Mar 10 09:44:10 crc systemd[1]: Starting Kubernetes Kubelet... Mar 10 09:44:10 crc restorecon[4742]: Relabeled /var/lib/kubelet/config.json from system_u:object_r:unlabeled_t:s0 to system_u:object_r:container_var_lib_t:s0 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/device-plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/device-plugins/kubelet.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/volumes/kubernetes.io~configmap/nginx-conf/..2025_02_23_05_40_35.4114275528/nginx.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/22e96971 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/21c98286 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/0f1869e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/46889d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/5b6a5969 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/6c7921f5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4804f443 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/2a46b283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/a6b5573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4f88ee5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/5a4eee4b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/cd87c521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/38602af4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/1483b002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/0346718b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/d3ed4ada not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/3bb473a5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/8cd075a9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/00ab4760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/54a21c09 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/70478888 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/43802770 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/955a0edc not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/bca2d009 not reset as customized by admin to system_u:object_r:container_file_t:s0:c140,c1009 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/b295f9bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/bc46ea27 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5731fc1b not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5e1b2a3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/943f0936 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/3f764ee4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/8695e3f9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/aed7aa86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/c64d7448 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/0ba16bd2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/207a939f not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/54aa8cdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/1f5fa595 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/bf9c8153 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/47fba4ea not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/7ae55ce9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7906a268 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/ce43fa69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7fc7ea3a not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/d8c38b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/9ef015fb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/b9db6a41 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/b1733d79 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/afccd338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/9df0a185 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/18938cf8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/7ab4eb23 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/56930be6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_35.630010865 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/0d8e3722 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/d22b2e76 not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/e036759f not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/2734c483 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/57878fe7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/3f3c2e58 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/375bec3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/7bc41e08 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/48c7a72d not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/4b66701f not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/a5a1c202 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_40.1388695756 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/26f3df5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/6d8fb21d not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/50e94777 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208473b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/ec9e08ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3b787c39 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208eaed5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/93aa3a2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3c697968 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/ba950ec9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/cb5cdb37 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/f2df9827 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/fedaa673 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/9ca2df95 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/b2d7460e not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2207853c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/241c1c29 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2d910eaf not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/c6c0f2e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/399edc97 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8049f7cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/0cec5484 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/312446d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c406,c828 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8e56a35d not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/2d30ddb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/eca8053d not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/c3a25c9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c168,c522 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/b9609c22 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/e8b0eca9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/b36a9c3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/38af7b07 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/ae821620 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/baa23338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/2c534809 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/59b29eae not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/c91a8e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/4d87494a not reset as customized by admin to system_u:object_r:container_file_t:s0:c442,c857 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/1e33ca63 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/8dea7be2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d0b04a99 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d84f01e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/4109059b not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/a7258a3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/05bdf2b6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/f3261b51 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/315d045e not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/5fdcf278 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/d053f757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/c2850dc7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fcfb0b2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c7ac9b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fa0c0d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c609b6ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/2be6c296 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/89a32653 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/4eb9afeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/13af6efa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/b03f9724 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/e3d105cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/3aed4d83 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/0765fa6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/2cefc627 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/3dcc6345 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/365af391 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b1130c0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/236a5913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b9432e26 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/5ddb0e3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/986dc4fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/8a23ff9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/9728ae68 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/665f31d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/136c9b42 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/98a1575b not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/cac69136 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/5deb77a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/2ae53400 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/e46f2326 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/dc688d3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/3497c3cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/177eb008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/af5a2afa not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/d780cb1f not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/49b0f374 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/26fbb125 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/cf14125a not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/b7f86972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/e51d739c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/88ba6a69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/669a9acf not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/5cd51231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/75349ec7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/15c26839 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/45023dcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/2bb66a50 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/64d03bdd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/ab8e7ca0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/bb9be25f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/9a0b61d3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/d471b9d2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/8cb76b8e not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/11a00840 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/ec355a92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/992f735e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d59cdbbc not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/72133ff0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/c56c834c not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d13724c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/0a498258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa471982 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fc900d92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa7d68da not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/4bacf9b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/424021b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/fc2e31a3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/f51eefac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/c8997f2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/7481f599 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/fdafea19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/d0e1c571 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/ee398915 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/682bb6b8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a3e67855 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a989f289 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/915431bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/7796fdab not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/dcdb5f19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/a3aaa88c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/5508e3e6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/160585de not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/e99f8da3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/8bc85570 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/a5861c91 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/84db1135 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/9e1a6043 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/c1aba1c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/d55ccd6d not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/971cc9f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/8f2e3dcf not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/ceb35e9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/1c192745 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/5209e501 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/f83de4df not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/e7b978ac not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/c64304a1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/5384386b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/cce3e3ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/8fb75465 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/740f573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/32fd1134 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/0a861bd3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/80363026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/bfa952a8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..2025_02_23_05_33_31.333075221 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/793bf43d not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/7db1bb6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/4f6a0368 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/c12c7d86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/36c4a773 not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/4c1e98ae not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/a4c8115c not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/setup/7db1802e not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver/a008a7ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-syncer/2c836bac not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-regeneration-controller/0ce62299 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-insecure-readyz/945d2457 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-check-endpoints/7d5c1dd8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/index.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/bundle-v1.15.0.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/channel.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/package.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/bc8d0691 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/6b76097a not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/34d1af30 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/312ba61c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/645d5dd1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/16e825f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/4cf51fc9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/2a23d348 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/075dbd49 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/dd585ddd not reset as customized by admin to system_u:object_r:container_file_t:s0:c377,c642 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/17ebd0ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c343 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/005579f4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_23_11.1287037894 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/bf5f3b9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/af276eb7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/ea28e322 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/692e6683 not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/871746a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/4eb2e958 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/ca9b62da not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/0edd6fce not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/containers/controller-manager/89b4555f not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/655fcd71 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/0d43c002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/e68efd17 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/9acf9b65 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/5ae3ff11 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/1e59206a not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/27af16d1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c304,c1017 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/7918e729 not reset as customized by admin to system_u:object_r:container_file_t:s0:c853,c893 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/5d976d0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c585,c981 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/d7f55cbb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/f0812073 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/1a56cbeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/7fdd437e not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/cdfb5652 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/fix-audit-permissions/fb93119e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver/f1e8fc0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver-check-endpoints/218511f3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server/serving-certs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/ca8af7b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/72cc8a75 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/6e8a3760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4c3455c0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/2278acb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4b453e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/3ec09bda not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2/cacerts.bin not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java/cacerts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl/ca-bundle.trust.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/email-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/objsign-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2ae6433e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fde84897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75680d2e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/openshift-service-serving-signer_1740288168.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/facfc4fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f5a969c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CFCA_EV_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9ef4a08a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ingress-operator_1740288202.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2f332aed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/248c8271.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d10a21f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ACCVRAIZ1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a94d09e5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c9a4d3b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40193066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd8c0d63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b936d1c6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CA_Disig_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4fd49c6c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM_SERVIDORES_SEGUROS.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b81b93f0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f9a69fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b30d5fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ANF_Secure_Server_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b433981b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93851c9e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9282e51c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7dd1bc4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Actalis_Authentication_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/930ac5d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f47b495.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e113c810.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5931b5bc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Commercial.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2b349938.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e48193cf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/302904dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a716d4ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Networking.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93bc0acc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/86212b19.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b727005e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbc54cab.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f51bb24c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c28a8a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9c8dfbd4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ccc52f49.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cb1c3204.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ce5e74ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd08c599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6d41d539.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb5fa911.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e35234b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8cb5ee0f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a7c655d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f8fc53da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/de6d66f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d41b5e2a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/41a3f684.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1df5a75f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_2011.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e36a6752.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b872f2b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9576d26b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/228f89db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_ECC_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb717492.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d21b73c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b1b94ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/595e996b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_RSA_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b46e03d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/128f4b91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_3_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81f2d2b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Autoridad_de_Certificacion_Firmaprofesional_CIF_A62634068.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3bde41ac.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d16a5865.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_EC-384_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0179095f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ffa7f1eb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9482e63a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4dae3dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e359ba6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7e067d03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/95aff9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7746a63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Baltimore_CyberTrust_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/653b494a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3ad48a91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_2_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/54657681.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/82223c44.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8de2f56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d9dafe4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d96b65e2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee64a828.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40547a79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5a3f0ff8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a780d93.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/34d996fb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/eed8c118.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/89c02a45.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b1159c4c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d6325660.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4c339cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8312c4c1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_E1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8508e720.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5fdd185d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48bec511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/69105f4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b9bc432.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/32888f65.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b03dec0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/219d9499.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5acf816d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbf06781.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc99f41e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AAA_Certificate_Services.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/985c1f52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8794b4e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_BR_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7c037b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ef954a4e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_EV_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2add47b6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/90c5a3c8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0f3e76e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/53a1b57a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_EV_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5ad8a5d6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/68dd7389.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d04f354.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d6437c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/062cdee6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bd43e1dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7f3d5d1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c491639e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3513523f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/399e7759.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/feffd413.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d18e9066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/607986c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c90bc37d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1b0f7e5c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e08bfd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dd8e9d41.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed39abd0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a3418fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bc3f2570.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_High_Assurance_EV_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/244b5494.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81b9768f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4be590e0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_ECC_P384_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9846683b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/252252d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e8e7201.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_RSA4096_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d52c538d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c44cc0c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Trusted_Root_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75d1b2ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a2c66da8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ecccd8db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust.net_Certification_Authority__2048_.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/aee5f10d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e7271e8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0e59380.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4c3982f2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b99d060.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf64f35b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0a775a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/002c0b4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cc450945.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_EC1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/106f3e4d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b3fb433b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4042bcee.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/02265526.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/455f1b52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0d69c7e1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9f727ac7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5e98733a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0cd152c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc4d6a89.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6187b673.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/FIRMAPROFESIONAL_CA_ROOT-A_WEB.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ba8887ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/068570d1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f081611a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48a195d8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GDCA_TrustAUTH_R5_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f6fa695.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab59055e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b92fd57f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GLOBALTRUST_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fa5da96b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ec40989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7719f463.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1001acf7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f013ecaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/626dceaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c559d742.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1d3472b9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9479c8c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a81e292b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4bfab552.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e071171e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/57bcb2da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_ECC_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab5346f4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5046c355.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_RSA_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/865fbdf9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da0cfd1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/85cde254.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_ECC_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbb3f32b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureSign_RootCA11.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5860aaa6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/31188b5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HiPKI_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c7f1359b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f15c80c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hongkong_Post_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/09789157.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/18856ac4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e09d511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Commercial_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cf701eeb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d06393bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Public_Sector_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/10531352.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Izenpe.com.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureTrust_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0ed035a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsec_e-Szigno_Root_CA_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8160b96c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8651083.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2c63f966.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_ECC_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d89cda1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/01419da9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_RSA_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7a5b843.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_RSA_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf53fb88.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9591a472.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3afde786.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Gold_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NAVER_Global_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3fb36b73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d39b0a2c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a89d74c2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd58d51e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7db1890.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NetLock_Arany__Class_Gold__F__tan__s__tv__ny.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/988a38cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/60afe812.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f39fc864.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5443e9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GB_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e73d606e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dfc0fe80.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b66938e9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e1eab7c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GC_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/773e07ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c899c73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d59297b8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ddcda989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_1_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/749e9e03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/52b525c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7e8dc79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a819ef2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/08063a00.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b483515.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/064e0aa9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1f58a078.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6f7454b3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7fa05551.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76faf6c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9339512a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f387163d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee37c333.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e18bfb83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e442e424.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fe8a2cd8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/23f4c490.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5cd81ad7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0c70a8d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7892ad52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SZAFIR_ROOT_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4f316efb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_RSA_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/06dc52d5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/583d0756.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0bf05006.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/88950faa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9046744a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c860d51.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_RSA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6fa5da56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/33ee480d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Secure_Global_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/63a2c897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_ECC_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bdacca6f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ff34af3f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbff3a01.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_ECC_RootCA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_C1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/406c9bb1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_C3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Services_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Silver_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/99e1b953.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/14bc7599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TUBITAK_Kamu_SM_SSL_Kok_Sertifikasi_-_Surum_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a3adc42.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f459871d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_ECC_Root_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_RSA_Root_2023.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TeliaSonera_Root_CA_v1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telia_Root_CA_v2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f103249.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f058632f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-certificates.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9bf03295.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/98aaf404.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1cef98f5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/073bfcc5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2923b3f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f249de83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/edcbddb5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P256_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b5697b0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ae85e5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b74d2bd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P384_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d887a5bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9aef356c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TunTrust_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd64f3fc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e13665f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Extended_Validation_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f5dc4f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da7377f6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Global_G2_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c01eb047.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/304d27c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed858448.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f30dd6ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/04f60c28.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_ECC_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fc5a8f99.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/35105088.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee532fd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/XRamp_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/706f604c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76579174.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d86cdd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/882de061.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f618aec.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a9d40e02.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e-Szigno_Root_CA_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e868b802.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/83e9984f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ePKI_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca6e4ad9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d6523ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4b718d9b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/869fbf79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/containers/registry/f8d22bdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/6e8bbfac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/54dd7996 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/a4f1bb05 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/207129da not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/c1df39e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/15b8f1cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/77bd6913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/2382c1b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/704ce128 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/70d16fe0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/bfb95535 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/57a8e8e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/1b9d3e5e not reset as customized by admin to system_u:object_r:container_file_t:s0:c107,c917 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/fddb173c not reset as customized by admin to system_u:object_r:container_file_t:s0:c202,c983 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/95d3c6c4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/bfb5fff5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/2aef40aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/c0391cad not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/1119e69d not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/660608b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/8220bd53 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/85f99d5c not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/4b0225f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/9c2a3394 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/e820b243 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/1ca52ea0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/e6988e45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/6655f00b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/98bc3986 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/08e3458a not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/2a191cb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/6c4eeefb not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/f61a549c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/24891863 not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/fbdfd89c not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/9b63b3bc not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/8acde6d6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/node-driver-registrar/59ecbba3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/csi-provisioner/685d4be3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/containers/route-controller-manager/feaea55e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/63709497 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/d966b7fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/f5773757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/81c9edb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/57bf57ee not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/86f5e6aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/0aabe31d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/d2af85c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/09d157d9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:10 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:11 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:11 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:11 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:11 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:11 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:11 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:11 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:11 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:11 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:11 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:11 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:11 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:11 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:11 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:11 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:11 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:11 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:11 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:11 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:11 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:11 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:11 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:11 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:11 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:11 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:11 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:11 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:11 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:11 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:11 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:11 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:11 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:11 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:11 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:11 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:11 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:11 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:11 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:11 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:11 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:11 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:11 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:11 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:11 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:11 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:11 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:11 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:11 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:11 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:11 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:11 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:11 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:11 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:11 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:11 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:11 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:11 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:11 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:11 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:11 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:11 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:11 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:11 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:11 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:11 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:11 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:11 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:11 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:11 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:11 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:11 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:11 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:11 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:11 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:11 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:11 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:11 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:11 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:11 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:11 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:11 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:11 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:11 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:11 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:11 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:11 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:11 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:11 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:11 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:11 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:11 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:11 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:11 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:11 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:11 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:11 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:11 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:11 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:11 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:11 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c0fe7256 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:11 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c30319e4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:11 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/e6b1dd45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:11 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/2bb643f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:11 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/920de426 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:11 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/70fa1e87 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:11 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/a1c12a2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:11 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/9442e6c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:11 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/5b45ec72 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:11 crc restorecon[4742]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:11 crc restorecon[4742]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:11 crc restorecon[4742]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:11 crc restorecon[4742]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:11 crc restorecon[4742]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:11 crc restorecon[4742]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:11 crc restorecon[4742]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:11 crc restorecon[4742]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:11 crc restorecon[4742]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:11 crc restorecon[4742]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:11 crc restorecon[4742]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:11 crc restorecon[4742]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:11 crc restorecon[4742]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:11 crc restorecon[4742]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:11 crc restorecon[4742]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:11 crc restorecon[4742]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:11 crc restorecon[4742]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:11 crc restorecon[4742]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:11 crc restorecon[4742]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:11 crc restorecon[4742]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:11 crc restorecon[4742]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:11 crc restorecon[4742]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:11 crc restorecon[4742]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:11 crc restorecon[4742]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:11 crc restorecon[4742]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:11 crc restorecon[4742]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:11 crc restorecon[4742]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:11 crc restorecon[4742]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:11 crc restorecon[4742]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:11 crc restorecon[4742]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:11 crc restorecon[4742]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:11 crc restorecon[4742]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:11 crc restorecon[4742]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:11 crc restorecon[4742]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:11 crc restorecon[4742]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:11 crc restorecon[4742]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:11 crc restorecon[4742]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:11 crc restorecon[4742]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:11 crc restorecon[4742]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:11 crc restorecon[4742]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:11 crc restorecon[4742]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:11 crc restorecon[4742]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:11 crc restorecon[4742]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:11 crc restorecon[4742]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:11 crc restorecon[4742]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:11 crc restorecon[4742]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:11 crc restorecon[4742]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:11 crc restorecon[4742]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:11 crc restorecon[4742]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:11 crc restorecon[4742]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:11 crc restorecon[4742]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:11 crc restorecon[4742]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:11 crc restorecon[4742]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:11 crc restorecon[4742]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:11 crc restorecon[4742]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:11 crc restorecon[4742]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:11 crc restorecon[4742]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:11 crc restorecon[4742]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:11 crc restorecon[4742]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:11 crc restorecon[4742]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:11 crc restorecon[4742]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:11 crc restorecon[4742]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:11 crc restorecon[4742]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:11 crc restorecon[4742]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:11 crc restorecon[4742]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:11 crc restorecon[4742]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:11 crc restorecon[4742]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:11 crc restorecon[4742]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:11 crc restorecon[4742]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:11 crc restorecon[4742]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:11 crc restorecon[4742]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:11 crc restorecon[4742]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:11 crc restorecon[4742]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:11 crc restorecon[4742]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:11 crc restorecon[4742]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:11 crc restorecon[4742]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:11 crc restorecon[4742]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:11 crc restorecon[4742]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:11 crc restorecon[4742]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:11 crc restorecon[4742]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:11 crc restorecon[4742]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:11 crc restorecon[4742]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:11 crc restorecon[4742]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:11 crc restorecon[4742]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:11 crc restorecon[4742]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:11 crc restorecon[4742]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:11 crc restorecon[4742]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:11 crc restorecon[4742]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:11 crc restorecon[4742]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:11 crc restorecon[4742]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:11 crc restorecon[4742]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:11 crc restorecon[4742]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/3c9f3a59 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:11 crc restorecon[4742]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/1091c11b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:11 crc restorecon[4742]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/9a6821c6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:11 crc restorecon[4742]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/ec0c35e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:11 crc restorecon[4742]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/517f37e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:11 crc restorecon[4742]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/6214fe78 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:11 crc restorecon[4742]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/ba189c8b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:11 crc restorecon[4742]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/351e4f31 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:11 crc restorecon[4742]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/c0f219ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:44:11 crc restorecon[4742]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Mar 10 09:44:11 crc restorecon[4742]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/8069f607 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Mar 10 09:44:11 crc restorecon[4742]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/559c3d82 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Mar 10 09:44:11 crc restorecon[4742]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/605ad488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Mar 10 09:44:11 crc restorecon[4742]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/148df488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Mar 10 09:44:11 crc restorecon[4742]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/3bf6dcb4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Mar 10 09:44:11 crc restorecon[4742]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/022a2feb not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Mar 10 09:44:11 crc restorecon[4742]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/938c3924 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Mar 10 09:44:11 crc restorecon[4742]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/729fe23e not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Mar 10 09:44:11 crc restorecon[4742]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/1fd5cbd4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Mar 10 09:44:11 crc restorecon[4742]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/a96697e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Mar 10 09:44:11 crc restorecon[4742]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/e155ddca not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Mar 10 09:44:11 crc restorecon[4742]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/10dd0e0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Mar 10 09:44:11 crc restorecon[4742]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 10 09:44:11 crc restorecon[4742]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 10 09:44:11 crc restorecon[4742]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 10 09:44:11 crc restorecon[4742]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 10 09:44:11 crc restorecon[4742]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 10 09:44:11 crc restorecon[4742]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 10 09:44:11 crc restorecon[4742]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 10 09:44:11 crc restorecon[4742]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 10 09:44:11 crc restorecon[4742]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 10 09:44:11 crc restorecon[4742]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 10 09:44:11 crc restorecon[4742]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 10 09:44:11 crc restorecon[4742]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 10 09:44:11 crc restorecon[4742]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 10 09:44:11 crc restorecon[4742]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 10 09:44:11 crc restorecon[4742]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 10 09:44:11 crc restorecon[4742]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 10 09:44:11 crc restorecon[4742]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 10 09:44:11 crc restorecon[4742]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 10 09:44:11 crc restorecon[4742]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 10 09:44:11 crc restorecon[4742]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 10 09:44:11 crc restorecon[4742]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 10 09:44:11 crc restorecon[4742]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/6f2c8392 not reset as customized by admin to system_u:object_r:container_file_t:s0:c267,c588 Mar 10 09:44:11 crc restorecon[4742]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/bd241ad9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 10 09:44:11 crc restorecon[4742]: /var/lib/kubelet/plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 10 09:44:11 crc restorecon[4742]: /var/lib/kubelet/plugins/csi-hostpath not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 10 09:44:11 crc restorecon[4742]: /var/lib/kubelet/plugins/csi-hostpath/csi.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 10 09:44:11 crc restorecon[4742]: /var/lib/kubelet/plugins/kubernetes.io not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 10 09:44:11 crc restorecon[4742]: /var/lib/kubelet/plugins/kubernetes.io/csi not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 10 09:44:11 crc restorecon[4742]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 10 09:44:11 crc restorecon[4742]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983 not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 10 09:44:11 crc restorecon[4742]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 10 09:44:11 crc restorecon[4742]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/vol_data.json not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 10 09:44:11 crc restorecon[4742]: /var/lib/kubelet/plugins_registry not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 10 09:44:11 crc restorecon[4742]: Relabeled /var/usrlocal/bin/kubenswrapper from system_u:object_r:bin_t:s0 to system_u:object_r:kubelet_exec_t:s0 Mar 10 09:44:11 crc kubenswrapper[4794]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 10 09:44:11 crc kubenswrapper[4794]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Mar 10 09:44:11 crc kubenswrapper[4794]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 10 09:44:11 crc kubenswrapper[4794]: Flag --register-with-taints has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 10 09:44:11 crc kubenswrapper[4794]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Mar 10 09:44:11 crc kubenswrapper[4794]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 10 09:44:11 crc kubenswrapper[4794]: I0310 09:44:11.732689 4794 server.go:211] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Mar 10 09:44:11 crc kubenswrapper[4794]: W0310 09:44:11.739849 4794 feature_gate.go:330] unrecognized feature gate: GatewayAPI Mar 10 09:44:11 crc kubenswrapper[4794]: W0310 09:44:11.739879 4794 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 10 09:44:11 crc kubenswrapper[4794]: W0310 09:44:11.739887 4794 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 10 09:44:11 crc kubenswrapper[4794]: W0310 09:44:11.739894 4794 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Mar 10 09:44:11 crc kubenswrapper[4794]: W0310 09:44:11.739904 4794 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Mar 10 09:44:11 crc kubenswrapper[4794]: W0310 09:44:11.739913 4794 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Mar 10 09:44:11 crc kubenswrapper[4794]: W0310 09:44:11.739920 4794 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Mar 10 09:44:11 crc kubenswrapper[4794]: W0310 09:44:11.739927 4794 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Mar 10 09:44:11 crc kubenswrapper[4794]: W0310 09:44:11.739934 4794 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Mar 10 09:44:11 crc kubenswrapper[4794]: W0310 09:44:11.739942 4794 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Mar 10 09:44:11 crc kubenswrapper[4794]: W0310 09:44:11.739949 4794 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Mar 10 09:44:11 crc kubenswrapper[4794]: W0310 09:44:11.739955 4794 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Mar 10 09:44:11 crc kubenswrapper[4794]: W0310 09:44:11.739960 4794 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Mar 10 09:44:11 crc kubenswrapper[4794]: W0310 09:44:11.739967 4794 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Mar 10 09:44:11 crc kubenswrapper[4794]: W0310 09:44:11.739973 4794 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Mar 10 09:44:11 crc kubenswrapper[4794]: W0310 09:44:11.739979 4794 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Mar 10 09:44:11 crc kubenswrapper[4794]: W0310 09:44:11.739985 4794 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 10 09:44:11 crc kubenswrapper[4794]: W0310 09:44:11.740000 4794 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Mar 10 09:44:11 crc kubenswrapper[4794]: W0310 09:44:11.740006 4794 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Mar 10 09:44:11 crc kubenswrapper[4794]: W0310 09:44:11.740012 4794 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Mar 10 09:44:11 crc kubenswrapper[4794]: W0310 09:44:11.740018 4794 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Mar 10 09:44:11 crc kubenswrapper[4794]: W0310 09:44:11.740023 4794 feature_gate.go:330] unrecognized feature gate: NewOLM Mar 10 09:44:11 crc kubenswrapper[4794]: W0310 09:44:11.740029 4794 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Mar 10 09:44:11 crc kubenswrapper[4794]: W0310 09:44:11.740035 4794 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Mar 10 09:44:11 crc kubenswrapper[4794]: W0310 09:44:11.740041 4794 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Mar 10 09:44:11 crc kubenswrapper[4794]: W0310 09:44:11.740047 4794 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Mar 10 09:44:11 crc kubenswrapper[4794]: W0310 09:44:11.740053 4794 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Mar 10 09:44:11 crc kubenswrapper[4794]: W0310 09:44:11.740061 4794 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Mar 10 09:44:11 crc kubenswrapper[4794]: W0310 09:44:11.740068 4794 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Mar 10 09:44:11 crc kubenswrapper[4794]: W0310 09:44:11.740076 4794 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Mar 10 09:44:11 crc kubenswrapper[4794]: W0310 09:44:11.740083 4794 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Mar 10 09:44:11 crc kubenswrapper[4794]: W0310 09:44:11.740089 4794 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Mar 10 09:44:11 crc kubenswrapper[4794]: W0310 09:44:11.740095 4794 feature_gate.go:330] unrecognized feature gate: PlatformOperators Mar 10 09:44:11 crc kubenswrapper[4794]: W0310 09:44:11.740101 4794 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 10 09:44:11 crc kubenswrapper[4794]: W0310 09:44:11.740108 4794 feature_gate.go:330] unrecognized feature gate: InsightsConfig Mar 10 09:44:11 crc kubenswrapper[4794]: W0310 09:44:11.740114 4794 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Mar 10 09:44:11 crc kubenswrapper[4794]: W0310 09:44:11.740120 4794 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Mar 10 09:44:11 crc kubenswrapper[4794]: W0310 09:44:11.740126 4794 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Mar 10 09:44:11 crc kubenswrapper[4794]: W0310 09:44:11.740133 4794 feature_gate.go:330] unrecognized feature gate: Example Mar 10 09:44:11 crc kubenswrapper[4794]: W0310 09:44:11.740139 4794 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Mar 10 09:44:11 crc kubenswrapper[4794]: W0310 09:44:11.740145 4794 feature_gate.go:330] unrecognized feature gate: OVNObservability Mar 10 09:44:11 crc kubenswrapper[4794]: W0310 09:44:11.740151 4794 feature_gate.go:330] unrecognized feature gate: SignatureStores Mar 10 09:44:11 crc kubenswrapper[4794]: W0310 09:44:11.740158 4794 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 10 09:44:11 crc kubenswrapper[4794]: W0310 09:44:11.740164 4794 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Mar 10 09:44:11 crc kubenswrapper[4794]: W0310 09:44:11.740170 4794 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Mar 10 09:44:11 crc kubenswrapper[4794]: W0310 09:44:11.740175 4794 feature_gate.go:330] unrecognized feature gate: PinnedImages Mar 10 09:44:11 crc kubenswrapper[4794]: W0310 09:44:11.740181 4794 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Mar 10 09:44:11 crc kubenswrapper[4794]: W0310 09:44:11.740187 4794 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Mar 10 09:44:11 crc kubenswrapper[4794]: W0310 09:44:11.740193 4794 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Mar 10 09:44:11 crc kubenswrapper[4794]: W0310 09:44:11.740199 4794 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 10 09:44:11 crc kubenswrapper[4794]: W0310 09:44:11.740204 4794 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Mar 10 09:44:11 crc kubenswrapper[4794]: W0310 09:44:11.740211 4794 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Mar 10 09:44:11 crc kubenswrapper[4794]: W0310 09:44:11.740216 4794 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Mar 10 09:44:11 crc kubenswrapper[4794]: W0310 09:44:11.740222 4794 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Mar 10 09:44:11 crc kubenswrapper[4794]: W0310 09:44:11.740228 4794 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Mar 10 09:44:11 crc kubenswrapper[4794]: W0310 09:44:11.740234 4794 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Mar 10 09:44:11 crc kubenswrapper[4794]: W0310 09:44:11.740240 4794 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Mar 10 09:44:11 crc kubenswrapper[4794]: W0310 09:44:11.740248 4794 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 10 09:44:11 crc kubenswrapper[4794]: W0310 09:44:11.740255 4794 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Mar 10 09:44:11 crc kubenswrapper[4794]: W0310 09:44:11.740261 4794 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Mar 10 09:44:11 crc kubenswrapper[4794]: W0310 09:44:11.740267 4794 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Mar 10 09:44:11 crc kubenswrapper[4794]: W0310 09:44:11.740274 4794 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Mar 10 09:44:11 crc kubenswrapper[4794]: W0310 09:44:11.740282 4794 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Mar 10 09:44:11 crc kubenswrapper[4794]: W0310 09:44:11.740290 4794 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Mar 10 09:44:11 crc kubenswrapper[4794]: W0310 09:44:11.740298 4794 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Mar 10 09:44:11 crc kubenswrapper[4794]: W0310 09:44:11.740307 4794 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Mar 10 09:44:11 crc kubenswrapper[4794]: W0310 09:44:11.740314 4794 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Mar 10 09:44:11 crc kubenswrapper[4794]: W0310 09:44:11.740320 4794 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Mar 10 09:44:11 crc kubenswrapper[4794]: W0310 09:44:11.740326 4794 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Mar 10 09:44:11 crc kubenswrapper[4794]: W0310 09:44:11.740354 4794 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Mar 10 09:44:11 crc kubenswrapper[4794]: W0310 09:44:11.740362 4794 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Mar 10 09:44:11 crc kubenswrapper[4794]: I0310 09:44:11.740485 4794 flags.go:64] FLAG: --address="0.0.0.0" Mar 10 09:44:11 crc kubenswrapper[4794]: I0310 09:44:11.740505 4794 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Mar 10 09:44:11 crc kubenswrapper[4794]: I0310 09:44:11.740517 4794 flags.go:64] FLAG: --anonymous-auth="true" Mar 10 09:44:11 crc kubenswrapper[4794]: I0310 09:44:11.740529 4794 flags.go:64] FLAG: --application-metrics-count-limit="100" Mar 10 09:44:11 crc kubenswrapper[4794]: I0310 09:44:11.740539 4794 flags.go:64] FLAG: --authentication-token-webhook="false" Mar 10 09:44:11 crc kubenswrapper[4794]: I0310 09:44:11.740547 4794 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Mar 10 09:44:11 crc kubenswrapper[4794]: I0310 09:44:11.740557 4794 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Mar 10 09:44:11 crc kubenswrapper[4794]: I0310 09:44:11.740565 4794 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Mar 10 09:44:11 crc kubenswrapper[4794]: I0310 09:44:11.740572 4794 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Mar 10 09:44:11 crc kubenswrapper[4794]: I0310 09:44:11.740579 4794 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Mar 10 09:44:11 crc kubenswrapper[4794]: I0310 09:44:11.740587 4794 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Mar 10 09:44:11 crc kubenswrapper[4794]: I0310 09:44:11.740594 4794 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Mar 10 09:44:11 crc kubenswrapper[4794]: I0310 09:44:11.740601 4794 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Mar 10 09:44:11 crc kubenswrapper[4794]: I0310 09:44:11.740608 4794 flags.go:64] FLAG: --cgroup-root="" Mar 10 09:44:11 crc kubenswrapper[4794]: I0310 09:44:11.740615 4794 flags.go:64] FLAG: --cgroups-per-qos="true" Mar 10 09:44:11 crc kubenswrapper[4794]: I0310 09:44:11.740622 4794 flags.go:64] FLAG: --client-ca-file="" Mar 10 09:44:11 crc kubenswrapper[4794]: I0310 09:44:11.740629 4794 flags.go:64] FLAG: --cloud-config="" Mar 10 09:44:11 crc kubenswrapper[4794]: I0310 09:44:11.740636 4794 flags.go:64] FLAG: --cloud-provider="" Mar 10 09:44:11 crc kubenswrapper[4794]: I0310 09:44:11.740642 4794 flags.go:64] FLAG: --cluster-dns="[]" Mar 10 09:44:11 crc kubenswrapper[4794]: I0310 09:44:11.740652 4794 flags.go:64] FLAG: --cluster-domain="" Mar 10 09:44:11 crc kubenswrapper[4794]: I0310 09:44:11.740659 4794 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Mar 10 09:44:11 crc kubenswrapper[4794]: I0310 09:44:11.740671 4794 flags.go:64] FLAG: --config-dir="" Mar 10 09:44:11 crc kubenswrapper[4794]: I0310 09:44:11.740678 4794 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Mar 10 09:44:11 crc kubenswrapper[4794]: I0310 09:44:11.740686 4794 flags.go:64] FLAG: --container-log-max-files="5" Mar 10 09:44:11 crc kubenswrapper[4794]: I0310 09:44:11.740695 4794 flags.go:64] FLAG: --container-log-max-size="10Mi" Mar 10 09:44:11 crc kubenswrapper[4794]: I0310 09:44:11.740702 4794 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Mar 10 09:44:11 crc kubenswrapper[4794]: I0310 09:44:11.740709 4794 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Mar 10 09:44:11 crc kubenswrapper[4794]: I0310 09:44:11.740716 4794 flags.go:64] FLAG: --containerd-namespace="k8s.io" Mar 10 09:44:11 crc kubenswrapper[4794]: I0310 09:44:11.740724 4794 flags.go:64] FLAG: --contention-profiling="false" Mar 10 09:44:11 crc kubenswrapper[4794]: I0310 09:44:11.740730 4794 flags.go:64] FLAG: --cpu-cfs-quota="true" Mar 10 09:44:11 crc kubenswrapper[4794]: I0310 09:44:11.740737 4794 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Mar 10 09:44:11 crc kubenswrapper[4794]: I0310 09:44:11.740744 4794 flags.go:64] FLAG: --cpu-manager-policy="none" Mar 10 09:44:11 crc kubenswrapper[4794]: I0310 09:44:11.740751 4794 flags.go:64] FLAG: --cpu-manager-policy-options="" Mar 10 09:44:11 crc kubenswrapper[4794]: I0310 09:44:11.740760 4794 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Mar 10 09:44:11 crc kubenswrapper[4794]: I0310 09:44:11.740767 4794 flags.go:64] FLAG: --enable-controller-attach-detach="true" Mar 10 09:44:11 crc kubenswrapper[4794]: I0310 09:44:11.740776 4794 flags.go:64] FLAG: --enable-debugging-handlers="true" Mar 10 09:44:11 crc kubenswrapper[4794]: I0310 09:44:11.740783 4794 flags.go:64] FLAG: --enable-load-reader="false" Mar 10 09:44:11 crc kubenswrapper[4794]: I0310 09:44:11.740792 4794 flags.go:64] FLAG: --enable-server="true" Mar 10 09:44:11 crc kubenswrapper[4794]: I0310 09:44:11.740799 4794 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Mar 10 09:44:11 crc kubenswrapper[4794]: I0310 09:44:11.740808 4794 flags.go:64] FLAG: --event-burst="100" Mar 10 09:44:11 crc kubenswrapper[4794]: I0310 09:44:11.740815 4794 flags.go:64] FLAG: --event-qps="50" Mar 10 09:44:11 crc kubenswrapper[4794]: I0310 09:44:11.740822 4794 flags.go:64] FLAG: --event-storage-age-limit="default=0" Mar 10 09:44:11 crc kubenswrapper[4794]: I0310 09:44:11.740830 4794 flags.go:64] FLAG: --event-storage-event-limit="default=0" Mar 10 09:44:11 crc kubenswrapper[4794]: I0310 09:44:11.740836 4794 flags.go:64] FLAG: --eviction-hard="" Mar 10 09:44:11 crc kubenswrapper[4794]: I0310 09:44:11.740845 4794 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Mar 10 09:44:11 crc kubenswrapper[4794]: I0310 09:44:11.740851 4794 flags.go:64] FLAG: --eviction-minimum-reclaim="" Mar 10 09:44:11 crc kubenswrapper[4794]: I0310 09:44:11.740858 4794 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Mar 10 09:44:11 crc kubenswrapper[4794]: I0310 09:44:11.740866 4794 flags.go:64] FLAG: --eviction-soft="" Mar 10 09:44:11 crc kubenswrapper[4794]: I0310 09:44:11.740873 4794 flags.go:64] FLAG: --eviction-soft-grace-period="" Mar 10 09:44:11 crc kubenswrapper[4794]: I0310 09:44:11.740879 4794 flags.go:64] FLAG: --exit-on-lock-contention="false" Mar 10 09:44:11 crc kubenswrapper[4794]: I0310 09:44:11.740886 4794 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Mar 10 09:44:11 crc kubenswrapper[4794]: I0310 09:44:11.740893 4794 flags.go:64] FLAG: --experimental-mounter-path="" Mar 10 09:44:11 crc kubenswrapper[4794]: I0310 09:44:11.740900 4794 flags.go:64] FLAG: --fail-cgroupv1="false" Mar 10 09:44:11 crc kubenswrapper[4794]: I0310 09:44:11.740910 4794 flags.go:64] FLAG: --fail-swap-on="true" Mar 10 09:44:11 crc kubenswrapper[4794]: I0310 09:44:11.740917 4794 flags.go:64] FLAG: --feature-gates="" Mar 10 09:44:11 crc kubenswrapper[4794]: I0310 09:44:11.740926 4794 flags.go:64] FLAG: --file-check-frequency="20s" Mar 10 09:44:11 crc kubenswrapper[4794]: I0310 09:44:11.740933 4794 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Mar 10 09:44:11 crc kubenswrapper[4794]: I0310 09:44:11.740940 4794 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Mar 10 09:44:11 crc kubenswrapper[4794]: I0310 09:44:11.740947 4794 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Mar 10 09:44:11 crc kubenswrapper[4794]: I0310 09:44:11.740955 4794 flags.go:64] FLAG: --healthz-port="10248" Mar 10 09:44:11 crc kubenswrapper[4794]: I0310 09:44:11.740962 4794 flags.go:64] FLAG: --help="false" Mar 10 09:44:11 crc kubenswrapper[4794]: I0310 09:44:11.740969 4794 flags.go:64] FLAG: --hostname-override="" Mar 10 09:44:11 crc kubenswrapper[4794]: I0310 09:44:11.740976 4794 flags.go:64] FLAG: --housekeeping-interval="10s" Mar 10 09:44:11 crc kubenswrapper[4794]: I0310 09:44:11.740984 4794 flags.go:64] FLAG: --http-check-frequency="20s" Mar 10 09:44:11 crc kubenswrapper[4794]: I0310 09:44:11.740991 4794 flags.go:64] FLAG: --image-credential-provider-bin-dir="" Mar 10 09:44:11 crc kubenswrapper[4794]: I0310 09:44:11.740998 4794 flags.go:64] FLAG: --image-credential-provider-config="" Mar 10 09:44:11 crc kubenswrapper[4794]: I0310 09:44:11.741005 4794 flags.go:64] FLAG: --image-gc-high-threshold="85" Mar 10 09:44:11 crc kubenswrapper[4794]: I0310 09:44:11.741013 4794 flags.go:64] FLAG: --image-gc-low-threshold="80" Mar 10 09:44:11 crc kubenswrapper[4794]: I0310 09:44:11.741020 4794 flags.go:64] FLAG: --image-service-endpoint="" Mar 10 09:44:11 crc kubenswrapper[4794]: I0310 09:44:11.741027 4794 flags.go:64] FLAG: --kernel-memcg-notification="false" Mar 10 09:44:11 crc kubenswrapper[4794]: I0310 09:44:11.741034 4794 flags.go:64] FLAG: --kube-api-burst="100" Mar 10 09:44:11 crc kubenswrapper[4794]: I0310 09:44:11.741041 4794 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Mar 10 09:44:11 crc kubenswrapper[4794]: I0310 09:44:11.741047 4794 flags.go:64] FLAG: --kube-api-qps="50" Mar 10 09:44:11 crc kubenswrapper[4794]: I0310 09:44:11.741056 4794 flags.go:64] FLAG: --kube-reserved="" Mar 10 09:44:11 crc kubenswrapper[4794]: I0310 09:44:11.741063 4794 flags.go:64] FLAG: --kube-reserved-cgroup="" Mar 10 09:44:11 crc kubenswrapper[4794]: I0310 09:44:11.741070 4794 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Mar 10 09:44:11 crc kubenswrapper[4794]: I0310 09:44:11.741077 4794 flags.go:64] FLAG: --kubelet-cgroups="" Mar 10 09:44:11 crc kubenswrapper[4794]: I0310 09:44:11.741084 4794 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Mar 10 09:44:11 crc kubenswrapper[4794]: I0310 09:44:11.741091 4794 flags.go:64] FLAG: --lock-file="" Mar 10 09:44:11 crc kubenswrapper[4794]: I0310 09:44:11.741098 4794 flags.go:64] FLAG: --log-cadvisor-usage="false" Mar 10 09:44:11 crc kubenswrapper[4794]: I0310 09:44:11.741105 4794 flags.go:64] FLAG: --log-flush-frequency="5s" Mar 10 09:44:11 crc kubenswrapper[4794]: I0310 09:44:11.741112 4794 flags.go:64] FLAG: --log-json-info-buffer-size="0" Mar 10 09:44:11 crc kubenswrapper[4794]: I0310 09:44:11.741122 4794 flags.go:64] FLAG: --log-json-split-stream="false" Mar 10 09:44:11 crc kubenswrapper[4794]: I0310 09:44:11.741129 4794 flags.go:64] FLAG: --log-text-info-buffer-size="0" Mar 10 09:44:11 crc kubenswrapper[4794]: I0310 09:44:11.741135 4794 flags.go:64] FLAG: --log-text-split-stream="false" Mar 10 09:44:11 crc kubenswrapper[4794]: I0310 09:44:11.741146 4794 flags.go:64] FLAG: --logging-format="text" Mar 10 09:44:11 crc kubenswrapper[4794]: I0310 09:44:11.741153 4794 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Mar 10 09:44:11 crc kubenswrapper[4794]: I0310 09:44:11.741160 4794 flags.go:64] FLAG: --make-iptables-util-chains="true" Mar 10 09:44:11 crc kubenswrapper[4794]: I0310 09:44:11.741167 4794 flags.go:64] FLAG: --manifest-url="" Mar 10 09:44:11 crc kubenswrapper[4794]: I0310 09:44:11.741174 4794 flags.go:64] FLAG: --manifest-url-header="" Mar 10 09:44:11 crc kubenswrapper[4794]: I0310 09:44:11.741183 4794 flags.go:64] FLAG: --max-housekeeping-interval="15s" Mar 10 09:44:11 crc kubenswrapper[4794]: I0310 09:44:11.741190 4794 flags.go:64] FLAG: --max-open-files="1000000" Mar 10 09:44:11 crc kubenswrapper[4794]: I0310 09:44:11.741199 4794 flags.go:64] FLAG: --max-pods="110" Mar 10 09:44:11 crc kubenswrapper[4794]: I0310 09:44:11.741207 4794 flags.go:64] FLAG: --maximum-dead-containers="-1" Mar 10 09:44:11 crc kubenswrapper[4794]: I0310 09:44:11.741214 4794 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Mar 10 09:44:11 crc kubenswrapper[4794]: I0310 09:44:11.741221 4794 flags.go:64] FLAG: --memory-manager-policy="None" Mar 10 09:44:11 crc kubenswrapper[4794]: I0310 09:44:11.741228 4794 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Mar 10 09:44:11 crc kubenswrapper[4794]: I0310 09:44:11.741236 4794 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Mar 10 09:44:11 crc kubenswrapper[4794]: I0310 09:44:11.741242 4794 flags.go:64] FLAG: --node-ip="192.168.126.11" Mar 10 09:44:11 crc kubenswrapper[4794]: I0310 09:44:11.741251 4794 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/control-plane=,node-role.kubernetes.io/master=,node.openshift.io/os_id=rhcos" Mar 10 09:44:11 crc kubenswrapper[4794]: I0310 09:44:11.741268 4794 flags.go:64] FLAG: --node-status-max-images="50" Mar 10 09:44:11 crc kubenswrapper[4794]: I0310 09:44:11.741275 4794 flags.go:64] FLAG: --node-status-update-frequency="10s" Mar 10 09:44:11 crc kubenswrapper[4794]: I0310 09:44:11.741282 4794 flags.go:64] FLAG: --oom-score-adj="-999" Mar 10 09:44:11 crc kubenswrapper[4794]: I0310 09:44:11.741289 4794 flags.go:64] FLAG: --pod-cidr="" Mar 10 09:44:11 crc kubenswrapper[4794]: I0310 09:44:11.741296 4794 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:33549946e22a9ffa738fd94b1345f90921bc8f92fa6137784cb33c77ad806f9d" Mar 10 09:44:11 crc kubenswrapper[4794]: I0310 09:44:11.741309 4794 flags.go:64] FLAG: --pod-manifest-path="" Mar 10 09:44:11 crc kubenswrapper[4794]: I0310 09:44:11.741316 4794 flags.go:64] FLAG: --pod-max-pids="-1" Mar 10 09:44:11 crc kubenswrapper[4794]: I0310 09:44:11.741323 4794 flags.go:64] FLAG: --pods-per-core="0" Mar 10 09:44:11 crc kubenswrapper[4794]: I0310 09:44:11.741352 4794 flags.go:64] FLAG: --port="10250" Mar 10 09:44:11 crc kubenswrapper[4794]: I0310 09:44:11.741363 4794 flags.go:64] FLAG: --protect-kernel-defaults="false" Mar 10 09:44:11 crc kubenswrapper[4794]: I0310 09:44:11.741371 4794 flags.go:64] FLAG: --provider-id="" Mar 10 09:44:11 crc kubenswrapper[4794]: I0310 09:44:11.741378 4794 flags.go:64] FLAG: --qos-reserved="" Mar 10 09:44:11 crc kubenswrapper[4794]: I0310 09:44:11.741385 4794 flags.go:64] FLAG: --read-only-port="10255" Mar 10 09:44:11 crc kubenswrapper[4794]: I0310 09:44:11.741393 4794 flags.go:64] FLAG: --register-node="true" Mar 10 09:44:11 crc kubenswrapper[4794]: I0310 09:44:11.741400 4794 flags.go:64] FLAG: --register-schedulable="true" Mar 10 09:44:11 crc kubenswrapper[4794]: I0310 09:44:11.741407 4794 flags.go:64] FLAG: --register-with-taints="node-role.kubernetes.io/master=:NoSchedule" Mar 10 09:44:11 crc kubenswrapper[4794]: I0310 09:44:11.741420 4794 flags.go:64] FLAG: --registry-burst="10" Mar 10 09:44:11 crc kubenswrapper[4794]: I0310 09:44:11.741432 4794 flags.go:64] FLAG: --registry-qps="5" Mar 10 09:44:11 crc kubenswrapper[4794]: I0310 09:44:11.741440 4794 flags.go:64] FLAG: --reserved-cpus="" Mar 10 09:44:11 crc kubenswrapper[4794]: I0310 09:44:11.741447 4794 flags.go:64] FLAG: --reserved-memory="" Mar 10 09:44:11 crc kubenswrapper[4794]: I0310 09:44:11.741455 4794 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Mar 10 09:44:11 crc kubenswrapper[4794]: I0310 09:44:11.741463 4794 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Mar 10 09:44:11 crc kubenswrapper[4794]: I0310 09:44:11.741470 4794 flags.go:64] FLAG: --rotate-certificates="false" Mar 10 09:44:11 crc kubenswrapper[4794]: I0310 09:44:11.741476 4794 flags.go:64] FLAG: --rotate-server-certificates="false" Mar 10 09:44:11 crc kubenswrapper[4794]: I0310 09:44:11.741483 4794 flags.go:64] FLAG: --runonce="false" Mar 10 09:44:11 crc kubenswrapper[4794]: I0310 09:44:11.741490 4794 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Mar 10 09:44:11 crc kubenswrapper[4794]: I0310 09:44:11.741497 4794 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Mar 10 09:44:11 crc kubenswrapper[4794]: I0310 09:44:11.741505 4794 flags.go:64] FLAG: --seccomp-default="false" Mar 10 09:44:11 crc kubenswrapper[4794]: I0310 09:44:11.741512 4794 flags.go:64] FLAG: --serialize-image-pulls="true" Mar 10 09:44:11 crc kubenswrapper[4794]: I0310 09:44:11.741519 4794 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Mar 10 09:44:11 crc kubenswrapper[4794]: I0310 09:44:11.741527 4794 flags.go:64] FLAG: --storage-driver-db="cadvisor" Mar 10 09:44:11 crc kubenswrapper[4794]: I0310 09:44:11.741534 4794 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Mar 10 09:44:11 crc kubenswrapper[4794]: I0310 09:44:11.741542 4794 flags.go:64] FLAG: --storage-driver-password="root" Mar 10 09:44:11 crc kubenswrapper[4794]: I0310 09:44:11.741548 4794 flags.go:64] FLAG: --storage-driver-secure="false" Mar 10 09:44:11 crc kubenswrapper[4794]: I0310 09:44:11.741555 4794 flags.go:64] FLAG: --storage-driver-table="stats" Mar 10 09:44:11 crc kubenswrapper[4794]: I0310 09:44:11.741562 4794 flags.go:64] FLAG: --storage-driver-user="root" Mar 10 09:44:11 crc kubenswrapper[4794]: I0310 09:44:11.741569 4794 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Mar 10 09:44:11 crc kubenswrapper[4794]: I0310 09:44:11.741576 4794 flags.go:64] FLAG: --sync-frequency="1m0s" Mar 10 09:44:11 crc kubenswrapper[4794]: I0310 09:44:11.741584 4794 flags.go:64] FLAG: --system-cgroups="" Mar 10 09:44:11 crc kubenswrapper[4794]: I0310 09:44:11.741590 4794 flags.go:64] FLAG: --system-reserved="cpu=200m,ephemeral-storage=350Mi,memory=350Mi" Mar 10 09:44:11 crc kubenswrapper[4794]: I0310 09:44:11.741602 4794 flags.go:64] FLAG: --system-reserved-cgroup="" Mar 10 09:44:11 crc kubenswrapper[4794]: I0310 09:44:11.741608 4794 flags.go:64] FLAG: --tls-cert-file="" Mar 10 09:44:11 crc kubenswrapper[4794]: I0310 09:44:11.741615 4794 flags.go:64] FLAG: --tls-cipher-suites="[]" Mar 10 09:44:11 crc kubenswrapper[4794]: I0310 09:44:11.741625 4794 flags.go:64] FLAG: --tls-min-version="" Mar 10 09:44:11 crc kubenswrapper[4794]: I0310 09:44:11.741632 4794 flags.go:64] FLAG: --tls-private-key-file="" Mar 10 09:44:11 crc kubenswrapper[4794]: I0310 09:44:11.741640 4794 flags.go:64] FLAG: --topology-manager-policy="none" Mar 10 09:44:11 crc kubenswrapper[4794]: I0310 09:44:11.741647 4794 flags.go:64] FLAG: --topology-manager-policy-options="" Mar 10 09:44:11 crc kubenswrapper[4794]: I0310 09:44:11.741653 4794 flags.go:64] FLAG: --topology-manager-scope="container" Mar 10 09:44:11 crc kubenswrapper[4794]: I0310 09:44:11.741660 4794 flags.go:64] FLAG: --v="2" Mar 10 09:44:11 crc kubenswrapper[4794]: I0310 09:44:11.741673 4794 flags.go:64] FLAG: --version="false" Mar 10 09:44:11 crc kubenswrapper[4794]: I0310 09:44:11.741682 4794 flags.go:64] FLAG: --vmodule="" Mar 10 09:44:11 crc kubenswrapper[4794]: I0310 09:44:11.741691 4794 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Mar 10 09:44:11 crc kubenswrapper[4794]: I0310 09:44:11.741698 4794 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Mar 10 09:44:11 crc kubenswrapper[4794]: W0310 09:44:11.741877 4794 feature_gate.go:330] unrecognized feature gate: Example Mar 10 09:44:11 crc kubenswrapper[4794]: W0310 09:44:11.741886 4794 feature_gate.go:330] unrecognized feature gate: PlatformOperators Mar 10 09:44:11 crc kubenswrapper[4794]: W0310 09:44:11.741892 4794 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Mar 10 09:44:11 crc kubenswrapper[4794]: W0310 09:44:11.741899 4794 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Mar 10 09:44:11 crc kubenswrapper[4794]: W0310 09:44:11.741906 4794 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Mar 10 09:44:11 crc kubenswrapper[4794]: W0310 09:44:11.741912 4794 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Mar 10 09:44:11 crc kubenswrapper[4794]: W0310 09:44:11.741919 4794 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Mar 10 09:44:11 crc kubenswrapper[4794]: W0310 09:44:11.741925 4794 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Mar 10 09:44:11 crc kubenswrapper[4794]: W0310 09:44:11.741931 4794 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Mar 10 09:44:11 crc kubenswrapper[4794]: W0310 09:44:11.741937 4794 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Mar 10 09:44:11 crc kubenswrapper[4794]: W0310 09:44:11.741943 4794 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Mar 10 09:44:11 crc kubenswrapper[4794]: W0310 09:44:11.741951 4794 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Mar 10 09:44:11 crc kubenswrapper[4794]: W0310 09:44:11.741957 4794 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 10 09:44:11 crc kubenswrapper[4794]: W0310 09:44:11.741963 4794 feature_gate.go:330] unrecognized feature gate: OVNObservability Mar 10 09:44:11 crc kubenswrapper[4794]: W0310 09:44:11.741969 4794 feature_gate.go:330] unrecognized feature gate: PinnedImages Mar 10 09:44:11 crc kubenswrapper[4794]: W0310 09:44:11.741976 4794 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Mar 10 09:44:11 crc kubenswrapper[4794]: W0310 09:44:11.741983 4794 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Mar 10 09:44:11 crc kubenswrapper[4794]: W0310 09:44:11.741990 4794 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Mar 10 09:44:11 crc kubenswrapper[4794]: W0310 09:44:11.741996 4794 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Mar 10 09:44:11 crc kubenswrapper[4794]: W0310 09:44:11.742002 4794 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Mar 10 09:44:11 crc kubenswrapper[4794]: W0310 09:44:11.742007 4794 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Mar 10 09:44:11 crc kubenswrapper[4794]: W0310 09:44:11.742013 4794 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Mar 10 09:44:11 crc kubenswrapper[4794]: W0310 09:44:11.742018 4794 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Mar 10 09:44:11 crc kubenswrapper[4794]: W0310 09:44:11.742023 4794 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Mar 10 09:44:11 crc kubenswrapper[4794]: W0310 09:44:11.742029 4794 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Mar 10 09:44:11 crc kubenswrapper[4794]: W0310 09:44:11.742034 4794 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Mar 10 09:44:11 crc kubenswrapper[4794]: W0310 09:44:11.742040 4794 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Mar 10 09:44:11 crc kubenswrapper[4794]: W0310 09:44:11.742045 4794 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Mar 10 09:44:11 crc kubenswrapper[4794]: W0310 09:44:11.742055 4794 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Mar 10 09:44:11 crc kubenswrapper[4794]: W0310 09:44:11.742061 4794 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Mar 10 09:44:11 crc kubenswrapper[4794]: W0310 09:44:11.742066 4794 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Mar 10 09:44:11 crc kubenswrapper[4794]: W0310 09:44:11.742071 4794 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Mar 10 09:44:11 crc kubenswrapper[4794]: W0310 09:44:11.742077 4794 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Mar 10 09:44:11 crc kubenswrapper[4794]: W0310 09:44:11.742082 4794 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Mar 10 09:44:11 crc kubenswrapper[4794]: W0310 09:44:11.742088 4794 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 10 09:44:11 crc kubenswrapper[4794]: W0310 09:44:11.742093 4794 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 10 09:44:11 crc kubenswrapper[4794]: W0310 09:44:11.742098 4794 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Mar 10 09:44:11 crc kubenswrapper[4794]: W0310 09:44:11.742104 4794 feature_gate.go:330] unrecognized feature gate: SignatureStores Mar 10 09:44:11 crc kubenswrapper[4794]: W0310 09:44:11.742109 4794 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Mar 10 09:44:11 crc kubenswrapper[4794]: W0310 09:44:11.742115 4794 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Mar 10 09:44:11 crc kubenswrapper[4794]: W0310 09:44:11.742120 4794 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Mar 10 09:44:11 crc kubenswrapper[4794]: W0310 09:44:11.742125 4794 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Mar 10 09:44:11 crc kubenswrapper[4794]: W0310 09:44:11.742131 4794 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Mar 10 09:44:11 crc kubenswrapper[4794]: W0310 09:44:11.742137 4794 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Mar 10 09:44:11 crc kubenswrapper[4794]: W0310 09:44:11.742143 4794 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Mar 10 09:44:11 crc kubenswrapper[4794]: W0310 09:44:11.742150 4794 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Mar 10 09:44:11 crc kubenswrapper[4794]: W0310 09:44:11.742157 4794 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Mar 10 09:44:11 crc kubenswrapper[4794]: W0310 09:44:11.742163 4794 feature_gate.go:330] unrecognized feature gate: GatewayAPI Mar 10 09:44:11 crc kubenswrapper[4794]: W0310 09:44:11.742168 4794 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 10 09:44:11 crc kubenswrapper[4794]: W0310 09:44:11.742174 4794 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Mar 10 09:44:11 crc kubenswrapper[4794]: W0310 09:44:11.742180 4794 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Mar 10 09:44:11 crc kubenswrapper[4794]: W0310 09:44:11.742186 4794 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Mar 10 09:44:11 crc kubenswrapper[4794]: W0310 09:44:11.742193 4794 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 10 09:44:11 crc kubenswrapper[4794]: W0310 09:44:11.742200 4794 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Mar 10 09:44:11 crc kubenswrapper[4794]: W0310 09:44:11.742206 4794 feature_gate.go:330] unrecognized feature gate: InsightsConfig Mar 10 09:44:11 crc kubenswrapper[4794]: W0310 09:44:11.742213 4794 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Mar 10 09:44:11 crc kubenswrapper[4794]: W0310 09:44:11.742244 4794 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Mar 10 09:44:11 crc kubenswrapper[4794]: W0310 09:44:11.742252 4794 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Mar 10 09:44:11 crc kubenswrapper[4794]: W0310 09:44:11.742259 4794 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Mar 10 09:44:11 crc kubenswrapper[4794]: W0310 09:44:11.742265 4794 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Mar 10 09:44:11 crc kubenswrapper[4794]: W0310 09:44:11.742273 4794 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 10 09:44:11 crc kubenswrapper[4794]: W0310 09:44:11.742279 4794 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Mar 10 09:44:11 crc kubenswrapper[4794]: W0310 09:44:11.742285 4794 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Mar 10 09:44:11 crc kubenswrapper[4794]: W0310 09:44:11.742290 4794 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Mar 10 09:44:11 crc kubenswrapper[4794]: W0310 09:44:11.742304 4794 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 10 09:44:11 crc kubenswrapper[4794]: W0310 09:44:11.742311 4794 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Mar 10 09:44:11 crc kubenswrapper[4794]: W0310 09:44:11.742316 4794 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Mar 10 09:44:11 crc kubenswrapper[4794]: W0310 09:44:11.742322 4794 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Mar 10 09:44:11 crc kubenswrapper[4794]: W0310 09:44:11.742327 4794 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Mar 10 09:44:11 crc kubenswrapper[4794]: W0310 09:44:11.742352 4794 feature_gate.go:330] unrecognized feature gate: NewOLM Mar 10 09:44:11 crc kubenswrapper[4794]: W0310 09:44:11.742358 4794 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Mar 10 09:44:11 crc kubenswrapper[4794]: I0310 09:44:11.742378 4794 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Mar 10 09:44:11 crc kubenswrapper[4794]: I0310 09:44:11.754405 4794 server.go:491] "Kubelet version" kubeletVersion="v1.31.5" Mar 10 09:44:11 crc kubenswrapper[4794]: I0310 09:44:11.754439 4794 server.go:493] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Mar 10 09:44:11 crc kubenswrapper[4794]: W0310 09:44:11.754536 4794 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 10 09:44:11 crc kubenswrapper[4794]: W0310 09:44:11.754548 4794 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Mar 10 09:44:11 crc kubenswrapper[4794]: W0310 09:44:11.754555 4794 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Mar 10 09:44:11 crc kubenswrapper[4794]: W0310 09:44:11.754562 4794 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Mar 10 09:44:11 crc kubenswrapper[4794]: W0310 09:44:11.754571 4794 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 10 09:44:11 crc kubenswrapper[4794]: W0310 09:44:11.754576 4794 feature_gate.go:330] unrecognized feature gate: PinnedImages Mar 10 09:44:11 crc kubenswrapper[4794]: W0310 09:44:11.754583 4794 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Mar 10 09:44:11 crc kubenswrapper[4794]: W0310 09:44:11.754589 4794 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Mar 10 09:44:11 crc kubenswrapper[4794]: W0310 09:44:11.754594 4794 feature_gate.go:330] unrecognized feature gate: PlatformOperators Mar 10 09:44:11 crc kubenswrapper[4794]: W0310 09:44:11.754602 4794 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Mar 10 09:44:11 crc kubenswrapper[4794]: W0310 09:44:11.754607 4794 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Mar 10 09:44:11 crc kubenswrapper[4794]: W0310 09:44:11.754613 4794 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Mar 10 09:44:11 crc kubenswrapper[4794]: W0310 09:44:11.754618 4794 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Mar 10 09:44:11 crc kubenswrapper[4794]: W0310 09:44:11.754623 4794 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Mar 10 09:44:11 crc kubenswrapper[4794]: W0310 09:44:11.754630 4794 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Mar 10 09:44:11 crc kubenswrapper[4794]: W0310 09:44:11.754635 4794 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Mar 10 09:44:11 crc kubenswrapper[4794]: W0310 09:44:11.754641 4794 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Mar 10 09:44:11 crc kubenswrapper[4794]: W0310 09:44:11.754647 4794 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Mar 10 09:44:11 crc kubenswrapper[4794]: W0310 09:44:11.754655 4794 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Mar 10 09:44:11 crc kubenswrapper[4794]: W0310 09:44:11.754665 4794 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Mar 10 09:44:11 crc kubenswrapper[4794]: W0310 09:44:11.754671 4794 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Mar 10 09:44:11 crc kubenswrapper[4794]: W0310 09:44:11.754679 4794 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Mar 10 09:44:11 crc kubenswrapper[4794]: W0310 09:44:11.754685 4794 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Mar 10 09:44:11 crc kubenswrapper[4794]: W0310 09:44:11.754691 4794 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Mar 10 09:44:11 crc kubenswrapper[4794]: W0310 09:44:11.754699 4794 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Mar 10 09:44:11 crc kubenswrapper[4794]: W0310 09:44:11.754707 4794 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 10 09:44:11 crc kubenswrapper[4794]: W0310 09:44:11.754713 4794 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Mar 10 09:44:11 crc kubenswrapper[4794]: W0310 09:44:11.754720 4794 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Mar 10 09:44:11 crc kubenswrapper[4794]: W0310 09:44:11.754727 4794 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 10 09:44:11 crc kubenswrapper[4794]: W0310 09:44:11.754733 4794 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Mar 10 09:44:11 crc kubenswrapper[4794]: W0310 09:44:11.754740 4794 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Mar 10 09:44:11 crc kubenswrapper[4794]: W0310 09:44:11.754747 4794 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Mar 10 09:44:11 crc kubenswrapper[4794]: W0310 09:44:11.754754 4794 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Mar 10 09:44:11 crc kubenswrapper[4794]: W0310 09:44:11.754760 4794 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Mar 10 09:44:11 crc kubenswrapper[4794]: W0310 09:44:11.754766 4794 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Mar 10 09:44:11 crc kubenswrapper[4794]: W0310 09:44:11.754772 4794 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 10 09:44:11 crc kubenswrapper[4794]: W0310 09:44:11.754778 4794 feature_gate.go:330] unrecognized feature gate: OVNObservability Mar 10 09:44:11 crc kubenswrapper[4794]: W0310 09:44:11.754784 4794 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Mar 10 09:44:11 crc kubenswrapper[4794]: W0310 09:44:11.754789 4794 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Mar 10 09:44:11 crc kubenswrapper[4794]: W0310 09:44:11.754795 4794 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Mar 10 09:44:11 crc kubenswrapper[4794]: W0310 09:44:11.754808 4794 feature_gate.go:330] unrecognized feature gate: NewOLM Mar 10 09:44:11 crc kubenswrapper[4794]: W0310 09:44:11.754813 4794 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Mar 10 09:44:11 crc kubenswrapper[4794]: W0310 09:44:11.754819 4794 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Mar 10 09:44:11 crc kubenswrapper[4794]: W0310 09:44:11.754824 4794 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Mar 10 09:44:11 crc kubenswrapper[4794]: W0310 09:44:11.754830 4794 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Mar 10 09:44:11 crc kubenswrapper[4794]: W0310 09:44:11.754836 4794 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Mar 10 09:44:11 crc kubenswrapper[4794]: W0310 09:44:11.754841 4794 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Mar 10 09:44:11 crc kubenswrapper[4794]: W0310 09:44:11.754846 4794 feature_gate.go:330] unrecognized feature gate: GatewayAPI Mar 10 09:44:11 crc kubenswrapper[4794]: W0310 09:44:11.754852 4794 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 10 09:44:11 crc kubenswrapper[4794]: W0310 09:44:11.754857 4794 feature_gate.go:330] unrecognized feature gate: InsightsConfig Mar 10 09:44:11 crc kubenswrapper[4794]: W0310 09:44:11.754863 4794 feature_gate.go:330] unrecognized feature gate: Example Mar 10 09:44:11 crc kubenswrapper[4794]: W0310 09:44:11.754869 4794 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Mar 10 09:44:11 crc kubenswrapper[4794]: W0310 09:44:11.754874 4794 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Mar 10 09:44:11 crc kubenswrapper[4794]: W0310 09:44:11.754880 4794 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Mar 10 09:44:11 crc kubenswrapper[4794]: W0310 09:44:11.754885 4794 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Mar 10 09:44:11 crc kubenswrapper[4794]: W0310 09:44:11.754891 4794 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Mar 10 09:44:11 crc kubenswrapper[4794]: W0310 09:44:11.754897 4794 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Mar 10 09:44:11 crc kubenswrapper[4794]: W0310 09:44:11.754903 4794 feature_gate.go:330] unrecognized feature gate: SignatureStores Mar 10 09:44:11 crc kubenswrapper[4794]: W0310 09:44:11.754908 4794 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Mar 10 09:44:11 crc kubenswrapper[4794]: W0310 09:44:11.754914 4794 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Mar 10 09:44:11 crc kubenswrapper[4794]: W0310 09:44:11.754919 4794 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Mar 10 09:44:11 crc kubenswrapper[4794]: W0310 09:44:11.754925 4794 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Mar 10 09:44:11 crc kubenswrapper[4794]: W0310 09:44:11.754932 4794 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Mar 10 09:44:11 crc kubenswrapper[4794]: W0310 09:44:11.754937 4794 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Mar 10 09:44:11 crc kubenswrapper[4794]: W0310 09:44:11.754943 4794 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Mar 10 09:44:11 crc kubenswrapper[4794]: W0310 09:44:11.754948 4794 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Mar 10 09:44:11 crc kubenswrapper[4794]: W0310 09:44:11.754954 4794 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Mar 10 09:44:11 crc kubenswrapper[4794]: W0310 09:44:11.754959 4794 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Mar 10 09:44:11 crc kubenswrapper[4794]: W0310 09:44:11.754965 4794 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Mar 10 09:44:11 crc kubenswrapper[4794]: W0310 09:44:11.754970 4794 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 10 09:44:11 crc kubenswrapper[4794]: W0310 09:44:11.754976 4794 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Mar 10 09:44:11 crc kubenswrapper[4794]: I0310 09:44:11.754985 4794 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Mar 10 09:44:11 crc kubenswrapper[4794]: W0310 09:44:11.755150 4794 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Mar 10 09:44:11 crc kubenswrapper[4794]: W0310 09:44:11.755161 4794 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Mar 10 09:44:11 crc kubenswrapper[4794]: W0310 09:44:11.755169 4794 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Mar 10 09:44:11 crc kubenswrapper[4794]: W0310 09:44:11.755176 4794 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Mar 10 09:44:11 crc kubenswrapper[4794]: W0310 09:44:11.755182 4794 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Mar 10 09:44:11 crc kubenswrapper[4794]: W0310 09:44:11.755189 4794 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 10 09:44:11 crc kubenswrapper[4794]: W0310 09:44:11.755197 4794 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Mar 10 09:44:11 crc kubenswrapper[4794]: W0310 09:44:11.755205 4794 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Mar 10 09:44:11 crc kubenswrapper[4794]: W0310 09:44:11.755212 4794 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Mar 10 09:44:11 crc kubenswrapper[4794]: W0310 09:44:11.755219 4794 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Mar 10 09:44:11 crc kubenswrapper[4794]: W0310 09:44:11.755226 4794 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Mar 10 09:44:11 crc kubenswrapper[4794]: W0310 09:44:11.755232 4794 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 10 09:44:11 crc kubenswrapper[4794]: W0310 09:44:11.755238 4794 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Mar 10 09:44:11 crc kubenswrapper[4794]: W0310 09:44:11.755244 4794 feature_gate.go:330] unrecognized feature gate: OVNObservability Mar 10 09:44:11 crc kubenswrapper[4794]: W0310 09:44:11.755249 4794 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Mar 10 09:44:11 crc kubenswrapper[4794]: W0310 09:44:11.755255 4794 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Mar 10 09:44:11 crc kubenswrapper[4794]: W0310 09:44:11.755261 4794 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Mar 10 09:44:11 crc kubenswrapper[4794]: W0310 09:44:11.755266 4794 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 10 09:44:11 crc kubenswrapper[4794]: W0310 09:44:11.755271 4794 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Mar 10 09:44:11 crc kubenswrapper[4794]: W0310 09:44:11.755278 4794 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Mar 10 09:44:11 crc kubenswrapper[4794]: W0310 09:44:11.755284 4794 feature_gate.go:330] unrecognized feature gate: SignatureStores Mar 10 09:44:11 crc kubenswrapper[4794]: W0310 09:44:11.755289 4794 feature_gate.go:330] unrecognized feature gate: GatewayAPI Mar 10 09:44:11 crc kubenswrapper[4794]: W0310 09:44:11.755294 4794 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Mar 10 09:44:11 crc kubenswrapper[4794]: W0310 09:44:11.755300 4794 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Mar 10 09:44:11 crc kubenswrapper[4794]: W0310 09:44:11.755306 4794 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Mar 10 09:44:11 crc kubenswrapper[4794]: W0310 09:44:11.755311 4794 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Mar 10 09:44:11 crc kubenswrapper[4794]: W0310 09:44:11.755317 4794 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Mar 10 09:44:11 crc kubenswrapper[4794]: W0310 09:44:11.755322 4794 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Mar 10 09:44:11 crc kubenswrapper[4794]: W0310 09:44:11.755350 4794 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Mar 10 09:44:11 crc kubenswrapper[4794]: W0310 09:44:11.755357 4794 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Mar 10 09:44:11 crc kubenswrapper[4794]: W0310 09:44:11.755364 4794 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Mar 10 09:44:11 crc kubenswrapper[4794]: W0310 09:44:11.755371 4794 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Mar 10 09:44:11 crc kubenswrapper[4794]: W0310 09:44:11.755377 4794 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Mar 10 09:44:11 crc kubenswrapper[4794]: W0310 09:44:11.755383 4794 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Mar 10 09:44:11 crc kubenswrapper[4794]: W0310 09:44:11.755389 4794 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Mar 10 09:44:11 crc kubenswrapper[4794]: W0310 09:44:11.755395 4794 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Mar 10 09:44:11 crc kubenswrapper[4794]: W0310 09:44:11.755400 4794 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Mar 10 09:44:11 crc kubenswrapper[4794]: W0310 09:44:11.755405 4794 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Mar 10 09:44:11 crc kubenswrapper[4794]: W0310 09:44:11.755411 4794 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Mar 10 09:44:11 crc kubenswrapper[4794]: W0310 09:44:11.755418 4794 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Mar 10 09:44:11 crc kubenswrapper[4794]: W0310 09:44:11.755424 4794 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 10 09:44:11 crc kubenswrapper[4794]: W0310 09:44:11.755430 4794 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Mar 10 09:44:11 crc kubenswrapper[4794]: W0310 09:44:11.755435 4794 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Mar 10 09:44:11 crc kubenswrapper[4794]: W0310 09:44:11.755441 4794 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Mar 10 09:44:11 crc kubenswrapper[4794]: W0310 09:44:11.755446 4794 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 10 09:44:11 crc kubenswrapper[4794]: W0310 09:44:11.755452 4794 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Mar 10 09:44:11 crc kubenswrapper[4794]: W0310 09:44:11.755458 4794 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Mar 10 09:44:11 crc kubenswrapper[4794]: W0310 09:44:11.755463 4794 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Mar 10 09:44:11 crc kubenswrapper[4794]: W0310 09:44:11.755469 4794 feature_gate.go:330] unrecognized feature gate: PinnedImages Mar 10 09:44:11 crc kubenswrapper[4794]: W0310 09:44:11.755474 4794 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Mar 10 09:44:11 crc kubenswrapper[4794]: W0310 09:44:11.755479 4794 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Mar 10 09:44:11 crc kubenswrapper[4794]: W0310 09:44:11.755485 4794 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Mar 10 09:44:11 crc kubenswrapper[4794]: W0310 09:44:11.755490 4794 feature_gate.go:330] unrecognized feature gate: Example Mar 10 09:44:11 crc kubenswrapper[4794]: W0310 09:44:11.755496 4794 feature_gate.go:330] unrecognized feature gate: InsightsConfig Mar 10 09:44:11 crc kubenswrapper[4794]: W0310 09:44:11.755501 4794 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Mar 10 09:44:11 crc kubenswrapper[4794]: W0310 09:44:11.755508 4794 feature_gate.go:330] unrecognized feature gate: NewOLM Mar 10 09:44:11 crc kubenswrapper[4794]: W0310 09:44:11.755514 4794 feature_gate.go:330] unrecognized feature gate: PlatformOperators Mar 10 09:44:11 crc kubenswrapper[4794]: W0310 09:44:11.755520 4794 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Mar 10 09:44:11 crc kubenswrapper[4794]: W0310 09:44:11.755526 4794 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 10 09:44:11 crc kubenswrapper[4794]: W0310 09:44:11.755531 4794 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Mar 10 09:44:11 crc kubenswrapper[4794]: W0310 09:44:11.755537 4794 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Mar 10 09:44:11 crc kubenswrapper[4794]: W0310 09:44:11.755542 4794 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Mar 10 09:44:11 crc kubenswrapper[4794]: W0310 09:44:11.755547 4794 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Mar 10 09:44:11 crc kubenswrapper[4794]: W0310 09:44:11.755553 4794 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Mar 10 09:44:11 crc kubenswrapper[4794]: W0310 09:44:11.755559 4794 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 10 09:44:11 crc kubenswrapper[4794]: W0310 09:44:11.755564 4794 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Mar 10 09:44:11 crc kubenswrapper[4794]: W0310 09:44:11.755570 4794 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Mar 10 09:44:11 crc kubenswrapper[4794]: W0310 09:44:11.755575 4794 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Mar 10 09:44:11 crc kubenswrapper[4794]: W0310 09:44:11.755581 4794 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Mar 10 09:44:11 crc kubenswrapper[4794]: W0310 09:44:11.755586 4794 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Mar 10 09:44:11 crc kubenswrapper[4794]: W0310 09:44:11.755591 4794 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Mar 10 09:44:11 crc kubenswrapper[4794]: I0310 09:44:11.755600 4794 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Mar 10 09:44:11 crc kubenswrapper[4794]: I0310 09:44:11.757611 4794 server.go:940] "Client rotation is on, will bootstrap in background" Mar 10 09:44:11 crc kubenswrapper[4794]: E0310 09:44:11.761852 4794 bootstrap.go:266] "Unhandled Error" err="part of the existing bootstrap client certificate in /var/lib/kubelet/kubeconfig is expired: 2026-02-24 05:52:08 +0000 UTC" logger="UnhandledError" Mar 10 09:44:11 crc kubenswrapper[4794]: I0310 09:44:11.766742 4794 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Mar 10 09:44:11 crc kubenswrapper[4794]: I0310 09:44:11.766913 4794 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". Mar 10 09:44:11 crc kubenswrapper[4794]: I0310 09:44:11.769033 4794 server.go:997] "Starting client certificate rotation" Mar 10 09:44:11 crc kubenswrapper[4794]: I0310 09:44:11.769073 4794 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate rotation is enabled Mar 10 09:44:11 crc kubenswrapper[4794]: I0310 09:44:11.770448 4794 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 10 09:44:11 crc kubenswrapper[4794]: I0310 09:44:11.799800 4794 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Mar 10 09:44:11 crc kubenswrapper[4794]: E0310 09:44:11.802133 4794 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 38.102.83.65:6443: connect: connection refused" logger="UnhandledError" Mar 10 09:44:11 crc kubenswrapper[4794]: I0310 09:44:11.804163 4794 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Mar 10 09:44:11 crc kubenswrapper[4794]: I0310 09:44:11.824957 4794 log.go:25] "Validated CRI v1 runtime API" Mar 10 09:44:11 crc kubenswrapper[4794]: I0310 09:44:11.866427 4794 log.go:25] "Validated CRI v1 image API" Mar 10 09:44:11 crc kubenswrapper[4794]: I0310 09:44:11.869038 4794 server.go:1437] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Mar 10 09:44:11 crc kubenswrapper[4794]: I0310 09:44:11.874842 4794 fs.go:133] Filesystem UUIDs: map[0b076daa-c26a-46d2-b3a6-72a8dbc6e257:/dev/vda4 2026-03-10-09-39-32-00:/dev/sr0 7B77-95E7:/dev/vda2 de0497b0-db1b-465a-b278-03db02455c71:/dev/vda3] Mar 10 09:44:11 crc kubenswrapper[4794]: I0310 09:44:11.874891 4794 fs.go:134] Filesystem partitions: map[/dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /dev/vda3:{mountpoint:/boot major:252 minor:3 fsType:ext4 blockSize:0} /dev/vda4:{mountpoint:/var major:252 minor:4 fsType:xfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /run/user/1000:{mountpoint:/run/user/1000 major:0 minor:42 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:30 fsType:tmpfs blockSize:0} /var/lib/etcd:{mountpoint:/var/lib/etcd major:0 minor:43 fsType:tmpfs blockSize:0}] Mar 10 09:44:11 crc kubenswrapper[4794]: I0310 09:44:11.900129 4794 manager.go:217] Machine: {Timestamp:2026-03-10 09:44:11.897750058 +0000 UTC m=+0.653920896 CPUVendorID:AuthenticAMD NumCores:12 NumPhysicalCores:1 NumSockets:12 CpuFrequency:2800000 MemoryCapacity:33654120448 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:21801e6708c44f15b81395eb736a7cec SystemUUID:970a308b-8f2f-4747-b542-8544494e7e13 BootID:30640994-851b-4f33-a3b0-5689f89c6242 Filesystems:[{Device:/var/lib/etcd DeviceMajor:0 DeviceMinor:43 Capacity:1073741824 Type:vfs Inodes:4108169 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16827060224 Type:vfs Inodes:4108169 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6730825728 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/vda4 DeviceMajor:252 DeviceMinor:4 Capacity:85292941312 Type:vfs Inodes:41679680 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:30 Capacity:16827060224 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/vda3 DeviceMajor:252 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/run/user/1000 DeviceMajor:0 DeviceMinor:42 Capacity:3365408768 Type:vfs Inodes:821633 HasInodes:true}] DiskMap:map[252:0:{Name:vda Major:252 Minor:0 Size:214748364800 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:fa:16:3e:61:91:c8 Speed:0 Mtu:1500} {Name:br-int MacAddress:d6:39:55:2e:22:71 Speed:0 Mtu:1400} {Name:ens3 MacAddress:fa:16:3e:61:91:c8 Speed:-1 Mtu:1500} {Name:ens7 MacAddress:fa:16:3e:82:5b:67 Speed:-1 Mtu:1500} {Name:ens7.20 MacAddress:52:54:00:44:8e:09 Speed:-1 Mtu:1496} {Name:ens7.21 MacAddress:52:54:00:37:40:36 Speed:-1 Mtu:1496} {Name:ens7.22 MacAddress:52:54:00:ce:68:94 Speed:-1 Mtu:1496} {Name:ens7.23 MacAddress:52:54:00:33:d5:1d Speed:-1 Mtu:1496} {Name:eth10 MacAddress:22:db:a9:eb:56:a9 Speed:0 Mtu:1500} {Name:ovn-k8s-mp0 MacAddress:0a:58:0a:d9:00:02 Speed:0 Mtu:1400} {Name:ovs-system MacAddress:8e:99:76:3e:94:4e Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33654120448 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:0 Size:16777216 Type:Unified Level:3}] SocketID:0 BookID: DrawerID:} {Id:0 Threads:[1] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:1 Size:16777216 Type:Unified Level:3}] SocketID:1 BookID: DrawerID:} {Id:0 Threads:[10] Caches:[{Id:10 Size:32768 Type:Data Level:1} {Id:10 Size:32768 Type:Instruction Level:1} {Id:10 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:10 Size:16777216 Type:Unified Level:3}] SocketID:10 BookID: DrawerID:} {Id:0 Threads:[11] Caches:[{Id:11 Size:32768 Type:Data Level:1} {Id:11 Size:32768 Type:Instruction Level:1} {Id:11 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:11 Size:16777216 Type:Unified Level:3}] SocketID:11 BookID: DrawerID:} {Id:0 Threads:[2] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:2 Size:16777216 Type:Unified Level:3}] SocketID:2 BookID: DrawerID:} {Id:0 Threads:[3] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:3 Size:16777216 Type:Unified Level:3}] SocketID:3 BookID: DrawerID:} {Id:0 Threads:[4] Caches:[{Id:4 Size:32768 Type:Data Level:1} {Id:4 Size:32768 Type:Instruction Level:1} {Id:4 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:4 Size:16777216 Type:Unified Level:3}] SocketID:4 BookID: DrawerID:} {Id:0 Threads:[5] Caches:[{Id:5 Size:32768 Type:Data Level:1} {Id:5 Size:32768 Type:Instruction Level:1} {Id:5 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:5 Size:16777216 Type:Unified Level:3}] SocketID:5 BookID: DrawerID:} {Id:0 Threads:[6] Caches:[{Id:6 Size:32768 Type:Data Level:1} {Id:6 Size:32768 Type:Instruction Level:1} {Id:6 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:6 Size:16777216 Type:Unified Level:3}] SocketID:6 BookID: DrawerID:} {Id:0 Threads:[7] Caches:[{Id:7 Size:32768 Type:Data Level:1} {Id:7 Size:32768 Type:Instruction Level:1} {Id:7 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:7 Size:16777216 Type:Unified Level:3}] SocketID:7 BookID: DrawerID:} {Id:0 Threads:[8] Caches:[{Id:8 Size:32768 Type:Data Level:1} {Id:8 Size:32768 Type:Instruction Level:1} {Id:8 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:8 Size:16777216 Type:Unified Level:3}] SocketID:8 BookID: DrawerID:} {Id:0 Threads:[9] Caches:[{Id:9 Size:32768 Type:Data Level:1} {Id:9 Size:32768 Type:Instruction Level:1} {Id:9 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:9 Size:16777216 Type:Unified Level:3}] SocketID:9 BookID: DrawerID:}] Caches:[] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Mar 10 09:44:11 crc kubenswrapper[4794]: I0310 09:44:11.900405 4794 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Mar 10 09:44:11 crc kubenswrapper[4794]: I0310 09:44:11.900573 4794 manager.go:233] Version: {KernelVersion:5.14.0-427.50.2.el9_4.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 418.94.202502100215-0 DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Mar 10 09:44:11 crc kubenswrapper[4794]: I0310 09:44:11.903912 4794 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" Mar 10 09:44:11 crc kubenswrapper[4794]: I0310 09:44:11.904241 4794 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Mar 10 09:44:11 crc kubenswrapper[4794]: I0310 09:44:11.904303 4794 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"crc","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"200m","ephemeral-storage":"350Mi","memory":"350Mi"},"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Mar 10 09:44:11 crc kubenswrapper[4794]: I0310 09:44:11.904667 4794 topology_manager.go:138] "Creating topology manager with none policy" Mar 10 09:44:11 crc kubenswrapper[4794]: I0310 09:44:11.904686 4794 container_manager_linux.go:303] "Creating device plugin manager" Mar 10 09:44:11 crc kubenswrapper[4794]: I0310 09:44:11.905161 4794 manager.go:142] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Mar 10 09:44:11 crc kubenswrapper[4794]: I0310 09:44:11.905911 4794 server.go:66] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Mar 10 09:44:11 crc kubenswrapper[4794]: I0310 09:44:11.906146 4794 state_mem.go:36] "Initialized new in-memory state store" Mar 10 09:44:11 crc kubenswrapper[4794]: I0310 09:44:11.906276 4794 server.go:1245] "Using root directory" path="/var/lib/kubelet" Mar 10 09:44:11 crc kubenswrapper[4794]: I0310 09:44:11.910985 4794 kubelet.go:418] "Attempting to sync node with API server" Mar 10 09:44:11 crc kubenswrapper[4794]: I0310 09:44:11.911020 4794 kubelet.go:313] "Adding static pod path" path="/etc/kubernetes/manifests" Mar 10 09:44:11 crc kubenswrapper[4794]: I0310 09:44:11.911092 4794 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Mar 10 09:44:11 crc kubenswrapper[4794]: I0310 09:44:11.911115 4794 kubelet.go:324] "Adding apiserver pod source" Mar 10 09:44:11 crc kubenswrapper[4794]: I0310 09:44:11.911134 4794 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Mar 10 09:44:11 crc kubenswrapper[4794]: I0310 09:44:11.915915 4794 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="cri-o" version="1.31.5-4.rhaos4.18.gitdad78d5.el9" apiVersion="v1" Mar 10 09:44:11 crc kubenswrapper[4794]: I0310 09:44:11.917158 4794 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-server-current.pem". Mar 10 09:44:11 crc kubenswrapper[4794]: W0310 09:44:11.919259 4794 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.65:6443: connect: connection refused Mar 10 09:44:11 crc kubenswrapper[4794]: W0310 09:44:11.919318 4794 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.65:6443: connect: connection refused Mar 10 09:44:11 crc kubenswrapper[4794]: E0310 09:44:11.919434 4794 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.65:6443: connect: connection refused" logger="UnhandledError" Mar 10 09:44:11 crc kubenswrapper[4794]: E0310 09:44:11.919498 4794 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.65:6443: connect: connection refused" logger="UnhandledError" Mar 10 09:44:11 crc kubenswrapper[4794]: I0310 09:44:11.919732 4794 kubelet.go:854] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Mar 10 09:44:11 crc kubenswrapper[4794]: I0310 09:44:11.922371 4794 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Mar 10 09:44:11 crc kubenswrapper[4794]: I0310 09:44:11.922398 4794 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Mar 10 09:44:11 crc kubenswrapper[4794]: I0310 09:44:11.922405 4794 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Mar 10 09:44:11 crc kubenswrapper[4794]: I0310 09:44:11.922412 4794 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Mar 10 09:44:11 crc kubenswrapper[4794]: I0310 09:44:11.922429 4794 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Mar 10 09:44:11 crc kubenswrapper[4794]: I0310 09:44:11.922437 4794 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/secret" Mar 10 09:44:11 crc kubenswrapper[4794]: I0310 09:44:11.922445 4794 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Mar 10 09:44:11 crc kubenswrapper[4794]: I0310 09:44:11.922459 4794 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Mar 10 09:44:11 crc kubenswrapper[4794]: I0310 09:44:11.922468 4794 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/fc" Mar 10 09:44:11 crc kubenswrapper[4794]: I0310 09:44:11.922474 4794 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Mar 10 09:44:11 crc kubenswrapper[4794]: I0310 09:44:11.922485 4794 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/projected" Mar 10 09:44:11 crc kubenswrapper[4794]: I0310 09:44:11.922492 4794 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Mar 10 09:44:11 crc kubenswrapper[4794]: I0310 09:44:11.923963 4794 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/csi" Mar 10 09:44:11 crc kubenswrapper[4794]: I0310 09:44:11.925027 4794 server.go:1280] "Started kubelet" Mar 10 09:44:11 crc kubenswrapper[4794]: I0310 09:44:11.926530 4794 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Mar 10 09:44:11 crc kubenswrapper[4794]: I0310 09:44:11.926431 4794 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Mar 10 09:44:11 crc kubenswrapper[4794]: I0310 09:44:11.927062 4794 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Mar 10 09:44:11 crc kubenswrapper[4794]: I0310 09:44:11.927131 4794 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.65:6443: connect: connection refused Mar 10 09:44:11 crc systemd[1]: Started Kubernetes Kubelet. Mar 10 09:44:11 crc kubenswrapper[4794]: I0310 09:44:11.936563 4794 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate rotation is enabled Mar 10 09:44:11 crc kubenswrapper[4794]: I0310 09:44:11.936633 4794 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Mar 10 09:44:11 crc kubenswrapper[4794]: E0310 09:44:11.937150 4794 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 09:44:11 crc kubenswrapper[4794]: I0310 09:44:11.937230 4794 volume_manager.go:287] "The desired_state_of_world populator starts" Mar 10 09:44:11 crc kubenswrapper[4794]: I0310 09:44:11.937240 4794 volume_manager.go:289] "Starting Kubelet Volume Manager" Mar 10 09:44:11 crc kubenswrapper[4794]: I0310 09:44:11.937306 4794 desired_state_of_world_populator.go:146] "Desired state populator starts to run" Mar 10 09:44:11 crc kubenswrapper[4794]: I0310 09:44:11.937423 4794 server.go:460] "Adding debug handlers to kubelet server" Mar 10 09:44:11 crc kubenswrapper[4794]: E0310 09:44:11.937789 4794 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.65:6443: connect: connection refused" interval="200ms" Mar 10 09:44:11 crc kubenswrapper[4794]: W0310 09:44:11.938676 4794 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.65:6443: connect: connection refused Mar 10 09:44:11 crc kubenswrapper[4794]: E0310 09:44:11.938798 4794 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.65:6443: connect: connection refused" logger="UnhandledError" Mar 10 09:44:11 crc kubenswrapper[4794]: I0310 09:44:11.939025 4794 factory.go:55] Registering systemd factory Mar 10 09:44:11 crc kubenswrapper[4794]: I0310 09:44:11.939065 4794 factory.go:221] Registration of the systemd container factory successfully Mar 10 09:44:11 crc kubenswrapper[4794]: I0310 09:44:11.939873 4794 factory.go:153] Registering CRI-O factory Mar 10 09:44:11 crc kubenswrapper[4794]: I0310 09:44:11.939909 4794 factory.go:221] Registration of the crio container factory successfully Mar 10 09:44:11 crc kubenswrapper[4794]: I0310 09:44:11.939984 4794 factory.go:219] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Mar 10 09:44:11 crc kubenswrapper[4794]: I0310 09:44:11.940010 4794 factory.go:103] Registering Raw factory Mar 10 09:44:11 crc kubenswrapper[4794]: I0310 09:44:11.940064 4794 manager.go:1196] Started watching for new ooms in manager Mar 10 09:44:11 crc kubenswrapper[4794]: I0310 09:44:11.940934 4794 manager.go:319] Starting recovery of all containers Mar 10 09:44:11 crc kubenswrapper[4794]: E0310 09:44:11.938632 4794 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": dial tcp 38.102.83.65:6443: connect: connection refused" event="&Event{ObjectMeta:{crc.189b71a9b07f9445 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 09:44:11.924984901 +0000 UTC m=+0.681155759,LastTimestamp:2026-03-10 09:44:11.924984901 +0000 UTC m=+0.681155759,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 09:44:11 crc kubenswrapper[4794]: I0310 09:44:11.944252 4794 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" seLinuxMountContext="" Mar 10 09:44:11 crc kubenswrapper[4794]: I0310 09:44:11.944299 4794 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf" seLinuxMountContext="" Mar 10 09:44:11 crc kubenswrapper[4794]: I0310 09:44:11.944313 4794 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" seLinuxMountContext="" Mar 10 09:44:11 crc kubenswrapper[4794]: I0310 09:44:11.944325 4794 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" seLinuxMountContext="" Mar 10 09:44:11 crc kubenswrapper[4794]: I0310 09:44:11.944352 4794 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" seLinuxMountContext="" Mar 10 09:44:11 crc kubenswrapper[4794]: I0310 09:44:11.944363 4794 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" seLinuxMountContext="" Mar 10 09:44:11 crc kubenswrapper[4794]: I0310 09:44:11.944376 4794 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" seLinuxMountContext="" Mar 10 09:44:11 crc kubenswrapper[4794]: I0310 09:44:11.944388 4794 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" seLinuxMountContext="" Mar 10 09:44:11 crc kubenswrapper[4794]: I0310 09:44:11.944401 4794 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" seLinuxMountContext="" Mar 10 09:44:11 crc kubenswrapper[4794]: I0310 09:44:11.944411 4794 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" seLinuxMountContext="" Mar 10 09:44:11 crc kubenswrapper[4794]: I0310 09:44:11.944422 4794 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" seLinuxMountContext="" Mar 10 09:44:11 crc kubenswrapper[4794]: I0310 09:44:11.944432 4794 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" seLinuxMountContext="" Mar 10 09:44:11 crc kubenswrapper[4794]: I0310 09:44:11.944445 4794 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" seLinuxMountContext="" Mar 10 09:44:11 crc kubenswrapper[4794]: I0310 09:44:11.944459 4794 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" seLinuxMountContext="" Mar 10 09:44:11 crc kubenswrapper[4794]: I0310 09:44:11.944469 4794 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" seLinuxMountContext="" Mar 10 09:44:11 crc kubenswrapper[4794]: I0310 09:44:11.944480 4794 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" seLinuxMountContext="" Mar 10 09:44:11 crc kubenswrapper[4794]: I0310 09:44:11.944492 4794 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="44663579-783b-4372-86d6-acf235a62d72" volumeName="kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" seLinuxMountContext="" Mar 10 09:44:11 crc kubenswrapper[4794]: I0310 09:44:11.944503 4794 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" seLinuxMountContext="" Mar 10 09:44:11 crc kubenswrapper[4794]: I0310 09:44:11.944514 4794 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" seLinuxMountContext="" Mar 10 09:44:11 crc kubenswrapper[4794]: I0310 09:44:11.944524 4794 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" seLinuxMountContext="" Mar 10 09:44:11 crc kubenswrapper[4794]: I0310 09:44:11.944534 4794 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" seLinuxMountContext="" Mar 10 09:44:11 crc kubenswrapper[4794]: I0310 09:44:11.944543 4794 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" seLinuxMountContext="" Mar 10 09:44:11 crc kubenswrapper[4794]: I0310 09:44:11.944553 4794 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" seLinuxMountContext="" Mar 10 09:44:11 crc kubenswrapper[4794]: I0310 09:44:11.944565 4794 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" seLinuxMountContext="" Mar 10 09:44:11 crc kubenswrapper[4794]: I0310 09:44:11.944577 4794 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" seLinuxMountContext="" Mar 10 09:44:11 crc kubenswrapper[4794]: I0310 09:44:11.944587 4794 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" seLinuxMountContext="" Mar 10 09:44:11 crc kubenswrapper[4794]: I0310 09:44:11.944600 4794 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" seLinuxMountContext="" Mar 10 09:44:11 crc kubenswrapper[4794]: I0310 09:44:11.944612 4794 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" seLinuxMountContext="" Mar 10 09:44:11 crc kubenswrapper[4794]: I0310 09:44:11.944622 4794 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" seLinuxMountContext="" Mar 10 09:44:11 crc kubenswrapper[4794]: I0310 09:44:11.944633 4794 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" seLinuxMountContext="" Mar 10 09:44:11 crc kubenswrapper[4794]: I0310 09:44:11.944644 4794 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" seLinuxMountContext="" Mar 10 09:44:11 crc kubenswrapper[4794]: I0310 09:44:11.944655 4794 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" seLinuxMountContext="" Mar 10 09:44:11 crc kubenswrapper[4794]: I0310 09:44:11.944666 4794 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" seLinuxMountContext="" Mar 10 09:44:11 crc kubenswrapper[4794]: I0310 09:44:11.944676 4794 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" seLinuxMountContext="" Mar 10 09:44:11 crc kubenswrapper[4794]: I0310 09:44:11.944687 4794 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" seLinuxMountContext="" Mar 10 09:44:11 crc kubenswrapper[4794]: I0310 09:44:11.944699 4794 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" seLinuxMountContext="" Mar 10 09:44:11 crc kubenswrapper[4794]: I0310 09:44:11.944714 4794 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert" seLinuxMountContext="" Mar 10 09:44:11 crc kubenswrapper[4794]: I0310 09:44:11.944728 4794 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" seLinuxMountContext="" Mar 10 09:44:11 crc kubenswrapper[4794]: I0310 09:44:11.944742 4794 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" seLinuxMountContext="" Mar 10 09:44:11 crc kubenswrapper[4794]: I0310 09:44:11.944762 4794 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" seLinuxMountContext="" Mar 10 09:44:11 crc kubenswrapper[4794]: I0310 09:44:11.944775 4794 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" seLinuxMountContext="" Mar 10 09:44:11 crc kubenswrapper[4794]: I0310 09:44:11.944790 4794 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" seLinuxMountContext="" Mar 10 09:44:11 crc kubenswrapper[4794]: I0310 09:44:11.944802 4794 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" seLinuxMountContext="" Mar 10 09:44:11 crc kubenswrapper[4794]: I0310 09:44:11.944816 4794 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" seLinuxMountContext="" Mar 10 09:44:11 crc kubenswrapper[4794]: I0310 09:44:11.944832 4794 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" seLinuxMountContext="" Mar 10 09:44:11 crc kubenswrapper[4794]: I0310 09:44:11.944845 4794 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" seLinuxMountContext="" Mar 10 09:44:11 crc kubenswrapper[4794]: I0310 09:44:11.944860 4794 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" seLinuxMountContext="" Mar 10 09:44:11 crc kubenswrapper[4794]: I0310 09:44:11.944872 4794 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" seLinuxMountContext="" Mar 10 09:44:11 crc kubenswrapper[4794]: I0310 09:44:11.944887 4794 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" seLinuxMountContext="" Mar 10 09:44:11 crc kubenswrapper[4794]: I0310 09:44:11.944903 4794 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" seLinuxMountContext="" Mar 10 09:44:11 crc kubenswrapper[4794]: I0310 09:44:11.944915 4794 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" seLinuxMountContext="" Mar 10 09:44:11 crc kubenswrapper[4794]: I0310 09:44:11.944926 4794 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" seLinuxMountContext="" Mar 10 09:44:11 crc kubenswrapper[4794]: I0310 09:44:11.944944 4794 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" seLinuxMountContext="" Mar 10 09:44:11 crc kubenswrapper[4794]: I0310 09:44:11.944958 4794 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" seLinuxMountContext="" Mar 10 09:44:11 crc kubenswrapper[4794]: I0310 09:44:11.944972 4794 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" seLinuxMountContext="" Mar 10 09:44:11 crc kubenswrapper[4794]: I0310 09:44:11.944984 4794 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" seLinuxMountContext="" Mar 10 09:44:11 crc kubenswrapper[4794]: I0310 09:44:11.945000 4794 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" seLinuxMountContext="" Mar 10 09:44:11 crc kubenswrapper[4794]: I0310 09:44:11.945027 4794 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" seLinuxMountContext="" Mar 10 09:44:11 crc kubenswrapper[4794]: I0310 09:44:11.945039 4794 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" seLinuxMountContext="" Mar 10 09:44:11 crc kubenswrapper[4794]: I0310 09:44:11.945048 4794 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm" seLinuxMountContext="" Mar 10 09:44:11 crc kubenswrapper[4794]: I0310 09:44:11.945062 4794 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" seLinuxMountContext="" Mar 10 09:44:11 crc kubenswrapper[4794]: I0310 09:44:11.945072 4794 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" seLinuxMountContext="" Mar 10 09:44:11 crc kubenswrapper[4794]: I0310 09:44:11.945085 4794 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" seLinuxMountContext="" Mar 10 09:44:11 crc kubenswrapper[4794]: I0310 09:44:11.945098 4794 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" seLinuxMountContext="" Mar 10 09:44:11 crc kubenswrapper[4794]: I0310 09:44:11.945110 4794 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" seLinuxMountContext="" Mar 10 09:44:11 crc kubenswrapper[4794]: I0310 09:44:11.945121 4794 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" seLinuxMountContext="" Mar 10 09:44:11 crc kubenswrapper[4794]: I0310 09:44:11.945130 4794 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" seLinuxMountContext="" Mar 10 09:44:11 crc kubenswrapper[4794]: I0310 09:44:11.945140 4794 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" seLinuxMountContext="" Mar 10 09:44:11 crc kubenswrapper[4794]: I0310 09:44:11.945154 4794 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" seLinuxMountContext="" Mar 10 09:44:11 crc kubenswrapper[4794]: I0310 09:44:11.945165 4794 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" seLinuxMountContext="" Mar 10 09:44:11 crc kubenswrapper[4794]: I0310 09:44:11.945174 4794 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" seLinuxMountContext="" Mar 10 09:44:11 crc kubenswrapper[4794]: I0310 09:44:11.945193 4794 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" seLinuxMountContext="" Mar 10 09:44:11 crc kubenswrapper[4794]: I0310 09:44:11.945204 4794 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" seLinuxMountContext="" Mar 10 09:44:11 crc kubenswrapper[4794]: I0310 09:44:11.945219 4794 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" seLinuxMountContext="" Mar 10 09:44:11 crc kubenswrapper[4794]: I0310 09:44:11.945233 4794 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" volumeName="kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" seLinuxMountContext="" Mar 10 09:44:11 crc kubenswrapper[4794]: I0310 09:44:11.945244 4794 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" seLinuxMountContext="" Mar 10 09:44:11 crc kubenswrapper[4794]: I0310 09:44:11.945256 4794 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" seLinuxMountContext="" Mar 10 09:44:11 crc kubenswrapper[4794]: I0310 09:44:11.945267 4794 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" seLinuxMountContext="" Mar 10 09:44:11 crc kubenswrapper[4794]: I0310 09:44:11.945629 4794 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" seLinuxMountContext="" Mar 10 09:44:11 crc kubenswrapper[4794]: I0310 09:44:11.945653 4794 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" seLinuxMountContext="" Mar 10 09:44:11 crc kubenswrapper[4794]: I0310 09:44:11.945675 4794 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" seLinuxMountContext="" Mar 10 09:44:11 crc kubenswrapper[4794]: I0310 09:44:11.945698 4794 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" seLinuxMountContext="" Mar 10 09:44:11 crc kubenswrapper[4794]: I0310 09:44:11.945711 4794 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" seLinuxMountContext="" Mar 10 09:44:11 crc kubenswrapper[4794]: I0310 09:44:11.945730 4794 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" seLinuxMountContext="" Mar 10 09:44:11 crc kubenswrapper[4794]: I0310 09:44:11.945741 4794 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" seLinuxMountContext="" Mar 10 09:44:11 crc kubenswrapper[4794]: I0310 09:44:11.945752 4794 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" seLinuxMountContext="" Mar 10 09:44:11 crc kubenswrapper[4794]: I0310 09:44:11.945768 4794 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" seLinuxMountContext="" Mar 10 09:44:11 crc kubenswrapper[4794]: I0310 09:44:11.945782 4794 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides" seLinuxMountContext="" Mar 10 09:44:11 crc kubenswrapper[4794]: I0310 09:44:11.945800 4794 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5" seLinuxMountContext="" Mar 10 09:44:11 crc kubenswrapper[4794]: I0310 09:44:11.945810 4794 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" seLinuxMountContext="" Mar 10 09:44:11 crc kubenswrapper[4794]: I0310 09:44:11.945821 4794 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" seLinuxMountContext="" Mar 10 09:44:11 crc kubenswrapper[4794]: I0310 09:44:11.945836 4794 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" seLinuxMountContext="" Mar 10 09:44:11 crc kubenswrapper[4794]: I0310 09:44:11.945847 4794 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d751cbb-f2e2-430d-9754-c882a5e924a5" volumeName="kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl" seLinuxMountContext="" Mar 10 09:44:11 crc kubenswrapper[4794]: I0310 09:44:11.945863 4794 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" seLinuxMountContext="" Mar 10 09:44:11 crc kubenswrapper[4794]: I0310 09:44:11.945874 4794 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb" seLinuxMountContext="" Mar 10 09:44:11 crc kubenswrapper[4794]: I0310 09:44:11.945884 4794 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" seLinuxMountContext="" Mar 10 09:44:11 crc kubenswrapper[4794]: I0310 09:44:11.948141 4794 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" seLinuxMountContext="" Mar 10 09:44:11 crc kubenswrapper[4794]: I0310 09:44:11.948182 4794 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" seLinuxMountContext="" Mar 10 09:44:11 crc kubenswrapper[4794]: I0310 09:44:11.948209 4794 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" seLinuxMountContext="" Mar 10 09:44:11 crc kubenswrapper[4794]: I0310 09:44:11.948222 4794 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" seLinuxMountContext="" Mar 10 09:44:11 crc kubenswrapper[4794]: I0310 09:44:11.948234 4794 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" seLinuxMountContext="" Mar 10 09:44:11 crc kubenswrapper[4794]: I0310 09:44:11.948258 4794 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" seLinuxMountContext="" Mar 10 09:44:11 crc kubenswrapper[4794]: I0310 09:44:11.948272 4794 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" seLinuxMountContext="" Mar 10 09:44:11 crc kubenswrapper[4794]: I0310 09:44:11.948289 4794 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" seLinuxMountContext="" Mar 10 09:44:11 crc kubenswrapper[4794]: I0310 09:44:11.948312 4794 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" seLinuxMountContext="" Mar 10 09:44:11 crc kubenswrapper[4794]: I0310 09:44:11.948598 4794 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" seLinuxMountContext="" Mar 10 09:44:11 crc kubenswrapper[4794]: I0310 09:44:11.948620 4794 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" seLinuxMountContext="" Mar 10 09:44:11 crc kubenswrapper[4794]: I0310 09:44:11.948637 4794 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" seLinuxMountContext="" Mar 10 09:44:11 crc kubenswrapper[4794]: I0310 09:44:11.948655 4794 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" seLinuxMountContext="" Mar 10 09:44:11 crc kubenswrapper[4794]: I0310 09:44:11.948672 4794 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" seLinuxMountContext="" Mar 10 09:44:11 crc kubenswrapper[4794]: I0310 09:44:11.948685 4794 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" seLinuxMountContext="" Mar 10 09:44:11 crc kubenswrapper[4794]: I0310 09:44:11.948699 4794 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" seLinuxMountContext="" Mar 10 09:44:11 crc kubenswrapper[4794]: I0310 09:44:11.948710 4794 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" seLinuxMountContext="" Mar 10 09:44:11 crc kubenswrapper[4794]: I0310 09:44:11.948725 4794 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" seLinuxMountContext="" Mar 10 09:44:11 crc kubenswrapper[4794]: I0310 09:44:11.948735 4794 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" seLinuxMountContext="" Mar 10 09:44:11 crc kubenswrapper[4794]: I0310 09:44:11.948749 4794 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" seLinuxMountContext="" Mar 10 09:44:11 crc kubenswrapper[4794]: I0310 09:44:11.948773 4794 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" seLinuxMountContext="" Mar 10 09:44:11 crc kubenswrapper[4794]: I0310 09:44:11.948787 4794 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" seLinuxMountContext="" Mar 10 09:44:11 crc kubenswrapper[4794]: I0310 09:44:11.948797 4794 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" seLinuxMountContext="" Mar 10 09:44:11 crc kubenswrapper[4794]: I0310 09:44:11.948811 4794 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" seLinuxMountContext="" Mar 10 09:44:11 crc kubenswrapper[4794]: I0310 09:44:11.948988 4794 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" seLinuxMountContext="" Mar 10 09:44:11 crc kubenswrapper[4794]: I0310 09:44:11.949006 4794 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert" seLinuxMountContext="" Mar 10 09:44:11 crc kubenswrapper[4794]: I0310 09:44:11.950889 4794 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" seLinuxMountContext="" Mar 10 09:44:11 crc kubenswrapper[4794]: I0310 09:44:11.950907 4794 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script" seLinuxMountContext="" Mar 10 09:44:11 crc kubenswrapper[4794]: I0310 09:44:11.950917 4794 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3b6479f0-333b-4a96-9adf-2099afdc2447" volumeName="kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr" seLinuxMountContext="" Mar 10 09:44:11 crc kubenswrapper[4794]: I0310 09:44:11.950926 4794 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" seLinuxMountContext="" Mar 10 09:44:11 crc kubenswrapper[4794]: I0310 09:44:11.950935 4794 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" seLinuxMountContext="" Mar 10 09:44:11 crc kubenswrapper[4794]: I0310 09:44:11.950945 4794 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" seLinuxMountContext="" Mar 10 09:44:11 crc kubenswrapper[4794]: I0310 09:44:11.950954 4794 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" seLinuxMountContext="" Mar 10 09:44:11 crc kubenswrapper[4794]: I0310 09:44:11.950963 4794 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" seLinuxMountContext="" Mar 10 09:44:11 crc kubenswrapper[4794]: I0310 09:44:11.950974 4794 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" seLinuxMountContext="" Mar 10 09:44:11 crc kubenswrapper[4794]: I0310 09:44:11.950984 4794 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" seLinuxMountContext="" Mar 10 09:44:11 crc kubenswrapper[4794]: I0310 09:44:11.950995 4794 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" seLinuxMountContext="" Mar 10 09:44:11 crc kubenswrapper[4794]: I0310 09:44:11.951005 4794 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" seLinuxMountContext="" Mar 10 09:44:11 crc kubenswrapper[4794]: I0310 09:44:11.951014 4794 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49ef4625-1d3a-4a9f-b595-c2433d32326d" volumeName="kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" seLinuxMountContext="" Mar 10 09:44:11 crc kubenswrapper[4794]: I0310 09:44:11.951023 4794 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" seLinuxMountContext="" Mar 10 09:44:11 crc kubenswrapper[4794]: I0310 09:44:11.954510 4794 reconstruct.go:144] "Volume is marked device as uncertain and added into the actual state" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" deviceMountPath="/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount" Mar 10 09:44:11 crc kubenswrapper[4794]: I0310 09:44:11.954592 4794 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" seLinuxMountContext="" Mar 10 09:44:11 crc kubenswrapper[4794]: I0310 09:44:11.954605 4794 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" seLinuxMountContext="" Mar 10 09:44:11 crc kubenswrapper[4794]: I0310 09:44:11.955108 4794 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" seLinuxMountContext="" Mar 10 09:44:11 crc kubenswrapper[4794]: I0310 09:44:11.955160 4794 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" seLinuxMountContext="" Mar 10 09:44:11 crc kubenswrapper[4794]: I0310 09:44:11.955184 4794 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" seLinuxMountContext="" Mar 10 09:44:11 crc kubenswrapper[4794]: I0310 09:44:11.955206 4794 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" seLinuxMountContext="" Mar 10 09:44:11 crc kubenswrapper[4794]: I0310 09:44:11.955219 4794 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" seLinuxMountContext="" Mar 10 09:44:11 crc kubenswrapper[4794]: I0310 09:44:11.955232 4794 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" seLinuxMountContext="" Mar 10 09:44:11 crc kubenswrapper[4794]: I0310 09:44:11.955270 4794 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" seLinuxMountContext="" Mar 10 09:44:11 crc kubenswrapper[4794]: I0310 09:44:11.955294 4794 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" seLinuxMountContext="" Mar 10 09:44:11 crc kubenswrapper[4794]: I0310 09:44:11.955319 4794 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" seLinuxMountContext="" Mar 10 09:44:11 crc kubenswrapper[4794]: I0310 09:44:11.955364 4794 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" seLinuxMountContext="" Mar 10 09:44:11 crc kubenswrapper[4794]: I0310 09:44:11.955387 4794 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" seLinuxMountContext="" Mar 10 09:44:11 crc kubenswrapper[4794]: I0310 09:44:11.955411 4794 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls" seLinuxMountContext="" Mar 10 09:44:11 crc kubenswrapper[4794]: I0310 09:44:11.955426 4794 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" seLinuxMountContext="" Mar 10 09:44:11 crc kubenswrapper[4794]: I0310 09:44:11.955439 4794 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" seLinuxMountContext="" Mar 10 09:44:11 crc kubenswrapper[4794]: I0310 09:44:11.955451 4794 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" seLinuxMountContext="" Mar 10 09:44:11 crc kubenswrapper[4794]: I0310 09:44:11.955900 4794 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" seLinuxMountContext="" Mar 10 09:44:11 crc kubenswrapper[4794]: I0310 09:44:11.955911 4794 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" seLinuxMountContext="" Mar 10 09:44:11 crc kubenswrapper[4794]: I0310 09:44:11.955922 4794 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" seLinuxMountContext="" Mar 10 09:44:11 crc kubenswrapper[4794]: I0310 09:44:11.955933 4794 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" seLinuxMountContext="" Mar 10 09:44:11 crc kubenswrapper[4794]: I0310 09:44:11.955943 4794 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" seLinuxMountContext="" Mar 10 09:44:11 crc kubenswrapper[4794]: I0310 09:44:11.955953 4794 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" seLinuxMountContext="" Mar 10 09:44:11 crc kubenswrapper[4794]: I0310 09:44:11.955963 4794 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" seLinuxMountContext="" Mar 10 09:44:11 crc kubenswrapper[4794]: I0310 09:44:11.955974 4794 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" seLinuxMountContext="" Mar 10 09:44:11 crc kubenswrapper[4794]: I0310 09:44:11.955985 4794 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" seLinuxMountContext="" Mar 10 09:44:11 crc kubenswrapper[4794]: I0310 09:44:11.955998 4794 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" seLinuxMountContext="" Mar 10 09:44:11 crc kubenswrapper[4794]: I0310 09:44:11.956009 4794 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" seLinuxMountContext="" Mar 10 09:44:11 crc kubenswrapper[4794]: I0310 09:44:11.956019 4794 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" volumeName="kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" seLinuxMountContext="" Mar 10 09:44:11 crc kubenswrapper[4794]: I0310 09:44:11.956028 4794 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" seLinuxMountContext="" Mar 10 09:44:11 crc kubenswrapper[4794]: I0310 09:44:11.956040 4794 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" seLinuxMountContext="" Mar 10 09:44:11 crc kubenswrapper[4794]: I0310 09:44:11.956050 4794 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" seLinuxMountContext="" Mar 10 09:44:11 crc kubenswrapper[4794]: I0310 09:44:11.956060 4794 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" seLinuxMountContext="" Mar 10 09:44:11 crc kubenswrapper[4794]: I0310 09:44:11.956071 4794 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" seLinuxMountContext="" Mar 10 09:44:11 crc kubenswrapper[4794]: I0310 09:44:11.956080 4794 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" seLinuxMountContext="" Mar 10 09:44:11 crc kubenswrapper[4794]: I0310 09:44:11.956091 4794 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" seLinuxMountContext="" Mar 10 09:44:11 crc kubenswrapper[4794]: I0310 09:44:11.956099 4794 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" seLinuxMountContext="" Mar 10 09:44:11 crc kubenswrapper[4794]: I0310 09:44:11.956109 4794 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" seLinuxMountContext="" Mar 10 09:44:11 crc kubenswrapper[4794]: I0310 09:44:11.956118 4794 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" seLinuxMountContext="" Mar 10 09:44:11 crc kubenswrapper[4794]: I0310 09:44:11.956128 4794 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" seLinuxMountContext="" Mar 10 09:44:11 crc kubenswrapper[4794]: I0310 09:44:11.956139 4794 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" seLinuxMountContext="" Mar 10 09:44:11 crc kubenswrapper[4794]: I0310 09:44:11.956148 4794 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" seLinuxMountContext="" Mar 10 09:44:11 crc kubenswrapper[4794]: I0310 09:44:11.956157 4794 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" seLinuxMountContext="" Mar 10 09:44:11 crc kubenswrapper[4794]: I0310 09:44:11.956167 4794 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" seLinuxMountContext="" Mar 10 09:44:11 crc kubenswrapper[4794]: I0310 09:44:11.956177 4794 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" seLinuxMountContext="" Mar 10 09:44:11 crc kubenswrapper[4794]: I0310 09:44:11.956186 4794 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" seLinuxMountContext="" Mar 10 09:44:11 crc kubenswrapper[4794]: I0310 09:44:11.956197 4794 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" seLinuxMountContext="" Mar 10 09:44:11 crc kubenswrapper[4794]: I0310 09:44:11.956212 4794 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" seLinuxMountContext="" Mar 10 09:44:11 crc kubenswrapper[4794]: I0310 09:44:11.956221 4794 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" seLinuxMountContext="" Mar 10 09:44:11 crc kubenswrapper[4794]: I0310 09:44:11.956231 4794 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" seLinuxMountContext="" Mar 10 09:44:11 crc kubenswrapper[4794]: I0310 09:44:11.956242 4794 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" seLinuxMountContext="" Mar 10 09:44:11 crc kubenswrapper[4794]: I0310 09:44:11.956255 4794 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" seLinuxMountContext="" Mar 10 09:44:11 crc kubenswrapper[4794]: I0310 09:44:11.956268 4794 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" seLinuxMountContext="" Mar 10 09:44:11 crc kubenswrapper[4794]: I0310 09:44:11.956279 4794 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" seLinuxMountContext="" Mar 10 09:44:11 crc kubenswrapper[4794]: I0310 09:44:11.956290 4794 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" seLinuxMountContext="" Mar 10 09:44:11 crc kubenswrapper[4794]: I0310 09:44:11.956301 4794 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" seLinuxMountContext="" Mar 10 09:44:11 crc kubenswrapper[4794]: I0310 09:44:11.956314 4794 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" seLinuxMountContext="" Mar 10 09:44:11 crc kubenswrapper[4794]: I0310 09:44:11.956323 4794 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" seLinuxMountContext="" Mar 10 09:44:11 crc kubenswrapper[4794]: I0310 09:44:11.956349 4794 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" seLinuxMountContext="" Mar 10 09:44:11 crc kubenswrapper[4794]: I0310 09:44:11.956360 4794 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" seLinuxMountContext="" Mar 10 09:44:11 crc kubenswrapper[4794]: I0310 09:44:11.956369 4794 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" seLinuxMountContext="" Mar 10 09:44:11 crc kubenswrapper[4794]: I0310 09:44:11.956379 4794 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" seLinuxMountContext="" Mar 10 09:44:11 crc kubenswrapper[4794]: I0310 09:44:11.956388 4794 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" seLinuxMountContext="" Mar 10 09:44:11 crc kubenswrapper[4794]: I0310 09:44:11.956399 4794 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" seLinuxMountContext="" Mar 10 09:44:11 crc kubenswrapper[4794]: I0310 09:44:11.956408 4794 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" seLinuxMountContext="" Mar 10 09:44:11 crc kubenswrapper[4794]: I0310 09:44:11.956418 4794 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" seLinuxMountContext="" Mar 10 09:44:11 crc kubenswrapper[4794]: I0310 09:44:11.956428 4794 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" seLinuxMountContext="" Mar 10 09:44:11 crc kubenswrapper[4794]: I0310 09:44:11.956439 4794 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" seLinuxMountContext="" Mar 10 09:44:11 crc kubenswrapper[4794]: I0310 09:44:11.956449 4794 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" seLinuxMountContext="" Mar 10 09:44:11 crc kubenswrapper[4794]: I0310 09:44:11.956460 4794 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf" seLinuxMountContext="" Mar 10 09:44:11 crc kubenswrapper[4794]: I0310 09:44:11.956472 4794 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" seLinuxMountContext="" Mar 10 09:44:11 crc kubenswrapper[4794]: I0310 09:44:11.956482 4794 reconstruct.go:97] "Volume reconstruction finished" Mar 10 09:44:11 crc kubenswrapper[4794]: I0310 09:44:11.956492 4794 reconciler.go:26] "Reconciler: start to sync state" Mar 10 09:44:11 crc kubenswrapper[4794]: I0310 09:44:11.960460 4794 manager.go:324] Recovery completed Mar 10 09:44:11 crc kubenswrapper[4794]: I0310 09:44:11.967942 4794 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 09:44:11 crc kubenswrapper[4794]: I0310 09:44:11.969117 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:44:11 crc kubenswrapper[4794]: I0310 09:44:11.969150 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:44:11 crc kubenswrapper[4794]: I0310 09:44:11.969159 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:44:11 crc kubenswrapper[4794]: I0310 09:44:11.970287 4794 cpu_manager.go:225] "Starting CPU manager" policy="none" Mar 10 09:44:11 crc kubenswrapper[4794]: I0310 09:44:11.970307 4794 cpu_manager.go:226] "Reconciling" reconcilePeriod="10s" Mar 10 09:44:11 crc kubenswrapper[4794]: I0310 09:44:11.970327 4794 state_mem.go:36] "Initialized new in-memory state store" Mar 10 09:44:11 crc kubenswrapper[4794]: I0310 09:44:11.984946 4794 policy_none.go:49] "None policy: Start" Mar 10 09:44:11 crc kubenswrapper[4794]: I0310 09:44:11.986508 4794 memory_manager.go:170] "Starting memorymanager" policy="None" Mar 10 09:44:11 crc kubenswrapper[4794]: I0310 09:44:11.986540 4794 state_mem.go:35] "Initializing new in-memory state store" Mar 10 09:44:11 crc kubenswrapper[4794]: I0310 09:44:11.994559 4794 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Mar 10 09:44:11 crc kubenswrapper[4794]: I0310 09:44:11.997749 4794 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Mar 10 09:44:11 crc kubenswrapper[4794]: I0310 09:44:11.997792 4794 status_manager.go:217] "Starting to sync pod status with apiserver" Mar 10 09:44:11 crc kubenswrapper[4794]: I0310 09:44:11.997821 4794 kubelet.go:2335] "Starting kubelet main sync loop" Mar 10 09:44:11 crc kubenswrapper[4794]: E0310 09:44:11.997879 4794 kubelet.go:2359] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Mar 10 09:44:11 crc kubenswrapper[4794]: W0310 09:44:11.999026 4794 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.65:6443: connect: connection refused Mar 10 09:44:11 crc kubenswrapper[4794]: E0310 09:44:11.999114 4794 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.65:6443: connect: connection refused" logger="UnhandledError" Mar 10 09:44:12 crc kubenswrapper[4794]: E0310 09:44:12.037981 4794 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 09:44:12 crc kubenswrapper[4794]: I0310 09:44:12.041982 4794 manager.go:334] "Starting Device Plugin manager" Mar 10 09:44:12 crc kubenswrapper[4794]: I0310 09:44:12.042116 4794 manager.go:513] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Mar 10 09:44:12 crc kubenswrapper[4794]: I0310 09:44:12.042135 4794 server.go:79] "Starting device plugin registration server" Mar 10 09:44:12 crc kubenswrapper[4794]: I0310 09:44:12.042735 4794 eviction_manager.go:189] "Eviction manager: starting control loop" Mar 10 09:44:12 crc kubenswrapper[4794]: I0310 09:44:12.042786 4794 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Mar 10 09:44:12 crc kubenswrapper[4794]: I0310 09:44:12.043091 4794 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Mar 10 09:44:12 crc kubenswrapper[4794]: I0310 09:44:12.043208 4794 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Mar 10 09:44:12 crc kubenswrapper[4794]: I0310 09:44:12.043222 4794 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Mar 10 09:44:12 crc kubenswrapper[4794]: E0310 09:44:12.048897 4794 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 10 09:44:12 crc kubenswrapper[4794]: I0310 09:44:12.098075 4794 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc","openshift-kube-controller-manager/kube-controller-manager-crc","openshift-kube-scheduler/openshift-kube-scheduler-crc","openshift-machine-config-operator/kube-rbac-proxy-crio-crc","openshift-etcd/etcd-crc"] Mar 10 09:44:12 crc kubenswrapper[4794]: I0310 09:44:12.098211 4794 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 09:44:12 crc kubenswrapper[4794]: I0310 09:44:12.100006 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:44:12 crc kubenswrapper[4794]: I0310 09:44:12.100035 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:44:12 crc kubenswrapper[4794]: I0310 09:44:12.100044 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:44:12 crc kubenswrapper[4794]: I0310 09:44:12.100241 4794 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 09:44:12 crc kubenswrapper[4794]: I0310 09:44:12.100816 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 10 09:44:12 crc kubenswrapper[4794]: I0310 09:44:12.100883 4794 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 09:44:12 crc kubenswrapper[4794]: I0310 09:44:12.101415 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:44:12 crc kubenswrapper[4794]: I0310 09:44:12.101458 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:44:12 crc kubenswrapper[4794]: I0310 09:44:12.101470 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:44:12 crc kubenswrapper[4794]: I0310 09:44:12.101945 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 10 09:44:12 crc kubenswrapper[4794]: I0310 09:44:12.102518 4794 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 09:44:12 crc kubenswrapper[4794]: I0310 09:44:12.102585 4794 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 09:44:12 crc kubenswrapper[4794]: I0310 09:44:12.102730 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:44:12 crc kubenswrapper[4794]: I0310 09:44:12.102780 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:44:12 crc kubenswrapper[4794]: I0310 09:44:12.102808 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:44:12 crc kubenswrapper[4794]: I0310 09:44:12.103584 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:44:12 crc kubenswrapper[4794]: I0310 09:44:12.103614 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:44:12 crc kubenswrapper[4794]: I0310 09:44:12.103625 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:44:12 crc kubenswrapper[4794]: I0310 09:44:12.103746 4794 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 09:44:12 crc kubenswrapper[4794]: I0310 09:44:12.103963 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 10 09:44:12 crc kubenswrapper[4794]: I0310 09:44:12.103977 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:44:12 crc kubenswrapper[4794]: I0310 09:44:12.104002 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:44:12 crc kubenswrapper[4794]: I0310 09:44:12.104013 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:44:12 crc kubenswrapper[4794]: I0310 09:44:12.104018 4794 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 09:44:12 crc kubenswrapper[4794]: I0310 09:44:12.105174 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:44:12 crc kubenswrapper[4794]: I0310 09:44:12.105190 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:44:12 crc kubenswrapper[4794]: I0310 09:44:12.105205 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:44:12 crc kubenswrapper[4794]: I0310 09:44:12.105218 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:44:12 crc kubenswrapper[4794]: I0310 09:44:12.105208 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:44:12 crc kubenswrapper[4794]: I0310 09:44:12.105272 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:44:12 crc kubenswrapper[4794]: I0310 09:44:12.105349 4794 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 09:44:12 crc kubenswrapper[4794]: I0310 09:44:12.105434 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 10 09:44:12 crc kubenswrapper[4794]: I0310 09:44:12.105463 4794 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 09:44:12 crc kubenswrapper[4794]: I0310 09:44:12.106089 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:44:12 crc kubenswrapper[4794]: I0310 09:44:12.106120 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:44:12 crc kubenswrapper[4794]: I0310 09:44:12.106132 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:44:12 crc kubenswrapper[4794]: I0310 09:44:12.106166 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:44:12 crc kubenswrapper[4794]: I0310 09:44:12.106183 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:44:12 crc kubenswrapper[4794]: I0310 09:44:12.106194 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:44:12 crc kubenswrapper[4794]: I0310 09:44:12.106525 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Mar 10 09:44:12 crc kubenswrapper[4794]: I0310 09:44:12.106586 4794 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 09:44:12 crc kubenswrapper[4794]: I0310 09:44:12.108111 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:44:12 crc kubenswrapper[4794]: I0310 09:44:12.108143 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:44:12 crc kubenswrapper[4794]: I0310 09:44:12.108162 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:44:12 crc kubenswrapper[4794]: E0310 09:44:12.138911 4794 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.65:6443: connect: connection refused" interval="400ms" Mar 10 09:44:12 crc kubenswrapper[4794]: I0310 09:44:12.143957 4794 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 09:44:12 crc kubenswrapper[4794]: I0310 09:44:12.145069 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:44:12 crc kubenswrapper[4794]: I0310 09:44:12.145123 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:44:12 crc kubenswrapper[4794]: I0310 09:44:12.145141 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:44:12 crc kubenswrapper[4794]: I0310 09:44:12.145177 4794 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 10 09:44:12 crc kubenswrapper[4794]: E0310 09:44:12.145774 4794 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.65:6443: connect: connection refused" node="crc" Mar 10 09:44:12 crc kubenswrapper[4794]: I0310 09:44:12.158820 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 10 09:44:12 crc kubenswrapper[4794]: I0310 09:44:12.158870 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 10 09:44:12 crc kubenswrapper[4794]: I0310 09:44:12.158912 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 10 09:44:12 crc kubenswrapper[4794]: I0310 09:44:12.158946 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 10 09:44:12 crc kubenswrapper[4794]: I0310 09:44:12.158975 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 10 09:44:12 crc kubenswrapper[4794]: I0310 09:44:12.159062 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 10 09:44:12 crc kubenswrapper[4794]: I0310 09:44:12.159106 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 10 09:44:12 crc kubenswrapper[4794]: I0310 09:44:12.159141 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 10 09:44:12 crc kubenswrapper[4794]: I0310 09:44:12.159189 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 10 09:44:12 crc kubenswrapper[4794]: I0310 09:44:12.159218 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 10 09:44:12 crc kubenswrapper[4794]: I0310 09:44:12.159307 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 10 09:44:12 crc kubenswrapper[4794]: I0310 09:44:12.159413 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 10 09:44:12 crc kubenswrapper[4794]: I0310 09:44:12.159475 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 10 09:44:12 crc kubenswrapper[4794]: I0310 09:44:12.159508 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 10 09:44:12 crc kubenswrapper[4794]: I0310 09:44:12.159540 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 10 09:44:12 crc kubenswrapper[4794]: E0310 09:44:12.176955 4794 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": dial tcp 38.102.83.65:6443: connect: connection refused" event="&Event{ObjectMeta:{crc.189b71a9b07f9445 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 09:44:11.924984901 +0000 UTC m=+0.681155759,LastTimestamp:2026-03-10 09:44:11.924984901 +0000 UTC m=+0.681155759,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 09:44:12 crc kubenswrapper[4794]: I0310 09:44:12.260804 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 10 09:44:12 crc kubenswrapper[4794]: I0310 09:44:12.260894 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 10 09:44:12 crc kubenswrapper[4794]: I0310 09:44:12.260938 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 10 09:44:12 crc kubenswrapper[4794]: I0310 09:44:12.260982 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 10 09:44:12 crc kubenswrapper[4794]: I0310 09:44:12.261022 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 10 09:44:12 crc kubenswrapper[4794]: I0310 09:44:12.261026 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 10 09:44:12 crc kubenswrapper[4794]: I0310 09:44:12.261069 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 10 09:44:12 crc kubenswrapper[4794]: I0310 09:44:12.261105 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 10 09:44:12 crc kubenswrapper[4794]: I0310 09:44:12.261071 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 10 09:44:12 crc kubenswrapper[4794]: I0310 09:44:12.261127 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 10 09:44:12 crc kubenswrapper[4794]: I0310 09:44:12.261230 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 10 09:44:12 crc kubenswrapper[4794]: I0310 09:44:12.261243 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 10 09:44:12 crc kubenswrapper[4794]: I0310 09:44:12.261141 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 10 09:44:12 crc kubenswrapper[4794]: I0310 09:44:12.261148 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 10 09:44:12 crc kubenswrapper[4794]: I0310 09:44:12.261147 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 10 09:44:12 crc kubenswrapper[4794]: I0310 09:44:12.261277 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 10 09:44:12 crc kubenswrapper[4794]: I0310 09:44:12.261375 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 10 09:44:12 crc kubenswrapper[4794]: I0310 09:44:12.261316 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 10 09:44:12 crc kubenswrapper[4794]: I0310 09:44:12.261489 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 10 09:44:12 crc kubenswrapper[4794]: I0310 09:44:12.261527 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 10 09:44:12 crc kubenswrapper[4794]: I0310 09:44:12.261536 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 10 09:44:12 crc kubenswrapper[4794]: I0310 09:44:12.261578 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 10 09:44:12 crc kubenswrapper[4794]: I0310 09:44:12.261620 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 10 09:44:12 crc kubenswrapper[4794]: I0310 09:44:12.261628 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 10 09:44:12 crc kubenswrapper[4794]: I0310 09:44:12.261662 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 10 09:44:12 crc kubenswrapper[4794]: I0310 09:44:12.261693 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 10 09:44:12 crc kubenswrapper[4794]: I0310 09:44:12.261709 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 10 09:44:12 crc kubenswrapper[4794]: I0310 09:44:12.261709 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 10 09:44:12 crc kubenswrapper[4794]: I0310 09:44:12.261751 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 10 09:44:12 crc kubenswrapper[4794]: I0310 09:44:12.261775 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 10 09:44:12 crc kubenswrapper[4794]: I0310 09:44:12.346299 4794 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 09:44:12 crc kubenswrapper[4794]: I0310 09:44:12.347966 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:44:12 crc kubenswrapper[4794]: I0310 09:44:12.348048 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:44:12 crc kubenswrapper[4794]: I0310 09:44:12.348074 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:44:12 crc kubenswrapper[4794]: I0310 09:44:12.348113 4794 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 10 09:44:12 crc kubenswrapper[4794]: E0310 09:44:12.348755 4794 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.65:6443: connect: connection refused" node="crc" Mar 10 09:44:12 crc kubenswrapper[4794]: I0310 09:44:12.448518 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 10 09:44:12 crc kubenswrapper[4794]: I0310 09:44:12.457987 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 10 09:44:12 crc kubenswrapper[4794]: I0310 09:44:12.478298 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 10 09:44:12 crc kubenswrapper[4794]: I0310 09:44:12.487762 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 10 09:44:12 crc kubenswrapper[4794]: I0310 09:44:12.494443 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Mar 10 09:44:12 crc kubenswrapper[4794]: W0310 09:44:12.495901 4794 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf4b27818a5e8e43d0dc095d08835c792.slice/crio-27468303c723c99d99daea0bb2a3dae656872d06b4ad91c5143cf1455d27a797 WatchSource:0}: Error finding container 27468303c723c99d99daea0bb2a3dae656872d06b4ad91c5143cf1455d27a797: Status 404 returned error can't find the container with id 27468303c723c99d99daea0bb2a3dae656872d06b4ad91c5143cf1455d27a797 Mar 10 09:44:12 crc kubenswrapper[4794]: W0310 09:44:12.497132 4794 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf614b9022728cf315e60c057852e563e.slice/crio-7e2074bbe61fab250b414a1bac8d714f71745694f0621151921087ab6888daaa WatchSource:0}: Error finding container 7e2074bbe61fab250b414a1bac8d714f71745694f0621151921087ab6888daaa: Status 404 returned error can't find the container with id 7e2074bbe61fab250b414a1bac8d714f71745694f0621151921087ab6888daaa Mar 10 09:44:12 crc kubenswrapper[4794]: W0310 09:44:12.499441 4794 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3dcd261975c3d6b9a6ad6367fd4facd3.slice/crio-01f08cfd78ea2d075fb56fe39d0d3ee066e3a2604392711b8e74774739f18c7f WatchSource:0}: Error finding container 01f08cfd78ea2d075fb56fe39d0d3ee066e3a2604392711b8e74774739f18c7f: Status 404 returned error can't find the container with id 01f08cfd78ea2d075fb56fe39d0d3ee066e3a2604392711b8e74774739f18c7f Mar 10 09:44:12 crc kubenswrapper[4794]: W0310 09:44:12.510615 4794 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2139d3e2895fc6797b9c76a1b4c9886d.slice/crio-8f3381b074899ea580533e471b77eacf31d0c3f80c749354480912cd176ebc4b WatchSource:0}: Error finding container 8f3381b074899ea580533e471b77eacf31d0c3f80c749354480912cd176ebc4b: Status 404 returned error can't find the container with id 8f3381b074899ea580533e471b77eacf31d0c3f80c749354480912cd176ebc4b Mar 10 09:44:12 crc kubenswrapper[4794]: E0310 09:44:12.541527 4794 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.65:6443: connect: connection refused" interval="800ms" Mar 10 09:44:12 crc kubenswrapper[4794]: I0310 09:44:12.749415 4794 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 09:44:12 crc kubenswrapper[4794]: I0310 09:44:12.750682 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:44:12 crc kubenswrapper[4794]: I0310 09:44:12.750722 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:44:12 crc kubenswrapper[4794]: I0310 09:44:12.750732 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:44:12 crc kubenswrapper[4794]: I0310 09:44:12.750752 4794 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 10 09:44:12 crc kubenswrapper[4794]: E0310 09:44:12.751191 4794 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.65:6443: connect: connection refused" node="crc" Mar 10 09:44:12 crc kubenswrapper[4794]: W0310 09:44:12.891991 4794 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.65:6443: connect: connection refused Mar 10 09:44:12 crc kubenswrapper[4794]: E0310 09:44:12.892073 4794 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.65:6443: connect: connection refused" logger="UnhandledError" Mar 10 09:44:12 crc kubenswrapper[4794]: I0310 09:44:12.928304 4794 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.65:6443: connect: connection refused Mar 10 09:44:12 crc kubenswrapper[4794]: W0310 09:44:12.970581 4794 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.65:6443: connect: connection refused Mar 10 09:44:12 crc kubenswrapper[4794]: E0310 09:44:12.970666 4794 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.65:6443: connect: connection refused" logger="UnhandledError" Mar 10 09:44:13 crc kubenswrapper[4794]: I0310 09:44:13.002083 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"2b7e2515416277afd6874c821de9b05b0bbbf6ed151532da3d6c9eb31349bd4d"} Mar 10 09:44:13 crc kubenswrapper[4794]: I0310 09:44:13.003263 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"01f08cfd78ea2d075fb56fe39d0d3ee066e3a2604392711b8e74774739f18c7f"} Mar 10 09:44:13 crc kubenswrapper[4794]: I0310 09:44:13.005907 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"7e2074bbe61fab250b414a1bac8d714f71745694f0621151921087ab6888daaa"} Mar 10 09:44:13 crc kubenswrapper[4794]: I0310 09:44:13.007384 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"27468303c723c99d99daea0bb2a3dae656872d06b4ad91c5143cf1455d27a797"} Mar 10 09:44:13 crc kubenswrapper[4794]: I0310 09:44:13.016596 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"8f3381b074899ea580533e471b77eacf31d0c3f80c749354480912cd176ebc4b"} Mar 10 09:44:13 crc kubenswrapper[4794]: W0310 09:44:13.094449 4794 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.65:6443: connect: connection refused Mar 10 09:44:13 crc kubenswrapper[4794]: E0310 09:44:13.094808 4794 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.65:6443: connect: connection refused" logger="UnhandledError" Mar 10 09:44:13 crc kubenswrapper[4794]: E0310 09:44:13.342428 4794 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.65:6443: connect: connection refused" interval="1.6s" Mar 10 09:44:13 crc kubenswrapper[4794]: W0310 09:44:13.481121 4794 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.65:6443: connect: connection refused Mar 10 09:44:13 crc kubenswrapper[4794]: E0310 09:44:13.481288 4794 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.65:6443: connect: connection refused" logger="UnhandledError" Mar 10 09:44:13 crc kubenswrapper[4794]: I0310 09:44:13.551509 4794 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 09:44:13 crc kubenswrapper[4794]: I0310 09:44:13.553554 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:44:13 crc kubenswrapper[4794]: I0310 09:44:13.553610 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:44:13 crc kubenswrapper[4794]: I0310 09:44:13.553640 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:44:13 crc kubenswrapper[4794]: I0310 09:44:13.553672 4794 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 10 09:44:13 crc kubenswrapper[4794]: E0310 09:44:13.554198 4794 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.65:6443: connect: connection refused" node="crc" Mar 10 09:44:13 crc kubenswrapper[4794]: I0310 09:44:13.816442 4794 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 10 09:44:13 crc kubenswrapper[4794]: E0310 09:44:13.817695 4794 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 38.102.83.65:6443: connect: connection refused" logger="UnhandledError" Mar 10 09:44:13 crc kubenswrapper[4794]: I0310 09:44:13.928938 4794 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.65:6443: connect: connection refused Mar 10 09:44:14 crc kubenswrapper[4794]: I0310 09:44:14.021818 4794 generic.go:334] "Generic (PLEG): container finished" podID="d1b160f5dda77d281dd8e69ec8d817f9" containerID="8ff582c5b9b19a04841665aef50262d055e093a8e80200961a93e1e1ee0391b4" exitCode=0 Mar 10 09:44:14 crc kubenswrapper[4794]: I0310 09:44:14.021921 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerDied","Data":"8ff582c5b9b19a04841665aef50262d055e093a8e80200961a93e1e1ee0391b4"} Mar 10 09:44:14 crc kubenswrapper[4794]: I0310 09:44:14.021964 4794 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 09:44:14 crc kubenswrapper[4794]: I0310 09:44:14.023637 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:44:14 crc kubenswrapper[4794]: I0310 09:44:14.023711 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:44:14 crc kubenswrapper[4794]: I0310 09:44:14.023731 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:44:14 crc kubenswrapper[4794]: I0310 09:44:14.024464 4794 generic.go:334] "Generic (PLEG): container finished" podID="3dcd261975c3d6b9a6ad6367fd4facd3" containerID="eab34114bba85670ea6caba8768b650dcbad9e92ca9034f0a21da9b355af87b1" exitCode=0 Mar 10 09:44:14 crc kubenswrapper[4794]: I0310 09:44:14.024525 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerDied","Data":"eab34114bba85670ea6caba8768b650dcbad9e92ca9034f0a21da9b355af87b1"} Mar 10 09:44:14 crc kubenswrapper[4794]: I0310 09:44:14.024602 4794 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 09:44:14 crc kubenswrapper[4794]: I0310 09:44:14.028317 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:44:14 crc kubenswrapper[4794]: I0310 09:44:14.028410 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:44:14 crc kubenswrapper[4794]: I0310 09:44:14.028436 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:44:14 crc kubenswrapper[4794]: I0310 09:44:14.033629 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"0aac675904ab2ca1d18eed0251bb1dafa297d7819e95189e0ea6288b164cae86"} Mar 10 09:44:14 crc kubenswrapper[4794]: I0310 09:44:14.033696 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"76c05c05697162298be5d52e9bf3012e05344a077cc3cf6a53839cee75b671a9"} Mar 10 09:44:14 crc kubenswrapper[4794]: I0310 09:44:14.033729 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"6c038c0c25a477050dd9f8907667961d329c0c2202ab280127d9e91d82922b68"} Mar 10 09:44:14 crc kubenswrapper[4794]: I0310 09:44:14.033752 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"c2e8a850bc5db9bf7a7e73a9a71933744f371c73ceba41cd04d0b175614309ef"} Mar 10 09:44:14 crc kubenswrapper[4794]: I0310 09:44:14.033707 4794 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 09:44:14 crc kubenswrapper[4794]: I0310 09:44:14.034936 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:44:14 crc kubenswrapper[4794]: I0310 09:44:14.034988 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:44:14 crc kubenswrapper[4794]: I0310 09:44:14.035006 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:44:14 crc kubenswrapper[4794]: I0310 09:44:14.036211 4794 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="712665fa5ac90b706c3ce8eb211703686c18a26f136962df26e3ad7c4396c72f" exitCode=0 Mar 10 09:44:14 crc kubenswrapper[4794]: I0310 09:44:14.036380 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"712665fa5ac90b706c3ce8eb211703686c18a26f136962df26e3ad7c4396c72f"} Mar 10 09:44:14 crc kubenswrapper[4794]: I0310 09:44:14.036432 4794 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 09:44:14 crc kubenswrapper[4794]: I0310 09:44:14.038314 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:44:14 crc kubenswrapper[4794]: I0310 09:44:14.038385 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:44:14 crc kubenswrapper[4794]: I0310 09:44:14.038405 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:44:14 crc kubenswrapper[4794]: I0310 09:44:14.039274 4794 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="d1cbb7471e4c1bc60e6bff82ff47be1667d540e2f73c1e118371b1f887c6e22e" exitCode=0 Mar 10 09:44:14 crc kubenswrapper[4794]: I0310 09:44:14.039355 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"d1cbb7471e4c1bc60e6bff82ff47be1667d540e2f73c1e118371b1f887c6e22e"} Mar 10 09:44:14 crc kubenswrapper[4794]: I0310 09:44:14.039444 4794 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 09:44:14 crc kubenswrapper[4794]: I0310 09:44:14.040601 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:44:14 crc kubenswrapper[4794]: I0310 09:44:14.040638 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:44:14 crc kubenswrapper[4794]: I0310 09:44:14.040650 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:44:14 crc kubenswrapper[4794]: I0310 09:44:14.042432 4794 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 09:44:14 crc kubenswrapper[4794]: I0310 09:44:14.043504 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:44:14 crc kubenswrapper[4794]: I0310 09:44:14.043572 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:44:14 crc kubenswrapper[4794]: I0310 09:44:14.043583 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:44:14 crc kubenswrapper[4794]: I0310 09:44:14.928278 4794 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.65:6443: connect: connection refused Mar 10 09:44:14 crc kubenswrapper[4794]: E0310 09:44:14.943363 4794 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.65:6443: connect: connection refused" interval="3.2s" Mar 10 09:44:15 crc kubenswrapper[4794]: I0310 09:44:15.048213 4794 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 09:44:15 crc kubenswrapper[4794]: I0310 09:44:15.048295 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"5ecb656c11edc3bfd23f7b6362d5ac8cad055186b134498b59e58b868ebd157c"} Mar 10 09:44:15 crc kubenswrapper[4794]: I0310 09:44:15.049214 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:44:15 crc kubenswrapper[4794]: I0310 09:44:15.049239 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:44:15 crc kubenswrapper[4794]: I0310 09:44:15.049250 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:44:15 crc kubenswrapper[4794]: I0310 09:44:15.051613 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"7350cf82749601985e69e238d1c8a61ffa7788dd01f1b91a34f1f7574d3c86ff"} Mar 10 09:44:15 crc kubenswrapper[4794]: I0310 09:44:15.051638 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"1cb1eebc0d6deb9837da08cb3090cb0f25f2dc66b49afb402fd66f4e73784d66"} Mar 10 09:44:15 crc kubenswrapper[4794]: I0310 09:44:15.051649 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"cab1094176cbf7505724e0c9b08a4c294e6d85e4d5ad3b06535e0dbab270fbc2"} Mar 10 09:44:15 crc kubenswrapper[4794]: I0310 09:44:15.051713 4794 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 09:44:15 crc kubenswrapper[4794]: I0310 09:44:15.052752 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:44:15 crc kubenswrapper[4794]: I0310 09:44:15.052795 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:44:15 crc kubenswrapper[4794]: I0310 09:44:15.052808 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:44:15 crc kubenswrapper[4794]: I0310 09:44:15.055567 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"6a3be3a71e2af20be50ffcbd448d5f28e73bafe2d36819ec5476a260b6f5a8c5"} Mar 10 09:44:15 crc kubenswrapper[4794]: I0310 09:44:15.055596 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"27049a35d23ac6ffa5cb459e4ea2c5c55a7b1731c9edff0941d452442cc73af2"} Mar 10 09:44:15 crc kubenswrapper[4794]: I0310 09:44:15.055608 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"de2c346c49fbd94c4328853bed3ffd94ba873a423ff0d99a6ace6fc832bd4da7"} Mar 10 09:44:15 crc kubenswrapper[4794]: I0310 09:44:15.055619 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"c7c4ede630df1f139ad8abbc2d1d1e2f0f6cf56a2c29902f6b6879df004834cd"} Mar 10 09:44:15 crc kubenswrapper[4794]: I0310 09:44:15.057474 4794 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="8e8b205e76d6211c7df0521baf694a871c2e747401dd85f694fe3b6712995540" exitCode=0 Mar 10 09:44:15 crc kubenswrapper[4794]: I0310 09:44:15.057602 4794 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 09:44:15 crc kubenswrapper[4794]: I0310 09:44:15.057613 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"8e8b205e76d6211c7df0521baf694a871c2e747401dd85f694fe3b6712995540"} Mar 10 09:44:15 crc kubenswrapper[4794]: I0310 09:44:15.057675 4794 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 09:44:15 crc kubenswrapper[4794]: I0310 09:44:15.058639 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:44:15 crc kubenswrapper[4794]: I0310 09:44:15.058667 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:44:15 crc kubenswrapper[4794]: I0310 09:44:15.058678 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:44:15 crc kubenswrapper[4794]: I0310 09:44:15.059391 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:44:15 crc kubenswrapper[4794]: I0310 09:44:15.059418 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:44:15 crc kubenswrapper[4794]: I0310 09:44:15.059430 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:44:15 crc kubenswrapper[4794]: I0310 09:44:15.155179 4794 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 09:44:15 crc kubenswrapper[4794]: I0310 09:44:15.156493 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:44:15 crc kubenswrapper[4794]: I0310 09:44:15.156529 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:44:15 crc kubenswrapper[4794]: I0310 09:44:15.156539 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:44:15 crc kubenswrapper[4794]: I0310 09:44:15.156563 4794 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 10 09:44:15 crc kubenswrapper[4794]: E0310 09:44:15.157015 4794 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.65:6443: connect: connection refused" node="crc" Mar 10 09:44:16 crc kubenswrapper[4794]: I0310 09:44:16.065971 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"f034690168432a340998e04b0939fd2fc650d3c640fdb8ca664c718c5110d8e1"} Mar 10 09:44:16 crc kubenswrapper[4794]: I0310 09:44:16.066135 4794 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 09:44:16 crc kubenswrapper[4794]: I0310 09:44:16.067772 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:44:16 crc kubenswrapper[4794]: I0310 09:44:16.067833 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:44:16 crc kubenswrapper[4794]: I0310 09:44:16.067858 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:44:16 crc kubenswrapper[4794]: I0310 09:44:16.069739 4794 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="0fee74b6caaaf89e473d672c7699753699a594043f013d2e5106d232ad87d850" exitCode=0 Mar 10 09:44:16 crc kubenswrapper[4794]: I0310 09:44:16.069833 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"0fee74b6caaaf89e473d672c7699753699a594043f013d2e5106d232ad87d850"} Mar 10 09:44:16 crc kubenswrapper[4794]: I0310 09:44:16.069870 4794 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 10 09:44:16 crc kubenswrapper[4794]: I0310 09:44:16.069913 4794 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 09:44:16 crc kubenswrapper[4794]: I0310 09:44:16.069936 4794 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 09:44:16 crc kubenswrapper[4794]: I0310 09:44:16.069957 4794 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 09:44:16 crc kubenswrapper[4794]: I0310 09:44:16.071923 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:44:16 crc kubenswrapper[4794]: I0310 09:44:16.071951 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:44:16 crc kubenswrapper[4794]: I0310 09:44:16.071991 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:44:16 crc kubenswrapper[4794]: I0310 09:44:16.072011 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:44:16 crc kubenswrapper[4794]: I0310 09:44:16.071965 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:44:16 crc kubenswrapper[4794]: I0310 09:44:16.072097 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:44:16 crc kubenswrapper[4794]: I0310 09:44:16.072195 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:44:16 crc kubenswrapper[4794]: I0310 09:44:16.072254 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:44:16 crc kubenswrapper[4794]: I0310 09:44:16.072279 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:44:16 crc kubenswrapper[4794]: I0310 09:44:16.773482 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 10 09:44:16 crc kubenswrapper[4794]: I0310 09:44:16.773652 4794 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 09:44:16 crc kubenswrapper[4794]: I0310 09:44:16.775006 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:44:16 crc kubenswrapper[4794]: I0310 09:44:16.775048 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:44:16 crc kubenswrapper[4794]: I0310 09:44:16.775075 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:44:17 crc kubenswrapper[4794]: I0310 09:44:17.077504 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"4b3bcab9c36a963edfb820142c1c917f0cc5d18a09dc3aa65b4539c2248fb866"} Mar 10 09:44:17 crc kubenswrapper[4794]: I0310 09:44:17.077591 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"02b38ec46bd15bd61a9748f309c0f8fd16d1ab485405a653f6974b6b37952f75"} Mar 10 09:44:17 crc kubenswrapper[4794]: I0310 09:44:17.077624 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"714aace000a0c1b32bd3c863ca1e38e9e44df606fcf3fddb6573eaf8faea4ac7"} Mar 10 09:44:17 crc kubenswrapper[4794]: I0310 09:44:17.077650 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"8ed250f61efe96b4fac3af4d8d8958a60df304df0120052a7a1ef2d716cc025d"} Mar 10 09:44:17 crc kubenswrapper[4794]: I0310 09:44:17.077532 4794 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 10 09:44:17 crc kubenswrapper[4794]: I0310 09:44:17.077737 4794 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 09:44:17 crc kubenswrapper[4794]: I0310 09:44:17.078752 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:44:17 crc kubenswrapper[4794]: I0310 09:44:17.078784 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:44:17 crc kubenswrapper[4794]: I0310 09:44:17.078795 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:44:17 crc kubenswrapper[4794]: I0310 09:44:17.995903 4794 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 10 09:44:18 crc kubenswrapper[4794]: I0310 09:44:18.084101 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"55280e0d81cf78ca765eb09242df44355b2e25fc2d3e5f88b78ef9f14a44ada8"} Mar 10 09:44:18 crc kubenswrapper[4794]: I0310 09:44:18.084312 4794 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 09:44:18 crc kubenswrapper[4794]: I0310 09:44:18.085617 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:44:18 crc kubenswrapper[4794]: I0310 09:44:18.085703 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:44:18 crc kubenswrapper[4794]: I0310 09:44:18.085729 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:44:18 crc kubenswrapper[4794]: I0310 09:44:18.357853 4794 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 09:44:18 crc kubenswrapper[4794]: I0310 09:44:18.358877 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:44:18 crc kubenswrapper[4794]: I0310 09:44:18.358941 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:44:18 crc kubenswrapper[4794]: I0310 09:44:18.358964 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:44:18 crc kubenswrapper[4794]: I0310 09:44:18.359000 4794 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 10 09:44:18 crc kubenswrapper[4794]: I0310 09:44:18.364853 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 10 09:44:18 crc kubenswrapper[4794]: I0310 09:44:18.365003 4794 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 10 09:44:18 crc kubenswrapper[4794]: I0310 09:44:18.365047 4794 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 09:44:18 crc kubenswrapper[4794]: I0310 09:44:18.366659 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:44:18 crc kubenswrapper[4794]: I0310 09:44:18.366715 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:44:18 crc kubenswrapper[4794]: I0310 09:44:18.366731 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:44:19 crc kubenswrapper[4794]: I0310 09:44:19.088148 4794 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 09:44:19 crc kubenswrapper[4794]: I0310 09:44:19.089291 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:44:19 crc kubenswrapper[4794]: I0310 09:44:19.089391 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:44:19 crc kubenswrapper[4794]: I0310 09:44:19.089417 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:44:19 crc kubenswrapper[4794]: I0310 09:44:19.267861 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 10 09:44:19 crc kubenswrapper[4794]: I0310 09:44:19.268422 4794 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 09:44:19 crc kubenswrapper[4794]: I0310 09:44:19.270027 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:44:19 crc kubenswrapper[4794]: I0310 09:44:19.270075 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:44:19 crc kubenswrapper[4794]: I0310 09:44:19.270092 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:44:19 crc kubenswrapper[4794]: I0310 09:44:19.444742 4794 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 10 09:44:19 crc kubenswrapper[4794]: I0310 09:44:19.724551 4794 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-etcd/etcd-crc" Mar 10 09:44:20 crc kubenswrapper[4794]: I0310 09:44:20.066637 4794 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 10 09:44:20 crc kubenswrapper[4794]: I0310 09:44:20.066765 4794 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 09:44:20 crc kubenswrapper[4794]: I0310 09:44:20.067857 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:44:20 crc kubenswrapper[4794]: I0310 09:44:20.067882 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:44:20 crc kubenswrapper[4794]: I0310 09:44:20.067891 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:44:20 crc kubenswrapper[4794]: I0310 09:44:20.091092 4794 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 09:44:20 crc kubenswrapper[4794]: I0310 09:44:20.091157 4794 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 09:44:20 crc kubenswrapper[4794]: I0310 09:44:20.092914 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:44:20 crc kubenswrapper[4794]: I0310 09:44:20.092959 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:44:20 crc kubenswrapper[4794]: I0310 09:44:20.092978 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:44:20 crc kubenswrapper[4794]: I0310 09:44:20.093731 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:44:20 crc kubenswrapper[4794]: I0310 09:44:20.093799 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:44:20 crc kubenswrapper[4794]: I0310 09:44:20.093820 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:44:20 crc kubenswrapper[4794]: I0310 09:44:20.822225 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 10 09:44:20 crc kubenswrapper[4794]: I0310 09:44:20.822498 4794 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 09:44:20 crc kubenswrapper[4794]: I0310 09:44:20.824107 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:44:20 crc kubenswrapper[4794]: I0310 09:44:20.824198 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:44:20 crc kubenswrapper[4794]: I0310 09:44:20.824219 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:44:20 crc kubenswrapper[4794]: I0310 09:44:20.983666 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-etcd/etcd-crc" Mar 10 09:44:21 crc kubenswrapper[4794]: I0310 09:44:21.093451 4794 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 09:44:21 crc kubenswrapper[4794]: I0310 09:44:21.094553 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:44:21 crc kubenswrapper[4794]: I0310 09:44:21.094609 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:44:21 crc kubenswrapper[4794]: I0310 09:44:21.094627 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:44:21 crc kubenswrapper[4794]: I0310 09:44:21.428907 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 10 09:44:21 crc kubenswrapper[4794]: I0310 09:44:21.429160 4794 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 09:44:21 crc kubenswrapper[4794]: I0310 09:44:21.430862 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:44:21 crc kubenswrapper[4794]: I0310 09:44:21.430998 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:44:21 crc kubenswrapper[4794]: I0310 09:44:21.431020 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:44:22 crc kubenswrapper[4794]: E0310 09:44:22.049483 4794 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 10 09:44:22 crc kubenswrapper[4794]: I0310 09:44:22.861430 4794 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 10 09:44:22 crc kubenswrapper[4794]: I0310 09:44:22.861652 4794 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 09:44:22 crc kubenswrapper[4794]: I0310 09:44:22.864280 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:44:22 crc kubenswrapper[4794]: I0310 09:44:22.864367 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:44:22 crc kubenswrapper[4794]: I0310 09:44:22.864387 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:44:22 crc kubenswrapper[4794]: I0310 09:44:22.870208 4794 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 10 09:44:23 crc kubenswrapper[4794]: I0310 09:44:23.067294 4794 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 10 09:44:23 crc kubenswrapper[4794]: I0310 09:44:23.067458 4794 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 10 09:44:23 crc kubenswrapper[4794]: I0310 09:44:23.099409 4794 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 09:44:23 crc kubenswrapper[4794]: I0310 09:44:23.100660 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:44:23 crc kubenswrapper[4794]: I0310 09:44:23.100731 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:44:23 crc kubenswrapper[4794]: I0310 09:44:23.100748 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:44:23 crc kubenswrapper[4794]: I0310 09:44:23.104696 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 10 09:44:24 crc kubenswrapper[4794]: I0310 09:44:24.101810 4794 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 09:44:24 crc kubenswrapper[4794]: I0310 09:44:24.103043 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:44:24 crc kubenswrapper[4794]: I0310 09:44:24.103110 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:44:24 crc kubenswrapper[4794]: I0310 09:44:24.103127 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:44:25 crc kubenswrapper[4794]: W0310 09:44:25.639183 4794 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": net/http: TLS handshake timeout Mar 10 09:44:25 crc kubenswrapper[4794]: I0310 09:44:25.639319 4794 trace.go:236] Trace[1827448863]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (10-Mar-2026 09:44:15.636) (total time: 10002ms): Mar 10 09:44:25 crc kubenswrapper[4794]: Trace[1827448863]: ---"Objects listed" error:Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": net/http: TLS handshake timeout 10002ms (09:44:25.639) Mar 10 09:44:25 crc kubenswrapper[4794]: Trace[1827448863]: [10.002293785s] [10.002293785s] END Mar 10 09:44:25 crc kubenswrapper[4794]: E0310 09:44:25.639395 4794 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": net/http: TLS handshake timeout" logger="UnhandledError" Mar 10 09:44:25 crc kubenswrapper[4794]: W0310 09:44:25.650229 4794 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": net/http: TLS handshake timeout Mar 10 09:44:25 crc kubenswrapper[4794]: I0310 09:44:25.650415 4794 trace.go:236] Trace[517050099]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (10-Mar-2026 09:44:15.648) (total time: 10001ms): Mar 10 09:44:25 crc kubenswrapper[4794]: Trace[517050099]: ---"Objects listed" error:Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": net/http: TLS handshake timeout 10001ms (09:44:25.650) Mar 10 09:44:25 crc kubenswrapper[4794]: Trace[517050099]: [10.001596646s] [10.001596646s] END Mar 10 09:44:25 crc kubenswrapper[4794]: E0310 09:44:25.650460 4794 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": net/http: TLS handshake timeout" logger="UnhandledError" Mar 10 09:44:25 crc kubenswrapper[4794]: W0310 09:44:25.832580 4794 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": net/http: TLS handshake timeout Mar 10 09:44:25 crc kubenswrapper[4794]: I0310 09:44:25.832676 4794 trace.go:236] Trace[877114793]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (10-Mar-2026 09:44:15.830) (total time: 10001ms): Mar 10 09:44:25 crc kubenswrapper[4794]: Trace[877114793]: ---"Objects listed" error:Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": net/http: TLS handshake timeout 10001ms (09:44:25.832) Mar 10 09:44:25 crc kubenswrapper[4794]: Trace[877114793]: [10.001834539s] [10.001834539s] END Mar 10 09:44:25 crc kubenswrapper[4794]: E0310 09:44:25.832700 4794 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": net/http: TLS handshake timeout" logger="UnhandledError" Mar 10 09:44:25 crc kubenswrapper[4794]: W0310 09:44:25.867706 4794 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": net/http: TLS handshake timeout Mar 10 09:44:25 crc kubenswrapper[4794]: I0310 09:44:25.867827 4794 trace.go:236] Trace[1471991380]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (10-Mar-2026 09:44:15.865) (total time: 10001ms): Mar 10 09:44:25 crc kubenswrapper[4794]: Trace[1471991380]: ---"Objects listed" error:Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": net/http: TLS handshake timeout 10001ms (09:44:25.867) Mar 10 09:44:25 crc kubenswrapper[4794]: Trace[1471991380]: [10.001798563s] [10.001798563s] END Mar 10 09:44:25 crc kubenswrapper[4794]: E0310 09:44:25.867855 4794 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": net/http: TLS handshake timeout" logger="UnhandledError" Mar 10 09:44:25 crc kubenswrapper[4794]: I0310 09:44:25.928365 4794 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": net/http: TLS handshake timeout Mar 10 09:44:26 crc kubenswrapper[4794]: E0310 09:44:26.152593 4794 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:44:26Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 10 09:44:26 crc kubenswrapper[4794]: E0310 09:44:26.155573 4794 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:44:26Z is after 2026-02-23T05:33:13Z" interval="6.4s" Mar 10 09:44:26 crc kubenswrapper[4794]: E0310 09:44:26.161634 4794 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:44:26Z is after 2026-02-23T05:33:13Z" node="crc" Mar 10 09:44:26 crc kubenswrapper[4794]: I0310 09:44:26.167648 4794 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Mar 10 09:44:26 crc kubenswrapper[4794]: I0310 09:44:26.167710 4794 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Mar 10 09:44:26 crc kubenswrapper[4794]: E0310 09:44:26.169730 4794 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:44:26Z is after 2026-02-23T05:33:13Z" event="&Event{ObjectMeta:{crc.189b71a9b07f9445 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 09:44:11.924984901 +0000 UTC m=+0.681155759,LastTimestamp:2026-03-10 09:44:11.924984901 +0000 UTC m=+0.681155759,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 09:44:26 crc kubenswrapper[4794]: I0310 09:44:26.174631 4794 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Mar 10 09:44:26 crc kubenswrapper[4794]: I0310 09:44:26.174682 4794 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Mar 10 09:44:26 crc kubenswrapper[4794]: I0310 09:44:26.933036 4794 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:44:26Z is after 2026-02-23T05:33:13Z Mar 10 09:44:27 crc kubenswrapper[4794]: I0310 09:44:27.112054 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Mar 10 09:44:27 crc kubenswrapper[4794]: I0310 09:44:27.113904 4794 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="f034690168432a340998e04b0939fd2fc650d3c640fdb8ca664c718c5110d8e1" exitCode=255 Mar 10 09:44:27 crc kubenswrapper[4794]: I0310 09:44:27.113942 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"f034690168432a340998e04b0939fd2fc650d3c640fdb8ca664c718c5110d8e1"} Mar 10 09:44:27 crc kubenswrapper[4794]: I0310 09:44:27.114110 4794 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 09:44:27 crc kubenswrapper[4794]: I0310 09:44:27.118644 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:44:27 crc kubenswrapper[4794]: I0310 09:44:27.118702 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:44:27 crc kubenswrapper[4794]: I0310 09:44:27.118719 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:44:27 crc kubenswrapper[4794]: I0310 09:44:27.119533 4794 scope.go:117] "RemoveContainer" containerID="f034690168432a340998e04b0939fd2fc650d3c640fdb8ca664c718c5110d8e1" Mar 10 09:44:27 crc kubenswrapper[4794]: I0310 09:44:27.933901 4794 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:44:27Z is after 2026-02-23T05:33:13Z Mar 10 09:44:28 crc kubenswrapper[4794]: I0310 09:44:28.119276 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Mar 10 09:44:28 crc kubenswrapper[4794]: I0310 09:44:28.121841 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"6c31b6c56c5ba692f9061be00f838e090d196cc0d971e624f775e557e9ec0268"} Mar 10 09:44:28 crc kubenswrapper[4794]: I0310 09:44:28.122025 4794 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 09:44:28 crc kubenswrapper[4794]: I0310 09:44:28.123048 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:44:28 crc kubenswrapper[4794]: I0310 09:44:28.123228 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:44:28 crc kubenswrapper[4794]: I0310 09:44:28.123387 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:44:28 crc kubenswrapper[4794]: I0310 09:44:28.931513 4794 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:44:28Z is after 2026-02-23T05:33:13Z Mar 10 09:44:29 crc kubenswrapper[4794]: W0310 09:44:29.083499 4794 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:44:29Z is after 2026-02-23T05:33:13Z Mar 10 09:44:29 crc kubenswrapper[4794]: E0310 09:44:29.083594 4794 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:44:29Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 10 09:44:29 crc kubenswrapper[4794]: W0310 09:44:29.115601 4794 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:44:29Z is after 2026-02-23T05:33:13Z Mar 10 09:44:29 crc kubenswrapper[4794]: E0310 09:44:29.115691 4794 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:44:29Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 10 09:44:29 crc kubenswrapper[4794]: I0310 09:44:29.126023 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Mar 10 09:44:29 crc kubenswrapper[4794]: I0310 09:44:29.127085 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Mar 10 09:44:29 crc kubenswrapper[4794]: I0310 09:44:29.129179 4794 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="6c31b6c56c5ba692f9061be00f838e090d196cc0d971e624f775e557e9ec0268" exitCode=255 Mar 10 09:44:29 crc kubenswrapper[4794]: I0310 09:44:29.129225 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"6c31b6c56c5ba692f9061be00f838e090d196cc0d971e624f775e557e9ec0268"} Mar 10 09:44:29 crc kubenswrapper[4794]: I0310 09:44:29.129284 4794 scope.go:117] "RemoveContainer" containerID="f034690168432a340998e04b0939fd2fc650d3c640fdb8ca664c718c5110d8e1" Mar 10 09:44:29 crc kubenswrapper[4794]: I0310 09:44:29.129519 4794 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 09:44:29 crc kubenswrapper[4794]: I0310 09:44:29.130926 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:44:29 crc kubenswrapper[4794]: I0310 09:44:29.130979 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:44:29 crc kubenswrapper[4794]: I0310 09:44:29.131002 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:44:29 crc kubenswrapper[4794]: I0310 09:44:29.132042 4794 scope.go:117] "RemoveContainer" containerID="6c31b6c56c5ba692f9061be00f838e090d196cc0d971e624f775e557e9ec0268" Mar 10 09:44:29 crc kubenswrapper[4794]: E0310 09:44:29.132412 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 10 09:44:29 crc kubenswrapper[4794]: I0310 09:44:29.268936 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 10 09:44:29 crc kubenswrapper[4794]: I0310 09:44:29.451112 4794 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 10 09:44:29 crc kubenswrapper[4794]: I0310 09:44:29.754753 4794 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-etcd/etcd-crc" Mar 10 09:44:29 crc kubenswrapper[4794]: I0310 09:44:29.755003 4794 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 09:44:29 crc kubenswrapper[4794]: I0310 09:44:29.756484 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:44:29 crc kubenswrapper[4794]: I0310 09:44:29.756531 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:44:29 crc kubenswrapper[4794]: I0310 09:44:29.756544 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:44:29 crc kubenswrapper[4794]: I0310 09:44:29.774616 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-etcd/etcd-crc" Mar 10 09:44:29 crc kubenswrapper[4794]: I0310 09:44:29.932126 4794 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:44:29Z is after 2026-02-23T05:33:13Z Mar 10 09:44:30 crc kubenswrapper[4794]: I0310 09:44:30.134164 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Mar 10 09:44:30 crc kubenswrapper[4794]: I0310 09:44:30.138017 4794 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 09:44:30 crc kubenswrapper[4794]: I0310 09:44:30.138101 4794 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 09:44:30 crc kubenswrapper[4794]: I0310 09:44:30.139495 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:44:30 crc kubenswrapper[4794]: I0310 09:44:30.139796 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:44:30 crc kubenswrapper[4794]: I0310 09:44:30.139838 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:44:30 crc kubenswrapper[4794]: I0310 09:44:30.139630 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:44:30 crc kubenswrapper[4794]: I0310 09:44:30.139929 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:44:30 crc kubenswrapper[4794]: I0310 09:44:30.139950 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:44:30 crc kubenswrapper[4794]: I0310 09:44:30.141965 4794 scope.go:117] "RemoveContainer" containerID="6c31b6c56c5ba692f9061be00f838e090d196cc0d971e624f775e557e9ec0268" Mar 10 09:44:30 crc kubenswrapper[4794]: E0310 09:44:30.142205 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 10 09:44:30 crc kubenswrapper[4794]: I0310 09:44:30.149313 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 10 09:44:30 crc kubenswrapper[4794]: W0310 09:44:30.474125 4794 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:44:30Z is after 2026-02-23T05:33:13Z Mar 10 09:44:30 crc kubenswrapper[4794]: E0310 09:44:30.474621 4794 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:44:30Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 10 09:44:30 crc kubenswrapper[4794]: I0310 09:44:30.932860 4794 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:44:30Z is after 2026-02-23T05:33:13Z Mar 10 09:44:31 crc kubenswrapper[4794]: I0310 09:44:31.141464 4794 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 09:44:31 crc kubenswrapper[4794]: I0310 09:44:31.143708 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:44:31 crc kubenswrapper[4794]: I0310 09:44:31.143954 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:44:31 crc kubenswrapper[4794]: I0310 09:44:31.144128 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:44:31 crc kubenswrapper[4794]: I0310 09:44:31.145091 4794 scope.go:117] "RemoveContainer" containerID="6c31b6c56c5ba692f9061be00f838e090d196cc0d971e624f775e557e9ec0268" Mar 10 09:44:31 crc kubenswrapper[4794]: E0310 09:44:31.145569 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 10 09:44:31 crc kubenswrapper[4794]: W0310 09:44:31.683198 4794 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:44:31Z is after 2026-02-23T05:33:13Z Mar 10 09:44:31 crc kubenswrapper[4794]: E0310 09:44:31.683297 4794 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:44:31Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 10 09:44:31 crc kubenswrapper[4794]: I0310 09:44:31.933365 4794 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:44:31Z is after 2026-02-23T05:33:13Z Mar 10 09:44:32 crc kubenswrapper[4794]: E0310 09:44:32.049662 4794 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 10 09:44:32 crc kubenswrapper[4794]: I0310 09:44:32.143612 4794 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 09:44:32 crc kubenswrapper[4794]: I0310 09:44:32.144822 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:44:32 crc kubenswrapper[4794]: I0310 09:44:32.144887 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:44:32 crc kubenswrapper[4794]: I0310 09:44:32.144905 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:44:32 crc kubenswrapper[4794]: I0310 09:44:32.145755 4794 scope.go:117] "RemoveContainer" containerID="6c31b6c56c5ba692f9061be00f838e090d196cc0d971e624f775e557e9ec0268" Mar 10 09:44:32 crc kubenswrapper[4794]: E0310 09:44:32.146038 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 10 09:44:32 crc kubenswrapper[4794]: E0310 09:44:32.558822 4794 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:44:32Z is after 2026-02-23T05:33:13Z" interval="7s" Mar 10 09:44:32 crc kubenswrapper[4794]: I0310 09:44:32.562053 4794 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 09:44:32 crc kubenswrapper[4794]: I0310 09:44:32.563225 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:44:32 crc kubenswrapper[4794]: I0310 09:44:32.563267 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:44:32 crc kubenswrapper[4794]: I0310 09:44:32.563285 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:44:32 crc kubenswrapper[4794]: I0310 09:44:32.563312 4794 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 10 09:44:32 crc kubenswrapper[4794]: E0310 09:44:32.565853 4794 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:44:32Z is after 2026-02-23T05:33:13Z" node="crc" Mar 10 09:44:32 crc kubenswrapper[4794]: I0310 09:44:32.934147 4794 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:44:32Z is after 2026-02-23T05:33:13Z Mar 10 09:44:33 crc kubenswrapper[4794]: I0310 09:44:33.068025 4794 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 10 09:44:33 crc kubenswrapper[4794]: I0310 09:44:33.068199 4794 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 10 09:44:33 crc kubenswrapper[4794]: I0310 09:44:33.740487 4794 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 10 09:44:33 crc kubenswrapper[4794]: I0310 09:44:33.740820 4794 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 09:44:33 crc kubenswrapper[4794]: I0310 09:44:33.743261 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:44:33 crc kubenswrapper[4794]: I0310 09:44:33.743359 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:44:33 crc kubenswrapper[4794]: I0310 09:44:33.743379 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:44:33 crc kubenswrapper[4794]: I0310 09:44:33.744079 4794 scope.go:117] "RemoveContainer" containerID="6c31b6c56c5ba692f9061be00f838e090d196cc0d971e624f775e557e9ec0268" Mar 10 09:44:33 crc kubenswrapper[4794]: E0310 09:44:33.744387 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 10 09:44:33 crc kubenswrapper[4794]: I0310 09:44:33.931499 4794 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:44:33Z is after 2026-02-23T05:33:13Z Mar 10 09:44:34 crc kubenswrapper[4794]: I0310 09:44:34.604158 4794 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 10 09:44:34 crc kubenswrapper[4794]: E0310 09:44:34.608776 4794 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:44:34Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 10 09:44:34 crc kubenswrapper[4794]: I0310 09:44:34.932415 4794 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:44:34Z is after 2026-02-23T05:33:13Z Mar 10 09:44:35 crc kubenswrapper[4794]: I0310 09:44:35.932675 4794 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:44:35Z is after 2026-02-23T05:33:13Z Mar 10 09:44:36 crc kubenswrapper[4794]: E0310 09:44:36.176611 4794 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:44:36Z is after 2026-02-23T05:33:13Z" event="&Event{ObjectMeta:{crc.189b71a9b07f9445 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 09:44:11.924984901 +0000 UTC m=+0.681155759,LastTimestamp:2026-03-10 09:44:11.924984901 +0000 UTC m=+0.681155759,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 09:44:36 crc kubenswrapper[4794]: I0310 09:44:36.930729 4794 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:44:36Z is after 2026-02-23T05:33:13Z Mar 10 09:44:37 crc kubenswrapper[4794]: W0310 09:44:37.794140 4794 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:44:37Z is after 2026-02-23T05:33:13Z Mar 10 09:44:37 crc kubenswrapper[4794]: E0310 09:44:37.794258 4794 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:44:37Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 10 09:44:37 crc kubenswrapper[4794]: I0310 09:44:37.931583 4794 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:44:37Z is after 2026-02-23T05:33:13Z Mar 10 09:44:38 crc kubenswrapper[4794]: I0310 09:44:38.932952 4794 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:44:38Z is after 2026-02-23T05:33:13Z Mar 10 09:44:39 crc kubenswrapper[4794]: W0310 09:44:39.340893 4794 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:44:39Z is after 2026-02-23T05:33:13Z Mar 10 09:44:39 crc kubenswrapper[4794]: E0310 09:44:39.341007 4794 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:44:39Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 10 09:44:39 crc kubenswrapper[4794]: E0310 09:44:39.562310 4794 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:44:39Z is after 2026-02-23T05:33:13Z" interval="7s" Mar 10 09:44:39 crc kubenswrapper[4794]: I0310 09:44:39.566546 4794 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 09:44:39 crc kubenswrapper[4794]: I0310 09:44:39.568058 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:44:39 crc kubenswrapper[4794]: I0310 09:44:39.568134 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:44:39 crc kubenswrapper[4794]: I0310 09:44:39.568161 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:44:39 crc kubenswrapper[4794]: I0310 09:44:39.568205 4794 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 10 09:44:39 crc kubenswrapper[4794]: E0310 09:44:39.571575 4794 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:44:39Z is after 2026-02-23T05:33:13Z" node="crc" Mar 10 09:44:39 crc kubenswrapper[4794]: I0310 09:44:39.935462 4794 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:44:39Z is after 2026-02-23T05:33:13Z Mar 10 09:44:40 crc kubenswrapper[4794]: W0310 09:44:40.316584 4794 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:44:40Z is after 2026-02-23T05:33:13Z Mar 10 09:44:40 crc kubenswrapper[4794]: E0310 09:44:40.316663 4794 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:44:40Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 10 09:44:40 crc kubenswrapper[4794]: W0310 09:44:40.703301 4794 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:44:40Z is after 2026-02-23T05:33:13Z Mar 10 09:44:40 crc kubenswrapper[4794]: E0310 09:44:40.703391 4794 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:44:40Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 10 09:44:40 crc kubenswrapper[4794]: I0310 09:44:40.932643 4794 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:44:40Z is after 2026-02-23T05:33:13Z Mar 10 09:44:41 crc kubenswrapper[4794]: I0310 09:44:41.933119 4794 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:44:41Z is after 2026-02-23T05:33:13Z Mar 10 09:44:42 crc kubenswrapper[4794]: E0310 09:44:42.049945 4794 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 10 09:44:42 crc kubenswrapper[4794]: I0310 09:44:42.932235 4794 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:44:42Z is after 2026-02-23T05:33:13Z Mar 10 09:44:43 crc kubenswrapper[4794]: I0310 09:44:43.067638 4794 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 10 09:44:43 crc kubenswrapper[4794]: I0310 09:44:43.067727 4794 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 10 09:44:43 crc kubenswrapper[4794]: I0310 09:44:43.067817 4794 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 10 09:44:43 crc kubenswrapper[4794]: I0310 09:44:43.068010 4794 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 09:44:43 crc kubenswrapper[4794]: I0310 09:44:43.069813 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:44:43 crc kubenswrapper[4794]: I0310 09:44:43.069878 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:44:43 crc kubenswrapper[4794]: I0310 09:44:43.069898 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:44:43 crc kubenswrapper[4794]: I0310 09:44:43.070659 4794 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="cluster-policy-controller" containerStatusID={"Type":"cri-o","ID":"6c038c0c25a477050dd9f8907667961d329c0c2202ab280127d9e91d82922b68"} pod="openshift-kube-controller-manager/kube-controller-manager-crc" containerMessage="Container cluster-policy-controller failed startup probe, will be restarted" Mar 10 09:44:43 crc kubenswrapper[4794]: I0310 09:44:43.070935 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" containerID="cri-o://6c038c0c25a477050dd9f8907667961d329c0c2202ab280127d9e91d82922b68" gracePeriod=30 Mar 10 09:44:43 crc kubenswrapper[4794]: I0310 09:44:43.933430 4794 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:44:43Z is after 2026-02-23T05:33:13Z Mar 10 09:44:44 crc kubenswrapper[4794]: I0310 09:44:44.179450 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/0.log" Mar 10 09:44:44 crc kubenswrapper[4794]: I0310 09:44:44.180959 4794 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="6c038c0c25a477050dd9f8907667961d329c0c2202ab280127d9e91d82922b68" exitCode=255 Mar 10 09:44:44 crc kubenswrapper[4794]: I0310 09:44:44.181022 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"6c038c0c25a477050dd9f8907667961d329c0c2202ab280127d9e91d82922b68"} Mar 10 09:44:44 crc kubenswrapper[4794]: I0310 09:44:44.181091 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"3e7c9a20b63dc51728211c782f530307eafabeb7f97e5bc87bcdacd29a91658e"} Mar 10 09:44:44 crc kubenswrapper[4794]: I0310 09:44:44.181227 4794 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 09:44:44 crc kubenswrapper[4794]: I0310 09:44:44.185284 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:44:44 crc kubenswrapper[4794]: I0310 09:44:44.185382 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:44:44 crc kubenswrapper[4794]: I0310 09:44:44.185405 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:44:44 crc kubenswrapper[4794]: I0310 09:44:44.933326 4794 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:44:44Z is after 2026-02-23T05:33:13Z Mar 10 09:44:45 crc kubenswrapper[4794]: I0310 09:44:45.931824 4794 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:44:45Z is after 2026-02-23T05:33:13Z Mar 10 09:44:46 crc kubenswrapper[4794]: E0310 09:44:46.183492 4794 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:44:46Z is after 2026-02-23T05:33:13Z" event="&Event{ObjectMeta:{crc.189b71a9b07f9445 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 09:44:11.924984901 +0000 UTC m=+0.681155759,LastTimestamp:2026-03-10 09:44:11.924984901 +0000 UTC m=+0.681155759,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 09:44:46 crc kubenswrapper[4794]: E0310 09:44:46.567901 4794 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:44:46Z is after 2026-02-23T05:33:13Z" interval="7s" Mar 10 09:44:46 crc kubenswrapper[4794]: I0310 09:44:46.572217 4794 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 09:44:46 crc kubenswrapper[4794]: I0310 09:44:46.573911 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:44:46 crc kubenswrapper[4794]: I0310 09:44:46.573957 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:44:46 crc kubenswrapper[4794]: I0310 09:44:46.573976 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:44:46 crc kubenswrapper[4794]: I0310 09:44:46.574010 4794 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 10 09:44:46 crc kubenswrapper[4794]: E0310 09:44:46.578699 4794 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:44:46Z is after 2026-02-23T05:33:13Z" node="crc" Mar 10 09:44:46 crc kubenswrapper[4794]: I0310 09:44:46.774286 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 10 09:44:46 crc kubenswrapper[4794]: I0310 09:44:46.774598 4794 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 09:44:46 crc kubenswrapper[4794]: I0310 09:44:46.776087 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:44:46 crc kubenswrapper[4794]: I0310 09:44:46.776136 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:44:46 crc kubenswrapper[4794]: I0310 09:44:46.776155 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:44:46 crc kubenswrapper[4794]: I0310 09:44:46.932616 4794 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:44:46Z is after 2026-02-23T05:33:13Z Mar 10 09:44:46 crc kubenswrapper[4794]: I0310 09:44:46.999273 4794 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 09:44:47 crc kubenswrapper[4794]: I0310 09:44:47.000761 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:44:47 crc kubenswrapper[4794]: I0310 09:44:47.000832 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:44:47 crc kubenswrapper[4794]: I0310 09:44:47.000858 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:44:47 crc kubenswrapper[4794]: I0310 09:44:47.001870 4794 scope.go:117] "RemoveContainer" containerID="6c31b6c56c5ba692f9061be00f838e090d196cc0d971e624f775e557e9ec0268" Mar 10 09:44:47 crc kubenswrapper[4794]: I0310 09:44:47.932626 4794 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:44:47Z is after 2026-02-23T05:33:13Z Mar 10 09:44:48 crc kubenswrapper[4794]: I0310 09:44:48.194384 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log" Mar 10 09:44:48 crc kubenswrapper[4794]: I0310 09:44:48.195165 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Mar 10 09:44:48 crc kubenswrapper[4794]: I0310 09:44:48.198359 4794 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="b210b83aa3e9184693096ec3a1dcc70ebc8060a90f1d6639bbc625dee9013aaa" exitCode=255 Mar 10 09:44:48 crc kubenswrapper[4794]: I0310 09:44:48.198392 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"b210b83aa3e9184693096ec3a1dcc70ebc8060a90f1d6639bbc625dee9013aaa"} Mar 10 09:44:48 crc kubenswrapper[4794]: I0310 09:44:48.198474 4794 scope.go:117] "RemoveContainer" containerID="6c31b6c56c5ba692f9061be00f838e090d196cc0d971e624f775e557e9ec0268" Mar 10 09:44:48 crc kubenswrapper[4794]: I0310 09:44:48.198656 4794 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 09:44:48 crc kubenswrapper[4794]: I0310 09:44:48.200075 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:44:48 crc kubenswrapper[4794]: I0310 09:44:48.200116 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:44:48 crc kubenswrapper[4794]: I0310 09:44:48.200134 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:44:48 crc kubenswrapper[4794]: I0310 09:44:48.201247 4794 scope.go:117] "RemoveContainer" containerID="b210b83aa3e9184693096ec3a1dcc70ebc8060a90f1d6639bbc625dee9013aaa" Mar 10 09:44:48 crc kubenswrapper[4794]: E0310 09:44:48.201582 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 10 09:44:48 crc kubenswrapper[4794]: I0310 09:44:48.932056 4794 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:44:48Z is after 2026-02-23T05:33:13Z Mar 10 09:44:49 crc kubenswrapper[4794]: I0310 09:44:49.205385 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log" Mar 10 09:44:49 crc kubenswrapper[4794]: I0310 09:44:49.268318 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 10 09:44:49 crc kubenswrapper[4794]: I0310 09:44:49.268601 4794 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 09:44:49 crc kubenswrapper[4794]: I0310 09:44:49.270172 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:44:49 crc kubenswrapper[4794]: I0310 09:44:49.270237 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:44:49 crc kubenswrapper[4794]: I0310 09:44:49.270261 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:44:49 crc kubenswrapper[4794]: I0310 09:44:49.271281 4794 scope.go:117] "RemoveContainer" containerID="b210b83aa3e9184693096ec3a1dcc70ebc8060a90f1d6639bbc625dee9013aaa" Mar 10 09:44:49 crc kubenswrapper[4794]: E0310 09:44:49.271696 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 10 09:44:49 crc kubenswrapper[4794]: I0310 09:44:49.932019 4794 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:44:49Z is after 2026-02-23T05:33:13Z Mar 10 09:44:50 crc kubenswrapper[4794]: I0310 09:44:50.066782 4794 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 10 09:44:50 crc kubenswrapper[4794]: I0310 09:44:50.067077 4794 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 09:44:50 crc kubenswrapper[4794]: I0310 09:44:50.068937 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:44:50 crc kubenswrapper[4794]: I0310 09:44:50.068996 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:44:50 crc kubenswrapper[4794]: I0310 09:44:50.069022 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:44:50 crc kubenswrapper[4794]: I0310 09:44:50.933397 4794 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:44:50Z is after 2026-02-23T05:33:13Z Mar 10 09:44:51 crc kubenswrapper[4794]: I0310 09:44:51.655966 4794 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 10 09:44:51 crc kubenswrapper[4794]: E0310 09:44:51.663673 4794 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:44:51Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 10 09:44:51 crc kubenswrapper[4794]: E0310 09:44:51.665000 4794 certificate_manager.go:440] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Reached backoff limit, still unable to rotate certs: timed out waiting for the condition" logger="UnhandledError" Mar 10 09:44:51 crc kubenswrapper[4794]: I0310 09:44:51.932792 4794 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:44:51Z is after 2026-02-23T05:33:13Z Mar 10 09:44:52 crc kubenswrapper[4794]: E0310 09:44:52.050612 4794 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 10 09:44:52 crc kubenswrapper[4794]: I0310 09:44:52.933058 4794 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:44:52Z is after 2026-02-23T05:33:13Z Mar 10 09:44:53 crc kubenswrapper[4794]: I0310 09:44:53.066868 4794 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 10 09:44:53 crc kubenswrapper[4794]: I0310 09:44:53.066970 4794 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 10 09:44:53 crc kubenswrapper[4794]: E0310 09:44:53.573891 4794 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:44:53Z is after 2026-02-23T05:33:13Z" interval="7s" Mar 10 09:44:53 crc kubenswrapper[4794]: I0310 09:44:53.579761 4794 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 09:44:53 crc kubenswrapper[4794]: I0310 09:44:53.581553 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:44:53 crc kubenswrapper[4794]: I0310 09:44:53.581645 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:44:53 crc kubenswrapper[4794]: I0310 09:44:53.581664 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:44:53 crc kubenswrapper[4794]: I0310 09:44:53.581699 4794 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 10 09:44:53 crc kubenswrapper[4794]: E0310 09:44:53.585164 4794 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:44:53Z is after 2026-02-23T05:33:13Z" node="crc" Mar 10 09:44:53 crc kubenswrapper[4794]: I0310 09:44:53.740268 4794 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 10 09:44:53 crc kubenswrapper[4794]: I0310 09:44:53.740547 4794 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 09:44:53 crc kubenswrapper[4794]: I0310 09:44:53.742200 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:44:53 crc kubenswrapper[4794]: I0310 09:44:53.742268 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:44:53 crc kubenswrapper[4794]: I0310 09:44:53.742287 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:44:53 crc kubenswrapper[4794]: I0310 09:44:53.743197 4794 scope.go:117] "RemoveContainer" containerID="b210b83aa3e9184693096ec3a1dcc70ebc8060a90f1d6639bbc625dee9013aaa" Mar 10 09:44:53 crc kubenswrapper[4794]: E0310 09:44:53.743546 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 10 09:44:53 crc kubenswrapper[4794]: I0310 09:44:53.932985 4794 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:44:53Z is after 2026-02-23T05:33:13Z Mar 10 09:44:54 crc kubenswrapper[4794]: W0310 09:44:54.559085 4794 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:44:54Z is after 2026-02-23T05:33:13Z Mar 10 09:44:54 crc kubenswrapper[4794]: E0310 09:44:54.559203 4794 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:44:54Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 10 09:44:54 crc kubenswrapper[4794]: I0310 09:44:54.933739 4794 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:44:54Z is after 2026-02-23T05:33:13Z Mar 10 09:44:55 crc kubenswrapper[4794]: I0310 09:44:55.933370 4794 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:44:55Z is after 2026-02-23T05:33:13Z Mar 10 09:44:56 crc kubenswrapper[4794]: E0310 09:44:56.190148 4794 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:44:56Z is after 2026-02-23T05:33:13Z" event="&Event{ObjectMeta:{crc.189b71a9b07f9445 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 09:44:11.924984901 +0000 UTC m=+0.681155759,LastTimestamp:2026-03-10 09:44:11.924984901 +0000 UTC m=+0.681155759,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 09:44:56 crc kubenswrapper[4794]: I0310 09:44:56.933040 4794 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:44:56Z is after 2026-02-23T05:33:13Z Mar 10 09:44:57 crc kubenswrapper[4794]: W0310 09:44:57.273793 4794 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:44:57Z is after 2026-02-23T05:33:13Z Mar 10 09:44:57 crc kubenswrapper[4794]: E0310 09:44:57.273903 4794 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:44:57Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 10 09:44:57 crc kubenswrapper[4794]: I0310 09:44:57.933054 4794 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:44:57Z is after 2026-02-23T05:33:13Z Mar 10 09:44:58 crc kubenswrapper[4794]: I0310 09:44:58.932880 4794 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:44:58Z is after 2026-02-23T05:33:13Z Mar 10 09:44:59 crc kubenswrapper[4794]: I0310 09:44:59.932748 4794 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:44:59Z is after 2026-02-23T05:33:13Z Mar 10 09:45:00 crc kubenswrapper[4794]: E0310 09:45:00.580073 4794 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:45:00Z is after 2026-02-23T05:33:13Z" interval="7s" Mar 10 09:45:00 crc kubenswrapper[4794]: I0310 09:45:00.585777 4794 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 09:45:00 crc kubenswrapper[4794]: I0310 09:45:00.587303 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:45:00 crc kubenswrapper[4794]: I0310 09:45:00.587559 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:45:00 crc kubenswrapper[4794]: I0310 09:45:00.587620 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:45:00 crc kubenswrapper[4794]: I0310 09:45:00.587661 4794 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 10 09:45:00 crc kubenswrapper[4794]: E0310 09:45:00.592772 4794 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:45:00Z is after 2026-02-23T05:33:13Z" node="crc" Mar 10 09:45:00 crc kubenswrapper[4794]: I0310 09:45:00.829716 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 10 09:45:00 crc kubenswrapper[4794]: I0310 09:45:00.829907 4794 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 09:45:00 crc kubenswrapper[4794]: I0310 09:45:00.831245 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:45:00 crc kubenswrapper[4794]: I0310 09:45:00.831307 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:45:00 crc kubenswrapper[4794]: I0310 09:45:00.831367 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:45:00 crc kubenswrapper[4794]: I0310 09:45:00.933593 4794 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:45:00Z is after 2026-02-23T05:33:13Z Mar 10 09:45:01 crc kubenswrapper[4794]: W0310 09:45:01.274253 4794 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:45:01Z is after 2026-02-23T05:33:13Z Mar 10 09:45:01 crc kubenswrapper[4794]: E0310 09:45:01.274424 4794 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:45:01Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 10 09:45:01 crc kubenswrapper[4794]: W0310 09:45:01.709086 4794 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:45:01Z is after 2026-02-23T05:33:13Z Mar 10 09:45:01 crc kubenswrapper[4794]: E0310 09:45:01.709230 4794 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:45:01Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 10 09:45:01 crc kubenswrapper[4794]: I0310 09:45:01.933047 4794 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:45:01Z is after 2026-02-23T05:33:13Z Mar 10 09:45:02 crc kubenswrapper[4794]: E0310 09:45:02.051585 4794 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 10 09:45:02 crc kubenswrapper[4794]: I0310 09:45:02.932622 4794 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:45:02Z is after 2026-02-23T05:33:13Z Mar 10 09:45:03 crc kubenswrapper[4794]: I0310 09:45:03.067571 4794 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 10 09:45:03 crc kubenswrapper[4794]: I0310 09:45:03.067992 4794 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 10 09:45:03 crc kubenswrapper[4794]: I0310 09:45:03.933024 4794 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:45:03Z is after 2026-02-23T05:33:13Z Mar 10 09:45:04 crc kubenswrapper[4794]: I0310 09:45:04.932445 4794 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:45:04Z is after 2026-02-23T05:33:13Z Mar 10 09:45:05 crc kubenswrapper[4794]: I0310 09:45:05.933011 4794 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:45:05Z is after 2026-02-23T05:33:13Z Mar 10 09:45:06 crc kubenswrapper[4794]: E0310 09:45:06.196480 4794 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:45:06Z is after 2026-02-23T05:33:13Z" event="&Event{ObjectMeta:{crc.189b71a9b07f9445 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 09:44:11.924984901 +0000 UTC m=+0.681155759,LastTimestamp:2026-03-10 09:44:11.924984901 +0000 UTC m=+0.681155759,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 09:45:06 crc kubenswrapper[4794]: I0310 09:45:06.930761 4794 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:45:06Z is after 2026-02-23T05:33:13Z Mar 10 09:45:06 crc kubenswrapper[4794]: I0310 09:45:06.998877 4794 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 09:45:07 crc kubenswrapper[4794]: I0310 09:45:07.000403 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:45:07 crc kubenswrapper[4794]: I0310 09:45:07.000456 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:45:07 crc kubenswrapper[4794]: I0310 09:45:07.000474 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:45:07 crc kubenswrapper[4794]: I0310 09:45:07.001245 4794 scope.go:117] "RemoveContainer" containerID="b210b83aa3e9184693096ec3a1dcc70ebc8060a90f1d6639bbc625dee9013aaa" Mar 10 09:45:07 crc kubenswrapper[4794]: E0310 09:45:07.001542 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 10 09:45:07 crc kubenswrapper[4794]: E0310 09:45:07.584547 4794 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:45:07Z is after 2026-02-23T05:33:13Z" interval="7s" Mar 10 09:45:07 crc kubenswrapper[4794]: I0310 09:45:07.593659 4794 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 09:45:07 crc kubenswrapper[4794]: I0310 09:45:07.594769 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:45:07 crc kubenswrapper[4794]: I0310 09:45:07.594796 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:45:07 crc kubenswrapper[4794]: I0310 09:45:07.594805 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:45:07 crc kubenswrapper[4794]: I0310 09:45:07.594822 4794 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 10 09:45:07 crc kubenswrapper[4794]: E0310 09:45:07.597868 4794 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:45:07Z is after 2026-02-23T05:33:13Z" node="crc" Mar 10 09:45:07 crc kubenswrapper[4794]: I0310 09:45:07.933172 4794 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:45:07Z is after 2026-02-23T05:33:13Z Mar 10 09:45:08 crc kubenswrapper[4794]: I0310 09:45:08.935257 4794 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 10 09:45:09 crc kubenswrapper[4794]: I0310 09:45:09.935026 4794 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 10 09:45:10 crc kubenswrapper[4794]: I0310 09:45:10.936742 4794 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 10 09:45:11 crc kubenswrapper[4794]: I0310 09:45:11.936234 4794 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 10 09:45:12 crc kubenswrapper[4794]: E0310 09:45:12.051742 4794 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 10 09:45:12 crc kubenswrapper[4794]: I0310 09:45:12.933951 4794 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 10 09:45:13 crc kubenswrapper[4794]: I0310 09:45:13.067877 4794 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 10 09:45:13 crc kubenswrapper[4794]: I0310 09:45:13.068454 4794 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 10 09:45:13 crc kubenswrapper[4794]: I0310 09:45:13.068567 4794 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 10 09:45:13 crc kubenswrapper[4794]: I0310 09:45:13.068815 4794 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 09:45:13 crc kubenswrapper[4794]: I0310 09:45:13.071855 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:45:13 crc kubenswrapper[4794]: I0310 09:45:13.071909 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:45:13 crc kubenswrapper[4794]: I0310 09:45:13.071928 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:45:13 crc kubenswrapper[4794]: I0310 09:45:13.072675 4794 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="cluster-policy-controller" containerStatusID={"Type":"cri-o","ID":"3e7c9a20b63dc51728211c782f530307eafabeb7f97e5bc87bcdacd29a91658e"} pod="openshift-kube-controller-manager/kube-controller-manager-crc" containerMessage="Container cluster-policy-controller failed startup probe, will be restarted" Mar 10 09:45:13 crc kubenswrapper[4794]: I0310 09:45:13.072858 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" containerID="cri-o://3e7c9a20b63dc51728211c782f530307eafabeb7f97e5bc87bcdacd29a91658e" gracePeriod=30 Mar 10 09:45:13 crc kubenswrapper[4794]: I0310 09:45:13.284429 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/1.log" Mar 10 09:45:13 crc kubenswrapper[4794]: I0310 09:45:13.286182 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/0.log" Mar 10 09:45:13 crc kubenswrapper[4794]: I0310 09:45:13.286756 4794 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="3e7c9a20b63dc51728211c782f530307eafabeb7f97e5bc87bcdacd29a91658e" exitCode=255 Mar 10 09:45:13 crc kubenswrapper[4794]: I0310 09:45:13.286807 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"3e7c9a20b63dc51728211c782f530307eafabeb7f97e5bc87bcdacd29a91658e"} Mar 10 09:45:13 crc kubenswrapper[4794]: I0310 09:45:13.286858 4794 scope.go:117] "RemoveContainer" containerID="6c038c0c25a477050dd9f8907667961d329c0c2202ab280127d9e91d82922b68" Mar 10 09:45:13 crc kubenswrapper[4794]: I0310 09:45:13.932076 4794 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 10 09:45:14 crc kubenswrapper[4794]: I0310 09:45:14.291883 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/1.log" Mar 10 09:45:14 crc kubenswrapper[4794]: I0310 09:45:14.294072 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"875e6fa3d8ad5a7c381c1bf3efd51ccfa1202dd3fd9b30bc3e05fac7eb9fc3f7"} Mar 10 09:45:14 crc kubenswrapper[4794]: I0310 09:45:14.294209 4794 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 09:45:14 crc kubenswrapper[4794]: I0310 09:45:14.295396 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:45:14 crc kubenswrapper[4794]: I0310 09:45:14.295604 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:45:14 crc kubenswrapper[4794]: I0310 09:45:14.295781 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:45:14 crc kubenswrapper[4794]: E0310 09:45:14.591774 4794 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 10 09:45:14 crc kubenswrapper[4794]: I0310 09:45:14.598825 4794 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 09:45:14 crc kubenswrapper[4794]: I0310 09:45:14.600493 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:45:14 crc kubenswrapper[4794]: I0310 09:45:14.600532 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:45:14 crc kubenswrapper[4794]: I0310 09:45:14.600543 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:45:14 crc kubenswrapper[4794]: I0310 09:45:14.600571 4794 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 10 09:45:14 crc kubenswrapper[4794]: E0310 09:45:14.604732 4794 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Mar 10 09:45:14 crc kubenswrapper[4794]: I0310 09:45:14.933012 4794 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 10 09:45:15 crc kubenswrapper[4794]: I0310 09:45:15.297706 4794 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 09:45:15 crc kubenswrapper[4794]: I0310 09:45:15.299165 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:45:15 crc kubenswrapper[4794]: I0310 09:45:15.299200 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:45:15 crc kubenswrapper[4794]: I0310 09:45:15.299215 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:45:15 crc kubenswrapper[4794]: I0310 09:45:15.935498 4794 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 10 09:45:16 crc kubenswrapper[4794]: E0310 09:45:16.204314 4794 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189b71a9b07f9445 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 09:44:11.924984901 +0000 UTC m=+0.681155759,LastTimestamp:2026-03-10 09:44:11.924984901 +0000 UTC m=+0.681155759,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 09:45:16 crc kubenswrapper[4794]: E0310 09:45:16.209099 4794 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189b71a9b32153a6 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 09:44:11.969139622 +0000 UTC m=+0.725310440,LastTimestamp:2026-03-10 09:44:11.969139622 +0000 UTC m=+0.725310440,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 09:45:16 crc kubenswrapper[4794]: E0310 09:45:16.213649 4794 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189b71a9b32193fd default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 09:44:11.969156093 +0000 UTC m=+0.725326911,LastTimestamp:2026-03-10 09:44:11.969156093 +0000 UTC m=+0.725326911,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 09:45:16 crc kubenswrapper[4794]: E0310 09:45:16.214725 4794 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189b71a9b321b581 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 09:44:11.969164673 +0000 UTC m=+0.725335491,LastTimestamp:2026-03-10 09:44:11.969164673 +0000 UTC m=+0.725335491,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 09:45:16 crc kubenswrapper[4794]: E0310 09:45:16.218810 4794 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189b71a9b7b3eeb2 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeAllocatableEnforced,Message:Updated Node Allocatable limit across pods,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 09:44:12.045856434 +0000 UTC m=+0.802027262,LastTimestamp:2026-03-10 09:44:12.045856434 +0000 UTC m=+0.802027262,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 09:45:16 crc kubenswrapper[4794]: E0310 09:45:16.227205 4794 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189b71a9b32153a6\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189b71a9b32153a6 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 09:44:11.969139622 +0000 UTC m=+0.725310440,LastTimestamp:2026-03-10 09:44:12.100026791 +0000 UTC m=+0.856197609,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 09:45:16 crc kubenswrapper[4794]: E0310 09:45:16.232039 4794 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189b71a9b32193fd\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189b71a9b32193fd default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 09:44:11.969156093 +0000 UTC m=+0.725326911,LastTimestamp:2026-03-10 09:44:12.100040291 +0000 UTC m=+0.856211099,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 09:45:16 crc kubenswrapper[4794]: E0310 09:45:16.236806 4794 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189b71a9b321b581\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189b71a9b321b581 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 09:44:11.969164673 +0000 UTC m=+0.725335491,LastTimestamp:2026-03-10 09:44:12.100049552 +0000 UTC m=+0.856220360,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 09:45:16 crc kubenswrapper[4794]: E0310 09:45:16.241278 4794 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189b71a9b32153a6\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189b71a9b32153a6 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 09:44:11.969139622 +0000 UTC m=+0.725310440,LastTimestamp:2026-03-10 09:44:12.101448162 +0000 UTC m=+0.857618990,Count:3,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 09:45:16 crc kubenswrapper[4794]: E0310 09:45:16.247656 4794 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189b71a9b32193fd\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189b71a9b32193fd default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 09:44:11.969156093 +0000 UTC m=+0.725326911,LastTimestamp:2026-03-10 09:44:12.101465123 +0000 UTC m=+0.857635951,Count:3,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 09:45:16 crc kubenswrapper[4794]: E0310 09:45:16.252538 4794 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189b71a9b321b581\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189b71a9b321b581 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 09:44:11.969164673 +0000 UTC m=+0.725335491,LastTimestamp:2026-03-10 09:44:12.101476453 +0000 UTC m=+0.857647281,Count:3,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 09:45:16 crc kubenswrapper[4794]: E0310 09:45:16.257927 4794 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189b71a9b32153a6\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189b71a9b32153a6 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 09:44:11.969139622 +0000 UTC m=+0.725310440,LastTimestamp:2026-03-10 09:44:12.102756661 +0000 UTC m=+0.858927499,Count:4,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 09:45:16 crc kubenswrapper[4794]: E0310 09:45:16.264933 4794 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189b71a9b32193fd\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189b71a9b32193fd default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 09:44:11.969156093 +0000 UTC m=+0.725326911,LastTimestamp:2026-03-10 09:44:12.102790302 +0000 UTC m=+0.858961130,Count:4,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 09:45:16 crc kubenswrapper[4794]: E0310 09:45:16.269016 4794 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189b71a9b321b581\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189b71a9b321b581 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 09:44:11.969164673 +0000 UTC m=+0.725335491,LastTimestamp:2026-03-10 09:44:12.102818203 +0000 UTC m=+0.858989041,Count:4,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 09:45:16 crc kubenswrapper[4794]: E0310 09:45:16.273813 4794 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189b71a9b32153a6\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189b71a9b32153a6 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 09:44:11.969139622 +0000 UTC m=+0.725310440,LastTimestamp:2026-03-10 09:44:12.103605006 +0000 UTC m=+0.859775834,Count:5,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 09:45:16 crc kubenswrapper[4794]: E0310 09:45:16.278697 4794 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189b71a9b32193fd\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189b71a9b32193fd default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 09:44:11.969156093 +0000 UTC m=+0.725326911,LastTimestamp:2026-03-10 09:44:12.103621146 +0000 UTC m=+0.859791974,Count:5,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 09:45:16 crc kubenswrapper[4794]: E0310 09:45:16.283300 4794 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189b71a9b321b581\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189b71a9b321b581 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 09:44:11.969164673 +0000 UTC m=+0.725335491,LastTimestamp:2026-03-10 09:44:12.103631486 +0000 UTC m=+0.859802314,Count:5,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 09:45:16 crc kubenswrapper[4794]: E0310 09:45:16.287236 4794 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189b71a9b32153a6\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189b71a9b32153a6 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 09:44:11.969139622 +0000 UTC m=+0.725310440,LastTimestamp:2026-03-10 09:44:12.103994837 +0000 UTC m=+0.860165665,Count:6,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 09:45:16 crc kubenswrapper[4794]: E0310 09:45:16.290979 4794 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189b71a9b32193fd\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189b71a9b32193fd default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 09:44:11.969156093 +0000 UTC m=+0.725326911,LastTimestamp:2026-03-10 09:44:12.104009318 +0000 UTC m=+0.860180146,Count:6,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 09:45:16 crc kubenswrapper[4794]: E0310 09:45:16.295016 4794 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189b71a9b321b581\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189b71a9b321b581 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 09:44:11.969164673 +0000 UTC m=+0.725335491,LastTimestamp:2026-03-10 09:44:12.104019378 +0000 UTC m=+0.860190206,Count:6,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 09:45:16 crc kubenswrapper[4794]: E0310 09:45:16.299791 4794 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189b71a9b32153a6\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189b71a9b32153a6 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 09:44:11.969139622 +0000 UTC m=+0.725310440,LastTimestamp:2026-03-10 09:44:12.105193473 +0000 UTC m=+0.861364301,Count:7,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 09:45:16 crc kubenswrapper[4794]: E0310 09:45:16.303718 4794 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189b71a9b32153a6\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189b71a9b32153a6 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 09:44:11.969139622 +0000 UTC m=+0.725310440,LastTimestamp:2026-03-10 09:44:12.105201473 +0000 UTC m=+0.861372301,Count:8,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 09:45:16 crc kubenswrapper[4794]: E0310 09:45:16.308222 4794 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189b71a9b32193fd\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189b71a9b32193fd default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 09:44:11.969156093 +0000 UTC m=+0.725326911,LastTimestamp:2026-03-10 09:44:12.105213164 +0000 UTC m=+0.861383992,Count:7,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 09:45:16 crc kubenswrapper[4794]: E0310 09:45:16.314373 4794 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189b71a9b321b581\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189b71a9b321b581 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 09:44:11.969164673 +0000 UTC m=+0.725335491,LastTimestamp:2026-03-10 09:44:12.105223734 +0000 UTC m=+0.861394562,Count:7,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 09:45:16 crc kubenswrapper[4794]: E0310 09:45:16.318504 4794 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189b71a9b32193fd\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189b71a9b32193fd default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 09:44:11.969156093 +0000 UTC m=+0.725326911,LastTimestamp:2026-03-10 09:44:12.105251805 +0000 UTC m=+0.861422663,Count:8,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 09:45:16 crc kubenswrapper[4794]: E0310 09:45:16.321287 4794 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189b71a9d2e8121f openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 09:44:12.502258207 +0000 UTC m=+1.258429015,LastTimestamp:2026-03-10 09:44:12.502258207 +0000 UTC m=+1.258429015,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 09:45:16 crc kubenswrapper[4794]: E0310 09:45:16.325407 4794 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189b71a9d2e84361 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 09:44:12.502270817 +0000 UTC m=+1.258441645,LastTimestamp:2026-03-10 09:44:12.502270817 +0000 UTC m=+1.258441645,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 09:45:16 crc kubenswrapper[4794]: E0310 09:45:16.331530 4794 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189b71a9d309dc60 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{wait-for-host-port},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 09:44:12.504472672 +0000 UTC m=+1.260643530,LastTimestamp:2026-03-10 09:44:12.504472672 +0000 UTC m=+1.260643530,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 09:45:16 crc kubenswrapper[4794]: E0310 09:45:16.336006 4794 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189b71a9d35ef28a openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 09:44:12.510048906 +0000 UTC m=+1.266219734,LastTimestamp:2026-03-10 09:44:12.510048906 +0000 UTC m=+1.266219734,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 09:45:16 crc kubenswrapper[4794]: E0310 09:45:16.341404 4794 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189b71a9d3eb2085 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 09:44:12.519235717 +0000 UTC m=+1.275406555,LastTimestamp:2026-03-10 09:44:12.519235717 +0000 UTC m=+1.275406555,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 09:45:16 crc kubenswrapper[4794]: E0310 09:45:16.347904 4794 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189b71a9f1f8b1b1 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager},},Reason:Created,Message:Created container kube-controller-manager,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 09:44:13.023441329 +0000 UTC m=+1.779612147,LastTimestamp:2026-03-10 09:44:13.023441329 +0000 UTC m=+1.779612147,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 09:45:16 crc kubenswrapper[4794]: E0310 09:45:16.353120 4794 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189b71a9f20f2606 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Created,Message:Created container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 09:44:13.024912902 +0000 UTC m=+1.781083720,LastTimestamp:2026-03-10 09:44:13.024912902 +0000 UTC m=+1.781083720,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 09:45:16 crc kubenswrapper[4794]: E0310 09:45:16.357604 4794 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189b71a9f24d5e27 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{wait-for-host-port},},Reason:Created,Message:Created container wait-for-host-port,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 09:44:13.028990503 +0000 UTC m=+1.785161321,LastTimestamp:2026-03-10 09:44:13.028990503 +0000 UTC m=+1.785161321,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 09:45:16 crc kubenswrapper[4794]: E0310 09:45:16.362478 4794 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189b71a9f2922a81 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Created,Message:Created container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 09:44:13.033499265 +0000 UTC m=+1.789670083,LastTimestamp:2026-03-10 09:44:13.033499265 +0000 UTC m=+1.789670083,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 09:45:16 crc kubenswrapper[4794]: E0310 09:45:16.366020 4794 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189b71a9f2a411d8 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager},},Reason:Started,Message:Started container kube-controller-manager,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 09:44:13.0346726 +0000 UTC m=+1.790843418,LastTimestamp:2026-03-10 09:44:13.0346726 +0000 UTC m=+1.790843418,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 09:45:16 crc kubenswrapper[4794]: E0310 09:45:16.369561 4794 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189b71a9f2a55771 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Started,Message:Started container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 09:44:13.034755953 +0000 UTC m=+1.790926781,LastTimestamp:2026-03-10 09:44:13.034755953 +0000 UTC m=+1.790926781,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 09:45:16 crc kubenswrapper[4794]: E0310 09:45:16.373850 4794 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189b71a9f2a85355 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Created,Message:Created container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 09:44:13.034951509 +0000 UTC m=+1.791122327,LastTimestamp:2026-03-10 09:44:13.034951509 +0000 UTC m=+1.791122327,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 09:45:16 crc kubenswrapper[4794]: E0310 09:45:16.378683 4794 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189b71a9f2af1881 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{wait-for-host-port},},Reason:Started,Message:Started container wait-for-host-port,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 09:44:13.035395201 +0000 UTC m=+1.791566019,LastTimestamp:2026-03-10 09:44:13.035395201 +0000 UTC m=+1.791566019,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 09:45:16 crc kubenswrapper[4794]: E0310 09:45:16.384456 4794 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189b71a9f2bad8c2 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 09:44:13.036165314 +0000 UTC m=+1.792336132,LastTimestamp:2026-03-10 09:44:13.036165314 +0000 UTC m=+1.792336132,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 09:45:16 crc kubenswrapper[4794]: E0310 09:45:16.389035 4794 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189b71a9f35f4021 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Started,Message:Started container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 09:44:13.046939681 +0000 UTC m=+1.803110509,LastTimestamp:2026-03-10 09:44:13.046939681 +0000 UTC m=+1.803110509,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 09:45:16 crc kubenswrapper[4794]: E0310 09:45:16.395551 4794 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189b71a9f39ef693 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Started,Message:Started container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 09:44:13.051115155 +0000 UTC m=+1.807285973,LastTimestamp:2026-03-10 09:44:13.051115155 +0000 UTC m=+1.807285973,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 09:45:16 crc kubenswrapper[4794]: E0310 09:45:16.399980 4794 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189b71aa0595de46 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Created,Message:Created container cluster-policy-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 09:44:13.352508998 +0000 UTC m=+2.108679856,LastTimestamp:2026-03-10 09:44:13.352508998 +0000 UTC m=+2.108679856,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 09:45:16 crc kubenswrapper[4794]: E0310 09:45:16.404675 4794 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189b71aa066a7d4b openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Started,Message:Started container cluster-policy-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 09:44:13.366443339 +0000 UTC m=+2.122614197,LastTimestamp:2026-03-10 09:44:13.366443339 +0000 UTC m=+2.122614197,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 09:45:16 crc kubenswrapper[4794]: E0310 09:45:16.408129 4794 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189b71aa06846cc0 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-cert-syncer},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 09:44:13.36814304 +0000 UTC m=+2.124313898,LastTimestamp:2026-03-10 09:44:13.36814304 +0000 UTC m=+2.124313898,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 09:45:16 crc kubenswrapper[4794]: E0310 09:45:16.412207 4794 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189b71aa15156a3e openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-cert-syncer},},Reason:Created,Message:Created container kube-controller-manager-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 09:44:13.612526142 +0000 UTC m=+2.368696970,LastTimestamp:2026-03-10 09:44:13.612526142 +0000 UTC m=+2.368696970,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 09:45:16 crc kubenswrapper[4794]: E0310 09:45:16.417386 4794 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189b71aa161fab6f openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-cert-syncer},},Reason:Started,Message:Started container kube-controller-manager-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 09:44:13.629975407 +0000 UTC m=+2.386146325,LastTimestamp:2026-03-10 09:44:13.629975407 +0000 UTC m=+2.386146325,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 09:45:16 crc kubenswrapper[4794]: E0310 09:45:16.422302 4794 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189b71aa16337a04 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-recovery-controller},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 09:44:13.631273476 +0000 UTC m=+2.387444304,LastTimestamp:2026-03-10 09:44:13.631273476 +0000 UTC m=+2.387444304,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 09:45:16 crc kubenswrapper[4794]: E0310 09:45:16.426091 4794 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189b71aa22faa9d1 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-recovery-controller},},Reason:Created,Message:Created container kube-controller-manager-recovery-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 09:44:13.845653969 +0000 UTC m=+2.601824787,LastTimestamp:2026-03-10 09:44:13.845653969 +0000 UTC m=+2.601824787,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 09:45:16 crc kubenswrapper[4794]: E0310 09:45:16.430229 4794 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189b71aa26158700 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-recovery-controller},},Reason:Started,Message:Started container kube-controller-manager-recovery-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 09:44:13.897746176 +0000 UTC m=+2.653917034,LastTimestamp:2026-03-10 09:44:13.897746176 +0000 UTC m=+2.653917034,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 09:45:16 crc kubenswrapper[4794]: E0310 09:45:16.434426 4794 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189b71aa2de55b54 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 09:44:14.028806996 +0000 UTC m=+2.784977854,LastTimestamp:2026-03-10 09:44:14.028806996 +0000 UTC m=+2.784977854,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 09:45:16 crc kubenswrapper[4794]: E0310 09:45:16.438400 4794 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189b71aa2e0a4b2c openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 09:44:14.031227692 +0000 UTC m=+2.787398540,LastTimestamp:2026-03-10 09:44:14.031227692 +0000 UTC m=+2.787398540,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 09:45:16 crc kubenswrapper[4794]: E0310 09:45:16.446132 4794 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189b71aa2eb11a5b openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 09:44:14.042159707 +0000 UTC m=+2.798330565,LastTimestamp:2026-03-10 09:44:14.042159707 +0000 UTC m=+2.798330565,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 09:45:16 crc kubenswrapper[4794]: E0310 09:45:16.450656 4794 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189b71aa2efb564e openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-ensure-env-vars},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 09:44:14.047024718 +0000 UTC m=+2.803195536,LastTimestamp:2026-03-10 09:44:14.047024718 +0000 UTC m=+2.803195536,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 09:45:16 crc kubenswrapper[4794]: E0310 09:45:16.454294 4794 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189b71aa3d5b39eb openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Created,Message:Created container kube-apiserver,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 09:44:14.288189931 +0000 UTC m=+3.044360749,LastTimestamp:2026-03-10 09:44:14.288189931 +0000 UTC m=+3.044360749,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 09:45:16 crc kubenswrapper[4794]: E0310 09:45:16.457451 4794 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189b71aa3d6675e5 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-ensure-env-vars},},Reason:Created,Message:Created container etcd-ensure-env-vars,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 09:44:14.288926181 +0000 UTC m=+3.045097009,LastTimestamp:2026-03-10 09:44:14.288926181 +0000 UTC m=+3.045097009,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 09:45:16 crc kubenswrapper[4794]: E0310 09:45:16.461210 4794 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189b71aa3d6711ae openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler},},Reason:Created,Message:Created container kube-scheduler,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 09:44:14.288966062 +0000 UTC m=+3.045136890,LastTimestamp:2026-03-10 09:44:14.288966062 +0000 UTC m=+3.045136890,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 09:45:16 crc kubenswrapper[4794]: E0310 09:45:16.464890 4794 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189b71aa3d6f6435 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Created,Message:Created container kube-rbac-proxy-crio,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 09:44:14.289511477 +0000 UTC m=+3.045682295,LastTimestamp:2026-03-10 09:44:14.289511477 +0000 UTC m=+3.045682295,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 09:45:16 crc kubenswrapper[4794]: E0310 09:45:16.468631 4794 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189b71aa3e4e2197 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler},},Reason:Started,Message:Started container kube-scheduler,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 09:44:14.304108951 +0000 UTC m=+3.060279779,LastTimestamp:2026-03-10 09:44:14.304108951 +0000 UTC m=+3.060279779,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 09:45:16 crc kubenswrapper[4794]: E0310 09:45:16.471918 4794 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189b71aa3e5b8633 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-cert-syncer},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 09:44:14.304986675 +0000 UTC m=+3.061157483,LastTimestamp:2026-03-10 09:44:14.304986675 +0000 UTC m=+3.061157483,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 09:45:16 crc kubenswrapper[4794]: E0310 09:45:16.475203 4794 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189b71aa3e7c7cfb openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Started,Message:Started container kube-apiserver,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 09:44:14.307147003 +0000 UTC m=+3.063317821,LastTimestamp:2026-03-10 09:44:14.307147003 +0000 UTC m=+3.063317821,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 09:45:16 crc kubenswrapper[4794]: E0310 09:45:16.479121 4794 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189b71aa3e887d7d openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-syncer},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 09:44:14.307933565 +0000 UTC m=+3.064104383,LastTimestamp:2026-03-10 09:44:14.307933565 +0000 UTC m=+3.064104383,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 09:45:16 crc kubenswrapper[4794]: E0310 09:45:16.482753 4794 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189b71aa3ecd55fc openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Started,Message:Started container kube-rbac-proxy-crio,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 09:44:14.312445436 +0000 UTC m=+3.068616254,LastTimestamp:2026-03-10 09:44:14.312445436 +0000 UTC m=+3.068616254,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 09:45:16 crc kubenswrapper[4794]: E0310 09:45:16.486206 4794 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189b71aa3ecef897 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-ensure-env-vars},},Reason:Started,Message:Started container etcd-ensure-env-vars,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 09:44:14.312552599 +0000 UTC m=+3.068723417,LastTimestamp:2026-03-10 09:44:14.312552599 +0000 UTC m=+3.068723417,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 09:45:16 crc kubenswrapper[4794]: E0310 09:45:16.491857 4794 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189b71aa4a301b77 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-syncer},},Reason:Created,Message:Created container kube-apiserver-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 09:44:14.503467895 +0000 UTC m=+3.259638713,LastTimestamp:2026-03-10 09:44:14.503467895 +0000 UTC m=+3.259638713,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 09:45:16 crc kubenswrapper[4794]: E0310 09:45:16.495080 4794 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189b71aa4a301b81 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-cert-syncer},},Reason:Created,Message:Created container kube-scheduler-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 09:44:14.503467905 +0000 UTC m=+3.259638723,LastTimestamp:2026-03-10 09:44:14.503467905 +0000 UTC m=+3.259638723,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 09:45:16 crc kubenswrapper[4794]: E0310 09:45:16.500720 4794 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189b71aa4b059d13 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-syncer},},Reason:Started,Message:Started container kube-apiserver-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 09:44:14.517460243 +0000 UTC m=+3.273631061,LastTimestamp:2026-03-10 09:44:14.517460243 +0000 UTC m=+3.273631061,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 09:45:16 crc kubenswrapper[4794]: E0310 09:45:16.505797 4794 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189b71aa4b153c0d openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-regeneration-controller},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 09:44:14.518483981 +0000 UTC m=+3.274654799,LastTimestamp:2026-03-10 09:44:14.518483981 +0000 UTC m=+3.274654799,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 09:45:16 crc kubenswrapper[4794]: E0310 09:45:16.511755 4794 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189b71aa4b2ed550 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-cert-syncer},},Reason:Started,Message:Started container kube-scheduler-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 09:44:14.520161616 +0000 UTC m=+3.276332434,LastTimestamp:2026-03-10 09:44:14.520161616 +0000 UTC m=+3.276332434,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 09:45:16 crc kubenswrapper[4794]: E0310 09:45:16.516488 4794 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189b71aa4b3956ff openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-recovery-controller},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 09:44:14.520850175 +0000 UTC m=+3.277020993,LastTimestamp:2026-03-10 09:44:14.520850175 +0000 UTC m=+3.277020993,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 09:45:16 crc kubenswrapper[4794]: E0310 09:45:16.522019 4794 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189b71aa5614467d openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-regeneration-controller},},Reason:Created,Message:Created container kube-apiserver-cert-regeneration-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 09:44:14.702970493 +0000 UTC m=+3.459141321,LastTimestamp:2026-03-10 09:44:14.702970493 +0000 UTC m=+3.459141321,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 09:45:16 crc kubenswrapper[4794]: E0310 09:45:16.527391 4794 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189b71aa563591f6 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-recovery-controller},},Reason:Created,Message:Created container kube-scheduler-recovery-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 09:44:14.705152502 +0000 UTC m=+3.461323330,LastTimestamp:2026-03-10 09:44:14.705152502 +0000 UTC m=+3.461323330,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 09:45:16 crc kubenswrapper[4794]: E0310 09:45:16.532543 4794 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189b71aa56cc66ec openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-regeneration-controller},},Reason:Started,Message:Started container kube-apiserver-cert-regeneration-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 09:44:14.71503742 +0000 UTC m=+3.471208248,LastTimestamp:2026-03-10 09:44:14.71503742 +0000 UTC m=+3.471208248,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 09:45:16 crc kubenswrapper[4794]: E0310 09:45:16.538983 4794 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189b71aa56dbfcf3 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-insecure-readyz},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 09:44:14.716058867 +0000 UTC m=+3.472229705,LastTimestamp:2026-03-10 09:44:14.716058867 +0000 UTC m=+3.472229705,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 09:45:16 crc kubenswrapper[4794]: E0310 09:45:16.545442 4794 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189b71aa5724f613 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-recovery-controller},},Reason:Started,Message:Started container kube-scheduler-recovery-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 09:44:14.720841235 +0000 UTC m=+3.477012063,LastTimestamp:2026-03-10 09:44:14.720841235 +0000 UTC m=+3.477012063,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 09:45:16 crc kubenswrapper[4794]: E0310 09:45:16.550145 4794 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189b71aa61cdf642 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-insecure-readyz},},Reason:Created,Message:Created container kube-apiserver-insecure-readyz,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 09:44:14.899689026 +0000 UTC m=+3.655859844,LastTimestamp:2026-03-10 09:44:14.899689026 +0000 UTC m=+3.655859844,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 09:45:16 crc kubenswrapper[4794]: E0310 09:45:16.554119 4794 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189b71aa62938c90 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-insecure-readyz},},Reason:Started,Message:Started container kube-apiserver-insecure-readyz,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 09:44:14.912638096 +0000 UTC m=+3.668808914,LastTimestamp:2026-03-10 09:44:14.912638096 +0000 UTC m=+3.668808914,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 09:45:16 crc kubenswrapper[4794]: E0310 09:45:16.559298 4794 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189b71aa62a37b8a openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 09:44:14.913682314 +0000 UTC m=+3.669853132,LastTimestamp:2026-03-10 09:44:14.913682314 +0000 UTC m=+3.669853132,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 09:45:16 crc kubenswrapper[4794]: E0310 09:45:16.566937 4794 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189b71aa6b5cefbb openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-resources-copy},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 09:44:15.060053947 +0000 UTC m=+3.816224775,LastTimestamp:2026-03-10 09:44:15.060053947 +0000 UTC m=+3.816224775,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 09:45:16 crc kubenswrapper[4794]: E0310 09:45:16.572155 4794 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189b71aa6d2c72f3 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Created,Message:Created container kube-apiserver-check-endpoints,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 09:44:15.090430707 +0000 UTC m=+3.846601545,LastTimestamp:2026-03-10 09:44:15.090430707 +0000 UTC m=+3.846601545,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 09:45:16 crc kubenswrapper[4794]: E0310 09:45:16.576928 4794 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189b71aa6e17bbfc openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Started,Message:Started container kube-apiserver-check-endpoints,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 09:44:15.105850364 +0000 UTC m=+3.862021202,LastTimestamp:2026-03-10 09:44:15.105850364 +0000 UTC m=+3.862021202,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 09:45:16 crc kubenswrapper[4794]: E0310 09:45:16.583326 4794 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189b71aa76318426 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-resources-copy},},Reason:Created,Message:Created container etcd-resources-copy,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 09:44:15.241757734 +0000 UTC m=+3.997928552,LastTimestamp:2026-03-10 09:44:15.241757734 +0000 UTC m=+3.997928552,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 09:45:16 crc kubenswrapper[4794]: E0310 09:45:16.588117 4794 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189b71aa76fe9d52 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-resources-copy},},Reason:Started,Message:Started container etcd-resources-copy,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 09:44:15.255199058 +0000 UTC m=+4.011369876,LastTimestamp:2026-03-10 09:44:15.255199058 +0000 UTC m=+4.011369876,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 09:45:16 crc kubenswrapper[4794]: E0310 09:45:16.595812 4794 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189b71aaa7ce1256 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcdctl},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 09:44:16.074101334 +0000 UTC m=+4.830272192,LastTimestamp:2026-03-10 09:44:16.074101334 +0000 UTC m=+4.830272192,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 09:45:16 crc kubenswrapper[4794]: E0310 09:45:16.602631 4794 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189b71aab71fe9ab openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcdctl},},Reason:Created,Message:Created container etcdctl,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 09:44:16.331123115 +0000 UTC m=+5.087293963,LastTimestamp:2026-03-10 09:44:16.331123115 +0000 UTC m=+5.087293963,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 09:45:16 crc kubenswrapper[4794]: E0310 09:45:16.609617 4794 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189b71aab7dcbd1a openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcdctl},},Reason:Started,Message:Started container etcdctl,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 09:44:16.34349801 +0000 UTC m=+5.099668858,LastTimestamp:2026-03-10 09:44:16.34349801 +0000 UTC m=+5.099668858,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 09:45:16 crc kubenswrapper[4794]: E0310 09:45:16.616091 4794 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189b71aab7f3f836 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 09:44:16.34502047 +0000 UTC m=+5.101191318,LastTimestamp:2026-03-10 09:44:16.34502047 +0000 UTC m=+5.101191318,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 09:45:16 crc kubenswrapper[4794]: E0310 09:45:16.621323 4794 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189b71aac77df177 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd},},Reason:Created,Message:Created container etcd,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 09:44:16.605720951 +0000 UTC m=+5.361891779,LastTimestamp:2026-03-10 09:44:16.605720951 +0000 UTC m=+5.361891779,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 09:45:16 crc kubenswrapper[4794]: E0310 09:45:16.626799 4794 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189b71aac87783dd openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd},},Reason:Started,Message:Started container etcd,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 09:44:16.622076893 +0000 UTC m=+5.378247741,LastTimestamp:2026-03-10 09:44:16.622076893 +0000 UTC m=+5.378247741,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 09:45:16 crc kubenswrapper[4794]: E0310 09:45:16.632942 4794 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189b71aac8903b57 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-metrics},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 09:44:16.623696727 +0000 UTC m=+5.379867585,LastTimestamp:2026-03-10 09:44:16.623696727 +0000 UTC m=+5.379867585,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 09:45:16 crc kubenswrapper[4794]: E0310 09:45:16.637734 4794 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189b71aad5306572 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-metrics},},Reason:Created,Message:Created container etcd-metrics,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 09:44:16.835519858 +0000 UTC m=+5.591690676,LastTimestamp:2026-03-10 09:44:16.835519858 +0000 UTC m=+5.591690676,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 09:45:16 crc kubenswrapper[4794]: E0310 09:45:16.642715 4794 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189b71aad630d315 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-metrics},},Reason:Started,Message:Started container etcd-metrics,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 09:44:16.852325141 +0000 UTC m=+5.608495959,LastTimestamp:2026-03-10 09:44:16.852325141 +0000 UTC m=+5.608495959,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 09:45:16 crc kubenswrapper[4794]: E0310 09:45:16.646903 4794 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189b71aad63fdcea openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-readyz},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 09:44:16.853310698 +0000 UTC m=+5.609481506,LastTimestamp:2026-03-10 09:44:16.853310698 +0000 UTC m=+5.609481506,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 09:45:16 crc kubenswrapper[4794]: E0310 09:45:16.653228 4794 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189b71aae0193011 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-readyz},},Reason:Created,Message:Created container etcd-readyz,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 09:44:17.018548241 +0000 UTC m=+5.774719059,LastTimestamp:2026-03-10 09:44:17.018548241 +0000 UTC m=+5.774719059,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 09:45:16 crc kubenswrapper[4794]: E0310 09:45:16.658193 4794 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189b71aae11d3def openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-readyz},},Reason:Started,Message:Started container etcd-readyz,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 09:44:17.035591151 +0000 UTC m=+5.791762019,LastTimestamp:2026-03-10 09:44:17.035591151 +0000 UTC m=+5.791762019,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 09:45:16 crc kubenswrapper[4794]: E0310 09:45:16.664360 4794 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189b71aae12c2dab openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-rev},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 09:44:17.036570027 +0000 UTC m=+5.792740845,LastTimestamp:2026-03-10 09:44:17.036570027 +0000 UTC m=+5.792740845,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 09:45:16 crc kubenswrapper[4794]: E0310 09:45:16.669520 4794 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189b71aaebe46211 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-rev},},Reason:Created,Message:Created container etcd-rev,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 09:44:17.216414225 +0000 UTC m=+5.972585043,LastTimestamp:2026-03-10 09:44:17.216414225 +0000 UTC m=+5.972585043,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 09:45:16 crc kubenswrapper[4794]: E0310 09:45:16.673802 4794 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189b71aaec7cc3ca openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-rev},},Reason:Started,Message:Started container etcd-rev,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 09:44:17.226400714 +0000 UTC m=+5.982571532,LastTimestamp:2026-03-10 09:44:17.226400714 +0000 UTC m=+5.982571532,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 09:45:16 crc kubenswrapper[4794]: E0310 09:45:16.681798 4794 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Mar 10 09:45:16 crc kubenswrapper[4794]: &Event{ObjectMeta:{kube-controller-manager-crc.189b71ac48a37575 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": context deadline exceeded (Client.Timeout exceeded while awaiting headers) Mar 10 09:45:16 crc kubenswrapper[4794]: body: Mar 10 09:45:16 crc kubenswrapper[4794]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 09:44:23.067407733 +0000 UTC m=+11.823578641,LastTimestamp:2026-03-10 09:44:23.067407733 +0000 UTC m=+11.823578641,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 10 09:45:16 crc kubenswrapper[4794]: > Mar 10 09:45:16 crc kubenswrapper[4794]: E0310 09:45:16.687312 4794 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189b71ac48a551d6 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Unhealthy,Message:Startup probe failed: Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers),Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 09:44:23.067529686 +0000 UTC m=+11.823700544,LastTimestamp:2026-03-10 09:44:23.067529686 +0000 UTC m=+11.823700544,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 09:45:16 crc kubenswrapper[4794]: E0310 09:45:16.694745 4794 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event=< Mar 10 09:45:16 crc kubenswrapper[4794]: &Event{ObjectMeta:{kube-apiserver-crc.189b71ad016e0e7d openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:ProbeError,Message:Startup probe error: HTTP probe failed with statuscode: 403 Mar 10 09:45:16 crc kubenswrapper[4794]: body: {"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Mar 10 09:45:16 crc kubenswrapper[4794]: Mar 10 09:45:16 crc kubenswrapper[4794]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 09:44:26.167692925 +0000 UTC m=+14.923863743,LastTimestamp:2026-03-10 09:44:26.167692925 +0000 UTC m=+14.923863743,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 10 09:45:16 crc kubenswrapper[4794]: > Mar 10 09:45:16 crc kubenswrapper[4794]: E0310 09:45:16.700104 4794 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189b71ad016ea73a openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Unhealthy,Message:Startup probe failed: HTTP probe failed with statuscode: 403,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 09:44:26.167732026 +0000 UTC m=+14.923902844,LastTimestamp:2026-03-10 09:44:26.167732026 +0000 UTC m=+14.923902844,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 09:45:16 crc kubenswrapper[4794]: E0310 09:45:16.707537 4794 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.189b71ad016e0e7d\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event=< Mar 10 09:45:16 crc kubenswrapper[4794]: &Event{ObjectMeta:{kube-apiserver-crc.189b71ad016e0e7d openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:ProbeError,Message:Startup probe error: HTTP probe failed with statuscode: 403 Mar 10 09:45:16 crc kubenswrapper[4794]: body: {"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Mar 10 09:45:16 crc kubenswrapper[4794]: Mar 10 09:45:16 crc kubenswrapper[4794]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 09:44:26.167692925 +0000 UTC m=+14.923863743,LastTimestamp:2026-03-10 09:44:26.174668449 +0000 UTC m=+14.930839277,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 10 09:45:16 crc kubenswrapper[4794]: > Mar 10 09:45:16 crc kubenswrapper[4794]: E0310 09:45:16.713685 4794 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.189b71ad016ea73a\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189b71ad016ea73a openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Unhealthy,Message:Startup probe failed: HTTP probe failed with statuscode: 403,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 09:44:26.167732026 +0000 UTC m=+14.923902844,LastTimestamp:2026-03-10 09:44:26.17470165 +0000 UTC m=+14.930872478,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 09:45:16 crc kubenswrapper[4794]: E0310 09:45:16.718505 4794 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.189b71aa62a37b8a\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189b71aa62a37b8a openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 09:44:14.913682314 +0000 UTC m=+3.669853132,LastTimestamp:2026-03-10 09:44:27.121116153 +0000 UTC m=+15.877287011,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 09:45:16 crc kubenswrapper[4794]: E0310 09:45:16.722954 4794 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.189b71aa6d2c72f3\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189b71aa6d2c72f3 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Created,Message:Created container kube-apiserver-check-endpoints,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 09:44:15.090430707 +0000 UTC m=+3.846601545,LastTimestamp:2026-03-10 09:44:27.328937507 +0000 UTC m=+16.085108345,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 09:45:16 crc kubenswrapper[4794]: E0310 09:45:16.727855 4794 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.189b71aa6e17bbfc\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189b71aa6e17bbfc openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Started,Message:Started container kube-apiserver-check-endpoints,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 09:44:15.105850364 +0000 UTC m=+3.862021202,LastTimestamp:2026-03-10 09:44:27.337473596 +0000 UTC m=+16.093644424,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 09:45:16 crc kubenswrapper[4794]: E0310 09:45:16.735613 4794 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Mar 10 09:45:16 crc kubenswrapper[4794]: &Event{ObjectMeta:{kube-controller-manager-crc.189b71ae9cbaad82 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) Mar 10 09:45:16 crc kubenswrapper[4794]: body: Mar 10 09:45:16 crc kubenswrapper[4794]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 09:44:33.068150146 +0000 UTC m=+21.824320964,LastTimestamp:2026-03-10 09:44:33.068150146 +0000 UTC m=+21.824320964,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 10 09:45:16 crc kubenswrapper[4794]: > Mar 10 09:45:16 crc kubenswrapper[4794]: E0310 09:45:16.740043 4794 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189b71ae9cbbfeaa openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Unhealthy,Message:Startup probe failed: Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers),Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 09:44:33.068236458 +0000 UTC m=+21.824407276,LastTimestamp:2026-03-10 09:44:33.068236458 +0000 UTC m=+21.824407276,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 09:45:16 crc kubenswrapper[4794]: E0310 09:45:16.747195 4794 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189b71ae9cbaad82\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Mar 10 09:45:16 crc kubenswrapper[4794]: &Event{ObjectMeta:{kube-controller-manager-crc.189b71ae9cbaad82 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) Mar 10 09:45:16 crc kubenswrapper[4794]: body: Mar 10 09:45:16 crc kubenswrapper[4794]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 09:44:33.068150146 +0000 UTC m=+21.824320964,LastTimestamp:2026-03-10 09:44:43.067700386 +0000 UTC m=+31.823871244,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 10 09:45:16 crc kubenswrapper[4794]: > Mar 10 09:45:16 crc kubenswrapper[4794]: E0310 09:45:16.753564 4794 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189b71ae9cbbfeaa\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189b71ae9cbbfeaa openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Unhealthy,Message:Startup probe failed: Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers),Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 09:44:33.068236458 +0000 UTC m=+21.824407276,LastTimestamp:2026-03-10 09:44:43.067775228 +0000 UTC m=+31.823946086,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 09:45:16 crc kubenswrapper[4794]: E0310 09:45:16.757530 4794 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189b71b0f0f06232 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Killing,Message:Container cluster-policy-controller failed startup probe, will be restarted,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 09:44:43.070890546 +0000 UTC m=+31.827061404,LastTimestamp:2026-03-10 09:44:43.070890546 +0000 UTC m=+31.827061404,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 09:45:16 crc kubenswrapper[4794]: E0310 09:45:16.761225 4794 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189b71a9f2bad8c2\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189b71a9f2bad8c2 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 09:44:13.036165314 +0000 UTC m=+1.792336132,LastTimestamp:2026-03-10 09:44:43.191292314 +0000 UTC m=+31.947463182,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 09:45:16 crc kubenswrapper[4794]: E0310 09:45:16.765369 4794 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189b71aa0595de46\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189b71aa0595de46 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Created,Message:Created container cluster-policy-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 09:44:13.352508998 +0000 UTC m=+2.108679856,LastTimestamp:2026-03-10 09:44:43.41852726 +0000 UTC m=+32.174698118,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 09:45:16 crc kubenswrapper[4794]: E0310 09:45:16.769395 4794 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189b71aa066a7d4b\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189b71aa066a7d4b openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Started,Message:Started container cluster-policy-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 09:44:13.366443339 +0000 UTC m=+2.122614197,LastTimestamp:2026-03-10 09:44:43.428874859 +0000 UTC m=+32.185045707,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 09:45:16 crc kubenswrapper[4794]: I0310 09:45:16.773936 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 10 09:45:16 crc kubenswrapper[4794]: I0310 09:45:16.774163 4794 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 09:45:16 crc kubenswrapper[4794]: E0310 09:45:16.775581 4794 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189b71ae9cbaad82\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Mar 10 09:45:16 crc kubenswrapper[4794]: &Event{ObjectMeta:{kube-controller-manager-crc.189b71ae9cbaad82 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) Mar 10 09:45:16 crc kubenswrapper[4794]: body: Mar 10 09:45:16 crc kubenswrapper[4794]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 09:44:33.068150146 +0000 UTC m=+21.824320964,LastTimestamp:2026-03-10 09:44:53.066943936 +0000 UTC m=+41.823114784,Count:3,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 10 09:45:16 crc kubenswrapper[4794]: > Mar 10 09:45:16 crc kubenswrapper[4794]: I0310 09:45:16.776500 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:45:16 crc kubenswrapper[4794]: I0310 09:45:16.776603 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:45:16 crc kubenswrapper[4794]: I0310 09:45:16.776687 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:45:16 crc kubenswrapper[4794]: E0310 09:45:16.779593 4794 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189b71ae9cbbfeaa\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189b71ae9cbbfeaa openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Unhealthy,Message:Startup probe failed: Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers),Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 09:44:33.068236458 +0000 UTC m=+21.824407276,LastTimestamp:2026-03-10 09:44:53.067022458 +0000 UTC m=+41.823193316,Count:3,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 09:45:16 crc kubenswrapper[4794]: E0310 09:45:16.784677 4794 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189b71ae9cbaad82\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Mar 10 09:45:16 crc kubenswrapper[4794]: &Event{ObjectMeta:{kube-controller-manager-crc.189b71ae9cbaad82 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) Mar 10 09:45:16 crc kubenswrapper[4794]: body: Mar 10 09:45:16 crc kubenswrapper[4794]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 09:44:33.068150146 +0000 UTC m=+21.824320964,LastTimestamp:2026-03-10 09:45:03.067948846 +0000 UTC m=+51.824119724,Count:4,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 10 09:45:16 crc kubenswrapper[4794]: > Mar 10 09:45:16 crc kubenswrapper[4794]: I0310 09:45:16.934103 4794 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 10 09:45:17 crc kubenswrapper[4794]: I0310 09:45:17.932162 4794 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 10 09:45:18 crc kubenswrapper[4794]: I0310 09:45:18.931014 4794 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 10 09:45:19 crc kubenswrapper[4794]: I0310 09:45:19.931994 4794 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 10 09:45:20 crc kubenswrapper[4794]: I0310 09:45:20.067139 4794 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 10 09:45:20 crc kubenswrapper[4794]: I0310 09:45:20.067393 4794 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 09:45:20 crc kubenswrapper[4794]: I0310 09:45:20.068780 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:45:20 crc kubenswrapper[4794]: I0310 09:45:20.068842 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:45:20 crc kubenswrapper[4794]: I0310 09:45:20.068866 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:45:20 crc kubenswrapper[4794]: I0310 09:45:20.074282 4794 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 10 09:45:20 crc kubenswrapper[4794]: I0310 09:45:20.308873 4794 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 09:45:20 crc kubenswrapper[4794]: I0310 09:45:20.310056 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:45:20 crc kubenswrapper[4794]: I0310 09:45:20.310126 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:45:20 crc kubenswrapper[4794]: I0310 09:45:20.310157 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:45:20 crc kubenswrapper[4794]: I0310 09:45:20.931823 4794 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 10 09:45:20 crc kubenswrapper[4794]: I0310 09:45:20.998461 4794 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 09:45:20 crc kubenswrapper[4794]: I0310 09:45:20.999804 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:45:20 crc kubenswrapper[4794]: I0310 09:45:20.999841 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:45:20 crc kubenswrapper[4794]: I0310 09:45:20.999856 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:45:21 crc kubenswrapper[4794]: I0310 09:45:21.000433 4794 scope.go:117] "RemoveContainer" containerID="b210b83aa3e9184693096ec3a1dcc70ebc8060a90f1d6639bbc625dee9013aaa" Mar 10 09:45:21 crc kubenswrapper[4794]: I0310 09:45:21.313030 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log" Mar 10 09:45:21 crc kubenswrapper[4794]: I0310 09:45:21.314507 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"bd3d37caeb172a681855b5504cd1387cb3798cff0c9ae2c67f9a57207d86f7fe"} Mar 10 09:45:21 crc kubenswrapper[4794]: I0310 09:45:21.314630 4794 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 09:45:21 crc kubenswrapper[4794]: I0310 09:45:21.315318 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:45:21 crc kubenswrapper[4794]: I0310 09:45:21.315367 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:45:21 crc kubenswrapper[4794]: I0310 09:45:21.315379 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:45:21 crc kubenswrapper[4794]: E0310 09:45:21.598019 4794 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 10 09:45:21 crc kubenswrapper[4794]: I0310 09:45:21.605146 4794 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 09:45:21 crc kubenswrapper[4794]: I0310 09:45:21.606802 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:45:21 crc kubenswrapper[4794]: I0310 09:45:21.606837 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:45:21 crc kubenswrapper[4794]: I0310 09:45:21.606850 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:45:21 crc kubenswrapper[4794]: I0310 09:45:21.606875 4794 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 10 09:45:21 crc kubenswrapper[4794]: E0310 09:45:21.610932 4794 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Mar 10 09:45:21 crc kubenswrapper[4794]: I0310 09:45:21.932415 4794 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 10 09:45:22 crc kubenswrapper[4794]: E0310 09:45:22.052115 4794 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 10 09:45:22 crc kubenswrapper[4794]: I0310 09:45:22.318186 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Mar 10 09:45:22 crc kubenswrapper[4794]: I0310 09:45:22.318780 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log" Mar 10 09:45:22 crc kubenswrapper[4794]: I0310 09:45:22.320608 4794 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="bd3d37caeb172a681855b5504cd1387cb3798cff0c9ae2c67f9a57207d86f7fe" exitCode=255 Mar 10 09:45:22 crc kubenswrapper[4794]: I0310 09:45:22.320650 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"bd3d37caeb172a681855b5504cd1387cb3798cff0c9ae2c67f9a57207d86f7fe"} Mar 10 09:45:22 crc kubenswrapper[4794]: I0310 09:45:22.320707 4794 scope.go:117] "RemoveContainer" containerID="b210b83aa3e9184693096ec3a1dcc70ebc8060a90f1d6639bbc625dee9013aaa" Mar 10 09:45:22 crc kubenswrapper[4794]: I0310 09:45:22.320914 4794 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 09:45:22 crc kubenswrapper[4794]: I0310 09:45:22.322978 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:45:22 crc kubenswrapper[4794]: I0310 09:45:22.323009 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:45:22 crc kubenswrapper[4794]: I0310 09:45:22.323022 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:45:22 crc kubenswrapper[4794]: I0310 09:45:22.323603 4794 scope.go:117] "RemoveContainer" containerID="bd3d37caeb172a681855b5504cd1387cb3798cff0c9ae2c67f9a57207d86f7fe" Mar 10 09:45:22 crc kubenswrapper[4794]: E0310 09:45:22.323846 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 10 09:45:22 crc kubenswrapper[4794]: I0310 09:45:22.932012 4794 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 10 09:45:23 crc kubenswrapper[4794]: I0310 09:45:23.324033 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Mar 10 09:45:23 crc kubenswrapper[4794]: I0310 09:45:23.666555 4794 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 10 09:45:23 crc kubenswrapper[4794]: I0310 09:45:23.686860 4794 reflector.go:368] Caches populated for *v1.CertificateSigningRequest from k8s.io/client-go/tools/watch/informerwatcher.go:146 Mar 10 09:45:23 crc kubenswrapper[4794]: I0310 09:45:23.740774 4794 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 10 09:45:23 crc kubenswrapper[4794]: I0310 09:45:23.740941 4794 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 09:45:23 crc kubenswrapper[4794]: I0310 09:45:23.742018 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:45:23 crc kubenswrapper[4794]: I0310 09:45:23.742054 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:45:23 crc kubenswrapper[4794]: I0310 09:45:23.742064 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:45:23 crc kubenswrapper[4794]: I0310 09:45:23.742508 4794 scope.go:117] "RemoveContainer" containerID="bd3d37caeb172a681855b5504cd1387cb3798cff0c9ae2c67f9a57207d86f7fe" Mar 10 09:45:23 crc kubenswrapper[4794]: E0310 09:45:23.742851 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 10 09:45:23 crc kubenswrapper[4794]: I0310 09:45:23.933716 4794 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 10 09:45:24 crc kubenswrapper[4794]: I0310 09:45:24.931557 4794 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 10 09:45:25 crc kubenswrapper[4794]: I0310 09:45:25.928085 4794 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 10 09:45:26 crc kubenswrapper[4794]: I0310 09:45:26.777812 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 10 09:45:26 crc kubenswrapper[4794]: I0310 09:45:26.777928 4794 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 09:45:26 crc kubenswrapper[4794]: I0310 09:45:26.778932 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:45:26 crc kubenswrapper[4794]: I0310 09:45:26.778968 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:45:26 crc kubenswrapper[4794]: I0310 09:45:26.778982 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:45:26 crc kubenswrapper[4794]: I0310 09:45:26.934743 4794 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 10 09:45:26 crc kubenswrapper[4794]: I0310 09:45:26.979946 4794 csr.go:261] certificate signing request csr-5c665 is approved, waiting to be issued Mar 10 09:45:26 crc kubenswrapper[4794]: I0310 09:45:26.988792 4794 csr.go:257] certificate signing request csr-5c665 is issued Mar 10 09:45:27 crc kubenswrapper[4794]: I0310 09:45:27.088804 4794 reconstruct.go:205] "DevicePaths of reconstructed volumes updated" Mar 10 09:45:27 crc kubenswrapper[4794]: I0310 09:45:27.488392 4794 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Mar 10 09:45:27 crc kubenswrapper[4794]: I0310 09:45:27.770635 4794 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Mar 10 09:45:27 crc kubenswrapper[4794]: W0310 09:45:27.771046 4794 reflector.go:484] k8s.io/client-go/informers/factory.go:160: watch of *v1.Service ended with: very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received Mar 10 09:45:27 crc kubenswrapper[4794]: I0310 09:45:27.989826 4794 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2027-02-24 05:54:36 +0000 UTC, rotation deadline is 2027-01-04 12:23:31.45964532 +0000 UTC Mar 10 09:45:27 crc kubenswrapper[4794]: I0310 09:45:27.989874 4794 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Waiting 7202h38m3.469773475s for next certificate rotation Mar 10 09:45:28 crc kubenswrapper[4794]: I0310 09:45:28.611010 4794 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 09:45:28 crc kubenswrapper[4794]: I0310 09:45:28.612453 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:45:28 crc kubenswrapper[4794]: I0310 09:45:28.612535 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:45:28 crc kubenswrapper[4794]: I0310 09:45:28.612550 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:45:28 crc kubenswrapper[4794]: I0310 09:45:28.612696 4794 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 10 09:45:28 crc kubenswrapper[4794]: I0310 09:45:28.618802 4794 kubelet_node_status.go:115] "Node was previously registered" node="crc" Mar 10 09:45:28 crc kubenswrapper[4794]: I0310 09:45:28.619036 4794 kubelet_node_status.go:79] "Successfully registered node" node="crc" Mar 10 09:45:28 crc kubenswrapper[4794]: E0310 09:45:28.619059 4794 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": node \"crc\" not found" Mar 10 09:45:28 crc kubenswrapper[4794]: I0310 09:45:28.621959 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:45:28 crc kubenswrapper[4794]: I0310 09:45:28.622006 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:45:28 crc kubenswrapper[4794]: I0310 09:45:28.622016 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:45:28 crc kubenswrapper[4794]: I0310 09:45:28.622029 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:45:28 crc kubenswrapper[4794]: I0310 09:45:28.622039 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:45:28Z","lastTransitionTime":"2026-03-10T09:45:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:45:28 crc kubenswrapper[4794]: E0310 09:45:28.631168 4794 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:45:28Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:28Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:45:28Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:28Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:45:28Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:28Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:45:28Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:28Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"30640994-851b-4f33-a3b0-5689f89c6242\\\",\\\"systemUUID\\\":\\\"970a308b-8f2f-4747-b542-8544494e7e13\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 09:45:28 crc kubenswrapper[4794]: I0310 09:45:28.638974 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:45:28 crc kubenswrapper[4794]: I0310 09:45:28.639028 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:45:28 crc kubenswrapper[4794]: I0310 09:45:28.639037 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:45:28 crc kubenswrapper[4794]: I0310 09:45:28.639052 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:45:28 crc kubenswrapper[4794]: I0310 09:45:28.639063 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:45:28Z","lastTransitionTime":"2026-03-10T09:45:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:45:28 crc kubenswrapper[4794]: E0310 09:45:28.647029 4794 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:45:28Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:28Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:45:28Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:28Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:45:28Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:28Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:45:28Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:28Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"30640994-851b-4f33-a3b0-5689f89c6242\\\",\\\"systemUUID\\\":\\\"970a308b-8f2f-4747-b542-8544494e7e13\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 09:45:28 crc kubenswrapper[4794]: I0310 09:45:28.655906 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:45:28 crc kubenswrapper[4794]: I0310 09:45:28.655977 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:45:28 crc kubenswrapper[4794]: I0310 09:45:28.655996 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:45:28 crc kubenswrapper[4794]: I0310 09:45:28.656021 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:45:28 crc kubenswrapper[4794]: I0310 09:45:28.656038 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:45:28Z","lastTransitionTime":"2026-03-10T09:45:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:45:28 crc kubenswrapper[4794]: E0310 09:45:28.666700 4794 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:45:28Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:28Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:45:28Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:28Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:45:28Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:28Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:45:28Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:28Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"30640994-851b-4f33-a3b0-5689f89c6242\\\",\\\"systemUUID\\\":\\\"970a308b-8f2f-4747-b542-8544494e7e13\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 09:45:28 crc kubenswrapper[4794]: I0310 09:45:28.673656 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:45:28 crc kubenswrapper[4794]: I0310 09:45:28.673693 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:45:28 crc kubenswrapper[4794]: I0310 09:45:28.673703 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:45:28 crc kubenswrapper[4794]: I0310 09:45:28.673718 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:45:28 crc kubenswrapper[4794]: I0310 09:45:28.673730 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:45:28Z","lastTransitionTime":"2026-03-10T09:45:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:45:28 crc kubenswrapper[4794]: E0310 09:45:28.683451 4794 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:45:28Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:28Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:45:28Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:28Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:45:28Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:28Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:45:28Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:28Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"30640994-851b-4f33-a3b0-5689f89c6242\\\",\\\"systemUUID\\\":\\\"970a308b-8f2f-4747-b542-8544494e7e13\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 09:45:28 crc kubenswrapper[4794]: E0310 09:45:28.683559 4794 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 10 09:45:28 crc kubenswrapper[4794]: E0310 09:45:28.683585 4794 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 09:45:28 crc kubenswrapper[4794]: E0310 09:45:28.784692 4794 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 09:45:28 crc kubenswrapper[4794]: E0310 09:45:28.885194 4794 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 09:45:28 crc kubenswrapper[4794]: E0310 09:45:28.986108 4794 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 09:45:29 crc kubenswrapper[4794]: E0310 09:45:29.087272 4794 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 09:45:29 crc kubenswrapper[4794]: E0310 09:45:29.188361 4794 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 09:45:29 crc kubenswrapper[4794]: I0310 09:45:29.268400 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 10 09:45:29 crc kubenswrapper[4794]: I0310 09:45:29.268647 4794 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 09:45:29 crc kubenswrapper[4794]: I0310 09:45:29.270098 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:45:29 crc kubenswrapper[4794]: I0310 09:45:29.270159 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:45:29 crc kubenswrapper[4794]: I0310 09:45:29.270184 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:45:29 crc kubenswrapper[4794]: I0310 09:45:29.271228 4794 scope.go:117] "RemoveContainer" containerID="bd3d37caeb172a681855b5504cd1387cb3798cff0c9ae2c67f9a57207d86f7fe" Mar 10 09:45:29 crc kubenswrapper[4794]: E0310 09:45:29.271549 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 10 09:45:29 crc kubenswrapper[4794]: E0310 09:45:29.289180 4794 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 09:45:29 crc kubenswrapper[4794]: E0310 09:45:29.389543 4794 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 09:45:29 crc kubenswrapper[4794]: E0310 09:45:29.490045 4794 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 09:45:29 crc kubenswrapper[4794]: E0310 09:45:29.591077 4794 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 09:45:29 crc kubenswrapper[4794]: E0310 09:45:29.691306 4794 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 09:45:29 crc kubenswrapper[4794]: E0310 09:45:29.791798 4794 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 09:45:29 crc kubenswrapper[4794]: E0310 09:45:29.891895 4794 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 09:45:29 crc kubenswrapper[4794]: E0310 09:45:29.992686 4794 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 09:45:30 crc kubenswrapper[4794]: E0310 09:45:30.093321 4794 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 09:45:30 crc kubenswrapper[4794]: E0310 09:45:30.194147 4794 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 09:45:30 crc kubenswrapper[4794]: E0310 09:45:30.294865 4794 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 09:45:30 crc kubenswrapper[4794]: E0310 09:45:30.395183 4794 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 09:45:30 crc kubenswrapper[4794]: E0310 09:45:30.496257 4794 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 09:45:30 crc kubenswrapper[4794]: E0310 09:45:30.596690 4794 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 09:45:30 crc kubenswrapper[4794]: E0310 09:45:30.697609 4794 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 09:45:30 crc kubenswrapper[4794]: E0310 09:45:30.798553 4794 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 09:45:30 crc kubenswrapper[4794]: E0310 09:45:30.899550 4794 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 09:45:31 crc kubenswrapper[4794]: E0310 09:45:31.000141 4794 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 09:45:31 crc kubenswrapper[4794]: E0310 09:45:31.101351 4794 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 09:45:31 crc kubenswrapper[4794]: E0310 09:45:31.202231 4794 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 09:45:31 crc kubenswrapper[4794]: E0310 09:45:31.303392 4794 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 09:45:31 crc kubenswrapper[4794]: E0310 09:45:31.404224 4794 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 09:45:31 crc kubenswrapper[4794]: E0310 09:45:31.504320 4794 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 09:45:31 crc kubenswrapper[4794]: E0310 09:45:31.605181 4794 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 09:45:31 crc kubenswrapper[4794]: E0310 09:45:31.706215 4794 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 09:45:31 crc kubenswrapper[4794]: E0310 09:45:31.806806 4794 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 09:45:31 crc kubenswrapper[4794]: E0310 09:45:31.907668 4794 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 09:45:32 crc kubenswrapper[4794]: E0310 09:45:32.008149 4794 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 09:45:32 crc kubenswrapper[4794]: E0310 09:45:32.053145 4794 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 10 09:45:32 crc kubenswrapper[4794]: E0310 09:45:32.108533 4794 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 09:45:32 crc kubenswrapper[4794]: E0310 09:45:32.208706 4794 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 09:45:32 crc kubenswrapper[4794]: E0310 09:45:32.308872 4794 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 09:45:32 crc kubenswrapper[4794]: E0310 09:45:32.410081 4794 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 09:45:32 crc kubenswrapper[4794]: E0310 09:45:32.510264 4794 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 09:45:32 crc kubenswrapper[4794]: E0310 09:45:32.611011 4794 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 09:45:32 crc kubenswrapper[4794]: E0310 09:45:32.711765 4794 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 09:45:32 crc kubenswrapper[4794]: E0310 09:45:32.812938 4794 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 09:45:32 crc kubenswrapper[4794]: E0310 09:45:32.913601 4794 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 09:45:33 crc kubenswrapper[4794]: E0310 09:45:33.014758 4794 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 09:45:33 crc kubenswrapper[4794]: E0310 09:45:33.115073 4794 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 09:45:33 crc kubenswrapper[4794]: E0310 09:45:33.216267 4794 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 09:45:33 crc kubenswrapper[4794]: E0310 09:45:33.316412 4794 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 09:45:33 crc kubenswrapper[4794]: E0310 09:45:33.417018 4794 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 09:45:33 crc kubenswrapper[4794]: E0310 09:45:33.518107 4794 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 09:45:33 crc kubenswrapper[4794]: E0310 09:45:33.618947 4794 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 09:45:33 crc kubenswrapper[4794]: E0310 09:45:33.719957 4794 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 09:45:33 crc kubenswrapper[4794]: E0310 09:45:33.820437 4794 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 09:45:33 crc kubenswrapper[4794]: E0310 09:45:33.920579 4794 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 09:45:34 crc kubenswrapper[4794]: E0310 09:45:34.021558 4794 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 09:45:34 crc kubenswrapper[4794]: E0310 09:45:34.121702 4794 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 09:45:34 crc kubenswrapper[4794]: E0310 09:45:34.222292 4794 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 09:45:34 crc kubenswrapper[4794]: I0310 09:45:34.242921 4794 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Mar 10 09:45:34 crc kubenswrapper[4794]: E0310 09:45:34.322944 4794 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 09:45:34 crc kubenswrapper[4794]: E0310 09:45:34.423721 4794 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 09:45:34 crc kubenswrapper[4794]: E0310 09:45:34.524654 4794 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 09:45:34 crc kubenswrapper[4794]: E0310 09:45:34.625463 4794 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 09:45:34 crc kubenswrapper[4794]: E0310 09:45:34.726264 4794 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 09:45:34 crc kubenswrapper[4794]: E0310 09:45:34.827247 4794 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 09:45:34 crc kubenswrapper[4794]: E0310 09:45:34.927435 4794 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 09:45:35 crc kubenswrapper[4794]: I0310 09:45:35.019532 4794 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Mar 10 09:45:35 crc kubenswrapper[4794]: I0310 09:45:35.031096 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:45:35 crc kubenswrapper[4794]: I0310 09:45:35.031151 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:45:35 crc kubenswrapper[4794]: I0310 09:45:35.031168 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:45:35 crc kubenswrapper[4794]: I0310 09:45:35.031201 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:45:35 crc kubenswrapper[4794]: I0310 09:45:35.031228 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:45:35Z","lastTransitionTime":"2026-03-10T09:45:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:45:35 crc kubenswrapper[4794]: I0310 09:45:35.134169 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:45:35 crc kubenswrapper[4794]: I0310 09:45:35.134208 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:45:35 crc kubenswrapper[4794]: I0310 09:45:35.134220 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:45:35 crc kubenswrapper[4794]: I0310 09:45:35.134238 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:45:35 crc kubenswrapper[4794]: I0310 09:45:35.134249 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:45:35Z","lastTransitionTime":"2026-03-10T09:45:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:45:35 crc kubenswrapper[4794]: I0310 09:45:35.237369 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:45:35 crc kubenswrapper[4794]: I0310 09:45:35.237420 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:45:35 crc kubenswrapper[4794]: I0310 09:45:35.237436 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:45:35 crc kubenswrapper[4794]: I0310 09:45:35.237460 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:45:35 crc kubenswrapper[4794]: I0310 09:45:35.237476 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:45:35Z","lastTransitionTime":"2026-03-10T09:45:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:45:35 crc kubenswrapper[4794]: I0310 09:45:35.340597 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:45:35 crc kubenswrapper[4794]: I0310 09:45:35.340672 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:45:35 crc kubenswrapper[4794]: I0310 09:45:35.340697 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:45:35 crc kubenswrapper[4794]: I0310 09:45:35.340746 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:45:35 crc kubenswrapper[4794]: I0310 09:45:35.340769 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:45:35Z","lastTransitionTime":"2026-03-10T09:45:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:45:35 crc kubenswrapper[4794]: I0310 09:45:35.444439 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:45:35 crc kubenswrapper[4794]: I0310 09:45:35.444493 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:45:35 crc kubenswrapper[4794]: I0310 09:45:35.444512 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:45:35 crc kubenswrapper[4794]: I0310 09:45:35.444536 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:45:35 crc kubenswrapper[4794]: I0310 09:45:35.444554 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:45:35Z","lastTransitionTime":"2026-03-10T09:45:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:45:35 crc kubenswrapper[4794]: I0310 09:45:35.547282 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:45:35 crc kubenswrapper[4794]: I0310 09:45:35.547363 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:45:35 crc kubenswrapper[4794]: I0310 09:45:35.547400 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:45:35 crc kubenswrapper[4794]: I0310 09:45:35.547428 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:45:35 crc kubenswrapper[4794]: I0310 09:45:35.547444 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:45:35Z","lastTransitionTime":"2026-03-10T09:45:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:45:35 crc kubenswrapper[4794]: I0310 09:45:35.650663 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:45:35 crc kubenswrapper[4794]: I0310 09:45:35.650760 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:45:35 crc kubenswrapper[4794]: I0310 09:45:35.650779 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:45:35 crc kubenswrapper[4794]: I0310 09:45:35.650804 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:45:35 crc kubenswrapper[4794]: I0310 09:45:35.650820 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:45:35Z","lastTransitionTime":"2026-03-10T09:45:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:45:35 crc kubenswrapper[4794]: I0310 09:45:35.753163 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:45:35 crc kubenswrapper[4794]: I0310 09:45:35.753193 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:45:35 crc kubenswrapper[4794]: I0310 09:45:35.753204 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:45:35 crc kubenswrapper[4794]: I0310 09:45:35.753218 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:45:35 crc kubenswrapper[4794]: I0310 09:45:35.753229 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:45:35Z","lastTransitionTime":"2026-03-10T09:45:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:45:35 crc kubenswrapper[4794]: I0310 09:45:35.856132 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:45:35 crc kubenswrapper[4794]: I0310 09:45:35.856172 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:45:35 crc kubenswrapper[4794]: I0310 09:45:35.856182 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:45:35 crc kubenswrapper[4794]: I0310 09:45:35.856198 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:45:35 crc kubenswrapper[4794]: I0310 09:45:35.856210 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:45:35Z","lastTransitionTime":"2026-03-10T09:45:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:45:35 crc kubenswrapper[4794]: I0310 09:45:35.958135 4794 apiserver.go:52] "Watching apiserver" Mar 10 09:45:35 crc kubenswrapper[4794]: I0310 09:45:35.959472 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:45:35 crc kubenswrapper[4794]: I0310 09:45:35.959529 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:45:35 crc kubenswrapper[4794]: I0310 09:45:35.959551 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:45:35 crc kubenswrapper[4794]: I0310 09:45:35.959582 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:45:35 crc kubenswrapper[4794]: I0310 09:45:35.959606 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:45:35Z","lastTransitionTime":"2026-03-10T09:45:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:45:35 crc kubenswrapper[4794]: I0310 09:45:35.966120 4794 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Mar 10 09:45:35 crc kubenswrapper[4794]: I0310 09:45:35.966506 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-network-diagnostics/network-check-target-xd92c","openshift-network-node-identity/network-node-identity-vrzqb","openshift-network-operator/iptables-alerter-4ln5h","openshift-network-operator/network-operator-58b4c7f79c-55gtf","openshift-network-console/networking-console-plugin-85b44fc459-gdk6g","openshift-network-diagnostics/network-check-source-55646444c4-trplf"] Mar 10 09:45:35 crc kubenswrapper[4794]: I0310 09:45:35.966943 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 10 09:45:35 crc kubenswrapper[4794]: I0310 09:45:35.966970 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 09:45:35 crc kubenswrapper[4794]: E0310 09:45:35.967058 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 09:45:35 crc kubenswrapper[4794]: I0310 09:45:35.967591 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 09:45:35 crc kubenswrapper[4794]: E0310 09:45:35.967642 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 09:45:35 crc kubenswrapper[4794]: I0310 09:45:35.967711 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 10 09:45:35 crc kubenswrapper[4794]: I0310 09:45:35.967864 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 09:45:35 crc kubenswrapper[4794]: E0310 09:45:35.967901 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 09:45:35 crc kubenswrapper[4794]: I0310 09:45:35.967951 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 10 09:45:35 crc kubenswrapper[4794]: I0310 09:45:35.970382 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Mar 10 09:45:35 crc kubenswrapper[4794]: I0310 09:45:35.971071 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Mar 10 09:45:35 crc kubenswrapper[4794]: I0310 09:45:35.971139 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Mar 10 09:45:35 crc kubenswrapper[4794]: I0310 09:45:35.971870 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Mar 10 09:45:35 crc kubenswrapper[4794]: I0310 09:45:35.973041 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Mar 10 09:45:35 crc kubenswrapper[4794]: I0310 09:45:35.973234 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Mar 10 09:45:35 crc kubenswrapper[4794]: I0310 09:45:35.973396 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Mar 10 09:45:35 crc kubenswrapper[4794]: I0310 09:45:35.973474 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Mar 10 09:45:35 crc kubenswrapper[4794]: I0310 09:45:35.973775 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.015229 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:35Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.029634 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.038246 4794 desired_state_of_world_populator.go:154] "Finished populating initial desired state of world" Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.044257 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.044304 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.044368 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.044399 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") pod \"44663579-783b-4372-86d6-acf235a62d72\" (UID: \"44663579-783b-4372-86d6-acf235a62d72\") " Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.044420 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.044441 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.044462 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.044530 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.044590 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.044610 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.044631 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.044655 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.044677 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.044699 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.044722 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.044744 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.044768 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.044791 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.044810 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.044844 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.044905 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" (OuterVolumeSpecName: "webhook-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "webhook-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.044956 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.044867 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.045073 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.045118 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.045214 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.045246 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.045280 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.045316 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.045373 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.045405 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.045439 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.045471 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.045501 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.045533 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.045572 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.045603 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.045634 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.045670 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.045728 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.045763 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.045798 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.045832 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.045867 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.045899 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.045932 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.045245 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" (OuterVolumeSpecName: "package-server-manager-serving-cert") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "package-server-manager-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.045971 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.045211 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" (OuterVolumeSpecName: "kube-api-access-vt5rc") pod "44663579-783b-4372-86d6-acf235a62d72" (UID: "44663579-783b-4372-86d6-acf235a62d72"). InnerVolumeSpecName "kube-api-access-vt5rc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.045382 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.045418 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" (OuterVolumeSpecName: "kube-api-access-6g6sz") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "kube-api-access-6g6sz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.045543 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.045586 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" (OuterVolumeSpecName: "control-plane-machine-set-operator-tls") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "control-plane-machine-set-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.045743 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" (OuterVolumeSpecName: "kube-api-access-lz9wn") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "kube-api-access-lz9wn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.046125 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" (OuterVolumeSpecName: "machine-approver-tls") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "machine-approver-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.045853 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.045886 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.045931 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" (OuterVolumeSpecName: "kube-api-access-6ccd8") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "kube-api-access-6ccd8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.045961 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.045965 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.046215 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.046241 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.046266 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.046291 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.046295 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" (OuterVolumeSpecName: "kube-api-access-htfz6") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "kube-api-access-htfz6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.046316 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.046354 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.046374 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.046381 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.046424 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.046452 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.046529 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.046554 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.046578 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.046603 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.046625 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.046646 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.046670 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.046693 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.046718 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.046738 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.046759 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.046758 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" (OuterVolumeSpecName: "kube-api-access-8tdtz") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "kube-api-access-8tdtz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.046782 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.046804 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.046826 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.046849 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.046865 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.046871 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") pod \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\" (UID: \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\") " Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.046912 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.046940 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.046965 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.047254 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.047279 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.047303 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.047377 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.047406 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") pod \"49ef4625-1d3a-4a9f-b595-c2433d32326d\" (UID: \"49ef4625-1d3a-4a9f-b595-c2433d32326d\") " Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.047429 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.047455 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.047481 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.047503 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.047529 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.047552 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.047574 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.047598 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.047638 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.047668 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.047691 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.047714 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.047740 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.047765 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.047788 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.047812 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.047837 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.047858 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.047890 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.047907 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.047925 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.047949 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.047972 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.047995 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.048023 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.048045 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.048067 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.048094 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.048119 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.048142 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.048169 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.048192 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.048212 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.048229 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.048249 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.048272 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.048295 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.048317 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.048357 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.048379 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.048399 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.048420 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.048441 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.048464 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.048487 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.048509 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.048530 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.048621 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.048649 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.048671 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.048691 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.048714 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.048736 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.048763 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.048787 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.048814 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.048836 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.048858 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.048881 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.048903 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.048926 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.048948 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.048969 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.048991 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.049013 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.049038 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.049062 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.049167 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.049194 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.049219 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.049243 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.049271 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.049294 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.049318 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.049360 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.049386 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.049413 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.049442 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.049467 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.049489 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.049513 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.049536 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.049558 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.049579 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.049602 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.049624 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.049645 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.049668 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") pod \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\" (UID: \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\") " Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.049694 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.049720 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.049745 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.049840 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.049916 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.049943 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.049999 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.050027 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.050052 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.050076 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.050101 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.050129 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.050155 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.050179 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.050204 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.050227 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.050252 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.050277 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.050302 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.050351 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.050382 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.050410 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.050437 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.050462 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.050488 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.050520 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.050541 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.050558 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.050575 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.050616 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.050643 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.050665 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.050684 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.050705 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.050724 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.050741 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.050764 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.050800 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.050826 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.050851 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.050878 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.050907 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.050931 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.050996 4794 reconciler_common.go:293] "Volume detached for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") on node \"crc\" DevicePath \"\"" Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.051014 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") on node \"crc\" DevicePath \"\"" Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.051031 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") on node \"crc\" DevicePath \"\"" Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.051046 4794 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") on node \"crc\" DevicePath \"\"" Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.051058 4794 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.051072 4794 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.051084 4794 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.051101 4794 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.051115 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") on node \"crc\" DevicePath \"\"" Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.051127 4794 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.051140 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") on node \"crc\" DevicePath \"\"" Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.051153 4794 reconciler_common.go:293] "Volume detached for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") on node \"crc\" DevicePath \"\"" Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.051166 4794 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.051178 4794 reconciler_common.go:293] "Volume detached for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") on node \"crc\" DevicePath \"\"" Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.051195 4794 reconciler_common.go:293] "Volume detached for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.051208 4794 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.051222 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") on node \"crc\" DevicePath \"\"" Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.051235 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") on node \"crc\" DevicePath \"\"" Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.051245 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.046976 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.047170 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" (OuterVolumeSpecName: "kube-api-access-x2m85") pod "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" (UID: "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d"). InnerVolumeSpecName "kube-api-access-x2m85". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.047309 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" (OuterVolumeSpecName: "config") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.047320 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.047465 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" (OuterVolumeSpecName: "client-ca") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.047565 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.047842 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.047962 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" (OuterVolumeSpecName: "samples-operator-tls") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "samples-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.048483 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.048476 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" (OuterVolumeSpecName: "config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.048782 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.048965 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.049150 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" (OuterVolumeSpecName: "kube-api-access-qg5z5") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "kube-api-access-qg5z5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.049202 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.049383 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" (OuterVolumeSpecName: "kube-api-access-x7zkh") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "kube-api-access-x7zkh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.049405 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.049972 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.050373 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" (OuterVolumeSpecName: "service-ca") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.050392 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" (OuterVolumeSpecName: "kube-api-access-249nr") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "kube-api-access-249nr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.050413 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" (OuterVolumeSpecName: "image-registry-operator-tls") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "image-registry-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.050760 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.050796 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" (OuterVolumeSpecName: "kube-api-access-gf66m") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "kube-api-access-gf66m". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.050812 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.050841 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.050877 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.050868 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.051156 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.051261 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.051388 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" (OuterVolumeSpecName: "kube-api-access-tk88c") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "kube-api-access-tk88c". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.051529 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.051614 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" (OuterVolumeSpecName: "kube-api-access-xcphl") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "kube-api-access-xcphl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.051585 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.052631 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.052709 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" (OuterVolumeSpecName: "utilities") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.052718 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.052803 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.053563 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" (OuterVolumeSpecName: "kube-api-access-d6qdx") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "kube-api-access-d6qdx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.053601 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.053499 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" (OuterVolumeSpecName: "console-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.053690 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" (OuterVolumeSpecName: "kube-api-access-9xfj7") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "kube-api-access-9xfj7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.053809 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.053878 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" (OuterVolumeSpecName: "kube-api-access-s4n52") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "kube-api-access-s4n52". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.054154 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.054205 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" (OuterVolumeSpecName: "available-featuregates") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "available-featuregates". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.054718 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.055045 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.055264 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" (OuterVolumeSpecName: "kube-api-access-wxkg8") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "kube-api-access-wxkg8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.055291 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" (OuterVolumeSpecName: "kube-api-access-zgdk5") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "kube-api-access-zgdk5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.055569 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.055559 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.055929 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.056036 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.056305 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.056487 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.056930 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.056974 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" (OuterVolumeSpecName: "kube-api-access-w7l8j") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "kube-api-access-w7l8j". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.057139 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" (OuterVolumeSpecName: "kube-api-access-7c4vf") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "kube-api-access-7c4vf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.057445 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.057452 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" (OuterVolumeSpecName: "kube-api-access-rnphk") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "kube-api-access-rnphk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.057449 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.057445 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" (OuterVolumeSpecName: "kube-api-access-pj782") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "kube-api-access-pj782". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.057697 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" (OuterVolumeSpecName: "kube-api-access-279lb") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "kube-api-access-279lb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.057968 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" (OuterVolumeSpecName: "images") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.058049 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" (OuterVolumeSpecName: "kube-api-access-kfwg7") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "kube-api-access-kfwg7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.058094 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" (OuterVolumeSpecName: "client-ca") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.063240 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" (OuterVolumeSpecName: "kube-api-access-w4xd4") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "kube-api-access-w4xd4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.061529 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.058112 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.058175 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" (OuterVolumeSpecName: "kube-api-access-dbsvg") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "kube-api-access-dbsvg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.058298 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.058691 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.063308 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.058679 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" (OuterVolumeSpecName: "config") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.058986 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.059005 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" (OuterVolumeSpecName: "kube-api-access-lzf88") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "kube-api-access-lzf88". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.059041 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" (OuterVolumeSpecName: "kube-api-access-pcxfs") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "kube-api-access-pcxfs". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.059407 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" (OuterVolumeSpecName: "kube-api-access-fqsjt") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "kube-api-access-fqsjt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.059483 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" (OuterVolumeSpecName: "config") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.059531 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" (OuterVolumeSpecName: "mcd-auth-proxy-config") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "mcd-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.059622 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" (OuterVolumeSpecName: "kube-api-access-d4lsv") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "kube-api-access-d4lsv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:45:36 crc kubenswrapper[4794]: E0310 09:45:36.059698 4794 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.059786 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" (OuterVolumeSpecName: "machine-api-operator-tls") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "machine-api-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.060254 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 09:45:36 crc kubenswrapper[4794]: E0310 09:45:36.060745 4794 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.060472 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" (OuterVolumeSpecName: "utilities") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.058518 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.062666 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.062837 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" (OuterVolumeSpecName: "kube-api-access-2w9zh") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "kube-api-access-2w9zh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:45:36 crc kubenswrapper[4794]: E0310 09:45:36.062956 4794 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 09:45:36.562931135 +0000 UTC m=+85.319101963 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.063518 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" (OuterVolumeSpecName: "config") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.059609 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.063629 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.063909 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" (OuterVolumeSpecName: "multus-daemon-config") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "multus-daemon-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.063976 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.064045 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" (OuterVolumeSpecName: "kube-api-access-xcgwh") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "kube-api-access-xcgwh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.064214 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.064220 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" (OuterVolumeSpecName: "kube-api-access-w9rds") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "kube-api-access-w9rds". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.064293 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" (OuterVolumeSpecName: "kube-api-access-x4zgh") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "kube-api-access-x4zgh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.064476 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.064501 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.064655 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" (OuterVolumeSpecName: "utilities") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.064932 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.064954 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.065155 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" (OuterVolumeSpecName: "etcd-service-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.066883 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" (OuterVolumeSpecName: "config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.067102 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.067148 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" (OuterVolumeSpecName: "kube-api-access-mg5zb") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "kube-api-access-mg5zb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.067304 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.067403 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.067418 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" (OuterVolumeSpecName: "kube-api-access-bf2bz") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "kube-api-access-bf2bz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:45:36 crc kubenswrapper[4794]: E0310 09:45:36.067714 4794 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-10 09:45:36.56769328 +0000 UTC m=+85.323864098 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 10 09:45:36 crc kubenswrapper[4794]: E0310 09:45:36.077585 4794 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-10 09:45:36.577549464 +0000 UTC m=+85.333720322 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.066940 4794 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.077628 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.077649 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.077660 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.077673 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.077684 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:45:36Z","lastTransitionTime":"2026-03-10T09:45:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.044252 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:35Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.068045 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" (OuterVolumeSpecName: "service-ca") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.063183 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" (OuterVolumeSpecName: "kube-api-access-pjr6v") pod "49ef4625-1d3a-4a9f-b595-c2433d32326d" (UID: "49ef4625-1d3a-4a9f-b595-c2433d32326d"). InnerVolumeSpecName "kube-api-access-pjr6v". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.068160 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" (OuterVolumeSpecName: "cert") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.068288 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.068386 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" (OuterVolumeSpecName: "kube-api-access-2d4wz") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "kube-api-access-2d4wz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.068531 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" (OuterVolumeSpecName: "config") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.068603 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.069196 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" (OuterVolumeSpecName: "mcc-auth-proxy-config") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "mcc-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.069719 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" (OuterVolumeSpecName: "kube-api-access-jkwtn") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "kube-api-access-jkwtn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.070529 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.075426 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" (OuterVolumeSpecName: "kube-api-access-cfbct") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "kube-api-access-cfbct". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.075619 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" (OuterVolumeSpecName: "kube-api-access-zkvpv") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "kube-api-access-zkvpv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.078499 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" (OuterVolumeSpecName: "config") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.079649 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 09:45:36 crc kubenswrapper[4794]: E0310 09:45:36.081990 4794 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 10 09:45:36 crc kubenswrapper[4794]: E0310 09:45:36.082022 4794 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 10 09:45:36 crc kubenswrapper[4794]: E0310 09:45:36.082041 4794 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 10 09:45:36 crc kubenswrapper[4794]: E0310 09:45:36.082113 4794 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-10 09:45:36.582088312 +0000 UTC m=+85.338259210 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 10 09:45:36 crc kubenswrapper[4794]: E0310 09:45:36.082353 4794 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 10 09:45:36 crc kubenswrapper[4794]: E0310 09:45:36.082374 4794 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 10 09:45:36 crc kubenswrapper[4794]: E0310 09:45:36.082387 4794 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 10 09:45:36 crc kubenswrapper[4794]: E0310 09:45:36.082431 4794 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-10 09:45:36.582417633 +0000 UTC m=+85.338588511 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.088548 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" (OuterVolumeSpecName: "signing-cabundle") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-cabundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.089241 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.089559 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.091623 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" (OuterVolumeSpecName: "kube-api-access-jhbk2") pod "bd23aa5c-e532-4e53-bccf-e79f130c5ae8" (UID: "bd23aa5c-e532-4e53-bccf-e79f130c5ae8"). InnerVolumeSpecName "kube-api-access-jhbk2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.101150 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" (OuterVolumeSpecName: "config") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.103791 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.104044 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" (OuterVolumeSpecName: "config") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.106922 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" (OuterVolumeSpecName: "config") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.107049 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" (OuterVolumeSpecName: "kube-api-access-mnrrd") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "kube-api-access-mnrrd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.107551 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.107576 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.107747 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.107788 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.109581 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" (OuterVolumeSpecName: "signing-key") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.114689 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" (OuterVolumeSpecName: "audit") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "audit". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.115061 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" (OuterVolumeSpecName: "node-bootstrap-token") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "node-bootstrap-token". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.115734 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.116026 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" (OuterVolumeSpecName: "etcd-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.118589 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.120806 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.125712 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.125723 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" (OuterVolumeSpecName: "kube-api-access-fcqwp") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "kube-api-access-fcqwp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.126778 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.128563 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.128742 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" (OuterVolumeSpecName: "kube-api-access-sb6h7") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "kube-api-access-sb6h7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.128785 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:35Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.128865 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" (OuterVolumeSpecName: "stats-auth") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "stats-auth". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.128946 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.129027 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.129096 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" (OuterVolumeSpecName: "certs") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.128590 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" (OuterVolumeSpecName: "config") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.129282 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" (OuterVolumeSpecName: "default-certificate") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "default-certificate". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.129308 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.129498 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.129600 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.129701 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" (OuterVolumeSpecName: "kube-api-access-v47cf") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "kube-api-access-v47cf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.129960 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" (OuterVolumeSpecName: "images") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.130090 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" (OuterVolumeSpecName: "kube-api-access-qs4fp") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "kube-api-access-qs4fp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.130394 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" (OuterVolumeSpecName: "config") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.130557 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" (OuterVolumeSpecName: "config") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.130657 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.130739 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" (OuterVolumeSpecName: "kube-api-access-4d4hj") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "kube-api-access-4d4hj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.131004 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" (OuterVolumeSpecName: "ovn-control-plane-metrics-cert") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovn-control-plane-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.131140 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" (OuterVolumeSpecName: "utilities") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.131638 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" (OuterVolumeSpecName: "serviceca") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "serviceca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.132980 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" (OuterVolumeSpecName: "kube-api-access-nzwt7") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "kube-api-access-nzwt7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.133213 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" (OuterVolumeSpecName: "config-volume") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.133376 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" (OuterVolumeSpecName: "apiservice-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "apiservice-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.133482 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" (OuterVolumeSpecName: "webhook-certs") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "webhook-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.133900 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.134037 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" (OuterVolumeSpecName: "tmpfs") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "tmpfs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.134595 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" (OuterVolumeSpecName: "image-import-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "image-import-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.134689 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.134784 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.135057 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.135098 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" (OuterVolumeSpecName: "cni-sysctl-allowlist") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-sysctl-allowlist". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.135501 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.135598 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" (OuterVolumeSpecName: "config") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.135856 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" (OuterVolumeSpecName: "kube-api-access-ngvvp") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "kube-api-access-ngvvp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.136472 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.141924 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:35Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.143429 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.150971 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.152639 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.152687 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.152728 4794 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.152751 4794 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.152768 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.152785 4794 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") on node \"crc\" DevicePath \"\"" Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.152793 4794 reconciler_common.go:293] "Volume detached for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") on node \"crc\" DevicePath \"\"" Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.152803 4794 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") on node \"crc\" DevicePath \"\"" Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.152812 4794 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.152820 4794 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.152828 4794 reconciler_common.go:293] "Volume detached for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") on node \"crc\" DevicePath \"\"" Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.152837 4794 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") on node \"crc\" DevicePath \"\"" Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.152837 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.152845 4794 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.152878 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") on node \"crc\" DevicePath \"\"" Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.152887 4794 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") on node \"crc\" DevicePath \"\"" Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.152895 4794 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") on node \"crc\" DevicePath \"\"" Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.152903 4794 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") on node \"crc\" DevicePath \"\"" Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.152911 4794 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.152920 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.152928 4794 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.152936 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") on node \"crc\" DevicePath \"\"" Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.152945 4794 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.152953 4794 reconciler_common.go:293] "Volume detached for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") on node \"crc\" DevicePath \"\"" Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.152962 4794 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.152970 4794 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") on node \"crc\" DevicePath \"\"" Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.152978 4794 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.152985 4794 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.152993 4794 reconciler_common.go:293] "Volume detached for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") on node \"crc\" DevicePath \"\"" Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.153001 4794 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") on node \"crc\" DevicePath \"\"" Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.153009 4794 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.153020 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") on node \"crc\" DevicePath \"\"" Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.153032 4794 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") on node \"crc\" DevicePath \"\"" Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.153043 4794 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") on node \"crc\" DevicePath \"\"" Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.153053 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") on node \"crc\" DevicePath \"\"" Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.153064 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") on node \"crc\" DevicePath \"\"" Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.153075 4794 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.153084 4794 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") on node \"crc\" DevicePath \"\"" Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.153095 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") on node \"crc\" DevicePath \"\"" Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.153106 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") on node \"crc\" DevicePath \"\"" Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.153118 4794 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") on node \"crc\" DevicePath \"\"" Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.153129 4794 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.153140 4794 reconciler_common.go:293] "Volume detached for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.153150 4794 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") on node \"crc\" DevicePath \"\"" Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.153162 4794 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.153172 4794 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") on node \"crc\" DevicePath \"\"" Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.153182 4794 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.153191 4794 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") on node \"crc\" DevicePath \"\"" Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.153201 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") on node \"crc\" DevicePath \"\"" Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.153213 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") on node \"crc\" DevicePath \"\"" Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.153223 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") on node \"crc\" DevicePath \"\"" Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.153234 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") on node \"crc\" DevicePath \"\"" Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.153244 4794 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") on node \"crc\" DevicePath \"\"" Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.153255 4794 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.153268 4794 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.153280 4794 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") on node \"crc\" DevicePath \"\"" Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.153290 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") on node \"crc\" DevicePath \"\"" Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.153302 4794 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") on node \"crc\" DevicePath \"\"" Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.153314 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") on node \"crc\" DevicePath \"\"" Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.153325 4794 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.153355 4794 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") on node \"crc\" DevicePath \"\"" Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.153366 4794 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") on node \"crc\" DevicePath \"\"" Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.153377 4794 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") on node \"crc\" DevicePath \"\"" Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.153388 4794 reconciler_common.go:293] "Volume detached for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") on node \"crc\" DevicePath \"\"" Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.153398 4794 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") on node \"crc\" DevicePath \"\"" Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.153408 4794 reconciler_common.go:293] "Volume detached for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") on node \"crc\" DevicePath \"\"" Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.153418 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") on node \"crc\" DevicePath \"\"" Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.153428 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") on node \"crc\" DevicePath \"\"" Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.153449 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") on node \"crc\" DevicePath \"\"" Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.153460 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") on node \"crc\" DevicePath \"\"" Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.153471 4794 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") on node \"crc\" DevicePath \"\"" Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.153482 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") on node \"crc\" DevicePath \"\"" Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.153493 4794 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") on node \"crc\" DevicePath \"\"" Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.153503 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") on node \"crc\" DevicePath \"\"" Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.153515 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.153525 4794 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") on node \"crc\" DevicePath \"\"" Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.153537 4794 reconciler_common.go:293] "Volume detached for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") on node \"crc\" DevicePath \"\"" Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.153549 4794 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.153560 4794 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") on node \"crc\" DevicePath \"\"" Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.153571 4794 reconciler_common.go:293] "Volume detached for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") on node \"crc\" DevicePath \"\"" Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.153581 4794 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.153592 4794 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") on node \"crc\" DevicePath \"\"" Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.153602 4794 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.153611 4794 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") on node \"crc\" DevicePath \"\"" Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.153621 4794 reconciler_common.go:293] "Volume detached for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") on node \"crc\" DevicePath \"\"" Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.153631 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") on node \"crc\" DevicePath \"\"" Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.153642 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") on node \"crc\" DevicePath \"\"" Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.153651 4794 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.153662 4794 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.153673 4794 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") on node \"crc\" DevicePath \"\"" Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.153683 4794 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.153693 4794 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") on node \"crc\" DevicePath \"\"" Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.153705 4794 reconciler_common.go:293] "Volume detached for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") on node \"crc\" DevicePath \"\"" Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.153716 4794 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.153727 4794 reconciler_common.go:293] "Volume detached for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") on node \"crc\" DevicePath \"\"" Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.153738 4794 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.153749 4794 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.153759 4794 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") on node \"crc\" DevicePath \"\"" Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.153770 4794 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.153780 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") on node \"crc\" DevicePath \"\"" Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.153788 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") on node \"crc\" DevicePath \"\"" Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.153797 4794 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") on node \"crc\" DevicePath \"\"" Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.153804 4794 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") on node \"crc\" DevicePath \"\"" Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.153813 4794 reconciler_common.go:293] "Volume detached for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") on node \"crc\" DevicePath \"\"" Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.153821 4794 reconciler_common.go:293] "Volume detached for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") on node \"crc\" DevicePath \"\"" Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.153829 4794 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") on node \"crc\" DevicePath \"\"" Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.153838 4794 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") on node \"crc\" DevicePath \"\"" Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.153846 4794 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") on node \"crc\" DevicePath \"\"" Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.153854 4794 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") on node \"crc\" DevicePath \"\"" Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.153862 4794 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") on node \"crc\" DevicePath \"\"" Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.153869 4794 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.153878 4794 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.153885 4794 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.153894 4794 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.153903 4794 reconciler_common.go:293] "Volume detached for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") on node \"crc\" DevicePath \"\"" Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.153912 4794 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") on node \"crc\" DevicePath \"\"" Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.153920 4794 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.153928 4794 reconciler_common.go:293] "Volume detached for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") on node \"crc\" DevicePath \"\"" Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.153936 4794 reconciler_common.go:293] "Volume detached for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") on node \"crc\" DevicePath \"\"" Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.153944 4794 reconciler_common.go:293] "Volume detached for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") on node \"crc\" DevicePath \"\"" Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.153952 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") on node \"crc\" DevicePath \"\"" Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.153960 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") on node \"crc\" DevicePath \"\"" Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.153967 4794 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.153975 4794 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.153983 4794 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.153991 4794 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") on node \"crc\" DevicePath \"\"" Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.154000 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") on node \"crc\" DevicePath \"\"" Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.154008 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") on node \"crc\" DevicePath \"\"" Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.154015 4794 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.154023 4794 reconciler_common.go:293] "Volume detached for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") on node \"crc\" DevicePath \"\"" Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.154031 4794 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") on node \"crc\" DevicePath \"\"" Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.154038 4794 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.154046 4794 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.154057 4794 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") on node \"crc\" DevicePath \"\"" Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.154065 4794 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") on node \"crc\" DevicePath \"\"" Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.154073 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") on node \"crc\" DevicePath \"\"" Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.154082 4794 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.154071 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.154090 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") on node \"crc\" DevicePath \"\"" Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.154199 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") on node \"crc\" DevicePath \"\"" Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.154212 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") on node \"crc\" DevicePath \"\"" Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.154224 4794 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.154235 4794 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.154246 4794 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") on node \"crc\" DevicePath \"\"" Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.154256 4794 reconciler_common.go:293] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") on node \"crc\" DevicePath \"\"" Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.154266 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") on node \"crc\" DevicePath \"\"" Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.154276 4794 reconciler_common.go:293] "Volume detached for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.154286 4794 reconciler_common.go:293] "Volume detached for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") on node \"crc\" DevicePath \"\"" Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.154296 4794 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") on node \"crc\" DevicePath \"\"" Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.154306 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.154316 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") on node \"crc\" DevicePath \"\"" Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.154327 4794 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.154361 4794 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.154372 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") on node \"crc\" DevicePath \"\"" Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.154382 4794 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") on node \"crc\" DevicePath \"\"" Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.154392 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") on node \"crc\" DevicePath \"\"" Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.154403 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") on node \"crc\" DevicePath \"\"" Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.154414 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") on node \"crc\" DevicePath \"\"" Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.154428 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") on node \"crc\" DevicePath \"\"" Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.154439 4794 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.154449 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") on node \"crc\" DevicePath \"\"" Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.154461 4794 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") on node \"crc\" DevicePath \"\"" Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.154472 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") on node \"crc\" DevicePath \"\"" Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.154482 4794 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") on node \"crc\" DevicePath \"\"" Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.154492 4794 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.154503 4794 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") on node \"crc\" DevicePath \"\"" Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.154514 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") on node \"crc\" DevicePath \"\"" Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.154525 4794 reconciler_common.go:293] "Volume detached for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") on node \"crc\" DevicePath \"\"" Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.154535 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") on node \"crc\" DevicePath \"\"" Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.154545 4794 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") on node \"crc\" DevicePath \"\"" Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.154555 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") on node \"crc\" DevicePath \"\"" Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.154564 4794 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.154574 4794 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.154583 4794 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") on node \"crc\" DevicePath \"\"" Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.154592 4794 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") on node \"crc\" DevicePath \"\"" Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.154602 4794 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") on node \"crc\" DevicePath \"\"" Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.154612 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") on node \"crc\" DevicePath \"\"" Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.154623 4794 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") on node \"crc\" DevicePath \"\"" Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.160703 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.162922 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.163965 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:35Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.180170 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.180413 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.180504 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.180600 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.180677 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:45:36Z","lastTransitionTime":"2026-03-10T09:45:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.255576 4794 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.255621 4794 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.283426 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.283545 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.283575 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.283606 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.283628 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:45:36Z","lastTransitionTime":"2026-03-10T09:45:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.300007 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.307317 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.315324 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 10 09:45:36 crc kubenswrapper[4794]: W0310 09:45:36.319198 4794 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod37a5e44f_9a88_4405_be8a_b645485e7312.slice/crio-93a0c86007b6e57c019ab0936fd1b51abc5bd642ae25a52a9c1f3d35e7ced468 WatchSource:0}: Error finding container 93a0c86007b6e57c019ab0936fd1b51abc5bd642ae25a52a9c1f3d35e7ced468: Status 404 returned error can't find the container with id 93a0c86007b6e57c019ab0936fd1b51abc5bd642ae25a52a9c1f3d35e7ced468 Mar 10 09:45:36 crc kubenswrapper[4794]: W0310 09:45:36.321722 4794 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podef543e1b_8068_4ea3_b32a_61027b32e95d.slice/crio-6a9de9688f31c91d14e7499d4832f8889f87ef29f88b1a7323435c6ea746637c WatchSource:0}: Error finding container 6a9de9688f31c91d14e7499d4832f8889f87ef29f88b1a7323435c6ea746637c: Status 404 returned error can't find the container with id 6a9de9688f31c91d14e7499d4832f8889f87ef29f88b1a7323435c6ea746637c Mar 10 09:45:36 crc kubenswrapper[4794]: W0310 09:45:36.337787 4794 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd75a4c96_2883_4a0b_bab2_0fab2b6c0b49.slice/crio-082091f718827612b5fff38fb43385aa4e2f974912cfe80254e2df381469b365 WatchSource:0}: Error finding container 082091f718827612b5fff38fb43385aa4e2f974912cfe80254e2df381469b365: Status 404 returned error can't find the container with id 082091f718827612b5fff38fb43385aa4e2f974912cfe80254e2df381469b365 Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.364469 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"082091f718827612b5fff38fb43385aa4e2f974912cfe80254e2df381469b365"} Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.366904 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"6a9de9688f31c91d14e7499d4832f8889f87ef29f88b1a7323435c6ea746637c"} Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.369599 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"93a0c86007b6e57c019ab0936fd1b51abc5bd642ae25a52a9c1f3d35e7ced468"} Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.386139 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.386194 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.386216 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.386248 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.386274 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:45:36Z","lastTransitionTime":"2026-03-10T09:45:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.489201 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.489246 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.489261 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.489278 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.489290 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:45:36Z","lastTransitionTime":"2026-03-10T09:45:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.590996 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.591050 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.591064 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.591082 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.591092 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:45:36Z","lastTransitionTime":"2026-03-10T09:45:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.659236 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.659393 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 09:45:36 crc kubenswrapper[4794]: E0310 09:45:36.659480 4794 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 09:45:37.659444994 +0000 UTC m=+86.415615812 (durationBeforeRetry 1s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 09:45:36 crc kubenswrapper[4794]: E0310 09:45:36.659547 4794 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.659565 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.659596 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 09:45:36 crc kubenswrapper[4794]: E0310 09:45:36.659621 4794 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-10 09:45:37.659599439 +0000 UTC m=+86.415770287 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.659651 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 09:45:36 crc kubenswrapper[4794]: E0310 09:45:36.659756 4794 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 10 09:45:36 crc kubenswrapper[4794]: E0310 09:45:36.659771 4794 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 10 09:45:36 crc kubenswrapper[4794]: E0310 09:45:36.659782 4794 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 10 09:45:36 crc kubenswrapper[4794]: E0310 09:45:36.659816 4794 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-10 09:45:37.659808326 +0000 UTC m=+86.415979144 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 10 09:45:36 crc kubenswrapper[4794]: E0310 09:45:36.659815 4794 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 10 09:45:36 crc kubenswrapper[4794]: E0310 09:45:36.659816 4794 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 10 09:45:36 crc kubenswrapper[4794]: E0310 09:45:36.659933 4794 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-10 09:45:37.659910799 +0000 UTC m=+86.416081637 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 10 09:45:36 crc kubenswrapper[4794]: E0310 09:45:36.659840 4794 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 10 09:45:36 crc kubenswrapper[4794]: E0310 09:45:36.659973 4794 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 10 09:45:36 crc kubenswrapper[4794]: E0310 09:45:36.660015 4794 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-10 09:45:37.660004632 +0000 UTC m=+86.416175470 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.694020 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.694068 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.694089 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.694113 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.694125 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:45:36Z","lastTransitionTime":"2026-03-10T09:45:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.797150 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.797448 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.797461 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.797478 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.797490 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:45:36Z","lastTransitionTime":"2026-03-10T09:45:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.899915 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.899953 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.899964 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.899982 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:45:36 crc kubenswrapper[4794]: I0310 09:45:36.899994 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:45:36Z","lastTransitionTime":"2026-03-10T09:45:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:45:37 crc kubenswrapper[4794]: I0310 09:45:37.001914 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:45:37 crc kubenswrapper[4794]: I0310 09:45:37.001962 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:45:37 crc kubenswrapper[4794]: I0310 09:45:37.001974 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:45:37 crc kubenswrapper[4794]: I0310 09:45:37.001991 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:45:37 crc kubenswrapper[4794]: I0310 09:45:37.002004 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:45:37Z","lastTransitionTime":"2026-03-10T09:45:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:45:37 crc kubenswrapper[4794]: I0310 09:45:37.104876 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:45:37 crc kubenswrapper[4794]: I0310 09:45:37.104928 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:45:37 crc kubenswrapper[4794]: I0310 09:45:37.104948 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:45:37 crc kubenswrapper[4794]: I0310 09:45:37.104972 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:45:37 crc kubenswrapper[4794]: I0310 09:45:37.104991 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:45:37Z","lastTransitionTime":"2026-03-10T09:45:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:45:37 crc kubenswrapper[4794]: I0310 09:45:37.207804 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:45:37 crc kubenswrapper[4794]: I0310 09:45:37.207847 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:45:37 crc kubenswrapper[4794]: I0310 09:45:37.207858 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:45:37 crc kubenswrapper[4794]: I0310 09:45:37.207876 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:45:37 crc kubenswrapper[4794]: I0310 09:45:37.207888 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:45:37Z","lastTransitionTime":"2026-03-10T09:45:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:45:37 crc kubenswrapper[4794]: I0310 09:45:37.311662 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:45:37 crc kubenswrapper[4794]: I0310 09:45:37.311717 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:45:37 crc kubenswrapper[4794]: I0310 09:45:37.311735 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:45:37 crc kubenswrapper[4794]: I0310 09:45:37.311758 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:45:37 crc kubenswrapper[4794]: I0310 09:45:37.311777 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:45:37Z","lastTransitionTime":"2026-03-10T09:45:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:45:37 crc kubenswrapper[4794]: I0310 09:45:37.374170 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"d7e66c82b5c376bc659e9597f6714923192b2658db0530c1e20b73533bfb9079"} Mar 10 09:45:37 crc kubenswrapper[4794]: I0310 09:45:37.378047 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"0a99da5e73d983b28a12e4063cb3dcbef725b0797abd5b5beff2d066fb4b43d6"} Mar 10 09:45:37 crc kubenswrapper[4794]: I0310 09:45:37.378145 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"faad85d1ef96c3c23ba54d03d456195b33f24bed1a5fc64cec481108d087d77c"} Mar 10 09:45:37 crc kubenswrapper[4794]: I0310 09:45:37.397679 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:35Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:45:37Z is after 2025-08-24T17:21:41Z" Mar 10 09:45:37 crc kubenswrapper[4794]: I0310 09:45:37.414035 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:45:37Z is after 2025-08-24T17:21:41Z" Mar 10 09:45:37 crc kubenswrapper[4794]: I0310 09:45:37.414535 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:45:37 crc kubenswrapper[4794]: I0310 09:45:37.414616 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:45:37 crc kubenswrapper[4794]: I0310 09:45:37.414644 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:45:37 crc kubenswrapper[4794]: I0310 09:45:37.414675 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:45:37 crc kubenswrapper[4794]: I0310 09:45:37.414699 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:45:37Z","lastTransitionTime":"2026-03-10T09:45:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:45:37 crc kubenswrapper[4794]: I0310 09:45:37.429405 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:35Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:45:37Z is after 2025-08-24T17:21:41Z" Mar 10 09:45:37 crc kubenswrapper[4794]: I0310 09:45:37.445110 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:35Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:45:37Z is after 2025-08-24T17:21:41Z" Mar 10 09:45:37 crc kubenswrapper[4794]: I0310 09:45:37.461209 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d7e66c82b5c376bc659e9597f6714923192b2658db0530c1e20b73533bfb9079\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:45:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:45:37Z is after 2025-08-24T17:21:41Z" Mar 10 09:45:37 crc kubenswrapper[4794]: I0310 09:45:37.475844 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:35Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:45:37Z is after 2025-08-24T17:21:41Z" Mar 10 09:45:37 crc kubenswrapper[4794]: I0310 09:45:37.490919 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d7e66c82b5c376bc659e9597f6714923192b2658db0530c1e20b73533bfb9079\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:45:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:45:37Z is after 2025-08-24T17:21:41Z" Mar 10 09:45:37 crc kubenswrapper[4794]: I0310 09:45:37.503886 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:35Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:45:37Z is after 2025-08-24T17:21:41Z" Mar 10 09:45:37 crc kubenswrapper[4794]: I0310 09:45:37.517056 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:45:37Z is after 2025-08-24T17:21:41Z" Mar 10 09:45:37 crc kubenswrapper[4794]: I0310 09:45:37.517999 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:45:37 crc kubenswrapper[4794]: I0310 09:45:37.518046 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:45:37 crc kubenswrapper[4794]: I0310 09:45:37.518054 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:45:37 crc kubenswrapper[4794]: I0310 09:45:37.518068 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:45:37 crc kubenswrapper[4794]: I0310 09:45:37.518080 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:45:37Z","lastTransitionTime":"2026-03-10T09:45:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:45:37 crc kubenswrapper[4794]: I0310 09:45:37.527831 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0a99da5e73d983b28a12e4063cb3dcbef725b0797abd5b5beff2d066fb4b43d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:45:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://faad85d1ef96c3c23ba54d03d456195b33f24bed1a5fc64cec481108d087d77c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:45:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:45:37Z is after 2025-08-24T17:21:41Z" Mar 10 09:45:37 crc kubenswrapper[4794]: I0310 09:45:37.538132 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:35Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:45:37Z is after 2025-08-24T17:21:41Z" Mar 10 09:45:37 crc kubenswrapper[4794]: I0310 09:45:37.550984 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:35Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:45:37Z is after 2025-08-24T17:21:41Z" Mar 10 09:45:37 crc kubenswrapper[4794]: I0310 09:45:37.620355 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:45:37 crc kubenswrapper[4794]: I0310 09:45:37.620415 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:45:37 crc kubenswrapper[4794]: I0310 09:45:37.620432 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:45:37 crc kubenswrapper[4794]: I0310 09:45:37.620456 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:45:37 crc kubenswrapper[4794]: I0310 09:45:37.620474 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:45:37Z","lastTransitionTime":"2026-03-10T09:45:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:45:37 crc kubenswrapper[4794]: I0310 09:45:37.666827 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 09:45:37 crc kubenswrapper[4794]: E0310 09:45:37.670827 4794 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 09:45:39.666842936 +0000 UTC m=+88.423013794 (durationBeforeRetry 2s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 09:45:37 crc kubenswrapper[4794]: I0310 09:45:37.670924 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 09:45:37 crc kubenswrapper[4794]: I0310 09:45:37.670990 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 09:45:37 crc kubenswrapper[4794]: I0310 09:45:37.671042 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 09:45:37 crc kubenswrapper[4794]: I0310 09:45:37.671083 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 09:45:37 crc kubenswrapper[4794]: E0310 09:45:37.671170 4794 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 10 09:45:37 crc kubenswrapper[4794]: E0310 09:45:37.671230 4794 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-10 09:45:39.6712142 +0000 UTC m=+88.427385018 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 10 09:45:37 crc kubenswrapper[4794]: E0310 09:45:37.671711 4794 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 10 09:45:37 crc kubenswrapper[4794]: E0310 09:45:37.671764 4794 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-10 09:45:39.671750426 +0000 UTC m=+88.427921244 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 10 09:45:37 crc kubenswrapper[4794]: E0310 09:45:37.671815 4794 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 10 09:45:37 crc kubenswrapper[4794]: E0310 09:45:37.671852 4794 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 10 09:45:37 crc kubenswrapper[4794]: E0310 09:45:37.671872 4794 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 10 09:45:37 crc kubenswrapper[4794]: E0310 09:45:37.671962 4794 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-10 09:45:39.671939342 +0000 UTC m=+88.428110200 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 10 09:45:37 crc kubenswrapper[4794]: E0310 09:45:37.671985 4794 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 10 09:45:37 crc kubenswrapper[4794]: E0310 09:45:37.672081 4794 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 10 09:45:37 crc kubenswrapper[4794]: E0310 09:45:37.673404 4794 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 10 09:45:37 crc kubenswrapper[4794]: E0310 09:45:37.673545 4794 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-10 09:45:39.673519811 +0000 UTC m=+88.429690669 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 10 09:45:37 crc kubenswrapper[4794]: I0310 09:45:37.723251 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:45:37 crc kubenswrapper[4794]: I0310 09:45:37.723295 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:45:37 crc kubenswrapper[4794]: I0310 09:45:37.723310 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:45:37 crc kubenswrapper[4794]: I0310 09:45:37.723354 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:45:37 crc kubenswrapper[4794]: I0310 09:45:37.723370 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:45:37Z","lastTransitionTime":"2026-03-10T09:45:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:45:37 crc kubenswrapper[4794]: I0310 09:45:37.826430 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:45:37 crc kubenswrapper[4794]: I0310 09:45:37.826487 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:45:37 crc kubenswrapper[4794]: I0310 09:45:37.826504 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:45:37 crc kubenswrapper[4794]: I0310 09:45:37.826530 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:45:37 crc kubenswrapper[4794]: I0310 09:45:37.826547 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:45:37Z","lastTransitionTime":"2026-03-10T09:45:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:45:37 crc kubenswrapper[4794]: I0310 09:45:37.929027 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:45:37 crc kubenswrapper[4794]: I0310 09:45:37.929074 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:45:37 crc kubenswrapper[4794]: I0310 09:45:37.929087 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:45:37 crc kubenswrapper[4794]: I0310 09:45:37.929102 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:45:37 crc kubenswrapper[4794]: I0310 09:45:37.929116 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:45:37Z","lastTransitionTime":"2026-03-10T09:45:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:45:37 crc kubenswrapper[4794]: I0310 09:45:37.998528 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 09:45:37 crc kubenswrapper[4794]: I0310 09:45:37.998577 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 09:45:37 crc kubenswrapper[4794]: I0310 09:45:37.998548 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 09:45:37 crc kubenswrapper[4794]: E0310 09:45:37.998666 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 09:45:37 crc kubenswrapper[4794]: E0310 09:45:37.998774 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 09:45:37 crc kubenswrapper[4794]: E0310 09:45:37.998856 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 09:45:38 crc kubenswrapper[4794]: I0310 09:45:38.003371 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01ab3dd5-8196-46d0-ad33-122e2ca51def" path="/var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes" Mar 10 09:45:38 crc kubenswrapper[4794]: I0310 09:45:38.004154 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" path="/var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes" Mar 10 09:45:38 crc kubenswrapper[4794]: I0310 09:45:38.005229 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09efc573-dbb6-4249-bd59-9b87aba8dd28" path="/var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes" Mar 10 09:45:38 crc kubenswrapper[4794]: I0310 09:45:38.006172 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b574797-001e-440a-8f4e-c0be86edad0f" path="/var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes" Mar 10 09:45:38 crc kubenswrapper[4794]: I0310 09:45:38.007249 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b78653f-4ff9-4508-8672-245ed9b561e3" path="/var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes" Mar 10 09:45:38 crc kubenswrapper[4794]: I0310 09:45:38.007952 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1386a44e-36a2-460c-96d0-0359d2b6f0f5" path="/var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes" Mar 10 09:45:38 crc kubenswrapper[4794]: I0310 09:45:38.009908 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1bf7eb37-55a3-4c65-b768-a94c82151e69" path="/var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes" Mar 10 09:45:38 crc kubenswrapper[4794]: I0310 09:45:38.010697 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d611f23-29be-4491-8495-bee1670e935f" path="/var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes" Mar 10 09:45:38 crc kubenswrapper[4794]: I0310 09:45:38.011521 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20b0d48f-5fd6-431c-a545-e3c800c7b866" path="/var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/volumes" Mar 10 09:45:38 crc kubenswrapper[4794]: I0310 09:45:38.012174 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" path="/var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes" Mar 10 09:45:38 crc kubenswrapper[4794]: I0310 09:45:38.012899 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22c825df-677d-4ca6-82db-3454ed06e783" path="/var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes" Mar 10 09:45:38 crc kubenswrapper[4794]: I0310 09:45:38.013857 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25e176fe-21b4-4974-b1ed-c8b94f112a7f" path="/var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes" Mar 10 09:45:38 crc kubenswrapper[4794]: I0310 09:45:38.014563 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" path="/var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes" Mar 10 09:45:38 crc kubenswrapper[4794]: I0310 09:45:38.015214 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31d8b7a1-420e-4252-a5b7-eebe8a111292" path="/var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes" Mar 10 09:45:38 crc kubenswrapper[4794]: I0310 09:45:38.015901 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ab1a177-2de0-46d9-b765-d0d0649bb42e" path="/var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/volumes" Mar 10 09:45:38 crc kubenswrapper[4794]: I0310 09:45:38.016606 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" path="/var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes" Mar 10 09:45:38 crc kubenswrapper[4794]: I0310 09:45:38.017298 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43509403-f426-496e-be36-56cef71462f5" path="/var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes" Mar 10 09:45:38 crc kubenswrapper[4794]: I0310 09:45:38.017824 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44663579-783b-4372-86d6-acf235a62d72" path="/var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/volumes" Mar 10 09:45:38 crc kubenswrapper[4794]: I0310 09:45:38.018579 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="496e6271-fb68-4057-954e-a0d97a4afa3f" path="/var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes" Mar 10 09:45:38 crc kubenswrapper[4794]: I0310 09:45:38.019293 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" path="/var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes" Mar 10 09:45:38 crc kubenswrapper[4794]: I0310 09:45:38.021528 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49ef4625-1d3a-4a9f-b595-c2433d32326d" path="/var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/volumes" Mar 10 09:45:38 crc kubenswrapper[4794]: I0310 09:45:38.022392 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4bb40260-dbaa-4fb0-84df-5e680505d512" path="/var/lib/kubelet/pods/4bb40260-dbaa-4fb0-84df-5e680505d512/volumes" Mar 10 09:45:38 crc kubenswrapper[4794]: I0310 09:45:38.022979 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5225d0e4-402f-4861-b410-819f433b1803" path="/var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes" Mar 10 09:45:38 crc kubenswrapper[4794]: I0310 09:45:38.023944 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5441d097-087c-4d9a-baa8-b210afa90fc9" path="/var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes" Mar 10 09:45:38 crc kubenswrapper[4794]: I0310 09:45:38.024612 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57a731c4-ef35-47a8-b875-bfb08a7f8011" path="/var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes" Mar 10 09:45:38 crc kubenswrapper[4794]: I0310 09:45:38.025444 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b88f790-22fa-440e-b583-365168c0b23d" path="/var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/volumes" Mar 10 09:45:38 crc kubenswrapper[4794]: I0310 09:45:38.026282 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5fe579f8-e8a6-4643-bce5-a661393c4dde" path="/var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/volumes" Mar 10 09:45:38 crc kubenswrapper[4794]: I0310 09:45:38.027081 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6402fda4-df10-493c-b4e5-d0569419652d" path="/var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes" Mar 10 09:45:38 crc kubenswrapper[4794]: I0310 09:45:38.028020 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6509e943-70c6-444c-bc41-48a544e36fbd" path="/var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes" Mar 10 09:45:38 crc kubenswrapper[4794]: I0310 09:45:38.028518 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6731426b-95fe-49ff-bb5f-40441049fde2" path="/var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/volumes" Mar 10 09:45:38 crc kubenswrapper[4794]: I0310 09:45:38.028981 4794 kubelet_volumes.go:152] "Cleaned up orphaned volume subpath from pod" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volume-subpaths/run-systemd/ovnkube-controller/6" Mar 10 09:45:38 crc kubenswrapper[4794]: I0310 09:45:38.029086 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volumes" Mar 10 09:45:38 crc kubenswrapper[4794]: I0310 09:45:38.031079 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7539238d-5fe0-46ed-884e-1c3b566537ec" path="/var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes" Mar 10 09:45:38 crc kubenswrapper[4794]: I0310 09:45:38.031279 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:45:38 crc kubenswrapper[4794]: I0310 09:45:38.031312 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:45:38 crc kubenswrapper[4794]: I0310 09:45:38.031324 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:45:38 crc kubenswrapper[4794]: I0310 09:45:38.031356 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:45:38 crc kubenswrapper[4794]: I0310 09:45:38.031368 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:45:38Z","lastTransitionTime":"2026-03-10T09:45:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:45:38 crc kubenswrapper[4794]: I0310 09:45:38.031943 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7583ce53-e0fe-4a16-9e4d-50516596a136" path="/var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes" Mar 10 09:45:38 crc kubenswrapper[4794]: I0310 09:45:38.032597 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7bb08738-c794-4ee8-9972-3a62ca171029" path="/var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes" Mar 10 09:45:38 crc kubenswrapper[4794]: I0310 09:45:38.034107 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87cf06ed-a83f-41a7-828d-70653580a8cb" path="/var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes" Mar 10 09:45:38 crc kubenswrapper[4794]: I0310 09:45:38.035204 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" path="/var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes" Mar 10 09:45:38 crc kubenswrapper[4794]: I0310 09:45:38.036140 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="925f1c65-6136-48ba-85aa-3a3b50560753" path="/var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes" Mar 10 09:45:38 crc kubenswrapper[4794]: I0310 09:45:38.037234 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" path="/var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/volumes" Mar 10 09:45:38 crc kubenswrapper[4794]: I0310 09:45:38.038372 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d4552c7-cd75-42dd-8880-30dd377c49a4" path="/var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes" Mar 10 09:45:38 crc kubenswrapper[4794]: I0310 09:45:38.039183 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" path="/var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/volumes" Mar 10 09:45:38 crc kubenswrapper[4794]: I0310 09:45:38.040172 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a31745f5-9847-4afe-82a5-3161cc66ca93" path="/var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes" Mar 10 09:45:38 crc kubenswrapper[4794]: I0310 09:45:38.041125 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" path="/var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes" Mar 10 09:45:38 crc kubenswrapper[4794]: I0310 09:45:38.042699 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6312bbd-5731-4ea0-a20f-81d5a57df44a" path="/var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/volumes" Mar 10 09:45:38 crc kubenswrapper[4794]: I0310 09:45:38.043597 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" path="/var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes" Mar 10 09:45:38 crc kubenswrapper[4794]: I0310 09:45:38.044647 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" path="/var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes" Mar 10 09:45:38 crc kubenswrapper[4794]: I0310 09:45:38.045769 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" path="/var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/volumes" Mar 10 09:45:38 crc kubenswrapper[4794]: I0310 09:45:38.047031 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf126b07-da06-4140-9a57-dfd54fc6b486" path="/var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes" Mar 10 09:45:38 crc kubenswrapper[4794]: I0310 09:45:38.047714 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c03ee662-fb2f-4fc4-a2c1-af487c19d254" path="/var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes" Mar 10 09:45:38 crc kubenswrapper[4794]: I0310 09:45:38.048517 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" path="/var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/volumes" Mar 10 09:45:38 crc kubenswrapper[4794]: I0310 09:45:38.049380 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7e6199b-1264-4501-8953-767f51328d08" path="/var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes" Mar 10 09:45:38 crc kubenswrapper[4794]: I0310 09:45:38.050561 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="efdd0498-1daa-4136-9a4a-3b948c2293fc" path="/var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/volumes" Mar 10 09:45:38 crc kubenswrapper[4794]: I0310 09:45:38.051765 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" path="/var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/volumes" Mar 10 09:45:38 crc kubenswrapper[4794]: I0310 09:45:38.052655 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fda69060-fa79-4696-b1a6-7980f124bf7c" path="/var/lib/kubelet/pods/fda69060-fa79-4696-b1a6-7980f124bf7c/volumes" Mar 10 09:45:38 crc kubenswrapper[4794]: I0310 09:45:38.134179 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:45:38 crc kubenswrapper[4794]: I0310 09:45:38.134220 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:45:38 crc kubenswrapper[4794]: I0310 09:45:38.134231 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:45:38 crc kubenswrapper[4794]: I0310 09:45:38.134247 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:45:38 crc kubenswrapper[4794]: I0310 09:45:38.134258 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:45:38Z","lastTransitionTime":"2026-03-10T09:45:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:45:38 crc kubenswrapper[4794]: I0310 09:45:38.236822 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:45:38 crc kubenswrapper[4794]: I0310 09:45:38.236885 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:45:38 crc kubenswrapper[4794]: I0310 09:45:38.236902 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:45:38 crc kubenswrapper[4794]: I0310 09:45:38.236928 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:45:38 crc kubenswrapper[4794]: I0310 09:45:38.236945 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:45:38Z","lastTransitionTime":"2026-03-10T09:45:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:45:38 crc kubenswrapper[4794]: I0310 09:45:38.339504 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:45:38 crc kubenswrapper[4794]: I0310 09:45:38.339549 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:45:38 crc kubenswrapper[4794]: I0310 09:45:38.339561 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:45:38 crc kubenswrapper[4794]: I0310 09:45:38.339578 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:45:38 crc kubenswrapper[4794]: I0310 09:45:38.339591 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:45:38Z","lastTransitionTime":"2026-03-10T09:45:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:45:38 crc kubenswrapper[4794]: I0310 09:45:38.442498 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:45:38 crc kubenswrapper[4794]: I0310 09:45:38.442542 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:45:38 crc kubenswrapper[4794]: I0310 09:45:38.442553 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:45:38 crc kubenswrapper[4794]: I0310 09:45:38.442571 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:45:38 crc kubenswrapper[4794]: I0310 09:45:38.442584 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:45:38Z","lastTransitionTime":"2026-03-10T09:45:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:45:38 crc kubenswrapper[4794]: I0310 09:45:38.546096 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:45:38 crc kubenswrapper[4794]: I0310 09:45:38.546141 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:45:38 crc kubenswrapper[4794]: I0310 09:45:38.546152 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:45:38 crc kubenswrapper[4794]: I0310 09:45:38.546167 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:45:38 crc kubenswrapper[4794]: I0310 09:45:38.546177 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:45:38Z","lastTransitionTime":"2026-03-10T09:45:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:45:38 crc kubenswrapper[4794]: I0310 09:45:38.649640 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:45:38 crc kubenswrapper[4794]: I0310 09:45:38.649695 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:45:38 crc kubenswrapper[4794]: I0310 09:45:38.649712 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:45:38 crc kubenswrapper[4794]: I0310 09:45:38.649753 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:45:38 crc kubenswrapper[4794]: I0310 09:45:38.649770 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:45:38Z","lastTransitionTime":"2026-03-10T09:45:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:45:38 crc kubenswrapper[4794]: I0310 09:45:38.752326 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:45:38 crc kubenswrapper[4794]: I0310 09:45:38.752381 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:45:38 crc kubenswrapper[4794]: I0310 09:45:38.752396 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:45:38 crc kubenswrapper[4794]: I0310 09:45:38.752413 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:45:38 crc kubenswrapper[4794]: I0310 09:45:38.752424 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:45:38Z","lastTransitionTime":"2026-03-10T09:45:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:45:38 crc kubenswrapper[4794]: I0310 09:45:38.787211 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:45:38 crc kubenswrapper[4794]: I0310 09:45:38.787243 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:45:38 crc kubenswrapper[4794]: I0310 09:45:38.787252 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:45:38 crc kubenswrapper[4794]: I0310 09:45:38.787264 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:45:38 crc kubenswrapper[4794]: I0310 09:45:38.787275 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:45:38Z","lastTransitionTime":"2026-03-10T09:45:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:45:38 crc kubenswrapper[4794]: E0310 09:45:38.807032 4794 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:45:38Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:38Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:45:38Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:38Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:45:38Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:38Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:45:38Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:38Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"30640994-851b-4f33-a3b0-5689f89c6242\\\",\\\"systemUUID\\\":\\\"970a308b-8f2f-4747-b542-8544494e7e13\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:45:38Z is after 2025-08-24T17:21:41Z" Mar 10 09:45:38 crc kubenswrapper[4794]: I0310 09:45:38.811932 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:45:38 crc kubenswrapper[4794]: I0310 09:45:38.811983 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:45:38 crc kubenswrapper[4794]: I0310 09:45:38.812000 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:45:38 crc kubenswrapper[4794]: I0310 09:45:38.812023 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:45:38 crc kubenswrapper[4794]: I0310 09:45:38.812039 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:45:38Z","lastTransitionTime":"2026-03-10T09:45:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:45:38 crc kubenswrapper[4794]: E0310 09:45:38.831165 4794 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:45:38Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:38Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:45:38Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:38Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:45:38Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:38Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:45:38Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:38Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"30640994-851b-4f33-a3b0-5689f89c6242\\\",\\\"systemUUID\\\":\\\"970a308b-8f2f-4747-b542-8544494e7e13\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:45:38Z is after 2025-08-24T17:21:41Z" Mar 10 09:45:38 crc kubenswrapper[4794]: I0310 09:45:38.835805 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:45:38 crc kubenswrapper[4794]: I0310 09:45:38.835855 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:45:38 crc kubenswrapper[4794]: I0310 09:45:38.835872 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:45:38 crc kubenswrapper[4794]: I0310 09:45:38.835896 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:45:38 crc kubenswrapper[4794]: I0310 09:45:38.835913 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:45:38Z","lastTransitionTime":"2026-03-10T09:45:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:45:38 crc kubenswrapper[4794]: E0310 09:45:38.853858 4794 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:45:38Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:38Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:45:38Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:38Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:45:38Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:38Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:45:38Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:38Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"30640994-851b-4f33-a3b0-5689f89c6242\\\",\\\"systemUUID\\\":\\\"970a308b-8f2f-4747-b542-8544494e7e13\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:45:38Z is after 2025-08-24T17:21:41Z" Mar 10 09:45:38 crc kubenswrapper[4794]: I0310 09:45:38.858091 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:45:38 crc kubenswrapper[4794]: I0310 09:45:38.858170 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:45:38 crc kubenswrapper[4794]: I0310 09:45:38.858195 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:45:38 crc kubenswrapper[4794]: I0310 09:45:38.858225 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:45:38 crc kubenswrapper[4794]: I0310 09:45:38.858250 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:45:38Z","lastTransitionTime":"2026-03-10T09:45:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:45:38 crc kubenswrapper[4794]: E0310 09:45:38.875317 4794 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:45:38Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:38Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:45:38Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:38Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:45:38Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:38Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:45:38Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:38Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"30640994-851b-4f33-a3b0-5689f89c6242\\\",\\\"systemUUID\\\":\\\"970a308b-8f2f-4747-b542-8544494e7e13\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:45:38Z is after 2025-08-24T17:21:41Z" Mar 10 09:45:38 crc kubenswrapper[4794]: I0310 09:45:38.879404 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:45:38 crc kubenswrapper[4794]: I0310 09:45:38.879445 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:45:38 crc kubenswrapper[4794]: I0310 09:45:38.879459 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:45:38 crc kubenswrapper[4794]: I0310 09:45:38.879482 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:45:38 crc kubenswrapper[4794]: I0310 09:45:38.879498 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:45:38Z","lastTransitionTime":"2026-03-10T09:45:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:45:38 crc kubenswrapper[4794]: E0310 09:45:38.898185 4794 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:45:38Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:38Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:45:38Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:38Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:45:38Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:38Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:45:38Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:38Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"30640994-851b-4f33-a3b0-5689f89c6242\\\",\\\"systemUUID\\\":\\\"970a308b-8f2f-4747-b542-8544494e7e13\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:45:38Z is after 2025-08-24T17:21:41Z" Mar 10 09:45:38 crc kubenswrapper[4794]: E0310 09:45:38.898387 4794 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 10 09:45:38 crc kubenswrapper[4794]: I0310 09:45:38.900000 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:45:38 crc kubenswrapper[4794]: I0310 09:45:38.900059 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:45:38 crc kubenswrapper[4794]: I0310 09:45:38.900078 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:45:38 crc kubenswrapper[4794]: I0310 09:45:38.900101 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:45:38 crc kubenswrapper[4794]: I0310 09:45:38.900123 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:45:38Z","lastTransitionTime":"2026-03-10T09:45:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:45:39 crc kubenswrapper[4794]: I0310 09:45:39.002658 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:45:39 crc kubenswrapper[4794]: I0310 09:45:39.002707 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:45:39 crc kubenswrapper[4794]: I0310 09:45:39.002723 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:45:39 crc kubenswrapper[4794]: I0310 09:45:39.002745 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:45:39 crc kubenswrapper[4794]: I0310 09:45:39.002760 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:45:39Z","lastTransitionTime":"2026-03-10T09:45:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:45:39 crc kubenswrapper[4794]: I0310 09:45:39.105538 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:45:39 crc kubenswrapper[4794]: I0310 09:45:39.105605 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:45:39 crc kubenswrapper[4794]: I0310 09:45:39.105623 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:45:39 crc kubenswrapper[4794]: I0310 09:45:39.105649 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:45:39 crc kubenswrapper[4794]: I0310 09:45:39.105678 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:45:39Z","lastTransitionTime":"2026-03-10T09:45:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:45:39 crc kubenswrapper[4794]: I0310 09:45:39.210440 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:45:39 crc kubenswrapper[4794]: I0310 09:45:39.210544 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:45:39 crc kubenswrapper[4794]: I0310 09:45:39.210569 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:45:39 crc kubenswrapper[4794]: I0310 09:45:39.210598 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:45:39 crc kubenswrapper[4794]: I0310 09:45:39.210619 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:45:39Z","lastTransitionTime":"2026-03-10T09:45:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:45:39 crc kubenswrapper[4794]: I0310 09:45:39.314159 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:45:39 crc kubenswrapper[4794]: I0310 09:45:39.314204 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:45:39 crc kubenswrapper[4794]: I0310 09:45:39.314218 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:45:39 crc kubenswrapper[4794]: I0310 09:45:39.314240 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:45:39 crc kubenswrapper[4794]: I0310 09:45:39.314254 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:45:39Z","lastTransitionTime":"2026-03-10T09:45:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:45:39 crc kubenswrapper[4794]: I0310 09:45:39.383474 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"7ca44cdf468c33b025c75f0f104435e413e525fba85c006f98fa92ff2f2a4e3b"} Mar 10 09:45:39 crc kubenswrapper[4794]: I0310 09:45:39.406154 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:35Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:45:39Z is after 2025-08-24T17:21:41Z" Mar 10 09:45:39 crc kubenswrapper[4794]: I0310 09:45:39.416300 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:45:39 crc kubenswrapper[4794]: I0310 09:45:39.416360 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:45:39 crc kubenswrapper[4794]: I0310 09:45:39.416379 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:45:39 crc kubenswrapper[4794]: I0310 09:45:39.416396 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:45:39 crc kubenswrapper[4794]: I0310 09:45:39.416406 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:45:39Z","lastTransitionTime":"2026-03-10T09:45:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:45:39 crc kubenswrapper[4794]: I0310 09:45:39.423023 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:45:39Z is after 2025-08-24T17:21:41Z" Mar 10 09:45:39 crc kubenswrapper[4794]: I0310 09:45:39.437990 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0a99da5e73d983b28a12e4063cb3dcbef725b0797abd5b5beff2d066fb4b43d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:45:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://faad85d1ef96c3c23ba54d03d456195b33f24bed1a5fc64cec481108d087d77c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:45:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:45:39Z is after 2025-08-24T17:21:41Z" Mar 10 09:45:39 crc kubenswrapper[4794]: I0310 09:45:39.450388 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:35Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:45:39Z is after 2025-08-24T17:21:41Z" Mar 10 09:45:39 crc kubenswrapper[4794]: I0310 09:45:39.465424 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:39Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:39Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7ca44cdf468c33b025c75f0f104435e413e525fba85c006f98fa92ff2f2a4e3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:45:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:45:39Z is after 2025-08-24T17:21:41Z" Mar 10 09:45:39 crc kubenswrapper[4794]: I0310 09:45:39.481197 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d7e66c82b5c376bc659e9597f6714923192b2658db0530c1e20b73533bfb9079\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:45:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:45:39Z is after 2025-08-24T17:21:41Z" Mar 10 09:45:39 crc kubenswrapper[4794]: I0310 09:45:39.519428 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:45:39 crc kubenswrapper[4794]: I0310 09:45:39.519547 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:45:39 crc kubenswrapper[4794]: I0310 09:45:39.519585 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:45:39 crc kubenswrapper[4794]: I0310 09:45:39.519614 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:45:39 crc kubenswrapper[4794]: I0310 09:45:39.519650 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:45:39Z","lastTransitionTime":"2026-03-10T09:45:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:45:39 crc kubenswrapper[4794]: I0310 09:45:39.623625 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:45:39 crc kubenswrapper[4794]: I0310 09:45:39.623743 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:45:39 crc kubenswrapper[4794]: I0310 09:45:39.623850 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:45:39 crc kubenswrapper[4794]: I0310 09:45:39.623926 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:45:39 crc kubenswrapper[4794]: I0310 09:45:39.623949 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:45:39Z","lastTransitionTime":"2026-03-10T09:45:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:45:39 crc kubenswrapper[4794]: I0310 09:45:39.690588 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 09:45:39 crc kubenswrapper[4794]: I0310 09:45:39.690733 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 09:45:39 crc kubenswrapper[4794]: I0310 09:45:39.690794 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 09:45:39 crc kubenswrapper[4794]: I0310 09:45:39.690851 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 09:45:39 crc kubenswrapper[4794]: I0310 09:45:39.690909 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 09:45:39 crc kubenswrapper[4794]: E0310 09:45:39.691104 4794 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 10 09:45:39 crc kubenswrapper[4794]: E0310 09:45:39.691152 4794 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 10 09:45:39 crc kubenswrapper[4794]: E0310 09:45:39.691467 4794 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 10 09:45:39 crc kubenswrapper[4794]: E0310 09:45:39.691553 4794 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 09:45:43.691524932 +0000 UTC m=+92.447695770 (durationBeforeRetry 4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 09:45:39 crc kubenswrapper[4794]: E0310 09:45:39.691565 4794 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 10 09:45:39 crc kubenswrapper[4794]: E0310 09:45:39.691600 4794 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 10 09:45:39 crc kubenswrapper[4794]: E0310 09:45:39.691613 4794 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 10 09:45:39 crc kubenswrapper[4794]: E0310 09:45:39.691634 4794 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-10 09:45:43.691614595 +0000 UTC m=+92.447785473 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 10 09:45:39 crc kubenswrapper[4794]: E0310 09:45:39.691671 4794 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-10 09:45:43.691653066 +0000 UTC m=+92.447823894 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 10 09:45:39 crc kubenswrapper[4794]: E0310 09:45:39.691970 4794 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 10 09:45:39 crc kubenswrapper[4794]: E0310 09:45:39.692125 4794 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-10 09:45:43.6920926 +0000 UTC m=+92.448263478 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 10 09:45:39 crc kubenswrapper[4794]: E0310 09:45:39.692166 4794 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 10 09:45:39 crc kubenswrapper[4794]: E0310 09:45:39.692244 4794 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-10 09:45:43.692225424 +0000 UTC m=+92.448396352 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 10 09:45:39 crc kubenswrapper[4794]: I0310 09:45:39.726509 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:45:39 crc kubenswrapper[4794]: I0310 09:45:39.726563 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:45:39 crc kubenswrapper[4794]: I0310 09:45:39.726577 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:45:39 crc kubenswrapper[4794]: I0310 09:45:39.726598 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:45:39 crc kubenswrapper[4794]: I0310 09:45:39.726613 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:45:39Z","lastTransitionTime":"2026-03-10T09:45:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:45:39 crc kubenswrapper[4794]: I0310 09:45:39.828843 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:45:39 crc kubenswrapper[4794]: I0310 09:45:39.828871 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:45:39 crc kubenswrapper[4794]: I0310 09:45:39.828880 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:45:39 crc kubenswrapper[4794]: I0310 09:45:39.828893 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:45:39 crc kubenswrapper[4794]: I0310 09:45:39.828903 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:45:39Z","lastTransitionTime":"2026-03-10T09:45:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:45:39 crc kubenswrapper[4794]: I0310 09:45:39.932310 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:45:39 crc kubenswrapper[4794]: I0310 09:45:39.932392 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:45:39 crc kubenswrapper[4794]: I0310 09:45:39.932412 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:45:39 crc kubenswrapper[4794]: I0310 09:45:39.932436 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:45:39 crc kubenswrapper[4794]: I0310 09:45:39.932455 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:45:39Z","lastTransitionTime":"2026-03-10T09:45:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:45:39 crc kubenswrapper[4794]: I0310 09:45:39.998639 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 09:45:39 crc kubenswrapper[4794]: I0310 09:45:39.998748 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 09:45:39 crc kubenswrapper[4794]: E0310 09:45:39.998773 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 09:45:39 crc kubenswrapper[4794]: E0310 09:45:39.998961 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 09:45:39 crc kubenswrapper[4794]: I0310 09:45:39.999072 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 09:45:39 crc kubenswrapper[4794]: E0310 09:45:39.999236 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 09:45:40 crc kubenswrapper[4794]: I0310 09:45:40.035627 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:45:40 crc kubenswrapper[4794]: I0310 09:45:40.035668 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:45:40 crc kubenswrapper[4794]: I0310 09:45:40.035678 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:45:40 crc kubenswrapper[4794]: I0310 09:45:40.035693 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:45:40 crc kubenswrapper[4794]: I0310 09:45:40.035702 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:45:40Z","lastTransitionTime":"2026-03-10T09:45:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:45:40 crc kubenswrapper[4794]: I0310 09:45:40.138758 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:45:40 crc kubenswrapper[4794]: I0310 09:45:40.138830 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:45:40 crc kubenswrapper[4794]: I0310 09:45:40.138853 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:45:40 crc kubenswrapper[4794]: I0310 09:45:40.138885 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:45:40 crc kubenswrapper[4794]: I0310 09:45:40.138910 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:45:40Z","lastTransitionTime":"2026-03-10T09:45:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:45:40 crc kubenswrapper[4794]: I0310 09:45:40.241819 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:45:40 crc kubenswrapper[4794]: I0310 09:45:40.241860 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:45:40 crc kubenswrapper[4794]: I0310 09:45:40.241875 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:45:40 crc kubenswrapper[4794]: I0310 09:45:40.241894 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:45:40 crc kubenswrapper[4794]: I0310 09:45:40.241907 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:45:40Z","lastTransitionTime":"2026-03-10T09:45:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:45:40 crc kubenswrapper[4794]: I0310 09:45:40.347468 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:45:40 crc kubenswrapper[4794]: I0310 09:45:40.347552 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:45:40 crc kubenswrapper[4794]: I0310 09:45:40.347577 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:45:40 crc kubenswrapper[4794]: I0310 09:45:40.347610 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:45:40 crc kubenswrapper[4794]: I0310 09:45:40.347644 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:45:40Z","lastTransitionTime":"2026-03-10T09:45:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:45:40 crc kubenswrapper[4794]: I0310 09:45:40.451348 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:45:40 crc kubenswrapper[4794]: I0310 09:45:40.451410 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:45:40 crc kubenswrapper[4794]: I0310 09:45:40.451427 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:45:40 crc kubenswrapper[4794]: I0310 09:45:40.451447 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:45:40 crc kubenswrapper[4794]: I0310 09:45:40.451460 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:45:40Z","lastTransitionTime":"2026-03-10T09:45:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:45:40 crc kubenswrapper[4794]: I0310 09:45:40.554898 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:45:40 crc kubenswrapper[4794]: I0310 09:45:40.554947 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:45:40 crc kubenswrapper[4794]: I0310 09:45:40.554961 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:45:40 crc kubenswrapper[4794]: I0310 09:45:40.554982 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:45:40 crc kubenswrapper[4794]: I0310 09:45:40.554997 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:45:40Z","lastTransitionTime":"2026-03-10T09:45:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:45:40 crc kubenswrapper[4794]: I0310 09:45:40.657696 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:45:40 crc kubenswrapper[4794]: I0310 09:45:40.657742 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:45:40 crc kubenswrapper[4794]: I0310 09:45:40.657755 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:45:40 crc kubenswrapper[4794]: I0310 09:45:40.657771 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:45:40 crc kubenswrapper[4794]: I0310 09:45:40.657812 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:45:40Z","lastTransitionTime":"2026-03-10T09:45:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:45:40 crc kubenswrapper[4794]: I0310 09:45:40.761459 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:45:40 crc kubenswrapper[4794]: I0310 09:45:40.761507 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:45:40 crc kubenswrapper[4794]: I0310 09:45:40.761526 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:45:40 crc kubenswrapper[4794]: I0310 09:45:40.761549 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:45:40 crc kubenswrapper[4794]: I0310 09:45:40.761566 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:45:40Z","lastTransitionTime":"2026-03-10T09:45:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:45:40 crc kubenswrapper[4794]: I0310 09:45:40.864149 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:45:40 crc kubenswrapper[4794]: I0310 09:45:40.864188 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:45:40 crc kubenswrapper[4794]: I0310 09:45:40.864199 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:45:40 crc kubenswrapper[4794]: I0310 09:45:40.864215 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:45:40 crc kubenswrapper[4794]: I0310 09:45:40.864227 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:45:40Z","lastTransitionTime":"2026-03-10T09:45:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:45:40 crc kubenswrapper[4794]: I0310 09:45:40.966853 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:45:40 crc kubenswrapper[4794]: I0310 09:45:40.966911 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:45:40 crc kubenswrapper[4794]: I0310 09:45:40.966930 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:45:40 crc kubenswrapper[4794]: I0310 09:45:40.966956 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:45:40 crc kubenswrapper[4794]: I0310 09:45:40.966973 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:45:40Z","lastTransitionTime":"2026-03-10T09:45:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:45:41 crc kubenswrapper[4794]: I0310 09:45:41.069607 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:45:41 crc kubenswrapper[4794]: I0310 09:45:41.069668 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:45:41 crc kubenswrapper[4794]: I0310 09:45:41.069686 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:45:41 crc kubenswrapper[4794]: I0310 09:45:41.069710 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:45:41 crc kubenswrapper[4794]: I0310 09:45:41.069729 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:45:41Z","lastTransitionTime":"2026-03-10T09:45:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:45:41 crc kubenswrapper[4794]: I0310 09:45:41.172481 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:45:41 crc kubenswrapper[4794]: I0310 09:45:41.172540 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:45:41 crc kubenswrapper[4794]: I0310 09:45:41.172560 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:45:41 crc kubenswrapper[4794]: I0310 09:45:41.172584 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:45:41 crc kubenswrapper[4794]: I0310 09:45:41.172601 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:45:41Z","lastTransitionTime":"2026-03-10T09:45:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:45:41 crc kubenswrapper[4794]: I0310 09:45:41.275516 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:45:41 crc kubenswrapper[4794]: I0310 09:45:41.275545 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:45:41 crc kubenswrapper[4794]: I0310 09:45:41.275557 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:45:41 crc kubenswrapper[4794]: I0310 09:45:41.275571 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:45:41 crc kubenswrapper[4794]: I0310 09:45:41.275580 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:45:41Z","lastTransitionTime":"2026-03-10T09:45:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:45:41 crc kubenswrapper[4794]: I0310 09:45:41.377112 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:45:41 crc kubenswrapper[4794]: I0310 09:45:41.377143 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:45:41 crc kubenswrapper[4794]: I0310 09:45:41.377155 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:45:41 crc kubenswrapper[4794]: I0310 09:45:41.377170 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:45:41 crc kubenswrapper[4794]: I0310 09:45:41.377183 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:45:41Z","lastTransitionTime":"2026-03-10T09:45:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:45:41 crc kubenswrapper[4794]: I0310 09:45:41.479801 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:45:41 crc kubenswrapper[4794]: I0310 09:45:41.479856 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:45:41 crc kubenswrapper[4794]: I0310 09:45:41.479866 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:45:41 crc kubenswrapper[4794]: I0310 09:45:41.479882 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:45:41 crc kubenswrapper[4794]: I0310 09:45:41.479892 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:45:41Z","lastTransitionTime":"2026-03-10T09:45:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:45:41 crc kubenswrapper[4794]: I0310 09:45:41.582473 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:45:41 crc kubenswrapper[4794]: I0310 09:45:41.582561 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:45:41 crc kubenswrapper[4794]: I0310 09:45:41.582578 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:45:41 crc kubenswrapper[4794]: I0310 09:45:41.582635 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:45:41 crc kubenswrapper[4794]: I0310 09:45:41.582656 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:45:41Z","lastTransitionTime":"2026-03-10T09:45:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:45:41 crc kubenswrapper[4794]: I0310 09:45:41.685994 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:45:41 crc kubenswrapper[4794]: I0310 09:45:41.686042 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:45:41 crc kubenswrapper[4794]: I0310 09:45:41.686055 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:45:41 crc kubenswrapper[4794]: I0310 09:45:41.686077 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:45:41 crc kubenswrapper[4794]: I0310 09:45:41.686090 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:45:41Z","lastTransitionTime":"2026-03-10T09:45:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:45:41 crc kubenswrapper[4794]: I0310 09:45:41.789219 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:45:41 crc kubenswrapper[4794]: I0310 09:45:41.789287 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:45:41 crc kubenswrapper[4794]: I0310 09:45:41.789311 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:45:41 crc kubenswrapper[4794]: I0310 09:45:41.789374 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:45:41 crc kubenswrapper[4794]: I0310 09:45:41.789406 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:45:41Z","lastTransitionTime":"2026-03-10T09:45:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:45:41 crc kubenswrapper[4794]: I0310 09:45:41.891286 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:45:41 crc kubenswrapper[4794]: I0310 09:45:41.891353 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:45:41 crc kubenswrapper[4794]: I0310 09:45:41.891368 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:45:41 crc kubenswrapper[4794]: I0310 09:45:41.891389 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:45:41 crc kubenswrapper[4794]: I0310 09:45:41.891401 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:45:41Z","lastTransitionTime":"2026-03-10T09:45:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:45:41 crc kubenswrapper[4794]: I0310 09:45:41.993113 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:45:41 crc kubenswrapper[4794]: I0310 09:45:41.993150 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:45:41 crc kubenswrapper[4794]: I0310 09:45:41.993161 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:45:41 crc kubenswrapper[4794]: I0310 09:45:41.993176 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:45:41 crc kubenswrapper[4794]: I0310 09:45:41.993187 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:45:41Z","lastTransitionTime":"2026-03-10T09:45:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:45:41 crc kubenswrapper[4794]: I0310 09:45:41.998627 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 09:45:41 crc kubenswrapper[4794]: I0310 09:45:41.998710 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 09:45:41 crc kubenswrapper[4794]: I0310 09:45:41.998812 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 09:45:41 crc kubenswrapper[4794]: E0310 09:45:41.998721 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 09:45:41 crc kubenswrapper[4794]: E0310 09:45:41.998982 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 09:45:41 crc kubenswrapper[4794]: E0310 09:45:41.998845 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 09:45:42 crc kubenswrapper[4794]: I0310 09:45:42.012684 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0a99da5e73d983b28a12e4063cb3dcbef725b0797abd5b5beff2d066fb4b43d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:45:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://faad85d1ef96c3c23ba54d03d456195b33f24bed1a5fc64cec481108d087d77c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:45:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:45:42Z is after 2025-08-24T17:21:41Z" Mar 10 09:45:42 crc kubenswrapper[4794]: I0310 09:45:42.025368 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:35Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:45:42Z is after 2025-08-24T17:21:41Z" Mar 10 09:45:42 crc kubenswrapper[4794]: I0310 09:45:42.041436 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:35Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:45:42Z is after 2025-08-24T17:21:41Z" Mar 10 09:45:42 crc kubenswrapper[4794]: I0310 09:45:42.057918 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:45:42Z is after 2025-08-24T17:21:41Z" Mar 10 09:45:42 crc kubenswrapper[4794]: I0310 09:45:42.074122 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d7e66c82b5c376bc659e9597f6714923192b2658db0530c1e20b73533bfb9079\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:45:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:45:42Z is after 2025-08-24T17:21:41Z" Mar 10 09:45:42 crc kubenswrapper[4794]: I0310 09:45:42.089939 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:39Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:39Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7ca44cdf468c33b025c75f0f104435e413e525fba85c006f98fa92ff2f2a4e3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:45:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:45:42Z is after 2025-08-24T17:21:41Z" Mar 10 09:45:42 crc kubenswrapper[4794]: I0310 09:45:42.095364 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:45:42 crc kubenswrapper[4794]: I0310 09:45:42.095402 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:45:42 crc kubenswrapper[4794]: I0310 09:45:42.095416 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:45:42 crc kubenswrapper[4794]: I0310 09:45:42.095431 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:45:42 crc kubenswrapper[4794]: I0310 09:45:42.095442 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:45:42Z","lastTransitionTime":"2026-03-10T09:45:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:45:42 crc kubenswrapper[4794]: I0310 09:45:42.198092 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:45:42 crc kubenswrapper[4794]: I0310 09:45:42.198130 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:45:42 crc kubenswrapper[4794]: I0310 09:45:42.198142 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:45:42 crc kubenswrapper[4794]: I0310 09:45:42.198158 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:45:42 crc kubenswrapper[4794]: I0310 09:45:42.198172 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:45:42Z","lastTransitionTime":"2026-03-10T09:45:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:45:42 crc kubenswrapper[4794]: I0310 09:45:42.300299 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:45:42 crc kubenswrapper[4794]: I0310 09:45:42.300602 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:45:42 crc kubenswrapper[4794]: I0310 09:45:42.300612 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:45:42 crc kubenswrapper[4794]: I0310 09:45:42.300632 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:45:42 crc kubenswrapper[4794]: I0310 09:45:42.300640 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:45:42Z","lastTransitionTime":"2026-03-10T09:45:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:45:42 crc kubenswrapper[4794]: I0310 09:45:42.402248 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:45:42 crc kubenswrapper[4794]: I0310 09:45:42.402292 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:45:42 crc kubenswrapper[4794]: I0310 09:45:42.402305 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:45:42 crc kubenswrapper[4794]: I0310 09:45:42.402322 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:45:42 crc kubenswrapper[4794]: I0310 09:45:42.402352 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:45:42Z","lastTransitionTime":"2026-03-10T09:45:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:45:42 crc kubenswrapper[4794]: I0310 09:45:42.504111 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:45:42 crc kubenswrapper[4794]: I0310 09:45:42.504148 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:45:42 crc kubenswrapper[4794]: I0310 09:45:42.504156 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:45:42 crc kubenswrapper[4794]: I0310 09:45:42.504171 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:45:42 crc kubenswrapper[4794]: I0310 09:45:42.504181 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:45:42Z","lastTransitionTime":"2026-03-10T09:45:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:45:42 crc kubenswrapper[4794]: I0310 09:45:42.606749 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:45:42 crc kubenswrapper[4794]: I0310 09:45:42.606860 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:45:42 crc kubenswrapper[4794]: I0310 09:45:42.606887 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:45:42 crc kubenswrapper[4794]: I0310 09:45:42.606918 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:45:42 crc kubenswrapper[4794]: I0310 09:45:42.606938 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:45:42Z","lastTransitionTime":"2026-03-10T09:45:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:45:42 crc kubenswrapper[4794]: I0310 09:45:42.709627 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:45:42 crc kubenswrapper[4794]: I0310 09:45:42.709671 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:45:42 crc kubenswrapper[4794]: I0310 09:45:42.709685 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:45:42 crc kubenswrapper[4794]: I0310 09:45:42.709704 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:45:42 crc kubenswrapper[4794]: I0310 09:45:42.709717 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:45:42Z","lastTransitionTime":"2026-03-10T09:45:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:45:42 crc kubenswrapper[4794]: I0310 09:45:42.811935 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:45:42 crc kubenswrapper[4794]: I0310 09:45:42.811969 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:45:42 crc kubenswrapper[4794]: I0310 09:45:42.811979 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:45:42 crc kubenswrapper[4794]: I0310 09:45:42.811994 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:45:42 crc kubenswrapper[4794]: I0310 09:45:42.812005 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:45:42Z","lastTransitionTime":"2026-03-10T09:45:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:45:42 crc kubenswrapper[4794]: I0310 09:45:42.915707 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:45:42 crc kubenswrapper[4794]: I0310 09:45:42.915794 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:45:42 crc kubenswrapper[4794]: I0310 09:45:42.915823 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:45:42 crc kubenswrapper[4794]: I0310 09:45:42.915855 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:45:42 crc kubenswrapper[4794]: I0310 09:45:42.915877 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:45:42Z","lastTransitionTime":"2026-03-10T09:45:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:45:43 crc kubenswrapper[4794]: I0310 09:45:43.019203 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:45:43 crc kubenswrapper[4794]: I0310 09:45:43.019278 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:45:43 crc kubenswrapper[4794]: I0310 09:45:43.019302 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:45:43 crc kubenswrapper[4794]: I0310 09:45:43.019329 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:45:43 crc kubenswrapper[4794]: I0310 09:45:43.019379 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:45:43Z","lastTransitionTime":"2026-03-10T09:45:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:45:43 crc kubenswrapper[4794]: I0310 09:45:43.121346 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:45:43 crc kubenswrapper[4794]: I0310 09:45:43.121391 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:45:43 crc kubenswrapper[4794]: I0310 09:45:43.121403 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:45:43 crc kubenswrapper[4794]: I0310 09:45:43.121419 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:45:43 crc kubenswrapper[4794]: I0310 09:45:43.121432 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:45:43Z","lastTransitionTime":"2026-03-10T09:45:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:45:43 crc kubenswrapper[4794]: I0310 09:45:43.224325 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:45:43 crc kubenswrapper[4794]: I0310 09:45:43.224441 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:45:43 crc kubenswrapper[4794]: I0310 09:45:43.224459 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:45:43 crc kubenswrapper[4794]: I0310 09:45:43.224482 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:45:43 crc kubenswrapper[4794]: I0310 09:45:43.224500 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:45:43Z","lastTransitionTime":"2026-03-10T09:45:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:45:43 crc kubenswrapper[4794]: I0310 09:45:43.326834 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:45:43 crc kubenswrapper[4794]: I0310 09:45:43.326907 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:45:43 crc kubenswrapper[4794]: I0310 09:45:43.326926 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:45:43 crc kubenswrapper[4794]: I0310 09:45:43.326948 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:45:43 crc kubenswrapper[4794]: I0310 09:45:43.326968 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:45:43Z","lastTransitionTime":"2026-03-10T09:45:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:45:43 crc kubenswrapper[4794]: I0310 09:45:43.430043 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:45:43 crc kubenswrapper[4794]: I0310 09:45:43.430084 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:45:43 crc kubenswrapper[4794]: I0310 09:45:43.430095 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:45:43 crc kubenswrapper[4794]: I0310 09:45:43.430110 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:45:43 crc kubenswrapper[4794]: I0310 09:45:43.430123 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:45:43Z","lastTransitionTime":"2026-03-10T09:45:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:45:43 crc kubenswrapper[4794]: I0310 09:45:43.532587 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:45:43 crc kubenswrapper[4794]: I0310 09:45:43.532660 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:45:43 crc kubenswrapper[4794]: I0310 09:45:43.532684 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:45:43 crc kubenswrapper[4794]: I0310 09:45:43.532714 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:45:43 crc kubenswrapper[4794]: I0310 09:45:43.532736 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:45:43Z","lastTransitionTime":"2026-03-10T09:45:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:45:43 crc kubenswrapper[4794]: I0310 09:45:43.634929 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:45:43 crc kubenswrapper[4794]: I0310 09:45:43.634968 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:45:43 crc kubenswrapper[4794]: I0310 09:45:43.634980 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:45:43 crc kubenswrapper[4794]: I0310 09:45:43.634996 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:45:43 crc kubenswrapper[4794]: I0310 09:45:43.635008 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:45:43Z","lastTransitionTime":"2026-03-10T09:45:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:45:43 crc kubenswrapper[4794]: I0310 09:45:43.722978 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 09:45:43 crc kubenswrapper[4794]: I0310 09:45:43.723109 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 09:45:43 crc kubenswrapper[4794]: E0310 09:45:43.723145 4794 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 09:45:51.72311103 +0000 UTC m=+100.479281888 (durationBeforeRetry 8s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 09:45:43 crc kubenswrapper[4794]: I0310 09:45:43.723196 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 09:45:43 crc kubenswrapper[4794]: I0310 09:45:43.723264 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 09:45:43 crc kubenswrapper[4794]: E0310 09:45:43.723304 4794 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 10 09:45:43 crc kubenswrapper[4794]: E0310 09:45:43.723352 4794 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 10 09:45:43 crc kubenswrapper[4794]: E0310 09:45:43.723374 4794 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 10 09:45:43 crc kubenswrapper[4794]: I0310 09:45:43.723328 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 09:45:43 crc kubenswrapper[4794]: E0310 09:45:43.723379 4794 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 10 09:45:43 crc kubenswrapper[4794]: E0310 09:45:43.723466 4794 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 10 09:45:43 crc kubenswrapper[4794]: E0310 09:45:43.723511 4794 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 10 09:45:43 crc kubenswrapper[4794]: E0310 09:45:43.723545 4794 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-10 09:45:51.723522973 +0000 UTC m=+100.479693841 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 10 09:45:43 crc kubenswrapper[4794]: E0310 09:45:43.723596 4794 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-10 09:45:51.723563634 +0000 UTC m=+100.479734492 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 10 09:45:43 crc kubenswrapper[4794]: E0310 09:45:43.723388 4794 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 10 09:45:43 crc kubenswrapper[4794]: E0310 09:45:43.723441 4794 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 10 09:45:43 crc kubenswrapper[4794]: E0310 09:45:43.723770 4794 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-10 09:45:51.723731539 +0000 UTC m=+100.479902397 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 10 09:45:43 crc kubenswrapper[4794]: E0310 09:45:43.723805 4794 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-10 09:45:51.723790441 +0000 UTC m=+100.479961299 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 10 09:45:43 crc kubenswrapper[4794]: I0310 09:45:43.737856 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:45:43 crc kubenswrapper[4794]: I0310 09:45:43.737911 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:45:43 crc kubenswrapper[4794]: I0310 09:45:43.737933 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:45:43 crc kubenswrapper[4794]: I0310 09:45:43.737961 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:45:43 crc kubenswrapper[4794]: I0310 09:45:43.737983 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:45:43Z","lastTransitionTime":"2026-03-10T09:45:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:45:43 crc kubenswrapper[4794]: I0310 09:45:43.840149 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:45:43 crc kubenswrapper[4794]: I0310 09:45:43.840186 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:45:43 crc kubenswrapper[4794]: I0310 09:45:43.840200 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:45:43 crc kubenswrapper[4794]: I0310 09:45:43.840219 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:45:43 crc kubenswrapper[4794]: I0310 09:45:43.840234 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:45:43Z","lastTransitionTime":"2026-03-10T09:45:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:45:43 crc kubenswrapper[4794]: I0310 09:45:43.941748 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:45:43 crc kubenswrapper[4794]: I0310 09:45:43.941804 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:45:43 crc kubenswrapper[4794]: I0310 09:45:43.941822 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:45:43 crc kubenswrapper[4794]: I0310 09:45:43.941846 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:45:43 crc kubenswrapper[4794]: I0310 09:45:43.941866 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:45:43Z","lastTransitionTime":"2026-03-10T09:45:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:45:43 crc kubenswrapper[4794]: I0310 09:45:43.998822 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 09:45:43 crc kubenswrapper[4794]: I0310 09:45:43.998994 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 09:45:43 crc kubenswrapper[4794]: I0310 09:45:43.998995 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 09:45:43 crc kubenswrapper[4794]: E0310 09:45:43.999135 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 09:45:43 crc kubenswrapper[4794]: E0310 09:45:43.999199 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 09:45:43 crc kubenswrapper[4794]: E0310 09:45:43.999631 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 09:45:44 crc kubenswrapper[4794]: I0310 09:45:44.010367 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc"] Mar 10 09:45:44 crc kubenswrapper[4794]: I0310 09:45:44.012483 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Mar 10 09:45:44 crc kubenswrapper[4794]: I0310 09:45:44.012952 4794 scope.go:117] "RemoveContainer" containerID="bd3d37caeb172a681855b5504cd1387cb3798cff0c9ae2c67f9a57207d86f7fe" Mar 10 09:45:44 crc kubenswrapper[4794]: E0310 09:45:44.013067 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 10 09:45:44 crc kubenswrapper[4794]: I0310 09:45:44.043936 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:45:44 crc kubenswrapper[4794]: I0310 09:45:44.043987 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:45:44 crc kubenswrapper[4794]: I0310 09:45:44.044000 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:45:44 crc kubenswrapper[4794]: I0310 09:45:44.044018 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:45:44 crc kubenswrapper[4794]: I0310 09:45:44.044028 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:45:44Z","lastTransitionTime":"2026-03-10T09:45:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:45:44 crc kubenswrapper[4794]: I0310 09:45:44.146737 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:45:44 crc kubenswrapper[4794]: I0310 09:45:44.146777 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:45:44 crc kubenswrapper[4794]: I0310 09:45:44.146790 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:45:44 crc kubenswrapper[4794]: I0310 09:45:44.146807 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:45:44 crc kubenswrapper[4794]: I0310 09:45:44.146820 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:45:44Z","lastTransitionTime":"2026-03-10T09:45:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:45:44 crc kubenswrapper[4794]: I0310 09:45:44.249069 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:45:44 crc kubenswrapper[4794]: I0310 09:45:44.249100 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:45:44 crc kubenswrapper[4794]: I0310 09:45:44.249111 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:45:44 crc kubenswrapper[4794]: I0310 09:45:44.249126 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:45:44 crc kubenswrapper[4794]: I0310 09:45:44.249138 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:45:44Z","lastTransitionTime":"2026-03-10T09:45:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:45:44 crc kubenswrapper[4794]: I0310 09:45:44.352549 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:45:44 crc kubenswrapper[4794]: I0310 09:45:44.352589 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:45:44 crc kubenswrapper[4794]: I0310 09:45:44.352603 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:45:44 crc kubenswrapper[4794]: I0310 09:45:44.352622 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:45:44 crc kubenswrapper[4794]: I0310 09:45:44.352634 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:45:44Z","lastTransitionTime":"2026-03-10T09:45:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:45:44 crc kubenswrapper[4794]: I0310 09:45:44.396464 4794 scope.go:117] "RemoveContainer" containerID="bd3d37caeb172a681855b5504cd1387cb3798cff0c9ae2c67f9a57207d86f7fe" Mar 10 09:45:44 crc kubenswrapper[4794]: E0310 09:45:44.396649 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 10 09:45:44 crc kubenswrapper[4794]: I0310 09:45:44.456148 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:45:44 crc kubenswrapper[4794]: I0310 09:45:44.456202 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:45:44 crc kubenswrapper[4794]: I0310 09:45:44.456214 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:45:44 crc kubenswrapper[4794]: I0310 09:45:44.456241 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:45:44 crc kubenswrapper[4794]: I0310 09:45:44.456259 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:45:44Z","lastTransitionTime":"2026-03-10T09:45:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:45:44 crc kubenswrapper[4794]: I0310 09:45:44.558206 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:45:44 crc kubenswrapper[4794]: I0310 09:45:44.558259 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:45:44 crc kubenswrapper[4794]: I0310 09:45:44.558270 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:45:44 crc kubenswrapper[4794]: I0310 09:45:44.558286 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:45:44 crc kubenswrapper[4794]: I0310 09:45:44.558296 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:45:44Z","lastTransitionTime":"2026-03-10T09:45:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:45:44 crc kubenswrapper[4794]: I0310 09:45:44.661045 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:45:44 crc kubenswrapper[4794]: I0310 09:45:44.661104 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:45:44 crc kubenswrapper[4794]: I0310 09:45:44.661124 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:45:44 crc kubenswrapper[4794]: I0310 09:45:44.661150 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:45:44 crc kubenswrapper[4794]: I0310 09:45:44.661166 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:45:44Z","lastTransitionTime":"2026-03-10T09:45:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:45:44 crc kubenswrapper[4794]: I0310 09:45:44.764577 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:45:44 crc kubenswrapper[4794]: I0310 09:45:44.764665 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:45:44 crc kubenswrapper[4794]: I0310 09:45:44.764694 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:45:44 crc kubenswrapper[4794]: I0310 09:45:44.764726 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:45:44 crc kubenswrapper[4794]: I0310 09:45:44.764750 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:45:44Z","lastTransitionTime":"2026-03-10T09:45:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:45:44 crc kubenswrapper[4794]: I0310 09:45:44.867230 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:45:44 crc kubenswrapper[4794]: I0310 09:45:44.867276 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:45:44 crc kubenswrapper[4794]: I0310 09:45:44.867297 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:45:44 crc kubenswrapper[4794]: I0310 09:45:44.867321 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:45:44 crc kubenswrapper[4794]: I0310 09:45:44.867370 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:45:44Z","lastTransitionTime":"2026-03-10T09:45:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:45:44 crc kubenswrapper[4794]: I0310 09:45:44.967865 4794 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Mar 10 09:45:44 crc kubenswrapper[4794]: I0310 09:45:44.970285 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:45:44 crc kubenswrapper[4794]: I0310 09:45:44.970377 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:45:44 crc kubenswrapper[4794]: I0310 09:45:44.970397 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:45:44 crc kubenswrapper[4794]: I0310 09:45:44.970422 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:45:44 crc kubenswrapper[4794]: I0310 09:45:44.970442 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:45:44Z","lastTransitionTime":"2026-03-10T09:45:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:45:45 crc kubenswrapper[4794]: I0310 09:45:45.073964 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:45:45 crc kubenswrapper[4794]: I0310 09:45:45.074049 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:45:45 crc kubenswrapper[4794]: I0310 09:45:45.074067 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:45:45 crc kubenswrapper[4794]: I0310 09:45:45.074090 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:45:45 crc kubenswrapper[4794]: I0310 09:45:45.074107 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:45:45Z","lastTransitionTime":"2026-03-10T09:45:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:45:45 crc kubenswrapper[4794]: I0310 09:45:45.176631 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:45:45 crc kubenswrapper[4794]: I0310 09:45:45.176693 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:45:45 crc kubenswrapper[4794]: I0310 09:45:45.176710 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:45:45 crc kubenswrapper[4794]: I0310 09:45:45.176733 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:45:45 crc kubenswrapper[4794]: I0310 09:45:45.176751 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:45:45Z","lastTransitionTime":"2026-03-10T09:45:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:45:45 crc kubenswrapper[4794]: I0310 09:45:45.279624 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:45:45 crc kubenswrapper[4794]: I0310 09:45:45.279688 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:45:45 crc kubenswrapper[4794]: I0310 09:45:45.279711 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:45:45 crc kubenswrapper[4794]: I0310 09:45:45.279740 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:45:45 crc kubenswrapper[4794]: I0310 09:45:45.279776 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:45:45Z","lastTransitionTime":"2026-03-10T09:45:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:45:45 crc kubenswrapper[4794]: I0310 09:45:45.382307 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:45:45 crc kubenswrapper[4794]: I0310 09:45:45.382412 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:45:45 crc kubenswrapper[4794]: I0310 09:45:45.382437 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:45:45 crc kubenswrapper[4794]: I0310 09:45:45.382468 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:45:45 crc kubenswrapper[4794]: I0310 09:45:45.382492 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:45:45Z","lastTransitionTime":"2026-03-10T09:45:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:45:45 crc kubenswrapper[4794]: I0310 09:45:45.484398 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:45:45 crc kubenswrapper[4794]: I0310 09:45:45.484464 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:45:45 crc kubenswrapper[4794]: I0310 09:45:45.484476 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:45:45 crc kubenswrapper[4794]: I0310 09:45:45.484492 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:45:45 crc kubenswrapper[4794]: I0310 09:45:45.484503 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:45:45Z","lastTransitionTime":"2026-03-10T09:45:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:45:45 crc kubenswrapper[4794]: I0310 09:45:45.587068 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:45:45 crc kubenswrapper[4794]: I0310 09:45:45.587113 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:45:45 crc kubenswrapper[4794]: I0310 09:45:45.587126 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:45:45 crc kubenswrapper[4794]: I0310 09:45:45.587147 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:45:45 crc kubenswrapper[4794]: I0310 09:45:45.587159 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:45:45Z","lastTransitionTime":"2026-03-10T09:45:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:45:45 crc kubenswrapper[4794]: I0310 09:45:45.689719 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:45:45 crc kubenswrapper[4794]: I0310 09:45:45.689759 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:45:45 crc kubenswrapper[4794]: I0310 09:45:45.689773 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:45:45 crc kubenswrapper[4794]: I0310 09:45:45.689790 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:45:45 crc kubenswrapper[4794]: I0310 09:45:45.689804 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:45:45Z","lastTransitionTime":"2026-03-10T09:45:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:45:45 crc kubenswrapper[4794]: I0310 09:45:45.792056 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:45:45 crc kubenswrapper[4794]: I0310 09:45:45.792100 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:45:45 crc kubenswrapper[4794]: I0310 09:45:45.792112 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:45:45 crc kubenswrapper[4794]: I0310 09:45:45.792133 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:45:45 crc kubenswrapper[4794]: I0310 09:45:45.792145 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:45:45Z","lastTransitionTime":"2026-03-10T09:45:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:45:45 crc kubenswrapper[4794]: I0310 09:45:45.895181 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:45:45 crc kubenswrapper[4794]: I0310 09:45:45.895222 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:45:45 crc kubenswrapper[4794]: I0310 09:45:45.895233 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:45:45 crc kubenswrapper[4794]: I0310 09:45:45.895248 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:45:45 crc kubenswrapper[4794]: I0310 09:45:45.895260 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:45:45Z","lastTransitionTime":"2026-03-10T09:45:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:45:45 crc kubenswrapper[4794]: I0310 09:45:45.997323 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:45:45 crc kubenswrapper[4794]: I0310 09:45:45.997376 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:45:45 crc kubenswrapper[4794]: I0310 09:45:45.997388 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:45:45 crc kubenswrapper[4794]: I0310 09:45:45.997405 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:45:45 crc kubenswrapper[4794]: I0310 09:45:45.997418 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:45:45Z","lastTransitionTime":"2026-03-10T09:45:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:45:45 crc kubenswrapper[4794]: I0310 09:45:45.998206 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 09:45:45 crc kubenswrapper[4794]: I0310 09:45:45.998231 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 09:45:45 crc kubenswrapper[4794]: E0310 09:45:45.998354 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 09:45:45 crc kubenswrapper[4794]: I0310 09:45:45.998420 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 09:45:45 crc kubenswrapper[4794]: E0310 09:45:45.998566 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 09:45:45 crc kubenswrapper[4794]: E0310 09:45:45.998670 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 09:45:46 crc kubenswrapper[4794]: I0310 09:45:46.100843 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:45:46 crc kubenswrapper[4794]: I0310 09:45:46.100918 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:45:46 crc kubenswrapper[4794]: I0310 09:45:46.100940 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:45:46 crc kubenswrapper[4794]: I0310 09:45:46.100970 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:45:46 crc kubenswrapper[4794]: I0310 09:45:46.101010 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:45:46Z","lastTransitionTime":"2026-03-10T09:45:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:45:46 crc kubenswrapper[4794]: I0310 09:45:46.202707 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:45:46 crc kubenswrapper[4794]: I0310 09:45:46.202741 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:45:46 crc kubenswrapper[4794]: I0310 09:45:46.202750 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:45:46 crc kubenswrapper[4794]: I0310 09:45:46.202763 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:45:46 crc kubenswrapper[4794]: I0310 09:45:46.202771 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:45:46Z","lastTransitionTime":"2026-03-10T09:45:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:45:46 crc kubenswrapper[4794]: I0310 09:45:46.305228 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:45:46 crc kubenswrapper[4794]: I0310 09:45:46.305286 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:45:46 crc kubenswrapper[4794]: I0310 09:45:46.305299 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:45:46 crc kubenswrapper[4794]: I0310 09:45:46.305319 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:45:46 crc kubenswrapper[4794]: I0310 09:45:46.305348 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:45:46Z","lastTransitionTime":"2026-03-10T09:45:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:45:46 crc kubenswrapper[4794]: I0310 09:45:46.407910 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:45:46 crc kubenswrapper[4794]: I0310 09:45:46.407952 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:45:46 crc kubenswrapper[4794]: I0310 09:45:46.407966 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:45:46 crc kubenswrapper[4794]: I0310 09:45:46.407982 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:45:46 crc kubenswrapper[4794]: I0310 09:45:46.407994 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:45:46Z","lastTransitionTime":"2026-03-10T09:45:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:45:46 crc kubenswrapper[4794]: I0310 09:45:46.511041 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:45:46 crc kubenswrapper[4794]: I0310 09:45:46.511103 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:45:46 crc kubenswrapper[4794]: I0310 09:45:46.511121 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:45:46 crc kubenswrapper[4794]: I0310 09:45:46.511149 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:45:46 crc kubenswrapper[4794]: I0310 09:45:46.511166 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:45:46Z","lastTransitionTime":"2026-03-10T09:45:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:45:46 crc kubenswrapper[4794]: I0310 09:45:46.613079 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:45:46 crc kubenswrapper[4794]: I0310 09:45:46.613145 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:45:46 crc kubenswrapper[4794]: I0310 09:45:46.613163 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:45:46 crc kubenswrapper[4794]: I0310 09:45:46.613188 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:45:46 crc kubenswrapper[4794]: I0310 09:45:46.613207 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:45:46Z","lastTransitionTime":"2026-03-10T09:45:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:45:46 crc kubenswrapper[4794]: I0310 09:45:46.715656 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:45:46 crc kubenswrapper[4794]: I0310 09:45:46.715692 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:45:46 crc kubenswrapper[4794]: I0310 09:45:46.715703 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:45:46 crc kubenswrapper[4794]: I0310 09:45:46.715716 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:45:46 crc kubenswrapper[4794]: I0310 09:45:46.715725 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:45:46Z","lastTransitionTime":"2026-03-10T09:45:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:45:46 crc kubenswrapper[4794]: I0310 09:45:46.818391 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:45:46 crc kubenswrapper[4794]: I0310 09:45:46.818455 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:45:46 crc kubenswrapper[4794]: I0310 09:45:46.818471 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:45:46 crc kubenswrapper[4794]: I0310 09:45:46.818496 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:45:46 crc kubenswrapper[4794]: I0310 09:45:46.818515 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:45:46Z","lastTransitionTime":"2026-03-10T09:45:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:45:46 crc kubenswrapper[4794]: I0310 09:45:46.921370 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:45:46 crc kubenswrapper[4794]: I0310 09:45:46.921416 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:45:46 crc kubenswrapper[4794]: I0310 09:45:46.921428 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:45:46 crc kubenswrapper[4794]: I0310 09:45:46.921447 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:45:46 crc kubenswrapper[4794]: I0310 09:45:46.921459 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:45:46Z","lastTransitionTime":"2026-03-10T09:45:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:45:47 crc kubenswrapper[4794]: I0310 09:45:47.024566 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:45:47 crc kubenswrapper[4794]: I0310 09:45:47.024645 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:45:47 crc kubenswrapper[4794]: I0310 09:45:47.024668 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:45:47 crc kubenswrapper[4794]: I0310 09:45:47.024697 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:45:47 crc kubenswrapper[4794]: I0310 09:45:47.024725 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:45:47Z","lastTransitionTime":"2026-03-10T09:45:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:45:47 crc kubenswrapper[4794]: I0310 09:45:47.127968 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:45:47 crc kubenswrapper[4794]: I0310 09:45:47.128036 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:45:47 crc kubenswrapper[4794]: I0310 09:45:47.128054 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:45:47 crc kubenswrapper[4794]: I0310 09:45:47.128081 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:45:47 crc kubenswrapper[4794]: I0310 09:45:47.128097 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:45:47Z","lastTransitionTime":"2026-03-10T09:45:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:45:47 crc kubenswrapper[4794]: I0310 09:45:47.231162 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:45:47 crc kubenswrapper[4794]: I0310 09:45:47.231197 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:45:47 crc kubenswrapper[4794]: I0310 09:45:47.231208 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:45:47 crc kubenswrapper[4794]: I0310 09:45:47.231223 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:45:47 crc kubenswrapper[4794]: I0310 09:45:47.231234 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:45:47Z","lastTransitionTime":"2026-03-10T09:45:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:45:47 crc kubenswrapper[4794]: I0310 09:45:47.333216 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:45:47 crc kubenswrapper[4794]: I0310 09:45:47.333436 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:45:47 crc kubenswrapper[4794]: I0310 09:45:47.333463 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:45:47 crc kubenswrapper[4794]: I0310 09:45:47.333491 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:45:47 crc kubenswrapper[4794]: I0310 09:45:47.333509 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:45:47Z","lastTransitionTime":"2026-03-10T09:45:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:45:47 crc kubenswrapper[4794]: I0310 09:45:47.435926 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:45:47 crc kubenswrapper[4794]: I0310 09:45:47.435974 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:45:47 crc kubenswrapper[4794]: I0310 09:45:47.435985 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:45:47 crc kubenswrapper[4794]: I0310 09:45:47.436002 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:45:47 crc kubenswrapper[4794]: I0310 09:45:47.436013 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:45:47Z","lastTransitionTime":"2026-03-10T09:45:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:45:47 crc kubenswrapper[4794]: I0310 09:45:47.538095 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:45:47 crc kubenswrapper[4794]: I0310 09:45:47.538141 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:45:47 crc kubenswrapper[4794]: I0310 09:45:47.538153 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:45:47 crc kubenswrapper[4794]: I0310 09:45:47.538170 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:45:47 crc kubenswrapper[4794]: I0310 09:45:47.538182 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:45:47Z","lastTransitionTime":"2026-03-10T09:45:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:45:47 crc kubenswrapper[4794]: I0310 09:45:47.640727 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:45:47 crc kubenswrapper[4794]: I0310 09:45:47.640780 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:45:47 crc kubenswrapper[4794]: I0310 09:45:47.640797 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:45:47 crc kubenswrapper[4794]: I0310 09:45:47.640825 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:45:47 crc kubenswrapper[4794]: I0310 09:45:47.640843 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:45:47Z","lastTransitionTime":"2026-03-10T09:45:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:45:47 crc kubenswrapper[4794]: I0310 09:45:47.743325 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:45:47 crc kubenswrapper[4794]: I0310 09:45:47.743419 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:45:47 crc kubenswrapper[4794]: I0310 09:45:47.743437 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:45:47 crc kubenswrapper[4794]: I0310 09:45:47.743461 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:45:47 crc kubenswrapper[4794]: I0310 09:45:47.743479 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:45:47Z","lastTransitionTime":"2026-03-10T09:45:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:45:47 crc kubenswrapper[4794]: I0310 09:45:47.846622 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:45:47 crc kubenswrapper[4794]: I0310 09:45:47.846667 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:45:47 crc kubenswrapper[4794]: I0310 09:45:47.846681 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:45:47 crc kubenswrapper[4794]: I0310 09:45:47.846701 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:45:47 crc kubenswrapper[4794]: I0310 09:45:47.846714 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:45:47Z","lastTransitionTime":"2026-03-10T09:45:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:45:47 crc kubenswrapper[4794]: I0310 09:45:47.949548 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:45:47 crc kubenswrapper[4794]: I0310 09:45:47.949781 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:45:47 crc kubenswrapper[4794]: I0310 09:45:47.949813 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:45:47 crc kubenswrapper[4794]: I0310 09:45:47.949844 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:45:47 crc kubenswrapper[4794]: I0310 09:45:47.949865 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:45:47Z","lastTransitionTime":"2026-03-10T09:45:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:45:47 crc kubenswrapper[4794]: I0310 09:45:47.998746 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 09:45:47 crc kubenswrapper[4794]: I0310 09:45:47.998810 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 09:45:47 crc kubenswrapper[4794]: I0310 09:45:47.998903 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 09:45:47 crc kubenswrapper[4794]: E0310 09:45:47.998926 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 09:45:47 crc kubenswrapper[4794]: E0310 09:45:47.999050 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 09:45:47 crc kubenswrapper[4794]: E0310 09:45:47.999217 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 09:45:48 crc kubenswrapper[4794]: I0310 09:45:48.053436 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:45:48 crc kubenswrapper[4794]: I0310 09:45:48.053499 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:45:48 crc kubenswrapper[4794]: I0310 09:45:48.053513 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:45:48 crc kubenswrapper[4794]: I0310 09:45:48.053531 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:45:48 crc kubenswrapper[4794]: I0310 09:45:48.053549 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:45:48Z","lastTransitionTime":"2026-03-10T09:45:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:45:48 crc kubenswrapper[4794]: I0310 09:45:48.155764 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:45:48 crc kubenswrapper[4794]: I0310 09:45:48.155815 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:45:48 crc kubenswrapper[4794]: I0310 09:45:48.155828 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:45:48 crc kubenswrapper[4794]: I0310 09:45:48.155846 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:45:48 crc kubenswrapper[4794]: I0310 09:45:48.155858 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:45:48Z","lastTransitionTime":"2026-03-10T09:45:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:45:48 crc kubenswrapper[4794]: I0310 09:45:48.258704 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:45:48 crc kubenswrapper[4794]: I0310 09:45:48.258778 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:45:48 crc kubenswrapper[4794]: I0310 09:45:48.258873 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:45:48 crc kubenswrapper[4794]: I0310 09:45:48.258930 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:45:48 crc kubenswrapper[4794]: I0310 09:45:48.258957 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:45:48Z","lastTransitionTime":"2026-03-10T09:45:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:45:48 crc kubenswrapper[4794]: I0310 09:45:48.361721 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:45:48 crc kubenswrapper[4794]: I0310 09:45:48.361759 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:45:48 crc kubenswrapper[4794]: I0310 09:45:48.361771 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:45:48 crc kubenswrapper[4794]: I0310 09:45:48.361787 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:45:48 crc kubenswrapper[4794]: I0310 09:45:48.361799 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:45:48Z","lastTransitionTime":"2026-03-10T09:45:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:45:48 crc kubenswrapper[4794]: I0310 09:45:48.463996 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:45:48 crc kubenswrapper[4794]: I0310 09:45:48.464081 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:45:48 crc kubenswrapper[4794]: I0310 09:45:48.464094 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:45:48 crc kubenswrapper[4794]: I0310 09:45:48.464112 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:45:48 crc kubenswrapper[4794]: I0310 09:45:48.464124 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:45:48Z","lastTransitionTime":"2026-03-10T09:45:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:45:48 crc kubenswrapper[4794]: I0310 09:45:48.566763 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:45:48 crc kubenswrapper[4794]: I0310 09:45:48.566808 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:45:48 crc kubenswrapper[4794]: I0310 09:45:48.566820 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:45:48 crc kubenswrapper[4794]: I0310 09:45:48.566837 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:45:48 crc kubenswrapper[4794]: I0310 09:45:48.566849 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:45:48Z","lastTransitionTime":"2026-03-10T09:45:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:45:48 crc kubenswrapper[4794]: I0310 09:45:48.669493 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:45:48 crc kubenswrapper[4794]: I0310 09:45:48.669592 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:45:48 crc kubenswrapper[4794]: I0310 09:45:48.669622 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:45:48 crc kubenswrapper[4794]: I0310 09:45:48.669652 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:45:48 crc kubenswrapper[4794]: I0310 09:45:48.669672 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:45:48Z","lastTransitionTime":"2026-03-10T09:45:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:45:48 crc kubenswrapper[4794]: I0310 09:45:48.772507 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:45:48 crc kubenswrapper[4794]: I0310 09:45:48.772588 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:45:48 crc kubenswrapper[4794]: I0310 09:45:48.772619 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:45:48 crc kubenswrapper[4794]: I0310 09:45:48.772648 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:45:48 crc kubenswrapper[4794]: I0310 09:45:48.772669 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:45:48Z","lastTransitionTime":"2026-03-10T09:45:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:45:48 crc kubenswrapper[4794]: I0310 09:45:48.875434 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:45:48 crc kubenswrapper[4794]: I0310 09:45:48.875465 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:45:48 crc kubenswrapper[4794]: I0310 09:45:48.875475 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:45:48 crc kubenswrapper[4794]: I0310 09:45:48.875489 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:45:48 crc kubenswrapper[4794]: I0310 09:45:48.875498 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:45:48Z","lastTransitionTime":"2026-03-10T09:45:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:45:48 crc kubenswrapper[4794]: I0310 09:45:48.978700 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:45:48 crc kubenswrapper[4794]: I0310 09:45:48.978768 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:45:48 crc kubenswrapper[4794]: I0310 09:45:48.978786 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:45:48 crc kubenswrapper[4794]: I0310 09:45:48.979171 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:45:48 crc kubenswrapper[4794]: I0310 09:45:48.979225 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:45:48Z","lastTransitionTime":"2026-03-10T09:45:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:45:49 crc kubenswrapper[4794]: I0310 09:45:49.040700 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:45:49 crc kubenswrapper[4794]: I0310 09:45:49.040734 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:45:49 crc kubenswrapper[4794]: I0310 09:45:49.040741 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:45:49 crc kubenswrapper[4794]: I0310 09:45:49.040754 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:45:49 crc kubenswrapper[4794]: I0310 09:45:49.040763 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:45:49Z","lastTransitionTime":"2026-03-10T09:45:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:45:49 crc kubenswrapper[4794]: E0310 09:45:49.057314 4794 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:45:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:49Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:45:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:49Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:45:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:49Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:45:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:49Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"30640994-851b-4f33-a3b0-5689f89c6242\\\",\\\"systemUUID\\\":\\\"970a308b-8f2f-4747-b542-8544494e7e13\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:45:49Z is after 2025-08-24T17:21:41Z" Mar 10 09:45:49 crc kubenswrapper[4794]: I0310 09:45:49.061315 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:45:49 crc kubenswrapper[4794]: I0310 09:45:49.061362 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:45:49 crc kubenswrapper[4794]: I0310 09:45:49.061373 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:45:49 crc kubenswrapper[4794]: I0310 09:45:49.061388 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:45:49 crc kubenswrapper[4794]: I0310 09:45:49.061401 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:45:49Z","lastTransitionTime":"2026-03-10T09:45:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:45:49 crc kubenswrapper[4794]: E0310 09:45:49.080419 4794 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:45:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:49Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:45:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:49Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:45:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:49Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:45:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:49Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"30640994-851b-4f33-a3b0-5689f89c6242\\\",\\\"systemUUID\\\":\\\"970a308b-8f2f-4747-b542-8544494e7e13\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:45:49Z is after 2025-08-24T17:21:41Z" Mar 10 09:45:49 crc kubenswrapper[4794]: I0310 09:45:49.084582 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:45:49 crc kubenswrapper[4794]: I0310 09:45:49.084604 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:45:49 crc kubenswrapper[4794]: I0310 09:45:49.084612 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:45:49 crc kubenswrapper[4794]: I0310 09:45:49.084623 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:45:49 crc kubenswrapper[4794]: I0310 09:45:49.084632 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:45:49Z","lastTransitionTime":"2026-03-10T09:45:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:45:49 crc kubenswrapper[4794]: E0310 09:45:49.103242 4794 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:45:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:49Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:45:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:49Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:45:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:49Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:45:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:49Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"30640994-851b-4f33-a3b0-5689f89c6242\\\",\\\"systemUUID\\\":\\\"970a308b-8f2f-4747-b542-8544494e7e13\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:45:49Z is after 2025-08-24T17:21:41Z" Mar 10 09:45:49 crc kubenswrapper[4794]: I0310 09:45:49.107970 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:45:49 crc kubenswrapper[4794]: I0310 09:45:49.108011 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:45:49 crc kubenswrapper[4794]: I0310 09:45:49.108028 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:45:49 crc kubenswrapper[4794]: I0310 09:45:49.108050 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:45:49 crc kubenswrapper[4794]: I0310 09:45:49.108066 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:45:49Z","lastTransitionTime":"2026-03-10T09:45:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:45:49 crc kubenswrapper[4794]: E0310 09:45:49.124934 4794 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:45:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:49Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:45:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:49Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:45:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:49Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:45:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:49Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"30640994-851b-4f33-a3b0-5689f89c6242\\\",\\\"systemUUID\\\":\\\"970a308b-8f2f-4747-b542-8544494e7e13\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:45:49Z is after 2025-08-24T17:21:41Z" Mar 10 09:45:49 crc kubenswrapper[4794]: I0310 09:45:49.129619 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:45:49 crc kubenswrapper[4794]: I0310 09:45:49.129650 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:45:49 crc kubenswrapper[4794]: I0310 09:45:49.129662 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:45:49 crc kubenswrapper[4794]: I0310 09:45:49.129678 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:45:49 crc kubenswrapper[4794]: I0310 09:45:49.129690 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:45:49Z","lastTransitionTime":"2026-03-10T09:45:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:45:49 crc kubenswrapper[4794]: E0310 09:45:49.149539 4794 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:45:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:49Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:45:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:49Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:45:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:49Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:45:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:49Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"30640994-851b-4f33-a3b0-5689f89c6242\\\",\\\"systemUUID\\\":\\\"970a308b-8f2f-4747-b542-8544494e7e13\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:45:49Z is after 2025-08-24T17:21:41Z" Mar 10 09:45:49 crc kubenswrapper[4794]: E0310 09:45:49.149731 4794 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 10 09:45:49 crc kubenswrapper[4794]: I0310 09:45:49.151918 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:45:49 crc kubenswrapper[4794]: I0310 09:45:49.151962 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:45:49 crc kubenswrapper[4794]: I0310 09:45:49.151975 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:45:49 crc kubenswrapper[4794]: I0310 09:45:49.151995 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:45:49 crc kubenswrapper[4794]: I0310 09:45:49.152006 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:45:49Z","lastTransitionTime":"2026-03-10T09:45:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:45:49 crc kubenswrapper[4794]: I0310 09:45:49.255045 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:45:49 crc kubenswrapper[4794]: I0310 09:45:49.255091 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:45:49 crc kubenswrapper[4794]: I0310 09:45:49.255105 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:45:49 crc kubenswrapper[4794]: I0310 09:45:49.255125 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:45:49 crc kubenswrapper[4794]: I0310 09:45:49.255139 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:45:49Z","lastTransitionTime":"2026-03-10T09:45:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:45:49 crc kubenswrapper[4794]: I0310 09:45:49.358098 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:45:49 crc kubenswrapper[4794]: I0310 09:45:49.358148 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:45:49 crc kubenswrapper[4794]: I0310 09:45:49.358166 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:45:49 crc kubenswrapper[4794]: I0310 09:45:49.358192 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:45:49 crc kubenswrapper[4794]: I0310 09:45:49.358209 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:45:49Z","lastTransitionTime":"2026-03-10T09:45:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:45:49 crc kubenswrapper[4794]: I0310 09:45:49.460410 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:45:49 crc kubenswrapper[4794]: I0310 09:45:49.460456 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:45:49 crc kubenswrapper[4794]: I0310 09:45:49.460474 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:45:49 crc kubenswrapper[4794]: I0310 09:45:49.460497 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:45:49 crc kubenswrapper[4794]: I0310 09:45:49.460514 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:45:49Z","lastTransitionTime":"2026-03-10T09:45:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:45:49 crc kubenswrapper[4794]: I0310 09:45:49.562832 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:45:49 crc kubenswrapper[4794]: I0310 09:45:49.562895 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:45:49 crc kubenswrapper[4794]: I0310 09:45:49.562919 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:45:49 crc kubenswrapper[4794]: I0310 09:45:49.562949 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:45:49 crc kubenswrapper[4794]: I0310 09:45:49.562973 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:45:49Z","lastTransitionTime":"2026-03-10T09:45:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:45:49 crc kubenswrapper[4794]: I0310 09:45:49.666276 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:45:49 crc kubenswrapper[4794]: I0310 09:45:49.666314 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:45:49 crc kubenswrapper[4794]: I0310 09:45:49.666325 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:45:49 crc kubenswrapper[4794]: I0310 09:45:49.666370 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:45:49 crc kubenswrapper[4794]: I0310 09:45:49.666383 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:45:49Z","lastTransitionTime":"2026-03-10T09:45:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:45:49 crc kubenswrapper[4794]: I0310 09:45:49.768740 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:45:49 crc kubenswrapper[4794]: I0310 09:45:49.768772 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:45:49 crc kubenswrapper[4794]: I0310 09:45:49.768781 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:45:49 crc kubenswrapper[4794]: I0310 09:45:49.768797 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:45:49 crc kubenswrapper[4794]: I0310 09:45:49.768806 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:45:49Z","lastTransitionTime":"2026-03-10T09:45:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:45:49 crc kubenswrapper[4794]: I0310 09:45:49.871938 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:45:49 crc kubenswrapper[4794]: I0310 09:45:49.871969 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:45:49 crc kubenswrapper[4794]: I0310 09:45:49.871980 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:45:49 crc kubenswrapper[4794]: I0310 09:45:49.871994 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:45:49 crc kubenswrapper[4794]: I0310 09:45:49.872006 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:45:49Z","lastTransitionTime":"2026-03-10T09:45:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:45:49 crc kubenswrapper[4794]: I0310 09:45:49.974805 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:45:49 crc kubenswrapper[4794]: I0310 09:45:49.974972 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:45:49 crc kubenswrapper[4794]: I0310 09:45:49.974993 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:45:49 crc kubenswrapper[4794]: I0310 09:45:49.975016 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:45:49 crc kubenswrapper[4794]: I0310 09:45:49.975033 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:45:49Z","lastTransitionTime":"2026-03-10T09:45:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:45:49 crc kubenswrapper[4794]: I0310 09:45:49.998827 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 09:45:49 crc kubenswrapper[4794]: E0310 09:45:49.998942 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 09:45:49 crc kubenswrapper[4794]: I0310 09:45:49.999252 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 09:45:49 crc kubenswrapper[4794]: E0310 09:45:49.999306 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 09:45:49 crc kubenswrapper[4794]: I0310 09:45:49.999385 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 09:45:49 crc kubenswrapper[4794]: E0310 09:45:49.999460 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 09:45:50 crc kubenswrapper[4794]: I0310 09:45:50.077510 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:45:50 crc kubenswrapper[4794]: I0310 09:45:50.077559 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:45:50 crc kubenswrapper[4794]: I0310 09:45:50.077579 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:45:50 crc kubenswrapper[4794]: I0310 09:45:50.077605 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:45:50 crc kubenswrapper[4794]: I0310 09:45:50.077622 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:45:50Z","lastTransitionTime":"2026-03-10T09:45:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:45:50 crc kubenswrapper[4794]: I0310 09:45:50.180699 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:45:50 crc kubenswrapper[4794]: I0310 09:45:50.180764 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:45:50 crc kubenswrapper[4794]: I0310 09:45:50.180781 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:45:50 crc kubenswrapper[4794]: I0310 09:45:50.180806 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:45:50 crc kubenswrapper[4794]: I0310 09:45:50.180823 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:45:50Z","lastTransitionTime":"2026-03-10T09:45:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:45:50 crc kubenswrapper[4794]: I0310 09:45:50.283744 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:45:50 crc kubenswrapper[4794]: I0310 09:45:50.283818 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:45:50 crc kubenswrapper[4794]: I0310 09:45:50.283836 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:45:50 crc kubenswrapper[4794]: I0310 09:45:50.283862 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:45:50 crc kubenswrapper[4794]: I0310 09:45:50.283885 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:45:50Z","lastTransitionTime":"2026-03-10T09:45:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:45:50 crc kubenswrapper[4794]: I0310 09:45:50.387738 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:45:50 crc kubenswrapper[4794]: I0310 09:45:50.387817 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:45:50 crc kubenswrapper[4794]: I0310 09:45:50.387841 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:45:50 crc kubenswrapper[4794]: I0310 09:45:50.387949 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:45:50 crc kubenswrapper[4794]: I0310 09:45:50.387989 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:45:50Z","lastTransitionTime":"2026-03-10T09:45:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:45:50 crc kubenswrapper[4794]: I0310 09:45:50.491054 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:45:50 crc kubenswrapper[4794]: I0310 09:45:50.491100 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:45:50 crc kubenswrapper[4794]: I0310 09:45:50.491113 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:45:50 crc kubenswrapper[4794]: I0310 09:45:50.491130 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:45:50 crc kubenswrapper[4794]: I0310 09:45:50.491142 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:45:50Z","lastTransitionTime":"2026-03-10T09:45:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:45:50 crc kubenswrapper[4794]: I0310 09:45:50.593514 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:45:50 crc kubenswrapper[4794]: I0310 09:45:50.593547 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:45:50 crc kubenswrapper[4794]: I0310 09:45:50.593556 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:45:50 crc kubenswrapper[4794]: I0310 09:45:50.593570 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:45:50 crc kubenswrapper[4794]: I0310 09:45:50.593578 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:45:50Z","lastTransitionTime":"2026-03-10T09:45:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:45:50 crc kubenswrapper[4794]: I0310 09:45:50.695641 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:45:50 crc kubenswrapper[4794]: I0310 09:45:50.695705 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:45:50 crc kubenswrapper[4794]: I0310 09:45:50.695722 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:45:50 crc kubenswrapper[4794]: I0310 09:45:50.695747 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:45:50 crc kubenswrapper[4794]: I0310 09:45:50.695766 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:45:50Z","lastTransitionTime":"2026-03-10T09:45:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:45:50 crc kubenswrapper[4794]: I0310 09:45:50.798957 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:45:50 crc kubenswrapper[4794]: I0310 09:45:50.798990 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:45:50 crc kubenswrapper[4794]: I0310 09:45:50.799003 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:45:50 crc kubenswrapper[4794]: I0310 09:45:50.799021 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:45:50 crc kubenswrapper[4794]: I0310 09:45:50.799031 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:45:50Z","lastTransitionTime":"2026-03-10T09:45:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:45:50 crc kubenswrapper[4794]: I0310 09:45:50.901659 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:45:50 crc kubenswrapper[4794]: I0310 09:45:50.901711 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:45:50 crc kubenswrapper[4794]: I0310 09:45:50.901727 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:45:50 crc kubenswrapper[4794]: I0310 09:45:50.901751 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:45:50 crc kubenswrapper[4794]: I0310 09:45:50.901767 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:45:50Z","lastTransitionTime":"2026-03-10T09:45:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:45:51 crc kubenswrapper[4794]: I0310 09:45:51.003775 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:45:51 crc kubenswrapper[4794]: I0310 09:45:51.003824 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:45:51 crc kubenswrapper[4794]: I0310 09:45:51.003835 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:45:51 crc kubenswrapper[4794]: I0310 09:45:51.003853 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:45:51 crc kubenswrapper[4794]: I0310 09:45:51.003864 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:45:51Z","lastTransitionTime":"2026-03-10T09:45:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:45:51 crc kubenswrapper[4794]: I0310 09:45:51.106560 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:45:51 crc kubenswrapper[4794]: I0310 09:45:51.106724 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:45:51 crc kubenswrapper[4794]: I0310 09:45:51.106740 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:45:51 crc kubenswrapper[4794]: I0310 09:45:51.106758 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:45:51 crc kubenswrapper[4794]: I0310 09:45:51.106771 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:45:51Z","lastTransitionTime":"2026-03-10T09:45:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:45:51 crc kubenswrapper[4794]: I0310 09:45:51.208918 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:45:51 crc kubenswrapper[4794]: I0310 09:45:51.208973 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:45:51 crc kubenswrapper[4794]: I0310 09:45:51.208991 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:45:51 crc kubenswrapper[4794]: I0310 09:45:51.209015 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:45:51 crc kubenswrapper[4794]: I0310 09:45:51.209032 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:45:51Z","lastTransitionTime":"2026-03-10T09:45:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:45:51 crc kubenswrapper[4794]: I0310 09:45:51.312304 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:45:51 crc kubenswrapper[4794]: I0310 09:45:51.312411 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:45:51 crc kubenswrapper[4794]: I0310 09:45:51.312434 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:45:51 crc kubenswrapper[4794]: I0310 09:45:51.312462 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:45:51 crc kubenswrapper[4794]: I0310 09:45:51.312482 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:45:51Z","lastTransitionTime":"2026-03-10T09:45:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:45:51 crc kubenswrapper[4794]: I0310 09:45:51.414536 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:45:51 crc kubenswrapper[4794]: I0310 09:45:51.414594 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:45:51 crc kubenswrapper[4794]: I0310 09:45:51.414616 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:45:51 crc kubenswrapper[4794]: I0310 09:45:51.414649 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:45:51 crc kubenswrapper[4794]: I0310 09:45:51.414673 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:45:51Z","lastTransitionTime":"2026-03-10T09:45:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:45:51 crc kubenswrapper[4794]: I0310 09:45:51.517148 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:45:51 crc kubenswrapper[4794]: I0310 09:45:51.517202 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:45:51 crc kubenswrapper[4794]: I0310 09:45:51.517218 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:45:51 crc kubenswrapper[4794]: I0310 09:45:51.517246 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:45:51 crc kubenswrapper[4794]: I0310 09:45:51.517262 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:45:51Z","lastTransitionTime":"2026-03-10T09:45:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:45:51 crc kubenswrapper[4794]: I0310 09:45:51.620659 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:45:51 crc kubenswrapper[4794]: I0310 09:45:51.620899 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:45:51 crc kubenswrapper[4794]: I0310 09:45:51.620937 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:45:51 crc kubenswrapper[4794]: I0310 09:45:51.620969 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:45:51 crc kubenswrapper[4794]: I0310 09:45:51.620992 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:45:51Z","lastTransitionTime":"2026-03-10T09:45:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:45:51 crc kubenswrapper[4794]: I0310 09:45:51.724828 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:45:51 crc kubenswrapper[4794]: I0310 09:45:51.724896 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:45:51 crc kubenswrapper[4794]: I0310 09:45:51.724918 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:45:51 crc kubenswrapper[4794]: I0310 09:45:51.724945 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:45:51 crc kubenswrapper[4794]: I0310 09:45:51.724966 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:45:51Z","lastTransitionTime":"2026-03-10T09:45:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:45:51 crc kubenswrapper[4794]: I0310 09:45:51.797441 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 09:45:51 crc kubenswrapper[4794]: I0310 09:45:51.797563 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 09:45:51 crc kubenswrapper[4794]: E0310 09:45:51.797624 4794 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 09:46:07.797586393 +0000 UTC m=+116.553757251 (durationBeforeRetry 16s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 09:45:51 crc kubenswrapper[4794]: I0310 09:45:51.797690 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 09:45:51 crc kubenswrapper[4794]: E0310 09:45:51.797700 4794 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 10 09:45:51 crc kubenswrapper[4794]: I0310 09:45:51.797759 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 09:45:51 crc kubenswrapper[4794]: E0310 09:45:51.797864 4794 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-10 09:46:07.79782981 +0000 UTC m=+116.554000668 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 10 09:45:51 crc kubenswrapper[4794]: E0310 09:45:51.797746 4794 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 10 09:45:51 crc kubenswrapper[4794]: I0310 09:45:51.798016 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 09:45:51 crc kubenswrapper[4794]: E0310 09:45:51.797907 4794 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 10 09:45:51 crc kubenswrapper[4794]: E0310 09:45:51.798105 4794 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-10 09:46:07.798071278 +0000 UTC m=+116.554242126 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 10 09:45:51 crc kubenswrapper[4794]: E0310 09:45:51.798108 4794 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 10 09:45:51 crc kubenswrapper[4794]: E0310 09:45:51.798136 4794 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 10 09:45:51 crc kubenswrapper[4794]: E0310 09:45:51.798150 4794 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 10 09:45:51 crc kubenswrapper[4794]: E0310 09:45:51.798170 4794 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 10 09:45:51 crc kubenswrapper[4794]: E0310 09:45:51.798177 4794 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 10 09:45:51 crc kubenswrapper[4794]: E0310 09:45:51.798260 4794 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-10 09:46:07.798238373 +0000 UTC m=+116.554409231 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 10 09:45:51 crc kubenswrapper[4794]: E0310 09:45:51.798295 4794 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-10 09:46:07.798279194 +0000 UTC m=+116.554450162 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 10 09:45:51 crc kubenswrapper[4794]: I0310 09:45:51.827329 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:45:51 crc kubenswrapper[4794]: I0310 09:45:51.827416 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:45:51 crc kubenswrapper[4794]: I0310 09:45:51.827434 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:45:51 crc kubenswrapper[4794]: I0310 09:45:51.827458 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:45:51 crc kubenswrapper[4794]: I0310 09:45:51.827474 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:45:51Z","lastTransitionTime":"2026-03-10T09:45:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:45:51 crc kubenswrapper[4794]: I0310 09:45:51.930292 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:45:51 crc kubenswrapper[4794]: I0310 09:45:51.930393 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:45:51 crc kubenswrapper[4794]: I0310 09:45:51.930422 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:45:51 crc kubenswrapper[4794]: I0310 09:45:51.930447 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:45:51 crc kubenswrapper[4794]: I0310 09:45:51.930463 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:45:51Z","lastTransitionTime":"2026-03-10T09:45:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:45:51 crc kubenswrapper[4794]: I0310 09:45:51.998758 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 09:45:51 crc kubenswrapper[4794]: E0310 09:45:51.998979 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 09:45:51 crc kubenswrapper[4794]: I0310 09:45:51.999542 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 09:45:51 crc kubenswrapper[4794]: E0310 09:45:51.999753 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 09:45:51 crc kubenswrapper[4794]: I0310 09:45:51.999793 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 09:45:52 crc kubenswrapper[4794]: E0310 09:45:51.999999 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 09:45:52 crc kubenswrapper[4794]: I0310 09:45:52.017021 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2f008ec7-aeae-48e2-8ce6-2d24f9a74cb9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5ecb656c11edc3bfd23f7b6362d5ac8cad055186b134498b59e58b868ebd157c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:44:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8ff582c5b9b19a04841665aef50262d055e093a8e80200961a93e1e1ee0391b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8ff582c5b9b19a04841665aef50262d055e093a8e80200961a93e1e1ee0391b4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:44:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:44:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:44:12Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:45:52Z is after 2025-08-24T17:21:41Z" Mar 10 09:45:52 crc kubenswrapper[4794]: I0310 09:45:52.031131 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d7e66c82b5c376bc659e9597f6714923192b2658db0530c1e20b73533bfb9079\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:45:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:45:52Z is after 2025-08-24T17:21:41Z" Mar 10 09:45:52 crc kubenswrapper[4794]: I0310 09:45:52.033118 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:45:52 crc kubenswrapper[4794]: I0310 09:45:52.033189 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:45:52 crc kubenswrapper[4794]: I0310 09:45:52.033216 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:45:52 crc kubenswrapper[4794]: I0310 09:45:52.033247 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:45:52 crc kubenswrapper[4794]: I0310 09:45:52.033272 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:45:52Z","lastTransitionTime":"2026-03-10T09:45:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:45:52 crc kubenswrapper[4794]: I0310 09:45:52.046948 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:39Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:39Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7ca44cdf468c33b025c75f0f104435e413e525fba85c006f98fa92ff2f2a4e3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:45:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:45:52Z is after 2025-08-24T17:21:41Z" Mar 10 09:45:52 crc kubenswrapper[4794]: I0310 09:45:52.069038 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c04f5dfc-8cde-406b-9e01-2e529e0c0f31\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:12Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:12Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c7c4ede630df1f139ad8abbc2d1d1e2f0f6cf56a2c29902f6b6879df004834cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:44:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://27049a35d23ac6ffa5cb459e4ea2c5c55a7b1731c9edff0941d452442cc73af2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:44:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://de2c346c49fbd94c4328853bed3ffd94ba873a423ff0d99a6ace6fc832bd4da7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:44:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd3d37caeb172a681855b5504cd1387cb3798cff0c9ae2c67f9a57207d86f7fe\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bd3d37caeb172a681855b5504cd1387cb3798cff0c9ae2c67f9a57207d86f7fe\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T09:45:21Z\\\",\\\"message\\\":\\\"le observer\\\\nW0310 09:45:21.579222 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0310 09:45:21.579381 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0310 09:45:21.580347 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2897458058/tls.crt::/tmp/serving-cert-2897458058/tls.key\\\\\\\"\\\\nI0310 09:45:21.843230 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0310 09:45:21.845521 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0310 09:45:21.845546 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0310 09:45:21.845567 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0310 09:45:21.845572 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0310 09:45:21.854911 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0310 09:45:21.854948 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 09:45:21.854954 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 09:45:21.854960 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0310 09:45:21.854966 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0310 09:45:21.854970 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0310 09:45:21.854974 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0310 09:45:21.854995 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0310 09:45:21.858504 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T09:45:21Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a3be3a71e2af20be50ffcbd448d5f28e73bafe2d36819ec5476a260b6f5a8c5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:44:14Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://712665fa5ac90b706c3ce8eb211703686c18a26f136962df26e3ad7c4396c72f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://712665fa5ac90b706c3ce8eb211703686c18a26f136962df26e3ad7c4396c72f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:44:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:44:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:44:12Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:45:52Z is after 2025-08-24T17:21:41Z" Mar 10 09:45:52 crc kubenswrapper[4794]: I0310 09:45:52.086191 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:45:52Z is after 2025-08-24T17:21:41Z" Mar 10 09:45:52 crc kubenswrapper[4794]: I0310 09:45:52.099032 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0a99da5e73d983b28a12e4063cb3dcbef725b0797abd5b5beff2d066fb4b43d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:45:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://faad85d1ef96c3c23ba54d03d456195b33f24bed1a5fc64cec481108d087d77c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:45:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:45:52Z is after 2025-08-24T17:21:41Z" Mar 10 09:45:52 crc kubenswrapper[4794]: I0310 09:45:52.112798 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:35Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:45:52Z is after 2025-08-24T17:21:41Z" Mar 10 09:45:52 crc kubenswrapper[4794]: I0310 09:45:52.127260 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:35Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:45:52Z is after 2025-08-24T17:21:41Z" Mar 10 09:45:52 crc kubenswrapper[4794]: I0310 09:45:52.137221 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:45:52 crc kubenswrapper[4794]: I0310 09:45:52.137262 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:45:52 crc kubenswrapper[4794]: I0310 09:45:52.137273 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:45:52 crc kubenswrapper[4794]: I0310 09:45:52.137294 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:45:52 crc kubenswrapper[4794]: I0310 09:45:52.137305 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:45:52Z","lastTransitionTime":"2026-03-10T09:45:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:45:52 crc kubenswrapper[4794]: I0310 09:45:52.240276 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:45:52 crc kubenswrapper[4794]: I0310 09:45:52.240346 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:45:52 crc kubenswrapper[4794]: I0310 09:45:52.240360 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:45:52 crc kubenswrapper[4794]: I0310 09:45:52.240378 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:45:52 crc kubenswrapper[4794]: I0310 09:45:52.240390 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:45:52Z","lastTransitionTime":"2026-03-10T09:45:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:45:52 crc kubenswrapper[4794]: I0310 09:45:52.243793 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/node-resolver-dc7fw"] Mar 10 09:45:52 crc kubenswrapper[4794]: I0310 09:45:52.244081 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-dc7fw" Mar 10 09:45:52 crc kubenswrapper[4794]: I0310 09:45:52.246878 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Mar 10 09:45:52 crc kubenswrapper[4794]: I0310 09:45:52.246977 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Mar 10 09:45:52 crc kubenswrapper[4794]: I0310 09:45:52.247663 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Mar 10 09:45:52 crc kubenswrapper[4794]: I0310 09:45:52.270032 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c04f5dfc-8cde-406b-9e01-2e529e0c0f31\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:12Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:12Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c7c4ede630df1f139ad8abbc2d1d1e2f0f6cf56a2c29902f6b6879df004834cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:44:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://27049a35d23ac6ffa5cb459e4ea2c5c55a7b1731c9edff0941d452442cc73af2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:44:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://de2c346c49fbd94c4328853bed3ffd94ba873a423ff0d99a6ace6fc832bd4da7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:44:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd3d37caeb172a681855b5504cd1387cb3798cff0c9ae2c67f9a57207d86f7fe\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bd3d37caeb172a681855b5504cd1387cb3798cff0c9ae2c67f9a57207d86f7fe\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T09:45:21Z\\\",\\\"message\\\":\\\"le observer\\\\nW0310 09:45:21.579222 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0310 09:45:21.579381 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0310 09:45:21.580347 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2897458058/tls.crt::/tmp/serving-cert-2897458058/tls.key\\\\\\\"\\\\nI0310 09:45:21.843230 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0310 09:45:21.845521 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0310 09:45:21.845546 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0310 09:45:21.845567 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0310 09:45:21.845572 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0310 09:45:21.854911 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0310 09:45:21.854948 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 09:45:21.854954 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 09:45:21.854960 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0310 09:45:21.854966 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0310 09:45:21.854970 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0310 09:45:21.854974 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0310 09:45:21.854995 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0310 09:45:21.858504 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T09:45:21Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a3be3a71e2af20be50ffcbd448d5f28e73bafe2d36819ec5476a260b6f5a8c5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:44:14Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://712665fa5ac90b706c3ce8eb211703686c18a26f136962df26e3ad7c4396c72f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://712665fa5ac90b706c3ce8eb211703686c18a26f136962df26e3ad7c4396c72f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:44:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:44:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:44:12Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:45:52Z is after 2025-08-24T17:21:41Z" Mar 10 09:45:52 crc kubenswrapper[4794]: I0310 09:45:52.287226 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2f008ec7-aeae-48e2-8ce6-2d24f9a74cb9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5ecb656c11edc3bfd23f7b6362d5ac8cad055186b134498b59e58b868ebd157c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:44:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8ff582c5b9b19a04841665aef50262d055e093a8e80200961a93e1e1ee0391b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8ff582c5b9b19a04841665aef50262d055e093a8e80200961a93e1e1ee0391b4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:44:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:44:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:44:12Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:45:52Z is after 2025-08-24T17:21:41Z" Mar 10 09:45:52 crc kubenswrapper[4794]: I0310 09:45:52.301376 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/441d4601-197c-4325-84bf-cef005ae408b-hosts-file\") pod \"node-resolver-dc7fw\" (UID: \"441d4601-197c-4325-84bf-cef005ae408b\") " pod="openshift-dns/node-resolver-dc7fw" Mar 10 09:45:52 crc kubenswrapper[4794]: I0310 09:45:52.301510 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pxxtq\" (UniqueName: \"kubernetes.io/projected/441d4601-197c-4325-84bf-cef005ae408b-kube-api-access-pxxtq\") pod \"node-resolver-dc7fw\" (UID: \"441d4601-197c-4325-84bf-cef005ae408b\") " pod="openshift-dns/node-resolver-dc7fw" Mar 10 09:45:52 crc kubenswrapper[4794]: I0310 09:45:52.308305 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d7e66c82b5c376bc659e9597f6714923192b2658db0530c1e20b73533bfb9079\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:45:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:45:52Z is after 2025-08-24T17:21:41Z" Mar 10 09:45:52 crc kubenswrapper[4794]: I0310 09:45:52.326062 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:39Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:39Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7ca44cdf468c33b025c75f0f104435e413e525fba85c006f98fa92ff2f2a4e3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:45:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:45:52Z is after 2025-08-24T17:21:41Z" Mar 10 09:45:52 crc kubenswrapper[4794]: I0310 09:45:52.340437 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:35Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:45:52Z is after 2025-08-24T17:21:41Z" Mar 10 09:45:52 crc kubenswrapper[4794]: I0310 09:45:52.342773 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:45:52 crc kubenswrapper[4794]: I0310 09:45:52.342929 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:45:52 crc kubenswrapper[4794]: I0310 09:45:52.343015 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:45:52 crc kubenswrapper[4794]: I0310 09:45:52.343116 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:45:52 crc kubenswrapper[4794]: I0310 09:45:52.343205 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:45:52Z","lastTransitionTime":"2026-03-10T09:45:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:45:52 crc kubenswrapper[4794]: I0310 09:45:52.358484 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:45:52Z is after 2025-08-24T17:21:41Z" Mar 10 09:45:52 crc kubenswrapper[4794]: I0310 09:45:52.375477 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0a99da5e73d983b28a12e4063cb3dcbef725b0797abd5b5beff2d066fb4b43d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:45:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://faad85d1ef96c3c23ba54d03d456195b33f24bed1a5fc64cec481108d087d77c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:45:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:45:52Z is after 2025-08-24T17:21:41Z" Mar 10 09:45:52 crc kubenswrapper[4794]: I0310 09:45:52.393422 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:35Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:45:52Z is after 2025-08-24T17:21:41Z" Mar 10 09:45:52 crc kubenswrapper[4794]: I0310 09:45:52.402501 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pxxtq\" (UniqueName: \"kubernetes.io/projected/441d4601-197c-4325-84bf-cef005ae408b-kube-api-access-pxxtq\") pod \"node-resolver-dc7fw\" (UID: \"441d4601-197c-4325-84bf-cef005ae408b\") " pod="openshift-dns/node-resolver-dc7fw" Mar 10 09:45:52 crc kubenswrapper[4794]: I0310 09:45:52.402593 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/441d4601-197c-4325-84bf-cef005ae408b-hosts-file\") pod \"node-resolver-dc7fw\" (UID: \"441d4601-197c-4325-84bf-cef005ae408b\") " pod="openshift-dns/node-resolver-dc7fw" Mar 10 09:45:52 crc kubenswrapper[4794]: I0310 09:45:52.402701 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/441d4601-197c-4325-84bf-cef005ae408b-hosts-file\") pod \"node-resolver-dc7fw\" (UID: \"441d4601-197c-4325-84bf-cef005ae408b\") " pod="openshift-dns/node-resolver-dc7fw" Mar 10 09:45:52 crc kubenswrapper[4794]: I0310 09:45:52.408452 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-dc7fw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"441d4601-197c-4325-84bf-cef005ae408b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:52Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:52Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxxtq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:45:52Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-dc7fw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:45:52Z is after 2025-08-24T17:21:41Z" Mar 10 09:45:52 crc kubenswrapper[4794]: I0310 09:45:52.431080 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pxxtq\" (UniqueName: \"kubernetes.io/projected/441d4601-197c-4325-84bf-cef005ae408b-kube-api-access-pxxtq\") pod \"node-resolver-dc7fw\" (UID: \"441d4601-197c-4325-84bf-cef005ae408b\") " pod="openshift-dns/node-resolver-dc7fw" Mar 10 09:45:52 crc kubenswrapper[4794]: I0310 09:45:52.446772 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:45:52 crc kubenswrapper[4794]: I0310 09:45:52.446831 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:45:52 crc kubenswrapper[4794]: I0310 09:45:52.446842 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:45:52 crc kubenswrapper[4794]: I0310 09:45:52.446886 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:45:52 crc kubenswrapper[4794]: I0310 09:45:52.446902 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:45:52Z","lastTransitionTime":"2026-03-10T09:45:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:45:52 crc kubenswrapper[4794]: I0310 09:45:52.549531 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:45:52 crc kubenswrapper[4794]: I0310 09:45:52.549579 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:45:52 crc kubenswrapper[4794]: I0310 09:45:52.549590 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:45:52 crc kubenswrapper[4794]: I0310 09:45:52.549610 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:45:52 crc kubenswrapper[4794]: I0310 09:45:52.549625 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:45:52Z","lastTransitionTime":"2026-03-10T09:45:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:45:52 crc kubenswrapper[4794]: I0310 09:45:52.557795 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-dc7fw" Mar 10 09:45:52 crc kubenswrapper[4794]: W0310 09:45:52.571310 4794 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod441d4601_197c_4325_84bf_cef005ae408b.slice/crio-30f86fd41e15041f4051596477ce3b47d90fb53afacfbe93b367c69047634c28 WatchSource:0}: Error finding container 30f86fd41e15041f4051596477ce3b47d90fb53afacfbe93b367c69047634c28: Status 404 returned error can't find the container with id 30f86fd41e15041f4051596477ce3b47d90fb53afacfbe93b367c69047634c28 Mar 10 09:45:52 crc kubenswrapper[4794]: I0310 09:45:52.641034 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-additional-cni-plugins-zr89w"] Mar 10 09:45:52 crc kubenswrapper[4794]: I0310 09:45:52.641738 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-jpdth"] Mar 10 09:45:52 crc kubenswrapper[4794]: I0310 09:45:52.641901 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-daemon-69278"] Mar 10 09:45:52 crc kubenswrapper[4794]: I0310 09:45:52.642180 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-69278" Mar 10 09:45:52 crc kubenswrapper[4794]: I0310 09:45:52.642268 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-jpdth" Mar 10 09:45:52 crc kubenswrapper[4794]: I0310 09:45:52.642281 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-zr89w" Mar 10 09:45:52 crc kubenswrapper[4794]: I0310 09:45:52.644570 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Mar 10 09:45:52 crc kubenswrapper[4794]: I0310 09:45:52.645739 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Mar 10 09:45:52 crc kubenswrapper[4794]: I0310 09:45:52.645775 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Mar 10 09:45:52 crc kubenswrapper[4794]: I0310 09:45:52.645940 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Mar 10 09:45:52 crc kubenswrapper[4794]: I0310 09:45:52.646088 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Mar 10 09:45:52 crc kubenswrapper[4794]: I0310 09:45:52.646173 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Mar 10 09:45:52 crc kubenswrapper[4794]: I0310 09:45:52.647154 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Mar 10 09:45:52 crc kubenswrapper[4794]: I0310 09:45:52.647231 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Mar 10 09:45:52 crc kubenswrapper[4794]: I0310 09:45:52.647791 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Mar 10 09:45:52 crc kubenswrapper[4794]: I0310 09:45:52.647792 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Mar 10 09:45:52 crc kubenswrapper[4794]: I0310 09:45:52.647913 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Mar 10 09:45:52 crc kubenswrapper[4794]: I0310 09:45:52.647994 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Mar 10 09:45:52 crc kubenswrapper[4794]: I0310 09:45:52.651583 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:45:52 crc kubenswrapper[4794]: I0310 09:45:52.651653 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:45:52 crc kubenswrapper[4794]: I0310 09:45:52.651671 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:45:52 crc kubenswrapper[4794]: I0310 09:45:52.651696 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:45:52 crc kubenswrapper[4794]: I0310 09:45:52.651714 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:45:52Z","lastTransitionTime":"2026-03-10T09:45:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:45:52 crc kubenswrapper[4794]: I0310 09:45:52.662048 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:39Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:39Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7ca44cdf468c33b025c75f0f104435e413e525fba85c006f98fa92ff2f2a4e3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:45:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:45:52Z is after 2025-08-24T17:21:41Z" Mar 10 09:45:52 crc kubenswrapper[4794]: I0310 09:45:52.677825 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d7e66c82b5c376bc659e9597f6714923192b2658db0530c1e20b73533bfb9079\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:45:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:45:52Z is after 2025-08-24T17:21:41Z" Mar 10 09:45:52 crc kubenswrapper[4794]: I0310 09:45:52.691518 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0a99da5e73d983b28a12e4063cb3dcbef725b0797abd5b5beff2d066fb4b43d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:45:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://faad85d1ef96c3c23ba54d03d456195b33f24bed1a5fc64cec481108d087d77c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:45:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:45:52Z is after 2025-08-24T17:21:41Z" Mar 10 09:45:52 crc kubenswrapper[4794]: I0310 09:45:52.700885 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-dc7fw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"441d4601-197c-4325-84bf-cef005ae408b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:52Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:52Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxxtq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:45:52Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-dc7fw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:45:52Z is after 2025-08-24T17:21:41Z" Mar 10 09:45:52 crc kubenswrapper[4794]: I0310 09:45:52.705526 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/749e1060-d177-43e4-9f39-d1b3e9b573b3-system-cni-dir\") pod \"multus-additional-cni-plugins-zr89w\" (UID: \"749e1060-d177-43e4-9f39-d1b3e9b573b3\") " pod="openshift-multus/multus-additional-cni-plugins-zr89w" Mar 10 09:45:52 crc kubenswrapper[4794]: I0310 09:45:52.705553 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/11028118-385a-4a2a-8bc4-49aad67ce147-multus-conf-dir\") pod \"multus-jpdth\" (UID: \"11028118-385a-4a2a-8bc4-49aad67ce147\") " pod="openshift-multus/multus-jpdth" Mar 10 09:45:52 crc kubenswrapper[4794]: I0310 09:45:52.705570 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/071a79a8-a892-4d38-a255-2a19483b64aa-proxy-tls\") pod \"machine-config-daemon-69278\" (UID: \"071a79a8-a892-4d38-a255-2a19483b64aa\") " pod="openshift-machine-config-operator/machine-config-daemon-69278" Mar 10 09:45:52 crc kubenswrapper[4794]: I0310 09:45:52.705584 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/11028118-385a-4a2a-8bc4-49aad67ce147-hostroot\") pod \"multus-jpdth\" (UID: \"11028118-385a-4a2a-8bc4-49aad67ce147\") " pod="openshift-multus/multus-jpdth" Mar 10 09:45:52 crc kubenswrapper[4794]: I0310 09:45:52.705629 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/11028118-385a-4a2a-8bc4-49aad67ce147-host-run-k8s-cni-cncf-io\") pod \"multus-jpdth\" (UID: \"11028118-385a-4a2a-8bc4-49aad67ce147\") " pod="openshift-multus/multus-jpdth" Mar 10 09:45:52 crc kubenswrapper[4794]: I0310 09:45:52.705645 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/11028118-385a-4a2a-8bc4-49aad67ce147-cni-binary-copy\") pod \"multus-jpdth\" (UID: \"11028118-385a-4a2a-8bc4-49aad67ce147\") " pod="openshift-multus/multus-jpdth" Mar 10 09:45:52 crc kubenswrapper[4794]: I0310 09:45:52.705660 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/749e1060-d177-43e4-9f39-d1b3e9b573b3-cnibin\") pod \"multus-additional-cni-plugins-zr89w\" (UID: \"749e1060-d177-43e4-9f39-d1b3e9b573b3\") " pod="openshift-multus/multus-additional-cni-plugins-zr89w" Mar 10 09:45:52 crc kubenswrapper[4794]: I0310 09:45:52.705675 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/11028118-385a-4a2a-8bc4-49aad67ce147-multus-cni-dir\") pod \"multus-jpdth\" (UID: \"11028118-385a-4a2a-8bc4-49aad67ce147\") " pod="openshift-multus/multus-jpdth" Mar 10 09:45:52 crc kubenswrapper[4794]: I0310 09:45:52.705692 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/11028118-385a-4a2a-8bc4-49aad67ce147-os-release\") pod \"multus-jpdth\" (UID: \"11028118-385a-4a2a-8bc4-49aad67ce147\") " pod="openshift-multus/multus-jpdth" Mar 10 09:45:52 crc kubenswrapper[4794]: I0310 09:45:52.705709 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wl72n\" (UniqueName: \"kubernetes.io/projected/071a79a8-a892-4d38-a255-2a19483b64aa-kube-api-access-wl72n\") pod \"machine-config-daemon-69278\" (UID: \"071a79a8-a892-4d38-a255-2a19483b64aa\") " pod="openshift-machine-config-operator/machine-config-daemon-69278" Mar 10 09:45:52 crc kubenswrapper[4794]: I0310 09:45:52.705723 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/749e1060-d177-43e4-9f39-d1b3e9b573b3-cni-binary-copy\") pod \"multus-additional-cni-plugins-zr89w\" (UID: \"749e1060-d177-43e4-9f39-d1b3e9b573b3\") " pod="openshift-multus/multus-additional-cni-plugins-zr89w" Mar 10 09:45:52 crc kubenswrapper[4794]: I0310 09:45:52.705737 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/749e1060-d177-43e4-9f39-d1b3e9b573b3-os-release\") pod \"multus-additional-cni-plugins-zr89w\" (UID: \"749e1060-d177-43e4-9f39-d1b3e9b573b3\") " pod="openshift-multus/multus-additional-cni-plugins-zr89w" Mar 10 09:45:52 crc kubenswrapper[4794]: I0310 09:45:52.705749 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/749e1060-d177-43e4-9f39-d1b3e9b573b3-tuning-conf-dir\") pod \"multus-additional-cni-plugins-zr89w\" (UID: \"749e1060-d177-43e4-9f39-d1b3e9b573b3\") " pod="openshift-multus/multus-additional-cni-plugins-zr89w" Mar 10 09:45:52 crc kubenswrapper[4794]: I0310 09:45:52.705762 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/11028118-385a-4a2a-8bc4-49aad67ce147-host-var-lib-cni-bin\") pod \"multus-jpdth\" (UID: \"11028118-385a-4a2a-8bc4-49aad67ce147\") " pod="openshift-multus/multus-jpdth" Mar 10 09:45:52 crc kubenswrapper[4794]: I0310 09:45:52.705780 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/749e1060-d177-43e4-9f39-d1b3e9b573b3-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-zr89w\" (UID: \"749e1060-d177-43e4-9f39-d1b3e9b573b3\") " pod="openshift-multus/multus-additional-cni-plugins-zr89w" Mar 10 09:45:52 crc kubenswrapper[4794]: I0310 09:45:52.705796 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/11028118-385a-4a2a-8bc4-49aad67ce147-system-cni-dir\") pod \"multus-jpdth\" (UID: \"11028118-385a-4a2a-8bc4-49aad67ce147\") " pod="openshift-multus/multus-jpdth" Mar 10 09:45:52 crc kubenswrapper[4794]: I0310 09:45:52.705809 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/11028118-385a-4a2a-8bc4-49aad67ce147-host-var-lib-cni-multus\") pod \"multus-jpdth\" (UID: \"11028118-385a-4a2a-8bc4-49aad67ce147\") " pod="openshift-multus/multus-jpdth" Mar 10 09:45:52 crc kubenswrapper[4794]: I0310 09:45:52.705822 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/11028118-385a-4a2a-8bc4-49aad67ce147-host-var-lib-kubelet\") pod \"multus-jpdth\" (UID: \"11028118-385a-4a2a-8bc4-49aad67ce147\") " pod="openshift-multus/multus-jpdth" Mar 10 09:45:52 crc kubenswrapper[4794]: I0310 09:45:52.705835 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/071a79a8-a892-4d38-a255-2a19483b64aa-rootfs\") pod \"machine-config-daemon-69278\" (UID: \"071a79a8-a892-4d38-a255-2a19483b64aa\") " pod="openshift-machine-config-operator/machine-config-daemon-69278" Mar 10 09:45:52 crc kubenswrapper[4794]: I0310 09:45:52.705848 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/11028118-385a-4a2a-8bc4-49aad67ce147-cnibin\") pod \"multus-jpdth\" (UID: \"11028118-385a-4a2a-8bc4-49aad67ce147\") " pod="openshift-multus/multus-jpdth" Mar 10 09:45:52 crc kubenswrapper[4794]: I0310 09:45:52.705862 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/11028118-385a-4a2a-8bc4-49aad67ce147-etc-kubernetes\") pod \"multus-jpdth\" (UID: \"11028118-385a-4a2a-8bc4-49aad67ce147\") " pod="openshift-multus/multus-jpdth" Mar 10 09:45:52 crc kubenswrapper[4794]: I0310 09:45:52.705893 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-drqmz\" (UniqueName: \"kubernetes.io/projected/749e1060-d177-43e4-9f39-d1b3e9b573b3-kube-api-access-drqmz\") pod \"multus-additional-cni-plugins-zr89w\" (UID: \"749e1060-d177-43e4-9f39-d1b3e9b573b3\") " pod="openshift-multus/multus-additional-cni-plugins-zr89w" Mar 10 09:45:52 crc kubenswrapper[4794]: I0310 09:45:52.705907 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/11028118-385a-4a2a-8bc4-49aad67ce147-multus-socket-dir-parent\") pod \"multus-jpdth\" (UID: \"11028118-385a-4a2a-8bc4-49aad67ce147\") " pod="openshift-multus/multus-jpdth" Mar 10 09:45:52 crc kubenswrapper[4794]: I0310 09:45:52.705922 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/11028118-385a-4a2a-8bc4-49aad67ce147-multus-daemon-config\") pod \"multus-jpdth\" (UID: \"11028118-385a-4a2a-8bc4-49aad67ce147\") " pod="openshift-multus/multus-jpdth" Mar 10 09:45:52 crc kubenswrapper[4794]: I0310 09:45:52.705936 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8v9tn\" (UniqueName: \"kubernetes.io/projected/11028118-385a-4a2a-8bc4-49aad67ce147-kube-api-access-8v9tn\") pod \"multus-jpdth\" (UID: \"11028118-385a-4a2a-8bc4-49aad67ce147\") " pod="openshift-multus/multus-jpdth" Mar 10 09:45:52 crc kubenswrapper[4794]: I0310 09:45:52.705950 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/071a79a8-a892-4d38-a255-2a19483b64aa-mcd-auth-proxy-config\") pod \"machine-config-daemon-69278\" (UID: \"071a79a8-a892-4d38-a255-2a19483b64aa\") " pod="openshift-machine-config-operator/machine-config-daemon-69278" Mar 10 09:45:52 crc kubenswrapper[4794]: I0310 09:45:52.705965 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/11028118-385a-4a2a-8bc4-49aad67ce147-host-run-netns\") pod \"multus-jpdth\" (UID: \"11028118-385a-4a2a-8bc4-49aad67ce147\") " pod="openshift-multus/multus-jpdth" Mar 10 09:45:52 crc kubenswrapper[4794]: I0310 09:45:52.705979 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/11028118-385a-4a2a-8bc4-49aad67ce147-host-run-multus-certs\") pod \"multus-jpdth\" (UID: \"11028118-385a-4a2a-8bc4-49aad67ce147\") " pod="openshift-multus/multus-jpdth" Mar 10 09:45:52 crc kubenswrapper[4794]: I0310 09:45:52.712522 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:35Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:45:52Z is after 2025-08-24T17:21:41Z" Mar 10 09:45:52 crc kubenswrapper[4794]: I0310 09:45:52.725845 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:45:52Z is after 2025-08-24T17:21:41Z" Mar 10 09:45:52 crc kubenswrapper[4794]: I0310 09:45:52.737971 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c04f5dfc-8cde-406b-9e01-2e529e0c0f31\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:12Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:12Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c7c4ede630df1f139ad8abbc2d1d1e2f0f6cf56a2c29902f6b6879df004834cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:44:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://27049a35d23ac6ffa5cb459e4ea2c5c55a7b1731c9edff0941d452442cc73af2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:44:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://de2c346c49fbd94c4328853bed3ffd94ba873a423ff0d99a6ace6fc832bd4da7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:44:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd3d37caeb172a681855b5504cd1387cb3798cff0c9ae2c67f9a57207d86f7fe\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bd3d37caeb172a681855b5504cd1387cb3798cff0c9ae2c67f9a57207d86f7fe\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T09:45:21Z\\\",\\\"message\\\":\\\"le observer\\\\nW0310 09:45:21.579222 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0310 09:45:21.579381 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0310 09:45:21.580347 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2897458058/tls.crt::/tmp/serving-cert-2897458058/tls.key\\\\\\\"\\\\nI0310 09:45:21.843230 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0310 09:45:21.845521 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0310 09:45:21.845546 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0310 09:45:21.845567 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0310 09:45:21.845572 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0310 09:45:21.854911 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0310 09:45:21.854948 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 09:45:21.854954 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 09:45:21.854960 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0310 09:45:21.854966 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0310 09:45:21.854970 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0310 09:45:21.854974 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0310 09:45:21.854995 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0310 09:45:21.858504 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T09:45:21Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a3be3a71e2af20be50ffcbd448d5f28e73bafe2d36819ec5476a260b6f5a8c5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:44:14Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://712665fa5ac90b706c3ce8eb211703686c18a26f136962df26e3ad7c4396c72f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://712665fa5ac90b706c3ce8eb211703686c18a26f136962df26e3ad7c4396c72f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:44:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:44:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:44:12Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:45:52Z is after 2025-08-24T17:21:41Z" Mar 10 09:45:52 crc kubenswrapper[4794]: I0310 09:45:52.746131 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2f008ec7-aeae-48e2-8ce6-2d24f9a74cb9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5ecb656c11edc3bfd23f7b6362d5ac8cad055186b134498b59e58b868ebd157c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:44:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8ff582c5b9b19a04841665aef50262d055e093a8e80200961a93e1e1ee0391b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8ff582c5b9b19a04841665aef50262d055e093a8e80200961a93e1e1ee0391b4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:44:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:44:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:44:12Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:45:52Z is after 2025-08-24T17:21:41Z" Mar 10 09:45:52 crc kubenswrapper[4794]: I0310 09:45:52.753800 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:45:52 crc kubenswrapper[4794]: I0310 09:45:52.753834 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:45:52 crc kubenswrapper[4794]: I0310 09:45:52.753842 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:45:52 crc kubenswrapper[4794]: I0310 09:45:52.753855 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:45:52 crc kubenswrapper[4794]: I0310 09:45:52.753865 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:45:52Z","lastTransitionTime":"2026-03-10T09:45:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:45:52 crc kubenswrapper[4794]: I0310 09:45:52.759003 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-zr89w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"749e1060-d177-43e4-9f39-d1b3e9b573b3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:52Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:52Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:52Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-drqmz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-drqmz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-drqmz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-drqmz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-drqmz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-drqmz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-drqmz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:45:52Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-zr89w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:45:52Z is after 2025-08-24T17:21:41Z" Mar 10 09:45:52 crc kubenswrapper[4794]: I0310 09:45:52.770545 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:35Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:45:52Z is after 2025-08-24T17:21:41Z" Mar 10 09:45:52 crc kubenswrapper[4794]: I0310 09:45:52.784123 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-69278" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"071a79a8-a892-4d38-a255-2a19483b64aa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:52Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:52Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wl72n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wl72n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:45:52Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-69278\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:45:52Z is after 2025-08-24T17:21:41Z" Mar 10 09:45:52 crc kubenswrapper[4794]: I0310 09:45:52.799473 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c04f5dfc-8cde-406b-9e01-2e529e0c0f31\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:12Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:12Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c7c4ede630df1f139ad8abbc2d1d1e2f0f6cf56a2c29902f6b6879df004834cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:44:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://27049a35d23ac6ffa5cb459e4ea2c5c55a7b1731c9edff0941d452442cc73af2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:44:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://de2c346c49fbd94c4328853bed3ffd94ba873a423ff0d99a6ace6fc832bd4da7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:44:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd3d37caeb172a681855b5504cd1387cb3798cff0c9ae2c67f9a57207d86f7fe\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bd3d37caeb172a681855b5504cd1387cb3798cff0c9ae2c67f9a57207d86f7fe\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T09:45:21Z\\\",\\\"message\\\":\\\"le observer\\\\nW0310 09:45:21.579222 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0310 09:45:21.579381 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0310 09:45:21.580347 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2897458058/tls.crt::/tmp/serving-cert-2897458058/tls.key\\\\\\\"\\\\nI0310 09:45:21.843230 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0310 09:45:21.845521 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0310 09:45:21.845546 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0310 09:45:21.845567 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0310 09:45:21.845572 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0310 09:45:21.854911 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0310 09:45:21.854948 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 09:45:21.854954 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 09:45:21.854960 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0310 09:45:21.854966 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0310 09:45:21.854970 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0310 09:45:21.854974 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0310 09:45:21.854995 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0310 09:45:21.858504 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T09:45:21Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a3be3a71e2af20be50ffcbd448d5f28e73bafe2d36819ec5476a260b6f5a8c5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:44:14Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://712665fa5ac90b706c3ce8eb211703686c18a26f136962df26e3ad7c4396c72f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://712665fa5ac90b706c3ce8eb211703686c18a26f136962df26e3ad7c4396c72f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:44:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:44:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:44:12Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:45:52Z is after 2025-08-24T17:21:41Z" Mar 10 09:45:52 crc kubenswrapper[4794]: I0310 09:45:52.806619 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wl72n\" (UniqueName: \"kubernetes.io/projected/071a79a8-a892-4d38-a255-2a19483b64aa-kube-api-access-wl72n\") pod \"machine-config-daemon-69278\" (UID: \"071a79a8-a892-4d38-a255-2a19483b64aa\") " pod="openshift-machine-config-operator/machine-config-daemon-69278" Mar 10 09:45:52 crc kubenswrapper[4794]: I0310 09:45:52.806647 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/749e1060-d177-43e4-9f39-d1b3e9b573b3-cni-binary-copy\") pod \"multus-additional-cni-plugins-zr89w\" (UID: \"749e1060-d177-43e4-9f39-d1b3e9b573b3\") " pod="openshift-multus/multus-additional-cni-plugins-zr89w" Mar 10 09:45:52 crc kubenswrapper[4794]: I0310 09:45:52.806665 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/749e1060-d177-43e4-9f39-d1b3e9b573b3-os-release\") pod \"multus-additional-cni-plugins-zr89w\" (UID: \"749e1060-d177-43e4-9f39-d1b3e9b573b3\") " pod="openshift-multus/multus-additional-cni-plugins-zr89w" Mar 10 09:45:52 crc kubenswrapper[4794]: I0310 09:45:52.806679 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/749e1060-d177-43e4-9f39-d1b3e9b573b3-tuning-conf-dir\") pod \"multus-additional-cni-plugins-zr89w\" (UID: \"749e1060-d177-43e4-9f39-d1b3e9b573b3\") " pod="openshift-multus/multus-additional-cni-plugins-zr89w" Mar 10 09:45:52 crc kubenswrapper[4794]: I0310 09:45:52.806694 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/11028118-385a-4a2a-8bc4-49aad67ce147-host-var-lib-cni-bin\") pod \"multus-jpdth\" (UID: \"11028118-385a-4a2a-8bc4-49aad67ce147\") " pod="openshift-multus/multus-jpdth" Mar 10 09:45:52 crc kubenswrapper[4794]: I0310 09:45:52.806710 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/749e1060-d177-43e4-9f39-d1b3e9b573b3-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-zr89w\" (UID: \"749e1060-d177-43e4-9f39-d1b3e9b573b3\") " pod="openshift-multus/multus-additional-cni-plugins-zr89w" Mar 10 09:45:52 crc kubenswrapper[4794]: I0310 09:45:52.806730 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/11028118-385a-4a2a-8bc4-49aad67ce147-system-cni-dir\") pod \"multus-jpdth\" (UID: \"11028118-385a-4a2a-8bc4-49aad67ce147\") " pod="openshift-multus/multus-jpdth" Mar 10 09:45:52 crc kubenswrapper[4794]: I0310 09:45:52.806747 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/11028118-385a-4a2a-8bc4-49aad67ce147-host-var-lib-cni-multus\") pod \"multus-jpdth\" (UID: \"11028118-385a-4a2a-8bc4-49aad67ce147\") " pod="openshift-multus/multus-jpdth" Mar 10 09:45:52 crc kubenswrapper[4794]: I0310 09:45:52.806759 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/11028118-385a-4a2a-8bc4-49aad67ce147-host-var-lib-kubelet\") pod \"multus-jpdth\" (UID: \"11028118-385a-4a2a-8bc4-49aad67ce147\") " pod="openshift-multus/multus-jpdth" Mar 10 09:45:52 crc kubenswrapper[4794]: I0310 09:45:52.806773 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/071a79a8-a892-4d38-a255-2a19483b64aa-rootfs\") pod \"machine-config-daemon-69278\" (UID: \"071a79a8-a892-4d38-a255-2a19483b64aa\") " pod="openshift-machine-config-operator/machine-config-daemon-69278" Mar 10 09:45:52 crc kubenswrapper[4794]: I0310 09:45:52.806789 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/11028118-385a-4a2a-8bc4-49aad67ce147-etc-kubernetes\") pod \"multus-jpdth\" (UID: \"11028118-385a-4a2a-8bc4-49aad67ce147\") " pod="openshift-multus/multus-jpdth" Mar 10 09:45:52 crc kubenswrapper[4794]: I0310 09:45:52.806805 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/11028118-385a-4a2a-8bc4-49aad67ce147-cnibin\") pod \"multus-jpdth\" (UID: \"11028118-385a-4a2a-8bc4-49aad67ce147\") " pod="openshift-multus/multus-jpdth" Mar 10 09:45:52 crc kubenswrapper[4794]: I0310 09:45:52.806841 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/749e1060-d177-43e4-9f39-d1b3e9b573b3-os-release\") pod \"multus-additional-cni-plugins-zr89w\" (UID: \"749e1060-d177-43e4-9f39-d1b3e9b573b3\") " pod="openshift-multus/multus-additional-cni-plugins-zr89w" Mar 10 09:45:52 crc kubenswrapper[4794]: I0310 09:45:52.806850 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-drqmz\" (UniqueName: \"kubernetes.io/projected/749e1060-d177-43e4-9f39-d1b3e9b573b3-kube-api-access-drqmz\") pod \"multus-additional-cni-plugins-zr89w\" (UID: \"749e1060-d177-43e4-9f39-d1b3e9b573b3\") " pod="openshift-multus/multus-additional-cni-plugins-zr89w" Mar 10 09:45:52 crc kubenswrapper[4794]: I0310 09:45:52.806925 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/11028118-385a-4a2a-8bc4-49aad67ce147-multus-socket-dir-parent\") pod \"multus-jpdth\" (UID: \"11028118-385a-4a2a-8bc4-49aad67ce147\") " pod="openshift-multus/multus-jpdth" Mar 10 09:45:52 crc kubenswrapper[4794]: I0310 09:45:52.806945 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8v9tn\" (UniqueName: \"kubernetes.io/projected/11028118-385a-4a2a-8bc4-49aad67ce147-kube-api-access-8v9tn\") pod \"multus-jpdth\" (UID: \"11028118-385a-4a2a-8bc4-49aad67ce147\") " pod="openshift-multus/multus-jpdth" Mar 10 09:45:52 crc kubenswrapper[4794]: I0310 09:45:52.806953 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/11028118-385a-4a2a-8bc4-49aad67ce147-etc-kubernetes\") pod \"multus-jpdth\" (UID: \"11028118-385a-4a2a-8bc4-49aad67ce147\") " pod="openshift-multus/multus-jpdth" Mar 10 09:45:52 crc kubenswrapper[4794]: I0310 09:45:52.807031 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/11028118-385a-4a2a-8bc4-49aad67ce147-multus-socket-dir-parent\") pod \"multus-jpdth\" (UID: \"11028118-385a-4a2a-8bc4-49aad67ce147\") " pod="openshift-multus/multus-jpdth" Mar 10 09:45:52 crc kubenswrapper[4794]: I0310 09:45:52.806953 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/11028118-385a-4a2a-8bc4-49aad67ce147-host-var-lib-kubelet\") pod \"multus-jpdth\" (UID: \"11028118-385a-4a2a-8bc4-49aad67ce147\") " pod="openshift-multus/multus-jpdth" Mar 10 09:45:52 crc kubenswrapper[4794]: I0310 09:45:52.807026 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/11028118-385a-4a2a-8bc4-49aad67ce147-host-var-lib-cni-bin\") pod \"multus-jpdth\" (UID: \"11028118-385a-4a2a-8bc4-49aad67ce147\") " pod="openshift-multus/multus-jpdth" Mar 10 09:45:52 crc kubenswrapper[4794]: I0310 09:45:52.807069 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/11028118-385a-4a2a-8bc4-49aad67ce147-cnibin\") pod \"multus-jpdth\" (UID: \"11028118-385a-4a2a-8bc4-49aad67ce147\") " pod="openshift-multus/multus-jpdth" Mar 10 09:45:52 crc kubenswrapper[4794]: I0310 09:45:52.806964 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/071a79a8-a892-4d38-a255-2a19483b64aa-mcd-auth-proxy-config\") pod \"machine-config-daemon-69278\" (UID: \"071a79a8-a892-4d38-a255-2a19483b64aa\") " pod="openshift-machine-config-operator/machine-config-daemon-69278" Mar 10 09:45:52 crc kubenswrapper[4794]: I0310 09:45:52.807075 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/071a79a8-a892-4d38-a255-2a19483b64aa-rootfs\") pod \"machine-config-daemon-69278\" (UID: \"071a79a8-a892-4d38-a255-2a19483b64aa\") " pod="openshift-machine-config-operator/machine-config-daemon-69278" Mar 10 09:45:52 crc kubenswrapper[4794]: I0310 09:45:52.807119 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/11028118-385a-4a2a-8bc4-49aad67ce147-multus-daemon-config\") pod \"multus-jpdth\" (UID: \"11028118-385a-4a2a-8bc4-49aad67ce147\") " pod="openshift-multus/multus-jpdth" Mar 10 09:45:52 crc kubenswrapper[4794]: I0310 09:45:52.807157 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/11028118-385a-4a2a-8bc4-49aad67ce147-host-var-lib-cni-multus\") pod \"multus-jpdth\" (UID: \"11028118-385a-4a2a-8bc4-49aad67ce147\") " pod="openshift-multus/multus-jpdth" Mar 10 09:45:52 crc kubenswrapper[4794]: I0310 09:45:52.807163 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/11028118-385a-4a2a-8bc4-49aad67ce147-system-cni-dir\") pod \"multus-jpdth\" (UID: \"11028118-385a-4a2a-8bc4-49aad67ce147\") " pod="openshift-multus/multus-jpdth" Mar 10 09:45:52 crc kubenswrapper[4794]: I0310 09:45:52.807184 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/11028118-385a-4a2a-8bc4-49aad67ce147-host-run-netns\") pod \"multus-jpdth\" (UID: \"11028118-385a-4a2a-8bc4-49aad67ce147\") " pod="openshift-multus/multus-jpdth" Mar 10 09:45:52 crc kubenswrapper[4794]: I0310 09:45:52.807238 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/11028118-385a-4a2a-8bc4-49aad67ce147-host-run-netns\") pod \"multus-jpdth\" (UID: \"11028118-385a-4a2a-8bc4-49aad67ce147\") " pod="openshift-multus/multus-jpdth" Mar 10 09:45:52 crc kubenswrapper[4794]: I0310 09:45:52.807281 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/11028118-385a-4a2a-8bc4-49aad67ce147-host-run-multus-certs\") pod \"multus-jpdth\" (UID: \"11028118-385a-4a2a-8bc4-49aad67ce147\") " pod="openshift-multus/multus-jpdth" Mar 10 09:45:52 crc kubenswrapper[4794]: I0310 09:45:52.807369 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/11028118-385a-4a2a-8bc4-49aad67ce147-host-run-multus-certs\") pod \"multus-jpdth\" (UID: \"11028118-385a-4a2a-8bc4-49aad67ce147\") " pod="openshift-multus/multus-jpdth" Mar 10 09:45:52 crc kubenswrapper[4794]: I0310 09:45:52.807366 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/11028118-385a-4a2a-8bc4-49aad67ce147-multus-conf-dir\") pod \"multus-jpdth\" (UID: \"11028118-385a-4a2a-8bc4-49aad67ce147\") " pod="openshift-multus/multus-jpdth" Mar 10 09:45:52 crc kubenswrapper[4794]: I0310 09:45:52.807414 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/749e1060-d177-43e4-9f39-d1b3e9b573b3-cni-binary-copy\") pod \"multus-additional-cni-plugins-zr89w\" (UID: \"749e1060-d177-43e4-9f39-d1b3e9b573b3\") " pod="openshift-multus/multus-additional-cni-plugins-zr89w" Mar 10 09:45:52 crc kubenswrapper[4794]: I0310 09:45:52.807424 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/11028118-385a-4a2a-8bc4-49aad67ce147-multus-conf-dir\") pod \"multus-jpdth\" (UID: \"11028118-385a-4a2a-8bc4-49aad67ce147\") " pod="openshift-multus/multus-jpdth" Mar 10 09:45:52 crc kubenswrapper[4794]: I0310 09:45:52.807426 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/749e1060-d177-43e4-9f39-d1b3e9b573b3-tuning-conf-dir\") pod \"multus-additional-cni-plugins-zr89w\" (UID: \"749e1060-d177-43e4-9f39-d1b3e9b573b3\") " pod="openshift-multus/multus-additional-cni-plugins-zr89w" Mar 10 09:45:52 crc kubenswrapper[4794]: I0310 09:45:52.807417 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/071a79a8-a892-4d38-a255-2a19483b64aa-proxy-tls\") pod \"machine-config-daemon-69278\" (UID: \"071a79a8-a892-4d38-a255-2a19483b64aa\") " pod="openshift-machine-config-operator/machine-config-daemon-69278" Mar 10 09:45:52 crc kubenswrapper[4794]: I0310 09:45:52.807529 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/749e1060-d177-43e4-9f39-d1b3e9b573b3-system-cni-dir\") pod \"multus-additional-cni-plugins-zr89w\" (UID: \"749e1060-d177-43e4-9f39-d1b3e9b573b3\") " pod="openshift-multus/multus-additional-cni-plugins-zr89w" Mar 10 09:45:52 crc kubenswrapper[4794]: I0310 09:45:52.807612 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/749e1060-d177-43e4-9f39-d1b3e9b573b3-system-cni-dir\") pod \"multus-additional-cni-plugins-zr89w\" (UID: \"749e1060-d177-43e4-9f39-d1b3e9b573b3\") " pod="openshift-multus/multus-additional-cni-plugins-zr89w" Mar 10 09:45:52 crc kubenswrapper[4794]: I0310 09:45:52.807623 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/11028118-385a-4a2a-8bc4-49aad67ce147-host-run-k8s-cni-cncf-io\") pod \"multus-jpdth\" (UID: \"11028118-385a-4a2a-8bc4-49aad67ce147\") " pod="openshift-multus/multus-jpdth" Mar 10 09:45:52 crc kubenswrapper[4794]: I0310 09:45:52.807675 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/11028118-385a-4a2a-8bc4-49aad67ce147-host-run-k8s-cni-cncf-io\") pod \"multus-jpdth\" (UID: \"11028118-385a-4a2a-8bc4-49aad67ce147\") " pod="openshift-multus/multus-jpdth" Mar 10 09:45:52 crc kubenswrapper[4794]: I0310 09:45:52.807637 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/071a79a8-a892-4d38-a255-2a19483b64aa-mcd-auth-proxy-config\") pod \"machine-config-daemon-69278\" (UID: \"071a79a8-a892-4d38-a255-2a19483b64aa\") " pod="openshift-machine-config-operator/machine-config-daemon-69278" Mar 10 09:45:52 crc kubenswrapper[4794]: I0310 09:45:52.807712 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/11028118-385a-4a2a-8bc4-49aad67ce147-hostroot\") pod \"multus-jpdth\" (UID: \"11028118-385a-4a2a-8bc4-49aad67ce147\") " pod="openshift-multus/multus-jpdth" Mar 10 09:45:52 crc kubenswrapper[4794]: I0310 09:45:52.807785 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/749e1060-d177-43e4-9f39-d1b3e9b573b3-cnibin\") pod \"multus-additional-cni-plugins-zr89w\" (UID: \"749e1060-d177-43e4-9f39-d1b3e9b573b3\") " pod="openshift-multus/multus-additional-cni-plugins-zr89w" Mar 10 09:45:52 crc kubenswrapper[4794]: I0310 09:45:52.807801 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/11028118-385a-4a2a-8bc4-49aad67ce147-hostroot\") pod \"multus-jpdth\" (UID: \"11028118-385a-4a2a-8bc4-49aad67ce147\") " pod="openshift-multus/multus-jpdth" Mar 10 09:45:52 crc kubenswrapper[4794]: I0310 09:45:52.807834 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/749e1060-d177-43e4-9f39-d1b3e9b573b3-cnibin\") pod \"multus-additional-cni-plugins-zr89w\" (UID: \"749e1060-d177-43e4-9f39-d1b3e9b573b3\") " pod="openshift-multus/multus-additional-cni-plugins-zr89w" Mar 10 09:45:52 crc kubenswrapper[4794]: I0310 09:45:52.807835 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/11028118-385a-4a2a-8bc4-49aad67ce147-multus-cni-dir\") pod \"multus-jpdth\" (UID: \"11028118-385a-4a2a-8bc4-49aad67ce147\") " pod="openshift-multus/multus-jpdth" Mar 10 09:45:52 crc kubenswrapper[4794]: I0310 09:45:52.807881 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/11028118-385a-4a2a-8bc4-49aad67ce147-multus-daemon-config\") pod \"multus-jpdth\" (UID: \"11028118-385a-4a2a-8bc4-49aad67ce147\") " pod="openshift-multus/multus-jpdth" Mar 10 09:45:52 crc kubenswrapper[4794]: I0310 09:45:52.807883 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/11028118-385a-4a2a-8bc4-49aad67ce147-os-release\") pod \"multus-jpdth\" (UID: \"11028118-385a-4a2a-8bc4-49aad67ce147\") " pod="openshift-multus/multus-jpdth" Mar 10 09:45:52 crc kubenswrapper[4794]: I0310 09:45:52.807925 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/11028118-385a-4a2a-8bc4-49aad67ce147-os-release\") pod \"multus-jpdth\" (UID: \"11028118-385a-4a2a-8bc4-49aad67ce147\") " pod="openshift-multus/multus-jpdth" Mar 10 09:45:52 crc kubenswrapper[4794]: I0310 09:45:52.807929 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/11028118-385a-4a2a-8bc4-49aad67ce147-multus-cni-dir\") pod \"multus-jpdth\" (UID: \"11028118-385a-4a2a-8bc4-49aad67ce147\") " pod="openshift-multus/multus-jpdth" Mar 10 09:45:52 crc kubenswrapper[4794]: I0310 09:45:52.807931 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/11028118-385a-4a2a-8bc4-49aad67ce147-cni-binary-copy\") pod \"multus-jpdth\" (UID: \"11028118-385a-4a2a-8bc4-49aad67ce147\") " pod="openshift-multus/multus-jpdth" Mar 10 09:45:52 crc kubenswrapper[4794]: I0310 09:45:52.808103 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/749e1060-d177-43e4-9f39-d1b3e9b573b3-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-zr89w\" (UID: \"749e1060-d177-43e4-9f39-d1b3e9b573b3\") " pod="openshift-multus/multus-additional-cni-plugins-zr89w" Mar 10 09:45:52 crc kubenswrapper[4794]: I0310 09:45:52.808472 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/11028118-385a-4a2a-8bc4-49aad67ce147-cni-binary-copy\") pod \"multus-jpdth\" (UID: \"11028118-385a-4a2a-8bc4-49aad67ce147\") " pod="openshift-multus/multus-jpdth" Mar 10 09:45:52 crc kubenswrapper[4794]: I0310 09:45:52.810101 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2f008ec7-aeae-48e2-8ce6-2d24f9a74cb9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5ecb656c11edc3bfd23f7b6362d5ac8cad055186b134498b59e58b868ebd157c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:44:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8ff582c5b9b19a04841665aef50262d055e093a8e80200961a93e1e1ee0391b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8ff582c5b9b19a04841665aef50262d055e093a8e80200961a93e1e1ee0391b4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:44:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:44:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:44:12Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:45:52Z is after 2025-08-24T17:21:41Z" Mar 10 09:45:52 crc kubenswrapper[4794]: I0310 09:45:52.812463 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/071a79a8-a892-4d38-a255-2a19483b64aa-proxy-tls\") pod \"machine-config-daemon-69278\" (UID: \"071a79a8-a892-4d38-a255-2a19483b64aa\") " pod="openshift-machine-config-operator/machine-config-daemon-69278" Mar 10 09:45:52 crc kubenswrapper[4794]: I0310 09:45:52.823128 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-zr89w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"749e1060-d177-43e4-9f39-d1b3e9b573b3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:52Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:52Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:52Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-drqmz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-drqmz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-drqmz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-drqmz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-drqmz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-drqmz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-drqmz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:45:52Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-zr89w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:45:52Z is after 2025-08-24T17:21:41Z" Mar 10 09:45:52 crc kubenswrapper[4794]: I0310 09:45:52.823283 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8v9tn\" (UniqueName: \"kubernetes.io/projected/11028118-385a-4a2a-8bc4-49aad67ce147-kube-api-access-8v9tn\") pod \"multus-jpdth\" (UID: \"11028118-385a-4a2a-8bc4-49aad67ce147\") " pod="openshift-multus/multus-jpdth" Mar 10 09:45:52 crc kubenswrapper[4794]: I0310 09:45:52.824738 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-drqmz\" (UniqueName: \"kubernetes.io/projected/749e1060-d177-43e4-9f39-d1b3e9b573b3-kube-api-access-drqmz\") pod \"multus-additional-cni-plugins-zr89w\" (UID: \"749e1060-d177-43e4-9f39-d1b3e9b573b3\") " pod="openshift-multus/multus-additional-cni-plugins-zr89w" Mar 10 09:45:52 crc kubenswrapper[4794]: I0310 09:45:52.827522 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wl72n\" (UniqueName: \"kubernetes.io/projected/071a79a8-a892-4d38-a255-2a19483b64aa-kube-api-access-wl72n\") pod \"machine-config-daemon-69278\" (UID: \"071a79a8-a892-4d38-a255-2a19483b64aa\") " pod="openshift-machine-config-operator/machine-config-daemon-69278" Mar 10 09:45:52 crc kubenswrapper[4794]: I0310 09:45:52.840239 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-jpdth" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"11028118-385a-4a2a-8bc4-49aad67ce147\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:52Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:52Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8v9tn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:45:52Z\\\"}}\" for pod \"openshift-multus\"/\"multus-jpdth\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:45:52Z is after 2025-08-24T17:21:41Z" Mar 10 09:45:52 crc kubenswrapper[4794]: I0310 09:45:52.853092 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:35Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:45:52Z is after 2025-08-24T17:21:41Z" Mar 10 09:45:52 crc kubenswrapper[4794]: I0310 09:45:52.855940 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:45:52 crc kubenswrapper[4794]: I0310 09:45:52.855975 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:45:52 crc kubenswrapper[4794]: I0310 09:45:52.855985 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:45:52 crc kubenswrapper[4794]: I0310 09:45:52.856000 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:45:52 crc kubenswrapper[4794]: I0310 09:45:52.856010 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:45:52Z","lastTransitionTime":"2026-03-10T09:45:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:45:52 crc kubenswrapper[4794]: I0310 09:45:52.864048 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-69278" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"071a79a8-a892-4d38-a255-2a19483b64aa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:52Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:52Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wl72n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wl72n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:45:52Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-69278\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:45:52Z is after 2025-08-24T17:21:41Z" Mar 10 09:45:52 crc kubenswrapper[4794]: I0310 09:45:52.880974 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d7e66c82b5c376bc659e9597f6714923192b2658db0530c1e20b73533bfb9079\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:45:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:45:52Z is after 2025-08-24T17:21:41Z" Mar 10 09:45:52 crc kubenswrapper[4794]: I0310 09:45:52.894150 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:39Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:39Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7ca44cdf468c33b025c75f0f104435e413e525fba85c006f98fa92ff2f2a4e3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:45:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:45:52Z is after 2025-08-24T17:21:41Z" Mar 10 09:45:52 crc kubenswrapper[4794]: I0310 09:45:52.910667 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:35Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:45:52Z is after 2025-08-24T17:21:41Z" Mar 10 09:45:52 crc kubenswrapper[4794]: I0310 09:45:52.925708 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:45:52Z is after 2025-08-24T17:21:41Z" Mar 10 09:45:52 crc kubenswrapper[4794]: I0310 09:45:52.943445 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0a99da5e73d983b28a12e4063cb3dcbef725b0797abd5b5beff2d066fb4b43d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:45:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://faad85d1ef96c3c23ba54d03d456195b33f24bed1a5fc64cec481108d087d77c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:45:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:45:52Z is after 2025-08-24T17:21:41Z" Mar 10 09:45:52 crc kubenswrapper[4794]: I0310 09:45:52.954322 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-dc7fw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"441d4601-197c-4325-84bf-cef005ae408b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:52Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:52Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxxtq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:45:52Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-dc7fw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:45:52Z is after 2025-08-24T17:21:41Z" Mar 10 09:45:52 crc kubenswrapper[4794]: I0310 09:45:52.957925 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:45:52 crc kubenswrapper[4794]: I0310 09:45:52.957955 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:45:52 crc kubenswrapper[4794]: I0310 09:45:52.957964 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:45:52 crc kubenswrapper[4794]: I0310 09:45:52.957977 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:45:52 crc kubenswrapper[4794]: I0310 09:45:52.957987 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:45:52Z","lastTransitionTime":"2026-03-10T09:45:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:45:52 crc kubenswrapper[4794]: I0310 09:45:52.966740 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-69278" Mar 10 09:45:52 crc kubenswrapper[4794]: I0310 09:45:52.976852 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-zr89w" Mar 10 09:45:52 crc kubenswrapper[4794]: W0310 09:45:52.979374 4794 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod071a79a8_a892_4d38_a255_2a19483b64aa.slice/crio-ab02ad95418cd3dea58a5ad7e06e269c6d5323d1f01c1de9b73eb1cdc002b3d5 WatchSource:0}: Error finding container ab02ad95418cd3dea58a5ad7e06e269c6d5323d1f01c1de9b73eb1cdc002b3d5: Status 404 returned error can't find the container with id ab02ad95418cd3dea58a5ad7e06e269c6d5323d1f01c1de9b73eb1cdc002b3d5 Mar 10 09:45:52 crc kubenswrapper[4794]: I0310 09:45:52.985861 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-jpdth" Mar 10 09:45:52 crc kubenswrapper[4794]: W0310 09:45:52.987989 4794 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod749e1060_d177_43e4_9f39_d1b3e9b573b3.slice/crio-385a1fe09f11a3bbd921bbb356db644dc3a78706de29c78152b506d109668ebf WatchSource:0}: Error finding container 385a1fe09f11a3bbd921bbb356db644dc3a78706de29c78152b506d109668ebf: Status 404 returned error can't find the container with id 385a1fe09f11a3bbd921bbb356db644dc3a78706de29c78152b506d109668ebf Mar 10 09:45:53 crc kubenswrapper[4794]: W0310 09:45:53.006970 4794 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod11028118_385a_4a2a_8bc4_49aad67ce147.slice/crio-d64c429adfec0875481a44d4f512a6b38d2a0cb584fd54925fcb0fe270518afd WatchSource:0}: Error finding container d64c429adfec0875481a44d4f512a6b38d2a0cb584fd54925fcb0fe270518afd: Status 404 returned error can't find the container with id d64c429adfec0875481a44d4f512a6b38d2a0cb584fd54925fcb0fe270518afd Mar 10 09:45:53 crc kubenswrapper[4794]: I0310 09:45:53.024237 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-mm9nq"] Mar 10 09:45:53 crc kubenswrapper[4794]: I0310 09:45:53.025270 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-mm9nq" Mar 10 09:45:53 crc kubenswrapper[4794]: I0310 09:45:53.027662 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Mar 10 09:45:53 crc kubenswrapper[4794]: I0310 09:45:53.028255 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Mar 10 09:45:53 crc kubenswrapper[4794]: I0310 09:45:53.028124 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Mar 10 09:45:53 crc kubenswrapper[4794]: I0310 09:45:53.028162 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Mar 10 09:45:53 crc kubenswrapper[4794]: I0310 09:45:53.028220 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Mar 10 09:45:53 crc kubenswrapper[4794]: I0310 09:45:53.028636 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Mar 10 09:45:53 crc kubenswrapper[4794]: I0310 09:45:53.028738 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Mar 10 09:45:53 crc kubenswrapper[4794]: I0310 09:45:53.040040 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-69278" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"071a79a8-a892-4d38-a255-2a19483b64aa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:52Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:52Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wl72n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wl72n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:45:52Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-69278\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:45:53Z is after 2025-08-24T17:21:41Z" Mar 10 09:45:53 crc kubenswrapper[4794]: I0310 09:45:53.058951 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:35Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:45:53Z is after 2025-08-24T17:21:41Z" Mar 10 09:45:53 crc kubenswrapper[4794]: I0310 09:45:53.060990 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:45:53 crc kubenswrapper[4794]: I0310 09:45:53.061027 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:45:53 crc kubenswrapper[4794]: I0310 09:45:53.061039 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:45:53 crc kubenswrapper[4794]: I0310 09:45:53.061057 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:45:53 crc kubenswrapper[4794]: I0310 09:45:53.061070 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:45:53Z","lastTransitionTime":"2026-03-10T09:45:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:45:53 crc kubenswrapper[4794]: I0310 09:45:53.072609 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d7e66c82b5c376bc659e9597f6714923192b2658db0530c1e20b73533bfb9079\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:45:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:45:53Z is after 2025-08-24T17:21:41Z" Mar 10 09:45:53 crc kubenswrapper[4794]: I0310 09:45:53.086450 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:39Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:39Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7ca44cdf468c33b025c75f0f104435e413e525fba85c006f98fa92ff2f2a4e3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:45:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:45:53Z is after 2025-08-24T17:21:41Z" Mar 10 09:45:53 crc kubenswrapper[4794]: I0310 09:45:53.110094 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/d6907de6-7eb7-440a-a101-f492ffa28e39-ovnkube-script-lib\") pod \"ovnkube-node-mm9nq\" (UID: \"d6907de6-7eb7-440a-a101-f492ffa28e39\") " pod="openshift-ovn-kubernetes/ovnkube-node-mm9nq" Mar 10 09:45:53 crc kubenswrapper[4794]: I0310 09:45:53.110134 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d6907de6-7eb7-440a-a101-f492ffa28e39-etc-openvswitch\") pod \"ovnkube-node-mm9nq\" (UID: \"d6907de6-7eb7-440a-a101-f492ffa28e39\") " pod="openshift-ovn-kubernetes/ovnkube-node-mm9nq" Mar 10 09:45:53 crc kubenswrapper[4794]: I0310 09:45:53.110153 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d6907de6-7eb7-440a-a101-f492ffa28e39-host-run-ovn-kubernetes\") pod \"ovnkube-node-mm9nq\" (UID: \"d6907de6-7eb7-440a-a101-f492ffa28e39\") " pod="openshift-ovn-kubernetes/ovnkube-node-mm9nq" Mar 10 09:45:53 crc kubenswrapper[4794]: I0310 09:45:53.109971 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mm9nq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d6907de6-7eb7-440a-a101-f492ffa28e39\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:53Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5dhqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5dhqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5dhqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5dhqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5dhqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5dhqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5dhqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5dhqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5dhqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:45:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-mm9nq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:45:53Z is after 2025-08-24T17:21:41Z" Mar 10 09:45:53 crc kubenswrapper[4794]: I0310 09:45:53.110170 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/d6907de6-7eb7-440a-a101-f492ffa28e39-host-cni-netd\") pod \"ovnkube-node-mm9nq\" (UID: \"d6907de6-7eb7-440a-a101-f492ffa28e39\") " pod="openshift-ovn-kubernetes/ovnkube-node-mm9nq" Mar 10 09:45:53 crc kubenswrapper[4794]: I0310 09:45:53.110296 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d6907de6-7eb7-440a-a101-f492ffa28e39-run-openvswitch\") pod \"ovnkube-node-mm9nq\" (UID: \"d6907de6-7eb7-440a-a101-f492ffa28e39\") " pod="openshift-ovn-kubernetes/ovnkube-node-mm9nq" Mar 10 09:45:53 crc kubenswrapper[4794]: I0310 09:45:53.110315 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/d6907de6-7eb7-440a-a101-f492ffa28e39-log-socket\") pod \"ovnkube-node-mm9nq\" (UID: \"d6907de6-7eb7-440a-a101-f492ffa28e39\") " pod="openshift-ovn-kubernetes/ovnkube-node-mm9nq" Mar 10 09:45:53 crc kubenswrapper[4794]: I0310 09:45:53.110344 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/d6907de6-7eb7-440a-a101-f492ffa28e39-ovnkube-config\") pod \"ovnkube-node-mm9nq\" (UID: \"d6907de6-7eb7-440a-a101-f492ffa28e39\") " pod="openshift-ovn-kubernetes/ovnkube-node-mm9nq" Mar 10 09:45:53 crc kubenswrapper[4794]: I0310 09:45:53.110375 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/d6907de6-7eb7-440a-a101-f492ffa28e39-host-cni-bin\") pod \"ovnkube-node-mm9nq\" (UID: \"d6907de6-7eb7-440a-a101-f492ffa28e39\") " pod="openshift-ovn-kubernetes/ovnkube-node-mm9nq" Mar 10 09:45:53 crc kubenswrapper[4794]: I0310 09:45:53.110394 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d6907de6-7eb7-440a-a101-f492ffa28e39-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-mm9nq\" (UID: \"d6907de6-7eb7-440a-a101-f492ffa28e39\") " pod="openshift-ovn-kubernetes/ovnkube-node-mm9nq" Mar 10 09:45:53 crc kubenswrapper[4794]: I0310 09:45:53.110409 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/d6907de6-7eb7-440a-a101-f492ffa28e39-node-log\") pod \"ovnkube-node-mm9nq\" (UID: \"d6907de6-7eb7-440a-a101-f492ffa28e39\") " pod="openshift-ovn-kubernetes/ovnkube-node-mm9nq" Mar 10 09:45:53 crc kubenswrapper[4794]: I0310 09:45:53.110428 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5dhqd\" (UniqueName: \"kubernetes.io/projected/d6907de6-7eb7-440a-a101-f492ffa28e39-kube-api-access-5dhqd\") pod \"ovnkube-node-mm9nq\" (UID: \"d6907de6-7eb7-440a-a101-f492ffa28e39\") " pod="openshift-ovn-kubernetes/ovnkube-node-mm9nq" Mar 10 09:45:53 crc kubenswrapper[4794]: I0310 09:45:53.110445 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/d6907de6-7eb7-440a-a101-f492ffa28e39-systemd-units\") pod \"ovnkube-node-mm9nq\" (UID: \"d6907de6-7eb7-440a-a101-f492ffa28e39\") " pod="openshift-ovn-kubernetes/ovnkube-node-mm9nq" Mar 10 09:45:53 crc kubenswrapper[4794]: I0310 09:45:53.110543 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/d6907de6-7eb7-440a-a101-f492ffa28e39-host-kubelet\") pod \"ovnkube-node-mm9nq\" (UID: \"d6907de6-7eb7-440a-a101-f492ffa28e39\") " pod="openshift-ovn-kubernetes/ovnkube-node-mm9nq" Mar 10 09:45:53 crc kubenswrapper[4794]: I0310 09:45:53.110577 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d6907de6-7eb7-440a-a101-f492ffa28e39-var-lib-openvswitch\") pod \"ovnkube-node-mm9nq\" (UID: \"d6907de6-7eb7-440a-a101-f492ffa28e39\") " pod="openshift-ovn-kubernetes/ovnkube-node-mm9nq" Mar 10 09:45:53 crc kubenswrapper[4794]: I0310 09:45:53.110604 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/d6907de6-7eb7-440a-a101-f492ffa28e39-host-run-netns\") pod \"ovnkube-node-mm9nq\" (UID: \"d6907de6-7eb7-440a-a101-f492ffa28e39\") " pod="openshift-ovn-kubernetes/ovnkube-node-mm9nq" Mar 10 09:45:53 crc kubenswrapper[4794]: I0310 09:45:53.110625 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/d6907de6-7eb7-440a-a101-f492ffa28e39-run-systemd\") pod \"ovnkube-node-mm9nq\" (UID: \"d6907de6-7eb7-440a-a101-f492ffa28e39\") " pod="openshift-ovn-kubernetes/ovnkube-node-mm9nq" Mar 10 09:45:53 crc kubenswrapper[4794]: I0310 09:45:53.110662 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d6907de6-7eb7-440a-a101-f492ffa28e39-host-slash\") pod \"ovnkube-node-mm9nq\" (UID: \"d6907de6-7eb7-440a-a101-f492ffa28e39\") " pod="openshift-ovn-kubernetes/ovnkube-node-mm9nq" Mar 10 09:45:53 crc kubenswrapper[4794]: I0310 09:45:53.110702 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/d6907de6-7eb7-440a-a101-f492ffa28e39-run-ovn\") pod \"ovnkube-node-mm9nq\" (UID: \"d6907de6-7eb7-440a-a101-f492ffa28e39\") " pod="openshift-ovn-kubernetes/ovnkube-node-mm9nq" Mar 10 09:45:53 crc kubenswrapper[4794]: I0310 09:45:53.110715 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/d6907de6-7eb7-440a-a101-f492ffa28e39-env-overrides\") pod \"ovnkube-node-mm9nq\" (UID: \"d6907de6-7eb7-440a-a101-f492ffa28e39\") " pod="openshift-ovn-kubernetes/ovnkube-node-mm9nq" Mar 10 09:45:53 crc kubenswrapper[4794]: I0310 09:45:53.110744 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/d6907de6-7eb7-440a-a101-f492ffa28e39-ovn-node-metrics-cert\") pod \"ovnkube-node-mm9nq\" (UID: \"d6907de6-7eb7-440a-a101-f492ffa28e39\") " pod="openshift-ovn-kubernetes/ovnkube-node-mm9nq" Mar 10 09:45:53 crc kubenswrapper[4794]: I0310 09:45:53.123516 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:35Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:45:53Z is after 2025-08-24T17:21:41Z" Mar 10 09:45:53 crc kubenswrapper[4794]: I0310 09:45:53.135097 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:45:53Z is after 2025-08-24T17:21:41Z" Mar 10 09:45:53 crc kubenswrapper[4794]: I0310 09:45:53.146781 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0a99da5e73d983b28a12e4063cb3dcbef725b0797abd5b5beff2d066fb4b43d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:45:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://faad85d1ef96c3c23ba54d03d456195b33f24bed1a5fc64cec481108d087d77c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:45:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:45:53Z is after 2025-08-24T17:21:41Z" Mar 10 09:45:53 crc kubenswrapper[4794]: I0310 09:45:53.158840 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-dc7fw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"441d4601-197c-4325-84bf-cef005ae408b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:52Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:52Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxxtq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:45:52Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-dc7fw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:45:53Z is after 2025-08-24T17:21:41Z" Mar 10 09:45:53 crc kubenswrapper[4794]: I0310 09:45:53.163252 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:45:53 crc kubenswrapper[4794]: I0310 09:45:53.163533 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:45:53 crc kubenswrapper[4794]: I0310 09:45:53.163543 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:45:53 crc kubenswrapper[4794]: I0310 09:45:53.163565 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:45:53 crc kubenswrapper[4794]: I0310 09:45:53.163574 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:45:53Z","lastTransitionTime":"2026-03-10T09:45:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:45:53 crc kubenswrapper[4794]: I0310 09:45:53.169341 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2f008ec7-aeae-48e2-8ce6-2d24f9a74cb9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5ecb656c11edc3bfd23f7b6362d5ac8cad055186b134498b59e58b868ebd157c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:44:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8ff582c5b9b19a04841665aef50262d055e093a8e80200961a93e1e1ee0391b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8ff582c5b9b19a04841665aef50262d055e093a8e80200961a93e1e1ee0391b4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:44:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:44:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:44:12Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:45:53Z is after 2025-08-24T17:21:41Z" Mar 10 09:45:53 crc kubenswrapper[4794]: I0310 09:45:53.182426 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-zr89w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"749e1060-d177-43e4-9f39-d1b3e9b573b3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:52Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:52Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:52Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-drqmz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-drqmz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-drqmz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-drqmz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-drqmz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-drqmz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-drqmz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:45:52Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-zr89w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:45:53Z is after 2025-08-24T17:21:41Z" Mar 10 09:45:53 crc kubenswrapper[4794]: I0310 09:45:53.193175 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-jpdth" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"11028118-385a-4a2a-8bc4-49aad67ce147\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:52Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:52Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8v9tn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:45:52Z\\\"}}\" for pod \"openshift-multus\"/\"multus-jpdth\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:45:53Z is after 2025-08-24T17:21:41Z" Mar 10 09:45:53 crc kubenswrapper[4794]: I0310 09:45:53.205655 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c04f5dfc-8cde-406b-9e01-2e529e0c0f31\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:12Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:12Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c7c4ede630df1f139ad8abbc2d1d1e2f0f6cf56a2c29902f6b6879df004834cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:44:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://27049a35d23ac6ffa5cb459e4ea2c5c55a7b1731c9edff0941d452442cc73af2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:44:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://de2c346c49fbd94c4328853bed3ffd94ba873a423ff0d99a6ace6fc832bd4da7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:44:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd3d37caeb172a681855b5504cd1387cb3798cff0c9ae2c67f9a57207d86f7fe\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bd3d37caeb172a681855b5504cd1387cb3798cff0c9ae2c67f9a57207d86f7fe\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T09:45:21Z\\\",\\\"message\\\":\\\"le observer\\\\nW0310 09:45:21.579222 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0310 09:45:21.579381 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0310 09:45:21.580347 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2897458058/tls.crt::/tmp/serving-cert-2897458058/tls.key\\\\\\\"\\\\nI0310 09:45:21.843230 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0310 09:45:21.845521 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0310 09:45:21.845546 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0310 09:45:21.845567 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0310 09:45:21.845572 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0310 09:45:21.854911 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0310 09:45:21.854948 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 09:45:21.854954 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 09:45:21.854960 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0310 09:45:21.854966 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0310 09:45:21.854970 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0310 09:45:21.854974 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0310 09:45:21.854995 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0310 09:45:21.858504 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T09:45:21Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a3be3a71e2af20be50ffcbd448d5f28e73bafe2d36819ec5476a260b6f5a8c5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:44:14Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://712665fa5ac90b706c3ce8eb211703686c18a26f136962df26e3ad7c4396c72f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://712665fa5ac90b706c3ce8eb211703686c18a26f136962df26e3ad7c4396c72f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:44:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:44:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:44:12Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:45:53Z is after 2025-08-24T17:21:41Z" Mar 10 09:45:53 crc kubenswrapper[4794]: I0310 09:45:53.211735 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/d6907de6-7eb7-440a-a101-f492ffa28e39-run-ovn\") pod \"ovnkube-node-mm9nq\" (UID: \"d6907de6-7eb7-440a-a101-f492ffa28e39\") " pod="openshift-ovn-kubernetes/ovnkube-node-mm9nq" Mar 10 09:45:53 crc kubenswrapper[4794]: I0310 09:45:53.211777 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/d6907de6-7eb7-440a-a101-f492ffa28e39-env-overrides\") pod \"ovnkube-node-mm9nq\" (UID: \"d6907de6-7eb7-440a-a101-f492ffa28e39\") " pod="openshift-ovn-kubernetes/ovnkube-node-mm9nq" Mar 10 09:45:53 crc kubenswrapper[4794]: I0310 09:45:53.211807 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/d6907de6-7eb7-440a-a101-f492ffa28e39-ovn-node-metrics-cert\") pod \"ovnkube-node-mm9nq\" (UID: \"d6907de6-7eb7-440a-a101-f492ffa28e39\") " pod="openshift-ovn-kubernetes/ovnkube-node-mm9nq" Mar 10 09:45:53 crc kubenswrapper[4794]: I0310 09:45:53.211827 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d6907de6-7eb7-440a-a101-f492ffa28e39-etc-openvswitch\") pod \"ovnkube-node-mm9nq\" (UID: \"d6907de6-7eb7-440a-a101-f492ffa28e39\") " pod="openshift-ovn-kubernetes/ovnkube-node-mm9nq" Mar 10 09:45:53 crc kubenswrapper[4794]: I0310 09:45:53.211842 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d6907de6-7eb7-440a-a101-f492ffa28e39-host-run-ovn-kubernetes\") pod \"ovnkube-node-mm9nq\" (UID: \"d6907de6-7eb7-440a-a101-f492ffa28e39\") " pod="openshift-ovn-kubernetes/ovnkube-node-mm9nq" Mar 10 09:45:53 crc kubenswrapper[4794]: I0310 09:45:53.211863 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/d6907de6-7eb7-440a-a101-f492ffa28e39-ovnkube-script-lib\") pod \"ovnkube-node-mm9nq\" (UID: \"d6907de6-7eb7-440a-a101-f492ffa28e39\") " pod="openshift-ovn-kubernetes/ovnkube-node-mm9nq" Mar 10 09:45:53 crc kubenswrapper[4794]: I0310 09:45:53.211883 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d6907de6-7eb7-440a-a101-f492ffa28e39-run-openvswitch\") pod \"ovnkube-node-mm9nq\" (UID: \"d6907de6-7eb7-440a-a101-f492ffa28e39\") " pod="openshift-ovn-kubernetes/ovnkube-node-mm9nq" Mar 10 09:45:53 crc kubenswrapper[4794]: I0310 09:45:53.211893 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/d6907de6-7eb7-440a-a101-f492ffa28e39-run-ovn\") pod \"ovnkube-node-mm9nq\" (UID: \"d6907de6-7eb7-440a-a101-f492ffa28e39\") " pod="openshift-ovn-kubernetes/ovnkube-node-mm9nq" Mar 10 09:45:53 crc kubenswrapper[4794]: I0310 09:45:53.211912 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d6907de6-7eb7-440a-a101-f492ffa28e39-etc-openvswitch\") pod \"ovnkube-node-mm9nq\" (UID: \"d6907de6-7eb7-440a-a101-f492ffa28e39\") " pod="openshift-ovn-kubernetes/ovnkube-node-mm9nq" Mar 10 09:45:53 crc kubenswrapper[4794]: I0310 09:45:53.211904 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/d6907de6-7eb7-440a-a101-f492ffa28e39-host-cni-netd\") pod \"ovnkube-node-mm9nq\" (UID: \"d6907de6-7eb7-440a-a101-f492ffa28e39\") " pod="openshift-ovn-kubernetes/ovnkube-node-mm9nq" Mar 10 09:45:53 crc kubenswrapper[4794]: I0310 09:45:53.211959 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d6907de6-7eb7-440a-a101-f492ffa28e39-host-run-ovn-kubernetes\") pod \"ovnkube-node-mm9nq\" (UID: \"d6907de6-7eb7-440a-a101-f492ffa28e39\") " pod="openshift-ovn-kubernetes/ovnkube-node-mm9nq" Mar 10 09:45:53 crc kubenswrapper[4794]: I0310 09:45:53.211975 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/d6907de6-7eb7-440a-a101-f492ffa28e39-log-socket\") pod \"ovnkube-node-mm9nq\" (UID: \"d6907de6-7eb7-440a-a101-f492ffa28e39\") " pod="openshift-ovn-kubernetes/ovnkube-node-mm9nq" Mar 10 09:45:53 crc kubenswrapper[4794]: I0310 09:45:53.212001 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/d6907de6-7eb7-440a-a101-f492ffa28e39-ovnkube-config\") pod \"ovnkube-node-mm9nq\" (UID: \"d6907de6-7eb7-440a-a101-f492ffa28e39\") " pod="openshift-ovn-kubernetes/ovnkube-node-mm9nq" Mar 10 09:45:53 crc kubenswrapper[4794]: I0310 09:45:53.212017 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d6907de6-7eb7-440a-a101-f492ffa28e39-run-openvswitch\") pod \"ovnkube-node-mm9nq\" (UID: \"d6907de6-7eb7-440a-a101-f492ffa28e39\") " pod="openshift-ovn-kubernetes/ovnkube-node-mm9nq" Mar 10 09:45:53 crc kubenswrapper[4794]: I0310 09:45:53.211941 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/d6907de6-7eb7-440a-a101-f492ffa28e39-host-cni-netd\") pod \"ovnkube-node-mm9nq\" (UID: \"d6907de6-7eb7-440a-a101-f492ffa28e39\") " pod="openshift-ovn-kubernetes/ovnkube-node-mm9nq" Mar 10 09:45:53 crc kubenswrapper[4794]: I0310 09:45:53.212050 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/d6907de6-7eb7-440a-a101-f492ffa28e39-host-cni-bin\") pod \"ovnkube-node-mm9nq\" (UID: \"d6907de6-7eb7-440a-a101-f492ffa28e39\") " pod="openshift-ovn-kubernetes/ovnkube-node-mm9nq" Mar 10 09:45:53 crc kubenswrapper[4794]: I0310 09:45:53.212080 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d6907de6-7eb7-440a-a101-f492ffa28e39-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-mm9nq\" (UID: \"d6907de6-7eb7-440a-a101-f492ffa28e39\") " pod="openshift-ovn-kubernetes/ovnkube-node-mm9nq" Mar 10 09:45:53 crc kubenswrapper[4794]: I0310 09:45:53.212107 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/d6907de6-7eb7-440a-a101-f492ffa28e39-node-log\") pod \"ovnkube-node-mm9nq\" (UID: \"d6907de6-7eb7-440a-a101-f492ffa28e39\") " pod="openshift-ovn-kubernetes/ovnkube-node-mm9nq" Mar 10 09:45:53 crc kubenswrapper[4794]: I0310 09:45:53.212130 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5dhqd\" (UniqueName: \"kubernetes.io/projected/d6907de6-7eb7-440a-a101-f492ffa28e39-kube-api-access-5dhqd\") pod \"ovnkube-node-mm9nq\" (UID: \"d6907de6-7eb7-440a-a101-f492ffa28e39\") " pod="openshift-ovn-kubernetes/ovnkube-node-mm9nq" Mar 10 09:45:53 crc kubenswrapper[4794]: I0310 09:45:53.212152 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/d6907de6-7eb7-440a-a101-f492ffa28e39-systemd-units\") pod \"ovnkube-node-mm9nq\" (UID: \"d6907de6-7eb7-440a-a101-f492ffa28e39\") " pod="openshift-ovn-kubernetes/ovnkube-node-mm9nq" Mar 10 09:45:53 crc kubenswrapper[4794]: I0310 09:45:53.212176 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/d6907de6-7eb7-440a-a101-f492ffa28e39-host-kubelet\") pod \"ovnkube-node-mm9nq\" (UID: \"d6907de6-7eb7-440a-a101-f492ffa28e39\") " pod="openshift-ovn-kubernetes/ovnkube-node-mm9nq" Mar 10 09:45:53 crc kubenswrapper[4794]: I0310 09:45:53.212197 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d6907de6-7eb7-440a-a101-f492ffa28e39-var-lib-openvswitch\") pod \"ovnkube-node-mm9nq\" (UID: \"d6907de6-7eb7-440a-a101-f492ffa28e39\") " pod="openshift-ovn-kubernetes/ovnkube-node-mm9nq" Mar 10 09:45:53 crc kubenswrapper[4794]: I0310 09:45:53.212220 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/d6907de6-7eb7-440a-a101-f492ffa28e39-host-run-netns\") pod \"ovnkube-node-mm9nq\" (UID: \"d6907de6-7eb7-440a-a101-f492ffa28e39\") " pod="openshift-ovn-kubernetes/ovnkube-node-mm9nq" Mar 10 09:45:53 crc kubenswrapper[4794]: I0310 09:45:53.212240 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/d6907de6-7eb7-440a-a101-f492ffa28e39-run-systemd\") pod \"ovnkube-node-mm9nq\" (UID: \"d6907de6-7eb7-440a-a101-f492ffa28e39\") " pod="openshift-ovn-kubernetes/ovnkube-node-mm9nq" Mar 10 09:45:53 crc kubenswrapper[4794]: I0310 09:45:53.212267 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d6907de6-7eb7-440a-a101-f492ffa28e39-host-slash\") pod \"ovnkube-node-mm9nq\" (UID: \"d6907de6-7eb7-440a-a101-f492ffa28e39\") " pod="openshift-ovn-kubernetes/ovnkube-node-mm9nq" Mar 10 09:45:53 crc kubenswrapper[4794]: I0310 09:45:53.212410 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d6907de6-7eb7-440a-a101-f492ffa28e39-host-slash\") pod \"ovnkube-node-mm9nq\" (UID: \"d6907de6-7eb7-440a-a101-f492ffa28e39\") " pod="openshift-ovn-kubernetes/ovnkube-node-mm9nq" Mar 10 09:45:53 crc kubenswrapper[4794]: I0310 09:45:53.212486 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/d6907de6-7eb7-440a-a101-f492ffa28e39-host-cni-bin\") pod \"ovnkube-node-mm9nq\" (UID: \"d6907de6-7eb7-440a-a101-f492ffa28e39\") " pod="openshift-ovn-kubernetes/ovnkube-node-mm9nq" Mar 10 09:45:53 crc kubenswrapper[4794]: I0310 09:45:53.212055 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/d6907de6-7eb7-440a-a101-f492ffa28e39-log-socket\") pod \"ovnkube-node-mm9nq\" (UID: \"d6907de6-7eb7-440a-a101-f492ffa28e39\") " pod="openshift-ovn-kubernetes/ovnkube-node-mm9nq" Mar 10 09:45:53 crc kubenswrapper[4794]: I0310 09:45:53.212555 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d6907de6-7eb7-440a-a101-f492ffa28e39-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-mm9nq\" (UID: \"d6907de6-7eb7-440a-a101-f492ffa28e39\") " pod="openshift-ovn-kubernetes/ovnkube-node-mm9nq" Mar 10 09:45:53 crc kubenswrapper[4794]: I0310 09:45:53.212578 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d6907de6-7eb7-440a-a101-f492ffa28e39-var-lib-openvswitch\") pod \"ovnkube-node-mm9nq\" (UID: \"d6907de6-7eb7-440a-a101-f492ffa28e39\") " pod="openshift-ovn-kubernetes/ovnkube-node-mm9nq" Mar 10 09:45:53 crc kubenswrapper[4794]: I0310 09:45:53.212581 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/d6907de6-7eb7-440a-a101-f492ffa28e39-host-run-netns\") pod \"ovnkube-node-mm9nq\" (UID: \"d6907de6-7eb7-440a-a101-f492ffa28e39\") " pod="openshift-ovn-kubernetes/ovnkube-node-mm9nq" Mar 10 09:45:53 crc kubenswrapper[4794]: I0310 09:45:53.212607 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/d6907de6-7eb7-440a-a101-f492ffa28e39-run-systemd\") pod \"ovnkube-node-mm9nq\" (UID: \"d6907de6-7eb7-440a-a101-f492ffa28e39\") " pod="openshift-ovn-kubernetes/ovnkube-node-mm9nq" Mar 10 09:45:53 crc kubenswrapper[4794]: I0310 09:45:53.212619 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/d6907de6-7eb7-440a-a101-f492ffa28e39-host-kubelet\") pod \"ovnkube-node-mm9nq\" (UID: \"d6907de6-7eb7-440a-a101-f492ffa28e39\") " pod="openshift-ovn-kubernetes/ovnkube-node-mm9nq" Mar 10 09:45:53 crc kubenswrapper[4794]: I0310 09:45:53.212611 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/d6907de6-7eb7-440a-a101-f492ffa28e39-systemd-units\") pod \"ovnkube-node-mm9nq\" (UID: \"d6907de6-7eb7-440a-a101-f492ffa28e39\") " pod="openshift-ovn-kubernetes/ovnkube-node-mm9nq" Mar 10 09:45:53 crc kubenswrapper[4794]: I0310 09:45:53.212635 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/d6907de6-7eb7-440a-a101-f492ffa28e39-node-log\") pod \"ovnkube-node-mm9nq\" (UID: \"d6907de6-7eb7-440a-a101-f492ffa28e39\") " pod="openshift-ovn-kubernetes/ovnkube-node-mm9nq" Mar 10 09:45:53 crc kubenswrapper[4794]: I0310 09:45:53.212701 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/d6907de6-7eb7-440a-a101-f492ffa28e39-env-overrides\") pod \"ovnkube-node-mm9nq\" (UID: \"d6907de6-7eb7-440a-a101-f492ffa28e39\") " pod="openshift-ovn-kubernetes/ovnkube-node-mm9nq" Mar 10 09:45:53 crc kubenswrapper[4794]: I0310 09:45:53.212784 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/d6907de6-7eb7-440a-a101-f492ffa28e39-ovnkube-script-lib\") pod \"ovnkube-node-mm9nq\" (UID: \"d6907de6-7eb7-440a-a101-f492ffa28e39\") " pod="openshift-ovn-kubernetes/ovnkube-node-mm9nq" Mar 10 09:45:53 crc kubenswrapper[4794]: I0310 09:45:53.212844 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/d6907de6-7eb7-440a-a101-f492ffa28e39-ovnkube-config\") pod \"ovnkube-node-mm9nq\" (UID: \"d6907de6-7eb7-440a-a101-f492ffa28e39\") " pod="openshift-ovn-kubernetes/ovnkube-node-mm9nq" Mar 10 09:45:53 crc kubenswrapper[4794]: I0310 09:45:53.216065 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/d6907de6-7eb7-440a-a101-f492ffa28e39-ovn-node-metrics-cert\") pod \"ovnkube-node-mm9nq\" (UID: \"d6907de6-7eb7-440a-a101-f492ffa28e39\") " pod="openshift-ovn-kubernetes/ovnkube-node-mm9nq" Mar 10 09:45:53 crc kubenswrapper[4794]: I0310 09:45:53.226789 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5dhqd\" (UniqueName: \"kubernetes.io/projected/d6907de6-7eb7-440a-a101-f492ffa28e39-kube-api-access-5dhqd\") pod \"ovnkube-node-mm9nq\" (UID: \"d6907de6-7eb7-440a-a101-f492ffa28e39\") " pod="openshift-ovn-kubernetes/ovnkube-node-mm9nq" Mar 10 09:45:53 crc kubenswrapper[4794]: I0310 09:45:53.266806 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:45:53 crc kubenswrapper[4794]: I0310 09:45:53.266847 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:45:53 crc kubenswrapper[4794]: I0310 09:45:53.266857 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:45:53 crc kubenswrapper[4794]: I0310 09:45:53.266873 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:45:53 crc kubenswrapper[4794]: I0310 09:45:53.266885 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:45:53Z","lastTransitionTime":"2026-03-10T09:45:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:45:53 crc kubenswrapper[4794]: I0310 09:45:53.369473 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:45:53 crc kubenswrapper[4794]: I0310 09:45:53.369505 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:45:53 crc kubenswrapper[4794]: I0310 09:45:53.369513 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:45:53 crc kubenswrapper[4794]: I0310 09:45:53.369526 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:45:53 crc kubenswrapper[4794]: I0310 09:45:53.369535 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:45:53Z","lastTransitionTime":"2026-03-10T09:45:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:45:53 crc kubenswrapper[4794]: I0310 09:45:53.371418 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-mm9nq" Mar 10 09:45:53 crc kubenswrapper[4794]: W0310 09:45:53.388735 4794 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd6907de6_7eb7_440a_a101_f492ffa28e39.slice/crio-37811c6f400dc349945266a557e31553144442b3b962784c47f1679857265df2 WatchSource:0}: Error finding container 37811c6f400dc349945266a557e31553144442b3b962784c47f1679857265df2: Status 404 returned error can't find the container with id 37811c6f400dc349945266a557e31553144442b3b962784c47f1679857265df2 Mar 10 09:45:53 crc kubenswrapper[4794]: I0310 09:45:53.424547 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mm9nq" event={"ID":"d6907de6-7eb7-440a-a101-f492ffa28e39","Type":"ContainerStarted","Data":"37811c6f400dc349945266a557e31553144442b3b962784c47f1679857265df2"} Mar 10 09:45:53 crc kubenswrapper[4794]: I0310 09:45:53.428497 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-69278" event={"ID":"071a79a8-a892-4d38-a255-2a19483b64aa","Type":"ContainerStarted","Data":"f596b5be3f96e424f4c4f0c6f81241ec50ee64356cb1449294fa6790aa3f7755"} Mar 10 09:45:53 crc kubenswrapper[4794]: I0310 09:45:53.428598 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-69278" event={"ID":"071a79a8-a892-4d38-a255-2a19483b64aa","Type":"ContainerStarted","Data":"387c7d244ef3b150e61bf6db8b30580b19b8867628fab0e5c4eebdde4df81142"} Mar 10 09:45:53 crc kubenswrapper[4794]: I0310 09:45:53.428613 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-69278" event={"ID":"071a79a8-a892-4d38-a255-2a19483b64aa","Type":"ContainerStarted","Data":"ab02ad95418cd3dea58a5ad7e06e269c6d5323d1f01c1de9b73eb1cdc002b3d5"} Mar 10 09:45:53 crc kubenswrapper[4794]: I0310 09:45:53.431081 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-jpdth" event={"ID":"11028118-385a-4a2a-8bc4-49aad67ce147","Type":"ContainerStarted","Data":"31569151ef9bf456980ec9d56c85d54ea77739b3991e26844385936dfceb24ec"} Mar 10 09:45:53 crc kubenswrapper[4794]: I0310 09:45:53.431153 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-jpdth" event={"ID":"11028118-385a-4a2a-8bc4-49aad67ce147","Type":"ContainerStarted","Data":"d64c429adfec0875481a44d4f512a6b38d2a0cb584fd54925fcb0fe270518afd"} Mar 10 09:45:53 crc kubenswrapper[4794]: I0310 09:45:53.434633 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-zr89w" event={"ID":"749e1060-d177-43e4-9f39-d1b3e9b573b3","Type":"ContainerStarted","Data":"9a7b1089f71425c1e393f53f827c2bd0dd87615752c6c5655ef28b2346de21d8"} Mar 10 09:45:53 crc kubenswrapper[4794]: I0310 09:45:53.434686 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-zr89w" event={"ID":"749e1060-d177-43e4-9f39-d1b3e9b573b3","Type":"ContainerStarted","Data":"385a1fe09f11a3bbd921bbb356db644dc3a78706de29c78152b506d109668ebf"} Mar 10 09:45:53 crc kubenswrapper[4794]: I0310 09:45:53.439370 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-dc7fw" event={"ID":"441d4601-197c-4325-84bf-cef005ae408b","Type":"ContainerStarted","Data":"2367448989b7eb7235c3277100a8d4945551beee942453ce2efe152e17d24ce2"} Mar 10 09:45:53 crc kubenswrapper[4794]: I0310 09:45:53.439438 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-dc7fw" event={"ID":"441d4601-197c-4325-84bf-cef005ae408b","Type":"ContainerStarted","Data":"30f86fd41e15041f4051596477ce3b47d90fb53afacfbe93b367c69047634c28"} Mar 10 09:45:53 crc kubenswrapper[4794]: I0310 09:45:53.442049 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2f008ec7-aeae-48e2-8ce6-2d24f9a74cb9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5ecb656c11edc3bfd23f7b6362d5ac8cad055186b134498b59e58b868ebd157c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:44:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8ff582c5b9b19a04841665aef50262d055e093a8e80200961a93e1e1ee0391b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8ff582c5b9b19a04841665aef50262d055e093a8e80200961a93e1e1ee0391b4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:44:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:44:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:44:12Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:45:53Z is after 2025-08-24T17:21:41Z" Mar 10 09:45:53 crc kubenswrapper[4794]: I0310 09:45:53.454933 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-zr89w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"749e1060-d177-43e4-9f39-d1b3e9b573b3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:52Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:52Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:52Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-drqmz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-drqmz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-drqmz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-drqmz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-drqmz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-drqmz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-drqmz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:45:52Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-zr89w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:45:53Z is after 2025-08-24T17:21:41Z" Mar 10 09:45:53 crc kubenswrapper[4794]: I0310 09:45:53.468894 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-jpdth" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"11028118-385a-4a2a-8bc4-49aad67ce147\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:52Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:52Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8v9tn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:45:52Z\\\"}}\" for pod \"openshift-multus\"/\"multus-jpdth\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:45:53Z is after 2025-08-24T17:21:41Z" Mar 10 09:45:53 crc kubenswrapper[4794]: I0310 09:45:53.472463 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:45:53 crc kubenswrapper[4794]: I0310 09:45:53.472506 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:45:53 crc kubenswrapper[4794]: I0310 09:45:53.472518 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:45:53 crc kubenswrapper[4794]: I0310 09:45:53.472537 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:45:53 crc kubenswrapper[4794]: I0310 09:45:53.472550 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:45:53Z","lastTransitionTime":"2026-03-10T09:45:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:45:53 crc kubenswrapper[4794]: I0310 09:45:53.486475 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c04f5dfc-8cde-406b-9e01-2e529e0c0f31\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:12Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:12Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c7c4ede630df1f139ad8abbc2d1d1e2f0f6cf56a2c29902f6b6879df004834cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:44:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://27049a35d23ac6ffa5cb459e4ea2c5c55a7b1731c9edff0941d452442cc73af2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:44:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://de2c346c49fbd94c4328853bed3ffd94ba873a423ff0d99a6ace6fc832bd4da7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:44:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd3d37caeb172a681855b5504cd1387cb3798cff0c9ae2c67f9a57207d86f7fe\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bd3d37caeb172a681855b5504cd1387cb3798cff0c9ae2c67f9a57207d86f7fe\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T09:45:21Z\\\",\\\"message\\\":\\\"le observer\\\\nW0310 09:45:21.579222 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0310 09:45:21.579381 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0310 09:45:21.580347 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2897458058/tls.crt::/tmp/serving-cert-2897458058/tls.key\\\\\\\"\\\\nI0310 09:45:21.843230 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0310 09:45:21.845521 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0310 09:45:21.845546 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0310 09:45:21.845567 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0310 09:45:21.845572 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0310 09:45:21.854911 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0310 09:45:21.854948 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 09:45:21.854954 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 09:45:21.854960 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0310 09:45:21.854966 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0310 09:45:21.854970 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0310 09:45:21.854974 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0310 09:45:21.854995 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0310 09:45:21.858504 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T09:45:21Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a3be3a71e2af20be50ffcbd448d5f28e73bafe2d36819ec5476a260b6f5a8c5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:44:14Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://712665fa5ac90b706c3ce8eb211703686c18a26f136962df26e3ad7c4396c72f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://712665fa5ac90b706c3ce8eb211703686c18a26f136962df26e3ad7c4396c72f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:44:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:44:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:44:12Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:45:53Z is after 2025-08-24T17:21:41Z" Mar 10 09:45:53 crc kubenswrapper[4794]: I0310 09:45:53.500056 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-69278" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"071a79a8-a892-4d38-a255-2a19483b64aa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f596b5be3f96e424f4c4f0c6f81241ec50ee64356cb1449294fa6790aa3f7755\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:45:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wl72n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://387c7d244ef3b150e61bf6db8b30580b19b8867628fab0e5c4eebdde4df81142\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:45:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wl72n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:45:52Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-69278\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:45:53Z is after 2025-08-24T17:21:41Z" Mar 10 09:45:53 crc kubenswrapper[4794]: I0310 09:45:53.515754 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:35Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:45:53Z is after 2025-08-24T17:21:41Z" Mar 10 09:45:53 crc kubenswrapper[4794]: I0310 09:45:53.528735 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d7e66c82b5c376bc659e9597f6714923192b2658db0530c1e20b73533bfb9079\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:45:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:45:53Z is after 2025-08-24T17:21:41Z" Mar 10 09:45:53 crc kubenswrapper[4794]: I0310 09:45:53.540005 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:39Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:39Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7ca44cdf468c33b025c75f0f104435e413e525fba85c006f98fa92ff2f2a4e3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:45:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:45:53Z is after 2025-08-24T17:21:41Z" Mar 10 09:45:53 crc kubenswrapper[4794]: I0310 09:45:53.559360 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mm9nq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d6907de6-7eb7-440a-a101-f492ffa28e39\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:53Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5dhqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5dhqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5dhqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5dhqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5dhqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5dhqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5dhqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5dhqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5dhqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:45:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-mm9nq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:45:53Z is after 2025-08-24T17:21:41Z" Mar 10 09:45:53 crc kubenswrapper[4794]: I0310 09:45:53.571816 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:35Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:45:53Z is after 2025-08-24T17:21:41Z" Mar 10 09:45:53 crc kubenswrapper[4794]: I0310 09:45:53.574272 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:45:53 crc kubenswrapper[4794]: I0310 09:45:53.574302 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:45:53 crc kubenswrapper[4794]: I0310 09:45:53.574310 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:45:53 crc kubenswrapper[4794]: I0310 09:45:53.574388 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:45:53 crc kubenswrapper[4794]: I0310 09:45:53.574399 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:45:53Z","lastTransitionTime":"2026-03-10T09:45:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:45:53 crc kubenswrapper[4794]: I0310 09:45:53.583463 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:45:53Z is after 2025-08-24T17:21:41Z" Mar 10 09:45:53 crc kubenswrapper[4794]: I0310 09:45:53.595007 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0a99da5e73d983b28a12e4063cb3dcbef725b0797abd5b5beff2d066fb4b43d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:45:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://faad85d1ef96c3c23ba54d03d456195b33f24bed1a5fc64cec481108d087d77c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:45:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:45:53Z is after 2025-08-24T17:21:41Z" Mar 10 09:45:53 crc kubenswrapper[4794]: I0310 09:45:53.605309 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-dc7fw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"441d4601-197c-4325-84bf-cef005ae408b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:52Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:52Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxxtq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:45:52Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-dc7fw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:45:53Z is after 2025-08-24T17:21:41Z" Mar 10 09:45:53 crc kubenswrapper[4794]: I0310 09:45:53.625215 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:35Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:45:53Z is after 2025-08-24T17:21:41Z" Mar 10 09:45:53 crc kubenswrapper[4794]: I0310 09:45:53.662492 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:45:53Z is after 2025-08-24T17:21:41Z" Mar 10 09:45:53 crc kubenswrapper[4794]: I0310 09:45:53.675763 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:45:53 crc kubenswrapper[4794]: I0310 09:45:53.675809 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:45:53 crc kubenswrapper[4794]: I0310 09:45:53.675820 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:45:53 crc kubenswrapper[4794]: I0310 09:45:53.675838 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:45:53 crc kubenswrapper[4794]: I0310 09:45:53.675850 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:45:53Z","lastTransitionTime":"2026-03-10T09:45:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:45:53 crc kubenswrapper[4794]: I0310 09:45:53.678242 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0a99da5e73d983b28a12e4063cb3dcbef725b0797abd5b5beff2d066fb4b43d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:45:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://faad85d1ef96c3c23ba54d03d456195b33f24bed1a5fc64cec481108d087d77c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:45:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:45:53Z is after 2025-08-24T17:21:41Z" Mar 10 09:45:53 crc kubenswrapper[4794]: I0310 09:45:53.687702 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-dc7fw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"441d4601-197c-4325-84bf-cef005ae408b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2367448989b7eb7235c3277100a8d4945551beee942453ce2efe152e17d24ce2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:45:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxxtq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:45:52Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-dc7fw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:45:53Z is after 2025-08-24T17:21:41Z" Mar 10 09:45:53 crc kubenswrapper[4794]: I0310 09:45:53.707738 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c04f5dfc-8cde-406b-9e01-2e529e0c0f31\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:12Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:12Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c7c4ede630df1f139ad8abbc2d1d1e2f0f6cf56a2c29902f6b6879df004834cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:44:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://27049a35d23ac6ffa5cb459e4ea2c5c55a7b1731c9edff0941d452442cc73af2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:44:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://de2c346c49fbd94c4328853bed3ffd94ba873a423ff0d99a6ace6fc832bd4da7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:44:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd3d37caeb172a681855b5504cd1387cb3798cff0c9ae2c67f9a57207d86f7fe\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bd3d37caeb172a681855b5504cd1387cb3798cff0c9ae2c67f9a57207d86f7fe\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T09:45:21Z\\\",\\\"message\\\":\\\"le observer\\\\nW0310 09:45:21.579222 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0310 09:45:21.579381 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0310 09:45:21.580347 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2897458058/tls.crt::/tmp/serving-cert-2897458058/tls.key\\\\\\\"\\\\nI0310 09:45:21.843230 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0310 09:45:21.845521 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0310 09:45:21.845546 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0310 09:45:21.845567 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0310 09:45:21.845572 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0310 09:45:21.854911 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0310 09:45:21.854948 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 09:45:21.854954 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 09:45:21.854960 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0310 09:45:21.854966 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0310 09:45:21.854970 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0310 09:45:21.854974 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0310 09:45:21.854995 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0310 09:45:21.858504 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T09:45:21Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a3be3a71e2af20be50ffcbd448d5f28e73bafe2d36819ec5476a260b6f5a8c5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:44:14Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://712665fa5ac90b706c3ce8eb211703686c18a26f136962df26e3ad7c4396c72f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://712665fa5ac90b706c3ce8eb211703686c18a26f136962df26e3ad7c4396c72f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:44:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:44:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:44:12Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:45:53Z is after 2025-08-24T17:21:41Z" Mar 10 09:45:53 crc kubenswrapper[4794]: I0310 09:45:53.718232 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2f008ec7-aeae-48e2-8ce6-2d24f9a74cb9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5ecb656c11edc3bfd23f7b6362d5ac8cad055186b134498b59e58b868ebd157c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:44:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8ff582c5b9b19a04841665aef50262d055e093a8e80200961a93e1e1ee0391b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8ff582c5b9b19a04841665aef50262d055e093a8e80200961a93e1e1ee0391b4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:44:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:44:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:44:12Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:45:53Z is after 2025-08-24T17:21:41Z" Mar 10 09:45:53 crc kubenswrapper[4794]: I0310 09:45:53.730526 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-zr89w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"749e1060-d177-43e4-9f39-d1b3e9b573b3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:52Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:52Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:52Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-drqmz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9a7b1089f71425c1e393f53f827c2bd0dd87615752c6c5655ef28b2346de21d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a7b1089f71425c1e393f53f827c2bd0dd87615752c6c5655ef28b2346de21d8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:45:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:45:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-drqmz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-drqmz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-drqmz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-drqmz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-drqmz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-drqmz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:45:52Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-zr89w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:45:53Z is after 2025-08-24T17:21:41Z" Mar 10 09:45:53 crc kubenswrapper[4794]: I0310 09:45:53.742717 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-jpdth" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"11028118-385a-4a2a-8bc4-49aad67ce147\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31569151ef9bf456980ec9d56c85d54ea77739b3991e26844385936dfceb24ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:45:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8v9tn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:45:52Z\\\"}}\" for pod \"openshift-multus\"/\"multus-jpdth\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:45:53Z is after 2025-08-24T17:21:41Z" Mar 10 09:45:53 crc kubenswrapper[4794]: I0310 09:45:53.753657 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:35Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:45:53Z is after 2025-08-24T17:21:41Z" Mar 10 09:45:53 crc kubenswrapper[4794]: I0310 09:45:53.764832 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-69278" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"071a79a8-a892-4d38-a255-2a19483b64aa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f596b5be3f96e424f4c4f0c6f81241ec50ee64356cb1449294fa6790aa3f7755\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:45:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wl72n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://387c7d244ef3b150e61bf6db8b30580b19b8867628fab0e5c4eebdde4df81142\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:45:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wl72n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:45:52Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-69278\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:45:53Z is after 2025-08-24T17:21:41Z" Mar 10 09:45:53 crc kubenswrapper[4794]: I0310 09:45:53.777856 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:45:53 crc kubenswrapper[4794]: I0310 09:45:53.777908 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:45:53 crc kubenswrapper[4794]: I0310 09:45:53.777919 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:45:53 crc kubenswrapper[4794]: I0310 09:45:53.777938 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:45:53 crc kubenswrapper[4794]: I0310 09:45:53.777949 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:45:53Z","lastTransitionTime":"2026-03-10T09:45:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:45:53 crc kubenswrapper[4794]: I0310 09:45:53.779420 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d7e66c82b5c376bc659e9597f6714923192b2658db0530c1e20b73533bfb9079\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:45:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:45:53Z is after 2025-08-24T17:21:41Z" Mar 10 09:45:53 crc kubenswrapper[4794]: I0310 09:45:53.793680 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:39Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:39Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7ca44cdf468c33b025c75f0f104435e413e525fba85c006f98fa92ff2f2a4e3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:45:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:45:53Z is after 2025-08-24T17:21:41Z" Mar 10 09:45:53 crc kubenswrapper[4794]: I0310 09:45:53.810851 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mm9nq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d6907de6-7eb7-440a-a101-f492ffa28e39\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:53Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5dhqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5dhqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5dhqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5dhqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5dhqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5dhqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5dhqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5dhqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5dhqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:45:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-mm9nq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:45:53Z is after 2025-08-24T17:21:41Z" Mar 10 09:45:53 crc kubenswrapper[4794]: I0310 09:45:53.880823 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:45:53 crc kubenswrapper[4794]: I0310 09:45:53.880869 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:45:53 crc kubenswrapper[4794]: I0310 09:45:53.880879 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:45:53 crc kubenswrapper[4794]: I0310 09:45:53.880896 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:45:53 crc kubenswrapper[4794]: I0310 09:45:53.880909 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:45:53Z","lastTransitionTime":"2026-03-10T09:45:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:45:53 crc kubenswrapper[4794]: I0310 09:45:53.983877 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:45:53 crc kubenswrapper[4794]: I0310 09:45:53.983952 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:45:53 crc kubenswrapper[4794]: I0310 09:45:53.983997 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:45:53 crc kubenswrapper[4794]: I0310 09:45:53.984023 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:45:53 crc kubenswrapper[4794]: I0310 09:45:53.984041 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:45:53Z","lastTransitionTime":"2026-03-10T09:45:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:45:53 crc kubenswrapper[4794]: I0310 09:45:53.998447 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 09:45:53 crc kubenswrapper[4794]: I0310 09:45:53.998536 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 09:45:53 crc kubenswrapper[4794]: I0310 09:45:53.998631 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 09:45:53 crc kubenswrapper[4794]: E0310 09:45:53.998650 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 09:45:53 crc kubenswrapper[4794]: E0310 09:45:53.998762 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 09:45:53 crc kubenswrapper[4794]: E0310 09:45:53.998863 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 09:45:54 crc kubenswrapper[4794]: I0310 09:45:54.087567 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:45:54 crc kubenswrapper[4794]: I0310 09:45:54.087643 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:45:54 crc kubenswrapper[4794]: I0310 09:45:54.087667 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:45:54 crc kubenswrapper[4794]: I0310 09:45:54.087702 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:45:54 crc kubenswrapper[4794]: I0310 09:45:54.087724 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:45:54Z","lastTransitionTime":"2026-03-10T09:45:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:45:54 crc kubenswrapper[4794]: I0310 09:45:54.190586 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:45:54 crc kubenswrapper[4794]: I0310 09:45:54.190663 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:45:54 crc kubenswrapper[4794]: I0310 09:45:54.190689 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:45:54 crc kubenswrapper[4794]: I0310 09:45:54.190720 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:45:54 crc kubenswrapper[4794]: I0310 09:45:54.190748 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:45:54Z","lastTransitionTime":"2026-03-10T09:45:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:45:54 crc kubenswrapper[4794]: I0310 09:45:54.293445 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:45:54 crc kubenswrapper[4794]: I0310 09:45:54.293504 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:45:54 crc kubenswrapper[4794]: I0310 09:45:54.293525 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:45:54 crc kubenswrapper[4794]: I0310 09:45:54.293547 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:45:54 crc kubenswrapper[4794]: I0310 09:45:54.293564 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:45:54Z","lastTransitionTime":"2026-03-10T09:45:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:45:54 crc kubenswrapper[4794]: I0310 09:45:54.397043 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:45:54 crc kubenswrapper[4794]: I0310 09:45:54.397133 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:45:54 crc kubenswrapper[4794]: I0310 09:45:54.397155 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:45:54 crc kubenswrapper[4794]: I0310 09:45:54.397184 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:45:54 crc kubenswrapper[4794]: I0310 09:45:54.397207 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:45:54Z","lastTransitionTime":"2026-03-10T09:45:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:45:54 crc kubenswrapper[4794]: I0310 09:45:54.444573 4794 generic.go:334] "Generic (PLEG): container finished" podID="d6907de6-7eb7-440a-a101-f492ffa28e39" containerID="40c8d4f75a838a0ea1ce416d08c6d4628c2cbdbc209d50922f2f02867142bedd" exitCode=0 Mar 10 09:45:54 crc kubenswrapper[4794]: I0310 09:45:54.444644 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mm9nq" event={"ID":"d6907de6-7eb7-440a-a101-f492ffa28e39","Type":"ContainerDied","Data":"40c8d4f75a838a0ea1ce416d08c6d4628c2cbdbc209d50922f2f02867142bedd"} Mar 10 09:45:54 crc kubenswrapper[4794]: I0310 09:45:54.447062 4794 generic.go:334] "Generic (PLEG): container finished" podID="749e1060-d177-43e4-9f39-d1b3e9b573b3" containerID="9a7b1089f71425c1e393f53f827c2bd0dd87615752c6c5655ef28b2346de21d8" exitCode=0 Mar 10 09:45:54 crc kubenswrapper[4794]: I0310 09:45:54.447092 4794 generic.go:334] "Generic (PLEG): container finished" podID="749e1060-d177-43e4-9f39-d1b3e9b573b3" containerID="6e448f6f31e1ada006f54b8381012841a94a4e0101fba5630d715f1ab89e842e" exitCode=0 Mar 10 09:45:54 crc kubenswrapper[4794]: I0310 09:45:54.447113 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-zr89w" event={"ID":"749e1060-d177-43e4-9f39-d1b3e9b573b3","Type":"ContainerDied","Data":"9a7b1089f71425c1e393f53f827c2bd0dd87615752c6c5655ef28b2346de21d8"} Mar 10 09:45:54 crc kubenswrapper[4794]: I0310 09:45:54.447137 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-zr89w" event={"ID":"749e1060-d177-43e4-9f39-d1b3e9b573b3","Type":"ContainerDied","Data":"6e448f6f31e1ada006f54b8381012841a94a4e0101fba5630d715f1ab89e842e"} Mar 10 09:45:54 crc kubenswrapper[4794]: I0310 09:45:54.466004 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:35Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:45:54Z is after 2025-08-24T17:21:41Z" Mar 10 09:45:54 crc kubenswrapper[4794]: I0310 09:45:54.480816 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:45:54Z is after 2025-08-24T17:21:41Z" Mar 10 09:45:54 crc kubenswrapper[4794]: I0310 09:45:54.498285 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0a99da5e73d983b28a12e4063cb3dcbef725b0797abd5b5beff2d066fb4b43d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:45:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://faad85d1ef96c3c23ba54d03d456195b33f24bed1a5fc64cec481108d087d77c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:45:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:45:54Z is after 2025-08-24T17:21:41Z" Mar 10 09:45:54 crc kubenswrapper[4794]: I0310 09:45:54.500270 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:45:54 crc kubenswrapper[4794]: I0310 09:45:54.500311 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:45:54 crc kubenswrapper[4794]: I0310 09:45:54.500328 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:45:54 crc kubenswrapper[4794]: I0310 09:45:54.500382 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:45:54 crc kubenswrapper[4794]: I0310 09:45:54.500399 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:45:54Z","lastTransitionTime":"2026-03-10T09:45:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:45:54 crc kubenswrapper[4794]: I0310 09:45:54.510542 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-dc7fw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"441d4601-197c-4325-84bf-cef005ae408b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2367448989b7eb7235c3277100a8d4945551beee942453ce2efe152e17d24ce2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:45:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxxtq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:45:52Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-dc7fw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:45:54Z is after 2025-08-24T17:21:41Z" Mar 10 09:45:54 crc kubenswrapper[4794]: I0310 09:45:54.527657 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c04f5dfc-8cde-406b-9e01-2e529e0c0f31\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:12Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:12Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c7c4ede630df1f139ad8abbc2d1d1e2f0f6cf56a2c29902f6b6879df004834cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:44:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://27049a35d23ac6ffa5cb459e4ea2c5c55a7b1731c9edff0941d452442cc73af2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:44:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://de2c346c49fbd94c4328853bed3ffd94ba873a423ff0d99a6ace6fc832bd4da7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:44:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd3d37caeb172a681855b5504cd1387cb3798cff0c9ae2c67f9a57207d86f7fe\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bd3d37caeb172a681855b5504cd1387cb3798cff0c9ae2c67f9a57207d86f7fe\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T09:45:21Z\\\",\\\"message\\\":\\\"le observer\\\\nW0310 09:45:21.579222 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0310 09:45:21.579381 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0310 09:45:21.580347 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2897458058/tls.crt::/tmp/serving-cert-2897458058/tls.key\\\\\\\"\\\\nI0310 09:45:21.843230 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0310 09:45:21.845521 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0310 09:45:21.845546 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0310 09:45:21.845567 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0310 09:45:21.845572 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0310 09:45:21.854911 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0310 09:45:21.854948 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 09:45:21.854954 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 09:45:21.854960 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0310 09:45:21.854966 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0310 09:45:21.854970 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0310 09:45:21.854974 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0310 09:45:21.854995 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0310 09:45:21.858504 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T09:45:21Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a3be3a71e2af20be50ffcbd448d5f28e73bafe2d36819ec5476a260b6f5a8c5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:44:14Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://712665fa5ac90b706c3ce8eb211703686c18a26f136962df26e3ad7c4396c72f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://712665fa5ac90b706c3ce8eb211703686c18a26f136962df26e3ad7c4396c72f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:44:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:44:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:44:12Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:45:54Z is after 2025-08-24T17:21:41Z" Mar 10 09:45:54 crc kubenswrapper[4794]: I0310 09:45:54.537448 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2f008ec7-aeae-48e2-8ce6-2d24f9a74cb9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5ecb656c11edc3bfd23f7b6362d5ac8cad055186b134498b59e58b868ebd157c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:44:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8ff582c5b9b19a04841665aef50262d055e093a8e80200961a93e1e1ee0391b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8ff582c5b9b19a04841665aef50262d055e093a8e80200961a93e1e1ee0391b4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:44:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:44:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:44:12Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:45:54Z is after 2025-08-24T17:21:41Z" Mar 10 09:45:54 crc kubenswrapper[4794]: I0310 09:45:54.551856 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-zr89w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"749e1060-d177-43e4-9f39-d1b3e9b573b3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:52Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:52Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:52Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-drqmz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9a7b1089f71425c1e393f53f827c2bd0dd87615752c6c5655ef28b2346de21d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a7b1089f71425c1e393f53f827c2bd0dd87615752c6c5655ef28b2346de21d8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:45:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:45:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-drqmz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-drqmz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-drqmz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-drqmz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-drqmz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-drqmz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:45:52Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-zr89w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:45:54Z is after 2025-08-24T17:21:41Z" Mar 10 09:45:54 crc kubenswrapper[4794]: I0310 09:45:54.567236 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-jpdth" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"11028118-385a-4a2a-8bc4-49aad67ce147\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31569151ef9bf456980ec9d56c85d54ea77739b3991e26844385936dfceb24ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:45:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8v9tn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:45:52Z\\\"}}\" for pod \"openshift-multus\"/\"multus-jpdth\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:45:54Z is after 2025-08-24T17:21:41Z" Mar 10 09:45:54 crc kubenswrapper[4794]: I0310 09:45:54.582416 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:35Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:45:54Z is after 2025-08-24T17:21:41Z" Mar 10 09:45:54 crc kubenswrapper[4794]: I0310 09:45:54.592025 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-69278" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"071a79a8-a892-4d38-a255-2a19483b64aa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f596b5be3f96e424f4c4f0c6f81241ec50ee64356cb1449294fa6790aa3f7755\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:45:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wl72n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://387c7d244ef3b150e61bf6db8b30580b19b8867628fab0e5c4eebdde4df81142\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:45:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wl72n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:45:52Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-69278\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:45:54Z is after 2025-08-24T17:21:41Z" Mar 10 09:45:54 crc kubenswrapper[4794]: I0310 09:45:54.603902 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:45:54 crc kubenswrapper[4794]: I0310 09:45:54.603943 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:45:54 crc kubenswrapper[4794]: I0310 09:45:54.603955 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:45:54 crc kubenswrapper[4794]: I0310 09:45:54.603972 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:45:54 crc kubenswrapper[4794]: I0310 09:45:54.603985 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:45:54Z","lastTransitionTime":"2026-03-10T09:45:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:45:54 crc kubenswrapper[4794]: I0310 09:45:54.604462 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d7e66c82b5c376bc659e9597f6714923192b2658db0530c1e20b73533bfb9079\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:45:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:45:54Z is after 2025-08-24T17:21:41Z" Mar 10 09:45:54 crc kubenswrapper[4794]: I0310 09:45:54.614995 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:39Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:39Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7ca44cdf468c33b025c75f0f104435e413e525fba85c006f98fa92ff2f2a4e3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:45:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:45:54Z is after 2025-08-24T17:21:41Z" Mar 10 09:45:54 crc kubenswrapper[4794]: I0310 09:45:54.635557 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mm9nq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d6907de6-7eb7-440a-a101-f492ffa28e39\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5dhqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5dhqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5dhqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5dhqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5dhqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5dhqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5dhqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5dhqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://40c8d4f75a838a0ea1ce416d08c6d4628c2cbdbc209d50922f2f02867142bedd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://40c8d4f75a838a0ea1ce416d08c6d4628c2cbdbc209d50922f2f02867142bedd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:45:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:45:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5dhqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:45:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-mm9nq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:45:54Z is after 2025-08-24T17:21:41Z" Mar 10 09:45:54 crc kubenswrapper[4794]: I0310 09:45:54.655273 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d7e66c82b5c376bc659e9597f6714923192b2658db0530c1e20b73533bfb9079\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:45:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:45:54Z is after 2025-08-24T17:21:41Z" Mar 10 09:45:54 crc kubenswrapper[4794]: I0310 09:45:54.673759 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:39Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:39Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7ca44cdf468c33b025c75f0f104435e413e525fba85c006f98fa92ff2f2a4e3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:45:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:45:54Z is after 2025-08-24T17:21:41Z" Mar 10 09:45:54 crc kubenswrapper[4794]: I0310 09:45:54.699967 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mm9nq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d6907de6-7eb7-440a-a101-f492ffa28e39\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5dhqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5dhqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5dhqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5dhqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5dhqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5dhqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5dhqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5dhqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://40c8d4f75a838a0ea1ce416d08c6d4628c2cbdbc209d50922f2f02867142bedd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://40c8d4f75a838a0ea1ce416d08c6d4628c2cbdbc209d50922f2f02867142bedd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:45:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:45:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5dhqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:45:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-mm9nq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:45:54Z is after 2025-08-24T17:21:41Z" Mar 10 09:45:54 crc kubenswrapper[4794]: I0310 09:45:54.706041 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:45:54 crc kubenswrapper[4794]: I0310 09:45:54.706077 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:45:54 crc kubenswrapper[4794]: I0310 09:45:54.706091 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:45:54 crc kubenswrapper[4794]: I0310 09:45:54.706107 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:45:54 crc kubenswrapper[4794]: I0310 09:45:54.706119 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:45:54Z","lastTransitionTime":"2026-03-10T09:45:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:45:54 crc kubenswrapper[4794]: I0310 09:45:54.713930 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:35Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:45:54Z is after 2025-08-24T17:21:41Z" Mar 10 09:45:54 crc kubenswrapper[4794]: I0310 09:45:54.727599 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:45:54Z is after 2025-08-24T17:21:41Z" Mar 10 09:45:54 crc kubenswrapper[4794]: I0310 09:45:54.741970 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0a99da5e73d983b28a12e4063cb3dcbef725b0797abd5b5beff2d066fb4b43d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:45:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://faad85d1ef96c3c23ba54d03d456195b33f24bed1a5fc64cec481108d087d77c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:45:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:45:54Z is after 2025-08-24T17:21:41Z" Mar 10 09:45:54 crc kubenswrapper[4794]: I0310 09:45:54.756137 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-dc7fw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"441d4601-197c-4325-84bf-cef005ae408b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2367448989b7eb7235c3277100a8d4945551beee942453ce2efe152e17d24ce2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:45:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxxtq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:45:52Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-dc7fw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:45:54Z is after 2025-08-24T17:21:41Z" Mar 10 09:45:54 crc kubenswrapper[4794]: I0310 09:45:54.770783 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c04f5dfc-8cde-406b-9e01-2e529e0c0f31\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:12Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:12Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c7c4ede630df1f139ad8abbc2d1d1e2f0f6cf56a2c29902f6b6879df004834cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:44:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://27049a35d23ac6ffa5cb459e4ea2c5c55a7b1731c9edff0941d452442cc73af2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:44:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://de2c346c49fbd94c4328853bed3ffd94ba873a423ff0d99a6ace6fc832bd4da7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:44:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd3d37caeb172a681855b5504cd1387cb3798cff0c9ae2c67f9a57207d86f7fe\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bd3d37caeb172a681855b5504cd1387cb3798cff0c9ae2c67f9a57207d86f7fe\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T09:45:21Z\\\",\\\"message\\\":\\\"le observer\\\\nW0310 09:45:21.579222 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0310 09:45:21.579381 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0310 09:45:21.580347 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2897458058/tls.crt::/tmp/serving-cert-2897458058/tls.key\\\\\\\"\\\\nI0310 09:45:21.843230 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0310 09:45:21.845521 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0310 09:45:21.845546 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0310 09:45:21.845567 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0310 09:45:21.845572 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0310 09:45:21.854911 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0310 09:45:21.854948 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 09:45:21.854954 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 09:45:21.854960 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0310 09:45:21.854966 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0310 09:45:21.854970 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0310 09:45:21.854974 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0310 09:45:21.854995 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0310 09:45:21.858504 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T09:45:21Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a3be3a71e2af20be50ffcbd448d5f28e73bafe2d36819ec5476a260b6f5a8c5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:44:14Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://712665fa5ac90b706c3ce8eb211703686c18a26f136962df26e3ad7c4396c72f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://712665fa5ac90b706c3ce8eb211703686c18a26f136962df26e3ad7c4396c72f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:44:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:44:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:44:12Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:45:54Z is after 2025-08-24T17:21:41Z" Mar 10 09:45:54 crc kubenswrapper[4794]: I0310 09:45:54.778898 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2f008ec7-aeae-48e2-8ce6-2d24f9a74cb9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5ecb656c11edc3bfd23f7b6362d5ac8cad055186b134498b59e58b868ebd157c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:44:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8ff582c5b9b19a04841665aef50262d055e093a8e80200961a93e1e1ee0391b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8ff582c5b9b19a04841665aef50262d055e093a8e80200961a93e1e1ee0391b4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:44:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:44:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:44:12Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:45:54Z is after 2025-08-24T17:21:41Z" Mar 10 09:45:54 crc kubenswrapper[4794]: I0310 09:45:54.792692 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-zr89w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"749e1060-d177-43e4-9f39-d1b3e9b573b3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:52Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:52Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:52Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-drqmz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9a7b1089f71425c1e393f53f827c2bd0dd87615752c6c5655ef28b2346de21d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a7b1089f71425c1e393f53f827c2bd0dd87615752c6c5655ef28b2346de21d8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:45:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:45:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-drqmz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e448f6f31e1ada006f54b8381012841a94a4e0101fba5630d715f1ab89e842e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6e448f6f31e1ada006f54b8381012841a94a4e0101fba5630d715f1ab89e842e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:45:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:45:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-drqmz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-drqmz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-drqmz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-drqmz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-drqmz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:45:52Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-zr89w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:45:54Z is after 2025-08-24T17:21:41Z" Mar 10 09:45:54 crc kubenswrapper[4794]: I0310 09:45:54.806775 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-jpdth" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"11028118-385a-4a2a-8bc4-49aad67ce147\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31569151ef9bf456980ec9d56c85d54ea77739b3991e26844385936dfceb24ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:45:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8v9tn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:45:52Z\\\"}}\" for pod \"openshift-multus\"/\"multus-jpdth\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:45:54Z is after 2025-08-24T17:21:41Z" Mar 10 09:45:54 crc kubenswrapper[4794]: I0310 09:45:54.808430 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:45:54 crc kubenswrapper[4794]: I0310 09:45:54.808463 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:45:54 crc kubenswrapper[4794]: I0310 09:45:54.808474 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:45:54 crc kubenswrapper[4794]: I0310 09:45:54.808492 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:45:54 crc kubenswrapper[4794]: I0310 09:45:54.808503 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:45:54Z","lastTransitionTime":"2026-03-10T09:45:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:45:54 crc kubenswrapper[4794]: I0310 09:45:54.819055 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:35Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:45:54Z is after 2025-08-24T17:21:41Z" Mar 10 09:45:54 crc kubenswrapper[4794]: I0310 09:45:54.829195 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-69278" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"071a79a8-a892-4d38-a255-2a19483b64aa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f596b5be3f96e424f4c4f0c6f81241ec50ee64356cb1449294fa6790aa3f7755\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:45:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wl72n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://387c7d244ef3b150e61bf6db8b30580b19b8867628fab0e5c4eebdde4df81142\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:45:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wl72n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:45:52Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-69278\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:45:54Z is after 2025-08-24T17:21:41Z" Mar 10 09:45:54 crc kubenswrapper[4794]: I0310 09:45:54.911899 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:45:54 crc kubenswrapper[4794]: I0310 09:45:54.911942 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:45:54 crc kubenswrapper[4794]: I0310 09:45:54.911954 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:45:54 crc kubenswrapper[4794]: I0310 09:45:54.911969 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:45:54 crc kubenswrapper[4794]: I0310 09:45:54.911978 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:45:54Z","lastTransitionTime":"2026-03-10T09:45:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:45:55 crc kubenswrapper[4794]: I0310 09:45:55.018278 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:45:55 crc kubenswrapper[4794]: I0310 09:45:55.018327 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:45:55 crc kubenswrapper[4794]: I0310 09:45:55.018369 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:45:55 crc kubenswrapper[4794]: I0310 09:45:55.018392 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:45:55 crc kubenswrapper[4794]: I0310 09:45:55.018409 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:45:55Z","lastTransitionTime":"2026-03-10T09:45:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:45:55 crc kubenswrapper[4794]: I0310 09:45:55.121010 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:45:55 crc kubenswrapper[4794]: I0310 09:45:55.121055 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:45:55 crc kubenswrapper[4794]: I0310 09:45:55.121071 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:45:55 crc kubenswrapper[4794]: I0310 09:45:55.121093 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:45:55 crc kubenswrapper[4794]: I0310 09:45:55.121110 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:45:55Z","lastTransitionTime":"2026-03-10T09:45:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:45:55 crc kubenswrapper[4794]: I0310 09:45:55.223515 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:45:55 crc kubenswrapper[4794]: I0310 09:45:55.223595 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:45:55 crc kubenswrapper[4794]: I0310 09:45:55.223612 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:45:55 crc kubenswrapper[4794]: I0310 09:45:55.223638 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:45:55 crc kubenswrapper[4794]: I0310 09:45:55.223660 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:45:55Z","lastTransitionTime":"2026-03-10T09:45:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:45:55 crc kubenswrapper[4794]: I0310 09:45:55.327066 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:45:55 crc kubenswrapper[4794]: I0310 09:45:55.327119 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:45:55 crc kubenswrapper[4794]: I0310 09:45:55.327141 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:45:55 crc kubenswrapper[4794]: I0310 09:45:55.327171 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:45:55 crc kubenswrapper[4794]: I0310 09:45:55.327192 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:45:55Z","lastTransitionTime":"2026-03-10T09:45:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:45:55 crc kubenswrapper[4794]: I0310 09:45:55.430636 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:45:55 crc kubenswrapper[4794]: I0310 09:45:55.430703 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:45:55 crc kubenswrapper[4794]: I0310 09:45:55.430725 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:45:55 crc kubenswrapper[4794]: I0310 09:45:55.430755 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:45:55 crc kubenswrapper[4794]: I0310 09:45:55.430780 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:45:55Z","lastTransitionTime":"2026-03-10T09:45:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:45:55 crc kubenswrapper[4794]: I0310 09:45:55.454501 4794 generic.go:334] "Generic (PLEG): container finished" podID="749e1060-d177-43e4-9f39-d1b3e9b573b3" containerID="d4d92d9cb37c7766aaabeee4dc2a40c043f0e8178dfe21b6ca9f91fa6313fed4" exitCode=0 Mar 10 09:45:55 crc kubenswrapper[4794]: I0310 09:45:55.454544 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-zr89w" event={"ID":"749e1060-d177-43e4-9f39-d1b3e9b573b3","Type":"ContainerDied","Data":"d4d92d9cb37c7766aaabeee4dc2a40c043f0e8178dfe21b6ca9f91fa6313fed4"} Mar 10 09:45:55 crc kubenswrapper[4794]: I0310 09:45:55.461069 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mm9nq" event={"ID":"d6907de6-7eb7-440a-a101-f492ffa28e39","Type":"ContainerStarted","Data":"f392e7eaab035834aa954a19e6c9d926bbedf2a547b3a11fba67c0bb966df58c"} Mar 10 09:45:55 crc kubenswrapper[4794]: I0310 09:45:55.461111 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mm9nq" event={"ID":"d6907de6-7eb7-440a-a101-f492ffa28e39","Type":"ContainerStarted","Data":"f4fd74482a81888186636681a0568f49e919ac0d729db583ba9178c42e488051"} Mar 10 09:45:55 crc kubenswrapper[4794]: I0310 09:45:55.461133 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mm9nq" event={"ID":"d6907de6-7eb7-440a-a101-f492ffa28e39","Type":"ContainerStarted","Data":"cf907aae6bd62c6969511c2483d9891a3d89407fa596e7bd4d6e18a3739aea41"} Mar 10 09:45:55 crc kubenswrapper[4794]: I0310 09:45:55.461151 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mm9nq" event={"ID":"d6907de6-7eb7-440a-a101-f492ffa28e39","Type":"ContainerStarted","Data":"5d6a3c6de4d134994477dbeb7d3ccdac86f691a7c465321397a93be240c84c4c"} Mar 10 09:45:55 crc kubenswrapper[4794]: I0310 09:45:55.461168 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mm9nq" event={"ID":"d6907de6-7eb7-440a-a101-f492ffa28e39","Type":"ContainerStarted","Data":"4fe68076842f5ec0c1922f36a3fdae71ce2e309fd4fa24d4ee4e15c16e514b57"} Mar 10 09:45:55 crc kubenswrapper[4794]: I0310 09:45:55.461195 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mm9nq" event={"ID":"d6907de6-7eb7-440a-a101-f492ffa28e39","Type":"ContainerStarted","Data":"6993b19937515a45f9fab7b20a6856320e79cb3526295d52f48338126d519cac"} Mar 10 09:45:55 crc kubenswrapper[4794]: I0310 09:45:55.483775 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0a99da5e73d983b28a12e4063cb3dcbef725b0797abd5b5beff2d066fb4b43d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:45:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://faad85d1ef96c3c23ba54d03d456195b33f24bed1a5fc64cec481108d087d77c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:45:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:45:55Z is after 2025-08-24T17:21:41Z" Mar 10 09:45:55 crc kubenswrapper[4794]: I0310 09:45:55.499877 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-dc7fw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"441d4601-197c-4325-84bf-cef005ae408b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2367448989b7eb7235c3277100a8d4945551beee942453ce2efe152e17d24ce2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:45:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxxtq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:45:52Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-dc7fw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:45:55Z is after 2025-08-24T17:21:41Z" Mar 10 09:45:55 crc kubenswrapper[4794]: I0310 09:45:55.511135 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:35Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:45:55Z is after 2025-08-24T17:21:41Z" Mar 10 09:45:55 crc kubenswrapper[4794]: I0310 09:45:55.522752 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:45:55Z is after 2025-08-24T17:21:41Z" Mar 10 09:45:55 crc kubenswrapper[4794]: I0310 09:45:55.533367 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:45:55 crc kubenswrapper[4794]: I0310 09:45:55.533414 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:45:55 crc kubenswrapper[4794]: I0310 09:45:55.533427 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:45:55 crc kubenswrapper[4794]: I0310 09:45:55.533444 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:45:55 crc kubenswrapper[4794]: I0310 09:45:55.533455 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:45:55Z","lastTransitionTime":"2026-03-10T09:45:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:45:55 crc kubenswrapper[4794]: I0310 09:45:55.534208 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-jpdth" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"11028118-385a-4a2a-8bc4-49aad67ce147\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31569151ef9bf456980ec9d56c85d54ea77739b3991e26844385936dfceb24ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:45:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8v9tn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:45:52Z\\\"}}\" for pod \"openshift-multus\"/\"multus-jpdth\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:45:55Z is after 2025-08-24T17:21:41Z" Mar 10 09:45:55 crc kubenswrapper[4794]: I0310 09:45:55.546431 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c04f5dfc-8cde-406b-9e01-2e529e0c0f31\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:12Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:12Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c7c4ede630df1f139ad8abbc2d1d1e2f0f6cf56a2c29902f6b6879df004834cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:44:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://27049a35d23ac6ffa5cb459e4ea2c5c55a7b1731c9edff0941d452442cc73af2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:44:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://de2c346c49fbd94c4328853bed3ffd94ba873a423ff0d99a6ace6fc832bd4da7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:44:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd3d37caeb172a681855b5504cd1387cb3798cff0c9ae2c67f9a57207d86f7fe\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bd3d37caeb172a681855b5504cd1387cb3798cff0c9ae2c67f9a57207d86f7fe\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T09:45:21Z\\\",\\\"message\\\":\\\"le observer\\\\nW0310 09:45:21.579222 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0310 09:45:21.579381 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0310 09:45:21.580347 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2897458058/tls.crt::/tmp/serving-cert-2897458058/tls.key\\\\\\\"\\\\nI0310 09:45:21.843230 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0310 09:45:21.845521 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0310 09:45:21.845546 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0310 09:45:21.845567 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0310 09:45:21.845572 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0310 09:45:21.854911 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0310 09:45:21.854948 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 09:45:21.854954 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 09:45:21.854960 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0310 09:45:21.854966 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0310 09:45:21.854970 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0310 09:45:21.854974 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0310 09:45:21.854995 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0310 09:45:21.858504 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T09:45:21Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a3be3a71e2af20be50ffcbd448d5f28e73bafe2d36819ec5476a260b6f5a8c5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:44:14Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://712665fa5ac90b706c3ce8eb211703686c18a26f136962df26e3ad7c4396c72f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://712665fa5ac90b706c3ce8eb211703686c18a26f136962df26e3ad7c4396c72f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:44:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:44:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:44:12Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:45:55Z is after 2025-08-24T17:21:41Z" Mar 10 09:45:55 crc kubenswrapper[4794]: I0310 09:45:55.555072 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2f008ec7-aeae-48e2-8ce6-2d24f9a74cb9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5ecb656c11edc3bfd23f7b6362d5ac8cad055186b134498b59e58b868ebd157c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:44:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8ff582c5b9b19a04841665aef50262d055e093a8e80200961a93e1e1ee0391b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8ff582c5b9b19a04841665aef50262d055e093a8e80200961a93e1e1ee0391b4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:44:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:44:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:44:12Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:45:55Z is after 2025-08-24T17:21:41Z" Mar 10 09:45:55 crc kubenswrapper[4794]: I0310 09:45:55.570113 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-zr89w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"749e1060-d177-43e4-9f39-d1b3e9b573b3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:52Z\\\",\\\"message\\\":\\\"containers with incomplete status: [routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:52Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:52Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-drqmz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9a7b1089f71425c1e393f53f827c2bd0dd87615752c6c5655ef28b2346de21d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a7b1089f71425c1e393f53f827c2bd0dd87615752c6c5655ef28b2346de21d8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:45:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:45:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-drqmz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e448f6f31e1ada006f54b8381012841a94a4e0101fba5630d715f1ab89e842e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6e448f6f31e1ada006f54b8381012841a94a4e0101fba5630d715f1ab89e842e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:45:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:45:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-drqmz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4d92d9cb37c7766aaabeee4dc2a40c043f0e8178dfe21b6ca9f91fa6313fed4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d4d92d9cb37c7766aaabeee4dc2a40c043f0e8178dfe21b6ca9f91fa6313fed4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:45:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:45:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-drqmz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-drqmz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-drqmz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-drqmz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:45:52Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-zr89w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:45:55Z is after 2025-08-24T17:21:41Z" Mar 10 09:45:55 crc kubenswrapper[4794]: I0310 09:45:55.580363 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:35Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:45:55Z is after 2025-08-24T17:21:41Z" Mar 10 09:45:55 crc kubenswrapper[4794]: I0310 09:45:55.591048 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-69278" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"071a79a8-a892-4d38-a255-2a19483b64aa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f596b5be3f96e424f4c4f0c6f81241ec50ee64356cb1449294fa6790aa3f7755\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:45:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wl72n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://387c7d244ef3b150e61bf6db8b30580b19b8867628fab0e5c4eebdde4df81142\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:45:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wl72n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:45:52Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-69278\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:45:55Z is after 2025-08-24T17:21:41Z" Mar 10 09:45:55 crc kubenswrapper[4794]: I0310 09:45:55.601043 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:39Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:39Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7ca44cdf468c33b025c75f0f104435e413e525fba85c006f98fa92ff2f2a4e3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:45:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:45:55Z is after 2025-08-24T17:21:41Z" Mar 10 09:45:55 crc kubenswrapper[4794]: I0310 09:45:55.616662 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mm9nq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d6907de6-7eb7-440a-a101-f492ffa28e39\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5dhqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5dhqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5dhqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5dhqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5dhqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5dhqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5dhqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5dhqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://40c8d4f75a838a0ea1ce416d08c6d4628c2cbdbc209d50922f2f02867142bedd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://40c8d4f75a838a0ea1ce416d08c6d4628c2cbdbc209d50922f2f02867142bedd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:45:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:45:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5dhqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:45:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-mm9nq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:45:55Z is after 2025-08-24T17:21:41Z" Mar 10 09:45:55 crc kubenswrapper[4794]: I0310 09:45:55.630363 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d7e66c82b5c376bc659e9597f6714923192b2658db0530c1e20b73533bfb9079\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:45:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:45:55Z is after 2025-08-24T17:21:41Z" Mar 10 09:45:55 crc kubenswrapper[4794]: I0310 09:45:55.636420 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:45:55 crc kubenswrapper[4794]: I0310 09:45:55.636468 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:45:55 crc kubenswrapper[4794]: I0310 09:45:55.636482 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:45:55 crc kubenswrapper[4794]: I0310 09:45:55.636502 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:45:55 crc kubenswrapper[4794]: I0310 09:45:55.636518 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:45:55Z","lastTransitionTime":"2026-03-10T09:45:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:45:55 crc kubenswrapper[4794]: I0310 09:45:55.738135 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:45:55 crc kubenswrapper[4794]: I0310 09:45:55.738173 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:45:55 crc kubenswrapper[4794]: I0310 09:45:55.738182 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:45:55 crc kubenswrapper[4794]: I0310 09:45:55.738198 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:45:55 crc kubenswrapper[4794]: I0310 09:45:55.738207 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:45:55Z","lastTransitionTime":"2026-03-10T09:45:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:45:55 crc kubenswrapper[4794]: I0310 09:45:55.841025 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:45:55 crc kubenswrapper[4794]: I0310 09:45:55.841053 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:45:55 crc kubenswrapper[4794]: I0310 09:45:55.841061 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:45:55 crc kubenswrapper[4794]: I0310 09:45:55.841074 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:45:55 crc kubenswrapper[4794]: I0310 09:45:55.841082 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:45:55Z","lastTransitionTime":"2026-03-10T09:45:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:45:55 crc kubenswrapper[4794]: I0310 09:45:55.944112 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:45:55 crc kubenswrapper[4794]: I0310 09:45:55.944156 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:45:55 crc kubenswrapper[4794]: I0310 09:45:55.944168 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:45:55 crc kubenswrapper[4794]: I0310 09:45:55.944188 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:45:55 crc kubenswrapper[4794]: I0310 09:45:55.944198 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:45:55Z","lastTransitionTime":"2026-03-10T09:45:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:45:55 crc kubenswrapper[4794]: I0310 09:45:55.998106 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 09:45:55 crc kubenswrapper[4794]: I0310 09:45:55.998200 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 09:45:55 crc kubenswrapper[4794]: I0310 09:45:55.998368 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 09:45:55 crc kubenswrapper[4794]: E0310 09:45:55.998448 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 09:45:55 crc kubenswrapper[4794]: E0310 09:45:55.998548 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 09:45:55 crc kubenswrapper[4794]: E0310 09:45:55.998660 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 09:45:55 crc kubenswrapper[4794]: I0310 09:45:55.998908 4794 scope.go:117] "RemoveContainer" containerID="bd3d37caeb172a681855b5504cd1387cb3798cff0c9ae2c67f9a57207d86f7fe" Mar 10 09:45:55 crc kubenswrapper[4794]: E0310 09:45:55.999274 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 10 09:45:56 crc kubenswrapper[4794]: I0310 09:45:56.046490 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:45:56 crc kubenswrapper[4794]: I0310 09:45:56.046532 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:45:56 crc kubenswrapper[4794]: I0310 09:45:56.046549 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:45:56 crc kubenswrapper[4794]: I0310 09:45:56.046571 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:45:56 crc kubenswrapper[4794]: I0310 09:45:56.046587 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:45:56Z","lastTransitionTime":"2026-03-10T09:45:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:45:56 crc kubenswrapper[4794]: I0310 09:45:56.149870 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:45:56 crc kubenswrapper[4794]: I0310 09:45:56.149930 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:45:56 crc kubenswrapper[4794]: I0310 09:45:56.149947 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:45:56 crc kubenswrapper[4794]: I0310 09:45:56.149970 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:45:56 crc kubenswrapper[4794]: I0310 09:45:56.149993 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:45:56Z","lastTransitionTime":"2026-03-10T09:45:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:45:56 crc kubenswrapper[4794]: I0310 09:45:56.253426 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:45:56 crc kubenswrapper[4794]: I0310 09:45:56.253473 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:45:56 crc kubenswrapper[4794]: I0310 09:45:56.253483 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:45:56 crc kubenswrapper[4794]: I0310 09:45:56.253497 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:45:56 crc kubenswrapper[4794]: I0310 09:45:56.253505 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:45:56Z","lastTransitionTime":"2026-03-10T09:45:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:45:56 crc kubenswrapper[4794]: I0310 09:45:56.356079 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:45:56 crc kubenswrapper[4794]: I0310 09:45:56.356493 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:45:56 crc kubenswrapper[4794]: I0310 09:45:56.356508 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:45:56 crc kubenswrapper[4794]: I0310 09:45:56.356529 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:45:56 crc kubenswrapper[4794]: I0310 09:45:56.356545 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:45:56Z","lastTransitionTime":"2026-03-10T09:45:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:45:56 crc kubenswrapper[4794]: I0310 09:45:56.459751 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:45:56 crc kubenswrapper[4794]: I0310 09:45:56.459833 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:45:56 crc kubenswrapper[4794]: I0310 09:45:56.459859 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:45:56 crc kubenswrapper[4794]: I0310 09:45:56.459886 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:45:56 crc kubenswrapper[4794]: I0310 09:45:56.459914 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:45:56Z","lastTransitionTime":"2026-03-10T09:45:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:45:56 crc kubenswrapper[4794]: I0310 09:45:56.466862 4794 generic.go:334] "Generic (PLEG): container finished" podID="749e1060-d177-43e4-9f39-d1b3e9b573b3" containerID="06a10ffd91a8c26bf9a3affc09b1c698bcf344aebafadbb12e37e4cc0fb8fb32" exitCode=0 Mar 10 09:45:56 crc kubenswrapper[4794]: I0310 09:45:56.466914 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-zr89w" event={"ID":"749e1060-d177-43e4-9f39-d1b3e9b573b3","Type":"ContainerDied","Data":"06a10ffd91a8c26bf9a3affc09b1c698bcf344aebafadbb12e37e4cc0fb8fb32"} Mar 10 09:45:56 crc kubenswrapper[4794]: I0310 09:45:56.480193 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d7e66c82b5c376bc659e9597f6714923192b2658db0530c1e20b73533bfb9079\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:45:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:45:56Z is after 2025-08-24T17:21:41Z" Mar 10 09:45:56 crc kubenswrapper[4794]: I0310 09:45:56.494650 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:39Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:39Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7ca44cdf468c33b025c75f0f104435e413e525fba85c006f98fa92ff2f2a4e3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:45:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:45:56Z is after 2025-08-24T17:21:41Z" Mar 10 09:45:56 crc kubenswrapper[4794]: I0310 09:45:56.517986 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mm9nq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d6907de6-7eb7-440a-a101-f492ffa28e39\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5dhqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5dhqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5dhqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5dhqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5dhqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5dhqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5dhqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5dhqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://40c8d4f75a838a0ea1ce416d08c6d4628c2cbdbc209d50922f2f02867142bedd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://40c8d4f75a838a0ea1ce416d08c6d4628c2cbdbc209d50922f2f02867142bedd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:45:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:45:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5dhqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:45:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-mm9nq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:45:56Z is after 2025-08-24T17:21:41Z" Mar 10 09:45:56 crc kubenswrapper[4794]: I0310 09:45:56.530936 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:35Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:45:56Z is after 2025-08-24T17:21:41Z" Mar 10 09:45:56 crc kubenswrapper[4794]: I0310 09:45:56.541595 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:45:56Z is after 2025-08-24T17:21:41Z" Mar 10 09:45:56 crc kubenswrapper[4794]: I0310 09:45:56.554284 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0a99da5e73d983b28a12e4063cb3dcbef725b0797abd5b5beff2d066fb4b43d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:45:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://faad85d1ef96c3c23ba54d03d456195b33f24bed1a5fc64cec481108d087d77c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:45:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:45:56Z is after 2025-08-24T17:21:41Z" Mar 10 09:45:56 crc kubenswrapper[4794]: I0310 09:45:56.562015 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:45:56 crc kubenswrapper[4794]: I0310 09:45:56.562079 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:45:56 crc kubenswrapper[4794]: I0310 09:45:56.562099 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:45:56 crc kubenswrapper[4794]: I0310 09:45:56.562413 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:45:56 crc kubenswrapper[4794]: I0310 09:45:56.562451 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:45:56Z","lastTransitionTime":"2026-03-10T09:45:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:45:56 crc kubenswrapper[4794]: I0310 09:45:56.565095 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-dc7fw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"441d4601-197c-4325-84bf-cef005ae408b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2367448989b7eb7235c3277100a8d4945551beee942453ce2efe152e17d24ce2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:45:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxxtq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:45:52Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-dc7fw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:45:56Z is after 2025-08-24T17:21:41Z" Mar 10 09:45:56 crc kubenswrapper[4794]: I0310 09:45:56.577490 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c04f5dfc-8cde-406b-9e01-2e529e0c0f31\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:12Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:12Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c7c4ede630df1f139ad8abbc2d1d1e2f0f6cf56a2c29902f6b6879df004834cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:44:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://27049a35d23ac6ffa5cb459e4ea2c5c55a7b1731c9edff0941d452442cc73af2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:44:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://de2c346c49fbd94c4328853bed3ffd94ba873a423ff0d99a6ace6fc832bd4da7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:44:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd3d37caeb172a681855b5504cd1387cb3798cff0c9ae2c67f9a57207d86f7fe\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bd3d37caeb172a681855b5504cd1387cb3798cff0c9ae2c67f9a57207d86f7fe\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T09:45:21Z\\\",\\\"message\\\":\\\"le observer\\\\nW0310 09:45:21.579222 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0310 09:45:21.579381 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0310 09:45:21.580347 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2897458058/tls.crt::/tmp/serving-cert-2897458058/tls.key\\\\\\\"\\\\nI0310 09:45:21.843230 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0310 09:45:21.845521 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0310 09:45:21.845546 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0310 09:45:21.845567 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0310 09:45:21.845572 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0310 09:45:21.854911 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0310 09:45:21.854948 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 09:45:21.854954 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 09:45:21.854960 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0310 09:45:21.854966 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0310 09:45:21.854970 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0310 09:45:21.854974 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0310 09:45:21.854995 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0310 09:45:21.858504 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T09:45:21Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a3be3a71e2af20be50ffcbd448d5f28e73bafe2d36819ec5476a260b6f5a8c5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:44:14Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://712665fa5ac90b706c3ce8eb211703686c18a26f136962df26e3ad7c4396c72f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://712665fa5ac90b706c3ce8eb211703686c18a26f136962df26e3ad7c4396c72f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:44:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:44:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:44:12Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:45:56Z is after 2025-08-24T17:21:41Z" Mar 10 09:45:56 crc kubenswrapper[4794]: I0310 09:45:56.586753 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2f008ec7-aeae-48e2-8ce6-2d24f9a74cb9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5ecb656c11edc3bfd23f7b6362d5ac8cad055186b134498b59e58b868ebd157c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:44:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8ff582c5b9b19a04841665aef50262d055e093a8e80200961a93e1e1ee0391b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8ff582c5b9b19a04841665aef50262d055e093a8e80200961a93e1e1ee0391b4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:44:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:44:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:44:12Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:45:56Z is after 2025-08-24T17:21:41Z" Mar 10 09:45:56 crc kubenswrapper[4794]: I0310 09:45:56.599939 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-zr89w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"749e1060-d177-43e4-9f39-d1b3e9b573b3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:52Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:52Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:52Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-drqmz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9a7b1089f71425c1e393f53f827c2bd0dd87615752c6c5655ef28b2346de21d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a7b1089f71425c1e393f53f827c2bd0dd87615752c6c5655ef28b2346de21d8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:45:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:45:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-drqmz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e448f6f31e1ada006f54b8381012841a94a4e0101fba5630d715f1ab89e842e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6e448f6f31e1ada006f54b8381012841a94a4e0101fba5630d715f1ab89e842e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:45:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:45:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-drqmz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4d92d9cb37c7766aaabeee4dc2a40c043f0e8178dfe21b6ca9f91fa6313fed4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d4d92d9cb37c7766aaabeee4dc2a40c043f0e8178dfe21b6ca9f91fa6313fed4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:45:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:45:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-drqmz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://06a10ffd91a8c26bf9a3affc09b1c698bcf344aebafadbb12e37e4cc0fb8fb32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://06a10ffd91a8c26bf9a3affc09b1c698bcf344aebafadbb12e37e4cc0fb8fb32\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:45:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:45:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-drqmz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-drqmz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-drqmz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:45:52Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-zr89w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:45:56Z is after 2025-08-24T17:21:41Z" Mar 10 09:45:56 crc kubenswrapper[4794]: I0310 09:45:56.611551 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-jpdth" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"11028118-385a-4a2a-8bc4-49aad67ce147\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31569151ef9bf456980ec9d56c85d54ea77739b3991e26844385936dfceb24ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:45:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8v9tn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:45:52Z\\\"}}\" for pod \"openshift-multus\"/\"multus-jpdth\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:45:56Z is after 2025-08-24T17:21:41Z" Mar 10 09:45:56 crc kubenswrapper[4794]: I0310 09:45:56.623728 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:35Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:45:56Z is after 2025-08-24T17:21:41Z" Mar 10 09:45:56 crc kubenswrapper[4794]: I0310 09:45:56.637431 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-69278" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"071a79a8-a892-4d38-a255-2a19483b64aa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f596b5be3f96e424f4c4f0c6f81241ec50ee64356cb1449294fa6790aa3f7755\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:45:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wl72n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://387c7d244ef3b150e61bf6db8b30580b19b8867628fab0e5c4eebdde4df81142\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:45:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wl72n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:45:52Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-69278\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:45:56Z is after 2025-08-24T17:21:41Z" Mar 10 09:45:56 crc kubenswrapper[4794]: I0310 09:45:56.665379 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:45:56 crc kubenswrapper[4794]: I0310 09:45:56.665416 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:45:56 crc kubenswrapper[4794]: I0310 09:45:56.665428 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:45:56 crc kubenswrapper[4794]: I0310 09:45:56.665446 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:45:56 crc kubenswrapper[4794]: I0310 09:45:56.665458 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:45:56Z","lastTransitionTime":"2026-03-10T09:45:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:45:56 crc kubenswrapper[4794]: I0310 09:45:56.768075 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:45:56 crc kubenswrapper[4794]: I0310 09:45:56.768136 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:45:56 crc kubenswrapper[4794]: I0310 09:45:56.768154 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:45:56 crc kubenswrapper[4794]: I0310 09:45:56.768179 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:45:56 crc kubenswrapper[4794]: I0310 09:45:56.768196 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:45:56Z","lastTransitionTime":"2026-03-10T09:45:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:45:56 crc kubenswrapper[4794]: I0310 09:45:56.871048 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:45:56 crc kubenswrapper[4794]: I0310 09:45:56.871106 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:45:56 crc kubenswrapper[4794]: I0310 09:45:56.871134 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:45:56 crc kubenswrapper[4794]: I0310 09:45:56.871167 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:45:56 crc kubenswrapper[4794]: I0310 09:45:56.871192 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:45:56Z","lastTransitionTime":"2026-03-10T09:45:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:45:56 crc kubenswrapper[4794]: I0310 09:45:56.973464 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:45:56 crc kubenswrapper[4794]: I0310 09:45:56.973503 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:45:56 crc kubenswrapper[4794]: I0310 09:45:56.973511 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:45:56 crc kubenswrapper[4794]: I0310 09:45:56.973531 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:45:56 crc kubenswrapper[4794]: I0310 09:45:56.973540 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:45:56Z","lastTransitionTime":"2026-03-10T09:45:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:45:57 crc kubenswrapper[4794]: I0310 09:45:57.013436 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd/etcd-crc"] Mar 10 09:45:57 crc kubenswrapper[4794]: I0310 09:45:57.076053 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:45:57 crc kubenswrapper[4794]: I0310 09:45:57.076093 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:45:57 crc kubenswrapper[4794]: I0310 09:45:57.076103 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:45:57 crc kubenswrapper[4794]: I0310 09:45:57.076120 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:45:57 crc kubenswrapper[4794]: I0310 09:45:57.076129 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:45:57Z","lastTransitionTime":"2026-03-10T09:45:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:45:57 crc kubenswrapper[4794]: I0310 09:45:57.178844 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:45:57 crc kubenswrapper[4794]: I0310 09:45:57.178886 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:45:57 crc kubenswrapper[4794]: I0310 09:45:57.178898 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:45:57 crc kubenswrapper[4794]: I0310 09:45:57.178915 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:45:57 crc kubenswrapper[4794]: I0310 09:45:57.178928 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:45:57Z","lastTransitionTime":"2026-03-10T09:45:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:45:57 crc kubenswrapper[4794]: I0310 09:45:57.281551 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:45:57 crc kubenswrapper[4794]: I0310 09:45:57.281581 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:45:57 crc kubenswrapper[4794]: I0310 09:45:57.281589 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:45:57 crc kubenswrapper[4794]: I0310 09:45:57.281602 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:45:57 crc kubenswrapper[4794]: I0310 09:45:57.281612 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:45:57Z","lastTransitionTime":"2026-03-10T09:45:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:45:57 crc kubenswrapper[4794]: I0310 09:45:57.383893 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:45:57 crc kubenswrapper[4794]: I0310 09:45:57.383925 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:45:57 crc kubenswrapper[4794]: I0310 09:45:57.383933 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:45:57 crc kubenswrapper[4794]: I0310 09:45:57.383946 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:45:57 crc kubenswrapper[4794]: I0310 09:45:57.383955 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:45:57Z","lastTransitionTime":"2026-03-10T09:45:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:45:57 crc kubenswrapper[4794]: I0310 09:45:57.474209 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mm9nq" event={"ID":"d6907de6-7eb7-440a-a101-f492ffa28e39","Type":"ContainerStarted","Data":"01f874a0c0eedbf796a95816f7e3fa80891f260f22087c48aea06498330167dd"} Mar 10 09:45:57 crc kubenswrapper[4794]: I0310 09:45:57.478251 4794 generic.go:334] "Generic (PLEG): container finished" podID="749e1060-d177-43e4-9f39-d1b3e9b573b3" containerID="080520ae745ae721854727f7b8ea882c3b1073ef487362681973f5befe15a79c" exitCode=0 Mar 10 09:45:57 crc kubenswrapper[4794]: I0310 09:45:57.478422 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-zr89w" event={"ID":"749e1060-d177-43e4-9f39-d1b3e9b573b3","Type":"ContainerDied","Data":"080520ae745ae721854727f7b8ea882c3b1073ef487362681973f5befe15a79c"} Mar 10 09:45:57 crc kubenswrapper[4794]: I0310 09:45:57.485980 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:45:57 crc kubenswrapper[4794]: I0310 09:45:57.486018 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:45:57 crc kubenswrapper[4794]: I0310 09:45:57.486030 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:45:57 crc kubenswrapper[4794]: I0310 09:45:57.486045 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:45:57 crc kubenswrapper[4794]: I0310 09:45:57.486056 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:45:57Z","lastTransitionTime":"2026-03-10T09:45:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:45:57 crc kubenswrapper[4794]: I0310 09:45:57.498229 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-zr89w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"749e1060-d177-43e4-9f39-d1b3e9b573b3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:52Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:52Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:52Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-drqmz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9a7b1089f71425c1e393f53f827c2bd0dd87615752c6c5655ef28b2346de21d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a7b1089f71425c1e393f53f827c2bd0dd87615752c6c5655ef28b2346de21d8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:45:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:45:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-drqmz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e448f6f31e1ada006f54b8381012841a94a4e0101fba5630d715f1ab89e842e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6e448f6f31e1ada006f54b8381012841a94a4e0101fba5630d715f1ab89e842e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:45:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:45:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-drqmz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4d92d9cb37c7766aaabeee4dc2a40c043f0e8178dfe21b6ca9f91fa6313fed4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d4d92d9cb37c7766aaabeee4dc2a40c043f0e8178dfe21b6ca9f91fa6313fed4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:45:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:45:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-drqmz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://06a10ffd91a8c26bf9a3affc09b1c698bcf344aebafadbb12e37e4cc0fb8fb32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://06a10ffd91a8c26bf9a3affc09b1c698bcf344aebafadbb12e37e4cc0fb8fb32\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:45:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:45:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-drqmz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://080520ae745ae721854727f7b8ea882c3b1073ef487362681973f5befe15a79c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://080520ae745ae721854727f7b8ea882c3b1073ef487362681973f5befe15a79c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:45:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:45:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-drqmz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-drqmz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:45:52Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-zr89w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:45:57Z is after 2025-08-24T17:21:41Z" Mar 10 09:45:57 crc kubenswrapper[4794]: I0310 09:45:57.511643 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-jpdth" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"11028118-385a-4a2a-8bc4-49aad67ce147\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31569151ef9bf456980ec9d56c85d54ea77739b3991e26844385936dfceb24ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:45:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8v9tn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:45:52Z\\\"}}\" for pod \"openshift-multus\"/\"multus-jpdth\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:45:57Z is after 2025-08-24T17:21:41Z" Mar 10 09:45:57 crc kubenswrapper[4794]: I0310 09:45:57.526468 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c04f5dfc-8cde-406b-9e01-2e529e0c0f31\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:12Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:12Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c7c4ede630df1f139ad8abbc2d1d1e2f0f6cf56a2c29902f6b6879df004834cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:44:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://27049a35d23ac6ffa5cb459e4ea2c5c55a7b1731c9edff0941d452442cc73af2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:44:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://de2c346c49fbd94c4328853bed3ffd94ba873a423ff0d99a6ace6fc832bd4da7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:44:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd3d37caeb172a681855b5504cd1387cb3798cff0c9ae2c67f9a57207d86f7fe\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bd3d37caeb172a681855b5504cd1387cb3798cff0c9ae2c67f9a57207d86f7fe\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T09:45:21Z\\\",\\\"message\\\":\\\"le observer\\\\nW0310 09:45:21.579222 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0310 09:45:21.579381 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0310 09:45:21.580347 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2897458058/tls.crt::/tmp/serving-cert-2897458058/tls.key\\\\\\\"\\\\nI0310 09:45:21.843230 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0310 09:45:21.845521 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0310 09:45:21.845546 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0310 09:45:21.845567 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0310 09:45:21.845572 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0310 09:45:21.854911 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0310 09:45:21.854948 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 09:45:21.854954 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 09:45:21.854960 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0310 09:45:21.854966 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0310 09:45:21.854970 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0310 09:45:21.854974 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0310 09:45:21.854995 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0310 09:45:21.858504 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T09:45:21Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a3be3a71e2af20be50ffcbd448d5f28e73bafe2d36819ec5476a260b6f5a8c5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:44:14Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://712665fa5ac90b706c3ce8eb211703686c18a26f136962df26e3ad7c4396c72f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://712665fa5ac90b706c3ce8eb211703686c18a26f136962df26e3ad7c4396c72f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:44:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:44:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:44:12Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:45:57Z is after 2025-08-24T17:21:41Z" Mar 10 09:45:57 crc kubenswrapper[4794]: I0310 09:45:57.536804 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2f008ec7-aeae-48e2-8ce6-2d24f9a74cb9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5ecb656c11edc3bfd23f7b6362d5ac8cad055186b134498b59e58b868ebd157c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:44:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8ff582c5b9b19a04841665aef50262d055e093a8e80200961a93e1e1ee0391b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8ff582c5b9b19a04841665aef50262d055e093a8e80200961a93e1e1ee0391b4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:44:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:44:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:44:12Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:45:57Z is after 2025-08-24T17:21:41Z" Mar 10 09:45:57 crc kubenswrapper[4794]: I0310 09:45:57.550819 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:35Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:45:57Z is after 2025-08-24T17:21:41Z" Mar 10 09:45:57 crc kubenswrapper[4794]: I0310 09:45:57.561579 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-69278" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"071a79a8-a892-4d38-a255-2a19483b64aa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f596b5be3f96e424f4c4f0c6f81241ec50ee64356cb1449294fa6790aa3f7755\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:45:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wl72n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://387c7d244ef3b150e61bf6db8b30580b19b8867628fab0e5c4eebdde4df81142\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:45:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wl72n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:45:52Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-69278\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:45:57Z is after 2025-08-24T17:21:41Z" Mar 10 09:45:57 crc kubenswrapper[4794]: I0310 09:45:57.573628 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d7e66c82b5c376bc659e9597f6714923192b2658db0530c1e20b73533bfb9079\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:45:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:45:57Z is after 2025-08-24T17:21:41Z" Mar 10 09:45:57 crc kubenswrapper[4794]: I0310 09:45:57.584809 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:39Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:39Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7ca44cdf468c33b025c75f0f104435e413e525fba85c006f98fa92ff2f2a4e3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:45:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:45:57Z is after 2025-08-24T17:21:41Z" Mar 10 09:45:57 crc kubenswrapper[4794]: I0310 09:45:57.589009 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:45:57 crc kubenswrapper[4794]: I0310 09:45:57.589035 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:45:57 crc kubenswrapper[4794]: I0310 09:45:57.589044 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:45:57 crc kubenswrapper[4794]: I0310 09:45:57.589058 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:45:57 crc kubenswrapper[4794]: I0310 09:45:57.589067 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:45:57Z","lastTransitionTime":"2026-03-10T09:45:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:45:57 crc kubenswrapper[4794]: I0310 09:45:57.602081 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mm9nq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d6907de6-7eb7-440a-a101-f492ffa28e39\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5dhqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5dhqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5dhqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5dhqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5dhqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5dhqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5dhqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5dhqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://40c8d4f75a838a0ea1ce416d08c6d4628c2cbdbc209d50922f2f02867142bedd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://40c8d4f75a838a0ea1ce416d08c6d4628c2cbdbc209d50922f2f02867142bedd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:45:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:45:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5dhqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:45:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-mm9nq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:45:57Z is after 2025-08-24T17:21:41Z" Mar 10 09:45:57 crc kubenswrapper[4794]: I0310 09:45:57.622099 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1e0df489-afb8-4101-acd6-7274ee4ebddf\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://714aace000a0c1b32bd3c863ca1e38e9e44df606fcf3fddb6573eaf8faea4ac7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:44:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://02b38ec46bd15bd61a9748f309c0f8fd16d1ab485405a653f6974b6b37952f75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:44:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b3bcab9c36a963edfb820142c1c917f0cc5d18a09dc3aa65b4539c2248fb866\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:44:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://55280e0d81cf78ca765eb09242df44355b2e25fc2d3e5f88b78ef9f14a44ada8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:44:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8ed250f61efe96b4fac3af4d8d8958a60df304df0120052a7a1ef2d716cc025d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:44:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d1cbb7471e4c1bc60e6bff82ff47be1667d540e2f73c1e118371b1f887c6e22e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d1cbb7471e4c1bc60e6bff82ff47be1667d540e2f73c1e118371b1f887c6e22e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:44:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:44:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e8b205e76d6211c7df0521baf694a871c2e747401dd85f694fe3b6712995540\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8e8b205e76d6211c7df0521baf694a871c2e747401dd85f694fe3b6712995540\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:44:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:44:14Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://0fee74b6caaaf89e473d672c7699753699a594043f013d2e5106d232ad87d850\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0fee74b6caaaf89e473d672c7699753699a594043f013d2e5106d232ad87d850\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:44:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:44:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:44:12Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:45:57Z is after 2025-08-24T17:21:41Z" Mar 10 09:45:57 crc kubenswrapper[4794]: I0310 09:45:57.634758 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:45:57Z is after 2025-08-24T17:21:41Z" Mar 10 09:45:57 crc kubenswrapper[4794]: I0310 09:45:57.646924 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0a99da5e73d983b28a12e4063cb3dcbef725b0797abd5b5beff2d066fb4b43d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:45:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://faad85d1ef96c3c23ba54d03d456195b33f24bed1a5fc64cec481108d087d77c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:45:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:45:57Z is after 2025-08-24T17:21:41Z" Mar 10 09:45:57 crc kubenswrapper[4794]: I0310 09:45:57.656756 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-dc7fw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"441d4601-197c-4325-84bf-cef005ae408b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2367448989b7eb7235c3277100a8d4945551beee942453ce2efe152e17d24ce2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:45:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxxtq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:45:52Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-dc7fw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:45:57Z is after 2025-08-24T17:21:41Z" Mar 10 09:45:57 crc kubenswrapper[4794]: I0310 09:45:57.667721 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:35Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:45:57Z is after 2025-08-24T17:21:41Z" Mar 10 09:45:57 crc kubenswrapper[4794]: I0310 09:45:57.691742 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:45:57 crc kubenswrapper[4794]: I0310 09:45:57.691776 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:45:57 crc kubenswrapper[4794]: I0310 09:45:57.691787 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:45:57 crc kubenswrapper[4794]: I0310 09:45:57.691803 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:45:57 crc kubenswrapper[4794]: I0310 09:45:57.691814 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:45:57Z","lastTransitionTime":"2026-03-10T09:45:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:45:57 crc kubenswrapper[4794]: I0310 09:45:57.795435 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:45:57 crc kubenswrapper[4794]: I0310 09:45:57.795768 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:45:57 crc kubenswrapper[4794]: I0310 09:45:57.795786 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:45:57 crc kubenswrapper[4794]: I0310 09:45:57.795813 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:45:57 crc kubenswrapper[4794]: I0310 09:45:57.795831 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:45:57Z","lastTransitionTime":"2026-03-10T09:45:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:45:57 crc kubenswrapper[4794]: I0310 09:45:57.908015 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:45:57 crc kubenswrapper[4794]: I0310 09:45:57.908068 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:45:57 crc kubenswrapper[4794]: I0310 09:45:57.908085 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:45:57 crc kubenswrapper[4794]: I0310 09:45:57.908112 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:45:57 crc kubenswrapper[4794]: I0310 09:45:57.908130 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:45:57Z","lastTransitionTime":"2026-03-10T09:45:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:45:57 crc kubenswrapper[4794]: I0310 09:45:57.998611 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 09:45:57 crc kubenswrapper[4794]: I0310 09:45:57.998699 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 09:45:57 crc kubenswrapper[4794]: E0310 09:45:57.998760 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 09:45:57 crc kubenswrapper[4794]: I0310 09:45:57.998777 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 09:45:57 crc kubenswrapper[4794]: E0310 09:45:57.998950 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 09:45:57 crc kubenswrapper[4794]: E0310 09:45:57.999089 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 09:45:58 crc kubenswrapper[4794]: I0310 09:45:58.009890 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:45:58 crc kubenswrapper[4794]: I0310 09:45:58.009926 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:45:58 crc kubenswrapper[4794]: I0310 09:45:58.009938 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:45:58 crc kubenswrapper[4794]: I0310 09:45:58.009955 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:45:58 crc kubenswrapper[4794]: I0310 09:45:58.009968 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:45:58Z","lastTransitionTime":"2026-03-10T09:45:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:45:58 crc kubenswrapper[4794]: I0310 09:45:58.112659 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:45:58 crc kubenswrapper[4794]: I0310 09:45:58.112718 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:45:58 crc kubenswrapper[4794]: I0310 09:45:58.112733 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:45:58 crc kubenswrapper[4794]: I0310 09:45:58.112755 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:45:58 crc kubenswrapper[4794]: I0310 09:45:58.112770 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:45:58Z","lastTransitionTime":"2026-03-10T09:45:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:45:58 crc kubenswrapper[4794]: I0310 09:45:58.215781 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:45:58 crc kubenswrapper[4794]: I0310 09:45:58.215842 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:45:58 crc kubenswrapper[4794]: I0310 09:45:58.215860 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:45:58 crc kubenswrapper[4794]: I0310 09:45:58.215885 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:45:58 crc kubenswrapper[4794]: I0310 09:45:58.215904 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:45:58Z","lastTransitionTime":"2026-03-10T09:45:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:45:58 crc kubenswrapper[4794]: I0310 09:45:58.318433 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:45:58 crc kubenswrapper[4794]: I0310 09:45:58.318481 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:45:58 crc kubenswrapper[4794]: I0310 09:45:58.318494 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:45:58 crc kubenswrapper[4794]: I0310 09:45:58.318511 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:45:58 crc kubenswrapper[4794]: I0310 09:45:58.318526 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:45:58Z","lastTransitionTime":"2026-03-10T09:45:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:45:58 crc kubenswrapper[4794]: I0310 09:45:58.421737 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:45:58 crc kubenswrapper[4794]: I0310 09:45:58.421776 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:45:58 crc kubenswrapper[4794]: I0310 09:45:58.421787 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:45:58 crc kubenswrapper[4794]: I0310 09:45:58.421805 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:45:58 crc kubenswrapper[4794]: I0310 09:45:58.421816 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:45:58Z","lastTransitionTime":"2026-03-10T09:45:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:45:58 crc kubenswrapper[4794]: I0310 09:45:58.487223 4794 generic.go:334] "Generic (PLEG): container finished" podID="749e1060-d177-43e4-9f39-d1b3e9b573b3" containerID="4d9e1da83ee6ac0c4f792e1ec3da1a150b5ee013e83dd2e1b0e86bf5b5fcc648" exitCode=0 Mar 10 09:45:58 crc kubenswrapper[4794]: I0310 09:45:58.487328 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-zr89w" event={"ID":"749e1060-d177-43e4-9f39-d1b3e9b573b3","Type":"ContainerDied","Data":"4d9e1da83ee6ac0c4f792e1ec3da1a150b5ee013e83dd2e1b0e86bf5b5fcc648"} Mar 10 09:45:58 crc kubenswrapper[4794]: I0310 09:45:58.504705 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:45:58Z is after 2025-08-24T17:21:41Z" Mar 10 09:45:58 crc kubenswrapper[4794]: I0310 09:45:58.519393 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0a99da5e73d983b28a12e4063cb3dcbef725b0797abd5b5beff2d066fb4b43d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:45:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://faad85d1ef96c3c23ba54d03d456195b33f24bed1a5fc64cec481108d087d77c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:45:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:45:58Z is after 2025-08-24T17:21:41Z" Mar 10 09:45:58 crc kubenswrapper[4794]: I0310 09:45:58.523966 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:45:58 crc kubenswrapper[4794]: I0310 09:45:58.524013 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:45:58 crc kubenswrapper[4794]: I0310 09:45:58.524025 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:45:58 crc kubenswrapper[4794]: I0310 09:45:58.524045 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:45:58 crc kubenswrapper[4794]: I0310 09:45:58.524061 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:45:58Z","lastTransitionTime":"2026-03-10T09:45:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:45:58 crc kubenswrapper[4794]: I0310 09:45:58.530578 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-dc7fw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"441d4601-197c-4325-84bf-cef005ae408b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2367448989b7eb7235c3277100a8d4945551beee942453ce2efe152e17d24ce2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:45:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxxtq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:45:52Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-dc7fw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:45:58Z is after 2025-08-24T17:21:41Z" Mar 10 09:45:58 crc kubenswrapper[4794]: I0310 09:45:58.541893 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:35Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:45:58Z is after 2025-08-24T17:21:41Z" Mar 10 09:45:58 crc kubenswrapper[4794]: I0310 09:45:58.554593 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-zr89w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"749e1060-d177-43e4-9f39-d1b3e9b573b3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:52Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:52Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-drqmz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9a7b1089f71425c1e393f53f827c2bd0dd87615752c6c5655ef28b2346de21d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a7b1089f71425c1e393f53f827c2bd0dd87615752c6c5655ef28b2346de21d8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:45:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:45:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-drqmz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e448f6f31e1ada006f54b8381012841a94a4e0101fba5630d715f1ab89e842e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6e448f6f31e1ada006f54b8381012841a94a4e0101fba5630d715f1ab89e842e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:45:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:45:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-drqmz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4d92d9cb37c7766aaabeee4dc2a40c043f0e8178dfe21b6ca9f91fa6313fed4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d4d92d9cb37c7766aaabeee4dc2a40c043f0e8178dfe21b6ca9f91fa6313fed4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:45:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:45:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-drqmz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://06a10ffd91a8c26bf9a3affc09b1c698bcf344aebafadbb12e37e4cc0fb8fb32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://06a10ffd91a8c26bf9a3affc09b1c698bcf344aebafadbb12e37e4cc0fb8fb32\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:45:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:45:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-drqmz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://080520ae745ae721854727f7b8ea882c3b1073ef487362681973f5befe15a79c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://080520ae745ae721854727f7b8ea882c3b1073ef487362681973f5befe15a79c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:45:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:45:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-drqmz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4d9e1da83ee6ac0c4f792e1ec3da1a150b5ee013e83dd2e1b0e86bf5b5fcc648\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4d9e1da83ee6ac0c4f792e1ec3da1a150b5ee013e83dd2e1b0e86bf5b5fcc648\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:45:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:45:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-drqmz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:45:52Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-zr89w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:45:58Z is after 2025-08-24T17:21:41Z" Mar 10 09:45:58 crc kubenswrapper[4794]: I0310 09:45:58.567928 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-jpdth" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"11028118-385a-4a2a-8bc4-49aad67ce147\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31569151ef9bf456980ec9d56c85d54ea77739b3991e26844385936dfceb24ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:45:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8v9tn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:45:52Z\\\"}}\" for pod \"openshift-multus\"/\"multus-jpdth\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:45:58Z is after 2025-08-24T17:21:41Z" Mar 10 09:45:58 crc kubenswrapper[4794]: I0310 09:45:58.584517 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c04f5dfc-8cde-406b-9e01-2e529e0c0f31\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:12Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:12Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c7c4ede630df1f139ad8abbc2d1d1e2f0f6cf56a2c29902f6b6879df004834cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:44:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://27049a35d23ac6ffa5cb459e4ea2c5c55a7b1731c9edff0941d452442cc73af2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:44:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://de2c346c49fbd94c4328853bed3ffd94ba873a423ff0d99a6ace6fc832bd4da7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:44:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd3d37caeb172a681855b5504cd1387cb3798cff0c9ae2c67f9a57207d86f7fe\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bd3d37caeb172a681855b5504cd1387cb3798cff0c9ae2c67f9a57207d86f7fe\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T09:45:21Z\\\",\\\"message\\\":\\\"le observer\\\\nW0310 09:45:21.579222 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0310 09:45:21.579381 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0310 09:45:21.580347 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2897458058/tls.crt::/tmp/serving-cert-2897458058/tls.key\\\\\\\"\\\\nI0310 09:45:21.843230 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0310 09:45:21.845521 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0310 09:45:21.845546 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0310 09:45:21.845567 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0310 09:45:21.845572 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0310 09:45:21.854911 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0310 09:45:21.854948 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 09:45:21.854954 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 09:45:21.854960 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0310 09:45:21.854966 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0310 09:45:21.854970 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0310 09:45:21.854974 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0310 09:45:21.854995 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0310 09:45:21.858504 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T09:45:21Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a3be3a71e2af20be50ffcbd448d5f28e73bafe2d36819ec5476a260b6f5a8c5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:44:14Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://712665fa5ac90b706c3ce8eb211703686c18a26f136962df26e3ad7c4396c72f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://712665fa5ac90b706c3ce8eb211703686c18a26f136962df26e3ad7c4396c72f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:44:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:44:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:44:12Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:45:58Z is after 2025-08-24T17:21:41Z" Mar 10 09:45:58 crc kubenswrapper[4794]: I0310 09:45:58.594364 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2f008ec7-aeae-48e2-8ce6-2d24f9a74cb9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5ecb656c11edc3bfd23f7b6362d5ac8cad055186b134498b59e58b868ebd157c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:44:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8ff582c5b9b19a04841665aef50262d055e093a8e80200961a93e1e1ee0391b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8ff582c5b9b19a04841665aef50262d055e093a8e80200961a93e1e1ee0391b4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:44:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:44:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:44:12Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:45:58Z is after 2025-08-24T17:21:41Z" Mar 10 09:45:58 crc kubenswrapper[4794]: I0310 09:45:58.607104 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:35Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:45:58Z is after 2025-08-24T17:21:41Z" Mar 10 09:45:58 crc kubenswrapper[4794]: I0310 09:45:58.617256 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-69278" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"071a79a8-a892-4d38-a255-2a19483b64aa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f596b5be3f96e424f4c4f0c6f81241ec50ee64356cb1449294fa6790aa3f7755\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:45:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wl72n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://387c7d244ef3b150e61bf6db8b30580b19b8867628fab0e5c4eebdde4df81142\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:45:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wl72n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:45:52Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-69278\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:45:58Z is after 2025-08-24T17:21:41Z" Mar 10 09:45:58 crc kubenswrapper[4794]: I0310 09:45:58.626393 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:45:58 crc kubenswrapper[4794]: I0310 09:45:58.626428 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:45:58 crc kubenswrapper[4794]: I0310 09:45:58.626440 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:45:58 crc kubenswrapper[4794]: I0310 09:45:58.626456 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:45:58 crc kubenswrapper[4794]: I0310 09:45:58.626468 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:45:58Z","lastTransitionTime":"2026-03-10T09:45:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:45:58 crc kubenswrapper[4794]: I0310 09:45:58.632399 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d7e66c82b5c376bc659e9597f6714923192b2658db0530c1e20b73533bfb9079\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:45:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:45:58Z is after 2025-08-24T17:21:41Z" Mar 10 09:45:58 crc kubenswrapper[4794]: I0310 09:45:58.643552 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:39Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:39Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7ca44cdf468c33b025c75f0f104435e413e525fba85c006f98fa92ff2f2a4e3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:45:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:45:58Z is after 2025-08-24T17:21:41Z" Mar 10 09:45:58 crc kubenswrapper[4794]: I0310 09:45:58.664609 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mm9nq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d6907de6-7eb7-440a-a101-f492ffa28e39\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5dhqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5dhqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5dhqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5dhqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5dhqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5dhqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5dhqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5dhqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://40c8d4f75a838a0ea1ce416d08c6d4628c2cbdbc209d50922f2f02867142bedd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://40c8d4f75a838a0ea1ce416d08c6d4628c2cbdbc209d50922f2f02867142bedd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:45:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:45:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5dhqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:45:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-mm9nq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:45:58Z is after 2025-08-24T17:21:41Z" Mar 10 09:45:58 crc kubenswrapper[4794]: I0310 09:45:58.685524 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1e0df489-afb8-4101-acd6-7274ee4ebddf\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://714aace000a0c1b32bd3c863ca1e38e9e44df606fcf3fddb6573eaf8faea4ac7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:44:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://02b38ec46bd15bd61a9748f309c0f8fd16d1ab485405a653f6974b6b37952f75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:44:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b3bcab9c36a963edfb820142c1c917f0cc5d18a09dc3aa65b4539c2248fb866\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:44:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://55280e0d81cf78ca765eb09242df44355b2e25fc2d3e5f88b78ef9f14a44ada8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:44:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8ed250f61efe96b4fac3af4d8d8958a60df304df0120052a7a1ef2d716cc025d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:44:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d1cbb7471e4c1bc60e6bff82ff47be1667d540e2f73c1e118371b1f887c6e22e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d1cbb7471e4c1bc60e6bff82ff47be1667d540e2f73c1e118371b1f887c6e22e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:44:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:44:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e8b205e76d6211c7df0521baf694a871c2e747401dd85f694fe3b6712995540\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8e8b205e76d6211c7df0521baf694a871c2e747401dd85f694fe3b6712995540\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:44:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:44:14Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://0fee74b6caaaf89e473d672c7699753699a594043f013d2e5106d232ad87d850\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0fee74b6caaaf89e473d672c7699753699a594043f013d2e5106d232ad87d850\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:44:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:44:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:44:12Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:45:58Z is after 2025-08-24T17:21:41Z" Mar 10 09:45:58 crc kubenswrapper[4794]: I0310 09:45:58.728475 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:45:58 crc kubenswrapper[4794]: I0310 09:45:58.728513 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:45:58 crc kubenswrapper[4794]: I0310 09:45:58.728522 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:45:58 crc kubenswrapper[4794]: I0310 09:45:58.728538 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:45:58 crc kubenswrapper[4794]: I0310 09:45:58.728550 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:45:58Z","lastTransitionTime":"2026-03-10T09:45:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:45:58 crc kubenswrapper[4794]: I0310 09:45:58.830459 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:45:58 crc kubenswrapper[4794]: I0310 09:45:58.830509 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:45:58 crc kubenswrapper[4794]: I0310 09:45:58.830523 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:45:58 crc kubenswrapper[4794]: I0310 09:45:58.830541 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:45:58 crc kubenswrapper[4794]: I0310 09:45:58.830553 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:45:58Z","lastTransitionTime":"2026-03-10T09:45:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:45:58 crc kubenswrapper[4794]: I0310 09:45:58.932918 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:45:58 crc kubenswrapper[4794]: I0310 09:45:58.932956 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:45:58 crc kubenswrapper[4794]: I0310 09:45:58.932967 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:45:58 crc kubenswrapper[4794]: I0310 09:45:58.932983 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:45:58 crc kubenswrapper[4794]: I0310 09:45:58.932996 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:45:58Z","lastTransitionTime":"2026-03-10T09:45:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:45:59 crc kubenswrapper[4794]: I0310 09:45:59.014247 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/node-ca-gp22j"] Mar 10 09:45:59 crc kubenswrapper[4794]: I0310 09:45:59.014953 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-gp22j" Mar 10 09:45:59 crc kubenswrapper[4794]: I0310 09:45:59.016870 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Mar 10 09:45:59 crc kubenswrapper[4794]: I0310 09:45:59.017219 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Mar 10 09:45:59 crc kubenswrapper[4794]: I0310 09:45:59.018815 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Mar 10 09:45:59 crc kubenswrapper[4794]: I0310 09:45:59.019121 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Mar 10 09:45:59 crc kubenswrapper[4794]: I0310 09:45:59.035627 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:45:59 crc kubenswrapper[4794]: I0310 09:45:59.035682 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:45:59 crc kubenswrapper[4794]: I0310 09:45:59.035701 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:45:59 crc kubenswrapper[4794]: I0310 09:45:59.035725 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:45:59 crc kubenswrapper[4794]: I0310 09:45:59.035744 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:45:59Z","lastTransitionTime":"2026-03-10T09:45:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:45:59 crc kubenswrapper[4794]: I0310 09:45:59.037127 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-jpdth" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"11028118-385a-4a2a-8bc4-49aad67ce147\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31569151ef9bf456980ec9d56c85d54ea77739b3991e26844385936dfceb24ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:45:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8v9tn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:45:52Z\\\"}}\" for pod \"openshift-multus\"/\"multus-jpdth\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:45:59Z is after 2025-08-24T17:21:41Z" Mar 10 09:45:59 crc kubenswrapper[4794]: I0310 09:45:59.057178 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c04f5dfc-8cde-406b-9e01-2e529e0c0f31\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:12Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:12Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c7c4ede630df1f139ad8abbc2d1d1e2f0f6cf56a2c29902f6b6879df004834cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:44:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://27049a35d23ac6ffa5cb459e4ea2c5c55a7b1731c9edff0941d452442cc73af2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:44:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://de2c346c49fbd94c4328853bed3ffd94ba873a423ff0d99a6ace6fc832bd4da7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:44:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd3d37caeb172a681855b5504cd1387cb3798cff0c9ae2c67f9a57207d86f7fe\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bd3d37caeb172a681855b5504cd1387cb3798cff0c9ae2c67f9a57207d86f7fe\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T09:45:21Z\\\",\\\"message\\\":\\\"le observer\\\\nW0310 09:45:21.579222 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0310 09:45:21.579381 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0310 09:45:21.580347 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2897458058/tls.crt::/tmp/serving-cert-2897458058/tls.key\\\\\\\"\\\\nI0310 09:45:21.843230 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0310 09:45:21.845521 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0310 09:45:21.845546 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0310 09:45:21.845567 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0310 09:45:21.845572 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0310 09:45:21.854911 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0310 09:45:21.854948 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 09:45:21.854954 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 09:45:21.854960 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0310 09:45:21.854966 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0310 09:45:21.854970 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0310 09:45:21.854974 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0310 09:45:21.854995 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0310 09:45:21.858504 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T09:45:21Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a3be3a71e2af20be50ffcbd448d5f28e73bafe2d36819ec5476a260b6f5a8c5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:44:14Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://712665fa5ac90b706c3ce8eb211703686c18a26f136962df26e3ad7c4396c72f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://712665fa5ac90b706c3ce8eb211703686c18a26f136962df26e3ad7c4396c72f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:44:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:44:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:44:12Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:45:59Z is after 2025-08-24T17:21:41Z" Mar 10 09:45:59 crc kubenswrapper[4794]: I0310 09:45:59.070875 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2f008ec7-aeae-48e2-8ce6-2d24f9a74cb9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5ecb656c11edc3bfd23f7b6362d5ac8cad055186b134498b59e58b868ebd157c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:44:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8ff582c5b9b19a04841665aef50262d055e093a8e80200961a93e1e1ee0391b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8ff582c5b9b19a04841665aef50262d055e093a8e80200961a93e1e1ee0391b4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:44:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:44:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:44:12Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:45:59Z is after 2025-08-24T17:21:41Z" Mar 10 09:45:59 crc kubenswrapper[4794]: I0310 09:45:59.093149 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-zr89w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"749e1060-d177-43e4-9f39-d1b3e9b573b3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:52Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:52Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-drqmz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9a7b1089f71425c1e393f53f827c2bd0dd87615752c6c5655ef28b2346de21d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a7b1089f71425c1e393f53f827c2bd0dd87615752c6c5655ef28b2346de21d8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:45:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:45:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-drqmz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e448f6f31e1ada006f54b8381012841a94a4e0101fba5630d715f1ab89e842e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6e448f6f31e1ada006f54b8381012841a94a4e0101fba5630d715f1ab89e842e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:45:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:45:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-drqmz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4d92d9cb37c7766aaabeee4dc2a40c043f0e8178dfe21b6ca9f91fa6313fed4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d4d92d9cb37c7766aaabeee4dc2a40c043f0e8178dfe21b6ca9f91fa6313fed4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:45:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:45:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-drqmz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://06a10ffd91a8c26bf9a3affc09b1c698bcf344aebafadbb12e37e4cc0fb8fb32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://06a10ffd91a8c26bf9a3affc09b1c698bcf344aebafadbb12e37e4cc0fb8fb32\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:45:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:45:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-drqmz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://080520ae745ae721854727f7b8ea882c3b1073ef487362681973f5befe15a79c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://080520ae745ae721854727f7b8ea882c3b1073ef487362681973f5befe15a79c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:45:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:45:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-drqmz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4d9e1da83ee6ac0c4f792e1ec3da1a150b5ee013e83dd2e1b0e86bf5b5fcc648\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4d9e1da83ee6ac0c4f792e1ec3da1a150b5ee013e83dd2e1b0e86bf5b5fcc648\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:45:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:45:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-drqmz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:45:52Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-zr89w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:45:59Z is after 2025-08-24T17:21:41Z" Mar 10 09:45:59 crc kubenswrapper[4794]: I0310 09:45:59.113409 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:35Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:45:59Z is after 2025-08-24T17:21:41Z" Mar 10 09:45:59 crc kubenswrapper[4794]: I0310 09:45:59.130013 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-69278" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"071a79a8-a892-4d38-a255-2a19483b64aa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f596b5be3f96e424f4c4f0c6f81241ec50ee64356cb1449294fa6790aa3f7755\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:45:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wl72n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://387c7d244ef3b150e61bf6db8b30580b19b8867628fab0e5c4eebdde4df81142\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:45:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wl72n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:45:52Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-69278\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:45:59Z is after 2025-08-24T17:21:41Z" Mar 10 09:45:59 crc kubenswrapper[4794]: I0310 09:45:59.139048 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:45:59 crc kubenswrapper[4794]: I0310 09:45:59.139080 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:45:59 crc kubenswrapper[4794]: I0310 09:45:59.139089 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:45:59 crc kubenswrapper[4794]: I0310 09:45:59.139111 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:45:59 crc kubenswrapper[4794]: I0310 09:45:59.139119 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:45:59Z","lastTransitionTime":"2026-03-10T09:45:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:45:59 crc kubenswrapper[4794]: I0310 09:45:59.148903 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:39Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:39Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7ca44cdf468c33b025c75f0f104435e413e525fba85c006f98fa92ff2f2a4e3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:45:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:45:59Z is after 2025-08-24T17:21:41Z" Mar 10 09:45:59 crc kubenswrapper[4794]: I0310 09:45:59.168560 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mm9nq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d6907de6-7eb7-440a-a101-f492ffa28e39\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5dhqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5dhqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5dhqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5dhqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5dhqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5dhqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5dhqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5dhqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://40c8d4f75a838a0ea1ce416d08c6d4628c2cbdbc209d50922f2f02867142bedd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://40c8d4f75a838a0ea1ce416d08c6d4628c2cbdbc209d50922f2f02867142bedd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:45:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:45:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5dhqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:45:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-mm9nq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:45:59Z is after 2025-08-24T17:21:41Z" Mar 10 09:45:59 crc kubenswrapper[4794]: I0310 09:45:59.171897 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/ccf2d24b-7ac8-4da4-8629-a56d006f1292-serviceca\") pod \"node-ca-gp22j\" (UID: \"ccf2d24b-7ac8-4da4-8629-a56d006f1292\") " pod="openshift-image-registry/node-ca-gp22j" Mar 10 09:45:59 crc kubenswrapper[4794]: I0310 09:45:59.171937 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ccf2d24b-7ac8-4da4-8629-a56d006f1292-host\") pod \"node-ca-gp22j\" (UID: \"ccf2d24b-7ac8-4da4-8629-a56d006f1292\") " pod="openshift-image-registry/node-ca-gp22j" Mar 10 09:45:59 crc kubenswrapper[4794]: I0310 09:45:59.171958 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k8fgn\" (UniqueName: \"kubernetes.io/projected/ccf2d24b-7ac8-4da4-8629-a56d006f1292-kube-api-access-k8fgn\") pod \"node-ca-gp22j\" (UID: \"ccf2d24b-7ac8-4da4-8629-a56d006f1292\") " pod="openshift-image-registry/node-ca-gp22j" Mar 10 09:45:59 crc kubenswrapper[4794]: I0310 09:45:59.201379 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1e0df489-afb8-4101-acd6-7274ee4ebddf\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://714aace000a0c1b32bd3c863ca1e38e9e44df606fcf3fddb6573eaf8faea4ac7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:44:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://02b38ec46bd15bd61a9748f309c0f8fd16d1ab485405a653f6974b6b37952f75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:44:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b3bcab9c36a963edfb820142c1c917f0cc5d18a09dc3aa65b4539c2248fb866\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:44:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://55280e0d81cf78ca765eb09242df44355b2e25fc2d3e5f88b78ef9f14a44ada8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:44:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8ed250f61efe96b4fac3af4d8d8958a60df304df0120052a7a1ef2d716cc025d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:44:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d1cbb7471e4c1bc60e6bff82ff47be1667d540e2f73c1e118371b1f887c6e22e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d1cbb7471e4c1bc60e6bff82ff47be1667d540e2f73c1e118371b1f887c6e22e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:44:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:44:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e8b205e76d6211c7df0521baf694a871c2e747401dd85f694fe3b6712995540\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8e8b205e76d6211c7df0521baf694a871c2e747401dd85f694fe3b6712995540\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:44:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:44:14Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://0fee74b6caaaf89e473d672c7699753699a594043f013d2e5106d232ad87d850\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0fee74b6caaaf89e473d672c7699753699a594043f013d2e5106d232ad87d850\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:44:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:44:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:44:12Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:45:59Z is after 2025-08-24T17:21:41Z" Mar 10 09:45:59 crc kubenswrapper[4794]: I0310 09:45:59.218952 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d7e66c82b5c376bc659e9597f6714923192b2658db0530c1e20b73533bfb9079\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:45:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:45:59Z is after 2025-08-24T17:21:41Z" Mar 10 09:45:59 crc kubenswrapper[4794]: I0310 09:45:59.235878 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0a99da5e73d983b28a12e4063cb3dcbef725b0797abd5b5beff2d066fb4b43d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:45:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://faad85d1ef96c3c23ba54d03d456195b33f24bed1a5fc64cec481108d087d77c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:45:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:45:59Z is after 2025-08-24T17:21:41Z" Mar 10 09:45:59 crc kubenswrapper[4794]: I0310 09:45:59.241329 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:45:59 crc kubenswrapper[4794]: I0310 09:45:59.241407 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:45:59 crc kubenswrapper[4794]: I0310 09:45:59.241426 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:45:59 crc kubenswrapper[4794]: I0310 09:45:59.241450 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:45:59 crc kubenswrapper[4794]: I0310 09:45:59.241470 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:45:59Z","lastTransitionTime":"2026-03-10T09:45:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:45:59 crc kubenswrapper[4794]: I0310 09:45:59.250361 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-dc7fw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"441d4601-197c-4325-84bf-cef005ae408b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2367448989b7eb7235c3277100a8d4945551beee942453ce2efe152e17d24ce2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:45:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxxtq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:45:52Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-dc7fw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:45:59Z is after 2025-08-24T17:21:41Z" Mar 10 09:45:59 crc kubenswrapper[4794]: I0310 09:45:59.262922 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-gp22j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ccf2d24b-7ac8-4da4-8629-a56d006f1292\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:59Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:59Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k8fgn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:45:59Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-gp22j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:45:59Z is after 2025-08-24T17:21:41Z" Mar 10 09:45:59 crc kubenswrapper[4794]: I0310 09:45:59.273191 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/ccf2d24b-7ac8-4da4-8629-a56d006f1292-serviceca\") pod \"node-ca-gp22j\" (UID: \"ccf2d24b-7ac8-4da4-8629-a56d006f1292\") " pod="openshift-image-registry/node-ca-gp22j" Mar 10 09:45:59 crc kubenswrapper[4794]: I0310 09:45:59.273254 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ccf2d24b-7ac8-4da4-8629-a56d006f1292-host\") pod \"node-ca-gp22j\" (UID: \"ccf2d24b-7ac8-4da4-8629-a56d006f1292\") " pod="openshift-image-registry/node-ca-gp22j" Mar 10 09:45:59 crc kubenswrapper[4794]: I0310 09:45:59.273283 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k8fgn\" (UniqueName: \"kubernetes.io/projected/ccf2d24b-7ac8-4da4-8629-a56d006f1292-kube-api-access-k8fgn\") pod \"node-ca-gp22j\" (UID: \"ccf2d24b-7ac8-4da4-8629-a56d006f1292\") " pod="openshift-image-registry/node-ca-gp22j" Mar 10 09:45:59 crc kubenswrapper[4794]: I0310 09:45:59.274853 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ccf2d24b-7ac8-4da4-8629-a56d006f1292-host\") pod \"node-ca-gp22j\" (UID: \"ccf2d24b-7ac8-4da4-8629-a56d006f1292\") " pod="openshift-image-registry/node-ca-gp22j" Mar 10 09:45:59 crc kubenswrapper[4794]: I0310 09:45:59.276723 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/ccf2d24b-7ac8-4da4-8629-a56d006f1292-serviceca\") pod \"node-ca-gp22j\" (UID: \"ccf2d24b-7ac8-4da4-8629-a56d006f1292\") " pod="openshift-image-registry/node-ca-gp22j" Mar 10 09:45:59 crc kubenswrapper[4794]: I0310 09:45:59.280058 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:35Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:45:59Z is after 2025-08-24T17:21:41Z" Mar 10 09:45:59 crc kubenswrapper[4794]: I0310 09:45:59.295452 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:45:59Z is after 2025-08-24T17:21:41Z" Mar 10 09:45:59 crc kubenswrapper[4794]: I0310 09:45:59.295814 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k8fgn\" (UniqueName: \"kubernetes.io/projected/ccf2d24b-7ac8-4da4-8629-a56d006f1292-kube-api-access-k8fgn\") pod \"node-ca-gp22j\" (UID: \"ccf2d24b-7ac8-4da4-8629-a56d006f1292\") " pod="openshift-image-registry/node-ca-gp22j" Mar 10 09:45:59 crc kubenswrapper[4794]: I0310 09:45:59.327809 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-gp22j" Mar 10 09:45:59 crc kubenswrapper[4794]: I0310 09:45:59.346722 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:45:59 crc kubenswrapper[4794]: I0310 09:45:59.346770 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:45:59 crc kubenswrapper[4794]: I0310 09:45:59.346782 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:45:59 crc kubenswrapper[4794]: I0310 09:45:59.346799 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:45:59 crc kubenswrapper[4794]: I0310 09:45:59.346815 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:45:59Z","lastTransitionTime":"2026-03-10T09:45:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:45:59 crc kubenswrapper[4794]: I0310 09:45:59.450040 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:45:59 crc kubenswrapper[4794]: I0310 09:45:59.450103 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:45:59 crc kubenswrapper[4794]: I0310 09:45:59.450120 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:45:59 crc kubenswrapper[4794]: I0310 09:45:59.450138 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:45:59 crc kubenswrapper[4794]: I0310 09:45:59.450151 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:45:59Z","lastTransitionTime":"2026-03-10T09:45:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:45:59 crc kubenswrapper[4794]: I0310 09:45:59.525880 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:45:59 crc kubenswrapper[4794]: I0310 09:45:59.525916 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:45:59 crc kubenswrapper[4794]: I0310 09:45:59.525927 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:45:59 crc kubenswrapper[4794]: I0310 09:45:59.525943 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:45:59 crc kubenswrapper[4794]: I0310 09:45:59.525954 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:45:59Z","lastTransitionTime":"2026-03-10T09:45:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:45:59 crc kubenswrapper[4794]: I0310 09:45:59.531726 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-zr89w" event={"ID":"749e1060-d177-43e4-9f39-d1b3e9b573b3","Type":"ContainerStarted","Data":"80687a118cd90a6d4538b0de1757e2f3a17d7f2d5e2532d4a3c4396e838bdce4"} Mar 10 09:45:59 crc kubenswrapper[4794]: I0310 09:45:59.533028 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-gp22j" event={"ID":"ccf2d24b-7ac8-4da4-8629-a56d006f1292","Type":"ContainerStarted","Data":"4771bfd3072752086c0eff07f1797e1993024b89f0da7be7d40c02f7e9c033da"} Mar 10 09:45:59 crc kubenswrapper[4794]: E0310 09:45:59.544372 4794 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:45:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:59Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:45:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:59Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:45:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:59Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:45:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:59Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"30640994-851b-4f33-a3b0-5689f89c6242\\\",\\\"systemUUID\\\":\\\"970a308b-8f2f-4747-b542-8544494e7e13\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:45:59Z is after 2025-08-24T17:21:41Z" Mar 10 09:45:59 crc kubenswrapper[4794]: I0310 09:45:59.548882 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mm9nq" event={"ID":"d6907de6-7eb7-440a-a101-f492ffa28e39","Type":"ContainerStarted","Data":"129f4f2f028ce030a9031869e327954dca94ca04e3cff5fe553e344d932b3301"} Mar 10 09:45:59 crc kubenswrapper[4794]: I0310 09:45:59.549222 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-mm9nq" Mar 10 09:45:59 crc kubenswrapper[4794]: I0310 09:45:59.549271 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-mm9nq" Mar 10 09:45:59 crc kubenswrapper[4794]: I0310 09:45:59.549482 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:45:59 crc kubenswrapper[4794]: I0310 09:45:59.549501 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:45:59 crc kubenswrapper[4794]: I0310 09:45:59.549514 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:45:59 crc kubenswrapper[4794]: I0310 09:45:59.549529 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:45:59 crc kubenswrapper[4794]: I0310 09:45:59.549546 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-mm9nq" Mar 10 09:45:59 crc kubenswrapper[4794]: I0310 09:45:59.549540 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:45:59Z","lastTransitionTime":"2026-03-10T09:45:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:45:59 crc kubenswrapper[4794]: I0310 09:45:59.556083 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:35Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:45:59Z is after 2025-08-24T17:21:41Z" Mar 10 09:45:59 crc kubenswrapper[4794]: E0310 09:45:59.563227 4794 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:45:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:59Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:45:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:59Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:45:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:59Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:45:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:59Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"30640994-851b-4f33-a3b0-5689f89c6242\\\",\\\"systemUUID\\\":\\\"970a308b-8f2f-4747-b542-8544494e7e13\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:45:59Z is after 2025-08-24T17:21:41Z" Mar 10 09:45:59 crc kubenswrapper[4794]: I0310 09:45:59.566687 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:45:59 crc kubenswrapper[4794]: I0310 09:45:59.566718 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:45:59 crc kubenswrapper[4794]: I0310 09:45:59.566729 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:45:59 crc kubenswrapper[4794]: I0310 09:45:59.566743 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:45:59 crc kubenswrapper[4794]: I0310 09:45:59.566753 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:45:59Z","lastTransitionTime":"2026-03-10T09:45:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:45:59 crc kubenswrapper[4794]: I0310 09:45:59.570168 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:45:59Z is after 2025-08-24T17:21:41Z" Mar 10 09:45:59 crc kubenswrapper[4794]: I0310 09:45:59.576995 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-mm9nq" Mar 10 09:45:59 crc kubenswrapper[4794]: I0310 09:45:59.580493 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-mm9nq" Mar 10 09:45:59 crc kubenswrapper[4794]: E0310 09:45:59.582403 4794 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:45:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:59Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:45:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:59Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:45:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:59Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:45:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:59Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"30640994-851b-4f33-a3b0-5689f89c6242\\\",\\\"systemUUID\\\":\\\"970a308b-8f2f-4747-b542-8544494e7e13\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:45:59Z is after 2025-08-24T17:21:41Z" Mar 10 09:45:59 crc kubenswrapper[4794]: I0310 09:45:59.584156 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0a99da5e73d983b28a12e4063cb3dcbef725b0797abd5b5beff2d066fb4b43d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:45:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://faad85d1ef96c3c23ba54d03d456195b33f24bed1a5fc64cec481108d087d77c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:45:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:45:59Z is after 2025-08-24T17:21:41Z" Mar 10 09:45:59 crc kubenswrapper[4794]: I0310 09:45:59.585853 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:45:59 crc kubenswrapper[4794]: I0310 09:45:59.585877 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:45:59 crc kubenswrapper[4794]: I0310 09:45:59.585888 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:45:59 crc kubenswrapper[4794]: I0310 09:45:59.585905 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:45:59 crc kubenswrapper[4794]: I0310 09:45:59.585917 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:45:59Z","lastTransitionTime":"2026-03-10T09:45:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:45:59 crc kubenswrapper[4794]: I0310 09:45:59.596909 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-dc7fw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"441d4601-197c-4325-84bf-cef005ae408b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2367448989b7eb7235c3277100a8d4945551beee942453ce2efe152e17d24ce2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:45:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxxtq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:45:52Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-dc7fw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:45:59Z is after 2025-08-24T17:21:41Z" Mar 10 09:45:59 crc kubenswrapper[4794]: E0310 09:45:59.598635 4794 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:45:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:59Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:45:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:59Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:45:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:59Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:45:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:59Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"30640994-851b-4f33-a3b0-5689f89c6242\\\",\\\"systemUUID\\\":\\\"970a308b-8f2f-4747-b542-8544494e7e13\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:45:59Z is after 2025-08-24T17:21:41Z" Mar 10 09:45:59 crc kubenswrapper[4794]: I0310 09:45:59.602297 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:45:59 crc kubenswrapper[4794]: I0310 09:45:59.602354 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:45:59 crc kubenswrapper[4794]: I0310 09:45:59.602369 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:45:59 crc kubenswrapper[4794]: I0310 09:45:59.602391 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:45:59 crc kubenswrapper[4794]: I0310 09:45:59.602407 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:45:59Z","lastTransitionTime":"2026-03-10T09:45:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:45:59 crc kubenswrapper[4794]: I0310 09:45:59.608041 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-gp22j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ccf2d24b-7ac8-4da4-8629-a56d006f1292\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:59Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:59Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k8fgn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:45:59Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-gp22j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:45:59Z is after 2025-08-24T17:21:41Z" Mar 10 09:45:59 crc kubenswrapper[4794]: E0310 09:45:59.614899 4794 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:45:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:59Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:45:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:59Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:45:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:59Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:45:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:59Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"30640994-851b-4f33-a3b0-5689f89c6242\\\",\\\"systemUUID\\\":\\\"970a308b-8f2f-4747-b542-8544494e7e13\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:45:59Z is after 2025-08-24T17:21:41Z" Mar 10 09:45:59 crc kubenswrapper[4794]: E0310 09:45:59.615061 4794 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 10 09:45:59 crc kubenswrapper[4794]: I0310 09:45:59.616682 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:45:59 crc kubenswrapper[4794]: I0310 09:45:59.616723 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:45:59 crc kubenswrapper[4794]: I0310 09:45:59.616735 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:45:59 crc kubenswrapper[4794]: I0310 09:45:59.616752 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:45:59 crc kubenswrapper[4794]: I0310 09:45:59.616763 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:45:59Z","lastTransitionTime":"2026-03-10T09:45:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:45:59 crc kubenswrapper[4794]: I0310 09:45:59.623262 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c04f5dfc-8cde-406b-9e01-2e529e0c0f31\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:12Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:12Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c7c4ede630df1f139ad8abbc2d1d1e2f0f6cf56a2c29902f6b6879df004834cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:44:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://27049a35d23ac6ffa5cb459e4ea2c5c55a7b1731c9edff0941d452442cc73af2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:44:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://de2c346c49fbd94c4328853bed3ffd94ba873a423ff0d99a6ace6fc832bd4da7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:44:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd3d37caeb172a681855b5504cd1387cb3798cff0c9ae2c67f9a57207d86f7fe\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bd3d37caeb172a681855b5504cd1387cb3798cff0c9ae2c67f9a57207d86f7fe\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T09:45:21Z\\\",\\\"message\\\":\\\"le observer\\\\nW0310 09:45:21.579222 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0310 09:45:21.579381 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0310 09:45:21.580347 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2897458058/tls.crt::/tmp/serving-cert-2897458058/tls.key\\\\\\\"\\\\nI0310 09:45:21.843230 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0310 09:45:21.845521 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0310 09:45:21.845546 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0310 09:45:21.845567 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0310 09:45:21.845572 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0310 09:45:21.854911 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0310 09:45:21.854948 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 09:45:21.854954 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 09:45:21.854960 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0310 09:45:21.854966 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0310 09:45:21.854970 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0310 09:45:21.854974 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0310 09:45:21.854995 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0310 09:45:21.858504 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T09:45:21Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a3be3a71e2af20be50ffcbd448d5f28e73bafe2d36819ec5476a260b6f5a8c5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:44:14Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://712665fa5ac90b706c3ce8eb211703686c18a26f136962df26e3ad7c4396c72f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://712665fa5ac90b706c3ce8eb211703686c18a26f136962df26e3ad7c4396c72f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:44:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:44:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:44:12Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:45:59Z is after 2025-08-24T17:21:41Z" Mar 10 09:45:59 crc kubenswrapper[4794]: I0310 09:45:59.633890 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2f008ec7-aeae-48e2-8ce6-2d24f9a74cb9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5ecb656c11edc3bfd23f7b6362d5ac8cad055186b134498b59e58b868ebd157c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:44:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8ff582c5b9b19a04841665aef50262d055e093a8e80200961a93e1e1ee0391b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8ff582c5b9b19a04841665aef50262d055e093a8e80200961a93e1e1ee0391b4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:44:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:44:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:44:12Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:45:59Z is after 2025-08-24T17:21:41Z" Mar 10 09:45:59 crc kubenswrapper[4794]: I0310 09:45:59.646002 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-zr89w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"749e1060-d177-43e4-9f39-d1b3e9b573b3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://80687a118cd90a6d4538b0de1757e2f3a17d7f2d5e2532d4a3c4396e838bdce4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:45:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-drqmz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9a7b1089f71425c1e393f53f827c2bd0dd87615752c6c5655ef28b2346de21d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a7b1089f71425c1e393f53f827c2bd0dd87615752c6c5655ef28b2346de21d8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:45:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:45:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-drqmz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e448f6f31e1ada006f54b8381012841a94a4e0101fba5630d715f1ab89e842e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6e448f6f31e1ada006f54b8381012841a94a4e0101fba5630d715f1ab89e842e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:45:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:45:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-drqmz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4d92d9cb37c7766aaabeee4dc2a40c043f0e8178dfe21b6ca9f91fa6313fed4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d4d92d9cb37c7766aaabeee4dc2a40c043f0e8178dfe21b6ca9f91fa6313fed4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:45:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:45:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-drqmz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://06a10ffd91a8c26bf9a3affc09b1c698bcf344aebafadbb12e37e4cc0fb8fb32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://06a10ffd91a8c26bf9a3affc09b1c698bcf344aebafadbb12e37e4cc0fb8fb32\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:45:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:45:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-drqmz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://080520ae745ae721854727f7b8ea882c3b1073ef487362681973f5befe15a79c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://080520ae745ae721854727f7b8ea882c3b1073ef487362681973f5befe15a79c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:45:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:45:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-drqmz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4d9e1da83ee6ac0c4f792e1ec3da1a150b5ee013e83dd2e1b0e86bf5b5fcc648\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4d9e1da83ee6ac0c4f792e1ec3da1a150b5ee013e83dd2e1b0e86bf5b5fcc648\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:45:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:45:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-drqmz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:45:52Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-zr89w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:45:59Z is after 2025-08-24T17:21:41Z" Mar 10 09:45:59 crc kubenswrapper[4794]: I0310 09:45:59.660944 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-jpdth" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"11028118-385a-4a2a-8bc4-49aad67ce147\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31569151ef9bf456980ec9d56c85d54ea77739b3991e26844385936dfceb24ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:45:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8v9tn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:45:52Z\\\"}}\" for pod \"openshift-multus\"/\"multus-jpdth\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:45:59Z is after 2025-08-24T17:21:41Z" Mar 10 09:45:59 crc kubenswrapper[4794]: I0310 09:45:59.675145 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:35Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:45:59Z is after 2025-08-24T17:21:41Z" Mar 10 09:45:59 crc kubenswrapper[4794]: I0310 09:45:59.687480 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-69278" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"071a79a8-a892-4d38-a255-2a19483b64aa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f596b5be3f96e424f4c4f0c6f81241ec50ee64356cb1449294fa6790aa3f7755\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:45:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wl72n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://387c7d244ef3b150e61bf6db8b30580b19b8867628fab0e5c4eebdde4df81142\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:45:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wl72n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:45:52Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-69278\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:45:59Z is after 2025-08-24T17:21:41Z" Mar 10 09:45:59 crc kubenswrapper[4794]: I0310 09:45:59.706786 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1e0df489-afb8-4101-acd6-7274ee4ebddf\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://714aace000a0c1b32bd3c863ca1e38e9e44df606fcf3fddb6573eaf8faea4ac7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:44:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://02b38ec46bd15bd61a9748f309c0f8fd16d1ab485405a653f6974b6b37952f75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:44:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b3bcab9c36a963edfb820142c1c917f0cc5d18a09dc3aa65b4539c2248fb866\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:44:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://55280e0d81cf78ca765eb09242df44355b2e25fc2d3e5f88b78ef9f14a44ada8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:44:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8ed250f61efe96b4fac3af4d8d8958a60df304df0120052a7a1ef2d716cc025d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:44:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d1cbb7471e4c1bc60e6bff82ff47be1667d540e2f73c1e118371b1f887c6e22e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d1cbb7471e4c1bc60e6bff82ff47be1667d540e2f73c1e118371b1f887c6e22e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:44:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:44:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e8b205e76d6211c7df0521baf694a871c2e747401dd85f694fe3b6712995540\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8e8b205e76d6211c7df0521baf694a871c2e747401dd85f694fe3b6712995540\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:44:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:44:14Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://0fee74b6caaaf89e473d672c7699753699a594043f013d2e5106d232ad87d850\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0fee74b6caaaf89e473d672c7699753699a594043f013d2e5106d232ad87d850\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:44:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:44:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:44:12Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:45:59Z is after 2025-08-24T17:21:41Z" Mar 10 09:45:59 crc kubenswrapper[4794]: I0310 09:45:59.719121 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:45:59 crc kubenswrapper[4794]: I0310 09:45:59.719163 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:45:59 crc kubenswrapper[4794]: I0310 09:45:59.719172 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:45:59 crc kubenswrapper[4794]: I0310 09:45:59.719189 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:45:59 crc kubenswrapper[4794]: I0310 09:45:59.719201 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:45:59Z","lastTransitionTime":"2026-03-10T09:45:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:45:59 crc kubenswrapper[4794]: I0310 09:45:59.719355 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d7e66c82b5c376bc659e9597f6714923192b2658db0530c1e20b73533bfb9079\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:45:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:45:59Z is after 2025-08-24T17:21:41Z" Mar 10 09:45:59 crc kubenswrapper[4794]: I0310 09:45:59.731211 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:39Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:39Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7ca44cdf468c33b025c75f0f104435e413e525fba85c006f98fa92ff2f2a4e3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:45:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:45:59Z is after 2025-08-24T17:21:41Z" Mar 10 09:45:59 crc kubenswrapper[4794]: I0310 09:45:59.749468 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mm9nq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d6907de6-7eb7-440a-a101-f492ffa28e39\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5dhqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5dhqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5dhqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5dhqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5dhqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5dhqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5dhqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5dhqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://40c8d4f75a838a0ea1ce416d08c6d4628c2cbdbc209d50922f2f02867142bedd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://40c8d4f75a838a0ea1ce416d08c6d4628c2cbdbc209d50922f2f02867142bedd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:45:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:45:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5dhqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:45:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-mm9nq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:45:59Z is after 2025-08-24T17:21:41Z" Mar 10 09:45:59 crc kubenswrapper[4794]: I0310 09:45:59.762926 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d7e66c82b5c376bc659e9597f6714923192b2658db0530c1e20b73533bfb9079\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:45:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:45:59Z is after 2025-08-24T17:21:41Z" Mar 10 09:45:59 crc kubenswrapper[4794]: I0310 09:45:59.773828 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:39Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:39Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7ca44cdf468c33b025c75f0f104435e413e525fba85c006f98fa92ff2f2a4e3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:45:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:45:59Z is after 2025-08-24T17:21:41Z" Mar 10 09:45:59 crc kubenswrapper[4794]: I0310 09:45:59.793228 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mm9nq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d6907de6-7eb7-440a-a101-f492ffa28e39\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d6a3c6de4d134994477dbeb7d3ccdac86f691a7c465321397a93be240c84c4c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:45:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5dhqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf907aae6bd62c6969511c2483d9891a3d89407fa596e7bd4d6e18a3739aea41\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:45:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5dhqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f392e7eaab035834aa954a19e6c9d926bbedf2a547b3a11fba67c0bb966df58c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:45:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5dhqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4fd74482a81888186636681a0568f49e919ac0d729db583ba9178c42e488051\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:45:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5dhqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4fe68076842f5ec0c1922f36a3fdae71ce2e309fd4fa24d4ee4e15c16e514b57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:45:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5dhqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6993b19937515a45f9fab7b20a6856320e79cb3526295d52f48338126d519cac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:45:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5dhqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://129f4f2f028ce030a9031869e327954dca94ca04e3cff5fe553e344d932b3301\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:45:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5dhqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://01f874a0c0eedbf796a95816f7e3fa80891f260f22087c48aea06498330167dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:45:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5dhqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://40c8d4f75a838a0ea1ce416d08c6d4628c2cbdbc209d50922f2f02867142bedd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://40c8d4f75a838a0ea1ce416d08c6d4628c2cbdbc209d50922f2f02867142bedd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:45:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:45:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5dhqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:45:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-mm9nq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:45:59Z is after 2025-08-24T17:21:41Z" Mar 10 09:45:59 crc kubenswrapper[4794]: I0310 09:45:59.810492 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1e0df489-afb8-4101-acd6-7274ee4ebddf\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://714aace000a0c1b32bd3c863ca1e38e9e44df606fcf3fddb6573eaf8faea4ac7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:44:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://02b38ec46bd15bd61a9748f309c0f8fd16d1ab485405a653f6974b6b37952f75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:44:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b3bcab9c36a963edfb820142c1c917f0cc5d18a09dc3aa65b4539c2248fb866\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:44:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://55280e0d81cf78ca765eb09242df44355b2e25fc2d3e5f88b78ef9f14a44ada8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:44:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8ed250f61efe96b4fac3af4d8d8958a60df304df0120052a7a1ef2d716cc025d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:44:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d1cbb7471e4c1bc60e6bff82ff47be1667d540e2f73c1e118371b1f887c6e22e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d1cbb7471e4c1bc60e6bff82ff47be1667d540e2f73c1e118371b1f887c6e22e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:44:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:44:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e8b205e76d6211c7df0521baf694a871c2e747401dd85f694fe3b6712995540\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8e8b205e76d6211c7df0521baf694a871c2e747401dd85f694fe3b6712995540\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:44:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:44:14Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://0fee74b6caaaf89e473d672c7699753699a594043f013d2e5106d232ad87d850\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0fee74b6caaaf89e473d672c7699753699a594043f013d2e5106d232ad87d850\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:44:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:44:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:44:12Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:45:59Z is after 2025-08-24T17:21:41Z" Mar 10 09:45:59 crc kubenswrapper[4794]: I0310 09:45:59.821627 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:45:59 crc kubenswrapper[4794]: I0310 09:45:59.821662 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:45:59 crc kubenswrapper[4794]: I0310 09:45:59.821671 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:45:59 crc kubenswrapper[4794]: I0310 09:45:59.821684 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:45:59 crc kubenswrapper[4794]: I0310 09:45:59.821694 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:45:59Z","lastTransitionTime":"2026-03-10T09:45:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:45:59 crc kubenswrapper[4794]: I0310 09:45:59.822666 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:45:59Z is after 2025-08-24T17:21:41Z" Mar 10 09:45:59 crc kubenswrapper[4794]: I0310 09:45:59.836364 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0a99da5e73d983b28a12e4063cb3dcbef725b0797abd5b5beff2d066fb4b43d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:45:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://faad85d1ef96c3c23ba54d03d456195b33f24bed1a5fc64cec481108d087d77c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:45:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:45:59Z is after 2025-08-24T17:21:41Z" Mar 10 09:45:59 crc kubenswrapper[4794]: I0310 09:45:59.848873 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-dc7fw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"441d4601-197c-4325-84bf-cef005ae408b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2367448989b7eb7235c3277100a8d4945551beee942453ce2efe152e17d24ce2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:45:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxxtq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:45:52Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-dc7fw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:45:59Z is after 2025-08-24T17:21:41Z" Mar 10 09:45:59 crc kubenswrapper[4794]: I0310 09:45:59.858801 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-gp22j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ccf2d24b-7ac8-4da4-8629-a56d006f1292\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:59Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:59Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k8fgn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:45:59Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-gp22j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:45:59Z is after 2025-08-24T17:21:41Z" Mar 10 09:45:59 crc kubenswrapper[4794]: I0310 09:45:59.872918 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:35Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:45:59Z is after 2025-08-24T17:21:41Z" Mar 10 09:45:59 crc kubenswrapper[4794]: I0310 09:45:59.886539 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-zr89w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"749e1060-d177-43e4-9f39-d1b3e9b573b3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://80687a118cd90a6d4538b0de1757e2f3a17d7f2d5e2532d4a3c4396e838bdce4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:45:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-drqmz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9a7b1089f71425c1e393f53f827c2bd0dd87615752c6c5655ef28b2346de21d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a7b1089f71425c1e393f53f827c2bd0dd87615752c6c5655ef28b2346de21d8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:45:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:45:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-drqmz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e448f6f31e1ada006f54b8381012841a94a4e0101fba5630d715f1ab89e842e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6e448f6f31e1ada006f54b8381012841a94a4e0101fba5630d715f1ab89e842e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:45:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:45:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-drqmz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4d92d9cb37c7766aaabeee4dc2a40c043f0e8178dfe21b6ca9f91fa6313fed4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d4d92d9cb37c7766aaabeee4dc2a40c043f0e8178dfe21b6ca9f91fa6313fed4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:45:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:45:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-drqmz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://06a10ffd91a8c26bf9a3affc09b1c698bcf344aebafadbb12e37e4cc0fb8fb32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://06a10ffd91a8c26bf9a3affc09b1c698bcf344aebafadbb12e37e4cc0fb8fb32\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:45:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:45:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-drqmz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://080520ae745ae721854727f7b8ea882c3b1073ef487362681973f5befe15a79c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://080520ae745ae721854727f7b8ea882c3b1073ef487362681973f5befe15a79c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:45:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:45:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-drqmz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4d9e1da83ee6ac0c4f792e1ec3da1a150b5ee013e83dd2e1b0e86bf5b5fcc648\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4d9e1da83ee6ac0c4f792e1ec3da1a150b5ee013e83dd2e1b0e86bf5b5fcc648\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:45:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:45:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-drqmz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:45:52Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-zr89w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:45:59Z is after 2025-08-24T17:21:41Z" Mar 10 09:45:59 crc kubenswrapper[4794]: I0310 09:45:59.898922 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-jpdth" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"11028118-385a-4a2a-8bc4-49aad67ce147\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31569151ef9bf456980ec9d56c85d54ea77739b3991e26844385936dfceb24ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:45:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8v9tn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:45:52Z\\\"}}\" for pod \"openshift-multus\"/\"multus-jpdth\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:45:59Z is after 2025-08-24T17:21:41Z" Mar 10 09:45:59 crc kubenswrapper[4794]: I0310 09:45:59.913502 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c04f5dfc-8cde-406b-9e01-2e529e0c0f31\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:12Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:12Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c7c4ede630df1f139ad8abbc2d1d1e2f0f6cf56a2c29902f6b6879df004834cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:44:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://27049a35d23ac6ffa5cb459e4ea2c5c55a7b1731c9edff0941d452442cc73af2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:44:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://de2c346c49fbd94c4328853bed3ffd94ba873a423ff0d99a6ace6fc832bd4da7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:44:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd3d37caeb172a681855b5504cd1387cb3798cff0c9ae2c67f9a57207d86f7fe\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bd3d37caeb172a681855b5504cd1387cb3798cff0c9ae2c67f9a57207d86f7fe\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T09:45:21Z\\\",\\\"message\\\":\\\"le observer\\\\nW0310 09:45:21.579222 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0310 09:45:21.579381 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0310 09:45:21.580347 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2897458058/tls.crt::/tmp/serving-cert-2897458058/tls.key\\\\\\\"\\\\nI0310 09:45:21.843230 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0310 09:45:21.845521 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0310 09:45:21.845546 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0310 09:45:21.845567 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0310 09:45:21.845572 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0310 09:45:21.854911 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0310 09:45:21.854948 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 09:45:21.854954 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 09:45:21.854960 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0310 09:45:21.854966 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0310 09:45:21.854970 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0310 09:45:21.854974 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0310 09:45:21.854995 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0310 09:45:21.858504 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T09:45:21Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a3be3a71e2af20be50ffcbd448d5f28e73bafe2d36819ec5476a260b6f5a8c5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:44:14Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://712665fa5ac90b706c3ce8eb211703686c18a26f136962df26e3ad7c4396c72f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://712665fa5ac90b706c3ce8eb211703686c18a26f136962df26e3ad7c4396c72f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:44:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:44:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:44:12Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:45:59Z is after 2025-08-24T17:21:41Z" Mar 10 09:45:59 crc kubenswrapper[4794]: I0310 09:45:59.923987 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:45:59 crc kubenswrapper[4794]: I0310 09:45:59.924066 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:45:59 crc kubenswrapper[4794]: I0310 09:45:59.924085 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:45:59 crc kubenswrapper[4794]: I0310 09:45:59.924108 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:45:59 crc kubenswrapper[4794]: I0310 09:45:59.924128 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:45:59Z","lastTransitionTime":"2026-03-10T09:45:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:45:59 crc kubenswrapper[4794]: I0310 09:45:59.926422 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2f008ec7-aeae-48e2-8ce6-2d24f9a74cb9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5ecb656c11edc3bfd23f7b6362d5ac8cad055186b134498b59e58b868ebd157c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:44:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8ff582c5b9b19a04841665aef50262d055e093a8e80200961a93e1e1ee0391b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8ff582c5b9b19a04841665aef50262d055e093a8e80200961a93e1e1ee0391b4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:44:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:44:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:44:12Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:45:59Z is after 2025-08-24T17:21:41Z" Mar 10 09:45:59 crc kubenswrapper[4794]: I0310 09:45:59.940725 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:35Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:45:59Z is after 2025-08-24T17:21:41Z" Mar 10 09:45:59 crc kubenswrapper[4794]: I0310 09:45:59.950698 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-69278" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"071a79a8-a892-4d38-a255-2a19483b64aa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f596b5be3f96e424f4c4f0c6f81241ec50ee64356cb1449294fa6790aa3f7755\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:45:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wl72n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://387c7d244ef3b150e61bf6db8b30580b19b8867628fab0e5c4eebdde4df81142\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:45:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wl72n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:45:52Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-69278\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:45:59Z is after 2025-08-24T17:21:41Z" Mar 10 09:45:59 crc kubenswrapper[4794]: I0310 09:45:59.998230 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 09:45:59 crc kubenswrapper[4794]: E0310 09:45:59.998391 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 09:45:59 crc kubenswrapper[4794]: I0310 09:45:59.998780 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 09:45:59 crc kubenswrapper[4794]: E0310 09:45:59.998847 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 09:45:59 crc kubenswrapper[4794]: I0310 09:45:59.998872 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 09:45:59 crc kubenswrapper[4794]: E0310 09:45:59.998921 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 09:46:00 crc kubenswrapper[4794]: I0310 09:46:00.026627 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:46:00 crc kubenswrapper[4794]: I0310 09:46:00.026668 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:46:00 crc kubenswrapper[4794]: I0310 09:46:00.026682 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:46:00 crc kubenswrapper[4794]: I0310 09:46:00.026703 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:46:00 crc kubenswrapper[4794]: I0310 09:46:00.026717 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:46:00Z","lastTransitionTime":"2026-03-10T09:46:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:46:00 crc kubenswrapper[4794]: I0310 09:46:00.128779 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:46:00 crc kubenswrapper[4794]: I0310 09:46:00.128813 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:46:00 crc kubenswrapper[4794]: I0310 09:46:00.128823 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:46:00 crc kubenswrapper[4794]: I0310 09:46:00.128837 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:46:00 crc kubenswrapper[4794]: I0310 09:46:00.128846 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:46:00Z","lastTransitionTime":"2026-03-10T09:46:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:46:00 crc kubenswrapper[4794]: I0310 09:46:00.231420 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:46:00 crc kubenswrapper[4794]: I0310 09:46:00.231475 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:46:00 crc kubenswrapper[4794]: I0310 09:46:00.231492 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:46:00 crc kubenswrapper[4794]: I0310 09:46:00.231516 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:46:00 crc kubenswrapper[4794]: I0310 09:46:00.231564 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:46:00Z","lastTransitionTime":"2026-03-10T09:46:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:46:00 crc kubenswrapper[4794]: I0310 09:46:00.334195 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:46:00 crc kubenswrapper[4794]: I0310 09:46:00.334273 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:46:00 crc kubenswrapper[4794]: I0310 09:46:00.334298 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:46:00 crc kubenswrapper[4794]: I0310 09:46:00.334329 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:46:00 crc kubenswrapper[4794]: I0310 09:46:00.334391 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:46:00Z","lastTransitionTime":"2026-03-10T09:46:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:46:00 crc kubenswrapper[4794]: I0310 09:46:00.437589 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:46:00 crc kubenswrapper[4794]: I0310 09:46:00.437664 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:46:00 crc kubenswrapper[4794]: I0310 09:46:00.437688 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:46:00 crc kubenswrapper[4794]: I0310 09:46:00.437718 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:46:00 crc kubenswrapper[4794]: I0310 09:46:00.437739 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:46:00Z","lastTransitionTime":"2026-03-10T09:46:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:46:00 crc kubenswrapper[4794]: I0310 09:46:00.540517 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:46:00 crc kubenswrapper[4794]: I0310 09:46:00.540568 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:46:00 crc kubenswrapper[4794]: I0310 09:46:00.540579 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:46:00 crc kubenswrapper[4794]: I0310 09:46:00.540598 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:46:00 crc kubenswrapper[4794]: I0310 09:46:00.540612 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:46:00Z","lastTransitionTime":"2026-03-10T09:46:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:46:00 crc kubenswrapper[4794]: I0310 09:46:00.554090 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-gp22j" event={"ID":"ccf2d24b-7ac8-4da4-8629-a56d006f1292","Type":"ContainerStarted","Data":"dc5c1e6650959d5e7fe835284d6684f8d165aaec2e2e5c6537668db196c0ba81"} Mar 10 09:46:00 crc kubenswrapper[4794]: I0310 09:46:00.570534 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:35Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:46:00Z is after 2025-08-24T17:21:41Z" Mar 10 09:46:00 crc kubenswrapper[4794]: I0310 09:46:00.584820 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-69278" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"071a79a8-a892-4d38-a255-2a19483b64aa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f596b5be3f96e424f4c4f0c6f81241ec50ee64356cb1449294fa6790aa3f7755\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:45:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wl72n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://387c7d244ef3b150e61bf6db8b30580b19b8867628fab0e5c4eebdde4df81142\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:45:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wl72n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:45:52Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-69278\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:46:00Z is after 2025-08-24T17:21:41Z" Mar 10 09:46:00 crc kubenswrapper[4794]: I0310 09:46:00.601766 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d7e66c82b5c376bc659e9597f6714923192b2658db0530c1e20b73533bfb9079\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:45:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:46:00Z is after 2025-08-24T17:21:41Z" Mar 10 09:46:00 crc kubenswrapper[4794]: I0310 09:46:00.616700 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:39Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:39Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7ca44cdf468c33b025c75f0f104435e413e525fba85c006f98fa92ff2f2a4e3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:45:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:46:00Z is after 2025-08-24T17:21:41Z" Mar 10 09:46:00 crc kubenswrapper[4794]: I0310 09:46:00.638980 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mm9nq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d6907de6-7eb7-440a-a101-f492ffa28e39\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d6a3c6de4d134994477dbeb7d3ccdac86f691a7c465321397a93be240c84c4c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:45:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5dhqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf907aae6bd62c6969511c2483d9891a3d89407fa596e7bd4d6e18a3739aea41\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:45:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5dhqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f392e7eaab035834aa954a19e6c9d926bbedf2a547b3a11fba67c0bb966df58c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:45:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5dhqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4fd74482a81888186636681a0568f49e919ac0d729db583ba9178c42e488051\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:45:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5dhqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4fe68076842f5ec0c1922f36a3fdae71ce2e309fd4fa24d4ee4e15c16e514b57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:45:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5dhqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6993b19937515a45f9fab7b20a6856320e79cb3526295d52f48338126d519cac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:45:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5dhqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://129f4f2f028ce030a9031869e327954dca94ca04e3cff5fe553e344d932b3301\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:45:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5dhqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://01f874a0c0eedbf796a95816f7e3fa80891f260f22087c48aea06498330167dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:45:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5dhqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://40c8d4f75a838a0ea1ce416d08c6d4628c2cbdbc209d50922f2f02867142bedd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://40c8d4f75a838a0ea1ce416d08c6d4628c2cbdbc209d50922f2f02867142bedd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:45:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:45:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5dhqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:45:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-mm9nq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:46:00Z is after 2025-08-24T17:21:41Z" Mar 10 09:46:00 crc kubenswrapper[4794]: I0310 09:46:00.642728 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:46:00 crc kubenswrapper[4794]: I0310 09:46:00.642769 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:46:00 crc kubenswrapper[4794]: I0310 09:46:00.642780 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:46:00 crc kubenswrapper[4794]: I0310 09:46:00.642797 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:46:00 crc kubenswrapper[4794]: I0310 09:46:00.642808 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:46:00Z","lastTransitionTime":"2026-03-10T09:46:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:46:00 crc kubenswrapper[4794]: I0310 09:46:00.659707 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1e0df489-afb8-4101-acd6-7274ee4ebddf\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://714aace000a0c1b32bd3c863ca1e38e9e44df606fcf3fddb6573eaf8faea4ac7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:44:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://02b38ec46bd15bd61a9748f309c0f8fd16d1ab485405a653f6974b6b37952f75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:44:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b3bcab9c36a963edfb820142c1c917f0cc5d18a09dc3aa65b4539c2248fb866\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:44:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://55280e0d81cf78ca765eb09242df44355b2e25fc2d3e5f88b78ef9f14a44ada8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:44:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8ed250f61efe96b4fac3af4d8d8958a60df304df0120052a7a1ef2d716cc025d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:44:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d1cbb7471e4c1bc60e6bff82ff47be1667d540e2f73c1e118371b1f887c6e22e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d1cbb7471e4c1bc60e6bff82ff47be1667d540e2f73c1e118371b1f887c6e22e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:44:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:44:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e8b205e76d6211c7df0521baf694a871c2e747401dd85f694fe3b6712995540\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8e8b205e76d6211c7df0521baf694a871c2e747401dd85f694fe3b6712995540\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:44:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:44:14Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://0fee74b6caaaf89e473d672c7699753699a594043f013d2e5106d232ad87d850\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0fee74b6caaaf89e473d672c7699753699a594043f013d2e5106d232ad87d850\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:44:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:44:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:44:12Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:46:00Z is after 2025-08-24T17:21:41Z" Mar 10 09:46:00 crc kubenswrapper[4794]: I0310 09:46:00.672881 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:46:00Z is after 2025-08-24T17:21:41Z" Mar 10 09:46:00 crc kubenswrapper[4794]: I0310 09:46:00.684393 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0a99da5e73d983b28a12e4063cb3dcbef725b0797abd5b5beff2d066fb4b43d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:45:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://faad85d1ef96c3c23ba54d03d456195b33f24bed1a5fc64cec481108d087d77c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:45:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:46:00Z is after 2025-08-24T17:21:41Z" Mar 10 09:46:00 crc kubenswrapper[4794]: I0310 09:46:00.694486 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-dc7fw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"441d4601-197c-4325-84bf-cef005ae408b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2367448989b7eb7235c3277100a8d4945551beee942453ce2efe152e17d24ce2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:45:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxxtq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:45:52Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-dc7fw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:46:00Z is after 2025-08-24T17:21:41Z" Mar 10 09:46:00 crc kubenswrapper[4794]: I0310 09:46:00.704645 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-gp22j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ccf2d24b-7ac8-4da4-8629-a56d006f1292\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:46:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:46:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:46:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dc5c1e6650959d5e7fe835284d6684f8d165aaec2e2e5c6537668db196c0ba81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:45:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k8fgn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:45:59Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-gp22j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:46:00Z is after 2025-08-24T17:21:41Z" Mar 10 09:46:00 crc kubenswrapper[4794]: I0310 09:46:00.718707 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:35Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:46:00Z is after 2025-08-24T17:21:41Z" Mar 10 09:46:00 crc kubenswrapper[4794]: I0310 09:46:00.739392 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-zr89w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"749e1060-d177-43e4-9f39-d1b3e9b573b3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://80687a118cd90a6d4538b0de1757e2f3a17d7f2d5e2532d4a3c4396e838bdce4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:45:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-drqmz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9a7b1089f71425c1e393f53f827c2bd0dd87615752c6c5655ef28b2346de21d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a7b1089f71425c1e393f53f827c2bd0dd87615752c6c5655ef28b2346de21d8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:45:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:45:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-drqmz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e448f6f31e1ada006f54b8381012841a94a4e0101fba5630d715f1ab89e842e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6e448f6f31e1ada006f54b8381012841a94a4e0101fba5630d715f1ab89e842e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:45:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:45:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-drqmz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4d92d9cb37c7766aaabeee4dc2a40c043f0e8178dfe21b6ca9f91fa6313fed4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d4d92d9cb37c7766aaabeee4dc2a40c043f0e8178dfe21b6ca9f91fa6313fed4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:45:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:45:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-drqmz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://06a10ffd91a8c26bf9a3affc09b1c698bcf344aebafadbb12e37e4cc0fb8fb32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://06a10ffd91a8c26bf9a3affc09b1c698bcf344aebafadbb12e37e4cc0fb8fb32\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:45:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:45:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-drqmz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://080520ae745ae721854727f7b8ea882c3b1073ef487362681973f5befe15a79c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://080520ae745ae721854727f7b8ea882c3b1073ef487362681973f5befe15a79c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:45:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:45:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-drqmz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4d9e1da83ee6ac0c4f792e1ec3da1a150b5ee013e83dd2e1b0e86bf5b5fcc648\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4d9e1da83ee6ac0c4f792e1ec3da1a150b5ee013e83dd2e1b0e86bf5b5fcc648\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:45:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:45:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-drqmz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:45:52Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-zr89w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:46:00Z is after 2025-08-24T17:21:41Z" Mar 10 09:46:00 crc kubenswrapper[4794]: I0310 09:46:00.745656 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:46:00 crc kubenswrapper[4794]: I0310 09:46:00.745696 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:46:00 crc kubenswrapper[4794]: I0310 09:46:00.745710 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:46:00 crc kubenswrapper[4794]: I0310 09:46:00.745729 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:46:00 crc kubenswrapper[4794]: I0310 09:46:00.745740 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:46:00Z","lastTransitionTime":"2026-03-10T09:46:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:46:00 crc kubenswrapper[4794]: I0310 09:46:00.754700 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-jpdth" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"11028118-385a-4a2a-8bc4-49aad67ce147\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31569151ef9bf456980ec9d56c85d54ea77739b3991e26844385936dfceb24ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:45:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8v9tn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:45:52Z\\\"}}\" for pod \"openshift-multus\"/\"multus-jpdth\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:46:00Z is after 2025-08-24T17:21:41Z" Mar 10 09:46:00 crc kubenswrapper[4794]: I0310 09:46:00.772729 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c04f5dfc-8cde-406b-9e01-2e529e0c0f31\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:12Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:12Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c7c4ede630df1f139ad8abbc2d1d1e2f0f6cf56a2c29902f6b6879df004834cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:44:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://27049a35d23ac6ffa5cb459e4ea2c5c55a7b1731c9edff0941d452442cc73af2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:44:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://de2c346c49fbd94c4328853bed3ffd94ba873a423ff0d99a6ace6fc832bd4da7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:44:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd3d37caeb172a681855b5504cd1387cb3798cff0c9ae2c67f9a57207d86f7fe\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bd3d37caeb172a681855b5504cd1387cb3798cff0c9ae2c67f9a57207d86f7fe\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T09:45:21Z\\\",\\\"message\\\":\\\"le observer\\\\nW0310 09:45:21.579222 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0310 09:45:21.579381 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0310 09:45:21.580347 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2897458058/tls.crt::/tmp/serving-cert-2897458058/tls.key\\\\\\\"\\\\nI0310 09:45:21.843230 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0310 09:45:21.845521 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0310 09:45:21.845546 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0310 09:45:21.845567 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0310 09:45:21.845572 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0310 09:45:21.854911 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0310 09:45:21.854948 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 09:45:21.854954 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 09:45:21.854960 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0310 09:45:21.854966 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0310 09:45:21.854970 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0310 09:45:21.854974 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0310 09:45:21.854995 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0310 09:45:21.858504 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T09:45:21Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a3be3a71e2af20be50ffcbd448d5f28e73bafe2d36819ec5476a260b6f5a8c5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:44:14Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://712665fa5ac90b706c3ce8eb211703686c18a26f136962df26e3ad7c4396c72f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://712665fa5ac90b706c3ce8eb211703686c18a26f136962df26e3ad7c4396c72f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:44:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:44:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:44:12Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:46:00Z is after 2025-08-24T17:21:41Z" Mar 10 09:46:00 crc kubenswrapper[4794]: I0310 09:46:00.788880 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2f008ec7-aeae-48e2-8ce6-2d24f9a74cb9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5ecb656c11edc3bfd23f7b6362d5ac8cad055186b134498b59e58b868ebd157c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:44:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8ff582c5b9b19a04841665aef50262d055e093a8e80200961a93e1e1ee0391b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8ff582c5b9b19a04841665aef50262d055e093a8e80200961a93e1e1ee0391b4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:44:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:44:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:44:12Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:46:00Z is after 2025-08-24T17:21:41Z" Mar 10 09:46:00 crc kubenswrapper[4794]: I0310 09:46:00.849097 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:46:00 crc kubenswrapper[4794]: I0310 09:46:00.849181 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:46:00 crc kubenswrapper[4794]: I0310 09:46:00.849199 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:46:00 crc kubenswrapper[4794]: I0310 09:46:00.849223 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:46:00 crc kubenswrapper[4794]: I0310 09:46:00.849243 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:46:00Z","lastTransitionTime":"2026-03-10T09:46:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:46:00 crc kubenswrapper[4794]: I0310 09:46:00.953553 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:46:00 crc kubenswrapper[4794]: I0310 09:46:00.953647 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:46:00 crc kubenswrapper[4794]: I0310 09:46:00.953669 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:46:00 crc kubenswrapper[4794]: I0310 09:46:00.953700 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:46:00 crc kubenswrapper[4794]: I0310 09:46:00.953737 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:46:00Z","lastTransitionTime":"2026-03-10T09:46:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:46:01 crc kubenswrapper[4794]: I0310 09:46:01.056908 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:46:01 crc kubenswrapper[4794]: I0310 09:46:01.056999 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:46:01 crc kubenswrapper[4794]: I0310 09:46:01.057020 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:46:01 crc kubenswrapper[4794]: I0310 09:46:01.057045 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:46:01 crc kubenswrapper[4794]: I0310 09:46:01.057063 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:46:01Z","lastTransitionTime":"2026-03-10T09:46:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:46:01 crc kubenswrapper[4794]: I0310 09:46:01.160562 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:46:01 crc kubenswrapper[4794]: I0310 09:46:01.160652 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:46:01 crc kubenswrapper[4794]: I0310 09:46:01.160708 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:46:01 crc kubenswrapper[4794]: I0310 09:46:01.160737 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:46:01 crc kubenswrapper[4794]: I0310 09:46:01.160761 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:46:01Z","lastTransitionTime":"2026-03-10T09:46:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:46:01 crc kubenswrapper[4794]: I0310 09:46:01.263255 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:46:01 crc kubenswrapper[4794]: I0310 09:46:01.263391 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:46:01 crc kubenswrapper[4794]: I0310 09:46:01.263428 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:46:01 crc kubenswrapper[4794]: I0310 09:46:01.263469 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:46:01 crc kubenswrapper[4794]: I0310 09:46:01.263495 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:46:01Z","lastTransitionTime":"2026-03-10T09:46:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:46:01 crc kubenswrapper[4794]: I0310 09:46:01.366831 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:46:01 crc kubenswrapper[4794]: I0310 09:46:01.366918 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:46:01 crc kubenswrapper[4794]: I0310 09:46:01.366940 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:46:01 crc kubenswrapper[4794]: I0310 09:46:01.366967 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:46:01 crc kubenswrapper[4794]: I0310 09:46:01.366986 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:46:01Z","lastTransitionTime":"2026-03-10T09:46:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:46:01 crc kubenswrapper[4794]: I0310 09:46:01.469914 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:46:01 crc kubenswrapper[4794]: I0310 09:46:01.469978 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:46:01 crc kubenswrapper[4794]: I0310 09:46:01.470003 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:46:01 crc kubenswrapper[4794]: I0310 09:46:01.470032 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:46:01 crc kubenswrapper[4794]: I0310 09:46:01.470055 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:46:01Z","lastTransitionTime":"2026-03-10T09:46:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:46:01 crc kubenswrapper[4794]: I0310 09:46:01.572666 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:46:01 crc kubenswrapper[4794]: I0310 09:46:01.572753 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:46:01 crc kubenswrapper[4794]: I0310 09:46:01.572772 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:46:01 crc kubenswrapper[4794]: I0310 09:46:01.572797 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:46:01 crc kubenswrapper[4794]: I0310 09:46:01.572816 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:46:01Z","lastTransitionTime":"2026-03-10T09:46:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:46:01 crc kubenswrapper[4794]: I0310 09:46:01.675140 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:46:01 crc kubenswrapper[4794]: I0310 09:46:01.675189 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:46:01 crc kubenswrapper[4794]: I0310 09:46:01.675201 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:46:01 crc kubenswrapper[4794]: I0310 09:46:01.675220 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:46:01 crc kubenswrapper[4794]: I0310 09:46:01.675232 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:46:01Z","lastTransitionTime":"2026-03-10T09:46:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:46:01 crc kubenswrapper[4794]: I0310 09:46:01.777620 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:46:01 crc kubenswrapper[4794]: I0310 09:46:01.777659 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:46:01 crc kubenswrapper[4794]: I0310 09:46:01.777668 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:46:01 crc kubenswrapper[4794]: I0310 09:46:01.777681 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:46:01 crc kubenswrapper[4794]: I0310 09:46:01.777690 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:46:01Z","lastTransitionTime":"2026-03-10T09:46:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:46:01 crc kubenswrapper[4794]: I0310 09:46:01.880353 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:46:01 crc kubenswrapper[4794]: I0310 09:46:01.880828 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:46:01 crc kubenswrapper[4794]: I0310 09:46:01.880922 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:46:01 crc kubenswrapper[4794]: I0310 09:46:01.881008 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:46:01 crc kubenswrapper[4794]: I0310 09:46:01.881122 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:46:01Z","lastTransitionTime":"2026-03-10T09:46:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:46:01 crc kubenswrapper[4794]: I0310 09:46:01.983943 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:46:01 crc kubenswrapper[4794]: I0310 09:46:01.983981 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:46:01 crc kubenswrapper[4794]: I0310 09:46:01.983993 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:46:01 crc kubenswrapper[4794]: I0310 09:46:01.984011 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:46:01 crc kubenswrapper[4794]: I0310 09:46:01.984023 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:46:01Z","lastTransitionTime":"2026-03-10T09:46:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:46:01 crc kubenswrapper[4794]: I0310 09:46:01.998938 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 09:46:01 crc kubenswrapper[4794]: I0310 09:46:01.999011 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 09:46:01 crc kubenswrapper[4794]: I0310 09:46:01.998938 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 09:46:01 crc kubenswrapper[4794]: E0310 09:46:01.999072 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 09:46:01 crc kubenswrapper[4794]: E0310 09:46:01.999136 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 09:46:01 crc kubenswrapper[4794]: E0310 09:46:01.999239 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 09:46:02 crc kubenswrapper[4794]: I0310 09:46:02.017541 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d7e66c82b5c376bc659e9597f6714923192b2658db0530c1e20b73533bfb9079\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:45:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:46:02Z is after 2025-08-24T17:21:41Z" Mar 10 09:46:02 crc kubenswrapper[4794]: I0310 09:46:02.031595 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:39Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:39Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7ca44cdf468c33b025c75f0f104435e413e525fba85c006f98fa92ff2f2a4e3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:45:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:46:02Z is after 2025-08-24T17:21:41Z" Mar 10 09:46:02 crc kubenswrapper[4794]: I0310 09:46:02.050896 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mm9nq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d6907de6-7eb7-440a-a101-f492ffa28e39\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d6a3c6de4d134994477dbeb7d3ccdac86f691a7c465321397a93be240c84c4c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:45:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5dhqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf907aae6bd62c6969511c2483d9891a3d89407fa596e7bd4d6e18a3739aea41\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:45:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5dhqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f392e7eaab035834aa954a19e6c9d926bbedf2a547b3a11fba67c0bb966df58c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:45:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5dhqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4fd74482a81888186636681a0568f49e919ac0d729db583ba9178c42e488051\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:45:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5dhqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4fe68076842f5ec0c1922f36a3fdae71ce2e309fd4fa24d4ee4e15c16e514b57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:45:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5dhqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6993b19937515a45f9fab7b20a6856320e79cb3526295d52f48338126d519cac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:45:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5dhqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://129f4f2f028ce030a9031869e327954dca94ca04e3cff5fe553e344d932b3301\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:45:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5dhqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://01f874a0c0eedbf796a95816f7e3fa80891f260f22087c48aea06498330167dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:45:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5dhqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://40c8d4f75a838a0ea1ce416d08c6d4628c2cbdbc209d50922f2f02867142bedd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://40c8d4f75a838a0ea1ce416d08c6d4628c2cbdbc209d50922f2f02867142bedd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:45:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:45:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5dhqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:45:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-mm9nq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:46:02Z is after 2025-08-24T17:21:41Z" Mar 10 09:46:02 crc kubenswrapper[4794]: I0310 09:46:02.076428 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1e0df489-afb8-4101-acd6-7274ee4ebddf\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://714aace000a0c1b32bd3c863ca1e38e9e44df606fcf3fddb6573eaf8faea4ac7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:44:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://02b38ec46bd15bd61a9748f309c0f8fd16d1ab485405a653f6974b6b37952f75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:44:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b3bcab9c36a963edfb820142c1c917f0cc5d18a09dc3aa65b4539c2248fb866\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:44:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://55280e0d81cf78ca765eb09242df44355b2e25fc2d3e5f88b78ef9f14a44ada8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:44:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8ed250f61efe96b4fac3af4d8d8958a60df304df0120052a7a1ef2d716cc025d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:44:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d1cbb7471e4c1bc60e6bff82ff47be1667d540e2f73c1e118371b1f887c6e22e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d1cbb7471e4c1bc60e6bff82ff47be1667d540e2f73c1e118371b1f887c6e22e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:44:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:44:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e8b205e76d6211c7df0521baf694a871c2e747401dd85f694fe3b6712995540\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8e8b205e76d6211c7df0521baf694a871c2e747401dd85f694fe3b6712995540\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:44:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:44:14Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://0fee74b6caaaf89e473d672c7699753699a594043f013d2e5106d232ad87d850\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0fee74b6caaaf89e473d672c7699753699a594043f013d2e5106d232ad87d850\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:44:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:44:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:44:12Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:46:02Z is after 2025-08-24T17:21:41Z" Mar 10 09:46:02 crc kubenswrapper[4794]: I0310 09:46:02.086413 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:46:02 crc kubenswrapper[4794]: I0310 09:46:02.086447 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:46:02 crc kubenswrapper[4794]: I0310 09:46:02.086457 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:46:02 crc kubenswrapper[4794]: I0310 09:46:02.086482 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:46:02 crc kubenswrapper[4794]: I0310 09:46:02.086492 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:46:02Z","lastTransitionTime":"2026-03-10T09:46:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:46:02 crc kubenswrapper[4794]: I0310 09:46:02.089797 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:46:02Z is after 2025-08-24T17:21:41Z" Mar 10 09:46:02 crc kubenswrapper[4794]: I0310 09:46:02.106035 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0a99da5e73d983b28a12e4063cb3dcbef725b0797abd5b5beff2d066fb4b43d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:45:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://faad85d1ef96c3c23ba54d03d456195b33f24bed1a5fc64cec481108d087d77c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:45:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:46:02Z is after 2025-08-24T17:21:41Z" Mar 10 09:46:02 crc kubenswrapper[4794]: I0310 09:46:02.116028 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-dc7fw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"441d4601-197c-4325-84bf-cef005ae408b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2367448989b7eb7235c3277100a8d4945551beee942453ce2efe152e17d24ce2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:45:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxxtq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:45:52Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-dc7fw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:46:02Z is after 2025-08-24T17:21:41Z" Mar 10 09:46:02 crc kubenswrapper[4794]: I0310 09:46:02.127009 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-gp22j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ccf2d24b-7ac8-4da4-8629-a56d006f1292\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:46:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:46:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:46:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dc5c1e6650959d5e7fe835284d6684f8d165aaec2e2e5c6537668db196c0ba81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:45:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k8fgn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:45:59Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-gp22j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:46:02Z is after 2025-08-24T17:21:41Z" Mar 10 09:46:02 crc kubenswrapper[4794]: I0310 09:46:02.138836 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:35Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:46:02Z is after 2025-08-24T17:21:41Z" Mar 10 09:46:02 crc kubenswrapper[4794]: I0310 09:46:02.150676 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-zr89w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"749e1060-d177-43e4-9f39-d1b3e9b573b3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://80687a118cd90a6d4538b0de1757e2f3a17d7f2d5e2532d4a3c4396e838bdce4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:45:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-drqmz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9a7b1089f71425c1e393f53f827c2bd0dd87615752c6c5655ef28b2346de21d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a7b1089f71425c1e393f53f827c2bd0dd87615752c6c5655ef28b2346de21d8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:45:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:45:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-drqmz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e448f6f31e1ada006f54b8381012841a94a4e0101fba5630d715f1ab89e842e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6e448f6f31e1ada006f54b8381012841a94a4e0101fba5630d715f1ab89e842e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:45:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:45:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-drqmz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4d92d9cb37c7766aaabeee4dc2a40c043f0e8178dfe21b6ca9f91fa6313fed4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d4d92d9cb37c7766aaabeee4dc2a40c043f0e8178dfe21b6ca9f91fa6313fed4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:45:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:45:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-drqmz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://06a10ffd91a8c26bf9a3affc09b1c698bcf344aebafadbb12e37e4cc0fb8fb32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://06a10ffd91a8c26bf9a3affc09b1c698bcf344aebafadbb12e37e4cc0fb8fb32\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:45:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:45:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-drqmz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://080520ae745ae721854727f7b8ea882c3b1073ef487362681973f5befe15a79c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://080520ae745ae721854727f7b8ea882c3b1073ef487362681973f5befe15a79c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:45:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:45:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-drqmz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4d9e1da83ee6ac0c4f792e1ec3da1a150b5ee013e83dd2e1b0e86bf5b5fcc648\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4d9e1da83ee6ac0c4f792e1ec3da1a150b5ee013e83dd2e1b0e86bf5b5fcc648\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:45:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:45:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-drqmz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:45:52Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-zr89w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:46:02Z is after 2025-08-24T17:21:41Z" Mar 10 09:46:02 crc kubenswrapper[4794]: E0310 09:46:02.154816 4794 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd6907de6_7eb7_440a_a101_f492ffa28e39.slice/crio-conmon-129f4f2f028ce030a9031869e327954dca94ca04e3cff5fe553e344d932b3301.scope\": RecentStats: unable to find data in memory cache]" Mar 10 09:46:02 crc kubenswrapper[4794]: I0310 09:46:02.163082 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-jpdth" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"11028118-385a-4a2a-8bc4-49aad67ce147\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31569151ef9bf456980ec9d56c85d54ea77739b3991e26844385936dfceb24ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:45:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8v9tn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:45:52Z\\\"}}\" for pod \"openshift-multus\"/\"multus-jpdth\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:46:02Z is after 2025-08-24T17:21:41Z" Mar 10 09:46:02 crc kubenswrapper[4794]: I0310 09:46:02.175158 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c04f5dfc-8cde-406b-9e01-2e529e0c0f31\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:12Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:12Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c7c4ede630df1f139ad8abbc2d1d1e2f0f6cf56a2c29902f6b6879df004834cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:44:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://27049a35d23ac6ffa5cb459e4ea2c5c55a7b1731c9edff0941d452442cc73af2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:44:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://de2c346c49fbd94c4328853bed3ffd94ba873a423ff0d99a6ace6fc832bd4da7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:44:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd3d37caeb172a681855b5504cd1387cb3798cff0c9ae2c67f9a57207d86f7fe\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bd3d37caeb172a681855b5504cd1387cb3798cff0c9ae2c67f9a57207d86f7fe\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T09:45:21Z\\\",\\\"message\\\":\\\"le observer\\\\nW0310 09:45:21.579222 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0310 09:45:21.579381 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0310 09:45:21.580347 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2897458058/tls.crt::/tmp/serving-cert-2897458058/tls.key\\\\\\\"\\\\nI0310 09:45:21.843230 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0310 09:45:21.845521 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0310 09:45:21.845546 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0310 09:45:21.845567 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0310 09:45:21.845572 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0310 09:45:21.854911 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0310 09:45:21.854948 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 09:45:21.854954 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 09:45:21.854960 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0310 09:45:21.854966 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0310 09:45:21.854970 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0310 09:45:21.854974 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0310 09:45:21.854995 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0310 09:45:21.858504 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T09:45:21Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a3be3a71e2af20be50ffcbd448d5f28e73bafe2d36819ec5476a260b6f5a8c5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:44:14Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://712665fa5ac90b706c3ce8eb211703686c18a26f136962df26e3ad7c4396c72f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://712665fa5ac90b706c3ce8eb211703686c18a26f136962df26e3ad7c4396c72f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:44:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:44:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:44:12Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:46:02Z is after 2025-08-24T17:21:41Z" Mar 10 09:46:02 crc kubenswrapper[4794]: I0310 09:46:02.188422 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2f008ec7-aeae-48e2-8ce6-2d24f9a74cb9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5ecb656c11edc3bfd23f7b6362d5ac8cad055186b134498b59e58b868ebd157c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:44:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8ff582c5b9b19a04841665aef50262d055e093a8e80200961a93e1e1ee0391b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8ff582c5b9b19a04841665aef50262d055e093a8e80200961a93e1e1ee0391b4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:44:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:44:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:44:12Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:46:02Z is after 2025-08-24T17:21:41Z" Mar 10 09:46:02 crc kubenswrapper[4794]: I0310 09:46:02.188853 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:46:02 crc kubenswrapper[4794]: I0310 09:46:02.188898 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:46:02 crc kubenswrapper[4794]: I0310 09:46:02.188910 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:46:02 crc kubenswrapper[4794]: I0310 09:46:02.188927 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:46:02 crc kubenswrapper[4794]: I0310 09:46:02.188940 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:46:02Z","lastTransitionTime":"2026-03-10T09:46:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:46:02 crc kubenswrapper[4794]: I0310 09:46:02.200838 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:35Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:46:02Z is after 2025-08-24T17:21:41Z" Mar 10 09:46:02 crc kubenswrapper[4794]: I0310 09:46:02.210703 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-69278" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"071a79a8-a892-4d38-a255-2a19483b64aa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f596b5be3f96e424f4c4f0c6f81241ec50ee64356cb1449294fa6790aa3f7755\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:45:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wl72n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://387c7d244ef3b150e61bf6db8b30580b19b8867628fab0e5c4eebdde4df81142\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:45:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wl72n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:45:52Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-69278\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:46:02Z is after 2025-08-24T17:21:41Z" Mar 10 09:46:02 crc kubenswrapper[4794]: I0310 09:46:02.291220 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:46:02 crc kubenswrapper[4794]: I0310 09:46:02.291276 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:46:02 crc kubenswrapper[4794]: I0310 09:46:02.291296 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:46:02 crc kubenswrapper[4794]: I0310 09:46:02.291320 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:46:02 crc kubenswrapper[4794]: I0310 09:46:02.291368 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:46:02Z","lastTransitionTime":"2026-03-10T09:46:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:46:02 crc kubenswrapper[4794]: I0310 09:46:02.395543 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:46:02 crc kubenswrapper[4794]: I0310 09:46:02.395598 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:46:02 crc kubenswrapper[4794]: I0310 09:46:02.395615 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:46:02 crc kubenswrapper[4794]: I0310 09:46:02.395641 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:46:02 crc kubenswrapper[4794]: I0310 09:46:02.395657 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:46:02Z","lastTransitionTime":"2026-03-10T09:46:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:46:02 crc kubenswrapper[4794]: I0310 09:46:02.497574 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:46:02 crc kubenswrapper[4794]: I0310 09:46:02.497605 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:46:02 crc kubenswrapper[4794]: I0310 09:46:02.497613 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:46:02 crc kubenswrapper[4794]: I0310 09:46:02.497626 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:46:02 crc kubenswrapper[4794]: I0310 09:46:02.497634 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:46:02Z","lastTransitionTime":"2026-03-10T09:46:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:46:02 crc kubenswrapper[4794]: I0310 09:46:02.564959 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-mm9nq_d6907de6-7eb7-440a-a101-f492ffa28e39/ovnkube-controller/0.log" Mar 10 09:46:02 crc kubenswrapper[4794]: I0310 09:46:02.569306 4794 generic.go:334] "Generic (PLEG): container finished" podID="d6907de6-7eb7-440a-a101-f492ffa28e39" containerID="129f4f2f028ce030a9031869e327954dca94ca04e3cff5fe553e344d932b3301" exitCode=1 Mar 10 09:46:02 crc kubenswrapper[4794]: I0310 09:46:02.569376 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mm9nq" event={"ID":"d6907de6-7eb7-440a-a101-f492ffa28e39","Type":"ContainerDied","Data":"129f4f2f028ce030a9031869e327954dca94ca04e3cff5fe553e344d932b3301"} Mar 10 09:46:02 crc kubenswrapper[4794]: I0310 09:46:02.570580 4794 scope.go:117] "RemoveContainer" containerID="129f4f2f028ce030a9031869e327954dca94ca04e3cff5fe553e344d932b3301" Mar 10 09:46:02 crc kubenswrapper[4794]: I0310 09:46:02.591604 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:35Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:46:02Z is after 2025-08-24T17:21:41Z" Mar 10 09:46:02 crc kubenswrapper[4794]: I0310 09:46:02.600679 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:46:02 crc kubenswrapper[4794]: I0310 09:46:02.600742 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:46:02 crc kubenswrapper[4794]: I0310 09:46:02.600766 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:46:02 crc kubenswrapper[4794]: I0310 09:46:02.600796 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:46:02 crc kubenswrapper[4794]: I0310 09:46:02.600819 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:46:02Z","lastTransitionTime":"2026-03-10T09:46:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:46:02 crc kubenswrapper[4794]: I0310 09:46:02.612065 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-69278" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"071a79a8-a892-4d38-a255-2a19483b64aa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f596b5be3f96e424f4c4f0c6f81241ec50ee64356cb1449294fa6790aa3f7755\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:45:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wl72n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://387c7d244ef3b150e61bf6db8b30580b19b8867628fab0e5c4eebdde4df81142\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:45:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wl72n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:45:52Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-69278\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:46:02Z is after 2025-08-24T17:21:41Z" Mar 10 09:46:02 crc kubenswrapper[4794]: I0310 09:46:02.643192 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1e0df489-afb8-4101-acd6-7274ee4ebddf\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://714aace000a0c1b32bd3c863ca1e38e9e44df606fcf3fddb6573eaf8faea4ac7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:44:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://02b38ec46bd15bd61a9748f309c0f8fd16d1ab485405a653f6974b6b37952f75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:44:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b3bcab9c36a963edfb820142c1c917f0cc5d18a09dc3aa65b4539c2248fb866\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:44:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://55280e0d81cf78ca765eb09242df44355b2e25fc2d3e5f88b78ef9f14a44ada8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:44:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8ed250f61efe96b4fac3af4d8d8958a60df304df0120052a7a1ef2d716cc025d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:44:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d1cbb7471e4c1bc60e6bff82ff47be1667d540e2f73c1e118371b1f887c6e22e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d1cbb7471e4c1bc60e6bff82ff47be1667d540e2f73c1e118371b1f887c6e22e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:44:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:44:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e8b205e76d6211c7df0521baf694a871c2e747401dd85f694fe3b6712995540\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8e8b205e76d6211c7df0521baf694a871c2e747401dd85f694fe3b6712995540\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:44:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:44:14Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://0fee74b6caaaf89e473d672c7699753699a594043f013d2e5106d232ad87d850\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0fee74b6caaaf89e473d672c7699753699a594043f013d2e5106d232ad87d850\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:44:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:44:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:44:12Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:46:02Z is after 2025-08-24T17:21:41Z" Mar 10 09:46:02 crc kubenswrapper[4794]: I0310 09:46:02.663622 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d7e66c82b5c376bc659e9597f6714923192b2658db0530c1e20b73533bfb9079\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:45:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:46:02Z is after 2025-08-24T17:21:41Z" Mar 10 09:46:02 crc kubenswrapper[4794]: I0310 09:46:02.681093 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:39Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:39Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7ca44cdf468c33b025c75f0f104435e413e525fba85c006f98fa92ff2f2a4e3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:45:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:46:02Z is after 2025-08-24T17:21:41Z" Mar 10 09:46:02 crc kubenswrapper[4794]: I0310 09:46:02.703276 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:46:02 crc kubenswrapper[4794]: I0310 09:46:02.703309 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:46:02 crc kubenswrapper[4794]: I0310 09:46:02.703322 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:46:02 crc kubenswrapper[4794]: I0310 09:46:02.703389 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:46:02 crc kubenswrapper[4794]: I0310 09:46:02.703403 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:46:02Z","lastTransitionTime":"2026-03-10T09:46:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:46:02 crc kubenswrapper[4794]: I0310 09:46:02.710699 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mm9nq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d6907de6-7eb7-440a-a101-f492ffa28e39\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d6a3c6de4d134994477dbeb7d3ccdac86f691a7c465321397a93be240c84c4c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:45:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5dhqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf907aae6bd62c6969511c2483d9891a3d89407fa596e7bd4d6e18a3739aea41\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:45:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5dhqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f392e7eaab035834aa954a19e6c9d926bbedf2a547b3a11fba67c0bb966df58c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:45:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5dhqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4fd74482a81888186636681a0568f49e919ac0d729db583ba9178c42e488051\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:45:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5dhqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4fe68076842f5ec0c1922f36a3fdae71ce2e309fd4fa24d4ee4e15c16e514b57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:45:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5dhqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6993b19937515a45f9fab7b20a6856320e79cb3526295d52f48338126d519cac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:45:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5dhqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://129f4f2f028ce030a9031869e327954dca94ca04e3cff5fe553e344d932b3301\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://129f4f2f028ce030a9031869e327954dca94ca04e3cff5fe553e344d932b3301\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-10T09:46:02Z\\\",\\\"message\\\":\\\".io/client-go/informers/factory.go:160\\\\nI0310 09:46:02.111927 6641 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0310 09:46:02.112060 6641 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0310 09:46:02.112249 6641 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0310 09:46:02.112716 6641 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0310 09:46:02.112769 6641 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0310 09:46:02.112826 6641 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0310 09:46:02.112858 6641 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0310 09:46:02.112824 6641 handler.go:208] Removed *v1.Node event handler 7\\\\nI0310 09:46:02.112907 6641 handler.go:208] Removed *v1.Node event handler 2\\\\nI0310 09:46:02.112924 6641 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0310 09:46:02.112979 6641 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0310 09:46:02.113019 6641 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0310 09:46:02.113068 6641 factory.go:656] Stopping watch factory\\\\nI0310 09:46:02.113107 6641 ovnkube.go:599] Stopped ovnkube\\\\nI0310 09:46:0\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T09:45:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5dhqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://01f874a0c0eedbf796a95816f7e3fa80891f260f22087c48aea06498330167dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:45:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5dhqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://40c8d4f75a838a0ea1ce416d08c6d4628c2cbdbc209d50922f2f02867142bedd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://40c8d4f75a838a0ea1ce416d08c6d4628c2cbdbc209d50922f2f02867142bedd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:45:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:45:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5dhqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:45:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-mm9nq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:46:02Z is after 2025-08-24T17:21:41Z" Mar 10 09:46:02 crc kubenswrapper[4794]: I0310 09:46:02.724784 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:35Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:46:02Z is after 2025-08-24T17:21:41Z" Mar 10 09:46:02 crc kubenswrapper[4794]: I0310 09:46:02.740763 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:46:02Z is after 2025-08-24T17:21:41Z" Mar 10 09:46:02 crc kubenswrapper[4794]: I0310 09:46:02.757914 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0a99da5e73d983b28a12e4063cb3dcbef725b0797abd5b5beff2d066fb4b43d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:45:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://faad85d1ef96c3c23ba54d03d456195b33f24bed1a5fc64cec481108d087d77c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:45:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:46:02Z is after 2025-08-24T17:21:41Z" Mar 10 09:46:02 crc kubenswrapper[4794]: I0310 09:46:02.775565 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-dc7fw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"441d4601-197c-4325-84bf-cef005ae408b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2367448989b7eb7235c3277100a8d4945551beee942453ce2efe152e17d24ce2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:45:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxxtq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:45:52Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-dc7fw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:46:02Z is after 2025-08-24T17:21:41Z" Mar 10 09:46:02 crc kubenswrapper[4794]: I0310 09:46:02.787256 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-gp22j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ccf2d24b-7ac8-4da4-8629-a56d006f1292\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:46:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:46:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:46:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dc5c1e6650959d5e7fe835284d6684f8d165aaec2e2e5c6537668db196c0ba81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:45:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k8fgn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:45:59Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-gp22j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:46:02Z is after 2025-08-24T17:21:41Z" Mar 10 09:46:02 crc kubenswrapper[4794]: I0310 09:46:02.806419 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:46:02 crc kubenswrapper[4794]: I0310 09:46:02.806469 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:46:02 crc kubenswrapper[4794]: I0310 09:46:02.806485 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:46:02 crc kubenswrapper[4794]: I0310 09:46:02.806508 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:46:02 crc kubenswrapper[4794]: I0310 09:46:02.806525 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:46:02Z","lastTransitionTime":"2026-03-10T09:46:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:46:02 crc kubenswrapper[4794]: I0310 09:46:02.806964 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c04f5dfc-8cde-406b-9e01-2e529e0c0f31\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:12Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:12Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c7c4ede630df1f139ad8abbc2d1d1e2f0f6cf56a2c29902f6b6879df004834cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:44:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://27049a35d23ac6ffa5cb459e4ea2c5c55a7b1731c9edff0941d452442cc73af2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:44:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://de2c346c49fbd94c4328853bed3ffd94ba873a423ff0d99a6ace6fc832bd4da7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:44:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd3d37caeb172a681855b5504cd1387cb3798cff0c9ae2c67f9a57207d86f7fe\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bd3d37caeb172a681855b5504cd1387cb3798cff0c9ae2c67f9a57207d86f7fe\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T09:45:21Z\\\",\\\"message\\\":\\\"le observer\\\\nW0310 09:45:21.579222 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0310 09:45:21.579381 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0310 09:45:21.580347 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2897458058/tls.crt::/tmp/serving-cert-2897458058/tls.key\\\\\\\"\\\\nI0310 09:45:21.843230 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0310 09:45:21.845521 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0310 09:45:21.845546 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0310 09:45:21.845567 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0310 09:45:21.845572 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0310 09:45:21.854911 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0310 09:45:21.854948 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 09:45:21.854954 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 09:45:21.854960 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0310 09:45:21.854966 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0310 09:45:21.854970 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0310 09:45:21.854974 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0310 09:45:21.854995 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0310 09:45:21.858504 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T09:45:21Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a3be3a71e2af20be50ffcbd448d5f28e73bafe2d36819ec5476a260b6f5a8c5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:44:14Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://712665fa5ac90b706c3ce8eb211703686c18a26f136962df26e3ad7c4396c72f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://712665fa5ac90b706c3ce8eb211703686c18a26f136962df26e3ad7c4396c72f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:44:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:44:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:44:12Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:46:02Z is after 2025-08-24T17:21:41Z" Mar 10 09:46:02 crc kubenswrapper[4794]: I0310 09:46:02.819890 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2f008ec7-aeae-48e2-8ce6-2d24f9a74cb9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5ecb656c11edc3bfd23f7b6362d5ac8cad055186b134498b59e58b868ebd157c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:44:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8ff582c5b9b19a04841665aef50262d055e093a8e80200961a93e1e1ee0391b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8ff582c5b9b19a04841665aef50262d055e093a8e80200961a93e1e1ee0391b4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:44:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:44:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:44:12Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:46:02Z is after 2025-08-24T17:21:41Z" Mar 10 09:46:02 crc kubenswrapper[4794]: I0310 09:46:02.839546 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-zr89w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"749e1060-d177-43e4-9f39-d1b3e9b573b3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://80687a118cd90a6d4538b0de1757e2f3a17d7f2d5e2532d4a3c4396e838bdce4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:45:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-drqmz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9a7b1089f71425c1e393f53f827c2bd0dd87615752c6c5655ef28b2346de21d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a7b1089f71425c1e393f53f827c2bd0dd87615752c6c5655ef28b2346de21d8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:45:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:45:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-drqmz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e448f6f31e1ada006f54b8381012841a94a4e0101fba5630d715f1ab89e842e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6e448f6f31e1ada006f54b8381012841a94a4e0101fba5630d715f1ab89e842e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:45:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:45:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-drqmz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4d92d9cb37c7766aaabeee4dc2a40c043f0e8178dfe21b6ca9f91fa6313fed4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d4d92d9cb37c7766aaabeee4dc2a40c043f0e8178dfe21b6ca9f91fa6313fed4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:45:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:45:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-drqmz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://06a10ffd91a8c26bf9a3affc09b1c698bcf344aebafadbb12e37e4cc0fb8fb32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://06a10ffd91a8c26bf9a3affc09b1c698bcf344aebafadbb12e37e4cc0fb8fb32\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:45:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:45:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-drqmz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://080520ae745ae721854727f7b8ea882c3b1073ef487362681973f5befe15a79c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://080520ae745ae721854727f7b8ea882c3b1073ef487362681973f5befe15a79c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:45:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:45:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-drqmz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4d9e1da83ee6ac0c4f792e1ec3da1a150b5ee013e83dd2e1b0e86bf5b5fcc648\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4d9e1da83ee6ac0c4f792e1ec3da1a150b5ee013e83dd2e1b0e86bf5b5fcc648\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:45:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:45:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-drqmz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:45:52Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-zr89w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:46:02Z is after 2025-08-24T17:21:41Z" Mar 10 09:46:02 crc kubenswrapper[4794]: I0310 09:46:02.857518 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-jpdth" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"11028118-385a-4a2a-8bc4-49aad67ce147\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31569151ef9bf456980ec9d56c85d54ea77739b3991e26844385936dfceb24ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:45:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8v9tn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:45:52Z\\\"}}\" for pod \"openshift-multus\"/\"multus-jpdth\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:46:02Z is after 2025-08-24T17:21:41Z" Mar 10 09:46:02 crc kubenswrapper[4794]: I0310 09:46:02.908765 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:46:02 crc kubenswrapper[4794]: I0310 09:46:02.908807 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:46:02 crc kubenswrapper[4794]: I0310 09:46:02.908820 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:46:02 crc kubenswrapper[4794]: I0310 09:46:02.908840 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:46:02 crc kubenswrapper[4794]: I0310 09:46:02.908853 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:46:02Z","lastTransitionTime":"2026-03-10T09:46:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:46:03 crc kubenswrapper[4794]: I0310 09:46:03.011672 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:46:03 crc kubenswrapper[4794]: I0310 09:46:03.011727 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:46:03 crc kubenswrapper[4794]: I0310 09:46:03.011746 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:46:03 crc kubenswrapper[4794]: I0310 09:46:03.011769 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:46:03 crc kubenswrapper[4794]: I0310 09:46:03.011786 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:46:03Z","lastTransitionTime":"2026-03-10T09:46:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:46:03 crc kubenswrapper[4794]: I0310 09:46:03.113820 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:46:03 crc kubenswrapper[4794]: I0310 09:46:03.113859 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:46:03 crc kubenswrapper[4794]: I0310 09:46:03.113871 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:46:03 crc kubenswrapper[4794]: I0310 09:46:03.113888 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:46:03 crc kubenswrapper[4794]: I0310 09:46:03.113901 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:46:03Z","lastTransitionTime":"2026-03-10T09:46:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:46:03 crc kubenswrapper[4794]: I0310 09:46:03.216036 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:46:03 crc kubenswrapper[4794]: I0310 09:46:03.216085 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:46:03 crc kubenswrapper[4794]: I0310 09:46:03.216102 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:46:03 crc kubenswrapper[4794]: I0310 09:46:03.216123 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:46:03 crc kubenswrapper[4794]: I0310 09:46:03.216138 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:46:03Z","lastTransitionTime":"2026-03-10T09:46:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:46:03 crc kubenswrapper[4794]: I0310 09:46:03.318008 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:46:03 crc kubenswrapper[4794]: I0310 09:46:03.318060 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:46:03 crc kubenswrapper[4794]: I0310 09:46:03.318079 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:46:03 crc kubenswrapper[4794]: I0310 09:46:03.318103 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:46:03 crc kubenswrapper[4794]: I0310 09:46:03.318119 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:46:03Z","lastTransitionTime":"2026-03-10T09:46:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:46:03 crc kubenswrapper[4794]: I0310 09:46:03.420440 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:46:03 crc kubenswrapper[4794]: I0310 09:46:03.420488 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:46:03 crc kubenswrapper[4794]: I0310 09:46:03.420502 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:46:03 crc kubenswrapper[4794]: I0310 09:46:03.420520 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:46:03 crc kubenswrapper[4794]: I0310 09:46:03.420533 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:46:03Z","lastTransitionTime":"2026-03-10T09:46:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:46:03 crc kubenswrapper[4794]: I0310 09:46:03.522832 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:46:03 crc kubenswrapper[4794]: I0310 09:46:03.522875 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:46:03 crc kubenswrapper[4794]: I0310 09:46:03.522896 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:46:03 crc kubenswrapper[4794]: I0310 09:46:03.522916 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:46:03 crc kubenswrapper[4794]: I0310 09:46:03.522934 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:46:03Z","lastTransitionTime":"2026-03-10T09:46:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:46:03 crc kubenswrapper[4794]: I0310 09:46:03.574177 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-mm9nq_d6907de6-7eb7-440a-a101-f492ffa28e39/ovnkube-controller/1.log" Mar 10 09:46:03 crc kubenswrapper[4794]: I0310 09:46:03.574947 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-mm9nq_d6907de6-7eb7-440a-a101-f492ffa28e39/ovnkube-controller/0.log" Mar 10 09:46:03 crc kubenswrapper[4794]: I0310 09:46:03.578320 4794 generic.go:334] "Generic (PLEG): container finished" podID="d6907de6-7eb7-440a-a101-f492ffa28e39" containerID="f02214c7d7705fc02dcc65fa178be79e26797ae2a4414ac88e3b14feca08b425" exitCode=1 Mar 10 09:46:03 crc kubenswrapper[4794]: I0310 09:46:03.578445 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mm9nq" event={"ID":"d6907de6-7eb7-440a-a101-f492ffa28e39","Type":"ContainerDied","Data":"f02214c7d7705fc02dcc65fa178be79e26797ae2a4414ac88e3b14feca08b425"} Mar 10 09:46:03 crc kubenswrapper[4794]: I0310 09:46:03.578525 4794 scope.go:117] "RemoveContainer" containerID="129f4f2f028ce030a9031869e327954dca94ca04e3cff5fe553e344d932b3301" Mar 10 09:46:03 crc kubenswrapper[4794]: I0310 09:46:03.579167 4794 scope.go:117] "RemoveContainer" containerID="f02214c7d7705fc02dcc65fa178be79e26797ae2a4414ac88e3b14feca08b425" Mar 10 09:46:03 crc kubenswrapper[4794]: E0310 09:46:03.579459 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-mm9nq_openshift-ovn-kubernetes(d6907de6-7eb7-440a-a101-f492ffa28e39)\"" pod="openshift-ovn-kubernetes/ovnkube-node-mm9nq" podUID="d6907de6-7eb7-440a-a101-f492ffa28e39" Mar 10 09:46:03 crc kubenswrapper[4794]: I0310 09:46:03.601482 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:35Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:46:03Z is after 2025-08-24T17:21:41Z" Mar 10 09:46:03 crc kubenswrapper[4794]: I0310 09:46:03.619056 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-69278" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"071a79a8-a892-4d38-a255-2a19483b64aa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f596b5be3f96e424f4c4f0c6f81241ec50ee64356cb1449294fa6790aa3f7755\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:45:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wl72n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://387c7d244ef3b150e61bf6db8b30580b19b8867628fab0e5c4eebdde4df81142\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:45:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wl72n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:45:52Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-69278\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:46:03Z is after 2025-08-24T17:21:41Z" Mar 10 09:46:03 crc kubenswrapper[4794]: I0310 09:46:03.626407 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:46:03 crc kubenswrapper[4794]: I0310 09:46:03.626469 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:46:03 crc kubenswrapper[4794]: I0310 09:46:03.626495 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:46:03 crc kubenswrapper[4794]: I0310 09:46:03.626526 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:46:03 crc kubenswrapper[4794]: I0310 09:46:03.626549 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:46:03Z","lastTransitionTime":"2026-03-10T09:46:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:46:03 crc kubenswrapper[4794]: I0310 09:46:03.641032 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1e0df489-afb8-4101-acd6-7274ee4ebddf\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://714aace000a0c1b32bd3c863ca1e38e9e44df606fcf3fddb6573eaf8faea4ac7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:44:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://02b38ec46bd15bd61a9748f309c0f8fd16d1ab485405a653f6974b6b37952f75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:44:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b3bcab9c36a963edfb820142c1c917f0cc5d18a09dc3aa65b4539c2248fb866\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:44:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://55280e0d81cf78ca765eb09242df44355b2e25fc2d3e5f88b78ef9f14a44ada8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:44:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8ed250f61efe96b4fac3af4d8d8958a60df304df0120052a7a1ef2d716cc025d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:44:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d1cbb7471e4c1bc60e6bff82ff47be1667d540e2f73c1e118371b1f887c6e22e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d1cbb7471e4c1bc60e6bff82ff47be1667d540e2f73c1e118371b1f887c6e22e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:44:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:44:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e8b205e76d6211c7df0521baf694a871c2e747401dd85f694fe3b6712995540\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8e8b205e76d6211c7df0521baf694a871c2e747401dd85f694fe3b6712995540\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:44:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:44:14Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://0fee74b6caaaf89e473d672c7699753699a594043f013d2e5106d232ad87d850\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0fee74b6caaaf89e473d672c7699753699a594043f013d2e5106d232ad87d850\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:44:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:44:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:44:12Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:46:03Z is after 2025-08-24T17:21:41Z" Mar 10 09:46:03 crc kubenswrapper[4794]: I0310 09:46:03.655108 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d7e66c82b5c376bc659e9597f6714923192b2658db0530c1e20b73533bfb9079\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:45:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:46:03Z is after 2025-08-24T17:21:41Z" Mar 10 09:46:03 crc kubenswrapper[4794]: I0310 09:46:03.668975 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:39Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:39Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7ca44cdf468c33b025c75f0f104435e413e525fba85c006f98fa92ff2f2a4e3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:45:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:46:03Z is after 2025-08-24T17:21:41Z" Mar 10 09:46:03 crc kubenswrapper[4794]: I0310 09:46:03.689792 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mm9nq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d6907de6-7eb7-440a-a101-f492ffa28e39\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d6a3c6de4d134994477dbeb7d3ccdac86f691a7c465321397a93be240c84c4c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:45:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5dhqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf907aae6bd62c6969511c2483d9891a3d89407fa596e7bd4d6e18a3739aea41\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:45:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5dhqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f392e7eaab035834aa954a19e6c9d926bbedf2a547b3a11fba67c0bb966df58c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:45:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5dhqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4fd74482a81888186636681a0568f49e919ac0d729db583ba9178c42e488051\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:45:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5dhqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4fe68076842f5ec0c1922f36a3fdae71ce2e309fd4fa24d4ee4e15c16e514b57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:45:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5dhqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6993b19937515a45f9fab7b20a6856320e79cb3526295d52f48338126d519cac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:45:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5dhqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f02214c7d7705fc02dcc65fa178be79e26797ae2a4414ac88e3b14feca08b425\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://129f4f2f028ce030a9031869e327954dca94ca04e3cff5fe553e344d932b3301\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-10T09:46:02Z\\\",\\\"message\\\":\\\".io/client-go/informers/factory.go:160\\\\nI0310 09:46:02.111927 6641 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0310 09:46:02.112060 6641 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0310 09:46:02.112249 6641 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0310 09:46:02.112716 6641 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0310 09:46:02.112769 6641 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0310 09:46:02.112826 6641 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0310 09:46:02.112858 6641 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0310 09:46:02.112824 6641 handler.go:208] Removed *v1.Node event handler 7\\\\nI0310 09:46:02.112907 6641 handler.go:208] Removed *v1.Node event handler 2\\\\nI0310 09:46:02.112924 6641 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0310 09:46:02.112979 6641 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0310 09:46:02.113019 6641 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0310 09:46:02.113068 6641 factory.go:656] Stopping watch factory\\\\nI0310 09:46:02.113107 6641 ovnkube.go:599] Stopped ovnkube\\\\nI0310 09:46:0\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T09:45:59Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f02214c7d7705fc02dcc65fa178be79e26797ae2a4414ac88e3b14feca08b425\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-10T09:46:03Z\\\",\\\"message\\\":\\\"j_retry.go:386] Retry successful for *v1.Pod openshift-network-operator/iptables-alerter-4ln5h after 0 failed attempt(s)\\\\nF0310 09:46:03.457799 6791 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:46:03Z is after 2025-08-24T17:21:41Z]\\\\nI0310 09:46:03.457805 6791 default_network_controller.go:776] Recording success event on pod openshift-network-operator/iptables-alerter-4ln5h\\\\nI0310 09:46:03.457647 6791 ovn.go:134] Ensuring zone local for Pod openshift-machine-config-operator/kube-rbac-proxy-crio-crc in node crc\\\\nI0310 09:46:03.457819 6791 obj_retry.go:386] Retry successful for *v1.Pod openshift-machine-config-operat\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T09:46:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5dhqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://01f874a0c0eedbf796a95816f7e3fa80891f260f22087c48aea06498330167dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:45:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5dhqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://40c8d4f75a838a0ea1ce416d08c6d4628c2cbdbc209d50922f2f02867142bedd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://40c8d4f75a838a0ea1ce416d08c6d4628c2cbdbc209d50922f2f02867142bedd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:45:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:45:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5dhqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:45:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-mm9nq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:46:03Z is after 2025-08-24T17:21:41Z" Mar 10 09:46:03 crc kubenswrapper[4794]: I0310 09:46:03.705676 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:35Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:46:03Z is after 2025-08-24T17:21:41Z" Mar 10 09:46:03 crc kubenswrapper[4794]: I0310 09:46:03.718991 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:46:03Z is after 2025-08-24T17:21:41Z" Mar 10 09:46:03 crc kubenswrapper[4794]: I0310 09:46:03.729107 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:46:03 crc kubenswrapper[4794]: I0310 09:46:03.729161 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:46:03 crc kubenswrapper[4794]: I0310 09:46:03.729176 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:46:03 crc kubenswrapper[4794]: I0310 09:46:03.729192 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:46:03 crc kubenswrapper[4794]: I0310 09:46:03.729204 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:46:03Z","lastTransitionTime":"2026-03-10T09:46:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:46:03 crc kubenswrapper[4794]: I0310 09:46:03.731787 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0a99da5e73d983b28a12e4063cb3dcbef725b0797abd5b5beff2d066fb4b43d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:45:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://faad85d1ef96c3c23ba54d03d456195b33f24bed1a5fc64cec481108d087d77c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:45:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:46:03Z is after 2025-08-24T17:21:41Z" Mar 10 09:46:03 crc kubenswrapper[4794]: I0310 09:46:03.741457 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-dc7fw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"441d4601-197c-4325-84bf-cef005ae408b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2367448989b7eb7235c3277100a8d4945551beee942453ce2efe152e17d24ce2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:45:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxxtq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:45:52Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-dc7fw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:46:03Z is after 2025-08-24T17:21:41Z" Mar 10 09:46:03 crc kubenswrapper[4794]: I0310 09:46:03.753043 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-gp22j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ccf2d24b-7ac8-4da4-8629-a56d006f1292\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:46:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:46:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:46:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dc5c1e6650959d5e7fe835284d6684f8d165aaec2e2e5c6537668db196c0ba81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:45:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k8fgn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:45:59Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-gp22j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:46:03Z is after 2025-08-24T17:21:41Z" Mar 10 09:46:03 crc kubenswrapper[4794]: I0310 09:46:03.768468 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c04f5dfc-8cde-406b-9e01-2e529e0c0f31\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:12Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:12Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c7c4ede630df1f139ad8abbc2d1d1e2f0f6cf56a2c29902f6b6879df004834cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:44:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://27049a35d23ac6ffa5cb459e4ea2c5c55a7b1731c9edff0941d452442cc73af2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:44:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://de2c346c49fbd94c4328853bed3ffd94ba873a423ff0d99a6ace6fc832bd4da7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:44:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd3d37caeb172a681855b5504cd1387cb3798cff0c9ae2c67f9a57207d86f7fe\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bd3d37caeb172a681855b5504cd1387cb3798cff0c9ae2c67f9a57207d86f7fe\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T09:45:21Z\\\",\\\"message\\\":\\\"le observer\\\\nW0310 09:45:21.579222 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0310 09:45:21.579381 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0310 09:45:21.580347 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2897458058/tls.crt::/tmp/serving-cert-2897458058/tls.key\\\\\\\"\\\\nI0310 09:45:21.843230 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0310 09:45:21.845521 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0310 09:45:21.845546 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0310 09:45:21.845567 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0310 09:45:21.845572 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0310 09:45:21.854911 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0310 09:45:21.854948 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 09:45:21.854954 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 09:45:21.854960 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0310 09:45:21.854966 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0310 09:45:21.854970 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0310 09:45:21.854974 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0310 09:45:21.854995 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0310 09:45:21.858504 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T09:45:21Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a3be3a71e2af20be50ffcbd448d5f28e73bafe2d36819ec5476a260b6f5a8c5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:44:14Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://712665fa5ac90b706c3ce8eb211703686c18a26f136962df26e3ad7c4396c72f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://712665fa5ac90b706c3ce8eb211703686c18a26f136962df26e3ad7c4396c72f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:44:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:44:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:44:12Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:46:03Z is after 2025-08-24T17:21:41Z" Mar 10 09:46:03 crc kubenswrapper[4794]: I0310 09:46:03.778188 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2f008ec7-aeae-48e2-8ce6-2d24f9a74cb9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5ecb656c11edc3bfd23f7b6362d5ac8cad055186b134498b59e58b868ebd157c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:44:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8ff582c5b9b19a04841665aef50262d055e093a8e80200961a93e1e1ee0391b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8ff582c5b9b19a04841665aef50262d055e093a8e80200961a93e1e1ee0391b4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:44:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:44:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:44:12Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:46:03Z is after 2025-08-24T17:21:41Z" Mar 10 09:46:03 crc kubenswrapper[4794]: I0310 09:46:03.797618 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-zr89w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"749e1060-d177-43e4-9f39-d1b3e9b573b3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://80687a118cd90a6d4538b0de1757e2f3a17d7f2d5e2532d4a3c4396e838bdce4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:45:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-drqmz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9a7b1089f71425c1e393f53f827c2bd0dd87615752c6c5655ef28b2346de21d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a7b1089f71425c1e393f53f827c2bd0dd87615752c6c5655ef28b2346de21d8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:45:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:45:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-drqmz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e448f6f31e1ada006f54b8381012841a94a4e0101fba5630d715f1ab89e842e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6e448f6f31e1ada006f54b8381012841a94a4e0101fba5630d715f1ab89e842e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:45:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:45:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-drqmz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4d92d9cb37c7766aaabeee4dc2a40c043f0e8178dfe21b6ca9f91fa6313fed4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d4d92d9cb37c7766aaabeee4dc2a40c043f0e8178dfe21b6ca9f91fa6313fed4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:45:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:45:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-drqmz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://06a10ffd91a8c26bf9a3affc09b1c698bcf344aebafadbb12e37e4cc0fb8fb32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://06a10ffd91a8c26bf9a3affc09b1c698bcf344aebafadbb12e37e4cc0fb8fb32\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:45:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:45:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-drqmz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://080520ae745ae721854727f7b8ea882c3b1073ef487362681973f5befe15a79c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://080520ae745ae721854727f7b8ea882c3b1073ef487362681973f5befe15a79c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:45:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:45:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-drqmz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4d9e1da83ee6ac0c4f792e1ec3da1a150b5ee013e83dd2e1b0e86bf5b5fcc648\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4d9e1da83ee6ac0c4f792e1ec3da1a150b5ee013e83dd2e1b0e86bf5b5fcc648\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:45:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:45:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-drqmz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:45:52Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-zr89w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:46:03Z is after 2025-08-24T17:21:41Z" Mar 10 09:46:03 crc kubenswrapper[4794]: I0310 09:46:03.813111 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-jpdth" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"11028118-385a-4a2a-8bc4-49aad67ce147\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31569151ef9bf456980ec9d56c85d54ea77739b3991e26844385936dfceb24ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:45:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8v9tn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:45:52Z\\\"}}\" for pod \"openshift-multus\"/\"multus-jpdth\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:46:03Z is after 2025-08-24T17:21:41Z" Mar 10 09:46:03 crc kubenswrapper[4794]: I0310 09:46:03.831790 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:46:03 crc kubenswrapper[4794]: I0310 09:46:03.831855 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:46:03 crc kubenswrapper[4794]: I0310 09:46:03.831873 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:46:03 crc kubenswrapper[4794]: I0310 09:46:03.831899 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:46:03 crc kubenswrapper[4794]: I0310 09:46:03.831917 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:46:03Z","lastTransitionTime":"2026-03-10T09:46:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:46:03 crc kubenswrapper[4794]: I0310 09:46:03.934712 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:46:03 crc kubenswrapper[4794]: I0310 09:46:03.934823 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:46:03 crc kubenswrapper[4794]: I0310 09:46:03.934890 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:46:03 crc kubenswrapper[4794]: I0310 09:46:03.934927 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:46:03 crc kubenswrapper[4794]: I0310 09:46:03.934950 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:46:03Z","lastTransitionTime":"2026-03-10T09:46:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:46:03 crc kubenswrapper[4794]: I0310 09:46:03.998624 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 09:46:03 crc kubenswrapper[4794]: I0310 09:46:03.998700 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 09:46:03 crc kubenswrapper[4794]: I0310 09:46:03.998793 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 09:46:03 crc kubenswrapper[4794]: E0310 09:46:03.998796 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 09:46:03 crc kubenswrapper[4794]: E0310 09:46:03.998949 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 09:46:03 crc kubenswrapper[4794]: E0310 09:46:03.999049 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 09:46:04 crc kubenswrapper[4794]: I0310 09:46:04.037429 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:46:04 crc kubenswrapper[4794]: I0310 09:46:04.037489 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:46:04 crc kubenswrapper[4794]: I0310 09:46:04.037507 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:46:04 crc kubenswrapper[4794]: I0310 09:46:04.037531 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:46:04 crc kubenswrapper[4794]: I0310 09:46:04.037551 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:46:04Z","lastTransitionTime":"2026-03-10T09:46:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:46:04 crc kubenswrapper[4794]: I0310 09:46:04.140287 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:46:04 crc kubenswrapper[4794]: I0310 09:46:04.140409 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:46:04 crc kubenswrapper[4794]: I0310 09:46:04.140437 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:46:04 crc kubenswrapper[4794]: I0310 09:46:04.140467 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:46:04 crc kubenswrapper[4794]: I0310 09:46:04.140490 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:46:04Z","lastTransitionTime":"2026-03-10T09:46:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:46:04 crc kubenswrapper[4794]: I0310 09:46:04.243009 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:46:04 crc kubenswrapper[4794]: I0310 09:46:04.243074 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:46:04 crc kubenswrapper[4794]: I0310 09:46:04.243093 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:46:04 crc kubenswrapper[4794]: I0310 09:46:04.243121 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:46:04 crc kubenswrapper[4794]: I0310 09:46:04.243145 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:46:04Z","lastTransitionTime":"2026-03-10T09:46:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:46:04 crc kubenswrapper[4794]: I0310 09:46:04.345871 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:46:04 crc kubenswrapper[4794]: I0310 09:46:04.345942 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:46:04 crc kubenswrapper[4794]: I0310 09:46:04.345966 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:46:04 crc kubenswrapper[4794]: I0310 09:46:04.345995 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:46:04 crc kubenswrapper[4794]: I0310 09:46:04.346018 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:46:04Z","lastTransitionTime":"2026-03-10T09:46:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:46:04 crc kubenswrapper[4794]: I0310 09:46:04.448905 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:46:04 crc kubenswrapper[4794]: I0310 09:46:04.448968 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:46:04 crc kubenswrapper[4794]: I0310 09:46:04.448992 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:46:04 crc kubenswrapper[4794]: I0310 09:46:04.449022 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:46:04 crc kubenswrapper[4794]: I0310 09:46:04.449042 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:46:04Z","lastTransitionTime":"2026-03-10T09:46:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:46:04 crc kubenswrapper[4794]: I0310 09:46:04.551177 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:46:04 crc kubenswrapper[4794]: I0310 09:46:04.551210 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:46:04 crc kubenswrapper[4794]: I0310 09:46:04.551221 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:46:04 crc kubenswrapper[4794]: I0310 09:46:04.551237 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:46:04 crc kubenswrapper[4794]: I0310 09:46:04.551249 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:46:04Z","lastTransitionTime":"2026-03-10T09:46:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:46:04 crc kubenswrapper[4794]: I0310 09:46:04.584024 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-mm9nq_d6907de6-7eb7-440a-a101-f492ffa28e39/ovnkube-controller/1.log" Mar 10 09:46:04 crc kubenswrapper[4794]: I0310 09:46:04.654468 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:46:04 crc kubenswrapper[4794]: I0310 09:46:04.654562 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:46:04 crc kubenswrapper[4794]: I0310 09:46:04.654583 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:46:04 crc kubenswrapper[4794]: I0310 09:46:04.654606 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:46:04 crc kubenswrapper[4794]: I0310 09:46:04.654620 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:46:04Z","lastTransitionTime":"2026-03-10T09:46:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:46:04 crc kubenswrapper[4794]: I0310 09:46:04.757406 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:46:04 crc kubenswrapper[4794]: I0310 09:46:04.757475 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:46:04 crc kubenswrapper[4794]: I0310 09:46:04.757489 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:46:04 crc kubenswrapper[4794]: I0310 09:46:04.757515 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:46:04 crc kubenswrapper[4794]: I0310 09:46:04.757530 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:46:04Z","lastTransitionTime":"2026-03-10T09:46:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:46:04 crc kubenswrapper[4794]: I0310 09:46:04.860773 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:46:04 crc kubenswrapper[4794]: I0310 09:46:04.860833 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:46:04 crc kubenswrapper[4794]: I0310 09:46:04.860850 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:46:04 crc kubenswrapper[4794]: I0310 09:46:04.860876 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:46:04 crc kubenswrapper[4794]: I0310 09:46:04.860893 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:46:04Z","lastTransitionTime":"2026-03-10T09:46:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:46:04 crc kubenswrapper[4794]: I0310 09:46:04.963791 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:46:04 crc kubenswrapper[4794]: I0310 09:46:04.963877 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:46:04 crc kubenswrapper[4794]: I0310 09:46:04.963895 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:46:04 crc kubenswrapper[4794]: I0310 09:46:04.963922 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:46:04 crc kubenswrapper[4794]: I0310 09:46:04.963944 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:46:04Z","lastTransitionTime":"2026-03-10T09:46:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:46:04 crc kubenswrapper[4794]: I0310 09:46:04.996040 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-9j8d2"] Mar 10 09:46:04 crc kubenswrapper[4794]: I0310 09:46:04.996705 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-9j8d2" Mar 10 09:46:04 crc kubenswrapper[4794]: I0310 09:46:04.999420 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Mar 10 09:46:05 crc kubenswrapper[4794]: I0310 09:46:05.003138 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Mar 10 09:46:05 crc kubenswrapper[4794]: I0310 09:46:05.014629 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:35Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:46:05Z is after 2025-08-24T17:21:41Z" Mar 10 09:46:05 crc kubenswrapper[4794]: I0310 09:46:05.027097 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-69278" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"071a79a8-a892-4d38-a255-2a19483b64aa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f596b5be3f96e424f4c4f0c6f81241ec50ee64356cb1449294fa6790aa3f7755\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:45:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wl72n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://387c7d244ef3b150e61bf6db8b30580b19b8867628fab0e5c4eebdde4df81142\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:45:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wl72n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:45:52Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-69278\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:46:05Z is after 2025-08-24T17:21:41Z" Mar 10 09:46:05 crc kubenswrapper[4794]: I0310 09:46:05.048349 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1e0df489-afb8-4101-acd6-7274ee4ebddf\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://714aace000a0c1b32bd3c863ca1e38e9e44df606fcf3fddb6573eaf8faea4ac7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:44:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://02b38ec46bd15bd61a9748f309c0f8fd16d1ab485405a653f6974b6b37952f75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:44:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b3bcab9c36a963edfb820142c1c917f0cc5d18a09dc3aa65b4539c2248fb866\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:44:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://55280e0d81cf78ca765eb09242df44355b2e25fc2d3e5f88b78ef9f14a44ada8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:44:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8ed250f61efe96b4fac3af4d8d8958a60df304df0120052a7a1ef2d716cc025d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:44:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d1cbb7471e4c1bc60e6bff82ff47be1667d540e2f73c1e118371b1f887c6e22e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d1cbb7471e4c1bc60e6bff82ff47be1667d540e2f73c1e118371b1f887c6e22e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:44:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:44:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e8b205e76d6211c7df0521baf694a871c2e747401dd85f694fe3b6712995540\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8e8b205e76d6211c7df0521baf694a871c2e747401dd85f694fe3b6712995540\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:44:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:44:14Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://0fee74b6caaaf89e473d672c7699753699a594043f013d2e5106d232ad87d850\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0fee74b6caaaf89e473d672c7699753699a594043f013d2e5106d232ad87d850\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:44:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:44:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:44:12Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:46:05Z is after 2025-08-24T17:21:41Z" Mar 10 09:46:05 crc kubenswrapper[4794]: I0310 09:46:05.066609 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:46:05 crc kubenswrapper[4794]: I0310 09:46:05.066648 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:46:05 crc kubenswrapper[4794]: I0310 09:46:05.066663 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:46:05 crc kubenswrapper[4794]: I0310 09:46:05.066686 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:46:05 crc kubenswrapper[4794]: I0310 09:46:05.066701 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:46:05Z","lastTransitionTime":"2026-03-10T09:46:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:46:05 crc kubenswrapper[4794]: I0310 09:46:05.069554 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d7e66c82b5c376bc659e9597f6714923192b2658db0530c1e20b73533bfb9079\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:45:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:46:05Z is after 2025-08-24T17:21:41Z" Mar 10 09:46:05 crc kubenswrapper[4794]: I0310 09:46:05.090656 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:39Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:39Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7ca44cdf468c33b025c75f0f104435e413e525fba85c006f98fa92ff2f2a4e3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:45:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:46:05Z is after 2025-08-24T17:21:41Z" Mar 10 09:46:05 crc kubenswrapper[4794]: I0310 09:46:05.114130 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mm9nq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d6907de6-7eb7-440a-a101-f492ffa28e39\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d6a3c6de4d134994477dbeb7d3ccdac86f691a7c465321397a93be240c84c4c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:45:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5dhqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf907aae6bd62c6969511c2483d9891a3d89407fa596e7bd4d6e18a3739aea41\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:45:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5dhqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f392e7eaab035834aa954a19e6c9d926bbedf2a547b3a11fba67c0bb966df58c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:45:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5dhqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4fd74482a81888186636681a0568f49e919ac0d729db583ba9178c42e488051\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:45:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5dhqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4fe68076842f5ec0c1922f36a3fdae71ce2e309fd4fa24d4ee4e15c16e514b57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:45:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5dhqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6993b19937515a45f9fab7b20a6856320e79cb3526295d52f48338126d519cac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:45:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5dhqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f02214c7d7705fc02dcc65fa178be79e26797ae2a4414ac88e3b14feca08b425\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://129f4f2f028ce030a9031869e327954dca94ca04e3cff5fe553e344d932b3301\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-10T09:46:02Z\\\",\\\"message\\\":\\\".io/client-go/informers/factory.go:160\\\\nI0310 09:46:02.111927 6641 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0310 09:46:02.112060 6641 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0310 09:46:02.112249 6641 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0310 09:46:02.112716 6641 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0310 09:46:02.112769 6641 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0310 09:46:02.112826 6641 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0310 09:46:02.112858 6641 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0310 09:46:02.112824 6641 handler.go:208] Removed *v1.Node event handler 7\\\\nI0310 09:46:02.112907 6641 handler.go:208] Removed *v1.Node event handler 2\\\\nI0310 09:46:02.112924 6641 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0310 09:46:02.112979 6641 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0310 09:46:02.113019 6641 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0310 09:46:02.113068 6641 factory.go:656] Stopping watch factory\\\\nI0310 09:46:02.113107 6641 ovnkube.go:599] Stopped ovnkube\\\\nI0310 09:46:0\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T09:45:59Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f02214c7d7705fc02dcc65fa178be79e26797ae2a4414ac88e3b14feca08b425\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-10T09:46:03Z\\\",\\\"message\\\":\\\"j_retry.go:386] Retry successful for *v1.Pod openshift-network-operator/iptables-alerter-4ln5h after 0 failed attempt(s)\\\\nF0310 09:46:03.457799 6791 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:46:03Z is after 2025-08-24T17:21:41Z]\\\\nI0310 09:46:03.457805 6791 default_network_controller.go:776] Recording success event on pod openshift-network-operator/iptables-alerter-4ln5h\\\\nI0310 09:46:03.457647 6791 ovn.go:134] Ensuring zone local for Pod openshift-machine-config-operator/kube-rbac-proxy-crio-crc in node crc\\\\nI0310 09:46:03.457819 6791 obj_retry.go:386] Retry successful for *v1.Pod openshift-machine-config-operat\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T09:46:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5dhqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://01f874a0c0eedbf796a95816f7e3fa80891f260f22087c48aea06498330167dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:45:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5dhqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://40c8d4f75a838a0ea1ce416d08c6d4628c2cbdbc209d50922f2f02867142bedd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://40c8d4f75a838a0ea1ce416d08c6d4628c2cbdbc209d50922f2f02867142bedd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:45:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:45:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5dhqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:45:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-mm9nq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:46:05Z is after 2025-08-24T17:21:41Z" Mar 10 09:46:05 crc kubenswrapper[4794]: I0310 09:46:05.129438 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-9j8d2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f99fa7c7-084a-43b1-acbf-5fbc67e1a66b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:46:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:46:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:46:04Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:46:04Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8l2p5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8l2p5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:46:04Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-9j8d2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:46:05Z is after 2025-08-24T17:21:41Z" Mar 10 09:46:05 crc kubenswrapper[4794]: I0310 09:46:05.135527 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/f99fa7c7-084a-43b1-acbf-5fbc67e1a66b-env-overrides\") pod \"ovnkube-control-plane-749d76644c-9j8d2\" (UID: \"f99fa7c7-084a-43b1-acbf-5fbc67e1a66b\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-9j8d2" Mar 10 09:46:05 crc kubenswrapper[4794]: I0310 09:46:05.135630 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/f99fa7c7-084a-43b1-acbf-5fbc67e1a66b-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-9j8d2\" (UID: \"f99fa7c7-084a-43b1-acbf-5fbc67e1a66b\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-9j8d2" Mar 10 09:46:05 crc kubenswrapper[4794]: I0310 09:46:05.135681 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/f99fa7c7-084a-43b1-acbf-5fbc67e1a66b-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-9j8d2\" (UID: \"f99fa7c7-084a-43b1-acbf-5fbc67e1a66b\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-9j8d2" Mar 10 09:46:05 crc kubenswrapper[4794]: I0310 09:46:05.135722 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8l2p5\" (UniqueName: \"kubernetes.io/projected/f99fa7c7-084a-43b1-acbf-5fbc67e1a66b-kube-api-access-8l2p5\") pod \"ovnkube-control-plane-749d76644c-9j8d2\" (UID: \"f99fa7c7-084a-43b1-acbf-5fbc67e1a66b\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-9j8d2" Mar 10 09:46:05 crc kubenswrapper[4794]: I0310 09:46:05.146995 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:35Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:46:05Z is after 2025-08-24T17:21:41Z" Mar 10 09:46:05 crc kubenswrapper[4794]: I0310 09:46:05.163832 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:46:05Z is after 2025-08-24T17:21:41Z" Mar 10 09:46:05 crc kubenswrapper[4794]: I0310 09:46:05.169193 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:46:05 crc kubenswrapper[4794]: I0310 09:46:05.169246 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:46:05 crc kubenswrapper[4794]: I0310 09:46:05.169266 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:46:05 crc kubenswrapper[4794]: I0310 09:46:05.169290 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:46:05 crc kubenswrapper[4794]: I0310 09:46:05.169305 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:46:05Z","lastTransitionTime":"2026-03-10T09:46:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:46:05 crc kubenswrapper[4794]: I0310 09:46:05.182045 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0a99da5e73d983b28a12e4063cb3dcbef725b0797abd5b5beff2d066fb4b43d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:45:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://faad85d1ef96c3c23ba54d03d456195b33f24bed1a5fc64cec481108d087d77c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:45:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:46:05Z is after 2025-08-24T17:21:41Z" Mar 10 09:46:05 crc kubenswrapper[4794]: I0310 09:46:05.196666 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-dc7fw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"441d4601-197c-4325-84bf-cef005ae408b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2367448989b7eb7235c3277100a8d4945551beee942453ce2efe152e17d24ce2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:45:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxxtq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:45:52Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-dc7fw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:46:05Z is after 2025-08-24T17:21:41Z" Mar 10 09:46:05 crc kubenswrapper[4794]: I0310 09:46:05.213069 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-gp22j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ccf2d24b-7ac8-4da4-8629-a56d006f1292\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:46:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:46:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:46:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dc5c1e6650959d5e7fe835284d6684f8d165aaec2e2e5c6537668db196c0ba81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:45:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k8fgn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:45:59Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-gp22j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:46:05Z is after 2025-08-24T17:21:41Z" Mar 10 09:46:05 crc kubenswrapper[4794]: I0310 09:46:05.231000 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c04f5dfc-8cde-406b-9e01-2e529e0c0f31\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:12Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:12Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c7c4ede630df1f139ad8abbc2d1d1e2f0f6cf56a2c29902f6b6879df004834cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:44:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://27049a35d23ac6ffa5cb459e4ea2c5c55a7b1731c9edff0941d452442cc73af2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:44:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://de2c346c49fbd94c4328853bed3ffd94ba873a423ff0d99a6ace6fc832bd4da7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:44:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd3d37caeb172a681855b5504cd1387cb3798cff0c9ae2c67f9a57207d86f7fe\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bd3d37caeb172a681855b5504cd1387cb3798cff0c9ae2c67f9a57207d86f7fe\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T09:45:21Z\\\",\\\"message\\\":\\\"le observer\\\\nW0310 09:45:21.579222 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0310 09:45:21.579381 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0310 09:45:21.580347 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2897458058/tls.crt::/tmp/serving-cert-2897458058/tls.key\\\\\\\"\\\\nI0310 09:45:21.843230 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0310 09:45:21.845521 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0310 09:45:21.845546 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0310 09:45:21.845567 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0310 09:45:21.845572 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0310 09:45:21.854911 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0310 09:45:21.854948 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 09:45:21.854954 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 09:45:21.854960 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0310 09:45:21.854966 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0310 09:45:21.854970 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0310 09:45:21.854974 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0310 09:45:21.854995 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0310 09:45:21.858504 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T09:45:21Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a3be3a71e2af20be50ffcbd448d5f28e73bafe2d36819ec5476a260b6f5a8c5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:44:14Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://712665fa5ac90b706c3ce8eb211703686c18a26f136962df26e3ad7c4396c72f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://712665fa5ac90b706c3ce8eb211703686c18a26f136962df26e3ad7c4396c72f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:44:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:44:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:44:12Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:46:05Z is after 2025-08-24T17:21:41Z" Mar 10 09:46:05 crc kubenswrapper[4794]: I0310 09:46:05.236614 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/f99fa7c7-084a-43b1-acbf-5fbc67e1a66b-env-overrides\") pod \"ovnkube-control-plane-749d76644c-9j8d2\" (UID: \"f99fa7c7-084a-43b1-acbf-5fbc67e1a66b\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-9j8d2" Mar 10 09:46:05 crc kubenswrapper[4794]: I0310 09:46:05.236718 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/f99fa7c7-084a-43b1-acbf-5fbc67e1a66b-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-9j8d2\" (UID: \"f99fa7c7-084a-43b1-acbf-5fbc67e1a66b\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-9j8d2" Mar 10 09:46:05 crc kubenswrapper[4794]: I0310 09:46:05.236778 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/f99fa7c7-084a-43b1-acbf-5fbc67e1a66b-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-9j8d2\" (UID: \"f99fa7c7-084a-43b1-acbf-5fbc67e1a66b\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-9j8d2" Mar 10 09:46:05 crc kubenswrapper[4794]: I0310 09:46:05.236820 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8l2p5\" (UniqueName: \"kubernetes.io/projected/f99fa7c7-084a-43b1-acbf-5fbc67e1a66b-kube-api-access-8l2p5\") pod \"ovnkube-control-plane-749d76644c-9j8d2\" (UID: \"f99fa7c7-084a-43b1-acbf-5fbc67e1a66b\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-9j8d2" Mar 10 09:46:05 crc kubenswrapper[4794]: I0310 09:46:05.237877 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/f99fa7c7-084a-43b1-acbf-5fbc67e1a66b-env-overrides\") pod \"ovnkube-control-plane-749d76644c-9j8d2\" (UID: \"f99fa7c7-084a-43b1-acbf-5fbc67e1a66b\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-9j8d2" Mar 10 09:46:05 crc kubenswrapper[4794]: I0310 09:46:05.238349 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/f99fa7c7-084a-43b1-acbf-5fbc67e1a66b-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-9j8d2\" (UID: \"f99fa7c7-084a-43b1-acbf-5fbc67e1a66b\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-9j8d2" Mar 10 09:46:05 crc kubenswrapper[4794]: I0310 09:46:05.245477 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/f99fa7c7-084a-43b1-acbf-5fbc67e1a66b-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-9j8d2\" (UID: \"f99fa7c7-084a-43b1-acbf-5fbc67e1a66b\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-9j8d2" Mar 10 09:46:05 crc kubenswrapper[4794]: I0310 09:46:05.248113 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2f008ec7-aeae-48e2-8ce6-2d24f9a74cb9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5ecb656c11edc3bfd23f7b6362d5ac8cad055186b134498b59e58b868ebd157c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:44:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8ff582c5b9b19a04841665aef50262d055e093a8e80200961a93e1e1ee0391b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8ff582c5b9b19a04841665aef50262d055e093a8e80200961a93e1e1ee0391b4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:44:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:44:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:44:12Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:46:05Z is after 2025-08-24T17:21:41Z" Mar 10 09:46:05 crc kubenswrapper[4794]: I0310 09:46:05.262911 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8l2p5\" (UniqueName: \"kubernetes.io/projected/f99fa7c7-084a-43b1-acbf-5fbc67e1a66b-kube-api-access-8l2p5\") pod \"ovnkube-control-plane-749d76644c-9j8d2\" (UID: \"f99fa7c7-084a-43b1-acbf-5fbc67e1a66b\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-9j8d2" Mar 10 09:46:05 crc kubenswrapper[4794]: I0310 09:46:05.273673 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:46:05 crc kubenswrapper[4794]: I0310 09:46:05.273528 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-zr89w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"749e1060-d177-43e4-9f39-d1b3e9b573b3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://80687a118cd90a6d4538b0de1757e2f3a17d7f2d5e2532d4a3c4396e838bdce4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:45:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-drqmz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9a7b1089f71425c1e393f53f827c2bd0dd87615752c6c5655ef28b2346de21d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a7b1089f71425c1e393f53f827c2bd0dd87615752c6c5655ef28b2346de21d8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:45:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:45:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-drqmz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e448f6f31e1ada006f54b8381012841a94a4e0101fba5630d715f1ab89e842e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6e448f6f31e1ada006f54b8381012841a94a4e0101fba5630d715f1ab89e842e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:45:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:45:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-drqmz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4d92d9cb37c7766aaabeee4dc2a40c043f0e8178dfe21b6ca9f91fa6313fed4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d4d92d9cb37c7766aaabeee4dc2a40c043f0e8178dfe21b6ca9f91fa6313fed4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:45:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:45:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-drqmz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://06a10ffd91a8c26bf9a3affc09b1c698bcf344aebafadbb12e37e4cc0fb8fb32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://06a10ffd91a8c26bf9a3affc09b1c698bcf344aebafadbb12e37e4cc0fb8fb32\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:45:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:45:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-drqmz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://080520ae745ae721854727f7b8ea882c3b1073ef487362681973f5befe15a79c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://080520ae745ae721854727f7b8ea882c3b1073ef487362681973f5befe15a79c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:45:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:45:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-drqmz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4d9e1da83ee6ac0c4f792e1ec3da1a150b5ee013e83dd2e1b0e86bf5b5fcc648\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4d9e1da83ee6ac0c4f792e1ec3da1a150b5ee013e83dd2e1b0e86bf5b5fcc648\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:45:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:45:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-drqmz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:45:52Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-zr89w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:46:05Z is after 2025-08-24T17:21:41Z" Mar 10 09:46:05 crc kubenswrapper[4794]: I0310 09:46:05.273721 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:46:05 crc kubenswrapper[4794]: I0310 09:46:05.273882 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:46:05 crc kubenswrapper[4794]: I0310 09:46:05.273908 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:46:05 crc kubenswrapper[4794]: I0310 09:46:05.273923 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:46:05Z","lastTransitionTime":"2026-03-10T09:46:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:46:05 crc kubenswrapper[4794]: I0310 09:46:05.293033 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-jpdth" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"11028118-385a-4a2a-8bc4-49aad67ce147\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31569151ef9bf456980ec9d56c85d54ea77739b3991e26844385936dfceb24ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:45:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8v9tn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:45:52Z\\\"}}\" for pod \"openshift-multus\"/\"multus-jpdth\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:46:05Z is after 2025-08-24T17:21:41Z" Mar 10 09:46:05 crc kubenswrapper[4794]: I0310 09:46:05.312704 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-9j8d2" Mar 10 09:46:05 crc kubenswrapper[4794]: W0310 09:46:05.333411 4794 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf99fa7c7_084a_43b1_acbf_5fbc67e1a66b.slice/crio-367b3f9c217569388491354fe34fde68ddccca523205ba1ea951b409e0dc1c02 WatchSource:0}: Error finding container 367b3f9c217569388491354fe34fde68ddccca523205ba1ea951b409e0dc1c02: Status 404 returned error can't find the container with id 367b3f9c217569388491354fe34fde68ddccca523205ba1ea951b409e0dc1c02 Mar 10 09:46:05 crc kubenswrapper[4794]: I0310 09:46:05.377213 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:46:05 crc kubenswrapper[4794]: I0310 09:46:05.377267 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:46:05 crc kubenswrapper[4794]: I0310 09:46:05.377282 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:46:05 crc kubenswrapper[4794]: I0310 09:46:05.377302 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:46:05 crc kubenswrapper[4794]: I0310 09:46:05.377315 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:46:05Z","lastTransitionTime":"2026-03-10T09:46:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:46:05 crc kubenswrapper[4794]: I0310 09:46:05.481823 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:46:05 crc kubenswrapper[4794]: I0310 09:46:05.481891 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:46:05 crc kubenswrapper[4794]: I0310 09:46:05.481904 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:46:05 crc kubenswrapper[4794]: I0310 09:46:05.481922 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:46:05 crc kubenswrapper[4794]: I0310 09:46:05.482288 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:46:05Z","lastTransitionTime":"2026-03-10T09:46:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:46:05 crc kubenswrapper[4794]: I0310 09:46:05.585431 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:46:05 crc kubenswrapper[4794]: I0310 09:46:05.585478 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:46:05 crc kubenswrapper[4794]: I0310 09:46:05.585490 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:46:05 crc kubenswrapper[4794]: I0310 09:46:05.585508 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:46:05 crc kubenswrapper[4794]: I0310 09:46:05.585519 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:46:05Z","lastTransitionTime":"2026-03-10T09:46:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:46:05 crc kubenswrapper[4794]: I0310 09:46:05.596030 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-9j8d2" event={"ID":"f99fa7c7-084a-43b1-acbf-5fbc67e1a66b","Type":"ContainerStarted","Data":"5aec2bb176e581a0b8ab9808299a341c97cb05dcb3bf8a44d65e65b642cecec5"} Mar 10 09:46:05 crc kubenswrapper[4794]: I0310 09:46:05.596142 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-9j8d2" event={"ID":"f99fa7c7-084a-43b1-acbf-5fbc67e1a66b","Type":"ContainerStarted","Data":"367b3f9c217569388491354fe34fde68ddccca523205ba1ea951b409e0dc1c02"} Mar 10 09:46:05 crc kubenswrapper[4794]: I0310 09:46:05.688293 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:46:05 crc kubenswrapper[4794]: I0310 09:46:05.688390 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:46:05 crc kubenswrapper[4794]: I0310 09:46:05.688414 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:46:05 crc kubenswrapper[4794]: I0310 09:46:05.688445 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:46:05 crc kubenswrapper[4794]: I0310 09:46:05.688466 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:46:05Z","lastTransitionTime":"2026-03-10T09:46:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:46:05 crc kubenswrapper[4794]: I0310 09:46:05.752963 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/network-metrics-daemon-jl52w"] Mar 10 09:46:05 crc kubenswrapper[4794]: I0310 09:46:05.753789 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jl52w" Mar 10 09:46:05 crc kubenswrapper[4794]: E0310 09:46:05.753902 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jl52w" podUID="befc934b-d5ba-4fb4-afc6-97614b624ebc" Mar 10 09:46:05 crc kubenswrapper[4794]: I0310 09:46:05.766982 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-gp22j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ccf2d24b-7ac8-4da4-8629-a56d006f1292\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:46:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:46:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:46:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dc5c1e6650959d5e7fe835284d6684f8d165aaec2e2e5c6537668db196c0ba81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:45:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k8fgn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:45:59Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-gp22j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:46:05Z is after 2025-08-24T17:21:41Z" Mar 10 09:46:05 crc kubenswrapper[4794]: I0310 09:46:05.791442 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:46:05 crc kubenswrapper[4794]: I0310 09:46:05.791491 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:46:05 crc kubenswrapper[4794]: I0310 09:46:05.791502 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:46:05 crc kubenswrapper[4794]: I0310 09:46:05.791526 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:46:05 crc kubenswrapper[4794]: I0310 09:46:05.791536 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:46:05Z","lastTransitionTime":"2026-03-10T09:46:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:46:05 crc kubenswrapper[4794]: I0310 09:46:05.791989 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-9j8d2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f99fa7c7-084a-43b1-acbf-5fbc67e1a66b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:46:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:46:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:46:04Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:46:04Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8l2p5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8l2p5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:46:04Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-9j8d2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:46:05Z is after 2025-08-24T17:21:41Z" Mar 10 09:46:05 crc kubenswrapper[4794]: I0310 09:46:05.818149 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:35Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:46:05Z is after 2025-08-24T17:21:41Z" Mar 10 09:46:05 crc kubenswrapper[4794]: I0310 09:46:05.841977 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:46:05Z is after 2025-08-24T17:21:41Z" Mar 10 09:46:05 crc kubenswrapper[4794]: I0310 09:46:05.843319 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/befc934b-d5ba-4fb4-afc6-97614b624ebc-metrics-certs\") pod \"network-metrics-daemon-jl52w\" (UID: \"befc934b-d5ba-4fb4-afc6-97614b624ebc\") " pod="openshift-multus/network-metrics-daemon-jl52w" Mar 10 09:46:05 crc kubenswrapper[4794]: I0310 09:46:05.843388 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gbh5s\" (UniqueName: \"kubernetes.io/projected/befc934b-d5ba-4fb4-afc6-97614b624ebc-kube-api-access-gbh5s\") pod \"network-metrics-daemon-jl52w\" (UID: \"befc934b-d5ba-4fb4-afc6-97614b624ebc\") " pod="openshift-multus/network-metrics-daemon-jl52w" Mar 10 09:46:05 crc kubenswrapper[4794]: I0310 09:46:05.859636 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0a99da5e73d983b28a12e4063cb3dcbef725b0797abd5b5beff2d066fb4b43d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:45:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://faad85d1ef96c3c23ba54d03d456195b33f24bed1a5fc64cec481108d087d77c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:45:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:46:05Z is after 2025-08-24T17:21:41Z" Mar 10 09:46:05 crc kubenswrapper[4794]: I0310 09:46:05.870255 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-dc7fw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"441d4601-197c-4325-84bf-cef005ae408b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2367448989b7eb7235c3277100a8d4945551beee942453ce2efe152e17d24ce2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:45:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxxtq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:45:52Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-dc7fw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:46:05Z is after 2025-08-24T17:21:41Z" Mar 10 09:46:05 crc kubenswrapper[4794]: I0310 09:46:05.883562 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c04f5dfc-8cde-406b-9e01-2e529e0c0f31\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:12Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:12Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c7c4ede630df1f139ad8abbc2d1d1e2f0f6cf56a2c29902f6b6879df004834cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:44:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://27049a35d23ac6ffa5cb459e4ea2c5c55a7b1731c9edff0941d452442cc73af2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:44:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://de2c346c49fbd94c4328853bed3ffd94ba873a423ff0d99a6ace6fc832bd4da7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:44:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd3d37caeb172a681855b5504cd1387cb3798cff0c9ae2c67f9a57207d86f7fe\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bd3d37caeb172a681855b5504cd1387cb3798cff0c9ae2c67f9a57207d86f7fe\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T09:45:21Z\\\",\\\"message\\\":\\\"le observer\\\\nW0310 09:45:21.579222 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0310 09:45:21.579381 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0310 09:45:21.580347 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2897458058/tls.crt::/tmp/serving-cert-2897458058/tls.key\\\\\\\"\\\\nI0310 09:45:21.843230 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0310 09:45:21.845521 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0310 09:45:21.845546 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0310 09:45:21.845567 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0310 09:45:21.845572 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0310 09:45:21.854911 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0310 09:45:21.854948 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 09:45:21.854954 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 09:45:21.854960 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0310 09:45:21.854966 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0310 09:45:21.854970 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0310 09:45:21.854974 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0310 09:45:21.854995 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0310 09:45:21.858504 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T09:45:21Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a3be3a71e2af20be50ffcbd448d5f28e73bafe2d36819ec5476a260b6f5a8c5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:44:14Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://712665fa5ac90b706c3ce8eb211703686c18a26f136962df26e3ad7c4396c72f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://712665fa5ac90b706c3ce8eb211703686c18a26f136962df26e3ad7c4396c72f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:44:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:44:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:44:12Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:46:05Z is after 2025-08-24T17:21:41Z" Mar 10 09:46:05 crc kubenswrapper[4794]: I0310 09:46:05.892670 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2f008ec7-aeae-48e2-8ce6-2d24f9a74cb9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5ecb656c11edc3bfd23f7b6362d5ac8cad055186b134498b59e58b868ebd157c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:44:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8ff582c5b9b19a04841665aef50262d055e093a8e80200961a93e1e1ee0391b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8ff582c5b9b19a04841665aef50262d055e093a8e80200961a93e1e1ee0391b4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:44:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:44:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:44:12Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:46:05Z is after 2025-08-24T17:21:41Z" Mar 10 09:46:05 crc kubenswrapper[4794]: I0310 09:46:05.893715 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:46:05 crc kubenswrapper[4794]: I0310 09:46:05.893748 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:46:05 crc kubenswrapper[4794]: I0310 09:46:05.893792 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:46:05 crc kubenswrapper[4794]: I0310 09:46:05.893810 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:46:05 crc kubenswrapper[4794]: I0310 09:46:05.893822 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:46:05Z","lastTransitionTime":"2026-03-10T09:46:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:46:05 crc kubenswrapper[4794]: I0310 09:46:05.905975 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-zr89w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"749e1060-d177-43e4-9f39-d1b3e9b573b3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://80687a118cd90a6d4538b0de1757e2f3a17d7f2d5e2532d4a3c4396e838bdce4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:45:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-drqmz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9a7b1089f71425c1e393f53f827c2bd0dd87615752c6c5655ef28b2346de21d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a7b1089f71425c1e393f53f827c2bd0dd87615752c6c5655ef28b2346de21d8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:45:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:45:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-drqmz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e448f6f31e1ada006f54b8381012841a94a4e0101fba5630d715f1ab89e842e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6e448f6f31e1ada006f54b8381012841a94a4e0101fba5630d715f1ab89e842e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:45:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:45:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-drqmz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4d92d9cb37c7766aaabeee4dc2a40c043f0e8178dfe21b6ca9f91fa6313fed4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d4d92d9cb37c7766aaabeee4dc2a40c043f0e8178dfe21b6ca9f91fa6313fed4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:45:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:45:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-drqmz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://06a10ffd91a8c26bf9a3affc09b1c698bcf344aebafadbb12e37e4cc0fb8fb32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://06a10ffd91a8c26bf9a3affc09b1c698bcf344aebafadbb12e37e4cc0fb8fb32\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:45:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:45:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-drqmz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://080520ae745ae721854727f7b8ea882c3b1073ef487362681973f5befe15a79c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://080520ae745ae721854727f7b8ea882c3b1073ef487362681973f5befe15a79c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:45:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:45:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-drqmz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4d9e1da83ee6ac0c4f792e1ec3da1a150b5ee013e83dd2e1b0e86bf5b5fcc648\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4d9e1da83ee6ac0c4f792e1ec3da1a150b5ee013e83dd2e1b0e86bf5b5fcc648\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:45:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:45:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-drqmz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:45:52Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-zr89w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:46:05Z is after 2025-08-24T17:21:41Z" Mar 10 09:46:05 crc kubenswrapper[4794]: I0310 09:46:05.917118 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-jpdth" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"11028118-385a-4a2a-8bc4-49aad67ce147\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31569151ef9bf456980ec9d56c85d54ea77739b3991e26844385936dfceb24ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:45:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8v9tn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:45:52Z\\\"}}\" for pod \"openshift-multus\"/\"multus-jpdth\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:46:05Z is after 2025-08-24T17:21:41Z" Mar 10 09:46:05 crc kubenswrapper[4794]: I0310 09:46:05.927070 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:35Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:46:05Z is after 2025-08-24T17:21:41Z" Mar 10 09:46:05 crc kubenswrapper[4794]: I0310 09:46:05.942090 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-69278" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"071a79a8-a892-4d38-a255-2a19483b64aa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f596b5be3f96e424f4c4f0c6f81241ec50ee64356cb1449294fa6790aa3f7755\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:45:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wl72n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://387c7d244ef3b150e61bf6db8b30580b19b8867628fab0e5c4eebdde4df81142\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:45:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wl72n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:45:52Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-69278\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:46:05Z is after 2025-08-24T17:21:41Z" Mar 10 09:46:05 crc kubenswrapper[4794]: I0310 09:46:05.944500 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gbh5s\" (UniqueName: \"kubernetes.io/projected/befc934b-d5ba-4fb4-afc6-97614b624ebc-kube-api-access-gbh5s\") pod \"network-metrics-daemon-jl52w\" (UID: \"befc934b-d5ba-4fb4-afc6-97614b624ebc\") " pod="openshift-multus/network-metrics-daemon-jl52w" Mar 10 09:46:05 crc kubenswrapper[4794]: I0310 09:46:05.944610 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/befc934b-d5ba-4fb4-afc6-97614b624ebc-metrics-certs\") pod \"network-metrics-daemon-jl52w\" (UID: \"befc934b-d5ba-4fb4-afc6-97614b624ebc\") " pod="openshift-multus/network-metrics-daemon-jl52w" Mar 10 09:46:05 crc kubenswrapper[4794]: E0310 09:46:05.944732 4794 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 10 09:46:05 crc kubenswrapper[4794]: E0310 09:46:05.944794 4794 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/befc934b-d5ba-4fb4-afc6-97614b624ebc-metrics-certs podName:befc934b-d5ba-4fb4-afc6-97614b624ebc nodeName:}" failed. No retries permitted until 2026-03-10 09:46:06.444777655 +0000 UTC m=+115.200948473 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/befc934b-d5ba-4fb4-afc6-97614b624ebc-metrics-certs") pod "network-metrics-daemon-jl52w" (UID: "befc934b-d5ba-4fb4-afc6-97614b624ebc") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 10 09:46:05 crc kubenswrapper[4794]: I0310 09:46:05.953686 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-jl52w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"befc934b-d5ba-4fb4-afc6-97614b624ebc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:46:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:46:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:46:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:46:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gbh5s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gbh5s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:46:05Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-jl52w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:46:05Z is after 2025-08-24T17:21:41Z" Mar 10 09:46:05 crc kubenswrapper[4794]: I0310 09:46:05.965719 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gbh5s\" (UniqueName: \"kubernetes.io/projected/befc934b-d5ba-4fb4-afc6-97614b624ebc-kube-api-access-gbh5s\") pod \"network-metrics-daemon-jl52w\" (UID: \"befc934b-d5ba-4fb4-afc6-97614b624ebc\") " pod="openshift-multus/network-metrics-daemon-jl52w" Mar 10 09:46:05 crc kubenswrapper[4794]: I0310 09:46:05.977016 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1e0df489-afb8-4101-acd6-7274ee4ebddf\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://714aace000a0c1b32bd3c863ca1e38e9e44df606fcf3fddb6573eaf8faea4ac7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:44:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://02b38ec46bd15bd61a9748f309c0f8fd16d1ab485405a653f6974b6b37952f75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:44:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b3bcab9c36a963edfb820142c1c917f0cc5d18a09dc3aa65b4539c2248fb866\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:44:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://55280e0d81cf78ca765eb09242df44355b2e25fc2d3e5f88b78ef9f14a44ada8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:44:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8ed250f61efe96b4fac3af4d8d8958a60df304df0120052a7a1ef2d716cc025d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:44:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d1cbb7471e4c1bc60e6bff82ff47be1667d540e2f73c1e118371b1f887c6e22e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d1cbb7471e4c1bc60e6bff82ff47be1667d540e2f73c1e118371b1f887c6e22e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:44:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:44:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e8b205e76d6211c7df0521baf694a871c2e747401dd85f694fe3b6712995540\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8e8b205e76d6211c7df0521baf694a871c2e747401dd85f694fe3b6712995540\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:44:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:44:14Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://0fee74b6caaaf89e473d672c7699753699a594043f013d2e5106d232ad87d850\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0fee74b6caaaf89e473d672c7699753699a594043f013d2e5106d232ad87d850\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:44:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:44:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:44:12Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:46:05Z is after 2025-08-24T17:21:41Z" Mar 10 09:46:05 crc kubenswrapper[4794]: I0310 09:46:05.995416 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:46:05 crc kubenswrapper[4794]: I0310 09:46:05.995457 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:46:05 crc kubenswrapper[4794]: I0310 09:46:05.995468 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:46:05 crc kubenswrapper[4794]: I0310 09:46:05.995487 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:46:05 crc kubenswrapper[4794]: I0310 09:46:05.995500 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:46:05Z","lastTransitionTime":"2026-03-10T09:46:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:46:05 crc kubenswrapper[4794]: I0310 09:46:05.997058 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d7e66c82b5c376bc659e9597f6714923192b2658db0530c1e20b73533bfb9079\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:45:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:46:05Z is after 2025-08-24T17:21:41Z" Mar 10 09:46:05 crc kubenswrapper[4794]: I0310 09:46:05.998275 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 09:46:05 crc kubenswrapper[4794]: I0310 09:46:05.998323 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 09:46:05 crc kubenswrapper[4794]: E0310 09:46:05.998446 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 09:46:05 crc kubenswrapper[4794]: I0310 09:46:05.998465 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 09:46:05 crc kubenswrapper[4794]: E0310 09:46:05.998551 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 09:46:05 crc kubenswrapper[4794]: E0310 09:46:05.998702 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 09:46:06 crc kubenswrapper[4794]: I0310 09:46:06.008965 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:39Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:39Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7ca44cdf468c33b025c75f0f104435e413e525fba85c006f98fa92ff2f2a4e3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:45:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:46:06Z is after 2025-08-24T17:21:41Z" Mar 10 09:46:06 crc kubenswrapper[4794]: I0310 09:46:06.024943 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mm9nq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d6907de6-7eb7-440a-a101-f492ffa28e39\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d6a3c6de4d134994477dbeb7d3ccdac86f691a7c465321397a93be240c84c4c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:45:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5dhqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf907aae6bd62c6969511c2483d9891a3d89407fa596e7bd4d6e18a3739aea41\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:45:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5dhqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f392e7eaab035834aa954a19e6c9d926bbedf2a547b3a11fba67c0bb966df58c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:45:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5dhqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4fd74482a81888186636681a0568f49e919ac0d729db583ba9178c42e488051\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:45:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5dhqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4fe68076842f5ec0c1922f36a3fdae71ce2e309fd4fa24d4ee4e15c16e514b57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:45:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5dhqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6993b19937515a45f9fab7b20a6856320e79cb3526295d52f48338126d519cac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:45:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5dhqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f02214c7d7705fc02dcc65fa178be79e26797ae2a4414ac88e3b14feca08b425\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://129f4f2f028ce030a9031869e327954dca94ca04e3cff5fe553e344d932b3301\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-10T09:46:02Z\\\",\\\"message\\\":\\\".io/client-go/informers/factory.go:160\\\\nI0310 09:46:02.111927 6641 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0310 09:46:02.112060 6641 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0310 09:46:02.112249 6641 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0310 09:46:02.112716 6641 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0310 09:46:02.112769 6641 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0310 09:46:02.112826 6641 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0310 09:46:02.112858 6641 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0310 09:46:02.112824 6641 handler.go:208] Removed *v1.Node event handler 7\\\\nI0310 09:46:02.112907 6641 handler.go:208] Removed *v1.Node event handler 2\\\\nI0310 09:46:02.112924 6641 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0310 09:46:02.112979 6641 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0310 09:46:02.113019 6641 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0310 09:46:02.113068 6641 factory.go:656] Stopping watch factory\\\\nI0310 09:46:02.113107 6641 ovnkube.go:599] Stopped ovnkube\\\\nI0310 09:46:0\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T09:45:59Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f02214c7d7705fc02dcc65fa178be79e26797ae2a4414ac88e3b14feca08b425\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-10T09:46:03Z\\\",\\\"message\\\":\\\"j_retry.go:386] Retry successful for *v1.Pod openshift-network-operator/iptables-alerter-4ln5h after 0 failed attempt(s)\\\\nF0310 09:46:03.457799 6791 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:46:03Z is after 2025-08-24T17:21:41Z]\\\\nI0310 09:46:03.457805 6791 default_network_controller.go:776] Recording success event on pod openshift-network-operator/iptables-alerter-4ln5h\\\\nI0310 09:46:03.457647 6791 ovn.go:134] Ensuring zone local for Pod openshift-machine-config-operator/kube-rbac-proxy-crio-crc in node crc\\\\nI0310 09:46:03.457819 6791 obj_retry.go:386] Retry successful for *v1.Pod openshift-machine-config-operat\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T09:46:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5dhqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://01f874a0c0eedbf796a95816f7e3fa80891f260f22087c48aea06498330167dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:45:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5dhqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://40c8d4f75a838a0ea1ce416d08c6d4628c2cbdbc209d50922f2f02867142bedd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://40c8d4f75a838a0ea1ce416d08c6d4628c2cbdbc209d50922f2f02867142bedd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:45:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:45:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5dhqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:45:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-mm9nq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:46:06Z is after 2025-08-24T17:21:41Z" Mar 10 09:46:06 crc kubenswrapper[4794]: I0310 09:46:06.097957 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:46:06 crc kubenswrapper[4794]: I0310 09:46:06.098004 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:46:06 crc kubenswrapper[4794]: I0310 09:46:06.098019 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:46:06 crc kubenswrapper[4794]: I0310 09:46:06.098042 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:46:06 crc kubenswrapper[4794]: I0310 09:46:06.098056 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:46:06Z","lastTransitionTime":"2026-03-10T09:46:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:46:06 crc kubenswrapper[4794]: I0310 09:46:06.201320 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:46:06 crc kubenswrapper[4794]: I0310 09:46:06.201400 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:46:06 crc kubenswrapper[4794]: I0310 09:46:06.201415 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:46:06 crc kubenswrapper[4794]: I0310 09:46:06.201435 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:46:06 crc kubenswrapper[4794]: I0310 09:46:06.201445 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:46:06Z","lastTransitionTime":"2026-03-10T09:46:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:46:06 crc kubenswrapper[4794]: I0310 09:46:06.303818 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:46:06 crc kubenswrapper[4794]: I0310 09:46:06.303878 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:46:06 crc kubenswrapper[4794]: I0310 09:46:06.303893 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:46:06 crc kubenswrapper[4794]: I0310 09:46:06.303914 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:46:06 crc kubenswrapper[4794]: I0310 09:46:06.303930 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:46:06Z","lastTransitionTime":"2026-03-10T09:46:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:46:06 crc kubenswrapper[4794]: I0310 09:46:06.407193 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:46:06 crc kubenswrapper[4794]: I0310 09:46:06.407248 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:46:06 crc kubenswrapper[4794]: I0310 09:46:06.407266 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:46:06 crc kubenswrapper[4794]: I0310 09:46:06.407289 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:46:06 crc kubenswrapper[4794]: I0310 09:46:06.407306 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:46:06Z","lastTransitionTime":"2026-03-10T09:46:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:46:06 crc kubenswrapper[4794]: I0310 09:46:06.450568 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/befc934b-d5ba-4fb4-afc6-97614b624ebc-metrics-certs\") pod \"network-metrics-daemon-jl52w\" (UID: \"befc934b-d5ba-4fb4-afc6-97614b624ebc\") " pod="openshift-multus/network-metrics-daemon-jl52w" Mar 10 09:46:06 crc kubenswrapper[4794]: E0310 09:46:06.450747 4794 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 10 09:46:06 crc kubenswrapper[4794]: E0310 09:46:06.450841 4794 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/befc934b-d5ba-4fb4-afc6-97614b624ebc-metrics-certs podName:befc934b-d5ba-4fb4-afc6-97614b624ebc nodeName:}" failed. No retries permitted until 2026-03-10 09:46:07.450817518 +0000 UTC m=+116.206988376 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/befc934b-d5ba-4fb4-afc6-97614b624ebc-metrics-certs") pod "network-metrics-daemon-jl52w" (UID: "befc934b-d5ba-4fb4-afc6-97614b624ebc") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 10 09:46:06 crc kubenswrapper[4794]: I0310 09:46:06.511378 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:46:06 crc kubenswrapper[4794]: I0310 09:46:06.511492 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:46:06 crc kubenswrapper[4794]: I0310 09:46:06.511573 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:46:06 crc kubenswrapper[4794]: I0310 09:46:06.511604 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:46:06 crc kubenswrapper[4794]: I0310 09:46:06.511628 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:46:06Z","lastTransitionTime":"2026-03-10T09:46:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:46:06 crc kubenswrapper[4794]: I0310 09:46:06.601520 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-9j8d2" event={"ID":"f99fa7c7-084a-43b1-acbf-5fbc67e1a66b","Type":"ContainerStarted","Data":"1df90543899f07349ecc9cbe5282c9e766f2cf21e6a6da40920a77f6f40a9c2f"} Mar 10 09:46:06 crc kubenswrapper[4794]: I0310 09:46:06.613837 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:46:06 crc kubenswrapper[4794]: I0310 09:46:06.613914 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:46:06 crc kubenswrapper[4794]: I0310 09:46:06.613939 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:46:06 crc kubenswrapper[4794]: I0310 09:46:06.613970 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:46:06 crc kubenswrapper[4794]: I0310 09:46:06.613989 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:46:06Z","lastTransitionTime":"2026-03-10T09:46:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:46:06 crc kubenswrapper[4794]: I0310 09:46:06.623764 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:35Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:46:06Z is after 2025-08-24T17:21:41Z" Mar 10 09:46:06 crc kubenswrapper[4794]: I0310 09:46:06.642975 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:46:06Z is after 2025-08-24T17:21:41Z" Mar 10 09:46:06 crc kubenswrapper[4794]: I0310 09:46:06.662847 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0a99da5e73d983b28a12e4063cb3dcbef725b0797abd5b5beff2d066fb4b43d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:45:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://faad85d1ef96c3c23ba54d03d456195b33f24bed1a5fc64cec481108d087d77c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:45:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:46:06Z is after 2025-08-24T17:21:41Z" Mar 10 09:46:06 crc kubenswrapper[4794]: I0310 09:46:06.678518 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-dc7fw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"441d4601-197c-4325-84bf-cef005ae408b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2367448989b7eb7235c3277100a8d4945551beee942453ce2efe152e17d24ce2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:45:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxxtq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:45:52Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-dc7fw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:46:06Z is after 2025-08-24T17:21:41Z" Mar 10 09:46:06 crc kubenswrapper[4794]: I0310 09:46:06.694778 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-gp22j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ccf2d24b-7ac8-4da4-8629-a56d006f1292\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:46:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:46:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:46:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dc5c1e6650959d5e7fe835284d6684f8d165aaec2e2e5c6537668db196c0ba81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:45:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k8fgn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:45:59Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-gp22j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:46:06Z is after 2025-08-24T17:21:41Z" Mar 10 09:46:06 crc kubenswrapper[4794]: I0310 09:46:06.712667 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-9j8d2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f99fa7c7-084a-43b1-acbf-5fbc67e1a66b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:46:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:46:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:46:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:46:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5aec2bb176e581a0b8ab9808299a341c97cb05dcb3bf8a44d65e65b642cecec5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:46:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8l2p5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1df90543899f07349ecc9cbe5282c9e766f2cf21e6a6da40920a77f6f40a9c2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:46:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8l2p5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:46:04Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-9j8d2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:46:06Z is after 2025-08-24T17:21:41Z" Mar 10 09:46:06 crc kubenswrapper[4794]: I0310 09:46:06.717164 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:46:06 crc kubenswrapper[4794]: I0310 09:46:06.717271 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:46:06 crc kubenswrapper[4794]: I0310 09:46:06.717287 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:46:06 crc kubenswrapper[4794]: I0310 09:46:06.717309 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:46:06 crc kubenswrapper[4794]: I0310 09:46:06.717323 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:46:06Z","lastTransitionTime":"2026-03-10T09:46:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:46:06 crc kubenswrapper[4794]: I0310 09:46:06.729999 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c04f5dfc-8cde-406b-9e01-2e529e0c0f31\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:12Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:12Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c7c4ede630df1f139ad8abbc2d1d1e2f0f6cf56a2c29902f6b6879df004834cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:44:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://27049a35d23ac6ffa5cb459e4ea2c5c55a7b1731c9edff0941d452442cc73af2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:44:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://de2c346c49fbd94c4328853bed3ffd94ba873a423ff0d99a6ace6fc832bd4da7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:44:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd3d37caeb172a681855b5504cd1387cb3798cff0c9ae2c67f9a57207d86f7fe\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bd3d37caeb172a681855b5504cd1387cb3798cff0c9ae2c67f9a57207d86f7fe\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T09:45:21Z\\\",\\\"message\\\":\\\"le observer\\\\nW0310 09:45:21.579222 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0310 09:45:21.579381 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0310 09:45:21.580347 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2897458058/tls.crt::/tmp/serving-cert-2897458058/tls.key\\\\\\\"\\\\nI0310 09:45:21.843230 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0310 09:45:21.845521 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0310 09:45:21.845546 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0310 09:45:21.845567 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0310 09:45:21.845572 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0310 09:45:21.854911 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0310 09:45:21.854948 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 09:45:21.854954 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 09:45:21.854960 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0310 09:45:21.854966 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0310 09:45:21.854970 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0310 09:45:21.854974 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0310 09:45:21.854995 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0310 09:45:21.858504 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T09:45:21Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a3be3a71e2af20be50ffcbd448d5f28e73bafe2d36819ec5476a260b6f5a8c5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:44:14Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://712665fa5ac90b706c3ce8eb211703686c18a26f136962df26e3ad7c4396c72f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://712665fa5ac90b706c3ce8eb211703686c18a26f136962df26e3ad7c4396c72f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:44:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:44:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:44:12Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:46:06Z is after 2025-08-24T17:21:41Z" Mar 10 09:46:06 crc kubenswrapper[4794]: I0310 09:46:06.744466 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2f008ec7-aeae-48e2-8ce6-2d24f9a74cb9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5ecb656c11edc3bfd23f7b6362d5ac8cad055186b134498b59e58b868ebd157c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:44:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8ff582c5b9b19a04841665aef50262d055e093a8e80200961a93e1e1ee0391b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8ff582c5b9b19a04841665aef50262d055e093a8e80200961a93e1e1ee0391b4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:44:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:44:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:44:12Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:46:06Z is after 2025-08-24T17:21:41Z" Mar 10 09:46:06 crc kubenswrapper[4794]: I0310 09:46:06.762134 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-zr89w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"749e1060-d177-43e4-9f39-d1b3e9b573b3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://80687a118cd90a6d4538b0de1757e2f3a17d7f2d5e2532d4a3c4396e838bdce4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:45:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-drqmz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9a7b1089f71425c1e393f53f827c2bd0dd87615752c6c5655ef28b2346de21d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a7b1089f71425c1e393f53f827c2bd0dd87615752c6c5655ef28b2346de21d8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:45:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:45:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-drqmz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e448f6f31e1ada006f54b8381012841a94a4e0101fba5630d715f1ab89e842e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6e448f6f31e1ada006f54b8381012841a94a4e0101fba5630d715f1ab89e842e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:45:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:45:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-drqmz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4d92d9cb37c7766aaabeee4dc2a40c043f0e8178dfe21b6ca9f91fa6313fed4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d4d92d9cb37c7766aaabeee4dc2a40c043f0e8178dfe21b6ca9f91fa6313fed4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:45:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:45:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-drqmz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://06a10ffd91a8c26bf9a3affc09b1c698bcf344aebafadbb12e37e4cc0fb8fb32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://06a10ffd91a8c26bf9a3affc09b1c698bcf344aebafadbb12e37e4cc0fb8fb32\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:45:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:45:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-drqmz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://080520ae745ae721854727f7b8ea882c3b1073ef487362681973f5befe15a79c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://080520ae745ae721854727f7b8ea882c3b1073ef487362681973f5befe15a79c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:45:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:45:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-drqmz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4d9e1da83ee6ac0c4f792e1ec3da1a150b5ee013e83dd2e1b0e86bf5b5fcc648\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4d9e1da83ee6ac0c4f792e1ec3da1a150b5ee013e83dd2e1b0e86bf5b5fcc648\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:45:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:45:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-drqmz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:45:52Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-zr89w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:46:06Z is after 2025-08-24T17:21:41Z" Mar 10 09:46:06 crc kubenswrapper[4794]: I0310 09:46:06.776808 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-jpdth" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"11028118-385a-4a2a-8bc4-49aad67ce147\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31569151ef9bf456980ec9d56c85d54ea77739b3991e26844385936dfceb24ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:45:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8v9tn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:45:52Z\\\"}}\" for pod \"openshift-multus\"/\"multus-jpdth\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:46:06Z is after 2025-08-24T17:21:41Z" Mar 10 09:46:06 crc kubenswrapper[4794]: I0310 09:46:06.796589 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:35Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:46:06Z is after 2025-08-24T17:21:41Z" Mar 10 09:46:06 crc kubenswrapper[4794]: I0310 09:46:06.814385 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-69278" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"071a79a8-a892-4d38-a255-2a19483b64aa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f596b5be3f96e424f4c4f0c6f81241ec50ee64356cb1449294fa6790aa3f7755\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:45:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wl72n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://387c7d244ef3b150e61bf6db8b30580b19b8867628fab0e5c4eebdde4df81142\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:45:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wl72n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:45:52Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-69278\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:46:06Z is after 2025-08-24T17:21:41Z" Mar 10 09:46:06 crc kubenswrapper[4794]: I0310 09:46:06.819781 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:46:06 crc kubenswrapper[4794]: I0310 09:46:06.819817 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:46:06 crc kubenswrapper[4794]: I0310 09:46:06.819829 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:46:06 crc kubenswrapper[4794]: I0310 09:46:06.819845 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:46:06 crc kubenswrapper[4794]: I0310 09:46:06.819858 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:46:06Z","lastTransitionTime":"2026-03-10T09:46:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:46:06 crc kubenswrapper[4794]: I0310 09:46:06.829614 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-jl52w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"befc934b-d5ba-4fb4-afc6-97614b624ebc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:46:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:46:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:46:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:46:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gbh5s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gbh5s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:46:05Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-jl52w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:46:06Z is after 2025-08-24T17:21:41Z" Mar 10 09:46:06 crc kubenswrapper[4794]: I0310 09:46:06.853527 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1e0df489-afb8-4101-acd6-7274ee4ebddf\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://714aace000a0c1b32bd3c863ca1e38e9e44df606fcf3fddb6573eaf8faea4ac7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:44:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://02b38ec46bd15bd61a9748f309c0f8fd16d1ab485405a653f6974b6b37952f75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:44:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b3bcab9c36a963edfb820142c1c917f0cc5d18a09dc3aa65b4539c2248fb866\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:44:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://55280e0d81cf78ca765eb09242df44355b2e25fc2d3e5f88b78ef9f14a44ada8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:44:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8ed250f61efe96b4fac3af4d8d8958a60df304df0120052a7a1ef2d716cc025d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:44:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d1cbb7471e4c1bc60e6bff82ff47be1667d540e2f73c1e118371b1f887c6e22e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d1cbb7471e4c1bc60e6bff82ff47be1667d540e2f73c1e118371b1f887c6e22e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:44:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:44:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e8b205e76d6211c7df0521baf694a871c2e747401dd85f694fe3b6712995540\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8e8b205e76d6211c7df0521baf694a871c2e747401dd85f694fe3b6712995540\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:44:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:44:14Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://0fee74b6caaaf89e473d672c7699753699a594043f013d2e5106d232ad87d850\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0fee74b6caaaf89e473d672c7699753699a594043f013d2e5106d232ad87d850\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:44:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:44:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:44:12Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:46:06Z is after 2025-08-24T17:21:41Z" Mar 10 09:46:06 crc kubenswrapper[4794]: I0310 09:46:06.868530 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d7e66c82b5c376bc659e9597f6714923192b2658db0530c1e20b73533bfb9079\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:45:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:46:06Z is after 2025-08-24T17:21:41Z" Mar 10 09:46:06 crc kubenswrapper[4794]: I0310 09:46:06.880025 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:39Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:39Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7ca44cdf468c33b025c75f0f104435e413e525fba85c006f98fa92ff2f2a4e3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:45:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:46:06Z is after 2025-08-24T17:21:41Z" Mar 10 09:46:06 crc kubenswrapper[4794]: I0310 09:46:06.898630 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mm9nq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d6907de6-7eb7-440a-a101-f492ffa28e39\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d6a3c6de4d134994477dbeb7d3ccdac86f691a7c465321397a93be240c84c4c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:45:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5dhqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf907aae6bd62c6969511c2483d9891a3d89407fa596e7bd4d6e18a3739aea41\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:45:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5dhqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f392e7eaab035834aa954a19e6c9d926bbedf2a547b3a11fba67c0bb966df58c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:45:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5dhqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4fd74482a81888186636681a0568f49e919ac0d729db583ba9178c42e488051\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:45:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5dhqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4fe68076842f5ec0c1922f36a3fdae71ce2e309fd4fa24d4ee4e15c16e514b57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:45:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5dhqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6993b19937515a45f9fab7b20a6856320e79cb3526295d52f48338126d519cac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:45:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5dhqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f02214c7d7705fc02dcc65fa178be79e26797ae2a4414ac88e3b14feca08b425\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://129f4f2f028ce030a9031869e327954dca94ca04e3cff5fe553e344d932b3301\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-10T09:46:02Z\\\",\\\"message\\\":\\\".io/client-go/informers/factory.go:160\\\\nI0310 09:46:02.111927 6641 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0310 09:46:02.112060 6641 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0310 09:46:02.112249 6641 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0310 09:46:02.112716 6641 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0310 09:46:02.112769 6641 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0310 09:46:02.112826 6641 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0310 09:46:02.112858 6641 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0310 09:46:02.112824 6641 handler.go:208] Removed *v1.Node event handler 7\\\\nI0310 09:46:02.112907 6641 handler.go:208] Removed *v1.Node event handler 2\\\\nI0310 09:46:02.112924 6641 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0310 09:46:02.112979 6641 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0310 09:46:02.113019 6641 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0310 09:46:02.113068 6641 factory.go:656] Stopping watch factory\\\\nI0310 09:46:02.113107 6641 ovnkube.go:599] Stopped ovnkube\\\\nI0310 09:46:0\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T09:45:59Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f02214c7d7705fc02dcc65fa178be79e26797ae2a4414ac88e3b14feca08b425\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-10T09:46:03Z\\\",\\\"message\\\":\\\"j_retry.go:386] Retry successful for *v1.Pod openshift-network-operator/iptables-alerter-4ln5h after 0 failed attempt(s)\\\\nF0310 09:46:03.457799 6791 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:46:03Z is after 2025-08-24T17:21:41Z]\\\\nI0310 09:46:03.457805 6791 default_network_controller.go:776] Recording success event on pod openshift-network-operator/iptables-alerter-4ln5h\\\\nI0310 09:46:03.457647 6791 ovn.go:134] Ensuring zone local for Pod openshift-machine-config-operator/kube-rbac-proxy-crio-crc in node crc\\\\nI0310 09:46:03.457819 6791 obj_retry.go:386] Retry successful for *v1.Pod openshift-machine-config-operat\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T09:46:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5dhqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://01f874a0c0eedbf796a95816f7e3fa80891f260f22087c48aea06498330167dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:45:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5dhqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://40c8d4f75a838a0ea1ce416d08c6d4628c2cbdbc209d50922f2f02867142bedd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://40c8d4f75a838a0ea1ce416d08c6d4628c2cbdbc209d50922f2f02867142bedd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:45:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:45:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5dhqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:45:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-mm9nq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:46:06Z is after 2025-08-24T17:21:41Z" Mar 10 09:46:06 crc kubenswrapper[4794]: I0310 09:46:06.921849 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:46:06 crc kubenswrapper[4794]: I0310 09:46:06.921892 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:46:06 crc kubenswrapper[4794]: I0310 09:46:06.921903 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:46:06 crc kubenswrapper[4794]: I0310 09:46:06.921919 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:46:06 crc kubenswrapper[4794]: I0310 09:46:06.921931 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:46:06Z","lastTransitionTime":"2026-03-10T09:46:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:46:06 crc kubenswrapper[4794]: I0310 09:46:06.998008 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jl52w" Mar 10 09:46:06 crc kubenswrapper[4794]: E0310 09:46:06.998310 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jl52w" podUID="befc934b-d5ba-4fb4-afc6-97614b624ebc" Mar 10 09:46:06 crc kubenswrapper[4794]: I0310 09:46:06.998721 4794 scope.go:117] "RemoveContainer" containerID="bd3d37caeb172a681855b5504cd1387cb3798cff0c9ae2c67f9a57207d86f7fe" Mar 10 09:46:07 crc kubenswrapper[4794]: I0310 09:46:07.024229 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:46:07 crc kubenswrapper[4794]: I0310 09:46:07.024266 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:46:07 crc kubenswrapper[4794]: I0310 09:46:07.024278 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:46:07 crc kubenswrapper[4794]: I0310 09:46:07.024300 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:46:07 crc kubenswrapper[4794]: I0310 09:46:07.024313 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:46:07Z","lastTransitionTime":"2026-03-10T09:46:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:46:07 crc kubenswrapper[4794]: I0310 09:46:07.127998 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:46:07 crc kubenswrapper[4794]: I0310 09:46:07.128274 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:46:07 crc kubenswrapper[4794]: I0310 09:46:07.128286 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:46:07 crc kubenswrapper[4794]: I0310 09:46:07.128302 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:46:07 crc kubenswrapper[4794]: I0310 09:46:07.128312 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:46:07Z","lastTransitionTime":"2026-03-10T09:46:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:46:07 crc kubenswrapper[4794]: I0310 09:46:07.231447 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:46:07 crc kubenswrapper[4794]: I0310 09:46:07.231485 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:46:07 crc kubenswrapper[4794]: I0310 09:46:07.231494 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:46:07 crc kubenswrapper[4794]: I0310 09:46:07.231512 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:46:07 crc kubenswrapper[4794]: I0310 09:46:07.231523 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:46:07Z","lastTransitionTime":"2026-03-10T09:46:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:46:07 crc kubenswrapper[4794]: I0310 09:46:07.334242 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:46:07 crc kubenswrapper[4794]: I0310 09:46:07.334278 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:46:07 crc kubenswrapper[4794]: I0310 09:46:07.334288 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:46:07 crc kubenswrapper[4794]: I0310 09:46:07.334304 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:46:07 crc kubenswrapper[4794]: I0310 09:46:07.334329 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:46:07Z","lastTransitionTime":"2026-03-10T09:46:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:46:07 crc kubenswrapper[4794]: I0310 09:46:07.437546 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:46:07 crc kubenswrapper[4794]: I0310 09:46:07.437583 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:46:07 crc kubenswrapper[4794]: I0310 09:46:07.437592 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:46:07 crc kubenswrapper[4794]: I0310 09:46:07.437604 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:46:07 crc kubenswrapper[4794]: I0310 09:46:07.437613 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:46:07Z","lastTransitionTime":"2026-03-10T09:46:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:46:07 crc kubenswrapper[4794]: I0310 09:46:07.461710 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/befc934b-d5ba-4fb4-afc6-97614b624ebc-metrics-certs\") pod \"network-metrics-daemon-jl52w\" (UID: \"befc934b-d5ba-4fb4-afc6-97614b624ebc\") " pod="openshift-multus/network-metrics-daemon-jl52w" Mar 10 09:46:07 crc kubenswrapper[4794]: E0310 09:46:07.461848 4794 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 10 09:46:07 crc kubenswrapper[4794]: E0310 09:46:07.461911 4794 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/befc934b-d5ba-4fb4-afc6-97614b624ebc-metrics-certs podName:befc934b-d5ba-4fb4-afc6-97614b624ebc nodeName:}" failed. No retries permitted until 2026-03-10 09:46:09.461899033 +0000 UTC m=+118.218069851 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/befc934b-d5ba-4fb4-afc6-97614b624ebc-metrics-certs") pod "network-metrics-daemon-jl52w" (UID: "befc934b-d5ba-4fb4-afc6-97614b624ebc") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 10 09:46:07 crc kubenswrapper[4794]: I0310 09:46:07.539713 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:46:07 crc kubenswrapper[4794]: I0310 09:46:07.539792 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:46:07 crc kubenswrapper[4794]: I0310 09:46:07.539806 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:46:07 crc kubenswrapper[4794]: I0310 09:46:07.539826 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:46:07 crc kubenswrapper[4794]: I0310 09:46:07.539842 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:46:07Z","lastTransitionTime":"2026-03-10T09:46:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:46:07 crc kubenswrapper[4794]: I0310 09:46:07.608410 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Mar 10 09:46:07 crc kubenswrapper[4794]: I0310 09:46:07.610954 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"8f9a5ac235816afb81c6091da3019a5d8be1e29ba1de74f30c6c28ace091f822"} Mar 10 09:46:07 crc kubenswrapper[4794]: I0310 09:46:07.611771 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 10 09:46:07 crc kubenswrapper[4794]: I0310 09:46:07.632930 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-jpdth" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"11028118-385a-4a2a-8bc4-49aad67ce147\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31569151ef9bf456980ec9d56c85d54ea77739b3991e26844385936dfceb24ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:45:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8v9tn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:45:52Z\\\"}}\" for pod \"openshift-multus\"/\"multus-jpdth\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:46:07Z is after 2025-08-24T17:21:41Z" Mar 10 09:46:07 crc kubenswrapper[4794]: I0310 09:46:07.642953 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:46:07 crc kubenswrapper[4794]: I0310 09:46:07.643010 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:46:07 crc kubenswrapper[4794]: I0310 09:46:07.643025 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:46:07 crc kubenswrapper[4794]: I0310 09:46:07.643054 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:46:07 crc kubenswrapper[4794]: I0310 09:46:07.643068 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:46:07Z","lastTransitionTime":"2026-03-10T09:46:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:46:07 crc kubenswrapper[4794]: I0310 09:46:07.652867 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c04f5dfc-8cde-406b-9e01-2e529e0c0f31\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:12Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:12Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c7c4ede630df1f139ad8abbc2d1d1e2f0f6cf56a2c29902f6b6879df004834cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:44:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://27049a35d23ac6ffa5cb459e4ea2c5c55a7b1731c9edff0941d452442cc73af2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:44:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://de2c346c49fbd94c4328853bed3ffd94ba873a423ff0d99a6ace6fc832bd4da7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:44:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f9a5ac235816afb81c6091da3019a5d8be1e29ba1de74f30c6c28ace091f822\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bd3d37caeb172a681855b5504cd1387cb3798cff0c9ae2c67f9a57207d86f7fe\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T09:45:21Z\\\",\\\"message\\\":\\\"le observer\\\\nW0310 09:45:21.579222 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0310 09:45:21.579381 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0310 09:45:21.580347 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2897458058/tls.crt::/tmp/serving-cert-2897458058/tls.key\\\\\\\"\\\\nI0310 09:45:21.843230 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0310 09:45:21.845521 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0310 09:45:21.845546 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0310 09:45:21.845567 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0310 09:45:21.845572 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0310 09:45:21.854911 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0310 09:45:21.854948 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 09:45:21.854954 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 09:45:21.854960 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0310 09:45:21.854966 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0310 09:45:21.854970 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0310 09:45:21.854974 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0310 09:45:21.854995 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0310 09:45:21.858504 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T09:45:21Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:46:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a3be3a71e2af20be50ffcbd448d5f28e73bafe2d36819ec5476a260b6f5a8c5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:44:14Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://712665fa5ac90b706c3ce8eb211703686c18a26f136962df26e3ad7c4396c72f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://712665fa5ac90b706c3ce8eb211703686c18a26f136962df26e3ad7c4396c72f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:44:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:44:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:44:12Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:46:07Z is after 2025-08-24T17:21:41Z" Mar 10 09:46:07 crc kubenswrapper[4794]: I0310 09:46:07.665688 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2f008ec7-aeae-48e2-8ce6-2d24f9a74cb9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5ecb656c11edc3bfd23f7b6362d5ac8cad055186b134498b59e58b868ebd157c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:44:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8ff582c5b9b19a04841665aef50262d055e093a8e80200961a93e1e1ee0391b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8ff582c5b9b19a04841665aef50262d055e093a8e80200961a93e1e1ee0391b4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:44:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:44:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:44:12Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:46:07Z is after 2025-08-24T17:21:41Z" Mar 10 09:46:07 crc kubenswrapper[4794]: I0310 09:46:07.681973 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-zr89w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"749e1060-d177-43e4-9f39-d1b3e9b573b3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://80687a118cd90a6d4538b0de1757e2f3a17d7f2d5e2532d4a3c4396e838bdce4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:45:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-drqmz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9a7b1089f71425c1e393f53f827c2bd0dd87615752c6c5655ef28b2346de21d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a7b1089f71425c1e393f53f827c2bd0dd87615752c6c5655ef28b2346de21d8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:45:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:45:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-drqmz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e448f6f31e1ada006f54b8381012841a94a4e0101fba5630d715f1ab89e842e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6e448f6f31e1ada006f54b8381012841a94a4e0101fba5630d715f1ab89e842e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:45:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:45:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-drqmz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4d92d9cb37c7766aaabeee4dc2a40c043f0e8178dfe21b6ca9f91fa6313fed4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d4d92d9cb37c7766aaabeee4dc2a40c043f0e8178dfe21b6ca9f91fa6313fed4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:45:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:45:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-drqmz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://06a10ffd91a8c26bf9a3affc09b1c698bcf344aebafadbb12e37e4cc0fb8fb32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://06a10ffd91a8c26bf9a3affc09b1c698bcf344aebafadbb12e37e4cc0fb8fb32\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:45:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:45:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-drqmz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://080520ae745ae721854727f7b8ea882c3b1073ef487362681973f5befe15a79c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://080520ae745ae721854727f7b8ea882c3b1073ef487362681973f5befe15a79c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:45:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:45:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-drqmz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4d9e1da83ee6ac0c4f792e1ec3da1a150b5ee013e83dd2e1b0e86bf5b5fcc648\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4d9e1da83ee6ac0c4f792e1ec3da1a150b5ee013e83dd2e1b0e86bf5b5fcc648\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:45:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:45:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-drqmz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:45:52Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-zr89w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:46:07Z is after 2025-08-24T17:21:41Z" Mar 10 09:46:07 crc kubenswrapper[4794]: I0310 09:46:07.695910 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:35Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:46:07Z is after 2025-08-24T17:21:41Z" Mar 10 09:46:07 crc kubenswrapper[4794]: I0310 09:46:07.713379 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-69278" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"071a79a8-a892-4d38-a255-2a19483b64aa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f596b5be3f96e424f4c4f0c6f81241ec50ee64356cb1449294fa6790aa3f7755\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:45:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wl72n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://387c7d244ef3b150e61bf6db8b30580b19b8867628fab0e5c4eebdde4df81142\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:45:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wl72n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:45:52Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-69278\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:46:07Z is after 2025-08-24T17:21:41Z" Mar 10 09:46:07 crc kubenswrapper[4794]: I0310 09:46:07.725843 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-jl52w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"befc934b-d5ba-4fb4-afc6-97614b624ebc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:46:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:46:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:46:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:46:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gbh5s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gbh5s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:46:05Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-jl52w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:46:07Z is after 2025-08-24T17:21:41Z" Mar 10 09:46:07 crc kubenswrapper[4794]: I0310 09:46:07.739992 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:39Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:39Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7ca44cdf468c33b025c75f0f104435e413e525fba85c006f98fa92ff2f2a4e3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:45:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:46:07Z is after 2025-08-24T17:21:41Z" Mar 10 09:46:07 crc kubenswrapper[4794]: I0310 09:46:07.745968 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:46:07 crc kubenswrapper[4794]: I0310 09:46:07.746022 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:46:07 crc kubenswrapper[4794]: I0310 09:46:07.746038 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:46:07 crc kubenswrapper[4794]: I0310 09:46:07.746056 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:46:07 crc kubenswrapper[4794]: I0310 09:46:07.746070 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:46:07Z","lastTransitionTime":"2026-03-10T09:46:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:46:07 crc kubenswrapper[4794]: I0310 09:46:07.763923 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mm9nq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d6907de6-7eb7-440a-a101-f492ffa28e39\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d6a3c6de4d134994477dbeb7d3ccdac86f691a7c465321397a93be240c84c4c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:45:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5dhqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf907aae6bd62c6969511c2483d9891a3d89407fa596e7bd4d6e18a3739aea41\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:45:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5dhqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f392e7eaab035834aa954a19e6c9d926bbedf2a547b3a11fba67c0bb966df58c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:45:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5dhqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4fd74482a81888186636681a0568f49e919ac0d729db583ba9178c42e488051\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:45:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5dhqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4fe68076842f5ec0c1922f36a3fdae71ce2e309fd4fa24d4ee4e15c16e514b57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:45:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5dhqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6993b19937515a45f9fab7b20a6856320e79cb3526295d52f48338126d519cac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:45:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5dhqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f02214c7d7705fc02dcc65fa178be79e26797ae2a4414ac88e3b14feca08b425\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://129f4f2f028ce030a9031869e327954dca94ca04e3cff5fe553e344d932b3301\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-10T09:46:02Z\\\",\\\"message\\\":\\\".io/client-go/informers/factory.go:160\\\\nI0310 09:46:02.111927 6641 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0310 09:46:02.112060 6641 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0310 09:46:02.112249 6641 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0310 09:46:02.112716 6641 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0310 09:46:02.112769 6641 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0310 09:46:02.112826 6641 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0310 09:46:02.112858 6641 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0310 09:46:02.112824 6641 handler.go:208] Removed *v1.Node event handler 7\\\\nI0310 09:46:02.112907 6641 handler.go:208] Removed *v1.Node event handler 2\\\\nI0310 09:46:02.112924 6641 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0310 09:46:02.112979 6641 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0310 09:46:02.113019 6641 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0310 09:46:02.113068 6641 factory.go:656] Stopping watch factory\\\\nI0310 09:46:02.113107 6641 ovnkube.go:599] Stopped ovnkube\\\\nI0310 09:46:0\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T09:45:59Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f02214c7d7705fc02dcc65fa178be79e26797ae2a4414ac88e3b14feca08b425\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-10T09:46:03Z\\\",\\\"message\\\":\\\"j_retry.go:386] Retry successful for *v1.Pod openshift-network-operator/iptables-alerter-4ln5h after 0 failed attempt(s)\\\\nF0310 09:46:03.457799 6791 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:46:03Z is after 2025-08-24T17:21:41Z]\\\\nI0310 09:46:03.457805 6791 default_network_controller.go:776] Recording success event on pod openshift-network-operator/iptables-alerter-4ln5h\\\\nI0310 09:46:03.457647 6791 ovn.go:134] Ensuring zone local for Pod openshift-machine-config-operator/kube-rbac-proxy-crio-crc in node crc\\\\nI0310 09:46:03.457819 6791 obj_retry.go:386] Retry successful for *v1.Pod openshift-machine-config-operat\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T09:46:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5dhqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://01f874a0c0eedbf796a95816f7e3fa80891f260f22087c48aea06498330167dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:45:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5dhqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://40c8d4f75a838a0ea1ce416d08c6d4628c2cbdbc209d50922f2f02867142bedd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://40c8d4f75a838a0ea1ce416d08c6d4628c2cbdbc209d50922f2f02867142bedd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:45:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:45:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5dhqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:45:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-mm9nq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:46:07Z is after 2025-08-24T17:21:41Z" Mar 10 09:46:07 crc kubenswrapper[4794]: I0310 09:46:07.794415 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1e0df489-afb8-4101-acd6-7274ee4ebddf\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://714aace000a0c1b32bd3c863ca1e38e9e44df606fcf3fddb6573eaf8faea4ac7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:44:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://02b38ec46bd15bd61a9748f309c0f8fd16d1ab485405a653f6974b6b37952f75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:44:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b3bcab9c36a963edfb820142c1c917f0cc5d18a09dc3aa65b4539c2248fb866\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:44:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://55280e0d81cf78ca765eb09242df44355b2e25fc2d3e5f88b78ef9f14a44ada8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:44:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8ed250f61efe96b4fac3af4d8d8958a60df304df0120052a7a1ef2d716cc025d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:44:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d1cbb7471e4c1bc60e6bff82ff47be1667d540e2f73c1e118371b1f887c6e22e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d1cbb7471e4c1bc60e6bff82ff47be1667d540e2f73c1e118371b1f887c6e22e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:44:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:44:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e8b205e76d6211c7df0521baf694a871c2e747401dd85f694fe3b6712995540\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8e8b205e76d6211c7df0521baf694a871c2e747401dd85f694fe3b6712995540\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:44:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:44:14Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://0fee74b6caaaf89e473d672c7699753699a594043f013d2e5106d232ad87d850\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0fee74b6caaaf89e473d672c7699753699a594043f013d2e5106d232ad87d850\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:44:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:44:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:44:12Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:46:07Z is after 2025-08-24T17:21:41Z" Mar 10 09:46:07 crc kubenswrapper[4794]: I0310 09:46:07.809572 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d7e66c82b5c376bc659e9597f6714923192b2658db0530c1e20b73533bfb9079\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:45:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:46:07Z is after 2025-08-24T17:21:41Z" Mar 10 09:46:07 crc kubenswrapper[4794]: I0310 09:46:07.824321 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0a99da5e73d983b28a12e4063cb3dcbef725b0797abd5b5beff2d066fb4b43d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:45:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://faad85d1ef96c3c23ba54d03d456195b33f24bed1a5fc64cec481108d087d77c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:45:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:46:07Z is after 2025-08-24T17:21:41Z" Mar 10 09:46:07 crc kubenswrapper[4794]: I0310 09:46:07.841353 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-dc7fw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"441d4601-197c-4325-84bf-cef005ae408b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2367448989b7eb7235c3277100a8d4945551beee942453ce2efe152e17d24ce2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:45:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxxtq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:45:52Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-dc7fw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:46:07Z is after 2025-08-24T17:21:41Z" Mar 10 09:46:07 crc kubenswrapper[4794]: I0310 09:46:07.848594 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:46:07 crc kubenswrapper[4794]: I0310 09:46:07.848655 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:46:07 crc kubenswrapper[4794]: I0310 09:46:07.848679 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:46:07 crc kubenswrapper[4794]: I0310 09:46:07.848711 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:46:07 crc kubenswrapper[4794]: I0310 09:46:07.848736 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:46:07Z","lastTransitionTime":"2026-03-10T09:46:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:46:07 crc kubenswrapper[4794]: I0310 09:46:07.866004 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 09:46:07 crc kubenswrapper[4794]: I0310 09:46:07.866203 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 09:46:07 crc kubenswrapper[4794]: E0310 09:46:07.866269 4794 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 09:46:39.866235003 +0000 UTC m=+148.622405871 (durationBeforeRetry 32s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 09:46:07 crc kubenswrapper[4794]: E0310 09:46:07.866417 4794 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 10 09:46:07 crc kubenswrapper[4794]: I0310 09:46:07.866438 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 09:46:07 crc kubenswrapper[4794]: E0310 09:46:07.866453 4794 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 10 09:46:07 crc kubenswrapper[4794]: E0310 09:46:07.866631 4794 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 10 09:46:07 crc kubenswrapper[4794]: E0310 09:46:07.866545 4794 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 10 09:46:07 crc kubenswrapper[4794]: E0310 09:46:07.866690 4794 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 10 09:46:07 crc kubenswrapper[4794]: E0310 09:46:07.866714 4794 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 10 09:46:07 crc kubenswrapper[4794]: I0310 09:46:07.866562 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 09:46:07 crc kubenswrapper[4794]: E0310 09:46:07.866760 4794 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 10 09:46:07 crc kubenswrapper[4794]: E0310 09:46:07.866769 4794 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-10 09:46:39.866732899 +0000 UTC m=+148.622903747 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 10 09:46:07 crc kubenswrapper[4794]: E0310 09:46:07.866898 4794 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-10 09:46:39.866872423 +0000 UTC m=+148.623043291 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 10 09:46:07 crc kubenswrapper[4794]: I0310 09:46:07.867000 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 09:46:07 crc kubenswrapper[4794]: E0310 09:46:07.867038 4794 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-10 09:46:39.867010027 +0000 UTC m=+148.623180895 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 10 09:46:07 crc kubenswrapper[4794]: E0310 09:46:07.867065 4794 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 10 09:46:07 crc kubenswrapper[4794]: E0310 09:46:07.867108 4794 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-10 09:46:39.86709704 +0000 UTC m=+148.623267948 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 10 09:46:07 crc kubenswrapper[4794]: I0310 09:46:07.876465 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-gp22j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ccf2d24b-7ac8-4da4-8629-a56d006f1292\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:46:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:46:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:46:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dc5c1e6650959d5e7fe835284d6684f8d165aaec2e2e5c6537668db196c0ba81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:45:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k8fgn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:45:59Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-gp22j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:46:07Z is after 2025-08-24T17:21:41Z" Mar 10 09:46:07 crc kubenswrapper[4794]: I0310 09:46:07.897129 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-9j8d2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f99fa7c7-084a-43b1-acbf-5fbc67e1a66b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:46:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:46:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:46:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:46:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5aec2bb176e581a0b8ab9808299a341c97cb05dcb3bf8a44d65e65b642cecec5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:46:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8l2p5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1df90543899f07349ecc9cbe5282c9e766f2cf21e6a6da40920a77f6f40a9c2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:46:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8l2p5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:46:04Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-9j8d2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:46:07Z is after 2025-08-24T17:21:41Z" Mar 10 09:46:07 crc kubenswrapper[4794]: I0310 09:46:07.911812 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:35Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:46:07Z is after 2025-08-24T17:21:41Z" Mar 10 09:46:07 crc kubenswrapper[4794]: I0310 09:46:07.922368 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:46:07Z is after 2025-08-24T17:21:41Z" Mar 10 09:46:07 crc kubenswrapper[4794]: I0310 09:46:07.951638 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:46:07 crc kubenswrapper[4794]: I0310 09:46:07.951691 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:46:07 crc kubenswrapper[4794]: I0310 09:46:07.951703 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:46:07 crc kubenswrapper[4794]: I0310 09:46:07.951720 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:46:07 crc kubenswrapper[4794]: I0310 09:46:07.951733 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:46:07Z","lastTransitionTime":"2026-03-10T09:46:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:46:07 crc kubenswrapper[4794]: I0310 09:46:07.998269 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 09:46:07 crc kubenswrapper[4794]: I0310 09:46:07.998370 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 09:46:07 crc kubenswrapper[4794]: E0310 09:46:07.998440 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 09:46:07 crc kubenswrapper[4794]: I0310 09:46:07.998476 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 09:46:07 crc kubenswrapper[4794]: E0310 09:46:07.998615 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 09:46:07 crc kubenswrapper[4794]: E0310 09:46:07.998710 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 09:46:08 crc kubenswrapper[4794]: I0310 09:46:08.054727 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:46:08 crc kubenswrapper[4794]: I0310 09:46:08.054784 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:46:08 crc kubenswrapper[4794]: I0310 09:46:08.054802 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:46:08 crc kubenswrapper[4794]: I0310 09:46:08.054827 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:46:08 crc kubenswrapper[4794]: I0310 09:46:08.054849 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:46:08Z","lastTransitionTime":"2026-03-10T09:46:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:46:08 crc kubenswrapper[4794]: I0310 09:46:08.157754 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:46:08 crc kubenswrapper[4794]: I0310 09:46:08.157804 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:46:08 crc kubenswrapper[4794]: I0310 09:46:08.157821 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:46:08 crc kubenswrapper[4794]: I0310 09:46:08.157843 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:46:08 crc kubenswrapper[4794]: I0310 09:46:08.157861 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:46:08Z","lastTransitionTime":"2026-03-10T09:46:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:46:08 crc kubenswrapper[4794]: I0310 09:46:08.260504 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:46:08 crc kubenswrapper[4794]: I0310 09:46:08.260545 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:46:08 crc kubenswrapper[4794]: I0310 09:46:08.260553 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:46:08 crc kubenswrapper[4794]: I0310 09:46:08.260568 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:46:08 crc kubenswrapper[4794]: I0310 09:46:08.260578 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:46:08Z","lastTransitionTime":"2026-03-10T09:46:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:46:08 crc kubenswrapper[4794]: I0310 09:46:08.363962 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:46:08 crc kubenswrapper[4794]: I0310 09:46:08.364003 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:46:08 crc kubenswrapper[4794]: I0310 09:46:08.364014 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:46:08 crc kubenswrapper[4794]: I0310 09:46:08.364032 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:46:08 crc kubenswrapper[4794]: I0310 09:46:08.364043 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:46:08Z","lastTransitionTime":"2026-03-10T09:46:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:46:08 crc kubenswrapper[4794]: I0310 09:46:08.467580 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:46:08 crc kubenswrapper[4794]: I0310 09:46:08.467643 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:46:08 crc kubenswrapper[4794]: I0310 09:46:08.467666 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:46:08 crc kubenswrapper[4794]: I0310 09:46:08.467694 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:46:08 crc kubenswrapper[4794]: I0310 09:46:08.467717 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:46:08Z","lastTransitionTime":"2026-03-10T09:46:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:46:08 crc kubenswrapper[4794]: I0310 09:46:08.570570 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:46:08 crc kubenswrapper[4794]: I0310 09:46:08.570656 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:46:08 crc kubenswrapper[4794]: I0310 09:46:08.570671 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:46:08 crc kubenswrapper[4794]: I0310 09:46:08.570691 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:46:08 crc kubenswrapper[4794]: I0310 09:46:08.570729 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:46:08Z","lastTransitionTime":"2026-03-10T09:46:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:46:08 crc kubenswrapper[4794]: I0310 09:46:08.672989 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:46:08 crc kubenswrapper[4794]: I0310 09:46:08.673081 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:46:08 crc kubenswrapper[4794]: I0310 09:46:08.673102 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:46:08 crc kubenswrapper[4794]: I0310 09:46:08.673157 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:46:08 crc kubenswrapper[4794]: I0310 09:46:08.673175 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:46:08Z","lastTransitionTime":"2026-03-10T09:46:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:46:08 crc kubenswrapper[4794]: I0310 09:46:08.776569 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:46:08 crc kubenswrapper[4794]: I0310 09:46:08.776618 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:46:08 crc kubenswrapper[4794]: I0310 09:46:08.776654 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:46:08 crc kubenswrapper[4794]: I0310 09:46:08.776676 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:46:08 crc kubenswrapper[4794]: I0310 09:46:08.776687 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:46:08Z","lastTransitionTime":"2026-03-10T09:46:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:46:08 crc kubenswrapper[4794]: I0310 09:46:08.879414 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:46:08 crc kubenswrapper[4794]: I0310 09:46:08.879451 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:46:08 crc kubenswrapper[4794]: I0310 09:46:08.879463 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:46:08 crc kubenswrapper[4794]: I0310 09:46:08.879479 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:46:08 crc kubenswrapper[4794]: I0310 09:46:08.879493 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:46:08Z","lastTransitionTime":"2026-03-10T09:46:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:46:08 crc kubenswrapper[4794]: I0310 09:46:08.981626 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:46:08 crc kubenswrapper[4794]: I0310 09:46:08.981664 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:46:08 crc kubenswrapper[4794]: I0310 09:46:08.981676 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:46:08 crc kubenswrapper[4794]: I0310 09:46:08.981694 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:46:08 crc kubenswrapper[4794]: I0310 09:46:08.981706 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:46:08Z","lastTransitionTime":"2026-03-10T09:46:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:46:08 crc kubenswrapper[4794]: I0310 09:46:08.998325 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jl52w" Mar 10 09:46:08 crc kubenswrapper[4794]: E0310 09:46:08.998575 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jl52w" podUID="befc934b-d5ba-4fb4-afc6-97614b624ebc" Mar 10 09:46:09 crc kubenswrapper[4794]: I0310 09:46:09.084393 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:46:09 crc kubenswrapper[4794]: I0310 09:46:09.084458 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:46:09 crc kubenswrapper[4794]: I0310 09:46:09.084481 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:46:09 crc kubenswrapper[4794]: I0310 09:46:09.084508 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:46:09 crc kubenswrapper[4794]: I0310 09:46:09.084529 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:46:09Z","lastTransitionTime":"2026-03-10T09:46:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:46:09 crc kubenswrapper[4794]: I0310 09:46:09.187382 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:46:09 crc kubenswrapper[4794]: I0310 09:46:09.187445 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:46:09 crc kubenswrapper[4794]: I0310 09:46:09.187466 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:46:09 crc kubenswrapper[4794]: I0310 09:46:09.187489 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:46:09 crc kubenswrapper[4794]: I0310 09:46:09.187506 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:46:09Z","lastTransitionTime":"2026-03-10T09:46:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:46:09 crc kubenswrapper[4794]: I0310 09:46:09.291055 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:46:09 crc kubenswrapper[4794]: I0310 09:46:09.291130 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:46:09 crc kubenswrapper[4794]: I0310 09:46:09.291150 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:46:09 crc kubenswrapper[4794]: I0310 09:46:09.291174 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:46:09 crc kubenswrapper[4794]: I0310 09:46:09.291193 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:46:09Z","lastTransitionTime":"2026-03-10T09:46:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:46:09 crc kubenswrapper[4794]: I0310 09:46:09.394155 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:46:09 crc kubenswrapper[4794]: I0310 09:46:09.394216 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:46:09 crc kubenswrapper[4794]: I0310 09:46:09.394233 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:46:09 crc kubenswrapper[4794]: I0310 09:46:09.394256 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:46:09 crc kubenswrapper[4794]: I0310 09:46:09.394273 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:46:09Z","lastTransitionTime":"2026-03-10T09:46:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:46:09 crc kubenswrapper[4794]: I0310 09:46:09.485415 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/befc934b-d5ba-4fb4-afc6-97614b624ebc-metrics-certs\") pod \"network-metrics-daemon-jl52w\" (UID: \"befc934b-d5ba-4fb4-afc6-97614b624ebc\") " pod="openshift-multus/network-metrics-daemon-jl52w" Mar 10 09:46:09 crc kubenswrapper[4794]: E0310 09:46:09.485643 4794 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 10 09:46:09 crc kubenswrapper[4794]: E0310 09:46:09.485773 4794 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/befc934b-d5ba-4fb4-afc6-97614b624ebc-metrics-certs podName:befc934b-d5ba-4fb4-afc6-97614b624ebc nodeName:}" failed. No retries permitted until 2026-03-10 09:46:13.485746464 +0000 UTC m=+122.241917312 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/befc934b-d5ba-4fb4-afc6-97614b624ebc-metrics-certs") pod "network-metrics-daemon-jl52w" (UID: "befc934b-d5ba-4fb4-afc6-97614b624ebc") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 10 09:46:09 crc kubenswrapper[4794]: I0310 09:46:09.497023 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:46:09 crc kubenswrapper[4794]: I0310 09:46:09.497105 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:46:09 crc kubenswrapper[4794]: I0310 09:46:09.497132 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:46:09 crc kubenswrapper[4794]: I0310 09:46:09.497162 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:46:09 crc kubenswrapper[4794]: I0310 09:46:09.497184 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:46:09Z","lastTransitionTime":"2026-03-10T09:46:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:46:09 crc kubenswrapper[4794]: I0310 09:46:09.599990 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:46:09 crc kubenswrapper[4794]: I0310 09:46:09.600027 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:46:09 crc kubenswrapper[4794]: I0310 09:46:09.600039 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:46:09 crc kubenswrapper[4794]: I0310 09:46:09.600055 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:46:09 crc kubenswrapper[4794]: I0310 09:46:09.600066 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:46:09Z","lastTransitionTime":"2026-03-10T09:46:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:46:09 crc kubenswrapper[4794]: I0310 09:46:09.703082 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:46:09 crc kubenswrapper[4794]: I0310 09:46:09.703126 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:46:09 crc kubenswrapper[4794]: I0310 09:46:09.703141 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:46:09 crc kubenswrapper[4794]: I0310 09:46:09.703162 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:46:09 crc kubenswrapper[4794]: I0310 09:46:09.703177 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:46:09Z","lastTransitionTime":"2026-03-10T09:46:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:46:09 crc kubenswrapper[4794]: I0310 09:46:09.716920 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:46:09 crc kubenswrapper[4794]: I0310 09:46:09.716975 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:46:09 crc kubenswrapper[4794]: I0310 09:46:09.716989 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:46:09 crc kubenswrapper[4794]: I0310 09:46:09.717010 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:46:09 crc kubenswrapper[4794]: I0310 09:46:09.717034 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:46:09Z","lastTransitionTime":"2026-03-10T09:46:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:46:09 crc kubenswrapper[4794]: E0310 09:46:09.728331 4794 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:46:09Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:46:09Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:46:09Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:46:09Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:46:09Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:46:09Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:46:09Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:46:09Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"30640994-851b-4f33-a3b0-5689f89c6242\\\",\\\"systemUUID\\\":\\\"970a308b-8f2f-4747-b542-8544494e7e13\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:46:09Z is after 2025-08-24T17:21:41Z" Mar 10 09:46:09 crc kubenswrapper[4794]: I0310 09:46:09.732148 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:46:09 crc kubenswrapper[4794]: I0310 09:46:09.732184 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:46:09 crc kubenswrapper[4794]: I0310 09:46:09.732195 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:46:09 crc kubenswrapper[4794]: I0310 09:46:09.732211 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:46:09 crc kubenswrapper[4794]: I0310 09:46:09.732224 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:46:09Z","lastTransitionTime":"2026-03-10T09:46:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:46:09 crc kubenswrapper[4794]: E0310 09:46:09.744128 4794 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:46:09Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:46:09Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:46:09Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:46:09Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:46:09Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:46:09Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:46:09Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:46:09Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"30640994-851b-4f33-a3b0-5689f89c6242\\\",\\\"systemUUID\\\":\\\"970a308b-8f2f-4747-b542-8544494e7e13\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:46:09Z is after 2025-08-24T17:21:41Z" Mar 10 09:46:09 crc kubenswrapper[4794]: I0310 09:46:09.747573 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:46:09 crc kubenswrapper[4794]: I0310 09:46:09.747613 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:46:09 crc kubenswrapper[4794]: I0310 09:46:09.747624 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:46:09 crc kubenswrapper[4794]: I0310 09:46:09.747641 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:46:09 crc kubenswrapper[4794]: I0310 09:46:09.747653 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:46:09Z","lastTransitionTime":"2026-03-10T09:46:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:46:09 crc kubenswrapper[4794]: E0310 09:46:09.757498 4794 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:46:09Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:46:09Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:46:09Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:46:09Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:46:09Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:46:09Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:46:09Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:46:09Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"30640994-851b-4f33-a3b0-5689f89c6242\\\",\\\"systemUUID\\\":\\\"970a308b-8f2f-4747-b542-8544494e7e13\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:46:09Z is after 2025-08-24T17:21:41Z" Mar 10 09:46:09 crc kubenswrapper[4794]: I0310 09:46:09.760585 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:46:09 crc kubenswrapper[4794]: I0310 09:46:09.760621 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:46:09 crc kubenswrapper[4794]: I0310 09:46:09.760635 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:46:09 crc kubenswrapper[4794]: I0310 09:46:09.760658 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:46:09 crc kubenswrapper[4794]: I0310 09:46:09.760671 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:46:09Z","lastTransitionTime":"2026-03-10T09:46:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:46:09 crc kubenswrapper[4794]: E0310 09:46:09.771312 4794 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:46:09Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:46:09Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:46:09Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:46:09Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:46:09Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:46:09Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:46:09Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:46:09Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"30640994-851b-4f33-a3b0-5689f89c6242\\\",\\\"systemUUID\\\":\\\"970a308b-8f2f-4747-b542-8544494e7e13\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:46:09Z is after 2025-08-24T17:21:41Z" Mar 10 09:46:09 crc kubenswrapper[4794]: I0310 09:46:09.774729 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:46:09 crc kubenswrapper[4794]: I0310 09:46:09.774788 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:46:09 crc kubenswrapper[4794]: I0310 09:46:09.774806 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:46:09 crc kubenswrapper[4794]: I0310 09:46:09.774834 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:46:09 crc kubenswrapper[4794]: I0310 09:46:09.774853 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:46:09Z","lastTransitionTime":"2026-03-10T09:46:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:46:09 crc kubenswrapper[4794]: E0310 09:46:09.787414 4794 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:46:09Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:46:09Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:46:09Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:46:09Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:46:09Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:46:09Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:46:09Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:46:09Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"30640994-851b-4f33-a3b0-5689f89c6242\\\",\\\"systemUUID\\\":\\\"970a308b-8f2f-4747-b542-8544494e7e13\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:46:09Z is after 2025-08-24T17:21:41Z" Mar 10 09:46:09 crc kubenswrapper[4794]: E0310 09:46:09.787577 4794 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 10 09:46:09 crc kubenswrapper[4794]: I0310 09:46:09.805768 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:46:09 crc kubenswrapper[4794]: I0310 09:46:09.805796 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:46:09 crc kubenswrapper[4794]: I0310 09:46:09.805803 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:46:09 crc kubenswrapper[4794]: I0310 09:46:09.805818 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:46:09 crc kubenswrapper[4794]: I0310 09:46:09.805830 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:46:09Z","lastTransitionTime":"2026-03-10T09:46:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:46:09 crc kubenswrapper[4794]: I0310 09:46:09.908214 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:46:09 crc kubenswrapper[4794]: I0310 09:46:09.908255 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:46:09 crc kubenswrapper[4794]: I0310 09:46:09.908267 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:46:09 crc kubenswrapper[4794]: I0310 09:46:09.908284 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:46:09 crc kubenswrapper[4794]: I0310 09:46:09.908296 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:46:09Z","lastTransitionTime":"2026-03-10T09:46:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:46:09 crc kubenswrapper[4794]: I0310 09:46:09.998524 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 09:46:09 crc kubenswrapper[4794]: I0310 09:46:09.998555 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 09:46:09 crc kubenswrapper[4794]: E0310 09:46:09.998703 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 09:46:09 crc kubenswrapper[4794]: E0310 09:46:09.998827 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 09:46:09 crc kubenswrapper[4794]: I0310 09:46:09.998927 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 09:46:09 crc kubenswrapper[4794]: E0310 09:46:09.999028 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 09:46:10 crc kubenswrapper[4794]: I0310 09:46:10.010896 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:46:10 crc kubenswrapper[4794]: I0310 09:46:10.010949 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:46:10 crc kubenswrapper[4794]: I0310 09:46:10.010967 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:46:10 crc kubenswrapper[4794]: I0310 09:46:10.010990 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:46:10 crc kubenswrapper[4794]: I0310 09:46:10.011008 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:46:10Z","lastTransitionTime":"2026-03-10T09:46:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:46:10 crc kubenswrapper[4794]: I0310 09:46:10.114236 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:46:10 crc kubenswrapper[4794]: I0310 09:46:10.114274 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:46:10 crc kubenswrapper[4794]: I0310 09:46:10.114287 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:46:10 crc kubenswrapper[4794]: I0310 09:46:10.114303 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:46:10 crc kubenswrapper[4794]: I0310 09:46:10.114314 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:46:10Z","lastTransitionTime":"2026-03-10T09:46:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:46:10 crc kubenswrapper[4794]: I0310 09:46:10.217499 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:46:10 crc kubenswrapper[4794]: I0310 09:46:10.217540 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:46:10 crc kubenswrapper[4794]: I0310 09:46:10.217555 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:46:10 crc kubenswrapper[4794]: I0310 09:46:10.217589 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:46:10 crc kubenswrapper[4794]: I0310 09:46:10.217608 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:46:10Z","lastTransitionTime":"2026-03-10T09:46:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:46:10 crc kubenswrapper[4794]: I0310 09:46:10.319547 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:46:10 crc kubenswrapper[4794]: I0310 09:46:10.319604 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:46:10 crc kubenswrapper[4794]: I0310 09:46:10.319616 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:46:10 crc kubenswrapper[4794]: I0310 09:46:10.319635 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:46:10 crc kubenswrapper[4794]: I0310 09:46:10.319659 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:46:10Z","lastTransitionTime":"2026-03-10T09:46:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:46:10 crc kubenswrapper[4794]: I0310 09:46:10.422888 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:46:10 crc kubenswrapper[4794]: I0310 09:46:10.422917 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:46:10 crc kubenswrapper[4794]: I0310 09:46:10.422927 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:46:10 crc kubenswrapper[4794]: I0310 09:46:10.422942 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:46:10 crc kubenswrapper[4794]: I0310 09:46:10.422952 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:46:10Z","lastTransitionTime":"2026-03-10T09:46:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:46:10 crc kubenswrapper[4794]: I0310 09:46:10.525666 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:46:10 crc kubenswrapper[4794]: I0310 09:46:10.525693 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:46:10 crc kubenswrapper[4794]: I0310 09:46:10.525702 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:46:10 crc kubenswrapper[4794]: I0310 09:46:10.525714 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:46:10 crc kubenswrapper[4794]: I0310 09:46:10.525722 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:46:10Z","lastTransitionTime":"2026-03-10T09:46:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:46:10 crc kubenswrapper[4794]: I0310 09:46:10.632783 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:46:10 crc kubenswrapper[4794]: I0310 09:46:10.632866 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:46:10 crc kubenswrapper[4794]: I0310 09:46:10.632890 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:46:10 crc kubenswrapper[4794]: I0310 09:46:10.632922 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:46:10 crc kubenswrapper[4794]: I0310 09:46:10.632943 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:46:10Z","lastTransitionTime":"2026-03-10T09:46:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:46:10 crc kubenswrapper[4794]: I0310 09:46:10.735396 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:46:10 crc kubenswrapper[4794]: I0310 09:46:10.735440 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:46:10 crc kubenswrapper[4794]: I0310 09:46:10.735453 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:46:10 crc kubenswrapper[4794]: I0310 09:46:10.735475 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:46:10 crc kubenswrapper[4794]: I0310 09:46:10.735488 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:46:10Z","lastTransitionTime":"2026-03-10T09:46:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:46:10 crc kubenswrapper[4794]: I0310 09:46:10.838629 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:46:10 crc kubenswrapper[4794]: I0310 09:46:10.838682 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:46:10 crc kubenswrapper[4794]: I0310 09:46:10.838699 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:46:10 crc kubenswrapper[4794]: I0310 09:46:10.838721 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:46:10 crc kubenswrapper[4794]: I0310 09:46:10.838738 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:46:10Z","lastTransitionTime":"2026-03-10T09:46:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:46:10 crc kubenswrapper[4794]: I0310 09:46:10.941886 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:46:10 crc kubenswrapper[4794]: I0310 09:46:10.941951 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:46:10 crc kubenswrapper[4794]: I0310 09:46:10.941974 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:46:10 crc kubenswrapper[4794]: I0310 09:46:10.942005 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:46:10 crc kubenswrapper[4794]: I0310 09:46:10.942025 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:46:10Z","lastTransitionTime":"2026-03-10T09:46:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:46:10 crc kubenswrapper[4794]: I0310 09:46:10.998719 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jl52w" Mar 10 09:46:10 crc kubenswrapper[4794]: E0310 09:46:10.998974 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jl52w" podUID="befc934b-d5ba-4fb4-afc6-97614b624ebc" Mar 10 09:46:11 crc kubenswrapper[4794]: I0310 09:46:11.009204 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/openshift-kube-scheduler-crc"] Mar 10 09:46:11 crc kubenswrapper[4794]: I0310 09:46:11.044992 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:46:11 crc kubenswrapper[4794]: I0310 09:46:11.045055 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:46:11 crc kubenswrapper[4794]: I0310 09:46:11.045068 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:46:11 crc kubenswrapper[4794]: I0310 09:46:11.045087 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:46:11 crc kubenswrapper[4794]: I0310 09:46:11.045102 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:46:11Z","lastTransitionTime":"2026-03-10T09:46:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:46:11 crc kubenswrapper[4794]: I0310 09:46:11.148010 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:46:11 crc kubenswrapper[4794]: I0310 09:46:11.148049 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:46:11 crc kubenswrapper[4794]: I0310 09:46:11.148058 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:46:11 crc kubenswrapper[4794]: I0310 09:46:11.148072 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:46:11 crc kubenswrapper[4794]: I0310 09:46:11.148081 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:46:11Z","lastTransitionTime":"2026-03-10T09:46:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:46:11 crc kubenswrapper[4794]: I0310 09:46:11.251820 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:46:11 crc kubenswrapper[4794]: I0310 09:46:11.251876 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:46:11 crc kubenswrapper[4794]: I0310 09:46:11.251896 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:46:11 crc kubenswrapper[4794]: I0310 09:46:11.251924 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:46:11 crc kubenswrapper[4794]: I0310 09:46:11.251948 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:46:11Z","lastTransitionTime":"2026-03-10T09:46:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:46:11 crc kubenswrapper[4794]: I0310 09:46:11.354707 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:46:11 crc kubenswrapper[4794]: I0310 09:46:11.354778 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:46:11 crc kubenswrapper[4794]: I0310 09:46:11.354811 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:46:11 crc kubenswrapper[4794]: I0310 09:46:11.354842 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:46:11 crc kubenswrapper[4794]: I0310 09:46:11.354863 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:46:11Z","lastTransitionTime":"2026-03-10T09:46:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:46:11 crc kubenswrapper[4794]: I0310 09:46:11.457463 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:46:11 crc kubenswrapper[4794]: I0310 09:46:11.457520 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:46:11 crc kubenswrapper[4794]: I0310 09:46:11.457538 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:46:11 crc kubenswrapper[4794]: I0310 09:46:11.457562 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:46:11 crc kubenswrapper[4794]: I0310 09:46:11.457579 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:46:11Z","lastTransitionTime":"2026-03-10T09:46:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:46:11 crc kubenswrapper[4794]: I0310 09:46:11.560829 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:46:11 crc kubenswrapper[4794]: I0310 09:46:11.560884 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:46:11 crc kubenswrapper[4794]: I0310 09:46:11.560907 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:46:11 crc kubenswrapper[4794]: I0310 09:46:11.560936 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:46:11 crc kubenswrapper[4794]: I0310 09:46:11.560957 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:46:11Z","lastTransitionTime":"2026-03-10T09:46:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:46:11 crc kubenswrapper[4794]: I0310 09:46:11.664280 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:46:11 crc kubenswrapper[4794]: I0310 09:46:11.664401 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:46:11 crc kubenswrapper[4794]: I0310 09:46:11.664421 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:46:11 crc kubenswrapper[4794]: I0310 09:46:11.664452 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:46:11 crc kubenswrapper[4794]: I0310 09:46:11.664469 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:46:11Z","lastTransitionTime":"2026-03-10T09:46:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:46:11 crc kubenswrapper[4794]: I0310 09:46:11.767289 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:46:11 crc kubenswrapper[4794]: I0310 09:46:11.767335 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:46:11 crc kubenswrapper[4794]: I0310 09:46:11.767369 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:46:11 crc kubenswrapper[4794]: I0310 09:46:11.767388 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:46:11 crc kubenswrapper[4794]: I0310 09:46:11.767400 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:46:11Z","lastTransitionTime":"2026-03-10T09:46:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:46:11 crc kubenswrapper[4794]: I0310 09:46:11.869784 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:46:11 crc kubenswrapper[4794]: I0310 09:46:11.870114 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:46:11 crc kubenswrapper[4794]: I0310 09:46:11.870190 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:46:11 crc kubenswrapper[4794]: I0310 09:46:11.870287 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:46:11 crc kubenswrapper[4794]: I0310 09:46:11.870375 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:46:11Z","lastTransitionTime":"2026-03-10T09:46:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:46:11 crc kubenswrapper[4794]: E0310 09:46:11.971189 4794 kubelet_node_status.go:497] "Node not becoming ready in time after startup" Mar 10 09:46:11 crc kubenswrapper[4794]: I0310 09:46:11.998042 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 09:46:11 crc kubenswrapper[4794]: I0310 09:46:11.998041 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 09:46:11 crc kubenswrapper[4794]: E0310 09:46:11.998476 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 09:46:11 crc kubenswrapper[4794]: E0310 09:46:11.998584 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 09:46:11 crc kubenswrapper[4794]: I0310 09:46:11.998866 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 09:46:11 crc kubenswrapper[4794]: E0310 09:46:11.999475 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 09:46:12 crc kubenswrapper[4794]: I0310 09:46:12.033012 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1e0df489-afb8-4101-acd6-7274ee4ebddf\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://714aace000a0c1b32bd3c863ca1e38e9e44df606fcf3fddb6573eaf8faea4ac7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:44:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://02b38ec46bd15bd61a9748f309c0f8fd16d1ab485405a653f6974b6b37952f75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:44:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b3bcab9c36a963edfb820142c1c917f0cc5d18a09dc3aa65b4539c2248fb866\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:44:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://55280e0d81cf78ca765eb09242df44355b2e25fc2d3e5f88b78ef9f14a44ada8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:44:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8ed250f61efe96b4fac3af4d8d8958a60df304df0120052a7a1ef2d716cc025d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:44:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d1cbb7471e4c1bc60e6bff82ff47be1667d540e2f73c1e118371b1f887c6e22e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d1cbb7471e4c1bc60e6bff82ff47be1667d540e2f73c1e118371b1f887c6e22e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:44:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:44:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e8b205e76d6211c7df0521baf694a871c2e747401dd85f694fe3b6712995540\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8e8b205e76d6211c7df0521baf694a871c2e747401dd85f694fe3b6712995540\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:44:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:44:14Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://0fee74b6caaaf89e473d672c7699753699a594043f013d2e5106d232ad87d850\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0fee74b6caaaf89e473d672c7699753699a594043f013d2e5106d232ad87d850\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:44:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:44:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:44:12Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:46:12Z is after 2025-08-24T17:21:41Z" Mar 10 09:46:12 crc kubenswrapper[4794]: I0310 09:46:12.048141 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d7e66c82b5c376bc659e9597f6714923192b2658db0530c1e20b73533bfb9079\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:45:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:46:12Z is after 2025-08-24T17:21:41Z" Mar 10 09:46:12 crc kubenswrapper[4794]: I0310 09:46:12.066527 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:39Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:39Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7ca44cdf468c33b025c75f0f104435e413e525fba85c006f98fa92ff2f2a4e3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:45:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:46:12Z is after 2025-08-24T17:21:41Z" Mar 10 09:46:12 crc kubenswrapper[4794]: E0310 09:46:12.070999 4794 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 10 09:46:12 crc kubenswrapper[4794]: I0310 09:46:12.104094 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mm9nq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d6907de6-7eb7-440a-a101-f492ffa28e39\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d6a3c6de4d134994477dbeb7d3ccdac86f691a7c465321397a93be240c84c4c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:45:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5dhqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf907aae6bd62c6969511c2483d9891a3d89407fa596e7bd4d6e18a3739aea41\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:45:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5dhqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f392e7eaab035834aa954a19e6c9d926bbedf2a547b3a11fba67c0bb966df58c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:45:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5dhqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4fd74482a81888186636681a0568f49e919ac0d729db583ba9178c42e488051\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:45:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5dhqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4fe68076842f5ec0c1922f36a3fdae71ce2e309fd4fa24d4ee4e15c16e514b57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:45:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5dhqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6993b19937515a45f9fab7b20a6856320e79cb3526295d52f48338126d519cac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:45:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5dhqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f02214c7d7705fc02dcc65fa178be79e26797ae2a4414ac88e3b14feca08b425\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://129f4f2f028ce030a9031869e327954dca94ca04e3cff5fe553e344d932b3301\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-10T09:46:02Z\\\",\\\"message\\\":\\\".io/client-go/informers/factory.go:160\\\\nI0310 09:46:02.111927 6641 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0310 09:46:02.112060 6641 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0310 09:46:02.112249 6641 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0310 09:46:02.112716 6641 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0310 09:46:02.112769 6641 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0310 09:46:02.112826 6641 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0310 09:46:02.112858 6641 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0310 09:46:02.112824 6641 handler.go:208] Removed *v1.Node event handler 7\\\\nI0310 09:46:02.112907 6641 handler.go:208] Removed *v1.Node event handler 2\\\\nI0310 09:46:02.112924 6641 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0310 09:46:02.112979 6641 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0310 09:46:02.113019 6641 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0310 09:46:02.113068 6641 factory.go:656] Stopping watch factory\\\\nI0310 09:46:02.113107 6641 ovnkube.go:599] Stopped ovnkube\\\\nI0310 09:46:0\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T09:45:59Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f02214c7d7705fc02dcc65fa178be79e26797ae2a4414ac88e3b14feca08b425\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-10T09:46:03Z\\\",\\\"message\\\":\\\"j_retry.go:386] Retry successful for *v1.Pod openshift-network-operator/iptables-alerter-4ln5h after 0 failed attempt(s)\\\\nF0310 09:46:03.457799 6791 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:46:03Z is after 2025-08-24T17:21:41Z]\\\\nI0310 09:46:03.457805 6791 default_network_controller.go:776] Recording success event on pod openshift-network-operator/iptables-alerter-4ln5h\\\\nI0310 09:46:03.457647 6791 ovn.go:134] Ensuring zone local for Pod openshift-machine-config-operator/kube-rbac-proxy-crio-crc in node crc\\\\nI0310 09:46:03.457819 6791 obj_retry.go:386] Retry successful for *v1.Pod openshift-machine-config-operat\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T09:46:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5dhqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://01f874a0c0eedbf796a95816f7e3fa80891f260f22087c48aea06498330167dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:45:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5dhqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://40c8d4f75a838a0ea1ce416d08c6d4628c2cbdbc209d50922f2f02867142bedd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://40c8d4f75a838a0ea1ce416d08c6d4628c2cbdbc209d50922f2f02867142bedd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:45:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:45:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5dhqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:45:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-mm9nq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:46:12Z is after 2025-08-24T17:21:41Z" Mar 10 09:46:12 crc kubenswrapper[4794]: I0310 09:46:12.124270 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:35Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:46:12Z is after 2025-08-24T17:21:41Z" Mar 10 09:46:12 crc kubenswrapper[4794]: I0310 09:46:12.138243 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:46:12Z is after 2025-08-24T17:21:41Z" Mar 10 09:46:12 crc kubenswrapper[4794]: I0310 09:46:12.151816 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0a99da5e73d983b28a12e4063cb3dcbef725b0797abd5b5beff2d066fb4b43d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:45:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://faad85d1ef96c3c23ba54d03d456195b33f24bed1a5fc64cec481108d087d77c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:45:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:46:12Z is after 2025-08-24T17:21:41Z" Mar 10 09:46:12 crc kubenswrapper[4794]: I0310 09:46:12.161818 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-dc7fw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"441d4601-197c-4325-84bf-cef005ae408b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2367448989b7eb7235c3277100a8d4945551beee942453ce2efe152e17d24ce2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:45:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxxtq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:45:52Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-dc7fw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:46:12Z is after 2025-08-24T17:21:41Z" Mar 10 09:46:12 crc kubenswrapper[4794]: I0310 09:46:12.171912 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-gp22j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ccf2d24b-7ac8-4da4-8629-a56d006f1292\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:46:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:46:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:46:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dc5c1e6650959d5e7fe835284d6684f8d165aaec2e2e5c6537668db196c0ba81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:45:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k8fgn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:45:59Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-gp22j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:46:12Z is after 2025-08-24T17:21:41Z" Mar 10 09:46:12 crc kubenswrapper[4794]: I0310 09:46:12.183277 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-9j8d2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f99fa7c7-084a-43b1-acbf-5fbc67e1a66b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:46:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:46:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:46:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:46:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5aec2bb176e581a0b8ab9808299a341c97cb05dcb3bf8a44d65e65b642cecec5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:46:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8l2p5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1df90543899f07349ecc9cbe5282c9e766f2cf21e6a6da40920a77f6f40a9c2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:46:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8l2p5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:46:04Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-9j8d2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:46:12Z is after 2025-08-24T17:21:41Z" Mar 10 09:46:12 crc kubenswrapper[4794]: I0310 09:46:12.195088 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3f31d53b-bd82-4075-97b0-bab5614c1182\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cab1094176cbf7505724e0c9b08a4c294e6d85e4d5ad3b06535e0dbab270fbc2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:44:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1cb1eebc0d6deb9837da08cb3090cb0f25f2dc66b49afb402fd66f4e73784d66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:44:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7350cf82749601985e69e238d1c8a61ffa7788dd01f1b91a34f1f7574d3c86ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:44:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eab34114bba85670ea6caba8768b650dcbad9e92ca9034f0a21da9b355af87b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eab34114bba85670ea6caba8768b650dcbad9e92ca9034f0a21da9b355af87b1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:44:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:44:13Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:44:12Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:46:12Z is after 2025-08-24T17:21:41Z" Mar 10 09:46:12 crc kubenswrapper[4794]: I0310 09:46:12.207724 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2f008ec7-aeae-48e2-8ce6-2d24f9a74cb9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5ecb656c11edc3bfd23f7b6362d5ac8cad055186b134498b59e58b868ebd157c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:44:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8ff582c5b9b19a04841665aef50262d055e093a8e80200961a93e1e1ee0391b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8ff582c5b9b19a04841665aef50262d055e093a8e80200961a93e1e1ee0391b4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:44:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:44:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:44:12Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:46:12Z is after 2025-08-24T17:21:41Z" Mar 10 09:46:12 crc kubenswrapper[4794]: I0310 09:46:12.220829 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-zr89w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"749e1060-d177-43e4-9f39-d1b3e9b573b3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://80687a118cd90a6d4538b0de1757e2f3a17d7f2d5e2532d4a3c4396e838bdce4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:45:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-drqmz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9a7b1089f71425c1e393f53f827c2bd0dd87615752c6c5655ef28b2346de21d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a7b1089f71425c1e393f53f827c2bd0dd87615752c6c5655ef28b2346de21d8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:45:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:45:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-drqmz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e448f6f31e1ada006f54b8381012841a94a4e0101fba5630d715f1ab89e842e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6e448f6f31e1ada006f54b8381012841a94a4e0101fba5630d715f1ab89e842e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:45:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:45:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-drqmz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4d92d9cb37c7766aaabeee4dc2a40c043f0e8178dfe21b6ca9f91fa6313fed4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d4d92d9cb37c7766aaabeee4dc2a40c043f0e8178dfe21b6ca9f91fa6313fed4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:45:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:45:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-drqmz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://06a10ffd91a8c26bf9a3affc09b1c698bcf344aebafadbb12e37e4cc0fb8fb32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://06a10ffd91a8c26bf9a3affc09b1c698bcf344aebafadbb12e37e4cc0fb8fb32\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:45:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:45:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-drqmz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://080520ae745ae721854727f7b8ea882c3b1073ef487362681973f5befe15a79c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://080520ae745ae721854727f7b8ea882c3b1073ef487362681973f5befe15a79c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:45:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:45:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-drqmz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4d9e1da83ee6ac0c4f792e1ec3da1a150b5ee013e83dd2e1b0e86bf5b5fcc648\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4d9e1da83ee6ac0c4f792e1ec3da1a150b5ee013e83dd2e1b0e86bf5b5fcc648\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:45:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:45:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-drqmz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:45:52Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-zr89w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:46:12Z is after 2025-08-24T17:21:41Z" Mar 10 09:46:12 crc kubenswrapper[4794]: I0310 09:46:12.234957 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-jpdth" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"11028118-385a-4a2a-8bc4-49aad67ce147\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31569151ef9bf456980ec9d56c85d54ea77739b3991e26844385936dfceb24ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:45:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8v9tn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:45:52Z\\\"}}\" for pod \"openshift-multus\"/\"multus-jpdth\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:46:12Z is after 2025-08-24T17:21:41Z" Mar 10 09:46:12 crc kubenswrapper[4794]: I0310 09:46:12.249105 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c04f5dfc-8cde-406b-9e01-2e529e0c0f31\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:12Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:12Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c7c4ede630df1f139ad8abbc2d1d1e2f0f6cf56a2c29902f6b6879df004834cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:44:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://27049a35d23ac6ffa5cb459e4ea2c5c55a7b1731c9edff0941d452442cc73af2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:44:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://de2c346c49fbd94c4328853bed3ffd94ba873a423ff0d99a6ace6fc832bd4da7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:44:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f9a5ac235816afb81c6091da3019a5d8be1e29ba1de74f30c6c28ace091f822\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bd3d37caeb172a681855b5504cd1387cb3798cff0c9ae2c67f9a57207d86f7fe\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T09:45:21Z\\\",\\\"message\\\":\\\"le observer\\\\nW0310 09:45:21.579222 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0310 09:45:21.579381 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0310 09:45:21.580347 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2897458058/tls.crt::/tmp/serving-cert-2897458058/tls.key\\\\\\\"\\\\nI0310 09:45:21.843230 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0310 09:45:21.845521 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0310 09:45:21.845546 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0310 09:45:21.845567 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0310 09:45:21.845572 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0310 09:45:21.854911 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0310 09:45:21.854948 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 09:45:21.854954 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 09:45:21.854960 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0310 09:45:21.854966 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0310 09:45:21.854970 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0310 09:45:21.854974 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0310 09:45:21.854995 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0310 09:45:21.858504 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T09:45:21Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:46:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a3be3a71e2af20be50ffcbd448d5f28e73bafe2d36819ec5476a260b6f5a8c5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:44:14Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://712665fa5ac90b706c3ce8eb211703686c18a26f136962df26e3ad7c4396c72f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://712665fa5ac90b706c3ce8eb211703686c18a26f136962df26e3ad7c4396c72f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:44:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:44:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:44:12Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:46:12Z is after 2025-08-24T17:21:41Z" Mar 10 09:46:12 crc kubenswrapper[4794]: I0310 09:46:12.259979 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-69278" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"071a79a8-a892-4d38-a255-2a19483b64aa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f596b5be3f96e424f4c4f0c6f81241ec50ee64356cb1449294fa6790aa3f7755\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:45:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wl72n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://387c7d244ef3b150e61bf6db8b30580b19b8867628fab0e5c4eebdde4df81142\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:45:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wl72n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:45:52Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-69278\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:46:12Z is after 2025-08-24T17:21:41Z" Mar 10 09:46:12 crc kubenswrapper[4794]: I0310 09:46:12.270646 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-jl52w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"befc934b-d5ba-4fb4-afc6-97614b624ebc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:46:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:46:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:46:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:46:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gbh5s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gbh5s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:46:05Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-jl52w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:46:12Z is after 2025-08-24T17:21:41Z" Mar 10 09:46:12 crc kubenswrapper[4794]: I0310 09:46:12.280809 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:35Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:46:12Z is after 2025-08-24T17:21:41Z" Mar 10 09:46:12 crc kubenswrapper[4794]: I0310 09:46:12.998828 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jl52w" Mar 10 09:46:12 crc kubenswrapper[4794]: E0310 09:46:12.999023 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jl52w" podUID="befc934b-d5ba-4fb4-afc6-97614b624ebc" Mar 10 09:46:13 crc kubenswrapper[4794]: I0310 09:46:13.528994 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/befc934b-d5ba-4fb4-afc6-97614b624ebc-metrics-certs\") pod \"network-metrics-daemon-jl52w\" (UID: \"befc934b-d5ba-4fb4-afc6-97614b624ebc\") " pod="openshift-multus/network-metrics-daemon-jl52w" Mar 10 09:46:13 crc kubenswrapper[4794]: E0310 09:46:13.529220 4794 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 10 09:46:13 crc kubenswrapper[4794]: E0310 09:46:13.529494 4794 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/befc934b-d5ba-4fb4-afc6-97614b624ebc-metrics-certs podName:befc934b-d5ba-4fb4-afc6-97614b624ebc nodeName:}" failed. No retries permitted until 2026-03-10 09:46:21.529469854 +0000 UTC m=+130.285640702 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/befc934b-d5ba-4fb4-afc6-97614b624ebc-metrics-certs") pod "network-metrics-daemon-jl52w" (UID: "befc934b-d5ba-4fb4-afc6-97614b624ebc") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 10 09:46:13 crc kubenswrapper[4794]: I0310 09:46:13.998325 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 09:46:13 crc kubenswrapper[4794]: I0310 09:46:13.998467 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 09:46:13 crc kubenswrapper[4794]: E0310 09:46:13.998500 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 09:46:13 crc kubenswrapper[4794]: I0310 09:46:13.998351 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 09:46:13 crc kubenswrapper[4794]: E0310 09:46:13.998651 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 09:46:13 crc kubenswrapper[4794]: E0310 09:46:13.999289 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 09:46:14 crc kubenswrapper[4794]: I0310 09:46:14.000235 4794 scope.go:117] "RemoveContainer" containerID="f02214c7d7705fc02dcc65fa178be79e26797ae2a4414ac88e3b14feca08b425" Mar 10 09:46:14 crc kubenswrapper[4794]: I0310 09:46:14.018529 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:35Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:46:14Z is after 2025-08-24T17:21:41Z" Mar 10 09:46:14 crc kubenswrapper[4794]: I0310 09:46:14.039572 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-69278" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"071a79a8-a892-4d38-a255-2a19483b64aa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f596b5be3f96e424f4c4f0c6f81241ec50ee64356cb1449294fa6790aa3f7755\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:45:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wl72n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://387c7d244ef3b150e61bf6db8b30580b19b8867628fab0e5c4eebdde4df81142\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:45:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wl72n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:45:52Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-69278\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:46:14Z is after 2025-08-24T17:21:41Z" Mar 10 09:46:14 crc kubenswrapper[4794]: I0310 09:46:14.056571 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-jl52w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"befc934b-d5ba-4fb4-afc6-97614b624ebc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:46:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:46:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:46:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:46:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gbh5s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gbh5s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:46:05Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-jl52w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:46:14Z is after 2025-08-24T17:21:41Z" Mar 10 09:46:14 crc kubenswrapper[4794]: I0310 09:46:14.091929 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1e0df489-afb8-4101-acd6-7274ee4ebddf\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://714aace000a0c1b32bd3c863ca1e38e9e44df606fcf3fddb6573eaf8faea4ac7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:44:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://02b38ec46bd15bd61a9748f309c0f8fd16d1ab485405a653f6974b6b37952f75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:44:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b3bcab9c36a963edfb820142c1c917f0cc5d18a09dc3aa65b4539c2248fb866\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:44:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://55280e0d81cf78ca765eb09242df44355b2e25fc2d3e5f88b78ef9f14a44ada8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:44:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8ed250f61efe96b4fac3af4d8d8958a60df304df0120052a7a1ef2d716cc025d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:44:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d1cbb7471e4c1bc60e6bff82ff47be1667d540e2f73c1e118371b1f887c6e22e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d1cbb7471e4c1bc60e6bff82ff47be1667d540e2f73c1e118371b1f887c6e22e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:44:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:44:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e8b205e76d6211c7df0521baf694a871c2e747401dd85f694fe3b6712995540\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8e8b205e76d6211c7df0521baf694a871c2e747401dd85f694fe3b6712995540\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:44:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:44:14Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://0fee74b6caaaf89e473d672c7699753699a594043f013d2e5106d232ad87d850\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0fee74b6caaaf89e473d672c7699753699a594043f013d2e5106d232ad87d850\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:44:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:44:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:44:12Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:46:14Z is after 2025-08-24T17:21:41Z" Mar 10 09:46:14 crc kubenswrapper[4794]: I0310 09:46:14.107768 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d7e66c82b5c376bc659e9597f6714923192b2658db0530c1e20b73533bfb9079\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:45:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:46:14Z is after 2025-08-24T17:21:41Z" Mar 10 09:46:14 crc kubenswrapper[4794]: I0310 09:46:14.122478 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:39Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:39Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7ca44cdf468c33b025c75f0f104435e413e525fba85c006f98fa92ff2f2a4e3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:45:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:46:14Z is after 2025-08-24T17:21:41Z" Mar 10 09:46:14 crc kubenswrapper[4794]: I0310 09:46:14.149697 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mm9nq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d6907de6-7eb7-440a-a101-f492ffa28e39\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d6a3c6de4d134994477dbeb7d3ccdac86f691a7c465321397a93be240c84c4c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:45:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5dhqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf907aae6bd62c6969511c2483d9891a3d89407fa596e7bd4d6e18a3739aea41\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:45:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5dhqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f392e7eaab035834aa954a19e6c9d926bbedf2a547b3a11fba67c0bb966df58c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:45:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5dhqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4fd74482a81888186636681a0568f49e919ac0d729db583ba9178c42e488051\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:45:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5dhqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4fe68076842f5ec0c1922f36a3fdae71ce2e309fd4fa24d4ee4e15c16e514b57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:45:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5dhqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6993b19937515a45f9fab7b20a6856320e79cb3526295d52f48338126d519cac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:45:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5dhqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f02214c7d7705fc02dcc65fa178be79e26797ae2a4414ac88e3b14feca08b425\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f02214c7d7705fc02dcc65fa178be79e26797ae2a4414ac88e3b14feca08b425\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-10T09:46:03Z\\\",\\\"message\\\":\\\"j_retry.go:386] Retry successful for *v1.Pod openshift-network-operator/iptables-alerter-4ln5h after 0 failed attempt(s)\\\\nF0310 09:46:03.457799 6791 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:46:03Z is after 2025-08-24T17:21:41Z]\\\\nI0310 09:46:03.457805 6791 default_network_controller.go:776] Recording success event on pod openshift-network-operator/iptables-alerter-4ln5h\\\\nI0310 09:46:03.457647 6791 ovn.go:134] Ensuring zone local for Pod openshift-machine-config-operator/kube-rbac-proxy-crio-crc in node crc\\\\nI0310 09:46:03.457819 6791 obj_retry.go:386] Retry successful for *v1.Pod openshift-machine-config-operat\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T09:46:02Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-mm9nq_openshift-ovn-kubernetes(d6907de6-7eb7-440a-a101-f492ffa28e39)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5dhqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://01f874a0c0eedbf796a95816f7e3fa80891f260f22087c48aea06498330167dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:45:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5dhqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://40c8d4f75a838a0ea1ce416d08c6d4628c2cbdbc209d50922f2f02867142bedd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://40c8d4f75a838a0ea1ce416d08c6d4628c2cbdbc209d50922f2f02867142bedd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:45:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:45:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5dhqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:45:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-mm9nq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:46:14Z is after 2025-08-24T17:21:41Z" Mar 10 09:46:14 crc kubenswrapper[4794]: I0310 09:46:14.168653 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3f31d53b-bd82-4075-97b0-bab5614c1182\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cab1094176cbf7505724e0c9b08a4c294e6d85e4d5ad3b06535e0dbab270fbc2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:44:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1cb1eebc0d6deb9837da08cb3090cb0f25f2dc66b49afb402fd66f4e73784d66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:44:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7350cf82749601985e69e238d1c8a61ffa7788dd01f1b91a34f1f7574d3c86ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:44:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eab34114bba85670ea6caba8768b650dcbad9e92ca9034f0a21da9b355af87b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eab34114bba85670ea6caba8768b650dcbad9e92ca9034f0a21da9b355af87b1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:44:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:44:13Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:44:12Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:46:14Z is after 2025-08-24T17:21:41Z" Mar 10 09:46:14 crc kubenswrapper[4794]: I0310 09:46:14.185985 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:35Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:46:14Z is after 2025-08-24T17:21:41Z" Mar 10 09:46:14 crc kubenswrapper[4794]: I0310 09:46:14.204311 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:46:14Z is after 2025-08-24T17:21:41Z" Mar 10 09:46:14 crc kubenswrapper[4794]: I0310 09:46:14.226375 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0a99da5e73d983b28a12e4063cb3dcbef725b0797abd5b5beff2d066fb4b43d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:45:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://faad85d1ef96c3c23ba54d03d456195b33f24bed1a5fc64cec481108d087d77c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:45:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:46:14Z is after 2025-08-24T17:21:41Z" Mar 10 09:46:14 crc kubenswrapper[4794]: I0310 09:46:14.239748 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-dc7fw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"441d4601-197c-4325-84bf-cef005ae408b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2367448989b7eb7235c3277100a8d4945551beee942453ce2efe152e17d24ce2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:45:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxxtq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:45:52Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-dc7fw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:46:14Z is after 2025-08-24T17:21:41Z" Mar 10 09:46:14 crc kubenswrapper[4794]: I0310 09:46:14.253227 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-gp22j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ccf2d24b-7ac8-4da4-8629-a56d006f1292\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:46:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:46:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:46:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dc5c1e6650959d5e7fe835284d6684f8d165aaec2e2e5c6537668db196c0ba81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:45:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k8fgn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:45:59Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-gp22j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:46:14Z is after 2025-08-24T17:21:41Z" Mar 10 09:46:14 crc kubenswrapper[4794]: I0310 09:46:14.266074 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-9j8d2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f99fa7c7-084a-43b1-acbf-5fbc67e1a66b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:46:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:46:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:46:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:46:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5aec2bb176e581a0b8ab9808299a341c97cb05dcb3bf8a44d65e65b642cecec5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:46:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8l2p5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1df90543899f07349ecc9cbe5282c9e766f2cf21e6a6da40920a77f6f40a9c2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:46:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8l2p5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:46:04Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-9j8d2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:46:14Z is after 2025-08-24T17:21:41Z" Mar 10 09:46:14 crc kubenswrapper[4794]: I0310 09:46:14.284884 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c04f5dfc-8cde-406b-9e01-2e529e0c0f31\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:12Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:12Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c7c4ede630df1f139ad8abbc2d1d1e2f0f6cf56a2c29902f6b6879df004834cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:44:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://27049a35d23ac6ffa5cb459e4ea2c5c55a7b1731c9edff0941d452442cc73af2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:44:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://de2c346c49fbd94c4328853bed3ffd94ba873a423ff0d99a6ace6fc832bd4da7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:44:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f9a5ac235816afb81c6091da3019a5d8be1e29ba1de74f30c6c28ace091f822\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bd3d37caeb172a681855b5504cd1387cb3798cff0c9ae2c67f9a57207d86f7fe\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T09:45:21Z\\\",\\\"message\\\":\\\"le observer\\\\nW0310 09:45:21.579222 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0310 09:45:21.579381 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0310 09:45:21.580347 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2897458058/tls.crt::/tmp/serving-cert-2897458058/tls.key\\\\\\\"\\\\nI0310 09:45:21.843230 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0310 09:45:21.845521 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0310 09:45:21.845546 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0310 09:45:21.845567 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0310 09:45:21.845572 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0310 09:45:21.854911 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0310 09:45:21.854948 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 09:45:21.854954 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 09:45:21.854960 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0310 09:45:21.854966 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0310 09:45:21.854970 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0310 09:45:21.854974 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0310 09:45:21.854995 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0310 09:45:21.858504 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T09:45:21Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:46:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a3be3a71e2af20be50ffcbd448d5f28e73bafe2d36819ec5476a260b6f5a8c5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:44:14Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://712665fa5ac90b706c3ce8eb211703686c18a26f136962df26e3ad7c4396c72f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://712665fa5ac90b706c3ce8eb211703686c18a26f136962df26e3ad7c4396c72f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:44:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:44:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:44:12Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:46:14Z is after 2025-08-24T17:21:41Z" Mar 10 09:46:14 crc kubenswrapper[4794]: I0310 09:46:14.303659 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2f008ec7-aeae-48e2-8ce6-2d24f9a74cb9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5ecb656c11edc3bfd23f7b6362d5ac8cad055186b134498b59e58b868ebd157c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:44:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8ff582c5b9b19a04841665aef50262d055e093a8e80200961a93e1e1ee0391b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8ff582c5b9b19a04841665aef50262d055e093a8e80200961a93e1e1ee0391b4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:44:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:44:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:44:12Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:46:14Z is after 2025-08-24T17:21:41Z" Mar 10 09:46:14 crc kubenswrapper[4794]: I0310 09:46:14.322951 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-zr89w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"749e1060-d177-43e4-9f39-d1b3e9b573b3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://80687a118cd90a6d4538b0de1757e2f3a17d7f2d5e2532d4a3c4396e838bdce4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:45:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-drqmz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9a7b1089f71425c1e393f53f827c2bd0dd87615752c6c5655ef28b2346de21d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a7b1089f71425c1e393f53f827c2bd0dd87615752c6c5655ef28b2346de21d8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:45:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:45:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-drqmz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e448f6f31e1ada006f54b8381012841a94a4e0101fba5630d715f1ab89e842e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6e448f6f31e1ada006f54b8381012841a94a4e0101fba5630d715f1ab89e842e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:45:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:45:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-drqmz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4d92d9cb37c7766aaabeee4dc2a40c043f0e8178dfe21b6ca9f91fa6313fed4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d4d92d9cb37c7766aaabeee4dc2a40c043f0e8178dfe21b6ca9f91fa6313fed4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:45:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:45:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-drqmz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://06a10ffd91a8c26bf9a3affc09b1c698bcf344aebafadbb12e37e4cc0fb8fb32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://06a10ffd91a8c26bf9a3affc09b1c698bcf344aebafadbb12e37e4cc0fb8fb32\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:45:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:45:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-drqmz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://080520ae745ae721854727f7b8ea882c3b1073ef487362681973f5befe15a79c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://080520ae745ae721854727f7b8ea882c3b1073ef487362681973f5befe15a79c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:45:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:45:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-drqmz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4d9e1da83ee6ac0c4f792e1ec3da1a150b5ee013e83dd2e1b0e86bf5b5fcc648\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4d9e1da83ee6ac0c4f792e1ec3da1a150b5ee013e83dd2e1b0e86bf5b5fcc648\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:45:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:45:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-drqmz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:45:52Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-zr89w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:46:14Z is after 2025-08-24T17:21:41Z" Mar 10 09:46:14 crc kubenswrapper[4794]: I0310 09:46:14.340252 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-jpdth" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"11028118-385a-4a2a-8bc4-49aad67ce147\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31569151ef9bf456980ec9d56c85d54ea77739b3991e26844385936dfceb24ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:45:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8v9tn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:45:52Z\\\"}}\" for pod \"openshift-multus\"/\"multus-jpdth\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:46:14Z is after 2025-08-24T17:21:41Z" Mar 10 09:46:14 crc kubenswrapper[4794]: I0310 09:46:14.634375 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-mm9nq_d6907de6-7eb7-440a-a101-f492ffa28e39/ovnkube-controller/1.log" Mar 10 09:46:14 crc kubenswrapper[4794]: I0310 09:46:14.638227 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mm9nq" event={"ID":"d6907de6-7eb7-440a-a101-f492ffa28e39","Type":"ContainerStarted","Data":"ff55d9a5f2346b5bb8356e359d723fd5fdfa5ad328aaa24ec0faa047536068dc"} Mar 10 09:46:14 crc kubenswrapper[4794]: I0310 09:46:14.638699 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-mm9nq" Mar 10 09:46:14 crc kubenswrapper[4794]: I0310 09:46:14.652860 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2f008ec7-aeae-48e2-8ce6-2d24f9a74cb9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5ecb656c11edc3bfd23f7b6362d5ac8cad055186b134498b59e58b868ebd157c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:44:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8ff582c5b9b19a04841665aef50262d055e093a8e80200961a93e1e1ee0391b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8ff582c5b9b19a04841665aef50262d055e093a8e80200961a93e1e1ee0391b4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:44:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:44:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:44:12Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:46:14Z is after 2025-08-24T17:21:41Z" Mar 10 09:46:14 crc kubenswrapper[4794]: I0310 09:46:14.676581 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-zr89w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"749e1060-d177-43e4-9f39-d1b3e9b573b3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://80687a118cd90a6d4538b0de1757e2f3a17d7f2d5e2532d4a3c4396e838bdce4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:45:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-drqmz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9a7b1089f71425c1e393f53f827c2bd0dd87615752c6c5655ef28b2346de21d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a7b1089f71425c1e393f53f827c2bd0dd87615752c6c5655ef28b2346de21d8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:45:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:45:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-drqmz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e448f6f31e1ada006f54b8381012841a94a4e0101fba5630d715f1ab89e842e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6e448f6f31e1ada006f54b8381012841a94a4e0101fba5630d715f1ab89e842e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:45:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:45:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-drqmz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4d92d9cb37c7766aaabeee4dc2a40c043f0e8178dfe21b6ca9f91fa6313fed4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d4d92d9cb37c7766aaabeee4dc2a40c043f0e8178dfe21b6ca9f91fa6313fed4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:45:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:45:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-drqmz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://06a10ffd91a8c26bf9a3affc09b1c698bcf344aebafadbb12e37e4cc0fb8fb32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://06a10ffd91a8c26bf9a3affc09b1c698bcf344aebafadbb12e37e4cc0fb8fb32\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:45:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:45:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-drqmz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://080520ae745ae721854727f7b8ea882c3b1073ef487362681973f5befe15a79c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://080520ae745ae721854727f7b8ea882c3b1073ef487362681973f5befe15a79c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:45:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:45:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-drqmz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4d9e1da83ee6ac0c4f792e1ec3da1a150b5ee013e83dd2e1b0e86bf5b5fcc648\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4d9e1da83ee6ac0c4f792e1ec3da1a150b5ee013e83dd2e1b0e86bf5b5fcc648\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:45:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:45:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-drqmz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:45:52Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-zr89w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:46:14Z is after 2025-08-24T17:21:41Z" Mar 10 09:46:14 crc kubenswrapper[4794]: I0310 09:46:14.692305 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-jpdth" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"11028118-385a-4a2a-8bc4-49aad67ce147\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31569151ef9bf456980ec9d56c85d54ea77739b3991e26844385936dfceb24ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:45:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8v9tn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:45:52Z\\\"}}\" for pod \"openshift-multus\"/\"multus-jpdth\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:46:14Z is after 2025-08-24T17:21:41Z" Mar 10 09:46:14 crc kubenswrapper[4794]: I0310 09:46:14.704473 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c04f5dfc-8cde-406b-9e01-2e529e0c0f31\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:12Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:12Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c7c4ede630df1f139ad8abbc2d1d1e2f0f6cf56a2c29902f6b6879df004834cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:44:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://27049a35d23ac6ffa5cb459e4ea2c5c55a7b1731c9edff0941d452442cc73af2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:44:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://de2c346c49fbd94c4328853bed3ffd94ba873a423ff0d99a6ace6fc832bd4da7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:44:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f9a5ac235816afb81c6091da3019a5d8be1e29ba1de74f30c6c28ace091f822\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bd3d37caeb172a681855b5504cd1387cb3798cff0c9ae2c67f9a57207d86f7fe\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T09:45:21Z\\\",\\\"message\\\":\\\"le observer\\\\nW0310 09:45:21.579222 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0310 09:45:21.579381 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0310 09:45:21.580347 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2897458058/tls.crt::/tmp/serving-cert-2897458058/tls.key\\\\\\\"\\\\nI0310 09:45:21.843230 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0310 09:45:21.845521 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0310 09:45:21.845546 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0310 09:45:21.845567 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0310 09:45:21.845572 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0310 09:45:21.854911 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0310 09:45:21.854948 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 09:45:21.854954 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 09:45:21.854960 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0310 09:45:21.854966 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0310 09:45:21.854970 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0310 09:45:21.854974 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0310 09:45:21.854995 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0310 09:45:21.858504 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T09:45:21Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:46:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a3be3a71e2af20be50ffcbd448d5f28e73bafe2d36819ec5476a260b6f5a8c5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:44:14Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://712665fa5ac90b706c3ce8eb211703686c18a26f136962df26e3ad7c4396c72f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://712665fa5ac90b706c3ce8eb211703686c18a26f136962df26e3ad7c4396c72f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:44:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:44:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:44:12Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:46:14Z is after 2025-08-24T17:21:41Z" Mar 10 09:46:14 crc kubenswrapper[4794]: I0310 09:46:14.713698 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-69278" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"071a79a8-a892-4d38-a255-2a19483b64aa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f596b5be3f96e424f4c4f0c6f81241ec50ee64356cb1449294fa6790aa3f7755\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:45:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wl72n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://387c7d244ef3b150e61bf6db8b30580b19b8867628fab0e5c4eebdde4df81142\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:45:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wl72n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:45:52Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-69278\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:46:14Z is after 2025-08-24T17:21:41Z" Mar 10 09:46:14 crc kubenswrapper[4794]: I0310 09:46:14.728655 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-jl52w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"befc934b-d5ba-4fb4-afc6-97614b624ebc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:46:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:46:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:46:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:46:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gbh5s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gbh5s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:46:05Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-jl52w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:46:14Z is after 2025-08-24T17:21:41Z" Mar 10 09:46:14 crc kubenswrapper[4794]: I0310 09:46:14.740443 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:35Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:46:14Z is after 2025-08-24T17:21:41Z" Mar 10 09:46:14 crc kubenswrapper[4794]: I0310 09:46:14.760249 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1e0df489-afb8-4101-acd6-7274ee4ebddf\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://714aace000a0c1b32bd3c863ca1e38e9e44df606fcf3fddb6573eaf8faea4ac7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:44:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://02b38ec46bd15bd61a9748f309c0f8fd16d1ab485405a653f6974b6b37952f75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:44:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b3bcab9c36a963edfb820142c1c917f0cc5d18a09dc3aa65b4539c2248fb866\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:44:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://55280e0d81cf78ca765eb09242df44355b2e25fc2d3e5f88b78ef9f14a44ada8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:44:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8ed250f61efe96b4fac3af4d8d8958a60df304df0120052a7a1ef2d716cc025d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:44:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d1cbb7471e4c1bc60e6bff82ff47be1667d540e2f73c1e118371b1f887c6e22e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d1cbb7471e4c1bc60e6bff82ff47be1667d540e2f73c1e118371b1f887c6e22e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:44:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:44:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e8b205e76d6211c7df0521baf694a871c2e747401dd85f694fe3b6712995540\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8e8b205e76d6211c7df0521baf694a871c2e747401dd85f694fe3b6712995540\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:44:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:44:14Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://0fee74b6caaaf89e473d672c7699753699a594043f013d2e5106d232ad87d850\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0fee74b6caaaf89e473d672c7699753699a594043f013d2e5106d232ad87d850\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:44:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:44:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:44:12Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:46:14Z is after 2025-08-24T17:21:41Z" Mar 10 09:46:14 crc kubenswrapper[4794]: I0310 09:46:14.774820 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d7e66c82b5c376bc659e9597f6714923192b2658db0530c1e20b73533bfb9079\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:45:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:46:14Z is after 2025-08-24T17:21:41Z" Mar 10 09:46:14 crc kubenswrapper[4794]: I0310 09:46:14.786588 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:39Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:39Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7ca44cdf468c33b025c75f0f104435e413e525fba85c006f98fa92ff2f2a4e3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:45:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:46:14Z is after 2025-08-24T17:21:41Z" Mar 10 09:46:14 crc kubenswrapper[4794]: I0310 09:46:14.808159 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mm9nq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d6907de6-7eb7-440a-a101-f492ffa28e39\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d6a3c6de4d134994477dbeb7d3ccdac86f691a7c465321397a93be240c84c4c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:45:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5dhqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf907aae6bd62c6969511c2483d9891a3d89407fa596e7bd4d6e18a3739aea41\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:45:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5dhqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f392e7eaab035834aa954a19e6c9d926bbedf2a547b3a11fba67c0bb966df58c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:45:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5dhqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4fd74482a81888186636681a0568f49e919ac0d729db583ba9178c42e488051\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:45:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5dhqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4fe68076842f5ec0c1922f36a3fdae71ce2e309fd4fa24d4ee4e15c16e514b57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:45:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5dhqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6993b19937515a45f9fab7b20a6856320e79cb3526295d52f48338126d519cac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:45:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5dhqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff55d9a5f2346b5bb8356e359d723fd5fdfa5ad328aaa24ec0faa047536068dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f02214c7d7705fc02dcc65fa178be79e26797ae2a4414ac88e3b14feca08b425\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-10T09:46:03Z\\\",\\\"message\\\":\\\"j_retry.go:386] Retry successful for *v1.Pod openshift-network-operator/iptables-alerter-4ln5h after 0 failed attempt(s)\\\\nF0310 09:46:03.457799 6791 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:46:03Z is after 2025-08-24T17:21:41Z]\\\\nI0310 09:46:03.457805 6791 default_network_controller.go:776] Recording success event on pod openshift-network-operator/iptables-alerter-4ln5h\\\\nI0310 09:46:03.457647 6791 ovn.go:134] Ensuring zone local for Pod openshift-machine-config-operator/kube-rbac-proxy-crio-crc in node crc\\\\nI0310 09:46:03.457819 6791 obj_retry.go:386] Retry successful for *v1.Pod openshift-machine-config-operat\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T09:46:02Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:46:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5dhqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://01f874a0c0eedbf796a95816f7e3fa80891f260f22087c48aea06498330167dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:45:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5dhqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://40c8d4f75a838a0ea1ce416d08c6d4628c2cbdbc209d50922f2f02867142bedd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://40c8d4f75a838a0ea1ce416d08c6d4628c2cbdbc209d50922f2f02867142bedd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:45:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:45:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5dhqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:45:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-mm9nq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:46:14Z is after 2025-08-24T17:21:41Z" Mar 10 09:46:14 crc kubenswrapper[4794]: I0310 09:46:14.818683 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:35Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:46:14Z is after 2025-08-24T17:21:41Z" Mar 10 09:46:14 crc kubenswrapper[4794]: I0310 09:46:14.830737 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:46:14Z is after 2025-08-24T17:21:41Z" Mar 10 09:46:14 crc kubenswrapper[4794]: I0310 09:46:14.843663 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0a99da5e73d983b28a12e4063cb3dcbef725b0797abd5b5beff2d066fb4b43d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:45:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://faad85d1ef96c3c23ba54d03d456195b33f24bed1a5fc64cec481108d087d77c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:45:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:46:14Z is after 2025-08-24T17:21:41Z" Mar 10 09:46:14 crc kubenswrapper[4794]: I0310 09:46:14.855581 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-dc7fw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"441d4601-197c-4325-84bf-cef005ae408b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2367448989b7eb7235c3277100a8d4945551beee942453ce2efe152e17d24ce2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:45:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxxtq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:45:52Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-dc7fw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:46:14Z is after 2025-08-24T17:21:41Z" Mar 10 09:46:14 crc kubenswrapper[4794]: I0310 09:46:14.865878 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-gp22j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ccf2d24b-7ac8-4da4-8629-a56d006f1292\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:46:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:46:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:46:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dc5c1e6650959d5e7fe835284d6684f8d165aaec2e2e5c6537668db196c0ba81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:45:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k8fgn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:45:59Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-gp22j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:46:14Z is after 2025-08-24T17:21:41Z" Mar 10 09:46:14 crc kubenswrapper[4794]: I0310 09:46:14.878322 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-9j8d2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f99fa7c7-084a-43b1-acbf-5fbc67e1a66b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:46:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:46:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:46:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:46:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5aec2bb176e581a0b8ab9808299a341c97cb05dcb3bf8a44d65e65b642cecec5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:46:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8l2p5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1df90543899f07349ecc9cbe5282c9e766f2cf21e6a6da40920a77f6f40a9c2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:46:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8l2p5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:46:04Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-9j8d2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:46:14Z is after 2025-08-24T17:21:41Z" Mar 10 09:46:14 crc kubenswrapper[4794]: I0310 09:46:14.891302 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3f31d53b-bd82-4075-97b0-bab5614c1182\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cab1094176cbf7505724e0c9b08a4c294e6d85e4d5ad3b06535e0dbab270fbc2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:44:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1cb1eebc0d6deb9837da08cb3090cb0f25f2dc66b49afb402fd66f4e73784d66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:44:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7350cf82749601985e69e238d1c8a61ffa7788dd01f1b91a34f1f7574d3c86ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:44:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eab34114bba85670ea6caba8768b650dcbad9e92ca9034f0a21da9b355af87b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eab34114bba85670ea6caba8768b650dcbad9e92ca9034f0a21da9b355af87b1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:44:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:44:13Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:44:12Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:46:14Z is after 2025-08-24T17:21:41Z" Mar 10 09:46:14 crc kubenswrapper[4794]: I0310 09:46:14.998218 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jl52w" Mar 10 09:46:14 crc kubenswrapper[4794]: E0310 09:46:14.998386 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jl52w" podUID="befc934b-d5ba-4fb4-afc6-97614b624ebc" Mar 10 09:46:15 crc kubenswrapper[4794]: I0310 09:46:15.644080 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-mm9nq_d6907de6-7eb7-440a-a101-f492ffa28e39/ovnkube-controller/2.log" Mar 10 09:46:15 crc kubenswrapper[4794]: I0310 09:46:15.644771 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-mm9nq_d6907de6-7eb7-440a-a101-f492ffa28e39/ovnkube-controller/1.log" Mar 10 09:46:15 crc kubenswrapper[4794]: I0310 09:46:15.649375 4794 generic.go:334] "Generic (PLEG): container finished" podID="d6907de6-7eb7-440a-a101-f492ffa28e39" containerID="ff55d9a5f2346b5bb8356e359d723fd5fdfa5ad328aaa24ec0faa047536068dc" exitCode=1 Mar 10 09:46:15 crc kubenswrapper[4794]: I0310 09:46:15.649431 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mm9nq" event={"ID":"d6907de6-7eb7-440a-a101-f492ffa28e39","Type":"ContainerDied","Data":"ff55d9a5f2346b5bb8356e359d723fd5fdfa5ad328aaa24ec0faa047536068dc"} Mar 10 09:46:15 crc kubenswrapper[4794]: I0310 09:46:15.649485 4794 scope.go:117] "RemoveContainer" containerID="f02214c7d7705fc02dcc65fa178be79e26797ae2a4414ac88e3b14feca08b425" Mar 10 09:46:15 crc kubenswrapper[4794]: I0310 09:46:15.650267 4794 scope.go:117] "RemoveContainer" containerID="ff55d9a5f2346b5bb8356e359d723fd5fdfa5ad328aaa24ec0faa047536068dc" Mar 10 09:46:15 crc kubenswrapper[4794]: E0310 09:46:15.650506 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-mm9nq_openshift-ovn-kubernetes(d6907de6-7eb7-440a-a101-f492ffa28e39)\"" pod="openshift-ovn-kubernetes/ovnkube-node-mm9nq" podUID="d6907de6-7eb7-440a-a101-f492ffa28e39" Mar 10 09:46:15 crc kubenswrapper[4794]: I0310 09:46:15.669967 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0a99da5e73d983b28a12e4063cb3dcbef725b0797abd5b5beff2d066fb4b43d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:45:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://faad85d1ef96c3c23ba54d03d456195b33f24bed1a5fc64cec481108d087d77c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:45:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:46:15Z is after 2025-08-24T17:21:41Z" Mar 10 09:46:15 crc kubenswrapper[4794]: I0310 09:46:15.679944 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-dc7fw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"441d4601-197c-4325-84bf-cef005ae408b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2367448989b7eb7235c3277100a8d4945551beee942453ce2efe152e17d24ce2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:45:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxxtq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:45:52Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-dc7fw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:46:15Z is after 2025-08-24T17:21:41Z" Mar 10 09:46:15 crc kubenswrapper[4794]: I0310 09:46:15.690739 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-gp22j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ccf2d24b-7ac8-4da4-8629-a56d006f1292\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:46:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:46:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:46:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dc5c1e6650959d5e7fe835284d6684f8d165aaec2e2e5c6537668db196c0ba81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:45:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k8fgn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:45:59Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-gp22j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:46:15Z is after 2025-08-24T17:21:41Z" Mar 10 09:46:15 crc kubenswrapper[4794]: I0310 09:46:15.704119 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-9j8d2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f99fa7c7-084a-43b1-acbf-5fbc67e1a66b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:46:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:46:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:46:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:46:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5aec2bb176e581a0b8ab9808299a341c97cb05dcb3bf8a44d65e65b642cecec5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:46:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8l2p5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1df90543899f07349ecc9cbe5282c9e766f2cf21e6a6da40920a77f6f40a9c2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:46:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8l2p5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:46:04Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-9j8d2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:46:15Z is after 2025-08-24T17:21:41Z" Mar 10 09:46:15 crc kubenswrapper[4794]: I0310 09:46:15.715526 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3f31d53b-bd82-4075-97b0-bab5614c1182\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cab1094176cbf7505724e0c9b08a4c294e6d85e4d5ad3b06535e0dbab270fbc2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:44:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1cb1eebc0d6deb9837da08cb3090cb0f25f2dc66b49afb402fd66f4e73784d66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:44:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7350cf82749601985e69e238d1c8a61ffa7788dd01f1b91a34f1f7574d3c86ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:44:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eab34114bba85670ea6caba8768b650dcbad9e92ca9034f0a21da9b355af87b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eab34114bba85670ea6caba8768b650dcbad9e92ca9034f0a21da9b355af87b1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:44:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:44:13Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:44:12Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:46:15Z is after 2025-08-24T17:21:41Z" Mar 10 09:46:15 crc kubenswrapper[4794]: I0310 09:46:15.729227 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:35Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:46:15Z is after 2025-08-24T17:21:41Z" Mar 10 09:46:15 crc kubenswrapper[4794]: I0310 09:46:15.742200 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:46:15Z is after 2025-08-24T17:21:41Z" Mar 10 09:46:15 crc kubenswrapper[4794]: I0310 09:46:15.753700 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-jpdth" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"11028118-385a-4a2a-8bc4-49aad67ce147\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31569151ef9bf456980ec9d56c85d54ea77739b3991e26844385936dfceb24ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:45:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8v9tn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:45:52Z\\\"}}\" for pod \"openshift-multus\"/\"multus-jpdth\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:46:15Z is after 2025-08-24T17:21:41Z" Mar 10 09:46:15 crc kubenswrapper[4794]: I0310 09:46:15.767386 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c04f5dfc-8cde-406b-9e01-2e529e0c0f31\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:12Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:12Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c7c4ede630df1f139ad8abbc2d1d1e2f0f6cf56a2c29902f6b6879df004834cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:44:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://27049a35d23ac6ffa5cb459e4ea2c5c55a7b1731c9edff0941d452442cc73af2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:44:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://de2c346c49fbd94c4328853bed3ffd94ba873a423ff0d99a6ace6fc832bd4da7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:44:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f9a5ac235816afb81c6091da3019a5d8be1e29ba1de74f30c6c28ace091f822\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bd3d37caeb172a681855b5504cd1387cb3798cff0c9ae2c67f9a57207d86f7fe\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T09:45:21Z\\\",\\\"message\\\":\\\"le observer\\\\nW0310 09:45:21.579222 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0310 09:45:21.579381 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0310 09:45:21.580347 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2897458058/tls.crt::/tmp/serving-cert-2897458058/tls.key\\\\\\\"\\\\nI0310 09:45:21.843230 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0310 09:45:21.845521 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0310 09:45:21.845546 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0310 09:45:21.845567 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0310 09:45:21.845572 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0310 09:45:21.854911 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0310 09:45:21.854948 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 09:45:21.854954 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 09:45:21.854960 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0310 09:45:21.854966 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0310 09:45:21.854970 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0310 09:45:21.854974 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0310 09:45:21.854995 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0310 09:45:21.858504 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T09:45:21Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:46:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a3be3a71e2af20be50ffcbd448d5f28e73bafe2d36819ec5476a260b6f5a8c5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:44:14Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://712665fa5ac90b706c3ce8eb211703686c18a26f136962df26e3ad7c4396c72f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://712665fa5ac90b706c3ce8eb211703686c18a26f136962df26e3ad7c4396c72f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:44:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:44:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:44:12Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:46:15Z is after 2025-08-24T17:21:41Z" Mar 10 09:46:15 crc kubenswrapper[4794]: I0310 09:46:15.776572 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2f008ec7-aeae-48e2-8ce6-2d24f9a74cb9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5ecb656c11edc3bfd23f7b6362d5ac8cad055186b134498b59e58b868ebd157c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:44:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8ff582c5b9b19a04841665aef50262d055e093a8e80200961a93e1e1ee0391b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8ff582c5b9b19a04841665aef50262d055e093a8e80200961a93e1e1ee0391b4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:44:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:44:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:44:12Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:46:15Z is after 2025-08-24T17:21:41Z" Mar 10 09:46:15 crc kubenswrapper[4794]: I0310 09:46:15.788724 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-zr89w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"749e1060-d177-43e4-9f39-d1b3e9b573b3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://80687a118cd90a6d4538b0de1757e2f3a17d7f2d5e2532d4a3c4396e838bdce4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:45:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-drqmz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9a7b1089f71425c1e393f53f827c2bd0dd87615752c6c5655ef28b2346de21d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a7b1089f71425c1e393f53f827c2bd0dd87615752c6c5655ef28b2346de21d8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:45:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:45:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-drqmz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e448f6f31e1ada006f54b8381012841a94a4e0101fba5630d715f1ab89e842e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6e448f6f31e1ada006f54b8381012841a94a4e0101fba5630d715f1ab89e842e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:45:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:45:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-drqmz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4d92d9cb37c7766aaabeee4dc2a40c043f0e8178dfe21b6ca9f91fa6313fed4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d4d92d9cb37c7766aaabeee4dc2a40c043f0e8178dfe21b6ca9f91fa6313fed4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:45:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:45:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-drqmz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://06a10ffd91a8c26bf9a3affc09b1c698bcf344aebafadbb12e37e4cc0fb8fb32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://06a10ffd91a8c26bf9a3affc09b1c698bcf344aebafadbb12e37e4cc0fb8fb32\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:45:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:45:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-drqmz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://080520ae745ae721854727f7b8ea882c3b1073ef487362681973f5befe15a79c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://080520ae745ae721854727f7b8ea882c3b1073ef487362681973f5befe15a79c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:45:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:45:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-drqmz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4d9e1da83ee6ac0c4f792e1ec3da1a150b5ee013e83dd2e1b0e86bf5b5fcc648\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4d9e1da83ee6ac0c4f792e1ec3da1a150b5ee013e83dd2e1b0e86bf5b5fcc648\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:45:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:45:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-drqmz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:45:52Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-zr89w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:46:15Z is after 2025-08-24T17:21:41Z" Mar 10 09:46:15 crc kubenswrapper[4794]: I0310 09:46:15.799447 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:35Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:46:15Z is after 2025-08-24T17:21:41Z" Mar 10 09:46:15 crc kubenswrapper[4794]: I0310 09:46:15.812073 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-69278" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"071a79a8-a892-4d38-a255-2a19483b64aa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f596b5be3f96e424f4c4f0c6f81241ec50ee64356cb1449294fa6790aa3f7755\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:45:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wl72n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://387c7d244ef3b150e61bf6db8b30580b19b8867628fab0e5c4eebdde4df81142\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:45:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wl72n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:45:52Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-69278\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:46:15Z is after 2025-08-24T17:21:41Z" Mar 10 09:46:15 crc kubenswrapper[4794]: I0310 09:46:15.824707 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-jl52w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"befc934b-d5ba-4fb4-afc6-97614b624ebc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:46:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:46:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:46:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:46:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gbh5s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gbh5s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:46:05Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-jl52w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:46:15Z is after 2025-08-24T17:21:41Z" Mar 10 09:46:15 crc kubenswrapper[4794]: I0310 09:46:15.837759 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:39Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:39Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7ca44cdf468c33b025c75f0f104435e413e525fba85c006f98fa92ff2f2a4e3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:45:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:46:15Z is after 2025-08-24T17:21:41Z" Mar 10 09:46:15 crc kubenswrapper[4794]: I0310 09:46:15.858881 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mm9nq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d6907de6-7eb7-440a-a101-f492ffa28e39\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d6a3c6de4d134994477dbeb7d3ccdac86f691a7c465321397a93be240c84c4c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:45:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5dhqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf907aae6bd62c6969511c2483d9891a3d89407fa596e7bd4d6e18a3739aea41\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:45:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5dhqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f392e7eaab035834aa954a19e6c9d926bbedf2a547b3a11fba67c0bb966df58c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:45:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5dhqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4fd74482a81888186636681a0568f49e919ac0d729db583ba9178c42e488051\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:45:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5dhqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4fe68076842f5ec0c1922f36a3fdae71ce2e309fd4fa24d4ee4e15c16e514b57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:45:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5dhqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6993b19937515a45f9fab7b20a6856320e79cb3526295d52f48338126d519cac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:45:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5dhqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff55d9a5f2346b5bb8356e359d723fd5fdfa5ad328aaa24ec0faa047536068dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f02214c7d7705fc02dcc65fa178be79e26797ae2a4414ac88e3b14feca08b425\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-10T09:46:03Z\\\",\\\"message\\\":\\\"j_retry.go:386] Retry successful for *v1.Pod openshift-network-operator/iptables-alerter-4ln5h after 0 failed attempt(s)\\\\nF0310 09:46:03.457799 6791 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:46:03Z is after 2025-08-24T17:21:41Z]\\\\nI0310 09:46:03.457805 6791 default_network_controller.go:776] Recording success event on pod openshift-network-operator/iptables-alerter-4ln5h\\\\nI0310 09:46:03.457647 6791 ovn.go:134] Ensuring zone local for Pod openshift-machine-config-operator/kube-rbac-proxy-crio-crc in node crc\\\\nI0310 09:46:03.457819 6791 obj_retry.go:386] Retry successful for *v1.Pod openshift-machine-config-operat\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T09:46:02Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ff55d9a5f2346b5bb8356e359d723fd5fdfa5ad328aaa24ec0faa047536068dc\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-10T09:46:14Z\\\",\\\"message\\\":\\\" handler 7\\\\nI0310 09:46:14.916611 7063 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0310 09:46:14.916631 7063 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0310 09:46:14.916647 7063 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0310 09:46:14.916690 7063 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0310 09:46:14.916708 7063 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI0310 09:46:14.916737 7063 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0310 09:46:14.916775 7063 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI0310 09:46:14.916785 7063 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0310 09:46:14.916793 7063 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0310 09:46:14.916814 7063 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0310 09:46:14.916819 7063 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0310 09:46:14.916818 7063 factory.go:656] Stopping watch factory\\\\nI0310 09:46:14.916836 7063 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0310 09:46:14.916845 7063 ovnkube.go:599] Stopped ovnkube\\\\nI0310 09:46:14.916890 7063 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0310 09:46:14.916957 7063 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T09:46:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5dhqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://01f874a0c0eedbf796a95816f7e3fa80891f260f22087c48aea06498330167dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:45:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5dhqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://40c8d4f75a838a0ea1ce416d08c6d4628c2cbdbc209d50922f2f02867142bedd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://40c8d4f75a838a0ea1ce416d08c6d4628c2cbdbc209d50922f2f02867142bedd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:45:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:45:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5dhqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:45:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-mm9nq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:46:15Z is after 2025-08-24T17:21:41Z" Mar 10 09:46:15 crc kubenswrapper[4794]: I0310 09:46:15.888187 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1e0df489-afb8-4101-acd6-7274ee4ebddf\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://714aace000a0c1b32bd3c863ca1e38e9e44df606fcf3fddb6573eaf8faea4ac7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:44:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://02b38ec46bd15bd61a9748f309c0f8fd16d1ab485405a653f6974b6b37952f75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:44:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b3bcab9c36a963edfb820142c1c917f0cc5d18a09dc3aa65b4539c2248fb866\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:44:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://55280e0d81cf78ca765eb09242df44355b2e25fc2d3e5f88b78ef9f14a44ada8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:44:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8ed250f61efe96b4fac3af4d8d8958a60df304df0120052a7a1ef2d716cc025d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:44:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d1cbb7471e4c1bc60e6bff82ff47be1667d540e2f73c1e118371b1f887c6e22e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d1cbb7471e4c1bc60e6bff82ff47be1667d540e2f73c1e118371b1f887c6e22e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:44:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:44:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e8b205e76d6211c7df0521baf694a871c2e747401dd85f694fe3b6712995540\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8e8b205e76d6211c7df0521baf694a871c2e747401dd85f694fe3b6712995540\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:44:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:44:14Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://0fee74b6caaaf89e473d672c7699753699a594043f013d2e5106d232ad87d850\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0fee74b6caaaf89e473d672c7699753699a594043f013d2e5106d232ad87d850\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:44:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:44:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:44:12Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:46:15Z is after 2025-08-24T17:21:41Z" Mar 10 09:46:15 crc kubenswrapper[4794]: I0310 09:46:15.907054 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d7e66c82b5c376bc659e9597f6714923192b2658db0530c1e20b73533bfb9079\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:45:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:46:15Z is after 2025-08-24T17:21:41Z" Mar 10 09:46:15 crc kubenswrapper[4794]: I0310 09:46:15.998683 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 09:46:15 crc kubenswrapper[4794]: I0310 09:46:15.998723 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 09:46:15 crc kubenswrapper[4794]: I0310 09:46:15.998816 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 09:46:15 crc kubenswrapper[4794]: E0310 09:46:15.998894 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 09:46:15 crc kubenswrapper[4794]: E0310 09:46:15.999033 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 09:46:15 crc kubenswrapper[4794]: E0310 09:46:15.999173 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 09:46:16 crc kubenswrapper[4794]: I0310 09:46:16.655047 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-mm9nq_d6907de6-7eb7-440a-a101-f492ffa28e39/ovnkube-controller/2.log" Mar 10 09:46:16 crc kubenswrapper[4794]: I0310 09:46:16.660622 4794 scope.go:117] "RemoveContainer" containerID="ff55d9a5f2346b5bb8356e359d723fd5fdfa5ad328aaa24ec0faa047536068dc" Mar 10 09:46:16 crc kubenswrapper[4794]: E0310 09:46:16.660976 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-mm9nq_openshift-ovn-kubernetes(d6907de6-7eb7-440a-a101-f492ffa28e39)\"" pod="openshift-ovn-kubernetes/ovnkube-node-mm9nq" podUID="d6907de6-7eb7-440a-a101-f492ffa28e39" Mar 10 09:46:16 crc kubenswrapper[4794]: I0310 09:46:16.674806 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:35Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:46:16Z is after 2025-08-24T17:21:41Z" Mar 10 09:46:16 crc kubenswrapper[4794]: I0310 09:46:16.689717 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-69278" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"071a79a8-a892-4d38-a255-2a19483b64aa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f596b5be3f96e424f4c4f0c6f81241ec50ee64356cb1449294fa6790aa3f7755\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:45:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wl72n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://387c7d244ef3b150e61bf6db8b30580b19b8867628fab0e5c4eebdde4df81142\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:45:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wl72n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:45:52Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-69278\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:46:16Z is after 2025-08-24T17:21:41Z" Mar 10 09:46:16 crc kubenswrapper[4794]: I0310 09:46:16.701927 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-jl52w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"befc934b-d5ba-4fb4-afc6-97614b624ebc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:46:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:46:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:46:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:46:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gbh5s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gbh5s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:46:05Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-jl52w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:46:16Z is after 2025-08-24T17:21:41Z" Mar 10 09:46:16 crc kubenswrapper[4794]: I0310 09:46:16.715490 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:39Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:39Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7ca44cdf468c33b025c75f0f104435e413e525fba85c006f98fa92ff2f2a4e3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:45:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:46:16Z is after 2025-08-24T17:21:41Z" Mar 10 09:46:16 crc kubenswrapper[4794]: I0310 09:46:16.746858 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mm9nq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d6907de6-7eb7-440a-a101-f492ffa28e39\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d6a3c6de4d134994477dbeb7d3ccdac86f691a7c465321397a93be240c84c4c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:45:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5dhqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf907aae6bd62c6969511c2483d9891a3d89407fa596e7bd4d6e18a3739aea41\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:45:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5dhqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f392e7eaab035834aa954a19e6c9d926bbedf2a547b3a11fba67c0bb966df58c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:45:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5dhqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4fd74482a81888186636681a0568f49e919ac0d729db583ba9178c42e488051\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:45:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5dhqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4fe68076842f5ec0c1922f36a3fdae71ce2e309fd4fa24d4ee4e15c16e514b57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:45:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5dhqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6993b19937515a45f9fab7b20a6856320e79cb3526295d52f48338126d519cac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:45:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5dhqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff55d9a5f2346b5bb8356e359d723fd5fdfa5ad328aaa24ec0faa047536068dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ff55d9a5f2346b5bb8356e359d723fd5fdfa5ad328aaa24ec0faa047536068dc\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-10T09:46:14Z\\\",\\\"message\\\":\\\" handler 7\\\\nI0310 09:46:14.916611 7063 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0310 09:46:14.916631 7063 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0310 09:46:14.916647 7063 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0310 09:46:14.916690 7063 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0310 09:46:14.916708 7063 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI0310 09:46:14.916737 7063 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0310 09:46:14.916775 7063 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI0310 09:46:14.916785 7063 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0310 09:46:14.916793 7063 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0310 09:46:14.916814 7063 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0310 09:46:14.916819 7063 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0310 09:46:14.916818 7063 factory.go:656] Stopping watch factory\\\\nI0310 09:46:14.916836 7063 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0310 09:46:14.916845 7063 ovnkube.go:599] Stopped ovnkube\\\\nI0310 09:46:14.916890 7063 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0310 09:46:14.916957 7063 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T09:46:14Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-mm9nq_openshift-ovn-kubernetes(d6907de6-7eb7-440a-a101-f492ffa28e39)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5dhqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://01f874a0c0eedbf796a95816f7e3fa80891f260f22087c48aea06498330167dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:45:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5dhqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://40c8d4f75a838a0ea1ce416d08c6d4628c2cbdbc209d50922f2f02867142bedd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://40c8d4f75a838a0ea1ce416d08c6d4628c2cbdbc209d50922f2f02867142bedd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:45:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:45:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5dhqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:45:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-mm9nq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:46:16Z is after 2025-08-24T17:21:41Z" Mar 10 09:46:16 crc kubenswrapper[4794]: I0310 09:46:16.778623 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1e0df489-afb8-4101-acd6-7274ee4ebddf\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://714aace000a0c1b32bd3c863ca1e38e9e44df606fcf3fddb6573eaf8faea4ac7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:44:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://02b38ec46bd15bd61a9748f309c0f8fd16d1ab485405a653f6974b6b37952f75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:44:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b3bcab9c36a963edfb820142c1c917f0cc5d18a09dc3aa65b4539c2248fb866\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:44:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://55280e0d81cf78ca765eb09242df44355b2e25fc2d3e5f88b78ef9f14a44ada8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:44:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8ed250f61efe96b4fac3af4d8d8958a60df304df0120052a7a1ef2d716cc025d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:44:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d1cbb7471e4c1bc60e6bff82ff47be1667d540e2f73c1e118371b1f887c6e22e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d1cbb7471e4c1bc60e6bff82ff47be1667d540e2f73c1e118371b1f887c6e22e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:44:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:44:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e8b205e76d6211c7df0521baf694a871c2e747401dd85f694fe3b6712995540\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8e8b205e76d6211c7df0521baf694a871c2e747401dd85f694fe3b6712995540\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:44:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:44:14Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://0fee74b6caaaf89e473d672c7699753699a594043f013d2e5106d232ad87d850\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0fee74b6caaaf89e473d672c7699753699a594043f013d2e5106d232ad87d850\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:44:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:44:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:44:12Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:46:16Z is after 2025-08-24T17:21:41Z" Mar 10 09:46:16 crc kubenswrapper[4794]: I0310 09:46:16.794813 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d7e66c82b5c376bc659e9597f6714923192b2658db0530c1e20b73533bfb9079\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:45:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:46:16Z is after 2025-08-24T17:21:41Z" Mar 10 09:46:16 crc kubenswrapper[4794]: I0310 09:46:16.814805 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0a99da5e73d983b28a12e4063cb3dcbef725b0797abd5b5beff2d066fb4b43d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:45:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://faad85d1ef96c3c23ba54d03d456195b33f24bed1a5fc64cec481108d087d77c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:45:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:46:16Z is after 2025-08-24T17:21:41Z" Mar 10 09:46:16 crc kubenswrapper[4794]: I0310 09:46:16.827933 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-dc7fw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"441d4601-197c-4325-84bf-cef005ae408b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2367448989b7eb7235c3277100a8d4945551beee942453ce2efe152e17d24ce2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:45:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxxtq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:45:52Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-dc7fw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:46:16Z is after 2025-08-24T17:21:41Z" Mar 10 09:46:16 crc kubenswrapper[4794]: I0310 09:46:16.839974 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-gp22j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ccf2d24b-7ac8-4da4-8629-a56d006f1292\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:46:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:46:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:46:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dc5c1e6650959d5e7fe835284d6684f8d165aaec2e2e5c6537668db196c0ba81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:45:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k8fgn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:45:59Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-gp22j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:46:16Z is after 2025-08-24T17:21:41Z" Mar 10 09:46:16 crc kubenswrapper[4794]: I0310 09:46:16.854868 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-9j8d2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f99fa7c7-084a-43b1-acbf-5fbc67e1a66b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:46:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:46:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:46:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:46:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5aec2bb176e581a0b8ab9808299a341c97cb05dcb3bf8a44d65e65b642cecec5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:46:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8l2p5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1df90543899f07349ecc9cbe5282c9e766f2cf21e6a6da40920a77f6f40a9c2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:46:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8l2p5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:46:04Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-9j8d2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:46:16Z is after 2025-08-24T17:21:41Z" Mar 10 09:46:16 crc kubenswrapper[4794]: I0310 09:46:16.867763 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3f31d53b-bd82-4075-97b0-bab5614c1182\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cab1094176cbf7505724e0c9b08a4c294e6d85e4d5ad3b06535e0dbab270fbc2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:44:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1cb1eebc0d6deb9837da08cb3090cb0f25f2dc66b49afb402fd66f4e73784d66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:44:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7350cf82749601985e69e238d1c8a61ffa7788dd01f1b91a34f1f7574d3c86ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:44:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eab34114bba85670ea6caba8768b650dcbad9e92ca9034f0a21da9b355af87b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eab34114bba85670ea6caba8768b650dcbad9e92ca9034f0a21da9b355af87b1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:44:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:44:13Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:44:12Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:46:16Z is after 2025-08-24T17:21:41Z" Mar 10 09:46:16 crc kubenswrapper[4794]: I0310 09:46:16.880326 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:35Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:46:16Z is after 2025-08-24T17:21:41Z" Mar 10 09:46:16 crc kubenswrapper[4794]: I0310 09:46:16.892608 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:46:16Z is after 2025-08-24T17:21:41Z" Mar 10 09:46:16 crc kubenswrapper[4794]: I0310 09:46:16.912302 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-jpdth" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"11028118-385a-4a2a-8bc4-49aad67ce147\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31569151ef9bf456980ec9d56c85d54ea77739b3991e26844385936dfceb24ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:45:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8v9tn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:45:52Z\\\"}}\" for pod \"openshift-multus\"/\"multus-jpdth\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:46:16Z is after 2025-08-24T17:21:41Z" Mar 10 09:46:16 crc kubenswrapper[4794]: I0310 09:46:16.931944 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c04f5dfc-8cde-406b-9e01-2e529e0c0f31\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:12Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:12Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c7c4ede630df1f139ad8abbc2d1d1e2f0f6cf56a2c29902f6b6879df004834cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:44:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://27049a35d23ac6ffa5cb459e4ea2c5c55a7b1731c9edff0941d452442cc73af2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:44:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://de2c346c49fbd94c4328853bed3ffd94ba873a423ff0d99a6ace6fc832bd4da7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:44:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f9a5ac235816afb81c6091da3019a5d8be1e29ba1de74f30c6c28ace091f822\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bd3d37caeb172a681855b5504cd1387cb3798cff0c9ae2c67f9a57207d86f7fe\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T09:45:21Z\\\",\\\"message\\\":\\\"le observer\\\\nW0310 09:45:21.579222 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0310 09:45:21.579381 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0310 09:45:21.580347 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2897458058/tls.crt::/tmp/serving-cert-2897458058/tls.key\\\\\\\"\\\\nI0310 09:45:21.843230 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0310 09:45:21.845521 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0310 09:45:21.845546 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0310 09:45:21.845567 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0310 09:45:21.845572 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0310 09:45:21.854911 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0310 09:45:21.854948 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 09:45:21.854954 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 09:45:21.854960 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0310 09:45:21.854966 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0310 09:45:21.854970 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0310 09:45:21.854974 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0310 09:45:21.854995 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0310 09:45:21.858504 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T09:45:21Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:46:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a3be3a71e2af20be50ffcbd448d5f28e73bafe2d36819ec5476a260b6f5a8c5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:44:14Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://712665fa5ac90b706c3ce8eb211703686c18a26f136962df26e3ad7c4396c72f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://712665fa5ac90b706c3ce8eb211703686c18a26f136962df26e3ad7c4396c72f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:44:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:44:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:44:12Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:46:16Z is after 2025-08-24T17:21:41Z" Mar 10 09:46:16 crc kubenswrapper[4794]: I0310 09:46:16.946311 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2f008ec7-aeae-48e2-8ce6-2d24f9a74cb9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5ecb656c11edc3bfd23f7b6362d5ac8cad055186b134498b59e58b868ebd157c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:44:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8ff582c5b9b19a04841665aef50262d055e093a8e80200961a93e1e1ee0391b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8ff582c5b9b19a04841665aef50262d055e093a8e80200961a93e1e1ee0391b4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:44:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:44:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:44:12Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:46:16Z is after 2025-08-24T17:21:41Z" Mar 10 09:46:16 crc kubenswrapper[4794]: I0310 09:46:16.960974 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-zr89w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"749e1060-d177-43e4-9f39-d1b3e9b573b3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://80687a118cd90a6d4538b0de1757e2f3a17d7f2d5e2532d4a3c4396e838bdce4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:45:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-drqmz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9a7b1089f71425c1e393f53f827c2bd0dd87615752c6c5655ef28b2346de21d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a7b1089f71425c1e393f53f827c2bd0dd87615752c6c5655ef28b2346de21d8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:45:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:45:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-drqmz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e448f6f31e1ada006f54b8381012841a94a4e0101fba5630d715f1ab89e842e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6e448f6f31e1ada006f54b8381012841a94a4e0101fba5630d715f1ab89e842e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:45:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:45:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-drqmz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4d92d9cb37c7766aaabeee4dc2a40c043f0e8178dfe21b6ca9f91fa6313fed4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d4d92d9cb37c7766aaabeee4dc2a40c043f0e8178dfe21b6ca9f91fa6313fed4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:45:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:45:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-drqmz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://06a10ffd91a8c26bf9a3affc09b1c698bcf344aebafadbb12e37e4cc0fb8fb32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://06a10ffd91a8c26bf9a3affc09b1c698bcf344aebafadbb12e37e4cc0fb8fb32\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:45:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:45:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-drqmz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://080520ae745ae721854727f7b8ea882c3b1073ef487362681973f5befe15a79c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://080520ae745ae721854727f7b8ea882c3b1073ef487362681973f5befe15a79c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:45:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:45:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-drqmz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4d9e1da83ee6ac0c4f792e1ec3da1a150b5ee013e83dd2e1b0e86bf5b5fcc648\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4d9e1da83ee6ac0c4f792e1ec3da1a150b5ee013e83dd2e1b0e86bf5b5fcc648\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:45:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:45:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-drqmz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:45:52Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-zr89w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:46:16Z is after 2025-08-24T17:21:41Z" Mar 10 09:46:16 crc kubenswrapper[4794]: I0310 09:46:16.998793 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jl52w" Mar 10 09:46:16 crc kubenswrapper[4794]: E0310 09:46:16.998955 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jl52w" podUID="befc934b-d5ba-4fb4-afc6-97614b624ebc" Mar 10 09:46:17 crc kubenswrapper[4794]: E0310 09:46:17.072659 4794 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 10 09:46:17 crc kubenswrapper[4794]: I0310 09:46:17.998480 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 09:46:17 crc kubenswrapper[4794]: I0310 09:46:17.998598 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 09:46:17 crc kubenswrapper[4794]: E0310 09:46:17.998784 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 09:46:17 crc kubenswrapper[4794]: E0310 09:46:17.998954 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 09:46:17 crc kubenswrapper[4794]: I0310 09:46:17.999041 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 09:46:17 crc kubenswrapper[4794]: E0310 09:46:17.999115 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 09:46:18 crc kubenswrapper[4794]: I0310 09:46:18.998254 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jl52w" Mar 10 09:46:18 crc kubenswrapper[4794]: E0310 09:46:18.998459 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jl52w" podUID="befc934b-d5ba-4fb4-afc6-97614b624ebc" Mar 10 09:46:19 crc kubenswrapper[4794]: I0310 09:46:19.272672 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 10 09:46:19 crc kubenswrapper[4794]: I0310 09:46:19.288493 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:35Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:46:19Z is after 2025-08-24T17:21:41Z" Mar 10 09:46:19 crc kubenswrapper[4794]: I0310 09:46:19.302988 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-69278" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"071a79a8-a892-4d38-a255-2a19483b64aa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f596b5be3f96e424f4c4f0c6f81241ec50ee64356cb1449294fa6790aa3f7755\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:45:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wl72n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://387c7d244ef3b150e61bf6db8b30580b19b8867628fab0e5c4eebdde4df81142\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:45:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wl72n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:45:52Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-69278\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:46:19Z is after 2025-08-24T17:21:41Z" Mar 10 09:46:19 crc kubenswrapper[4794]: I0310 09:46:19.316272 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-jl52w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"befc934b-d5ba-4fb4-afc6-97614b624ebc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:46:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:46:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:46:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:46:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gbh5s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gbh5s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:46:05Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-jl52w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:46:19Z is after 2025-08-24T17:21:41Z" Mar 10 09:46:19 crc kubenswrapper[4794]: I0310 09:46:19.341059 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1e0df489-afb8-4101-acd6-7274ee4ebddf\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://714aace000a0c1b32bd3c863ca1e38e9e44df606fcf3fddb6573eaf8faea4ac7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:44:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://02b38ec46bd15bd61a9748f309c0f8fd16d1ab485405a653f6974b6b37952f75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:44:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b3bcab9c36a963edfb820142c1c917f0cc5d18a09dc3aa65b4539c2248fb866\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:44:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://55280e0d81cf78ca765eb09242df44355b2e25fc2d3e5f88b78ef9f14a44ada8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:44:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8ed250f61efe96b4fac3af4d8d8958a60df304df0120052a7a1ef2d716cc025d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:44:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d1cbb7471e4c1bc60e6bff82ff47be1667d540e2f73c1e118371b1f887c6e22e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d1cbb7471e4c1bc60e6bff82ff47be1667d540e2f73c1e118371b1f887c6e22e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:44:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:44:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e8b205e76d6211c7df0521baf694a871c2e747401dd85f694fe3b6712995540\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8e8b205e76d6211c7df0521baf694a871c2e747401dd85f694fe3b6712995540\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:44:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:44:14Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://0fee74b6caaaf89e473d672c7699753699a594043f013d2e5106d232ad87d850\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0fee74b6caaaf89e473d672c7699753699a594043f013d2e5106d232ad87d850\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:44:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:44:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:44:12Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:46:19Z is after 2025-08-24T17:21:41Z" Mar 10 09:46:19 crc kubenswrapper[4794]: I0310 09:46:19.354891 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d7e66c82b5c376bc659e9597f6714923192b2658db0530c1e20b73533bfb9079\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:45:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:46:19Z is after 2025-08-24T17:21:41Z" Mar 10 09:46:19 crc kubenswrapper[4794]: I0310 09:46:19.366969 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:39Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:39Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7ca44cdf468c33b025c75f0f104435e413e525fba85c006f98fa92ff2f2a4e3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:45:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:46:19Z is after 2025-08-24T17:21:41Z" Mar 10 09:46:19 crc kubenswrapper[4794]: I0310 09:46:19.388957 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mm9nq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d6907de6-7eb7-440a-a101-f492ffa28e39\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d6a3c6de4d134994477dbeb7d3ccdac86f691a7c465321397a93be240c84c4c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:45:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5dhqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf907aae6bd62c6969511c2483d9891a3d89407fa596e7bd4d6e18a3739aea41\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:45:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5dhqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f392e7eaab035834aa954a19e6c9d926bbedf2a547b3a11fba67c0bb966df58c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:45:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5dhqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4fd74482a81888186636681a0568f49e919ac0d729db583ba9178c42e488051\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:45:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5dhqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4fe68076842f5ec0c1922f36a3fdae71ce2e309fd4fa24d4ee4e15c16e514b57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:45:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5dhqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6993b19937515a45f9fab7b20a6856320e79cb3526295d52f48338126d519cac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:45:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5dhqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff55d9a5f2346b5bb8356e359d723fd5fdfa5ad328aaa24ec0faa047536068dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ff55d9a5f2346b5bb8356e359d723fd5fdfa5ad328aaa24ec0faa047536068dc\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-10T09:46:14Z\\\",\\\"message\\\":\\\" handler 7\\\\nI0310 09:46:14.916611 7063 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0310 09:46:14.916631 7063 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0310 09:46:14.916647 7063 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0310 09:46:14.916690 7063 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0310 09:46:14.916708 7063 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI0310 09:46:14.916737 7063 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0310 09:46:14.916775 7063 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI0310 09:46:14.916785 7063 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0310 09:46:14.916793 7063 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0310 09:46:14.916814 7063 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0310 09:46:14.916819 7063 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0310 09:46:14.916818 7063 factory.go:656] Stopping watch factory\\\\nI0310 09:46:14.916836 7063 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0310 09:46:14.916845 7063 ovnkube.go:599] Stopped ovnkube\\\\nI0310 09:46:14.916890 7063 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0310 09:46:14.916957 7063 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T09:46:14Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-mm9nq_openshift-ovn-kubernetes(d6907de6-7eb7-440a-a101-f492ffa28e39)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5dhqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://01f874a0c0eedbf796a95816f7e3fa80891f260f22087c48aea06498330167dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:45:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5dhqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://40c8d4f75a838a0ea1ce416d08c6d4628c2cbdbc209d50922f2f02867142bedd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://40c8d4f75a838a0ea1ce416d08c6d4628c2cbdbc209d50922f2f02867142bedd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:45:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:45:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5dhqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:45:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-mm9nq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:46:19Z is after 2025-08-24T17:21:41Z" Mar 10 09:46:19 crc kubenswrapper[4794]: I0310 09:46:19.407223 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3f31d53b-bd82-4075-97b0-bab5614c1182\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cab1094176cbf7505724e0c9b08a4c294e6d85e4d5ad3b06535e0dbab270fbc2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:44:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1cb1eebc0d6deb9837da08cb3090cb0f25f2dc66b49afb402fd66f4e73784d66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:44:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7350cf82749601985e69e238d1c8a61ffa7788dd01f1b91a34f1f7574d3c86ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:44:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eab34114bba85670ea6caba8768b650dcbad9e92ca9034f0a21da9b355af87b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eab34114bba85670ea6caba8768b650dcbad9e92ca9034f0a21da9b355af87b1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:44:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:44:13Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:44:12Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:46:19Z is after 2025-08-24T17:21:41Z" Mar 10 09:46:19 crc kubenswrapper[4794]: I0310 09:46:19.426951 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:35Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:46:19Z is after 2025-08-24T17:21:41Z" Mar 10 09:46:19 crc kubenswrapper[4794]: I0310 09:46:19.445525 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:46:19Z is after 2025-08-24T17:21:41Z" Mar 10 09:46:19 crc kubenswrapper[4794]: I0310 09:46:19.465278 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0a99da5e73d983b28a12e4063cb3dcbef725b0797abd5b5beff2d066fb4b43d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:45:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://faad85d1ef96c3c23ba54d03d456195b33f24bed1a5fc64cec481108d087d77c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:45:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:46:19Z is after 2025-08-24T17:21:41Z" Mar 10 09:46:19 crc kubenswrapper[4794]: I0310 09:46:19.483402 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-dc7fw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"441d4601-197c-4325-84bf-cef005ae408b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2367448989b7eb7235c3277100a8d4945551beee942453ce2efe152e17d24ce2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:45:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxxtq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:45:52Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-dc7fw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:46:19Z is after 2025-08-24T17:21:41Z" Mar 10 09:46:19 crc kubenswrapper[4794]: I0310 09:46:19.498446 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-gp22j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ccf2d24b-7ac8-4da4-8629-a56d006f1292\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:46:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:46:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:46:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dc5c1e6650959d5e7fe835284d6684f8d165aaec2e2e5c6537668db196c0ba81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:45:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k8fgn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:45:59Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-gp22j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:46:19Z is after 2025-08-24T17:21:41Z" Mar 10 09:46:19 crc kubenswrapper[4794]: I0310 09:46:19.515957 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-9j8d2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f99fa7c7-084a-43b1-acbf-5fbc67e1a66b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:46:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:46:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:46:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:46:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5aec2bb176e581a0b8ab9808299a341c97cb05dcb3bf8a44d65e65b642cecec5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:46:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8l2p5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1df90543899f07349ecc9cbe5282c9e766f2cf21e6a6da40920a77f6f40a9c2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:46:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8l2p5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:46:04Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-9j8d2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:46:19Z is after 2025-08-24T17:21:41Z" Mar 10 09:46:19 crc kubenswrapper[4794]: I0310 09:46:19.534427 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c04f5dfc-8cde-406b-9e01-2e529e0c0f31\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:46:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:46:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c7c4ede630df1f139ad8abbc2d1d1e2f0f6cf56a2c29902f6b6879df004834cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:44:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://27049a35d23ac6ffa5cb459e4ea2c5c55a7b1731c9edff0941d452442cc73af2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:44:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://de2c346c49fbd94c4328853bed3ffd94ba873a423ff0d99a6ace6fc832bd4da7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:44:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f9a5ac235816afb81c6091da3019a5d8be1e29ba1de74f30c6c28ace091f822\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bd3d37caeb172a681855b5504cd1387cb3798cff0c9ae2c67f9a57207d86f7fe\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T09:45:21Z\\\",\\\"message\\\":\\\"le observer\\\\nW0310 09:45:21.579222 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0310 09:45:21.579381 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0310 09:45:21.580347 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2897458058/tls.crt::/tmp/serving-cert-2897458058/tls.key\\\\\\\"\\\\nI0310 09:45:21.843230 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0310 09:45:21.845521 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0310 09:45:21.845546 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0310 09:45:21.845567 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0310 09:45:21.845572 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0310 09:45:21.854911 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0310 09:45:21.854948 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 09:45:21.854954 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 09:45:21.854960 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0310 09:45:21.854966 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0310 09:45:21.854970 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0310 09:45:21.854974 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0310 09:45:21.854995 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0310 09:45:21.858504 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T09:45:21Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:46:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a3be3a71e2af20be50ffcbd448d5f28e73bafe2d36819ec5476a260b6f5a8c5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:44:14Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://712665fa5ac90b706c3ce8eb211703686c18a26f136962df26e3ad7c4396c72f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://712665fa5ac90b706c3ce8eb211703686c18a26f136962df26e3ad7c4396c72f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:44:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:44:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:44:12Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:46:19Z is after 2025-08-24T17:21:41Z" Mar 10 09:46:19 crc kubenswrapper[4794]: I0310 09:46:19.550236 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2f008ec7-aeae-48e2-8ce6-2d24f9a74cb9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5ecb656c11edc3bfd23f7b6362d5ac8cad055186b134498b59e58b868ebd157c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:44:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8ff582c5b9b19a04841665aef50262d055e093a8e80200961a93e1e1ee0391b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8ff582c5b9b19a04841665aef50262d055e093a8e80200961a93e1e1ee0391b4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:44:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:44:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:44:12Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:46:19Z is after 2025-08-24T17:21:41Z" Mar 10 09:46:19 crc kubenswrapper[4794]: I0310 09:46:19.571303 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-zr89w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"749e1060-d177-43e4-9f39-d1b3e9b573b3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://80687a118cd90a6d4538b0de1757e2f3a17d7f2d5e2532d4a3c4396e838bdce4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:45:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-drqmz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9a7b1089f71425c1e393f53f827c2bd0dd87615752c6c5655ef28b2346de21d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a7b1089f71425c1e393f53f827c2bd0dd87615752c6c5655ef28b2346de21d8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:45:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:45:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-drqmz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e448f6f31e1ada006f54b8381012841a94a4e0101fba5630d715f1ab89e842e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6e448f6f31e1ada006f54b8381012841a94a4e0101fba5630d715f1ab89e842e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:45:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:45:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-drqmz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4d92d9cb37c7766aaabeee4dc2a40c043f0e8178dfe21b6ca9f91fa6313fed4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d4d92d9cb37c7766aaabeee4dc2a40c043f0e8178dfe21b6ca9f91fa6313fed4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:45:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:45:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-drqmz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://06a10ffd91a8c26bf9a3affc09b1c698bcf344aebafadbb12e37e4cc0fb8fb32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://06a10ffd91a8c26bf9a3affc09b1c698bcf344aebafadbb12e37e4cc0fb8fb32\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:45:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:45:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-drqmz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://080520ae745ae721854727f7b8ea882c3b1073ef487362681973f5befe15a79c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://080520ae745ae721854727f7b8ea882c3b1073ef487362681973f5befe15a79c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:45:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:45:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-drqmz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4d9e1da83ee6ac0c4f792e1ec3da1a150b5ee013e83dd2e1b0e86bf5b5fcc648\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4d9e1da83ee6ac0c4f792e1ec3da1a150b5ee013e83dd2e1b0e86bf5b5fcc648\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:45:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:45:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-drqmz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:45:52Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-zr89w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:46:19Z is after 2025-08-24T17:21:41Z" Mar 10 09:46:19 crc kubenswrapper[4794]: I0310 09:46:19.590381 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-jpdth" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"11028118-385a-4a2a-8bc4-49aad67ce147\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31569151ef9bf456980ec9d56c85d54ea77739b3991e26844385936dfceb24ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:45:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8v9tn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:45:52Z\\\"}}\" for pod \"openshift-multus\"/\"multus-jpdth\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:46:19Z is after 2025-08-24T17:21:41Z" Mar 10 09:46:19 crc kubenswrapper[4794]: I0310 09:46:19.998095 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 09:46:19 crc kubenswrapper[4794]: I0310 09:46:19.998162 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 09:46:19 crc kubenswrapper[4794]: E0310 09:46:19.998240 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 09:46:19 crc kubenswrapper[4794]: E0310 09:46:19.998421 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 09:46:19 crc kubenswrapper[4794]: I0310 09:46:19.998502 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 09:46:19 crc kubenswrapper[4794]: E0310 09:46:19.998594 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 09:46:20 crc kubenswrapper[4794]: I0310 09:46:20.054487 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:46:20 crc kubenswrapper[4794]: I0310 09:46:20.054561 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:46:20 crc kubenswrapper[4794]: I0310 09:46:20.054585 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:46:20 crc kubenswrapper[4794]: I0310 09:46:20.054613 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:46:20 crc kubenswrapper[4794]: I0310 09:46:20.054634 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:46:20Z","lastTransitionTime":"2026-03-10T09:46:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:46:20 crc kubenswrapper[4794]: E0310 09:46:20.076103 4794 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:46:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:46:20Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:46:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:46:20Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:46:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:46:20Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:46:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:46:20Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"30640994-851b-4f33-a3b0-5689f89c6242\\\",\\\"systemUUID\\\":\\\"970a308b-8f2f-4747-b542-8544494e7e13\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:46:20Z is after 2025-08-24T17:21:41Z" Mar 10 09:46:20 crc kubenswrapper[4794]: I0310 09:46:20.081092 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:46:20 crc kubenswrapper[4794]: I0310 09:46:20.081156 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:46:20 crc kubenswrapper[4794]: I0310 09:46:20.081176 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:46:20 crc kubenswrapper[4794]: I0310 09:46:20.081198 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:46:20 crc kubenswrapper[4794]: I0310 09:46:20.081214 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:46:20Z","lastTransitionTime":"2026-03-10T09:46:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:46:20 crc kubenswrapper[4794]: E0310 09:46:20.102050 4794 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:46:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:46:20Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:46:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:46:20Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:46:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:46:20Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:46:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:46:20Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"30640994-851b-4f33-a3b0-5689f89c6242\\\",\\\"systemUUID\\\":\\\"970a308b-8f2f-4747-b542-8544494e7e13\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:46:20Z is after 2025-08-24T17:21:41Z" Mar 10 09:46:20 crc kubenswrapper[4794]: I0310 09:46:20.108444 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:46:20 crc kubenswrapper[4794]: I0310 09:46:20.108505 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:46:20 crc kubenswrapper[4794]: I0310 09:46:20.108527 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:46:20 crc kubenswrapper[4794]: I0310 09:46:20.108554 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:46:20 crc kubenswrapper[4794]: I0310 09:46:20.108585 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:46:20Z","lastTransitionTime":"2026-03-10T09:46:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:46:20 crc kubenswrapper[4794]: E0310 09:46:20.133185 4794 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:46:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:46:20Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:46:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:46:20Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:46:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:46:20Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:46:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:46:20Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"30640994-851b-4f33-a3b0-5689f89c6242\\\",\\\"systemUUID\\\":\\\"970a308b-8f2f-4747-b542-8544494e7e13\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:46:20Z is after 2025-08-24T17:21:41Z" Mar 10 09:46:20 crc kubenswrapper[4794]: I0310 09:46:20.138365 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:46:20 crc kubenswrapper[4794]: I0310 09:46:20.138429 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:46:20 crc kubenswrapper[4794]: I0310 09:46:20.138453 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:46:20 crc kubenswrapper[4794]: I0310 09:46:20.138482 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:46:20 crc kubenswrapper[4794]: I0310 09:46:20.138505 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:46:20Z","lastTransitionTime":"2026-03-10T09:46:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:46:20 crc kubenswrapper[4794]: E0310 09:46:20.158271 4794 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:46:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:46:20Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:46:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:46:20Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:46:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:46:20Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:46:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:46:20Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"30640994-851b-4f33-a3b0-5689f89c6242\\\",\\\"systemUUID\\\":\\\"970a308b-8f2f-4747-b542-8544494e7e13\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:46:20Z is after 2025-08-24T17:21:41Z" Mar 10 09:46:20 crc kubenswrapper[4794]: I0310 09:46:20.162810 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:46:20 crc kubenswrapper[4794]: I0310 09:46:20.162867 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:46:20 crc kubenswrapper[4794]: I0310 09:46:20.162892 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:46:20 crc kubenswrapper[4794]: I0310 09:46:20.162922 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:46:20 crc kubenswrapper[4794]: I0310 09:46:20.162943 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:46:20Z","lastTransitionTime":"2026-03-10T09:46:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:46:20 crc kubenswrapper[4794]: E0310 09:46:20.182974 4794 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:46:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:46:20Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:46:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:46:20Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:46:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:46:20Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:46:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:46:20Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"30640994-851b-4f33-a3b0-5689f89c6242\\\",\\\"systemUUID\\\":\\\"970a308b-8f2f-4747-b542-8544494e7e13\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:46:20Z is after 2025-08-24T17:21:41Z" Mar 10 09:46:20 crc kubenswrapper[4794]: E0310 09:46:20.183193 4794 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 10 09:46:20 crc kubenswrapper[4794]: I0310 09:46:20.998552 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jl52w" Mar 10 09:46:20 crc kubenswrapper[4794]: E0310 09:46:20.999500 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jl52w" podUID="befc934b-d5ba-4fb4-afc6-97614b624ebc" Mar 10 09:46:21 crc kubenswrapper[4794]: I0310 09:46:21.614917 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/befc934b-d5ba-4fb4-afc6-97614b624ebc-metrics-certs\") pod \"network-metrics-daemon-jl52w\" (UID: \"befc934b-d5ba-4fb4-afc6-97614b624ebc\") " pod="openshift-multus/network-metrics-daemon-jl52w" Mar 10 09:46:21 crc kubenswrapper[4794]: E0310 09:46:21.615100 4794 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 10 09:46:21 crc kubenswrapper[4794]: E0310 09:46:21.615182 4794 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/befc934b-d5ba-4fb4-afc6-97614b624ebc-metrics-certs podName:befc934b-d5ba-4fb4-afc6-97614b624ebc nodeName:}" failed. No retries permitted until 2026-03-10 09:46:37.615158961 +0000 UTC m=+146.371329809 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/befc934b-d5ba-4fb4-afc6-97614b624ebc-metrics-certs") pod "network-metrics-daemon-jl52w" (UID: "befc934b-d5ba-4fb4-afc6-97614b624ebc") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 10 09:46:21 crc kubenswrapper[4794]: I0310 09:46:21.998657 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 09:46:21 crc kubenswrapper[4794]: I0310 09:46:21.998621 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 09:46:21 crc kubenswrapper[4794]: E0310 09:46:21.998834 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 09:46:21 crc kubenswrapper[4794]: I0310 09:46:21.998977 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 09:46:21 crc kubenswrapper[4794]: E0310 09:46:21.999022 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 09:46:21 crc kubenswrapper[4794]: E0310 09:46:21.999164 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 09:46:22 crc kubenswrapper[4794]: I0310 09:46:22.032519 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1e0df489-afb8-4101-acd6-7274ee4ebddf\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://714aace000a0c1b32bd3c863ca1e38e9e44df606fcf3fddb6573eaf8faea4ac7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:44:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://02b38ec46bd15bd61a9748f309c0f8fd16d1ab485405a653f6974b6b37952f75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:44:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b3bcab9c36a963edfb820142c1c917f0cc5d18a09dc3aa65b4539c2248fb866\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:44:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://55280e0d81cf78ca765eb09242df44355b2e25fc2d3e5f88b78ef9f14a44ada8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:44:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8ed250f61efe96b4fac3af4d8d8958a60df304df0120052a7a1ef2d716cc025d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:44:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d1cbb7471e4c1bc60e6bff82ff47be1667d540e2f73c1e118371b1f887c6e22e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d1cbb7471e4c1bc60e6bff82ff47be1667d540e2f73c1e118371b1f887c6e22e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:44:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:44:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e8b205e76d6211c7df0521baf694a871c2e747401dd85f694fe3b6712995540\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8e8b205e76d6211c7df0521baf694a871c2e747401dd85f694fe3b6712995540\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:44:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:44:14Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://0fee74b6caaaf89e473d672c7699753699a594043f013d2e5106d232ad87d850\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0fee74b6caaaf89e473d672c7699753699a594043f013d2e5106d232ad87d850\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:44:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:44:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:44:12Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:46:22Z is after 2025-08-24T17:21:41Z" Mar 10 09:46:22 crc kubenswrapper[4794]: I0310 09:46:22.054354 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d7e66c82b5c376bc659e9597f6714923192b2658db0530c1e20b73533bfb9079\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:45:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:46:22Z is after 2025-08-24T17:21:41Z" Mar 10 09:46:22 crc kubenswrapper[4794]: I0310 09:46:22.071472 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:39Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:39Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7ca44cdf468c33b025c75f0f104435e413e525fba85c006f98fa92ff2f2a4e3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:45:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:46:22Z is after 2025-08-24T17:21:41Z" Mar 10 09:46:22 crc kubenswrapper[4794]: E0310 09:46:22.073423 4794 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 10 09:46:22 crc kubenswrapper[4794]: I0310 09:46:22.104655 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mm9nq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d6907de6-7eb7-440a-a101-f492ffa28e39\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d6a3c6de4d134994477dbeb7d3ccdac86f691a7c465321397a93be240c84c4c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:45:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5dhqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf907aae6bd62c6969511c2483d9891a3d89407fa596e7bd4d6e18a3739aea41\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:45:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5dhqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f392e7eaab035834aa954a19e6c9d926bbedf2a547b3a11fba67c0bb966df58c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:45:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5dhqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4fd74482a81888186636681a0568f49e919ac0d729db583ba9178c42e488051\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:45:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5dhqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4fe68076842f5ec0c1922f36a3fdae71ce2e309fd4fa24d4ee4e15c16e514b57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:45:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5dhqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6993b19937515a45f9fab7b20a6856320e79cb3526295d52f48338126d519cac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:45:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5dhqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff55d9a5f2346b5bb8356e359d723fd5fdfa5ad328aaa24ec0faa047536068dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ff55d9a5f2346b5bb8356e359d723fd5fdfa5ad328aaa24ec0faa047536068dc\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-10T09:46:14Z\\\",\\\"message\\\":\\\" handler 7\\\\nI0310 09:46:14.916611 7063 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0310 09:46:14.916631 7063 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0310 09:46:14.916647 7063 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0310 09:46:14.916690 7063 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0310 09:46:14.916708 7063 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI0310 09:46:14.916737 7063 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0310 09:46:14.916775 7063 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI0310 09:46:14.916785 7063 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0310 09:46:14.916793 7063 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0310 09:46:14.916814 7063 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0310 09:46:14.916819 7063 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0310 09:46:14.916818 7063 factory.go:656] Stopping watch factory\\\\nI0310 09:46:14.916836 7063 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0310 09:46:14.916845 7063 ovnkube.go:599] Stopped ovnkube\\\\nI0310 09:46:14.916890 7063 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0310 09:46:14.916957 7063 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T09:46:14Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-mm9nq_openshift-ovn-kubernetes(d6907de6-7eb7-440a-a101-f492ffa28e39)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5dhqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://01f874a0c0eedbf796a95816f7e3fa80891f260f22087c48aea06498330167dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:45:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5dhqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://40c8d4f75a838a0ea1ce416d08c6d4628c2cbdbc209d50922f2f02867142bedd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://40c8d4f75a838a0ea1ce416d08c6d4628c2cbdbc209d50922f2f02867142bedd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:45:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:45:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5dhqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:45:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-mm9nq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:46:22Z is after 2025-08-24T17:21:41Z" Mar 10 09:46:22 crc kubenswrapper[4794]: I0310 09:46:22.117476 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-9j8d2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f99fa7c7-084a-43b1-acbf-5fbc67e1a66b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:46:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:46:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:46:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:46:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5aec2bb176e581a0b8ab9808299a341c97cb05dcb3bf8a44d65e65b642cecec5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:46:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8l2p5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1df90543899f07349ecc9cbe5282c9e766f2cf21e6a6da40920a77f6f40a9c2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:46:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8l2p5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:46:04Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-9j8d2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:46:22Z is after 2025-08-24T17:21:41Z" Mar 10 09:46:22 crc kubenswrapper[4794]: I0310 09:46:22.132121 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3f31d53b-bd82-4075-97b0-bab5614c1182\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cab1094176cbf7505724e0c9b08a4c294e6d85e4d5ad3b06535e0dbab270fbc2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:44:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1cb1eebc0d6deb9837da08cb3090cb0f25f2dc66b49afb402fd66f4e73784d66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:44:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7350cf82749601985e69e238d1c8a61ffa7788dd01f1b91a34f1f7574d3c86ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:44:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eab34114bba85670ea6caba8768b650dcbad9e92ca9034f0a21da9b355af87b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eab34114bba85670ea6caba8768b650dcbad9e92ca9034f0a21da9b355af87b1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:44:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:44:13Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:44:12Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:46:22Z is after 2025-08-24T17:21:41Z" Mar 10 09:46:22 crc kubenswrapper[4794]: I0310 09:46:22.149169 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:35Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:46:22Z is after 2025-08-24T17:21:41Z" Mar 10 09:46:22 crc kubenswrapper[4794]: I0310 09:46:22.162101 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:46:22Z is after 2025-08-24T17:21:41Z" Mar 10 09:46:22 crc kubenswrapper[4794]: I0310 09:46:22.179780 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0a99da5e73d983b28a12e4063cb3dcbef725b0797abd5b5beff2d066fb4b43d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:45:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://faad85d1ef96c3c23ba54d03d456195b33f24bed1a5fc64cec481108d087d77c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:45:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:46:22Z is after 2025-08-24T17:21:41Z" Mar 10 09:46:22 crc kubenswrapper[4794]: I0310 09:46:22.193391 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-dc7fw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"441d4601-197c-4325-84bf-cef005ae408b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2367448989b7eb7235c3277100a8d4945551beee942453ce2efe152e17d24ce2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:45:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxxtq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:45:52Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-dc7fw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:46:22Z is after 2025-08-24T17:21:41Z" Mar 10 09:46:22 crc kubenswrapper[4794]: I0310 09:46:22.207911 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-gp22j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ccf2d24b-7ac8-4da4-8629-a56d006f1292\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:46:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:46:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:46:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dc5c1e6650959d5e7fe835284d6684f8d165aaec2e2e5c6537668db196c0ba81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:45:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k8fgn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:45:59Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-gp22j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:46:22Z is after 2025-08-24T17:21:41Z" Mar 10 09:46:22 crc kubenswrapper[4794]: I0310 09:46:22.227077 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c04f5dfc-8cde-406b-9e01-2e529e0c0f31\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:46:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:46:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c7c4ede630df1f139ad8abbc2d1d1e2f0f6cf56a2c29902f6b6879df004834cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:44:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://27049a35d23ac6ffa5cb459e4ea2c5c55a7b1731c9edff0941d452442cc73af2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:44:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://de2c346c49fbd94c4328853bed3ffd94ba873a423ff0d99a6ace6fc832bd4da7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:44:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f9a5ac235816afb81c6091da3019a5d8be1e29ba1de74f30c6c28ace091f822\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bd3d37caeb172a681855b5504cd1387cb3798cff0c9ae2c67f9a57207d86f7fe\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T09:45:21Z\\\",\\\"message\\\":\\\"le observer\\\\nW0310 09:45:21.579222 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0310 09:45:21.579381 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0310 09:45:21.580347 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2897458058/tls.crt::/tmp/serving-cert-2897458058/tls.key\\\\\\\"\\\\nI0310 09:45:21.843230 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0310 09:45:21.845521 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0310 09:45:21.845546 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0310 09:45:21.845567 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0310 09:45:21.845572 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0310 09:45:21.854911 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0310 09:45:21.854948 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 09:45:21.854954 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 09:45:21.854960 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0310 09:45:21.854966 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0310 09:45:21.854970 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0310 09:45:21.854974 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0310 09:45:21.854995 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0310 09:45:21.858504 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T09:45:21Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:46:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a3be3a71e2af20be50ffcbd448d5f28e73bafe2d36819ec5476a260b6f5a8c5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:44:14Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://712665fa5ac90b706c3ce8eb211703686c18a26f136962df26e3ad7c4396c72f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://712665fa5ac90b706c3ce8eb211703686c18a26f136962df26e3ad7c4396c72f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:44:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:44:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:44:12Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:46:22Z is after 2025-08-24T17:21:41Z" Mar 10 09:46:22 crc kubenswrapper[4794]: I0310 09:46:22.242124 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2f008ec7-aeae-48e2-8ce6-2d24f9a74cb9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5ecb656c11edc3bfd23f7b6362d5ac8cad055186b134498b59e58b868ebd157c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:44:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8ff582c5b9b19a04841665aef50262d055e093a8e80200961a93e1e1ee0391b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8ff582c5b9b19a04841665aef50262d055e093a8e80200961a93e1e1ee0391b4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:44:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:44:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:44:12Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:46:22Z is after 2025-08-24T17:21:41Z" Mar 10 09:46:22 crc kubenswrapper[4794]: I0310 09:46:22.267278 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-zr89w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"749e1060-d177-43e4-9f39-d1b3e9b573b3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://80687a118cd90a6d4538b0de1757e2f3a17d7f2d5e2532d4a3c4396e838bdce4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:45:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-drqmz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9a7b1089f71425c1e393f53f827c2bd0dd87615752c6c5655ef28b2346de21d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a7b1089f71425c1e393f53f827c2bd0dd87615752c6c5655ef28b2346de21d8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:45:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:45:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-drqmz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e448f6f31e1ada006f54b8381012841a94a4e0101fba5630d715f1ab89e842e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6e448f6f31e1ada006f54b8381012841a94a4e0101fba5630d715f1ab89e842e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:45:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:45:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-drqmz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4d92d9cb37c7766aaabeee4dc2a40c043f0e8178dfe21b6ca9f91fa6313fed4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d4d92d9cb37c7766aaabeee4dc2a40c043f0e8178dfe21b6ca9f91fa6313fed4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:45:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:45:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-drqmz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://06a10ffd91a8c26bf9a3affc09b1c698bcf344aebafadbb12e37e4cc0fb8fb32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://06a10ffd91a8c26bf9a3affc09b1c698bcf344aebafadbb12e37e4cc0fb8fb32\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:45:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:45:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-drqmz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://080520ae745ae721854727f7b8ea882c3b1073ef487362681973f5befe15a79c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://080520ae745ae721854727f7b8ea882c3b1073ef487362681973f5befe15a79c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:45:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:45:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-drqmz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4d9e1da83ee6ac0c4f792e1ec3da1a150b5ee013e83dd2e1b0e86bf5b5fcc648\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4d9e1da83ee6ac0c4f792e1ec3da1a150b5ee013e83dd2e1b0e86bf5b5fcc648\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:45:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:45:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-drqmz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:45:52Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-zr89w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:46:22Z is after 2025-08-24T17:21:41Z" Mar 10 09:46:22 crc kubenswrapper[4794]: I0310 09:46:22.282778 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-jpdth" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"11028118-385a-4a2a-8bc4-49aad67ce147\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31569151ef9bf456980ec9d56c85d54ea77739b3991e26844385936dfceb24ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:45:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8v9tn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:45:52Z\\\"}}\" for pod \"openshift-multus\"/\"multus-jpdth\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:46:22Z is after 2025-08-24T17:21:41Z" Mar 10 09:46:22 crc kubenswrapper[4794]: I0310 09:46:22.296109 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:35Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:46:22Z is after 2025-08-24T17:21:41Z" Mar 10 09:46:22 crc kubenswrapper[4794]: I0310 09:46:22.309014 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-69278" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"071a79a8-a892-4d38-a255-2a19483b64aa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f596b5be3f96e424f4c4f0c6f81241ec50ee64356cb1449294fa6790aa3f7755\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:45:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wl72n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://387c7d244ef3b150e61bf6db8b30580b19b8867628fab0e5c4eebdde4df81142\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:45:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wl72n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:45:52Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-69278\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:46:22Z is after 2025-08-24T17:21:41Z" Mar 10 09:46:22 crc kubenswrapper[4794]: I0310 09:46:22.321886 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-jl52w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"befc934b-d5ba-4fb4-afc6-97614b624ebc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:46:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:46:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:46:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:46:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gbh5s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gbh5s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:46:05Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-jl52w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:46:22Z is after 2025-08-24T17:21:41Z" Mar 10 09:46:22 crc kubenswrapper[4794]: I0310 09:46:22.981273 4794 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Mar 10 09:46:22 crc kubenswrapper[4794]: I0310 09:46:22.998106 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jl52w" Mar 10 09:46:22 crc kubenswrapper[4794]: E0310 09:46:22.998292 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jl52w" podUID="befc934b-d5ba-4fb4-afc6-97614b624ebc" Mar 10 09:46:23 crc kubenswrapper[4794]: I0310 09:46:23.998893 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 09:46:23 crc kubenswrapper[4794]: I0310 09:46:23.998967 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 09:46:24 crc kubenswrapper[4794]: E0310 09:46:23.999509 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 09:46:24 crc kubenswrapper[4794]: I0310 09:46:23.999054 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 09:46:24 crc kubenswrapper[4794]: E0310 09:46:23.999651 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 09:46:24 crc kubenswrapper[4794]: E0310 09:46:23.999852 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 09:46:24 crc kubenswrapper[4794]: I0310 09:46:24.998705 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jl52w" Mar 10 09:46:24 crc kubenswrapper[4794]: E0310 09:46:24.998954 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jl52w" podUID="befc934b-d5ba-4fb4-afc6-97614b624ebc" Mar 10 09:46:25 crc kubenswrapper[4794]: I0310 09:46:25.998181 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 09:46:25 crc kubenswrapper[4794]: I0310 09:46:25.998288 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 09:46:25 crc kubenswrapper[4794]: E0310 09:46:25.998411 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 09:46:25 crc kubenswrapper[4794]: I0310 09:46:25.998442 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 09:46:25 crc kubenswrapper[4794]: E0310 09:46:25.998577 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 09:46:25 crc kubenswrapper[4794]: E0310 09:46:25.998703 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 09:46:26 crc kubenswrapper[4794]: I0310 09:46:26.998972 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jl52w" Mar 10 09:46:26 crc kubenswrapper[4794]: E0310 09:46:26.999448 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jl52w" podUID="befc934b-d5ba-4fb4-afc6-97614b624ebc" Mar 10 09:46:27 crc kubenswrapper[4794]: I0310 09:46:27.011626 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/kube-controller-manager-crc"] Mar 10 09:46:27 crc kubenswrapper[4794]: E0310 09:46:27.074763 4794 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 10 09:46:27 crc kubenswrapper[4794]: I0310 09:46:27.998749 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 09:46:27 crc kubenswrapper[4794]: I0310 09:46:27.998875 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 09:46:27 crc kubenswrapper[4794]: I0310 09:46:27.998900 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 09:46:27 crc kubenswrapper[4794]: E0310 09:46:27.998990 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 09:46:28 crc kubenswrapper[4794]: E0310 09:46:27.999144 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 09:46:28 crc kubenswrapper[4794]: E0310 09:46:27.999786 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 09:46:28 crc kubenswrapper[4794]: I0310 09:46:28.000459 4794 scope.go:117] "RemoveContainer" containerID="ff55d9a5f2346b5bb8356e359d723fd5fdfa5ad328aaa24ec0faa047536068dc" Mar 10 09:46:28 crc kubenswrapper[4794]: E0310 09:46:28.000759 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-mm9nq_openshift-ovn-kubernetes(d6907de6-7eb7-440a-a101-f492ffa28e39)\"" pod="openshift-ovn-kubernetes/ovnkube-node-mm9nq" podUID="d6907de6-7eb7-440a-a101-f492ffa28e39" Mar 10 09:46:28 crc kubenswrapper[4794]: I0310 09:46:28.998481 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jl52w" Mar 10 09:46:28 crc kubenswrapper[4794]: E0310 09:46:28.998644 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jl52w" podUID="befc934b-d5ba-4fb4-afc6-97614b624ebc" Mar 10 09:46:29 crc kubenswrapper[4794]: I0310 09:46:29.998944 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 09:46:29 crc kubenswrapper[4794]: I0310 09:46:29.998972 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 09:46:29 crc kubenswrapper[4794]: I0310 09:46:29.999052 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 09:46:29 crc kubenswrapper[4794]: E0310 09:46:29.999120 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 09:46:29 crc kubenswrapper[4794]: E0310 09:46:29.999235 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 09:46:29 crc kubenswrapper[4794]: E0310 09:46:29.999464 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 09:46:30 crc kubenswrapper[4794]: I0310 09:46:30.244515 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:46:30 crc kubenswrapper[4794]: I0310 09:46:30.244587 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:46:30 crc kubenswrapper[4794]: I0310 09:46:30.244623 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:46:30 crc kubenswrapper[4794]: I0310 09:46:30.244652 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:46:30 crc kubenswrapper[4794]: I0310 09:46:30.244674 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:46:30Z","lastTransitionTime":"2026-03-10T09:46:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:46:30 crc kubenswrapper[4794]: E0310 09:46:30.267969 4794 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:46:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:46:30Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:46:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:46:30Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:46:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:46:30Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:46:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:46:30Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"30640994-851b-4f33-a3b0-5689f89c6242\\\",\\\"systemUUID\\\":\\\"970a308b-8f2f-4747-b542-8544494e7e13\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:46:30Z is after 2025-08-24T17:21:41Z" Mar 10 09:46:30 crc kubenswrapper[4794]: I0310 09:46:30.278920 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:46:30 crc kubenswrapper[4794]: I0310 09:46:30.278978 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:46:30 crc kubenswrapper[4794]: I0310 09:46:30.278995 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:46:30 crc kubenswrapper[4794]: I0310 09:46:30.279017 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:46:30 crc kubenswrapper[4794]: I0310 09:46:30.279034 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:46:30Z","lastTransitionTime":"2026-03-10T09:46:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:46:30 crc kubenswrapper[4794]: E0310 09:46:30.300522 4794 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:46:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:46:30Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:46:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:46:30Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:46:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:46:30Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:46:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:46:30Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"30640994-851b-4f33-a3b0-5689f89c6242\\\",\\\"systemUUID\\\":\\\"970a308b-8f2f-4747-b542-8544494e7e13\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:46:30Z is after 2025-08-24T17:21:41Z" Mar 10 09:46:30 crc kubenswrapper[4794]: I0310 09:46:30.305575 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:46:30 crc kubenswrapper[4794]: I0310 09:46:30.305629 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:46:30 crc kubenswrapper[4794]: I0310 09:46:30.305650 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:46:30 crc kubenswrapper[4794]: I0310 09:46:30.305680 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:46:30 crc kubenswrapper[4794]: I0310 09:46:30.305701 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:46:30Z","lastTransitionTime":"2026-03-10T09:46:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:46:30 crc kubenswrapper[4794]: E0310 09:46:30.326256 4794 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:46:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:46:30Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:46:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:46:30Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:46:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:46:30Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:46:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:46:30Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"30640994-851b-4f33-a3b0-5689f89c6242\\\",\\\"systemUUID\\\":\\\"970a308b-8f2f-4747-b542-8544494e7e13\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:46:30Z is after 2025-08-24T17:21:41Z" Mar 10 09:46:30 crc kubenswrapper[4794]: I0310 09:46:30.331983 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:46:30 crc kubenswrapper[4794]: I0310 09:46:30.332052 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:46:30 crc kubenswrapper[4794]: I0310 09:46:30.332075 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:46:30 crc kubenswrapper[4794]: I0310 09:46:30.332104 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:46:30 crc kubenswrapper[4794]: I0310 09:46:30.332126 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:46:30Z","lastTransitionTime":"2026-03-10T09:46:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:46:30 crc kubenswrapper[4794]: E0310 09:46:30.353731 4794 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:46:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:46:30Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:46:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:46:30Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:46:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:46:30Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:46:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:46:30Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"30640994-851b-4f33-a3b0-5689f89c6242\\\",\\\"systemUUID\\\":\\\"970a308b-8f2f-4747-b542-8544494e7e13\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:46:30Z is after 2025-08-24T17:21:41Z" Mar 10 09:46:30 crc kubenswrapper[4794]: I0310 09:46:30.359593 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:46:30 crc kubenswrapper[4794]: I0310 09:46:30.359650 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:46:30 crc kubenswrapper[4794]: I0310 09:46:30.359673 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:46:30 crc kubenswrapper[4794]: I0310 09:46:30.359702 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:46:30 crc kubenswrapper[4794]: I0310 09:46:30.359725 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:46:30Z","lastTransitionTime":"2026-03-10T09:46:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:46:30 crc kubenswrapper[4794]: E0310 09:46:30.380250 4794 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:46:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:46:30Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:46:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:46:30Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:46:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:46:30Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:46:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:46:30Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"30640994-851b-4f33-a3b0-5689f89c6242\\\",\\\"systemUUID\\\":\\\"970a308b-8f2f-4747-b542-8544494e7e13\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:46:30Z is after 2025-08-24T17:21:41Z" Mar 10 09:46:30 crc kubenswrapper[4794]: E0310 09:46:30.380423 4794 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 10 09:46:30 crc kubenswrapper[4794]: I0310 09:46:30.999059 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jl52w" Mar 10 09:46:30 crc kubenswrapper[4794]: E0310 09:46:30.999315 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jl52w" podUID="befc934b-d5ba-4fb4-afc6-97614b624ebc" Mar 10 09:46:31 crc kubenswrapper[4794]: I0310 09:46:31.998759 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 09:46:31 crc kubenswrapper[4794]: I0310 09:46:31.998789 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 09:46:31 crc kubenswrapper[4794]: E0310 09:46:31.999444 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 09:46:31 crc kubenswrapper[4794]: I0310 09:46:31.998911 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 09:46:31 crc kubenswrapper[4794]: E0310 09:46:31.999713 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 09:46:32 crc kubenswrapper[4794]: E0310 09:46:31.999883 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 09:46:32 crc kubenswrapper[4794]: I0310 09:46:32.015098 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:35Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:46:32Z is after 2025-08-24T17:21:41Z" Mar 10 09:46:32 crc kubenswrapper[4794]: I0310 09:46:32.031701 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-69278" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"071a79a8-a892-4d38-a255-2a19483b64aa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f596b5be3f96e424f4c4f0c6f81241ec50ee64356cb1449294fa6790aa3f7755\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:45:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wl72n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://387c7d244ef3b150e61bf6db8b30580b19b8867628fab0e5c4eebdde4df81142\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:45:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wl72n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:45:52Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-69278\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:46:32Z is after 2025-08-24T17:21:41Z" Mar 10 09:46:32 crc kubenswrapper[4794]: I0310 09:46:32.046530 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-jl52w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"befc934b-d5ba-4fb4-afc6-97614b624ebc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:46:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:46:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:46:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:46:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gbh5s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gbh5s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:46:05Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-jl52w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:46:32Z is after 2025-08-24T17:21:41Z" Mar 10 09:46:32 crc kubenswrapper[4794]: I0310 09:46:32.066784 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37e93748-d1b9-4fcb-8dd2-3e4af26b9498\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://875e6fa3d8ad5a7c381c1bf3efd51ccfa1202dd3fd9b30bc3e05fac7eb9fc3f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3e7c9a20b63dc51728211c782f530307eafabeb7f97e5bc87bcdacd29a91658e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T09:45:13Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0310 09:44:43.522265 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0310 09:44:43.523853 1 observer_polling.go:159] Starting file observer\\\\nI0310 09:44:43.529139 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0310 09:44:43.530568 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0310 09:45:12.957570 1 cert_rotation.go:91] certificate rotation detected, shutting down client connections to start using new credentials\\\\nI0310 09:45:13.082840 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0310 09:45:13.082951 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T09:44:43Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:45:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2e8a850bc5db9bf7a7e73a9a71933744f371c73ceba41cd04d0b175614309ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:44:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://76c05c05697162298be5d52e9bf3012e05344a077cc3cf6a53839cee75b671a9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:44:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0aac675904ab2ca1d18eed0251bb1dafa297d7819e95189e0ea6288b164cae86\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:44:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:44:12Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:46:32Z is after 2025-08-24T17:21:41Z" Mar 10 09:46:32 crc kubenswrapper[4794]: E0310 09:46:32.075626 4794 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 10 09:46:32 crc kubenswrapper[4794]: I0310 09:46:32.102263 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1e0df489-afb8-4101-acd6-7274ee4ebddf\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://714aace000a0c1b32bd3c863ca1e38e9e44df606fcf3fddb6573eaf8faea4ac7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:44:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://02b38ec46bd15bd61a9748f309c0f8fd16d1ab485405a653f6974b6b37952f75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:44:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b3bcab9c36a963edfb820142c1c917f0cc5d18a09dc3aa65b4539c2248fb866\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:44:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://55280e0d81cf78ca765eb09242df44355b2e25fc2d3e5f88b78ef9f14a44ada8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:44:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8ed250f61efe96b4fac3af4d8d8958a60df304df0120052a7a1ef2d716cc025d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:44:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d1cbb7471e4c1bc60e6bff82ff47be1667d540e2f73c1e118371b1f887c6e22e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d1cbb7471e4c1bc60e6bff82ff47be1667d540e2f73c1e118371b1f887c6e22e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:44:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:44:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e8b205e76d6211c7df0521baf694a871c2e747401dd85f694fe3b6712995540\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8e8b205e76d6211c7df0521baf694a871c2e747401dd85f694fe3b6712995540\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:44:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:44:14Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://0fee74b6caaaf89e473d672c7699753699a594043f013d2e5106d232ad87d850\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0fee74b6caaaf89e473d672c7699753699a594043f013d2e5106d232ad87d850\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:44:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:44:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:44:12Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:46:32Z is after 2025-08-24T17:21:41Z" Mar 10 09:46:32 crc kubenswrapper[4794]: I0310 09:46:32.123137 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d7e66c82b5c376bc659e9597f6714923192b2658db0530c1e20b73533bfb9079\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:45:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:46:32Z is after 2025-08-24T17:21:41Z" Mar 10 09:46:32 crc kubenswrapper[4794]: I0310 09:46:32.145476 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:39Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:39Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7ca44cdf468c33b025c75f0f104435e413e525fba85c006f98fa92ff2f2a4e3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:45:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:46:32Z is after 2025-08-24T17:21:41Z" Mar 10 09:46:32 crc kubenswrapper[4794]: I0310 09:46:32.174588 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mm9nq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d6907de6-7eb7-440a-a101-f492ffa28e39\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d6a3c6de4d134994477dbeb7d3ccdac86f691a7c465321397a93be240c84c4c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:45:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5dhqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf907aae6bd62c6969511c2483d9891a3d89407fa596e7bd4d6e18a3739aea41\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:45:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5dhqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f392e7eaab035834aa954a19e6c9d926bbedf2a547b3a11fba67c0bb966df58c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:45:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5dhqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4fd74482a81888186636681a0568f49e919ac0d729db583ba9178c42e488051\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:45:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5dhqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4fe68076842f5ec0c1922f36a3fdae71ce2e309fd4fa24d4ee4e15c16e514b57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:45:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5dhqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6993b19937515a45f9fab7b20a6856320e79cb3526295d52f48338126d519cac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:45:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5dhqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff55d9a5f2346b5bb8356e359d723fd5fdfa5ad328aaa24ec0faa047536068dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ff55d9a5f2346b5bb8356e359d723fd5fdfa5ad328aaa24ec0faa047536068dc\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-10T09:46:14Z\\\",\\\"message\\\":\\\" handler 7\\\\nI0310 09:46:14.916611 7063 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0310 09:46:14.916631 7063 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0310 09:46:14.916647 7063 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0310 09:46:14.916690 7063 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0310 09:46:14.916708 7063 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI0310 09:46:14.916737 7063 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0310 09:46:14.916775 7063 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI0310 09:46:14.916785 7063 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0310 09:46:14.916793 7063 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0310 09:46:14.916814 7063 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0310 09:46:14.916819 7063 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0310 09:46:14.916818 7063 factory.go:656] Stopping watch factory\\\\nI0310 09:46:14.916836 7063 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0310 09:46:14.916845 7063 ovnkube.go:599] Stopped ovnkube\\\\nI0310 09:46:14.916890 7063 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0310 09:46:14.916957 7063 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T09:46:14Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-mm9nq_openshift-ovn-kubernetes(d6907de6-7eb7-440a-a101-f492ffa28e39)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5dhqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://01f874a0c0eedbf796a95816f7e3fa80891f260f22087c48aea06498330167dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:45:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5dhqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://40c8d4f75a838a0ea1ce416d08c6d4628c2cbdbc209d50922f2f02867142bedd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://40c8d4f75a838a0ea1ce416d08c6d4628c2cbdbc209d50922f2f02867142bedd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:45:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:45:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5dhqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:45:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-mm9nq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:46:32Z is after 2025-08-24T17:21:41Z" Mar 10 09:46:32 crc kubenswrapper[4794]: I0310 09:46:32.191499 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-9j8d2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f99fa7c7-084a-43b1-acbf-5fbc67e1a66b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:46:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:46:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:46:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:46:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5aec2bb176e581a0b8ab9808299a341c97cb05dcb3bf8a44d65e65b642cecec5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:46:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8l2p5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1df90543899f07349ecc9cbe5282c9e766f2cf21e6a6da40920a77f6f40a9c2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:46:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8l2p5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:46:04Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-9j8d2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:46:32Z is after 2025-08-24T17:21:41Z" Mar 10 09:46:32 crc kubenswrapper[4794]: I0310 09:46:32.207305 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3f31d53b-bd82-4075-97b0-bab5614c1182\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cab1094176cbf7505724e0c9b08a4c294e6d85e4d5ad3b06535e0dbab270fbc2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:44:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1cb1eebc0d6deb9837da08cb3090cb0f25f2dc66b49afb402fd66f4e73784d66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:44:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7350cf82749601985e69e238d1c8a61ffa7788dd01f1b91a34f1f7574d3c86ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:44:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eab34114bba85670ea6caba8768b650dcbad9e92ca9034f0a21da9b355af87b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eab34114bba85670ea6caba8768b650dcbad9e92ca9034f0a21da9b355af87b1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:44:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:44:13Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:44:12Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:46:32Z is after 2025-08-24T17:21:41Z" Mar 10 09:46:32 crc kubenswrapper[4794]: I0310 09:46:32.221052 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:35Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:46:32Z is after 2025-08-24T17:21:41Z" Mar 10 09:46:32 crc kubenswrapper[4794]: I0310 09:46:32.236021 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:46:32Z is after 2025-08-24T17:21:41Z" Mar 10 09:46:32 crc kubenswrapper[4794]: I0310 09:46:32.255614 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0a99da5e73d983b28a12e4063cb3dcbef725b0797abd5b5beff2d066fb4b43d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:45:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://faad85d1ef96c3c23ba54d03d456195b33f24bed1a5fc64cec481108d087d77c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:45:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:46:32Z is after 2025-08-24T17:21:41Z" Mar 10 09:46:32 crc kubenswrapper[4794]: I0310 09:46:32.269917 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-dc7fw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"441d4601-197c-4325-84bf-cef005ae408b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2367448989b7eb7235c3277100a8d4945551beee942453ce2efe152e17d24ce2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:45:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxxtq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:45:52Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-dc7fw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:46:32Z is after 2025-08-24T17:21:41Z" Mar 10 09:46:32 crc kubenswrapper[4794]: I0310 09:46:32.283997 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-gp22j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ccf2d24b-7ac8-4da4-8629-a56d006f1292\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:46:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:46:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:46:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dc5c1e6650959d5e7fe835284d6684f8d165aaec2e2e5c6537668db196c0ba81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:45:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k8fgn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:45:59Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-gp22j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:46:32Z is after 2025-08-24T17:21:41Z" Mar 10 09:46:32 crc kubenswrapper[4794]: I0310 09:46:32.302352 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c04f5dfc-8cde-406b-9e01-2e529e0c0f31\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:46:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:46:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c7c4ede630df1f139ad8abbc2d1d1e2f0f6cf56a2c29902f6b6879df004834cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:44:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://27049a35d23ac6ffa5cb459e4ea2c5c55a7b1731c9edff0941d452442cc73af2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:44:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://de2c346c49fbd94c4328853bed3ffd94ba873a423ff0d99a6ace6fc832bd4da7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:44:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f9a5ac235816afb81c6091da3019a5d8be1e29ba1de74f30c6c28ace091f822\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bd3d37caeb172a681855b5504cd1387cb3798cff0c9ae2c67f9a57207d86f7fe\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T09:45:21Z\\\",\\\"message\\\":\\\"le observer\\\\nW0310 09:45:21.579222 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0310 09:45:21.579381 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0310 09:45:21.580347 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2897458058/tls.crt::/tmp/serving-cert-2897458058/tls.key\\\\\\\"\\\\nI0310 09:45:21.843230 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0310 09:45:21.845521 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0310 09:45:21.845546 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0310 09:45:21.845567 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0310 09:45:21.845572 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0310 09:45:21.854911 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0310 09:45:21.854948 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 09:45:21.854954 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 09:45:21.854960 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0310 09:45:21.854966 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0310 09:45:21.854970 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0310 09:45:21.854974 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0310 09:45:21.854995 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0310 09:45:21.858504 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T09:45:21Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:46:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a3be3a71e2af20be50ffcbd448d5f28e73bafe2d36819ec5476a260b6f5a8c5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:44:14Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://712665fa5ac90b706c3ce8eb211703686c18a26f136962df26e3ad7c4396c72f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://712665fa5ac90b706c3ce8eb211703686c18a26f136962df26e3ad7c4396c72f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:44:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:44:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:44:12Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:46:32Z is after 2025-08-24T17:21:41Z" Mar 10 09:46:32 crc kubenswrapper[4794]: I0310 09:46:32.317467 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2f008ec7-aeae-48e2-8ce6-2d24f9a74cb9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5ecb656c11edc3bfd23f7b6362d5ac8cad055186b134498b59e58b868ebd157c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:44:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8ff582c5b9b19a04841665aef50262d055e093a8e80200961a93e1e1ee0391b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8ff582c5b9b19a04841665aef50262d055e093a8e80200961a93e1e1ee0391b4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:44:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:44:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:44:12Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:46:32Z is after 2025-08-24T17:21:41Z" Mar 10 09:46:32 crc kubenswrapper[4794]: I0310 09:46:32.339626 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-zr89w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"749e1060-d177-43e4-9f39-d1b3e9b573b3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://80687a118cd90a6d4538b0de1757e2f3a17d7f2d5e2532d4a3c4396e838bdce4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:45:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-drqmz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9a7b1089f71425c1e393f53f827c2bd0dd87615752c6c5655ef28b2346de21d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a7b1089f71425c1e393f53f827c2bd0dd87615752c6c5655ef28b2346de21d8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:45:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:45:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-drqmz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e448f6f31e1ada006f54b8381012841a94a4e0101fba5630d715f1ab89e842e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6e448f6f31e1ada006f54b8381012841a94a4e0101fba5630d715f1ab89e842e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:45:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:45:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-drqmz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4d92d9cb37c7766aaabeee4dc2a40c043f0e8178dfe21b6ca9f91fa6313fed4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d4d92d9cb37c7766aaabeee4dc2a40c043f0e8178dfe21b6ca9f91fa6313fed4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:45:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:45:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-drqmz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://06a10ffd91a8c26bf9a3affc09b1c698bcf344aebafadbb12e37e4cc0fb8fb32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://06a10ffd91a8c26bf9a3affc09b1c698bcf344aebafadbb12e37e4cc0fb8fb32\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:45:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:45:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-drqmz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://080520ae745ae721854727f7b8ea882c3b1073ef487362681973f5befe15a79c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://080520ae745ae721854727f7b8ea882c3b1073ef487362681973f5befe15a79c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:45:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:45:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-drqmz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4d9e1da83ee6ac0c4f792e1ec3da1a150b5ee013e83dd2e1b0e86bf5b5fcc648\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4d9e1da83ee6ac0c4f792e1ec3da1a150b5ee013e83dd2e1b0e86bf5b5fcc648\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:45:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:45:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-drqmz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:45:52Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-zr89w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:46:32Z is after 2025-08-24T17:21:41Z" Mar 10 09:46:32 crc kubenswrapper[4794]: I0310 09:46:32.361291 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-jpdth" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"11028118-385a-4a2a-8bc4-49aad67ce147\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31569151ef9bf456980ec9d56c85d54ea77739b3991e26844385936dfceb24ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:45:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8v9tn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:45:52Z\\\"}}\" for pod \"openshift-multus\"/\"multus-jpdth\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:46:32Z is after 2025-08-24T17:21:41Z" Mar 10 09:46:32 crc kubenswrapper[4794]: I0310 09:46:32.998877 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jl52w" Mar 10 09:46:32 crc kubenswrapper[4794]: E0310 09:46:32.999543 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jl52w" podUID="befc934b-d5ba-4fb4-afc6-97614b624ebc" Mar 10 09:46:33 crc kubenswrapper[4794]: I0310 09:46:33.998508 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 09:46:33 crc kubenswrapper[4794]: E0310 09:46:33.998696 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 09:46:33 crc kubenswrapper[4794]: I0310 09:46:33.998833 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 09:46:33 crc kubenswrapper[4794]: E0310 09:46:33.998976 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 09:46:33 crc kubenswrapper[4794]: I0310 09:46:33.999033 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 09:46:33 crc kubenswrapper[4794]: E0310 09:46:33.999188 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 09:46:34 crc kubenswrapper[4794]: I0310 09:46:34.998478 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jl52w" Mar 10 09:46:34 crc kubenswrapper[4794]: E0310 09:46:34.998681 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jl52w" podUID="befc934b-d5ba-4fb4-afc6-97614b624ebc" Mar 10 09:46:35 crc kubenswrapper[4794]: I0310 09:46:35.998831 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 09:46:35 crc kubenswrapper[4794]: I0310 09:46:35.998886 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 09:46:35 crc kubenswrapper[4794]: I0310 09:46:35.998827 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 09:46:35 crc kubenswrapper[4794]: E0310 09:46:35.999048 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 09:46:35 crc kubenswrapper[4794]: E0310 09:46:35.999164 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 09:46:35 crc kubenswrapper[4794]: E0310 09:46:35.999323 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 09:46:36 crc kubenswrapper[4794]: I0310 09:46:36.998319 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jl52w" Mar 10 09:46:36 crc kubenswrapper[4794]: E0310 09:46:36.998569 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jl52w" podUID="befc934b-d5ba-4fb4-afc6-97614b624ebc" Mar 10 09:46:37 crc kubenswrapper[4794]: E0310 09:46:37.076975 4794 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 10 09:46:37 crc kubenswrapper[4794]: I0310 09:46:37.680297 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/befc934b-d5ba-4fb4-afc6-97614b624ebc-metrics-certs\") pod \"network-metrics-daemon-jl52w\" (UID: \"befc934b-d5ba-4fb4-afc6-97614b624ebc\") " pod="openshift-multus/network-metrics-daemon-jl52w" Mar 10 09:46:37 crc kubenswrapper[4794]: E0310 09:46:37.680564 4794 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 10 09:46:37 crc kubenswrapper[4794]: E0310 09:46:37.680706 4794 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/befc934b-d5ba-4fb4-afc6-97614b624ebc-metrics-certs podName:befc934b-d5ba-4fb4-afc6-97614b624ebc nodeName:}" failed. No retries permitted until 2026-03-10 09:47:09.680672909 +0000 UTC m=+178.436843767 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/befc934b-d5ba-4fb4-afc6-97614b624ebc-metrics-certs") pod "network-metrics-daemon-jl52w" (UID: "befc934b-d5ba-4fb4-afc6-97614b624ebc") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 10 09:46:37 crc kubenswrapper[4794]: I0310 09:46:37.998143 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 09:46:37 crc kubenswrapper[4794]: I0310 09:46:37.998216 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 09:46:37 crc kubenswrapper[4794]: I0310 09:46:37.998157 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 09:46:37 crc kubenswrapper[4794]: E0310 09:46:37.998367 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 09:46:37 crc kubenswrapper[4794]: E0310 09:46:37.998442 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 09:46:37 crc kubenswrapper[4794]: E0310 09:46:37.998512 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 09:46:38 crc kubenswrapper[4794]: I0310 09:46:38.998448 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jl52w" Mar 10 09:46:38 crc kubenswrapper[4794]: E0310 09:46:38.998665 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jl52w" podUID="befc934b-d5ba-4fb4-afc6-97614b624ebc" Mar 10 09:46:39 crc kubenswrapper[4794]: I0310 09:46:39.746304 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-jpdth_11028118-385a-4a2a-8bc4-49aad67ce147/kube-multus/0.log" Mar 10 09:46:39 crc kubenswrapper[4794]: I0310 09:46:39.746786 4794 generic.go:334] "Generic (PLEG): container finished" podID="11028118-385a-4a2a-8bc4-49aad67ce147" containerID="31569151ef9bf456980ec9d56c85d54ea77739b3991e26844385936dfceb24ec" exitCode=1 Mar 10 09:46:39 crc kubenswrapper[4794]: I0310 09:46:39.746835 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-jpdth" event={"ID":"11028118-385a-4a2a-8bc4-49aad67ce147","Type":"ContainerDied","Data":"31569151ef9bf456980ec9d56c85d54ea77739b3991e26844385936dfceb24ec"} Mar 10 09:46:39 crc kubenswrapper[4794]: I0310 09:46:39.747480 4794 scope.go:117] "RemoveContainer" containerID="31569151ef9bf456980ec9d56c85d54ea77739b3991e26844385936dfceb24ec" Mar 10 09:46:39 crc kubenswrapper[4794]: I0310 09:46:39.773926 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:46:39Z is after 2025-08-24T17:21:41Z" Mar 10 09:46:39 crc kubenswrapper[4794]: I0310 09:46:39.794104 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0a99da5e73d983b28a12e4063cb3dcbef725b0797abd5b5beff2d066fb4b43d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:45:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://faad85d1ef96c3c23ba54d03d456195b33f24bed1a5fc64cec481108d087d77c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:45:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:46:39Z is after 2025-08-24T17:21:41Z" Mar 10 09:46:39 crc kubenswrapper[4794]: I0310 09:46:39.809762 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-dc7fw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"441d4601-197c-4325-84bf-cef005ae408b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2367448989b7eb7235c3277100a8d4945551beee942453ce2efe152e17d24ce2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:45:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxxtq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:45:52Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-dc7fw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:46:39Z is after 2025-08-24T17:21:41Z" Mar 10 09:46:39 crc kubenswrapper[4794]: I0310 09:46:39.828095 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-gp22j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ccf2d24b-7ac8-4da4-8629-a56d006f1292\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:46:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:46:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:46:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dc5c1e6650959d5e7fe835284d6684f8d165aaec2e2e5c6537668db196c0ba81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:45:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k8fgn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:45:59Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-gp22j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:46:39Z is after 2025-08-24T17:21:41Z" Mar 10 09:46:39 crc kubenswrapper[4794]: I0310 09:46:39.844645 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-9j8d2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f99fa7c7-084a-43b1-acbf-5fbc67e1a66b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:46:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:46:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:46:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:46:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5aec2bb176e581a0b8ab9808299a341c97cb05dcb3bf8a44d65e65b642cecec5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:46:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8l2p5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1df90543899f07349ecc9cbe5282c9e766f2cf21e6a6da40920a77f6f40a9c2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:46:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8l2p5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:46:04Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-9j8d2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:46:39Z is after 2025-08-24T17:21:41Z" Mar 10 09:46:39 crc kubenswrapper[4794]: I0310 09:46:39.868306 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3f31d53b-bd82-4075-97b0-bab5614c1182\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cab1094176cbf7505724e0c9b08a4c294e6d85e4d5ad3b06535e0dbab270fbc2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:44:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1cb1eebc0d6deb9837da08cb3090cb0f25f2dc66b49afb402fd66f4e73784d66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:44:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7350cf82749601985e69e238d1c8a61ffa7788dd01f1b91a34f1f7574d3c86ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:44:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eab34114bba85670ea6caba8768b650dcbad9e92ca9034f0a21da9b355af87b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eab34114bba85670ea6caba8768b650dcbad9e92ca9034f0a21da9b355af87b1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:44:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:44:13Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:44:12Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:46:39Z is after 2025-08-24T17:21:41Z" Mar 10 09:46:39 crc kubenswrapper[4794]: I0310 09:46:39.892798 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:35Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:46:39Z is after 2025-08-24T17:21:41Z" Mar 10 09:46:39 crc kubenswrapper[4794]: I0310 09:46:39.906061 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 09:46:39 crc kubenswrapper[4794]: E0310 09:46:39.906224 4794 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 09:47:43.906198536 +0000 UTC m=+212.662369404 (durationBeforeRetry 1m4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 09:46:39 crc kubenswrapper[4794]: I0310 09:46:39.906268 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 09:46:39 crc kubenswrapper[4794]: I0310 09:46:39.906339 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 09:46:39 crc kubenswrapper[4794]: I0310 09:46:39.906423 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 09:46:39 crc kubenswrapper[4794]: E0310 09:46:39.906454 4794 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 10 09:46:39 crc kubenswrapper[4794]: E0310 09:46:39.906484 4794 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 10 09:46:39 crc kubenswrapper[4794]: E0310 09:46:39.906505 4794 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 10 09:46:39 crc kubenswrapper[4794]: E0310 09:46:39.906518 4794 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 10 09:46:39 crc kubenswrapper[4794]: I0310 09:46:39.906502 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 09:46:39 crc kubenswrapper[4794]: E0310 09:46:39.906567 4794 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-10 09:47:43.906535867 +0000 UTC m=+212.662706735 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 10 09:46:39 crc kubenswrapper[4794]: E0310 09:46:39.906652 4794 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 10 09:46:39 crc kubenswrapper[4794]: E0310 09:46:39.906690 4794 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 10 09:46:39 crc kubenswrapper[4794]: E0310 09:46:39.906710 4794 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 10 09:46:39 crc kubenswrapper[4794]: E0310 09:46:39.906728 4794 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 10 09:46:39 crc kubenswrapper[4794]: E0310 09:46:39.906934 4794 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-10 09:47:43.906909409 +0000 UTC m=+212.663080297 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 10 09:46:39 crc kubenswrapper[4794]: E0310 09:46:39.906987 4794 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-10 09:47:43.906971541 +0000 UTC m=+212.663142399 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 10 09:46:39 crc kubenswrapper[4794]: E0310 09:46:39.907015 4794 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-10 09:47:43.907001772 +0000 UTC m=+212.663172630 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 10 09:46:39 crc kubenswrapper[4794]: I0310 09:46:39.917267 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-zr89w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"749e1060-d177-43e4-9f39-d1b3e9b573b3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://80687a118cd90a6d4538b0de1757e2f3a17d7f2d5e2532d4a3c4396e838bdce4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:45:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-drqmz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9a7b1089f71425c1e393f53f827c2bd0dd87615752c6c5655ef28b2346de21d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a7b1089f71425c1e393f53f827c2bd0dd87615752c6c5655ef28b2346de21d8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:45:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:45:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-drqmz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e448f6f31e1ada006f54b8381012841a94a4e0101fba5630d715f1ab89e842e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6e448f6f31e1ada006f54b8381012841a94a4e0101fba5630d715f1ab89e842e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:45:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:45:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-drqmz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4d92d9cb37c7766aaabeee4dc2a40c043f0e8178dfe21b6ca9f91fa6313fed4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d4d92d9cb37c7766aaabeee4dc2a40c043f0e8178dfe21b6ca9f91fa6313fed4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:45:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:45:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-drqmz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://06a10ffd91a8c26bf9a3affc09b1c698bcf344aebafadbb12e37e4cc0fb8fb32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://06a10ffd91a8c26bf9a3affc09b1c698bcf344aebafadbb12e37e4cc0fb8fb32\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:45:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:45:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-drqmz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://080520ae745ae721854727f7b8ea882c3b1073ef487362681973f5befe15a79c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://080520ae745ae721854727f7b8ea882c3b1073ef487362681973f5befe15a79c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:45:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:45:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-drqmz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4d9e1da83ee6ac0c4f792e1ec3da1a150b5ee013e83dd2e1b0e86bf5b5fcc648\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4d9e1da83ee6ac0c4f792e1ec3da1a150b5ee013e83dd2e1b0e86bf5b5fcc648\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:45:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:45:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-drqmz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:45:52Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-zr89w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:46:39Z is after 2025-08-24T17:21:41Z" Mar 10 09:46:39 crc kubenswrapper[4794]: I0310 09:46:39.938518 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-jpdth" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"11028118-385a-4a2a-8bc4-49aad67ce147\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:46:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:46:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31569151ef9bf456980ec9d56c85d54ea77739b3991e26844385936dfceb24ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://31569151ef9bf456980ec9d56c85d54ea77739b3991e26844385936dfceb24ec\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-10T09:46:39Z\\\",\\\"message\\\":\\\"2026-03-10T09:45:54+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_5c90214b-8170-4311-8f5b-bb523456972d\\\\n2026-03-10T09:45:54+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_5c90214b-8170-4311-8f5b-bb523456972d to /host/opt/cni/bin/\\\\n2026-03-10T09:45:54Z [verbose] multus-daemon started\\\\n2026-03-10T09:45:54Z [verbose] Readiness Indicator file check\\\\n2026-03-10T09:46:39Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T09:45:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8v9tn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:45:52Z\\\"}}\" for pod \"openshift-multus\"/\"multus-jpdth\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:46:39Z is after 2025-08-24T17:21:41Z" Mar 10 09:46:39 crc kubenswrapper[4794]: I0310 09:46:39.964650 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c04f5dfc-8cde-406b-9e01-2e529e0c0f31\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:46:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:46:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c7c4ede630df1f139ad8abbc2d1d1e2f0f6cf56a2c29902f6b6879df004834cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:44:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://27049a35d23ac6ffa5cb459e4ea2c5c55a7b1731c9edff0941d452442cc73af2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:44:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://de2c346c49fbd94c4328853bed3ffd94ba873a423ff0d99a6ace6fc832bd4da7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:44:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f9a5ac235816afb81c6091da3019a5d8be1e29ba1de74f30c6c28ace091f822\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bd3d37caeb172a681855b5504cd1387cb3798cff0c9ae2c67f9a57207d86f7fe\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T09:45:21Z\\\",\\\"message\\\":\\\"le observer\\\\nW0310 09:45:21.579222 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0310 09:45:21.579381 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0310 09:45:21.580347 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2897458058/tls.crt::/tmp/serving-cert-2897458058/tls.key\\\\\\\"\\\\nI0310 09:45:21.843230 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0310 09:45:21.845521 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0310 09:45:21.845546 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0310 09:45:21.845567 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0310 09:45:21.845572 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0310 09:45:21.854911 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0310 09:45:21.854948 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 09:45:21.854954 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 09:45:21.854960 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0310 09:45:21.854966 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0310 09:45:21.854970 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0310 09:45:21.854974 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0310 09:45:21.854995 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0310 09:45:21.858504 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T09:45:21Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:46:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a3be3a71e2af20be50ffcbd448d5f28e73bafe2d36819ec5476a260b6f5a8c5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:44:14Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://712665fa5ac90b706c3ce8eb211703686c18a26f136962df26e3ad7c4396c72f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://712665fa5ac90b706c3ce8eb211703686c18a26f136962df26e3ad7c4396c72f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:44:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:44:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:44:12Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:46:39Z is after 2025-08-24T17:21:41Z" Mar 10 09:46:39 crc kubenswrapper[4794]: I0310 09:46:39.979987 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2f008ec7-aeae-48e2-8ce6-2d24f9a74cb9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5ecb656c11edc3bfd23f7b6362d5ac8cad055186b134498b59e58b868ebd157c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:44:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8ff582c5b9b19a04841665aef50262d055e093a8e80200961a93e1e1ee0391b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8ff582c5b9b19a04841665aef50262d055e093a8e80200961a93e1e1ee0391b4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:44:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:44:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:44:12Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:46:39Z is after 2025-08-24T17:21:41Z" Mar 10 09:46:39 crc kubenswrapper[4794]: I0310 09:46:39.996574 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-jl52w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"befc934b-d5ba-4fb4-afc6-97614b624ebc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:46:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:46:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:46:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:46:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gbh5s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gbh5s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:46:05Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-jl52w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:46:39Z is after 2025-08-24T17:21:41Z" Mar 10 09:46:39 crc kubenswrapper[4794]: I0310 09:46:39.998796 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 09:46:39 crc kubenswrapper[4794]: I0310 09:46:39.998877 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 09:46:39 crc kubenswrapper[4794]: E0310 09:46:39.998940 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 09:46:39 crc kubenswrapper[4794]: E0310 09:46:39.999041 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 09:46:39 crc kubenswrapper[4794]: I0310 09:46:39.999172 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 09:46:39 crc kubenswrapper[4794]: E0310 09:46:39.999284 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 09:46:40 crc kubenswrapper[4794]: I0310 09:46:40.014833 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:35Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:46:40Z is after 2025-08-24T17:21:41Z" Mar 10 09:46:40 crc kubenswrapper[4794]: I0310 09:46:40.031733 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-69278" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"071a79a8-a892-4d38-a255-2a19483b64aa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f596b5be3f96e424f4c4f0c6f81241ec50ee64356cb1449294fa6790aa3f7755\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:45:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wl72n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://387c7d244ef3b150e61bf6db8b30580b19b8867628fab0e5c4eebdde4df81142\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:45:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wl72n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:45:52Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-69278\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:46:40Z is after 2025-08-24T17:21:41Z" Mar 10 09:46:40 crc kubenswrapper[4794]: I0310 09:46:40.055554 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d7e66c82b5c376bc659e9597f6714923192b2658db0530c1e20b73533bfb9079\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:45:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:46:40Z is after 2025-08-24T17:21:41Z" Mar 10 09:46:40 crc kubenswrapper[4794]: I0310 09:46:40.072705 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:39Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:39Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7ca44cdf468c33b025c75f0f104435e413e525fba85c006f98fa92ff2f2a4e3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:45:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:46:40Z is after 2025-08-24T17:21:41Z" Mar 10 09:46:40 crc kubenswrapper[4794]: I0310 09:46:40.106047 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mm9nq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d6907de6-7eb7-440a-a101-f492ffa28e39\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d6a3c6de4d134994477dbeb7d3ccdac86f691a7c465321397a93be240c84c4c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:45:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5dhqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf907aae6bd62c6969511c2483d9891a3d89407fa596e7bd4d6e18a3739aea41\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:45:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5dhqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f392e7eaab035834aa954a19e6c9d926bbedf2a547b3a11fba67c0bb966df58c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:45:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5dhqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4fd74482a81888186636681a0568f49e919ac0d729db583ba9178c42e488051\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:45:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5dhqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4fe68076842f5ec0c1922f36a3fdae71ce2e309fd4fa24d4ee4e15c16e514b57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:45:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5dhqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6993b19937515a45f9fab7b20a6856320e79cb3526295d52f48338126d519cac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:45:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5dhqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff55d9a5f2346b5bb8356e359d723fd5fdfa5ad328aaa24ec0faa047536068dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ff55d9a5f2346b5bb8356e359d723fd5fdfa5ad328aaa24ec0faa047536068dc\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-10T09:46:14Z\\\",\\\"message\\\":\\\" handler 7\\\\nI0310 09:46:14.916611 7063 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0310 09:46:14.916631 7063 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0310 09:46:14.916647 7063 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0310 09:46:14.916690 7063 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0310 09:46:14.916708 7063 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI0310 09:46:14.916737 7063 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0310 09:46:14.916775 7063 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI0310 09:46:14.916785 7063 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0310 09:46:14.916793 7063 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0310 09:46:14.916814 7063 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0310 09:46:14.916819 7063 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0310 09:46:14.916818 7063 factory.go:656] Stopping watch factory\\\\nI0310 09:46:14.916836 7063 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0310 09:46:14.916845 7063 ovnkube.go:599] Stopped ovnkube\\\\nI0310 09:46:14.916890 7063 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0310 09:46:14.916957 7063 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T09:46:14Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-mm9nq_openshift-ovn-kubernetes(d6907de6-7eb7-440a-a101-f492ffa28e39)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5dhqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://01f874a0c0eedbf796a95816f7e3fa80891f260f22087c48aea06498330167dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:45:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5dhqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://40c8d4f75a838a0ea1ce416d08c6d4628c2cbdbc209d50922f2f02867142bedd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://40c8d4f75a838a0ea1ce416d08c6d4628c2cbdbc209d50922f2f02867142bedd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:45:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:45:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5dhqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:45:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-mm9nq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:46:40Z is after 2025-08-24T17:21:41Z" Mar 10 09:46:40 crc kubenswrapper[4794]: I0310 09:46:40.127162 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37e93748-d1b9-4fcb-8dd2-3e4af26b9498\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://875e6fa3d8ad5a7c381c1bf3efd51ccfa1202dd3fd9b30bc3e05fac7eb9fc3f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3e7c9a20b63dc51728211c782f530307eafabeb7f97e5bc87bcdacd29a91658e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T09:45:13Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0310 09:44:43.522265 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0310 09:44:43.523853 1 observer_polling.go:159] Starting file observer\\\\nI0310 09:44:43.529139 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0310 09:44:43.530568 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0310 09:45:12.957570 1 cert_rotation.go:91] certificate rotation detected, shutting down client connections to start using new credentials\\\\nI0310 09:45:13.082840 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0310 09:45:13.082951 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T09:44:43Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:45:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2e8a850bc5db9bf7a7e73a9a71933744f371c73ceba41cd04d0b175614309ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:44:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://76c05c05697162298be5d52e9bf3012e05344a077cc3cf6a53839cee75b671a9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:44:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0aac675904ab2ca1d18eed0251bb1dafa297d7819e95189e0ea6288b164cae86\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:44:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:44:12Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:46:40Z is after 2025-08-24T17:21:41Z" Mar 10 09:46:40 crc kubenswrapper[4794]: I0310 09:46:40.153222 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1e0df489-afb8-4101-acd6-7274ee4ebddf\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://714aace000a0c1b32bd3c863ca1e38e9e44df606fcf3fddb6573eaf8faea4ac7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:44:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://02b38ec46bd15bd61a9748f309c0f8fd16d1ab485405a653f6974b6b37952f75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:44:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b3bcab9c36a963edfb820142c1c917f0cc5d18a09dc3aa65b4539c2248fb866\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:44:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://55280e0d81cf78ca765eb09242df44355b2e25fc2d3e5f88b78ef9f14a44ada8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:44:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8ed250f61efe96b4fac3af4d8d8958a60df304df0120052a7a1ef2d716cc025d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:44:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d1cbb7471e4c1bc60e6bff82ff47be1667d540e2f73c1e118371b1f887c6e22e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d1cbb7471e4c1bc60e6bff82ff47be1667d540e2f73c1e118371b1f887c6e22e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:44:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:44:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e8b205e76d6211c7df0521baf694a871c2e747401dd85f694fe3b6712995540\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8e8b205e76d6211c7df0521baf694a871c2e747401dd85f694fe3b6712995540\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:44:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:44:14Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://0fee74b6caaaf89e473d672c7699753699a594043f013d2e5106d232ad87d850\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0fee74b6caaaf89e473d672c7699753699a594043f013d2e5106d232ad87d850\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:44:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:44:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:44:12Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:46:40Z is after 2025-08-24T17:21:41Z" Mar 10 09:46:40 crc kubenswrapper[4794]: I0310 09:46:40.449270 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:46:40 crc kubenswrapper[4794]: I0310 09:46:40.449394 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:46:40 crc kubenswrapper[4794]: I0310 09:46:40.449421 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:46:40 crc kubenswrapper[4794]: I0310 09:46:40.449545 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:46:40 crc kubenswrapper[4794]: I0310 09:46:40.449635 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:46:40Z","lastTransitionTime":"2026-03-10T09:46:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:46:40 crc kubenswrapper[4794]: E0310 09:46:40.468814 4794 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:46:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:46:40Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:46:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:46:40Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:46:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:46:40Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:46:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:46:40Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"30640994-851b-4f33-a3b0-5689f89c6242\\\",\\\"systemUUID\\\":\\\"970a308b-8f2f-4747-b542-8544494e7e13\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:46:40Z is after 2025-08-24T17:21:41Z" Mar 10 09:46:40 crc kubenswrapper[4794]: I0310 09:46:40.473060 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:46:40 crc kubenswrapper[4794]: I0310 09:46:40.473123 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:46:40 crc kubenswrapper[4794]: I0310 09:46:40.473141 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:46:40 crc kubenswrapper[4794]: I0310 09:46:40.473165 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:46:40 crc kubenswrapper[4794]: I0310 09:46:40.473184 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:46:40Z","lastTransitionTime":"2026-03-10T09:46:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:46:40 crc kubenswrapper[4794]: E0310 09:46:40.490732 4794 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:46:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:46:40Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:46:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:46:40Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:46:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:46:40Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:46:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:46:40Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"30640994-851b-4f33-a3b0-5689f89c6242\\\",\\\"systemUUID\\\":\\\"970a308b-8f2f-4747-b542-8544494e7e13\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:46:40Z is after 2025-08-24T17:21:41Z" Mar 10 09:46:40 crc kubenswrapper[4794]: I0310 09:46:40.495561 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:46:40 crc kubenswrapper[4794]: I0310 09:46:40.495632 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:46:40 crc kubenswrapper[4794]: I0310 09:46:40.495676 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:46:40 crc kubenswrapper[4794]: I0310 09:46:40.495705 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:46:40 crc kubenswrapper[4794]: I0310 09:46:40.495733 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:46:40Z","lastTransitionTime":"2026-03-10T09:46:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:46:40 crc kubenswrapper[4794]: E0310 09:46:40.518370 4794 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:46:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:46:40Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:46:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:46:40Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:46:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:46:40Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:46:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:46:40Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"30640994-851b-4f33-a3b0-5689f89c6242\\\",\\\"systemUUID\\\":\\\"970a308b-8f2f-4747-b542-8544494e7e13\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:46:40Z is after 2025-08-24T17:21:41Z" Mar 10 09:46:40 crc kubenswrapper[4794]: I0310 09:46:40.522883 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:46:40 crc kubenswrapper[4794]: I0310 09:46:40.522918 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:46:40 crc kubenswrapper[4794]: I0310 09:46:40.522931 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:46:40 crc kubenswrapper[4794]: I0310 09:46:40.522945 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:46:40 crc kubenswrapper[4794]: I0310 09:46:40.522955 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:46:40Z","lastTransitionTime":"2026-03-10T09:46:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:46:40 crc kubenswrapper[4794]: E0310 09:46:40.533545 4794 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:46:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:46:40Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:46:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:46:40Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:46:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:46:40Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:46:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:46:40Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"30640994-851b-4f33-a3b0-5689f89c6242\\\",\\\"systemUUID\\\":\\\"970a308b-8f2f-4747-b542-8544494e7e13\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:46:40Z is after 2025-08-24T17:21:41Z" Mar 10 09:46:40 crc kubenswrapper[4794]: I0310 09:46:40.537157 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:46:40 crc kubenswrapper[4794]: I0310 09:46:40.537204 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:46:40 crc kubenswrapper[4794]: I0310 09:46:40.537217 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:46:40 crc kubenswrapper[4794]: I0310 09:46:40.537234 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:46:40 crc kubenswrapper[4794]: I0310 09:46:40.537248 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:46:40Z","lastTransitionTime":"2026-03-10T09:46:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:46:40 crc kubenswrapper[4794]: E0310 09:46:40.551908 4794 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:46:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:46:40Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:46:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:46:40Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:46:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:46:40Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:46:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:46:40Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"30640994-851b-4f33-a3b0-5689f89c6242\\\",\\\"systemUUID\\\":\\\"970a308b-8f2f-4747-b542-8544494e7e13\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:46:40Z is after 2025-08-24T17:21:41Z" Mar 10 09:46:40 crc kubenswrapper[4794]: E0310 09:46:40.552049 4794 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 10 09:46:40 crc kubenswrapper[4794]: I0310 09:46:40.755064 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-jpdth_11028118-385a-4a2a-8bc4-49aad67ce147/kube-multus/0.log" Mar 10 09:46:40 crc kubenswrapper[4794]: I0310 09:46:40.755148 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-jpdth" event={"ID":"11028118-385a-4a2a-8bc4-49aad67ce147","Type":"ContainerStarted","Data":"7132e8357d430e33f2f8e8172b67b9aba8deac7f2725d5222c6ccece6e4589d1"} Mar 10 09:46:40 crc kubenswrapper[4794]: I0310 09:46:40.779290 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c04f5dfc-8cde-406b-9e01-2e529e0c0f31\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:46:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:46:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c7c4ede630df1f139ad8abbc2d1d1e2f0f6cf56a2c29902f6b6879df004834cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:44:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://27049a35d23ac6ffa5cb459e4ea2c5c55a7b1731c9edff0941d452442cc73af2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:44:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://de2c346c49fbd94c4328853bed3ffd94ba873a423ff0d99a6ace6fc832bd4da7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:44:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f9a5ac235816afb81c6091da3019a5d8be1e29ba1de74f30c6c28ace091f822\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bd3d37caeb172a681855b5504cd1387cb3798cff0c9ae2c67f9a57207d86f7fe\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T09:45:21Z\\\",\\\"message\\\":\\\"le observer\\\\nW0310 09:45:21.579222 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0310 09:45:21.579381 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0310 09:45:21.580347 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2897458058/tls.crt::/tmp/serving-cert-2897458058/tls.key\\\\\\\"\\\\nI0310 09:45:21.843230 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0310 09:45:21.845521 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0310 09:45:21.845546 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0310 09:45:21.845567 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0310 09:45:21.845572 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0310 09:45:21.854911 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0310 09:45:21.854948 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 09:45:21.854954 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 09:45:21.854960 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0310 09:45:21.854966 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0310 09:45:21.854970 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0310 09:45:21.854974 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0310 09:45:21.854995 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0310 09:45:21.858504 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T09:45:21Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:46:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a3be3a71e2af20be50ffcbd448d5f28e73bafe2d36819ec5476a260b6f5a8c5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:44:14Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://712665fa5ac90b706c3ce8eb211703686c18a26f136962df26e3ad7c4396c72f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://712665fa5ac90b706c3ce8eb211703686c18a26f136962df26e3ad7c4396c72f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:44:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:44:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:44:12Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:46:40Z is after 2025-08-24T17:21:41Z" Mar 10 09:46:40 crc kubenswrapper[4794]: I0310 09:46:40.797063 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2f008ec7-aeae-48e2-8ce6-2d24f9a74cb9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5ecb656c11edc3bfd23f7b6362d5ac8cad055186b134498b59e58b868ebd157c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:44:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8ff582c5b9b19a04841665aef50262d055e093a8e80200961a93e1e1ee0391b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8ff582c5b9b19a04841665aef50262d055e093a8e80200961a93e1e1ee0391b4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:44:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:44:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:44:12Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:46:40Z is after 2025-08-24T17:21:41Z" Mar 10 09:46:40 crc kubenswrapper[4794]: I0310 09:46:40.818679 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-zr89w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"749e1060-d177-43e4-9f39-d1b3e9b573b3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://80687a118cd90a6d4538b0de1757e2f3a17d7f2d5e2532d4a3c4396e838bdce4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:45:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-drqmz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9a7b1089f71425c1e393f53f827c2bd0dd87615752c6c5655ef28b2346de21d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a7b1089f71425c1e393f53f827c2bd0dd87615752c6c5655ef28b2346de21d8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:45:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:45:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-drqmz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e448f6f31e1ada006f54b8381012841a94a4e0101fba5630d715f1ab89e842e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6e448f6f31e1ada006f54b8381012841a94a4e0101fba5630d715f1ab89e842e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:45:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:45:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-drqmz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4d92d9cb37c7766aaabeee4dc2a40c043f0e8178dfe21b6ca9f91fa6313fed4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d4d92d9cb37c7766aaabeee4dc2a40c043f0e8178dfe21b6ca9f91fa6313fed4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:45:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:45:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-drqmz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://06a10ffd91a8c26bf9a3affc09b1c698bcf344aebafadbb12e37e4cc0fb8fb32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://06a10ffd91a8c26bf9a3affc09b1c698bcf344aebafadbb12e37e4cc0fb8fb32\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:45:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:45:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-drqmz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://080520ae745ae721854727f7b8ea882c3b1073ef487362681973f5befe15a79c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://080520ae745ae721854727f7b8ea882c3b1073ef487362681973f5befe15a79c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:45:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:45:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-drqmz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4d9e1da83ee6ac0c4f792e1ec3da1a150b5ee013e83dd2e1b0e86bf5b5fcc648\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4d9e1da83ee6ac0c4f792e1ec3da1a150b5ee013e83dd2e1b0e86bf5b5fcc648\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:45:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:45:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-drqmz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:45:52Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-zr89w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:46:40Z is after 2025-08-24T17:21:41Z" Mar 10 09:46:40 crc kubenswrapper[4794]: I0310 09:46:40.839231 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-jpdth" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"11028118-385a-4a2a-8bc4-49aad67ce147\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:46:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:46:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7132e8357d430e33f2f8e8172b67b9aba8deac7f2725d5222c6ccece6e4589d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://31569151ef9bf456980ec9d56c85d54ea77739b3991e26844385936dfceb24ec\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-10T09:46:39Z\\\",\\\"message\\\":\\\"2026-03-10T09:45:54+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_5c90214b-8170-4311-8f5b-bb523456972d\\\\n2026-03-10T09:45:54+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_5c90214b-8170-4311-8f5b-bb523456972d to /host/opt/cni/bin/\\\\n2026-03-10T09:45:54Z [verbose] multus-daemon started\\\\n2026-03-10T09:45:54Z [verbose] Readiness Indicator file check\\\\n2026-03-10T09:46:39Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T09:45:53Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:46:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8v9tn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:45:52Z\\\"}}\" for pod \"openshift-multus\"/\"multus-jpdth\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:46:40Z is after 2025-08-24T17:21:41Z" Mar 10 09:46:40 crc kubenswrapper[4794]: I0310 09:46:40.857209 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:35Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:46:40Z is after 2025-08-24T17:21:41Z" Mar 10 09:46:40 crc kubenswrapper[4794]: I0310 09:46:40.874699 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-69278" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"071a79a8-a892-4d38-a255-2a19483b64aa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f596b5be3f96e424f4c4f0c6f81241ec50ee64356cb1449294fa6790aa3f7755\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:45:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wl72n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://387c7d244ef3b150e61bf6db8b30580b19b8867628fab0e5c4eebdde4df81142\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:45:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wl72n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:45:52Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-69278\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:46:40Z is after 2025-08-24T17:21:41Z" Mar 10 09:46:40 crc kubenswrapper[4794]: I0310 09:46:40.891574 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-jl52w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"befc934b-d5ba-4fb4-afc6-97614b624ebc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:46:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:46:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:46:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:46:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gbh5s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gbh5s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:46:05Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-jl52w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:46:40Z is after 2025-08-24T17:21:41Z" Mar 10 09:46:40 crc kubenswrapper[4794]: I0310 09:46:40.913312 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37e93748-d1b9-4fcb-8dd2-3e4af26b9498\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://875e6fa3d8ad5a7c381c1bf3efd51ccfa1202dd3fd9b30bc3e05fac7eb9fc3f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3e7c9a20b63dc51728211c782f530307eafabeb7f97e5bc87bcdacd29a91658e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T09:45:13Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0310 09:44:43.522265 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0310 09:44:43.523853 1 observer_polling.go:159] Starting file observer\\\\nI0310 09:44:43.529139 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0310 09:44:43.530568 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0310 09:45:12.957570 1 cert_rotation.go:91] certificate rotation detected, shutting down client connections to start using new credentials\\\\nI0310 09:45:13.082840 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0310 09:45:13.082951 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T09:44:43Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:45:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2e8a850bc5db9bf7a7e73a9a71933744f371c73ceba41cd04d0b175614309ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:44:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://76c05c05697162298be5d52e9bf3012e05344a077cc3cf6a53839cee75b671a9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:44:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0aac675904ab2ca1d18eed0251bb1dafa297d7819e95189e0ea6288b164cae86\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:44:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:44:12Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:46:40Z is after 2025-08-24T17:21:41Z" Mar 10 09:46:40 crc kubenswrapper[4794]: I0310 09:46:40.946118 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1e0df489-afb8-4101-acd6-7274ee4ebddf\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://714aace000a0c1b32bd3c863ca1e38e9e44df606fcf3fddb6573eaf8faea4ac7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:44:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://02b38ec46bd15bd61a9748f309c0f8fd16d1ab485405a653f6974b6b37952f75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:44:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b3bcab9c36a963edfb820142c1c917f0cc5d18a09dc3aa65b4539c2248fb866\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:44:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://55280e0d81cf78ca765eb09242df44355b2e25fc2d3e5f88b78ef9f14a44ada8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:44:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8ed250f61efe96b4fac3af4d8d8958a60df304df0120052a7a1ef2d716cc025d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:44:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d1cbb7471e4c1bc60e6bff82ff47be1667d540e2f73c1e118371b1f887c6e22e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d1cbb7471e4c1bc60e6bff82ff47be1667d540e2f73c1e118371b1f887c6e22e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:44:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:44:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e8b205e76d6211c7df0521baf694a871c2e747401dd85f694fe3b6712995540\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8e8b205e76d6211c7df0521baf694a871c2e747401dd85f694fe3b6712995540\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:44:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:44:14Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://0fee74b6caaaf89e473d672c7699753699a594043f013d2e5106d232ad87d850\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0fee74b6caaaf89e473d672c7699753699a594043f013d2e5106d232ad87d850\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:44:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:44:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:44:12Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:46:40Z is after 2025-08-24T17:21:41Z" Mar 10 09:46:40 crc kubenswrapper[4794]: I0310 09:46:40.968295 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d7e66c82b5c376bc659e9597f6714923192b2658db0530c1e20b73533bfb9079\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:45:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:46:40Z is after 2025-08-24T17:21:41Z" Mar 10 09:46:40 crc kubenswrapper[4794]: I0310 09:46:40.986010 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:39Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:39Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7ca44cdf468c33b025c75f0f104435e413e525fba85c006f98fa92ff2f2a4e3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:45:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:46:40Z is after 2025-08-24T17:21:41Z" Mar 10 09:46:40 crc kubenswrapper[4794]: I0310 09:46:40.999038 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jl52w" Mar 10 09:46:41 crc kubenswrapper[4794]: E0310 09:46:40.999259 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jl52w" podUID="befc934b-d5ba-4fb4-afc6-97614b624ebc" Mar 10 09:46:41 crc kubenswrapper[4794]: I0310 09:46:41.010926 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mm9nq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d6907de6-7eb7-440a-a101-f492ffa28e39\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d6a3c6de4d134994477dbeb7d3ccdac86f691a7c465321397a93be240c84c4c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:45:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5dhqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf907aae6bd62c6969511c2483d9891a3d89407fa596e7bd4d6e18a3739aea41\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:45:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5dhqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f392e7eaab035834aa954a19e6c9d926bbedf2a547b3a11fba67c0bb966df58c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:45:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5dhqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4fd74482a81888186636681a0568f49e919ac0d729db583ba9178c42e488051\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:45:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5dhqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4fe68076842f5ec0c1922f36a3fdae71ce2e309fd4fa24d4ee4e15c16e514b57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:45:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5dhqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6993b19937515a45f9fab7b20a6856320e79cb3526295d52f48338126d519cac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:45:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5dhqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff55d9a5f2346b5bb8356e359d723fd5fdfa5ad328aaa24ec0faa047536068dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ff55d9a5f2346b5bb8356e359d723fd5fdfa5ad328aaa24ec0faa047536068dc\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-10T09:46:14Z\\\",\\\"message\\\":\\\" handler 7\\\\nI0310 09:46:14.916611 7063 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0310 09:46:14.916631 7063 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0310 09:46:14.916647 7063 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0310 09:46:14.916690 7063 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0310 09:46:14.916708 7063 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI0310 09:46:14.916737 7063 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0310 09:46:14.916775 7063 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI0310 09:46:14.916785 7063 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0310 09:46:14.916793 7063 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0310 09:46:14.916814 7063 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0310 09:46:14.916819 7063 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0310 09:46:14.916818 7063 factory.go:656] Stopping watch factory\\\\nI0310 09:46:14.916836 7063 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0310 09:46:14.916845 7063 ovnkube.go:599] Stopped ovnkube\\\\nI0310 09:46:14.916890 7063 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0310 09:46:14.916957 7063 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T09:46:14Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-mm9nq_openshift-ovn-kubernetes(d6907de6-7eb7-440a-a101-f492ffa28e39)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5dhqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://01f874a0c0eedbf796a95816f7e3fa80891f260f22087c48aea06498330167dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:45:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5dhqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://40c8d4f75a838a0ea1ce416d08c6d4628c2cbdbc209d50922f2f02867142bedd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://40c8d4f75a838a0ea1ce416d08c6d4628c2cbdbc209d50922f2f02867142bedd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:45:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:45:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5dhqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:45:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-mm9nq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:46:41Z is after 2025-08-24T17:21:41Z" Mar 10 09:46:41 crc kubenswrapper[4794]: I0310 09:46:41.028736 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3f31d53b-bd82-4075-97b0-bab5614c1182\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cab1094176cbf7505724e0c9b08a4c294e6d85e4d5ad3b06535e0dbab270fbc2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:44:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1cb1eebc0d6deb9837da08cb3090cb0f25f2dc66b49afb402fd66f4e73784d66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:44:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7350cf82749601985e69e238d1c8a61ffa7788dd01f1b91a34f1f7574d3c86ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:44:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eab34114bba85670ea6caba8768b650dcbad9e92ca9034f0a21da9b355af87b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eab34114bba85670ea6caba8768b650dcbad9e92ca9034f0a21da9b355af87b1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:44:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:44:13Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:44:12Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:46:41Z is after 2025-08-24T17:21:41Z" Mar 10 09:46:41 crc kubenswrapper[4794]: I0310 09:46:41.047231 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:35Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:46:41Z is after 2025-08-24T17:21:41Z" Mar 10 09:46:41 crc kubenswrapper[4794]: I0310 09:46:41.064823 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:46:41Z is after 2025-08-24T17:21:41Z" Mar 10 09:46:41 crc kubenswrapper[4794]: I0310 09:46:41.080376 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0a99da5e73d983b28a12e4063cb3dcbef725b0797abd5b5beff2d066fb4b43d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:45:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://faad85d1ef96c3c23ba54d03d456195b33f24bed1a5fc64cec481108d087d77c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:45:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:46:41Z is after 2025-08-24T17:21:41Z" Mar 10 09:46:41 crc kubenswrapper[4794]: I0310 09:46:41.096803 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-dc7fw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"441d4601-197c-4325-84bf-cef005ae408b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2367448989b7eb7235c3277100a8d4945551beee942453ce2efe152e17d24ce2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:45:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxxtq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:45:52Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-dc7fw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:46:41Z is after 2025-08-24T17:21:41Z" Mar 10 09:46:41 crc kubenswrapper[4794]: I0310 09:46:41.111340 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-gp22j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ccf2d24b-7ac8-4da4-8629-a56d006f1292\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:46:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:46:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:46:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dc5c1e6650959d5e7fe835284d6684f8d165aaec2e2e5c6537668db196c0ba81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:45:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k8fgn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:45:59Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-gp22j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:46:41Z is after 2025-08-24T17:21:41Z" Mar 10 09:46:41 crc kubenswrapper[4794]: I0310 09:46:41.127537 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-9j8d2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f99fa7c7-084a-43b1-acbf-5fbc67e1a66b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:46:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:46:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:46:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:46:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5aec2bb176e581a0b8ab9808299a341c97cb05dcb3bf8a44d65e65b642cecec5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:46:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8l2p5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1df90543899f07349ecc9cbe5282c9e766f2cf21e6a6da40920a77f6f40a9c2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:46:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8l2p5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:46:04Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-9j8d2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:46:41Z is after 2025-08-24T17:21:41Z" Mar 10 09:46:41 crc kubenswrapper[4794]: I0310 09:46:41.998136 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 09:46:41 crc kubenswrapper[4794]: I0310 09:46:41.998218 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 09:46:41 crc kubenswrapper[4794]: I0310 09:46:41.998245 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 09:46:41 crc kubenswrapper[4794]: E0310 09:46:41.998436 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 09:46:41 crc kubenswrapper[4794]: E0310 09:46:41.998697 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 09:46:41 crc kubenswrapper[4794]: E0310 09:46:41.999148 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 09:46:42 crc kubenswrapper[4794]: I0310 09:46:41.999532 4794 scope.go:117] "RemoveContainer" containerID="ff55d9a5f2346b5bb8356e359d723fd5fdfa5ad328aaa24ec0faa047536068dc" Mar 10 09:46:42 crc kubenswrapper[4794]: I0310 09:46:42.020377 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2f008ec7-aeae-48e2-8ce6-2d24f9a74cb9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5ecb656c11edc3bfd23f7b6362d5ac8cad055186b134498b59e58b868ebd157c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:44:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8ff582c5b9b19a04841665aef50262d055e093a8e80200961a93e1e1ee0391b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8ff582c5b9b19a04841665aef50262d055e093a8e80200961a93e1e1ee0391b4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:44:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:44:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:44:12Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:46:42Z is after 2025-08-24T17:21:41Z" Mar 10 09:46:42 crc kubenswrapper[4794]: I0310 09:46:42.043155 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-zr89w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"749e1060-d177-43e4-9f39-d1b3e9b573b3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://80687a118cd90a6d4538b0de1757e2f3a17d7f2d5e2532d4a3c4396e838bdce4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:45:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-drqmz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9a7b1089f71425c1e393f53f827c2bd0dd87615752c6c5655ef28b2346de21d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a7b1089f71425c1e393f53f827c2bd0dd87615752c6c5655ef28b2346de21d8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:45:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:45:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-drqmz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e448f6f31e1ada006f54b8381012841a94a4e0101fba5630d715f1ab89e842e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6e448f6f31e1ada006f54b8381012841a94a4e0101fba5630d715f1ab89e842e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:45:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:45:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-drqmz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4d92d9cb37c7766aaabeee4dc2a40c043f0e8178dfe21b6ca9f91fa6313fed4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d4d92d9cb37c7766aaabeee4dc2a40c043f0e8178dfe21b6ca9f91fa6313fed4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:45:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:45:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-drqmz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://06a10ffd91a8c26bf9a3affc09b1c698bcf344aebafadbb12e37e4cc0fb8fb32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://06a10ffd91a8c26bf9a3affc09b1c698bcf344aebafadbb12e37e4cc0fb8fb32\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:45:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:45:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-drqmz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://080520ae745ae721854727f7b8ea882c3b1073ef487362681973f5befe15a79c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://080520ae745ae721854727f7b8ea882c3b1073ef487362681973f5befe15a79c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:45:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:45:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-drqmz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4d9e1da83ee6ac0c4f792e1ec3da1a150b5ee013e83dd2e1b0e86bf5b5fcc648\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4d9e1da83ee6ac0c4f792e1ec3da1a150b5ee013e83dd2e1b0e86bf5b5fcc648\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:45:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:45:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-drqmz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:45:52Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-zr89w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:46:42Z is after 2025-08-24T17:21:41Z" Mar 10 09:46:42 crc kubenswrapper[4794]: I0310 09:46:42.065405 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-jpdth" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"11028118-385a-4a2a-8bc4-49aad67ce147\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:46:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:46:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7132e8357d430e33f2f8e8172b67b9aba8deac7f2725d5222c6ccece6e4589d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://31569151ef9bf456980ec9d56c85d54ea77739b3991e26844385936dfceb24ec\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-10T09:46:39Z\\\",\\\"message\\\":\\\"2026-03-10T09:45:54+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_5c90214b-8170-4311-8f5b-bb523456972d\\\\n2026-03-10T09:45:54+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_5c90214b-8170-4311-8f5b-bb523456972d to /host/opt/cni/bin/\\\\n2026-03-10T09:45:54Z [verbose] multus-daemon started\\\\n2026-03-10T09:45:54Z [verbose] Readiness Indicator file check\\\\n2026-03-10T09:46:39Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T09:45:53Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:46:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8v9tn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:45:52Z\\\"}}\" for pod \"openshift-multus\"/\"multus-jpdth\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:46:42Z is after 2025-08-24T17:21:41Z" Mar 10 09:46:42 crc kubenswrapper[4794]: E0310 09:46:42.077831 4794 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 10 09:46:42 crc kubenswrapper[4794]: I0310 09:46:42.089586 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c04f5dfc-8cde-406b-9e01-2e529e0c0f31\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:46:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:46:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c7c4ede630df1f139ad8abbc2d1d1e2f0f6cf56a2c29902f6b6879df004834cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:44:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://27049a35d23ac6ffa5cb459e4ea2c5c55a7b1731c9edff0941d452442cc73af2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:44:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://de2c346c49fbd94c4328853bed3ffd94ba873a423ff0d99a6ace6fc832bd4da7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:44:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f9a5ac235816afb81c6091da3019a5d8be1e29ba1de74f30c6c28ace091f822\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bd3d37caeb172a681855b5504cd1387cb3798cff0c9ae2c67f9a57207d86f7fe\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T09:45:21Z\\\",\\\"message\\\":\\\"le observer\\\\nW0310 09:45:21.579222 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0310 09:45:21.579381 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0310 09:45:21.580347 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2897458058/tls.crt::/tmp/serving-cert-2897458058/tls.key\\\\\\\"\\\\nI0310 09:45:21.843230 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0310 09:45:21.845521 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0310 09:45:21.845546 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0310 09:45:21.845567 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0310 09:45:21.845572 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0310 09:45:21.854911 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0310 09:45:21.854948 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 09:45:21.854954 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 09:45:21.854960 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0310 09:45:21.854966 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0310 09:45:21.854970 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0310 09:45:21.854974 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0310 09:45:21.854995 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0310 09:45:21.858504 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T09:45:21Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:46:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a3be3a71e2af20be50ffcbd448d5f28e73bafe2d36819ec5476a260b6f5a8c5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:44:14Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://712665fa5ac90b706c3ce8eb211703686c18a26f136962df26e3ad7c4396c72f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://712665fa5ac90b706c3ce8eb211703686c18a26f136962df26e3ad7c4396c72f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:44:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:44:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:44:12Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:46:42Z is after 2025-08-24T17:21:41Z" Mar 10 09:46:42 crc kubenswrapper[4794]: I0310 09:46:42.111823 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-69278" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"071a79a8-a892-4d38-a255-2a19483b64aa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f596b5be3f96e424f4c4f0c6f81241ec50ee64356cb1449294fa6790aa3f7755\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:45:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wl72n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://387c7d244ef3b150e61bf6db8b30580b19b8867628fab0e5c4eebdde4df81142\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:45:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wl72n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:45:52Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-69278\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:46:42Z is after 2025-08-24T17:21:41Z" Mar 10 09:46:42 crc kubenswrapper[4794]: I0310 09:46:42.128449 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-jl52w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"befc934b-d5ba-4fb4-afc6-97614b624ebc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:46:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:46:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:46:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:46:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gbh5s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gbh5s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:46:05Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-jl52w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:46:42Z is after 2025-08-24T17:21:41Z" Mar 10 09:46:42 crc kubenswrapper[4794]: I0310 09:46:42.150052 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:35Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:46:42Z is after 2025-08-24T17:21:41Z" Mar 10 09:46:42 crc kubenswrapper[4794]: I0310 09:46:42.174443 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1e0df489-afb8-4101-acd6-7274ee4ebddf\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://714aace000a0c1b32bd3c863ca1e38e9e44df606fcf3fddb6573eaf8faea4ac7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:44:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://02b38ec46bd15bd61a9748f309c0f8fd16d1ab485405a653f6974b6b37952f75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:44:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b3bcab9c36a963edfb820142c1c917f0cc5d18a09dc3aa65b4539c2248fb866\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:44:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://55280e0d81cf78ca765eb09242df44355b2e25fc2d3e5f88b78ef9f14a44ada8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:44:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8ed250f61efe96b4fac3af4d8d8958a60df304df0120052a7a1ef2d716cc025d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:44:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d1cbb7471e4c1bc60e6bff82ff47be1667d540e2f73c1e118371b1f887c6e22e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d1cbb7471e4c1bc60e6bff82ff47be1667d540e2f73c1e118371b1f887c6e22e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:44:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:44:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e8b205e76d6211c7df0521baf694a871c2e747401dd85f694fe3b6712995540\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8e8b205e76d6211c7df0521baf694a871c2e747401dd85f694fe3b6712995540\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:44:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:44:14Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://0fee74b6caaaf89e473d672c7699753699a594043f013d2e5106d232ad87d850\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0fee74b6caaaf89e473d672c7699753699a594043f013d2e5106d232ad87d850\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:44:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:44:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:44:12Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:46:42Z is after 2025-08-24T17:21:41Z" Mar 10 09:46:42 crc kubenswrapper[4794]: I0310 09:46:42.191609 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d7e66c82b5c376bc659e9597f6714923192b2658db0530c1e20b73533bfb9079\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:45:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:46:42Z is after 2025-08-24T17:21:41Z" Mar 10 09:46:42 crc kubenswrapper[4794]: I0310 09:46:42.207525 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:39Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:39Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7ca44cdf468c33b025c75f0f104435e413e525fba85c006f98fa92ff2f2a4e3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:45:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:46:42Z is after 2025-08-24T17:21:41Z" Mar 10 09:46:42 crc kubenswrapper[4794]: I0310 09:46:42.232490 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mm9nq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d6907de6-7eb7-440a-a101-f492ffa28e39\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d6a3c6de4d134994477dbeb7d3ccdac86f691a7c465321397a93be240c84c4c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:45:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5dhqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf907aae6bd62c6969511c2483d9891a3d89407fa596e7bd4d6e18a3739aea41\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:45:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5dhqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f392e7eaab035834aa954a19e6c9d926bbedf2a547b3a11fba67c0bb966df58c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:45:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5dhqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4fd74482a81888186636681a0568f49e919ac0d729db583ba9178c42e488051\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:45:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5dhqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4fe68076842f5ec0c1922f36a3fdae71ce2e309fd4fa24d4ee4e15c16e514b57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:45:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5dhqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6993b19937515a45f9fab7b20a6856320e79cb3526295d52f48338126d519cac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:45:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5dhqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff55d9a5f2346b5bb8356e359d723fd5fdfa5ad328aaa24ec0faa047536068dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ff55d9a5f2346b5bb8356e359d723fd5fdfa5ad328aaa24ec0faa047536068dc\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-10T09:46:14Z\\\",\\\"message\\\":\\\" handler 7\\\\nI0310 09:46:14.916611 7063 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0310 09:46:14.916631 7063 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0310 09:46:14.916647 7063 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0310 09:46:14.916690 7063 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0310 09:46:14.916708 7063 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI0310 09:46:14.916737 7063 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0310 09:46:14.916775 7063 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI0310 09:46:14.916785 7063 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0310 09:46:14.916793 7063 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0310 09:46:14.916814 7063 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0310 09:46:14.916819 7063 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0310 09:46:14.916818 7063 factory.go:656] Stopping watch factory\\\\nI0310 09:46:14.916836 7063 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0310 09:46:14.916845 7063 ovnkube.go:599] Stopped ovnkube\\\\nI0310 09:46:14.916890 7063 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0310 09:46:14.916957 7063 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T09:46:14Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-mm9nq_openshift-ovn-kubernetes(d6907de6-7eb7-440a-a101-f492ffa28e39)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5dhqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://01f874a0c0eedbf796a95816f7e3fa80891f260f22087c48aea06498330167dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:45:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5dhqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://40c8d4f75a838a0ea1ce416d08c6d4628c2cbdbc209d50922f2f02867142bedd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://40c8d4f75a838a0ea1ce416d08c6d4628c2cbdbc209d50922f2f02867142bedd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:45:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:45:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5dhqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:45:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-mm9nq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:46:42Z is after 2025-08-24T17:21:41Z" Mar 10 09:46:42 crc kubenswrapper[4794]: I0310 09:46:42.247550 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37e93748-d1b9-4fcb-8dd2-3e4af26b9498\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://875e6fa3d8ad5a7c381c1bf3efd51ccfa1202dd3fd9b30bc3e05fac7eb9fc3f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3e7c9a20b63dc51728211c782f530307eafabeb7f97e5bc87bcdacd29a91658e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T09:45:13Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0310 09:44:43.522265 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0310 09:44:43.523853 1 observer_polling.go:159] Starting file observer\\\\nI0310 09:44:43.529139 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0310 09:44:43.530568 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0310 09:45:12.957570 1 cert_rotation.go:91] certificate rotation detected, shutting down client connections to start using new credentials\\\\nI0310 09:45:13.082840 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0310 09:45:13.082951 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T09:44:43Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:45:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2e8a850bc5db9bf7a7e73a9a71933744f371c73ceba41cd04d0b175614309ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:44:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://76c05c05697162298be5d52e9bf3012e05344a077cc3cf6a53839cee75b671a9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:44:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0aac675904ab2ca1d18eed0251bb1dafa297d7819e95189e0ea6288b164cae86\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:44:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:44:12Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:46:42Z is after 2025-08-24T17:21:41Z" Mar 10 09:46:42 crc kubenswrapper[4794]: I0310 09:46:42.266592 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:35Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:46:42Z is after 2025-08-24T17:21:41Z" Mar 10 09:46:42 crc kubenswrapper[4794]: I0310 09:46:42.282104 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:46:42Z is after 2025-08-24T17:21:41Z" Mar 10 09:46:42 crc kubenswrapper[4794]: I0310 09:46:42.294172 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0a99da5e73d983b28a12e4063cb3dcbef725b0797abd5b5beff2d066fb4b43d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:45:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://faad85d1ef96c3c23ba54d03d456195b33f24bed1a5fc64cec481108d087d77c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:45:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:46:42Z is after 2025-08-24T17:21:41Z" Mar 10 09:46:42 crc kubenswrapper[4794]: I0310 09:46:42.304221 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-dc7fw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"441d4601-197c-4325-84bf-cef005ae408b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2367448989b7eb7235c3277100a8d4945551beee942453ce2efe152e17d24ce2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:45:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxxtq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:45:52Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-dc7fw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:46:42Z is after 2025-08-24T17:21:41Z" Mar 10 09:46:42 crc kubenswrapper[4794]: I0310 09:46:42.314447 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-gp22j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ccf2d24b-7ac8-4da4-8629-a56d006f1292\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:46:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:46:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:46:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dc5c1e6650959d5e7fe835284d6684f8d165aaec2e2e5c6537668db196c0ba81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:45:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k8fgn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:45:59Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-gp22j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:46:42Z is after 2025-08-24T17:21:41Z" Mar 10 09:46:42 crc kubenswrapper[4794]: I0310 09:46:42.329892 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-9j8d2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f99fa7c7-084a-43b1-acbf-5fbc67e1a66b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:46:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:46:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:46:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:46:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5aec2bb176e581a0b8ab9808299a341c97cb05dcb3bf8a44d65e65b642cecec5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:46:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8l2p5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1df90543899f07349ecc9cbe5282c9e766f2cf21e6a6da40920a77f6f40a9c2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:46:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8l2p5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:46:04Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-9j8d2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:46:42Z is after 2025-08-24T17:21:41Z" Mar 10 09:46:42 crc kubenswrapper[4794]: I0310 09:46:42.343406 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3f31d53b-bd82-4075-97b0-bab5614c1182\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cab1094176cbf7505724e0c9b08a4c294e6d85e4d5ad3b06535e0dbab270fbc2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:44:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1cb1eebc0d6deb9837da08cb3090cb0f25f2dc66b49afb402fd66f4e73784d66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:44:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7350cf82749601985e69e238d1c8a61ffa7788dd01f1b91a34f1f7574d3c86ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:44:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eab34114bba85670ea6caba8768b650dcbad9e92ca9034f0a21da9b355af87b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eab34114bba85670ea6caba8768b650dcbad9e92ca9034f0a21da9b355af87b1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:44:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:44:13Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:44:12Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:46:42Z is after 2025-08-24T17:21:41Z" Mar 10 09:46:42 crc kubenswrapper[4794]: I0310 09:46:42.765654 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-mm9nq_d6907de6-7eb7-440a-a101-f492ffa28e39/ovnkube-controller/2.log" Mar 10 09:46:42 crc kubenswrapper[4794]: I0310 09:46:42.768805 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mm9nq" event={"ID":"d6907de6-7eb7-440a-a101-f492ffa28e39","Type":"ContainerStarted","Data":"db67319402db24bdefe093e2db9fa128803504208bed6a528ba39f3dc8996c58"} Mar 10 09:46:42 crc kubenswrapper[4794]: I0310 09:46:42.769312 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-mm9nq" Mar 10 09:46:42 crc kubenswrapper[4794]: I0310 09:46:42.786150 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:35Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:46:42Z is after 2025-08-24T17:21:41Z" Mar 10 09:46:42 crc kubenswrapper[4794]: I0310 09:46:42.797057 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-69278" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"071a79a8-a892-4d38-a255-2a19483b64aa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f596b5be3f96e424f4c4f0c6f81241ec50ee64356cb1449294fa6790aa3f7755\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:45:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wl72n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://387c7d244ef3b150e61bf6db8b30580b19b8867628fab0e5c4eebdde4df81142\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:45:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wl72n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:45:52Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-69278\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:46:42Z is after 2025-08-24T17:21:41Z" Mar 10 09:46:42 crc kubenswrapper[4794]: I0310 09:46:42.806756 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-jl52w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"befc934b-d5ba-4fb4-afc6-97614b624ebc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:46:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:46:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:46:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:46:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gbh5s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gbh5s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:46:05Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-jl52w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:46:42Z is after 2025-08-24T17:21:41Z" Mar 10 09:46:42 crc kubenswrapper[4794]: I0310 09:46:42.818918 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:39Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:39Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7ca44cdf468c33b025c75f0f104435e413e525fba85c006f98fa92ff2f2a4e3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:45:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:46:42Z is after 2025-08-24T17:21:41Z" Mar 10 09:46:42 crc kubenswrapper[4794]: I0310 09:46:42.839561 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mm9nq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d6907de6-7eb7-440a-a101-f492ffa28e39\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d6a3c6de4d134994477dbeb7d3ccdac86f691a7c465321397a93be240c84c4c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:45:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5dhqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf907aae6bd62c6969511c2483d9891a3d89407fa596e7bd4d6e18a3739aea41\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:45:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5dhqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f392e7eaab035834aa954a19e6c9d926bbedf2a547b3a11fba67c0bb966df58c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:45:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5dhqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4fd74482a81888186636681a0568f49e919ac0d729db583ba9178c42e488051\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:45:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5dhqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4fe68076842f5ec0c1922f36a3fdae71ce2e309fd4fa24d4ee4e15c16e514b57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:45:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5dhqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6993b19937515a45f9fab7b20a6856320e79cb3526295d52f48338126d519cac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:45:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5dhqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://db67319402db24bdefe093e2db9fa128803504208bed6a528ba39f3dc8996c58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ff55d9a5f2346b5bb8356e359d723fd5fdfa5ad328aaa24ec0faa047536068dc\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-10T09:46:14Z\\\",\\\"message\\\":\\\" handler 7\\\\nI0310 09:46:14.916611 7063 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0310 09:46:14.916631 7063 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0310 09:46:14.916647 7063 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0310 09:46:14.916690 7063 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0310 09:46:14.916708 7063 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI0310 09:46:14.916737 7063 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0310 09:46:14.916775 7063 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI0310 09:46:14.916785 7063 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0310 09:46:14.916793 7063 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0310 09:46:14.916814 7063 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0310 09:46:14.916819 7063 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0310 09:46:14.916818 7063 factory.go:656] Stopping watch factory\\\\nI0310 09:46:14.916836 7063 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0310 09:46:14.916845 7063 ovnkube.go:599] Stopped ovnkube\\\\nI0310 09:46:14.916890 7063 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0310 09:46:14.916957 7063 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T09:46:14Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:46:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5dhqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://01f874a0c0eedbf796a95816f7e3fa80891f260f22087c48aea06498330167dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:45:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5dhqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://40c8d4f75a838a0ea1ce416d08c6d4628c2cbdbc209d50922f2f02867142bedd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://40c8d4f75a838a0ea1ce416d08c6d4628c2cbdbc209d50922f2f02867142bedd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:45:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:45:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5dhqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:45:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-mm9nq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:46:42Z is after 2025-08-24T17:21:41Z" Mar 10 09:46:42 crc kubenswrapper[4794]: I0310 09:46:42.855801 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37e93748-d1b9-4fcb-8dd2-3e4af26b9498\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://875e6fa3d8ad5a7c381c1bf3efd51ccfa1202dd3fd9b30bc3e05fac7eb9fc3f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3e7c9a20b63dc51728211c782f530307eafabeb7f97e5bc87bcdacd29a91658e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T09:45:13Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0310 09:44:43.522265 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0310 09:44:43.523853 1 observer_polling.go:159] Starting file observer\\\\nI0310 09:44:43.529139 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0310 09:44:43.530568 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0310 09:45:12.957570 1 cert_rotation.go:91] certificate rotation detected, shutting down client connections to start using new credentials\\\\nI0310 09:45:13.082840 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0310 09:45:13.082951 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T09:44:43Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:45:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2e8a850bc5db9bf7a7e73a9a71933744f371c73ceba41cd04d0b175614309ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:44:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://76c05c05697162298be5d52e9bf3012e05344a077cc3cf6a53839cee75b671a9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:44:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0aac675904ab2ca1d18eed0251bb1dafa297d7819e95189e0ea6288b164cae86\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:44:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:44:12Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:46:42Z is after 2025-08-24T17:21:41Z" Mar 10 09:46:42 crc kubenswrapper[4794]: I0310 09:46:42.879175 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1e0df489-afb8-4101-acd6-7274ee4ebddf\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://714aace000a0c1b32bd3c863ca1e38e9e44df606fcf3fddb6573eaf8faea4ac7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:44:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://02b38ec46bd15bd61a9748f309c0f8fd16d1ab485405a653f6974b6b37952f75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:44:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b3bcab9c36a963edfb820142c1c917f0cc5d18a09dc3aa65b4539c2248fb866\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:44:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://55280e0d81cf78ca765eb09242df44355b2e25fc2d3e5f88b78ef9f14a44ada8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:44:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8ed250f61efe96b4fac3af4d8d8958a60df304df0120052a7a1ef2d716cc025d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:44:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d1cbb7471e4c1bc60e6bff82ff47be1667d540e2f73c1e118371b1f887c6e22e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d1cbb7471e4c1bc60e6bff82ff47be1667d540e2f73c1e118371b1f887c6e22e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:44:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:44:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e8b205e76d6211c7df0521baf694a871c2e747401dd85f694fe3b6712995540\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8e8b205e76d6211c7df0521baf694a871c2e747401dd85f694fe3b6712995540\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:44:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:44:14Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://0fee74b6caaaf89e473d672c7699753699a594043f013d2e5106d232ad87d850\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0fee74b6caaaf89e473d672c7699753699a594043f013d2e5106d232ad87d850\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:44:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:44:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:44:12Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:46:42Z is after 2025-08-24T17:21:41Z" Mar 10 09:46:42 crc kubenswrapper[4794]: I0310 09:46:42.892926 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d7e66c82b5c376bc659e9597f6714923192b2658db0530c1e20b73533bfb9079\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:45:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:46:42Z is after 2025-08-24T17:21:41Z" Mar 10 09:46:42 crc kubenswrapper[4794]: I0310 09:46:42.903915 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0a99da5e73d983b28a12e4063cb3dcbef725b0797abd5b5beff2d066fb4b43d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:45:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://faad85d1ef96c3c23ba54d03d456195b33f24bed1a5fc64cec481108d087d77c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:45:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:46:42Z is after 2025-08-24T17:21:41Z" Mar 10 09:46:42 crc kubenswrapper[4794]: I0310 09:46:42.913984 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-dc7fw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"441d4601-197c-4325-84bf-cef005ae408b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2367448989b7eb7235c3277100a8d4945551beee942453ce2efe152e17d24ce2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:45:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxxtq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:45:52Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-dc7fw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:46:42Z is after 2025-08-24T17:21:41Z" Mar 10 09:46:42 crc kubenswrapper[4794]: I0310 09:46:42.924612 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-gp22j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ccf2d24b-7ac8-4da4-8629-a56d006f1292\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:46:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:46:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:46:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dc5c1e6650959d5e7fe835284d6684f8d165aaec2e2e5c6537668db196c0ba81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:45:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k8fgn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:45:59Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-gp22j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:46:42Z is after 2025-08-24T17:21:41Z" Mar 10 09:46:42 crc kubenswrapper[4794]: I0310 09:46:42.936986 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-9j8d2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f99fa7c7-084a-43b1-acbf-5fbc67e1a66b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:46:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:46:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:46:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:46:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5aec2bb176e581a0b8ab9808299a341c97cb05dcb3bf8a44d65e65b642cecec5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:46:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8l2p5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1df90543899f07349ecc9cbe5282c9e766f2cf21e6a6da40920a77f6f40a9c2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:46:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8l2p5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:46:04Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-9j8d2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:46:42Z is after 2025-08-24T17:21:41Z" Mar 10 09:46:42 crc kubenswrapper[4794]: I0310 09:46:42.949759 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3f31d53b-bd82-4075-97b0-bab5614c1182\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cab1094176cbf7505724e0c9b08a4c294e6d85e4d5ad3b06535e0dbab270fbc2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:44:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1cb1eebc0d6deb9837da08cb3090cb0f25f2dc66b49afb402fd66f4e73784d66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:44:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7350cf82749601985e69e238d1c8a61ffa7788dd01f1b91a34f1f7574d3c86ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:44:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eab34114bba85670ea6caba8768b650dcbad9e92ca9034f0a21da9b355af87b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eab34114bba85670ea6caba8768b650dcbad9e92ca9034f0a21da9b355af87b1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:44:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:44:13Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:44:12Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:46:42Z is after 2025-08-24T17:21:41Z" Mar 10 09:46:42 crc kubenswrapper[4794]: I0310 09:46:42.963009 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:35Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:46:42Z is after 2025-08-24T17:21:41Z" Mar 10 09:46:42 crc kubenswrapper[4794]: I0310 09:46:42.972502 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:46:42Z is after 2025-08-24T17:21:41Z" Mar 10 09:46:42 crc kubenswrapper[4794]: I0310 09:46:42.984800 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-jpdth" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"11028118-385a-4a2a-8bc4-49aad67ce147\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:46:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:46:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7132e8357d430e33f2f8e8172b67b9aba8deac7f2725d5222c6ccece6e4589d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://31569151ef9bf456980ec9d56c85d54ea77739b3991e26844385936dfceb24ec\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-10T09:46:39Z\\\",\\\"message\\\":\\\"2026-03-10T09:45:54+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_5c90214b-8170-4311-8f5b-bb523456972d\\\\n2026-03-10T09:45:54+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_5c90214b-8170-4311-8f5b-bb523456972d to /host/opt/cni/bin/\\\\n2026-03-10T09:45:54Z [verbose] multus-daemon started\\\\n2026-03-10T09:45:54Z [verbose] Readiness Indicator file check\\\\n2026-03-10T09:46:39Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T09:45:53Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:46:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8v9tn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:45:52Z\\\"}}\" for pod \"openshift-multus\"/\"multus-jpdth\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:46:42Z is after 2025-08-24T17:21:41Z" Mar 10 09:46:42 crc kubenswrapper[4794]: I0310 09:46:42.998073 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jl52w" Mar 10 09:46:42 crc kubenswrapper[4794]: E0310 09:46:42.998257 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jl52w" podUID="befc934b-d5ba-4fb4-afc6-97614b624ebc" Mar 10 09:46:43 crc kubenswrapper[4794]: I0310 09:46:43.001998 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c04f5dfc-8cde-406b-9e01-2e529e0c0f31\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:46:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:46:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c7c4ede630df1f139ad8abbc2d1d1e2f0f6cf56a2c29902f6b6879df004834cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:44:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://27049a35d23ac6ffa5cb459e4ea2c5c55a7b1731c9edff0941d452442cc73af2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:44:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://de2c346c49fbd94c4328853bed3ffd94ba873a423ff0d99a6ace6fc832bd4da7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:44:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f9a5ac235816afb81c6091da3019a5d8be1e29ba1de74f30c6c28ace091f822\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bd3d37caeb172a681855b5504cd1387cb3798cff0c9ae2c67f9a57207d86f7fe\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T09:45:21Z\\\",\\\"message\\\":\\\"le observer\\\\nW0310 09:45:21.579222 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0310 09:45:21.579381 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0310 09:45:21.580347 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2897458058/tls.crt::/tmp/serving-cert-2897458058/tls.key\\\\\\\"\\\\nI0310 09:45:21.843230 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0310 09:45:21.845521 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0310 09:45:21.845546 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0310 09:45:21.845567 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0310 09:45:21.845572 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0310 09:45:21.854911 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0310 09:45:21.854948 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 09:45:21.854954 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 09:45:21.854960 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0310 09:45:21.854966 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0310 09:45:21.854970 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0310 09:45:21.854974 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0310 09:45:21.854995 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0310 09:45:21.858504 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T09:45:21Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:46:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a3be3a71e2af20be50ffcbd448d5f28e73bafe2d36819ec5476a260b6f5a8c5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:44:14Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://712665fa5ac90b706c3ce8eb211703686c18a26f136962df26e3ad7c4396c72f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://712665fa5ac90b706c3ce8eb211703686c18a26f136962df26e3ad7c4396c72f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:44:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:44:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:44:12Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:46:43Z is after 2025-08-24T17:21:41Z" Mar 10 09:46:43 crc kubenswrapper[4794]: I0310 09:46:43.012061 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2f008ec7-aeae-48e2-8ce6-2d24f9a74cb9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5ecb656c11edc3bfd23f7b6362d5ac8cad055186b134498b59e58b868ebd157c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:44:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8ff582c5b9b19a04841665aef50262d055e093a8e80200961a93e1e1ee0391b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8ff582c5b9b19a04841665aef50262d055e093a8e80200961a93e1e1ee0391b4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:44:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:44:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:44:12Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:46:43Z is after 2025-08-24T17:21:41Z" Mar 10 09:46:43 crc kubenswrapper[4794]: I0310 09:46:43.027966 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-zr89w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"749e1060-d177-43e4-9f39-d1b3e9b573b3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://80687a118cd90a6d4538b0de1757e2f3a17d7f2d5e2532d4a3c4396e838bdce4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:45:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-drqmz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9a7b1089f71425c1e393f53f827c2bd0dd87615752c6c5655ef28b2346de21d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a7b1089f71425c1e393f53f827c2bd0dd87615752c6c5655ef28b2346de21d8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:45:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:45:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-drqmz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e448f6f31e1ada006f54b8381012841a94a4e0101fba5630d715f1ab89e842e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6e448f6f31e1ada006f54b8381012841a94a4e0101fba5630d715f1ab89e842e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:45:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:45:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-drqmz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4d92d9cb37c7766aaabeee4dc2a40c043f0e8178dfe21b6ca9f91fa6313fed4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d4d92d9cb37c7766aaabeee4dc2a40c043f0e8178dfe21b6ca9f91fa6313fed4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:45:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:45:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-drqmz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://06a10ffd91a8c26bf9a3affc09b1c698bcf344aebafadbb12e37e4cc0fb8fb32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://06a10ffd91a8c26bf9a3affc09b1c698bcf344aebafadbb12e37e4cc0fb8fb32\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:45:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:45:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-drqmz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://080520ae745ae721854727f7b8ea882c3b1073ef487362681973f5befe15a79c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://080520ae745ae721854727f7b8ea882c3b1073ef487362681973f5befe15a79c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:45:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:45:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-drqmz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4d9e1da83ee6ac0c4f792e1ec3da1a150b5ee013e83dd2e1b0e86bf5b5fcc648\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4d9e1da83ee6ac0c4f792e1ec3da1a150b5ee013e83dd2e1b0e86bf5b5fcc648\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:45:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:45:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-drqmz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:45:52Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-zr89w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:46:43Z is after 2025-08-24T17:21:41Z" Mar 10 09:46:43 crc kubenswrapper[4794]: I0310 09:46:43.774828 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-mm9nq_d6907de6-7eb7-440a-a101-f492ffa28e39/ovnkube-controller/3.log" Mar 10 09:46:43 crc kubenswrapper[4794]: I0310 09:46:43.775954 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-mm9nq_d6907de6-7eb7-440a-a101-f492ffa28e39/ovnkube-controller/2.log" Mar 10 09:46:43 crc kubenswrapper[4794]: I0310 09:46:43.779623 4794 generic.go:334] "Generic (PLEG): container finished" podID="d6907de6-7eb7-440a-a101-f492ffa28e39" containerID="db67319402db24bdefe093e2db9fa128803504208bed6a528ba39f3dc8996c58" exitCode=1 Mar 10 09:46:43 crc kubenswrapper[4794]: I0310 09:46:43.779678 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mm9nq" event={"ID":"d6907de6-7eb7-440a-a101-f492ffa28e39","Type":"ContainerDied","Data":"db67319402db24bdefe093e2db9fa128803504208bed6a528ba39f3dc8996c58"} Mar 10 09:46:43 crc kubenswrapper[4794]: I0310 09:46:43.779778 4794 scope.go:117] "RemoveContainer" containerID="ff55d9a5f2346b5bb8356e359d723fd5fdfa5ad328aaa24ec0faa047536068dc" Mar 10 09:46:43 crc kubenswrapper[4794]: I0310 09:46:43.780605 4794 scope.go:117] "RemoveContainer" containerID="db67319402db24bdefe093e2db9fa128803504208bed6a528ba39f3dc8996c58" Mar 10 09:46:43 crc kubenswrapper[4794]: E0310 09:46:43.780875 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-mm9nq_openshift-ovn-kubernetes(d6907de6-7eb7-440a-a101-f492ffa28e39)\"" pod="openshift-ovn-kubernetes/ovnkube-node-mm9nq" podUID="d6907de6-7eb7-440a-a101-f492ffa28e39" Mar 10 09:46:43 crc kubenswrapper[4794]: I0310 09:46:43.800967 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2f008ec7-aeae-48e2-8ce6-2d24f9a74cb9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5ecb656c11edc3bfd23f7b6362d5ac8cad055186b134498b59e58b868ebd157c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:44:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8ff582c5b9b19a04841665aef50262d055e093a8e80200961a93e1e1ee0391b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8ff582c5b9b19a04841665aef50262d055e093a8e80200961a93e1e1ee0391b4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:44:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:44:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:44:12Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:46:43Z is after 2025-08-24T17:21:41Z" Mar 10 09:46:43 crc kubenswrapper[4794]: I0310 09:46:43.825943 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-zr89w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"749e1060-d177-43e4-9f39-d1b3e9b573b3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://80687a118cd90a6d4538b0de1757e2f3a17d7f2d5e2532d4a3c4396e838bdce4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:45:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-drqmz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9a7b1089f71425c1e393f53f827c2bd0dd87615752c6c5655ef28b2346de21d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a7b1089f71425c1e393f53f827c2bd0dd87615752c6c5655ef28b2346de21d8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:45:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:45:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-drqmz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e448f6f31e1ada006f54b8381012841a94a4e0101fba5630d715f1ab89e842e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6e448f6f31e1ada006f54b8381012841a94a4e0101fba5630d715f1ab89e842e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:45:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:45:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-drqmz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4d92d9cb37c7766aaabeee4dc2a40c043f0e8178dfe21b6ca9f91fa6313fed4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d4d92d9cb37c7766aaabeee4dc2a40c043f0e8178dfe21b6ca9f91fa6313fed4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:45:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:45:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-drqmz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://06a10ffd91a8c26bf9a3affc09b1c698bcf344aebafadbb12e37e4cc0fb8fb32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://06a10ffd91a8c26bf9a3affc09b1c698bcf344aebafadbb12e37e4cc0fb8fb32\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:45:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:45:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-drqmz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://080520ae745ae721854727f7b8ea882c3b1073ef487362681973f5befe15a79c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://080520ae745ae721854727f7b8ea882c3b1073ef487362681973f5befe15a79c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:45:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:45:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-drqmz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4d9e1da83ee6ac0c4f792e1ec3da1a150b5ee013e83dd2e1b0e86bf5b5fcc648\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4d9e1da83ee6ac0c4f792e1ec3da1a150b5ee013e83dd2e1b0e86bf5b5fcc648\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:45:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:45:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-drqmz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:45:52Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-zr89w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:46:43Z is after 2025-08-24T17:21:41Z" Mar 10 09:46:43 crc kubenswrapper[4794]: I0310 09:46:43.848501 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-jpdth" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"11028118-385a-4a2a-8bc4-49aad67ce147\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:46:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:46:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7132e8357d430e33f2f8e8172b67b9aba8deac7f2725d5222c6ccece6e4589d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://31569151ef9bf456980ec9d56c85d54ea77739b3991e26844385936dfceb24ec\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-10T09:46:39Z\\\",\\\"message\\\":\\\"2026-03-10T09:45:54+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_5c90214b-8170-4311-8f5b-bb523456972d\\\\n2026-03-10T09:45:54+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_5c90214b-8170-4311-8f5b-bb523456972d to /host/opt/cni/bin/\\\\n2026-03-10T09:45:54Z [verbose] multus-daemon started\\\\n2026-03-10T09:45:54Z [verbose] Readiness Indicator file check\\\\n2026-03-10T09:46:39Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T09:45:53Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:46:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8v9tn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:45:52Z\\\"}}\" for pod \"openshift-multus\"/\"multus-jpdth\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:46:43Z is after 2025-08-24T17:21:41Z" Mar 10 09:46:43 crc kubenswrapper[4794]: I0310 09:46:43.877190 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c04f5dfc-8cde-406b-9e01-2e529e0c0f31\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:46:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:46:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c7c4ede630df1f139ad8abbc2d1d1e2f0f6cf56a2c29902f6b6879df004834cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:44:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://27049a35d23ac6ffa5cb459e4ea2c5c55a7b1731c9edff0941d452442cc73af2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:44:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://de2c346c49fbd94c4328853bed3ffd94ba873a423ff0d99a6ace6fc832bd4da7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:44:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f9a5ac235816afb81c6091da3019a5d8be1e29ba1de74f30c6c28ace091f822\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bd3d37caeb172a681855b5504cd1387cb3798cff0c9ae2c67f9a57207d86f7fe\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T09:45:21Z\\\",\\\"message\\\":\\\"le observer\\\\nW0310 09:45:21.579222 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0310 09:45:21.579381 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0310 09:45:21.580347 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2897458058/tls.crt::/tmp/serving-cert-2897458058/tls.key\\\\\\\"\\\\nI0310 09:45:21.843230 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0310 09:45:21.845521 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0310 09:45:21.845546 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0310 09:45:21.845567 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0310 09:45:21.845572 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0310 09:45:21.854911 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0310 09:45:21.854948 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 09:45:21.854954 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 09:45:21.854960 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0310 09:45:21.854966 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0310 09:45:21.854970 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0310 09:45:21.854974 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0310 09:45:21.854995 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0310 09:45:21.858504 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T09:45:21Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:46:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a3be3a71e2af20be50ffcbd448d5f28e73bafe2d36819ec5476a260b6f5a8c5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:44:14Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://712665fa5ac90b706c3ce8eb211703686c18a26f136962df26e3ad7c4396c72f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://712665fa5ac90b706c3ce8eb211703686c18a26f136962df26e3ad7c4396c72f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:44:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:44:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:44:12Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:46:43Z is after 2025-08-24T17:21:41Z" Mar 10 09:46:43 crc kubenswrapper[4794]: I0310 09:46:43.893936 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-69278" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"071a79a8-a892-4d38-a255-2a19483b64aa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f596b5be3f96e424f4c4f0c6f81241ec50ee64356cb1449294fa6790aa3f7755\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:45:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wl72n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://387c7d244ef3b150e61bf6db8b30580b19b8867628fab0e5c4eebdde4df81142\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:45:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wl72n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:45:52Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-69278\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:46:43Z is after 2025-08-24T17:21:41Z" Mar 10 09:46:43 crc kubenswrapper[4794]: I0310 09:46:43.910000 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-jl52w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"befc934b-d5ba-4fb4-afc6-97614b624ebc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:46:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:46:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:46:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:46:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gbh5s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gbh5s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:46:05Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-jl52w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:46:43Z is after 2025-08-24T17:21:41Z" Mar 10 09:46:43 crc kubenswrapper[4794]: I0310 09:46:43.926944 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:35Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:46:43Z is after 2025-08-24T17:21:41Z" Mar 10 09:46:43 crc kubenswrapper[4794]: I0310 09:46:43.948922 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1e0df489-afb8-4101-acd6-7274ee4ebddf\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://714aace000a0c1b32bd3c863ca1e38e9e44df606fcf3fddb6573eaf8faea4ac7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:44:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://02b38ec46bd15bd61a9748f309c0f8fd16d1ab485405a653f6974b6b37952f75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:44:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b3bcab9c36a963edfb820142c1c917f0cc5d18a09dc3aa65b4539c2248fb866\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:44:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://55280e0d81cf78ca765eb09242df44355b2e25fc2d3e5f88b78ef9f14a44ada8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:44:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8ed250f61efe96b4fac3af4d8d8958a60df304df0120052a7a1ef2d716cc025d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:44:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d1cbb7471e4c1bc60e6bff82ff47be1667d540e2f73c1e118371b1f887c6e22e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d1cbb7471e4c1bc60e6bff82ff47be1667d540e2f73c1e118371b1f887c6e22e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:44:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:44:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e8b205e76d6211c7df0521baf694a871c2e747401dd85f694fe3b6712995540\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8e8b205e76d6211c7df0521baf694a871c2e747401dd85f694fe3b6712995540\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:44:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:44:14Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://0fee74b6caaaf89e473d672c7699753699a594043f013d2e5106d232ad87d850\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0fee74b6caaaf89e473d672c7699753699a594043f013d2e5106d232ad87d850\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:44:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:44:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:44:12Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:46:43Z is after 2025-08-24T17:21:41Z" Mar 10 09:46:43 crc kubenswrapper[4794]: I0310 09:46:43.964223 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d7e66c82b5c376bc659e9597f6714923192b2658db0530c1e20b73533bfb9079\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:45:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:46:43Z is after 2025-08-24T17:21:41Z" Mar 10 09:46:43 crc kubenswrapper[4794]: I0310 09:46:43.980427 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:39Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:39Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7ca44cdf468c33b025c75f0f104435e413e525fba85c006f98fa92ff2f2a4e3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:45:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:46:43Z is after 2025-08-24T17:21:41Z" Mar 10 09:46:43 crc kubenswrapper[4794]: I0310 09:46:43.998620 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 09:46:43 crc kubenswrapper[4794]: E0310 09:46:43.998767 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 09:46:43 crc kubenswrapper[4794]: I0310 09:46:43.998976 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 09:46:43 crc kubenswrapper[4794]: E0310 09:46:43.999045 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 09:46:43 crc kubenswrapper[4794]: I0310 09:46:43.999158 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 09:46:43 crc kubenswrapper[4794]: E0310 09:46:43.999283 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 09:46:44 crc kubenswrapper[4794]: I0310 09:46:44.009099 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mm9nq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d6907de6-7eb7-440a-a101-f492ffa28e39\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d6a3c6de4d134994477dbeb7d3ccdac86f691a7c465321397a93be240c84c4c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:45:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5dhqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf907aae6bd62c6969511c2483d9891a3d89407fa596e7bd4d6e18a3739aea41\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:45:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5dhqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f392e7eaab035834aa954a19e6c9d926bbedf2a547b3a11fba67c0bb966df58c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:45:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5dhqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4fd74482a81888186636681a0568f49e919ac0d729db583ba9178c42e488051\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:45:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5dhqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4fe68076842f5ec0c1922f36a3fdae71ce2e309fd4fa24d4ee4e15c16e514b57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:45:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5dhqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6993b19937515a45f9fab7b20a6856320e79cb3526295d52f48338126d519cac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:45:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5dhqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://db67319402db24bdefe093e2db9fa128803504208bed6a528ba39f3dc8996c58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ff55d9a5f2346b5bb8356e359d723fd5fdfa5ad328aaa24ec0faa047536068dc\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-10T09:46:14Z\\\",\\\"message\\\":\\\" handler 7\\\\nI0310 09:46:14.916611 7063 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0310 09:46:14.916631 7063 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0310 09:46:14.916647 7063 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0310 09:46:14.916690 7063 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0310 09:46:14.916708 7063 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI0310 09:46:14.916737 7063 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0310 09:46:14.916775 7063 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI0310 09:46:14.916785 7063 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0310 09:46:14.916793 7063 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0310 09:46:14.916814 7063 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0310 09:46:14.916819 7063 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0310 09:46:14.916818 7063 factory.go:656] Stopping watch factory\\\\nI0310 09:46:14.916836 7063 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0310 09:46:14.916845 7063 ovnkube.go:599] Stopped ovnkube\\\\nI0310 09:46:14.916890 7063 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0310 09:46:14.916957 7063 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T09:46:14Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://db67319402db24bdefe093e2db9fa128803504208bed6a528ba39f3dc8996c58\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-10T09:46:42Z\\\",\\\"message\\\":\\\" failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:46:42Z is after 2025-08-24T17:21:41Z]\\\\nI0310 09:46:42.915586 7366 obj_retry.go:365] Adding new object: *v1.Pod openshift-multus/multus-additional-cni-plugins-zr89w\\\\nI0310 09:46:42.915584 7366 obj_retry.go:303] Retry object setup: *v1.Pod openshift-machine-config-operator/machine-config-daemon-69278\\\\nI0310 09:46:42.915592 7366 ovn.go:134] Ensuring zone local for Pod openshift-multus/multus-additional-cni-plugins-zr89w in node crc\\\\nI0310 09:46:42.915595 7366 services_controller.go:356] Processing sync for service openshift-kube-controller-manager/kube-controller-manager for network=default\\\\nI0310 09:46:42.915493 7366 obj_retry.go:386] Retry su\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T09:46:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5dhqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://01f874a0c0eedbf796a95816f7e3fa80891f260f22087c48aea06498330167dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:45:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5dhqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://40c8d4f75a838a0ea1ce416d08c6d4628c2cbdbc209d50922f2f02867142bedd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://40c8d4f75a838a0ea1ce416d08c6d4628c2cbdbc209d50922f2f02867142bedd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:45:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:45:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5dhqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:45:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-mm9nq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:46:44Z is after 2025-08-24T17:21:41Z" Mar 10 09:46:44 crc kubenswrapper[4794]: I0310 09:46:44.022926 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37e93748-d1b9-4fcb-8dd2-3e4af26b9498\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://875e6fa3d8ad5a7c381c1bf3efd51ccfa1202dd3fd9b30bc3e05fac7eb9fc3f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3e7c9a20b63dc51728211c782f530307eafabeb7f97e5bc87bcdacd29a91658e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T09:45:13Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0310 09:44:43.522265 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0310 09:44:43.523853 1 observer_polling.go:159] Starting file observer\\\\nI0310 09:44:43.529139 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0310 09:44:43.530568 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0310 09:45:12.957570 1 cert_rotation.go:91] certificate rotation detected, shutting down client connections to start using new credentials\\\\nI0310 09:45:13.082840 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0310 09:45:13.082951 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T09:44:43Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:45:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2e8a850bc5db9bf7a7e73a9a71933744f371c73ceba41cd04d0b175614309ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:44:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://76c05c05697162298be5d52e9bf3012e05344a077cc3cf6a53839cee75b671a9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:44:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0aac675904ab2ca1d18eed0251bb1dafa297d7819e95189e0ea6288b164cae86\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:44:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:44:12Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:46:44Z is after 2025-08-24T17:21:41Z" Mar 10 09:46:44 crc kubenswrapper[4794]: I0310 09:46:44.035526 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:35Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:46:44Z is after 2025-08-24T17:21:41Z" Mar 10 09:46:44 crc kubenswrapper[4794]: I0310 09:46:44.050590 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:46:44Z is after 2025-08-24T17:21:41Z" Mar 10 09:46:44 crc kubenswrapper[4794]: I0310 09:46:44.065990 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0a99da5e73d983b28a12e4063cb3dcbef725b0797abd5b5beff2d066fb4b43d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:45:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://faad85d1ef96c3c23ba54d03d456195b33f24bed1a5fc64cec481108d087d77c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:45:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:46:44Z is after 2025-08-24T17:21:41Z" Mar 10 09:46:44 crc kubenswrapper[4794]: I0310 09:46:44.075904 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-dc7fw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"441d4601-197c-4325-84bf-cef005ae408b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2367448989b7eb7235c3277100a8d4945551beee942453ce2efe152e17d24ce2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:45:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxxtq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:45:52Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-dc7fw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:46:44Z is after 2025-08-24T17:21:41Z" Mar 10 09:46:44 crc kubenswrapper[4794]: I0310 09:46:44.087951 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-gp22j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ccf2d24b-7ac8-4da4-8629-a56d006f1292\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:46:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:46:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:46:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dc5c1e6650959d5e7fe835284d6684f8d165aaec2e2e5c6537668db196c0ba81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:45:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k8fgn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:45:59Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-gp22j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:46:44Z is after 2025-08-24T17:21:41Z" Mar 10 09:46:44 crc kubenswrapper[4794]: I0310 09:46:44.101776 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-9j8d2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f99fa7c7-084a-43b1-acbf-5fbc67e1a66b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:46:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:46:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:46:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:46:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5aec2bb176e581a0b8ab9808299a341c97cb05dcb3bf8a44d65e65b642cecec5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:46:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8l2p5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1df90543899f07349ecc9cbe5282c9e766f2cf21e6a6da40920a77f6f40a9c2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:46:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8l2p5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:46:04Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-9j8d2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:46:44Z is after 2025-08-24T17:21:41Z" Mar 10 09:46:44 crc kubenswrapper[4794]: I0310 09:46:44.115282 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3f31d53b-bd82-4075-97b0-bab5614c1182\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cab1094176cbf7505724e0c9b08a4c294e6d85e4d5ad3b06535e0dbab270fbc2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:44:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1cb1eebc0d6deb9837da08cb3090cb0f25f2dc66b49afb402fd66f4e73784d66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:44:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7350cf82749601985e69e238d1c8a61ffa7788dd01f1b91a34f1f7574d3c86ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:44:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eab34114bba85670ea6caba8768b650dcbad9e92ca9034f0a21da9b355af87b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eab34114bba85670ea6caba8768b650dcbad9e92ca9034f0a21da9b355af87b1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:44:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:44:13Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:44:12Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:46:44Z is after 2025-08-24T17:21:41Z" Mar 10 09:46:44 crc kubenswrapper[4794]: I0310 09:46:44.786307 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-mm9nq_d6907de6-7eb7-440a-a101-f492ffa28e39/ovnkube-controller/3.log" Mar 10 09:46:44 crc kubenswrapper[4794]: I0310 09:46:44.793129 4794 scope.go:117] "RemoveContainer" containerID="db67319402db24bdefe093e2db9fa128803504208bed6a528ba39f3dc8996c58" Mar 10 09:46:44 crc kubenswrapper[4794]: E0310 09:46:44.793709 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-mm9nq_openshift-ovn-kubernetes(d6907de6-7eb7-440a-a101-f492ffa28e39)\"" pod="openshift-ovn-kubernetes/ovnkube-node-mm9nq" podUID="d6907de6-7eb7-440a-a101-f492ffa28e39" Mar 10 09:46:44 crc kubenswrapper[4794]: I0310 09:46:44.810940 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-69278" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"071a79a8-a892-4d38-a255-2a19483b64aa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f596b5be3f96e424f4c4f0c6f81241ec50ee64356cb1449294fa6790aa3f7755\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:45:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wl72n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://387c7d244ef3b150e61bf6db8b30580b19b8867628fab0e5c4eebdde4df81142\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:45:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wl72n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:45:52Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-69278\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:46:44Z is after 2025-08-24T17:21:41Z" Mar 10 09:46:44 crc kubenswrapper[4794]: I0310 09:46:44.827579 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-jl52w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"befc934b-d5ba-4fb4-afc6-97614b624ebc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:46:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:46:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:46:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:46:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gbh5s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gbh5s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:46:05Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-jl52w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:46:44Z is after 2025-08-24T17:21:41Z" Mar 10 09:46:44 crc kubenswrapper[4794]: I0310 09:46:44.846826 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:35Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:46:44Z is after 2025-08-24T17:21:41Z" Mar 10 09:46:44 crc kubenswrapper[4794]: I0310 09:46:44.883153 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1e0df489-afb8-4101-acd6-7274ee4ebddf\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://714aace000a0c1b32bd3c863ca1e38e9e44df606fcf3fddb6573eaf8faea4ac7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:44:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://02b38ec46bd15bd61a9748f309c0f8fd16d1ab485405a653f6974b6b37952f75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:44:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b3bcab9c36a963edfb820142c1c917f0cc5d18a09dc3aa65b4539c2248fb866\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:44:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://55280e0d81cf78ca765eb09242df44355b2e25fc2d3e5f88b78ef9f14a44ada8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:44:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8ed250f61efe96b4fac3af4d8d8958a60df304df0120052a7a1ef2d716cc025d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:44:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d1cbb7471e4c1bc60e6bff82ff47be1667d540e2f73c1e118371b1f887c6e22e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d1cbb7471e4c1bc60e6bff82ff47be1667d540e2f73c1e118371b1f887c6e22e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:44:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:44:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e8b205e76d6211c7df0521baf694a871c2e747401dd85f694fe3b6712995540\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8e8b205e76d6211c7df0521baf694a871c2e747401dd85f694fe3b6712995540\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:44:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:44:14Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://0fee74b6caaaf89e473d672c7699753699a594043f013d2e5106d232ad87d850\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0fee74b6caaaf89e473d672c7699753699a594043f013d2e5106d232ad87d850\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:44:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:44:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:44:12Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:46:44Z is after 2025-08-24T17:21:41Z" Mar 10 09:46:44 crc kubenswrapper[4794]: I0310 09:46:44.906393 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d7e66c82b5c376bc659e9597f6714923192b2658db0530c1e20b73533bfb9079\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:45:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:46:44Z is after 2025-08-24T17:21:41Z" Mar 10 09:46:44 crc kubenswrapper[4794]: I0310 09:46:44.922521 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:39Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:39Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7ca44cdf468c33b025c75f0f104435e413e525fba85c006f98fa92ff2f2a4e3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:45:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:46:44Z is after 2025-08-24T17:21:41Z" Mar 10 09:46:44 crc kubenswrapper[4794]: I0310 09:46:44.949998 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mm9nq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d6907de6-7eb7-440a-a101-f492ffa28e39\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d6a3c6de4d134994477dbeb7d3ccdac86f691a7c465321397a93be240c84c4c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:45:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5dhqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf907aae6bd62c6969511c2483d9891a3d89407fa596e7bd4d6e18a3739aea41\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:45:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5dhqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f392e7eaab035834aa954a19e6c9d926bbedf2a547b3a11fba67c0bb966df58c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:45:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5dhqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4fd74482a81888186636681a0568f49e919ac0d729db583ba9178c42e488051\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:45:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5dhqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4fe68076842f5ec0c1922f36a3fdae71ce2e309fd4fa24d4ee4e15c16e514b57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:45:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5dhqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6993b19937515a45f9fab7b20a6856320e79cb3526295d52f48338126d519cac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:45:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5dhqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://db67319402db24bdefe093e2db9fa128803504208bed6a528ba39f3dc8996c58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://db67319402db24bdefe093e2db9fa128803504208bed6a528ba39f3dc8996c58\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-10T09:46:42Z\\\",\\\"message\\\":\\\" failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:46:42Z is after 2025-08-24T17:21:41Z]\\\\nI0310 09:46:42.915586 7366 obj_retry.go:365] Adding new object: *v1.Pod openshift-multus/multus-additional-cni-plugins-zr89w\\\\nI0310 09:46:42.915584 7366 obj_retry.go:303] Retry object setup: *v1.Pod openshift-machine-config-operator/machine-config-daemon-69278\\\\nI0310 09:46:42.915592 7366 ovn.go:134] Ensuring zone local for Pod openshift-multus/multus-additional-cni-plugins-zr89w in node crc\\\\nI0310 09:46:42.915595 7366 services_controller.go:356] Processing sync for service openshift-kube-controller-manager/kube-controller-manager for network=default\\\\nI0310 09:46:42.915493 7366 obj_retry.go:386] Retry su\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T09:46:42Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-mm9nq_openshift-ovn-kubernetes(d6907de6-7eb7-440a-a101-f492ffa28e39)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5dhqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://01f874a0c0eedbf796a95816f7e3fa80891f260f22087c48aea06498330167dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:45:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5dhqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://40c8d4f75a838a0ea1ce416d08c6d4628c2cbdbc209d50922f2f02867142bedd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://40c8d4f75a838a0ea1ce416d08c6d4628c2cbdbc209d50922f2f02867142bedd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:45:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:45:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5dhqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:45:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-mm9nq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:46:44Z is after 2025-08-24T17:21:41Z" Mar 10 09:46:44 crc kubenswrapper[4794]: I0310 09:46:44.969706 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37e93748-d1b9-4fcb-8dd2-3e4af26b9498\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://875e6fa3d8ad5a7c381c1bf3efd51ccfa1202dd3fd9b30bc3e05fac7eb9fc3f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3e7c9a20b63dc51728211c782f530307eafabeb7f97e5bc87bcdacd29a91658e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T09:45:13Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0310 09:44:43.522265 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0310 09:44:43.523853 1 observer_polling.go:159] Starting file observer\\\\nI0310 09:44:43.529139 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0310 09:44:43.530568 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0310 09:45:12.957570 1 cert_rotation.go:91] certificate rotation detected, shutting down client connections to start using new credentials\\\\nI0310 09:45:13.082840 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0310 09:45:13.082951 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T09:44:43Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:45:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2e8a850bc5db9bf7a7e73a9a71933744f371c73ceba41cd04d0b175614309ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:44:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://76c05c05697162298be5d52e9bf3012e05344a077cc3cf6a53839cee75b671a9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:44:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0aac675904ab2ca1d18eed0251bb1dafa297d7819e95189e0ea6288b164cae86\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:44:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:44:12Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:46:44Z is after 2025-08-24T17:21:41Z" Mar 10 09:46:44 crc kubenswrapper[4794]: I0310 09:46:44.990092 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:35Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:46:44Z is after 2025-08-24T17:21:41Z" Mar 10 09:46:44 crc kubenswrapper[4794]: I0310 09:46:44.998788 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jl52w" Mar 10 09:46:44 crc kubenswrapper[4794]: E0310 09:46:44.999049 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jl52w" podUID="befc934b-d5ba-4fb4-afc6-97614b624ebc" Mar 10 09:46:45 crc kubenswrapper[4794]: I0310 09:46:45.006639 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:46:45Z is after 2025-08-24T17:21:41Z" Mar 10 09:46:45 crc kubenswrapper[4794]: I0310 09:46:45.024803 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0a99da5e73d983b28a12e4063cb3dcbef725b0797abd5b5beff2d066fb4b43d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:45:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://faad85d1ef96c3c23ba54d03d456195b33f24bed1a5fc64cec481108d087d77c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:45:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:46:45Z is after 2025-08-24T17:21:41Z" Mar 10 09:46:45 crc kubenswrapper[4794]: I0310 09:46:45.041183 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-dc7fw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"441d4601-197c-4325-84bf-cef005ae408b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2367448989b7eb7235c3277100a8d4945551beee942453ce2efe152e17d24ce2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:45:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxxtq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:45:52Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-dc7fw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:46:45Z is after 2025-08-24T17:21:41Z" Mar 10 09:46:45 crc kubenswrapper[4794]: I0310 09:46:45.056896 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-gp22j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ccf2d24b-7ac8-4da4-8629-a56d006f1292\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:46:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:46:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:46:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dc5c1e6650959d5e7fe835284d6684f8d165aaec2e2e5c6537668db196c0ba81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:45:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k8fgn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:45:59Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-gp22j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:46:45Z is after 2025-08-24T17:21:41Z" Mar 10 09:46:45 crc kubenswrapper[4794]: I0310 09:46:45.071160 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-9j8d2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f99fa7c7-084a-43b1-acbf-5fbc67e1a66b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:46:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:46:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:46:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:46:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5aec2bb176e581a0b8ab9808299a341c97cb05dcb3bf8a44d65e65b642cecec5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:46:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8l2p5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1df90543899f07349ecc9cbe5282c9e766f2cf21e6a6da40920a77f6f40a9c2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:46:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8l2p5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:46:04Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-9j8d2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:46:45Z is after 2025-08-24T17:21:41Z" Mar 10 09:46:45 crc kubenswrapper[4794]: I0310 09:46:45.088573 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3f31d53b-bd82-4075-97b0-bab5614c1182\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cab1094176cbf7505724e0c9b08a4c294e6d85e4d5ad3b06535e0dbab270fbc2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:44:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1cb1eebc0d6deb9837da08cb3090cb0f25f2dc66b49afb402fd66f4e73784d66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:44:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7350cf82749601985e69e238d1c8a61ffa7788dd01f1b91a34f1f7574d3c86ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:44:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eab34114bba85670ea6caba8768b650dcbad9e92ca9034f0a21da9b355af87b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eab34114bba85670ea6caba8768b650dcbad9e92ca9034f0a21da9b355af87b1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:44:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:44:13Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:44:12Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:46:45Z is after 2025-08-24T17:21:41Z" Mar 10 09:46:45 crc kubenswrapper[4794]: I0310 09:46:45.104314 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2f008ec7-aeae-48e2-8ce6-2d24f9a74cb9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5ecb656c11edc3bfd23f7b6362d5ac8cad055186b134498b59e58b868ebd157c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:44:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8ff582c5b9b19a04841665aef50262d055e093a8e80200961a93e1e1ee0391b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8ff582c5b9b19a04841665aef50262d055e093a8e80200961a93e1e1ee0391b4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:44:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:44:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:44:12Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:46:45Z is after 2025-08-24T17:21:41Z" Mar 10 09:46:45 crc kubenswrapper[4794]: I0310 09:46:45.127662 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-zr89w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"749e1060-d177-43e4-9f39-d1b3e9b573b3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://80687a118cd90a6d4538b0de1757e2f3a17d7f2d5e2532d4a3c4396e838bdce4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:45:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-drqmz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9a7b1089f71425c1e393f53f827c2bd0dd87615752c6c5655ef28b2346de21d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a7b1089f71425c1e393f53f827c2bd0dd87615752c6c5655ef28b2346de21d8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:45:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:45:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-drqmz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e448f6f31e1ada006f54b8381012841a94a4e0101fba5630d715f1ab89e842e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6e448f6f31e1ada006f54b8381012841a94a4e0101fba5630d715f1ab89e842e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:45:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:45:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-drqmz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4d92d9cb37c7766aaabeee4dc2a40c043f0e8178dfe21b6ca9f91fa6313fed4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d4d92d9cb37c7766aaabeee4dc2a40c043f0e8178dfe21b6ca9f91fa6313fed4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:45:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:45:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-drqmz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://06a10ffd91a8c26bf9a3affc09b1c698bcf344aebafadbb12e37e4cc0fb8fb32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://06a10ffd91a8c26bf9a3affc09b1c698bcf344aebafadbb12e37e4cc0fb8fb32\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:45:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:45:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-drqmz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://080520ae745ae721854727f7b8ea882c3b1073ef487362681973f5befe15a79c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://080520ae745ae721854727f7b8ea882c3b1073ef487362681973f5befe15a79c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:45:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:45:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-drqmz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4d9e1da83ee6ac0c4f792e1ec3da1a150b5ee013e83dd2e1b0e86bf5b5fcc648\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4d9e1da83ee6ac0c4f792e1ec3da1a150b5ee013e83dd2e1b0e86bf5b5fcc648\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:45:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:45:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-drqmz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:45:52Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-zr89w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:46:45Z is after 2025-08-24T17:21:41Z" Mar 10 09:46:45 crc kubenswrapper[4794]: I0310 09:46:45.149005 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-jpdth" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"11028118-385a-4a2a-8bc4-49aad67ce147\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:46:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:46:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7132e8357d430e33f2f8e8172b67b9aba8deac7f2725d5222c6ccece6e4589d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://31569151ef9bf456980ec9d56c85d54ea77739b3991e26844385936dfceb24ec\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-10T09:46:39Z\\\",\\\"message\\\":\\\"2026-03-10T09:45:54+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_5c90214b-8170-4311-8f5b-bb523456972d\\\\n2026-03-10T09:45:54+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_5c90214b-8170-4311-8f5b-bb523456972d to /host/opt/cni/bin/\\\\n2026-03-10T09:45:54Z [verbose] multus-daemon started\\\\n2026-03-10T09:45:54Z [verbose] Readiness Indicator file check\\\\n2026-03-10T09:46:39Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T09:45:53Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:46:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8v9tn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:45:52Z\\\"}}\" for pod \"openshift-multus\"/\"multus-jpdth\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:46:45Z is after 2025-08-24T17:21:41Z" Mar 10 09:46:45 crc kubenswrapper[4794]: I0310 09:46:45.171732 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c04f5dfc-8cde-406b-9e01-2e529e0c0f31\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:46:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:46:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c7c4ede630df1f139ad8abbc2d1d1e2f0f6cf56a2c29902f6b6879df004834cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:44:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://27049a35d23ac6ffa5cb459e4ea2c5c55a7b1731c9edff0941d452442cc73af2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:44:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://de2c346c49fbd94c4328853bed3ffd94ba873a423ff0d99a6ace6fc832bd4da7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:44:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f9a5ac235816afb81c6091da3019a5d8be1e29ba1de74f30c6c28ace091f822\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bd3d37caeb172a681855b5504cd1387cb3798cff0c9ae2c67f9a57207d86f7fe\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T09:45:21Z\\\",\\\"message\\\":\\\"le observer\\\\nW0310 09:45:21.579222 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0310 09:45:21.579381 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0310 09:45:21.580347 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2897458058/tls.crt::/tmp/serving-cert-2897458058/tls.key\\\\\\\"\\\\nI0310 09:45:21.843230 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0310 09:45:21.845521 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0310 09:45:21.845546 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0310 09:45:21.845567 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0310 09:45:21.845572 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0310 09:45:21.854911 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0310 09:45:21.854948 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 09:45:21.854954 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 09:45:21.854960 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0310 09:45:21.854966 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0310 09:45:21.854970 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0310 09:45:21.854974 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0310 09:45:21.854995 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0310 09:45:21.858504 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T09:45:21Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:46:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a3be3a71e2af20be50ffcbd448d5f28e73bafe2d36819ec5476a260b6f5a8c5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:44:14Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://712665fa5ac90b706c3ce8eb211703686c18a26f136962df26e3ad7c4396c72f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://712665fa5ac90b706c3ce8eb211703686c18a26f136962df26e3ad7c4396c72f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:44:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:44:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:44:12Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:46:45Z is after 2025-08-24T17:21:41Z" Mar 10 09:46:45 crc kubenswrapper[4794]: I0310 09:46:45.998409 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 09:46:45 crc kubenswrapper[4794]: I0310 09:46:45.998527 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 09:46:45 crc kubenswrapper[4794]: I0310 09:46:45.998670 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 09:46:45 crc kubenswrapper[4794]: E0310 09:46:45.998814 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 09:46:45 crc kubenswrapper[4794]: E0310 09:46:45.998974 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 09:46:45 crc kubenswrapper[4794]: E0310 09:46:45.999129 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 09:46:46 crc kubenswrapper[4794]: I0310 09:46:46.998611 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jl52w" Mar 10 09:46:46 crc kubenswrapper[4794]: E0310 09:46:46.998853 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jl52w" podUID="befc934b-d5ba-4fb4-afc6-97614b624ebc" Mar 10 09:46:47 crc kubenswrapper[4794]: E0310 09:46:47.079298 4794 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 10 09:46:47 crc kubenswrapper[4794]: I0310 09:46:47.998846 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 09:46:47 crc kubenswrapper[4794]: I0310 09:46:47.998916 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 09:46:47 crc kubenswrapper[4794]: E0310 09:46:47.999000 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 09:46:47 crc kubenswrapper[4794]: I0310 09:46:47.998847 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 09:46:47 crc kubenswrapper[4794]: E0310 09:46:47.999162 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 09:46:47 crc kubenswrapper[4794]: E0310 09:46:47.999226 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 09:46:48 crc kubenswrapper[4794]: I0310 09:46:48.998592 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jl52w" Mar 10 09:46:48 crc kubenswrapper[4794]: E0310 09:46:48.998735 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jl52w" podUID="befc934b-d5ba-4fb4-afc6-97614b624ebc" Mar 10 09:46:49 crc kubenswrapper[4794]: I0310 09:46:49.998657 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 09:46:49 crc kubenswrapper[4794]: I0310 09:46:49.998748 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 09:46:49 crc kubenswrapper[4794]: I0310 09:46:49.998832 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 09:46:49 crc kubenswrapper[4794]: E0310 09:46:49.999025 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 09:46:49 crc kubenswrapper[4794]: E0310 09:46:49.999142 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 09:46:49 crc kubenswrapper[4794]: E0310 09:46:49.999291 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 09:46:50 crc kubenswrapper[4794]: I0310 09:46:50.813843 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:46:50 crc kubenswrapper[4794]: I0310 09:46:50.813901 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:46:50 crc kubenswrapper[4794]: I0310 09:46:50.813919 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:46:50 crc kubenswrapper[4794]: I0310 09:46:50.813941 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:46:50 crc kubenswrapper[4794]: I0310 09:46:50.813957 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:46:50Z","lastTransitionTime":"2026-03-10T09:46:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:46:50 crc kubenswrapper[4794]: E0310 09:46:50.831887 4794 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:46:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:46:50Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:46:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:46:50Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:46:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:46:50Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:46:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:46:50Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"30640994-851b-4f33-a3b0-5689f89c6242\\\",\\\"systemUUID\\\":\\\"970a308b-8f2f-4747-b542-8544494e7e13\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:46:50Z is after 2025-08-24T17:21:41Z" Mar 10 09:46:50 crc kubenswrapper[4794]: I0310 09:46:50.834852 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:46:50 crc kubenswrapper[4794]: I0310 09:46:50.834914 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:46:50 crc kubenswrapper[4794]: I0310 09:46:50.834927 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:46:50 crc kubenswrapper[4794]: I0310 09:46:50.834942 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:46:50 crc kubenswrapper[4794]: I0310 09:46:50.834950 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:46:50Z","lastTransitionTime":"2026-03-10T09:46:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:46:50 crc kubenswrapper[4794]: E0310 09:46:50.847023 4794 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:46:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:46:50Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:46:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:46:50Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:46:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:46:50Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:46:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:46:50Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"30640994-851b-4f33-a3b0-5689f89c6242\\\",\\\"systemUUID\\\":\\\"970a308b-8f2f-4747-b542-8544494e7e13\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:46:50Z is after 2025-08-24T17:21:41Z" Mar 10 09:46:50 crc kubenswrapper[4794]: I0310 09:46:50.850777 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:46:50 crc kubenswrapper[4794]: I0310 09:46:50.850854 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:46:50 crc kubenswrapper[4794]: I0310 09:46:50.850898 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:46:50 crc kubenswrapper[4794]: I0310 09:46:50.850919 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:46:50 crc kubenswrapper[4794]: I0310 09:46:50.850932 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:46:50Z","lastTransitionTime":"2026-03-10T09:46:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:46:50 crc kubenswrapper[4794]: E0310 09:46:50.871135 4794 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:46:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:46:50Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:46:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:46:50Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:46:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:46:50Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:46:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:46:50Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"30640994-851b-4f33-a3b0-5689f89c6242\\\",\\\"systemUUID\\\":\\\"970a308b-8f2f-4747-b542-8544494e7e13\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:46:50Z is after 2025-08-24T17:21:41Z" Mar 10 09:46:50 crc kubenswrapper[4794]: I0310 09:46:50.875387 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:46:50 crc kubenswrapper[4794]: I0310 09:46:50.875476 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:46:50 crc kubenswrapper[4794]: I0310 09:46:50.875494 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:46:50 crc kubenswrapper[4794]: I0310 09:46:50.875552 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:46:50 crc kubenswrapper[4794]: I0310 09:46:50.875570 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:46:50Z","lastTransitionTime":"2026-03-10T09:46:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:46:50 crc kubenswrapper[4794]: E0310 09:46:50.894257 4794 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:46:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:46:50Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:46:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:46:50Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:46:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:46:50Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:46:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:46:50Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"30640994-851b-4f33-a3b0-5689f89c6242\\\",\\\"systemUUID\\\":\\\"970a308b-8f2f-4747-b542-8544494e7e13\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:46:50Z is after 2025-08-24T17:21:41Z" Mar 10 09:46:50 crc kubenswrapper[4794]: I0310 09:46:50.898493 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:46:50 crc kubenswrapper[4794]: I0310 09:46:50.898534 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:46:50 crc kubenswrapper[4794]: I0310 09:46:50.898549 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:46:50 crc kubenswrapper[4794]: I0310 09:46:50.898572 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:46:50 crc kubenswrapper[4794]: I0310 09:46:50.898588 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:46:50Z","lastTransitionTime":"2026-03-10T09:46:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:46:50 crc kubenswrapper[4794]: E0310 09:46:50.918498 4794 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:46:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:46:50Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:46:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:46:50Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:46:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:46:50Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:46:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:46:50Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"30640994-851b-4f33-a3b0-5689f89c6242\\\",\\\"systemUUID\\\":\\\"970a308b-8f2f-4747-b542-8544494e7e13\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:46:50Z is after 2025-08-24T17:21:41Z" Mar 10 09:46:50 crc kubenswrapper[4794]: E0310 09:46:50.918745 4794 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 10 09:46:50 crc kubenswrapper[4794]: I0310 09:46:50.998040 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jl52w" Mar 10 09:46:50 crc kubenswrapper[4794]: E0310 09:46:50.998247 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jl52w" podUID="befc934b-d5ba-4fb4-afc6-97614b624ebc" Mar 10 09:46:51 crc kubenswrapper[4794]: I0310 09:46:51.998532 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 09:46:51 crc kubenswrapper[4794]: E0310 09:46:51.998678 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 09:46:51 crc kubenswrapper[4794]: I0310 09:46:51.998783 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 09:46:51 crc kubenswrapper[4794]: I0310 09:46:51.998792 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 09:46:51 crc kubenswrapper[4794]: E0310 09:46:51.998931 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 09:46:51 crc kubenswrapper[4794]: E0310 09:46:51.999027 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 09:46:52 crc kubenswrapper[4794]: I0310 09:46:52.010090 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2f008ec7-aeae-48e2-8ce6-2d24f9a74cb9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5ecb656c11edc3bfd23f7b6362d5ac8cad055186b134498b59e58b868ebd157c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:44:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8ff582c5b9b19a04841665aef50262d055e093a8e80200961a93e1e1ee0391b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8ff582c5b9b19a04841665aef50262d055e093a8e80200961a93e1e1ee0391b4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:44:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:44:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:44:12Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:46:52Z is after 2025-08-24T17:21:41Z" Mar 10 09:46:52 crc kubenswrapper[4794]: I0310 09:46:52.023244 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-zr89w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"749e1060-d177-43e4-9f39-d1b3e9b573b3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://80687a118cd90a6d4538b0de1757e2f3a17d7f2d5e2532d4a3c4396e838bdce4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:45:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-drqmz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9a7b1089f71425c1e393f53f827c2bd0dd87615752c6c5655ef28b2346de21d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a7b1089f71425c1e393f53f827c2bd0dd87615752c6c5655ef28b2346de21d8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:45:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:45:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-drqmz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e448f6f31e1ada006f54b8381012841a94a4e0101fba5630d715f1ab89e842e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6e448f6f31e1ada006f54b8381012841a94a4e0101fba5630d715f1ab89e842e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:45:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:45:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-drqmz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4d92d9cb37c7766aaabeee4dc2a40c043f0e8178dfe21b6ca9f91fa6313fed4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d4d92d9cb37c7766aaabeee4dc2a40c043f0e8178dfe21b6ca9f91fa6313fed4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:45:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:45:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-drqmz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://06a10ffd91a8c26bf9a3affc09b1c698bcf344aebafadbb12e37e4cc0fb8fb32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://06a10ffd91a8c26bf9a3affc09b1c698bcf344aebafadbb12e37e4cc0fb8fb32\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:45:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:45:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-drqmz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://080520ae745ae721854727f7b8ea882c3b1073ef487362681973f5befe15a79c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://080520ae745ae721854727f7b8ea882c3b1073ef487362681973f5befe15a79c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:45:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:45:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-drqmz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4d9e1da83ee6ac0c4f792e1ec3da1a150b5ee013e83dd2e1b0e86bf5b5fcc648\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4d9e1da83ee6ac0c4f792e1ec3da1a150b5ee013e83dd2e1b0e86bf5b5fcc648\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:45:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:45:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-drqmz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:45:52Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-zr89w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:46:52Z is after 2025-08-24T17:21:41Z" Mar 10 09:46:52 crc kubenswrapper[4794]: I0310 09:46:52.041329 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-jpdth" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"11028118-385a-4a2a-8bc4-49aad67ce147\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:46:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:46:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7132e8357d430e33f2f8e8172b67b9aba8deac7f2725d5222c6ccece6e4589d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://31569151ef9bf456980ec9d56c85d54ea77739b3991e26844385936dfceb24ec\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-10T09:46:39Z\\\",\\\"message\\\":\\\"2026-03-10T09:45:54+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_5c90214b-8170-4311-8f5b-bb523456972d\\\\n2026-03-10T09:45:54+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_5c90214b-8170-4311-8f5b-bb523456972d to /host/opt/cni/bin/\\\\n2026-03-10T09:45:54Z [verbose] multus-daemon started\\\\n2026-03-10T09:45:54Z [verbose] Readiness Indicator file check\\\\n2026-03-10T09:46:39Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T09:45:53Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:46:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8v9tn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:45:52Z\\\"}}\" for pod \"openshift-multus\"/\"multus-jpdth\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:46:52Z is after 2025-08-24T17:21:41Z" Mar 10 09:46:52 crc kubenswrapper[4794]: I0310 09:46:52.054444 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c04f5dfc-8cde-406b-9e01-2e529e0c0f31\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:46:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:46:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c7c4ede630df1f139ad8abbc2d1d1e2f0f6cf56a2c29902f6b6879df004834cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:44:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://27049a35d23ac6ffa5cb459e4ea2c5c55a7b1731c9edff0941d452442cc73af2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:44:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://de2c346c49fbd94c4328853bed3ffd94ba873a423ff0d99a6ace6fc832bd4da7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:44:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f9a5ac235816afb81c6091da3019a5d8be1e29ba1de74f30c6c28ace091f822\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bd3d37caeb172a681855b5504cd1387cb3798cff0c9ae2c67f9a57207d86f7fe\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T09:45:21Z\\\",\\\"message\\\":\\\"le observer\\\\nW0310 09:45:21.579222 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0310 09:45:21.579381 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0310 09:45:21.580347 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2897458058/tls.crt::/tmp/serving-cert-2897458058/tls.key\\\\\\\"\\\\nI0310 09:45:21.843230 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0310 09:45:21.845521 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0310 09:45:21.845546 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0310 09:45:21.845567 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0310 09:45:21.845572 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0310 09:45:21.854911 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0310 09:45:21.854948 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 09:45:21.854954 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 09:45:21.854960 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0310 09:45:21.854966 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0310 09:45:21.854970 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0310 09:45:21.854974 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0310 09:45:21.854995 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0310 09:45:21.858504 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T09:45:21Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:46:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a3be3a71e2af20be50ffcbd448d5f28e73bafe2d36819ec5476a260b6f5a8c5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:44:14Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://712665fa5ac90b706c3ce8eb211703686c18a26f136962df26e3ad7c4396c72f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://712665fa5ac90b706c3ce8eb211703686c18a26f136962df26e3ad7c4396c72f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:44:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:44:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:44:12Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:46:52Z is after 2025-08-24T17:21:41Z" Mar 10 09:46:52 crc kubenswrapper[4794]: I0310 09:46:52.064903 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-69278" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"071a79a8-a892-4d38-a255-2a19483b64aa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f596b5be3f96e424f4c4f0c6f81241ec50ee64356cb1449294fa6790aa3f7755\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:45:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wl72n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://387c7d244ef3b150e61bf6db8b30580b19b8867628fab0e5c4eebdde4df81142\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:45:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wl72n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:45:52Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-69278\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:46:52Z is after 2025-08-24T17:21:41Z" Mar 10 09:46:52 crc kubenswrapper[4794]: I0310 09:46:52.079189 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-jl52w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"befc934b-d5ba-4fb4-afc6-97614b624ebc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:46:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:46:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:46:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:46:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gbh5s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gbh5s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:46:05Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-jl52w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:46:52Z is after 2025-08-24T17:21:41Z" Mar 10 09:46:52 crc kubenswrapper[4794]: E0310 09:46:52.080084 4794 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 10 09:46:52 crc kubenswrapper[4794]: I0310 09:46:52.100949 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:35Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:46:52Z is after 2025-08-24T17:21:41Z" Mar 10 09:46:52 crc kubenswrapper[4794]: I0310 09:46:52.129987 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1e0df489-afb8-4101-acd6-7274ee4ebddf\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://714aace000a0c1b32bd3c863ca1e38e9e44df606fcf3fddb6573eaf8faea4ac7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:44:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://02b38ec46bd15bd61a9748f309c0f8fd16d1ab485405a653f6974b6b37952f75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:44:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b3bcab9c36a963edfb820142c1c917f0cc5d18a09dc3aa65b4539c2248fb866\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:44:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://55280e0d81cf78ca765eb09242df44355b2e25fc2d3e5f88b78ef9f14a44ada8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:44:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8ed250f61efe96b4fac3af4d8d8958a60df304df0120052a7a1ef2d716cc025d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:44:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d1cbb7471e4c1bc60e6bff82ff47be1667d540e2f73c1e118371b1f887c6e22e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d1cbb7471e4c1bc60e6bff82ff47be1667d540e2f73c1e118371b1f887c6e22e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:44:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:44:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e8b205e76d6211c7df0521baf694a871c2e747401dd85f694fe3b6712995540\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8e8b205e76d6211c7df0521baf694a871c2e747401dd85f694fe3b6712995540\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:44:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:44:14Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://0fee74b6caaaf89e473d672c7699753699a594043f013d2e5106d232ad87d850\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0fee74b6caaaf89e473d672c7699753699a594043f013d2e5106d232ad87d850\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:44:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:44:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:44:12Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:46:52Z is after 2025-08-24T17:21:41Z" Mar 10 09:46:52 crc kubenswrapper[4794]: I0310 09:46:52.146572 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d7e66c82b5c376bc659e9597f6714923192b2658db0530c1e20b73533bfb9079\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:45:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:46:52Z is after 2025-08-24T17:21:41Z" Mar 10 09:46:52 crc kubenswrapper[4794]: I0310 09:46:52.163427 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:39Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:39Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7ca44cdf468c33b025c75f0f104435e413e525fba85c006f98fa92ff2f2a4e3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:45:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:46:52Z is after 2025-08-24T17:21:41Z" Mar 10 09:46:52 crc kubenswrapper[4794]: I0310 09:46:52.193185 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mm9nq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d6907de6-7eb7-440a-a101-f492ffa28e39\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d6a3c6de4d134994477dbeb7d3ccdac86f691a7c465321397a93be240c84c4c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:45:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5dhqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf907aae6bd62c6969511c2483d9891a3d89407fa596e7bd4d6e18a3739aea41\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:45:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5dhqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f392e7eaab035834aa954a19e6c9d926bbedf2a547b3a11fba67c0bb966df58c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:45:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5dhqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4fd74482a81888186636681a0568f49e919ac0d729db583ba9178c42e488051\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:45:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5dhqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4fe68076842f5ec0c1922f36a3fdae71ce2e309fd4fa24d4ee4e15c16e514b57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:45:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5dhqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6993b19937515a45f9fab7b20a6856320e79cb3526295d52f48338126d519cac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:45:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5dhqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://db67319402db24bdefe093e2db9fa128803504208bed6a528ba39f3dc8996c58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://db67319402db24bdefe093e2db9fa128803504208bed6a528ba39f3dc8996c58\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-10T09:46:42Z\\\",\\\"message\\\":\\\" failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:46:42Z is after 2025-08-24T17:21:41Z]\\\\nI0310 09:46:42.915586 7366 obj_retry.go:365] Adding new object: *v1.Pod openshift-multus/multus-additional-cni-plugins-zr89w\\\\nI0310 09:46:42.915584 7366 obj_retry.go:303] Retry object setup: *v1.Pod openshift-machine-config-operator/machine-config-daemon-69278\\\\nI0310 09:46:42.915592 7366 ovn.go:134] Ensuring zone local for Pod openshift-multus/multus-additional-cni-plugins-zr89w in node crc\\\\nI0310 09:46:42.915595 7366 services_controller.go:356] Processing sync for service openshift-kube-controller-manager/kube-controller-manager for network=default\\\\nI0310 09:46:42.915493 7366 obj_retry.go:386] Retry su\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T09:46:42Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-mm9nq_openshift-ovn-kubernetes(d6907de6-7eb7-440a-a101-f492ffa28e39)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5dhqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://01f874a0c0eedbf796a95816f7e3fa80891f260f22087c48aea06498330167dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:45:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5dhqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://40c8d4f75a838a0ea1ce416d08c6d4628c2cbdbc209d50922f2f02867142bedd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://40c8d4f75a838a0ea1ce416d08c6d4628c2cbdbc209d50922f2f02867142bedd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:45:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:45:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5dhqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:45:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-mm9nq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:46:52Z is after 2025-08-24T17:21:41Z" Mar 10 09:46:52 crc kubenswrapper[4794]: I0310 09:46:52.210698 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37e93748-d1b9-4fcb-8dd2-3e4af26b9498\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://875e6fa3d8ad5a7c381c1bf3efd51ccfa1202dd3fd9b30bc3e05fac7eb9fc3f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3e7c9a20b63dc51728211c782f530307eafabeb7f97e5bc87bcdacd29a91658e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T09:45:13Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0310 09:44:43.522265 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0310 09:44:43.523853 1 observer_polling.go:159] Starting file observer\\\\nI0310 09:44:43.529139 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0310 09:44:43.530568 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0310 09:45:12.957570 1 cert_rotation.go:91] certificate rotation detected, shutting down client connections to start using new credentials\\\\nI0310 09:45:13.082840 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0310 09:45:13.082951 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T09:44:43Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:45:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2e8a850bc5db9bf7a7e73a9a71933744f371c73ceba41cd04d0b175614309ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:44:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://76c05c05697162298be5d52e9bf3012e05344a077cc3cf6a53839cee75b671a9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:44:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0aac675904ab2ca1d18eed0251bb1dafa297d7819e95189e0ea6288b164cae86\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:44:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:44:12Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:46:52Z is after 2025-08-24T17:21:41Z" Mar 10 09:46:52 crc kubenswrapper[4794]: I0310 09:46:52.226151 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:35Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:46:52Z is after 2025-08-24T17:21:41Z" Mar 10 09:46:52 crc kubenswrapper[4794]: I0310 09:46:52.240089 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:46:52Z is after 2025-08-24T17:21:41Z" Mar 10 09:46:52 crc kubenswrapper[4794]: I0310 09:46:52.260140 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0a99da5e73d983b28a12e4063cb3dcbef725b0797abd5b5beff2d066fb4b43d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:45:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://faad85d1ef96c3c23ba54d03d456195b33f24bed1a5fc64cec481108d087d77c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:45:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:46:52Z is after 2025-08-24T17:21:41Z" Mar 10 09:46:52 crc kubenswrapper[4794]: I0310 09:46:52.271424 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-dc7fw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"441d4601-197c-4325-84bf-cef005ae408b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2367448989b7eb7235c3277100a8d4945551beee942453ce2efe152e17d24ce2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:45:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxxtq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:45:52Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-dc7fw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:46:52Z is after 2025-08-24T17:21:41Z" Mar 10 09:46:52 crc kubenswrapper[4794]: I0310 09:46:52.285351 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-gp22j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ccf2d24b-7ac8-4da4-8629-a56d006f1292\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:46:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:46:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:46:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dc5c1e6650959d5e7fe835284d6684f8d165aaec2e2e5c6537668db196c0ba81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:45:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k8fgn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:45:59Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-gp22j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:46:52Z is after 2025-08-24T17:21:41Z" Mar 10 09:46:52 crc kubenswrapper[4794]: I0310 09:46:52.299931 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-9j8d2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f99fa7c7-084a-43b1-acbf-5fbc67e1a66b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:46:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:46:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:46:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:46:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5aec2bb176e581a0b8ab9808299a341c97cb05dcb3bf8a44d65e65b642cecec5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:46:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8l2p5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1df90543899f07349ecc9cbe5282c9e766f2cf21e6a6da40920a77f6f40a9c2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:46:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8l2p5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:46:04Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-9j8d2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:46:52Z is after 2025-08-24T17:21:41Z" Mar 10 09:46:52 crc kubenswrapper[4794]: I0310 09:46:52.314955 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3f31d53b-bd82-4075-97b0-bab5614c1182\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:45:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:44:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cab1094176cbf7505724e0c9b08a4c294e6d85e4d5ad3b06535e0dbab270fbc2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:44:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1cb1eebc0d6deb9837da08cb3090cb0f25f2dc66b49afb402fd66f4e73784d66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:44:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7350cf82749601985e69e238d1c8a61ffa7788dd01f1b91a34f1f7574d3c86ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:44:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eab34114bba85670ea6caba8768b650dcbad9e92ca9034f0a21da9b355af87b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eab34114bba85670ea6caba8768b650dcbad9e92ca9034f0a21da9b355af87b1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:44:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:44:13Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:44:12Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:46:52Z is after 2025-08-24T17:21:41Z" Mar 10 09:46:52 crc kubenswrapper[4794]: I0310 09:46:52.998901 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jl52w" Mar 10 09:46:52 crc kubenswrapper[4794]: E0310 09:46:52.999195 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jl52w" podUID="befc934b-d5ba-4fb4-afc6-97614b624ebc" Mar 10 09:46:53 crc kubenswrapper[4794]: I0310 09:46:53.998257 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 09:46:53 crc kubenswrapper[4794]: I0310 09:46:53.998321 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 09:46:53 crc kubenswrapper[4794]: E0310 09:46:53.998531 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 09:46:53 crc kubenswrapper[4794]: I0310 09:46:53.998600 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 09:46:53 crc kubenswrapper[4794]: E0310 09:46:53.998836 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 09:46:54 crc kubenswrapper[4794]: E0310 09:46:53.998973 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 09:46:54 crc kubenswrapper[4794]: I0310 09:46:54.998064 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jl52w" Mar 10 09:46:54 crc kubenswrapper[4794]: E0310 09:46:54.998258 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jl52w" podUID="befc934b-d5ba-4fb4-afc6-97614b624ebc" Mar 10 09:46:55 crc kubenswrapper[4794]: I0310 09:46:55.998756 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 09:46:55 crc kubenswrapper[4794]: I0310 09:46:55.999138 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 09:46:55 crc kubenswrapper[4794]: I0310 09:46:55.998600 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 09:46:55 crc kubenswrapper[4794]: E0310 09:46:55.999565 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 09:46:55 crc kubenswrapper[4794]: E0310 09:46:55.999700 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 09:46:55 crc kubenswrapper[4794]: E0310 09:46:55.999824 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 09:46:56 crc kubenswrapper[4794]: I0310 09:46:56.001144 4794 scope.go:117] "RemoveContainer" containerID="db67319402db24bdefe093e2db9fa128803504208bed6a528ba39f3dc8996c58" Mar 10 09:46:56 crc kubenswrapper[4794]: E0310 09:46:56.001444 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-mm9nq_openshift-ovn-kubernetes(d6907de6-7eb7-440a-a101-f492ffa28e39)\"" pod="openshift-ovn-kubernetes/ovnkube-node-mm9nq" podUID="d6907de6-7eb7-440a-a101-f492ffa28e39" Mar 10 09:46:56 crc kubenswrapper[4794]: I0310 09:46:56.998886 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jl52w" Mar 10 09:46:56 crc kubenswrapper[4794]: E0310 09:46:56.999494 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jl52w" podUID="befc934b-d5ba-4fb4-afc6-97614b624ebc" Mar 10 09:46:57 crc kubenswrapper[4794]: E0310 09:46:57.081730 4794 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 10 09:46:57 crc kubenswrapper[4794]: I0310 09:46:57.998390 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 09:46:57 crc kubenswrapper[4794]: I0310 09:46:57.998466 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 09:46:57 crc kubenswrapper[4794]: E0310 09:46:57.998959 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 09:46:57 crc kubenswrapper[4794]: I0310 09:46:57.998476 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 09:46:57 crc kubenswrapper[4794]: E0310 09:46:57.999104 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 09:46:57 crc kubenswrapper[4794]: E0310 09:46:57.999374 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 09:46:58 crc kubenswrapper[4794]: I0310 09:46:58.998895 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jl52w" Mar 10 09:46:58 crc kubenswrapper[4794]: E0310 09:46:58.999077 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jl52w" podUID="befc934b-d5ba-4fb4-afc6-97614b624ebc" Mar 10 09:46:59 crc kubenswrapper[4794]: I0310 09:46:59.998779 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 09:46:59 crc kubenswrapper[4794]: I0310 09:46:59.998881 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 09:46:59 crc kubenswrapper[4794]: E0310 09:46:59.999040 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 09:46:59 crc kubenswrapper[4794]: I0310 09:46:59.999424 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 09:46:59 crc kubenswrapper[4794]: E0310 09:46:59.999562 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 09:46:59 crc kubenswrapper[4794]: E0310 09:46:59.999916 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 09:47:00 crc kubenswrapper[4794]: I0310 09:47:00.949358 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:47:00 crc kubenswrapper[4794]: I0310 09:47:00.949422 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:47:00 crc kubenswrapper[4794]: I0310 09:47:00.949442 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:47:00 crc kubenswrapper[4794]: I0310 09:47:00.949473 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:47:00 crc kubenswrapper[4794]: I0310 09:47:00.949507 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:47:00Z","lastTransitionTime":"2026-03-10T09:47:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:47:00 crc kubenswrapper[4794]: E0310 09:47:00.970400 4794 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:47:00Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:47:00Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:47:00Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:47:00Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:47:00Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:47:00Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:47:00Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:47:00Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"30640994-851b-4f33-a3b0-5689f89c6242\\\",\\\"systemUUID\\\":\\\"970a308b-8f2f-4747-b542-8544494e7e13\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:47:00Z is after 2025-08-24T17:21:41Z" Mar 10 09:47:00 crc kubenswrapper[4794]: I0310 09:47:00.981021 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:47:00 crc kubenswrapper[4794]: I0310 09:47:00.981367 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:47:00 crc kubenswrapper[4794]: I0310 09:47:00.981453 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:47:00 crc kubenswrapper[4794]: I0310 09:47:00.981537 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:47:00 crc kubenswrapper[4794]: I0310 09:47:00.981601 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:47:00Z","lastTransitionTime":"2026-03-10T09:47:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:47:00 crc kubenswrapper[4794]: E0310 09:47:00.996543 4794 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:47:00Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:47:00Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:47:00Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:47:00Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:47:00Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:47:00Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:47:00Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:47:00Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"30640994-851b-4f33-a3b0-5689f89c6242\\\",\\\"systemUUID\\\":\\\"970a308b-8f2f-4747-b542-8544494e7e13\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:47:00Z is after 2025-08-24T17:21:41Z" Mar 10 09:47:00 crc kubenswrapper[4794]: I0310 09:47:00.998471 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jl52w" Mar 10 09:47:00 crc kubenswrapper[4794]: E0310 09:47:00.999304 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jl52w" podUID="befc934b-d5ba-4fb4-afc6-97614b624ebc" Mar 10 09:47:01 crc kubenswrapper[4794]: I0310 09:47:01.002232 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:47:01 crc kubenswrapper[4794]: I0310 09:47:01.002285 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:47:01 crc kubenswrapper[4794]: I0310 09:47:01.002321 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:47:01 crc kubenswrapper[4794]: I0310 09:47:01.002392 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:47:01 crc kubenswrapper[4794]: I0310 09:47:01.002416 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:47:01Z","lastTransitionTime":"2026-03-10T09:47:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:47:01 crc kubenswrapper[4794]: E0310 09:47:01.022501 4794 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:47:01Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:47:01Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:47:01Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:47:01Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:47:01Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:47:01Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:47:01Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:47:01Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"30640994-851b-4f33-a3b0-5689f89c6242\\\",\\\"systemUUID\\\":\\\"970a308b-8f2f-4747-b542-8544494e7e13\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:47:01Z is after 2025-08-24T17:21:41Z" Mar 10 09:47:01 crc kubenswrapper[4794]: I0310 09:47:01.027665 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:47:01 crc kubenswrapper[4794]: I0310 09:47:01.027731 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:47:01 crc kubenswrapper[4794]: I0310 09:47:01.027745 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:47:01 crc kubenswrapper[4794]: I0310 09:47:01.027767 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:47:01 crc kubenswrapper[4794]: I0310 09:47:01.027781 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:47:01Z","lastTransitionTime":"2026-03-10T09:47:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:47:01 crc kubenswrapper[4794]: E0310 09:47:01.045604 4794 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:47:01Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:47:01Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:47:01Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:47:01Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:47:01Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:47:01Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:47:01Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:47:01Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"30640994-851b-4f33-a3b0-5689f89c6242\\\",\\\"systemUUID\\\":\\\"970a308b-8f2f-4747-b542-8544494e7e13\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:47:01Z is after 2025-08-24T17:21:41Z" Mar 10 09:47:01 crc kubenswrapper[4794]: I0310 09:47:01.051296 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:47:01 crc kubenswrapper[4794]: I0310 09:47:01.051691 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:47:01 crc kubenswrapper[4794]: I0310 09:47:01.051746 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:47:01 crc kubenswrapper[4794]: I0310 09:47:01.051781 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:47:01 crc kubenswrapper[4794]: I0310 09:47:01.051808 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:47:01Z","lastTransitionTime":"2026-03-10T09:47:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:47:01 crc kubenswrapper[4794]: E0310 09:47:01.066454 4794 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:47:01Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:47:01Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:47:01Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:47:01Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:47:01Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:47:01Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:47:01Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:47:01Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"30640994-851b-4f33-a3b0-5689f89c6242\\\",\\\"systemUUID\\\":\\\"970a308b-8f2f-4747-b542-8544494e7e13\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:47:01Z is after 2025-08-24T17:21:41Z" Mar 10 09:47:01 crc kubenswrapper[4794]: E0310 09:47:01.066666 4794 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 10 09:47:01 crc kubenswrapper[4794]: I0310 09:47:01.998758 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 09:47:01 crc kubenswrapper[4794]: E0310 09:47:01.999043 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 09:47:01 crc kubenswrapper[4794]: I0310 09:47:01.999223 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 09:47:01 crc kubenswrapper[4794]: I0310 09:47:01.999415 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 09:47:02 crc kubenswrapper[4794]: E0310 09:47:02.000011 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 09:47:02 crc kubenswrapper[4794]: E0310 09:47:01.999536 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 09:47:02 crc kubenswrapper[4794]: I0310 09:47:02.055242 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=78.055208113 podStartE2EDuration="1m18.055208113s" podCreationTimestamp="2026-03-10 09:45:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 09:47:02.039649978 +0000 UTC m=+170.795820836" watchObservedRunningTime="2026-03-10 09:47:02.055208113 +0000 UTC m=+170.811378971" Mar 10 09:47:02 crc kubenswrapper[4794]: I0310 09:47:02.056107 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" podStartSLOduration=78.056093792 podStartE2EDuration="1m18.056093792s" podCreationTimestamp="2026-03-10 09:45:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 09:47:02.055942717 +0000 UTC m=+170.812113615" watchObservedRunningTime="2026-03-10 09:47:02.056093792 +0000 UTC m=+170.812264650" Mar 10 09:47:02 crc kubenswrapper[4794]: E0310 09:47:02.082538 4794 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 10 09:47:02 crc kubenswrapper[4794]: I0310 09:47:02.105131 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-zr89w" podStartSLOduration=99.10511242 podStartE2EDuration="1m39.10511242s" podCreationTimestamp="2026-03-10 09:45:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 09:47:02.086569821 +0000 UTC m=+170.842740649" watchObservedRunningTime="2026-03-10 09:47:02.10511242 +0000 UTC m=+170.861283248" Mar 10 09:47:02 crc kubenswrapper[4794]: I0310 09:47:02.117577 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-jpdth" podStartSLOduration=99.117559796 podStartE2EDuration="1m39.117559796s" podCreationTimestamp="2026-03-10 09:45:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 09:47:02.105821643 +0000 UTC m=+170.861992491" watchObservedRunningTime="2026-03-10 09:47:02.117559796 +0000 UTC m=+170.873730634" Mar 10 09:47:02 crc kubenswrapper[4794]: I0310 09:47:02.144608 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-daemon-69278" podStartSLOduration=99.144587656 podStartE2EDuration="1m39.144587656s" podCreationTimestamp="2026-03-10 09:45:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 09:47:02.130916421 +0000 UTC m=+170.887087249" watchObservedRunningTime="2026-03-10 09:47:02.144587656 +0000 UTC m=+170.900758494" Mar 10 09:47:02 crc kubenswrapper[4794]: I0310 09:47:02.220263 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podStartSLOduration=35.220241931 podStartE2EDuration="35.220241931s" podCreationTimestamp="2026-03-10 09:46:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 09:47:02.161979509 +0000 UTC m=+170.918150337" watchObservedRunningTime="2026-03-10 09:47:02.220241931 +0000 UTC m=+170.976412769" Mar 10 09:47:02 crc kubenswrapper[4794]: I0310 09:47:02.220889 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd/etcd-crc" podStartSLOduration=65.220881512 podStartE2EDuration="1m5.220881512s" podCreationTimestamp="2026-03-10 09:45:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 09:47:02.219296501 +0000 UTC m=+170.975467329" watchObservedRunningTime="2026-03-10 09:47:02.220881512 +0000 UTC m=+170.977052350" Mar 10 09:47:02 crc kubenswrapper[4794]: I0310 09:47:02.288694 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-gp22j" podStartSLOduration=99.288670208 podStartE2EDuration="1m39.288670208s" podCreationTimestamp="2026-03-10 09:45:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 09:47:02.288151281 +0000 UTC m=+171.044322109" watchObservedRunningTime="2026-03-10 09:47:02.288670208 +0000 UTC m=+171.044841046" Mar 10 09:47:02 crc kubenswrapper[4794]: I0310 09:47:02.306692 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-9j8d2" podStartSLOduration=99.30667525 podStartE2EDuration="1m39.30667525s" podCreationTimestamp="2026-03-10 09:45:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 09:47:02.305977447 +0000 UTC m=+171.062148265" watchObservedRunningTime="2026-03-10 09:47:02.30667525 +0000 UTC m=+171.062846068" Mar 10 09:47:02 crc kubenswrapper[4794]: I0310 09:47:02.318423 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" podStartSLOduration=51.318409213 podStartE2EDuration="51.318409213s" podCreationTimestamp="2026-03-10 09:46:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 09:47:02.31799636 +0000 UTC m=+171.074167178" watchObservedRunningTime="2026-03-10 09:47:02.318409213 +0000 UTC m=+171.074580031" Mar 10 09:47:02 crc kubenswrapper[4794]: I0310 09:47:02.371554 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-dc7fw" podStartSLOduration=99.371532452 podStartE2EDuration="1m39.371532452s" podCreationTimestamp="2026-03-10 09:45:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 09:47:02.370664474 +0000 UTC m=+171.126835292" watchObservedRunningTime="2026-03-10 09:47:02.371532452 +0000 UTC m=+171.127703280" Mar 10 09:47:02 crc kubenswrapper[4794]: I0310 09:47:02.998053 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jl52w" Mar 10 09:47:02 crc kubenswrapper[4794]: E0310 09:47:02.998273 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jl52w" podUID="befc934b-d5ba-4fb4-afc6-97614b624ebc" Mar 10 09:47:03 crc kubenswrapper[4794]: I0310 09:47:03.998255 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 09:47:03 crc kubenswrapper[4794]: I0310 09:47:03.998307 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 09:47:03 crc kubenswrapper[4794]: I0310 09:47:03.998348 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 09:47:03 crc kubenswrapper[4794]: E0310 09:47:03.998456 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 09:47:03 crc kubenswrapper[4794]: E0310 09:47:03.998610 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 09:47:03 crc kubenswrapper[4794]: E0310 09:47:03.998591 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 09:47:04 crc kubenswrapper[4794]: I0310 09:47:04.999026 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jl52w" Mar 10 09:47:05 crc kubenswrapper[4794]: E0310 09:47:04.999228 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jl52w" podUID="befc934b-d5ba-4fb4-afc6-97614b624ebc" Mar 10 09:47:05 crc kubenswrapper[4794]: I0310 09:47:05.998114 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 09:47:05 crc kubenswrapper[4794]: E0310 09:47:05.998373 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 09:47:05 crc kubenswrapper[4794]: I0310 09:47:05.998405 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 09:47:05 crc kubenswrapper[4794]: I0310 09:47:05.998457 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 09:47:05 crc kubenswrapper[4794]: E0310 09:47:05.998509 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 09:47:05 crc kubenswrapper[4794]: E0310 09:47:05.998648 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 09:47:06 crc kubenswrapper[4794]: I0310 09:47:06.998784 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jl52w" Mar 10 09:47:06 crc kubenswrapper[4794]: E0310 09:47:06.999095 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jl52w" podUID="befc934b-d5ba-4fb4-afc6-97614b624ebc" Mar 10 09:47:07 crc kubenswrapper[4794]: E0310 09:47:07.083993 4794 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 10 09:47:07 crc kubenswrapper[4794]: I0310 09:47:07.998962 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 09:47:07 crc kubenswrapper[4794]: I0310 09:47:07.999115 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 09:47:08 crc kubenswrapper[4794]: I0310 09:47:07.999288 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 09:47:08 crc kubenswrapper[4794]: E0310 09:47:07.999445 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 09:47:08 crc kubenswrapper[4794]: E0310 09:47:07.999518 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 09:47:08 crc kubenswrapper[4794]: E0310 09:47:07.999719 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 09:47:08 crc kubenswrapper[4794]: I0310 09:47:08.998018 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jl52w" Mar 10 09:47:08 crc kubenswrapper[4794]: E0310 09:47:08.998167 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jl52w" podUID="befc934b-d5ba-4fb4-afc6-97614b624ebc" Mar 10 09:47:09 crc kubenswrapper[4794]: I0310 09:47:09.733726 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/befc934b-d5ba-4fb4-afc6-97614b624ebc-metrics-certs\") pod \"network-metrics-daemon-jl52w\" (UID: \"befc934b-d5ba-4fb4-afc6-97614b624ebc\") " pod="openshift-multus/network-metrics-daemon-jl52w" Mar 10 09:47:09 crc kubenswrapper[4794]: E0310 09:47:09.733838 4794 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 10 09:47:09 crc kubenswrapper[4794]: E0310 09:47:09.733893 4794 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/befc934b-d5ba-4fb4-afc6-97614b624ebc-metrics-certs podName:befc934b-d5ba-4fb4-afc6-97614b624ebc nodeName:}" failed. No retries permitted until 2026-03-10 09:48:13.733877592 +0000 UTC m=+242.490048420 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/befc934b-d5ba-4fb4-afc6-97614b624ebc-metrics-certs") pod "network-metrics-daemon-jl52w" (UID: "befc934b-d5ba-4fb4-afc6-97614b624ebc") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 10 09:47:09 crc kubenswrapper[4794]: I0310 09:47:09.998539 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 09:47:09 crc kubenswrapper[4794]: I0310 09:47:09.998599 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 09:47:09 crc kubenswrapper[4794]: I0310 09:47:09.998560 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 09:47:09 crc kubenswrapper[4794]: E0310 09:47:09.998727 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 09:47:09 crc kubenswrapper[4794]: E0310 09:47:09.999301 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 09:47:09 crc kubenswrapper[4794]: E0310 09:47:09.999476 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 09:47:10 crc kubenswrapper[4794]: I0310 09:47:09.999968 4794 scope.go:117] "RemoveContainer" containerID="db67319402db24bdefe093e2db9fa128803504208bed6a528ba39f3dc8996c58" Mar 10 09:47:10 crc kubenswrapper[4794]: E0310 09:47:10.000760 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-mm9nq_openshift-ovn-kubernetes(d6907de6-7eb7-440a-a101-f492ffa28e39)\"" pod="openshift-ovn-kubernetes/ovnkube-node-mm9nq" podUID="d6907de6-7eb7-440a-a101-f492ffa28e39" Mar 10 09:47:10 crc kubenswrapper[4794]: I0310 09:47:10.998211 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jl52w" Mar 10 09:47:10 crc kubenswrapper[4794]: E0310 09:47:10.998494 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jl52w" podUID="befc934b-d5ba-4fb4-afc6-97614b624ebc" Mar 10 09:47:11 crc kubenswrapper[4794]: I0310 09:47:11.133675 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:47:11 crc kubenswrapper[4794]: I0310 09:47:11.133771 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:47:11 crc kubenswrapper[4794]: I0310 09:47:11.133797 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:47:11 crc kubenswrapper[4794]: I0310 09:47:11.133829 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:47:11 crc kubenswrapper[4794]: I0310 09:47:11.133852 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:47:11Z","lastTransitionTime":"2026-03-10T09:47:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:47:11 crc kubenswrapper[4794]: I0310 09:47:11.194157 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-version/cluster-version-operator-5c965bbfc6-vft7z"] Mar 10 09:47:11 crc kubenswrapper[4794]: I0310 09:47:11.194897 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-vft7z" Mar 10 09:47:11 crc kubenswrapper[4794]: I0310 09:47:11.199564 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Mar 10 09:47:11 crc kubenswrapper[4794]: I0310 09:47:11.200091 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Mar 10 09:47:11 crc kubenswrapper[4794]: I0310 09:47:11.200300 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Mar 10 09:47:11 crc kubenswrapper[4794]: I0310 09:47:11.201481 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Mar 10 09:47:11 crc kubenswrapper[4794]: I0310 09:47:11.245714 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/becfb047-8496-4c5a-9f65-02721f50e13d-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-vft7z\" (UID: \"becfb047-8496-4c5a-9f65-02721f50e13d\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-vft7z" Mar 10 09:47:11 crc kubenswrapper[4794]: I0310 09:47:11.245781 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/becfb047-8496-4c5a-9f65-02721f50e13d-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-vft7z\" (UID: \"becfb047-8496-4c5a-9f65-02721f50e13d\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-vft7z" Mar 10 09:47:11 crc kubenswrapper[4794]: I0310 09:47:11.245836 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/becfb047-8496-4c5a-9f65-02721f50e13d-service-ca\") pod \"cluster-version-operator-5c965bbfc6-vft7z\" (UID: \"becfb047-8496-4c5a-9f65-02721f50e13d\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-vft7z" Mar 10 09:47:11 crc kubenswrapper[4794]: I0310 09:47:11.245867 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/becfb047-8496-4c5a-9f65-02721f50e13d-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-vft7z\" (UID: \"becfb047-8496-4c5a-9f65-02721f50e13d\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-vft7z" Mar 10 09:47:11 crc kubenswrapper[4794]: I0310 09:47:11.245912 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/becfb047-8496-4c5a-9f65-02721f50e13d-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-vft7z\" (UID: \"becfb047-8496-4c5a-9f65-02721f50e13d\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-vft7z" Mar 10 09:47:11 crc kubenswrapper[4794]: I0310 09:47:11.347183 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/becfb047-8496-4c5a-9f65-02721f50e13d-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-vft7z\" (UID: \"becfb047-8496-4c5a-9f65-02721f50e13d\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-vft7z" Mar 10 09:47:11 crc kubenswrapper[4794]: I0310 09:47:11.347246 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/becfb047-8496-4c5a-9f65-02721f50e13d-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-vft7z\" (UID: \"becfb047-8496-4c5a-9f65-02721f50e13d\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-vft7z" Mar 10 09:47:11 crc kubenswrapper[4794]: I0310 09:47:11.347292 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/becfb047-8496-4c5a-9f65-02721f50e13d-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-vft7z\" (UID: \"becfb047-8496-4c5a-9f65-02721f50e13d\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-vft7z" Mar 10 09:47:11 crc kubenswrapper[4794]: I0310 09:47:11.347326 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/becfb047-8496-4c5a-9f65-02721f50e13d-service-ca\") pod \"cluster-version-operator-5c965bbfc6-vft7z\" (UID: \"becfb047-8496-4c5a-9f65-02721f50e13d\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-vft7z" Mar 10 09:47:11 crc kubenswrapper[4794]: I0310 09:47:11.347438 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/becfb047-8496-4c5a-9f65-02721f50e13d-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-vft7z\" (UID: \"becfb047-8496-4c5a-9f65-02721f50e13d\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-vft7z" Mar 10 09:47:11 crc kubenswrapper[4794]: I0310 09:47:11.347550 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/becfb047-8496-4c5a-9f65-02721f50e13d-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-vft7z\" (UID: \"becfb047-8496-4c5a-9f65-02721f50e13d\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-vft7z" Mar 10 09:47:11 crc kubenswrapper[4794]: I0310 09:47:11.348039 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/becfb047-8496-4c5a-9f65-02721f50e13d-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-vft7z\" (UID: \"becfb047-8496-4c5a-9f65-02721f50e13d\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-vft7z" Mar 10 09:47:11 crc kubenswrapper[4794]: I0310 09:47:11.349977 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/becfb047-8496-4c5a-9f65-02721f50e13d-service-ca\") pod \"cluster-version-operator-5c965bbfc6-vft7z\" (UID: \"becfb047-8496-4c5a-9f65-02721f50e13d\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-vft7z" Mar 10 09:47:11 crc kubenswrapper[4794]: I0310 09:47:11.358164 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/becfb047-8496-4c5a-9f65-02721f50e13d-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-vft7z\" (UID: \"becfb047-8496-4c5a-9f65-02721f50e13d\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-vft7z" Mar 10 09:47:11 crc kubenswrapper[4794]: I0310 09:47:11.381181 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/becfb047-8496-4c5a-9f65-02721f50e13d-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-vft7z\" (UID: \"becfb047-8496-4c5a-9f65-02721f50e13d\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-vft7z" Mar 10 09:47:11 crc kubenswrapper[4794]: I0310 09:47:11.520423 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-vft7z" Mar 10 09:47:11 crc kubenswrapper[4794]: W0310 09:47:11.538048 4794 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbecfb047_8496_4c5a_9f65_02721f50e13d.slice/crio-b2e1f58cdf28b80a2162e70a1dc870c42eaf477ea91e61235e4fc748134d26aa WatchSource:0}: Error finding container b2e1f58cdf28b80a2162e70a1dc870c42eaf477ea91e61235e4fc748134d26aa: Status 404 returned error can't find the container with id b2e1f58cdf28b80a2162e70a1dc870c42eaf477ea91e61235e4fc748134d26aa Mar 10 09:47:11 crc kubenswrapper[4794]: I0310 09:47:11.890652 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-vft7z" event={"ID":"becfb047-8496-4c5a-9f65-02721f50e13d","Type":"ContainerStarted","Data":"7c8eeb8db2cf844d0534063ae71be699bc429a7e2e6912c99b2be7b3fdbd1abd"} Mar 10 09:47:11 crc kubenswrapper[4794]: I0310 09:47:11.890739 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-vft7z" event={"ID":"becfb047-8496-4c5a-9f65-02721f50e13d","Type":"ContainerStarted","Data":"b2e1f58cdf28b80a2162e70a1dc870c42eaf477ea91e61235e4fc748134d26aa"} Mar 10 09:47:11 crc kubenswrapper[4794]: I0310 09:47:11.907520 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-vft7z" podStartSLOduration=108.90749961 podStartE2EDuration="1m48.90749961s" podCreationTimestamp="2026-03-10 09:45:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 09:47:11.906797248 +0000 UTC m=+180.662968086" watchObservedRunningTime="2026-03-10 09:47:11.90749961 +0000 UTC m=+180.663670428" Mar 10 09:47:12 crc kubenswrapper[4794]: I0310 09:47:12.000706 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 09:47:12 crc kubenswrapper[4794]: E0310 09:47:12.000978 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 09:47:12 crc kubenswrapper[4794]: I0310 09:47:12.001364 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 09:47:12 crc kubenswrapper[4794]: E0310 09:47:12.001495 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 09:47:12 crc kubenswrapper[4794]: I0310 09:47:12.001651 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 09:47:12 crc kubenswrapper[4794]: E0310 09:47:12.001775 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 09:47:12 crc kubenswrapper[4794]: I0310 09:47:12.036580 4794 certificate_manager.go:356] kubernetes.io/kubelet-serving: Rotating certificates Mar 10 09:47:12 crc kubenswrapper[4794]: I0310 09:47:12.054154 4794 reflector.go:368] Caches populated for *v1.CertificateSigningRequest from k8s.io/client-go/tools/watch/informerwatcher.go:146 Mar 10 09:47:12 crc kubenswrapper[4794]: E0310 09:47:12.084631 4794 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 10 09:47:12 crc kubenswrapper[4794]: I0310 09:47:12.999157 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jl52w" Mar 10 09:47:12 crc kubenswrapper[4794]: E0310 09:47:12.999348 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jl52w" podUID="befc934b-d5ba-4fb4-afc6-97614b624ebc" Mar 10 09:47:13 crc kubenswrapper[4794]: I0310 09:47:13.999116 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 09:47:13 crc kubenswrapper[4794]: E0310 09:47:13.999320 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 09:47:13 crc kubenswrapper[4794]: I0310 09:47:13.999707 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 09:47:13 crc kubenswrapper[4794]: I0310 09:47:13.999724 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 09:47:13 crc kubenswrapper[4794]: E0310 09:47:13.999842 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 09:47:14 crc kubenswrapper[4794]: E0310 09:47:14.000025 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 09:47:14 crc kubenswrapper[4794]: I0310 09:47:14.998759 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jl52w" Mar 10 09:47:14 crc kubenswrapper[4794]: E0310 09:47:14.998919 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jl52w" podUID="befc934b-d5ba-4fb4-afc6-97614b624ebc" Mar 10 09:47:15 crc kubenswrapper[4794]: I0310 09:47:15.998917 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 09:47:15 crc kubenswrapper[4794]: I0310 09:47:15.998935 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 09:47:16 crc kubenswrapper[4794]: E0310 09:47:15.999112 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 09:47:16 crc kubenswrapper[4794]: I0310 09:47:15.999128 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 09:47:16 crc kubenswrapper[4794]: E0310 09:47:15.999425 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 09:47:16 crc kubenswrapper[4794]: E0310 09:47:15.999516 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 09:47:16 crc kubenswrapper[4794]: I0310 09:47:16.999066 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jl52w" Mar 10 09:47:16 crc kubenswrapper[4794]: E0310 09:47:16.999212 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jl52w" podUID="befc934b-d5ba-4fb4-afc6-97614b624ebc" Mar 10 09:47:17 crc kubenswrapper[4794]: E0310 09:47:17.086101 4794 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 10 09:47:17 crc kubenswrapper[4794]: I0310 09:47:17.998999 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 09:47:17 crc kubenswrapper[4794]: I0310 09:47:17.999116 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 09:47:17 crc kubenswrapper[4794]: E0310 09:47:17.999218 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 09:47:17 crc kubenswrapper[4794]: I0310 09:47:17.999246 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 09:47:17 crc kubenswrapper[4794]: E0310 09:47:17.999381 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 09:47:17 crc kubenswrapper[4794]: E0310 09:47:17.999479 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 09:47:18 crc kubenswrapper[4794]: I0310 09:47:18.998302 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jl52w" Mar 10 09:47:18 crc kubenswrapper[4794]: E0310 09:47:18.998487 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jl52w" podUID="befc934b-d5ba-4fb4-afc6-97614b624ebc" Mar 10 09:47:19 crc kubenswrapper[4794]: I0310 09:47:19.998886 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 09:47:19 crc kubenswrapper[4794]: I0310 09:47:19.998969 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 09:47:19 crc kubenswrapper[4794]: E0310 09:47:19.999393 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 09:47:19 crc kubenswrapper[4794]: E0310 09:47:19.999513 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 09:47:19 crc kubenswrapper[4794]: I0310 09:47:19.999053 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 09:47:19 crc kubenswrapper[4794]: E0310 09:47:19.999619 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 09:47:20 crc kubenswrapper[4794]: I0310 09:47:20.999000 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jl52w" Mar 10 09:47:20 crc kubenswrapper[4794]: E0310 09:47:20.999229 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jl52w" podUID="befc934b-d5ba-4fb4-afc6-97614b624ebc" Mar 10 09:47:21 crc kubenswrapper[4794]: I0310 09:47:21.998824 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 09:47:21 crc kubenswrapper[4794]: I0310 09:47:21.998868 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 09:47:22 crc kubenswrapper[4794]: E0310 09:47:22.000856 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 09:47:22 crc kubenswrapper[4794]: I0310 09:47:22.000912 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 09:47:22 crc kubenswrapper[4794]: E0310 09:47:22.001007 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 09:47:22 crc kubenswrapper[4794]: E0310 09:47:22.001163 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 09:47:22 crc kubenswrapper[4794]: E0310 09:47:22.086725 4794 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 10 09:47:22 crc kubenswrapper[4794]: I0310 09:47:22.998425 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jl52w" Mar 10 09:47:22 crc kubenswrapper[4794]: E0310 09:47:22.998611 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jl52w" podUID="befc934b-d5ba-4fb4-afc6-97614b624ebc" Mar 10 09:47:23 crc kubenswrapper[4794]: I0310 09:47:23.998950 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 09:47:23 crc kubenswrapper[4794]: I0310 09:47:23.999036 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 09:47:23 crc kubenswrapper[4794]: E0310 09:47:23.999097 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 09:47:23 crc kubenswrapper[4794]: I0310 09:47:23.999201 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 09:47:23 crc kubenswrapper[4794]: E0310 09:47:23.999250 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 09:47:23 crc kubenswrapper[4794]: E0310 09:47:23.999412 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 09:47:24 crc kubenswrapper[4794]: I0310 09:47:24.999072 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jl52w" Mar 10 09:47:24 crc kubenswrapper[4794]: E0310 09:47:24.999298 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jl52w" podUID="befc934b-d5ba-4fb4-afc6-97614b624ebc" Mar 10 09:47:25 crc kubenswrapper[4794]: I0310 09:47:25.000238 4794 scope.go:117] "RemoveContainer" containerID="db67319402db24bdefe093e2db9fa128803504208bed6a528ba39f3dc8996c58" Mar 10 09:47:25 crc kubenswrapper[4794]: I0310 09:47:25.811695 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-jl52w"] Mar 10 09:47:25 crc kubenswrapper[4794]: I0310 09:47:25.937965 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-jpdth_11028118-385a-4a2a-8bc4-49aad67ce147/kube-multus/1.log" Mar 10 09:47:25 crc kubenswrapper[4794]: I0310 09:47:25.938699 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-jpdth_11028118-385a-4a2a-8bc4-49aad67ce147/kube-multus/0.log" Mar 10 09:47:25 crc kubenswrapper[4794]: I0310 09:47:25.938751 4794 generic.go:334] "Generic (PLEG): container finished" podID="11028118-385a-4a2a-8bc4-49aad67ce147" containerID="7132e8357d430e33f2f8e8172b67b9aba8deac7f2725d5222c6ccece6e4589d1" exitCode=1 Mar 10 09:47:25 crc kubenswrapper[4794]: I0310 09:47:25.938815 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-jpdth" event={"ID":"11028118-385a-4a2a-8bc4-49aad67ce147","Type":"ContainerDied","Data":"7132e8357d430e33f2f8e8172b67b9aba8deac7f2725d5222c6ccece6e4589d1"} Mar 10 09:47:25 crc kubenswrapper[4794]: I0310 09:47:25.938851 4794 scope.go:117] "RemoveContainer" containerID="31569151ef9bf456980ec9d56c85d54ea77739b3991e26844385936dfceb24ec" Mar 10 09:47:25 crc kubenswrapper[4794]: I0310 09:47:25.939480 4794 scope.go:117] "RemoveContainer" containerID="7132e8357d430e33f2f8e8172b67b9aba8deac7f2725d5222c6ccece6e4589d1" Mar 10 09:47:25 crc kubenswrapper[4794]: E0310 09:47:25.939885 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-multus pod=multus-jpdth_openshift-multus(11028118-385a-4a2a-8bc4-49aad67ce147)\"" pod="openshift-multus/multus-jpdth" podUID="11028118-385a-4a2a-8bc4-49aad67ce147" Mar 10 09:47:25 crc kubenswrapper[4794]: I0310 09:47:25.942929 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-mm9nq_d6907de6-7eb7-440a-a101-f492ffa28e39/ovnkube-controller/3.log" Mar 10 09:47:25 crc kubenswrapper[4794]: I0310 09:47:25.947928 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jl52w" Mar 10 09:47:25 crc kubenswrapper[4794]: I0310 09:47:25.947949 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mm9nq" event={"ID":"d6907de6-7eb7-440a-a101-f492ffa28e39","Type":"ContainerStarted","Data":"b88a0e22611d7a1518609affa6087b133d21efdb5a7e80a5d4de584a1e106e57"} Mar 10 09:47:25 crc kubenswrapper[4794]: E0310 09:47:25.948028 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jl52w" podUID="befc934b-d5ba-4fb4-afc6-97614b624ebc" Mar 10 09:47:25 crc kubenswrapper[4794]: I0310 09:47:25.948594 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-mm9nq" Mar 10 09:47:26 crc kubenswrapper[4794]: I0310 09:47:26.001006 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 09:47:26 crc kubenswrapper[4794]: I0310 09:47:26.001039 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 09:47:26 crc kubenswrapper[4794]: I0310 09:47:26.001047 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 09:47:26 crc kubenswrapper[4794]: E0310 09:47:26.001154 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 09:47:26 crc kubenswrapper[4794]: E0310 09:47:26.001440 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 09:47:26 crc kubenswrapper[4794]: E0310 09:47:26.001553 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 09:47:26 crc kubenswrapper[4794]: I0310 09:47:26.004767 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-mm9nq" podStartSLOduration=123.004754402 podStartE2EDuration="2m3.004754402s" podCreationTimestamp="2026-03-10 09:45:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 09:47:26.003968897 +0000 UTC m=+194.760139725" watchObservedRunningTime="2026-03-10 09:47:26.004754402 +0000 UTC m=+194.760925230" Mar 10 09:47:26 crc kubenswrapper[4794]: I0310 09:47:26.954477 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-jpdth_11028118-385a-4a2a-8bc4-49aad67ce147/kube-multus/1.log" Mar 10 09:47:27 crc kubenswrapper[4794]: E0310 09:47:27.088241 4794 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 10 09:47:27 crc kubenswrapper[4794]: I0310 09:47:27.998183 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 09:47:27 crc kubenswrapper[4794]: I0310 09:47:27.998262 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jl52w" Mar 10 09:47:27 crc kubenswrapper[4794]: E0310 09:47:27.998388 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 09:47:27 crc kubenswrapper[4794]: I0310 09:47:27.998408 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 09:47:27 crc kubenswrapper[4794]: E0310 09:47:27.998588 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jl52w" podUID="befc934b-d5ba-4fb4-afc6-97614b624ebc" Mar 10 09:47:27 crc kubenswrapper[4794]: I0310 09:47:27.998675 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 09:47:27 crc kubenswrapper[4794]: E0310 09:47:27.998717 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 09:47:27 crc kubenswrapper[4794]: E0310 09:47:27.998886 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 09:47:29 crc kubenswrapper[4794]: I0310 09:47:29.998682 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 09:47:29 crc kubenswrapper[4794]: I0310 09:47:29.998756 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 09:47:29 crc kubenswrapper[4794]: E0310 09:47:29.998867 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 09:47:29 crc kubenswrapper[4794]: I0310 09:47:29.998887 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jl52w" Mar 10 09:47:29 crc kubenswrapper[4794]: I0310 09:47:29.999138 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 09:47:29 crc kubenswrapper[4794]: E0310 09:47:29.999126 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 09:47:29 crc kubenswrapper[4794]: E0310 09:47:29.999253 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jl52w" podUID="befc934b-d5ba-4fb4-afc6-97614b624ebc" Mar 10 09:47:29 crc kubenswrapper[4794]: E0310 09:47:29.999443 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 09:47:31 crc kubenswrapper[4794]: I0310 09:47:31.998539 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jl52w" Mar 10 09:47:31 crc kubenswrapper[4794]: I0310 09:47:31.998595 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 09:47:31 crc kubenswrapper[4794]: I0310 09:47:31.998604 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 09:47:31 crc kubenswrapper[4794]: E0310 09:47:31.999455 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jl52w" podUID="befc934b-d5ba-4fb4-afc6-97614b624ebc" Mar 10 09:47:31 crc kubenswrapper[4794]: I0310 09:47:31.999479 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 09:47:31 crc kubenswrapper[4794]: E0310 09:47:31.999523 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 09:47:31 crc kubenswrapper[4794]: E0310 09:47:31.999574 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 09:47:31 crc kubenswrapper[4794]: E0310 09:47:31.999653 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 09:47:32 crc kubenswrapper[4794]: E0310 09:47:32.088787 4794 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 10 09:47:33 crc kubenswrapper[4794]: I0310 09:47:33.998639 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jl52w" Mar 10 09:47:33 crc kubenswrapper[4794]: I0310 09:47:33.998718 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 09:47:33 crc kubenswrapper[4794]: E0310 09:47:33.998776 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jl52w" podUID="befc934b-d5ba-4fb4-afc6-97614b624ebc" Mar 10 09:47:33 crc kubenswrapper[4794]: I0310 09:47:33.998646 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 09:47:33 crc kubenswrapper[4794]: I0310 09:47:33.998916 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 09:47:34 crc kubenswrapper[4794]: E0310 09:47:33.998838 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 09:47:34 crc kubenswrapper[4794]: E0310 09:47:33.999161 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 09:47:34 crc kubenswrapper[4794]: E0310 09:47:33.999256 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 09:47:35 crc kubenswrapper[4794]: I0310 09:47:35.998675 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 09:47:35 crc kubenswrapper[4794]: I0310 09:47:35.998945 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jl52w" Mar 10 09:47:36 crc kubenswrapper[4794]: I0310 09:47:35.998711 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 09:47:36 crc kubenswrapper[4794]: E0310 09:47:35.999011 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 09:47:36 crc kubenswrapper[4794]: I0310 09:47:35.998742 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 09:47:36 crc kubenswrapper[4794]: E0310 09:47:35.999205 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jl52w" podUID="befc934b-d5ba-4fb4-afc6-97614b624ebc" Mar 10 09:47:36 crc kubenswrapper[4794]: E0310 09:47:35.999327 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 09:47:36 crc kubenswrapper[4794]: E0310 09:47:35.999424 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 09:47:37 crc kubenswrapper[4794]: E0310 09:47:37.090857 4794 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 10 09:47:37 crc kubenswrapper[4794]: I0310 09:47:37.998746 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 09:47:37 crc kubenswrapper[4794]: I0310 09:47:37.998788 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 09:47:37 crc kubenswrapper[4794]: I0310 09:47:37.998815 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jl52w" Mar 10 09:47:37 crc kubenswrapper[4794]: E0310 09:47:37.998909 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 09:47:37 crc kubenswrapper[4794]: I0310 09:47:37.998958 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 09:47:37 crc kubenswrapper[4794]: E0310 09:47:37.999057 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jl52w" podUID="befc934b-d5ba-4fb4-afc6-97614b624ebc" Mar 10 09:47:37 crc kubenswrapper[4794]: E0310 09:47:37.999187 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 09:47:37 crc kubenswrapper[4794]: E0310 09:47:37.999398 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 09:47:38 crc kubenswrapper[4794]: I0310 09:47:38.999307 4794 scope.go:117] "RemoveContainer" containerID="7132e8357d430e33f2f8e8172b67b9aba8deac7f2725d5222c6ccece6e4589d1" Mar 10 09:47:39 crc kubenswrapper[4794]: I0310 09:47:39.999150 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jl52w" Mar 10 09:47:39 crc kubenswrapper[4794]: E0310 09:47:39.999669 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jl52w" podUID="befc934b-d5ba-4fb4-afc6-97614b624ebc" Mar 10 09:47:40 crc kubenswrapper[4794]: I0310 09:47:40.001631 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 09:47:40 crc kubenswrapper[4794]: I0310 09:47:40.001690 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 09:47:40 crc kubenswrapper[4794]: I0310 09:47:40.001742 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 09:47:40 crc kubenswrapper[4794]: E0310 09:47:40.001866 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 09:47:40 crc kubenswrapper[4794]: E0310 09:47:40.002029 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 09:47:40 crc kubenswrapper[4794]: E0310 09:47:40.002296 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 09:47:40 crc kubenswrapper[4794]: I0310 09:47:40.003892 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-jpdth_11028118-385a-4a2a-8bc4-49aad67ce147/kube-multus/1.log" Mar 10 09:47:40 crc kubenswrapper[4794]: I0310 09:47:40.004701 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-jpdth" event={"ID":"11028118-385a-4a2a-8bc4-49aad67ce147","Type":"ContainerStarted","Data":"3920ff9b5b2c0054283f96181e8561137507b6a3b05185c78e1f5bf2968a1845"} Mar 10 09:47:41 crc kubenswrapper[4794]: I0310 09:47:41.998982 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 09:47:41 crc kubenswrapper[4794]: I0310 09:47:41.999041 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 09:47:42 crc kubenswrapper[4794]: I0310 09:47:41.999143 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jl52w" Mar 10 09:47:42 crc kubenswrapper[4794]: I0310 09:47:42.000250 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 09:47:42 crc kubenswrapper[4794]: E0310 09:47:42.001632 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 09:47:42 crc kubenswrapper[4794]: E0310 09:47:42.001914 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 09:47:42 crc kubenswrapper[4794]: E0310 09:47:42.002186 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jl52w" podUID="befc934b-d5ba-4fb4-afc6-97614b624ebc" Mar 10 09:47:42 crc kubenswrapper[4794]: E0310 09:47:42.002267 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 09:47:43 crc kubenswrapper[4794]: I0310 09:47:43.910722 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 09:47:43 crc kubenswrapper[4794]: E0310 09:47:43.910960 4794 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 09:49:45.910923436 +0000 UTC m=+334.667094264 (durationBeforeRetry 2m2s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 09:47:43 crc kubenswrapper[4794]: I0310 09:47:43.911082 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 09:47:43 crc kubenswrapper[4794]: I0310 09:47:43.911115 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 09:47:43 crc kubenswrapper[4794]: I0310 09:47:43.911138 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 09:47:43 crc kubenswrapper[4794]: I0310 09:47:43.911177 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 09:47:43 crc kubenswrapper[4794]: E0310 09:47:43.911249 4794 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 10 09:47:43 crc kubenswrapper[4794]: E0310 09:47:43.911314 4794 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 10 09:47:43 crc kubenswrapper[4794]: E0310 09:47:43.911462 4794 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 10 09:47:43 crc kubenswrapper[4794]: E0310 09:47:43.911315 4794 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-10 09:49:45.911300698 +0000 UTC m=+334.667471606 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 10 09:47:43 crc kubenswrapper[4794]: E0310 09:47:43.911482 4794 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 10 09:47:43 crc kubenswrapper[4794]: E0310 09:47:43.911507 4794 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 10 09:47:43 crc kubenswrapper[4794]: E0310 09:47:43.911516 4794 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-10 09:49:45.911496464 +0000 UTC m=+334.667667282 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 10 09:47:43 crc kubenswrapper[4794]: E0310 09:47:43.911547 4794 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-10 09:49:45.911535795 +0000 UTC m=+334.667706633 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 10 09:47:43 crc kubenswrapper[4794]: E0310 09:47:43.911644 4794 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 10 09:47:43 crc kubenswrapper[4794]: E0310 09:47:43.911691 4794 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 10 09:47:43 crc kubenswrapper[4794]: E0310 09:47:43.911711 4794 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 10 09:47:43 crc kubenswrapper[4794]: E0310 09:47:43.911793 4794 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-10 09:49:45.911769462 +0000 UTC m=+334.667940310 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 10 09:47:43 crc kubenswrapper[4794]: I0310 09:47:43.999062 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 09:47:43 crc kubenswrapper[4794]: I0310 09:47:43.999120 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 09:47:43 crc kubenswrapper[4794]: I0310 09:47:43.999062 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 09:47:43 crc kubenswrapper[4794]: I0310 09:47:43.999390 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jl52w" Mar 10 09:47:44 crc kubenswrapper[4794]: I0310 09:47:44.002541 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Mar 10 09:47:44 crc kubenswrapper[4794]: I0310 09:47:44.002778 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Mar 10 09:47:44 crc kubenswrapper[4794]: I0310 09:47:44.003313 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Mar 10 09:47:44 crc kubenswrapper[4794]: I0310 09:47:44.003416 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Mar 10 09:47:44 crc kubenswrapper[4794]: I0310 09:47:44.005188 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Mar 10 09:47:44 crc kubenswrapper[4794]: I0310 09:47:44.005398 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Mar 10 09:47:51 crc kubenswrapper[4794]: I0310 09:47:51.592091 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeReady" Mar 10 09:47:51 crc kubenswrapper[4794]: I0310 09:47:51.640515 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-n64xh"] Mar 10 09:47:51 crc kubenswrapper[4794]: I0310 09:47:51.641192 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-4jppl"] Mar 10 09:47:51 crc kubenswrapper[4794]: I0310 09:47:51.642244 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-4jppl" Mar 10 09:47:51 crc kubenswrapper[4794]: I0310 09:47:51.643004 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-n64xh" Mar 10 09:47:51 crc kubenswrapper[4794]: I0310 09:47:51.644800 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-cxgqz"] Mar 10 09:47:51 crc kubenswrapper[4794]: I0310 09:47:51.645382 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-cxgqz" Mar 10 09:47:51 crc kubenswrapper[4794]: I0310 09:47:51.645452 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 10 09:47:51 crc kubenswrapper[4794]: I0310 09:47:51.645816 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-jscb4"] Mar 10 09:47:51 crc kubenswrapper[4794]: I0310 09:47:51.646312 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-jscb4" Mar 10 09:47:51 crc kubenswrapper[4794]: I0310 09:47:51.649632 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-m2lbn"] Mar 10 09:47:51 crc kubenswrapper[4794]: I0310 09:47:51.652222 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-m2lbn" Mar 10 09:47:51 crc kubenswrapper[4794]: I0310 09:47:51.654156 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Mar 10 09:47:51 crc kubenswrapper[4794]: I0310 09:47:51.666077 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Mar 10 09:47:51 crc kubenswrapper[4794]: I0310 09:47:51.678593 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Mar 10 09:47:51 crc kubenswrapper[4794]: I0310 09:47:51.678976 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Mar 10 09:47:51 crc kubenswrapper[4794]: I0310 09:47:51.679160 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Mar 10 09:47:51 crc kubenswrapper[4794]: I0310 09:47:51.679236 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 10 09:47:51 crc kubenswrapper[4794]: I0310 09:47:51.679448 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Mar 10 09:47:51 crc kubenswrapper[4794]: I0310 09:47:51.678594 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Mar 10 09:47:51 crc kubenswrapper[4794]: I0310 09:47:51.679687 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Mar 10 09:47:51 crc kubenswrapper[4794]: I0310 09:47:51.679450 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 10 09:47:51 crc kubenswrapper[4794]: I0310 09:47:51.678797 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 10 09:47:51 crc kubenswrapper[4794]: I0310 09:47:51.684569 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 10 09:47:51 crc kubenswrapper[4794]: I0310 09:47:51.684768 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 10 09:47:51 crc kubenswrapper[4794]: I0310 09:47:51.684864 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 10 09:47:51 crc kubenswrapper[4794]: I0310 09:47:51.684944 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Mar 10 09:47:51 crc kubenswrapper[4794]: I0310 09:47:51.685035 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 10 09:47:51 crc kubenswrapper[4794]: I0310 09:47:51.685125 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Mar 10 09:47:51 crc kubenswrapper[4794]: I0310 09:47:51.685131 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Mar 10 09:47:51 crc kubenswrapper[4794]: I0310 09:47:51.685288 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Mar 10 09:47:51 crc kubenswrapper[4794]: I0310 09:47:51.685374 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Mar 10 09:47:51 crc kubenswrapper[4794]: I0310 09:47:51.685405 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Mar 10 09:47:51 crc kubenswrapper[4794]: I0310 09:47:51.685516 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 10 09:47:51 crc kubenswrapper[4794]: I0310 09:47:51.685827 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Mar 10 09:47:51 crc kubenswrapper[4794]: I0310 09:47:51.686179 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 10 09:47:51 crc kubenswrapper[4794]: I0310 09:47:51.685290 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Mar 10 09:47:51 crc kubenswrapper[4794]: I0310 09:47:51.686585 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Mar 10 09:47:51 crc kubenswrapper[4794]: I0310 09:47:51.686743 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Mar 10 09:47:51 crc kubenswrapper[4794]: I0310 09:47:51.686586 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 10 09:47:51 crc kubenswrapper[4794]: I0310 09:47:51.685379 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Mar 10 09:47:51 crc kubenswrapper[4794]: I0310 09:47:51.686636 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Mar 10 09:47:51 crc kubenswrapper[4794]: I0310 09:47:51.686650 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Mar 10 09:47:51 crc kubenswrapper[4794]: I0310 09:47:51.686656 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Mar 10 09:47:51 crc kubenswrapper[4794]: I0310 09:47:51.686838 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Mar 10 09:47:51 crc kubenswrapper[4794]: I0310 09:47:51.686865 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Mar 10 09:47:51 crc kubenswrapper[4794]: I0310 09:47:51.687643 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-xtppb"] Mar 10 09:47:51 crc kubenswrapper[4794]: I0310 09:47:51.687800 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Mar 10 09:47:51 crc kubenswrapper[4794]: I0310 09:47:51.688223 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-xtppb" Mar 10 09:47:51 crc kubenswrapper[4794]: I0310 09:47:51.689756 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Mar 10 09:47:51 crc kubenswrapper[4794]: I0310 09:47:51.693405 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 10 09:47:51 crc kubenswrapper[4794]: I0310 09:47:51.699524 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 10 09:47:51 crc kubenswrapper[4794]: I0310 09:47:51.699527 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-zzsr8"] Mar 10 09:47:51 crc kubenswrapper[4794]: I0310 09:47:51.700361 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-zzsr8" Mar 10 09:47:51 crc kubenswrapper[4794]: I0310 09:47:51.709227 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Mar 10 09:47:51 crc kubenswrapper[4794]: I0310 09:47:51.709487 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Mar 10 09:47:51 crc kubenswrapper[4794]: I0310 09:47:51.709746 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Mar 10 09:47:51 crc kubenswrapper[4794]: I0310 09:47:51.709932 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Mar 10 09:47:51 crc kubenswrapper[4794]: I0310 09:47:51.710116 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Mar 10 09:47:51 crc kubenswrapper[4794]: I0310 09:47:51.710241 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Mar 10 09:47:51 crc kubenswrapper[4794]: I0310 09:47:51.710454 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Mar 10 09:47:51 crc kubenswrapper[4794]: I0310 09:47:51.711075 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/bff3d354-0064-4a96-8945-51df3cd2d7e7-etcd-serving-ca\") pod \"apiserver-76f77b778f-4jppl\" (UID: \"bff3d354-0064-4a96-8945-51df3cd2d7e7\") " pod="openshift-apiserver/apiserver-76f77b778f-4jppl" Mar 10 09:47:51 crc kubenswrapper[4794]: I0310 09:47:51.712992 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bff3d354-0064-4a96-8945-51df3cd2d7e7-serving-cert\") pod \"apiserver-76f77b778f-4jppl\" (UID: \"bff3d354-0064-4a96-8945-51df3cd2d7e7\") " pod="openshift-apiserver/apiserver-76f77b778f-4jppl" Mar 10 09:47:51 crc kubenswrapper[4794]: I0310 09:47:51.713017 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/32cb5682-0c0e-4f56-8fcc-cd73067c41c7-etcd-client\") pod \"apiserver-7bbb656c7d-cxgqz\" (UID: \"32cb5682-0c0e-4f56-8fcc-cd73067c41c7\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-cxgqz" Mar 10 09:47:51 crc kubenswrapper[4794]: I0310 09:47:51.713040 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a1d20393-8c02-48fd-83ad-eb270b721313-config\") pod \"machine-api-operator-5694c8668f-m2lbn\" (UID: \"a1d20393-8c02-48fd-83ad-eb270b721313\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-m2lbn" Mar 10 09:47:51 crc kubenswrapper[4794]: I0310 09:47:51.713093 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/bff3d354-0064-4a96-8945-51df3cd2d7e7-audit-dir\") pod \"apiserver-76f77b778f-4jppl\" (UID: \"bff3d354-0064-4a96-8945-51df3cd2d7e7\") " pod="openshift-apiserver/apiserver-76f77b778f-4jppl" Mar 10 09:47:51 crc kubenswrapper[4794]: I0310 09:47:51.713115 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/a1d20393-8c02-48fd-83ad-eb270b721313-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-m2lbn\" (UID: \"a1d20393-8c02-48fd-83ad-eb270b721313\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-m2lbn" Mar 10 09:47:51 crc kubenswrapper[4794]: I0310 09:47:51.713149 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/02821716-8fb0-46bc-9c95-4c7ca46500b4-serving-cert\") pod \"route-controller-manager-6576b87f9c-jscb4\" (UID: \"02821716-8fb0-46bc-9c95-4c7ca46500b4\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-jscb4" Mar 10 09:47:51 crc kubenswrapper[4794]: I0310 09:47:51.713173 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/32cb5682-0c0e-4f56-8fcc-cd73067c41c7-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-cxgqz\" (UID: \"32cb5682-0c0e-4f56-8fcc-cd73067c41c7\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-cxgqz" Mar 10 09:47:51 crc kubenswrapper[4794]: I0310 09:47:51.713195 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/19ba7a1d-a381-49f2-aa2e-6463336559fe-config\") pod \"controller-manager-879f6c89f-n64xh\" (UID: \"19ba7a1d-a381-49f2-aa2e-6463336559fe\") " pod="openshift-controller-manager/controller-manager-879f6c89f-n64xh" Mar 10 09:47:51 crc kubenswrapper[4794]: I0310 09:47:51.713218 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bff3d354-0064-4a96-8945-51df3cd2d7e7-trusted-ca-bundle\") pod \"apiserver-76f77b778f-4jppl\" (UID: \"bff3d354-0064-4a96-8945-51df3cd2d7e7\") " pod="openshift-apiserver/apiserver-76f77b778f-4jppl" Mar 10 09:47:51 crc kubenswrapper[4794]: I0310 09:47:51.713239 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/a1d20393-8c02-48fd-83ad-eb270b721313-images\") pod \"machine-api-operator-5694c8668f-m2lbn\" (UID: \"a1d20393-8c02-48fd-83ad-eb270b721313\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-m2lbn" Mar 10 09:47:51 crc kubenswrapper[4794]: I0310 09:47:51.713293 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/32cb5682-0c0e-4f56-8fcc-cd73067c41c7-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-cxgqz\" (UID: \"32cb5682-0c0e-4f56-8fcc-cd73067c41c7\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-cxgqz" Mar 10 09:47:51 crc kubenswrapper[4794]: I0310 09:47:51.713315 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/32cb5682-0c0e-4f56-8fcc-cd73067c41c7-audit-dir\") pod \"apiserver-7bbb656c7d-cxgqz\" (UID: \"32cb5682-0c0e-4f56-8fcc-cd73067c41c7\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-cxgqz" Mar 10 09:47:51 crc kubenswrapper[4794]: I0310 09:47:51.713373 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/02821716-8fb0-46bc-9c95-4c7ca46500b4-client-ca\") pod \"route-controller-manager-6576b87f9c-jscb4\" (UID: \"02821716-8fb0-46bc-9c95-4c7ca46500b4\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-jscb4" Mar 10 09:47:51 crc kubenswrapper[4794]: I0310 09:47:51.713395 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/bff3d354-0064-4a96-8945-51df3cd2d7e7-image-import-ca\") pod \"apiserver-76f77b778f-4jppl\" (UID: \"bff3d354-0064-4a96-8945-51df3cd2d7e7\") " pod="openshift-apiserver/apiserver-76f77b778f-4jppl" Mar 10 09:47:51 crc kubenswrapper[4794]: I0310 09:47:51.713416 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hhffj\" (UniqueName: \"kubernetes.io/projected/02821716-8fb0-46bc-9c95-4c7ca46500b4-kube-api-access-hhffj\") pod \"route-controller-manager-6576b87f9c-jscb4\" (UID: \"02821716-8fb0-46bc-9c95-4c7ca46500b4\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-jscb4" Mar 10 09:47:51 crc kubenswrapper[4794]: I0310 09:47:51.713440 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zdbvb\" (UniqueName: \"kubernetes.io/projected/32cb5682-0c0e-4f56-8fcc-cd73067c41c7-kube-api-access-zdbvb\") pod \"apiserver-7bbb656c7d-cxgqz\" (UID: \"32cb5682-0c0e-4f56-8fcc-cd73067c41c7\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-cxgqz" Mar 10 09:47:51 crc kubenswrapper[4794]: I0310 09:47:51.713464 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/32cb5682-0c0e-4f56-8fcc-cd73067c41c7-audit-policies\") pod \"apiserver-7bbb656c7d-cxgqz\" (UID: \"32cb5682-0c0e-4f56-8fcc-cd73067c41c7\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-cxgqz" Mar 10 09:47:51 crc kubenswrapper[4794]: I0310 09:47:51.713486 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/19ba7a1d-a381-49f2-aa2e-6463336559fe-client-ca\") pod \"controller-manager-879f6c89f-n64xh\" (UID: \"19ba7a1d-a381-49f2-aa2e-6463336559fe\") " pod="openshift-controller-manager/controller-manager-879f6c89f-n64xh" Mar 10 09:47:51 crc kubenswrapper[4794]: I0310 09:47:51.713835 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tgp2d\" (UniqueName: \"kubernetes.io/projected/bff3d354-0064-4a96-8945-51df3cd2d7e7-kube-api-access-tgp2d\") pod \"apiserver-76f77b778f-4jppl\" (UID: \"bff3d354-0064-4a96-8945-51df3cd2d7e7\") " pod="openshift-apiserver/apiserver-76f77b778f-4jppl" Mar 10 09:47:51 crc kubenswrapper[4794]: I0310 09:47:51.713892 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-97qwh\" (UniqueName: \"kubernetes.io/projected/19ba7a1d-a381-49f2-aa2e-6463336559fe-kube-api-access-97qwh\") pod \"controller-manager-879f6c89f-n64xh\" (UID: \"19ba7a1d-a381-49f2-aa2e-6463336559fe\") " pod="openshift-controller-manager/controller-manager-879f6c89f-n64xh" Mar 10 09:47:51 crc kubenswrapper[4794]: I0310 09:47:51.713915 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bff3d354-0064-4a96-8945-51df3cd2d7e7-config\") pod \"apiserver-76f77b778f-4jppl\" (UID: \"bff3d354-0064-4a96-8945-51df3cd2d7e7\") " pod="openshift-apiserver/apiserver-76f77b778f-4jppl" Mar 10 09:47:51 crc kubenswrapper[4794]: I0310 09:47:51.713956 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/bff3d354-0064-4a96-8945-51df3cd2d7e7-etcd-client\") pod \"apiserver-76f77b778f-4jppl\" (UID: \"bff3d354-0064-4a96-8945-51df3cd2d7e7\") " pod="openshift-apiserver/apiserver-76f77b778f-4jppl" Mar 10 09:47:51 crc kubenswrapper[4794]: I0310 09:47:51.713981 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/02821716-8fb0-46bc-9c95-4c7ca46500b4-config\") pod \"route-controller-manager-6576b87f9c-jscb4\" (UID: \"02821716-8fb0-46bc-9c95-4c7ca46500b4\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-jscb4" Mar 10 09:47:51 crc kubenswrapper[4794]: I0310 09:47:51.714005 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/bff3d354-0064-4a96-8945-51df3cd2d7e7-node-pullsecrets\") pod \"apiserver-76f77b778f-4jppl\" (UID: \"bff3d354-0064-4a96-8945-51df3cd2d7e7\") " pod="openshift-apiserver/apiserver-76f77b778f-4jppl" Mar 10 09:47:51 crc kubenswrapper[4794]: I0310 09:47:51.714026 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/bff3d354-0064-4a96-8945-51df3cd2d7e7-encryption-config\") pod \"apiserver-76f77b778f-4jppl\" (UID: \"bff3d354-0064-4a96-8945-51df3cd2d7e7\") " pod="openshift-apiserver/apiserver-76f77b778f-4jppl" Mar 10 09:47:51 crc kubenswrapper[4794]: I0310 09:47:51.714053 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/bff3d354-0064-4a96-8945-51df3cd2d7e7-audit\") pod \"apiserver-76f77b778f-4jppl\" (UID: \"bff3d354-0064-4a96-8945-51df3cd2d7e7\") " pod="openshift-apiserver/apiserver-76f77b778f-4jppl" Mar 10 09:47:51 crc kubenswrapper[4794]: I0310 09:47:51.714081 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/32cb5682-0c0e-4f56-8fcc-cd73067c41c7-serving-cert\") pod \"apiserver-7bbb656c7d-cxgqz\" (UID: \"32cb5682-0c0e-4f56-8fcc-cd73067c41c7\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-cxgqz" Mar 10 09:47:51 crc kubenswrapper[4794]: I0310 09:47:51.714104 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/32cb5682-0c0e-4f56-8fcc-cd73067c41c7-encryption-config\") pod \"apiserver-7bbb656c7d-cxgqz\" (UID: \"32cb5682-0c0e-4f56-8fcc-cd73067c41c7\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-cxgqz" Mar 10 09:47:51 crc kubenswrapper[4794]: I0310 09:47:51.714136 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/19ba7a1d-a381-49f2-aa2e-6463336559fe-serving-cert\") pod \"controller-manager-879f6c89f-n64xh\" (UID: \"19ba7a1d-a381-49f2-aa2e-6463336559fe\") " pod="openshift-controller-manager/controller-manager-879f6c89f-n64xh" Mar 10 09:47:51 crc kubenswrapper[4794]: I0310 09:47:51.714166 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/19ba7a1d-a381-49f2-aa2e-6463336559fe-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-n64xh\" (UID: \"19ba7a1d-a381-49f2-aa2e-6463336559fe\") " pod="openshift-controller-manager/controller-manager-879f6c89f-n64xh" Mar 10 09:47:51 crc kubenswrapper[4794]: I0310 09:47:51.714186 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-grm58\" (UniqueName: \"kubernetes.io/projected/a1d20393-8c02-48fd-83ad-eb270b721313-kube-api-access-grm58\") pod \"machine-api-operator-5694c8668f-m2lbn\" (UID: \"a1d20393-8c02-48fd-83ad-eb270b721313\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-m2lbn" Mar 10 09:47:51 crc kubenswrapper[4794]: I0310 09:47:51.710572 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Mar 10 09:47:51 crc kubenswrapper[4794]: I0310 09:47:51.710296 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Mar 10 09:47:51 crc kubenswrapper[4794]: I0310 09:47:51.710323 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Mar 10 09:47:51 crc kubenswrapper[4794]: I0310 09:47:51.710266 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Mar 10 09:47:51 crc kubenswrapper[4794]: I0310 09:47:51.714936 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-n64xh"] Mar 10 09:47:51 crc kubenswrapper[4794]: I0310 09:47:51.714990 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-jnffm"] Mar 10 09:47:51 crc kubenswrapper[4794]: I0310 09:47:51.715393 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-f9d7485db-skqw6"] Mar 10 09:47:51 crc kubenswrapper[4794]: I0310 09:47:51.715729 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-skqw6" Mar 10 09:47:51 crc kubenswrapper[4794]: I0310 09:47:51.716150 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-jnffm" Mar 10 09:47:51 crc kubenswrapper[4794]: I0310 09:47:51.716672 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-n94r9"] Mar 10 09:47:51 crc kubenswrapper[4794]: I0310 09:47:51.717026 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-n94r9" Mar 10 09:47:51 crc kubenswrapper[4794]: I0310 09:47:51.719538 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-glg2p"] Mar 10 09:47:51 crc kubenswrapper[4794]: I0310 09:47:51.719964 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-9vsxq"] Mar 10 09:47:51 crc kubenswrapper[4794]: I0310 09:47:51.720313 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-q8jhf"] Mar 10 09:47:51 crc kubenswrapper[4794]: I0310 09:47:51.725031 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-q8jhf" Mar 10 09:47:51 crc kubenswrapper[4794]: I0310 09:47:51.725659 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-glg2p" Mar 10 09:47:51 crc kubenswrapper[4794]: I0310 09:47:51.725946 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-9vsxq" Mar 10 09:47:51 crc kubenswrapper[4794]: I0310 09:47:51.729398 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-zp5hg"] Mar 10 09:47:51 crc kubenswrapper[4794]: I0310 09:47:51.729898 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-jqtj2"] Mar 10 09:47:51 crc kubenswrapper[4794]: I0310 09:47:51.730251 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-58897d9998-qnrc8"] Mar 10 09:47:51 crc kubenswrapper[4794]: I0310 09:47:51.732615 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Mar 10 09:47:51 crc kubenswrapper[4794]: I0310 09:47:51.733059 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-zp5hg" Mar 10 09:47:51 crc kubenswrapper[4794]: I0310 09:47:51.733184 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Mar 10 09:47:51 crc kubenswrapper[4794]: I0310 09:47:51.733306 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-jqtj2" Mar 10 09:47:51 crc kubenswrapper[4794]: I0310 09:47:51.736871 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-txkm8"] Mar 10 09:47:51 crc kubenswrapper[4794]: I0310 09:47:51.737171 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-l4mhp"] Mar 10 09:47:51 crc kubenswrapper[4794]: I0310 09:47:51.737442 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-qnrc8" Mar 10 09:47:51 crc kubenswrapper[4794]: I0310 09:47:51.737702 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-txkm8" Mar 10 09:47:51 crc kubenswrapper[4794]: I0310 09:47:51.748349 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Mar 10 09:47:51 crc kubenswrapper[4794]: I0310 09:47:51.750742 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Mar 10 09:47:51 crc kubenswrapper[4794]: I0310 09:47:51.750970 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Mar 10 09:47:51 crc kubenswrapper[4794]: I0310 09:47:51.751102 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Mar 10 09:47:51 crc kubenswrapper[4794]: I0310 09:47:51.751581 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Mar 10 09:47:51 crc kubenswrapper[4794]: I0310 09:47:51.751886 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-l4mhp" Mar 10 09:47:51 crc kubenswrapper[4794]: I0310 09:47:51.751971 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Mar 10 09:47:51 crc kubenswrapper[4794]: I0310 09:47:51.752090 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Mar 10 09:47:51 crc kubenswrapper[4794]: I0310 09:47:51.752568 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Mar 10 09:47:51 crc kubenswrapper[4794]: I0310 09:47:51.752655 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Mar 10 09:47:51 crc kubenswrapper[4794]: I0310 09:47:51.752751 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Mar 10 09:47:51 crc kubenswrapper[4794]: I0310 09:47:51.752885 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Mar 10 09:47:51 crc kubenswrapper[4794]: I0310 09:47:51.753115 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Mar 10 09:47:51 crc kubenswrapper[4794]: I0310 09:47:51.753314 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Mar 10 09:47:51 crc kubenswrapper[4794]: I0310 09:47:51.753519 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Mar 10 09:47:51 crc kubenswrapper[4794]: I0310 09:47:51.753680 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Mar 10 09:47:51 crc kubenswrapper[4794]: I0310 09:47:51.753808 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Mar 10 09:47:51 crc kubenswrapper[4794]: I0310 09:47:51.753925 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Mar 10 09:47:51 crc kubenswrapper[4794]: I0310 09:47:51.754035 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Mar 10 09:47:51 crc kubenswrapper[4794]: I0310 09:47:51.754162 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Mar 10 09:47:51 crc kubenswrapper[4794]: I0310 09:47:51.754184 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Mar 10 09:47:51 crc kubenswrapper[4794]: I0310 09:47:51.754282 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Mar 10 09:47:51 crc kubenswrapper[4794]: I0310 09:47:51.754453 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Mar 10 09:47:51 crc kubenswrapper[4794]: I0310 09:47:51.754486 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Mar 10 09:47:51 crc kubenswrapper[4794]: I0310 09:47:51.754624 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Mar 10 09:47:51 crc kubenswrapper[4794]: I0310 09:47:51.758209 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Mar 10 09:47:51 crc kubenswrapper[4794]: I0310 09:47:51.758365 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Mar 10 09:47:51 crc kubenswrapper[4794]: I0310 09:47:51.758372 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Mar 10 09:47:51 crc kubenswrapper[4794]: I0310 09:47:51.758283 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Mar 10 09:47:51 crc kubenswrapper[4794]: I0310 09:47:51.758409 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Mar 10 09:47:51 crc kubenswrapper[4794]: I0310 09:47:51.758477 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Mar 10 09:47:51 crc kubenswrapper[4794]: I0310 09:47:51.758573 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Mar 10 09:47:51 crc kubenswrapper[4794]: I0310 09:47:51.758661 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-bhj55"] Mar 10 09:47:51 crc kubenswrapper[4794]: I0310 09:47:51.758791 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Mar 10 09:47:51 crc kubenswrapper[4794]: I0310 09:47:51.759267 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-bhj55" Mar 10 09:47:51 crc kubenswrapper[4794]: I0310 09:47:51.762619 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Mar 10 09:47:51 crc kubenswrapper[4794]: I0310 09:47:51.764808 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Mar 10 09:47:51 crc kubenswrapper[4794]: I0310 09:47:51.765068 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Mar 10 09:47:51 crc kubenswrapper[4794]: I0310 09:47:51.765249 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-q62bc"] Mar 10 09:47:51 crc kubenswrapper[4794]: I0310 09:47:51.765321 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Mar 10 09:47:51 crc kubenswrapper[4794]: I0310 09:47:51.765519 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Mar 10 09:47:51 crc kubenswrapper[4794]: I0310 09:47:51.765707 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Mar 10 09:47:51 crc kubenswrapper[4794]: I0310 09:47:51.765918 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-q62bc" Mar 10 09:47:51 crc kubenswrapper[4794]: I0310 09:47:51.765988 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Mar 10 09:47:51 crc kubenswrapper[4794]: I0310 09:47:51.766161 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Mar 10 09:47:51 crc kubenswrapper[4794]: I0310 09:47:51.766231 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Mar 10 09:47:51 crc kubenswrapper[4794]: I0310 09:47:51.766419 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Mar 10 09:47:51 crc kubenswrapper[4794]: I0310 09:47:51.769917 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Mar 10 09:47:51 crc kubenswrapper[4794]: I0310 09:47:51.770133 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Mar 10 09:47:51 crc kubenswrapper[4794]: I0310 09:47:51.770276 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Mar 10 09:47:51 crc kubenswrapper[4794]: I0310 09:47:51.770452 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Mar 10 09:47:51 crc kubenswrapper[4794]: I0310 09:47:51.770576 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Mar 10 09:47:51 crc kubenswrapper[4794]: I0310 09:47:51.770614 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-5444994796-tlkl5"] Mar 10 09:47:51 crc kubenswrapper[4794]: I0310 09:47:51.770984 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Mar 10 09:47:51 crc kubenswrapper[4794]: I0310 09:47:51.771104 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-dkz9c"] Mar 10 09:47:51 crc kubenswrapper[4794]: I0310 09:47:51.771190 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-tlkl5" Mar 10 09:47:51 crc kubenswrapper[4794]: I0310 09:47:51.771831 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-rjjct"] Mar 10 09:47:51 crc kubenswrapper[4794]: I0310 09:47:51.772160 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-rjjct" Mar 10 09:47:51 crc kubenswrapper[4794]: I0310 09:47:51.772450 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-dkz9c" Mar 10 09:47:51 crc kubenswrapper[4794]: I0310 09:47:51.773272 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Mar 10 09:47:51 crc kubenswrapper[4794]: I0310 09:47:51.773699 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Mar 10 09:47:51 crc kubenswrapper[4794]: I0310 09:47:51.773886 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Mar 10 09:47:51 crc kubenswrapper[4794]: I0310 09:47:51.776294 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-sqlbx"] Mar 10 09:47:51 crc kubenswrapper[4794]: I0310 09:47:51.776932 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-hxsrd"] Mar 10 09:47:51 crc kubenswrapper[4794]: I0310 09:47:51.777359 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-hxsrd" Mar 10 09:47:51 crc kubenswrapper[4794]: I0310 09:47:51.777535 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-xwx4g"] Mar 10 09:47:51 crc kubenswrapper[4794]: I0310 09:47:51.777888 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-xwx4g" Mar 10 09:47:51 crc kubenswrapper[4794]: I0310 09:47:51.778992 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-sqlbx" Mar 10 09:47:51 crc kubenswrapper[4794]: I0310 09:47:51.779444 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Mar 10 09:47:51 crc kubenswrapper[4794]: I0310 09:47:51.785467 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-h4k77"] Mar 10 09:47:51 crc kubenswrapper[4794]: I0310 09:47:51.786261 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-2tbqn"] Mar 10 09:47:51 crc kubenswrapper[4794]: I0310 09:47:51.786494 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-h4k77" Mar 10 09:47:51 crc kubenswrapper[4794]: I0310 09:47:51.786636 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-2tbqn" Mar 10 09:47:51 crc kubenswrapper[4794]: I0310 09:47:51.787551 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-gg8jk"] Mar 10 09:47:51 crc kubenswrapper[4794]: I0310 09:47:51.788258 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Mar 10 09:47:51 crc kubenswrapper[4794]: I0310 09:47:51.791367 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-vjgc6"] Mar 10 09:47:51 crc kubenswrapper[4794]: I0310 09:47:51.791557 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-gg8jk" Mar 10 09:47:51 crc kubenswrapper[4794]: I0310 09:47:51.792235 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-qk8nd"] Mar 10 09:47:51 crc kubenswrapper[4794]: I0310 09:47:51.792768 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-vjgc6" Mar 10 09:47:51 crc kubenswrapper[4794]: I0310 09:47:51.792848 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-qk8nd" Mar 10 09:47:51 crc kubenswrapper[4794]: I0310 09:47:51.812664 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Mar 10 09:47:51 crc kubenswrapper[4794]: I0310 09:47:51.813033 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-m7cm9"] Mar 10 09:47:51 crc kubenswrapper[4794]: I0310 09:47:51.824036 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/bff3d354-0064-4a96-8945-51df3cd2d7e7-etcd-client\") pod \"apiserver-76f77b778f-4jppl\" (UID: \"bff3d354-0064-4a96-8945-51df3cd2d7e7\") " pod="openshift-apiserver/apiserver-76f77b778f-4jppl" Mar 10 09:47:51 crc kubenswrapper[4794]: I0310 09:47:51.825419 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Mar 10 09:47:51 crc kubenswrapper[4794]: I0310 09:47:51.825883 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-machine-approver/machine-approver-56656f9798-m5mvt"] Mar 10 09:47:51 crc kubenswrapper[4794]: I0310 09:47:51.826315 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-m7cm9" Mar 10 09:47:51 crc kubenswrapper[4794]: I0310 09:47:51.829155 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/4f8f1872-3b59-424c-ad20-9332a4ffc4b2-tmpfs\") pod \"packageserver-d55dfcdfc-rjjct\" (UID: \"4f8f1872-3b59-424c-ad20-9332a4ffc4b2\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-rjjct" Mar 10 09:47:51 crc kubenswrapper[4794]: I0310 09:47:51.829265 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/bff3d354-0064-4a96-8945-51df3cd2d7e7-encryption-config\") pod \"apiserver-76f77b778f-4jppl\" (UID: \"bff3d354-0064-4a96-8945-51df3cd2d7e7\") " pod="openshift-apiserver/apiserver-76f77b778f-4jppl" Mar 10 09:47:51 crc kubenswrapper[4794]: I0310 09:47:51.829369 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/4f8f1872-3b59-424c-ad20-9332a4ffc4b2-webhook-cert\") pod \"packageserver-d55dfcdfc-rjjct\" (UID: \"4f8f1872-3b59-424c-ad20-9332a4ffc4b2\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-rjjct" Mar 10 09:47:51 crc kubenswrapper[4794]: I0310 09:47:51.829807 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/02821716-8fb0-46bc-9c95-4c7ca46500b4-config\") pod \"route-controller-manager-6576b87f9c-jscb4\" (UID: \"02821716-8fb0-46bc-9c95-4c7ca46500b4\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-jscb4" Mar 10 09:47:51 crc kubenswrapper[4794]: I0310 09:47:51.829863 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/bff3d354-0064-4a96-8945-51df3cd2d7e7-node-pullsecrets\") pod \"apiserver-76f77b778f-4jppl\" (UID: \"bff3d354-0064-4a96-8945-51df3cd2d7e7\") " pod="openshift-apiserver/apiserver-76f77b778f-4jppl" Mar 10 09:47:51 crc kubenswrapper[4794]: I0310 09:47:51.829890 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/bff3d354-0064-4a96-8945-51df3cd2d7e7-audit\") pod \"apiserver-76f77b778f-4jppl\" (UID: \"bff3d354-0064-4a96-8945-51df3cd2d7e7\") " pod="openshift-apiserver/apiserver-76f77b778f-4jppl" Mar 10 09:47:51 crc kubenswrapper[4794]: I0310 09:47:51.829915 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b70fe17f-97eb-4abf-b88e-0e24e5d01c48-serving-cert\") pod \"openshift-config-operator-7777fb866f-l4mhp\" (UID: \"b70fe17f-97eb-4abf-b88e-0e24e5d01c48\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-l4mhp" Mar 10 09:47:51 crc kubenswrapper[4794]: I0310 09:47:51.829957 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/eb31e03a-1724-4272-bb5f-d0a7b6b80059-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-q62bc\" (UID: \"eb31e03a-1724-4272-bb5f-d0a7b6b80059\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-q62bc" Mar 10 09:47:51 crc kubenswrapper[4794]: I0310 09:47:51.829984 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/3e0afe5a-17db-4dd6-b871-792a4e0f81a9-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-jnffm\" (UID: \"3e0afe5a-17db-4dd6-b871-792a4e0f81a9\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-jnffm" Mar 10 09:47:51 crc kubenswrapper[4794]: I0310 09:47:51.830004 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gr4s9\" (UniqueName: \"kubernetes.io/projected/eb31e03a-1724-4272-bb5f-d0a7b6b80059-kube-api-access-gr4s9\") pod \"control-plane-machine-set-operator-78cbb6b69f-q62bc\" (UID: \"eb31e03a-1724-4272-bb5f-d0a7b6b80059\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-q62bc" Mar 10 09:47:51 crc kubenswrapper[4794]: I0310 09:47:51.830026 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/3abe2691-57ff-4b50-8b21-23d7125489bb-etcd-client\") pod \"etcd-operator-b45778765-txkm8\" (UID: \"3abe2691-57ff-4b50-8b21-23d7125489bb\") " pod="openshift-etcd-operator/etcd-operator-b45778765-txkm8" Mar 10 09:47:51 crc kubenswrapper[4794]: I0310 09:47:51.830051 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/19ba7a1d-a381-49f2-aa2e-6463336559fe-serving-cert\") pod \"controller-manager-879f6c89f-n64xh\" (UID: \"19ba7a1d-a381-49f2-aa2e-6463336559fe\") " pod="openshift-controller-manager/controller-manager-879f6c89f-n64xh" Mar 10 09:47:51 crc kubenswrapper[4794]: I0310 09:47:51.830079 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/32cb5682-0c0e-4f56-8fcc-cd73067c41c7-serving-cert\") pod \"apiserver-7bbb656c7d-cxgqz\" (UID: \"32cb5682-0c0e-4f56-8fcc-cd73067c41c7\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-cxgqz" Mar 10 09:47:51 crc kubenswrapper[4794]: I0310 09:47:51.830106 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/32cb5682-0c0e-4f56-8fcc-cd73067c41c7-encryption-config\") pod \"apiserver-7bbb656c7d-cxgqz\" (UID: \"32cb5682-0c0e-4f56-8fcc-cd73067c41c7\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-cxgqz" Mar 10 09:47:51 crc kubenswrapper[4794]: I0310 09:47:51.830129 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/3e0afe5a-17db-4dd6-b871-792a4e0f81a9-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-jnffm\" (UID: \"3e0afe5a-17db-4dd6-b871-792a4e0f81a9\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-jnffm" Mar 10 09:47:51 crc kubenswrapper[4794]: I0310 09:47:51.830149 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5ef589c7-76d3-4f7e-8221-50e60876c39f-config\") pod \"kube-controller-manager-operator-78b949d7b-sqlbx\" (UID: \"5ef589c7-76d3-4f7e-8221-50e60876c39f\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-sqlbx" Mar 10 09:47:51 crc kubenswrapper[4794]: I0310 09:47:51.830169 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qxk64\" (UniqueName: \"kubernetes.io/projected/180c50b5-f178-4b42-922c-edfd1deb91d4-kube-api-access-qxk64\") pod \"router-default-5444994796-tlkl5\" (UID: \"180c50b5-f178-4b42-922c-edfd1deb91d4\") " pod="openshift-ingress/router-default-5444994796-tlkl5" Mar 10 09:47:51 crc kubenswrapper[4794]: I0310 09:47:51.830201 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5ef589c7-76d3-4f7e-8221-50e60876c39f-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-sqlbx\" (UID: \"5ef589c7-76d3-4f7e-8221-50e60876c39f\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-sqlbx" Mar 10 09:47:51 crc kubenswrapper[4794]: I0310 09:47:51.830236 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/19ba7a1d-a381-49f2-aa2e-6463336559fe-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-n64xh\" (UID: \"19ba7a1d-a381-49f2-aa2e-6463336559fe\") " pod="openshift-controller-manager/controller-manager-879f6c89f-n64xh" Mar 10 09:47:51 crc kubenswrapper[4794]: I0310 09:47:51.830264 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-grm58\" (UniqueName: \"kubernetes.io/projected/a1d20393-8c02-48fd-83ad-eb270b721313-kube-api-access-grm58\") pod \"machine-api-operator-5694c8668f-m2lbn\" (UID: \"a1d20393-8c02-48fd-83ad-eb270b721313\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-m2lbn" Mar 10 09:47:51 crc kubenswrapper[4794]: I0310 09:47:51.830291 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/e5fe5900-c0c7-4b99-9dcb-bcb3bd8da1fc-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-jqtj2\" (UID: \"e5fe5900-c0c7-4b99-9dcb-bcb3bd8da1fc\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-jqtj2" Mar 10 09:47:51 crc kubenswrapper[4794]: I0310 09:47:51.830315 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c55bf603-9ac8-4c71-8540-4a3ea6958d0f-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-glg2p\" (UID: \"c55bf603-9ac8-4c71-8540-4a3ea6958d0f\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-glg2p" Mar 10 09:47:51 crc kubenswrapper[4794]: I0310 09:47:51.830368 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/32cb5682-0c0e-4f56-8fcc-cd73067c41c7-etcd-client\") pod \"apiserver-7bbb656c7d-cxgqz\" (UID: \"32cb5682-0c0e-4f56-8fcc-cd73067c41c7\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-cxgqz" Mar 10 09:47:51 crc kubenswrapper[4794]: I0310 09:47:51.830410 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a1d20393-8c02-48fd-83ad-eb270b721313-config\") pod \"machine-api-operator-5694c8668f-m2lbn\" (UID: \"a1d20393-8c02-48fd-83ad-eb270b721313\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-m2lbn" Mar 10 09:47:51 crc kubenswrapper[4794]: I0310 09:47:51.830439 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/bff3d354-0064-4a96-8945-51df3cd2d7e7-etcd-serving-ca\") pod \"apiserver-76f77b778f-4jppl\" (UID: \"bff3d354-0064-4a96-8945-51df3cd2d7e7\") " pod="openshift-apiserver/apiserver-76f77b778f-4jppl" Mar 10 09:47:51 crc kubenswrapper[4794]: I0310 09:47:51.830440 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/bff3d354-0064-4a96-8945-51df3cd2d7e7-etcd-client\") pod \"apiserver-76f77b778f-4jppl\" (UID: \"bff3d354-0064-4a96-8945-51df3cd2d7e7\") " pod="openshift-apiserver/apiserver-76f77b778f-4jppl" Mar 10 09:47:51 crc kubenswrapper[4794]: I0310 09:47:51.830700 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/bff3d354-0064-4a96-8945-51df3cd2d7e7-node-pullsecrets\") pod \"apiserver-76f77b778f-4jppl\" (UID: \"bff3d354-0064-4a96-8945-51df3cd2d7e7\") " pod="openshift-apiserver/apiserver-76f77b778f-4jppl" Mar 10 09:47:51 crc kubenswrapper[4794]: I0310 09:47:51.831052 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bff3d354-0064-4a96-8945-51df3cd2d7e7-serving-cert\") pod \"apiserver-76f77b778f-4jppl\" (UID: \"bff3d354-0064-4a96-8945-51df3cd2d7e7\") " pod="openshift-apiserver/apiserver-76f77b778f-4jppl" Mar 10 09:47:51 crc kubenswrapper[4794]: I0310 09:47:51.831139 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/3abe2691-57ff-4b50-8b21-23d7125489bb-etcd-ca\") pod \"etcd-operator-b45778765-txkm8\" (UID: \"3abe2691-57ff-4b50-8b21-23d7125489bb\") " pod="openshift-etcd-operator/etcd-operator-b45778765-txkm8" Mar 10 09:47:51 crc kubenswrapper[4794]: I0310 09:47:51.831213 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b7pgh\" (UniqueName: \"kubernetes.io/projected/c55bf603-9ac8-4c71-8540-4a3ea6958d0f-kube-api-access-b7pgh\") pod \"openshift-controller-manager-operator-756b6f6bc6-glg2p\" (UID: \"c55bf603-9ac8-4c71-8540-4a3ea6958d0f\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-glg2p" Mar 10 09:47:51 crc kubenswrapper[4794]: I0310 09:47:51.831310 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/bff3d354-0064-4a96-8945-51df3cd2d7e7-audit-dir\") pod \"apiserver-76f77b778f-4jppl\" (UID: \"bff3d354-0064-4a96-8945-51df3cd2d7e7\") " pod="openshift-apiserver/apiserver-76f77b778f-4jppl" Mar 10 09:47:51 crc kubenswrapper[4794]: I0310 09:47:51.831363 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/a1d20393-8c02-48fd-83ad-eb270b721313-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-m2lbn\" (UID: \"a1d20393-8c02-48fd-83ad-eb270b721313\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-m2lbn" Mar 10 09:47:51 crc kubenswrapper[4794]: I0310 09:47:51.831407 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/bff3d354-0064-4a96-8945-51df3cd2d7e7-audit\") pod \"apiserver-76f77b778f-4jppl\" (UID: \"bff3d354-0064-4a96-8945-51df3cd2d7e7\") " pod="openshift-apiserver/apiserver-76f77b778f-4jppl" Mar 10 09:47:51 crc kubenswrapper[4794]: I0310 09:47:51.831570 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/bff3d354-0064-4a96-8945-51df3cd2d7e7-etcd-serving-ca\") pod \"apiserver-76f77b778f-4jppl\" (UID: \"bff3d354-0064-4a96-8945-51df3cd2d7e7\") " pod="openshift-apiserver/apiserver-76f77b778f-4jppl" Mar 10 09:47:51 crc kubenswrapper[4794]: I0310 09:47:51.831647 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/bff3d354-0064-4a96-8945-51df3cd2d7e7-audit-dir\") pod \"apiserver-76f77b778f-4jppl\" (UID: \"bff3d354-0064-4a96-8945-51df3cd2d7e7\") " pod="openshift-apiserver/apiserver-76f77b778f-4jppl" Mar 10 09:47:51 crc kubenswrapper[4794]: I0310 09:47:51.831661 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-n2b4p"] Mar 10 09:47:51 crc kubenswrapper[4794]: I0310 09:47:51.832320 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-cxgqz"] Mar 10 09:47:51 crc kubenswrapper[4794]: I0310 09:47:51.832498 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-n2b4p" Mar 10 09:47:51 crc kubenswrapper[4794]: I0310 09:47:51.832573 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-m5mvt" Mar 10 09:47:51 crc kubenswrapper[4794]: I0310 09:47:51.831722 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/180c50b5-f178-4b42-922c-edfd1deb91d4-metrics-certs\") pod \"router-default-5444994796-tlkl5\" (UID: \"180c50b5-f178-4b42-922c-edfd1deb91d4\") " pod="openshift-ingress/router-default-5444994796-tlkl5" Mar 10 09:47:51 crc kubenswrapper[4794]: I0310 09:47:51.832497 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a1d20393-8c02-48fd-83ad-eb270b721313-config\") pod \"machine-api-operator-5694c8668f-m2lbn\" (UID: \"a1d20393-8c02-48fd-83ad-eb270b721313\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-m2lbn" Mar 10 09:47:51 crc kubenswrapper[4794]: I0310 09:47:51.833519 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/bff3d354-0064-4a96-8945-51df3cd2d7e7-encryption-config\") pod \"apiserver-76f77b778f-4jppl\" (UID: \"bff3d354-0064-4a96-8945-51df3cd2d7e7\") " pod="openshift-apiserver/apiserver-76f77b778f-4jppl" Mar 10 09:47:51 crc kubenswrapper[4794]: I0310 09:47:51.834050 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qbh8v\" (UniqueName: \"kubernetes.io/projected/e5fe5900-c0c7-4b99-9dcb-bcb3bd8da1fc-kube-api-access-qbh8v\") pod \"cluster-samples-operator-665b6dd947-jqtj2\" (UID: \"e5fe5900-c0c7-4b99-9dcb-bcb3bd8da1fc\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-jqtj2" Mar 10 09:47:51 crc kubenswrapper[4794]: I0310 09:47:51.834119 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/02821716-8fb0-46bc-9c95-4c7ca46500b4-serving-cert\") pod \"route-controller-manager-6576b87f9c-jscb4\" (UID: \"02821716-8fb0-46bc-9c95-4c7ca46500b4\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-jscb4" Mar 10 09:47:51 crc kubenswrapper[4794]: I0310 09:47:51.834204 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lb7db\" (UniqueName: \"kubernetes.io/projected/4f8f1872-3b59-424c-ad20-9332a4ffc4b2-kube-api-access-lb7db\") pod \"packageserver-d55dfcdfc-rjjct\" (UID: \"4f8f1872-3b59-424c-ad20-9332a4ffc4b2\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-rjjct" Mar 10 09:47:51 crc kubenswrapper[4794]: I0310 09:47:51.834239 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jhknc\" (UniqueName: \"kubernetes.io/projected/531c4a6c-a67a-4629-bfeb-a89565d11497-kube-api-access-jhknc\") pod \"migrator-59844c95c7-q8jhf\" (UID: \"531c4a6c-a67a-4629-bfeb-a89565d11497\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-q8jhf" Mar 10 09:47:51 crc kubenswrapper[4794]: I0310 09:47:51.834721 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/32cb5682-0c0e-4f56-8fcc-cd73067c41c7-etcd-client\") pod \"apiserver-7bbb656c7d-cxgqz\" (UID: \"32cb5682-0c0e-4f56-8fcc-cd73067c41c7\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-cxgqz" Mar 10 09:47:51 crc kubenswrapper[4794]: I0310 09:47:51.834745 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-m2lbn"] Mar 10 09:47:51 crc kubenswrapper[4794]: I0310 09:47:51.834571 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/19ba7a1d-a381-49f2-aa2e-6463336559fe-serving-cert\") pod \"controller-manager-879f6c89f-n64xh\" (UID: \"19ba7a1d-a381-49f2-aa2e-6463336559fe\") " pod="openshift-controller-manager/controller-manager-879f6c89f-n64xh" Mar 10 09:47:51 crc kubenswrapper[4794]: I0310 09:47:51.834842 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/32cb5682-0c0e-4f56-8fcc-cd73067c41c7-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-cxgqz\" (UID: \"32cb5682-0c0e-4f56-8fcc-cd73067c41c7\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-cxgqz" Mar 10 09:47:51 crc kubenswrapper[4794]: I0310 09:47:51.834903 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-924n5\" (UniqueName: \"kubernetes.io/projected/3e0afe5a-17db-4dd6-b871-792a4e0f81a9-kube-api-access-924n5\") pod \"cluster-image-registry-operator-dc59b4c8b-jnffm\" (UID: \"3e0afe5a-17db-4dd6-b871-792a4e0f81a9\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-jnffm" Mar 10 09:47:51 crc kubenswrapper[4794]: I0310 09:47:51.834928 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c55bf603-9ac8-4c71-8540-4a3ea6958d0f-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-glg2p\" (UID: \"c55bf603-9ac8-4c71-8540-4a3ea6958d0f\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-glg2p" Mar 10 09:47:51 crc kubenswrapper[4794]: I0310 09:47:51.834967 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/19ba7a1d-a381-49f2-aa2e-6463336559fe-config\") pod \"controller-manager-879f6c89f-n64xh\" (UID: \"19ba7a1d-a381-49f2-aa2e-6463336559fe\") " pod="openshift-controller-manager/controller-manager-879f6c89f-n64xh" Mar 10 09:47:51 crc kubenswrapper[4794]: I0310 09:47:51.835190 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/32cb5682-0c0e-4f56-8fcc-cd73067c41c7-serving-cert\") pod \"apiserver-7bbb656c7d-cxgqz\" (UID: \"32cb5682-0c0e-4f56-8fcc-cd73067c41c7\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-cxgqz" Mar 10 09:47:51 crc kubenswrapper[4794]: I0310 09:47:51.835263 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bff3d354-0064-4a96-8945-51df3cd2d7e7-trusted-ca-bundle\") pod \"apiserver-76f77b778f-4jppl\" (UID: \"bff3d354-0064-4a96-8945-51df3cd2d7e7\") " pod="openshift-apiserver/apiserver-76f77b778f-4jppl" Mar 10 09:47:51 crc kubenswrapper[4794]: I0310 09:47:51.835367 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/a1d20393-8c02-48fd-83ad-eb270b721313-images\") pod \"machine-api-operator-5694c8668f-m2lbn\" (UID: \"a1d20393-8c02-48fd-83ad-eb270b721313\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-m2lbn" Mar 10 09:47:51 crc kubenswrapper[4794]: I0310 09:47:51.835395 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/32cb5682-0c0e-4f56-8fcc-cd73067c41c7-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-cxgqz\" (UID: \"32cb5682-0c0e-4f56-8fcc-cd73067c41c7\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-cxgqz" Mar 10 09:47:51 crc kubenswrapper[4794]: I0310 09:47:51.835739 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/32cb5682-0c0e-4f56-8fcc-cd73067c41c7-audit-dir\") pod \"apiserver-7bbb656c7d-cxgqz\" (UID: \"32cb5682-0c0e-4f56-8fcc-cd73067c41c7\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-cxgqz" Mar 10 09:47:51 crc kubenswrapper[4794]: I0310 09:47:51.835793 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/4f8f1872-3b59-424c-ad20-9332a4ffc4b2-apiservice-cert\") pod \"packageserver-d55dfcdfc-rjjct\" (UID: \"4f8f1872-3b59-424c-ad20-9332a4ffc4b2\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-rjjct" Mar 10 09:47:51 crc kubenswrapper[4794]: I0310 09:47:51.835822 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/32cb5682-0c0e-4f56-8fcc-cd73067c41c7-audit-dir\") pod \"apiserver-7bbb656c7d-cxgqz\" (UID: \"32cb5682-0c0e-4f56-8fcc-cd73067c41c7\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-cxgqz" Mar 10 09:47:51 crc kubenswrapper[4794]: I0310 09:47:51.836048 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/32cb5682-0c0e-4f56-8fcc-cd73067c41c7-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-cxgqz\" (UID: \"32cb5682-0c0e-4f56-8fcc-cd73067c41c7\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-cxgqz" Mar 10 09:47:51 crc kubenswrapper[4794]: I0310 09:47:51.836114 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/5ef589c7-76d3-4f7e-8221-50e60876c39f-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-sqlbx\" (UID: \"5ef589c7-76d3-4f7e-8221-50e60876c39f\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-sqlbx" Mar 10 09:47:51 crc kubenswrapper[4794]: I0310 09:47:51.836153 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/a1d20393-8c02-48fd-83ad-eb270b721313-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-m2lbn\" (UID: \"a1d20393-8c02-48fd-83ad-eb270b721313\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-m2lbn" Mar 10 09:47:51 crc kubenswrapper[4794]: I0310 09:47:51.836165 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bff3d354-0064-4a96-8945-51df3cd2d7e7-trusted-ca-bundle\") pod \"apiserver-76f77b778f-4jppl\" (UID: \"bff3d354-0064-4a96-8945-51df3cd2d7e7\") " pod="openshift-apiserver/apiserver-76f77b778f-4jppl" Mar 10 09:47:51 crc kubenswrapper[4794]: I0310 09:47:51.836363 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3abe2691-57ff-4b50-8b21-23d7125489bb-serving-cert\") pod \"etcd-operator-b45778765-txkm8\" (UID: \"3abe2691-57ff-4b50-8b21-23d7125489bb\") " pod="openshift-etcd-operator/etcd-operator-b45778765-txkm8" Mar 10 09:47:51 crc kubenswrapper[4794]: I0310 09:47:51.836446 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/02821716-8fb0-46bc-9c95-4c7ca46500b4-client-ca\") pod \"route-controller-manager-6576b87f9c-jscb4\" (UID: \"02821716-8fb0-46bc-9c95-4c7ca46500b4\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-jscb4" Mar 10 09:47:51 crc kubenswrapper[4794]: I0310 09:47:51.836489 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/bff3d354-0064-4a96-8945-51df3cd2d7e7-image-import-ca\") pod \"apiserver-76f77b778f-4jppl\" (UID: \"bff3d354-0064-4a96-8945-51df3cd2d7e7\") " pod="openshift-apiserver/apiserver-76f77b778f-4jppl" Mar 10 09:47:51 crc kubenswrapper[4794]: I0310 09:47:51.836525 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/3abe2691-57ff-4b50-8b21-23d7125489bb-etcd-service-ca\") pod \"etcd-operator-b45778765-txkm8\" (UID: \"3abe2691-57ff-4b50-8b21-23d7125489bb\") " pod="openshift-etcd-operator/etcd-operator-b45778765-txkm8" Mar 10 09:47:51 crc kubenswrapper[4794]: I0310 09:47:51.836553 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hhffj\" (UniqueName: \"kubernetes.io/projected/02821716-8fb0-46bc-9c95-4c7ca46500b4-kube-api-access-hhffj\") pod \"route-controller-manager-6576b87f9c-jscb4\" (UID: \"02821716-8fb0-46bc-9c95-4c7ca46500b4\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-jscb4" Mar 10 09:47:51 crc kubenswrapper[4794]: I0310 09:47:51.836574 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zdbvb\" (UniqueName: \"kubernetes.io/projected/32cb5682-0c0e-4f56-8fcc-cd73067c41c7-kube-api-access-zdbvb\") pod \"apiserver-7bbb656c7d-cxgqz\" (UID: \"32cb5682-0c0e-4f56-8fcc-cd73067c41c7\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-cxgqz" Mar 10 09:47:51 crc kubenswrapper[4794]: I0310 09:47:51.836595 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/180c50b5-f178-4b42-922c-edfd1deb91d4-stats-auth\") pod \"router-default-5444994796-tlkl5\" (UID: \"180c50b5-f178-4b42-922c-edfd1deb91d4\") " pod="openshift-ingress/router-default-5444994796-tlkl5" Mar 10 09:47:51 crc kubenswrapper[4794]: I0310 09:47:51.836623 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/19ba7a1d-a381-49f2-aa2e-6463336559fe-client-ca\") pod \"controller-manager-879f6c89f-n64xh\" (UID: \"19ba7a1d-a381-49f2-aa2e-6463336559fe\") " pod="openshift-controller-manager/controller-manager-879f6c89f-n64xh" Mar 10 09:47:51 crc kubenswrapper[4794]: I0310 09:47:51.836642 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tgp2d\" (UniqueName: \"kubernetes.io/projected/bff3d354-0064-4a96-8945-51df3cd2d7e7-kube-api-access-tgp2d\") pod \"apiserver-76f77b778f-4jppl\" (UID: \"bff3d354-0064-4a96-8945-51df3cd2d7e7\") " pod="openshift-apiserver/apiserver-76f77b778f-4jppl" Mar 10 09:47:51 crc kubenswrapper[4794]: I0310 09:47:51.836664 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/32cb5682-0c0e-4f56-8fcc-cd73067c41c7-audit-policies\") pod \"apiserver-7bbb656c7d-cxgqz\" (UID: \"32cb5682-0c0e-4f56-8fcc-cd73067c41c7\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-cxgqz" Mar 10 09:47:51 crc kubenswrapper[4794]: I0310 09:47:51.836688 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-97qwh\" (UniqueName: \"kubernetes.io/projected/19ba7a1d-a381-49f2-aa2e-6463336559fe-kube-api-access-97qwh\") pod \"controller-manager-879f6c89f-n64xh\" (UID: \"19ba7a1d-a381-49f2-aa2e-6463336559fe\") " pod="openshift-controller-manager/controller-manager-879f6c89f-n64xh" Mar 10 09:47:51 crc kubenswrapper[4794]: I0310 09:47:51.836707 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bff3d354-0064-4a96-8945-51df3cd2d7e7-config\") pod \"apiserver-76f77b778f-4jppl\" (UID: \"bff3d354-0064-4a96-8945-51df3cd2d7e7\") " pod="openshift-apiserver/apiserver-76f77b778f-4jppl" Mar 10 09:47:51 crc kubenswrapper[4794]: I0310 09:47:51.836720 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/32cb5682-0c0e-4f56-8fcc-cd73067c41c7-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-cxgqz\" (UID: \"32cb5682-0c0e-4f56-8fcc-cd73067c41c7\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-cxgqz" Mar 10 09:47:51 crc kubenswrapper[4794]: I0310 09:47:51.836727 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3abe2691-57ff-4b50-8b21-23d7125489bb-config\") pod \"etcd-operator-b45778765-txkm8\" (UID: \"3abe2691-57ff-4b50-8b21-23d7125489bb\") " pod="openshift-etcd-operator/etcd-operator-b45778765-txkm8" Mar 10 09:47:51 crc kubenswrapper[4794]: I0310 09:47:51.836723 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/a1d20393-8c02-48fd-83ad-eb270b721313-images\") pod \"machine-api-operator-5694c8668f-m2lbn\" (UID: \"a1d20393-8c02-48fd-83ad-eb270b721313\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-m2lbn" Mar 10 09:47:51 crc kubenswrapper[4794]: I0310 09:47:51.836746 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/180c50b5-f178-4b42-922c-edfd1deb91d4-service-ca-bundle\") pod \"router-default-5444994796-tlkl5\" (UID: \"180c50b5-f178-4b42-922c-edfd1deb91d4\") " pod="openshift-ingress/router-default-5444994796-tlkl5" Mar 10 09:47:51 crc kubenswrapper[4794]: I0310 09:47:51.836792 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/3e0afe5a-17db-4dd6-b871-792a4e0f81a9-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-jnffm\" (UID: \"3e0afe5a-17db-4dd6-b871-792a4e0f81a9\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-jnffm" Mar 10 09:47:51 crc kubenswrapper[4794]: I0310 09:47:51.836815 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xc4hx\" (UniqueName: \"kubernetes.io/projected/3abe2691-57ff-4b50-8b21-23d7125489bb-kube-api-access-xc4hx\") pod \"etcd-operator-b45778765-txkm8\" (UID: \"3abe2691-57ff-4b50-8b21-23d7125489bb\") " pod="openshift-etcd-operator/etcd-operator-b45778765-txkm8" Mar 10 09:47:51 crc kubenswrapper[4794]: I0310 09:47:51.836838 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/180c50b5-f178-4b42-922c-edfd1deb91d4-default-certificate\") pod \"router-default-5444994796-tlkl5\" (UID: \"180c50b5-f178-4b42-922c-edfd1deb91d4\") " pod="openshift-ingress/router-default-5444994796-tlkl5" Mar 10 09:47:51 crc kubenswrapper[4794]: I0310 09:47:51.836857 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/b70fe17f-97eb-4abf-b88e-0e24e5d01c48-available-featuregates\") pod \"openshift-config-operator-7777fb866f-l4mhp\" (UID: \"b70fe17f-97eb-4abf-b88e-0e24e5d01c48\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-l4mhp" Mar 10 09:47:51 crc kubenswrapper[4794]: I0310 09:47:51.836878 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8z6j4\" (UniqueName: \"kubernetes.io/projected/b70fe17f-97eb-4abf-b88e-0e24e5d01c48-kube-api-access-8z6j4\") pod \"openshift-config-operator-7777fb866f-l4mhp\" (UID: \"b70fe17f-97eb-4abf-b88e-0e24e5d01c48\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-l4mhp" Mar 10 09:47:51 crc kubenswrapper[4794]: I0310 09:47:51.837040 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/19ba7a1d-a381-49f2-aa2e-6463336559fe-config\") pod \"controller-manager-879f6c89f-n64xh\" (UID: \"19ba7a1d-a381-49f2-aa2e-6463336559fe\") " pod="openshift-controller-manager/controller-manager-879f6c89f-n64xh" Mar 10 09:47:51 crc kubenswrapper[4794]: I0310 09:47:51.837282 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/02821716-8fb0-46bc-9c95-4c7ca46500b4-client-ca\") pod \"route-controller-manager-6576b87f9c-jscb4\" (UID: \"02821716-8fb0-46bc-9c95-4c7ca46500b4\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-jscb4" Mar 10 09:47:51 crc kubenswrapper[4794]: I0310 09:47:51.837400 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/19ba7a1d-a381-49f2-aa2e-6463336559fe-client-ca\") pod \"controller-manager-879f6c89f-n64xh\" (UID: \"19ba7a1d-a381-49f2-aa2e-6463336559fe\") " pod="openshift-controller-manager/controller-manager-879f6c89f-n64xh" Mar 10 09:47:51 crc kubenswrapper[4794]: I0310 09:47:51.837650 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29552266-9tldc"] Mar 10 09:47:51 crc kubenswrapper[4794]: I0310 09:47:51.837697 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/32cb5682-0c0e-4f56-8fcc-cd73067c41c7-audit-policies\") pod \"apiserver-7bbb656c7d-cxgqz\" (UID: \"32cb5682-0c0e-4f56-8fcc-cd73067c41c7\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-cxgqz" Mar 10 09:47:51 crc kubenswrapper[4794]: I0310 09:47:51.837823 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/19ba7a1d-a381-49f2-aa2e-6463336559fe-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-n64xh\" (UID: \"19ba7a1d-a381-49f2-aa2e-6463336559fe\") " pod="openshift-controller-manager/controller-manager-879f6c89f-n64xh" Mar 10 09:47:51 crc kubenswrapper[4794]: I0310 09:47:51.837870 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bff3d354-0064-4a96-8945-51df3cd2d7e7-config\") pod \"apiserver-76f77b778f-4jppl\" (UID: \"bff3d354-0064-4a96-8945-51df3cd2d7e7\") " pod="openshift-apiserver/apiserver-76f77b778f-4jppl" Mar 10 09:47:51 crc kubenswrapper[4794]: I0310 09:47:51.837913 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/bff3d354-0064-4a96-8945-51df3cd2d7e7-image-import-ca\") pod \"apiserver-76f77b778f-4jppl\" (UID: \"bff3d354-0064-4a96-8945-51df3cd2d7e7\") " pod="openshift-apiserver/apiserver-76f77b778f-4jppl" Mar 10 09:47:51 crc kubenswrapper[4794]: I0310 09:47:51.839184 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/02821716-8fb0-46bc-9c95-4c7ca46500b4-config\") pod \"route-controller-manager-6576b87f9c-jscb4\" (UID: \"02821716-8fb0-46bc-9c95-4c7ca46500b4\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-jscb4" Mar 10 09:47:51 crc kubenswrapper[4794]: I0310 09:47:51.839226 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/32cb5682-0c0e-4f56-8fcc-cd73067c41c7-encryption-config\") pod \"apiserver-7bbb656c7d-cxgqz\" (UID: \"32cb5682-0c0e-4f56-8fcc-cd73067c41c7\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-cxgqz" Mar 10 09:47:51 crc kubenswrapper[4794]: I0310 09:47:51.839332 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552266-9tldc" Mar 10 09:47:51 crc kubenswrapper[4794]: I0310 09:47:51.839712 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bff3d354-0064-4a96-8945-51df3cd2d7e7-serving-cert\") pod \"apiserver-76f77b778f-4jppl\" (UID: \"bff3d354-0064-4a96-8945-51df3cd2d7e7\") " pod="openshift-apiserver/apiserver-76f77b778f-4jppl" Mar 10 09:47:51 crc kubenswrapper[4794]: I0310 09:47:51.839812 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-zpwhc"] Mar 10 09:47:51 crc kubenswrapper[4794]: I0310 09:47:51.840273 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-zpwhc" Mar 10 09:47:51 crc kubenswrapper[4794]: I0310 09:47:51.841025 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29552265-smnpm"] Mar 10 09:47:51 crc kubenswrapper[4794]: I0310 09:47:51.841824 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29552265-smnpm" Mar 10 09:47:51 crc kubenswrapper[4794]: I0310 09:47:51.842080 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-7954f5f757-bknsm"] Mar 10 09:47:51 crc kubenswrapper[4794]: I0310 09:47:51.842608 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-bknsm" Mar 10 09:47:51 crc kubenswrapper[4794]: I0310 09:47:51.844492 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/02821716-8fb0-46bc-9c95-4c7ca46500b4-serving-cert\") pod \"route-controller-manager-6576b87f9c-jscb4\" (UID: \"02821716-8fb0-46bc-9c95-4c7ca46500b4\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-jscb4" Mar 10 09:47:51 crc kubenswrapper[4794]: I0310 09:47:51.845061 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-p8ppb"] Mar 10 09:47:51 crc kubenswrapper[4794]: I0310 09:47:51.846652 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Mar 10 09:47:51 crc kubenswrapper[4794]: I0310 09:47:51.847587 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-server-sl6l8"] Mar 10 09:47:51 crc kubenswrapper[4794]: I0310 09:47:51.848915 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-sl6l8" Mar 10 09:47:51 crc kubenswrapper[4794]: I0310 09:47:51.855664 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-zzsr8"] Mar 10 09:47:51 crc kubenswrapper[4794]: I0310 09:47:51.858280 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-p8ppb" Mar 10 09:47:51 crc kubenswrapper[4794]: I0310 09:47:51.859455 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-skqw6"] Mar 10 09:47:51 crc kubenswrapper[4794]: I0310 09:47:51.859860 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-xtppb"] Mar 10 09:47:51 crc kubenswrapper[4794]: I0310 09:47:51.861710 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-jnffm"] Mar 10 09:47:51 crc kubenswrapper[4794]: I0310 09:47:51.862141 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-glg2p"] Mar 10 09:47:51 crc kubenswrapper[4794]: I0310 09:47:51.865874 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Mar 10 09:47:51 crc kubenswrapper[4794]: I0310 09:47:51.867090 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-jqtj2"] Mar 10 09:47:51 crc kubenswrapper[4794]: I0310 09:47:51.871661 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-4jppl"] Mar 10 09:47:51 crc kubenswrapper[4794]: I0310 09:47:51.872846 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-txkm8"] Mar 10 09:47:51 crc kubenswrapper[4794]: I0310 09:47:51.873837 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-n94r9"] Mar 10 09:47:51 crc kubenswrapper[4794]: I0310 09:47:51.874900 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-jscb4"] Mar 10 09:47:51 crc kubenswrapper[4794]: I0310 09:47:51.883682 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-2tbqn"] Mar 10 09:47:51 crc kubenswrapper[4794]: I0310 09:47:51.884774 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Mar 10 09:47:51 crc kubenswrapper[4794]: I0310 09:47:51.885217 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-h4k77"] Mar 10 09:47:51 crc kubenswrapper[4794]: I0310 09:47:51.886833 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-qnrc8"] Mar 10 09:47:51 crc kubenswrapper[4794]: I0310 09:47:51.888821 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-m7cm9"] Mar 10 09:47:51 crc kubenswrapper[4794]: I0310 09:47:51.890325 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-q62bc"] Mar 10 09:47:51 crc kubenswrapper[4794]: I0310 09:47:51.892413 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-sqlbx"] Mar 10 09:47:51 crc kubenswrapper[4794]: I0310 09:47:51.893780 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-bhj55"] Mar 10 09:47:51 crc kubenswrapper[4794]: I0310 09:47:51.895111 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-dkz9c"] Mar 10 09:47:51 crc kubenswrapper[4794]: I0310 09:47:51.895925 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-q8jhf"] Mar 10 09:47:51 crc kubenswrapper[4794]: I0310 09:47:51.898166 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-gg8jk"] Mar 10 09:47:51 crc kubenswrapper[4794]: I0310 09:47:51.898701 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-zp5hg"] Mar 10 09:47:51 crc kubenswrapper[4794]: I0310 09:47:51.901447 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-9vsxq"] Mar 10 09:47:51 crc kubenswrapper[4794]: I0310 09:47:51.905029 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-l4mhp"] Mar 10 09:47:51 crc kubenswrapper[4794]: I0310 09:47:51.905249 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Mar 10 09:47:51 crc kubenswrapper[4794]: I0310 09:47:51.906371 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-xwx4g"] Mar 10 09:47:51 crc kubenswrapper[4794]: I0310 09:47:51.908449 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-rkl2f"] Mar 10 09:47:51 crc kubenswrapper[4794]: I0310 09:47:51.909151 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-rkl2f" Mar 10 09:47:51 crc kubenswrapper[4794]: I0310 09:47:51.909700 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552266-9tldc"] Mar 10 09:47:51 crc kubenswrapper[4794]: I0310 09:47:51.910832 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-n2b4p"] Mar 10 09:47:51 crc kubenswrapper[4794]: I0310 09:47:51.912066 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-p8ppb"] Mar 10 09:47:51 crc kubenswrapper[4794]: I0310 09:47:51.913214 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-qk8nd"] Mar 10 09:47:51 crc kubenswrapper[4794]: I0310 09:47:51.914310 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-vjgc6"] Mar 10 09:47:51 crc kubenswrapper[4794]: I0310 09:47:51.915538 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-hxsrd"] Mar 10 09:47:51 crc kubenswrapper[4794]: I0310 09:47:51.916662 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-rjjct"] Mar 10 09:47:51 crc kubenswrapper[4794]: I0310 09:47:51.917774 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-bknsm"] Mar 10 09:47:51 crc kubenswrapper[4794]: I0310 09:47:51.918802 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-rkl2f"] Mar 10 09:47:51 crc kubenswrapper[4794]: I0310 09:47:51.919860 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29552265-smnpm"] Mar 10 09:47:51 crc kubenswrapper[4794]: I0310 09:47:51.921251 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-zpwhc"] Mar 10 09:47:51 crc kubenswrapper[4794]: I0310 09:47:51.922314 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-ckvtc"] Mar 10 09:47:51 crc kubenswrapper[4794]: I0310 09:47:51.924192 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-2lthx"] Mar 10 09:47:51 crc kubenswrapper[4794]: I0310 09:47:51.924353 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-ckvtc" Mar 10 09:47:51 crc kubenswrapper[4794]: I0310 09:47:51.924680 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Mar 10 09:47:51 crc kubenswrapper[4794]: I0310 09:47:51.924846 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-2lthx"] Mar 10 09:47:51 crc kubenswrapper[4794]: I0310 09:47:51.924899 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-2lthx" Mar 10 09:47:51 crc kubenswrapper[4794]: I0310 09:47:51.926093 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-ckvtc"] Mar 10 09:47:51 crc kubenswrapper[4794]: I0310 09:47:51.937688 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3abe2691-57ff-4b50-8b21-23d7125489bb-config\") pod \"etcd-operator-b45778765-txkm8\" (UID: \"3abe2691-57ff-4b50-8b21-23d7125489bb\") " pod="openshift-etcd-operator/etcd-operator-b45778765-txkm8" Mar 10 09:47:51 crc kubenswrapper[4794]: I0310 09:47:51.937721 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/180c50b5-f178-4b42-922c-edfd1deb91d4-service-ca-bundle\") pod \"router-default-5444994796-tlkl5\" (UID: \"180c50b5-f178-4b42-922c-edfd1deb91d4\") " pod="openshift-ingress/router-default-5444994796-tlkl5" Mar 10 09:47:51 crc kubenswrapper[4794]: I0310 09:47:51.937750 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/3e0afe5a-17db-4dd6-b871-792a4e0f81a9-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-jnffm\" (UID: \"3e0afe5a-17db-4dd6-b871-792a4e0f81a9\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-jnffm" Mar 10 09:47:51 crc kubenswrapper[4794]: I0310 09:47:51.937769 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xc4hx\" (UniqueName: \"kubernetes.io/projected/3abe2691-57ff-4b50-8b21-23d7125489bb-kube-api-access-xc4hx\") pod \"etcd-operator-b45778765-txkm8\" (UID: \"3abe2691-57ff-4b50-8b21-23d7125489bb\") " pod="openshift-etcd-operator/etcd-operator-b45778765-txkm8" Mar 10 09:47:51 crc kubenswrapper[4794]: I0310 09:47:51.937790 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/180c50b5-f178-4b42-922c-edfd1deb91d4-default-certificate\") pod \"router-default-5444994796-tlkl5\" (UID: \"180c50b5-f178-4b42-922c-edfd1deb91d4\") " pod="openshift-ingress/router-default-5444994796-tlkl5" Mar 10 09:47:51 crc kubenswrapper[4794]: I0310 09:47:51.937817 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/b70fe17f-97eb-4abf-b88e-0e24e5d01c48-available-featuregates\") pod \"openshift-config-operator-7777fb866f-l4mhp\" (UID: \"b70fe17f-97eb-4abf-b88e-0e24e5d01c48\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-l4mhp" Mar 10 09:47:51 crc kubenswrapper[4794]: I0310 09:47:51.937843 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8z6j4\" (UniqueName: \"kubernetes.io/projected/b70fe17f-97eb-4abf-b88e-0e24e5d01c48-kube-api-access-8z6j4\") pod \"openshift-config-operator-7777fb866f-l4mhp\" (UID: \"b70fe17f-97eb-4abf-b88e-0e24e5d01c48\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-l4mhp" Mar 10 09:47:51 crc kubenswrapper[4794]: I0310 09:47:51.937867 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/4f8f1872-3b59-424c-ad20-9332a4ffc4b2-tmpfs\") pod \"packageserver-d55dfcdfc-rjjct\" (UID: \"4f8f1872-3b59-424c-ad20-9332a4ffc4b2\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-rjjct" Mar 10 09:47:51 crc kubenswrapper[4794]: I0310 09:47:51.937894 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/4f8f1872-3b59-424c-ad20-9332a4ffc4b2-webhook-cert\") pod \"packageserver-d55dfcdfc-rjjct\" (UID: \"4f8f1872-3b59-424c-ad20-9332a4ffc4b2\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-rjjct" Mar 10 09:47:51 crc kubenswrapper[4794]: I0310 09:47:51.937932 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b70fe17f-97eb-4abf-b88e-0e24e5d01c48-serving-cert\") pod \"openshift-config-operator-7777fb866f-l4mhp\" (UID: \"b70fe17f-97eb-4abf-b88e-0e24e5d01c48\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-l4mhp" Mar 10 09:47:51 crc kubenswrapper[4794]: I0310 09:47:51.937961 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/eb31e03a-1724-4272-bb5f-d0a7b6b80059-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-q62bc\" (UID: \"eb31e03a-1724-4272-bb5f-d0a7b6b80059\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-q62bc" Mar 10 09:47:51 crc kubenswrapper[4794]: I0310 09:47:51.937986 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/3e0afe5a-17db-4dd6-b871-792a4e0f81a9-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-jnffm\" (UID: \"3e0afe5a-17db-4dd6-b871-792a4e0f81a9\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-jnffm" Mar 10 09:47:51 crc kubenswrapper[4794]: I0310 09:47:51.938017 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/3abe2691-57ff-4b50-8b21-23d7125489bb-etcd-client\") pod \"etcd-operator-b45778765-txkm8\" (UID: \"3abe2691-57ff-4b50-8b21-23d7125489bb\") " pod="openshift-etcd-operator/etcd-operator-b45778765-txkm8" Mar 10 09:47:51 crc kubenswrapper[4794]: I0310 09:47:51.938052 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gr4s9\" (UniqueName: \"kubernetes.io/projected/eb31e03a-1724-4272-bb5f-d0a7b6b80059-kube-api-access-gr4s9\") pod \"control-plane-machine-set-operator-78cbb6b69f-q62bc\" (UID: \"eb31e03a-1724-4272-bb5f-d0a7b6b80059\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-q62bc" Mar 10 09:47:51 crc kubenswrapper[4794]: I0310 09:47:51.938145 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/3e0afe5a-17db-4dd6-b871-792a4e0f81a9-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-jnffm\" (UID: \"3e0afe5a-17db-4dd6-b871-792a4e0f81a9\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-jnffm" Mar 10 09:47:51 crc kubenswrapper[4794]: I0310 09:47:51.938168 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5ef589c7-76d3-4f7e-8221-50e60876c39f-config\") pod \"kube-controller-manager-operator-78b949d7b-sqlbx\" (UID: \"5ef589c7-76d3-4f7e-8221-50e60876c39f\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-sqlbx" Mar 10 09:47:51 crc kubenswrapper[4794]: I0310 09:47:51.938188 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qxk64\" (UniqueName: \"kubernetes.io/projected/180c50b5-f178-4b42-922c-edfd1deb91d4-kube-api-access-qxk64\") pod \"router-default-5444994796-tlkl5\" (UID: \"180c50b5-f178-4b42-922c-edfd1deb91d4\") " pod="openshift-ingress/router-default-5444994796-tlkl5" Mar 10 09:47:51 crc kubenswrapper[4794]: I0310 09:47:51.938206 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5ef589c7-76d3-4f7e-8221-50e60876c39f-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-sqlbx\" (UID: \"5ef589c7-76d3-4f7e-8221-50e60876c39f\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-sqlbx" Mar 10 09:47:51 crc kubenswrapper[4794]: I0310 09:47:51.938278 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/e5fe5900-c0c7-4b99-9dcb-bcb3bd8da1fc-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-jqtj2\" (UID: \"e5fe5900-c0c7-4b99-9dcb-bcb3bd8da1fc\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-jqtj2" Mar 10 09:47:51 crc kubenswrapper[4794]: I0310 09:47:51.938303 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c55bf603-9ac8-4c71-8540-4a3ea6958d0f-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-glg2p\" (UID: \"c55bf603-9ac8-4c71-8540-4a3ea6958d0f\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-glg2p" Mar 10 09:47:51 crc kubenswrapper[4794]: I0310 09:47:51.938327 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/3abe2691-57ff-4b50-8b21-23d7125489bb-etcd-ca\") pod \"etcd-operator-b45778765-txkm8\" (UID: \"3abe2691-57ff-4b50-8b21-23d7125489bb\") " pod="openshift-etcd-operator/etcd-operator-b45778765-txkm8" Mar 10 09:47:51 crc kubenswrapper[4794]: I0310 09:47:51.938364 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b7pgh\" (UniqueName: \"kubernetes.io/projected/c55bf603-9ac8-4c71-8540-4a3ea6958d0f-kube-api-access-b7pgh\") pod \"openshift-controller-manager-operator-756b6f6bc6-glg2p\" (UID: \"c55bf603-9ac8-4c71-8540-4a3ea6958d0f\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-glg2p" Mar 10 09:47:51 crc kubenswrapper[4794]: I0310 09:47:51.938400 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/180c50b5-f178-4b42-922c-edfd1deb91d4-metrics-certs\") pod \"router-default-5444994796-tlkl5\" (UID: \"180c50b5-f178-4b42-922c-edfd1deb91d4\") " pod="openshift-ingress/router-default-5444994796-tlkl5" Mar 10 09:47:51 crc kubenswrapper[4794]: I0310 09:47:51.938474 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/b70fe17f-97eb-4abf-b88e-0e24e5d01c48-available-featuregates\") pod \"openshift-config-operator-7777fb866f-l4mhp\" (UID: \"b70fe17f-97eb-4abf-b88e-0e24e5d01c48\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-l4mhp" Mar 10 09:47:51 crc kubenswrapper[4794]: I0310 09:47:51.938673 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3abe2691-57ff-4b50-8b21-23d7125489bb-config\") pod \"etcd-operator-b45778765-txkm8\" (UID: \"3abe2691-57ff-4b50-8b21-23d7125489bb\") " pod="openshift-etcd-operator/etcd-operator-b45778765-txkm8" Mar 10 09:47:51 crc kubenswrapper[4794]: I0310 09:47:51.938475 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qbh8v\" (UniqueName: \"kubernetes.io/projected/e5fe5900-c0c7-4b99-9dcb-bcb3bd8da1fc-kube-api-access-qbh8v\") pod \"cluster-samples-operator-665b6dd947-jqtj2\" (UID: \"e5fe5900-c0c7-4b99-9dcb-bcb3bd8da1fc\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-jqtj2" Mar 10 09:47:51 crc kubenswrapper[4794]: I0310 09:47:51.938784 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lb7db\" (UniqueName: \"kubernetes.io/projected/4f8f1872-3b59-424c-ad20-9332a4ffc4b2-kube-api-access-lb7db\") pod \"packageserver-d55dfcdfc-rjjct\" (UID: \"4f8f1872-3b59-424c-ad20-9332a4ffc4b2\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-rjjct" Mar 10 09:47:51 crc kubenswrapper[4794]: I0310 09:47:51.938808 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/4f8f1872-3b59-424c-ad20-9332a4ffc4b2-tmpfs\") pod \"packageserver-d55dfcdfc-rjjct\" (UID: \"4f8f1872-3b59-424c-ad20-9332a4ffc4b2\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-rjjct" Mar 10 09:47:51 crc kubenswrapper[4794]: I0310 09:47:51.938816 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jhknc\" (UniqueName: \"kubernetes.io/projected/531c4a6c-a67a-4629-bfeb-a89565d11497-kube-api-access-jhknc\") pod \"migrator-59844c95c7-q8jhf\" (UID: \"531c4a6c-a67a-4629-bfeb-a89565d11497\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-q8jhf" Mar 10 09:47:51 crc kubenswrapper[4794]: I0310 09:47:51.938891 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-924n5\" (UniqueName: \"kubernetes.io/projected/3e0afe5a-17db-4dd6-b871-792a4e0f81a9-kube-api-access-924n5\") pod \"cluster-image-registry-operator-dc59b4c8b-jnffm\" (UID: \"3e0afe5a-17db-4dd6-b871-792a4e0f81a9\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-jnffm" Mar 10 09:47:51 crc kubenswrapper[4794]: I0310 09:47:51.938922 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c55bf603-9ac8-4c71-8540-4a3ea6958d0f-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-glg2p\" (UID: \"c55bf603-9ac8-4c71-8540-4a3ea6958d0f\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-glg2p" Mar 10 09:47:51 crc kubenswrapper[4794]: I0310 09:47:51.938961 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/4f8f1872-3b59-424c-ad20-9332a4ffc4b2-apiservice-cert\") pod \"packageserver-d55dfcdfc-rjjct\" (UID: \"4f8f1872-3b59-424c-ad20-9332a4ffc4b2\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-rjjct" Mar 10 09:47:51 crc kubenswrapper[4794]: I0310 09:47:51.939020 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/5ef589c7-76d3-4f7e-8221-50e60876c39f-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-sqlbx\" (UID: \"5ef589c7-76d3-4f7e-8221-50e60876c39f\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-sqlbx" Mar 10 09:47:51 crc kubenswrapper[4794]: I0310 09:47:51.939065 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3abe2691-57ff-4b50-8b21-23d7125489bb-serving-cert\") pod \"etcd-operator-b45778765-txkm8\" (UID: \"3abe2691-57ff-4b50-8b21-23d7125489bb\") " pod="openshift-etcd-operator/etcd-operator-b45778765-txkm8" Mar 10 09:47:51 crc kubenswrapper[4794]: I0310 09:47:51.939093 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/3abe2691-57ff-4b50-8b21-23d7125489bb-etcd-service-ca\") pod \"etcd-operator-b45778765-txkm8\" (UID: \"3abe2691-57ff-4b50-8b21-23d7125489bb\") " pod="openshift-etcd-operator/etcd-operator-b45778765-txkm8" Mar 10 09:47:51 crc kubenswrapper[4794]: I0310 09:47:51.940104 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/3abe2691-57ff-4b50-8b21-23d7125489bb-etcd-ca\") pod \"etcd-operator-b45778765-txkm8\" (UID: \"3abe2691-57ff-4b50-8b21-23d7125489bb\") " pod="openshift-etcd-operator/etcd-operator-b45778765-txkm8" Mar 10 09:47:51 crc kubenswrapper[4794]: I0310 09:47:51.940293 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/180c50b5-f178-4b42-922c-edfd1deb91d4-stats-auth\") pod \"router-default-5444994796-tlkl5\" (UID: \"180c50b5-f178-4b42-922c-edfd1deb91d4\") " pod="openshift-ingress/router-default-5444994796-tlkl5" Mar 10 09:47:51 crc kubenswrapper[4794]: I0310 09:47:51.940271 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/3e0afe5a-17db-4dd6-b871-792a4e0f81a9-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-jnffm\" (UID: \"3e0afe5a-17db-4dd6-b871-792a4e0f81a9\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-jnffm" Mar 10 09:47:51 crc kubenswrapper[4794]: I0310 09:47:51.940487 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c55bf603-9ac8-4c71-8540-4a3ea6958d0f-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-glg2p\" (UID: \"c55bf603-9ac8-4c71-8540-4a3ea6958d0f\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-glg2p" Mar 10 09:47:51 crc kubenswrapper[4794]: I0310 09:47:51.941191 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/3abe2691-57ff-4b50-8b21-23d7125489bb-etcd-service-ca\") pod \"etcd-operator-b45778765-txkm8\" (UID: \"3abe2691-57ff-4b50-8b21-23d7125489bb\") " pod="openshift-etcd-operator/etcd-operator-b45778765-txkm8" Mar 10 09:47:51 crc kubenswrapper[4794]: I0310 09:47:51.941591 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/3e0afe5a-17db-4dd6-b871-792a4e0f81a9-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-jnffm\" (UID: \"3e0afe5a-17db-4dd6-b871-792a4e0f81a9\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-jnffm" Mar 10 09:47:51 crc kubenswrapper[4794]: I0310 09:47:51.946406 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c55bf603-9ac8-4c71-8540-4a3ea6958d0f-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-glg2p\" (UID: \"c55bf603-9ac8-4c71-8540-4a3ea6958d0f\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-glg2p" Mar 10 09:47:51 crc kubenswrapper[4794]: I0310 09:47:51.946818 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Mar 10 09:47:51 crc kubenswrapper[4794]: I0310 09:47:51.948538 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3abe2691-57ff-4b50-8b21-23d7125489bb-serving-cert\") pod \"etcd-operator-b45778765-txkm8\" (UID: \"3abe2691-57ff-4b50-8b21-23d7125489bb\") " pod="openshift-etcd-operator/etcd-operator-b45778765-txkm8" Mar 10 09:47:51 crc kubenswrapper[4794]: I0310 09:47:51.949559 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/3abe2691-57ff-4b50-8b21-23d7125489bb-etcd-client\") pod \"etcd-operator-b45778765-txkm8\" (UID: \"3abe2691-57ff-4b50-8b21-23d7125489bb\") " pod="openshift-etcd-operator/etcd-operator-b45778765-txkm8" Mar 10 09:47:51 crc kubenswrapper[4794]: I0310 09:47:51.951187 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/e5fe5900-c0c7-4b99-9dcb-bcb3bd8da1fc-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-jqtj2\" (UID: \"e5fe5900-c0c7-4b99-9dcb-bcb3bd8da1fc\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-jqtj2" Mar 10 09:47:51 crc kubenswrapper[4794]: I0310 09:47:51.951538 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b70fe17f-97eb-4abf-b88e-0e24e5d01c48-serving-cert\") pod \"openshift-config-operator-7777fb866f-l4mhp\" (UID: \"b70fe17f-97eb-4abf-b88e-0e24e5d01c48\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-l4mhp" Mar 10 09:47:51 crc kubenswrapper[4794]: I0310 09:47:51.964985 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Mar 10 09:47:51 crc kubenswrapper[4794]: I0310 09:47:51.985571 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Mar 10 09:47:52 crc kubenswrapper[4794]: I0310 09:47:52.004954 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Mar 10 09:47:52 crc kubenswrapper[4794]: I0310 09:47:52.024987 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Mar 10 09:47:52 crc kubenswrapper[4794]: I0310 09:47:52.051982 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Mar 10 09:47:52 crc kubenswrapper[4794]: I0310 09:47:52.064913 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Mar 10 09:47:52 crc kubenswrapper[4794]: I0310 09:47:52.085044 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Mar 10 09:47:52 crc kubenswrapper[4794]: I0310 09:47:52.105632 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Mar 10 09:47:52 crc kubenswrapper[4794]: I0310 09:47:52.113145 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/eb31e03a-1724-4272-bb5f-d0a7b6b80059-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-q62bc\" (UID: \"eb31e03a-1724-4272-bb5f-d0a7b6b80059\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-q62bc" Mar 10 09:47:52 crc kubenswrapper[4794]: I0310 09:47:52.125545 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Mar 10 09:47:52 crc kubenswrapper[4794]: I0310 09:47:52.145451 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Mar 10 09:47:52 crc kubenswrapper[4794]: I0310 09:47:52.165468 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Mar 10 09:47:52 crc kubenswrapper[4794]: I0310 09:47:52.186162 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Mar 10 09:47:52 crc kubenswrapper[4794]: I0310 09:47:52.192038 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/180c50b5-f178-4b42-922c-edfd1deb91d4-default-certificate\") pod \"router-default-5444994796-tlkl5\" (UID: \"180c50b5-f178-4b42-922c-edfd1deb91d4\") " pod="openshift-ingress/router-default-5444994796-tlkl5" Mar 10 09:47:52 crc kubenswrapper[4794]: I0310 09:47:52.205675 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Mar 10 09:47:52 crc kubenswrapper[4794]: I0310 09:47:52.213836 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/180c50b5-f178-4b42-922c-edfd1deb91d4-stats-auth\") pod \"router-default-5444994796-tlkl5\" (UID: \"180c50b5-f178-4b42-922c-edfd1deb91d4\") " pod="openshift-ingress/router-default-5444994796-tlkl5" Mar 10 09:47:52 crc kubenswrapper[4794]: I0310 09:47:52.225814 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Mar 10 09:47:52 crc kubenswrapper[4794]: I0310 09:47:52.235186 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/180c50b5-f178-4b42-922c-edfd1deb91d4-metrics-certs\") pod \"router-default-5444994796-tlkl5\" (UID: \"180c50b5-f178-4b42-922c-edfd1deb91d4\") " pod="openshift-ingress/router-default-5444994796-tlkl5" Mar 10 09:47:52 crc kubenswrapper[4794]: I0310 09:47:52.245246 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Mar 10 09:47:52 crc kubenswrapper[4794]: I0310 09:47:52.265825 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Mar 10 09:47:52 crc kubenswrapper[4794]: I0310 09:47:52.269450 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/180c50b5-f178-4b42-922c-edfd1deb91d4-service-ca-bundle\") pod \"router-default-5444994796-tlkl5\" (UID: \"180c50b5-f178-4b42-922c-edfd1deb91d4\") " pod="openshift-ingress/router-default-5444994796-tlkl5" Mar 10 09:47:52 crc kubenswrapper[4794]: I0310 09:47:52.285105 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Mar 10 09:47:52 crc kubenswrapper[4794]: I0310 09:47:52.305456 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Mar 10 09:47:52 crc kubenswrapper[4794]: I0310 09:47:52.326411 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Mar 10 09:47:52 crc kubenswrapper[4794]: I0310 09:47:52.345779 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Mar 10 09:47:52 crc kubenswrapper[4794]: I0310 09:47:52.353632 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/4f8f1872-3b59-424c-ad20-9332a4ffc4b2-webhook-cert\") pod \"packageserver-d55dfcdfc-rjjct\" (UID: \"4f8f1872-3b59-424c-ad20-9332a4ffc4b2\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-rjjct" Mar 10 09:47:52 crc kubenswrapper[4794]: I0310 09:47:52.355036 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/4f8f1872-3b59-424c-ad20-9332a4ffc4b2-apiservice-cert\") pod \"packageserver-d55dfcdfc-rjjct\" (UID: \"4f8f1872-3b59-424c-ad20-9332a4ffc4b2\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-rjjct" Mar 10 09:47:52 crc kubenswrapper[4794]: I0310 09:47:52.366489 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Mar 10 09:47:52 crc kubenswrapper[4794]: I0310 09:47:52.386594 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Mar 10 09:47:52 crc kubenswrapper[4794]: I0310 09:47:52.406265 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Mar 10 09:47:52 crc kubenswrapper[4794]: I0310 09:47:52.426022 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Mar 10 09:47:52 crc kubenswrapper[4794]: I0310 09:47:52.447061 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Mar 10 09:47:52 crc kubenswrapper[4794]: I0310 09:47:52.485855 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Mar 10 09:47:52 crc kubenswrapper[4794]: I0310 09:47:52.505962 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Mar 10 09:47:52 crc kubenswrapper[4794]: I0310 09:47:52.525902 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Mar 10 09:47:52 crc kubenswrapper[4794]: I0310 09:47:52.546396 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Mar 10 09:47:52 crc kubenswrapper[4794]: I0310 09:47:52.565939 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Mar 10 09:47:52 crc kubenswrapper[4794]: I0310 09:47:52.586763 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Mar 10 09:47:52 crc kubenswrapper[4794]: I0310 09:47:52.606908 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Mar 10 09:47:52 crc kubenswrapper[4794]: I0310 09:47:52.625942 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Mar 10 09:47:52 crc kubenswrapper[4794]: I0310 09:47:52.645937 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Mar 10 09:47:52 crc kubenswrapper[4794]: I0310 09:47:52.665441 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Mar 10 09:47:52 crc kubenswrapper[4794]: I0310 09:47:52.687567 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Mar 10 09:47:52 crc kubenswrapper[4794]: I0310 09:47:52.705219 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Mar 10 09:47:52 crc kubenswrapper[4794]: I0310 09:47:52.712028 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5ef589c7-76d3-4f7e-8221-50e60876c39f-config\") pod \"kube-controller-manager-operator-78b949d7b-sqlbx\" (UID: \"5ef589c7-76d3-4f7e-8221-50e60876c39f\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-sqlbx" Mar 10 09:47:52 crc kubenswrapper[4794]: I0310 09:47:52.725496 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Mar 10 09:47:52 crc kubenswrapper[4794]: I0310 09:47:52.734120 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5ef589c7-76d3-4f7e-8221-50e60876c39f-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-sqlbx\" (UID: \"5ef589c7-76d3-4f7e-8221-50e60876c39f\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-sqlbx" Mar 10 09:47:52 crc kubenswrapper[4794]: I0310 09:47:52.745662 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Mar 10 09:47:52 crc kubenswrapper[4794]: I0310 09:47:52.765546 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Mar 10 09:47:52 crc kubenswrapper[4794]: I0310 09:47:52.785591 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Mar 10 09:47:52 crc kubenswrapper[4794]: I0310 09:47:52.804412 4794 request.go:700] Waited for 1.017591404s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-service-ca/secrets?fieldSelector=metadata.name%3Dservice-ca-dockercfg-pn86c&limit=500&resourceVersion=0 Mar 10 09:47:52 crc kubenswrapper[4794]: I0310 09:47:52.806808 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Mar 10 09:47:52 crc kubenswrapper[4794]: I0310 09:47:52.825982 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Mar 10 09:47:52 crc kubenswrapper[4794]: I0310 09:47:52.845873 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Mar 10 09:47:52 crc kubenswrapper[4794]: I0310 09:47:52.865565 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Mar 10 09:47:52 crc kubenswrapper[4794]: I0310 09:47:52.906100 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Mar 10 09:47:52 crc kubenswrapper[4794]: I0310 09:47:52.924769 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Mar 10 09:47:52 crc kubenswrapper[4794]: I0310 09:47:52.946667 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Mar 10 09:47:52 crc kubenswrapper[4794]: I0310 09:47:52.965702 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Mar 10 09:47:52 crc kubenswrapper[4794]: I0310 09:47:52.985138 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Mar 10 09:47:53 crc kubenswrapper[4794]: I0310 09:47:53.006011 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Mar 10 09:47:53 crc kubenswrapper[4794]: I0310 09:47:53.025676 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Mar 10 09:47:53 crc kubenswrapper[4794]: I0310 09:47:53.045842 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Mar 10 09:47:53 crc kubenswrapper[4794]: I0310 09:47:53.065494 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Mar 10 09:47:53 crc kubenswrapper[4794]: I0310 09:47:53.085995 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Mar 10 09:47:53 crc kubenswrapper[4794]: I0310 09:47:53.126032 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Mar 10 09:47:53 crc kubenswrapper[4794]: I0310 09:47:53.131644 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-grm58\" (UniqueName: \"kubernetes.io/projected/a1d20393-8c02-48fd-83ad-eb270b721313-kube-api-access-grm58\") pod \"machine-api-operator-5694c8668f-m2lbn\" (UID: \"a1d20393-8c02-48fd-83ad-eb270b721313\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-m2lbn" Mar 10 09:47:53 crc kubenswrapper[4794]: I0310 09:47:53.147024 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Mar 10 09:47:53 crc kubenswrapper[4794]: I0310 09:47:53.167058 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Mar 10 09:47:53 crc kubenswrapper[4794]: I0310 09:47:53.184506 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Mar 10 09:47:53 crc kubenswrapper[4794]: I0310 09:47:53.205041 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Mar 10 09:47:53 crc kubenswrapper[4794]: I0310 09:47:53.229153 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Mar 10 09:47:53 crc kubenswrapper[4794]: I0310 09:47:53.244735 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Mar 10 09:47:53 crc kubenswrapper[4794]: I0310 09:47:53.258052 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-m2lbn" Mar 10 09:47:53 crc kubenswrapper[4794]: I0310 09:47:53.265188 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Mar 10 09:47:53 crc kubenswrapper[4794]: I0310 09:47:53.285362 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Mar 10 09:47:53 crc kubenswrapper[4794]: I0310 09:47:53.305846 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Mar 10 09:47:53 crc kubenswrapper[4794]: I0310 09:47:53.336067 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Mar 10 09:47:53 crc kubenswrapper[4794]: I0310 09:47:53.376285 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hhffj\" (UniqueName: \"kubernetes.io/projected/02821716-8fb0-46bc-9c95-4c7ca46500b4-kube-api-access-hhffj\") pod \"route-controller-manager-6576b87f9c-jscb4\" (UID: \"02821716-8fb0-46bc-9c95-4c7ca46500b4\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-jscb4" Mar 10 09:47:53 crc kubenswrapper[4794]: I0310 09:47:53.381414 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tgp2d\" (UniqueName: \"kubernetes.io/projected/bff3d354-0064-4a96-8945-51df3cd2d7e7-kube-api-access-tgp2d\") pod \"apiserver-76f77b778f-4jppl\" (UID: \"bff3d354-0064-4a96-8945-51df3cd2d7e7\") " pod="openshift-apiserver/apiserver-76f77b778f-4jppl" Mar 10 09:47:53 crc kubenswrapper[4794]: I0310 09:47:53.407379 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-mm9nq" Mar 10 09:47:53 crc kubenswrapper[4794]: I0310 09:47:53.410216 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zdbvb\" (UniqueName: \"kubernetes.io/projected/32cb5682-0c0e-4f56-8fcc-cd73067c41c7-kube-api-access-zdbvb\") pod \"apiserver-7bbb656c7d-cxgqz\" (UID: \"32cb5682-0c0e-4f56-8fcc-cd73067c41c7\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-cxgqz" Mar 10 09:47:53 crc kubenswrapper[4794]: I0310 09:47:53.426308 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 10 09:47:53 crc kubenswrapper[4794]: I0310 09:47:53.426772 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-97qwh\" (UniqueName: \"kubernetes.io/projected/19ba7a1d-a381-49f2-aa2e-6463336559fe-kube-api-access-97qwh\") pod \"controller-manager-879f6c89f-n64xh\" (UID: \"19ba7a1d-a381-49f2-aa2e-6463336559fe\") " pod="openshift-controller-manager/controller-manager-879f6c89f-n64xh" Mar 10 09:47:53 crc kubenswrapper[4794]: I0310 09:47:53.445183 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 10 09:47:53 crc kubenswrapper[4794]: I0310 09:47:53.453063 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-m2lbn"] Mar 10 09:47:53 crc kubenswrapper[4794]: W0310 09:47:53.459301 4794 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda1d20393_8c02_48fd_83ad_eb270b721313.slice/crio-daa32138b56c2648acf010d824ff0da431bcdfccdee8492935508a533f6e65da WatchSource:0}: Error finding container daa32138b56c2648acf010d824ff0da431bcdfccdee8492935508a533f6e65da: Status 404 returned error can't find the container with id daa32138b56c2648acf010d824ff0da431bcdfccdee8492935508a533f6e65da Mar 10 09:47:53 crc kubenswrapper[4794]: I0310 09:47:53.465710 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Mar 10 09:47:53 crc kubenswrapper[4794]: I0310 09:47:53.485131 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Mar 10 09:47:53 crc kubenswrapper[4794]: I0310 09:47:53.493480 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-4jppl" Mar 10 09:47:53 crc kubenswrapper[4794]: I0310 09:47:53.505253 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 10 09:47:53 crc kubenswrapper[4794]: I0310 09:47:53.505622 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-n64xh" Mar 10 09:47:53 crc kubenswrapper[4794]: I0310 09:47:53.516532 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-cxgqz" Mar 10 09:47:53 crc kubenswrapper[4794]: I0310 09:47:53.525176 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 10 09:47:53 crc kubenswrapper[4794]: I0310 09:47:53.537607 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-jscb4" Mar 10 09:47:53 crc kubenswrapper[4794]: I0310 09:47:53.545296 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Mar 10 09:47:53 crc kubenswrapper[4794]: I0310 09:47:53.565921 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Mar 10 09:47:53 crc kubenswrapper[4794]: I0310 09:47:53.586259 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Mar 10 09:47:53 crc kubenswrapper[4794]: I0310 09:47:53.606384 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Mar 10 09:47:53 crc kubenswrapper[4794]: I0310 09:47:53.625443 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Mar 10 09:47:53 crc kubenswrapper[4794]: I0310 09:47:53.646166 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Mar 10 09:47:53 crc kubenswrapper[4794]: I0310 09:47:53.665378 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Mar 10 09:47:53 crc kubenswrapper[4794]: I0310 09:47:53.682788 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-4jppl"] Mar 10 09:47:53 crc kubenswrapper[4794]: I0310 09:47:53.685476 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Mar 10 09:47:53 crc kubenswrapper[4794]: I0310 09:47:53.706829 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Mar 10 09:47:53 crc kubenswrapper[4794]: W0310 09:47:53.708182 4794 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbff3d354_0064_4a96_8945_51df3cd2d7e7.slice/crio-d27401f44bfb69e9b4188881a92454abc242b23454f347679b15576173051b6a WatchSource:0}: Error finding container d27401f44bfb69e9b4188881a92454abc242b23454f347679b15576173051b6a: Status 404 returned error can't find the container with id d27401f44bfb69e9b4188881a92454abc242b23454f347679b15576173051b6a Mar 10 09:47:53 crc kubenswrapper[4794]: I0310 09:47:53.719769 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-n64xh"] Mar 10 09:47:53 crc kubenswrapper[4794]: I0310 09:47:53.726049 4794 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Mar 10 09:47:53 crc kubenswrapper[4794]: I0310 09:47:53.744496 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Mar 10 09:47:53 crc kubenswrapper[4794]: I0310 09:47:53.765296 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Mar 10 09:47:53 crc kubenswrapper[4794]: I0310 09:47:53.787113 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Mar 10 09:47:53 crc kubenswrapper[4794]: I0310 09:47:53.805131 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Mar 10 09:47:53 crc kubenswrapper[4794]: I0310 09:47:53.823400 4794 request.go:700] Waited for 1.898260438s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-dns/secrets?fieldSelector=metadata.name%3Ddns-default-metrics-tls&limit=500&resourceVersion=0 Mar 10 09:47:53 crc kubenswrapper[4794]: I0310 09:47:53.825159 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Mar 10 09:47:53 crc kubenswrapper[4794]: I0310 09:47:53.863644 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8z6j4\" (UniqueName: \"kubernetes.io/projected/b70fe17f-97eb-4abf-b88e-0e24e5d01c48-kube-api-access-8z6j4\") pod \"openshift-config-operator-7777fb866f-l4mhp\" (UID: \"b70fe17f-97eb-4abf-b88e-0e24e5d01c48\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-l4mhp" Mar 10 09:47:53 crc kubenswrapper[4794]: I0310 09:47:53.878937 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gr4s9\" (UniqueName: \"kubernetes.io/projected/eb31e03a-1724-4272-bb5f-d0a7b6b80059-kube-api-access-gr4s9\") pod \"control-plane-machine-set-operator-78cbb6b69f-q62bc\" (UID: \"eb31e03a-1724-4272-bb5f-d0a7b6b80059\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-q62bc" Mar 10 09:47:53 crc kubenswrapper[4794]: I0310 09:47:53.899936 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jhknc\" (UniqueName: \"kubernetes.io/projected/531c4a6c-a67a-4629-bfeb-a89565d11497-kube-api-access-jhknc\") pod \"migrator-59844c95c7-q8jhf\" (UID: \"531c4a6c-a67a-4629-bfeb-a89565d11497\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-q8jhf" Mar 10 09:47:53 crc kubenswrapper[4794]: I0310 09:47:53.918035 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/3e0afe5a-17db-4dd6-b871-792a4e0f81a9-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-jnffm\" (UID: \"3e0afe5a-17db-4dd6-b871-792a4e0f81a9\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-jnffm" Mar 10 09:47:53 crc kubenswrapper[4794]: I0310 09:47:53.938676 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xc4hx\" (UniqueName: \"kubernetes.io/projected/3abe2691-57ff-4b50-8b21-23d7125489bb-kube-api-access-xc4hx\") pod \"etcd-operator-b45778765-txkm8\" (UID: \"3abe2691-57ff-4b50-8b21-23d7125489bb\") " pod="openshift-etcd-operator/etcd-operator-b45778765-txkm8" Mar 10 09:47:53 crc kubenswrapper[4794]: I0310 09:47:53.961585 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-jscb4"] Mar 10 09:47:53 crc kubenswrapper[4794]: I0310 09:47:53.963290 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-cxgqz"] Mar 10 09:47:53 crc kubenswrapper[4794]: W0310 09:47:53.966508 4794 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod02821716_8fb0_46bc_9c95_4c7ca46500b4.slice/crio-f27f1f6a52bdb3a32252853fe633bf32a368271ef3f581f4f0cdfe063462c49f WatchSource:0}: Error finding container f27f1f6a52bdb3a32252853fe633bf32a368271ef3f581f4f0cdfe063462c49f: Status 404 returned error can't find the container with id f27f1f6a52bdb3a32252853fe633bf32a368271ef3f581f4f0cdfe063462c49f Mar 10 09:47:53 crc kubenswrapper[4794]: W0310 09:47:53.970561 4794 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod32cb5682_0c0e_4f56_8fcc_cd73067c41c7.slice/crio-7fadcfb38890627c39a04060ce07880cf1fe6fc59a08d0d354ce217de95090a5 WatchSource:0}: Error finding container 7fadcfb38890627c39a04060ce07880cf1fe6fc59a08d0d354ce217de95090a5: Status 404 returned error can't find the container with id 7fadcfb38890627c39a04060ce07880cf1fe6fc59a08d0d354ce217de95090a5 Mar 10 09:47:53 crc kubenswrapper[4794]: I0310 09:47:53.972528 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lb7db\" (UniqueName: \"kubernetes.io/projected/4f8f1872-3b59-424c-ad20-9332a4ffc4b2-kube-api-access-lb7db\") pod \"packageserver-d55dfcdfc-rjjct\" (UID: \"4f8f1872-3b59-424c-ad20-9332a4ffc4b2\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-rjjct" Mar 10 09:47:53 crc kubenswrapper[4794]: I0310 09:47:53.982144 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-q8jhf" Mar 10 09:47:53 crc kubenswrapper[4794]: I0310 09:47:53.984415 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qxk64\" (UniqueName: \"kubernetes.io/projected/180c50b5-f178-4b42-922c-edfd1deb91d4-kube-api-access-qxk64\") pod \"router-default-5444994796-tlkl5\" (UID: \"180c50b5-f178-4b42-922c-edfd1deb91d4\") " pod="openshift-ingress/router-default-5444994796-tlkl5" Mar 10 09:47:54 crc kubenswrapper[4794]: I0310 09:47:54.008623 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b7pgh\" (UniqueName: \"kubernetes.io/projected/c55bf603-9ac8-4c71-8540-4a3ea6958d0f-kube-api-access-b7pgh\") pod \"openshift-controller-manager-operator-756b6f6bc6-glg2p\" (UID: \"c55bf603-9ac8-4c71-8540-4a3ea6958d0f\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-glg2p" Mar 10 09:47:54 crc kubenswrapper[4794]: I0310 09:47:54.029205 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/5ef589c7-76d3-4f7e-8221-50e60876c39f-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-sqlbx\" (UID: \"5ef589c7-76d3-4f7e-8221-50e60876c39f\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-sqlbx" Mar 10 09:47:54 crc kubenswrapper[4794]: I0310 09:47:54.037563 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-txkm8" Mar 10 09:47:54 crc kubenswrapper[4794]: I0310 09:47:54.049291 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qbh8v\" (UniqueName: \"kubernetes.io/projected/e5fe5900-c0c7-4b99-9dcb-bcb3bd8da1fc-kube-api-access-qbh8v\") pod \"cluster-samples-operator-665b6dd947-jqtj2\" (UID: \"e5fe5900-c0c7-4b99-9dcb-bcb3bd8da1fc\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-jqtj2" Mar 10 09:47:54 crc kubenswrapper[4794]: I0310 09:47:54.055482 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-n64xh" event={"ID":"19ba7a1d-a381-49f2-aa2e-6463336559fe","Type":"ContainerStarted","Data":"3beee2cde0164bfcb074685175ca919eab419dc359bf571b0f008bc511270c73"} Mar 10 09:47:54 crc kubenswrapper[4794]: I0310 09:47:54.055520 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-n64xh" event={"ID":"19ba7a1d-a381-49f2-aa2e-6463336559fe","Type":"ContainerStarted","Data":"ecae1904760a9d8f6c89da64b4468cca3e743d52c67ecd3df1d1217cd808866a"} Mar 10 09:47:54 crc kubenswrapper[4794]: I0310 09:47:54.055791 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-879f6c89f-n64xh" Mar 10 09:47:54 crc kubenswrapper[4794]: I0310 09:47:54.057963 4794 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-n64xh container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.5:8443/healthz\": dial tcp 10.217.0.5:8443: connect: connection refused" start-of-body= Mar 10 09:47:54 crc kubenswrapper[4794]: I0310 09:47:54.057999 4794 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-n64xh" podUID="19ba7a1d-a381-49f2-aa2e-6463336559fe" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.5:8443/healthz\": dial tcp 10.217.0.5:8443: connect: connection refused" Mar 10 09:47:54 crc kubenswrapper[4794]: I0310 09:47:54.058395 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-cxgqz" event={"ID":"32cb5682-0c0e-4f56-8fcc-cd73067c41c7","Type":"ContainerStarted","Data":"7fadcfb38890627c39a04060ce07880cf1fe6fc59a08d0d354ce217de95090a5"} Mar 10 09:47:54 crc kubenswrapper[4794]: I0310 09:47:54.061927 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-m2lbn" event={"ID":"a1d20393-8c02-48fd-83ad-eb270b721313","Type":"ContainerStarted","Data":"01b5b611c0252ee1e7cad559092141da34f6eb5512921bc6f3ec0348b39bacaa"} Mar 10 09:47:54 crc kubenswrapper[4794]: I0310 09:47:54.061968 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-m2lbn" event={"ID":"a1d20393-8c02-48fd-83ad-eb270b721313","Type":"ContainerStarted","Data":"ad876beb88c0f9f4a78108c6306c05b97bbf80f4149cd9aa0f9f1608755d1eba"} Mar 10 09:47:54 crc kubenswrapper[4794]: I0310 09:47:54.061978 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-m2lbn" event={"ID":"a1d20393-8c02-48fd-83ad-eb270b721313","Type":"ContainerStarted","Data":"daa32138b56c2648acf010d824ff0da431bcdfccdee8492935508a533f6e65da"} Mar 10 09:47:54 crc kubenswrapper[4794]: I0310 09:47:54.063386 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-l4mhp" Mar 10 09:47:54 crc kubenswrapper[4794]: I0310 09:47:54.065747 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-924n5\" (UniqueName: \"kubernetes.io/projected/3e0afe5a-17db-4dd6-b871-792a4e0f81a9-kube-api-access-924n5\") pod \"cluster-image-registry-operator-dc59b4c8b-jnffm\" (UID: \"3e0afe5a-17db-4dd6-b871-792a4e0f81a9\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-jnffm" Mar 10 09:47:54 crc kubenswrapper[4794]: I0310 09:47:54.066218 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-4jppl" event={"ID":"bff3d354-0064-4a96-8945-51df3cd2d7e7","Type":"ContainerStarted","Data":"d27401f44bfb69e9b4188881a92454abc242b23454f347679b15576173051b6a"} Mar 10 09:47:54 crc kubenswrapper[4794]: I0310 09:47:54.067167 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-jscb4" event={"ID":"02821716-8fb0-46bc-9c95-4c7ca46500b4","Type":"ContainerStarted","Data":"f27f1f6a52bdb3a32252853fe633bf32a368271ef3f581f4f0cdfe063462c49f"} Mar 10 09:47:54 crc kubenswrapper[4794]: I0310 09:47:54.076974 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-q62bc" Mar 10 09:47:54 crc kubenswrapper[4794]: I0310 09:47:54.085903 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-tlkl5" Mar 10 09:47:54 crc kubenswrapper[4794]: I0310 09:47:54.091630 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-rjjct" Mar 10 09:47:54 crc kubenswrapper[4794]: I0310 09:47:54.117089 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-sqlbx" Mar 10 09:47:54 crc kubenswrapper[4794]: I0310 09:47:54.174848 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7827a543-d8b2-460b-aee5-212ea1208c0d-trusted-ca-bundle\") pod \"console-f9d7485db-skqw6\" (UID: \"7827a543-d8b2-460b-aee5-212ea1208c0d\") " pod="openshift-console/console-f9d7485db-skqw6" Mar 10 09:47:54 crc kubenswrapper[4794]: I0310 09:47:54.174889 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pcs2g\" (UniqueName: \"kubernetes.io/projected/2d3943fb-fdf7-495d-9a69-f04362b8b319-kube-api-access-pcs2g\") pod \"openshift-apiserver-operator-796bbdcf4f-zzsr8\" (UID: \"2d3943fb-fdf7-495d-9a69-f04362b8b319\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-zzsr8" Mar 10 09:47:54 crc kubenswrapper[4794]: I0310 09:47:54.174938 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7a0f2e0e-60dc-472e-ad6b-fc17442a65ed-serving-cert\") pod \"service-ca-operator-777779d784-hxsrd\" (UID: \"7a0f2e0e-60dc-472e-ad6b-fc17442a65ed\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-hxsrd" Mar 10 09:47:54 crc kubenswrapper[4794]: I0310 09:47:54.174957 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2d3943fb-fdf7-495d-9a69-f04362b8b319-config\") pod \"openshift-apiserver-operator-796bbdcf4f-zzsr8\" (UID: \"2d3943fb-fdf7-495d-9a69-f04362b8b319\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-zzsr8" Mar 10 09:47:54 crc kubenswrapper[4794]: I0310 09:47:54.174975 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/d56da28c-c09d-4fff-b73e-c3b5c787c300-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-n94r9\" (UID: \"d56da28c-c09d-4fff-b73e-c3b5c787c300\") " pod="openshift-authentication/oauth-openshift-558db77b4-n94r9" Mar 10 09:47:54 crc kubenswrapper[4794]: I0310 09:47:54.174989 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/577c9773-972c-42bb-99c1-b3d44cf21403-config\") pod \"kube-apiserver-operator-766d6c64bb-xwx4g\" (UID: \"577c9773-972c-42bb-99c1-b3d44cf21403\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-xwx4g" Mar 10 09:47:54 crc kubenswrapper[4794]: I0310 09:47:54.175008 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d56da28c-c09d-4fff-b73e-c3b5c787c300-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-n94r9\" (UID: \"d56da28c-c09d-4fff-b73e-c3b5c787c300\") " pod="openshift-authentication/oauth-openshift-558db77b4-n94r9" Mar 10 09:47:54 crc kubenswrapper[4794]: I0310 09:47:54.175026 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a138cfa4-f8ff-4627-b3da-f72aeb474795-service-ca-bundle\") pod \"authentication-operator-69f744f599-xtppb\" (UID: \"a138cfa4-f8ff-4627-b3da-f72aeb474795\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-xtppb" Mar 10 09:47:54 crc kubenswrapper[4794]: I0310 09:47:54.175048 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zp5hg\" (UID: \"f7eaacf5-bd8a-46e7-8584-1ec3ae00f6b3\") " pod="openshift-image-registry/image-registry-697d97f7c8-zp5hg" Mar 10 09:47:54 crc kubenswrapper[4794]: I0310 09:47:54.175063 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lx8pl\" (UniqueName: \"kubernetes.io/projected/a138cfa4-f8ff-4627-b3da-f72aeb474795-kube-api-access-lx8pl\") pod \"authentication-operator-69f744f599-xtppb\" (UID: \"a138cfa4-f8ff-4627-b3da-f72aeb474795\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-xtppb" Mar 10 09:47:54 crc kubenswrapper[4794]: I0310 09:47:54.175079 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/d56da28c-c09d-4fff-b73e-c3b5c787c300-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-n94r9\" (UID: \"d56da28c-c09d-4fff-b73e-c3b5c787c300\") " pod="openshift-authentication/oauth-openshift-558db77b4-n94r9" Mar 10 09:47:54 crc kubenswrapper[4794]: I0310 09:47:54.175094 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/d56da28c-c09d-4fff-b73e-c3b5c787c300-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-n94r9\" (UID: \"d56da28c-c09d-4fff-b73e-c3b5c787c300\") " pod="openshift-authentication/oauth-openshift-558db77b4-n94r9" Mar 10 09:47:54 crc kubenswrapper[4794]: I0310 09:47:54.175109 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2d3943fb-fdf7-495d-9a69-f04362b8b319-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-zzsr8\" (UID: \"2d3943fb-fdf7-495d-9a69-f04362b8b319\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-zzsr8" Mar 10 09:47:54 crc kubenswrapper[4794]: I0310 09:47:54.176279 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7a0f2e0e-60dc-472e-ad6b-fc17442a65ed-config\") pod \"service-ca-operator-777779d784-hxsrd\" (UID: \"7a0f2e0e-60dc-472e-ad6b-fc17442a65ed\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-hxsrd" Mar 10 09:47:54 crc kubenswrapper[4794]: I0310 09:47:54.176406 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/d56da28c-c09d-4fff-b73e-c3b5c787c300-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-n94r9\" (UID: \"d56da28c-c09d-4fff-b73e-c3b5c787c300\") " pod="openshift-authentication/oauth-openshift-558db77b4-n94r9" Mar 10 09:47:54 crc kubenswrapper[4794]: I0310 09:47:54.176478 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/7827a543-d8b2-460b-aee5-212ea1208c0d-console-config\") pod \"console-f9d7485db-skqw6\" (UID: \"7827a543-d8b2-460b-aee5-212ea1208c0d\") " pod="openshift-console/console-f9d7485db-skqw6" Mar 10 09:47:54 crc kubenswrapper[4794]: I0310 09:47:54.176513 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d7615b4-8f9f-4200-9e55-1baa01a6b14d-trusted-ca\") pod \"console-operator-58897d9998-qnrc8\" (UID: \"9d7615b4-8f9f-4200-9e55-1baa01a6b14d\") " pod="openshift-console-operator/console-operator-58897d9998-qnrc8" Mar 10 09:47:54 crc kubenswrapper[4794]: I0310 09:47:54.176536 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5bgv2\" (UniqueName: \"kubernetes.io/projected/797ba833-0538-46ba-b3f1-4a3b2415c7f1-kube-api-access-5bgv2\") pod \"service-ca-9c57cc56f-2tbqn\" (UID: \"797ba833-0538-46ba-b3f1-4a3b2415c7f1\") " pod="openshift-service-ca/service-ca-9c57cc56f-2tbqn" Mar 10 09:47:54 crc kubenswrapper[4794]: I0310 09:47:54.176678 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/982fe064-1be6-4ff0-b6ba-6f04ee269140-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-dkz9c\" (UID: \"982fe064-1be6-4ff0-b6ba-6f04ee269140\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-dkz9c" Mar 10 09:47:54 crc kubenswrapper[4794]: I0310 09:47:54.176796 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/d56da28c-c09d-4fff-b73e-c3b5c787c300-audit-policies\") pod \"oauth-openshift-558db77b4-n94r9\" (UID: \"d56da28c-c09d-4fff-b73e-c3b5c787c300\") " pod="openshift-authentication/oauth-openshift-558db77b4-n94r9" Mar 10 09:47:54 crc kubenswrapper[4794]: I0310 09:47:54.176831 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/d56da28c-c09d-4fff-b73e-c3b5c787c300-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-n94r9\" (UID: \"d56da28c-c09d-4fff-b73e-c3b5c787c300\") " pod="openshift-authentication/oauth-openshift-558db77b4-n94r9" Mar 10 09:47:54 crc kubenswrapper[4794]: I0310 09:47:54.176856 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/982fe064-1be6-4ff0-b6ba-6f04ee269140-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-dkz9c\" (UID: \"982fe064-1be6-4ff0-b6ba-6f04ee269140\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-dkz9c" Mar 10 09:47:54 crc kubenswrapper[4794]: I0310 09:47:54.176879 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/7b739a9e-64b9-4415-bc45-dc9307aa49d3-metrics-tls\") pod \"ingress-operator-5b745b69d9-bhj55\" (UID: \"7b739a9e-64b9-4415-bc45-dc9307aa49d3\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-bhj55" Mar 10 09:47:54 crc kubenswrapper[4794]: I0310 09:47:54.176905 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/d56da28c-c09d-4fff-b73e-c3b5c787c300-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-n94r9\" (UID: \"d56da28c-c09d-4fff-b73e-c3b5c787c300\") " pod="openshift-authentication/oauth-openshift-558db77b4-n94r9" Mar 10 09:47:54 crc kubenswrapper[4794]: I0310 09:47:54.176929 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/f7eaacf5-bd8a-46e7-8584-1ec3ae00f6b3-ca-trust-extracted\") pod \"image-registry-697d97f7c8-zp5hg\" (UID: \"f7eaacf5-bd8a-46e7-8584-1ec3ae00f6b3\") " pod="openshift-image-registry/image-registry-697d97f7c8-zp5hg" Mar 10 09:47:54 crc kubenswrapper[4794]: I0310 09:47:54.176967 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/7827a543-d8b2-460b-aee5-212ea1208c0d-service-ca\") pod \"console-f9d7485db-skqw6\" (UID: \"7827a543-d8b2-460b-aee5-212ea1208c0d\") " pod="openshift-console/console-f9d7485db-skqw6" Mar 10 09:47:54 crc kubenswrapper[4794]: I0310 09:47:54.176987 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/cd3da9ba-0a30-4cd5-a02b-47c78ab2b9aa-proxy-tls\") pod \"machine-config-controller-84d6567774-h4k77\" (UID: \"cd3da9ba-0a30-4cd5-a02b-47c78ab2b9aa\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-h4k77" Mar 10 09:47:54 crc kubenswrapper[4794]: I0310 09:47:54.177034 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/f7eaacf5-bd8a-46e7-8584-1ec3ae00f6b3-installation-pull-secrets\") pod \"image-registry-697d97f7c8-zp5hg\" (UID: \"f7eaacf5-bd8a-46e7-8584-1ec3ae00f6b3\") " pod="openshift-image-registry/image-registry-697d97f7c8-zp5hg" Mar 10 09:47:54 crc kubenswrapper[4794]: E0310 09:47:54.177071 4794 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 09:47:54.677056546 +0000 UTC m=+223.433227464 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zp5hg" (UID: "f7eaacf5-bd8a-46e7-8584-1ec3ae00f6b3") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 09:47:54 crc kubenswrapper[4794]: I0310 09:47:54.177096 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lrgvc\" (UniqueName: \"kubernetes.io/projected/983001a9-70eb-40e1-895f-5e3fc80f538e-kube-api-access-lrgvc\") pod \"dns-operator-744455d44c-9vsxq\" (UID: \"983001a9-70eb-40e1-895f-5e3fc80f538e\") " pod="openshift-dns-operator/dns-operator-744455d44c-9vsxq" Mar 10 09:47:54 crc kubenswrapper[4794]: I0310 09:47:54.177127 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d7615b4-8f9f-4200-9e55-1baa01a6b14d-config\") pod \"console-operator-58897d9998-qnrc8\" (UID: \"9d7615b4-8f9f-4200-9e55-1baa01a6b14d\") " pod="openshift-console-operator/console-operator-58897d9998-qnrc8" Mar 10 09:47:54 crc kubenswrapper[4794]: I0310 09:47:54.177169 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/f7eaacf5-bd8a-46e7-8584-1ec3ae00f6b3-bound-sa-token\") pod \"image-registry-697d97f7c8-zp5hg\" (UID: \"f7eaacf5-bd8a-46e7-8584-1ec3ae00f6b3\") " pod="openshift-image-registry/image-registry-697d97f7c8-zp5hg" Mar 10 09:47:54 crc kubenswrapper[4794]: I0310 09:47:54.177192 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wxn5t\" (UniqueName: \"kubernetes.io/projected/f7eaacf5-bd8a-46e7-8584-1ec3ae00f6b3-kube-api-access-wxn5t\") pod \"image-registry-697d97f7c8-zp5hg\" (UID: \"f7eaacf5-bd8a-46e7-8584-1ec3ae00f6b3\") " pod="openshift-image-registry/image-registry-697d97f7c8-zp5hg" Mar 10 09:47:54 crc kubenswrapper[4794]: I0310 09:47:54.178024 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a138cfa4-f8ff-4627-b3da-f72aeb474795-serving-cert\") pod \"authentication-operator-69f744f599-xtppb\" (UID: \"a138cfa4-f8ff-4627-b3da-f72aeb474795\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-xtppb" Mar 10 09:47:54 crc kubenswrapper[4794]: I0310 09:47:54.178068 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/d56da28c-c09d-4fff-b73e-c3b5c787c300-audit-dir\") pod \"oauth-openshift-558db77b4-n94r9\" (UID: \"d56da28c-c09d-4fff-b73e-c3b5c787c300\") " pod="openshift-authentication/oauth-openshift-558db77b4-n94r9" Mar 10 09:47:54 crc kubenswrapper[4794]: I0310 09:47:54.178089 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/d56da28c-c09d-4fff-b73e-c3b5c787c300-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-n94r9\" (UID: \"d56da28c-c09d-4fff-b73e-c3b5c787c300\") " pod="openshift-authentication/oauth-openshift-558db77b4-n94r9" Mar 10 09:47:54 crc kubenswrapper[4794]: I0310 09:47:54.178161 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/7827a543-d8b2-460b-aee5-212ea1208c0d-console-oauth-config\") pod \"console-f9d7485db-skqw6\" (UID: \"7827a543-d8b2-460b-aee5-212ea1208c0d\") " pod="openshift-console/console-f9d7485db-skqw6" Mar 10 09:47:54 crc kubenswrapper[4794]: I0310 09:47:54.178242 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/577c9773-972c-42bb-99c1-b3d44cf21403-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-xwx4g\" (UID: \"577c9773-972c-42bb-99c1-b3d44cf21403\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-xwx4g" Mar 10 09:47:54 crc kubenswrapper[4794]: I0310 09:47:54.178267 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a138cfa4-f8ff-4627-b3da-f72aeb474795-config\") pod \"authentication-operator-69f744f599-xtppb\" (UID: \"a138cfa4-f8ff-4627-b3da-f72aeb474795\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-xtppb" Mar 10 09:47:54 crc kubenswrapper[4794]: I0310 09:47:54.178443 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s9j4m\" (UniqueName: \"kubernetes.io/projected/7827a543-d8b2-460b-aee5-212ea1208c0d-kube-api-access-s9j4m\") pod \"console-f9d7485db-skqw6\" (UID: \"7827a543-d8b2-460b-aee5-212ea1208c0d\") " pod="openshift-console/console-f9d7485db-skqw6" Mar 10 09:47:54 crc kubenswrapper[4794]: I0310 09:47:54.178510 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/f7eaacf5-bd8a-46e7-8584-1ec3ae00f6b3-registry-tls\") pod \"image-registry-697d97f7c8-zp5hg\" (UID: \"f7eaacf5-bd8a-46e7-8584-1ec3ae00f6b3\") " pod="openshift-image-registry/image-registry-697d97f7c8-zp5hg" Mar 10 09:47:54 crc kubenswrapper[4794]: I0310 09:47:54.178533 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/797ba833-0538-46ba-b3f1-4a3b2415c7f1-signing-key\") pod \"service-ca-9c57cc56f-2tbqn\" (UID: \"797ba833-0538-46ba-b3f1-4a3b2415c7f1\") " pod="openshift-service-ca/service-ca-9c57cc56f-2tbqn" Mar 10 09:47:54 crc kubenswrapper[4794]: I0310 09:47:54.178555 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-45vcn\" (UniqueName: \"kubernetes.io/projected/cd3da9ba-0a30-4cd5-a02b-47c78ab2b9aa-kube-api-access-45vcn\") pod \"machine-config-controller-84d6567774-h4k77\" (UID: \"cd3da9ba-0a30-4cd5-a02b-47c78ab2b9aa\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-h4k77" Mar 10 09:47:54 crc kubenswrapper[4794]: I0310 09:47:54.178664 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d7615b4-8f9f-4200-9e55-1baa01a6b14d-serving-cert\") pod \"console-operator-58897d9998-qnrc8\" (UID: \"9d7615b4-8f9f-4200-9e55-1baa01a6b14d\") " pod="openshift-console-operator/console-operator-58897d9998-qnrc8" Mar 10 09:47:54 crc kubenswrapper[4794]: I0310 09:47:54.178721 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rh9rx\" (UniqueName: \"kubernetes.io/projected/d56da28c-c09d-4fff-b73e-c3b5c787c300-kube-api-access-rh9rx\") pod \"oauth-openshift-558db77b4-n94r9\" (UID: \"d56da28c-c09d-4fff-b73e-c3b5c787c300\") " pod="openshift-authentication/oauth-openshift-558db77b4-n94r9" Mar 10 09:47:54 crc kubenswrapper[4794]: I0310 09:47:54.178912 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/f7eaacf5-bd8a-46e7-8584-1ec3ae00f6b3-registry-certificates\") pod \"image-registry-697d97f7c8-zp5hg\" (UID: \"f7eaacf5-bd8a-46e7-8584-1ec3ae00f6b3\") " pod="openshift-image-registry/image-registry-697d97f7c8-zp5hg" Mar 10 09:47:54 crc kubenswrapper[4794]: I0310 09:47:54.178943 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/d56da28c-c09d-4fff-b73e-c3b5c787c300-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-n94r9\" (UID: \"d56da28c-c09d-4fff-b73e-c3b5c787c300\") " pod="openshift-authentication/oauth-openshift-558db77b4-n94r9" Mar 10 09:47:54 crc kubenswrapper[4794]: I0310 09:47:54.179058 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/7827a543-d8b2-460b-aee5-212ea1208c0d-oauth-serving-cert\") pod \"console-f9d7485db-skqw6\" (UID: \"7827a543-d8b2-460b-aee5-212ea1208c0d\") " pod="openshift-console/console-f9d7485db-skqw6" Mar 10 09:47:54 crc kubenswrapper[4794]: I0310 09:47:54.179132 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/d56da28c-c09d-4fff-b73e-c3b5c787c300-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-n94r9\" (UID: \"d56da28c-c09d-4fff-b73e-c3b5c787c300\") " pod="openshift-authentication/oauth-openshift-558db77b4-n94r9" Mar 10 09:47:54 crc kubenswrapper[4794]: I0310 09:47:54.179165 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/7827a543-d8b2-460b-aee5-212ea1208c0d-console-serving-cert\") pod \"console-f9d7485db-skqw6\" (UID: \"7827a543-d8b2-460b-aee5-212ea1208c0d\") " pod="openshift-console/console-f9d7485db-skqw6" Mar 10 09:47:54 crc kubenswrapper[4794]: I0310 09:47:54.179228 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/983001a9-70eb-40e1-895f-5e3fc80f538e-metrics-tls\") pod \"dns-operator-744455d44c-9vsxq\" (UID: \"983001a9-70eb-40e1-895f-5e3fc80f538e\") " pod="openshift-dns-operator/dns-operator-744455d44c-9vsxq" Mar 10 09:47:54 crc kubenswrapper[4794]: I0310 09:47:54.179250 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/797ba833-0538-46ba-b3f1-4a3b2415c7f1-signing-cabundle\") pod \"service-ca-9c57cc56f-2tbqn\" (UID: \"797ba833-0538-46ba-b3f1-4a3b2415c7f1\") " pod="openshift-service-ca/service-ca-9c57cc56f-2tbqn" Mar 10 09:47:54 crc kubenswrapper[4794]: I0310 09:47:54.179287 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a138cfa4-f8ff-4627-b3da-f72aeb474795-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-xtppb\" (UID: \"a138cfa4-f8ff-4627-b3da-f72aeb474795\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-xtppb" Mar 10 09:47:54 crc kubenswrapper[4794]: I0310 09:47:54.179323 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7fwfx\" (UniqueName: \"kubernetes.io/projected/9d7615b4-8f9f-4200-9e55-1baa01a6b14d-kube-api-access-7fwfx\") pod \"console-operator-58897d9998-qnrc8\" (UID: \"9d7615b4-8f9f-4200-9e55-1baa01a6b14d\") " pod="openshift-console-operator/console-operator-58897d9998-qnrc8" Mar 10 09:47:54 crc kubenswrapper[4794]: I0310 09:47:54.179362 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g5wsv\" (UniqueName: \"kubernetes.io/projected/7a0f2e0e-60dc-472e-ad6b-fc17442a65ed-kube-api-access-g5wsv\") pod \"service-ca-operator-777779d784-hxsrd\" (UID: \"7a0f2e0e-60dc-472e-ad6b-fc17442a65ed\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-hxsrd" Mar 10 09:47:54 crc kubenswrapper[4794]: I0310 09:47:54.179391 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/7b739a9e-64b9-4415-bc45-dc9307aa49d3-bound-sa-token\") pod \"ingress-operator-5b745b69d9-bhj55\" (UID: \"7b739a9e-64b9-4415-bc45-dc9307aa49d3\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-bhj55" Mar 10 09:47:54 crc kubenswrapper[4794]: I0310 09:47:54.179417 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6t9r5\" (UniqueName: \"kubernetes.io/projected/7b739a9e-64b9-4415-bc45-dc9307aa49d3-kube-api-access-6t9r5\") pod \"ingress-operator-5b745b69d9-bhj55\" (UID: \"7b739a9e-64b9-4415-bc45-dc9307aa49d3\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-bhj55" Mar 10 09:47:54 crc kubenswrapper[4794]: I0310 09:47:54.179443 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/577c9773-972c-42bb-99c1-b3d44cf21403-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-xwx4g\" (UID: \"577c9773-972c-42bb-99c1-b3d44cf21403\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-xwx4g" Mar 10 09:47:54 crc kubenswrapper[4794]: I0310 09:47:54.179481 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rtg5n\" (UniqueName: \"kubernetes.io/projected/982fe064-1be6-4ff0-b6ba-6f04ee269140-kube-api-access-rtg5n\") pod \"kube-storage-version-migrator-operator-b67b599dd-dkz9c\" (UID: \"982fe064-1be6-4ff0-b6ba-6f04ee269140\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-dkz9c" Mar 10 09:47:54 crc kubenswrapper[4794]: I0310 09:47:54.179501 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/7b739a9e-64b9-4415-bc45-dc9307aa49d3-trusted-ca\") pod \"ingress-operator-5b745b69d9-bhj55\" (UID: \"7b739a9e-64b9-4415-bc45-dc9307aa49d3\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-bhj55" Mar 10 09:47:54 crc kubenswrapper[4794]: I0310 09:47:54.179538 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f7eaacf5-bd8a-46e7-8584-1ec3ae00f6b3-trusted-ca\") pod \"image-registry-697d97f7c8-zp5hg\" (UID: \"f7eaacf5-bd8a-46e7-8584-1ec3ae00f6b3\") " pod="openshift-image-registry/image-registry-697d97f7c8-zp5hg" Mar 10 09:47:54 crc kubenswrapper[4794]: I0310 09:47:54.179560 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/d56da28c-c09d-4fff-b73e-c3b5c787c300-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-n94r9\" (UID: \"d56da28c-c09d-4fff-b73e-c3b5c787c300\") " pod="openshift-authentication/oauth-openshift-558db77b4-n94r9" Mar 10 09:47:54 crc kubenswrapper[4794]: I0310 09:47:54.179606 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/cd3da9ba-0a30-4cd5-a02b-47c78ab2b9aa-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-h4k77\" (UID: \"cd3da9ba-0a30-4cd5-a02b-47c78ab2b9aa\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-h4k77" Mar 10 09:47:54 crc kubenswrapper[4794]: W0310 09:47:54.194391 4794 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod180c50b5_f178_4b42_922c_edfd1deb91d4.slice/crio-e9d33cba18fdc611e604a7c004eeb1327d0f52abd1a53351e1de54430e16f8d0 WatchSource:0}: Error finding container e9d33cba18fdc611e604a7c004eeb1327d0f52abd1a53351e1de54430e16f8d0: Status 404 returned error can't find the container with id e9d33cba18fdc611e604a7c004eeb1327d0f52abd1a53351e1de54430e16f8d0 Mar 10 09:47:54 crc kubenswrapper[4794]: I0310 09:47:54.239748 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-jnffm" Mar 10 09:47:54 crc kubenswrapper[4794]: I0310 09:47:54.280522 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 09:47:54 crc kubenswrapper[4794]: I0310 09:47:54.280926 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s9j4m\" (UniqueName: \"kubernetes.io/projected/7827a543-d8b2-460b-aee5-212ea1208c0d-kube-api-access-s9j4m\") pod \"console-f9d7485db-skqw6\" (UID: \"7827a543-d8b2-460b-aee5-212ea1208c0d\") " pod="openshift-console/console-f9d7485db-skqw6" Mar 10 09:47:54 crc kubenswrapper[4794]: I0310 09:47:54.280950 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a39fe093-da97-48ba-bdf3-a566eefc5208-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-n2b4p\" (UID: \"a39fe093-da97-48ba-bdf3-a566eefc5208\") " pod="openshift-marketplace/marketplace-operator-79b997595-n2b4p" Mar 10 09:47:54 crc kubenswrapper[4794]: I0310 09:47:54.280967 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/6fd49366-785d-4530-a7d2-4a5daf70ea0f-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-m7cm9\" (UID: \"6fd49366-785d-4530-a7d2-4a5daf70ea0f\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-m7cm9" Mar 10 09:47:54 crc kubenswrapper[4794]: I0310 09:47:54.280995 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/f7eaacf5-bd8a-46e7-8584-1ec3ae00f6b3-registry-tls\") pod \"image-registry-697d97f7c8-zp5hg\" (UID: \"f7eaacf5-bd8a-46e7-8584-1ec3ae00f6b3\") " pod="openshift-image-registry/image-registry-697d97f7c8-zp5hg" Mar 10 09:47:54 crc kubenswrapper[4794]: I0310 09:47:54.281012 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/eece3bac-ab7c-4a16-82ae-35775eef8806-config-volume\") pod \"collect-profiles-29552265-smnpm\" (UID: \"eece3bac-ab7c-4a16-82ae-35775eef8806\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552265-smnpm" Mar 10 09:47:54 crc kubenswrapper[4794]: I0310 09:47:54.281032 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rh9rx\" (UniqueName: \"kubernetes.io/projected/d56da28c-c09d-4fff-b73e-c3b5c787c300-kube-api-access-rh9rx\") pod \"oauth-openshift-558db77b4-n94r9\" (UID: \"d56da28c-c09d-4fff-b73e-c3b5c787c300\") " pod="openshift-authentication/oauth-openshift-558db77b4-n94r9" Mar 10 09:47:54 crc kubenswrapper[4794]: I0310 09:47:54.281048 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/797ba833-0538-46ba-b3f1-4a3b2415c7f1-signing-key\") pod \"service-ca-9c57cc56f-2tbqn\" (UID: \"797ba833-0538-46ba-b3f1-4a3b2415c7f1\") " pod="openshift-service-ca/service-ca-9c57cc56f-2tbqn" Mar 10 09:47:54 crc kubenswrapper[4794]: I0310 09:47:54.281064 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-45vcn\" (UniqueName: \"kubernetes.io/projected/cd3da9ba-0a30-4cd5-a02b-47c78ab2b9aa-kube-api-access-45vcn\") pod \"machine-config-controller-84d6567774-h4k77\" (UID: \"cd3da9ba-0a30-4cd5-a02b-47c78ab2b9aa\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-h4k77" Mar 10 09:47:54 crc kubenswrapper[4794]: I0310 09:47:54.281087 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d7615b4-8f9f-4200-9e55-1baa01a6b14d-serving-cert\") pod \"console-operator-58897d9998-qnrc8\" (UID: \"9d7615b4-8f9f-4200-9e55-1baa01a6b14d\") " pod="openshift-console-operator/console-operator-58897d9998-qnrc8" Mar 10 09:47:54 crc kubenswrapper[4794]: I0310 09:47:54.281113 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/d56da28c-c09d-4fff-b73e-c3b5c787c300-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-n94r9\" (UID: \"d56da28c-c09d-4fff-b73e-c3b5c787c300\") " pod="openshift-authentication/oauth-openshift-558db77b4-n94r9" Mar 10 09:47:54 crc kubenswrapper[4794]: I0310 09:47:54.281138 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/f7eaacf5-bd8a-46e7-8584-1ec3ae00f6b3-registry-certificates\") pod \"image-registry-697d97f7c8-zp5hg\" (UID: \"f7eaacf5-bd8a-46e7-8584-1ec3ae00f6b3\") " pod="openshift-image-registry/image-registry-697d97f7c8-zp5hg" Mar 10 09:47:54 crc kubenswrapper[4794]: I0310 09:47:54.281153 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/7827a543-d8b2-460b-aee5-212ea1208c0d-oauth-serving-cert\") pod \"console-f9d7485db-skqw6\" (UID: \"7827a543-d8b2-460b-aee5-212ea1208c0d\") " pod="openshift-console/console-f9d7485db-skqw6" Mar 10 09:47:54 crc kubenswrapper[4794]: I0310 09:47:54.281174 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/f97a286b-f0b0-4309-a3e4-33eea0aea5f8-plugins-dir\") pod \"csi-hostpathplugin-ckvtc\" (UID: \"f97a286b-f0b0-4309-a3e4-33eea0aea5f8\") " pod="hostpath-provisioner/csi-hostpathplugin-ckvtc" Mar 10 09:47:54 crc kubenswrapper[4794]: I0310 09:47:54.281188 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rt8vh\" (UniqueName: \"kubernetes.io/projected/d4be505c-4811-4a7a-a7b5-3141574f1ee0-kube-api-access-rt8vh\") pod \"multus-admission-controller-857f4d67dd-qk8nd\" (UID: \"d4be505c-4811-4a7a-a7b5-3141574f1ee0\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-qk8nd" Mar 10 09:47:54 crc kubenswrapper[4794]: I0310 09:47:54.281208 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-78sbs\" (UniqueName: \"kubernetes.io/projected/a4cfdaaf-4265-4c7d-b58d-4538905360a2-kube-api-access-78sbs\") pod \"ingress-canary-rkl2f\" (UID: \"a4cfdaaf-4265-4c7d-b58d-4538905360a2\") " pod="openshift-ingress-canary/ingress-canary-rkl2f" Mar 10 09:47:54 crc kubenswrapper[4794]: I0310 09:47:54.281233 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/d56da28c-c09d-4fff-b73e-c3b5c787c300-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-n94r9\" (UID: \"d56da28c-c09d-4fff-b73e-c3b5c787c300\") " pod="openshift-authentication/oauth-openshift-558db77b4-n94r9" Mar 10 09:47:54 crc kubenswrapper[4794]: I0310 09:47:54.281248 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/7827a543-d8b2-460b-aee5-212ea1208c0d-console-serving-cert\") pod \"console-f9d7485db-skqw6\" (UID: \"7827a543-d8b2-460b-aee5-212ea1208c0d\") " pod="openshift-console/console-f9d7485db-skqw6" Mar 10 09:47:54 crc kubenswrapper[4794]: I0310 09:47:54.281265 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/9214e776-ae19-4f0c-9008-9cd8a4ece399-certs\") pod \"machine-config-server-sl6l8\" (UID: \"9214e776-ae19-4f0c-9008-9cd8a4ece399\") " pod="openshift-machine-config-operator/machine-config-server-sl6l8" Mar 10 09:47:54 crc kubenswrapper[4794]: I0310 09:47:54.281278 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d5ndx\" (UniqueName: \"kubernetes.io/projected/eb441ca2-702b-4848-906b-9f02a8ff65ee-kube-api-access-d5ndx\") pod \"machine-config-operator-74547568cd-vjgc6\" (UID: \"eb441ca2-702b-4848-906b-9f02a8ff65ee\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-vjgc6" Mar 10 09:47:54 crc kubenswrapper[4794]: I0310 09:47:54.281293 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6g9hr\" (UniqueName: \"kubernetes.io/projected/c2bd53f1-d95b-43dd-91d4-8adc12d6971a-kube-api-access-6g9hr\") pod \"package-server-manager-789f6589d5-gg8jk\" (UID: \"c2bd53f1-d95b-43dd-91d4-8adc12d6971a\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-gg8jk" Mar 10 09:47:54 crc kubenswrapper[4794]: I0310 09:47:54.281320 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/9214e776-ae19-4f0c-9008-9cd8a4ece399-node-bootstrap-token\") pod \"machine-config-server-sl6l8\" (UID: \"9214e776-ae19-4f0c-9008-9cd8a4ece399\") " pod="openshift-machine-config-operator/machine-config-server-sl6l8" Mar 10 09:47:54 crc kubenswrapper[4794]: I0310 09:47:54.281351 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/270526dc-92ee-4d56-93cf-b4ee2df197fa-metrics-tls\") pod \"dns-default-2lthx\" (UID: \"270526dc-92ee-4d56-93cf-b4ee2df197fa\") " pod="openshift-dns/dns-default-2lthx" Mar 10 09:47:54 crc kubenswrapper[4794]: I0310 09:47:54.281375 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/983001a9-70eb-40e1-895f-5e3fc80f538e-metrics-tls\") pod \"dns-operator-744455d44c-9vsxq\" (UID: \"983001a9-70eb-40e1-895f-5e3fc80f538e\") " pod="openshift-dns-operator/dns-operator-744455d44c-9vsxq" Mar 10 09:47:54 crc kubenswrapper[4794]: I0310 09:47:54.281390 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/eb441ca2-702b-4848-906b-9f02a8ff65ee-images\") pod \"machine-config-operator-74547568cd-vjgc6\" (UID: \"eb441ca2-702b-4848-906b-9f02a8ff65ee\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-vjgc6" Mar 10 09:47:54 crc kubenswrapper[4794]: I0310 09:47:54.281415 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a138cfa4-f8ff-4627-b3da-f72aeb474795-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-xtppb\" (UID: \"a138cfa4-f8ff-4627-b3da-f72aeb474795\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-xtppb" Mar 10 09:47:54 crc kubenswrapper[4794]: I0310 09:47:54.281428 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/797ba833-0538-46ba-b3f1-4a3b2415c7f1-signing-cabundle\") pod \"service-ca-9c57cc56f-2tbqn\" (UID: \"797ba833-0538-46ba-b3f1-4a3b2415c7f1\") " pod="openshift-service-ca/service-ca-9c57cc56f-2tbqn" Mar 10 09:47:54 crc kubenswrapper[4794]: I0310 09:47:54.281443 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/eb441ca2-702b-4848-906b-9f02a8ff65ee-proxy-tls\") pod \"machine-config-operator-74547568cd-vjgc6\" (UID: \"eb441ca2-702b-4848-906b-9f02a8ff65ee\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-vjgc6" Mar 10 09:47:54 crc kubenswrapper[4794]: I0310 09:47:54.281480 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7fwfx\" (UniqueName: \"kubernetes.io/projected/9d7615b4-8f9f-4200-9e55-1baa01a6b14d-kube-api-access-7fwfx\") pod \"console-operator-58897d9998-qnrc8\" (UID: \"9d7615b4-8f9f-4200-9e55-1baa01a6b14d\") " pod="openshift-console-operator/console-operator-58897d9998-qnrc8" Mar 10 09:47:54 crc kubenswrapper[4794]: I0310 09:47:54.281510 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/7b739a9e-64b9-4415-bc45-dc9307aa49d3-bound-sa-token\") pod \"ingress-operator-5b745b69d9-bhj55\" (UID: \"7b739a9e-64b9-4415-bc45-dc9307aa49d3\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-bhj55" Mar 10 09:47:54 crc kubenswrapper[4794]: I0310 09:47:54.281525 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g5wsv\" (UniqueName: \"kubernetes.io/projected/7a0f2e0e-60dc-472e-ad6b-fc17442a65ed-kube-api-access-g5wsv\") pod \"service-ca-operator-777779d784-hxsrd\" (UID: \"7a0f2e0e-60dc-472e-ad6b-fc17442a65ed\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-hxsrd" Mar 10 09:47:54 crc kubenswrapper[4794]: I0310 09:47:54.281550 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6t9r5\" (UniqueName: \"kubernetes.io/projected/7b739a9e-64b9-4415-bc45-dc9307aa49d3-kube-api-access-6t9r5\") pod \"ingress-operator-5b745b69d9-bhj55\" (UID: \"7b739a9e-64b9-4415-bc45-dc9307aa49d3\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-bhj55" Mar 10 09:47:54 crc kubenswrapper[4794]: I0310 09:47:54.281588 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/577c9773-972c-42bb-99c1-b3d44cf21403-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-xwx4g\" (UID: \"577c9773-972c-42bb-99c1-b3d44cf21403\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-xwx4g" Mar 10 09:47:54 crc kubenswrapper[4794]: I0310 09:47:54.281621 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jqjnf\" (UniqueName: \"kubernetes.io/projected/9214e776-ae19-4f0c-9008-9cd8a4ece399-kube-api-access-jqjnf\") pod \"machine-config-server-sl6l8\" (UID: \"9214e776-ae19-4f0c-9008-9cd8a4ece399\") " pod="openshift-machine-config-operator/machine-config-server-sl6l8" Mar 10 09:47:54 crc kubenswrapper[4794]: I0310 09:47:54.281646 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rtg5n\" (UniqueName: \"kubernetes.io/projected/982fe064-1be6-4ff0-b6ba-6f04ee269140-kube-api-access-rtg5n\") pod \"kube-storage-version-migrator-operator-b67b599dd-dkz9c\" (UID: \"982fe064-1be6-4ff0-b6ba-6f04ee269140\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-dkz9c" Mar 10 09:47:54 crc kubenswrapper[4794]: I0310 09:47:54.281662 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6fd49366-785d-4530-a7d2-4a5daf70ea0f-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-m7cm9\" (UID: \"6fd49366-785d-4530-a7d2-4a5daf70ea0f\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-m7cm9" Mar 10 09:47:54 crc kubenswrapper[4794]: I0310 09:47:54.281676 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a4cfdaaf-4265-4c7d-b58d-4538905360a2-cert\") pod \"ingress-canary-rkl2f\" (UID: \"a4cfdaaf-4265-4c7d-b58d-4538905360a2\") " pod="openshift-ingress-canary/ingress-canary-rkl2f" Mar 10 09:47:54 crc kubenswrapper[4794]: I0310 09:47:54.281702 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/7b739a9e-64b9-4415-bc45-dc9307aa49d3-trusted-ca\") pod \"ingress-operator-5b745b69d9-bhj55\" (UID: \"7b739a9e-64b9-4415-bc45-dc9307aa49d3\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-bhj55" Mar 10 09:47:54 crc kubenswrapper[4794]: I0310 09:47:54.281734 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f7eaacf5-bd8a-46e7-8584-1ec3ae00f6b3-trusted-ca\") pod \"image-registry-697d97f7c8-zp5hg\" (UID: \"f7eaacf5-bd8a-46e7-8584-1ec3ae00f6b3\") " pod="openshift-image-registry/image-registry-697d97f7c8-zp5hg" Mar 10 09:47:54 crc kubenswrapper[4794]: I0310 09:47:54.281750 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/d56da28c-c09d-4fff-b73e-c3b5c787c300-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-n94r9\" (UID: \"d56da28c-c09d-4fff-b73e-c3b5c787c300\") " pod="openshift-authentication/oauth-openshift-558db77b4-n94r9" Mar 10 09:47:54 crc kubenswrapper[4794]: I0310 09:47:54.281767 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/9e528446-c8eb-4103-b2ef-994eef6ecac5-profile-collector-cert\") pod \"olm-operator-6b444d44fb-zpwhc\" (UID: \"9e528446-c8eb-4103-b2ef-994eef6ecac5\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-zpwhc" Mar 10 09:47:54 crc kubenswrapper[4794]: I0310 09:47:54.281800 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/cd3da9ba-0a30-4cd5-a02b-47c78ab2b9aa-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-h4k77\" (UID: \"cd3da9ba-0a30-4cd5-a02b-47c78ab2b9aa\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-h4k77" Mar 10 09:47:54 crc kubenswrapper[4794]: I0310 09:47:54.281815 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7827a543-d8b2-460b-aee5-212ea1208c0d-trusted-ca-bundle\") pod \"console-f9d7485db-skqw6\" (UID: \"7827a543-d8b2-460b-aee5-212ea1208c0d\") " pod="openshift-console/console-f9d7485db-skqw6" Mar 10 09:47:54 crc kubenswrapper[4794]: I0310 09:47:54.281831 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pcs2g\" (UniqueName: \"kubernetes.io/projected/2d3943fb-fdf7-495d-9a69-f04362b8b319-kube-api-access-pcs2g\") pod \"openshift-apiserver-operator-796bbdcf4f-zzsr8\" (UID: \"2d3943fb-fdf7-495d-9a69-f04362b8b319\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-zzsr8" Mar 10 09:47:54 crc kubenswrapper[4794]: I0310 09:47:54.281858 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rplrp\" (UniqueName: \"kubernetes.io/projected/eece3bac-ab7c-4a16-82ae-35775eef8806-kube-api-access-rplrp\") pod \"collect-profiles-29552265-smnpm\" (UID: \"eece3bac-ab7c-4a16-82ae-35775eef8806\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552265-smnpm" Mar 10 09:47:54 crc kubenswrapper[4794]: I0310 09:47:54.281873 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7a0f2e0e-60dc-472e-ad6b-fc17442a65ed-serving-cert\") pod \"service-ca-operator-777779d784-hxsrd\" (UID: \"7a0f2e0e-60dc-472e-ad6b-fc17442a65ed\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-hxsrd" Mar 10 09:47:54 crc kubenswrapper[4794]: I0310 09:47:54.281898 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2d3943fb-fdf7-495d-9a69-f04362b8b319-config\") pod \"openshift-apiserver-operator-796bbdcf4f-zzsr8\" (UID: \"2d3943fb-fdf7-495d-9a69-f04362b8b319\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-zzsr8" Mar 10 09:47:54 crc kubenswrapper[4794]: I0310 09:47:54.281913 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/d56da28c-c09d-4fff-b73e-c3b5c787c300-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-n94r9\" (UID: \"d56da28c-c09d-4fff-b73e-c3b5c787c300\") " pod="openshift-authentication/oauth-openshift-558db77b4-n94r9" Mar 10 09:47:54 crc kubenswrapper[4794]: I0310 09:47:54.281931 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/577c9773-972c-42bb-99c1-b3d44cf21403-config\") pod \"kube-apiserver-operator-766d6c64bb-xwx4g\" (UID: \"577c9773-972c-42bb-99c1-b3d44cf21403\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-xwx4g" Mar 10 09:47:54 crc kubenswrapper[4794]: I0310 09:47:54.281958 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d56da28c-c09d-4fff-b73e-c3b5c787c300-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-n94r9\" (UID: \"d56da28c-c09d-4fff-b73e-c3b5c787c300\") " pod="openshift-authentication/oauth-openshift-558db77b4-n94r9" Mar 10 09:47:54 crc kubenswrapper[4794]: I0310 09:47:54.282013 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a138cfa4-f8ff-4627-b3da-f72aeb474795-service-ca-bundle\") pod \"authentication-operator-69f744f599-xtppb\" (UID: \"a138cfa4-f8ff-4627-b3da-f72aeb474795\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-xtppb" Mar 10 09:47:54 crc kubenswrapper[4794]: I0310 09:47:54.282031 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m2dvh\" (UniqueName: \"kubernetes.io/projected/a8b37b88-1555-420f-a9c1-f7e48046f160-kube-api-access-m2dvh\") pod \"downloads-7954f5f757-bknsm\" (UID: \"a8b37b88-1555-420f-a9c1-f7e48046f160\") " pod="openshift-console/downloads-7954f5f757-bknsm" Mar 10 09:47:54 crc kubenswrapper[4794]: I0310 09:47:54.282048 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/f97a286b-f0b0-4309-a3e4-33eea0aea5f8-registration-dir\") pod \"csi-hostpathplugin-ckvtc\" (UID: \"f97a286b-f0b0-4309-a3e4-33eea0aea5f8\") " pod="hostpath-provisioner/csi-hostpathplugin-ckvtc" Mar 10 09:47:54 crc kubenswrapper[4794]: E0310 09:47:54.282110 4794 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 09:47:54.782077163 +0000 UTC m=+223.538247981 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 09:47:54 crc kubenswrapper[4794]: I0310 09:47:54.283775 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/d56da28c-c09d-4fff-b73e-c3b5c787c300-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-n94r9\" (UID: \"d56da28c-c09d-4fff-b73e-c3b5c787c300\") " pod="openshift-authentication/oauth-openshift-558db77b4-n94r9" Mar 10 09:47:54 crc kubenswrapper[4794]: I0310 09:47:54.284569 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/cd3da9ba-0a30-4cd5-a02b-47c78ab2b9aa-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-h4k77\" (UID: \"cd3da9ba-0a30-4cd5-a02b-47c78ab2b9aa\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-h4k77" Mar 10 09:47:54 crc kubenswrapper[4794]: I0310 09:47:54.285462 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a138cfa4-f8ff-4627-b3da-f72aeb474795-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-xtppb\" (UID: \"a138cfa4-f8ff-4627-b3da-f72aeb474795\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-xtppb" Mar 10 09:47:54 crc kubenswrapper[4794]: I0310 09:47:54.285469 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/797ba833-0538-46ba-b3f1-4a3b2415c7f1-signing-cabundle\") pod \"service-ca-9c57cc56f-2tbqn\" (UID: \"797ba833-0538-46ba-b3f1-4a3b2415c7f1\") " pod="openshift-service-ca/service-ca-9c57cc56f-2tbqn" Mar 10 09:47:54 crc kubenswrapper[4794]: I0310 09:47:54.285769 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2d3943fb-fdf7-495d-9a69-f04362b8b319-config\") pod \"openshift-apiserver-operator-796bbdcf4f-zzsr8\" (UID: \"2d3943fb-fdf7-495d-9a69-f04362b8b319\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-zzsr8" Mar 10 09:47:54 crc kubenswrapper[4794]: I0310 09:47:54.285812 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5ab22632-d678-4997-996e-bc26acff8c46-config\") pod \"machine-approver-56656f9798-m5mvt\" (UID: \"5ab22632-d678-4997-996e-bc26acff8c46\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-m5mvt" Mar 10 09:47:54 crc kubenswrapper[4794]: I0310 09:47:54.285841 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zp5hg\" (UID: \"f7eaacf5-bd8a-46e7-8584-1ec3ae00f6b3\") " pod="openshift-image-registry/image-registry-697d97f7c8-zp5hg" Mar 10 09:47:54 crc kubenswrapper[4794]: I0310 09:47:54.285904 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lx8pl\" (UniqueName: \"kubernetes.io/projected/a138cfa4-f8ff-4627-b3da-f72aeb474795-kube-api-access-lx8pl\") pod \"authentication-operator-69f744f599-xtppb\" (UID: \"a138cfa4-f8ff-4627-b3da-f72aeb474795\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-xtppb" Mar 10 09:47:54 crc kubenswrapper[4794]: I0310 09:47:54.285921 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/d56da28c-c09d-4fff-b73e-c3b5c787c300-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-n94r9\" (UID: \"d56da28c-c09d-4fff-b73e-c3b5c787c300\") " pod="openshift-authentication/oauth-openshift-558db77b4-n94r9" Mar 10 09:47:54 crc kubenswrapper[4794]: I0310 09:47:54.285940 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/d56da28c-c09d-4fff-b73e-c3b5c787c300-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-n94r9\" (UID: \"d56da28c-c09d-4fff-b73e-c3b5c787c300\") " pod="openshift-authentication/oauth-openshift-558db77b4-n94r9" Mar 10 09:47:54 crc kubenswrapper[4794]: I0310 09:47:54.285958 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wc5xb\" (UniqueName: \"kubernetes.io/projected/9e528446-c8eb-4103-b2ef-994eef6ecac5-kube-api-access-wc5xb\") pod \"olm-operator-6b444d44fb-zpwhc\" (UID: \"9e528446-c8eb-4103-b2ef-994eef6ecac5\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-zpwhc" Mar 10 09:47:54 crc kubenswrapper[4794]: I0310 09:47:54.285987 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2d3943fb-fdf7-495d-9a69-f04362b8b319-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-zzsr8\" (UID: \"2d3943fb-fdf7-495d-9a69-f04362b8b319\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-zzsr8" Mar 10 09:47:54 crc kubenswrapper[4794]: I0310 09:47:54.286002 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/22ead435-4c45-43ad-a499-fe930a626c52-srv-cert\") pod \"catalog-operator-68c6474976-p8ppb\" (UID: \"22ead435-4c45-43ad-a499-fe930a626c52\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-p8ppb" Mar 10 09:47:54 crc kubenswrapper[4794]: I0310 09:47:54.286066 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/d56da28c-c09d-4fff-b73e-c3b5c787c300-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-n94r9\" (UID: \"d56da28c-c09d-4fff-b73e-c3b5c787c300\") " pod="openshift-authentication/oauth-openshift-558db77b4-n94r9" Mar 10 09:47:54 crc kubenswrapper[4794]: I0310 09:47:54.286084 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7a0f2e0e-60dc-472e-ad6b-fc17442a65ed-config\") pod \"service-ca-operator-777779d784-hxsrd\" (UID: \"7a0f2e0e-60dc-472e-ad6b-fc17442a65ed\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-hxsrd" Mar 10 09:47:54 crc kubenswrapper[4794]: I0310 09:47:54.286101 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kk6wc\" (UniqueName: \"kubernetes.io/projected/270526dc-92ee-4d56-93cf-b4ee2df197fa-kube-api-access-kk6wc\") pod \"dns-default-2lthx\" (UID: \"270526dc-92ee-4d56-93cf-b4ee2df197fa\") " pod="openshift-dns/dns-default-2lthx" Mar 10 09:47:54 crc kubenswrapper[4794]: I0310 09:47:54.286120 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/d4be505c-4811-4a7a-a7b5-3141574f1ee0-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-qk8nd\" (UID: \"d4be505c-4811-4a7a-a7b5-3141574f1ee0\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-qk8nd" Mar 10 09:47:54 crc kubenswrapper[4794]: I0310 09:47:54.286127 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a138cfa4-f8ff-4627-b3da-f72aeb474795-service-ca-bundle\") pod \"authentication-operator-69f744f599-xtppb\" (UID: \"a138cfa4-f8ff-4627-b3da-f72aeb474795\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-xtppb" Mar 10 09:47:54 crc kubenswrapper[4794]: I0310 09:47:54.286224 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/5ab22632-d678-4997-996e-bc26acff8c46-auth-proxy-config\") pod \"machine-approver-56656f9798-m5mvt\" (UID: \"5ab22632-d678-4997-996e-bc26acff8c46\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-m5mvt" Mar 10 09:47:54 crc kubenswrapper[4794]: I0310 09:47:54.286254 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/f97a286b-f0b0-4309-a3e4-33eea0aea5f8-mountpoint-dir\") pod \"csi-hostpathplugin-ckvtc\" (UID: \"f97a286b-f0b0-4309-a3e4-33eea0aea5f8\") " pod="hostpath-provisioner/csi-hostpathplugin-ckvtc" Mar 10 09:47:54 crc kubenswrapper[4794]: I0310 09:47:54.286270 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/270526dc-92ee-4d56-93cf-b4ee2df197fa-config-volume\") pod \"dns-default-2lthx\" (UID: \"270526dc-92ee-4d56-93cf-b4ee2df197fa\") " pod="openshift-dns/dns-default-2lthx" Mar 10 09:47:54 crc kubenswrapper[4794]: I0310 09:47:54.286300 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/f97a286b-f0b0-4309-a3e4-33eea0aea5f8-socket-dir\") pod \"csi-hostpathplugin-ckvtc\" (UID: \"f97a286b-f0b0-4309-a3e4-33eea0aea5f8\") " pod="hostpath-provisioner/csi-hostpathplugin-ckvtc" Mar 10 09:47:54 crc kubenswrapper[4794]: I0310 09:47:54.286355 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/7827a543-d8b2-460b-aee5-212ea1208c0d-console-config\") pod \"console-f9d7485db-skqw6\" (UID: \"7827a543-d8b2-460b-aee5-212ea1208c0d\") " pod="openshift-console/console-f9d7485db-skqw6" Mar 10 09:47:54 crc kubenswrapper[4794]: I0310 09:47:54.286383 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d7615b4-8f9f-4200-9e55-1baa01a6b14d-trusted-ca\") pod \"console-operator-58897d9998-qnrc8\" (UID: \"9d7615b4-8f9f-4200-9e55-1baa01a6b14d\") " pod="openshift-console-operator/console-operator-58897d9998-qnrc8" Mar 10 09:47:54 crc kubenswrapper[4794]: I0310 09:47:54.286399 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/5ab22632-d678-4997-996e-bc26acff8c46-machine-approver-tls\") pod \"machine-approver-56656f9798-m5mvt\" (UID: \"5ab22632-d678-4997-996e-bc26acff8c46\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-m5mvt" Mar 10 09:47:54 crc kubenswrapper[4794]: I0310 09:47:54.286437 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/d56da28c-c09d-4fff-b73e-c3b5c787c300-audit-policies\") pod \"oauth-openshift-558db77b4-n94r9\" (UID: \"d56da28c-c09d-4fff-b73e-c3b5c787c300\") " pod="openshift-authentication/oauth-openshift-558db77b4-n94r9" Mar 10 09:47:54 crc kubenswrapper[4794]: I0310 09:47:54.286455 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5bgv2\" (UniqueName: \"kubernetes.io/projected/797ba833-0538-46ba-b3f1-4a3b2415c7f1-kube-api-access-5bgv2\") pod \"service-ca-9c57cc56f-2tbqn\" (UID: \"797ba833-0538-46ba-b3f1-4a3b2415c7f1\") " pod="openshift-service-ca/service-ca-9c57cc56f-2tbqn" Mar 10 09:47:54 crc kubenswrapper[4794]: I0310 09:47:54.286472 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/982fe064-1be6-4ff0-b6ba-6f04ee269140-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-dkz9c\" (UID: \"982fe064-1be6-4ff0-b6ba-6f04ee269140\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-dkz9c" Mar 10 09:47:54 crc kubenswrapper[4794]: I0310 09:47:54.286488 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mhkn5\" (UniqueName: \"kubernetes.io/projected/22ead435-4c45-43ad-a499-fe930a626c52-kube-api-access-mhkn5\") pod \"catalog-operator-68c6474976-p8ppb\" (UID: \"22ead435-4c45-43ad-a499-fe930a626c52\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-p8ppb" Mar 10 09:47:54 crc kubenswrapper[4794]: I0310 09:47:54.286505 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/d56da28c-c09d-4fff-b73e-c3b5c787c300-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-n94r9\" (UID: \"d56da28c-c09d-4fff-b73e-c3b5c787c300\") " pod="openshift-authentication/oauth-openshift-558db77b4-n94r9" Mar 10 09:47:54 crc kubenswrapper[4794]: I0310 09:47:54.286523 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/982fe064-1be6-4ff0-b6ba-6f04ee269140-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-dkz9c\" (UID: \"982fe064-1be6-4ff0-b6ba-6f04ee269140\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-dkz9c" Mar 10 09:47:54 crc kubenswrapper[4794]: I0310 09:47:54.286540 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/f97a286b-f0b0-4309-a3e4-33eea0aea5f8-csi-data-dir\") pod \"csi-hostpathplugin-ckvtc\" (UID: \"f97a286b-f0b0-4309-a3e4-33eea0aea5f8\") " pod="hostpath-provisioner/csi-hostpathplugin-ckvtc" Mar 10 09:47:54 crc kubenswrapper[4794]: I0310 09:47:54.286559 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/7b739a9e-64b9-4415-bc45-dc9307aa49d3-metrics-tls\") pod \"ingress-operator-5b745b69d9-bhj55\" (UID: \"7b739a9e-64b9-4415-bc45-dc9307aa49d3\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-bhj55" Mar 10 09:47:54 crc kubenswrapper[4794]: I0310 09:47:54.286584 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/a39fe093-da97-48ba-bdf3-a566eefc5208-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-n2b4p\" (UID: \"a39fe093-da97-48ba-bdf3-a566eefc5208\") " pod="openshift-marketplace/marketplace-operator-79b997595-n2b4p" Mar 10 09:47:54 crc kubenswrapper[4794]: I0310 09:47:54.286603 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bxz59\" (UniqueName: \"kubernetes.io/projected/a39fe093-da97-48ba-bdf3-a566eefc5208-kube-api-access-bxz59\") pod \"marketplace-operator-79b997595-n2b4p\" (UID: \"a39fe093-da97-48ba-bdf3-a566eefc5208\") " pod="openshift-marketplace/marketplace-operator-79b997595-n2b4p" Mar 10 09:47:54 crc kubenswrapper[4794]: I0310 09:47:54.286570 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/7827a543-d8b2-460b-aee5-212ea1208c0d-oauth-serving-cert\") pod \"console-f9d7485db-skqw6\" (UID: \"7827a543-d8b2-460b-aee5-212ea1208c0d\") " pod="openshift-console/console-f9d7485db-skqw6" Mar 10 09:47:54 crc kubenswrapper[4794]: I0310 09:47:54.286622 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/c2bd53f1-d95b-43dd-91d4-8adc12d6971a-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-gg8jk\" (UID: \"c2bd53f1-d95b-43dd-91d4-8adc12d6971a\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-gg8jk" Mar 10 09:47:54 crc kubenswrapper[4794]: I0310 09:47:54.287174 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/d56da28c-c09d-4fff-b73e-c3b5c787c300-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-n94r9\" (UID: \"d56da28c-c09d-4fff-b73e-c3b5c787c300\") " pod="openshift-authentication/oauth-openshift-558db77b4-n94r9" Mar 10 09:47:54 crc kubenswrapper[4794]: I0310 09:47:54.287207 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/f7eaacf5-bd8a-46e7-8584-1ec3ae00f6b3-ca-trust-extracted\") pod \"image-registry-697d97f7c8-zp5hg\" (UID: \"f7eaacf5-bd8a-46e7-8584-1ec3ae00f6b3\") " pod="openshift-image-registry/image-registry-697d97f7c8-zp5hg" Mar 10 09:47:54 crc kubenswrapper[4794]: I0310 09:47:54.287246 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/7827a543-d8b2-460b-aee5-212ea1208c0d-service-ca\") pod \"console-f9d7485db-skqw6\" (UID: \"7827a543-d8b2-460b-aee5-212ea1208c0d\") " pod="openshift-console/console-f9d7485db-skqw6" Mar 10 09:47:54 crc kubenswrapper[4794]: I0310 09:47:54.287264 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/cd3da9ba-0a30-4cd5-a02b-47c78ab2b9aa-proxy-tls\") pod \"machine-config-controller-84d6567774-h4k77\" (UID: \"cd3da9ba-0a30-4cd5-a02b-47c78ab2b9aa\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-h4k77" Mar 10 09:47:54 crc kubenswrapper[4794]: I0310 09:47:54.287297 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dwjlz\" (UniqueName: \"kubernetes.io/projected/f97a286b-f0b0-4309-a3e4-33eea0aea5f8-kube-api-access-dwjlz\") pod \"csi-hostpathplugin-ckvtc\" (UID: \"f97a286b-f0b0-4309-a3e4-33eea0aea5f8\") " pod="hostpath-provisioner/csi-hostpathplugin-ckvtc" Mar 10 09:47:54 crc kubenswrapper[4794]: I0310 09:47:54.287338 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/f7eaacf5-bd8a-46e7-8584-1ec3ae00f6b3-installation-pull-secrets\") pod \"image-registry-697d97f7c8-zp5hg\" (UID: \"f7eaacf5-bd8a-46e7-8584-1ec3ae00f6b3\") " pod="openshift-image-registry/image-registry-697d97f7c8-zp5hg" Mar 10 09:47:54 crc kubenswrapper[4794]: I0310 09:47:54.287379 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lrgvc\" (UniqueName: \"kubernetes.io/projected/983001a9-70eb-40e1-895f-5e3fc80f538e-kube-api-access-lrgvc\") pod \"dns-operator-744455d44c-9vsxq\" (UID: \"983001a9-70eb-40e1-895f-5e3fc80f538e\") " pod="openshift-dns-operator/dns-operator-744455d44c-9vsxq" Mar 10 09:47:54 crc kubenswrapper[4794]: I0310 09:47:54.287396 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d7615b4-8f9f-4200-9e55-1baa01a6b14d-config\") pod \"console-operator-58897d9998-qnrc8\" (UID: \"9d7615b4-8f9f-4200-9e55-1baa01a6b14d\") " pod="openshift-console-operator/console-operator-58897d9998-qnrc8" Mar 10 09:47:54 crc kubenswrapper[4794]: I0310 09:47:54.287427 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/22ead435-4c45-43ad-a499-fe930a626c52-profile-collector-cert\") pod \"catalog-operator-68c6474976-p8ppb\" (UID: \"22ead435-4c45-43ad-a499-fe930a626c52\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-p8ppb" Mar 10 09:47:54 crc kubenswrapper[4794]: I0310 09:47:54.287448 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/f7eaacf5-bd8a-46e7-8584-1ec3ae00f6b3-bound-sa-token\") pod \"image-registry-697d97f7c8-zp5hg\" (UID: \"f7eaacf5-bd8a-46e7-8584-1ec3ae00f6b3\") " pod="openshift-image-registry/image-registry-697d97f7c8-zp5hg" Mar 10 09:47:54 crc kubenswrapper[4794]: I0310 09:47:54.287465 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wxn5t\" (UniqueName: \"kubernetes.io/projected/f7eaacf5-bd8a-46e7-8584-1ec3ae00f6b3-kube-api-access-wxn5t\") pod \"image-registry-697d97f7c8-zp5hg\" (UID: \"f7eaacf5-bd8a-46e7-8584-1ec3ae00f6b3\") " pod="openshift-image-registry/image-registry-697d97f7c8-zp5hg" Mar 10 09:47:54 crc kubenswrapper[4794]: I0310 09:47:54.287491 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/eece3bac-ab7c-4a16-82ae-35775eef8806-secret-volume\") pod \"collect-profiles-29552265-smnpm\" (UID: \"eece3bac-ab7c-4a16-82ae-35775eef8806\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552265-smnpm" Mar 10 09:47:54 crc kubenswrapper[4794]: I0310 09:47:54.287508 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a138cfa4-f8ff-4627-b3da-f72aeb474795-serving-cert\") pod \"authentication-operator-69f744f599-xtppb\" (UID: \"a138cfa4-f8ff-4627-b3da-f72aeb474795\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-xtppb" Mar 10 09:47:54 crc kubenswrapper[4794]: I0310 09:47:54.287523 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/d56da28c-c09d-4fff-b73e-c3b5c787c300-audit-dir\") pod \"oauth-openshift-558db77b4-n94r9\" (UID: \"d56da28c-c09d-4fff-b73e-c3b5c787c300\") " pod="openshift-authentication/oauth-openshift-558db77b4-n94r9" Mar 10 09:47:54 crc kubenswrapper[4794]: I0310 09:47:54.287541 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/d56da28c-c09d-4fff-b73e-c3b5c787c300-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-n94r9\" (UID: \"d56da28c-c09d-4fff-b73e-c3b5c787c300\") " pod="openshift-authentication/oauth-openshift-558db77b4-n94r9" Mar 10 09:47:54 crc kubenswrapper[4794]: I0310 09:47:54.287558 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zppzf\" (UniqueName: \"kubernetes.io/projected/f12f506f-5226-41a3-9643-260415a884a5-kube-api-access-zppzf\") pod \"auto-csr-approver-29552266-9tldc\" (UID: \"f12f506f-5226-41a3-9643-260415a884a5\") " pod="openshift-infra/auto-csr-approver-29552266-9tldc" Mar 10 09:47:54 crc kubenswrapper[4794]: I0310 09:47:54.287575 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/9e528446-c8eb-4103-b2ef-994eef6ecac5-srv-cert\") pod \"olm-operator-6b444d44fb-zpwhc\" (UID: \"9e528446-c8eb-4103-b2ef-994eef6ecac5\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-zpwhc" Mar 10 09:47:54 crc kubenswrapper[4794]: I0310 09:47:54.287593 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/7827a543-d8b2-460b-aee5-212ea1208c0d-console-oauth-config\") pod \"console-f9d7485db-skqw6\" (UID: \"7827a543-d8b2-460b-aee5-212ea1208c0d\") " pod="openshift-console/console-f9d7485db-skqw6" Mar 10 09:47:54 crc kubenswrapper[4794]: I0310 09:47:54.287608 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6fd49366-785d-4530-a7d2-4a5daf70ea0f-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-m7cm9\" (UID: \"6fd49366-785d-4530-a7d2-4a5daf70ea0f\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-m7cm9" Mar 10 09:47:54 crc kubenswrapper[4794]: I0310 09:47:54.287629 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/eb441ca2-702b-4848-906b-9f02a8ff65ee-auth-proxy-config\") pod \"machine-config-operator-74547568cd-vjgc6\" (UID: \"eb441ca2-702b-4848-906b-9f02a8ff65ee\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-vjgc6" Mar 10 09:47:54 crc kubenswrapper[4794]: I0310 09:47:54.287731 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/7827a543-d8b2-460b-aee5-212ea1208c0d-console-serving-cert\") pod \"console-f9d7485db-skqw6\" (UID: \"7827a543-d8b2-460b-aee5-212ea1208c0d\") " pod="openshift-console/console-f9d7485db-skqw6" Mar 10 09:47:54 crc kubenswrapper[4794]: I0310 09:47:54.287878 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d56da28c-c09d-4fff-b73e-c3b5c787c300-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-n94r9\" (UID: \"d56da28c-c09d-4fff-b73e-c3b5c787c300\") " pod="openshift-authentication/oauth-openshift-558db77b4-n94r9" Mar 10 09:47:54 crc kubenswrapper[4794]: I0310 09:47:54.288179 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/7b739a9e-64b9-4415-bc45-dc9307aa49d3-trusted-ca\") pod \"ingress-operator-5b745b69d9-bhj55\" (UID: \"7b739a9e-64b9-4415-bc45-dc9307aa49d3\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-bhj55" Mar 10 09:47:54 crc kubenswrapper[4794]: I0310 09:47:54.288617 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/7827a543-d8b2-460b-aee5-212ea1208c0d-service-ca\") pod \"console-f9d7485db-skqw6\" (UID: \"7827a543-d8b2-460b-aee5-212ea1208c0d\") " pod="openshift-console/console-f9d7485db-skqw6" Mar 10 09:47:54 crc kubenswrapper[4794]: I0310 09:47:54.289592 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7a0f2e0e-60dc-472e-ad6b-fc17442a65ed-serving-cert\") pod \"service-ca-operator-777779d784-hxsrd\" (UID: \"7a0f2e0e-60dc-472e-ad6b-fc17442a65ed\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-hxsrd" Mar 10 09:47:54 crc kubenswrapper[4794]: I0310 09:47:54.290001 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7a0f2e0e-60dc-472e-ad6b-fc17442a65ed-config\") pod \"service-ca-operator-777779d784-hxsrd\" (UID: \"7a0f2e0e-60dc-472e-ad6b-fc17442a65ed\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-hxsrd" Mar 10 09:47:54 crc kubenswrapper[4794]: I0310 09:47:54.290102 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/577c9773-972c-42bb-99c1-b3d44cf21403-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-xwx4g\" (UID: \"577c9773-972c-42bb-99c1-b3d44cf21403\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-xwx4g" Mar 10 09:47:54 crc kubenswrapper[4794]: I0310 09:47:54.290125 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a138cfa4-f8ff-4627-b3da-f72aeb474795-config\") pod \"authentication-operator-69f744f599-xtppb\" (UID: \"a138cfa4-f8ff-4627-b3da-f72aeb474795\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-xtppb" Mar 10 09:47:54 crc kubenswrapper[4794]: I0310 09:47:54.290152 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jh2j8\" (UniqueName: \"kubernetes.io/projected/5ab22632-d678-4997-996e-bc26acff8c46-kube-api-access-jh2j8\") pod \"machine-approver-56656f9798-m5mvt\" (UID: \"5ab22632-d678-4997-996e-bc26acff8c46\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-m5mvt" Mar 10 09:47:54 crc kubenswrapper[4794]: E0310 09:47:54.290443 4794 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 09:47:54.790428755 +0000 UTC m=+223.546599573 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zp5hg" (UID: "f7eaacf5-bd8a-46e7-8584-1ec3ae00f6b3") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 09:47:54 crc kubenswrapper[4794]: I0310 09:47:54.290822 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/d56da28c-c09d-4fff-b73e-c3b5c787c300-audit-dir\") pod \"oauth-openshift-558db77b4-n94r9\" (UID: \"d56da28c-c09d-4fff-b73e-c3b5c787c300\") " pod="openshift-authentication/oauth-openshift-558db77b4-n94r9" Mar 10 09:47:54 crc kubenswrapper[4794]: I0310 09:47:54.291426 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/577c9773-972c-42bb-99c1-b3d44cf21403-config\") pod \"kube-apiserver-operator-766d6c64bb-xwx4g\" (UID: \"577c9773-972c-42bb-99c1-b3d44cf21403\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-xwx4g" Mar 10 09:47:54 crc kubenswrapper[4794]: I0310 09:47:54.291558 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d7615b4-8f9f-4200-9e55-1baa01a6b14d-config\") pod \"console-operator-58897d9998-qnrc8\" (UID: \"9d7615b4-8f9f-4200-9e55-1baa01a6b14d\") " pod="openshift-console-operator/console-operator-58897d9998-qnrc8" Mar 10 09:47:54 crc kubenswrapper[4794]: I0310 09:47:54.301496 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/7827a543-d8b2-460b-aee5-212ea1208c0d-console-config\") pod \"console-f9d7485db-skqw6\" (UID: \"7827a543-d8b2-460b-aee5-212ea1208c0d\") " pod="openshift-console/console-f9d7485db-skqw6" Mar 10 09:47:54 crc kubenswrapper[4794]: I0310 09:47:54.302370 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7827a543-d8b2-460b-aee5-212ea1208c0d-trusted-ca-bundle\") pod \"console-f9d7485db-skqw6\" (UID: \"7827a543-d8b2-460b-aee5-212ea1208c0d\") " pod="openshift-console/console-f9d7485db-skqw6" Mar 10 09:47:54 crc kubenswrapper[4794]: I0310 09:47:54.303156 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/983001a9-70eb-40e1-895f-5e3fc80f538e-metrics-tls\") pod \"dns-operator-744455d44c-9vsxq\" (UID: \"983001a9-70eb-40e1-895f-5e3fc80f538e\") " pod="openshift-dns-operator/dns-operator-744455d44c-9vsxq" Mar 10 09:47:54 crc kubenswrapper[4794]: I0310 09:47:54.303170 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/577c9773-972c-42bb-99c1-b3d44cf21403-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-xwx4g\" (UID: \"577c9773-972c-42bb-99c1-b3d44cf21403\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-xwx4g" Mar 10 09:47:54 crc kubenswrapper[4794]: I0310 09:47:54.303252 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/f7eaacf5-bd8a-46e7-8584-1ec3ae00f6b3-registry-tls\") pod \"image-registry-697d97f7c8-zp5hg\" (UID: \"f7eaacf5-bd8a-46e7-8584-1ec3ae00f6b3\") " pod="openshift-image-registry/image-registry-697d97f7c8-zp5hg" Mar 10 09:47:54 crc kubenswrapper[4794]: I0310 09:47:54.303463 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a138cfa4-f8ff-4627-b3da-f72aeb474795-config\") pod \"authentication-operator-69f744f599-xtppb\" (UID: \"a138cfa4-f8ff-4627-b3da-f72aeb474795\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-xtppb" Mar 10 09:47:54 crc kubenswrapper[4794]: I0310 09:47:54.303621 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-glg2p" Mar 10 09:47:54 crc kubenswrapper[4794]: I0310 09:47:54.303703 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/d56da28c-c09d-4fff-b73e-c3b5c787c300-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-n94r9\" (UID: \"d56da28c-c09d-4fff-b73e-c3b5c787c300\") " pod="openshift-authentication/oauth-openshift-558db77b4-n94r9" Mar 10 09:47:54 crc kubenswrapper[4794]: I0310 09:47:54.303717 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d7615b4-8f9f-4200-9e55-1baa01a6b14d-serving-cert\") pod \"console-operator-58897d9998-qnrc8\" (UID: \"9d7615b4-8f9f-4200-9e55-1baa01a6b14d\") " pod="openshift-console-operator/console-operator-58897d9998-qnrc8" Mar 10 09:47:54 crc kubenswrapper[4794]: I0310 09:47:54.303764 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a138cfa4-f8ff-4627-b3da-f72aeb474795-serving-cert\") pod \"authentication-operator-69f744f599-xtppb\" (UID: \"a138cfa4-f8ff-4627-b3da-f72aeb474795\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-xtppb" Mar 10 09:47:54 crc kubenswrapper[4794]: I0310 09:47:54.303946 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/d56da28c-c09d-4fff-b73e-c3b5c787c300-audit-policies\") pod \"oauth-openshift-558db77b4-n94r9\" (UID: \"d56da28c-c09d-4fff-b73e-c3b5c787c300\") " pod="openshift-authentication/oauth-openshift-558db77b4-n94r9" Mar 10 09:47:54 crc kubenswrapper[4794]: I0310 09:47:54.304102 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d7615b4-8f9f-4200-9e55-1baa01a6b14d-trusted-ca\") pod \"console-operator-58897d9998-qnrc8\" (UID: \"9d7615b4-8f9f-4200-9e55-1baa01a6b14d\") " pod="openshift-console-operator/console-operator-58897d9998-qnrc8" Mar 10 09:47:54 crc kubenswrapper[4794]: I0310 09:47:54.304168 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/d56da28c-c09d-4fff-b73e-c3b5c787c300-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-n94r9\" (UID: \"d56da28c-c09d-4fff-b73e-c3b5c787c300\") " pod="openshift-authentication/oauth-openshift-558db77b4-n94r9" Mar 10 09:47:54 crc kubenswrapper[4794]: I0310 09:47:54.304428 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/d56da28c-c09d-4fff-b73e-c3b5c787c300-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-n94r9\" (UID: \"d56da28c-c09d-4fff-b73e-c3b5c787c300\") " pod="openshift-authentication/oauth-openshift-558db77b4-n94r9" Mar 10 09:47:54 crc kubenswrapper[4794]: I0310 09:47:54.304617 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/982fe064-1be6-4ff0-b6ba-6f04ee269140-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-dkz9c\" (UID: \"982fe064-1be6-4ff0-b6ba-6f04ee269140\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-dkz9c" Mar 10 09:47:54 crc kubenswrapper[4794]: I0310 09:47:54.304702 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/7b739a9e-64b9-4415-bc45-dc9307aa49d3-metrics-tls\") pod \"ingress-operator-5b745b69d9-bhj55\" (UID: \"7b739a9e-64b9-4415-bc45-dc9307aa49d3\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-bhj55" Mar 10 09:47:54 crc kubenswrapper[4794]: I0310 09:47:54.304776 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/f7eaacf5-bd8a-46e7-8584-1ec3ae00f6b3-registry-certificates\") pod \"image-registry-697d97f7c8-zp5hg\" (UID: \"f7eaacf5-bd8a-46e7-8584-1ec3ae00f6b3\") " pod="openshift-image-registry/image-registry-697d97f7c8-zp5hg" Mar 10 09:47:54 crc kubenswrapper[4794]: I0310 09:47:54.304947 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/982fe064-1be6-4ff0-b6ba-6f04ee269140-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-dkz9c\" (UID: \"982fe064-1be6-4ff0-b6ba-6f04ee269140\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-dkz9c" Mar 10 09:47:54 crc kubenswrapper[4794]: I0310 09:47:54.305448 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/d56da28c-c09d-4fff-b73e-c3b5c787c300-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-n94r9\" (UID: \"d56da28c-c09d-4fff-b73e-c3b5c787c300\") " pod="openshift-authentication/oauth-openshift-558db77b4-n94r9" Mar 10 09:47:54 crc kubenswrapper[4794]: I0310 09:47:54.306013 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f7eaacf5-bd8a-46e7-8584-1ec3ae00f6b3-trusted-ca\") pod \"image-registry-697d97f7c8-zp5hg\" (UID: \"f7eaacf5-bd8a-46e7-8584-1ec3ae00f6b3\") " pod="openshift-image-registry/image-registry-697d97f7c8-zp5hg" Mar 10 09:47:54 crc kubenswrapper[4794]: I0310 09:47:54.307280 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/f7eaacf5-bd8a-46e7-8584-1ec3ae00f6b3-ca-trust-extracted\") pod \"image-registry-697d97f7c8-zp5hg\" (UID: \"f7eaacf5-bd8a-46e7-8584-1ec3ae00f6b3\") " pod="openshift-image-registry/image-registry-697d97f7c8-zp5hg" Mar 10 09:47:54 crc kubenswrapper[4794]: I0310 09:47:54.309629 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/d56da28c-c09d-4fff-b73e-c3b5c787c300-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-n94r9\" (UID: \"d56da28c-c09d-4fff-b73e-c3b5c787c300\") " pod="openshift-authentication/oauth-openshift-558db77b4-n94r9" Mar 10 09:47:54 crc kubenswrapper[4794]: I0310 09:47:54.318522 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-jqtj2" Mar 10 09:47:54 crc kubenswrapper[4794]: I0310 09:47:54.318849 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/d56da28c-c09d-4fff-b73e-c3b5c787c300-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-n94r9\" (UID: \"d56da28c-c09d-4fff-b73e-c3b5c787c300\") " pod="openshift-authentication/oauth-openshift-558db77b4-n94r9" Mar 10 09:47:54 crc kubenswrapper[4794]: I0310 09:47:54.319911 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/7827a543-d8b2-460b-aee5-212ea1208c0d-console-oauth-config\") pod \"console-f9d7485db-skqw6\" (UID: \"7827a543-d8b2-460b-aee5-212ea1208c0d\") " pod="openshift-console/console-f9d7485db-skqw6" Mar 10 09:47:54 crc kubenswrapper[4794]: I0310 09:47:54.321721 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/d56da28c-c09d-4fff-b73e-c3b5c787c300-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-n94r9\" (UID: \"d56da28c-c09d-4fff-b73e-c3b5c787c300\") " pod="openshift-authentication/oauth-openshift-558db77b4-n94r9" Mar 10 09:47:54 crc kubenswrapper[4794]: I0310 09:47:54.322658 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2d3943fb-fdf7-495d-9a69-f04362b8b319-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-zzsr8\" (UID: \"2d3943fb-fdf7-495d-9a69-f04362b8b319\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-zzsr8" Mar 10 09:47:54 crc kubenswrapper[4794]: I0310 09:47:54.322702 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/797ba833-0538-46ba-b3f1-4a3b2415c7f1-signing-key\") pod \"service-ca-9c57cc56f-2tbqn\" (UID: \"797ba833-0538-46ba-b3f1-4a3b2415c7f1\") " pod="openshift-service-ca/service-ca-9c57cc56f-2tbqn" Mar 10 09:47:54 crc kubenswrapper[4794]: I0310 09:47:54.322706 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/f7eaacf5-bd8a-46e7-8584-1ec3ae00f6b3-installation-pull-secrets\") pod \"image-registry-697d97f7c8-zp5hg\" (UID: \"f7eaacf5-bd8a-46e7-8584-1ec3ae00f6b3\") " pod="openshift-image-registry/image-registry-697d97f7c8-zp5hg" Mar 10 09:47:54 crc kubenswrapper[4794]: I0310 09:47:54.323293 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/d56da28c-c09d-4fff-b73e-c3b5c787c300-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-n94r9\" (UID: \"d56da28c-c09d-4fff-b73e-c3b5c787c300\") " pod="openshift-authentication/oauth-openshift-558db77b4-n94r9" Mar 10 09:47:54 crc kubenswrapper[4794]: I0310 09:47:54.324966 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/d56da28c-c09d-4fff-b73e-c3b5c787c300-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-n94r9\" (UID: \"d56da28c-c09d-4fff-b73e-c3b5c787c300\") " pod="openshift-authentication/oauth-openshift-558db77b4-n94r9" Mar 10 09:47:54 crc kubenswrapper[4794]: I0310 09:47:54.325406 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/cd3da9ba-0a30-4cd5-a02b-47c78ab2b9aa-proxy-tls\") pod \"machine-config-controller-84d6567774-h4k77\" (UID: \"cd3da9ba-0a30-4cd5-a02b-47c78ab2b9aa\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-h4k77" Mar 10 09:47:54 crc kubenswrapper[4794]: I0310 09:47:54.327979 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pcs2g\" (UniqueName: \"kubernetes.io/projected/2d3943fb-fdf7-495d-9a69-f04362b8b319-kube-api-access-pcs2g\") pod \"openshift-apiserver-operator-796bbdcf4f-zzsr8\" (UID: \"2d3943fb-fdf7-495d-9a69-f04362b8b319\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-zzsr8" Mar 10 09:47:54 crc kubenswrapper[4794]: I0310 09:47:54.339744 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g5wsv\" (UniqueName: \"kubernetes.io/projected/7a0f2e0e-60dc-472e-ad6b-fc17442a65ed-kube-api-access-g5wsv\") pod \"service-ca-operator-777779d784-hxsrd\" (UID: \"7a0f2e0e-60dc-472e-ad6b-fc17442a65ed\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-hxsrd" Mar 10 09:47:54 crc kubenswrapper[4794]: I0310 09:47:54.360192 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6t9r5\" (UniqueName: \"kubernetes.io/projected/7b739a9e-64b9-4415-bc45-dc9307aa49d3-kube-api-access-6t9r5\") pod \"ingress-operator-5b745b69d9-bhj55\" (UID: \"7b739a9e-64b9-4415-bc45-dc9307aa49d3\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-bhj55" Mar 10 09:47:54 crc kubenswrapper[4794]: I0310 09:47:54.383505 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-45vcn\" (UniqueName: \"kubernetes.io/projected/cd3da9ba-0a30-4cd5-a02b-47c78ab2b9aa-kube-api-access-45vcn\") pod \"machine-config-controller-84d6567774-h4k77\" (UID: \"cd3da9ba-0a30-4cd5-a02b-47c78ab2b9aa\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-h4k77" Mar 10 09:47:54 crc kubenswrapper[4794]: I0310 09:47:54.391163 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 09:47:54 crc kubenswrapper[4794]: I0310 09:47:54.391387 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/22ead435-4c45-43ad-a499-fe930a626c52-srv-cert\") pod \"catalog-operator-68c6474976-p8ppb\" (UID: \"22ead435-4c45-43ad-a499-fe930a626c52\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-p8ppb" Mar 10 09:47:54 crc kubenswrapper[4794]: I0310 09:47:54.391408 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kk6wc\" (UniqueName: \"kubernetes.io/projected/270526dc-92ee-4d56-93cf-b4ee2df197fa-kube-api-access-kk6wc\") pod \"dns-default-2lthx\" (UID: \"270526dc-92ee-4d56-93cf-b4ee2df197fa\") " pod="openshift-dns/dns-default-2lthx" Mar 10 09:47:54 crc kubenswrapper[4794]: I0310 09:47:54.391424 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/d4be505c-4811-4a7a-a7b5-3141574f1ee0-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-qk8nd\" (UID: \"d4be505c-4811-4a7a-a7b5-3141574f1ee0\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-qk8nd" Mar 10 09:47:54 crc kubenswrapper[4794]: I0310 09:47:54.391439 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/5ab22632-d678-4997-996e-bc26acff8c46-auth-proxy-config\") pod \"machine-approver-56656f9798-m5mvt\" (UID: \"5ab22632-d678-4997-996e-bc26acff8c46\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-m5mvt" Mar 10 09:47:54 crc kubenswrapper[4794]: E0310 09:47:54.391502 4794 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 09:47:54.891476332 +0000 UTC m=+223.647647150 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 09:47:54 crc kubenswrapper[4794]: I0310 09:47:54.391559 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/f97a286b-f0b0-4309-a3e4-33eea0aea5f8-mountpoint-dir\") pod \"csi-hostpathplugin-ckvtc\" (UID: \"f97a286b-f0b0-4309-a3e4-33eea0aea5f8\") " pod="hostpath-provisioner/csi-hostpathplugin-ckvtc" Mar 10 09:47:54 crc kubenswrapper[4794]: I0310 09:47:54.391580 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/270526dc-92ee-4d56-93cf-b4ee2df197fa-config-volume\") pod \"dns-default-2lthx\" (UID: \"270526dc-92ee-4d56-93cf-b4ee2df197fa\") " pod="openshift-dns/dns-default-2lthx" Mar 10 09:47:54 crc kubenswrapper[4794]: I0310 09:47:54.391739 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/f97a286b-f0b0-4309-a3e4-33eea0aea5f8-socket-dir\") pod \"csi-hostpathplugin-ckvtc\" (UID: \"f97a286b-f0b0-4309-a3e4-33eea0aea5f8\") " pod="hostpath-provisioner/csi-hostpathplugin-ckvtc" Mar 10 09:47:54 crc kubenswrapper[4794]: I0310 09:47:54.391793 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mhkn5\" (UniqueName: \"kubernetes.io/projected/22ead435-4c45-43ad-a499-fe930a626c52-kube-api-access-mhkn5\") pod \"catalog-operator-68c6474976-p8ppb\" (UID: \"22ead435-4c45-43ad-a499-fe930a626c52\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-p8ppb" Mar 10 09:47:54 crc kubenswrapper[4794]: I0310 09:47:54.391810 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/5ab22632-d678-4997-996e-bc26acff8c46-machine-approver-tls\") pod \"machine-approver-56656f9798-m5mvt\" (UID: \"5ab22632-d678-4997-996e-bc26acff8c46\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-m5mvt" Mar 10 09:47:54 crc kubenswrapper[4794]: I0310 09:47:54.391827 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/f97a286b-f0b0-4309-a3e4-33eea0aea5f8-csi-data-dir\") pod \"csi-hostpathplugin-ckvtc\" (UID: \"f97a286b-f0b0-4309-a3e4-33eea0aea5f8\") " pod="hostpath-provisioner/csi-hostpathplugin-ckvtc" Mar 10 09:47:54 crc kubenswrapper[4794]: I0310 09:47:54.391847 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/a39fe093-da97-48ba-bdf3-a566eefc5208-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-n2b4p\" (UID: \"a39fe093-da97-48ba-bdf3-a566eefc5208\") " pod="openshift-marketplace/marketplace-operator-79b997595-n2b4p" Mar 10 09:47:54 crc kubenswrapper[4794]: I0310 09:47:54.391866 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bxz59\" (UniqueName: \"kubernetes.io/projected/a39fe093-da97-48ba-bdf3-a566eefc5208-kube-api-access-bxz59\") pod \"marketplace-operator-79b997595-n2b4p\" (UID: \"a39fe093-da97-48ba-bdf3-a566eefc5208\") " pod="openshift-marketplace/marketplace-operator-79b997595-n2b4p" Mar 10 09:47:54 crc kubenswrapper[4794]: I0310 09:47:54.391883 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/c2bd53f1-d95b-43dd-91d4-8adc12d6971a-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-gg8jk\" (UID: \"c2bd53f1-d95b-43dd-91d4-8adc12d6971a\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-gg8jk" Mar 10 09:47:54 crc kubenswrapper[4794]: I0310 09:47:54.391927 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dwjlz\" (UniqueName: \"kubernetes.io/projected/f97a286b-f0b0-4309-a3e4-33eea0aea5f8-kube-api-access-dwjlz\") pod \"csi-hostpathplugin-ckvtc\" (UID: \"f97a286b-f0b0-4309-a3e4-33eea0aea5f8\") " pod="hostpath-provisioner/csi-hostpathplugin-ckvtc" Mar 10 09:47:54 crc kubenswrapper[4794]: I0310 09:47:54.391973 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/22ead435-4c45-43ad-a499-fe930a626c52-profile-collector-cert\") pod \"catalog-operator-68c6474976-p8ppb\" (UID: \"22ead435-4c45-43ad-a499-fe930a626c52\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-p8ppb" Mar 10 09:47:54 crc kubenswrapper[4794]: I0310 09:47:54.392003 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/eece3bac-ab7c-4a16-82ae-35775eef8806-secret-volume\") pod \"collect-profiles-29552265-smnpm\" (UID: \"eece3bac-ab7c-4a16-82ae-35775eef8806\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552265-smnpm" Mar 10 09:47:54 crc kubenswrapper[4794]: I0310 09:47:54.392020 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zppzf\" (UniqueName: \"kubernetes.io/projected/f12f506f-5226-41a3-9643-260415a884a5-kube-api-access-zppzf\") pod \"auto-csr-approver-29552266-9tldc\" (UID: \"f12f506f-5226-41a3-9643-260415a884a5\") " pod="openshift-infra/auto-csr-approver-29552266-9tldc" Mar 10 09:47:54 crc kubenswrapper[4794]: I0310 09:47:54.392026 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/5ab22632-d678-4997-996e-bc26acff8c46-auth-proxy-config\") pod \"machine-approver-56656f9798-m5mvt\" (UID: \"5ab22632-d678-4997-996e-bc26acff8c46\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-m5mvt" Mar 10 09:47:54 crc kubenswrapper[4794]: I0310 09:47:54.392037 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/9e528446-c8eb-4103-b2ef-994eef6ecac5-srv-cert\") pod \"olm-operator-6b444d44fb-zpwhc\" (UID: \"9e528446-c8eb-4103-b2ef-994eef6ecac5\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-zpwhc" Mar 10 09:47:54 crc kubenswrapper[4794]: I0310 09:47:54.392054 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6fd49366-785d-4530-a7d2-4a5daf70ea0f-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-m7cm9\" (UID: \"6fd49366-785d-4530-a7d2-4a5daf70ea0f\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-m7cm9" Mar 10 09:47:54 crc kubenswrapper[4794]: I0310 09:47:54.392074 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/eb441ca2-702b-4848-906b-9f02a8ff65ee-auth-proxy-config\") pod \"machine-config-operator-74547568cd-vjgc6\" (UID: \"eb441ca2-702b-4848-906b-9f02a8ff65ee\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-vjgc6" Mar 10 09:47:54 crc kubenswrapper[4794]: I0310 09:47:54.392095 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jh2j8\" (UniqueName: \"kubernetes.io/projected/5ab22632-d678-4997-996e-bc26acff8c46-kube-api-access-jh2j8\") pod \"machine-approver-56656f9798-m5mvt\" (UID: \"5ab22632-d678-4997-996e-bc26acff8c46\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-m5mvt" Mar 10 09:47:54 crc kubenswrapper[4794]: I0310 09:47:54.392121 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a39fe093-da97-48ba-bdf3-a566eefc5208-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-n2b4p\" (UID: \"a39fe093-da97-48ba-bdf3-a566eefc5208\") " pod="openshift-marketplace/marketplace-operator-79b997595-n2b4p" Mar 10 09:47:54 crc kubenswrapper[4794]: I0310 09:47:54.392135 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/6fd49366-785d-4530-a7d2-4a5daf70ea0f-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-m7cm9\" (UID: \"6fd49366-785d-4530-a7d2-4a5daf70ea0f\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-m7cm9" Mar 10 09:47:54 crc kubenswrapper[4794]: I0310 09:47:54.392161 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/eece3bac-ab7c-4a16-82ae-35775eef8806-config-volume\") pod \"collect-profiles-29552265-smnpm\" (UID: \"eece3bac-ab7c-4a16-82ae-35775eef8806\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552265-smnpm" Mar 10 09:47:54 crc kubenswrapper[4794]: I0310 09:47:54.392186 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/f97a286b-f0b0-4309-a3e4-33eea0aea5f8-plugins-dir\") pod \"csi-hostpathplugin-ckvtc\" (UID: \"f97a286b-f0b0-4309-a3e4-33eea0aea5f8\") " pod="hostpath-provisioner/csi-hostpathplugin-ckvtc" Mar 10 09:47:54 crc kubenswrapper[4794]: I0310 09:47:54.392201 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rt8vh\" (UniqueName: \"kubernetes.io/projected/d4be505c-4811-4a7a-a7b5-3141574f1ee0-kube-api-access-rt8vh\") pod \"multus-admission-controller-857f4d67dd-qk8nd\" (UID: \"d4be505c-4811-4a7a-a7b5-3141574f1ee0\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-qk8nd" Mar 10 09:47:54 crc kubenswrapper[4794]: I0310 09:47:54.392226 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/9214e776-ae19-4f0c-9008-9cd8a4ece399-certs\") pod \"machine-config-server-sl6l8\" (UID: \"9214e776-ae19-4f0c-9008-9cd8a4ece399\") " pod="openshift-machine-config-operator/machine-config-server-sl6l8" Mar 10 09:47:54 crc kubenswrapper[4794]: I0310 09:47:54.392242 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d5ndx\" (UniqueName: \"kubernetes.io/projected/eb441ca2-702b-4848-906b-9f02a8ff65ee-kube-api-access-d5ndx\") pod \"machine-config-operator-74547568cd-vjgc6\" (UID: \"eb441ca2-702b-4848-906b-9f02a8ff65ee\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-vjgc6" Mar 10 09:47:54 crc kubenswrapper[4794]: I0310 09:47:54.392260 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-78sbs\" (UniqueName: \"kubernetes.io/projected/a4cfdaaf-4265-4c7d-b58d-4538905360a2-kube-api-access-78sbs\") pod \"ingress-canary-rkl2f\" (UID: \"a4cfdaaf-4265-4c7d-b58d-4538905360a2\") " pod="openshift-ingress-canary/ingress-canary-rkl2f" Mar 10 09:47:54 crc kubenswrapper[4794]: I0310 09:47:54.392282 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/9214e776-ae19-4f0c-9008-9cd8a4ece399-node-bootstrap-token\") pod \"machine-config-server-sl6l8\" (UID: \"9214e776-ae19-4f0c-9008-9cd8a4ece399\") " pod="openshift-machine-config-operator/machine-config-server-sl6l8" Mar 10 09:47:54 crc kubenswrapper[4794]: I0310 09:47:54.392301 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6g9hr\" (UniqueName: \"kubernetes.io/projected/c2bd53f1-d95b-43dd-91d4-8adc12d6971a-kube-api-access-6g9hr\") pod \"package-server-manager-789f6589d5-gg8jk\" (UID: \"c2bd53f1-d95b-43dd-91d4-8adc12d6971a\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-gg8jk" Mar 10 09:47:54 crc kubenswrapper[4794]: I0310 09:47:54.392315 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/270526dc-92ee-4d56-93cf-b4ee2df197fa-metrics-tls\") pod \"dns-default-2lthx\" (UID: \"270526dc-92ee-4d56-93cf-b4ee2df197fa\") " pod="openshift-dns/dns-default-2lthx" Mar 10 09:47:54 crc kubenswrapper[4794]: I0310 09:47:54.392370 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/eb441ca2-702b-4848-906b-9f02a8ff65ee-images\") pod \"machine-config-operator-74547568cd-vjgc6\" (UID: \"eb441ca2-702b-4848-906b-9f02a8ff65ee\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-vjgc6" Mar 10 09:47:54 crc kubenswrapper[4794]: I0310 09:47:54.392402 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/eb441ca2-702b-4848-906b-9f02a8ff65ee-proxy-tls\") pod \"machine-config-operator-74547568cd-vjgc6\" (UID: \"eb441ca2-702b-4848-906b-9f02a8ff65ee\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-vjgc6" Mar 10 09:47:54 crc kubenswrapper[4794]: I0310 09:47:54.392421 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/270526dc-92ee-4d56-93cf-b4ee2df197fa-config-volume\") pod \"dns-default-2lthx\" (UID: \"270526dc-92ee-4d56-93cf-b4ee2df197fa\") " pod="openshift-dns/dns-default-2lthx" Mar 10 09:47:54 crc kubenswrapper[4794]: I0310 09:47:54.392442 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jqjnf\" (UniqueName: \"kubernetes.io/projected/9214e776-ae19-4f0c-9008-9cd8a4ece399-kube-api-access-jqjnf\") pod \"machine-config-server-sl6l8\" (UID: \"9214e776-ae19-4f0c-9008-9cd8a4ece399\") " pod="openshift-machine-config-operator/machine-config-server-sl6l8" Mar 10 09:47:54 crc kubenswrapper[4794]: I0310 09:47:54.392466 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6fd49366-785d-4530-a7d2-4a5daf70ea0f-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-m7cm9\" (UID: \"6fd49366-785d-4530-a7d2-4a5daf70ea0f\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-m7cm9" Mar 10 09:47:54 crc kubenswrapper[4794]: I0310 09:47:54.392483 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a4cfdaaf-4265-4c7d-b58d-4538905360a2-cert\") pod \"ingress-canary-rkl2f\" (UID: \"a4cfdaaf-4265-4c7d-b58d-4538905360a2\") " pod="openshift-ingress-canary/ingress-canary-rkl2f" Mar 10 09:47:54 crc kubenswrapper[4794]: I0310 09:47:54.392498 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/f97a286b-f0b0-4309-a3e4-33eea0aea5f8-mountpoint-dir\") pod \"csi-hostpathplugin-ckvtc\" (UID: \"f97a286b-f0b0-4309-a3e4-33eea0aea5f8\") " pod="hostpath-provisioner/csi-hostpathplugin-ckvtc" Mar 10 09:47:54 crc kubenswrapper[4794]: I0310 09:47:54.392512 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/9e528446-c8eb-4103-b2ef-994eef6ecac5-profile-collector-cert\") pod \"olm-operator-6b444d44fb-zpwhc\" (UID: \"9e528446-c8eb-4103-b2ef-994eef6ecac5\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-zpwhc" Mar 10 09:47:54 crc kubenswrapper[4794]: I0310 09:47:54.392585 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rplrp\" (UniqueName: \"kubernetes.io/projected/eece3bac-ab7c-4a16-82ae-35775eef8806-kube-api-access-rplrp\") pod \"collect-profiles-29552265-smnpm\" (UID: \"eece3bac-ab7c-4a16-82ae-35775eef8806\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552265-smnpm" Mar 10 09:47:54 crc kubenswrapper[4794]: I0310 09:47:54.392620 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m2dvh\" (UniqueName: \"kubernetes.io/projected/a8b37b88-1555-420f-a9c1-f7e48046f160-kube-api-access-m2dvh\") pod \"downloads-7954f5f757-bknsm\" (UID: \"a8b37b88-1555-420f-a9c1-f7e48046f160\") " pod="openshift-console/downloads-7954f5f757-bknsm" Mar 10 09:47:54 crc kubenswrapper[4794]: I0310 09:47:54.392635 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/f97a286b-f0b0-4309-a3e4-33eea0aea5f8-registration-dir\") pod \"csi-hostpathplugin-ckvtc\" (UID: \"f97a286b-f0b0-4309-a3e4-33eea0aea5f8\") " pod="hostpath-provisioner/csi-hostpathplugin-ckvtc" Mar 10 09:47:54 crc kubenswrapper[4794]: I0310 09:47:54.392651 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5ab22632-d678-4997-996e-bc26acff8c46-config\") pod \"machine-approver-56656f9798-m5mvt\" (UID: \"5ab22632-d678-4997-996e-bc26acff8c46\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-m5mvt" Mar 10 09:47:54 crc kubenswrapper[4794]: I0310 09:47:54.392669 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zp5hg\" (UID: \"f7eaacf5-bd8a-46e7-8584-1ec3ae00f6b3\") " pod="openshift-image-registry/image-registry-697d97f7c8-zp5hg" Mar 10 09:47:54 crc kubenswrapper[4794]: I0310 09:47:54.392693 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wc5xb\" (UniqueName: \"kubernetes.io/projected/9e528446-c8eb-4103-b2ef-994eef6ecac5-kube-api-access-wc5xb\") pod \"olm-operator-6b444d44fb-zpwhc\" (UID: \"9e528446-c8eb-4103-b2ef-994eef6ecac5\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-zpwhc" Mar 10 09:47:54 crc kubenswrapper[4794]: I0310 09:47:54.393002 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/f97a286b-f0b0-4309-a3e4-33eea0aea5f8-socket-dir\") pod \"csi-hostpathplugin-ckvtc\" (UID: \"f97a286b-f0b0-4309-a3e4-33eea0aea5f8\") " pod="hostpath-provisioner/csi-hostpathplugin-ckvtc" Mar 10 09:47:54 crc kubenswrapper[4794]: I0310 09:47:54.393887 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/f97a286b-f0b0-4309-a3e4-33eea0aea5f8-csi-data-dir\") pod \"csi-hostpathplugin-ckvtc\" (UID: \"f97a286b-f0b0-4309-a3e4-33eea0aea5f8\") " pod="hostpath-provisioner/csi-hostpathplugin-ckvtc" Mar 10 09:47:54 crc kubenswrapper[4794]: I0310 09:47:54.396237 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/eece3bac-ab7c-4a16-82ae-35775eef8806-config-volume\") pod \"collect-profiles-29552265-smnpm\" (UID: \"eece3bac-ab7c-4a16-82ae-35775eef8806\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552265-smnpm" Mar 10 09:47:54 crc kubenswrapper[4794]: I0310 09:47:54.396893 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/d4be505c-4811-4a7a-a7b5-3141574f1ee0-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-qk8nd\" (UID: \"d4be505c-4811-4a7a-a7b5-3141574f1ee0\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-qk8nd" Mar 10 09:47:54 crc kubenswrapper[4794]: I0310 09:47:54.396904 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/secret/9214e776-ae19-4f0c-9008-9cd8a4ece399-certs\") pod \"machine-config-server-sl6l8\" (UID: \"9214e776-ae19-4f0c-9008-9cd8a4ece399\") " pod="openshift-machine-config-operator/machine-config-server-sl6l8" Mar 10 09:47:54 crc kubenswrapper[4794]: I0310 09:47:54.396975 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/f97a286b-f0b0-4309-a3e4-33eea0aea5f8-plugins-dir\") pod \"csi-hostpathplugin-ckvtc\" (UID: \"f97a286b-f0b0-4309-a3e4-33eea0aea5f8\") " pod="hostpath-provisioner/csi-hostpathplugin-ckvtc" Mar 10 09:47:54 crc kubenswrapper[4794]: I0310 09:47:54.398560 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/9214e776-ae19-4f0c-9008-9cd8a4ece399-node-bootstrap-token\") pod \"machine-config-server-sl6l8\" (UID: \"9214e776-ae19-4f0c-9008-9cd8a4ece399\") " pod="openshift-machine-config-operator/machine-config-server-sl6l8" Mar 10 09:47:54 crc kubenswrapper[4794]: I0310 09:47:54.399175 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/22ead435-4c45-43ad-a499-fe930a626c52-srv-cert\") pod \"catalog-operator-68c6474976-p8ppb\" (UID: \"22ead435-4c45-43ad-a499-fe930a626c52\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-p8ppb" Mar 10 09:47:54 crc kubenswrapper[4794]: I0310 09:47:54.399634 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/f97a286b-f0b0-4309-a3e4-33eea0aea5f8-registration-dir\") pod \"csi-hostpathplugin-ckvtc\" (UID: \"f97a286b-f0b0-4309-a3e4-33eea0aea5f8\") " pod="hostpath-provisioner/csi-hostpathplugin-ckvtc" Mar 10 09:47:54 crc kubenswrapper[4794]: E0310 09:47:54.399774 4794 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 09:47:54.899756582 +0000 UTC m=+223.655927500 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zp5hg" (UID: "f7eaacf5-bd8a-46e7-8584-1ec3ae00f6b3") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 09:47:54 crc kubenswrapper[4794]: I0310 09:47:54.400403 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/9e528446-c8eb-4103-b2ef-994eef6ecac5-srv-cert\") pod \"olm-operator-6b444d44fb-zpwhc\" (UID: \"9e528446-c8eb-4103-b2ef-994eef6ecac5\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-zpwhc" Mar 10 09:47:54 crc kubenswrapper[4794]: I0310 09:47:54.400236 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5ab22632-d678-4997-996e-bc26acff8c46-config\") pod \"machine-approver-56656f9798-m5mvt\" (UID: \"5ab22632-d678-4997-996e-bc26acff8c46\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-m5mvt" Mar 10 09:47:54 crc kubenswrapper[4794]: I0310 09:47:54.400862 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/a39fe093-da97-48ba-bdf3-a566eefc5208-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-n2b4p\" (UID: \"a39fe093-da97-48ba-bdf3-a566eefc5208\") " pod="openshift-marketplace/marketplace-operator-79b997595-n2b4p" Mar 10 09:47:54 crc kubenswrapper[4794]: I0310 09:47:54.401122 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/eb441ca2-702b-4848-906b-9f02a8ff65ee-images\") pod \"machine-config-operator-74547568cd-vjgc6\" (UID: \"eb441ca2-702b-4848-906b-9f02a8ff65ee\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-vjgc6" Mar 10 09:47:54 crc kubenswrapper[4794]: I0310 09:47:54.401174 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6fd49366-785d-4530-a7d2-4a5daf70ea0f-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-m7cm9\" (UID: \"6fd49366-785d-4530-a7d2-4a5daf70ea0f\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-m7cm9" Mar 10 09:47:54 crc kubenswrapper[4794]: I0310 09:47:54.401630 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/eb441ca2-702b-4848-906b-9f02a8ff65ee-auth-proxy-config\") pod \"machine-config-operator-74547568cd-vjgc6\" (UID: \"eb441ca2-702b-4848-906b-9f02a8ff65ee\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-vjgc6" Mar 10 09:47:54 crc kubenswrapper[4794]: I0310 09:47:54.401981 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/270526dc-92ee-4d56-93cf-b4ee2df197fa-metrics-tls\") pod \"dns-default-2lthx\" (UID: \"270526dc-92ee-4d56-93cf-b4ee2df197fa\") " pod="openshift-dns/dns-default-2lthx" Mar 10 09:47:54 crc kubenswrapper[4794]: I0310 09:47:54.403240 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6fd49366-785d-4530-a7d2-4a5daf70ea0f-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-m7cm9\" (UID: \"6fd49366-785d-4530-a7d2-4a5daf70ea0f\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-m7cm9" Mar 10 09:47:54 crc kubenswrapper[4794]: I0310 09:47:54.403742 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a4cfdaaf-4265-4c7d-b58d-4538905360a2-cert\") pod \"ingress-canary-rkl2f\" (UID: \"a4cfdaaf-4265-4c7d-b58d-4538905360a2\") " pod="openshift-ingress-canary/ingress-canary-rkl2f" Mar 10 09:47:54 crc kubenswrapper[4794]: I0310 09:47:54.404105 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a39fe093-da97-48ba-bdf3-a566eefc5208-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-n2b4p\" (UID: \"a39fe093-da97-48ba-bdf3-a566eefc5208\") " pod="openshift-marketplace/marketplace-operator-79b997595-n2b4p" Mar 10 09:47:54 crc kubenswrapper[4794]: I0310 09:47:54.404491 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/22ead435-4c45-43ad-a499-fe930a626c52-profile-collector-cert\") pod \"catalog-operator-68c6474976-p8ppb\" (UID: \"22ead435-4c45-43ad-a499-fe930a626c52\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-p8ppb" Mar 10 09:47:54 crc kubenswrapper[4794]: I0310 09:47:54.405472 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/5ab22632-d678-4997-996e-bc26acff8c46-machine-approver-tls\") pod \"machine-approver-56656f9798-m5mvt\" (UID: \"5ab22632-d678-4997-996e-bc26acff8c46\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-m5mvt" Mar 10 09:47:54 crc kubenswrapper[4794]: I0310 09:47:54.407043 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/eb441ca2-702b-4848-906b-9f02a8ff65ee-proxy-tls\") pod \"machine-config-operator-74547568cd-vjgc6\" (UID: \"eb441ca2-702b-4848-906b-9f02a8ff65ee\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-vjgc6" Mar 10 09:47:54 crc kubenswrapper[4794]: I0310 09:47:54.408188 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/eece3bac-ab7c-4a16-82ae-35775eef8806-secret-volume\") pod \"collect-profiles-29552265-smnpm\" (UID: \"eece3bac-ab7c-4a16-82ae-35775eef8806\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552265-smnpm" Mar 10 09:47:54 crc kubenswrapper[4794]: I0310 09:47:54.408639 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/c2bd53f1-d95b-43dd-91d4-8adc12d6971a-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-gg8jk\" (UID: \"c2bd53f1-d95b-43dd-91d4-8adc12d6971a\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-gg8jk" Mar 10 09:47:54 crc kubenswrapper[4794]: I0310 09:47:54.408975 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/9e528446-c8eb-4103-b2ef-994eef6ecac5-profile-collector-cert\") pod \"olm-operator-6b444d44fb-zpwhc\" (UID: \"9e528446-c8eb-4103-b2ef-994eef6ecac5\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-zpwhc" Mar 10 09:47:54 crc kubenswrapper[4794]: I0310 09:47:54.410941 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-hxsrd" Mar 10 09:47:54 crc kubenswrapper[4794]: I0310 09:47:54.416426 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7fwfx\" (UniqueName: \"kubernetes.io/projected/9d7615b4-8f9f-4200-9e55-1baa01a6b14d-kube-api-access-7fwfx\") pod \"console-operator-58897d9998-qnrc8\" (UID: \"9d7615b4-8f9f-4200-9e55-1baa01a6b14d\") " pod="openshift-console-operator/console-operator-58897d9998-qnrc8" Mar 10 09:47:54 crc kubenswrapper[4794]: I0310 09:47:54.417048 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/7b739a9e-64b9-4415-bc45-dc9307aa49d3-bound-sa-token\") pod \"ingress-operator-5b745b69d9-bhj55\" (UID: \"7b739a9e-64b9-4415-bc45-dc9307aa49d3\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-bhj55" Mar 10 09:47:54 crc kubenswrapper[4794]: I0310 09:47:54.423598 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-h4k77" Mar 10 09:47:54 crc kubenswrapper[4794]: I0310 09:47:54.449472 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rh9rx\" (UniqueName: \"kubernetes.io/projected/d56da28c-c09d-4fff-b73e-c3b5c787c300-kube-api-access-rh9rx\") pod \"oauth-openshift-558db77b4-n94r9\" (UID: \"d56da28c-c09d-4fff-b73e-c3b5c787c300\") " pod="openshift-authentication/oauth-openshift-558db77b4-n94r9" Mar 10 09:47:54 crc kubenswrapper[4794]: I0310 09:47:54.463603 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wxn5t\" (UniqueName: \"kubernetes.io/projected/f7eaacf5-bd8a-46e7-8584-1ec3ae00f6b3-kube-api-access-wxn5t\") pod \"image-registry-697d97f7c8-zp5hg\" (UID: \"f7eaacf5-bd8a-46e7-8584-1ec3ae00f6b3\") " pod="openshift-image-registry/image-registry-697d97f7c8-zp5hg" Mar 10 09:47:54 crc kubenswrapper[4794]: I0310 09:47:54.481856 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/f7eaacf5-bd8a-46e7-8584-1ec3ae00f6b3-bound-sa-token\") pod \"image-registry-697d97f7c8-zp5hg\" (UID: \"f7eaacf5-bd8a-46e7-8584-1ec3ae00f6b3\") " pod="openshift-image-registry/image-registry-697d97f7c8-zp5hg" Mar 10 09:47:54 crc kubenswrapper[4794]: I0310 09:47:54.493212 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 09:47:54 crc kubenswrapper[4794]: E0310 09:47:54.493728 4794 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 09:47:54.993713325 +0000 UTC m=+223.749884143 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 09:47:54 crc kubenswrapper[4794]: I0310 09:47:54.496616 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-zzsr8" Mar 10 09:47:54 crc kubenswrapper[4794]: I0310 09:47:54.500852 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-l4mhp"] Mar 10 09:47:54 crc kubenswrapper[4794]: I0310 09:47:54.502961 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-q8jhf"] Mar 10 09:47:54 crc kubenswrapper[4794]: I0310 09:47:54.503387 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lrgvc\" (UniqueName: \"kubernetes.io/projected/983001a9-70eb-40e1-895f-5e3fc80f538e-kube-api-access-lrgvc\") pod \"dns-operator-744455d44c-9vsxq\" (UID: \"983001a9-70eb-40e1-895f-5e3fc80f538e\") " pod="openshift-dns-operator/dns-operator-744455d44c-9vsxq" Mar 10 09:47:54 crc kubenswrapper[4794]: I0310 09:47:54.509373 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-rjjct"] Mar 10 09:47:54 crc kubenswrapper[4794]: I0310 09:47:54.521371 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s9j4m\" (UniqueName: \"kubernetes.io/projected/7827a543-d8b2-460b-aee5-212ea1208c0d-kube-api-access-s9j4m\") pod \"console-f9d7485db-skqw6\" (UID: \"7827a543-d8b2-460b-aee5-212ea1208c0d\") " pod="openshift-console/console-f9d7485db-skqw6" Mar 10 09:47:54 crc kubenswrapper[4794]: I0310 09:47:54.542077 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-jnffm"] Mar 10 09:47:54 crc kubenswrapper[4794]: I0310 09:47:54.546441 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lx8pl\" (UniqueName: \"kubernetes.io/projected/a138cfa4-f8ff-4627-b3da-f72aeb474795-kube-api-access-lx8pl\") pod \"authentication-operator-69f744f599-xtppb\" (UID: \"a138cfa4-f8ff-4627-b3da-f72aeb474795\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-xtppb" Mar 10 09:47:54 crc kubenswrapper[4794]: W0310 09:47:54.551931 4794 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb70fe17f_97eb_4abf_b88e_0e24e5d01c48.slice/crio-6d351322381c05cf7758622453516d58b6facfd84aaf45d4a9617f11f9383e34 WatchSource:0}: Error finding container 6d351322381c05cf7758622453516d58b6facfd84aaf45d4a9617f11f9383e34: Status 404 returned error can't find the container with id 6d351322381c05cf7758622453516d58b6facfd84aaf45d4a9617f11f9383e34 Mar 10 09:47:54 crc kubenswrapper[4794]: I0310 09:47:54.580411 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-n94r9" Mar 10 09:47:54 crc kubenswrapper[4794]: I0310 09:47:54.582952 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-q62bc"] Mar 10 09:47:54 crc kubenswrapper[4794]: I0310 09:47:54.583608 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rtg5n\" (UniqueName: \"kubernetes.io/projected/982fe064-1be6-4ff0-b6ba-6f04ee269140-kube-api-access-rtg5n\") pod \"kube-storage-version-migrator-operator-b67b599dd-dkz9c\" (UID: \"982fe064-1be6-4ff0-b6ba-6f04ee269140\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-dkz9c" Mar 10 09:47:54 crc kubenswrapper[4794]: I0310 09:47:54.584902 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-txkm8"] Mar 10 09:47:54 crc kubenswrapper[4794]: W0310 09:47:54.592282 4794 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podeb31e03a_1724_4272_bb5f_d0a7b6b80059.slice/crio-f07dd8a579be341a3f19de0fffcfa0b7a15cb49f4e2a5d47c7a5f42aecf40a93 WatchSource:0}: Error finding container f07dd8a579be341a3f19de0fffcfa0b7a15cb49f4e2a5d47c7a5f42aecf40a93: Status 404 returned error can't find the container with id f07dd8a579be341a3f19de0fffcfa0b7a15cb49f4e2a5d47c7a5f42aecf40a93 Mar 10 09:47:54 crc kubenswrapper[4794]: I0310 09:47:54.594385 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zp5hg\" (UID: \"f7eaacf5-bd8a-46e7-8584-1ec3ae00f6b3\") " pod="openshift-image-registry/image-registry-697d97f7c8-zp5hg" Mar 10 09:47:54 crc kubenswrapper[4794]: E0310 09:47:54.594801 4794 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 09:47:55.094787793 +0000 UTC m=+223.850958611 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zp5hg" (UID: "f7eaacf5-bd8a-46e7-8584-1ec3ae00f6b3") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 09:47:54 crc kubenswrapper[4794]: I0310 09:47:54.599838 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-9vsxq" Mar 10 09:47:54 crc kubenswrapper[4794]: I0310 09:47:54.602912 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/577c9773-972c-42bb-99c1-b3d44cf21403-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-xwx4g\" (UID: \"577c9773-972c-42bb-99c1-b3d44cf21403\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-xwx4g" Mar 10 09:47:54 crc kubenswrapper[4794]: I0310 09:47:54.621679 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5bgv2\" (UniqueName: \"kubernetes.io/projected/797ba833-0538-46ba-b3f1-4a3b2415c7f1-kube-api-access-5bgv2\") pod \"service-ca-9c57cc56f-2tbqn\" (UID: \"797ba833-0538-46ba-b3f1-4a3b2415c7f1\") " pod="openshift-service-ca/service-ca-9c57cc56f-2tbqn" Mar 10 09:47:54 crc kubenswrapper[4794]: I0310 09:47:54.627613 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-qnrc8" Mar 10 09:47:54 crc kubenswrapper[4794]: I0310 09:47:54.641792 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-hxsrd"] Mar 10 09:47:54 crc kubenswrapper[4794]: I0310 09:47:54.642841 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wc5xb\" (UniqueName: \"kubernetes.io/projected/9e528446-c8eb-4103-b2ef-994eef6ecac5-kube-api-access-wc5xb\") pod \"olm-operator-6b444d44fb-zpwhc\" (UID: \"9e528446-c8eb-4103-b2ef-994eef6ecac5\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-zpwhc" Mar 10 09:47:54 crc kubenswrapper[4794]: I0310 09:47:54.667849 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rt8vh\" (UniqueName: \"kubernetes.io/projected/d4be505c-4811-4a7a-a7b5-3141574f1ee0-kube-api-access-rt8vh\") pod \"multus-admission-controller-857f4d67dd-qk8nd\" (UID: \"d4be505c-4811-4a7a-a7b5-3141574f1ee0\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-qk8nd" Mar 10 09:47:54 crc kubenswrapper[4794]: I0310 09:47:54.670102 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-bhj55" Mar 10 09:47:54 crc kubenswrapper[4794]: I0310 09:47:54.679044 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-h4k77"] Mar 10 09:47:54 crc kubenswrapper[4794]: I0310 09:47:54.682979 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mhkn5\" (UniqueName: \"kubernetes.io/projected/22ead435-4c45-43ad-a499-fe930a626c52-kube-api-access-mhkn5\") pod \"catalog-operator-68c6474976-p8ppb\" (UID: \"22ead435-4c45-43ad-a499-fe930a626c52\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-p8ppb" Mar 10 09:47:54 crc kubenswrapper[4794]: I0310 09:47:54.695275 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 09:47:54 crc kubenswrapper[4794]: E0310 09:47:54.695432 4794 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 09:47:55.195404726 +0000 UTC m=+223.951575544 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 09:47:54 crc kubenswrapper[4794]: I0310 09:47:54.695529 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zp5hg\" (UID: \"f7eaacf5-bd8a-46e7-8584-1ec3ae00f6b3\") " pod="openshift-image-registry/image-registry-697d97f7c8-zp5hg" Mar 10 09:47:54 crc kubenswrapper[4794]: E0310 09:47:54.696032 4794 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 09:47:55.196021575 +0000 UTC m=+223.952192383 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zp5hg" (UID: "f7eaacf5-bd8a-46e7-8584-1ec3ae00f6b3") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 09:47:54 crc kubenswrapper[4794]: I0310 09:47:54.697266 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-dkz9c" Mar 10 09:47:54 crc kubenswrapper[4794]: I0310 09:47:54.702071 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zppzf\" (UniqueName: \"kubernetes.io/projected/f12f506f-5226-41a3-9643-260415a884a5-kube-api-access-zppzf\") pod \"auto-csr-approver-29552266-9tldc\" (UID: \"f12f506f-5226-41a3-9643-260415a884a5\") " pod="openshift-infra/auto-csr-approver-29552266-9tldc" Mar 10 09:47:54 crc kubenswrapper[4794]: I0310 09:47:54.703843 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-xwx4g" Mar 10 09:47:54 crc kubenswrapper[4794]: I0310 09:47:54.724457 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-zzsr8"] Mar 10 09:47:54 crc kubenswrapper[4794]: I0310 09:47:54.723320 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/6fd49366-785d-4530-a7d2-4a5daf70ea0f-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-m7cm9\" (UID: \"6fd49366-785d-4530-a7d2-4a5daf70ea0f\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-m7cm9" Mar 10 09:47:54 crc kubenswrapper[4794]: I0310 09:47:54.728402 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-sqlbx"] Mar 10 09:47:54 crc kubenswrapper[4794]: I0310 09:47:54.730642 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-2tbqn" Mar 10 09:47:54 crc kubenswrapper[4794]: W0310 09:47:54.735830 4794 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcd3da9ba_0a30_4cd5_a02b_47c78ab2b9aa.slice/crio-808dd332beb722493e47f78a9d15742204fa692c44003a0de8e94e3f2a295148 WatchSource:0}: Error finding container 808dd332beb722493e47f78a9d15742204fa692c44003a0de8e94e3f2a295148: Status 404 returned error can't find the container with id 808dd332beb722493e47f78a9d15742204fa692c44003a0de8e94e3f2a295148 Mar 10 09:47:54 crc kubenswrapper[4794]: I0310 09:47:54.740943 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jh2j8\" (UniqueName: \"kubernetes.io/projected/5ab22632-d678-4997-996e-bc26acff8c46-kube-api-access-jh2j8\") pod \"machine-approver-56656f9798-m5mvt\" (UID: \"5ab22632-d678-4997-996e-bc26acff8c46\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-m5mvt" Mar 10 09:47:54 crc kubenswrapper[4794]: W0310 09:47:54.746027 4794 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2d3943fb_fdf7_495d_9a69_f04362b8b319.slice/crio-b3eef02fa9b2367bf0d5bfd9bd0b14881872b931df464a04bfd1c185c188d0e6 WatchSource:0}: Error finding container b3eef02fa9b2367bf0d5bfd9bd0b14881872b931df464a04bfd1c185c188d0e6: Status 404 returned error can't find the container with id b3eef02fa9b2367bf0d5bfd9bd0b14881872b931df464a04bfd1c185c188d0e6 Mar 10 09:47:54 crc kubenswrapper[4794]: I0310 09:47:54.752368 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-qk8nd" Mar 10 09:47:54 crc kubenswrapper[4794]: I0310 09:47:54.760760 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-m7cm9" Mar 10 09:47:54 crc kubenswrapper[4794]: I0310 09:47:54.763375 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d5ndx\" (UniqueName: \"kubernetes.io/projected/eb441ca2-702b-4848-906b-9f02a8ff65ee-kube-api-access-d5ndx\") pod \"machine-config-operator-74547568cd-vjgc6\" (UID: \"eb441ca2-702b-4848-906b-9f02a8ff65ee\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-vjgc6" Mar 10 09:47:54 crc kubenswrapper[4794]: I0310 09:47:54.774639 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-m5mvt" Mar 10 09:47:54 crc kubenswrapper[4794]: I0310 09:47:54.776531 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-jqtj2"] Mar 10 09:47:54 crc kubenswrapper[4794]: I0310 09:47:54.780729 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-78sbs\" (UniqueName: \"kubernetes.io/projected/a4cfdaaf-4265-4c7d-b58d-4538905360a2-kube-api-access-78sbs\") pod \"ingress-canary-rkl2f\" (UID: \"a4cfdaaf-4265-4c7d-b58d-4538905360a2\") " pod="openshift-ingress-canary/ingress-canary-rkl2f" Mar 10 09:47:54 crc kubenswrapper[4794]: I0310 09:47:54.797391 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-xtppb" Mar 10 09:47:54 crc kubenswrapper[4794]: I0310 09:47:54.798447 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 09:47:54 crc kubenswrapper[4794]: E0310 09:47:54.798856 4794 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 09:47:55.298836915 +0000 UTC m=+224.055007733 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 09:47:54 crc kubenswrapper[4794]: I0310 09:47:54.811361 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552266-9tldc" Mar 10 09:47:54 crc kubenswrapper[4794]: I0310 09:47:54.812056 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-glg2p"] Mar 10 09:47:54 crc kubenswrapper[4794]: I0310 09:47:54.814290 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-skqw6" Mar 10 09:47:54 crc kubenswrapper[4794]: I0310 09:47:54.814849 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-zpwhc" Mar 10 09:47:54 crc kubenswrapper[4794]: I0310 09:47:54.826193 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kk6wc\" (UniqueName: \"kubernetes.io/projected/270526dc-92ee-4d56-93cf-b4ee2df197fa-kube-api-access-kk6wc\") pod \"dns-default-2lthx\" (UID: \"270526dc-92ee-4d56-93cf-b4ee2df197fa\") " pod="openshift-dns/dns-default-2lthx" Mar 10 09:47:54 crc kubenswrapper[4794]: I0310 09:47:54.836525 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-p8ppb" Mar 10 09:47:54 crc kubenswrapper[4794]: I0310 09:47:54.843118 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bxz59\" (UniqueName: \"kubernetes.io/projected/a39fe093-da97-48ba-bdf3-a566eefc5208-kube-api-access-bxz59\") pod \"marketplace-operator-79b997595-n2b4p\" (UID: \"a39fe093-da97-48ba-bdf3-a566eefc5208\") " pod="openshift-marketplace/marketplace-operator-79b997595-n2b4p" Mar 10 09:47:54 crc kubenswrapper[4794]: I0310 09:47:54.848988 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6g9hr\" (UniqueName: \"kubernetes.io/projected/c2bd53f1-d95b-43dd-91d4-8adc12d6971a-kube-api-access-6g9hr\") pod \"package-server-manager-789f6589d5-gg8jk\" (UID: \"c2bd53f1-d95b-43dd-91d4-8adc12d6971a\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-gg8jk" Mar 10 09:47:54 crc kubenswrapper[4794]: I0310 09:47:54.860200 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-rkl2f" Mar 10 09:47:54 crc kubenswrapper[4794]: I0310 09:47:54.871962 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-2lthx" Mar 10 09:47:54 crc kubenswrapper[4794]: I0310 09:47:54.875781 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m2dvh\" (UniqueName: \"kubernetes.io/projected/a8b37b88-1555-420f-a9c1-f7e48046f160-kube-api-access-m2dvh\") pod \"downloads-7954f5f757-bknsm\" (UID: \"a8b37b88-1555-420f-a9c1-f7e48046f160\") " pod="openshift-console/downloads-7954f5f757-bknsm" Mar 10 09:47:54 crc kubenswrapper[4794]: I0310 09:47:54.899676 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zp5hg\" (UID: \"f7eaacf5-bd8a-46e7-8584-1ec3ae00f6b3\") " pod="openshift-image-registry/image-registry-697d97f7c8-zp5hg" Mar 10 09:47:54 crc kubenswrapper[4794]: E0310 09:47:54.900234 4794 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 09:47:55.400223733 +0000 UTC m=+224.156394551 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zp5hg" (UID: "f7eaacf5-bd8a-46e7-8584-1ec3ae00f6b3") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 09:47:54 crc kubenswrapper[4794]: I0310 09:47:54.906441 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jqjnf\" (UniqueName: \"kubernetes.io/projected/9214e776-ae19-4f0c-9008-9cd8a4ece399-kube-api-access-jqjnf\") pod \"machine-config-server-sl6l8\" (UID: \"9214e776-ae19-4f0c-9008-9cd8a4ece399\") " pod="openshift-machine-config-operator/machine-config-server-sl6l8" Mar 10 09:47:54 crc kubenswrapper[4794]: I0310 09:47:54.908257 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rplrp\" (UniqueName: \"kubernetes.io/projected/eece3bac-ab7c-4a16-82ae-35775eef8806-kube-api-access-rplrp\") pod \"collect-profiles-29552265-smnpm\" (UID: \"eece3bac-ab7c-4a16-82ae-35775eef8806\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552265-smnpm" Mar 10 09:47:54 crc kubenswrapper[4794]: I0310 09:47:54.910120 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-9vsxq"] Mar 10 09:47:54 crc kubenswrapper[4794]: I0310 09:47:54.921273 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dwjlz\" (UniqueName: \"kubernetes.io/projected/f97a286b-f0b0-4309-a3e4-33eea0aea5f8-kube-api-access-dwjlz\") pod \"csi-hostpathplugin-ckvtc\" (UID: \"f97a286b-f0b0-4309-a3e4-33eea0aea5f8\") " pod="hostpath-provisioner/csi-hostpathplugin-ckvtc" Mar 10 09:47:55 crc kubenswrapper[4794]: I0310 09:47:55.000612 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 09:47:55 crc kubenswrapper[4794]: E0310 09:47:55.000743 4794 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 09:47:55.500726113 +0000 UTC m=+224.256896941 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 09:47:55 crc kubenswrapper[4794]: I0310 09:47:55.000834 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zp5hg\" (UID: \"f7eaacf5-bd8a-46e7-8584-1ec3ae00f6b3\") " pod="openshift-image-registry/image-registry-697d97f7c8-zp5hg" Mar 10 09:47:55 crc kubenswrapper[4794]: E0310 09:47:55.001102 4794 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 09:47:55.501094414 +0000 UTC m=+224.257265232 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zp5hg" (UID: "f7eaacf5-bd8a-46e7-8584-1ec3ae00f6b3") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 09:47:55 crc kubenswrapper[4794]: I0310 09:47:55.038161 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-gg8jk" Mar 10 09:47:55 crc kubenswrapper[4794]: I0310 09:47:55.047593 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-vjgc6" Mar 10 09:47:55 crc kubenswrapper[4794]: I0310 09:47:55.065882 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-n2b4p" Mar 10 09:47:55 crc kubenswrapper[4794]: I0310 09:47:55.083680 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-h4k77" event={"ID":"cd3da9ba-0a30-4cd5-a02b-47c78ab2b9aa","Type":"ContainerStarted","Data":"808dd332beb722493e47f78a9d15742204fa692c44003a0de8e94e3f2a295148"} Mar 10 09:47:55 crc kubenswrapper[4794]: I0310 09:47:55.085176 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-hxsrd" event={"ID":"7a0f2e0e-60dc-472e-ad6b-fc17442a65ed","Type":"ContainerStarted","Data":"6d3238030635b9ac5fe4639f0752cb2ac487cf85c00f25dca2b209beba348be5"} Mar 10 09:47:55 crc kubenswrapper[4794]: I0310 09:47:55.086976 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-tlkl5" event={"ID":"180c50b5-f178-4b42-922c-edfd1deb91d4","Type":"ContainerStarted","Data":"c8f7c75921ff47e2f0c40ea989458dc26a84fc4e59911c65603d4ef47f4e79a3"} Mar 10 09:47:55 crc kubenswrapper[4794]: I0310 09:47:55.087000 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-tlkl5" event={"ID":"180c50b5-f178-4b42-922c-edfd1deb91d4","Type":"ContainerStarted","Data":"e9d33cba18fdc611e604a7c004eeb1327d0f52abd1a53351e1de54430e16f8d0"} Mar 10 09:47:55 crc kubenswrapper[4794]: I0310 09:47:55.091760 4794 generic.go:334] "Generic (PLEG): container finished" podID="bff3d354-0064-4a96-8945-51df3cd2d7e7" containerID="b44642a5faa7007b9de85dc5849a53e519bc911fac9aedf82e7d556b531b2089" exitCode=0 Mar 10 09:47:55 crc kubenswrapper[4794]: I0310 09:47:55.092028 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-4jppl" event={"ID":"bff3d354-0064-4a96-8945-51df3cd2d7e7","Type":"ContainerDied","Data":"b44642a5faa7007b9de85dc5849a53e519bc911fac9aedf82e7d556b531b2089"} Mar 10 09:47:55 crc kubenswrapper[4794]: I0310 09:47:55.102389 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 09:47:55 crc kubenswrapper[4794]: E0310 09:47:55.102640 4794 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 09:47:55.602620566 +0000 UTC m=+224.358791384 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 09:47:55 crc kubenswrapper[4794]: I0310 09:47:55.107322 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-q62bc" event={"ID":"eb31e03a-1724-4272-bb5f-d0a7b6b80059","Type":"ContainerStarted","Data":"a06622570303fe2cc2b125dc731c87d42ef9225320fb15c85dc03de9692ed618"} Mar 10 09:47:55 crc kubenswrapper[4794]: I0310 09:47:55.107385 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-q62bc" event={"ID":"eb31e03a-1724-4272-bb5f-d0a7b6b80059","Type":"ContainerStarted","Data":"f07dd8a579be341a3f19de0fffcfa0b7a15cb49f4e2a5d47c7a5f42aecf40a93"} Mar 10 09:47:55 crc kubenswrapper[4794]: I0310 09:47:55.107397 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-zzsr8" event={"ID":"2d3943fb-fdf7-495d-9a69-f04362b8b319","Type":"ContainerStarted","Data":"b3eef02fa9b2367bf0d5bfd9bd0b14881872b931df464a04bfd1c185c188d0e6"} Mar 10 09:47:55 crc kubenswrapper[4794]: I0310 09:47:55.110486 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/machine-api-operator-5694c8668f-m2lbn" podStartSLOduration=152.110472712 podStartE2EDuration="2m32.110472712s" podCreationTimestamp="2026-03-10 09:45:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 09:47:55.101806941 +0000 UTC m=+223.857977759" watchObservedRunningTime="2026-03-10 09:47:55.110472712 +0000 UTC m=+223.866643530" Mar 10 09:47:55 crc kubenswrapper[4794]: I0310 09:47:55.111416 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-jscb4" event={"ID":"02821716-8fb0-46bc-9c95-4c7ca46500b4","Type":"ContainerStarted","Data":"0fdbc6aea5410e167044bf5caeee4427a16acfd4ebbea46a9b6019e14174d93e"} Mar 10 09:47:55 crc kubenswrapper[4794]: I0310 09:47:55.111740 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-jscb4" Mar 10 09:47:55 crc kubenswrapper[4794]: I0310 09:47:55.113922 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-9vsxq" event={"ID":"983001a9-70eb-40e1-895f-5e3fc80f538e","Type":"ContainerStarted","Data":"92ab295205c92f240937f72689d27cb01bfa67a60323f671c9c6c5c18d8f9ff9"} Mar 10 09:47:55 crc kubenswrapper[4794]: I0310 09:47:55.114229 4794 patch_prober.go:28] interesting pod/route-controller-manager-6576b87f9c-jscb4 container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.17:8443/healthz\": dial tcp 10.217.0.17:8443: connect: connection refused" start-of-body= Mar 10 09:47:55 crc kubenswrapper[4794]: I0310 09:47:55.114261 4794 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-jscb4" podUID="02821716-8fb0-46bc-9c95-4c7ca46500b4" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.17:8443/healthz\": dial tcp 10.217.0.17:8443: connect: connection refused" Mar 10 09:47:55 crc kubenswrapper[4794]: I0310 09:47:55.118110 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-l4mhp" event={"ID":"b70fe17f-97eb-4abf-b88e-0e24e5d01c48","Type":"ContainerStarted","Data":"0027914ee980b5f53458d3fdd51ad21ced468d24e63cee485a85ed3c6bba8939"} Mar 10 09:47:55 crc kubenswrapper[4794]: I0310 09:47:55.118152 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-l4mhp" event={"ID":"b70fe17f-97eb-4abf-b88e-0e24e5d01c48","Type":"ContainerStarted","Data":"6d351322381c05cf7758622453516d58b6facfd84aaf45d4a9617f11f9383e34"} Mar 10 09:47:55 crc kubenswrapper[4794]: I0310 09:47:55.121666 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-qnrc8"] Mar 10 09:47:55 crc kubenswrapper[4794]: I0310 09:47:55.122590 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29552265-smnpm" Mar 10 09:47:55 crc kubenswrapper[4794]: I0310 09:47:55.130007 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-bknsm" Mar 10 09:47:55 crc kubenswrapper[4794]: I0310 09:47:55.135322 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-q8jhf" event={"ID":"531c4a6c-a67a-4629-bfeb-a89565d11497","Type":"ContainerStarted","Data":"7513afa3525cd1339598f106100aff835f32e2c23534da98a7f16e9b4ab37605"} Mar 10 09:47:55 crc kubenswrapper[4794]: I0310 09:47:55.135982 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-q8jhf" event={"ID":"531c4a6c-a67a-4629-bfeb-a89565d11497","Type":"ContainerStarted","Data":"87a2eb98144883399d5acf33c0db266120e4da9778b0fbde4294413d191d1c3b"} Mar 10 09:47:55 crc kubenswrapper[4794]: I0310 09:47:55.138192 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-jnffm" event={"ID":"3e0afe5a-17db-4dd6-b871-792a4e0f81a9","Type":"ContainerStarted","Data":"b01c1f8e482a1ff99b5bbc46687347d6f47f4924b0bdc9aeadd60e41e72580b9"} Mar 10 09:47:55 crc kubenswrapper[4794]: I0310 09:47:55.138248 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-jnffm" event={"ID":"3e0afe5a-17db-4dd6-b871-792a4e0f81a9","Type":"ContainerStarted","Data":"da5b2185c5771ccb3714517021cd595a56d44872adf12851bb256badf27aafdb"} Mar 10 09:47:55 crc kubenswrapper[4794]: I0310 09:47:55.145609 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-sl6l8" Mar 10 09:47:55 crc kubenswrapper[4794]: I0310 09:47:55.147166 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-glg2p" event={"ID":"c55bf603-9ac8-4c71-8540-4a3ea6958d0f","Type":"ContainerStarted","Data":"4482edb791fe5b6063eed36d0ff47e0e909ad335043d958380569da64dc0e58b"} Mar 10 09:47:55 crc kubenswrapper[4794]: I0310 09:47:55.148874 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-txkm8" event={"ID":"3abe2691-57ff-4b50-8b21-23d7125489bb","Type":"ContainerStarted","Data":"d3ff56483aca30de36ea2e01f2b084a483a70480d78e686770803b055fd0a777"} Mar 10 09:47:55 crc kubenswrapper[4794]: I0310 09:47:55.150430 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-sqlbx" event={"ID":"5ef589c7-76d3-4f7e-8221-50e60876c39f","Type":"ContainerStarted","Data":"09ccef74f27b388a6f2d3774da9823104f4bf328d2f5709bd614e1433fe33af0"} Mar 10 09:47:55 crc kubenswrapper[4794]: I0310 09:47:55.155495 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-n94r9"] Mar 10 09:47:55 crc kubenswrapper[4794]: I0310 09:47:55.161966 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-ckvtc" Mar 10 09:47:55 crc kubenswrapper[4794]: I0310 09:47:55.165364 4794 generic.go:334] "Generic (PLEG): container finished" podID="32cb5682-0c0e-4f56-8fcc-cd73067c41c7" containerID="81c1c06d12d6337a18bb2ca1ffaa7a5b8979a6cff4a8f0ec602acec60ffa9ed2" exitCode=0 Mar 10 09:47:55 crc kubenswrapper[4794]: I0310 09:47:55.165451 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-cxgqz" event={"ID":"32cb5682-0c0e-4f56-8fcc-cd73067c41c7","Type":"ContainerDied","Data":"81c1c06d12d6337a18bb2ca1ffaa7a5b8979a6cff4a8f0ec602acec60ffa9ed2"} Mar 10 09:47:55 crc kubenswrapper[4794]: I0310 09:47:55.169551 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-rjjct" event={"ID":"4f8f1872-3b59-424c-ad20-9332a4ffc4b2","Type":"ContainerStarted","Data":"3cd236007709b629e1e6fa46b15d31f11a3be17885f7bbfdaa0185cac525953d"} Mar 10 09:47:55 crc kubenswrapper[4794]: I0310 09:47:55.169583 4794 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-n64xh container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.5:8443/healthz\": dial tcp 10.217.0.5:8443: connect: connection refused" start-of-body= Mar 10 09:47:55 crc kubenswrapper[4794]: I0310 09:47:55.169609 4794 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-n64xh" podUID="19ba7a1d-a381-49f2-aa2e-6463336559fe" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.5:8443/healthz\": dial tcp 10.217.0.5:8443: connect: connection refused" Mar 10 09:47:55 crc kubenswrapper[4794]: I0310 09:47:55.169586 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-rjjct" event={"ID":"4f8f1872-3b59-424c-ad20-9332a4ffc4b2","Type":"ContainerStarted","Data":"ffcd6786a1dd7a7f1782c559b9a9453390e335e057e6d3f15fe53ce4d2a3e297"} Mar 10 09:47:55 crc kubenswrapper[4794]: I0310 09:47:55.169756 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-rjjct" Mar 10 09:47:55 crc kubenswrapper[4794]: I0310 09:47:55.174727 4794 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-rjjct container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.36:5443/healthz\": dial tcp 10.217.0.36:5443: connect: connection refused" start-of-body= Mar 10 09:47:55 crc kubenswrapper[4794]: I0310 09:47:55.174800 4794 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-rjjct" podUID="4f8f1872-3b59-424c-ad20-9332a4ffc4b2" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.36:5443/healthz\": dial tcp 10.217.0.36:5443: connect: connection refused" Mar 10 09:47:55 crc kubenswrapper[4794]: I0310 09:47:55.207928 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zp5hg\" (UID: \"f7eaacf5-bd8a-46e7-8584-1ec3ae00f6b3\") " pod="openshift-image-registry/image-registry-697d97f7c8-zp5hg" Mar 10 09:47:55 crc kubenswrapper[4794]: E0310 09:47:55.208615 4794 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 09:47:55.708598601 +0000 UTC m=+224.464769419 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zp5hg" (UID: "f7eaacf5-bd8a-46e7-8584-1ec3ae00f6b3") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 09:47:55 crc kubenswrapper[4794]: W0310 09:47:55.254705 4794 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9d7615b4_8f9f_4200_9e55_1baa01a6b14d.slice/crio-5ff91a73a1dac8ff28600773b11e129065f4caef4ede92e3dbf2e1e49c12de1e WatchSource:0}: Error finding container 5ff91a73a1dac8ff28600773b11e129065f4caef4ede92e3dbf2e1e49c12de1e: Status 404 returned error can't find the container with id 5ff91a73a1dac8ff28600773b11e129065f4caef4ede92e3dbf2e1e49c12de1e Mar 10 09:47:55 crc kubenswrapper[4794]: W0310 09:47:55.270387 4794 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd56da28c_c09d_4fff_b73e_c3b5c787c300.slice/crio-51b0cd56f9858210411a8a4e7888d9ea1eb9d991527e4543e17eea8f744550b6 WatchSource:0}: Error finding container 51b0cd56f9858210411a8a4e7888d9ea1eb9d991527e4543e17eea8f744550b6: Status 404 returned error can't find the container with id 51b0cd56f9858210411a8a4e7888d9ea1eb9d991527e4543e17eea8f744550b6 Mar 10 09:47:55 crc kubenswrapper[4794]: I0310 09:47:55.300626 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-879f6c89f-n64xh" podStartSLOduration=152.300601955 podStartE2EDuration="2m32.300601955s" podCreationTimestamp="2026-03-10 09:45:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 09:47:55.300264355 +0000 UTC m=+224.056435173" watchObservedRunningTime="2026-03-10 09:47:55.300601955 +0000 UTC m=+224.056772783" Mar 10 09:47:55 crc kubenswrapper[4794]: I0310 09:47:55.309079 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 09:47:55 crc kubenswrapper[4794]: E0310 09:47:55.309349 4794 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 09:47:55.809314018 +0000 UTC m=+224.565484836 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 09:47:55 crc kubenswrapper[4794]: I0310 09:47:55.309623 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zp5hg\" (UID: \"f7eaacf5-bd8a-46e7-8584-1ec3ae00f6b3\") " pod="openshift-image-registry/image-registry-697d97f7c8-zp5hg" Mar 10 09:47:55 crc kubenswrapper[4794]: E0310 09:47:55.314119 4794 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 09:47:55.814101982 +0000 UTC m=+224.570272920 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zp5hg" (UID: "f7eaacf5-bd8a-46e7-8584-1ec3ae00f6b3") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 09:47:55 crc kubenswrapper[4794]: I0310 09:47:55.378945 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-2tbqn"] Mar 10 09:47:55 crc kubenswrapper[4794]: I0310 09:47:55.402864 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-bhj55"] Mar 10 09:47:55 crc kubenswrapper[4794]: I0310 09:47:55.410695 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 09:47:55 crc kubenswrapper[4794]: E0310 09:47:55.411121 4794 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 09:47:55.911095297 +0000 UTC m=+224.667266115 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 09:47:55 crc kubenswrapper[4794]: I0310 09:47:55.414059 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zp5hg\" (UID: \"f7eaacf5-bd8a-46e7-8584-1ec3ae00f6b3\") " pod="openshift-image-registry/image-registry-697d97f7c8-zp5hg" Mar 10 09:47:55 crc kubenswrapper[4794]: E0310 09:47:55.414520 4794 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 09:47:55.91450307 +0000 UTC m=+224.670673888 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zp5hg" (UID: "f7eaacf5-bd8a-46e7-8584-1ec3ae00f6b3") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 09:47:55 crc kubenswrapper[4794]: I0310 09:47:55.501369 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-qk8nd"] Mar 10 09:47:55 crc kubenswrapper[4794]: I0310 09:47:55.501770 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552266-9tldc"] Mar 10 09:47:55 crc kubenswrapper[4794]: I0310 09:47:55.514730 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 09:47:55 crc kubenswrapper[4794]: E0310 09:47:55.514859 4794 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 09:47:56.014837325 +0000 UTC m=+224.771008143 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 09:47:55 crc kubenswrapper[4794]: I0310 09:47:55.514937 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zp5hg\" (UID: \"f7eaacf5-bd8a-46e7-8584-1ec3ae00f6b3\") " pod="openshift-image-registry/image-registry-697d97f7c8-zp5hg" Mar 10 09:47:55 crc kubenswrapper[4794]: E0310 09:47:55.515274 4794 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 09:47:56.015263988 +0000 UTC m=+224.771434806 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zp5hg" (UID: "f7eaacf5-bd8a-46e7-8584-1ec3ae00f6b3") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 09:47:55 crc kubenswrapper[4794]: W0310 09:47:55.533783 4794 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7b739a9e_64b9_4415_bc45_dc9307aa49d3.slice/crio-2e7cacc26f0d059f9eaa92659cb97eae5cc7ac19e05bb3b05b7919d32dfb2327 WatchSource:0}: Error finding container 2e7cacc26f0d059f9eaa92659cb97eae5cc7ac19e05bb3b05b7919d32dfb2327: Status 404 returned error can't find the container with id 2e7cacc26f0d059f9eaa92659cb97eae5cc7ac19e05bb3b05b7919d32dfb2327 Mar 10 09:47:55 crc kubenswrapper[4794]: W0310 09:47:55.560963 4794 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod797ba833_0538_46ba_b3f1_4a3b2415c7f1.slice/crio-8dedd52c2a4a46dbe76730c47ead13288b6b7e17bed4af7b3af228b5fcd12cdf WatchSource:0}: Error finding container 8dedd52c2a4a46dbe76730c47ead13288b6b7e17bed4af7b3af228b5fcd12cdf: Status 404 returned error can't find the container with id 8dedd52c2a4a46dbe76730c47ead13288b6b7e17bed4af7b3af228b5fcd12cdf Mar 10 09:47:55 crc kubenswrapper[4794]: I0310 09:47:55.591090 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-zpwhc"] Mar 10 09:47:55 crc kubenswrapper[4794]: W0310 09:47:55.592285 4794 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9214e776_ae19_4f0c_9008_9cd8a4ece399.slice/crio-5d90c3415791761153039ac8d367b96d0ddd56a3a7046371cc5f743822685c9e WatchSource:0}: Error finding container 5d90c3415791761153039ac8d367b96d0ddd56a3a7046371cc5f743822685c9e: Status 404 returned error can't find the container with id 5d90c3415791761153039ac8d367b96d0ddd56a3a7046371cc5f743822685c9e Mar 10 09:47:55 crc kubenswrapper[4794]: I0310 09:47:55.616972 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 09:47:55 crc kubenswrapper[4794]: E0310 09:47:55.617430 4794 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 09:47:56.117412219 +0000 UTC m=+224.873583037 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 09:47:55 crc kubenswrapper[4794]: I0310 09:47:55.645073 4794 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 10 09:47:55 crc kubenswrapper[4794]: W0310 09:47:55.668463 4794 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9e528446_c8eb_4103_b2ef_994eef6ecac5.slice/crio-11696377e4e2b6fc04a8d5723b2be35926be0ed97996ac75986a2c50ba23aa0a WatchSource:0}: Error finding container 11696377e4e2b6fc04a8d5723b2be35926be0ed97996ac75986a2c50ba23aa0a: Status 404 returned error can't find the container with id 11696377e4e2b6fc04a8d5723b2be35926be0ed97996ac75986a2c50ba23aa0a Mar 10 09:47:55 crc kubenswrapper[4794]: I0310 09:47:55.718452 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zp5hg\" (UID: \"f7eaacf5-bd8a-46e7-8584-1ec3ae00f6b3\") " pod="openshift-image-registry/image-registry-697d97f7c8-zp5hg" Mar 10 09:47:55 crc kubenswrapper[4794]: E0310 09:47:55.718911 4794 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 09:47:56.218894058 +0000 UTC m=+224.975064876 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zp5hg" (UID: "f7eaacf5-bd8a-46e7-8584-1ec3ae00f6b3") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 09:47:55 crc kubenswrapper[4794]: I0310 09:47:55.820106 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 09:47:55 crc kubenswrapper[4794]: E0310 09:47:55.820712 4794 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 09:47:56.320697958 +0000 UTC m=+225.076868776 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 09:47:55 crc kubenswrapper[4794]: I0310 09:47:55.820910 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-skqw6"] Mar 10 09:47:55 crc kubenswrapper[4794]: I0310 09:47:55.887798 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-m7cm9"] Mar 10 09:47:55 crc kubenswrapper[4794]: I0310 09:47:55.896150 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-xwx4g"] Mar 10 09:47:55 crc kubenswrapper[4794]: I0310 09:47:55.900465 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-gg8jk"] Mar 10 09:47:55 crc kubenswrapper[4794]: I0310 09:47:55.904266 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-dkz9c"] Mar 10 09:47:55 crc kubenswrapper[4794]: I0310 09:47:55.924142 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zp5hg\" (UID: \"f7eaacf5-bd8a-46e7-8584-1ec3ae00f6b3\") " pod="openshift-image-registry/image-registry-697d97f7c8-zp5hg" Mar 10 09:47:55 crc kubenswrapper[4794]: E0310 09:47:55.924507 4794 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 09:47:56.424494228 +0000 UTC m=+225.180665046 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zp5hg" (UID: "f7eaacf5-bd8a-46e7-8584-1ec3ae00f6b3") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 09:47:56 crc kubenswrapper[4794]: W0310 09:47:56.019067 4794 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7827a543_d8b2_460b_aee5_212ea1208c0d.slice/crio-7be1444afeeb28da0d5ec0820c7bf898af23aea3cfa1fc5668516d44d4782b6c WatchSource:0}: Error finding container 7be1444afeeb28da0d5ec0820c7bf898af23aea3cfa1fc5668516d44d4782b6c: Status 404 returned error can't find the container with id 7be1444afeeb28da0d5ec0820c7bf898af23aea3cfa1fc5668516d44d4782b6c Mar 10 09:47:56 crc kubenswrapper[4794]: W0310 09:47:56.022116 4794 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6fd49366_785d_4530_a7d2_4a5daf70ea0f.slice/crio-4474ae6f644c1548c23af8f50cc0dcc400e170b756ebf8c974d2661655ce4d95 WatchSource:0}: Error finding container 4474ae6f644c1548c23af8f50cc0dcc400e170b756ebf8c974d2661655ce4d95: Status 404 returned error can't find the container with id 4474ae6f644c1548c23af8f50cc0dcc400e170b756ebf8c974d2661655ce4d95 Mar 10 09:47:56 crc kubenswrapper[4794]: W0310 09:47:56.026524 4794 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod982fe064_1be6_4ff0_b6ba_6f04ee269140.slice/crio-f9d1d9aed3aba3083114cd37e53440d01f87208e5ee1cf8686ddc88f98494f01 WatchSource:0}: Error finding container f9d1d9aed3aba3083114cd37e53440d01f87208e5ee1cf8686ddc88f98494f01: Status 404 returned error can't find the container with id f9d1d9aed3aba3083114cd37e53440d01f87208e5ee1cf8686ddc88f98494f01 Mar 10 09:47:56 crc kubenswrapper[4794]: I0310 09:47:56.027849 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 09:47:56 crc kubenswrapper[4794]: E0310 09:47:56.028609 4794 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 09:47:56.528588737 +0000 UTC m=+225.284759545 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 09:47:56 crc kubenswrapper[4794]: I0310 09:47:56.055091 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-p8ppb"] Mar 10 09:47:56 crc kubenswrapper[4794]: I0310 09:47:56.062959 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-2lthx"] Mar 10 09:47:56 crc kubenswrapper[4794]: I0310 09:47:56.088814 4794 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-5444994796-tlkl5" Mar 10 09:47:56 crc kubenswrapper[4794]: I0310 09:47:56.091926 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-rkl2f"] Mar 10 09:47:56 crc kubenswrapper[4794]: W0310 09:47:56.096646 4794 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod577c9773_972c_42bb_99c1_b3d44cf21403.slice/crio-f0b6a36a9d5c46f7813948f61913173559cb2792b291a1668329c92843c4f6c2 WatchSource:0}: Error finding container f0b6a36a9d5c46f7813948f61913173559cb2792b291a1668329c92843c4f6c2: Status 404 returned error can't find the container with id f0b6a36a9d5c46f7813948f61913173559cb2792b291a1668329c92843c4f6c2 Mar 10 09:47:56 crc kubenswrapper[4794]: I0310 09:47:56.104645 4794 patch_prober.go:28] interesting pod/router-default-5444994796-tlkl5 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 10 09:47:56 crc kubenswrapper[4794]: [-]has-synced failed: reason withheld Mar 10 09:47:56 crc kubenswrapper[4794]: [+]process-running ok Mar 10 09:47:56 crc kubenswrapper[4794]: healthz check failed Mar 10 09:47:56 crc kubenswrapper[4794]: I0310 09:47:56.104686 4794 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-tlkl5" podUID="180c50b5-f178-4b42-922c-edfd1deb91d4" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 10 09:47:56 crc kubenswrapper[4794]: I0310 09:47:56.129956 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zp5hg\" (UID: \"f7eaacf5-bd8a-46e7-8584-1ec3ae00f6b3\") " pod="openshift-image-registry/image-registry-697d97f7c8-zp5hg" Mar 10 09:47:56 crc kubenswrapper[4794]: E0310 09:47:56.130395 4794 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 09:47:56.630380026 +0000 UTC m=+225.386550854 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zp5hg" (UID: "f7eaacf5-bd8a-46e7-8584-1ec3ae00f6b3") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 09:47:56 crc kubenswrapper[4794]: I0310 09:47:56.130896 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-xtppb"] Mar 10 09:47:56 crc kubenswrapper[4794]: I0310 09:47:56.132427 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29552265-smnpm"] Mar 10 09:47:56 crc kubenswrapper[4794]: I0310 09:47:56.150375 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-n2b4p"] Mar 10 09:47:56 crc kubenswrapper[4794]: I0310 09:47:56.170895 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-vjgc6"] Mar 10 09:47:56 crc kubenswrapper[4794]: I0310 09:47:56.183209 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-xwx4g" event={"ID":"577c9773-972c-42bb-99c1-b3d44cf21403","Type":"ContainerStarted","Data":"f0b6a36a9d5c46f7813948f61913173559cb2792b291a1668329c92843c4f6c2"} Mar 10 09:47:56 crc kubenswrapper[4794]: I0310 09:47:56.187630 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-dkz9c" event={"ID":"982fe064-1be6-4ff0-b6ba-6f04ee269140","Type":"ContainerStarted","Data":"f9d1d9aed3aba3083114cd37e53440d01f87208e5ee1cf8686ddc88f98494f01"} Mar 10 09:47:56 crc kubenswrapper[4794]: I0310 09:47:56.194161 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-hxsrd" event={"ID":"7a0f2e0e-60dc-472e-ad6b-fc17442a65ed","Type":"ContainerStarted","Data":"e2a86f618b5fdf5d631a05aef77332fdc1e58abfaa8a8db7ae802195c8b6b862"} Mar 10 09:47:56 crc kubenswrapper[4794]: I0310 09:47:56.195495 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-zpwhc" event={"ID":"9e528446-c8eb-4103-b2ef-994eef6ecac5","Type":"ContainerStarted","Data":"11696377e4e2b6fc04a8d5723b2be35926be0ed97996ac75986a2c50ba23aa0a"} Mar 10 09:47:56 crc kubenswrapper[4794]: I0310 09:47:56.196383 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-zzsr8" event={"ID":"2d3943fb-fdf7-495d-9a69-f04362b8b319","Type":"ContainerStarted","Data":"91f3515c7a2ef16d683c309d8cd05997eb65ae7b50d8c9c668a450e15db02ee8"} Mar 10 09:47:56 crc kubenswrapper[4794]: I0310 09:47:56.197468 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-glg2p" event={"ID":"c55bf603-9ac8-4c71-8540-4a3ea6958d0f","Type":"ContainerStarted","Data":"04ee04df8ecbef130a5c208a55e2df6ef145f1e10096c055ae9b779f117a1643"} Mar 10 09:47:56 crc kubenswrapper[4794]: I0310 09:47:56.218425 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-txkm8" event={"ID":"3abe2691-57ff-4b50-8b21-23d7125489bb","Type":"ContainerStarted","Data":"b33e2ffd91214f7bd895eb2e0337d199551aa72007d91a9024a2e9d1b65a95dc"} Mar 10 09:47:56 crc kubenswrapper[4794]: I0310 09:47:56.231589 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 09:47:56 crc kubenswrapper[4794]: E0310 09:47:56.233105 4794 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 09:47:56.733084363 +0000 UTC m=+225.489255181 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 09:47:56 crc kubenswrapper[4794]: I0310 09:47:56.233225 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-2lthx" event={"ID":"270526dc-92ee-4d56-93cf-b4ee2df197fa","Type":"ContainerStarted","Data":"6c27d474c2fc139e6c7bcb3720d165bb8f067e424abafa4bf500fa18960643bc"} Mar 10 09:47:56 crc kubenswrapper[4794]: I0310 09:47:56.248765 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-q8jhf" event={"ID":"531c4a6c-a67a-4629-bfeb-a89565d11497","Type":"ContainerStarted","Data":"b0fdfb2a624597e84255ad936f6f64a7f3f51dc5fda56b43268e4ce614c2a9f9"} Mar 10 09:47:56 crc kubenswrapper[4794]: I0310 09:47:56.250324 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-bknsm"] Mar 10 09:47:56 crc kubenswrapper[4794]: I0310 09:47:56.272239 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-qnrc8" event={"ID":"9d7615b4-8f9f-4200-9e55-1baa01a6b14d","Type":"ContainerStarted","Data":"8f66dbb28c2da0cea60412a36311624ff5c8a8a7e11627873830f314c6c1cccb"} Mar 10 09:47:56 crc kubenswrapper[4794]: I0310 09:47:56.272291 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-qnrc8" event={"ID":"9d7615b4-8f9f-4200-9e55-1baa01a6b14d","Type":"ContainerStarted","Data":"5ff91a73a1dac8ff28600773b11e129065f4caef4ede92e3dbf2e1e49c12de1e"} Mar 10 09:47:56 crc kubenswrapper[4794]: I0310 09:47:56.273244 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console-operator/console-operator-58897d9998-qnrc8" Mar 10 09:47:56 crc kubenswrapper[4794]: I0310 09:47:56.277155 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-ckvtc"] Mar 10 09:47:56 crc kubenswrapper[4794]: I0310 09:47:56.293000 4794 patch_prober.go:28] interesting pod/console-operator-58897d9998-qnrc8 container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.28:8443/readyz\": dial tcp 10.217.0.28:8443: connect: connection refused" start-of-body= Mar 10 09:47:56 crc kubenswrapper[4794]: I0310 09:47:56.293047 4794 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-qnrc8" podUID="9d7615b4-8f9f-4200-9e55-1baa01a6b14d" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.28:8443/readyz\": dial tcp 10.217.0.28:8443: connect: connection refused" Mar 10 09:47:56 crc kubenswrapper[4794]: I0310 09:47:56.295594 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-jqtj2" event={"ID":"e5fe5900-c0c7-4b99-9dcb-bcb3bd8da1fc","Type":"ContainerStarted","Data":"784ce291e0a1ef608a8319d9e3bf7f533f618b23fe809b306e12212f15cbfcf3"} Mar 10 09:47:56 crc kubenswrapper[4794]: I0310 09:47:56.295727 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-jqtj2" event={"ID":"e5fe5900-c0c7-4b99-9dcb-bcb3bd8da1fc","Type":"ContainerStarted","Data":"32e4f3d0d58bca95d4492d75b955cc4b2dd5f787b352b4ad384ba845d7dc6f46"} Mar 10 09:47:56 crc kubenswrapper[4794]: I0310 09:47:56.297398 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-n94r9" event={"ID":"d56da28c-c09d-4fff-b73e-c3b5c787c300","Type":"ContainerStarted","Data":"51b0cd56f9858210411a8a4e7888d9ea1eb9d991527e4543e17eea8f744550b6"} Mar 10 09:47:56 crc kubenswrapper[4794]: I0310 09:47:56.309723 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-gg8jk" event={"ID":"c2bd53f1-d95b-43dd-91d4-8adc12d6971a","Type":"ContainerStarted","Data":"0cddfe3470037f75e7607cf5c661b8f3fd9882f0b0fbf21c550686a6458ccafb"} Mar 10 09:47:56 crc kubenswrapper[4794]: I0310 09:47:56.313133 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-9vsxq" event={"ID":"983001a9-70eb-40e1-895f-5e3fc80f538e","Type":"ContainerStarted","Data":"0a1109af954ec858a5d50f15a90674feb006386200b6247ceb1f2f013c1ba5cc"} Mar 10 09:47:56 crc kubenswrapper[4794]: I0310 09:47:56.319202 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-rkl2f" event={"ID":"a4cfdaaf-4265-4c7d-b58d-4538905360a2","Type":"ContainerStarted","Data":"d543ae94679629eb7fcdca7482384df766d7365f48468e666734ba155127fe5e"} Mar 10 09:47:56 crc kubenswrapper[4794]: I0310 09:47:56.329551 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-m5mvt" event={"ID":"5ab22632-d678-4997-996e-bc26acff8c46","Type":"ContainerStarted","Data":"6c2d9e3d284efef58b883847e2f0dcea09619284282163abe6749260ca34739b"} Mar 10 09:47:56 crc kubenswrapper[4794]: I0310 09:47:56.333403 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-bhj55" event={"ID":"7b739a9e-64b9-4415-bc45-dc9307aa49d3","Type":"ContainerStarted","Data":"2e7cacc26f0d059f9eaa92659cb97eae5cc7ac19e05bb3b05b7919d32dfb2327"} Mar 10 09:47:56 crc kubenswrapper[4794]: I0310 09:47:56.339969 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-skqw6" event={"ID":"7827a543-d8b2-460b-aee5-212ea1208c0d","Type":"ContainerStarted","Data":"7be1444afeeb28da0d5ec0820c7bf898af23aea3cfa1fc5668516d44d4782b6c"} Mar 10 09:47:56 crc kubenswrapper[4794]: E0310 09:47:56.342998 4794 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 09:47:56.842980706 +0000 UTC m=+225.599151524 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zp5hg" (UID: "f7eaacf5-bd8a-46e7-8584-1ec3ae00f6b3") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 09:47:56 crc kubenswrapper[4794]: I0310 09:47:56.340547 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zp5hg\" (UID: \"f7eaacf5-bd8a-46e7-8584-1ec3ae00f6b3\") " pod="openshift-image-registry/image-registry-697d97f7c8-zp5hg" Mar 10 09:47:56 crc kubenswrapper[4794]: I0310 09:47:56.356856 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552266-9tldc" event={"ID":"f12f506f-5226-41a3-9643-260415a884a5","Type":"ContainerStarted","Data":"9e0453e81bc0ab73cfa7f712f3ba1c831eee5cbc4d731d219a3532cd1c964f7e"} Mar 10 09:47:56 crc kubenswrapper[4794]: I0310 09:47:56.420777 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-2tbqn" event={"ID":"797ba833-0538-46ba-b3f1-4a3b2415c7f1","Type":"ContainerStarted","Data":"8dedd52c2a4a46dbe76730c47ead13288b6b7e17bed4af7b3af228b5fcd12cdf"} Mar 10 09:47:56 crc kubenswrapper[4794]: I0310 09:47:56.444910 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 09:47:56 crc kubenswrapper[4794]: E0310 09:47:56.445048 4794 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 09:47:56.945024204 +0000 UTC m=+225.701195022 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 09:47:56 crc kubenswrapper[4794]: I0310 09:47:56.445166 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zp5hg\" (UID: \"f7eaacf5-bd8a-46e7-8584-1ec3ae00f6b3\") " pod="openshift-image-registry/image-registry-697d97f7c8-zp5hg" Mar 10 09:47:56 crc kubenswrapper[4794]: E0310 09:47:56.445548 4794 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 09:47:56.945534699 +0000 UTC m=+225.701705517 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zp5hg" (UID: "f7eaacf5-bd8a-46e7-8584-1ec3ae00f6b3") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 09:47:56 crc kubenswrapper[4794]: I0310 09:47:56.462684 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-q8jhf" podStartSLOduration=153.462666006 podStartE2EDuration="2m33.462666006s" podCreationTimestamp="2026-03-10 09:45:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 09:47:56.46181399 +0000 UTC m=+225.217984798" watchObservedRunningTime="2026-03-10 09:47:56.462666006 +0000 UTC m=+225.218836824" Mar 10 09:47:56 crc kubenswrapper[4794]: I0310 09:47:56.489490 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-qk8nd" event={"ID":"d4be505c-4811-4a7a-a7b5-3141574f1ee0","Type":"ContainerStarted","Data":"c66112bcd15cc4b95ed2393041b729a08adbddb140f2f6c4ffa7d6c600bc8d03"} Mar 10 09:47:56 crc kubenswrapper[4794]: I0310 09:47:56.510947 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29552265-smnpm" event={"ID":"eece3bac-ab7c-4a16-82ae-35775eef8806","Type":"ContainerStarted","Data":"26cb0fb5707601e2aa6a31b3b01be61d23126ea93b3e1b921e2e9abbf3f84244"} Mar 10 09:47:56 crc kubenswrapper[4794]: I0310 09:47:56.524455 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-h4k77" event={"ID":"cd3da9ba-0a30-4cd5-a02b-47c78ab2b9aa","Type":"ContainerStarted","Data":"ff85e6a01283381cc37b1a6b52e7768a45d13800eeba1a67436922785fd1ca00"} Mar 10 09:47:56 crc kubenswrapper[4794]: I0310 09:47:56.540155 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-777779d784-hxsrd" podStartSLOduration=153.540136722 podStartE2EDuration="2m33.540136722s" podCreationTimestamp="2026-03-10 09:45:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 09:47:56.497585478 +0000 UTC m=+225.253756296" watchObservedRunningTime="2026-03-10 09:47:56.540136722 +0000 UTC m=+225.296307530" Mar 10 09:47:56 crc kubenswrapper[4794]: I0310 09:47:56.540468 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-glg2p" podStartSLOduration=153.540464062 podStartE2EDuration="2m33.540464062s" podCreationTimestamp="2026-03-10 09:45:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 09:47:56.538284276 +0000 UTC m=+225.294455104" watchObservedRunningTime="2026-03-10 09:47:56.540464062 +0000 UTC m=+225.296634880" Mar 10 09:47:56 crc kubenswrapper[4794]: I0310 09:47:56.551156 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-p8ppb" event={"ID":"22ead435-4c45-43ad-a499-fe930a626c52","Type":"ContainerStarted","Data":"acd7b1633f91f8d9e6824e78beb1047e26997b999456c578c0540edb363b30fe"} Mar 10 09:47:56 crc kubenswrapper[4794]: I0310 09:47:56.551508 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 09:47:56 crc kubenswrapper[4794]: E0310 09:47:56.552492 4794 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 09:47:57.052474954 +0000 UTC m=+225.808645772 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 09:47:56 crc kubenswrapper[4794]: I0310 09:47:56.556971 4794 generic.go:334] "Generic (PLEG): container finished" podID="b70fe17f-97eb-4abf-b88e-0e24e5d01c48" containerID="0027914ee980b5f53458d3fdd51ad21ced468d24e63cee485a85ed3c6bba8939" exitCode=0 Mar 10 09:47:56 crc kubenswrapper[4794]: I0310 09:47:56.557748 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-l4mhp" event={"ID":"b70fe17f-97eb-4abf-b88e-0e24e5d01c48","Type":"ContainerDied","Data":"0027914ee980b5f53458d3fdd51ad21ced468d24e63cee485a85ed3c6bba8939"} Mar 10 09:47:56 crc kubenswrapper[4794]: I0310 09:47:56.559452 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-sqlbx" event={"ID":"5ef589c7-76d3-4f7e-8221-50e60876c39f","Type":"ContainerStarted","Data":"b5cc8764fdc176b2f36590f6a0dcade14ded3da11528fb95f5f02f059469947d"} Mar 10 09:47:56 crc kubenswrapper[4794]: I0310 09:47:56.560531 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-m7cm9" event={"ID":"6fd49366-785d-4530-a7d2-4a5daf70ea0f","Type":"ContainerStarted","Data":"4474ae6f644c1548c23af8f50cc0dcc400e170b756ebf8c974d2661655ce4d95"} Mar 10 09:47:56 crc kubenswrapper[4794]: I0310 09:47:56.564196 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-sl6l8" event={"ID":"9214e776-ae19-4f0c-9008-9cd8a4ece399","Type":"ContainerStarted","Data":"5d90c3415791761153039ac8d367b96d0ddd56a3a7046371cc5f743822685c9e"} Mar 10 09:47:56 crc kubenswrapper[4794]: I0310 09:47:56.566937 4794 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-rjjct container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.36:5443/healthz\": dial tcp 10.217.0.36:5443: connect: connection refused" start-of-body= Mar 10 09:47:56 crc kubenswrapper[4794]: I0310 09:47:56.566982 4794 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-rjjct" podUID="4f8f1872-3b59-424c-ad20-9332a4ffc4b2" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.36:5443/healthz\": dial tcp 10.217.0.36:5443: connect: connection refused" Mar 10 09:47:56 crc kubenswrapper[4794]: I0310 09:47:56.576367 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-jscb4" Mar 10 09:47:56 crc kubenswrapper[4794]: I0310 09:47:56.652827 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zp5hg\" (UID: \"f7eaacf5-bd8a-46e7-8584-1ec3ae00f6b3\") " pod="openshift-image-registry/image-registry-697d97f7c8-zp5hg" Mar 10 09:47:56 crc kubenswrapper[4794]: E0310 09:47:56.656592 4794 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 09:47:57.156579272 +0000 UTC m=+225.912750090 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zp5hg" (UID: "f7eaacf5-bd8a-46e7-8584-1ec3ae00f6b3") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 09:47:56 crc kubenswrapper[4794]: I0310 09:47:56.678875 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-5444994796-tlkl5" podStartSLOduration=153.678857014 podStartE2EDuration="2m33.678857014s" podCreationTimestamp="2026-03-10 09:45:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 09:47:56.677561925 +0000 UTC m=+225.433732743" watchObservedRunningTime="2026-03-10 09:47:56.678857014 +0000 UTC m=+225.435027832" Mar 10 09:47:56 crc kubenswrapper[4794]: I0310 09:47:56.679746 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-58897d9998-qnrc8" podStartSLOduration=153.679738981 podStartE2EDuration="2m33.679738981s" podCreationTimestamp="2026-03-10 09:45:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 09:47:56.616513045 +0000 UTC m=+225.372683883" watchObservedRunningTime="2026-03-10 09:47:56.679738981 +0000 UTC m=+225.435909809" Mar 10 09:47:56 crc kubenswrapper[4794]: I0310 09:47:56.742097 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-q62bc" podStartSLOduration=153.742079031 podStartE2EDuration="2m33.742079031s" podCreationTimestamp="2026-03-10 09:45:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 09:47:56.711785758 +0000 UTC m=+225.467956576" watchObservedRunningTime="2026-03-10 09:47:56.742079031 +0000 UTC m=+225.498249849" Mar 10 09:47:56 crc kubenswrapper[4794]: I0310 09:47:56.742746 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-jnffm" podStartSLOduration=153.742739311 podStartE2EDuration="2m33.742739311s" podCreationTimestamp="2026-03-10 09:45:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 09:47:56.74270437 +0000 UTC m=+225.498875188" watchObservedRunningTime="2026-03-10 09:47:56.742739311 +0000 UTC m=+225.498910129" Mar 10 09:47:56 crc kubenswrapper[4794]: I0310 09:47:56.755466 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 09:47:56 crc kubenswrapper[4794]: E0310 09:47:56.755844 4794 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 09:47:57.255812085 +0000 UTC m=+226.011982903 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 09:47:56 crc kubenswrapper[4794]: I0310 09:47:56.796579 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-zzsr8" podStartSLOduration=153.796554543 podStartE2EDuration="2m33.796554543s" podCreationTimestamp="2026-03-10 09:45:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 09:47:56.789315815 +0000 UTC m=+225.545486633" watchObservedRunningTime="2026-03-10 09:47:56.796554543 +0000 UTC m=+225.552725361" Mar 10 09:47:56 crc kubenswrapper[4794]: I0310 09:47:56.856897 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zp5hg\" (UID: \"f7eaacf5-bd8a-46e7-8584-1ec3ae00f6b3\") " pod="openshift-image-registry/image-registry-697d97f7c8-zp5hg" Mar 10 09:47:56 crc kubenswrapper[4794]: E0310 09:47:56.857274 4794 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 09:47:57.357261484 +0000 UTC m=+226.113432302 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zp5hg" (UID: "f7eaacf5-bd8a-46e7-8584-1ec3ae00f6b3") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 09:47:56 crc kubenswrapper[4794]: I0310 09:47:56.960652 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 09:47:56 crc kubenswrapper[4794]: E0310 09:47:56.961521 4794 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 09:47:57.461504958 +0000 UTC m=+226.217675766 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 09:47:56 crc kubenswrapper[4794]: I0310 09:47:56.977131 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-jscb4" podStartSLOduration=153.977113058 podStartE2EDuration="2m33.977113058s" podCreationTimestamp="2026-03-10 09:45:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 09:47:56.914134639 +0000 UTC m=+225.670305457" watchObservedRunningTime="2026-03-10 09:47:56.977113058 +0000 UTC m=+225.733283886" Mar 10 09:47:56 crc kubenswrapper[4794]: I0310 09:47:56.978113 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-rjjct" podStartSLOduration=153.978105578 podStartE2EDuration="2m33.978105578s" podCreationTimestamp="2026-03-10 09:45:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 09:47:56.975603002 +0000 UTC m=+225.731773830" watchObservedRunningTime="2026-03-10 09:47:56.978105578 +0000 UTC m=+225.734276396" Mar 10 09:47:57 crc kubenswrapper[4794]: I0310 09:47:57.036386 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd-operator/etcd-operator-b45778765-txkm8" podStartSLOduration=154.036372135 podStartE2EDuration="2m34.036372135s" podCreationTimestamp="2026-03-10 09:45:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 09:47:57.034385234 +0000 UTC m=+225.790556052" watchObservedRunningTime="2026-03-10 09:47:57.036372135 +0000 UTC m=+225.792542953" Mar 10 09:47:57 crc kubenswrapper[4794]: I0310 09:47:57.063954 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zp5hg\" (UID: \"f7eaacf5-bd8a-46e7-8584-1ec3ae00f6b3\") " pod="openshift-image-registry/image-registry-697d97f7c8-zp5hg" Mar 10 09:47:57 crc kubenswrapper[4794]: E0310 09:47:57.064352 4794 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 09:47:57.564318478 +0000 UTC m=+226.320489296 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zp5hg" (UID: "f7eaacf5-bd8a-46e7-8584-1ec3ae00f6b3") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 09:47:57 crc kubenswrapper[4794]: I0310 09:47:57.096424 4794 patch_prober.go:28] interesting pod/router-default-5444994796-tlkl5 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 10 09:47:57 crc kubenswrapper[4794]: [-]has-synced failed: reason withheld Mar 10 09:47:57 crc kubenswrapper[4794]: [+]process-running ok Mar 10 09:47:57 crc kubenswrapper[4794]: healthz check failed Mar 10 09:47:57 crc kubenswrapper[4794]: I0310 09:47:57.096481 4794 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-tlkl5" podUID="180c50b5-f178-4b42-922c-edfd1deb91d4" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 10 09:47:57 crc kubenswrapper[4794]: I0310 09:47:57.164717 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 09:47:57 crc kubenswrapper[4794]: E0310 09:47:57.165372 4794 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 09:47:57.665353474 +0000 UTC m=+226.421524302 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 09:47:57 crc kubenswrapper[4794]: I0310 09:47:57.168076 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-sqlbx" podStartSLOduration=154.168058416 podStartE2EDuration="2m34.168058416s" podCreationTimestamp="2026-03-10 09:45:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 09:47:57.146959349 +0000 UTC m=+225.903130167" watchObservedRunningTime="2026-03-10 09:47:57.168058416 +0000 UTC m=+225.924229234" Mar 10 09:47:57 crc kubenswrapper[4794]: I0310 09:47:57.183398 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-server-sl6l8" podStartSLOduration=6.183371747 podStartE2EDuration="6.183371747s" podCreationTimestamp="2026-03-10 09:47:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 09:47:57.168572422 +0000 UTC m=+225.924743240" watchObservedRunningTime="2026-03-10 09:47:57.183371747 +0000 UTC m=+225.939542575" Mar 10 09:47:57 crc kubenswrapper[4794]: I0310 09:47:57.197061 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-9c57cc56f-2tbqn" podStartSLOduration=154.197042699 podStartE2EDuration="2m34.197042699s" podCreationTimestamp="2026-03-10 09:45:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 09:47:57.196676319 +0000 UTC m=+225.952847137" watchObservedRunningTime="2026-03-10 09:47:57.197042699 +0000 UTC m=+225.953213517" Mar 10 09:47:57 crc kubenswrapper[4794]: I0310 09:47:57.251108 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-h4k77" podStartSLOduration=154.251089419 podStartE2EDuration="2m34.251089419s" podCreationTimestamp="2026-03-10 09:45:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 09:47:57.220032003 +0000 UTC m=+225.976202821" watchObservedRunningTime="2026-03-10 09:47:57.251089419 +0000 UTC m=+226.007260237" Mar 10 09:47:57 crc kubenswrapper[4794]: I0310 09:47:57.266084 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zp5hg\" (UID: \"f7eaacf5-bd8a-46e7-8584-1ec3ae00f6b3\") " pod="openshift-image-registry/image-registry-697d97f7c8-zp5hg" Mar 10 09:47:57 crc kubenswrapper[4794]: E0310 09:47:57.266448 4794 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 09:47:57.766435602 +0000 UTC m=+226.522606420 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zp5hg" (UID: "f7eaacf5-bd8a-46e7-8584-1ec3ae00f6b3") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 09:47:57 crc kubenswrapper[4794]: I0310 09:47:57.371840 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 09:47:57 crc kubenswrapper[4794]: E0310 09:47:57.372466 4794 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 09:47:57.872452069 +0000 UTC m=+226.628622887 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 09:47:57 crc kubenswrapper[4794]: I0310 09:47:57.473092 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zp5hg\" (UID: \"f7eaacf5-bd8a-46e7-8584-1ec3ae00f6b3\") " pod="openshift-image-registry/image-registry-697d97f7c8-zp5hg" Mar 10 09:47:57 crc kubenswrapper[4794]: E0310 09:47:57.473472 4794 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 09:47:57.973460675 +0000 UTC m=+226.729631493 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zp5hg" (UID: "f7eaacf5-bd8a-46e7-8584-1ec3ae00f6b3") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 09:47:57 crc kubenswrapper[4794]: I0310 09:47:57.573588 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 09:47:57 crc kubenswrapper[4794]: E0310 09:47:57.573911 4794 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 09:47:58.073895953 +0000 UTC m=+226.830066761 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 09:47:57 crc kubenswrapper[4794]: I0310 09:47:57.587530 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-zpwhc" event={"ID":"9e528446-c8eb-4103-b2ef-994eef6ecac5","Type":"ContainerStarted","Data":"d54f72706f0b2d63d6fbfe8b4d9925c8b5c35ac360336c153eb60af211f49dce"} Mar 10 09:47:57 crc kubenswrapper[4794]: I0310 09:47:57.588615 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-zpwhc" Mar 10 09:47:57 crc kubenswrapper[4794]: I0310 09:47:57.590085 4794 patch_prober.go:28] interesting pod/olm-operator-6b444d44fb-zpwhc container/olm-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.42:8443/healthz\": dial tcp 10.217.0.42:8443: connect: connection refused" start-of-body= Mar 10 09:47:57 crc kubenswrapper[4794]: I0310 09:47:57.590116 4794 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-zpwhc" podUID="9e528446-c8eb-4103-b2ef-994eef6ecac5" containerName="olm-operator" probeResult="failure" output="Get \"https://10.217.0.42:8443/healthz\": dial tcp 10.217.0.42:8443: connect: connection refused" Mar 10 09:47:57 crc kubenswrapper[4794]: I0310 09:47:57.601316 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-h4k77" event={"ID":"cd3da9ba-0a30-4cd5-a02b-47c78ab2b9aa","Type":"ContainerStarted","Data":"c694321ba0b4c0ea6771b5d1ddfd5f1aa13f10eba72808627df3332ee10cc6f0"} Mar 10 09:47:57 crc kubenswrapper[4794]: I0310 09:47:57.610502 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-ckvtc" event={"ID":"f97a286b-f0b0-4309-a3e4-33eea0aea5f8","Type":"ContainerStarted","Data":"68559c5d10bd4d30b53960e9dc48cfdc68948fb9fc0b5573e8183604f60885ef"} Mar 10 09:47:57 crc kubenswrapper[4794]: I0310 09:47:57.611697 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-m5mvt" event={"ID":"5ab22632-d678-4997-996e-bc26acff8c46","Type":"ContainerStarted","Data":"37c18ecd4e270bfbedc69b714a9673f4f682c8740a56c7c05031f917f09ee4aa"} Mar 10 09:47:57 crc kubenswrapper[4794]: I0310 09:47:57.611721 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-m5mvt" event={"ID":"5ab22632-d678-4997-996e-bc26acff8c46","Type":"ContainerStarted","Data":"36bcf5276e5720f67bc0da226afe9d1fb87fc22f93cb2f06961c6b167b88ee04"} Mar 10 09:47:57 crc kubenswrapper[4794]: I0310 09:47:57.616713 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-gg8jk" event={"ID":"c2bd53f1-d95b-43dd-91d4-8adc12d6971a","Type":"ContainerStarted","Data":"e389a33b96c383a0e56c95a31d1d275e1ef9f197931cdc698ee755737f21a678"} Mar 10 09:47:57 crc kubenswrapper[4794]: I0310 09:47:57.616742 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-gg8jk" event={"ID":"c2bd53f1-d95b-43dd-91d4-8adc12d6971a","Type":"ContainerStarted","Data":"724c201b5d8d36a47ff2f308cefeb4e24200f625fbe5c1465b31c6d41529e3fb"} Mar 10 09:47:57 crc kubenswrapper[4794]: I0310 09:47:57.617146 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-gg8jk" Mar 10 09:47:57 crc kubenswrapper[4794]: I0310 09:47:57.644505 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-4jppl" event={"ID":"bff3d354-0064-4a96-8945-51df3cd2d7e7","Type":"ContainerStarted","Data":"57d34cf26443725a23620bb07ab6f8e4f7f4d91b4d8c14d922d04fdf26c3915b"} Mar 10 09:47:57 crc kubenswrapper[4794]: I0310 09:47:57.644549 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-4jppl" event={"ID":"bff3d354-0064-4a96-8945-51df3cd2d7e7","Type":"ContainerStarted","Data":"7f3fbe8c49af2de7e7676f179f8062dd141c740108d5059efc9ce77333eee66d"} Mar 10 09:47:57 crc kubenswrapper[4794]: I0310 09:47:57.647097 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-2lthx" event={"ID":"270526dc-92ee-4d56-93cf-b4ee2df197fa","Type":"ContainerStarted","Data":"428627e506b59ac240956ac0e7c51aa0797289e91deff4d885501f1edadd1e44"} Mar 10 09:47:57 crc kubenswrapper[4794]: I0310 09:47:57.648875 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29552265-smnpm" event={"ID":"eece3bac-ab7c-4a16-82ae-35775eef8806","Type":"ContainerStarted","Data":"9359a9c9fbf9b27a25c133709f0cca4798f0af917d86e842a532f8a026f6b7c7"} Mar 10 09:47:57 crc kubenswrapper[4794]: I0310 09:47:57.654281 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-sl6l8" event={"ID":"9214e776-ae19-4f0c-9008-9cd8a4ece399","Type":"ContainerStarted","Data":"b2fe739bc16f28dbe03f1c2be5a38bd760eedf2ff20de8835d78045a2b727e57"} Mar 10 09:47:57 crc kubenswrapper[4794]: I0310 09:47:57.665766 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-2tbqn" event={"ID":"797ba833-0538-46ba-b3f1-4a3b2415c7f1","Type":"ContainerStarted","Data":"1e6b1423fb9e2b11edc4d09f2874ed8f94e0ada39c59a48f22429ce1505107b8"} Mar 10 09:47:57 crc kubenswrapper[4794]: I0310 09:47:57.672456 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-xtppb" event={"ID":"a138cfa4-f8ff-4627-b3da-f72aeb474795","Type":"ContainerStarted","Data":"380934d4834902369e169def355370875babc0034791f23d14ecef76ed912de8"} Mar 10 09:47:57 crc kubenswrapper[4794]: I0310 09:47:57.672492 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-xtppb" event={"ID":"a138cfa4-f8ff-4627-b3da-f72aeb474795","Type":"ContainerStarted","Data":"ba68fddadd327796f235791539e8496c4a139c274911dac718a3e48de3b1845c"} Mar 10 09:47:57 crc kubenswrapper[4794]: I0310 09:47:57.675248 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zp5hg\" (UID: \"f7eaacf5-bd8a-46e7-8584-1ec3ae00f6b3\") " pod="openshift-image-registry/image-registry-697d97f7c8-zp5hg" Mar 10 09:47:57 crc kubenswrapper[4794]: E0310 09:47:57.676390 4794 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 09:47:58.176379363 +0000 UTC m=+226.932550181 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zp5hg" (UID: "f7eaacf5-bd8a-46e7-8584-1ec3ae00f6b3") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 09:47:57 crc kubenswrapper[4794]: I0310 09:47:57.689138 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-xwx4g" event={"ID":"577c9773-972c-42bb-99c1-b3d44cf21403","Type":"ContainerStarted","Data":"6deaeef5d7072cba8c1996026b20ef49f59282239dbcdaff2cd27e2c26cfc421"} Mar 10 09:47:57 crc kubenswrapper[4794]: I0310 09:47:57.690754 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-p8ppb" event={"ID":"22ead435-4c45-43ad-a499-fe930a626c52","Type":"ContainerStarted","Data":"cdbbf64ad4cca124396993238d98a3749be33228686e9536f827b86dbf46cc85"} Mar 10 09:47:57 crc kubenswrapper[4794]: I0310 09:47:57.691323 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-p8ppb" Mar 10 09:47:57 crc kubenswrapper[4794]: I0310 09:47:57.692426 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-skqw6" event={"ID":"7827a543-d8b2-460b-aee5-212ea1208c0d","Type":"ContainerStarted","Data":"ac23f35647b7d6a9a513d7618e602b0baae643c80748e46291a1562c1ffa3ac9"} Mar 10 09:47:57 crc kubenswrapper[4794]: I0310 09:47:57.694493 4794 patch_prober.go:28] interesting pod/catalog-operator-68c6474976-p8ppb container/catalog-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.34:8443/healthz\": dial tcp 10.217.0.34:8443: connect: connection refused" start-of-body= Mar 10 09:47:57 crc kubenswrapper[4794]: I0310 09:47:57.694524 4794 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-p8ppb" podUID="22ead435-4c45-43ad-a499-fe930a626c52" containerName="catalog-operator" probeResult="failure" output="Get \"https://10.217.0.34:8443/healthz\": dial tcp 10.217.0.34:8443: connect: connection refused" Mar 10 09:47:57 crc kubenswrapper[4794]: I0310 09:47:57.695081 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-rkl2f" event={"ID":"a4cfdaaf-4265-4c7d-b58d-4538905360a2","Type":"ContainerStarted","Data":"a6fd00cc04fb683b7098c53b705f00a9ced8d78e162fb3ac5020a24c82928bd0"} Mar 10 09:47:57 crc kubenswrapper[4794]: I0310 09:47:57.723044 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-n94r9" event={"ID":"d56da28c-c09d-4fff-b73e-c3b5c787c300","Type":"ContainerStarted","Data":"a2f0ca38cf8561e292249a48d1b06ca578f4f8715cafb1c749fea0b2be397ce6"} Mar 10 09:47:57 crc kubenswrapper[4794]: I0310 09:47:57.723683 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-558db77b4-n94r9" Mar 10 09:47:57 crc kubenswrapper[4794]: I0310 09:47:57.728191 4794 patch_prober.go:28] interesting pod/oauth-openshift-558db77b4-n94r9 container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.13:6443/healthz\": dial tcp 10.217.0.13:6443: connect: connection refused" start-of-body= Mar 10 09:47:57 crc kubenswrapper[4794]: I0310 09:47:57.728253 4794 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-558db77b4-n94r9" podUID="d56da28c-c09d-4fff-b73e-c3b5c787c300" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.13:6443/healthz\": dial tcp 10.217.0.13:6443: connect: connection refused" Mar 10 09:47:57 crc kubenswrapper[4794]: I0310 09:47:57.739915 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-cxgqz" event={"ID":"32cb5682-0c0e-4f56-8fcc-cd73067c41c7","Type":"ContainerStarted","Data":"83f39f94ac5cc4a8d3161a93fdead9c83c37fe8ce8c350491fa77f3f5a37a831"} Mar 10 09:47:57 crc kubenswrapper[4794]: I0310 09:47:57.748948 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-l4mhp" event={"ID":"b70fe17f-97eb-4abf-b88e-0e24e5d01c48","Type":"ContainerStarted","Data":"1f2e2ec169f3909ebe5e6afb452ebb600a80a33a12f861a3e20b659b042003b0"} Mar 10 09:47:57 crc kubenswrapper[4794]: I0310 09:47:57.749548 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-config-operator/openshift-config-operator-7777fb866f-l4mhp" Mar 10 09:47:57 crc kubenswrapper[4794]: I0310 09:47:57.752648 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-n2b4p" event={"ID":"a39fe093-da97-48ba-bdf3-a566eefc5208","Type":"ContainerStarted","Data":"670ae2fdad1919a87f46024509bc20c149bdc0af6ab62019439f2b21a599393d"} Mar 10 09:47:57 crc kubenswrapper[4794]: I0310 09:47:57.752677 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-n2b4p" event={"ID":"a39fe093-da97-48ba-bdf3-a566eefc5208","Type":"ContainerStarted","Data":"6f6471a789932655f56daead01973640ab8e411e76596cce7f6d130170f0f020"} Mar 10 09:47:57 crc kubenswrapper[4794]: I0310 09:47:57.753289 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-n2b4p" Mar 10 09:47:57 crc kubenswrapper[4794]: I0310 09:47:57.756068 4794 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-n2b4p container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.35:8080/healthz\": dial tcp 10.217.0.35:8080: connect: connection refused" start-of-body= Mar 10 09:47:57 crc kubenswrapper[4794]: I0310 09:47:57.756105 4794 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-n2b4p" podUID="a39fe093-da97-48ba-bdf3-a566eefc5208" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.35:8080/healthz\": dial tcp 10.217.0.35:8080: connect: connection refused" Mar 10 09:47:57 crc kubenswrapper[4794]: I0310 09:47:57.761059 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-bknsm" event={"ID":"a8b37b88-1555-420f-a9c1-f7e48046f160","Type":"ContainerStarted","Data":"879077a32cdc3eca6dfcc7cdc4f1bb1cae212c964d6d0812b2ddf4f409529a36"} Mar 10 09:47:57 crc kubenswrapper[4794]: I0310 09:47:57.761106 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-bknsm" event={"ID":"a8b37b88-1555-420f-a9c1-f7e48046f160","Type":"ContainerStarted","Data":"ce0029eb6c7e8c6d24f9566d380402da6bdfbc4bbe4da3dd7f25e5e0294b88df"} Mar 10 09:47:57 crc kubenswrapper[4794]: I0310 09:47:57.761363 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/downloads-7954f5f757-bknsm" Mar 10 09:47:57 crc kubenswrapper[4794]: I0310 09:47:57.762696 4794 patch_prober.go:28] interesting pod/downloads-7954f5f757-bknsm container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.37:8080/\": dial tcp 10.217.0.37:8080: connect: connection refused" start-of-body= Mar 10 09:47:57 crc kubenswrapper[4794]: I0310 09:47:57.762736 4794 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-bknsm" podUID="a8b37b88-1555-420f-a9c1-f7e48046f160" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.37:8080/\": dial tcp 10.217.0.37:8080: connect: connection refused" Mar 10 09:47:57 crc kubenswrapper[4794]: I0310 09:47:57.765620 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-jqtj2" event={"ID":"e5fe5900-c0c7-4b99-9dcb-bcb3bd8da1fc","Type":"ContainerStarted","Data":"3059b7f9becff3a164e77fcc2da7699511cb8ab6f64d7ba4dcd9e5dcf8676ceb"} Mar 10 09:47:57 crc kubenswrapper[4794]: I0310 09:47:57.785409 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 09:47:57 crc kubenswrapper[4794]: E0310 09:47:57.787022 4794 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 09:47:58.287006929 +0000 UTC m=+227.043177747 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 09:47:57 crc kubenswrapper[4794]: I0310 09:47:57.806591 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-vjgc6" event={"ID":"eb441ca2-702b-4848-906b-9f02a8ff65ee","Type":"ContainerStarted","Data":"7e8583226bc36a90c4ec708dba82a3de6b2fa12f104958be9e4dd026553b9b14"} Mar 10 09:47:57 crc kubenswrapper[4794]: I0310 09:47:57.806639 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-vjgc6" event={"ID":"eb441ca2-702b-4848-906b-9f02a8ff65ee","Type":"ContainerStarted","Data":"9132cbece61c93c0e47521dd9575e5c30ef756b71be9107fc09c97e37e122905"} Mar 10 09:47:57 crc kubenswrapper[4794]: I0310 09:47:57.806650 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-vjgc6" event={"ID":"eb441ca2-702b-4848-906b-9f02a8ff65ee","Type":"ContainerStarted","Data":"80e8ca44f8e39dcfd1bff6e097fa8e27bfa0a9861f8309691b696dffc9c5d778"} Mar 10 09:47:57 crc kubenswrapper[4794]: I0310 09:47:57.815076 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-m7cm9" event={"ID":"6fd49366-785d-4530-a7d2-4a5daf70ea0f","Type":"ContainerStarted","Data":"71cc874f62fa843734055f7de4d36b3d48727af7cdd69f656d73248c9b714dd1"} Mar 10 09:47:57 crc kubenswrapper[4794]: I0310 09:47:57.825290 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-qk8nd" event={"ID":"d4be505c-4811-4a7a-a7b5-3141574f1ee0","Type":"ContainerStarted","Data":"589e1c0a3409746075540f74edb4eb573b908401ab11200198f2b687b2a5ac3a"} Mar 10 09:47:57 crc kubenswrapper[4794]: I0310 09:47:57.829629 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-9vsxq" event={"ID":"983001a9-70eb-40e1-895f-5e3fc80f538e","Type":"ContainerStarted","Data":"87e1bd66941b85b767172612b5e8cc1b478b4d069575130e8b301956fbbcc068"} Mar 10 09:47:57 crc kubenswrapper[4794]: I0310 09:47:57.831988 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-bhj55" event={"ID":"7b739a9e-64b9-4415-bc45-dc9307aa49d3","Type":"ContainerStarted","Data":"e20f29cb81543db9f3c1c07a600c0d0da9b7dce89fc38c87fae837aad4b5ff8d"} Mar 10 09:47:57 crc kubenswrapper[4794]: I0310 09:47:57.832015 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-bhj55" event={"ID":"7b739a9e-64b9-4415-bc45-dc9307aa49d3","Type":"ContainerStarted","Data":"5a6678d5917b1f24f27ff5b0b305c6c0711051b32b5fb067afa2bf7d148a06cd"} Mar 10 09:47:57 crc kubenswrapper[4794]: I0310 09:47:57.836515 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-dkz9c" event={"ID":"982fe064-1be6-4ff0-b6ba-6f04ee269140","Type":"ContainerStarted","Data":"08e96f17b63cc2bf8715e3c3b0623bc536ff1362876fad1b57378374a06d5f7a"} Mar 10 09:47:57 crc kubenswrapper[4794]: I0310 09:47:57.844960 4794 patch_prober.go:28] interesting pod/console-operator-58897d9998-qnrc8 container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.28:8443/readyz\": dial tcp 10.217.0.28:8443: connect: connection refused" start-of-body= Mar 10 09:47:57 crc kubenswrapper[4794]: I0310 09:47:57.845019 4794 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-qnrc8" podUID="9d7615b4-8f9f-4200-9e55-1baa01a6b14d" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.28:8443/readyz\": dial tcp 10.217.0.28:8443: connect: connection refused" Mar 10 09:47:57 crc kubenswrapper[4794]: I0310 09:47:57.889474 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zp5hg\" (UID: \"f7eaacf5-bd8a-46e7-8584-1ec3ae00f6b3\") " pod="openshift-image-registry/image-registry-697d97f7c8-zp5hg" Mar 10 09:47:57 crc kubenswrapper[4794]: E0310 09:47:57.895407 4794 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 09:47:58.395393097 +0000 UTC m=+227.151563915 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zp5hg" (UID: "f7eaacf5-bd8a-46e7-8584-1ec3ae00f6b3") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 09:47:57 crc kubenswrapper[4794]: I0310 09:47:57.962546 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-7954f5f757-bknsm" podStartSLOduration=154.962530042 podStartE2EDuration="2m34.962530042s" podCreationTimestamp="2026-03-10 09:45:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 09:47:57.908728439 +0000 UTC m=+226.664899267" watchObservedRunningTime="2026-03-10 09:47:57.962530042 +0000 UTC m=+226.718700860" Mar 10 09:47:57 crc kubenswrapper[4794]: I0310 09:47:57.963749 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver/apiserver-76f77b778f-4jppl" podStartSLOduration=154.963742519 podStartE2EDuration="2m34.963742519s" podCreationTimestamp="2026-03-10 09:45:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 09:47:57.961228002 +0000 UTC m=+226.717398820" watchObservedRunningTime="2026-03-10 09:47:57.963742519 +0000 UTC m=+226.719913327" Mar 10 09:47:57 crc kubenswrapper[4794]: I0310 09:47:57.975940 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-m7cm9" podStartSLOduration=154.975926386 podStartE2EDuration="2m34.975926386s" podCreationTimestamp="2026-03-10 09:45:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 09:47:57.974682229 +0000 UTC m=+226.730853047" watchObservedRunningTime="2026-03-10 09:47:57.975926386 +0000 UTC m=+226.732097204" Mar 10 09:47:57 crc kubenswrapper[4794]: I0310 09:47:57.978569 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-rjjct" Mar 10 09:47:57 crc kubenswrapper[4794]: I0310 09:47:57.991723 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 09:47:57 crc kubenswrapper[4794]: E0310 09:47:57.992326 4794 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 09:47:58.492309769 +0000 UTC m=+227.248480577 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 09:47:58 crc kubenswrapper[4794]: I0310 09:47:58.043489 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-vjgc6" podStartSLOduration=155.043471652 podStartE2EDuration="2m35.043471652s" podCreationTimestamp="2026-03-10 09:45:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 09:47:58.006206168 +0000 UTC m=+226.762376986" watchObservedRunningTime="2026-03-10 09:47:58.043471652 +0000 UTC m=+226.799642470" Mar 10 09:47:58 crc kubenswrapper[4794]: I0310 09:47:58.097465 4794 patch_prober.go:28] interesting pod/router-default-5444994796-tlkl5 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 10 09:47:58 crc kubenswrapper[4794]: [-]has-synced failed: reason withheld Mar 10 09:47:58 crc kubenswrapper[4794]: [+]process-running ok Mar 10 09:47:58 crc kubenswrapper[4794]: healthz check failed Mar 10 09:47:58 crc kubenswrapper[4794]: I0310 09:47:58.097574 4794 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-tlkl5" podUID="180c50b5-f178-4b42-922c-edfd1deb91d4" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 10 09:47:58 crc kubenswrapper[4794]: I0310 09:47:58.098435 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-n2b4p" podStartSLOduration=155.098416899 podStartE2EDuration="2m35.098416899s" podCreationTimestamp="2026-03-10 09:45:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 09:47:58.045925126 +0000 UTC m=+226.802095944" watchObservedRunningTime="2026-03-10 09:47:58.098416899 +0000 UTC m=+226.854587707" Mar 10 09:47:58 crc kubenswrapper[4794]: I0310 09:47:58.098850 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zp5hg\" (UID: \"f7eaacf5-bd8a-46e7-8584-1ec3ae00f6b3\") " pod="openshift-image-registry/image-registry-697d97f7c8-zp5hg" Mar 10 09:47:58 crc kubenswrapper[4794]: E0310 09:47:58.099152 4794 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 09:47:58.599142851 +0000 UTC m=+227.355313669 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zp5hg" (UID: "f7eaacf5-bd8a-46e7-8584-1ec3ae00f6b3") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 09:47:58 crc kubenswrapper[4794]: I0310 09:47:58.142926 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-jqtj2" podStartSLOduration=155.142907271 podStartE2EDuration="2m35.142907271s" podCreationTimestamp="2026-03-10 09:45:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 09:47:58.104159512 +0000 UTC m=+226.860330330" watchObservedRunningTime="2026-03-10 09:47:58.142907271 +0000 UTC m=+226.899078089" Mar 10 09:47:58 crc kubenswrapper[4794]: I0310 09:47:58.169650 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns-operator/dns-operator-744455d44c-9vsxq" podStartSLOduration=155.169616396 podStartE2EDuration="2m35.169616396s" podCreationTimestamp="2026-03-10 09:45:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 09:47:58.168574264 +0000 UTC m=+226.924745092" watchObservedRunningTime="2026-03-10 09:47:58.169616396 +0000 UTC m=+226.925787204" Mar 10 09:47:58 crc kubenswrapper[4794]: I0310 09:47:58.170546 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-admission-controller-857f4d67dd-qk8nd" podStartSLOduration=155.170541784 podStartE2EDuration="2m35.170541784s" podCreationTimestamp="2026-03-10 09:45:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 09:47:58.144926942 +0000 UTC m=+226.901097760" watchObservedRunningTime="2026-03-10 09:47:58.170541784 +0000 UTC m=+226.926712592" Mar 10 09:47:58 crc kubenswrapper[4794]: I0310 09:47:58.208156 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 09:47:58 crc kubenswrapper[4794]: E0310 09:47:58.208541 4794 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 09:47:58.708519519 +0000 UTC m=+227.464690337 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 09:47:58 crc kubenswrapper[4794]: I0310 09:47:58.244118 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-bhj55" podStartSLOduration=155.244103502 podStartE2EDuration="2m35.244103502s" podCreationTimestamp="2026-03-10 09:45:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 09:47:58.207998733 +0000 UTC m=+226.964169561" watchObservedRunningTime="2026-03-10 09:47:58.244103502 +0000 UTC m=+227.000274320" Mar 10 09:47:58 crc kubenswrapper[4794]: I0310 09:47:58.244218 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-dkz9c" podStartSLOduration=155.244212855 podStartE2EDuration="2m35.244212855s" podCreationTimestamp="2026-03-10 09:45:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 09:47:58.243597187 +0000 UTC m=+226.999768005" watchObservedRunningTime="2026-03-10 09:47:58.244212855 +0000 UTC m=+227.000383673" Mar 10 09:47:58 crc kubenswrapper[4794]: I0310 09:47:58.312779 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zp5hg\" (UID: \"f7eaacf5-bd8a-46e7-8584-1ec3ae00f6b3\") " pod="openshift-image-registry/image-registry-697d97f7c8-zp5hg" Mar 10 09:47:58 crc kubenswrapper[4794]: E0310 09:47:58.313187 4794 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 09:47:58.813174445 +0000 UTC m=+227.569345263 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zp5hg" (UID: "f7eaacf5-bd8a-46e7-8584-1ec3ae00f6b3") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 09:47:58 crc kubenswrapper[4794]: I0310 09:47:58.391986 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29552265-smnpm" podStartSLOduration=155.391965041 podStartE2EDuration="2m35.391965041s" podCreationTimestamp="2026-03-10 09:45:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 09:47:58.311456223 +0000 UTC m=+227.067627051" watchObservedRunningTime="2026-03-10 09:47:58.391965041 +0000 UTC m=+227.148135869" Mar 10 09:47:58 crc kubenswrapper[4794]: I0310 09:47:58.413886 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 09:47:58 crc kubenswrapper[4794]: E0310 09:47:58.414175 4794 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 09:47:58.91414622 +0000 UTC m=+227.670317038 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 09:47:58 crc kubenswrapper[4794]: I0310 09:47:58.414776 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zp5hg\" (UID: \"f7eaacf5-bd8a-46e7-8584-1ec3ae00f6b3\") " pod="openshift-image-registry/image-registry-697d97f7c8-zp5hg" Mar 10 09:47:58 crc kubenswrapper[4794]: E0310 09:47:58.415167 4794 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 09:47:58.91515888 +0000 UTC m=+227.671329698 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zp5hg" (UID: "f7eaacf5-bd8a-46e7-8584-1ec3ae00f6b3") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 09:47:58 crc kubenswrapper[4794]: I0310 09:47:58.445897 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-zpwhc" podStartSLOduration=155.445878436 podStartE2EDuration="2m35.445878436s" podCreationTimestamp="2026-03-10 09:45:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 09:47:58.39360442 +0000 UTC m=+227.149775248" watchObservedRunningTime="2026-03-10 09:47:58.445878436 +0000 UTC m=+227.202049264" Mar 10 09:47:58 crc kubenswrapper[4794]: I0310 09:47:58.493603 4794 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-apiserver/apiserver-76f77b778f-4jppl" Mar 10 09:47:58 crc kubenswrapper[4794]: I0310 09:47:58.493644 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-apiserver/apiserver-76f77b778f-4jppl" Mar 10 09:47:58 crc kubenswrapper[4794]: I0310 09:47:58.495544 4794 patch_prober.go:28] interesting pod/apiserver-76f77b778f-4jppl container/openshift-apiserver namespace/openshift-apiserver: Startup probe status=failure output="Get \"https://10.217.0.6:8443/livez\": dial tcp 10.217.0.6:8443: connect: connection refused" start-of-body= Mar 10 09:47:58 crc kubenswrapper[4794]: I0310 09:47:58.495611 4794 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-apiserver/apiserver-76f77b778f-4jppl" podUID="bff3d354-0064-4a96-8945-51df3cd2d7e7" containerName="openshift-apiserver" probeResult="failure" output="Get \"https://10.217.0.6:8443/livez\": dial tcp 10.217.0.6:8443: connect: connection refused" Mar 10 09:47:58 crc kubenswrapper[4794]: I0310 09:47:58.504863 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-m5mvt" podStartSLOduration=155.504844844 podStartE2EDuration="2m35.504844844s" podCreationTimestamp="2026-03-10 09:45:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 09:47:58.453519217 +0000 UTC m=+227.209690035" watchObservedRunningTime="2026-03-10 09:47:58.504844844 +0000 UTC m=+227.261015662" Mar 10 09:47:58 crc kubenswrapper[4794]: I0310 09:47:58.506848 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-p8ppb" podStartSLOduration=155.506841755 podStartE2EDuration="2m35.506841755s" podCreationTimestamp="2026-03-10 09:45:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 09:47:58.503653259 +0000 UTC m=+227.259824097" watchObservedRunningTime="2026-03-10 09:47:58.506841755 +0000 UTC m=+227.263012573" Mar 10 09:47:58 crc kubenswrapper[4794]: I0310 09:47:58.515694 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 09:47:58 crc kubenswrapper[4794]: E0310 09:47:58.515899 4794 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 09:47:59.015870457 +0000 UTC m=+227.772041285 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 09:47:58 crc kubenswrapper[4794]: I0310 09:47:58.515968 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zp5hg\" (UID: \"f7eaacf5-bd8a-46e7-8584-1ec3ae00f6b3\") " pod="openshift-image-registry/image-registry-697d97f7c8-zp5hg" Mar 10 09:47:58 crc kubenswrapper[4794]: E0310 09:47:58.516356 4794 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 09:47:59.01630578 +0000 UTC m=+227.772476598 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zp5hg" (UID: "f7eaacf5-bd8a-46e7-8584-1ec3ae00f6b3") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 09:47:58 crc kubenswrapper[4794]: I0310 09:47:58.517324 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-cxgqz" Mar 10 09:47:58 crc kubenswrapper[4794]: I0310 09:47:58.517382 4794 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-cxgqz" Mar 10 09:47:58 crc kubenswrapper[4794]: I0310 09:47:58.548678 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication-operator/authentication-operator-69f744f599-xtppb" podStartSLOduration=155.548663186 podStartE2EDuration="2m35.548663186s" podCreationTimestamp="2026-03-10 09:45:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 09:47:58.540685005 +0000 UTC m=+227.296855823" watchObservedRunningTime="2026-03-10 09:47:58.548663186 +0000 UTC m=+227.304833994" Mar 10 09:47:58 crc kubenswrapper[4794]: I0310 09:47:58.617046 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 09:47:58 crc kubenswrapper[4794]: E0310 09:47:58.617219 4794 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 09:47:59.117193192 +0000 UTC m=+227.873364010 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 09:47:58 crc kubenswrapper[4794]: I0310 09:47:58.617652 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zp5hg\" (UID: \"f7eaacf5-bd8a-46e7-8584-1ec3ae00f6b3\") " pod="openshift-image-registry/image-registry-697d97f7c8-zp5hg" Mar 10 09:47:58 crc kubenswrapper[4794]: E0310 09:47:58.618036 4794 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 09:47:59.118022227 +0000 UTC m=+227.874193045 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zp5hg" (UID: "f7eaacf5-bd8a-46e7-8584-1ec3ae00f6b3") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 09:47:58 crc kubenswrapper[4794]: I0310 09:47:58.692061 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-config-operator/openshift-config-operator-7777fb866f-l4mhp" podStartSLOduration=155.692042759 podStartE2EDuration="2m35.692042759s" podCreationTimestamp="2026-03-10 09:45:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 09:47:58.622583715 +0000 UTC m=+227.378754523" watchObservedRunningTime="2026-03-10 09:47:58.692042759 +0000 UTC m=+227.448213567" Mar 10 09:47:58 crc kubenswrapper[4794]: I0310 09:47:58.692709 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-558db77b4-n94r9" podStartSLOduration=155.692704949 podStartE2EDuration="2m35.692704949s" podCreationTimestamp="2026-03-10 09:45:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 09:47:58.687909605 +0000 UTC m=+227.444080423" watchObservedRunningTime="2026-03-10 09:47:58.692704949 +0000 UTC m=+227.448875767" Mar 10 09:47:58 crc kubenswrapper[4794]: I0310 09:47:58.718863 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 09:47:58 crc kubenswrapper[4794]: E0310 09:47:58.719022 4794 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 09:47:59.218997982 +0000 UTC m=+227.975168800 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 09:47:58 crc kubenswrapper[4794]: I0310 09:47:58.719117 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zp5hg\" (UID: \"f7eaacf5-bd8a-46e7-8584-1ec3ae00f6b3\") " pod="openshift-image-registry/image-registry-697d97f7c8-zp5hg" Mar 10 09:47:58 crc kubenswrapper[4794]: E0310 09:47:58.719463 4794 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 09:47:59.219449766 +0000 UTC m=+227.975620584 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zp5hg" (UID: "f7eaacf5-bd8a-46e7-8584-1ec3ae00f6b3") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 09:47:58 crc kubenswrapper[4794]: I0310 09:47:58.727466 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-xwx4g" podStartSLOduration=155.727447177 podStartE2EDuration="2m35.727447177s" podCreationTimestamp="2026-03-10 09:45:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 09:47:58.726076205 +0000 UTC m=+227.482247043" watchObservedRunningTime="2026-03-10 09:47:58.727447177 +0000 UTC m=+227.483617995" Mar 10 09:47:58 crc kubenswrapper[4794]: I0310 09:47:58.790453 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-cxgqz" podStartSLOduration=155.790437476 podStartE2EDuration="2m35.790437476s" podCreationTimestamp="2026-03-10 09:45:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 09:47:58.787782736 +0000 UTC m=+227.543953554" watchObservedRunningTime="2026-03-10 09:47:58.790437476 +0000 UTC m=+227.546608294" Mar 10 09:47:58 crc kubenswrapper[4794]: I0310 09:47:58.820582 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 09:47:58 crc kubenswrapper[4794]: E0310 09:47:58.820941 4794 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 09:47:59.320926895 +0000 UTC m=+228.077097713 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 09:47:58 crc kubenswrapper[4794]: I0310 09:47:58.828113 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-gg8jk" podStartSLOduration=155.828094112 podStartE2EDuration="2m35.828094112s" podCreationTimestamp="2026-03-10 09:45:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 09:47:58.82639144 +0000 UTC m=+227.582562258" watchObservedRunningTime="2026-03-10 09:47:58.828094112 +0000 UTC m=+227.584264930" Mar 10 09:47:58 crc kubenswrapper[4794]: I0310 09:47:58.841124 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-2lthx" event={"ID":"270526dc-92ee-4d56-93cf-b4ee2df197fa","Type":"ContainerStarted","Data":"68c9c01dbac35301a8a064442e803e2eb13c4a4637281cd217a11487007ca833"} Mar 10 09:47:58 crc kubenswrapper[4794]: I0310 09:47:58.841895 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-dns/dns-default-2lthx" Mar 10 09:47:58 crc kubenswrapper[4794]: I0310 09:47:58.855143 4794 patch_prober.go:28] interesting pod/catalog-operator-68c6474976-p8ppb container/catalog-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.34:8443/healthz\": dial tcp 10.217.0.34:8443: connect: connection refused" start-of-body= Mar 10 09:47:58 crc kubenswrapper[4794]: I0310 09:47:58.855196 4794 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-p8ppb" podUID="22ead435-4c45-43ad-a499-fe930a626c52" containerName="catalog-operator" probeResult="failure" output="Get \"https://10.217.0.34:8443/healthz\": dial tcp 10.217.0.34:8443: connect: connection refused" Mar 10 09:47:58 crc kubenswrapper[4794]: I0310 09:47:58.855432 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-qk8nd" event={"ID":"d4be505c-4811-4a7a-a7b5-3141574f1ee0","Type":"ContainerStarted","Data":"ed2c0ce7dfacdf3beeec170df3c2d5358f5e16f5f40375246fa6922a8edc809d"} Mar 10 09:47:58 crc kubenswrapper[4794]: I0310 09:47:58.856314 4794 patch_prober.go:28] interesting pod/console-operator-58897d9998-qnrc8 container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.28:8443/readyz\": dial tcp 10.217.0.28:8443: connect: connection refused" start-of-body= Mar 10 09:47:58 crc kubenswrapper[4794]: I0310 09:47:58.856370 4794 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-n2b4p container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.35:8080/healthz\": dial tcp 10.217.0.35:8080: connect: connection refused" start-of-body= Mar 10 09:47:58 crc kubenswrapper[4794]: I0310 09:47:58.856380 4794 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-qnrc8" podUID="9d7615b4-8f9f-4200-9e55-1baa01a6b14d" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.28:8443/readyz\": dial tcp 10.217.0.28:8443: connect: connection refused" Mar 10 09:47:58 crc kubenswrapper[4794]: I0310 09:47:58.856392 4794 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-n2b4p" podUID="a39fe093-da97-48ba-bdf3-a566eefc5208" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.35:8080/healthz\": dial tcp 10.217.0.35:8080: connect: connection refused" Mar 10 09:47:58 crc kubenswrapper[4794]: I0310 09:47:58.856998 4794 patch_prober.go:28] interesting pod/downloads-7954f5f757-bknsm container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.37:8080/\": dial tcp 10.217.0.37:8080: connect: connection refused" start-of-body= Mar 10 09:47:58 crc kubenswrapper[4794]: I0310 09:47:58.857034 4794 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-bknsm" podUID="a8b37b88-1555-420f-a9c1-f7e48046f160" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.37:8080/\": dial tcp 10.217.0.37:8080: connect: connection refused" Mar 10 09:47:58 crc kubenswrapper[4794]: I0310 09:47:58.862730 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-zpwhc" Mar 10 09:47:58 crc kubenswrapper[4794]: I0310 09:47:58.865670 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-rkl2f" podStartSLOduration=7.865658524 podStartE2EDuration="7.865658524s" podCreationTimestamp="2026-03-10 09:47:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 09:47:58.865204071 +0000 UTC m=+227.621374889" watchObservedRunningTime="2026-03-10 09:47:58.865658524 +0000 UTC m=+227.621829342" Mar 10 09:47:58 crc kubenswrapper[4794]: I0310 09:47:58.907507 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-f9d7485db-skqw6" podStartSLOduration=155.907492155 podStartE2EDuration="2m35.907492155s" podCreationTimestamp="2026-03-10 09:45:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 09:47:58.905891057 +0000 UTC m=+227.662061875" watchObservedRunningTime="2026-03-10 09:47:58.907492155 +0000 UTC m=+227.663662973" Mar 10 09:47:58 crc kubenswrapper[4794]: I0310 09:47:58.923096 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zp5hg\" (UID: \"f7eaacf5-bd8a-46e7-8584-1ec3ae00f6b3\") " pod="openshift-image-registry/image-registry-697d97f7c8-zp5hg" Mar 10 09:47:58 crc kubenswrapper[4794]: E0310 09:47:58.926919 4794 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 09:47:59.42684897 +0000 UTC m=+228.183019788 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zp5hg" (UID: "f7eaacf5-bd8a-46e7-8584-1ec3ae00f6b3") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 09:47:58 crc kubenswrapper[4794]: I0310 09:47:58.972734 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-2lthx" podStartSLOduration=7.972717272 podStartE2EDuration="7.972717272s" podCreationTimestamp="2026-03-10 09:47:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 09:47:58.935944983 +0000 UTC m=+227.692115811" watchObservedRunningTime="2026-03-10 09:47:58.972717272 +0000 UTC m=+227.728888090" Mar 10 09:47:59 crc kubenswrapper[4794]: I0310 09:47:59.027966 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 09:47:59 crc kubenswrapper[4794]: E0310 09:47:59.028366 4794 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 09:47:59.52834948 +0000 UTC m=+228.284520308 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 09:47:59 crc kubenswrapper[4794]: I0310 09:47:59.103559 4794 patch_prober.go:28] interesting pod/router-default-5444994796-tlkl5 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 10 09:47:59 crc kubenswrapper[4794]: [-]has-synced failed: reason withheld Mar 10 09:47:59 crc kubenswrapper[4794]: [+]process-running ok Mar 10 09:47:59 crc kubenswrapper[4794]: healthz check failed Mar 10 09:47:59 crc kubenswrapper[4794]: I0310 09:47:59.103616 4794 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-tlkl5" podUID="180c50b5-f178-4b42-922c-edfd1deb91d4" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 10 09:47:59 crc kubenswrapper[4794]: I0310 09:47:59.129050 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zp5hg\" (UID: \"f7eaacf5-bd8a-46e7-8584-1ec3ae00f6b3\") " pod="openshift-image-registry/image-registry-697d97f7c8-zp5hg" Mar 10 09:47:59 crc kubenswrapper[4794]: E0310 09:47:59.129397 4794 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 09:47:59.629384126 +0000 UTC m=+228.385554944 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zp5hg" (UID: "f7eaacf5-bd8a-46e7-8584-1ec3ae00f6b3") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 09:47:59 crc kubenswrapper[4794]: I0310 09:47:59.230111 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 09:47:59 crc kubenswrapper[4794]: E0310 09:47:59.230258 4794 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 09:47:59.730240608 +0000 UTC m=+228.486411426 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 09:47:59 crc kubenswrapper[4794]: I0310 09:47:59.230302 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zp5hg\" (UID: \"f7eaacf5-bd8a-46e7-8584-1ec3ae00f6b3\") " pod="openshift-image-registry/image-registry-697d97f7c8-zp5hg" Mar 10 09:47:59 crc kubenswrapper[4794]: E0310 09:47:59.230611 4794 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 09:47:59.730602879 +0000 UTC m=+228.486773697 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zp5hg" (UID: "f7eaacf5-bd8a-46e7-8584-1ec3ae00f6b3") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 09:47:59 crc kubenswrapper[4794]: I0310 09:47:59.330952 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 09:47:59 crc kubenswrapper[4794]: E0310 09:47:59.331364 4794 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 09:47:59.831347186 +0000 UTC m=+228.587518084 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 09:47:59 crc kubenswrapper[4794]: I0310 09:47:59.432610 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zp5hg\" (UID: \"f7eaacf5-bd8a-46e7-8584-1ec3ae00f6b3\") " pod="openshift-image-registry/image-registry-697d97f7c8-zp5hg" Mar 10 09:47:59 crc kubenswrapper[4794]: E0310 09:47:59.432946 4794 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 09:47:59.93293461 +0000 UTC m=+228.689105428 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zp5hg" (UID: "f7eaacf5-bd8a-46e7-8584-1ec3ae00f6b3") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 09:47:59 crc kubenswrapper[4794]: I0310 09:47:59.473710 4794 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-cxgqz" Mar 10 09:47:59 crc kubenswrapper[4794]: I0310 09:47:59.533477 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 09:47:59 crc kubenswrapper[4794]: E0310 09:47:59.533966 4794 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 09:48:00.033947166 +0000 UTC m=+228.790117984 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 09:47:59 crc kubenswrapper[4794]: I0310 09:47:59.639154 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zp5hg\" (UID: \"f7eaacf5-bd8a-46e7-8584-1ec3ae00f6b3\") " pod="openshift-image-registry/image-registry-697d97f7c8-zp5hg" Mar 10 09:47:59 crc kubenswrapper[4794]: E0310 09:47:59.639481 4794 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 09:48:00.139466507 +0000 UTC m=+228.895637325 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zp5hg" (UID: "f7eaacf5-bd8a-46e7-8584-1ec3ae00f6b3") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 09:47:59 crc kubenswrapper[4794]: I0310 09:47:59.654102 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-558db77b4-n94r9" Mar 10 09:47:59 crc kubenswrapper[4794]: I0310 09:47:59.740112 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 09:47:59 crc kubenswrapper[4794]: E0310 09:47:59.740582 4794 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 09:48:00.240561766 +0000 UTC m=+228.996732584 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 09:47:59 crc kubenswrapper[4794]: I0310 09:47:59.842172 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zp5hg\" (UID: \"f7eaacf5-bd8a-46e7-8584-1ec3ae00f6b3\") " pod="openshift-image-registry/image-registry-697d97f7c8-zp5hg" Mar 10 09:47:59 crc kubenswrapper[4794]: E0310 09:47:59.842543 4794 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 09:48:00.34252743 +0000 UTC m=+229.098698238 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zp5hg" (UID: "f7eaacf5-bd8a-46e7-8584-1ec3ae00f6b3") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 09:47:59 crc kubenswrapper[4794]: I0310 09:47:59.874707 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-ckvtc" event={"ID":"f97a286b-f0b0-4309-a3e4-33eea0aea5f8","Type":"ContainerStarted","Data":"fcfdb1d14301df87d183e02f3a5dc71d32bb6f1516419409e64734dfb9283537"} Mar 10 09:47:59 crc kubenswrapper[4794]: I0310 09:47:59.876053 4794 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-n2b4p container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.35:8080/healthz\": dial tcp 10.217.0.35:8080: connect: connection refused" start-of-body= Mar 10 09:47:59 crc kubenswrapper[4794]: I0310 09:47:59.876129 4794 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-n2b4p" podUID="a39fe093-da97-48ba-bdf3-a566eefc5208" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.35:8080/healthz\": dial tcp 10.217.0.35:8080: connect: connection refused" Mar 10 09:47:59 crc kubenswrapper[4794]: I0310 09:47:59.886800 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-cxgqz" Mar 10 09:47:59 crc kubenswrapper[4794]: I0310 09:47:59.887086 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-p8ppb" Mar 10 09:47:59 crc kubenswrapper[4794]: I0310 09:47:59.945254 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 09:47:59 crc kubenswrapper[4794]: E0310 09:47:59.946536 4794 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 09:48:00.446514856 +0000 UTC m=+229.202685674 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 09:48:00 crc kubenswrapper[4794]: I0310 09:48:00.047668 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zp5hg\" (UID: \"f7eaacf5-bd8a-46e7-8584-1ec3ae00f6b3\") " pod="openshift-image-registry/image-registry-697d97f7c8-zp5hg" Mar 10 09:48:00 crc kubenswrapper[4794]: E0310 09:48:00.048089 4794 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 09:48:00.548073698 +0000 UTC m=+229.304244516 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zp5hg" (UID: "f7eaacf5-bd8a-46e7-8584-1ec3ae00f6b3") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 09:48:00 crc kubenswrapper[4794]: I0310 09:48:00.076591 4794 patch_prober.go:28] interesting pod/openshift-config-operator-7777fb866f-l4mhp container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.217.0.26:8443/healthz\": dial tcp 10.217.0.26:8443: connect: connection refused" start-of-body= Mar 10 09:48:00 crc kubenswrapper[4794]: I0310 09:48:00.076656 4794 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-7777fb866f-l4mhp" podUID="b70fe17f-97eb-4abf-b88e-0e24e5d01c48" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.217.0.26:8443/healthz\": dial tcp 10.217.0.26:8443: connect: connection refused" Mar 10 09:48:00 crc kubenswrapper[4794]: I0310 09:48:00.076945 4794 patch_prober.go:28] interesting pod/openshift-config-operator-7777fb866f-l4mhp container/openshift-config-operator namespace/openshift-config-operator: Liveness probe status=failure output="Get \"https://10.217.0.26:8443/healthz\": dial tcp 10.217.0.26:8443: connect: connection refused" start-of-body= Mar 10 09:48:00 crc kubenswrapper[4794]: I0310 09:48:00.076973 4794 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-config-operator/openshift-config-operator-7777fb866f-l4mhp" podUID="b70fe17f-97eb-4abf-b88e-0e24e5d01c48" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.217.0.26:8443/healthz\": dial tcp 10.217.0.26:8443: connect: connection refused" Mar 10 09:48:00 crc kubenswrapper[4794]: I0310 09:48:00.095122 4794 patch_prober.go:28] interesting pod/router-default-5444994796-tlkl5 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 10 09:48:00 crc kubenswrapper[4794]: [-]has-synced failed: reason withheld Mar 10 09:48:00 crc kubenswrapper[4794]: [+]process-running ok Mar 10 09:48:00 crc kubenswrapper[4794]: healthz check failed Mar 10 09:48:00 crc kubenswrapper[4794]: I0310 09:48:00.095179 4794 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-tlkl5" podUID="180c50b5-f178-4b42-922c-edfd1deb91d4" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 10 09:48:00 crc kubenswrapper[4794]: I0310 09:48:00.145777 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29552268-gzdlm"] Mar 10 09:48:00 crc kubenswrapper[4794]: I0310 09:48:00.146534 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552268-gzdlm" Mar 10 09:48:00 crc kubenswrapper[4794]: I0310 09:48:00.148646 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 09:48:00 crc kubenswrapper[4794]: E0310 09:48:00.149087 4794 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 09:48:00.649072124 +0000 UTC m=+229.405242942 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 09:48:00 crc kubenswrapper[4794]: I0310 09:48:00.160086 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-r5tkc" Mar 10 09:48:00 crc kubenswrapper[4794]: I0310 09:48:00.163920 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552268-gzdlm"] Mar 10 09:48:00 crc kubenswrapper[4794]: I0310 09:48:00.251040 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zp5hg\" (UID: \"f7eaacf5-bd8a-46e7-8584-1ec3ae00f6b3\") " pod="openshift-image-registry/image-registry-697d97f7c8-zp5hg" Mar 10 09:48:00 crc kubenswrapper[4794]: I0310 09:48:00.251104 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tmbjk\" (UniqueName: \"kubernetes.io/projected/75d151a9-5d22-4241-9177-7856740702e4-kube-api-access-tmbjk\") pod \"auto-csr-approver-29552268-gzdlm\" (UID: \"75d151a9-5d22-4241-9177-7856740702e4\") " pod="openshift-infra/auto-csr-approver-29552268-gzdlm" Mar 10 09:48:00 crc kubenswrapper[4794]: E0310 09:48:00.251397 4794 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 09:48:00.751385638 +0000 UTC m=+229.507556456 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zp5hg" (UID: "f7eaacf5-bd8a-46e7-8584-1ec3ae00f6b3") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 09:48:00 crc kubenswrapper[4794]: I0310 09:48:00.352005 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 09:48:00 crc kubenswrapper[4794]: E0310 09:48:00.352181 4794 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 09:48:00.852148777 +0000 UTC m=+229.608319585 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 09:48:00 crc kubenswrapper[4794]: I0310 09:48:00.352305 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zp5hg\" (UID: \"f7eaacf5-bd8a-46e7-8584-1ec3ae00f6b3\") " pod="openshift-image-registry/image-registry-697d97f7c8-zp5hg" Mar 10 09:48:00 crc kubenswrapper[4794]: I0310 09:48:00.352385 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tmbjk\" (UniqueName: \"kubernetes.io/projected/75d151a9-5d22-4241-9177-7856740702e4-kube-api-access-tmbjk\") pod \"auto-csr-approver-29552268-gzdlm\" (UID: \"75d151a9-5d22-4241-9177-7856740702e4\") " pod="openshift-infra/auto-csr-approver-29552268-gzdlm" Mar 10 09:48:00 crc kubenswrapper[4794]: E0310 09:48:00.352606 4794 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 09:48:00.852598511 +0000 UTC m=+229.608769319 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zp5hg" (UID: "f7eaacf5-bd8a-46e7-8584-1ec3ae00f6b3") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 09:48:00 crc kubenswrapper[4794]: I0310 09:48:00.405109 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tmbjk\" (UniqueName: \"kubernetes.io/projected/75d151a9-5d22-4241-9177-7856740702e4-kube-api-access-tmbjk\") pod \"auto-csr-approver-29552268-gzdlm\" (UID: \"75d151a9-5d22-4241-9177-7856740702e4\") " pod="openshift-infra/auto-csr-approver-29552268-gzdlm" Mar 10 09:48:00 crc kubenswrapper[4794]: I0310 09:48:00.453267 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 09:48:00 crc kubenswrapper[4794]: I0310 09:48:00.453294 4794 ???:1] "http: TLS handshake error from 192.168.126.11:56100: no serving certificate available for the kubelet" Mar 10 09:48:00 crc kubenswrapper[4794]: E0310 09:48:00.453409 4794 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 09:48:00.95339131 +0000 UTC m=+229.709562118 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 09:48:00 crc kubenswrapper[4794]: I0310 09:48:00.453573 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zp5hg\" (UID: \"f7eaacf5-bd8a-46e7-8584-1ec3ae00f6b3\") " pod="openshift-image-registry/image-registry-697d97f7c8-zp5hg" Mar 10 09:48:00 crc kubenswrapper[4794]: E0310 09:48:00.453858 4794 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 09:48:00.953851354 +0000 UTC m=+229.710022172 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zp5hg" (UID: "f7eaacf5-bd8a-46e7-8584-1ec3ae00f6b3") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 09:48:00 crc kubenswrapper[4794]: I0310 09:48:00.461486 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552268-gzdlm" Mar 10 09:48:00 crc kubenswrapper[4794]: I0310 09:48:00.539169 4794 ???:1] "http: TLS handshake error from 192.168.126.11:56116: no serving certificate available for the kubelet" Mar 10 09:48:00 crc kubenswrapper[4794]: I0310 09:48:00.554870 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 09:48:00 crc kubenswrapper[4794]: E0310 09:48:00.555004 4794 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 09:48:01.054986874 +0000 UTC m=+229.811157682 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 09:48:00 crc kubenswrapper[4794]: I0310 09:48:00.555141 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zp5hg\" (UID: \"f7eaacf5-bd8a-46e7-8584-1ec3ae00f6b3\") " pod="openshift-image-registry/image-registry-697d97f7c8-zp5hg" Mar 10 09:48:00 crc kubenswrapper[4794]: E0310 09:48:00.555585 4794 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 09:48:01.055578072 +0000 UTC m=+229.811748890 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zp5hg" (UID: "f7eaacf5-bd8a-46e7-8584-1ec3ae00f6b3") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 09:48:00 crc kubenswrapper[4794]: I0310 09:48:00.633457 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-n64xh"] Mar 10 09:48:00 crc kubenswrapper[4794]: I0310 09:48:00.633673 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-879f6c89f-n64xh" podUID="19ba7a1d-a381-49f2-aa2e-6463336559fe" containerName="controller-manager" containerID="cri-o://3beee2cde0164bfcb074685175ca919eab419dc359bf571b0f008bc511270c73" gracePeriod=30 Mar 10 09:48:00 crc kubenswrapper[4794]: I0310 09:48:00.636963 4794 ???:1] "http: TLS handshake error from 192.168.126.11:56118: no serving certificate available for the kubelet" Mar 10 09:48:00 crc kubenswrapper[4794]: I0310 09:48:00.641892 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-879f6c89f-n64xh" Mar 10 09:48:00 crc kubenswrapper[4794]: I0310 09:48:00.656376 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 09:48:00 crc kubenswrapper[4794]: E0310 09:48:00.657711 4794 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 09:48:01.1576946 +0000 UTC m=+229.913865418 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 09:48:00 crc kubenswrapper[4794]: I0310 09:48:00.674001 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-jscb4"] Mar 10 09:48:00 crc kubenswrapper[4794]: I0310 09:48:00.674195 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-jscb4" podUID="02821716-8fb0-46bc-9c95-4c7ca46500b4" containerName="route-controller-manager" containerID="cri-o://0fdbc6aea5410e167044bf5caeee4427a16acfd4ebbea46a9b6019e14174d93e" gracePeriod=30 Mar 10 09:48:00 crc kubenswrapper[4794]: I0310 09:48:00.734647 4794 ???:1] "http: TLS handshake error from 192.168.126.11:56124: no serving certificate available for the kubelet" Mar 10 09:48:00 crc kubenswrapper[4794]: I0310 09:48:00.762211 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zp5hg\" (UID: \"f7eaacf5-bd8a-46e7-8584-1ec3ae00f6b3\") " pod="openshift-image-registry/image-registry-697d97f7c8-zp5hg" Mar 10 09:48:00 crc kubenswrapper[4794]: E0310 09:48:00.762570 4794 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 09:48:01.262555113 +0000 UTC m=+230.018725931 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zp5hg" (UID: "f7eaacf5-bd8a-46e7-8584-1ec3ae00f6b3") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 09:48:00 crc kubenswrapper[4794]: I0310 09:48:00.835962 4794 ???:1] "http: TLS handshake error from 192.168.126.11:56136: no serving certificate available for the kubelet" Mar 10 09:48:00 crc kubenswrapper[4794]: I0310 09:48:00.864574 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 09:48:00 crc kubenswrapper[4794]: E0310 09:48:00.865015 4794 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 09:48:01.364996781 +0000 UTC m=+230.121167599 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 09:48:00 crc kubenswrapper[4794]: I0310 09:48:00.885281 4794 generic.go:334] "Generic (PLEG): container finished" podID="02821716-8fb0-46bc-9c95-4c7ca46500b4" containerID="0fdbc6aea5410e167044bf5caeee4427a16acfd4ebbea46a9b6019e14174d93e" exitCode=0 Mar 10 09:48:00 crc kubenswrapper[4794]: I0310 09:48:00.885660 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-jscb4" event={"ID":"02821716-8fb0-46bc-9c95-4c7ca46500b4","Type":"ContainerDied","Data":"0fdbc6aea5410e167044bf5caeee4427a16acfd4ebbea46a9b6019e14174d93e"} Mar 10 09:48:00 crc kubenswrapper[4794]: I0310 09:48:00.888937 4794 generic.go:334] "Generic (PLEG): container finished" podID="19ba7a1d-a381-49f2-aa2e-6463336559fe" containerID="3beee2cde0164bfcb074685175ca919eab419dc359bf571b0f008bc511270c73" exitCode=0 Mar 10 09:48:00 crc kubenswrapper[4794]: I0310 09:48:00.889059 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-n64xh" event={"ID":"19ba7a1d-a381-49f2-aa2e-6463336559fe","Type":"ContainerDied","Data":"3beee2cde0164bfcb074685175ca919eab419dc359bf571b0f008bc511270c73"} Mar 10 09:48:00 crc kubenswrapper[4794]: I0310 09:48:00.891177 4794 generic.go:334] "Generic (PLEG): container finished" podID="eece3bac-ab7c-4a16-82ae-35775eef8806" containerID="9359a9c9fbf9b27a25c133709f0cca4798f0af917d86e842a532f8a026f6b7c7" exitCode=0 Mar 10 09:48:00 crc kubenswrapper[4794]: I0310 09:48:00.891317 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29552265-smnpm" event={"ID":"eece3bac-ab7c-4a16-82ae-35775eef8806","Type":"ContainerDied","Data":"9359a9c9fbf9b27a25c133709f0cca4798f0af917d86e842a532f8a026f6b7c7"} Mar 10 09:48:00 crc kubenswrapper[4794]: I0310 09:48:00.954194 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552268-gzdlm"] Mar 10 09:48:00 crc kubenswrapper[4794]: I0310 09:48:00.958754 4794 ???:1] "http: TLS handshake error from 192.168.126.11:56144: no serving certificate available for the kubelet" Mar 10 09:48:00 crc kubenswrapper[4794]: I0310 09:48:00.968687 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zp5hg\" (UID: \"f7eaacf5-bd8a-46e7-8584-1ec3ae00f6b3\") " pod="openshift-image-registry/image-registry-697d97f7c8-zp5hg" Mar 10 09:48:00 crc kubenswrapper[4794]: E0310 09:48:00.969847 4794 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 09:48:01.469832003 +0000 UTC m=+230.226002821 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zp5hg" (UID: "f7eaacf5-bd8a-46e7-8584-1ec3ae00f6b3") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 09:48:01 crc kubenswrapper[4794]: I0310 09:48:01.071891 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 09:48:01 crc kubenswrapper[4794]: E0310 09:48:01.072228 4794 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 09:48:01.572210849 +0000 UTC m=+230.328381667 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 09:48:01 crc kubenswrapper[4794]: I0310 09:48:01.091556 4794 patch_prober.go:28] interesting pod/router-default-5444994796-tlkl5 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 10 09:48:01 crc kubenswrapper[4794]: [-]has-synced failed: reason withheld Mar 10 09:48:01 crc kubenswrapper[4794]: [+]process-running ok Mar 10 09:48:01 crc kubenswrapper[4794]: healthz check failed Mar 10 09:48:01 crc kubenswrapper[4794]: I0310 09:48:01.091605 4794 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-tlkl5" podUID="180c50b5-f178-4b42-922c-edfd1deb91d4" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 10 09:48:01 crc kubenswrapper[4794]: I0310 09:48:01.145701 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-jscb4" Mar 10 09:48:01 crc kubenswrapper[4794]: I0310 09:48:01.147413 4794 ???:1] "http: TLS handshake error from 192.168.126.11:56152: no serving certificate available for the kubelet" Mar 10 09:48:01 crc kubenswrapper[4794]: I0310 09:48:01.163607 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-n64xh" Mar 10 09:48:01 crc kubenswrapper[4794]: I0310 09:48:01.173113 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zp5hg\" (UID: \"f7eaacf5-bd8a-46e7-8584-1ec3ae00f6b3\") " pod="openshift-image-registry/image-registry-697d97f7c8-zp5hg" Mar 10 09:48:01 crc kubenswrapper[4794]: E0310 09:48:01.173386 4794 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 09:48:01.67337524 +0000 UTC m=+230.429546048 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zp5hg" (UID: "f7eaacf5-bd8a-46e7-8584-1ec3ae00f6b3") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 09:48:01 crc kubenswrapper[4794]: I0310 09:48:01.278615 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/19ba7a1d-a381-49f2-aa2e-6463336559fe-serving-cert\") pod \"19ba7a1d-a381-49f2-aa2e-6463336559fe\" (UID: \"19ba7a1d-a381-49f2-aa2e-6463336559fe\") " Mar 10 09:48:01 crc kubenswrapper[4794]: I0310 09:48:01.278671 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/19ba7a1d-a381-49f2-aa2e-6463336559fe-proxy-ca-bundles\") pod \"19ba7a1d-a381-49f2-aa2e-6463336559fe\" (UID: \"19ba7a1d-a381-49f2-aa2e-6463336559fe\") " Mar 10 09:48:01 crc kubenswrapper[4794]: I0310 09:48:01.278803 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 09:48:01 crc kubenswrapper[4794]: I0310 09:48:01.278843 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/02821716-8fb0-46bc-9c95-4c7ca46500b4-client-ca\") pod \"02821716-8fb0-46bc-9c95-4c7ca46500b4\" (UID: \"02821716-8fb0-46bc-9c95-4c7ca46500b4\") " Mar 10 09:48:01 crc kubenswrapper[4794]: I0310 09:48:01.278865 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/02821716-8fb0-46bc-9c95-4c7ca46500b4-config\") pod \"02821716-8fb0-46bc-9c95-4c7ca46500b4\" (UID: \"02821716-8fb0-46bc-9c95-4c7ca46500b4\") " Mar 10 09:48:01 crc kubenswrapper[4794]: I0310 09:48:01.278890 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/02821716-8fb0-46bc-9c95-4c7ca46500b4-serving-cert\") pod \"02821716-8fb0-46bc-9c95-4c7ca46500b4\" (UID: \"02821716-8fb0-46bc-9c95-4c7ca46500b4\") " Mar 10 09:48:01 crc kubenswrapper[4794]: I0310 09:48:01.278909 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-97qwh\" (UniqueName: \"kubernetes.io/projected/19ba7a1d-a381-49f2-aa2e-6463336559fe-kube-api-access-97qwh\") pod \"19ba7a1d-a381-49f2-aa2e-6463336559fe\" (UID: \"19ba7a1d-a381-49f2-aa2e-6463336559fe\") " Mar 10 09:48:01 crc kubenswrapper[4794]: I0310 09:48:01.278952 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/19ba7a1d-a381-49f2-aa2e-6463336559fe-config\") pod \"19ba7a1d-a381-49f2-aa2e-6463336559fe\" (UID: \"19ba7a1d-a381-49f2-aa2e-6463336559fe\") " Mar 10 09:48:01 crc kubenswrapper[4794]: E0310 09:48:01.278989 4794 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 09:48:01.778962684 +0000 UTC m=+230.535133502 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 09:48:01 crc kubenswrapper[4794]: I0310 09:48:01.279029 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hhffj\" (UniqueName: \"kubernetes.io/projected/02821716-8fb0-46bc-9c95-4c7ca46500b4-kube-api-access-hhffj\") pod \"02821716-8fb0-46bc-9c95-4c7ca46500b4\" (UID: \"02821716-8fb0-46bc-9c95-4c7ca46500b4\") " Mar 10 09:48:01 crc kubenswrapper[4794]: I0310 09:48:01.279088 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/19ba7a1d-a381-49f2-aa2e-6463336559fe-client-ca\") pod \"19ba7a1d-a381-49f2-aa2e-6463336559fe\" (UID: \"19ba7a1d-a381-49f2-aa2e-6463336559fe\") " Mar 10 09:48:01 crc kubenswrapper[4794]: I0310 09:48:01.279304 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zp5hg\" (UID: \"f7eaacf5-bd8a-46e7-8584-1ec3ae00f6b3\") " pod="openshift-image-registry/image-registry-697d97f7c8-zp5hg" Mar 10 09:48:01 crc kubenswrapper[4794]: I0310 09:48:01.279476 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/19ba7a1d-a381-49f2-aa2e-6463336559fe-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "19ba7a1d-a381-49f2-aa2e-6463336559fe" (UID: "19ba7a1d-a381-49f2-aa2e-6463336559fe"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 09:48:01 crc kubenswrapper[4794]: I0310 09:48:01.279671 4794 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/19ba7a1d-a381-49f2-aa2e-6463336559fe-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 10 09:48:01 crc kubenswrapper[4794]: I0310 09:48:01.279686 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/19ba7a1d-a381-49f2-aa2e-6463336559fe-config" (OuterVolumeSpecName: "config") pod "19ba7a1d-a381-49f2-aa2e-6463336559fe" (UID: "19ba7a1d-a381-49f2-aa2e-6463336559fe"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 09:48:01 crc kubenswrapper[4794]: E0310 09:48:01.279758 4794 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 09:48:01.779750548 +0000 UTC m=+230.535921356 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zp5hg" (UID: "f7eaacf5-bd8a-46e7-8584-1ec3ae00f6b3") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 09:48:01 crc kubenswrapper[4794]: I0310 09:48:01.280163 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/19ba7a1d-a381-49f2-aa2e-6463336559fe-client-ca" (OuterVolumeSpecName: "client-ca") pod "19ba7a1d-a381-49f2-aa2e-6463336559fe" (UID: "19ba7a1d-a381-49f2-aa2e-6463336559fe"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 09:48:01 crc kubenswrapper[4794]: I0310 09:48:01.280205 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/02821716-8fb0-46bc-9c95-4c7ca46500b4-client-ca" (OuterVolumeSpecName: "client-ca") pod "02821716-8fb0-46bc-9c95-4c7ca46500b4" (UID: "02821716-8fb0-46bc-9c95-4c7ca46500b4"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 09:48:01 crc kubenswrapper[4794]: I0310 09:48:01.281818 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/02821716-8fb0-46bc-9c95-4c7ca46500b4-config" (OuterVolumeSpecName: "config") pod "02821716-8fb0-46bc-9c95-4c7ca46500b4" (UID: "02821716-8fb0-46bc-9c95-4c7ca46500b4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 09:48:01 crc kubenswrapper[4794]: I0310 09:48:01.289326 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/02821716-8fb0-46bc-9c95-4c7ca46500b4-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "02821716-8fb0-46bc-9c95-4c7ca46500b4" (UID: "02821716-8fb0-46bc-9c95-4c7ca46500b4"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:48:01 crc kubenswrapper[4794]: I0310 09:48:01.289403 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/02821716-8fb0-46bc-9c95-4c7ca46500b4-kube-api-access-hhffj" (OuterVolumeSpecName: "kube-api-access-hhffj") pod "02821716-8fb0-46bc-9c95-4c7ca46500b4" (UID: "02821716-8fb0-46bc-9c95-4c7ca46500b4"). InnerVolumeSpecName "kube-api-access-hhffj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:48:01 crc kubenswrapper[4794]: I0310 09:48:01.290273 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/19ba7a1d-a381-49f2-aa2e-6463336559fe-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "19ba7a1d-a381-49f2-aa2e-6463336559fe" (UID: "19ba7a1d-a381-49f2-aa2e-6463336559fe"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:48:01 crc kubenswrapper[4794]: I0310 09:48:01.290418 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/19ba7a1d-a381-49f2-aa2e-6463336559fe-kube-api-access-97qwh" (OuterVolumeSpecName: "kube-api-access-97qwh") pod "19ba7a1d-a381-49f2-aa2e-6463336559fe" (UID: "19ba7a1d-a381-49f2-aa2e-6463336559fe"). InnerVolumeSpecName "kube-api-access-97qwh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:48:01 crc kubenswrapper[4794]: I0310 09:48:01.347276 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-bnvgq"] Mar 10 09:48:01 crc kubenswrapper[4794]: E0310 09:48:01.347491 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="19ba7a1d-a381-49f2-aa2e-6463336559fe" containerName="controller-manager" Mar 10 09:48:01 crc kubenswrapper[4794]: I0310 09:48:01.347503 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="19ba7a1d-a381-49f2-aa2e-6463336559fe" containerName="controller-manager" Mar 10 09:48:01 crc kubenswrapper[4794]: E0310 09:48:01.347511 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="02821716-8fb0-46bc-9c95-4c7ca46500b4" containerName="route-controller-manager" Mar 10 09:48:01 crc kubenswrapper[4794]: I0310 09:48:01.347517 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="02821716-8fb0-46bc-9c95-4c7ca46500b4" containerName="route-controller-manager" Mar 10 09:48:01 crc kubenswrapper[4794]: I0310 09:48:01.347639 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="19ba7a1d-a381-49f2-aa2e-6463336559fe" containerName="controller-manager" Mar 10 09:48:01 crc kubenswrapper[4794]: I0310 09:48:01.347655 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="02821716-8fb0-46bc-9c95-4c7ca46500b4" containerName="route-controller-manager" Mar 10 09:48:01 crc kubenswrapper[4794]: I0310 09:48:01.348361 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-bnvgq" Mar 10 09:48:01 crc kubenswrapper[4794]: I0310 09:48:01.350813 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Mar 10 09:48:01 crc kubenswrapper[4794]: I0310 09:48:01.368273 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-bnvgq"] Mar 10 09:48:01 crc kubenswrapper[4794]: I0310 09:48:01.442391 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 09:48:01 crc kubenswrapper[4794]: E0310 09:48:01.442899 4794 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 09:48:01.94265795 +0000 UTC m=+230.698828768 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 09:48:01 crc kubenswrapper[4794]: I0310 09:48:01.442948 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/88245dbf-bf6b-4051-9a3c-91da5a183538-utilities\") pod \"certified-operators-bnvgq\" (UID: \"88245dbf-bf6b-4051-9a3c-91da5a183538\") " pod="openshift-marketplace/certified-operators-bnvgq" Mar 10 09:48:01 crc kubenswrapper[4794]: I0310 09:48:01.443009 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zp5hg\" (UID: \"f7eaacf5-bd8a-46e7-8584-1ec3ae00f6b3\") " pod="openshift-image-registry/image-registry-697d97f7c8-zp5hg" Mar 10 09:48:01 crc kubenswrapper[4794]: E0310 09:48:01.443499 4794 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 09:48:01.943491856 +0000 UTC m=+230.699662674 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zp5hg" (UID: "f7eaacf5-bd8a-46e7-8584-1ec3ae00f6b3") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 09:48:01 crc kubenswrapper[4794]: I0310 09:48:01.443675 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fwx4g\" (UniqueName: \"kubernetes.io/projected/88245dbf-bf6b-4051-9a3c-91da5a183538-kube-api-access-fwx4g\") pod \"certified-operators-bnvgq\" (UID: \"88245dbf-bf6b-4051-9a3c-91da5a183538\") " pod="openshift-marketplace/certified-operators-bnvgq" Mar 10 09:48:01 crc kubenswrapper[4794]: I0310 09:48:01.443707 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/88245dbf-bf6b-4051-9a3c-91da5a183538-catalog-content\") pod \"certified-operators-bnvgq\" (UID: \"88245dbf-bf6b-4051-9a3c-91da5a183538\") " pod="openshift-marketplace/certified-operators-bnvgq" Mar 10 09:48:01 crc kubenswrapper[4794]: I0310 09:48:01.443836 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hhffj\" (UniqueName: \"kubernetes.io/projected/02821716-8fb0-46bc-9c95-4c7ca46500b4-kube-api-access-hhffj\") on node \"crc\" DevicePath \"\"" Mar 10 09:48:01 crc kubenswrapper[4794]: I0310 09:48:01.443865 4794 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/19ba7a1d-a381-49f2-aa2e-6463336559fe-client-ca\") on node \"crc\" DevicePath \"\"" Mar 10 09:48:01 crc kubenswrapper[4794]: I0310 09:48:01.443874 4794 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/19ba7a1d-a381-49f2-aa2e-6463336559fe-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 10 09:48:01 crc kubenswrapper[4794]: I0310 09:48:01.443886 4794 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/02821716-8fb0-46bc-9c95-4c7ca46500b4-client-ca\") on node \"crc\" DevicePath \"\"" Mar 10 09:48:01 crc kubenswrapper[4794]: I0310 09:48:01.443895 4794 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/02821716-8fb0-46bc-9c95-4c7ca46500b4-config\") on node \"crc\" DevicePath \"\"" Mar 10 09:48:01 crc kubenswrapper[4794]: I0310 09:48:01.443904 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-97qwh\" (UniqueName: \"kubernetes.io/projected/19ba7a1d-a381-49f2-aa2e-6463336559fe-kube-api-access-97qwh\") on node \"crc\" DevicePath \"\"" Mar 10 09:48:01 crc kubenswrapper[4794]: I0310 09:48:01.443913 4794 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/02821716-8fb0-46bc-9c95-4c7ca46500b4-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 10 09:48:01 crc kubenswrapper[4794]: I0310 09:48:01.443921 4794 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/19ba7a1d-a381-49f2-aa2e-6463336559fe-config\") on node \"crc\" DevicePath \"\"" Mar 10 09:48:01 crc kubenswrapper[4794]: I0310 09:48:01.469864 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Mar 10 09:48:01 crc kubenswrapper[4794]: I0310 09:48:01.470412 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 10 09:48:01 crc kubenswrapper[4794]: I0310 09:48:01.471907 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager"/"installer-sa-dockercfg-kjl2n" Mar 10 09:48:01 crc kubenswrapper[4794]: I0310 09:48:01.472196 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager"/"kube-root-ca.crt" Mar 10 09:48:01 crc kubenswrapper[4794]: I0310 09:48:01.477753 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Mar 10 09:48:01 crc kubenswrapper[4794]: I0310 09:48:01.503134 4794 ???:1] "http: TLS handshake error from 192.168.126.11:56160: no serving certificate available for the kubelet" Mar 10 09:48:01 crc kubenswrapper[4794]: I0310 09:48:01.544950 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 09:48:01 crc kubenswrapper[4794]: E0310 09:48:01.545218 4794 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 09:48:02.045198772 +0000 UTC m=+230.801369590 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 09:48:01 crc kubenswrapper[4794]: I0310 09:48:01.545392 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fwx4g\" (UniqueName: \"kubernetes.io/projected/88245dbf-bf6b-4051-9a3c-91da5a183538-kube-api-access-fwx4g\") pod \"certified-operators-bnvgq\" (UID: \"88245dbf-bf6b-4051-9a3c-91da5a183538\") " pod="openshift-marketplace/certified-operators-bnvgq" Mar 10 09:48:01 crc kubenswrapper[4794]: I0310 09:48:01.545441 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/88245dbf-bf6b-4051-9a3c-91da5a183538-catalog-content\") pod \"certified-operators-bnvgq\" (UID: \"88245dbf-bf6b-4051-9a3c-91da5a183538\") " pod="openshift-marketplace/certified-operators-bnvgq" Mar 10 09:48:01 crc kubenswrapper[4794]: I0310 09:48:01.545476 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/58dac5f1-539f-4540-a1f8-9ec492a7e505-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"58dac5f1-539f-4540-a1f8-9ec492a7e505\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 10 09:48:01 crc kubenswrapper[4794]: I0310 09:48:01.545605 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/88245dbf-bf6b-4051-9a3c-91da5a183538-utilities\") pod \"certified-operators-bnvgq\" (UID: \"88245dbf-bf6b-4051-9a3c-91da5a183538\") " pod="openshift-marketplace/certified-operators-bnvgq" Mar 10 09:48:01 crc kubenswrapper[4794]: I0310 09:48:01.545631 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zp5hg\" (UID: \"f7eaacf5-bd8a-46e7-8584-1ec3ae00f6b3\") " pod="openshift-image-registry/image-registry-697d97f7c8-zp5hg" Mar 10 09:48:01 crc kubenswrapper[4794]: I0310 09:48:01.545696 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/58dac5f1-539f-4540-a1f8-9ec492a7e505-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"58dac5f1-539f-4540-a1f8-9ec492a7e505\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 10 09:48:01 crc kubenswrapper[4794]: I0310 09:48:01.546007 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/88245dbf-bf6b-4051-9a3c-91da5a183538-catalog-content\") pod \"certified-operators-bnvgq\" (UID: \"88245dbf-bf6b-4051-9a3c-91da5a183538\") " pod="openshift-marketplace/certified-operators-bnvgq" Mar 10 09:48:01 crc kubenswrapper[4794]: E0310 09:48:01.546016 4794 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 09:48:02.046000946 +0000 UTC m=+230.802171764 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zp5hg" (UID: "f7eaacf5-bd8a-46e7-8584-1ec3ae00f6b3") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 09:48:01 crc kubenswrapper[4794]: I0310 09:48:01.550651 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/88245dbf-bf6b-4051-9a3c-91da5a183538-utilities\") pod \"certified-operators-bnvgq\" (UID: \"88245dbf-bf6b-4051-9a3c-91da5a183538\") " pod="openshift-marketplace/certified-operators-bnvgq" Mar 10 09:48:01 crc kubenswrapper[4794]: I0310 09:48:01.557513 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-zffn2"] Mar 10 09:48:01 crc kubenswrapper[4794]: I0310 09:48:01.558535 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-zffn2" Mar 10 09:48:01 crc kubenswrapper[4794]: I0310 09:48:01.563859 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Mar 10 09:48:01 crc kubenswrapper[4794]: I0310 09:48:01.568657 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-zffn2"] Mar 10 09:48:01 crc kubenswrapper[4794]: I0310 09:48:01.572480 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fwx4g\" (UniqueName: \"kubernetes.io/projected/88245dbf-bf6b-4051-9a3c-91da5a183538-kube-api-access-fwx4g\") pod \"certified-operators-bnvgq\" (UID: \"88245dbf-bf6b-4051-9a3c-91da5a183538\") " pod="openshift-marketplace/certified-operators-bnvgq" Mar 10 09:48:01 crc kubenswrapper[4794]: I0310 09:48:01.646505 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 09:48:01 crc kubenswrapper[4794]: E0310 09:48:01.646657 4794 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 09:48:02.14662637 +0000 UTC m=+230.902797188 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 09:48:01 crc kubenswrapper[4794]: I0310 09:48:01.646703 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/58dac5f1-539f-4540-a1f8-9ec492a7e505-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"58dac5f1-539f-4540-a1f8-9ec492a7e505\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 10 09:48:01 crc kubenswrapper[4794]: I0310 09:48:01.646857 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/58dac5f1-539f-4540-a1f8-9ec492a7e505-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"58dac5f1-539f-4540-a1f8-9ec492a7e505\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 10 09:48:01 crc kubenswrapper[4794]: I0310 09:48:01.646898 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/869965fc-c355-4c93-9776-dc1a070c926e-catalog-content\") pod \"community-operators-zffn2\" (UID: \"869965fc-c355-4c93-9776-dc1a070c926e\") " pod="openshift-marketplace/community-operators-zffn2" Mar 10 09:48:01 crc kubenswrapper[4794]: I0310 09:48:01.647131 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/58dac5f1-539f-4540-a1f8-9ec492a7e505-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"58dac5f1-539f-4540-a1f8-9ec492a7e505\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 10 09:48:01 crc kubenswrapper[4794]: I0310 09:48:01.647212 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/869965fc-c355-4c93-9776-dc1a070c926e-utilities\") pod \"community-operators-zffn2\" (UID: \"869965fc-c355-4c93-9776-dc1a070c926e\") " pod="openshift-marketplace/community-operators-zffn2" Mar 10 09:48:01 crc kubenswrapper[4794]: I0310 09:48:01.647287 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bmrnm\" (UniqueName: \"kubernetes.io/projected/869965fc-c355-4c93-9776-dc1a070c926e-kube-api-access-bmrnm\") pod \"community-operators-zffn2\" (UID: \"869965fc-c355-4c93-9776-dc1a070c926e\") " pod="openshift-marketplace/community-operators-zffn2" Mar 10 09:48:01 crc kubenswrapper[4794]: I0310 09:48:01.647424 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zp5hg\" (UID: \"f7eaacf5-bd8a-46e7-8584-1ec3ae00f6b3\") " pod="openshift-image-registry/image-registry-697d97f7c8-zp5hg" Mar 10 09:48:01 crc kubenswrapper[4794]: E0310 09:48:01.647737 4794 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 09:48:02.147723524 +0000 UTC m=+230.903894342 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zp5hg" (UID: "f7eaacf5-bd8a-46e7-8584-1ec3ae00f6b3") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 09:48:01 crc kubenswrapper[4794]: I0310 09:48:01.662526 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-bnvgq" Mar 10 09:48:01 crc kubenswrapper[4794]: I0310 09:48:01.679401 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/58dac5f1-539f-4540-a1f8-9ec492a7e505-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"58dac5f1-539f-4540-a1f8-9ec492a7e505\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 10 09:48:01 crc kubenswrapper[4794]: I0310 09:48:01.743065 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-4d8mk"] Mar 10 09:48:01 crc kubenswrapper[4794]: I0310 09:48:01.743976 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-4d8mk" Mar 10 09:48:01 crc kubenswrapper[4794]: I0310 09:48:01.748926 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 09:48:01 crc kubenswrapper[4794]: I0310 09:48:01.749076 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dc6e5973-dae4-42b4-aae1-44fd0f6e1c3e-utilities\") pod \"certified-operators-4d8mk\" (UID: \"dc6e5973-dae4-42b4-aae1-44fd0f6e1c3e\") " pod="openshift-marketplace/certified-operators-4d8mk" Mar 10 09:48:01 crc kubenswrapper[4794]: I0310 09:48:01.749119 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/869965fc-c355-4c93-9776-dc1a070c926e-catalog-content\") pod \"community-operators-zffn2\" (UID: \"869965fc-c355-4c93-9776-dc1a070c926e\") " pod="openshift-marketplace/community-operators-zffn2" Mar 10 09:48:01 crc kubenswrapper[4794]: I0310 09:48:01.749166 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ng2zz\" (UniqueName: \"kubernetes.io/projected/dc6e5973-dae4-42b4-aae1-44fd0f6e1c3e-kube-api-access-ng2zz\") pod \"certified-operators-4d8mk\" (UID: \"dc6e5973-dae4-42b4-aae1-44fd0f6e1c3e\") " pod="openshift-marketplace/certified-operators-4d8mk" Mar 10 09:48:01 crc kubenswrapper[4794]: I0310 09:48:01.749214 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/869965fc-c355-4c93-9776-dc1a070c926e-utilities\") pod \"community-operators-zffn2\" (UID: \"869965fc-c355-4c93-9776-dc1a070c926e\") " pod="openshift-marketplace/community-operators-zffn2" Mar 10 09:48:01 crc kubenswrapper[4794]: I0310 09:48:01.749238 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bmrnm\" (UniqueName: \"kubernetes.io/projected/869965fc-c355-4c93-9776-dc1a070c926e-kube-api-access-bmrnm\") pod \"community-operators-zffn2\" (UID: \"869965fc-c355-4c93-9776-dc1a070c926e\") " pod="openshift-marketplace/community-operators-zffn2" Mar 10 09:48:01 crc kubenswrapper[4794]: I0310 09:48:01.749262 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dc6e5973-dae4-42b4-aae1-44fd0f6e1c3e-catalog-content\") pod \"certified-operators-4d8mk\" (UID: \"dc6e5973-dae4-42b4-aae1-44fd0f6e1c3e\") " pod="openshift-marketplace/certified-operators-4d8mk" Mar 10 09:48:01 crc kubenswrapper[4794]: I0310 09:48:01.750170 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/869965fc-c355-4c93-9776-dc1a070c926e-catalog-content\") pod \"community-operators-zffn2\" (UID: \"869965fc-c355-4c93-9776-dc1a070c926e\") " pod="openshift-marketplace/community-operators-zffn2" Mar 10 09:48:01 crc kubenswrapper[4794]: E0310 09:48:01.750263 4794 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 09:48:02.250233615 +0000 UTC m=+231.006404443 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 09:48:01 crc kubenswrapper[4794]: I0310 09:48:01.752268 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/869965fc-c355-4c93-9776-dc1a070c926e-utilities\") pod \"community-operators-zffn2\" (UID: \"869965fc-c355-4c93-9776-dc1a070c926e\") " pod="openshift-marketplace/community-operators-zffn2" Mar 10 09:48:01 crc kubenswrapper[4794]: I0310 09:48:01.757878 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-4d8mk"] Mar 10 09:48:01 crc kubenswrapper[4794]: I0310 09:48:01.775536 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bmrnm\" (UniqueName: \"kubernetes.io/projected/869965fc-c355-4c93-9776-dc1a070c926e-kube-api-access-bmrnm\") pod \"community-operators-zffn2\" (UID: \"869965fc-c355-4c93-9776-dc1a070c926e\") " pod="openshift-marketplace/community-operators-zffn2" Mar 10 09:48:01 crc kubenswrapper[4794]: I0310 09:48:01.791822 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 10 09:48:01 crc kubenswrapper[4794]: I0310 09:48:01.851678 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ng2zz\" (UniqueName: \"kubernetes.io/projected/dc6e5973-dae4-42b4-aae1-44fd0f6e1c3e-kube-api-access-ng2zz\") pod \"certified-operators-4d8mk\" (UID: \"dc6e5973-dae4-42b4-aae1-44fd0f6e1c3e\") " pod="openshift-marketplace/certified-operators-4d8mk" Mar 10 09:48:01 crc kubenswrapper[4794]: I0310 09:48:01.851740 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dc6e5973-dae4-42b4-aae1-44fd0f6e1c3e-catalog-content\") pod \"certified-operators-4d8mk\" (UID: \"dc6e5973-dae4-42b4-aae1-44fd0f6e1c3e\") " pod="openshift-marketplace/certified-operators-4d8mk" Mar 10 09:48:01 crc kubenswrapper[4794]: I0310 09:48:01.851767 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zp5hg\" (UID: \"f7eaacf5-bd8a-46e7-8584-1ec3ae00f6b3\") " pod="openshift-image-registry/image-registry-697d97f7c8-zp5hg" Mar 10 09:48:01 crc kubenswrapper[4794]: I0310 09:48:01.851843 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dc6e5973-dae4-42b4-aae1-44fd0f6e1c3e-utilities\") pod \"certified-operators-4d8mk\" (UID: \"dc6e5973-dae4-42b4-aae1-44fd0f6e1c3e\") " pod="openshift-marketplace/certified-operators-4d8mk" Mar 10 09:48:01 crc kubenswrapper[4794]: I0310 09:48:01.852551 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dc6e5973-dae4-42b4-aae1-44fd0f6e1c3e-catalog-content\") pod \"certified-operators-4d8mk\" (UID: \"dc6e5973-dae4-42b4-aae1-44fd0f6e1c3e\") " pod="openshift-marketplace/certified-operators-4d8mk" Mar 10 09:48:01 crc kubenswrapper[4794]: I0310 09:48:01.852615 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dc6e5973-dae4-42b4-aae1-44fd0f6e1c3e-utilities\") pod \"certified-operators-4d8mk\" (UID: \"dc6e5973-dae4-42b4-aae1-44fd0f6e1c3e\") " pod="openshift-marketplace/certified-operators-4d8mk" Mar 10 09:48:01 crc kubenswrapper[4794]: E0310 09:48:01.852870 4794 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 09:48:02.352832008 +0000 UTC m=+231.109002826 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zp5hg" (UID: "f7eaacf5-bd8a-46e7-8584-1ec3ae00f6b3") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 09:48:01 crc kubenswrapper[4794]: I0310 09:48:01.862941 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-bnvgq"] Mar 10 09:48:01 crc kubenswrapper[4794]: I0310 09:48:01.867903 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ng2zz\" (UniqueName: \"kubernetes.io/projected/dc6e5973-dae4-42b4-aae1-44fd0f6e1c3e-kube-api-access-ng2zz\") pod \"certified-operators-4d8mk\" (UID: \"dc6e5973-dae4-42b4-aae1-44fd0f6e1c3e\") " pod="openshift-marketplace/certified-operators-4d8mk" Mar 10 09:48:01 crc kubenswrapper[4794]: W0310 09:48:01.875985 4794 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod88245dbf_bf6b_4051_9a3c_91da5a183538.slice/crio-1d1868622a62b82c40e3713c2ddc58846a44ce60f8e18319d20f62a5caf4b4b7 WatchSource:0}: Error finding container 1d1868622a62b82c40e3713c2ddc58846a44ce60f8e18319d20f62a5caf4b4b7: Status 404 returned error can't find the container with id 1d1868622a62b82c40e3713c2ddc58846a44ce60f8e18319d20f62a5caf4b4b7 Mar 10 09:48:01 crc kubenswrapper[4794]: I0310 09:48:01.896497 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-zffn2" Mar 10 09:48:01 crc kubenswrapper[4794]: I0310 09:48:01.907652 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-n64xh" event={"ID":"19ba7a1d-a381-49f2-aa2e-6463336559fe","Type":"ContainerDied","Data":"ecae1904760a9d8f6c89da64b4468cca3e743d52c67ecd3df1d1217cd808866a"} Mar 10 09:48:01 crc kubenswrapper[4794]: I0310 09:48:01.907698 4794 scope.go:117] "RemoveContainer" containerID="3beee2cde0164bfcb074685175ca919eab419dc359bf571b0f008bc511270c73" Mar 10 09:48:01 crc kubenswrapper[4794]: I0310 09:48:01.907663 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-n64xh" Mar 10 09:48:01 crc kubenswrapper[4794]: I0310 09:48:01.919635 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-ckvtc" event={"ID":"f97a286b-f0b0-4309-a3e4-33eea0aea5f8","Type":"ContainerStarted","Data":"71edc60778332b260adac76f64815ce928aae34f2e32c155f120349bb75ed978"} Mar 10 09:48:01 crc kubenswrapper[4794]: I0310 09:48:01.923512 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bnvgq" event={"ID":"88245dbf-bf6b-4051-9a3c-91da5a183538","Type":"ContainerStarted","Data":"1d1868622a62b82c40e3713c2ddc58846a44ce60f8e18319d20f62a5caf4b4b7"} Mar 10 09:48:01 crc kubenswrapper[4794]: I0310 09:48:01.925947 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552268-gzdlm" event={"ID":"75d151a9-5d22-4241-9177-7856740702e4","Type":"ContainerStarted","Data":"1a3133abef39b11152825f2379cbf7ea797502cb21786c566d3cab3f67715af4"} Mar 10 09:48:01 crc kubenswrapper[4794]: I0310 09:48:01.942975 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-jscb4" Mar 10 09:48:01 crc kubenswrapper[4794]: I0310 09:48:01.944473 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-jscb4" event={"ID":"02821716-8fb0-46bc-9c95-4c7ca46500b4","Type":"ContainerDied","Data":"f27f1f6a52bdb3a32252853fe633bf32a368271ef3f581f4f0cdfe063462c49f"} Mar 10 09:48:01 crc kubenswrapper[4794]: I0310 09:48:01.948114 4794 scope.go:117] "RemoveContainer" containerID="0fdbc6aea5410e167044bf5caeee4427a16acfd4ebbea46a9b6019e14174d93e" Mar 10 09:48:01 crc kubenswrapper[4794]: I0310 09:48:01.957770 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 09:48:01 crc kubenswrapper[4794]: E0310 09:48:01.959287 4794 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 09:48:02.459264637 +0000 UTC m=+231.215435455 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 09:48:01 crc kubenswrapper[4794]: I0310 09:48:01.986066 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-n64xh"] Mar 10 09:48:01 crc kubenswrapper[4794]: I0310 09:48:01.986114 4794 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-n64xh"] Mar 10 09:48:01 crc kubenswrapper[4794]: I0310 09:48:01.989593 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-b52gv"] Mar 10 09:48:01 crc kubenswrapper[4794]: I0310 09:48:01.990459 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-b52gv" Mar 10 09:48:01 crc kubenswrapper[4794]: I0310 09:48:01.991277 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-b52gv"] Mar 10 09:48:02 crc kubenswrapper[4794]: I0310 09:48:02.056816 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="19ba7a1d-a381-49f2-aa2e-6463336559fe" path="/var/lib/kubelet/pods/19ba7a1d-a381-49f2-aa2e-6463336559fe/volumes" Mar 10 09:48:02 crc kubenswrapper[4794]: I0310 09:48:02.057508 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-jscb4"] Mar 10 09:48:02 crc kubenswrapper[4794]: I0310 09:48:02.057530 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Mar 10 09:48:02 crc kubenswrapper[4794]: I0310 09:48:02.057542 4794 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-jscb4"] Mar 10 09:48:02 crc kubenswrapper[4794]: I0310 09:48:02.063210 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2c891e53-bebe-462e-a924-5073338f2ac1-utilities\") pod \"community-operators-b52gv\" (UID: \"2c891e53-bebe-462e-a924-5073338f2ac1\") " pod="openshift-marketplace/community-operators-b52gv" Mar 10 09:48:02 crc kubenswrapper[4794]: I0310 09:48:02.063265 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zp5hg\" (UID: \"f7eaacf5-bd8a-46e7-8584-1ec3ae00f6b3\") " pod="openshift-image-registry/image-registry-697d97f7c8-zp5hg" Mar 10 09:48:02 crc kubenswrapper[4794]: I0310 09:48:02.063303 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2c891e53-bebe-462e-a924-5073338f2ac1-catalog-content\") pod \"community-operators-b52gv\" (UID: \"2c891e53-bebe-462e-a924-5073338f2ac1\") " pod="openshift-marketplace/community-operators-b52gv" Mar 10 09:48:02 crc kubenswrapper[4794]: I0310 09:48:02.063375 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ftt2l\" (UniqueName: \"kubernetes.io/projected/2c891e53-bebe-462e-a924-5073338f2ac1-kube-api-access-ftt2l\") pod \"community-operators-b52gv\" (UID: \"2c891e53-bebe-462e-a924-5073338f2ac1\") " pod="openshift-marketplace/community-operators-b52gv" Mar 10 09:48:02 crc kubenswrapper[4794]: E0310 09:48:02.063679 4794 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 09:48:02.563667746 +0000 UTC m=+231.319838564 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zp5hg" (UID: "f7eaacf5-bd8a-46e7-8584-1ec3ae00f6b3") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 09:48:02 crc kubenswrapper[4794]: W0310 09:48:02.068821 4794 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod58dac5f1_539f_4540_a1f8_9ec492a7e505.slice/crio-6c03e78ae11a854cff642e89ac19ef3bf6cc7a5a3dfac0dd2c85a04f3902ea3d WatchSource:0}: Error finding container 6c03e78ae11a854cff642e89ac19ef3bf6cc7a5a3dfac0dd2c85a04f3902ea3d: Status 404 returned error can't find the container with id 6c03e78ae11a854cff642e89ac19ef3bf6cc7a5a3dfac0dd2c85a04f3902ea3d Mar 10 09:48:02 crc kubenswrapper[4794]: I0310 09:48:02.077705 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-4d8mk" Mar 10 09:48:02 crc kubenswrapper[4794]: I0310 09:48:02.091589 4794 patch_prober.go:28] interesting pod/router-default-5444994796-tlkl5 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 10 09:48:02 crc kubenswrapper[4794]: [-]has-synced failed: reason withheld Mar 10 09:48:02 crc kubenswrapper[4794]: [+]process-running ok Mar 10 09:48:02 crc kubenswrapper[4794]: healthz check failed Mar 10 09:48:02 crc kubenswrapper[4794]: I0310 09:48:02.091700 4794 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-tlkl5" podUID="180c50b5-f178-4b42-922c-edfd1deb91d4" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 10 09:48:02 crc kubenswrapper[4794]: I0310 09:48:02.163900 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 09:48:02 crc kubenswrapper[4794]: E0310 09:48:02.164090 4794 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 09:48:02.664063403 +0000 UTC m=+231.420234221 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 09:48:02 crc kubenswrapper[4794]: I0310 09:48:02.164280 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2c891e53-bebe-462e-a924-5073338f2ac1-utilities\") pod \"community-operators-b52gv\" (UID: \"2c891e53-bebe-462e-a924-5073338f2ac1\") " pod="openshift-marketplace/community-operators-b52gv" Mar 10 09:48:02 crc kubenswrapper[4794]: I0310 09:48:02.164322 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zp5hg\" (UID: \"f7eaacf5-bd8a-46e7-8584-1ec3ae00f6b3\") " pod="openshift-image-registry/image-registry-697d97f7c8-zp5hg" Mar 10 09:48:02 crc kubenswrapper[4794]: I0310 09:48:02.164386 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2c891e53-bebe-462e-a924-5073338f2ac1-catalog-content\") pod \"community-operators-b52gv\" (UID: \"2c891e53-bebe-462e-a924-5073338f2ac1\") " pod="openshift-marketplace/community-operators-b52gv" Mar 10 09:48:02 crc kubenswrapper[4794]: I0310 09:48:02.164420 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ftt2l\" (UniqueName: \"kubernetes.io/projected/2c891e53-bebe-462e-a924-5073338f2ac1-kube-api-access-ftt2l\") pod \"community-operators-b52gv\" (UID: \"2c891e53-bebe-462e-a924-5073338f2ac1\") " pod="openshift-marketplace/community-operators-b52gv" Mar 10 09:48:02 crc kubenswrapper[4794]: E0310 09:48:02.164697 4794 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 09:48:02.664679802 +0000 UTC m=+231.420850620 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zp5hg" (UID: "f7eaacf5-bd8a-46e7-8584-1ec3ae00f6b3") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 09:48:02 crc kubenswrapper[4794]: I0310 09:48:02.165036 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2c891e53-bebe-462e-a924-5073338f2ac1-utilities\") pod \"community-operators-b52gv\" (UID: \"2c891e53-bebe-462e-a924-5073338f2ac1\") " pod="openshift-marketplace/community-operators-b52gv" Mar 10 09:48:02 crc kubenswrapper[4794]: I0310 09:48:02.165081 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2c891e53-bebe-462e-a924-5073338f2ac1-catalog-content\") pod \"community-operators-b52gv\" (UID: \"2c891e53-bebe-462e-a924-5073338f2ac1\") " pod="openshift-marketplace/community-operators-b52gv" Mar 10 09:48:02 crc kubenswrapper[4794]: I0310 09:48:02.168400 4794 ???:1] "http: TLS handshake error from 192.168.126.11:56176: no serving certificate available for the kubelet" Mar 10 09:48:02 crc kubenswrapper[4794]: I0310 09:48:02.184849 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ftt2l\" (UniqueName: \"kubernetes.io/projected/2c891e53-bebe-462e-a924-5073338f2ac1-kube-api-access-ftt2l\") pod \"community-operators-b52gv\" (UID: \"2c891e53-bebe-462e-a924-5073338f2ac1\") " pod="openshift-marketplace/community-operators-b52gv" Mar 10 09:48:02 crc kubenswrapper[4794]: I0310 09:48:02.204266 4794 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock" Mar 10 09:48:02 crc kubenswrapper[4794]: I0310 09:48:02.265362 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 09:48:02 crc kubenswrapper[4794]: E0310 09:48:02.265559 4794 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 09:48:02.765529393 +0000 UTC m=+231.521700211 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 09:48:02 crc kubenswrapper[4794]: I0310 09:48:02.265853 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zp5hg\" (UID: \"f7eaacf5-bd8a-46e7-8584-1ec3ae00f6b3\") " pod="openshift-image-registry/image-registry-697d97f7c8-zp5hg" Mar 10 09:48:02 crc kubenswrapper[4794]: E0310 09:48:02.266249 4794 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 09:48:02.766238674 +0000 UTC m=+231.522409492 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zp5hg" (UID: "f7eaacf5-bd8a-46e7-8584-1ec3ae00f6b3") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 09:48:02 crc kubenswrapper[4794]: I0310 09:48:02.309304 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-b52gv" Mar 10 09:48:02 crc kubenswrapper[4794]: I0310 09:48:02.313573 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29552265-smnpm" Mar 10 09:48:02 crc kubenswrapper[4794]: I0310 09:48:02.366748 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 09:48:02 crc kubenswrapper[4794]: E0310 09:48:02.367108 4794 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 09:48:02.867090645 +0000 UTC m=+231.623261463 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 09:48:02 crc kubenswrapper[4794]: I0310 09:48:02.451302 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-zffn2"] Mar 10 09:48:02 crc kubenswrapper[4794]: I0310 09:48:02.467601 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/eece3bac-ab7c-4a16-82ae-35775eef8806-secret-volume\") pod \"eece3bac-ab7c-4a16-82ae-35775eef8806\" (UID: \"eece3bac-ab7c-4a16-82ae-35775eef8806\") " Mar 10 09:48:02 crc kubenswrapper[4794]: I0310 09:48:02.467683 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/eece3bac-ab7c-4a16-82ae-35775eef8806-config-volume\") pod \"eece3bac-ab7c-4a16-82ae-35775eef8806\" (UID: \"eece3bac-ab7c-4a16-82ae-35775eef8806\") " Mar 10 09:48:02 crc kubenswrapper[4794]: I0310 09:48:02.467837 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rplrp\" (UniqueName: \"kubernetes.io/projected/eece3bac-ab7c-4a16-82ae-35775eef8806-kube-api-access-rplrp\") pod \"eece3bac-ab7c-4a16-82ae-35775eef8806\" (UID: \"eece3bac-ab7c-4a16-82ae-35775eef8806\") " Mar 10 09:48:02 crc kubenswrapper[4794]: I0310 09:48:02.467948 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zp5hg\" (UID: \"f7eaacf5-bd8a-46e7-8584-1ec3ae00f6b3\") " pod="openshift-image-registry/image-registry-697d97f7c8-zp5hg" Mar 10 09:48:02 crc kubenswrapper[4794]: E0310 09:48:02.468472 4794 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 09:48:02.968456391 +0000 UTC m=+231.724627229 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zp5hg" (UID: "f7eaacf5-bd8a-46e7-8584-1ec3ae00f6b3") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 09:48:02 crc kubenswrapper[4794]: I0310 09:48:02.468560 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/eece3bac-ab7c-4a16-82ae-35775eef8806-config-volume" (OuterVolumeSpecName: "config-volume") pod "eece3bac-ab7c-4a16-82ae-35775eef8806" (UID: "eece3bac-ab7c-4a16-82ae-35775eef8806"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 09:48:02 crc kubenswrapper[4794]: I0310 09:48:02.473535 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eece3bac-ab7c-4a16-82ae-35775eef8806-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "eece3bac-ab7c-4a16-82ae-35775eef8806" (UID: "eece3bac-ab7c-4a16-82ae-35775eef8806"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:48:02 crc kubenswrapper[4794]: I0310 09:48:02.477119 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eece3bac-ab7c-4a16-82ae-35775eef8806-kube-api-access-rplrp" (OuterVolumeSpecName: "kube-api-access-rplrp") pod "eece3bac-ab7c-4a16-82ae-35775eef8806" (UID: "eece3bac-ab7c-4a16-82ae-35775eef8806"). InnerVolumeSpecName "kube-api-access-rplrp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:48:02 crc kubenswrapper[4794]: W0310 09:48:02.484267 4794 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod869965fc_c355_4c93_9776_dc1a070c926e.slice/crio-0390edaa2c4571644faf4767f082b91047ffe17e250b5e8f9c9e444e2cd89902 WatchSource:0}: Error finding container 0390edaa2c4571644faf4767f082b91047ffe17e250b5e8f9c9e444e2cd89902: Status 404 returned error can't find the container with id 0390edaa2c4571644faf4767f082b91047ffe17e250b5e8f9c9e444e2cd89902 Mar 10 09:48:02 crc kubenswrapper[4794]: I0310 09:48:02.523966 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-4d8mk"] Mar 10 09:48:02 crc kubenswrapper[4794]: I0310 09:48:02.569520 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-b52gv"] Mar 10 09:48:02 crc kubenswrapper[4794]: I0310 09:48:02.569679 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 09:48:02 crc kubenswrapper[4794]: I0310 09:48:02.570132 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rplrp\" (UniqueName: \"kubernetes.io/projected/eece3bac-ab7c-4a16-82ae-35775eef8806-kube-api-access-rplrp\") on node \"crc\" DevicePath \"\"" Mar 10 09:48:02 crc kubenswrapper[4794]: I0310 09:48:02.570149 4794 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/eece3bac-ab7c-4a16-82ae-35775eef8806-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 10 09:48:02 crc kubenswrapper[4794]: I0310 09:48:02.570160 4794 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/eece3bac-ab7c-4a16-82ae-35775eef8806-config-volume\") on node \"crc\" DevicePath \"\"" Mar 10 09:48:02 crc kubenswrapper[4794]: E0310 09:48:02.570229 4794 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 09:48:03.07021278 +0000 UTC m=+231.826383598 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 09:48:02 crc kubenswrapper[4794]: I0310 09:48:02.584192 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-68c767b974-7nlsr"] Mar 10 09:48:02 crc kubenswrapper[4794]: E0310 09:48:02.584428 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eece3bac-ab7c-4a16-82ae-35775eef8806" containerName="collect-profiles" Mar 10 09:48:02 crc kubenswrapper[4794]: I0310 09:48:02.584457 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="eece3bac-ab7c-4a16-82ae-35775eef8806" containerName="collect-profiles" Mar 10 09:48:02 crc kubenswrapper[4794]: I0310 09:48:02.584552 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="eece3bac-ab7c-4a16-82ae-35775eef8806" containerName="collect-profiles" Mar 10 09:48:02 crc kubenswrapper[4794]: I0310 09:48:02.584817 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5c8b7bdcb-jdhjs"] Mar 10 09:48:02 crc kubenswrapper[4794]: I0310 09:48:02.585215 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5c8b7bdcb-jdhjs" Mar 10 09:48:02 crc kubenswrapper[4794]: I0310 09:48:02.585586 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-68c767b974-7nlsr" Mar 10 09:48:02 crc kubenswrapper[4794]: I0310 09:48:02.597896 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 10 09:48:02 crc kubenswrapper[4794]: I0310 09:48:02.598222 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 10 09:48:02 crc kubenswrapper[4794]: I0310 09:48:02.598350 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 10 09:48:02 crc kubenswrapper[4794]: I0310 09:48:02.598553 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 10 09:48:02 crc kubenswrapper[4794]: I0310 09:48:02.598672 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 10 09:48:02 crc kubenswrapper[4794]: I0310 09:48:02.598785 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 10 09:48:02 crc kubenswrapper[4794]: I0310 09:48:02.598868 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 10 09:48:02 crc kubenswrapper[4794]: I0310 09:48:02.598934 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 10 09:48:02 crc kubenswrapper[4794]: I0310 09:48:02.599017 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 10 09:48:02 crc kubenswrapper[4794]: I0310 09:48:02.599153 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 10 09:48:02 crc kubenswrapper[4794]: I0310 09:48:02.599261 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 10 09:48:02 crc kubenswrapper[4794]: I0310 09:48:02.599403 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 10 09:48:02 crc kubenswrapper[4794]: I0310 09:48:02.602884 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-68c767b974-7nlsr"] Mar 10 09:48:02 crc kubenswrapper[4794]: I0310 09:48:02.607179 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 10 09:48:02 crc kubenswrapper[4794]: I0310 09:48:02.608493 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5c8b7bdcb-jdhjs"] Mar 10 09:48:02 crc kubenswrapper[4794]: I0310 09:48:02.671493 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zp5hg\" (UID: \"f7eaacf5-bd8a-46e7-8584-1ec3ae00f6b3\") " pod="openshift-image-registry/image-registry-697d97f7c8-zp5hg" Mar 10 09:48:02 crc kubenswrapper[4794]: E0310 09:48:02.671811 4794 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 09:48:03.171798863 +0000 UTC m=+231.927969681 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zp5hg" (UID: "f7eaacf5-bd8a-46e7-8584-1ec3ae00f6b3") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 09:48:02 crc kubenswrapper[4794]: I0310 09:48:02.772262 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 09:48:02 crc kubenswrapper[4794]: E0310 09:48:02.772746 4794 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 09:48:03.272718806 +0000 UTC m=+232.028889624 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 09:48:02 crc kubenswrapper[4794]: I0310 09:48:02.772829 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/dbc27c9b-724b-4f9c-aa65-cca9e4af56e3-serving-cert\") pod \"route-controller-manager-5c8b7bdcb-jdhjs\" (UID: \"dbc27c9b-724b-4f9c-aa65-cca9e4af56e3\") " pod="openshift-route-controller-manager/route-controller-manager-5c8b7bdcb-jdhjs" Mar 10 09:48:02 crc kubenswrapper[4794]: I0310 09:48:02.772873 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nrpgj\" (UniqueName: \"kubernetes.io/projected/dbc27c9b-724b-4f9c-aa65-cca9e4af56e3-kube-api-access-nrpgj\") pod \"route-controller-manager-5c8b7bdcb-jdhjs\" (UID: \"dbc27c9b-724b-4f9c-aa65-cca9e4af56e3\") " pod="openshift-route-controller-manager/route-controller-manager-5c8b7bdcb-jdhjs" Mar 10 09:48:02 crc kubenswrapper[4794]: I0310 09:48:02.772911 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/80fc46da-4f15-419e-bc48-64a9ad936c6a-config\") pod \"controller-manager-68c767b974-7nlsr\" (UID: \"80fc46da-4f15-419e-bc48-64a9ad936c6a\") " pod="openshift-controller-manager/controller-manager-68c767b974-7nlsr" Mar 10 09:48:02 crc kubenswrapper[4794]: I0310 09:48:02.773012 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/80fc46da-4f15-419e-bc48-64a9ad936c6a-client-ca\") pod \"controller-manager-68c767b974-7nlsr\" (UID: \"80fc46da-4f15-419e-bc48-64a9ad936c6a\") " pod="openshift-controller-manager/controller-manager-68c767b974-7nlsr" Mar 10 09:48:02 crc kubenswrapper[4794]: I0310 09:48:02.773036 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r9528\" (UniqueName: \"kubernetes.io/projected/80fc46da-4f15-419e-bc48-64a9ad936c6a-kube-api-access-r9528\") pod \"controller-manager-68c767b974-7nlsr\" (UID: \"80fc46da-4f15-419e-bc48-64a9ad936c6a\") " pod="openshift-controller-manager/controller-manager-68c767b974-7nlsr" Mar 10 09:48:02 crc kubenswrapper[4794]: I0310 09:48:02.773217 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/80fc46da-4f15-419e-bc48-64a9ad936c6a-proxy-ca-bundles\") pod \"controller-manager-68c767b974-7nlsr\" (UID: \"80fc46da-4f15-419e-bc48-64a9ad936c6a\") " pod="openshift-controller-manager/controller-manager-68c767b974-7nlsr" Mar 10 09:48:02 crc kubenswrapper[4794]: I0310 09:48:02.773247 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/80fc46da-4f15-419e-bc48-64a9ad936c6a-serving-cert\") pod \"controller-manager-68c767b974-7nlsr\" (UID: \"80fc46da-4f15-419e-bc48-64a9ad936c6a\") " pod="openshift-controller-manager/controller-manager-68c767b974-7nlsr" Mar 10 09:48:02 crc kubenswrapper[4794]: I0310 09:48:02.773291 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zp5hg\" (UID: \"f7eaacf5-bd8a-46e7-8584-1ec3ae00f6b3\") " pod="openshift-image-registry/image-registry-697d97f7c8-zp5hg" Mar 10 09:48:02 crc kubenswrapper[4794]: I0310 09:48:02.773325 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/dbc27c9b-724b-4f9c-aa65-cca9e4af56e3-client-ca\") pod \"route-controller-manager-5c8b7bdcb-jdhjs\" (UID: \"dbc27c9b-724b-4f9c-aa65-cca9e4af56e3\") " pod="openshift-route-controller-manager/route-controller-manager-5c8b7bdcb-jdhjs" Mar 10 09:48:02 crc kubenswrapper[4794]: I0310 09:48:02.773376 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dbc27c9b-724b-4f9c-aa65-cca9e4af56e3-config\") pod \"route-controller-manager-5c8b7bdcb-jdhjs\" (UID: \"dbc27c9b-724b-4f9c-aa65-cca9e4af56e3\") " pod="openshift-route-controller-manager/route-controller-manager-5c8b7bdcb-jdhjs" Mar 10 09:48:02 crc kubenswrapper[4794]: E0310 09:48:02.773690 4794 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 09:48:03.273678545 +0000 UTC m=+232.029849363 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zp5hg" (UID: "f7eaacf5-bd8a-46e7-8584-1ec3ae00f6b3") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 09:48:02 crc kubenswrapper[4794]: I0310 09:48:02.874010 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 09:48:02 crc kubenswrapper[4794]: I0310 09:48:02.874120 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/80fc46da-4f15-419e-bc48-64a9ad936c6a-client-ca\") pod \"controller-manager-68c767b974-7nlsr\" (UID: \"80fc46da-4f15-419e-bc48-64a9ad936c6a\") " pod="openshift-controller-manager/controller-manager-68c767b974-7nlsr" Mar 10 09:48:02 crc kubenswrapper[4794]: I0310 09:48:02.874151 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r9528\" (UniqueName: \"kubernetes.io/projected/80fc46da-4f15-419e-bc48-64a9ad936c6a-kube-api-access-r9528\") pod \"controller-manager-68c767b974-7nlsr\" (UID: \"80fc46da-4f15-419e-bc48-64a9ad936c6a\") " pod="openshift-controller-manager/controller-manager-68c767b974-7nlsr" Mar 10 09:48:02 crc kubenswrapper[4794]: E0310 09:48:02.874204 4794 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 09:48:03.374183346 +0000 UTC m=+232.130354164 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 09:48:02 crc kubenswrapper[4794]: I0310 09:48:02.874250 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/80fc46da-4f15-419e-bc48-64a9ad936c6a-proxy-ca-bundles\") pod \"controller-manager-68c767b974-7nlsr\" (UID: \"80fc46da-4f15-419e-bc48-64a9ad936c6a\") " pod="openshift-controller-manager/controller-manager-68c767b974-7nlsr" Mar 10 09:48:02 crc kubenswrapper[4794]: I0310 09:48:02.874275 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/80fc46da-4f15-419e-bc48-64a9ad936c6a-serving-cert\") pod \"controller-manager-68c767b974-7nlsr\" (UID: \"80fc46da-4f15-419e-bc48-64a9ad936c6a\") " pod="openshift-controller-manager/controller-manager-68c767b974-7nlsr" Mar 10 09:48:02 crc kubenswrapper[4794]: I0310 09:48:02.874298 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/dbc27c9b-724b-4f9c-aa65-cca9e4af56e3-client-ca\") pod \"route-controller-manager-5c8b7bdcb-jdhjs\" (UID: \"dbc27c9b-724b-4f9c-aa65-cca9e4af56e3\") " pod="openshift-route-controller-manager/route-controller-manager-5c8b7bdcb-jdhjs" Mar 10 09:48:02 crc kubenswrapper[4794]: I0310 09:48:02.874315 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zp5hg\" (UID: \"f7eaacf5-bd8a-46e7-8584-1ec3ae00f6b3\") " pod="openshift-image-registry/image-registry-697d97f7c8-zp5hg" Mar 10 09:48:02 crc kubenswrapper[4794]: I0310 09:48:02.874349 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dbc27c9b-724b-4f9c-aa65-cca9e4af56e3-config\") pod \"route-controller-manager-5c8b7bdcb-jdhjs\" (UID: \"dbc27c9b-724b-4f9c-aa65-cca9e4af56e3\") " pod="openshift-route-controller-manager/route-controller-manager-5c8b7bdcb-jdhjs" Mar 10 09:48:02 crc kubenswrapper[4794]: I0310 09:48:02.874415 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/dbc27c9b-724b-4f9c-aa65-cca9e4af56e3-serving-cert\") pod \"route-controller-manager-5c8b7bdcb-jdhjs\" (UID: \"dbc27c9b-724b-4f9c-aa65-cca9e4af56e3\") " pod="openshift-route-controller-manager/route-controller-manager-5c8b7bdcb-jdhjs" Mar 10 09:48:02 crc kubenswrapper[4794]: I0310 09:48:02.874436 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nrpgj\" (UniqueName: \"kubernetes.io/projected/dbc27c9b-724b-4f9c-aa65-cca9e4af56e3-kube-api-access-nrpgj\") pod \"route-controller-manager-5c8b7bdcb-jdhjs\" (UID: \"dbc27c9b-724b-4f9c-aa65-cca9e4af56e3\") " pod="openshift-route-controller-manager/route-controller-manager-5c8b7bdcb-jdhjs" Mar 10 09:48:02 crc kubenswrapper[4794]: I0310 09:48:02.874461 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/80fc46da-4f15-419e-bc48-64a9ad936c6a-config\") pod \"controller-manager-68c767b974-7nlsr\" (UID: \"80fc46da-4f15-419e-bc48-64a9ad936c6a\") " pod="openshift-controller-manager/controller-manager-68c767b974-7nlsr" Mar 10 09:48:02 crc kubenswrapper[4794]: E0310 09:48:02.874948 4794 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 09:48:03.374928948 +0000 UTC m=+232.131099846 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zp5hg" (UID: "f7eaacf5-bd8a-46e7-8584-1ec3ae00f6b3") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 09:48:02 crc kubenswrapper[4794]: I0310 09:48:02.875767 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dbc27c9b-724b-4f9c-aa65-cca9e4af56e3-config\") pod \"route-controller-manager-5c8b7bdcb-jdhjs\" (UID: \"dbc27c9b-724b-4f9c-aa65-cca9e4af56e3\") " pod="openshift-route-controller-manager/route-controller-manager-5c8b7bdcb-jdhjs" Mar 10 09:48:02 crc kubenswrapper[4794]: I0310 09:48:02.877135 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/dbc27c9b-724b-4f9c-aa65-cca9e4af56e3-client-ca\") pod \"route-controller-manager-5c8b7bdcb-jdhjs\" (UID: \"dbc27c9b-724b-4f9c-aa65-cca9e4af56e3\") " pod="openshift-route-controller-manager/route-controller-manager-5c8b7bdcb-jdhjs" Mar 10 09:48:02 crc kubenswrapper[4794]: I0310 09:48:02.877863 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/80fc46da-4f15-419e-bc48-64a9ad936c6a-client-ca\") pod \"controller-manager-68c767b974-7nlsr\" (UID: \"80fc46da-4f15-419e-bc48-64a9ad936c6a\") " pod="openshift-controller-manager/controller-manager-68c767b974-7nlsr" Mar 10 09:48:02 crc kubenswrapper[4794]: I0310 09:48:02.878112 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/80fc46da-4f15-419e-bc48-64a9ad936c6a-proxy-ca-bundles\") pod \"controller-manager-68c767b974-7nlsr\" (UID: \"80fc46da-4f15-419e-bc48-64a9ad936c6a\") " pod="openshift-controller-manager/controller-manager-68c767b974-7nlsr" Mar 10 09:48:02 crc kubenswrapper[4794]: I0310 09:48:02.882307 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/80fc46da-4f15-419e-bc48-64a9ad936c6a-config\") pod \"controller-manager-68c767b974-7nlsr\" (UID: \"80fc46da-4f15-419e-bc48-64a9ad936c6a\") " pod="openshift-controller-manager/controller-manager-68c767b974-7nlsr" Mar 10 09:48:02 crc kubenswrapper[4794]: I0310 09:48:02.882946 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/80fc46da-4f15-419e-bc48-64a9ad936c6a-serving-cert\") pod \"controller-manager-68c767b974-7nlsr\" (UID: \"80fc46da-4f15-419e-bc48-64a9ad936c6a\") " pod="openshift-controller-manager/controller-manager-68c767b974-7nlsr" Mar 10 09:48:02 crc kubenswrapper[4794]: I0310 09:48:02.888174 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/dbc27c9b-724b-4f9c-aa65-cca9e4af56e3-serving-cert\") pod \"route-controller-manager-5c8b7bdcb-jdhjs\" (UID: \"dbc27c9b-724b-4f9c-aa65-cca9e4af56e3\") " pod="openshift-route-controller-manager/route-controller-manager-5c8b7bdcb-jdhjs" Mar 10 09:48:02 crc kubenswrapper[4794]: I0310 09:48:02.890015 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nrpgj\" (UniqueName: \"kubernetes.io/projected/dbc27c9b-724b-4f9c-aa65-cca9e4af56e3-kube-api-access-nrpgj\") pod \"route-controller-manager-5c8b7bdcb-jdhjs\" (UID: \"dbc27c9b-724b-4f9c-aa65-cca9e4af56e3\") " pod="openshift-route-controller-manager/route-controller-manager-5c8b7bdcb-jdhjs" Mar 10 09:48:02 crc kubenswrapper[4794]: I0310 09:48:02.891378 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r9528\" (UniqueName: \"kubernetes.io/projected/80fc46da-4f15-419e-bc48-64a9ad936c6a-kube-api-access-r9528\") pod \"controller-manager-68c767b974-7nlsr\" (UID: \"80fc46da-4f15-419e-bc48-64a9ad936c6a\") " pod="openshift-controller-manager/controller-manager-68c767b974-7nlsr" Mar 10 09:48:02 crc kubenswrapper[4794]: I0310 09:48:02.941028 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5c8b7bdcb-jdhjs" Mar 10 09:48:02 crc kubenswrapper[4794]: I0310 09:48:02.964236 4794 generic.go:334] "Generic (PLEG): container finished" podID="88245dbf-bf6b-4051-9a3c-91da5a183538" containerID="9a1950959578cf24faa2d48b936bcd50e5c83a87c9e318cdc2c43e24cb40dbdc" exitCode=0 Mar 10 09:48:02 crc kubenswrapper[4794]: I0310 09:48:02.964301 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bnvgq" event={"ID":"88245dbf-bf6b-4051-9a3c-91da5a183538","Type":"ContainerDied","Data":"9a1950959578cf24faa2d48b936bcd50e5c83a87c9e318cdc2c43e24cb40dbdc"} Mar 10 09:48:02 crc kubenswrapper[4794]: I0310 09:48:02.973309 4794 generic.go:334] "Generic (PLEG): container finished" podID="2c891e53-bebe-462e-a924-5073338f2ac1" containerID="04fb598ca8708bd8249618aed32ed8191b905b395b2871f5b4d7d57e0a7cbceb" exitCode=0 Mar 10 09:48:02 crc kubenswrapper[4794]: I0310 09:48:02.973443 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-68c767b974-7nlsr" Mar 10 09:48:02 crc kubenswrapper[4794]: I0310 09:48:02.973743 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-b52gv" event={"ID":"2c891e53-bebe-462e-a924-5073338f2ac1","Type":"ContainerDied","Data":"04fb598ca8708bd8249618aed32ed8191b905b395b2871f5b4d7d57e0a7cbceb"} Mar 10 09:48:02 crc kubenswrapper[4794]: I0310 09:48:02.973766 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-b52gv" event={"ID":"2c891e53-bebe-462e-a924-5073338f2ac1","Type":"ContainerStarted","Data":"3cabe3c0f008fd383116e851a9bb8a57b46f76458599dae3cdf888ad301adf70"} Mar 10 09:48:02 crc kubenswrapper[4794]: I0310 09:48:02.975167 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 09:48:02 crc kubenswrapper[4794]: E0310 09:48:02.975271 4794 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 09:48:03.475254743 +0000 UTC m=+232.231425561 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 09:48:02 crc kubenswrapper[4794]: I0310 09:48:02.976231 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zp5hg\" (UID: \"f7eaacf5-bd8a-46e7-8584-1ec3ae00f6b3\") " pod="openshift-image-registry/image-registry-697d97f7c8-zp5hg" Mar 10 09:48:02 crc kubenswrapper[4794]: E0310 09:48:02.976608 4794 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 09:48:03.476598424 +0000 UTC m=+232.232769242 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zp5hg" (UID: "f7eaacf5-bd8a-46e7-8584-1ec3ae00f6b3") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 09:48:02 crc kubenswrapper[4794]: I0310 09:48:02.977323 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29552265-smnpm" Mar 10 09:48:02 crc kubenswrapper[4794]: I0310 09:48:02.977503 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29552265-smnpm" event={"ID":"eece3bac-ab7c-4a16-82ae-35775eef8806","Type":"ContainerDied","Data":"26cb0fb5707601e2aa6a31b3b01be61d23126ea93b3e1b921e2e9abbf3f84244"} Mar 10 09:48:02 crc kubenswrapper[4794]: I0310 09:48:02.977554 4794 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="26cb0fb5707601e2aa6a31b3b01be61d23126ea93b3e1b921e2e9abbf3f84244" Mar 10 09:48:02 crc kubenswrapper[4794]: I0310 09:48:02.989026 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"58dac5f1-539f-4540-a1f8-9ec492a7e505","Type":"ContainerStarted","Data":"3c250664bb899118e6f7184632b77dc441bd62ef01ea58d769336590c68d4319"} Mar 10 09:48:02 crc kubenswrapper[4794]: I0310 09:48:02.989067 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"58dac5f1-539f-4540-a1f8-9ec492a7e505","Type":"ContainerStarted","Data":"6c03e78ae11a854cff642e89ac19ef3bf6cc7a5a3dfac0dd2c85a04f3902ea3d"} Mar 10 09:48:02 crc kubenswrapper[4794]: I0310 09:48:02.991095 4794 generic.go:334] "Generic (PLEG): container finished" podID="dc6e5973-dae4-42b4-aae1-44fd0f6e1c3e" containerID="97bb64ed647073b0a6b564113a9baad591fa4a2e5eb7904028e056bf5b4cc925" exitCode=0 Mar 10 09:48:02 crc kubenswrapper[4794]: I0310 09:48:02.991149 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4d8mk" event={"ID":"dc6e5973-dae4-42b4-aae1-44fd0f6e1c3e","Type":"ContainerDied","Data":"97bb64ed647073b0a6b564113a9baad591fa4a2e5eb7904028e056bf5b4cc925"} Mar 10 09:48:02 crc kubenswrapper[4794]: I0310 09:48:02.991171 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4d8mk" event={"ID":"dc6e5973-dae4-42b4-aae1-44fd0f6e1c3e","Type":"ContainerStarted","Data":"f37bec18dcebcae034b8717b756e3dde951466d81babf4babaeba83aeb10a777"} Mar 10 09:48:02 crc kubenswrapper[4794]: I0310 09:48:02.997113 4794 generic.go:334] "Generic (PLEG): container finished" podID="869965fc-c355-4c93-9776-dc1a070c926e" containerID="123bbb1f871d38f1241f71600926c356bbe1dc7c610499df38a8a78ee1ac4de0" exitCode=0 Mar 10 09:48:02 crc kubenswrapper[4794]: I0310 09:48:02.997281 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zffn2" event={"ID":"869965fc-c355-4c93-9776-dc1a070c926e","Type":"ContainerDied","Data":"123bbb1f871d38f1241f71600926c356bbe1dc7c610499df38a8a78ee1ac4de0"} Mar 10 09:48:02 crc kubenswrapper[4794]: I0310 09:48:02.997478 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zffn2" event={"ID":"869965fc-c355-4c93-9776-dc1a070c926e","Type":"ContainerStarted","Data":"0390edaa2c4571644faf4767f082b91047ffe17e250b5e8f9c9e444e2cd89902"} Mar 10 09:48:03 crc kubenswrapper[4794]: I0310 09:48:03.005308 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-ckvtc" event={"ID":"f97a286b-f0b0-4309-a3e4-33eea0aea5f8","Type":"ContainerStarted","Data":"db6d03aed0055b37dc49cc0940120f17795a236b0c15abab7eac7de0d6fdf6b5"} Mar 10 09:48:03 crc kubenswrapper[4794]: I0310 09:48:03.005363 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-ckvtc" event={"ID":"f97a286b-f0b0-4309-a3e4-33eea0aea5f8","Type":"ContainerStarted","Data":"1b2515e92e75e07fd2ef41d112cf51315fa912f179e7a1ce18dfdc9cb55e71fa"} Mar 10 09:48:03 crc kubenswrapper[4794]: I0310 09:48:03.058774 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/revision-pruner-9-crc" podStartSLOduration=2.058756932 podStartE2EDuration="2.058756932s" podCreationTimestamp="2026-03-10 09:48:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 09:48:03.058146093 +0000 UTC m=+231.814316931" watchObservedRunningTime="2026-03-10 09:48:03.058756932 +0000 UTC m=+231.814927750" Mar 10 09:48:03 crc kubenswrapper[4794]: I0310 09:48:03.070630 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-config-operator/openshift-config-operator-7777fb866f-l4mhp" Mar 10 09:48:03 crc kubenswrapper[4794]: I0310 09:48:03.082847 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 09:48:03 crc kubenswrapper[4794]: E0310 09:48:03.084248 4794 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 09:48:03.584229579 +0000 UTC m=+232.340400397 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 09:48:03 crc kubenswrapper[4794]: I0310 09:48:03.097713 4794 patch_prober.go:28] interesting pod/router-default-5444994796-tlkl5 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 10 09:48:03 crc kubenswrapper[4794]: [-]has-synced failed: reason withheld Mar 10 09:48:03 crc kubenswrapper[4794]: [+]process-running ok Mar 10 09:48:03 crc kubenswrapper[4794]: healthz check failed Mar 10 09:48:03 crc kubenswrapper[4794]: I0310 09:48:03.097760 4794 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-tlkl5" podUID="180c50b5-f178-4b42-922c-edfd1deb91d4" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 10 09:48:03 crc kubenswrapper[4794]: I0310 09:48:03.120427 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="hostpath-provisioner/csi-hostpathplugin-ckvtc" podStartSLOduration=12.12040467 podStartE2EDuration="12.12040467s" podCreationTimestamp="2026-03-10 09:47:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 09:48:03.083620681 +0000 UTC m=+231.839791499" watchObservedRunningTime="2026-03-10 09:48:03.12040467 +0000 UTC m=+231.876575488" Mar 10 09:48:03 crc kubenswrapper[4794]: I0310 09:48:03.184694 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zp5hg\" (UID: \"f7eaacf5-bd8a-46e7-8584-1ec3ae00f6b3\") " pod="openshift-image-registry/image-registry-697d97f7c8-zp5hg" Mar 10 09:48:03 crc kubenswrapper[4794]: E0310 09:48:03.185005 4794 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 09:48:03.684994138 +0000 UTC m=+232.441164956 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zp5hg" (UID: "f7eaacf5-bd8a-46e7-8584-1ec3ae00f6b3") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 09:48:03 crc kubenswrapper[4794]: I0310 09:48:03.193238 4794 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock","Timestamp":"2026-03-10T09:48:02.204295286Z","Handler":null,"Name":""} Mar 10 09:48:03 crc kubenswrapper[4794]: I0310 09:48:03.195453 4794 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: kubevirt.io.hostpath-provisioner endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock versions: 1.0.0 Mar 10 09:48:03 crc kubenswrapper[4794]: I0310 09:48:03.195477 4794 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: kubevirt.io.hostpath-provisioner at endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock Mar 10 09:48:03 crc kubenswrapper[4794]: I0310 09:48:03.286289 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 09:48:03 crc kubenswrapper[4794]: I0310 09:48:03.291680 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Mar 10 09:48:03 crc kubenswrapper[4794]: I0310 09:48:03.387705 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zp5hg\" (UID: \"f7eaacf5-bd8a-46e7-8584-1ec3ae00f6b3\") " pod="openshift-image-registry/image-registry-697d97f7c8-zp5hg" Mar 10 09:48:03 crc kubenswrapper[4794]: I0310 09:48:03.392250 4794 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 10 09:48:03 crc kubenswrapper[4794]: I0310 09:48:03.392293 4794 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zp5hg\" (UID: \"f7eaacf5-bd8a-46e7-8584-1ec3ae00f6b3\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount\"" pod="openshift-image-registry/image-registry-697d97f7c8-zp5hg" Mar 10 09:48:03 crc kubenswrapper[4794]: I0310 09:48:03.423963 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zp5hg\" (UID: \"f7eaacf5-bd8a-46e7-8584-1ec3ae00f6b3\") " pod="openshift-image-registry/image-registry-697d97f7c8-zp5hg" Mar 10 09:48:03 crc kubenswrapper[4794]: I0310 09:48:03.464858 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5c8b7bdcb-jdhjs"] Mar 10 09:48:03 crc kubenswrapper[4794]: I0310 09:48:03.469705 4794 ???:1] "http: TLS handshake error from 192.168.126.11:39188: no serving certificate available for the kubelet" Mar 10 09:48:03 crc kubenswrapper[4794]: W0310 09:48:03.472022 4794 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddbc27c9b_724b_4f9c_aa65_cca9e4af56e3.slice/crio-eceb5a7fbfd9fd57df456c96b15796723e2008f4b06d1ba34506dd43335bb3b8 WatchSource:0}: Error finding container eceb5a7fbfd9fd57df456c96b15796723e2008f4b06d1ba34506dd43335bb3b8: Status 404 returned error can't find the container with id eceb5a7fbfd9fd57df456c96b15796723e2008f4b06d1ba34506dd43335bb3b8 Mar 10 09:48:03 crc kubenswrapper[4794]: I0310 09:48:03.490820 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-68c767b974-7nlsr"] Mar 10 09:48:03 crc kubenswrapper[4794]: I0310 09:48:03.504077 4794 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-apiserver/apiserver-76f77b778f-4jppl" Mar 10 09:48:03 crc kubenswrapper[4794]: W0310 09:48:03.513644 4794 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod80fc46da_4f15_419e_bc48_64a9ad936c6a.slice/crio-d22cc678a94b49d1258a07c232af2df04595a7452da5e191386b6171fadf13d5 WatchSource:0}: Error finding container d22cc678a94b49d1258a07c232af2df04595a7452da5e191386b6171fadf13d5: Status 404 returned error can't find the container with id d22cc678a94b49d1258a07c232af2df04595a7452da5e191386b6171fadf13d5 Mar 10 09:48:03 crc kubenswrapper[4794]: I0310 09:48:03.514567 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-apiserver/apiserver-76f77b778f-4jppl" Mar 10 09:48:03 crc kubenswrapper[4794]: I0310 09:48:03.545976 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-q7js2"] Mar 10 09:48:03 crc kubenswrapper[4794]: I0310 09:48:03.547470 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-q7js2" Mar 10 09:48:03 crc kubenswrapper[4794]: I0310 09:48:03.550818 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Mar 10 09:48:03 crc kubenswrapper[4794]: I0310 09:48:03.559565 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-q7js2"] Mar 10 09:48:03 crc kubenswrapper[4794]: I0310 09:48:03.609533 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-zp5hg" Mar 10 09:48:03 crc kubenswrapper[4794]: I0310 09:48:03.695502 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mvqmm\" (UniqueName: \"kubernetes.io/projected/0cbd0a5e-393e-4539-86ef-559427986faa-kube-api-access-mvqmm\") pod \"redhat-marketplace-q7js2\" (UID: \"0cbd0a5e-393e-4539-86ef-559427986faa\") " pod="openshift-marketplace/redhat-marketplace-q7js2" Mar 10 09:48:03 crc kubenswrapper[4794]: I0310 09:48:03.695566 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0cbd0a5e-393e-4539-86ef-559427986faa-utilities\") pod \"redhat-marketplace-q7js2\" (UID: \"0cbd0a5e-393e-4539-86ef-559427986faa\") " pod="openshift-marketplace/redhat-marketplace-q7js2" Mar 10 09:48:03 crc kubenswrapper[4794]: I0310 09:48:03.695710 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0cbd0a5e-393e-4539-86ef-559427986faa-catalog-content\") pod \"redhat-marketplace-q7js2\" (UID: \"0cbd0a5e-393e-4539-86ef-559427986faa\") " pod="openshift-marketplace/redhat-marketplace-q7js2" Mar 10 09:48:03 crc kubenswrapper[4794]: I0310 09:48:03.796954 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0cbd0a5e-393e-4539-86ef-559427986faa-catalog-content\") pod \"redhat-marketplace-q7js2\" (UID: \"0cbd0a5e-393e-4539-86ef-559427986faa\") " pod="openshift-marketplace/redhat-marketplace-q7js2" Mar 10 09:48:03 crc kubenswrapper[4794]: I0310 09:48:03.797232 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mvqmm\" (UniqueName: \"kubernetes.io/projected/0cbd0a5e-393e-4539-86ef-559427986faa-kube-api-access-mvqmm\") pod \"redhat-marketplace-q7js2\" (UID: \"0cbd0a5e-393e-4539-86ef-559427986faa\") " pod="openshift-marketplace/redhat-marketplace-q7js2" Mar 10 09:48:03 crc kubenswrapper[4794]: I0310 09:48:03.797268 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0cbd0a5e-393e-4539-86ef-559427986faa-utilities\") pod \"redhat-marketplace-q7js2\" (UID: \"0cbd0a5e-393e-4539-86ef-559427986faa\") " pod="openshift-marketplace/redhat-marketplace-q7js2" Mar 10 09:48:03 crc kubenswrapper[4794]: I0310 09:48:03.798228 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0cbd0a5e-393e-4539-86ef-559427986faa-utilities\") pod \"redhat-marketplace-q7js2\" (UID: \"0cbd0a5e-393e-4539-86ef-559427986faa\") " pod="openshift-marketplace/redhat-marketplace-q7js2" Mar 10 09:48:03 crc kubenswrapper[4794]: I0310 09:48:03.798229 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0cbd0a5e-393e-4539-86ef-559427986faa-catalog-content\") pod \"redhat-marketplace-q7js2\" (UID: \"0cbd0a5e-393e-4539-86ef-559427986faa\") " pod="openshift-marketplace/redhat-marketplace-q7js2" Mar 10 09:48:03 crc kubenswrapper[4794]: I0310 09:48:03.817667 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mvqmm\" (UniqueName: \"kubernetes.io/projected/0cbd0a5e-393e-4539-86ef-559427986faa-kube-api-access-mvqmm\") pod \"redhat-marketplace-q7js2\" (UID: \"0cbd0a5e-393e-4539-86ef-559427986faa\") " pod="openshift-marketplace/redhat-marketplace-q7js2" Mar 10 09:48:03 crc kubenswrapper[4794]: I0310 09:48:03.910605 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-q7js2" Mar 10 09:48:03 crc kubenswrapper[4794]: I0310 09:48:03.976821 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-zp5hg"] Mar 10 09:48:03 crc kubenswrapper[4794]: I0310 09:48:03.982381 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-6hcq9"] Mar 10 09:48:03 crc kubenswrapper[4794]: I0310 09:48:03.983404 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-6hcq9" Mar 10 09:48:03 crc kubenswrapper[4794]: I0310 09:48:03.984425 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-6hcq9"] Mar 10 09:48:04 crc kubenswrapper[4794]: I0310 09:48:04.015907 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="02821716-8fb0-46bc-9c95-4c7ca46500b4" path="/var/lib/kubelet/pods/02821716-8fb0-46bc-9c95-4c7ca46500b4/volumes" Mar 10 09:48:04 crc kubenswrapper[4794]: I0310 09:48:04.023624 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f668bae-612b-4b75-9490-919e737c6a3b" path="/var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes" Mar 10 09:48:04 crc kubenswrapper[4794]: I0310 09:48:04.039503 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Mar 10 09:48:04 crc kubenswrapper[4794]: I0310 09:48:04.040313 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 10 09:48:04 crc kubenswrapper[4794]: I0310 09:48:04.048227 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Mar 10 09:48:04 crc kubenswrapper[4794]: I0310 09:48:04.048481 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Mar 10 09:48:04 crc kubenswrapper[4794]: I0310 09:48:04.085043 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Mar 10 09:48:04 crc kubenswrapper[4794]: I0310 09:48:04.087835 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ingress/router-default-5444994796-tlkl5" Mar 10 09:48:04 crc kubenswrapper[4794]: I0310 09:48:04.096539 4794 patch_prober.go:28] interesting pod/router-default-5444994796-tlkl5 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 10 09:48:04 crc kubenswrapper[4794]: [-]has-synced failed: reason withheld Mar 10 09:48:04 crc kubenswrapper[4794]: [+]process-running ok Mar 10 09:48:04 crc kubenswrapper[4794]: healthz check failed Mar 10 09:48:04 crc kubenswrapper[4794]: I0310 09:48:04.096600 4794 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-tlkl5" podUID="180c50b5-f178-4b42-922c-edfd1deb91d4" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 10 09:48:04 crc kubenswrapper[4794]: I0310 09:48:04.097143 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5c8b7bdcb-jdhjs" event={"ID":"dbc27c9b-724b-4f9c-aa65-cca9e4af56e3","Type":"ContainerStarted","Data":"86f311bbd7c592b22f2343e859a7ea0ed6d28c6264f5016c0ee4169a1456a72f"} Mar 10 09:48:04 crc kubenswrapper[4794]: I0310 09:48:04.097179 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5c8b7bdcb-jdhjs" event={"ID":"dbc27c9b-724b-4f9c-aa65-cca9e4af56e3","Type":"ContainerStarted","Data":"eceb5a7fbfd9fd57df456c96b15796723e2008f4b06d1ba34506dd43335bb3b8"} Mar 10 09:48:04 crc kubenswrapper[4794]: I0310 09:48:04.098120 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-5c8b7bdcb-jdhjs" Mar 10 09:48:04 crc kubenswrapper[4794]: I0310 09:48:04.104305 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/86b8b5f3-501a-44fa-8f78-ceaefcaecfdd-catalog-content\") pod \"redhat-marketplace-6hcq9\" (UID: \"86b8b5f3-501a-44fa-8f78-ceaefcaecfdd\") " pod="openshift-marketplace/redhat-marketplace-6hcq9" Mar 10 09:48:04 crc kubenswrapper[4794]: I0310 09:48:04.104352 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jp9vs\" (UniqueName: \"kubernetes.io/projected/86b8b5f3-501a-44fa-8f78-ceaefcaecfdd-kube-api-access-jp9vs\") pod \"redhat-marketplace-6hcq9\" (UID: \"86b8b5f3-501a-44fa-8f78-ceaefcaecfdd\") " pod="openshift-marketplace/redhat-marketplace-6hcq9" Mar 10 09:48:04 crc kubenswrapper[4794]: I0310 09:48:04.104382 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/86b8b5f3-501a-44fa-8f78-ceaefcaecfdd-utilities\") pod \"redhat-marketplace-6hcq9\" (UID: \"86b8b5f3-501a-44fa-8f78-ceaefcaecfdd\") " pod="openshift-marketplace/redhat-marketplace-6hcq9" Mar 10 09:48:04 crc kubenswrapper[4794]: I0310 09:48:04.125685 4794 generic.go:334] "Generic (PLEG): container finished" podID="58dac5f1-539f-4540-a1f8-9ec492a7e505" containerID="3c250664bb899118e6f7184632b77dc441bd62ef01ea58d769336590c68d4319" exitCode=0 Mar 10 09:48:04 crc kubenswrapper[4794]: I0310 09:48:04.125848 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"58dac5f1-539f-4540-a1f8-9ec492a7e505","Type":"ContainerDied","Data":"3c250664bb899118e6f7184632b77dc441bd62ef01ea58d769336590c68d4319"} Mar 10 09:48:04 crc kubenswrapper[4794]: I0310 09:48:04.130308 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-68c767b974-7nlsr" event={"ID":"80fc46da-4f15-419e-bc48-64a9ad936c6a","Type":"ContainerStarted","Data":"777d81d158e9bcbb458e27bcf95486b42a0096f7ddf5f15dd8654731bb62a776"} Mar 10 09:48:04 crc kubenswrapper[4794]: I0310 09:48:04.130359 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-68c767b974-7nlsr" event={"ID":"80fc46da-4f15-419e-bc48-64a9ad936c6a","Type":"ContainerStarted","Data":"d22cc678a94b49d1258a07c232af2df04595a7452da5e191386b6171fadf13d5"} Mar 10 09:48:04 crc kubenswrapper[4794]: I0310 09:48:04.131202 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-68c767b974-7nlsr" Mar 10 09:48:04 crc kubenswrapper[4794]: I0310 09:48:04.134020 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-zp5hg" event={"ID":"f7eaacf5-bd8a-46e7-8584-1ec3ae00f6b3","Type":"ContainerStarted","Data":"ca2739fb9fb4d9f8c71b5b3a76a829b57c173d46fa0313e8d90d9780fe755a71"} Mar 10 09:48:04 crc kubenswrapper[4794]: I0310 09:48:04.137847 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-5c8b7bdcb-jdhjs" podStartSLOduration=3.137834259 podStartE2EDuration="3.137834259s" podCreationTimestamp="2026-03-10 09:48:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 09:48:04.134758466 +0000 UTC m=+232.890929284" watchObservedRunningTime="2026-03-10 09:48:04.137834259 +0000 UTC m=+232.894005077" Mar 10 09:48:04 crc kubenswrapper[4794]: I0310 09:48:04.205657 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/7d8b333f-f7f3-4af3-bd55-3645d7081b77-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"7d8b333f-f7f3-4af3-bd55-3645d7081b77\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 10 09:48:04 crc kubenswrapper[4794]: I0310 09:48:04.206041 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/86b8b5f3-501a-44fa-8f78-ceaefcaecfdd-catalog-content\") pod \"redhat-marketplace-6hcq9\" (UID: \"86b8b5f3-501a-44fa-8f78-ceaefcaecfdd\") " pod="openshift-marketplace/redhat-marketplace-6hcq9" Mar 10 09:48:04 crc kubenswrapper[4794]: I0310 09:48:04.206060 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/7d8b333f-f7f3-4af3-bd55-3645d7081b77-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"7d8b333f-f7f3-4af3-bd55-3645d7081b77\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 10 09:48:04 crc kubenswrapper[4794]: I0310 09:48:04.206106 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jp9vs\" (UniqueName: \"kubernetes.io/projected/86b8b5f3-501a-44fa-8f78-ceaefcaecfdd-kube-api-access-jp9vs\") pod \"redhat-marketplace-6hcq9\" (UID: \"86b8b5f3-501a-44fa-8f78-ceaefcaecfdd\") " pod="openshift-marketplace/redhat-marketplace-6hcq9" Mar 10 09:48:04 crc kubenswrapper[4794]: I0310 09:48:04.206152 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/86b8b5f3-501a-44fa-8f78-ceaefcaecfdd-utilities\") pod \"redhat-marketplace-6hcq9\" (UID: \"86b8b5f3-501a-44fa-8f78-ceaefcaecfdd\") " pod="openshift-marketplace/redhat-marketplace-6hcq9" Mar 10 09:48:04 crc kubenswrapper[4794]: I0310 09:48:04.206603 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/86b8b5f3-501a-44fa-8f78-ceaefcaecfdd-utilities\") pod \"redhat-marketplace-6hcq9\" (UID: \"86b8b5f3-501a-44fa-8f78-ceaefcaecfdd\") " pod="openshift-marketplace/redhat-marketplace-6hcq9" Mar 10 09:48:04 crc kubenswrapper[4794]: I0310 09:48:04.207506 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/86b8b5f3-501a-44fa-8f78-ceaefcaecfdd-catalog-content\") pod \"redhat-marketplace-6hcq9\" (UID: \"86b8b5f3-501a-44fa-8f78-ceaefcaecfdd\") " pod="openshift-marketplace/redhat-marketplace-6hcq9" Mar 10 09:48:04 crc kubenswrapper[4794]: I0310 09:48:04.217869 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-68c767b974-7nlsr" podStartSLOduration=3.217820221 podStartE2EDuration="3.217820221s" podCreationTimestamp="2026-03-10 09:48:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 09:48:04.212790039 +0000 UTC m=+232.968960857" watchObservedRunningTime="2026-03-10 09:48:04.217820221 +0000 UTC m=+232.973991039" Mar 10 09:48:04 crc kubenswrapper[4794]: I0310 09:48:04.248769 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jp9vs\" (UniqueName: \"kubernetes.io/projected/86b8b5f3-501a-44fa-8f78-ceaefcaecfdd-kube-api-access-jp9vs\") pod \"redhat-marketplace-6hcq9\" (UID: \"86b8b5f3-501a-44fa-8f78-ceaefcaecfdd\") " pod="openshift-marketplace/redhat-marketplace-6hcq9" Mar 10 09:48:04 crc kubenswrapper[4794]: I0310 09:48:04.284932 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-68c767b974-7nlsr" Mar 10 09:48:04 crc kubenswrapper[4794]: I0310 09:48:04.299586 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-5c8b7bdcb-jdhjs" Mar 10 09:48:04 crc kubenswrapper[4794]: I0310 09:48:04.307069 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/7d8b333f-f7f3-4af3-bd55-3645d7081b77-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"7d8b333f-f7f3-4af3-bd55-3645d7081b77\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 10 09:48:04 crc kubenswrapper[4794]: I0310 09:48:04.307207 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/7d8b333f-f7f3-4af3-bd55-3645d7081b77-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"7d8b333f-f7f3-4af3-bd55-3645d7081b77\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 10 09:48:04 crc kubenswrapper[4794]: I0310 09:48:04.313855 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/7d8b333f-f7f3-4af3-bd55-3645d7081b77-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"7d8b333f-f7f3-4af3-bd55-3645d7081b77\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 10 09:48:04 crc kubenswrapper[4794]: I0310 09:48:04.320580 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-6hcq9" Mar 10 09:48:04 crc kubenswrapper[4794]: I0310 09:48:04.387931 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/7d8b333f-f7f3-4af3-bd55-3645d7081b77-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"7d8b333f-f7f3-4af3-bd55-3645d7081b77\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 10 09:48:04 crc kubenswrapper[4794]: I0310 09:48:04.388284 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 10 09:48:04 crc kubenswrapper[4794]: I0310 09:48:04.552774 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-cx8l7"] Mar 10 09:48:04 crc kubenswrapper[4794]: I0310 09:48:04.554295 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-cx8l7" Mar 10 09:48:04 crc kubenswrapper[4794]: I0310 09:48:04.556846 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Mar 10 09:48:04 crc kubenswrapper[4794]: I0310 09:48:04.565976 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-cx8l7"] Mar 10 09:48:04 crc kubenswrapper[4794]: I0310 09:48:04.629597 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-q7js2"] Mar 10 09:48:04 crc kubenswrapper[4794]: I0310 09:48:04.637708 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-58897d9998-qnrc8" Mar 10 09:48:04 crc kubenswrapper[4794]: I0310 09:48:04.722944 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j49q6\" (UniqueName: \"kubernetes.io/projected/1fe60fc6-df84-4c3b-a73a-0d5a64aeb61a-kube-api-access-j49q6\") pod \"redhat-operators-cx8l7\" (UID: \"1fe60fc6-df84-4c3b-a73a-0d5a64aeb61a\") " pod="openshift-marketplace/redhat-operators-cx8l7" Mar 10 09:48:04 crc kubenswrapper[4794]: I0310 09:48:04.723027 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1fe60fc6-df84-4c3b-a73a-0d5a64aeb61a-catalog-content\") pod \"redhat-operators-cx8l7\" (UID: \"1fe60fc6-df84-4c3b-a73a-0d5a64aeb61a\") " pod="openshift-marketplace/redhat-operators-cx8l7" Mar 10 09:48:04 crc kubenswrapper[4794]: I0310 09:48:04.723089 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1fe60fc6-df84-4c3b-a73a-0d5a64aeb61a-utilities\") pod \"redhat-operators-cx8l7\" (UID: \"1fe60fc6-df84-4c3b-a73a-0d5a64aeb61a\") " pod="openshift-marketplace/redhat-operators-cx8l7" Mar 10 09:48:04 crc kubenswrapper[4794]: I0310 09:48:04.749765 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-6hcq9"] Mar 10 09:48:04 crc kubenswrapper[4794]: W0310 09:48:04.784175 4794 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod86b8b5f3_501a_44fa_8f78_ceaefcaecfdd.slice/crio-c933f8aa9063d6e53255ad958bf467e9afa76762c930f66fb36f795a71bc0a0c WatchSource:0}: Error finding container c933f8aa9063d6e53255ad958bf467e9afa76762c930f66fb36f795a71bc0a0c: Status 404 returned error can't find the container with id c933f8aa9063d6e53255ad958bf467e9afa76762c930f66fb36f795a71bc0a0c Mar 10 09:48:04 crc kubenswrapper[4794]: I0310 09:48:04.816290 4794 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-f9d7485db-skqw6" Mar 10 09:48:04 crc kubenswrapper[4794]: I0310 09:48:04.816417 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-f9d7485db-skqw6" Mar 10 09:48:04 crc kubenswrapper[4794]: I0310 09:48:04.821754 4794 patch_prober.go:28] interesting pod/console-f9d7485db-skqw6 container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.12:8443/health\": dial tcp 10.217.0.12:8443: connect: connection refused" start-of-body= Mar 10 09:48:04 crc kubenswrapper[4794]: I0310 09:48:04.821809 4794 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-skqw6" podUID="7827a543-d8b2-460b-aee5-212ea1208c0d" containerName="console" probeResult="failure" output="Get \"https://10.217.0.12:8443/health\": dial tcp 10.217.0.12:8443: connect: connection refused" Mar 10 09:48:04 crc kubenswrapper[4794]: I0310 09:48:04.826099 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1fe60fc6-df84-4c3b-a73a-0d5a64aeb61a-utilities\") pod \"redhat-operators-cx8l7\" (UID: \"1fe60fc6-df84-4c3b-a73a-0d5a64aeb61a\") " pod="openshift-marketplace/redhat-operators-cx8l7" Mar 10 09:48:04 crc kubenswrapper[4794]: I0310 09:48:04.826211 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j49q6\" (UniqueName: \"kubernetes.io/projected/1fe60fc6-df84-4c3b-a73a-0d5a64aeb61a-kube-api-access-j49q6\") pod \"redhat-operators-cx8l7\" (UID: \"1fe60fc6-df84-4c3b-a73a-0d5a64aeb61a\") " pod="openshift-marketplace/redhat-operators-cx8l7" Mar 10 09:48:04 crc kubenswrapper[4794]: I0310 09:48:04.826287 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1fe60fc6-df84-4c3b-a73a-0d5a64aeb61a-catalog-content\") pod \"redhat-operators-cx8l7\" (UID: \"1fe60fc6-df84-4c3b-a73a-0d5a64aeb61a\") " pod="openshift-marketplace/redhat-operators-cx8l7" Mar 10 09:48:04 crc kubenswrapper[4794]: I0310 09:48:04.826716 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1fe60fc6-df84-4c3b-a73a-0d5a64aeb61a-catalog-content\") pod \"redhat-operators-cx8l7\" (UID: \"1fe60fc6-df84-4c3b-a73a-0d5a64aeb61a\") " pod="openshift-marketplace/redhat-operators-cx8l7" Mar 10 09:48:04 crc kubenswrapper[4794]: I0310 09:48:04.826959 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1fe60fc6-df84-4c3b-a73a-0d5a64aeb61a-utilities\") pod \"redhat-operators-cx8l7\" (UID: \"1fe60fc6-df84-4c3b-a73a-0d5a64aeb61a\") " pod="openshift-marketplace/redhat-operators-cx8l7" Mar 10 09:48:04 crc kubenswrapper[4794]: I0310 09:48:04.848909 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j49q6\" (UniqueName: \"kubernetes.io/projected/1fe60fc6-df84-4c3b-a73a-0d5a64aeb61a-kube-api-access-j49q6\") pod \"redhat-operators-cx8l7\" (UID: \"1fe60fc6-df84-4c3b-a73a-0d5a64aeb61a\") " pod="openshift-marketplace/redhat-operators-cx8l7" Mar 10 09:48:04 crc kubenswrapper[4794]: I0310 09:48:04.890657 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-cx8l7" Mar 10 09:48:04 crc kubenswrapper[4794]: I0310 09:48:04.960191 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-rq6sq"] Mar 10 09:48:04 crc kubenswrapper[4794]: I0310 09:48:04.970232 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-rq6sq"] Mar 10 09:48:04 crc kubenswrapper[4794]: I0310 09:48:04.970576 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-rq6sq" Mar 10 09:48:05 crc kubenswrapper[4794]: I0310 09:48:05.072081 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-n2b4p" Mar 10 09:48:05 crc kubenswrapper[4794]: I0310 09:48:05.091864 4794 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-5444994796-tlkl5" Mar 10 09:48:05 crc kubenswrapper[4794]: I0310 09:48:05.127558 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Mar 10 09:48:05 crc kubenswrapper[4794]: I0310 09:48:05.130406 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fj8sg\" (UniqueName: \"kubernetes.io/projected/6840b6f1-4520-4c02-9f69-b238ac692ae5-kube-api-access-fj8sg\") pod \"redhat-operators-rq6sq\" (UID: \"6840b6f1-4520-4c02-9f69-b238ac692ae5\") " pod="openshift-marketplace/redhat-operators-rq6sq" Mar 10 09:48:05 crc kubenswrapper[4794]: I0310 09:48:05.130531 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6840b6f1-4520-4c02-9f69-b238ac692ae5-utilities\") pod \"redhat-operators-rq6sq\" (UID: \"6840b6f1-4520-4c02-9f69-b238ac692ae5\") " pod="openshift-marketplace/redhat-operators-rq6sq" Mar 10 09:48:05 crc kubenswrapper[4794]: I0310 09:48:05.130774 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6840b6f1-4520-4c02-9f69-b238ac692ae5-catalog-content\") pod \"redhat-operators-rq6sq\" (UID: \"6840b6f1-4520-4c02-9f69-b238ac692ae5\") " pod="openshift-marketplace/redhat-operators-rq6sq" Mar 10 09:48:05 crc kubenswrapper[4794]: I0310 09:48:05.131151 4794 patch_prober.go:28] interesting pod/downloads-7954f5f757-bknsm container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.37:8080/\": dial tcp 10.217.0.37:8080: connect: connection refused" start-of-body= Mar 10 09:48:05 crc kubenswrapper[4794]: I0310 09:48:05.131160 4794 patch_prober.go:28] interesting pod/downloads-7954f5f757-bknsm container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.37:8080/\": dial tcp 10.217.0.37:8080: connect: connection refused" start-of-body= Mar 10 09:48:05 crc kubenswrapper[4794]: I0310 09:48:05.131197 4794 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-bknsm" podUID="a8b37b88-1555-420f-a9c1-f7e48046f160" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.37:8080/\": dial tcp 10.217.0.37:8080: connect: connection refused" Mar 10 09:48:05 crc kubenswrapper[4794]: I0310 09:48:05.131188 4794 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-bknsm" podUID="a8b37b88-1555-420f-a9c1-f7e48046f160" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.37:8080/\": dial tcp 10.217.0.37:8080: connect: connection refused" Mar 10 09:48:05 crc kubenswrapper[4794]: I0310 09:48:05.172537 4794 generic.go:334] "Generic (PLEG): container finished" podID="0cbd0a5e-393e-4539-86ef-559427986faa" containerID="2f94243ea035d2b447baf542f6a149184fb1e3808fe881e9512042eedb1f2534" exitCode=0 Mar 10 09:48:05 crc kubenswrapper[4794]: I0310 09:48:05.172601 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-q7js2" event={"ID":"0cbd0a5e-393e-4539-86ef-559427986faa","Type":"ContainerDied","Data":"2f94243ea035d2b447baf542f6a149184fb1e3808fe881e9512042eedb1f2534"} Mar 10 09:48:05 crc kubenswrapper[4794]: I0310 09:48:05.172627 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-q7js2" event={"ID":"0cbd0a5e-393e-4539-86ef-559427986faa","Type":"ContainerStarted","Data":"2d721804b25840827005cd1f692bc5cbc597518d2292b478ec625aa65a262b34"} Mar 10 09:48:05 crc kubenswrapper[4794]: I0310 09:48:05.176454 4794 generic.go:334] "Generic (PLEG): container finished" podID="86b8b5f3-501a-44fa-8f78-ceaefcaecfdd" containerID="e82861a1fc8afc141de36c880aa783497a205a03654eb6ba92f3cbf4cf6aab1a" exitCode=0 Mar 10 09:48:05 crc kubenswrapper[4794]: I0310 09:48:05.176509 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6hcq9" event={"ID":"86b8b5f3-501a-44fa-8f78-ceaefcaecfdd","Type":"ContainerDied","Data":"e82861a1fc8afc141de36c880aa783497a205a03654eb6ba92f3cbf4cf6aab1a"} Mar 10 09:48:05 crc kubenswrapper[4794]: I0310 09:48:05.176529 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6hcq9" event={"ID":"86b8b5f3-501a-44fa-8f78-ceaefcaecfdd","Type":"ContainerStarted","Data":"c933f8aa9063d6e53255ad958bf467e9afa76762c930f66fb36f795a71bc0a0c"} Mar 10 09:48:05 crc kubenswrapper[4794]: I0310 09:48:05.225431 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-zp5hg" event={"ID":"f7eaacf5-bd8a-46e7-8584-1ec3ae00f6b3","Type":"ContainerStarted","Data":"b10df9598b701f6239dcd28f244b86e29218a74a71906050e3f86265aedddb14"} Mar 10 09:48:05 crc kubenswrapper[4794]: I0310 09:48:05.226537 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-697d97f7c8-zp5hg" Mar 10 09:48:05 crc kubenswrapper[4794]: I0310 09:48:05.240187 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-5444994796-tlkl5" Mar 10 09:48:05 crc kubenswrapper[4794]: I0310 09:48:05.241126 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fj8sg\" (UniqueName: \"kubernetes.io/projected/6840b6f1-4520-4c02-9f69-b238ac692ae5-kube-api-access-fj8sg\") pod \"redhat-operators-rq6sq\" (UID: \"6840b6f1-4520-4c02-9f69-b238ac692ae5\") " pod="openshift-marketplace/redhat-operators-rq6sq" Mar 10 09:48:05 crc kubenswrapper[4794]: I0310 09:48:05.247031 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-697d97f7c8-zp5hg" podStartSLOduration=162.247012725 podStartE2EDuration="2m42.247012725s" podCreationTimestamp="2026-03-10 09:45:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 09:48:05.24321709 +0000 UTC m=+233.999387908" watchObservedRunningTime="2026-03-10 09:48:05.247012725 +0000 UTC m=+234.003183553" Mar 10 09:48:05 crc kubenswrapper[4794]: I0310 09:48:05.257102 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6840b6f1-4520-4c02-9f69-b238ac692ae5-utilities\") pod \"redhat-operators-rq6sq\" (UID: \"6840b6f1-4520-4c02-9f69-b238ac692ae5\") " pod="openshift-marketplace/redhat-operators-rq6sq" Mar 10 09:48:05 crc kubenswrapper[4794]: I0310 09:48:05.257213 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6840b6f1-4520-4c02-9f69-b238ac692ae5-utilities\") pod \"redhat-operators-rq6sq\" (UID: \"6840b6f1-4520-4c02-9f69-b238ac692ae5\") " pod="openshift-marketplace/redhat-operators-rq6sq" Mar 10 09:48:05 crc kubenswrapper[4794]: I0310 09:48:05.258140 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6840b6f1-4520-4c02-9f69-b238ac692ae5-catalog-content\") pod \"redhat-operators-rq6sq\" (UID: \"6840b6f1-4520-4c02-9f69-b238ac692ae5\") " pod="openshift-marketplace/redhat-operators-rq6sq" Mar 10 09:48:05 crc kubenswrapper[4794]: I0310 09:48:05.257731 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6840b6f1-4520-4c02-9f69-b238ac692ae5-catalog-content\") pod \"redhat-operators-rq6sq\" (UID: \"6840b6f1-4520-4c02-9f69-b238ac692ae5\") " pod="openshift-marketplace/redhat-operators-rq6sq" Mar 10 09:48:05 crc kubenswrapper[4794]: I0310 09:48:05.293829 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fj8sg\" (UniqueName: \"kubernetes.io/projected/6840b6f1-4520-4c02-9f69-b238ac692ae5-kube-api-access-fj8sg\") pod \"redhat-operators-rq6sq\" (UID: \"6840b6f1-4520-4c02-9f69-b238ac692ae5\") " pod="openshift-marketplace/redhat-operators-rq6sq" Mar 10 09:48:05 crc kubenswrapper[4794]: I0310 09:48:05.339698 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-rq6sq" Mar 10 09:48:05 crc kubenswrapper[4794]: I0310 09:48:05.466255 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-cx8l7"] Mar 10 09:48:06 crc kubenswrapper[4794]: I0310 09:48:06.055036 4794 ???:1] "http: TLS handshake error from 192.168.126.11:39192: no serving certificate available for the kubelet" Mar 10 09:48:07 crc kubenswrapper[4794]: I0310 09:48:07.931278 4794 ???:1] "http: TLS handshake error from 192.168.126.11:39204: no serving certificate available for the kubelet" Mar 10 09:48:09 crc kubenswrapper[4794]: I0310 09:48:09.876073 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-2lthx" Mar 10 09:48:11 crc kubenswrapper[4794]: I0310 09:48:11.209969 4794 ???:1] "http: TLS handshake error from 192.168.126.11:39208: no serving certificate available for the kubelet" Mar 10 09:48:12 crc kubenswrapper[4794]: W0310 09:48:12.815953 4794 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod7d8b333f_f7f3_4af3_bd55_3645d7081b77.slice/crio-eca5a79dd61d26b5c7285d605ae7a8c7f7f2ffa292877983e5906e90dfed519a WatchSource:0}: Error finding container eca5a79dd61d26b5c7285d605ae7a8c7f7f2ffa292877983e5906e90dfed519a: Status 404 returned error can't find the container with id eca5a79dd61d26b5c7285d605ae7a8c7f7f2ffa292877983e5906e90dfed519a Mar 10 09:48:12 crc kubenswrapper[4794]: W0310 09:48:12.818736 4794 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1fe60fc6_df84_4c3b_a73a_0d5a64aeb61a.slice/crio-1161b228ee1af342bca4f1d7c6009c98d690de7d323211060ad9ff2f517ebcb6 WatchSource:0}: Error finding container 1161b228ee1af342bca4f1d7c6009c98d690de7d323211060ad9ff2f517ebcb6: Status 404 returned error can't find the container with id 1161b228ee1af342bca4f1d7c6009c98d690de7d323211060ad9ff2f517ebcb6 Mar 10 09:48:12 crc kubenswrapper[4794]: I0310 09:48:12.867747 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 10 09:48:13 crc kubenswrapper[4794]: I0310 09:48:13.018791 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/58dac5f1-539f-4540-a1f8-9ec492a7e505-kubelet-dir\") pod \"58dac5f1-539f-4540-a1f8-9ec492a7e505\" (UID: \"58dac5f1-539f-4540-a1f8-9ec492a7e505\") " Mar 10 09:48:13 crc kubenswrapper[4794]: I0310 09:48:13.018843 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/58dac5f1-539f-4540-a1f8-9ec492a7e505-kube-api-access\") pod \"58dac5f1-539f-4540-a1f8-9ec492a7e505\" (UID: \"58dac5f1-539f-4540-a1f8-9ec492a7e505\") " Mar 10 09:48:13 crc kubenswrapper[4794]: I0310 09:48:13.018857 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/58dac5f1-539f-4540-a1f8-9ec492a7e505-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "58dac5f1-539f-4540-a1f8-9ec492a7e505" (UID: "58dac5f1-539f-4540-a1f8-9ec492a7e505"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 09:48:13 crc kubenswrapper[4794]: I0310 09:48:13.019070 4794 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/58dac5f1-539f-4540-a1f8-9ec492a7e505-kubelet-dir\") on node \"crc\" DevicePath \"\"" Mar 10 09:48:13 crc kubenswrapper[4794]: I0310 09:48:13.025084 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/58dac5f1-539f-4540-a1f8-9ec492a7e505-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "58dac5f1-539f-4540-a1f8-9ec492a7e505" (UID: "58dac5f1-539f-4540-a1f8-9ec492a7e505"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:48:13 crc kubenswrapper[4794]: I0310 09:48:13.121048 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/58dac5f1-539f-4540-a1f8-9ec492a7e505-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 10 09:48:13 crc kubenswrapper[4794]: I0310 09:48:13.288601 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"7d8b333f-f7f3-4af3-bd55-3645d7081b77","Type":"ContainerStarted","Data":"eca5a79dd61d26b5c7285d605ae7a8c7f7f2ffa292877983e5906e90dfed519a"} Mar 10 09:48:13 crc kubenswrapper[4794]: I0310 09:48:13.290103 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"58dac5f1-539f-4540-a1f8-9ec492a7e505","Type":"ContainerDied","Data":"6c03e78ae11a854cff642e89ac19ef3bf6cc7a5a3dfac0dd2c85a04f3902ea3d"} Mar 10 09:48:13 crc kubenswrapper[4794]: I0310 09:48:13.290129 4794 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6c03e78ae11a854cff642e89ac19ef3bf6cc7a5a3dfac0dd2c85a04f3902ea3d" Mar 10 09:48:13 crc kubenswrapper[4794]: I0310 09:48:13.290135 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 10 09:48:13 crc kubenswrapper[4794]: I0310 09:48:13.290903 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cx8l7" event={"ID":"1fe60fc6-df84-4c3b-a73a-0d5a64aeb61a","Type":"ContainerStarted","Data":"1161b228ee1af342bca4f1d7c6009c98d690de7d323211060ad9ff2f517ebcb6"} Mar 10 09:48:13 crc kubenswrapper[4794]: I0310 09:48:13.829708 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/befc934b-d5ba-4fb4-afc6-97614b624ebc-metrics-certs\") pod \"network-metrics-daemon-jl52w\" (UID: \"befc934b-d5ba-4fb4-afc6-97614b624ebc\") " pod="openshift-multus/network-metrics-daemon-jl52w" Mar 10 09:48:13 crc kubenswrapper[4794]: I0310 09:48:13.833452 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Mar 10 09:48:13 crc kubenswrapper[4794]: I0310 09:48:13.846519 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/befc934b-d5ba-4fb4-afc6-97614b624ebc-metrics-certs\") pod \"network-metrics-daemon-jl52w\" (UID: \"befc934b-d5ba-4fb4-afc6-97614b624ebc\") " pod="openshift-multus/network-metrics-daemon-jl52w" Mar 10 09:48:14 crc kubenswrapper[4794]: I0310 09:48:14.029922 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Mar 10 09:48:14 crc kubenswrapper[4794]: I0310 09:48:14.038261 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jl52w" Mar 10 09:48:14 crc kubenswrapper[4794]: I0310 09:48:14.815500 4794 patch_prober.go:28] interesting pod/console-f9d7485db-skqw6 container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.12:8443/health\": dial tcp 10.217.0.12:8443: connect: connection refused" start-of-body= Mar 10 09:48:14 crc kubenswrapper[4794]: I0310 09:48:14.815580 4794 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-skqw6" podUID="7827a543-d8b2-460b-aee5-212ea1208c0d" containerName="console" probeResult="failure" output="Get \"https://10.217.0.12:8443/health\": dial tcp 10.217.0.12:8443: connect: connection refused" Mar 10 09:48:15 crc kubenswrapper[4794]: I0310 09:48:15.136156 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-7954f5f757-bknsm" Mar 10 09:48:20 crc kubenswrapper[4794]: I0310 09:48:20.343964 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-68c767b974-7nlsr"] Mar 10 09:48:20 crc kubenswrapper[4794]: I0310 09:48:20.344882 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-68c767b974-7nlsr" podUID="80fc46da-4f15-419e-bc48-64a9ad936c6a" containerName="controller-manager" containerID="cri-o://777d81d158e9bcbb458e27bcf95486b42a0096f7ddf5f15dd8654731bb62a776" gracePeriod=30 Mar 10 09:48:20 crc kubenswrapper[4794]: I0310 09:48:20.376796 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5c8b7bdcb-jdhjs"] Mar 10 09:48:20 crc kubenswrapper[4794]: I0310 09:48:20.377210 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-5c8b7bdcb-jdhjs" podUID="dbc27c9b-724b-4f9c-aa65-cca9e4af56e3" containerName="route-controller-manager" containerID="cri-o://86f311bbd7c592b22f2343e859a7ea0ed6d28c6264f5016c0ee4169a1456a72f" gracePeriod=30 Mar 10 09:48:22 crc kubenswrapper[4794]: I0310 09:48:22.345144 4794 generic.go:334] "Generic (PLEG): container finished" podID="dbc27c9b-724b-4f9c-aa65-cca9e4af56e3" containerID="86f311bbd7c592b22f2343e859a7ea0ed6d28c6264f5016c0ee4169a1456a72f" exitCode=0 Mar 10 09:48:22 crc kubenswrapper[4794]: I0310 09:48:22.345244 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5c8b7bdcb-jdhjs" event={"ID":"dbc27c9b-724b-4f9c-aa65-cca9e4af56e3","Type":"ContainerDied","Data":"86f311bbd7c592b22f2343e859a7ea0ed6d28c6264f5016c0ee4169a1456a72f"} Mar 10 09:48:22 crc kubenswrapper[4794]: I0310 09:48:22.348393 4794 generic.go:334] "Generic (PLEG): container finished" podID="80fc46da-4f15-419e-bc48-64a9ad936c6a" containerID="777d81d158e9bcbb458e27bcf95486b42a0096f7ddf5f15dd8654731bb62a776" exitCode=0 Mar 10 09:48:22 crc kubenswrapper[4794]: I0310 09:48:22.348459 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-68c767b974-7nlsr" event={"ID":"80fc46da-4f15-419e-bc48-64a9ad936c6a","Type":"ContainerDied","Data":"777d81d158e9bcbb458e27bcf95486b42a0096f7ddf5f15dd8654731bb62a776"} Mar 10 09:48:22 crc kubenswrapper[4794]: I0310 09:48:22.942240 4794 patch_prober.go:28] interesting pod/route-controller-manager-5c8b7bdcb-jdhjs container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.52:8443/healthz\": dial tcp 10.217.0.52:8443: connect: connection refused" start-of-body= Mar 10 09:48:22 crc kubenswrapper[4794]: I0310 09:48:22.942304 4794 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-5c8b7bdcb-jdhjs" podUID="dbc27c9b-724b-4f9c-aa65-cca9e4af56e3" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.52:8443/healthz\": dial tcp 10.217.0.52:8443: connect: connection refused" Mar 10 09:48:22 crc kubenswrapper[4794]: I0310 09:48:22.967824 4794 patch_prober.go:28] interesting pod/machine-config-daemon-69278 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 09:48:22 crc kubenswrapper[4794]: I0310 09:48:22.968125 4794 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-69278" podUID="071a79a8-a892-4d38-a255-2a19483b64aa" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 09:48:22 crc kubenswrapper[4794]: I0310 09:48:22.974137 4794 patch_prober.go:28] interesting pod/controller-manager-68c767b974-7nlsr container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.51:8443/healthz\": dial tcp 10.217.0.51:8443: connect: connection refused" start-of-body= Mar 10 09:48:22 crc kubenswrapper[4794]: I0310 09:48:22.974202 4794 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-68c767b974-7nlsr" podUID="80fc46da-4f15-419e-bc48-64a9ad936c6a" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.51:8443/healthz\": dial tcp 10.217.0.51:8443: connect: connection refused" Mar 10 09:48:23 crc kubenswrapper[4794]: I0310 09:48:23.615062 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-697d97f7c8-zp5hg" Mar 10 09:48:23 crc kubenswrapper[4794]: E0310 09:48:23.972388 4794 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/openshift4/ose-cli:latest" Mar 10 09:48:23 crc kubenswrapper[4794]: E0310 09:48:23.972544 4794 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 10 09:48:23 crc kubenswrapper[4794]: container &Container{Name:oc,Image:registry.redhat.io/openshift4/ose-cli:latest,Command:[/bin/bash -c oc get csr -o go-template='{{range .items}}{{if not .status}}{{.metadata.name}}{{"\n"}}{{end}}{{end}}' | xargs --no-run-if-empty oc adm certificate approve Mar 10 09:48:23 crc kubenswrapper[4794]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-zppzf,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod auto-csr-approver-29552266-9tldc_openshift-infra(f12f506f-5226-41a3-9643-260415a884a5): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled Mar 10 09:48:23 crc kubenswrapper[4794]: > logger="UnhandledError" Mar 10 09:48:23 crc kubenswrapper[4794]: E0310 09:48:23.974643 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"oc\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-infra/auto-csr-approver-29552266-9tldc" podUID="f12f506f-5226-41a3-9643-260415a884a5" Mar 10 09:48:24 crc kubenswrapper[4794]: E0310 09:48:24.360361 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"oc\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/openshift4/ose-cli:latest\\\"\"" pod="openshift-infra/auto-csr-approver-29552266-9tldc" podUID="f12f506f-5226-41a3-9643-260415a884a5" Mar 10 09:48:24 crc kubenswrapper[4794]: I0310 09:48:24.957140 4794 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-f9d7485db-skqw6" Mar 10 09:48:24 crc kubenswrapper[4794]: I0310 09:48:24.965320 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-f9d7485db-skqw6" Mar 10 09:48:25 crc kubenswrapper[4794]: E0310 09:48:25.066083 4794 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/openshift4/ose-cli:latest" Mar 10 09:48:25 crc kubenswrapper[4794]: E0310 09:48:25.066205 4794 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 10 09:48:25 crc kubenswrapper[4794]: container &Container{Name:oc,Image:registry.redhat.io/openshift4/ose-cli:latest,Command:[/bin/bash -c oc get csr -o go-template='{{range .items}}{{if not .status}}{{.metadata.name}}{{"\n"}}{{end}}{{end}}' | xargs --no-run-if-empty oc adm certificate approve Mar 10 09:48:25 crc kubenswrapper[4794]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-tmbjk,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod auto-csr-approver-29552268-gzdlm_openshift-infra(75d151a9-5d22-4241-9177-7856740702e4): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled Mar 10 09:48:25 crc kubenswrapper[4794]: > logger="UnhandledError" Mar 10 09:48:25 crc kubenswrapper[4794]: E0310 09:48:25.067749 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"oc\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-infra/auto-csr-approver-29552268-gzdlm" podUID="75d151a9-5d22-4241-9177-7856740702e4" Mar 10 09:48:25 crc kubenswrapper[4794]: E0310 09:48:25.365301 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"oc\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/openshift4/ose-cli:latest\\\"\"" pod="openshift-infra/auto-csr-approver-29552268-gzdlm" podUID="75d151a9-5d22-4241-9177-7856740702e4" Mar 10 09:48:26 crc kubenswrapper[4794]: E0310 09:48:26.259945 4794 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Mar 10 09:48:26 crc kubenswrapper[4794]: E0310 09:48:26.260302 4794 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-fwx4g,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-bnvgq_openshift-marketplace(88245dbf-bf6b-4051-9a3c-91da5a183538): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 10 09:48:26 crc kubenswrapper[4794]: E0310 09:48:26.261759 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-bnvgq" podUID="88245dbf-bf6b-4051-9a3c-91da5a183538" Mar 10 09:48:26 crc kubenswrapper[4794]: E0310 09:48:26.267995 4794 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Mar 10 09:48:26 crc kubenswrapper[4794]: E0310 09:48:26.268161 4794 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-ng2zz,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-4d8mk_openshift-marketplace(dc6e5973-dae4-42b4-aae1-44fd0f6e1c3e): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 10 09:48:26 crc kubenswrapper[4794]: E0310 09:48:26.269451 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-4d8mk" podUID="dc6e5973-dae4-42b4-aae1-44fd0f6e1c3e" Mar 10 09:48:27 crc kubenswrapper[4794]: E0310 09:48:27.540170 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-bnvgq" podUID="88245dbf-bf6b-4051-9a3c-91da5a183538" Mar 10 09:48:27 crc kubenswrapper[4794]: E0310 09:48:27.540232 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-4d8mk" podUID="dc6e5973-dae4-42b4-aae1-44fd0f6e1c3e" Mar 10 09:48:27 crc kubenswrapper[4794]: I0310 09:48:27.598624 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5c8b7bdcb-jdhjs" Mar 10 09:48:27 crc kubenswrapper[4794]: I0310 09:48:27.604625 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-68c767b974-7nlsr" Mar 10 09:48:27 crc kubenswrapper[4794]: E0310 09:48:27.629837 4794 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Mar 10 09:48:27 crc kubenswrapper[4794]: E0310 09:48:27.629971 4794 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-bmrnm,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-zffn2_openshift-marketplace(869965fc-c355-4c93-9776-dc1a070c926e): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 10 09:48:27 crc kubenswrapper[4794]: E0310 09:48:27.631273 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-zffn2" podUID="869965fc-c355-4c93-9776-dc1a070c926e" Mar 10 09:48:27 crc kubenswrapper[4794]: I0310 09:48:27.633840 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-76c9db8f4f-cmvvb"] Mar 10 09:48:27 crc kubenswrapper[4794]: E0310 09:48:27.634051 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dbc27c9b-724b-4f9c-aa65-cca9e4af56e3" containerName="route-controller-manager" Mar 10 09:48:27 crc kubenswrapper[4794]: I0310 09:48:27.634067 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="dbc27c9b-724b-4f9c-aa65-cca9e4af56e3" containerName="route-controller-manager" Mar 10 09:48:27 crc kubenswrapper[4794]: E0310 09:48:27.634082 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="80fc46da-4f15-419e-bc48-64a9ad936c6a" containerName="controller-manager" Mar 10 09:48:27 crc kubenswrapper[4794]: I0310 09:48:27.634090 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="80fc46da-4f15-419e-bc48-64a9ad936c6a" containerName="controller-manager" Mar 10 09:48:27 crc kubenswrapper[4794]: E0310 09:48:27.634103 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="58dac5f1-539f-4540-a1f8-9ec492a7e505" containerName="pruner" Mar 10 09:48:27 crc kubenswrapper[4794]: I0310 09:48:27.634109 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="58dac5f1-539f-4540-a1f8-9ec492a7e505" containerName="pruner" Mar 10 09:48:27 crc kubenswrapper[4794]: I0310 09:48:27.634195 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="80fc46da-4f15-419e-bc48-64a9ad936c6a" containerName="controller-manager" Mar 10 09:48:27 crc kubenswrapper[4794]: I0310 09:48:27.634206 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="dbc27c9b-724b-4f9c-aa65-cca9e4af56e3" containerName="route-controller-manager" Mar 10 09:48:27 crc kubenswrapper[4794]: I0310 09:48:27.634215 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="58dac5f1-539f-4540-a1f8-9ec492a7e505" containerName="pruner" Mar 10 09:48:27 crc kubenswrapper[4794]: I0310 09:48:27.634600 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-76c9db8f4f-cmvvb" Mar 10 09:48:27 crc kubenswrapper[4794]: I0310 09:48:27.635808 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-76c9db8f4f-cmvvb"] Mar 10 09:48:27 crc kubenswrapper[4794]: E0310 09:48:27.639494 4794 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Mar 10 09:48:27 crc kubenswrapper[4794]: E0310 09:48:27.639637 4794 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-ftt2l,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-b52gv_openshift-marketplace(2c891e53-bebe-462e-a924-5073338f2ac1): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 10 09:48:27 crc kubenswrapper[4794]: E0310 09:48:27.640820 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-b52gv" podUID="2c891e53-bebe-462e-a924-5073338f2ac1" Mar 10 09:48:27 crc kubenswrapper[4794]: I0310 09:48:27.727850 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r9528\" (UniqueName: \"kubernetes.io/projected/80fc46da-4f15-419e-bc48-64a9ad936c6a-kube-api-access-r9528\") pod \"80fc46da-4f15-419e-bc48-64a9ad936c6a\" (UID: \"80fc46da-4f15-419e-bc48-64a9ad936c6a\") " Mar 10 09:48:27 crc kubenswrapper[4794]: I0310 09:48:27.728139 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nrpgj\" (UniqueName: \"kubernetes.io/projected/dbc27c9b-724b-4f9c-aa65-cca9e4af56e3-kube-api-access-nrpgj\") pod \"dbc27c9b-724b-4f9c-aa65-cca9e4af56e3\" (UID: \"dbc27c9b-724b-4f9c-aa65-cca9e4af56e3\") " Mar 10 09:48:27 crc kubenswrapper[4794]: I0310 09:48:27.728189 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/80fc46da-4f15-419e-bc48-64a9ad936c6a-config\") pod \"80fc46da-4f15-419e-bc48-64a9ad936c6a\" (UID: \"80fc46da-4f15-419e-bc48-64a9ad936c6a\") " Mar 10 09:48:27 crc kubenswrapper[4794]: I0310 09:48:27.728218 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/dbc27c9b-724b-4f9c-aa65-cca9e4af56e3-serving-cert\") pod \"dbc27c9b-724b-4f9c-aa65-cca9e4af56e3\" (UID: \"dbc27c9b-724b-4f9c-aa65-cca9e4af56e3\") " Mar 10 09:48:27 crc kubenswrapper[4794]: I0310 09:48:27.728234 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/80fc46da-4f15-419e-bc48-64a9ad936c6a-serving-cert\") pod \"80fc46da-4f15-419e-bc48-64a9ad936c6a\" (UID: \"80fc46da-4f15-419e-bc48-64a9ad936c6a\") " Mar 10 09:48:27 crc kubenswrapper[4794]: I0310 09:48:27.728252 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/80fc46da-4f15-419e-bc48-64a9ad936c6a-client-ca\") pod \"80fc46da-4f15-419e-bc48-64a9ad936c6a\" (UID: \"80fc46da-4f15-419e-bc48-64a9ad936c6a\") " Mar 10 09:48:27 crc kubenswrapper[4794]: I0310 09:48:27.728276 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/80fc46da-4f15-419e-bc48-64a9ad936c6a-proxy-ca-bundles\") pod \"80fc46da-4f15-419e-bc48-64a9ad936c6a\" (UID: \"80fc46da-4f15-419e-bc48-64a9ad936c6a\") " Mar 10 09:48:27 crc kubenswrapper[4794]: I0310 09:48:27.728295 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dbc27c9b-724b-4f9c-aa65-cca9e4af56e3-config\") pod \"dbc27c9b-724b-4f9c-aa65-cca9e4af56e3\" (UID: \"dbc27c9b-724b-4f9c-aa65-cca9e4af56e3\") " Mar 10 09:48:27 crc kubenswrapper[4794]: I0310 09:48:27.728348 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/dbc27c9b-724b-4f9c-aa65-cca9e4af56e3-client-ca\") pod \"dbc27c9b-724b-4f9c-aa65-cca9e4af56e3\" (UID: \"dbc27c9b-724b-4f9c-aa65-cca9e4af56e3\") " Mar 10 09:48:27 crc kubenswrapper[4794]: I0310 09:48:27.728450 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/73161eb6-50b2-4b4b-b52e-40b38b3efd9a-config\") pod \"route-controller-manager-76c9db8f4f-cmvvb\" (UID: \"73161eb6-50b2-4b4b-b52e-40b38b3efd9a\") " pod="openshift-route-controller-manager/route-controller-manager-76c9db8f4f-cmvvb" Mar 10 09:48:27 crc kubenswrapper[4794]: I0310 09:48:27.728548 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/73161eb6-50b2-4b4b-b52e-40b38b3efd9a-client-ca\") pod \"route-controller-manager-76c9db8f4f-cmvvb\" (UID: \"73161eb6-50b2-4b4b-b52e-40b38b3efd9a\") " pod="openshift-route-controller-manager/route-controller-manager-76c9db8f4f-cmvvb" Mar 10 09:48:27 crc kubenswrapper[4794]: I0310 09:48:27.728574 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/73161eb6-50b2-4b4b-b52e-40b38b3efd9a-serving-cert\") pod \"route-controller-manager-76c9db8f4f-cmvvb\" (UID: \"73161eb6-50b2-4b4b-b52e-40b38b3efd9a\") " pod="openshift-route-controller-manager/route-controller-manager-76c9db8f4f-cmvvb" Mar 10 09:48:27 crc kubenswrapper[4794]: I0310 09:48:27.728589 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j6j7x\" (UniqueName: \"kubernetes.io/projected/73161eb6-50b2-4b4b-b52e-40b38b3efd9a-kube-api-access-j6j7x\") pod \"route-controller-manager-76c9db8f4f-cmvvb\" (UID: \"73161eb6-50b2-4b4b-b52e-40b38b3efd9a\") " pod="openshift-route-controller-manager/route-controller-manager-76c9db8f4f-cmvvb" Mar 10 09:48:27 crc kubenswrapper[4794]: I0310 09:48:27.729192 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dbc27c9b-724b-4f9c-aa65-cca9e4af56e3-client-ca" (OuterVolumeSpecName: "client-ca") pod "dbc27c9b-724b-4f9c-aa65-cca9e4af56e3" (UID: "dbc27c9b-724b-4f9c-aa65-cca9e4af56e3"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 09:48:27 crc kubenswrapper[4794]: I0310 09:48:27.729446 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/80fc46da-4f15-419e-bc48-64a9ad936c6a-client-ca" (OuterVolumeSpecName: "client-ca") pod "80fc46da-4f15-419e-bc48-64a9ad936c6a" (UID: "80fc46da-4f15-419e-bc48-64a9ad936c6a"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 09:48:27 crc kubenswrapper[4794]: I0310 09:48:27.729839 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/80fc46da-4f15-419e-bc48-64a9ad936c6a-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "80fc46da-4f15-419e-bc48-64a9ad936c6a" (UID: "80fc46da-4f15-419e-bc48-64a9ad936c6a"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 09:48:27 crc kubenswrapper[4794]: I0310 09:48:27.729846 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dbc27c9b-724b-4f9c-aa65-cca9e4af56e3-config" (OuterVolumeSpecName: "config") pod "dbc27c9b-724b-4f9c-aa65-cca9e4af56e3" (UID: "dbc27c9b-724b-4f9c-aa65-cca9e4af56e3"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 09:48:27 crc kubenswrapper[4794]: I0310 09:48:27.730116 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/80fc46da-4f15-419e-bc48-64a9ad936c6a-config" (OuterVolumeSpecName: "config") pod "80fc46da-4f15-419e-bc48-64a9ad936c6a" (UID: "80fc46da-4f15-419e-bc48-64a9ad936c6a"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 09:48:27 crc kubenswrapper[4794]: I0310 09:48:27.738533 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dbc27c9b-724b-4f9c-aa65-cca9e4af56e3-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "dbc27c9b-724b-4f9c-aa65-cca9e4af56e3" (UID: "dbc27c9b-724b-4f9c-aa65-cca9e4af56e3"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:48:27 crc kubenswrapper[4794]: I0310 09:48:27.738546 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dbc27c9b-724b-4f9c-aa65-cca9e4af56e3-kube-api-access-nrpgj" (OuterVolumeSpecName: "kube-api-access-nrpgj") pod "dbc27c9b-724b-4f9c-aa65-cca9e4af56e3" (UID: "dbc27c9b-724b-4f9c-aa65-cca9e4af56e3"). InnerVolumeSpecName "kube-api-access-nrpgj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:48:27 crc kubenswrapper[4794]: I0310 09:48:27.738610 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/80fc46da-4f15-419e-bc48-64a9ad936c6a-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "80fc46da-4f15-419e-bc48-64a9ad936c6a" (UID: "80fc46da-4f15-419e-bc48-64a9ad936c6a"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:48:27 crc kubenswrapper[4794]: I0310 09:48:27.739573 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/80fc46da-4f15-419e-bc48-64a9ad936c6a-kube-api-access-r9528" (OuterVolumeSpecName: "kube-api-access-r9528") pod "80fc46da-4f15-419e-bc48-64a9ad936c6a" (UID: "80fc46da-4f15-419e-bc48-64a9ad936c6a"). InnerVolumeSpecName "kube-api-access-r9528". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:48:27 crc kubenswrapper[4794]: I0310 09:48:27.829613 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/73161eb6-50b2-4b4b-b52e-40b38b3efd9a-config\") pod \"route-controller-manager-76c9db8f4f-cmvvb\" (UID: \"73161eb6-50b2-4b4b-b52e-40b38b3efd9a\") " pod="openshift-route-controller-manager/route-controller-manager-76c9db8f4f-cmvvb" Mar 10 09:48:27 crc kubenswrapper[4794]: I0310 09:48:27.829736 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/73161eb6-50b2-4b4b-b52e-40b38b3efd9a-client-ca\") pod \"route-controller-manager-76c9db8f4f-cmvvb\" (UID: \"73161eb6-50b2-4b4b-b52e-40b38b3efd9a\") " pod="openshift-route-controller-manager/route-controller-manager-76c9db8f4f-cmvvb" Mar 10 09:48:27 crc kubenswrapper[4794]: I0310 09:48:27.829769 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/73161eb6-50b2-4b4b-b52e-40b38b3efd9a-serving-cert\") pod \"route-controller-manager-76c9db8f4f-cmvvb\" (UID: \"73161eb6-50b2-4b4b-b52e-40b38b3efd9a\") " pod="openshift-route-controller-manager/route-controller-manager-76c9db8f4f-cmvvb" Mar 10 09:48:27 crc kubenswrapper[4794]: I0310 09:48:27.829793 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j6j7x\" (UniqueName: \"kubernetes.io/projected/73161eb6-50b2-4b4b-b52e-40b38b3efd9a-kube-api-access-j6j7x\") pod \"route-controller-manager-76c9db8f4f-cmvvb\" (UID: \"73161eb6-50b2-4b4b-b52e-40b38b3efd9a\") " pod="openshift-route-controller-manager/route-controller-manager-76c9db8f4f-cmvvb" Mar 10 09:48:27 crc kubenswrapper[4794]: I0310 09:48:27.829849 4794 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/80fc46da-4f15-419e-bc48-64a9ad936c6a-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 10 09:48:27 crc kubenswrapper[4794]: I0310 09:48:27.829863 4794 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dbc27c9b-724b-4f9c-aa65-cca9e4af56e3-config\") on node \"crc\" DevicePath \"\"" Mar 10 09:48:27 crc kubenswrapper[4794]: I0310 09:48:27.829875 4794 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/dbc27c9b-724b-4f9c-aa65-cca9e4af56e3-client-ca\") on node \"crc\" DevicePath \"\"" Mar 10 09:48:27 crc kubenswrapper[4794]: I0310 09:48:27.829889 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r9528\" (UniqueName: \"kubernetes.io/projected/80fc46da-4f15-419e-bc48-64a9ad936c6a-kube-api-access-r9528\") on node \"crc\" DevicePath \"\"" Mar 10 09:48:27 crc kubenswrapper[4794]: I0310 09:48:27.829902 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nrpgj\" (UniqueName: \"kubernetes.io/projected/dbc27c9b-724b-4f9c-aa65-cca9e4af56e3-kube-api-access-nrpgj\") on node \"crc\" DevicePath \"\"" Mar 10 09:48:27 crc kubenswrapper[4794]: I0310 09:48:27.829912 4794 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/80fc46da-4f15-419e-bc48-64a9ad936c6a-config\") on node \"crc\" DevicePath \"\"" Mar 10 09:48:27 crc kubenswrapper[4794]: I0310 09:48:27.829922 4794 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/dbc27c9b-724b-4f9c-aa65-cca9e4af56e3-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 10 09:48:27 crc kubenswrapper[4794]: I0310 09:48:27.829933 4794 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/80fc46da-4f15-419e-bc48-64a9ad936c6a-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 10 09:48:27 crc kubenswrapper[4794]: I0310 09:48:27.829943 4794 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/80fc46da-4f15-419e-bc48-64a9ad936c6a-client-ca\") on node \"crc\" DevicePath \"\"" Mar 10 09:48:27 crc kubenswrapper[4794]: I0310 09:48:27.830979 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/73161eb6-50b2-4b4b-b52e-40b38b3efd9a-config\") pod \"route-controller-manager-76c9db8f4f-cmvvb\" (UID: \"73161eb6-50b2-4b4b-b52e-40b38b3efd9a\") " pod="openshift-route-controller-manager/route-controller-manager-76c9db8f4f-cmvvb" Mar 10 09:48:27 crc kubenswrapper[4794]: I0310 09:48:27.831720 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/73161eb6-50b2-4b4b-b52e-40b38b3efd9a-client-ca\") pod \"route-controller-manager-76c9db8f4f-cmvvb\" (UID: \"73161eb6-50b2-4b4b-b52e-40b38b3efd9a\") " pod="openshift-route-controller-manager/route-controller-manager-76c9db8f4f-cmvvb" Mar 10 09:48:27 crc kubenswrapper[4794]: I0310 09:48:27.833429 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/73161eb6-50b2-4b4b-b52e-40b38b3efd9a-serving-cert\") pod \"route-controller-manager-76c9db8f4f-cmvvb\" (UID: \"73161eb6-50b2-4b4b-b52e-40b38b3efd9a\") " pod="openshift-route-controller-manager/route-controller-manager-76c9db8f4f-cmvvb" Mar 10 09:48:27 crc kubenswrapper[4794]: I0310 09:48:27.846451 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j6j7x\" (UniqueName: \"kubernetes.io/projected/73161eb6-50b2-4b4b-b52e-40b38b3efd9a-kube-api-access-j6j7x\") pod \"route-controller-manager-76c9db8f4f-cmvvb\" (UID: \"73161eb6-50b2-4b4b-b52e-40b38b3efd9a\") " pod="openshift-route-controller-manager/route-controller-manager-76c9db8f4f-cmvvb" Mar 10 09:48:27 crc kubenswrapper[4794]: I0310 09:48:27.974043 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-76c9db8f4f-cmvvb" Mar 10 09:48:28 crc kubenswrapper[4794]: I0310 09:48:28.378004 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5c8b7bdcb-jdhjs" Mar 10 09:48:28 crc kubenswrapper[4794]: I0310 09:48:28.378003 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5c8b7bdcb-jdhjs" event={"ID":"dbc27c9b-724b-4f9c-aa65-cca9e4af56e3","Type":"ContainerDied","Data":"eceb5a7fbfd9fd57df456c96b15796723e2008f4b06d1ba34506dd43335bb3b8"} Mar 10 09:48:28 crc kubenswrapper[4794]: I0310 09:48:28.378376 4794 scope.go:117] "RemoveContainer" containerID="86f311bbd7c592b22f2343e859a7ea0ed6d28c6264f5016c0ee4169a1456a72f" Mar 10 09:48:28 crc kubenswrapper[4794]: I0310 09:48:28.379371 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-68c767b974-7nlsr" event={"ID":"80fc46da-4f15-419e-bc48-64a9ad936c6a","Type":"ContainerDied","Data":"d22cc678a94b49d1258a07c232af2df04595a7452da5e191386b6171fadf13d5"} Mar 10 09:48:28 crc kubenswrapper[4794]: I0310 09:48:28.379429 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-68c767b974-7nlsr" Mar 10 09:48:28 crc kubenswrapper[4794]: I0310 09:48:28.429889 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5c8b7bdcb-jdhjs"] Mar 10 09:48:28 crc kubenswrapper[4794]: I0310 09:48:28.435466 4794 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5c8b7bdcb-jdhjs"] Mar 10 09:48:28 crc kubenswrapper[4794]: I0310 09:48:28.439120 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-68c767b974-7nlsr"] Mar 10 09:48:28 crc kubenswrapper[4794]: I0310 09:48:28.442199 4794 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-68c767b974-7nlsr"] Mar 10 09:48:29 crc kubenswrapper[4794]: E0310 09:48:29.031824 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-b52gv" podUID="2c891e53-bebe-462e-a924-5073338f2ac1" Mar 10 09:48:29 crc kubenswrapper[4794]: E0310 09:48:29.031860 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-zffn2" podUID="869965fc-c355-4c93-9776-dc1a070c926e" Mar 10 09:48:29 crc kubenswrapper[4794]: I0310 09:48:29.144317 4794 scope.go:117] "RemoveContainer" containerID="777d81d158e9bcbb458e27bcf95486b42a0096f7ddf5f15dd8654731bb62a776" Mar 10 09:48:29 crc kubenswrapper[4794]: I0310 09:48:29.397770 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6hcq9" event={"ID":"86b8b5f3-501a-44fa-8f78-ceaefcaecfdd","Type":"ContainerStarted","Data":"b53b9ac8b80c41da591dcbf3aad299c79c4bef06655be38b313281d853503d83"} Mar 10 09:48:29 crc kubenswrapper[4794]: I0310 09:48:29.405454 4794 generic.go:334] "Generic (PLEG): container finished" podID="1fe60fc6-df84-4c3b-a73a-0d5a64aeb61a" containerID="98ee9b77d2b1c7e0683d83e11e018be3037e3f647628e34b0989b07c2096c4f3" exitCode=0 Mar 10 09:48:29 crc kubenswrapper[4794]: I0310 09:48:29.405513 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cx8l7" event={"ID":"1fe60fc6-df84-4c3b-a73a-0d5a64aeb61a","Type":"ContainerDied","Data":"98ee9b77d2b1c7e0683d83e11e018be3037e3f647628e34b0989b07c2096c4f3"} Mar 10 09:48:29 crc kubenswrapper[4794]: I0310 09:48:29.410976 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-76c9db8f4f-cmvvb"] Mar 10 09:48:29 crc kubenswrapper[4794]: I0310 09:48:29.453592 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-jl52w"] Mar 10 09:48:29 crc kubenswrapper[4794]: W0310 09:48:29.461059 4794 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbefc934b_d5ba_4fb4_afc6_97614b624ebc.slice/crio-37e0988587b4dd26975fc26feee785871320f3e4bbc7586db5c261ca673435fe WatchSource:0}: Error finding container 37e0988587b4dd26975fc26feee785871320f3e4bbc7586db5c261ca673435fe: Status 404 returned error can't find the container with id 37e0988587b4dd26975fc26feee785871320f3e4bbc7586db5c261ca673435fe Mar 10 09:48:29 crc kubenswrapper[4794]: I0310 09:48:29.524019 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-rq6sq"] Mar 10 09:48:30 crc kubenswrapper[4794]: I0310 09:48:30.006274 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="80fc46da-4f15-419e-bc48-64a9ad936c6a" path="/var/lib/kubelet/pods/80fc46da-4f15-419e-bc48-64a9ad936c6a/volumes" Mar 10 09:48:30 crc kubenswrapper[4794]: I0310 09:48:30.007193 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dbc27c9b-724b-4f9c-aa65-cca9e4af56e3" path="/var/lib/kubelet/pods/dbc27c9b-724b-4f9c-aa65-cca9e4af56e3/volumes" Mar 10 09:48:30 crc kubenswrapper[4794]: I0310 09:48:30.411676 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-jl52w" event={"ID":"befc934b-d5ba-4fb4-afc6-97614b624ebc","Type":"ContainerStarted","Data":"af075be2db7f5bee1af9cef54d7324031dbf80fa2168a6766ddadf3a709ca469"} Mar 10 09:48:30 crc kubenswrapper[4794]: I0310 09:48:30.412732 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-jl52w" event={"ID":"befc934b-d5ba-4fb4-afc6-97614b624ebc","Type":"ContainerStarted","Data":"f67ca065485be1e9232adbd647e654e86f7f28f677d72f0a03685acf46dc26b4"} Mar 10 09:48:30 crc kubenswrapper[4794]: I0310 09:48:30.412750 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-jl52w" event={"ID":"befc934b-d5ba-4fb4-afc6-97614b624ebc","Type":"ContainerStarted","Data":"37e0988587b4dd26975fc26feee785871320f3e4bbc7586db5c261ca673435fe"} Mar 10 09:48:30 crc kubenswrapper[4794]: I0310 09:48:30.418873 4794 generic.go:334] "Generic (PLEG): container finished" podID="86b8b5f3-501a-44fa-8f78-ceaefcaecfdd" containerID="b53b9ac8b80c41da591dcbf3aad299c79c4bef06655be38b313281d853503d83" exitCode=0 Mar 10 09:48:30 crc kubenswrapper[4794]: I0310 09:48:30.418945 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6hcq9" event={"ID":"86b8b5f3-501a-44fa-8f78-ceaefcaecfdd","Type":"ContainerDied","Data":"b53b9ac8b80c41da591dcbf3aad299c79c4bef06655be38b313281d853503d83"} Mar 10 09:48:30 crc kubenswrapper[4794]: I0310 09:48:30.421936 4794 generic.go:334] "Generic (PLEG): container finished" podID="7d8b333f-f7f3-4af3-bd55-3645d7081b77" containerID="df2350821f96bdadd78c001254ab9f4a5b6dbc85370c58ec39c84b290c9ec53e" exitCode=0 Mar 10 09:48:30 crc kubenswrapper[4794]: I0310 09:48:30.422000 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"7d8b333f-f7f3-4af3-bd55-3645d7081b77","Type":"ContainerDied","Data":"df2350821f96bdadd78c001254ab9f4a5b6dbc85370c58ec39c84b290c9ec53e"} Mar 10 09:48:30 crc kubenswrapper[4794]: I0310 09:48:30.427908 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-jl52w" podStartSLOduration=187.427886437 podStartE2EDuration="3m7.427886437s" podCreationTimestamp="2026-03-10 09:45:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 09:48:30.424451114 +0000 UTC m=+259.180621932" watchObservedRunningTime="2026-03-10 09:48:30.427886437 +0000 UTC m=+259.184057265" Mar 10 09:48:30 crc kubenswrapper[4794]: I0310 09:48:30.447933 4794 generic.go:334] "Generic (PLEG): container finished" podID="6840b6f1-4520-4c02-9f69-b238ac692ae5" containerID="f349060de8d52fc556856175ffbaa35343781ed37addfad8d27136bb86351d17" exitCode=0 Mar 10 09:48:30 crc kubenswrapper[4794]: I0310 09:48:30.449578 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rq6sq" event={"ID":"6840b6f1-4520-4c02-9f69-b238ac692ae5","Type":"ContainerDied","Data":"f349060de8d52fc556856175ffbaa35343781ed37addfad8d27136bb86351d17"} Mar 10 09:48:30 crc kubenswrapper[4794]: I0310 09:48:30.449792 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rq6sq" event={"ID":"6840b6f1-4520-4c02-9f69-b238ac692ae5","Type":"ContainerStarted","Data":"3ecf087b67f87a48241f472f7540e4d812b711a2b66ba25a3c2034d5ca0aeaa6"} Mar 10 09:48:30 crc kubenswrapper[4794]: I0310 09:48:30.453457 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-76c9db8f4f-cmvvb" event={"ID":"73161eb6-50b2-4b4b-b52e-40b38b3efd9a","Type":"ContainerStarted","Data":"7e52f162adcaf84ec7763e3c776f6090231dc147a2c14e4bfe3dab84a49fd623"} Mar 10 09:48:30 crc kubenswrapper[4794]: I0310 09:48:30.453494 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-76c9db8f4f-cmvvb" event={"ID":"73161eb6-50b2-4b4b-b52e-40b38b3efd9a","Type":"ContainerStarted","Data":"c22c720da28dbc0047b8e2404215d49b187dde17c7887269725c072bf7449c74"} Mar 10 09:48:30 crc kubenswrapper[4794]: I0310 09:48:30.454054 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-76c9db8f4f-cmvvb" Mar 10 09:48:30 crc kubenswrapper[4794]: I0310 09:48:30.460832 4794 generic.go:334] "Generic (PLEG): container finished" podID="0cbd0a5e-393e-4539-86ef-559427986faa" containerID="4cccfbb55efed4efdee22b1df0454599adf64b78e912c51336c4bec0ca666c88" exitCode=0 Mar 10 09:48:30 crc kubenswrapper[4794]: I0310 09:48:30.461782 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-q7js2" event={"ID":"0cbd0a5e-393e-4539-86ef-559427986faa","Type":"ContainerDied","Data":"4cccfbb55efed4efdee22b1df0454599adf64b78e912c51336c4bec0ca666c88"} Mar 10 09:48:30 crc kubenswrapper[4794]: I0310 09:48:30.464218 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-76c9db8f4f-cmvvb" Mar 10 09:48:30 crc kubenswrapper[4794]: I0310 09:48:30.521945 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-76c9db8f4f-cmvvb" podStartSLOduration=10.521927603 podStartE2EDuration="10.521927603s" podCreationTimestamp="2026-03-10 09:48:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 09:48:30.520111337 +0000 UTC m=+259.276282165" watchObservedRunningTime="2026-03-10 09:48:30.521927603 +0000 UTC m=+259.278098441" Mar 10 09:48:30 crc kubenswrapper[4794]: I0310 09:48:30.597786 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-6987fd4cbd-27sjp"] Mar 10 09:48:30 crc kubenswrapper[4794]: I0310 09:48:30.599958 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6987fd4cbd-27sjp" Mar 10 09:48:30 crc kubenswrapper[4794]: I0310 09:48:30.607032 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 10 09:48:30 crc kubenswrapper[4794]: I0310 09:48:30.607181 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 10 09:48:30 crc kubenswrapper[4794]: I0310 09:48:30.608218 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 10 09:48:30 crc kubenswrapper[4794]: I0310 09:48:30.609195 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 10 09:48:30 crc kubenswrapper[4794]: I0310 09:48:30.609383 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 10 09:48:30 crc kubenswrapper[4794]: I0310 09:48:30.611219 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 10 09:48:30 crc kubenswrapper[4794]: I0310 09:48:30.623485 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 10 09:48:30 crc kubenswrapper[4794]: I0310 09:48:30.625321 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-6987fd4cbd-27sjp"] Mar 10 09:48:30 crc kubenswrapper[4794]: I0310 09:48:30.675142 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nw58v\" (UniqueName: \"kubernetes.io/projected/32fdf926-a589-4496-8773-c28effa19c31-kube-api-access-nw58v\") pod \"controller-manager-6987fd4cbd-27sjp\" (UID: \"32fdf926-a589-4496-8773-c28effa19c31\") " pod="openshift-controller-manager/controller-manager-6987fd4cbd-27sjp" Mar 10 09:48:30 crc kubenswrapper[4794]: I0310 09:48:30.675199 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/32fdf926-a589-4496-8773-c28effa19c31-config\") pod \"controller-manager-6987fd4cbd-27sjp\" (UID: \"32fdf926-a589-4496-8773-c28effa19c31\") " pod="openshift-controller-manager/controller-manager-6987fd4cbd-27sjp" Mar 10 09:48:30 crc kubenswrapper[4794]: I0310 09:48:30.675276 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/32fdf926-a589-4496-8773-c28effa19c31-serving-cert\") pod \"controller-manager-6987fd4cbd-27sjp\" (UID: \"32fdf926-a589-4496-8773-c28effa19c31\") " pod="openshift-controller-manager/controller-manager-6987fd4cbd-27sjp" Mar 10 09:48:30 crc kubenswrapper[4794]: I0310 09:48:30.675294 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/32fdf926-a589-4496-8773-c28effa19c31-client-ca\") pod \"controller-manager-6987fd4cbd-27sjp\" (UID: \"32fdf926-a589-4496-8773-c28effa19c31\") " pod="openshift-controller-manager/controller-manager-6987fd4cbd-27sjp" Mar 10 09:48:30 crc kubenswrapper[4794]: I0310 09:48:30.675326 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/32fdf926-a589-4496-8773-c28effa19c31-proxy-ca-bundles\") pod \"controller-manager-6987fd4cbd-27sjp\" (UID: \"32fdf926-a589-4496-8773-c28effa19c31\") " pod="openshift-controller-manager/controller-manager-6987fd4cbd-27sjp" Mar 10 09:48:30 crc kubenswrapper[4794]: I0310 09:48:30.776571 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/32fdf926-a589-4496-8773-c28effa19c31-proxy-ca-bundles\") pod \"controller-manager-6987fd4cbd-27sjp\" (UID: \"32fdf926-a589-4496-8773-c28effa19c31\") " pod="openshift-controller-manager/controller-manager-6987fd4cbd-27sjp" Mar 10 09:48:30 crc kubenswrapper[4794]: I0310 09:48:30.776620 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nw58v\" (UniqueName: \"kubernetes.io/projected/32fdf926-a589-4496-8773-c28effa19c31-kube-api-access-nw58v\") pod \"controller-manager-6987fd4cbd-27sjp\" (UID: \"32fdf926-a589-4496-8773-c28effa19c31\") " pod="openshift-controller-manager/controller-manager-6987fd4cbd-27sjp" Mar 10 09:48:30 crc kubenswrapper[4794]: I0310 09:48:30.776680 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/32fdf926-a589-4496-8773-c28effa19c31-config\") pod \"controller-manager-6987fd4cbd-27sjp\" (UID: \"32fdf926-a589-4496-8773-c28effa19c31\") " pod="openshift-controller-manager/controller-manager-6987fd4cbd-27sjp" Mar 10 09:48:30 crc kubenswrapper[4794]: I0310 09:48:30.776777 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/32fdf926-a589-4496-8773-c28effa19c31-serving-cert\") pod \"controller-manager-6987fd4cbd-27sjp\" (UID: \"32fdf926-a589-4496-8773-c28effa19c31\") " pod="openshift-controller-manager/controller-manager-6987fd4cbd-27sjp" Mar 10 09:48:30 crc kubenswrapper[4794]: I0310 09:48:30.777110 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/32fdf926-a589-4496-8773-c28effa19c31-client-ca\") pod \"controller-manager-6987fd4cbd-27sjp\" (UID: \"32fdf926-a589-4496-8773-c28effa19c31\") " pod="openshift-controller-manager/controller-manager-6987fd4cbd-27sjp" Mar 10 09:48:30 crc kubenswrapper[4794]: I0310 09:48:30.778070 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/32fdf926-a589-4496-8773-c28effa19c31-client-ca\") pod \"controller-manager-6987fd4cbd-27sjp\" (UID: \"32fdf926-a589-4496-8773-c28effa19c31\") " pod="openshift-controller-manager/controller-manager-6987fd4cbd-27sjp" Mar 10 09:48:30 crc kubenswrapper[4794]: I0310 09:48:30.778085 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/32fdf926-a589-4496-8773-c28effa19c31-config\") pod \"controller-manager-6987fd4cbd-27sjp\" (UID: \"32fdf926-a589-4496-8773-c28effa19c31\") " pod="openshift-controller-manager/controller-manager-6987fd4cbd-27sjp" Mar 10 09:48:30 crc kubenswrapper[4794]: I0310 09:48:30.780557 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/32fdf926-a589-4496-8773-c28effa19c31-proxy-ca-bundles\") pod \"controller-manager-6987fd4cbd-27sjp\" (UID: \"32fdf926-a589-4496-8773-c28effa19c31\") " pod="openshift-controller-manager/controller-manager-6987fd4cbd-27sjp" Mar 10 09:48:30 crc kubenswrapper[4794]: I0310 09:48:30.785624 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/32fdf926-a589-4496-8773-c28effa19c31-serving-cert\") pod \"controller-manager-6987fd4cbd-27sjp\" (UID: \"32fdf926-a589-4496-8773-c28effa19c31\") " pod="openshift-controller-manager/controller-manager-6987fd4cbd-27sjp" Mar 10 09:48:30 crc kubenswrapper[4794]: I0310 09:48:30.793077 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nw58v\" (UniqueName: \"kubernetes.io/projected/32fdf926-a589-4496-8773-c28effa19c31-kube-api-access-nw58v\") pod \"controller-manager-6987fd4cbd-27sjp\" (UID: \"32fdf926-a589-4496-8773-c28effa19c31\") " pod="openshift-controller-manager/controller-manager-6987fd4cbd-27sjp" Mar 10 09:48:30 crc kubenswrapper[4794]: I0310 09:48:30.926108 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6987fd4cbd-27sjp" Mar 10 09:48:31 crc kubenswrapper[4794]: I0310 09:48:31.101946 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-6987fd4cbd-27sjp"] Mar 10 09:48:31 crc kubenswrapper[4794]: W0310 09:48:31.109558 4794 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod32fdf926_a589_4496_8773_c28effa19c31.slice/crio-7ffa4384c6b4ab1f235350e80708b247f6f8bfc665e4667b304abc080f745542 WatchSource:0}: Error finding container 7ffa4384c6b4ab1f235350e80708b247f6f8bfc665e4667b304abc080f745542: Status 404 returned error can't find the container with id 7ffa4384c6b4ab1f235350e80708b247f6f8bfc665e4667b304abc080f745542 Mar 10 09:48:31 crc kubenswrapper[4794]: I0310 09:48:31.467476 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6987fd4cbd-27sjp" event={"ID":"32fdf926-a589-4496-8773-c28effa19c31","Type":"ContainerStarted","Data":"b457de1513341efbbdf27804d4325587b1ca45e637fb46cc1afa7d62da652291"} Mar 10 09:48:31 crc kubenswrapper[4794]: I0310 09:48:31.467912 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6987fd4cbd-27sjp" event={"ID":"32fdf926-a589-4496-8773-c28effa19c31","Type":"ContainerStarted","Data":"7ffa4384c6b4ab1f235350e80708b247f6f8bfc665e4667b304abc080f745542"} Mar 10 09:48:31 crc kubenswrapper[4794]: I0310 09:48:31.467933 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-6987fd4cbd-27sjp" Mar 10 09:48:31 crc kubenswrapper[4794]: I0310 09:48:31.469958 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-q7js2" event={"ID":"0cbd0a5e-393e-4539-86ef-559427986faa","Type":"ContainerStarted","Data":"1edd73cb5575e96ed483f8bfdcf30f5d909505595954c18a9a542f61e3dc57b2"} Mar 10 09:48:31 crc kubenswrapper[4794]: I0310 09:48:31.472233 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6hcq9" event={"ID":"86b8b5f3-501a-44fa-8f78-ceaefcaecfdd","Type":"ContainerStarted","Data":"3df217ab7a77920b1c6587cba3d2add14b8ad14f788e2c4c0d7b52837dfffba2"} Mar 10 09:48:31 crc kubenswrapper[4794]: I0310 09:48:31.477879 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-6987fd4cbd-27sjp" Mar 10 09:48:31 crc kubenswrapper[4794]: I0310 09:48:31.482491 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-6987fd4cbd-27sjp" podStartSLOduration=11.482478041 podStartE2EDuration="11.482478041s" podCreationTimestamp="2026-03-10 09:48:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 09:48:31.479853406 +0000 UTC m=+260.236024224" watchObservedRunningTime="2026-03-10 09:48:31.482478041 +0000 UTC m=+260.238648859" Mar 10 09:48:31 crc kubenswrapper[4794]: I0310 09:48:31.526511 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-6hcq9" podStartSLOduration=10.053994552 podStartE2EDuration="28.526488212s" podCreationTimestamp="2026-03-10 09:48:03 +0000 UTC" firstStartedPulling="2026-03-10 09:48:12.367304627 +0000 UTC m=+241.123475445" lastFinishedPulling="2026-03-10 09:48:30.839798287 +0000 UTC m=+259.595969105" observedRunningTime="2026-03-10 09:48:31.523165172 +0000 UTC m=+260.279335990" watchObservedRunningTime="2026-03-10 09:48:31.526488212 +0000 UTC m=+260.282659030" Mar 10 09:48:31 crc kubenswrapper[4794]: I0310 09:48:31.542584 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-q7js2" podStartSLOduration=10.035585423 podStartE2EDuration="28.542563083s" podCreationTimestamp="2026-03-10 09:48:03 +0000 UTC" firstStartedPulling="2026-03-10 09:48:12.36740162 +0000 UTC m=+241.123572438" lastFinishedPulling="2026-03-10 09:48:30.87437927 +0000 UTC m=+259.630550098" observedRunningTime="2026-03-10 09:48:31.539959589 +0000 UTC m=+260.296130397" watchObservedRunningTime="2026-03-10 09:48:31.542563083 +0000 UTC m=+260.298733901" Mar 10 09:48:31 crc kubenswrapper[4794]: I0310 09:48:31.716965 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 10 09:48:31 crc kubenswrapper[4794]: I0310 09:48:31.720867 4794 ???:1] "http: TLS handshake error from 192.168.126.11:36334: no serving certificate available for the kubelet" Mar 10 09:48:31 crc kubenswrapper[4794]: I0310 09:48:31.895187 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/7d8b333f-f7f3-4af3-bd55-3645d7081b77-kubelet-dir\") pod \"7d8b333f-f7f3-4af3-bd55-3645d7081b77\" (UID: \"7d8b333f-f7f3-4af3-bd55-3645d7081b77\") " Mar 10 09:48:31 crc kubenswrapper[4794]: I0310 09:48:31.895235 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/7d8b333f-f7f3-4af3-bd55-3645d7081b77-kube-api-access\") pod \"7d8b333f-f7f3-4af3-bd55-3645d7081b77\" (UID: \"7d8b333f-f7f3-4af3-bd55-3645d7081b77\") " Mar 10 09:48:31 crc kubenswrapper[4794]: I0310 09:48:31.895345 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7d8b333f-f7f3-4af3-bd55-3645d7081b77-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "7d8b333f-f7f3-4af3-bd55-3645d7081b77" (UID: "7d8b333f-f7f3-4af3-bd55-3645d7081b77"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 09:48:31 crc kubenswrapper[4794]: I0310 09:48:31.895468 4794 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/7d8b333f-f7f3-4af3-bd55-3645d7081b77-kubelet-dir\") on node \"crc\" DevicePath \"\"" Mar 10 09:48:31 crc kubenswrapper[4794]: I0310 09:48:31.911008 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7d8b333f-f7f3-4af3-bd55-3645d7081b77-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "7d8b333f-f7f3-4af3-bd55-3645d7081b77" (UID: "7d8b333f-f7f3-4af3-bd55-3645d7081b77"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:48:31 crc kubenswrapper[4794]: I0310 09:48:31.997304 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/7d8b333f-f7f3-4af3-bd55-3645d7081b77-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 10 09:48:32 crc kubenswrapper[4794]: I0310 09:48:32.481039 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"7d8b333f-f7f3-4af3-bd55-3645d7081b77","Type":"ContainerDied","Data":"eca5a79dd61d26b5c7285d605ae7a8c7f7f2ffa292877983e5906e90dfed519a"} Mar 10 09:48:32 crc kubenswrapper[4794]: I0310 09:48:32.481348 4794 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="eca5a79dd61d26b5c7285d605ae7a8c7f7f2ffa292877983e5906e90dfed519a" Mar 10 09:48:32 crc kubenswrapper[4794]: I0310 09:48:32.481058 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 10 09:48:33 crc kubenswrapper[4794]: I0310 09:48:33.911865 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-q7js2" Mar 10 09:48:33 crc kubenswrapper[4794]: I0310 09:48:33.912181 4794 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-q7js2" Mar 10 09:48:34 crc kubenswrapper[4794]: I0310 09:48:34.045082 4794 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-q7js2" Mar 10 09:48:34 crc kubenswrapper[4794]: I0310 09:48:34.321327 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-6hcq9" Mar 10 09:48:34 crc kubenswrapper[4794]: I0310 09:48:34.321388 4794 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-6hcq9" Mar 10 09:48:34 crc kubenswrapper[4794]: I0310 09:48:34.365459 4794 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-6hcq9" Mar 10 09:48:35 crc kubenswrapper[4794]: I0310 09:48:35.043128 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-gg8jk" Mar 10 09:48:38 crc kubenswrapper[4794]: I0310 09:48:38.430172 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Mar 10 09:48:38 crc kubenswrapper[4794]: E0310 09:48:38.430634 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7d8b333f-f7f3-4af3-bd55-3645d7081b77" containerName="pruner" Mar 10 09:48:38 crc kubenswrapper[4794]: I0310 09:48:38.430647 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="7d8b333f-f7f3-4af3-bd55-3645d7081b77" containerName="pruner" Mar 10 09:48:38 crc kubenswrapper[4794]: I0310 09:48:38.430758 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="7d8b333f-f7f3-4af3-bd55-3645d7081b77" containerName="pruner" Mar 10 09:48:38 crc kubenswrapper[4794]: I0310 09:48:38.431102 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 10 09:48:38 crc kubenswrapper[4794]: I0310 09:48:38.433804 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Mar 10 09:48:38 crc kubenswrapper[4794]: I0310 09:48:38.435602 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Mar 10 09:48:38 crc kubenswrapper[4794]: I0310 09:48:38.447667 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Mar 10 09:48:38 crc kubenswrapper[4794]: I0310 09:48:38.491715 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/3bfb465f-2aa5-4e4b-93ca-774f6cb12f28-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"3bfb465f-2aa5-4e4b-93ca-774f6cb12f28\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 10 09:48:38 crc kubenswrapper[4794]: I0310 09:48:38.491849 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/3bfb465f-2aa5-4e4b-93ca-774f6cb12f28-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"3bfb465f-2aa5-4e4b-93ca-774f6cb12f28\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 10 09:48:38 crc kubenswrapper[4794]: I0310 09:48:38.593053 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/3bfb465f-2aa5-4e4b-93ca-774f6cb12f28-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"3bfb465f-2aa5-4e4b-93ca-774f6cb12f28\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 10 09:48:38 crc kubenswrapper[4794]: I0310 09:48:38.593152 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/3bfb465f-2aa5-4e4b-93ca-774f6cb12f28-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"3bfb465f-2aa5-4e4b-93ca-774f6cb12f28\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 10 09:48:38 crc kubenswrapper[4794]: I0310 09:48:38.593874 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/3bfb465f-2aa5-4e4b-93ca-774f6cb12f28-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"3bfb465f-2aa5-4e4b-93ca-774f6cb12f28\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 10 09:48:38 crc kubenswrapper[4794]: I0310 09:48:38.613136 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/3bfb465f-2aa5-4e4b-93ca-774f6cb12f28-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"3bfb465f-2aa5-4e4b-93ca-774f6cb12f28\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 10 09:48:38 crc kubenswrapper[4794]: I0310 09:48:38.757029 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 10 09:48:39 crc kubenswrapper[4794]: I0310 09:48:39.171983 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Mar 10 09:48:39 crc kubenswrapper[4794]: W0310 09:48:39.180502 4794 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod3bfb465f_2aa5_4e4b_93ca_774f6cb12f28.slice/crio-d66d1251c9f402452ac247257e1486d97a054e3c21e5f48f242d4b58017c4fa6 WatchSource:0}: Error finding container d66d1251c9f402452ac247257e1486d97a054e3c21e5f48f242d4b58017c4fa6: Status 404 returned error can't find the container with id d66d1251c9f402452ac247257e1486d97a054e3c21e5f48f242d4b58017c4fa6 Mar 10 09:48:39 crc kubenswrapper[4794]: I0310 09:48:39.521837 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cx8l7" event={"ID":"1fe60fc6-df84-4c3b-a73a-0d5a64aeb61a","Type":"ContainerStarted","Data":"e0c23d7e79ef5eac72c4b0ab3717e1faf359bf880623ab9990dd8488d38e240b"} Mar 10 09:48:39 crc kubenswrapper[4794]: I0310 09:48:39.525635 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rq6sq" event={"ID":"6840b6f1-4520-4c02-9f69-b238ac692ae5","Type":"ContainerStarted","Data":"df76e4024f94af4d9ef41e07bbfdaf7e85fa0ff7446fdb21859d29906b43a317"} Mar 10 09:48:39 crc kubenswrapper[4794]: I0310 09:48:39.526862 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"3bfb465f-2aa5-4e4b-93ca-774f6cb12f28","Type":"ContainerStarted","Data":"f2942662db9da0d348d5ea9660d19f108ce262b813097d9a9d9362927ec4ddf7"} Mar 10 09:48:39 crc kubenswrapper[4794]: I0310 09:48:39.526897 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"3bfb465f-2aa5-4e4b-93ca-774f6cb12f28","Type":"ContainerStarted","Data":"d66d1251c9f402452ac247257e1486d97a054e3c21e5f48f242d4b58017c4fa6"} Mar 10 09:48:39 crc kubenswrapper[4794]: I0310 09:48:39.528282 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552268-gzdlm" event={"ID":"75d151a9-5d22-4241-9177-7856740702e4","Type":"ContainerStarted","Data":"c264f2b0994e995bf93906c45f1565d45f2ea144f2807b0b2316545f9b3c7131"} Mar 10 09:48:39 crc kubenswrapper[4794]: I0310 09:48:39.556820 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29552268-gzdlm" podStartSLOduration=1.6540696110000002 podStartE2EDuration="39.556799571s" podCreationTimestamp="2026-03-10 09:48:00 +0000 UTC" firstStartedPulling="2026-03-10 09:48:01.018298234 +0000 UTC m=+229.774469062" lastFinishedPulling="2026-03-10 09:48:38.921028214 +0000 UTC m=+267.677199022" observedRunningTime="2026-03-10 09:48:39.553470121 +0000 UTC m=+268.309640939" watchObservedRunningTime="2026-03-10 09:48:39.556799571 +0000 UTC m=+268.312970379" Mar 10 09:48:39 crc kubenswrapper[4794]: I0310 09:48:39.570494 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/revision-pruner-9-crc" podStartSLOduration=1.5704790960000001 podStartE2EDuration="1.570479096s" podCreationTimestamp="2026-03-10 09:48:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 09:48:39.566872355 +0000 UTC m=+268.323043173" watchObservedRunningTime="2026-03-10 09:48:39.570479096 +0000 UTC m=+268.326649914" Mar 10 09:48:39 crc kubenswrapper[4794]: I0310 09:48:39.613548 4794 csr.go:261] certificate signing request csr-rclkv is approved, waiting to be issued Mar 10 09:48:39 crc kubenswrapper[4794]: I0310 09:48:39.618785 4794 csr.go:257] certificate signing request csr-rclkv is issued Mar 10 09:48:40 crc kubenswrapper[4794]: I0310 09:48:40.334574 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-6987fd4cbd-27sjp"] Mar 10 09:48:40 crc kubenswrapper[4794]: I0310 09:48:40.334774 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-6987fd4cbd-27sjp" podUID="32fdf926-a589-4496-8773-c28effa19c31" containerName="controller-manager" containerID="cri-o://b457de1513341efbbdf27804d4325587b1ca45e637fb46cc1afa7d62da652291" gracePeriod=30 Mar 10 09:48:40 crc kubenswrapper[4794]: I0310 09:48:40.436088 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-76c9db8f4f-cmvvb"] Mar 10 09:48:40 crc kubenswrapper[4794]: I0310 09:48:40.436278 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-76c9db8f4f-cmvvb" podUID="73161eb6-50b2-4b4b-b52e-40b38b3efd9a" containerName="route-controller-manager" containerID="cri-o://7e52f162adcaf84ec7763e3c776f6090231dc147a2c14e4bfe3dab84a49fd623" gracePeriod=30 Mar 10 09:48:40 crc kubenswrapper[4794]: I0310 09:48:40.534111 4794 generic.go:334] "Generic (PLEG): container finished" podID="75d151a9-5d22-4241-9177-7856740702e4" containerID="c264f2b0994e995bf93906c45f1565d45f2ea144f2807b0b2316545f9b3c7131" exitCode=0 Mar 10 09:48:40 crc kubenswrapper[4794]: I0310 09:48:40.534160 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552268-gzdlm" event={"ID":"75d151a9-5d22-4241-9177-7856740702e4","Type":"ContainerDied","Data":"c264f2b0994e995bf93906c45f1565d45f2ea144f2807b0b2316545f9b3c7131"} Mar 10 09:48:40 crc kubenswrapper[4794]: I0310 09:48:40.536858 4794 generic.go:334] "Generic (PLEG): container finished" podID="dc6e5973-dae4-42b4-aae1-44fd0f6e1c3e" containerID="3e150bfb91adedfa4aa0f3b8935534f2fd60a447dd207906377db0be8f5854ce" exitCode=0 Mar 10 09:48:40 crc kubenswrapper[4794]: I0310 09:48:40.536886 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4d8mk" event={"ID":"dc6e5973-dae4-42b4-aae1-44fd0f6e1c3e","Type":"ContainerDied","Data":"3e150bfb91adedfa4aa0f3b8935534f2fd60a447dd207906377db0be8f5854ce"} Mar 10 09:48:40 crc kubenswrapper[4794]: I0310 09:48:40.538371 4794 generic.go:334] "Generic (PLEG): container finished" podID="1fe60fc6-df84-4c3b-a73a-0d5a64aeb61a" containerID="e0c23d7e79ef5eac72c4b0ab3717e1faf359bf880623ab9990dd8488d38e240b" exitCode=0 Mar 10 09:48:40 crc kubenswrapper[4794]: I0310 09:48:40.538401 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cx8l7" event={"ID":"1fe60fc6-df84-4c3b-a73a-0d5a64aeb61a","Type":"ContainerDied","Data":"e0c23d7e79ef5eac72c4b0ab3717e1faf359bf880623ab9990dd8488d38e240b"} Mar 10 09:48:40 crc kubenswrapper[4794]: I0310 09:48:40.542728 4794 generic.go:334] "Generic (PLEG): container finished" podID="6840b6f1-4520-4c02-9f69-b238ac692ae5" containerID="df76e4024f94af4d9ef41e07bbfdaf7e85fa0ff7446fdb21859d29906b43a317" exitCode=0 Mar 10 09:48:40 crc kubenswrapper[4794]: I0310 09:48:40.542779 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rq6sq" event={"ID":"6840b6f1-4520-4c02-9f69-b238ac692ae5","Type":"ContainerDied","Data":"df76e4024f94af4d9ef41e07bbfdaf7e85fa0ff7446fdb21859d29906b43a317"} Mar 10 09:48:40 crc kubenswrapper[4794]: I0310 09:48:40.544229 4794 generic.go:334] "Generic (PLEG): container finished" podID="3bfb465f-2aa5-4e4b-93ca-774f6cb12f28" containerID="f2942662db9da0d348d5ea9660d19f108ce262b813097d9a9d9362927ec4ddf7" exitCode=0 Mar 10 09:48:40 crc kubenswrapper[4794]: I0310 09:48:40.544282 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"3bfb465f-2aa5-4e4b-93ca-774f6cb12f28","Type":"ContainerDied","Data":"f2942662db9da0d348d5ea9660d19f108ce262b813097d9a9d9362927ec4ddf7"} Mar 10 09:48:40 crc kubenswrapper[4794]: I0310 09:48:40.550879 4794 generic.go:334] "Generic (PLEG): container finished" podID="88245dbf-bf6b-4051-9a3c-91da5a183538" containerID="b9ef099180e73d9bde1e4a27716c3151e3e2a880cec2afb1d00afd9253ad5771" exitCode=0 Mar 10 09:48:40 crc kubenswrapper[4794]: I0310 09:48:40.550914 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bnvgq" event={"ID":"88245dbf-bf6b-4051-9a3c-91da5a183538","Type":"ContainerDied","Data":"b9ef099180e73d9bde1e4a27716c3151e3e2a880cec2afb1d00afd9253ad5771"} Mar 10 09:48:40 crc kubenswrapper[4794]: I0310 09:48:40.619922 4794 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2027-02-24 05:54:36 +0000 UTC, rotation deadline is 2026-11-17 00:16:07.40869475 +0000 UTC Mar 10 09:48:40 crc kubenswrapper[4794]: I0310 09:48:40.619969 4794 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 6038h27m26.788730298s for next certificate rotation Mar 10 09:48:40 crc kubenswrapper[4794]: I0310 09:48:40.927589 4794 patch_prober.go:28] interesting pod/controller-manager-6987fd4cbd-27sjp container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.60:8443/healthz\": dial tcp 10.217.0.60:8443: connect: connection refused" start-of-body= Mar 10 09:48:40 crc kubenswrapper[4794]: I0310 09:48:40.927653 4794 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-6987fd4cbd-27sjp" podUID="32fdf926-a589-4496-8773-c28effa19c31" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.60:8443/healthz\": dial tcp 10.217.0.60:8443: connect: connection refused" Mar 10 09:48:41 crc kubenswrapper[4794]: I0310 09:48:41.508584 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6987fd4cbd-27sjp" Mar 10 09:48:41 crc kubenswrapper[4794]: I0310 09:48:41.525443 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-76c9db8f4f-cmvvb" Mar 10 09:48:41 crc kubenswrapper[4794]: I0310 09:48:41.536923 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-669b859c78-wk8bw"] Mar 10 09:48:41 crc kubenswrapper[4794]: E0310 09:48:41.537143 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="32fdf926-a589-4496-8773-c28effa19c31" containerName="controller-manager" Mar 10 09:48:41 crc kubenswrapper[4794]: I0310 09:48:41.537154 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="32fdf926-a589-4496-8773-c28effa19c31" containerName="controller-manager" Mar 10 09:48:41 crc kubenswrapper[4794]: E0310 09:48:41.537165 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="73161eb6-50b2-4b4b-b52e-40b38b3efd9a" containerName="route-controller-manager" Mar 10 09:48:41 crc kubenswrapper[4794]: I0310 09:48:41.537170 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="73161eb6-50b2-4b4b-b52e-40b38b3efd9a" containerName="route-controller-manager" Mar 10 09:48:41 crc kubenswrapper[4794]: I0310 09:48:41.537262 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="32fdf926-a589-4496-8773-c28effa19c31" containerName="controller-manager" Mar 10 09:48:41 crc kubenswrapper[4794]: I0310 09:48:41.537278 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="73161eb6-50b2-4b4b-b52e-40b38b3efd9a" containerName="route-controller-manager" Mar 10 09:48:41 crc kubenswrapper[4794]: I0310 09:48:41.537654 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-669b859c78-wk8bw" Mar 10 09:48:41 crc kubenswrapper[4794]: I0310 09:48:41.552080 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-669b859c78-wk8bw"] Mar 10 09:48:41 crc kubenswrapper[4794]: I0310 09:48:41.563411 4794 generic.go:334] "Generic (PLEG): container finished" podID="73161eb6-50b2-4b4b-b52e-40b38b3efd9a" containerID="7e52f162adcaf84ec7763e3c776f6090231dc147a2c14e4bfe3dab84a49fd623" exitCode=0 Mar 10 09:48:41 crc kubenswrapper[4794]: I0310 09:48:41.563526 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-76c9db8f4f-cmvvb" event={"ID":"73161eb6-50b2-4b4b-b52e-40b38b3efd9a","Type":"ContainerDied","Data":"7e52f162adcaf84ec7763e3c776f6090231dc147a2c14e4bfe3dab84a49fd623"} Mar 10 09:48:41 crc kubenswrapper[4794]: I0310 09:48:41.563554 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-76c9db8f4f-cmvvb" event={"ID":"73161eb6-50b2-4b4b-b52e-40b38b3efd9a","Type":"ContainerDied","Data":"c22c720da28dbc0047b8e2404215d49b187dde17c7887269725c072bf7449c74"} Mar 10 09:48:41 crc kubenswrapper[4794]: I0310 09:48:41.563570 4794 scope.go:117] "RemoveContainer" containerID="7e52f162adcaf84ec7763e3c776f6090231dc147a2c14e4bfe3dab84a49fd623" Mar 10 09:48:41 crc kubenswrapper[4794]: I0310 09:48:41.563716 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-76c9db8f4f-cmvvb" Mar 10 09:48:41 crc kubenswrapper[4794]: I0310 09:48:41.565799 4794 generic.go:334] "Generic (PLEG): container finished" podID="32fdf926-a589-4496-8773-c28effa19c31" containerID="b457de1513341efbbdf27804d4325587b1ca45e637fb46cc1afa7d62da652291" exitCode=0 Mar 10 09:48:41 crc kubenswrapper[4794]: I0310 09:48:41.566057 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6987fd4cbd-27sjp" Mar 10 09:48:41 crc kubenswrapper[4794]: I0310 09:48:41.566097 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6987fd4cbd-27sjp" event={"ID":"32fdf926-a589-4496-8773-c28effa19c31","Type":"ContainerDied","Data":"b457de1513341efbbdf27804d4325587b1ca45e637fb46cc1afa7d62da652291"} Mar 10 09:48:41 crc kubenswrapper[4794]: I0310 09:48:41.566177 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6987fd4cbd-27sjp" event={"ID":"32fdf926-a589-4496-8773-c28effa19c31","Type":"ContainerDied","Data":"7ffa4384c6b4ab1f235350e80708b247f6f8bfc665e4667b304abc080f745542"} Mar 10 09:48:41 crc kubenswrapper[4794]: I0310 09:48:41.620243 4794 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2027-02-24 05:54:36 +0000 UTC, rotation deadline is 2026-12-12 16:55:50.541912733 +0000 UTC Mar 10 09:48:41 crc kubenswrapper[4794]: I0310 09:48:41.620577 4794 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 6655h7m8.921339713s for next certificate rotation Mar 10 09:48:41 crc kubenswrapper[4794]: I0310 09:48:41.631516 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/73161eb6-50b2-4b4b-b52e-40b38b3efd9a-client-ca\") pod \"73161eb6-50b2-4b4b-b52e-40b38b3efd9a\" (UID: \"73161eb6-50b2-4b4b-b52e-40b38b3efd9a\") " Mar 10 09:48:41 crc kubenswrapper[4794]: I0310 09:48:41.631563 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/73161eb6-50b2-4b4b-b52e-40b38b3efd9a-serving-cert\") pod \"73161eb6-50b2-4b4b-b52e-40b38b3efd9a\" (UID: \"73161eb6-50b2-4b4b-b52e-40b38b3efd9a\") " Mar 10 09:48:41 crc kubenswrapper[4794]: I0310 09:48:41.631597 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j6j7x\" (UniqueName: \"kubernetes.io/projected/73161eb6-50b2-4b4b-b52e-40b38b3efd9a-kube-api-access-j6j7x\") pod \"73161eb6-50b2-4b4b-b52e-40b38b3efd9a\" (UID: \"73161eb6-50b2-4b4b-b52e-40b38b3efd9a\") " Mar 10 09:48:41 crc kubenswrapper[4794]: I0310 09:48:41.631633 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/32fdf926-a589-4496-8773-c28effa19c31-config\") pod \"32fdf926-a589-4496-8773-c28effa19c31\" (UID: \"32fdf926-a589-4496-8773-c28effa19c31\") " Mar 10 09:48:41 crc kubenswrapper[4794]: I0310 09:48:41.631656 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/32fdf926-a589-4496-8773-c28effa19c31-proxy-ca-bundles\") pod \"32fdf926-a589-4496-8773-c28effa19c31\" (UID: \"32fdf926-a589-4496-8773-c28effa19c31\") " Mar 10 09:48:41 crc kubenswrapper[4794]: I0310 09:48:41.631685 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nw58v\" (UniqueName: \"kubernetes.io/projected/32fdf926-a589-4496-8773-c28effa19c31-kube-api-access-nw58v\") pod \"32fdf926-a589-4496-8773-c28effa19c31\" (UID: \"32fdf926-a589-4496-8773-c28effa19c31\") " Mar 10 09:48:41 crc kubenswrapper[4794]: I0310 09:48:41.631736 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/32fdf926-a589-4496-8773-c28effa19c31-serving-cert\") pod \"32fdf926-a589-4496-8773-c28effa19c31\" (UID: \"32fdf926-a589-4496-8773-c28effa19c31\") " Mar 10 09:48:41 crc kubenswrapper[4794]: I0310 09:48:41.631773 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/73161eb6-50b2-4b4b-b52e-40b38b3efd9a-config\") pod \"73161eb6-50b2-4b4b-b52e-40b38b3efd9a\" (UID: \"73161eb6-50b2-4b4b-b52e-40b38b3efd9a\") " Mar 10 09:48:41 crc kubenswrapper[4794]: I0310 09:48:41.632863 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/73161eb6-50b2-4b4b-b52e-40b38b3efd9a-client-ca" (OuterVolumeSpecName: "client-ca") pod "73161eb6-50b2-4b4b-b52e-40b38b3efd9a" (UID: "73161eb6-50b2-4b4b-b52e-40b38b3efd9a"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 09:48:41 crc kubenswrapper[4794]: I0310 09:48:41.633664 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/32fdf926-a589-4496-8773-c28effa19c31-client-ca\") pod \"32fdf926-a589-4496-8773-c28effa19c31\" (UID: \"32fdf926-a589-4496-8773-c28effa19c31\") " Mar 10 09:48:41 crc kubenswrapper[4794]: I0310 09:48:41.633959 4794 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/73161eb6-50b2-4b4b-b52e-40b38b3efd9a-client-ca\") on node \"crc\" DevicePath \"\"" Mar 10 09:48:41 crc kubenswrapper[4794]: I0310 09:48:41.635908 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/32fdf926-a589-4496-8773-c28effa19c31-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "32fdf926-a589-4496-8773-c28effa19c31" (UID: "32fdf926-a589-4496-8773-c28effa19c31"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 09:48:41 crc kubenswrapper[4794]: I0310 09:48:41.636010 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/32fdf926-a589-4496-8773-c28effa19c31-config" (OuterVolumeSpecName: "config") pod "32fdf926-a589-4496-8773-c28effa19c31" (UID: "32fdf926-a589-4496-8773-c28effa19c31"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 09:48:41 crc kubenswrapper[4794]: I0310 09:48:41.636405 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/32fdf926-a589-4496-8773-c28effa19c31-client-ca" (OuterVolumeSpecName: "client-ca") pod "32fdf926-a589-4496-8773-c28effa19c31" (UID: "32fdf926-a589-4496-8773-c28effa19c31"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 09:48:41 crc kubenswrapper[4794]: I0310 09:48:41.638599 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/73161eb6-50b2-4b4b-b52e-40b38b3efd9a-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "73161eb6-50b2-4b4b-b52e-40b38b3efd9a" (UID: "73161eb6-50b2-4b4b-b52e-40b38b3efd9a"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:48:41 crc kubenswrapper[4794]: I0310 09:48:41.639001 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/32fdf926-a589-4496-8773-c28effa19c31-kube-api-access-nw58v" (OuterVolumeSpecName: "kube-api-access-nw58v") pod "32fdf926-a589-4496-8773-c28effa19c31" (UID: "32fdf926-a589-4496-8773-c28effa19c31"). InnerVolumeSpecName "kube-api-access-nw58v". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:48:41 crc kubenswrapper[4794]: I0310 09:48:41.639305 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/73161eb6-50b2-4b4b-b52e-40b38b3efd9a-config" (OuterVolumeSpecName: "config") pod "73161eb6-50b2-4b4b-b52e-40b38b3efd9a" (UID: "73161eb6-50b2-4b4b-b52e-40b38b3efd9a"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 09:48:41 crc kubenswrapper[4794]: I0310 09:48:41.639684 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/73161eb6-50b2-4b4b-b52e-40b38b3efd9a-kube-api-access-j6j7x" (OuterVolumeSpecName: "kube-api-access-j6j7x") pod "73161eb6-50b2-4b4b-b52e-40b38b3efd9a" (UID: "73161eb6-50b2-4b4b-b52e-40b38b3efd9a"). InnerVolumeSpecName "kube-api-access-j6j7x". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:48:41 crc kubenswrapper[4794]: I0310 09:48:41.640953 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/32fdf926-a589-4496-8773-c28effa19c31-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "32fdf926-a589-4496-8773-c28effa19c31" (UID: "32fdf926-a589-4496-8773-c28effa19c31"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:48:41 crc kubenswrapper[4794]: I0310 09:48:41.735429 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/24eecd7d-54cc-4497-b27e-2cc495e920dc-proxy-ca-bundles\") pod \"controller-manager-669b859c78-wk8bw\" (UID: \"24eecd7d-54cc-4497-b27e-2cc495e920dc\") " pod="openshift-controller-manager/controller-manager-669b859c78-wk8bw" Mar 10 09:48:41 crc kubenswrapper[4794]: I0310 09:48:41.735495 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kmvwl\" (UniqueName: \"kubernetes.io/projected/24eecd7d-54cc-4497-b27e-2cc495e920dc-kube-api-access-kmvwl\") pod \"controller-manager-669b859c78-wk8bw\" (UID: \"24eecd7d-54cc-4497-b27e-2cc495e920dc\") " pod="openshift-controller-manager/controller-manager-669b859c78-wk8bw" Mar 10 09:48:41 crc kubenswrapper[4794]: I0310 09:48:41.735521 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/24eecd7d-54cc-4497-b27e-2cc495e920dc-serving-cert\") pod \"controller-manager-669b859c78-wk8bw\" (UID: \"24eecd7d-54cc-4497-b27e-2cc495e920dc\") " pod="openshift-controller-manager/controller-manager-669b859c78-wk8bw" Mar 10 09:48:41 crc kubenswrapper[4794]: I0310 09:48:41.735559 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/24eecd7d-54cc-4497-b27e-2cc495e920dc-config\") pod \"controller-manager-669b859c78-wk8bw\" (UID: \"24eecd7d-54cc-4497-b27e-2cc495e920dc\") " pod="openshift-controller-manager/controller-manager-669b859c78-wk8bw" Mar 10 09:48:41 crc kubenswrapper[4794]: I0310 09:48:41.735709 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/24eecd7d-54cc-4497-b27e-2cc495e920dc-client-ca\") pod \"controller-manager-669b859c78-wk8bw\" (UID: \"24eecd7d-54cc-4497-b27e-2cc495e920dc\") " pod="openshift-controller-manager/controller-manager-669b859c78-wk8bw" Mar 10 09:48:41 crc kubenswrapper[4794]: I0310 09:48:41.736109 4794 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/73161eb6-50b2-4b4b-b52e-40b38b3efd9a-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 10 09:48:41 crc kubenswrapper[4794]: I0310 09:48:41.736127 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j6j7x\" (UniqueName: \"kubernetes.io/projected/73161eb6-50b2-4b4b-b52e-40b38b3efd9a-kube-api-access-j6j7x\") on node \"crc\" DevicePath \"\"" Mar 10 09:48:41 crc kubenswrapper[4794]: I0310 09:48:41.736137 4794 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/32fdf926-a589-4496-8773-c28effa19c31-config\") on node \"crc\" DevicePath \"\"" Mar 10 09:48:41 crc kubenswrapper[4794]: I0310 09:48:41.736146 4794 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/32fdf926-a589-4496-8773-c28effa19c31-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 10 09:48:41 crc kubenswrapper[4794]: I0310 09:48:41.736156 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nw58v\" (UniqueName: \"kubernetes.io/projected/32fdf926-a589-4496-8773-c28effa19c31-kube-api-access-nw58v\") on node \"crc\" DevicePath \"\"" Mar 10 09:48:41 crc kubenswrapper[4794]: I0310 09:48:41.736164 4794 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/32fdf926-a589-4496-8773-c28effa19c31-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 10 09:48:41 crc kubenswrapper[4794]: I0310 09:48:41.736175 4794 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/73161eb6-50b2-4b4b-b52e-40b38b3efd9a-config\") on node \"crc\" DevicePath \"\"" Mar 10 09:48:41 crc kubenswrapper[4794]: I0310 09:48:41.736184 4794 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/32fdf926-a589-4496-8773-c28effa19c31-client-ca\") on node \"crc\" DevicePath \"\"" Mar 10 09:48:41 crc kubenswrapper[4794]: I0310 09:48:41.747735 4794 scope.go:117] "RemoveContainer" containerID="7e52f162adcaf84ec7763e3c776f6090231dc147a2c14e4bfe3dab84a49fd623" Mar 10 09:48:41 crc kubenswrapper[4794]: E0310 09:48:41.748101 4794 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7e52f162adcaf84ec7763e3c776f6090231dc147a2c14e4bfe3dab84a49fd623\": container with ID starting with 7e52f162adcaf84ec7763e3c776f6090231dc147a2c14e4bfe3dab84a49fd623 not found: ID does not exist" containerID="7e52f162adcaf84ec7763e3c776f6090231dc147a2c14e4bfe3dab84a49fd623" Mar 10 09:48:41 crc kubenswrapper[4794]: I0310 09:48:41.748126 4794 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7e52f162adcaf84ec7763e3c776f6090231dc147a2c14e4bfe3dab84a49fd623"} err="failed to get container status \"7e52f162adcaf84ec7763e3c776f6090231dc147a2c14e4bfe3dab84a49fd623\": rpc error: code = NotFound desc = could not find container \"7e52f162adcaf84ec7763e3c776f6090231dc147a2c14e4bfe3dab84a49fd623\": container with ID starting with 7e52f162adcaf84ec7763e3c776f6090231dc147a2c14e4bfe3dab84a49fd623 not found: ID does not exist" Mar 10 09:48:41 crc kubenswrapper[4794]: I0310 09:48:41.748147 4794 scope.go:117] "RemoveContainer" containerID="b457de1513341efbbdf27804d4325587b1ca45e637fb46cc1afa7d62da652291" Mar 10 09:48:41 crc kubenswrapper[4794]: I0310 09:48:41.837315 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/24eecd7d-54cc-4497-b27e-2cc495e920dc-client-ca\") pod \"controller-manager-669b859c78-wk8bw\" (UID: \"24eecd7d-54cc-4497-b27e-2cc495e920dc\") " pod="openshift-controller-manager/controller-manager-669b859c78-wk8bw" Mar 10 09:48:41 crc kubenswrapper[4794]: I0310 09:48:41.837391 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/24eecd7d-54cc-4497-b27e-2cc495e920dc-proxy-ca-bundles\") pod \"controller-manager-669b859c78-wk8bw\" (UID: \"24eecd7d-54cc-4497-b27e-2cc495e920dc\") " pod="openshift-controller-manager/controller-manager-669b859c78-wk8bw" Mar 10 09:48:41 crc kubenswrapper[4794]: I0310 09:48:41.837437 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kmvwl\" (UniqueName: \"kubernetes.io/projected/24eecd7d-54cc-4497-b27e-2cc495e920dc-kube-api-access-kmvwl\") pod \"controller-manager-669b859c78-wk8bw\" (UID: \"24eecd7d-54cc-4497-b27e-2cc495e920dc\") " pod="openshift-controller-manager/controller-manager-669b859c78-wk8bw" Mar 10 09:48:41 crc kubenswrapper[4794]: I0310 09:48:41.837460 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/24eecd7d-54cc-4497-b27e-2cc495e920dc-serving-cert\") pod \"controller-manager-669b859c78-wk8bw\" (UID: \"24eecd7d-54cc-4497-b27e-2cc495e920dc\") " pod="openshift-controller-manager/controller-manager-669b859c78-wk8bw" Mar 10 09:48:41 crc kubenswrapper[4794]: I0310 09:48:41.837507 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/24eecd7d-54cc-4497-b27e-2cc495e920dc-config\") pod \"controller-manager-669b859c78-wk8bw\" (UID: \"24eecd7d-54cc-4497-b27e-2cc495e920dc\") " pod="openshift-controller-manager/controller-manager-669b859c78-wk8bw" Mar 10 09:48:41 crc kubenswrapper[4794]: I0310 09:48:41.838146 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/24eecd7d-54cc-4497-b27e-2cc495e920dc-client-ca\") pod \"controller-manager-669b859c78-wk8bw\" (UID: \"24eecd7d-54cc-4497-b27e-2cc495e920dc\") " pod="openshift-controller-manager/controller-manager-669b859c78-wk8bw" Mar 10 09:48:41 crc kubenswrapper[4794]: I0310 09:48:41.838551 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/24eecd7d-54cc-4497-b27e-2cc495e920dc-config\") pod \"controller-manager-669b859c78-wk8bw\" (UID: \"24eecd7d-54cc-4497-b27e-2cc495e920dc\") " pod="openshift-controller-manager/controller-manager-669b859c78-wk8bw" Mar 10 09:48:41 crc kubenswrapper[4794]: I0310 09:48:41.838557 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/24eecd7d-54cc-4497-b27e-2cc495e920dc-proxy-ca-bundles\") pod \"controller-manager-669b859c78-wk8bw\" (UID: \"24eecd7d-54cc-4497-b27e-2cc495e920dc\") " pod="openshift-controller-manager/controller-manager-669b859c78-wk8bw" Mar 10 09:48:41 crc kubenswrapper[4794]: I0310 09:48:41.841278 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/24eecd7d-54cc-4497-b27e-2cc495e920dc-serving-cert\") pod \"controller-manager-669b859c78-wk8bw\" (UID: \"24eecd7d-54cc-4497-b27e-2cc495e920dc\") " pod="openshift-controller-manager/controller-manager-669b859c78-wk8bw" Mar 10 09:48:41 crc kubenswrapper[4794]: I0310 09:48:41.842520 4794 scope.go:117] "RemoveContainer" containerID="b457de1513341efbbdf27804d4325587b1ca45e637fb46cc1afa7d62da652291" Mar 10 09:48:41 crc kubenswrapper[4794]: E0310 09:48:41.842872 4794 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b457de1513341efbbdf27804d4325587b1ca45e637fb46cc1afa7d62da652291\": container with ID starting with b457de1513341efbbdf27804d4325587b1ca45e637fb46cc1afa7d62da652291 not found: ID does not exist" containerID="b457de1513341efbbdf27804d4325587b1ca45e637fb46cc1afa7d62da652291" Mar 10 09:48:41 crc kubenswrapper[4794]: I0310 09:48:41.842905 4794 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b457de1513341efbbdf27804d4325587b1ca45e637fb46cc1afa7d62da652291"} err="failed to get container status \"b457de1513341efbbdf27804d4325587b1ca45e637fb46cc1afa7d62da652291\": rpc error: code = NotFound desc = could not find container \"b457de1513341efbbdf27804d4325587b1ca45e637fb46cc1afa7d62da652291\": container with ID starting with b457de1513341efbbdf27804d4325587b1ca45e637fb46cc1afa7d62da652291 not found: ID does not exist" Mar 10 09:48:41 crc kubenswrapper[4794]: I0310 09:48:41.852381 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kmvwl\" (UniqueName: \"kubernetes.io/projected/24eecd7d-54cc-4497-b27e-2cc495e920dc-kube-api-access-kmvwl\") pod \"controller-manager-669b859c78-wk8bw\" (UID: \"24eecd7d-54cc-4497-b27e-2cc495e920dc\") " pod="openshift-controller-manager/controller-manager-669b859c78-wk8bw" Mar 10 09:48:41 crc kubenswrapper[4794]: I0310 09:48:41.859528 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-669b859c78-wk8bw" Mar 10 09:48:41 crc kubenswrapper[4794]: I0310 09:48:41.866848 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 10 09:48:41 crc kubenswrapper[4794]: I0310 09:48:41.871475 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552268-gzdlm" Mar 10 09:48:41 crc kubenswrapper[4794]: I0310 09:48:41.920400 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-6987fd4cbd-27sjp"] Mar 10 09:48:41 crc kubenswrapper[4794]: I0310 09:48:41.925000 4794 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-6987fd4cbd-27sjp"] Mar 10 09:48:41 crc kubenswrapper[4794]: I0310 09:48:41.929975 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-76c9db8f4f-cmvvb"] Mar 10 09:48:41 crc kubenswrapper[4794]: I0310 09:48:41.936814 4794 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-76c9db8f4f-cmvvb"] Mar 10 09:48:42 crc kubenswrapper[4794]: I0310 09:48:42.006568 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="32fdf926-a589-4496-8773-c28effa19c31" path="/var/lib/kubelet/pods/32fdf926-a589-4496-8773-c28effa19c31/volumes" Mar 10 09:48:42 crc kubenswrapper[4794]: I0310 09:48:42.007050 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="73161eb6-50b2-4b4b-b52e-40b38b3efd9a" path="/var/lib/kubelet/pods/73161eb6-50b2-4b4b-b52e-40b38b3efd9a/volumes" Mar 10 09:48:42 crc kubenswrapper[4794]: I0310 09:48:42.039920 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/3bfb465f-2aa5-4e4b-93ca-774f6cb12f28-kubelet-dir\") pod \"3bfb465f-2aa5-4e4b-93ca-774f6cb12f28\" (UID: \"3bfb465f-2aa5-4e4b-93ca-774f6cb12f28\") " Mar 10 09:48:42 crc kubenswrapper[4794]: I0310 09:48:42.040060 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/3bfb465f-2aa5-4e4b-93ca-774f6cb12f28-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "3bfb465f-2aa5-4e4b-93ca-774f6cb12f28" (UID: "3bfb465f-2aa5-4e4b-93ca-774f6cb12f28"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 09:48:42 crc kubenswrapper[4794]: I0310 09:48:42.040202 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tmbjk\" (UniqueName: \"kubernetes.io/projected/75d151a9-5d22-4241-9177-7856740702e4-kube-api-access-tmbjk\") pod \"75d151a9-5d22-4241-9177-7856740702e4\" (UID: \"75d151a9-5d22-4241-9177-7856740702e4\") " Mar 10 09:48:42 crc kubenswrapper[4794]: I0310 09:48:42.040271 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/3bfb465f-2aa5-4e4b-93ca-774f6cb12f28-kube-api-access\") pod \"3bfb465f-2aa5-4e4b-93ca-774f6cb12f28\" (UID: \"3bfb465f-2aa5-4e4b-93ca-774f6cb12f28\") " Mar 10 09:48:42 crc kubenswrapper[4794]: I0310 09:48:42.040549 4794 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/3bfb465f-2aa5-4e4b-93ca-774f6cb12f28-kubelet-dir\") on node \"crc\" DevicePath \"\"" Mar 10 09:48:42 crc kubenswrapper[4794]: I0310 09:48:42.042726 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3bfb465f-2aa5-4e4b-93ca-774f6cb12f28-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "3bfb465f-2aa5-4e4b-93ca-774f6cb12f28" (UID: "3bfb465f-2aa5-4e4b-93ca-774f6cb12f28"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:48:42 crc kubenswrapper[4794]: I0310 09:48:42.043120 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/75d151a9-5d22-4241-9177-7856740702e4-kube-api-access-tmbjk" (OuterVolumeSpecName: "kube-api-access-tmbjk") pod "75d151a9-5d22-4241-9177-7856740702e4" (UID: "75d151a9-5d22-4241-9177-7856740702e4"). InnerVolumeSpecName "kube-api-access-tmbjk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:48:42 crc kubenswrapper[4794]: I0310 09:48:42.141317 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tmbjk\" (UniqueName: \"kubernetes.io/projected/75d151a9-5d22-4241-9177-7856740702e4-kube-api-access-tmbjk\") on node \"crc\" DevicePath \"\"" Mar 10 09:48:42 crc kubenswrapper[4794]: I0310 09:48:42.141418 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/3bfb465f-2aa5-4e4b-93ca-774f6cb12f28-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 10 09:48:42 crc kubenswrapper[4794]: I0310 09:48:42.572707 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552268-gzdlm" event={"ID":"75d151a9-5d22-4241-9177-7856740702e4","Type":"ContainerDied","Data":"1a3133abef39b11152825f2379cbf7ea797502cb21786c566d3cab3f67715af4"} Mar 10 09:48:42 crc kubenswrapper[4794]: I0310 09:48:42.572742 4794 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1a3133abef39b11152825f2379cbf7ea797502cb21786c566d3cab3f67715af4" Mar 10 09:48:42 crc kubenswrapper[4794]: I0310 09:48:42.572827 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552268-gzdlm" Mar 10 09:48:42 crc kubenswrapper[4794]: I0310 09:48:42.574633 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"3bfb465f-2aa5-4e4b-93ca-774f6cb12f28","Type":"ContainerDied","Data":"d66d1251c9f402452ac247257e1486d97a054e3c21e5f48f242d4b58017c4fa6"} Mar 10 09:48:42 crc kubenswrapper[4794]: I0310 09:48:42.574643 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 10 09:48:42 crc kubenswrapper[4794]: I0310 09:48:42.574660 4794 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d66d1251c9f402452ac247257e1486d97a054e3c21e5f48f242d4b58017c4fa6" Mar 10 09:48:43 crc kubenswrapper[4794]: I0310 09:48:43.018173 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-669b859c78-wk8bw"] Mar 10 09:48:43 crc kubenswrapper[4794]: W0310 09:48:43.023571 4794 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod24eecd7d_54cc_4497_b27e_2cc495e920dc.slice/crio-39c6c10c8106392f34859a650b62de9c8c57c7afcebe8d88ec75238c5bb102ab WatchSource:0}: Error finding container 39c6c10c8106392f34859a650b62de9c8c57c7afcebe8d88ec75238c5bb102ab: Status 404 returned error can't find the container with id 39c6c10c8106392f34859a650b62de9c8c57c7afcebe8d88ec75238c5bb102ab Mar 10 09:48:43 crc kubenswrapper[4794]: I0310 09:48:43.588658 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-669b859c78-wk8bw" event={"ID":"24eecd7d-54cc-4497-b27e-2cc495e920dc","Type":"ContainerStarted","Data":"9dacf379e2d82e2d5de83578f26f292442a8ee8a6dced36feb94a12262d565cd"} Mar 10 09:48:43 crc kubenswrapper[4794]: I0310 09:48:43.588890 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-669b859c78-wk8bw" event={"ID":"24eecd7d-54cc-4497-b27e-2cc495e920dc","Type":"ContainerStarted","Data":"39c6c10c8106392f34859a650b62de9c8c57c7afcebe8d88ec75238c5bb102ab"} Mar 10 09:48:43 crc kubenswrapper[4794]: I0310 09:48:43.588906 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-669b859c78-wk8bw" Mar 10 09:48:43 crc kubenswrapper[4794]: I0310 09:48:43.592790 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4d8mk" event={"ID":"dc6e5973-dae4-42b4-aae1-44fd0f6e1c3e","Type":"ContainerStarted","Data":"0dc9d68e9e86dea775383b389ff07056b9d9668f7fbb7c1a316d24644f0897bb"} Mar 10 09:48:43 crc kubenswrapper[4794]: I0310 09:48:43.594969 4794 generic.go:334] "Generic (PLEG): container finished" podID="f12f506f-5226-41a3-9643-260415a884a5" containerID="63117909e244ec7b2c31dfd6a3ef41360076d2e72b8e908ca37fe4233235e47f" exitCode=0 Mar 10 09:48:43 crc kubenswrapper[4794]: I0310 09:48:43.595074 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552266-9tldc" event={"ID":"f12f506f-5226-41a3-9643-260415a884a5","Type":"ContainerDied","Data":"63117909e244ec7b2c31dfd6a3ef41360076d2e72b8e908ca37fe4233235e47f"} Mar 10 09:48:43 crc kubenswrapper[4794]: I0310 09:48:43.597571 4794 generic.go:334] "Generic (PLEG): container finished" podID="869965fc-c355-4c93-9776-dc1a070c926e" containerID="c369267a010db33c54459e5703bd75c5d1efea77f7e7614ca282912df8de29d7" exitCode=0 Mar 10 09:48:43 crc kubenswrapper[4794]: I0310 09:48:43.597664 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zffn2" event={"ID":"869965fc-c355-4c93-9776-dc1a070c926e","Type":"ContainerDied","Data":"c369267a010db33c54459e5703bd75c5d1efea77f7e7614ca282912df8de29d7"} Mar 10 09:48:43 crc kubenswrapper[4794]: I0310 09:48:43.600035 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cx8l7" event={"ID":"1fe60fc6-df84-4c3b-a73a-0d5a64aeb61a","Type":"ContainerStarted","Data":"0d2f487e9a5fcf36206ba7fd5b76cc51ce7328a16051ac52831a122bfc158a54"} Mar 10 09:48:43 crc kubenswrapper[4794]: I0310 09:48:43.604299 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rq6sq" event={"ID":"6840b6f1-4520-4c02-9f69-b238ac692ae5","Type":"ContainerStarted","Data":"aaa17f102fefbb220bf3e023f4365c32450404156e29043b0a751280ebfe94b2"} Mar 10 09:48:43 crc kubenswrapper[4794]: I0310 09:48:43.606986 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bnvgq" event={"ID":"88245dbf-bf6b-4051-9a3c-91da5a183538","Type":"ContainerStarted","Data":"063c82c36304ffc8762f2979da18d291ed2033e28095ef4691b3ed815dfc3f7e"} Mar 10 09:48:43 crc kubenswrapper[4794]: I0310 09:48:43.609409 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-86f487c7f-b8zhm"] Mar 10 09:48:43 crc kubenswrapper[4794]: E0310 09:48:43.609577 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3bfb465f-2aa5-4e4b-93ca-774f6cb12f28" containerName="pruner" Mar 10 09:48:43 crc kubenswrapper[4794]: I0310 09:48:43.609591 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="3bfb465f-2aa5-4e4b-93ca-774f6cb12f28" containerName="pruner" Mar 10 09:48:43 crc kubenswrapper[4794]: E0310 09:48:43.609608 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="75d151a9-5d22-4241-9177-7856740702e4" containerName="oc" Mar 10 09:48:43 crc kubenswrapper[4794]: I0310 09:48:43.609615 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="75d151a9-5d22-4241-9177-7856740702e4" containerName="oc" Mar 10 09:48:43 crc kubenswrapper[4794]: I0310 09:48:43.609705 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="3bfb465f-2aa5-4e4b-93ca-774f6cb12f28" containerName="pruner" Mar 10 09:48:43 crc kubenswrapper[4794]: I0310 09:48:43.609717 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="75d151a9-5d22-4241-9177-7856740702e4" containerName="oc" Mar 10 09:48:43 crc kubenswrapper[4794]: I0310 09:48:43.610024 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-86f487c7f-b8zhm" Mar 10 09:48:43 crc kubenswrapper[4794]: I0310 09:48:43.611947 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 10 09:48:43 crc kubenswrapper[4794]: I0310 09:48:43.612173 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 10 09:48:43 crc kubenswrapper[4794]: I0310 09:48:43.612258 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 10 09:48:43 crc kubenswrapper[4794]: I0310 09:48:43.612620 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-669b859c78-wk8bw" Mar 10 09:48:43 crc kubenswrapper[4794]: I0310 09:48:43.612718 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 10 09:48:43 crc kubenswrapper[4794]: I0310 09:48:43.613372 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 10 09:48:43 crc kubenswrapper[4794]: I0310 09:48:43.613634 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 10 09:48:43 crc kubenswrapper[4794]: I0310 09:48:43.625023 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-86f487c7f-b8zhm"] Mar 10 09:48:43 crc kubenswrapper[4794]: I0310 09:48:43.630961 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-669b859c78-wk8bw" podStartSLOduration=3.630935188 podStartE2EDuration="3.630935188s" podCreationTimestamp="2026-03-10 09:48:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 09:48:43.628584943 +0000 UTC m=+272.384755761" watchObservedRunningTime="2026-03-10 09:48:43.630935188 +0000 UTC m=+272.387106006" Mar 10 09:48:43 crc kubenswrapper[4794]: I0310 09:48:43.648991 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-cx8l7" podStartSLOduration=26.360515046 podStartE2EDuration="39.648978871s" podCreationTimestamp="2026-03-10 09:48:04 +0000 UTC" firstStartedPulling="2026-03-10 09:48:29.409269142 +0000 UTC m=+258.165439960" lastFinishedPulling="2026-03-10 09:48:42.697732967 +0000 UTC m=+271.453903785" observedRunningTime="2026-03-10 09:48:43.644098405 +0000 UTC m=+272.400269223" watchObservedRunningTime="2026-03-10 09:48:43.648978871 +0000 UTC m=+272.405149689" Mar 10 09:48:43 crc kubenswrapper[4794]: I0310 09:48:43.666110 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5527923f-a4af-4a62-833c-b45d251e581e-serving-cert\") pod \"route-controller-manager-86f487c7f-b8zhm\" (UID: \"5527923f-a4af-4a62-833c-b45d251e581e\") " pod="openshift-route-controller-manager/route-controller-manager-86f487c7f-b8zhm" Mar 10 09:48:43 crc kubenswrapper[4794]: I0310 09:48:43.666201 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c7mzb\" (UniqueName: \"kubernetes.io/projected/5527923f-a4af-4a62-833c-b45d251e581e-kube-api-access-c7mzb\") pod \"route-controller-manager-86f487c7f-b8zhm\" (UID: \"5527923f-a4af-4a62-833c-b45d251e581e\") " pod="openshift-route-controller-manager/route-controller-manager-86f487c7f-b8zhm" Mar 10 09:48:43 crc kubenswrapper[4794]: I0310 09:48:43.666254 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5527923f-a4af-4a62-833c-b45d251e581e-config\") pod \"route-controller-manager-86f487c7f-b8zhm\" (UID: \"5527923f-a4af-4a62-833c-b45d251e581e\") " pod="openshift-route-controller-manager/route-controller-manager-86f487c7f-b8zhm" Mar 10 09:48:43 crc kubenswrapper[4794]: I0310 09:48:43.666278 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5527923f-a4af-4a62-833c-b45d251e581e-client-ca\") pod \"route-controller-manager-86f487c7f-b8zhm\" (UID: \"5527923f-a4af-4a62-833c-b45d251e581e\") " pod="openshift-route-controller-manager/route-controller-manager-86f487c7f-b8zhm" Mar 10 09:48:43 crc kubenswrapper[4794]: I0310 09:48:43.711015 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-bnvgq" podStartSLOduration=3.108937289 podStartE2EDuration="42.710997203s" podCreationTimestamp="2026-03-10 09:48:01 +0000 UTC" firstStartedPulling="2026-03-10 09:48:02.967730026 +0000 UTC m=+231.723900844" lastFinishedPulling="2026-03-10 09:48:42.56978994 +0000 UTC m=+271.325960758" observedRunningTime="2026-03-10 09:48:43.708924898 +0000 UTC m=+272.465095716" watchObservedRunningTime="2026-03-10 09:48:43.710997203 +0000 UTC m=+272.467168011" Mar 10 09:48:43 crc kubenswrapper[4794]: I0310 09:48:43.741648 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-rq6sq" podStartSLOduration=27.445848291 podStartE2EDuration="39.741629921s" podCreationTimestamp="2026-03-10 09:48:04 +0000 UTC" firstStartedPulling="2026-03-10 09:48:30.451512869 +0000 UTC m=+259.207683727" lastFinishedPulling="2026-03-10 09:48:42.747294539 +0000 UTC m=+271.503465357" observedRunningTime="2026-03-10 09:48:43.740099895 +0000 UTC m=+272.496270713" watchObservedRunningTime="2026-03-10 09:48:43.741629921 +0000 UTC m=+272.497800739" Mar 10 09:48:43 crc kubenswrapper[4794]: I0310 09:48:43.756743 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-4d8mk" podStartSLOduration=2.893895922 podStartE2EDuration="42.756721796s" podCreationTimestamp="2026-03-10 09:48:01 +0000 UTC" firstStartedPulling="2026-03-10 09:48:02.996250586 +0000 UTC m=+231.752421404" lastFinishedPulling="2026-03-10 09:48:42.85907646 +0000 UTC m=+271.615247278" observedRunningTime="2026-03-10 09:48:43.755415559 +0000 UTC m=+272.511586377" watchObservedRunningTime="2026-03-10 09:48:43.756721796 +0000 UTC m=+272.512892614" Mar 10 09:48:43 crc kubenswrapper[4794]: I0310 09:48:43.767900 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5527923f-a4af-4a62-833c-b45d251e581e-config\") pod \"route-controller-manager-86f487c7f-b8zhm\" (UID: \"5527923f-a4af-4a62-833c-b45d251e581e\") " pod="openshift-route-controller-manager/route-controller-manager-86f487c7f-b8zhm" Mar 10 09:48:43 crc kubenswrapper[4794]: I0310 09:48:43.767952 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5527923f-a4af-4a62-833c-b45d251e581e-client-ca\") pod \"route-controller-manager-86f487c7f-b8zhm\" (UID: \"5527923f-a4af-4a62-833c-b45d251e581e\") " pod="openshift-route-controller-manager/route-controller-manager-86f487c7f-b8zhm" Mar 10 09:48:43 crc kubenswrapper[4794]: I0310 09:48:43.768060 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5527923f-a4af-4a62-833c-b45d251e581e-serving-cert\") pod \"route-controller-manager-86f487c7f-b8zhm\" (UID: \"5527923f-a4af-4a62-833c-b45d251e581e\") " pod="openshift-route-controller-manager/route-controller-manager-86f487c7f-b8zhm" Mar 10 09:48:43 crc kubenswrapper[4794]: I0310 09:48:43.768093 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c7mzb\" (UniqueName: \"kubernetes.io/projected/5527923f-a4af-4a62-833c-b45d251e581e-kube-api-access-c7mzb\") pod \"route-controller-manager-86f487c7f-b8zhm\" (UID: \"5527923f-a4af-4a62-833c-b45d251e581e\") " pod="openshift-route-controller-manager/route-controller-manager-86f487c7f-b8zhm" Mar 10 09:48:43 crc kubenswrapper[4794]: I0310 09:48:43.768943 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5527923f-a4af-4a62-833c-b45d251e581e-client-ca\") pod \"route-controller-manager-86f487c7f-b8zhm\" (UID: \"5527923f-a4af-4a62-833c-b45d251e581e\") " pod="openshift-route-controller-manager/route-controller-manager-86f487c7f-b8zhm" Mar 10 09:48:43 crc kubenswrapper[4794]: I0310 09:48:43.769119 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5527923f-a4af-4a62-833c-b45d251e581e-config\") pod \"route-controller-manager-86f487c7f-b8zhm\" (UID: \"5527923f-a4af-4a62-833c-b45d251e581e\") " pod="openshift-route-controller-manager/route-controller-manager-86f487c7f-b8zhm" Mar 10 09:48:43 crc kubenswrapper[4794]: I0310 09:48:43.774484 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5527923f-a4af-4a62-833c-b45d251e581e-serving-cert\") pod \"route-controller-manager-86f487c7f-b8zhm\" (UID: \"5527923f-a4af-4a62-833c-b45d251e581e\") " pod="openshift-route-controller-manager/route-controller-manager-86f487c7f-b8zhm" Mar 10 09:48:43 crc kubenswrapper[4794]: I0310 09:48:43.799388 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c7mzb\" (UniqueName: \"kubernetes.io/projected/5527923f-a4af-4a62-833c-b45d251e581e-kube-api-access-c7mzb\") pod \"route-controller-manager-86f487c7f-b8zhm\" (UID: \"5527923f-a4af-4a62-833c-b45d251e581e\") " pod="openshift-route-controller-manager/route-controller-manager-86f487c7f-b8zhm" Mar 10 09:48:43 crc kubenswrapper[4794]: I0310 09:48:43.927760 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-86f487c7f-b8zhm" Mar 10 09:48:43 crc kubenswrapper[4794]: I0310 09:48:43.990816 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-q7js2" Mar 10 09:48:44 crc kubenswrapper[4794]: I0310 09:48:44.370683 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-6hcq9" Mar 10 09:48:44 crc kubenswrapper[4794]: I0310 09:48:44.401281 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-86f487c7f-b8zhm"] Mar 10 09:48:44 crc kubenswrapper[4794]: W0310 09:48:44.411146 4794 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5527923f_a4af_4a62_833c_b45d251e581e.slice/crio-b6344670af9be7d2f10ed1ba90fb834873e68427df5366412328475b91faed38 WatchSource:0}: Error finding container b6344670af9be7d2f10ed1ba90fb834873e68427df5366412328475b91faed38: Status 404 returned error can't find the container with id b6344670af9be7d2f10ed1ba90fb834873e68427df5366412328475b91faed38 Mar 10 09:48:44 crc kubenswrapper[4794]: I0310 09:48:44.617341 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zffn2" event={"ID":"869965fc-c355-4c93-9776-dc1a070c926e","Type":"ContainerStarted","Data":"71d967cd3d765afe905206227b4a185610ee4a8d660a82bc475e7aac4f1aa414"} Mar 10 09:48:44 crc kubenswrapper[4794]: I0310 09:48:44.619094 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-86f487c7f-b8zhm" event={"ID":"5527923f-a4af-4a62-833c-b45d251e581e","Type":"ContainerStarted","Data":"aa220af4b82dfcfafe4c7f4c0a728b752c841a821bce65022f675cbe0e6e5a5e"} Mar 10 09:48:44 crc kubenswrapper[4794]: I0310 09:48:44.619119 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-86f487c7f-b8zhm" event={"ID":"5527923f-a4af-4a62-833c-b45d251e581e","Type":"ContainerStarted","Data":"b6344670af9be7d2f10ed1ba90fb834873e68427df5366412328475b91faed38"} Mar 10 09:48:44 crc kubenswrapper[4794]: I0310 09:48:44.638216 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-zffn2" podStartSLOduration=2.654420772 podStartE2EDuration="43.638201888s" podCreationTimestamp="2026-03-10 09:48:01 +0000 UTC" firstStartedPulling="2026-03-10 09:48:03.006430623 +0000 UTC m=+231.762601441" lastFinishedPulling="2026-03-10 09:48:43.990211729 +0000 UTC m=+272.746382557" observedRunningTime="2026-03-10 09:48:44.636985454 +0000 UTC m=+273.393156272" watchObservedRunningTime="2026-03-10 09:48:44.638201888 +0000 UTC m=+273.394372706" Mar 10 09:48:44 crc kubenswrapper[4794]: I0310 09:48:44.655350 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-86f487c7f-b8zhm" podStartSLOduration=4.655311476 podStartE2EDuration="4.655311476s" podCreationTimestamp="2026-03-10 09:48:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 09:48:44.653848914 +0000 UTC m=+273.410019732" watchObservedRunningTime="2026-03-10 09:48:44.655311476 +0000 UTC m=+273.411482294" Mar 10 09:48:44 crc kubenswrapper[4794]: I0310 09:48:44.891136 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-cx8l7" Mar 10 09:48:44 crc kubenswrapper[4794]: I0310 09:48:44.891192 4794 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-cx8l7" Mar 10 09:48:44 crc kubenswrapper[4794]: I0310 09:48:44.959537 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552266-9tldc" Mar 10 09:48:44 crc kubenswrapper[4794]: I0310 09:48:44.984496 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zppzf\" (UniqueName: \"kubernetes.io/projected/f12f506f-5226-41a3-9643-260415a884a5-kube-api-access-zppzf\") pod \"f12f506f-5226-41a3-9643-260415a884a5\" (UID: \"f12f506f-5226-41a3-9643-260415a884a5\") " Mar 10 09:48:44 crc kubenswrapper[4794]: I0310 09:48:44.990966 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f12f506f-5226-41a3-9643-260415a884a5-kube-api-access-zppzf" (OuterVolumeSpecName: "kube-api-access-zppzf") pod "f12f506f-5226-41a3-9643-260415a884a5" (UID: "f12f506f-5226-41a3-9643-260415a884a5"). InnerVolumeSpecName "kube-api-access-zppzf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:48:45 crc kubenswrapper[4794]: I0310 09:48:45.086152 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zppzf\" (UniqueName: \"kubernetes.io/projected/f12f506f-5226-41a3-9643-260415a884a5-kube-api-access-zppzf\") on node \"crc\" DevicePath \"\"" Mar 10 09:48:45 crc kubenswrapper[4794]: I0310 09:48:45.341463 4794 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-rq6sq" Mar 10 09:48:45 crc kubenswrapper[4794]: I0310 09:48:45.341515 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-rq6sq" Mar 10 09:48:45 crc kubenswrapper[4794]: I0310 09:48:45.625005 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552266-9tldc" event={"ID":"f12f506f-5226-41a3-9643-260415a884a5","Type":"ContainerDied","Data":"9e0453e81bc0ab73cfa7f712f3ba1c831eee5cbc4d731d219a3532cd1c964f7e"} Mar 10 09:48:45 crc kubenswrapper[4794]: I0310 09:48:45.625042 4794 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9e0453e81bc0ab73cfa7f712f3ba1c831eee5cbc4d731d219a3532cd1c964f7e" Mar 10 09:48:45 crc kubenswrapper[4794]: I0310 09:48:45.625036 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552266-9tldc" Mar 10 09:48:45 crc kubenswrapper[4794]: I0310 09:48:45.628020 4794 generic.go:334] "Generic (PLEG): container finished" podID="2c891e53-bebe-462e-a924-5073338f2ac1" containerID="bb53f72d4d5cd4f613068d335dcc9e3823324ab901f9447640f0080d7ef21d06" exitCode=0 Mar 10 09:48:45 crc kubenswrapper[4794]: I0310 09:48:45.628043 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-b52gv" event={"ID":"2c891e53-bebe-462e-a924-5073338f2ac1","Type":"ContainerDied","Data":"bb53f72d4d5cd4f613068d335dcc9e3823324ab901f9447640f0080d7ef21d06"} Mar 10 09:48:45 crc kubenswrapper[4794]: I0310 09:48:45.628634 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-86f487c7f-b8zhm" Mar 10 09:48:45 crc kubenswrapper[4794]: I0310 09:48:45.637691 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-86f487c7f-b8zhm" Mar 10 09:48:45 crc kubenswrapper[4794]: I0310 09:48:45.825486 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Mar 10 09:48:45 crc kubenswrapper[4794]: E0310 09:48:45.825691 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f12f506f-5226-41a3-9643-260415a884a5" containerName="oc" Mar 10 09:48:45 crc kubenswrapper[4794]: I0310 09:48:45.825705 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="f12f506f-5226-41a3-9643-260415a884a5" containerName="oc" Mar 10 09:48:45 crc kubenswrapper[4794]: I0310 09:48:45.825820 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="f12f506f-5226-41a3-9643-260415a884a5" containerName="oc" Mar 10 09:48:45 crc kubenswrapper[4794]: I0310 09:48:45.826295 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Mar 10 09:48:45 crc kubenswrapper[4794]: I0310 09:48:45.830697 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Mar 10 09:48:45 crc kubenswrapper[4794]: I0310 09:48:45.830774 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Mar 10 09:48:45 crc kubenswrapper[4794]: I0310 09:48:45.836948 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Mar 10 09:48:45 crc kubenswrapper[4794]: I0310 09:48:45.894270 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/8be88dc0-2383-4590-8731-7c19146fcd2b-kube-api-access\") pod \"installer-9-crc\" (UID: \"8be88dc0-2383-4590-8731-7c19146fcd2b\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 10 09:48:45 crc kubenswrapper[4794]: I0310 09:48:45.894445 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/8be88dc0-2383-4590-8731-7c19146fcd2b-kubelet-dir\") pod \"installer-9-crc\" (UID: \"8be88dc0-2383-4590-8731-7c19146fcd2b\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 10 09:48:45 crc kubenswrapper[4794]: I0310 09:48:45.894529 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/8be88dc0-2383-4590-8731-7c19146fcd2b-var-lock\") pod \"installer-9-crc\" (UID: \"8be88dc0-2383-4590-8731-7c19146fcd2b\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 10 09:48:45 crc kubenswrapper[4794]: I0310 09:48:45.933564 4794 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-cx8l7" podUID="1fe60fc6-df84-4c3b-a73a-0d5a64aeb61a" containerName="registry-server" probeResult="failure" output=< Mar 10 09:48:45 crc kubenswrapper[4794]: timeout: failed to connect service ":50051" within 1s Mar 10 09:48:45 crc kubenswrapper[4794]: > Mar 10 09:48:45 crc kubenswrapper[4794]: I0310 09:48:45.999563 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/8be88dc0-2383-4590-8731-7c19146fcd2b-kube-api-access\") pod \"installer-9-crc\" (UID: \"8be88dc0-2383-4590-8731-7c19146fcd2b\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 10 09:48:46 crc kubenswrapper[4794]: I0310 09:48:45.999660 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/8be88dc0-2383-4590-8731-7c19146fcd2b-kubelet-dir\") pod \"installer-9-crc\" (UID: \"8be88dc0-2383-4590-8731-7c19146fcd2b\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 10 09:48:46 crc kubenswrapper[4794]: I0310 09:48:45.999701 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/8be88dc0-2383-4590-8731-7c19146fcd2b-var-lock\") pod \"installer-9-crc\" (UID: \"8be88dc0-2383-4590-8731-7c19146fcd2b\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 10 09:48:46 crc kubenswrapper[4794]: I0310 09:48:45.999830 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/8be88dc0-2383-4590-8731-7c19146fcd2b-var-lock\") pod \"installer-9-crc\" (UID: \"8be88dc0-2383-4590-8731-7c19146fcd2b\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 10 09:48:46 crc kubenswrapper[4794]: I0310 09:48:46.000009 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/8be88dc0-2383-4590-8731-7c19146fcd2b-kubelet-dir\") pod \"installer-9-crc\" (UID: \"8be88dc0-2383-4590-8731-7c19146fcd2b\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 10 09:48:46 crc kubenswrapper[4794]: I0310 09:48:46.018713 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/8be88dc0-2383-4590-8731-7c19146fcd2b-kube-api-access\") pod \"installer-9-crc\" (UID: \"8be88dc0-2383-4590-8731-7c19146fcd2b\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 10 09:48:46 crc kubenswrapper[4794]: I0310 09:48:46.139665 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Mar 10 09:48:46 crc kubenswrapper[4794]: I0310 09:48:46.379724 4794 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-rq6sq" podUID="6840b6f1-4520-4c02-9f69-b238ac692ae5" containerName="registry-server" probeResult="failure" output=< Mar 10 09:48:46 crc kubenswrapper[4794]: timeout: failed to connect service ":50051" within 1s Mar 10 09:48:46 crc kubenswrapper[4794]: > Mar 10 09:48:46 crc kubenswrapper[4794]: I0310 09:48:46.594911 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Mar 10 09:48:46 crc kubenswrapper[4794]: W0310 09:48:46.603526 4794 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod8be88dc0_2383_4590_8731_7c19146fcd2b.slice/crio-06257be372456d311406456cf45b1816fa4e85174e9491c6c04661749041b5c3 WatchSource:0}: Error finding container 06257be372456d311406456cf45b1816fa4e85174e9491c6c04661749041b5c3: Status 404 returned error can't find the container with id 06257be372456d311406456cf45b1816fa4e85174e9491c6c04661749041b5c3 Mar 10 09:48:46 crc kubenswrapper[4794]: I0310 09:48:46.634036 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"8be88dc0-2383-4590-8731-7c19146fcd2b","Type":"ContainerStarted","Data":"06257be372456d311406456cf45b1816fa4e85174e9491c6c04661749041b5c3"} Mar 10 09:48:46 crc kubenswrapper[4794]: I0310 09:48:46.638224 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-b52gv" event={"ID":"2c891e53-bebe-462e-a924-5073338f2ac1","Type":"ContainerStarted","Data":"c46b369ac1122e647a4d3c490b09786e921f9680d9fb069180811f496c1c4656"} Mar 10 09:48:46 crc kubenswrapper[4794]: I0310 09:48:46.654796 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-b52gv" podStartSLOduration=2.469371997 podStartE2EDuration="45.65477053s" podCreationTimestamp="2026-03-10 09:48:01 +0000 UTC" firstStartedPulling="2026-03-10 09:48:02.975542692 +0000 UTC m=+231.731713510" lastFinishedPulling="2026-03-10 09:48:46.160941225 +0000 UTC m=+274.917112043" observedRunningTime="2026-03-10 09:48:46.653979672 +0000 UTC m=+275.410150490" watchObservedRunningTime="2026-03-10 09:48:46.65477053 +0000 UTC m=+275.410941348" Mar 10 09:48:46 crc kubenswrapper[4794]: I0310 09:48:46.811675 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-6hcq9"] Mar 10 09:48:46 crc kubenswrapper[4794]: I0310 09:48:46.811894 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-6hcq9" podUID="86b8b5f3-501a-44fa-8f78-ceaefcaecfdd" containerName="registry-server" containerID="cri-o://3df217ab7a77920b1c6587cba3d2add14b8ad14f788e2c4c0d7b52837dfffba2" gracePeriod=2 Mar 10 09:48:47 crc kubenswrapper[4794]: I0310 09:48:47.644280 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"8be88dc0-2383-4590-8731-7c19146fcd2b","Type":"ContainerStarted","Data":"7838560755eca50c1338f0a51378c57ab2950728571720689d1a07e386bf0d5e"} Mar 10 09:48:47 crc kubenswrapper[4794]: I0310 09:48:47.645863 4794 generic.go:334] "Generic (PLEG): container finished" podID="86b8b5f3-501a-44fa-8f78-ceaefcaecfdd" containerID="3df217ab7a77920b1c6587cba3d2add14b8ad14f788e2c4c0d7b52837dfffba2" exitCode=0 Mar 10 09:48:47 crc kubenswrapper[4794]: I0310 09:48:47.646222 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6hcq9" event={"ID":"86b8b5f3-501a-44fa-8f78-ceaefcaecfdd","Type":"ContainerDied","Data":"3df217ab7a77920b1c6587cba3d2add14b8ad14f788e2c4c0d7b52837dfffba2"} Mar 10 09:48:47 crc kubenswrapper[4794]: I0310 09:48:47.663395 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/installer-9-crc" podStartSLOduration=2.6633777480000003 podStartE2EDuration="2.663377748s" podCreationTimestamp="2026-03-10 09:48:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 09:48:47.662303589 +0000 UTC m=+276.418474407" watchObservedRunningTime="2026-03-10 09:48:47.663377748 +0000 UTC m=+276.419548566" Mar 10 09:48:48 crc kubenswrapper[4794]: I0310 09:48:48.006960 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-6hcq9" Mar 10 09:48:48 crc kubenswrapper[4794]: I0310 09:48:48.122633 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jp9vs\" (UniqueName: \"kubernetes.io/projected/86b8b5f3-501a-44fa-8f78-ceaefcaecfdd-kube-api-access-jp9vs\") pod \"86b8b5f3-501a-44fa-8f78-ceaefcaecfdd\" (UID: \"86b8b5f3-501a-44fa-8f78-ceaefcaecfdd\") " Mar 10 09:48:48 crc kubenswrapper[4794]: I0310 09:48:48.122683 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/86b8b5f3-501a-44fa-8f78-ceaefcaecfdd-catalog-content\") pod \"86b8b5f3-501a-44fa-8f78-ceaefcaecfdd\" (UID: \"86b8b5f3-501a-44fa-8f78-ceaefcaecfdd\") " Mar 10 09:48:48 crc kubenswrapper[4794]: I0310 09:48:48.122759 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/86b8b5f3-501a-44fa-8f78-ceaefcaecfdd-utilities\") pod \"86b8b5f3-501a-44fa-8f78-ceaefcaecfdd\" (UID: \"86b8b5f3-501a-44fa-8f78-ceaefcaecfdd\") " Mar 10 09:48:48 crc kubenswrapper[4794]: I0310 09:48:48.123536 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/86b8b5f3-501a-44fa-8f78-ceaefcaecfdd-utilities" (OuterVolumeSpecName: "utilities") pod "86b8b5f3-501a-44fa-8f78-ceaefcaecfdd" (UID: "86b8b5f3-501a-44fa-8f78-ceaefcaecfdd"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 09:48:48 crc kubenswrapper[4794]: I0310 09:48:48.131102 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/86b8b5f3-501a-44fa-8f78-ceaefcaecfdd-kube-api-access-jp9vs" (OuterVolumeSpecName: "kube-api-access-jp9vs") pod "86b8b5f3-501a-44fa-8f78-ceaefcaecfdd" (UID: "86b8b5f3-501a-44fa-8f78-ceaefcaecfdd"). InnerVolumeSpecName "kube-api-access-jp9vs". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:48:48 crc kubenswrapper[4794]: I0310 09:48:48.159842 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/86b8b5f3-501a-44fa-8f78-ceaefcaecfdd-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "86b8b5f3-501a-44fa-8f78-ceaefcaecfdd" (UID: "86b8b5f3-501a-44fa-8f78-ceaefcaecfdd"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 09:48:48 crc kubenswrapper[4794]: I0310 09:48:48.224254 4794 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/86b8b5f3-501a-44fa-8f78-ceaefcaecfdd-utilities\") on node \"crc\" DevicePath \"\"" Mar 10 09:48:48 crc kubenswrapper[4794]: I0310 09:48:48.224283 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jp9vs\" (UniqueName: \"kubernetes.io/projected/86b8b5f3-501a-44fa-8f78-ceaefcaecfdd-kube-api-access-jp9vs\") on node \"crc\" DevicePath \"\"" Mar 10 09:48:48 crc kubenswrapper[4794]: I0310 09:48:48.224293 4794 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/86b8b5f3-501a-44fa-8f78-ceaefcaecfdd-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 10 09:48:48 crc kubenswrapper[4794]: I0310 09:48:48.653225 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6hcq9" event={"ID":"86b8b5f3-501a-44fa-8f78-ceaefcaecfdd","Type":"ContainerDied","Data":"c933f8aa9063d6e53255ad958bf467e9afa76762c930f66fb36f795a71bc0a0c"} Mar 10 09:48:48 crc kubenswrapper[4794]: I0310 09:48:48.653315 4794 scope.go:117] "RemoveContainer" containerID="3df217ab7a77920b1c6587cba3d2add14b8ad14f788e2c4c0d7b52837dfffba2" Mar 10 09:48:48 crc kubenswrapper[4794]: I0310 09:48:48.653249 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-6hcq9" Mar 10 09:48:48 crc kubenswrapper[4794]: I0310 09:48:48.668638 4794 scope.go:117] "RemoveContainer" containerID="b53b9ac8b80c41da591dcbf3aad299c79c4bef06655be38b313281d853503d83" Mar 10 09:48:48 crc kubenswrapper[4794]: I0310 09:48:48.686896 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-6hcq9"] Mar 10 09:48:48 crc kubenswrapper[4794]: I0310 09:48:48.686959 4794 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-6hcq9"] Mar 10 09:48:48 crc kubenswrapper[4794]: I0310 09:48:48.696466 4794 scope.go:117] "RemoveContainer" containerID="e82861a1fc8afc141de36c880aa783497a205a03654eb6ba92f3cbf4cf6aab1a" Mar 10 09:48:50 crc kubenswrapper[4794]: I0310 09:48:50.005093 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="86b8b5f3-501a-44fa-8f78-ceaefcaecfdd" path="/var/lib/kubelet/pods/86b8b5f3-501a-44fa-8f78-ceaefcaecfdd/volumes" Mar 10 09:48:51 crc kubenswrapper[4794]: I0310 09:48:51.663230 4794 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-bnvgq" Mar 10 09:48:51 crc kubenswrapper[4794]: I0310 09:48:51.663555 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-bnvgq" Mar 10 09:48:51 crc kubenswrapper[4794]: I0310 09:48:51.737817 4794 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-bnvgq" Mar 10 09:48:51 crc kubenswrapper[4794]: I0310 09:48:51.898770 4794 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-zffn2" Mar 10 09:48:51 crc kubenswrapper[4794]: I0310 09:48:51.898837 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-zffn2" Mar 10 09:48:51 crc kubenswrapper[4794]: I0310 09:48:51.966034 4794 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-zffn2" Mar 10 09:48:52 crc kubenswrapper[4794]: I0310 09:48:52.078056 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-4d8mk" Mar 10 09:48:52 crc kubenswrapper[4794]: I0310 09:48:52.078109 4794 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-4d8mk" Mar 10 09:48:52 crc kubenswrapper[4794]: I0310 09:48:52.153083 4794 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-4d8mk" Mar 10 09:48:52 crc kubenswrapper[4794]: I0310 09:48:52.309967 4794 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-b52gv" Mar 10 09:48:52 crc kubenswrapper[4794]: I0310 09:48:52.310017 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-b52gv" Mar 10 09:48:52 crc kubenswrapper[4794]: I0310 09:48:52.356895 4794 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-b52gv" Mar 10 09:48:52 crc kubenswrapper[4794]: I0310 09:48:52.749407 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-4d8mk" Mar 10 09:48:52 crc kubenswrapper[4794]: I0310 09:48:52.750019 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-bnvgq" Mar 10 09:48:52 crc kubenswrapper[4794]: I0310 09:48:52.756299 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-b52gv" Mar 10 09:48:52 crc kubenswrapper[4794]: I0310 09:48:52.767784 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-zffn2" Mar 10 09:48:52 crc kubenswrapper[4794]: I0310 09:48:52.967811 4794 patch_prober.go:28] interesting pod/machine-config-daemon-69278 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 09:48:52 crc kubenswrapper[4794]: I0310 09:48:52.967904 4794 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-69278" podUID="071a79a8-a892-4d38-a255-2a19483b64aa" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 09:48:54 crc kubenswrapper[4794]: I0310 09:48:54.613528 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-4d8mk"] Mar 10 09:48:54 crc kubenswrapper[4794]: I0310 09:48:54.692221 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-4d8mk" podUID="dc6e5973-dae4-42b4-aae1-44fd0f6e1c3e" containerName="registry-server" containerID="cri-o://0dc9d68e9e86dea775383b389ff07056b9d9668f7fbb7c1a316d24644f0897bb" gracePeriod=2 Mar 10 09:48:54 crc kubenswrapper[4794]: I0310 09:48:54.946698 4794 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-cx8l7" Mar 10 09:48:55 crc kubenswrapper[4794]: I0310 09:48:55.025536 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-cx8l7" Mar 10 09:48:55 crc kubenswrapper[4794]: I0310 09:48:55.218258 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-b52gv"] Mar 10 09:48:55 crc kubenswrapper[4794]: I0310 09:48:55.218842 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-b52gv" podUID="2c891e53-bebe-462e-a924-5073338f2ac1" containerName="registry-server" containerID="cri-o://c46b369ac1122e647a4d3c490b09786e921f9680d9fb069180811f496c1c4656" gracePeriod=2 Mar 10 09:48:55 crc kubenswrapper[4794]: I0310 09:48:55.233783 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-4d8mk" Mar 10 09:48:55 crc kubenswrapper[4794]: I0310 09:48:55.406864 4794 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-rq6sq" Mar 10 09:48:55 crc kubenswrapper[4794]: I0310 09:48:55.430464 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dc6e5973-dae4-42b4-aae1-44fd0f6e1c3e-utilities\") pod \"dc6e5973-dae4-42b4-aae1-44fd0f6e1c3e\" (UID: \"dc6e5973-dae4-42b4-aae1-44fd0f6e1c3e\") " Mar 10 09:48:55 crc kubenswrapper[4794]: I0310 09:48:55.430532 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dc6e5973-dae4-42b4-aae1-44fd0f6e1c3e-catalog-content\") pod \"dc6e5973-dae4-42b4-aae1-44fd0f6e1c3e\" (UID: \"dc6e5973-dae4-42b4-aae1-44fd0f6e1c3e\") " Mar 10 09:48:55 crc kubenswrapper[4794]: I0310 09:48:55.430590 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ng2zz\" (UniqueName: \"kubernetes.io/projected/dc6e5973-dae4-42b4-aae1-44fd0f6e1c3e-kube-api-access-ng2zz\") pod \"dc6e5973-dae4-42b4-aae1-44fd0f6e1c3e\" (UID: \"dc6e5973-dae4-42b4-aae1-44fd0f6e1c3e\") " Mar 10 09:48:55 crc kubenswrapper[4794]: I0310 09:48:55.431487 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dc6e5973-dae4-42b4-aae1-44fd0f6e1c3e-utilities" (OuterVolumeSpecName: "utilities") pod "dc6e5973-dae4-42b4-aae1-44fd0f6e1c3e" (UID: "dc6e5973-dae4-42b4-aae1-44fd0f6e1c3e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 09:48:55 crc kubenswrapper[4794]: I0310 09:48:55.439244 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dc6e5973-dae4-42b4-aae1-44fd0f6e1c3e-kube-api-access-ng2zz" (OuterVolumeSpecName: "kube-api-access-ng2zz") pod "dc6e5973-dae4-42b4-aae1-44fd0f6e1c3e" (UID: "dc6e5973-dae4-42b4-aae1-44fd0f6e1c3e"). InnerVolumeSpecName "kube-api-access-ng2zz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:48:55 crc kubenswrapper[4794]: I0310 09:48:55.451968 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-rq6sq" Mar 10 09:48:55 crc kubenswrapper[4794]: I0310 09:48:55.484129 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dc6e5973-dae4-42b4-aae1-44fd0f6e1c3e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "dc6e5973-dae4-42b4-aae1-44fd0f6e1c3e" (UID: "dc6e5973-dae4-42b4-aae1-44fd0f6e1c3e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 09:48:55 crc kubenswrapper[4794]: I0310 09:48:55.531986 4794 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dc6e5973-dae4-42b4-aae1-44fd0f6e1c3e-utilities\") on node \"crc\" DevicePath \"\"" Mar 10 09:48:55 crc kubenswrapper[4794]: I0310 09:48:55.532024 4794 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dc6e5973-dae4-42b4-aae1-44fd0f6e1c3e-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 10 09:48:55 crc kubenswrapper[4794]: I0310 09:48:55.532037 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ng2zz\" (UniqueName: \"kubernetes.io/projected/dc6e5973-dae4-42b4-aae1-44fd0f6e1c3e-kube-api-access-ng2zz\") on node \"crc\" DevicePath \"\"" Mar 10 09:48:55 crc kubenswrapper[4794]: I0310 09:48:55.636038 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-b52gv" Mar 10 09:48:55 crc kubenswrapper[4794]: I0310 09:48:55.700492 4794 generic.go:334] "Generic (PLEG): container finished" podID="dc6e5973-dae4-42b4-aae1-44fd0f6e1c3e" containerID="0dc9d68e9e86dea775383b389ff07056b9d9668f7fbb7c1a316d24644f0897bb" exitCode=0 Mar 10 09:48:55 crc kubenswrapper[4794]: I0310 09:48:55.700576 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4d8mk" event={"ID":"dc6e5973-dae4-42b4-aae1-44fd0f6e1c3e","Type":"ContainerDied","Data":"0dc9d68e9e86dea775383b389ff07056b9d9668f7fbb7c1a316d24644f0897bb"} Mar 10 09:48:55 crc kubenswrapper[4794]: I0310 09:48:55.700600 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-4d8mk" Mar 10 09:48:55 crc kubenswrapper[4794]: I0310 09:48:55.700624 4794 scope.go:117] "RemoveContainer" containerID="0dc9d68e9e86dea775383b389ff07056b9d9668f7fbb7c1a316d24644f0897bb" Mar 10 09:48:55 crc kubenswrapper[4794]: I0310 09:48:55.700610 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4d8mk" event={"ID":"dc6e5973-dae4-42b4-aae1-44fd0f6e1c3e","Type":"ContainerDied","Data":"f37bec18dcebcae034b8717b756e3dde951466d81babf4babaeba83aeb10a777"} Mar 10 09:48:55 crc kubenswrapper[4794]: I0310 09:48:55.704753 4794 generic.go:334] "Generic (PLEG): container finished" podID="2c891e53-bebe-462e-a924-5073338f2ac1" containerID="c46b369ac1122e647a4d3c490b09786e921f9680d9fb069180811f496c1c4656" exitCode=0 Mar 10 09:48:55 crc kubenswrapper[4794]: I0310 09:48:55.704789 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-b52gv" Mar 10 09:48:55 crc kubenswrapper[4794]: I0310 09:48:55.704815 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-b52gv" event={"ID":"2c891e53-bebe-462e-a924-5073338f2ac1","Type":"ContainerDied","Data":"c46b369ac1122e647a4d3c490b09786e921f9680d9fb069180811f496c1c4656"} Mar 10 09:48:55 crc kubenswrapper[4794]: I0310 09:48:55.705123 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-b52gv" event={"ID":"2c891e53-bebe-462e-a924-5073338f2ac1","Type":"ContainerDied","Data":"3cabe3c0f008fd383116e851a9bb8a57b46f76458599dae3cdf888ad301adf70"} Mar 10 09:48:55 crc kubenswrapper[4794]: I0310 09:48:55.717601 4794 scope.go:117] "RemoveContainer" containerID="3e150bfb91adedfa4aa0f3b8935534f2fd60a447dd207906377db0be8f5854ce" Mar 10 09:48:55 crc kubenswrapper[4794]: I0310 09:48:55.742017 4794 scope.go:117] "RemoveContainer" containerID="97bb64ed647073b0a6b564113a9baad591fa4a2e5eb7904028e056bf5b4cc925" Mar 10 09:48:55 crc kubenswrapper[4794]: I0310 09:48:55.747296 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-4d8mk"] Mar 10 09:48:55 crc kubenswrapper[4794]: I0310 09:48:55.751532 4794 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-4d8mk"] Mar 10 09:48:55 crc kubenswrapper[4794]: I0310 09:48:55.756619 4794 scope.go:117] "RemoveContainer" containerID="0dc9d68e9e86dea775383b389ff07056b9d9668f7fbb7c1a316d24644f0897bb" Mar 10 09:48:55 crc kubenswrapper[4794]: E0310 09:48:55.757035 4794 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0dc9d68e9e86dea775383b389ff07056b9d9668f7fbb7c1a316d24644f0897bb\": container with ID starting with 0dc9d68e9e86dea775383b389ff07056b9d9668f7fbb7c1a316d24644f0897bb not found: ID does not exist" containerID="0dc9d68e9e86dea775383b389ff07056b9d9668f7fbb7c1a316d24644f0897bb" Mar 10 09:48:55 crc kubenswrapper[4794]: I0310 09:48:55.757097 4794 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0dc9d68e9e86dea775383b389ff07056b9d9668f7fbb7c1a316d24644f0897bb"} err="failed to get container status \"0dc9d68e9e86dea775383b389ff07056b9d9668f7fbb7c1a316d24644f0897bb\": rpc error: code = NotFound desc = could not find container \"0dc9d68e9e86dea775383b389ff07056b9d9668f7fbb7c1a316d24644f0897bb\": container with ID starting with 0dc9d68e9e86dea775383b389ff07056b9d9668f7fbb7c1a316d24644f0897bb not found: ID does not exist" Mar 10 09:48:55 crc kubenswrapper[4794]: I0310 09:48:55.757125 4794 scope.go:117] "RemoveContainer" containerID="3e150bfb91adedfa4aa0f3b8935534f2fd60a447dd207906377db0be8f5854ce" Mar 10 09:48:55 crc kubenswrapper[4794]: E0310 09:48:55.757461 4794 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3e150bfb91adedfa4aa0f3b8935534f2fd60a447dd207906377db0be8f5854ce\": container with ID starting with 3e150bfb91adedfa4aa0f3b8935534f2fd60a447dd207906377db0be8f5854ce not found: ID does not exist" containerID="3e150bfb91adedfa4aa0f3b8935534f2fd60a447dd207906377db0be8f5854ce" Mar 10 09:48:55 crc kubenswrapper[4794]: I0310 09:48:55.757506 4794 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3e150bfb91adedfa4aa0f3b8935534f2fd60a447dd207906377db0be8f5854ce"} err="failed to get container status \"3e150bfb91adedfa4aa0f3b8935534f2fd60a447dd207906377db0be8f5854ce\": rpc error: code = NotFound desc = could not find container \"3e150bfb91adedfa4aa0f3b8935534f2fd60a447dd207906377db0be8f5854ce\": container with ID starting with 3e150bfb91adedfa4aa0f3b8935534f2fd60a447dd207906377db0be8f5854ce not found: ID does not exist" Mar 10 09:48:55 crc kubenswrapper[4794]: I0310 09:48:55.757545 4794 scope.go:117] "RemoveContainer" containerID="97bb64ed647073b0a6b564113a9baad591fa4a2e5eb7904028e056bf5b4cc925" Mar 10 09:48:55 crc kubenswrapper[4794]: E0310 09:48:55.757852 4794 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"97bb64ed647073b0a6b564113a9baad591fa4a2e5eb7904028e056bf5b4cc925\": container with ID starting with 97bb64ed647073b0a6b564113a9baad591fa4a2e5eb7904028e056bf5b4cc925 not found: ID does not exist" containerID="97bb64ed647073b0a6b564113a9baad591fa4a2e5eb7904028e056bf5b4cc925" Mar 10 09:48:55 crc kubenswrapper[4794]: I0310 09:48:55.757883 4794 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"97bb64ed647073b0a6b564113a9baad591fa4a2e5eb7904028e056bf5b4cc925"} err="failed to get container status \"97bb64ed647073b0a6b564113a9baad591fa4a2e5eb7904028e056bf5b4cc925\": rpc error: code = NotFound desc = could not find container \"97bb64ed647073b0a6b564113a9baad591fa4a2e5eb7904028e056bf5b4cc925\": container with ID starting with 97bb64ed647073b0a6b564113a9baad591fa4a2e5eb7904028e056bf5b4cc925 not found: ID does not exist" Mar 10 09:48:55 crc kubenswrapper[4794]: I0310 09:48:55.757907 4794 scope.go:117] "RemoveContainer" containerID="c46b369ac1122e647a4d3c490b09786e921f9680d9fb069180811f496c1c4656" Mar 10 09:48:55 crc kubenswrapper[4794]: I0310 09:48:55.768834 4794 scope.go:117] "RemoveContainer" containerID="bb53f72d4d5cd4f613068d335dcc9e3823324ab901f9447640f0080d7ef21d06" Mar 10 09:48:55 crc kubenswrapper[4794]: I0310 09:48:55.792611 4794 scope.go:117] "RemoveContainer" containerID="04fb598ca8708bd8249618aed32ed8191b905b395b2871f5b4d7d57e0a7cbceb" Mar 10 09:48:55 crc kubenswrapper[4794]: I0310 09:48:55.811005 4794 scope.go:117] "RemoveContainer" containerID="c46b369ac1122e647a4d3c490b09786e921f9680d9fb069180811f496c1c4656" Mar 10 09:48:55 crc kubenswrapper[4794]: E0310 09:48:55.811358 4794 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c46b369ac1122e647a4d3c490b09786e921f9680d9fb069180811f496c1c4656\": container with ID starting with c46b369ac1122e647a4d3c490b09786e921f9680d9fb069180811f496c1c4656 not found: ID does not exist" containerID="c46b369ac1122e647a4d3c490b09786e921f9680d9fb069180811f496c1c4656" Mar 10 09:48:55 crc kubenswrapper[4794]: I0310 09:48:55.811393 4794 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c46b369ac1122e647a4d3c490b09786e921f9680d9fb069180811f496c1c4656"} err="failed to get container status \"c46b369ac1122e647a4d3c490b09786e921f9680d9fb069180811f496c1c4656\": rpc error: code = NotFound desc = could not find container \"c46b369ac1122e647a4d3c490b09786e921f9680d9fb069180811f496c1c4656\": container with ID starting with c46b369ac1122e647a4d3c490b09786e921f9680d9fb069180811f496c1c4656 not found: ID does not exist" Mar 10 09:48:55 crc kubenswrapper[4794]: I0310 09:48:55.811419 4794 scope.go:117] "RemoveContainer" containerID="bb53f72d4d5cd4f613068d335dcc9e3823324ab901f9447640f0080d7ef21d06" Mar 10 09:48:55 crc kubenswrapper[4794]: E0310 09:48:55.811661 4794 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bb53f72d4d5cd4f613068d335dcc9e3823324ab901f9447640f0080d7ef21d06\": container with ID starting with bb53f72d4d5cd4f613068d335dcc9e3823324ab901f9447640f0080d7ef21d06 not found: ID does not exist" containerID="bb53f72d4d5cd4f613068d335dcc9e3823324ab901f9447640f0080d7ef21d06" Mar 10 09:48:55 crc kubenswrapper[4794]: I0310 09:48:55.811686 4794 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bb53f72d4d5cd4f613068d335dcc9e3823324ab901f9447640f0080d7ef21d06"} err="failed to get container status \"bb53f72d4d5cd4f613068d335dcc9e3823324ab901f9447640f0080d7ef21d06\": rpc error: code = NotFound desc = could not find container \"bb53f72d4d5cd4f613068d335dcc9e3823324ab901f9447640f0080d7ef21d06\": container with ID starting with bb53f72d4d5cd4f613068d335dcc9e3823324ab901f9447640f0080d7ef21d06 not found: ID does not exist" Mar 10 09:48:55 crc kubenswrapper[4794]: I0310 09:48:55.811703 4794 scope.go:117] "RemoveContainer" containerID="04fb598ca8708bd8249618aed32ed8191b905b395b2871f5b4d7d57e0a7cbceb" Mar 10 09:48:55 crc kubenswrapper[4794]: E0310 09:48:55.811951 4794 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"04fb598ca8708bd8249618aed32ed8191b905b395b2871f5b4d7d57e0a7cbceb\": container with ID starting with 04fb598ca8708bd8249618aed32ed8191b905b395b2871f5b4d7d57e0a7cbceb not found: ID does not exist" containerID="04fb598ca8708bd8249618aed32ed8191b905b395b2871f5b4d7d57e0a7cbceb" Mar 10 09:48:55 crc kubenswrapper[4794]: I0310 09:48:55.811971 4794 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"04fb598ca8708bd8249618aed32ed8191b905b395b2871f5b4d7d57e0a7cbceb"} err="failed to get container status \"04fb598ca8708bd8249618aed32ed8191b905b395b2871f5b4d7d57e0a7cbceb\": rpc error: code = NotFound desc = could not find container \"04fb598ca8708bd8249618aed32ed8191b905b395b2871f5b4d7d57e0a7cbceb\": container with ID starting with 04fb598ca8708bd8249618aed32ed8191b905b395b2871f5b4d7d57e0a7cbceb not found: ID does not exist" Mar 10 09:48:55 crc kubenswrapper[4794]: I0310 09:48:55.834521 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2c891e53-bebe-462e-a924-5073338f2ac1-utilities\") pod \"2c891e53-bebe-462e-a924-5073338f2ac1\" (UID: \"2c891e53-bebe-462e-a924-5073338f2ac1\") " Mar 10 09:48:55 crc kubenswrapper[4794]: I0310 09:48:55.834617 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ftt2l\" (UniqueName: \"kubernetes.io/projected/2c891e53-bebe-462e-a924-5073338f2ac1-kube-api-access-ftt2l\") pod \"2c891e53-bebe-462e-a924-5073338f2ac1\" (UID: \"2c891e53-bebe-462e-a924-5073338f2ac1\") " Mar 10 09:48:55 crc kubenswrapper[4794]: I0310 09:48:55.834667 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2c891e53-bebe-462e-a924-5073338f2ac1-catalog-content\") pod \"2c891e53-bebe-462e-a924-5073338f2ac1\" (UID: \"2c891e53-bebe-462e-a924-5073338f2ac1\") " Mar 10 09:48:55 crc kubenswrapper[4794]: I0310 09:48:55.836983 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2c891e53-bebe-462e-a924-5073338f2ac1-kube-api-access-ftt2l" (OuterVolumeSpecName: "kube-api-access-ftt2l") pod "2c891e53-bebe-462e-a924-5073338f2ac1" (UID: "2c891e53-bebe-462e-a924-5073338f2ac1"). InnerVolumeSpecName "kube-api-access-ftt2l". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:48:55 crc kubenswrapper[4794]: I0310 09:48:55.840934 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2c891e53-bebe-462e-a924-5073338f2ac1-utilities" (OuterVolumeSpecName: "utilities") pod "2c891e53-bebe-462e-a924-5073338f2ac1" (UID: "2c891e53-bebe-462e-a924-5073338f2ac1"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 09:48:55 crc kubenswrapper[4794]: I0310 09:48:55.897957 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2c891e53-bebe-462e-a924-5073338f2ac1-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "2c891e53-bebe-462e-a924-5073338f2ac1" (UID: "2c891e53-bebe-462e-a924-5073338f2ac1"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 09:48:55 crc kubenswrapper[4794]: I0310 09:48:55.936932 4794 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2c891e53-bebe-462e-a924-5073338f2ac1-utilities\") on node \"crc\" DevicePath \"\"" Mar 10 09:48:55 crc kubenswrapper[4794]: I0310 09:48:55.936985 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ftt2l\" (UniqueName: \"kubernetes.io/projected/2c891e53-bebe-462e-a924-5073338f2ac1-kube-api-access-ftt2l\") on node \"crc\" DevicePath \"\"" Mar 10 09:48:55 crc kubenswrapper[4794]: I0310 09:48:55.937006 4794 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2c891e53-bebe-462e-a924-5073338f2ac1-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 10 09:48:56 crc kubenswrapper[4794]: I0310 09:48:56.008145 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dc6e5973-dae4-42b4-aae1-44fd0f6e1c3e" path="/var/lib/kubelet/pods/dc6e5973-dae4-42b4-aae1-44fd0f6e1c3e/volumes" Mar 10 09:48:56 crc kubenswrapper[4794]: I0310 09:48:56.047313 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-b52gv"] Mar 10 09:48:56 crc kubenswrapper[4794]: I0310 09:48:56.057002 4794 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-b52gv"] Mar 10 09:48:57 crc kubenswrapper[4794]: I0310 09:48:57.615379 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-rq6sq"] Mar 10 09:48:57 crc kubenswrapper[4794]: I0310 09:48:57.616204 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-rq6sq" podUID="6840b6f1-4520-4c02-9f69-b238ac692ae5" containerName="registry-server" containerID="cri-o://aaa17f102fefbb220bf3e023f4365c32450404156e29043b0a751280ebfe94b2" gracePeriod=2 Mar 10 09:48:58 crc kubenswrapper[4794]: I0310 09:48:58.007178 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2c891e53-bebe-462e-a924-5073338f2ac1" path="/var/lib/kubelet/pods/2c891e53-bebe-462e-a924-5073338f2ac1/volumes" Mar 10 09:48:58 crc kubenswrapper[4794]: I0310 09:48:58.052936 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-rq6sq" Mar 10 09:48:58 crc kubenswrapper[4794]: I0310 09:48:58.167747 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fj8sg\" (UniqueName: \"kubernetes.io/projected/6840b6f1-4520-4c02-9f69-b238ac692ae5-kube-api-access-fj8sg\") pod \"6840b6f1-4520-4c02-9f69-b238ac692ae5\" (UID: \"6840b6f1-4520-4c02-9f69-b238ac692ae5\") " Mar 10 09:48:58 crc kubenswrapper[4794]: I0310 09:48:58.168041 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6840b6f1-4520-4c02-9f69-b238ac692ae5-catalog-content\") pod \"6840b6f1-4520-4c02-9f69-b238ac692ae5\" (UID: \"6840b6f1-4520-4c02-9f69-b238ac692ae5\") " Mar 10 09:48:58 crc kubenswrapper[4794]: I0310 09:48:58.168212 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6840b6f1-4520-4c02-9f69-b238ac692ae5-utilities\") pod \"6840b6f1-4520-4c02-9f69-b238ac692ae5\" (UID: \"6840b6f1-4520-4c02-9f69-b238ac692ae5\") " Mar 10 09:48:58 crc kubenswrapper[4794]: I0310 09:48:58.169042 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6840b6f1-4520-4c02-9f69-b238ac692ae5-utilities" (OuterVolumeSpecName: "utilities") pod "6840b6f1-4520-4c02-9f69-b238ac692ae5" (UID: "6840b6f1-4520-4c02-9f69-b238ac692ae5"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 09:48:58 crc kubenswrapper[4794]: I0310 09:48:58.169441 4794 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6840b6f1-4520-4c02-9f69-b238ac692ae5-utilities\") on node \"crc\" DevicePath \"\"" Mar 10 09:48:58 crc kubenswrapper[4794]: I0310 09:48:58.173263 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6840b6f1-4520-4c02-9f69-b238ac692ae5-kube-api-access-fj8sg" (OuterVolumeSpecName: "kube-api-access-fj8sg") pod "6840b6f1-4520-4c02-9f69-b238ac692ae5" (UID: "6840b6f1-4520-4c02-9f69-b238ac692ae5"). InnerVolumeSpecName "kube-api-access-fj8sg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:48:58 crc kubenswrapper[4794]: I0310 09:48:58.270830 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fj8sg\" (UniqueName: \"kubernetes.io/projected/6840b6f1-4520-4c02-9f69-b238ac692ae5-kube-api-access-fj8sg\") on node \"crc\" DevicePath \"\"" Mar 10 09:48:58 crc kubenswrapper[4794]: I0310 09:48:58.362062 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6840b6f1-4520-4c02-9f69-b238ac692ae5-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "6840b6f1-4520-4c02-9f69-b238ac692ae5" (UID: "6840b6f1-4520-4c02-9f69-b238ac692ae5"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 09:48:58 crc kubenswrapper[4794]: I0310 09:48:58.372064 4794 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6840b6f1-4520-4c02-9f69-b238ac692ae5-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 10 09:48:58 crc kubenswrapper[4794]: I0310 09:48:58.732995 4794 generic.go:334] "Generic (PLEG): container finished" podID="6840b6f1-4520-4c02-9f69-b238ac692ae5" containerID="aaa17f102fefbb220bf3e023f4365c32450404156e29043b0a751280ebfe94b2" exitCode=0 Mar 10 09:48:58 crc kubenswrapper[4794]: I0310 09:48:58.733122 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-rq6sq" Mar 10 09:48:58 crc kubenswrapper[4794]: I0310 09:48:58.733137 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rq6sq" event={"ID":"6840b6f1-4520-4c02-9f69-b238ac692ae5","Type":"ContainerDied","Data":"aaa17f102fefbb220bf3e023f4365c32450404156e29043b0a751280ebfe94b2"} Mar 10 09:48:58 crc kubenswrapper[4794]: I0310 09:48:58.733637 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rq6sq" event={"ID":"6840b6f1-4520-4c02-9f69-b238ac692ae5","Type":"ContainerDied","Data":"3ecf087b67f87a48241f472f7540e4d812b711a2b66ba25a3c2034d5ca0aeaa6"} Mar 10 09:48:58 crc kubenswrapper[4794]: I0310 09:48:58.733694 4794 scope.go:117] "RemoveContainer" containerID="aaa17f102fefbb220bf3e023f4365c32450404156e29043b0a751280ebfe94b2" Mar 10 09:48:58 crc kubenswrapper[4794]: I0310 09:48:58.760106 4794 scope.go:117] "RemoveContainer" containerID="df76e4024f94af4d9ef41e07bbfdaf7e85fa0ff7446fdb21859d29906b43a317" Mar 10 09:48:58 crc kubenswrapper[4794]: I0310 09:48:58.790722 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-rq6sq"] Mar 10 09:48:58 crc kubenswrapper[4794]: I0310 09:48:58.798076 4794 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-rq6sq"] Mar 10 09:48:58 crc kubenswrapper[4794]: I0310 09:48:58.812258 4794 scope.go:117] "RemoveContainer" containerID="f349060de8d52fc556856175ffbaa35343781ed37addfad8d27136bb86351d17" Mar 10 09:48:58 crc kubenswrapper[4794]: I0310 09:48:58.832137 4794 scope.go:117] "RemoveContainer" containerID="aaa17f102fefbb220bf3e023f4365c32450404156e29043b0a751280ebfe94b2" Mar 10 09:48:58 crc kubenswrapper[4794]: E0310 09:48:58.832755 4794 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"aaa17f102fefbb220bf3e023f4365c32450404156e29043b0a751280ebfe94b2\": container with ID starting with aaa17f102fefbb220bf3e023f4365c32450404156e29043b0a751280ebfe94b2 not found: ID does not exist" containerID="aaa17f102fefbb220bf3e023f4365c32450404156e29043b0a751280ebfe94b2" Mar 10 09:48:58 crc kubenswrapper[4794]: I0310 09:48:58.832900 4794 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aaa17f102fefbb220bf3e023f4365c32450404156e29043b0a751280ebfe94b2"} err="failed to get container status \"aaa17f102fefbb220bf3e023f4365c32450404156e29043b0a751280ebfe94b2\": rpc error: code = NotFound desc = could not find container \"aaa17f102fefbb220bf3e023f4365c32450404156e29043b0a751280ebfe94b2\": container with ID starting with aaa17f102fefbb220bf3e023f4365c32450404156e29043b0a751280ebfe94b2 not found: ID does not exist" Mar 10 09:48:58 crc kubenswrapper[4794]: I0310 09:48:58.833006 4794 scope.go:117] "RemoveContainer" containerID="df76e4024f94af4d9ef41e07bbfdaf7e85fa0ff7446fdb21859d29906b43a317" Mar 10 09:48:58 crc kubenswrapper[4794]: E0310 09:48:58.833486 4794 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"df76e4024f94af4d9ef41e07bbfdaf7e85fa0ff7446fdb21859d29906b43a317\": container with ID starting with df76e4024f94af4d9ef41e07bbfdaf7e85fa0ff7446fdb21859d29906b43a317 not found: ID does not exist" containerID="df76e4024f94af4d9ef41e07bbfdaf7e85fa0ff7446fdb21859d29906b43a317" Mar 10 09:48:58 crc kubenswrapper[4794]: I0310 09:48:58.833558 4794 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"df76e4024f94af4d9ef41e07bbfdaf7e85fa0ff7446fdb21859d29906b43a317"} err="failed to get container status \"df76e4024f94af4d9ef41e07bbfdaf7e85fa0ff7446fdb21859d29906b43a317\": rpc error: code = NotFound desc = could not find container \"df76e4024f94af4d9ef41e07bbfdaf7e85fa0ff7446fdb21859d29906b43a317\": container with ID starting with df76e4024f94af4d9ef41e07bbfdaf7e85fa0ff7446fdb21859d29906b43a317 not found: ID does not exist" Mar 10 09:48:58 crc kubenswrapper[4794]: I0310 09:48:58.833601 4794 scope.go:117] "RemoveContainer" containerID="f349060de8d52fc556856175ffbaa35343781ed37addfad8d27136bb86351d17" Mar 10 09:48:58 crc kubenswrapper[4794]: E0310 09:48:58.833952 4794 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f349060de8d52fc556856175ffbaa35343781ed37addfad8d27136bb86351d17\": container with ID starting with f349060de8d52fc556856175ffbaa35343781ed37addfad8d27136bb86351d17 not found: ID does not exist" containerID="f349060de8d52fc556856175ffbaa35343781ed37addfad8d27136bb86351d17" Mar 10 09:48:58 crc kubenswrapper[4794]: I0310 09:48:58.834062 4794 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f349060de8d52fc556856175ffbaa35343781ed37addfad8d27136bb86351d17"} err="failed to get container status \"f349060de8d52fc556856175ffbaa35343781ed37addfad8d27136bb86351d17\": rpc error: code = NotFound desc = could not find container \"f349060de8d52fc556856175ffbaa35343781ed37addfad8d27136bb86351d17\": container with ID starting with f349060de8d52fc556856175ffbaa35343781ed37addfad8d27136bb86351d17 not found: ID does not exist" Mar 10 09:49:00 crc kubenswrapper[4794]: I0310 09:49:00.008829 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6840b6f1-4520-4c02-9f69-b238ac692ae5" path="/var/lib/kubelet/pods/6840b6f1-4520-4c02-9f69-b238ac692ae5/volumes" Mar 10 09:49:00 crc kubenswrapper[4794]: I0310 09:49:00.362741 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-669b859c78-wk8bw"] Mar 10 09:49:00 crc kubenswrapper[4794]: I0310 09:49:00.363014 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-669b859c78-wk8bw" podUID="24eecd7d-54cc-4497-b27e-2cc495e920dc" containerName="controller-manager" containerID="cri-o://9dacf379e2d82e2d5de83578f26f292442a8ee8a6dced36feb94a12262d565cd" gracePeriod=30 Mar 10 09:49:00 crc kubenswrapper[4794]: I0310 09:49:00.372943 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-86f487c7f-b8zhm"] Mar 10 09:49:00 crc kubenswrapper[4794]: I0310 09:49:00.373190 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-86f487c7f-b8zhm" podUID="5527923f-a4af-4a62-833c-b45d251e581e" containerName="route-controller-manager" containerID="cri-o://aa220af4b82dfcfafe4c7f4c0a728b752c841a821bce65022f675cbe0e6e5a5e" gracePeriod=30 Mar 10 09:49:00 crc kubenswrapper[4794]: I0310 09:49:00.747693 4794 generic.go:334] "Generic (PLEG): container finished" podID="5527923f-a4af-4a62-833c-b45d251e581e" containerID="aa220af4b82dfcfafe4c7f4c0a728b752c841a821bce65022f675cbe0e6e5a5e" exitCode=0 Mar 10 09:49:00 crc kubenswrapper[4794]: I0310 09:49:00.747847 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-86f487c7f-b8zhm" event={"ID":"5527923f-a4af-4a62-833c-b45d251e581e","Type":"ContainerDied","Data":"aa220af4b82dfcfafe4c7f4c0a728b752c841a821bce65022f675cbe0e6e5a5e"} Mar 10 09:49:00 crc kubenswrapper[4794]: I0310 09:49:00.750963 4794 generic.go:334] "Generic (PLEG): container finished" podID="24eecd7d-54cc-4497-b27e-2cc495e920dc" containerID="9dacf379e2d82e2d5de83578f26f292442a8ee8a6dced36feb94a12262d565cd" exitCode=0 Mar 10 09:49:00 crc kubenswrapper[4794]: I0310 09:49:00.751018 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-669b859c78-wk8bw" event={"ID":"24eecd7d-54cc-4497-b27e-2cc495e920dc","Type":"ContainerDied","Data":"9dacf379e2d82e2d5de83578f26f292442a8ee8a6dced36feb94a12262d565cd"} Mar 10 09:49:00 crc kubenswrapper[4794]: I0310 09:49:00.865741 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-86f487c7f-b8zhm" Mar 10 09:49:00 crc kubenswrapper[4794]: I0310 09:49:00.986748 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-669b859c78-wk8bw" Mar 10 09:49:01 crc kubenswrapper[4794]: I0310 09:49:01.012895 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c7mzb\" (UniqueName: \"kubernetes.io/projected/5527923f-a4af-4a62-833c-b45d251e581e-kube-api-access-c7mzb\") pod \"5527923f-a4af-4a62-833c-b45d251e581e\" (UID: \"5527923f-a4af-4a62-833c-b45d251e581e\") " Mar 10 09:49:01 crc kubenswrapper[4794]: I0310 09:49:01.013119 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5527923f-a4af-4a62-833c-b45d251e581e-client-ca\") pod \"5527923f-a4af-4a62-833c-b45d251e581e\" (UID: \"5527923f-a4af-4a62-833c-b45d251e581e\") " Mar 10 09:49:01 crc kubenswrapper[4794]: I0310 09:49:01.013188 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5527923f-a4af-4a62-833c-b45d251e581e-config\") pod \"5527923f-a4af-4a62-833c-b45d251e581e\" (UID: \"5527923f-a4af-4a62-833c-b45d251e581e\") " Mar 10 09:49:01 crc kubenswrapper[4794]: I0310 09:49:01.013256 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5527923f-a4af-4a62-833c-b45d251e581e-serving-cert\") pod \"5527923f-a4af-4a62-833c-b45d251e581e\" (UID: \"5527923f-a4af-4a62-833c-b45d251e581e\") " Mar 10 09:49:01 crc kubenswrapper[4794]: I0310 09:49:01.013914 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5527923f-a4af-4a62-833c-b45d251e581e-client-ca" (OuterVolumeSpecName: "client-ca") pod "5527923f-a4af-4a62-833c-b45d251e581e" (UID: "5527923f-a4af-4a62-833c-b45d251e581e"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 09:49:01 crc kubenswrapper[4794]: I0310 09:49:01.013933 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5527923f-a4af-4a62-833c-b45d251e581e-config" (OuterVolumeSpecName: "config") pod "5527923f-a4af-4a62-833c-b45d251e581e" (UID: "5527923f-a4af-4a62-833c-b45d251e581e"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 09:49:01 crc kubenswrapper[4794]: I0310 09:49:01.014092 4794 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5527923f-a4af-4a62-833c-b45d251e581e-client-ca\") on node \"crc\" DevicePath \"\"" Mar 10 09:49:01 crc kubenswrapper[4794]: I0310 09:49:01.014125 4794 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5527923f-a4af-4a62-833c-b45d251e581e-config\") on node \"crc\" DevicePath \"\"" Mar 10 09:49:01 crc kubenswrapper[4794]: I0310 09:49:01.020908 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5527923f-a4af-4a62-833c-b45d251e581e-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "5527923f-a4af-4a62-833c-b45d251e581e" (UID: "5527923f-a4af-4a62-833c-b45d251e581e"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:49:01 crc kubenswrapper[4794]: I0310 09:49:01.021508 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5527923f-a4af-4a62-833c-b45d251e581e-kube-api-access-c7mzb" (OuterVolumeSpecName: "kube-api-access-c7mzb") pod "5527923f-a4af-4a62-833c-b45d251e581e" (UID: "5527923f-a4af-4a62-833c-b45d251e581e"). InnerVolumeSpecName "kube-api-access-c7mzb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:49:01 crc kubenswrapper[4794]: I0310 09:49:01.115045 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kmvwl\" (UniqueName: \"kubernetes.io/projected/24eecd7d-54cc-4497-b27e-2cc495e920dc-kube-api-access-kmvwl\") pod \"24eecd7d-54cc-4497-b27e-2cc495e920dc\" (UID: \"24eecd7d-54cc-4497-b27e-2cc495e920dc\") " Mar 10 09:49:01 crc kubenswrapper[4794]: I0310 09:49:01.115123 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/24eecd7d-54cc-4497-b27e-2cc495e920dc-serving-cert\") pod \"24eecd7d-54cc-4497-b27e-2cc495e920dc\" (UID: \"24eecd7d-54cc-4497-b27e-2cc495e920dc\") " Mar 10 09:49:01 crc kubenswrapper[4794]: I0310 09:49:01.115219 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/24eecd7d-54cc-4497-b27e-2cc495e920dc-proxy-ca-bundles\") pod \"24eecd7d-54cc-4497-b27e-2cc495e920dc\" (UID: \"24eecd7d-54cc-4497-b27e-2cc495e920dc\") " Mar 10 09:49:01 crc kubenswrapper[4794]: I0310 09:49:01.116054 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/24eecd7d-54cc-4497-b27e-2cc495e920dc-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "24eecd7d-54cc-4497-b27e-2cc495e920dc" (UID: "24eecd7d-54cc-4497-b27e-2cc495e920dc"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 09:49:01 crc kubenswrapper[4794]: I0310 09:49:01.116098 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/24eecd7d-54cc-4497-b27e-2cc495e920dc-client-ca\") pod \"24eecd7d-54cc-4497-b27e-2cc495e920dc\" (UID: \"24eecd7d-54cc-4497-b27e-2cc495e920dc\") " Mar 10 09:49:01 crc kubenswrapper[4794]: I0310 09:49:01.116130 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/24eecd7d-54cc-4497-b27e-2cc495e920dc-config\") pod \"24eecd7d-54cc-4497-b27e-2cc495e920dc\" (UID: \"24eecd7d-54cc-4497-b27e-2cc495e920dc\") " Mar 10 09:49:01 crc kubenswrapper[4794]: I0310 09:49:01.116188 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/24eecd7d-54cc-4497-b27e-2cc495e920dc-client-ca" (OuterVolumeSpecName: "client-ca") pod "24eecd7d-54cc-4497-b27e-2cc495e920dc" (UID: "24eecd7d-54cc-4497-b27e-2cc495e920dc"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 09:49:01 crc kubenswrapper[4794]: I0310 09:49:01.116602 4794 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/24eecd7d-54cc-4497-b27e-2cc495e920dc-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 10 09:49:01 crc kubenswrapper[4794]: I0310 09:49:01.116633 4794 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/24eecd7d-54cc-4497-b27e-2cc495e920dc-client-ca\") on node \"crc\" DevicePath \"\"" Mar 10 09:49:01 crc kubenswrapper[4794]: I0310 09:49:01.116648 4794 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5527923f-a4af-4a62-833c-b45d251e581e-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 10 09:49:01 crc kubenswrapper[4794]: I0310 09:49:01.116660 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c7mzb\" (UniqueName: \"kubernetes.io/projected/5527923f-a4af-4a62-833c-b45d251e581e-kube-api-access-c7mzb\") on node \"crc\" DevicePath \"\"" Mar 10 09:49:01 crc kubenswrapper[4794]: I0310 09:49:01.116911 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/24eecd7d-54cc-4497-b27e-2cc495e920dc-config" (OuterVolumeSpecName: "config") pod "24eecd7d-54cc-4497-b27e-2cc495e920dc" (UID: "24eecd7d-54cc-4497-b27e-2cc495e920dc"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 09:49:01 crc kubenswrapper[4794]: I0310 09:49:01.118057 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/24eecd7d-54cc-4497-b27e-2cc495e920dc-kube-api-access-kmvwl" (OuterVolumeSpecName: "kube-api-access-kmvwl") pod "24eecd7d-54cc-4497-b27e-2cc495e920dc" (UID: "24eecd7d-54cc-4497-b27e-2cc495e920dc"). InnerVolumeSpecName "kube-api-access-kmvwl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:49:01 crc kubenswrapper[4794]: I0310 09:49:01.118548 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/24eecd7d-54cc-4497-b27e-2cc495e920dc-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "24eecd7d-54cc-4497-b27e-2cc495e920dc" (UID: "24eecd7d-54cc-4497-b27e-2cc495e920dc"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:49:01 crc kubenswrapper[4794]: I0310 09:49:01.217431 4794 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/24eecd7d-54cc-4497-b27e-2cc495e920dc-config\") on node \"crc\" DevicePath \"\"" Mar 10 09:49:01 crc kubenswrapper[4794]: I0310 09:49:01.217492 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kmvwl\" (UniqueName: \"kubernetes.io/projected/24eecd7d-54cc-4497-b27e-2cc495e920dc-kube-api-access-kmvwl\") on node \"crc\" DevicePath \"\"" Mar 10 09:49:01 crc kubenswrapper[4794]: I0310 09:49:01.217520 4794 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/24eecd7d-54cc-4497-b27e-2cc495e920dc-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 10 09:49:01 crc kubenswrapper[4794]: I0310 09:49:01.629796 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6f5cd6dbf6-8fz7v"] Mar 10 09:49:01 crc kubenswrapper[4794]: E0310 09:49:01.630141 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dc6e5973-dae4-42b4-aae1-44fd0f6e1c3e" containerName="extract-utilities" Mar 10 09:49:01 crc kubenswrapper[4794]: I0310 09:49:01.630167 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="dc6e5973-dae4-42b4-aae1-44fd0f6e1c3e" containerName="extract-utilities" Mar 10 09:49:01 crc kubenswrapper[4794]: E0310 09:49:01.630195 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="86b8b5f3-501a-44fa-8f78-ceaefcaecfdd" containerName="extract-content" Mar 10 09:49:01 crc kubenswrapper[4794]: I0310 09:49:01.630209 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="86b8b5f3-501a-44fa-8f78-ceaefcaecfdd" containerName="extract-content" Mar 10 09:49:01 crc kubenswrapper[4794]: E0310 09:49:01.630229 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2c891e53-bebe-462e-a924-5073338f2ac1" containerName="extract-content" Mar 10 09:49:01 crc kubenswrapper[4794]: I0310 09:49:01.630243 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="2c891e53-bebe-462e-a924-5073338f2ac1" containerName="extract-content" Mar 10 09:49:01 crc kubenswrapper[4794]: E0310 09:49:01.630263 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6840b6f1-4520-4c02-9f69-b238ac692ae5" containerName="extract-content" Mar 10 09:49:01 crc kubenswrapper[4794]: I0310 09:49:01.630275 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="6840b6f1-4520-4c02-9f69-b238ac692ae5" containerName="extract-content" Mar 10 09:49:01 crc kubenswrapper[4794]: E0310 09:49:01.630293 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dc6e5973-dae4-42b4-aae1-44fd0f6e1c3e" containerName="extract-content" Mar 10 09:49:01 crc kubenswrapper[4794]: I0310 09:49:01.630305 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="dc6e5973-dae4-42b4-aae1-44fd0f6e1c3e" containerName="extract-content" Mar 10 09:49:01 crc kubenswrapper[4794]: E0310 09:49:01.630322 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6840b6f1-4520-4c02-9f69-b238ac692ae5" containerName="registry-server" Mar 10 09:49:01 crc kubenswrapper[4794]: I0310 09:49:01.630363 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="6840b6f1-4520-4c02-9f69-b238ac692ae5" containerName="registry-server" Mar 10 09:49:01 crc kubenswrapper[4794]: E0310 09:49:01.630382 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6840b6f1-4520-4c02-9f69-b238ac692ae5" containerName="extract-utilities" Mar 10 09:49:01 crc kubenswrapper[4794]: I0310 09:49:01.630394 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="6840b6f1-4520-4c02-9f69-b238ac692ae5" containerName="extract-utilities" Mar 10 09:49:01 crc kubenswrapper[4794]: E0310 09:49:01.630409 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2c891e53-bebe-462e-a924-5073338f2ac1" containerName="registry-server" Mar 10 09:49:01 crc kubenswrapper[4794]: I0310 09:49:01.630424 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="2c891e53-bebe-462e-a924-5073338f2ac1" containerName="registry-server" Mar 10 09:49:01 crc kubenswrapper[4794]: E0310 09:49:01.630440 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="86b8b5f3-501a-44fa-8f78-ceaefcaecfdd" containerName="extract-utilities" Mar 10 09:49:01 crc kubenswrapper[4794]: I0310 09:49:01.630455 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="86b8b5f3-501a-44fa-8f78-ceaefcaecfdd" containerName="extract-utilities" Mar 10 09:49:01 crc kubenswrapper[4794]: E0310 09:49:01.630473 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2c891e53-bebe-462e-a924-5073338f2ac1" containerName="extract-utilities" Mar 10 09:49:01 crc kubenswrapper[4794]: I0310 09:49:01.630486 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="2c891e53-bebe-462e-a924-5073338f2ac1" containerName="extract-utilities" Mar 10 09:49:01 crc kubenswrapper[4794]: E0310 09:49:01.630505 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5527923f-a4af-4a62-833c-b45d251e581e" containerName="route-controller-manager" Mar 10 09:49:01 crc kubenswrapper[4794]: I0310 09:49:01.630518 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="5527923f-a4af-4a62-833c-b45d251e581e" containerName="route-controller-manager" Mar 10 09:49:01 crc kubenswrapper[4794]: E0310 09:49:01.630539 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dc6e5973-dae4-42b4-aae1-44fd0f6e1c3e" containerName="registry-server" Mar 10 09:49:01 crc kubenswrapper[4794]: I0310 09:49:01.630551 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="dc6e5973-dae4-42b4-aae1-44fd0f6e1c3e" containerName="registry-server" Mar 10 09:49:01 crc kubenswrapper[4794]: E0310 09:49:01.630563 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="86b8b5f3-501a-44fa-8f78-ceaefcaecfdd" containerName="registry-server" Mar 10 09:49:01 crc kubenswrapper[4794]: I0310 09:49:01.630575 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="86b8b5f3-501a-44fa-8f78-ceaefcaecfdd" containerName="registry-server" Mar 10 09:49:01 crc kubenswrapper[4794]: E0310 09:49:01.630589 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="24eecd7d-54cc-4497-b27e-2cc495e920dc" containerName="controller-manager" Mar 10 09:49:01 crc kubenswrapper[4794]: I0310 09:49:01.630601 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="24eecd7d-54cc-4497-b27e-2cc495e920dc" containerName="controller-manager" Mar 10 09:49:01 crc kubenswrapper[4794]: I0310 09:49:01.630789 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="86b8b5f3-501a-44fa-8f78-ceaefcaecfdd" containerName="registry-server" Mar 10 09:49:01 crc kubenswrapper[4794]: I0310 09:49:01.630809 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="24eecd7d-54cc-4497-b27e-2cc495e920dc" containerName="controller-manager" Mar 10 09:49:01 crc kubenswrapper[4794]: I0310 09:49:01.630829 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="dc6e5973-dae4-42b4-aae1-44fd0f6e1c3e" containerName="registry-server" Mar 10 09:49:01 crc kubenswrapper[4794]: I0310 09:49:01.630845 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="5527923f-a4af-4a62-833c-b45d251e581e" containerName="route-controller-manager" Mar 10 09:49:01 crc kubenswrapper[4794]: I0310 09:49:01.630863 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="2c891e53-bebe-462e-a924-5073338f2ac1" containerName="registry-server" Mar 10 09:49:01 crc kubenswrapper[4794]: I0310 09:49:01.630882 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="6840b6f1-4520-4c02-9f69-b238ac692ae5" containerName="registry-server" Mar 10 09:49:01 crc kubenswrapper[4794]: I0310 09:49:01.631507 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6f5cd6dbf6-8fz7v" Mar 10 09:49:01 crc kubenswrapper[4794]: I0310 09:49:01.634960 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-fc4875744-s7tsn"] Mar 10 09:49:01 crc kubenswrapper[4794]: I0310 09:49:01.635883 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-fc4875744-s7tsn" Mar 10 09:49:01 crc kubenswrapper[4794]: I0310 09:49:01.655368 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6f5cd6dbf6-8fz7v"] Mar 10 09:49:01 crc kubenswrapper[4794]: I0310 09:49:01.692899 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-fc4875744-s7tsn"] Mar 10 09:49:01 crc kubenswrapper[4794]: I0310 09:49:01.723843 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f42205af-8717-4d47-88a2-846e3cf4d48f-client-ca\") pod \"controller-manager-fc4875744-s7tsn\" (UID: \"f42205af-8717-4d47-88a2-846e3cf4d48f\") " pod="openshift-controller-manager/controller-manager-fc4875744-s7tsn" Mar 10 09:49:01 crc kubenswrapper[4794]: I0310 09:49:01.723895 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f42205af-8717-4d47-88a2-846e3cf4d48f-config\") pod \"controller-manager-fc4875744-s7tsn\" (UID: \"f42205af-8717-4d47-88a2-846e3cf4d48f\") " pod="openshift-controller-manager/controller-manager-fc4875744-s7tsn" Mar 10 09:49:01 crc kubenswrapper[4794]: I0310 09:49:01.724043 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f42205af-8717-4d47-88a2-846e3cf4d48f-serving-cert\") pod \"controller-manager-fc4875744-s7tsn\" (UID: \"f42205af-8717-4d47-88a2-846e3cf4d48f\") " pod="openshift-controller-manager/controller-manager-fc4875744-s7tsn" Mar 10 09:49:01 crc kubenswrapper[4794]: I0310 09:49:01.724100 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vwlzr\" (UniqueName: \"kubernetes.io/projected/f42205af-8717-4d47-88a2-846e3cf4d48f-kube-api-access-vwlzr\") pod \"controller-manager-fc4875744-s7tsn\" (UID: \"f42205af-8717-4d47-88a2-846e3cf4d48f\") " pod="openshift-controller-manager/controller-manager-fc4875744-s7tsn" Mar 10 09:49:01 crc kubenswrapper[4794]: I0310 09:49:01.724123 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/f42205af-8717-4d47-88a2-846e3cf4d48f-proxy-ca-bundles\") pod \"controller-manager-fc4875744-s7tsn\" (UID: \"f42205af-8717-4d47-88a2-846e3cf4d48f\") " pod="openshift-controller-manager/controller-manager-fc4875744-s7tsn" Mar 10 09:49:01 crc kubenswrapper[4794]: I0310 09:49:01.757398 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-86f487c7f-b8zhm" event={"ID":"5527923f-a4af-4a62-833c-b45d251e581e","Type":"ContainerDied","Data":"b6344670af9be7d2f10ed1ba90fb834873e68427df5366412328475b91faed38"} Mar 10 09:49:01 crc kubenswrapper[4794]: I0310 09:49:01.757468 4794 scope.go:117] "RemoveContainer" containerID="aa220af4b82dfcfafe4c7f4c0a728b752c841a821bce65022f675cbe0e6e5a5e" Mar 10 09:49:01 crc kubenswrapper[4794]: I0310 09:49:01.757594 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-86f487c7f-b8zhm" Mar 10 09:49:01 crc kubenswrapper[4794]: I0310 09:49:01.764695 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-669b859c78-wk8bw" event={"ID":"24eecd7d-54cc-4497-b27e-2cc495e920dc","Type":"ContainerDied","Data":"39c6c10c8106392f34859a650b62de9c8c57c7afcebe8d88ec75238c5bb102ab"} Mar 10 09:49:01 crc kubenswrapper[4794]: I0310 09:49:01.764779 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-669b859c78-wk8bw" Mar 10 09:49:01 crc kubenswrapper[4794]: I0310 09:49:01.787781 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-86f487c7f-b8zhm"] Mar 10 09:49:01 crc kubenswrapper[4794]: I0310 09:49:01.793726 4794 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-86f487c7f-b8zhm"] Mar 10 09:49:01 crc kubenswrapper[4794]: I0310 09:49:01.793840 4794 scope.go:117] "RemoveContainer" containerID="9dacf379e2d82e2d5de83578f26f292442a8ee8a6dced36feb94a12262d565cd" Mar 10 09:49:01 crc kubenswrapper[4794]: I0310 09:49:01.798349 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-669b859c78-wk8bw"] Mar 10 09:49:01 crc kubenswrapper[4794]: I0310 09:49:01.802712 4794 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-669b859c78-wk8bw"] Mar 10 09:49:01 crc kubenswrapper[4794]: I0310 09:49:01.824651 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vwlzr\" (UniqueName: \"kubernetes.io/projected/f42205af-8717-4d47-88a2-846e3cf4d48f-kube-api-access-vwlzr\") pod \"controller-manager-fc4875744-s7tsn\" (UID: \"f42205af-8717-4d47-88a2-846e3cf4d48f\") " pod="openshift-controller-manager/controller-manager-fc4875744-s7tsn" Mar 10 09:49:01 crc kubenswrapper[4794]: I0310 09:49:01.824697 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/f42205af-8717-4d47-88a2-846e3cf4d48f-proxy-ca-bundles\") pod \"controller-manager-fc4875744-s7tsn\" (UID: \"f42205af-8717-4d47-88a2-846e3cf4d48f\") " pod="openshift-controller-manager/controller-manager-fc4875744-s7tsn" Mar 10 09:49:01 crc kubenswrapper[4794]: I0310 09:49:01.824757 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f42205af-8717-4d47-88a2-846e3cf4d48f-client-ca\") pod \"controller-manager-fc4875744-s7tsn\" (UID: \"f42205af-8717-4d47-88a2-846e3cf4d48f\") " pod="openshift-controller-manager/controller-manager-fc4875744-s7tsn" Mar 10 09:49:01 crc kubenswrapper[4794]: I0310 09:49:01.824781 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c0cdbf97-982e-4ec5-876f-81c34f1233bd-config\") pod \"route-controller-manager-6f5cd6dbf6-8fz7v\" (UID: \"c0cdbf97-982e-4ec5-876f-81c34f1233bd\") " pod="openshift-route-controller-manager/route-controller-manager-6f5cd6dbf6-8fz7v" Mar 10 09:49:01 crc kubenswrapper[4794]: I0310 09:49:01.824798 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f42205af-8717-4d47-88a2-846e3cf4d48f-config\") pod \"controller-manager-fc4875744-s7tsn\" (UID: \"f42205af-8717-4d47-88a2-846e3cf4d48f\") " pod="openshift-controller-manager/controller-manager-fc4875744-s7tsn" Mar 10 09:49:01 crc kubenswrapper[4794]: I0310 09:49:01.824815 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c0cdbf97-982e-4ec5-876f-81c34f1233bd-serving-cert\") pod \"route-controller-manager-6f5cd6dbf6-8fz7v\" (UID: \"c0cdbf97-982e-4ec5-876f-81c34f1233bd\") " pod="openshift-route-controller-manager/route-controller-manager-6f5cd6dbf6-8fz7v" Mar 10 09:49:01 crc kubenswrapper[4794]: I0310 09:49:01.824862 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qg4q9\" (UniqueName: \"kubernetes.io/projected/c0cdbf97-982e-4ec5-876f-81c34f1233bd-kube-api-access-qg4q9\") pod \"route-controller-manager-6f5cd6dbf6-8fz7v\" (UID: \"c0cdbf97-982e-4ec5-876f-81c34f1233bd\") " pod="openshift-route-controller-manager/route-controller-manager-6f5cd6dbf6-8fz7v" Mar 10 09:49:01 crc kubenswrapper[4794]: I0310 09:49:01.824882 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f42205af-8717-4d47-88a2-846e3cf4d48f-serving-cert\") pod \"controller-manager-fc4875744-s7tsn\" (UID: \"f42205af-8717-4d47-88a2-846e3cf4d48f\") " pod="openshift-controller-manager/controller-manager-fc4875744-s7tsn" Mar 10 09:49:01 crc kubenswrapper[4794]: I0310 09:49:01.824900 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c0cdbf97-982e-4ec5-876f-81c34f1233bd-client-ca\") pod \"route-controller-manager-6f5cd6dbf6-8fz7v\" (UID: \"c0cdbf97-982e-4ec5-876f-81c34f1233bd\") " pod="openshift-route-controller-manager/route-controller-manager-6f5cd6dbf6-8fz7v" Mar 10 09:49:01 crc kubenswrapper[4794]: I0310 09:49:01.826122 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/f42205af-8717-4d47-88a2-846e3cf4d48f-proxy-ca-bundles\") pod \"controller-manager-fc4875744-s7tsn\" (UID: \"f42205af-8717-4d47-88a2-846e3cf4d48f\") " pod="openshift-controller-manager/controller-manager-fc4875744-s7tsn" Mar 10 09:49:01 crc kubenswrapper[4794]: I0310 09:49:01.826232 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f42205af-8717-4d47-88a2-846e3cf4d48f-config\") pod \"controller-manager-fc4875744-s7tsn\" (UID: \"f42205af-8717-4d47-88a2-846e3cf4d48f\") " pod="openshift-controller-manager/controller-manager-fc4875744-s7tsn" Mar 10 09:49:01 crc kubenswrapper[4794]: I0310 09:49:01.826632 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f42205af-8717-4d47-88a2-846e3cf4d48f-client-ca\") pod \"controller-manager-fc4875744-s7tsn\" (UID: \"f42205af-8717-4d47-88a2-846e3cf4d48f\") " pod="openshift-controller-manager/controller-manager-fc4875744-s7tsn" Mar 10 09:49:01 crc kubenswrapper[4794]: I0310 09:49:01.840437 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f42205af-8717-4d47-88a2-846e3cf4d48f-serving-cert\") pod \"controller-manager-fc4875744-s7tsn\" (UID: \"f42205af-8717-4d47-88a2-846e3cf4d48f\") " pod="openshift-controller-manager/controller-manager-fc4875744-s7tsn" Mar 10 09:49:01 crc kubenswrapper[4794]: I0310 09:49:01.846748 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vwlzr\" (UniqueName: \"kubernetes.io/projected/f42205af-8717-4d47-88a2-846e3cf4d48f-kube-api-access-vwlzr\") pod \"controller-manager-fc4875744-s7tsn\" (UID: \"f42205af-8717-4d47-88a2-846e3cf4d48f\") " pod="openshift-controller-manager/controller-manager-fc4875744-s7tsn" Mar 10 09:49:01 crc kubenswrapper[4794]: I0310 09:49:01.926385 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c0cdbf97-982e-4ec5-876f-81c34f1233bd-client-ca\") pod \"route-controller-manager-6f5cd6dbf6-8fz7v\" (UID: \"c0cdbf97-982e-4ec5-876f-81c34f1233bd\") " pod="openshift-route-controller-manager/route-controller-manager-6f5cd6dbf6-8fz7v" Mar 10 09:49:01 crc kubenswrapper[4794]: I0310 09:49:01.926525 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c0cdbf97-982e-4ec5-876f-81c34f1233bd-config\") pod \"route-controller-manager-6f5cd6dbf6-8fz7v\" (UID: \"c0cdbf97-982e-4ec5-876f-81c34f1233bd\") " pod="openshift-route-controller-manager/route-controller-manager-6f5cd6dbf6-8fz7v" Mar 10 09:49:01 crc kubenswrapper[4794]: I0310 09:49:01.926555 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c0cdbf97-982e-4ec5-876f-81c34f1233bd-serving-cert\") pod \"route-controller-manager-6f5cd6dbf6-8fz7v\" (UID: \"c0cdbf97-982e-4ec5-876f-81c34f1233bd\") " pod="openshift-route-controller-manager/route-controller-manager-6f5cd6dbf6-8fz7v" Mar 10 09:49:01 crc kubenswrapper[4794]: I0310 09:49:01.926620 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qg4q9\" (UniqueName: \"kubernetes.io/projected/c0cdbf97-982e-4ec5-876f-81c34f1233bd-kube-api-access-qg4q9\") pod \"route-controller-manager-6f5cd6dbf6-8fz7v\" (UID: \"c0cdbf97-982e-4ec5-876f-81c34f1233bd\") " pod="openshift-route-controller-manager/route-controller-manager-6f5cd6dbf6-8fz7v" Mar 10 09:49:01 crc kubenswrapper[4794]: I0310 09:49:01.927877 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c0cdbf97-982e-4ec5-876f-81c34f1233bd-client-ca\") pod \"route-controller-manager-6f5cd6dbf6-8fz7v\" (UID: \"c0cdbf97-982e-4ec5-876f-81c34f1233bd\") " pod="openshift-route-controller-manager/route-controller-manager-6f5cd6dbf6-8fz7v" Mar 10 09:49:01 crc kubenswrapper[4794]: I0310 09:49:01.929172 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c0cdbf97-982e-4ec5-876f-81c34f1233bd-config\") pod \"route-controller-manager-6f5cd6dbf6-8fz7v\" (UID: \"c0cdbf97-982e-4ec5-876f-81c34f1233bd\") " pod="openshift-route-controller-manager/route-controller-manager-6f5cd6dbf6-8fz7v" Mar 10 09:49:01 crc kubenswrapper[4794]: I0310 09:49:01.934309 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c0cdbf97-982e-4ec5-876f-81c34f1233bd-serving-cert\") pod \"route-controller-manager-6f5cd6dbf6-8fz7v\" (UID: \"c0cdbf97-982e-4ec5-876f-81c34f1233bd\") " pod="openshift-route-controller-manager/route-controller-manager-6f5cd6dbf6-8fz7v" Mar 10 09:49:01 crc kubenswrapper[4794]: I0310 09:49:01.951537 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qg4q9\" (UniqueName: \"kubernetes.io/projected/c0cdbf97-982e-4ec5-876f-81c34f1233bd-kube-api-access-qg4q9\") pod \"route-controller-manager-6f5cd6dbf6-8fz7v\" (UID: \"c0cdbf97-982e-4ec5-876f-81c34f1233bd\") " pod="openshift-route-controller-manager/route-controller-manager-6f5cd6dbf6-8fz7v" Mar 10 09:49:01 crc kubenswrapper[4794]: I0310 09:49:01.962299 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6f5cd6dbf6-8fz7v" Mar 10 09:49:01 crc kubenswrapper[4794]: I0310 09:49:01.973838 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-fc4875744-s7tsn" Mar 10 09:49:02 crc kubenswrapper[4794]: I0310 09:49:02.004290 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="24eecd7d-54cc-4497-b27e-2cc495e920dc" path="/var/lib/kubelet/pods/24eecd7d-54cc-4497-b27e-2cc495e920dc/volumes" Mar 10 09:49:02 crc kubenswrapper[4794]: I0310 09:49:02.005028 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5527923f-a4af-4a62-833c-b45d251e581e" path="/var/lib/kubelet/pods/5527923f-a4af-4a62-833c-b45d251e581e/volumes" Mar 10 09:49:02 crc kubenswrapper[4794]: I0310 09:49:02.144087 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6f5cd6dbf6-8fz7v"] Mar 10 09:49:02 crc kubenswrapper[4794]: I0310 09:49:02.204771 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-fc4875744-s7tsn"] Mar 10 09:49:02 crc kubenswrapper[4794]: I0310 09:49:02.772237 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6f5cd6dbf6-8fz7v" event={"ID":"c0cdbf97-982e-4ec5-876f-81c34f1233bd","Type":"ContainerStarted","Data":"8b2b83c5135b4c86f50d23fecbfe18a99d36b7c57ddad61bcfc75ab45b8ce0c5"} Mar 10 09:49:02 crc kubenswrapper[4794]: I0310 09:49:02.772584 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6f5cd6dbf6-8fz7v" event={"ID":"c0cdbf97-982e-4ec5-876f-81c34f1233bd","Type":"ContainerStarted","Data":"77e756f68ebfeeccec059b34a443d931d75dba505212b104e8e9f89adf3b3e5f"} Mar 10 09:49:02 crc kubenswrapper[4794]: I0310 09:49:02.772609 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6f5cd6dbf6-8fz7v" Mar 10 09:49:02 crc kubenswrapper[4794]: I0310 09:49:02.777078 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-fc4875744-s7tsn" event={"ID":"f42205af-8717-4d47-88a2-846e3cf4d48f","Type":"ContainerStarted","Data":"49f7887bb8bee694f2122a2edc1e9d86d6e28a93be04605113654c393dfd403e"} Mar 10 09:49:02 crc kubenswrapper[4794]: I0310 09:49:02.777137 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-fc4875744-s7tsn" event={"ID":"f42205af-8717-4d47-88a2-846e3cf4d48f","Type":"ContainerStarted","Data":"2e22afaeef82f3512a11753f05a9c85ba137dd8dd208dad23e3fcfc10db1f871"} Mar 10 09:49:02 crc kubenswrapper[4794]: I0310 09:49:02.777417 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-fc4875744-s7tsn" Mar 10 09:49:02 crc kubenswrapper[4794]: I0310 09:49:02.782923 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-fc4875744-s7tsn" Mar 10 09:49:02 crc kubenswrapper[4794]: I0310 09:49:02.826404 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-fc4875744-s7tsn" podStartSLOduration=2.826383691 podStartE2EDuration="2.826383691s" podCreationTimestamp="2026-03-10 09:49:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 09:49:02.824025466 +0000 UTC m=+291.580196294" watchObservedRunningTime="2026-03-10 09:49:02.826383691 +0000 UTC m=+291.582554519" Mar 10 09:49:02 crc kubenswrapper[4794]: I0310 09:49:02.827016 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6f5cd6dbf6-8fz7v" podStartSLOduration=2.827009084 podStartE2EDuration="2.827009084s" podCreationTimestamp="2026-03-10 09:49:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 09:49:02.801620896 +0000 UTC m=+291.557791734" watchObservedRunningTime="2026-03-10 09:49:02.827009084 +0000 UTC m=+291.583179922" Mar 10 09:49:02 crc kubenswrapper[4794]: I0310 09:49:02.859810 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6f5cd6dbf6-8fz7v" Mar 10 09:49:02 crc kubenswrapper[4794]: I0310 09:49:02.895224 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-n94r9"] Mar 10 09:49:20 crc kubenswrapper[4794]: I0310 09:49:20.336844 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-fc4875744-s7tsn"] Mar 10 09:49:20 crc kubenswrapper[4794]: I0310 09:49:20.337811 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-fc4875744-s7tsn" podUID="f42205af-8717-4d47-88a2-846e3cf4d48f" containerName="controller-manager" containerID="cri-o://49f7887bb8bee694f2122a2edc1e9d86d6e28a93be04605113654c393dfd403e" gracePeriod=30 Mar 10 09:49:20 crc kubenswrapper[4794]: I0310 09:49:20.439163 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6f5cd6dbf6-8fz7v"] Mar 10 09:49:20 crc kubenswrapper[4794]: I0310 09:49:20.439494 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6f5cd6dbf6-8fz7v" podUID="c0cdbf97-982e-4ec5-876f-81c34f1233bd" containerName="route-controller-manager" containerID="cri-o://8b2b83c5135b4c86f50d23fecbfe18a99d36b7c57ddad61bcfc75ab45b8ce0c5" gracePeriod=30 Mar 10 09:49:20 crc kubenswrapper[4794]: I0310 09:49:20.879834 4794 generic.go:334] "Generic (PLEG): container finished" podID="f42205af-8717-4d47-88a2-846e3cf4d48f" containerID="49f7887bb8bee694f2122a2edc1e9d86d6e28a93be04605113654c393dfd403e" exitCode=0 Mar 10 09:49:20 crc kubenswrapper[4794]: I0310 09:49:20.879962 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-fc4875744-s7tsn" event={"ID":"f42205af-8717-4d47-88a2-846e3cf4d48f","Type":"ContainerDied","Data":"49f7887bb8bee694f2122a2edc1e9d86d6e28a93be04605113654c393dfd403e"} Mar 10 09:49:20 crc kubenswrapper[4794]: I0310 09:49:20.882135 4794 generic.go:334] "Generic (PLEG): container finished" podID="c0cdbf97-982e-4ec5-876f-81c34f1233bd" containerID="8b2b83c5135b4c86f50d23fecbfe18a99d36b7c57ddad61bcfc75ab45b8ce0c5" exitCode=0 Mar 10 09:49:20 crc kubenswrapper[4794]: I0310 09:49:20.882195 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6f5cd6dbf6-8fz7v" event={"ID":"c0cdbf97-982e-4ec5-876f-81c34f1233bd","Type":"ContainerDied","Data":"8b2b83c5135b4c86f50d23fecbfe18a99d36b7c57ddad61bcfc75ab45b8ce0c5"} Mar 10 09:49:20 crc kubenswrapper[4794]: I0310 09:49:20.882232 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6f5cd6dbf6-8fz7v" event={"ID":"c0cdbf97-982e-4ec5-876f-81c34f1233bd","Type":"ContainerDied","Data":"77e756f68ebfeeccec059b34a443d931d75dba505212b104e8e9f89adf3b3e5f"} Mar 10 09:49:20 crc kubenswrapper[4794]: I0310 09:49:20.882253 4794 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="77e756f68ebfeeccec059b34a443d931d75dba505212b104e8e9f89adf3b3e5f" Mar 10 09:49:20 crc kubenswrapper[4794]: I0310 09:49:20.912470 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6f5cd6dbf6-8fz7v" Mar 10 09:49:20 crc kubenswrapper[4794]: I0310 09:49:20.921405 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-fc4875744-s7tsn" Mar 10 09:49:20 crc kubenswrapper[4794]: I0310 09:49:20.979796 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f42205af-8717-4d47-88a2-846e3cf4d48f-config\") pod \"f42205af-8717-4d47-88a2-846e3cf4d48f\" (UID: \"f42205af-8717-4d47-88a2-846e3cf4d48f\") " Mar 10 09:49:20 crc kubenswrapper[4794]: I0310 09:49:20.979845 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/f42205af-8717-4d47-88a2-846e3cf4d48f-proxy-ca-bundles\") pod \"f42205af-8717-4d47-88a2-846e3cf4d48f\" (UID: \"f42205af-8717-4d47-88a2-846e3cf4d48f\") " Mar 10 09:49:20 crc kubenswrapper[4794]: I0310 09:49:20.979884 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c0cdbf97-982e-4ec5-876f-81c34f1233bd-client-ca\") pod \"c0cdbf97-982e-4ec5-876f-81c34f1233bd\" (UID: \"c0cdbf97-982e-4ec5-876f-81c34f1233bd\") " Mar 10 09:49:20 crc kubenswrapper[4794]: I0310 09:49:20.979899 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c0cdbf97-982e-4ec5-876f-81c34f1233bd-config\") pod \"c0cdbf97-982e-4ec5-876f-81c34f1233bd\" (UID: \"c0cdbf97-982e-4ec5-876f-81c34f1233bd\") " Mar 10 09:49:20 crc kubenswrapper[4794]: I0310 09:49:20.979928 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c0cdbf97-982e-4ec5-876f-81c34f1233bd-serving-cert\") pod \"c0cdbf97-982e-4ec5-876f-81c34f1233bd\" (UID: \"c0cdbf97-982e-4ec5-876f-81c34f1233bd\") " Mar 10 09:49:20 crc kubenswrapper[4794]: I0310 09:49:20.979946 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f42205af-8717-4d47-88a2-846e3cf4d48f-serving-cert\") pod \"f42205af-8717-4d47-88a2-846e3cf4d48f\" (UID: \"f42205af-8717-4d47-88a2-846e3cf4d48f\") " Mar 10 09:49:20 crc kubenswrapper[4794]: I0310 09:49:20.979972 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f42205af-8717-4d47-88a2-846e3cf4d48f-client-ca\") pod \"f42205af-8717-4d47-88a2-846e3cf4d48f\" (UID: \"f42205af-8717-4d47-88a2-846e3cf4d48f\") " Mar 10 09:49:20 crc kubenswrapper[4794]: I0310 09:49:20.980016 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vwlzr\" (UniqueName: \"kubernetes.io/projected/f42205af-8717-4d47-88a2-846e3cf4d48f-kube-api-access-vwlzr\") pod \"f42205af-8717-4d47-88a2-846e3cf4d48f\" (UID: \"f42205af-8717-4d47-88a2-846e3cf4d48f\") " Mar 10 09:49:20 crc kubenswrapper[4794]: I0310 09:49:20.980037 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qg4q9\" (UniqueName: \"kubernetes.io/projected/c0cdbf97-982e-4ec5-876f-81c34f1233bd-kube-api-access-qg4q9\") pod \"c0cdbf97-982e-4ec5-876f-81c34f1233bd\" (UID: \"c0cdbf97-982e-4ec5-876f-81c34f1233bd\") " Mar 10 09:49:20 crc kubenswrapper[4794]: I0310 09:49:20.980553 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c0cdbf97-982e-4ec5-876f-81c34f1233bd-client-ca" (OuterVolumeSpecName: "client-ca") pod "c0cdbf97-982e-4ec5-876f-81c34f1233bd" (UID: "c0cdbf97-982e-4ec5-876f-81c34f1233bd"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 09:49:20 crc kubenswrapper[4794]: I0310 09:49:20.980629 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c0cdbf97-982e-4ec5-876f-81c34f1233bd-config" (OuterVolumeSpecName: "config") pod "c0cdbf97-982e-4ec5-876f-81c34f1233bd" (UID: "c0cdbf97-982e-4ec5-876f-81c34f1233bd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 09:49:20 crc kubenswrapper[4794]: I0310 09:49:20.980998 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f42205af-8717-4d47-88a2-846e3cf4d48f-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "f42205af-8717-4d47-88a2-846e3cf4d48f" (UID: "f42205af-8717-4d47-88a2-846e3cf4d48f"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 09:49:20 crc kubenswrapper[4794]: I0310 09:49:20.981103 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f42205af-8717-4d47-88a2-846e3cf4d48f-client-ca" (OuterVolumeSpecName: "client-ca") pod "f42205af-8717-4d47-88a2-846e3cf4d48f" (UID: "f42205af-8717-4d47-88a2-846e3cf4d48f"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 09:49:20 crc kubenswrapper[4794]: I0310 09:49:20.981097 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f42205af-8717-4d47-88a2-846e3cf4d48f-config" (OuterVolumeSpecName: "config") pod "f42205af-8717-4d47-88a2-846e3cf4d48f" (UID: "f42205af-8717-4d47-88a2-846e3cf4d48f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 09:49:20 crc kubenswrapper[4794]: I0310 09:49:20.984158 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c0cdbf97-982e-4ec5-876f-81c34f1233bd-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "c0cdbf97-982e-4ec5-876f-81c34f1233bd" (UID: "c0cdbf97-982e-4ec5-876f-81c34f1233bd"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:49:20 crc kubenswrapper[4794]: I0310 09:49:20.984492 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c0cdbf97-982e-4ec5-876f-81c34f1233bd-kube-api-access-qg4q9" (OuterVolumeSpecName: "kube-api-access-qg4q9") pod "c0cdbf97-982e-4ec5-876f-81c34f1233bd" (UID: "c0cdbf97-982e-4ec5-876f-81c34f1233bd"). InnerVolumeSpecName "kube-api-access-qg4q9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:49:20 crc kubenswrapper[4794]: I0310 09:49:20.984678 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f42205af-8717-4d47-88a2-846e3cf4d48f-kube-api-access-vwlzr" (OuterVolumeSpecName: "kube-api-access-vwlzr") pod "f42205af-8717-4d47-88a2-846e3cf4d48f" (UID: "f42205af-8717-4d47-88a2-846e3cf4d48f"). InnerVolumeSpecName "kube-api-access-vwlzr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:49:20 crc kubenswrapper[4794]: I0310 09:49:20.985370 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f42205af-8717-4d47-88a2-846e3cf4d48f-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "f42205af-8717-4d47-88a2-846e3cf4d48f" (UID: "f42205af-8717-4d47-88a2-846e3cf4d48f"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:49:21 crc kubenswrapper[4794]: I0310 09:49:21.081062 4794 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c0cdbf97-982e-4ec5-876f-81c34f1233bd-config\") on node \"crc\" DevicePath \"\"" Mar 10 09:49:21 crc kubenswrapper[4794]: I0310 09:49:21.081109 4794 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c0cdbf97-982e-4ec5-876f-81c34f1233bd-client-ca\") on node \"crc\" DevicePath \"\"" Mar 10 09:49:21 crc kubenswrapper[4794]: I0310 09:49:21.081162 4794 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c0cdbf97-982e-4ec5-876f-81c34f1233bd-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 10 09:49:21 crc kubenswrapper[4794]: I0310 09:49:21.081193 4794 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f42205af-8717-4d47-88a2-846e3cf4d48f-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 10 09:49:21 crc kubenswrapper[4794]: I0310 09:49:21.081213 4794 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f42205af-8717-4d47-88a2-846e3cf4d48f-client-ca\") on node \"crc\" DevicePath \"\"" Mar 10 09:49:21 crc kubenswrapper[4794]: I0310 09:49:21.081236 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vwlzr\" (UniqueName: \"kubernetes.io/projected/f42205af-8717-4d47-88a2-846e3cf4d48f-kube-api-access-vwlzr\") on node \"crc\" DevicePath \"\"" Mar 10 09:49:21 crc kubenswrapper[4794]: I0310 09:49:21.081257 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qg4q9\" (UniqueName: \"kubernetes.io/projected/c0cdbf97-982e-4ec5-876f-81c34f1233bd-kube-api-access-qg4q9\") on node \"crc\" DevicePath \"\"" Mar 10 09:49:21 crc kubenswrapper[4794]: I0310 09:49:21.081298 4794 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f42205af-8717-4d47-88a2-846e3cf4d48f-config\") on node \"crc\" DevicePath \"\"" Mar 10 09:49:21 crc kubenswrapper[4794]: I0310 09:49:21.081325 4794 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/f42205af-8717-4d47-88a2-846e3cf4d48f-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 10 09:49:21 crc kubenswrapper[4794]: I0310 09:49:21.640823 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-b94f44856-jvg8z"] Mar 10 09:49:21 crc kubenswrapper[4794]: E0310 09:49:21.641124 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c0cdbf97-982e-4ec5-876f-81c34f1233bd" containerName="route-controller-manager" Mar 10 09:49:21 crc kubenswrapper[4794]: I0310 09:49:21.641139 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="c0cdbf97-982e-4ec5-876f-81c34f1233bd" containerName="route-controller-manager" Mar 10 09:49:21 crc kubenswrapper[4794]: E0310 09:49:21.641157 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f42205af-8717-4d47-88a2-846e3cf4d48f" containerName="controller-manager" Mar 10 09:49:21 crc kubenswrapper[4794]: I0310 09:49:21.641170 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="f42205af-8717-4d47-88a2-846e3cf4d48f" containerName="controller-manager" Mar 10 09:49:21 crc kubenswrapper[4794]: I0310 09:49:21.641285 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="c0cdbf97-982e-4ec5-876f-81c34f1233bd" containerName="route-controller-manager" Mar 10 09:49:21 crc kubenswrapper[4794]: I0310 09:49:21.641301 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="f42205af-8717-4d47-88a2-846e3cf4d48f" containerName="controller-manager" Mar 10 09:49:21 crc kubenswrapper[4794]: I0310 09:49:21.641718 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-b94f44856-jvg8z" Mar 10 09:49:21 crc kubenswrapper[4794]: I0310 09:49:21.645983 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5c55dc6b44-bdtlg"] Mar 10 09:49:21 crc kubenswrapper[4794]: I0310 09:49:21.647033 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5c55dc6b44-bdtlg" Mar 10 09:49:21 crc kubenswrapper[4794]: I0310 09:49:21.654402 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-b94f44856-jvg8z"] Mar 10 09:49:21 crc kubenswrapper[4794]: I0310 09:49:21.656652 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5c55dc6b44-bdtlg"] Mar 10 09:49:21 crc kubenswrapper[4794]: I0310 09:49:21.691848 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/9071057f-d75a-4285-aa35-4ed998d69d0b-proxy-ca-bundles\") pod \"controller-manager-b94f44856-jvg8z\" (UID: \"9071057f-d75a-4285-aa35-4ed998d69d0b\") " pod="openshift-controller-manager/controller-manager-b94f44856-jvg8z" Mar 10 09:49:21 crc kubenswrapper[4794]: I0310 09:49:21.692047 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9071057f-d75a-4285-aa35-4ed998d69d0b-config\") pod \"controller-manager-b94f44856-jvg8z\" (UID: \"9071057f-d75a-4285-aa35-4ed998d69d0b\") " pod="openshift-controller-manager/controller-manager-b94f44856-jvg8z" Mar 10 09:49:21 crc kubenswrapper[4794]: I0310 09:49:21.692106 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/559b5de5-db19-431f-b0ef-acffda856134-config\") pod \"route-controller-manager-5c55dc6b44-bdtlg\" (UID: \"559b5de5-db19-431f-b0ef-acffda856134\") " pod="openshift-route-controller-manager/route-controller-manager-5c55dc6b44-bdtlg" Mar 10 09:49:21 crc kubenswrapper[4794]: I0310 09:49:21.692151 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xtv8k\" (UniqueName: \"kubernetes.io/projected/9071057f-d75a-4285-aa35-4ed998d69d0b-kube-api-access-xtv8k\") pod \"controller-manager-b94f44856-jvg8z\" (UID: \"9071057f-d75a-4285-aa35-4ed998d69d0b\") " pod="openshift-controller-manager/controller-manager-b94f44856-jvg8z" Mar 10 09:49:21 crc kubenswrapper[4794]: I0310 09:49:21.692173 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/9071057f-d75a-4285-aa35-4ed998d69d0b-client-ca\") pod \"controller-manager-b94f44856-jvg8z\" (UID: \"9071057f-d75a-4285-aa35-4ed998d69d0b\") " pod="openshift-controller-manager/controller-manager-b94f44856-jvg8z" Mar 10 09:49:21 crc kubenswrapper[4794]: I0310 09:49:21.692188 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9071057f-d75a-4285-aa35-4ed998d69d0b-serving-cert\") pod \"controller-manager-b94f44856-jvg8z\" (UID: \"9071057f-d75a-4285-aa35-4ed998d69d0b\") " pod="openshift-controller-manager/controller-manager-b94f44856-jvg8z" Mar 10 09:49:21 crc kubenswrapper[4794]: I0310 09:49:21.692285 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wgfj8\" (UniqueName: \"kubernetes.io/projected/559b5de5-db19-431f-b0ef-acffda856134-kube-api-access-wgfj8\") pod \"route-controller-manager-5c55dc6b44-bdtlg\" (UID: \"559b5de5-db19-431f-b0ef-acffda856134\") " pod="openshift-route-controller-manager/route-controller-manager-5c55dc6b44-bdtlg" Mar 10 09:49:21 crc kubenswrapper[4794]: I0310 09:49:21.692435 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/559b5de5-db19-431f-b0ef-acffda856134-client-ca\") pod \"route-controller-manager-5c55dc6b44-bdtlg\" (UID: \"559b5de5-db19-431f-b0ef-acffda856134\") " pod="openshift-route-controller-manager/route-controller-manager-5c55dc6b44-bdtlg" Mar 10 09:49:21 crc kubenswrapper[4794]: I0310 09:49:21.692475 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/559b5de5-db19-431f-b0ef-acffda856134-serving-cert\") pod \"route-controller-manager-5c55dc6b44-bdtlg\" (UID: \"559b5de5-db19-431f-b0ef-acffda856134\") " pod="openshift-route-controller-manager/route-controller-manager-5c55dc6b44-bdtlg" Mar 10 09:49:21 crc kubenswrapper[4794]: I0310 09:49:21.792877 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/9071057f-d75a-4285-aa35-4ed998d69d0b-proxy-ca-bundles\") pod \"controller-manager-b94f44856-jvg8z\" (UID: \"9071057f-d75a-4285-aa35-4ed998d69d0b\") " pod="openshift-controller-manager/controller-manager-b94f44856-jvg8z" Mar 10 09:49:21 crc kubenswrapper[4794]: I0310 09:49:21.792971 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9071057f-d75a-4285-aa35-4ed998d69d0b-config\") pod \"controller-manager-b94f44856-jvg8z\" (UID: \"9071057f-d75a-4285-aa35-4ed998d69d0b\") " pod="openshift-controller-manager/controller-manager-b94f44856-jvg8z" Mar 10 09:49:21 crc kubenswrapper[4794]: I0310 09:49:21.793002 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/559b5de5-db19-431f-b0ef-acffda856134-config\") pod \"route-controller-manager-5c55dc6b44-bdtlg\" (UID: \"559b5de5-db19-431f-b0ef-acffda856134\") " pod="openshift-route-controller-manager/route-controller-manager-5c55dc6b44-bdtlg" Mar 10 09:49:21 crc kubenswrapper[4794]: I0310 09:49:21.793042 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xtv8k\" (UniqueName: \"kubernetes.io/projected/9071057f-d75a-4285-aa35-4ed998d69d0b-kube-api-access-xtv8k\") pod \"controller-manager-b94f44856-jvg8z\" (UID: \"9071057f-d75a-4285-aa35-4ed998d69d0b\") " pod="openshift-controller-manager/controller-manager-b94f44856-jvg8z" Mar 10 09:49:21 crc kubenswrapper[4794]: I0310 09:49:21.793066 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/9071057f-d75a-4285-aa35-4ed998d69d0b-client-ca\") pod \"controller-manager-b94f44856-jvg8z\" (UID: \"9071057f-d75a-4285-aa35-4ed998d69d0b\") " pod="openshift-controller-manager/controller-manager-b94f44856-jvg8z" Mar 10 09:49:21 crc kubenswrapper[4794]: I0310 09:49:21.793086 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9071057f-d75a-4285-aa35-4ed998d69d0b-serving-cert\") pod \"controller-manager-b94f44856-jvg8z\" (UID: \"9071057f-d75a-4285-aa35-4ed998d69d0b\") " pod="openshift-controller-manager/controller-manager-b94f44856-jvg8z" Mar 10 09:49:21 crc kubenswrapper[4794]: I0310 09:49:21.793120 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wgfj8\" (UniqueName: \"kubernetes.io/projected/559b5de5-db19-431f-b0ef-acffda856134-kube-api-access-wgfj8\") pod \"route-controller-manager-5c55dc6b44-bdtlg\" (UID: \"559b5de5-db19-431f-b0ef-acffda856134\") " pod="openshift-route-controller-manager/route-controller-manager-5c55dc6b44-bdtlg" Mar 10 09:49:21 crc kubenswrapper[4794]: I0310 09:49:21.793152 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/559b5de5-db19-431f-b0ef-acffda856134-client-ca\") pod \"route-controller-manager-5c55dc6b44-bdtlg\" (UID: \"559b5de5-db19-431f-b0ef-acffda856134\") " pod="openshift-route-controller-manager/route-controller-manager-5c55dc6b44-bdtlg" Mar 10 09:49:21 crc kubenswrapper[4794]: I0310 09:49:21.793212 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/559b5de5-db19-431f-b0ef-acffda856134-serving-cert\") pod \"route-controller-manager-5c55dc6b44-bdtlg\" (UID: \"559b5de5-db19-431f-b0ef-acffda856134\") " pod="openshift-route-controller-manager/route-controller-manager-5c55dc6b44-bdtlg" Mar 10 09:49:21 crc kubenswrapper[4794]: I0310 09:49:21.793928 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/9071057f-d75a-4285-aa35-4ed998d69d0b-client-ca\") pod \"controller-manager-b94f44856-jvg8z\" (UID: \"9071057f-d75a-4285-aa35-4ed998d69d0b\") " pod="openshift-controller-manager/controller-manager-b94f44856-jvg8z" Mar 10 09:49:21 crc kubenswrapper[4794]: I0310 09:49:21.794081 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9071057f-d75a-4285-aa35-4ed998d69d0b-config\") pod \"controller-manager-b94f44856-jvg8z\" (UID: \"9071057f-d75a-4285-aa35-4ed998d69d0b\") " pod="openshift-controller-manager/controller-manager-b94f44856-jvg8z" Mar 10 09:49:21 crc kubenswrapper[4794]: I0310 09:49:21.794473 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/9071057f-d75a-4285-aa35-4ed998d69d0b-proxy-ca-bundles\") pod \"controller-manager-b94f44856-jvg8z\" (UID: \"9071057f-d75a-4285-aa35-4ed998d69d0b\") " pod="openshift-controller-manager/controller-manager-b94f44856-jvg8z" Mar 10 09:49:21 crc kubenswrapper[4794]: I0310 09:49:21.794752 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/559b5de5-db19-431f-b0ef-acffda856134-config\") pod \"route-controller-manager-5c55dc6b44-bdtlg\" (UID: \"559b5de5-db19-431f-b0ef-acffda856134\") " pod="openshift-route-controller-manager/route-controller-manager-5c55dc6b44-bdtlg" Mar 10 09:49:21 crc kubenswrapper[4794]: I0310 09:49:21.794786 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/559b5de5-db19-431f-b0ef-acffda856134-client-ca\") pod \"route-controller-manager-5c55dc6b44-bdtlg\" (UID: \"559b5de5-db19-431f-b0ef-acffda856134\") " pod="openshift-route-controller-manager/route-controller-manager-5c55dc6b44-bdtlg" Mar 10 09:49:21 crc kubenswrapper[4794]: I0310 09:49:21.799426 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/559b5de5-db19-431f-b0ef-acffda856134-serving-cert\") pod \"route-controller-manager-5c55dc6b44-bdtlg\" (UID: \"559b5de5-db19-431f-b0ef-acffda856134\") " pod="openshift-route-controller-manager/route-controller-manager-5c55dc6b44-bdtlg" Mar 10 09:49:21 crc kubenswrapper[4794]: I0310 09:49:21.803289 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9071057f-d75a-4285-aa35-4ed998d69d0b-serving-cert\") pod \"controller-manager-b94f44856-jvg8z\" (UID: \"9071057f-d75a-4285-aa35-4ed998d69d0b\") " pod="openshift-controller-manager/controller-manager-b94f44856-jvg8z" Mar 10 09:49:21 crc kubenswrapper[4794]: I0310 09:49:21.814206 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wgfj8\" (UniqueName: \"kubernetes.io/projected/559b5de5-db19-431f-b0ef-acffda856134-kube-api-access-wgfj8\") pod \"route-controller-manager-5c55dc6b44-bdtlg\" (UID: \"559b5de5-db19-431f-b0ef-acffda856134\") " pod="openshift-route-controller-manager/route-controller-manager-5c55dc6b44-bdtlg" Mar 10 09:49:21 crc kubenswrapper[4794]: I0310 09:49:21.816469 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xtv8k\" (UniqueName: \"kubernetes.io/projected/9071057f-d75a-4285-aa35-4ed998d69d0b-kube-api-access-xtv8k\") pod \"controller-manager-b94f44856-jvg8z\" (UID: \"9071057f-d75a-4285-aa35-4ed998d69d0b\") " pod="openshift-controller-manager/controller-manager-b94f44856-jvg8z" Mar 10 09:49:21 crc kubenswrapper[4794]: I0310 09:49:21.892366 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6f5cd6dbf6-8fz7v" Mar 10 09:49:21 crc kubenswrapper[4794]: I0310 09:49:21.892410 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-fc4875744-s7tsn" event={"ID":"f42205af-8717-4d47-88a2-846e3cf4d48f","Type":"ContainerDied","Data":"2e22afaeef82f3512a11753f05a9c85ba137dd8dd208dad23e3fcfc10db1f871"} Mar 10 09:49:21 crc kubenswrapper[4794]: I0310 09:49:21.892483 4794 scope.go:117] "RemoveContainer" containerID="49f7887bb8bee694f2122a2edc1e9d86d6e28a93be04605113654c393dfd403e" Mar 10 09:49:21 crc kubenswrapper[4794]: I0310 09:49:21.892386 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-fc4875744-s7tsn" Mar 10 09:49:21 crc kubenswrapper[4794]: I0310 09:49:21.943501 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6f5cd6dbf6-8fz7v"] Mar 10 09:49:21 crc kubenswrapper[4794]: I0310 09:49:21.949462 4794 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6f5cd6dbf6-8fz7v"] Mar 10 09:49:21 crc kubenswrapper[4794]: I0310 09:49:21.954906 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-fc4875744-s7tsn"] Mar 10 09:49:21 crc kubenswrapper[4794]: I0310 09:49:21.958236 4794 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-fc4875744-s7tsn"] Mar 10 09:49:21 crc kubenswrapper[4794]: I0310 09:49:21.973541 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-b94f44856-jvg8z" Mar 10 09:49:21 crc kubenswrapper[4794]: I0310 09:49:21.987779 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5c55dc6b44-bdtlg" Mar 10 09:49:22 crc kubenswrapper[4794]: I0310 09:49:22.014234 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c0cdbf97-982e-4ec5-876f-81c34f1233bd" path="/var/lib/kubelet/pods/c0cdbf97-982e-4ec5-876f-81c34f1233bd/volumes" Mar 10 09:49:22 crc kubenswrapper[4794]: I0310 09:49:22.014939 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f42205af-8717-4d47-88a2-846e3cf4d48f" path="/var/lib/kubelet/pods/f42205af-8717-4d47-88a2-846e3cf4d48f/volumes" Mar 10 09:49:22 crc kubenswrapper[4794]: I0310 09:49:22.256747 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5c55dc6b44-bdtlg"] Mar 10 09:49:22 crc kubenswrapper[4794]: W0310 09:49:22.263779 4794 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod559b5de5_db19_431f_b0ef_acffda856134.slice/crio-6910c2d614f112f757337289a81c0c6b31703329e164071f1595e5305813fe5e WatchSource:0}: Error finding container 6910c2d614f112f757337289a81c0c6b31703329e164071f1595e5305813fe5e: Status 404 returned error can't find the container with id 6910c2d614f112f757337289a81c0c6b31703329e164071f1595e5305813fe5e Mar 10 09:49:22 crc kubenswrapper[4794]: I0310 09:49:22.421558 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-b94f44856-jvg8z"] Mar 10 09:49:22 crc kubenswrapper[4794]: W0310 09:49:22.429091 4794 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9071057f_d75a_4285_aa35_4ed998d69d0b.slice/crio-c512ae2fd78ff46671d2d5eeb1d1c9f6337bfa0b22fc850a204da82c2fd1ae06 WatchSource:0}: Error finding container c512ae2fd78ff46671d2d5eeb1d1c9f6337bfa0b22fc850a204da82c2fd1ae06: Status 404 returned error can't find the container with id c512ae2fd78ff46671d2d5eeb1d1c9f6337bfa0b22fc850a204da82c2fd1ae06 Mar 10 09:49:22 crc kubenswrapper[4794]: I0310 09:49:22.898897 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5c55dc6b44-bdtlg" event={"ID":"559b5de5-db19-431f-b0ef-acffda856134","Type":"ContainerStarted","Data":"70593332a3b31811b527e4ae6b47cda7bc15cf108a1c6d0dbc79368d904c8a37"} Mar 10 09:49:22 crc kubenswrapper[4794]: I0310 09:49:22.898934 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5c55dc6b44-bdtlg" event={"ID":"559b5de5-db19-431f-b0ef-acffda856134","Type":"ContainerStarted","Data":"6910c2d614f112f757337289a81c0c6b31703329e164071f1595e5305813fe5e"} Mar 10 09:49:22 crc kubenswrapper[4794]: I0310 09:49:22.899078 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-5c55dc6b44-bdtlg" Mar 10 09:49:22 crc kubenswrapper[4794]: I0310 09:49:22.900913 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-b94f44856-jvg8z" event={"ID":"9071057f-d75a-4285-aa35-4ed998d69d0b","Type":"ContainerStarted","Data":"7f25e45dbd5f0cbe9fb3dfefce31b31d7bf5873a1e22b50c773168aca73cf2e9"} Mar 10 09:49:22 crc kubenswrapper[4794]: I0310 09:49:22.900939 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-b94f44856-jvg8z" event={"ID":"9071057f-d75a-4285-aa35-4ed998d69d0b","Type":"ContainerStarted","Data":"c512ae2fd78ff46671d2d5eeb1d1c9f6337bfa0b22fc850a204da82c2fd1ae06"} Mar 10 09:49:22 crc kubenswrapper[4794]: I0310 09:49:22.901177 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-b94f44856-jvg8z" Mar 10 09:49:22 crc kubenswrapper[4794]: I0310 09:49:22.903152 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-5c55dc6b44-bdtlg" Mar 10 09:49:22 crc kubenswrapper[4794]: I0310 09:49:22.906415 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-b94f44856-jvg8z" Mar 10 09:49:22 crc kubenswrapper[4794]: I0310 09:49:22.917106 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-5c55dc6b44-bdtlg" podStartSLOduration=2.917093357 podStartE2EDuration="2.917093357s" podCreationTimestamp="2026-03-10 09:49:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 09:49:22.914175233 +0000 UTC m=+311.670346051" watchObservedRunningTime="2026-03-10 09:49:22.917093357 +0000 UTC m=+311.673264175" Mar 10 09:49:22 crc kubenswrapper[4794]: I0310 09:49:22.931284 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-b94f44856-jvg8z" podStartSLOduration=2.93127143 podStartE2EDuration="2.93127143s" podCreationTimestamp="2026-03-10 09:49:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 09:49:22.929976064 +0000 UTC m=+311.686146882" watchObservedRunningTime="2026-03-10 09:49:22.93127143 +0000 UTC m=+311.687442238" Mar 10 09:49:22 crc kubenswrapper[4794]: I0310 09:49:22.968104 4794 patch_prober.go:28] interesting pod/machine-config-daemon-69278 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 09:49:22 crc kubenswrapper[4794]: I0310 09:49:22.968171 4794 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-69278" podUID="071a79a8-a892-4d38-a255-2a19483b64aa" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 09:49:22 crc kubenswrapper[4794]: I0310 09:49:22.968228 4794 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-69278" Mar 10 09:49:22 crc kubenswrapper[4794]: I0310 09:49:22.968896 4794 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"387c7d244ef3b150e61bf6db8b30580b19b8867628fab0e5c4eebdde4df81142"} pod="openshift-machine-config-operator/machine-config-daemon-69278" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 10 09:49:22 crc kubenswrapper[4794]: I0310 09:49:22.968970 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-69278" podUID="071a79a8-a892-4d38-a255-2a19483b64aa" containerName="machine-config-daemon" containerID="cri-o://387c7d244ef3b150e61bf6db8b30580b19b8867628fab0e5c4eebdde4df81142" gracePeriod=600 Mar 10 09:49:23 crc kubenswrapper[4794]: I0310 09:49:23.909757 4794 generic.go:334] "Generic (PLEG): container finished" podID="071a79a8-a892-4d38-a255-2a19483b64aa" containerID="387c7d244ef3b150e61bf6db8b30580b19b8867628fab0e5c4eebdde4df81142" exitCode=0 Mar 10 09:49:23 crc kubenswrapper[4794]: I0310 09:49:23.909857 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-69278" event={"ID":"071a79a8-a892-4d38-a255-2a19483b64aa","Type":"ContainerDied","Data":"387c7d244ef3b150e61bf6db8b30580b19b8867628fab0e5c4eebdde4df81142"} Mar 10 09:49:23 crc kubenswrapper[4794]: I0310 09:49:23.911958 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-69278" event={"ID":"071a79a8-a892-4d38-a255-2a19483b64aa","Type":"ContainerStarted","Data":"8bd93f36183b19386ef399a63319db0f77fccc38f2efa4fc0d1b62c277727e21"} Mar 10 09:49:24 crc kubenswrapper[4794]: I0310 09:49:24.811209 4794 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Mar 10 09:49:24 crc kubenswrapper[4794]: I0310 09:49:24.811874 4794 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Mar 10 09:49:24 crc kubenswrapper[4794]: I0310 09:49:24.811980 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 10 09:49:24 crc kubenswrapper[4794]: I0310 09:49:24.812139 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" containerID="cri-o://c7c4ede630df1f139ad8abbc2d1d1e2f0f6cf56a2c29902f6b6879df004834cd" gracePeriod=15 Mar 10 09:49:24 crc kubenswrapper[4794]: I0310 09:49:24.812203 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" containerID="cri-o://6a3be3a71e2af20be50ffcbd448d5f28e73bafe2d36819ec5476a260b6f5a8c5" gracePeriod=15 Mar 10 09:49:24 crc kubenswrapper[4794]: I0310 09:49:24.812218 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" containerID="cri-o://27049a35d23ac6ffa5cb459e4ea2c5c55a7b1731c9edff0941d452442cc73af2" gracePeriod=15 Mar 10 09:49:24 crc kubenswrapper[4794]: I0310 09:49:24.812226 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" containerID="cri-o://8f9a5ac235816afb81c6091da3019a5d8be1e29ba1de74f30c6c28ace091f822" gracePeriod=15 Mar 10 09:49:24 crc kubenswrapper[4794]: I0310 09:49:24.812246 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" containerID="cri-o://de2c346c49fbd94c4328853bed3ffd94ba873a423ff0d99a6ace6fc832bd4da7" gracePeriod=15 Mar 10 09:49:24 crc kubenswrapper[4794]: I0310 09:49:24.814071 4794 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Mar 10 09:49:24 crc kubenswrapper[4794]: E0310 09:49:24.814244 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 10 09:49:24 crc kubenswrapper[4794]: I0310 09:49:24.814257 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 10 09:49:24 crc kubenswrapper[4794]: E0310 09:49:24.814265 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Mar 10 09:49:24 crc kubenswrapper[4794]: I0310 09:49:24.814272 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Mar 10 09:49:24 crc kubenswrapper[4794]: E0310 09:49:24.814285 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Mar 10 09:49:24 crc kubenswrapper[4794]: I0310 09:49:24.814292 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Mar 10 09:49:24 crc kubenswrapper[4794]: E0310 09:49:24.814303 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 10 09:49:24 crc kubenswrapper[4794]: I0310 09:49:24.814310 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 10 09:49:24 crc kubenswrapper[4794]: E0310 09:49:24.814320 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Mar 10 09:49:24 crc kubenswrapper[4794]: I0310 09:49:24.814326 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Mar 10 09:49:24 crc kubenswrapper[4794]: E0310 09:49:24.814431 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 10 09:49:24 crc kubenswrapper[4794]: I0310 09:49:24.814439 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 10 09:49:24 crc kubenswrapper[4794]: E0310 09:49:24.814445 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Mar 10 09:49:24 crc kubenswrapper[4794]: I0310 09:49:24.814453 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Mar 10 09:49:24 crc kubenswrapper[4794]: E0310 09:49:24.814463 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 10 09:49:24 crc kubenswrapper[4794]: I0310 09:49:24.814470 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 10 09:49:24 crc kubenswrapper[4794]: E0310 09:49:24.814478 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Mar 10 09:49:24 crc kubenswrapper[4794]: I0310 09:49:24.814483 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Mar 10 09:49:24 crc kubenswrapper[4794]: I0310 09:49:24.814584 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 10 09:49:24 crc kubenswrapper[4794]: I0310 09:49:24.814595 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 10 09:49:24 crc kubenswrapper[4794]: I0310 09:49:24.814603 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Mar 10 09:49:24 crc kubenswrapper[4794]: I0310 09:49:24.814610 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 10 09:49:24 crc kubenswrapper[4794]: I0310 09:49:24.814619 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Mar 10 09:49:24 crc kubenswrapper[4794]: I0310 09:49:24.814625 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Mar 10 09:49:24 crc kubenswrapper[4794]: I0310 09:49:24.814635 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Mar 10 09:49:24 crc kubenswrapper[4794]: E0310 09:49:24.814745 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 10 09:49:24 crc kubenswrapper[4794]: I0310 09:49:24.814762 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 10 09:49:24 crc kubenswrapper[4794]: I0310 09:49:24.814878 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 10 09:49:24 crc kubenswrapper[4794]: I0310 09:49:24.814890 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 10 09:49:24 crc kubenswrapper[4794]: I0310 09:49:24.829384 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 10 09:49:24 crc kubenswrapper[4794]: I0310 09:49:24.829662 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 10 09:49:24 crc kubenswrapper[4794]: I0310 09:49:24.829707 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 10 09:49:24 crc kubenswrapper[4794]: I0310 09:49:24.829744 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 10 09:49:24 crc kubenswrapper[4794]: I0310 09:49:24.829763 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 10 09:49:24 crc kubenswrapper[4794]: I0310 09:49:24.829779 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 10 09:49:24 crc kubenswrapper[4794]: I0310 09:49:24.829794 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 10 09:49:24 crc kubenswrapper[4794]: I0310 09:49:24.829813 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 10 09:49:24 crc kubenswrapper[4794]: I0310 09:49:24.850170 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Mar 10 09:49:24 crc kubenswrapper[4794]: I0310 09:49:24.930652 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 10 09:49:24 crc kubenswrapper[4794]: I0310 09:49:24.930708 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 10 09:49:24 crc kubenswrapper[4794]: I0310 09:49:24.930758 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 10 09:49:24 crc kubenswrapper[4794]: I0310 09:49:24.930818 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 10 09:49:24 crc kubenswrapper[4794]: I0310 09:49:24.930844 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 10 09:49:24 crc kubenswrapper[4794]: I0310 09:49:24.930861 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 10 09:49:24 crc kubenswrapper[4794]: I0310 09:49:24.930878 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 10 09:49:24 crc kubenswrapper[4794]: I0310 09:49:24.930902 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 10 09:49:24 crc kubenswrapper[4794]: I0310 09:49:24.930968 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 10 09:49:24 crc kubenswrapper[4794]: I0310 09:49:24.931436 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 10 09:49:24 crc kubenswrapper[4794]: I0310 09:49:24.932267 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 10 09:49:24 crc kubenswrapper[4794]: I0310 09:49:24.932312 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 10 09:49:24 crc kubenswrapper[4794]: I0310 09:49:24.932396 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 10 09:49:24 crc kubenswrapper[4794]: I0310 09:49:24.932398 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 10 09:49:24 crc kubenswrapper[4794]: I0310 09:49:24.932432 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 10 09:49:24 crc kubenswrapper[4794]: I0310 09:49:24.932459 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 10 09:49:25 crc kubenswrapper[4794]: I0310 09:49:25.146814 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 10 09:49:25 crc kubenswrapper[4794]: E0310 09:49:25.177936 4794 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.102.83.65:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.189b71f29fc9b9e9 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 09:49:25.177244137 +0000 UTC m=+313.933414975,LastTimestamp:2026-03-10 09:49:25.177244137 +0000 UTC m=+313.933414975,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 09:49:25 crc kubenswrapper[4794]: I0310 09:49:25.939848 4794 generic.go:334] "Generic (PLEG): container finished" podID="8be88dc0-2383-4590-8731-7c19146fcd2b" containerID="7838560755eca50c1338f0a51378c57ab2950728571720689d1a07e386bf0d5e" exitCode=0 Mar 10 09:49:25 crc kubenswrapper[4794]: I0310 09:49:25.939984 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"8be88dc0-2383-4590-8731-7c19146fcd2b","Type":"ContainerDied","Data":"7838560755eca50c1338f0a51378c57ab2950728571720689d1a07e386bf0d5e"} Mar 10 09:49:25 crc kubenswrapper[4794]: I0310 09:49:25.940983 4794 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.65:6443: connect: connection refused" Mar 10 09:49:25 crc kubenswrapper[4794]: I0310 09:49:25.941433 4794 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.65:6443: connect: connection refused" Mar 10 09:49:25 crc kubenswrapper[4794]: I0310 09:49:25.942548 4794 status_manager.go:851] "Failed to get status for pod" podUID="8be88dc0-2383-4590-8731-7c19146fcd2b" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.65:6443: connect: connection refused" Mar 10 09:49:25 crc kubenswrapper[4794]: I0310 09:49:25.946794 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Mar 10 09:49:25 crc kubenswrapper[4794]: I0310 09:49:25.948491 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Mar 10 09:49:25 crc kubenswrapper[4794]: I0310 09:49:25.949579 4794 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="8f9a5ac235816afb81c6091da3019a5d8be1e29ba1de74f30c6c28ace091f822" exitCode=0 Mar 10 09:49:25 crc kubenswrapper[4794]: I0310 09:49:25.949660 4794 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="6a3be3a71e2af20be50ffcbd448d5f28e73bafe2d36819ec5476a260b6f5a8c5" exitCode=0 Mar 10 09:49:25 crc kubenswrapper[4794]: I0310 09:49:25.949721 4794 scope.go:117] "RemoveContainer" containerID="bd3d37caeb172a681855b5504cd1387cb3798cff0c9ae2c67f9a57207d86f7fe" Mar 10 09:49:25 crc kubenswrapper[4794]: I0310 09:49:25.949736 4794 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="27049a35d23ac6ffa5cb459e4ea2c5c55a7b1731c9edff0941d452442cc73af2" exitCode=0 Mar 10 09:49:25 crc kubenswrapper[4794]: I0310 09:49:25.949919 4794 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="de2c346c49fbd94c4328853bed3ffd94ba873a423ff0d99a6ace6fc832bd4da7" exitCode=2 Mar 10 09:49:25 crc kubenswrapper[4794]: I0310 09:49:25.952869 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"d788d4a5003d3248803f7857ee8143040c3ba1fcf0389703ac3b7652edacabf8"} Mar 10 09:49:25 crc kubenswrapper[4794]: I0310 09:49:25.952939 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"4284be8185e19d90ce47bcff4eff9be86563ee1c8f23ace60081c7498a5bea13"} Mar 10 09:49:25 crc kubenswrapper[4794]: I0310 09:49:25.953838 4794 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.65:6443: connect: connection refused" Mar 10 09:49:25 crc kubenswrapper[4794]: I0310 09:49:25.954312 4794 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.65:6443: connect: connection refused" Mar 10 09:49:25 crc kubenswrapper[4794]: I0310 09:49:25.954954 4794 status_manager.go:851] "Failed to get status for pod" podUID="8be88dc0-2383-4590-8731-7c19146fcd2b" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.65:6443: connect: connection refused" Mar 10 09:49:26 crc kubenswrapper[4794]: I0310 09:49:26.985625 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Mar 10 09:49:27 crc kubenswrapper[4794]: I0310 09:49:27.182119 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Mar 10 09:49:27 crc kubenswrapper[4794]: I0310 09:49:27.183019 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 10 09:49:27 crc kubenswrapper[4794]: I0310 09:49:27.184679 4794 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.65:6443: connect: connection refused" Mar 10 09:49:27 crc kubenswrapper[4794]: I0310 09:49:27.185829 4794 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.65:6443: connect: connection refused" Mar 10 09:49:27 crc kubenswrapper[4794]: I0310 09:49:27.186088 4794 status_manager.go:851] "Failed to get status for pod" podUID="8be88dc0-2383-4590-8731-7c19146fcd2b" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.65:6443: connect: connection refused" Mar 10 09:49:27 crc kubenswrapper[4794]: I0310 09:49:27.259524 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Mar 10 09:49:27 crc kubenswrapper[4794]: I0310 09:49:27.259581 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Mar 10 09:49:27 crc kubenswrapper[4794]: I0310 09:49:27.259656 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 09:49:27 crc kubenswrapper[4794]: I0310 09:49:27.259688 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Mar 10 09:49:27 crc kubenswrapper[4794]: I0310 09:49:27.259700 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 09:49:27 crc kubenswrapper[4794]: I0310 09:49:27.259884 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir" (OuterVolumeSpecName: "cert-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "cert-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 09:49:27 crc kubenswrapper[4794]: I0310 09:49:27.259915 4794 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") on node \"crc\" DevicePath \"\"" Mar 10 09:49:27 crc kubenswrapper[4794]: I0310 09:49:27.259927 4794 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") on node \"crc\" DevicePath \"\"" Mar 10 09:49:27 crc kubenswrapper[4794]: I0310 09:49:27.307727 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Mar 10 09:49:27 crc kubenswrapper[4794]: I0310 09:49:27.308554 4794 status_manager.go:851] "Failed to get status for pod" podUID="8be88dc0-2383-4590-8731-7c19146fcd2b" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.65:6443: connect: connection refused" Mar 10 09:49:27 crc kubenswrapper[4794]: I0310 09:49:27.308799 4794 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.65:6443: connect: connection refused" Mar 10 09:49:27 crc kubenswrapper[4794]: I0310 09:49:27.309061 4794 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.65:6443: connect: connection refused" Mar 10 09:49:27 crc kubenswrapper[4794]: I0310 09:49:27.361046 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/8be88dc0-2383-4590-8731-7c19146fcd2b-kubelet-dir\") pod \"8be88dc0-2383-4590-8731-7c19146fcd2b\" (UID: \"8be88dc0-2383-4590-8731-7c19146fcd2b\") " Mar 10 09:49:27 crc kubenswrapper[4794]: I0310 09:49:27.361114 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/8be88dc0-2383-4590-8731-7c19146fcd2b-var-lock\") pod \"8be88dc0-2383-4590-8731-7c19146fcd2b\" (UID: \"8be88dc0-2383-4590-8731-7c19146fcd2b\") " Mar 10 09:49:27 crc kubenswrapper[4794]: I0310 09:49:27.361158 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8be88dc0-2383-4590-8731-7c19146fcd2b-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "8be88dc0-2383-4590-8731-7c19146fcd2b" (UID: "8be88dc0-2383-4590-8731-7c19146fcd2b"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 09:49:27 crc kubenswrapper[4794]: I0310 09:49:27.361217 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8be88dc0-2383-4590-8731-7c19146fcd2b-var-lock" (OuterVolumeSpecName: "var-lock") pod "8be88dc0-2383-4590-8731-7c19146fcd2b" (UID: "8be88dc0-2383-4590-8731-7c19146fcd2b"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 09:49:27 crc kubenswrapper[4794]: I0310 09:49:27.361251 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/8be88dc0-2383-4590-8731-7c19146fcd2b-kube-api-access\") pod \"8be88dc0-2383-4590-8731-7c19146fcd2b\" (UID: \"8be88dc0-2383-4590-8731-7c19146fcd2b\") " Mar 10 09:49:27 crc kubenswrapper[4794]: I0310 09:49:27.361675 4794 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/8be88dc0-2383-4590-8731-7c19146fcd2b-kubelet-dir\") on node \"crc\" DevicePath \"\"" Mar 10 09:49:27 crc kubenswrapper[4794]: I0310 09:49:27.361709 4794 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/8be88dc0-2383-4590-8731-7c19146fcd2b-var-lock\") on node \"crc\" DevicePath \"\"" Mar 10 09:49:27 crc kubenswrapper[4794]: I0310 09:49:27.361726 4794 reconciler_common.go:293] "Volume detached for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") on node \"crc\" DevicePath \"\"" Mar 10 09:49:27 crc kubenswrapper[4794]: I0310 09:49:27.366761 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8be88dc0-2383-4590-8731-7c19146fcd2b-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "8be88dc0-2383-4590-8731-7c19146fcd2b" (UID: "8be88dc0-2383-4590-8731-7c19146fcd2b"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:49:27 crc kubenswrapper[4794]: I0310 09:49:27.462875 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/8be88dc0-2383-4590-8731-7c19146fcd2b-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 10 09:49:27 crc kubenswrapper[4794]: I0310 09:49:27.946087 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-authentication/oauth-openshift-558db77b4-n94r9" podUID="d56da28c-c09d-4fff-b73e-c3b5c787c300" containerName="oauth-openshift" containerID="cri-o://a2f0ca38cf8561e292249a48d1b06ca578f4f8715cafb1c749fea0b2be397ce6" gracePeriod=15 Mar 10 09:49:28 crc kubenswrapper[4794]: I0310 09:49:28.001932 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Mar 10 09:49:28 crc kubenswrapper[4794]: I0310 09:49:28.002917 4794 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="c7c4ede630df1f139ad8abbc2d1d1e2f0f6cf56a2c29902f6b6879df004834cd" exitCode=0 Mar 10 09:49:28 crc kubenswrapper[4794]: I0310 09:49:28.003087 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 10 09:49:28 crc kubenswrapper[4794]: I0310 09:49:28.006869 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Mar 10 09:49:28 crc kubenswrapper[4794]: I0310 09:49:28.008419 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f4b27818a5e8e43d0dc095d08835c792" path="/var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/volumes" Mar 10 09:49:28 crc kubenswrapper[4794]: I0310 09:49:28.009784 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"8be88dc0-2383-4590-8731-7c19146fcd2b","Type":"ContainerDied","Data":"06257be372456d311406456cf45b1816fa4e85174e9491c6c04661749041b5c3"} Mar 10 09:49:28 crc kubenswrapper[4794]: I0310 09:49:28.009812 4794 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="06257be372456d311406456cf45b1816fa4e85174e9491c6c04661749041b5c3" Mar 10 09:49:28 crc kubenswrapper[4794]: I0310 09:49:28.009854 4794 scope.go:117] "RemoveContainer" containerID="8f9a5ac235816afb81c6091da3019a5d8be1e29ba1de74f30c6c28ace091f822" Mar 10 09:49:28 crc kubenswrapper[4794]: I0310 09:49:28.030405 4794 status_manager.go:851] "Failed to get status for pod" podUID="8be88dc0-2383-4590-8731-7c19146fcd2b" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.65:6443: connect: connection refused" Mar 10 09:49:28 crc kubenswrapper[4794]: I0310 09:49:28.030742 4794 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.65:6443: connect: connection refused" Mar 10 09:49:28 crc kubenswrapper[4794]: I0310 09:49:28.031254 4794 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.65:6443: connect: connection refused" Mar 10 09:49:28 crc kubenswrapper[4794]: I0310 09:49:28.031913 4794 status_manager.go:851] "Failed to get status for pod" podUID="8be88dc0-2383-4590-8731-7c19146fcd2b" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.65:6443: connect: connection refused" Mar 10 09:49:28 crc kubenswrapper[4794]: I0310 09:49:28.032250 4794 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.65:6443: connect: connection refused" Mar 10 09:49:28 crc kubenswrapper[4794]: I0310 09:49:28.032655 4794 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.65:6443: connect: connection refused" Mar 10 09:49:28 crc kubenswrapper[4794]: I0310 09:49:28.032963 4794 scope.go:117] "RemoveContainer" containerID="6a3be3a71e2af20be50ffcbd448d5f28e73bafe2d36819ec5476a260b6f5a8c5" Mar 10 09:49:28 crc kubenswrapper[4794]: I0310 09:49:28.049658 4794 scope.go:117] "RemoveContainer" containerID="27049a35d23ac6ffa5cb459e4ea2c5c55a7b1731c9edff0941d452442cc73af2" Mar 10 09:49:28 crc kubenswrapper[4794]: I0310 09:49:28.082881 4794 scope.go:117] "RemoveContainer" containerID="de2c346c49fbd94c4328853bed3ffd94ba873a423ff0d99a6ace6fc832bd4da7" Mar 10 09:49:28 crc kubenswrapper[4794]: I0310 09:49:28.099975 4794 scope.go:117] "RemoveContainer" containerID="c7c4ede630df1f139ad8abbc2d1d1e2f0f6cf56a2c29902f6b6879df004834cd" Mar 10 09:49:28 crc kubenswrapper[4794]: I0310 09:49:28.128627 4794 scope.go:117] "RemoveContainer" containerID="712665fa5ac90b706c3ce8eb211703686c18a26f136962df26e3ad7c4396c72f" Mar 10 09:49:28 crc kubenswrapper[4794]: I0310 09:49:28.149545 4794 scope.go:117] "RemoveContainer" containerID="8f9a5ac235816afb81c6091da3019a5d8be1e29ba1de74f30c6c28ace091f822" Mar 10 09:49:28 crc kubenswrapper[4794]: E0310 09:49:28.152477 4794 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8f9a5ac235816afb81c6091da3019a5d8be1e29ba1de74f30c6c28ace091f822\": container with ID starting with 8f9a5ac235816afb81c6091da3019a5d8be1e29ba1de74f30c6c28ace091f822 not found: ID does not exist" containerID="8f9a5ac235816afb81c6091da3019a5d8be1e29ba1de74f30c6c28ace091f822" Mar 10 09:49:28 crc kubenswrapper[4794]: I0310 09:49:28.152535 4794 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8f9a5ac235816afb81c6091da3019a5d8be1e29ba1de74f30c6c28ace091f822"} err="failed to get container status \"8f9a5ac235816afb81c6091da3019a5d8be1e29ba1de74f30c6c28ace091f822\": rpc error: code = NotFound desc = could not find container \"8f9a5ac235816afb81c6091da3019a5d8be1e29ba1de74f30c6c28ace091f822\": container with ID starting with 8f9a5ac235816afb81c6091da3019a5d8be1e29ba1de74f30c6c28ace091f822 not found: ID does not exist" Mar 10 09:49:28 crc kubenswrapper[4794]: I0310 09:49:28.152561 4794 scope.go:117] "RemoveContainer" containerID="6a3be3a71e2af20be50ffcbd448d5f28e73bafe2d36819ec5476a260b6f5a8c5" Mar 10 09:49:28 crc kubenswrapper[4794]: E0310 09:49:28.152907 4794 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6a3be3a71e2af20be50ffcbd448d5f28e73bafe2d36819ec5476a260b6f5a8c5\": container with ID starting with 6a3be3a71e2af20be50ffcbd448d5f28e73bafe2d36819ec5476a260b6f5a8c5 not found: ID does not exist" containerID="6a3be3a71e2af20be50ffcbd448d5f28e73bafe2d36819ec5476a260b6f5a8c5" Mar 10 09:49:28 crc kubenswrapper[4794]: I0310 09:49:28.152926 4794 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6a3be3a71e2af20be50ffcbd448d5f28e73bafe2d36819ec5476a260b6f5a8c5"} err="failed to get container status \"6a3be3a71e2af20be50ffcbd448d5f28e73bafe2d36819ec5476a260b6f5a8c5\": rpc error: code = NotFound desc = could not find container \"6a3be3a71e2af20be50ffcbd448d5f28e73bafe2d36819ec5476a260b6f5a8c5\": container with ID starting with 6a3be3a71e2af20be50ffcbd448d5f28e73bafe2d36819ec5476a260b6f5a8c5 not found: ID does not exist" Mar 10 09:49:28 crc kubenswrapper[4794]: I0310 09:49:28.152940 4794 scope.go:117] "RemoveContainer" containerID="27049a35d23ac6ffa5cb459e4ea2c5c55a7b1731c9edff0941d452442cc73af2" Mar 10 09:49:28 crc kubenswrapper[4794]: E0310 09:49:28.153159 4794 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"27049a35d23ac6ffa5cb459e4ea2c5c55a7b1731c9edff0941d452442cc73af2\": container with ID starting with 27049a35d23ac6ffa5cb459e4ea2c5c55a7b1731c9edff0941d452442cc73af2 not found: ID does not exist" containerID="27049a35d23ac6ffa5cb459e4ea2c5c55a7b1731c9edff0941d452442cc73af2" Mar 10 09:49:28 crc kubenswrapper[4794]: I0310 09:49:28.153178 4794 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"27049a35d23ac6ffa5cb459e4ea2c5c55a7b1731c9edff0941d452442cc73af2"} err="failed to get container status \"27049a35d23ac6ffa5cb459e4ea2c5c55a7b1731c9edff0941d452442cc73af2\": rpc error: code = NotFound desc = could not find container \"27049a35d23ac6ffa5cb459e4ea2c5c55a7b1731c9edff0941d452442cc73af2\": container with ID starting with 27049a35d23ac6ffa5cb459e4ea2c5c55a7b1731c9edff0941d452442cc73af2 not found: ID does not exist" Mar 10 09:49:28 crc kubenswrapper[4794]: I0310 09:49:28.153193 4794 scope.go:117] "RemoveContainer" containerID="de2c346c49fbd94c4328853bed3ffd94ba873a423ff0d99a6ace6fc832bd4da7" Mar 10 09:49:28 crc kubenswrapper[4794]: E0310 09:49:28.153457 4794 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"de2c346c49fbd94c4328853bed3ffd94ba873a423ff0d99a6ace6fc832bd4da7\": container with ID starting with de2c346c49fbd94c4328853bed3ffd94ba873a423ff0d99a6ace6fc832bd4da7 not found: ID does not exist" containerID="de2c346c49fbd94c4328853bed3ffd94ba873a423ff0d99a6ace6fc832bd4da7" Mar 10 09:49:28 crc kubenswrapper[4794]: I0310 09:49:28.153478 4794 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"de2c346c49fbd94c4328853bed3ffd94ba873a423ff0d99a6ace6fc832bd4da7"} err="failed to get container status \"de2c346c49fbd94c4328853bed3ffd94ba873a423ff0d99a6ace6fc832bd4da7\": rpc error: code = NotFound desc = could not find container \"de2c346c49fbd94c4328853bed3ffd94ba873a423ff0d99a6ace6fc832bd4da7\": container with ID starting with de2c346c49fbd94c4328853bed3ffd94ba873a423ff0d99a6ace6fc832bd4da7 not found: ID does not exist" Mar 10 09:49:28 crc kubenswrapper[4794]: I0310 09:49:28.153492 4794 scope.go:117] "RemoveContainer" containerID="c7c4ede630df1f139ad8abbc2d1d1e2f0f6cf56a2c29902f6b6879df004834cd" Mar 10 09:49:28 crc kubenswrapper[4794]: E0310 09:49:28.153686 4794 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c7c4ede630df1f139ad8abbc2d1d1e2f0f6cf56a2c29902f6b6879df004834cd\": container with ID starting with c7c4ede630df1f139ad8abbc2d1d1e2f0f6cf56a2c29902f6b6879df004834cd not found: ID does not exist" containerID="c7c4ede630df1f139ad8abbc2d1d1e2f0f6cf56a2c29902f6b6879df004834cd" Mar 10 09:49:28 crc kubenswrapper[4794]: I0310 09:49:28.153703 4794 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c7c4ede630df1f139ad8abbc2d1d1e2f0f6cf56a2c29902f6b6879df004834cd"} err="failed to get container status \"c7c4ede630df1f139ad8abbc2d1d1e2f0f6cf56a2c29902f6b6879df004834cd\": rpc error: code = NotFound desc = could not find container \"c7c4ede630df1f139ad8abbc2d1d1e2f0f6cf56a2c29902f6b6879df004834cd\": container with ID starting with c7c4ede630df1f139ad8abbc2d1d1e2f0f6cf56a2c29902f6b6879df004834cd not found: ID does not exist" Mar 10 09:49:28 crc kubenswrapper[4794]: I0310 09:49:28.153716 4794 scope.go:117] "RemoveContainer" containerID="712665fa5ac90b706c3ce8eb211703686c18a26f136962df26e3ad7c4396c72f" Mar 10 09:49:28 crc kubenswrapper[4794]: E0310 09:49:28.153907 4794 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"712665fa5ac90b706c3ce8eb211703686c18a26f136962df26e3ad7c4396c72f\": container with ID starting with 712665fa5ac90b706c3ce8eb211703686c18a26f136962df26e3ad7c4396c72f not found: ID does not exist" containerID="712665fa5ac90b706c3ce8eb211703686c18a26f136962df26e3ad7c4396c72f" Mar 10 09:49:28 crc kubenswrapper[4794]: I0310 09:49:28.153926 4794 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"712665fa5ac90b706c3ce8eb211703686c18a26f136962df26e3ad7c4396c72f"} err="failed to get container status \"712665fa5ac90b706c3ce8eb211703686c18a26f136962df26e3ad7c4396c72f\": rpc error: code = NotFound desc = could not find container \"712665fa5ac90b706c3ce8eb211703686c18a26f136962df26e3ad7c4396c72f\": container with ID starting with 712665fa5ac90b706c3ce8eb211703686c18a26f136962df26e3ad7c4396c72f not found: ID does not exist" Mar 10 09:49:28 crc kubenswrapper[4794]: I0310 09:49:28.396167 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-n94r9" Mar 10 09:49:28 crc kubenswrapper[4794]: I0310 09:49:28.396553 4794 status_manager.go:851] "Failed to get status for pod" podUID="8be88dc0-2383-4590-8731-7c19146fcd2b" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.65:6443: connect: connection refused" Mar 10 09:49:28 crc kubenswrapper[4794]: I0310 09:49:28.396849 4794 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.65:6443: connect: connection refused" Mar 10 09:49:28 crc kubenswrapper[4794]: I0310 09:49:28.397102 4794 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.65:6443: connect: connection refused" Mar 10 09:49:28 crc kubenswrapper[4794]: I0310 09:49:28.397377 4794 status_manager.go:851] "Failed to get status for pod" podUID="d56da28c-c09d-4fff-b73e-c3b5c787c300" pod="openshift-authentication/oauth-openshift-558db77b4-n94r9" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-558db77b4-n94r9\": dial tcp 38.102.83.65:6443: connect: connection refused" Mar 10 09:49:28 crc kubenswrapper[4794]: I0310 09:49:28.474563 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/d56da28c-c09d-4fff-b73e-c3b5c787c300-v4-0-config-system-service-ca\") pod \"d56da28c-c09d-4fff-b73e-c3b5c787c300\" (UID: \"d56da28c-c09d-4fff-b73e-c3b5c787c300\") " Mar 10 09:49:28 crc kubenswrapper[4794]: I0310 09:49:28.474673 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/d56da28c-c09d-4fff-b73e-c3b5c787c300-v4-0-config-system-cliconfig\") pod \"d56da28c-c09d-4fff-b73e-c3b5c787c300\" (UID: \"d56da28c-c09d-4fff-b73e-c3b5c787c300\") " Mar 10 09:49:28 crc kubenswrapper[4794]: I0310 09:49:28.474704 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/d56da28c-c09d-4fff-b73e-c3b5c787c300-v4-0-config-user-template-error\") pod \"d56da28c-c09d-4fff-b73e-c3b5c787c300\" (UID: \"d56da28c-c09d-4fff-b73e-c3b5c787c300\") " Mar 10 09:49:28 crc kubenswrapper[4794]: I0310 09:49:28.474732 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/d56da28c-c09d-4fff-b73e-c3b5c787c300-v4-0-config-user-idp-0-file-data\") pod \"d56da28c-c09d-4fff-b73e-c3b5c787c300\" (UID: \"d56da28c-c09d-4fff-b73e-c3b5c787c300\") " Mar 10 09:49:28 crc kubenswrapper[4794]: I0310 09:49:28.474762 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d56da28c-c09d-4fff-b73e-c3b5c787c300-v4-0-config-system-trusted-ca-bundle\") pod \"d56da28c-c09d-4fff-b73e-c3b5c787c300\" (UID: \"d56da28c-c09d-4fff-b73e-c3b5c787c300\") " Mar 10 09:49:28 crc kubenswrapper[4794]: I0310 09:49:28.474794 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/d56da28c-c09d-4fff-b73e-c3b5c787c300-audit-dir\") pod \"d56da28c-c09d-4fff-b73e-c3b5c787c300\" (UID: \"d56da28c-c09d-4fff-b73e-c3b5c787c300\") " Mar 10 09:49:28 crc kubenswrapper[4794]: I0310 09:49:28.474826 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/d56da28c-c09d-4fff-b73e-c3b5c787c300-v4-0-config-system-router-certs\") pod \"d56da28c-c09d-4fff-b73e-c3b5c787c300\" (UID: \"d56da28c-c09d-4fff-b73e-c3b5c787c300\") " Mar 10 09:49:28 crc kubenswrapper[4794]: I0310 09:49:28.474860 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rh9rx\" (UniqueName: \"kubernetes.io/projected/d56da28c-c09d-4fff-b73e-c3b5c787c300-kube-api-access-rh9rx\") pod \"d56da28c-c09d-4fff-b73e-c3b5c787c300\" (UID: \"d56da28c-c09d-4fff-b73e-c3b5c787c300\") " Mar 10 09:49:28 crc kubenswrapper[4794]: I0310 09:49:28.474906 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/d56da28c-c09d-4fff-b73e-c3b5c787c300-audit-policies\") pod \"d56da28c-c09d-4fff-b73e-c3b5c787c300\" (UID: \"d56da28c-c09d-4fff-b73e-c3b5c787c300\") " Mar 10 09:49:28 crc kubenswrapper[4794]: I0310 09:49:28.474932 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/d56da28c-c09d-4fff-b73e-c3b5c787c300-v4-0-config-system-session\") pod \"d56da28c-c09d-4fff-b73e-c3b5c787c300\" (UID: \"d56da28c-c09d-4fff-b73e-c3b5c787c300\") " Mar 10 09:49:28 crc kubenswrapper[4794]: I0310 09:49:28.474957 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/d56da28c-c09d-4fff-b73e-c3b5c787c300-v4-0-config-system-serving-cert\") pod \"d56da28c-c09d-4fff-b73e-c3b5c787c300\" (UID: \"d56da28c-c09d-4fff-b73e-c3b5c787c300\") " Mar 10 09:49:28 crc kubenswrapper[4794]: I0310 09:49:28.474979 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/d56da28c-c09d-4fff-b73e-c3b5c787c300-v4-0-config-user-template-provider-selection\") pod \"d56da28c-c09d-4fff-b73e-c3b5c787c300\" (UID: \"d56da28c-c09d-4fff-b73e-c3b5c787c300\") " Mar 10 09:49:28 crc kubenswrapper[4794]: I0310 09:49:28.475007 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/d56da28c-c09d-4fff-b73e-c3b5c787c300-v4-0-config-user-template-login\") pod \"d56da28c-c09d-4fff-b73e-c3b5c787c300\" (UID: \"d56da28c-c09d-4fff-b73e-c3b5c787c300\") " Mar 10 09:49:28 crc kubenswrapper[4794]: I0310 09:49:28.475049 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/d56da28c-c09d-4fff-b73e-c3b5c787c300-v4-0-config-system-ocp-branding-template\") pod \"d56da28c-c09d-4fff-b73e-c3b5c787c300\" (UID: \"d56da28c-c09d-4fff-b73e-c3b5c787c300\") " Mar 10 09:49:28 crc kubenswrapper[4794]: I0310 09:49:28.475922 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d56da28c-c09d-4fff-b73e-c3b5c787c300-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "d56da28c-c09d-4fff-b73e-c3b5c787c300" (UID: "d56da28c-c09d-4fff-b73e-c3b5c787c300"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 09:49:28 crc kubenswrapper[4794]: I0310 09:49:28.476381 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d56da28c-c09d-4fff-b73e-c3b5c787c300-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "d56da28c-c09d-4fff-b73e-c3b5c787c300" (UID: "d56da28c-c09d-4fff-b73e-c3b5c787c300"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 09:49:28 crc kubenswrapper[4794]: I0310 09:49:28.476463 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d56da28c-c09d-4fff-b73e-c3b5c787c300-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "d56da28c-c09d-4fff-b73e-c3b5c787c300" (UID: "d56da28c-c09d-4fff-b73e-c3b5c787c300"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 09:49:28 crc kubenswrapper[4794]: I0310 09:49:28.476929 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d56da28c-c09d-4fff-b73e-c3b5c787c300-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "d56da28c-c09d-4fff-b73e-c3b5c787c300" (UID: "d56da28c-c09d-4fff-b73e-c3b5c787c300"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 09:49:28 crc kubenswrapper[4794]: I0310 09:49:28.477384 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d56da28c-c09d-4fff-b73e-c3b5c787c300-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "d56da28c-c09d-4fff-b73e-c3b5c787c300" (UID: "d56da28c-c09d-4fff-b73e-c3b5c787c300"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 09:49:28 crc kubenswrapper[4794]: I0310 09:49:28.483516 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d56da28c-c09d-4fff-b73e-c3b5c787c300-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "d56da28c-c09d-4fff-b73e-c3b5c787c300" (UID: "d56da28c-c09d-4fff-b73e-c3b5c787c300"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:49:28 crc kubenswrapper[4794]: I0310 09:49:28.483854 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d56da28c-c09d-4fff-b73e-c3b5c787c300-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "d56da28c-c09d-4fff-b73e-c3b5c787c300" (UID: "d56da28c-c09d-4fff-b73e-c3b5c787c300"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:49:28 crc kubenswrapper[4794]: I0310 09:49:28.485861 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d56da28c-c09d-4fff-b73e-c3b5c787c300-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "d56da28c-c09d-4fff-b73e-c3b5c787c300" (UID: "d56da28c-c09d-4fff-b73e-c3b5c787c300"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:49:28 crc kubenswrapper[4794]: I0310 09:49:28.487513 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d56da28c-c09d-4fff-b73e-c3b5c787c300-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "d56da28c-c09d-4fff-b73e-c3b5c787c300" (UID: "d56da28c-c09d-4fff-b73e-c3b5c787c300"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:49:28 crc kubenswrapper[4794]: I0310 09:49:28.487792 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d56da28c-c09d-4fff-b73e-c3b5c787c300-kube-api-access-rh9rx" (OuterVolumeSpecName: "kube-api-access-rh9rx") pod "d56da28c-c09d-4fff-b73e-c3b5c787c300" (UID: "d56da28c-c09d-4fff-b73e-c3b5c787c300"). InnerVolumeSpecName "kube-api-access-rh9rx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:49:28 crc kubenswrapper[4794]: I0310 09:49:28.488536 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d56da28c-c09d-4fff-b73e-c3b5c787c300-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "d56da28c-c09d-4fff-b73e-c3b5c787c300" (UID: "d56da28c-c09d-4fff-b73e-c3b5c787c300"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:49:28 crc kubenswrapper[4794]: I0310 09:49:28.489709 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d56da28c-c09d-4fff-b73e-c3b5c787c300-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "d56da28c-c09d-4fff-b73e-c3b5c787c300" (UID: "d56da28c-c09d-4fff-b73e-c3b5c787c300"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:49:28 crc kubenswrapper[4794]: I0310 09:49:28.491145 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d56da28c-c09d-4fff-b73e-c3b5c787c300-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "d56da28c-c09d-4fff-b73e-c3b5c787c300" (UID: "d56da28c-c09d-4fff-b73e-c3b5c787c300"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:49:28 crc kubenswrapper[4794]: I0310 09:49:28.491367 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d56da28c-c09d-4fff-b73e-c3b5c787c300-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "d56da28c-c09d-4fff-b73e-c3b5c787c300" (UID: "d56da28c-c09d-4fff-b73e-c3b5c787c300"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:49:28 crc kubenswrapper[4794]: I0310 09:49:28.576204 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rh9rx\" (UniqueName: \"kubernetes.io/projected/d56da28c-c09d-4fff-b73e-c3b5c787c300-kube-api-access-rh9rx\") on node \"crc\" DevicePath \"\"" Mar 10 09:49:28 crc kubenswrapper[4794]: I0310 09:49:28.576262 4794 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/d56da28c-c09d-4fff-b73e-c3b5c787c300-audit-policies\") on node \"crc\" DevicePath \"\"" Mar 10 09:49:28 crc kubenswrapper[4794]: I0310 09:49:28.576278 4794 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/d56da28c-c09d-4fff-b73e-c3b5c787c300-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Mar 10 09:49:28 crc kubenswrapper[4794]: I0310 09:49:28.576290 4794 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/d56da28c-c09d-4fff-b73e-c3b5c787c300-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 10 09:49:28 crc kubenswrapper[4794]: I0310 09:49:28.576306 4794 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/d56da28c-c09d-4fff-b73e-c3b5c787c300-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Mar 10 09:49:28 crc kubenswrapper[4794]: I0310 09:49:28.576322 4794 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/d56da28c-c09d-4fff-b73e-c3b5c787c300-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Mar 10 09:49:28 crc kubenswrapper[4794]: I0310 09:49:28.576357 4794 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/d56da28c-c09d-4fff-b73e-c3b5c787c300-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Mar 10 09:49:28 crc kubenswrapper[4794]: I0310 09:49:28.576377 4794 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/d56da28c-c09d-4fff-b73e-c3b5c787c300-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Mar 10 09:49:28 crc kubenswrapper[4794]: I0310 09:49:28.576389 4794 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/d56da28c-c09d-4fff-b73e-c3b5c787c300-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Mar 10 09:49:28 crc kubenswrapper[4794]: I0310 09:49:28.576403 4794 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/d56da28c-c09d-4fff-b73e-c3b5c787c300-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Mar 10 09:49:28 crc kubenswrapper[4794]: I0310 09:49:28.576414 4794 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/d56da28c-c09d-4fff-b73e-c3b5c787c300-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Mar 10 09:49:28 crc kubenswrapper[4794]: I0310 09:49:28.576428 4794 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d56da28c-c09d-4fff-b73e-c3b5c787c300-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 09:49:28 crc kubenswrapper[4794]: I0310 09:49:28.576440 4794 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/d56da28c-c09d-4fff-b73e-c3b5c787c300-audit-dir\") on node \"crc\" DevicePath \"\"" Mar 10 09:49:28 crc kubenswrapper[4794]: I0310 09:49:28.576452 4794 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/d56da28c-c09d-4fff-b73e-c3b5c787c300-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Mar 10 09:49:29 crc kubenswrapper[4794]: I0310 09:49:29.016651 4794 generic.go:334] "Generic (PLEG): container finished" podID="d56da28c-c09d-4fff-b73e-c3b5c787c300" containerID="a2f0ca38cf8561e292249a48d1b06ca578f4f8715cafb1c749fea0b2be397ce6" exitCode=0 Mar 10 09:49:29 crc kubenswrapper[4794]: I0310 09:49:29.016702 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-n94r9" event={"ID":"d56da28c-c09d-4fff-b73e-c3b5c787c300","Type":"ContainerDied","Data":"a2f0ca38cf8561e292249a48d1b06ca578f4f8715cafb1c749fea0b2be397ce6"} Mar 10 09:49:29 crc kubenswrapper[4794]: I0310 09:49:29.016732 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-n94r9" Mar 10 09:49:29 crc kubenswrapper[4794]: I0310 09:49:29.017823 4794 scope.go:117] "RemoveContainer" containerID="a2f0ca38cf8561e292249a48d1b06ca578f4f8715cafb1c749fea0b2be397ce6" Mar 10 09:49:29 crc kubenswrapper[4794]: I0310 09:49:29.017799 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-n94r9" event={"ID":"d56da28c-c09d-4fff-b73e-c3b5c787c300","Type":"ContainerDied","Data":"51b0cd56f9858210411a8a4e7888d9ea1eb9d991527e4543e17eea8f744550b6"} Mar 10 09:49:29 crc kubenswrapper[4794]: I0310 09:49:29.018507 4794 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.65:6443: connect: connection refused" Mar 10 09:49:29 crc kubenswrapper[4794]: I0310 09:49:29.018745 4794 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.65:6443: connect: connection refused" Mar 10 09:49:29 crc kubenswrapper[4794]: I0310 09:49:29.019022 4794 status_manager.go:851] "Failed to get status for pod" podUID="d56da28c-c09d-4fff-b73e-c3b5c787c300" pod="openshift-authentication/oauth-openshift-558db77b4-n94r9" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-558db77b4-n94r9\": dial tcp 38.102.83.65:6443: connect: connection refused" Mar 10 09:49:29 crc kubenswrapper[4794]: I0310 09:49:29.019326 4794 status_manager.go:851] "Failed to get status for pod" podUID="8be88dc0-2383-4590-8731-7c19146fcd2b" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.65:6443: connect: connection refused" Mar 10 09:49:29 crc kubenswrapper[4794]: I0310 09:49:29.030353 4794 status_manager.go:851] "Failed to get status for pod" podUID="d56da28c-c09d-4fff-b73e-c3b5c787c300" pod="openshift-authentication/oauth-openshift-558db77b4-n94r9" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-558db77b4-n94r9\": dial tcp 38.102.83.65:6443: connect: connection refused" Mar 10 09:49:29 crc kubenswrapper[4794]: I0310 09:49:29.030734 4794 status_manager.go:851] "Failed to get status for pod" podUID="8be88dc0-2383-4590-8731-7c19146fcd2b" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.65:6443: connect: connection refused" Mar 10 09:49:29 crc kubenswrapper[4794]: I0310 09:49:29.031021 4794 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.65:6443: connect: connection refused" Mar 10 09:49:29 crc kubenswrapper[4794]: I0310 09:49:29.031258 4794 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.65:6443: connect: connection refused" Mar 10 09:49:29 crc kubenswrapper[4794]: I0310 09:49:29.034019 4794 scope.go:117] "RemoveContainer" containerID="a2f0ca38cf8561e292249a48d1b06ca578f4f8715cafb1c749fea0b2be397ce6" Mar 10 09:49:29 crc kubenswrapper[4794]: E0310 09:49:29.034595 4794 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a2f0ca38cf8561e292249a48d1b06ca578f4f8715cafb1c749fea0b2be397ce6\": container with ID starting with a2f0ca38cf8561e292249a48d1b06ca578f4f8715cafb1c749fea0b2be397ce6 not found: ID does not exist" containerID="a2f0ca38cf8561e292249a48d1b06ca578f4f8715cafb1c749fea0b2be397ce6" Mar 10 09:49:29 crc kubenswrapper[4794]: I0310 09:49:29.034629 4794 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a2f0ca38cf8561e292249a48d1b06ca578f4f8715cafb1c749fea0b2be397ce6"} err="failed to get container status \"a2f0ca38cf8561e292249a48d1b06ca578f4f8715cafb1c749fea0b2be397ce6\": rpc error: code = NotFound desc = could not find container \"a2f0ca38cf8561e292249a48d1b06ca578f4f8715cafb1c749fea0b2be397ce6\": container with ID starting with a2f0ca38cf8561e292249a48d1b06ca578f4f8715cafb1c749fea0b2be397ce6 not found: ID does not exist" Mar 10 09:49:32 crc kubenswrapper[4794]: I0310 09:49:32.003641 4794 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.65:6443: connect: connection refused" Mar 10 09:49:32 crc kubenswrapper[4794]: I0310 09:49:32.004385 4794 status_manager.go:851] "Failed to get status for pod" podUID="d56da28c-c09d-4fff-b73e-c3b5c787c300" pod="openshift-authentication/oauth-openshift-558db77b4-n94r9" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-558db77b4-n94r9\": dial tcp 38.102.83.65:6443: connect: connection refused" Mar 10 09:49:32 crc kubenswrapper[4794]: I0310 09:49:32.004914 4794 status_manager.go:851] "Failed to get status for pod" podUID="8be88dc0-2383-4590-8731-7c19146fcd2b" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.65:6443: connect: connection refused" Mar 10 09:49:32 crc kubenswrapper[4794]: E0310 09:49:32.571319 4794 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.65:6443: connect: connection refused" Mar 10 09:49:32 crc kubenswrapper[4794]: E0310 09:49:32.572220 4794 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.65:6443: connect: connection refused" Mar 10 09:49:32 crc kubenswrapper[4794]: E0310 09:49:32.572871 4794 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.65:6443: connect: connection refused" Mar 10 09:49:32 crc kubenswrapper[4794]: E0310 09:49:32.573188 4794 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.65:6443: connect: connection refused" Mar 10 09:49:32 crc kubenswrapper[4794]: E0310 09:49:32.573639 4794 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.65:6443: connect: connection refused" Mar 10 09:49:32 crc kubenswrapper[4794]: I0310 09:49:32.573670 4794 controller.go:115] "failed to update lease using latest lease, fallback to ensure lease" err="failed 5 attempts to update lease" Mar 10 09:49:32 crc kubenswrapper[4794]: E0310 09:49:32.573880 4794 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.65:6443: connect: connection refused" interval="200ms" Mar 10 09:49:32 crc kubenswrapper[4794]: E0310 09:49:32.775283 4794 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.65:6443: connect: connection refused" interval="400ms" Mar 10 09:49:33 crc kubenswrapper[4794]: E0310 09:49:33.177184 4794 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.65:6443: connect: connection refused" interval="800ms" Mar 10 09:49:33 crc kubenswrapper[4794]: E0310 09:49:33.688544 4794 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.102.83.65:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.189b71f29fc9b9e9 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 09:49:25.177244137 +0000 UTC m=+313.933414975,LastTimestamp:2026-03-10 09:49:25.177244137 +0000 UTC m=+313.933414975,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 09:49:33 crc kubenswrapper[4794]: E0310 09:49:33.979396 4794 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.65:6443: connect: connection refused" interval="1.6s" Mar 10 09:49:35 crc kubenswrapper[4794]: E0310 09:49:35.580434 4794 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.65:6443: connect: connection refused" interval="3.2s" Mar 10 09:49:38 crc kubenswrapper[4794]: E0310 09:49:38.781956 4794 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.65:6443: connect: connection refused" interval="6.4s" Mar 10 09:49:39 crc kubenswrapper[4794]: I0310 09:49:39.090478 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/1.log" Mar 10 09:49:39 crc kubenswrapper[4794]: I0310 09:49:39.091945 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Mar 10 09:49:39 crc kubenswrapper[4794]: I0310 09:49:39.091988 4794 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="c2e8a850bc5db9bf7a7e73a9a71933744f371c73ceba41cd04d0b175614309ef" exitCode=1 Mar 10 09:49:39 crc kubenswrapper[4794]: I0310 09:49:39.092019 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"c2e8a850bc5db9bf7a7e73a9a71933744f371c73ceba41cd04d0b175614309ef"} Mar 10 09:49:39 crc kubenswrapper[4794]: I0310 09:49:39.092466 4794 scope.go:117] "RemoveContainer" containerID="c2e8a850bc5db9bf7a7e73a9a71933744f371c73ceba41cd04d0b175614309ef" Mar 10 09:49:39 crc kubenswrapper[4794]: I0310 09:49:39.092815 4794 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.65:6443: connect: connection refused" Mar 10 09:49:39 crc kubenswrapper[4794]: I0310 09:49:39.093409 4794 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.65:6443: connect: connection refused" Mar 10 09:49:39 crc kubenswrapper[4794]: I0310 09:49:39.093846 4794 status_manager.go:851] "Failed to get status for pod" podUID="d56da28c-c09d-4fff-b73e-c3b5c787c300" pod="openshift-authentication/oauth-openshift-558db77b4-n94r9" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-558db77b4-n94r9\": dial tcp 38.102.83.65:6443: connect: connection refused" Mar 10 09:49:39 crc kubenswrapper[4794]: I0310 09:49:39.094140 4794 status_manager.go:851] "Failed to get status for pod" podUID="8be88dc0-2383-4590-8731-7c19146fcd2b" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.65:6443: connect: connection refused" Mar 10 09:49:39 crc kubenswrapper[4794]: I0310 09:49:39.999112 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 10 09:49:40 crc kubenswrapper[4794]: I0310 09:49:40.000621 4794 status_manager.go:851] "Failed to get status for pod" podUID="8be88dc0-2383-4590-8731-7c19146fcd2b" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.65:6443: connect: connection refused" Mar 10 09:49:40 crc kubenswrapper[4794]: I0310 09:49:40.001429 4794 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.65:6443: connect: connection refused" Mar 10 09:49:40 crc kubenswrapper[4794]: I0310 09:49:40.001889 4794 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.65:6443: connect: connection refused" Mar 10 09:49:40 crc kubenswrapper[4794]: I0310 09:49:40.002433 4794 status_manager.go:851] "Failed to get status for pod" podUID="d56da28c-c09d-4fff-b73e-c3b5c787c300" pod="openshift-authentication/oauth-openshift-558db77b4-n94r9" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-558db77b4-n94r9\": dial tcp 38.102.83.65:6443: connect: connection refused" Mar 10 09:49:40 crc kubenswrapper[4794]: I0310 09:49:40.021268 4794 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="c04f5dfc-8cde-406b-9e01-2e529e0c0f31" Mar 10 09:49:40 crc kubenswrapper[4794]: I0310 09:49:40.021311 4794 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="c04f5dfc-8cde-406b-9e01-2e529e0c0f31" Mar 10 09:49:40 crc kubenswrapper[4794]: E0310 09:49:40.021853 4794 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.65:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 10 09:49:40 crc kubenswrapper[4794]: I0310 09:49:40.022442 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 10 09:49:40 crc kubenswrapper[4794]: E0310 09:49:40.027380 4794 desired_state_of_world_populator.go:312] "Error processing volume" err="error processing PVC openshift-image-registry/crc-image-registry-storage: failed to fetch PVC from API server: Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-image-registry/persistentvolumeclaims/crc-image-registry-storage\": dial tcp 38.102.83.65:6443: connect: connection refused" pod="openshift-image-registry/image-registry-697d97f7c8-zp5hg" volumeName="registry-storage" Mar 10 09:49:40 crc kubenswrapper[4794]: W0310 09:49:40.051837 4794 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod71bb4a3aecc4ba5b26c4b7318770ce13.slice/crio-9df8352e757898efd57af4fa298e5d298a486ed71d77998c5eaae8976b4f29f7 WatchSource:0}: Error finding container 9df8352e757898efd57af4fa298e5d298a486ed71d77998c5eaae8976b4f29f7: Status 404 returned error can't find the container with id 9df8352e757898efd57af4fa298e5d298a486ed71d77998c5eaae8976b4f29f7 Mar 10 09:49:40 crc kubenswrapper[4794]: I0310 09:49:40.099920 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"9df8352e757898efd57af4fa298e5d298a486ed71d77998c5eaae8976b4f29f7"} Mar 10 09:49:40 crc kubenswrapper[4794]: I0310 09:49:40.103847 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/1.log" Mar 10 09:49:40 crc kubenswrapper[4794]: I0310 09:49:40.105185 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Mar 10 09:49:40 crc kubenswrapper[4794]: I0310 09:49:40.105245 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"96f2ad13681bacf09475a55836ed3e46665b3cd92091c086f3f315efbbfbe9dc"} Mar 10 09:49:40 crc kubenswrapper[4794]: I0310 09:49:40.105991 4794 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.65:6443: connect: connection refused" Mar 10 09:49:40 crc kubenswrapper[4794]: I0310 09:49:40.106276 4794 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.65:6443: connect: connection refused" Mar 10 09:49:40 crc kubenswrapper[4794]: I0310 09:49:40.106704 4794 status_manager.go:851] "Failed to get status for pod" podUID="d56da28c-c09d-4fff-b73e-c3b5c787c300" pod="openshift-authentication/oauth-openshift-558db77b4-n94r9" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-558db77b4-n94r9\": dial tcp 38.102.83.65:6443: connect: connection refused" Mar 10 09:49:40 crc kubenswrapper[4794]: I0310 09:49:40.107007 4794 status_manager.go:851] "Failed to get status for pod" podUID="8be88dc0-2383-4590-8731-7c19146fcd2b" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.65:6443: connect: connection refused" Mar 10 09:49:41 crc kubenswrapper[4794]: I0310 09:49:41.114426 4794 generic.go:334] "Generic (PLEG): container finished" podID="71bb4a3aecc4ba5b26c4b7318770ce13" containerID="c9cf0512b5fc0f060d73f7742c0fc316ec296d9326edc543c3cc1f6ee81e1e49" exitCode=0 Mar 10 09:49:41 crc kubenswrapper[4794]: I0310 09:49:41.114550 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerDied","Data":"c9cf0512b5fc0f060d73f7742c0fc316ec296d9326edc543c3cc1f6ee81e1e49"} Mar 10 09:49:41 crc kubenswrapper[4794]: I0310 09:49:41.114684 4794 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="c04f5dfc-8cde-406b-9e01-2e529e0c0f31" Mar 10 09:49:41 crc kubenswrapper[4794]: I0310 09:49:41.114907 4794 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="c04f5dfc-8cde-406b-9e01-2e529e0c0f31" Mar 10 09:49:41 crc kubenswrapper[4794]: I0310 09:49:41.115579 4794 status_manager.go:851] "Failed to get status for pod" podUID="8be88dc0-2383-4590-8731-7c19146fcd2b" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.65:6443: connect: connection refused" Mar 10 09:49:41 crc kubenswrapper[4794]: E0310 09:49:41.115588 4794 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.65:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 10 09:49:41 crc kubenswrapper[4794]: I0310 09:49:41.115933 4794 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.65:6443: connect: connection refused" Mar 10 09:49:41 crc kubenswrapper[4794]: I0310 09:49:41.116496 4794 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.65:6443: connect: connection refused" Mar 10 09:49:41 crc kubenswrapper[4794]: I0310 09:49:41.116995 4794 status_manager.go:851] "Failed to get status for pod" podUID="d56da28c-c09d-4fff-b73e-c3b5c787c300" pod="openshift-authentication/oauth-openshift-558db77b4-n94r9" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-558db77b4-n94r9\": dial tcp 38.102.83.65:6443: connect: connection refused" Mar 10 09:49:41 crc kubenswrapper[4794]: I0310 09:49:41.429627 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 10 09:49:42 crc kubenswrapper[4794]: I0310 09:49:42.126547 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"460219efd19232d3be013c44d52547a2d9fc76d4587e7c1c2164aecde555dcfd"} Mar 10 09:49:42 crc kubenswrapper[4794]: I0310 09:49:42.127004 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"f8ab32c7fd7e6b28234ff5448bd59919b5de5058aa68f5ca23d105d16e44fa0b"} Mar 10 09:49:42 crc kubenswrapper[4794]: I0310 09:49:42.127025 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"f013965cc1f9bc2237e44fd859c8f6edc4cd23409f24b07b2c4dc72ba8024a82"} Mar 10 09:49:42 crc kubenswrapper[4794]: I0310 09:49:42.861775 4794 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 10 09:49:42 crc kubenswrapper[4794]: I0310 09:49:42.861928 4794 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/kube-controller-manager namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" start-of-body= Mar 10 09:49:42 crc kubenswrapper[4794]: I0310 09:49:42.861999 4794 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="kube-controller-manager" probeResult="failure" output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" Mar 10 09:49:43 crc kubenswrapper[4794]: I0310 09:49:43.137146 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"cc3fbec7b2828fd9143ac1018e26587e0c37a82cb91f1c603577d96f2592bfdb"} Mar 10 09:49:43 crc kubenswrapper[4794]: I0310 09:49:43.137416 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"1abd2e6b782e8fd8ff034898e74931bf8406b66a74ec4cd978d1beff549fb917"} Mar 10 09:49:43 crc kubenswrapper[4794]: I0310 09:49:43.137605 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 10 09:49:43 crc kubenswrapper[4794]: I0310 09:49:43.137722 4794 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="c04f5dfc-8cde-406b-9e01-2e529e0c0f31" Mar 10 09:49:43 crc kubenswrapper[4794]: I0310 09:49:43.137768 4794 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="c04f5dfc-8cde-406b-9e01-2e529e0c0f31" Mar 10 09:49:45 crc kubenswrapper[4794]: I0310 09:49:45.022938 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 10 09:49:45 crc kubenswrapper[4794]: I0310 09:49:45.023349 4794 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 10 09:49:45 crc kubenswrapper[4794]: I0310 09:49:45.030648 4794 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 10 09:49:45 crc kubenswrapper[4794]: I0310 09:49:45.912010 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 09:49:45 crc kubenswrapper[4794]: I0310 09:49:45.912525 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 09:49:45 crc kubenswrapper[4794]: I0310 09:49:45.912594 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 09:49:45 crc kubenswrapper[4794]: I0310 09:49:45.912633 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 09:49:45 crc kubenswrapper[4794]: I0310 09:49:45.915731 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Mar 10 09:49:45 crc kubenswrapper[4794]: I0310 09:49:45.915821 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Mar 10 09:49:45 crc kubenswrapper[4794]: I0310 09:49:45.915990 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Mar 10 09:49:45 crc kubenswrapper[4794]: I0310 09:49:45.924409 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 09:49:45 crc kubenswrapper[4794]: I0310 09:49:45.925006 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Mar 10 09:49:45 crc kubenswrapper[4794]: I0310 09:49:45.930531 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 09:49:45 crc kubenswrapper[4794]: I0310 09:49:45.938874 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 09:49:45 crc kubenswrapper[4794]: I0310 09:49:45.943376 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 09:49:46 crc kubenswrapper[4794]: I0310 09:49:46.118168 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 09:49:46 crc kubenswrapper[4794]: I0310 09:49:46.137510 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 09:49:46 crc kubenswrapper[4794]: I0310 09:49:46.146805 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 09:49:46 crc kubenswrapper[4794]: W0310 09:49:46.589248 4794 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3b6479f0_333b_4a96_9adf_2099afdc2447.slice/crio-bc957a5e388827aeb56645f5d4c8ae72d7a32eabc1b888fcd8e2c399e5ef09a0 WatchSource:0}: Error finding container bc957a5e388827aeb56645f5d4c8ae72d7a32eabc1b888fcd8e2c399e5ef09a0: Status 404 returned error can't find the container with id bc957a5e388827aeb56645f5d4c8ae72d7a32eabc1b888fcd8e2c399e5ef09a0 Mar 10 09:49:47 crc kubenswrapper[4794]: I0310 09:49:47.163997 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"ac051e6a34c0a7a0143b79b91ee488f5f9aa34599248609b8f54d7f0d909fac4"} Mar 10 09:49:47 crc kubenswrapper[4794]: I0310 09:49:47.164482 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"bc957a5e388827aeb56645f5d4c8ae72d7a32eabc1b888fcd8e2c399e5ef09a0"} Mar 10 09:49:47 crc kubenswrapper[4794]: I0310 09:49:47.164728 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 09:49:47 crc kubenswrapper[4794]: I0310 09:49:47.166784 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"cecc8adfb49287164394828c04e66a7126a748c2924e127a2b2bbacd221da5fb"} Mar 10 09:49:47 crc kubenswrapper[4794]: I0310 09:49:47.167029 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"7789519b2b3fe0996d5baa980ccfe9c580b430c7e1148950b8fe358fff09e8bd"} Mar 10 09:49:47 crc kubenswrapper[4794]: I0310 09:49:47.169719 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"8488bae8d46d9f94a5c7aeaace127e1906d1a8209a2984b53460c070e91a88da"} Mar 10 09:49:47 crc kubenswrapper[4794]: I0310 09:49:47.169769 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"240fe186965e33874f084f8cf0b3bbba96a9ba00739c107e5720ef48cc3026ab"} Mar 10 09:49:48 crc kubenswrapper[4794]: I0310 09:49:48.145759 4794 kubelet.go:1914] "Deleted mirror pod because it is outdated" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 10 09:49:48 crc kubenswrapper[4794]: I0310 09:49:48.176247 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-source-55646444c4-trplf_9d751cbb-f2e2-430d-9754-c882a5e924a5/check-endpoints/0.log" Mar 10 09:49:48 crc kubenswrapper[4794]: I0310 09:49:48.176306 4794 generic.go:334] "Generic (PLEG): container finished" podID="9d751cbb-f2e2-430d-9754-c882a5e924a5" containerID="8488bae8d46d9f94a5c7aeaace127e1906d1a8209a2984b53460c070e91a88da" exitCode=255 Mar 10 09:49:48 crc kubenswrapper[4794]: I0310 09:49:48.176371 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerDied","Data":"8488bae8d46d9f94a5c7aeaace127e1906d1a8209a2984b53460c070e91a88da"} Mar 10 09:49:48 crc kubenswrapper[4794]: I0310 09:49:48.176977 4794 scope.go:117] "RemoveContainer" containerID="8488bae8d46d9f94a5c7aeaace127e1906d1a8209a2984b53460c070e91a88da" Mar 10 09:49:48 crc kubenswrapper[4794]: I0310 09:49:48.177197 4794 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="c04f5dfc-8cde-406b-9e01-2e529e0c0f31" Mar 10 09:49:48 crc kubenswrapper[4794]: I0310 09:49:48.177228 4794 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="c04f5dfc-8cde-406b-9e01-2e529e0c0f31" Mar 10 09:49:48 crc kubenswrapper[4794]: I0310 09:49:48.184602 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 10 09:49:48 crc kubenswrapper[4794]: I0310 09:49:48.194662 4794 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="6877da11-ac12-4f3a-b1e7-16be8566c6d6" Mar 10 09:49:49 crc kubenswrapper[4794]: I0310 09:49:49.184991 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-source-55646444c4-trplf_9d751cbb-f2e2-430d-9754-c882a5e924a5/check-endpoints/0.log" Mar 10 09:49:49 crc kubenswrapper[4794]: I0310 09:49:49.185744 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"6fa4dafc049b27d0b5fcefb284c29c1554684b8bb25ec30daba5885bed0cc68b"} Mar 10 09:49:49 crc kubenswrapper[4794]: I0310 09:49:49.186013 4794 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="c04f5dfc-8cde-406b-9e01-2e529e0c0f31" Mar 10 09:49:49 crc kubenswrapper[4794]: I0310 09:49:49.186106 4794 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="c04f5dfc-8cde-406b-9e01-2e529e0c0f31" Mar 10 09:49:50 crc kubenswrapper[4794]: I0310 09:49:50.192952 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-source-55646444c4-trplf_9d751cbb-f2e2-430d-9754-c882a5e924a5/check-endpoints/1.log" Mar 10 09:49:50 crc kubenswrapper[4794]: I0310 09:49:50.194570 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-source-55646444c4-trplf_9d751cbb-f2e2-430d-9754-c882a5e924a5/check-endpoints/0.log" Mar 10 09:49:50 crc kubenswrapper[4794]: I0310 09:49:50.194944 4794 generic.go:334] "Generic (PLEG): container finished" podID="9d751cbb-f2e2-430d-9754-c882a5e924a5" containerID="6fa4dafc049b27d0b5fcefb284c29c1554684b8bb25ec30daba5885bed0cc68b" exitCode=255 Mar 10 09:49:50 crc kubenswrapper[4794]: I0310 09:49:50.194986 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerDied","Data":"6fa4dafc049b27d0b5fcefb284c29c1554684b8bb25ec30daba5885bed0cc68b"} Mar 10 09:49:50 crc kubenswrapper[4794]: I0310 09:49:50.195033 4794 scope.go:117] "RemoveContainer" containerID="8488bae8d46d9f94a5c7aeaace127e1906d1a8209a2984b53460c070e91a88da" Mar 10 09:49:50 crc kubenswrapper[4794]: I0310 09:49:50.195697 4794 scope.go:117] "RemoveContainer" containerID="6fa4dafc049b27d0b5fcefb284c29c1554684b8bb25ec30daba5885bed0cc68b" Mar 10 09:49:50 crc kubenswrapper[4794]: E0310 09:49:50.196114 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=check-endpoints pod=network-check-source-55646444c4-trplf_openshift-network-diagnostics(9d751cbb-f2e2-430d-9754-c882a5e924a5)\"" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 09:49:51 crc kubenswrapper[4794]: I0310 09:49:51.205378 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-source-55646444c4-trplf_9d751cbb-f2e2-430d-9754-c882a5e924a5/check-endpoints/1.log" Mar 10 09:49:52 crc kubenswrapper[4794]: I0310 09:49:52.020631 4794 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="6877da11-ac12-4f3a-b1e7-16be8566c6d6" Mar 10 09:49:52 crc kubenswrapper[4794]: I0310 09:49:52.861220 4794 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/kube-controller-manager namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" start-of-body= Mar 10 09:49:52 crc kubenswrapper[4794]: I0310 09:49:52.861279 4794 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="kube-controller-manager" probeResult="failure" output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" Mar 10 09:49:57 crc kubenswrapper[4794]: I0310 09:49:57.255834 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Mar 10 09:49:58 crc kubenswrapper[4794]: I0310 09:49:58.297206 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Mar 10 09:49:58 crc kubenswrapper[4794]: I0310 09:49:58.656312 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Mar 10 09:49:58 crc kubenswrapper[4794]: I0310 09:49:58.755654 4794 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Mar 10 09:49:58 crc kubenswrapper[4794]: I0310 09:49:58.788802 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Mar 10 09:49:58 crc kubenswrapper[4794]: I0310 09:49:58.959314 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Mar 10 09:49:59 crc kubenswrapper[4794]: I0310 09:49:59.115006 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Mar 10 09:49:59 crc kubenswrapper[4794]: I0310 09:49:59.224128 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Mar 10 09:49:59 crc kubenswrapper[4794]: I0310 09:49:59.272087 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Mar 10 09:49:59 crc kubenswrapper[4794]: I0310 09:49:59.443696 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Mar 10 09:49:59 crc kubenswrapper[4794]: I0310 09:49:59.467509 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Mar 10 09:49:59 crc kubenswrapper[4794]: I0310 09:49:59.566369 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Mar 10 09:49:59 crc kubenswrapper[4794]: I0310 09:49:59.643551 4794 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Mar 10 09:49:59 crc kubenswrapper[4794]: I0310 09:49:59.683754 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Mar 10 09:49:59 crc kubenswrapper[4794]: I0310 09:49:59.713607 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Mar 10 09:49:59 crc kubenswrapper[4794]: I0310 09:49:59.726759 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Mar 10 09:49:59 crc kubenswrapper[4794]: I0310 09:49:59.767865 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Mar 10 09:49:59 crc kubenswrapper[4794]: I0310 09:49:59.848894 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Mar 10 09:49:59 crc kubenswrapper[4794]: I0310 09:49:59.863232 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Mar 10 09:49:59 crc kubenswrapper[4794]: I0310 09:49:59.893125 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Mar 10 09:49:59 crc kubenswrapper[4794]: I0310 09:49:59.992254 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Mar 10 09:50:00 crc kubenswrapper[4794]: I0310 09:50:00.109465 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Mar 10 09:50:00 crc kubenswrapper[4794]: I0310 09:50:00.147785 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Mar 10 09:50:00 crc kubenswrapper[4794]: I0310 09:50:00.185020 4794 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Mar 10 09:50:00 crc kubenswrapper[4794]: I0310 09:50:00.205837 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Mar 10 09:50:00 crc kubenswrapper[4794]: I0310 09:50:00.289568 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Mar 10 09:50:00 crc kubenswrapper[4794]: I0310 09:50:00.297107 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Mar 10 09:50:00 crc kubenswrapper[4794]: I0310 09:50:00.334994 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Mar 10 09:50:00 crc kubenswrapper[4794]: I0310 09:50:00.439001 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Mar 10 09:50:00 crc kubenswrapper[4794]: I0310 09:50:00.521881 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Mar 10 09:50:00 crc kubenswrapper[4794]: I0310 09:50:00.575482 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Mar 10 09:50:00 crc kubenswrapper[4794]: I0310 09:50:00.688240 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Mar 10 09:50:00 crc kubenswrapper[4794]: I0310 09:50:00.834116 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Mar 10 09:50:00 crc kubenswrapper[4794]: I0310 09:50:00.986142 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Mar 10 09:50:01 crc kubenswrapper[4794]: I0310 09:50:01.077336 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Mar 10 09:50:01 crc kubenswrapper[4794]: I0310 09:50:01.172281 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Mar 10 09:50:01 crc kubenswrapper[4794]: I0310 09:50:01.276601 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Mar 10 09:50:01 crc kubenswrapper[4794]: I0310 09:50:01.355234 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Mar 10 09:50:01 crc kubenswrapper[4794]: I0310 09:50:01.397363 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Mar 10 09:50:01 crc kubenswrapper[4794]: I0310 09:50:01.399761 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Mar 10 09:50:01 crc kubenswrapper[4794]: I0310 09:50:01.430372 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Mar 10 09:50:01 crc kubenswrapper[4794]: I0310 09:50:01.443201 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Mar 10 09:50:01 crc kubenswrapper[4794]: I0310 09:50:01.466629 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Mar 10 09:50:01 crc kubenswrapper[4794]: I0310 09:50:01.471938 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Mar 10 09:50:01 crc kubenswrapper[4794]: I0310 09:50:01.481932 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Mar 10 09:50:01 crc kubenswrapper[4794]: I0310 09:50:01.540511 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Mar 10 09:50:01 crc kubenswrapper[4794]: I0310 09:50:01.659504 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Mar 10 09:50:01 crc kubenswrapper[4794]: I0310 09:50:01.778241 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 10 09:50:01 crc kubenswrapper[4794]: I0310 09:50:01.864860 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Mar 10 09:50:01 crc kubenswrapper[4794]: I0310 09:50:01.878981 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Mar 10 09:50:01 crc kubenswrapper[4794]: I0310 09:50:01.981374 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Mar 10 09:50:02 crc kubenswrapper[4794]: I0310 09:50:02.146565 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 10 09:50:02 crc kubenswrapper[4794]: I0310 09:50:02.158138 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 10 09:50:02 crc kubenswrapper[4794]: I0310 09:50:02.174633 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Mar 10 09:50:02 crc kubenswrapper[4794]: I0310 09:50:02.236974 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Mar 10 09:50:02 crc kubenswrapper[4794]: I0310 09:50:02.286262 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Mar 10 09:50:02 crc kubenswrapper[4794]: I0310 09:50:02.298424 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Mar 10 09:50:02 crc kubenswrapper[4794]: I0310 09:50:02.301764 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 10 09:50:02 crc kubenswrapper[4794]: I0310 09:50:02.359253 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Mar 10 09:50:02 crc kubenswrapper[4794]: I0310 09:50:02.502598 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Mar 10 09:50:02 crc kubenswrapper[4794]: I0310 09:50:02.526734 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Mar 10 09:50:02 crc kubenswrapper[4794]: I0310 09:50:02.673319 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Mar 10 09:50:02 crc kubenswrapper[4794]: I0310 09:50:02.737022 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Mar 10 09:50:02 crc kubenswrapper[4794]: I0310 09:50:02.740948 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 10 09:50:02 crc kubenswrapper[4794]: I0310 09:50:02.809175 4794 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Mar 10 09:50:02 crc kubenswrapper[4794]: I0310 09:50:02.866707 4794 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 10 09:50:02 crc kubenswrapper[4794]: I0310 09:50:02.873685 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 10 09:50:03 crc kubenswrapper[4794]: I0310 09:50:03.091655 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Mar 10 09:50:03 crc kubenswrapper[4794]: I0310 09:50:03.118101 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Mar 10 09:50:03 crc kubenswrapper[4794]: I0310 09:50:03.146106 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Mar 10 09:50:03 crc kubenswrapper[4794]: I0310 09:50:03.171084 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Mar 10 09:50:03 crc kubenswrapper[4794]: I0310 09:50:03.180487 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Mar 10 09:50:03 crc kubenswrapper[4794]: I0310 09:50:03.202509 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Mar 10 09:50:03 crc kubenswrapper[4794]: I0310 09:50:03.361156 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Mar 10 09:50:03 crc kubenswrapper[4794]: I0310 09:50:03.364606 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Mar 10 09:50:03 crc kubenswrapper[4794]: I0310 09:50:03.405380 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Mar 10 09:50:03 crc kubenswrapper[4794]: I0310 09:50:03.440898 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Mar 10 09:50:03 crc kubenswrapper[4794]: I0310 09:50:03.488318 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Mar 10 09:50:03 crc kubenswrapper[4794]: I0310 09:50:03.526179 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Mar 10 09:50:03 crc kubenswrapper[4794]: I0310 09:50:03.545771 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Mar 10 09:50:03 crc kubenswrapper[4794]: I0310 09:50:03.596752 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Mar 10 09:50:03 crc kubenswrapper[4794]: I0310 09:50:03.607996 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Mar 10 09:50:03 crc kubenswrapper[4794]: I0310 09:50:03.637087 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Mar 10 09:50:03 crc kubenswrapper[4794]: I0310 09:50:03.682668 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Mar 10 09:50:03 crc kubenswrapper[4794]: I0310 09:50:03.750999 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Mar 10 09:50:03 crc kubenswrapper[4794]: I0310 09:50:03.768621 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Mar 10 09:50:03 crc kubenswrapper[4794]: I0310 09:50:03.778920 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Mar 10 09:50:03 crc kubenswrapper[4794]: I0310 09:50:03.821454 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Mar 10 09:50:03 crc kubenswrapper[4794]: I0310 09:50:03.944519 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Mar 10 09:50:03 crc kubenswrapper[4794]: I0310 09:50:03.944570 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Mar 10 09:50:03 crc kubenswrapper[4794]: I0310 09:50:03.994027 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Mar 10 09:50:03 crc kubenswrapper[4794]: I0310 09:50:03.998751 4794 scope.go:117] "RemoveContainer" containerID="6fa4dafc049b27d0b5fcefb284c29c1554684b8bb25ec30daba5885bed0cc68b" Mar 10 09:50:04 crc kubenswrapper[4794]: I0310 09:50:04.000258 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Mar 10 09:50:04 crc kubenswrapper[4794]: I0310 09:50:04.152003 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Mar 10 09:50:04 crc kubenswrapper[4794]: I0310 09:50:04.283698 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-source-55646444c4-trplf_9d751cbb-f2e2-430d-9754-c882a5e924a5/check-endpoints/1.log" Mar 10 09:50:04 crc kubenswrapper[4794]: I0310 09:50:04.283743 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"6c30f1395ceab6141b0846c7d21550b1f6c727ffafc9a4ab5db494dd6fb17a7c"} Mar 10 09:50:04 crc kubenswrapper[4794]: I0310 09:50:04.397780 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Mar 10 09:50:04 crc kubenswrapper[4794]: I0310 09:50:04.476040 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Mar 10 09:50:04 crc kubenswrapper[4794]: I0310 09:50:04.557997 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Mar 10 09:50:04 crc kubenswrapper[4794]: I0310 09:50:04.585405 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Mar 10 09:50:04 crc kubenswrapper[4794]: I0310 09:50:04.674247 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Mar 10 09:50:04 crc kubenswrapper[4794]: I0310 09:50:04.683044 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Mar 10 09:50:04 crc kubenswrapper[4794]: I0310 09:50:04.761805 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Mar 10 09:50:04 crc kubenswrapper[4794]: I0310 09:50:04.828805 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Mar 10 09:50:04 crc kubenswrapper[4794]: I0310 09:50:04.843142 4794 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Mar 10 09:50:04 crc kubenswrapper[4794]: I0310 09:50:04.861853 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Mar 10 09:50:04 crc kubenswrapper[4794]: I0310 09:50:04.941561 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Mar 10 09:50:04 crc kubenswrapper[4794]: I0310 09:50:04.942454 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Mar 10 09:50:04 crc kubenswrapper[4794]: I0310 09:50:04.956351 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Mar 10 09:50:04 crc kubenswrapper[4794]: I0310 09:50:04.979873 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Mar 10 09:50:05 crc kubenswrapper[4794]: I0310 09:50:05.019514 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Mar 10 09:50:05 crc kubenswrapper[4794]: I0310 09:50:05.019588 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Mar 10 09:50:05 crc kubenswrapper[4794]: I0310 09:50:05.075755 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Mar 10 09:50:05 crc kubenswrapper[4794]: I0310 09:50:05.091070 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Mar 10 09:50:05 crc kubenswrapper[4794]: I0310 09:50:05.128714 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Mar 10 09:50:05 crc kubenswrapper[4794]: I0310 09:50:05.158854 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Mar 10 09:50:05 crc kubenswrapper[4794]: I0310 09:50:05.250484 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Mar 10 09:50:05 crc kubenswrapper[4794]: I0310 09:50:05.293037 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-source-55646444c4-trplf_9d751cbb-f2e2-430d-9754-c882a5e924a5/check-endpoints/2.log" Mar 10 09:50:05 crc kubenswrapper[4794]: I0310 09:50:05.293706 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-source-55646444c4-trplf_9d751cbb-f2e2-430d-9754-c882a5e924a5/check-endpoints/1.log" Mar 10 09:50:05 crc kubenswrapper[4794]: I0310 09:50:05.293780 4794 generic.go:334] "Generic (PLEG): container finished" podID="9d751cbb-f2e2-430d-9754-c882a5e924a5" containerID="6c30f1395ceab6141b0846c7d21550b1f6c727ffafc9a4ab5db494dd6fb17a7c" exitCode=255 Mar 10 09:50:05 crc kubenswrapper[4794]: I0310 09:50:05.293842 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerDied","Data":"6c30f1395ceab6141b0846c7d21550b1f6c727ffafc9a4ab5db494dd6fb17a7c"} Mar 10 09:50:05 crc kubenswrapper[4794]: I0310 09:50:05.293935 4794 scope.go:117] "RemoveContainer" containerID="6fa4dafc049b27d0b5fcefb284c29c1554684b8bb25ec30daba5885bed0cc68b" Mar 10 09:50:05 crc kubenswrapper[4794]: I0310 09:50:05.295174 4794 scope.go:117] "RemoveContainer" containerID="6c30f1395ceab6141b0846c7d21550b1f6c727ffafc9a4ab5db494dd6fb17a7c" Mar 10 09:50:05 crc kubenswrapper[4794]: E0310 09:50:05.295685 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"check-endpoints\" with CrashLoopBackOff: \"back-off 20s restarting failed container=check-endpoints pod=network-check-source-55646444c4-trplf_openshift-network-diagnostics(9d751cbb-f2e2-430d-9754-c882a5e924a5)\"" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 09:50:05 crc kubenswrapper[4794]: I0310 09:50:05.385951 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Mar 10 09:50:05 crc kubenswrapper[4794]: I0310 09:50:05.501314 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Mar 10 09:50:05 crc kubenswrapper[4794]: I0310 09:50:05.536806 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Mar 10 09:50:05 crc kubenswrapper[4794]: I0310 09:50:05.569484 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Mar 10 09:50:05 crc kubenswrapper[4794]: I0310 09:50:05.570394 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Mar 10 09:50:05 crc kubenswrapper[4794]: I0310 09:50:05.576655 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 10 09:50:05 crc kubenswrapper[4794]: I0310 09:50:05.609516 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Mar 10 09:50:05 crc kubenswrapper[4794]: I0310 09:50:05.614959 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Mar 10 09:50:05 crc kubenswrapper[4794]: I0310 09:50:05.660082 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Mar 10 09:50:05 crc kubenswrapper[4794]: I0310 09:50:05.697653 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Mar 10 09:50:05 crc kubenswrapper[4794]: I0310 09:50:05.712276 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Mar 10 09:50:05 crc kubenswrapper[4794]: I0310 09:50:05.779898 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Mar 10 09:50:05 crc kubenswrapper[4794]: I0310 09:50:05.974178 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Mar 10 09:50:06 crc kubenswrapper[4794]: I0310 09:50:06.136365 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Mar 10 09:50:06 crc kubenswrapper[4794]: I0310 09:50:06.190915 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Mar 10 09:50:06 crc kubenswrapper[4794]: I0310 09:50:06.233935 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Mar 10 09:50:06 crc kubenswrapper[4794]: I0310 09:50:06.235476 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Mar 10 09:50:06 crc kubenswrapper[4794]: I0310 09:50:06.301722 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-source-55646444c4-trplf_9d751cbb-f2e2-430d-9754-c882a5e924a5/check-endpoints/2.log" Mar 10 09:50:06 crc kubenswrapper[4794]: I0310 09:50:06.450913 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Mar 10 09:50:06 crc kubenswrapper[4794]: I0310 09:50:06.501848 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Mar 10 09:50:06 crc kubenswrapper[4794]: I0310 09:50:06.543090 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Mar 10 09:50:06 crc kubenswrapper[4794]: I0310 09:50:06.555470 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Mar 10 09:50:06 crc kubenswrapper[4794]: I0310 09:50:06.601347 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Mar 10 09:50:06 crc kubenswrapper[4794]: I0310 09:50:06.627351 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Mar 10 09:50:06 crc kubenswrapper[4794]: I0310 09:50:06.739302 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Mar 10 09:50:06 crc kubenswrapper[4794]: I0310 09:50:06.773695 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Mar 10 09:50:06 crc kubenswrapper[4794]: I0310 09:50:06.852814 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Mar 10 09:50:06 crc kubenswrapper[4794]: I0310 09:50:06.878593 4794 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Mar 10 09:50:06 crc kubenswrapper[4794]: I0310 09:50:06.879135 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podStartSLOduration=42.879121623 podStartE2EDuration="42.879121623s" podCreationTimestamp="2026-03-10 09:49:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 09:49:47.775921257 +0000 UTC m=+336.532092095" watchObservedRunningTime="2026-03-10 09:50:06.879121623 +0000 UTC m=+355.635292441" Mar 10 09:50:06 crc kubenswrapper[4794]: I0310 09:50:06.881719 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Mar 10 09:50:06 crc kubenswrapper[4794]: I0310 09:50:06.881844 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Mar 10 09:50:06 crc kubenswrapper[4794]: I0310 09:50:06.883023 4794 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc","openshift-authentication/oauth-openshift-558db77b4-n94r9"] Mar 10 09:50:06 crc kubenswrapper[4794]: I0310 09:50:06.883080 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc","openshift-authentication/oauth-openshift-6bb78bf599-rfqn5"] Mar 10 09:50:06 crc kubenswrapper[4794]: E0310 09:50:06.883308 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d56da28c-c09d-4fff-b73e-c3b5c787c300" containerName="oauth-openshift" Mar 10 09:50:06 crc kubenswrapper[4794]: I0310 09:50:06.883326 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="d56da28c-c09d-4fff-b73e-c3b5c787c300" containerName="oauth-openshift" Mar 10 09:50:06 crc kubenswrapper[4794]: E0310 09:50:06.883468 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8be88dc0-2383-4590-8731-7c19146fcd2b" containerName="installer" Mar 10 09:50:06 crc kubenswrapper[4794]: I0310 09:50:06.883483 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="8be88dc0-2383-4590-8731-7c19146fcd2b" containerName="installer" Mar 10 09:50:06 crc kubenswrapper[4794]: I0310 09:50:06.883550 4794 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="c04f5dfc-8cde-406b-9e01-2e529e0c0f31" Mar 10 09:50:06 crc kubenswrapper[4794]: I0310 09:50:06.883574 4794 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="c04f5dfc-8cde-406b-9e01-2e529e0c0f31" Mar 10 09:50:06 crc kubenswrapper[4794]: I0310 09:50:06.883616 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="8be88dc0-2383-4590-8731-7c19146fcd2b" containerName="installer" Mar 10 09:50:06 crc kubenswrapper[4794]: I0310 09:50:06.883641 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="d56da28c-c09d-4fff-b73e-c3b5c787c300" containerName="oauth-openshift" Mar 10 09:50:06 crc kubenswrapper[4794]: I0310 09:50:06.884105 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-6bb78bf599-rfqn5" Mar 10 09:50:06 crc kubenswrapper[4794]: I0310 09:50:06.888658 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Mar 10 09:50:06 crc kubenswrapper[4794]: I0310 09:50:06.888839 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Mar 10 09:50:06 crc kubenswrapper[4794]: I0310 09:50:06.889441 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Mar 10 09:50:06 crc kubenswrapper[4794]: I0310 09:50:06.889552 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Mar 10 09:50:06 crc kubenswrapper[4794]: I0310 09:50:06.889589 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Mar 10 09:50:06 crc kubenswrapper[4794]: I0310 09:50:06.889623 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Mar 10 09:50:06 crc kubenswrapper[4794]: I0310 09:50:06.889643 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Mar 10 09:50:06 crc kubenswrapper[4794]: I0310 09:50:06.889947 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 10 09:50:06 crc kubenswrapper[4794]: I0310 09:50:06.890046 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Mar 10 09:50:06 crc kubenswrapper[4794]: I0310 09:50:06.890100 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Mar 10 09:50:06 crc kubenswrapper[4794]: I0310 09:50:06.890296 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Mar 10 09:50:06 crc kubenswrapper[4794]: I0310 09:50:06.898179 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Mar 10 09:50:06 crc kubenswrapper[4794]: I0310 09:50:06.898195 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Mar 10 09:50:06 crc kubenswrapper[4794]: I0310 09:50:06.899955 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Mar 10 09:50:06 crc kubenswrapper[4794]: I0310 09:50:06.904987 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Mar 10 09:50:06 crc kubenswrapper[4794]: I0310 09:50:06.915537 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Mar 10 09:50:06 crc kubenswrapper[4794]: I0310 09:50:06.918263 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=18.918248804 podStartE2EDuration="18.918248804s" podCreationTimestamp="2026-03-10 09:49:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 09:50:06.911132236 +0000 UTC m=+355.667303054" watchObservedRunningTime="2026-03-10 09:50:06.918248804 +0000 UTC m=+355.674419622" Mar 10 09:50:06 crc kubenswrapper[4794]: I0310 09:50:06.939829 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Mar 10 09:50:06 crc kubenswrapper[4794]: I0310 09:50:06.973169 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-48gkz\" (UniqueName: \"kubernetes.io/projected/24fc0a18-e72f-49da-8175-18e10b7261c5-kube-api-access-48gkz\") pod \"oauth-openshift-6bb78bf599-rfqn5\" (UID: \"24fc0a18-e72f-49da-8175-18e10b7261c5\") " pod="openshift-authentication/oauth-openshift-6bb78bf599-rfqn5" Mar 10 09:50:06 crc kubenswrapper[4794]: I0310 09:50:06.973221 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/24fc0a18-e72f-49da-8175-18e10b7261c5-audit-policies\") pod \"oauth-openshift-6bb78bf599-rfqn5\" (UID: \"24fc0a18-e72f-49da-8175-18e10b7261c5\") " pod="openshift-authentication/oauth-openshift-6bb78bf599-rfqn5" Mar 10 09:50:06 crc kubenswrapper[4794]: I0310 09:50:06.973262 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/24fc0a18-e72f-49da-8175-18e10b7261c5-v4-0-config-system-service-ca\") pod \"oauth-openshift-6bb78bf599-rfqn5\" (UID: \"24fc0a18-e72f-49da-8175-18e10b7261c5\") " pod="openshift-authentication/oauth-openshift-6bb78bf599-rfqn5" Mar 10 09:50:06 crc kubenswrapper[4794]: I0310 09:50:06.973289 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/24fc0a18-e72f-49da-8175-18e10b7261c5-v4-0-config-user-template-error\") pod \"oauth-openshift-6bb78bf599-rfqn5\" (UID: \"24fc0a18-e72f-49da-8175-18e10b7261c5\") " pod="openshift-authentication/oauth-openshift-6bb78bf599-rfqn5" Mar 10 09:50:06 crc kubenswrapper[4794]: I0310 09:50:06.973312 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/24fc0a18-e72f-49da-8175-18e10b7261c5-audit-dir\") pod \"oauth-openshift-6bb78bf599-rfqn5\" (UID: \"24fc0a18-e72f-49da-8175-18e10b7261c5\") " pod="openshift-authentication/oauth-openshift-6bb78bf599-rfqn5" Mar 10 09:50:06 crc kubenswrapper[4794]: I0310 09:50:06.973348 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/24fc0a18-e72f-49da-8175-18e10b7261c5-v4-0-config-system-session\") pod \"oauth-openshift-6bb78bf599-rfqn5\" (UID: \"24fc0a18-e72f-49da-8175-18e10b7261c5\") " pod="openshift-authentication/oauth-openshift-6bb78bf599-rfqn5" Mar 10 09:50:06 crc kubenswrapper[4794]: I0310 09:50:06.973370 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/24fc0a18-e72f-49da-8175-18e10b7261c5-v4-0-config-system-router-certs\") pod \"oauth-openshift-6bb78bf599-rfqn5\" (UID: \"24fc0a18-e72f-49da-8175-18e10b7261c5\") " pod="openshift-authentication/oauth-openshift-6bb78bf599-rfqn5" Mar 10 09:50:06 crc kubenswrapper[4794]: I0310 09:50:06.973397 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/24fc0a18-e72f-49da-8175-18e10b7261c5-v4-0-config-system-serving-cert\") pod \"oauth-openshift-6bb78bf599-rfqn5\" (UID: \"24fc0a18-e72f-49da-8175-18e10b7261c5\") " pod="openshift-authentication/oauth-openshift-6bb78bf599-rfqn5" Mar 10 09:50:06 crc kubenswrapper[4794]: I0310 09:50:06.973437 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/24fc0a18-e72f-49da-8175-18e10b7261c5-v4-0-config-system-cliconfig\") pod \"oauth-openshift-6bb78bf599-rfqn5\" (UID: \"24fc0a18-e72f-49da-8175-18e10b7261c5\") " pod="openshift-authentication/oauth-openshift-6bb78bf599-rfqn5" Mar 10 09:50:06 crc kubenswrapper[4794]: I0310 09:50:06.973469 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/24fc0a18-e72f-49da-8175-18e10b7261c5-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-6bb78bf599-rfqn5\" (UID: \"24fc0a18-e72f-49da-8175-18e10b7261c5\") " pod="openshift-authentication/oauth-openshift-6bb78bf599-rfqn5" Mar 10 09:50:06 crc kubenswrapper[4794]: I0310 09:50:06.973497 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/24fc0a18-e72f-49da-8175-18e10b7261c5-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-6bb78bf599-rfqn5\" (UID: \"24fc0a18-e72f-49da-8175-18e10b7261c5\") " pod="openshift-authentication/oauth-openshift-6bb78bf599-rfqn5" Mar 10 09:50:06 crc kubenswrapper[4794]: I0310 09:50:06.973542 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/24fc0a18-e72f-49da-8175-18e10b7261c5-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-6bb78bf599-rfqn5\" (UID: \"24fc0a18-e72f-49da-8175-18e10b7261c5\") " pod="openshift-authentication/oauth-openshift-6bb78bf599-rfqn5" Mar 10 09:50:06 crc kubenswrapper[4794]: I0310 09:50:06.973573 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/24fc0a18-e72f-49da-8175-18e10b7261c5-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-6bb78bf599-rfqn5\" (UID: \"24fc0a18-e72f-49da-8175-18e10b7261c5\") " pod="openshift-authentication/oauth-openshift-6bb78bf599-rfqn5" Mar 10 09:50:06 crc kubenswrapper[4794]: I0310 09:50:06.973600 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/24fc0a18-e72f-49da-8175-18e10b7261c5-v4-0-config-user-template-login\") pod \"oauth-openshift-6bb78bf599-rfqn5\" (UID: \"24fc0a18-e72f-49da-8175-18e10b7261c5\") " pod="openshift-authentication/oauth-openshift-6bb78bf599-rfqn5" Mar 10 09:50:07 crc kubenswrapper[4794]: I0310 09:50:07.075057 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/24fc0a18-e72f-49da-8175-18e10b7261c5-v4-0-config-system-cliconfig\") pod \"oauth-openshift-6bb78bf599-rfqn5\" (UID: \"24fc0a18-e72f-49da-8175-18e10b7261c5\") " pod="openshift-authentication/oauth-openshift-6bb78bf599-rfqn5" Mar 10 09:50:07 crc kubenswrapper[4794]: I0310 09:50:07.075530 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/24fc0a18-e72f-49da-8175-18e10b7261c5-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-6bb78bf599-rfqn5\" (UID: \"24fc0a18-e72f-49da-8175-18e10b7261c5\") " pod="openshift-authentication/oauth-openshift-6bb78bf599-rfqn5" Mar 10 09:50:07 crc kubenswrapper[4794]: I0310 09:50:07.075736 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/24fc0a18-e72f-49da-8175-18e10b7261c5-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-6bb78bf599-rfqn5\" (UID: \"24fc0a18-e72f-49da-8175-18e10b7261c5\") " pod="openshift-authentication/oauth-openshift-6bb78bf599-rfqn5" Mar 10 09:50:07 crc kubenswrapper[4794]: I0310 09:50:07.076005 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/24fc0a18-e72f-49da-8175-18e10b7261c5-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-6bb78bf599-rfqn5\" (UID: \"24fc0a18-e72f-49da-8175-18e10b7261c5\") " pod="openshift-authentication/oauth-openshift-6bb78bf599-rfqn5" Mar 10 09:50:07 crc kubenswrapper[4794]: I0310 09:50:07.076237 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/24fc0a18-e72f-49da-8175-18e10b7261c5-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-6bb78bf599-rfqn5\" (UID: \"24fc0a18-e72f-49da-8175-18e10b7261c5\") " pod="openshift-authentication/oauth-openshift-6bb78bf599-rfqn5" Mar 10 09:50:07 crc kubenswrapper[4794]: I0310 09:50:07.076548 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/24fc0a18-e72f-49da-8175-18e10b7261c5-v4-0-config-user-template-login\") pod \"oauth-openshift-6bb78bf599-rfqn5\" (UID: \"24fc0a18-e72f-49da-8175-18e10b7261c5\") " pod="openshift-authentication/oauth-openshift-6bb78bf599-rfqn5" Mar 10 09:50:07 crc kubenswrapper[4794]: I0310 09:50:07.076847 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-48gkz\" (UniqueName: \"kubernetes.io/projected/24fc0a18-e72f-49da-8175-18e10b7261c5-kube-api-access-48gkz\") pod \"oauth-openshift-6bb78bf599-rfqn5\" (UID: \"24fc0a18-e72f-49da-8175-18e10b7261c5\") " pod="openshift-authentication/oauth-openshift-6bb78bf599-rfqn5" Mar 10 09:50:07 crc kubenswrapper[4794]: I0310 09:50:07.076711 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/24fc0a18-e72f-49da-8175-18e10b7261c5-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-6bb78bf599-rfqn5\" (UID: \"24fc0a18-e72f-49da-8175-18e10b7261c5\") " pod="openshift-authentication/oauth-openshift-6bb78bf599-rfqn5" Mar 10 09:50:07 crc kubenswrapper[4794]: I0310 09:50:07.076638 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/24fc0a18-e72f-49da-8175-18e10b7261c5-v4-0-config-system-cliconfig\") pod \"oauth-openshift-6bb78bf599-rfqn5\" (UID: \"24fc0a18-e72f-49da-8175-18e10b7261c5\") " pod="openshift-authentication/oauth-openshift-6bb78bf599-rfqn5" Mar 10 09:50:07 crc kubenswrapper[4794]: I0310 09:50:07.077108 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/24fc0a18-e72f-49da-8175-18e10b7261c5-audit-policies\") pod \"oauth-openshift-6bb78bf599-rfqn5\" (UID: \"24fc0a18-e72f-49da-8175-18e10b7261c5\") " pod="openshift-authentication/oauth-openshift-6bb78bf599-rfqn5" Mar 10 09:50:07 crc kubenswrapper[4794]: I0310 09:50:07.077228 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/24fc0a18-e72f-49da-8175-18e10b7261c5-v4-0-config-system-service-ca\") pod \"oauth-openshift-6bb78bf599-rfqn5\" (UID: \"24fc0a18-e72f-49da-8175-18e10b7261c5\") " pod="openshift-authentication/oauth-openshift-6bb78bf599-rfqn5" Mar 10 09:50:07 crc kubenswrapper[4794]: I0310 09:50:07.077260 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/24fc0a18-e72f-49da-8175-18e10b7261c5-v4-0-config-user-template-error\") pod \"oauth-openshift-6bb78bf599-rfqn5\" (UID: \"24fc0a18-e72f-49da-8175-18e10b7261c5\") " pod="openshift-authentication/oauth-openshift-6bb78bf599-rfqn5" Mar 10 09:50:07 crc kubenswrapper[4794]: I0310 09:50:07.077281 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/24fc0a18-e72f-49da-8175-18e10b7261c5-audit-dir\") pod \"oauth-openshift-6bb78bf599-rfqn5\" (UID: \"24fc0a18-e72f-49da-8175-18e10b7261c5\") " pod="openshift-authentication/oauth-openshift-6bb78bf599-rfqn5" Mar 10 09:50:07 crc kubenswrapper[4794]: I0310 09:50:07.077298 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/24fc0a18-e72f-49da-8175-18e10b7261c5-v4-0-config-system-session\") pod \"oauth-openshift-6bb78bf599-rfqn5\" (UID: \"24fc0a18-e72f-49da-8175-18e10b7261c5\") " pod="openshift-authentication/oauth-openshift-6bb78bf599-rfqn5" Mar 10 09:50:07 crc kubenswrapper[4794]: I0310 09:50:07.077317 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/24fc0a18-e72f-49da-8175-18e10b7261c5-v4-0-config-system-router-certs\") pod \"oauth-openshift-6bb78bf599-rfqn5\" (UID: \"24fc0a18-e72f-49da-8175-18e10b7261c5\") " pod="openshift-authentication/oauth-openshift-6bb78bf599-rfqn5" Mar 10 09:50:07 crc kubenswrapper[4794]: I0310 09:50:07.077358 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/24fc0a18-e72f-49da-8175-18e10b7261c5-v4-0-config-system-serving-cert\") pod \"oauth-openshift-6bb78bf599-rfqn5\" (UID: \"24fc0a18-e72f-49da-8175-18e10b7261c5\") " pod="openshift-authentication/oauth-openshift-6bb78bf599-rfqn5" Mar 10 09:50:07 crc kubenswrapper[4794]: I0310 09:50:07.077379 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/24fc0a18-e72f-49da-8175-18e10b7261c5-audit-dir\") pod \"oauth-openshift-6bb78bf599-rfqn5\" (UID: \"24fc0a18-e72f-49da-8175-18e10b7261c5\") " pod="openshift-authentication/oauth-openshift-6bb78bf599-rfqn5" Mar 10 09:50:07 crc kubenswrapper[4794]: I0310 09:50:07.078658 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/24fc0a18-e72f-49da-8175-18e10b7261c5-v4-0-config-system-service-ca\") pod \"oauth-openshift-6bb78bf599-rfqn5\" (UID: \"24fc0a18-e72f-49da-8175-18e10b7261c5\") " pod="openshift-authentication/oauth-openshift-6bb78bf599-rfqn5" Mar 10 09:50:07 crc kubenswrapper[4794]: I0310 09:50:07.079539 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/24fc0a18-e72f-49da-8175-18e10b7261c5-audit-policies\") pod \"oauth-openshift-6bb78bf599-rfqn5\" (UID: \"24fc0a18-e72f-49da-8175-18e10b7261c5\") " pod="openshift-authentication/oauth-openshift-6bb78bf599-rfqn5" Mar 10 09:50:07 crc kubenswrapper[4794]: I0310 09:50:07.082432 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/24fc0a18-e72f-49da-8175-18e10b7261c5-v4-0-config-user-template-error\") pod \"oauth-openshift-6bb78bf599-rfqn5\" (UID: \"24fc0a18-e72f-49da-8175-18e10b7261c5\") " pod="openshift-authentication/oauth-openshift-6bb78bf599-rfqn5" Mar 10 09:50:07 crc kubenswrapper[4794]: I0310 09:50:07.082610 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/24fc0a18-e72f-49da-8175-18e10b7261c5-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-6bb78bf599-rfqn5\" (UID: \"24fc0a18-e72f-49da-8175-18e10b7261c5\") " pod="openshift-authentication/oauth-openshift-6bb78bf599-rfqn5" Mar 10 09:50:07 crc kubenswrapper[4794]: I0310 09:50:07.082623 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/24fc0a18-e72f-49da-8175-18e10b7261c5-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-6bb78bf599-rfqn5\" (UID: \"24fc0a18-e72f-49da-8175-18e10b7261c5\") " pod="openshift-authentication/oauth-openshift-6bb78bf599-rfqn5" Mar 10 09:50:07 crc kubenswrapper[4794]: I0310 09:50:07.083793 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/24fc0a18-e72f-49da-8175-18e10b7261c5-v4-0-config-system-session\") pod \"oauth-openshift-6bb78bf599-rfqn5\" (UID: \"24fc0a18-e72f-49da-8175-18e10b7261c5\") " pod="openshift-authentication/oauth-openshift-6bb78bf599-rfqn5" Mar 10 09:50:07 crc kubenswrapper[4794]: I0310 09:50:07.085476 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/24fc0a18-e72f-49da-8175-18e10b7261c5-v4-0-config-system-router-certs\") pod \"oauth-openshift-6bb78bf599-rfqn5\" (UID: \"24fc0a18-e72f-49da-8175-18e10b7261c5\") " pod="openshift-authentication/oauth-openshift-6bb78bf599-rfqn5" Mar 10 09:50:07 crc kubenswrapper[4794]: I0310 09:50:07.086808 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/24fc0a18-e72f-49da-8175-18e10b7261c5-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-6bb78bf599-rfqn5\" (UID: \"24fc0a18-e72f-49da-8175-18e10b7261c5\") " pod="openshift-authentication/oauth-openshift-6bb78bf599-rfqn5" Mar 10 09:50:07 crc kubenswrapper[4794]: I0310 09:50:07.088485 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/24fc0a18-e72f-49da-8175-18e10b7261c5-v4-0-config-user-template-login\") pod \"oauth-openshift-6bb78bf599-rfqn5\" (UID: \"24fc0a18-e72f-49da-8175-18e10b7261c5\") " pod="openshift-authentication/oauth-openshift-6bb78bf599-rfqn5" Mar 10 09:50:07 crc kubenswrapper[4794]: I0310 09:50:07.096329 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/24fc0a18-e72f-49da-8175-18e10b7261c5-v4-0-config-system-serving-cert\") pod \"oauth-openshift-6bb78bf599-rfqn5\" (UID: \"24fc0a18-e72f-49da-8175-18e10b7261c5\") " pod="openshift-authentication/oauth-openshift-6bb78bf599-rfqn5" Mar 10 09:50:07 crc kubenswrapper[4794]: I0310 09:50:07.104905 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-48gkz\" (UniqueName: \"kubernetes.io/projected/24fc0a18-e72f-49da-8175-18e10b7261c5-kube-api-access-48gkz\") pod \"oauth-openshift-6bb78bf599-rfqn5\" (UID: \"24fc0a18-e72f-49da-8175-18e10b7261c5\") " pod="openshift-authentication/oauth-openshift-6bb78bf599-rfqn5" Mar 10 09:50:07 crc kubenswrapper[4794]: I0310 09:50:07.206241 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-6bb78bf599-rfqn5" Mar 10 09:50:07 crc kubenswrapper[4794]: I0310 09:50:07.227004 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Mar 10 09:50:07 crc kubenswrapper[4794]: I0310 09:50:07.256100 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Mar 10 09:50:07 crc kubenswrapper[4794]: I0310 09:50:07.269632 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Mar 10 09:50:07 crc kubenswrapper[4794]: I0310 09:50:07.314187 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Mar 10 09:50:07 crc kubenswrapper[4794]: I0310 09:50:07.343976 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Mar 10 09:50:07 crc kubenswrapper[4794]: I0310 09:50:07.481358 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Mar 10 09:50:07 crc kubenswrapper[4794]: I0310 09:50:07.573997 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Mar 10 09:50:07 crc kubenswrapper[4794]: I0310 09:50:07.574757 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-6bb78bf599-rfqn5"] Mar 10 09:50:07 crc kubenswrapper[4794]: I0310 09:50:07.627854 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Mar 10 09:50:07 crc kubenswrapper[4794]: I0310 09:50:07.756694 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Mar 10 09:50:07 crc kubenswrapper[4794]: I0310 09:50:07.779671 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Mar 10 09:50:07 crc kubenswrapper[4794]: I0310 09:50:07.989642 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Mar 10 09:50:08 crc kubenswrapper[4794]: I0310 09:50:08.006908 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d56da28c-c09d-4fff-b73e-c3b5c787c300" path="/var/lib/kubelet/pods/d56da28c-c09d-4fff-b73e-c3b5c787c300/volumes" Mar 10 09:50:08 crc kubenswrapper[4794]: I0310 09:50:08.026412 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Mar 10 09:50:08 crc kubenswrapper[4794]: I0310 09:50:08.033209 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Mar 10 09:50:08 crc kubenswrapper[4794]: I0310 09:50:08.083502 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Mar 10 09:50:08 crc kubenswrapper[4794]: I0310 09:50:08.212843 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Mar 10 09:50:08 crc kubenswrapper[4794]: I0310 09:50:08.231779 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Mar 10 09:50:08 crc kubenswrapper[4794]: I0310 09:50:08.311771 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Mar 10 09:50:08 crc kubenswrapper[4794]: I0310 09:50:08.313788 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Mar 10 09:50:08 crc kubenswrapper[4794]: I0310 09:50:08.318210 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-authentication_oauth-openshift-6bb78bf599-rfqn5_24fc0a18-e72f-49da-8175-18e10b7261c5/oauth-openshift/0.log" Mar 10 09:50:08 crc kubenswrapper[4794]: I0310 09:50:08.318261 4794 generic.go:334] "Generic (PLEG): container finished" podID="24fc0a18-e72f-49da-8175-18e10b7261c5" containerID="9f5893289e640248ff10da3f9fca07a6a5146f73b9e749d94bed0abb786f4f88" exitCode=255 Mar 10 09:50:08 crc kubenswrapper[4794]: I0310 09:50:08.318293 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-6bb78bf599-rfqn5" event={"ID":"24fc0a18-e72f-49da-8175-18e10b7261c5","Type":"ContainerDied","Data":"9f5893289e640248ff10da3f9fca07a6a5146f73b9e749d94bed0abb786f4f88"} Mar 10 09:50:08 crc kubenswrapper[4794]: I0310 09:50:08.318322 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-6bb78bf599-rfqn5" event={"ID":"24fc0a18-e72f-49da-8175-18e10b7261c5","Type":"ContainerStarted","Data":"889f830964000da018a95980a883d129f88b2a70808e09097ce0ff0066f34e7d"} Mar 10 09:50:08 crc kubenswrapper[4794]: I0310 09:50:08.318746 4794 scope.go:117] "RemoveContainer" containerID="9f5893289e640248ff10da3f9fca07a6a5146f73b9e749d94bed0abb786f4f88" Mar 10 09:50:08 crc kubenswrapper[4794]: I0310 09:50:08.399398 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Mar 10 09:50:08 crc kubenswrapper[4794]: I0310 09:50:08.402140 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Mar 10 09:50:08 crc kubenswrapper[4794]: I0310 09:50:08.453994 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Mar 10 09:50:08 crc kubenswrapper[4794]: I0310 09:50:08.517283 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Mar 10 09:50:08 crc kubenswrapper[4794]: I0310 09:50:08.572453 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Mar 10 09:50:08 crc kubenswrapper[4794]: I0310 09:50:08.642678 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Mar 10 09:50:08 crc kubenswrapper[4794]: I0310 09:50:08.685494 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Mar 10 09:50:08 crc kubenswrapper[4794]: I0310 09:50:08.708992 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Mar 10 09:50:08 crc kubenswrapper[4794]: I0310 09:50:08.748623 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 10 09:50:08 crc kubenswrapper[4794]: I0310 09:50:08.772220 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Mar 10 09:50:08 crc kubenswrapper[4794]: I0310 09:50:08.797256 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Mar 10 09:50:08 crc kubenswrapper[4794]: I0310 09:50:08.885375 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Mar 10 09:50:08 crc kubenswrapper[4794]: I0310 09:50:08.930138 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Mar 10 09:50:08 crc kubenswrapper[4794]: I0310 09:50:08.975145 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Mar 10 09:50:08 crc kubenswrapper[4794]: I0310 09:50:08.984058 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Mar 10 09:50:09 crc kubenswrapper[4794]: I0310 09:50:09.024010 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Mar 10 09:50:09 crc kubenswrapper[4794]: I0310 09:50:09.149108 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Mar 10 09:50:09 crc kubenswrapper[4794]: I0310 09:50:09.203785 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Mar 10 09:50:09 crc kubenswrapper[4794]: I0310 09:50:09.237787 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Mar 10 09:50:09 crc kubenswrapper[4794]: I0310 09:50:09.245913 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Mar 10 09:50:09 crc kubenswrapper[4794]: I0310 09:50:09.294758 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Mar 10 09:50:09 crc kubenswrapper[4794]: I0310 09:50:09.318599 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Mar 10 09:50:09 crc kubenswrapper[4794]: I0310 09:50:09.324730 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-authentication_oauth-openshift-6bb78bf599-rfqn5_24fc0a18-e72f-49da-8175-18e10b7261c5/oauth-openshift/0.log" Mar 10 09:50:09 crc kubenswrapper[4794]: I0310 09:50:09.324807 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-6bb78bf599-rfqn5" event={"ID":"24fc0a18-e72f-49da-8175-18e10b7261c5","Type":"ContainerStarted","Data":"59e77a54d094fe0a6e4dffb90919c9c304aba7c7022884b97d6164cdd9a2de50"} Mar 10 09:50:09 crc kubenswrapper[4794]: I0310 09:50:09.325197 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-6bb78bf599-rfqn5" Mar 10 09:50:09 crc kubenswrapper[4794]: I0310 09:50:09.329639 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-6bb78bf599-rfqn5" Mar 10 09:50:09 crc kubenswrapper[4794]: I0310 09:50:09.341452 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-6bb78bf599-rfqn5" podStartSLOduration=67.34142872 podStartE2EDuration="1m7.34142872s" podCreationTimestamp="2026-03-10 09:49:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 09:50:09.341413399 +0000 UTC m=+358.097584227" watchObservedRunningTime="2026-03-10 09:50:09.34142872 +0000 UTC m=+358.097599568" Mar 10 09:50:09 crc kubenswrapper[4794]: I0310 09:50:09.385413 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Mar 10 09:50:09 crc kubenswrapper[4794]: I0310 09:50:09.554625 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Mar 10 09:50:09 crc kubenswrapper[4794]: I0310 09:50:09.575091 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Mar 10 09:50:09 crc kubenswrapper[4794]: I0310 09:50:09.642373 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Mar 10 09:50:09 crc kubenswrapper[4794]: I0310 09:50:09.708388 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 10 09:50:09 crc kubenswrapper[4794]: I0310 09:50:09.739119 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Mar 10 09:50:09 crc kubenswrapper[4794]: I0310 09:50:09.949835 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Mar 10 09:50:09 crc kubenswrapper[4794]: I0310 09:50:09.952379 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 10 09:50:10 crc kubenswrapper[4794]: I0310 09:50:10.010353 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Mar 10 09:50:10 crc kubenswrapper[4794]: I0310 09:50:10.186197 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 10 09:50:10 crc kubenswrapper[4794]: I0310 09:50:10.191064 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Mar 10 09:50:10 crc kubenswrapper[4794]: I0310 09:50:10.487779 4794 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Mar 10 09:50:10 crc kubenswrapper[4794]: I0310 09:50:10.488090 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" containerID="cri-o://d788d4a5003d3248803f7857ee8143040c3ba1fcf0389703ac3b7652edacabf8" gracePeriod=5 Mar 10 09:50:10 crc kubenswrapper[4794]: I0310 09:50:10.495179 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Mar 10 09:50:10 crc kubenswrapper[4794]: I0310 09:50:10.502639 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Mar 10 09:50:10 crc kubenswrapper[4794]: I0310 09:50:10.604276 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Mar 10 09:50:10 crc kubenswrapper[4794]: I0310 09:50:10.662585 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Mar 10 09:50:10 crc kubenswrapper[4794]: I0310 09:50:10.663498 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Mar 10 09:50:10 crc kubenswrapper[4794]: I0310 09:50:10.673425 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Mar 10 09:50:11 crc kubenswrapper[4794]: I0310 09:50:11.071102 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Mar 10 09:50:11 crc kubenswrapper[4794]: I0310 09:50:11.121227 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Mar 10 09:50:11 crc kubenswrapper[4794]: I0310 09:50:11.341442 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Mar 10 09:50:11 crc kubenswrapper[4794]: I0310 09:50:11.446606 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Mar 10 09:50:11 crc kubenswrapper[4794]: I0310 09:50:11.623615 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Mar 10 09:50:11 crc kubenswrapper[4794]: I0310 09:50:11.640840 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Mar 10 09:50:11 crc kubenswrapper[4794]: I0310 09:50:11.845129 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Mar 10 09:50:11 crc kubenswrapper[4794]: I0310 09:50:11.865171 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Mar 10 09:50:11 crc kubenswrapper[4794]: I0310 09:50:11.948724 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Mar 10 09:50:12 crc kubenswrapper[4794]: I0310 09:50:12.046744 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Mar 10 09:50:12 crc kubenswrapper[4794]: I0310 09:50:12.329982 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Mar 10 09:50:12 crc kubenswrapper[4794]: I0310 09:50:12.447447 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Mar 10 09:50:12 crc kubenswrapper[4794]: I0310 09:50:12.478524 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Mar 10 09:50:12 crc kubenswrapper[4794]: I0310 09:50:12.593088 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Mar 10 09:50:12 crc kubenswrapper[4794]: I0310 09:50:12.641726 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 10 09:50:12 crc kubenswrapper[4794]: I0310 09:50:12.656921 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Mar 10 09:50:12 crc kubenswrapper[4794]: I0310 09:50:12.672781 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Mar 10 09:50:12 crc kubenswrapper[4794]: I0310 09:50:12.744992 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Mar 10 09:50:13 crc kubenswrapper[4794]: I0310 09:50:13.282029 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Mar 10 09:50:13 crc kubenswrapper[4794]: I0310 09:50:13.283711 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Mar 10 09:50:13 crc kubenswrapper[4794]: I0310 09:50:13.378310 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Mar 10 09:50:13 crc kubenswrapper[4794]: I0310 09:50:13.523025 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Mar 10 09:50:13 crc kubenswrapper[4794]: I0310 09:50:13.600492 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Mar 10 09:50:13 crc kubenswrapper[4794]: I0310 09:50:13.846407 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 10 09:50:15 crc kubenswrapper[4794]: I0310 09:50:15.011410 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29552270-rq675"] Mar 10 09:50:15 crc kubenswrapper[4794]: E0310 09:50:15.011646 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Mar 10 09:50:15 crc kubenswrapper[4794]: I0310 09:50:15.011661 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Mar 10 09:50:15 crc kubenswrapper[4794]: I0310 09:50:15.011782 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Mar 10 09:50:15 crc kubenswrapper[4794]: I0310 09:50:15.012164 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552270-rq675" Mar 10 09:50:15 crc kubenswrapper[4794]: I0310 09:50:15.020082 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552270-rq675"] Mar 10 09:50:15 crc kubenswrapper[4794]: I0310 09:50:15.025185 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 10 09:50:15 crc kubenswrapper[4794]: I0310 09:50:15.025251 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-r5tkc" Mar 10 09:50:15 crc kubenswrapper[4794]: I0310 09:50:15.025636 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 10 09:50:15 crc kubenswrapper[4794]: I0310 09:50:15.078518 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fdvfp\" (UniqueName: \"kubernetes.io/projected/dec1ce65-776a-4ed7-8441-0779bb0c5d93-kube-api-access-fdvfp\") pod \"auto-csr-approver-29552270-rq675\" (UID: \"dec1ce65-776a-4ed7-8441-0779bb0c5d93\") " pod="openshift-infra/auto-csr-approver-29552270-rq675" Mar 10 09:50:15 crc kubenswrapper[4794]: I0310 09:50:15.179730 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fdvfp\" (UniqueName: \"kubernetes.io/projected/dec1ce65-776a-4ed7-8441-0779bb0c5d93-kube-api-access-fdvfp\") pod \"auto-csr-approver-29552270-rq675\" (UID: \"dec1ce65-776a-4ed7-8441-0779bb0c5d93\") " pod="openshift-infra/auto-csr-approver-29552270-rq675" Mar 10 09:50:15 crc kubenswrapper[4794]: I0310 09:50:15.201798 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fdvfp\" (UniqueName: \"kubernetes.io/projected/dec1ce65-776a-4ed7-8441-0779bb0c5d93-kube-api-access-fdvfp\") pod \"auto-csr-approver-29552270-rq675\" (UID: \"dec1ce65-776a-4ed7-8441-0779bb0c5d93\") " pod="openshift-infra/auto-csr-approver-29552270-rq675" Mar 10 09:50:15 crc kubenswrapper[4794]: I0310 09:50:15.324300 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552270-rq675" Mar 10 09:50:15 crc kubenswrapper[4794]: I0310 09:50:15.804783 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552270-rq675"] Mar 10 09:50:15 crc kubenswrapper[4794]: W0310 09:50:15.816507 4794 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddec1ce65_776a_4ed7_8441_0779bb0c5d93.slice/crio-796d897c1cac07dcc7a074a8defc3de5e31f63a38c7e34cb21e13aafc7a222c4 WatchSource:0}: Error finding container 796d897c1cac07dcc7a074a8defc3de5e31f63a38c7e34cb21e13aafc7a222c4: Status 404 returned error can't find the container with id 796d897c1cac07dcc7a074a8defc3de5e31f63a38c7e34cb21e13aafc7a222c4 Mar 10 09:50:16 crc kubenswrapper[4794]: I0310 09:50:16.038444 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Mar 10 09:50:16 crc kubenswrapper[4794]: I0310 09:50:16.038809 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 10 09:50:16 crc kubenswrapper[4794]: I0310 09:50:16.087759 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 10 09:50:16 crc kubenswrapper[4794]: I0310 09:50:16.087795 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 10 09:50:16 crc kubenswrapper[4794]: I0310 09:50:16.087814 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 10 09:50:16 crc kubenswrapper[4794]: I0310 09:50:16.087848 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 10 09:50:16 crc kubenswrapper[4794]: I0310 09:50:16.087852 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests" (OuterVolumeSpecName: "manifests") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "manifests". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 09:50:16 crc kubenswrapper[4794]: I0310 09:50:16.087870 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 10 09:50:16 crc kubenswrapper[4794]: I0310 09:50:16.087911 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 09:50:16 crc kubenswrapper[4794]: I0310 09:50:16.087941 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock" (OuterVolumeSpecName: "var-lock") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 09:50:16 crc kubenswrapper[4794]: I0310 09:50:16.087959 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log" (OuterVolumeSpecName: "var-log") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 09:50:16 crc kubenswrapper[4794]: I0310 09:50:16.088375 4794 reconciler_common.go:293] "Volume detached for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") on node \"crc\" DevicePath \"\"" Mar 10 09:50:16 crc kubenswrapper[4794]: I0310 09:50:16.088402 4794 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") on node \"crc\" DevicePath \"\"" Mar 10 09:50:16 crc kubenswrapper[4794]: I0310 09:50:16.088415 4794 reconciler_common.go:293] "Volume detached for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") on node \"crc\" DevicePath \"\"" Mar 10 09:50:16 crc kubenswrapper[4794]: I0310 09:50:16.088428 4794 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") on node \"crc\" DevicePath \"\"" Mar 10 09:50:16 crc kubenswrapper[4794]: I0310 09:50:16.093864 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir" (OuterVolumeSpecName: "pod-resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "pod-resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 09:50:16 crc kubenswrapper[4794]: I0310 09:50:16.126200 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 09:50:16 crc kubenswrapper[4794]: I0310 09:50:16.189197 4794 reconciler_common.go:293] "Volume detached for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") on node \"crc\" DevicePath \"\"" Mar 10 09:50:16 crc kubenswrapper[4794]: I0310 09:50:16.392638 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552270-rq675" event={"ID":"dec1ce65-776a-4ed7-8441-0779bb0c5d93","Type":"ContainerStarted","Data":"796d897c1cac07dcc7a074a8defc3de5e31f63a38c7e34cb21e13aafc7a222c4"} Mar 10 09:50:16 crc kubenswrapper[4794]: I0310 09:50:16.394927 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Mar 10 09:50:16 crc kubenswrapper[4794]: I0310 09:50:16.394990 4794 generic.go:334] "Generic (PLEG): container finished" podID="f85e55b1a89d02b0cb034b1ea31ed45a" containerID="d788d4a5003d3248803f7857ee8143040c3ba1fcf0389703ac3b7652edacabf8" exitCode=137 Mar 10 09:50:16 crc kubenswrapper[4794]: I0310 09:50:16.395052 4794 scope.go:117] "RemoveContainer" containerID="d788d4a5003d3248803f7857ee8143040c3ba1fcf0389703ac3b7652edacabf8" Mar 10 09:50:16 crc kubenswrapper[4794]: I0310 09:50:16.395087 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 10 09:50:16 crc kubenswrapper[4794]: I0310 09:50:16.413364 4794 scope.go:117] "RemoveContainer" containerID="d788d4a5003d3248803f7857ee8143040c3ba1fcf0389703ac3b7652edacabf8" Mar 10 09:50:16 crc kubenswrapper[4794]: E0310 09:50:16.413837 4794 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d788d4a5003d3248803f7857ee8143040c3ba1fcf0389703ac3b7652edacabf8\": container with ID starting with d788d4a5003d3248803f7857ee8143040c3ba1fcf0389703ac3b7652edacabf8 not found: ID does not exist" containerID="d788d4a5003d3248803f7857ee8143040c3ba1fcf0389703ac3b7652edacabf8" Mar 10 09:50:16 crc kubenswrapper[4794]: I0310 09:50:16.413895 4794 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d788d4a5003d3248803f7857ee8143040c3ba1fcf0389703ac3b7652edacabf8"} err="failed to get container status \"d788d4a5003d3248803f7857ee8143040c3ba1fcf0389703ac3b7652edacabf8\": rpc error: code = NotFound desc = could not find container \"d788d4a5003d3248803f7857ee8143040c3ba1fcf0389703ac3b7652edacabf8\": container with ID starting with d788d4a5003d3248803f7857ee8143040c3ba1fcf0389703ac3b7652edacabf8 not found: ID does not exist" Mar 10 09:50:17 crc kubenswrapper[4794]: I0310 09:50:17.406569 4794 generic.go:334] "Generic (PLEG): container finished" podID="dec1ce65-776a-4ed7-8441-0779bb0c5d93" containerID="f12f6d1a08902d49b512cf0cf93f6aae05f971c4ee1f0deae66ff06febde7456" exitCode=0 Mar 10 09:50:17 crc kubenswrapper[4794]: I0310 09:50:17.406647 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552270-rq675" event={"ID":"dec1ce65-776a-4ed7-8441-0779bb0c5d93","Type":"ContainerDied","Data":"f12f6d1a08902d49b512cf0cf93f6aae05f971c4ee1f0deae66ff06febde7456"} Mar 10 09:50:18 crc kubenswrapper[4794]: I0310 09:50:18.006563 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" path="/var/lib/kubelet/pods/f85e55b1a89d02b0cb034b1ea31ed45a/volumes" Mar 10 09:50:18 crc kubenswrapper[4794]: I0310 09:50:18.007065 4794 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podUID="" Mar 10 09:50:18 crc kubenswrapper[4794]: I0310 09:50:18.017085 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Mar 10 09:50:18 crc kubenswrapper[4794]: I0310 09:50:18.017130 4794 kubelet.go:2649] "Unable to find pod for mirror pod, skipping" mirrorPod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" mirrorPodUID="a3f9d1af-5d92-47e5-9197-acf7c009171c" Mar 10 09:50:18 crc kubenswrapper[4794]: I0310 09:50:18.021404 4794 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Mar 10 09:50:18 crc kubenswrapper[4794]: I0310 09:50:18.021450 4794 kubelet.go:2673] "Unable to find pod for mirror pod, skipping" mirrorPod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" mirrorPodUID="a3f9d1af-5d92-47e5-9197-acf7c009171c" Mar 10 09:50:18 crc kubenswrapper[4794]: I0310 09:50:18.748462 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552270-rq675" Mar 10 09:50:18 crc kubenswrapper[4794]: I0310 09:50:18.919370 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fdvfp\" (UniqueName: \"kubernetes.io/projected/dec1ce65-776a-4ed7-8441-0779bb0c5d93-kube-api-access-fdvfp\") pod \"dec1ce65-776a-4ed7-8441-0779bb0c5d93\" (UID: \"dec1ce65-776a-4ed7-8441-0779bb0c5d93\") " Mar 10 09:50:18 crc kubenswrapper[4794]: I0310 09:50:18.924305 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dec1ce65-776a-4ed7-8441-0779bb0c5d93-kube-api-access-fdvfp" (OuterVolumeSpecName: "kube-api-access-fdvfp") pod "dec1ce65-776a-4ed7-8441-0779bb0c5d93" (UID: "dec1ce65-776a-4ed7-8441-0779bb0c5d93"). InnerVolumeSpecName "kube-api-access-fdvfp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:50:18 crc kubenswrapper[4794]: I0310 09:50:18.999432 4794 scope.go:117] "RemoveContainer" containerID="6c30f1395ceab6141b0846c7d21550b1f6c727ffafc9a4ab5db494dd6fb17a7c" Mar 10 09:50:19 crc kubenswrapper[4794]: E0310 09:50:19.000240 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"check-endpoints\" with CrashLoopBackOff: \"back-off 20s restarting failed container=check-endpoints pod=network-check-source-55646444c4-trplf_openshift-network-diagnostics(9d751cbb-f2e2-430d-9754-c882a5e924a5)\"" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 09:50:19 crc kubenswrapper[4794]: I0310 09:50:19.021314 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fdvfp\" (UniqueName: \"kubernetes.io/projected/dec1ce65-776a-4ed7-8441-0779bb0c5d93-kube-api-access-fdvfp\") on node \"crc\" DevicePath \"\"" Mar 10 09:50:19 crc kubenswrapper[4794]: I0310 09:50:19.420657 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552270-rq675" event={"ID":"dec1ce65-776a-4ed7-8441-0779bb0c5d93","Type":"ContainerDied","Data":"796d897c1cac07dcc7a074a8defc3de5e31f63a38c7e34cb21e13aafc7a222c4"} Mar 10 09:50:19 crc kubenswrapper[4794]: I0310 09:50:19.420711 4794 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="796d897c1cac07dcc7a074a8defc3de5e31f63a38c7e34cb21e13aafc7a222c4" Mar 10 09:50:19 crc kubenswrapper[4794]: I0310 09:50:19.420726 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552270-rq675" Mar 10 09:50:31 crc kubenswrapper[4794]: I0310 09:50:31.282639 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Mar 10 09:50:31 crc kubenswrapper[4794]: I0310 09:50:31.489506 4794 generic.go:334] "Generic (PLEG): container finished" podID="a39fe093-da97-48ba-bdf3-a566eefc5208" containerID="670ae2fdad1919a87f46024509bc20c149bdc0af6ab62019439f2b21a599393d" exitCode=0 Mar 10 09:50:31 crc kubenswrapper[4794]: I0310 09:50:31.489546 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-n2b4p" event={"ID":"a39fe093-da97-48ba-bdf3-a566eefc5208","Type":"ContainerDied","Data":"670ae2fdad1919a87f46024509bc20c149bdc0af6ab62019439f2b21a599393d"} Mar 10 09:50:31 crc kubenswrapper[4794]: I0310 09:50:31.490023 4794 scope.go:117] "RemoveContainer" containerID="670ae2fdad1919a87f46024509bc20c149bdc0af6ab62019439f2b21a599393d" Mar 10 09:50:32 crc kubenswrapper[4794]: I0310 09:50:32.503925 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-n2b4p" event={"ID":"a39fe093-da97-48ba-bdf3-a566eefc5208","Type":"ContainerStarted","Data":"6b1766f225c554c356c240978ae44a06d85c28ef26ce5ae4ae5c928580823fb0"} Mar 10 09:50:32 crc kubenswrapper[4794]: I0310 09:50:32.505281 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-n2b4p" Mar 10 09:50:32 crc kubenswrapper[4794]: I0310 09:50:32.508171 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-n2b4p" Mar 10 09:50:32 crc kubenswrapper[4794]: I0310 09:50:32.639603 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Mar 10 09:50:32 crc kubenswrapper[4794]: I0310 09:50:32.998527 4794 scope.go:117] "RemoveContainer" containerID="6c30f1395ceab6141b0846c7d21550b1f6c727ffafc9a4ab5db494dd6fb17a7c" Mar 10 09:50:33 crc kubenswrapper[4794]: I0310 09:50:33.509793 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-source-55646444c4-trplf_9d751cbb-f2e2-430d-9754-c882a5e924a5/check-endpoints/2.log" Mar 10 09:50:33 crc kubenswrapper[4794]: I0310 09:50:33.509876 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"d1c44339935f5327d4f0e01d2624a97fe9720b88a863a0b3a7c6031e823afa8a"} Mar 10 09:50:33 crc kubenswrapper[4794]: I0310 09:50:33.852205 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Mar 10 09:50:35 crc kubenswrapper[4794]: I0310 09:50:35.278857 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Mar 10 09:50:35 crc kubenswrapper[4794]: I0310 09:50:35.979387 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 10 09:50:36 crc kubenswrapper[4794]: I0310 09:50:36.723636 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Mar 10 09:50:39 crc kubenswrapper[4794]: I0310 09:50:39.115873 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Mar 10 09:50:41 crc kubenswrapper[4794]: I0310 09:50:41.697598 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Mar 10 09:50:47 crc kubenswrapper[4794]: I0310 09:50:47.613051 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Mar 10 09:50:48 crc kubenswrapper[4794]: I0310 09:50:48.355716 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Mar 10 09:51:41 crc kubenswrapper[4794]: I0310 09:51:41.972688 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-rvw4m"] Mar 10 09:51:41 crc kubenswrapper[4794]: E0310 09:51:41.973362 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dec1ce65-776a-4ed7-8441-0779bb0c5d93" containerName="oc" Mar 10 09:51:41 crc kubenswrapper[4794]: I0310 09:51:41.973373 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="dec1ce65-776a-4ed7-8441-0779bb0c5d93" containerName="oc" Mar 10 09:51:41 crc kubenswrapper[4794]: I0310 09:51:41.973462 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="dec1ce65-776a-4ed7-8441-0779bb0c5d93" containerName="oc" Mar 10 09:51:41 crc kubenswrapper[4794]: I0310 09:51:41.973816 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-rvw4m" Mar 10 09:51:41 crc kubenswrapper[4794]: I0310 09:51:41.991368 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-rvw4m"] Mar 10 09:51:42 crc kubenswrapper[4794]: I0310 09:51:42.073867 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-97cp8\" (UniqueName: \"kubernetes.io/projected/a243edcd-8113-4a6c-a113-d12e95c08b58-kube-api-access-97cp8\") pod \"image-registry-66df7c8f76-rvw4m\" (UID: \"a243edcd-8113-4a6c-a113-d12e95c08b58\") " pod="openshift-image-registry/image-registry-66df7c8f76-rvw4m" Mar 10 09:51:42 crc kubenswrapper[4794]: I0310 09:51:42.073916 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a243edcd-8113-4a6c-a113-d12e95c08b58-bound-sa-token\") pod \"image-registry-66df7c8f76-rvw4m\" (UID: \"a243edcd-8113-4a6c-a113-d12e95c08b58\") " pod="openshift-image-registry/image-registry-66df7c8f76-rvw4m" Mar 10 09:51:42 crc kubenswrapper[4794]: I0310 09:51:42.073951 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-rvw4m\" (UID: \"a243edcd-8113-4a6c-a113-d12e95c08b58\") " pod="openshift-image-registry/image-registry-66df7c8f76-rvw4m" Mar 10 09:51:42 crc kubenswrapper[4794]: I0310 09:51:42.074088 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a243edcd-8113-4a6c-a113-d12e95c08b58-trusted-ca\") pod \"image-registry-66df7c8f76-rvw4m\" (UID: \"a243edcd-8113-4a6c-a113-d12e95c08b58\") " pod="openshift-image-registry/image-registry-66df7c8f76-rvw4m" Mar 10 09:51:42 crc kubenswrapper[4794]: I0310 09:51:42.074183 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/a243edcd-8113-4a6c-a113-d12e95c08b58-ca-trust-extracted\") pod \"image-registry-66df7c8f76-rvw4m\" (UID: \"a243edcd-8113-4a6c-a113-d12e95c08b58\") " pod="openshift-image-registry/image-registry-66df7c8f76-rvw4m" Mar 10 09:51:42 crc kubenswrapper[4794]: I0310 09:51:42.074343 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/a243edcd-8113-4a6c-a113-d12e95c08b58-registry-tls\") pod \"image-registry-66df7c8f76-rvw4m\" (UID: \"a243edcd-8113-4a6c-a113-d12e95c08b58\") " pod="openshift-image-registry/image-registry-66df7c8f76-rvw4m" Mar 10 09:51:42 crc kubenswrapper[4794]: I0310 09:51:42.074378 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/a243edcd-8113-4a6c-a113-d12e95c08b58-registry-certificates\") pod \"image-registry-66df7c8f76-rvw4m\" (UID: \"a243edcd-8113-4a6c-a113-d12e95c08b58\") " pod="openshift-image-registry/image-registry-66df7c8f76-rvw4m" Mar 10 09:51:42 crc kubenswrapper[4794]: I0310 09:51:42.074400 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/a243edcd-8113-4a6c-a113-d12e95c08b58-installation-pull-secrets\") pod \"image-registry-66df7c8f76-rvw4m\" (UID: \"a243edcd-8113-4a6c-a113-d12e95c08b58\") " pod="openshift-image-registry/image-registry-66df7c8f76-rvw4m" Mar 10 09:51:42 crc kubenswrapper[4794]: I0310 09:51:42.092974 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-rvw4m\" (UID: \"a243edcd-8113-4a6c-a113-d12e95c08b58\") " pod="openshift-image-registry/image-registry-66df7c8f76-rvw4m" Mar 10 09:51:42 crc kubenswrapper[4794]: I0310 09:51:42.176196 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a243edcd-8113-4a6c-a113-d12e95c08b58-trusted-ca\") pod \"image-registry-66df7c8f76-rvw4m\" (UID: \"a243edcd-8113-4a6c-a113-d12e95c08b58\") " pod="openshift-image-registry/image-registry-66df7c8f76-rvw4m" Mar 10 09:51:42 crc kubenswrapper[4794]: I0310 09:51:42.176617 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/a243edcd-8113-4a6c-a113-d12e95c08b58-ca-trust-extracted\") pod \"image-registry-66df7c8f76-rvw4m\" (UID: \"a243edcd-8113-4a6c-a113-d12e95c08b58\") " pod="openshift-image-registry/image-registry-66df7c8f76-rvw4m" Mar 10 09:51:42 crc kubenswrapper[4794]: I0310 09:51:42.176903 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/a243edcd-8113-4a6c-a113-d12e95c08b58-registry-tls\") pod \"image-registry-66df7c8f76-rvw4m\" (UID: \"a243edcd-8113-4a6c-a113-d12e95c08b58\") " pod="openshift-image-registry/image-registry-66df7c8f76-rvw4m" Mar 10 09:51:42 crc kubenswrapper[4794]: I0310 09:51:42.177146 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/a243edcd-8113-4a6c-a113-d12e95c08b58-registry-certificates\") pod \"image-registry-66df7c8f76-rvw4m\" (UID: \"a243edcd-8113-4a6c-a113-d12e95c08b58\") " pod="openshift-image-registry/image-registry-66df7c8f76-rvw4m" Mar 10 09:51:42 crc kubenswrapper[4794]: I0310 09:51:42.177302 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/a243edcd-8113-4a6c-a113-d12e95c08b58-installation-pull-secrets\") pod \"image-registry-66df7c8f76-rvw4m\" (UID: \"a243edcd-8113-4a6c-a113-d12e95c08b58\") " pod="openshift-image-registry/image-registry-66df7c8f76-rvw4m" Mar 10 09:51:42 crc kubenswrapper[4794]: I0310 09:51:42.177313 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/a243edcd-8113-4a6c-a113-d12e95c08b58-ca-trust-extracted\") pod \"image-registry-66df7c8f76-rvw4m\" (UID: \"a243edcd-8113-4a6c-a113-d12e95c08b58\") " pod="openshift-image-registry/image-registry-66df7c8f76-rvw4m" Mar 10 09:51:42 crc kubenswrapper[4794]: I0310 09:51:42.177870 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-97cp8\" (UniqueName: \"kubernetes.io/projected/a243edcd-8113-4a6c-a113-d12e95c08b58-kube-api-access-97cp8\") pod \"image-registry-66df7c8f76-rvw4m\" (UID: \"a243edcd-8113-4a6c-a113-d12e95c08b58\") " pod="openshift-image-registry/image-registry-66df7c8f76-rvw4m" Mar 10 09:51:42 crc kubenswrapper[4794]: I0310 09:51:42.177998 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a243edcd-8113-4a6c-a113-d12e95c08b58-bound-sa-token\") pod \"image-registry-66df7c8f76-rvw4m\" (UID: \"a243edcd-8113-4a6c-a113-d12e95c08b58\") " pod="openshift-image-registry/image-registry-66df7c8f76-rvw4m" Mar 10 09:51:42 crc kubenswrapper[4794]: I0310 09:51:42.178714 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a243edcd-8113-4a6c-a113-d12e95c08b58-trusted-ca\") pod \"image-registry-66df7c8f76-rvw4m\" (UID: \"a243edcd-8113-4a6c-a113-d12e95c08b58\") " pod="openshift-image-registry/image-registry-66df7c8f76-rvw4m" Mar 10 09:51:42 crc kubenswrapper[4794]: I0310 09:51:42.180272 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/a243edcd-8113-4a6c-a113-d12e95c08b58-registry-certificates\") pod \"image-registry-66df7c8f76-rvw4m\" (UID: \"a243edcd-8113-4a6c-a113-d12e95c08b58\") " pod="openshift-image-registry/image-registry-66df7c8f76-rvw4m" Mar 10 09:51:42 crc kubenswrapper[4794]: I0310 09:51:42.185580 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/a243edcd-8113-4a6c-a113-d12e95c08b58-registry-tls\") pod \"image-registry-66df7c8f76-rvw4m\" (UID: \"a243edcd-8113-4a6c-a113-d12e95c08b58\") " pod="openshift-image-registry/image-registry-66df7c8f76-rvw4m" Mar 10 09:51:42 crc kubenswrapper[4794]: I0310 09:51:42.186677 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/a243edcd-8113-4a6c-a113-d12e95c08b58-installation-pull-secrets\") pod \"image-registry-66df7c8f76-rvw4m\" (UID: \"a243edcd-8113-4a6c-a113-d12e95c08b58\") " pod="openshift-image-registry/image-registry-66df7c8f76-rvw4m" Mar 10 09:51:42 crc kubenswrapper[4794]: I0310 09:51:42.198561 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-97cp8\" (UniqueName: \"kubernetes.io/projected/a243edcd-8113-4a6c-a113-d12e95c08b58-kube-api-access-97cp8\") pod \"image-registry-66df7c8f76-rvw4m\" (UID: \"a243edcd-8113-4a6c-a113-d12e95c08b58\") " pod="openshift-image-registry/image-registry-66df7c8f76-rvw4m" Mar 10 09:51:42 crc kubenswrapper[4794]: I0310 09:51:42.210504 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a243edcd-8113-4a6c-a113-d12e95c08b58-bound-sa-token\") pod \"image-registry-66df7c8f76-rvw4m\" (UID: \"a243edcd-8113-4a6c-a113-d12e95c08b58\") " pod="openshift-image-registry/image-registry-66df7c8f76-rvw4m" Mar 10 09:51:42 crc kubenswrapper[4794]: I0310 09:51:42.290925 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-rvw4m" Mar 10 09:51:42 crc kubenswrapper[4794]: I0310 09:51:42.513979 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-rvw4m"] Mar 10 09:51:42 crc kubenswrapper[4794]: I0310 09:51:42.925839 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-rvw4m" event={"ID":"a243edcd-8113-4a6c-a113-d12e95c08b58","Type":"ContainerStarted","Data":"19ebfdbba6c0b3a1ed2cbf5179326dcbca03a11f9a4c11e741f1bfd131cee126"} Mar 10 09:51:42 crc kubenswrapper[4794]: I0310 09:51:42.925892 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-rvw4m" event={"ID":"a243edcd-8113-4a6c-a113-d12e95c08b58","Type":"ContainerStarted","Data":"fb6beacc0387d4715294d00181bc9bdbd8b078cc3cdf8f5e7e8e43c22a6d00a8"} Mar 10 09:51:42 crc kubenswrapper[4794]: I0310 09:51:42.926428 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-66df7c8f76-rvw4m" Mar 10 09:51:42 crc kubenswrapper[4794]: I0310 09:51:42.956469 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-66df7c8f76-rvw4m" podStartSLOduration=1.956441673 podStartE2EDuration="1.956441673s" podCreationTimestamp="2026-03-10 09:51:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 09:51:42.94812756 +0000 UTC m=+451.704298418" watchObservedRunningTime="2026-03-10 09:51:42.956441673 +0000 UTC m=+451.712612521" Mar 10 09:51:52 crc kubenswrapper[4794]: I0310 09:51:52.967727 4794 patch_prober.go:28] interesting pod/machine-config-daemon-69278 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 09:51:52 crc kubenswrapper[4794]: I0310 09:51:52.968373 4794 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-69278" podUID="071a79a8-a892-4d38-a255-2a19483b64aa" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 09:52:00 crc kubenswrapper[4794]: I0310 09:52:00.134164 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29552272-f6597"] Mar 10 09:52:00 crc kubenswrapper[4794]: I0310 09:52:00.135441 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552272-f6597" Mar 10 09:52:00 crc kubenswrapper[4794]: I0310 09:52:00.139858 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 10 09:52:00 crc kubenswrapper[4794]: I0310 09:52:00.140434 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-r5tkc" Mar 10 09:52:00 crc kubenswrapper[4794]: I0310 09:52:00.141301 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 10 09:52:00 crc kubenswrapper[4794]: I0310 09:52:00.141595 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552272-f6597"] Mar 10 09:52:00 crc kubenswrapper[4794]: I0310 09:52:00.219420 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g6q67\" (UniqueName: \"kubernetes.io/projected/38cb9379-e79d-4708-a441-7e65e9dc7450-kube-api-access-g6q67\") pod \"auto-csr-approver-29552272-f6597\" (UID: \"38cb9379-e79d-4708-a441-7e65e9dc7450\") " pod="openshift-infra/auto-csr-approver-29552272-f6597" Mar 10 09:52:00 crc kubenswrapper[4794]: I0310 09:52:00.320859 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g6q67\" (UniqueName: \"kubernetes.io/projected/38cb9379-e79d-4708-a441-7e65e9dc7450-kube-api-access-g6q67\") pod \"auto-csr-approver-29552272-f6597\" (UID: \"38cb9379-e79d-4708-a441-7e65e9dc7450\") " pod="openshift-infra/auto-csr-approver-29552272-f6597" Mar 10 09:52:00 crc kubenswrapper[4794]: I0310 09:52:00.346441 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g6q67\" (UniqueName: \"kubernetes.io/projected/38cb9379-e79d-4708-a441-7e65e9dc7450-kube-api-access-g6q67\") pod \"auto-csr-approver-29552272-f6597\" (UID: \"38cb9379-e79d-4708-a441-7e65e9dc7450\") " pod="openshift-infra/auto-csr-approver-29552272-f6597" Mar 10 09:52:00 crc kubenswrapper[4794]: I0310 09:52:00.452970 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552272-f6597" Mar 10 09:52:00 crc kubenswrapper[4794]: I0310 09:52:00.860994 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552272-f6597"] Mar 10 09:52:01 crc kubenswrapper[4794]: I0310 09:52:01.041855 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552272-f6597" event={"ID":"38cb9379-e79d-4708-a441-7e65e9dc7450","Type":"ContainerStarted","Data":"4a8dbafe13ea21b1c4e00482869d929dc391a0fa47fe2cf8a0d4a7c0d7d05fd9"} Mar 10 09:52:02 crc kubenswrapper[4794]: I0310 09:52:02.048623 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552272-f6597" event={"ID":"38cb9379-e79d-4708-a441-7e65e9dc7450","Type":"ContainerStarted","Data":"3b119e0238b024cc6aef6badb35e23a741196bf39fb82d7295534f0a6c4afc22"} Mar 10 09:52:02 crc kubenswrapper[4794]: I0310 09:52:02.063949 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29552272-f6597" podStartSLOduration=1.194758294 podStartE2EDuration="2.063932597s" podCreationTimestamp="2026-03-10 09:52:00 +0000 UTC" firstStartedPulling="2026-03-10 09:52:00.86847812 +0000 UTC m=+469.624648958" lastFinishedPulling="2026-03-10 09:52:01.737652443 +0000 UTC m=+470.493823261" observedRunningTime="2026-03-10 09:52:02.059539312 +0000 UTC m=+470.815710140" watchObservedRunningTime="2026-03-10 09:52:02.063932597 +0000 UTC m=+470.820103405" Mar 10 09:52:02 crc kubenswrapper[4794]: I0310 09:52:02.297685 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-66df7c8f76-rvw4m" Mar 10 09:52:02 crc kubenswrapper[4794]: I0310 09:52:02.328176 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-bnvgq"] Mar 10 09:52:02 crc kubenswrapper[4794]: I0310 09:52:02.329827 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-bnvgq" podUID="88245dbf-bf6b-4051-9a3c-91da5a183538" containerName="registry-server" containerID="cri-o://063c82c36304ffc8762f2979da18d291ed2033e28095ef4691b3ed815dfc3f7e" gracePeriod=30 Mar 10 09:52:02 crc kubenswrapper[4794]: I0310 09:52:02.376264 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-zffn2"] Mar 10 09:52:02 crc kubenswrapper[4794]: I0310 09:52:02.377344 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-zffn2" podUID="869965fc-c355-4c93-9776-dc1a070c926e" containerName="registry-server" containerID="cri-o://71d967cd3d765afe905206227b4a185610ee4a8d660a82bc475e7aac4f1aa414" gracePeriod=30 Mar 10 09:52:02 crc kubenswrapper[4794]: I0310 09:52:02.387463 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-n2b4p"] Mar 10 09:52:02 crc kubenswrapper[4794]: I0310 09:52:02.387908 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/marketplace-operator-79b997595-n2b4p" podUID="a39fe093-da97-48ba-bdf3-a566eefc5208" containerName="marketplace-operator" containerID="cri-o://6b1766f225c554c356c240978ae44a06d85c28ef26ce5ae4ae5c928580823fb0" gracePeriod=30 Mar 10 09:52:02 crc kubenswrapper[4794]: I0310 09:52:02.391402 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-q7js2"] Mar 10 09:52:02 crc kubenswrapper[4794]: I0310 09:52:02.391680 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-q7js2" podUID="0cbd0a5e-393e-4539-86ef-559427986faa" containerName="registry-server" containerID="cri-o://1edd73cb5575e96ed483f8bfdcf30f5d909505595954c18a9a542f61e3dc57b2" gracePeriod=30 Mar 10 09:52:02 crc kubenswrapper[4794]: I0310 09:52:02.404225 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-zp5hg"] Mar 10 09:52:02 crc kubenswrapper[4794]: I0310 09:52:02.413707 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-cx8l7"] Mar 10 09:52:02 crc kubenswrapper[4794]: I0310 09:52:02.414011 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-cx8l7" podUID="1fe60fc6-df84-4c3b-a73a-0d5a64aeb61a" containerName="registry-server" containerID="cri-o://0d2f487e9a5fcf36206ba7fd5b76cc51ce7328a16051ac52831a122bfc158a54" gracePeriod=30 Mar 10 09:52:02 crc kubenswrapper[4794]: I0310 09:52:02.423643 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-ztm8n"] Mar 10 09:52:02 crc kubenswrapper[4794]: I0310 09:52:02.424551 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-ztm8n" Mar 10 09:52:02 crc kubenswrapper[4794]: I0310 09:52:02.436419 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-ztm8n"] Mar 10 09:52:02 crc kubenswrapper[4794]: I0310 09:52:02.561840 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e995e0b7-2d36-4d94-a424-f916e1cba0ac-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-ztm8n\" (UID: \"e995e0b7-2d36-4d94-a424-f916e1cba0ac\") " pod="openshift-marketplace/marketplace-operator-79b997595-ztm8n" Mar 10 09:52:02 crc kubenswrapper[4794]: I0310 09:52:02.561907 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/e995e0b7-2d36-4d94-a424-f916e1cba0ac-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-ztm8n\" (UID: \"e995e0b7-2d36-4d94-a424-f916e1cba0ac\") " pod="openshift-marketplace/marketplace-operator-79b997595-ztm8n" Mar 10 09:52:02 crc kubenswrapper[4794]: I0310 09:52:02.561964 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c94rj\" (UniqueName: \"kubernetes.io/projected/e995e0b7-2d36-4d94-a424-f916e1cba0ac-kube-api-access-c94rj\") pod \"marketplace-operator-79b997595-ztm8n\" (UID: \"e995e0b7-2d36-4d94-a424-f916e1cba0ac\") " pod="openshift-marketplace/marketplace-operator-79b997595-ztm8n" Mar 10 09:52:02 crc kubenswrapper[4794]: I0310 09:52:02.663325 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c94rj\" (UniqueName: \"kubernetes.io/projected/e995e0b7-2d36-4d94-a424-f916e1cba0ac-kube-api-access-c94rj\") pod \"marketplace-operator-79b997595-ztm8n\" (UID: \"e995e0b7-2d36-4d94-a424-f916e1cba0ac\") " pod="openshift-marketplace/marketplace-operator-79b997595-ztm8n" Mar 10 09:52:02 crc kubenswrapper[4794]: I0310 09:52:02.663402 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e995e0b7-2d36-4d94-a424-f916e1cba0ac-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-ztm8n\" (UID: \"e995e0b7-2d36-4d94-a424-f916e1cba0ac\") " pod="openshift-marketplace/marketplace-operator-79b997595-ztm8n" Mar 10 09:52:02 crc kubenswrapper[4794]: I0310 09:52:02.663435 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/e995e0b7-2d36-4d94-a424-f916e1cba0ac-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-ztm8n\" (UID: \"e995e0b7-2d36-4d94-a424-f916e1cba0ac\") " pod="openshift-marketplace/marketplace-operator-79b997595-ztm8n" Mar 10 09:52:02 crc kubenswrapper[4794]: I0310 09:52:02.664763 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e995e0b7-2d36-4d94-a424-f916e1cba0ac-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-ztm8n\" (UID: \"e995e0b7-2d36-4d94-a424-f916e1cba0ac\") " pod="openshift-marketplace/marketplace-operator-79b997595-ztm8n" Mar 10 09:52:02 crc kubenswrapper[4794]: I0310 09:52:02.669681 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/e995e0b7-2d36-4d94-a424-f916e1cba0ac-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-ztm8n\" (UID: \"e995e0b7-2d36-4d94-a424-f916e1cba0ac\") " pod="openshift-marketplace/marketplace-operator-79b997595-ztm8n" Mar 10 09:52:02 crc kubenswrapper[4794]: I0310 09:52:02.679038 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c94rj\" (UniqueName: \"kubernetes.io/projected/e995e0b7-2d36-4d94-a424-f916e1cba0ac-kube-api-access-c94rj\") pod \"marketplace-operator-79b997595-ztm8n\" (UID: \"e995e0b7-2d36-4d94-a424-f916e1cba0ac\") " pod="openshift-marketplace/marketplace-operator-79b997595-ztm8n" Mar 10 09:52:02 crc kubenswrapper[4794]: I0310 09:52:02.695439 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-bnvgq" Mar 10 09:52:02 crc kubenswrapper[4794]: I0310 09:52:02.865885 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/88245dbf-bf6b-4051-9a3c-91da5a183538-utilities\") pod \"88245dbf-bf6b-4051-9a3c-91da5a183538\" (UID: \"88245dbf-bf6b-4051-9a3c-91da5a183538\") " Mar 10 09:52:02 crc kubenswrapper[4794]: I0310 09:52:02.867141 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/88245dbf-bf6b-4051-9a3c-91da5a183538-catalog-content\") pod \"88245dbf-bf6b-4051-9a3c-91da5a183538\" (UID: \"88245dbf-bf6b-4051-9a3c-91da5a183538\") " Mar 10 09:52:02 crc kubenswrapper[4794]: I0310 09:52:02.875512 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fwx4g\" (UniqueName: \"kubernetes.io/projected/88245dbf-bf6b-4051-9a3c-91da5a183538-kube-api-access-fwx4g\") pod \"88245dbf-bf6b-4051-9a3c-91da5a183538\" (UID: \"88245dbf-bf6b-4051-9a3c-91da5a183538\") " Mar 10 09:52:02 crc kubenswrapper[4794]: I0310 09:52:02.867069 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/88245dbf-bf6b-4051-9a3c-91da5a183538-utilities" (OuterVolumeSpecName: "utilities") pod "88245dbf-bf6b-4051-9a3c-91da5a183538" (UID: "88245dbf-bf6b-4051-9a3c-91da5a183538"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 09:52:02 crc kubenswrapper[4794]: I0310 09:52:02.875863 4794 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/88245dbf-bf6b-4051-9a3c-91da5a183538-utilities\") on node \"crc\" DevicePath \"\"" Mar 10 09:52:02 crc kubenswrapper[4794]: I0310 09:52:02.879716 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/88245dbf-bf6b-4051-9a3c-91da5a183538-kube-api-access-fwx4g" (OuterVolumeSpecName: "kube-api-access-fwx4g") pod "88245dbf-bf6b-4051-9a3c-91da5a183538" (UID: "88245dbf-bf6b-4051-9a3c-91da5a183538"). InnerVolumeSpecName "kube-api-access-fwx4g". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:52:02 crc kubenswrapper[4794]: I0310 09:52:02.925773 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-n2b4p" Mar 10 09:52:02 crc kubenswrapper[4794]: I0310 09:52:02.926094 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/88245dbf-bf6b-4051-9a3c-91da5a183538-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "88245dbf-bf6b-4051-9a3c-91da5a183538" (UID: "88245dbf-bf6b-4051-9a3c-91da5a183538"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 09:52:02 crc kubenswrapper[4794]: I0310 09:52:02.931896 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-q7js2" Mar 10 09:52:02 crc kubenswrapper[4794]: I0310 09:52:02.936149 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-cx8l7" Mar 10 09:52:02 crc kubenswrapper[4794]: I0310 09:52:02.958438 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-ztm8n" Mar 10 09:52:02 crc kubenswrapper[4794]: I0310 09:52:02.976784 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fwx4g\" (UniqueName: \"kubernetes.io/projected/88245dbf-bf6b-4051-9a3c-91da5a183538-kube-api-access-fwx4g\") on node \"crc\" DevicePath \"\"" Mar 10 09:52:02 crc kubenswrapper[4794]: I0310 09:52:02.977248 4794 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/88245dbf-bf6b-4051-9a3c-91da5a183538-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 10 09:52:03 crc kubenswrapper[4794]: I0310 09:52:03.057018 4794 generic.go:334] "Generic (PLEG): container finished" podID="88245dbf-bf6b-4051-9a3c-91da5a183538" containerID="063c82c36304ffc8762f2979da18d291ed2033e28095ef4691b3ed815dfc3f7e" exitCode=0 Mar 10 09:52:03 crc kubenswrapper[4794]: I0310 09:52:03.057079 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bnvgq" event={"ID":"88245dbf-bf6b-4051-9a3c-91da5a183538","Type":"ContainerDied","Data":"063c82c36304ffc8762f2979da18d291ed2033e28095ef4691b3ed815dfc3f7e"} Mar 10 09:52:03 crc kubenswrapper[4794]: I0310 09:52:03.057089 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-bnvgq" Mar 10 09:52:03 crc kubenswrapper[4794]: I0310 09:52:03.057110 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bnvgq" event={"ID":"88245dbf-bf6b-4051-9a3c-91da5a183538","Type":"ContainerDied","Data":"1d1868622a62b82c40e3713c2ddc58846a44ce60f8e18319d20f62a5caf4b4b7"} Mar 10 09:52:03 crc kubenswrapper[4794]: I0310 09:52:03.057129 4794 scope.go:117] "RemoveContainer" containerID="063c82c36304ffc8762f2979da18d291ed2033e28095ef4691b3ed815dfc3f7e" Mar 10 09:52:03 crc kubenswrapper[4794]: I0310 09:52:03.061196 4794 generic.go:334] "Generic (PLEG): container finished" podID="a39fe093-da97-48ba-bdf3-a566eefc5208" containerID="6b1766f225c554c356c240978ae44a06d85c28ef26ce5ae4ae5c928580823fb0" exitCode=0 Mar 10 09:52:03 crc kubenswrapper[4794]: I0310 09:52:03.061273 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-n2b4p" event={"ID":"a39fe093-da97-48ba-bdf3-a566eefc5208","Type":"ContainerDied","Data":"6b1766f225c554c356c240978ae44a06d85c28ef26ce5ae4ae5c928580823fb0"} Mar 10 09:52:03 crc kubenswrapper[4794]: I0310 09:52:03.061303 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-n2b4p" event={"ID":"a39fe093-da97-48ba-bdf3-a566eefc5208","Type":"ContainerDied","Data":"6f6471a789932655f56daead01973640ab8e411e76596cce7f6d130170f0f020"} Mar 10 09:52:03 crc kubenswrapper[4794]: I0310 09:52:03.061419 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-n2b4p" Mar 10 09:52:03 crc kubenswrapper[4794]: I0310 09:52:03.070234 4794 generic.go:334] "Generic (PLEG): container finished" podID="0cbd0a5e-393e-4539-86ef-559427986faa" containerID="1edd73cb5575e96ed483f8bfdcf30f5d909505595954c18a9a542f61e3dc57b2" exitCode=0 Mar 10 09:52:03 crc kubenswrapper[4794]: I0310 09:52:03.070285 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-q7js2" event={"ID":"0cbd0a5e-393e-4539-86ef-559427986faa","Type":"ContainerDied","Data":"1edd73cb5575e96ed483f8bfdcf30f5d909505595954c18a9a542f61e3dc57b2"} Mar 10 09:52:03 crc kubenswrapper[4794]: I0310 09:52:03.070310 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-q7js2" event={"ID":"0cbd0a5e-393e-4539-86ef-559427986faa","Type":"ContainerDied","Data":"2d721804b25840827005cd1f692bc5cbc597518d2292b478ec625aa65a262b34"} Mar 10 09:52:03 crc kubenswrapper[4794]: I0310 09:52:03.070442 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-q7js2" Mar 10 09:52:03 crc kubenswrapper[4794]: I0310 09:52:03.078024 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1fe60fc6-df84-4c3b-a73a-0d5a64aeb61a-catalog-content\") pod \"1fe60fc6-df84-4c3b-a73a-0d5a64aeb61a\" (UID: \"1fe60fc6-df84-4c3b-a73a-0d5a64aeb61a\") " Mar 10 09:52:03 crc kubenswrapper[4794]: I0310 09:52:03.078053 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1fe60fc6-df84-4c3b-a73a-0d5a64aeb61a-utilities\") pod \"1fe60fc6-df84-4c3b-a73a-0d5a64aeb61a\" (UID: \"1fe60fc6-df84-4c3b-a73a-0d5a64aeb61a\") " Mar 10 09:52:03 crc kubenswrapper[4794]: I0310 09:52:03.078096 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bxz59\" (UniqueName: \"kubernetes.io/projected/a39fe093-da97-48ba-bdf3-a566eefc5208-kube-api-access-bxz59\") pod \"a39fe093-da97-48ba-bdf3-a566eefc5208\" (UID: \"a39fe093-da97-48ba-bdf3-a566eefc5208\") " Mar 10 09:52:03 crc kubenswrapper[4794]: I0310 09:52:03.078116 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a39fe093-da97-48ba-bdf3-a566eefc5208-marketplace-trusted-ca\") pod \"a39fe093-da97-48ba-bdf3-a566eefc5208\" (UID: \"a39fe093-da97-48ba-bdf3-a566eefc5208\") " Mar 10 09:52:03 crc kubenswrapper[4794]: I0310 09:52:03.078163 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0cbd0a5e-393e-4539-86ef-559427986faa-utilities\") pod \"0cbd0a5e-393e-4539-86ef-559427986faa\" (UID: \"0cbd0a5e-393e-4539-86ef-559427986faa\") " Mar 10 09:52:03 crc kubenswrapper[4794]: I0310 09:52:03.078192 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0cbd0a5e-393e-4539-86ef-559427986faa-catalog-content\") pod \"0cbd0a5e-393e-4539-86ef-559427986faa\" (UID: \"0cbd0a5e-393e-4539-86ef-559427986faa\") " Mar 10 09:52:03 crc kubenswrapper[4794]: I0310 09:52:03.078230 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j49q6\" (UniqueName: \"kubernetes.io/projected/1fe60fc6-df84-4c3b-a73a-0d5a64aeb61a-kube-api-access-j49q6\") pod \"1fe60fc6-df84-4c3b-a73a-0d5a64aeb61a\" (UID: \"1fe60fc6-df84-4c3b-a73a-0d5a64aeb61a\") " Mar 10 09:52:03 crc kubenswrapper[4794]: I0310 09:52:03.078254 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mvqmm\" (UniqueName: \"kubernetes.io/projected/0cbd0a5e-393e-4539-86ef-559427986faa-kube-api-access-mvqmm\") pod \"0cbd0a5e-393e-4539-86ef-559427986faa\" (UID: \"0cbd0a5e-393e-4539-86ef-559427986faa\") " Mar 10 09:52:03 crc kubenswrapper[4794]: I0310 09:52:03.078272 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/a39fe093-da97-48ba-bdf3-a566eefc5208-marketplace-operator-metrics\") pod \"a39fe093-da97-48ba-bdf3-a566eefc5208\" (UID: \"a39fe093-da97-48ba-bdf3-a566eefc5208\") " Mar 10 09:52:03 crc kubenswrapper[4794]: I0310 09:52:03.079441 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a39fe093-da97-48ba-bdf3-a566eefc5208-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "a39fe093-da97-48ba-bdf3-a566eefc5208" (UID: "a39fe093-da97-48ba-bdf3-a566eefc5208"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 09:52:03 crc kubenswrapper[4794]: I0310 09:52:03.079753 4794 generic.go:334] "Generic (PLEG): container finished" podID="38cb9379-e79d-4708-a441-7e65e9dc7450" containerID="3b119e0238b024cc6aef6badb35e23a741196bf39fb82d7295534f0a6c4afc22" exitCode=0 Mar 10 09:52:03 crc kubenswrapper[4794]: I0310 09:52:03.079772 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1fe60fc6-df84-4c3b-a73a-0d5a64aeb61a-utilities" (OuterVolumeSpecName: "utilities") pod "1fe60fc6-df84-4c3b-a73a-0d5a64aeb61a" (UID: "1fe60fc6-df84-4c3b-a73a-0d5a64aeb61a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 09:52:03 crc kubenswrapper[4794]: I0310 09:52:03.080036 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552272-f6597" event={"ID":"38cb9379-e79d-4708-a441-7e65e9dc7450","Type":"ContainerDied","Data":"3b119e0238b024cc6aef6badb35e23a741196bf39fb82d7295534f0a6c4afc22"} Mar 10 09:52:03 crc kubenswrapper[4794]: I0310 09:52:03.081421 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0cbd0a5e-393e-4539-86ef-559427986faa-utilities" (OuterVolumeSpecName: "utilities") pod "0cbd0a5e-393e-4539-86ef-559427986faa" (UID: "0cbd0a5e-393e-4539-86ef-559427986faa"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 09:52:03 crc kubenswrapper[4794]: I0310 09:52:03.086659 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a39fe093-da97-48ba-bdf3-a566eefc5208-kube-api-access-bxz59" (OuterVolumeSpecName: "kube-api-access-bxz59") pod "a39fe093-da97-48ba-bdf3-a566eefc5208" (UID: "a39fe093-da97-48ba-bdf3-a566eefc5208"). InnerVolumeSpecName "kube-api-access-bxz59". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:52:03 crc kubenswrapper[4794]: I0310 09:52:03.086886 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a39fe093-da97-48ba-bdf3-a566eefc5208-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "a39fe093-da97-48ba-bdf3-a566eefc5208" (UID: "a39fe093-da97-48ba-bdf3-a566eefc5208"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:52:03 crc kubenswrapper[4794]: I0310 09:52:03.087142 4794 scope.go:117] "RemoveContainer" containerID="b9ef099180e73d9bde1e4a27716c3151e3e2a880cec2afb1d00afd9253ad5771" Mar 10 09:52:03 crc kubenswrapper[4794]: I0310 09:52:03.087323 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0cbd0a5e-393e-4539-86ef-559427986faa-kube-api-access-mvqmm" (OuterVolumeSpecName: "kube-api-access-mvqmm") pod "0cbd0a5e-393e-4539-86ef-559427986faa" (UID: "0cbd0a5e-393e-4539-86ef-559427986faa"). InnerVolumeSpecName "kube-api-access-mvqmm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:52:03 crc kubenswrapper[4794]: I0310 09:52:03.088515 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1fe60fc6-df84-4c3b-a73a-0d5a64aeb61a-kube-api-access-j49q6" (OuterVolumeSpecName: "kube-api-access-j49q6") pod "1fe60fc6-df84-4c3b-a73a-0d5a64aeb61a" (UID: "1fe60fc6-df84-4c3b-a73a-0d5a64aeb61a"). InnerVolumeSpecName "kube-api-access-j49q6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:52:03 crc kubenswrapper[4794]: I0310 09:52:03.089417 4794 generic.go:334] "Generic (PLEG): container finished" podID="869965fc-c355-4c93-9776-dc1a070c926e" containerID="71d967cd3d765afe905206227b4a185610ee4a8d660a82bc475e7aac4f1aa414" exitCode=0 Mar 10 09:52:03 crc kubenswrapper[4794]: I0310 09:52:03.089484 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zffn2" event={"ID":"869965fc-c355-4c93-9776-dc1a070c926e","Type":"ContainerDied","Data":"71d967cd3d765afe905206227b4a185610ee4a8d660a82bc475e7aac4f1aa414"} Mar 10 09:52:03 crc kubenswrapper[4794]: I0310 09:52:03.093744 4794 generic.go:334] "Generic (PLEG): container finished" podID="1fe60fc6-df84-4c3b-a73a-0d5a64aeb61a" containerID="0d2f487e9a5fcf36206ba7fd5b76cc51ce7328a16051ac52831a122bfc158a54" exitCode=0 Mar 10 09:52:03 crc kubenswrapper[4794]: I0310 09:52:03.093791 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cx8l7" event={"ID":"1fe60fc6-df84-4c3b-a73a-0d5a64aeb61a","Type":"ContainerDied","Data":"0d2f487e9a5fcf36206ba7fd5b76cc51ce7328a16051ac52831a122bfc158a54"} Mar 10 09:52:03 crc kubenswrapper[4794]: I0310 09:52:03.093820 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cx8l7" event={"ID":"1fe60fc6-df84-4c3b-a73a-0d5a64aeb61a","Type":"ContainerDied","Data":"1161b228ee1af342bca4f1d7c6009c98d690de7d323211060ad9ff2f517ebcb6"} Mar 10 09:52:03 crc kubenswrapper[4794]: I0310 09:52:03.094735 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-cx8l7" Mar 10 09:52:03 crc kubenswrapper[4794]: I0310 09:52:03.117442 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-bnvgq"] Mar 10 09:52:03 crc kubenswrapper[4794]: I0310 09:52:03.122207 4794 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-bnvgq"] Mar 10 09:52:03 crc kubenswrapper[4794]: I0310 09:52:03.131603 4794 scope.go:117] "RemoveContainer" containerID="9a1950959578cf24faa2d48b936bcd50e5c83a87c9e318cdc2c43e24cb40dbdc" Mar 10 09:52:03 crc kubenswrapper[4794]: I0310 09:52:03.134729 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0cbd0a5e-393e-4539-86ef-559427986faa-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "0cbd0a5e-393e-4539-86ef-559427986faa" (UID: "0cbd0a5e-393e-4539-86ef-559427986faa"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 09:52:03 crc kubenswrapper[4794]: I0310 09:52:03.147103 4794 scope.go:117] "RemoveContainer" containerID="063c82c36304ffc8762f2979da18d291ed2033e28095ef4691b3ed815dfc3f7e" Mar 10 09:52:03 crc kubenswrapper[4794]: E0310 09:52:03.147698 4794 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"063c82c36304ffc8762f2979da18d291ed2033e28095ef4691b3ed815dfc3f7e\": container with ID starting with 063c82c36304ffc8762f2979da18d291ed2033e28095ef4691b3ed815dfc3f7e not found: ID does not exist" containerID="063c82c36304ffc8762f2979da18d291ed2033e28095ef4691b3ed815dfc3f7e" Mar 10 09:52:03 crc kubenswrapper[4794]: I0310 09:52:03.147756 4794 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"063c82c36304ffc8762f2979da18d291ed2033e28095ef4691b3ed815dfc3f7e"} err="failed to get container status \"063c82c36304ffc8762f2979da18d291ed2033e28095ef4691b3ed815dfc3f7e\": rpc error: code = NotFound desc = could not find container \"063c82c36304ffc8762f2979da18d291ed2033e28095ef4691b3ed815dfc3f7e\": container with ID starting with 063c82c36304ffc8762f2979da18d291ed2033e28095ef4691b3ed815dfc3f7e not found: ID does not exist" Mar 10 09:52:03 crc kubenswrapper[4794]: I0310 09:52:03.147789 4794 scope.go:117] "RemoveContainer" containerID="b9ef099180e73d9bde1e4a27716c3151e3e2a880cec2afb1d00afd9253ad5771" Mar 10 09:52:03 crc kubenswrapper[4794]: E0310 09:52:03.148220 4794 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b9ef099180e73d9bde1e4a27716c3151e3e2a880cec2afb1d00afd9253ad5771\": container with ID starting with b9ef099180e73d9bde1e4a27716c3151e3e2a880cec2afb1d00afd9253ad5771 not found: ID does not exist" containerID="b9ef099180e73d9bde1e4a27716c3151e3e2a880cec2afb1d00afd9253ad5771" Mar 10 09:52:03 crc kubenswrapper[4794]: I0310 09:52:03.148252 4794 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b9ef099180e73d9bde1e4a27716c3151e3e2a880cec2afb1d00afd9253ad5771"} err="failed to get container status \"b9ef099180e73d9bde1e4a27716c3151e3e2a880cec2afb1d00afd9253ad5771\": rpc error: code = NotFound desc = could not find container \"b9ef099180e73d9bde1e4a27716c3151e3e2a880cec2afb1d00afd9253ad5771\": container with ID starting with b9ef099180e73d9bde1e4a27716c3151e3e2a880cec2afb1d00afd9253ad5771 not found: ID does not exist" Mar 10 09:52:03 crc kubenswrapper[4794]: I0310 09:52:03.148271 4794 scope.go:117] "RemoveContainer" containerID="9a1950959578cf24faa2d48b936bcd50e5c83a87c9e318cdc2c43e24cb40dbdc" Mar 10 09:52:03 crc kubenswrapper[4794]: E0310 09:52:03.148491 4794 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9a1950959578cf24faa2d48b936bcd50e5c83a87c9e318cdc2c43e24cb40dbdc\": container with ID starting with 9a1950959578cf24faa2d48b936bcd50e5c83a87c9e318cdc2c43e24cb40dbdc not found: ID does not exist" containerID="9a1950959578cf24faa2d48b936bcd50e5c83a87c9e318cdc2c43e24cb40dbdc" Mar 10 09:52:03 crc kubenswrapper[4794]: I0310 09:52:03.148519 4794 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9a1950959578cf24faa2d48b936bcd50e5c83a87c9e318cdc2c43e24cb40dbdc"} err="failed to get container status \"9a1950959578cf24faa2d48b936bcd50e5c83a87c9e318cdc2c43e24cb40dbdc\": rpc error: code = NotFound desc = could not find container \"9a1950959578cf24faa2d48b936bcd50e5c83a87c9e318cdc2c43e24cb40dbdc\": container with ID starting with 9a1950959578cf24faa2d48b936bcd50e5c83a87c9e318cdc2c43e24cb40dbdc not found: ID does not exist" Mar 10 09:52:03 crc kubenswrapper[4794]: I0310 09:52:03.148539 4794 scope.go:117] "RemoveContainer" containerID="6b1766f225c554c356c240978ae44a06d85c28ef26ce5ae4ae5c928580823fb0" Mar 10 09:52:03 crc kubenswrapper[4794]: I0310 09:52:03.159777 4794 scope.go:117] "RemoveContainer" containerID="670ae2fdad1919a87f46024509bc20c149bdc0af6ab62019439f2b21a599393d" Mar 10 09:52:03 crc kubenswrapper[4794]: I0310 09:52:03.173977 4794 scope.go:117] "RemoveContainer" containerID="6b1766f225c554c356c240978ae44a06d85c28ef26ce5ae4ae5c928580823fb0" Mar 10 09:52:03 crc kubenswrapper[4794]: E0310 09:52:03.174326 4794 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6b1766f225c554c356c240978ae44a06d85c28ef26ce5ae4ae5c928580823fb0\": container with ID starting with 6b1766f225c554c356c240978ae44a06d85c28ef26ce5ae4ae5c928580823fb0 not found: ID does not exist" containerID="6b1766f225c554c356c240978ae44a06d85c28ef26ce5ae4ae5c928580823fb0" Mar 10 09:52:03 crc kubenswrapper[4794]: I0310 09:52:03.174369 4794 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6b1766f225c554c356c240978ae44a06d85c28ef26ce5ae4ae5c928580823fb0"} err="failed to get container status \"6b1766f225c554c356c240978ae44a06d85c28ef26ce5ae4ae5c928580823fb0\": rpc error: code = NotFound desc = could not find container \"6b1766f225c554c356c240978ae44a06d85c28ef26ce5ae4ae5c928580823fb0\": container with ID starting with 6b1766f225c554c356c240978ae44a06d85c28ef26ce5ae4ae5c928580823fb0 not found: ID does not exist" Mar 10 09:52:03 crc kubenswrapper[4794]: I0310 09:52:03.174388 4794 scope.go:117] "RemoveContainer" containerID="670ae2fdad1919a87f46024509bc20c149bdc0af6ab62019439f2b21a599393d" Mar 10 09:52:03 crc kubenswrapper[4794]: E0310 09:52:03.174582 4794 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"670ae2fdad1919a87f46024509bc20c149bdc0af6ab62019439f2b21a599393d\": container with ID starting with 670ae2fdad1919a87f46024509bc20c149bdc0af6ab62019439f2b21a599393d not found: ID does not exist" containerID="670ae2fdad1919a87f46024509bc20c149bdc0af6ab62019439f2b21a599393d" Mar 10 09:52:03 crc kubenswrapper[4794]: I0310 09:52:03.174611 4794 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"670ae2fdad1919a87f46024509bc20c149bdc0af6ab62019439f2b21a599393d"} err="failed to get container status \"670ae2fdad1919a87f46024509bc20c149bdc0af6ab62019439f2b21a599393d\": rpc error: code = NotFound desc = could not find container \"670ae2fdad1919a87f46024509bc20c149bdc0af6ab62019439f2b21a599393d\": container with ID starting with 670ae2fdad1919a87f46024509bc20c149bdc0af6ab62019439f2b21a599393d not found: ID does not exist" Mar 10 09:52:03 crc kubenswrapper[4794]: I0310 09:52:03.174625 4794 scope.go:117] "RemoveContainer" containerID="1edd73cb5575e96ed483f8bfdcf30f5d909505595954c18a9a542f61e3dc57b2" Mar 10 09:52:03 crc kubenswrapper[4794]: I0310 09:52:03.180263 4794 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1fe60fc6-df84-4c3b-a73a-0d5a64aeb61a-utilities\") on node \"crc\" DevicePath \"\"" Mar 10 09:52:03 crc kubenswrapper[4794]: I0310 09:52:03.180292 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bxz59\" (UniqueName: \"kubernetes.io/projected/a39fe093-da97-48ba-bdf3-a566eefc5208-kube-api-access-bxz59\") on node \"crc\" DevicePath \"\"" Mar 10 09:52:03 crc kubenswrapper[4794]: I0310 09:52:03.180306 4794 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a39fe093-da97-48ba-bdf3-a566eefc5208-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 10 09:52:03 crc kubenswrapper[4794]: I0310 09:52:03.180319 4794 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0cbd0a5e-393e-4539-86ef-559427986faa-utilities\") on node \"crc\" DevicePath \"\"" Mar 10 09:52:03 crc kubenswrapper[4794]: I0310 09:52:03.180349 4794 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0cbd0a5e-393e-4539-86ef-559427986faa-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 10 09:52:03 crc kubenswrapper[4794]: I0310 09:52:03.180361 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j49q6\" (UniqueName: \"kubernetes.io/projected/1fe60fc6-df84-4c3b-a73a-0d5a64aeb61a-kube-api-access-j49q6\") on node \"crc\" DevicePath \"\"" Mar 10 09:52:03 crc kubenswrapper[4794]: I0310 09:52:03.180372 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mvqmm\" (UniqueName: \"kubernetes.io/projected/0cbd0a5e-393e-4539-86ef-559427986faa-kube-api-access-mvqmm\") on node \"crc\" DevicePath \"\"" Mar 10 09:52:03 crc kubenswrapper[4794]: I0310 09:52:03.180383 4794 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/a39fe093-da97-48ba-bdf3-a566eefc5208-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Mar 10 09:52:03 crc kubenswrapper[4794]: I0310 09:52:03.186312 4794 scope.go:117] "RemoveContainer" containerID="4cccfbb55efed4efdee22b1df0454599adf64b78e912c51336c4bec0ca666c88" Mar 10 09:52:03 crc kubenswrapper[4794]: I0310 09:52:03.197656 4794 scope.go:117] "RemoveContainer" containerID="2f94243ea035d2b447baf542f6a149184fb1e3808fe881e9512042eedb1f2534" Mar 10 09:52:03 crc kubenswrapper[4794]: I0310 09:52:03.208581 4794 scope.go:117] "RemoveContainer" containerID="1edd73cb5575e96ed483f8bfdcf30f5d909505595954c18a9a542f61e3dc57b2" Mar 10 09:52:03 crc kubenswrapper[4794]: E0310 09:52:03.209065 4794 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1edd73cb5575e96ed483f8bfdcf30f5d909505595954c18a9a542f61e3dc57b2\": container with ID starting with 1edd73cb5575e96ed483f8bfdcf30f5d909505595954c18a9a542f61e3dc57b2 not found: ID does not exist" containerID="1edd73cb5575e96ed483f8bfdcf30f5d909505595954c18a9a542f61e3dc57b2" Mar 10 09:52:03 crc kubenswrapper[4794]: I0310 09:52:03.209464 4794 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1edd73cb5575e96ed483f8bfdcf30f5d909505595954c18a9a542f61e3dc57b2"} err="failed to get container status \"1edd73cb5575e96ed483f8bfdcf30f5d909505595954c18a9a542f61e3dc57b2\": rpc error: code = NotFound desc = could not find container \"1edd73cb5575e96ed483f8bfdcf30f5d909505595954c18a9a542f61e3dc57b2\": container with ID starting with 1edd73cb5575e96ed483f8bfdcf30f5d909505595954c18a9a542f61e3dc57b2 not found: ID does not exist" Mar 10 09:52:03 crc kubenswrapper[4794]: I0310 09:52:03.209517 4794 scope.go:117] "RemoveContainer" containerID="4cccfbb55efed4efdee22b1df0454599adf64b78e912c51336c4bec0ca666c88" Mar 10 09:52:03 crc kubenswrapper[4794]: E0310 09:52:03.209859 4794 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4cccfbb55efed4efdee22b1df0454599adf64b78e912c51336c4bec0ca666c88\": container with ID starting with 4cccfbb55efed4efdee22b1df0454599adf64b78e912c51336c4bec0ca666c88 not found: ID does not exist" containerID="4cccfbb55efed4efdee22b1df0454599adf64b78e912c51336c4bec0ca666c88" Mar 10 09:52:03 crc kubenswrapper[4794]: I0310 09:52:03.209902 4794 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4cccfbb55efed4efdee22b1df0454599adf64b78e912c51336c4bec0ca666c88"} err="failed to get container status \"4cccfbb55efed4efdee22b1df0454599adf64b78e912c51336c4bec0ca666c88\": rpc error: code = NotFound desc = could not find container \"4cccfbb55efed4efdee22b1df0454599adf64b78e912c51336c4bec0ca666c88\": container with ID starting with 4cccfbb55efed4efdee22b1df0454599adf64b78e912c51336c4bec0ca666c88 not found: ID does not exist" Mar 10 09:52:03 crc kubenswrapper[4794]: I0310 09:52:03.209931 4794 scope.go:117] "RemoveContainer" containerID="2f94243ea035d2b447baf542f6a149184fb1e3808fe881e9512042eedb1f2534" Mar 10 09:52:03 crc kubenswrapper[4794]: E0310 09:52:03.210312 4794 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2f94243ea035d2b447baf542f6a149184fb1e3808fe881e9512042eedb1f2534\": container with ID starting with 2f94243ea035d2b447baf542f6a149184fb1e3808fe881e9512042eedb1f2534 not found: ID does not exist" containerID="2f94243ea035d2b447baf542f6a149184fb1e3808fe881e9512042eedb1f2534" Mar 10 09:52:03 crc kubenswrapper[4794]: I0310 09:52:03.210400 4794 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2f94243ea035d2b447baf542f6a149184fb1e3808fe881e9512042eedb1f2534"} err="failed to get container status \"2f94243ea035d2b447baf542f6a149184fb1e3808fe881e9512042eedb1f2534\": rpc error: code = NotFound desc = could not find container \"2f94243ea035d2b447baf542f6a149184fb1e3808fe881e9512042eedb1f2534\": container with ID starting with 2f94243ea035d2b447baf542f6a149184fb1e3808fe881e9512042eedb1f2534 not found: ID does not exist" Mar 10 09:52:03 crc kubenswrapper[4794]: I0310 09:52:03.210420 4794 scope.go:117] "RemoveContainer" containerID="0d2f487e9a5fcf36206ba7fd5b76cc51ce7328a16051ac52831a122bfc158a54" Mar 10 09:52:03 crc kubenswrapper[4794]: I0310 09:52:03.220359 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1fe60fc6-df84-4c3b-a73a-0d5a64aeb61a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1fe60fc6-df84-4c3b-a73a-0d5a64aeb61a" (UID: "1fe60fc6-df84-4c3b-a73a-0d5a64aeb61a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 09:52:03 crc kubenswrapper[4794]: I0310 09:52:03.224410 4794 scope.go:117] "RemoveContainer" containerID="e0c23d7e79ef5eac72c4b0ab3717e1faf359bf880623ab9990dd8488d38e240b" Mar 10 09:52:03 crc kubenswrapper[4794]: I0310 09:52:03.238137 4794 scope.go:117] "RemoveContainer" containerID="98ee9b77d2b1c7e0683d83e11e018be3037e3f647628e34b0989b07c2096c4f3" Mar 10 09:52:03 crc kubenswrapper[4794]: I0310 09:52:03.249202 4794 scope.go:117] "RemoveContainer" containerID="0d2f487e9a5fcf36206ba7fd5b76cc51ce7328a16051ac52831a122bfc158a54" Mar 10 09:52:03 crc kubenswrapper[4794]: E0310 09:52:03.249695 4794 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0d2f487e9a5fcf36206ba7fd5b76cc51ce7328a16051ac52831a122bfc158a54\": container with ID starting with 0d2f487e9a5fcf36206ba7fd5b76cc51ce7328a16051ac52831a122bfc158a54 not found: ID does not exist" containerID="0d2f487e9a5fcf36206ba7fd5b76cc51ce7328a16051ac52831a122bfc158a54" Mar 10 09:52:03 crc kubenswrapper[4794]: I0310 09:52:03.249735 4794 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0d2f487e9a5fcf36206ba7fd5b76cc51ce7328a16051ac52831a122bfc158a54"} err="failed to get container status \"0d2f487e9a5fcf36206ba7fd5b76cc51ce7328a16051ac52831a122bfc158a54\": rpc error: code = NotFound desc = could not find container \"0d2f487e9a5fcf36206ba7fd5b76cc51ce7328a16051ac52831a122bfc158a54\": container with ID starting with 0d2f487e9a5fcf36206ba7fd5b76cc51ce7328a16051ac52831a122bfc158a54 not found: ID does not exist" Mar 10 09:52:03 crc kubenswrapper[4794]: I0310 09:52:03.249762 4794 scope.go:117] "RemoveContainer" containerID="e0c23d7e79ef5eac72c4b0ab3717e1faf359bf880623ab9990dd8488d38e240b" Mar 10 09:52:03 crc kubenswrapper[4794]: E0310 09:52:03.250193 4794 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e0c23d7e79ef5eac72c4b0ab3717e1faf359bf880623ab9990dd8488d38e240b\": container with ID starting with e0c23d7e79ef5eac72c4b0ab3717e1faf359bf880623ab9990dd8488d38e240b not found: ID does not exist" containerID="e0c23d7e79ef5eac72c4b0ab3717e1faf359bf880623ab9990dd8488d38e240b" Mar 10 09:52:03 crc kubenswrapper[4794]: I0310 09:52:03.250240 4794 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e0c23d7e79ef5eac72c4b0ab3717e1faf359bf880623ab9990dd8488d38e240b"} err="failed to get container status \"e0c23d7e79ef5eac72c4b0ab3717e1faf359bf880623ab9990dd8488d38e240b\": rpc error: code = NotFound desc = could not find container \"e0c23d7e79ef5eac72c4b0ab3717e1faf359bf880623ab9990dd8488d38e240b\": container with ID starting with e0c23d7e79ef5eac72c4b0ab3717e1faf359bf880623ab9990dd8488d38e240b not found: ID does not exist" Mar 10 09:52:03 crc kubenswrapper[4794]: I0310 09:52:03.250271 4794 scope.go:117] "RemoveContainer" containerID="98ee9b77d2b1c7e0683d83e11e018be3037e3f647628e34b0989b07c2096c4f3" Mar 10 09:52:03 crc kubenswrapper[4794]: E0310 09:52:03.250569 4794 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"98ee9b77d2b1c7e0683d83e11e018be3037e3f647628e34b0989b07c2096c4f3\": container with ID starting with 98ee9b77d2b1c7e0683d83e11e018be3037e3f647628e34b0989b07c2096c4f3 not found: ID does not exist" containerID="98ee9b77d2b1c7e0683d83e11e018be3037e3f647628e34b0989b07c2096c4f3" Mar 10 09:52:03 crc kubenswrapper[4794]: I0310 09:52:03.250592 4794 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"98ee9b77d2b1c7e0683d83e11e018be3037e3f647628e34b0989b07c2096c4f3"} err="failed to get container status \"98ee9b77d2b1c7e0683d83e11e018be3037e3f647628e34b0989b07c2096c4f3\": rpc error: code = NotFound desc = could not find container \"98ee9b77d2b1c7e0683d83e11e018be3037e3f647628e34b0989b07c2096c4f3\": container with ID starting with 98ee9b77d2b1c7e0683d83e11e018be3037e3f647628e34b0989b07c2096c4f3 not found: ID does not exist" Mar 10 09:52:03 crc kubenswrapper[4794]: I0310 09:52:03.281561 4794 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1fe60fc6-df84-4c3b-a73a-0d5a64aeb61a-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 10 09:52:03 crc kubenswrapper[4794]: I0310 09:52:03.352678 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-ztm8n"] Mar 10 09:52:03 crc kubenswrapper[4794]: I0310 09:52:03.400841 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-n2b4p"] Mar 10 09:52:03 crc kubenswrapper[4794]: I0310 09:52:03.400895 4794 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-n2b4p"] Mar 10 09:52:03 crc kubenswrapper[4794]: I0310 09:52:03.412213 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-q7js2"] Mar 10 09:52:03 crc kubenswrapper[4794]: I0310 09:52:03.416218 4794 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-q7js2"] Mar 10 09:52:03 crc kubenswrapper[4794]: I0310 09:52:03.436605 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-cx8l7"] Mar 10 09:52:03 crc kubenswrapper[4794]: I0310 09:52:03.442100 4794 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-cx8l7"] Mar 10 09:52:03 crc kubenswrapper[4794]: I0310 09:52:03.545374 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-zffn2" Mar 10 09:52:03 crc kubenswrapper[4794]: I0310 09:52:03.685577 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/869965fc-c355-4c93-9776-dc1a070c926e-catalog-content\") pod \"869965fc-c355-4c93-9776-dc1a070c926e\" (UID: \"869965fc-c355-4c93-9776-dc1a070c926e\") " Mar 10 09:52:03 crc kubenswrapper[4794]: I0310 09:52:03.685697 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/869965fc-c355-4c93-9776-dc1a070c926e-utilities\") pod \"869965fc-c355-4c93-9776-dc1a070c926e\" (UID: \"869965fc-c355-4c93-9776-dc1a070c926e\") " Mar 10 09:52:03 crc kubenswrapper[4794]: I0310 09:52:03.685746 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bmrnm\" (UniqueName: \"kubernetes.io/projected/869965fc-c355-4c93-9776-dc1a070c926e-kube-api-access-bmrnm\") pod \"869965fc-c355-4c93-9776-dc1a070c926e\" (UID: \"869965fc-c355-4c93-9776-dc1a070c926e\") " Mar 10 09:52:03 crc kubenswrapper[4794]: I0310 09:52:03.686418 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/869965fc-c355-4c93-9776-dc1a070c926e-utilities" (OuterVolumeSpecName: "utilities") pod "869965fc-c355-4c93-9776-dc1a070c926e" (UID: "869965fc-c355-4c93-9776-dc1a070c926e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 09:52:03 crc kubenswrapper[4794]: I0310 09:52:03.690904 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/869965fc-c355-4c93-9776-dc1a070c926e-kube-api-access-bmrnm" (OuterVolumeSpecName: "kube-api-access-bmrnm") pod "869965fc-c355-4c93-9776-dc1a070c926e" (UID: "869965fc-c355-4c93-9776-dc1a070c926e"). InnerVolumeSpecName "kube-api-access-bmrnm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:52:03 crc kubenswrapper[4794]: I0310 09:52:03.732421 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/869965fc-c355-4c93-9776-dc1a070c926e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "869965fc-c355-4c93-9776-dc1a070c926e" (UID: "869965fc-c355-4c93-9776-dc1a070c926e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 09:52:03 crc kubenswrapper[4794]: I0310 09:52:03.787536 4794 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/869965fc-c355-4c93-9776-dc1a070c926e-utilities\") on node \"crc\" DevicePath \"\"" Mar 10 09:52:03 crc kubenswrapper[4794]: I0310 09:52:03.787588 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bmrnm\" (UniqueName: \"kubernetes.io/projected/869965fc-c355-4c93-9776-dc1a070c926e-kube-api-access-bmrnm\") on node \"crc\" DevicePath \"\"" Mar 10 09:52:03 crc kubenswrapper[4794]: I0310 09:52:03.787605 4794 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/869965fc-c355-4c93-9776-dc1a070c926e-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 10 09:52:04 crc kubenswrapper[4794]: I0310 09:52:04.024014 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0cbd0a5e-393e-4539-86ef-559427986faa" path="/var/lib/kubelet/pods/0cbd0a5e-393e-4539-86ef-559427986faa/volumes" Mar 10 09:52:04 crc kubenswrapper[4794]: I0310 09:52:04.026305 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1fe60fc6-df84-4c3b-a73a-0d5a64aeb61a" path="/var/lib/kubelet/pods/1fe60fc6-df84-4c3b-a73a-0d5a64aeb61a/volumes" Mar 10 09:52:04 crc kubenswrapper[4794]: I0310 09:52:04.027782 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="88245dbf-bf6b-4051-9a3c-91da5a183538" path="/var/lib/kubelet/pods/88245dbf-bf6b-4051-9a3c-91da5a183538/volumes" Mar 10 09:52:04 crc kubenswrapper[4794]: I0310 09:52:04.030417 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a39fe093-da97-48ba-bdf3-a566eefc5208" path="/var/lib/kubelet/pods/a39fe093-da97-48ba-bdf3-a566eefc5208/volumes" Mar 10 09:52:04 crc kubenswrapper[4794]: I0310 09:52:04.103466 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zffn2" event={"ID":"869965fc-c355-4c93-9776-dc1a070c926e","Type":"ContainerDied","Data":"0390edaa2c4571644faf4767f082b91047ffe17e250b5e8f9c9e444e2cd89902"} Mar 10 09:52:04 crc kubenswrapper[4794]: I0310 09:52:04.103517 4794 scope.go:117] "RemoveContainer" containerID="71d967cd3d765afe905206227b4a185610ee4a8d660a82bc475e7aac4f1aa414" Mar 10 09:52:04 crc kubenswrapper[4794]: I0310 09:52:04.103812 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-zffn2" Mar 10 09:52:04 crc kubenswrapper[4794]: I0310 09:52:04.114145 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-ztm8n" event={"ID":"e995e0b7-2d36-4d94-a424-f916e1cba0ac","Type":"ContainerStarted","Data":"b720c76519f12a2923a35ae5ada6276c120c45e6c9f5f178be91479df1696fe3"} Mar 10 09:52:04 crc kubenswrapper[4794]: I0310 09:52:04.114229 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-ztm8n" event={"ID":"e995e0b7-2d36-4d94-a424-f916e1cba0ac","Type":"ContainerStarted","Data":"8e2609cbca1788ffe8abe4de04b32f5aa875c7b468996f6e5cd4d024013b1916"} Mar 10 09:52:04 crc kubenswrapper[4794]: I0310 09:52:04.114969 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-ztm8n" Mar 10 09:52:04 crc kubenswrapper[4794]: I0310 09:52:04.125761 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-zffn2"] Mar 10 09:52:04 crc kubenswrapper[4794]: I0310 09:52:04.125819 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-ztm8n" Mar 10 09:52:04 crc kubenswrapper[4794]: I0310 09:52:04.128901 4794 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-zffn2"] Mar 10 09:52:04 crc kubenswrapper[4794]: I0310 09:52:04.129496 4794 scope.go:117] "RemoveContainer" containerID="c369267a010db33c54459e5703bd75c5d1efea77f7e7614ca282912df8de29d7" Mar 10 09:52:04 crc kubenswrapper[4794]: I0310 09:52:04.140916 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-ztm8n" podStartSLOduration=2.140893192 podStartE2EDuration="2.140893192s" podCreationTimestamp="2026-03-10 09:52:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 09:52:04.13504897 +0000 UTC m=+472.891219788" watchObservedRunningTime="2026-03-10 09:52:04.140893192 +0000 UTC m=+472.897064010" Mar 10 09:52:04 crc kubenswrapper[4794]: I0310 09:52:04.188691 4794 scope.go:117] "RemoveContainer" containerID="123bbb1f871d38f1241f71600926c356bbe1dc7c610499df38a8a78ee1ac4de0" Mar 10 09:52:04 crc kubenswrapper[4794]: I0310 09:52:04.341771 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552272-f6597" Mar 10 09:52:04 crc kubenswrapper[4794]: I0310 09:52:04.496202 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g6q67\" (UniqueName: \"kubernetes.io/projected/38cb9379-e79d-4708-a441-7e65e9dc7450-kube-api-access-g6q67\") pod \"38cb9379-e79d-4708-a441-7e65e9dc7450\" (UID: \"38cb9379-e79d-4708-a441-7e65e9dc7450\") " Mar 10 09:52:04 crc kubenswrapper[4794]: I0310 09:52:04.500103 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/38cb9379-e79d-4708-a441-7e65e9dc7450-kube-api-access-g6q67" (OuterVolumeSpecName: "kube-api-access-g6q67") pod "38cb9379-e79d-4708-a441-7e65e9dc7450" (UID: "38cb9379-e79d-4708-a441-7e65e9dc7450"). InnerVolumeSpecName "kube-api-access-g6q67". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:52:04 crc kubenswrapper[4794]: I0310 09:52:04.541556 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-286c8"] Mar 10 09:52:04 crc kubenswrapper[4794]: E0310 09:52:04.541785 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="869965fc-c355-4c93-9776-dc1a070c926e" containerName="extract-utilities" Mar 10 09:52:04 crc kubenswrapper[4794]: I0310 09:52:04.541797 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="869965fc-c355-4c93-9776-dc1a070c926e" containerName="extract-utilities" Mar 10 09:52:04 crc kubenswrapper[4794]: E0310 09:52:04.541807 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1fe60fc6-df84-4c3b-a73a-0d5a64aeb61a" containerName="registry-server" Mar 10 09:52:04 crc kubenswrapper[4794]: I0310 09:52:04.541812 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="1fe60fc6-df84-4c3b-a73a-0d5a64aeb61a" containerName="registry-server" Mar 10 09:52:04 crc kubenswrapper[4794]: E0310 09:52:04.541822 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a39fe093-da97-48ba-bdf3-a566eefc5208" containerName="marketplace-operator" Mar 10 09:52:04 crc kubenswrapper[4794]: I0310 09:52:04.541829 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="a39fe093-da97-48ba-bdf3-a566eefc5208" containerName="marketplace-operator" Mar 10 09:52:04 crc kubenswrapper[4794]: E0310 09:52:04.541836 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="88245dbf-bf6b-4051-9a3c-91da5a183538" containerName="extract-content" Mar 10 09:52:04 crc kubenswrapper[4794]: I0310 09:52:04.541841 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="88245dbf-bf6b-4051-9a3c-91da5a183538" containerName="extract-content" Mar 10 09:52:04 crc kubenswrapper[4794]: E0310 09:52:04.541849 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="38cb9379-e79d-4708-a441-7e65e9dc7450" containerName="oc" Mar 10 09:52:04 crc kubenswrapper[4794]: I0310 09:52:04.541855 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="38cb9379-e79d-4708-a441-7e65e9dc7450" containerName="oc" Mar 10 09:52:04 crc kubenswrapper[4794]: E0310 09:52:04.541865 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="88245dbf-bf6b-4051-9a3c-91da5a183538" containerName="extract-utilities" Mar 10 09:52:04 crc kubenswrapper[4794]: I0310 09:52:04.541870 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="88245dbf-bf6b-4051-9a3c-91da5a183538" containerName="extract-utilities" Mar 10 09:52:04 crc kubenswrapper[4794]: E0310 09:52:04.541879 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0cbd0a5e-393e-4539-86ef-559427986faa" containerName="registry-server" Mar 10 09:52:04 crc kubenswrapper[4794]: I0310 09:52:04.541884 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="0cbd0a5e-393e-4539-86ef-559427986faa" containerName="registry-server" Mar 10 09:52:04 crc kubenswrapper[4794]: E0310 09:52:04.541892 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1fe60fc6-df84-4c3b-a73a-0d5a64aeb61a" containerName="extract-content" Mar 10 09:52:04 crc kubenswrapper[4794]: I0310 09:52:04.541897 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="1fe60fc6-df84-4c3b-a73a-0d5a64aeb61a" containerName="extract-content" Mar 10 09:52:04 crc kubenswrapper[4794]: E0310 09:52:04.541909 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="88245dbf-bf6b-4051-9a3c-91da5a183538" containerName="registry-server" Mar 10 09:52:04 crc kubenswrapper[4794]: I0310 09:52:04.541914 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="88245dbf-bf6b-4051-9a3c-91da5a183538" containerName="registry-server" Mar 10 09:52:04 crc kubenswrapper[4794]: E0310 09:52:04.541923 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0cbd0a5e-393e-4539-86ef-559427986faa" containerName="extract-utilities" Mar 10 09:52:04 crc kubenswrapper[4794]: I0310 09:52:04.541929 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="0cbd0a5e-393e-4539-86ef-559427986faa" containerName="extract-utilities" Mar 10 09:52:04 crc kubenswrapper[4794]: E0310 09:52:04.541936 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0cbd0a5e-393e-4539-86ef-559427986faa" containerName="extract-content" Mar 10 09:52:04 crc kubenswrapper[4794]: I0310 09:52:04.541943 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="0cbd0a5e-393e-4539-86ef-559427986faa" containerName="extract-content" Mar 10 09:52:04 crc kubenswrapper[4794]: E0310 09:52:04.541951 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="869965fc-c355-4c93-9776-dc1a070c926e" containerName="extract-content" Mar 10 09:52:04 crc kubenswrapper[4794]: I0310 09:52:04.541956 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="869965fc-c355-4c93-9776-dc1a070c926e" containerName="extract-content" Mar 10 09:52:04 crc kubenswrapper[4794]: E0310 09:52:04.541964 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="869965fc-c355-4c93-9776-dc1a070c926e" containerName="registry-server" Mar 10 09:52:04 crc kubenswrapper[4794]: I0310 09:52:04.541969 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="869965fc-c355-4c93-9776-dc1a070c926e" containerName="registry-server" Mar 10 09:52:04 crc kubenswrapper[4794]: E0310 09:52:04.541978 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1fe60fc6-df84-4c3b-a73a-0d5a64aeb61a" containerName="extract-utilities" Mar 10 09:52:04 crc kubenswrapper[4794]: I0310 09:52:04.541984 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="1fe60fc6-df84-4c3b-a73a-0d5a64aeb61a" containerName="extract-utilities" Mar 10 09:52:04 crc kubenswrapper[4794]: I0310 09:52:04.542063 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="a39fe093-da97-48ba-bdf3-a566eefc5208" containerName="marketplace-operator" Mar 10 09:52:04 crc kubenswrapper[4794]: I0310 09:52:04.542073 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="88245dbf-bf6b-4051-9a3c-91da5a183538" containerName="registry-server" Mar 10 09:52:04 crc kubenswrapper[4794]: I0310 09:52:04.542080 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="869965fc-c355-4c93-9776-dc1a070c926e" containerName="registry-server" Mar 10 09:52:04 crc kubenswrapper[4794]: I0310 09:52:04.542088 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="38cb9379-e79d-4708-a441-7e65e9dc7450" containerName="oc" Mar 10 09:52:04 crc kubenswrapper[4794]: I0310 09:52:04.542095 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="0cbd0a5e-393e-4539-86ef-559427986faa" containerName="registry-server" Mar 10 09:52:04 crc kubenswrapper[4794]: I0310 09:52:04.542101 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="1fe60fc6-df84-4c3b-a73a-0d5a64aeb61a" containerName="registry-server" Mar 10 09:52:04 crc kubenswrapper[4794]: E0310 09:52:04.542177 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a39fe093-da97-48ba-bdf3-a566eefc5208" containerName="marketplace-operator" Mar 10 09:52:04 crc kubenswrapper[4794]: I0310 09:52:04.542184 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="a39fe093-da97-48ba-bdf3-a566eefc5208" containerName="marketplace-operator" Mar 10 09:52:04 crc kubenswrapper[4794]: I0310 09:52:04.542288 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="a39fe093-da97-48ba-bdf3-a566eefc5208" containerName="marketplace-operator" Mar 10 09:52:04 crc kubenswrapper[4794]: I0310 09:52:04.542780 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-286c8" Mar 10 09:52:04 crc kubenswrapper[4794]: I0310 09:52:04.545678 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Mar 10 09:52:04 crc kubenswrapper[4794]: I0310 09:52:04.552153 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-286c8"] Mar 10 09:52:04 crc kubenswrapper[4794]: I0310 09:52:04.597593 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/23e836be-bdcf-44e3-a585-15f8942bf972-catalog-content\") pod \"redhat-marketplace-286c8\" (UID: \"23e836be-bdcf-44e3-a585-15f8942bf972\") " pod="openshift-marketplace/redhat-marketplace-286c8" Mar 10 09:52:04 crc kubenswrapper[4794]: I0310 09:52:04.597658 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b7pvx\" (UniqueName: \"kubernetes.io/projected/23e836be-bdcf-44e3-a585-15f8942bf972-kube-api-access-b7pvx\") pod \"redhat-marketplace-286c8\" (UID: \"23e836be-bdcf-44e3-a585-15f8942bf972\") " pod="openshift-marketplace/redhat-marketplace-286c8" Mar 10 09:52:04 crc kubenswrapper[4794]: I0310 09:52:04.597706 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/23e836be-bdcf-44e3-a585-15f8942bf972-utilities\") pod \"redhat-marketplace-286c8\" (UID: \"23e836be-bdcf-44e3-a585-15f8942bf972\") " pod="openshift-marketplace/redhat-marketplace-286c8" Mar 10 09:52:04 crc kubenswrapper[4794]: I0310 09:52:04.597756 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g6q67\" (UniqueName: \"kubernetes.io/projected/38cb9379-e79d-4708-a441-7e65e9dc7450-kube-api-access-g6q67\") on node \"crc\" DevicePath \"\"" Mar 10 09:52:04 crc kubenswrapper[4794]: I0310 09:52:04.698401 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/23e836be-bdcf-44e3-a585-15f8942bf972-utilities\") pod \"redhat-marketplace-286c8\" (UID: \"23e836be-bdcf-44e3-a585-15f8942bf972\") " pod="openshift-marketplace/redhat-marketplace-286c8" Mar 10 09:52:04 crc kubenswrapper[4794]: I0310 09:52:04.698469 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/23e836be-bdcf-44e3-a585-15f8942bf972-catalog-content\") pod \"redhat-marketplace-286c8\" (UID: \"23e836be-bdcf-44e3-a585-15f8942bf972\") " pod="openshift-marketplace/redhat-marketplace-286c8" Mar 10 09:52:04 crc kubenswrapper[4794]: I0310 09:52:04.698493 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b7pvx\" (UniqueName: \"kubernetes.io/projected/23e836be-bdcf-44e3-a585-15f8942bf972-kube-api-access-b7pvx\") pod \"redhat-marketplace-286c8\" (UID: \"23e836be-bdcf-44e3-a585-15f8942bf972\") " pod="openshift-marketplace/redhat-marketplace-286c8" Mar 10 09:52:04 crc kubenswrapper[4794]: I0310 09:52:04.698881 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/23e836be-bdcf-44e3-a585-15f8942bf972-utilities\") pod \"redhat-marketplace-286c8\" (UID: \"23e836be-bdcf-44e3-a585-15f8942bf972\") " pod="openshift-marketplace/redhat-marketplace-286c8" Mar 10 09:52:04 crc kubenswrapper[4794]: I0310 09:52:04.699185 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/23e836be-bdcf-44e3-a585-15f8942bf972-catalog-content\") pod \"redhat-marketplace-286c8\" (UID: \"23e836be-bdcf-44e3-a585-15f8942bf972\") " pod="openshift-marketplace/redhat-marketplace-286c8" Mar 10 09:52:04 crc kubenswrapper[4794]: I0310 09:52:04.714646 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b7pvx\" (UniqueName: \"kubernetes.io/projected/23e836be-bdcf-44e3-a585-15f8942bf972-kube-api-access-b7pvx\") pod \"redhat-marketplace-286c8\" (UID: \"23e836be-bdcf-44e3-a585-15f8942bf972\") " pod="openshift-marketplace/redhat-marketplace-286c8" Mar 10 09:52:04 crc kubenswrapper[4794]: I0310 09:52:04.735457 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-kf727"] Mar 10 09:52:04 crc kubenswrapper[4794]: I0310 09:52:04.737433 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-kf727" Mar 10 09:52:04 crc kubenswrapper[4794]: I0310 09:52:04.740189 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Mar 10 09:52:04 crc kubenswrapper[4794]: I0310 09:52:04.748234 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-kf727"] Mar 10 09:52:04 crc kubenswrapper[4794]: I0310 09:52:04.863474 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-286c8" Mar 10 09:52:04 crc kubenswrapper[4794]: I0310 09:52:04.900357 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/75dcab4f-0176-43ea-81d3-4ecbff649959-catalog-content\") pod \"redhat-operators-kf727\" (UID: \"75dcab4f-0176-43ea-81d3-4ecbff649959\") " pod="openshift-marketplace/redhat-operators-kf727" Mar 10 09:52:04 crc kubenswrapper[4794]: I0310 09:52:04.900719 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/75dcab4f-0176-43ea-81d3-4ecbff649959-utilities\") pod \"redhat-operators-kf727\" (UID: \"75dcab4f-0176-43ea-81d3-4ecbff649959\") " pod="openshift-marketplace/redhat-operators-kf727" Mar 10 09:52:04 crc kubenswrapper[4794]: I0310 09:52:04.900775 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zk8np\" (UniqueName: \"kubernetes.io/projected/75dcab4f-0176-43ea-81d3-4ecbff649959-kube-api-access-zk8np\") pod \"redhat-operators-kf727\" (UID: \"75dcab4f-0176-43ea-81d3-4ecbff649959\") " pod="openshift-marketplace/redhat-operators-kf727" Mar 10 09:52:05 crc kubenswrapper[4794]: I0310 09:52:05.001225 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/75dcab4f-0176-43ea-81d3-4ecbff649959-utilities\") pod \"redhat-operators-kf727\" (UID: \"75dcab4f-0176-43ea-81d3-4ecbff649959\") " pod="openshift-marketplace/redhat-operators-kf727" Mar 10 09:52:05 crc kubenswrapper[4794]: I0310 09:52:05.001482 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zk8np\" (UniqueName: \"kubernetes.io/projected/75dcab4f-0176-43ea-81d3-4ecbff649959-kube-api-access-zk8np\") pod \"redhat-operators-kf727\" (UID: \"75dcab4f-0176-43ea-81d3-4ecbff649959\") " pod="openshift-marketplace/redhat-operators-kf727" Mar 10 09:52:05 crc kubenswrapper[4794]: I0310 09:52:05.001536 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/75dcab4f-0176-43ea-81d3-4ecbff649959-catalog-content\") pod \"redhat-operators-kf727\" (UID: \"75dcab4f-0176-43ea-81d3-4ecbff649959\") " pod="openshift-marketplace/redhat-operators-kf727" Mar 10 09:52:05 crc kubenswrapper[4794]: I0310 09:52:05.001960 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/75dcab4f-0176-43ea-81d3-4ecbff649959-catalog-content\") pod \"redhat-operators-kf727\" (UID: \"75dcab4f-0176-43ea-81d3-4ecbff649959\") " pod="openshift-marketplace/redhat-operators-kf727" Mar 10 09:52:05 crc kubenswrapper[4794]: I0310 09:52:05.002287 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/75dcab4f-0176-43ea-81d3-4ecbff649959-utilities\") pod \"redhat-operators-kf727\" (UID: \"75dcab4f-0176-43ea-81d3-4ecbff649959\") " pod="openshift-marketplace/redhat-operators-kf727" Mar 10 09:52:05 crc kubenswrapper[4794]: I0310 09:52:05.037716 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zk8np\" (UniqueName: \"kubernetes.io/projected/75dcab4f-0176-43ea-81d3-4ecbff649959-kube-api-access-zk8np\") pod \"redhat-operators-kf727\" (UID: \"75dcab4f-0176-43ea-81d3-4ecbff649959\") " pod="openshift-marketplace/redhat-operators-kf727" Mar 10 09:52:05 crc kubenswrapper[4794]: I0310 09:52:05.060298 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-kf727" Mar 10 09:52:05 crc kubenswrapper[4794]: I0310 09:52:05.076598 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-286c8"] Mar 10 09:52:05 crc kubenswrapper[4794]: I0310 09:52:05.082616 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29552266-9tldc"] Mar 10 09:52:05 crc kubenswrapper[4794]: I0310 09:52:05.086479 4794 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29552266-9tldc"] Mar 10 09:52:05 crc kubenswrapper[4794]: I0310 09:52:05.129876 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552272-f6597" event={"ID":"38cb9379-e79d-4708-a441-7e65e9dc7450","Type":"ContainerDied","Data":"4a8dbafe13ea21b1c4e00482869d929dc391a0fa47fe2cf8a0d4a7c0d7d05fd9"} Mar 10 09:52:05 crc kubenswrapper[4794]: I0310 09:52:05.129941 4794 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4a8dbafe13ea21b1c4e00482869d929dc391a0fa47fe2cf8a0d4a7c0d7d05fd9" Mar 10 09:52:05 crc kubenswrapper[4794]: I0310 09:52:05.129879 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552272-f6597" Mar 10 09:52:05 crc kubenswrapper[4794]: I0310 09:52:05.130754 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-286c8" event={"ID":"23e836be-bdcf-44e3-a585-15f8942bf972","Type":"ContainerStarted","Data":"2a87d6315832a4f0822f813c800e5a59bdcad1bc16e946e1017f7360f0997685"} Mar 10 09:52:05 crc kubenswrapper[4794]: I0310 09:52:05.235817 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-kf727"] Mar 10 09:52:05 crc kubenswrapper[4794]: W0310 09:52:05.243432 4794 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod75dcab4f_0176_43ea_81d3_4ecbff649959.slice/crio-aa2b025a251823cc00aa455fe365354de5df2ce5c64ffc85e251bfd6fb5cd95a WatchSource:0}: Error finding container aa2b025a251823cc00aa455fe365354de5df2ce5c64ffc85e251bfd6fb5cd95a: Status 404 returned error can't find the container with id aa2b025a251823cc00aa455fe365354de5df2ce5c64ffc85e251bfd6fb5cd95a Mar 10 09:52:06 crc kubenswrapper[4794]: I0310 09:52:06.012047 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="869965fc-c355-4c93-9776-dc1a070c926e" path="/var/lib/kubelet/pods/869965fc-c355-4c93-9776-dc1a070c926e/volumes" Mar 10 09:52:06 crc kubenswrapper[4794]: I0310 09:52:06.012808 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f12f506f-5226-41a3-9643-260415a884a5" path="/var/lib/kubelet/pods/f12f506f-5226-41a3-9643-260415a884a5/volumes" Mar 10 09:52:06 crc kubenswrapper[4794]: I0310 09:52:06.140442 4794 generic.go:334] "Generic (PLEG): container finished" podID="75dcab4f-0176-43ea-81d3-4ecbff649959" containerID="e6cfec15be4ea0143c5065b8eaccafdca4b38588762970ab15fe0fe9e6a34436" exitCode=0 Mar 10 09:52:06 crc kubenswrapper[4794]: I0310 09:52:06.140526 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kf727" event={"ID":"75dcab4f-0176-43ea-81d3-4ecbff649959","Type":"ContainerDied","Data":"e6cfec15be4ea0143c5065b8eaccafdca4b38588762970ab15fe0fe9e6a34436"} Mar 10 09:52:06 crc kubenswrapper[4794]: I0310 09:52:06.140556 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kf727" event={"ID":"75dcab4f-0176-43ea-81d3-4ecbff649959","Type":"ContainerStarted","Data":"aa2b025a251823cc00aa455fe365354de5df2ce5c64ffc85e251bfd6fb5cd95a"} Mar 10 09:52:06 crc kubenswrapper[4794]: I0310 09:52:06.143725 4794 generic.go:334] "Generic (PLEG): container finished" podID="23e836be-bdcf-44e3-a585-15f8942bf972" containerID="50d69d2cbef731305949e5b9062252f5f17016ec4fd4ebba004a727bd8cab8b7" exitCode=0 Mar 10 09:52:06 crc kubenswrapper[4794]: I0310 09:52:06.143897 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-286c8" event={"ID":"23e836be-bdcf-44e3-a585-15f8942bf972","Type":"ContainerDied","Data":"50d69d2cbef731305949e5b9062252f5f17016ec4fd4ebba004a727bd8cab8b7"} Mar 10 09:52:06 crc kubenswrapper[4794]: I0310 09:52:06.940879 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-6gz6k"] Mar 10 09:52:06 crc kubenswrapper[4794]: I0310 09:52:06.942872 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-6gz6k" Mar 10 09:52:06 crc kubenswrapper[4794]: I0310 09:52:06.946301 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Mar 10 09:52:06 crc kubenswrapper[4794]: I0310 09:52:06.955391 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-6gz6k"] Mar 10 09:52:07 crc kubenswrapper[4794]: I0310 09:52:07.125815 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c7swz\" (UniqueName: \"kubernetes.io/projected/ffcd51d8-f4d3-46b5-8462-43b6499bd37c-kube-api-access-c7swz\") pod \"certified-operators-6gz6k\" (UID: \"ffcd51d8-f4d3-46b5-8462-43b6499bd37c\") " pod="openshift-marketplace/certified-operators-6gz6k" Mar 10 09:52:07 crc kubenswrapper[4794]: I0310 09:52:07.125898 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ffcd51d8-f4d3-46b5-8462-43b6499bd37c-catalog-content\") pod \"certified-operators-6gz6k\" (UID: \"ffcd51d8-f4d3-46b5-8462-43b6499bd37c\") " pod="openshift-marketplace/certified-operators-6gz6k" Mar 10 09:52:07 crc kubenswrapper[4794]: I0310 09:52:07.125970 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ffcd51d8-f4d3-46b5-8462-43b6499bd37c-utilities\") pod \"certified-operators-6gz6k\" (UID: \"ffcd51d8-f4d3-46b5-8462-43b6499bd37c\") " pod="openshift-marketplace/certified-operators-6gz6k" Mar 10 09:52:07 crc kubenswrapper[4794]: I0310 09:52:07.143137 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-sfbwb"] Mar 10 09:52:07 crc kubenswrapper[4794]: I0310 09:52:07.144601 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-sfbwb" Mar 10 09:52:07 crc kubenswrapper[4794]: I0310 09:52:07.149467 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Mar 10 09:52:07 crc kubenswrapper[4794]: I0310 09:52:07.151961 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kf727" event={"ID":"75dcab4f-0176-43ea-81d3-4ecbff649959","Type":"ContainerStarted","Data":"c3e9d7eac2d3ffbd40e45ee87820d7ce0d466e670a6a3507ffeb59d342e45a27"} Mar 10 09:52:07 crc kubenswrapper[4794]: I0310 09:52:07.153772 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-sfbwb"] Mar 10 09:52:07 crc kubenswrapper[4794]: I0310 09:52:07.153983 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-286c8" event={"ID":"23e836be-bdcf-44e3-a585-15f8942bf972","Type":"ContainerStarted","Data":"4aa655edfa71172c8f91fc484e5f6c19239cba965e298cc6e706b296cdc396bf"} Mar 10 09:52:07 crc kubenswrapper[4794]: I0310 09:52:07.226607 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ffcd51d8-f4d3-46b5-8462-43b6499bd37c-utilities\") pod \"certified-operators-6gz6k\" (UID: \"ffcd51d8-f4d3-46b5-8462-43b6499bd37c\") " pod="openshift-marketplace/certified-operators-6gz6k" Mar 10 09:52:07 crc kubenswrapper[4794]: I0310 09:52:07.226949 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c7swz\" (UniqueName: \"kubernetes.io/projected/ffcd51d8-f4d3-46b5-8462-43b6499bd37c-kube-api-access-c7swz\") pod \"certified-operators-6gz6k\" (UID: \"ffcd51d8-f4d3-46b5-8462-43b6499bd37c\") " pod="openshift-marketplace/certified-operators-6gz6k" Mar 10 09:52:07 crc kubenswrapper[4794]: I0310 09:52:07.226980 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ffcd51d8-f4d3-46b5-8462-43b6499bd37c-catalog-content\") pod \"certified-operators-6gz6k\" (UID: \"ffcd51d8-f4d3-46b5-8462-43b6499bd37c\") " pod="openshift-marketplace/certified-operators-6gz6k" Mar 10 09:52:07 crc kubenswrapper[4794]: I0310 09:52:07.227178 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ffcd51d8-f4d3-46b5-8462-43b6499bd37c-utilities\") pod \"certified-operators-6gz6k\" (UID: \"ffcd51d8-f4d3-46b5-8462-43b6499bd37c\") " pod="openshift-marketplace/certified-operators-6gz6k" Mar 10 09:52:07 crc kubenswrapper[4794]: I0310 09:52:07.227362 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ffcd51d8-f4d3-46b5-8462-43b6499bd37c-catalog-content\") pod \"certified-operators-6gz6k\" (UID: \"ffcd51d8-f4d3-46b5-8462-43b6499bd37c\") " pod="openshift-marketplace/certified-operators-6gz6k" Mar 10 09:52:07 crc kubenswrapper[4794]: I0310 09:52:07.244785 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c7swz\" (UniqueName: \"kubernetes.io/projected/ffcd51d8-f4d3-46b5-8462-43b6499bd37c-kube-api-access-c7swz\") pod \"certified-operators-6gz6k\" (UID: \"ffcd51d8-f4d3-46b5-8462-43b6499bd37c\") " pod="openshift-marketplace/certified-operators-6gz6k" Mar 10 09:52:07 crc kubenswrapper[4794]: I0310 09:52:07.299818 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-6gz6k" Mar 10 09:52:07 crc kubenswrapper[4794]: I0310 09:52:07.327732 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bfe02ea7-261f-448d-904e-07b6ad54a152-utilities\") pod \"community-operators-sfbwb\" (UID: \"bfe02ea7-261f-448d-904e-07b6ad54a152\") " pod="openshift-marketplace/community-operators-sfbwb" Mar 10 09:52:07 crc kubenswrapper[4794]: I0310 09:52:07.327768 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-94d48\" (UniqueName: \"kubernetes.io/projected/bfe02ea7-261f-448d-904e-07b6ad54a152-kube-api-access-94d48\") pod \"community-operators-sfbwb\" (UID: \"bfe02ea7-261f-448d-904e-07b6ad54a152\") " pod="openshift-marketplace/community-operators-sfbwb" Mar 10 09:52:07 crc kubenswrapper[4794]: I0310 09:52:07.327815 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bfe02ea7-261f-448d-904e-07b6ad54a152-catalog-content\") pod \"community-operators-sfbwb\" (UID: \"bfe02ea7-261f-448d-904e-07b6ad54a152\") " pod="openshift-marketplace/community-operators-sfbwb" Mar 10 09:52:07 crc kubenswrapper[4794]: I0310 09:52:07.429180 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bfe02ea7-261f-448d-904e-07b6ad54a152-utilities\") pod \"community-operators-sfbwb\" (UID: \"bfe02ea7-261f-448d-904e-07b6ad54a152\") " pod="openshift-marketplace/community-operators-sfbwb" Mar 10 09:52:07 crc kubenswrapper[4794]: I0310 09:52:07.429214 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-94d48\" (UniqueName: \"kubernetes.io/projected/bfe02ea7-261f-448d-904e-07b6ad54a152-kube-api-access-94d48\") pod \"community-operators-sfbwb\" (UID: \"bfe02ea7-261f-448d-904e-07b6ad54a152\") " pod="openshift-marketplace/community-operators-sfbwb" Mar 10 09:52:07 crc kubenswrapper[4794]: I0310 09:52:07.429277 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bfe02ea7-261f-448d-904e-07b6ad54a152-catalog-content\") pod \"community-operators-sfbwb\" (UID: \"bfe02ea7-261f-448d-904e-07b6ad54a152\") " pod="openshift-marketplace/community-operators-sfbwb" Mar 10 09:52:07 crc kubenswrapper[4794]: I0310 09:52:07.429792 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bfe02ea7-261f-448d-904e-07b6ad54a152-utilities\") pod \"community-operators-sfbwb\" (UID: \"bfe02ea7-261f-448d-904e-07b6ad54a152\") " pod="openshift-marketplace/community-operators-sfbwb" Mar 10 09:52:07 crc kubenswrapper[4794]: I0310 09:52:07.429831 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bfe02ea7-261f-448d-904e-07b6ad54a152-catalog-content\") pod \"community-operators-sfbwb\" (UID: \"bfe02ea7-261f-448d-904e-07b6ad54a152\") " pod="openshift-marketplace/community-operators-sfbwb" Mar 10 09:52:07 crc kubenswrapper[4794]: I0310 09:52:07.456298 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-94d48\" (UniqueName: \"kubernetes.io/projected/bfe02ea7-261f-448d-904e-07b6ad54a152-kube-api-access-94d48\") pod \"community-operators-sfbwb\" (UID: \"bfe02ea7-261f-448d-904e-07b6ad54a152\") " pod="openshift-marketplace/community-operators-sfbwb" Mar 10 09:52:07 crc kubenswrapper[4794]: I0310 09:52:07.479244 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-sfbwb" Mar 10 09:52:07 crc kubenswrapper[4794]: I0310 09:52:07.490224 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-6gz6k"] Mar 10 09:52:08 crc kubenswrapper[4794]: I0310 09:52:08.162093 4794 generic.go:334] "Generic (PLEG): container finished" podID="23e836be-bdcf-44e3-a585-15f8942bf972" containerID="4aa655edfa71172c8f91fc484e5f6c19239cba965e298cc6e706b296cdc396bf" exitCode=0 Mar 10 09:52:08 crc kubenswrapper[4794]: I0310 09:52:08.162205 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-286c8" event={"ID":"23e836be-bdcf-44e3-a585-15f8942bf972","Type":"ContainerDied","Data":"4aa655edfa71172c8f91fc484e5f6c19239cba965e298cc6e706b296cdc396bf"} Mar 10 09:52:08 crc kubenswrapper[4794]: I0310 09:52:08.164097 4794 generic.go:334] "Generic (PLEG): container finished" podID="75dcab4f-0176-43ea-81d3-4ecbff649959" containerID="c3e9d7eac2d3ffbd40e45ee87820d7ce0d466e670a6a3507ffeb59d342e45a27" exitCode=0 Mar 10 09:52:08 crc kubenswrapper[4794]: I0310 09:52:08.164151 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kf727" event={"ID":"75dcab4f-0176-43ea-81d3-4ecbff649959","Type":"ContainerDied","Data":"c3e9d7eac2d3ffbd40e45ee87820d7ce0d466e670a6a3507ffeb59d342e45a27"} Mar 10 09:52:08 crc kubenswrapper[4794]: I0310 09:52:08.175491 4794 generic.go:334] "Generic (PLEG): container finished" podID="ffcd51d8-f4d3-46b5-8462-43b6499bd37c" containerID="420066642e32472ff00be9fcf86171b513197bfd3355cd25fcaec26f70d2d055" exitCode=0 Mar 10 09:52:08 crc kubenswrapper[4794]: I0310 09:52:08.175520 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6gz6k" event={"ID":"ffcd51d8-f4d3-46b5-8462-43b6499bd37c","Type":"ContainerDied","Data":"420066642e32472ff00be9fcf86171b513197bfd3355cd25fcaec26f70d2d055"} Mar 10 09:52:08 crc kubenswrapper[4794]: I0310 09:52:08.175541 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6gz6k" event={"ID":"ffcd51d8-f4d3-46b5-8462-43b6499bd37c","Type":"ContainerStarted","Data":"49e47bf47da0f32ccb53707c676701d9114fae5087a250c6955e3316091d8c50"} Mar 10 09:52:08 crc kubenswrapper[4794]: I0310 09:52:08.341427 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-sfbwb"] Mar 10 09:52:09 crc kubenswrapper[4794]: I0310 09:52:09.190262 4794 generic.go:334] "Generic (PLEG): container finished" podID="ffcd51d8-f4d3-46b5-8462-43b6499bd37c" containerID="6616cf23f61d56f4476b60f703fdfa075f83963ec163cc2fe1c5292af6f0fd00" exitCode=0 Mar 10 09:52:09 crc kubenswrapper[4794]: I0310 09:52:09.190409 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6gz6k" event={"ID":"ffcd51d8-f4d3-46b5-8462-43b6499bd37c","Type":"ContainerDied","Data":"6616cf23f61d56f4476b60f703fdfa075f83963ec163cc2fe1c5292af6f0fd00"} Mar 10 09:52:09 crc kubenswrapper[4794]: I0310 09:52:09.192510 4794 generic.go:334] "Generic (PLEG): container finished" podID="bfe02ea7-261f-448d-904e-07b6ad54a152" containerID="8a51cf14b3bdc2c7e27149e958b88c30382b37355731bab392cd2211a5ddfba2" exitCode=0 Mar 10 09:52:09 crc kubenswrapper[4794]: I0310 09:52:09.192565 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-sfbwb" event={"ID":"bfe02ea7-261f-448d-904e-07b6ad54a152","Type":"ContainerDied","Data":"8a51cf14b3bdc2c7e27149e958b88c30382b37355731bab392cd2211a5ddfba2"} Mar 10 09:52:09 crc kubenswrapper[4794]: I0310 09:52:09.192594 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-sfbwb" event={"ID":"bfe02ea7-261f-448d-904e-07b6ad54a152","Type":"ContainerStarted","Data":"044f4d4d7fa6ac993946759b80c155ac4485e1e48b22eb12f345ea3b7d59d7e4"} Mar 10 09:52:09 crc kubenswrapper[4794]: I0310 09:52:09.197656 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-286c8" event={"ID":"23e836be-bdcf-44e3-a585-15f8942bf972","Type":"ContainerStarted","Data":"f8093973090056ea0515a631629ab29d0b6ff8f1bd841793fa0ae9e78c3f23ec"} Mar 10 09:52:09 crc kubenswrapper[4794]: I0310 09:52:09.200105 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kf727" event={"ID":"75dcab4f-0176-43ea-81d3-4ecbff649959","Type":"ContainerStarted","Data":"0ff396ca2070d65925f02a7bd4dd416883336cbd1322412504065f692cc25cfe"} Mar 10 09:52:09 crc kubenswrapper[4794]: I0310 09:52:09.249562 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-286c8" podStartSLOduration=2.7024097620000003 podStartE2EDuration="5.249543201s" podCreationTimestamp="2026-03-10 09:52:04 +0000 UTC" firstStartedPulling="2026-03-10 09:52:06.145449021 +0000 UTC m=+474.901619879" lastFinishedPulling="2026-03-10 09:52:08.6925825 +0000 UTC m=+477.448753318" observedRunningTime="2026-03-10 09:52:09.246677307 +0000 UTC m=+478.002848125" watchObservedRunningTime="2026-03-10 09:52:09.249543201 +0000 UTC m=+478.005714019" Mar 10 09:52:09 crc kubenswrapper[4794]: I0310 09:52:09.268216 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-kf727" podStartSLOduration=2.836692847 podStartE2EDuration="5.268199203s" podCreationTimestamp="2026-03-10 09:52:04 +0000 UTC" firstStartedPulling="2026-03-10 09:52:06.142672 +0000 UTC m=+474.898842818" lastFinishedPulling="2026-03-10 09:52:08.574178356 +0000 UTC m=+477.330349174" observedRunningTime="2026-03-10 09:52:09.264893165 +0000 UTC m=+478.021063983" watchObservedRunningTime="2026-03-10 09:52:09.268199203 +0000 UTC m=+478.024370021" Mar 10 09:52:10 crc kubenswrapper[4794]: I0310 09:52:10.206930 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6gz6k" event={"ID":"ffcd51d8-f4d3-46b5-8462-43b6499bd37c","Type":"ContainerStarted","Data":"a6c9c9760134b7610ba30b7f5109f20f8770a68d0c9377e9b335fe06b3f6798b"} Mar 10 09:52:10 crc kubenswrapper[4794]: I0310 09:52:10.209320 4794 generic.go:334] "Generic (PLEG): container finished" podID="bfe02ea7-261f-448d-904e-07b6ad54a152" containerID="03018e76bb953dec17191d251b8adcf1de9fafd18ca57527487db652c33647e4" exitCode=0 Mar 10 09:52:10 crc kubenswrapper[4794]: I0310 09:52:10.210227 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-sfbwb" event={"ID":"bfe02ea7-261f-448d-904e-07b6ad54a152","Type":"ContainerDied","Data":"03018e76bb953dec17191d251b8adcf1de9fafd18ca57527487db652c33647e4"} Mar 10 09:52:10 crc kubenswrapper[4794]: I0310 09:52:10.245493 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-6gz6k" podStartSLOduration=2.844646809 podStartE2EDuration="4.245475083s" podCreationTimestamp="2026-03-10 09:52:06 +0000 UTC" firstStartedPulling="2026-03-10 09:52:08.177523514 +0000 UTC m=+476.933694332" lastFinishedPulling="2026-03-10 09:52:09.578351778 +0000 UTC m=+478.334522606" observedRunningTime="2026-03-10 09:52:10.238846296 +0000 UTC m=+478.995017114" watchObservedRunningTime="2026-03-10 09:52:10.245475083 +0000 UTC m=+479.001645901" Mar 10 09:52:11 crc kubenswrapper[4794]: I0310 09:52:11.218213 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-sfbwb" event={"ID":"bfe02ea7-261f-448d-904e-07b6ad54a152","Type":"ContainerStarted","Data":"49cb5deada51807a57640376456c0f8697ad90ed45aaeb31fa6d8cedcdb86325"} Mar 10 09:52:11 crc kubenswrapper[4794]: I0310 09:52:11.233753 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-sfbwb" podStartSLOduration=2.816544092 podStartE2EDuration="4.233734903s" podCreationTimestamp="2026-03-10 09:52:07 +0000 UTC" firstStartedPulling="2026-03-10 09:52:09.194231937 +0000 UTC m=+477.950402765" lastFinishedPulling="2026-03-10 09:52:10.611422758 +0000 UTC m=+479.367593576" observedRunningTime="2026-03-10 09:52:11.232697329 +0000 UTC m=+479.988868157" watchObservedRunningTime="2026-03-10 09:52:11.233734903 +0000 UTC m=+479.989905741" Mar 10 09:52:14 crc kubenswrapper[4794]: I0310 09:52:14.864767 4794 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-286c8" Mar 10 09:52:14 crc kubenswrapper[4794]: I0310 09:52:14.865478 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-286c8" Mar 10 09:52:14 crc kubenswrapper[4794]: I0310 09:52:14.913856 4794 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-286c8" Mar 10 09:52:15 crc kubenswrapper[4794]: I0310 09:52:15.061059 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-kf727" Mar 10 09:52:15 crc kubenswrapper[4794]: I0310 09:52:15.061290 4794 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-kf727" Mar 10 09:52:15 crc kubenswrapper[4794]: I0310 09:52:15.295635 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-286c8" Mar 10 09:52:16 crc kubenswrapper[4794]: I0310 09:52:16.105868 4794 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-kf727" podUID="75dcab4f-0176-43ea-81d3-4ecbff649959" containerName="registry-server" probeResult="failure" output=< Mar 10 09:52:16 crc kubenswrapper[4794]: timeout: failed to connect service ":50051" within 1s Mar 10 09:52:16 crc kubenswrapper[4794]: > Mar 10 09:52:17 crc kubenswrapper[4794]: I0310 09:52:17.300064 4794 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-6gz6k" Mar 10 09:52:17 crc kubenswrapper[4794]: I0310 09:52:17.300541 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-6gz6k" Mar 10 09:52:17 crc kubenswrapper[4794]: I0310 09:52:17.345722 4794 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-6gz6k" Mar 10 09:52:17 crc kubenswrapper[4794]: I0310 09:52:17.479800 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-sfbwb" Mar 10 09:52:17 crc kubenswrapper[4794]: I0310 09:52:17.480156 4794 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-sfbwb" Mar 10 09:52:17 crc kubenswrapper[4794]: I0310 09:52:17.528427 4794 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-sfbwb" Mar 10 09:52:18 crc kubenswrapper[4794]: I0310 09:52:18.313317 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-6gz6k" Mar 10 09:52:18 crc kubenswrapper[4794]: I0310 09:52:18.322965 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-sfbwb" Mar 10 09:52:22 crc kubenswrapper[4794]: I0310 09:52:22.967999 4794 patch_prober.go:28] interesting pod/machine-config-daemon-69278 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 09:52:22 crc kubenswrapper[4794]: I0310 09:52:22.968774 4794 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-69278" podUID="071a79a8-a892-4d38-a255-2a19483b64aa" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 09:52:25 crc kubenswrapper[4794]: I0310 09:52:25.122441 4794 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-kf727" Mar 10 09:52:25 crc kubenswrapper[4794]: I0310 09:52:25.192798 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-kf727" Mar 10 09:52:27 crc kubenswrapper[4794]: I0310 09:52:27.449861 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-image-registry/image-registry-697d97f7c8-zp5hg" podUID="f7eaacf5-bd8a-46e7-8584-1ec3ae00f6b3" containerName="registry" containerID="cri-o://b10df9598b701f6239dcd28f244b86e29218a74a71906050e3f86265aedddb14" gracePeriod=30 Mar 10 09:52:27 crc kubenswrapper[4794]: I0310 09:52:27.808102 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-zp5hg" Mar 10 09:52:27 crc kubenswrapper[4794]: I0310 09:52:27.838037 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/f7eaacf5-bd8a-46e7-8584-1ec3ae00f6b3-bound-sa-token\") pod \"f7eaacf5-bd8a-46e7-8584-1ec3ae00f6b3\" (UID: \"f7eaacf5-bd8a-46e7-8584-1ec3ae00f6b3\") " Mar 10 09:52:27 crc kubenswrapper[4794]: I0310 09:52:27.838077 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/f7eaacf5-bd8a-46e7-8584-1ec3ae00f6b3-ca-trust-extracted\") pod \"f7eaacf5-bd8a-46e7-8584-1ec3ae00f6b3\" (UID: \"f7eaacf5-bd8a-46e7-8584-1ec3ae00f6b3\") " Mar 10 09:52:27 crc kubenswrapper[4794]: I0310 09:52:27.838128 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/f7eaacf5-bd8a-46e7-8584-1ec3ae00f6b3-installation-pull-secrets\") pod \"f7eaacf5-bd8a-46e7-8584-1ec3ae00f6b3\" (UID: \"f7eaacf5-bd8a-46e7-8584-1ec3ae00f6b3\") " Mar 10 09:52:27 crc kubenswrapper[4794]: I0310 09:52:27.838158 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f7eaacf5-bd8a-46e7-8584-1ec3ae00f6b3-trusted-ca\") pod \"f7eaacf5-bd8a-46e7-8584-1ec3ae00f6b3\" (UID: \"f7eaacf5-bd8a-46e7-8584-1ec3ae00f6b3\") " Mar 10 09:52:27 crc kubenswrapper[4794]: I0310 09:52:27.838174 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/f7eaacf5-bd8a-46e7-8584-1ec3ae00f6b3-registry-certificates\") pod \"f7eaacf5-bd8a-46e7-8584-1ec3ae00f6b3\" (UID: \"f7eaacf5-bd8a-46e7-8584-1ec3ae00f6b3\") " Mar 10 09:52:27 crc kubenswrapper[4794]: I0310 09:52:27.838399 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-storage\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"f7eaacf5-bd8a-46e7-8584-1ec3ae00f6b3\" (UID: \"f7eaacf5-bd8a-46e7-8584-1ec3ae00f6b3\") " Mar 10 09:52:27 crc kubenswrapper[4794]: I0310 09:52:27.838447 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/f7eaacf5-bd8a-46e7-8584-1ec3ae00f6b3-registry-tls\") pod \"f7eaacf5-bd8a-46e7-8584-1ec3ae00f6b3\" (UID: \"f7eaacf5-bd8a-46e7-8584-1ec3ae00f6b3\") " Mar 10 09:52:27 crc kubenswrapper[4794]: I0310 09:52:27.838488 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wxn5t\" (UniqueName: \"kubernetes.io/projected/f7eaacf5-bd8a-46e7-8584-1ec3ae00f6b3-kube-api-access-wxn5t\") pod \"f7eaacf5-bd8a-46e7-8584-1ec3ae00f6b3\" (UID: \"f7eaacf5-bd8a-46e7-8584-1ec3ae00f6b3\") " Mar 10 09:52:27 crc kubenswrapper[4794]: I0310 09:52:27.841394 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f7eaacf5-bd8a-46e7-8584-1ec3ae00f6b3-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "f7eaacf5-bd8a-46e7-8584-1ec3ae00f6b3" (UID: "f7eaacf5-bd8a-46e7-8584-1ec3ae00f6b3"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 09:52:27 crc kubenswrapper[4794]: I0310 09:52:27.841434 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f7eaacf5-bd8a-46e7-8584-1ec3ae00f6b3-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "f7eaacf5-bd8a-46e7-8584-1ec3ae00f6b3" (UID: "f7eaacf5-bd8a-46e7-8584-1ec3ae00f6b3"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 09:52:27 crc kubenswrapper[4794]: I0310 09:52:27.846831 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f7eaacf5-bd8a-46e7-8584-1ec3ae00f6b3-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "f7eaacf5-bd8a-46e7-8584-1ec3ae00f6b3" (UID: "f7eaacf5-bd8a-46e7-8584-1ec3ae00f6b3"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:52:27 crc kubenswrapper[4794]: I0310 09:52:27.847621 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f7eaacf5-bd8a-46e7-8584-1ec3ae00f6b3-kube-api-access-wxn5t" (OuterVolumeSpecName: "kube-api-access-wxn5t") pod "f7eaacf5-bd8a-46e7-8584-1ec3ae00f6b3" (UID: "f7eaacf5-bd8a-46e7-8584-1ec3ae00f6b3"). InnerVolumeSpecName "kube-api-access-wxn5t". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:52:27 crc kubenswrapper[4794]: I0310 09:52:27.850259 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f7eaacf5-bd8a-46e7-8584-1ec3ae00f6b3-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "f7eaacf5-bd8a-46e7-8584-1ec3ae00f6b3" (UID: "f7eaacf5-bd8a-46e7-8584-1ec3ae00f6b3"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:52:27 crc kubenswrapper[4794]: I0310 09:52:27.850406 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f7eaacf5-bd8a-46e7-8584-1ec3ae00f6b3-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "f7eaacf5-bd8a-46e7-8584-1ec3ae00f6b3" (UID: "f7eaacf5-bd8a-46e7-8584-1ec3ae00f6b3"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:52:27 crc kubenswrapper[4794]: I0310 09:52:27.851093 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "registry-storage") pod "f7eaacf5-bd8a-46e7-8584-1ec3ae00f6b3" (UID: "f7eaacf5-bd8a-46e7-8584-1ec3ae00f6b3"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Mar 10 09:52:27 crc kubenswrapper[4794]: I0310 09:52:27.861771 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f7eaacf5-bd8a-46e7-8584-1ec3ae00f6b3-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "f7eaacf5-bd8a-46e7-8584-1ec3ae00f6b3" (UID: "f7eaacf5-bd8a-46e7-8584-1ec3ae00f6b3"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 09:52:27 crc kubenswrapper[4794]: I0310 09:52:27.940244 4794 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f7eaacf5-bd8a-46e7-8584-1ec3ae00f6b3-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 10 09:52:27 crc kubenswrapper[4794]: I0310 09:52:27.940284 4794 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/f7eaacf5-bd8a-46e7-8584-1ec3ae00f6b3-registry-certificates\") on node \"crc\" DevicePath \"\"" Mar 10 09:52:27 crc kubenswrapper[4794]: I0310 09:52:27.940296 4794 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/f7eaacf5-bd8a-46e7-8584-1ec3ae00f6b3-registry-tls\") on node \"crc\" DevicePath \"\"" Mar 10 09:52:27 crc kubenswrapper[4794]: I0310 09:52:27.940305 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wxn5t\" (UniqueName: \"kubernetes.io/projected/f7eaacf5-bd8a-46e7-8584-1ec3ae00f6b3-kube-api-access-wxn5t\") on node \"crc\" DevicePath \"\"" Mar 10 09:52:27 crc kubenswrapper[4794]: I0310 09:52:27.940314 4794 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/f7eaacf5-bd8a-46e7-8584-1ec3ae00f6b3-bound-sa-token\") on node \"crc\" DevicePath \"\"" Mar 10 09:52:27 crc kubenswrapper[4794]: I0310 09:52:27.940359 4794 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/f7eaacf5-bd8a-46e7-8584-1ec3ae00f6b3-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Mar 10 09:52:27 crc kubenswrapper[4794]: I0310 09:52:27.940373 4794 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/f7eaacf5-bd8a-46e7-8584-1ec3ae00f6b3-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Mar 10 09:52:28 crc kubenswrapper[4794]: I0310 09:52:28.319494 4794 generic.go:334] "Generic (PLEG): container finished" podID="f7eaacf5-bd8a-46e7-8584-1ec3ae00f6b3" containerID="b10df9598b701f6239dcd28f244b86e29218a74a71906050e3f86265aedddb14" exitCode=0 Mar 10 09:52:28 crc kubenswrapper[4794]: I0310 09:52:28.319591 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-zp5hg" event={"ID":"f7eaacf5-bd8a-46e7-8584-1ec3ae00f6b3","Type":"ContainerDied","Data":"b10df9598b701f6239dcd28f244b86e29218a74a71906050e3f86265aedddb14"} Mar 10 09:52:28 crc kubenswrapper[4794]: I0310 09:52:28.319764 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-zp5hg" event={"ID":"f7eaacf5-bd8a-46e7-8584-1ec3ae00f6b3","Type":"ContainerDied","Data":"ca2739fb9fb4d9f8c71b5b3a76a829b57c173d46fa0313e8d90d9780fe755a71"} Mar 10 09:52:28 crc kubenswrapper[4794]: I0310 09:52:28.319782 4794 scope.go:117] "RemoveContainer" containerID="b10df9598b701f6239dcd28f244b86e29218a74a71906050e3f86265aedddb14" Mar 10 09:52:28 crc kubenswrapper[4794]: I0310 09:52:28.319636 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-zp5hg" Mar 10 09:52:28 crc kubenswrapper[4794]: I0310 09:52:28.338734 4794 scope.go:117] "RemoveContainer" containerID="b10df9598b701f6239dcd28f244b86e29218a74a71906050e3f86265aedddb14" Mar 10 09:52:28 crc kubenswrapper[4794]: I0310 09:52:28.338876 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-zp5hg"] Mar 10 09:52:28 crc kubenswrapper[4794]: E0310 09:52:28.339223 4794 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b10df9598b701f6239dcd28f244b86e29218a74a71906050e3f86265aedddb14\": container with ID starting with b10df9598b701f6239dcd28f244b86e29218a74a71906050e3f86265aedddb14 not found: ID does not exist" containerID="b10df9598b701f6239dcd28f244b86e29218a74a71906050e3f86265aedddb14" Mar 10 09:52:28 crc kubenswrapper[4794]: I0310 09:52:28.339256 4794 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b10df9598b701f6239dcd28f244b86e29218a74a71906050e3f86265aedddb14"} err="failed to get container status \"b10df9598b701f6239dcd28f244b86e29218a74a71906050e3f86265aedddb14\": rpc error: code = NotFound desc = could not find container \"b10df9598b701f6239dcd28f244b86e29218a74a71906050e3f86265aedddb14\": container with ID starting with b10df9598b701f6239dcd28f244b86e29218a74a71906050e3f86265aedddb14 not found: ID does not exist" Mar 10 09:52:28 crc kubenswrapper[4794]: I0310 09:52:28.345067 4794 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-zp5hg"] Mar 10 09:52:30 crc kubenswrapper[4794]: I0310 09:52:30.004918 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f7eaacf5-bd8a-46e7-8584-1ec3ae00f6b3" path="/var/lib/kubelet/pods/f7eaacf5-bd8a-46e7-8584-1ec3ae00f6b3/volumes" Mar 10 09:52:38 crc kubenswrapper[4794]: I0310 09:52:38.165346 4794 prober.go:107] "Probe failed" probeType="Liveness" pod="hostpath-provisioner/csi-hostpathplugin-ckvtc" podUID="f97a286b-f0b0-4309-a3e4-33eea0aea5f8" containerName="hostpath-provisioner" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 10 09:52:52 crc kubenswrapper[4794]: I0310 09:52:52.967204 4794 patch_prober.go:28] interesting pod/machine-config-daemon-69278 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 09:52:52 crc kubenswrapper[4794]: I0310 09:52:52.967735 4794 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-69278" podUID="071a79a8-a892-4d38-a255-2a19483b64aa" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 09:52:52 crc kubenswrapper[4794]: I0310 09:52:52.967787 4794 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-69278" Mar 10 09:52:52 crc kubenswrapper[4794]: I0310 09:52:52.968494 4794 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"8bd93f36183b19386ef399a63319db0f77fccc38f2efa4fc0d1b62c277727e21"} pod="openshift-machine-config-operator/machine-config-daemon-69278" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 10 09:52:52 crc kubenswrapper[4794]: I0310 09:52:52.968582 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-69278" podUID="071a79a8-a892-4d38-a255-2a19483b64aa" containerName="machine-config-daemon" containerID="cri-o://8bd93f36183b19386ef399a63319db0f77fccc38f2efa4fc0d1b62c277727e21" gracePeriod=600 Mar 10 09:52:53 crc kubenswrapper[4794]: I0310 09:52:53.462609 4794 generic.go:334] "Generic (PLEG): container finished" podID="071a79a8-a892-4d38-a255-2a19483b64aa" containerID="8bd93f36183b19386ef399a63319db0f77fccc38f2efa4fc0d1b62c277727e21" exitCode=0 Mar 10 09:52:53 crc kubenswrapper[4794]: I0310 09:52:53.462746 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-69278" event={"ID":"071a79a8-a892-4d38-a255-2a19483b64aa","Type":"ContainerDied","Data":"8bd93f36183b19386ef399a63319db0f77fccc38f2efa4fc0d1b62c277727e21"} Mar 10 09:52:53 crc kubenswrapper[4794]: I0310 09:52:53.463202 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-69278" event={"ID":"071a79a8-a892-4d38-a255-2a19483b64aa","Type":"ContainerStarted","Data":"008d2d6b5be2dee5277f2851a79ea6544ead77396a09ca0e14c2fa485aead805"} Mar 10 09:52:53 crc kubenswrapper[4794]: I0310 09:52:53.463268 4794 scope.go:117] "RemoveContainer" containerID="387c7d244ef3b150e61bf6db8b30580b19b8867628fab0e5c4eebdde4df81142" Mar 10 09:54:00 crc kubenswrapper[4794]: I0310 09:54:00.146665 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29552274-8sskt"] Mar 10 09:54:00 crc kubenswrapper[4794]: E0310 09:54:00.147864 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f7eaacf5-bd8a-46e7-8584-1ec3ae00f6b3" containerName="registry" Mar 10 09:54:00 crc kubenswrapper[4794]: I0310 09:54:00.147897 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="f7eaacf5-bd8a-46e7-8584-1ec3ae00f6b3" containerName="registry" Mar 10 09:54:00 crc kubenswrapper[4794]: I0310 09:54:00.148106 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="f7eaacf5-bd8a-46e7-8584-1ec3ae00f6b3" containerName="registry" Mar 10 09:54:00 crc kubenswrapper[4794]: I0310 09:54:00.148920 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552274-8sskt" Mar 10 09:54:00 crc kubenswrapper[4794]: I0310 09:54:00.152389 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 10 09:54:00 crc kubenswrapper[4794]: I0310 09:54:00.152469 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-r5tkc" Mar 10 09:54:00 crc kubenswrapper[4794]: I0310 09:54:00.153247 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552274-8sskt"] Mar 10 09:54:00 crc kubenswrapper[4794]: I0310 09:54:00.153255 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 10 09:54:00 crc kubenswrapper[4794]: I0310 09:54:00.282551 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-68pc6\" (UniqueName: \"kubernetes.io/projected/35e20182-9b30-45e4-a638-2460b967250e-kube-api-access-68pc6\") pod \"auto-csr-approver-29552274-8sskt\" (UID: \"35e20182-9b30-45e4-a638-2460b967250e\") " pod="openshift-infra/auto-csr-approver-29552274-8sskt" Mar 10 09:54:00 crc kubenswrapper[4794]: I0310 09:54:00.383520 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-68pc6\" (UniqueName: \"kubernetes.io/projected/35e20182-9b30-45e4-a638-2460b967250e-kube-api-access-68pc6\") pod \"auto-csr-approver-29552274-8sskt\" (UID: \"35e20182-9b30-45e4-a638-2460b967250e\") " pod="openshift-infra/auto-csr-approver-29552274-8sskt" Mar 10 09:54:00 crc kubenswrapper[4794]: I0310 09:54:00.415836 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-68pc6\" (UniqueName: \"kubernetes.io/projected/35e20182-9b30-45e4-a638-2460b967250e-kube-api-access-68pc6\") pod \"auto-csr-approver-29552274-8sskt\" (UID: \"35e20182-9b30-45e4-a638-2460b967250e\") " pod="openshift-infra/auto-csr-approver-29552274-8sskt" Mar 10 09:54:00 crc kubenswrapper[4794]: I0310 09:54:00.473601 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552274-8sskt" Mar 10 09:54:00 crc kubenswrapper[4794]: I0310 09:54:00.673065 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552274-8sskt"] Mar 10 09:54:00 crc kubenswrapper[4794]: I0310 09:54:00.682413 4794 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 10 09:54:00 crc kubenswrapper[4794]: I0310 09:54:00.923921 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552274-8sskt" event={"ID":"35e20182-9b30-45e4-a638-2460b967250e","Type":"ContainerStarted","Data":"ee49ba05d585924f778e9e38815a992de86c9f2600ec5fc1261b4e6fe9499605"} Mar 10 09:54:02 crc kubenswrapper[4794]: I0310 09:54:02.938484 4794 generic.go:334] "Generic (PLEG): container finished" podID="35e20182-9b30-45e4-a638-2460b967250e" containerID="dc2dd6fc1d8274102b00f0c4c10a1dbffacb90ceb4d7d48b075b7d54fee08404" exitCode=0 Mar 10 09:54:02 crc kubenswrapper[4794]: I0310 09:54:02.938622 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552274-8sskt" event={"ID":"35e20182-9b30-45e4-a638-2460b967250e","Type":"ContainerDied","Data":"dc2dd6fc1d8274102b00f0c4c10a1dbffacb90ceb4d7d48b075b7d54fee08404"} Mar 10 09:54:04 crc kubenswrapper[4794]: I0310 09:54:04.162579 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552274-8sskt" Mar 10 09:54:04 crc kubenswrapper[4794]: I0310 09:54:04.333631 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-68pc6\" (UniqueName: \"kubernetes.io/projected/35e20182-9b30-45e4-a638-2460b967250e-kube-api-access-68pc6\") pod \"35e20182-9b30-45e4-a638-2460b967250e\" (UID: \"35e20182-9b30-45e4-a638-2460b967250e\") " Mar 10 09:54:04 crc kubenswrapper[4794]: I0310 09:54:04.340560 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/35e20182-9b30-45e4-a638-2460b967250e-kube-api-access-68pc6" (OuterVolumeSpecName: "kube-api-access-68pc6") pod "35e20182-9b30-45e4-a638-2460b967250e" (UID: "35e20182-9b30-45e4-a638-2460b967250e"). InnerVolumeSpecName "kube-api-access-68pc6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:54:04 crc kubenswrapper[4794]: I0310 09:54:04.435666 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-68pc6\" (UniqueName: \"kubernetes.io/projected/35e20182-9b30-45e4-a638-2460b967250e-kube-api-access-68pc6\") on node \"crc\" DevicePath \"\"" Mar 10 09:54:04 crc kubenswrapper[4794]: I0310 09:54:04.953325 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552274-8sskt" event={"ID":"35e20182-9b30-45e4-a638-2460b967250e","Type":"ContainerDied","Data":"ee49ba05d585924f778e9e38815a992de86c9f2600ec5fc1261b4e6fe9499605"} Mar 10 09:54:04 crc kubenswrapper[4794]: I0310 09:54:04.953592 4794 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ee49ba05d585924f778e9e38815a992de86c9f2600ec5fc1261b4e6fe9499605" Mar 10 09:54:04 crc kubenswrapper[4794]: I0310 09:54:04.953454 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552274-8sskt" Mar 10 09:54:05 crc kubenswrapper[4794]: I0310 09:54:05.215148 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29552268-gzdlm"] Mar 10 09:54:05 crc kubenswrapper[4794]: I0310 09:54:05.220543 4794 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29552268-gzdlm"] Mar 10 09:54:06 crc kubenswrapper[4794]: I0310 09:54:06.010803 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="75d151a9-5d22-4241-9177-7856740702e4" path="/var/lib/kubelet/pods/75d151a9-5d22-4241-9177-7856740702e4/volumes" Mar 10 09:55:22 crc kubenswrapper[4794]: I0310 09:55:22.967966 4794 patch_prober.go:28] interesting pod/machine-config-daemon-69278 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 09:55:22 crc kubenswrapper[4794]: I0310 09:55:22.969634 4794 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-69278" podUID="071a79a8-a892-4d38-a255-2a19483b64aa" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 09:55:29 crc kubenswrapper[4794]: I0310 09:55:29.281426 4794 scope.go:117] "RemoveContainer" containerID="8b2b83c5135b4c86f50d23fecbfe18a99d36b7c57ddad61bcfc75ab45b8ce0c5" Mar 10 09:55:29 crc kubenswrapper[4794]: I0310 09:55:29.311411 4794 scope.go:117] "RemoveContainer" containerID="63117909e244ec7b2c31dfd6a3ef41360076d2e72b8e908ca37fe4233235e47f" Mar 10 09:55:29 crc kubenswrapper[4794]: I0310 09:55:29.355768 4794 scope.go:117] "RemoveContainer" containerID="c264f2b0994e995bf93906c45f1565d45f2ea144f2807b0b2316545f9b3c7131" Mar 10 09:55:52 crc kubenswrapper[4794]: I0310 09:55:52.968083 4794 patch_prober.go:28] interesting pod/machine-config-daemon-69278 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 09:55:52 crc kubenswrapper[4794]: I0310 09:55:52.968549 4794 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-69278" podUID="071a79a8-a892-4d38-a255-2a19483b64aa" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 09:56:00 crc kubenswrapper[4794]: I0310 09:56:00.146650 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29552276-dc4zw"] Mar 10 09:56:00 crc kubenswrapper[4794]: E0310 09:56:00.147754 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="35e20182-9b30-45e4-a638-2460b967250e" containerName="oc" Mar 10 09:56:00 crc kubenswrapper[4794]: I0310 09:56:00.147775 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="35e20182-9b30-45e4-a638-2460b967250e" containerName="oc" Mar 10 09:56:00 crc kubenswrapper[4794]: I0310 09:56:00.147981 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="35e20182-9b30-45e4-a638-2460b967250e" containerName="oc" Mar 10 09:56:00 crc kubenswrapper[4794]: I0310 09:56:00.148820 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552276-dc4zw" Mar 10 09:56:00 crc kubenswrapper[4794]: I0310 09:56:00.152667 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 10 09:56:00 crc kubenswrapper[4794]: I0310 09:56:00.153089 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-r5tkc" Mar 10 09:56:00 crc kubenswrapper[4794]: I0310 09:56:00.153135 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 10 09:56:00 crc kubenswrapper[4794]: I0310 09:56:00.162124 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552276-dc4zw"] Mar 10 09:56:00 crc kubenswrapper[4794]: I0310 09:56:00.212262 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6bk97\" (UniqueName: \"kubernetes.io/projected/d38f5eab-4cb0-42bc-9ceb-576829269e3e-kube-api-access-6bk97\") pod \"auto-csr-approver-29552276-dc4zw\" (UID: \"d38f5eab-4cb0-42bc-9ceb-576829269e3e\") " pod="openshift-infra/auto-csr-approver-29552276-dc4zw" Mar 10 09:56:00 crc kubenswrapper[4794]: I0310 09:56:00.313820 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6bk97\" (UniqueName: \"kubernetes.io/projected/d38f5eab-4cb0-42bc-9ceb-576829269e3e-kube-api-access-6bk97\") pod \"auto-csr-approver-29552276-dc4zw\" (UID: \"d38f5eab-4cb0-42bc-9ceb-576829269e3e\") " pod="openshift-infra/auto-csr-approver-29552276-dc4zw" Mar 10 09:56:00 crc kubenswrapper[4794]: I0310 09:56:00.352149 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6bk97\" (UniqueName: \"kubernetes.io/projected/d38f5eab-4cb0-42bc-9ceb-576829269e3e-kube-api-access-6bk97\") pod \"auto-csr-approver-29552276-dc4zw\" (UID: \"d38f5eab-4cb0-42bc-9ceb-576829269e3e\") " pod="openshift-infra/auto-csr-approver-29552276-dc4zw" Mar 10 09:56:00 crc kubenswrapper[4794]: I0310 09:56:00.504568 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552276-dc4zw" Mar 10 09:56:00 crc kubenswrapper[4794]: I0310 09:56:00.679074 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552276-dc4zw"] Mar 10 09:56:00 crc kubenswrapper[4794]: I0310 09:56:00.727170 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552276-dc4zw" event={"ID":"d38f5eab-4cb0-42bc-9ceb-576829269e3e","Type":"ContainerStarted","Data":"14c9dd6c19d6de97e47b73833b92f9ef002c262912112fd4db2b45a41479f386"} Mar 10 09:56:02 crc kubenswrapper[4794]: I0310 09:56:02.746993 4794 generic.go:334] "Generic (PLEG): container finished" podID="d38f5eab-4cb0-42bc-9ceb-576829269e3e" containerID="991701b72851d56d951b339ec74203a6d0acbb169329704a4623a42a0f002326" exitCode=0 Mar 10 09:56:02 crc kubenswrapper[4794]: I0310 09:56:02.747472 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552276-dc4zw" event={"ID":"d38f5eab-4cb0-42bc-9ceb-576829269e3e","Type":"ContainerDied","Data":"991701b72851d56d951b339ec74203a6d0acbb169329704a4623a42a0f002326"} Mar 10 09:56:04 crc kubenswrapper[4794]: I0310 09:56:04.073920 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552276-dc4zw" Mar 10 09:56:04 crc kubenswrapper[4794]: I0310 09:56:04.163917 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6bk97\" (UniqueName: \"kubernetes.io/projected/d38f5eab-4cb0-42bc-9ceb-576829269e3e-kube-api-access-6bk97\") pod \"d38f5eab-4cb0-42bc-9ceb-576829269e3e\" (UID: \"d38f5eab-4cb0-42bc-9ceb-576829269e3e\") " Mar 10 09:56:04 crc kubenswrapper[4794]: I0310 09:56:04.169279 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d38f5eab-4cb0-42bc-9ceb-576829269e3e-kube-api-access-6bk97" (OuterVolumeSpecName: "kube-api-access-6bk97") pod "d38f5eab-4cb0-42bc-9ceb-576829269e3e" (UID: "d38f5eab-4cb0-42bc-9ceb-576829269e3e"). InnerVolumeSpecName "kube-api-access-6bk97". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:56:04 crc kubenswrapper[4794]: I0310 09:56:04.265743 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6bk97\" (UniqueName: \"kubernetes.io/projected/d38f5eab-4cb0-42bc-9ceb-576829269e3e-kube-api-access-6bk97\") on node \"crc\" DevicePath \"\"" Mar 10 09:56:04 crc kubenswrapper[4794]: I0310 09:56:04.766733 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552276-dc4zw" event={"ID":"d38f5eab-4cb0-42bc-9ceb-576829269e3e","Type":"ContainerDied","Data":"14c9dd6c19d6de97e47b73833b92f9ef002c262912112fd4db2b45a41479f386"} Mar 10 09:56:04 crc kubenswrapper[4794]: I0310 09:56:04.766776 4794 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="14c9dd6c19d6de97e47b73833b92f9ef002c262912112fd4db2b45a41479f386" Mar 10 09:56:04 crc kubenswrapper[4794]: I0310 09:56:04.766817 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552276-dc4zw" Mar 10 09:56:05 crc kubenswrapper[4794]: I0310 09:56:05.127710 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29552270-rq675"] Mar 10 09:56:05 crc kubenswrapper[4794]: I0310 09:56:05.132182 4794 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29552270-rq675"] Mar 10 09:56:06 crc kubenswrapper[4794]: I0310 09:56:06.010248 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dec1ce65-776a-4ed7-8441-0779bb0c5d93" path="/var/lib/kubelet/pods/dec1ce65-776a-4ed7-8441-0779bb0c5d93/volumes" Mar 10 09:56:22 crc kubenswrapper[4794]: I0310 09:56:22.968414 4794 patch_prober.go:28] interesting pod/machine-config-daemon-69278 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 09:56:22 crc kubenswrapper[4794]: I0310 09:56:22.969026 4794 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-69278" podUID="071a79a8-a892-4d38-a255-2a19483b64aa" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 09:56:22 crc kubenswrapper[4794]: I0310 09:56:22.969070 4794 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-69278" Mar 10 09:56:22 crc kubenswrapper[4794]: I0310 09:56:22.969645 4794 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"008d2d6b5be2dee5277f2851a79ea6544ead77396a09ca0e14c2fa485aead805"} pod="openshift-machine-config-operator/machine-config-daemon-69278" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 10 09:56:22 crc kubenswrapper[4794]: I0310 09:56:22.969696 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-69278" podUID="071a79a8-a892-4d38-a255-2a19483b64aa" containerName="machine-config-daemon" containerID="cri-o://008d2d6b5be2dee5277f2851a79ea6544ead77396a09ca0e14c2fa485aead805" gracePeriod=600 Mar 10 09:56:23 crc kubenswrapper[4794]: I0310 09:56:23.924939 4794 generic.go:334] "Generic (PLEG): container finished" podID="071a79a8-a892-4d38-a255-2a19483b64aa" containerID="008d2d6b5be2dee5277f2851a79ea6544ead77396a09ca0e14c2fa485aead805" exitCode=0 Mar 10 09:56:23 crc kubenswrapper[4794]: I0310 09:56:23.925892 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-69278" event={"ID":"071a79a8-a892-4d38-a255-2a19483b64aa","Type":"ContainerDied","Data":"008d2d6b5be2dee5277f2851a79ea6544ead77396a09ca0e14c2fa485aead805"} Mar 10 09:56:23 crc kubenswrapper[4794]: I0310 09:56:23.925941 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-69278" event={"ID":"071a79a8-a892-4d38-a255-2a19483b64aa","Type":"ContainerStarted","Data":"900615b0bd1702fdf79917b75d57707d4e97b8f262a88b05aa4883f6d0d20891"} Mar 10 09:56:23 crc kubenswrapper[4794]: I0310 09:56:23.925971 4794 scope.go:117] "RemoveContainer" containerID="8bd93f36183b19386ef399a63319db0f77fccc38f2efa4fc0d1b62c277727e21" Mar 10 09:56:29 crc kubenswrapper[4794]: I0310 09:56:29.430009 4794 scope.go:117] "RemoveContainer" containerID="f12f6d1a08902d49b512cf0cf93f6aae05f971c4ee1f0deae66ff06febde7456" Mar 10 09:58:00 crc kubenswrapper[4794]: I0310 09:58:00.147810 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29552278-4gp6d"] Mar 10 09:58:00 crc kubenswrapper[4794]: E0310 09:58:00.148787 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d38f5eab-4cb0-42bc-9ceb-576829269e3e" containerName="oc" Mar 10 09:58:00 crc kubenswrapper[4794]: I0310 09:58:00.148810 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="d38f5eab-4cb0-42bc-9ceb-576829269e3e" containerName="oc" Mar 10 09:58:00 crc kubenswrapper[4794]: I0310 09:58:00.149019 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="d38f5eab-4cb0-42bc-9ceb-576829269e3e" containerName="oc" Mar 10 09:58:00 crc kubenswrapper[4794]: I0310 09:58:00.149683 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552278-4gp6d" Mar 10 09:58:00 crc kubenswrapper[4794]: I0310 09:58:00.152498 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 10 09:58:00 crc kubenswrapper[4794]: I0310 09:58:00.152959 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-r5tkc" Mar 10 09:58:00 crc kubenswrapper[4794]: I0310 09:58:00.156493 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 10 09:58:00 crc kubenswrapper[4794]: I0310 09:58:00.162224 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552278-4gp6d"] Mar 10 09:58:00 crc kubenswrapper[4794]: I0310 09:58:00.341260 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kpm28\" (UniqueName: \"kubernetes.io/projected/aad5d471-4640-4a77-9b35-85fb21b3e2de-kube-api-access-kpm28\") pod \"auto-csr-approver-29552278-4gp6d\" (UID: \"aad5d471-4640-4a77-9b35-85fb21b3e2de\") " pod="openshift-infra/auto-csr-approver-29552278-4gp6d" Mar 10 09:58:00 crc kubenswrapper[4794]: I0310 09:58:00.442369 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kpm28\" (UniqueName: \"kubernetes.io/projected/aad5d471-4640-4a77-9b35-85fb21b3e2de-kube-api-access-kpm28\") pod \"auto-csr-approver-29552278-4gp6d\" (UID: \"aad5d471-4640-4a77-9b35-85fb21b3e2de\") " pod="openshift-infra/auto-csr-approver-29552278-4gp6d" Mar 10 09:58:00 crc kubenswrapper[4794]: I0310 09:58:00.475061 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kpm28\" (UniqueName: \"kubernetes.io/projected/aad5d471-4640-4a77-9b35-85fb21b3e2de-kube-api-access-kpm28\") pod \"auto-csr-approver-29552278-4gp6d\" (UID: \"aad5d471-4640-4a77-9b35-85fb21b3e2de\") " pod="openshift-infra/auto-csr-approver-29552278-4gp6d" Mar 10 09:58:00 crc kubenswrapper[4794]: I0310 09:58:00.478540 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552278-4gp6d" Mar 10 09:58:00 crc kubenswrapper[4794]: I0310 09:58:00.672365 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552278-4gp6d"] Mar 10 09:58:01 crc kubenswrapper[4794]: I0310 09:58:01.579764 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552278-4gp6d" event={"ID":"aad5d471-4640-4a77-9b35-85fb21b3e2de","Type":"ContainerStarted","Data":"b683feca683d442315c6853cff0b05a34dffed2744b81d40f11694d9cf8b63fb"} Mar 10 09:58:02 crc kubenswrapper[4794]: I0310 09:58:02.587477 4794 generic.go:334] "Generic (PLEG): container finished" podID="aad5d471-4640-4a77-9b35-85fb21b3e2de" containerID="7d7da57ffe1e7fa8cf7ebd979d4e92467de34825c2105a784c90fecff649917b" exitCode=0 Mar 10 09:58:02 crc kubenswrapper[4794]: I0310 09:58:02.587569 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552278-4gp6d" event={"ID":"aad5d471-4640-4a77-9b35-85fb21b3e2de","Type":"ContainerDied","Data":"7d7da57ffe1e7fa8cf7ebd979d4e92467de34825c2105a784c90fecff649917b"} Mar 10 09:58:03 crc kubenswrapper[4794]: I0310 09:58:03.831505 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552278-4gp6d" Mar 10 09:58:03 crc kubenswrapper[4794]: I0310 09:58:03.987522 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kpm28\" (UniqueName: \"kubernetes.io/projected/aad5d471-4640-4a77-9b35-85fb21b3e2de-kube-api-access-kpm28\") pod \"aad5d471-4640-4a77-9b35-85fb21b3e2de\" (UID: \"aad5d471-4640-4a77-9b35-85fb21b3e2de\") " Mar 10 09:58:03 crc kubenswrapper[4794]: I0310 09:58:03.993511 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aad5d471-4640-4a77-9b35-85fb21b3e2de-kube-api-access-kpm28" (OuterVolumeSpecName: "kube-api-access-kpm28") pod "aad5d471-4640-4a77-9b35-85fb21b3e2de" (UID: "aad5d471-4640-4a77-9b35-85fb21b3e2de"). InnerVolumeSpecName "kube-api-access-kpm28". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:58:04 crc kubenswrapper[4794]: I0310 09:58:04.088437 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kpm28\" (UniqueName: \"kubernetes.io/projected/aad5d471-4640-4a77-9b35-85fb21b3e2de-kube-api-access-kpm28\") on node \"crc\" DevicePath \"\"" Mar 10 09:58:04 crc kubenswrapper[4794]: I0310 09:58:04.603683 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552278-4gp6d" event={"ID":"aad5d471-4640-4a77-9b35-85fb21b3e2de","Type":"ContainerDied","Data":"b683feca683d442315c6853cff0b05a34dffed2744b81d40f11694d9cf8b63fb"} Mar 10 09:58:04 crc kubenswrapper[4794]: I0310 09:58:04.603737 4794 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b683feca683d442315c6853cff0b05a34dffed2744b81d40f11694d9cf8b63fb" Mar 10 09:58:04 crc kubenswrapper[4794]: I0310 09:58:04.603807 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552278-4gp6d" Mar 10 09:58:04 crc kubenswrapper[4794]: I0310 09:58:04.898821 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29552272-f6597"] Mar 10 09:58:04 crc kubenswrapper[4794]: I0310 09:58:04.903827 4794 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29552272-f6597"] Mar 10 09:58:06 crc kubenswrapper[4794]: I0310 09:58:06.012850 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="38cb9379-e79d-4708-a441-7e65e9dc7450" path="/var/lib/kubelet/pods/38cb9379-e79d-4708-a441-7e65e9dc7450/volumes" Mar 10 09:58:07 crc kubenswrapper[4794]: I0310 09:58:07.582483 4794 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Mar 10 09:58:07 crc kubenswrapper[4794]: I0310 09:58:07.735323 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-4vzj5"] Mar 10 09:58:07 crc kubenswrapper[4794]: E0310 09:58:07.735539 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aad5d471-4640-4a77-9b35-85fb21b3e2de" containerName="oc" Mar 10 09:58:07 crc kubenswrapper[4794]: I0310 09:58:07.735550 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="aad5d471-4640-4a77-9b35-85fb21b3e2de" containerName="oc" Mar 10 09:58:07 crc kubenswrapper[4794]: I0310 09:58:07.735643 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="aad5d471-4640-4a77-9b35-85fb21b3e2de" containerName="oc" Mar 10 09:58:07 crc kubenswrapper[4794]: I0310 09:58:07.736324 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-4vzj5" Mar 10 09:58:07 crc kubenswrapper[4794]: I0310 09:58:07.752177 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6387ec1a-bd33-4fbe-8089-a804575a3728-catalog-content\") pod \"redhat-marketplace-4vzj5\" (UID: \"6387ec1a-bd33-4fbe-8089-a804575a3728\") " pod="openshift-marketplace/redhat-marketplace-4vzj5" Mar 10 09:58:07 crc kubenswrapper[4794]: I0310 09:58:07.752253 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6387ec1a-bd33-4fbe-8089-a804575a3728-utilities\") pod \"redhat-marketplace-4vzj5\" (UID: \"6387ec1a-bd33-4fbe-8089-a804575a3728\") " pod="openshift-marketplace/redhat-marketplace-4vzj5" Mar 10 09:58:07 crc kubenswrapper[4794]: I0310 09:58:07.752432 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kdgkc\" (UniqueName: \"kubernetes.io/projected/6387ec1a-bd33-4fbe-8089-a804575a3728-kube-api-access-kdgkc\") pod \"redhat-marketplace-4vzj5\" (UID: \"6387ec1a-bd33-4fbe-8089-a804575a3728\") " pod="openshift-marketplace/redhat-marketplace-4vzj5" Mar 10 09:58:07 crc kubenswrapper[4794]: I0310 09:58:07.759887 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-4vzj5"] Mar 10 09:58:07 crc kubenswrapper[4794]: I0310 09:58:07.854052 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6387ec1a-bd33-4fbe-8089-a804575a3728-catalog-content\") pod \"redhat-marketplace-4vzj5\" (UID: \"6387ec1a-bd33-4fbe-8089-a804575a3728\") " pod="openshift-marketplace/redhat-marketplace-4vzj5" Mar 10 09:58:07 crc kubenswrapper[4794]: I0310 09:58:07.854105 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6387ec1a-bd33-4fbe-8089-a804575a3728-utilities\") pod \"redhat-marketplace-4vzj5\" (UID: \"6387ec1a-bd33-4fbe-8089-a804575a3728\") " pod="openshift-marketplace/redhat-marketplace-4vzj5" Mar 10 09:58:07 crc kubenswrapper[4794]: I0310 09:58:07.854187 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kdgkc\" (UniqueName: \"kubernetes.io/projected/6387ec1a-bd33-4fbe-8089-a804575a3728-kube-api-access-kdgkc\") pod \"redhat-marketplace-4vzj5\" (UID: \"6387ec1a-bd33-4fbe-8089-a804575a3728\") " pod="openshift-marketplace/redhat-marketplace-4vzj5" Mar 10 09:58:07 crc kubenswrapper[4794]: I0310 09:58:07.854588 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6387ec1a-bd33-4fbe-8089-a804575a3728-catalog-content\") pod \"redhat-marketplace-4vzj5\" (UID: \"6387ec1a-bd33-4fbe-8089-a804575a3728\") " pod="openshift-marketplace/redhat-marketplace-4vzj5" Mar 10 09:58:07 crc kubenswrapper[4794]: I0310 09:58:07.854631 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6387ec1a-bd33-4fbe-8089-a804575a3728-utilities\") pod \"redhat-marketplace-4vzj5\" (UID: \"6387ec1a-bd33-4fbe-8089-a804575a3728\") " pod="openshift-marketplace/redhat-marketplace-4vzj5" Mar 10 09:58:07 crc kubenswrapper[4794]: I0310 09:58:07.872313 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kdgkc\" (UniqueName: \"kubernetes.io/projected/6387ec1a-bd33-4fbe-8089-a804575a3728-kube-api-access-kdgkc\") pod \"redhat-marketplace-4vzj5\" (UID: \"6387ec1a-bd33-4fbe-8089-a804575a3728\") " pod="openshift-marketplace/redhat-marketplace-4vzj5" Mar 10 09:58:08 crc kubenswrapper[4794]: I0310 09:58:08.051005 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-4vzj5" Mar 10 09:58:08 crc kubenswrapper[4794]: I0310 09:58:08.278308 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-4vzj5"] Mar 10 09:58:08 crc kubenswrapper[4794]: I0310 09:58:08.627777 4794 generic.go:334] "Generic (PLEG): container finished" podID="6387ec1a-bd33-4fbe-8089-a804575a3728" containerID="862495d2834f1870caa031bd4bbfc4f74ccccdcea12f87a0730994ee2345707e" exitCode=0 Mar 10 09:58:08 crc kubenswrapper[4794]: I0310 09:58:08.627839 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4vzj5" event={"ID":"6387ec1a-bd33-4fbe-8089-a804575a3728","Type":"ContainerDied","Data":"862495d2834f1870caa031bd4bbfc4f74ccccdcea12f87a0730994ee2345707e"} Mar 10 09:58:08 crc kubenswrapper[4794]: I0310 09:58:08.628108 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4vzj5" event={"ID":"6387ec1a-bd33-4fbe-8089-a804575a3728","Type":"ContainerStarted","Data":"88a28e85701565bb56e5b0cc3d8abeefba5bb62937a7ef79ab4b428e5453e255"} Mar 10 09:58:09 crc kubenswrapper[4794]: I0310 09:58:09.637296 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4vzj5" event={"ID":"6387ec1a-bd33-4fbe-8089-a804575a3728","Type":"ContainerStarted","Data":"d5d6fcde38c1663b48165afc5b68801c7ff57f614a52f9ac87dc2d6403273e59"} Mar 10 09:58:10 crc kubenswrapper[4794]: I0310 09:58:10.644312 4794 generic.go:334] "Generic (PLEG): container finished" podID="6387ec1a-bd33-4fbe-8089-a804575a3728" containerID="d5d6fcde38c1663b48165afc5b68801c7ff57f614a52f9ac87dc2d6403273e59" exitCode=0 Mar 10 09:58:10 crc kubenswrapper[4794]: I0310 09:58:10.644362 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4vzj5" event={"ID":"6387ec1a-bd33-4fbe-8089-a804575a3728","Type":"ContainerDied","Data":"d5d6fcde38c1663b48165afc5b68801c7ff57f614a52f9ac87dc2d6403273e59"} Mar 10 09:58:11 crc kubenswrapper[4794]: I0310 09:58:11.658942 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4vzj5" event={"ID":"6387ec1a-bd33-4fbe-8089-a804575a3728","Type":"ContainerStarted","Data":"95c33883bc220b3262833c9f3a1f5712724cefc42956c86338c0822da9d7ae09"} Mar 10 09:58:11 crc kubenswrapper[4794]: I0310 09:58:11.688647 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-4vzj5" podStartSLOduration=2.221321106 podStartE2EDuration="4.68862239s" podCreationTimestamp="2026-03-10 09:58:07 +0000 UTC" firstStartedPulling="2026-03-10 09:58:08.629619078 +0000 UTC m=+837.385789906" lastFinishedPulling="2026-03-10 09:58:11.096920332 +0000 UTC m=+839.853091190" observedRunningTime="2026-03-10 09:58:11.684750679 +0000 UTC m=+840.440921537" watchObservedRunningTime="2026-03-10 09:58:11.68862239 +0000 UTC m=+840.444793218" Mar 10 09:58:14 crc kubenswrapper[4794]: I0310 09:58:14.122872 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-p76xn"] Mar 10 09:58:14 crc kubenswrapper[4794]: I0310 09:58:14.129115 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-p76xn" Mar 10 09:58:14 crc kubenswrapper[4794]: I0310 09:58:14.149798 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-p76xn"] Mar 10 09:58:14 crc kubenswrapper[4794]: I0310 09:58:14.232616 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qzh9d\" (UniqueName: \"kubernetes.io/projected/250eec95-7242-403d-a887-c361d549d147-kube-api-access-qzh9d\") pod \"community-operators-p76xn\" (UID: \"250eec95-7242-403d-a887-c361d549d147\") " pod="openshift-marketplace/community-operators-p76xn" Mar 10 09:58:14 crc kubenswrapper[4794]: I0310 09:58:14.232893 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/250eec95-7242-403d-a887-c361d549d147-catalog-content\") pod \"community-operators-p76xn\" (UID: \"250eec95-7242-403d-a887-c361d549d147\") " pod="openshift-marketplace/community-operators-p76xn" Mar 10 09:58:14 crc kubenswrapper[4794]: I0310 09:58:14.233018 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/250eec95-7242-403d-a887-c361d549d147-utilities\") pod \"community-operators-p76xn\" (UID: \"250eec95-7242-403d-a887-c361d549d147\") " pod="openshift-marketplace/community-operators-p76xn" Mar 10 09:58:14 crc kubenswrapper[4794]: I0310 09:58:14.333850 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/250eec95-7242-403d-a887-c361d549d147-utilities\") pod \"community-operators-p76xn\" (UID: \"250eec95-7242-403d-a887-c361d549d147\") " pod="openshift-marketplace/community-operators-p76xn" Mar 10 09:58:14 crc kubenswrapper[4794]: I0310 09:58:14.334317 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qzh9d\" (UniqueName: \"kubernetes.io/projected/250eec95-7242-403d-a887-c361d549d147-kube-api-access-qzh9d\") pod \"community-operators-p76xn\" (UID: \"250eec95-7242-403d-a887-c361d549d147\") " pod="openshift-marketplace/community-operators-p76xn" Mar 10 09:58:14 crc kubenswrapper[4794]: I0310 09:58:14.334598 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/250eec95-7242-403d-a887-c361d549d147-catalog-content\") pod \"community-operators-p76xn\" (UID: \"250eec95-7242-403d-a887-c361d549d147\") " pod="openshift-marketplace/community-operators-p76xn" Mar 10 09:58:14 crc kubenswrapper[4794]: I0310 09:58:14.334901 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/250eec95-7242-403d-a887-c361d549d147-utilities\") pod \"community-operators-p76xn\" (UID: \"250eec95-7242-403d-a887-c361d549d147\") " pod="openshift-marketplace/community-operators-p76xn" Mar 10 09:58:14 crc kubenswrapper[4794]: I0310 09:58:14.335160 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/250eec95-7242-403d-a887-c361d549d147-catalog-content\") pod \"community-operators-p76xn\" (UID: \"250eec95-7242-403d-a887-c361d549d147\") " pod="openshift-marketplace/community-operators-p76xn" Mar 10 09:58:14 crc kubenswrapper[4794]: I0310 09:58:14.352347 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qzh9d\" (UniqueName: \"kubernetes.io/projected/250eec95-7242-403d-a887-c361d549d147-kube-api-access-qzh9d\") pod \"community-operators-p76xn\" (UID: \"250eec95-7242-403d-a887-c361d549d147\") " pod="openshift-marketplace/community-operators-p76xn" Mar 10 09:58:14 crc kubenswrapper[4794]: I0310 09:58:14.465455 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-p76xn" Mar 10 09:58:14 crc kubenswrapper[4794]: I0310 09:58:14.747875 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-p76xn"] Mar 10 09:58:15 crc kubenswrapper[4794]: I0310 09:58:15.688824 4794 generic.go:334] "Generic (PLEG): container finished" podID="250eec95-7242-403d-a887-c361d549d147" containerID="c8f562b153c02eda6cc19d78404b3ea7c932cdf0457f1f6b41c7eaba1dc3e3c1" exitCode=0 Mar 10 09:58:15 crc kubenswrapper[4794]: I0310 09:58:15.688912 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-p76xn" event={"ID":"250eec95-7242-403d-a887-c361d549d147","Type":"ContainerDied","Data":"c8f562b153c02eda6cc19d78404b3ea7c932cdf0457f1f6b41c7eaba1dc3e3c1"} Mar 10 09:58:15 crc kubenswrapper[4794]: I0310 09:58:15.689202 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-p76xn" event={"ID":"250eec95-7242-403d-a887-c361d549d147","Type":"ContainerStarted","Data":"5ae8ce80640d901fe4685374d302690b98098bda71238071f9fd8d8442c736d2"} Mar 10 09:58:17 crc kubenswrapper[4794]: I0310 09:58:17.705247 4794 generic.go:334] "Generic (PLEG): container finished" podID="250eec95-7242-403d-a887-c361d549d147" containerID="fd0b92b00a126142660bdfd55edae84cf99f30734db16cb34790389f63dfdcf6" exitCode=0 Mar 10 09:58:17 crc kubenswrapper[4794]: I0310 09:58:17.705349 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-p76xn" event={"ID":"250eec95-7242-403d-a887-c361d549d147","Type":"ContainerDied","Data":"fd0b92b00a126142660bdfd55edae84cf99f30734db16cb34790389f63dfdcf6"} Mar 10 09:58:18 crc kubenswrapper[4794]: I0310 09:58:18.051389 4794 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-4vzj5" Mar 10 09:58:18 crc kubenswrapper[4794]: I0310 09:58:18.051465 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-4vzj5" Mar 10 09:58:18 crc kubenswrapper[4794]: I0310 09:58:18.107362 4794 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-4vzj5" Mar 10 09:58:18 crc kubenswrapper[4794]: I0310 09:58:18.715840 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-p76xn" event={"ID":"250eec95-7242-403d-a887-c361d549d147","Type":"ContainerStarted","Data":"d649d04fb547e8ed2fe8d6a16528698e31ba8d6ee519dca9b33d0658a01513f0"} Mar 10 09:58:18 crc kubenswrapper[4794]: I0310 09:58:18.742369 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-p76xn" podStartSLOduration=2.343780604 podStartE2EDuration="4.742310828s" podCreationTimestamp="2026-03-10 09:58:14 +0000 UTC" firstStartedPulling="2026-03-10 09:58:15.691903029 +0000 UTC m=+844.448073857" lastFinishedPulling="2026-03-10 09:58:18.090433223 +0000 UTC m=+846.846604081" observedRunningTime="2026-03-10 09:58:18.737571281 +0000 UTC m=+847.493742149" watchObservedRunningTime="2026-03-10 09:58:18.742310828 +0000 UTC m=+847.498481696" Mar 10 09:58:18 crc kubenswrapper[4794]: I0310 09:58:18.779840 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-4vzj5" Mar 10 09:58:19 crc kubenswrapper[4794]: I0310 09:58:19.709627 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-4vzj5"] Mar 10 09:58:20 crc kubenswrapper[4794]: I0310 09:58:20.728375 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-4vzj5" podUID="6387ec1a-bd33-4fbe-8089-a804575a3728" containerName="registry-server" containerID="cri-o://95c33883bc220b3262833c9f3a1f5712724cefc42956c86338c0822da9d7ae09" gracePeriod=2 Mar 10 09:58:21 crc kubenswrapper[4794]: I0310 09:58:21.122932 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-4vzj5" Mar 10 09:58:21 crc kubenswrapper[4794]: I0310 09:58:21.224773 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kdgkc\" (UniqueName: \"kubernetes.io/projected/6387ec1a-bd33-4fbe-8089-a804575a3728-kube-api-access-kdgkc\") pod \"6387ec1a-bd33-4fbe-8089-a804575a3728\" (UID: \"6387ec1a-bd33-4fbe-8089-a804575a3728\") " Mar 10 09:58:21 crc kubenswrapper[4794]: I0310 09:58:21.226186 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6387ec1a-bd33-4fbe-8089-a804575a3728-utilities\") pod \"6387ec1a-bd33-4fbe-8089-a804575a3728\" (UID: \"6387ec1a-bd33-4fbe-8089-a804575a3728\") " Mar 10 09:58:21 crc kubenswrapper[4794]: I0310 09:58:21.226518 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6387ec1a-bd33-4fbe-8089-a804575a3728-catalog-content\") pod \"6387ec1a-bd33-4fbe-8089-a804575a3728\" (UID: \"6387ec1a-bd33-4fbe-8089-a804575a3728\") " Mar 10 09:58:21 crc kubenswrapper[4794]: I0310 09:58:21.227928 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6387ec1a-bd33-4fbe-8089-a804575a3728-utilities" (OuterVolumeSpecName: "utilities") pod "6387ec1a-bd33-4fbe-8089-a804575a3728" (UID: "6387ec1a-bd33-4fbe-8089-a804575a3728"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 09:58:21 crc kubenswrapper[4794]: I0310 09:58:21.233075 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6387ec1a-bd33-4fbe-8089-a804575a3728-kube-api-access-kdgkc" (OuterVolumeSpecName: "kube-api-access-kdgkc") pod "6387ec1a-bd33-4fbe-8089-a804575a3728" (UID: "6387ec1a-bd33-4fbe-8089-a804575a3728"). InnerVolumeSpecName "kube-api-access-kdgkc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:58:21 crc kubenswrapper[4794]: I0310 09:58:21.255424 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6387ec1a-bd33-4fbe-8089-a804575a3728-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "6387ec1a-bd33-4fbe-8089-a804575a3728" (UID: "6387ec1a-bd33-4fbe-8089-a804575a3728"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 09:58:21 crc kubenswrapper[4794]: I0310 09:58:21.328504 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kdgkc\" (UniqueName: \"kubernetes.io/projected/6387ec1a-bd33-4fbe-8089-a804575a3728-kube-api-access-kdgkc\") on node \"crc\" DevicePath \"\"" Mar 10 09:58:21 crc kubenswrapper[4794]: I0310 09:58:21.328554 4794 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6387ec1a-bd33-4fbe-8089-a804575a3728-utilities\") on node \"crc\" DevicePath \"\"" Mar 10 09:58:21 crc kubenswrapper[4794]: I0310 09:58:21.328611 4794 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6387ec1a-bd33-4fbe-8089-a804575a3728-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 10 09:58:21 crc kubenswrapper[4794]: I0310 09:58:21.738976 4794 generic.go:334] "Generic (PLEG): container finished" podID="6387ec1a-bd33-4fbe-8089-a804575a3728" containerID="95c33883bc220b3262833c9f3a1f5712724cefc42956c86338c0822da9d7ae09" exitCode=0 Mar 10 09:58:21 crc kubenswrapper[4794]: I0310 09:58:21.739020 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-4vzj5" Mar 10 09:58:21 crc kubenswrapper[4794]: I0310 09:58:21.739050 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4vzj5" event={"ID":"6387ec1a-bd33-4fbe-8089-a804575a3728","Type":"ContainerDied","Data":"95c33883bc220b3262833c9f3a1f5712724cefc42956c86338c0822da9d7ae09"} Mar 10 09:58:21 crc kubenswrapper[4794]: I0310 09:58:21.739444 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4vzj5" event={"ID":"6387ec1a-bd33-4fbe-8089-a804575a3728","Type":"ContainerDied","Data":"88a28e85701565bb56e5b0cc3d8abeefba5bb62937a7ef79ab4b428e5453e255"} Mar 10 09:58:21 crc kubenswrapper[4794]: I0310 09:58:21.739462 4794 scope.go:117] "RemoveContainer" containerID="95c33883bc220b3262833c9f3a1f5712724cefc42956c86338c0822da9d7ae09" Mar 10 09:58:21 crc kubenswrapper[4794]: I0310 09:58:21.759046 4794 scope.go:117] "RemoveContainer" containerID="d5d6fcde38c1663b48165afc5b68801c7ff57f614a52f9ac87dc2d6403273e59" Mar 10 09:58:21 crc kubenswrapper[4794]: I0310 09:58:21.778529 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-4vzj5"] Mar 10 09:58:21 crc kubenswrapper[4794]: I0310 09:58:21.785976 4794 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-4vzj5"] Mar 10 09:58:21 crc kubenswrapper[4794]: I0310 09:58:21.789106 4794 scope.go:117] "RemoveContainer" containerID="862495d2834f1870caa031bd4bbfc4f74ccccdcea12f87a0730994ee2345707e" Mar 10 09:58:21 crc kubenswrapper[4794]: I0310 09:58:21.804395 4794 scope.go:117] "RemoveContainer" containerID="95c33883bc220b3262833c9f3a1f5712724cefc42956c86338c0822da9d7ae09" Mar 10 09:58:21 crc kubenswrapper[4794]: E0310 09:58:21.804832 4794 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"95c33883bc220b3262833c9f3a1f5712724cefc42956c86338c0822da9d7ae09\": container with ID starting with 95c33883bc220b3262833c9f3a1f5712724cefc42956c86338c0822da9d7ae09 not found: ID does not exist" containerID="95c33883bc220b3262833c9f3a1f5712724cefc42956c86338c0822da9d7ae09" Mar 10 09:58:21 crc kubenswrapper[4794]: I0310 09:58:21.804870 4794 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"95c33883bc220b3262833c9f3a1f5712724cefc42956c86338c0822da9d7ae09"} err="failed to get container status \"95c33883bc220b3262833c9f3a1f5712724cefc42956c86338c0822da9d7ae09\": rpc error: code = NotFound desc = could not find container \"95c33883bc220b3262833c9f3a1f5712724cefc42956c86338c0822da9d7ae09\": container with ID starting with 95c33883bc220b3262833c9f3a1f5712724cefc42956c86338c0822da9d7ae09 not found: ID does not exist" Mar 10 09:58:21 crc kubenswrapper[4794]: I0310 09:58:21.804904 4794 scope.go:117] "RemoveContainer" containerID="d5d6fcde38c1663b48165afc5b68801c7ff57f614a52f9ac87dc2d6403273e59" Mar 10 09:58:21 crc kubenswrapper[4794]: E0310 09:58:21.805188 4794 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d5d6fcde38c1663b48165afc5b68801c7ff57f614a52f9ac87dc2d6403273e59\": container with ID starting with d5d6fcde38c1663b48165afc5b68801c7ff57f614a52f9ac87dc2d6403273e59 not found: ID does not exist" containerID="d5d6fcde38c1663b48165afc5b68801c7ff57f614a52f9ac87dc2d6403273e59" Mar 10 09:58:21 crc kubenswrapper[4794]: I0310 09:58:21.805266 4794 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d5d6fcde38c1663b48165afc5b68801c7ff57f614a52f9ac87dc2d6403273e59"} err="failed to get container status \"d5d6fcde38c1663b48165afc5b68801c7ff57f614a52f9ac87dc2d6403273e59\": rpc error: code = NotFound desc = could not find container \"d5d6fcde38c1663b48165afc5b68801c7ff57f614a52f9ac87dc2d6403273e59\": container with ID starting with d5d6fcde38c1663b48165afc5b68801c7ff57f614a52f9ac87dc2d6403273e59 not found: ID does not exist" Mar 10 09:58:21 crc kubenswrapper[4794]: I0310 09:58:21.805306 4794 scope.go:117] "RemoveContainer" containerID="862495d2834f1870caa031bd4bbfc4f74ccccdcea12f87a0730994ee2345707e" Mar 10 09:58:21 crc kubenswrapper[4794]: E0310 09:58:21.805615 4794 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"862495d2834f1870caa031bd4bbfc4f74ccccdcea12f87a0730994ee2345707e\": container with ID starting with 862495d2834f1870caa031bd4bbfc4f74ccccdcea12f87a0730994ee2345707e not found: ID does not exist" containerID="862495d2834f1870caa031bd4bbfc4f74ccccdcea12f87a0730994ee2345707e" Mar 10 09:58:21 crc kubenswrapper[4794]: I0310 09:58:21.805647 4794 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"862495d2834f1870caa031bd4bbfc4f74ccccdcea12f87a0730994ee2345707e"} err="failed to get container status \"862495d2834f1870caa031bd4bbfc4f74ccccdcea12f87a0730994ee2345707e\": rpc error: code = NotFound desc = could not find container \"862495d2834f1870caa031bd4bbfc4f74ccccdcea12f87a0730994ee2345707e\": container with ID starting with 862495d2834f1870caa031bd4bbfc4f74ccccdcea12f87a0730994ee2345707e not found: ID does not exist" Mar 10 09:58:22 crc kubenswrapper[4794]: I0310 09:58:22.006914 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6387ec1a-bd33-4fbe-8089-a804575a3728" path="/var/lib/kubelet/pods/6387ec1a-bd33-4fbe-8089-a804575a3728/volumes" Mar 10 09:58:24 crc kubenswrapper[4794]: I0310 09:58:24.466178 4794 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-p76xn" Mar 10 09:58:24 crc kubenswrapper[4794]: I0310 09:58:24.466538 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-p76xn" Mar 10 09:58:24 crc kubenswrapper[4794]: I0310 09:58:24.525194 4794 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-p76xn" Mar 10 09:58:24 crc kubenswrapper[4794]: I0310 09:58:24.813424 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-p76xn" Mar 10 09:58:25 crc kubenswrapper[4794]: I0310 09:58:25.503605 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-p76xn"] Mar 10 09:58:26 crc kubenswrapper[4794]: I0310 09:58:26.779119 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-p76xn" podUID="250eec95-7242-403d-a887-c361d549d147" containerName="registry-server" containerID="cri-o://d649d04fb547e8ed2fe8d6a16528698e31ba8d6ee519dca9b33d0658a01513f0" gracePeriod=2 Mar 10 09:58:27 crc kubenswrapper[4794]: I0310 09:58:27.162599 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-p76xn" Mar 10 09:58:27 crc kubenswrapper[4794]: I0310 09:58:27.213524 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/250eec95-7242-403d-a887-c361d549d147-utilities\") pod \"250eec95-7242-403d-a887-c361d549d147\" (UID: \"250eec95-7242-403d-a887-c361d549d147\") " Mar 10 09:58:27 crc kubenswrapper[4794]: I0310 09:58:27.213925 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qzh9d\" (UniqueName: \"kubernetes.io/projected/250eec95-7242-403d-a887-c361d549d147-kube-api-access-qzh9d\") pod \"250eec95-7242-403d-a887-c361d549d147\" (UID: \"250eec95-7242-403d-a887-c361d549d147\") " Mar 10 09:58:27 crc kubenswrapper[4794]: I0310 09:58:27.214305 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/250eec95-7242-403d-a887-c361d549d147-catalog-content\") pod \"250eec95-7242-403d-a887-c361d549d147\" (UID: \"250eec95-7242-403d-a887-c361d549d147\") " Mar 10 09:58:27 crc kubenswrapper[4794]: I0310 09:58:27.214772 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/250eec95-7242-403d-a887-c361d549d147-utilities" (OuterVolumeSpecName: "utilities") pod "250eec95-7242-403d-a887-c361d549d147" (UID: "250eec95-7242-403d-a887-c361d549d147"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 09:58:27 crc kubenswrapper[4794]: I0310 09:58:27.215189 4794 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/250eec95-7242-403d-a887-c361d549d147-utilities\") on node \"crc\" DevicePath \"\"" Mar 10 09:58:27 crc kubenswrapper[4794]: I0310 09:58:27.219522 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/250eec95-7242-403d-a887-c361d549d147-kube-api-access-qzh9d" (OuterVolumeSpecName: "kube-api-access-qzh9d") pod "250eec95-7242-403d-a887-c361d549d147" (UID: "250eec95-7242-403d-a887-c361d549d147"). InnerVolumeSpecName "kube-api-access-qzh9d". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:58:27 crc kubenswrapper[4794]: I0310 09:58:27.284861 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/250eec95-7242-403d-a887-c361d549d147-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "250eec95-7242-403d-a887-c361d549d147" (UID: "250eec95-7242-403d-a887-c361d549d147"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 09:58:27 crc kubenswrapper[4794]: I0310 09:58:27.316834 4794 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/250eec95-7242-403d-a887-c361d549d147-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 10 09:58:27 crc kubenswrapper[4794]: I0310 09:58:27.316890 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qzh9d\" (UniqueName: \"kubernetes.io/projected/250eec95-7242-403d-a887-c361d549d147-kube-api-access-qzh9d\") on node \"crc\" DevicePath \"\"" Mar 10 09:58:27 crc kubenswrapper[4794]: I0310 09:58:27.798904 4794 generic.go:334] "Generic (PLEG): container finished" podID="250eec95-7242-403d-a887-c361d549d147" containerID="d649d04fb547e8ed2fe8d6a16528698e31ba8d6ee519dca9b33d0658a01513f0" exitCode=0 Mar 10 09:58:27 crc kubenswrapper[4794]: I0310 09:58:27.798976 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-p76xn" event={"ID":"250eec95-7242-403d-a887-c361d549d147","Type":"ContainerDied","Data":"d649d04fb547e8ed2fe8d6a16528698e31ba8d6ee519dca9b33d0658a01513f0"} Mar 10 09:58:27 crc kubenswrapper[4794]: I0310 09:58:27.799011 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-p76xn" event={"ID":"250eec95-7242-403d-a887-c361d549d147","Type":"ContainerDied","Data":"5ae8ce80640d901fe4685374d302690b98098bda71238071f9fd8d8442c736d2"} Mar 10 09:58:27 crc kubenswrapper[4794]: I0310 09:58:27.799036 4794 scope.go:117] "RemoveContainer" containerID="d649d04fb547e8ed2fe8d6a16528698e31ba8d6ee519dca9b33d0658a01513f0" Mar 10 09:58:27 crc kubenswrapper[4794]: I0310 09:58:27.799101 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-p76xn" Mar 10 09:58:27 crc kubenswrapper[4794]: I0310 09:58:27.836596 4794 scope.go:117] "RemoveContainer" containerID="fd0b92b00a126142660bdfd55edae84cf99f30734db16cb34790389f63dfdcf6" Mar 10 09:58:27 crc kubenswrapper[4794]: I0310 09:58:27.853760 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-p76xn"] Mar 10 09:58:27 crc kubenswrapper[4794]: I0310 09:58:27.859831 4794 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-p76xn"] Mar 10 09:58:27 crc kubenswrapper[4794]: I0310 09:58:27.867171 4794 scope.go:117] "RemoveContainer" containerID="c8f562b153c02eda6cc19d78404b3ea7c932cdf0457f1f6b41c7eaba1dc3e3c1" Mar 10 09:58:27 crc kubenswrapper[4794]: I0310 09:58:27.884556 4794 scope.go:117] "RemoveContainer" containerID="d649d04fb547e8ed2fe8d6a16528698e31ba8d6ee519dca9b33d0658a01513f0" Mar 10 09:58:27 crc kubenswrapper[4794]: E0310 09:58:27.885254 4794 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d649d04fb547e8ed2fe8d6a16528698e31ba8d6ee519dca9b33d0658a01513f0\": container with ID starting with d649d04fb547e8ed2fe8d6a16528698e31ba8d6ee519dca9b33d0658a01513f0 not found: ID does not exist" containerID="d649d04fb547e8ed2fe8d6a16528698e31ba8d6ee519dca9b33d0658a01513f0" Mar 10 09:58:27 crc kubenswrapper[4794]: I0310 09:58:27.885295 4794 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d649d04fb547e8ed2fe8d6a16528698e31ba8d6ee519dca9b33d0658a01513f0"} err="failed to get container status \"d649d04fb547e8ed2fe8d6a16528698e31ba8d6ee519dca9b33d0658a01513f0\": rpc error: code = NotFound desc = could not find container \"d649d04fb547e8ed2fe8d6a16528698e31ba8d6ee519dca9b33d0658a01513f0\": container with ID starting with d649d04fb547e8ed2fe8d6a16528698e31ba8d6ee519dca9b33d0658a01513f0 not found: ID does not exist" Mar 10 09:58:27 crc kubenswrapper[4794]: I0310 09:58:27.885323 4794 scope.go:117] "RemoveContainer" containerID="fd0b92b00a126142660bdfd55edae84cf99f30734db16cb34790389f63dfdcf6" Mar 10 09:58:27 crc kubenswrapper[4794]: E0310 09:58:27.885818 4794 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fd0b92b00a126142660bdfd55edae84cf99f30734db16cb34790389f63dfdcf6\": container with ID starting with fd0b92b00a126142660bdfd55edae84cf99f30734db16cb34790389f63dfdcf6 not found: ID does not exist" containerID="fd0b92b00a126142660bdfd55edae84cf99f30734db16cb34790389f63dfdcf6" Mar 10 09:58:27 crc kubenswrapper[4794]: I0310 09:58:27.885845 4794 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fd0b92b00a126142660bdfd55edae84cf99f30734db16cb34790389f63dfdcf6"} err="failed to get container status \"fd0b92b00a126142660bdfd55edae84cf99f30734db16cb34790389f63dfdcf6\": rpc error: code = NotFound desc = could not find container \"fd0b92b00a126142660bdfd55edae84cf99f30734db16cb34790389f63dfdcf6\": container with ID starting with fd0b92b00a126142660bdfd55edae84cf99f30734db16cb34790389f63dfdcf6 not found: ID does not exist" Mar 10 09:58:27 crc kubenswrapper[4794]: I0310 09:58:27.885861 4794 scope.go:117] "RemoveContainer" containerID="c8f562b153c02eda6cc19d78404b3ea7c932cdf0457f1f6b41c7eaba1dc3e3c1" Mar 10 09:58:27 crc kubenswrapper[4794]: E0310 09:58:27.886153 4794 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c8f562b153c02eda6cc19d78404b3ea7c932cdf0457f1f6b41c7eaba1dc3e3c1\": container with ID starting with c8f562b153c02eda6cc19d78404b3ea7c932cdf0457f1f6b41c7eaba1dc3e3c1 not found: ID does not exist" containerID="c8f562b153c02eda6cc19d78404b3ea7c932cdf0457f1f6b41c7eaba1dc3e3c1" Mar 10 09:58:27 crc kubenswrapper[4794]: I0310 09:58:27.886190 4794 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c8f562b153c02eda6cc19d78404b3ea7c932cdf0457f1f6b41c7eaba1dc3e3c1"} err="failed to get container status \"c8f562b153c02eda6cc19d78404b3ea7c932cdf0457f1f6b41c7eaba1dc3e3c1\": rpc error: code = NotFound desc = could not find container \"c8f562b153c02eda6cc19d78404b3ea7c932cdf0457f1f6b41c7eaba1dc3e3c1\": container with ID starting with c8f562b153c02eda6cc19d78404b3ea7c932cdf0457f1f6b41c7eaba1dc3e3c1 not found: ID does not exist" Mar 10 09:58:28 crc kubenswrapper[4794]: I0310 09:58:28.007719 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="250eec95-7242-403d-a887-c361d549d147" path="/var/lib/kubelet/pods/250eec95-7242-403d-a887-c361d549d147/volumes" Mar 10 09:58:28 crc kubenswrapper[4794]: I0310 09:58:28.520550 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-tgrtq"] Mar 10 09:58:28 crc kubenswrapper[4794]: E0310 09:58:28.521169 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6387ec1a-bd33-4fbe-8089-a804575a3728" containerName="registry-server" Mar 10 09:58:28 crc kubenswrapper[4794]: I0310 09:58:28.521313 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="6387ec1a-bd33-4fbe-8089-a804575a3728" containerName="registry-server" Mar 10 09:58:28 crc kubenswrapper[4794]: E0310 09:58:28.521688 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6387ec1a-bd33-4fbe-8089-a804575a3728" containerName="extract-content" Mar 10 09:58:28 crc kubenswrapper[4794]: I0310 09:58:28.521861 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="6387ec1a-bd33-4fbe-8089-a804575a3728" containerName="extract-content" Mar 10 09:58:28 crc kubenswrapper[4794]: E0310 09:58:28.522003 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="250eec95-7242-403d-a887-c361d549d147" containerName="extract-utilities" Mar 10 09:58:28 crc kubenswrapper[4794]: I0310 09:58:28.522123 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="250eec95-7242-403d-a887-c361d549d147" containerName="extract-utilities" Mar 10 09:58:28 crc kubenswrapper[4794]: E0310 09:58:28.522240 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="250eec95-7242-403d-a887-c361d549d147" containerName="registry-server" Mar 10 09:58:28 crc kubenswrapper[4794]: I0310 09:58:28.522401 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="250eec95-7242-403d-a887-c361d549d147" containerName="registry-server" Mar 10 09:58:28 crc kubenswrapper[4794]: E0310 09:58:28.522546 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6387ec1a-bd33-4fbe-8089-a804575a3728" containerName="extract-utilities" Mar 10 09:58:28 crc kubenswrapper[4794]: I0310 09:58:28.522657 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="6387ec1a-bd33-4fbe-8089-a804575a3728" containerName="extract-utilities" Mar 10 09:58:28 crc kubenswrapper[4794]: E0310 09:58:28.522804 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="250eec95-7242-403d-a887-c361d549d147" containerName="extract-content" Mar 10 09:58:28 crc kubenswrapper[4794]: I0310 09:58:28.522916 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="250eec95-7242-403d-a887-c361d549d147" containerName="extract-content" Mar 10 09:58:28 crc kubenswrapper[4794]: I0310 09:58:28.523227 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="6387ec1a-bd33-4fbe-8089-a804575a3728" containerName="registry-server" Mar 10 09:58:28 crc kubenswrapper[4794]: I0310 09:58:28.523377 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="250eec95-7242-403d-a887-c361d549d147" containerName="registry-server" Mar 10 09:58:28 crc kubenswrapper[4794]: I0310 09:58:28.525381 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-tgrtq" Mar 10 09:58:28 crc kubenswrapper[4794]: I0310 09:58:28.552746 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-tgrtq"] Mar 10 09:58:28 crc kubenswrapper[4794]: I0310 09:58:28.640176 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cde9bff4-7089-47c3-a6e8-95c39511b06f-utilities\") pod \"certified-operators-tgrtq\" (UID: \"cde9bff4-7089-47c3-a6e8-95c39511b06f\") " pod="openshift-marketplace/certified-operators-tgrtq" Mar 10 09:58:28 crc kubenswrapper[4794]: I0310 09:58:28.640284 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cde9bff4-7089-47c3-a6e8-95c39511b06f-catalog-content\") pod \"certified-operators-tgrtq\" (UID: \"cde9bff4-7089-47c3-a6e8-95c39511b06f\") " pod="openshift-marketplace/certified-operators-tgrtq" Mar 10 09:58:28 crc kubenswrapper[4794]: I0310 09:58:28.640320 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rn89k\" (UniqueName: \"kubernetes.io/projected/cde9bff4-7089-47c3-a6e8-95c39511b06f-kube-api-access-rn89k\") pod \"certified-operators-tgrtq\" (UID: \"cde9bff4-7089-47c3-a6e8-95c39511b06f\") " pod="openshift-marketplace/certified-operators-tgrtq" Mar 10 09:58:28 crc kubenswrapper[4794]: I0310 09:58:28.741808 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cde9bff4-7089-47c3-a6e8-95c39511b06f-utilities\") pod \"certified-operators-tgrtq\" (UID: \"cde9bff4-7089-47c3-a6e8-95c39511b06f\") " pod="openshift-marketplace/certified-operators-tgrtq" Mar 10 09:58:28 crc kubenswrapper[4794]: I0310 09:58:28.742247 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cde9bff4-7089-47c3-a6e8-95c39511b06f-catalog-content\") pod \"certified-operators-tgrtq\" (UID: \"cde9bff4-7089-47c3-a6e8-95c39511b06f\") " pod="openshift-marketplace/certified-operators-tgrtq" Mar 10 09:58:28 crc kubenswrapper[4794]: I0310 09:58:28.742591 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rn89k\" (UniqueName: \"kubernetes.io/projected/cde9bff4-7089-47c3-a6e8-95c39511b06f-kube-api-access-rn89k\") pod \"certified-operators-tgrtq\" (UID: \"cde9bff4-7089-47c3-a6e8-95c39511b06f\") " pod="openshift-marketplace/certified-operators-tgrtq" Mar 10 09:58:28 crc kubenswrapper[4794]: I0310 09:58:28.742764 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cde9bff4-7089-47c3-a6e8-95c39511b06f-catalog-content\") pod \"certified-operators-tgrtq\" (UID: \"cde9bff4-7089-47c3-a6e8-95c39511b06f\") " pod="openshift-marketplace/certified-operators-tgrtq" Mar 10 09:58:28 crc kubenswrapper[4794]: I0310 09:58:28.742522 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cde9bff4-7089-47c3-a6e8-95c39511b06f-utilities\") pod \"certified-operators-tgrtq\" (UID: \"cde9bff4-7089-47c3-a6e8-95c39511b06f\") " pod="openshift-marketplace/certified-operators-tgrtq" Mar 10 09:58:28 crc kubenswrapper[4794]: I0310 09:58:28.767245 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rn89k\" (UniqueName: \"kubernetes.io/projected/cde9bff4-7089-47c3-a6e8-95c39511b06f-kube-api-access-rn89k\") pod \"certified-operators-tgrtq\" (UID: \"cde9bff4-7089-47c3-a6e8-95c39511b06f\") " pod="openshift-marketplace/certified-operators-tgrtq" Mar 10 09:58:28 crc kubenswrapper[4794]: I0310 09:58:28.860153 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-tgrtq" Mar 10 09:58:29 crc kubenswrapper[4794]: I0310 09:58:29.083427 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-tgrtq"] Mar 10 09:58:29 crc kubenswrapper[4794]: I0310 09:58:29.516014 4794 scope.go:117] "RemoveContainer" containerID="3b119e0238b024cc6aef6badb35e23a741196bf39fb82d7295534f0a6c4afc22" Mar 10 09:58:29 crc kubenswrapper[4794]: I0310 09:58:29.813413 4794 generic.go:334] "Generic (PLEG): container finished" podID="cde9bff4-7089-47c3-a6e8-95c39511b06f" containerID="2bd6de5fc8aafffbf7471f4b53b6fefbb7f5ba9b9bf663c60c8772a471ea1d2f" exitCode=0 Mar 10 09:58:29 crc kubenswrapper[4794]: I0310 09:58:29.813461 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-tgrtq" event={"ID":"cde9bff4-7089-47c3-a6e8-95c39511b06f","Type":"ContainerDied","Data":"2bd6de5fc8aafffbf7471f4b53b6fefbb7f5ba9b9bf663c60c8772a471ea1d2f"} Mar 10 09:58:29 crc kubenswrapper[4794]: I0310 09:58:29.813489 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-tgrtq" event={"ID":"cde9bff4-7089-47c3-a6e8-95c39511b06f","Type":"ContainerStarted","Data":"f724aafe3240f73b61d5f5fb418ebd22c8caa12e7244e3f427025c5562ba6a11"} Mar 10 09:58:30 crc kubenswrapper[4794]: I0310 09:58:30.825747 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-tgrtq" event={"ID":"cde9bff4-7089-47c3-a6e8-95c39511b06f","Type":"ContainerStarted","Data":"d630c4004007614968b2ebd7bac4b646c5dd2caf2ad6bd76fe295c2b77da89c9"} Mar 10 09:58:31 crc kubenswrapper[4794]: I0310 09:58:31.832486 4794 generic.go:334] "Generic (PLEG): container finished" podID="cde9bff4-7089-47c3-a6e8-95c39511b06f" containerID="d630c4004007614968b2ebd7bac4b646c5dd2caf2ad6bd76fe295c2b77da89c9" exitCode=0 Mar 10 09:58:31 crc kubenswrapper[4794]: I0310 09:58:31.832598 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-tgrtq" event={"ID":"cde9bff4-7089-47c3-a6e8-95c39511b06f","Type":"ContainerDied","Data":"d630c4004007614968b2ebd7bac4b646c5dd2caf2ad6bd76fe295c2b77da89c9"} Mar 10 09:58:32 crc kubenswrapper[4794]: I0310 09:58:32.841649 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-tgrtq" event={"ID":"cde9bff4-7089-47c3-a6e8-95c39511b06f","Type":"ContainerStarted","Data":"3b1ab3d80e163729aec4868322a62124d8d1a7b8dd4eed1dd7c66e345782535a"} Mar 10 09:58:32 crc kubenswrapper[4794]: I0310 09:58:32.866473 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-tgrtq" podStartSLOduration=2.427113194 podStartE2EDuration="4.866451755s" podCreationTimestamp="2026-03-10 09:58:28 +0000 UTC" firstStartedPulling="2026-03-10 09:58:29.815623554 +0000 UTC m=+858.571794402" lastFinishedPulling="2026-03-10 09:58:32.254962145 +0000 UTC m=+861.011132963" observedRunningTime="2026-03-10 09:58:32.859382326 +0000 UTC m=+861.615553214" watchObservedRunningTime="2026-03-10 09:58:32.866451755 +0000 UTC m=+861.622622613" Mar 10 09:58:38 crc kubenswrapper[4794]: I0310 09:58:38.035839 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-mm9nq"] Mar 10 09:58:38 crc kubenswrapper[4794]: I0310 09:58:38.037808 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-mm9nq" podUID="d6907de6-7eb7-440a-a101-f492ffa28e39" containerName="ovn-controller" containerID="cri-o://6993b19937515a45f9fab7b20a6856320e79cb3526295d52f48338126d519cac" gracePeriod=30 Mar 10 09:58:38 crc kubenswrapper[4794]: I0310 09:58:38.037874 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-mm9nq" podUID="d6907de6-7eb7-440a-a101-f492ffa28e39" containerName="nbdb" containerID="cri-o://f392e7eaab035834aa954a19e6c9d926bbedf2a547b3a11fba67c0bb966df58c" gracePeriod=30 Mar 10 09:58:38 crc kubenswrapper[4794]: I0310 09:58:38.037960 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-mm9nq" podUID="d6907de6-7eb7-440a-a101-f492ffa28e39" containerName="kube-rbac-proxy-node" containerID="cri-o://5d6a3c6de4d134994477dbeb7d3ccdac86f691a7c465321397a93be240c84c4c" gracePeriod=30 Mar 10 09:58:38 crc kubenswrapper[4794]: I0310 09:58:38.037992 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-mm9nq" podUID="d6907de6-7eb7-440a-a101-f492ffa28e39" containerName="ovn-acl-logging" containerID="cri-o://4fe68076842f5ec0c1922f36a3fdae71ce2e309fd4fa24d4ee4e15c16e514b57" gracePeriod=30 Mar 10 09:58:38 crc kubenswrapper[4794]: I0310 09:58:38.038039 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-mm9nq" podUID="d6907de6-7eb7-440a-a101-f492ffa28e39" containerName="kube-rbac-proxy-ovn-metrics" containerID="cri-o://cf907aae6bd62c6969511c2483d9891a3d89407fa596e7bd4d6e18a3739aea41" gracePeriod=30 Mar 10 09:58:38 crc kubenswrapper[4794]: I0310 09:58:38.037999 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-mm9nq" podUID="d6907de6-7eb7-440a-a101-f492ffa28e39" containerName="northd" containerID="cri-o://f4fd74482a81888186636681a0568f49e919ac0d729db583ba9178c42e488051" gracePeriod=30 Mar 10 09:58:38 crc kubenswrapper[4794]: I0310 09:58:38.038175 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-mm9nq" podUID="d6907de6-7eb7-440a-a101-f492ffa28e39" containerName="sbdb" containerID="cri-o://01f874a0c0eedbf796a95816f7e3fa80891f260f22087c48aea06498330167dd" gracePeriod=30 Mar 10 09:58:38 crc kubenswrapper[4794]: I0310 09:58:38.086291 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-mm9nq" podUID="d6907de6-7eb7-440a-a101-f492ffa28e39" containerName="ovnkube-controller" containerID="cri-o://b88a0e22611d7a1518609affa6087b133d21efdb5a7e80a5d4de584a1e106e57" gracePeriod=30 Mar 10 09:58:38 crc kubenswrapper[4794]: I0310 09:58:38.368078 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-mm9nq_d6907de6-7eb7-440a-a101-f492ffa28e39/ovnkube-controller/3.log" Mar 10 09:58:38 crc kubenswrapper[4794]: I0310 09:58:38.370997 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-mm9nq_d6907de6-7eb7-440a-a101-f492ffa28e39/ovn-acl-logging/0.log" Mar 10 09:58:38 crc kubenswrapper[4794]: I0310 09:58:38.371537 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-mm9nq_d6907de6-7eb7-440a-a101-f492ffa28e39/ovn-controller/0.log" Mar 10 09:58:38 crc kubenswrapper[4794]: I0310 09:58:38.371991 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-mm9nq" Mar 10 09:58:38 crc kubenswrapper[4794]: I0310 09:58:38.428738 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-7tzmx"] Mar 10 09:58:38 crc kubenswrapper[4794]: E0310 09:58:38.428932 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d6907de6-7eb7-440a-a101-f492ffa28e39" containerName="sbdb" Mar 10 09:58:38 crc kubenswrapper[4794]: I0310 09:58:38.428944 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="d6907de6-7eb7-440a-a101-f492ffa28e39" containerName="sbdb" Mar 10 09:58:38 crc kubenswrapper[4794]: E0310 09:58:38.428954 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d6907de6-7eb7-440a-a101-f492ffa28e39" containerName="ovnkube-controller" Mar 10 09:58:38 crc kubenswrapper[4794]: I0310 09:58:38.428960 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="d6907de6-7eb7-440a-a101-f492ffa28e39" containerName="ovnkube-controller" Mar 10 09:58:38 crc kubenswrapper[4794]: E0310 09:58:38.428968 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d6907de6-7eb7-440a-a101-f492ffa28e39" containerName="nbdb" Mar 10 09:58:38 crc kubenswrapper[4794]: I0310 09:58:38.428974 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="d6907de6-7eb7-440a-a101-f492ffa28e39" containerName="nbdb" Mar 10 09:58:38 crc kubenswrapper[4794]: E0310 09:58:38.428981 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d6907de6-7eb7-440a-a101-f492ffa28e39" containerName="kube-rbac-proxy-ovn-metrics" Mar 10 09:58:38 crc kubenswrapper[4794]: I0310 09:58:38.428987 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="d6907de6-7eb7-440a-a101-f492ffa28e39" containerName="kube-rbac-proxy-ovn-metrics" Mar 10 09:58:38 crc kubenswrapper[4794]: E0310 09:58:38.428996 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d6907de6-7eb7-440a-a101-f492ffa28e39" containerName="ovnkube-controller" Mar 10 09:58:38 crc kubenswrapper[4794]: I0310 09:58:38.429002 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="d6907de6-7eb7-440a-a101-f492ffa28e39" containerName="ovnkube-controller" Mar 10 09:58:38 crc kubenswrapper[4794]: E0310 09:58:38.429010 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d6907de6-7eb7-440a-a101-f492ffa28e39" containerName="kubecfg-setup" Mar 10 09:58:38 crc kubenswrapper[4794]: I0310 09:58:38.429015 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="d6907de6-7eb7-440a-a101-f492ffa28e39" containerName="kubecfg-setup" Mar 10 09:58:38 crc kubenswrapper[4794]: E0310 09:58:38.429026 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d6907de6-7eb7-440a-a101-f492ffa28e39" containerName="northd" Mar 10 09:58:38 crc kubenswrapper[4794]: I0310 09:58:38.429034 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="d6907de6-7eb7-440a-a101-f492ffa28e39" containerName="northd" Mar 10 09:58:38 crc kubenswrapper[4794]: E0310 09:58:38.429045 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d6907de6-7eb7-440a-a101-f492ffa28e39" containerName="ovn-controller" Mar 10 09:58:38 crc kubenswrapper[4794]: I0310 09:58:38.429053 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="d6907de6-7eb7-440a-a101-f492ffa28e39" containerName="ovn-controller" Mar 10 09:58:38 crc kubenswrapper[4794]: E0310 09:58:38.429062 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d6907de6-7eb7-440a-a101-f492ffa28e39" containerName="ovn-acl-logging" Mar 10 09:58:38 crc kubenswrapper[4794]: I0310 09:58:38.429067 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="d6907de6-7eb7-440a-a101-f492ffa28e39" containerName="ovn-acl-logging" Mar 10 09:58:38 crc kubenswrapper[4794]: E0310 09:58:38.429079 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d6907de6-7eb7-440a-a101-f492ffa28e39" containerName="ovnkube-controller" Mar 10 09:58:38 crc kubenswrapper[4794]: I0310 09:58:38.429084 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="d6907de6-7eb7-440a-a101-f492ffa28e39" containerName="ovnkube-controller" Mar 10 09:58:38 crc kubenswrapper[4794]: E0310 09:58:38.429093 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d6907de6-7eb7-440a-a101-f492ffa28e39" containerName="kube-rbac-proxy-node" Mar 10 09:58:38 crc kubenswrapper[4794]: I0310 09:58:38.429100 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="d6907de6-7eb7-440a-a101-f492ffa28e39" containerName="kube-rbac-proxy-node" Mar 10 09:58:38 crc kubenswrapper[4794]: I0310 09:58:38.429203 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="d6907de6-7eb7-440a-a101-f492ffa28e39" containerName="ovn-controller" Mar 10 09:58:38 crc kubenswrapper[4794]: I0310 09:58:38.429216 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="d6907de6-7eb7-440a-a101-f492ffa28e39" containerName="ovnkube-controller" Mar 10 09:58:38 crc kubenswrapper[4794]: I0310 09:58:38.429222 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="d6907de6-7eb7-440a-a101-f492ffa28e39" containerName="kube-rbac-proxy-node" Mar 10 09:58:38 crc kubenswrapper[4794]: I0310 09:58:38.429230 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="d6907de6-7eb7-440a-a101-f492ffa28e39" containerName="ovn-acl-logging" Mar 10 09:58:38 crc kubenswrapper[4794]: I0310 09:58:38.429237 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="d6907de6-7eb7-440a-a101-f492ffa28e39" containerName="kube-rbac-proxy-ovn-metrics" Mar 10 09:58:38 crc kubenswrapper[4794]: I0310 09:58:38.429244 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="d6907de6-7eb7-440a-a101-f492ffa28e39" containerName="ovnkube-controller" Mar 10 09:58:38 crc kubenswrapper[4794]: I0310 09:58:38.429250 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="d6907de6-7eb7-440a-a101-f492ffa28e39" containerName="ovnkube-controller" Mar 10 09:58:38 crc kubenswrapper[4794]: I0310 09:58:38.429258 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="d6907de6-7eb7-440a-a101-f492ffa28e39" containerName="nbdb" Mar 10 09:58:38 crc kubenswrapper[4794]: I0310 09:58:38.429265 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="d6907de6-7eb7-440a-a101-f492ffa28e39" containerName="sbdb" Mar 10 09:58:38 crc kubenswrapper[4794]: I0310 09:58:38.429273 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="d6907de6-7eb7-440a-a101-f492ffa28e39" containerName="northd" Mar 10 09:58:38 crc kubenswrapper[4794]: I0310 09:58:38.429281 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="d6907de6-7eb7-440a-a101-f492ffa28e39" containerName="ovnkube-controller" Mar 10 09:58:38 crc kubenswrapper[4794]: E0310 09:58:38.429406 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d6907de6-7eb7-440a-a101-f492ffa28e39" containerName="ovnkube-controller" Mar 10 09:58:38 crc kubenswrapper[4794]: I0310 09:58:38.429415 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="d6907de6-7eb7-440a-a101-f492ffa28e39" containerName="ovnkube-controller" Mar 10 09:58:38 crc kubenswrapper[4794]: E0310 09:58:38.429428 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d6907de6-7eb7-440a-a101-f492ffa28e39" containerName="ovnkube-controller" Mar 10 09:58:38 crc kubenswrapper[4794]: I0310 09:58:38.429433 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="d6907de6-7eb7-440a-a101-f492ffa28e39" containerName="ovnkube-controller" Mar 10 09:58:38 crc kubenswrapper[4794]: I0310 09:58:38.429522 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="d6907de6-7eb7-440a-a101-f492ffa28e39" containerName="ovnkube-controller" Mar 10 09:58:38 crc kubenswrapper[4794]: I0310 09:58:38.430947 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-7tzmx" Mar 10 09:58:38 crc kubenswrapper[4794]: I0310 09:58:38.476849 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/d6907de6-7eb7-440a-a101-f492ffa28e39-host-run-netns\") pod \"d6907de6-7eb7-440a-a101-f492ffa28e39\" (UID: \"d6907de6-7eb7-440a-a101-f492ffa28e39\") " Mar 10 09:58:38 crc kubenswrapper[4794]: I0310 09:58:38.476887 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/d6907de6-7eb7-440a-a101-f492ffa28e39-run-systemd\") pod \"d6907de6-7eb7-440a-a101-f492ffa28e39\" (UID: \"d6907de6-7eb7-440a-a101-f492ffa28e39\") " Mar 10 09:58:38 crc kubenswrapper[4794]: I0310 09:58:38.476928 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5dhqd\" (UniqueName: \"kubernetes.io/projected/d6907de6-7eb7-440a-a101-f492ffa28e39-kube-api-access-5dhqd\") pod \"d6907de6-7eb7-440a-a101-f492ffa28e39\" (UID: \"d6907de6-7eb7-440a-a101-f492ffa28e39\") " Mar 10 09:58:38 crc kubenswrapper[4794]: I0310 09:58:38.476944 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/d6907de6-7eb7-440a-a101-f492ffa28e39-host-cni-bin\") pod \"d6907de6-7eb7-440a-a101-f492ffa28e39\" (UID: \"d6907de6-7eb7-440a-a101-f492ffa28e39\") " Mar 10 09:58:38 crc kubenswrapper[4794]: I0310 09:58:38.476963 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/d6907de6-7eb7-440a-a101-f492ffa28e39-ovnkube-config\") pod \"d6907de6-7eb7-440a-a101-f492ffa28e39\" (UID: \"d6907de6-7eb7-440a-a101-f492ffa28e39\") " Mar 10 09:58:38 crc kubenswrapper[4794]: I0310 09:58:38.476993 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/d6907de6-7eb7-440a-a101-f492ffa28e39-env-overrides\") pod \"d6907de6-7eb7-440a-a101-f492ffa28e39\" (UID: \"d6907de6-7eb7-440a-a101-f492ffa28e39\") " Mar 10 09:58:38 crc kubenswrapper[4794]: I0310 09:58:38.477015 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d6907de6-7eb7-440a-a101-f492ffa28e39-var-lib-openvswitch\") pod \"d6907de6-7eb7-440a-a101-f492ffa28e39\" (UID: \"d6907de6-7eb7-440a-a101-f492ffa28e39\") " Mar 10 09:58:38 crc kubenswrapper[4794]: I0310 09:58:38.477029 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d6907de6-7eb7-440a-a101-f492ffa28e39-host-slash\") pod \"d6907de6-7eb7-440a-a101-f492ffa28e39\" (UID: \"d6907de6-7eb7-440a-a101-f492ffa28e39\") " Mar 10 09:58:38 crc kubenswrapper[4794]: I0310 09:58:38.477059 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/d6907de6-7eb7-440a-a101-f492ffa28e39-run-ovn\") pod \"d6907de6-7eb7-440a-a101-f492ffa28e39\" (UID: \"d6907de6-7eb7-440a-a101-f492ffa28e39\") " Mar 10 09:58:38 crc kubenswrapper[4794]: I0310 09:58:38.477089 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d6907de6-7eb7-440a-a101-f492ffa28e39-run-openvswitch\") pod \"d6907de6-7eb7-440a-a101-f492ffa28e39\" (UID: \"d6907de6-7eb7-440a-a101-f492ffa28e39\") " Mar 10 09:58:38 crc kubenswrapper[4794]: I0310 09:58:38.477103 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/d6907de6-7eb7-440a-a101-f492ffa28e39-host-kubelet\") pod \"d6907de6-7eb7-440a-a101-f492ffa28e39\" (UID: \"d6907de6-7eb7-440a-a101-f492ffa28e39\") " Mar 10 09:58:38 crc kubenswrapper[4794]: I0310 09:58:38.477120 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/d6907de6-7eb7-440a-a101-f492ffa28e39-log-socket\") pod \"d6907de6-7eb7-440a-a101-f492ffa28e39\" (UID: \"d6907de6-7eb7-440a-a101-f492ffa28e39\") " Mar 10 09:58:38 crc kubenswrapper[4794]: I0310 09:58:38.477139 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d6907de6-7eb7-440a-a101-f492ffa28e39-etc-openvswitch\") pod \"d6907de6-7eb7-440a-a101-f492ffa28e39\" (UID: \"d6907de6-7eb7-440a-a101-f492ffa28e39\") " Mar 10 09:58:38 crc kubenswrapper[4794]: I0310 09:58:38.477158 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d6907de6-7eb7-440a-a101-f492ffa28e39-host-var-lib-cni-networks-ovn-kubernetes\") pod \"d6907de6-7eb7-440a-a101-f492ffa28e39\" (UID: \"d6907de6-7eb7-440a-a101-f492ffa28e39\") " Mar 10 09:58:38 crc kubenswrapper[4794]: I0310 09:58:38.477181 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/d6907de6-7eb7-440a-a101-f492ffa28e39-ovn-node-metrics-cert\") pod \"d6907de6-7eb7-440a-a101-f492ffa28e39\" (UID: \"d6907de6-7eb7-440a-a101-f492ffa28e39\") " Mar 10 09:58:38 crc kubenswrapper[4794]: I0310 09:58:38.477204 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/d6907de6-7eb7-440a-a101-f492ffa28e39-systemd-units\") pod \"d6907de6-7eb7-440a-a101-f492ffa28e39\" (UID: \"d6907de6-7eb7-440a-a101-f492ffa28e39\") " Mar 10 09:58:38 crc kubenswrapper[4794]: I0310 09:58:38.477223 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/d6907de6-7eb7-440a-a101-f492ffa28e39-host-cni-netd\") pod \"d6907de6-7eb7-440a-a101-f492ffa28e39\" (UID: \"d6907de6-7eb7-440a-a101-f492ffa28e39\") " Mar 10 09:58:38 crc kubenswrapper[4794]: I0310 09:58:38.477236 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d6907de6-7eb7-440a-a101-f492ffa28e39-host-run-ovn-kubernetes\") pod \"d6907de6-7eb7-440a-a101-f492ffa28e39\" (UID: \"d6907de6-7eb7-440a-a101-f492ffa28e39\") " Mar 10 09:58:38 crc kubenswrapper[4794]: I0310 09:58:38.477249 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/d6907de6-7eb7-440a-a101-f492ffa28e39-node-log\") pod \"d6907de6-7eb7-440a-a101-f492ffa28e39\" (UID: \"d6907de6-7eb7-440a-a101-f492ffa28e39\") " Mar 10 09:58:38 crc kubenswrapper[4794]: I0310 09:58:38.477269 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/d6907de6-7eb7-440a-a101-f492ffa28e39-ovnkube-script-lib\") pod \"d6907de6-7eb7-440a-a101-f492ffa28e39\" (UID: \"d6907de6-7eb7-440a-a101-f492ffa28e39\") " Mar 10 09:58:38 crc kubenswrapper[4794]: I0310 09:58:38.477418 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/84306302-a4d7-4ec6-8005-afca6c0f9191-env-overrides\") pod \"ovnkube-node-7tzmx\" (UID: \"84306302-a4d7-4ec6-8005-afca6c0f9191\") " pod="openshift-ovn-kubernetes/ovnkube-node-7tzmx" Mar 10 09:58:38 crc kubenswrapper[4794]: I0310 09:58:38.477444 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/84306302-a4d7-4ec6-8005-afca6c0f9191-host-kubelet\") pod \"ovnkube-node-7tzmx\" (UID: \"84306302-a4d7-4ec6-8005-afca6c0f9191\") " pod="openshift-ovn-kubernetes/ovnkube-node-7tzmx" Mar 10 09:58:38 crc kubenswrapper[4794]: I0310 09:58:38.477463 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/84306302-a4d7-4ec6-8005-afca6c0f9191-run-openvswitch\") pod \"ovnkube-node-7tzmx\" (UID: \"84306302-a4d7-4ec6-8005-afca6c0f9191\") " pod="openshift-ovn-kubernetes/ovnkube-node-7tzmx" Mar 10 09:58:38 crc kubenswrapper[4794]: I0310 09:58:38.477483 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/84306302-a4d7-4ec6-8005-afca6c0f9191-node-log\") pod \"ovnkube-node-7tzmx\" (UID: \"84306302-a4d7-4ec6-8005-afca6c0f9191\") " pod="openshift-ovn-kubernetes/ovnkube-node-7tzmx" Mar 10 09:58:38 crc kubenswrapper[4794]: I0310 09:58:38.477498 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/84306302-a4d7-4ec6-8005-afca6c0f9191-log-socket\") pod \"ovnkube-node-7tzmx\" (UID: \"84306302-a4d7-4ec6-8005-afca6c0f9191\") " pod="openshift-ovn-kubernetes/ovnkube-node-7tzmx" Mar 10 09:58:38 crc kubenswrapper[4794]: I0310 09:58:38.477514 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/84306302-a4d7-4ec6-8005-afca6c0f9191-host-cni-netd\") pod \"ovnkube-node-7tzmx\" (UID: \"84306302-a4d7-4ec6-8005-afca6c0f9191\") " pod="openshift-ovn-kubernetes/ovnkube-node-7tzmx" Mar 10 09:58:38 crc kubenswrapper[4794]: I0310 09:58:38.477531 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/84306302-a4d7-4ec6-8005-afca6c0f9191-etc-openvswitch\") pod \"ovnkube-node-7tzmx\" (UID: \"84306302-a4d7-4ec6-8005-afca6c0f9191\") " pod="openshift-ovn-kubernetes/ovnkube-node-7tzmx" Mar 10 09:58:38 crc kubenswrapper[4794]: I0310 09:58:38.477547 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/84306302-a4d7-4ec6-8005-afca6c0f9191-var-lib-openvswitch\") pod \"ovnkube-node-7tzmx\" (UID: \"84306302-a4d7-4ec6-8005-afca6c0f9191\") " pod="openshift-ovn-kubernetes/ovnkube-node-7tzmx" Mar 10 09:58:38 crc kubenswrapper[4794]: I0310 09:58:38.477570 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/84306302-a4d7-4ec6-8005-afca6c0f9191-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-7tzmx\" (UID: \"84306302-a4d7-4ec6-8005-afca6c0f9191\") " pod="openshift-ovn-kubernetes/ovnkube-node-7tzmx" Mar 10 09:58:38 crc kubenswrapper[4794]: I0310 09:58:38.477590 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/84306302-a4d7-4ec6-8005-afca6c0f9191-run-ovn\") pod \"ovnkube-node-7tzmx\" (UID: \"84306302-a4d7-4ec6-8005-afca6c0f9191\") " pod="openshift-ovn-kubernetes/ovnkube-node-7tzmx" Mar 10 09:58:38 crc kubenswrapper[4794]: I0310 09:58:38.477607 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/84306302-a4d7-4ec6-8005-afca6c0f9191-ovnkube-script-lib\") pod \"ovnkube-node-7tzmx\" (UID: \"84306302-a4d7-4ec6-8005-afca6c0f9191\") " pod="openshift-ovn-kubernetes/ovnkube-node-7tzmx" Mar 10 09:58:38 crc kubenswrapper[4794]: I0310 09:58:38.476982 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d6907de6-7eb7-440a-a101-f492ffa28e39-host-run-netns" (OuterVolumeSpecName: "host-run-netns") pod "d6907de6-7eb7-440a-a101-f492ffa28e39" (UID: "d6907de6-7eb7-440a-a101-f492ffa28e39"). InnerVolumeSpecName "host-run-netns". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 09:58:38 crc kubenswrapper[4794]: I0310 09:58:38.477668 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d6907de6-7eb7-440a-a101-f492ffa28e39-node-log" (OuterVolumeSpecName: "node-log") pod "d6907de6-7eb7-440a-a101-f492ffa28e39" (UID: "d6907de6-7eb7-440a-a101-f492ffa28e39"). InnerVolumeSpecName "node-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 09:58:38 crc kubenswrapper[4794]: I0310 09:58:38.477681 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d6907de6-7eb7-440a-a101-f492ffa28e39-host-var-lib-cni-networks-ovn-kubernetes" (OuterVolumeSpecName: "host-var-lib-cni-networks-ovn-kubernetes") pod "d6907de6-7eb7-440a-a101-f492ffa28e39" (UID: "d6907de6-7eb7-440a-a101-f492ffa28e39"). InnerVolumeSpecName "host-var-lib-cni-networks-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 09:58:38 crc kubenswrapper[4794]: I0310 09:58:38.477701 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d6907de6-7eb7-440a-a101-f492ffa28e39-var-lib-openvswitch" (OuterVolumeSpecName: "var-lib-openvswitch") pod "d6907de6-7eb7-440a-a101-f492ffa28e39" (UID: "d6907de6-7eb7-440a-a101-f492ffa28e39"). InnerVolumeSpecName "var-lib-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 09:58:38 crc kubenswrapper[4794]: I0310 09:58:38.477624 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/84306302-a4d7-4ec6-8005-afca6c0f9191-host-run-netns\") pod \"ovnkube-node-7tzmx\" (UID: \"84306302-a4d7-4ec6-8005-afca6c0f9191\") " pod="openshift-ovn-kubernetes/ovnkube-node-7tzmx" Mar 10 09:58:38 crc kubenswrapper[4794]: I0310 09:58:38.477849 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/84306302-a4d7-4ec6-8005-afca6c0f9191-host-cni-bin\") pod \"ovnkube-node-7tzmx\" (UID: \"84306302-a4d7-4ec6-8005-afca6c0f9191\") " pod="openshift-ovn-kubernetes/ovnkube-node-7tzmx" Mar 10 09:58:38 crc kubenswrapper[4794]: I0310 09:58:38.477912 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hdklq\" (UniqueName: \"kubernetes.io/projected/84306302-a4d7-4ec6-8005-afca6c0f9191-kube-api-access-hdklq\") pod \"ovnkube-node-7tzmx\" (UID: \"84306302-a4d7-4ec6-8005-afca6c0f9191\") " pod="openshift-ovn-kubernetes/ovnkube-node-7tzmx" Mar 10 09:58:38 crc kubenswrapper[4794]: I0310 09:58:38.477943 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/84306302-a4d7-4ec6-8005-afca6c0f9191-systemd-units\") pod \"ovnkube-node-7tzmx\" (UID: \"84306302-a4d7-4ec6-8005-afca6c0f9191\") " pod="openshift-ovn-kubernetes/ovnkube-node-7tzmx" Mar 10 09:58:38 crc kubenswrapper[4794]: I0310 09:58:38.478023 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/84306302-a4d7-4ec6-8005-afca6c0f9191-ovn-node-metrics-cert\") pod \"ovnkube-node-7tzmx\" (UID: \"84306302-a4d7-4ec6-8005-afca6c0f9191\") " pod="openshift-ovn-kubernetes/ovnkube-node-7tzmx" Mar 10 09:58:38 crc kubenswrapper[4794]: I0310 09:58:38.477719 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d6907de6-7eb7-440a-a101-f492ffa28e39-host-slash" (OuterVolumeSpecName: "host-slash") pod "d6907de6-7eb7-440a-a101-f492ffa28e39" (UID: "d6907de6-7eb7-440a-a101-f492ffa28e39"). InnerVolumeSpecName "host-slash". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 09:58:38 crc kubenswrapper[4794]: I0310 09:58:38.477747 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d6907de6-7eb7-440a-a101-f492ffa28e39-host-kubelet" (OuterVolumeSpecName: "host-kubelet") pod "d6907de6-7eb7-440a-a101-f492ffa28e39" (UID: "d6907de6-7eb7-440a-a101-f492ffa28e39"). InnerVolumeSpecName "host-kubelet". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 09:58:38 crc kubenswrapper[4794]: I0310 09:58:38.477373 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d6907de6-7eb7-440a-a101-f492ffa28e39-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "d6907de6-7eb7-440a-a101-f492ffa28e39" (UID: "d6907de6-7eb7-440a-a101-f492ffa28e39"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 09:58:38 crc kubenswrapper[4794]: I0310 09:58:38.477593 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d6907de6-7eb7-440a-a101-f492ffa28e39-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "d6907de6-7eb7-440a-a101-f492ffa28e39" (UID: "d6907de6-7eb7-440a-a101-f492ffa28e39"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 09:58:38 crc kubenswrapper[4794]: I0310 09:58:38.477616 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d6907de6-7eb7-440a-a101-f492ffa28e39-host-run-ovn-kubernetes" (OuterVolumeSpecName: "host-run-ovn-kubernetes") pod "d6907de6-7eb7-440a-a101-f492ffa28e39" (UID: "d6907de6-7eb7-440a-a101-f492ffa28e39"). InnerVolumeSpecName "host-run-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 09:58:38 crc kubenswrapper[4794]: I0310 09:58:38.477771 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d6907de6-7eb7-440a-a101-f492ffa28e39-run-ovn" (OuterVolumeSpecName: "run-ovn") pod "d6907de6-7eb7-440a-a101-f492ffa28e39" (UID: "d6907de6-7eb7-440a-a101-f492ffa28e39"). InnerVolumeSpecName "run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 09:58:38 crc kubenswrapper[4794]: I0310 09:58:38.477763 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d6907de6-7eb7-440a-a101-f492ffa28e39-log-socket" (OuterVolumeSpecName: "log-socket") pod "d6907de6-7eb7-440a-a101-f492ffa28e39" (UID: "d6907de6-7eb7-440a-a101-f492ffa28e39"). InnerVolumeSpecName "log-socket". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 09:58:38 crc kubenswrapper[4794]: I0310 09:58:38.477789 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d6907de6-7eb7-440a-a101-f492ffa28e39-run-openvswitch" (OuterVolumeSpecName: "run-openvswitch") pod "d6907de6-7eb7-440a-a101-f492ffa28e39" (UID: "d6907de6-7eb7-440a-a101-f492ffa28e39"). InnerVolumeSpecName "run-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 09:58:38 crc kubenswrapper[4794]: I0310 09:58:38.478108 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/84306302-a4d7-4ec6-8005-afca6c0f9191-ovnkube-config\") pod \"ovnkube-node-7tzmx\" (UID: \"84306302-a4d7-4ec6-8005-afca6c0f9191\") " pod="openshift-ovn-kubernetes/ovnkube-node-7tzmx" Mar 10 09:58:38 crc kubenswrapper[4794]: I0310 09:58:38.477802 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d6907de6-7eb7-440a-a101-f492ffa28e39-etc-openvswitch" (OuterVolumeSpecName: "etc-openvswitch") pod "d6907de6-7eb7-440a-a101-f492ffa28e39" (UID: "d6907de6-7eb7-440a-a101-f492ffa28e39"). InnerVolumeSpecName "etc-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 09:58:38 crc kubenswrapper[4794]: I0310 09:58:38.477013 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d6907de6-7eb7-440a-a101-f492ffa28e39-host-cni-bin" (OuterVolumeSpecName: "host-cni-bin") pod "d6907de6-7eb7-440a-a101-f492ffa28e39" (UID: "d6907de6-7eb7-440a-a101-f492ffa28e39"). InnerVolumeSpecName "host-cni-bin". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 09:58:38 crc kubenswrapper[4794]: I0310 09:58:38.477826 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d6907de6-7eb7-440a-a101-f492ffa28e39-systemd-units" (OuterVolumeSpecName: "systemd-units") pod "d6907de6-7eb7-440a-a101-f492ffa28e39" (UID: "d6907de6-7eb7-440a-a101-f492ffa28e39"). InnerVolumeSpecName "systemd-units". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 09:58:38 crc kubenswrapper[4794]: I0310 09:58:38.477845 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d6907de6-7eb7-440a-a101-f492ffa28e39-host-cni-netd" (OuterVolumeSpecName: "host-cni-netd") pod "d6907de6-7eb7-440a-a101-f492ffa28e39" (UID: "d6907de6-7eb7-440a-a101-f492ffa28e39"). InnerVolumeSpecName "host-cni-netd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 09:58:38 crc kubenswrapper[4794]: I0310 09:58:38.478023 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d6907de6-7eb7-440a-a101-f492ffa28e39-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "d6907de6-7eb7-440a-a101-f492ffa28e39" (UID: "d6907de6-7eb7-440a-a101-f492ffa28e39"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 09:58:38 crc kubenswrapper[4794]: I0310 09:58:38.478139 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/84306302-a4d7-4ec6-8005-afca6c0f9191-host-run-ovn-kubernetes\") pod \"ovnkube-node-7tzmx\" (UID: \"84306302-a4d7-4ec6-8005-afca6c0f9191\") " pod="openshift-ovn-kubernetes/ovnkube-node-7tzmx" Mar 10 09:58:38 crc kubenswrapper[4794]: I0310 09:58:38.478180 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/84306302-a4d7-4ec6-8005-afca6c0f9191-host-slash\") pod \"ovnkube-node-7tzmx\" (UID: \"84306302-a4d7-4ec6-8005-afca6c0f9191\") " pod="openshift-ovn-kubernetes/ovnkube-node-7tzmx" Mar 10 09:58:38 crc kubenswrapper[4794]: I0310 09:58:38.478202 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/84306302-a4d7-4ec6-8005-afca6c0f9191-run-systemd\") pod \"ovnkube-node-7tzmx\" (UID: \"84306302-a4d7-4ec6-8005-afca6c0f9191\") " pod="openshift-ovn-kubernetes/ovnkube-node-7tzmx" Mar 10 09:58:38 crc kubenswrapper[4794]: I0310 09:58:38.478278 4794 reconciler_common.go:293] "Volume detached for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d6907de6-7eb7-440a-a101-f492ffa28e39-run-openvswitch\") on node \"crc\" DevicePath \"\"" Mar 10 09:58:38 crc kubenswrapper[4794]: I0310 09:58:38.478322 4794 reconciler_common.go:293] "Volume detached for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/d6907de6-7eb7-440a-a101-f492ffa28e39-host-kubelet\") on node \"crc\" DevicePath \"\"" Mar 10 09:58:38 crc kubenswrapper[4794]: I0310 09:58:38.478348 4794 reconciler_common.go:293] "Volume detached for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/d6907de6-7eb7-440a-a101-f492ffa28e39-log-socket\") on node \"crc\" DevicePath \"\"" Mar 10 09:58:38 crc kubenswrapper[4794]: I0310 09:58:38.478361 4794 reconciler_common.go:293] "Volume detached for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d6907de6-7eb7-440a-a101-f492ffa28e39-etc-openvswitch\") on node \"crc\" DevicePath \"\"" Mar 10 09:58:38 crc kubenswrapper[4794]: I0310 09:58:38.478375 4794 reconciler_common.go:293] "Volume detached for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d6907de6-7eb7-440a-a101-f492ffa28e39-host-var-lib-cni-networks-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Mar 10 09:58:38 crc kubenswrapper[4794]: I0310 09:58:38.478387 4794 reconciler_common.go:293] "Volume detached for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/d6907de6-7eb7-440a-a101-f492ffa28e39-systemd-units\") on node \"crc\" DevicePath \"\"" Mar 10 09:58:38 crc kubenswrapper[4794]: I0310 09:58:38.478399 4794 reconciler_common.go:293] "Volume detached for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d6907de6-7eb7-440a-a101-f492ffa28e39-host-run-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Mar 10 09:58:38 crc kubenswrapper[4794]: I0310 09:58:38.478411 4794 reconciler_common.go:293] "Volume detached for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/d6907de6-7eb7-440a-a101-f492ffa28e39-node-log\") on node \"crc\" DevicePath \"\"" Mar 10 09:58:38 crc kubenswrapper[4794]: I0310 09:58:38.478421 4794 reconciler_common.go:293] "Volume detached for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/d6907de6-7eb7-440a-a101-f492ffa28e39-host-cni-netd\") on node \"crc\" DevicePath \"\"" Mar 10 09:58:38 crc kubenswrapper[4794]: I0310 09:58:38.478431 4794 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/d6907de6-7eb7-440a-a101-f492ffa28e39-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Mar 10 09:58:38 crc kubenswrapper[4794]: I0310 09:58:38.478441 4794 reconciler_common.go:293] "Volume detached for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/d6907de6-7eb7-440a-a101-f492ffa28e39-host-run-netns\") on node \"crc\" DevicePath \"\"" Mar 10 09:58:38 crc kubenswrapper[4794]: I0310 09:58:38.478451 4794 reconciler_common.go:293] "Volume detached for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/d6907de6-7eb7-440a-a101-f492ffa28e39-host-cni-bin\") on node \"crc\" DevicePath \"\"" Mar 10 09:58:38 crc kubenswrapper[4794]: I0310 09:58:38.478462 4794 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/d6907de6-7eb7-440a-a101-f492ffa28e39-ovnkube-config\") on node \"crc\" DevicePath \"\"" Mar 10 09:58:38 crc kubenswrapper[4794]: I0310 09:58:38.478472 4794 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/d6907de6-7eb7-440a-a101-f492ffa28e39-env-overrides\") on node \"crc\" DevicePath \"\"" Mar 10 09:58:38 crc kubenswrapper[4794]: I0310 09:58:38.478485 4794 reconciler_common.go:293] "Volume detached for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d6907de6-7eb7-440a-a101-f492ffa28e39-var-lib-openvswitch\") on node \"crc\" DevicePath \"\"" Mar 10 09:58:38 crc kubenswrapper[4794]: I0310 09:58:38.478495 4794 reconciler_common.go:293] "Volume detached for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d6907de6-7eb7-440a-a101-f492ffa28e39-host-slash\") on node \"crc\" DevicePath \"\"" Mar 10 09:58:38 crc kubenswrapper[4794]: I0310 09:58:38.478507 4794 reconciler_common.go:293] "Volume detached for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/d6907de6-7eb7-440a-a101-f492ffa28e39-run-ovn\") on node \"crc\" DevicePath \"\"" Mar 10 09:58:38 crc kubenswrapper[4794]: I0310 09:58:38.481854 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d6907de6-7eb7-440a-a101-f492ffa28e39-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "d6907de6-7eb7-440a-a101-f492ffa28e39" (UID: "d6907de6-7eb7-440a-a101-f492ffa28e39"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:58:38 crc kubenswrapper[4794]: I0310 09:58:38.481918 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d6907de6-7eb7-440a-a101-f492ffa28e39-kube-api-access-5dhqd" (OuterVolumeSpecName: "kube-api-access-5dhqd") pod "d6907de6-7eb7-440a-a101-f492ffa28e39" (UID: "d6907de6-7eb7-440a-a101-f492ffa28e39"). InnerVolumeSpecName "kube-api-access-5dhqd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:58:38 crc kubenswrapper[4794]: I0310 09:58:38.488919 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d6907de6-7eb7-440a-a101-f492ffa28e39-run-systemd" (OuterVolumeSpecName: "run-systemd") pod "d6907de6-7eb7-440a-a101-f492ffa28e39" (UID: "d6907de6-7eb7-440a-a101-f492ffa28e39"). InnerVolumeSpecName "run-systemd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 09:58:38 crc kubenswrapper[4794]: I0310 09:58:38.579155 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/84306302-a4d7-4ec6-8005-afca6c0f9191-node-log\") pod \"ovnkube-node-7tzmx\" (UID: \"84306302-a4d7-4ec6-8005-afca6c0f9191\") " pod="openshift-ovn-kubernetes/ovnkube-node-7tzmx" Mar 10 09:58:38 crc kubenswrapper[4794]: I0310 09:58:38.579194 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/84306302-a4d7-4ec6-8005-afca6c0f9191-log-socket\") pod \"ovnkube-node-7tzmx\" (UID: \"84306302-a4d7-4ec6-8005-afca6c0f9191\") " pod="openshift-ovn-kubernetes/ovnkube-node-7tzmx" Mar 10 09:58:38 crc kubenswrapper[4794]: I0310 09:58:38.579216 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/84306302-a4d7-4ec6-8005-afca6c0f9191-host-cni-netd\") pod \"ovnkube-node-7tzmx\" (UID: \"84306302-a4d7-4ec6-8005-afca6c0f9191\") " pod="openshift-ovn-kubernetes/ovnkube-node-7tzmx" Mar 10 09:58:38 crc kubenswrapper[4794]: I0310 09:58:38.579244 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/84306302-a4d7-4ec6-8005-afca6c0f9191-var-lib-openvswitch\") pod \"ovnkube-node-7tzmx\" (UID: \"84306302-a4d7-4ec6-8005-afca6c0f9191\") " pod="openshift-ovn-kubernetes/ovnkube-node-7tzmx" Mar 10 09:58:38 crc kubenswrapper[4794]: I0310 09:58:38.579263 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/84306302-a4d7-4ec6-8005-afca6c0f9191-etc-openvswitch\") pod \"ovnkube-node-7tzmx\" (UID: \"84306302-a4d7-4ec6-8005-afca6c0f9191\") " pod="openshift-ovn-kubernetes/ovnkube-node-7tzmx" Mar 10 09:58:38 crc kubenswrapper[4794]: I0310 09:58:38.579277 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/84306302-a4d7-4ec6-8005-afca6c0f9191-node-log\") pod \"ovnkube-node-7tzmx\" (UID: \"84306302-a4d7-4ec6-8005-afca6c0f9191\") " pod="openshift-ovn-kubernetes/ovnkube-node-7tzmx" Mar 10 09:58:38 crc kubenswrapper[4794]: I0310 09:58:38.579295 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/84306302-a4d7-4ec6-8005-afca6c0f9191-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-7tzmx\" (UID: \"84306302-a4d7-4ec6-8005-afca6c0f9191\") " pod="openshift-ovn-kubernetes/ovnkube-node-7tzmx" Mar 10 09:58:38 crc kubenswrapper[4794]: I0310 09:58:38.579359 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/84306302-a4d7-4ec6-8005-afca6c0f9191-run-ovn\") pod \"ovnkube-node-7tzmx\" (UID: \"84306302-a4d7-4ec6-8005-afca6c0f9191\") " pod="openshift-ovn-kubernetes/ovnkube-node-7tzmx" Mar 10 09:58:38 crc kubenswrapper[4794]: I0310 09:58:38.579378 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/84306302-a4d7-4ec6-8005-afca6c0f9191-host-cni-netd\") pod \"ovnkube-node-7tzmx\" (UID: \"84306302-a4d7-4ec6-8005-afca6c0f9191\") " pod="openshift-ovn-kubernetes/ovnkube-node-7tzmx" Mar 10 09:58:38 crc kubenswrapper[4794]: I0310 09:58:38.579425 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/84306302-a4d7-4ec6-8005-afca6c0f9191-run-ovn\") pod \"ovnkube-node-7tzmx\" (UID: \"84306302-a4d7-4ec6-8005-afca6c0f9191\") " pod="openshift-ovn-kubernetes/ovnkube-node-7tzmx" Mar 10 09:58:38 crc kubenswrapper[4794]: I0310 09:58:38.579390 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/84306302-a4d7-4ec6-8005-afca6c0f9191-ovnkube-script-lib\") pod \"ovnkube-node-7tzmx\" (UID: \"84306302-a4d7-4ec6-8005-afca6c0f9191\") " pod="openshift-ovn-kubernetes/ovnkube-node-7tzmx" Mar 10 09:58:38 crc kubenswrapper[4794]: I0310 09:58:38.579399 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/84306302-a4d7-4ec6-8005-afca6c0f9191-log-socket\") pod \"ovnkube-node-7tzmx\" (UID: \"84306302-a4d7-4ec6-8005-afca6c0f9191\") " pod="openshift-ovn-kubernetes/ovnkube-node-7tzmx" Mar 10 09:58:38 crc kubenswrapper[4794]: I0310 09:58:38.579421 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/84306302-a4d7-4ec6-8005-afca6c0f9191-var-lib-openvswitch\") pod \"ovnkube-node-7tzmx\" (UID: \"84306302-a4d7-4ec6-8005-afca6c0f9191\") " pod="openshift-ovn-kubernetes/ovnkube-node-7tzmx" Mar 10 09:58:38 crc kubenswrapper[4794]: I0310 09:58:38.579419 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/84306302-a4d7-4ec6-8005-afca6c0f9191-etc-openvswitch\") pod \"ovnkube-node-7tzmx\" (UID: \"84306302-a4d7-4ec6-8005-afca6c0f9191\") " pod="openshift-ovn-kubernetes/ovnkube-node-7tzmx" Mar 10 09:58:38 crc kubenswrapper[4794]: I0310 09:58:38.579495 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/84306302-a4d7-4ec6-8005-afca6c0f9191-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-7tzmx\" (UID: \"84306302-a4d7-4ec6-8005-afca6c0f9191\") " pod="openshift-ovn-kubernetes/ovnkube-node-7tzmx" Mar 10 09:58:38 crc kubenswrapper[4794]: I0310 09:58:38.579501 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/84306302-a4d7-4ec6-8005-afca6c0f9191-host-run-netns\") pod \"ovnkube-node-7tzmx\" (UID: \"84306302-a4d7-4ec6-8005-afca6c0f9191\") " pod="openshift-ovn-kubernetes/ovnkube-node-7tzmx" Mar 10 09:58:38 crc kubenswrapper[4794]: I0310 09:58:38.579535 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/84306302-a4d7-4ec6-8005-afca6c0f9191-host-run-netns\") pod \"ovnkube-node-7tzmx\" (UID: \"84306302-a4d7-4ec6-8005-afca6c0f9191\") " pod="openshift-ovn-kubernetes/ovnkube-node-7tzmx" Mar 10 09:58:38 crc kubenswrapper[4794]: I0310 09:58:38.579544 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/84306302-a4d7-4ec6-8005-afca6c0f9191-host-cni-bin\") pod \"ovnkube-node-7tzmx\" (UID: \"84306302-a4d7-4ec6-8005-afca6c0f9191\") " pod="openshift-ovn-kubernetes/ovnkube-node-7tzmx" Mar 10 09:58:38 crc kubenswrapper[4794]: I0310 09:58:38.579578 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/84306302-a4d7-4ec6-8005-afca6c0f9191-host-cni-bin\") pod \"ovnkube-node-7tzmx\" (UID: \"84306302-a4d7-4ec6-8005-afca6c0f9191\") " pod="openshift-ovn-kubernetes/ovnkube-node-7tzmx" Mar 10 09:58:38 crc kubenswrapper[4794]: I0310 09:58:38.579606 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hdklq\" (UniqueName: \"kubernetes.io/projected/84306302-a4d7-4ec6-8005-afca6c0f9191-kube-api-access-hdklq\") pod \"ovnkube-node-7tzmx\" (UID: \"84306302-a4d7-4ec6-8005-afca6c0f9191\") " pod="openshift-ovn-kubernetes/ovnkube-node-7tzmx" Mar 10 09:58:38 crc kubenswrapper[4794]: I0310 09:58:38.579625 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/84306302-a4d7-4ec6-8005-afca6c0f9191-systemd-units\") pod \"ovnkube-node-7tzmx\" (UID: \"84306302-a4d7-4ec6-8005-afca6c0f9191\") " pod="openshift-ovn-kubernetes/ovnkube-node-7tzmx" Mar 10 09:58:38 crc kubenswrapper[4794]: I0310 09:58:38.579654 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/84306302-a4d7-4ec6-8005-afca6c0f9191-ovn-node-metrics-cert\") pod \"ovnkube-node-7tzmx\" (UID: \"84306302-a4d7-4ec6-8005-afca6c0f9191\") " pod="openshift-ovn-kubernetes/ovnkube-node-7tzmx" Mar 10 09:58:38 crc kubenswrapper[4794]: I0310 09:58:38.579694 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/84306302-a4d7-4ec6-8005-afca6c0f9191-ovnkube-config\") pod \"ovnkube-node-7tzmx\" (UID: \"84306302-a4d7-4ec6-8005-afca6c0f9191\") " pod="openshift-ovn-kubernetes/ovnkube-node-7tzmx" Mar 10 09:58:38 crc kubenswrapper[4794]: I0310 09:58:38.579718 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/84306302-a4d7-4ec6-8005-afca6c0f9191-host-run-ovn-kubernetes\") pod \"ovnkube-node-7tzmx\" (UID: \"84306302-a4d7-4ec6-8005-afca6c0f9191\") " pod="openshift-ovn-kubernetes/ovnkube-node-7tzmx" Mar 10 09:58:38 crc kubenswrapper[4794]: I0310 09:58:38.579743 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/84306302-a4d7-4ec6-8005-afca6c0f9191-host-slash\") pod \"ovnkube-node-7tzmx\" (UID: \"84306302-a4d7-4ec6-8005-afca6c0f9191\") " pod="openshift-ovn-kubernetes/ovnkube-node-7tzmx" Mar 10 09:58:38 crc kubenswrapper[4794]: I0310 09:58:38.579774 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/84306302-a4d7-4ec6-8005-afca6c0f9191-run-systemd\") pod \"ovnkube-node-7tzmx\" (UID: \"84306302-a4d7-4ec6-8005-afca6c0f9191\") " pod="openshift-ovn-kubernetes/ovnkube-node-7tzmx" Mar 10 09:58:38 crc kubenswrapper[4794]: I0310 09:58:38.579821 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/84306302-a4d7-4ec6-8005-afca6c0f9191-env-overrides\") pod \"ovnkube-node-7tzmx\" (UID: \"84306302-a4d7-4ec6-8005-afca6c0f9191\") " pod="openshift-ovn-kubernetes/ovnkube-node-7tzmx" Mar 10 09:58:38 crc kubenswrapper[4794]: I0310 09:58:38.579853 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/84306302-a4d7-4ec6-8005-afca6c0f9191-host-kubelet\") pod \"ovnkube-node-7tzmx\" (UID: \"84306302-a4d7-4ec6-8005-afca6c0f9191\") " pod="openshift-ovn-kubernetes/ovnkube-node-7tzmx" Mar 10 09:58:38 crc kubenswrapper[4794]: I0310 09:58:38.579871 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/84306302-a4d7-4ec6-8005-afca6c0f9191-run-openvswitch\") pod \"ovnkube-node-7tzmx\" (UID: \"84306302-a4d7-4ec6-8005-afca6c0f9191\") " pod="openshift-ovn-kubernetes/ovnkube-node-7tzmx" Mar 10 09:58:38 crc kubenswrapper[4794]: I0310 09:58:38.579934 4794 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/d6907de6-7eb7-440a-a101-f492ffa28e39-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Mar 10 09:58:38 crc kubenswrapper[4794]: I0310 09:58:38.579950 4794 reconciler_common.go:293] "Volume detached for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/d6907de6-7eb7-440a-a101-f492ffa28e39-run-systemd\") on node \"crc\" DevicePath \"\"" Mar 10 09:58:38 crc kubenswrapper[4794]: I0310 09:58:38.579960 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5dhqd\" (UniqueName: \"kubernetes.io/projected/d6907de6-7eb7-440a-a101-f492ffa28e39-kube-api-access-5dhqd\") on node \"crc\" DevicePath \"\"" Mar 10 09:58:38 crc kubenswrapper[4794]: I0310 09:58:38.579989 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/84306302-a4d7-4ec6-8005-afca6c0f9191-run-openvswitch\") pod \"ovnkube-node-7tzmx\" (UID: \"84306302-a4d7-4ec6-8005-afca6c0f9191\") " pod="openshift-ovn-kubernetes/ovnkube-node-7tzmx" Mar 10 09:58:38 crc kubenswrapper[4794]: I0310 09:58:38.580017 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/84306302-a4d7-4ec6-8005-afca6c0f9191-run-systemd\") pod \"ovnkube-node-7tzmx\" (UID: \"84306302-a4d7-4ec6-8005-afca6c0f9191\") " pod="openshift-ovn-kubernetes/ovnkube-node-7tzmx" Mar 10 09:58:38 crc kubenswrapper[4794]: I0310 09:58:38.580020 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/84306302-a4d7-4ec6-8005-afca6c0f9191-host-slash\") pod \"ovnkube-node-7tzmx\" (UID: \"84306302-a4d7-4ec6-8005-afca6c0f9191\") " pod="openshift-ovn-kubernetes/ovnkube-node-7tzmx" Mar 10 09:58:38 crc kubenswrapper[4794]: I0310 09:58:38.580054 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/84306302-a4d7-4ec6-8005-afca6c0f9191-ovnkube-script-lib\") pod \"ovnkube-node-7tzmx\" (UID: \"84306302-a4d7-4ec6-8005-afca6c0f9191\") " pod="openshift-ovn-kubernetes/ovnkube-node-7tzmx" Mar 10 09:58:38 crc kubenswrapper[4794]: I0310 09:58:38.580104 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/84306302-a4d7-4ec6-8005-afca6c0f9191-host-run-ovn-kubernetes\") pod \"ovnkube-node-7tzmx\" (UID: \"84306302-a4d7-4ec6-8005-afca6c0f9191\") " pod="openshift-ovn-kubernetes/ovnkube-node-7tzmx" Mar 10 09:58:38 crc kubenswrapper[4794]: I0310 09:58:38.580112 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/84306302-a4d7-4ec6-8005-afca6c0f9191-host-kubelet\") pod \"ovnkube-node-7tzmx\" (UID: \"84306302-a4d7-4ec6-8005-afca6c0f9191\") " pod="openshift-ovn-kubernetes/ovnkube-node-7tzmx" Mar 10 09:58:38 crc kubenswrapper[4794]: I0310 09:58:38.579717 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/84306302-a4d7-4ec6-8005-afca6c0f9191-systemd-units\") pod \"ovnkube-node-7tzmx\" (UID: \"84306302-a4d7-4ec6-8005-afca6c0f9191\") " pod="openshift-ovn-kubernetes/ovnkube-node-7tzmx" Mar 10 09:58:38 crc kubenswrapper[4794]: I0310 09:58:38.580561 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/84306302-a4d7-4ec6-8005-afca6c0f9191-env-overrides\") pod \"ovnkube-node-7tzmx\" (UID: \"84306302-a4d7-4ec6-8005-afca6c0f9191\") " pod="openshift-ovn-kubernetes/ovnkube-node-7tzmx" Mar 10 09:58:38 crc kubenswrapper[4794]: I0310 09:58:38.580685 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/84306302-a4d7-4ec6-8005-afca6c0f9191-ovnkube-config\") pod \"ovnkube-node-7tzmx\" (UID: \"84306302-a4d7-4ec6-8005-afca6c0f9191\") " pod="openshift-ovn-kubernetes/ovnkube-node-7tzmx" Mar 10 09:58:38 crc kubenswrapper[4794]: I0310 09:58:38.583871 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/84306302-a4d7-4ec6-8005-afca6c0f9191-ovn-node-metrics-cert\") pod \"ovnkube-node-7tzmx\" (UID: \"84306302-a4d7-4ec6-8005-afca6c0f9191\") " pod="openshift-ovn-kubernetes/ovnkube-node-7tzmx" Mar 10 09:58:38 crc kubenswrapper[4794]: I0310 09:58:38.596154 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hdklq\" (UniqueName: \"kubernetes.io/projected/84306302-a4d7-4ec6-8005-afca6c0f9191-kube-api-access-hdklq\") pod \"ovnkube-node-7tzmx\" (UID: \"84306302-a4d7-4ec6-8005-afca6c0f9191\") " pod="openshift-ovn-kubernetes/ovnkube-node-7tzmx" Mar 10 09:58:38 crc kubenswrapper[4794]: I0310 09:58:38.745528 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-7tzmx" Mar 10 09:58:38 crc kubenswrapper[4794]: W0310 09:58:38.772863 4794 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod84306302_a4d7_4ec6_8005_afca6c0f9191.slice/crio-ac900d66c8e1b791f0ae2d5eceac594d2d482a1a26a0c31ffce252d1ef1fbda1 WatchSource:0}: Error finding container ac900d66c8e1b791f0ae2d5eceac594d2d482a1a26a0c31ffce252d1ef1fbda1: Status 404 returned error can't find the container with id ac900d66c8e1b791f0ae2d5eceac594d2d482a1a26a0c31ffce252d1ef1fbda1 Mar 10 09:58:38 crc kubenswrapper[4794]: I0310 09:58:38.860626 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-tgrtq" Mar 10 09:58:38 crc kubenswrapper[4794]: I0310 09:58:38.860931 4794 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-tgrtq" Mar 10 09:58:38 crc kubenswrapper[4794]: I0310 09:58:38.876592 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-jpdth_11028118-385a-4a2a-8bc4-49aad67ce147/kube-multus/2.log" Mar 10 09:58:38 crc kubenswrapper[4794]: I0310 09:58:38.877036 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-jpdth_11028118-385a-4a2a-8bc4-49aad67ce147/kube-multus/1.log" Mar 10 09:58:38 crc kubenswrapper[4794]: I0310 09:58:38.877075 4794 generic.go:334] "Generic (PLEG): container finished" podID="11028118-385a-4a2a-8bc4-49aad67ce147" containerID="3920ff9b5b2c0054283f96181e8561137507b6a3b05185c78e1f5bf2968a1845" exitCode=2 Mar 10 09:58:38 crc kubenswrapper[4794]: I0310 09:58:38.877124 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-jpdth" event={"ID":"11028118-385a-4a2a-8bc4-49aad67ce147","Type":"ContainerDied","Data":"3920ff9b5b2c0054283f96181e8561137507b6a3b05185c78e1f5bf2968a1845"} Mar 10 09:58:38 crc kubenswrapper[4794]: I0310 09:58:38.877156 4794 scope.go:117] "RemoveContainer" containerID="7132e8357d430e33f2f8e8172b67b9aba8deac7f2725d5222c6ccece6e4589d1" Mar 10 09:58:38 crc kubenswrapper[4794]: I0310 09:58:38.877605 4794 scope.go:117] "RemoveContainer" containerID="3920ff9b5b2c0054283f96181e8561137507b6a3b05185c78e1f5bf2968a1845" Mar 10 09:58:38 crc kubenswrapper[4794]: I0310 09:58:38.882412 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-mm9nq_d6907de6-7eb7-440a-a101-f492ffa28e39/ovnkube-controller/3.log" Mar 10 09:58:38 crc kubenswrapper[4794]: I0310 09:58:38.887702 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-mm9nq_d6907de6-7eb7-440a-a101-f492ffa28e39/ovn-acl-logging/0.log" Mar 10 09:58:38 crc kubenswrapper[4794]: I0310 09:58:38.888388 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-mm9nq_d6907de6-7eb7-440a-a101-f492ffa28e39/ovn-controller/0.log" Mar 10 09:58:38 crc kubenswrapper[4794]: I0310 09:58:38.888875 4794 generic.go:334] "Generic (PLEG): container finished" podID="d6907de6-7eb7-440a-a101-f492ffa28e39" containerID="b88a0e22611d7a1518609affa6087b133d21efdb5a7e80a5d4de584a1e106e57" exitCode=0 Mar 10 09:58:38 crc kubenswrapper[4794]: I0310 09:58:38.888901 4794 generic.go:334] "Generic (PLEG): container finished" podID="d6907de6-7eb7-440a-a101-f492ffa28e39" containerID="01f874a0c0eedbf796a95816f7e3fa80891f260f22087c48aea06498330167dd" exitCode=0 Mar 10 09:58:38 crc kubenswrapper[4794]: I0310 09:58:38.888912 4794 generic.go:334] "Generic (PLEG): container finished" podID="d6907de6-7eb7-440a-a101-f492ffa28e39" containerID="f392e7eaab035834aa954a19e6c9d926bbedf2a547b3a11fba67c0bb966df58c" exitCode=0 Mar 10 09:58:38 crc kubenswrapper[4794]: I0310 09:58:38.888923 4794 generic.go:334] "Generic (PLEG): container finished" podID="d6907de6-7eb7-440a-a101-f492ffa28e39" containerID="f4fd74482a81888186636681a0568f49e919ac0d729db583ba9178c42e488051" exitCode=0 Mar 10 09:58:38 crc kubenswrapper[4794]: I0310 09:58:38.888931 4794 generic.go:334] "Generic (PLEG): container finished" podID="d6907de6-7eb7-440a-a101-f492ffa28e39" containerID="cf907aae6bd62c6969511c2483d9891a3d89407fa596e7bd4d6e18a3739aea41" exitCode=0 Mar 10 09:58:38 crc kubenswrapper[4794]: I0310 09:58:38.888938 4794 generic.go:334] "Generic (PLEG): container finished" podID="d6907de6-7eb7-440a-a101-f492ffa28e39" containerID="5d6a3c6de4d134994477dbeb7d3ccdac86f691a7c465321397a93be240c84c4c" exitCode=0 Mar 10 09:58:38 crc kubenswrapper[4794]: I0310 09:58:38.888935 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mm9nq" event={"ID":"d6907de6-7eb7-440a-a101-f492ffa28e39","Type":"ContainerDied","Data":"b88a0e22611d7a1518609affa6087b133d21efdb5a7e80a5d4de584a1e106e57"} Mar 10 09:58:38 crc kubenswrapper[4794]: I0310 09:58:38.888996 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mm9nq" event={"ID":"d6907de6-7eb7-440a-a101-f492ffa28e39","Type":"ContainerDied","Data":"01f874a0c0eedbf796a95816f7e3fa80891f260f22087c48aea06498330167dd"} Mar 10 09:58:38 crc kubenswrapper[4794]: I0310 09:58:38.889020 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mm9nq" event={"ID":"d6907de6-7eb7-440a-a101-f492ffa28e39","Type":"ContainerDied","Data":"f392e7eaab035834aa954a19e6c9d926bbedf2a547b3a11fba67c0bb966df58c"} Mar 10 09:58:38 crc kubenswrapper[4794]: I0310 09:58:38.888946 4794 generic.go:334] "Generic (PLEG): container finished" podID="d6907de6-7eb7-440a-a101-f492ffa28e39" containerID="4fe68076842f5ec0c1922f36a3fdae71ce2e309fd4fa24d4ee4e15c16e514b57" exitCode=143 Mar 10 09:58:38 crc kubenswrapper[4794]: I0310 09:58:38.889037 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mm9nq" event={"ID":"d6907de6-7eb7-440a-a101-f492ffa28e39","Type":"ContainerDied","Data":"f4fd74482a81888186636681a0568f49e919ac0d729db583ba9178c42e488051"} Mar 10 09:58:38 crc kubenswrapper[4794]: I0310 09:58:38.889054 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mm9nq" event={"ID":"d6907de6-7eb7-440a-a101-f492ffa28e39","Type":"ContainerDied","Data":"cf907aae6bd62c6969511c2483d9891a3d89407fa596e7bd4d6e18a3739aea41"} Mar 10 09:58:38 crc kubenswrapper[4794]: I0310 09:58:38.889070 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mm9nq" event={"ID":"d6907de6-7eb7-440a-a101-f492ffa28e39","Type":"ContainerDied","Data":"5d6a3c6de4d134994477dbeb7d3ccdac86f691a7c465321397a93be240c84c4c"} Mar 10 09:58:38 crc kubenswrapper[4794]: I0310 09:58:38.889084 4794 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"b88a0e22611d7a1518609affa6087b133d21efdb5a7e80a5d4de584a1e106e57"} Mar 10 09:58:38 crc kubenswrapper[4794]: I0310 09:58:38.889098 4794 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"db67319402db24bdefe093e2db9fa128803504208bed6a528ba39f3dc8996c58"} Mar 10 09:58:38 crc kubenswrapper[4794]: I0310 09:58:38.889107 4794 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"01f874a0c0eedbf796a95816f7e3fa80891f260f22087c48aea06498330167dd"} Mar 10 09:58:38 crc kubenswrapper[4794]: I0310 09:58:38.889115 4794 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"f392e7eaab035834aa954a19e6c9d926bbedf2a547b3a11fba67c0bb966df58c"} Mar 10 09:58:38 crc kubenswrapper[4794]: I0310 09:58:38.889123 4794 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"f4fd74482a81888186636681a0568f49e919ac0d729db583ba9178c42e488051"} Mar 10 09:58:38 crc kubenswrapper[4794]: I0310 09:58:38.889131 4794 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"cf907aae6bd62c6969511c2483d9891a3d89407fa596e7bd4d6e18a3739aea41"} Mar 10 09:58:38 crc kubenswrapper[4794]: I0310 09:58:38.889137 4794 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"5d6a3c6de4d134994477dbeb7d3ccdac86f691a7c465321397a93be240c84c4c"} Mar 10 09:58:38 crc kubenswrapper[4794]: I0310 09:58:38.889145 4794 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"4fe68076842f5ec0c1922f36a3fdae71ce2e309fd4fa24d4ee4e15c16e514b57"} Mar 10 09:58:38 crc kubenswrapper[4794]: I0310 09:58:38.889152 4794 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"6993b19937515a45f9fab7b20a6856320e79cb3526295d52f48338126d519cac"} Mar 10 09:58:38 crc kubenswrapper[4794]: I0310 09:58:38.889159 4794 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"40c8d4f75a838a0ea1ce416d08c6d4628c2cbdbc209d50922f2f02867142bedd"} Mar 10 09:58:38 crc kubenswrapper[4794]: I0310 09:58:38.888952 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-mm9nq" Mar 10 09:58:38 crc kubenswrapper[4794]: I0310 09:58:38.889169 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mm9nq" event={"ID":"d6907de6-7eb7-440a-a101-f492ffa28e39","Type":"ContainerDied","Data":"4fe68076842f5ec0c1922f36a3fdae71ce2e309fd4fa24d4ee4e15c16e514b57"} Mar 10 09:58:38 crc kubenswrapper[4794]: I0310 09:58:38.889263 4794 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"b88a0e22611d7a1518609affa6087b133d21efdb5a7e80a5d4de584a1e106e57"} Mar 10 09:58:38 crc kubenswrapper[4794]: I0310 09:58:38.889275 4794 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"db67319402db24bdefe093e2db9fa128803504208bed6a528ba39f3dc8996c58"} Mar 10 09:58:38 crc kubenswrapper[4794]: I0310 09:58:38.889281 4794 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"01f874a0c0eedbf796a95816f7e3fa80891f260f22087c48aea06498330167dd"} Mar 10 09:58:38 crc kubenswrapper[4794]: I0310 09:58:38.889286 4794 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"f392e7eaab035834aa954a19e6c9d926bbedf2a547b3a11fba67c0bb966df58c"} Mar 10 09:58:38 crc kubenswrapper[4794]: I0310 09:58:38.889291 4794 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"f4fd74482a81888186636681a0568f49e919ac0d729db583ba9178c42e488051"} Mar 10 09:58:38 crc kubenswrapper[4794]: I0310 09:58:38.889296 4794 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"cf907aae6bd62c6969511c2483d9891a3d89407fa596e7bd4d6e18a3739aea41"} Mar 10 09:58:38 crc kubenswrapper[4794]: I0310 09:58:38.889301 4794 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"5d6a3c6de4d134994477dbeb7d3ccdac86f691a7c465321397a93be240c84c4c"} Mar 10 09:58:38 crc kubenswrapper[4794]: I0310 09:58:38.889306 4794 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"4fe68076842f5ec0c1922f36a3fdae71ce2e309fd4fa24d4ee4e15c16e514b57"} Mar 10 09:58:38 crc kubenswrapper[4794]: I0310 09:58:38.889311 4794 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"6993b19937515a45f9fab7b20a6856320e79cb3526295d52f48338126d519cac"} Mar 10 09:58:38 crc kubenswrapper[4794]: I0310 09:58:38.889315 4794 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"40c8d4f75a838a0ea1ce416d08c6d4628c2cbdbc209d50922f2f02867142bedd"} Mar 10 09:58:38 crc kubenswrapper[4794]: I0310 09:58:38.889345 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mm9nq" event={"ID":"d6907de6-7eb7-440a-a101-f492ffa28e39","Type":"ContainerDied","Data":"6993b19937515a45f9fab7b20a6856320e79cb3526295d52f48338126d519cac"} Mar 10 09:58:38 crc kubenswrapper[4794]: I0310 09:58:38.889359 4794 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"b88a0e22611d7a1518609affa6087b133d21efdb5a7e80a5d4de584a1e106e57"} Mar 10 09:58:38 crc kubenswrapper[4794]: I0310 09:58:38.889367 4794 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"db67319402db24bdefe093e2db9fa128803504208bed6a528ba39f3dc8996c58"} Mar 10 09:58:38 crc kubenswrapper[4794]: I0310 09:58:38.889373 4794 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"01f874a0c0eedbf796a95816f7e3fa80891f260f22087c48aea06498330167dd"} Mar 10 09:58:38 crc kubenswrapper[4794]: I0310 09:58:38.889379 4794 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"f392e7eaab035834aa954a19e6c9d926bbedf2a547b3a11fba67c0bb966df58c"} Mar 10 09:58:38 crc kubenswrapper[4794]: I0310 09:58:38.889386 4794 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"f4fd74482a81888186636681a0568f49e919ac0d729db583ba9178c42e488051"} Mar 10 09:58:38 crc kubenswrapper[4794]: I0310 09:58:38.889392 4794 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"cf907aae6bd62c6969511c2483d9891a3d89407fa596e7bd4d6e18a3739aea41"} Mar 10 09:58:38 crc kubenswrapper[4794]: I0310 09:58:38.889398 4794 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"5d6a3c6de4d134994477dbeb7d3ccdac86f691a7c465321397a93be240c84c4c"} Mar 10 09:58:38 crc kubenswrapper[4794]: I0310 09:58:38.889402 4794 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"4fe68076842f5ec0c1922f36a3fdae71ce2e309fd4fa24d4ee4e15c16e514b57"} Mar 10 09:58:38 crc kubenswrapper[4794]: I0310 09:58:38.889408 4794 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"6993b19937515a45f9fab7b20a6856320e79cb3526295d52f48338126d519cac"} Mar 10 09:58:38 crc kubenswrapper[4794]: I0310 09:58:38.889414 4794 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"40c8d4f75a838a0ea1ce416d08c6d4628c2cbdbc209d50922f2f02867142bedd"} Mar 10 09:58:38 crc kubenswrapper[4794]: I0310 09:58:38.889049 4794 generic.go:334] "Generic (PLEG): container finished" podID="d6907de6-7eb7-440a-a101-f492ffa28e39" containerID="6993b19937515a45f9fab7b20a6856320e79cb3526295d52f48338126d519cac" exitCode=143 Mar 10 09:58:38 crc kubenswrapper[4794]: I0310 09:58:38.889463 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mm9nq" event={"ID":"d6907de6-7eb7-440a-a101-f492ffa28e39","Type":"ContainerDied","Data":"37811c6f400dc349945266a557e31553144442b3b962784c47f1679857265df2"} Mar 10 09:58:38 crc kubenswrapper[4794]: I0310 09:58:38.889473 4794 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"b88a0e22611d7a1518609affa6087b133d21efdb5a7e80a5d4de584a1e106e57"} Mar 10 09:58:38 crc kubenswrapper[4794]: I0310 09:58:38.889478 4794 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"db67319402db24bdefe093e2db9fa128803504208bed6a528ba39f3dc8996c58"} Mar 10 09:58:38 crc kubenswrapper[4794]: I0310 09:58:38.889483 4794 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"01f874a0c0eedbf796a95816f7e3fa80891f260f22087c48aea06498330167dd"} Mar 10 09:58:38 crc kubenswrapper[4794]: I0310 09:58:38.889489 4794 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"f392e7eaab035834aa954a19e6c9d926bbedf2a547b3a11fba67c0bb966df58c"} Mar 10 09:58:38 crc kubenswrapper[4794]: I0310 09:58:38.889494 4794 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"f4fd74482a81888186636681a0568f49e919ac0d729db583ba9178c42e488051"} Mar 10 09:58:38 crc kubenswrapper[4794]: I0310 09:58:38.889498 4794 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"cf907aae6bd62c6969511c2483d9891a3d89407fa596e7bd4d6e18a3739aea41"} Mar 10 09:58:38 crc kubenswrapper[4794]: I0310 09:58:38.889505 4794 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"5d6a3c6de4d134994477dbeb7d3ccdac86f691a7c465321397a93be240c84c4c"} Mar 10 09:58:38 crc kubenswrapper[4794]: I0310 09:58:38.889510 4794 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"4fe68076842f5ec0c1922f36a3fdae71ce2e309fd4fa24d4ee4e15c16e514b57"} Mar 10 09:58:38 crc kubenswrapper[4794]: I0310 09:58:38.889514 4794 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"6993b19937515a45f9fab7b20a6856320e79cb3526295d52f48338126d519cac"} Mar 10 09:58:38 crc kubenswrapper[4794]: I0310 09:58:38.889519 4794 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"40c8d4f75a838a0ea1ce416d08c6d4628c2cbdbc209d50922f2f02867142bedd"} Mar 10 09:58:38 crc kubenswrapper[4794]: I0310 09:58:38.892772 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7tzmx" event={"ID":"84306302-a4d7-4ec6-8005-afca6c0f9191","Type":"ContainerStarted","Data":"31f25f492872b6b74068bdaee672ce4deb20baa7e783cac059ea247efde2f65b"} Mar 10 09:58:38 crc kubenswrapper[4794]: I0310 09:58:38.892795 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7tzmx" event={"ID":"84306302-a4d7-4ec6-8005-afca6c0f9191","Type":"ContainerStarted","Data":"ac900d66c8e1b791f0ae2d5eceac594d2d482a1a26a0c31ffce252d1ef1fbda1"} Mar 10 09:58:38 crc kubenswrapper[4794]: I0310 09:58:38.911854 4794 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-tgrtq" Mar 10 09:58:38 crc kubenswrapper[4794]: I0310 09:58:38.944879 4794 scope.go:117] "RemoveContainer" containerID="b88a0e22611d7a1518609affa6087b133d21efdb5a7e80a5d4de584a1e106e57" Mar 10 09:58:38 crc kubenswrapper[4794]: I0310 09:58:38.955695 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-tgrtq" Mar 10 09:58:38 crc kubenswrapper[4794]: I0310 09:58:38.981470 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-mm9nq"] Mar 10 09:58:38 crc kubenswrapper[4794]: I0310 09:58:38.986888 4794 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-mm9nq"] Mar 10 09:58:38 crc kubenswrapper[4794]: I0310 09:58:38.989860 4794 scope.go:117] "RemoveContainer" containerID="db67319402db24bdefe093e2db9fa128803504208bed6a528ba39f3dc8996c58" Mar 10 09:58:39 crc kubenswrapper[4794]: I0310 09:58:39.009944 4794 scope.go:117] "RemoveContainer" containerID="01f874a0c0eedbf796a95816f7e3fa80891f260f22087c48aea06498330167dd" Mar 10 09:58:39 crc kubenswrapper[4794]: I0310 09:58:39.024026 4794 scope.go:117] "RemoveContainer" containerID="f392e7eaab035834aa954a19e6c9d926bbedf2a547b3a11fba67c0bb966df58c" Mar 10 09:58:39 crc kubenswrapper[4794]: I0310 09:58:39.047976 4794 scope.go:117] "RemoveContainer" containerID="f4fd74482a81888186636681a0568f49e919ac0d729db583ba9178c42e488051" Mar 10 09:58:39 crc kubenswrapper[4794]: I0310 09:58:39.073547 4794 scope.go:117] "RemoveContainer" containerID="cf907aae6bd62c6969511c2483d9891a3d89407fa596e7bd4d6e18a3739aea41" Mar 10 09:58:39 crc kubenswrapper[4794]: I0310 09:58:39.089236 4794 scope.go:117] "RemoveContainer" containerID="5d6a3c6de4d134994477dbeb7d3ccdac86f691a7c465321397a93be240c84c4c" Mar 10 09:58:39 crc kubenswrapper[4794]: I0310 09:58:39.103399 4794 scope.go:117] "RemoveContainer" containerID="4fe68076842f5ec0c1922f36a3fdae71ce2e309fd4fa24d4ee4e15c16e514b57" Mar 10 09:58:39 crc kubenswrapper[4794]: I0310 09:58:39.120249 4794 scope.go:117] "RemoveContainer" containerID="6993b19937515a45f9fab7b20a6856320e79cb3526295d52f48338126d519cac" Mar 10 09:58:39 crc kubenswrapper[4794]: I0310 09:58:39.140041 4794 scope.go:117] "RemoveContainer" containerID="40c8d4f75a838a0ea1ce416d08c6d4628c2cbdbc209d50922f2f02867142bedd" Mar 10 09:58:39 crc kubenswrapper[4794]: I0310 09:58:39.157594 4794 scope.go:117] "RemoveContainer" containerID="b88a0e22611d7a1518609affa6087b133d21efdb5a7e80a5d4de584a1e106e57" Mar 10 09:58:39 crc kubenswrapper[4794]: E0310 09:58:39.158005 4794 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b88a0e22611d7a1518609affa6087b133d21efdb5a7e80a5d4de584a1e106e57\": container with ID starting with b88a0e22611d7a1518609affa6087b133d21efdb5a7e80a5d4de584a1e106e57 not found: ID does not exist" containerID="b88a0e22611d7a1518609affa6087b133d21efdb5a7e80a5d4de584a1e106e57" Mar 10 09:58:39 crc kubenswrapper[4794]: I0310 09:58:39.158048 4794 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b88a0e22611d7a1518609affa6087b133d21efdb5a7e80a5d4de584a1e106e57"} err="failed to get container status \"b88a0e22611d7a1518609affa6087b133d21efdb5a7e80a5d4de584a1e106e57\": rpc error: code = NotFound desc = could not find container \"b88a0e22611d7a1518609affa6087b133d21efdb5a7e80a5d4de584a1e106e57\": container with ID starting with b88a0e22611d7a1518609affa6087b133d21efdb5a7e80a5d4de584a1e106e57 not found: ID does not exist" Mar 10 09:58:39 crc kubenswrapper[4794]: I0310 09:58:39.158073 4794 scope.go:117] "RemoveContainer" containerID="db67319402db24bdefe093e2db9fa128803504208bed6a528ba39f3dc8996c58" Mar 10 09:58:39 crc kubenswrapper[4794]: E0310 09:58:39.158317 4794 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"db67319402db24bdefe093e2db9fa128803504208bed6a528ba39f3dc8996c58\": container with ID starting with db67319402db24bdefe093e2db9fa128803504208bed6a528ba39f3dc8996c58 not found: ID does not exist" containerID="db67319402db24bdefe093e2db9fa128803504208bed6a528ba39f3dc8996c58" Mar 10 09:58:39 crc kubenswrapper[4794]: I0310 09:58:39.158356 4794 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"db67319402db24bdefe093e2db9fa128803504208bed6a528ba39f3dc8996c58"} err="failed to get container status \"db67319402db24bdefe093e2db9fa128803504208bed6a528ba39f3dc8996c58\": rpc error: code = NotFound desc = could not find container \"db67319402db24bdefe093e2db9fa128803504208bed6a528ba39f3dc8996c58\": container with ID starting with db67319402db24bdefe093e2db9fa128803504208bed6a528ba39f3dc8996c58 not found: ID does not exist" Mar 10 09:58:39 crc kubenswrapper[4794]: I0310 09:58:39.158377 4794 scope.go:117] "RemoveContainer" containerID="01f874a0c0eedbf796a95816f7e3fa80891f260f22087c48aea06498330167dd" Mar 10 09:58:39 crc kubenswrapper[4794]: E0310 09:58:39.158653 4794 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"01f874a0c0eedbf796a95816f7e3fa80891f260f22087c48aea06498330167dd\": container with ID starting with 01f874a0c0eedbf796a95816f7e3fa80891f260f22087c48aea06498330167dd not found: ID does not exist" containerID="01f874a0c0eedbf796a95816f7e3fa80891f260f22087c48aea06498330167dd" Mar 10 09:58:39 crc kubenswrapper[4794]: I0310 09:58:39.158681 4794 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"01f874a0c0eedbf796a95816f7e3fa80891f260f22087c48aea06498330167dd"} err="failed to get container status \"01f874a0c0eedbf796a95816f7e3fa80891f260f22087c48aea06498330167dd\": rpc error: code = NotFound desc = could not find container \"01f874a0c0eedbf796a95816f7e3fa80891f260f22087c48aea06498330167dd\": container with ID starting with 01f874a0c0eedbf796a95816f7e3fa80891f260f22087c48aea06498330167dd not found: ID does not exist" Mar 10 09:58:39 crc kubenswrapper[4794]: I0310 09:58:39.158709 4794 scope.go:117] "RemoveContainer" containerID="f392e7eaab035834aa954a19e6c9d926bbedf2a547b3a11fba67c0bb966df58c" Mar 10 09:58:39 crc kubenswrapper[4794]: E0310 09:58:39.158924 4794 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f392e7eaab035834aa954a19e6c9d926bbedf2a547b3a11fba67c0bb966df58c\": container with ID starting with f392e7eaab035834aa954a19e6c9d926bbedf2a547b3a11fba67c0bb966df58c not found: ID does not exist" containerID="f392e7eaab035834aa954a19e6c9d926bbedf2a547b3a11fba67c0bb966df58c" Mar 10 09:58:39 crc kubenswrapper[4794]: I0310 09:58:39.158951 4794 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f392e7eaab035834aa954a19e6c9d926bbedf2a547b3a11fba67c0bb966df58c"} err="failed to get container status \"f392e7eaab035834aa954a19e6c9d926bbedf2a547b3a11fba67c0bb966df58c\": rpc error: code = NotFound desc = could not find container \"f392e7eaab035834aa954a19e6c9d926bbedf2a547b3a11fba67c0bb966df58c\": container with ID starting with f392e7eaab035834aa954a19e6c9d926bbedf2a547b3a11fba67c0bb966df58c not found: ID does not exist" Mar 10 09:58:39 crc kubenswrapper[4794]: I0310 09:58:39.158970 4794 scope.go:117] "RemoveContainer" containerID="f4fd74482a81888186636681a0568f49e919ac0d729db583ba9178c42e488051" Mar 10 09:58:39 crc kubenswrapper[4794]: E0310 09:58:39.159169 4794 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f4fd74482a81888186636681a0568f49e919ac0d729db583ba9178c42e488051\": container with ID starting with f4fd74482a81888186636681a0568f49e919ac0d729db583ba9178c42e488051 not found: ID does not exist" containerID="f4fd74482a81888186636681a0568f49e919ac0d729db583ba9178c42e488051" Mar 10 09:58:39 crc kubenswrapper[4794]: I0310 09:58:39.159200 4794 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f4fd74482a81888186636681a0568f49e919ac0d729db583ba9178c42e488051"} err="failed to get container status \"f4fd74482a81888186636681a0568f49e919ac0d729db583ba9178c42e488051\": rpc error: code = NotFound desc = could not find container \"f4fd74482a81888186636681a0568f49e919ac0d729db583ba9178c42e488051\": container with ID starting with f4fd74482a81888186636681a0568f49e919ac0d729db583ba9178c42e488051 not found: ID does not exist" Mar 10 09:58:39 crc kubenswrapper[4794]: I0310 09:58:39.159221 4794 scope.go:117] "RemoveContainer" containerID="cf907aae6bd62c6969511c2483d9891a3d89407fa596e7bd4d6e18a3739aea41" Mar 10 09:58:39 crc kubenswrapper[4794]: E0310 09:58:39.159440 4794 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cf907aae6bd62c6969511c2483d9891a3d89407fa596e7bd4d6e18a3739aea41\": container with ID starting with cf907aae6bd62c6969511c2483d9891a3d89407fa596e7bd4d6e18a3739aea41 not found: ID does not exist" containerID="cf907aae6bd62c6969511c2483d9891a3d89407fa596e7bd4d6e18a3739aea41" Mar 10 09:58:39 crc kubenswrapper[4794]: I0310 09:58:39.159468 4794 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cf907aae6bd62c6969511c2483d9891a3d89407fa596e7bd4d6e18a3739aea41"} err="failed to get container status \"cf907aae6bd62c6969511c2483d9891a3d89407fa596e7bd4d6e18a3739aea41\": rpc error: code = NotFound desc = could not find container \"cf907aae6bd62c6969511c2483d9891a3d89407fa596e7bd4d6e18a3739aea41\": container with ID starting with cf907aae6bd62c6969511c2483d9891a3d89407fa596e7bd4d6e18a3739aea41 not found: ID does not exist" Mar 10 09:58:39 crc kubenswrapper[4794]: I0310 09:58:39.159484 4794 scope.go:117] "RemoveContainer" containerID="5d6a3c6de4d134994477dbeb7d3ccdac86f691a7c465321397a93be240c84c4c" Mar 10 09:58:39 crc kubenswrapper[4794]: E0310 09:58:39.159697 4794 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5d6a3c6de4d134994477dbeb7d3ccdac86f691a7c465321397a93be240c84c4c\": container with ID starting with 5d6a3c6de4d134994477dbeb7d3ccdac86f691a7c465321397a93be240c84c4c not found: ID does not exist" containerID="5d6a3c6de4d134994477dbeb7d3ccdac86f691a7c465321397a93be240c84c4c" Mar 10 09:58:39 crc kubenswrapper[4794]: I0310 09:58:39.159721 4794 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5d6a3c6de4d134994477dbeb7d3ccdac86f691a7c465321397a93be240c84c4c"} err="failed to get container status \"5d6a3c6de4d134994477dbeb7d3ccdac86f691a7c465321397a93be240c84c4c\": rpc error: code = NotFound desc = could not find container \"5d6a3c6de4d134994477dbeb7d3ccdac86f691a7c465321397a93be240c84c4c\": container with ID starting with 5d6a3c6de4d134994477dbeb7d3ccdac86f691a7c465321397a93be240c84c4c not found: ID does not exist" Mar 10 09:58:39 crc kubenswrapper[4794]: I0310 09:58:39.159739 4794 scope.go:117] "RemoveContainer" containerID="4fe68076842f5ec0c1922f36a3fdae71ce2e309fd4fa24d4ee4e15c16e514b57" Mar 10 09:58:39 crc kubenswrapper[4794]: E0310 09:58:39.159973 4794 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4fe68076842f5ec0c1922f36a3fdae71ce2e309fd4fa24d4ee4e15c16e514b57\": container with ID starting with 4fe68076842f5ec0c1922f36a3fdae71ce2e309fd4fa24d4ee4e15c16e514b57 not found: ID does not exist" containerID="4fe68076842f5ec0c1922f36a3fdae71ce2e309fd4fa24d4ee4e15c16e514b57" Mar 10 09:58:39 crc kubenswrapper[4794]: I0310 09:58:39.160000 4794 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4fe68076842f5ec0c1922f36a3fdae71ce2e309fd4fa24d4ee4e15c16e514b57"} err="failed to get container status \"4fe68076842f5ec0c1922f36a3fdae71ce2e309fd4fa24d4ee4e15c16e514b57\": rpc error: code = NotFound desc = could not find container \"4fe68076842f5ec0c1922f36a3fdae71ce2e309fd4fa24d4ee4e15c16e514b57\": container with ID starting with 4fe68076842f5ec0c1922f36a3fdae71ce2e309fd4fa24d4ee4e15c16e514b57 not found: ID does not exist" Mar 10 09:58:39 crc kubenswrapper[4794]: I0310 09:58:39.160015 4794 scope.go:117] "RemoveContainer" containerID="6993b19937515a45f9fab7b20a6856320e79cb3526295d52f48338126d519cac" Mar 10 09:58:39 crc kubenswrapper[4794]: E0310 09:58:39.160368 4794 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6993b19937515a45f9fab7b20a6856320e79cb3526295d52f48338126d519cac\": container with ID starting with 6993b19937515a45f9fab7b20a6856320e79cb3526295d52f48338126d519cac not found: ID does not exist" containerID="6993b19937515a45f9fab7b20a6856320e79cb3526295d52f48338126d519cac" Mar 10 09:58:39 crc kubenswrapper[4794]: I0310 09:58:39.160398 4794 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6993b19937515a45f9fab7b20a6856320e79cb3526295d52f48338126d519cac"} err="failed to get container status \"6993b19937515a45f9fab7b20a6856320e79cb3526295d52f48338126d519cac\": rpc error: code = NotFound desc = could not find container \"6993b19937515a45f9fab7b20a6856320e79cb3526295d52f48338126d519cac\": container with ID starting with 6993b19937515a45f9fab7b20a6856320e79cb3526295d52f48338126d519cac not found: ID does not exist" Mar 10 09:58:39 crc kubenswrapper[4794]: I0310 09:58:39.160414 4794 scope.go:117] "RemoveContainer" containerID="40c8d4f75a838a0ea1ce416d08c6d4628c2cbdbc209d50922f2f02867142bedd" Mar 10 09:58:39 crc kubenswrapper[4794]: E0310 09:58:39.160708 4794 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"40c8d4f75a838a0ea1ce416d08c6d4628c2cbdbc209d50922f2f02867142bedd\": container with ID starting with 40c8d4f75a838a0ea1ce416d08c6d4628c2cbdbc209d50922f2f02867142bedd not found: ID does not exist" containerID="40c8d4f75a838a0ea1ce416d08c6d4628c2cbdbc209d50922f2f02867142bedd" Mar 10 09:58:39 crc kubenswrapper[4794]: I0310 09:58:39.160756 4794 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"40c8d4f75a838a0ea1ce416d08c6d4628c2cbdbc209d50922f2f02867142bedd"} err="failed to get container status \"40c8d4f75a838a0ea1ce416d08c6d4628c2cbdbc209d50922f2f02867142bedd\": rpc error: code = NotFound desc = could not find container \"40c8d4f75a838a0ea1ce416d08c6d4628c2cbdbc209d50922f2f02867142bedd\": container with ID starting with 40c8d4f75a838a0ea1ce416d08c6d4628c2cbdbc209d50922f2f02867142bedd not found: ID does not exist" Mar 10 09:58:39 crc kubenswrapper[4794]: I0310 09:58:39.160789 4794 scope.go:117] "RemoveContainer" containerID="b88a0e22611d7a1518609affa6087b133d21efdb5a7e80a5d4de584a1e106e57" Mar 10 09:58:39 crc kubenswrapper[4794]: I0310 09:58:39.161117 4794 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b88a0e22611d7a1518609affa6087b133d21efdb5a7e80a5d4de584a1e106e57"} err="failed to get container status \"b88a0e22611d7a1518609affa6087b133d21efdb5a7e80a5d4de584a1e106e57\": rpc error: code = NotFound desc = could not find container \"b88a0e22611d7a1518609affa6087b133d21efdb5a7e80a5d4de584a1e106e57\": container with ID starting with b88a0e22611d7a1518609affa6087b133d21efdb5a7e80a5d4de584a1e106e57 not found: ID does not exist" Mar 10 09:58:39 crc kubenswrapper[4794]: I0310 09:58:39.161146 4794 scope.go:117] "RemoveContainer" containerID="db67319402db24bdefe093e2db9fa128803504208bed6a528ba39f3dc8996c58" Mar 10 09:58:39 crc kubenswrapper[4794]: I0310 09:58:39.161425 4794 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"db67319402db24bdefe093e2db9fa128803504208bed6a528ba39f3dc8996c58"} err="failed to get container status \"db67319402db24bdefe093e2db9fa128803504208bed6a528ba39f3dc8996c58\": rpc error: code = NotFound desc = could not find container \"db67319402db24bdefe093e2db9fa128803504208bed6a528ba39f3dc8996c58\": container with ID starting with db67319402db24bdefe093e2db9fa128803504208bed6a528ba39f3dc8996c58 not found: ID does not exist" Mar 10 09:58:39 crc kubenswrapper[4794]: I0310 09:58:39.161451 4794 scope.go:117] "RemoveContainer" containerID="01f874a0c0eedbf796a95816f7e3fa80891f260f22087c48aea06498330167dd" Mar 10 09:58:39 crc kubenswrapper[4794]: I0310 09:58:39.161720 4794 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"01f874a0c0eedbf796a95816f7e3fa80891f260f22087c48aea06498330167dd"} err="failed to get container status \"01f874a0c0eedbf796a95816f7e3fa80891f260f22087c48aea06498330167dd\": rpc error: code = NotFound desc = could not find container \"01f874a0c0eedbf796a95816f7e3fa80891f260f22087c48aea06498330167dd\": container with ID starting with 01f874a0c0eedbf796a95816f7e3fa80891f260f22087c48aea06498330167dd not found: ID does not exist" Mar 10 09:58:39 crc kubenswrapper[4794]: I0310 09:58:39.161747 4794 scope.go:117] "RemoveContainer" containerID="f392e7eaab035834aa954a19e6c9d926bbedf2a547b3a11fba67c0bb966df58c" Mar 10 09:58:39 crc kubenswrapper[4794]: I0310 09:58:39.161949 4794 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f392e7eaab035834aa954a19e6c9d926bbedf2a547b3a11fba67c0bb966df58c"} err="failed to get container status \"f392e7eaab035834aa954a19e6c9d926bbedf2a547b3a11fba67c0bb966df58c\": rpc error: code = NotFound desc = could not find container \"f392e7eaab035834aa954a19e6c9d926bbedf2a547b3a11fba67c0bb966df58c\": container with ID starting with f392e7eaab035834aa954a19e6c9d926bbedf2a547b3a11fba67c0bb966df58c not found: ID does not exist" Mar 10 09:58:39 crc kubenswrapper[4794]: I0310 09:58:39.161972 4794 scope.go:117] "RemoveContainer" containerID="f4fd74482a81888186636681a0568f49e919ac0d729db583ba9178c42e488051" Mar 10 09:58:39 crc kubenswrapper[4794]: I0310 09:58:39.162372 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-tgrtq"] Mar 10 09:58:39 crc kubenswrapper[4794]: I0310 09:58:39.162782 4794 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f4fd74482a81888186636681a0568f49e919ac0d729db583ba9178c42e488051"} err="failed to get container status \"f4fd74482a81888186636681a0568f49e919ac0d729db583ba9178c42e488051\": rpc error: code = NotFound desc = could not find container \"f4fd74482a81888186636681a0568f49e919ac0d729db583ba9178c42e488051\": container with ID starting with f4fd74482a81888186636681a0568f49e919ac0d729db583ba9178c42e488051 not found: ID does not exist" Mar 10 09:58:39 crc kubenswrapper[4794]: I0310 09:58:39.162801 4794 scope.go:117] "RemoveContainer" containerID="cf907aae6bd62c6969511c2483d9891a3d89407fa596e7bd4d6e18a3739aea41" Mar 10 09:58:39 crc kubenswrapper[4794]: I0310 09:58:39.163034 4794 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cf907aae6bd62c6969511c2483d9891a3d89407fa596e7bd4d6e18a3739aea41"} err="failed to get container status \"cf907aae6bd62c6969511c2483d9891a3d89407fa596e7bd4d6e18a3739aea41\": rpc error: code = NotFound desc = could not find container \"cf907aae6bd62c6969511c2483d9891a3d89407fa596e7bd4d6e18a3739aea41\": container with ID starting with cf907aae6bd62c6969511c2483d9891a3d89407fa596e7bd4d6e18a3739aea41 not found: ID does not exist" Mar 10 09:58:39 crc kubenswrapper[4794]: I0310 09:58:39.163061 4794 scope.go:117] "RemoveContainer" containerID="5d6a3c6de4d134994477dbeb7d3ccdac86f691a7c465321397a93be240c84c4c" Mar 10 09:58:39 crc kubenswrapper[4794]: I0310 09:58:39.163303 4794 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5d6a3c6de4d134994477dbeb7d3ccdac86f691a7c465321397a93be240c84c4c"} err="failed to get container status \"5d6a3c6de4d134994477dbeb7d3ccdac86f691a7c465321397a93be240c84c4c\": rpc error: code = NotFound desc = could not find container \"5d6a3c6de4d134994477dbeb7d3ccdac86f691a7c465321397a93be240c84c4c\": container with ID starting with 5d6a3c6de4d134994477dbeb7d3ccdac86f691a7c465321397a93be240c84c4c not found: ID does not exist" Mar 10 09:58:39 crc kubenswrapper[4794]: I0310 09:58:39.163377 4794 scope.go:117] "RemoveContainer" containerID="4fe68076842f5ec0c1922f36a3fdae71ce2e309fd4fa24d4ee4e15c16e514b57" Mar 10 09:58:39 crc kubenswrapper[4794]: I0310 09:58:39.163652 4794 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4fe68076842f5ec0c1922f36a3fdae71ce2e309fd4fa24d4ee4e15c16e514b57"} err="failed to get container status \"4fe68076842f5ec0c1922f36a3fdae71ce2e309fd4fa24d4ee4e15c16e514b57\": rpc error: code = NotFound desc = could not find container \"4fe68076842f5ec0c1922f36a3fdae71ce2e309fd4fa24d4ee4e15c16e514b57\": container with ID starting with 4fe68076842f5ec0c1922f36a3fdae71ce2e309fd4fa24d4ee4e15c16e514b57 not found: ID does not exist" Mar 10 09:58:39 crc kubenswrapper[4794]: I0310 09:58:39.163681 4794 scope.go:117] "RemoveContainer" containerID="6993b19937515a45f9fab7b20a6856320e79cb3526295d52f48338126d519cac" Mar 10 09:58:39 crc kubenswrapper[4794]: I0310 09:58:39.163912 4794 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6993b19937515a45f9fab7b20a6856320e79cb3526295d52f48338126d519cac"} err="failed to get container status \"6993b19937515a45f9fab7b20a6856320e79cb3526295d52f48338126d519cac\": rpc error: code = NotFound desc = could not find container \"6993b19937515a45f9fab7b20a6856320e79cb3526295d52f48338126d519cac\": container with ID starting with 6993b19937515a45f9fab7b20a6856320e79cb3526295d52f48338126d519cac not found: ID does not exist" Mar 10 09:58:39 crc kubenswrapper[4794]: I0310 09:58:39.163939 4794 scope.go:117] "RemoveContainer" containerID="40c8d4f75a838a0ea1ce416d08c6d4628c2cbdbc209d50922f2f02867142bedd" Mar 10 09:58:39 crc kubenswrapper[4794]: I0310 09:58:39.164191 4794 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"40c8d4f75a838a0ea1ce416d08c6d4628c2cbdbc209d50922f2f02867142bedd"} err="failed to get container status \"40c8d4f75a838a0ea1ce416d08c6d4628c2cbdbc209d50922f2f02867142bedd\": rpc error: code = NotFound desc = could not find container \"40c8d4f75a838a0ea1ce416d08c6d4628c2cbdbc209d50922f2f02867142bedd\": container with ID starting with 40c8d4f75a838a0ea1ce416d08c6d4628c2cbdbc209d50922f2f02867142bedd not found: ID does not exist" Mar 10 09:58:39 crc kubenswrapper[4794]: I0310 09:58:39.164209 4794 scope.go:117] "RemoveContainer" containerID="b88a0e22611d7a1518609affa6087b133d21efdb5a7e80a5d4de584a1e106e57" Mar 10 09:58:39 crc kubenswrapper[4794]: I0310 09:58:39.164644 4794 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b88a0e22611d7a1518609affa6087b133d21efdb5a7e80a5d4de584a1e106e57"} err="failed to get container status \"b88a0e22611d7a1518609affa6087b133d21efdb5a7e80a5d4de584a1e106e57\": rpc error: code = NotFound desc = could not find container \"b88a0e22611d7a1518609affa6087b133d21efdb5a7e80a5d4de584a1e106e57\": container with ID starting with b88a0e22611d7a1518609affa6087b133d21efdb5a7e80a5d4de584a1e106e57 not found: ID does not exist" Mar 10 09:58:39 crc kubenswrapper[4794]: I0310 09:58:39.164672 4794 scope.go:117] "RemoveContainer" containerID="db67319402db24bdefe093e2db9fa128803504208bed6a528ba39f3dc8996c58" Mar 10 09:58:39 crc kubenswrapper[4794]: I0310 09:58:39.164883 4794 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"db67319402db24bdefe093e2db9fa128803504208bed6a528ba39f3dc8996c58"} err="failed to get container status \"db67319402db24bdefe093e2db9fa128803504208bed6a528ba39f3dc8996c58\": rpc error: code = NotFound desc = could not find container \"db67319402db24bdefe093e2db9fa128803504208bed6a528ba39f3dc8996c58\": container with ID starting with db67319402db24bdefe093e2db9fa128803504208bed6a528ba39f3dc8996c58 not found: ID does not exist" Mar 10 09:58:39 crc kubenswrapper[4794]: I0310 09:58:39.164910 4794 scope.go:117] "RemoveContainer" containerID="01f874a0c0eedbf796a95816f7e3fa80891f260f22087c48aea06498330167dd" Mar 10 09:58:39 crc kubenswrapper[4794]: I0310 09:58:39.165403 4794 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"01f874a0c0eedbf796a95816f7e3fa80891f260f22087c48aea06498330167dd"} err="failed to get container status \"01f874a0c0eedbf796a95816f7e3fa80891f260f22087c48aea06498330167dd\": rpc error: code = NotFound desc = could not find container \"01f874a0c0eedbf796a95816f7e3fa80891f260f22087c48aea06498330167dd\": container with ID starting with 01f874a0c0eedbf796a95816f7e3fa80891f260f22087c48aea06498330167dd not found: ID does not exist" Mar 10 09:58:39 crc kubenswrapper[4794]: I0310 09:58:39.165427 4794 scope.go:117] "RemoveContainer" containerID="f392e7eaab035834aa954a19e6c9d926bbedf2a547b3a11fba67c0bb966df58c" Mar 10 09:58:39 crc kubenswrapper[4794]: I0310 09:58:39.165697 4794 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f392e7eaab035834aa954a19e6c9d926bbedf2a547b3a11fba67c0bb966df58c"} err="failed to get container status \"f392e7eaab035834aa954a19e6c9d926bbedf2a547b3a11fba67c0bb966df58c\": rpc error: code = NotFound desc = could not find container \"f392e7eaab035834aa954a19e6c9d926bbedf2a547b3a11fba67c0bb966df58c\": container with ID starting with f392e7eaab035834aa954a19e6c9d926bbedf2a547b3a11fba67c0bb966df58c not found: ID does not exist" Mar 10 09:58:39 crc kubenswrapper[4794]: I0310 09:58:39.165722 4794 scope.go:117] "RemoveContainer" containerID="f4fd74482a81888186636681a0568f49e919ac0d729db583ba9178c42e488051" Mar 10 09:58:39 crc kubenswrapper[4794]: I0310 09:58:39.166040 4794 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f4fd74482a81888186636681a0568f49e919ac0d729db583ba9178c42e488051"} err="failed to get container status \"f4fd74482a81888186636681a0568f49e919ac0d729db583ba9178c42e488051\": rpc error: code = NotFound desc = could not find container \"f4fd74482a81888186636681a0568f49e919ac0d729db583ba9178c42e488051\": container with ID starting with f4fd74482a81888186636681a0568f49e919ac0d729db583ba9178c42e488051 not found: ID does not exist" Mar 10 09:58:39 crc kubenswrapper[4794]: I0310 09:58:39.166064 4794 scope.go:117] "RemoveContainer" containerID="cf907aae6bd62c6969511c2483d9891a3d89407fa596e7bd4d6e18a3739aea41" Mar 10 09:58:39 crc kubenswrapper[4794]: I0310 09:58:39.166310 4794 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cf907aae6bd62c6969511c2483d9891a3d89407fa596e7bd4d6e18a3739aea41"} err="failed to get container status \"cf907aae6bd62c6969511c2483d9891a3d89407fa596e7bd4d6e18a3739aea41\": rpc error: code = NotFound desc = could not find container \"cf907aae6bd62c6969511c2483d9891a3d89407fa596e7bd4d6e18a3739aea41\": container with ID starting with cf907aae6bd62c6969511c2483d9891a3d89407fa596e7bd4d6e18a3739aea41 not found: ID does not exist" Mar 10 09:58:39 crc kubenswrapper[4794]: I0310 09:58:39.166352 4794 scope.go:117] "RemoveContainer" containerID="5d6a3c6de4d134994477dbeb7d3ccdac86f691a7c465321397a93be240c84c4c" Mar 10 09:58:39 crc kubenswrapper[4794]: I0310 09:58:39.166595 4794 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5d6a3c6de4d134994477dbeb7d3ccdac86f691a7c465321397a93be240c84c4c"} err="failed to get container status \"5d6a3c6de4d134994477dbeb7d3ccdac86f691a7c465321397a93be240c84c4c\": rpc error: code = NotFound desc = could not find container \"5d6a3c6de4d134994477dbeb7d3ccdac86f691a7c465321397a93be240c84c4c\": container with ID starting with 5d6a3c6de4d134994477dbeb7d3ccdac86f691a7c465321397a93be240c84c4c not found: ID does not exist" Mar 10 09:58:39 crc kubenswrapper[4794]: I0310 09:58:39.166622 4794 scope.go:117] "RemoveContainer" containerID="4fe68076842f5ec0c1922f36a3fdae71ce2e309fd4fa24d4ee4e15c16e514b57" Mar 10 09:58:39 crc kubenswrapper[4794]: I0310 09:58:39.166875 4794 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4fe68076842f5ec0c1922f36a3fdae71ce2e309fd4fa24d4ee4e15c16e514b57"} err="failed to get container status \"4fe68076842f5ec0c1922f36a3fdae71ce2e309fd4fa24d4ee4e15c16e514b57\": rpc error: code = NotFound desc = could not find container \"4fe68076842f5ec0c1922f36a3fdae71ce2e309fd4fa24d4ee4e15c16e514b57\": container with ID starting with 4fe68076842f5ec0c1922f36a3fdae71ce2e309fd4fa24d4ee4e15c16e514b57 not found: ID does not exist" Mar 10 09:58:39 crc kubenswrapper[4794]: I0310 09:58:39.166901 4794 scope.go:117] "RemoveContainer" containerID="6993b19937515a45f9fab7b20a6856320e79cb3526295d52f48338126d519cac" Mar 10 09:58:39 crc kubenswrapper[4794]: I0310 09:58:39.167113 4794 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6993b19937515a45f9fab7b20a6856320e79cb3526295d52f48338126d519cac"} err="failed to get container status \"6993b19937515a45f9fab7b20a6856320e79cb3526295d52f48338126d519cac\": rpc error: code = NotFound desc = could not find container \"6993b19937515a45f9fab7b20a6856320e79cb3526295d52f48338126d519cac\": container with ID starting with 6993b19937515a45f9fab7b20a6856320e79cb3526295d52f48338126d519cac not found: ID does not exist" Mar 10 09:58:39 crc kubenswrapper[4794]: I0310 09:58:39.167136 4794 scope.go:117] "RemoveContainer" containerID="40c8d4f75a838a0ea1ce416d08c6d4628c2cbdbc209d50922f2f02867142bedd" Mar 10 09:58:39 crc kubenswrapper[4794]: I0310 09:58:39.167381 4794 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"40c8d4f75a838a0ea1ce416d08c6d4628c2cbdbc209d50922f2f02867142bedd"} err="failed to get container status \"40c8d4f75a838a0ea1ce416d08c6d4628c2cbdbc209d50922f2f02867142bedd\": rpc error: code = NotFound desc = could not find container \"40c8d4f75a838a0ea1ce416d08c6d4628c2cbdbc209d50922f2f02867142bedd\": container with ID starting with 40c8d4f75a838a0ea1ce416d08c6d4628c2cbdbc209d50922f2f02867142bedd not found: ID does not exist" Mar 10 09:58:39 crc kubenswrapper[4794]: I0310 09:58:39.167410 4794 scope.go:117] "RemoveContainer" containerID="b88a0e22611d7a1518609affa6087b133d21efdb5a7e80a5d4de584a1e106e57" Mar 10 09:58:39 crc kubenswrapper[4794]: I0310 09:58:39.167702 4794 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b88a0e22611d7a1518609affa6087b133d21efdb5a7e80a5d4de584a1e106e57"} err="failed to get container status \"b88a0e22611d7a1518609affa6087b133d21efdb5a7e80a5d4de584a1e106e57\": rpc error: code = NotFound desc = could not find container \"b88a0e22611d7a1518609affa6087b133d21efdb5a7e80a5d4de584a1e106e57\": container with ID starting with b88a0e22611d7a1518609affa6087b133d21efdb5a7e80a5d4de584a1e106e57 not found: ID does not exist" Mar 10 09:58:39 crc kubenswrapper[4794]: I0310 09:58:39.167728 4794 scope.go:117] "RemoveContainer" containerID="db67319402db24bdefe093e2db9fa128803504208bed6a528ba39f3dc8996c58" Mar 10 09:58:39 crc kubenswrapper[4794]: I0310 09:58:39.167975 4794 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"db67319402db24bdefe093e2db9fa128803504208bed6a528ba39f3dc8996c58"} err="failed to get container status \"db67319402db24bdefe093e2db9fa128803504208bed6a528ba39f3dc8996c58\": rpc error: code = NotFound desc = could not find container \"db67319402db24bdefe093e2db9fa128803504208bed6a528ba39f3dc8996c58\": container with ID starting with db67319402db24bdefe093e2db9fa128803504208bed6a528ba39f3dc8996c58 not found: ID does not exist" Mar 10 09:58:39 crc kubenswrapper[4794]: I0310 09:58:39.168002 4794 scope.go:117] "RemoveContainer" containerID="01f874a0c0eedbf796a95816f7e3fa80891f260f22087c48aea06498330167dd" Mar 10 09:58:39 crc kubenswrapper[4794]: I0310 09:58:39.168300 4794 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"01f874a0c0eedbf796a95816f7e3fa80891f260f22087c48aea06498330167dd"} err="failed to get container status \"01f874a0c0eedbf796a95816f7e3fa80891f260f22087c48aea06498330167dd\": rpc error: code = NotFound desc = could not find container \"01f874a0c0eedbf796a95816f7e3fa80891f260f22087c48aea06498330167dd\": container with ID starting with 01f874a0c0eedbf796a95816f7e3fa80891f260f22087c48aea06498330167dd not found: ID does not exist" Mar 10 09:58:39 crc kubenswrapper[4794]: I0310 09:58:39.168326 4794 scope.go:117] "RemoveContainer" containerID="f392e7eaab035834aa954a19e6c9d926bbedf2a547b3a11fba67c0bb966df58c" Mar 10 09:58:39 crc kubenswrapper[4794]: I0310 09:58:39.168601 4794 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f392e7eaab035834aa954a19e6c9d926bbedf2a547b3a11fba67c0bb966df58c"} err="failed to get container status \"f392e7eaab035834aa954a19e6c9d926bbedf2a547b3a11fba67c0bb966df58c\": rpc error: code = NotFound desc = could not find container \"f392e7eaab035834aa954a19e6c9d926bbedf2a547b3a11fba67c0bb966df58c\": container with ID starting with f392e7eaab035834aa954a19e6c9d926bbedf2a547b3a11fba67c0bb966df58c not found: ID does not exist" Mar 10 09:58:39 crc kubenswrapper[4794]: I0310 09:58:39.168632 4794 scope.go:117] "RemoveContainer" containerID="f4fd74482a81888186636681a0568f49e919ac0d729db583ba9178c42e488051" Mar 10 09:58:39 crc kubenswrapper[4794]: I0310 09:58:39.168926 4794 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f4fd74482a81888186636681a0568f49e919ac0d729db583ba9178c42e488051"} err="failed to get container status \"f4fd74482a81888186636681a0568f49e919ac0d729db583ba9178c42e488051\": rpc error: code = NotFound desc = could not find container \"f4fd74482a81888186636681a0568f49e919ac0d729db583ba9178c42e488051\": container with ID starting with f4fd74482a81888186636681a0568f49e919ac0d729db583ba9178c42e488051 not found: ID does not exist" Mar 10 09:58:39 crc kubenswrapper[4794]: I0310 09:58:39.168950 4794 scope.go:117] "RemoveContainer" containerID="cf907aae6bd62c6969511c2483d9891a3d89407fa596e7bd4d6e18a3739aea41" Mar 10 09:58:39 crc kubenswrapper[4794]: I0310 09:58:39.169210 4794 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cf907aae6bd62c6969511c2483d9891a3d89407fa596e7bd4d6e18a3739aea41"} err="failed to get container status \"cf907aae6bd62c6969511c2483d9891a3d89407fa596e7bd4d6e18a3739aea41\": rpc error: code = NotFound desc = could not find container \"cf907aae6bd62c6969511c2483d9891a3d89407fa596e7bd4d6e18a3739aea41\": container with ID starting with cf907aae6bd62c6969511c2483d9891a3d89407fa596e7bd4d6e18a3739aea41 not found: ID does not exist" Mar 10 09:58:39 crc kubenswrapper[4794]: I0310 09:58:39.169234 4794 scope.go:117] "RemoveContainer" containerID="5d6a3c6de4d134994477dbeb7d3ccdac86f691a7c465321397a93be240c84c4c" Mar 10 09:58:39 crc kubenswrapper[4794]: I0310 09:58:39.169494 4794 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5d6a3c6de4d134994477dbeb7d3ccdac86f691a7c465321397a93be240c84c4c"} err="failed to get container status \"5d6a3c6de4d134994477dbeb7d3ccdac86f691a7c465321397a93be240c84c4c\": rpc error: code = NotFound desc = could not find container \"5d6a3c6de4d134994477dbeb7d3ccdac86f691a7c465321397a93be240c84c4c\": container with ID starting with 5d6a3c6de4d134994477dbeb7d3ccdac86f691a7c465321397a93be240c84c4c not found: ID does not exist" Mar 10 09:58:39 crc kubenswrapper[4794]: I0310 09:58:39.169517 4794 scope.go:117] "RemoveContainer" containerID="4fe68076842f5ec0c1922f36a3fdae71ce2e309fd4fa24d4ee4e15c16e514b57" Mar 10 09:58:39 crc kubenswrapper[4794]: I0310 09:58:39.169803 4794 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4fe68076842f5ec0c1922f36a3fdae71ce2e309fd4fa24d4ee4e15c16e514b57"} err="failed to get container status \"4fe68076842f5ec0c1922f36a3fdae71ce2e309fd4fa24d4ee4e15c16e514b57\": rpc error: code = NotFound desc = could not find container \"4fe68076842f5ec0c1922f36a3fdae71ce2e309fd4fa24d4ee4e15c16e514b57\": container with ID starting with 4fe68076842f5ec0c1922f36a3fdae71ce2e309fd4fa24d4ee4e15c16e514b57 not found: ID does not exist" Mar 10 09:58:39 crc kubenswrapper[4794]: I0310 09:58:39.169831 4794 scope.go:117] "RemoveContainer" containerID="6993b19937515a45f9fab7b20a6856320e79cb3526295d52f48338126d519cac" Mar 10 09:58:39 crc kubenswrapper[4794]: I0310 09:58:39.170061 4794 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6993b19937515a45f9fab7b20a6856320e79cb3526295d52f48338126d519cac"} err="failed to get container status \"6993b19937515a45f9fab7b20a6856320e79cb3526295d52f48338126d519cac\": rpc error: code = NotFound desc = could not find container \"6993b19937515a45f9fab7b20a6856320e79cb3526295d52f48338126d519cac\": container with ID starting with 6993b19937515a45f9fab7b20a6856320e79cb3526295d52f48338126d519cac not found: ID does not exist" Mar 10 09:58:39 crc kubenswrapper[4794]: I0310 09:58:39.170083 4794 scope.go:117] "RemoveContainer" containerID="40c8d4f75a838a0ea1ce416d08c6d4628c2cbdbc209d50922f2f02867142bedd" Mar 10 09:58:39 crc kubenswrapper[4794]: I0310 09:58:39.170300 4794 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"40c8d4f75a838a0ea1ce416d08c6d4628c2cbdbc209d50922f2f02867142bedd"} err="failed to get container status \"40c8d4f75a838a0ea1ce416d08c6d4628c2cbdbc209d50922f2f02867142bedd\": rpc error: code = NotFound desc = could not find container \"40c8d4f75a838a0ea1ce416d08c6d4628c2cbdbc209d50922f2f02867142bedd\": container with ID starting with 40c8d4f75a838a0ea1ce416d08c6d4628c2cbdbc209d50922f2f02867142bedd not found: ID does not exist" Mar 10 09:58:39 crc kubenswrapper[4794]: I0310 09:58:39.904700 4794 generic.go:334] "Generic (PLEG): container finished" podID="84306302-a4d7-4ec6-8005-afca6c0f9191" containerID="31f25f492872b6b74068bdaee672ce4deb20baa7e783cac059ea247efde2f65b" exitCode=0 Mar 10 09:58:39 crc kubenswrapper[4794]: I0310 09:58:39.904796 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7tzmx" event={"ID":"84306302-a4d7-4ec6-8005-afca6c0f9191","Type":"ContainerDied","Data":"31f25f492872b6b74068bdaee672ce4deb20baa7e783cac059ea247efde2f65b"} Mar 10 09:58:39 crc kubenswrapper[4794]: I0310 09:58:39.905160 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7tzmx" event={"ID":"84306302-a4d7-4ec6-8005-afca6c0f9191","Type":"ContainerStarted","Data":"34d3b772b63f0bb0b530c8a008a839334ca190c99f0d474b087e791cb273601a"} Mar 10 09:58:39 crc kubenswrapper[4794]: I0310 09:58:39.905178 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7tzmx" event={"ID":"84306302-a4d7-4ec6-8005-afca6c0f9191","Type":"ContainerStarted","Data":"8d77463748cab4a48c94540687331b4c6e389d5c593368ae1340703ad3ddf555"} Mar 10 09:58:39 crc kubenswrapper[4794]: I0310 09:58:39.905190 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7tzmx" event={"ID":"84306302-a4d7-4ec6-8005-afca6c0f9191","Type":"ContainerStarted","Data":"88b88b7ff6cce02fd7004e3b910368bcfce2e7852d7d62bbbe9ac151689aa023"} Mar 10 09:58:39 crc kubenswrapper[4794]: I0310 09:58:39.905202 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7tzmx" event={"ID":"84306302-a4d7-4ec6-8005-afca6c0f9191","Type":"ContainerStarted","Data":"2adc95d562200348241b0862bb2fc63dd586d9f84920284107c01f99714df086"} Mar 10 09:58:39 crc kubenswrapper[4794]: I0310 09:58:39.905213 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7tzmx" event={"ID":"84306302-a4d7-4ec6-8005-afca6c0f9191","Type":"ContainerStarted","Data":"d9fac3c8e119f05e8af2d18a1bb143731d68ef75c5012495ec8e58ce0a0d2f3d"} Mar 10 09:58:39 crc kubenswrapper[4794]: I0310 09:58:39.905227 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7tzmx" event={"ID":"84306302-a4d7-4ec6-8005-afca6c0f9191","Type":"ContainerStarted","Data":"eb9e109f287db13efabc52ee49d7aeee921e18009cd4f01dfdc03a15d89d6655"} Mar 10 09:58:39 crc kubenswrapper[4794]: I0310 09:58:39.907625 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-jpdth_11028118-385a-4a2a-8bc4-49aad67ce147/kube-multus/2.log" Mar 10 09:58:39 crc kubenswrapper[4794]: I0310 09:58:39.907707 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-jpdth" event={"ID":"11028118-385a-4a2a-8bc4-49aad67ce147","Type":"ContainerStarted","Data":"af95a91a74195c99022588d17414531b1c6e6d30c77298bebef8907175665bf2"} Mar 10 09:58:40 crc kubenswrapper[4794]: I0310 09:58:40.004765 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d6907de6-7eb7-440a-a101-f492ffa28e39" path="/var/lib/kubelet/pods/d6907de6-7eb7-440a-a101-f492ffa28e39/volumes" Mar 10 09:58:40 crc kubenswrapper[4794]: I0310 09:58:40.915649 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-tgrtq" podUID="cde9bff4-7089-47c3-a6e8-95c39511b06f" containerName="registry-server" containerID="cri-o://3b1ab3d80e163729aec4868322a62124d8d1a7b8dd4eed1dd7c66e345782535a" gracePeriod=2 Mar 10 09:58:41 crc kubenswrapper[4794]: I0310 09:58:41.089093 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-tgrtq" Mar 10 09:58:41 crc kubenswrapper[4794]: I0310 09:58:41.214748 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rn89k\" (UniqueName: \"kubernetes.io/projected/cde9bff4-7089-47c3-a6e8-95c39511b06f-kube-api-access-rn89k\") pod \"cde9bff4-7089-47c3-a6e8-95c39511b06f\" (UID: \"cde9bff4-7089-47c3-a6e8-95c39511b06f\") " Mar 10 09:58:41 crc kubenswrapper[4794]: I0310 09:58:41.214827 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cde9bff4-7089-47c3-a6e8-95c39511b06f-catalog-content\") pod \"cde9bff4-7089-47c3-a6e8-95c39511b06f\" (UID: \"cde9bff4-7089-47c3-a6e8-95c39511b06f\") " Mar 10 09:58:41 crc kubenswrapper[4794]: I0310 09:58:41.214883 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cde9bff4-7089-47c3-a6e8-95c39511b06f-utilities\") pod \"cde9bff4-7089-47c3-a6e8-95c39511b06f\" (UID: \"cde9bff4-7089-47c3-a6e8-95c39511b06f\") " Mar 10 09:58:41 crc kubenswrapper[4794]: I0310 09:58:41.215803 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cde9bff4-7089-47c3-a6e8-95c39511b06f-utilities" (OuterVolumeSpecName: "utilities") pod "cde9bff4-7089-47c3-a6e8-95c39511b06f" (UID: "cde9bff4-7089-47c3-a6e8-95c39511b06f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 09:58:41 crc kubenswrapper[4794]: I0310 09:58:41.222808 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cde9bff4-7089-47c3-a6e8-95c39511b06f-kube-api-access-rn89k" (OuterVolumeSpecName: "kube-api-access-rn89k") pod "cde9bff4-7089-47c3-a6e8-95c39511b06f" (UID: "cde9bff4-7089-47c3-a6e8-95c39511b06f"). InnerVolumeSpecName "kube-api-access-rn89k". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:58:41 crc kubenswrapper[4794]: I0310 09:58:41.316598 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rn89k\" (UniqueName: \"kubernetes.io/projected/cde9bff4-7089-47c3-a6e8-95c39511b06f-kube-api-access-rn89k\") on node \"crc\" DevicePath \"\"" Mar 10 09:58:41 crc kubenswrapper[4794]: I0310 09:58:41.316634 4794 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cde9bff4-7089-47c3-a6e8-95c39511b06f-utilities\") on node \"crc\" DevicePath \"\"" Mar 10 09:58:41 crc kubenswrapper[4794]: I0310 09:58:41.779964 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cde9bff4-7089-47c3-a6e8-95c39511b06f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "cde9bff4-7089-47c3-a6e8-95c39511b06f" (UID: "cde9bff4-7089-47c3-a6e8-95c39511b06f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 09:58:41 crc kubenswrapper[4794]: I0310 09:58:41.823780 4794 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cde9bff4-7089-47c3-a6e8-95c39511b06f-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 10 09:58:41 crc kubenswrapper[4794]: I0310 09:58:41.929549 4794 generic.go:334] "Generic (PLEG): container finished" podID="cde9bff4-7089-47c3-a6e8-95c39511b06f" containerID="3b1ab3d80e163729aec4868322a62124d8d1a7b8dd4eed1dd7c66e345782535a" exitCode=0 Mar 10 09:58:41 crc kubenswrapper[4794]: I0310 09:58:41.929599 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-tgrtq" event={"ID":"cde9bff4-7089-47c3-a6e8-95c39511b06f","Type":"ContainerDied","Data":"3b1ab3d80e163729aec4868322a62124d8d1a7b8dd4eed1dd7c66e345782535a"} Mar 10 09:58:41 crc kubenswrapper[4794]: I0310 09:58:41.929864 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-tgrtq" event={"ID":"cde9bff4-7089-47c3-a6e8-95c39511b06f","Type":"ContainerDied","Data":"f724aafe3240f73b61d5f5fb418ebd22c8caa12e7244e3f427025c5562ba6a11"} Mar 10 09:58:41 crc kubenswrapper[4794]: I0310 09:58:41.929624 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-tgrtq" Mar 10 09:58:41 crc kubenswrapper[4794]: I0310 09:58:41.929895 4794 scope.go:117] "RemoveContainer" containerID="3b1ab3d80e163729aec4868322a62124d8d1a7b8dd4eed1dd7c66e345782535a" Mar 10 09:58:41 crc kubenswrapper[4794]: I0310 09:58:41.935674 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7tzmx" event={"ID":"84306302-a4d7-4ec6-8005-afca6c0f9191","Type":"ContainerStarted","Data":"3d0aa2d165dea4f197f0edafaf72e17075d4a795da6766313dfb5ff179639d88"} Mar 10 09:58:41 crc kubenswrapper[4794]: I0310 09:58:41.950774 4794 scope.go:117] "RemoveContainer" containerID="d630c4004007614968b2ebd7bac4b646c5dd2caf2ad6bd76fe295c2b77da89c9" Mar 10 09:58:41 crc kubenswrapper[4794]: I0310 09:58:41.959186 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-tgrtq"] Mar 10 09:58:41 crc kubenswrapper[4794]: I0310 09:58:41.965207 4794 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-tgrtq"] Mar 10 09:58:41 crc kubenswrapper[4794]: I0310 09:58:41.979390 4794 scope.go:117] "RemoveContainer" containerID="2bd6de5fc8aafffbf7471f4b53b6fefbb7f5ba9b9bf663c60c8772a471ea1d2f" Mar 10 09:58:41 crc kubenswrapper[4794]: I0310 09:58:41.995386 4794 scope.go:117] "RemoveContainer" containerID="3b1ab3d80e163729aec4868322a62124d8d1a7b8dd4eed1dd7c66e345782535a" Mar 10 09:58:41 crc kubenswrapper[4794]: E0310 09:58:41.996169 4794 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3b1ab3d80e163729aec4868322a62124d8d1a7b8dd4eed1dd7c66e345782535a\": container with ID starting with 3b1ab3d80e163729aec4868322a62124d8d1a7b8dd4eed1dd7c66e345782535a not found: ID does not exist" containerID="3b1ab3d80e163729aec4868322a62124d8d1a7b8dd4eed1dd7c66e345782535a" Mar 10 09:58:41 crc kubenswrapper[4794]: I0310 09:58:41.996206 4794 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3b1ab3d80e163729aec4868322a62124d8d1a7b8dd4eed1dd7c66e345782535a"} err="failed to get container status \"3b1ab3d80e163729aec4868322a62124d8d1a7b8dd4eed1dd7c66e345782535a\": rpc error: code = NotFound desc = could not find container \"3b1ab3d80e163729aec4868322a62124d8d1a7b8dd4eed1dd7c66e345782535a\": container with ID starting with 3b1ab3d80e163729aec4868322a62124d8d1a7b8dd4eed1dd7c66e345782535a not found: ID does not exist" Mar 10 09:58:41 crc kubenswrapper[4794]: I0310 09:58:41.996229 4794 scope.go:117] "RemoveContainer" containerID="d630c4004007614968b2ebd7bac4b646c5dd2caf2ad6bd76fe295c2b77da89c9" Mar 10 09:58:41 crc kubenswrapper[4794]: E0310 09:58:41.996749 4794 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d630c4004007614968b2ebd7bac4b646c5dd2caf2ad6bd76fe295c2b77da89c9\": container with ID starting with d630c4004007614968b2ebd7bac4b646c5dd2caf2ad6bd76fe295c2b77da89c9 not found: ID does not exist" containerID="d630c4004007614968b2ebd7bac4b646c5dd2caf2ad6bd76fe295c2b77da89c9" Mar 10 09:58:41 crc kubenswrapper[4794]: I0310 09:58:41.996880 4794 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d630c4004007614968b2ebd7bac4b646c5dd2caf2ad6bd76fe295c2b77da89c9"} err="failed to get container status \"d630c4004007614968b2ebd7bac4b646c5dd2caf2ad6bd76fe295c2b77da89c9\": rpc error: code = NotFound desc = could not find container \"d630c4004007614968b2ebd7bac4b646c5dd2caf2ad6bd76fe295c2b77da89c9\": container with ID starting with d630c4004007614968b2ebd7bac4b646c5dd2caf2ad6bd76fe295c2b77da89c9 not found: ID does not exist" Mar 10 09:58:41 crc kubenswrapper[4794]: I0310 09:58:41.996986 4794 scope.go:117] "RemoveContainer" containerID="2bd6de5fc8aafffbf7471f4b53b6fefbb7f5ba9b9bf663c60c8772a471ea1d2f" Mar 10 09:58:41 crc kubenswrapper[4794]: E0310 09:58:41.997481 4794 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2bd6de5fc8aafffbf7471f4b53b6fefbb7f5ba9b9bf663c60c8772a471ea1d2f\": container with ID starting with 2bd6de5fc8aafffbf7471f4b53b6fefbb7f5ba9b9bf663c60c8772a471ea1d2f not found: ID does not exist" containerID="2bd6de5fc8aafffbf7471f4b53b6fefbb7f5ba9b9bf663c60c8772a471ea1d2f" Mar 10 09:58:41 crc kubenswrapper[4794]: I0310 09:58:41.997510 4794 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2bd6de5fc8aafffbf7471f4b53b6fefbb7f5ba9b9bf663c60c8772a471ea1d2f"} err="failed to get container status \"2bd6de5fc8aafffbf7471f4b53b6fefbb7f5ba9b9bf663c60c8772a471ea1d2f\": rpc error: code = NotFound desc = could not find container \"2bd6de5fc8aafffbf7471f4b53b6fefbb7f5ba9b9bf663c60c8772a471ea1d2f\": container with ID starting with 2bd6de5fc8aafffbf7471f4b53b6fefbb7f5ba9b9bf663c60c8772a471ea1d2f not found: ID does not exist" Mar 10 09:58:42 crc kubenswrapper[4794]: I0310 09:58:42.016182 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cde9bff4-7089-47c3-a6e8-95c39511b06f" path="/var/lib/kubelet/pods/cde9bff4-7089-47c3-a6e8-95c39511b06f/volumes" Mar 10 09:58:44 crc kubenswrapper[4794]: I0310 09:58:44.958667 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7tzmx" event={"ID":"84306302-a4d7-4ec6-8005-afca6c0f9191","Type":"ContainerStarted","Data":"7f2db4ef49902b48865397cb518277d96e6eb863cd97da7cfdc5bedebf01ad17"} Mar 10 09:58:44 crc kubenswrapper[4794]: I0310 09:58:44.959265 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-7tzmx" Mar 10 09:58:44 crc kubenswrapper[4794]: I0310 09:58:44.959276 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-7tzmx" Mar 10 09:58:44 crc kubenswrapper[4794]: I0310 09:58:44.988221 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-7tzmx" Mar 10 09:58:44 crc kubenswrapper[4794]: I0310 09:58:44.999593 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-7tzmx" podStartSLOduration=6.999577509 podStartE2EDuration="6.999577509s" podCreationTimestamp="2026-03-10 09:58:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 09:58:44.997255487 +0000 UTC m=+873.753426345" watchObservedRunningTime="2026-03-10 09:58:44.999577509 +0000 UTC m=+873.755748327" Mar 10 09:58:45 crc kubenswrapper[4794]: I0310 09:58:45.059401 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["crc-storage/crc-storage-crc-9vh2h"] Mar 10 09:58:45 crc kubenswrapper[4794]: E0310 09:58:45.059606 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cde9bff4-7089-47c3-a6e8-95c39511b06f" containerName="registry-server" Mar 10 09:58:45 crc kubenswrapper[4794]: I0310 09:58:45.059621 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="cde9bff4-7089-47c3-a6e8-95c39511b06f" containerName="registry-server" Mar 10 09:58:45 crc kubenswrapper[4794]: E0310 09:58:45.059636 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cde9bff4-7089-47c3-a6e8-95c39511b06f" containerName="extract-content" Mar 10 09:58:45 crc kubenswrapper[4794]: I0310 09:58:45.059643 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="cde9bff4-7089-47c3-a6e8-95c39511b06f" containerName="extract-content" Mar 10 09:58:45 crc kubenswrapper[4794]: E0310 09:58:45.059657 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cde9bff4-7089-47c3-a6e8-95c39511b06f" containerName="extract-utilities" Mar 10 09:58:45 crc kubenswrapper[4794]: I0310 09:58:45.059664 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="cde9bff4-7089-47c3-a6e8-95c39511b06f" containerName="extract-utilities" Mar 10 09:58:45 crc kubenswrapper[4794]: I0310 09:58:45.059769 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="cde9bff4-7089-47c3-a6e8-95c39511b06f" containerName="registry-server" Mar 10 09:58:45 crc kubenswrapper[4794]: I0310 09:58:45.060158 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-9vh2h" Mar 10 09:58:45 crc kubenswrapper[4794]: I0310 09:58:45.062927 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"openshift-service-ca.crt" Mar 10 09:58:45 crc kubenswrapper[4794]: I0310 09:58:45.063137 4794 reflector.go:368] Caches populated for *v1.Secret from object-"crc-storage"/"crc-storage-dockercfg-76wvz" Mar 10 09:58:45 crc kubenswrapper[4794]: I0310 09:58:45.069142 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"kube-root-ca.crt" Mar 10 09:58:45 crc kubenswrapper[4794]: I0310 09:58:45.069276 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"crc-storage" Mar 10 09:58:45 crc kubenswrapper[4794]: I0310 09:58:45.090989 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-9vh2h"] Mar 10 09:58:45 crc kubenswrapper[4794]: I0310 09:58:45.163850 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/2cdb631e-df45-47a7-bcfe-d659cbd1fd1e-crc-storage\") pod \"crc-storage-crc-9vh2h\" (UID: \"2cdb631e-df45-47a7-bcfe-d659cbd1fd1e\") " pod="crc-storage/crc-storage-crc-9vh2h" Mar 10 09:58:45 crc kubenswrapper[4794]: I0310 09:58:45.163892 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hpkd4\" (UniqueName: \"kubernetes.io/projected/2cdb631e-df45-47a7-bcfe-d659cbd1fd1e-kube-api-access-hpkd4\") pod \"crc-storage-crc-9vh2h\" (UID: \"2cdb631e-df45-47a7-bcfe-d659cbd1fd1e\") " pod="crc-storage/crc-storage-crc-9vh2h" Mar 10 09:58:45 crc kubenswrapper[4794]: I0310 09:58:45.163913 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/2cdb631e-df45-47a7-bcfe-d659cbd1fd1e-node-mnt\") pod \"crc-storage-crc-9vh2h\" (UID: \"2cdb631e-df45-47a7-bcfe-d659cbd1fd1e\") " pod="crc-storage/crc-storage-crc-9vh2h" Mar 10 09:58:45 crc kubenswrapper[4794]: I0310 09:58:45.265064 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/2cdb631e-df45-47a7-bcfe-d659cbd1fd1e-crc-storage\") pod \"crc-storage-crc-9vh2h\" (UID: \"2cdb631e-df45-47a7-bcfe-d659cbd1fd1e\") " pod="crc-storage/crc-storage-crc-9vh2h" Mar 10 09:58:45 crc kubenswrapper[4794]: I0310 09:58:45.265110 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hpkd4\" (UniqueName: \"kubernetes.io/projected/2cdb631e-df45-47a7-bcfe-d659cbd1fd1e-kube-api-access-hpkd4\") pod \"crc-storage-crc-9vh2h\" (UID: \"2cdb631e-df45-47a7-bcfe-d659cbd1fd1e\") " pod="crc-storage/crc-storage-crc-9vh2h" Mar 10 09:58:45 crc kubenswrapper[4794]: I0310 09:58:45.265132 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/2cdb631e-df45-47a7-bcfe-d659cbd1fd1e-node-mnt\") pod \"crc-storage-crc-9vh2h\" (UID: \"2cdb631e-df45-47a7-bcfe-d659cbd1fd1e\") " pod="crc-storage/crc-storage-crc-9vh2h" Mar 10 09:58:45 crc kubenswrapper[4794]: I0310 09:58:45.265310 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/2cdb631e-df45-47a7-bcfe-d659cbd1fd1e-node-mnt\") pod \"crc-storage-crc-9vh2h\" (UID: \"2cdb631e-df45-47a7-bcfe-d659cbd1fd1e\") " pod="crc-storage/crc-storage-crc-9vh2h" Mar 10 09:58:45 crc kubenswrapper[4794]: I0310 09:58:45.266171 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/2cdb631e-df45-47a7-bcfe-d659cbd1fd1e-crc-storage\") pod \"crc-storage-crc-9vh2h\" (UID: \"2cdb631e-df45-47a7-bcfe-d659cbd1fd1e\") " pod="crc-storage/crc-storage-crc-9vh2h" Mar 10 09:58:45 crc kubenswrapper[4794]: I0310 09:58:45.288446 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hpkd4\" (UniqueName: \"kubernetes.io/projected/2cdb631e-df45-47a7-bcfe-d659cbd1fd1e-kube-api-access-hpkd4\") pod \"crc-storage-crc-9vh2h\" (UID: \"2cdb631e-df45-47a7-bcfe-d659cbd1fd1e\") " pod="crc-storage/crc-storage-crc-9vh2h" Mar 10 09:58:45 crc kubenswrapper[4794]: I0310 09:58:45.400578 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-9vh2h" Mar 10 09:58:45 crc kubenswrapper[4794]: E0310 09:58:45.438824 4794 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-9vh2h_crc-storage_2cdb631e-df45-47a7-bcfe-d659cbd1fd1e_0(3ca58058784a5e70992e510e25f4b4d9e6637b603e6a710bef80552382ab1061): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 10 09:58:45 crc kubenswrapper[4794]: E0310 09:58:45.438952 4794 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-9vh2h_crc-storage_2cdb631e-df45-47a7-bcfe-d659cbd1fd1e_0(3ca58058784a5e70992e510e25f4b4d9e6637b603e6a710bef80552382ab1061): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="crc-storage/crc-storage-crc-9vh2h" Mar 10 09:58:45 crc kubenswrapper[4794]: E0310 09:58:45.439034 4794 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-9vh2h_crc-storage_2cdb631e-df45-47a7-bcfe-d659cbd1fd1e_0(3ca58058784a5e70992e510e25f4b4d9e6637b603e6a710bef80552382ab1061): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="crc-storage/crc-storage-crc-9vh2h" Mar 10 09:58:45 crc kubenswrapper[4794]: E0310 09:58:45.439125 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"crc-storage-crc-9vh2h_crc-storage(2cdb631e-df45-47a7-bcfe-d659cbd1fd1e)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"crc-storage-crc-9vh2h_crc-storage(2cdb631e-df45-47a7-bcfe-d659cbd1fd1e)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-9vh2h_crc-storage_2cdb631e-df45-47a7-bcfe-d659cbd1fd1e_0(3ca58058784a5e70992e510e25f4b4d9e6637b603e6a710bef80552382ab1061): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="crc-storage/crc-storage-crc-9vh2h" podUID="2cdb631e-df45-47a7-bcfe-d659cbd1fd1e" Mar 10 09:58:45 crc kubenswrapper[4794]: I0310 09:58:45.964645 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-9vh2h" Mar 10 09:58:45 crc kubenswrapper[4794]: I0310 09:58:45.965254 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-7tzmx" Mar 10 09:58:45 crc kubenswrapper[4794]: I0310 09:58:45.965405 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-9vh2h" Mar 10 09:58:46 crc kubenswrapper[4794]: I0310 09:58:46.015806 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-7tzmx" Mar 10 09:58:46 crc kubenswrapper[4794]: E0310 09:58:46.026415 4794 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-9vh2h_crc-storage_2cdb631e-df45-47a7-bcfe-d659cbd1fd1e_0(5eb5c11a7465bb9b665e14de9815a08e0760d61326d2366d3f33edb9d98020ea): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 10 09:58:46 crc kubenswrapper[4794]: E0310 09:58:46.026549 4794 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-9vh2h_crc-storage_2cdb631e-df45-47a7-bcfe-d659cbd1fd1e_0(5eb5c11a7465bb9b665e14de9815a08e0760d61326d2366d3f33edb9d98020ea): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="crc-storage/crc-storage-crc-9vh2h" Mar 10 09:58:46 crc kubenswrapper[4794]: E0310 09:58:46.026595 4794 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-9vh2h_crc-storage_2cdb631e-df45-47a7-bcfe-d659cbd1fd1e_0(5eb5c11a7465bb9b665e14de9815a08e0760d61326d2366d3f33edb9d98020ea): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="crc-storage/crc-storage-crc-9vh2h" Mar 10 09:58:46 crc kubenswrapper[4794]: E0310 09:58:46.026670 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"crc-storage-crc-9vh2h_crc-storage(2cdb631e-df45-47a7-bcfe-d659cbd1fd1e)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"crc-storage-crc-9vh2h_crc-storage(2cdb631e-df45-47a7-bcfe-d659cbd1fd1e)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-9vh2h_crc-storage_2cdb631e-df45-47a7-bcfe-d659cbd1fd1e_0(5eb5c11a7465bb9b665e14de9815a08e0760d61326d2366d3f33edb9d98020ea): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="crc-storage/crc-storage-crc-9vh2h" podUID="2cdb631e-df45-47a7-bcfe-d659cbd1fd1e" Mar 10 09:58:52 crc kubenswrapper[4794]: I0310 09:58:52.968322 4794 patch_prober.go:28] interesting pod/machine-config-daemon-69278 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 09:58:52 crc kubenswrapper[4794]: I0310 09:58:52.968793 4794 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-69278" podUID="071a79a8-a892-4d38-a255-2a19483b64aa" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 09:59:00 crc kubenswrapper[4794]: I0310 09:59:00.998910 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-9vh2h" Mar 10 09:59:01 crc kubenswrapper[4794]: I0310 09:59:00.999976 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-9vh2h" Mar 10 09:59:01 crc kubenswrapper[4794]: I0310 09:59:01.181948 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-9vh2h"] Mar 10 09:59:01 crc kubenswrapper[4794]: I0310 09:59:01.192482 4794 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 10 09:59:02 crc kubenswrapper[4794]: I0310 09:59:02.060975 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-9vh2h" event={"ID":"2cdb631e-df45-47a7-bcfe-d659cbd1fd1e","Type":"ContainerStarted","Data":"72bc8e0cdf0f66b7711a06fa317d1ad298508ae9de36570d8d9c5e53931f293d"} Mar 10 09:59:03 crc kubenswrapper[4794]: I0310 09:59:03.068932 4794 generic.go:334] "Generic (PLEG): container finished" podID="2cdb631e-df45-47a7-bcfe-d659cbd1fd1e" containerID="800525ae1fae6bbec2e45f78754b0cad9cf595eaf5b4cd7024b07fe4c8a121ef" exitCode=0 Mar 10 09:59:03 crc kubenswrapper[4794]: I0310 09:59:03.069049 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-9vh2h" event={"ID":"2cdb631e-df45-47a7-bcfe-d659cbd1fd1e","Type":"ContainerDied","Data":"800525ae1fae6bbec2e45f78754b0cad9cf595eaf5b4cd7024b07fe4c8a121ef"} Mar 10 09:59:04 crc kubenswrapper[4794]: I0310 09:59:04.293996 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-9vh2h" Mar 10 09:59:04 crc kubenswrapper[4794]: I0310 09:59:04.430200 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/2cdb631e-df45-47a7-bcfe-d659cbd1fd1e-crc-storage\") pod \"2cdb631e-df45-47a7-bcfe-d659cbd1fd1e\" (UID: \"2cdb631e-df45-47a7-bcfe-d659cbd1fd1e\") " Mar 10 09:59:04 crc kubenswrapper[4794]: I0310 09:59:04.430417 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/2cdb631e-df45-47a7-bcfe-d659cbd1fd1e-node-mnt\") pod \"2cdb631e-df45-47a7-bcfe-d659cbd1fd1e\" (UID: \"2cdb631e-df45-47a7-bcfe-d659cbd1fd1e\") " Mar 10 09:59:04 crc kubenswrapper[4794]: I0310 09:59:04.430542 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hpkd4\" (UniqueName: \"kubernetes.io/projected/2cdb631e-df45-47a7-bcfe-d659cbd1fd1e-kube-api-access-hpkd4\") pod \"2cdb631e-df45-47a7-bcfe-d659cbd1fd1e\" (UID: \"2cdb631e-df45-47a7-bcfe-d659cbd1fd1e\") " Mar 10 09:59:04 crc kubenswrapper[4794]: I0310 09:59:04.430612 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2cdb631e-df45-47a7-bcfe-d659cbd1fd1e-node-mnt" (OuterVolumeSpecName: "node-mnt") pod "2cdb631e-df45-47a7-bcfe-d659cbd1fd1e" (UID: "2cdb631e-df45-47a7-bcfe-d659cbd1fd1e"). InnerVolumeSpecName "node-mnt". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 09:59:04 crc kubenswrapper[4794]: I0310 09:59:04.430947 4794 reconciler_common.go:293] "Volume detached for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/2cdb631e-df45-47a7-bcfe-d659cbd1fd1e-node-mnt\") on node \"crc\" DevicePath \"\"" Mar 10 09:59:04 crc kubenswrapper[4794]: I0310 09:59:04.435032 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2cdb631e-df45-47a7-bcfe-d659cbd1fd1e-kube-api-access-hpkd4" (OuterVolumeSpecName: "kube-api-access-hpkd4") pod "2cdb631e-df45-47a7-bcfe-d659cbd1fd1e" (UID: "2cdb631e-df45-47a7-bcfe-d659cbd1fd1e"). InnerVolumeSpecName "kube-api-access-hpkd4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:59:04 crc kubenswrapper[4794]: I0310 09:59:04.447009 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2cdb631e-df45-47a7-bcfe-d659cbd1fd1e-crc-storage" (OuterVolumeSpecName: "crc-storage") pod "2cdb631e-df45-47a7-bcfe-d659cbd1fd1e" (UID: "2cdb631e-df45-47a7-bcfe-d659cbd1fd1e"). InnerVolumeSpecName "crc-storage". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 09:59:04 crc kubenswrapper[4794]: I0310 09:59:04.532477 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hpkd4\" (UniqueName: \"kubernetes.io/projected/2cdb631e-df45-47a7-bcfe-d659cbd1fd1e-kube-api-access-hpkd4\") on node \"crc\" DevicePath \"\"" Mar 10 09:59:04 crc kubenswrapper[4794]: I0310 09:59:04.532521 4794 reconciler_common.go:293] "Volume detached for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/2cdb631e-df45-47a7-bcfe-d659cbd1fd1e-crc-storage\") on node \"crc\" DevicePath \"\"" Mar 10 09:59:05 crc kubenswrapper[4794]: I0310 09:59:05.081349 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-9vh2h" event={"ID":"2cdb631e-df45-47a7-bcfe-d659cbd1fd1e","Type":"ContainerDied","Data":"72bc8e0cdf0f66b7711a06fa317d1ad298508ae9de36570d8d9c5e53931f293d"} Mar 10 09:59:05 crc kubenswrapper[4794]: I0310 09:59:05.081389 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-9vh2h" Mar 10 09:59:05 crc kubenswrapper[4794]: I0310 09:59:05.081404 4794 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="72bc8e0cdf0f66b7711a06fa317d1ad298508ae9de36570d8d9c5e53931f293d" Mar 10 09:59:08 crc kubenswrapper[4794]: I0310 09:59:08.779563 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-7tzmx" Mar 10 09:59:11 crc kubenswrapper[4794]: I0310 09:59:11.788092 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82v5bld"] Mar 10 09:59:11 crc kubenswrapper[4794]: E0310 09:59:11.789182 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2cdb631e-df45-47a7-bcfe-d659cbd1fd1e" containerName="storage" Mar 10 09:59:11 crc kubenswrapper[4794]: I0310 09:59:11.789216 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="2cdb631e-df45-47a7-bcfe-d659cbd1fd1e" containerName="storage" Mar 10 09:59:11 crc kubenswrapper[4794]: I0310 09:59:11.789477 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="2cdb631e-df45-47a7-bcfe-d659cbd1fd1e" containerName="storage" Mar 10 09:59:11 crc kubenswrapper[4794]: I0310 09:59:11.790946 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82v5bld" Mar 10 09:59:11 crc kubenswrapper[4794]: I0310 09:59:11.793693 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Mar 10 09:59:11 crc kubenswrapper[4794]: I0310 09:59:11.795775 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82v5bld"] Mar 10 09:59:11 crc kubenswrapper[4794]: I0310 09:59:11.924975 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qxd9q\" (UniqueName: \"kubernetes.io/projected/cf8896bc-caa9-4120-8390-a5ffe0897859-kube-api-access-qxd9q\") pod \"0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82v5bld\" (UID: \"cf8896bc-caa9-4120-8390-a5ffe0897859\") " pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82v5bld" Mar 10 09:59:11 crc kubenswrapper[4794]: I0310 09:59:11.925060 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/cf8896bc-caa9-4120-8390-a5ffe0897859-util\") pod \"0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82v5bld\" (UID: \"cf8896bc-caa9-4120-8390-a5ffe0897859\") " pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82v5bld" Mar 10 09:59:11 crc kubenswrapper[4794]: I0310 09:59:11.925126 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/cf8896bc-caa9-4120-8390-a5ffe0897859-bundle\") pod \"0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82v5bld\" (UID: \"cf8896bc-caa9-4120-8390-a5ffe0897859\") " pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82v5bld" Mar 10 09:59:12 crc kubenswrapper[4794]: I0310 09:59:12.026516 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/cf8896bc-caa9-4120-8390-a5ffe0897859-util\") pod \"0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82v5bld\" (UID: \"cf8896bc-caa9-4120-8390-a5ffe0897859\") " pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82v5bld" Mar 10 09:59:12 crc kubenswrapper[4794]: I0310 09:59:12.026605 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/cf8896bc-caa9-4120-8390-a5ffe0897859-bundle\") pod \"0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82v5bld\" (UID: \"cf8896bc-caa9-4120-8390-a5ffe0897859\") " pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82v5bld" Mar 10 09:59:12 crc kubenswrapper[4794]: I0310 09:59:12.026697 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qxd9q\" (UniqueName: \"kubernetes.io/projected/cf8896bc-caa9-4120-8390-a5ffe0897859-kube-api-access-qxd9q\") pod \"0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82v5bld\" (UID: \"cf8896bc-caa9-4120-8390-a5ffe0897859\") " pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82v5bld" Mar 10 09:59:12 crc kubenswrapper[4794]: I0310 09:59:12.027416 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/cf8896bc-caa9-4120-8390-a5ffe0897859-util\") pod \"0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82v5bld\" (UID: \"cf8896bc-caa9-4120-8390-a5ffe0897859\") " pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82v5bld" Mar 10 09:59:12 crc kubenswrapper[4794]: I0310 09:59:12.027495 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/cf8896bc-caa9-4120-8390-a5ffe0897859-bundle\") pod \"0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82v5bld\" (UID: \"cf8896bc-caa9-4120-8390-a5ffe0897859\") " pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82v5bld" Mar 10 09:59:12 crc kubenswrapper[4794]: I0310 09:59:12.050704 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qxd9q\" (UniqueName: \"kubernetes.io/projected/cf8896bc-caa9-4120-8390-a5ffe0897859-kube-api-access-qxd9q\") pod \"0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82v5bld\" (UID: \"cf8896bc-caa9-4120-8390-a5ffe0897859\") " pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82v5bld" Mar 10 09:59:12 crc kubenswrapper[4794]: I0310 09:59:12.120956 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Mar 10 09:59:12 crc kubenswrapper[4794]: I0310 09:59:12.128653 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82v5bld" Mar 10 09:59:12 crc kubenswrapper[4794]: I0310 09:59:12.579977 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82v5bld"] Mar 10 09:59:12 crc kubenswrapper[4794]: W0310 09:59:12.591428 4794 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcf8896bc_caa9_4120_8390_a5ffe0897859.slice/crio-8cb9fd8824b3991a79567861a16793774ab26b78d13c15ecc0828c42e42eacfc WatchSource:0}: Error finding container 8cb9fd8824b3991a79567861a16793774ab26b78d13c15ecc0828c42e42eacfc: Status 404 returned error can't find the container with id 8cb9fd8824b3991a79567861a16793774ab26b78d13c15ecc0828c42e42eacfc Mar 10 09:59:13 crc kubenswrapper[4794]: I0310 09:59:13.154021 4794 generic.go:334] "Generic (PLEG): container finished" podID="cf8896bc-caa9-4120-8390-a5ffe0897859" containerID="39718189970b0a94f27370c5b80b826321e303f227e6c7e09f19a1711ad505fa" exitCode=0 Mar 10 09:59:13 crc kubenswrapper[4794]: I0310 09:59:13.154281 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82v5bld" event={"ID":"cf8896bc-caa9-4120-8390-a5ffe0897859","Type":"ContainerDied","Data":"39718189970b0a94f27370c5b80b826321e303f227e6c7e09f19a1711ad505fa"} Mar 10 09:59:13 crc kubenswrapper[4794]: I0310 09:59:13.154534 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82v5bld" event={"ID":"cf8896bc-caa9-4120-8390-a5ffe0897859","Type":"ContainerStarted","Data":"8cb9fd8824b3991a79567861a16793774ab26b78d13c15ecc0828c42e42eacfc"} Mar 10 09:59:14 crc kubenswrapper[4794]: I0310 09:59:14.133860 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-8h9tq"] Mar 10 09:59:14 crc kubenswrapper[4794]: I0310 09:59:14.135353 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-8h9tq" Mar 10 09:59:14 crc kubenswrapper[4794]: I0310 09:59:14.147992 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-8h9tq"] Mar 10 09:59:14 crc kubenswrapper[4794]: I0310 09:59:14.299234 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/392e3be1-f7e7-4164-b474-bb4a91036e8e-utilities\") pod \"redhat-operators-8h9tq\" (UID: \"392e3be1-f7e7-4164-b474-bb4a91036e8e\") " pod="openshift-marketplace/redhat-operators-8h9tq" Mar 10 09:59:14 crc kubenswrapper[4794]: I0310 09:59:14.299284 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n94nb\" (UniqueName: \"kubernetes.io/projected/392e3be1-f7e7-4164-b474-bb4a91036e8e-kube-api-access-n94nb\") pod \"redhat-operators-8h9tq\" (UID: \"392e3be1-f7e7-4164-b474-bb4a91036e8e\") " pod="openshift-marketplace/redhat-operators-8h9tq" Mar 10 09:59:14 crc kubenswrapper[4794]: I0310 09:59:14.299311 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/392e3be1-f7e7-4164-b474-bb4a91036e8e-catalog-content\") pod \"redhat-operators-8h9tq\" (UID: \"392e3be1-f7e7-4164-b474-bb4a91036e8e\") " pod="openshift-marketplace/redhat-operators-8h9tq" Mar 10 09:59:14 crc kubenswrapper[4794]: I0310 09:59:14.401312 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/392e3be1-f7e7-4164-b474-bb4a91036e8e-catalog-content\") pod \"redhat-operators-8h9tq\" (UID: \"392e3be1-f7e7-4164-b474-bb4a91036e8e\") " pod="openshift-marketplace/redhat-operators-8h9tq" Mar 10 09:59:14 crc kubenswrapper[4794]: I0310 09:59:14.401411 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/392e3be1-f7e7-4164-b474-bb4a91036e8e-utilities\") pod \"redhat-operators-8h9tq\" (UID: \"392e3be1-f7e7-4164-b474-bb4a91036e8e\") " pod="openshift-marketplace/redhat-operators-8h9tq" Mar 10 09:59:14 crc kubenswrapper[4794]: I0310 09:59:14.401439 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n94nb\" (UniqueName: \"kubernetes.io/projected/392e3be1-f7e7-4164-b474-bb4a91036e8e-kube-api-access-n94nb\") pod \"redhat-operators-8h9tq\" (UID: \"392e3be1-f7e7-4164-b474-bb4a91036e8e\") " pod="openshift-marketplace/redhat-operators-8h9tq" Mar 10 09:59:14 crc kubenswrapper[4794]: I0310 09:59:14.402262 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/392e3be1-f7e7-4164-b474-bb4a91036e8e-catalog-content\") pod \"redhat-operators-8h9tq\" (UID: \"392e3be1-f7e7-4164-b474-bb4a91036e8e\") " pod="openshift-marketplace/redhat-operators-8h9tq" Mar 10 09:59:14 crc kubenswrapper[4794]: I0310 09:59:14.402440 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/392e3be1-f7e7-4164-b474-bb4a91036e8e-utilities\") pod \"redhat-operators-8h9tq\" (UID: \"392e3be1-f7e7-4164-b474-bb4a91036e8e\") " pod="openshift-marketplace/redhat-operators-8h9tq" Mar 10 09:59:14 crc kubenswrapper[4794]: I0310 09:59:14.435814 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n94nb\" (UniqueName: \"kubernetes.io/projected/392e3be1-f7e7-4164-b474-bb4a91036e8e-kube-api-access-n94nb\") pod \"redhat-operators-8h9tq\" (UID: \"392e3be1-f7e7-4164-b474-bb4a91036e8e\") " pod="openshift-marketplace/redhat-operators-8h9tq" Mar 10 09:59:14 crc kubenswrapper[4794]: I0310 09:59:14.492889 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-8h9tq" Mar 10 09:59:14 crc kubenswrapper[4794]: I0310 09:59:14.687352 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-8h9tq"] Mar 10 09:59:14 crc kubenswrapper[4794]: W0310 09:59:14.693991 4794 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod392e3be1_f7e7_4164_b474_bb4a91036e8e.slice/crio-80cd649ddff3483b73f0f8752dbe9469ce1eaaaa3fee25abccfdcc4ec96a4835 WatchSource:0}: Error finding container 80cd649ddff3483b73f0f8752dbe9469ce1eaaaa3fee25abccfdcc4ec96a4835: Status 404 returned error can't find the container with id 80cd649ddff3483b73f0f8752dbe9469ce1eaaaa3fee25abccfdcc4ec96a4835 Mar 10 09:59:15 crc kubenswrapper[4794]: I0310 09:59:15.200930 4794 generic.go:334] "Generic (PLEG): container finished" podID="cf8896bc-caa9-4120-8390-a5ffe0897859" containerID="67120802a19c8fdeb8d5801bfeb0f6d920613cf5677c6f50b315f04ad1cf54aa" exitCode=0 Mar 10 09:59:15 crc kubenswrapper[4794]: I0310 09:59:15.201021 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82v5bld" event={"ID":"cf8896bc-caa9-4120-8390-a5ffe0897859","Type":"ContainerDied","Data":"67120802a19c8fdeb8d5801bfeb0f6d920613cf5677c6f50b315f04ad1cf54aa"} Mar 10 09:59:15 crc kubenswrapper[4794]: I0310 09:59:15.202445 4794 generic.go:334] "Generic (PLEG): container finished" podID="392e3be1-f7e7-4164-b474-bb4a91036e8e" containerID="8a97bf30063dfaa5261d74db97b8cf8ed7a3308124d8cce61410873826005dec" exitCode=0 Mar 10 09:59:15 crc kubenswrapper[4794]: I0310 09:59:15.202466 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8h9tq" event={"ID":"392e3be1-f7e7-4164-b474-bb4a91036e8e","Type":"ContainerDied","Data":"8a97bf30063dfaa5261d74db97b8cf8ed7a3308124d8cce61410873826005dec"} Mar 10 09:59:15 crc kubenswrapper[4794]: I0310 09:59:15.202485 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8h9tq" event={"ID":"392e3be1-f7e7-4164-b474-bb4a91036e8e","Type":"ContainerStarted","Data":"80cd649ddff3483b73f0f8752dbe9469ce1eaaaa3fee25abccfdcc4ec96a4835"} Mar 10 09:59:16 crc kubenswrapper[4794]: I0310 09:59:16.209571 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8h9tq" event={"ID":"392e3be1-f7e7-4164-b474-bb4a91036e8e","Type":"ContainerStarted","Data":"685defe5ebe9fcd6635a67a1eabbacd7436301adea79952df99cef9b53499f92"} Mar 10 09:59:16 crc kubenswrapper[4794]: I0310 09:59:16.213172 4794 generic.go:334] "Generic (PLEG): container finished" podID="cf8896bc-caa9-4120-8390-a5ffe0897859" containerID="06e84878ae7b279f8ea7814ed7e8f361b0f1d6cc90dfc3e4af8a68aaeb0f23b7" exitCode=0 Mar 10 09:59:16 crc kubenswrapper[4794]: I0310 09:59:16.213216 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82v5bld" event={"ID":"cf8896bc-caa9-4120-8390-a5ffe0897859","Type":"ContainerDied","Data":"06e84878ae7b279f8ea7814ed7e8f361b0f1d6cc90dfc3e4af8a68aaeb0f23b7"} Mar 10 09:59:17 crc kubenswrapper[4794]: I0310 09:59:17.221103 4794 generic.go:334] "Generic (PLEG): container finished" podID="392e3be1-f7e7-4164-b474-bb4a91036e8e" containerID="685defe5ebe9fcd6635a67a1eabbacd7436301adea79952df99cef9b53499f92" exitCode=0 Mar 10 09:59:17 crc kubenswrapper[4794]: I0310 09:59:17.221191 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8h9tq" event={"ID":"392e3be1-f7e7-4164-b474-bb4a91036e8e","Type":"ContainerDied","Data":"685defe5ebe9fcd6635a67a1eabbacd7436301adea79952df99cef9b53499f92"} Mar 10 09:59:17 crc kubenswrapper[4794]: I0310 09:59:17.457871 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82v5bld" Mar 10 09:59:17 crc kubenswrapper[4794]: I0310 09:59:17.539889 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/cf8896bc-caa9-4120-8390-a5ffe0897859-bundle\") pod \"cf8896bc-caa9-4120-8390-a5ffe0897859\" (UID: \"cf8896bc-caa9-4120-8390-a5ffe0897859\") " Mar 10 09:59:17 crc kubenswrapper[4794]: I0310 09:59:17.540033 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/cf8896bc-caa9-4120-8390-a5ffe0897859-util\") pod \"cf8896bc-caa9-4120-8390-a5ffe0897859\" (UID: \"cf8896bc-caa9-4120-8390-a5ffe0897859\") " Mar 10 09:59:17 crc kubenswrapper[4794]: I0310 09:59:17.540088 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qxd9q\" (UniqueName: \"kubernetes.io/projected/cf8896bc-caa9-4120-8390-a5ffe0897859-kube-api-access-qxd9q\") pod \"cf8896bc-caa9-4120-8390-a5ffe0897859\" (UID: \"cf8896bc-caa9-4120-8390-a5ffe0897859\") " Mar 10 09:59:17 crc kubenswrapper[4794]: I0310 09:59:17.540694 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cf8896bc-caa9-4120-8390-a5ffe0897859-bundle" (OuterVolumeSpecName: "bundle") pod "cf8896bc-caa9-4120-8390-a5ffe0897859" (UID: "cf8896bc-caa9-4120-8390-a5ffe0897859"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 09:59:17 crc kubenswrapper[4794]: I0310 09:59:17.556305 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cf8896bc-caa9-4120-8390-a5ffe0897859-kube-api-access-qxd9q" (OuterVolumeSpecName: "kube-api-access-qxd9q") pod "cf8896bc-caa9-4120-8390-a5ffe0897859" (UID: "cf8896bc-caa9-4120-8390-a5ffe0897859"). InnerVolumeSpecName "kube-api-access-qxd9q". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:59:17 crc kubenswrapper[4794]: I0310 09:59:17.558803 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cf8896bc-caa9-4120-8390-a5ffe0897859-util" (OuterVolumeSpecName: "util") pod "cf8896bc-caa9-4120-8390-a5ffe0897859" (UID: "cf8896bc-caa9-4120-8390-a5ffe0897859"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 09:59:17 crc kubenswrapper[4794]: I0310 09:59:17.641658 4794 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/cf8896bc-caa9-4120-8390-a5ffe0897859-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 09:59:17 crc kubenswrapper[4794]: I0310 09:59:17.641705 4794 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/cf8896bc-caa9-4120-8390-a5ffe0897859-util\") on node \"crc\" DevicePath \"\"" Mar 10 09:59:17 crc kubenswrapper[4794]: I0310 09:59:17.641717 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qxd9q\" (UniqueName: \"kubernetes.io/projected/cf8896bc-caa9-4120-8390-a5ffe0897859-kube-api-access-qxd9q\") on node \"crc\" DevicePath \"\"" Mar 10 09:59:18 crc kubenswrapper[4794]: I0310 09:59:18.231720 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82v5bld" event={"ID":"cf8896bc-caa9-4120-8390-a5ffe0897859","Type":"ContainerDied","Data":"8cb9fd8824b3991a79567861a16793774ab26b78d13c15ecc0828c42e42eacfc"} Mar 10 09:59:18 crc kubenswrapper[4794]: I0310 09:59:18.232117 4794 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8cb9fd8824b3991a79567861a16793774ab26b78d13c15ecc0828c42e42eacfc" Mar 10 09:59:18 crc kubenswrapper[4794]: I0310 09:59:18.231833 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82v5bld" Mar 10 09:59:18 crc kubenswrapper[4794]: I0310 09:59:18.233837 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8h9tq" event={"ID":"392e3be1-f7e7-4164-b474-bb4a91036e8e","Type":"ContainerStarted","Data":"213e81194a36c49b49660c99398fb995cc1fb08d36ed3faecd924c8b29e5aff0"} Mar 10 09:59:18 crc kubenswrapper[4794]: I0310 09:59:18.260697 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-8h9tq" podStartSLOduration=1.6854650009999999 podStartE2EDuration="4.260667546s" podCreationTimestamp="2026-03-10 09:59:14 +0000 UTC" firstStartedPulling="2026-03-10 09:59:15.203657543 +0000 UTC m=+903.959828371" lastFinishedPulling="2026-03-10 09:59:17.778860098 +0000 UTC m=+906.535030916" observedRunningTime="2026-03-10 09:59:18.252949777 +0000 UTC m=+907.009120645" watchObservedRunningTime="2026-03-10 09:59:18.260667546 +0000 UTC m=+907.016838394" Mar 10 09:59:22 crc kubenswrapper[4794]: I0310 09:59:22.470218 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-operator-75c5dccd6c-hpmnv"] Mar 10 09:59:22 crc kubenswrapper[4794]: E0310 09:59:22.471632 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cf8896bc-caa9-4120-8390-a5ffe0897859" containerName="pull" Mar 10 09:59:22 crc kubenswrapper[4794]: I0310 09:59:22.471720 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="cf8896bc-caa9-4120-8390-a5ffe0897859" containerName="pull" Mar 10 09:59:22 crc kubenswrapper[4794]: E0310 09:59:22.471823 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cf8896bc-caa9-4120-8390-a5ffe0897859" containerName="util" Mar 10 09:59:22 crc kubenswrapper[4794]: I0310 09:59:22.471899 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="cf8896bc-caa9-4120-8390-a5ffe0897859" containerName="util" Mar 10 09:59:22 crc kubenswrapper[4794]: E0310 09:59:22.471973 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cf8896bc-caa9-4120-8390-a5ffe0897859" containerName="extract" Mar 10 09:59:22 crc kubenswrapper[4794]: I0310 09:59:22.472050 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="cf8896bc-caa9-4120-8390-a5ffe0897859" containerName="extract" Mar 10 09:59:22 crc kubenswrapper[4794]: I0310 09:59:22.472235 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="cf8896bc-caa9-4120-8390-a5ffe0897859" containerName="extract" Mar 10 09:59:22 crc kubenswrapper[4794]: I0310 09:59:22.472732 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-75c5dccd6c-hpmnv" Mar 10 09:59:22 crc kubenswrapper[4794]: I0310 09:59:22.482571 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-75c5dccd6c-hpmnv"] Mar 10 09:59:22 crc kubenswrapper[4794]: I0310 09:59:22.488032 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"kube-root-ca.crt" Mar 10 09:59:22 crc kubenswrapper[4794]: I0310 09:59:22.488279 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"openshift-service-ca.crt" Mar 10 09:59:22 crc kubenswrapper[4794]: I0310 09:59:22.488522 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-operator-dockercfg-w47hw" Mar 10 09:59:22 crc kubenswrapper[4794]: I0310 09:59:22.601596 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j2kvc\" (UniqueName: \"kubernetes.io/projected/03b196f2-90f3-4491-bdc7-298a917ccda7-kube-api-access-j2kvc\") pod \"nmstate-operator-75c5dccd6c-hpmnv\" (UID: \"03b196f2-90f3-4491-bdc7-298a917ccda7\") " pod="openshift-nmstate/nmstate-operator-75c5dccd6c-hpmnv" Mar 10 09:59:22 crc kubenswrapper[4794]: I0310 09:59:22.702868 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j2kvc\" (UniqueName: \"kubernetes.io/projected/03b196f2-90f3-4491-bdc7-298a917ccda7-kube-api-access-j2kvc\") pod \"nmstate-operator-75c5dccd6c-hpmnv\" (UID: \"03b196f2-90f3-4491-bdc7-298a917ccda7\") " pod="openshift-nmstate/nmstate-operator-75c5dccd6c-hpmnv" Mar 10 09:59:22 crc kubenswrapper[4794]: I0310 09:59:22.724981 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j2kvc\" (UniqueName: \"kubernetes.io/projected/03b196f2-90f3-4491-bdc7-298a917ccda7-kube-api-access-j2kvc\") pod \"nmstate-operator-75c5dccd6c-hpmnv\" (UID: \"03b196f2-90f3-4491-bdc7-298a917ccda7\") " pod="openshift-nmstate/nmstate-operator-75c5dccd6c-hpmnv" Mar 10 09:59:22 crc kubenswrapper[4794]: I0310 09:59:22.791854 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-75c5dccd6c-hpmnv" Mar 10 09:59:22 crc kubenswrapper[4794]: I0310 09:59:22.967996 4794 patch_prober.go:28] interesting pod/machine-config-daemon-69278 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 09:59:22 crc kubenswrapper[4794]: I0310 09:59:22.968306 4794 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-69278" podUID="071a79a8-a892-4d38-a255-2a19483b64aa" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 09:59:22 crc kubenswrapper[4794]: I0310 09:59:22.981458 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-75c5dccd6c-hpmnv"] Mar 10 09:59:23 crc kubenswrapper[4794]: I0310 09:59:23.261142 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-75c5dccd6c-hpmnv" event={"ID":"03b196f2-90f3-4491-bdc7-298a917ccda7","Type":"ContainerStarted","Data":"e962984870de9e20b1cc6e97ecf6709509ed9bd7d54fd141d6938f2cb8170776"} Mar 10 09:59:24 crc kubenswrapper[4794]: I0310 09:59:24.493760 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-8h9tq" Mar 10 09:59:24 crc kubenswrapper[4794]: I0310 09:59:24.494088 4794 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-8h9tq" Mar 10 09:59:24 crc kubenswrapper[4794]: I0310 09:59:24.547591 4794 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-8h9tq" Mar 10 09:59:25 crc kubenswrapper[4794]: I0310 09:59:25.317047 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-8h9tq" Mar 10 09:59:26 crc kubenswrapper[4794]: I0310 09:59:26.281074 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-75c5dccd6c-hpmnv" event={"ID":"03b196f2-90f3-4491-bdc7-298a917ccda7","Type":"ContainerStarted","Data":"af1eb741af05884ceb582214bbce00d49d9b21bad9004b115f055e128cccab05"} Mar 10 09:59:26 crc kubenswrapper[4794]: I0310 09:59:26.309634 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-operator-75c5dccd6c-hpmnv" podStartSLOduration=2.077742183 podStartE2EDuration="4.309613715s" podCreationTimestamp="2026-03-10 09:59:22 +0000 UTC" firstStartedPulling="2026-03-10 09:59:22.99360125 +0000 UTC m=+911.749772068" lastFinishedPulling="2026-03-10 09:59:25.225472772 +0000 UTC m=+913.981643600" observedRunningTime="2026-03-10 09:59:26.303717751 +0000 UTC m=+915.059888579" watchObservedRunningTime="2026-03-10 09:59:26.309613715 +0000 UTC m=+915.065784543" Mar 10 09:59:27 crc kubenswrapper[4794]: I0310 09:59:27.126596 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-8h9tq"] Mar 10 09:59:27 crc kubenswrapper[4794]: I0310 09:59:27.288203 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-8h9tq" podUID="392e3be1-f7e7-4164-b474-bb4a91036e8e" containerName="registry-server" containerID="cri-o://213e81194a36c49b49660c99398fb995cc1fb08d36ed3faecd924c8b29e5aff0" gracePeriod=2 Mar 10 09:59:28 crc kubenswrapper[4794]: I0310 09:59:28.296691 4794 generic.go:334] "Generic (PLEG): container finished" podID="392e3be1-f7e7-4164-b474-bb4a91036e8e" containerID="213e81194a36c49b49660c99398fb995cc1fb08d36ed3faecd924c8b29e5aff0" exitCode=0 Mar 10 09:59:28 crc kubenswrapper[4794]: I0310 09:59:28.296742 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8h9tq" event={"ID":"392e3be1-f7e7-4164-b474-bb4a91036e8e","Type":"ContainerDied","Data":"213e81194a36c49b49660c99398fb995cc1fb08d36ed3faecd924c8b29e5aff0"} Mar 10 09:59:28 crc kubenswrapper[4794]: I0310 09:59:28.754885 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-8h9tq" Mar 10 09:59:28 crc kubenswrapper[4794]: I0310 09:59:28.882057 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/392e3be1-f7e7-4164-b474-bb4a91036e8e-utilities\") pod \"392e3be1-f7e7-4164-b474-bb4a91036e8e\" (UID: \"392e3be1-f7e7-4164-b474-bb4a91036e8e\") " Mar 10 09:59:28 crc kubenswrapper[4794]: I0310 09:59:28.882154 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/392e3be1-f7e7-4164-b474-bb4a91036e8e-catalog-content\") pod \"392e3be1-f7e7-4164-b474-bb4a91036e8e\" (UID: \"392e3be1-f7e7-4164-b474-bb4a91036e8e\") " Mar 10 09:59:28 crc kubenswrapper[4794]: I0310 09:59:28.883527 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/392e3be1-f7e7-4164-b474-bb4a91036e8e-utilities" (OuterVolumeSpecName: "utilities") pod "392e3be1-f7e7-4164-b474-bb4a91036e8e" (UID: "392e3be1-f7e7-4164-b474-bb4a91036e8e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 09:59:28 crc kubenswrapper[4794]: I0310 09:59:28.882330 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n94nb\" (UniqueName: \"kubernetes.io/projected/392e3be1-f7e7-4164-b474-bb4a91036e8e-kube-api-access-n94nb\") pod \"392e3be1-f7e7-4164-b474-bb4a91036e8e\" (UID: \"392e3be1-f7e7-4164-b474-bb4a91036e8e\") " Mar 10 09:59:28 crc kubenswrapper[4794]: I0310 09:59:28.884375 4794 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/392e3be1-f7e7-4164-b474-bb4a91036e8e-utilities\") on node \"crc\" DevicePath \"\"" Mar 10 09:59:28 crc kubenswrapper[4794]: I0310 09:59:28.888805 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/392e3be1-f7e7-4164-b474-bb4a91036e8e-kube-api-access-n94nb" (OuterVolumeSpecName: "kube-api-access-n94nb") pod "392e3be1-f7e7-4164-b474-bb4a91036e8e" (UID: "392e3be1-f7e7-4164-b474-bb4a91036e8e"). InnerVolumeSpecName "kube-api-access-n94nb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:59:28 crc kubenswrapper[4794]: I0310 09:59:28.986062 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n94nb\" (UniqueName: \"kubernetes.io/projected/392e3be1-f7e7-4164-b474-bb4a91036e8e-kube-api-access-n94nb\") on node \"crc\" DevicePath \"\"" Mar 10 09:59:29 crc kubenswrapper[4794]: I0310 09:59:29.062290 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/392e3be1-f7e7-4164-b474-bb4a91036e8e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "392e3be1-f7e7-4164-b474-bb4a91036e8e" (UID: "392e3be1-f7e7-4164-b474-bb4a91036e8e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 09:59:29 crc kubenswrapper[4794]: I0310 09:59:29.087692 4794 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/392e3be1-f7e7-4164-b474-bb4a91036e8e-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 10 09:59:29 crc kubenswrapper[4794]: I0310 09:59:29.307357 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8h9tq" event={"ID":"392e3be1-f7e7-4164-b474-bb4a91036e8e","Type":"ContainerDied","Data":"80cd649ddff3483b73f0f8752dbe9469ce1eaaaa3fee25abccfdcc4ec96a4835"} Mar 10 09:59:29 crc kubenswrapper[4794]: I0310 09:59:29.307422 4794 scope.go:117] "RemoveContainer" containerID="213e81194a36c49b49660c99398fb995cc1fb08d36ed3faecd924c8b29e5aff0" Mar 10 09:59:29 crc kubenswrapper[4794]: I0310 09:59:29.307446 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-8h9tq" Mar 10 09:59:29 crc kubenswrapper[4794]: I0310 09:59:29.333690 4794 scope.go:117] "RemoveContainer" containerID="685defe5ebe9fcd6635a67a1eabbacd7436301adea79952df99cef9b53499f92" Mar 10 09:59:29 crc kubenswrapper[4794]: I0310 09:59:29.350453 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-8h9tq"] Mar 10 09:59:29 crc kubenswrapper[4794]: I0310 09:59:29.357584 4794 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-8h9tq"] Mar 10 09:59:29 crc kubenswrapper[4794]: I0310 09:59:29.367759 4794 scope.go:117] "RemoveContainer" containerID="8a97bf30063dfaa5261d74db97b8cf8ed7a3308124d8cce61410873826005dec" Mar 10 09:59:30 crc kubenswrapper[4794]: I0310 09:59:30.012289 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="392e3be1-f7e7-4164-b474-bb4a91036e8e" path="/var/lib/kubelet/pods/392e3be1-f7e7-4164-b474-bb4a91036e8e/volumes" Mar 10 09:59:31 crc kubenswrapper[4794]: I0310 09:59:31.903619 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-metrics-69594cc75-kl5qq"] Mar 10 09:59:31 crc kubenswrapper[4794]: E0310 09:59:31.903824 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="392e3be1-f7e7-4164-b474-bb4a91036e8e" containerName="extract-content" Mar 10 09:59:31 crc kubenswrapper[4794]: I0310 09:59:31.903838 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="392e3be1-f7e7-4164-b474-bb4a91036e8e" containerName="extract-content" Mar 10 09:59:31 crc kubenswrapper[4794]: E0310 09:59:31.903848 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="392e3be1-f7e7-4164-b474-bb4a91036e8e" containerName="registry-server" Mar 10 09:59:31 crc kubenswrapper[4794]: I0310 09:59:31.903853 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="392e3be1-f7e7-4164-b474-bb4a91036e8e" containerName="registry-server" Mar 10 09:59:31 crc kubenswrapper[4794]: E0310 09:59:31.903870 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="392e3be1-f7e7-4164-b474-bb4a91036e8e" containerName="extract-utilities" Mar 10 09:59:31 crc kubenswrapper[4794]: I0310 09:59:31.903877 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="392e3be1-f7e7-4164-b474-bb4a91036e8e" containerName="extract-utilities" Mar 10 09:59:31 crc kubenswrapper[4794]: I0310 09:59:31.903968 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="392e3be1-f7e7-4164-b474-bb4a91036e8e" containerName="registry-server" Mar 10 09:59:31 crc kubenswrapper[4794]: I0310 09:59:31.904518 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-69594cc75-kl5qq" Mar 10 09:59:31 crc kubenswrapper[4794]: I0310 09:59:31.907785 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-handler-dockercfg-twphc" Mar 10 09:59:31 crc kubenswrapper[4794]: I0310 09:59:31.914543 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-69594cc75-kl5qq"] Mar 10 09:59:31 crc kubenswrapper[4794]: I0310 09:59:31.918753 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-webhook-786f45cff4-x8x2b"] Mar 10 09:59:31 crc kubenswrapper[4794]: I0310 09:59:31.922175 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-786f45cff4-x8x2b" Mar 10 09:59:31 crc kubenswrapper[4794]: I0310 09:59:31.923907 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"openshift-nmstate-webhook" Mar 10 09:59:31 crc kubenswrapper[4794]: I0310 09:59:31.936235 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-786f45cff4-x8x2b"] Mar 10 09:59:31 crc kubenswrapper[4794]: I0310 09:59:31.963634 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-handler-mmjmm"] Mar 10 09:59:31 crc kubenswrapper[4794]: I0310 09:59:31.964378 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-mmjmm" Mar 10 09:59:32 crc kubenswrapper[4794]: I0310 09:59:32.025709 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/25233eef-8e10-48bd-bd95-1366cc06f956-tls-key-pair\") pod \"nmstate-webhook-786f45cff4-x8x2b\" (UID: \"25233eef-8e10-48bd-bd95-1366cc06f956\") " pod="openshift-nmstate/nmstate-webhook-786f45cff4-x8x2b" Mar 10 09:59:32 crc kubenswrapper[4794]: I0310 09:59:32.025777 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rpm5h\" (UniqueName: \"kubernetes.io/projected/25233eef-8e10-48bd-bd95-1366cc06f956-kube-api-access-rpm5h\") pod \"nmstate-webhook-786f45cff4-x8x2b\" (UID: \"25233eef-8e10-48bd-bd95-1366cc06f956\") " pod="openshift-nmstate/nmstate-webhook-786f45cff4-x8x2b" Mar 10 09:59:32 crc kubenswrapper[4794]: I0310 09:59:32.025825 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mfwx4\" (UniqueName: \"kubernetes.io/projected/c22565b3-2d95-40ea-beb3-17d2daec0262-kube-api-access-mfwx4\") pod \"nmstate-metrics-69594cc75-kl5qq\" (UID: \"c22565b3-2d95-40ea-beb3-17d2daec0262\") " pod="openshift-nmstate/nmstate-metrics-69594cc75-kl5qq" Mar 10 09:59:32 crc kubenswrapper[4794]: I0310 09:59:32.042266 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-54vjj"] Mar 10 09:59:32 crc kubenswrapper[4794]: I0310 09:59:32.043560 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-54vjj" Mar 10 09:59:32 crc kubenswrapper[4794]: I0310 09:59:32.048955 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"default-dockercfg-84spw" Mar 10 09:59:32 crc kubenswrapper[4794]: I0310 09:59:32.051639 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"nginx-conf" Mar 10 09:59:32 crc kubenswrapper[4794]: I0310 09:59:32.051872 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"plugin-serving-cert" Mar 10 09:59:32 crc kubenswrapper[4794]: I0310 09:59:32.057771 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-54vjj"] Mar 10 09:59:32 crc kubenswrapper[4794]: I0310 09:59:32.127004 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/eb034770-4c69-4e29-b08e-88af9c78c76e-dbus-socket\") pod \"nmstate-handler-mmjmm\" (UID: \"eb034770-4c69-4e29-b08e-88af9c78c76e\") " pod="openshift-nmstate/nmstate-handler-mmjmm" Mar 10 09:59:32 crc kubenswrapper[4794]: I0310 09:59:32.127251 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/eb034770-4c69-4e29-b08e-88af9c78c76e-nmstate-lock\") pod \"nmstate-handler-mmjmm\" (UID: \"eb034770-4c69-4e29-b08e-88af9c78c76e\") " pod="openshift-nmstate/nmstate-handler-mmjmm" Mar 10 09:59:32 crc kubenswrapper[4794]: I0310 09:59:32.127363 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mfwx4\" (UniqueName: \"kubernetes.io/projected/c22565b3-2d95-40ea-beb3-17d2daec0262-kube-api-access-mfwx4\") pod \"nmstate-metrics-69594cc75-kl5qq\" (UID: \"c22565b3-2d95-40ea-beb3-17d2daec0262\") " pod="openshift-nmstate/nmstate-metrics-69594cc75-kl5qq" Mar 10 09:59:32 crc kubenswrapper[4794]: I0310 09:59:32.127475 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/e60c58eb-6276-4eeb-b32b-21b8bb5c4d08-plugin-serving-cert\") pod \"nmstate-console-plugin-5dcbbd79cf-54vjj\" (UID: \"e60c58eb-6276-4eeb-b32b-21b8bb5c4d08\") " pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-54vjj" Mar 10 09:59:32 crc kubenswrapper[4794]: I0310 09:59:32.127571 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/eb034770-4c69-4e29-b08e-88af9c78c76e-ovs-socket\") pod \"nmstate-handler-mmjmm\" (UID: \"eb034770-4c69-4e29-b08e-88af9c78c76e\") " pod="openshift-nmstate/nmstate-handler-mmjmm" Mar 10 09:59:32 crc kubenswrapper[4794]: I0310 09:59:32.127648 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dx5fc\" (UniqueName: \"kubernetes.io/projected/e60c58eb-6276-4eeb-b32b-21b8bb5c4d08-kube-api-access-dx5fc\") pod \"nmstate-console-plugin-5dcbbd79cf-54vjj\" (UID: \"e60c58eb-6276-4eeb-b32b-21b8bb5c4d08\") " pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-54vjj" Mar 10 09:59:32 crc kubenswrapper[4794]: I0310 09:59:32.127741 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lcdh6\" (UniqueName: \"kubernetes.io/projected/eb034770-4c69-4e29-b08e-88af9c78c76e-kube-api-access-lcdh6\") pod \"nmstate-handler-mmjmm\" (UID: \"eb034770-4c69-4e29-b08e-88af9c78c76e\") " pod="openshift-nmstate/nmstate-handler-mmjmm" Mar 10 09:59:32 crc kubenswrapper[4794]: I0310 09:59:32.127814 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/25233eef-8e10-48bd-bd95-1366cc06f956-tls-key-pair\") pod \"nmstate-webhook-786f45cff4-x8x2b\" (UID: \"25233eef-8e10-48bd-bd95-1366cc06f956\") " pod="openshift-nmstate/nmstate-webhook-786f45cff4-x8x2b" Mar 10 09:59:32 crc kubenswrapper[4794]: I0310 09:59:32.127901 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rpm5h\" (UniqueName: \"kubernetes.io/projected/25233eef-8e10-48bd-bd95-1366cc06f956-kube-api-access-rpm5h\") pod \"nmstate-webhook-786f45cff4-x8x2b\" (UID: \"25233eef-8e10-48bd-bd95-1366cc06f956\") " pod="openshift-nmstate/nmstate-webhook-786f45cff4-x8x2b" Mar 10 09:59:32 crc kubenswrapper[4794]: I0310 09:59:32.127986 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/e60c58eb-6276-4eeb-b32b-21b8bb5c4d08-nginx-conf\") pod \"nmstate-console-plugin-5dcbbd79cf-54vjj\" (UID: \"e60c58eb-6276-4eeb-b32b-21b8bb5c4d08\") " pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-54vjj" Mar 10 09:59:32 crc kubenswrapper[4794]: I0310 09:59:32.143514 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/25233eef-8e10-48bd-bd95-1366cc06f956-tls-key-pair\") pod \"nmstate-webhook-786f45cff4-x8x2b\" (UID: \"25233eef-8e10-48bd-bd95-1366cc06f956\") " pod="openshift-nmstate/nmstate-webhook-786f45cff4-x8x2b" Mar 10 09:59:32 crc kubenswrapper[4794]: I0310 09:59:32.147321 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rpm5h\" (UniqueName: \"kubernetes.io/projected/25233eef-8e10-48bd-bd95-1366cc06f956-kube-api-access-rpm5h\") pod \"nmstate-webhook-786f45cff4-x8x2b\" (UID: \"25233eef-8e10-48bd-bd95-1366cc06f956\") " pod="openshift-nmstate/nmstate-webhook-786f45cff4-x8x2b" Mar 10 09:59:32 crc kubenswrapper[4794]: I0310 09:59:32.147591 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mfwx4\" (UniqueName: \"kubernetes.io/projected/c22565b3-2d95-40ea-beb3-17d2daec0262-kube-api-access-mfwx4\") pod \"nmstate-metrics-69594cc75-kl5qq\" (UID: \"c22565b3-2d95-40ea-beb3-17d2daec0262\") " pod="openshift-nmstate/nmstate-metrics-69594cc75-kl5qq" Mar 10 09:59:32 crc kubenswrapper[4794]: I0310 09:59:32.223542 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-69594cc75-kl5qq" Mar 10 09:59:32 crc kubenswrapper[4794]: I0310 09:59:32.229519 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/e60c58eb-6276-4eeb-b32b-21b8bb5c4d08-plugin-serving-cert\") pod \"nmstate-console-plugin-5dcbbd79cf-54vjj\" (UID: \"e60c58eb-6276-4eeb-b32b-21b8bb5c4d08\") " pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-54vjj" Mar 10 09:59:32 crc kubenswrapper[4794]: I0310 09:59:32.229571 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/eb034770-4c69-4e29-b08e-88af9c78c76e-ovs-socket\") pod \"nmstate-handler-mmjmm\" (UID: \"eb034770-4c69-4e29-b08e-88af9c78c76e\") " pod="openshift-nmstate/nmstate-handler-mmjmm" Mar 10 09:59:32 crc kubenswrapper[4794]: I0310 09:59:32.229588 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dx5fc\" (UniqueName: \"kubernetes.io/projected/e60c58eb-6276-4eeb-b32b-21b8bb5c4d08-kube-api-access-dx5fc\") pod \"nmstate-console-plugin-5dcbbd79cf-54vjj\" (UID: \"e60c58eb-6276-4eeb-b32b-21b8bb5c4d08\") " pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-54vjj" Mar 10 09:59:32 crc kubenswrapper[4794]: I0310 09:59:32.229620 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lcdh6\" (UniqueName: \"kubernetes.io/projected/eb034770-4c69-4e29-b08e-88af9c78c76e-kube-api-access-lcdh6\") pod \"nmstate-handler-mmjmm\" (UID: \"eb034770-4c69-4e29-b08e-88af9c78c76e\") " pod="openshift-nmstate/nmstate-handler-mmjmm" Mar 10 09:59:32 crc kubenswrapper[4794]: I0310 09:59:32.229661 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/e60c58eb-6276-4eeb-b32b-21b8bb5c4d08-nginx-conf\") pod \"nmstate-console-plugin-5dcbbd79cf-54vjj\" (UID: \"e60c58eb-6276-4eeb-b32b-21b8bb5c4d08\") " pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-54vjj" Mar 10 09:59:32 crc kubenswrapper[4794]: I0310 09:59:32.229678 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/eb034770-4c69-4e29-b08e-88af9c78c76e-dbus-socket\") pod \"nmstate-handler-mmjmm\" (UID: \"eb034770-4c69-4e29-b08e-88af9c78c76e\") " pod="openshift-nmstate/nmstate-handler-mmjmm" Mar 10 09:59:32 crc kubenswrapper[4794]: I0310 09:59:32.229697 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/eb034770-4c69-4e29-b08e-88af9c78c76e-nmstate-lock\") pod \"nmstate-handler-mmjmm\" (UID: \"eb034770-4c69-4e29-b08e-88af9c78c76e\") " pod="openshift-nmstate/nmstate-handler-mmjmm" Mar 10 09:59:32 crc kubenswrapper[4794]: I0310 09:59:32.229717 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/eb034770-4c69-4e29-b08e-88af9c78c76e-ovs-socket\") pod \"nmstate-handler-mmjmm\" (UID: \"eb034770-4c69-4e29-b08e-88af9c78c76e\") " pod="openshift-nmstate/nmstate-handler-mmjmm" Mar 10 09:59:32 crc kubenswrapper[4794]: I0310 09:59:32.229765 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/eb034770-4c69-4e29-b08e-88af9c78c76e-nmstate-lock\") pod \"nmstate-handler-mmjmm\" (UID: \"eb034770-4c69-4e29-b08e-88af9c78c76e\") " pod="openshift-nmstate/nmstate-handler-mmjmm" Mar 10 09:59:32 crc kubenswrapper[4794]: I0310 09:59:32.230058 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/eb034770-4c69-4e29-b08e-88af9c78c76e-dbus-socket\") pod \"nmstate-handler-mmjmm\" (UID: \"eb034770-4c69-4e29-b08e-88af9c78c76e\") " pod="openshift-nmstate/nmstate-handler-mmjmm" Mar 10 09:59:32 crc kubenswrapper[4794]: I0310 09:59:32.230803 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/e60c58eb-6276-4eeb-b32b-21b8bb5c4d08-nginx-conf\") pod \"nmstate-console-plugin-5dcbbd79cf-54vjj\" (UID: \"e60c58eb-6276-4eeb-b32b-21b8bb5c4d08\") " pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-54vjj" Mar 10 09:59:32 crc kubenswrapper[4794]: I0310 09:59:32.235085 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/e60c58eb-6276-4eeb-b32b-21b8bb5c4d08-plugin-serving-cert\") pod \"nmstate-console-plugin-5dcbbd79cf-54vjj\" (UID: \"e60c58eb-6276-4eeb-b32b-21b8bb5c4d08\") " pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-54vjj" Mar 10 09:59:32 crc kubenswrapper[4794]: I0310 09:59:32.235753 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-786f45cff4-x8x2b" Mar 10 09:59:32 crc kubenswrapper[4794]: I0310 09:59:32.250024 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lcdh6\" (UniqueName: \"kubernetes.io/projected/eb034770-4c69-4e29-b08e-88af9c78c76e-kube-api-access-lcdh6\") pod \"nmstate-handler-mmjmm\" (UID: \"eb034770-4c69-4e29-b08e-88af9c78c76e\") " pod="openshift-nmstate/nmstate-handler-mmjmm" Mar 10 09:59:32 crc kubenswrapper[4794]: I0310 09:59:32.264688 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dx5fc\" (UniqueName: \"kubernetes.io/projected/e60c58eb-6276-4eeb-b32b-21b8bb5c4d08-kube-api-access-dx5fc\") pod \"nmstate-console-plugin-5dcbbd79cf-54vjj\" (UID: \"e60c58eb-6276-4eeb-b32b-21b8bb5c4d08\") " pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-54vjj" Mar 10 09:59:32 crc kubenswrapper[4794]: I0310 09:59:32.279869 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-mmjmm" Mar 10 09:59:32 crc kubenswrapper[4794]: W0310 09:59:32.320300 4794 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podeb034770_4c69_4e29_b08e_88af9c78c76e.slice/crio-bbdcf73804fc2bf0422e688c56455e55030b352dd8bb81bdf31ff4a6fd2ccf11 WatchSource:0}: Error finding container bbdcf73804fc2bf0422e688c56455e55030b352dd8bb81bdf31ff4a6fd2ccf11: Status 404 returned error can't find the container with id bbdcf73804fc2bf0422e688c56455e55030b352dd8bb81bdf31ff4a6fd2ccf11 Mar 10 09:59:32 crc kubenswrapper[4794]: I0310 09:59:32.321315 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-74499ccd79-jbccz"] Mar 10 09:59:32 crc kubenswrapper[4794]: I0310 09:59:32.321971 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-74499ccd79-jbccz" Mar 10 09:59:32 crc kubenswrapper[4794]: I0310 09:59:32.332854 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-mmjmm" event={"ID":"eb034770-4c69-4e29-b08e-88af9c78c76e","Type":"ContainerStarted","Data":"bbdcf73804fc2bf0422e688c56455e55030b352dd8bb81bdf31ff4a6fd2ccf11"} Mar 10 09:59:32 crc kubenswrapper[4794]: I0310 09:59:32.368656 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-54vjj" Mar 10 09:59:32 crc kubenswrapper[4794]: I0310 09:59:32.376190 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-74499ccd79-jbccz"] Mar 10 09:59:32 crc kubenswrapper[4794]: I0310 09:59:32.433022 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/51e3997e-596c-4f50-a319-1f2e89c622f4-oauth-serving-cert\") pod \"console-74499ccd79-jbccz\" (UID: \"51e3997e-596c-4f50-a319-1f2e89c622f4\") " pod="openshift-console/console-74499ccd79-jbccz" Mar 10 09:59:32 crc kubenswrapper[4794]: I0310 09:59:32.433065 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/51e3997e-596c-4f50-a319-1f2e89c622f4-console-config\") pod \"console-74499ccd79-jbccz\" (UID: \"51e3997e-596c-4f50-a319-1f2e89c622f4\") " pod="openshift-console/console-74499ccd79-jbccz" Mar 10 09:59:32 crc kubenswrapper[4794]: I0310 09:59:32.433102 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/51e3997e-596c-4f50-a319-1f2e89c622f4-console-serving-cert\") pod \"console-74499ccd79-jbccz\" (UID: \"51e3997e-596c-4f50-a319-1f2e89c622f4\") " pod="openshift-console/console-74499ccd79-jbccz" Mar 10 09:59:32 crc kubenswrapper[4794]: I0310 09:59:32.433139 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tsbpl\" (UniqueName: \"kubernetes.io/projected/51e3997e-596c-4f50-a319-1f2e89c622f4-kube-api-access-tsbpl\") pod \"console-74499ccd79-jbccz\" (UID: \"51e3997e-596c-4f50-a319-1f2e89c622f4\") " pod="openshift-console/console-74499ccd79-jbccz" Mar 10 09:59:32 crc kubenswrapper[4794]: I0310 09:59:32.433155 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/51e3997e-596c-4f50-a319-1f2e89c622f4-console-oauth-config\") pod \"console-74499ccd79-jbccz\" (UID: \"51e3997e-596c-4f50-a319-1f2e89c622f4\") " pod="openshift-console/console-74499ccd79-jbccz" Mar 10 09:59:32 crc kubenswrapper[4794]: I0310 09:59:32.433171 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/51e3997e-596c-4f50-a319-1f2e89c622f4-service-ca\") pod \"console-74499ccd79-jbccz\" (UID: \"51e3997e-596c-4f50-a319-1f2e89c622f4\") " pod="openshift-console/console-74499ccd79-jbccz" Mar 10 09:59:32 crc kubenswrapper[4794]: I0310 09:59:32.433184 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/51e3997e-596c-4f50-a319-1f2e89c622f4-trusted-ca-bundle\") pod \"console-74499ccd79-jbccz\" (UID: \"51e3997e-596c-4f50-a319-1f2e89c622f4\") " pod="openshift-console/console-74499ccd79-jbccz" Mar 10 09:59:32 crc kubenswrapper[4794]: I0310 09:59:32.533946 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/51e3997e-596c-4f50-a319-1f2e89c622f4-console-serving-cert\") pod \"console-74499ccd79-jbccz\" (UID: \"51e3997e-596c-4f50-a319-1f2e89c622f4\") " pod="openshift-console/console-74499ccd79-jbccz" Mar 10 09:59:32 crc kubenswrapper[4794]: I0310 09:59:32.534057 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/51e3997e-596c-4f50-a319-1f2e89c622f4-console-oauth-config\") pod \"console-74499ccd79-jbccz\" (UID: \"51e3997e-596c-4f50-a319-1f2e89c622f4\") " pod="openshift-console/console-74499ccd79-jbccz" Mar 10 09:59:32 crc kubenswrapper[4794]: I0310 09:59:32.534080 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tsbpl\" (UniqueName: \"kubernetes.io/projected/51e3997e-596c-4f50-a319-1f2e89c622f4-kube-api-access-tsbpl\") pod \"console-74499ccd79-jbccz\" (UID: \"51e3997e-596c-4f50-a319-1f2e89c622f4\") " pod="openshift-console/console-74499ccd79-jbccz" Mar 10 09:59:32 crc kubenswrapper[4794]: I0310 09:59:32.534101 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/51e3997e-596c-4f50-a319-1f2e89c622f4-service-ca\") pod \"console-74499ccd79-jbccz\" (UID: \"51e3997e-596c-4f50-a319-1f2e89c622f4\") " pod="openshift-console/console-74499ccd79-jbccz" Mar 10 09:59:32 crc kubenswrapper[4794]: I0310 09:59:32.534122 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/51e3997e-596c-4f50-a319-1f2e89c622f4-trusted-ca-bundle\") pod \"console-74499ccd79-jbccz\" (UID: \"51e3997e-596c-4f50-a319-1f2e89c622f4\") " pod="openshift-console/console-74499ccd79-jbccz" Mar 10 09:59:32 crc kubenswrapper[4794]: I0310 09:59:32.534174 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/51e3997e-596c-4f50-a319-1f2e89c622f4-oauth-serving-cert\") pod \"console-74499ccd79-jbccz\" (UID: \"51e3997e-596c-4f50-a319-1f2e89c622f4\") " pod="openshift-console/console-74499ccd79-jbccz" Mar 10 09:59:32 crc kubenswrapper[4794]: I0310 09:59:32.534203 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/51e3997e-596c-4f50-a319-1f2e89c622f4-console-config\") pod \"console-74499ccd79-jbccz\" (UID: \"51e3997e-596c-4f50-a319-1f2e89c622f4\") " pod="openshift-console/console-74499ccd79-jbccz" Mar 10 09:59:32 crc kubenswrapper[4794]: I0310 09:59:32.534982 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/51e3997e-596c-4f50-a319-1f2e89c622f4-service-ca\") pod \"console-74499ccd79-jbccz\" (UID: \"51e3997e-596c-4f50-a319-1f2e89c622f4\") " pod="openshift-console/console-74499ccd79-jbccz" Mar 10 09:59:32 crc kubenswrapper[4794]: I0310 09:59:32.535047 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/51e3997e-596c-4f50-a319-1f2e89c622f4-console-config\") pod \"console-74499ccd79-jbccz\" (UID: \"51e3997e-596c-4f50-a319-1f2e89c622f4\") " pod="openshift-console/console-74499ccd79-jbccz" Mar 10 09:59:32 crc kubenswrapper[4794]: I0310 09:59:32.535173 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/51e3997e-596c-4f50-a319-1f2e89c622f4-oauth-serving-cert\") pod \"console-74499ccd79-jbccz\" (UID: \"51e3997e-596c-4f50-a319-1f2e89c622f4\") " pod="openshift-console/console-74499ccd79-jbccz" Mar 10 09:59:32 crc kubenswrapper[4794]: I0310 09:59:32.535847 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/51e3997e-596c-4f50-a319-1f2e89c622f4-trusted-ca-bundle\") pod \"console-74499ccd79-jbccz\" (UID: \"51e3997e-596c-4f50-a319-1f2e89c622f4\") " pod="openshift-console/console-74499ccd79-jbccz" Mar 10 09:59:32 crc kubenswrapper[4794]: I0310 09:59:32.543851 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/51e3997e-596c-4f50-a319-1f2e89c622f4-console-oauth-config\") pod \"console-74499ccd79-jbccz\" (UID: \"51e3997e-596c-4f50-a319-1f2e89c622f4\") " pod="openshift-console/console-74499ccd79-jbccz" Mar 10 09:59:32 crc kubenswrapper[4794]: I0310 09:59:32.543946 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/51e3997e-596c-4f50-a319-1f2e89c622f4-console-serving-cert\") pod \"console-74499ccd79-jbccz\" (UID: \"51e3997e-596c-4f50-a319-1f2e89c622f4\") " pod="openshift-console/console-74499ccd79-jbccz" Mar 10 09:59:32 crc kubenswrapper[4794]: I0310 09:59:32.553791 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tsbpl\" (UniqueName: \"kubernetes.io/projected/51e3997e-596c-4f50-a319-1f2e89c622f4-kube-api-access-tsbpl\") pod \"console-74499ccd79-jbccz\" (UID: \"51e3997e-596c-4f50-a319-1f2e89c622f4\") " pod="openshift-console/console-74499ccd79-jbccz" Mar 10 09:59:32 crc kubenswrapper[4794]: I0310 09:59:32.587026 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-54vjj"] Mar 10 09:59:32 crc kubenswrapper[4794]: W0310 09:59:32.590300 4794 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode60c58eb_6276_4eeb_b32b_21b8bb5c4d08.slice/crio-e2e4551081abc1ca225aa40c1d51d0613665f1bca522583f8d5ba739eb3e0d62 WatchSource:0}: Error finding container e2e4551081abc1ca225aa40c1d51d0613665f1bca522583f8d5ba739eb3e0d62: Status 404 returned error can't find the container with id e2e4551081abc1ca225aa40c1d51d0613665f1bca522583f8d5ba739eb3e0d62 Mar 10 09:59:32 crc kubenswrapper[4794]: I0310 09:59:32.641013 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-74499ccd79-jbccz" Mar 10 09:59:32 crc kubenswrapper[4794]: I0310 09:59:32.683305 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-786f45cff4-x8x2b"] Mar 10 09:59:32 crc kubenswrapper[4794]: W0310 09:59:32.689124 4794 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod25233eef_8e10_48bd_bd95_1366cc06f956.slice/crio-d152621cdeabfd1609197147969c17ee94398829ace6b6e6bba17c5f2071c55e WatchSource:0}: Error finding container d152621cdeabfd1609197147969c17ee94398829ace6b6e6bba17c5f2071c55e: Status 404 returned error can't find the container with id d152621cdeabfd1609197147969c17ee94398829ace6b6e6bba17c5f2071c55e Mar 10 09:59:32 crc kubenswrapper[4794]: I0310 09:59:32.733580 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-69594cc75-kl5qq"] Mar 10 09:59:32 crc kubenswrapper[4794]: W0310 09:59:32.740525 4794 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc22565b3_2d95_40ea_beb3_17d2daec0262.slice/crio-e6092bb6be8034b8d1862f47f6c739f9819e3cbcd02620882a62886975bc137c WatchSource:0}: Error finding container e6092bb6be8034b8d1862f47f6c739f9819e3cbcd02620882a62886975bc137c: Status 404 returned error can't find the container with id e6092bb6be8034b8d1862f47f6c739f9819e3cbcd02620882a62886975bc137c Mar 10 09:59:32 crc kubenswrapper[4794]: I0310 09:59:32.840360 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-74499ccd79-jbccz"] Mar 10 09:59:32 crc kubenswrapper[4794]: W0310 09:59:32.844445 4794 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod51e3997e_596c_4f50_a319_1f2e89c622f4.slice/crio-128bd139ccabc2dc17ec86c65251af4d42b9bff4307b398e65158adff5d3215a WatchSource:0}: Error finding container 128bd139ccabc2dc17ec86c65251af4d42b9bff4307b398e65158adff5d3215a: Status 404 returned error can't find the container with id 128bd139ccabc2dc17ec86c65251af4d42b9bff4307b398e65158adff5d3215a Mar 10 09:59:33 crc kubenswrapper[4794]: I0310 09:59:33.340755 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-69594cc75-kl5qq" event={"ID":"c22565b3-2d95-40ea-beb3-17d2daec0262","Type":"ContainerStarted","Data":"e6092bb6be8034b8d1862f47f6c739f9819e3cbcd02620882a62886975bc137c"} Mar 10 09:59:33 crc kubenswrapper[4794]: I0310 09:59:33.342554 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-74499ccd79-jbccz" event={"ID":"51e3997e-596c-4f50-a319-1f2e89c622f4","Type":"ContainerStarted","Data":"d73a64e3ba70ea1aa6aa3d13c9a75c5e2450530619f447a39cfa79b69923685c"} Mar 10 09:59:33 crc kubenswrapper[4794]: I0310 09:59:33.342592 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-74499ccd79-jbccz" event={"ID":"51e3997e-596c-4f50-a319-1f2e89c622f4","Type":"ContainerStarted","Data":"128bd139ccabc2dc17ec86c65251af4d42b9bff4307b398e65158adff5d3215a"} Mar 10 09:59:33 crc kubenswrapper[4794]: I0310 09:59:33.343774 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-786f45cff4-x8x2b" event={"ID":"25233eef-8e10-48bd-bd95-1366cc06f956","Type":"ContainerStarted","Data":"d152621cdeabfd1609197147969c17ee94398829ace6b6e6bba17c5f2071c55e"} Mar 10 09:59:33 crc kubenswrapper[4794]: I0310 09:59:33.345628 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-54vjj" event={"ID":"e60c58eb-6276-4eeb-b32b-21b8bb5c4d08","Type":"ContainerStarted","Data":"e2e4551081abc1ca225aa40c1d51d0613665f1bca522583f8d5ba739eb3e0d62"} Mar 10 09:59:33 crc kubenswrapper[4794]: I0310 09:59:33.365155 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-74499ccd79-jbccz" podStartSLOduration=1.365134579 podStartE2EDuration="1.365134579s" podCreationTimestamp="2026-03-10 09:59:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 09:59:33.359371059 +0000 UTC m=+922.115541887" watchObservedRunningTime="2026-03-10 09:59:33.365134579 +0000 UTC m=+922.121305407" Mar 10 09:59:36 crc kubenswrapper[4794]: I0310 09:59:36.365874 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-786f45cff4-x8x2b" event={"ID":"25233eef-8e10-48bd-bd95-1366cc06f956","Type":"ContainerStarted","Data":"104b55451ff99c56a85b6505ff210e28fb78659f5e814d329a6c5b57f518adfc"} Mar 10 09:59:36 crc kubenswrapper[4794]: I0310 09:59:36.368512 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-webhook-786f45cff4-x8x2b" Mar 10 09:59:36 crc kubenswrapper[4794]: I0310 09:59:36.369196 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-mmjmm" event={"ID":"eb034770-4c69-4e29-b08e-88af9c78c76e","Type":"ContainerStarted","Data":"757f14f9890def56441c8c25c47e9d5732420f1c9d6cb1fabc0e871df0da94d9"} Mar 10 09:59:36 crc kubenswrapper[4794]: I0310 09:59:36.369429 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-handler-mmjmm" Mar 10 09:59:36 crc kubenswrapper[4794]: I0310 09:59:36.372971 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-54vjj" event={"ID":"e60c58eb-6276-4eeb-b32b-21b8bb5c4d08","Type":"ContainerStarted","Data":"b62b8b37f5c55f58f3bf218f494c67f18c0ceb580c08ee06c4d0092f96ea9cf4"} Mar 10 09:59:36 crc kubenswrapper[4794]: I0310 09:59:36.376084 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-69594cc75-kl5qq" event={"ID":"c22565b3-2d95-40ea-beb3-17d2daec0262","Type":"ContainerStarted","Data":"ee2a82cfa0f6603f6f6e161ac4a0990fdc9f95132a9c446f6730593e5597e3e8"} Mar 10 09:59:36 crc kubenswrapper[4794]: I0310 09:59:36.389112 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-webhook-786f45cff4-x8x2b" podStartSLOduration=2.637814525 podStartE2EDuration="5.389092595s" podCreationTimestamp="2026-03-10 09:59:31 +0000 UTC" firstStartedPulling="2026-03-10 09:59:32.692475677 +0000 UTC m=+921.448646515" lastFinishedPulling="2026-03-10 09:59:35.443753747 +0000 UTC m=+924.199924585" observedRunningTime="2026-03-10 09:59:36.387402502 +0000 UTC m=+925.143573330" watchObservedRunningTime="2026-03-10 09:59:36.389092595 +0000 UTC m=+925.145263413" Mar 10 09:59:36 crc kubenswrapper[4794]: I0310 09:59:36.410269 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-handler-mmjmm" podStartSLOduration=2.362740594 podStartE2EDuration="5.410246482s" podCreationTimestamp="2026-03-10 09:59:31 +0000 UTC" firstStartedPulling="2026-03-10 09:59:32.326973525 +0000 UTC m=+921.083144343" lastFinishedPulling="2026-03-10 09:59:35.374479393 +0000 UTC m=+924.130650231" observedRunningTime="2026-03-10 09:59:36.405835675 +0000 UTC m=+925.162006523" watchObservedRunningTime="2026-03-10 09:59:36.410246482 +0000 UTC m=+925.166417320" Mar 10 09:59:38 crc kubenswrapper[4794]: I0310 09:59:38.390202 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-69594cc75-kl5qq" event={"ID":"c22565b3-2d95-40ea-beb3-17d2daec0262","Type":"ContainerStarted","Data":"e8b1fcc316b37990d0d909432c78d90a0c402a9a975d2b80cf3e34cd49979180"} Mar 10 09:59:38 crc kubenswrapper[4794]: I0310 09:59:38.419120 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-metrics-69594cc75-kl5qq" podStartSLOduration=2.382562781 podStartE2EDuration="7.419089632s" podCreationTimestamp="2026-03-10 09:59:31 +0000 UTC" firstStartedPulling="2026-03-10 09:59:32.743839424 +0000 UTC m=+921.500010252" lastFinishedPulling="2026-03-10 09:59:37.780366265 +0000 UTC m=+926.536537103" observedRunningTime="2026-03-10 09:59:38.413108066 +0000 UTC m=+927.169278964" watchObservedRunningTime="2026-03-10 09:59:38.419089632 +0000 UTC m=+927.175260490" Mar 10 09:59:38 crc kubenswrapper[4794]: I0310 09:59:38.421016 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-54vjj" podStartSLOduration=3.663637573 podStartE2EDuration="6.420995581s" podCreationTimestamp="2026-03-10 09:59:32 +0000 UTC" firstStartedPulling="2026-03-10 09:59:32.601070156 +0000 UTC m=+921.357240974" lastFinishedPulling="2026-03-10 09:59:35.358428164 +0000 UTC m=+924.114598982" observedRunningTime="2026-03-10 09:59:36.427024794 +0000 UTC m=+925.183195652" watchObservedRunningTime="2026-03-10 09:59:38.420995581 +0000 UTC m=+927.177166429" Mar 10 09:59:42 crc kubenswrapper[4794]: I0310 09:59:42.305849 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-handler-mmjmm" Mar 10 09:59:42 crc kubenswrapper[4794]: I0310 09:59:42.642044 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-74499ccd79-jbccz" Mar 10 09:59:42 crc kubenswrapper[4794]: I0310 09:59:42.642143 4794 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-74499ccd79-jbccz" Mar 10 09:59:42 crc kubenswrapper[4794]: I0310 09:59:42.647624 4794 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-74499ccd79-jbccz" Mar 10 09:59:43 crc kubenswrapper[4794]: I0310 09:59:43.423444 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-74499ccd79-jbccz" Mar 10 09:59:43 crc kubenswrapper[4794]: I0310 09:59:43.487205 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-skqw6"] Mar 10 09:59:52 crc kubenswrapper[4794]: I0310 09:59:52.242988 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-webhook-786f45cff4-x8x2b" Mar 10 09:59:52 crc kubenswrapper[4794]: I0310 09:59:52.968456 4794 patch_prober.go:28] interesting pod/machine-config-daemon-69278 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 09:59:52 crc kubenswrapper[4794]: I0310 09:59:52.969234 4794 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-69278" podUID="071a79a8-a892-4d38-a255-2a19483b64aa" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 09:59:52 crc kubenswrapper[4794]: I0310 09:59:52.969506 4794 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-69278" Mar 10 09:59:52 crc kubenswrapper[4794]: I0310 09:59:52.970720 4794 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"900615b0bd1702fdf79917b75d57707d4e97b8f262a88b05aa4883f6d0d20891"} pod="openshift-machine-config-operator/machine-config-daemon-69278" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 10 09:59:52 crc kubenswrapper[4794]: I0310 09:59:52.970975 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-69278" podUID="071a79a8-a892-4d38-a255-2a19483b64aa" containerName="machine-config-daemon" containerID="cri-o://900615b0bd1702fdf79917b75d57707d4e97b8f262a88b05aa4883f6d0d20891" gracePeriod=600 Mar 10 09:59:53 crc kubenswrapper[4794]: I0310 09:59:53.482837 4794 generic.go:334] "Generic (PLEG): container finished" podID="071a79a8-a892-4d38-a255-2a19483b64aa" containerID="900615b0bd1702fdf79917b75d57707d4e97b8f262a88b05aa4883f6d0d20891" exitCode=0 Mar 10 09:59:53 crc kubenswrapper[4794]: I0310 09:59:53.483124 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-69278" event={"ID":"071a79a8-a892-4d38-a255-2a19483b64aa","Type":"ContainerDied","Data":"900615b0bd1702fdf79917b75d57707d4e97b8f262a88b05aa4883f6d0d20891"} Mar 10 09:59:53 crc kubenswrapper[4794]: I0310 09:59:53.483150 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-69278" event={"ID":"071a79a8-a892-4d38-a255-2a19483b64aa","Type":"ContainerStarted","Data":"961f68351db8d19b5c0a0d1359e0a0bfe3a6d383630ab326fdce756a36734d0e"} Mar 10 09:59:53 crc kubenswrapper[4794]: I0310 09:59:53.483165 4794 scope.go:117] "RemoveContainer" containerID="008d2d6b5be2dee5277f2851a79ea6544ead77396a09ca0e14c2fa485aead805" Mar 10 10:00:00 crc kubenswrapper[4794]: I0310 10:00:00.142935 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29552280-m8kkn"] Mar 10 10:00:00 crc kubenswrapper[4794]: I0310 10:00:00.147797 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552280-m8kkn" Mar 10 10:00:00 crc kubenswrapper[4794]: I0310 10:00:00.151733 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 10 10:00:00 crc kubenswrapper[4794]: I0310 10:00:00.151818 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 10 10:00:00 crc kubenswrapper[4794]: I0310 10:00:00.151948 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-r5tkc" Mar 10 10:00:00 crc kubenswrapper[4794]: I0310 10:00:00.169101 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29552280-t9n75"] Mar 10 10:00:00 crc kubenswrapper[4794]: I0310 10:00:00.170216 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29552280-t9n75" Mar 10 10:00:00 crc kubenswrapper[4794]: I0310 10:00:00.172748 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 10 10:00:00 crc kubenswrapper[4794]: I0310 10:00:00.172890 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 10 10:00:00 crc kubenswrapper[4794]: I0310 10:00:00.174266 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552280-m8kkn"] Mar 10 10:00:00 crc kubenswrapper[4794]: I0310 10:00:00.179933 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29552280-t9n75"] Mar 10 10:00:00 crc kubenswrapper[4794]: I0310 10:00:00.212902 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2sct9\" (UniqueName: \"kubernetes.io/projected/449e1884-a4e4-4b83-b831-ddbc4f598eff-kube-api-access-2sct9\") pod \"auto-csr-approver-29552280-m8kkn\" (UID: \"449e1884-a4e4-4b83-b831-ddbc4f598eff\") " pod="openshift-infra/auto-csr-approver-29552280-m8kkn" Mar 10 10:00:00 crc kubenswrapper[4794]: I0310 10:00:00.314319 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4m2zd\" (UniqueName: \"kubernetes.io/projected/9be7aae6-30f2-4a0f-8aa3-c88cc81603d7-kube-api-access-4m2zd\") pod \"collect-profiles-29552280-t9n75\" (UID: \"9be7aae6-30f2-4a0f-8aa3-c88cc81603d7\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552280-t9n75" Mar 10 10:00:00 crc kubenswrapper[4794]: I0310 10:00:00.314415 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2sct9\" (UniqueName: \"kubernetes.io/projected/449e1884-a4e4-4b83-b831-ddbc4f598eff-kube-api-access-2sct9\") pod \"auto-csr-approver-29552280-m8kkn\" (UID: \"449e1884-a4e4-4b83-b831-ddbc4f598eff\") " pod="openshift-infra/auto-csr-approver-29552280-m8kkn" Mar 10 10:00:00 crc kubenswrapper[4794]: I0310 10:00:00.314444 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/9be7aae6-30f2-4a0f-8aa3-c88cc81603d7-secret-volume\") pod \"collect-profiles-29552280-t9n75\" (UID: \"9be7aae6-30f2-4a0f-8aa3-c88cc81603d7\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552280-t9n75" Mar 10 10:00:00 crc kubenswrapper[4794]: I0310 10:00:00.314530 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/9be7aae6-30f2-4a0f-8aa3-c88cc81603d7-config-volume\") pod \"collect-profiles-29552280-t9n75\" (UID: \"9be7aae6-30f2-4a0f-8aa3-c88cc81603d7\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552280-t9n75" Mar 10 10:00:00 crc kubenswrapper[4794]: I0310 10:00:00.332979 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2sct9\" (UniqueName: \"kubernetes.io/projected/449e1884-a4e4-4b83-b831-ddbc4f598eff-kube-api-access-2sct9\") pod \"auto-csr-approver-29552280-m8kkn\" (UID: \"449e1884-a4e4-4b83-b831-ddbc4f598eff\") " pod="openshift-infra/auto-csr-approver-29552280-m8kkn" Mar 10 10:00:00 crc kubenswrapper[4794]: I0310 10:00:00.415547 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/9be7aae6-30f2-4a0f-8aa3-c88cc81603d7-secret-volume\") pod \"collect-profiles-29552280-t9n75\" (UID: \"9be7aae6-30f2-4a0f-8aa3-c88cc81603d7\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552280-t9n75" Mar 10 10:00:00 crc kubenswrapper[4794]: I0310 10:00:00.415639 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/9be7aae6-30f2-4a0f-8aa3-c88cc81603d7-config-volume\") pod \"collect-profiles-29552280-t9n75\" (UID: \"9be7aae6-30f2-4a0f-8aa3-c88cc81603d7\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552280-t9n75" Mar 10 10:00:00 crc kubenswrapper[4794]: I0310 10:00:00.415667 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4m2zd\" (UniqueName: \"kubernetes.io/projected/9be7aae6-30f2-4a0f-8aa3-c88cc81603d7-kube-api-access-4m2zd\") pod \"collect-profiles-29552280-t9n75\" (UID: \"9be7aae6-30f2-4a0f-8aa3-c88cc81603d7\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552280-t9n75" Mar 10 10:00:00 crc kubenswrapper[4794]: I0310 10:00:00.416550 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/9be7aae6-30f2-4a0f-8aa3-c88cc81603d7-config-volume\") pod \"collect-profiles-29552280-t9n75\" (UID: \"9be7aae6-30f2-4a0f-8aa3-c88cc81603d7\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552280-t9n75" Mar 10 10:00:00 crc kubenswrapper[4794]: I0310 10:00:00.419571 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/9be7aae6-30f2-4a0f-8aa3-c88cc81603d7-secret-volume\") pod \"collect-profiles-29552280-t9n75\" (UID: \"9be7aae6-30f2-4a0f-8aa3-c88cc81603d7\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552280-t9n75" Mar 10 10:00:00 crc kubenswrapper[4794]: I0310 10:00:00.432485 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4m2zd\" (UniqueName: \"kubernetes.io/projected/9be7aae6-30f2-4a0f-8aa3-c88cc81603d7-kube-api-access-4m2zd\") pod \"collect-profiles-29552280-t9n75\" (UID: \"9be7aae6-30f2-4a0f-8aa3-c88cc81603d7\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552280-t9n75" Mar 10 10:00:00 crc kubenswrapper[4794]: I0310 10:00:00.470695 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552280-m8kkn" Mar 10 10:00:00 crc kubenswrapper[4794]: I0310 10:00:00.490502 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29552280-t9n75" Mar 10 10:00:00 crc kubenswrapper[4794]: I0310 10:00:00.878046 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552280-m8kkn"] Mar 10 10:00:00 crc kubenswrapper[4794]: I0310 10:00:00.924247 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29552280-t9n75"] Mar 10 10:00:00 crc kubenswrapper[4794]: W0310 10:00:00.932090 4794 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9be7aae6_30f2_4a0f_8aa3_c88cc81603d7.slice/crio-b0f47c6420676b8bfbe820ffd17e2290450fd8f3f7d11968682c8159d2e3d67c WatchSource:0}: Error finding container b0f47c6420676b8bfbe820ffd17e2290450fd8f3f7d11968682c8159d2e3d67c: Status 404 returned error can't find the container with id b0f47c6420676b8bfbe820ffd17e2290450fd8f3f7d11968682c8159d2e3d67c Mar 10 10:00:01 crc kubenswrapper[4794]: I0310 10:00:01.558573 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552280-m8kkn" event={"ID":"449e1884-a4e4-4b83-b831-ddbc4f598eff","Type":"ContainerStarted","Data":"74d014e0778dffb0925520662fe9f4f8467621bfc66a0444c56c799a3b05987c"} Mar 10 10:00:01 crc kubenswrapper[4794]: I0310 10:00:01.561395 4794 generic.go:334] "Generic (PLEG): container finished" podID="9be7aae6-30f2-4a0f-8aa3-c88cc81603d7" containerID="2e16baaeb24214f08df54071f10bd8f70b65ec2a0a85f920ccc8fbe70ee66c9b" exitCode=0 Mar 10 10:00:01 crc kubenswrapper[4794]: I0310 10:00:01.561453 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29552280-t9n75" event={"ID":"9be7aae6-30f2-4a0f-8aa3-c88cc81603d7","Type":"ContainerDied","Data":"2e16baaeb24214f08df54071f10bd8f70b65ec2a0a85f920ccc8fbe70ee66c9b"} Mar 10 10:00:01 crc kubenswrapper[4794]: I0310 10:00:01.561486 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29552280-t9n75" event={"ID":"9be7aae6-30f2-4a0f-8aa3-c88cc81603d7","Type":"ContainerStarted","Data":"b0f47c6420676b8bfbe820ffd17e2290450fd8f3f7d11968682c8159d2e3d67c"} Mar 10 10:00:02 crc kubenswrapper[4794]: I0310 10:00:02.779593 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29552280-t9n75" Mar 10 10:00:02 crc kubenswrapper[4794]: I0310 10:00:02.847642 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/9be7aae6-30f2-4a0f-8aa3-c88cc81603d7-secret-volume\") pod \"9be7aae6-30f2-4a0f-8aa3-c88cc81603d7\" (UID: \"9be7aae6-30f2-4a0f-8aa3-c88cc81603d7\") " Mar 10 10:00:02 crc kubenswrapper[4794]: I0310 10:00:02.847707 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/9be7aae6-30f2-4a0f-8aa3-c88cc81603d7-config-volume\") pod \"9be7aae6-30f2-4a0f-8aa3-c88cc81603d7\" (UID: \"9be7aae6-30f2-4a0f-8aa3-c88cc81603d7\") " Mar 10 10:00:02 crc kubenswrapper[4794]: I0310 10:00:02.847770 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4m2zd\" (UniqueName: \"kubernetes.io/projected/9be7aae6-30f2-4a0f-8aa3-c88cc81603d7-kube-api-access-4m2zd\") pod \"9be7aae6-30f2-4a0f-8aa3-c88cc81603d7\" (UID: \"9be7aae6-30f2-4a0f-8aa3-c88cc81603d7\") " Mar 10 10:00:02 crc kubenswrapper[4794]: I0310 10:00:02.849593 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9be7aae6-30f2-4a0f-8aa3-c88cc81603d7-config-volume" (OuterVolumeSpecName: "config-volume") pod "9be7aae6-30f2-4a0f-8aa3-c88cc81603d7" (UID: "9be7aae6-30f2-4a0f-8aa3-c88cc81603d7"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 10:00:02 crc kubenswrapper[4794]: I0310 10:00:02.853379 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9be7aae6-30f2-4a0f-8aa3-c88cc81603d7-kube-api-access-4m2zd" (OuterVolumeSpecName: "kube-api-access-4m2zd") pod "9be7aae6-30f2-4a0f-8aa3-c88cc81603d7" (UID: "9be7aae6-30f2-4a0f-8aa3-c88cc81603d7"). InnerVolumeSpecName "kube-api-access-4m2zd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 10:00:02 crc kubenswrapper[4794]: I0310 10:00:02.853551 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9be7aae6-30f2-4a0f-8aa3-c88cc81603d7-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "9be7aae6-30f2-4a0f-8aa3-c88cc81603d7" (UID: "9be7aae6-30f2-4a0f-8aa3-c88cc81603d7"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 10:00:02 crc kubenswrapper[4794]: I0310 10:00:02.949081 4794 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/9be7aae6-30f2-4a0f-8aa3-c88cc81603d7-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 10 10:00:02 crc kubenswrapper[4794]: I0310 10:00:02.949404 4794 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/9be7aae6-30f2-4a0f-8aa3-c88cc81603d7-config-volume\") on node \"crc\" DevicePath \"\"" Mar 10 10:00:02 crc kubenswrapper[4794]: I0310 10:00:02.949418 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4m2zd\" (UniqueName: \"kubernetes.io/projected/9be7aae6-30f2-4a0f-8aa3-c88cc81603d7-kube-api-access-4m2zd\") on node \"crc\" DevicePath \"\"" Mar 10 10:00:03 crc kubenswrapper[4794]: I0310 10:00:03.575568 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29552280-t9n75" event={"ID":"9be7aae6-30f2-4a0f-8aa3-c88cc81603d7","Type":"ContainerDied","Data":"b0f47c6420676b8bfbe820ffd17e2290450fd8f3f7d11968682c8159d2e3d67c"} Mar 10 10:00:03 crc kubenswrapper[4794]: I0310 10:00:03.575799 4794 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b0f47c6420676b8bfbe820ffd17e2290450fd8f3f7d11968682c8159d2e3d67c" Mar 10 10:00:03 crc kubenswrapper[4794]: I0310 10:00:03.575659 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29552280-t9n75" Mar 10 10:00:04 crc kubenswrapper[4794]: I0310 10:00:04.582753 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552280-m8kkn" event={"ID":"449e1884-a4e4-4b83-b831-ddbc4f598eff","Type":"ContainerStarted","Data":"10faf05a6df463cbd3424d36728612a9072a29a6e68134f130f0004c2fc9cd93"} Mar 10 10:00:04 crc kubenswrapper[4794]: I0310 10:00:04.611999 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29552280-m8kkn" podStartSLOduration=1.265497801 podStartE2EDuration="4.611975433s" podCreationTimestamp="2026-03-10 10:00:00 +0000 UTC" firstStartedPulling="2026-03-10 10:00:00.888578173 +0000 UTC m=+949.644749011" lastFinishedPulling="2026-03-10 10:00:04.235055825 +0000 UTC m=+952.991226643" observedRunningTime="2026-03-10 10:00:04.610864288 +0000 UTC m=+953.367035116" watchObservedRunningTime="2026-03-10 10:00:04.611975433 +0000 UTC m=+953.368146271" Mar 10 10:00:04 crc kubenswrapper[4794]: I0310 10:00:04.981326 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4rhj58"] Mar 10 10:00:04 crc kubenswrapper[4794]: E0310 10:00:04.981652 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9be7aae6-30f2-4a0f-8aa3-c88cc81603d7" containerName="collect-profiles" Mar 10 10:00:04 crc kubenswrapper[4794]: I0310 10:00:04.981674 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="9be7aae6-30f2-4a0f-8aa3-c88cc81603d7" containerName="collect-profiles" Mar 10 10:00:04 crc kubenswrapper[4794]: I0310 10:00:04.981792 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="9be7aae6-30f2-4a0f-8aa3-c88cc81603d7" containerName="collect-profiles" Mar 10 10:00:04 crc kubenswrapper[4794]: I0310 10:00:04.982731 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4rhj58" Mar 10 10:00:04 crc kubenswrapper[4794]: I0310 10:00:04.984704 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Mar 10 10:00:04 crc kubenswrapper[4794]: I0310 10:00:04.988067 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4rhj58"] Mar 10 10:00:05 crc kubenswrapper[4794]: I0310 10:00:05.085299 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/cce41c0b-f107-450b-95f5-64dae06dbe14-bundle\") pod \"d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4rhj58\" (UID: \"cce41c0b-f107-450b-95f5-64dae06dbe14\") " pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4rhj58" Mar 10 10:00:05 crc kubenswrapper[4794]: I0310 10:00:05.085398 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/cce41c0b-f107-450b-95f5-64dae06dbe14-util\") pod \"d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4rhj58\" (UID: \"cce41c0b-f107-450b-95f5-64dae06dbe14\") " pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4rhj58" Mar 10 10:00:05 crc kubenswrapper[4794]: I0310 10:00:05.085424 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-whvg4\" (UniqueName: \"kubernetes.io/projected/cce41c0b-f107-450b-95f5-64dae06dbe14-kube-api-access-whvg4\") pod \"d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4rhj58\" (UID: \"cce41c0b-f107-450b-95f5-64dae06dbe14\") " pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4rhj58" Mar 10 10:00:05 crc kubenswrapper[4794]: I0310 10:00:05.187386 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/cce41c0b-f107-450b-95f5-64dae06dbe14-bundle\") pod \"d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4rhj58\" (UID: \"cce41c0b-f107-450b-95f5-64dae06dbe14\") " pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4rhj58" Mar 10 10:00:05 crc kubenswrapper[4794]: I0310 10:00:05.187498 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/cce41c0b-f107-450b-95f5-64dae06dbe14-util\") pod \"d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4rhj58\" (UID: \"cce41c0b-f107-450b-95f5-64dae06dbe14\") " pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4rhj58" Mar 10 10:00:05 crc kubenswrapper[4794]: I0310 10:00:05.187537 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-whvg4\" (UniqueName: \"kubernetes.io/projected/cce41c0b-f107-450b-95f5-64dae06dbe14-kube-api-access-whvg4\") pod \"d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4rhj58\" (UID: \"cce41c0b-f107-450b-95f5-64dae06dbe14\") " pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4rhj58" Mar 10 10:00:05 crc kubenswrapper[4794]: I0310 10:00:05.188141 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/cce41c0b-f107-450b-95f5-64dae06dbe14-bundle\") pod \"d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4rhj58\" (UID: \"cce41c0b-f107-450b-95f5-64dae06dbe14\") " pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4rhj58" Mar 10 10:00:05 crc kubenswrapper[4794]: I0310 10:00:05.188243 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/cce41c0b-f107-450b-95f5-64dae06dbe14-util\") pod \"d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4rhj58\" (UID: \"cce41c0b-f107-450b-95f5-64dae06dbe14\") " pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4rhj58" Mar 10 10:00:05 crc kubenswrapper[4794]: I0310 10:00:05.218317 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-whvg4\" (UniqueName: \"kubernetes.io/projected/cce41c0b-f107-450b-95f5-64dae06dbe14-kube-api-access-whvg4\") pod \"d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4rhj58\" (UID: \"cce41c0b-f107-450b-95f5-64dae06dbe14\") " pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4rhj58" Mar 10 10:00:05 crc kubenswrapper[4794]: I0310 10:00:05.296690 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4rhj58" Mar 10 10:00:05 crc kubenswrapper[4794]: I0310 10:00:05.504662 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4rhj58"] Mar 10 10:00:05 crc kubenswrapper[4794]: I0310 10:00:05.591218 4794 generic.go:334] "Generic (PLEG): container finished" podID="449e1884-a4e4-4b83-b831-ddbc4f598eff" containerID="10faf05a6df463cbd3424d36728612a9072a29a6e68134f130f0004c2fc9cd93" exitCode=0 Mar 10 10:00:05 crc kubenswrapper[4794]: I0310 10:00:05.591267 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552280-m8kkn" event={"ID":"449e1884-a4e4-4b83-b831-ddbc4f598eff","Type":"ContainerDied","Data":"10faf05a6df463cbd3424d36728612a9072a29a6e68134f130f0004c2fc9cd93"} Mar 10 10:00:05 crc kubenswrapper[4794]: I0310 10:00:05.593631 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4rhj58" event={"ID":"cce41c0b-f107-450b-95f5-64dae06dbe14","Type":"ContainerStarted","Data":"a8b4fd7dd292480f6d7b2be9183c73a41e7db910885067d41e12a3eeb31187ab"} Mar 10 10:00:06 crc kubenswrapper[4794]: I0310 10:00:06.605230 4794 generic.go:334] "Generic (PLEG): container finished" podID="cce41c0b-f107-450b-95f5-64dae06dbe14" containerID="21348514b78c221074c7d243231de628e7a71b4b5a43c1950c8096ba6599d18f" exitCode=0 Mar 10 10:00:06 crc kubenswrapper[4794]: I0310 10:00:06.605328 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4rhj58" event={"ID":"cce41c0b-f107-450b-95f5-64dae06dbe14","Type":"ContainerDied","Data":"21348514b78c221074c7d243231de628e7a71b4b5a43c1950c8096ba6599d18f"} Mar 10 10:00:06 crc kubenswrapper[4794]: I0310 10:00:06.953925 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552280-m8kkn" Mar 10 10:00:07 crc kubenswrapper[4794]: I0310 10:00:07.016806 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2sct9\" (UniqueName: \"kubernetes.io/projected/449e1884-a4e4-4b83-b831-ddbc4f598eff-kube-api-access-2sct9\") pod \"449e1884-a4e4-4b83-b831-ddbc4f598eff\" (UID: \"449e1884-a4e4-4b83-b831-ddbc4f598eff\") " Mar 10 10:00:07 crc kubenswrapper[4794]: I0310 10:00:07.021818 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/449e1884-a4e4-4b83-b831-ddbc4f598eff-kube-api-access-2sct9" (OuterVolumeSpecName: "kube-api-access-2sct9") pod "449e1884-a4e4-4b83-b831-ddbc4f598eff" (UID: "449e1884-a4e4-4b83-b831-ddbc4f598eff"). InnerVolumeSpecName "kube-api-access-2sct9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 10:00:07 crc kubenswrapper[4794]: I0310 10:00:07.118751 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2sct9\" (UniqueName: \"kubernetes.io/projected/449e1884-a4e4-4b83-b831-ddbc4f598eff-kube-api-access-2sct9\") on node \"crc\" DevicePath \"\"" Mar 10 10:00:07 crc kubenswrapper[4794]: I0310 10:00:07.612843 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552280-m8kkn" event={"ID":"449e1884-a4e4-4b83-b831-ddbc4f598eff","Type":"ContainerDied","Data":"74d014e0778dffb0925520662fe9f4f8467621bfc66a0444c56c799a3b05987c"} Mar 10 10:00:07 crc kubenswrapper[4794]: I0310 10:00:07.612890 4794 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="74d014e0778dffb0925520662fe9f4f8467621bfc66a0444c56c799a3b05987c" Mar 10 10:00:07 crc kubenswrapper[4794]: I0310 10:00:07.612978 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552280-m8kkn" Mar 10 10:00:08 crc kubenswrapper[4794]: I0310 10:00:08.012865 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29552274-8sskt"] Mar 10 10:00:08 crc kubenswrapper[4794]: I0310 10:00:08.021651 4794 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29552274-8sskt"] Mar 10 10:00:08 crc kubenswrapper[4794]: I0310 10:00:08.529850 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/console-f9d7485db-skqw6" podUID="7827a543-d8b2-460b-aee5-212ea1208c0d" containerName="console" containerID="cri-o://ac23f35647b7d6a9a513d7618e602b0baae643c80748e46291a1562c1ffa3ac9" gracePeriod=15 Mar 10 10:00:08 crc kubenswrapper[4794]: I0310 10:00:08.871382 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-skqw6_7827a543-d8b2-460b-aee5-212ea1208c0d/console/0.log" Mar 10 10:00:08 crc kubenswrapper[4794]: I0310 10:00:08.871682 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-skqw6" Mar 10 10:00:08 crc kubenswrapper[4794]: I0310 10:00:08.943424 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/7827a543-d8b2-460b-aee5-212ea1208c0d-oauth-serving-cert\") pod \"7827a543-d8b2-460b-aee5-212ea1208c0d\" (UID: \"7827a543-d8b2-460b-aee5-212ea1208c0d\") " Mar 10 10:00:08 crc kubenswrapper[4794]: I0310 10:00:08.943549 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/7827a543-d8b2-460b-aee5-212ea1208c0d-console-serving-cert\") pod \"7827a543-d8b2-460b-aee5-212ea1208c0d\" (UID: \"7827a543-d8b2-460b-aee5-212ea1208c0d\") " Mar 10 10:00:08 crc kubenswrapper[4794]: I0310 10:00:08.943647 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s9j4m\" (UniqueName: \"kubernetes.io/projected/7827a543-d8b2-460b-aee5-212ea1208c0d-kube-api-access-s9j4m\") pod \"7827a543-d8b2-460b-aee5-212ea1208c0d\" (UID: \"7827a543-d8b2-460b-aee5-212ea1208c0d\") " Mar 10 10:00:08 crc kubenswrapper[4794]: I0310 10:00:08.943717 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/7827a543-d8b2-460b-aee5-212ea1208c0d-console-config\") pod \"7827a543-d8b2-460b-aee5-212ea1208c0d\" (UID: \"7827a543-d8b2-460b-aee5-212ea1208c0d\") " Mar 10 10:00:08 crc kubenswrapper[4794]: I0310 10:00:08.943748 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/7827a543-d8b2-460b-aee5-212ea1208c0d-service-ca\") pod \"7827a543-d8b2-460b-aee5-212ea1208c0d\" (UID: \"7827a543-d8b2-460b-aee5-212ea1208c0d\") " Mar 10 10:00:08 crc kubenswrapper[4794]: I0310 10:00:08.943803 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7827a543-d8b2-460b-aee5-212ea1208c0d-trusted-ca-bundle\") pod \"7827a543-d8b2-460b-aee5-212ea1208c0d\" (UID: \"7827a543-d8b2-460b-aee5-212ea1208c0d\") " Mar 10 10:00:08 crc kubenswrapper[4794]: I0310 10:00:08.943835 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/7827a543-d8b2-460b-aee5-212ea1208c0d-console-oauth-config\") pod \"7827a543-d8b2-460b-aee5-212ea1208c0d\" (UID: \"7827a543-d8b2-460b-aee5-212ea1208c0d\") " Mar 10 10:00:08 crc kubenswrapper[4794]: I0310 10:00:08.945048 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7827a543-d8b2-460b-aee5-212ea1208c0d-service-ca" (OuterVolumeSpecName: "service-ca") pod "7827a543-d8b2-460b-aee5-212ea1208c0d" (UID: "7827a543-d8b2-460b-aee5-212ea1208c0d"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 10:00:08 crc kubenswrapper[4794]: I0310 10:00:08.945374 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7827a543-d8b2-460b-aee5-212ea1208c0d-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "7827a543-d8b2-460b-aee5-212ea1208c0d" (UID: "7827a543-d8b2-460b-aee5-212ea1208c0d"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 10:00:08 crc kubenswrapper[4794]: I0310 10:00:08.945466 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7827a543-d8b2-460b-aee5-212ea1208c0d-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "7827a543-d8b2-460b-aee5-212ea1208c0d" (UID: "7827a543-d8b2-460b-aee5-212ea1208c0d"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 10:00:08 crc kubenswrapper[4794]: I0310 10:00:08.945504 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7827a543-d8b2-460b-aee5-212ea1208c0d-console-config" (OuterVolumeSpecName: "console-config") pod "7827a543-d8b2-460b-aee5-212ea1208c0d" (UID: "7827a543-d8b2-460b-aee5-212ea1208c0d"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 10:00:08 crc kubenswrapper[4794]: I0310 10:00:08.949999 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7827a543-d8b2-460b-aee5-212ea1208c0d-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "7827a543-d8b2-460b-aee5-212ea1208c0d" (UID: "7827a543-d8b2-460b-aee5-212ea1208c0d"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 10:00:08 crc kubenswrapper[4794]: I0310 10:00:08.950101 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7827a543-d8b2-460b-aee5-212ea1208c0d-kube-api-access-s9j4m" (OuterVolumeSpecName: "kube-api-access-s9j4m") pod "7827a543-d8b2-460b-aee5-212ea1208c0d" (UID: "7827a543-d8b2-460b-aee5-212ea1208c0d"). InnerVolumeSpecName "kube-api-access-s9j4m". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 10:00:08 crc kubenswrapper[4794]: I0310 10:00:08.951476 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7827a543-d8b2-460b-aee5-212ea1208c0d-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "7827a543-d8b2-460b-aee5-212ea1208c0d" (UID: "7827a543-d8b2-460b-aee5-212ea1208c0d"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 10:00:09 crc kubenswrapper[4794]: I0310 10:00:09.045444 4794 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/7827a543-d8b2-460b-aee5-212ea1208c0d-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 10 10:00:09 crc kubenswrapper[4794]: I0310 10:00:09.045492 4794 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/7827a543-d8b2-460b-aee5-212ea1208c0d-console-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 10 10:00:09 crc kubenswrapper[4794]: I0310 10:00:09.045506 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s9j4m\" (UniqueName: \"kubernetes.io/projected/7827a543-d8b2-460b-aee5-212ea1208c0d-kube-api-access-s9j4m\") on node \"crc\" DevicePath \"\"" Mar 10 10:00:09 crc kubenswrapper[4794]: I0310 10:00:09.045519 4794 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/7827a543-d8b2-460b-aee5-212ea1208c0d-console-config\") on node \"crc\" DevicePath \"\"" Mar 10 10:00:09 crc kubenswrapper[4794]: I0310 10:00:09.045545 4794 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/7827a543-d8b2-460b-aee5-212ea1208c0d-service-ca\") on node \"crc\" DevicePath \"\"" Mar 10 10:00:09 crc kubenswrapper[4794]: I0310 10:00:09.045557 4794 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7827a543-d8b2-460b-aee5-212ea1208c0d-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 10:00:09 crc kubenswrapper[4794]: I0310 10:00:09.045568 4794 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/7827a543-d8b2-460b-aee5-212ea1208c0d-console-oauth-config\") on node \"crc\" DevicePath \"\"" Mar 10 10:00:09 crc kubenswrapper[4794]: I0310 10:00:09.633021 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-skqw6_7827a543-d8b2-460b-aee5-212ea1208c0d/console/0.log" Mar 10 10:00:09 crc kubenswrapper[4794]: I0310 10:00:09.633137 4794 generic.go:334] "Generic (PLEG): container finished" podID="7827a543-d8b2-460b-aee5-212ea1208c0d" containerID="ac23f35647b7d6a9a513d7618e602b0baae643c80748e46291a1562c1ffa3ac9" exitCode=2 Mar 10 10:00:09 crc kubenswrapper[4794]: I0310 10:00:09.633192 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-skqw6" event={"ID":"7827a543-d8b2-460b-aee5-212ea1208c0d","Type":"ContainerDied","Data":"ac23f35647b7d6a9a513d7618e602b0baae643c80748e46291a1562c1ffa3ac9"} Mar 10 10:00:09 crc kubenswrapper[4794]: I0310 10:00:09.633255 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-skqw6" event={"ID":"7827a543-d8b2-460b-aee5-212ea1208c0d","Type":"ContainerDied","Data":"7be1444afeeb28da0d5ec0820c7bf898af23aea3cfa1fc5668516d44d4782b6c"} Mar 10 10:00:09 crc kubenswrapper[4794]: I0310 10:00:09.633279 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-skqw6" Mar 10 10:00:09 crc kubenswrapper[4794]: I0310 10:00:09.633299 4794 scope.go:117] "RemoveContainer" containerID="ac23f35647b7d6a9a513d7618e602b0baae643c80748e46291a1562c1ffa3ac9" Mar 10 10:00:09 crc kubenswrapper[4794]: I0310 10:00:09.658732 4794 scope.go:117] "RemoveContainer" containerID="ac23f35647b7d6a9a513d7618e602b0baae643c80748e46291a1562c1ffa3ac9" Mar 10 10:00:09 crc kubenswrapper[4794]: E0310 10:00:09.659603 4794 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ac23f35647b7d6a9a513d7618e602b0baae643c80748e46291a1562c1ffa3ac9\": container with ID starting with ac23f35647b7d6a9a513d7618e602b0baae643c80748e46291a1562c1ffa3ac9 not found: ID does not exist" containerID="ac23f35647b7d6a9a513d7618e602b0baae643c80748e46291a1562c1ffa3ac9" Mar 10 10:00:09 crc kubenswrapper[4794]: I0310 10:00:09.659636 4794 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ac23f35647b7d6a9a513d7618e602b0baae643c80748e46291a1562c1ffa3ac9"} err="failed to get container status \"ac23f35647b7d6a9a513d7618e602b0baae643c80748e46291a1562c1ffa3ac9\": rpc error: code = NotFound desc = could not find container \"ac23f35647b7d6a9a513d7618e602b0baae643c80748e46291a1562c1ffa3ac9\": container with ID starting with ac23f35647b7d6a9a513d7618e602b0baae643c80748e46291a1562c1ffa3ac9 not found: ID does not exist" Mar 10 10:00:09 crc kubenswrapper[4794]: I0310 10:00:09.675587 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-skqw6"] Mar 10 10:00:09 crc kubenswrapper[4794]: I0310 10:00:09.682186 4794 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-f9d7485db-skqw6"] Mar 10 10:00:10 crc kubenswrapper[4794]: I0310 10:00:10.011379 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="35e20182-9b30-45e4-a638-2460b967250e" path="/var/lib/kubelet/pods/35e20182-9b30-45e4-a638-2460b967250e/volumes" Mar 10 10:00:10 crc kubenswrapper[4794]: I0310 10:00:10.012861 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7827a543-d8b2-460b-aee5-212ea1208c0d" path="/var/lib/kubelet/pods/7827a543-d8b2-460b-aee5-212ea1208c0d/volumes" Mar 10 10:00:11 crc kubenswrapper[4794]: I0310 10:00:11.654454 4794 generic.go:334] "Generic (PLEG): container finished" podID="cce41c0b-f107-450b-95f5-64dae06dbe14" containerID="60a13e00ba8b0f7350045359817f2e8726350e3b746eb945ba630f1c89bc029a" exitCode=0 Mar 10 10:00:11 crc kubenswrapper[4794]: I0310 10:00:11.654553 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4rhj58" event={"ID":"cce41c0b-f107-450b-95f5-64dae06dbe14","Type":"ContainerDied","Data":"60a13e00ba8b0f7350045359817f2e8726350e3b746eb945ba630f1c89bc029a"} Mar 10 10:00:12 crc kubenswrapper[4794]: I0310 10:00:12.667102 4794 generic.go:334] "Generic (PLEG): container finished" podID="cce41c0b-f107-450b-95f5-64dae06dbe14" containerID="ae614ba715905fc452902249d8dfbdfdc4c854eb6ce2a2f46e3275a9684ef023" exitCode=0 Mar 10 10:00:12 crc kubenswrapper[4794]: I0310 10:00:12.667245 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4rhj58" event={"ID":"cce41c0b-f107-450b-95f5-64dae06dbe14","Type":"ContainerDied","Data":"ae614ba715905fc452902249d8dfbdfdc4c854eb6ce2a2f46e3275a9684ef023"} Mar 10 10:00:13 crc kubenswrapper[4794]: I0310 10:00:13.940189 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4rhj58" Mar 10 10:00:14 crc kubenswrapper[4794]: I0310 10:00:14.028902 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/cce41c0b-f107-450b-95f5-64dae06dbe14-util\") pod \"cce41c0b-f107-450b-95f5-64dae06dbe14\" (UID: \"cce41c0b-f107-450b-95f5-64dae06dbe14\") " Mar 10 10:00:14 crc kubenswrapper[4794]: I0310 10:00:14.028980 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-whvg4\" (UniqueName: \"kubernetes.io/projected/cce41c0b-f107-450b-95f5-64dae06dbe14-kube-api-access-whvg4\") pod \"cce41c0b-f107-450b-95f5-64dae06dbe14\" (UID: \"cce41c0b-f107-450b-95f5-64dae06dbe14\") " Mar 10 10:00:14 crc kubenswrapper[4794]: I0310 10:00:14.029074 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/cce41c0b-f107-450b-95f5-64dae06dbe14-bundle\") pod \"cce41c0b-f107-450b-95f5-64dae06dbe14\" (UID: \"cce41c0b-f107-450b-95f5-64dae06dbe14\") " Mar 10 10:00:14 crc kubenswrapper[4794]: I0310 10:00:14.031158 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cce41c0b-f107-450b-95f5-64dae06dbe14-bundle" (OuterVolumeSpecName: "bundle") pod "cce41c0b-f107-450b-95f5-64dae06dbe14" (UID: "cce41c0b-f107-450b-95f5-64dae06dbe14"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 10:00:14 crc kubenswrapper[4794]: I0310 10:00:14.035819 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cce41c0b-f107-450b-95f5-64dae06dbe14-kube-api-access-whvg4" (OuterVolumeSpecName: "kube-api-access-whvg4") pod "cce41c0b-f107-450b-95f5-64dae06dbe14" (UID: "cce41c0b-f107-450b-95f5-64dae06dbe14"). InnerVolumeSpecName "kube-api-access-whvg4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 10:00:14 crc kubenswrapper[4794]: I0310 10:00:14.043952 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cce41c0b-f107-450b-95f5-64dae06dbe14-util" (OuterVolumeSpecName: "util") pod "cce41c0b-f107-450b-95f5-64dae06dbe14" (UID: "cce41c0b-f107-450b-95f5-64dae06dbe14"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 10:00:14 crc kubenswrapper[4794]: I0310 10:00:14.130750 4794 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/cce41c0b-f107-450b-95f5-64dae06dbe14-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 10:00:14 crc kubenswrapper[4794]: I0310 10:00:14.131289 4794 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/cce41c0b-f107-450b-95f5-64dae06dbe14-util\") on node \"crc\" DevicePath \"\"" Mar 10 10:00:14 crc kubenswrapper[4794]: I0310 10:00:14.131310 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-whvg4\" (UniqueName: \"kubernetes.io/projected/cce41c0b-f107-450b-95f5-64dae06dbe14-kube-api-access-whvg4\") on node \"crc\" DevicePath \"\"" Mar 10 10:00:14 crc kubenswrapper[4794]: I0310 10:00:14.683438 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4rhj58" event={"ID":"cce41c0b-f107-450b-95f5-64dae06dbe14","Type":"ContainerDied","Data":"a8b4fd7dd292480f6d7b2be9183c73a41e7db910885067d41e12a3eeb31187ab"} Mar 10 10:00:14 crc kubenswrapper[4794]: I0310 10:00:14.683505 4794 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a8b4fd7dd292480f6d7b2be9183c73a41e7db910885067d41e12a3eeb31187ab" Mar 10 10:00:14 crc kubenswrapper[4794]: I0310 10:00:14.683520 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4rhj58" Mar 10 10:00:23 crc kubenswrapper[4794]: I0310 10:00:23.159168 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-controller-manager-5854bdf585-wdfdn"] Mar 10 10:00:23 crc kubenswrapper[4794]: E0310 10:00:23.159774 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cce41c0b-f107-450b-95f5-64dae06dbe14" containerName="util" Mar 10 10:00:23 crc kubenswrapper[4794]: I0310 10:00:23.159790 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="cce41c0b-f107-450b-95f5-64dae06dbe14" containerName="util" Mar 10 10:00:23 crc kubenswrapper[4794]: E0310 10:00:23.159807 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cce41c0b-f107-450b-95f5-64dae06dbe14" containerName="extract" Mar 10 10:00:23 crc kubenswrapper[4794]: I0310 10:00:23.159815 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="cce41c0b-f107-450b-95f5-64dae06dbe14" containerName="extract" Mar 10 10:00:23 crc kubenswrapper[4794]: E0310 10:00:23.159826 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="449e1884-a4e4-4b83-b831-ddbc4f598eff" containerName="oc" Mar 10 10:00:23 crc kubenswrapper[4794]: I0310 10:00:23.159835 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="449e1884-a4e4-4b83-b831-ddbc4f598eff" containerName="oc" Mar 10 10:00:23 crc kubenswrapper[4794]: E0310 10:00:23.159854 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cce41c0b-f107-450b-95f5-64dae06dbe14" containerName="pull" Mar 10 10:00:23 crc kubenswrapper[4794]: I0310 10:00:23.159862 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="cce41c0b-f107-450b-95f5-64dae06dbe14" containerName="pull" Mar 10 10:00:23 crc kubenswrapper[4794]: E0310 10:00:23.159873 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7827a543-d8b2-460b-aee5-212ea1208c0d" containerName="console" Mar 10 10:00:23 crc kubenswrapper[4794]: I0310 10:00:23.159881 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="7827a543-d8b2-460b-aee5-212ea1208c0d" containerName="console" Mar 10 10:00:23 crc kubenswrapper[4794]: I0310 10:00:23.160020 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="7827a543-d8b2-460b-aee5-212ea1208c0d" containerName="console" Mar 10 10:00:23 crc kubenswrapper[4794]: I0310 10:00:23.160045 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="449e1884-a4e4-4b83-b831-ddbc4f598eff" containerName="oc" Mar 10 10:00:23 crc kubenswrapper[4794]: I0310 10:00:23.160060 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="cce41c0b-f107-450b-95f5-64dae06dbe14" containerName="extract" Mar 10 10:00:23 crc kubenswrapper[4794]: I0310 10:00:23.160741 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-5854bdf585-wdfdn" Mar 10 10:00:23 crc kubenswrapper[4794]: I0310 10:00:23.164610 4794 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"manager-account-dockercfg-4pxj8" Mar 10 10:00:23 crc kubenswrapper[4794]: I0310 10:00:23.164958 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"openshift-service-ca.crt" Mar 10 10:00:23 crc kubenswrapper[4794]: I0310 10:00:23.165197 4794 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-controller-manager-service-cert" Mar 10 10:00:23 crc kubenswrapper[4794]: I0310 10:00:23.165439 4794 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-cert" Mar 10 10:00:23 crc kubenswrapper[4794]: I0310 10:00:23.165564 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"kube-root-ca.crt" Mar 10 10:00:23 crc kubenswrapper[4794]: I0310 10:00:23.177576 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-5854bdf585-wdfdn"] Mar 10 10:00:23 crc kubenswrapper[4794]: I0310 10:00:23.239688 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/b1881d46-f628-435d-8f46-23a62f6fcaee-webhook-cert\") pod \"metallb-operator-controller-manager-5854bdf585-wdfdn\" (UID: \"b1881d46-f628-435d-8f46-23a62f6fcaee\") " pod="metallb-system/metallb-operator-controller-manager-5854bdf585-wdfdn" Mar 10 10:00:23 crc kubenswrapper[4794]: I0310 10:00:23.239751 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zw82j\" (UniqueName: \"kubernetes.io/projected/b1881d46-f628-435d-8f46-23a62f6fcaee-kube-api-access-zw82j\") pod \"metallb-operator-controller-manager-5854bdf585-wdfdn\" (UID: \"b1881d46-f628-435d-8f46-23a62f6fcaee\") " pod="metallb-system/metallb-operator-controller-manager-5854bdf585-wdfdn" Mar 10 10:00:23 crc kubenswrapper[4794]: I0310 10:00:23.239793 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/b1881d46-f628-435d-8f46-23a62f6fcaee-apiservice-cert\") pod \"metallb-operator-controller-manager-5854bdf585-wdfdn\" (UID: \"b1881d46-f628-435d-8f46-23a62f6fcaee\") " pod="metallb-system/metallb-operator-controller-manager-5854bdf585-wdfdn" Mar 10 10:00:23 crc kubenswrapper[4794]: I0310 10:00:23.341011 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/b1881d46-f628-435d-8f46-23a62f6fcaee-apiservice-cert\") pod \"metallb-operator-controller-manager-5854bdf585-wdfdn\" (UID: \"b1881d46-f628-435d-8f46-23a62f6fcaee\") " pod="metallb-system/metallb-operator-controller-manager-5854bdf585-wdfdn" Mar 10 10:00:23 crc kubenswrapper[4794]: I0310 10:00:23.341074 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/b1881d46-f628-435d-8f46-23a62f6fcaee-webhook-cert\") pod \"metallb-operator-controller-manager-5854bdf585-wdfdn\" (UID: \"b1881d46-f628-435d-8f46-23a62f6fcaee\") " pod="metallb-system/metallb-operator-controller-manager-5854bdf585-wdfdn" Mar 10 10:00:23 crc kubenswrapper[4794]: I0310 10:00:23.341111 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zw82j\" (UniqueName: \"kubernetes.io/projected/b1881d46-f628-435d-8f46-23a62f6fcaee-kube-api-access-zw82j\") pod \"metallb-operator-controller-manager-5854bdf585-wdfdn\" (UID: \"b1881d46-f628-435d-8f46-23a62f6fcaee\") " pod="metallb-system/metallb-operator-controller-manager-5854bdf585-wdfdn" Mar 10 10:00:23 crc kubenswrapper[4794]: I0310 10:00:23.355218 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/b1881d46-f628-435d-8f46-23a62f6fcaee-webhook-cert\") pod \"metallb-operator-controller-manager-5854bdf585-wdfdn\" (UID: \"b1881d46-f628-435d-8f46-23a62f6fcaee\") " pod="metallb-system/metallb-operator-controller-manager-5854bdf585-wdfdn" Mar 10 10:00:23 crc kubenswrapper[4794]: I0310 10:00:23.355270 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/b1881d46-f628-435d-8f46-23a62f6fcaee-apiservice-cert\") pod \"metallb-operator-controller-manager-5854bdf585-wdfdn\" (UID: \"b1881d46-f628-435d-8f46-23a62f6fcaee\") " pod="metallb-system/metallb-operator-controller-manager-5854bdf585-wdfdn" Mar 10 10:00:23 crc kubenswrapper[4794]: I0310 10:00:23.357792 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zw82j\" (UniqueName: \"kubernetes.io/projected/b1881d46-f628-435d-8f46-23a62f6fcaee-kube-api-access-zw82j\") pod \"metallb-operator-controller-manager-5854bdf585-wdfdn\" (UID: \"b1881d46-f628-435d-8f46-23a62f6fcaee\") " pod="metallb-system/metallb-operator-controller-manager-5854bdf585-wdfdn" Mar 10 10:00:23 crc kubenswrapper[4794]: I0310 10:00:23.398560 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-webhook-server-57ff7f97d6-nq8rw"] Mar 10 10:00:23 crc kubenswrapper[4794]: I0310 10:00:23.399290 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-57ff7f97d6-nq8rw" Mar 10 10:00:23 crc kubenswrapper[4794]: I0310 10:00:23.401661 4794 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-dockercfg-k4ncw" Mar 10 10:00:23 crc kubenswrapper[4794]: I0310 10:00:23.401983 4794 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-service-cert" Mar 10 10:00:23 crc kubenswrapper[4794]: I0310 10:00:23.404958 4794 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Mar 10 10:00:23 crc kubenswrapper[4794]: I0310 10:00:23.412803 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-57ff7f97d6-nq8rw"] Mar 10 10:00:23 crc kubenswrapper[4794]: I0310 10:00:23.485096 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-5854bdf585-wdfdn" Mar 10 10:00:23 crc kubenswrapper[4794]: I0310 10:00:23.542971 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/1c4d30ef-4a9c-4150-a3e5-8f9ba4944fbc-webhook-cert\") pod \"metallb-operator-webhook-server-57ff7f97d6-nq8rw\" (UID: \"1c4d30ef-4a9c-4150-a3e5-8f9ba4944fbc\") " pod="metallb-system/metallb-operator-webhook-server-57ff7f97d6-nq8rw" Mar 10 10:00:23 crc kubenswrapper[4794]: I0310 10:00:23.543055 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/1c4d30ef-4a9c-4150-a3e5-8f9ba4944fbc-apiservice-cert\") pod \"metallb-operator-webhook-server-57ff7f97d6-nq8rw\" (UID: \"1c4d30ef-4a9c-4150-a3e5-8f9ba4944fbc\") " pod="metallb-system/metallb-operator-webhook-server-57ff7f97d6-nq8rw" Mar 10 10:00:23 crc kubenswrapper[4794]: I0310 10:00:23.543087 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ggftv\" (UniqueName: \"kubernetes.io/projected/1c4d30ef-4a9c-4150-a3e5-8f9ba4944fbc-kube-api-access-ggftv\") pod \"metallb-operator-webhook-server-57ff7f97d6-nq8rw\" (UID: \"1c4d30ef-4a9c-4150-a3e5-8f9ba4944fbc\") " pod="metallb-system/metallb-operator-webhook-server-57ff7f97d6-nq8rw" Mar 10 10:00:23 crc kubenswrapper[4794]: I0310 10:00:23.644141 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/1c4d30ef-4a9c-4150-a3e5-8f9ba4944fbc-webhook-cert\") pod \"metallb-operator-webhook-server-57ff7f97d6-nq8rw\" (UID: \"1c4d30ef-4a9c-4150-a3e5-8f9ba4944fbc\") " pod="metallb-system/metallb-operator-webhook-server-57ff7f97d6-nq8rw" Mar 10 10:00:23 crc kubenswrapper[4794]: I0310 10:00:23.644625 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/1c4d30ef-4a9c-4150-a3e5-8f9ba4944fbc-apiservice-cert\") pod \"metallb-operator-webhook-server-57ff7f97d6-nq8rw\" (UID: \"1c4d30ef-4a9c-4150-a3e5-8f9ba4944fbc\") " pod="metallb-system/metallb-operator-webhook-server-57ff7f97d6-nq8rw" Mar 10 10:00:23 crc kubenswrapper[4794]: I0310 10:00:23.644660 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ggftv\" (UniqueName: \"kubernetes.io/projected/1c4d30ef-4a9c-4150-a3e5-8f9ba4944fbc-kube-api-access-ggftv\") pod \"metallb-operator-webhook-server-57ff7f97d6-nq8rw\" (UID: \"1c4d30ef-4a9c-4150-a3e5-8f9ba4944fbc\") " pod="metallb-system/metallb-operator-webhook-server-57ff7f97d6-nq8rw" Mar 10 10:00:23 crc kubenswrapper[4794]: I0310 10:00:23.654270 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/1c4d30ef-4a9c-4150-a3e5-8f9ba4944fbc-apiservice-cert\") pod \"metallb-operator-webhook-server-57ff7f97d6-nq8rw\" (UID: \"1c4d30ef-4a9c-4150-a3e5-8f9ba4944fbc\") " pod="metallb-system/metallb-operator-webhook-server-57ff7f97d6-nq8rw" Mar 10 10:00:23 crc kubenswrapper[4794]: I0310 10:00:23.678997 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/1c4d30ef-4a9c-4150-a3e5-8f9ba4944fbc-webhook-cert\") pod \"metallb-operator-webhook-server-57ff7f97d6-nq8rw\" (UID: \"1c4d30ef-4a9c-4150-a3e5-8f9ba4944fbc\") " pod="metallb-system/metallb-operator-webhook-server-57ff7f97d6-nq8rw" Mar 10 10:00:23 crc kubenswrapper[4794]: I0310 10:00:23.682653 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ggftv\" (UniqueName: \"kubernetes.io/projected/1c4d30ef-4a9c-4150-a3e5-8f9ba4944fbc-kube-api-access-ggftv\") pod \"metallb-operator-webhook-server-57ff7f97d6-nq8rw\" (UID: \"1c4d30ef-4a9c-4150-a3e5-8f9ba4944fbc\") " pod="metallb-system/metallb-operator-webhook-server-57ff7f97d6-nq8rw" Mar 10 10:00:23 crc kubenswrapper[4794]: I0310 10:00:23.714615 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-57ff7f97d6-nq8rw" Mar 10 10:00:23 crc kubenswrapper[4794]: I0310 10:00:23.737420 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-5854bdf585-wdfdn"] Mar 10 10:00:23 crc kubenswrapper[4794]: W0310 10:00:23.771225 4794 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb1881d46_f628_435d_8f46_23a62f6fcaee.slice/crio-367d661d394a00e902c1c3e0a036bc0c61aaf1de6789d266af2390301198c720 WatchSource:0}: Error finding container 367d661d394a00e902c1c3e0a036bc0c61aaf1de6789d266af2390301198c720: Status 404 returned error can't find the container with id 367d661d394a00e902c1c3e0a036bc0c61aaf1de6789d266af2390301198c720 Mar 10 10:00:23 crc kubenswrapper[4794]: I0310 10:00:23.959863 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-57ff7f97d6-nq8rw"] Mar 10 10:00:24 crc kubenswrapper[4794]: I0310 10:00:24.735792 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-57ff7f97d6-nq8rw" event={"ID":"1c4d30ef-4a9c-4150-a3e5-8f9ba4944fbc","Type":"ContainerStarted","Data":"022b0a910bda1cd5e63a5c184aacd5e23add7c3272f6659753f71a9cadedca07"} Mar 10 10:00:24 crc kubenswrapper[4794]: I0310 10:00:24.737091 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-5854bdf585-wdfdn" event={"ID":"b1881d46-f628-435d-8f46-23a62f6fcaee","Type":"ContainerStarted","Data":"367d661d394a00e902c1c3e0a036bc0c61aaf1de6789d266af2390301198c720"} Mar 10 10:00:27 crc kubenswrapper[4794]: I0310 10:00:27.760895 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-5854bdf585-wdfdn" event={"ID":"b1881d46-f628-435d-8f46-23a62f6fcaee","Type":"ContainerStarted","Data":"5d6729c390cdd21b1412abd02d83665d5cca69af8e405fd3bf47519c02c5cc29"} Mar 10 10:00:27 crc kubenswrapper[4794]: I0310 10:00:27.761241 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-controller-manager-5854bdf585-wdfdn" Mar 10 10:00:27 crc kubenswrapper[4794]: I0310 10:00:27.781856 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-controller-manager-5854bdf585-wdfdn" podStartSLOduration=1.715100064 podStartE2EDuration="4.781839794s" podCreationTimestamp="2026-03-10 10:00:23 +0000 UTC" firstStartedPulling="2026-03-10 10:00:23.773846599 +0000 UTC m=+972.530017417" lastFinishedPulling="2026-03-10 10:00:26.840586329 +0000 UTC m=+975.596757147" observedRunningTime="2026-03-10 10:00:27.781165905 +0000 UTC m=+976.537336723" watchObservedRunningTime="2026-03-10 10:00:27.781839794 +0000 UTC m=+976.538010612" Mar 10 10:00:28 crc kubenswrapper[4794]: I0310 10:00:28.768146 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-57ff7f97d6-nq8rw" event={"ID":"1c4d30ef-4a9c-4150-a3e5-8f9ba4944fbc","Type":"ContainerStarted","Data":"867752ca5b12869b81f5466e22bebe70448d2ecb7d1858b2b664a2edffb7daab"} Mar 10 10:00:28 crc kubenswrapper[4794]: I0310 10:00:28.788481 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-webhook-server-57ff7f97d6-nq8rw" podStartSLOduration=1.226182708 podStartE2EDuration="5.788463722s" podCreationTimestamp="2026-03-10 10:00:23 +0000 UTC" firstStartedPulling="2026-03-10 10:00:23.970545454 +0000 UTC m=+972.726716272" lastFinishedPulling="2026-03-10 10:00:28.532826468 +0000 UTC m=+977.288997286" observedRunningTime="2026-03-10 10:00:28.78419069 +0000 UTC m=+977.540361518" watchObservedRunningTime="2026-03-10 10:00:28.788463722 +0000 UTC m=+977.544634550" Mar 10 10:00:29 crc kubenswrapper[4794]: I0310 10:00:29.621512 4794 scope.go:117] "RemoveContainer" containerID="dc2dd6fc1d8274102b00f0c4c10a1dbffacb90ceb4d7d48b075b7d54fee08404" Mar 10 10:00:29 crc kubenswrapper[4794]: I0310 10:00:29.772687 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-webhook-server-57ff7f97d6-nq8rw" Mar 10 10:00:43 crc kubenswrapper[4794]: I0310 10:00:43.728541 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-webhook-server-57ff7f97d6-nq8rw" Mar 10 10:01:03 crc kubenswrapper[4794]: I0310 10:01:03.488968 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-controller-manager-5854bdf585-wdfdn" Mar 10 10:01:04 crc kubenswrapper[4794]: I0310 10:01:04.195241 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-webhook-server-7f989f654f-tvkkq"] Mar 10 10:01:04 crc kubenswrapper[4794]: I0310 10:01:04.196216 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-7f989f654f-tvkkq" Mar 10 10:01:04 crc kubenswrapper[4794]: I0310 10:01:04.198754 4794 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-webhook-server-cert" Mar 10 10:01:04 crc kubenswrapper[4794]: I0310 10:01:04.198821 4794 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-daemon-dockercfg-9jgxt" Mar 10 10:01:04 crc kubenswrapper[4794]: I0310 10:01:04.215222 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-vftrb"] Mar 10 10:01:04 crc kubenswrapper[4794]: I0310 10:01:04.218046 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-vftrb" Mar 10 10:01:04 crc kubenswrapper[4794]: I0310 10:01:04.220297 4794 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-certs-secret" Mar 10 10:01:04 crc kubenswrapper[4794]: I0310 10:01:04.222282 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"frr-startup" Mar 10 10:01:04 crc kubenswrapper[4794]: I0310 10:01:04.223562 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-7f989f654f-tvkkq"] Mar 10 10:01:04 crc kubenswrapper[4794]: I0310 10:01:04.253054 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9797c8e7-cef5-4987-9eb3-6d9214e0e871-metrics-certs\") pod \"frr-k8s-vftrb\" (UID: \"9797c8e7-cef5-4987-9eb3-6d9214e0e871\") " pod="metallb-system/frr-k8s-vftrb" Mar 10 10:01:04 crc kubenswrapper[4794]: I0310 10:01:04.253120 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/9797c8e7-cef5-4987-9eb3-6d9214e0e871-reloader\") pod \"frr-k8s-vftrb\" (UID: \"9797c8e7-cef5-4987-9eb3-6d9214e0e871\") " pod="metallb-system/frr-k8s-vftrb" Mar 10 10:01:04 crc kubenswrapper[4794]: I0310 10:01:04.253199 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/9797c8e7-cef5-4987-9eb3-6d9214e0e871-frr-startup\") pod \"frr-k8s-vftrb\" (UID: \"9797c8e7-cef5-4987-9eb3-6d9214e0e871\") " pod="metallb-system/frr-k8s-vftrb" Mar 10 10:01:04 crc kubenswrapper[4794]: I0310 10:01:04.253252 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/9797c8e7-cef5-4987-9eb3-6d9214e0e871-frr-conf\") pod \"frr-k8s-vftrb\" (UID: \"9797c8e7-cef5-4987-9eb3-6d9214e0e871\") " pod="metallb-system/frr-k8s-vftrb" Mar 10 10:01:04 crc kubenswrapper[4794]: I0310 10:01:04.253297 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/bac622e3-6c34-4772-a6c6-99112d6e77fb-cert\") pod \"frr-k8s-webhook-server-7f989f654f-tvkkq\" (UID: \"bac622e3-6c34-4772-a6c6-99112d6e77fb\") " pod="metallb-system/frr-k8s-webhook-server-7f989f654f-tvkkq" Mar 10 10:01:04 crc kubenswrapper[4794]: I0310 10:01:04.253321 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/9797c8e7-cef5-4987-9eb3-6d9214e0e871-frr-sockets\") pod \"frr-k8s-vftrb\" (UID: \"9797c8e7-cef5-4987-9eb3-6d9214e0e871\") " pod="metallb-system/frr-k8s-vftrb" Mar 10 10:01:04 crc kubenswrapper[4794]: I0310 10:01:04.253388 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9cgrh\" (UniqueName: \"kubernetes.io/projected/9797c8e7-cef5-4987-9eb3-6d9214e0e871-kube-api-access-9cgrh\") pod \"frr-k8s-vftrb\" (UID: \"9797c8e7-cef5-4987-9eb3-6d9214e0e871\") " pod="metallb-system/frr-k8s-vftrb" Mar 10 10:01:04 crc kubenswrapper[4794]: I0310 10:01:04.253408 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/9797c8e7-cef5-4987-9eb3-6d9214e0e871-metrics\") pod \"frr-k8s-vftrb\" (UID: \"9797c8e7-cef5-4987-9eb3-6d9214e0e871\") " pod="metallb-system/frr-k8s-vftrb" Mar 10 10:01:04 crc kubenswrapper[4794]: I0310 10:01:04.253441 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v4p8n\" (UniqueName: \"kubernetes.io/projected/bac622e3-6c34-4772-a6c6-99112d6e77fb-kube-api-access-v4p8n\") pod \"frr-k8s-webhook-server-7f989f654f-tvkkq\" (UID: \"bac622e3-6c34-4772-a6c6-99112d6e77fb\") " pod="metallb-system/frr-k8s-webhook-server-7f989f654f-tvkkq" Mar 10 10:01:04 crc kubenswrapper[4794]: I0310 10:01:04.270813 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/speaker-cdztl"] Mar 10 10:01:04 crc kubenswrapper[4794]: I0310 10:01:04.271911 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-cdztl" Mar 10 10:01:04 crc kubenswrapper[4794]: I0310 10:01:04.273375 4794 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-certs-secret" Mar 10 10:01:04 crc kubenswrapper[4794]: I0310 10:01:04.273811 4794 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-dockercfg-vrczf" Mar 10 10:01:04 crc kubenswrapper[4794]: I0310 10:01:04.273976 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"metallb-excludel2" Mar 10 10:01:04 crc kubenswrapper[4794]: I0310 10:01:04.274143 4794 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-memberlist" Mar 10 10:01:04 crc kubenswrapper[4794]: I0310 10:01:04.287405 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/controller-86ddb6bd46-8rmct"] Mar 10 10:01:04 crc kubenswrapper[4794]: I0310 10:01:04.288325 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-86ddb6bd46-8rmct" Mar 10 10:01:04 crc kubenswrapper[4794]: I0310 10:01:04.289872 4794 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-certs-secret" Mar 10 10:01:04 crc kubenswrapper[4794]: I0310 10:01:04.303804 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-86ddb6bd46-8rmct"] Mar 10 10:01:04 crc kubenswrapper[4794]: I0310 10:01:04.353984 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/bac622e3-6c34-4772-a6c6-99112d6e77fb-cert\") pod \"frr-k8s-webhook-server-7f989f654f-tvkkq\" (UID: \"bac622e3-6c34-4772-a6c6-99112d6e77fb\") " pod="metallb-system/frr-k8s-webhook-server-7f989f654f-tvkkq" Mar 10 10:01:04 crc kubenswrapper[4794]: I0310 10:01:04.354026 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6xd4w\" (UniqueName: \"kubernetes.io/projected/7ee9da20-7d24-4c95-a86b-7cde5025a756-kube-api-access-6xd4w\") pod \"speaker-cdztl\" (UID: \"7ee9da20-7d24-4c95-a86b-7cde5025a756\") " pod="metallb-system/speaker-cdztl" Mar 10 10:01:04 crc kubenswrapper[4794]: I0310 10:01:04.354064 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/9797c8e7-cef5-4987-9eb3-6d9214e0e871-frr-sockets\") pod \"frr-k8s-vftrb\" (UID: \"9797c8e7-cef5-4987-9eb3-6d9214e0e871\") " pod="metallb-system/frr-k8s-vftrb" Mar 10 10:01:04 crc kubenswrapper[4794]: I0310 10:01:04.354088 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/7ee9da20-7d24-4c95-a86b-7cde5025a756-metallb-excludel2\") pod \"speaker-cdztl\" (UID: \"7ee9da20-7d24-4c95-a86b-7cde5025a756\") " pod="metallb-system/speaker-cdztl" Mar 10 10:01:04 crc kubenswrapper[4794]: I0310 10:01:04.354105 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5cb2a87f-5847-4fba-8544-21262656a693-cert\") pod \"controller-86ddb6bd46-8rmct\" (UID: \"5cb2a87f-5847-4fba-8544-21262656a693\") " pod="metallb-system/controller-86ddb6bd46-8rmct" Mar 10 10:01:04 crc kubenswrapper[4794]: I0310 10:01:04.354123 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7ee9da20-7d24-4c95-a86b-7cde5025a756-metrics-certs\") pod \"speaker-cdztl\" (UID: \"7ee9da20-7d24-4c95-a86b-7cde5025a756\") " pod="metallb-system/speaker-cdztl" Mar 10 10:01:04 crc kubenswrapper[4794]: E0310 10:01:04.354151 4794 secret.go:188] Couldn't get secret metallb-system/frr-k8s-webhook-server-cert: secret "frr-k8s-webhook-server-cert" not found Mar 10 10:01:04 crc kubenswrapper[4794]: I0310 10:01:04.354179 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9cgrh\" (UniqueName: \"kubernetes.io/projected/9797c8e7-cef5-4987-9eb3-6d9214e0e871-kube-api-access-9cgrh\") pod \"frr-k8s-vftrb\" (UID: \"9797c8e7-cef5-4987-9eb3-6d9214e0e871\") " pod="metallb-system/frr-k8s-vftrb" Mar 10 10:01:04 crc kubenswrapper[4794]: E0310 10:01:04.354227 4794 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bac622e3-6c34-4772-a6c6-99112d6e77fb-cert podName:bac622e3-6c34-4772-a6c6-99112d6e77fb nodeName:}" failed. No retries permitted until 2026-03-10 10:01:04.854208981 +0000 UTC m=+1013.610379799 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/bac622e3-6c34-4772-a6c6-99112d6e77fb-cert") pod "frr-k8s-webhook-server-7f989f654f-tvkkq" (UID: "bac622e3-6c34-4772-a6c6-99112d6e77fb") : secret "frr-k8s-webhook-server-cert" not found Mar 10 10:01:04 crc kubenswrapper[4794]: I0310 10:01:04.354372 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/9797c8e7-cef5-4987-9eb3-6d9214e0e871-metrics\") pod \"frr-k8s-vftrb\" (UID: \"9797c8e7-cef5-4987-9eb3-6d9214e0e871\") " pod="metallb-system/frr-k8s-vftrb" Mar 10 10:01:04 crc kubenswrapper[4794]: I0310 10:01:04.354394 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v4p8n\" (UniqueName: \"kubernetes.io/projected/bac622e3-6c34-4772-a6c6-99112d6e77fb-kube-api-access-v4p8n\") pod \"frr-k8s-webhook-server-7f989f654f-tvkkq\" (UID: \"bac622e3-6c34-4772-a6c6-99112d6e77fb\") " pod="metallb-system/frr-k8s-webhook-server-7f989f654f-tvkkq" Mar 10 10:01:04 crc kubenswrapper[4794]: I0310 10:01:04.354419 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/9797c8e7-cef5-4987-9eb3-6d9214e0e871-frr-sockets\") pod \"frr-k8s-vftrb\" (UID: \"9797c8e7-cef5-4987-9eb3-6d9214e0e871\") " pod="metallb-system/frr-k8s-vftrb" Mar 10 10:01:04 crc kubenswrapper[4794]: I0310 10:01:04.354487 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5cb2a87f-5847-4fba-8544-21262656a693-metrics-certs\") pod \"controller-86ddb6bd46-8rmct\" (UID: \"5cb2a87f-5847-4fba-8544-21262656a693\") " pod="metallb-system/controller-86ddb6bd46-8rmct" Mar 10 10:01:04 crc kubenswrapper[4794]: I0310 10:01:04.354525 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9797c8e7-cef5-4987-9eb3-6d9214e0e871-metrics-certs\") pod \"frr-k8s-vftrb\" (UID: \"9797c8e7-cef5-4987-9eb3-6d9214e0e871\") " pod="metallb-system/frr-k8s-vftrb" Mar 10 10:01:04 crc kubenswrapper[4794]: I0310 10:01:04.354556 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/9797c8e7-cef5-4987-9eb3-6d9214e0e871-reloader\") pod \"frr-k8s-vftrb\" (UID: \"9797c8e7-cef5-4987-9eb3-6d9214e0e871\") " pod="metallb-system/frr-k8s-vftrb" Mar 10 10:01:04 crc kubenswrapper[4794]: I0310 10:01:04.354573 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/9797c8e7-cef5-4987-9eb3-6d9214e0e871-frr-startup\") pod \"frr-k8s-vftrb\" (UID: \"9797c8e7-cef5-4987-9eb3-6d9214e0e871\") " pod="metallb-system/frr-k8s-vftrb" Mar 10 10:01:04 crc kubenswrapper[4794]: I0310 10:01:04.354619 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/7ee9da20-7d24-4c95-a86b-7cde5025a756-memberlist\") pod \"speaker-cdztl\" (UID: \"7ee9da20-7d24-4c95-a86b-7cde5025a756\") " pod="metallb-system/speaker-cdztl" Mar 10 10:01:04 crc kubenswrapper[4794]: I0310 10:01:04.354671 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/9797c8e7-cef5-4987-9eb3-6d9214e0e871-frr-conf\") pod \"frr-k8s-vftrb\" (UID: \"9797c8e7-cef5-4987-9eb3-6d9214e0e871\") " pod="metallb-system/frr-k8s-vftrb" Mar 10 10:01:04 crc kubenswrapper[4794]: I0310 10:01:04.354695 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7gmtv\" (UniqueName: \"kubernetes.io/projected/5cb2a87f-5847-4fba-8544-21262656a693-kube-api-access-7gmtv\") pod \"controller-86ddb6bd46-8rmct\" (UID: \"5cb2a87f-5847-4fba-8544-21262656a693\") " pod="metallb-system/controller-86ddb6bd46-8rmct" Mar 10 10:01:04 crc kubenswrapper[4794]: I0310 10:01:04.354710 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/9797c8e7-cef5-4987-9eb3-6d9214e0e871-metrics\") pod \"frr-k8s-vftrb\" (UID: \"9797c8e7-cef5-4987-9eb3-6d9214e0e871\") " pod="metallb-system/frr-k8s-vftrb" Mar 10 10:01:04 crc kubenswrapper[4794]: I0310 10:01:04.354898 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/9797c8e7-cef5-4987-9eb3-6d9214e0e871-reloader\") pod \"frr-k8s-vftrb\" (UID: \"9797c8e7-cef5-4987-9eb3-6d9214e0e871\") " pod="metallb-system/frr-k8s-vftrb" Mar 10 10:01:04 crc kubenswrapper[4794]: I0310 10:01:04.354931 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/9797c8e7-cef5-4987-9eb3-6d9214e0e871-frr-conf\") pod \"frr-k8s-vftrb\" (UID: \"9797c8e7-cef5-4987-9eb3-6d9214e0e871\") " pod="metallb-system/frr-k8s-vftrb" Mar 10 10:01:04 crc kubenswrapper[4794]: I0310 10:01:04.355426 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/9797c8e7-cef5-4987-9eb3-6d9214e0e871-frr-startup\") pod \"frr-k8s-vftrb\" (UID: \"9797c8e7-cef5-4987-9eb3-6d9214e0e871\") " pod="metallb-system/frr-k8s-vftrb" Mar 10 10:01:04 crc kubenswrapper[4794]: I0310 10:01:04.361161 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9797c8e7-cef5-4987-9eb3-6d9214e0e871-metrics-certs\") pod \"frr-k8s-vftrb\" (UID: \"9797c8e7-cef5-4987-9eb3-6d9214e0e871\") " pod="metallb-system/frr-k8s-vftrb" Mar 10 10:01:04 crc kubenswrapper[4794]: I0310 10:01:04.371751 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9cgrh\" (UniqueName: \"kubernetes.io/projected/9797c8e7-cef5-4987-9eb3-6d9214e0e871-kube-api-access-9cgrh\") pod \"frr-k8s-vftrb\" (UID: \"9797c8e7-cef5-4987-9eb3-6d9214e0e871\") " pod="metallb-system/frr-k8s-vftrb" Mar 10 10:01:04 crc kubenswrapper[4794]: I0310 10:01:04.374865 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v4p8n\" (UniqueName: \"kubernetes.io/projected/bac622e3-6c34-4772-a6c6-99112d6e77fb-kube-api-access-v4p8n\") pod \"frr-k8s-webhook-server-7f989f654f-tvkkq\" (UID: \"bac622e3-6c34-4772-a6c6-99112d6e77fb\") " pod="metallb-system/frr-k8s-webhook-server-7f989f654f-tvkkq" Mar 10 10:01:04 crc kubenswrapper[4794]: I0310 10:01:04.455620 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5cb2a87f-5847-4fba-8544-21262656a693-metrics-certs\") pod \"controller-86ddb6bd46-8rmct\" (UID: \"5cb2a87f-5847-4fba-8544-21262656a693\") " pod="metallb-system/controller-86ddb6bd46-8rmct" Mar 10 10:01:04 crc kubenswrapper[4794]: I0310 10:01:04.455696 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/7ee9da20-7d24-4c95-a86b-7cde5025a756-memberlist\") pod \"speaker-cdztl\" (UID: \"7ee9da20-7d24-4c95-a86b-7cde5025a756\") " pod="metallb-system/speaker-cdztl" Mar 10 10:01:04 crc kubenswrapper[4794]: I0310 10:01:04.455731 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7gmtv\" (UniqueName: \"kubernetes.io/projected/5cb2a87f-5847-4fba-8544-21262656a693-kube-api-access-7gmtv\") pod \"controller-86ddb6bd46-8rmct\" (UID: \"5cb2a87f-5847-4fba-8544-21262656a693\") " pod="metallb-system/controller-86ddb6bd46-8rmct" Mar 10 10:01:04 crc kubenswrapper[4794]: I0310 10:01:04.455787 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6xd4w\" (UniqueName: \"kubernetes.io/projected/7ee9da20-7d24-4c95-a86b-7cde5025a756-kube-api-access-6xd4w\") pod \"speaker-cdztl\" (UID: \"7ee9da20-7d24-4c95-a86b-7cde5025a756\") " pod="metallb-system/speaker-cdztl" Mar 10 10:01:04 crc kubenswrapper[4794]: I0310 10:01:04.455827 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/7ee9da20-7d24-4c95-a86b-7cde5025a756-metallb-excludel2\") pod \"speaker-cdztl\" (UID: \"7ee9da20-7d24-4c95-a86b-7cde5025a756\") " pod="metallb-system/speaker-cdztl" Mar 10 10:01:04 crc kubenswrapper[4794]: E0310 10:01:04.455829 4794 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Mar 10 10:01:04 crc kubenswrapper[4794]: I0310 10:01:04.455850 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5cb2a87f-5847-4fba-8544-21262656a693-cert\") pod \"controller-86ddb6bd46-8rmct\" (UID: \"5cb2a87f-5847-4fba-8544-21262656a693\") " pod="metallb-system/controller-86ddb6bd46-8rmct" Mar 10 10:01:04 crc kubenswrapper[4794]: E0310 10:01:04.455913 4794 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7ee9da20-7d24-4c95-a86b-7cde5025a756-memberlist podName:7ee9da20-7d24-4c95-a86b-7cde5025a756 nodeName:}" failed. No retries permitted until 2026-03-10 10:01:04.955886513 +0000 UTC m=+1013.712057331 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/7ee9da20-7d24-4c95-a86b-7cde5025a756-memberlist") pod "speaker-cdztl" (UID: "7ee9da20-7d24-4c95-a86b-7cde5025a756") : secret "metallb-memberlist" not found Mar 10 10:01:04 crc kubenswrapper[4794]: I0310 10:01:04.456384 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7ee9da20-7d24-4c95-a86b-7cde5025a756-metrics-certs\") pod \"speaker-cdztl\" (UID: \"7ee9da20-7d24-4c95-a86b-7cde5025a756\") " pod="metallb-system/speaker-cdztl" Mar 10 10:01:04 crc kubenswrapper[4794]: I0310 10:01:04.456534 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/7ee9da20-7d24-4c95-a86b-7cde5025a756-metallb-excludel2\") pod \"speaker-cdztl\" (UID: \"7ee9da20-7d24-4c95-a86b-7cde5025a756\") " pod="metallb-system/speaker-cdztl" Mar 10 10:01:04 crc kubenswrapper[4794]: I0310 10:01:04.459926 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7ee9da20-7d24-4c95-a86b-7cde5025a756-metrics-certs\") pod \"speaker-cdztl\" (UID: \"7ee9da20-7d24-4c95-a86b-7cde5025a756\") " pod="metallb-system/speaker-cdztl" Mar 10 10:01:04 crc kubenswrapper[4794]: I0310 10:01:04.460232 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5cb2a87f-5847-4fba-8544-21262656a693-metrics-certs\") pod \"controller-86ddb6bd46-8rmct\" (UID: \"5cb2a87f-5847-4fba-8544-21262656a693\") " pod="metallb-system/controller-86ddb6bd46-8rmct" Mar 10 10:01:04 crc kubenswrapper[4794]: I0310 10:01:04.470855 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5cb2a87f-5847-4fba-8544-21262656a693-cert\") pod \"controller-86ddb6bd46-8rmct\" (UID: \"5cb2a87f-5847-4fba-8544-21262656a693\") " pod="metallb-system/controller-86ddb6bd46-8rmct" Mar 10 10:01:04 crc kubenswrapper[4794]: I0310 10:01:04.471923 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7gmtv\" (UniqueName: \"kubernetes.io/projected/5cb2a87f-5847-4fba-8544-21262656a693-kube-api-access-7gmtv\") pod \"controller-86ddb6bd46-8rmct\" (UID: \"5cb2a87f-5847-4fba-8544-21262656a693\") " pod="metallb-system/controller-86ddb6bd46-8rmct" Mar 10 10:01:04 crc kubenswrapper[4794]: I0310 10:01:04.476371 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6xd4w\" (UniqueName: \"kubernetes.io/projected/7ee9da20-7d24-4c95-a86b-7cde5025a756-kube-api-access-6xd4w\") pod \"speaker-cdztl\" (UID: \"7ee9da20-7d24-4c95-a86b-7cde5025a756\") " pod="metallb-system/speaker-cdztl" Mar 10 10:01:04 crc kubenswrapper[4794]: I0310 10:01:04.539109 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-vftrb" Mar 10 10:01:04 crc kubenswrapper[4794]: I0310 10:01:04.600384 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-86ddb6bd46-8rmct" Mar 10 10:01:04 crc kubenswrapper[4794]: I0310 10:01:04.778913 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-86ddb6bd46-8rmct"] Mar 10 10:01:04 crc kubenswrapper[4794]: W0310 10:01:04.782930 4794 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5cb2a87f_5847_4fba_8544_21262656a693.slice/crio-a96f0ecdfaed61bee9932481321878b8ebcdee67bb6507ae4f40123a496fb0c2 WatchSource:0}: Error finding container a96f0ecdfaed61bee9932481321878b8ebcdee67bb6507ae4f40123a496fb0c2: Status 404 returned error can't find the container with id a96f0ecdfaed61bee9932481321878b8ebcdee67bb6507ae4f40123a496fb0c2 Mar 10 10:01:04 crc kubenswrapper[4794]: I0310 10:01:04.860172 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/bac622e3-6c34-4772-a6c6-99112d6e77fb-cert\") pod \"frr-k8s-webhook-server-7f989f654f-tvkkq\" (UID: \"bac622e3-6c34-4772-a6c6-99112d6e77fb\") " pod="metallb-system/frr-k8s-webhook-server-7f989f654f-tvkkq" Mar 10 10:01:04 crc kubenswrapper[4794]: I0310 10:01:04.864435 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/bac622e3-6c34-4772-a6c6-99112d6e77fb-cert\") pod \"frr-k8s-webhook-server-7f989f654f-tvkkq\" (UID: \"bac622e3-6c34-4772-a6c6-99112d6e77fb\") " pod="metallb-system/frr-k8s-webhook-server-7f989f654f-tvkkq" Mar 10 10:01:04 crc kubenswrapper[4794]: I0310 10:01:04.961645 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/7ee9da20-7d24-4c95-a86b-7cde5025a756-memberlist\") pod \"speaker-cdztl\" (UID: \"7ee9da20-7d24-4c95-a86b-7cde5025a756\") " pod="metallb-system/speaker-cdztl" Mar 10 10:01:04 crc kubenswrapper[4794]: I0310 10:01:04.966640 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/7ee9da20-7d24-4c95-a86b-7cde5025a756-memberlist\") pod \"speaker-cdztl\" (UID: \"7ee9da20-7d24-4c95-a86b-7cde5025a756\") " pod="metallb-system/speaker-cdztl" Mar 10 10:01:04 crc kubenswrapper[4794]: I0310 10:01:04.990721 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-86ddb6bd46-8rmct" event={"ID":"5cb2a87f-5847-4fba-8544-21262656a693","Type":"ContainerStarted","Data":"9b96dc2bc116a476145e783c0b9464479f0acbcabc25878b2dd93ff39e2e0598"} Mar 10 10:01:04 crc kubenswrapper[4794]: I0310 10:01:04.990782 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-86ddb6bd46-8rmct" event={"ID":"5cb2a87f-5847-4fba-8544-21262656a693","Type":"ContainerStarted","Data":"3d04f261276662d8ed4ea5b207898e7c222c9f6cb6de542fb55348ddb8c9ac88"} Mar 10 10:01:04 crc kubenswrapper[4794]: I0310 10:01:04.990803 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-86ddb6bd46-8rmct" event={"ID":"5cb2a87f-5847-4fba-8544-21262656a693","Type":"ContainerStarted","Data":"a96f0ecdfaed61bee9932481321878b8ebcdee67bb6507ae4f40123a496fb0c2"} Mar 10 10:01:04 crc kubenswrapper[4794]: I0310 10:01:04.990865 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/controller-86ddb6bd46-8rmct" Mar 10 10:01:04 crc kubenswrapper[4794]: I0310 10:01:04.991713 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-vftrb" event={"ID":"9797c8e7-cef5-4987-9eb3-6d9214e0e871","Type":"ContainerStarted","Data":"314bf63dbcca2340f8ff9f92e2ba8f193a4ac42b1b13a42db3fb23d1c9e5cafe"} Mar 10 10:01:05 crc kubenswrapper[4794]: I0310 10:01:05.009625 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/controller-86ddb6bd46-8rmct" podStartSLOduration=1.009609278 podStartE2EDuration="1.009609278s" podCreationTimestamp="2026-03-10 10:01:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 10:01:05.006776992 +0000 UTC m=+1013.762947810" watchObservedRunningTime="2026-03-10 10:01:05.009609278 +0000 UTC m=+1013.765780096" Mar 10 10:01:05 crc kubenswrapper[4794]: I0310 10:01:05.115663 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-7f989f654f-tvkkq" Mar 10 10:01:05 crc kubenswrapper[4794]: I0310 10:01:05.191866 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-cdztl" Mar 10 10:01:05 crc kubenswrapper[4794]: W0310 10:01:05.246278 4794 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7ee9da20_7d24_4c95_a86b_7cde5025a756.slice/crio-a6ccafc887992d88aeef19be9282740a4e4ee86bf65ac326ec1785ca6abdabc2 WatchSource:0}: Error finding container a6ccafc887992d88aeef19be9282740a4e4ee86bf65ac326ec1785ca6abdabc2: Status 404 returned error can't find the container with id a6ccafc887992d88aeef19be9282740a4e4ee86bf65ac326ec1785ca6abdabc2 Mar 10 10:01:05 crc kubenswrapper[4794]: I0310 10:01:05.551978 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-7f989f654f-tvkkq"] Mar 10 10:01:06 crc kubenswrapper[4794]: I0310 10:01:06.006073 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-7f989f654f-tvkkq" event={"ID":"bac622e3-6c34-4772-a6c6-99112d6e77fb","Type":"ContainerStarted","Data":"38bf5b24aebef2af27800be6317433733ad384e678e9031d8a7d655714f5ef64"} Mar 10 10:01:06 crc kubenswrapper[4794]: I0310 10:01:06.006369 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-cdztl" event={"ID":"7ee9da20-7d24-4c95-a86b-7cde5025a756","Type":"ContainerStarted","Data":"4f927cc4fe9de7c84ec4a790508298f55e03cd79ea11627a90201a0a7b6e4597"} Mar 10 10:01:06 crc kubenswrapper[4794]: I0310 10:01:06.006386 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-cdztl" event={"ID":"7ee9da20-7d24-4c95-a86b-7cde5025a756","Type":"ContainerStarted","Data":"dd5cbc7214a4b6e356624607257edbe7800db025a5a9bfb006ef8f9b55a5f1c1"} Mar 10 10:01:06 crc kubenswrapper[4794]: I0310 10:01:06.006395 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-cdztl" event={"ID":"7ee9da20-7d24-4c95-a86b-7cde5025a756","Type":"ContainerStarted","Data":"a6ccafc887992d88aeef19be9282740a4e4ee86bf65ac326ec1785ca6abdabc2"} Mar 10 10:01:06 crc kubenswrapper[4794]: I0310 10:01:06.006544 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/speaker-cdztl" Mar 10 10:01:06 crc kubenswrapper[4794]: I0310 10:01:06.023369 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/speaker-cdztl" podStartSLOduration=2.023329392 podStartE2EDuration="2.023329392s" podCreationTimestamp="2026-03-10 10:01:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 10:01:06.020988851 +0000 UTC m=+1014.777159679" watchObservedRunningTime="2026-03-10 10:01:06.023329392 +0000 UTC m=+1014.779500220" Mar 10 10:01:12 crc kubenswrapper[4794]: I0310 10:01:12.057197 4794 generic.go:334] "Generic (PLEG): container finished" podID="9797c8e7-cef5-4987-9eb3-6d9214e0e871" containerID="56b20c464c8d6938d16e9b6d810a50b40efe720c644c5c8d6143cf1c1b9bbebe" exitCode=0 Mar 10 10:01:12 crc kubenswrapper[4794]: I0310 10:01:12.057312 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-vftrb" event={"ID":"9797c8e7-cef5-4987-9eb3-6d9214e0e871","Type":"ContainerDied","Data":"56b20c464c8d6938d16e9b6d810a50b40efe720c644c5c8d6143cf1c1b9bbebe"} Mar 10 10:01:12 crc kubenswrapper[4794]: I0310 10:01:12.060678 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-7f989f654f-tvkkq" event={"ID":"bac622e3-6c34-4772-a6c6-99112d6e77fb","Type":"ContainerStarted","Data":"6596fd55a9c110be70c021a0497b1514d66133e224a2305da0f4444286ac2a2c"} Mar 10 10:01:12 crc kubenswrapper[4794]: I0310 10:01:12.060890 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-webhook-server-7f989f654f-tvkkq" Mar 10 10:01:13 crc kubenswrapper[4794]: I0310 10:01:13.067971 4794 generic.go:334] "Generic (PLEG): container finished" podID="9797c8e7-cef5-4987-9eb3-6d9214e0e871" containerID="55593b3ef5a935942f13570566c02b6a4af7c2c087e43b1b97cb16520a1e7e13" exitCode=0 Mar 10 10:01:13 crc kubenswrapper[4794]: I0310 10:01:13.068064 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-vftrb" event={"ID":"9797c8e7-cef5-4987-9eb3-6d9214e0e871","Type":"ContainerDied","Data":"55593b3ef5a935942f13570566c02b6a4af7c2c087e43b1b97cb16520a1e7e13"} Mar 10 10:01:13 crc kubenswrapper[4794]: I0310 10:01:13.106855 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-webhook-server-7f989f654f-tvkkq" podStartSLOduration=3.081393282 podStartE2EDuration="9.106835844s" podCreationTimestamp="2026-03-10 10:01:04 +0000 UTC" firstStartedPulling="2026-03-10 10:01:05.558150266 +0000 UTC m=+1014.314321084" lastFinishedPulling="2026-03-10 10:01:11.583592828 +0000 UTC m=+1020.339763646" observedRunningTime="2026-03-10 10:01:12.102958352 +0000 UTC m=+1020.859129210" watchObservedRunningTime="2026-03-10 10:01:13.106835844 +0000 UTC m=+1021.863006672" Mar 10 10:01:14 crc kubenswrapper[4794]: I0310 10:01:14.079681 4794 generic.go:334] "Generic (PLEG): container finished" podID="9797c8e7-cef5-4987-9eb3-6d9214e0e871" containerID="2535a6bd8d0800b480c783c8e00edfcc114246c295ddcae0a6eea5ad4360ac3e" exitCode=0 Mar 10 10:01:14 crc kubenswrapper[4794]: I0310 10:01:14.079960 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-vftrb" event={"ID":"9797c8e7-cef5-4987-9eb3-6d9214e0e871","Type":"ContainerDied","Data":"2535a6bd8d0800b480c783c8e00edfcc114246c295ddcae0a6eea5ad4360ac3e"} Mar 10 10:01:14 crc kubenswrapper[4794]: I0310 10:01:14.604465 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/controller-86ddb6bd46-8rmct" Mar 10 10:01:15 crc kubenswrapper[4794]: I0310 10:01:15.094009 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-vftrb" event={"ID":"9797c8e7-cef5-4987-9eb3-6d9214e0e871","Type":"ContainerStarted","Data":"4bead06bf82ce69dbcbe77bc36ac16407ef01ac7a98f2c98c3ebad657287a4e1"} Mar 10 10:01:15 crc kubenswrapper[4794]: I0310 10:01:15.094079 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-vftrb" event={"ID":"9797c8e7-cef5-4987-9eb3-6d9214e0e871","Type":"ContainerStarted","Data":"d0a4f131cea108d75de9d8d5c58dd9a103273c33ad1421d8401a6dc21a550908"} Mar 10 10:01:15 crc kubenswrapper[4794]: I0310 10:01:15.094093 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-vftrb" event={"ID":"9797c8e7-cef5-4987-9eb3-6d9214e0e871","Type":"ContainerStarted","Data":"1c9e1cf298344e8eac6c2855c836c42ba4e71853fe0c6ce100ff9875e14090b9"} Mar 10 10:01:15 crc kubenswrapper[4794]: I0310 10:01:15.094104 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-vftrb" event={"ID":"9797c8e7-cef5-4987-9eb3-6d9214e0e871","Type":"ContainerStarted","Data":"d568c7e28ffb8c50bbe46a2f31d8e922844c1fa2f455e5a3ad5e368de6cd6a96"} Mar 10 10:01:15 crc kubenswrapper[4794]: I0310 10:01:15.094115 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-vftrb" event={"ID":"9797c8e7-cef5-4987-9eb3-6d9214e0e871","Type":"ContainerStarted","Data":"a02ae14a5045720081dd3e3435c5aab57cc0aec914e9ab63359628aa675af640"} Mar 10 10:01:15 crc kubenswrapper[4794]: I0310 10:01:15.094125 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-vftrb" event={"ID":"9797c8e7-cef5-4987-9eb3-6d9214e0e871","Type":"ContainerStarted","Data":"63844b17ae4c1226a3efc8b5612c514b4598b25182464736bceb273cc4ada546"} Mar 10 10:01:15 crc kubenswrapper[4794]: I0310 10:01:15.094365 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-vftrb" Mar 10 10:01:15 crc kubenswrapper[4794]: I0310 10:01:15.124842 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-vftrb" podStartSLOduration=4.210684564 podStartE2EDuration="11.124824943s" podCreationTimestamp="2026-03-10 10:01:04 +0000 UTC" firstStartedPulling="2026-03-10 10:01:04.650611462 +0000 UTC m=+1013.406782280" lastFinishedPulling="2026-03-10 10:01:11.564751841 +0000 UTC m=+1020.320922659" observedRunningTime="2026-03-10 10:01:15.123066099 +0000 UTC m=+1023.879236927" watchObservedRunningTime="2026-03-10 10:01:15.124824943 +0000 UTC m=+1023.880995771" Mar 10 10:01:15 crc kubenswrapper[4794]: I0310 10:01:15.196031 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/speaker-cdztl" Mar 10 10:01:16 crc kubenswrapper[4794]: I0310 10:01:16.633864 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e57nmv4"] Mar 10 10:01:16 crc kubenswrapper[4794]: I0310 10:01:16.635518 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e57nmv4" Mar 10 10:01:16 crc kubenswrapper[4794]: I0310 10:01:16.643250 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Mar 10 10:01:16 crc kubenswrapper[4794]: I0310 10:01:16.649095 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e57nmv4"] Mar 10 10:01:16 crc kubenswrapper[4794]: I0310 10:01:16.734600 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/1ad61494-e023-4b0b-babf-452f7e2a5532-util\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e57nmv4\" (UID: \"1ad61494-e023-4b0b-babf-452f7e2a5532\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e57nmv4" Mar 10 10:01:16 crc kubenswrapper[4794]: I0310 10:01:16.734756 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4ftr6\" (UniqueName: \"kubernetes.io/projected/1ad61494-e023-4b0b-babf-452f7e2a5532-kube-api-access-4ftr6\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e57nmv4\" (UID: \"1ad61494-e023-4b0b-babf-452f7e2a5532\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e57nmv4" Mar 10 10:01:16 crc kubenswrapper[4794]: I0310 10:01:16.734786 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/1ad61494-e023-4b0b-babf-452f7e2a5532-bundle\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e57nmv4\" (UID: \"1ad61494-e023-4b0b-babf-452f7e2a5532\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e57nmv4" Mar 10 10:01:16 crc kubenswrapper[4794]: I0310 10:01:16.835932 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/1ad61494-e023-4b0b-babf-452f7e2a5532-util\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e57nmv4\" (UID: \"1ad61494-e023-4b0b-babf-452f7e2a5532\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e57nmv4" Mar 10 10:01:16 crc kubenswrapper[4794]: I0310 10:01:16.836093 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/1ad61494-e023-4b0b-babf-452f7e2a5532-bundle\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e57nmv4\" (UID: \"1ad61494-e023-4b0b-babf-452f7e2a5532\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e57nmv4" Mar 10 10:01:16 crc kubenswrapper[4794]: I0310 10:01:16.836131 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4ftr6\" (UniqueName: \"kubernetes.io/projected/1ad61494-e023-4b0b-babf-452f7e2a5532-kube-api-access-4ftr6\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e57nmv4\" (UID: \"1ad61494-e023-4b0b-babf-452f7e2a5532\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e57nmv4" Mar 10 10:01:16 crc kubenswrapper[4794]: I0310 10:01:16.836456 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/1ad61494-e023-4b0b-babf-452f7e2a5532-util\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e57nmv4\" (UID: \"1ad61494-e023-4b0b-babf-452f7e2a5532\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e57nmv4" Mar 10 10:01:16 crc kubenswrapper[4794]: I0310 10:01:16.836628 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/1ad61494-e023-4b0b-babf-452f7e2a5532-bundle\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e57nmv4\" (UID: \"1ad61494-e023-4b0b-babf-452f7e2a5532\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e57nmv4" Mar 10 10:01:16 crc kubenswrapper[4794]: I0310 10:01:16.854056 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4ftr6\" (UniqueName: \"kubernetes.io/projected/1ad61494-e023-4b0b-babf-452f7e2a5532-kube-api-access-4ftr6\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e57nmv4\" (UID: \"1ad61494-e023-4b0b-babf-452f7e2a5532\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e57nmv4" Mar 10 10:01:16 crc kubenswrapper[4794]: I0310 10:01:16.954380 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e57nmv4" Mar 10 10:01:17 crc kubenswrapper[4794]: I0310 10:01:17.168358 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e57nmv4"] Mar 10 10:01:17 crc kubenswrapper[4794]: W0310 10:01:17.179497 4794 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1ad61494_e023_4b0b_babf_452f7e2a5532.slice/crio-196c6550bfcc4b9e3e868b49c2ce8f698628ab803e3cd73ff842747238c438d4 WatchSource:0}: Error finding container 196c6550bfcc4b9e3e868b49c2ce8f698628ab803e3cd73ff842747238c438d4: Status 404 returned error can't find the container with id 196c6550bfcc4b9e3e868b49c2ce8f698628ab803e3cd73ff842747238c438d4 Mar 10 10:01:18 crc kubenswrapper[4794]: I0310 10:01:18.111980 4794 generic.go:334] "Generic (PLEG): container finished" podID="1ad61494-e023-4b0b-babf-452f7e2a5532" containerID="0daf4c0e921669cfccfb4d62b1d6e7d42be2ea67679e1246c307174d3246441d" exitCode=0 Mar 10 10:01:18 crc kubenswrapper[4794]: I0310 10:01:18.112036 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e57nmv4" event={"ID":"1ad61494-e023-4b0b-babf-452f7e2a5532","Type":"ContainerDied","Data":"0daf4c0e921669cfccfb4d62b1d6e7d42be2ea67679e1246c307174d3246441d"} Mar 10 10:01:18 crc kubenswrapper[4794]: I0310 10:01:18.112320 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e57nmv4" event={"ID":"1ad61494-e023-4b0b-babf-452f7e2a5532","Type":"ContainerStarted","Data":"196c6550bfcc4b9e3e868b49c2ce8f698628ab803e3cd73ff842747238c438d4"} Mar 10 10:01:19 crc kubenswrapper[4794]: I0310 10:01:19.540594 4794 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="metallb-system/frr-k8s-vftrb" Mar 10 10:01:19 crc kubenswrapper[4794]: I0310 10:01:19.580399 4794 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="metallb-system/frr-k8s-vftrb" Mar 10 10:01:23 crc kubenswrapper[4794]: I0310 10:01:23.155668 4794 generic.go:334] "Generic (PLEG): container finished" podID="1ad61494-e023-4b0b-babf-452f7e2a5532" containerID="897b5da08807f6ef90132bf0ec9ae54ac05e5cfe4784cf9ec11186ab96f20672" exitCode=0 Mar 10 10:01:23 crc kubenswrapper[4794]: I0310 10:01:23.155770 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e57nmv4" event={"ID":"1ad61494-e023-4b0b-babf-452f7e2a5532","Type":"ContainerDied","Data":"897b5da08807f6ef90132bf0ec9ae54ac05e5cfe4784cf9ec11186ab96f20672"} Mar 10 10:01:24 crc kubenswrapper[4794]: I0310 10:01:24.164205 4794 generic.go:334] "Generic (PLEG): container finished" podID="1ad61494-e023-4b0b-babf-452f7e2a5532" containerID="9b6b975b7f24c6317be16f55af96b19a0d9e6785a5fa1f8af8696b5cc359f9f2" exitCode=0 Mar 10 10:01:24 crc kubenswrapper[4794]: I0310 10:01:24.164252 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e57nmv4" event={"ID":"1ad61494-e023-4b0b-babf-452f7e2a5532","Type":"ContainerDied","Data":"9b6b975b7f24c6317be16f55af96b19a0d9e6785a5fa1f8af8696b5cc359f9f2"} Mar 10 10:01:24 crc kubenswrapper[4794]: I0310 10:01:24.545611 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-vftrb" Mar 10 10:01:25 crc kubenswrapper[4794]: I0310 10:01:25.122706 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-webhook-server-7f989f654f-tvkkq" Mar 10 10:01:25 crc kubenswrapper[4794]: I0310 10:01:25.437620 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e57nmv4" Mar 10 10:01:25 crc kubenswrapper[4794]: I0310 10:01:25.553986 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4ftr6\" (UniqueName: \"kubernetes.io/projected/1ad61494-e023-4b0b-babf-452f7e2a5532-kube-api-access-4ftr6\") pod \"1ad61494-e023-4b0b-babf-452f7e2a5532\" (UID: \"1ad61494-e023-4b0b-babf-452f7e2a5532\") " Mar 10 10:01:25 crc kubenswrapper[4794]: I0310 10:01:25.554281 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/1ad61494-e023-4b0b-babf-452f7e2a5532-bundle\") pod \"1ad61494-e023-4b0b-babf-452f7e2a5532\" (UID: \"1ad61494-e023-4b0b-babf-452f7e2a5532\") " Mar 10 10:01:25 crc kubenswrapper[4794]: I0310 10:01:25.554431 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/1ad61494-e023-4b0b-babf-452f7e2a5532-util\") pod \"1ad61494-e023-4b0b-babf-452f7e2a5532\" (UID: \"1ad61494-e023-4b0b-babf-452f7e2a5532\") " Mar 10 10:01:25 crc kubenswrapper[4794]: I0310 10:01:25.555094 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1ad61494-e023-4b0b-babf-452f7e2a5532-bundle" (OuterVolumeSpecName: "bundle") pod "1ad61494-e023-4b0b-babf-452f7e2a5532" (UID: "1ad61494-e023-4b0b-babf-452f7e2a5532"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 10:01:25 crc kubenswrapper[4794]: I0310 10:01:25.560144 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1ad61494-e023-4b0b-babf-452f7e2a5532-kube-api-access-4ftr6" (OuterVolumeSpecName: "kube-api-access-4ftr6") pod "1ad61494-e023-4b0b-babf-452f7e2a5532" (UID: "1ad61494-e023-4b0b-babf-452f7e2a5532"). InnerVolumeSpecName "kube-api-access-4ftr6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 10:01:25 crc kubenswrapper[4794]: I0310 10:01:25.565998 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1ad61494-e023-4b0b-babf-452f7e2a5532-util" (OuterVolumeSpecName: "util") pod "1ad61494-e023-4b0b-babf-452f7e2a5532" (UID: "1ad61494-e023-4b0b-babf-452f7e2a5532"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 10:01:25 crc kubenswrapper[4794]: I0310 10:01:25.656262 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4ftr6\" (UniqueName: \"kubernetes.io/projected/1ad61494-e023-4b0b-babf-452f7e2a5532-kube-api-access-4ftr6\") on node \"crc\" DevicePath \"\"" Mar 10 10:01:25 crc kubenswrapper[4794]: I0310 10:01:25.656304 4794 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/1ad61494-e023-4b0b-babf-452f7e2a5532-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 10:01:25 crc kubenswrapper[4794]: I0310 10:01:25.656317 4794 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/1ad61494-e023-4b0b-babf-452f7e2a5532-util\") on node \"crc\" DevicePath \"\"" Mar 10 10:01:26 crc kubenswrapper[4794]: I0310 10:01:26.180620 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e57nmv4" event={"ID":"1ad61494-e023-4b0b-babf-452f7e2a5532","Type":"ContainerDied","Data":"196c6550bfcc4b9e3e868b49c2ce8f698628ab803e3cd73ff842747238c438d4"} Mar 10 10:01:26 crc kubenswrapper[4794]: I0310 10:01:26.180682 4794 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="196c6550bfcc4b9e3e868b49c2ce8f698628ab803e3cd73ff842747238c438d4" Mar 10 10:01:26 crc kubenswrapper[4794]: I0310 10:01:26.180795 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e57nmv4" Mar 10 10:01:29 crc kubenswrapper[4794]: I0310 10:01:29.438233 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-6r9n4"] Mar 10 10:01:29 crc kubenswrapper[4794]: E0310 10:01:29.438948 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1ad61494-e023-4b0b-babf-452f7e2a5532" containerName="util" Mar 10 10:01:29 crc kubenswrapper[4794]: I0310 10:01:29.438971 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="1ad61494-e023-4b0b-babf-452f7e2a5532" containerName="util" Mar 10 10:01:29 crc kubenswrapper[4794]: E0310 10:01:29.439009 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1ad61494-e023-4b0b-babf-452f7e2a5532" containerName="extract" Mar 10 10:01:29 crc kubenswrapper[4794]: I0310 10:01:29.439019 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="1ad61494-e023-4b0b-babf-452f7e2a5532" containerName="extract" Mar 10 10:01:29 crc kubenswrapper[4794]: E0310 10:01:29.439038 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1ad61494-e023-4b0b-babf-452f7e2a5532" containerName="pull" Mar 10 10:01:29 crc kubenswrapper[4794]: I0310 10:01:29.439049 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="1ad61494-e023-4b0b-babf-452f7e2a5532" containerName="pull" Mar 10 10:01:29 crc kubenswrapper[4794]: I0310 10:01:29.439218 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="1ad61494-e023-4b0b-babf-452f7e2a5532" containerName="extract" Mar 10 10:01:29 crc kubenswrapper[4794]: I0310 10:01:29.439901 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-6r9n4" Mar 10 10:01:29 crc kubenswrapper[4794]: W0310 10:01:29.441983 4794 reflector.go:561] object-"cert-manager-operator"/"cert-manager-operator-controller-manager-dockercfg-grdbm": failed to list *v1.Secret: secrets "cert-manager-operator-controller-manager-dockercfg-grdbm" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "cert-manager-operator": no relationship found between node 'crc' and this object Mar 10 10:01:29 crc kubenswrapper[4794]: W0310 10:01:29.442112 4794 reflector.go:561] object-"cert-manager-operator"/"kube-root-ca.crt": failed to list *v1.ConfigMap: configmaps "kube-root-ca.crt" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "cert-manager-operator": no relationship found between node 'crc' and this object Mar 10 10:01:29 crc kubenswrapper[4794]: E0310 10:01:29.442268 4794 reflector.go:158] "Unhandled Error" err="object-\"cert-manager-operator\"/\"kube-root-ca.crt\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"kube-root-ca.crt\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"cert-manager-operator\": no relationship found between node 'crc' and this object" logger="UnhandledError" Mar 10 10:01:29 crc kubenswrapper[4794]: E0310 10:01:29.442204 4794 reflector.go:158] "Unhandled Error" err="object-\"cert-manager-operator\"/\"cert-manager-operator-controller-manager-dockercfg-grdbm\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"cert-manager-operator-controller-manager-dockercfg-grdbm\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"cert-manager-operator\": no relationship found between node 'crc' and this object" logger="UnhandledError" Mar 10 10:01:29 crc kubenswrapper[4794]: W0310 10:01:29.443004 4794 reflector.go:561] object-"cert-manager-operator"/"openshift-service-ca.crt": failed to list *v1.ConfigMap: configmaps "openshift-service-ca.crt" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "cert-manager-operator": no relationship found between node 'crc' and this object Mar 10 10:01:29 crc kubenswrapper[4794]: E0310 10:01:29.443048 4794 reflector.go:158] "Unhandled Error" err="object-\"cert-manager-operator\"/\"openshift-service-ca.crt\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"openshift-service-ca.crt\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"cert-manager-operator\": no relationship found between node 'crc' and this object" logger="UnhandledError" Mar 10 10:01:29 crc kubenswrapper[4794]: I0310 10:01:29.460546 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-6r9n4"] Mar 10 10:01:29 crc kubenswrapper[4794]: I0310 10:01:29.609545 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dzj48\" (UniqueName: \"kubernetes.io/projected/e64e5b5f-c243-452f-afda-f69d1fd2a341-kube-api-access-dzj48\") pod \"cert-manager-operator-controller-manager-66c8bdd694-6r9n4\" (UID: \"e64e5b5f-c243-452f-afda-f69d1fd2a341\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-6r9n4" Mar 10 10:01:29 crc kubenswrapper[4794]: I0310 10:01:29.609673 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/e64e5b5f-c243-452f-afda-f69d1fd2a341-tmp\") pod \"cert-manager-operator-controller-manager-66c8bdd694-6r9n4\" (UID: \"e64e5b5f-c243-452f-afda-f69d1fd2a341\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-6r9n4" Mar 10 10:01:29 crc kubenswrapper[4794]: I0310 10:01:29.710922 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dzj48\" (UniqueName: \"kubernetes.io/projected/e64e5b5f-c243-452f-afda-f69d1fd2a341-kube-api-access-dzj48\") pod \"cert-manager-operator-controller-manager-66c8bdd694-6r9n4\" (UID: \"e64e5b5f-c243-452f-afda-f69d1fd2a341\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-6r9n4" Mar 10 10:01:29 crc kubenswrapper[4794]: I0310 10:01:29.711071 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/e64e5b5f-c243-452f-afda-f69d1fd2a341-tmp\") pod \"cert-manager-operator-controller-manager-66c8bdd694-6r9n4\" (UID: \"e64e5b5f-c243-452f-afda-f69d1fd2a341\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-6r9n4" Mar 10 10:01:29 crc kubenswrapper[4794]: I0310 10:01:29.711551 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/e64e5b5f-c243-452f-afda-f69d1fd2a341-tmp\") pod \"cert-manager-operator-controller-manager-66c8bdd694-6r9n4\" (UID: \"e64e5b5f-c243-452f-afda-f69d1fd2a341\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-6r9n4" Mar 10 10:01:30 crc kubenswrapper[4794]: I0310 10:01:30.296087 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager-operator"/"kube-root-ca.crt" Mar 10 10:01:30 crc kubenswrapper[4794]: I0310 10:01:30.677420 4794 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager-operator"/"cert-manager-operator-controller-manager-dockercfg-grdbm" Mar 10 10:01:30 crc kubenswrapper[4794]: I0310 10:01:30.899947 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager-operator"/"openshift-service-ca.crt" Mar 10 10:01:30 crc kubenswrapper[4794]: I0310 10:01:30.910819 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dzj48\" (UniqueName: \"kubernetes.io/projected/e64e5b5f-c243-452f-afda-f69d1fd2a341-kube-api-access-dzj48\") pod \"cert-manager-operator-controller-manager-66c8bdd694-6r9n4\" (UID: \"e64e5b5f-c243-452f-afda-f69d1fd2a341\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-6r9n4" Mar 10 10:01:30 crc kubenswrapper[4794]: I0310 10:01:30.987502 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-6r9n4" Mar 10 10:01:31 crc kubenswrapper[4794]: I0310 10:01:31.424022 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-6r9n4"] Mar 10 10:01:31 crc kubenswrapper[4794]: W0310 10:01:31.429528 4794 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode64e5b5f_c243_452f_afda_f69d1fd2a341.slice/crio-9deb3854301a4e863bf3b5b6956a799f638cf6eaa1a9ff2b7aacc23398f88820 WatchSource:0}: Error finding container 9deb3854301a4e863bf3b5b6956a799f638cf6eaa1a9ff2b7aacc23398f88820: Status 404 returned error can't find the container with id 9deb3854301a4e863bf3b5b6956a799f638cf6eaa1a9ff2b7aacc23398f88820 Mar 10 10:01:32 crc kubenswrapper[4794]: I0310 10:01:32.217554 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-6r9n4" event={"ID":"e64e5b5f-c243-452f-afda-f69d1fd2a341","Type":"ContainerStarted","Data":"9deb3854301a4e863bf3b5b6956a799f638cf6eaa1a9ff2b7aacc23398f88820"} Mar 10 10:01:35 crc kubenswrapper[4794]: I0310 10:01:35.238912 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-6r9n4" event={"ID":"e64e5b5f-c243-452f-afda-f69d1fd2a341","Type":"ContainerStarted","Data":"c16a2851be47b9cca5915efa8415e38831af31895ee07f472d3d384b943449fe"} Mar 10 10:01:38 crc kubenswrapper[4794]: I0310 10:01:38.436100 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-6r9n4" podStartSLOduration=6.440114288 podStartE2EDuration="9.436076766s" podCreationTimestamp="2026-03-10 10:01:29 +0000 UTC" firstStartedPulling="2026-03-10 10:01:31.43278758 +0000 UTC m=+1040.188958438" lastFinishedPulling="2026-03-10 10:01:34.428750098 +0000 UTC m=+1043.184920916" observedRunningTime="2026-03-10 10:01:35.256673605 +0000 UTC m=+1044.012844423" watchObservedRunningTime="2026-03-10 10:01:38.436076766 +0000 UTC m=+1047.192247624" Mar 10 10:01:38 crc kubenswrapper[4794]: I0310 10:01:38.442716 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-webhook-6888856db4-4dfk8"] Mar 10 10:01:38 crc kubenswrapper[4794]: I0310 10:01:38.443956 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-6888856db4-4dfk8" Mar 10 10:01:38 crc kubenswrapper[4794]: I0310 10:01:38.447563 4794 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-webhook-dockercfg-tvc5v" Mar 10 10:01:38 crc kubenswrapper[4794]: I0310 10:01:38.447966 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"openshift-service-ca.crt" Mar 10 10:01:38 crc kubenswrapper[4794]: I0310 10:01:38.448021 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"kube-root-ca.crt" Mar 10 10:01:38 crc kubenswrapper[4794]: I0310 10:01:38.448692 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-6888856db4-4dfk8"] Mar 10 10:01:38 crc kubenswrapper[4794]: I0310 10:01:38.525636 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/1aab61bc-b01a-4965-96bd-261ceaca0636-bound-sa-token\") pod \"cert-manager-webhook-6888856db4-4dfk8\" (UID: \"1aab61bc-b01a-4965-96bd-261ceaca0636\") " pod="cert-manager/cert-manager-webhook-6888856db4-4dfk8" Mar 10 10:01:38 crc kubenswrapper[4794]: I0310 10:01:38.525704 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jbrlg\" (UniqueName: \"kubernetes.io/projected/1aab61bc-b01a-4965-96bd-261ceaca0636-kube-api-access-jbrlg\") pod \"cert-manager-webhook-6888856db4-4dfk8\" (UID: \"1aab61bc-b01a-4965-96bd-261ceaca0636\") " pod="cert-manager/cert-manager-webhook-6888856db4-4dfk8" Mar 10 10:01:38 crc kubenswrapper[4794]: I0310 10:01:38.627386 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/1aab61bc-b01a-4965-96bd-261ceaca0636-bound-sa-token\") pod \"cert-manager-webhook-6888856db4-4dfk8\" (UID: \"1aab61bc-b01a-4965-96bd-261ceaca0636\") " pod="cert-manager/cert-manager-webhook-6888856db4-4dfk8" Mar 10 10:01:38 crc kubenswrapper[4794]: I0310 10:01:38.627437 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jbrlg\" (UniqueName: \"kubernetes.io/projected/1aab61bc-b01a-4965-96bd-261ceaca0636-kube-api-access-jbrlg\") pod \"cert-manager-webhook-6888856db4-4dfk8\" (UID: \"1aab61bc-b01a-4965-96bd-261ceaca0636\") " pod="cert-manager/cert-manager-webhook-6888856db4-4dfk8" Mar 10 10:01:38 crc kubenswrapper[4794]: I0310 10:01:38.645550 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jbrlg\" (UniqueName: \"kubernetes.io/projected/1aab61bc-b01a-4965-96bd-261ceaca0636-kube-api-access-jbrlg\") pod \"cert-manager-webhook-6888856db4-4dfk8\" (UID: \"1aab61bc-b01a-4965-96bd-261ceaca0636\") " pod="cert-manager/cert-manager-webhook-6888856db4-4dfk8" Mar 10 10:01:38 crc kubenswrapper[4794]: I0310 10:01:38.654552 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/1aab61bc-b01a-4965-96bd-261ceaca0636-bound-sa-token\") pod \"cert-manager-webhook-6888856db4-4dfk8\" (UID: \"1aab61bc-b01a-4965-96bd-261ceaca0636\") " pod="cert-manager/cert-manager-webhook-6888856db4-4dfk8" Mar 10 10:01:38 crc kubenswrapper[4794]: I0310 10:01:38.766914 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-6888856db4-4dfk8" Mar 10 10:01:39 crc kubenswrapper[4794]: I0310 10:01:39.326794 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-6888856db4-4dfk8"] Mar 10 10:01:39 crc kubenswrapper[4794]: I0310 10:01:39.511050 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-cainjector-5545bd876-h6lmz"] Mar 10 10:01:39 crc kubenswrapper[4794]: I0310 10:01:39.511827 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-5545bd876-h6lmz" Mar 10 10:01:39 crc kubenswrapper[4794]: I0310 10:01:39.519784 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-5545bd876-h6lmz"] Mar 10 10:01:39 crc kubenswrapper[4794]: I0310 10:01:39.520805 4794 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-cainjector-dockercfg-8tt8d" Mar 10 10:01:39 crc kubenswrapper[4794]: I0310 10:01:39.641446 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/c9d3b9ca-7539-4c04-a6f1-7332587d32ff-bound-sa-token\") pod \"cert-manager-cainjector-5545bd876-h6lmz\" (UID: \"c9d3b9ca-7539-4c04-a6f1-7332587d32ff\") " pod="cert-manager/cert-manager-cainjector-5545bd876-h6lmz" Mar 10 10:01:39 crc kubenswrapper[4794]: I0310 10:01:39.641492 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pz7xh\" (UniqueName: \"kubernetes.io/projected/c9d3b9ca-7539-4c04-a6f1-7332587d32ff-kube-api-access-pz7xh\") pod \"cert-manager-cainjector-5545bd876-h6lmz\" (UID: \"c9d3b9ca-7539-4c04-a6f1-7332587d32ff\") " pod="cert-manager/cert-manager-cainjector-5545bd876-h6lmz" Mar 10 10:01:39 crc kubenswrapper[4794]: I0310 10:01:39.743496 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/c9d3b9ca-7539-4c04-a6f1-7332587d32ff-bound-sa-token\") pod \"cert-manager-cainjector-5545bd876-h6lmz\" (UID: \"c9d3b9ca-7539-4c04-a6f1-7332587d32ff\") " pod="cert-manager/cert-manager-cainjector-5545bd876-h6lmz" Mar 10 10:01:39 crc kubenswrapper[4794]: I0310 10:01:39.743554 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pz7xh\" (UniqueName: \"kubernetes.io/projected/c9d3b9ca-7539-4c04-a6f1-7332587d32ff-kube-api-access-pz7xh\") pod \"cert-manager-cainjector-5545bd876-h6lmz\" (UID: \"c9d3b9ca-7539-4c04-a6f1-7332587d32ff\") " pod="cert-manager/cert-manager-cainjector-5545bd876-h6lmz" Mar 10 10:01:39 crc kubenswrapper[4794]: I0310 10:01:39.760355 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/c9d3b9ca-7539-4c04-a6f1-7332587d32ff-bound-sa-token\") pod \"cert-manager-cainjector-5545bd876-h6lmz\" (UID: \"c9d3b9ca-7539-4c04-a6f1-7332587d32ff\") " pod="cert-manager/cert-manager-cainjector-5545bd876-h6lmz" Mar 10 10:01:39 crc kubenswrapper[4794]: I0310 10:01:39.760442 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pz7xh\" (UniqueName: \"kubernetes.io/projected/c9d3b9ca-7539-4c04-a6f1-7332587d32ff-kube-api-access-pz7xh\") pod \"cert-manager-cainjector-5545bd876-h6lmz\" (UID: \"c9d3b9ca-7539-4c04-a6f1-7332587d32ff\") " pod="cert-manager/cert-manager-cainjector-5545bd876-h6lmz" Mar 10 10:01:39 crc kubenswrapper[4794]: I0310 10:01:39.830891 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-5545bd876-h6lmz" Mar 10 10:01:40 crc kubenswrapper[4794]: I0310 10:01:40.007789 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-5545bd876-h6lmz"] Mar 10 10:01:40 crc kubenswrapper[4794]: W0310 10:01:40.015729 4794 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc9d3b9ca_7539_4c04_a6f1_7332587d32ff.slice/crio-38f49823befa45f8157f8db2a8e849df20b454471114da6b2630c7749c4ab734 WatchSource:0}: Error finding container 38f49823befa45f8157f8db2a8e849df20b454471114da6b2630c7749c4ab734: Status 404 returned error can't find the container with id 38f49823befa45f8157f8db2a8e849df20b454471114da6b2630c7749c4ab734 Mar 10 10:01:40 crc kubenswrapper[4794]: I0310 10:01:40.268716 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-6888856db4-4dfk8" event={"ID":"1aab61bc-b01a-4965-96bd-261ceaca0636","Type":"ContainerStarted","Data":"dab5bc76e6a3383485a7063995ee74b75ea142b4a0ad7f450ddc3a86465d3dff"} Mar 10 10:01:40 crc kubenswrapper[4794]: I0310 10:01:40.270070 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-5545bd876-h6lmz" event={"ID":"c9d3b9ca-7539-4c04-a6f1-7332587d32ff","Type":"ContainerStarted","Data":"38f49823befa45f8157f8db2a8e849df20b454471114da6b2630c7749c4ab734"} Mar 10 10:01:45 crc kubenswrapper[4794]: I0310 10:01:45.303168 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-5545bd876-h6lmz" event={"ID":"c9d3b9ca-7539-4c04-a6f1-7332587d32ff","Type":"ContainerStarted","Data":"6d848ab68a8f383f6f7c2be1c33397f89327a2ce5003f033385dedf3e3f31afc"} Mar 10 10:01:45 crc kubenswrapper[4794]: I0310 10:01:45.304468 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-6888856db4-4dfk8" event={"ID":"1aab61bc-b01a-4965-96bd-261ceaca0636","Type":"ContainerStarted","Data":"b369389c3a0806cee9f71f68b090bf1fc91cad18e5c637ae563cdadf99f44754"} Mar 10 10:01:45 crc kubenswrapper[4794]: I0310 10:01:45.304604 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="cert-manager/cert-manager-webhook-6888856db4-4dfk8" Mar 10 10:01:45 crc kubenswrapper[4794]: I0310 10:01:45.316470 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-cainjector-5545bd876-h6lmz" podStartSLOduration=2.11203856 podStartE2EDuration="6.316454341s" podCreationTimestamp="2026-03-10 10:01:39 +0000 UTC" firstStartedPulling="2026-03-10 10:01:40.01739529 +0000 UTC m=+1048.773566108" lastFinishedPulling="2026-03-10 10:01:44.221811071 +0000 UTC m=+1052.977981889" observedRunningTime="2026-03-10 10:01:45.315288846 +0000 UTC m=+1054.071459684" watchObservedRunningTime="2026-03-10 10:01:45.316454341 +0000 UTC m=+1054.072625159" Mar 10 10:01:45 crc kubenswrapper[4794]: I0310 10:01:45.330035 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-webhook-6888856db4-4dfk8" podStartSLOduration=2.446565875 podStartE2EDuration="7.330021526s" podCreationTimestamp="2026-03-10 10:01:38 +0000 UTC" firstStartedPulling="2026-03-10 10:01:39.332526521 +0000 UTC m=+1048.088697339" lastFinishedPulling="2026-03-10 10:01:44.215982162 +0000 UTC m=+1052.972152990" observedRunningTime="2026-03-10 10:01:45.328096828 +0000 UTC m=+1054.084267656" watchObservedRunningTime="2026-03-10 10:01:45.330021526 +0000 UTC m=+1054.086192344" Mar 10 10:01:53 crc kubenswrapper[4794]: I0310 10:01:53.770956 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="cert-manager/cert-manager-webhook-6888856db4-4dfk8" Mar 10 10:01:56 crc kubenswrapper[4794]: I0310 10:01:56.463945 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-545d4d4674-lhr9c"] Mar 10 10:01:56 crc kubenswrapper[4794]: I0310 10:01:56.465423 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-545d4d4674-lhr9c" Mar 10 10:01:56 crc kubenswrapper[4794]: I0310 10:01:56.468161 4794 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-dockercfg-hq7ww" Mar 10 10:01:56 crc kubenswrapper[4794]: I0310 10:01:56.479186 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-545d4d4674-lhr9c"] Mar 10 10:01:56 crc kubenswrapper[4794]: I0310 10:01:56.569968 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/c8fa8c73-f52b-4cee-a7ff-1c0288142c4c-bound-sa-token\") pod \"cert-manager-545d4d4674-lhr9c\" (UID: \"c8fa8c73-f52b-4cee-a7ff-1c0288142c4c\") " pod="cert-manager/cert-manager-545d4d4674-lhr9c" Mar 10 10:01:56 crc kubenswrapper[4794]: I0310 10:01:56.570055 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n6xz9\" (UniqueName: \"kubernetes.io/projected/c8fa8c73-f52b-4cee-a7ff-1c0288142c4c-kube-api-access-n6xz9\") pod \"cert-manager-545d4d4674-lhr9c\" (UID: \"c8fa8c73-f52b-4cee-a7ff-1c0288142c4c\") " pod="cert-manager/cert-manager-545d4d4674-lhr9c" Mar 10 10:01:56 crc kubenswrapper[4794]: I0310 10:01:56.671301 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/c8fa8c73-f52b-4cee-a7ff-1c0288142c4c-bound-sa-token\") pod \"cert-manager-545d4d4674-lhr9c\" (UID: \"c8fa8c73-f52b-4cee-a7ff-1c0288142c4c\") " pod="cert-manager/cert-manager-545d4d4674-lhr9c" Mar 10 10:01:56 crc kubenswrapper[4794]: I0310 10:01:56.671447 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n6xz9\" (UniqueName: \"kubernetes.io/projected/c8fa8c73-f52b-4cee-a7ff-1c0288142c4c-kube-api-access-n6xz9\") pod \"cert-manager-545d4d4674-lhr9c\" (UID: \"c8fa8c73-f52b-4cee-a7ff-1c0288142c4c\") " pod="cert-manager/cert-manager-545d4d4674-lhr9c" Mar 10 10:01:56 crc kubenswrapper[4794]: I0310 10:01:56.689188 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/c8fa8c73-f52b-4cee-a7ff-1c0288142c4c-bound-sa-token\") pod \"cert-manager-545d4d4674-lhr9c\" (UID: \"c8fa8c73-f52b-4cee-a7ff-1c0288142c4c\") " pod="cert-manager/cert-manager-545d4d4674-lhr9c" Mar 10 10:01:56 crc kubenswrapper[4794]: I0310 10:01:56.691394 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n6xz9\" (UniqueName: \"kubernetes.io/projected/c8fa8c73-f52b-4cee-a7ff-1c0288142c4c-kube-api-access-n6xz9\") pod \"cert-manager-545d4d4674-lhr9c\" (UID: \"c8fa8c73-f52b-4cee-a7ff-1c0288142c4c\") " pod="cert-manager/cert-manager-545d4d4674-lhr9c" Mar 10 10:01:56 crc kubenswrapper[4794]: I0310 10:01:56.784502 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-545d4d4674-lhr9c" Mar 10 10:01:57 crc kubenswrapper[4794]: I0310 10:01:57.029245 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-545d4d4674-lhr9c"] Mar 10 10:01:57 crc kubenswrapper[4794]: I0310 10:01:57.383285 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-545d4d4674-lhr9c" event={"ID":"c8fa8c73-f52b-4cee-a7ff-1c0288142c4c","Type":"ContainerStarted","Data":"212d3255cbb8fa7e1c72c9b295fa993bbabd6494f14febef0a18fa1784aaec81"} Mar 10 10:01:57 crc kubenswrapper[4794]: I0310 10:01:57.384219 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-545d4d4674-lhr9c" event={"ID":"c8fa8c73-f52b-4cee-a7ff-1c0288142c4c","Type":"ContainerStarted","Data":"9443286fd9bef8d6ed5e7511e60428b48d74c1820af59b1117e9e56f1a575dc5"} Mar 10 10:01:57 crc kubenswrapper[4794]: I0310 10:01:57.403086 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-545d4d4674-lhr9c" podStartSLOduration=1.403056679 podStartE2EDuration="1.403056679s" podCreationTimestamp="2026-03-10 10:01:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 10:01:57.40181606 +0000 UTC m=+1066.157986908" watchObservedRunningTime="2026-03-10 10:01:57.403056679 +0000 UTC m=+1066.159227527" Mar 10 10:02:00 crc kubenswrapper[4794]: I0310 10:02:00.134584 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29552282-5r6p7"] Mar 10 10:02:00 crc kubenswrapper[4794]: I0310 10:02:00.136247 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552282-5r6p7" Mar 10 10:02:00 crc kubenswrapper[4794]: I0310 10:02:00.138271 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-r5tkc" Mar 10 10:02:00 crc kubenswrapper[4794]: I0310 10:02:00.138619 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 10 10:02:00 crc kubenswrapper[4794]: I0310 10:02:00.138778 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 10 10:02:00 crc kubenswrapper[4794]: I0310 10:02:00.140114 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552282-5r6p7"] Mar 10 10:02:00 crc kubenswrapper[4794]: I0310 10:02:00.218803 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mt55h\" (UniqueName: \"kubernetes.io/projected/ef0c15f0-8d9a-41da-a93f-3799c7e84d28-kube-api-access-mt55h\") pod \"auto-csr-approver-29552282-5r6p7\" (UID: \"ef0c15f0-8d9a-41da-a93f-3799c7e84d28\") " pod="openshift-infra/auto-csr-approver-29552282-5r6p7" Mar 10 10:02:00 crc kubenswrapper[4794]: I0310 10:02:00.320757 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mt55h\" (UniqueName: \"kubernetes.io/projected/ef0c15f0-8d9a-41da-a93f-3799c7e84d28-kube-api-access-mt55h\") pod \"auto-csr-approver-29552282-5r6p7\" (UID: \"ef0c15f0-8d9a-41da-a93f-3799c7e84d28\") " pod="openshift-infra/auto-csr-approver-29552282-5r6p7" Mar 10 10:02:00 crc kubenswrapper[4794]: I0310 10:02:00.353611 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mt55h\" (UniqueName: \"kubernetes.io/projected/ef0c15f0-8d9a-41da-a93f-3799c7e84d28-kube-api-access-mt55h\") pod \"auto-csr-approver-29552282-5r6p7\" (UID: \"ef0c15f0-8d9a-41da-a93f-3799c7e84d28\") " pod="openshift-infra/auto-csr-approver-29552282-5r6p7" Mar 10 10:02:00 crc kubenswrapper[4794]: I0310 10:02:00.470361 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552282-5r6p7" Mar 10 10:02:00 crc kubenswrapper[4794]: I0310 10:02:00.720826 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552282-5r6p7"] Mar 10 10:02:00 crc kubenswrapper[4794]: W0310 10:02:00.742032 4794 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podef0c15f0_8d9a_41da_a93f_3799c7e84d28.slice/crio-08a33ee1a3b8e10e33eebb4c3e3ea03136cf9aa62d580f582f9f107de376a828 WatchSource:0}: Error finding container 08a33ee1a3b8e10e33eebb4c3e3ea03136cf9aa62d580f582f9f107de376a828: Status 404 returned error can't find the container with id 08a33ee1a3b8e10e33eebb4c3e3ea03136cf9aa62d580f582f9f107de376a828 Mar 10 10:02:01 crc kubenswrapper[4794]: I0310 10:02:01.413417 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552282-5r6p7" event={"ID":"ef0c15f0-8d9a-41da-a93f-3799c7e84d28","Type":"ContainerStarted","Data":"08a33ee1a3b8e10e33eebb4c3e3ea03136cf9aa62d580f582f9f107de376a828"} Mar 10 10:02:02 crc kubenswrapper[4794]: I0310 10:02:02.421715 4794 generic.go:334] "Generic (PLEG): container finished" podID="ef0c15f0-8d9a-41da-a93f-3799c7e84d28" containerID="be31c254c980a9b29478470f979589994d488f44e0eb6a6f25567fd7bda621ef" exitCode=0 Mar 10 10:02:02 crc kubenswrapper[4794]: I0310 10:02:02.421799 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552282-5r6p7" event={"ID":"ef0c15f0-8d9a-41da-a93f-3799c7e84d28","Type":"ContainerDied","Data":"be31c254c980a9b29478470f979589994d488f44e0eb6a6f25567fd7bda621ef"} Mar 10 10:02:03 crc kubenswrapper[4794]: I0310 10:02:03.740313 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552282-5r6p7" Mar 10 10:02:03 crc kubenswrapper[4794]: I0310 10:02:03.881795 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mt55h\" (UniqueName: \"kubernetes.io/projected/ef0c15f0-8d9a-41da-a93f-3799c7e84d28-kube-api-access-mt55h\") pod \"ef0c15f0-8d9a-41da-a93f-3799c7e84d28\" (UID: \"ef0c15f0-8d9a-41da-a93f-3799c7e84d28\") " Mar 10 10:02:03 crc kubenswrapper[4794]: I0310 10:02:03.894515 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ef0c15f0-8d9a-41da-a93f-3799c7e84d28-kube-api-access-mt55h" (OuterVolumeSpecName: "kube-api-access-mt55h") pod "ef0c15f0-8d9a-41da-a93f-3799c7e84d28" (UID: "ef0c15f0-8d9a-41da-a93f-3799c7e84d28"). InnerVolumeSpecName "kube-api-access-mt55h". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 10:02:03 crc kubenswrapper[4794]: I0310 10:02:03.984048 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mt55h\" (UniqueName: \"kubernetes.io/projected/ef0c15f0-8d9a-41da-a93f-3799c7e84d28-kube-api-access-mt55h\") on node \"crc\" DevicePath \"\"" Mar 10 10:02:04 crc kubenswrapper[4794]: I0310 10:02:04.443049 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552282-5r6p7" event={"ID":"ef0c15f0-8d9a-41da-a93f-3799c7e84d28","Type":"ContainerDied","Data":"08a33ee1a3b8e10e33eebb4c3e3ea03136cf9aa62d580f582f9f107de376a828"} Mar 10 10:02:04 crc kubenswrapper[4794]: I0310 10:02:04.443087 4794 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="08a33ee1a3b8e10e33eebb4c3e3ea03136cf9aa62d580f582f9f107de376a828" Mar 10 10:02:04 crc kubenswrapper[4794]: I0310 10:02:04.443136 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552282-5r6p7" Mar 10 10:02:04 crc kubenswrapper[4794]: I0310 10:02:04.801173 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29552276-dc4zw"] Mar 10 10:02:04 crc kubenswrapper[4794]: I0310 10:02:04.805802 4794 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29552276-dc4zw"] Mar 10 10:02:06 crc kubenswrapper[4794]: I0310 10:02:06.006790 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d38f5eab-4cb0-42bc-9ceb-576829269e3e" path="/var/lib/kubelet/pods/d38f5eab-4cb0-42bc-9ceb-576829269e3e/volumes" Mar 10 10:02:07 crc kubenswrapper[4794]: I0310 10:02:07.118145 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-72hkz"] Mar 10 10:02:07 crc kubenswrapper[4794]: E0310 10:02:07.118899 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ef0c15f0-8d9a-41da-a93f-3799c7e84d28" containerName="oc" Mar 10 10:02:07 crc kubenswrapper[4794]: I0310 10:02:07.118918 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="ef0c15f0-8d9a-41da-a93f-3799c7e84d28" containerName="oc" Mar 10 10:02:07 crc kubenswrapper[4794]: I0310 10:02:07.119091 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="ef0c15f0-8d9a-41da-a93f-3799c7e84d28" containerName="oc" Mar 10 10:02:07 crc kubenswrapper[4794]: I0310 10:02:07.119684 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-72hkz" Mar 10 10:02:07 crc kubenswrapper[4794]: I0310 10:02:07.123282 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-index-dockercfg-n5q8j" Mar 10 10:02:07 crc kubenswrapper[4794]: I0310 10:02:07.123974 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"kube-root-ca.crt" Mar 10 10:02:07 crc kubenswrapper[4794]: I0310 10:02:07.124411 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"openshift-service-ca.crt" Mar 10 10:02:07 crc kubenswrapper[4794]: I0310 10:02:07.137165 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-72hkz"] Mar 10 10:02:07 crc kubenswrapper[4794]: I0310 10:02:07.229145 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-smvk6\" (UniqueName: \"kubernetes.io/projected/3869e308-2a5b-4aff-84b2-ff1ce169458b-kube-api-access-smvk6\") pod \"openstack-operator-index-72hkz\" (UID: \"3869e308-2a5b-4aff-84b2-ff1ce169458b\") " pod="openstack-operators/openstack-operator-index-72hkz" Mar 10 10:02:07 crc kubenswrapper[4794]: I0310 10:02:07.330495 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-smvk6\" (UniqueName: \"kubernetes.io/projected/3869e308-2a5b-4aff-84b2-ff1ce169458b-kube-api-access-smvk6\") pod \"openstack-operator-index-72hkz\" (UID: \"3869e308-2a5b-4aff-84b2-ff1ce169458b\") " pod="openstack-operators/openstack-operator-index-72hkz" Mar 10 10:02:07 crc kubenswrapper[4794]: I0310 10:02:07.350846 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-smvk6\" (UniqueName: \"kubernetes.io/projected/3869e308-2a5b-4aff-84b2-ff1ce169458b-kube-api-access-smvk6\") pod \"openstack-operator-index-72hkz\" (UID: \"3869e308-2a5b-4aff-84b2-ff1ce169458b\") " pod="openstack-operators/openstack-operator-index-72hkz" Mar 10 10:02:07 crc kubenswrapper[4794]: I0310 10:02:07.444868 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-72hkz" Mar 10 10:02:07 crc kubenswrapper[4794]: I0310 10:02:07.861227 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-72hkz"] Mar 10 10:02:08 crc kubenswrapper[4794]: I0310 10:02:08.480720 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-72hkz" event={"ID":"3869e308-2a5b-4aff-84b2-ff1ce169458b","Type":"ContainerStarted","Data":"5984a1f718221a7dd4bbef0231b3125d909b8584e236580ad50d30577b4f41c7"} Mar 10 10:02:09 crc kubenswrapper[4794]: I0310 10:02:09.489903 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-72hkz" event={"ID":"3869e308-2a5b-4aff-84b2-ff1ce169458b","Type":"ContainerStarted","Data":"254edfea62aa9da7ab28f7ef70d6e91c8019e3a351eb2634f9902dfed38461da"} Mar 10 10:02:09 crc kubenswrapper[4794]: I0310 10:02:09.510679 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-72hkz" podStartSLOduration=1.710112718 podStartE2EDuration="2.510657548s" podCreationTimestamp="2026-03-10 10:02:07 +0000 UTC" firstStartedPulling="2026-03-10 10:02:07.87269224 +0000 UTC m=+1076.628863058" lastFinishedPulling="2026-03-10 10:02:08.67323707 +0000 UTC m=+1077.429407888" observedRunningTime="2026-03-10 10:02:09.508771421 +0000 UTC m=+1078.264942249" watchObservedRunningTime="2026-03-10 10:02:09.510657548 +0000 UTC m=+1078.266828376" Mar 10 10:02:10 crc kubenswrapper[4794]: I0310 10:02:10.280254 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-72hkz"] Mar 10 10:02:10 crc kubenswrapper[4794]: I0310 10:02:10.886881 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-8fx8m"] Mar 10 10:02:10 crc kubenswrapper[4794]: I0310 10:02:10.887794 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-8fx8m" Mar 10 10:02:10 crc kubenswrapper[4794]: I0310 10:02:10.897985 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-8fx8m"] Mar 10 10:02:10 crc kubenswrapper[4794]: I0310 10:02:10.977807 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6zcz6\" (UniqueName: \"kubernetes.io/projected/3b5b4e59-c15f-427b-a84f-08f07cda5dbc-kube-api-access-6zcz6\") pod \"openstack-operator-index-8fx8m\" (UID: \"3b5b4e59-c15f-427b-a84f-08f07cda5dbc\") " pod="openstack-operators/openstack-operator-index-8fx8m" Mar 10 10:02:11 crc kubenswrapper[4794]: I0310 10:02:11.078978 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6zcz6\" (UniqueName: \"kubernetes.io/projected/3b5b4e59-c15f-427b-a84f-08f07cda5dbc-kube-api-access-6zcz6\") pod \"openstack-operator-index-8fx8m\" (UID: \"3b5b4e59-c15f-427b-a84f-08f07cda5dbc\") " pod="openstack-operators/openstack-operator-index-8fx8m" Mar 10 10:02:11 crc kubenswrapper[4794]: I0310 10:02:11.117825 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6zcz6\" (UniqueName: \"kubernetes.io/projected/3b5b4e59-c15f-427b-a84f-08f07cda5dbc-kube-api-access-6zcz6\") pod \"openstack-operator-index-8fx8m\" (UID: \"3b5b4e59-c15f-427b-a84f-08f07cda5dbc\") " pod="openstack-operators/openstack-operator-index-8fx8m" Mar 10 10:02:11 crc kubenswrapper[4794]: I0310 10:02:11.250079 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-8fx8m" Mar 10 10:02:11 crc kubenswrapper[4794]: I0310 10:02:11.504905 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/openstack-operator-index-72hkz" podUID="3869e308-2a5b-4aff-84b2-ff1ce169458b" containerName="registry-server" containerID="cri-o://254edfea62aa9da7ab28f7ef70d6e91c8019e3a351eb2634f9902dfed38461da" gracePeriod=2 Mar 10 10:02:11 crc kubenswrapper[4794]: I0310 10:02:11.684898 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-8fx8m"] Mar 10 10:02:11 crc kubenswrapper[4794]: W0310 10:02:11.689160 4794 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3b5b4e59_c15f_427b_a84f_08f07cda5dbc.slice/crio-80057b45025d223fffd81f1efa2b7997005d851788ebf5ba317cbe1f2bd91610 WatchSource:0}: Error finding container 80057b45025d223fffd81f1efa2b7997005d851788ebf5ba317cbe1f2bd91610: Status 404 returned error can't find the container with id 80057b45025d223fffd81f1efa2b7997005d851788ebf5ba317cbe1f2bd91610 Mar 10 10:02:11 crc kubenswrapper[4794]: I0310 10:02:11.864016 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-72hkz" Mar 10 10:02:11 crc kubenswrapper[4794]: I0310 10:02:11.988835 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-smvk6\" (UniqueName: \"kubernetes.io/projected/3869e308-2a5b-4aff-84b2-ff1ce169458b-kube-api-access-smvk6\") pod \"3869e308-2a5b-4aff-84b2-ff1ce169458b\" (UID: \"3869e308-2a5b-4aff-84b2-ff1ce169458b\") " Mar 10 10:02:11 crc kubenswrapper[4794]: I0310 10:02:11.998215 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3869e308-2a5b-4aff-84b2-ff1ce169458b-kube-api-access-smvk6" (OuterVolumeSpecName: "kube-api-access-smvk6") pod "3869e308-2a5b-4aff-84b2-ff1ce169458b" (UID: "3869e308-2a5b-4aff-84b2-ff1ce169458b"). InnerVolumeSpecName "kube-api-access-smvk6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 10:02:12 crc kubenswrapper[4794]: I0310 10:02:12.114363 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-smvk6\" (UniqueName: \"kubernetes.io/projected/3869e308-2a5b-4aff-84b2-ff1ce169458b-kube-api-access-smvk6\") on node \"crc\" DevicePath \"\"" Mar 10 10:02:12 crc kubenswrapper[4794]: I0310 10:02:12.512238 4794 generic.go:334] "Generic (PLEG): container finished" podID="3869e308-2a5b-4aff-84b2-ff1ce169458b" containerID="254edfea62aa9da7ab28f7ef70d6e91c8019e3a351eb2634f9902dfed38461da" exitCode=0 Mar 10 10:02:12 crc kubenswrapper[4794]: I0310 10:02:12.512355 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-72hkz" event={"ID":"3869e308-2a5b-4aff-84b2-ff1ce169458b","Type":"ContainerDied","Data":"254edfea62aa9da7ab28f7ef70d6e91c8019e3a351eb2634f9902dfed38461da"} Mar 10 10:02:12 crc kubenswrapper[4794]: I0310 10:02:12.512390 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-72hkz" event={"ID":"3869e308-2a5b-4aff-84b2-ff1ce169458b","Type":"ContainerDied","Data":"5984a1f718221a7dd4bbef0231b3125d909b8584e236580ad50d30577b4f41c7"} Mar 10 10:02:12 crc kubenswrapper[4794]: I0310 10:02:12.512359 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-72hkz" Mar 10 10:02:12 crc kubenswrapper[4794]: I0310 10:02:12.512419 4794 scope.go:117] "RemoveContainer" containerID="254edfea62aa9da7ab28f7ef70d6e91c8019e3a351eb2634f9902dfed38461da" Mar 10 10:02:12 crc kubenswrapper[4794]: I0310 10:02:12.513756 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-8fx8m" event={"ID":"3b5b4e59-c15f-427b-a84f-08f07cda5dbc","Type":"ContainerStarted","Data":"63d68c24a99f3a0f0bbdd7c59b8c4a3c13420ab3107c9b3320806182a421e3d5"} Mar 10 10:02:12 crc kubenswrapper[4794]: I0310 10:02:12.513814 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-8fx8m" event={"ID":"3b5b4e59-c15f-427b-a84f-08f07cda5dbc","Type":"ContainerStarted","Data":"80057b45025d223fffd81f1efa2b7997005d851788ebf5ba317cbe1f2bd91610"} Mar 10 10:02:12 crc kubenswrapper[4794]: I0310 10:02:12.534994 4794 scope.go:117] "RemoveContainer" containerID="254edfea62aa9da7ab28f7ef70d6e91c8019e3a351eb2634f9902dfed38461da" Mar 10 10:02:12 crc kubenswrapper[4794]: E0310 10:02:12.535954 4794 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"254edfea62aa9da7ab28f7ef70d6e91c8019e3a351eb2634f9902dfed38461da\": container with ID starting with 254edfea62aa9da7ab28f7ef70d6e91c8019e3a351eb2634f9902dfed38461da not found: ID does not exist" containerID="254edfea62aa9da7ab28f7ef70d6e91c8019e3a351eb2634f9902dfed38461da" Mar 10 10:02:12 crc kubenswrapper[4794]: I0310 10:02:12.536141 4794 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"254edfea62aa9da7ab28f7ef70d6e91c8019e3a351eb2634f9902dfed38461da"} err="failed to get container status \"254edfea62aa9da7ab28f7ef70d6e91c8019e3a351eb2634f9902dfed38461da\": rpc error: code = NotFound desc = could not find container \"254edfea62aa9da7ab28f7ef70d6e91c8019e3a351eb2634f9902dfed38461da\": container with ID starting with 254edfea62aa9da7ab28f7ef70d6e91c8019e3a351eb2634f9902dfed38461da not found: ID does not exist" Mar 10 10:02:12 crc kubenswrapper[4794]: I0310 10:02:12.545426 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-8fx8m" podStartSLOduration=2.095187855 podStartE2EDuration="2.545404283s" podCreationTimestamp="2026-03-10 10:02:10 +0000 UTC" firstStartedPulling="2026-03-10 10:02:11.695543524 +0000 UTC m=+1080.451714342" lastFinishedPulling="2026-03-10 10:02:12.145759952 +0000 UTC m=+1080.901930770" observedRunningTime="2026-03-10 10:02:12.535960894 +0000 UTC m=+1081.292131742" watchObservedRunningTime="2026-03-10 10:02:12.545404283 +0000 UTC m=+1081.301575111" Mar 10 10:02:12 crc kubenswrapper[4794]: I0310 10:02:12.555589 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-72hkz"] Mar 10 10:02:12 crc kubenswrapper[4794]: I0310 10:02:12.560637 4794 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/openstack-operator-index-72hkz"] Mar 10 10:02:14 crc kubenswrapper[4794]: I0310 10:02:14.009511 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3869e308-2a5b-4aff-84b2-ff1ce169458b" path="/var/lib/kubelet/pods/3869e308-2a5b-4aff-84b2-ff1ce169458b/volumes" Mar 10 10:02:21 crc kubenswrapper[4794]: I0310 10:02:21.250846 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-index-8fx8m" Mar 10 10:02:21 crc kubenswrapper[4794]: I0310 10:02:21.251190 4794 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/openstack-operator-index-8fx8m" Mar 10 10:02:21 crc kubenswrapper[4794]: I0310 10:02:21.283898 4794 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/openstack-operator-index-8fx8m" Mar 10 10:02:21 crc kubenswrapper[4794]: I0310 10:02:21.625158 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-index-8fx8m" Mar 10 10:02:22 crc kubenswrapper[4794]: I0310 10:02:22.515974 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/951b5e6f67e727348308cf03f2b463d3e2bd27386b453f6699e03a49ba2s2pl"] Mar 10 10:02:22 crc kubenswrapper[4794]: E0310 10:02:22.516229 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3869e308-2a5b-4aff-84b2-ff1ce169458b" containerName="registry-server" Mar 10 10:02:22 crc kubenswrapper[4794]: I0310 10:02:22.516240 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="3869e308-2a5b-4aff-84b2-ff1ce169458b" containerName="registry-server" Mar 10 10:02:22 crc kubenswrapper[4794]: I0310 10:02:22.516378 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="3869e308-2a5b-4aff-84b2-ff1ce169458b" containerName="registry-server" Mar 10 10:02:22 crc kubenswrapper[4794]: I0310 10:02:22.517135 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/951b5e6f67e727348308cf03f2b463d3e2bd27386b453f6699e03a49ba2s2pl" Mar 10 10:02:22 crc kubenswrapper[4794]: I0310 10:02:22.519287 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"default-dockercfg-xplxl" Mar 10 10:02:22 crc kubenswrapper[4794]: I0310 10:02:22.525220 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/951b5e6f67e727348308cf03f2b463d3e2bd27386b453f6699e03a49ba2s2pl"] Mar 10 10:02:22 crc kubenswrapper[4794]: I0310 10:02:22.562422 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a9889ae6-f864-48e1-b8ed-42ae046882fa-bundle\") pod \"951b5e6f67e727348308cf03f2b463d3e2bd27386b453f6699e03a49ba2s2pl\" (UID: \"a9889ae6-f864-48e1-b8ed-42ae046882fa\") " pod="openstack-operators/951b5e6f67e727348308cf03f2b463d3e2bd27386b453f6699e03a49ba2s2pl" Mar 10 10:02:22 crc kubenswrapper[4794]: I0310 10:02:22.562502 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b88xq\" (UniqueName: \"kubernetes.io/projected/a9889ae6-f864-48e1-b8ed-42ae046882fa-kube-api-access-b88xq\") pod \"951b5e6f67e727348308cf03f2b463d3e2bd27386b453f6699e03a49ba2s2pl\" (UID: \"a9889ae6-f864-48e1-b8ed-42ae046882fa\") " pod="openstack-operators/951b5e6f67e727348308cf03f2b463d3e2bd27386b453f6699e03a49ba2s2pl" Mar 10 10:02:22 crc kubenswrapper[4794]: I0310 10:02:22.562605 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a9889ae6-f864-48e1-b8ed-42ae046882fa-util\") pod \"951b5e6f67e727348308cf03f2b463d3e2bd27386b453f6699e03a49ba2s2pl\" (UID: \"a9889ae6-f864-48e1-b8ed-42ae046882fa\") " pod="openstack-operators/951b5e6f67e727348308cf03f2b463d3e2bd27386b453f6699e03a49ba2s2pl" Mar 10 10:02:22 crc kubenswrapper[4794]: I0310 10:02:22.664425 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a9889ae6-f864-48e1-b8ed-42ae046882fa-bundle\") pod \"951b5e6f67e727348308cf03f2b463d3e2bd27386b453f6699e03a49ba2s2pl\" (UID: \"a9889ae6-f864-48e1-b8ed-42ae046882fa\") " pod="openstack-operators/951b5e6f67e727348308cf03f2b463d3e2bd27386b453f6699e03a49ba2s2pl" Mar 10 10:02:22 crc kubenswrapper[4794]: I0310 10:02:22.664489 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b88xq\" (UniqueName: \"kubernetes.io/projected/a9889ae6-f864-48e1-b8ed-42ae046882fa-kube-api-access-b88xq\") pod \"951b5e6f67e727348308cf03f2b463d3e2bd27386b453f6699e03a49ba2s2pl\" (UID: \"a9889ae6-f864-48e1-b8ed-42ae046882fa\") " pod="openstack-operators/951b5e6f67e727348308cf03f2b463d3e2bd27386b453f6699e03a49ba2s2pl" Mar 10 10:02:22 crc kubenswrapper[4794]: I0310 10:02:22.664564 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a9889ae6-f864-48e1-b8ed-42ae046882fa-util\") pod \"951b5e6f67e727348308cf03f2b463d3e2bd27386b453f6699e03a49ba2s2pl\" (UID: \"a9889ae6-f864-48e1-b8ed-42ae046882fa\") " pod="openstack-operators/951b5e6f67e727348308cf03f2b463d3e2bd27386b453f6699e03a49ba2s2pl" Mar 10 10:02:22 crc kubenswrapper[4794]: I0310 10:02:22.665222 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a9889ae6-f864-48e1-b8ed-42ae046882fa-util\") pod \"951b5e6f67e727348308cf03f2b463d3e2bd27386b453f6699e03a49ba2s2pl\" (UID: \"a9889ae6-f864-48e1-b8ed-42ae046882fa\") " pod="openstack-operators/951b5e6f67e727348308cf03f2b463d3e2bd27386b453f6699e03a49ba2s2pl" Mar 10 10:02:22 crc kubenswrapper[4794]: I0310 10:02:22.665402 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a9889ae6-f864-48e1-b8ed-42ae046882fa-bundle\") pod \"951b5e6f67e727348308cf03f2b463d3e2bd27386b453f6699e03a49ba2s2pl\" (UID: \"a9889ae6-f864-48e1-b8ed-42ae046882fa\") " pod="openstack-operators/951b5e6f67e727348308cf03f2b463d3e2bd27386b453f6699e03a49ba2s2pl" Mar 10 10:02:22 crc kubenswrapper[4794]: I0310 10:02:22.699791 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b88xq\" (UniqueName: \"kubernetes.io/projected/a9889ae6-f864-48e1-b8ed-42ae046882fa-kube-api-access-b88xq\") pod \"951b5e6f67e727348308cf03f2b463d3e2bd27386b453f6699e03a49ba2s2pl\" (UID: \"a9889ae6-f864-48e1-b8ed-42ae046882fa\") " pod="openstack-operators/951b5e6f67e727348308cf03f2b463d3e2bd27386b453f6699e03a49ba2s2pl" Mar 10 10:02:22 crc kubenswrapper[4794]: I0310 10:02:22.838124 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/951b5e6f67e727348308cf03f2b463d3e2bd27386b453f6699e03a49ba2s2pl" Mar 10 10:02:22 crc kubenswrapper[4794]: I0310 10:02:22.967436 4794 patch_prober.go:28] interesting pod/machine-config-daemon-69278 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 10:02:22 crc kubenswrapper[4794]: I0310 10:02:22.967490 4794 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-69278" podUID="071a79a8-a892-4d38-a255-2a19483b64aa" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 10:02:23 crc kubenswrapper[4794]: I0310 10:02:23.318857 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/951b5e6f67e727348308cf03f2b463d3e2bd27386b453f6699e03a49ba2s2pl"] Mar 10 10:02:23 crc kubenswrapper[4794]: I0310 10:02:23.604265 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/951b5e6f67e727348308cf03f2b463d3e2bd27386b453f6699e03a49ba2s2pl" event={"ID":"a9889ae6-f864-48e1-b8ed-42ae046882fa","Type":"ContainerStarted","Data":"cc675ee95f2f0f82c02b60171e2a6f7022fa559b152a05163c02925c0c212ff3"} Mar 10 10:02:23 crc kubenswrapper[4794]: I0310 10:02:23.604313 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/951b5e6f67e727348308cf03f2b463d3e2bd27386b453f6699e03a49ba2s2pl" event={"ID":"a9889ae6-f864-48e1-b8ed-42ae046882fa","Type":"ContainerStarted","Data":"3f45d0893d700077a7773d9f749799bcf5aa94b2bf9d66b145b401d15cb14cb1"} Mar 10 10:02:24 crc kubenswrapper[4794]: I0310 10:02:24.613853 4794 generic.go:334] "Generic (PLEG): container finished" podID="a9889ae6-f864-48e1-b8ed-42ae046882fa" containerID="cc675ee95f2f0f82c02b60171e2a6f7022fa559b152a05163c02925c0c212ff3" exitCode=0 Mar 10 10:02:24 crc kubenswrapper[4794]: I0310 10:02:24.613897 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/951b5e6f67e727348308cf03f2b463d3e2bd27386b453f6699e03a49ba2s2pl" event={"ID":"a9889ae6-f864-48e1-b8ed-42ae046882fa","Type":"ContainerDied","Data":"cc675ee95f2f0f82c02b60171e2a6f7022fa559b152a05163c02925c0c212ff3"} Mar 10 10:02:26 crc kubenswrapper[4794]: I0310 10:02:26.631183 4794 generic.go:334] "Generic (PLEG): container finished" podID="a9889ae6-f864-48e1-b8ed-42ae046882fa" containerID="6fa2c1e0b4837779ee982ea36883a666f8e24a6469c8d9f11cc7a248118088b5" exitCode=0 Mar 10 10:02:26 crc kubenswrapper[4794]: I0310 10:02:26.631253 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/951b5e6f67e727348308cf03f2b463d3e2bd27386b453f6699e03a49ba2s2pl" event={"ID":"a9889ae6-f864-48e1-b8ed-42ae046882fa","Type":"ContainerDied","Data":"6fa2c1e0b4837779ee982ea36883a666f8e24a6469c8d9f11cc7a248118088b5"} Mar 10 10:02:27 crc kubenswrapper[4794]: I0310 10:02:27.642618 4794 generic.go:334] "Generic (PLEG): container finished" podID="a9889ae6-f864-48e1-b8ed-42ae046882fa" containerID="bfa93e324c6b11e524c9962a97f0eca9ddba67cb4a57ed90b75b50a445ca2514" exitCode=0 Mar 10 10:02:27 crc kubenswrapper[4794]: I0310 10:02:27.642712 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/951b5e6f67e727348308cf03f2b463d3e2bd27386b453f6699e03a49ba2s2pl" event={"ID":"a9889ae6-f864-48e1-b8ed-42ae046882fa","Type":"ContainerDied","Data":"bfa93e324c6b11e524c9962a97f0eca9ddba67cb4a57ed90b75b50a445ca2514"} Mar 10 10:02:28 crc kubenswrapper[4794]: I0310 10:02:28.871115 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/951b5e6f67e727348308cf03f2b463d3e2bd27386b453f6699e03a49ba2s2pl" Mar 10 10:02:28 crc kubenswrapper[4794]: I0310 10:02:28.966071 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a9889ae6-f864-48e1-b8ed-42ae046882fa-bundle\") pod \"a9889ae6-f864-48e1-b8ed-42ae046882fa\" (UID: \"a9889ae6-f864-48e1-b8ed-42ae046882fa\") " Mar 10 10:02:28 crc kubenswrapper[4794]: I0310 10:02:28.966311 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b88xq\" (UniqueName: \"kubernetes.io/projected/a9889ae6-f864-48e1-b8ed-42ae046882fa-kube-api-access-b88xq\") pod \"a9889ae6-f864-48e1-b8ed-42ae046882fa\" (UID: \"a9889ae6-f864-48e1-b8ed-42ae046882fa\") " Mar 10 10:02:28 crc kubenswrapper[4794]: I0310 10:02:28.966495 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a9889ae6-f864-48e1-b8ed-42ae046882fa-util\") pod \"a9889ae6-f864-48e1-b8ed-42ae046882fa\" (UID: \"a9889ae6-f864-48e1-b8ed-42ae046882fa\") " Mar 10 10:02:28 crc kubenswrapper[4794]: I0310 10:02:28.967886 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a9889ae6-f864-48e1-b8ed-42ae046882fa-bundle" (OuterVolumeSpecName: "bundle") pod "a9889ae6-f864-48e1-b8ed-42ae046882fa" (UID: "a9889ae6-f864-48e1-b8ed-42ae046882fa"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 10:02:28 crc kubenswrapper[4794]: I0310 10:02:28.972328 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a9889ae6-f864-48e1-b8ed-42ae046882fa-kube-api-access-b88xq" (OuterVolumeSpecName: "kube-api-access-b88xq") pod "a9889ae6-f864-48e1-b8ed-42ae046882fa" (UID: "a9889ae6-f864-48e1-b8ed-42ae046882fa"). InnerVolumeSpecName "kube-api-access-b88xq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 10:02:29 crc kubenswrapper[4794]: I0310 10:02:29.000319 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a9889ae6-f864-48e1-b8ed-42ae046882fa-util" (OuterVolumeSpecName: "util") pod "a9889ae6-f864-48e1-b8ed-42ae046882fa" (UID: "a9889ae6-f864-48e1-b8ed-42ae046882fa"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 10:02:29 crc kubenswrapper[4794]: I0310 10:02:29.069139 4794 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a9889ae6-f864-48e1-b8ed-42ae046882fa-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 10:02:29 crc kubenswrapper[4794]: I0310 10:02:29.069213 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b88xq\" (UniqueName: \"kubernetes.io/projected/a9889ae6-f864-48e1-b8ed-42ae046882fa-kube-api-access-b88xq\") on node \"crc\" DevicePath \"\"" Mar 10 10:02:29 crc kubenswrapper[4794]: I0310 10:02:29.069244 4794 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a9889ae6-f864-48e1-b8ed-42ae046882fa-util\") on node \"crc\" DevicePath \"\"" Mar 10 10:02:29 crc kubenswrapper[4794]: I0310 10:02:29.664326 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/951b5e6f67e727348308cf03f2b463d3e2bd27386b453f6699e03a49ba2s2pl" event={"ID":"a9889ae6-f864-48e1-b8ed-42ae046882fa","Type":"ContainerDied","Data":"3f45d0893d700077a7773d9f749799bcf5aa94b2bf9d66b145b401d15cb14cb1"} Mar 10 10:02:29 crc kubenswrapper[4794]: I0310 10:02:29.664407 4794 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3f45d0893d700077a7773d9f749799bcf5aa94b2bf9d66b145b401d15cb14cb1" Mar 10 10:02:29 crc kubenswrapper[4794]: I0310 10:02:29.664470 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/951b5e6f67e727348308cf03f2b463d3e2bd27386b453f6699e03a49ba2s2pl" Mar 10 10:02:29 crc kubenswrapper[4794]: I0310 10:02:29.681324 4794 scope.go:117] "RemoveContainer" containerID="991701b72851d56d951b339ec74203a6d0acbb169329704a4623a42a0f002326" Mar 10 10:02:34 crc kubenswrapper[4794]: I0310 10:02:34.821559 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-init-6cf8df7788-pbmxv"] Mar 10 10:02:34 crc kubenswrapper[4794]: E0310 10:02:34.822382 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a9889ae6-f864-48e1-b8ed-42ae046882fa" containerName="pull" Mar 10 10:02:34 crc kubenswrapper[4794]: I0310 10:02:34.822397 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="a9889ae6-f864-48e1-b8ed-42ae046882fa" containerName="pull" Mar 10 10:02:34 crc kubenswrapper[4794]: E0310 10:02:34.822408 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a9889ae6-f864-48e1-b8ed-42ae046882fa" containerName="extract" Mar 10 10:02:34 crc kubenswrapper[4794]: I0310 10:02:34.822415 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="a9889ae6-f864-48e1-b8ed-42ae046882fa" containerName="extract" Mar 10 10:02:34 crc kubenswrapper[4794]: E0310 10:02:34.822428 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a9889ae6-f864-48e1-b8ed-42ae046882fa" containerName="util" Mar 10 10:02:34 crc kubenswrapper[4794]: I0310 10:02:34.822434 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="a9889ae6-f864-48e1-b8ed-42ae046882fa" containerName="util" Mar 10 10:02:34 crc kubenswrapper[4794]: I0310 10:02:34.822571 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="a9889ae6-f864-48e1-b8ed-42ae046882fa" containerName="extract" Mar 10 10:02:34 crc kubenswrapper[4794]: I0310 10:02:34.823034 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-init-6cf8df7788-pbmxv" Mar 10 10:02:34 crc kubenswrapper[4794]: I0310 10:02:34.826305 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-init-dockercfg-8g2kk" Mar 10 10:02:34 crc kubenswrapper[4794]: I0310 10:02:34.858030 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-init-6cf8df7788-pbmxv"] Mar 10 10:02:34 crc kubenswrapper[4794]: I0310 10:02:34.955379 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8qntn\" (UniqueName: \"kubernetes.io/projected/a51e8442-a0da-4aec-91c1-383cef679edf-kube-api-access-8qntn\") pod \"openstack-operator-controller-init-6cf8df7788-pbmxv\" (UID: \"a51e8442-a0da-4aec-91c1-383cef679edf\") " pod="openstack-operators/openstack-operator-controller-init-6cf8df7788-pbmxv" Mar 10 10:02:35 crc kubenswrapper[4794]: I0310 10:02:35.056716 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8qntn\" (UniqueName: \"kubernetes.io/projected/a51e8442-a0da-4aec-91c1-383cef679edf-kube-api-access-8qntn\") pod \"openstack-operator-controller-init-6cf8df7788-pbmxv\" (UID: \"a51e8442-a0da-4aec-91c1-383cef679edf\") " pod="openstack-operators/openstack-operator-controller-init-6cf8df7788-pbmxv" Mar 10 10:02:35 crc kubenswrapper[4794]: I0310 10:02:35.087250 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8qntn\" (UniqueName: \"kubernetes.io/projected/a51e8442-a0da-4aec-91c1-383cef679edf-kube-api-access-8qntn\") pod \"openstack-operator-controller-init-6cf8df7788-pbmxv\" (UID: \"a51e8442-a0da-4aec-91c1-383cef679edf\") " pod="openstack-operators/openstack-operator-controller-init-6cf8df7788-pbmxv" Mar 10 10:02:35 crc kubenswrapper[4794]: I0310 10:02:35.142169 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-init-6cf8df7788-pbmxv" Mar 10 10:02:35 crc kubenswrapper[4794]: I0310 10:02:35.598061 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-init-6cf8df7788-pbmxv"] Mar 10 10:02:35 crc kubenswrapper[4794]: I0310 10:02:35.715886 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-init-6cf8df7788-pbmxv" event={"ID":"a51e8442-a0da-4aec-91c1-383cef679edf","Type":"ContainerStarted","Data":"5d64c718e4ffda2ec4cfbb77a5c00756412b996e04bcfe6005ed41ce1c83381b"} Mar 10 10:02:40 crc kubenswrapper[4794]: I0310 10:02:40.750278 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-init-6cf8df7788-pbmxv" event={"ID":"a51e8442-a0da-4aec-91c1-383cef679edf","Type":"ContainerStarted","Data":"bb7253b8521cf725f5dc49315f2b6eb0845a563b46f94740849f54445dd9f7f1"} Mar 10 10:02:40 crc kubenswrapper[4794]: I0310 10:02:40.750834 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-init-6cf8df7788-pbmxv" Mar 10 10:02:40 crc kubenswrapper[4794]: I0310 10:02:40.779206 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-init-6cf8df7788-pbmxv" podStartSLOduration=2.573349892 podStartE2EDuration="6.77918915s" podCreationTimestamp="2026-03-10 10:02:34 +0000 UTC" firstStartedPulling="2026-03-10 10:02:35.596577966 +0000 UTC m=+1104.352748774" lastFinishedPulling="2026-03-10 10:02:39.802417194 +0000 UTC m=+1108.558588032" observedRunningTime="2026-03-10 10:02:40.773516352 +0000 UTC m=+1109.529687200" watchObservedRunningTime="2026-03-10 10:02:40.77918915 +0000 UTC m=+1109.535359968" Mar 10 10:02:45 crc kubenswrapper[4794]: I0310 10:02:45.145421 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-init-6cf8df7788-pbmxv" Mar 10 10:02:52 crc kubenswrapper[4794]: I0310 10:02:52.967641 4794 patch_prober.go:28] interesting pod/machine-config-daemon-69278 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 10:02:52 crc kubenswrapper[4794]: I0310 10:02:52.968156 4794 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-69278" podUID="071a79a8-a892-4d38-a255-2a19483b64aa" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 10:03:04 crc kubenswrapper[4794]: I0310 10:03:04.886251 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/cinder-operator-controller-manager-984cd4dcf-vh842"] Mar 10 10:03:04 crc kubenswrapper[4794]: I0310 10:03:04.894392 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/barbican-operator-controller-manager-677bd678f7-22vcf"] Mar 10 10:03:04 crc kubenswrapper[4794]: I0310 10:03:04.895193 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-677bd678f7-22vcf" Mar 10 10:03:04 crc kubenswrapper[4794]: I0310 10:03:04.897531 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"barbican-operator-controller-manager-dockercfg-z74mr" Mar 10 10:03:04 crc kubenswrapper[4794]: I0310 10:03:04.901884 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-984cd4dcf-vh842" Mar 10 10:03:04 crc kubenswrapper[4794]: I0310 10:03:04.904869 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"cinder-operator-controller-manager-dockercfg-l95dj" Mar 10 10:03:04 crc kubenswrapper[4794]: I0310 10:03:04.910757 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-984cd4dcf-vh842"] Mar 10 10:03:04 crc kubenswrapper[4794]: I0310 10:03:04.922815 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/designate-operator-controller-manager-66d56f6ff4-zcsq9"] Mar 10 10:03:04 crc kubenswrapper[4794]: I0310 10:03:04.923605 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-66d56f6ff4-zcsq9" Mar 10 10:03:04 crc kubenswrapper[4794]: I0310 10:03:04.926580 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"designate-operator-controller-manager-dockercfg-gnvtr" Mar 10 10:03:04 crc kubenswrapper[4794]: I0310 10:03:04.940565 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-677bd678f7-22vcf"] Mar 10 10:03:04 crc kubenswrapper[4794]: I0310 10:03:04.964931 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-66d56f6ff4-zcsq9"] Mar 10 10:03:04 crc kubenswrapper[4794]: I0310 10:03:04.977424 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/glance-operator-controller-manager-5964f64c48-2xrkm"] Mar 10 10:03:04 crc kubenswrapper[4794]: I0310 10:03:04.979025 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-5964f64c48-2xrkm" Mar 10 10:03:04 crc kubenswrapper[4794]: I0310 10:03:04.982614 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"glance-operator-controller-manager-dockercfg-7vm4v" Mar 10 10:03:05 crc kubenswrapper[4794]: I0310 10:03:05.001881 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-5964f64c48-2xrkm"] Mar 10 10:03:05 crc kubenswrapper[4794]: I0310 10:03:05.021060 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/heat-operator-controller-manager-77b6666d85-d9zzm"] Mar 10 10:03:05 crc kubenswrapper[4794]: I0310 10:03:05.023283 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-77b6666d85-d9zzm" Mar 10 10:03:05 crc kubenswrapper[4794]: I0310 10:03:05.027019 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"heat-operator-controller-manager-dockercfg-d72sl" Mar 10 10:03:05 crc kubenswrapper[4794]: I0310 10:03:05.028949 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-77b6666d85-d9zzm"] Mar 10 10:03:05 crc kubenswrapper[4794]: I0310 10:03:05.036978 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/horizon-operator-controller-manager-6d9d6b584d-ldq59"] Mar 10 10:03:05 crc kubenswrapper[4794]: I0310 10:03:05.037812 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-6d9d6b584d-ldq59" Mar 10 10:03:05 crc kubenswrapper[4794]: I0310 10:03:05.040129 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"horizon-operator-controller-manager-dockercfg-nfdtm" Mar 10 10:03:05 crc kubenswrapper[4794]: I0310 10:03:05.069935 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-6d9d6b584d-ldq59"] Mar 10 10:03:05 crc kubenswrapper[4794]: I0310 10:03:05.071582 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x9rrr\" (UniqueName: \"kubernetes.io/projected/aea0d607-d1b3-4a15-993a-c571f49c1337-kube-api-access-x9rrr\") pod \"cinder-operator-controller-manager-984cd4dcf-vh842\" (UID: \"aea0d607-d1b3-4a15-993a-c571f49c1337\") " pod="openstack-operators/cinder-operator-controller-manager-984cd4dcf-vh842" Mar 10 10:03:05 crc kubenswrapper[4794]: I0310 10:03:05.071645 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8wdz4\" (UniqueName: \"kubernetes.io/projected/491021e8-371d-44ff-bc8b-6cb379531865-kube-api-access-8wdz4\") pod \"glance-operator-controller-manager-5964f64c48-2xrkm\" (UID: \"491021e8-371d-44ff-bc8b-6cb379531865\") " pod="openstack-operators/glance-operator-controller-manager-5964f64c48-2xrkm" Mar 10 10:03:05 crc kubenswrapper[4794]: I0310 10:03:05.071696 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m8zz4\" (UniqueName: \"kubernetes.io/projected/690c6868-b841-43ec-9a82-42f6b2250153-kube-api-access-m8zz4\") pod \"barbican-operator-controller-manager-677bd678f7-22vcf\" (UID: \"690c6868-b841-43ec-9a82-42f6b2250153\") " pod="openstack-operators/barbican-operator-controller-manager-677bd678f7-22vcf" Mar 10 10:03:05 crc kubenswrapper[4794]: I0310 10:03:05.071720 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sdc5b\" (UniqueName: \"kubernetes.io/projected/8c9a844a-91eb-4909-961a-79ab54ac592c-kube-api-access-sdc5b\") pod \"designate-operator-controller-manager-66d56f6ff4-zcsq9\" (UID: \"8c9a844a-91eb-4909-961a-79ab54ac592c\") " pod="openstack-operators/designate-operator-controller-manager-66d56f6ff4-zcsq9" Mar 10 10:03:05 crc kubenswrapper[4794]: I0310 10:03:05.078101 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/infra-operator-controller-manager-5995f4446f-sf5hn"] Mar 10 10:03:05 crc kubenswrapper[4794]: I0310 10:03:05.079412 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-5995f4446f-sf5hn" Mar 10 10:03:05 crc kubenswrapper[4794]: I0310 10:03:05.084733 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-controller-manager-dockercfg-kfdrb" Mar 10 10:03:05 crc kubenswrapper[4794]: I0310 10:03:05.088007 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-webhook-server-cert" Mar 10 10:03:05 crc kubenswrapper[4794]: I0310 10:03:05.096391 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-5995f4446f-sf5hn"] Mar 10 10:03:05 crc kubenswrapper[4794]: I0310 10:03:05.107721 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ironic-operator-controller-manager-6bbb499bbc-tbxps"] Mar 10 10:03:05 crc kubenswrapper[4794]: I0310 10:03:05.108643 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-6bbb499bbc-tbxps" Mar 10 10:03:05 crc kubenswrapper[4794]: I0310 10:03:05.117391 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/keystone-operator-controller-manager-684f77d66d-p2ghv"] Mar 10 10:03:05 crc kubenswrapper[4794]: I0310 10:03:05.117592 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ironic-operator-controller-manager-dockercfg-c4nlf" Mar 10 10:03:05 crc kubenswrapper[4794]: I0310 10:03:05.118361 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-684f77d66d-p2ghv" Mar 10 10:03:05 crc kubenswrapper[4794]: I0310 10:03:05.121345 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"keystone-operator-controller-manager-dockercfg-d45d6" Mar 10 10:03:05 crc kubenswrapper[4794]: I0310 10:03:05.146472 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-6bbb499bbc-tbxps"] Mar 10 10:03:05 crc kubenswrapper[4794]: I0310 10:03:05.164430 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/manila-operator-controller-manager-68f45f9d9f-zb879"] Mar 10 10:03:05 crc kubenswrapper[4794]: I0310 10:03:05.165255 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-68f45f9d9f-zb879" Mar 10 10:03:05 crc kubenswrapper[4794]: I0310 10:03:05.171476 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"manila-operator-controller-manager-dockercfg-sqkhk" Mar 10 10:03:05 crc kubenswrapper[4794]: I0310 10:03:05.172604 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-684f77d66d-p2ghv"] Mar 10 10:03:05 crc kubenswrapper[4794]: I0310 10:03:05.173130 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m8zz4\" (UniqueName: \"kubernetes.io/projected/690c6868-b841-43ec-9a82-42f6b2250153-kube-api-access-m8zz4\") pod \"barbican-operator-controller-manager-677bd678f7-22vcf\" (UID: \"690c6868-b841-43ec-9a82-42f6b2250153\") " pod="openstack-operators/barbican-operator-controller-manager-677bd678f7-22vcf" Mar 10 10:03:05 crc kubenswrapper[4794]: I0310 10:03:05.173178 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sdc5b\" (UniqueName: \"kubernetes.io/projected/8c9a844a-91eb-4909-961a-79ab54ac592c-kube-api-access-sdc5b\") pod \"designate-operator-controller-manager-66d56f6ff4-zcsq9\" (UID: \"8c9a844a-91eb-4909-961a-79ab54ac592c\") " pod="openstack-operators/designate-operator-controller-manager-66d56f6ff4-zcsq9" Mar 10 10:03:05 crc kubenswrapper[4794]: I0310 10:03:05.173239 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1201bc13-f478-4ce1-9e86-3b1fcf9bcde1-cert\") pod \"infra-operator-controller-manager-5995f4446f-sf5hn\" (UID: \"1201bc13-f478-4ce1-9e86-3b1fcf9bcde1\") " pod="openstack-operators/infra-operator-controller-manager-5995f4446f-sf5hn" Mar 10 10:03:05 crc kubenswrapper[4794]: I0310 10:03:05.173288 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d46hj\" (UniqueName: \"kubernetes.io/projected/51d5cfa9-c743-4fe4-8965-e88568a7e266-kube-api-access-d46hj\") pod \"heat-operator-controller-manager-77b6666d85-d9zzm\" (UID: \"51d5cfa9-c743-4fe4-8965-e88568a7e266\") " pod="openstack-operators/heat-operator-controller-manager-77b6666d85-d9zzm" Mar 10 10:03:05 crc kubenswrapper[4794]: I0310 10:03:05.173351 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x9rrr\" (UniqueName: \"kubernetes.io/projected/aea0d607-d1b3-4a15-993a-c571f49c1337-kube-api-access-x9rrr\") pod \"cinder-operator-controller-manager-984cd4dcf-vh842\" (UID: \"aea0d607-d1b3-4a15-993a-c571f49c1337\") " pod="openstack-operators/cinder-operator-controller-manager-984cd4dcf-vh842" Mar 10 10:03:05 crc kubenswrapper[4794]: I0310 10:03:05.173385 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f4pcq\" (UniqueName: \"kubernetes.io/projected/6cc76a8e-a3ac-4499-ab5b-f3fba0d8702c-kube-api-access-f4pcq\") pod \"horizon-operator-controller-manager-6d9d6b584d-ldq59\" (UID: \"6cc76a8e-a3ac-4499-ab5b-f3fba0d8702c\") " pod="openstack-operators/horizon-operator-controller-manager-6d9d6b584d-ldq59" Mar 10 10:03:05 crc kubenswrapper[4794]: I0310 10:03:05.173443 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8wdz4\" (UniqueName: \"kubernetes.io/projected/491021e8-371d-44ff-bc8b-6cb379531865-kube-api-access-8wdz4\") pod \"glance-operator-controller-manager-5964f64c48-2xrkm\" (UID: \"491021e8-371d-44ff-bc8b-6cb379531865\") " pod="openstack-operators/glance-operator-controller-manager-5964f64c48-2xrkm" Mar 10 10:03:05 crc kubenswrapper[4794]: I0310 10:03:05.173471 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vkhgx\" (UniqueName: \"kubernetes.io/projected/1201bc13-f478-4ce1-9e86-3b1fcf9bcde1-kube-api-access-vkhgx\") pod \"infra-operator-controller-manager-5995f4446f-sf5hn\" (UID: \"1201bc13-f478-4ce1-9e86-3b1fcf9bcde1\") " pod="openstack-operators/infra-operator-controller-manager-5995f4446f-sf5hn" Mar 10 10:03:05 crc kubenswrapper[4794]: I0310 10:03:05.197836 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-68f45f9d9f-zb879"] Mar 10 10:03:05 crc kubenswrapper[4794]: I0310 10:03:05.207274 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m8zz4\" (UniqueName: \"kubernetes.io/projected/690c6868-b841-43ec-9a82-42f6b2250153-kube-api-access-m8zz4\") pod \"barbican-operator-controller-manager-677bd678f7-22vcf\" (UID: \"690c6868-b841-43ec-9a82-42f6b2250153\") " pod="openstack-operators/barbican-operator-controller-manager-677bd678f7-22vcf" Mar 10 10:03:05 crc kubenswrapper[4794]: I0310 10:03:05.210937 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8wdz4\" (UniqueName: \"kubernetes.io/projected/491021e8-371d-44ff-bc8b-6cb379531865-kube-api-access-8wdz4\") pod \"glance-operator-controller-manager-5964f64c48-2xrkm\" (UID: \"491021e8-371d-44ff-bc8b-6cb379531865\") " pod="openstack-operators/glance-operator-controller-manager-5964f64c48-2xrkm" Mar 10 10:03:05 crc kubenswrapper[4794]: I0310 10:03:05.217088 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x9rrr\" (UniqueName: \"kubernetes.io/projected/aea0d607-d1b3-4a15-993a-c571f49c1337-kube-api-access-x9rrr\") pod \"cinder-operator-controller-manager-984cd4dcf-vh842\" (UID: \"aea0d607-d1b3-4a15-993a-c571f49c1337\") " pod="openstack-operators/cinder-operator-controller-manager-984cd4dcf-vh842" Mar 10 10:03:05 crc kubenswrapper[4794]: I0310 10:03:05.233752 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-677bd678f7-22vcf" Mar 10 10:03:05 crc kubenswrapper[4794]: I0310 10:03:05.238476 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-658d4cdd5-5nt4h"] Mar 10 10:03:05 crc kubenswrapper[4794]: I0310 10:03:05.238877 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sdc5b\" (UniqueName: \"kubernetes.io/projected/8c9a844a-91eb-4909-961a-79ab54ac592c-kube-api-access-sdc5b\") pod \"designate-operator-controller-manager-66d56f6ff4-zcsq9\" (UID: \"8c9a844a-91eb-4909-961a-79ab54ac592c\") " pod="openstack-operators/designate-operator-controller-manager-66d56f6ff4-zcsq9" Mar 10 10:03:05 crc kubenswrapper[4794]: I0310 10:03:05.238915 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-984cd4dcf-vh842" Mar 10 10:03:05 crc kubenswrapper[4794]: I0310 10:03:05.240018 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-658d4cdd5-5nt4h" Mar 10 10:03:05 crc kubenswrapper[4794]: I0310 10:03:05.249509 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-66d56f6ff4-zcsq9" Mar 10 10:03:05 crc kubenswrapper[4794]: I0310 10:03:05.267919 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"mariadb-operator-controller-manager-dockercfg-vp95p" Mar 10 10:03:05 crc kubenswrapper[4794]: I0310 10:03:05.276394 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/neutron-operator-controller-manager-776c5696bf-pgqx8"] Mar 10 10:03:05 crc kubenswrapper[4794]: I0310 10:03:05.277222 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-776c5696bf-pgqx8" Mar 10 10:03:05 crc kubenswrapper[4794]: I0310 10:03:05.278023 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vkhgx\" (UniqueName: \"kubernetes.io/projected/1201bc13-f478-4ce1-9e86-3b1fcf9bcde1-kube-api-access-vkhgx\") pod \"infra-operator-controller-manager-5995f4446f-sf5hn\" (UID: \"1201bc13-f478-4ce1-9e86-3b1fcf9bcde1\") " pod="openstack-operators/infra-operator-controller-manager-5995f4446f-sf5hn" Mar 10 10:03:05 crc kubenswrapper[4794]: I0310 10:03:05.278071 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1201bc13-f478-4ce1-9e86-3b1fcf9bcde1-cert\") pod \"infra-operator-controller-manager-5995f4446f-sf5hn\" (UID: \"1201bc13-f478-4ce1-9e86-3b1fcf9bcde1\") " pod="openstack-operators/infra-operator-controller-manager-5995f4446f-sf5hn" Mar 10 10:03:05 crc kubenswrapper[4794]: I0310 10:03:05.278093 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n2cfv\" (UniqueName: \"kubernetes.io/projected/63cc23d4-8955-4725-886e-b1379acf91dc-kube-api-access-n2cfv\") pod \"ironic-operator-controller-manager-6bbb499bbc-tbxps\" (UID: \"63cc23d4-8955-4725-886e-b1379acf91dc\") " pod="openstack-operators/ironic-operator-controller-manager-6bbb499bbc-tbxps" Mar 10 10:03:05 crc kubenswrapper[4794]: I0310 10:03:05.278122 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d46hj\" (UniqueName: \"kubernetes.io/projected/51d5cfa9-c743-4fe4-8965-e88568a7e266-kube-api-access-d46hj\") pod \"heat-operator-controller-manager-77b6666d85-d9zzm\" (UID: \"51d5cfa9-c743-4fe4-8965-e88568a7e266\") " pod="openstack-operators/heat-operator-controller-manager-77b6666d85-d9zzm" Mar 10 10:03:05 crc kubenswrapper[4794]: I0310 10:03:05.278146 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-98wcf\" (UniqueName: \"kubernetes.io/projected/f67ba1e8-d8af-4850-8133-6c50df162861-kube-api-access-98wcf\") pod \"keystone-operator-controller-manager-684f77d66d-p2ghv\" (UID: \"f67ba1e8-d8af-4850-8133-6c50df162861\") " pod="openstack-operators/keystone-operator-controller-manager-684f77d66d-p2ghv" Mar 10 10:03:05 crc kubenswrapper[4794]: I0310 10:03:05.278166 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f4pcq\" (UniqueName: \"kubernetes.io/projected/6cc76a8e-a3ac-4499-ab5b-f3fba0d8702c-kube-api-access-f4pcq\") pod \"horizon-operator-controller-manager-6d9d6b584d-ldq59\" (UID: \"6cc76a8e-a3ac-4499-ab5b-f3fba0d8702c\") " pod="openstack-operators/horizon-operator-controller-manager-6d9d6b584d-ldq59" Mar 10 10:03:05 crc kubenswrapper[4794]: I0310 10:03:05.278201 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mlcfm\" (UniqueName: \"kubernetes.io/projected/f69e77cb-6b5c-4caf-a7de-7604ba460682-kube-api-access-mlcfm\") pod \"manila-operator-controller-manager-68f45f9d9f-zb879\" (UID: \"f69e77cb-6b5c-4caf-a7de-7604ba460682\") " pod="openstack-operators/manila-operator-controller-manager-68f45f9d9f-zb879" Mar 10 10:03:05 crc kubenswrapper[4794]: E0310 10:03:05.278438 4794 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 10 10:03:05 crc kubenswrapper[4794]: E0310 10:03:05.278474 4794 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1201bc13-f478-4ce1-9e86-3b1fcf9bcde1-cert podName:1201bc13-f478-4ce1-9e86-3b1fcf9bcde1 nodeName:}" failed. No retries permitted until 2026-03-10 10:03:05.778460529 +0000 UTC m=+1134.534631337 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/1201bc13-f478-4ce1-9e86-3b1fcf9bcde1-cert") pod "infra-operator-controller-manager-5995f4446f-sf5hn" (UID: "1201bc13-f478-4ce1-9e86-3b1fcf9bcde1") : secret "infra-operator-webhook-server-cert" not found Mar 10 10:03:05 crc kubenswrapper[4794]: I0310 10:03:05.286857 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"neutron-operator-controller-manager-dockercfg-v8jtg" Mar 10 10:03:05 crc kubenswrapper[4794]: I0310 10:03:05.290408 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-658d4cdd5-5nt4h"] Mar 10 10:03:05 crc kubenswrapper[4794]: I0310 10:03:05.321968 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-776c5696bf-pgqx8"] Mar 10 10:03:05 crc kubenswrapper[4794]: I0310 10:03:05.322877 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-5964f64c48-2xrkm" Mar 10 10:03:05 crc kubenswrapper[4794]: I0310 10:03:05.322909 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vkhgx\" (UniqueName: \"kubernetes.io/projected/1201bc13-f478-4ce1-9e86-3b1fcf9bcde1-kube-api-access-vkhgx\") pod \"infra-operator-controller-manager-5995f4446f-sf5hn\" (UID: \"1201bc13-f478-4ce1-9e86-3b1fcf9bcde1\") " pod="openstack-operators/infra-operator-controller-manager-5995f4446f-sf5hn" Mar 10 10:03:05 crc kubenswrapper[4794]: I0310 10:03:05.338027 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d46hj\" (UniqueName: \"kubernetes.io/projected/51d5cfa9-c743-4fe4-8965-e88568a7e266-kube-api-access-d46hj\") pod \"heat-operator-controller-manager-77b6666d85-d9zzm\" (UID: \"51d5cfa9-c743-4fe4-8965-e88568a7e266\") " pod="openstack-operators/heat-operator-controller-manager-77b6666d85-d9zzm" Mar 10 10:03:05 crc kubenswrapper[4794]: I0310 10:03:05.345134 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f4pcq\" (UniqueName: \"kubernetes.io/projected/6cc76a8e-a3ac-4499-ab5b-f3fba0d8702c-kube-api-access-f4pcq\") pod \"horizon-operator-controller-manager-6d9d6b584d-ldq59\" (UID: \"6cc76a8e-a3ac-4499-ab5b-f3fba0d8702c\") " pod="openstack-operators/horizon-operator-controller-manager-6d9d6b584d-ldq59" Mar 10 10:03:05 crc kubenswrapper[4794]: I0310 10:03:05.365116 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-6d9d6b584d-ldq59" Mar 10 10:03:05 crc kubenswrapper[4794]: I0310 10:03:05.379999 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n2cfv\" (UniqueName: \"kubernetes.io/projected/63cc23d4-8955-4725-886e-b1379acf91dc-kube-api-access-n2cfv\") pod \"ironic-operator-controller-manager-6bbb499bbc-tbxps\" (UID: \"63cc23d4-8955-4725-886e-b1379acf91dc\") " pod="openstack-operators/ironic-operator-controller-manager-6bbb499bbc-tbxps" Mar 10 10:03:05 crc kubenswrapper[4794]: I0310 10:03:05.380066 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-98wcf\" (UniqueName: \"kubernetes.io/projected/f67ba1e8-d8af-4850-8133-6c50df162861-kube-api-access-98wcf\") pod \"keystone-operator-controller-manager-684f77d66d-p2ghv\" (UID: \"f67ba1e8-d8af-4850-8133-6c50df162861\") " pod="openstack-operators/keystone-operator-controller-manager-684f77d66d-p2ghv" Mar 10 10:03:05 crc kubenswrapper[4794]: I0310 10:03:05.380111 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d249m\" (UniqueName: \"kubernetes.io/projected/f6c79bcc-08f3-4c8a-aa30-ce9db5e7bd27-kube-api-access-d249m\") pod \"mariadb-operator-controller-manager-658d4cdd5-5nt4h\" (UID: \"f6c79bcc-08f3-4c8a-aa30-ce9db5e7bd27\") " pod="openstack-operators/mariadb-operator-controller-manager-658d4cdd5-5nt4h" Mar 10 10:03:05 crc kubenswrapper[4794]: I0310 10:03:05.380147 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mlcfm\" (UniqueName: \"kubernetes.io/projected/f69e77cb-6b5c-4caf-a7de-7604ba460682-kube-api-access-mlcfm\") pod \"manila-operator-controller-manager-68f45f9d9f-zb879\" (UID: \"f69e77cb-6b5c-4caf-a7de-7604ba460682\") " pod="openstack-operators/manila-operator-controller-manager-68f45f9d9f-zb879" Mar 10 10:03:05 crc kubenswrapper[4794]: I0310 10:03:05.380186 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dxffv\" (UniqueName: \"kubernetes.io/projected/f22ad8cb-2f75-4695-b030-67a2991aa07c-kube-api-access-dxffv\") pod \"neutron-operator-controller-manager-776c5696bf-pgqx8\" (UID: \"f22ad8cb-2f75-4695-b030-67a2991aa07c\") " pod="openstack-operators/neutron-operator-controller-manager-776c5696bf-pgqx8" Mar 10 10:03:05 crc kubenswrapper[4794]: I0310 10:03:05.398134 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/nova-operator-controller-manager-569cc54c5-xqghl"] Mar 10 10:03:05 crc kubenswrapper[4794]: I0310 10:03:05.398930 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-569cc54c5-xqghl" Mar 10 10:03:05 crc kubenswrapper[4794]: I0310 10:03:05.406733 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"nova-operator-controller-manager-dockercfg-qjxlr" Mar 10 10:03:05 crc kubenswrapper[4794]: I0310 10:03:05.410092 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mlcfm\" (UniqueName: \"kubernetes.io/projected/f69e77cb-6b5c-4caf-a7de-7604ba460682-kube-api-access-mlcfm\") pod \"manila-operator-controller-manager-68f45f9d9f-zb879\" (UID: \"f69e77cb-6b5c-4caf-a7de-7604ba460682\") " pod="openstack-operators/manila-operator-controller-manager-68f45f9d9f-zb879" Mar 10 10:03:05 crc kubenswrapper[4794]: I0310 10:03:05.414956 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-98wcf\" (UniqueName: \"kubernetes.io/projected/f67ba1e8-d8af-4850-8133-6c50df162861-kube-api-access-98wcf\") pod \"keystone-operator-controller-manager-684f77d66d-p2ghv\" (UID: \"f67ba1e8-d8af-4850-8133-6c50df162861\") " pod="openstack-operators/keystone-operator-controller-manager-684f77d66d-p2ghv" Mar 10 10:03:05 crc kubenswrapper[4794]: I0310 10:03:05.435014 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n2cfv\" (UniqueName: \"kubernetes.io/projected/63cc23d4-8955-4725-886e-b1379acf91dc-kube-api-access-n2cfv\") pod \"ironic-operator-controller-manager-6bbb499bbc-tbxps\" (UID: \"63cc23d4-8955-4725-886e-b1379acf91dc\") " pod="openstack-operators/ironic-operator-controller-manager-6bbb499bbc-tbxps" Mar 10 10:03:05 crc kubenswrapper[4794]: I0310 10:03:05.440959 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-6bbb499bbc-tbxps" Mar 10 10:03:05 crc kubenswrapper[4794]: I0310 10:03:05.453655 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-684f77d66d-p2ghv" Mar 10 10:03:05 crc kubenswrapper[4794]: I0310 10:03:05.460401 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-569cc54c5-xqghl"] Mar 10 10:03:05 crc kubenswrapper[4794]: I0310 10:03:05.464202 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/octavia-operator-controller-manager-5f4f55cb5c-d8hqc"] Mar 10 10:03:05 crc kubenswrapper[4794]: I0310 10:03:05.465121 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-5f4f55cb5c-d8hqc" Mar 10 10:03:05 crc kubenswrapper[4794]: I0310 10:03:05.482964 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d249m\" (UniqueName: \"kubernetes.io/projected/f6c79bcc-08f3-4c8a-aa30-ce9db5e7bd27-kube-api-access-d249m\") pod \"mariadb-operator-controller-manager-658d4cdd5-5nt4h\" (UID: \"f6c79bcc-08f3-4c8a-aa30-ce9db5e7bd27\") " pod="openstack-operators/mariadb-operator-controller-manager-658d4cdd5-5nt4h" Mar 10 10:03:05 crc kubenswrapper[4794]: I0310 10:03:05.483055 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dxffv\" (UniqueName: \"kubernetes.io/projected/f22ad8cb-2f75-4695-b030-67a2991aa07c-kube-api-access-dxffv\") pod \"neutron-operator-controller-manager-776c5696bf-pgqx8\" (UID: \"f22ad8cb-2f75-4695-b030-67a2991aa07c\") " pod="openstack-operators/neutron-operator-controller-manager-776c5696bf-pgqx8" Mar 10 10:03:05 crc kubenswrapper[4794]: I0310 10:03:05.489744 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"octavia-operator-controller-manager-dockercfg-l96rz" Mar 10 10:03:05 crc kubenswrapper[4794]: I0310 10:03:05.490771 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-68f45f9d9f-zb879" Mar 10 10:03:05 crc kubenswrapper[4794]: I0310 10:03:05.520712 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dxffv\" (UniqueName: \"kubernetes.io/projected/f22ad8cb-2f75-4695-b030-67a2991aa07c-kube-api-access-dxffv\") pod \"neutron-operator-controller-manager-776c5696bf-pgqx8\" (UID: \"f22ad8cb-2f75-4695-b030-67a2991aa07c\") " pod="openstack-operators/neutron-operator-controller-manager-776c5696bf-pgqx8" Mar 10 10:03:05 crc kubenswrapper[4794]: I0310 10:03:05.528968 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d249m\" (UniqueName: \"kubernetes.io/projected/f6c79bcc-08f3-4c8a-aa30-ce9db5e7bd27-kube-api-access-d249m\") pod \"mariadb-operator-controller-manager-658d4cdd5-5nt4h\" (UID: \"f6c79bcc-08f3-4c8a-aa30-ce9db5e7bd27\") " pod="openstack-operators/mariadb-operator-controller-manager-658d4cdd5-5nt4h" Mar 10 10:03:05 crc kubenswrapper[4794]: I0310 10:03:05.561435 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-5f4f55cb5c-d8hqc"] Mar 10 10:03:05 crc kubenswrapper[4794]: I0310 10:03:05.583983 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zrwkr\" (UniqueName: \"kubernetes.io/projected/8dc7b4da-0fd5-489a-a3c6-8c613fb3c5a9-kube-api-access-zrwkr\") pod \"nova-operator-controller-manager-569cc54c5-xqghl\" (UID: \"8dc7b4da-0fd5-489a-a3c6-8c613fb3c5a9\") " pod="openstack-operators/nova-operator-controller-manager-569cc54c5-xqghl" Mar 10 10:03:05 crc kubenswrapper[4794]: I0310 10:03:05.584102 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mt52c\" (UniqueName: \"kubernetes.io/projected/cd0343b5-41b0-44c4-8c72-997df328e4ef-kube-api-access-mt52c\") pod \"octavia-operator-controller-manager-5f4f55cb5c-d8hqc\" (UID: \"cd0343b5-41b0-44c4-8c72-997df328e4ef\") " pod="openstack-operators/octavia-operator-controller-manager-5f4f55cb5c-d8hqc" Mar 10 10:03:05 crc kubenswrapper[4794]: I0310 10:03:05.615394 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ovn-operator-controller-manager-bbc5b68f9-xn8mp"] Mar 10 10:03:05 crc kubenswrapper[4794]: I0310 10:03:05.616432 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-bbc5b68f9-xn8mp" Mar 10 10:03:05 crc kubenswrapper[4794]: I0310 10:03:05.620096 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ovn-operator-controller-manager-dockercfg-4rd6j" Mar 10 10:03:05 crc kubenswrapper[4794]: I0310 10:03:05.634640 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-6647d7885f664lg"] Mar 10 10:03:05 crc kubenswrapper[4794]: I0310 10:03:05.684657 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-6647d7885f664lg" Mar 10 10:03:05 crc kubenswrapper[4794]: I0310 10:03:05.685989 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-658d4cdd5-5nt4h" Mar 10 10:03:05 crc kubenswrapper[4794]: I0310 10:03:05.687177 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mt52c\" (UniqueName: \"kubernetes.io/projected/cd0343b5-41b0-44c4-8c72-997df328e4ef-kube-api-access-mt52c\") pod \"octavia-operator-controller-manager-5f4f55cb5c-d8hqc\" (UID: \"cd0343b5-41b0-44c4-8c72-997df328e4ef\") " pod="openstack-operators/octavia-operator-controller-manager-5f4f55cb5c-d8hqc" Mar 10 10:03:05 crc kubenswrapper[4794]: I0310 10:03:05.687237 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zrwkr\" (UniqueName: \"kubernetes.io/projected/8dc7b4da-0fd5-489a-a3c6-8c613fb3c5a9-kube-api-access-zrwkr\") pod \"nova-operator-controller-manager-569cc54c5-xqghl\" (UID: \"8dc7b4da-0fd5-489a-a3c6-8c613fb3c5a9\") " pod="openstack-operators/nova-operator-controller-manager-569cc54c5-xqghl" Mar 10 10:03:05 crc kubenswrapper[4794]: I0310 10:03:05.687832 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-77b6666d85-d9zzm" Mar 10 10:03:05 crc kubenswrapper[4794]: I0310 10:03:05.688206 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-webhook-server-cert" Mar 10 10:03:05 crc kubenswrapper[4794]: I0310 10:03:05.688231 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-controller-manager-dockercfg-d84td" Mar 10 10:03:05 crc kubenswrapper[4794]: I0310 10:03:05.688468 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-776c5696bf-pgqx8" Mar 10 10:03:05 crc kubenswrapper[4794]: I0310 10:03:05.715378 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/placement-operator-controller-manager-574d45c66c-s7nkh"] Mar 10 10:03:05 crc kubenswrapper[4794]: I0310 10:03:05.716727 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-574d45c66c-s7nkh" Mar 10 10:03:05 crc kubenswrapper[4794]: I0310 10:03:05.725701 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"placement-operator-controller-manager-dockercfg-x6w4g" Mar 10 10:03:05 crc kubenswrapper[4794]: I0310 10:03:05.740391 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zrwkr\" (UniqueName: \"kubernetes.io/projected/8dc7b4da-0fd5-489a-a3c6-8c613fb3c5a9-kube-api-access-zrwkr\") pod \"nova-operator-controller-manager-569cc54c5-xqghl\" (UID: \"8dc7b4da-0fd5-489a-a3c6-8c613fb3c5a9\") " pod="openstack-operators/nova-operator-controller-manager-569cc54c5-xqghl" Mar 10 10:03:05 crc kubenswrapper[4794]: I0310 10:03:05.742260 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mt52c\" (UniqueName: \"kubernetes.io/projected/cd0343b5-41b0-44c4-8c72-997df328e4ef-kube-api-access-mt52c\") pod \"octavia-operator-controller-manager-5f4f55cb5c-d8hqc\" (UID: \"cd0343b5-41b0-44c4-8c72-997df328e4ef\") " pod="openstack-operators/octavia-operator-controller-manager-5f4f55cb5c-d8hqc" Mar 10 10:03:05 crc kubenswrapper[4794]: I0310 10:03:05.747975 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-569cc54c5-xqghl" Mar 10 10:03:05 crc kubenswrapper[4794]: I0310 10:03:05.748125 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-bbc5b68f9-xn8mp"] Mar 10 10:03:05 crc kubenswrapper[4794]: I0310 10:03:05.791100 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-86sk5\" (UniqueName: \"kubernetes.io/projected/7929c2d3-601b-4c69-970b-a69550d9852c-kube-api-access-86sk5\") pod \"openstack-baremetal-operator-controller-manager-6647d7885f664lg\" (UID: \"7929c2d3-601b-4c69-970b-a69550d9852c\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6647d7885f664lg" Mar 10 10:03:05 crc kubenswrapper[4794]: I0310 10:03:05.791204 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1201bc13-f478-4ce1-9e86-3b1fcf9bcde1-cert\") pod \"infra-operator-controller-manager-5995f4446f-sf5hn\" (UID: \"1201bc13-f478-4ce1-9e86-3b1fcf9bcde1\") " pod="openstack-operators/infra-operator-controller-manager-5995f4446f-sf5hn" Mar 10 10:03:05 crc kubenswrapper[4794]: I0310 10:03:05.791266 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-smh67\" (UniqueName: \"kubernetes.io/projected/a5511434-20de-4512-91d2-be8a49738d22-kube-api-access-smh67\") pod \"ovn-operator-controller-manager-bbc5b68f9-xn8mp\" (UID: \"a5511434-20de-4512-91d2-be8a49738d22\") " pod="openstack-operators/ovn-operator-controller-manager-bbc5b68f9-xn8mp" Mar 10 10:03:05 crc kubenswrapper[4794]: I0310 10:03:05.791286 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7929c2d3-601b-4c69-970b-a69550d9852c-cert\") pod \"openstack-baremetal-operator-controller-manager-6647d7885f664lg\" (UID: \"7929c2d3-601b-4c69-970b-a69550d9852c\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6647d7885f664lg" Mar 10 10:03:05 crc kubenswrapper[4794]: E0310 10:03:05.791432 4794 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 10 10:03:05 crc kubenswrapper[4794]: E0310 10:03:05.791473 4794 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1201bc13-f478-4ce1-9e86-3b1fcf9bcde1-cert podName:1201bc13-f478-4ce1-9e86-3b1fcf9bcde1 nodeName:}" failed. No retries permitted until 2026-03-10 10:03:06.79145926 +0000 UTC m=+1135.547630078 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/1201bc13-f478-4ce1-9e86-3b1fcf9bcde1-cert") pod "infra-operator-controller-manager-5995f4446f-sf5hn" (UID: "1201bc13-f478-4ce1-9e86-3b1fcf9bcde1") : secret "infra-operator-webhook-server-cert" not found Mar 10 10:03:05 crc kubenswrapper[4794]: I0310 10:03:05.806882 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-574d45c66c-s7nkh"] Mar 10 10:03:05 crc kubenswrapper[4794]: I0310 10:03:05.817529 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-5f4f55cb5c-d8hqc" Mar 10 10:03:05 crc kubenswrapper[4794]: I0310 10:03:05.825904 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-6647d7885f664lg"] Mar 10 10:03:05 crc kubenswrapper[4794]: I0310 10:03:05.835905 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/swift-operator-controller-manager-677c674df7-6664x"] Mar 10 10:03:05 crc kubenswrapper[4794]: I0310 10:03:05.836912 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-677c674df7-6664x" Mar 10 10:03:05 crc kubenswrapper[4794]: I0310 10:03:05.843317 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"swift-operator-controller-manager-dockercfg-545hv" Mar 10 10:03:05 crc kubenswrapper[4794]: I0310 10:03:05.849790 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-6cd66dbd4b-h7m6z"] Mar 10 10:03:05 crc kubenswrapper[4794]: I0310 10:03:05.850963 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-6cd66dbd4b-h7m6z" Mar 10 10:03:05 crc kubenswrapper[4794]: I0310 10:03:05.854562 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-677c674df7-6664x"] Mar 10 10:03:05 crc kubenswrapper[4794]: I0310 10:03:05.862698 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"telemetry-operator-controller-manager-dockercfg-nnc6b" Mar 10 10:03:05 crc kubenswrapper[4794]: I0310 10:03:05.865087 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-6cd66dbd4b-h7m6z"] Mar 10 10:03:05 crc kubenswrapper[4794]: I0310 10:03:05.892003 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bx922\" (UniqueName: \"kubernetes.io/projected/0f893a2c-14c6-4d56-a798-30e94b0e89af-kube-api-access-bx922\") pod \"placement-operator-controller-manager-574d45c66c-s7nkh\" (UID: \"0f893a2c-14c6-4d56-a798-30e94b0e89af\") " pod="openstack-operators/placement-operator-controller-manager-574d45c66c-s7nkh" Mar 10 10:03:05 crc kubenswrapper[4794]: I0310 10:03:05.892091 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-smh67\" (UniqueName: \"kubernetes.io/projected/a5511434-20de-4512-91d2-be8a49738d22-kube-api-access-smh67\") pod \"ovn-operator-controller-manager-bbc5b68f9-xn8mp\" (UID: \"a5511434-20de-4512-91d2-be8a49738d22\") " pod="openstack-operators/ovn-operator-controller-manager-bbc5b68f9-xn8mp" Mar 10 10:03:05 crc kubenswrapper[4794]: I0310 10:03:05.892119 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7929c2d3-601b-4c69-970b-a69550d9852c-cert\") pod \"openstack-baremetal-operator-controller-manager-6647d7885f664lg\" (UID: \"7929c2d3-601b-4c69-970b-a69550d9852c\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6647d7885f664lg" Mar 10 10:03:05 crc kubenswrapper[4794]: I0310 10:03:05.892148 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-86sk5\" (UniqueName: \"kubernetes.io/projected/7929c2d3-601b-4c69-970b-a69550d9852c-kube-api-access-86sk5\") pod \"openstack-baremetal-operator-controller-manager-6647d7885f664lg\" (UID: \"7929c2d3-601b-4c69-970b-a69550d9852c\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6647d7885f664lg" Mar 10 10:03:05 crc kubenswrapper[4794]: E0310 10:03:05.892608 4794 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 10 10:03:05 crc kubenswrapper[4794]: E0310 10:03:05.892657 4794 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7929c2d3-601b-4c69-970b-a69550d9852c-cert podName:7929c2d3-601b-4c69-970b-a69550d9852c nodeName:}" failed. No retries permitted until 2026-03-10 10:03:06.392639315 +0000 UTC m=+1135.148810133 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/7929c2d3-601b-4c69-970b-a69550d9852c-cert") pod "openstack-baremetal-operator-controller-manager-6647d7885f664lg" (UID: "7929c2d3-601b-4c69-970b-a69550d9852c") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 10 10:03:05 crc kubenswrapper[4794]: I0310 10:03:05.903935 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/test-operator-controller-manager-5c5cb9c4d7-gv6df"] Mar 10 10:03:05 crc kubenswrapper[4794]: I0310 10:03:05.904930 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-gv6df" Mar 10 10:03:05 crc kubenswrapper[4794]: I0310 10:03:05.909658 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-5c5cb9c4d7-gv6df"] Mar 10 10:03:05 crc kubenswrapper[4794]: I0310 10:03:05.908640 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"test-operator-controller-manager-dockercfg-qmn28" Mar 10 10:03:05 crc kubenswrapper[4794]: I0310 10:03:05.930041 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/watcher-operator-controller-manager-6dd88c6f67-sdplr"] Mar 10 10:03:05 crc kubenswrapper[4794]: I0310 10:03:05.932009 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-86sk5\" (UniqueName: \"kubernetes.io/projected/7929c2d3-601b-4c69-970b-a69550d9852c-kube-api-access-86sk5\") pod \"openstack-baremetal-operator-controller-manager-6647d7885f664lg\" (UID: \"7929c2d3-601b-4c69-970b-a69550d9852c\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6647d7885f664lg" Mar 10 10:03:05 crc kubenswrapper[4794]: I0310 10:03:05.934652 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-6dd88c6f67-sdplr" Mar 10 10:03:05 crc kubenswrapper[4794]: I0310 10:03:05.939893 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-6dd88c6f67-sdplr"] Mar 10 10:03:05 crc kubenswrapper[4794]: I0310 10:03:05.940437 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"watcher-operator-controller-manager-dockercfg-tvbt6" Mar 10 10:03:05 crc kubenswrapper[4794]: I0310 10:03:05.941539 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-smh67\" (UniqueName: \"kubernetes.io/projected/a5511434-20de-4512-91d2-be8a49738d22-kube-api-access-smh67\") pod \"ovn-operator-controller-manager-bbc5b68f9-xn8mp\" (UID: \"a5511434-20de-4512-91d2-be8a49738d22\") " pod="openstack-operators/ovn-operator-controller-manager-bbc5b68f9-xn8mp" Mar 10 10:03:05 crc kubenswrapper[4794]: I0310 10:03:05.970686 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-manager-6679ddfdc7-bkrkg"] Mar 10 10:03:05 crc kubenswrapper[4794]: I0310 10:03:05.971567 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-6679ddfdc7-bkrkg" Mar 10 10:03:05 crc kubenswrapper[4794]: I0310 10:03:05.978370 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-6679ddfdc7-bkrkg"] Mar 10 10:03:05 crc kubenswrapper[4794]: I0310 10:03:05.995098 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7v8zl\" (UniqueName: \"kubernetes.io/projected/dd5bf891-8f83-47c7-9d66-06a814fcc5ee-kube-api-access-7v8zl\") pod \"swift-operator-controller-manager-677c674df7-6664x\" (UID: \"dd5bf891-8f83-47c7-9d66-06a814fcc5ee\") " pod="openstack-operators/swift-operator-controller-manager-677c674df7-6664x" Mar 10 10:03:05 crc kubenswrapper[4794]: I0310 10:03:05.995185 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bx922\" (UniqueName: \"kubernetes.io/projected/0f893a2c-14c6-4d56-a798-30e94b0e89af-kube-api-access-bx922\") pod \"placement-operator-controller-manager-574d45c66c-s7nkh\" (UID: \"0f893a2c-14c6-4d56-a798-30e94b0e89af\") " pod="openstack-operators/placement-operator-controller-manager-574d45c66c-s7nkh" Mar 10 10:03:05 crc kubenswrapper[4794]: I0310 10:03:05.995236 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p575q\" (UniqueName: \"kubernetes.io/projected/2ebeb7da-d08d-4e4a-a294-b5d0ce38d07f-kube-api-access-p575q\") pod \"test-operator-controller-manager-5c5cb9c4d7-gv6df\" (UID: \"2ebeb7da-d08d-4e4a-a294-b5d0ce38d07f\") " pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-gv6df" Mar 10 10:03:05 crc kubenswrapper[4794]: I0310 10:03:05.995274 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dflz2\" (UniqueName: \"kubernetes.io/projected/5f503ef3-9c39-4afb-a266-431c3a44d21e-kube-api-access-dflz2\") pod \"telemetry-operator-controller-manager-6cd66dbd4b-h7m6z\" (UID: \"5f503ef3-9c39-4afb-a266-431c3a44d21e\") " pod="openstack-operators/telemetry-operator-controller-manager-6cd66dbd4b-h7m6z" Mar 10 10:03:06 crc kubenswrapper[4794]: I0310 10:03:06.007518 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-wsbdq"] Mar 10 10:03:06 crc kubenswrapper[4794]: I0310 10:03:06.007644 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-bbc5b68f9-xn8mp" Mar 10 10:03:06 crc kubenswrapper[4794]: I0310 10:03:06.010796 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"metrics-server-cert" Mar 10 10:03:06 crc kubenswrapper[4794]: I0310 10:03:06.011955 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-manager-dockercfg-bg7s8" Mar 10 10:03:06 crc kubenswrapper[4794]: I0310 10:03:06.012272 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"webhook-server-cert" Mar 10 10:03:06 crc kubenswrapper[4794]: I0310 10:03:06.014236 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-wsbdq" Mar 10 10:03:06 crc kubenswrapper[4794]: I0310 10:03:06.018277 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"rabbitmq-cluster-operator-controller-manager-dockercfg-mvk9h" Mar 10 10:03:06 crc kubenswrapper[4794]: I0310 10:03:06.049132 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bx922\" (UniqueName: \"kubernetes.io/projected/0f893a2c-14c6-4d56-a798-30e94b0e89af-kube-api-access-bx922\") pod \"placement-operator-controller-manager-574d45c66c-s7nkh\" (UID: \"0f893a2c-14c6-4d56-a798-30e94b0e89af\") " pod="openstack-operators/placement-operator-controller-manager-574d45c66c-s7nkh" Mar 10 10:03:06 crc kubenswrapper[4794]: I0310 10:03:06.097040 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ea65018d-9031-45ab-89c3-2846e861d0a2-metrics-certs\") pod \"openstack-operator-controller-manager-6679ddfdc7-bkrkg\" (UID: \"ea65018d-9031-45ab-89c3-2846e861d0a2\") " pod="openstack-operators/openstack-operator-controller-manager-6679ddfdc7-bkrkg" Mar 10 10:03:06 crc kubenswrapper[4794]: I0310 10:03:06.097126 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p575q\" (UniqueName: \"kubernetes.io/projected/2ebeb7da-d08d-4e4a-a294-b5d0ce38d07f-kube-api-access-p575q\") pod \"test-operator-controller-manager-5c5cb9c4d7-gv6df\" (UID: \"2ebeb7da-d08d-4e4a-a294-b5d0ce38d07f\") " pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-gv6df" Mar 10 10:03:06 crc kubenswrapper[4794]: I0310 10:03:06.097149 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hn5rg\" (UniqueName: \"kubernetes.io/projected/a11f8148-52e7-47c7-8ff3-e9c172925ebe-kube-api-access-hn5rg\") pod \"rabbitmq-cluster-operator-manager-668c99d594-wsbdq\" (UID: \"a11f8148-52e7-47c7-8ff3-e9c172925ebe\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-wsbdq" Mar 10 10:03:06 crc kubenswrapper[4794]: I0310 10:03:06.097187 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dflz2\" (UniqueName: \"kubernetes.io/projected/5f503ef3-9c39-4afb-a266-431c3a44d21e-kube-api-access-dflz2\") pod \"telemetry-operator-controller-manager-6cd66dbd4b-h7m6z\" (UID: \"5f503ef3-9c39-4afb-a266-431c3a44d21e\") " pod="openstack-operators/telemetry-operator-controller-manager-6cd66dbd4b-h7m6z" Mar 10 10:03:06 crc kubenswrapper[4794]: I0310 10:03:06.097220 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fz2wp\" (UniqueName: \"kubernetes.io/projected/220c126b-fc41-4aa9-89ef-fb9ad27e9719-kube-api-access-fz2wp\") pod \"watcher-operator-controller-manager-6dd88c6f67-sdplr\" (UID: \"220c126b-fc41-4aa9-89ef-fb9ad27e9719\") " pod="openstack-operators/watcher-operator-controller-manager-6dd88c6f67-sdplr" Mar 10 10:03:06 crc kubenswrapper[4794]: I0310 10:03:06.097242 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7v8zl\" (UniqueName: \"kubernetes.io/projected/dd5bf891-8f83-47c7-9d66-06a814fcc5ee-kube-api-access-7v8zl\") pod \"swift-operator-controller-manager-677c674df7-6664x\" (UID: \"dd5bf891-8f83-47c7-9d66-06a814fcc5ee\") " pod="openstack-operators/swift-operator-controller-manager-677c674df7-6664x" Mar 10 10:03:06 crc kubenswrapper[4794]: I0310 10:03:06.097278 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6f7zt\" (UniqueName: \"kubernetes.io/projected/ea65018d-9031-45ab-89c3-2846e861d0a2-kube-api-access-6f7zt\") pod \"openstack-operator-controller-manager-6679ddfdc7-bkrkg\" (UID: \"ea65018d-9031-45ab-89c3-2846e861d0a2\") " pod="openstack-operators/openstack-operator-controller-manager-6679ddfdc7-bkrkg" Mar 10 10:03:06 crc kubenswrapper[4794]: I0310 10:03:06.097297 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/ea65018d-9031-45ab-89c3-2846e861d0a2-webhook-certs\") pod \"openstack-operator-controller-manager-6679ddfdc7-bkrkg\" (UID: \"ea65018d-9031-45ab-89c3-2846e861d0a2\") " pod="openstack-operators/openstack-operator-controller-manager-6679ddfdc7-bkrkg" Mar 10 10:03:06 crc kubenswrapper[4794]: I0310 10:03:06.102593 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-574d45c66c-s7nkh" Mar 10 10:03:06 crc kubenswrapper[4794]: I0310 10:03:06.122044 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p575q\" (UniqueName: \"kubernetes.io/projected/2ebeb7da-d08d-4e4a-a294-b5d0ce38d07f-kube-api-access-p575q\") pod \"test-operator-controller-manager-5c5cb9c4d7-gv6df\" (UID: \"2ebeb7da-d08d-4e4a-a294-b5d0ce38d07f\") " pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-gv6df" Mar 10 10:03:06 crc kubenswrapper[4794]: I0310 10:03:06.139717 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7v8zl\" (UniqueName: \"kubernetes.io/projected/dd5bf891-8f83-47c7-9d66-06a814fcc5ee-kube-api-access-7v8zl\") pod \"swift-operator-controller-manager-677c674df7-6664x\" (UID: \"dd5bf891-8f83-47c7-9d66-06a814fcc5ee\") " pod="openstack-operators/swift-operator-controller-manager-677c674df7-6664x" Mar 10 10:03:06 crc kubenswrapper[4794]: I0310 10:03:06.156012 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-wsbdq"] Mar 10 10:03:06 crc kubenswrapper[4794]: I0310 10:03:06.156064 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-677bd678f7-22vcf"] Mar 10 10:03:06 crc kubenswrapper[4794]: I0310 10:03:06.163672 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-677c674df7-6664x" Mar 10 10:03:06 crc kubenswrapper[4794]: I0310 10:03:06.166779 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dflz2\" (UniqueName: \"kubernetes.io/projected/5f503ef3-9c39-4afb-a266-431c3a44d21e-kube-api-access-dflz2\") pod \"telemetry-operator-controller-manager-6cd66dbd4b-h7m6z\" (UID: \"5f503ef3-9c39-4afb-a266-431c3a44d21e\") " pod="openstack-operators/telemetry-operator-controller-manager-6cd66dbd4b-h7m6z" Mar 10 10:03:06 crc kubenswrapper[4794]: I0310 10:03:06.176003 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-6cd66dbd4b-h7m6z" Mar 10 10:03:06 crc kubenswrapper[4794]: I0310 10:03:06.215704 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ea65018d-9031-45ab-89c3-2846e861d0a2-metrics-certs\") pod \"openstack-operator-controller-manager-6679ddfdc7-bkrkg\" (UID: \"ea65018d-9031-45ab-89c3-2846e861d0a2\") " pod="openstack-operators/openstack-operator-controller-manager-6679ddfdc7-bkrkg" Mar 10 10:03:06 crc kubenswrapper[4794]: I0310 10:03:06.215982 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hn5rg\" (UniqueName: \"kubernetes.io/projected/a11f8148-52e7-47c7-8ff3-e9c172925ebe-kube-api-access-hn5rg\") pod \"rabbitmq-cluster-operator-manager-668c99d594-wsbdq\" (UID: \"a11f8148-52e7-47c7-8ff3-e9c172925ebe\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-wsbdq" Mar 10 10:03:06 crc kubenswrapper[4794]: I0310 10:03:06.216137 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fz2wp\" (UniqueName: \"kubernetes.io/projected/220c126b-fc41-4aa9-89ef-fb9ad27e9719-kube-api-access-fz2wp\") pod \"watcher-operator-controller-manager-6dd88c6f67-sdplr\" (UID: \"220c126b-fc41-4aa9-89ef-fb9ad27e9719\") " pod="openstack-operators/watcher-operator-controller-manager-6dd88c6f67-sdplr" Mar 10 10:03:06 crc kubenswrapper[4794]: I0310 10:03:06.216261 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6f7zt\" (UniqueName: \"kubernetes.io/projected/ea65018d-9031-45ab-89c3-2846e861d0a2-kube-api-access-6f7zt\") pod \"openstack-operator-controller-manager-6679ddfdc7-bkrkg\" (UID: \"ea65018d-9031-45ab-89c3-2846e861d0a2\") " pod="openstack-operators/openstack-operator-controller-manager-6679ddfdc7-bkrkg" Mar 10 10:03:06 crc kubenswrapper[4794]: I0310 10:03:06.216405 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/ea65018d-9031-45ab-89c3-2846e861d0a2-webhook-certs\") pod \"openstack-operator-controller-manager-6679ddfdc7-bkrkg\" (UID: \"ea65018d-9031-45ab-89c3-2846e861d0a2\") " pod="openstack-operators/openstack-operator-controller-manager-6679ddfdc7-bkrkg" Mar 10 10:03:06 crc kubenswrapper[4794]: E0310 10:03:06.215945 4794 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 10 10:03:06 crc kubenswrapper[4794]: E0310 10:03:06.216700 4794 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ea65018d-9031-45ab-89c3-2846e861d0a2-metrics-certs podName:ea65018d-9031-45ab-89c3-2846e861d0a2 nodeName:}" failed. No retries permitted until 2026-03-10 10:03:06.716685245 +0000 UTC m=+1135.472856063 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/ea65018d-9031-45ab-89c3-2846e861d0a2-metrics-certs") pod "openstack-operator-controller-manager-6679ddfdc7-bkrkg" (UID: "ea65018d-9031-45ab-89c3-2846e861d0a2") : secret "metrics-server-cert" not found Mar 10 10:03:06 crc kubenswrapper[4794]: E0310 10:03:06.216585 4794 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 10 10:03:06 crc kubenswrapper[4794]: E0310 10:03:06.217104 4794 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ea65018d-9031-45ab-89c3-2846e861d0a2-webhook-certs podName:ea65018d-9031-45ab-89c3-2846e861d0a2 nodeName:}" failed. No retries permitted until 2026-03-10 10:03:06.717093238 +0000 UTC m=+1135.473264056 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/ea65018d-9031-45ab-89c3-2846e861d0a2-webhook-certs") pod "openstack-operator-controller-manager-6679ddfdc7-bkrkg" (UID: "ea65018d-9031-45ab-89c3-2846e861d0a2") : secret "webhook-server-cert" not found Mar 10 10:03:06 crc kubenswrapper[4794]: I0310 10:03:06.244527 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fz2wp\" (UniqueName: \"kubernetes.io/projected/220c126b-fc41-4aa9-89ef-fb9ad27e9719-kube-api-access-fz2wp\") pod \"watcher-operator-controller-manager-6dd88c6f67-sdplr\" (UID: \"220c126b-fc41-4aa9-89ef-fb9ad27e9719\") " pod="openstack-operators/watcher-operator-controller-manager-6dd88c6f67-sdplr" Mar 10 10:03:06 crc kubenswrapper[4794]: I0310 10:03:06.247265 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6f7zt\" (UniqueName: \"kubernetes.io/projected/ea65018d-9031-45ab-89c3-2846e861d0a2-kube-api-access-6f7zt\") pod \"openstack-operator-controller-manager-6679ddfdc7-bkrkg\" (UID: \"ea65018d-9031-45ab-89c3-2846e861d0a2\") " pod="openstack-operators/openstack-operator-controller-manager-6679ddfdc7-bkrkg" Mar 10 10:03:06 crc kubenswrapper[4794]: I0310 10:03:06.276822 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hn5rg\" (UniqueName: \"kubernetes.io/projected/a11f8148-52e7-47c7-8ff3-e9c172925ebe-kube-api-access-hn5rg\") pod \"rabbitmq-cluster-operator-manager-668c99d594-wsbdq\" (UID: \"a11f8148-52e7-47c7-8ff3-e9c172925ebe\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-wsbdq" Mar 10 10:03:06 crc kubenswrapper[4794]: I0310 10:03:06.308885 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-gv6df" Mar 10 10:03:06 crc kubenswrapper[4794]: I0310 10:03:06.334574 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-6dd88c6f67-sdplr" Mar 10 10:03:06 crc kubenswrapper[4794]: I0310 10:03:06.429176 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7929c2d3-601b-4c69-970b-a69550d9852c-cert\") pod \"openstack-baremetal-operator-controller-manager-6647d7885f664lg\" (UID: \"7929c2d3-601b-4c69-970b-a69550d9852c\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6647d7885f664lg" Mar 10 10:03:06 crc kubenswrapper[4794]: E0310 10:03:06.429487 4794 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 10 10:03:06 crc kubenswrapper[4794]: E0310 10:03:06.429548 4794 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7929c2d3-601b-4c69-970b-a69550d9852c-cert podName:7929c2d3-601b-4c69-970b-a69550d9852c nodeName:}" failed. No retries permitted until 2026-03-10 10:03:07.429528995 +0000 UTC m=+1136.185699813 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/7929c2d3-601b-4c69-970b-a69550d9852c-cert") pod "openstack-baremetal-operator-controller-manager-6647d7885f664lg" (UID: "7929c2d3-601b-4c69-970b-a69550d9852c") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 10 10:03:06 crc kubenswrapper[4794]: I0310 10:03:06.432398 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-6d9d6b584d-ldq59"] Mar 10 10:03:06 crc kubenswrapper[4794]: I0310 10:03:06.438969 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-984cd4dcf-vh842"] Mar 10 10:03:06 crc kubenswrapper[4794]: W0310 10:03:06.460886 4794 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6cc76a8e_a3ac_4499_ab5b_f3fba0d8702c.slice/crio-4a9dfa26d4e13221285762717aef25191a55642b6d1e73b9c5b25c8517e47d75 WatchSource:0}: Error finding container 4a9dfa26d4e13221285762717aef25191a55642b6d1e73b9c5b25c8517e47d75: Status 404 returned error can't find the container with id 4a9dfa26d4e13221285762717aef25191a55642b6d1e73b9c5b25c8517e47d75 Mar 10 10:03:06 crc kubenswrapper[4794]: I0310 10:03:06.477088 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-wsbdq" Mar 10 10:03:06 crc kubenswrapper[4794]: I0310 10:03:06.482709 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-66d56f6ff4-zcsq9"] Mar 10 10:03:06 crc kubenswrapper[4794]: W0310 10:03:06.486217 4794 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podaea0d607_d1b3_4a15_993a_c571f49c1337.slice/crio-aba05ff4d134f503b2a9c0476692d922901819180c486ba7d91ac0bec3d95bb0 WatchSource:0}: Error finding container aba05ff4d134f503b2a9c0476692d922901819180c486ba7d91ac0bec3d95bb0: Status 404 returned error can't find the container with id aba05ff4d134f503b2a9c0476692d922901819180c486ba7d91ac0bec3d95bb0 Mar 10 10:03:06 crc kubenswrapper[4794]: I0310 10:03:06.530786 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-5964f64c48-2xrkm"] Mar 10 10:03:06 crc kubenswrapper[4794]: W0310 10:03:06.569408 4794 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod491021e8_371d_44ff_bc8b_6cb379531865.slice/crio-ce85562814ba46a8f0b6325d9479f232353a3ca5c4127d3bf241831fc7766664 WatchSource:0}: Error finding container ce85562814ba46a8f0b6325d9479f232353a3ca5c4127d3bf241831fc7766664: Status 404 returned error can't find the container with id ce85562814ba46a8f0b6325d9479f232353a3ca5c4127d3bf241831fc7766664 Mar 10 10:03:06 crc kubenswrapper[4794]: I0310 10:03:06.658109 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-6bbb499bbc-tbxps"] Mar 10 10:03:06 crc kubenswrapper[4794]: W0310 10:03:06.665856 4794 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf67ba1e8_d8af_4850_8133_6c50df162861.slice/crio-187de6e8d3d42ebd70a198c0c4fd4a396167b67f09c0b62faceeee4cba7910a3 WatchSource:0}: Error finding container 187de6e8d3d42ebd70a198c0c4fd4a396167b67f09c0b62faceeee4cba7910a3: Status 404 returned error can't find the container with id 187de6e8d3d42ebd70a198c0c4fd4a396167b67f09c0b62faceeee4cba7910a3 Mar 10 10:03:06 crc kubenswrapper[4794]: W0310 10:03:06.666686 4794 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod63cc23d4_8955_4725_886e_b1379acf91dc.slice/crio-a9b9aab8f755195cf75cb820a1e16b779b3c1e822947e757ebbf7976803363b5 WatchSource:0}: Error finding container a9b9aab8f755195cf75cb820a1e16b779b3c1e822947e757ebbf7976803363b5: Status 404 returned error can't find the container with id a9b9aab8f755195cf75cb820a1e16b779b3c1e822947e757ebbf7976803363b5 Mar 10 10:03:06 crc kubenswrapper[4794]: I0310 10:03:06.669452 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-776c5696bf-pgqx8"] Mar 10 10:03:06 crc kubenswrapper[4794]: I0310 10:03:06.676412 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-684f77d66d-p2ghv"] Mar 10 10:03:06 crc kubenswrapper[4794]: I0310 10:03:06.691700 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-68f45f9d9f-zb879"] Mar 10 10:03:06 crc kubenswrapper[4794]: I0310 10:03:06.735877 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/ea65018d-9031-45ab-89c3-2846e861d0a2-webhook-certs\") pod \"openstack-operator-controller-manager-6679ddfdc7-bkrkg\" (UID: \"ea65018d-9031-45ab-89c3-2846e861d0a2\") " pod="openstack-operators/openstack-operator-controller-manager-6679ddfdc7-bkrkg" Mar 10 10:03:06 crc kubenswrapper[4794]: I0310 10:03:06.735957 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ea65018d-9031-45ab-89c3-2846e861d0a2-metrics-certs\") pod \"openstack-operator-controller-manager-6679ddfdc7-bkrkg\" (UID: \"ea65018d-9031-45ab-89c3-2846e861d0a2\") " pod="openstack-operators/openstack-operator-controller-manager-6679ddfdc7-bkrkg" Mar 10 10:03:06 crc kubenswrapper[4794]: E0310 10:03:06.736094 4794 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 10 10:03:06 crc kubenswrapper[4794]: E0310 10:03:06.736139 4794 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 10 10:03:06 crc kubenswrapper[4794]: E0310 10:03:06.736174 4794 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ea65018d-9031-45ab-89c3-2846e861d0a2-webhook-certs podName:ea65018d-9031-45ab-89c3-2846e861d0a2 nodeName:}" failed. No retries permitted until 2026-03-10 10:03:07.736155218 +0000 UTC m=+1136.492326036 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/ea65018d-9031-45ab-89c3-2846e861d0a2-webhook-certs") pod "openstack-operator-controller-manager-6679ddfdc7-bkrkg" (UID: "ea65018d-9031-45ab-89c3-2846e861d0a2") : secret "webhook-server-cert" not found Mar 10 10:03:06 crc kubenswrapper[4794]: E0310 10:03:06.736192 4794 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ea65018d-9031-45ab-89c3-2846e861d0a2-metrics-certs podName:ea65018d-9031-45ab-89c3-2846e861d0a2 nodeName:}" failed. No retries permitted until 2026-03-10 10:03:07.736184759 +0000 UTC m=+1136.492355707 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/ea65018d-9031-45ab-89c3-2846e861d0a2-metrics-certs") pod "openstack-operator-controller-manager-6679ddfdc7-bkrkg" (UID: "ea65018d-9031-45ab-89c3-2846e861d0a2") : secret "metrics-server-cert" not found Mar 10 10:03:06 crc kubenswrapper[4794]: I0310 10:03:06.837120 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1201bc13-f478-4ce1-9e86-3b1fcf9bcde1-cert\") pod \"infra-operator-controller-manager-5995f4446f-sf5hn\" (UID: \"1201bc13-f478-4ce1-9e86-3b1fcf9bcde1\") " pod="openstack-operators/infra-operator-controller-manager-5995f4446f-sf5hn" Mar 10 10:03:06 crc kubenswrapper[4794]: E0310 10:03:06.837310 4794 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 10 10:03:06 crc kubenswrapper[4794]: E0310 10:03:06.837452 4794 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1201bc13-f478-4ce1-9e86-3b1fcf9bcde1-cert podName:1201bc13-f478-4ce1-9e86-3b1fcf9bcde1 nodeName:}" failed. No retries permitted until 2026-03-10 10:03:08.837431106 +0000 UTC m=+1137.593601924 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/1201bc13-f478-4ce1-9e86-3b1fcf9bcde1-cert") pod "infra-operator-controller-manager-5995f4446f-sf5hn" (UID: "1201bc13-f478-4ce1-9e86-3b1fcf9bcde1") : secret "infra-operator-webhook-server-cert" not found Mar 10 10:03:06 crc kubenswrapper[4794]: I0310 10:03:06.851560 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-658d4cdd5-5nt4h"] Mar 10 10:03:06 crc kubenswrapper[4794]: W0310 10:03:06.854258 4794 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf6c79bcc_08f3_4c8a_aa30_ce9db5e7bd27.slice/crio-432e06b4fbbc6db2b3bfa36f6a6998f986d5eccb6fe6adc5583335d43f487e35 WatchSource:0}: Error finding container 432e06b4fbbc6db2b3bfa36f6a6998f986d5eccb6fe6adc5583335d43f487e35: Status 404 returned error can't find the container with id 432e06b4fbbc6db2b3bfa36f6a6998f986d5eccb6fe6adc5583335d43f487e35 Mar 10 10:03:06 crc kubenswrapper[4794]: I0310 10:03:06.862927 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-569cc54c5-xqghl"] Mar 10 10:03:06 crc kubenswrapper[4794]: I0310 10:03:06.882137 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-77b6666d85-d9zzm"] Mar 10 10:03:06 crc kubenswrapper[4794]: I0310 10:03:06.895137 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-5f4f55cb5c-d8hqc"] Mar 10 10:03:06 crc kubenswrapper[4794]: I0310 10:03:06.932965 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-658d4cdd5-5nt4h" event={"ID":"f6c79bcc-08f3-4c8a-aa30-ce9db5e7bd27","Type":"ContainerStarted","Data":"432e06b4fbbc6db2b3bfa36f6a6998f986d5eccb6fe6adc5583335d43f487e35"} Mar 10 10:03:06 crc kubenswrapper[4794]: I0310 10:03:06.937407 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-5964f64c48-2xrkm" event={"ID":"491021e8-371d-44ff-bc8b-6cb379531865","Type":"ContainerStarted","Data":"ce85562814ba46a8f0b6325d9479f232353a3ca5c4127d3bf241831fc7766664"} Mar 10 10:03:06 crc kubenswrapper[4794]: I0310 10:03:06.938274 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-66d56f6ff4-zcsq9" event={"ID":"8c9a844a-91eb-4909-961a-79ab54ac592c","Type":"ContainerStarted","Data":"42059ec190eef32296bdcb4b86f1de1faf6b6102852eddadfa0ecb863c33e1e0"} Mar 10 10:03:06 crc kubenswrapper[4794]: I0310 10:03:06.942444 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-684f77d66d-p2ghv" event={"ID":"f67ba1e8-d8af-4850-8133-6c50df162861","Type":"ContainerStarted","Data":"187de6e8d3d42ebd70a198c0c4fd4a396167b67f09c0b62faceeee4cba7910a3"} Mar 10 10:03:06 crc kubenswrapper[4794]: I0310 10:03:06.943211 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-677bd678f7-22vcf" event={"ID":"690c6868-b841-43ec-9a82-42f6b2250153","Type":"ContainerStarted","Data":"6cbe9b864b100292efee7e61bb8fed1b5b525c301b74dfd0d3a4dc74d102188d"} Mar 10 10:03:06 crc kubenswrapper[4794]: I0310 10:03:06.943913 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-77b6666d85-d9zzm" event={"ID":"51d5cfa9-c743-4fe4-8965-e88568a7e266","Type":"ContainerStarted","Data":"9f7ee629cd59f37ecbfef21210e743033730ba3dca57ba00451e55f216d8de25"} Mar 10 10:03:06 crc kubenswrapper[4794]: I0310 10:03:06.944887 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-68f45f9d9f-zb879" event={"ID":"f69e77cb-6b5c-4caf-a7de-7604ba460682","Type":"ContainerStarted","Data":"fb482cad4f50fcd328cc9312da86239fad4d94e6166728ab8468630f3d1046ff"} Mar 10 10:03:06 crc kubenswrapper[4794]: I0310 10:03:06.946763 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-984cd4dcf-vh842" event={"ID":"aea0d607-d1b3-4a15-993a-c571f49c1337","Type":"ContainerStarted","Data":"aba05ff4d134f503b2a9c0476692d922901819180c486ba7d91ac0bec3d95bb0"} Mar 10 10:03:06 crc kubenswrapper[4794]: I0310 10:03:06.947967 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-6d9d6b584d-ldq59" event={"ID":"6cc76a8e-a3ac-4499-ab5b-f3fba0d8702c","Type":"ContainerStarted","Data":"4a9dfa26d4e13221285762717aef25191a55642b6d1e73b9c5b25c8517e47d75"} Mar 10 10:03:06 crc kubenswrapper[4794]: I0310 10:03:06.954267 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-776c5696bf-pgqx8" event={"ID":"f22ad8cb-2f75-4695-b030-67a2991aa07c","Type":"ContainerStarted","Data":"cd255f53956b6552718f7453c2218f85d8bbab7369c9cb94d327323f9773de73"} Mar 10 10:03:06 crc kubenswrapper[4794]: I0310 10:03:06.955283 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-6bbb499bbc-tbxps" event={"ID":"63cc23d4-8955-4725-886e-b1379acf91dc","Type":"ContainerStarted","Data":"a9b9aab8f755195cf75cb820a1e16b779b3c1e822947e757ebbf7976803363b5"} Mar 10 10:03:06 crc kubenswrapper[4794]: I0310 10:03:06.956105 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-569cc54c5-xqghl" event={"ID":"8dc7b4da-0fd5-489a-a3c6-8c613fb3c5a9","Type":"ContainerStarted","Data":"2cf182c8a33ba4fb258e10adb9b23b88db3353a5ab2a98c9b55897188dc6d80a"} Mar 10 10:03:06 crc kubenswrapper[4794]: I0310 10:03:06.956775 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-5f4f55cb5c-d8hqc" event={"ID":"cd0343b5-41b0-44c4-8c72-997df328e4ef","Type":"ContainerStarted","Data":"2d36fe8e004e623f957f4b55021e9bd19c65d2116f482d42a62f20ba475a0378"} Mar 10 10:03:06 crc kubenswrapper[4794]: I0310 10:03:06.965306 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-bbc5b68f9-xn8mp"] Mar 10 10:03:06 crc kubenswrapper[4794]: I0310 10:03:06.991405 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-5c5cb9c4d7-gv6df"] Mar 10 10:03:06 crc kubenswrapper[4794]: I0310 10:03:06.996009 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-574d45c66c-s7nkh"] Mar 10 10:03:06 crc kubenswrapper[4794]: W0310 10:03:06.996696 4794 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2ebeb7da_d08d_4e4a_a294_b5d0ce38d07f.slice/crio-9c280d55e0535af0b41f8e3cd3488732cec29071b8b7097e4cedf23611ca9ecc WatchSource:0}: Error finding container 9c280d55e0535af0b41f8e3cd3488732cec29071b8b7097e4cedf23611ca9ecc: Status 404 returned error can't find the container with id 9c280d55e0535af0b41f8e3cd3488732cec29071b8b7097e4cedf23611ca9ecc Mar 10 10:03:06 crc kubenswrapper[4794]: E0310 10:03:06.997215 4794 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/ovn-operator@sha256:2f63ddf5c95c6c82f6e04bc9f7f20d56dc003614647726ab00276239eec40b7f,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-smh67,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovn-operator-controller-manager-bbc5b68f9-xn8mp_openstack-operators(a5511434-20de-4512-91d2-be8a49738d22): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Mar 10 10:03:06 crc kubenswrapper[4794]: E0310 10:03:06.998643 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/ovn-operator-controller-manager-bbc5b68f9-xn8mp" podUID="a5511434-20de-4512-91d2-be8a49738d22" Mar 10 10:03:06 crc kubenswrapper[4794]: E0310 10:03:06.999411 4794 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/test-operator@sha256:43bd420bc05b4789243740bc75f61e10c7aac7883fc2f82b2d4d50085bc96c42,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-p575q,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod test-operator-controller-manager-5c5cb9c4d7-gv6df_openstack-operators(2ebeb7da-d08d-4e4a-a294-b5d0ce38d07f): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Mar 10 10:03:07 crc kubenswrapper[4794]: E0310 10:03:07.001006 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-gv6df" podUID="2ebeb7da-d08d-4e4a-a294-b5d0ce38d07f" Mar 10 10:03:07 crc kubenswrapper[4794]: I0310 10:03:07.001682 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-6cd66dbd4b-h7m6z"] Mar 10 10:03:07 crc kubenswrapper[4794]: E0310 10:03:07.004898 4794 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/placement-operator@sha256:e7e865363955c670e41b6c042c4f87abceff78f5495ba5c5c82988baad45c978,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-bx922,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod placement-operator-controller-manager-574d45c66c-s7nkh_openstack-operators(0f893a2c-14c6-4d56-a798-30e94b0e89af): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Mar 10 10:03:07 crc kubenswrapper[4794]: E0310 10:03:07.006285 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/placement-operator-controller-manager-574d45c66c-s7nkh" podUID="0f893a2c-14c6-4d56-a798-30e94b0e89af" Mar 10 10:03:07 crc kubenswrapper[4794]: I0310 10:03:07.006441 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-677c674df7-6664x"] Mar 10 10:03:07 crc kubenswrapper[4794]: W0310 10:03:07.011705 4794 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5f503ef3_9c39_4afb_a266_431c3a44d21e.slice/crio-3fb7daeddac853eac687b50b854ab7d8d7a4806a0d5a3bc764a91e23e598c0ee WatchSource:0}: Error finding container 3fb7daeddac853eac687b50b854ab7d8d7a4806a0d5a3bc764a91e23e598c0ee: Status 404 returned error can't find the container with id 3fb7daeddac853eac687b50b854ab7d8d7a4806a0d5a3bc764a91e23e598c0ee Mar 10 10:03:07 crc kubenswrapper[4794]: E0310 10:03:07.013119 4794 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/telemetry-operator@sha256:27c84b712abc2df6108e22636075eec25fea0229800f38594a492fd41b02c49d,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-dflz2,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod telemetry-operator-controller-manager-6cd66dbd4b-h7m6z_openstack-operators(5f503ef3-9c39-4afb-a266-431c3a44d21e): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Mar 10 10:03:07 crc kubenswrapper[4794]: E0310 10:03:07.014867 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/telemetry-operator-controller-manager-6cd66dbd4b-h7m6z" podUID="5f503ef3-9c39-4afb-a266-431c3a44d21e" Mar 10 10:03:07 crc kubenswrapper[4794]: W0310 10:03:07.016993 4794 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddd5bf891_8f83_47c7_9d66_06a814fcc5ee.slice/crio-fce8ab27f85202a115b1371173f1c60d552cdf4f9cdd2149e83fbe0b74925141 WatchSource:0}: Error finding container fce8ab27f85202a115b1371173f1c60d552cdf4f9cdd2149e83fbe0b74925141: Status 404 returned error can't find the container with id fce8ab27f85202a115b1371173f1c60d552cdf4f9cdd2149e83fbe0b74925141 Mar 10 10:03:07 crc kubenswrapper[4794]: E0310 10:03:07.022243 4794 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/swift-operator@sha256:c223309f51714785bd878ad04080f7428567edad793be4f992d492abd77af44c,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-7v8zl,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod swift-operator-controller-manager-677c674df7-6664x_openstack-operators(dd5bf891-8f83-47c7-9d66-06a814fcc5ee): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Mar 10 10:03:07 crc kubenswrapper[4794]: E0310 10:03:07.023428 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/swift-operator-controller-manager-677c674df7-6664x" podUID="dd5bf891-8f83-47c7-9d66-06a814fcc5ee" Mar 10 10:03:07 crc kubenswrapper[4794]: I0310 10:03:07.119798 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-6dd88c6f67-sdplr"] Mar 10 10:03:07 crc kubenswrapper[4794]: I0310 10:03:07.124123 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-wsbdq"] Mar 10 10:03:07 crc kubenswrapper[4794]: E0310 10:03:07.125303 4794 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/watcher-operator@sha256:4af709a2a6a1a1abb9659dbdd6fb3818122bdec7e66009fcced0bf0949f91554,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-fz2wp,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod watcher-operator-controller-manager-6dd88c6f67-sdplr_openstack-operators(220c126b-fc41-4aa9-89ef-fb9ad27e9719): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Mar 10 10:03:07 crc kubenswrapper[4794]: E0310 10:03:07.126408 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/watcher-operator-controller-manager-6dd88c6f67-sdplr" podUID="220c126b-fc41-4aa9-89ef-fb9ad27e9719" Mar 10 10:03:07 crc kubenswrapper[4794]: I0310 10:03:07.454724 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7929c2d3-601b-4c69-970b-a69550d9852c-cert\") pod \"openstack-baremetal-operator-controller-manager-6647d7885f664lg\" (UID: \"7929c2d3-601b-4c69-970b-a69550d9852c\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6647d7885f664lg" Mar 10 10:03:07 crc kubenswrapper[4794]: E0310 10:03:07.454932 4794 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 10 10:03:07 crc kubenswrapper[4794]: E0310 10:03:07.455107 4794 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7929c2d3-601b-4c69-970b-a69550d9852c-cert podName:7929c2d3-601b-4c69-970b-a69550d9852c nodeName:}" failed. No retries permitted until 2026-03-10 10:03:09.455090772 +0000 UTC m=+1138.211261590 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/7929c2d3-601b-4c69-970b-a69550d9852c-cert") pod "openstack-baremetal-operator-controller-manager-6647d7885f664lg" (UID: "7929c2d3-601b-4c69-970b-a69550d9852c") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 10 10:03:07 crc kubenswrapper[4794]: I0310 10:03:07.759140 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ea65018d-9031-45ab-89c3-2846e861d0a2-metrics-certs\") pod \"openstack-operator-controller-manager-6679ddfdc7-bkrkg\" (UID: \"ea65018d-9031-45ab-89c3-2846e861d0a2\") " pod="openstack-operators/openstack-operator-controller-manager-6679ddfdc7-bkrkg" Mar 10 10:03:07 crc kubenswrapper[4794]: I0310 10:03:07.759269 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/ea65018d-9031-45ab-89c3-2846e861d0a2-webhook-certs\") pod \"openstack-operator-controller-manager-6679ddfdc7-bkrkg\" (UID: \"ea65018d-9031-45ab-89c3-2846e861d0a2\") " pod="openstack-operators/openstack-operator-controller-manager-6679ddfdc7-bkrkg" Mar 10 10:03:07 crc kubenswrapper[4794]: E0310 10:03:07.759413 4794 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 10 10:03:07 crc kubenswrapper[4794]: E0310 10:03:07.759458 4794 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ea65018d-9031-45ab-89c3-2846e861d0a2-webhook-certs podName:ea65018d-9031-45ab-89c3-2846e861d0a2 nodeName:}" failed. No retries permitted until 2026-03-10 10:03:09.759443573 +0000 UTC m=+1138.515614391 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/ea65018d-9031-45ab-89c3-2846e861d0a2-webhook-certs") pod "openstack-operator-controller-manager-6679ddfdc7-bkrkg" (UID: "ea65018d-9031-45ab-89c3-2846e861d0a2") : secret "webhook-server-cert" not found Mar 10 10:03:07 crc kubenswrapper[4794]: E0310 10:03:07.759813 4794 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 10 10:03:07 crc kubenswrapper[4794]: E0310 10:03:07.759844 4794 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ea65018d-9031-45ab-89c3-2846e861d0a2-metrics-certs podName:ea65018d-9031-45ab-89c3-2846e861d0a2 nodeName:}" failed. No retries permitted until 2026-03-10 10:03:09.759837406 +0000 UTC m=+1138.516008224 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/ea65018d-9031-45ab-89c3-2846e861d0a2-metrics-certs") pod "openstack-operator-controller-manager-6679ddfdc7-bkrkg" (UID: "ea65018d-9031-45ab-89c3-2846e861d0a2") : secret "metrics-server-cert" not found Mar 10 10:03:07 crc kubenswrapper[4794]: I0310 10:03:07.971666 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-bbc5b68f9-xn8mp" event={"ID":"a5511434-20de-4512-91d2-be8a49738d22","Type":"ContainerStarted","Data":"a120325879bd49fe39dfbedc70fe60641d0e7ace6b145b696fb3eee460da4ea0"} Mar 10 10:03:07 crc kubenswrapper[4794]: I0310 10:03:07.974530 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-574d45c66c-s7nkh" event={"ID":"0f893a2c-14c6-4d56-a798-30e94b0e89af","Type":"ContainerStarted","Data":"8dbc8a6f62cbb5a3a50a3642d3970e9d3c6cf88d4cbae918a3dda5adb5159efb"} Mar 10 10:03:07 crc kubenswrapper[4794]: I0310 10:03:07.977176 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-6dd88c6f67-sdplr" event={"ID":"220c126b-fc41-4aa9-89ef-fb9ad27e9719","Type":"ContainerStarted","Data":"25b4cf1cc4e197f0422ceb4195fe43f316dc842382dd3437a5b32c97bec9b823"} Mar 10 10:03:07 crc kubenswrapper[4794]: E0310 10:03:07.977775 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/placement-operator@sha256:e7e865363955c670e41b6c042c4f87abceff78f5495ba5c5c82988baad45c978\\\"\"" pod="openstack-operators/placement-operator-controller-manager-574d45c66c-s7nkh" podUID="0f893a2c-14c6-4d56-a798-30e94b0e89af" Mar 10 10:03:07 crc kubenswrapper[4794]: E0310 10:03:07.977994 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/ovn-operator@sha256:2f63ddf5c95c6c82f6e04bc9f7f20d56dc003614647726ab00276239eec40b7f\\\"\"" pod="openstack-operators/ovn-operator-controller-manager-bbc5b68f9-xn8mp" podUID="a5511434-20de-4512-91d2-be8a49738d22" Mar 10 10:03:07 crc kubenswrapper[4794]: E0310 10:03:07.978158 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/watcher-operator@sha256:4af709a2a6a1a1abb9659dbdd6fb3818122bdec7e66009fcced0bf0949f91554\\\"\"" pod="openstack-operators/watcher-operator-controller-manager-6dd88c6f67-sdplr" podUID="220c126b-fc41-4aa9-89ef-fb9ad27e9719" Mar 10 10:03:07 crc kubenswrapper[4794]: I0310 10:03:07.983997 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-6cd66dbd4b-h7m6z" event={"ID":"5f503ef3-9c39-4afb-a266-431c3a44d21e","Type":"ContainerStarted","Data":"3fb7daeddac853eac687b50b854ab7d8d7a4806a0d5a3bc764a91e23e598c0ee"} Mar 10 10:03:07 crc kubenswrapper[4794]: E0310 10:03:07.988563 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/telemetry-operator@sha256:27c84b712abc2df6108e22636075eec25fea0229800f38594a492fd41b02c49d\\\"\"" pod="openstack-operators/telemetry-operator-controller-manager-6cd66dbd4b-h7m6z" podUID="5f503ef3-9c39-4afb-a266-431c3a44d21e" Mar 10 10:03:07 crc kubenswrapper[4794]: I0310 10:03:07.994032 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-gv6df" event={"ID":"2ebeb7da-d08d-4e4a-a294-b5d0ce38d07f","Type":"ContainerStarted","Data":"9c280d55e0535af0b41f8e3cd3488732cec29071b8b7097e4cedf23611ca9ecc"} Mar 10 10:03:07 crc kubenswrapper[4794]: E0310 10:03:07.995502 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/test-operator@sha256:43bd420bc05b4789243740bc75f61e10c7aac7883fc2f82b2d4d50085bc96c42\\\"\"" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-gv6df" podUID="2ebeb7da-d08d-4e4a-a294-b5d0ce38d07f" Mar 10 10:03:07 crc kubenswrapper[4794]: I0310 10:03:07.997496 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-677c674df7-6664x" event={"ID":"dd5bf891-8f83-47c7-9d66-06a814fcc5ee","Type":"ContainerStarted","Data":"fce8ab27f85202a115b1371173f1c60d552cdf4f9cdd2149e83fbe0b74925141"} Mar 10 10:03:07 crc kubenswrapper[4794]: E0310 10:03:07.999801 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/swift-operator@sha256:c223309f51714785bd878ad04080f7428567edad793be4f992d492abd77af44c\\\"\"" pod="openstack-operators/swift-operator-controller-manager-677c674df7-6664x" podUID="dd5bf891-8f83-47c7-9d66-06a814fcc5ee" Mar 10 10:03:08 crc kubenswrapper[4794]: I0310 10:03:08.011282 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-wsbdq" event={"ID":"a11f8148-52e7-47c7-8ff3-e9c172925ebe","Type":"ContainerStarted","Data":"ce83c21732a9db75f4f07be6732228ec69d37c88909b9d0c17acf40ba9827533"} Mar 10 10:03:08 crc kubenswrapper[4794]: I0310 10:03:08.881223 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1201bc13-f478-4ce1-9e86-3b1fcf9bcde1-cert\") pod \"infra-operator-controller-manager-5995f4446f-sf5hn\" (UID: \"1201bc13-f478-4ce1-9e86-3b1fcf9bcde1\") " pod="openstack-operators/infra-operator-controller-manager-5995f4446f-sf5hn" Mar 10 10:03:08 crc kubenswrapper[4794]: E0310 10:03:08.881407 4794 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 10 10:03:08 crc kubenswrapper[4794]: E0310 10:03:08.881612 4794 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1201bc13-f478-4ce1-9e86-3b1fcf9bcde1-cert podName:1201bc13-f478-4ce1-9e86-3b1fcf9bcde1 nodeName:}" failed. No retries permitted until 2026-03-10 10:03:12.881593511 +0000 UTC m=+1141.637764329 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/1201bc13-f478-4ce1-9e86-3b1fcf9bcde1-cert") pod "infra-operator-controller-manager-5995f4446f-sf5hn" (UID: "1201bc13-f478-4ce1-9e86-3b1fcf9bcde1") : secret "infra-operator-webhook-server-cert" not found Mar 10 10:03:09 crc kubenswrapper[4794]: E0310 10:03:09.021455 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/test-operator@sha256:43bd420bc05b4789243740bc75f61e10c7aac7883fc2f82b2d4d50085bc96c42\\\"\"" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-gv6df" podUID="2ebeb7da-d08d-4e4a-a294-b5d0ce38d07f" Mar 10 10:03:09 crc kubenswrapper[4794]: E0310 10:03:09.021467 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/telemetry-operator@sha256:27c84b712abc2df6108e22636075eec25fea0229800f38594a492fd41b02c49d\\\"\"" pod="openstack-operators/telemetry-operator-controller-manager-6cd66dbd4b-h7m6z" podUID="5f503ef3-9c39-4afb-a266-431c3a44d21e" Mar 10 10:03:09 crc kubenswrapper[4794]: E0310 10:03:09.021584 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/ovn-operator@sha256:2f63ddf5c95c6c82f6e04bc9f7f20d56dc003614647726ab00276239eec40b7f\\\"\"" pod="openstack-operators/ovn-operator-controller-manager-bbc5b68f9-xn8mp" podUID="a5511434-20de-4512-91d2-be8a49738d22" Mar 10 10:03:09 crc kubenswrapper[4794]: E0310 10:03:09.021948 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/swift-operator@sha256:c223309f51714785bd878ad04080f7428567edad793be4f992d492abd77af44c\\\"\"" pod="openstack-operators/swift-operator-controller-manager-677c674df7-6664x" podUID="dd5bf891-8f83-47c7-9d66-06a814fcc5ee" Mar 10 10:03:09 crc kubenswrapper[4794]: E0310 10:03:09.022037 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/placement-operator@sha256:e7e865363955c670e41b6c042c4f87abceff78f5495ba5c5c82988baad45c978\\\"\"" pod="openstack-operators/placement-operator-controller-manager-574d45c66c-s7nkh" podUID="0f893a2c-14c6-4d56-a798-30e94b0e89af" Mar 10 10:03:09 crc kubenswrapper[4794]: E0310 10:03:09.022288 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/watcher-operator@sha256:4af709a2a6a1a1abb9659dbdd6fb3818122bdec7e66009fcced0bf0949f91554\\\"\"" pod="openstack-operators/watcher-operator-controller-manager-6dd88c6f67-sdplr" podUID="220c126b-fc41-4aa9-89ef-fb9ad27e9719" Mar 10 10:03:09 crc kubenswrapper[4794]: I0310 10:03:09.491777 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7929c2d3-601b-4c69-970b-a69550d9852c-cert\") pod \"openstack-baremetal-operator-controller-manager-6647d7885f664lg\" (UID: \"7929c2d3-601b-4c69-970b-a69550d9852c\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6647d7885f664lg" Mar 10 10:03:09 crc kubenswrapper[4794]: E0310 10:03:09.492020 4794 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 10 10:03:09 crc kubenswrapper[4794]: E0310 10:03:09.492077 4794 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7929c2d3-601b-4c69-970b-a69550d9852c-cert podName:7929c2d3-601b-4c69-970b-a69550d9852c nodeName:}" failed. No retries permitted until 2026-03-10 10:03:13.49205985 +0000 UTC m=+1142.248230668 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/7929c2d3-601b-4c69-970b-a69550d9852c-cert") pod "openstack-baremetal-operator-controller-manager-6647d7885f664lg" (UID: "7929c2d3-601b-4c69-970b-a69550d9852c") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 10 10:03:09 crc kubenswrapper[4794]: I0310 10:03:09.796539 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ea65018d-9031-45ab-89c3-2846e861d0a2-metrics-certs\") pod \"openstack-operator-controller-manager-6679ddfdc7-bkrkg\" (UID: \"ea65018d-9031-45ab-89c3-2846e861d0a2\") " pod="openstack-operators/openstack-operator-controller-manager-6679ddfdc7-bkrkg" Mar 10 10:03:09 crc kubenswrapper[4794]: I0310 10:03:09.796659 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/ea65018d-9031-45ab-89c3-2846e861d0a2-webhook-certs\") pod \"openstack-operator-controller-manager-6679ddfdc7-bkrkg\" (UID: \"ea65018d-9031-45ab-89c3-2846e861d0a2\") " pod="openstack-operators/openstack-operator-controller-manager-6679ddfdc7-bkrkg" Mar 10 10:03:09 crc kubenswrapper[4794]: E0310 10:03:09.796806 4794 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 10 10:03:09 crc kubenswrapper[4794]: E0310 10:03:09.796864 4794 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ea65018d-9031-45ab-89c3-2846e861d0a2-webhook-certs podName:ea65018d-9031-45ab-89c3-2846e861d0a2 nodeName:}" failed. No retries permitted until 2026-03-10 10:03:13.796846905 +0000 UTC m=+1142.553017723 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/ea65018d-9031-45ab-89c3-2846e861d0a2-webhook-certs") pod "openstack-operator-controller-manager-6679ddfdc7-bkrkg" (UID: "ea65018d-9031-45ab-89c3-2846e861d0a2") : secret "webhook-server-cert" not found Mar 10 10:03:09 crc kubenswrapper[4794]: E0310 10:03:09.796918 4794 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 10 10:03:09 crc kubenswrapper[4794]: E0310 10:03:09.796943 4794 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ea65018d-9031-45ab-89c3-2846e861d0a2-metrics-certs podName:ea65018d-9031-45ab-89c3-2846e861d0a2 nodeName:}" failed. No retries permitted until 2026-03-10 10:03:13.796935488 +0000 UTC m=+1142.553106306 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/ea65018d-9031-45ab-89c3-2846e861d0a2-metrics-certs") pod "openstack-operator-controller-manager-6679ddfdc7-bkrkg" (UID: "ea65018d-9031-45ab-89c3-2846e861d0a2") : secret "metrics-server-cert" not found Mar 10 10:03:12 crc kubenswrapper[4794]: I0310 10:03:12.886422 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1201bc13-f478-4ce1-9e86-3b1fcf9bcde1-cert\") pod \"infra-operator-controller-manager-5995f4446f-sf5hn\" (UID: \"1201bc13-f478-4ce1-9e86-3b1fcf9bcde1\") " pod="openstack-operators/infra-operator-controller-manager-5995f4446f-sf5hn" Mar 10 10:03:12 crc kubenswrapper[4794]: E0310 10:03:12.886619 4794 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 10 10:03:12 crc kubenswrapper[4794]: E0310 10:03:12.886903 4794 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1201bc13-f478-4ce1-9e86-3b1fcf9bcde1-cert podName:1201bc13-f478-4ce1-9e86-3b1fcf9bcde1 nodeName:}" failed. No retries permitted until 2026-03-10 10:03:20.886881263 +0000 UTC m=+1149.643052151 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/1201bc13-f478-4ce1-9e86-3b1fcf9bcde1-cert") pod "infra-operator-controller-manager-5995f4446f-sf5hn" (UID: "1201bc13-f478-4ce1-9e86-3b1fcf9bcde1") : secret "infra-operator-webhook-server-cert" not found Mar 10 10:03:13 crc kubenswrapper[4794]: I0310 10:03:13.498131 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7929c2d3-601b-4c69-970b-a69550d9852c-cert\") pod \"openstack-baremetal-operator-controller-manager-6647d7885f664lg\" (UID: \"7929c2d3-601b-4c69-970b-a69550d9852c\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6647d7885f664lg" Mar 10 10:03:13 crc kubenswrapper[4794]: E0310 10:03:13.498295 4794 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 10 10:03:13 crc kubenswrapper[4794]: E0310 10:03:13.498512 4794 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7929c2d3-601b-4c69-970b-a69550d9852c-cert podName:7929c2d3-601b-4c69-970b-a69550d9852c nodeName:}" failed. No retries permitted until 2026-03-10 10:03:21.498489609 +0000 UTC m=+1150.254660447 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/7929c2d3-601b-4c69-970b-a69550d9852c-cert") pod "openstack-baremetal-operator-controller-manager-6647d7885f664lg" (UID: "7929c2d3-601b-4c69-970b-a69550d9852c") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 10 10:03:13 crc kubenswrapper[4794]: I0310 10:03:13.801484 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/ea65018d-9031-45ab-89c3-2846e861d0a2-webhook-certs\") pod \"openstack-operator-controller-manager-6679ddfdc7-bkrkg\" (UID: \"ea65018d-9031-45ab-89c3-2846e861d0a2\") " pod="openstack-operators/openstack-operator-controller-manager-6679ddfdc7-bkrkg" Mar 10 10:03:13 crc kubenswrapper[4794]: I0310 10:03:13.801558 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ea65018d-9031-45ab-89c3-2846e861d0a2-metrics-certs\") pod \"openstack-operator-controller-manager-6679ddfdc7-bkrkg\" (UID: \"ea65018d-9031-45ab-89c3-2846e861d0a2\") " pod="openstack-operators/openstack-operator-controller-manager-6679ddfdc7-bkrkg" Mar 10 10:03:13 crc kubenswrapper[4794]: E0310 10:03:13.801647 4794 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 10 10:03:13 crc kubenswrapper[4794]: E0310 10:03:13.801707 4794 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 10 10:03:13 crc kubenswrapper[4794]: E0310 10:03:13.801717 4794 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ea65018d-9031-45ab-89c3-2846e861d0a2-webhook-certs podName:ea65018d-9031-45ab-89c3-2846e861d0a2 nodeName:}" failed. No retries permitted until 2026-03-10 10:03:21.801700605 +0000 UTC m=+1150.557871423 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/ea65018d-9031-45ab-89c3-2846e861d0a2-webhook-certs") pod "openstack-operator-controller-manager-6679ddfdc7-bkrkg" (UID: "ea65018d-9031-45ab-89c3-2846e861d0a2") : secret "webhook-server-cert" not found Mar 10 10:03:13 crc kubenswrapper[4794]: E0310 10:03:13.801806 4794 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ea65018d-9031-45ab-89c3-2846e861d0a2-metrics-certs podName:ea65018d-9031-45ab-89c3-2846e861d0a2 nodeName:}" failed. No retries permitted until 2026-03-10 10:03:21.801794598 +0000 UTC m=+1150.557965416 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/ea65018d-9031-45ab-89c3-2846e861d0a2-metrics-certs") pod "openstack-operator-controller-manager-6679ddfdc7-bkrkg" (UID: "ea65018d-9031-45ab-89c3-2846e861d0a2") : secret "metrics-server-cert" not found Mar 10 10:03:19 crc kubenswrapper[4794]: E0310 10:03:19.683981 4794 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/cinder-operator@sha256:7c0da25380c91ffd1940d75eaa71b6842a6a4cf4056e62d6b0d237897b74e4d9" Mar 10 10:03:19 crc kubenswrapper[4794]: E0310 10:03:19.684478 4794 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/cinder-operator@sha256:7c0da25380c91ffd1940d75eaa71b6842a6a4cf4056e62d6b0d237897b74e4d9,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-x9rrr,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cinder-operator-controller-manager-984cd4dcf-vh842_openstack-operators(aea0d607-d1b3-4a15-993a-c571f49c1337): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 10 10:03:19 crc kubenswrapper[4794]: E0310 10:03:19.685642 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/cinder-operator-controller-manager-984cd4dcf-vh842" podUID="aea0d607-d1b3-4a15-993a-c571f49c1337" Mar 10 10:03:20 crc kubenswrapper[4794]: E0310 10:03:20.169375 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/cinder-operator@sha256:7c0da25380c91ffd1940d75eaa71b6842a6a4cf4056e62d6b0d237897b74e4d9\\\"\"" pod="openstack-operators/cinder-operator-controller-manager-984cd4dcf-vh842" podUID="aea0d607-d1b3-4a15-993a-c571f49c1337" Mar 10 10:03:20 crc kubenswrapper[4794]: I0310 10:03:20.910571 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1201bc13-f478-4ce1-9e86-3b1fcf9bcde1-cert\") pod \"infra-operator-controller-manager-5995f4446f-sf5hn\" (UID: \"1201bc13-f478-4ce1-9e86-3b1fcf9bcde1\") " pod="openstack-operators/infra-operator-controller-manager-5995f4446f-sf5hn" Mar 10 10:03:20 crc kubenswrapper[4794]: I0310 10:03:20.916324 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1201bc13-f478-4ce1-9e86-3b1fcf9bcde1-cert\") pod \"infra-operator-controller-manager-5995f4446f-sf5hn\" (UID: \"1201bc13-f478-4ce1-9e86-3b1fcf9bcde1\") " pod="openstack-operators/infra-operator-controller-manager-5995f4446f-sf5hn" Mar 10 10:03:21 crc kubenswrapper[4794]: I0310 10:03:21.009903 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-controller-manager-dockercfg-kfdrb" Mar 10 10:03:21 crc kubenswrapper[4794]: I0310 10:03:21.019267 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-5995f4446f-sf5hn" Mar 10 10:03:21 crc kubenswrapper[4794]: E0310 10:03:21.341201 4794 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/keystone-operator@sha256:40b84319f2f12a1c7ee478fd86a8b1aa5ac2ea8e24f5ce0f1ca78ad879dea8ca" Mar 10 10:03:21 crc kubenswrapper[4794]: E0310 10:03:21.341680 4794 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/keystone-operator@sha256:40b84319f2f12a1c7ee478fd86a8b1aa5ac2ea8e24f5ce0f1ca78ad879dea8ca,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-98wcf,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod keystone-operator-controller-manager-684f77d66d-p2ghv_openstack-operators(f67ba1e8-d8af-4850-8133-6c50df162861): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 10 10:03:21 crc kubenswrapper[4794]: E0310 10:03:21.342897 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/keystone-operator-controller-manager-684f77d66d-p2ghv" podUID="f67ba1e8-d8af-4850-8133-6c50df162861" Mar 10 10:03:21 crc kubenswrapper[4794]: I0310 10:03:21.520833 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7929c2d3-601b-4c69-970b-a69550d9852c-cert\") pod \"openstack-baremetal-operator-controller-manager-6647d7885f664lg\" (UID: \"7929c2d3-601b-4c69-970b-a69550d9852c\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6647d7885f664lg" Mar 10 10:03:21 crc kubenswrapper[4794]: I0310 10:03:21.533001 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7929c2d3-601b-4c69-970b-a69550d9852c-cert\") pod \"openstack-baremetal-operator-controller-manager-6647d7885f664lg\" (UID: \"7929c2d3-601b-4c69-970b-a69550d9852c\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6647d7885f664lg" Mar 10 10:03:21 crc kubenswrapper[4794]: I0310 10:03:21.642403 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-controller-manager-dockercfg-d84td" Mar 10 10:03:21 crc kubenswrapper[4794]: I0310 10:03:21.649794 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-6647d7885f664lg" Mar 10 10:03:21 crc kubenswrapper[4794]: I0310 10:03:21.824682 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/ea65018d-9031-45ab-89c3-2846e861d0a2-webhook-certs\") pod \"openstack-operator-controller-manager-6679ddfdc7-bkrkg\" (UID: \"ea65018d-9031-45ab-89c3-2846e861d0a2\") " pod="openstack-operators/openstack-operator-controller-manager-6679ddfdc7-bkrkg" Mar 10 10:03:21 crc kubenswrapper[4794]: I0310 10:03:21.824789 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ea65018d-9031-45ab-89c3-2846e861d0a2-metrics-certs\") pod \"openstack-operator-controller-manager-6679ddfdc7-bkrkg\" (UID: \"ea65018d-9031-45ab-89c3-2846e861d0a2\") " pod="openstack-operators/openstack-operator-controller-manager-6679ddfdc7-bkrkg" Mar 10 10:03:21 crc kubenswrapper[4794]: E0310 10:03:21.824803 4794 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 10 10:03:21 crc kubenswrapper[4794]: E0310 10:03:21.824858 4794 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ea65018d-9031-45ab-89c3-2846e861d0a2-webhook-certs podName:ea65018d-9031-45ab-89c3-2846e861d0a2 nodeName:}" failed. No retries permitted until 2026-03-10 10:03:37.824843215 +0000 UTC m=+1166.581014033 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/ea65018d-9031-45ab-89c3-2846e861d0a2-webhook-certs") pod "openstack-operator-controller-manager-6679ddfdc7-bkrkg" (UID: "ea65018d-9031-45ab-89c3-2846e861d0a2") : secret "webhook-server-cert" not found Mar 10 10:03:21 crc kubenswrapper[4794]: I0310 10:03:21.828593 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ea65018d-9031-45ab-89c3-2846e861d0a2-metrics-certs\") pod \"openstack-operator-controller-manager-6679ddfdc7-bkrkg\" (UID: \"ea65018d-9031-45ab-89c3-2846e861d0a2\") " pod="openstack-operators/openstack-operator-controller-manager-6679ddfdc7-bkrkg" Mar 10 10:03:22 crc kubenswrapper[4794]: E0310 10:03:22.184530 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/keystone-operator@sha256:40b84319f2f12a1c7ee478fd86a8b1aa5ac2ea8e24f5ce0f1ca78ad879dea8ca\\\"\"" pod="openstack-operators/keystone-operator-controller-manager-684f77d66d-p2ghv" podUID="f67ba1e8-d8af-4850-8133-6c50df162861" Mar 10 10:03:22 crc kubenswrapper[4794]: I0310 10:03:22.967026 4794 patch_prober.go:28] interesting pod/machine-config-daemon-69278 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 10:03:22 crc kubenswrapper[4794]: I0310 10:03:22.967354 4794 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-69278" podUID="071a79a8-a892-4d38-a255-2a19483b64aa" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 10:03:22 crc kubenswrapper[4794]: I0310 10:03:22.967399 4794 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-69278" Mar 10 10:03:22 crc kubenswrapper[4794]: I0310 10:03:22.968102 4794 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"961f68351db8d19b5c0a0d1359e0a0bfe3a6d383630ab326fdce756a36734d0e"} pod="openshift-machine-config-operator/machine-config-daemon-69278" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 10 10:03:22 crc kubenswrapper[4794]: I0310 10:03:22.968184 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-69278" podUID="071a79a8-a892-4d38-a255-2a19483b64aa" containerName="machine-config-daemon" containerID="cri-o://961f68351db8d19b5c0a0d1359e0a0bfe3a6d383630ab326fdce756a36734d0e" gracePeriod=600 Mar 10 10:03:23 crc kubenswrapper[4794]: I0310 10:03:23.189166 4794 generic.go:334] "Generic (PLEG): container finished" podID="071a79a8-a892-4d38-a255-2a19483b64aa" containerID="961f68351db8d19b5c0a0d1359e0a0bfe3a6d383630ab326fdce756a36734d0e" exitCode=0 Mar 10 10:03:23 crc kubenswrapper[4794]: I0310 10:03:23.189215 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-69278" event={"ID":"071a79a8-a892-4d38-a255-2a19483b64aa","Type":"ContainerDied","Data":"961f68351db8d19b5c0a0d1359e0a0bfe3a6d383630ab326fdce756a36734d0e"} Mar 10 10:03:23 crc kubenswrapper[4794]: I0310 10:03:23.189252 4794 scope.go:117] "RemoveContainer" containerID="900615b0bd1702fdf79917b75d57707d4e97b8f262a88b05aa4883f6d0d20891" Mar 10 10:03:23 crc kubenswrapper[4794]: I0310 10:03:23.868460 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-5995f4446f-sf5hn"] Mar 10 10:03:24 crc kubenswrapper[4794]: I0310 10:03:24.009611 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-6647d7885f664lg"] Mar 10 10:03:24 crc kubenswrapper[4794]: I0310 10:03:24.217779 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-gv6df" event={"ID":"2ebeb7da-d08d-4e4a-a294-b5d0ce38d07f","Type":"ContainerStarted","Data":"2664d30ac606774f5394e7695b5ce2521fdc3a78da2d34de8a6a84efcbe2d81b"} Mar 10 10:03:24 crc kubenswrapper[4794]: I0310 10:03:24.219965 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-gv6df" Mar 10 10:03:24 crc kubenswrapper[4794]: I0310 10:03:24.225757 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-6bbb499bbc-tbxps" event={"ID":"63cc23d4-8955-4725-886e-b1379acf91dc","Type":"ContainerStarted","Data":"33003087317a595d30efb712cfead685372f1ec126694dc27e9faf7ad906bbcd"} Mar 10 10:03:24 crc kubenswrapper[4794]: I0310 10:03:24.226039 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ironic-operator-controller-manager-6bbb499bbc-tbxps" Mar 10 10:03:24 crc kubenswrapper[4794]: I0310 10:03:24.232558 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-569cc54c5-xqghl" event={"ID":"8dc7b4da-0fd5-489a-a3c6-8c613fb3c5a9","Type":"ContainerStarted","Data":"20db462316b1287b129a0d5a738afd68560cb3cce715eff13dd8633945fa2760"} Mar 10 10:03:24 crc kubenswrapper[4794]: I0310 10:03:24.232691 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/nova-operator-controller-manager-569cc54c5-xqghl" Mar 10 10:03:24 crc kubenswrapper[4794]: I0310 10:03:24.246468 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-5995f4446f-sf5hn" event={"ID":"1201bc13-f478-4ce1-9e86-3b1fcf9bcde1","Type":"ContainerStarted","Data":"7f26de549b14abfe09cb6fa5cc7161e82abc02aa8315dd69d454ffdcdd52ddb8"} Mar 10 10:03:24 crc kubenswrapper[4794]: I0310 10:03:24.260540 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-wsbdq" event={"ID":"a11f8148-52e7-47c7-8ff3-e9c172925ebe","Type":"ContainerStarted","Data":"f021638a12f0b25fd4d7705421e6f90ace66191baf1ea05c9eb4b160649bf213"} Mar 10 10:03:24 crc kubenswrapper[4794]: I0310 10:03:24.268237 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/nova-operator-controller-manager-569cc54c5-xqghl" podStartSLOduration=2.935012732 podStartE2EDuration="19.268221308s" podCreationTimestamp="2026-03-10 10:03:05 +0000 UTC" firstStartedPulling="2026-03-10 10:03:06.873411885 +0000 UTC m=+1135.629582703" lastFinishedPulling="2026-03-10 10:03:23.206620461 +0000 UTC m=+1151.962791279" observedRunningTime="2026-03-10 10:03:24.264899574 +0000 UTC m=+1153.021070392" watchObservedRunningTime="2026-03-10 10:03:24.268221308 +0000 UTC m=+1153.024392126" Mar 10 10:03:24 crc kubenswrapper[4794]: I0310 10:03:24.271451 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-gv6df" podStartSLOduration=2.9930761649999997 podStartE2EDuration="19.271437689s" podCreationTimestamp="2026-03-10 10:03:05 +0000 UTC" firstStartedPulling="2026-03-10 10:03:06.999287646 +0000 UTC m=+1135.755458464" lastFinishedPulling="2026-03-10 10:03:23.27764917 +0000 UTC m=+1152.033819988" observedRunningTime="2026-03-10 10:03:24.2478858 +0000 UTC m=+1153.004056608" watchObservedRunningTime="2026-03-10 10:03:24.271437689 +0000 UTC m=+1153.027608507" Mar 10 10:03:24 crc kubenswrapper[4794]: I0310 10:03:24.294605 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-68f45f9d9f-zb879" event={"ID":"f69e77cb-6b5c-4caf-a7de-7604ba460682","Type":"ContainerStarted","Data":"a342789e3896e8d0bebb7c79a89bdd89bb4b7bd212024f590d62d3b60a552f65"} Mar 10 10:03:24 crc kubenswrapper[4794]: I0310 10:03:24.295541 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/manila-operator-controller-manager-68f45f9d9f-zb879" Mar 10 10:03:24 crc kubenswrapper[4794]: I0310 10:03:24.300358 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ironic-operator-controller-manager-6bbb499bbc-tbxps" podStartSLOduration=3.706060512 podStartE2EDuration="19.300325015s" podCreationTimestamp="2026-03-10 10:03:05 +0000 UTC" firstStartedPulling="2026-03-10 10:03:06.668522646 +0000 UTC m=+1135.424693464" lastFinishedPulling="2026-03-10 10:03:22.262787149 +0000 UTC m=+1151.018957967" observedRunningTime="2026-03-10 10:03:24.299923964 +0000 UTC m=+1153.056094782" watchObservedRunningTime="2026-03-10 10:03:24.300325015 +0000 UTC m=+1153.056495833" Mar 10 10:03:24 crc kubenswrapper[4794]: I0310 10:03:24.333858 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-wsbdq" podStartSLOduration=3.253026343 podStartE2EDuration="19.333842087s" podCreationTimestamp="2026-03-10 10:03:05 +0000 UTC" firstStartedPulling="2026-03-10 10:03:07.123428942 +0000 UTC m=+1135.879599760" lastFinishedPulling="2026-03-10 10:03:23.204244686 +0000 UTC m=+1151.960415504" observedRunningTime="2026-03-10 10:03:24.331671839 +0000 UTC m=+1153.087842657" watchObservedRunningTime="2026-03-10 10:03:24.333842087 +0000 UTC m=+1153.090012905" Mar 10 10:03:24 crc kubenswrapper[4794]: I0310 10:03:24.337527 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-6dd88c6f67-sdplr" event={"ID":"220c126b-fc41-4aa9-89ef-fb9ad27e9719","Type":"ContainerStarted","Data":"cf7b0d62b7c2ae9d0e45e5b085fdb1a05546fd9ef196a94cf3e9d7ba5114ddc6"} Mar 10 10:03:24 crc kubenswrapper[4794]: I0310 10:03:24.339535 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/watcher-operator-controller-manager-6dd88c6f67-sdplr" Mar 10 10:03:24 crc kubenswrapper[4794]: I0310 10:03:24.350495 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-66d56f6ff4-zcsq9" event={"ID":"8c9a844a-91eb-4909-961a-79ab54ac592c","Type":"ContainerStarted","Data":"99efb122539d8dc28f4c6cd070ccae590ec7972e01d9a9609f823d6a5749467b"} Mar 10 10:03:24 crc kubenswrapper[4794]: I0310 10:03:24.351946 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/designate-operator-controller-manager-66d56f6ff4-zcsq9" Mar 10 10:03:24 crc kubenswrapper[4794]: I0310 10:03:24.370540 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-77b6666d85-d9zzm" event={"ID":"51d5cfa9-c743-4fe4-8965-e88568a7e266","Type":"ContainerStarted","Data":"6bcf8534443311bc09323baa3d2fb28071f98c655f848a538b770c40bb6c3f4e"} Mar 10 10:03:24 crc kubenswrapper[4794]: I0310 10:03:24.371524 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/heat-operator-controller-manager-77b6666d85-d9zzm" Mar 10 10:03:24 crc kubenswrapper[4794]: I0310 10:03:24.376804 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/manila-operator-controller-manager-68f45f9d9f-zb879" podStartSLOduration=3.7996914 podStartE2EDuration="19.376789455s" podCreationTimestamp="2026-03-10 10:03:05 +0000 UTC" firstStartedPulling="2026-03-10 10:03:06.683604899 +0000 UTC m=+1135.439775717" lastFinishedPulling="2026-03-10 10:03:22.260702954 +0000 UTC m=+1151.016873772" observedRunningTime="2026-03-10 10:03:24.371949814 +0000 UTC m=+1153.128120642" watchObservedRunningTime="2026-03-10 10:03:24.376789455 +0000 UTC m=+1153.132960273" Mar 10 10:03:24 crc kubenswrapper[4794]: I0310 10:03:24.406553 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-69278" event={"ID":"071a79a8-a892-4d38-a255-2a19483b64aa","Type":"ContainerStarted","Data":"4fbbb2d33125ccb00592b9b895dbb76529b93f7f4dbc98756e86d7dc556b940a"} Mar 10 10:03:24 crc kubenswrapper[4794]: I0310 10:03:24.425340 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/watcher-operator-controller-manager-6dd88c6f67-sdplr" podStartSLOduration=3.250245035 podStartE2EDuration="19.425310288s" podCreationTimestamp="2026-03-10 10:03:05 +0000 UTC" firstStartedPulling="2026-03-10 10:03:07.124998581 +0000 UTC m=+1135.881169399" lastFinishedPulling="2026-03-10 10:03:23.300063834 +0000 UTC m=+1152.056234652" observedRunningTime="2026-03-10 10:03:24.421571801 +0000 UTC m=+1153.177742619" watchObservedRunningTime="2026-03-10 10:03:24.425310288 +0000 UTC m=+1153.181481106" Mar 10 10:03:24 crc kubenswrapper[4794]: I0310 10:03:24.437205 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-5964f64c48-2xrkm" event={"ID":"491021e8-371d-44ff-bc8b-6cb379531865","Type":"ContainerStarted","Data":"8f502808dd1f366536247e68141756dc3c3b79390642601fe3db49c6420552f2"} Mar 10 10:03:24 crc kubenswrapper[4794]: I0310 10:03:24.437685 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/glance-operator-controller-manager-5964f64c48-2xrkm" Mar 10 10:03:24 crc kubenswrapper[4794]: I0310 10:03:24.460716 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-5f4f55cb5c-d8hqc" event={"ID":"cd0343b5-41b0-44c4-8c72-997df328e4ef","Type":"ContainerStarted","Data":"e89b8504652bd24e4d8582a8977874a089e9f1835e75d4f45f73cd3a96d6a089"} Mar 10 10:03:24 crc kubenswrapper[4794]: I0310 10:03:24.461000 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/octavia-operator-controller-manager-5f4f55cb5c-d8hqc" Mar 10 10:03:24 crc kubenswrapper[4794]: I0310 10:03:24.465687 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/designate-operator-controller-manager-66d56f6ff4-zcsq9" podStartSLOduration=5.235403036 podStartE2EDuration="20.465669215s" podCreationTimestamp="2026-03-10 10:03:04 +0000 UTC" firstStartedPulling="2026-03-10 10:03:06.528292685 +0000 UTC m=+1135.284463503" lastFinishedPulling="2026-03-10 10:03:21.758558864 +0000 UTC m=+1150.514729682" observedRunningTime="2026-03-10 10:03:24.456542819 +0000 UTC m=+1153.212713637" watchObservedRunningTime="2026-03-10 10:03:24.465669215 +0000 UTC m=+1153.221840033" Mar 10 10:03:24 crc kubenswrapper[4794]: I0310 10:03:24.478486 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/heat-operator-controller-manager-77b6666d85-d9zzm" podStartSLOduration=4.124268492 podStartE2EDuration="20.478463347s" podCreationTimestamp="2026-03-10 10:03:04 +0000 UTC" firstStartedPulling="2026-03-10 10:03:06.87866635 +0000 UTC m=+1135.634837168" lastFinishedPulling="2026-03-10 10:03:23.232861205 +0000 UTC m=+1151.989032023" observedRunningTime="2026-03-10 10:03:24.475824584 +0000 UTC m=+1153.231995412" watchObservedRunningTime="2026-03-10 10:03:24.478463347 +0000 UTC m=+1153.234634165" Mar 10 10:03:24 crc kubenswrapper[4794]: I0310 10:03:24.486604 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-574d45c66c-s7nkh" event={"ID":"0f893a2c-14c6-4d56-a798-30e94b0e89af","Type":"ContainerStarted","Data":"9f10b7dbf51c05504aa3a63c33a63bd7e8a1e14a9d5392b991edc430c948355e"} Mar 10 10:03:24 crc kubenswrapper[4794]: I0310 10:03:24.487378 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/placement-operator-controller-manager-574d45c66c-s7nkh" Mar 10 10:03:24 crc kubenswrapper[4794]: I0310 10:03:24.491439 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-6647d7885f664lg" event={"ID":"7929c2d3-601b-4c69-970b-a69550d9852c","Type":"ContainerStarted","Data":"beb40bb70e550abb91022f130ac5f05529591de2bc784eb4a824ebce6ea08f67"} Mar 10 10:03:24 crc kubenswrapper[4794]: I0310 10:03:24.511017 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-677bd678f7-22vcf" event={"ID":"690c6868-b841-43ec-9a82-42f6b2250153","Type":"ContainerStarted","Data":"2240ce19e8d1cd3f8373eeebeafd7a370bbba0abdf037e254d93a72b9d875d92"} Mar 10 10:03:24 crc kubenswrapper[4794]: I0310 10:03:24.511806 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/barbican-operator-controller-manager-677bd678f7-22vcf" Mar 10 10:03:24 crc kubenswrapper[4794]: I0310 10:03:24.528187 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-658d4cdd5-5nt4h" event={"ID":"f6c79bcc-08f3-4c8a-aa30-ce9db5e7bd27","Type":"ContainerStarted","Data":"22b520b6be55e4aef229983e5d507ece0120ba809d324534c44c14e2feaf7821"} Mar 10 10:03:24 crc kubenswrapper[4794]: I0310 10:03:24.528990 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/mariadb-operator-controller-manager-658d4cdd5-5nt4h" Mar 10 10:03:24 crc kubenswrapper[4794]: I0310 10:03:24.548679 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-776c5696bf-pgqx8" event={"ID":"f22ad8cb-2f75-4695-b030-67a2991aa07c","Type":"ContainerStarted","Data":"5a6297d087f39e70ed50516246a43e634e68c29422970fa99956f3ca430273cf"} Mar 10 10:03:24 crc kubenswrapper[4794]: I0310 10:03:24.549402 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/neutron-operator-controller-manager-776c5696bf-pgqx8" Mar 10 10:03:24 crc kubenswrapper[4794]: I0310 10:03:24.560855 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-bbc5b68f9-xn8mp" event={"ID":"a5511434-20de-4512-91d2-be8a49738d22","Type":"ContainerStarted","Data":"78a63b6c3ec790251a3ae2197cab4a37c65113d2432e65bacc0f5660f0791fef"} Mar 10 10:03:24 crc kubenswrapper[4794]: I0310 10:03:24.561675 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ovn-operator-controller-manager-bbc5b68f9-xn8mp" Mar 10 10:03:24 crc kubenswrapper[4794]: I0310 10:03:24.577605 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/octavia-operator-controller-manager-5f4f55cb5c-d8hqc" podStartSLOduration=4.185065464 podStartE2EDuration="19.577591057s" podCreationTimestamp="2026-03-10 10:03:05 +0000 UTC" firstStartedPulling="2026-03-10 10:03:06.869053269 +0000 UTC m=+1135.625224087" lastFinishedPulling="2026-03-10 10:03:22.261578832 +0000 UTC m=+1151.017749680" observedRunningTime="2026-03-10 10:03:24.538113059 +0000 UTC m=+1153.294283877" watchObservedRunningTime="2026-03-10 10:03:24.577591057 +0000 UTC m=+1153.333761875" Mar 10 10:03:24 crc kubenswrapper[4794]: I0310 10:03:24.591349 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-6d9d6b584d-ldq59" event={"ID":"6cc76a8e-a3ac-4499-ab5b-f3fba0d8702c","Type":"ContainerStarted","Data":"bf4957221655b00f818f340b3641f5fae534a1534c6690a8de5d2a586da3eac0"} Mar 10 10:03:24 crc kubenswrapper[4794]: I0310 10:03:24.592163 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/horizon-operator-controller-manager-6d9d6b584d-ldq59" Mar 10 10:03:24 crc kubenswrapper[4794]: I0310 10:03:24.598646 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/glance-operator-controller-manager-5964f64c48-2xrkm" podStartSLOduration=5.855583228 podStartE2EDuration="20.598629548s" podCreationTimestamp="2026-03-10 10:03:04 +0000 UTC" firstStartedPulling="2026-03-10 10:03:06.575373592 +0000 UTC m=+1135.331544410" lastFinishedPulling="2026-03-10 10:03:21.318419912 +0000 UTC m=+1150.074590730" observedRunningTime="2026-03-10 10:03:24.575788171 +0000 UTC m=+1153.331958989" watchObservedRunningTime="2026-03-10 10:03:24.598629548 +0000 UTC m=+1153.354800366" Mar 10 10:03:24 crc kubenswrapper[4794]: I0310 10:03:24.657067 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/placement-operator-controller-manager-574d45c66c-s7nkh" podStartSLOduration=3.332104666 podStartE2EDuration="19.657046991s" podCreationTimestamp="2026-03-10 10:03:05 +0000 UTC" firstStartedPulling="2026-03-10 10:03:07.00482174 +0000 UTC m=+1135.760992558" lastFinishedPulling="2026-03-10 10:03:23.329764055 +0000 UTC m=+1152.085934883" observedRunningTime="2026-03-10 10:03:24.647155011 +0000 UTC m=+1153.403325829" watchObservedRunningTime="2026-03-10 10:03:24.657046991 +0000 UTC m=+1153.413217799" Mar 10 10:03:24 crc kubenswrapper[4794]: I0310 10:03:24.681192 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/horizon-operator-controller-manager-6d9d6b584d-ldq59" podStartSLOduration=4.886480784 podStartE2EDuration="20.681176748s" podCreationTimestamp="2026-03-10 10:03:04 +0000 UTC" firstStartedPulling="2026-03-10 10:03:06.466866017 +0000 UTC m=+1135.223036835" lastFinishedPulling="2026-03-10 10:03:22.261561981 +0000 UTC m=+1151.017732799" observedRunningTime="2026-03-10 10:03:24.680859969 +0000 UTC m=+1153.437030787" watchObservedRunningTime="2026-03-10 10:03:24.681176748 +0000 UTC m=+1153.437347566" Mar 10 10:03:24 crc kubenswrapper[4794]: I0310 10:03:24.765319 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/barbican-operator-controller-manager-677bd678f7-22vcf" podStartSLOduration=6.409839163 podStartE2EDuration="20.765290528s" podCreationTimestamp="2026-03-10 10:03:04 +0000 UTC" firstStartedPulling="2026-03-10 10:03:06.240968367 +0000 UTC m=+1134.997139185" lastFinishedPulling="2026-03-10 10:03:20.596419722 +0000 UTC m=+1149.352590550" observedRunningTime="2026-03-10 10:03:24.714769792 +0000 UTC m=+1153.470940610" watchObservedRunningTime="2026-03-10 10:03:24.765290528 +0000 UTC m=+1153.521461346" Mar 10 10:03:24 crc kubenswrapper[4794]: I0310 10:03:24.774274 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/mariadb-operator-controller-manager-658d4cdd5-5nt4h" podStartSLOduration=4.377740062 podStartE2EDuration="19.77425708s" podCreationTimestamp="2026-03-10 10:03:05 +0000 UTC" firstStartedPulling="2026-03-10 10:03:06.864179006 +0000 UTC m=+1135.620349824" lastFinishedPulling="2026-03-10 10:03:22.260696024 +0000 UTC m=+1151.016866842" observedRunningTime="2026-03-10 10:03:24.743886367 +0000 UTC m=+1153.500057185" watchObservedRunningTime="2026-03-10 10:03:24.77425708 +0000 UTC m=+1153.530427908" Mar 10 10:03:24 crc kubenswrapper[4794]: I0310 10:03:24.787878 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ovn-operator-controller-manager-bbc5b68f9-xn8mp" podStartSLOduration=3.455242121 podStartE2EDuration="19.787863257s" podCreationTimestamp="2026-03-10 10:03:05 +0000 UTC" firstStartedPulling="2026-03-10 10:03:06.997077787 +0000 UTC m=+1135.753248605" lastFinishedPulling="2026-03-10 10:03:23.329698923 +0000 UTC m=+1152.085869741" observedRunningTime="2026-03-10 10:03:24.773675541 +0000 UTC m=+1153.529846359" watchObservedRunningTime="2026-03-10 10:03:24.787863257 +0000 UTC m=+1153.544034075" Mar 10 10:03:24 crc kubenswrapper[4794]: I0310 10:03:24.810065 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/neutron-operator-controller-manager-776c5696bf-pgqx8" podStartSLOduration=4.221272761 podStartE2EDuration="19.810049653s" podCreationTimestamp="2026-03-10 10:03:05 +0000 UTC" firstStartedPulling="2026-03-10 10:03:06.672777569 +0000 UTC m=+1135.428948387" lastFinishedPulling="2026-03-10 10:03:22.261554461 +0000 UTC m=+1151.017725279" observedRunningTime="2026-03-10 10:03:24.804719936 +0000 UTC m=+1153.560890754" watchObservedRunningTime="2026-03-10 10:03:24.810049653 +0000 UTC m=+1153.566220471" Mar 10 10:03:30 crc kubenswrapper[4794]: I0310 10:03:30.667751 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-5995f4446f-sf5hn" event={"ID":"1201bc13-f478-4ce1-9e86-3b1fcf9bcde1","Type":"ContainerStarted","Data":"f47ec4494cdf656fc47d3492d53f3e94f62295ede02daa9b19ffdc1138193a39"} Mar 10 10:03:30 crc kubenswrapper[4794]: I0310 10:03:30.668313 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/infra-operator-controller-manager-5995f4446f-sf5hn" Mar 10 10:03:30 crc kubenswrapper[4794]: I0310 10:03:30.669631 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-677c674df7-6664x" event={"ID":"dd5bf891-8f83-47c7-9d66-06a814fcc5ee","Type":"ContainerStarted","Data":"4bb20883370da71731cf61976cc8365b245ccafe501fb298e60fe94e99156b1d"} Mar 10 10:03:30 crc kubenswrapper[4794]: I0310 10:03:30.669819 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/swift-operator-controller-manager-677c674df7-6664x" Mar 10 10:03:30 crc kubenswrapper[4794]: I0310 10:03:30.670937 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-6cd66dbd4b-h7m6z" event={"ID":"5f503ef3-9c39-4afb-a266-431c3a44d21e","Type":"ContainerStarted","Data":"34bc308791aba1b2a85e27e43b6378f38228eb44332d94a630f76a22357cedea"} Mar 10 10:03:30 crc kubenswrapper[4794]: I0310 10:03:30.671114 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/telemetry-operator-controller-manager-6cd66dbd4b-h7m6z" Mar 10 10:03:30 crc kubenswrapper[4794]: I0310 10:03:30.672303 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-6647d7885f664lg" event={"ID":"7929c2d3-601b-4c69-970b-a69550d9852c","Type":"ContainerStarted","Data":"ad5941c762d02a87636d677159e732f42f974f381af0c696b971d9a5da17065f"} Mar 10 10:03:30 crc kubenswrapper[4794]: I0310 10:03:30.672672 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-baremetal-operator-controller-manager-6647d7885f664lg" Mar 10 10:03:30 crc kubenswrapper[4794]: I0310 10:03:30.692683 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/infra-operator-controller-manager-5995f4446f-sf5hn" podStartSLOduration=21.085265861 podStartE2EDuration="26.692664394s" podCreationTimestamp="2026-03-10 10:03:04 +0000 UTC" firstStartedPulling="2026-03-10 10:03:23.898210696 +0000 UTC m=+1152.654381544" lastFinishedPulling="2026-03-10 10:03:29.505609259 +0000 UTC m=+1158.261780077" observedRunningTime="2026-03-10 10:03:30.689752852 +0000 UTC m=+1159.445923670" watchObservedRunningTime="2026-03-10 10:03:30.692664394 +0000 UTC m=+1159.448835222" Mar 10 10:03:30 crc kubenswrapper[4794]: I0310 10:03:30.724904 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-baremetal-operator-controller-manager-6647d7885f664lg" podStartSLOduration=20.235025232 podStartE2EDuration="25.724886596s" podCreationTimestamp="2026-03-10 10:03:05 +0000 UTC" firstStartedPulling="2026-03-10 10:03:24.018175341 +0000 UTC m=+1152.774346159" lastFinishedPulling="2026-03-10 10:03:29.508036705 +0000 UTC m=+1158.264207523" observedRunningTime="2026-03-10 10:03:30.720810287 +0000 UTC m=+1159.476981115" watchObservedRunningTime="2026-03-10 10:03:30.724886596 +0000 UTC m=+1159.481057414" Mar 10 10:03:30 crc kubenswrapper[4794]: I0310 10:03:30.740955 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/swift-operator-controller-manager-677c674df7-6664x" podStartSLOduration=3.280079033 podStartE2EDuration="25.740939239s" podCreationTimestamp="2026-03-10 10:03:05 +0000 UTC" firstStartedPulling="2026-03-10 10:03:07.02203175 +0000 UTC m=+1135.778202558" lastFinishedPulling="2026-03-10 10:03:29.482891946 +0000 UTC m=+1158.239062764" observedRunningTime="2026-03-10 10:03:30.736115638 +0000 UTC m=+1159.492286456" watchObservedRunningTime="2026-03-10 10:03:30.740939239 +0000 UTC m=+1159.497110047" Mar 10 10:03:30 crc kubenswrapper[4794]: I0310 10:03:30.757365 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/telemetry-operator-controller-manager-6cd66dbd4b-h7m6z" podStartSLOduration=3.280008839 podStartE2EDuration="25.757320763s" podCreationTimestamp="2026-03-10 10:03:05 +0000 UTC" firstStartedPulling="2026-03-10 10:03:07.012981195 +0000 UTC m=+1135.769152013" lastFinishedPulling="2026-03-10 10:03:29.490293119 +0000 UTC m=+1158.246463937" observedRunningTime="2026-03-10 10:03:30.753635797 +0000 UTC m=+1159.509806615" watchObservedRunningTime="2026-03-10 10:03:30.757320763 +0000 UTC m=+1159.513491601" Mar 10 10:03:35 crc kubenswrapper[4794]: I0310 10:03:35.237853 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/barbican-operator-controller-manager-677bd678f7-22vcf" Mar 10 10:03:35 crc kubenswrapper[4794]: I0310 10:03:35.253999 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/designate-operator-controller-manager-66d56f6ff4-zcsq9" Mar 10 10:03:35 crc kubenswrapper[4794]: I0310 10:03:35.325932 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/glance-operator-controller-manager-5964f64c48-2xrkm" Mar 10 10:03:35 crc kubenswrapper[4794]: I0310 10:03:35.368756 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/horizon-operator-controller-manager-6d9d6b584d-ldq59" Mar 10 10:03:35 crc kubenswrapper[4794]: I0310 10:03:35.443759 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ironic-operator-controller-manager-6bbb499bbc-tbxps" Mar 10 10:03:35 crc kubenswrapper[4794]: I0310 10:03:35.493542 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/manila-operator-controller-manager-68f45f9d9f-zb879" Mar 10 10:03:35 crc kubenswrapper[4794]: I0310 10:03:35.688818 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/mariadb-operator-controller-manager-658d4cdd5-5nt4h" Mar 10 10:03:35 crc kubenswrapper[4794]: I0310 10:03:35.690920 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/heat-operator-controller-manager-77b6666d85-d9zzm" Mar 10 10:03:35 crc kubenswrapper[4794]: I0310 10:03:35.692546 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/neutron-operator-controller-manager-776c5696bf-pgqx8" Mar 10 10:03:35 crc kubenswrapper[4794]: I0310 10:03:35.762501 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/nova-operator-controller-manager-569cc54c5-xqghl" Mar 10 10:03:35 crc kubenswrapper[4794]: I0310 10:03:35.820374 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/octavia-operator-controller-manager-5f4f55cb5c-d8hqc" Mar 10 10:03:36 crc kubenswrapper[4794]: I0310 10:03:36.013757 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ovn-operator-controller-manager-bbc5b68f9-xn8mp" Mar 10 10:03:36 crc kubenswrapper[4794]: I0310 10:03:36.105802 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/placement-operator-controller-manager-574d45c66c-s7nkh" Mar 10 10:03:36 crc kubenswrapper[4794]: I0310 10:03:36.166678 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/swift-operator-controller-manager-677c674df7-6664x" Mar 10 10:03:36 crc kubenswrapper[4794]: I0310 10:03:36.181234 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/telemetry-operator-controller-manager-6cd66dbd4b-h7m6z" Mar 10 10:03:36 crc kubenswrapper[4794]: I0310 10:03:36.312642 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-gv6df" Mar 10 10:03:36 crc kubenswrapper[4794]: I0310 10:03:36.340318 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/watcher-operator-controller-manager-6dd88c6f67-sdplr" Mar 10 10:03:37 crc kubenswrapper[4794]: I0310 10:03:37.883157 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/ea65018d-9031-45ab-89c3-2846e861d0a2-webhook-certs\") pod \"openstack-operator-controller-manager-6679ddfdc7-bkrkg\" (UID: \"ea65018d-9031-45ab-89c3-2846e861d0a2\") " pod="openstack-operators/openstack-operator-controller-manager-6679ddfdc7-bkrkg" Mar 10 10:03:37 crc kubenswrapper[4794]: I0310 10:03:37.889042 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/ea65018d-9031-45ab-89c3-2846e861d0a2-webhook-certs\") pod \"openstack-operator-controller-manager-6679ddfdc7-bkrkg\" (UID: \"ea65018d-9031-45ab-89c3-2846e861d0a2\") " pod="openstack-operators/openstack-operator-controller-manager-6679ddfdc7-bkrkg" Mar 10 10:03:37 crc kubenswrapper[4794]: I0310 10:03:37.912100 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-manager-dockercfg-bg7s8" Mar 10 10:03:37 crc kubenswrapper[4794]: I0310 10:03:37.920949 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-6679ddfdc7-bkrkg" Mar 10 10:03:38 crc kubenswrapper[4794]: I0310 10:03:38.347709 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-6679ddfdc7-bkrkg"] Mar 10 10:03:38 crc kubenswrapper[4794]: I0310 10:03:38.733167 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-6679ddfdc7-bkrkg" event={"ID":"ea65018d-9031-45ab-89c3-2846e861d0a2","Type":"ContainerStarted","Data":"c69fd8bf620a6fa8de5a227e22058c5916a2316a651a76681d3f964ece0104d5"} Mar 10 10:03:41 crc kubenswrapper[4794]: I0310 10:03:41.027987 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/infra-operator-controller-manager-5995f4446f-sf5hn" Mar 10 10:03:41 crc kubenswrapper[4794]: I0310 10:03:41.657837 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-baremetal-operator-controller-manager-6647d7885f664lg" Mar 10 10:03:42 crc kubenswrapper[4794]: I0310 10:03:42.759480 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-684f77d66d-p2ghv" event={"ID":"f67ba1e8-d8af-4850-8133-6c50df162861","Type":"ContainerStarted","Data":"210bb1c32a4457593ae34a7e32543a8eb4e7a8f5dccb030e204511e2ec127f65"} Mar 10 10:03:42 crc kubenswrapper[4794]: I0310 10:03:42.760088 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/keystone-operator-controller-manager-684f77d66d-p2ghv" Mar 10 10:03:42 crc kubenswrapper[4794]: I0310 10:03:42.760472 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-6679ddfdc7-bkrkg" event={"ID":"ea65018d-9031-45ab-89c3-2846e861d0a2","Type":"ContainerStarted","Data":"67d6384d324936007e4bd38869cdfc86486905ae9ee30e79704e98aa1fc6e160"} Mar 10 10:03:42 crc kubenswrapper[4794]: I0310 10:03:42.760892 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-manager-6679ddfdc7-bkrkg" Mar 10 10:03:42 crc kubenswrapper[4794]: I0310 10:03:42.762025 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-984cd4dcf-vh842" event={"ID":"aea0d607-d1b3-4a15-993a-c571f49c1337","Type":"ContainerStarted","Data":"03f9ed270be36a7dc5433068b7e62f42ddf8c4a61d4e1bac53591a43fc2eec56"} Mar 10 10:03:42 crc kubenswrapper[4794]: I0310 10:03:42.762215 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/cinder-operator-controller-manager-984cd4dcf-vh842" Mar 10 10:03:42 crc kubenswrapper[4794]: I0310 10:03:42.774121 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/keystone-operator-controller-manager-684f77d66d-p2ghv" podStartSLOduration=2.079106942 podStartE2EDuration="37.77410018s" podCreationTimestamp="2026-03-10 10:03:05 +0000 UTC" firstStartedPulling="2026-03-10 10:03:06.667992799 +0000 UTC m=+1135.424163607" lastFinishedPulling="2026-03-10 10:03:42.362985987 +0000 UTC m=+1171.119156845" observedRunningTime="2026-03-10 10:03:42.772182169 +0000 UTC m=+1171.528352987" watchObservedRunningTime="2026-03-10 10:03:42.77410018 +0000 UTC m=+1171.530270998" Mar 10 10:03:42 crc kubenswrapper[4794]: I0310 10:03:42.806593 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-manager-6679ddfdc7-bkrkg" podStartSLOduration=37.806560778 podStartE2EDuration="37.806560778s" podCreationTimestamp="2026-03-10 10:03:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 10:03:42.801306293 +0000 UTC m=+1171.557477141" watchObservedRunningTime="2026-03-10 10:03:42.806560778 +0000 UTC m=+1171.562731616" Mar 10 10:03:42 crc kubenswrapper[4794]: I0310 10:03:42.836782 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/cinder-operator-controller-manager-984cd4dcf-vh842" podStartSLOduration=2.986076603 podStartE2EDuration="38.836760486s" podCreationTimestamp="2026-03-10 10:03:04 +0000 UTC" firstStartedPulling="2026-03-10 10:03:06.511301412 +0000 UTC m=+1135.267472230" lastFinishedPulling="2026-03-10 10:03:42.361985275 +0000 UTC m=+1171.118156113" observedRunningTime="2026-03-10 10:03:42.830818529 +0000 UTC m=+1171.586989337" watchObservedRunningTime="2026-03-10 10:03:42.836760486 +0000 UTC m=+1171.592931324" Mar 10 10:03:47 crc kubenswrapper[4794]: I0310 10:03:47.928842 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-manager-6679ddfdc7-bkrkg" Mar 10 10:03:55 crc kubenswrapper[4794]: I0310 10:03:55.244224 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/cinder-operator-controller-manager-984cd4dcf-vh842" Mar 10 10:03:55 crc kubenswrapper[4794]: I0310 10:03:55.457866 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/keystone-operator-controller-manager-684f77d66d-p2ghv" Mar 10 10:04:00 crc kubenswrapper[4794]: I0310 10:04:00.154243 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29552284-4ph2z"] Mar 10 10:04:00 crc kubenswrapper[4794]: I0310 10:04:00.156136 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552284-4ph2z" Mar 10 10:04:00 crc kubenswrapper[4794]: I0310 10:04:00.159729 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-r5tkc" Mar 10 10:04:00 crc kubenswrapper[4794]: I0310 10:04:00.160386 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 10 10:04:00 crc kubenswrapper[4794]: I0310 10:04:00.160817 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 10 10:04:00 crc kubenswrapper[4794]: I0310 10:04:00.162877 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552284-4ph2z"] Mar 10 10:04:00 crc kubenswrapper[4794]: I0310 10:04:00.210911 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4t87f\" (UniqueName: \"kubernetes.io/projected/bf038607-8e68-452a-a8af-162d60ea7061-kube-api-access-4t87f\") pod \"auto-csr-approver-29552284-4ph2z\" (UID: \"bf038607-8e68-452a-a8af-162d60ea7061\") " pod="openshift-infra/auto-csr-approver-29552284-4ph2z" Mar 10 10:04:00 crc kubenswrapper[4794]: I0310 10:04:00.311868 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4t87f\" (UniqueName: \"kubernetes.io/projected/bf038607-8e68-452a-a8af-162d60ea7061-kube-api-access-4t87f\") pod \"auto-csr-approver-29552284-4ph2z\" (UID: \"bf038607-8e68-452a-a8af-162d60ea7061\") " pod="openshift-infra/auto-csr-approver-29552284-4ph2z" Mar 10 10:04:00 crc kubenswrapper[4794]: I0310 10:04:00.334498 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4t87f\" (UniqueName: \"kubernetes.io/projected/bf038607-8e68-452a-a8af-162d60ea7061-kube-api-access-4t87f\") pod \"auto-csr-approver-29552284-4ph2z\" (UID: \"bf038607-8e68-452a-a8af-162d60ea7061\") " pod="openshift-infra/auto-csr-approver-29552284-4ph2z" Mar 10 10:04:00 crc kubenswrapper[4794]: I0310 10:04:00.481005 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552284-4ph2z" Mar 10 10:04:00 crc kubenswrapper[4794]: I0310 10:04:00.889803 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552284-4ph2z"] Mar 10 10:04:00 crc kubenswrapper[4794]: W0310 10:04:00.894047 4794 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbf038607_8e68_452a_a8af_162d60ea7061.slice/crio-11c5bf9c65c272c17c7cd8db8eb6ec40577ca11bbbf688e09dcadfd374f0485c WatchSource:0}: Error finding container 11c5bf9c65c272c17c7cd8db8eb6ec40577ca11bbbf688e09dcadfd374f0485c: Status 404 returned error can't find the container with id 11c5bf9c65c272c17c7cd8db8eb6ec40577ca11bbbf688e09dcadfd374f0485c Mar 10 10:04:00 crc kubenswrapper[4794]: I0310 10:04:00.916784 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552284-4ph2z" event={"ID":"bf038607-8e68-452a-a8af-162d60ea7061","Type":"ContainerStarted","Data":"11c5bf9c65c272c17c7cd8db8eb6ec40577ca11bbbf688e09dcadfd374f0485c"} Mar 10 10:04:02 crc kubenswrapper[4794]: I0310 10:04:02.931276 4794 generic.go:334] "Generic (PLEG): container finished" podID="bf038607-8e68-452a-a8af-162d60ea7061" containerID="d7d3ebbfdeb38990606a0487ab1bcb1e035cb5d8c6aaa0c6504e779a44a224c7" exitCode=0 Mar 10 10:04:02 crc kubenswrapper[4794]: I0310 10:04:02.931373 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552284-4ph2z" event={"ID":"bf038607-8e68-452a-a8af-162d60ea7061","Type":"ContainerDied","Data":"d7d3ebbfdeb38990606a0487ab1bcb1e035cb5d8c6aaa0c6504e779a44a224c7"} Mar 10 10:04:04 crc kubenswrapper[4794]: I0310 10:04:04.224506 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552284-4ph2z" Mar 10 10:04:04 crc kubenswrapper[4794]: I0310 10:04:04.279949 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4t87f\" (UniqueName: \"kubernetes.io/projected/bf038607-8e68-452a-a8af-162d60ea7061-kube-api-access-4t87f\") pod \"bf038607-8e68-452a-a8af-162d60ea7061\" (UID: \"bf038607-8e68-452a-a8af-162d60ea7061\") " Mar 10 10:04:04 crc kubenswrapper[4794]: I0310 10:04:04.285125 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf038607-8e68-452a-a8af-162d60ea7061-kube-api-access-4t87f" (OuterVolumeSpecName: "kube-api-access-4t87f") pod "bf038607-8e68-452a-a8af-162d60ea7061" (UID: "bf038607-8e68-452a-a8af-162d60ea7061"). InnerVolumeSpecName "kube-api-access-4t87f". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 10:04:04 crc kubenswrapper[4794]: I0310 10:04:04.381681 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4t87f\" (UniqueName: \"kubernetes.io/projected/bf038607-8e68-452a-a8af-162d60ea7061-kube-api-access-4t87f\") on node \"crc\" DevicePath \"\"" Mar 10 10:04:04 crc kubenswrapper[4794]: I0310 10:04:04.948028 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552284-4ph2z" event={"ID":"bf038607-8e68-452a-a8af-162d60ea7061","Type":"ContainerDied","Data":"11c5bf9c65c272c17c7cd8db8eb6ec40577ca11bbbf688e09dcadfd374f0485c"} Mar 10 10:04:04 crc kubenswrapper[4794]: I0310 10:04:04.948061 4794 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="11c5bf9c65c272c17c7cd8db8eb6ec40577ca11bbbf688e09dcadfd374f0485c" Mar 10 10:04:04 crc kubenswrapper[4794]: I0310 10:04:04.948098 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552284-4ph2z" Mar 10 10:04:05 crc kubenswrapper[4794]: I0310 10:04:05.309731 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29552278-4gp6d"] Mar 10 10:04:05 crc kubenswrapper[4794]: I0310 10:04:05.315884 4794 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29552278-4gp6d"] Mar 10 10:04:06 crc kubenswrapper[4794]: I0310 10:04:06.008324 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="aad5d471-4640-4a77-9b35-85fb21b3e2de" path="/var/lib/kubelet/pods/aad5d471-4640-4a77-9b35-85fb21b3e2de/volumes" Mar 10 10:04:14 crc kubenswrapper[4794]: I0310 10:04:14.664213 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-589db6c89c-hsx96"] Mar 10 10:04:14 crc kubenswrapper[4794]: E0310 10:04:14.665119 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bf038607-8e68-452a-a8af-162d60ea7061" containerName="oc" Mar 10 10:04:14 crc kubenswrapper[4794]: I0310 10:04:14.665136 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="bf038607-8e68-452a-a8af-162d60ea7061" containerName="oc" Mar 10 10:04:14 crc kubenswrapper[4794]: I0310 10:04:14.665301 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="bf038607-8e68-452a-a8af-162d60ea7061" containerName="oc" Mar 10 10:04:14 crc kubenswrapper[4794]: I0310 10:04:14.666177 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-589db6c89c-hsx96" Mar 10 10:04:14 crc kubenswrapper[4794]: I0310 10:04:14.672779 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openshift-service-ca.crt" Mar 10 10:04:14 crc kubenswrapper[4794]: I0310 10:04:14.672855 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dnsmasq-dns-dockercfg-bgm4r" Mar 10 10:04:14 crc kubenswrapper[4794]: I0310 10:04:14.672892 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"kube-root-ca.crt" Mar 10 10:04:14 crc kubenswrapper[4794]: I0310 10:04:14.672896 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns" Mar 10 10:04:14 crc kubenswrapper[4794]: I0310 10:04:14.675785 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-589db6c89c-hsx96"] Mar 10 10:04:14 crc kubenswrapper[4794]: I0310 10:04:14.725853 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-86bbd886cf-sxl8t"] Mar 10 10:04:14 crc kubenswrapper[4794]: I0310 10:04:14.726950 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86bbd886cf-sxl8t" Mar 10 10:04:14 crc kubenswrapper[4794]: I0310 10:04:14.729541 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-svc" Mar 10 10:04:14 crc kubenswrapper[4794]: I0310 10:04:14.735842 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/76c64598-18fa-4a47-8bb0-45c5b7c93845-config\") pod \"dnsmasq-dns-589db6c89c-hsx96\" (UID: \"76c64598-18fa-4a47-8bb0-45c5b7c93845\") " pod="openstack/dnsmasq-dns-589db6c89c-hsx96" Mar 10 10:04:14 crc kubenswrapper[4794]: I0310 10:04:14.735921 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nsflg\" (UniqueName: \"kubernetes.io/projected/76c64598-18fa-4a47-8bb0-45c5b7c93845-kube-api-access-nsflg\") pod \"dnsmasq-dns-589db6c89c-hsx96\" (UID: \"76c64598-18fa-4a47-8bb0-45c5b7c93845\") " pod="openstack/dnsmasq-dns-589db6c89c-hsx96" Mar 10 10:04:14 crc kubenswrapper[4794]: I0310 10:04:14.741570 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-86bbd886cf-sxl8t"] Mar 10 10:04:14 crc kubenswrapper[4794]: I0310 10:04:14.837270 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/76c64598-18fa-4a47-8bb0-45c5b7c93845-config\") pod \"dnsmasq-dns-589db6c89c-hsx96\" (UID: \"76c64598-18fa-4a47-8bb0-45c5b7c93845\") " pod="openstack/dnsmasq-dns-589db6c89c-hsx96" Mar 10 10:04:14 crc kubenswrapper[4794]: I0310 10:04:14.837311 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0295a29c-dbe2-4603-862a-80463076f1e3-dns-svc\") pod \"dnsmasq-dns-86bbd886cf-sxl8t\" (UID: \"0295a29c-dbe2-4603-862a-80463076f1e3\") " pod="openstack/dnsmasq-dns-86bbd886cf-sxl8t" Mar 10 10:04:14 crc kubenswrapper[4794]: I0310 10:04:14.837351 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xjtj7\" (UniqueName: \"kubernetes.io/projected/0295a29c-dbe2-4603-862a-80463076f1e3-kube-api-access-xjtj7\") pod \"dnsmasq-dns-86bbd886cf-sxl8t\" (UID: \"0295a29c-dbe2-4603-862a-80463076f1e3\") " pod="openstack/dnsmasq-dns-86bbd886cf-sxl8t" Mar 10 10:04:14 crc kubenswrapper[4794]: I0310 10:04:14.837377 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nsflg\" (UniqueName: \"kubernetes.io/projected/76c64598-18fa-4a47-8bb0-45c5b7c93845-kube-api-access-nsflg\") pod \"dnsmasq-dns-589db6c89c-hsx96\" (UID: \"76c64598-18fa-4a47-8bb0-45c5b7c93845\") " pod="openstack/dnsmasq-dns-589db6c89c-hsx96" Mar 10 10:04:14 crc kubenswrapper[4794]: I0310 10:04:14.837438 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0295a29c-dbe2-4603-862a-80463076f1e3-config\") pod \"dnsmasq-dns-86bbd886cf-sxl8t\" (UID: \"0295a29c-dbe2-4603-862a-80463076f1e3\") " pod="openstack/dnsmasq-dns-86bbd886cf-sxl8t" Mar 10 10:04:14 crc kubenswrapper[4794]: I0310 10:04:14.839295 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/76c64598-18fa-4a47-8bb0-45c5b7c93845-config\") pod \"dnsmasq-dns-589db6c89c-hsx96\" (UID: \"76c64598-18fa-4a47-8bb0-45c5b7c93845\") " pod="openstack/dnsmasq-dns-589db6c89c-hsx96" Mar 10 10:04:14 crc kubenswrapper[4794]: I0310 10:04:14.861373 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nsflg\" (UniqueName: \"kubernetes.io/projected/76c64598-18fa-4a47-8bb0-45c5b7c93845-kube-api-access-nsflg\") pod \"dnsmasq-dns-589db6c89c-hsx96\" (UID: \"76c64598-18fa-4a47-8bb0-45c5b7c93845\") " pod="openstack/dnsmasq-dns-589db6c89c-hsx96" Mar 10 10:04:14 crc kubenswrapper[4794]: I0310 10:04:14.938738 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0295a29c-dbe2-4603-862a-80463076f1e3-config\") pod \"dnsmasq-dns-86bbd886cf-sxl8t\" (UID: \"0295a29c-dbe2-4603-862a-80463076f1e3\") " pod="openstack/dnsmasq-dns-86bbd886cf-sxl8t" Mar 10 10:04:14 crc kubenswrapper[4794]: I0310 10:04:14.939142 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0295a29c-dbe2-4603-862a-80463076f1e3-dns-svc\") pod \"dnsmasq-dns-86bbd886cf-sxl8t\" (UID: \"0295a29c-dbe2-4603-862a-80463076f1e3\") " pod="openstack/dnsmasq-dns-86bbd886cf-sxl8t" Mar 10 10:04:14 crc kubenswrapper[4794]: I0310 10:04:14.939192 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xjtj7\" (UniqueName: \"kubernetes.io/projected/0295a29c-dbe2-4603-862a-80463076f1e3-kube-api-access-xjtj7\") pod \"dnsmasq-dns-86bbd886cf-sxl8t\" (UID: \"0295a29c-dbe2-4603-862a-80463076f1e3\") " pod="openstack/dnsmasq-dns-86bbd886cf-sxl8t" Mar 10 10:04:14 crc kubenswrapper[4794]: I0310 10:04:14.939757 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0295a29c-dbe2-4603-862a-80463076f1e3-config\") pod \"dnsmasq-dns-86bbd886cf-sxl8t\" (UID: \"0295a29c-dbe2-4603-862a-80463076f1e3\") " pod="openstack/dnsmasq-dns-86bbd886cf-sxl8t" Mar 10 10:04:14 crc kubenswrapper[4794]: I0310 10:04:14.940282 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0295a29c-dbe2-4603-862a-80463076f1e3-dns-svc\") pod \"dnsmasq-dns-86bbd886cf-sxl8t\" (UID: \"0295a29c-dbe2-4603-862a-80463076f1e3\") " pod="openstack/dnsmasq-dns-86bbd886cf-sxl8t" Mar 10 10:04:14 crc kubenswrapper[4794]: I0310 10:04:14.957240 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xjtj7\" (UniqueName: \"kubernetes.io/projected/0295a29c-dbe2-4603-862a-80463076f1e3-kube-api-access-xjtj7\") pod \"dnsmasq-dns-86bbd886cf-sxl8t\" (UID: \"0295a29c-dbe2-4603-862a-80463076f1e3\") " pod="openstack/dnsmasq-dns-86bbd886cf-sxl8t" Mar 10 10:04:14 crc kubenswrapper[4794]: I0310 10:04:14.986380 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-589db6c89c-hsx96" Mar 10 10:04:15 crc kubenswrapper[4794]: I0310 10:04:15.050453 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86bbd886cf-sxl8t" Mar 10 10:04:15 crc kubenswrapper[4794]: I0310 10:04:15.396211 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-589db6c89c-hsx96"] Mar 10 10:04:15 crc kubenswrapper[4794]: I0310 10:04:15.407363 4794 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 10 10:04:15 crc kubenswrapper[4794]: I0310 10:04:15.500278 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-86bbd886cf-sxl8t"] Mar 10 10:04:15 crc kubenswrapper[4794]: W0310 10:04:15.502494 4794 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0295a29c_dbe2_4603_862a_80463076f1e3.slice/crio-2af6480990b143962cf000989449ac2bdae8a33961dd2be3f1ec9cd30d0067c3 WatchSource:0}: Error finding container 2af6480990b143962cf000989449ac2bdae8a33961dd2be3f1ec9cd30d0067c3: Status 404 returned error can't find the container with id 2af6480990b143962cf000989449ac2bdae8a33961dd2be3f1ec9cd30d0067c3 Mar 10 10:04:16 crc kubenswrapper[4794]: I0310 10:04:16.023618 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86bbd886cf-sxl8t" event={"ID":"0295a29c-dbe2-4603-862a-80463076f1e3","Type":"ContainerStarted","Data":"2af6480990b143962cf000989449ac2bdae8a33961dd2be3f1ec9cd30d0067c3"} Mar 10 10:04:16 crc kubenswrapper[4794]: I0310 10:04:16.024986 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-589db6c89c-hsx96" event={"ID":"76c64598-18fa-4a47-8bb0-45c5b7c93845","Type":"ContainerStarted","Data":"d690140bb36b6cdb97cd9fd4dc90306b57eccce1562de40144ab0a9aa55cdbc4"} Mar 10 10:04:17 crc kubenswrapper[4794]: I0310 10:04:17.187103 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-589db6c89c-hsx96"] Mar 10 10:04:17 crc kubenswrapper[4794]: I0310 10:04:17.210770 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-78cb4465c9-mcwck"] Mar 10 10:04:17 crc kubenswrapper[4794]: I0310 10:04:17.211836 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78cb4465c9-mcwck" Mar 10 10:04:17 crc kubenswrapper[4794]: I0310 10:04:17.226618 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78cb4465c9-mcwck"] Mar 10 10:04:17 crc kubenswrapper[4794]: I0310 10:04:17.275059 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ps8n5\" (UniqueName: \"kubernetes.io/projected/49c3ebb8-c668-4ca8-ad87-06ab860ac32d-kube-api-access-ps8n5\") pod \"dnsmasq-dns-78cb4465c9-mcwck\" (UID: \"49c3ebb8-c668-4ca8-ad87-06ab860ac32d\") " pod="openstack/dnsmasq-dns-78cb4465c9-mcwck" Mar 10 10:04:17 crc kubenswrapper[4794]: I0310 10:04:17.275125 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/49c3ebb8-c668-4ca8-ad87-06ab860ac32d-config\") pod \"dnsmasq-dns-78cb4465c9-mcwck\" (UID: \"49c3ebb8-c668-4ca8-ad87-06ab860ac32d\") " pod="openstack/dnsmasq-dns-78cb4465c9-mcwck" Mar 10 10:04:17 crc kubenswrapper[4794]: I0310 10:04:17.275146 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/49c3ebb8-c668-4ca8-ad87-06ab860ac32d-dns-svc\") pod \"dnsmasq-dns-78cb4465c9-mcwck\" (UID: \"49c3ebb8-c668-4ca8-ad87-06ab860ac32d\") " pod="openstack/dnsmasq-dns-78cb4465c9-mcwck" Mar 10 10:04:17 crc kubenswrapper[4794]: I0310 10:04:17.427244 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ps8n5\" (UniqueName: \"kubernetes.io/projected/49c3ebb8-c668-4ca8-ad87-06ab860ac32d-kube-api-access-ps8n5\") pod \"dnsmasq-dns-78cb4465c9-mcwck\" (UID: \"49c3ebb8-c668-4ca8-ad87-06ab860ac32d\") " pod="openstack/dnsmasq-dns-78cb4465c9-mcwck" Mar 10 10:04:17 crc kubenswrapper[4794]: I0310 10:04:17.427388 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/49c3ebb8-c668-4ca8-ad87-06ab860ac32d-config\") pod \"dnsmasq-dns-78cb4465c9-mcwck\" (UID: \"49c3ebb8-c668-4ca8-ad87-06ab860ac32d\") " pod="openstack/dnsmasq-dns-78cb4465c9-mcwck" Mar 10 10:04:17 crc kubenswrapper[4794]: I0310 10:04:17.427414 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/49c3ebb8-c668-4ca8-ad87-06ab860ac32d-dns-svc\") pod \"dnsmasq-dns-78cb4465c9-mcwck\" (UID: \"49c3ebb8-c668-4ca8-ad87-06ab860ac32d\") " pod="openstack/dnsmasq-dns-78cb4465c9-mcwck" Mar 10 10:04:17 crc kubenswrapper[4794]: I0310 10:04:17.429188 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/49c3ebb8-c668-4ca8-ad87-06ab860ac32d-dns-svc\") pod \"dnsmasq-dns-78cb4465c9-mcwck\" (UID: \"49c3ebb8-c668-4ca8-ad87-06ab860ac32d\") " pod="openstack/dnsmasq-dns-78cb4465c9-mcwck" Mar 10 10:04:17 crc kubenswrapper[4794]: I0310 10:04:17.429717 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/49c3ebb8-c668-4ca8-ad87-06ab860ac32d-config\") pod \"dnsmasq-dns-78cb4465c9-mcwck\" (UID: \"49c3ebb8-c668-4ca8-ad87-06ab860ac32d\") " pod="openstack/dnsmasq-dns-78cb4465c9-mcwck" Mar 10 10:04:17 crc kubenswrapper[4794]: I0310 10:04:17.465518 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ps8n5\" (UniqueName: \"kubernetes.io/projected/49c3ebb8-c668-4ca8-ad87-06ab860ac32d-kube-api-access-ps8n5\") pod \"dnsmasq-dns-78cb4465c9-mcwck\" (UID: \"49c3ebb8-c668-4ca8-ad87-06ab860ac32d\") " pod="openstack/dnsmasq-dns-78cb4465c9-mcwck" Mar 10 10:04:17 crc kubenswrapper[4794]: I0310 10:04:17.471165 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-86bbd886cf-sxl8t"] Mar 10 10:04:17 crc kubenswrapper[4794]: I0310 10:04:17.506537 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7c47bcb9f9-24sxt"] Mar 10 10:04:17 crc kubenswrapper[4794]: I0310 10:04:17.507850 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7c47bcb9f9-24sxt" Mar 10 10:04:17 crc kubenswrapper[4794]: I0310 10:04:17.519501 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7c47bcb9f9-24sxt"] Mar 10 10:04:17 crc kubenswrapper[4794]: I0310 10:04:17.545368 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78cb4465c9-mcwck" Mar 10 10:04:17 crc kubenswrapper[4794]: I0310 10:04:17.630890 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9x2nw\" (UniqueName: \"kubernetes.io/projected/5666224e-6fa7-45b5-bdf5-12d699e536ba-kube-api-access-9x2nw\") pod \"dnsmasq-dns-7c47bcb9f9-24sxt\" (UID: \"5666224e-6fa7-45b5-bdf5-12d699e536ba\") " pod="openstack/dnsmasq-dns-7c47bcb9f9-24sxt" Mar 10 10:04:17 crc kubenswrapper[4794]: I0310 10:04:17.631241 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5666224e-6fa7-45b5-bdf5-12d699e536ba-config\") pod \"dnsmasq-dns-7c47bcb9f9-24sxt\" (UID: \"5666224e-6fa7-45b5-bdf5-12d699e536ba\") " pod="openstack/dnsmasq-dns-7c47bcb9f9-24sxt" Mar 10 10:04:17 crc kubenswrapper[4794]: I0310 10:04:17.631264 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5666224e-6fa7-45b5-bdf5-12d699e536ba-dns-svc\") pod \"dnsmasq-dns-7c47bcb9f9-24sxt\" (UID: \"5666224e-6fa7-45b5-bdf5-12d699e536ba\") " pod="openstack/dnsmasq-dns-7c47bcb9f9-24sxt" Mar 10 10:04:17 crc kubenswrapper[4794]: I0310 10:04:17.732878 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9x2nw\" (UniqueName: \"kubernetes.io/projected/5666224e-6fa7-45b5-bdf5-12d699e536ba-kube-api-access-9x2nw\") pod \"dnsmasq-dns-7c47bcb9f9-24sxt\" (UID: \"5666224e-6fa7-45b5-bdf5-12d699e536ba\") " pod="openstack/dnsmasq-dns-7c47bcb9f9-24sxt" Mar 10 10:04:17 crc kubenswrapper[4794]: I0310 10:04:17.732953 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5666224e-6fa7-45b5-bdf5-12d699e536ba-config\") pod \"dnsmasq-dns-7c47bcb9f9-24sxt\" (UID: \"5666224e-6fa7-45b5-bdf5-12d699e536ba\") " pod="openstack/dnsmasq-dns-7c47bcb9f9-24sxt" Mar 10 10:04:17 crc kubenswrapper[4794]: I0310 10:04:17.732979 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5666224e-6fa7-45b5-bdf5-12d699e536ba-dns-svc\") pod \"dnsmasq-dns-7c47bcb9f9-24sxt\" (UID: \"5666224e-6fa7-45b5-bdf5-12d699e536ba\") " pod="openstack/dnsmasq-dns-7c47bcb9f9-24sxt" Mar 10 10:04:17 crc kubenswrapper[4794]: I0310 10:04:17.734110 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5666224e-6fa7-45b5-bdf5-12d699e536ba-dns-svc\") pod \"dnsmasq-dns-7c47bcb9f9-24sxt\" (UID: \"5666224e-6fa7-45b5-bdf5-12d699e536ba\") " pod="openstack/dnsmasq-dns-7c47bcb9f9-24sxt" Mar 10 10:04:17 crc kubenswrapper[4794]: I0310 10:04:17.735247 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5666224e-6fa7-45b5-bdf5-12d699e536ba-config\") pod \"dnsmasq-dns-7c47bcb9f9-24sxt\" (UID: \"5666224e-6fa7-45b5-bdf5-12d699e536ba\") " pod="openstack/dnsmasq-dns-7c47bcb9f9-24sxt" Mar 10 10:04:17 crc kubenswrapper[4794]: I0310 10:04:17.756914 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9x2nw\" (UniqueName: \"kubernetes.io/projected/5666224e-6fa7-45b5-bdf5-12d699e536ba-kube-api-access-9x2nw\") pod \"dnsmasq-dns-7c47bcb9f9-24sxt\" (UID: \"5666224e-6fa7-45b5-bdf5-12d699e536ba\") " pod="openstack/dnsmasq-dns-7c47bcb9f9-24sxt" Mar 10 10:04:17 crc kubenswrapper[4794]: I0310 10:04:17.836897 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78cb4465c9-mcwck"] Mar 10 10:04:17 crc kubenswrapper[4794]: I0310 10:04:17.838179 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7c47bcb9f9-24sxt" Mar 10 10:04:17 crc kubenswrapper[4794]: W0310 10:04:17.857471 4794 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod49c3ebb8_c668_4ca8_ad87_06ab860ac32d.slice/crio-599cab58e2a00da157953867d7a25389863887a7818246fb5306a52233fb1af0 WatchSource:0}: Error finding container 599cab58e2a00da157953867d7a25389863887a7818246fb5306a52233fb1af0: Status 404 returned error can't find the container with id 599cab58e2a00da157953867d7a25389863887a7818246fb5306a52233fb1af0 Mar 10 10:04:18 crc kubenswrapper[4794]: I0310 10:04:18.042057 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78cb4465c9-mcwck" event={"ID":"49c3ebb8-c668-4ca8-ad87-06ab860ac32d","Type":"ContainerStarted","Data":"599cab58e2a00da157953867d7a25389863887a7818246fb5306a52233fb1af0"} Mar 10 10:04:18 crc kubenswrapper[4794]: I0310 10:04:18.282220 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7c47bcb9f9-24sxt"] Mar 10 10:04:18 crc kubenswrapper[4794]: W0310 10:04:18.291215 4794 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5666224e_6fa7_45b5_bdf5_12d699e536ba.slice/crio-1cbe9b65b1c0ca10424133c7082fc752aadc10d6c858f8aa4a622667bc4fc493 WatchSource:0}: Error finding container 1cbe9b65b1c0ca10424133c7082fc752aadc10d6c858f8aa4a622667bc4fc493: Status 404 returned error can't find the container with id 1cbe9b65b1c0ca10424133c7082fc752aadc10d6c858f8aa4a622667bc4fc493 Mar 10 10:04:18 crc kubenswrapper[4794]: I0310 10:04:18.364060 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 10 10:04:18 crc kubenswrapper[4794]: I0310 10:04:18.365210 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 10 10:04:18 crc kubenswrapper[4794]: I0310 10:04:18.367882 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Mar 10 10:04:18 crc kubenswrapper[4794]: I0310 10:04:18.368072 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Mar 10 10:04:18 crc kubenswrapper[4794]: I0310 10:04:18.368111 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Mar 10 10:04:18 crc kubenswrapper[4794]: I0310 10:04:18.368125 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Mar 10 10:04:18 crc kubenswrapper[4794]: I0310 10:04:18.368161 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Mar 10 10:04:18 crc kubenswrapper[4794]: I0310 10:04:18.368128 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-b68wd" Mar 10 10:04:18 crc kubenswrapper[4794]: I0310 10:04:18.368266 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Mar 10 10:04:18 crc kubenswrapper[4794]: I0310 10:04:18.385813 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 10 10:04:18 crc kubenswrapper[4794]: I0310 10:04:18.456878 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/598e06ed-3156-4e09-976e-4dda0e35afc2-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"598e06ed-3156-4e09-976e-4dda0e35afc2\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 10:04:18 crc kubenswrapper[4794]: I0310 10:04:18.456945 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/598e06ed-3156-4e09-976e-4dda0e35afc2-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"598e06ed-3156-4e09-976e-4dda0e35afc2\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 10:04:18 crc kubenswrapper[4794]: I0310 10:04:18.457030 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/598e06ed-3156-4e09-976e-4dda0e35afc2-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"598e06ed-3156-4e09-976e-4dda0e35afc2\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 10:04:18 crc kubenswrapper[4794]: I0310 10:04:18.457122 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g5gx7\" (UniqueName: \"kubernetes.io/projected/598e06ed-3156-4e09-976e-4dda0e35afc2-kube-api-access-g5gx7\") pod \"rabbitmq-cell1-server-0\" (UID: \"598e06ed-3156-4e09-976e-4dda0e35afc2\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 10:04:18 crc kubenswrapper[4794]: I0310 10:04:18.457145 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/598e06ed-3156-4e09-976e-4dda0e35afc2-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"598e06ed-3156-4e09-976e-4dda0e35afc2\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 10:04:18 crc kubenswrapper[4794]: I0310 10:04:18.457166 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/598e06ed-3156-4e09-976e-4dda0e35afc2-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"598e06ed-3156-4e09-976e-4dda0e35afc2\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 10:04:18 crc kubenswrapper[4794]: I0310 10:04:18.457195 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"598e06ed-3156-4e09-976e-4dda0e35afc2\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 10:04:18 crc kubenswrapper[4794]: I0310 10:04:18.457215 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/598e06ed-3156-4e09-976e-4dda0e35afc2-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"598e06ed-3156-4e09-976e-4dda0e35afc2\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 10:04:18 crc kubenswrapper[4794]: I0310 10:04:18.457419 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/598e06ed-3156-4e09-976e-4dda0e35afc2-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"598e06ed-3156-4e09-976e-4dda0e35afc2\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 10:04:18 crc kubenswrapper[4794]: I0310 10:04:18.457462 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/598e06ed-3156-4e09-976e-4dda0e35afc2-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"598e06ed-3156-4e09-976e-4dda0e35afc2\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 10:04:18 crc kubenswrapper[4794]: I0310 10:04:18.457551 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/598e06ed-3156-4e09-976e-4dda0e35afc2-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"598e06ed-3156-4e09-976e-4dda0e35afc2\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 10:04:18 crc kubenswrapper[4794]: I0310 10:04:18.564829 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/598e06ed-3156-4e09-976e-4dda0e35afc2-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"598e06ed-3156-4e09-976e-4dda0e35afc2\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 10:04:18 crc kubenswrapper[4794]: I0310 10:04:18.564902 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/598e06ed-3156-4e09-976e-4dda0e35afc2-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"598e06ed-3156-4e09-976e-4dda0e35afc2\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 10:04:18 crc kubenswrapper[4794]: I0310 10:04:18.564929 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/598e06ed-3156-4e09-976e-4dda0e35afc2-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"598e06ed-3156-4e09-976e-4dda0e35afc2\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 10:04:18 crc kubenswrapper[4794]: I0310 10:04:18.564982 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/598e06ed-3156-4e09-976e-4dda0e35afc2-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"598e06ed-3156-4e09-976e-4dda0e35afc2\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 10:04:18 crc kubenswrapper[4794]: I0310 10:04:18.565031 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g5gx7\" (UniqueName: \"kubernetes.io/projected/598e06ed-3156-4e09-976e-4dda0e35afc2-kube-api-access-g5gx7\") pod \"rabbitmq-cell1-server-0\" (UID: \"598e06ed-3156-4e09-976e-4dda0e35afc2\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 10:04:18 crc kubenswrapper[4794]: I0310 10:04:18.565077 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/598e06ed-3156-4e09-976e-4dda0e35afc2-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"598e06ed-3156-4e09-976e-4dda0e35afc2\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 10:04:18 crc kubenswrapper[4794]: I0310 10:04:18.565096 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/598e06ed-3156-4e09-976e-4dda0e35afc2-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"598e06ed-3156-4e09-976e-4dda0e35afc2\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 10:04:18 crc kubenswrapper[4794]: I0310 10:04:18.565117 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"598e06ed-3156-4e09-976e-4dda0e35afc2\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 10:04:18 crc kubenswrapper[4794]: I0310 10:04:18.565148 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/598e06ed-3156-4e09-976e-4dda0e35afc2-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"598e06ed-3156-4e09-976e-4dda0e35afc2\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 10:04:18 crc kubenswrapper[4794]: I0310 10:04:18.565162 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/598e06ed-3156-4e09-976e-4dda0e35afc2-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"598e06ed-3156-4e09-976e-4dda0e35afc2\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 10:04:18 crc kubenswrapper[4794]: I0310 10:04:18.565177 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/598e06ed-3156-4e09-976e-4dda0e35afc2-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"598e06ed-3156-4e09-976e-4dda0e35afc2\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 10:04:18 crc kubenswrapper[4794]: I0310 10:04:18.567236 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/598e06ed-3156-4e09-976e-4dda0e35afc2-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"598e06ed-3156-4e09-976e-4dda0e35afc2\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 10:04:18 crc kubenswrapper[4794]: I0310 10:04:18.567252 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/598e06ed-3156-4e09-976e-4dda0e35afc2-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"598e06ed-3156-4e09-976e-4dda0e35afc2\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 10:04:18 crc kubenswrapper[4794]: I0310 10:04:18.567865 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/598e06ed-3156-4e09-976e-4dda0e35afc2-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"598e06ed-3156-4e09-976e-4dda0e35afc2\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 10:04:18 crc kubenswrapper[4794]: I0310 10:04:18.568303 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/598e06ed-3156-4e09-976e-4dda0e35afc2-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"598e06ed-3156-4e09-976e-4dda0e35afc2\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 10:04:18 crc kubenswrapper[4794]: I0310 10:04:18.568413 4794 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"598e06ed-3156-4e09-976e-4dda0e35afc2\") device mount path \"/mnt/openstack/pv07\"" pod="openstack/rabbitmq-cell1-server-0" Mar 10 10:04:18 crc kubenswrapper[4794]: I0310 10:04:18.568626 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/598e06ed-3156-4e09-976e-4dda0e35afc2-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"598e06ed-3156-4e09-976e-4dda0e35afc2\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 10:04:18 crc kubenswrapper[4794]: I0310 10:04:18.576294 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/598e06ed-3156-4e09-976e-4dda0e35afc2-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"598e06ed-3156-4e09-976e-4dda0e35afc2\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 10:04:18 crc kubenswrapper[4794]: I0310 10:04:18.580719 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/598e06ed-3156-4e09-976e-4dda0e35afc2-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"598e06ed-3156-4e09-976e-4dda0e35afc2\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 10:04:18 crc kubenswrapper[4794]: I0310 10:04:18.593993 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g5gx7\" (UniqueName: \"kubernetes.io/projected/598e06ed-3156-4e09-976e-4dda0e35afc2-kube-api-access-g5gx7\") pod \"rabbitmq-cell1-server-0\" (UID: \"598e06ed-3156-4e09-976e-4dda0e35afc2\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 10:04:18 crc kubenswrapper[4794]: I0310 10:04:18.600001 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/598e06ed-3156-4e09-976e-4dda0e35afc2-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"598e06ed-3156-4e09-976e-4dda0e35afc2\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 10:04:18 crc kubenswrapper[4794]: I0310 10:04:18.604002 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/598e06ed-3156-4e09-976e-4dda0e35afc2-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"598e06ed-3156-4e09-976e-4dda0e35afc2\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 10:04:18 crc kubenswrapper[4794]: I0310 10:04:18.618204 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"598e06ed-3156-4e09-976e-4dda0e35afc2\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 10:04:18 crc kubenswrapper[4794]: I0310 10:04:18.648622 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Mar 10 10:04:18 crc kubenswrapper[4794]: I0310 10:04:18.650425 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 10 10:04:18 crc kubenswrapper[4794]: I0310 10:04:18.654758 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Mar 10 10:04:18 crc kubenswrapper[4794]: I0310 10:04:18.654839 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Mar 10 10:04:18 crc kubenswrapper[4794]: I0310 10:04:18.654758 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Mar 10 10:04:18 crc kubenswrapper[4794]: I0310 10:04:18.655053 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-ppbqf" Mar 10 10:04:18 crc kubenswrapper[4794]: I0310 10:04:18.655102 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Mar 10 10:04:18 crc kubenswrapper[4794]: I0310 10:04:18.655124 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Mar 10 10:04:18 crc kubenswrapper[4794]: I0310 10:04:18.655231 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Mar 10 10:04:18 crc kubenswrapper[4794]: I0310 10:04:18.667662 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 10 10:04:18 crc kubenswrapper[4794]: I0310 10:04:18.700320 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 10 10:04:18 crc kubenswrapper[4794]: I0310 10:04:18.767655 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/a45381ea-b5d8-49aa-b4b8-ab372b39b0d3-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"a45381ea-b5d8-49aa-b4b8-ab372b39b0d3\") " pod="openstack/rabbitmq-server-0" Mar 10 10:04:18 crc kubenswrapper[4794]: I0310 10:04:18.767722 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/a45381ea-b5d8-49aa-b4b8-ab372b39b0d3-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"a45381ea-b5d8-49aa-b4b8-ab372b39b0d3\") " pod="openstack/rabbitmq-server-0" Mar 10 10:04:18 crc kubenswrapper[4794]: I0310 10:04:18.767775 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/a45381ea-b5d8-49aa-b4b8-ab372b39b0d3-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"a45381ea-b5d8-49aa-b4b8-ab372b39b0d3\") " pod="openstack/rabbitmq-server-0" Mar 10 10:04:18 crc kubenswrapper[4794]: I0310 10:04:18.767814 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/a45381ea-b5d8-49aa-b4b8-ab372b39b0d3-pod-info\") pod \"rabbitmq-server-0\" (UID: \"a45381ea-b5d8-49aa-b4b8-ab372b39b0d3\") " pod="openstack/rabbitmq-server-0" Mar 10 10:04:18 crc kubenswrapper[4794]: I0310 10:04:18.767851 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-27qt6\" (UniqueName: \"kubernetes.io/projected/a45381ea-b5d8-49aa-b4b8-ab372b39b0d3-kube-api-access-27qt6\") pod \"rabbitmq-server-0\" (UID: \"a45381ea-b5d8-49aa-b4b8-ab372b39b0d3\") " pod="openstack/rabbitmq-server-0" Mar 10 10:04:18 crc kubenswrapper[4794]: I0310 10:04:18.767876 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/a45381ea-b5d8-49aa-b4b8-ab372b39b0d3-server-conf\") pod \"rabbitmq-server-0\" (UID: \"a45381ea-b5d8-49aa-b4b8-ab372b39b0d3\") " pod="openstack/rabbitmq-server-0" Mar 10 10:04:18 crc kubenswrapper[4794]: I0310 10:04:18.767908 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-server-0\" (UID: \"a45381ea-b5d8-49aa-b4b8-ab372b39b0d3\") " pod="openstack/rabbitmq-server-0" Mar 10 10:04:18 crc kubenswrapper[4794]: I0310 10:04:18.767933 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/a45381ea-b5d8-49aa-b4b8-ab372b39b0d3-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"a45381ea-b5d8-49aa-b4b8-ab372b39b0d3\") " pod="openstack/rabbitmq-server-0" Mar 10 10:04:18 crc kubenswrapper[4794]: I0310 10:04:18.767960 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/a45381ea-b5d8-49aa-b4b8-ab372b39b0d3-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"a45381ea-b5d8-49aa-b4b8-ab372b39b0d3\") " pod="openstack/rabbitmq-server-0" Mar 10 10:04:18 crc kubenswrapper[4794]: I0310 10:04:18.768039 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/a45381ea-b5d8-49aa-b4b8-ab372b39b0d3-config-data\") pod \"rabbitmq-server-0\" (UID: \"a45381ea-b5d8-49aa-b4b8-ab372b39b0d3\") " pod="openstack/rabbitmq-server-0" Mar 10 10:04:18 crc kubenswrapper[4794]: I0310 10:04:18.768084 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/a45381ea-b5d8-49aa-b4b8-ab372b39b0d3-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"a45381ea-b5d8-49aa-b4b8-ab372b39b0d3\") " pod="openstack/rabbitmq-server-0" Mar 10 10:04:18 crc kubenswrapper[4794]: I0310 10:04:18.869744 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/a45381ea-b5d8-49aa-b4b8-ab372b39b0d3-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"a45381ea-b5d8-49aa-b4b8-ab372b39b0d3\") " pod="openstack/rabbitmq-server-0" Mar 10 10:04:18 crc kubenswrapper[4794]: I0310 10:04:18.869816 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/a45381ea-b5d8-49aa-b4b8-ab372b39b0d3-pod-info\") pod \"rabbitmq-server-0\" (UID: \"a45381ea-b5d8-49aa-b4b8-ab372b39b0d3\") " pod="openstack/rabbitmq-server-0" Mar 10 10:04:18 crc kubenswrapper[4794]: I0310 10:04:18.869859 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-27qt6\" (UniqueName: \"kubernetes.io/projected/a45381ea-b5d8-49aa-b4b8-ab372b39b0d3-kube-api-access-27qt6\") pod \"rabbitmq-server-0\" (UID: \"a45381ea-b5d8-49aa-b4b8-ab372b39b0d3\") " pod="openstack/rabbitmq-server-0" Mar 10 10:04:18 crc kubenswrapper[4794]: I0310 10:04:18.869886 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/a45381ea-b5d8-49aa-b4b8-ab372b39b0d3-server-conf\") pod \"rabbitmq-server-0\" (UID: \"a45381ea-b5d8-49aa-b4b8-ab372b39b0d3\") " pod="openstack/rabbitmq-server-0" Mar 10 10:04:18 crc kubenswrapper[4794]: I0310 10:04:18.869921 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-server-0\" (UID: \"a45381ea-b5d8-49aa-b4b8-ab372b39b0d3\") " pod="openstack/rabbitmq-server-0" Mar 10 10:04:18 crc kubenswrapper[4794]: I0310 10:04:18.869944 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/a45381ea-b5d8-49aa-b4b8-ab372b39b0d3-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"a45381ea-b5d8-49aa-b4b8-ab372b39b0d3\") " pod="openstack/rabbitmq-server-0" Mar 10 10:04:18 crc kubenswrapper[4794]: I0310 10:04:18.869969 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/a45381ea-b5d8-49aa-b4b8-ab372b39b0d3-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"a45381ea-b5d8-49aa-b4b8-ab372b39b0d3\") " pod="openstack/rabbitmq-server-0" Mar 10 10:04:18 crc kubenswrapper[4794]: I0310 10:04:18.870006 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/a45381ea-b5d8-49aa-b4b8-ab372b39b0d3-config-data\") pod \"rabbitmq-server-0\" (UID: \"a45381ea-b5d8-49aa-b4b8-ab372b39b0d3\") " pod="openstack/rabbitmq-server-0" Mar 10 10:04:18 crc kubenswrapper[4794]: I0310 10:04:18.870028 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/a45381ea-b5d8-49aa-b4b8-ab372b39b0d3-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"a45381ea-b5d8-49aa-b4b8-ab372b39b0d3\") " pod="openstack/rabbitmq-server-0" Mar 10 10:04:18 crc kubenswrapper[4794]: I0310 10:04:18.870064 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/a45381ea-b5d8-49aa-b4b8-ab372b39b0d3-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"a45381ea-b5d8-49aa-b4b8-ab372b39b0d3\") " pod="openstack/rabbitmq-server-0" Mar 10 10:04:18 crc kubenswrapper[4794]: I0310 10:04:18.870094 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/a45381ea-b5d8-49aa-b4b8-ab372b39b0d3-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"a45381ea-b5d8-49aa-b4b8-ab372b39b0d3\") " pod="openstack/rabbitmq-server-0" Mar 10 10:04:18 crc kubenswrapper[4794]: I0310 10:04:18.870583 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/a45381ea-b5d8-49aa-b4b8-ab372b39b0d3-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"a45381ea-b5d8-49aa-b4b8-ab372b39b0d3\") " pod="openstack/rabbitmq-server-0" Mar 10 10:04:18 crc kubenswrapper[4794]: I0310 10:04:18.871360 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/a45381ea-b5d8-49aa-b4b8-ab372b39b0d3-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"a45381ea-b5d8-49aa-b4b8-ab372b39b0d3\") " pod="openstack/rabbitmq-server-0" Mar 10 10:04:18 crc kubenswrapper[4794]: I0310 10:04:18.871969 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/a45381ea-b5d8-49aa-b4b8-ab372b39b0d3-config-data\") pod \"rabbitmq-server-0\" (UID: \"a45381ea-b5d8-49aa-b4b8-ab372b39b0d3\") " pod="openstack/rabbitmq-server-0" Mar 10 10:04:18 crc kubenswrapper[4794]: I0310 10:04:18.872598 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/a45381ea-b5d8-49aa-b4b8-ab372b39b0d3-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"a45381ea-b5d8-49aa-b4b8-ab372b39b0d3\") " pod="openstack/rabbitmq-server-0" Mar 10 10:04:18 crc kubenswrapper[4794]: I0310 10:04:18.872830 4794 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-server-0\" (UID: \"a45381ea-b5d8-49aa-b4b8-ab372b39b0d3\") device mount path \"/mnt/openstack/pv01\"" pod="openstack/rabbitmq-server-0" Mar 10 10:04:18 crc kubenswrapper[4794]: I0310 10:04:18.873602 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/a45381ea-b5d8-49aa-b4b8-ab372b39b0d3-server-conf\") pod \"rabbitmq-server-0\" (UID: \"a45381ea-b5d8-49aa-b4b8-ab372b39b0d3\") " pod="openstack/rabbitmq-server-0" Mar 10 10:04:18 crc kubenswrapper[4794]: I0310 10:04:18.873958 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/a45381ea-b5d8-49aa-b4b8-ab372b39b0d3-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"a45381ea-b5d8-49aa-b4b8-ab372b39b0d3\") " pod="openstack/rabbitmq-server-0" Mar 10 10:04:18 crc kubenswrapper[4794]: I0310 10:04:18.875773 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/a45381ea-b5d8-49aa-b4b8-ab372b39b0d3-pod-info\") pod \"rabbitmq-server-0\" (UID: \"a45381ea-b5d8-49aa-b4b8-ab372b39b0d3\") " pod="openstack/rabbitmq-server-0" Mar 10 10:04:18 crc kubenswrapper[4794]: I0310 10:04:18.877404 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/a45381ea-b5d8-49aa-b4b8-ab372b39b0d3-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"a45381ea-b5d8-49aa-b4b8-ab372b39b0d3\") " pod="openstack/rabbitmq-server-0" Mar 10 10:04:18 crc kubenswrapper[4794]: I0310 10:04:18.890537 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/a45381ea-b5d8-49aa-b4b8-ab372b39b0d3-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"a45381ea-b5d8-49aa-b4b8-ab372b39b0d3\") " pod="openstack/rabbitmq-server-0" Mar 10 10:04:18 crc kubenswrapper[4794]: I0310 10:04:18.893527 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-27qt6\" (UniqueName: \"kubernetes.io/projected/a45381ea-b5d8-49aa-b4b8-ab372b39b0d3-kube-api-access-27qt6\") pod \"rabbitmq-server-0\" (UID: \"a45381ea-b5d8-49aa-b4b8-ab372b39b0d3\") " pod="openstack/rabbitmq-server-0" Mar 10 10:04:18 crc kubenswrapper[4794]: I0310 10:04:18.903355 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-server-0\" (UID: \"a45381ea-b5d8-49aa-b4b8-ab372b39b0d3\") " pod="openstack/rabbitmq-server-0" Mar 10 10:04:18 crc kubenswrapper[4794]: I0310 10:04:18.993380 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 10 10:04:19 crc kubenswrapper[4794]: I0310 10:04:19.053657 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7c47bcb9f9-24sxt" event={"ID":"5666224e-6fa7-45b5-bdf5-12d699e536ba","Type":"ContainerStarted","Data":"1cbe9b65b1c0ca10424133c7082fc752aadc10d6c858f8aa4a622667bc4fc493"} Mar 10 10:04:19 crc kubenswrapper[4794]: I0310 10:04:19.685857 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-galera-0"] Mar 10 10:04:19 crc kubenswrapper[4794]: I0310 10:04:19.686923 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Mar 10 10:04:19 crc kubenswrapper[4794]: I0310 10:04:19.694863 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config-data" Mar 10 10:04:19 crc kubenswrapper[4794]: I0310 10:04:19.696024 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-scripts" Mar 10 10:04:19 crc kubenswrapper[4794]: I0310 10:04:19.698731 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-dockercfg-lwkxx" Mar 10 10:04:19 crc kubenswrapper[4794]: I0310 10:04:19.699102 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-svc" Mar 10 10:04:19 crc kubenswrapper[4794]: I0310 10:04:19.701954 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"combined-ca-bundle" Mar 10 10:04:19 crc kubenswrapper[4794]: I0310 10:04:19.717374 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Mar 10 10:04:19 crc kubenswrapper[4794]: I0310 10:04:19.786326 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/2b6b3a2f-0f4d-4b7f-a507-450f0ffff42b-config-data-default\") pod \"openstack-galera-0\" (UID: \"2b6b3a2f-0f4d-4b7f-a507-450f0ffff42b\") " pod="openstack/openstack-galera-0" Mar 10 10:04:19 crc kubenswrapper[4794]: I0310 10:04:19.786405 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2b6b3a2f-0f4d-4b7f-a507-450f0ffff42b-operator-scripts\") pod \"openstack-galera-0\" (UID: \"2b6b3a2f-0f4d-4b7f-a507-450f0ffff42b\") " pod="openstack/openstack-galera-0" Mar 10 10:04:19 crc kubenswrapper[4794]: I0310 10:04:19.786428 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/2b6b3a2f-0f4d-4b7f-a507-450f0ffff42b-config-data-generated\") pod \"openstack-galera-0\" (UID: \"2b6b3a2f-0f4d-4b7f-a507-450f0ffff42b\") " pod="openstack/openstack-galera-0" Mar 10 10:04:19 crc kubenswrapper[4794]: I0310 10:04:19.786567 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2b6b3a2f-0f4d-4b7f-a507-450f0ffff42b-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"2b6b3a2f-0f4d-4b7f-a507-450f0ffff42b\") " pod="openstack/openstack-galera-0" Mar 10 10:04:19 crc kubenswrapper[4794]: I0310 10:04:19.786687 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/2b6b3a2f-0f4d-4b7f-a507-450f0ffff42b-kolla-config\") pod \"openstack-galera-0\" (UID: \"2b6b3a2f-0f4d-4b7f-a507-450f0ffff42b\") " pod="openstack/openstack-galera-0" Mar 10 10:04:19 crc kubenswrapper[4794]: I0310 10:04:19.786746 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qjbsb\" (UniqueName: \"kubernetes.io/projected/2b6b3a2f-0f4d-4b7f-a507-450f0ffff42b-kube-api-access-qjbsb\") pod \"openstack-galera-0\" (UID: \"2b6b3a2f-0f4d-4b7f-a507-450f0ffff42b\") " pod="openstack/openstack-galera-0" Mar 10 10:04:19 crc kubenswrapper[4794]: I0310 10:04:19.786852 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/2b6b3a2f-0f4d-4b7f-a507-450f0ffff42b-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"2b6b3a2f-0f4d-4b7f-a507-450f0ffff42b\") " pod="openstack/openstack-galera-0" Mar 10 10:04:19 crc kubenswrapper[4794]: I0310 10:04:19.787012 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"openstack-galera-0\" (UID: \"2b6b3a2f-0f4d-4b7f-a507-450f0ffff42b\") " pod="openstack/openstack-galera-0" Mar 10 10:04:19 crc kubenswrapper[4794]: I0310 10:04:19.888369 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/2b6b3a2f-0f4d-4b7f-a507-450f0ffff42b-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"2b6b3a2f-0f4d-4b7f-a507-450f0ffff42b\") " pod="openstack/openstack-galera-0" Mar 10 10:04:19 crc kubenswrapper[4794]: I0310 10:04:19.888427 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"openstack-galera-0\" (UID: \"2b6b3a2f-0f4d-4b7f-a507-450f0ffff42b\") " pod="openstack/openstack-galera-0" Mar 10 10:04:19 crc kubenswrapper[4794]: I0310 10:04:19.888491 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/2b6b3a2f-0f4d-4b7f-a507-450f0ffff42b-config-data-default\") pod \"openstack-galera-0\" (UID: \"2b6b3a2f-0f4d-4b7f-a507-450f0ffff42b\") " pod="openstack/openstack-galera-0" Mar 10 10:04:19 crc kubenswrapper[4794]: I0310 10:04:19.888522 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2b6b3a2f-0f4d-4b7f-a507-450f0ffff42b-operator-scripts\") pod \"openstack-galera-0\" (UID: \"2b6b3a2f-0f4d-4b7f-a507-450f0ffff42b\") " pod="openstack/openstack-galera-0" Mar 10 10:04:19 crc kubenswrapper[4794]: I0310 10:04:19.888565 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/2b6b3a2f-0f4d-4b7f-a507-450f0ffff42b-config-data-generated\") pod \"openstack-galera-0\" (UID: \"2b6b3a2f-0f4d-4b7f-a507-450f0ffff42b\") " pod="openstack/openstack-galera-0" Mar 10 10:04:19 crc kubenswrapper[4794]: I0310 10:04:19.888594 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2b6b3a2f-0f4d-4b7f-a507-450f0ffff42b-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"2b6b3a2f-0f4d-4b7f-a507-450f0ffff42b\") " pod="openstack/openstack-galera-0" Mar 10 10:04:19 crc kubenswrapper[4794]: I0310 10:04:19.888646 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/2b6b3a2f-0f4d-4b7f-a507-450f0ffff42b-kolla-config\") pod \"openstack-galera-0\" (UID: \"2b6b3a2f-0f4d-4b7f-a507-450f0ffff42b\") " pod="openstack/openstack-galera-0" Mar 10 10:04:19 crc kubenswrapper[4794]: I0310 10:04:19.888669 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qjbsb\" (UniqueName: \"kubernetes.io/projected/2b6b3a2f-0f4d-4b7f-a507-450f0ffff42b-kube-api-access-qjbsb\") pod \"openstack-galera-0\" (UID: \"2b6b3a2f-0f4d-4b7f-a507-450f0ffff42b\") " pod="openstack/openstack-galera-0" Mar 10 10:04:19 crc kubenswrapper[4794]: I0310 10:04:19.889205 4794 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"openstack-galera-0\" (UID: \"2b6b3a2f-0f4d-4b7f-a507-450f0ffff42b\") device mount path \"/mnt/openstack/pv08\"" pod="openstack/openstack-galera-0" Mar 10 10:04:19 crc kubenswrapper[4794]: I0310 10:04:19.891353 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2b6b3a2f-0f4d-4b7f-a507-450f0ffff42b-operator-scripts\") pod \"openstack-galera-0\" (UID: \"2b6b3a2f-0f4d-4b7f-a507-450f0ffff42b\") " pod="openstack/openstack-galera-0" Mar 10 10:04:19 crc kubenswrapper[4794]: I0310 10:04:19.903769 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/2b6b3a2f-0f4d-4b7f-a507-450f0ffff42b-config-data-generated\") pod \"openstack-galera-0\" (UID: \"2b6b3a2f-0f4d-4b7f-a507-450f0ffff42b\") " pod="openstack/openstack-galera-0" Mar 10 10:04:19 crc kubenswrapper[4794]: I0310 10:04:19.904211 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/2b6b3a2f-0f4d-4b7f-a507-450f0ffff42b-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"2b6b3a2f-0f4d-4b7f-a507-450f0ffff42b\") " pod="openstack/openstack-galera-0" Mar 10 10:04:19 crc kubenswrapper[4794]: I0310 10:04:19.904383 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/2b6b3a2f-0f4d-4b7f-a507-450f0ffff42b-config-data-default\") pod \"openstack-galera-0\" (UID: \"2b6b3a2f-0f4d-4b7f-a507-450f0ffff42b\") " pod="openstack/openstack-galera-0" Mar 10 10:04:19 crc kubenswrapper[4794]: I0310 10:04:19.907135 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/2b6b3a2f-0f4d-4b7f-a507-450f0ffff42b-kolla-config\") pod \"openstack-galera-0\" (UID: \"2b6b3a2f-0f4d-4b7f-a507-450f0ffff42b\") " pod="openstack/openstack-galera-0" Mar 10 10:04:19 crc kubenswrapper[4794]: I0310 10:04:19.907376 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2b6b3a2f-0f4d-4b7f-a507-450f0ffff42b-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"2b6b3a2f-0f4d-4b7f-a507-450f0ffff42b\") " pod="openstack/openstack-galera-0" Mar 10 10:04:19 crc kubenswrapper[4794]: I0310 10:04:19.909861 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"openstack-galera-0\" (UID: \"2b6b3a2f-0f4d-4b7f-a507-450f0ffff42b\") " pod="openstack/openstack-galera-0" Mar 10 10:04:19 crc kubenswrapper[4794]: I0310 10:04:19.910695 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qjbsb\" (UniqueName: \"kubernetes.io/projected/2b6b3a2f-0f4d-4b7f-a507-450f0ffff42b-kube-api-access-qjbsb\") pod \"openstack-galera-0\" (UID: \"2b6b3a2f-0f4d-4b7f-a507-450f0ffff42b\") " pod="openstack/openstack-galera-0" Mar 10 10:04:20 crc kubenswrapper[4794]: I0310 10:04:20.018458 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Mar 10 10:04:21 crc kubenswrapper[4794]: I0310 10:04:21.278097 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-cell1-galera-0"] Mar 10 10:04:21 crc kubenswrapper[4794]: I0310 10:04:21.281275 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Mar 10 10:04:21 crc kubenswrapper[4794]: I0310 10:04:21.285083 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-config-data" Mar 10 10:04:21 crc kubenswrapper[4794]: I0310 10:04:21.290733 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-cell1-svc" Mar 10 10:04:21 crc kubenswrapper[4794]: I0310 10:04:21.291415 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-scripts" Mar 10 10:04:21 crc kubenswrapper[4794]: I0310 10:04:21.292026 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-cell1-dockercfg-8gwvc" Mar 10 10:04:21 crc kubenswrapper[4794]: I0310 10:04:21.300897 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Mar 10 10:04:21 crc kubenswrapper[4794]: I0310 10:04:21.418102 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2gwsk\" (UniqueName: \"kubernetes.io/projected/7f82e49b-0e4d-4cf5-8213-b30edcae94d4-kube-api-access-2gwsk\") pod \"openstack-cell1-galera-0\" (UID: \"7f82e49b-0e4d-4cf5-8213-b30edcae94d4\") " pod="openstack/openstack-cell1-galera-0" Mar 10 10:04:21 crc kubenswrapper[4794]: I0310 10:04:21.418142 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7f82e49b-0e4d-4cf5-8213-b30edcae94d4-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"7f82e49b-0e4d-4cf5-8213-b30edcae94d4\") " pod="openstack/openstack-cell1-galera-0" Mar 10 10:04:21 crc kubenswrapper[4794]: I0310 10:04:21.418163 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"openstack-cell1-galera-0\" (UID: \"7f82e49b-0e4d-4cf5-8213-b30edcae94d4\") " pod="openstack/openstack-cell1-galera-0" Mar 10 10:04:21 crc kubenswrapper[4794]: I0310 10:04:21.418207 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/7f82e49b-0e4d-4cf5-8213-b30edcae94d4-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"7f82e49b-0e4d-4cf5-8213-b30edcae94d4\") " pod="openstack/openstack-cell1-galera-0" Mar 10 10:04:21 crc kubenswrapper[4794]: I0310 10:04:21.418268 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7f82e49b-0e4d-4cf5-8213-b30edcae94d4-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"7f82e49b-0e4d-4cf5-8213-b30edcae94d4\") " pod="openstack/openstack-cell1-galera-0" Mar 10 10:04:21 crc kubenswrapper[4794]: I0310 10:04:21.418298 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/7f82e49b-0e4d-4cf5-8213-b30edcae94d4-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"7f82e49b-0e4d-4cf5-8213-b30edcae94d4\") " pod="openstack/openstack-cell1-galera-0" Mar 10 10:04:21 crc kubenswrapper[4794]: I0310 10:04:21.418453 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/7f82e49b-0e4d-4cf5-8213-b30edcae94d4-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"7f82e49b-0e4d-4cf5-8213-b30edcae94d4\") " pod="openstack/openstack-cell1-galera-0" Mar 10 10:04:21 crc kubenswrapper[4794]: I0310 10:04:21.418495 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/7f82e49b-0e4d-4cf5-8213-b30edcae94d4-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"7f82e49b-0e4d-4cf5-8213-b30edcae94d4\") " pod="openstack/openstack-cell1-galera-0" Mar 10 10:04:21 crc kubenswrapper[4794]: I0310 10:04:21.520133 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/7f82e49b-0e4d-4cf5-8213-b30edcae94d4-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"7f82e49b-0e4d-4cf5-8213-b30edcae94d4\") " pod="openstack/openstack-cell1-galera-0" Mar 10 10:04:21 crc kubenswrapper[4794]: I0310 10:04:21.520202 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7f82e49b-0e4d-4cf5-8213-b30edcae94d4-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"7f82e49b-0e4d-4cf5-8213-b30edcae94d4\") " pod="openstack/openstack-cell1-galera-0" Mar 10 10:04:21 crc kubenswrapper[4794]: I0310 10:04:21.520229 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/7f82e49b-0e4d-4cf5-8213-b30edcae94d4-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"7f82e49b-0e4d-4cf5-8213-b30edcae94d4\") " pod="openstack/openstack-cell1-galera-0" Mar 10 10:04:21 crc kubenswrapper[4794]: I0310 10:04:21.520263 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/7f82e49b-0e4d-4cf5-8213-b30edcae94d4-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"7f82e49b-0e4d-4cf5-8213-b30edcae94d4\") " pod="openstack/openstack-cell1-galera-0" Mar 10 10:04:21 crc kubenswrapper[4794]: I0310 10:04:21.520280 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/7f82e49b-0e4d-4cf5-8213-b30edcae94d4-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"7f82e49b-0e4d-4cf5-8213-b30edcae94d4\") " pod="openstack/openstack-cell1-galera-0" Mar 10 10:04:21 crc kubenswrapper[4794]: I0310 10:04:21.520315 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2gwsk\" (UniqueName: \"kubernetes.io/projected/7f82e49b-0e4d-4cf5-8213-b30edcae94d4-kube-api-access-2gwsk\") pod \"openstack-cell1-galera-0\" (UID: \"7f82e49b-0e4d-4cf5-8213-b30edcae94d4\") " pod="openstack/openstack-cell1-galera-0" Mar 10 10:04:21 crc kubenswrapper[4794]: I0310 10:04:21.520358 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"openstack-cell1-galera-0\" (UID: \"7f82e49b-0e4d-4cf5-8213-b30edcae94d4\") " pod="openstack/openstack-cell1-galera-0" Mar 10 10:04:21 crc kubenswrapper[4794]: I0310 10:04:21.520376 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7f82e49b-0e4d-4cf5-8213-b30edcae94d4-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"7f82e49b-0e4d-4cf5-8213-b30edcae94d4\") " pod="openstack/openstack-cell1-galera-0" Mar 10 10:04:21 crc kubenswrapper[4794]: I0310 10:04:21.521420 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/7f82e49b-0e4d-4cf5-8213-b30edcae94d4-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"7f82e49b-0e4d-4cf5-8213-b30edcae94d4\") " pod="openstack/openstack-cell1-galera-0" Mar 10 10:04:21 crc kubenswrapper[4794]: I0310 10:04:21.521865 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/7f82e49b-0e4d-4cf5-8213-b30edcae94d4-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"7f82e49b-0e4d-4cf5-8213-b30edcae94d4\") " pod="openstack/openstack-cell1-galera-0" Mar 10 10:04:21 crc kubenswrapper[4794]: I0310 10:04:21.521955 4794 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"openstack-cell1-galera-0\" (UID: \"7f82e49b-0e4d-4cf5-8213-b30edcae94d4\") device mount path \"/mnt/openstack/pv02\"" pod="openstack/openstack-cell1-galera-0" Mar 10 10:04:21 crc kubenswrapper[4794]: I0310 10:04:21.522647 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7f82e49b-0e4d-4cf5-8213-b30edcae94d4-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"7f82e49b-0e4d-4cf5-8213-b30edcae94d4\") " pod="openstack/openstack-cell1-galera-0" Mar 10 10:04:21 crc kubenswrapper[4794]: I0310 10:04:21.522647 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/7f82e49b-0e4d-4cf5-8213-b30edcae94d4-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"7f82e49b-0e4d-4cf5-8213-b30edcae94d4\") " pod="openstack/openstack-cell1-galera-0" Mar 10 10:04:21 crc kubenswrapper[4794]: I0310 10:04:21.525508 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/7f82e49b-0e4d-4cf5-8213-b30edcae94d4-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"7f82e49b-0e4d-4cf5-8213-b30edcae94d4\") " pod="openstack/openstack-cell1-galera-0" Mar 10 10:04:21 crc kubenswrapper[4794]: I0310 10:04:21.529251 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7f82e49b-0e4d-4cf5-8213-b30edcae94d4-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"7f82e49b-0e4d-4cf5-8213-b30edcae94d4\") " pod="openstack/openstack-cell1-galera-0" Mar 10 10:04:21 crc kubenswrapper[4794]: I0310 10:04:21.536962 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2gwsk\" (UniqueName: \"kubernetes.io/projected/7f82e49b-0e4d-4cf5-8213-b30edcae94d4-kube-api-access-2gwsk\") pod \"openstack-cell1-galera-0\" (UID: \"7f82e49b-0e4d-4cf5-8213-b30edcae94d4\") " pod="openstack/openstack-cell1-galera-0" Mar 10 10:04:21 crc kubenswrapper[4794]: I0310 10:04:21.545500 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"openstack-cell1-galera-0\" (UID: \"7f82e49b-0e4d-4cf5-8213-b30edcae94d4\") " pod="openstack/openstack-cell1-galera-0" Mar 10 10:04:21 crc kubenswrapper[4794]: I0310 10:04:21.576560 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/memcached-0"] Mar 10 10:04:21 crc kubenswrapper[4794]: I0310 10:04:21.577773 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Mar 10 10:04:21 crc kubenswrapper[4794]: I0310 10:04:21.591235 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Mar 10 10:04:21 crc kubenswrapper[4794]: I0310 10:04:21.591735 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-memcached-svc" Mar 10 10:04:21 crc kubenswrapper[4794]: I0310 10:04:21.592083 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"memcached-config-data" Mar 10 10:04:21 crc kubenswrapper[4794]: I0310 10:04:21.592234 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"memcached-memcached-dockercfg-bchcd" Mar 10 10:04:21 crc kubenswrapper[4794]: I0310 10:04:21.622915 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Mar 10 10:04:21 crc kubenswrapper[4794]: I0310 10:04:21.723637 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/71ee8a8d-89a0-495f-925b-071e52449063-kolla-config\") pod \"memcached-0\" (UID: \"71ee8a8d-89a0-495f-925b-071e52449063\") " pod="openstack/memcached-0" Mar 10 10:04:21 crc kubenswrapper[4794]: I0310 10:04:21.723688 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/71ee8a8d-89a0-495f-925b-071e52449063-memcached-tls-certs\") pod \"memcached-0\" (UID: \"71ee8a8d-89a0-495f-925b-071e52449063\") " pod="openstack/memcached-0" Mar 10 10:04:21 crc kubenswrapper[4794]: I0310 10:04:21.723711 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/71ee8a8d-89a0-495f-925b-071e52449063-config-data\") pod \"memcached-0\" (UID: \"71ee8a8d-89a0-495f-925b-071e52449063\") " pod="openstack/memcached-0" Mar 10 10:04:21 crc kubenswrapper[4794]: I0310 10:04:21.723949 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/71ee8a8d-89a0-495f-925b-071e52449063-combined-ca-bundle\") pod \"memcached-0\" (UID: \"71ee8a8d-89a0-495f-925b-071e52449063\") " pod="openstack/memcached-0" Mar 10 10:04:21 crc kubenswrapper[4794]: I0310 10:04:21.724009 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p5grc\" (UniqueName: \"kubernetes.io/projected/71ee8a8d-89a0-495f-925b-071e52449063-kube-api-access-p5grc\") pod \"memcached-0\" (UID: \"71ee8a8d-89a0-495f-925b-071e52449063\") " pod="openstack/memcached-0" Mar 10 10:04:21 crc kubenswrapper[4794]: I0310 10:04:21.825433 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/71ee8a8d-89a0-495f-925b-071e52449063-combined-ca-bundle\") pod \"memcached-0\" (UID: \"71ee8a8d-89a0-495f-925b-071e52449063\") " pod="openstack/memcached-0" Mar 10 10:04:21 crc kubenswrapper[4794]: I0310 10:04:21.825479 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p5grc\" (UniqueName: \"kubernetes.io/projected/71ee8a8d-89a0-495f-925b-071e52449063-kube-api-access-p5grc\") pod \"memcached-0\" (UID: \"71ee8a8d-89a0-495f-925b-071e52449063\") " pod="openstack/memcached-0" Mar 10 10:04:21 crc kubenswrapper[4794]: I0310 10:04:21.825519 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/71ee8a8d-89a0-495f-925b-071e52449063-kolla-config\") pod \"memcached-0\" (UID: \"71ee8a8d-89a0-495f-925b-071e52449063\") " pod="openstack/memcached-0" Mar 10 10:04:21 crc kubenswrapper[4794]: I0310 10:04:21.825550 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/71ee8a8d-89a0-495f-925b-071e52449063-memcached-tls-certs\") pod \"memcached-0\" (UID: \"71ee8a8d-89a0-495f-925b-071e52449063\") " pod="openstack/memcached-0" Mar 10 10:04:21 crc kubenswrapper[4794]: I0310 10:04:21.825565 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/71ee8a8d-89a0-495f-925b-071e52449063-config-data\") pod \"memcached-0\" (UID: \"71ee8a8d-89a0-495f-925b-071e52449063\") " pod="openstack/memcached-0" Mar 10 10:04:21 crc kubenswrapper[4794]: I0310 10:04:21.826233 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/71ee8a8d-89a0-495f-925b-071e52449063-config-data\") pod \"memcached-0\" (UID: \"71ee8a8d-89a0-495f-925b-071e52449063\") " pod="openstack/memcached-0" Mar 10 10:04:21 crc kubenswrapper[4794]: I0310 10:04:21.827709 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/71ee8a8d-89a0-495f-925b-071e52449063-kolla-config\") pod \"memcached-0\" (UID: \"71ee8a8d-89a0-495f-925b-071e52449063\") " pod="openstack/memcached-0" Mar 10 10:04:21 crc kubenswrapper[4794]: I0310 10:04:21.829513 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/71ee8a8d-89a0-495f-925b-071e52449063-combined-ca-bundle\") pod \"memcached-0\" (UID: \"71ee8a8d-89a0-495f-925b-071e52449063\") " pod="openstack/memcached-0" Mar 10 10:04:21 crc kubenswrapper[4794]: I0310 10:04:21.831376 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/71ee8a8d-89a0-495f-925b-071e52449063-memcached-tls-certs\") pod \"memcached-0\" (UID: \"71ee8a8d-89a0-495f-925b-071e52449063\") " pod="openstack/memcached-0" Mar 10 10:04:21 crc kubenswrapper[4794]: I0310 10:04:21.852259 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p5grc\" (UniqueName: \"kubernetes.io/projected/71ee8a8d-89a0-495f-925b-071e52449063-kube-api-access-p5grc\") pod \"memcached-0\" (UID: \"71ee8a8d-89a0-495f-925b-071e52449063\") " pod="openstack/memcached-0" Mar 10 10:04:21 crc kubenswrapper[4794]: I0310 10:04:21.913705 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Mar 10 10:04:23 crc kubenswrapper[4794]: I0310 10:04:23.666282 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Mar 10 10:04:23 crc kubenswrapper[4794]: I0310 10:04:23.667992 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 10 10:04:23 crc kubenswrapper[4794]: I0310 10:04:23.670480 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-ceilometer-dockercfg-b6rs7" Mar 10 10:04:23 crc kubenswrapper[4794]: I0310 10:04:23.676093 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 10 10:04:23 crc kubenswrapper[4794]: I0310 10:04:23.763254 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5zjqk\" (UniqueName: \"kubernetes.io/projected/57b6ddc9-2a57-41f1-a6c8-d0be37b88252-kube-api-access-5zjqk\") pod \"kube-state-metrics-0\" (UID: \"57b6ddc9-2a57-41f1-a6c8-d0be37b88252\") " pod="openstack/kube-state-metrics-0" Mar 10 10:04:23 crc kubenswrapper[4794]: I0310 10:04:23.865060 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5zjqk\" (UniqueName: \"kubernetes.io/projected/57b6ddc9-2a57-41f1-a6c8-d0be37b88252-kube-api-access-5zjqk\") pod \"kube-state-metrics-0\" (UID: \"57b6ddc9-2a57-41f1-a6c8-d0be37b88252\") " pod="openstack/kube-state-metrics-0" Mar 10 10:04:23 crc kubenswrapper[4794]: I0310 10:04:23.891795 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5zjqk\" (UniqueName: \"kubernetes.io/projected/57b6ddc9-2a57-41f1-a6c8-d0be37b88252-kube-api-access-5zjqk\") pod \"kube-state-metrics-0\" (UID: \"57b6ddc9-2a57-41f1-a6c8-d0be37b88252\") " pod="openstack/kube-state-metrics-0" Mar 10 10:04:23 crc kubenswrapper[4794]: I0310 10:04:23.987581 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 10 10:04:25 crc kubenswrapper[4794]: I0310 10:04:25.132534 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 10 10:04:26 crc kubenswrapper[4794]: I0310 10:04:26.371208 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-fvs8j"] Mar 10 10:04:26 crc kubenswrapper[4794]: I0310 10:04:26.378925 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-fvs8j" Mar 10 10:04:26 crc kubenswrapper[4794]: I0310 10:04:26.382949 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-ovs-8552m"] Mar 10 10:04:26 crc kubenswrapper[4794]: I0310 10:04:26.384504 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-8552m" Mar 10 10:04:26 crc kubenswrapper[4794]: I0310 10:04:26.393315 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncontroller-ovncontroller-dockercfg-n6djd" Mar 10 10:04:26 crc kubenswrapper[4794]: I0310 10:04:26.393496 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovncontroller-ovndbs" Mar 10 10:04:26 crc kubenswrapper[4794]: I0310 10:04:26.393593 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-scripts" Mar 10 10:04:26 crc kubenswrapper[4794]: I0310 10:04:26.394007 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-fvs8j"] Mar 10 10:04:26 crc kubenswrapper[4794]: I0310 10:04:26.400056 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-8552m"] Mar 10 10:04:26 crc kubenswrapper[4794]: I0310 10:04:26.510567 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4fbaedcd-8c1d-452e-a36b-1a12e47f48d1-scripts\") pod \"ovn-controller-ovs-8552m\" (UID: \"4fbaedcd-8c1d-452e-a36b-1a12e47f48d1\") " pod="openstack/ovn-controller-ovs-8552m" Mar 10 10:04:26 crc kubenswrapper[4794]: I0310 10:04:26.510614 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/c4ffbeb0-ab29-4b48-bbd2-65c776b08edd-var-run\") pod \"ovn-controller-fvs8j\" (UID: \"c4ffbeb0-ab29-4b48-bbd2-65c776b08edd\") " pod="openstack/ovn-controller-fvs8j" Mar 10 10:04:26 crc kubenswrapper[4794]: I0310 10:04:26.510641 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/4fbaedcd-8c1d-452e-a36b-1a12e47f48d1-etc-ovs\") pod \"ovn-controller-ovs-8552m\" (UID: \"4fbaedcd-8c1d-452e-a36b-1a12e47f48d1\") " pod="openstack/ovn-controller-ovs-8552m" Mar 10 10:04:26 crc kubenswrapper[4794]: I0310 10:04:26.510662 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/4fbaedcd-8c1d-452e-a36b-1a12e47f48d1-var-run\") pod \"ovn-controller-ovs-8552m\" (UID: \"4fbaedcd-8c1d-452e-a36b-1a12e47f48d1\") " pod="openstack/ovn-controller-ovs-8552m" Mar 10 10:04:26 crc kubenswrapper[4794]: I0310 10:04:26.510681 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/c4ffbeb0-ab29-4b48-bbd2-65c776b08edd-ovn-controller-tls-certs\") pod \"ovn-controller-fvs8j\" (UID: \"c4ffbeb0-ab29-4b48-bbd2-65c776b08edd\") " pod="openstack/ovn-controller-fvs8j" Mar 10 10:04:26 crc kubenswrapper[4794]: I0310 10:04:26.510702 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/c4ffbeb0-ab29-4b48-bbd2-65c776b08edd-var-run-ovn\") pod \"ovn-controller-fvs8j\" (UID: \"c4ffbeb0-ab29-4b48-bbd2-65c776b08edd\") " pod="openstack/ovn-controller-fvs8j" Mar 10 10:04:26 crc kubenswrapper[4794]: I0310 10:04:26.510722 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/4fbaedcd-8c1d-452e-a36b-1a12e47f48d1-var-log\") pod \"ovn-controller-ovs-8552m\" (UID: \"4fbaedcd-8c1d-452e-a36b-1a12e47f48d1\") " pod="openstack/ovn-controller-ovs-8552m" Mar 10 10:04:26 crc kubenswrapper[4794]: I0310 10:04:26.510737 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bl2kw\" (UniqueName: \"kubernetes.io/projected/4fbaedcd-8c1d-452e-a36b-1a12e47f48d1-kube-api-access-bl2kw\") pod \"ovn-controller-ovs-8552m\" (UID: \"4fbaedcd-8c1d-452e-a36b-1a12e47f48d1\") " pod="openstack/ovn-controller-ovs-8552m" Mar 10 10:04:26 crc kubenswrapper[4794]: I0310 10:04:26.510754 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c4ffbeb0-ab29-4b48-bbd2-65c776b08edd-scripts\") pod \"ovn-controller-fvs8j\" (UID: \"c4ffbeb0-ab29-4b48-bbd2-65c776b08edd\") " pod="openstack/ovn-controller-fvs8j" Mar 10 10:04:26 crc kubenswrapper[4794]: I0310 10:04:26.510778 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/4fbaedcd-8c1d-452e-a36b-1a12e47f48d1-var-lib\") pod \"ovn-controller-ovs-8552m\" (UID: \"4fbaedcd-8c1d-452e-a36b-1a12e47f48d1\") " pod="openstack/ovn-controller-ovs-8552m" Mar 10 10:04:26 crc kubenswrapper[4794]: I0310 10:04:26.510800 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c4ffbeb0-ab29-4b48-bbd2-65c776b08edd-combined-ca-bundle\") pod \"ovn-controller-fvs8j\" (UID: \"c4ffbeb0-ab29-4b48-bbd2-65c776b08edd\") " pod="openstack/ovn-controller-fvs8j" Mar 10 10:04:26 crc kubenswrapper[4794]: I0310 10:04:26.510868 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2lddn\" (UniqueName: \"kubernetes.io/projected/c4ffbeb0-ab29-4b48-bbd2-65c776b08edd-kube-api-access-2lddn\") pod \"ovn-controller-fvs8j\" (UID: \"c4ffbeb0-ab29-4b48-bbd2-65c776b08edd\") " pod="openstack/ovn-controller-fvs8j" Mar 10 10:04:26 crc kubenswrapper[4794]: I0310 10:04:26.511031 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/c4ffbeb0-ab29-4b48-bbd2-65c776b08edd-var-log-ovn\") pod \"ovn-controller-fvs8j\" (UID: \"c4ffbeb0-ab29-4b48-bbd2-65c776b08edd\") " pod="openstack/ovn-controller-fvs8j" Mar 10 10:04:26 crc kubenswrapper[4794]: I0310 10:04:26.612087 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/c4ffbeb0-ab29-4b48-bbd2-65c776b08edd-var-log-ovn\") pod \"ovn-controller-fvs8j\" (UID: \"c4ffbeb0-ab29-4b48-bbd2-65c776b08edd\") " pod="openstack/ovn-controller-fvs8j" Mar 10 10:04:26 crc kubenswrapper[4794]: I0310 10:04:26.612201 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4fbaedcd-8c1d-452e-a36b-1a12e47f48d1-scripts\") pod \"ovn-controller-ovs-8552m\" (UID: \"4fbaedcd-8c1d-452e-a36b-1a12e47f48d1\") " pod="openstack/ovn-controller-ovs-8552m" Mar 10 10:04:26 crc kubenswrapper[4794]: I0310 10:04:26.612233 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/c4ffbeb0-ab29-4b48-bbd2-65c776b08edd-var-run\") pod \"ovn-controller-fvs8j\" (UID: \"c4ffbeb0-ab29-4b48-bbd2-65c776b08edd\") " pod="openstack/ovn-controller-fvs8j" Mar 10 10:04:26 crc kubenswrapper[4794]: I0310 10:04:26.612266 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/4fbaedcd-8c1d-452e-a36b-1a12e47f48d1-etc-ovs\") pod \"ovn-controller-ovs-8552m\" (UID: \"4fbaedcd-8c1d-452e-a36b-1a12e47f48d1\") " pod="openstack/ovn-controller-ovs-8552m" Mar 10 10:04:26 crc kubenswrapper[4794]: I0310 10:04:26.612515 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/c4ffbeb0-ab29-4b48-bbd2-65c776b08edd-ovn-controller-tls-certs\") pod \"ovn-controller-fvs8j\" (UID: \"c4ffbeb0-ab29-4b48-bbd2-65c776b08edd\") " pod="openstack/ovn-controller-fvs8j" Mar 10 10:04:26 crc kubenswrapper[4794]: I0310 10:04:26.612541 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/4fbaedcd-8c1d-452e-a36b-1a12e47f48d1-var-run\") pod \"ovn-controller-ovs-8552m\" (UID: \"4fbaedcd-8c1d-452e-a36b-1a12e47f48d1\") " pod="openstack/ovn-controller-ovs-8552m" Mar 10 10:04:26 crc kubenswrapper[4794]: I0310 10:04:26.612668 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/c4ffbeb0-ab29-4b48-bbd2-65c776b08edd-var-log-ovn\") pod \"ovn-controller-fvs8j\" (UID: \"c4ffbeb0-ab29-4b48-bbd2-65c776b08edd\") " pod="openstack/ovn-controller-fvs8j" Mar 10 10:04:26 crc kubenswrapper[4794]: I0310 10:04:26.612879 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/4fbaedcd-8c1d-452e-a36b-1a12e47f48d1-var-run\") pod \"ovn-controller-ovs-8552m\" (UID: \"4fbaedcd-8c1d-452e-a36b-1a12e47f48d1\") " pod="openstack/ovn-controller-ovs-8552m" Mar 10 10:04:26 crc kubenswrapper[4794]: I0310 10:04:26.613258 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/c4ffbeb0-ab29-4b48-bbd2-65c776b08edd-var-run\") pod \"ovn-controller-fvs8j\" (UID: \"c4ffbeb0-ab29-4b48-bbd2-65c776b08edd\") " pod="openstack/ovn-controller-fvs8j" Mar 10 10:04:26 crc kubenswrapper[4794]: I0310 10:04:26.613307 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/c4ffbeb0-ab29-4b48-bbd2-65c776b08edd-var-run-ovn\") pod \"ovn-controller-fvs8j\" (UID: \"c4ffbeb0-ab29-4b48-bbd2-65c776b08edd\") " pod="openstack/ovn-controller-fvs8j" Mar 10 10:04:26 crc kubenswrapper[4794]: I0310 10:04:26.613430 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/c4ffbeb0-ab29-4b48-bbd2-65c776b08edd-var-run-ovn\") pod \"ovn-controller-fvs8j\" (UID: \"c4ffbeb0-ab29-4b48-bbd2-65c776b08edd\") " pod="openstack/ovn-controller-fvs8j" Mar 10 10:04:26 crc kubenswrapper[4794]: I0310 10:04:26.613446 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/4fbaedcd-8c1d-452e-a36b-1a12e47f48d1-etc-ovs\") pod \"ovn-controller-ovs-8552m\" (UID: \"4fbaedcd-8c1d-452e-a36b-1a12e47f48d1\") " pod="openstack/ovn-controller-ovs-8552m" Mar 10 10:04:26 crc kubenswrapper[4794]: I0310 10:04:26.613463 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/4fbaedcd-8c1d-452e-a36b-1a12e47f48d1-var-log\") pod \"ovn-controller-ovs-8552m\" (UID: \"4fbaedcd-8c1d-452e-a36b-1a12e47f48d1\") " pod="openstack/ovn-controller-ovs-8552m" Mar 10 10:04:26 crc kubenswrapper[4794]: I0310 10:04:26.613481 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bl2kw\" (UniqueName: \"kubernetes.io/projected/4fbaedcd-8c1d-452e-a36b-1a12e47f48d1-kube-api-access-bl2kw\") pod \"ovn-controller-ovs-8552m\" (UID: \"4fbaedcd-8c1d-452e-a36b-1a12e47f48d1\") " pod="openstack/ovn-controller-ovs-8552m" Mar 10 10:04:26 crc kubenswrapper[4794]: I0310 10:04:26.613502 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c4ffbeb0-ab29-4b48-bbd2-65c776b08edd-scripts\") pod \"ovn-controller-fvs8j\" (UID: \"c4ffbeb0-ab29-4b48-bbd2-65c776b08edd\") " pod="openstack/ovn-controller-fvs8j" Mar 10 10:04:26 crc kubenswrapper[4794]: I0310 10:04:26.613523 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/4fbaedcd-8c1d-452e-a36b-1a12e47f48d1-var-lib\") pod \"ovn-controller-ovs-8552m\" (UID: \"4fbaedcd-8c1d-452e-a36b-1a12e47f48d1\") " pod="openstack/ovn-controller-ovs-8552m" Mar 10 10:04:26 crc kubenswrapper[4794]: I0310 10:04:26.613543 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2lddn\" (UniqueName: \"kubernetes.io/projected/c4ffbeb0-ab29-4b48-bbd2-65c776b08edd-kube-api-access-2lddn\") pod \"ovn-controller-fvs8j\" (UID: \"c4ffbeb0-ab29-4b48-bbd2-65c776b08edd\") " pod="openstack/ovn-controller-fvs8j" Mar 10 10:04:26 crc kubenswrapper[4794]: I0310 10:04:26.613557 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c4ffbeb0-ab29-4b48-bbd2-65c776b08edd-combined-ca-bundle\") pod \"ovn-controller-fvs8j\" (UID: \"c4ffbeb0-ab29-4b48-bbd2-65c776b08edd\") " pod="openstack/ovn-controller-fvs8j" Mar 10 10:04:26 crc kubenswrapper[4794]: I0310 10:04:26.613574 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/4fbaedcd-8c1d-452e-a36b-1a12e47f48d1-var-log\") pod \"ovn-controller-ovs-8552m\" (UID: \"4fbaedcd-8c1d-452e-a36b-1a12e47f48d1\") " pod="openstack/ovn-controller-ovs-8552m" Mar 10 10:04:26 crc kubenswrapper[4794]: I0310 10:04:26.613726 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/4fbaedcd-8c1d-452e-a36b-1a12e47f48d1-var-lib\") pod \"ovn-controller-ovs-8552m\" (UID: \"4fbaedcd-8c1d-452e-a36b-1a12e47f48d1\") " pod="openstack/ovn-controller-ovs-8552m" Mar 10 10:04:26 crc kubenswrapper[4794]: I0310 10:04:26.614614 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4fbaedcd-8c1d-452e-a36b-1a12e47f48d1-scripts\") pod \"ovn-controller-ovs-8552m\" (UID: \"4fbaedcd-8c1d-452e-a36b-1a12e47f48d1\") " pod="openstack/ovn-controller-ovs-8552m" Mar 10 10:04:26 crc kubenswrapper[4794]: I0310 10:04:26.618087 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/c4ffbeb0-ab29-4b48-bbd2-65c776b08edd-ovn-controller-tls-certs\") pod \"ovn-controller-fvs8j\" (UID: \"c4ffbeb0-ab29-4b48-bbd2-65c776b08edd\") " pod="openstack/ovn-controller-fvs8j" Mar 10 10:04:26 crc kubenswrapper[4794]: I0310 10:04:26.618764 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c4ffbeb0-ab29-4b48-bbd2-65c776b08edd-combined-ca-bundle\") pod \"ovn-controller-fvs8j\" (UID: \"c4ffbeb0-ab29-4b48-bbd2-65c776b08edd\") " pod="openstack/ovn-controller-fvs8j" Mar 10 10:04:26 crc kubenswrapper[4794]: I0310 10:04:26.621285 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c4ffbeb0-ab29-4b48-bbd2-65c776b08edd-scripts\") pod \"ovn-controller-fvs8j\" (UID: \"c4ffbeb0-ab29-4b48-bbd2-65c776b08edd\") " pod="openstack/ovn-controller-fvs8j" Mar 10 10:04:26 crc kubenswrapper[4794]: I0310 10:04:26.629286 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2lddn\" (UniqueName: \"kubernetes.io/projected/c4ffbeb0-ab29-4b48-bbd2-65c776b08edd-kube-api-access-2lddn\") pod \"ovn-controller-fvs8j\" (UID: \"c4ffbeb0-ab29-4b48-bbd2-65c776b08edd\") " pod="openstack/ovn-controller-fvs8j" Mar 10 10:04:26 crc kubenswrapper[4794]: I0310 10:04:26.642782 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bl2kw\" (UniqueName: \"kubernetes.io/projected/4fbaedcd-8c1d-452e-a36b-1a12e47f48d1-kube-api-access-bl2kw\") pod \"ovn-controller-ovs-8552m\" (UID: \"4fbaedcd-8c1d-452e-a36b-1a12e47f48d1\") " pod="openstack/ovn-controller-ovs-8552m" Mar 10 10:04:26 crc kubenswrapper[4794]: I0310 10:04:26.715272 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-fvs8j" Mar 10 10:04:26 crc kubenswrapper[4794]: I0310 10:04:26.722304 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-8552m" Mar 10 10:04:28 crc kubenswrapper[4794]: I0310 10:04:28.155729 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-nb-0"] Mar 10 10:04:28 crc kubenswrapper[4794]: I0310 10:04:28.156864 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Mar 10 10:04:28 crc kubenswrapper[4794]: I0310 10:04:28.160646 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-config" Mar 10 10:04:28 crc kubenswrapper[4794]: I0310 10:04:28.160943 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-nb-ovndbs" Mar 10 10:04:28 crc kubenswrapper[4794]: I0310 10:04:28.161258 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovn-metrics" Mar 10 10:04:28 crc kubenswrapper[4794]: I0310 10:04:28.162346 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-nb-dockercfg-8mqg9" Mar 10 10:04:28 crc kubenswrapper[4794]: I0310 10:04:28.162436 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-scripts" Mar 10 10:04:28 crc kubenswrapper[4794]: I0310 10:04:28.184944 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Mar 10 10:04:28 crc kubenswrapper[4794]: I0310 10:04:28.237727 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/9575e254-d696-4a8a-b84f-c8f36d746ff8-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"9575e254-d696-4a8a-b84f-c8f36d746ff8\") " pod="openstack/ovsdbserver-nb-0" Mar 10 10:04:28 crc kubenswrapper[4794]: I0310 10:04:28.238014 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c9bcv\" (UniqueName: \"kubernetes.io/projected/9575e254-d696-4a8a-b84f-c8f36d746ff8-kube-api-access-c9bcv\") pod \"ovsdbserver-nb-0\" (UID: \"9575e254-d696-4a8a-b84f-c8f36d746ff8\") " pod="openstack/ovsdbserver-nb-0" Mar 10 10:04:28 crc kubenswrapper[4794]: I0310 10:04:28.238164 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"ovsdbserver-nb-0\" (UID: \"9575e254-d696-4a8a-b84f-c8f36d746ff8\") " pod="openstack/ovsdbserver-nb-0" Mar 10 10:04:28 crc kubenswrapper[4794]: I0310 10:04:28.238359 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/9575e254-d696-4a8a-b84f-c8f36d746ff8-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"9575e254-d696-4a8a-b84f-c8f36d746ff8\") " pod="openstack/ovsdbserver-nb-0" Mar 10 10:04:28 crc kubenswrapper[4794]: I0310 10:04:28.238513 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9575e254-d696-4a8a-b84f-c8f36d746ff8-config\") pod \"ovsdbserver-nb-0\" (UID: \"9575e254-d696-4a8a-b84f-c8f36d746ff8\") " pod="openstack/ovsdbserver-nb-0" Mar 10 10:04:28 crc kubenswrapper[4794]: I0310 10:04:28.238654 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/9575e254-d696-4a8a-b84f-c8f36d746ff8-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"9575e254-d696-4a8a-b84f-c8f36d746ff8\") " pod="openstack/ovsdbserver-nb-0" Mar 10 10:04:28 crc kubenswrapper[4794]: I0310 10:04:28.238884 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9575e254-d696-4a8a-b84f-c8f36d746ff8-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"9575e254-d696-4a8a-b84f-c8f36d746ff8\") " pod="openstack/ovsdbserver-nb-0" Mar 10 10:04:28 crc kubenswrapper[4794]: I0310 10:04:28.239018 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9575e254-d696-4a8a-b84f-c8f36d746ff8-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"9575e254-d696-4a8a-b84f-c8f36d746ff8\") " pod="openstack/ovsdbserver-nb-0" Mar 10 10:04:28 crc kubenswrapper[4794]: I0310 10:04:28.340655 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"ovsdbserver-nb-0\" (UID: \"9575e254-d696-4a8a-b84f-c8f36d746ff8\") " pod="openstack/ovsdbserver-nb-0" Mar 10 10:04:28 crc kubenswrapper[4794]: I0310 10:04:28.341041 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/9575e254-d696-4a8a-b84f-c8f36d746ff8-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"9575e254-d696-4a8a-b84f-c8f36d746ff8\") " pod="openstack/ovsdbserver-nb-0" Mar 10 10:04:28 crc kubenswrapper[4794]: I0310 10:04:28.341742 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9575e254-d696-4a8a-b84f-c8f36d746ff8-config\") pod \"ovsdbserver-nb-0\" (UID: \"9575e254-d696-4a8a-b84f-c8f36d746ff8\") " pod="openstack/ovsdbserver-nb-0" Mar 10 10:04:28 crc kubenswrapper[4794]: I0310 10:04:28.341885 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/9575e254-d696-4a8a-b84f-c8f36d746ff8-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"9575e254-d696-4a8a-b84f-c8f36d746ff8\") " pod="openstack/ovsdbserver-nb-0" Mar 10 10:04:28 crc kubenswrapper[4794]: I0310 10:04:28.342059 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9575e254-d696-4a8a-b84f-c8f36d746ff8-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"9575e254-d696-4a8a-b84f-c8f36d746ff8\") " pod="openstack/ovsdbserver-nb-0" Mar 10 10:04:28 crc kubenswrapper[4794]: I0310 10:04:28.342175 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9575e254-d696-4a8a-b84f-c8f36d746ff8-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"9575e254-d696-4a8a-b84f-c8f36d746ff8\") " pod="openstack/ovsdbserver-nb-0" Mar 10 10:04:28 crc kubenswrapper[4794]: I0310 10:04:28.342412 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/9575e254-d696-4a8a-b84f-c8f36d746ff8-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"9575e254-d696-4a8a-b84f-c8f36d746ff8\") " pod="openstack/ovsdbserver-nb-0" Mar 10 10:04:28 crc kubenswrapper[4794]: I0310 10:04:28.342523 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c9bcv\" (UniqueName: \"kubernetes.io/projected/9575e254-d696-4a8a-b84f-c8f36d746ff8-kube-api-access-c9bcv\") pod \"ovsdbserver-nb-0\" (UID: \"9575e254-d696-4a8a-b84f-c8f36d746ff8\") " pod="openstack/ovsdbserver-nb-0" Mar 10 10:04:28 crc kubenswrapper[4794]: I0310 10:04:28.342562 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9575e254-d696-4a8a-b84f-c8f36d746ff8-config\") pod \"ovsdbserver-nb-0\" (UID: \"9575e254-d696-4a8a-b84f-c8f36d746ff8\") " pod="openstack/ovsdbserver-nb-0" Mar 10 10:04:28 crc kubenswrapper[4794]: I0310 10:04:28.341317 4794 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"ovsdbserver-nb-0\" (UID: \"9575e254-d696-4a8a-b84f-c8f36d746ff8\") device mount path \"/mnt/openstack/pv06\"" pod="openstack/ovsdbserver-nb-0" Mar 10 10:04:28 crc kubenswrapper[4794]: I0310 10:04:28.343421 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9575e254-d696-4a8a-b84f-c8f36d746ff8-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"9575e254-d696-4a8a-b84f-c8f36d746ff8\") " pod="openstack/ovsdbserver-nb-0" Mar 10 10:04:28 crc kubenswrapper[4794]: I0310 10:04:28.341532 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/9575e254-d696-4a8a-b84f-c8f36d746ff8-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"9575e254-d696-4a8a-b84f-c8f36d746ff8\") " pod="openstack/ovsdbserver-nb-0" Mar 10 10:04:28 crc kubenswrapper[4794]: I0310 10:04:28.347882 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/9575e254-d696-4a8a-b84f-c8f36d746ff8-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"9575e254-d696-4a8a-b84f-c8f36d746ff8\") " pod="openstack/ovsdbserver-nb-0" Mar 10 10:04:28 crc kubenswrapper[4794]: I0310 10:04:28.348600 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9575e254-d696-4a8a-b84f-c8f36d746ff8-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"9575e254-d696-4a8a-b84f-c8f36d746ff8\") " pod="openstack/ovsdbserver-nb-0" Mar 10 10:04:28 crc kubenswrapper[4794]: I0310 10:04:28.361355 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c9bcv\" (UniqueName: \"kubernetes.io/projected/9575e254-d696-4a8a-b84f-c8f36d746ff8-kube-api-access-c9bcv\") pod \"ovsdbserver-nb-0\" (UID: \"9575e254-d696-4a8a-b84f-c8f36d746ff8\") " pod="openstack/ovsdbserver-nb-0" Mar 10 10:04:28 crc kubenswrapper[4794]: I0310 10:04:28.362907 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/9575e254-d696-4a8a-b84f-c8f36d746ff8-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"9575e254-d696-4a8a-b84f-c8f36d746ff8\") " pod="openstack/ovsdbserver-nb-0" Mar 10 10:04:28 crc kubenswrapper[4794]: I0310 10:04:28.366046 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"ovsdbserver-nb-0\" (UID: \"9575e254-d696-4a8a-b84f-c8f36d746ff8\") " pod="openstack/ovsdbserver-nb-0" Mar 10 10:04:28 crc kubenswrapper[4794]: I0310 10:04:28.480922 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Mar 10 10:04:29 crc kubenswrapper[4794]: W0310 10:04:29.246743 4794 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod598e06ed_3156_4e09_976e_4dda0e35afc2.slice/crio-04250b32ee461c407bf2af76f554ef3ce4c4f1ccb97739e259bbd8a00dce17b4 WatchSource:0}: Error finding container 04250b32ee461c407bf2af76f554ef3ce4c4f1ccb97739e259bbd8a00dce17b4: Status 404 returned error can't find the container with id 04250b32ee461c407bf2af76f554ef3ce4c4f1ccb97739e259bbd8a00dce17b4 Mar 10 10:04:29 crc kubenswrapper[4794]: I0310 10:04:29.772916 4794 scope.go:117] "RemoveContainer" containerID="7d7da57ffe1e7fa8cf7ebd979d4e92467de34825c2105a784c90fecff649917b" Mar 10 10:04:30 crc kubenswrapper[4794]: I0310 10:04:30.144546 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"598e06ed-3156-4e09-976e-4dda0e35afc2","Type":"ContainerStarted","Data":"04250b32ee461c407bf2af76f554ef3ce4c4f1ccb97739e259bbd8a00dce17b4"} Mar 10 10:04:30 crc kubenswrapper[4794]: I0310 10:04:30.927790 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-sb-0"] Mar 10 10:04:30 crc kubenswrapper[4794]: I0310 10:04:30.929974 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Mar 10 10:04:30 crc kubenswrapper[4794]: I0310 10:04:30.932170 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-sb-dockercfg-c4t9q" Mar 10 10:04:30 crc kubenswrapper[4794]: I0310 10:04:30.932650 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-sb-ovndbs" Mar 10 10:04:30 crc kubenswrapper[4794]: I0310 10:04:30.933533 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-config" Mar 10 10:04:30 crc kubenswrapper[4794]: I0310 10:04:30.936134 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-scripts" Mar 10 10:04:30 crc kubenswrapper[4794]: I0310 10:04:30.940100 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Mar 10 10:04:30 crc kubenswrapper[4794]: I0310 10:04:30.985856 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1fcb4385-7603-4d75-8c41-23f457fcae25-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"1fcb4385-7603-4d75-8c41-23f457fcae25\") " pod="openstack/ovsdbserver-sb-0" Mar 10 10:04:30 crc kubenswrapper[4794]: I0310 10:04:30.986087 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"ovsdbserver-sb-0\" (UID: \"1fcb4385-7603-4d75-8c41-23f457fcae25\") " pod="openstack/ovsdbserver-sb-0" Mar 10 10:04:30 crc kubenswrapper[4794]: I0310 10:04:30.986242 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1fcb4385-7603-4d75-8c41-23f457fcae25-config\") pod \"ovsdbserver-sb-0\" (UID: \"1fcb4385-7603-4d75-8c41-23f457fcae25\") " pod="openstack/ovsdbserver-sb-0" Mar 10 10:04:30 crc kubenswrapper[4794]: I0310 10:04:30.986764 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/1fcb4385-7603-4d75-8c41-23f457fcae25-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"1fcb4385-7603-4d75-8c41-23f457fcae25\") " pod="openstack/ovsdbserver-sb-0" Mar 10 10:04:30 crc kubenswrapper[4794]: I0310 10:04:30.986949 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s5gzb\" (UniqueName: \"kubernetes.io/projected/1fcb4385-7603-4d75-8c41-23f457fcae25-kube-api-access-s5gzb\") pod \"ovsdbserver-sb-0\" (UID: \"1fcb4385-7603-4d75-8c41-23f457fcae25\") " pod="openstack/ovsdbserver-sb-0" Mar 10 10:04:30 crc kubenswrapper[4794]: I0310 10:04:30.987157 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/1fcb4385-7603-4d75-8c41-23f457fcae25-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"1fcb4385-7603-4d75-8c41-23f457fcae25\") " pod="openstack/ovsdbserver-sb-0" Mar 10 10:04:30 crc kubenswrapper[4794]: I0310 10:04:30.987353 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/1fcb4385-7603-4d75-8c41-23f457fcae25-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"1fcb4385-7603-4d75-8c41-23f457fcae25\") " pod="openstack/ovsdbserver-sb-0" Mar 10 10:04:30 crc kubenswrapper[4794]: I0310 10:04:30.987449 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1fcb4385-7603-4d75-8c41-23f457fcae25-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"1fcb4385-7603-4d75-8c41-23f457fcae25\") " pod="openstack/ovsdbserver-sb-0" Mar 10 10:04:31 crc kubenswrapper[4794]: I0310 10:04:31.089023 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"ovsdbserver-sb-0\" (UID: \"1fcb4385-7603-4d75-8c41-23f457fcae25\") " pod="openstack/ovsdbserver-sb-0" Mar 10 10:04:31 crc kubenswrapper[4794]: I0310 10:04:31.089422 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1fcb4385-7603-4d75-8c41-23f457fcae25-config\") pod \"ovsdbserver-sb-0\" (UID: \"1fcb4385-7603-4d75-8c41-23f457fcae25\") " pod="openstack/ovsdbserver-sb-0" Mar 10 10:04:31 crc kubenswrapper[4794]: I0310 10:04:31.089592 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/1fcb4385-7603-4d75-8c41-23f457fcae25-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"1fcb4385-7603-4d75-8c41-23f457fcae25\") " pod="openstack/ovsdbserver-sb-0" Mar 10 10:04:31 crc kubenswrapper[4794]: I0310 10:04:31.089685 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s5gzb\" (UniqueName: \"kubernetes.io/projected/1fcb4385-7603-4d75-8c41-23f457fcae25-kube-api-access-s5gzb\") pod \"ovsdbserver-sb-0\" (UID: \"1fcb4385-7603-4d75-8c41-23f457fcae25\") " pod="openstack/ovsdbserver-sb-0" Mar 10 10:04:31 crc kubenswrapper[4794]: I0310 10:04:31.089762 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/1fcb4385-7603-4d75-8c41-23f457fcae25-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"1fcb4385-7603-4d75-8c41-23f457fcae25\") " pod="openstack/ovsdbserver-sb-0" Mar 10 10:04:31 crc kubenswrapper[4794]: I0310 10:04:31.089864 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/1fcb4385-7603-4d75-8c41-23f457fcae25-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"1fcb4385-7603-4d75-8c41-23f457fcae25\") " pod="openstack/ovsdbserver-sb-0" Mar 10 10:04:31 crc kubenswrapper[4794]: I0310 10:04:31.089942 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1fcb4385-7603-4d75-8c41-23f457fcae25-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"1fcb4385-7603-4d75-8c41-23f457fcae25\") " pod="openstack/ovsdbserver-sb-0" Mar 10 10:04:31 crc kubenswrapper[4794]: I0310 10:04:31.090035 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1fcb4385-7603-4d75-8c41-23f457fcae25-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"1fcb4385-7603-4d75-8c41-23f457fcae25\") " pod="openstack/ovsdbserver-sb-0" Mar 10 10:04:31 crc kubenswrapper[4794]: I0310 10:04:31.089640 4794 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"ovsdbserver-sb-0\" (UID: \"1fcb4385-7603-4d75-8c41-23f457fcae25\") device mount path \"/mnt/openstack/pv11\"" pod="openstack/ovsdbserver-sb-0" Mar 10 10:04:31 crc kubenswrapper[4794]: I0310 10:04:31.091092 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/1fcb4385-7603-4d75-8c41-23f457fcae25-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"1fcb4385-7603-4d75-8c41-23f457fcae25\") " pod="openstack/ovsdbserver-sb-0" Mar 10 10:04:31 crc kubenswrapper[4794]: I0310 10:04:31.091499 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1fcb4385-7603-4d75-8c41-23f457fcae25-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"1fcb4385-7603-4d75-8c41-23f457fcae25\") " pod="openstack/ovsdbserver-sb-0" Mar 10 10:04:31 crc kubenswrapper[4794]: I0310 10:04:31.092136 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1fcb4385-7603-4d75-8c41-23f457fcae25-config\") pod \"ovsdbserver-sb-0\" (UID: \"1fcb4385-7603-4d75-8c41-23f457fcae25\") " pod="openstack/ovsdbserver-sb-0" Mar 10 10:04:31 crc kubenswrapper[4794]: I0310 10:04:31.097964 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/1fcb4385-7603-4d75-8c41-23f457fcae25-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"1fcb4385-7603-4d75-8c41-23f457fcae25\") " pod="openstack/ovsdbserver-sb-0" Mar 10 10:04:31 crc kubenswrapper[4794]: I0310 10:04:31.102922 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/1fcb4385-7603-4d75-8c41-23f457fcae25-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"1fcb4385-7603-4d75-8c41-23f457fcae25\") " pod="openstack/ovsdbserver-sb-0" Mar 10 10:04:31 crc kubenswrapper[4794]: I0310 10:04:31.103774 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1fcb4385-7603-4d75-8c41-23f457fcae25-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"1fcb4385-7603-4d75-8c41-23f457fcae25\") " pod="openstack/ovsdbserver-sb-0" Mar 10 10:04:31 crc kubenswrapper[4794]: I0310 10:04:31.119102 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"ovsdbserver-sb-0\" (UID: \"1fcb4385-7603-4d75-8c41-23f457fcae25\") " pod="openstack/ovsdbserver-sb-0" Mar 10 10:04:31 crc kubenswrapper[4794]: I0310 10:04:31.124817 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s5gzb\" (UniqueName: \"kubernetes.io/projected/1fcb4385-7603-4d75-8c41-23f457fcae25-kube-api-access-s5gzb\") pod \"ovsdbserver-sb-0\" (UID: \"1fcb4385-7603-4d75-8c41-23f457fcae25\") " pod="openstack/ovsdbserver-sb-0" Mar 10 10:04:31 crc kubenswrapper[4794]: I0310 10:04:31.263950 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Mar 10 10:04:31 crc kubenswrapper[4794]: E0310 10:04:31.438564 4794 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:944833b50342d462c10637342bc85197a8cf099a3650df12e23854dde99af514" Mar 10 10:04:31 crc kubenswrapper[4794]: E0310 10:04:31.438786 4794 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:944833b50342d462c10637342bc85197a8cf099a3650df12e23854dde99af514,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:ndfhb5h667h568h584h5f9h58dh565h664h587h597h577h64bh5c4h66fh647hbdh68ch5c5h68dh686h5f7h64hd7hc6h55fh57bh98h57fh87h5fh57fq,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-xjtj7,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-86bbd886cf-sxl8t_openstack(0295a29c-dbe2-4603-862a-80463076f1e3): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 10 10:04:31 crc kubenswrapper[4794]: E0310 10:04:31.439996 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-86bbd886cf-sxl8t" podUID="0295a29c-dbe2-4603-862a-80463076f1e3" Mar 10 10:04:31 crc kubenswrapper[4794]: E0310 10:04:31.446502 4794 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:944833b50342d462c10637342bc85197a8cf099a3650df12e23854dde99af514" Mar 10 10:04:31 crc kubenswrapper[4794]: E0310 10:04:31.447260 4794 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:944833b50342d462c10637342bc85197a8cf099a3650df12e23854dde99af514,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:nffh5bdhf4h5f8h79h55h77h58fh56dh7bh6fh578hbch55dh68h56bhd9h65dh57ch658hc9h566h666h688h58h65dh684h5d7h6ch575h5d6h88q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-nsflg,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-589db6c89c-hsx96_openstack(76c64598-18fa-4a47-8bb0-45c5b7c93845): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 10 10:04:31 crc kubenswrapper[4794]: E0310 10:04:31.448479 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-589db6c89c-hsx96" podUID="76c64598-18fa-4a47-8bb0-45c5b7c93845" Mar 10 10:04:31 crc kubenswrapper[4794]: I0310 10:04:31.787115 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Mar 10 10:04:31 crc kubenswrapper[4794]: W0310 10:04:31.795510 4794 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2b6b3a2f_0f4d_4b7f_a507_450f0ffff42b.slice/crio-7dab7f9e9746bf17b56c4b83e051254e9901df841f512cb3effa974e3664d6fa WatchSource:0}: Error finding container 7dab7f9e9746bf17b56c4b83e051254e9901df841f512cb3effa974e3664d6fa: Status 404 returned error can't find the container with id 7dab7f9e9746bf17b56c4b83e051254e9901df841f512cb3effa974e3664d6fa Mar 10 10:04:31 crc kubenswrapper[4794]: I0310 10:04:31.903411 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 10 10:04:31 crc kubenswrapper[4794]: I0310 10:04:31.911019 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Mar 10 10:04:31 crc kubenswrapper[4794]: W0310 10:04:31.960271 4794 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7f82e49b_0e4d_4cf5_8213_b30edcae94d4.slice/crio-d384067b285f1fd56157911934d42a055836cecf1f196917852bff1fa1e15975 WatchSource:0}: Error finding container d384067b285f1fd56157911934d42a055836cecf1f196917852bff1fa1e15975: Status 404 returned error can't find the container with id d384067b285f1fd56157911934d42a055836cecf1f196917852bff1fa1e15975 Mar 10 10:04:31 crc kubenswrapper[4794]: W0310 10:04:31.960651 4794 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda45381ea_b5d8_49aa_b4b8_ab372b39b0d3.slice/crio-0149e10f7e01ef9d92ad296ac2010105b1999fbb92de901a0dc1173a78ddb7ab WatchSource:0}: Error finding container 0149e10f7e01ef9d92ad296ac2010105b1999fbb92de901a0dc1173a78ddb7ab: Status 404 returned error can't find the container with id 0149e10f7e01ef9d92ad296ac2010105b1999fbb92de901a0dc1173a78ddb7ab Mar 10 10:04:32 crc kubenswrapper[4794]: I0310 10:04:32.062254 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 10 10:04:32 crc kubenswrapper[4794]: I0310 10:04:32.088026 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Mar 10 10:04:32 crc kubenswrapper[4794]: I0310 10:04:32.102578 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-fvs8j"] Mar 10 10:04:32 crc kubenswrapper[4794]: I0310 10:04:32.158214 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Mar 10 10:04:32 crc kubenswrapper[4794]: W0310 10:04:32.200545 4794 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1fcb4385_7603_4d75_8c41_23f457fcae25.slice/crio-0aaa2584749680bc8e1a1c028dcb6ca7fc34ef2cd230a42cb8f6e7a9c56b9f8b WatchSource:0}: Error finding container 0aaa2584749680bc8e1a1c028dcb6ca7fc34ef2cd230a42cb8f6e7a9c56b9f8b: Status 404 returned error can't find the container with id 0aaa2584749680bc8e1a1c028dcb6ca7fc34ef2cd230a42cb8f6e7a9c56b9f8b Mar 10 10:04:32 crc kubenswrapper[4794]: I0310 10:04:32.202189 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"2b6b3a2f-0f4d-4b7f-a507-450f0ffff42b","Type":"ContainerStarted","Data":"7dab7f9e9746bf17b56c4b83e051254e9901df841f512cb3effa974e3664d6fa"} Mar 10 10:04:32 crc kubenswrapper[4794]: I0310 10:04:32.205044 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"a45381ea-b5d8-49aa-b4b8-ab372b39b0d3","Type":"ContainerStarted","Data":"0149e10f7e01ef9d92ad296ac2010105b1999fbb92de901a0dc1173a78ddb7ab"} Mar 10 10:04:32 crc kubenswrapper[4794]: I0310 10:04:32.206739 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"57b6ddc9-2a57-41f1-a6c8-d0be37b88252","Type":"ContainerStarted","Data":"3b36d3081398e6b67192ac531cd0a42f8ac5118b45044f79104e02d527f5122b"} Mar 10 10:04:32 crc kubenswrapper[4794]: I0310 10:04:32.208443 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"71ee8a8d-89a0-495f-925b-071e52449063","Type":"ContainerStarted","Data":"6fa8d1949110896a40b5b1cfbf46eb189d5e00b9d4d7ed875abf5ab436383ec5"} Mar 10 10:04:32 crc kubenswrapper[4794]: I0310 10:04:32.210027 4794 generic.go:334] "Generic (PLEG): container finished" podID="5666224e-6fa7-45b5-bdf5-12d699e536ba" containerID="f2bf2635b5ddf00ea3dd401977963877c3ace0f44b2659adff6ed91cf277948e" exitCode=0 Mar 10 10:04:32 crc kubenswrapper[4794]: I0310 10:04:32.210976 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7c47bcb9f9-24sxt" event={"ID":"5666224e-6fa7-45b5-bdf5-12d699e536ba","Type":"ContainerDied","Data":"f2bf2635b5ddf00ea3dd401977963877c3ace0f44b2659adff6ed91cf277948e"} Mar 10 10:04:32 crc kubenswrapper[4794]: I0310 10:04:32.225119 4794 generic.go:334] "Generic (PLEG): container finished" podID="49c3ebb8-c668-4ca8-ad87-06ab860ac32d" containerID="4ec04b84342ee6d8d5ec0946a38ae45839bda64075a6ef1b336b052126fe59d1" exitCode=0 Mar 10 10:04:32 crc kubenswrapper[4794]: I0310 10:04:32.225211 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78cb4465c9-mcwck" event={"ID":"49c3ebb8-c668-4ca8-ad87-06ab860ac32d","Type":"ContainerDied","Data":"4ec04b84342ee6d8d5ec0946a38ae45839bda64075a6ef1b336b052126fe59d1"} Mar 10 10:04:32 crc kubenswrapper[4794]: I0310 10:04:32.243180 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"7f82e49b-0e4d-4cf5-8213-b30edcae94d4","Type":"ContainerStarted","Data":"d384067b285f1fd56157911934d42a055836cecf1f196917852bff1fa1e15975"} Mar 10 10:04:32 crc kubenswrapper[4794]: I0310 10:04:32.267973 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-fvs8j" event={"ID":"c4ffbeb0-ab29-4b48-bbd2-65c776b08edd","Type":"ContainerStarted","Data":"200f1a127dbcf778780d3267801e67f85028cd39740627d3c9a68d705833a8d9"} Mar 10 10:04:32 crc kubenswrapper[4794]: I0310 10:04:32.388828 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-8552m"] Mar 10 10:04:32 crc kubenswrapper[4794]: W0310 10:04:32.431436 4794 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4fbaedcd_8c1d_452e_a36b_1a12e47f48d1.slice/crio-ba97aebd530c2adb07290b9a30334411952d96573f9a64ca5c288c8642ed252e WatchSource:0}: Error finding container ba97aebd530c2adb07290b9a30334411952d96573f9a64ca5c288c8642ed252e: Status 404 returned error can't find the container with id ba97aebd530c2adb07290b9a30334411952d96573f9a64ca5c288c8642ed252e Mar 10 10:04:32 crc kubenswrapper[4794]: I0310 10:04:32.602471 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86bbd886cf-sxl8t" Mar 10 10:04:32 crc kubenswrapper[4794]: I0310 10:04:32.623081 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0295a29c-dbe2-4603-862a-80463076f1e3-config\") pod \"0295a29c-dbe2-4603-862a-80463076f1e3\" (UID: \"0295a29c-dbe2-4603-862a-80463076f1e3\") " Mar 10 10:04:32 crc kubenswrapper[4794]: I0310 10:04:32.623203 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xjtj7\" (UniqueName: \"kubernetes.io/projected/0295a29c-dbe2-4603-862a-80463076f1e3-kube-api-access-xjtj7\") pod \"0295a29c-dbe2-4603-862a-80463076f1e3\" (UID: \"0295a29c-dbe2-4603-862a-80463076f1e3\") " Mar 10 10:04:32 crc kubenswrapper[4794]: I0310 10:04:32.623262 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0295a29c-dbe2-4603-862a-80463076f1e3-dns-svc\") pod \"0295a29c-dbe2-4603-862a-80463076f1e3\" (UID: \"0295a29c-dbe2-4603-862a-80463076f1e3\") " Mar 10 10:04:32 crc kubenswrapper[4794]: I0310 10:04:32.623973 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0295a29c-dbe2-4603-862a-80463076f1e3-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "0295a29c-dbe2-4603-862a-80463076f1e3" (UID: "0295a29c-dbe2-4603-862a-80463076f1e3"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 10:04:32 crc kubenswrapper[4794]: I0310 10:04:32.624345 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0295a29c-dbe2-4603-862a-80463076f1e3-config" (OuterVolumeSpecName: "config") pod "0295a29c-dbe2-4603-862a-80463076f1e3" (UID: "0295a29c-dbe2-4603-862a-80463076f1e3"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 10:04:32 crc kubenswrapper[4794]: I0310 10:04:32.639919 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0295a29c-dbe2-4603-862a-80463076f1e3-kube-api-access-xjtj7" (OuterVolumeSpecName: "kube-api-access-xjtj7") pod "0295a29c-dbe2-4603-862a-80463076f1e3" (UID: "0295a29c-dbe2-4603-862a-80463076f1e3"). InnerVolumeSpecName "kube-api-access-xjtj7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 10:04:32 crc kubenswrapper[4794]: I0310 10:04:32.666715 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-589db6c89c-hsx96" Mar 10 10:04:32 crc kubenswrapper[4794]: I0310 10:04:32.724630 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nsflg\" (UniqueName: \"kubernetes.io/projected/76c64598-18fa-4a47-8bb0-45c5b7c93845-kube-api-access-nsflg\") pod \"76c64598-18fa-4a47-8bb0-45c5b7c93845\" (UID: \"76c64598-18fa-4a47-8bb0-45c5b7c93845\") " Mar 10 10:04:32 crc kubenswrapper[4794]: I0310 10:04:32.724775 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/76c64598-18fa-4a47-8bb0-45c5b7c93845-config\") pod \"76c64598-18fa-4a47-8bb0-45c5b7c93845\" (UID: \"76c64598-18fa-4a47-8bb0-45c5b7c93845\") " Mar 10 10:04:32 crc kubenswrapper[4794]: I0310 10:04:32.725094 4794 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0295a29c-dbe2-4603-862a-80463076f1e3-config\") on node \"crc\" DevicePath \"\"" Mar 10 10:04:32 crc kubenswrapper[4794]: I0310 10:04:32.725106 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xjtj7\" (UniqueName: \"kubernetes.io/projected/0295a29c-dbe2-4603-862a-80463076f1e3-kube-api-access-xjtj7\") on node \"crc\" DevicePath \"\"" Mar 10 10:04:32 crc kubenswrapper[4794]: I0310 10:04:32.725115 4794 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0295a29c-dbe2-4603-862a-80463076f1e3-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 10 10:04:32 crc kubenswrapper[4794]: I0310 10:04:32.725187 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/76c64598-18fa-4a47-8bb0-45c5b7c93845-config" (OuterVolumeSpecName: "config") pod "76c64598-18fa-4a47-8bb0-45c5b7c93845" (UID: "76c64598-18fa-4a47-8bb0-45c5b7c93845"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 10:04:32 crc kubenswrapper[4794]: I0310 10:04:32.727044 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/76c64598-18fa-4a47-8bb0-45c5b7c93845-kube-api-access-nsflg" (OuterVolumeSpecName: "kube-api-access-nsflg") pod "76c64598-18fa-4a47-8bb0-45c5b7c93845" (UID: "76c64598-18fa-4a47-8bb0-45c5b7c93845"). InnerVolumeSpecName "kube-api-access-nsflg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 10:04:32 crc kubenswrapper[4794]: I0310 10:04:32.826129 4794 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/76c64598-18fa-4a47-8bb0-45c5b7c93845-config\") on node \"crc\" DevicePath \"\"" Mar 10 10:04:32 crc kubenswrapper[4794]: I0310 10:04:32.826160 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nsflg\" (UniqueName: \"kubernetes.io/projected/76c64598-18fa-4a47-8bb0-45c5b7c93845-kube-api-access-nsflg\") on node \"crc\" DevicePath \"\"" Mar 10 10:04:33 crc kubenswrapper[4794]: I0310 10:04:33.278634 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Mar 10 10:04:33 crc kubenswrapper[4794]: I0310 10:04:33.280717 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86bbd886cf-sxl8t" Mar 10 10:04:33 crc kubenswrapper[4794]: I0310 10:04:33.280803 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86bbd886cf-sxl8t" event={"ID":"0295a29c-dbe2-4603-862a-80463076f1e3","Type":"ContainerDied","Data":"2af6480990b143962cf000989449ac2bdae8a33961dd2be3f1ec9cd30d0067c3"} Mar 10 10:04:33 crc kubenswrapper[4794]: I0310 10:04:33.283117 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7c47bcb9f9-24sxt" event={"ID":"5666224e-6fa7-45b5-bdf5-12d699e536ba","Type":"ContainerStarted","Data":"f929dfa0b410a2845a0438d6ee1fce4cbf21d85fbbf58a18716d08b6b11bcb83"} Mar 10 10:04:33 crc kubenswrapper[4794]: I0310 10:04:33.283292 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-7c47bcb9f9-24sxt" Mar 10 10:04:33 crc kubenswrapper[4794]: I0310 10:04:33.285889 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-8552m" event={"ID":"4fbaedcd-8c1d-452e-a36b-1a12e47f48d1","Type":"ContainerStarted","Data":"ba97aebd530c2adb07290b9a30334411952d96573f9a64ca5c288c8642ed252e"} Mar 10 10:04:33 crc kubenswrapper[4794]: I0310 10:04:33.287980 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78cb4465c9-mcwck" event={"ID":"49c3ebb8-c668-4ca8-ad87-06ab860ac32d","Type":"ContainerStarted","Data":"25d113d97dd3f14c83e19c6c1468b205ad5abd476c1b8e9ac8583bb0445db8a6"} Mar 10 10:04:33 crc kubenswrapper[4794]: I0310 10:04:33.288619 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-78cb4465c9-mcwck" Mar 10 10:04:33 crc kubenswrapper[4794]: I0310 10:04:33.289451 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"1fcb4385-7603-4d75-8c41-23f457fcae25","Type":"ContainerStarted","Data":"0aaa2584749680bc8e1a1c028dcb6ca7fc34ef2cd230a42cb8f6e7a9c56b9f8b"} Mar 10 10:04:33 crc kubenswrapper[4794]: I0310 10:04:33.290418 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-589db6c89c-hsx96" event={"ID":"76c64598-18fa-4a47-8bb0-45c5b7c93845","Type":"ContainerDied","Data":"d690140bb36b6cdb97cd9fd4dc90306b57eccce1562de40144ab0a9aa55cdbc4"} Mar 10 10:04:33 crc kubenswrapper[4794]: I0310 10:04:33.290460 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-589db6c89c-hsx96" Mar 10 10:04:33 crc kubenswrapper[4794]: I0310 10:04:33.300159 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-7c47bcb9f9-24sxt" podStartSLOduration=3.034747827 podStartE2EDuration="16.3001443s" podCreationTimestamp="2026-03-10 10:04:17 +0000 UTC" firstStartedPulling="2026-03-10 10:04:18.293712846 +0000 UTC m=+1207.049883664" lastFinishedPulling="2026-03-10 10:04:31.559109319 +0000 UTC m=+1220.315280137" observedRunningTime="2026-03-10 10:04:33.299527311 +0000 UTC m=+1222.055698119" watchObservedRunningTime="2026-03-10 10:04:33.3001443 +0000 UTC m=+1222.056315118" Mar 10 10:04:33 crc kubenswrapper[4794]: I0310 10:04:33.319428 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-78cb4465c9-mcwck" podStartSLOduration=2.519701543 podStartE2EDuration="16.319409565s" podCreationTimestamp="2026-03-10 10:04:17 +0000 UTC" firstStartedPulling="2026-03-10 10:04:17.862048379 +0000 UTC m=+1206.618219197" lastFinishedPulling="2026-03-10 10:04:31.661756401 +0000 UTC m=+1220.417927219" observedRunningTime="2026-03-10 10:04:33.319099565 +0000 UTC m=+1222.075270383" watchObservedRunningTime="2026-03-10 10:04:33.319409565 +0000 UTC m=+1222.075580383" Mar 10 10:04:33 crc kubenswrapper[4794]: I0310 10:04:33.357466 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-86bbd886cf-sxl8t"] Mar 10 10:04:33 crc kubenswrapper[4794]: I0310 10:04:33.377501 4794 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-86bbd886cf-sxl8t"] Mar 10 10:04:33 crc kubenswrapper[4794]: I0310 10:04:33.390001 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-589db6c89c-hsx96"] Mar 10 10:04:33 crc kubenswrapper[4794]: I0310 10:04:33.398113 4794 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-589db6c89c-hsx96"] Mar 10 10:04:34 crc kubenswrapper[4794]: I0310 10:04:34.010836 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0295a29c-dbe2-4603-862a-80463076f1e3" path="/var/lib/kubelet/pods/0295a29c-dbe2-4603-862a-80463076f1e3/volumes" Mar 10 10:04:34 crc kubenswrapper[4794]: I0310 10:04:34.011226 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="76c64598-18fa-4a47-8bb0-45c5b7c93845" path="/var/lib/kubelet/pods/76c64598-18fa-4a47-8bb0-45c5b7c93845/volumes" Mar 10 10:04:37 crc kubenswrapper[4794]: I0310 10:04:37.330843 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"9575e254-d696-4a8a-b84f-c8f36d746ff8","Type":"ContainerStarted","Data":"8e38a5eade5e988013fb78ab62479c4344b7f756e8e7babc55613c3221471a07"} Mar 10 10:04:37 crc kubenswrapper[4794]: I0310 10:04:37.547238 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-78cb4465c9-mcwck" Mar 10 10:04:37 crc kubenswrapper[4794]: I0310 10:04:37.840540 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-7c47bcb9f9-24sxt" Mar 10 10:04:37 crc kubenswrapper[4794]: I0310 10:04:37.895276 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-78cb4465c9-mcwck"] Mar 10 10:04:38 crc kubenswrapper[4794]: I0310 10:04:38.338339 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-78cb4465c9-mcwck" podUID="49c3ebb8-c668-4ca8-ad87-06ab860ac32d" containerName="dnsmasq-dns" containerID="cri-o://25d113d97dd3f14c83e19c6c1468b205ad5abd476c1b8e9ac8583bb0445db8a6" gracePeriod=10 Mar 10 10:04:39 crc kubenswrapper[4794]: I0310 10:04:39.347826 4794 generic.go:334] "Generic (PLEG): container finished" podID="49c3ebb8-c668-4ca8-ad87-06ab860ac32d" containerID="25d113d97dd3f14c83e19c6c1468b205ad5abd476c1b8e9ac8583bb0445db8a6" exitCode=0 Mar 10 10:04:39 crc kubenswrapper[4794]: I0310 10:04:39.347910 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78cb4465c9-mcwck" event={"ID":"49c3ebb8-c668-4ca8-ad87-06ab860ac32d","Type":"ContainerDied","Data":"25d113d97dd3f14c83e19c6c1468b205ad5abd476c1b8e9ac8583bb0445db8a6"} Mar 10 10:04:40 crc kubenswrapper[4794]: I0310 10:04:40.077618 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-metrics-p9r7k"] Mar 10 10:04:40 crc kubenswrapper[4794]: I0310 10:04:40.078810 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-p9r7k" Mar 10 10:04:40 crc kubenswrapper[4794]: I0310 10:04:40.091899 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-metrics-config" Mar 10 10:04:40 crc kubenswrapper[4794]: I0310 10:04:40.095059 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-p9r7k"] Mar 10 10:04:40 crc kubenswrapper[4794]: I0310 10:04:40.195697 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-86dbfc8fbf-5jfbf"] Mar 10 10:04:40 crc kubenswrapper[4794]: I0310 10:04:40.197474 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86dbfc8fbf-5jfbf" Mar 10 10:04:40 crc kubenswrapper[4794]: I0310 10:04:40.202240 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-sb" Mar 10 10:04:40 crc kubenswrapper[4794]: I0310 10:04:40.216581 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-86dbfc8fbf-5jfbf"] Mar 10 10:04:40 crc kubenswrapper[4794]: I0310 10:04:40.260737 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f68cd69f-6fe2-4189-ad03-9593a4e94337-config\") pod \"ovn-controller-metrics-p9r7k\" (UID: \"f68cd69f-6fe2-4189-ad03-9593a4e94337\") " pod="openstack/ovn-controller-metrics-p9r7k" Mar 10 10:04:40 crc kubenswrapper[4794]: I0310 10:04:40.260801 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f68cd69f-6fe2-4189-ad03-9593a4e94337-combined-ca-bundle\") pod \"ovn-controller-metrics-p9r7k\" (UID: \"f68cd69f-6fe2-4189-ad03-9593a4e94337\") " pod="openstack/ovn-controller-metrics-p9r7k" Mar 10 10:04:40 crc kubenswrapper[4794]: I0310 10:04:40.260840 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ztmcd\" (UniqueName: \"kubernetes.io/projected/f68cd69f-6fe2-4189-ad03-9593a4e94337-kube-api-access-ztmcd\") pod \"ovn-controller-metrics-p9r7k\" (UID: \"f68cd69f-6fe2-4189-ad03-9593a4e94337\") " pod="openstack/ovn-controller-metrics-p9r7k" Mar 10 10:04:40 crc kubenswrapper[4794]: I0310 10:04:40.260874 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/f68cd69f-6fe2-4189-ad03-9593a4e94337-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-p9r7k\" (UID: \"f68cd69f-6fe2-4189-ad03-9593a4e94337\") " pod="openstack/ovn-controller-metrics-p9r7k" Mar 10 10:04:40 crc kubenswrapper[4794]: I0310 10:04:40.262034 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/f68cd69f-6fe2-4189-ad03-9593a4e94337-ovn-rundir\") pod \"ovn-controller-metrics-p9r7k\" (UID: \"f68cd69f-6fe2-4189-ad03-9593a4e94337\") " pod="openstack/ovn-controller-metrics-p9r7k" Mar 10 10:04:40 crc kubenswrapper[4794]: I0310 10:04:40.262108 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/f68cd69f-6fe2-4189-ad03-9593a4e94337-ovs-rundir\") pod \"ovn-controller-metrics-p9r7k\" (UID: \"f68cd69f-6fe2-4189-ad03-9593a4e94337\") " pod="openstack/ovn-controller-metrics-p9r7k" Mar 10 10:04:40 crc kubenswrapper[4794]: I0310 10:04:40.298612 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-86dbfc8fbf-5jfbf"] Mar 10 10:04:40 crc kubenswrapper[4794]: E0310 10:04:40.299231 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[config dns-svc kube-api-access-cfjvw ovsdbserver-sb], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openstack/dnsmasq-dns-86dbfc8fbf-5jfbf" podUID="ddb22c27-27d2-49d5-a23c-3646342b284d" Mar 10 10:04:40 crc kubenswrapper[4794]: I0310 10:04:40.322706 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-659ddb758c-4qcnz"] Mar 10 10:04:40 crc kubenswrapper[4794]: I0310 10:04:40.324227 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-659ddb758c-4qcnz" Mar 10 10:04:40 crc kubenswrapper[4794]: I0310 10:04:40.328010 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-nb" Mar 10 10:04:40 crc kubenswrapper[4794]: I0310 10:04:40.340407 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-659ddb758c-4qcnz"] Mar 10 10:04:40 crc kubenswrapper[4794]: I0310 10:04:40.370230 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ddb22c27-27d2-49d5-a23c-3646342b284d-dns-svc\") pod \"dnsmasq-dns-86dbfc8fbf-5jfbf\" (UID: \"ddb22c27-27d2-49d5-a23c-3646342b284d\") " pod="openstack/dnsmasq-dns-86dbfc8fbf-5jfbf" Mar 10 10:04:40 crc kubenswrapper[4794]: I0310 10:04:40.370313 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f68cd69f-6fe2-4189-ad03-9593a4e94337-config\") pod \"ovn-controller-metrics-p9r7k\" (UID: \"f68cd69f-6fe2-4189-ad03-9593a4e94337\") " pod="openstack/ovn-controller-metrics-p9r7k" Mar 10 10:04:40 crc kubenswrapper[4794]: I0310 10:04:40.370365 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ddb22c27-27d2-49d5-a23c-3646342b284d-ovsdbserver-sb\") pod \"dnsmasq-dns-86dbfc8fbf-5jfbf\" (UID: \"ddb22c27-27d2-49d5-a23c-3646342b284d\") " pod="openstack/dnsmasq-dns-86dbfc8fbf-5jfbf" Mar 10 10:04:40 crc kubenswrapper[4794]: I0310 10:04:40.371505 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f68cd69f-6fe2-4189-ad03-9593a4e94337-combined-ca-bundle\") pod \"ovn-controller-metrics-p9r7k\" (UID: \"f68cd69f-6fe2-4189-ad03-9593a4e94337\") " pod="openstack/ovn-controller-metrics-p9r7k" Mar 10 10:04:40 crc kubenswrapper[4794]: I0310 10:04:40.371566 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ztmcd\" (UniqueName: \"kubernetes.io/projected/f68cd69f-6fe2-4189-ad03-9593a4e94337-kube-api-access-ztmcd\") pod \"ovn-controller-metrics-p9r7k\" (UID: \"f68cd69f-6fe2-4189-ad03-9593a4e94337\") " pod="openstack/ovn-controller-metrics-p9r7k" Mar 10 10:04:40 crc kubenswrapper[4794]: I0310 10:04:40.371594 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cfjvw\" (UniqueName: \"kubernetes.io/projected/ddb22c27-27d2-49d5-a23c-3646342b284d-kube-api-access-cfjvw\") pod \"dnsmasq-dns-86dbfc8fbf-5jfbf\" (UID: \"ddb22c27-27d2-49d5-a23c-3646342b284d\") " pod="openstack/dnsmasq-dns-86dbfc8fbf-5jfbf" Mar 10 10:04:40 crc kubenswrapper[4794]: I0310 10:04:40.371648 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/f68cd69f-6fe2-4189-ad03-9593a4e94337-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-p9r7k\" (UID: \"f68cd69f-6fe2-4189-ad03-9593a4e94337\") " pod="openstack/ovn-controller-metrics-p9r7k" Mar 10 10:04:40 crc kubenswrapper[4794]: I0310 10:04:40.371682 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/f68cd69f-6fe2-4189-ad03-9593a4e94337-ovn-rundir\") pod \"ovn-controller-metrics-p9r7k\" (UID: \"f68cd69f-6fe2-4189-ad03-9593a4e94337\") " pod="openstack/ovn-controller-metrics-p9r7k" Mar 10 10:04:40 crc kubenswrapper[4794]: I0310 10:04:40.371708 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ddb22c27-27d2-49d5-a23c-3646342b284d-config\") pod \"dnsmasq-dns-86dbfc8fbf-5jfbf\" (UID: \"ddb22c27-27d2-49d5-a23c-3646342b284d\") " pod="openstack/dnsmasq-dns-86dbfc8fbf-5jfbf" Mar 10 10:04:40 crc kubenswrapper[4794]: I0310 10:04:40.371762 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/f68cd69f-6fe2-4189-ad03-9593a4e94337-ovs-rundir\") pod \"ovn-controller-metrics-p9r7k\" (UID: \"f68cd69f-6fe2-4189-ad03-9593a4e94337\") " pod="openstack/ovn-controller-metrics-p9r7k" Mar 10 10:04:40 crc kubenswrapper[4794]: I0310 10:04:40.372102 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/f68cd69f-6fe2-4189-ad03-9593a4e94337-ovs-rundir\") pod \"ovn-controller-metrics-p9r7k\" (UID: \"f68cd69f-6fe2-4189-ad03-9593a4e94337\") " pod="openstack/ovn-controller-metrics-p9r7k" Mar 10 10:04:40 crc kubenswrapper[4794]: I0310 10:04:40.372174 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/f68cd69f-6fe2-4189-ad03-9593a4e94337-ovn-rundir\") pod \"ovn-controller-metrics-p9r7k\" (UID: \"f68cd69f-6fe2-4189-ad03-9593a4e94337\") " pod="openstack/ovn-controller-metrics-p9r7k" Mar 10 10:04:40 crc kubenswrapper[4794]: I0310 10:04:40.379310 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f68cd69f-6fe2-4189-ad03-9593a4e94337-config\") pod \"ovn-controller-metrics-p9r7k\" (UID: \"f68cd69f-6fe2-4189-ad03-9593a4e94337\") " pod="openstack/ovn-controller-metrics-p9r7k" Mar 10 10:04:40 crc kubenswrapper[4794]: I0310 10:04:40.383573 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86dbfc8fbf-5jfbf" Mar 10 10:04:40 crc kubenswrapper[4794]: I0310 10:04:40.387526 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f68cd69f-6fe2-4189-ad03-9593a4e94337-combined-ca-bundle\") pod \"ovn-controller-metrics-p9r7k\" (UID: \"f68cd69f-6fe2-4189-ad03-9593a4e94337\") " pod="openstack/ovn-controller-metrics-p9r7k" Mar 10 10:04:40 crc kubenswrapper[4794]: I0310 10:04:40.388036 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/f68cd69f-6fe2-4189-ad03-9593a4e94337-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-p9r7k\" (UID: \"f68cd69f-6fe2-4189-ad03-9593a4e94337\") " pod="openstack/ovn-controller-metrics-p9r7k" Mar 10 10:04:40 crc kubenswrapper[4794]: I0310 10:04:40.390698 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ztmcd\" (UniqueName: \"kubernetes.io/projected/f68cd69f-6fe2-4189-ad03-9593a4e94337-kube-api-access-ztmcd\") pod \"ovn-controller-metrics-p9r7k\" (UID: \"f68cd69f-6fe2-4189-ad03-9593a4e94337\") " pod="openstack/ovn-controller-metrics-p9r7k" Mar 10 10:04:40 crc kubenswrapper[4794]: I0310 10:04:40.409658 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-p9r7k" Mar 10 10:04:40 crc kubenswrapper[4794]: I0310 10:04:40.451379 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86dbfc8fbf-5jfbf" Mar 10 10:04:40 crc kubenswrapper[4794]: I0310 10:04:40.473291 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cfjvw\" (UniqueName: \"kubernetes.io/projected/ddb22c27-27d2-49d5-a23c-3646342b284d-kube-api-access-cfjvw\") pod \"dnsmasq-dns-86dbfc8fbf-5jfbf\" (UID: \"ddb22c27-27d2-49d5-a23c-3646342b284d\") " pod="openstack/dnsmasq-dns-86dbfc8fbf-5jfbf" Mar 10 10:04:40 crc kubenswrapper[4794]: I0310 10:04:40.473577 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4d676f27-d254-46a3-afb6-7ecb637b61be-ovsdbserver-nb\") pod \"dnsmasq-dns-659ddb758c-4qcnz\" (UID: \"4d676f27-d254-46a3-afb6-7ecb637b61be\") " pod="openstack/dnsmasq-dns-659ddb758c-4qcnz" Mar 10 10:04:40 crc kubenswrapper[4794]: I0310 10:04:40.473602 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4d676f27-d254-46a3-afb6-7ecb637b61be-dns-svc\") pod \"dnsmasq-dns-659ddb758c-4qcnz\" (UID: \"4d676f27-d254-46a3-afb6-7ecb637b61be\") " pod="openstack/dnsmasq-dns-659ddb758c-4qcnz" Mar 10 10:04:40 crc kubenswrapper[4794]: I0310 10:04:40.473629 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4d676f27-d254-46a3-afb6-7ecb637b61be-ovsdbserver-sb\") pod \"dnsmasq-dns-659ddb758c-4qcnz\" (UID: \"4d676f27-d254-46a3-afb6-7ecb637b61be\") " pod="openstack/dnsmasq-dns-659ddb758c-4qcnz" Mar 10 10:04:40 crc kubenswrapper[4794]: I0310 10:04:40.473651 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ddb22c27-27d2-49d5-a23c-3646342b284d-config\") pod \"dnsmasq-dns-86dbfc8fbf-5jfbf\" (UID: \"ddb22c27-27d2-49d5-a23c-3646342b284d\") " pod="openstack/dnsmasq-dns-86dbfc8fbf-5jfbf" Mar 10 10:04:40 crc kubenswrapper[4794]: I0310 10:04:40.473673 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-krtv5\" (UniqueName: \"kubernetes.io/projected/4d676f27-d254-46a3-afb6-7ecb637b61be-kube-api-access-krtv5\") pod \"dnsmasq-dns-659ddb758c-4qcnz\" (UID: \"4d676f27-d254-46a3-afb6-7ecb637b61be\") " pod="openstack/dnsmasq-dns-659ddb758c-4qcnz" Mar 10 10:04:40 crc kubenswrapper[4794]: I0310 10:04:40.473919 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ddb22c27-27d2-49d5-a23c-3646342b284d-dns-svc\") pod \"dnsmasq-dns-86dbfc8fbf-5jfbf\" (UID: \"ddb22c27-27d2-49d5-a23c-3646342b284d\") " pod="openstack/dnsmasq-dns-86dbfc8fbf-5jfbf" Mar 10 10:04:40 crc kubenswrapper[4794]: I0310 10:04:40.473977 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4d676f27-d254-46a3-afb6-7ecb637b61be-config\") pod \"dnsmasq-dns-659ddb758c-4qcnz\" (UID: \"4d676f27-d254-46a3-afb6-7ecb637b61be\") " pod="openstack/dnsmasq-dns-659ddb758c-4qcnz" Mar 10 10:04:40 crc kubenswrapper[4794]: I0310 10:04:40.474051 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ddb22c27-27d2-49d5-a23c-3646342b284d-ovsdbserver-sb\") pod \"dnsmasq-dns-86dbfc8fbf-5jfbf\" (UID: \"ddb22c27-27d2-49d5-a23c-3646342b284d\") " pod="openstack/dnsmasq-dns-86dbfc8fbf-5jfbf" Mar 10 10:04:40 crc kubenswrapper[4794]: I0310 10:04:40.474388 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ddb22c27-27d2-49d5-a23c-3646342b284d-config\") pod \"dnsmasq-dns-86dbfc8fbf-5jfbf\" (UID: \"ddb22c27-27d2-49d5-a23c-3646342b284d\") " pod="openstack/dnsmasq-dns-86dbfc8fbf-5jfbf" Mar 10 10:04:40 crc kubenswrapper[4794]: I0310 10:04:40.475094 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ddb22c27-27d2-49d5-a23c-3646342b284d-ovsdbserver-sb\") pod \"dnsmasq-dns-86dbfc8fbf-5jfbf\" (UID: \"ddb22c27-27d2-49d5-a23c-3646342b284d\") " pod="openstack/dnsmasq-dns-86dbfc8fbf-5jfbf" Mar 10 10:04:40 crc kubenswrapper[4794]: I0310 10:04:40.475152 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ddb22c27-27d2-49d5-a23c-3646342b284d-dns-svc\") pod \"dnsmasq-dns-86dbfc8fbf-5jfbf\" (UID: \"ddb22c27-27d2-49d5-a23c-3646342b284d\") " pod="openstack/dnsmasq-dns-86dbfc8fbf-5jfbf" Mar 10 10:04:40 crc kubenswrapper[4794]: I0310 10:04:40.490144 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cfjvw\" (UniqueName: \"kubernetes.io/projected/ddb22c27-27d2-49d5-a23c-3646342b284d-kube-api-access-cfjvw\") pod \"dnsmasq-dns-86dbfc8fbf-5jfbf\" (UID: \"ddb22c27-27d2-49d5-a23c-3646342b284d\") " pod="openstack/dnsmasq-dns-86dbfc8fbf-5jfbf" Mar 10 10:04:40 crc kubenswrapper[4794]: I0310 10:04:40.575032 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ddb22c27-27d2-49d5-a23c-3646342b284d-config\") pod \"ddb22c27-27d2-49d5-a23c-3646342b284d\" (UID: \"ddb22c27-27d2-49d5-a23c-3646342b284d\") " Mar 10 10:04:40 crc kubenswrapper[4794]: I0310 10:04:40.575442 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4d676f27-d254-46a3-afb6-7ecb637b61be-config\") pod \"dnsmasq-dns-659ddb758c-4qcnz\" (UID: \"4d676f27-d254-46a3-afb6-7ecb637b61be\") " pod="openstack/dnsmasq-dns-659ddb758c-4qcnz" Mar 10 10:04:40 crc kubenswrapper[4794]: I0310 10:04:40.575478 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ddb22c27-27d2-49d5-a23c-3646342b284d-config" (OuterVolumeSpecName: "config") pod "ddb22c27-27d2-49d5-a23c-3646342b284d" (UID: "ddb22c27-27d2-49d5-a23c-3646342b284d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 10:04:40 crc kubenswrapper[4794]: I0310 10:04:40.575523 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4d676f27-d254-46a3-afb6-7ecb637b61be-ovsdbserver-nb\") pod \"dnsmasq-dns-659ddb758c-4qcnz\" (UID: \"4d676f27-d254-46a3-afb6-7ecb637b61be\") " pod="openstack/dnsmasq-dns-659ddb758c-4qcnz" Mar 10 10:04:40 crc kubenswrapper[4794]: I0310 10:04:40.575543 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4d676f27-d254-46a3-afb6-7ecb637b61be-dns-svc\") pod \"dnsmasq-dns-659ddb758c-4qcnz\" (UID: \"4d676f27-d254-46a3-afb6-7ecb637b61be\") " pod="openstack/dnsmasq-dns-659ddb758c-4qcnz" Mar 10 10:04:40 crc kubenswrapper[4794]: I0310 10:04:40.575570 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4d676f27-d254-46a3-afb6-7ecb637b61be-ovsdbserver-sb\") pod \"dnsmasq-dns-659ddb758c-4qcnz\" (UID: \"4d676f27-d254-46a3-afb6-7ecb637b61be\") " pod="openstack/dnsmasq-dns-659ddb758c-4qcnz" Mar 10 10:04:40 crc kubenswrapper[4794]: I0310 10:04:40.575599 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-krtv5\" (UniqueName: \"kubernetes.io/projected/4d676f27-d254-46a3-afb6-7ecb637b61be-kube-api-access-krtv5\") pod \"dnsmasq-dns-659ddb758c-4qcnz\" (UID: \"4d676f27-d254-46a3-afb6-7ecb637b61be\") " pod="openstack/dnsmasq-dns-659ddb758c-4qcnz" Mar 10 10:04:40 crc kubenswrapper[4794]: I0310 10:04:40.576216 4794 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ddb22c27-27d2-49d5-a23c-3646342b284d-config\") on node \"crc\" DevicePath \"\"" Mar 10 10:04:40 crc kubenswrapper[4794]: I0310 10:04:40.576308 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4d676f27-d254-46a3-afb6-7ecb637b61be-config\") pod \"dnsmasq-dns-659ddb758c-4qcnz\" (UID: \"4d676f27-d254-46a3-afb6-7ecb637b61be\") " pod="openstack/dnsmasq-dns-659ddb758c-4qcnz" Mar 10 10:04:40 crc kubenswrapper[4794]: I0310 10:04:40.576422 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4d676f27-d254-46a3-afb6-7ecb637b61be-ovsdbserver-nb\") pod \"dnsmasq-dns-659ddb758c-4qcnz\" (UID: \"4d676f27-d254-46a3-afb6-7ecb637b61be\") " pod="openstack/dnsmasq-dns-659ddb758c-4qcnz" Mar 10 10:04:40 crc kubenswrapper[4794]: I0310 10:04:40.576485 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4d676f27-d254-46a3-afb6-7ecb637b61be-dns-svc\") pod \"dnsmasq-dns-659ddb758c-4qcnz\" (UID: \"4d676f27-d254-46a3-afb6-7ecb637b61be\") " pod="openstack/dnsmasq-dns-659ddb758c-4qcnz" Mar 10 10:04:40 crc kubenswrapper[4794]: I0310 10:04:40.576840 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4d676f27-d254-46a3-afb6-7ecb637b61be-ovsdbserver-sb\") pod \"dnsmasq-dns-659ddb758c-4qcnz\" (UID: \"4d676f27-d254-46a3-afb6-7ecb637b61be\") " pod="openstack/dnsmasq-dns-659ddb758c-4qcnz" Mar 10 10:04:40 crc kubenswrapper[4794]: I0310 10:04:40.592218 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-krtv5\" (UniqueName: \"kubernetes.io/projected/4d676f27-d254-46a3-afb6-7ecb637b61be-kube-api-access-krtv5\") pod \"dnsmasq-dns-659ddb758c-4qcnz\" (UID: \"4d676f27-d254-46a3-afb6-7ecb637b61be\") " pod="openstack/dnsmasq-dns-659ddb758c-4qcnz" Mar 10 10:04:40 crc kubenswrapper[4794]: I0310 10:04:40.680753 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ddb22c27-27d2-49d5-a23c-3646342b284d-ovsdbserver-sb\") pod \"ddb22c27-27d2-49d5-a23c-3646342b284d\" (UID: \"ddb22c27-27d2-49d5-a23c-3646342b284d\") " Mar 10 10:04:40 crc kubenswrapper[4794]: I0310 10:04:40.680951 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ddb22c27-27d2-49d5-a23c-3646342b284d-dns-svc\") pod \"ddb22c27-27d2-49d5-a23c-3646342b284d\" (UID: \"ddb22c27-27d2-49d5-a23c-3646342b284d\") " Mar 10 10:04:40 crc kubenswrapper[4794]: I0310 10:04:40.680978 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cfjvw\" (UniqueName: \"kubernetes.io/projected/ddb22c27-27d2-49d5-a23c-3646342b284d-kube-api-access-cfjvw\") pod \"ddb22c27-27d2-49d5-a23c-3646342b284d\" (UID: \"ddb22c27-27d2-49d5-a23c-3646342b284d\") " Mar 10 10:04:40 crc kubenswrapper[4794]: I0310 10:04:40.681232 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ddb22c27-27d2-49d5-a23c-3646342b284d-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "ddb22c27-27d2-49d5-a23c-3646342b284d" (UID: "ddb22c27-27d2-49d5-a23c-3646342b284d"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 10:04:40 crc kubenswrapper[4794]: I0310 10:04:40.681613 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ddb22c27-27d2-49d5-a23c-3646342b284d-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "ddb22c27-27d2-49d5-a23c-3646342b284d" (UID: "ddb22c27-27d2-49d5-a23c-3646342b284d"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 10:04:40 crc kubenswrapper[4794]: I0310 10:04:40.688199 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ddb22c27-27d2-49d5-a23c-3646342b284d-kube-api-access-cfjvw" (OuterVolumeSpecName: "kube-api-access-cfjvw") pod "ddb22c27-27d2-49d5-a23c-3646342b284d" (UID: "ddb22c27-27d2-49d5-a23c-3646342b284d"). InnerVolumeSpecName "kube-api-access-cfjvw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 10:04:40 crc kubenswrapper[4794]: I0310 10:04:40.749264 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-659ddb758c-4qcnz" Mar 10 10:04:40 crc kubenswrapper[4794]: I0310 10:04:40.781685 4794 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ddb22c27-27d2-49d5-a23c-3646342b284d-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 10 10:04:40 crc kubenswrapper[4794]: I0310 10:04:40.781725 4794 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ddb22c27-27d2-49d5-a23c-3646342b284d-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 10 10:04:40 crc kubenswrapper[4794]: I0310 10:04:40.781743 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cfjvw\" (UniqueName: \"kubernetes.io/projected/ddb22c27-27d2-49d5-a23c-3646342b284d-kube-api-access-cfjvw\") on node \"crc\" DevicePath \"\"" Mar 10 10:04:41 crc kubenswrapper[4794]: I0310 10:04:41.399501 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86dbfc8fbf-5jfbf" Mar 10 10:04:41 crc kubenswrapper[4794]: I0310 10:04:41.468982 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-86dbfc8fbf-5jfbf"] Mar 10 10:04:41 crc kubenswrapper[4794]: I0310 10:04:41.478280 4794 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-86dbfc8fbf-5jfbf"] Mar 10 10:04:41 crc kubenswrapper[4794]: I0310 10:04:41.940471 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78cb4465c9-mcwck" Mar 10 10:04:42 crc kubenswrapper[4794]: I0310 10:04:42.009638 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ddb22c27-27d2-49d5-a23c-3646342b284d" path="/var/lib/kubelet/pods/ddb22c27-27d2-49d5-a23c-3646342b284d/volumes" Mar 10 10:04:42 crc kubenswrapper[4794]: I0310 10:04:42.103085 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ps8n5\" (UniqueName: \"kubernetes.io/projected/49c3ebb8-c668-4ca8-ad87-06ab860ac32d-kube-api-access-ps8n5\") pod \"49c3ebb8-c668-4ca8-ad87-06ab860ac32d\" (UID: \"49c3ebb8-c668-4ca8-ad87-06ab860ac32d\") " Mar 10 10:04:42 crc kubenswrapper[4794]: I0310 10:04:42.103221 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/49c3ebb8-c668-4ca8-ad87-06ab860ac32d-config\") pod \"49c3ebb8-c668-4ca8-ad87-06ab860ac32d\" (UID: \"49c3ebb8-c668-4ca8-ad87-06ab860ac32d\") " Mar 10 10:04:42 crc kubenswrapper[4794]: I0310 10:04:42.103276 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/49c3ebb8-c668-4ca8-ad87-06ab860ac32d-dns-svc\") pod \"49c3ebb8-c668-4ca8-ad87-06ab860ac32d\" (UID: \"49c3ebb8-c668-4ca8-ad87-06ab860ac32d\") " Mar 10 10:04:42 crc kubenswrapper[4794]: I0310 10:04:42.107883 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49c3ebb8-c668-4ca8-ad87-06ab860ac32d-kube-api-access-ps8n5" (OuterVolumeSpecName: "kube-api-access-ps8n5") pod "49c3ebb8-c668-4ca8-ad87-06ab860ac32d" (UID: "49c3ebb8-c668-4ca8-ad87-06ab860ac32d"). InnerVolumeSpecName "kube-api-access-ps8n5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 10:04:42 crc kubenswrapper[4794]: I0310 10:04:42.133201 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c3ebb8-c668-4ca8-ad87-06ab860ac32d-config" (OuterVolumeSpecName: "config") pod "49c3ebb8-c668-4ca8-ad87-06ab860ac32d" (UID: "49c3ebb8-c668-4ca8-ad87-06ab860ac32d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 10:04:42 crc kubenswrapper[4794]: I0310 10:04:42.135555 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c3ebb8-c668-4ca8-ad87-06ab860ac32d-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "49c3ebb8-c668-4ca8-ad87-06ab860ac32d" (UID: "49c3ebb8-c668-4ca8-ad87-06ab860ac32d"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 10:04:42 crc kubenswrapper[4794]: I0310 10:04:42.205218 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ps8n5\" (UniqueName: \"kubernetes.io/projected/49c3ebb8-c668-4ca8-ad87-06ab860ac32d-kube-api-access-ps8n5\") on node \"crc\" DevicePath \"\"" Mar 10 10:04:42 crc kubenswrapper[4794]: I0310 10:04:42.205259 4794 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/49c3ebb8-c668-4ca8-ad87-06ab860ac32d-config\") on node \"crc\" DevicePath \"\"" Mar 10 10:04:42 crc kubenswrapper[4794]: I0310 10:04:42.205275 4794 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/49c3ebb8-c668-4ca8-ad87-06ab860ac32d-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 10 10:04:42 crc kubenswrapper[4794]: I0310 10:04:42.419111 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78cb4465c9-mcwck" event={"ID":"49c3ebb8-c668-4ca8-ad87-06ab860ac32d","Type":"ContainerDied","Data":"599cab58e2a00da157953867d7a25389863887a7818246fb5306a52233fb1af0"} Mar 10 10:04:42 crc kubenswrapper[4794]: I0310 10:04:42.419368 4794 scope.go:117] "RemoveContainer" containerID="25d113d97dd3f14c83e19c6c1468b205ad5abd476c1b8e9ac8583bb0445db8a6" Mar 10 10:04:42 crc kubenswrapper[4794]: I0310 10:04:42.419161 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78cb4465c9-mcwck" Mar 10 10:04:42 crc kubenswrapper[4794]: I0310 10:04:42.474180 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-78cb4465c9-mcwck"] Mar 10 10:04:42 crc kubenswrapper[4794]: I0310 10:04:42.497275 4794 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-78cb4465c9-mcwck"] Mar 10 10:04:42 crc kubenswrapper[4794]: I0310 10:04:42.590174 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-p9r7k"] Mar 10 10:04:42 crc kubenswrapper[4794]: I0310 10:04:42.613146 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-659ddb758c-4qcnz"] Mar 10 10:04:42 crc kubenswrapper[4794]: I0310 10:04:42.895641 4794 scope.go:117] "RemoveContainer" containerID="4ec04b84342ee6d8d5ec0946a38ae45839bda64075a6ef1b336b052126fe59d1" Mar 10 10:04:43 crc kubenswrapper[4794]: I0310 10:04:43.431697 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-659ddb758c-4qcnz" event={"ID":"4d676f27-d254-46a3-afb6-7ecb637b61be","Type":"ContainerStarted","Data":"917109063d9a885d2757c65955824d1118f430f8c8ac6572f7b092d1b9a6cb04"} Mar 10 10:04:43 crc kubenswrapper[4794]: I0310 10:04:43.435004 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"7f82e49b-0e4d-4cf5-8213-b30edcae94d4","Type":"ContainerStarted","Data":"1a56c890345feacde6604baef18e26d73094a63129df6f970f2058bcdb56403e"} Mar 10 10:04:43 crc kubenswrapper[4794]: I0310 10:04:43.436500 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"1fcb4385-7603-4d75-8c41-23f457fcae25","Type":"ContainerStarted","Data":"09e39ed4ef8c86442deb9f63034c2d652a3971bc2b7caf7ccc84b744457e17cf"} Mar 10 10:04:43 crc kubenswrapper[4794]: I0310 10:04:43.437727 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-p9r7k" event={"ID":"f68cd69f-6fe2-4189-ad03-9593a4e94337","Type":"ContainerStarted","Data":"1df7f3a49ec37831f1f34354206d89199e20087c4d0704329a60ba21a2b8f285"} Mar 10 10:04:44 crc kubenswrapper[4794]: I0310 10:04:44.009720 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49c3ebb8-c668-4ca8-ad87-06ab860ac32d" path="/var/lib/kubelet/pods/49c3ebb8-c668-4ca8-ad87-06ab860ac32d/volumes" Mar 10 10:04:44 crc kubenswrapper[4794]: I0310 10:04:44.448449 4794 generic.go:334] "Generic (PLEG): container finished" podID="4fbaedcd-8c1d-452e-a36b-1a12e47f48d1" containerID="f77929cd0c4936dedf4ae780db89375e59fd10d1eb9a6d9a9595f0dfbea94734" exitCode=0 Mar 10 10:04:44 crc kubenswrapper[4794]: I0310 10:04:44.448491 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-8552m" event={"ID":"4fbaedcd-8c1d-452e-a36b-1a12e47f48d1","Type":"ContainerDied","Data":"f77929cd0c4936dedf4ae780db89375e59fd10d1eb9a6d9a9595f0dfbea94734"} Mar 10 10:04:44 crc kubenswrapper[4794]: I0310 10:04:44.452592 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"2b6b3a2f-0f4d-4b7f-a507-450f0ffff42b","Type":"ContainerStarted","Data":"00869dd26164d229d447f55eb54f29605e2bcbe41e9378c894e892c957a95916"} Mar 10 10:04:44 crc kubenswrapper[4794]: I0310 10:04:44.455274 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"a45381ea-b5d8-49aa-b4b8-ab372b39b0d3","Type":"ContainerStarted","Data":"a840482fd73ba7f63de99b82bc1c4a4c3093d855770bd6ce8ca9c72f090ea3e7"} Mar 10 10:04:44 crc kubenswrapper[4794]: I0310 10:04:44.457636 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-fvs8j" event={"ID":"c4ffbeb0-ab29-4b48-bbd2-65c776b08edd","Type":"ContainerStarted","Data":"9ee666a84a51e6fb61d9cd9eaf3a5940b75beb1c27f7f0412bcb937917781eb8"} Mar 10 10:04:44 crc kubenswrapper[4794]: I0310 10:04:44.458125 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-fvs8j" Mar 10 10:04:44 crc kubenswrapper[4794]: I0310 10:04:44.487188 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"598e06ed-3156-4e09-976e-4dda0e35afc2","Type":"ContainerStarted","Data":"d12835c0bc7b99c0b1d3a6ef8ae9785eba0f3a435f78fe3cb1b5f9e6c4bd6dea"} Mar 10 10:04:44 crc kubenswrapper[4794]: I0310 10:04:44.496176 4794 generic.go:334] "Generic (PLEG): container finished" podID="4d676f27-d254-46a3-afb6-7ecb637b61be" containerID="1a4a358cfdd7b0bd2ed2421fcb052432a6e3fcbe1ac515c3592ed116f809c2ce" exitCode=0 Mar 10 10:04:44 crc kubenswrapper[4794]: I0310 10:04:44.496262 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-659ddb758c-4qcnz" event={"ID":"4d676f27-d254-46a3-afb6-7ecb637b61be","Type":"ContainerDied","Data":"1a4a358cfdd7b0bd2ed2421fcb052432a6e3fcbe1ac515c3592ed116f809c2ce"} Mar 10 10:04:44 crc kubenswrapper[4794]: I0310 10:04:44.511903 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"9575e254-d696-4a8a-b84f-c8f36d746ff8","Type":"ContainerStarted","Data":"568dc6ea988e0568a2d3d813b291ac89a4bc785eb376be0c5d4fc65492cc33f4"} Mar 10 10:04:44 crc kubenswrapper[4794]: I0310 10:04:44.513738 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"57b6ddc9-2a57-41f1-a6c8-d0be37b88252","Type":"ContainerStarted","Data":"b7b6d56bae7bead8aea36dc9b247f1241a2b6e20d19a3ee1d788129133d85a48"} Mar 10 10:04:44 crc kubenswrapper[4794]: I0310 10:04:44.514496 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Mar 10 10:04:44 crc kubenswrapper[4794]: I0310 10:04:44.523581 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"71ee8a8d-89a0-495f-925b-071e52449063","Type":"ContainerStarted","Data":"2cb4ade39b3ccc7065ae84be53a791f98290d564a27b4334ca648c8b39c8ca95"} Mar 10 10:04:44 crc kubenswrapper[4794]: I0310 10:04:44.523626 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/memcached-0" Mar 10 10:04:44 crc kubenswrapper[4794]: I0310 10:04:44.604775 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-fvs8j" podStartSLOduration=8.44727968 podStartE2EDuration="18.604757224s" podCreationTimestamp="2026-03-10 10:04:26 +0000 UTC" firstStartedPulling="2026-03-10 10:04:32.096012389 +0000 UTC m=+1220.852183197" lastFinishedPulling="2026-03-10 10:04:42.253489913 +0000 UTC m=+1231.009660741" observedRunningTime="2026-03-10 10:04:44.577582475 +0000 UTC m=+1233.333753283" watchObservedRunningTime="2026-03-10 10:04:44.604757224 +0000 UTC m=+1233.360928042" Mar 10 10:04:44 crc kubenswrapper[4794]: I0310 10:04:44.651699 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/memcached-0" podStartSLOduration=14.110826687 podStartE2EDuration="23.651674618s" podCreationTimestamp="2026-03-10 10:04:21 +0000 UTC" firstStartedPulling="2026-03-10 10:04:32.075751193 +0000 UTC m=+1220.831922011" lastFinishedPulling="2026-03-10 10:04:41.616599124 +0000 UTC m=+1230.372769942" observedRunningTime="2026-03-10 10:04:44.645590545 +0000 UTC m=+1233.401761353" watchObservedRunningTime="2026-03-10 10:04:44.651674618 +0000 UTC m=+1233.407845436" Mar 10 10:04:44 crc kubenswrapper[4794]: I0310 10:04:44.674459 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=10.710021739 podStartE2EDuration="21.674441688s" podCreationTimestamp="2026-03-10 10:04:23 +0000 UTC" firstStartedPulling="2026-03-10 10:04:32.054104274 +0000 UTC m=+1220.810275092" lastFinishedPulling="2026-03-10 10:04:43.018524223 +0000 UTC m=+1231.774695041" observedRunningTime="2026-03-10 10:04:44.668436328 +0000 UTC m=+1233.424607166" watchObservedRunningTime="2026-03-10 10:04:44.674441688 +0000 UTC m=+1233.430612506" Mar 10 10:04:45 crc kubenswrapper[4794]: I0310 10:04:45.533825 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-8552m" event={"ID":"4fbaedcd-8c1d-452e-a36b-1a12e47f48d1","Type":"ContainerStarted","Data":"760dbf63b8a7310e794a1834c303840d43e2dfc6133fa29596b18c0cf1a30fcb"} Mar 10 10:04:45 crc kubenswrapper[4794]: I0310 10:04:45.538026 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-659ddb758c-4qcnz" event={"ID":"4d676f27-d254-46a3-afb6-7ecb637b61be","Type":"ContainerStarted","Data":"9d26bd5b81721ffcbfeba1b8441c9226b29c9ef39b18c49b5750cccfa3e56b31"} Mar 10 10:04:45 crc kubenswrapper[4794]: I0310 10:04:45.557949 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-659ddb758c-4qcnz" podStartSLOduration=5.557928166 podStartE2EDuration="5.557928166s" podCreationTimestamp="2026-03-10 10:04:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 10:04:45.555431927 +0000 UTC m=+1234.311602755" watchObservedRunningTime="2026-03-10 10:04:45.557928166 +0000 UTC m=+1234.314098984" Mar 10 10:04:45 crc kubenswrapper[4794]: I0310 10:04:45.750589 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-659ddb758c-4qcnz" Mar 10 10:04:47 crc kubenswrapper[4794]: I0310 10:04:47.555977 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"1fcb4385-7603-4d75-8c41-23f457fcae25","Type":"ContainerStarted","Data":"7e4285317b0405a3de83fe6a6261bf54c91dad51c10abc1e8c435712c795e03f"} Mar 10 10:04:47 crc kubenswrapper[4794]: I0310 10:04:47.558610 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"9575e254-d696-4a8a-b84f-c8f36d746ff8","Type":"ContainerStarted","Data":"7c722e31acce807ad3627a24a3d71255a5f9253c9aaa80d754038f1c69a3dfc4"} Mar 10 10:04:47 crc kubenswrapper[4794]: I0310 10:04:47.560754 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-p9r7k" event={"ID":"f68cd69f-6fe2-4189-ad03-9593a4e94337","Type":"ContainerStarted","Data":"f4db0913dfb22b8fdf8a9875e0693880b022c639ea96ea1251770012a7e71a8f"} Mar 10 10:04:47 crc kubenswrapper[4794]: I0310 10:04:47.564729 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-8552m" event={"ID":"4fbaedcd-8c1d-452e-a36b-1a12e47f48d1","Type":"ContainerStarted","Data":"b7dfa84825a07e2ea2e0b6f2310752db3a4b9e635a5fe6eebca49deff4d3269e"} Mar 10 10:04:47 crc kubenswrapper[4794]: I0310 10:04:47.565514 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-8552m" Mar 10 10:04:47 crc kubenswrapper[4794]: I0310 10:04:47.565552 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-8552m" Mar 10 10:04:47 crc kubenswrapper[4794]: I0310 10:04:47.567471 4794 generic.go:334] "Generic (PLEG): container finished" podID="2b6b3a2f-0f4d-4b7f-a507-450f0ffff42b" containerID="00869dd26164d229d447f55eb54f29605e2bcbe41e9378c894e892c957a95916" exitCode=0 Mar 10 10:04:47 crc kubenswrapper[4794]: I0310 10:04:47.567531 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"2b6b3a2f-0f4d-4b7f-a507-450f0ffff42b","Type":"ContainerDied","Data":"00869dd26164d229d447f55eb54f29605e2bcbe41e9378c894e892c957a95916"} Mar 10 10:04:47 crc kubenswrapper[4794]: I0310 10:04:47.570605 4794 generic.go:334] "Generic (PLEG): container finished" podID="7f82e49b-0e4d-4cf5-8213-b30edcae94d4" containerID="1a56c890345feacde6604baef18e26d73094a63129df6f970f2058bcdb56403e" exitCode=0 Mar 10 10:04:47 crc kubenswrapper[4794]: I0310 10:04:47.571351 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"7f82e49b-0e4d-4cf5-8213-b30edcae94d4","Type":"ContainerDied","Data":"1a56c890345feacde6604baef18e26d73094a63129df6f970f2058bcdb56403e"} Mar 10 10:04:47 crc kubenswrapper[4794]: I0310 10:04:47.592477 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-sb-0" podStartSLOduration=4.275487362 podStartE2EDuration="18.592458286s" podCreationTimestamp="2026-03-10 10:04:29 +0000 UTC" firstStartedPulling="2026-03-10 10:04:32.204200294 +0000 UTC m=+1220.960371112" lastFinishedPulling="2026-03-10 10:04:46.521171218 +0000 UTC m=+1235.277342036" observedRunningTime="2026-03-10 10:04:47.584456463 +0000 UTC m=+1236.340627301" watchObservedRunningTime="2026-03-10 10:04:47.592458286 +0000 UTC m=+1236.348629104" Mar 10 10:04:47 crc kubenswrapper[4794]: I0310 10:04:47.630969 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-metrics-p9r7k" podStartSLOduration=4.022929694 podStartE2EDuration="7.630947603s" podCreationTimestamp="2026-03-10 10:04:40 +0000 UTC" firstStartedPulling="2026-03-10 10:04:42.895908334 +0000 UTC m=+1231.652079162" lastFinishedPulling="2026-03-10 10:04:46.503926253 +0000 UTC m=+1235.260097071" observedRunningTime="2026-03-10 10:04:47.613235163 +0000 UTC m=+1236.369405991" watchObservedRunningTime="2026-03-10 10:04:47.630947603 +0000 UTC m=+1236.387118431" Mar 10 10:04:47 crc kubenswrapper[4794]: I0310 10:04:47.723639 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-ovs-8552m" podStartSLOduration=12.02619937 podStartE2EDuration="21.723619884s" podCreationTimestamp="2026-03-10 10:04:26 +0000 UTC" firstStartedPulling="2026-03-10 10:04:32.433936395 +0000 UTC m=+1221.190107213" lastFinishedPulling="2026-03-10 10:04:42.131356909 +0000 UTC m=+1230.887527727" observedRunningTime="2026-03-10 10:04:47.718047018 +0000 UTC m=+1236.474217846" watchObservedRunningTime="2026-03-10 10:04:47.723619884 +0000 UTC m=+1236.479790702" Mar 10 10:04:47 crc kubenswrapper[4794]: I0310 10:04:47.748750 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-nb-0" podStartSLOduration=11.524331474 podStartE2EDuration="20.748730718s" podCreationTimestamp="2026-03-10 10:04:27 +0000 UTC" firstStartedPulling="2026-03-10 10:04:37.282842573 +0000 UTC m=+1226.039013431" lastFinishedPulling="2026-03-10 10:04:46.507241857 +0000 UTC m=+1235.263412675" observedRunningTime="2026-03-10 10:04:47.737540914 +0000 UTC m=+1236.493711732" watchObservedRunningTime="2026-03-10 10:04:47.748730718 +0000 UTC m=+1236.504901536" Mar 10 10:04:48 crc kubenswrapper[4794]: I0310 10:04:48.481110 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-nb-0" Mar 10 10:04:48 crc kubenswrapper[4794]: I0310 10:04:48.581893 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"7f82e49b-0e4d-4cf5-8213-b30edcae94d4","Type":"ContainerStarted","Data":"c4c1209870650aefcffd90e94778414a3fb571c20ef3e08baeadd14915b7e8d6"} Mar 10 10:04:48 crc kubenswrapper[4794]: I0310 10:04:48.586060 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"2b6b3a2f-0f4d-4b7f-a507-450f0ffff42b","Type":"ContainerStarted","Data":"4ee46b9ad300bd7296d35f0791b2f32932975b8845ba6b96c9a8eff329eb83f5"} Mar 10 10:04:48 crc kubenswrapper[4794]: I0310 10:04:48.611800 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-cell1-galera-0" podStartSLOduration=18.836679708 podStartE2EDuration="28.611778231s" podCreationTimestamp="2026-03-10 10:04:20 +0000 UTC" firstStartedPulling="2026-03-10 10:04:31.96316568 +0000 UTC m=+1220.719336498" lastFinishedPulling="2026-03-10 10:04:41.738264203 +0000 UTC m=+1230.494435021" observedRunningTime="2026-03-10 10:04:48.606132072 +0000 UTC m=+1237.362302900" watchObservedRunningTime="2026-03-10 10:04:48.611778231 +0000 UTC m=+1237.367949059" Mar 10 10:04:48 crc kubenswrapper[4794]: I0310 10:04:48.631748 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-galera-0" podStartSLOduration=20.397377894999998 podStartE2EDuration="30.631722231s" podCreationTimestamp="2026-03-10 10:04:18 +0000 UTC" firstStartedPulling="2026-03-10 10:04:31.801276669 +0000 UTC m=+1220.557447497" lastFinishedPulling="2026-03-10 10:04:42.035621015 +0000 UTC m=+1230.791791833" observedRunningTime="2026-03-10 10:04:48.630858404 +0000 UTC m=+1237.387029232" watchObservedRunningTime="2026-03-10 10:04:48.631722231 +0000 UTC m=+1237.387893059" Mar 10 10:04:49 crc kubenswrapper[4794]: I0310 10:04:49.264626 4794 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-sb-0" Mar 10 10:04:49 crc kubenswrapper[4794]: I0310 10:04:49.308255 4794 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-sb-0" Mar 10 10:04:49 crc kubenswrapper[4794]: I0310 10:04:49.481699 4794 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-nb-0" Mar 10 10:04:49 crc kubenswrapper[4794]: I0310 10:04:49.525503 4794 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-nb-0" Mar 10 10:04:49 crc kubenswrapper[4794]: I0310 10:04:49.594965 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-sb-0" Mar 10 10:04:49 crc kubenswrapper[4794]: I0310 10:04:49.630913 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-sb-0" Mar 10 10:04:49 crc kubenswrapper[4794]: I0310 10:04:49.631500 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-nb-0" Mar 10 10:04:49 crc kubenswrapper[4794]: I0310 10:04:49.890392 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-northd-0"] Mar 10 10:04:49 crc kubenswrapper[4794]: E0310 10:04:49.890770 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="49c3ebb8-c668-4ca8-ad87-06ab860ac32d" containerName="init" Mar 10 10:04:49 crc kubenswrapper[4794]: I0310 10:04:49.890790 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="49c3ebb8-c668-4ca8-ad87-06ab860ac32d" containerName="init" Mar 10 10:04:49 crc kubenswrapper[4794]: E0310 10:04:49.890820 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="49c3ebb8-c668-4ca8-ad87-06ab860ac32d" containerName="dnsmasq-dns" Mar 10 10:04:49 crc kubenswrapper[4794]: I0310 10:04:49.890829 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="49c3ebb8-c668-4ca8-ad87-06ab860ac32d" containerName="dnsmasq-dns" Mar 10 10:04:49 crc kubenswrapper[4794]: I0310 10:04:49.891028 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="49c3ebb8-c668-4ca8-ad87-06ab860ac32d" containerName="dnsmasq-dns" Mar 10 10:04:49 crc kubenswrapper[4794]: I0310 10:04:49.891989 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Mar 10 10:04:49 crc kubenswrapper[4794]: I0310 10:04:49.894720 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-config" Mar 10 10:04:49 crc kubenswrapper[4794]: I0310 10:04:49.894745 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-scripts" Mar 10 10:04:49 crc kubenswrapper[4794]: I0310 10:04:49.895044 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovnnorthd-ovnnorthd-dockercfg-z6hk2" Mar 10 10:04:49 crc kubenswrapper[4794]: I0310 10:04:49.911640 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Mar 10 10:04:49 crc kubenswrapper[4794]: I0310 10:04:49.915140 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovnnorthd-ovndbs" Mar 10 10:04:49 crc kubenswrapper[4794]: I0310 10:04:49.946485 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3a1da9f1-f33d-4327-b899-b5a38c6990d8-scripts\") pod \"ovn-northd-0\" (UID: \"3a1da9f1-f33d-4327-b899-b5a38c6990d8\") " pod="openstack/ovn-northd-0" Mar 10 10:04:49 crc kubenswrapper[4794]: I0310 10:04:49.946548 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/3a1da9f1-f33d-4327-b899-b5a38c6990d8-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"3a1da9f1-f33d-4327-b899-b5a38c6990d8\") " pod="openstack/ovn-northd-0" Mar 10 10:04:49 crc kubenswrapper[4794]: I0310 10:04:49.946777 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/3a1da9f1-f33d-4327-b899-b5a38c6990d8-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"3a1da9f1-f33d-4327-b899-b5a38c6990d8\") " pod="openstack/ovn-northd-0" Mar 10 10:04:49 crc kubenswrapper[4794]: I0310 10:04:49.946808 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hqg4h\" (UniqueName: \"kubernetes.io/projected/3a1da9f1-f33d-4327-b899-b5a38c6990d8-kube-api-access-hqg4h\") pod \"ovn-northd-0\" (UID: \"3a1da9f1-f33d-4327-b899-b5a38c6990d8\") " pod="openstack/ovn-northd-0" Mar 10 10:04:49 crc kubenswrapper[4794]: I0310 10:04:49.946869 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/3a1da9f1-f33d-4327-b899-b5a38c6990d8-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"3a1da9f1-f33d-4327-b899-b5a38c6990d8\") " pod="openstack/ovn-northd-0" Mar 10 10:04:49 crc kubenswrapper[4794]: I0310 10:04:49.946941 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3a1da9f1-f33d-4327-b899-b5a38c6990d8-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"3a1da9f1-f33d-4327-b899-b5a38c6990d8\") " pod="openstack/ovn-northd-0" Mar 10 10:04:49 crc kubenswrapper[4794]: I0310 10:04:49.946986 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3a1da9f1-f33d-4327-b899-b5a38c6990d8-config\") pod \"ovn-northd-0\" (UID: \"3a1da9f1-f33d-4327-b899-b5a38c6990d8\") " pod="openstack/ovn-northd-0" Mar 10 10:04:50 crc kubenswrapper[4794]: I0310 10:04:50.019303 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-galera-0" Mar 10 10:04:50 crc kubenswrapper[4794]: I0310 10:04:50.019366 4794 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-galera-0" Mar 10 10:04:50 crc kubenswrapper[4794]: I0310 10:04:50.048238 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/3a1da9f1-f33d-4327-b899-b5a38c6990d8-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"3a1da9f1-f33d-4327-b899-b5a38c6990d8\") " pod="openstack/ovn-northd-0" Mar 10 10:04:50 crc kubenswrapper[4794]: I0310 10:04:50.048281 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hqg4h\" (UniqueName: \"kubernetes.io/projected/3a1da9f1-f33d-4327-b899-b5a38c6990d8-kube-api-access-hqg4h\") pod \"ovn-northd-0\" (UID: \"3a1da9f1-f33d-4327-b899-b5a38c6990d8\") " pod="openstack/ovn-northd-0" Mar 10 10:04:50 crc kubenswrapper[4794]: I0310 10:04:50.048365 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/3a1da9f1-f33d-4327-b899-b5a38c6990d8-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"3a1da9f1-f33d-4327-b899-b5a38c6990d8\") " pod="openstack/ovn-northd-0" Mar 10 10:04:50 crc kubenswrapper[4794]: I0310 10:04:50.048426 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3a1da9f1-f33d-4327-b899-b5a38c6990d8-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"3a1da9f1-f33d-4327-b899-b5a38c6990d8\") " pod="openstack/ovn-northd-0" Mar 10 10:04:50 crc kubenswrapper[4794]: I0310 10:04:50.048460 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3a1da9f1-f33d-4327-b899-b5a38c6990d8-config\") pod \"ovn-northd-0\" (UID: \"3a1da9f1-f33d-4327-b899-b5a38c6990d8\") " pod="openstack/ovn-northd-0" Mar 10 10:04:50 crc kubenswrapper[4794]: I0310 10:04:50.048501 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3a1da9f1-f33d-4327-b899-b5a38c6990d8-scripts\") pod \"ovn-northd-0\" (UID: \"3a1da9f1-f33d-4327-b899-b5a38c6990d8\") " pod="openstack/ovn-northd-0" Mar 10 10:04:50 crc kubenswrapper[4794]: I0310 10:04:50.048540 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/3a1da9f1-f33d-4327-b899-b5a38c6990d8-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"3a1da9f1-f33d-4327-b899-b5a38c6990d8\") " pod="openstack/ovn-northd-0" Mar 10 10:04:50 crc kubenswrapper[4794]: I0310 10:04:50.050617 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/3a1da9f1-f33d-4327-b899-b5a38c6990d8-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"3a1da9f1-f33d-4327-b899-b5a38c6990d8\") " pod="openstack/ovn-northd-0" Mar 10 10:04:50 crc kubenswrapper[4794]: I0310 10:04:50.051325 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3a1da9f1-f33d-4327-b899-b5a38c6990d8-scripts\") pod \"ovn-northd-0\" (UID: \"3a1da9f1-f33d-4327-b899-b5a38c6990d8\") " pod="openstack/ovn-northd-0" Mar 10 10:04:50 crc kubenswrapper[4794]: I0310 10:04:50.054029 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3a1da9f1-f33d-4327-b899-b5a38c6990d8-config\") pod \"ovn-northd-0\" (UID: \"3a1da9f1-f33d-4327-b899-b5a38c6990d8\") " pod="openstack/ovn-northd-0" Mar 10 10:04:50 crc kubenswrapper[4794]: I0310 10:04:50.054219 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/3a1da9f1-f33d-4327-b899-b5a38c6990d8-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"3a1da9f1-f33d-4327-b899-b5a38c6990d8\") " pod="openstack/ovn-northd-0" Mar 10 10:04:50 crc kubenswrapper[4794]: I0310 10:04:50.054257 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3a1da9f1-f33d-4327-b899-b5a38c6990d8-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"3a1da9f1-f33d-4327-b899-b5a38c6990d8\") " pod="openstack/ovn-northd-0" Mar 10 10:04:50 crc kubenswrapper[4794]: I0310 10:04:50.055111 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/3a1da9f1-f33d-4327-b899-b5a38c6990d8-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"3a1da9f1-f33d-4327-b899-b5a38c6990d8\") " pod="openstack/ovn-northd-0" Mar 10 10:04:50 crc kubenswrapper[4794]: I0310 10:04:50.065693 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hqg4h\" (UniqueName: \"kubernetes.io/projected/3a1da9f1-f33d-4327-b899-b5a38c6990d8-kube-api-access-hqg4h\") pod \"ovn-northd-0\" (UID: \"3a1da9f1-f33d-4327-b899-b5a38c6990d8\") " pod="openstack/ovn-northd-0" Mar 10 10:04:50 crc kubenswrapper[4794]: I0310 10:04:50.215778 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Mar 10 10:04:50 crc kubenswrapper[4794]: I0310 10:04:50.689043 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Mar 10 10:04:50 crc kubenswrapper[4794]: W0310 10:04:50.695412 4794 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3a1da9f1_f33d_4327_b899_b5a38c6990d8.slice/crio-2855314e11c632caa2a2618519660341ead19e4cd33121471f0e27f25c3b5915 WatchSource:0}: Error finding container 2855314e11c632caa2a2618519660341ead19e4cd33121471f0e27f25c3b5915: Status 404 returned error can't find the container with id 2855314e11c632caa2a2618519660341ead19e4cd33121471f0e27f25c3b5915 Mar 10 10:04:50 crc kubenswrapper[4794]: I0310 10:04:50.751498 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-659ddb758c-4qcnz" Mar 10 10:04:50 crc kubenswrapper[4794]: I0310 10:04:50.803404 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7c47bcb9f9-24sxt"] Mar 10 10:04:50 crc kubenswrapper[4794]: I0310 10:04:50.803737 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-7c47bcb9f9-24sxt" podUID="5666224e-6fa7-45b5-bdf5-12d699e536ba" containerName="dnsmasq-dns" containerID="cri-o://f929dfa0b410a2845a0438d6ee1fce4cbf21d85fbbf58a18716d08b6b11bcb83" gracePeriod=10 Mar 10 10:04:51 crc kubenswrapper[4794]: I0310 10:04:51.293871 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7c47bcb9f9-24sxt" Mar 10 10:04:51 crc kubenswrapper[4794]: I0310 10:04:51.474688 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5666224e-6fa7-45b5-bdf5-12d699e536ba-config\") pod \"5666224e-6fa7-45b5-bdf5-12d699e536ba\" (UID: \"5666224e-6fa7-45b5-bdf5-12d699e536ba\") " Mar 10 10:04:51 crc kubenswrapper[4794]: I0310 10:04:51.475052 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5666224e-6fa7-45b5-bdf5-12d699e536ba-dns-svc\") pod \"5666224e-6fa7-45b5-bdf5-12d699e536ba\" (UID: \"5666224e-6fa7-45b5-bdf5-12d699e536ba\") " Mar 10 10:04:51 crc kubenswrapper[4794]: I0310 10:04:51.475093 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9x2nw\" (UniqueName: \"kubernetes.io/projected/5666224e-6fa7-45b5-bdf5-12d699e536ba-kube-api-access-9x2nw\") pod \"5666224e-6fa7-45b5-bdf5-12d699e536ba\" (UID: \"5666224e-6fa7-45b5-bdf5-12d699e536ba\") " Mar 10 10:04:51 crc kubenswrapper[4794]: I0310 10:04:51.483012 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5666224e-6fa7-45b5-bdf5-12d699e536ba-kube-api-access-9x2nw" (OuterVolumeSpecName: "kube-api-access-9x2nw") pod "5666224e-6fa7-45b5-bdf5-12d699e536ba" (UID: "5666224e-6fa7-45b5-bdf5-12d699e536ba"). InnerVolumeSpecName "kube-api-access-9x2nw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 10:04:51 crc kubenswrapper[4794]: I0310 10:04:51.518361 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5666224e-6fa7-45b5-bdf5-12d699e536ba-config" (OuterVolumeSpecName: "config") pod "5666224e-6fa7-45b5-bdf5-12d699e536ba" (UID: "5666224e-6fa7-45b5-bdf5-12d699e536ba"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 10:04:51 crc kubenswrapper[4794]: I0310 10:04:51.525867 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5666224e-6fa7-45b5-bdf5-12d699e536ba-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "5666224e-6fa7-45b5-bdf5-12d699e536ba" (UID: "5666224e-6fa7-45b5-bdf5-12d699e536ba"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 10:04:51 crc kubenswrapper[4794]: I0310 10:04:51.576971 4794 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5666224e-6fa7-45b5-bdf5-12d699e536ba-config\") on node \"crc\" DevicePath \"\"" Mar 10 10:04:51 crc kubenswrapper[4794]: I0310 10:04:51.577006 4794 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5666224e-6fa7-45b5-bdf5-12d699e536ba-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 10 10:04:51 crc kubenswrapper[4794]: I0310 10:04:51.577017 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9x2nw\" (UniqueName: \"kubernetes.io/projected/5666224e-6fa7-45b5-bdf5-12d699e536ba-kube-api-access-9x2nw\") on node \"crc\" DevicePath \"\"" Mar 10 10:04:51 crc kubenswrapper[4794]: I0310 10:04:51.612582 4794 generic.go:334] "Generic (PLEG): container finished" podID="5666224e-6fa7-45b5-bdf5-12d699e536ba" containerID="f929dfa0b410a2845a0438d6ee1fce4cbf21d85fbbf58a18716d08b6b11bcb83" exitCode=0 Mar 10 10:04:51 crc kubenswrapper[4794]: I0310 10:04:51.612640 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7c47bcb9f9-24sxt" Mar 10 10:04:51 crc kubenswrapper[4794]: I0310 10:04:51.612637 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7c47bcb9f9-24sxt" event={"ID":"5666224e-6fa7-45b5-bdf5-12d699e536ba","Type":"ContainerDied","Data":"f929dfa0b410a2845a0438d6ee1fce4cbf21d85fbbf58a18716d08b6b11bcb83"} Mar 10 10:04:51 crc kubenswrapper[4794]: I0310 10:04:51.612684 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7c47bcb9f9-24sxt" event={"ID":"5666224e-6fa7-45b5-bdf5-12d699e536ba","Type":"ContainerDied","Data":"1cbe9b65b1c0ca10424133c7082fc752aadc10d6c858f8aa4a622667bc4fc493"} Mar 10 10:04:51 crc kubenswrapper[4794]: I0310 10:04:51.612702 4794 scope.go:117] "RemoveContainer" containerID="f929dfa0b410a2845a0438d6ee1fce4cbf21d85fbbf58a18716d08b6b11bcb83" Mar 10 10:04:51 crc kubenswrapper[4794]: I0310 10:04:51.614440 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"3a1da9f1-f33d-4327-b899-b5a38c6990d8","Type":"ContainerStarted","Data":"2855314e11c632caa2a2618519660341ead19e4cd33121471f0e27f25c3b5915"} Mar 10 10:04:51 crc kubenswrapper[4794]: I0310 10:04:51.623892 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-cell1-galera-0" Mar 10 10:04:51 crc kubenswrapper[4794]: I0310 10:04:51.623926 4794 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-cell1-galera-0" Mar 10 10:04:51 crc kubenswrapper[4794]: I0310 10:04:51.646605 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7c47bcb9f9-24sxt"] Mar 10 10:04:51 crc kubenswrapper[4794]: I0310 10:04:51.651932 4794 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7c47bcb9f9-24sxt"] Mar 10 10:04:51 crc kubenswrapper[4794]: I0310 10:04:51.660162 4794 scope.go:117] "RemoveContainer" containerID="f2bf2635b5ddf00ea3dd401977963877c3ace0f44b2659adff6ed91cf277948e" Mar 10 10:04:51 crc kubenswrapper[4794]: I0310 10:04:51.736578 4794 scope.go:117] "RemoveContainer" containerID="f929dfa0b410a2845a0438d6ee1fce4cbf21d85fbbf58a18716d08b6b11bcb83" Mar 10 10:04:51 crc kubenswrapper[4794]: E0310 10:04:51.737279 4794 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f929dfa0b410a2845a0438d6ee1fce4cbf21d85fbbf58a18716d08b6b11bcb83\": container with ID starting with f929dfa0b410a2845a0438d6ee1fce4cbf21d85fbbf58a18716d08b6b11bcb83 not found: ID does not exist" containerID="f929dfa0b410a2845a0438d6ee1fce4cbf21d85fbbf58a18716d08b6b11bcb83" Mar 10 10:04:51 crc kubenswrapper[4794]: I0310 10:04:51.737313 4794 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f929dfa0b410a2845a0438d6ee1fce4cbf21d85fbbf58a18716d08b6b11bcb83"} err="failed to get container status \"f929dfa0b410a2845a0438d6ee1fce4cbf21d85fbbf58a18716d08b6b11bcb83\": rpc error: code = NotFound desc = could not find container \"f929dfa0b410a2845a0438d6ee1fce4cbf21d85fbbf58a18716d08b6b11bcb83\": container with ID starting with f929dfa0b410a2845a0438d6ee1fce4cbf21d85fbbf58a18716d08b6b11bcb83 not found: ID does not exist" Mar 10 10:04:51 crc kubenswrapper[4794]: I0310 10:04:51.737362 4794 scope.go:117] "RemoveContainer" containerID="f2bf2635b5ddf00ea3dd401977963877c3ace0f44b2659adff6ed91cf277948e" Mar 10 10:04:51 crc kubenswrapper[4794]: E0310 10:04:51.737794 4794 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f2bf2635b5ddf00ea3dd401977963877c3ace0f44b2659adff6ed91cf277948e\": container with ID starting with f2bf2635b5ddf00ea3dd401977963877c3ace0f44b2659adff6ed91cf277948e not found: ID does not exist" containerID="f2bf2635b5ddf00ea3dd401977963877c3ace0f44b2659adff6ed91cf277948e" Mar 10 10:04:51 crc kubenswrapper[4794]: I0310 10:04:51.737837 4794 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f2bf2635b5ddf00ea3dd401977963877c3ace0f44b2659adff6ed91cf277948e"} err="failed to get container status \"f2bf2635b5ddf00ea3dd401977963877c3ace0f44b2659adff6ed91cf277948e\": rpc error: code = NotFound desc = could not find container \"f2bf2635b5ddf00ea3dd401977963877c3ace0f44b2659adff6ed91cf277948e\": container with ID starting with f2bf2635b5ddf00ea3dd401977963877c3ace0f44b2659adff6ed91cf277948e not found: ID does not exist" Mar 10 10:04:51 crc kubenswrapper[4794]: I0310 10:04:51.914543 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/memcached-0" Mar 10 10:04:52 crc kubenswrapper[4794]: I0310 10:04:52.007949 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5666224e-6fa7-45b5-bdf5-12d699e536ba" path="/var/lib/kubelet/pods/5666224e-6fa7-45b5-bdf5-12d699e536ba/volumes" Mar 10 10:04:52 crc kubenswrapper[4794]: I0310 10:04:52.622803 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"3a1da9f1-f33d-4327-b899-b5a38c6990d8","Type":"ContainerStarted","Data":"804d445e998cb87856a9e74959cb49d74352b3c669f7c0cfc5587e7a39d5a062"} Mar 10 10:04:52 crc kubenswrapper[4794]: I0310 10:04:52.623518 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-northd-0" Mar 10 10:04:52 crc kubenswrapper[4794]: I0310 10:04:52.623557 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"3a1da9f1-f33d-4327-b899-b5a38c6990d8","Type":"ContainerStarted","Data":"a27a1a3aea44b5edb2651ff49b0fcc031f55a8ebaeb3807a56fb618e61e23ded"} Mar 10 10:04:52 crc kubenswrapper[4794]: I0310 10:04:52.653745 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-northd-0" podStartSLOduration=2.616566642 podStartE2EDuration="3.653717721s" podCreationTimestamp="2026-03-10 10:04:49 +0000 UTC" firstStartedPulling="2026-03-10 10:04:50.70103439 +0000 UTC m=+1239.457205208" lastFinishedPulling="2026-03-10 10:04:51.738185469 +0000 UTC m=+1240.494356287" observedRunningTime="2026-03-10 10:04:52.643986913 +0000 UTC m=+1241.400157731" watchObservedRunningTime="2026-03-10 10:04:52.653717721 +0000 UTC m=+1241.409888569" Mar 10 10:04:52 crc kubenswrapper[4794]: I0310 10:04:52.656363 4794 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-galera-0" Mar 10 10:04:52 crc kubenswrapper[4794]: I0310 10:04:52.729819 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-galera-0" Mar 10 10:04:53 crc kubenswrapper[4794]: I0310 10:04:53.006237 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-create-mbt9m"] Mar 10 10:04:53 crc kubenswrapper[4794]: E0310 10:04:53.007729 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5666224e-6fa7-45b5-bdf5-12d699e536ba" containerName="dnsmasq-dns" Mar 10 10:04:53 crc kubenswrapper[4794]: I0310 10:04:53.007850 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="5666224e-6fa7-45b5-bdf5-12d699e536ba" containerName="dnsmasq-dns" Mar 10 10:04:53 crc kubenswrapper[4794]: E0310 10:04:53.007954 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5666224e-6fa7-45b5-bdf5-12d699e536ba" containerName="init" Mar 10 10:04:53 crc kubenswrapper[4794]: I0310 10:04:53.008043 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="5666224e-6fa7-45b5-bdf5-12d699e536ba" containerName="init" Mar 10 10:04:53 crc kubenswrapper[4794]: I0310 10:04:53.008316 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="5666224e-6fa7-45b5-bdf5-12d699e536ba" containerName="dnsmasq-dns" Mar 10 10:04:53 crc kubenswrapper[4794]: I0310 10:04:53.009044 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-mbt9m" Mar 10 10:04:53 crc kubenswrapper[4794]: I0310 10:04:53.013261 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-d7d5-account-create-update-2vwdw"] Mar 10 10:04:53 crc kubenswrapper[4794]: I0310 10:04:53.014248 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-d7d5-account-create-update-2vwdw" Mar 10 10:04:53 crc kubenswrapper[4794]: I0310 10:04:53.015845 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-db-secret" Mar 10 10:04:53 crc kubenswrapper[4794]: I0310 10:04:53.021967 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-mbt9m"] Mar 10 10:04:53 crc kubenswrapper[4794]: I0310 10:04:53.029306 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-d7d5-account-create-update-2vwdw"] Mar 10 10:04:53 crc kubenswrapper[4794]: I0310 10:04:53.099603 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/34215f04-95c7-4644-8ecc-70147aa8c100-operator-scripts\") pod \"keystone-db-create-mbt9m\" (UID: \"34215f04-95c7-4644-8ecc-70147aa8c100\") " pod="openstack/keystone-db-create-mbt9m" Mar 10 10:04:53 crc kubenswrapper[4794]: I0310 10:04:53.099713 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a812a094-c8c4-48cf-82c4-97fc75f2774f-operator-scripts\") pod \"keystone-d7d5-account-create-update-2vwdw\" (UID: \"a812a094-c8c4-48cf-82c4-97fc75f2774f\") " pod="openstack/keystone-d7d5-account-create-update-2vwdw" Mar 10 10:04:53 crc kubenswrapper[4794]: I0310 10:04:53.099838 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7k96z\" (UniqueName: \"kubernetes.io/projected/a812a094-c8c4-48cf-82c4-97fc75f2774f-kube-api-access-7k96z\") pod \"keystone-d7d5-account-create-update-2vwdw\" (UID: \"a812a094-c8c4-48cf-82c4-97fc75f2774f\") " pod="openstack/keystone-d7d5-account-create-update-2vwdw" Mar 10 10:04:53 crc kubenswrapper[4794]: I0310 10:04:53.099991 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j62l9\" (UniqueName: \"kubernetes.io/projected/34215f04-95c7-4644-8ecc-70147aa8c100-kube-api-access-j62l9\") pod \"keystone-db-create-mbt9m\" (UID: \"34215f04-95c7-4644-8ecc-70147aa8c100\") " pod="openstack/keystone-db-create-mbt9m" Mar 10 10:04:53 crc kubenswrapper[4794]: I0310 10:04:53.115388 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-create-dcx5m"] Mar 10 10:04:53 crc kubenswrapper[4794]: I0310 10:04:53.116563 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-dcx5m" Mar 10 10:04:53 crc kubenswrapper[4794]: I0310 10:04:53.123672 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-dcx5m"] Mar 10 10:04:53 crc kubenswrapper[4794]: I0310 10:04:53.201115 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7k96z\" (UniqueName: \"kubernetes.io/projected/a812a094-c8c4-48cf-82c4-97fc75f2774f-kube-api-access-7k96z\") pod \"keystone-d7d5-account-create-update-2vwdw\" (UID: \"a812a094-c8c4-48cf-82c4-97fc75f2774f\") " pod="openstack/keystone-d7d5-account-create-update-2vwdw" Mar 10 10:04:53 crc kubenswrapper[4794]: I0310 10:04:53.201221 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j62l9\" (UniqueName: \"kubernetes.io/projected/34215f04-95c7-4644-8ecc-70147aa8c100-kube-api-access-j62l9\") pod \"keystone-db-create-mbt9m\" (UID: \"34215f04-95c7-4644-8ecc-70147aa8c100\") " pod="openstack/keystone-db-create-mbt9m" Mar 10 10:04:53 crc kubenswrapper[4794]: I0310 10:04:53.201294 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/34215f04-95c7-4644-8ecc-70147aa8c100-operator-scripts\") pod \"keystone-db-create-mbt9m\" (UID: \"34215f04-95c7-4644-8ecc-70147aa8c100\") " pod="openstack/keystone-db-create-mbt9m" Mar 10 10:04:53 crc kubenswrapper[4794]: I0310 10:04:53.201347 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a812a094-c8c4-48cf-82c4-97fc75f2774f-operator-scripts\") pod \"keystone-d7d5-account-create-update-2vwdw\" (UID: \"a812a094-c8c4-48cf-82c4-97fc75f2774f\") " pod="openstack/keystone-d7d5-account-create-update-2vwdw" Mar 10 10:04:53 crc kubenswrapper[4794]: I0310 10:04:53.201385 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f6x2f\" (UniqueName: \"kubernetes.io/projected/71cc55d3-bfda-4469-ad6c-6c5c0357360a-kube-api-access-f6x2f\") pod \"placement-db-create-dcx5m\" (UID: \"71cc55d3-bfda-4469-ad6c-6c5c0357360a\") " pod="openstack/placement-db-create-dcx5m" Mar 10 10:04:53 crc kubenswrapper[4794]: I0310 10:04:53.201435 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/71cc55d3-bfda-4469-ad6c-6c5c0357360a-operator-scripts\") pod \"placement-db-create-dcx5m\" (UID: \"71cc55d3-bfda-4469-ad6c-6c5c0357360a\") " pod="openstack/placement-db-create-dcx5m" Mar 10 10:04:53 crc kubenswrapper[4794]: I0310 10:04:53.202271 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a812a094-c8c4-48cf-82c4-97fc75f2774f-operator-scripts\") pod \"keystone-d7d5-account-create-update-2vwdw\" (UID: \"a812a094-c8c4-48cf-82c4-97fc75f2774f\") " pod="openstack/keystone-d7d5-account-create-update-2vwdw" Mar 10 10:04:53 crc kubenswrapper[4794]: I0310 10:04:53.202303 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/34215f04-95c7-4644-8ecc-70147aa8c100-operator-scripts\") pod \"keystone-db-create-mbt9m\" (UID: \"34215f04-95c7-4644-8ecc-70147aa8c100\") " pod="openstack/keystone-db-create-mbt9m" Mar 10 10:04:53 crc kubenswrapper[4794]: I0310 10:04:53.206307 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-a23e-account-create-update-4mdw7"] Mar 10 10:04:53 crc kubenswrapper[4794]: I0310 10:04:53.207351 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-a23e-account-create-update-4mdw7" Mar 10 10:04:53 crc kubenswrapper[4794]: I0310 10:04:53.209586 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-db-secret" Mar 10 10:04:53 crc kubenswrapper[4794]: I0310 10:04:53.214145 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-a23e-account-create-update-4mdw7"] Mar 10 10:04:53 crc kubenswrapper[4794]: I0310 10:04:53.224792 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j62l9\" (UniqueName: \"kubernetes.io/projected/34215f04-95c7-4644-8ecc-70147aa8c100-kube-api-access-j62l9\") pod \"keystone-db-create-mbt9m\" (UID: \"34215f04-95c7-4644-8ecc-70147aa8c100\") " pod="openstack/keystone-db-create-mbt9m" Mar 10 10:04:53 crc kubenswrapper[4794]: I0310 10:04:53.237299 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7k96z\" (UniqueName: \"kubernetes.io/projected/a812a094-c8c4-48cf-82c4-97fc75f2774f-kube-api-access-7k96z\") pod \"keystone-d7d5-account-create-update-2vwdw\" (UID: \"a812a094-c8c4-48cf-82c4-97fc75f2774f\") " pod="openstack/keystone-d7d5-account-create-update-2vwdw" Mar 10 10:04:53 crc kubenswrapper[4794]: I0310 10:04:53.303219 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/71cc55d3-bfda-4469-ad6c-6c5c0357360a-operator-scripts\") pod \"placement-db-create-dcx5m\" (UID: \"71cc55d3-bfda-4469-ad6c-6c5c0357360a\") " pod="openstack/placement-db-create-dcx5m" Mar 10 10:04:53 crc kubenswrapper[4794]: I0310 10:04:53.303280 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sbbrq\" (UniqueName: \"kubernetes.io/projected/d1e8a45c-ff29-42fa-9999-5ba419470fd5-kube-api-access-sbbrq\") pod \"placement-a23e-account-create-update-4mdw7\" (UID: \"d1e8a45c-ff29-42fa-9999-5ba419470fd5\") " pod="openstack/placement-a23e-account-create-update-4mdw7" Mar 10 10:04:53 crc kubenswrapper[4794]: I0310 10:04:53.303453 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d1e8a45c-ff29-42fa-9999-5ba419470fd5-operator-scripts\") pod \"placement-a23e-account-create-update-4mdw7\" (UID: \"d1e8a45c-ff29-42fa-9999-5ba419470fd5\") " pod="openstack/placement-a23e-account-create-update-4mdw7" Mar 10 10:04:53 crc kubenswrapper[4794]: I0310 10:04:53.303535 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f6x2f\" (UniqueName: \"kubernetes.io/projected/71cc55d3-bfda-4469-ad6c-6c5c0357360a-kube-api-access-f6x2f\") pod \"placement-db-create-dcx5m\" (UID: \"71cc55d3-bfda-4469-ad6c-6c5c0357360a\") " pod="openstack/placement-db-create-dcx5m" Mar 10 10:04:53 crc kubenswrapper[4794]: I0310 10:04:53.304352 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/71cc55d3-bfda-4469-ad6c-6c5c0357360a-operator-scripts\") pod \"placement-db-create-dcx5m\" (UID: \"71cc55d3-bfda-4469-ad6c-6c5c0357360a\") " pod="openstack/placement-db-create-dcx5m" Mar 10 10:04:53 crc kubenswrapper[4794]: I0310 10:04:53.318838 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f6x2f\" (UniqueName: \"kubernetes.io/projected/71cc55d3-bfda-4469-ad6c-6c5c0357360a-kube-api-access-f6x2f\") pod \"placement-db-create-dcx5m\" (UID: \"71cc55d3-bfda-4469-ad6c-6c5c0357360a\") " pod="openstack/placement-db-create-dcx5m" Mar 10 10:04:53 crc kubenswrapper[4794]: I0310 10:04:53.324781 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-mbt9m" Mar 10 10:04:53 crc kubenswrapper[4794]: I0310 10:04:53.344420 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-d7d5-account-create-update-2vwdw" Mar 10 10:04:53 crc kubenswrapper[4794]: I0310 10:04:53.407095 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d1e8a45c-ff29-42fa-9999-5ba419470fd5-operator-scripts\") pod \"placement-a23e-account-create-update-4mdw7\" (UID: \"d1e8a45c-ff29-42fa-9999-5ba419470fd5\") " pod="openstack/placement-a23e-account-create-update-4mdw7" Mar 10 10:04:53 crc kubenswrapper[4794]: I0310 10:04:53.407391 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sbbrq\" (UniqueName: \"kubernetes.io/projected/d1e8a45c-ff29-42fa-9999-5ba419470fd5-kube-api-access-sbbrq\") pod \"placement-a23e-account-create-update-4mdw7\" (UID: \"d1e8a45c-ff29-42fa-9999-5ba419470fd5\") " pod="openstack/placement-a23e-account-create-update-4mdw7" Mar 10 10:04:53 crc kubenswrapper[4794]: I0310 10:04:53.412678 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d1e8a45c-ff29-42fa-9999-5ba419470fd5-operator-scripts\") pod \"placement-a23e-account-create-update-4mdw7\" (UID: \"d1e8a45c-ff29-42fa-9999-5ba419470fd5\") " pod="openstack/placement-a23e-account-create-update-4mdw7" Mar 10 10:04:53 crc kubenswrapper[4794]: I0310 10:04:53.430937 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-dcx5m" Mar 10 10:04:53 crc kubenswrapper[4794]: I0310 10:04:53.434789 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sbbrq\" (UniqueName: \"kubernetes.io/projected/d1e8a45c-ff29-42fa-9999-5ba419470fd5-kube-api-access-sbbrq\") pod \"placement-a23e-account-create-update-4mdw7\" (UID: \"d1e8a45c-ff29-42fa-9999-5ba419470fd5\") " pod="openstack/placement-a23e-account-create-update-4mdw7" Mar 10 10:04:53 crc kubenswrapper[4794]: I0310 10:04:53.535808 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-a23e-account-create-update-4mdw7" Mar 10 10:04:53 crc kubenswrapper[4794]: I0310 10:04:53.775871 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-mbt9m"] Mar 10 10:04:53 crc kubenswrapper[4794]: I0310 10:04:53.916136 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-d7d5-account-create-update-2vwdw"] Mar 10 10:04:53 crc kubenswrapper[4794]: W0310 10:04:53.936535 4794 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda812a094_c8c4_48cf_82c4_97fc75f2774f.slice/crio-2b32958d0b64f99223e6bf006d92e2ba8ca402f67f28d185a618bdbf717a94e8 WatchSource:0}: Error finding container 2b32958d0b64f99223e6bf006d92e2ba8ca402f67f28d185a618bdbf717a94e8: Status 404 returned error can't find the container with id 2b32958d0b64f99223e6bf006d92e2ba8ca402f67f28d185a618bdbf717a94e8 Mar 10 10:04:54 crc kubenswrapper[4794]: I0310 10:04:54.065273 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-dcx5m"] Mar 10 10:04:54 crc kubenswrapper[4794]: I0310 10:04:54.095653 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Mar 10 10:04:54 crc kubenswrapper[4794]: I0310 10:04:54.104995 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-58df884995-wr8ms"] Mar 10 10:04:54 crc kubenswrapper[4794]: I0310 10:04:54.106212 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-58df884995-wr8ms" Mar 10 10:04:54 crc kubenswrapper[4794]: I0310 10:04:54.159435 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-58df884995-wr8ms"] Mar 10 10:04:54 crc kubenswrapper[4794]: I0310 10:04:54.257269 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0d9d50dc-009d-40eb-9dc5-4c1ffc2ef40a-config\") pod \"dnsmasq-dns-58df884995-wr8ms\" (UID: \"0d9d50dc-009d-40eb-9dc5-4c1ffc2ef40a\") " pod="openstack/dnsmasq-dns-58df884995-wr8ms" Mar 10 10:04:54 crc kubenswrapper[4794]: I0310 10:04:54.257351 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0d9d50dc-009d-40eb-9dc5-4c1ffc2ef40a-ovsdbserver-sb\") pod \"dnsmasq-dns-58df884995-wr8ms\" (UID: \"0d9d50dc-009d-40eb-9dc5-4c1ffc2ef40a\") " pod="openstack/dnsmasq-dns-58df884995-wr8ms" Mar 10 10:04:54 crc kubenswrapper[4794]: I0310 10:04:54.257405 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0d9d50dc-009d-40eb-9dc5-4c1ffc2ef40a-dns-svc\") pod \"dnsmasq-dns-58df884995-wr8ms\" (UID: \"0d9d50dc-009d-40eb-9dc5-4c1ffc2ef40a\") " pod="openstack/dnsmasq-dns-58df884995-wr8ms" Mar 10 10:04:54 crc kubenswrapper[4794]: I0310 10:04:54.257426 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0d9d50dc-009d-40eb-9dc5-4c1ffc2ef40a-ovsdbserver-nb\") pod \"dnsmasq-dns-58df884995-wr8ms\" (UID: \"0d9d50dc-009d-40eb-9dc5-4c1ffc2ef40a\") " pod="openstack/dnsmasq-dns-58df884995-wr8ms" Mar 10 10:04:54 crc kubenswrapper[4794]: I0310 10:04:54.257471 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-spbpj\" (UniqueName: \"kubernetes.io/projected/0d9d50dc-009d-40eb-9dc5-4c1ffc2ef40a-kube-api-access-spbpj\") pod \"dnsmasq-dns-58df884995-wr8ms\" (UID: \"0d9d50dc-009d-40eb-9dc5-4c1ffc2ef40a\") " pod="openstack/dnsmasq-dns-58df884995-wr8ms" Mar 10 10:04:54 crc kubenswrapper[4794]: I0310 10:04:54.292054 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-a23e-account-create-update-4mdw7"] Mar 10 10:04:54 crc kubenswrapper[4794]: I0310 10:04:54.359469 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0d9d50dc-009d-40eb-9dc5-4c1ffc2ef40a-ovsdbserver-sb\") pod \"dnsmasq-dns-58df884995-wr8ms\" (UID: \"0d9d50dc-009d-40eb-9dc5-4c1ffc2ef40a\") " pod="openstack/dnsmasq-dns-58df884995-wr8ms" Mar 10 10:04:54 crc kubenswrapper[4794]: I0310 10:04:54.359610 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0d9d50dc-009d-40eb-9dc5-4c1ffc2ef40a-dns-svc\") pod \"dnsmasq-dns-58df884995-wr8ms\" (UID: \"0d9d50dc-009d-40eb-9dc5-4c1ffc2ef40a\") " pod="openstack/dnsmasq-dns-58df884995-wr8ms" Mar 10 10:04:54 crc kubenswrapper[4794]: I0310 10:04:54.359669 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0d9d50dc-009d-40eb-9dc5-4c1ffc2ef40a-ovsdbserver-nb\") pod \"dnsmasq-dns-58df884995-wr8ms\" (UID: \"0d9d50dc-009d-40eb-9dc5-4c1ffc2ef40a\") " pod="openstack/dnsmasq-dns-58df884995-wr8ms" Mar 10 10:04:54 crc kubenswrapper[4794]: I0310 10:04:54.359726 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-spbpj\" (UniqueName: \"kubernetes.io/projected/0d9d50dc-009d-40eb-9dc5-4c1ffc2ef40a-kube-api-access-spbpj\") pod \"dnsmasq-dns-58df884995-wr8ms\" (UID: \"0d9d50dc-009d-40eb-9dc5-4c1ffc2ef40a\") " pod="openstack/dnsmasq-dns-58df884995-wr8ms" Mar 10 10:04:54 crc kubenswrapper[4794]: I0310 10:04:54.359858 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0d9d50dc-009d-40eb-9dc5-4c1ffc2ef40a-config\") pod \"dnsmasq-dns-58df884995-wr8ms\" (UID: \"0d9d50dc-009d-40eb-9dc5-4c1ffc2ef40a\") " pod="openstack/dnsmasq-dns-58df884995-wr8ms" Mar 10 10:04:54 crc kubenswrapper[4794]: I0310 10:04:54.361379 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0d9d50dc-009d-40eb-9dc5-4c1ffc2ef40a-dns-svc\") pod \"dnsmasq-dns-58df884995-wr8ms\" (UID: \"0d9d50dc-009d-40eb-9dc5-4c1ffc2ef40a\") " pod="openstack/dnsmasq-dns-58df884995-wr8ms" Mar 10 10:04:54 crc kubenswrapper[4794]: I0310 10:04:54.361429 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0d9d50dc-009d-40eb-9dc5-4c1ffc2ef40a-config\") pod \"dnsmasq-dns-58df884995-wr8ms\" (UID: \"0d9d50dc-009d-40eb-9dc5-4c1ffc2ef40a\") " pod="openstack/dnsmasq-dns-58df884995-wr8ms" Mar 10 10:04:54 crc kubenswrapper[4794]: I0310 10:04:54.363214 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0d9d50dc-009d-40eb-9dc5-4c1ffc2ef40a-ovsdbserver-nb\") pod \"dnsmasq-dns-58df884995-wr8ms\" (UID: \"0d9d50dc-009d-40eb-9dc5-4c1ffc2ef40a\") " pod="openstack/dnsmasq-dns-58df884995-wr8ms" Mar 10 10:04:54 crc kubenswrapper[4794]: I0310 10:04:54.367619 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0d9d50dc-009d-40eb-9dc5-4c1ffc2ef40a-ovsdbserver-sb\") pod \"dnsmasq-dns-58df884995-wr8ms\" (UID: \"0d9d50dc-009d-40eb-9dc5-4c1ffc2ef40a\") " pod="openstack/dnsmasq-dns-58df884995-wr8ms" Mar 10 10:04:54 crc kubenswrapper[4794]: I0310 10:04:54.385723 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-spbpj\" (UniqueName: \"kubernetes.io/projected/0d9d50dc-009d-40eb-9dc5-4c1ffc2ef40a-kube-api-access-spbpj\") pod \"dnsmasq-dns-58df884995-wr8ms\" (UID: \"0d9d50dc-009d-40eb-9dc5-4c1ffc2ef40a\") " pod="openstack/dnsmasq-dns-58df884995-wr8ms" Mar 10 10:04:54 crc kubenswrapper[4794]: I0310 10:04:54.444110 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-58df884995-wr8ms" Mar 10 10:04:54 crc kubenswrapper[4794]: I0310 10:04:54.647789 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-a23e-account-create-update-4mdw7" event={"ID":"d1e8a45c-ff29-42fa-9999-5ba419470fd5","Type":"ContainerStarted","Data":"8072401a35535b1f6ddb0b3f54e261bef2fc583f3b26f982227dd7c9cad3c0ff"} Mar 10 10:04:54 crc kubenswrapper[4794]: I0310 10:04:54.649799 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-dcx5m" event={"ID":"71cc55d3-bfda-4469-ad6c-6c5c0357360a","Type":"ContainerStarted","Data":"8d0146b1fe6aebb86ceb3b4c055564825a0bd4b041da3364fefe0c8d956a6d6c"} Mar 10 10:04:54 crc kubenswrapper[4794]: I0310 10:04:54.649848 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-dcx5m" event={"ID":"71cc55d3-bfda-4469-ad6c-6c5c0357360a","Type":"ContainerStarted","Data":"5f85c793e0de75d0c82bb95e2d7b3c0985b24209abdb643a7dd5764380c22002"} Mar 10 10:04:54 crc kubenswrapper[4794]: I0310 10:04:54.660577 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-d7d5-account-create-update-2vwdw" event={"ID":"a812a094-c8c4-48cf-82c4-97fc75f2774f","Type":"ContainerStarted","Data":"4801069e3ec4ee45b0400bd772935ecc7c61384c2fafb5723da3323b36de9214"} Mar 10 10:04:54 crc kubenswrapper[4794]: I0310 10:04:54.660633 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-d7d5-account-create-update-2vwdw" event={"ID":"a812a094-c8c4-48cf-82c4-97fc75f2774f","Type":"ContainerStarted","Data":"2b32958d0b64f99223e6bf006d92e2ba8ca402f67f28d185a618bdbf717a94e8"} Mar 10 10:04:54 crc kubenswrapper[4794]: I0310 10:04:54.672376 4794 generic.go:334] "Generic (PLEG): container finished" podID="34215f04-95c7-4644-8ecc-70147aa8c100" containerID="e91b423fcccb339cdeb18818c51cf8ecd72433c4ea29ae829ab212dc3d885f87" exitCode=0 Mar 10 10:04:54 crc kubenswrapper[4794]: I0310 10:04:54.672431 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-mbt9m" event={"ID":"34215f04-95c7-4644-8ecc-70147aa8c100","Type":"ContainerDied","Data":"e91b423fcccb339cdeb18818c51cf8ecd72433c4ea29ae829ab212dc3d885f87"} Mar 10 10:04:54 crc kubenswrapper[4794]: I0310 10:04:54.672460 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-mbt9m" event={"ID":"34215f04-95c7-4644-8ecc-70147aa8c100","Type":"ContainerStarted","Data":"fcf375297d86aef077102a23dbbb3a301f4a3f2117f6c9ee1d36204807aedce3"} Mar 10 10:04:54 crc kubenswrapper[4794]: I0310 10:04:54.677527 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-db-create-dcx5m" podStartSLOduration=1.67750991 podStartE2EDuration="1.67750991s" podCreationTimestamp="2026-03-10 10:04:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 10:04:54.668898188 +0000 UTC m=+1243.425069006" watchObservedRunningTime="2026-03-10 10:04:54.67750991 +0000 UTC m=+1243.433680728" Mar 10 10:04:54 crc kubenswrapper[4794]: I0310 10:04:54.685251 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-d7d5-account-create-update-2vwdw" podStartSLOduration=2.685232884 podStartE2EDuration="2.685232884s" podCreationTimestamp="2026-03-10 10:04:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 10:04:54.685009457 +0000 UTC m=+1243.441180275" watchObservedRunningTime="2026-03-10 10:04:54.685232884 +0000 UTC m=+1243.441403702" Mar 10 10:04:54 crc kubenswrapper[4794]: I0310 10:04:54.852438 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-58df884995-wr8ms"] Mar 10 10:04:54 crc kubenswrapper[4794]: W0310 10:04:54.852703 4794 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0d9d50dc_009d_40eb_9dc5_4c1ffc2ef40a.slice/crio-a41ecb82175a17bdf15e0e28d9ba0d9a934f24a98329058e4ed90c99e9c8f8ce WatchSource:0}: Error finding container a41ecb82175a17bdf15e0e28d9ba0d9a934f24a98329058e4ed90c99e9c8f8ce: Status 404 returned error can't find the container with id a41ecb82175a17bdf15e0e28d9ba0d9a934f24a98329058e4ed90c99e9c8f8ce Mar 10 10:04:55 crc kubenswrapper[4794]: I0310 10:04:55.259835 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-storage-0"] Mar 10 10:04:55 crc kubenswrapper[4794]: I0310 10:04:55.267414 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Mar 10 10:04:55 crc kubenswrapper[4794]: I0310 10:04:55.269716 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-storage-config-data" Mar 10 10:04:55 crc kubenswrapper[4794]: I0310 10:04:55.269853 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-conf" Mar 10 10:04:55 crc kubenswrapper[4794]: I0310 10:04:55.270008 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-swift-dockercfg-78xsv" Mar 10 10:04:55 crc kubenswrapper[4794]: I0310 10:04:55.270121 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-files" Mar 10 10:04:55 crc kubenswrapper[4794]: I0310 10:04:55.304150 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Mar 10 10:04:55 crc kubenswrapper[4794]: I0310 10:04:55.374111 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/6cbde6fd-ed6b-49a8-96ae-642b15a1802b-lock\") pod \"swift-storage-0\" (UID: \"6cbde6fd-ed6b-49a8-96ae-642b15a1802b\") " pod="openstack/swift-storage-0" Mar 10 10:04:55 crc kubenswrapper[4794]: I0310 10:04:55.374154 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"swift-storage-0\" (UID: \"6cbde6fd-ed6b-49a8-96ae-642b15a1802b\") " pod="openstack/swift-storage-0" Mar 10 10:04:55 crc kubenswrapper[4794]: I0310 10:04:55.374205 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/6cbde6fd-ed6b-49a8-96ae-642b15a1802b-etc-swift\") pod \"swift-storage-0\" (UID: \"6cbde6fd-ed6b-49a8-96ae-642b15a1802b\") " pod="openstack/swift-storage-0" Mar 10 10:04:55 crc kubenswrapper[4794]: I0310 10:04:55.374246 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/6cbde6fd-ed6b-49a8-96ae-642b15a1802b-cache\") pod \"swift-storage-0\" (UID: \"6cbde6fd-ed6b-49a8-96ae-642b15a1802b\") " pod="openstack/swift-storage-0" Mar 10 10:04:55 crc kubenswrapper[4794]: I0310 10:04:55.374264 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2dx2x\" (UniqueName: \"kubernetes.io/projected/6cbde6fd-ed6b-49a8-96ae-642b15a1802b-kube-api-access-2dx2x\") pod \"swift-storage-0\" (UID: \"6cbde6fd-ed6b-49a8-96ae-642b15a1802b\") " pod="openstack/swift-storage-0" Mar 10 10:04:55 crc kubenswrapper[4794]: I0310 10:04:55.374279 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6cbde6fd-ed6b-49a8-96ae-642b15a1802b-combined-ca-bundle\") pod \"swift-storage-0\" (UID: \"6cbde6fd-ed6b-49a8-96ae-642b15a1802b\") " pod="openstack/swift-storage-0" Mar 10 10:04:55 crc kubenswrapper[4794]: I0310 10:04:55.475726 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/6cbde6fd-ed6b-49a8-96ae-642b15a1802b-etc-swift\") pod \"swift-storage-0\" (UID: \"6cbde6fd-ed6b-49a8-96ae-642b15a1802b\") " pod="openstack/swift-storage-0" Mar 10 10:04:55 crc kubenswrapper[4794]: I0310 10:04:55.475787 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2dx2x\" (UniqueName: \"kubernetes.io/projected/6cbde6fd-ed6b-49a8-96ae-642b15a1802b-kube-api-access-2dx2x\") pod \"swift-storage-0\" (UID: \"6cbde6fd-ed6b-49a8-96ae-642b15a1802b\") " pod="openstack/swift-storage-0" Mar 10 10:04:55 crc kubenswrapper[4794]: I0310 10:04:55.475807 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/6cbde6fd-ed6b-49a8-96ae-642b15a1802b-cache\") pod \"swift-storage-0\" (UID: \"6cbde6fd-ed6b-49a8-96ae-642b15a1802b\") " pod="openstack/swift-storage-0" Mar 10 10:04:55 crc kubenswrapper[4794]: I0310 10:04:55.475822 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6cbde6fd-ed6b-49a8-96ae-642b15a1802b-combined-ca-bundle\") pod \"swift-storage-0\" (UID: \"6cbde6fd-ed6b-49a8-96ae-642b15a1802b\") " pod="openstack/swift-storage-0" Mar 10 10:04:55 crc kubenswrapper[4794]: I0310 10:04:55.475889 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/6cbde6fd-ed6b-49a8-96ae-642b15a1802b-lock\") pod \"swift-storage-0\" (UID: \"6cbde6fd-ed6b-49a8-96ae-642b15a1802b\") " pod="openstack/swift-storage-0" Mar 10 10:04:55 crc kubenswrapper[4794]: I0310 10:04:55.475916 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"swift-storage-0\" (UID: \"6cbde6fd-ed6b-49a8-96ae-642b15a1802b\") " pod="openstack/swift-storage-0" Mar 10 10:04:55 crc kubenswrapper[4794]: I0310 10:04:55.476210 4794 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"swift-storage-0\" (UID: \"6cbde6fd-ed6b-49a8-96ae-642b15a1802b\") device mount path \"/mnt/openstack/pv12\"" pod="openstack/swift-storage-0" Mar 10 10:04:55 crc kubenswrapper[4794]: I0310 10:04:55.476377 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/6cbde6fd-ed6b-49a8-96ae-642b15a1802b-cache\") pod \"swift-storage-0\" (UID: \"6cbde6fd-ed6b-49a8-96ae-642b15a1802b\") " pod="openstack/swift-storage-0" Mar 10 10:04:55 crc kubenswrapper[4794]: I0310 10:04:55.476420 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/6cbde6fd-ed6b-49a8-96ae-642b15a1802b-lock\") pod \"swift-storage-0\" (UID: \"6cbde6fd-ed6b-49a8-96ae-642b15a1802b\") " pod="openstack/swift-storage-0" Mar 10 10:04:55 crc kubenswrapper[4794]: E0310 10:04:55.476733 4794 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Mar 10 10:04:55 crc kubenswrapper[4794]: E0310 10:04:55.476766 4794 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Mar 10 10:04:55 crc kubenswrapper[4794]: E0310 10:04:55.476833 4794 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/6cbde6fd-ed6b-49a8-96ae-642b15a1802b-etc-swift podName:6cbde6fd-ed6b-49a8-96ae-642b15a1802b nodeName:}" failed. No retries permitted until 2026-03-10 10:04:55.976810486 +0000 UTC m=+1244.732981394 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/6cbde6fd-ed6b-49a8-96ae-642b15a1802b-etc-swift") pod "swift-storage-0" (UID: "6cbde6fd-ed6b-49a8-96ae-642b15a1802b") : configmap "swift-ring-files" not found Mar 10 10:04:55 crc kubenswrapper[4794]: I0310 10:04:55.499499 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6cbde6fd-ed6b-49a8-96ae-642b15a1802b-combined-ca-bundle\") pod \"swift-storage-0\" (UID: \"6cbde6fd-ed6b-49a8-96ae-642b15a1802b\") " pod="openstack/swift-storage-0" Mar 10 10:04:55 crc kubenswrapper[4794]: I0310 10:04:55.502865 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2dx2x\" (UniqueName: \"kubernetes.io/projected/6cbde6fd-ed6b-49a8-96ae-642b15a1802b-kube-api-access-2dx2x\") pod \"swift-storage-0\" (UID: \"6cbde6fd-ed6b-49a8-96ae-642b15a1802b\") " pod="openstack/swift-storage-0" Mar 10 10:04:55 crc kubenswrapper[4794]: I0310 10:04:55.504099 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"swift-storage-0\" (UID: \"6cbde6fd-ed6b-49a8-96ae-642b15a1802b\") " pod="openstack/swift-storage-0" Mar 10 10:04:55 crc kubenswrapper[4794]: I0310 10:04:55.682692 4794 generic.go:334] "Generic (PLEG): container finished" podID="d1e8a45c-ff29-42fa-9999-5ba419470fd5" containerID="e28e928400da89d2d5a42bae0c7bca5c24862a455b433690c6883af7e54ea226" exitCode=0 Mar 10 10:04:55 crc kubenswrapper[4794]: I0310 10:04:55.682775 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-a23e-account-create-update-4mdw7" event={"ID":"d1e8a45c-ff29-42fa-9999-5ba419470fd5","Type":"ContainerDied","Data":"e28e928400da89d2d5a42bae0c7bca5c24862a455b433690c6883af7e54ea226"} Mar 10 10:04:55 crc kubenswrapper[4794]: I0310 10:04:55.684615 4794 generic.go:334] "Generic (PLEG): container finished" podID="71cc55d3-bfda-4469-ad6c-6c5c0357360a" containerID="8d0146b1fe6aebb86ceb3b4c055564825a0bd4b041da3364fefe0c8d956a6d6c" exitCode=0 Mar 10 10:04:55 crc kubenswrapper[4794]: I0310 10:04:55.684732 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-dcx5m" event={"ID":"71cc55d3-bfda-4469-ad6c-6c5c0357360a","Type":"ContainerDied","Data":"8d0146b1fe6aebb86ceb3b4c055564825a0bd4b041da3364fefe0c8d956a6d6c"} Mar 10 10:04:55 crc kubenswrapper[4794]: I0310 10:04:55.686300 4794 generic.go:334] "Generic (PLEG): container finished" podID="0d9d50dc-009d-40eb-9dc5-4c1ffc2ef40a" containerID="70e8baac06ba32920a4cbdf0279be6a6858547f3c37e9d1983091d760290b72d" exitCode=0 Mar 10 10:04:55 crc kubenswrapper[4794]: I0310 10:04:55.686418 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-58df884995-wr8ms" event={"ID":"0d9d50dc-009d-40eb-9dc5-4c1ffc2ef40a","Type":"ContainerDied","Data":"70e8baac06ba32920a4cbdf0279be6a6858547f3c37e9d1983091d760290b72d"} Mar 10 10:04:55 crc kubenswrapper[4794]: I0310 10:04:55.686498 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-58df884995-wr8ms" event={"ID":"0d9d50dc-009d-40eb-9dc5-4c1ffc2ef40a","Type":"ContainerStarted","Data":"a41ecb82175a17bdf15e0e28d9ba0d9a934f24a98329058e4ed90c99e9c8f8ce"} Mar 10 10:04:55 crc kubenswrapper[4794]: I0310 10:04:55.688451 4794 generic.go:334] "Generic (PLEG): container finished" podID="a812a094-c8c4-48cf-82c4-97fc75f2774f" containerID="4801069e3ec4ee45b0400bd772935ecc7c61384c2fafb5723da3323b36de9214" exitCode=0 Mar 10 10:04:55 crc kubenswrapper[4794]: I0310 10:04:55.688568 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-d7d5-account-create-update-2vwdw" event={"ID":"a812a094-c8c4-48cf-82c4-97fc75f2774f","Type":"ContainerDied","Data":"4801069e3ec4ee45b0400bd772935ecc7c61384c2fafb5723da3323b36de9214"} Mar 10 10:04:55 crc kubenswrapper[4794]: I0310 10:04:55.971182 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-mbt9m" Mar 10 10:04:55 crc kubenswrapper[4794]: I0310 10:04:55.989883 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/6cbde6fd-ed6b-49a8-96ae-642b15a1802b-etc-swift\") pod \"swift-storage-0\" (UID: \"6cbde6fd-ed6b-49a8-96ae-642b15a1802b\") " pod="openstack/swift-storage-0" Mar 10 10:04:55 crc kubenswrapper[4794]: E0310 10:04:55.990068 4794 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Mar 10 10:04:55 crc kubenswrapper[4794]: E0310 10:04:55.990096 4794 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Mar 10 10:04:55 crc kubenswrapper[4794]: E0310 10:04:55.990151 4794 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/6cbde6fd-ed6b-49a8-96ae-642b15a1802b-etc-swift podName:6cbde6fd-ed6b-49a8-96ae-642b15a1802b nodeName:}" failed. No retries permitted until 2026-03-10 10:04:56.9901341 +0000 UTC m=+1245.746304918 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/6cbde6fd-ed6b-49a8-96ae-642b15a1802b-etc-swift") pod "swift-storage-0" (UID: "6cbde6fd-ed6b-49a8-96ae-642b15a1802b") : configmap "swift-ring-files" not found Mar 10 10:04:56 crc kubenswrapper[4794]: I0310 10:04:56.090961 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j62l9\" (UniqueName: \"kubernetes.io/projected/34215f04-95c7-4644-8ecc-70147aa8c100-kube-api-access-j62l9\") pod \"34215f04-95c7-4644-8ecc-70147aa8c100\" (UID: \"34215f04-95c7-4644-8ecc-70147aa8c100\") " Mar 10 10:04:56 crc kubenswrapper[4794]: I0310 10:04:56.091426 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/34215f04-95c7-4644-8ecc-70147aa8c100-operator-scripts\") pod \"34215f04-95c7-4644-8ecc-70147aa8c100\" (UID: \"34215f04-95c7-4644-8ecc-70147aa8c100\") " Mar 10 10:04:56 crc kubenswrapper[4794]: I0310 10:04:56.091793 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/34215f04-95c7-4644-8ecc-70147aa8c100-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "34215f04-95c7-4644-8ecc-70147aa8c100" (UID: "34215f04-95c7-4644-8ecc-70147aa8c100"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 10:04:56 crc kubenswrapper[4794]: I0310 10:04:56.091969 4794 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/34215f04-95c7-4644-8ecc-70147aa8c100-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 10:04:56 crc kubenswrapper[4794]: I0310 10:04:56.097600 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/34215f04-95c7-4644-8ecc-70147aa8c100-kube-api-access-j62l9" (OuterVolumeSpecName: "kube-api-access-j62l9") pod "34215f04-95c7-4644-8ecc-70147aa8c100" (UID: "34215f04-95c7-4644-8ecc-70147aa8c100"). InnerVolumeSpecName "kube-api-access-j62l9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 10:04:56 crc kubenswrapper[4794]: I0310 10:04:56.194666 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j62l9\" (UniqueName: \"kubernetes.io/projected/34215f04-95c7-4644-8ecc-70147aa8c100-kube-api-access-j62l9\") on node \"crc\" DevicePath \"\"" Mar 10 10:04:56 crc kubenswrapper[4794]: I0310 10:04:56.351170 4794 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-cell1-galera-0" Mar 10 10:04:56 crc kubenswrapper[4794]: I0310 10:04:56.448183 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-cell1-galera-0" Mar 10 10:04:56 crc kubenswrapper[4794]: I0310 10:04:56.697363 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-mbt9m" Mar 10 10:04:56 crc kubenswrapper[4794]: I0310 10:04:56.698114 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-mbt9m" event={"ID":"34215f04-95c7-4644-8ecc-70147aa8c100","Type":"ContainerDied","Data":"fcf375297d86aef077102a23dbbb3a301f4a3f2117f6c9ee1d36204807aedce3"} Mar 10 10:04:56 crc kubenswrapper[4794]: I0310 10:04:56.698155 4794 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fcf375297d86aef077102a23dbbb3a301f4a3f2117f6c9ee1d36204807aedce3" Mar 10 10:04:56 crc kubenswrapper[4794]: I0310 10:04:56.702540 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-58df884995-wr8ms" event={"ID":"0d9d50dc-009d-40eb-9dc5-4c1ffc2ef40a","Type":"ContainerStarted","Data":"658379ccd63c03a7b70ce7eaceec34bae80606e2f0afc083861a3090282f3ec4"} Mar 10 10:04:57 crc kubenswrapper[4794]: I0310 10:04:57.006803 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/6cbde6fd-ed6b-49a8-96ae-642b15a1802b-etc-swift\") pod \"swift-storage-0\" (UID: \"6cbde6fd-ed6b-49a8-96ae-642b15a1802b\") " pod="openstack/swift-storage-0" Mar 10 10:04:57 crc kubenswrapper[4794]: E0310 10:04:57.006984 4794 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Mar 10 10:04:57 crc kubenswrapper[4794]: E0310 10:04:57.007007 4794 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Mar 10 10:04:57 crc kubenswrapper[4794]: E0310 10:04:57.007062 4794 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/6cbde6fd-ed6b-49a8-96ae-642b15a1802b-etc-swift podName:6cbde6fd-ed6b-49a8-96ae-642b15a1802b nodeName:}" failed. No retries permitted until 2026-03-10 10:04:59.007045918 +0000 UTC m=+1247.763216736 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/6cbde6fd-ed6b-49a8-96ae-642b15a1802b-etc-swift") pod "swift-storage-0" (UID: "6cbde6fd-ed6b-49a8-96ae-642b15a1802b") : configmap "swift-ring-files" not found Mar 10 10:04:57 crc kubenswrapper[4794]: I0310 10:04:57.021683 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-58df884995-wr8ms" podStartSLOduration=3.021667251 podStartE2EDuration="3.021667251s" podCreationTimestamp="2026-03-10 10:04:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 10:04:56.72501982 +0000 UTC m=+1245.481190638" watchObservedRunningTime="2026-03-10 10:04:57.021667251 +0000 UTC m=+1245.777838069" Mar 10 10:04:57 crc kubenswrapper[4794]: I0310 10:04:57.180088 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-d7d5-account-create-update-2vwdw" Mar 10 10:04:57 crc kubenswrapper[4794]: I0310 10:04:57.187787 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-dcx5m" Mar 10 10:04:57 crc kubenswrapper[4794]: I0310 10:04:57.195092 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-a23e-account-create-update-4mdw7" Mar 10 10:04:57 crc kubenswrapper[4794]: I0310 10:04:57.310317 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sbbrq\" (UniqueName: \"kubernetes.io/projected/d1e8a45c-ff29-42fa-9999-5ba419470fd5-kube-api-access-sbbrq\") pod \"d1e8a45c-ff29-42fa-9999-5ba419470fd5\" (UID: \"d1e8a45c-ff29-42fa-9999-5ba419470fd5\") " Mar 10 10:04:57 crc kubenswrapper[4794]: I0310 10:04:57.311094 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7k96z\" (UniqueName: \"kubernetes.io/projected/a812a094-c8c4-48cf-82c4-97fc75f2774f-kube-api-access-7k96z\") pod \"a812a094-c8c4-48cf-82c4-97fc75f2774f\" (UID: \"a812a094-c8c4-48cf-82c4-97fc75f2774f\") " Mar 10 10:04:57 crc kubenswrapper[4794]: I0310 10:04:57.311126 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/71cc55d3-bfda-4469-ad6c-6c5c0357360a-operator-scripts\") pod \"71cc55d3-bfda-4469-ad6c-6c5c0357360a\" (UID: \"71cc55d3-bfda-4469-ad6c-6c5c0357360a\") " Mar 10 10:04:57 crc kubenswrapper[4794]: I0310 10:04:57.311211 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a812a094-c8c4-48cf-82c4-97fc75f2774f-operator-scripts\") pod \"a812a094-c8c4-48cf-82c4-97fc75f2774f\" (UID: \"a812a094-c8c4-48cf-82c4-97fc75f2774f\") " Mar 10 10:04:57 crc kubenswrapper[4794]: I0310 10:04:57.311274 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d1e8a45c-ff29-42fa-9999-5ba419470fd5-operator-scripts\") pod \"d1e8a45c-ff29-42fa-9999-5ba419470fd5\" (UID: \"d1e8a45c-ff29-42fa-9999-5ba419470fd5\") " Mar 10 10:04:57 crc kubenswrapper[4794]: I0310 10:04:57.311723 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/71cc55d3-bfda-4469-ad6c-6c5c0357360a-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "71cc55d3-bfda-4469-ad6c-6c5c0357360a" (UID: "71cc55d3-bfda-4469-ad6c-6c5c0357360a"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 10:04:57 crc kubenswrapper[4794]: I0310 10:04:57.311735 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d1e8a45c-ff29-42fa-9999-5ba419470fd5-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "d1e8a45c-ff29-42fa-9999-5ba419470fd5" (UID: "d1e8a45c-ff29-42fa-9999-5ba419470fd5"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 10:04:57 crc kubenswrapper[4794]: I0310 10:04:57.312126 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a812a094-c8c4-48cf-82c4-97fc75f2774f-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "a812a094-c8c4-48cf-82c4-97fc75f2774f" (UID: "a812a094-c8c4-48cf-82c4-97fc75f2774f"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 10:04:57 crc kubenswrapper[4794]: I0310 10:04:57.312213 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f6x2f\" (UniqueName: \"kubernetes.io/projected/71cc55d3-bfda-4469-ad6c-6c5c0357360a-kube-api-access-f6x2f\") pod \"71cc55d3-bfda-4469-ad6c-6c5c0357360a\" (UID: \"71cc55d3-bfda-4469-ad6c-6c5c0357360a\") " Mar 10 10:04:57 crc kubenswrapper[4794]: I0310 10:04:57.312911 4794 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/71cc55d3-bfda-4469-ad6c-6c5c0357360a-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 10:04:57 crc kubenswrapper[4794]: I0310 10:04:57.312927 4794 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a812a094-c8c4-48cf-82c4-97fc75f2774f-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 10:04:57 crc kubenswrapper[4794]: I0310 10:04:57.312936 4794 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d1e8a45c-ff29-42fa-9999-5ba419470fd5-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 10:04:57 crc kubenswrapper[4794]: I0310 10:04:57.316694 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/71cc55d3-bfda-4469-ad6c-6c5c0357360a-kube-api-access-f6x2f" (OuterVolumeSpecName: "kube-api-access-f6x2f") pod "71cc55d3-bfda-4469-ad6c-6c5c0357360a" (UID: "71cc55d3-bfda-4469-ad6c-6c5c0357360a"). InnerVolumeSpecName "kube-api-access-f6x2f". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 10:04:57 crc kubenswrapper[4794]: I0310 10:04:57.317937 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a812a094-c8c4-48cf-82c4-97fc75f2774f-kube-api-access-7k96z" (OuterVolumeSpecName: "kube-api-access-7k96z") pod "a812a094-c8c4-48cf-82c4-97fc75f2774f" (UID: "a812a094-c8c4-48cf-82c4-97fc75f2774f"). InnerVolumeSpecName "kube-api-access-7k96z". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 10:04:57 crc kubenswrapper[4794]: I0310 10:04:57.318117 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d1e8a45c-ff29-42fa-9999-5ba419470fd5-kube-api-access-sbbrq" (OuterVolumeSpecName: "kube-api-access-sbbrq") pod "d1e8a45c-ff29-42fa-9999-5ba419470fd5" (UID: "d1e8a45c-ff29-42fa-9999-5ba419470fd5"). InnerVolumeSpecName "kube-api-access-sbbrq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 10:04:57 crc kubenswrapper[4794]: I0310 10:04:57.413956 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f6x2f\" (UniqueName: \"kubernetes.io/projected/71cc55d3-bfda-4469-ad6c-6c5c0357360a-kube-api-access-f6x2f\") on node \"crc\" DevicePath \"\"" Mar 10 10:04:57 crc kubenswrapper[4794]: I0310 10:04:57.413980 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sbbrq\" (UniqueName: \"kubernetes.io/projected/d1e8a45c-ff29-42fa-9999-5ba419470fd5-kube-api-access-sbbrq\") on node \"crc\" DevicePath \"\"" Mar 10 10:04:57 crc kubenswrapper[4794]: I0310 10:04:57.413990 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7k96z\" (UniqueName: \"kubernetes.io/projected/a812a094-c8c4-48cf-82c4-97fc75f2774f-kube-api-access-7k96z\") on node \"crc\" DevicePath \"\"" Mar 10 10:04:57 crc kubenswrapper[4794]: I0310 10:04:57.711604 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-dcx5m" event={"ID":"71cc55d3-bfda-4469-ad6c-6c5c0357360a","Type":"ContainerDied","Data":"5f85c793e0de75d0c82bb95e2d7b3c0985b24209abdb643a7dd5764380c22002"} Mar 10 10:04:57 crc kubenswrapper[4794]: I0310 10:04:57.711846 4794 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5f85c793e0de75d0c82bb95e2d7b3c0985b24209abdb643a7dd5764380c22002" Mar 10 10:04:57 crc kubenswrapper[4794]: I0310 10:04:57.711620 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-dcx5m" Mar 10 10:04:57 crc kubenswrapper[4794]: I0310 10:04:57.713221 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-d7d5-account-create-update-2vwdw" event={"ID":"a812a094-c8c4-48cf-82c4-97fc75f2774f","Type":"ContainerDied","Data":"2b32958d0b64f99223e6bf006d92e2ba8ca402f67f28d185a618bdbf717a94e8"} Mar 10 10:04:57 crc kubenswrapper[4794]: I0310 10:04:57.713324 4794 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2b32958d0b64f99223e6bf006d92e2ba8ca402f67f28d185a618bdbf717a94e8" Mar 10 10:04:57 crc kubenswrapper[4794]: I0310 10:04:57.713292 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-d7d5-account-create-update-2vwdw" Mar 10 10:04:57 crc kubenswrapper[4794]: I0310 10:04:57.714902 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-a23e-account-create-update-4mdw7" Mar 10 10:04:57 crc kubenswrapper[4794]: I0310 10:04:57.719412 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-a23e-account-create-update-4mdw7" event={"ID":"d1e8a45c-ff29-42fa-9999-5ba419470fd5","Type":"ContainerDied","Data":"8072401a35535b1f6ddb0b3f54e261bef2fc583f3b26f982227dd7c9cad3c0ff"} Mar 10 10:04:57 crc kubenswrapper[4794]: I0310 10:04:57.719469 4794 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8072401a35535b1f6ddb0b3f54e261bef2fc583f3b26f982227dd7c9cad3c0ff" Mar 10 10:04:57 crc kubenswrapper[4794]: I0310 10:04:57.719499 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-58df884995-wr8ms" Mar 10 10:04:58 crc kubenswrapper[4794]: I0310 10:04:58.636686 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-cnjz9"] Mar 10 10:04:58 crc kubenswrapper[4794]: E0310 10:04:58.637380 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="71cc55d3-bfda-4469-ad6c-6c5c0357360a" containerName="mariadb-database-create" Mar 10 10:04:58 crc kubenswrapper[4794]: I0310 10:04:58.637396 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="71cc55d3-bfda-4469-ad6c-6c5c0357360a" containerName="mariadb-database-create" Mar 10 10:04:58 crc kubenswrapper[4794]: E0310 10:04:58.637433 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d1e8a45c-ff29-42fa-9999-5ba419470fd5" containerName="mariadb-account-create-update" Mar 10 10:04:58 crc kubenswrapper[4794]: I0310 10:04:58.637441 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="d1e8a45c-ff29-42fa-9999-5ba419470fd5" containerName="mariadb-account-create-update" Mar 10 10:04:58 crc kubenswrapper[4794]: E0310 10:04:58.637459 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a812a094-c8c4-48cf-82c4-97fc75f2774f" containerName="mariadb-account-create-update" Mar 10 10:04:58 crc kubenswrapper[4794]: I0310 10:04:58.637468 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="a812a094-c8c4-48cf-82c4-97fc75f2774f" containerName="mariadb-account-create-update" Mar 10 10:04:58 crc kubenswrapper[4794]: E0310 10:04:58.637480 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="34215f04-95c7-4644-8ecc-70147aa8c100" containerName="mariadb-database-create" Mar 10 10:04:58 crc kubenswrapper[4794]: I0310 10:04:58.637488 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="34215f04-95c7-4644-8ecc-70147aa8c100" containerName="mariadb-database-create" Mar 10 10:04:58 crc kubenswrapper[4794]: I0310 10:04:58.637702 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="a812a094-c8c4-48cf-82c4-97fc75f2774f" containerName="mariadb-account-create-update" Mar 10 10:04:58 crc kubenswrapper[4794]: I0310 10:04:58.637723 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="71cc55d3-bfda-4469-ad6c-6c5c0357360a" containerName="mariadb-database-create" Mar 10 10:04:58 crc kubenswrapper[4794]: I0310 10:04:58.637736 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="d1e8a45c-ff29-42fa-9999-5ba419470fd5" containerName="mariadb-account-create-update" Mar 10 10:04:58 crc kubenswrapper[4794]: I0310 10:04:58.637749 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="34215f04-95c7-4644-8ecc-70147aa8c100" containerName="mariadb-database-create" Mar 10 10:04:58 crc kubenswrapper[4794]: I0310 10:04:58.638505 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-cnjz9" Mar 10 10:04:58 crc kubenswrapper[4794]: I0310 10:04:58.640889 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-mariadb-root-db-secret" Mar 10 10:04:58 crc kubenswrapper[4794]: I0310 10:04:58.648703 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-cnjz9"] Mar 10 10:04:58 crc kubenswrapper[4794]: I0310 10:04:58.734835 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f997533c-caab-4b81-9f03-cef498327517-operator-scripts\") pod \"root-account-create-update-cnjz9\" (UID: \"f997533c-caab-4b81-9f03-cef498327517\") " pod="openstack/root-account-create-update-cnjz9" Mar 10 10:04:58 crc kubenswrapper[4794]: I0310 10:04:58.734887 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x2rxz\" (UniqueName: \"kubernetes.io/projected/f997533c-caab-4b81-9f03-cef498327517-kube-api-access-x2rxz\") pod \"root-account-create-update-cnjz9\" (UID: \"f997533c-caab-4b81-9f03-cef498327517\") " pod="openstack/root-account-create-update-cnjz9" Mar 10 10:04:58 crc kubenswrapper[4794]: I0310 10:04:58.835914 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f997533c-caab-4b81-9f03-cef498327517-operator-scripts\") pod \"root-account-create-update-cnjz9\" (UID: \"f997533c-caab-4b81-9f03-cef498327517\") " pod="openstack/root-account-create-update-cnjz9" Mar 10 10:04:58 crc kubenswrapper[4794]: I0310 10:04:58.836015 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x2rxz\" (UniqueName: \"kubernetes.io/projected/f997533c-caab-4b81-9f03-cef498327517-kube-api-access-x2rxz\") pod \"root-account-create-update-cnjz9\" (UID: \"f997533c-caab-4b81-9f03-cef498327517\") " pod="openstack/root-account-create-update-cnjz9" Mar 10 10:04:58 crc kubenswrapper[4794]: I0310 10:04:58.836776 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f997533c-caab-4b81-9f03-cef498327517-operator-scripts\") pod \"root-account-create-update-cnjz9\" (UID: \"f997533c-caab-4b81-9f03-cef498327517\") " pod="openstack/root-account-create-update-cnjz9" Mar 10 10:04:58 crc kubenswrapper[4794]: I0310 10:04:58.855493 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x2rxz\" (UniqueName: \"kubernetes.io/projected/f997533c-caab-4b81-9f03-cef498327517-kube-api-access-x2rxz\") pod \"root-account-create-update-cnjz9\" (UID: \"f997533c-caab-4b81-9f03-cef498327517\") " pod="openstack/root-account-create-update-cnjz9" Mar 10 10:04:58 crc kubenswrapper[4794]: I0310 10:04:58.956766 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-cnjz9" Mar 10 10:04:59 crc kubenswrapper[4794]: I0310 10:04:59.039132 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/6cbde6fd-ed6b-49a8-96ae-642b15a1802b-etc-swift\") pod \"swift-storage-0\" (UID: \"6cbde6fd-ed6b-49a8-96ae-642b15a1802b\") " pod="openstack/swift-storage-0" Mar 10 10:04:59 crc kubenswrapper[4794]: E0310 10:04:59.039349 4794 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Mar 10 10:04:59 crc kubenswrapper[4794]: E0310 10:04:59.039547 4794 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Mar 10 10:04:59 crc kubenswrapper[4794]: E0310 10:04:59.039615 4794 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/6cbde6fd-ed6b-49a8-96ae-642b15a1802b-etc-swift podName:6cbde6fd-ed6b-49a8-96ae-642b15a1802b nodeName:}" failed. No retries permitted until 2026-03-10 10:05:03.039593704 +0000 UTC m=+1251.795764522 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/6cbde6fd-ed6b-49a8-96ae-642b15a1802b-etc-swift") pod "swift-storage-0" (UID: "6cbde6fd-ed6b-49a8-96ae-642b15a1802b") : configmap "swift-ring-files" not found Mar 10 10:04:59 crc kubenswrapper[4794]: I0310 10:04:59.174508 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-ring-rebalance-qg7cr"] Mar 10 10:04:59 crc kubenswrapper[4794]: I0310 10:04:59.175744 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-qg7cr" Mar 10 10:04:59 crc kubenswrapper[4794]: I0310 10:04:59.178811 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-config-data" Mar 10 10:04:59 crc kubenswrapper[4794]: I0310 10:04:59.178999 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Mar 10 10:04:59 crc kubenswrapper[4794]: I0310 10:04:59.179142 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-scripts" Mar 10 10:04:59 crc kubenswrapper[4794]: I0310 10:04:59.194927 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-qg7cr"] Mar 10 10:04:59 crc kubenswrapper[4794]: I0310 10:04:59.210351 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-ring-rebalance-vb95j"] Mar 10 10:04:59 crc kubenswrapper[4794]: I0310 10:04:59.211473 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-vb95j" Mar 10 10:04:59 crc kubenswrapper[4794]: I0310 10:04:59.218559 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-ring-rebalance-qg7cr"] Mar 10 10:04:59 crc kubenswrapper[4794]: E0310 10:04:59.219277 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[combined-ca-bundle dispersionconf etc-swift kube-api-access-vvvgh ring-data-devices scripts swiftconf], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openstack/swift-ring-rebalance-qg7cr" podUID="8fb6d40c-f4cb-4aaf-92c6-f0bea1bbf04d" Mar 10 10:04:59 crc kubenswrapper[4794]: I0310 10:04:59.237109 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-vb95j"] Mar 10 10:04:59 crc kubenswrapper[4794]: I0310 10:04:59.246705 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/8fb6d40c-f4cb-4aaf-92c6-f0bea1bbf04d-swiftconf\") pod \"swift-ring-rebalance-qg7cr\" (UID: \"8fb6d40c-f4cb-4aaf-92c6-f0bea1bbf04d\") " pod="openstack/swift-ring-rebalance-qg7cr" Mar 10 10:04:59 crc kubenswrapper[4794]: I0310 10:04:59.246782 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/8fb6d40c-f4cb-4aaf-92c6-f0bea1bbf04d-ring-data-devices\") pod \"swift-ring-rebalance-qg7cr\" (UID: \"8fb6d40c-f4cb-4aaf-92c6-f0bea1bbf04d\") " pod="openstack/swift-ring-rebalance-qg7cr" Mar 10 10:04:59 crc kubenswrapper[4794]: I0310 10:04:59.246810 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vvvgh\" (UniqueName: \"kubernetes.io/projected/8fb6d40c-f4cb-4aaf-92c6-f0bea1bbf04d-kube-api-access-vvvgh\") pod \"swift-ring-rebalance-qg7cr\" (UID: \"8fb6d40c-f4cb-4aaf-92c6-f0bea1bbf04d\") " pod="openstack/swift-ring-rebalance-qg7cr" Mar 10 10:04:59 crc kubenswrapper[4794]: I0310 10:04:59.246840 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8fb6d40c-f4cb-4aaf-92c6-f0bea1bbf04d-scripts\") pod \"swift-ring-rebalance-qg7cr\" (UID: \"8fb6d40c-f4cb-4aaf-92c6-f0bea1bbf04d\") " pod="openstack/swift-ring-rebalance-qg7cr" Mar 10 10:04:59 crc kubenswrapper[4794]: I0310 10:04:59.246871 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8fb6d40c-f4cb-4aaf-92c6-f0bea1bbf04d-combined-ca-bundle\") pod \"swift-ring-rebalance-qg7cr\" (UID: \"8fb6d40c-f4cb-4aaf-92c6-f0bea1bbf04d\") " pod="openstack/swift-ring-rebalance-qg7cr" Mar 10 10:04:59 crc kubenswrapper[4794]: I0310 10:04:59.246907 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/8fb6d40c-f4cb-4aaf-92c6-f0bea1bbf04d-dispersionconf\") pod \"swift-ring-rebalance-qg7cr\" (UID: \"8fb6d40c-f4cb-4aaf-92c6-f0bea1bbf04d\") " pod="openstack/swift-ring-rebalance-qg7cr" Mar 10 10:04:59 crc kubenswrapper[4794]: I0310 10:04:59.246922 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/8fb6d40c-f4cb-4aaf-92c6-f0bea1bbf04d-etc-swift\") pod \"swift-ring-rebalance-qg7cr\" (UID: \"8fb6d40c-f4cb-4aaf-92c6-f0bea1bbf04d\") " pod="openstack/swift-ring-rebalance-qg7cr" Mar 10 10:04:59 crc kubenswrapper[4794]: I0310 10:04:59.348416 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/8fb6d40c-f4cb-4aaf-92c6-f0bea1bbf04d-swiftconf\") pod \"swift-ring-rebalance-qg7cr\" (UID: \"8fb6d40c-f4cb-4aaf-92c6-f0bea1bbf04d\") " pod="openstack/swift-ring-rebalance-qg7cr" Mar 10 10:04:59 crc kubenswrapper[4794]: I0310 10:04:59.348504 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/605a052b-1adf-461d-baf7-1a30a69d8de7-etc-swift\") pod \"swift-ring-rebalance-vb95j\" (UID: \"605a052b-1adf-461d-baf7-1a30a69d8de7\") " pod="openstack/swift-ring-rebalance-vb95j" Mar 10 10:04:59 crc kubenswrapper[4794]: I0310 10:04:59.348554 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-msnkx\" (UniqueName: \"kubernetes.io/projected/605a052b-1adf-461d-baf7-1a30a69d8de7-kube-api-access-msnkx\") pod \"swift-ring-rebalance-vb95j\" (UID: \"605a052b-1adf-461d-baf7-1a30a69d8de7\") " pod="openstack/swift-ring-rebalance-vb95j" Mar 10 10:04:59 crc kubenswrapper[4794]: I0310 10:04:59.348597 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/8fb6d40c-f4cb-4aaf-92c6-f0bea1bbf04d-ring-data-devices\") pod \"swift-ring-rebalance-qg7cr\" (UID: \"8fb6d40c-f4cb-4aaf-92c6-f0bea1bbf04d\") " pod="openstack/swift-ring-rebalance-qg7cr" Mar 10 10:04:59 crc kubenswrapper[4794]: I0310 10:04:59.348623 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/605a052b-1adf-461d-baf7-1a30a69d8de7-scripts\") pod \"swift-ring-rebalance-vb95j\" (UID: \"605a052b-1adf-461d-baf7-1a30a69d8de7\") " pod="openstack/swift-ring-rebalance-vb95j" Mar 10 10:04:59 crc kubenswrapper[4794]: I0310 10:04:59.348656 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vvvgh\" (UniqueName: \"kubernetes.io/projected/8fb6d40c-f4cb-4aaf-92c6-f0bea1bbf04d-kube-api-access-vvvgh\") pod \"swift-ring-rebalance-qg7cr\" (UID: \"8fb6d40c-f4cb-4aaf-92c6-f0bea1bbf04d\") " pod="openstack/swift-ring-rebalance-qg7cr" Mar 10 10:04:59 crc kubenswrapper[4794]: I0310 10:04:59.348693 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8fb6d40c-f4cb-4aaf-92c6-f0bea1bbf04d-scripts\") pod \"swift-ring-rebalance-qg7cr\" (UID: \"8fb6d40c-f4cb-4aaf-92c6-f0bea1bbf04d\") " pod="openstack/swift-ring-rebalance-qg7cr" Mar 10 10:04:59 crc kubenswrapper[4794]: I0310 10:04:59.348731 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/605a052b-1adf-461d-baf7-1a30a69d8de7-combined-ca-bundle\") pod \"swift-ring-rebalance-vb95j\" (UID: \"605a052b-1adf-461d-baf7-1a30a69d8de7\") " pod="openstack/swift-ring-rebalance-vb95j" Mar 10 10:04:59 crc kubenswrapper[4794]: I0310 10:04:59.348759 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8fb6d40c-f4cb-4aaf-92c6-f0bea1bbf04d-combined-ca-bundle\") pod \"swift-ring-rebalance-qg7cr\" (UID: \"8fb6d40c-f4cb-4aaf-92c6-f0bea1bbf04d\") " pod="openstack/swift-ring-rebalance-qg7cr" Mar 10 10:04:59 crc kubenswrapper[4794]: I0310 10:04:59.348792 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/605a052b-1adf-461d-baf7-1a30a69d8de7-swiftconf\") pod \"swift-ring-rebalance-vb95j\" (UID: \"605a052b-1adf-461d-baf7-1a30a69d8de7\") " pod="openstack/swift-ring-rebalance-vb95j" Mar 10 10:04:59 crc kubenswrapper[4794]: I0310 10:04:59.348820 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/605a052b-1adf-461d-baf7-1a30a69d8de7-ring-data-devices\") pod \"swift-ring-rebalance-vb95j\" (UID: \"605a052b-1adf-461d-baf7-1a30a69d8de7\") " pod="openstack/swift-ring-rebalance-vb95j" Mar 10 10:04:59 crc kubenswrapper[4794]: I0310 10:04:59.348951 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/8fb6d40c-f4cb-4aaf-92c6-f0bea1bbf04d-dispersionconf\") pod \"swift-ring-rebalance-qg7cr\" (UID: \"8fb6d40c-f4cb-4aaf-92c6-f0bea1bbf04d\") " pod="openstack/swift-ring-rebalance-qg7cr" Mar 10 10:04:59 crc kubenswrapper[4794]: I0310 10:04:59.349003 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/8fb6d40c-f4cb-4aaf-92c6-f0bea1bbf04d-etc-swift\") pod \"swift-ring-rebalance-qg7cr\" (UID: \"8fb6d40c-f4cb-4aaf-92c6-f0bea1bbf04d\") " pod="openstack/swift-ring-rebalance-qg7cr" Mar 10 10:04:59 crc kubenswrapper[4794]: I0310 10:04:59.349071 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/605a052b-1adf-461d-baf7-1a30a69d8de7-dispersionconf\") pod \"swift-ring-rebalance-vb95j\" (UID: \"605a052b-1adf-461d-baf7-1a30a69d8de7\") " pod="openstack/swift-ring-rebalance-vb95j" Mar 10 10:04:59 crc kubenswrapper[4794]: I0310 10:04:59.349461 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/8fb6d40c-f4cb-4aaf-92c6-f0bea1bbf04d-etc-swift\") pod \"swift-ring-rebalance-qg7cr\" (UID: \"8fb6d40c-f4cb-4aaf-92c6-f0bea1bbf04d\") " pod="openstack/swift-ring-rebalance-qg7cr" Mar 10 10:04:59 crc kubenswrapper[4794]: I0310 10:04:59.349490 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/8fb6d40c-f4cb-4aaf-92c6-f0bea1bbf04d-ring-data-devices\") pod \"swift-ring-rebalance-qg7cr\" (UID: \"8fb6d40c-f4cb-4aaf-92c6-f0bea1bbf04d\") " pod="openstack/swift-ring-rebalance-qg7cr" Mar 10 10:04:59 crc kubenswrapper[4794]: I0310 10:04:59.349595 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8fb6d40c-f4cb-4aaf-92c6-f0bea1bbf04d-scripts\") pod \"swift-ring-rebalance-qg7cr\" (UID: \"8fb6d40c-f4cb-4aaf-92c6-f0bea1bbf04d\") " pod="openstack/swift-ring-rebalance-qg7cr" Mar 10 10:04:59 crc kubenswrapper[4794]: I0310 10:04:59.353499 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/8fb6d40c-f4cb-4aaf-92c6-f0bea1bbf04d-dispersionconf\") pod \"swift-ring-rebalance-qg7cr\" (UID: \"8fb6d40c-f4cb-4aaf-92c6-f0bea1bbf04d\") " pod="openstack/swift-ring-rebalance-qg7cr" Mar 10 10:04:59 crc kubenswrapper[4794]: I0310 10:04:59.354645 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8fb6d40c-f4cb-4aaf-92c6-f0bea1bbf04d-combined-ca-bundle\") pod \"swift-ring-rebalance-qg7cr\" (UID: \"8fb6d40c-f4cb-4aaf-92c6-f0bea1bbf04d\") " pod="openstack/swift-ring-rebalance-qg7cr" Mar 10 10:04:59 crc kubenswrapper[4794]: I0310 10:04:59.354886 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/8fb6d40c-f4cb-4aaf-92c6-f0bea1bbf04d-swiftconf\") pod \"swift-ring-rebalance-qg7cr\" (UID: \"8fb6d40c-f4cb-4aaf-92c6-f0bea1bbf04d\") " pod="openstack/swift-ring-rebalance-qg7cr" Mar 10 10:04:59 crc kubenswrapper[4794]: I0310 10:04:59.378114 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vvvgh\" (UniqueName: \"kubernetes.io/projected/8fb6d40c-f4cb-4aaf-92c6-f0bea1bbf04d-kube-api-access-vvvgh\") pod \"swift-ring-rebalance-qg7cr\" (UID: \"8fb6d40c-f4cb-4aaf-92c6-f0bea1bbf04d\") " pod="openstack/swift-ring-rebalance-qg7cr" Mar 10 10:04:59 crc kubenswrapper[4794]: I0310 10:04:59.413860 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-cnjz9"] Mar 10 10:04:59 crc kubenswrapper[4794]: I0310 10:04:59.450629 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/605a052b-1adf-461d-baf7-1a30a69d8de7-swiftconf\") pod \"swift-ring-rebalance-vb95j\" (UID: \"605a052b-1adf-461d-baf7-1a30a69d8de7\") " pod="openstack/swift-ring-rebalance-vb95j" Mar 10 10:04:59 crc kubenswrapper[4794]: I0310 10:04:59.450816 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/605a052b-1adf-461d-baf7-1a30a69d8de7-ring-data-devices\") pod \"swift-ring-rebalance-vb95j\" (UID: \"605a052b-1adf-461d-baf7-1a30a69d8de7\") " pod="openstack/swift-ring-rebalance-vb95j" Mar 10 10:04:59 crc kubenswrapper[4794]: I0310 10:04:59.450930 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/605a052b-1adf-461d-baf7-1a30a69d8de7-dispersionconf\") pod \"swift-ring-rebalance-vb95j\" (UID: \"605a052b-1adf-461d-baf7-1a30a69d8de7\") " pod="openstack/swift-ring-rebalance-vb95j" Mar 10 10:04:59 crc kubenswrapper[4794]: I0310 10:04:59.451031 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/605a052b-1adf-461d-baf7-1a30a69d8de7-etc-swift\") pod \"swift-ring-rebalance-vb95j\" (UID: \"605a052b-1adf-461d-baf7-1a30a69d8de7\") " pod="openstack/swift-ring-rebalance-vb95j" Mar 10 10:04:59 crc kubenswrapper[4794]: I0310 10:04:59.451121 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-msnkx\" (UniqueName: \"kubernetes.io/projected/605a052b-1adf-461d-baf7-1a30a69d8de7-kube-api-access-msnkx\") pod \"swift-ring-rebalance-vb95j\" (UID: \"605a052b-1adf-461d-baf7-1a30a69d8de7\") " pod="openstack/swift-ring-rebalance-vb95j" Mar 10 10:04:59 crc kubenswrapper[4794]: I0310 10:04:59.451282 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/605a052b-1adf-461d-baf7-1a30a69d8de7-scripts\") pod \"swift-ring-rebalance-vb95j\" (UID: \"605a052b-1adf-461d-baf7-1a30a69d8de7\") " pod="openstack/swift-ring-rebalance-vb95j" Mar 10 10:04:59 crc kubenswrapper[4794]: I0310 10:04:59.451416 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/605a052b-1adf-461d-baf7-1a30a69d8de7-combined-ca-bundle\") pod \"swift-ring-rebalance-vb95j\" (UID: \"605a052b-1adf-461d-baf7-1a30a69d8de7\") " pod="openstack/swift-ring-rebalance-vb95j" Mar 10 10:04:59 crc kubenswrapper[4794]: I0310 10:04:59.452180 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/605a052b-1adf-461d-baf7-1a30a69d8de7-scripts\") pod \"swift-ring-rebalance-vb95j\" (UID: \"605a052b-1adf-461d-baf7-1a30a69d8de7\") " pod="openstack/swift-ring-rebalance-vb95j" Mar 10 10:04:59 crc kubenswrapper[4794]: I0310 10:04:59.452273 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/605a052b-1adf-461d-baf7-1a30a69d8de7-etc-swift\") pod \"swift-ring-rebalance-vb95j\" (UID: \"605a052b-1adf-461d-baf7-1a30a69d8de7\") " pod="openstack/swift-ring-rebalance-vb95j" Mar 10 10:04:59 crc kubenswrapper[4794]: I0310 10:04:59.452701 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/605a052b-1adf-461d-baf7-1a30a69d8de7-ring-data-devices\") pod \"swift-ring-rebalance-vb95j\" (UID: \"605a052b-1adf-461d-baf7-1a30a69d8de7\") " pod="openstack/swift-ring-rebalance-vb95j" Mar 10 10:04:59 crc kubenswrapper[4794]: I0310 10:04:59.454630 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/605a052b-1adf-461d-baf7-1a30a69d8de7-swiftconf\") pod \"swift-ring-rebalance-vb95j\" (UID: \"605a052b-1adf-461d-baf7-1a30a69d8de7\") " pod="openstack/swift-ring-rebalance-vb95j" Mar 10 10:04:59 crc kubenswrapper[4794]: I0310 10:04:59.454751 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/605a052b-1adf-461d-baf7-1a30a69d8de7-dispersionconf\") pod \"swift-ring-rebalance-vb95j\" (UID: \"605a052b-1adf-461d-baf7-1a30a69d8de7\") " pod="openstack/swift-ring-rebalance-vb95j" Mar 10 10:04:59 crc kubenswrapper[4794]: I0310 10:04:59.455324 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/605a052b-1adf-461d-baf7-1a30a69d8de7-combined-ca-bundle\") pod \"swift-ring-rebalance-vb95j\" (UID: \"605a052b-1adf-461d-baf7-1a30a69d8de7\") " pod="openstack/swift-ring-rebalance-vb95j" Mar 10 10:04:59 crc kubenswrapper[4794]: I0310 10:04:59.474068 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-msnkx\" (UniqueName: \"kubernetes.io/projected/605a052b-1adf-461d-baf7-1a30a69d8de7-kube-api-access-msnkx\") pod \"swift-ring-rebalance-vb95j\" (UID: \"605a052b-1adf-461d-baf7-1a30a69d8de7\") " pod="openstack/swift-ring-rebalance-vb95j" Mar 10 10:04:59 crc kubenswrapper[4794]: I0310 10:04:59.538153 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-vb95j" Mar 10 10:04:59 crc kubenswrapper[4794]: I0310 10:04:59.730611 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-qg7cr" Mar 10 10:04:59 crc kubenswrapper[4794]: I0310 10:04:59.732000 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-cnjz9" event={"ID":"f997533c-caab-4b81-9f03-cef498327517","Type":"ContainerStarted","Data":"93f84ceecd4613c693b9db23e3eaa377c57bb4bbb5320eba6b3f1970cdfa1c72"} Mar 10 10:04:59 crc kubenswrapper[4794]: I0310 10:04:59.732035 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-cnjz9" event={"ID":"f997533c-caab-4b81-9f03-cef498327517","Type":"ContainerStarted","Data":"bc85d750cbcbd840fc9738b373198473a4911ddbeaf9994d1cdada4c62663a84"} Mar 10 10:04:59 crc kubenswrapper[4794]: I0310 10:04:59.742601 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-qg7cr" Mar 10 10:04:59 crc kubenswrapper[4794]: I0310 10:04:59.752166 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/root-account-create-update-cnjz9" podStartSLOduration=1.752147187 podStartE2EDuration="1.752147187s" podCreationTimestamp="2026-03-10 10:04:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 10:04:59.749158192 +0000 UTC m=+1248.505329040" watchObservedRunningTime="2026-03-10 10:04:59.752147187 +0000 UTC m=+1248.508318035" Mar 10 10:04:59 crc kubenswrapper[4794]: I0310 10:04:59.857796 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8fb6d40c-f4cb-4aaf-92c6-f0bea1bbf04d-scripts\") pod \"8fb6d40c-f4cb-4aaf-92c6-f0bea1bbf04d\" (UID: \"8fb6d40c-f4cb-4aaf-92c6-f0bea1bbf04d\") " Mar 10 10:04:59 crc kubenswrapper[4794]: I0310 10:04:59.858107 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/8fb6d40c-f4cb-4aaf-92c6-f0bea1bbf04d-dispersionconf\") pod \"8fb6d40c-f4cb-4aaf-92c6-f0bea1bbf04d\" (UID: \"8fb6d40c-f4cb-4aaf-92c6-f0bea1bbf04d\") " Mar 10 10:04:59 crc kubenswrapper[4794]: I0310 10:04:59.858157 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/8fb6d40c-f4cb-4aaf-92c6-f0bea1bbf04d-swiftconf\") pod \"8fb6d40c-f4cb-4aaf-92c6-f0bea1bbf04d\" (UID: \"8fb6d40c-f4cb-4aaf-92c6-f0bea1bbf04d\") " Mar 10 10:04:59 crc kubenswrapper[4794]: I0310 10:04:59.858207 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/8fb6d40c-f4cb-4aaf-92c6-f0bea1bbf04d-etc-swift\") pod \"8fb6d40c-f4cb-4aaf-92c6-f0bea1bbf04d\" (UID: \"8fb6d40c-f4cb-4aaf-92c6-f0bea1bbf04d\") " Mar 10 10:04:59 crc kubenswrapper[4794]: I0310 10:04:59.858232 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vvvgh\" (UniqueName: \"kubernetes.io/projected/8fb6d40c-f4cb-4aaf-92c6-f0bea1bbf04d-kube-api-access-vvvgh\") pod \"8fb6d40c-f4cb-4aaf-92c6-f0bea1bbf04d\" (UID: \"8fb6d40c-f4cb-4aaf-92c6-f0bea1bbf04d\") " Mar 10 10:04:59 crc kubenswrapper[4794]: I0310 10:04:59.858316 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/8fb6d40c-f4cb-4aaf-92c6-f0bea1bbf04d-ring-data-devices\") pod \"8fb6d40c-f4cb-4aaf-92c6-f0bea1bbf04d\" (UID: \"8fb6d40c-f4cb-4aaf-92c6-f0bea1bbf04d\") " Mar 10 10:04:59 crc kubenswrapper[4794]: I0310 10:04:59.858367 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8fb6d40c-f4cb-4aaf-92c6-f0bea1bbf04d-combined-ca-bundle\") pod \"8fb6d40c-f4cb-4aaf-92c6-f0bea1bbf04d\" (UID: \"8fb6d40c-f4cb-4aaf-92c6-f0bea1bbf04d\") " Mar 10 10:04:59 crc kubenswrapper[4794]: I0310 10:04:59.858428 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8fb6d40c-f4cb-4aaf-92c6-f0bea1bbf04d-scripts" (OuterVolumeSpecName: "scripts") pod "8fb6d40c-f4cb-4aaf-92c6-f0bea1bbf04d" (UID: "8fb6d40c-f4cb-4aaf-92c6-f0bea1bbf04d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 10:04:59 crc kubenswrapper[4794]: I0310 10:04:59.858718 4794 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8fb6d40c-f4cb-4aaf-92c6-f0bea1bbf04d-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 10:04:59 crc kubenswrapper[4794]: I0310 10:04:59.858795 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8fb6d40c-f4cb-4aaf-92c6-f0bea1bbf04d-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "8fb6d40c-f4cb-4aaf-92c6-f0bea1bbf04d" (UID: "8fb6d40c-f4cb-4aaf-92c6-f0bea1bbf04d"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 10:04:59 crc kubenswrapper[4794]: I0310 10:04:59.859182 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8fb6d40c-f4cb-4aaf-92c6-f0bea1bbf04d-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "8fb6d40c-f4cb-4aaf-92c6-f0bea1bbf04d" (UID: "8fb6d40c-f4cb-4aaf-92c6-f0bea1bbf04d"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 10:04:59 crc kubenswrapper[4794]: I0310 10:04:59.861810 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8fb6d40c-f4cb-4aaf-92c6-f0bea1bbf04d-kube-api-access-vvvgh" (OuterVolumeSpecName: "kube-api-access-vvvgh") pod "8fb6d40c-f4cb-4aaf-92c6-f0bea1bbf04d" (UID: "8fb6d40c-f4cb-4aaf-92c6-f0bea1bbf04d"). InnerVolumeSpecName "kube-api-access-vvvgh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 10:04:59 crc kubenswrapper[4794]: I0310 10:04:59.862065 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8fb6d40c-f4cb-4aaf-92c6-f0bea1bbf04d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8fb6d40c-f4cb-4aaf-92c6-f0bea1bbf04d" (UID: "8fb6d40c-f4cb-4aaf-92c6-f0bea1bbf04d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 10:04:59 crc kubenswrapper[4794]: I0310 10:04:59.864450 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8fb6d40c-f4cb-4aaf-92c6-f0bea1bbf04d-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "8fb6d40c-f4cb-4aaf-92c6-f0bea1bbf04d" (UID: "8fb6d40c-f4cb-4aaf-92c6-f0bea1bbf04d"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 10:04:59 crc kubenswrapper[4794]: I0310 10:04:59.864684 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8fb6d40c-f4cb-4aaf-92c6-f0bea1bbf04d-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "8fb6d40c-f4cb-4aaf-92c6-f0bea1bbf04d" (UID: "8fb6d40c-f4cb-4aaf-92c6-f0bea1bbf04d"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 10:04:59 crc kubenswrapper[4794]: I0310 10:04:59.960482 4794 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/8fb6d40c-f4cb-4aaf-92c6-f0bea1bbf04d-ring-data-devices\") on node \"crc\" DevicePath \"\"" Mar 10 10:04:59 crc kubenswrapper[4794]: I0310 10:04:59.960516 4794 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8fb6d40c-f4cb-4aaf-92c6-f0bea1bbf04d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 10:04:59 crc kubenswrapper[4794]: I0310 10:04:59.960526 4794 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/8fb6d40c-f4cb-4aaf-92c6-f0bea1bbf04d-dispersionconf\") on node \"crc\" DevicePath \"\"" Mar 10 10:04:59 crc kubenswrapper[4794]: I0310 10:04:59.960535 4794 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/8fb6d40c-f4cb-4aaf-92c6-f0bea1bbf04d-swiftconf\") on node \"crc\" DevicePath \"\"" Mar 10 10:04:59 crc kubenswrapper[4794]: I0310 10:04:59.960543 4794 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/8fb6d40c-f4cb-4aaf-92c6-f0bea1bbf04d-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 10 10:04:59 crc kubenswrapper[4794]: I0310 10:04:59.960553 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vvvgh\" (UniqueName: \"kubernetes.io/projected/8fb6d40c-f4cb-4aaf-92c6-f0bea1bbf04d-kube-api-access-vvvgh\") on node \"crc\" DevicePath \"\"" Mar 10 10:04:59 crc kubenswrapper[4794]: I0310 10:04:59.973690 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-vb95j"] Mar 10 10:05:00 crc kubenswrapper[4794]: I0310 10:05:00.741491 4794 generic.go:334] "Generic (PLEG): container finished" podID="f997533c-caab-4b81-9f03-cef498327517" containerID="93f84ceecd4613c693b9db23e3eaa377c57bb4bbb5320eba6b3f1970cdfa1c72" exitCode=0 Mar 10 10:05:00 crc kubenswrapper[4794]: I0310 10:05:00.741552 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-cnjz9" event={"ID":"f997533c-caab-4b81-9f03-cef498327517","Type":"ContainerDied","Data":"93f84ceecd4613c693b9db23e3eaa377c57bb4bbb5320eba6b3f1970cdfa1c72"} Mar 10 10:05:00 crc kubenswrapper[4794]: I0310 10:05:00.743431 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-qg7cr" Mar 10 10:05:00 crc kubenswrapper[4794]: I0310 10:05:00.743438 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-vb95j" event={"ID":"605a052b-1adf-461d-baf7-1a30a69d8de7","Type":"ContainerStarted","Data":"bf3a081702df539ed4ae4432b8e9f82ca7dc54125bd2f38bf996d25bc14c9343"} Mar 10 10:05:00 crc kubenswrapper[4794]: I0310 10:05:00.797791 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-ring-rebalance-qg7cr"] Mar 10 10:05:00 crc kubenswrapper[4794]: I0310 10:05:00.835506 4794 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/swift-ring-rebalance-qg7cr"] Mar 10 10:05:02 crc kubenswrapper[4794]: I0310 10:05:02.008571 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8fb6d40c-f4cb-4aaf-92c6-f0bea1bbf04d" path="/var/lib/kubelet/pods/8fb6d40c-f4cb-4aaf-92c6-f0bea1bbf04d/volumes" Mar 10 10:05:02 crc kubenswrapper[4794]: I0310 10:05:02.046680 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-create-jtsc6"] Mar 10 10:05:02 crc kubenswrapper[4794]: I0310 10:05:02.047958 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-jtsc6" Mar 10 10:05:02 crc kubenswrapper[4794]: I0310 10:05:02.067259 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-jtsc6"] Mar 10 10:05:02 crc kubenswrapper[4794]: I0310 10:05:02.096727 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8gmdp\" (UniqueName: \"kubernetes.io/projected/510d7963-19bc-4b66-8c32-0b7b92c8e7ad-kube-api-access-8gmdp\") pod \"glance-db-create-jtsc6\" (UID: \"510d7963-19bc-4b66-8c32-0b7b92c8e7ad\") " pod="openstack/glance-db-create-jtsc6" Mar 10 10:05:02 crc kubenswrapper[4794]: I0310 10:05:02.096949 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/510d7963-19bc-4b66-8c32-0b7b92c8e7ad-operator-scripts\") pod \"glance-db-create-jtsc6\" (UID: \"510d7963-19bc-4b66-8c32-0b7b92c8e7ad\") " pod="openstack/glance-db-create-jtsc6" Mar 10 10:05:02 crc kubenswrapper[4794]: I0310 10:05:02.149369 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-7da2-account-create-update-dtltz"] Mar 10 10:05:02 crc kubenswrapper[4794]: I0310 10:05:02.150867 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-7da2-account-create-update-dtltz" Mar 10 10:05:02 crc kubenswrapper[4794]: I0310 10:05:02.158701 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-7da2-account-create-update-dtltz"] Mar 10 10:05:02 crc kubenswrapper[4794]: I0310 10:05:02.195004 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-db-secret" Mar 10 10:05:02 crc kubenswrapper[4794]: I0310 10:05:02.198852 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9nw9w\" (UniqueName: \"kubernetes.io/projected/f3cfd303-3976-4400-8c37-e64a4bed85f2-kube-api-access-9nw9w\") pod \"glance-7da2-account-create-update-dtltz\" (UID: \"f3cfd303-3976-4400-8c37-e64a4bed85f2\") " pod="openstack/glance-7da2-account-create-update-dtltz" Mar 10 10:05:02 crc kubenswrapper[4794]: I0310 10:05:02.198950 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f3cfd303-3976-4400-8c37-e64a4bed85f2-operator-scripts\") pod \"glance-7da2-account-create-update-dtltz\" (UID: \"f3cfd303-3976-4400-8c37-e64a4bed85f2\") " pod="openstack/glance-7da2-account-create-update-dtltz" Mar 10 10:05:02 crc kubenswrapper[4794]: I0310 10:05:02.199065 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/510d7963-19bc-4b66-8c32-0b7b92c8e7ad-operator-scripts\") pod \"glance-db-create-jtsc6\" (UID: \"510d7963-19bc-4b66-8c32-0b7b92c8e7ad\") " pod="openstack/glance-db-create-jtsc6" Mar 10 10:05:02 crc kubenswrapper[4794]: I0310 10:05:02.199137 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8gmdp\" (UniqueName: \"kubernetes.io/projected/510d7963-19bc-4b66-8c32-0b7b92c8e7ad-kube-api-access-8gmdp\") pod \"glance-db-create-jtsc6\" (UID: \"510d7963-19bc-4b66-8c32-0b7b92c8e7ad\") " pod="openstack/glance-db-create-jtsc6" Mar 10 10:05:02 crc kubenswrapper[4794]: I0310 10:05:02.199866 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/510d7963-19bc-4b66-8c32-0b7b92c8e7ad-operator-scripts\") pod \"glance-db-create-jtsc6\" (UID: \"510d7963-19bc-4b66-8c32-0b7b92c8e7ad\") " pod="openstack/glance-db-create-jtsc6" Mar 10 10:05:02 crc kubenswrapper[4794]: I0310 10:05:02.239097 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8gmdp\" (UniqueName: \"kubernetes.io/projected/510d7963-19bc-4b66-8c32-0b7b92c8e7ad-kube-api-access-8gmdp\") pod \"glance-db-create-jtsc6\" (UID: \"510d7963-19bc-4b66-8c32-0b7b92c8e7ad\") " pod="openstack/glance-db-create-jtsc6" Mar 10 10:05:02 crc kubenswrapper[4794]: I0310 10:05:02.299787 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9nw9w\" (UniqueName: \"kubernetes.io/projected/f3cfd303-3976-4400-8c37-e64a4bed85f2-kube-api-access-9nw9w\") pod \"glance-7da2-account-create-update-dtltz\" (UID: \"f3cfd303-3976-4400-8c37-e64a4bed85f2\") " pod="openstack/glance-7da2-account-create-update-dtltz" Mar 10 10:05:02 crc kubenswrapper[4794]: I0310 10:05:02.299832 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f3cfd303-3976-4400-8c37-e64a4bed85f2-operator-scripts\") pod \"glance-7da2-account-create-update-dtltz\" (UID: \"f3cfd303-3976-4400-8c37-e64a4bed85f2\") " pod="openstack/glance-7da2-account-create-update-dtltz" Mar 10 10:05:02 crc kubenswrapper[4794]: I0310 10:05:02.305681 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f3cfd303-3976-4400-8c37-e64a4bed85f2-operator-scripts\") pod \"glance-7da2-account-create-update-dtltz\" (UID: \"f3cfd303-3976-4400-8c37-e64a4bed85f2\") " pod="openstack/glance-7da2-account-create-update-dtltz" Mar 10 10:05:02 crc kubenswrapper[4794]: I0310 10:05:02.317027 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9nw9w\" (UniqueName: \"kubernetes.io/projected/f3cfd303-3976-4400-8c37-e64a4bed85f2-kube-api-access-9nw9w\") pod \"glance-7da2-account-create-update-dtltz\" (UID: \"f3cfd303-3976-4400-8c37-e64a4bed85f2\") " pod="openstack/glance-7da2-account-create-update-dtltz" Mar 10 10:05:02 crc kubenswrapper[4794]: I0310 10:05:02.362517 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-cnjz9" Mar 10 10:05:02 crc kubenswrapper[4794]: I0310 10:05:02.363958 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-jtsc6" Mar 10 10:05:02 crc kubenswrapper[4794]: I0310 10:05:02.507904 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x2rxz\" (UniqueName: \"kubernetes.io/projected/f997533c-caab-4b81-9f03-cef498327517-kube-api-access-x2rxz\") pod \"f997533c-caab-4b81-9f03-cef498327517\" (UID: \"f997533c-caab-4b81-9f03-cef498327517\") " Mar 10 10:05:02 crc kubenswrapper[4794]: I0310 10:05:02.508987 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f997533c-caab-4b81-9f03-cef498327517-operator-scripts\") pod \"f997533c-caab-4b81-9f03-cef498327517\" (UID: \"f997533c-caab-4b81-9f03-cef498327517\") " Mar 10 10:05:02 crc kubenswrapper[4794]: I0310 10:05:02.509554 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f997533c-caab-4b81-9f03-cef498327517-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "f997533c-caab-4b81-9f03-cef498327517" (UID: "f997533c-caab-4b81-9f03-cef498327517"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 10:05:02 crc kubenswrapper[4794]: I0310 10:05:02.509906 4794 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f997533c-caab-4b81-9f03-cef498327517-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 10:05:02 crc kubenswrapper[4794]: I0310 10:05:02.512552 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f997533c-caab-4b81-9f03-cef498327517-kube-api-access-x2rxz" (OuterVolumeSpecName: "kube-api-access-x2rxz") pod "f997533c-caab-4b81-9f03-cef498327517" (UID: "f997533c-caab-4b81-9f03-cef498327517"). InnerVolumeSpecName "kube-api-access-x2rxz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 10:05:02 crc kubenswrapper[4794]: I0310 10:05:02.516216 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-7da2-account-create-update-dtltz" Mar 10 10:05:02 crc kubenswrapper[4794]: I0310 10:05:02.611678 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x2rxz\" (UniqueName: \"kubernetes.io/projected/f997533c-caab-4b81-9f03-cef498327517-kube-api-access-x2rxz\") on node \"crc\" DevicePath \"\"" Mar 10 10:05:02 crc kubenswrapper[4794]: I0310 10:05:02.760992 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-cnjz9" event={"ID":"f997533c-caab-4b81-9f03-cef498327517","Type":"ContainerDied","Data":"bc85d750cbcbd840fc9738b373198473a4911ddbeaf9994d1cdada4c62663a84"} Mar 10 10:05:02 crc kubenswrapper[4794]: I0310 10:05:02.761049 4794 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bc85d750cbcbd840fc9738b373198473a4911ddbeaf9994d1cdada4c62663a84" Mar 10 10:05:02 crc kubenswrapper[4794]: I0310 10:05:02.761128 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-cnjz9" Mar 10 10:05:03 crc kubenswrapper[4794]: I0310 10:05:03.120760 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/6cbde6fd-ed6b-49a8-96ae-642b15a1802b-etc-swift\") pod \"swift-storage-0\" (UID: \"6cbde6fd-ed6b-49a8-96ae-642b15a1802b\") " pod="openstack/swift-storage-0" Mar 10 10:05:03 crc kubenswrapper[4794]: E0310 10:05:03.120967 4794 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Mar 10 10:05:03 crc kubenswrapper[4794]: E0310 10:05:03.120994 4794 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Mar 10 10:05:03 crc kubenswrapper[4794]: E0310 10:05:03.121060 4794 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/6cbde6fd-ed6b-49a8-96ae-642b15a1802b-etc-swift podName:6cbde6fd-ed6b-49a8-96ae-642b15a1802b nodeName:}" failed. No retries permitted until 2026-03-10 10:05:11.121037973 +0000 UTC m=+1259.877208791 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/6cbde6fd-ed6b-49a8-96ae-642b15a1802b-etc-swift") pod "swift-storage-0" (UID: "6cbde6fd-ed6b-49a8-96ae-642b15a1802b") : configmap "swift-ring-files" not found Mar 10 10:05:03 crc kubenswrapper[4794]: I0310 10:05:03.769367 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-vb95j" event={"ID":"605a052b-1adf-461d-baf7-1a30a69d8de7","Type":"ContainerStarted","Data":"19f6cd89c73f96aeb050448147fc3ab69b927ded2c3caffb20293da23bbe586e"} Mar 10 10:05:03 crc kubenswrapper[4794]: I0310 10:05:03.815916 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-7da2-account-create-update-dtltz"] Mar 10 10:05:03 crc kubenswrapper[4794]: I0310 10:05:03.823945 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-ring-rebalance-vb95j" podStartSLOduration=1.43558522 podStartE2EDuration="4.823928651s" podCreationTimestamp="2026-03-10 10:04:59 +0000 UTC" firstStartedPulling="2026-03-10 10:04:59.982202622 +0000 UTC m=+1248.738373460" lastFinishedPulling="2026-03-10 10:05:03.370546063 +0000 UTC m=+1252.126716891" observedRunningTime="2026-03-10 10:05:03.803873316 +0000 UTC m=+1252.560044154" watchObservedRunningTime="2026-03-10 10:05:03.823928651 +0000 UTC m=+1252.580099469" Mar 10 10:05:03 crc kubenswrapper[4794]: I0310 10:05:03.852581 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-jtsc6"] Mar 10 10:05:03 crc kubenswrapper[4794]: W0310 10:05:03.853658 4794 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod510d7963_19bc_4b66_8c32_0b7b92c8e7ad.slice/crio-6b638b1ddbe57e1cc4949e19ce72436e197db7687793b05ff13ff64c30b54e04 WatchSource:0}: Error finding container 6b638b1ddbe57e1cc4949e19ce72436e197db7687793b05ff13ff64c30b54e04: Status 404 returned error can't find the container with id 6b638b1ddbe57e1cc4949e19ce72436e197db7687793b05ff13ff64c30b54e04 Mar 10 10:05:04 crc kubenswrapper[4794]: I0310 10:05:04.445660 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-58df884995-wr8ms" Mar 10 10:05:04 crc kubenswrapper[4794]: I0310 10:05:04.523637 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-659ddb758c-4qcnz"] Mar 10 10:05:04 crc kubenswrapper[4794]: I0310 10:05:04.524286 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-659ddb758c-4qcnz" podUID="4d676f27-d254-46a3-afb6-7ecb637b61be" containerName="dnsmasq-dns" containerID="cri-o://9d26bd5b81721ffcbfeba1b8441c9226b29c9ef39b18c49b5750cccfa3e56b31" gracePeriod=10 Mar 10 10:05:04 crc kubenswrapper[4794]: I0310 10:05:04.779630 4794 generic.go:334] "Generic (PLEG): container finished" podID="f3cfd303-3976-4400-8c37-e64a4bed85f2" containerID="0b930984fbf109b32ed28954bcbccc55e0338f35e6b3df584a5cad91d11280de" exitCode=0 Mar 10 10:05:04 crc kubenswrapper[4794]: I0310 10:05:04.779687 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-7da2-account-create-update-dtltz" event={"ID":"f3cfd303-3976-4400-8c37-e64a4bed85f2","Type":"ContainerDied","Data":"0b930984fbf109b32ed28954bcbccc55e0338f35e6b3df584a5cad91d11280de"} Mar 10 10:05:04 crc kubenswrapper[4794]: I0310 10:05:04.779709 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-7da2-account-create-update-dtltz" event={"ID":"f3cfd303-3976-4400-8c37-e64a4bed85f2","Type":"ContainerStarted","Data":"82cc98e7acb1bbe09fa0df35d1c8dcc10892b6adbfbf048e7047abdf1733663f"} Mar 10 10:05:04 crc kubenswrapper[4794]: I0310 10:05:04.783473 4794 generic.go:334] "Generic (PLEG): container finished" podID="4d676f27-d254-46a3-afb6-7ecb637b61be" containerID="9d26bd5b81721ffcbfeba1b8441c9226b29c9ef39b18c49b5750cccfa3e56b31" exitCode=0 Mar 10 10:05:04 crc kubenswrapper[4794]: I0310 10:05:04.783509 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-659ddb758c-4qcnz" event={"ID":"4d676f27-d254-46a3-afb6-7ecb637b61be","Type":"ContainerDied","Data":"9d26bd5b81721ffcbfeba1b8441c9226b29c9ef39b18c49b5750cccfa3e56b31"} Mar 10 10:05:04 crc kubenswrapper[4794]: I0310 10:05:04.785730 4794 generic.go:334] "Generic (PLEG): container finished" podID="510d7963-19bc-4b66-8c32-0b7b92c8e7ad" containerID="4c601955445959dac10e09eee2e220072dd4672a1b804c16d5f6f9003c83d0ab" exitCode=0 Mar 10 10:05:04 crc kubenswrapper[4794]: I0310 10:05:04.785829 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-jtsc6" event={"ID":"510d7963-19bc-4b66-8c32-0b7b92c8e7ad","Type":"ContainerDied","Data":"4c601955445959dac10e09eee2e220072dd4672a1b804c16d5f6f9003c83d0ab"} Mar 10 10:05:04 crc kubenswrapper[4794]: I0310 10:05:04.785879 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-jtsc6" event={"ID":"510d7963-19bc-4b66-8c32-0b7b92c8e7ad","Type":"ContainerStarted","Data":"6b638b1ddbe57e1cc4949e19ce72436e197db7687793b05ff13ff64c30b54e04"} Mar 10 10:05:05 crc kubenswrapper[4794]: I0310 10:05:05.013493 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-659ddb758c-4qcnz" Mar 10 10:05:05 crc kubenswrapper[4794]: I0310 10:05:05.068034 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4d676f27-d254-46a3-afb6-7ecb637b61be-config\") pod \"4d676f27-d254-46a3-afb6-7ecb637b61be\" (UID: \"4d676f27-d254-46a3-afb6-7ecb637b61be\") " Mar 10 10:05:05 crc kubenswrapper[4794]: I0310 10:05:05.068154 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4d676f27-d254-46a3-afb6-7ecb637b61be-ovsdbserver-nb\") pod \"4d676f27-d254-46a3-afb6-7ecb637b61be\" (UID: \"4d676f27-d254-46a3-afb6-7ecb637b61be\") " Mar 10 10:05:05 crc kubenswrapper[4794]: I0310 10:05:05.068195 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4d676f27-d254-46a3-afb6-7ecb637b61be-ovsdbserver-sb\") pod \"4d676f27-d254-46a3-afb6-7ecb637b61be\" (UID: \"4d676f27-d254-46a3-afb6-7ecb637b61be\") " Mar 10 10:05:05 crc kubenswrapper[4794]: I0310 10:05:05.068235 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-krtv5\" (UniqueName: \"kubernetes.io/projected/4d676f27-d254-46a3-afb6-7ecb637b61be-kube-api-access-krtv5\") pod \"4d676f27-d254-46a3-afb6-7ecb637b61be\" (UID: \"4d676f27-d254-46a3-afb6-7ecb637b61be\") " Mar 10 10:05:05 crc kubenswrapper[4794]: I0310 10:05:05.068275 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4d676f27-d254-46a3-afb6-7ecb637b61be-dns-svc\") pod \"4d676f27-d254-46a3-afb6-7ecb637b61be\" (UID: \"4d676f27-d254-46a3-afb6-7ecb637b61be\") " Mar 10 10:05:05 crc kubenswrapper[4794]: I0310 10:05:05.093513 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4d676f27-d254-46a3-afb6-7ecb637b61be-kube-api-access-krtv5" (OuterVolumeSpecName: "kube-api-access-krtv5") pod "4d676f27-d254-46a3-afb6-7ecb637b61be" (UID: "4d676f27-d254-46a3-afb6-7ecb637b61be"). InnerVolumeSpecName "kube-api-access-krtv5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 10:05:05 crc kubenswrapper[4794]: I0310 10:05:05.121017 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4d676f27-d254-46a3-afb6-7ecb637b61be-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "4d676f27-d254-46a3-afb6-7ecb637b61be" (UID: "4d676f27-d254-46a3-afb6-7ecb637b61be"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 10:05:05 crc kubenswrapper[4794]: I0310 10:05:05.125772 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4d676f27-d254-46a3-afb6-7ecb637b61be-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "4d676f27-d254-46a3-afb6-7ecb637b61be" (UID: "4d676f27-d254-46a3-afb6-7ecb637b61be"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 10:05:05 crc kubenswrapper[4794]: I0310 10:05:05.127056 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4d676f27-d254-46a3-afb6-7ecb637b61be-config" (OuterVolumeSpecName: "config") pod "4d676f27-d254-46a3-afb6-7ecb637b61be" (UID: "4d676f27-d254-46a3-afb6-7ecb637b61be"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 10:05:05 crc kubenswrapper[4794]: I0310 10:05:05.131816 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4d676f27-d254-46a3-afb6-7ecb637b61be-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "4d676f27-d254-46a3-afb6-7ecb637b61be" (UID: "4d676f27-d254-46a3-afb6-7ecb637b61be"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 10:05:05 crc kubenswrapper[4794]: I0310 10:05:05.170372 4794 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4d676f27-d254-46a3-afb6-7ecb637b61be-config\") on node \"crc\" DevicePath \"\"" Mar 10 10:05:05 crc kubenswrapper[4794]: I0310 10:05:05.170409 4794 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4d676f27-d254-46a3-afb6-7ecb637b61be-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 10 10:05:05 crc kubenswrapper[4794]: I0310 10:05:05.170423 4794 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4d676f27-d254-46a3-afb6-7ecb637b61be-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 10 10:05:05 crc kubenswrapper[4794]: I0310 10:05:05.170434 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-krtv5\" (UniqueName: \"kubernetes.io/projected/4d676f27-d254-46a3-afb6-7ecb637b61be-kube-api-access-krtv5\") on node \"crc\" DevicePath \"\"" Mar 10 10:05:05 crc kubenswrapper[4794]: I0310 10:05:05.170445 4794 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4d676f27-d254-46a3-afb6-7ecb637b61be-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 10 10:05:05 crc kubenswrapper[4794]: I0310 10:05:05.304862 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-cnjz9"] Mar 10 10:05:05 crc kubenswrapper[4794]: I0310 10:05:05.310973 4794 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-cnjz9"] Mar 10 10:05:05 crc kubenswrapper[4794]: I0310 10:05:05.797418 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-659ddb758c-4qcnz" event={"ID":"4d676f27-d254-46a3-afb6-7ecb637b61be","Type":"ContainerDied","Data":"917109063d9a885d2757c65955824d1118f430f8c8ac6572f7b092d1b9a6cb04"} Mar 10 10:05:05 crc kubenswrapper[4794]: I0310 10:05:05.797491 4794 scope.go:117] "RemoveContainer" containerID="9d26bd5b81721ffcbfeba1b8441c9226b29c9ef39b18c49b5750cccfa3e56b31" Mar 10 10:05:05 crc kubenswrapper[4794]: I0310 10:05:05.797623 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-659ddb758c-4qcnz" Mar 10 10:05:05 crc kubenswrapper[4794]: I0310 10:05:05.835925 4794 scope.go:117] "RemoveContainer" containerID="1a4a358cfdd7b0bd2ed2421fcb052432a6e3fcbe1ac515c3592ed116f809c2ce" Mar 10 10:05:05 crc kubenswrapper[4794]: I0310 10:05:05.840820 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-659ddb758c-4qcnz"] Mar 10 10:05:05 crc kubenswrapper[4794]: I0310 10:05:05.852302 4794 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-659ddb758c-4qcnz"] Mar 10 10:05:06 crc kubenswrapper[4794]: I0310 10:05:06.031149 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4d676f27-d254-46a3-afb6-7ecb637b61be" path="/var/lib/kubelet/pods/4d676f27-d254-46a3-afb6-7ecb637b61be/volumes" Mar 10 10:05:06 crc kubenswrapper[4794]: I0310 10:05:06.038395 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f997533c-caab-4b81-9f03-cef498327517" path="/var/lib/kubelet/pods/f997533c-caab-4b81-9f03-cef498327517/volumes" Mar 10 10:05:06 crc kubenswrapper[4794]: I0310 10:05:06.235439 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-7da2-account-create-update-dtltz" Mar 10 10:05:06 crc kubenswrapper[4794]: I0310 10:05:06.241950 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-jtsc6" Mar 10 10:05:06 crc kubenswrapper[4794]: I0310 10:05:06.296039 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/510d7963-19bc-4b66-8c32-0b7b92c8e7ad-operator-scripts\") pod \"510d7963-19bc-4b66-8c32-0b7b92c8e7ad\" (UID: \"510d7963-19bc-4b66-8c32-0b7b92c8e7ad\") " Mar 10 10:05:06 crc kubenswrapper[4794]: I0310 10:05:06.296151 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f3cfd303-3976-4400-8c37-e64a4bed85f2-operator-scripts\") pod \"f3cfd303-3976-4400-8c37-e64a4bed85f2\" (UID: \"f3cfd303-3976-4400-8c37-e64a4bed85f2\") " Mar 10 10:05:06 crc kubenswrapper[4794]: I0310 10:05:06.296225 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9nw9w\" (UniqueName: \"kubernetes.io/projected/f3cfd303-3976-4400-8c37-e64a4bed85f2-kube-api-access-9nw9w\") pod \"f3cfd303-3976-4400-8c37-e64a4bed85f2\" (UID: \"f3cfd303-3976-4400-8c37-e64a4bed85f2\") " Mar 10 10:05:06 crc kubenswrapper[4794]: I0310 10:05:06.296277 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8gmdp\" (UniqueName: \"kubernetes.io/projected/510d7963-19bc-4b66-8c32-0b7b92c8e7ad-kube-api-access-8gmdp\") pod \"510d7963-19bc-4b66-8c32-0b7b92c8e7ad\" (UID: \"510d7963-19bc-4b66-8c32-0b7b92c8e7ad\") " Mar 10 10:05:06 crc kubenswrapper[4794]: I0310 10:05:06.296573 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/510d7963-19bc-4b66-8c32-0b7b92c8e7ad-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "510d7963-19bc-4b66-8c32-0b7b92c8e7ad" (UID: "510d7963-19bc-4b66-8c32-0b7b92c8e7ad"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 10:05:06 crc kubenswrapper[4794]: I0310 10:05:06.296707 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f3cfd303-3976-4400-8c37-e64a4bed85f2-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "f3cfd303-3976-4400-8c37-e64a4bed85f2" (UID: "f3cfd303-3976-4400-8c37-e64a4bed85f2"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 10:05:06 crc kubenswrapper[4794]: I0310 10:05:06.296778 4794 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/510d7963-19bc-4b66-8c32-0b7b92c8e7ad-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 10:05:06 crc kubenswrapper[4794]: I0310 10:05:06.301386 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f3cfd303-3976-4400-8c37-e64a4bed85f2-kube-api-access-9nw9w" (OuterVolumeSpecName: "kube-api-access-9nw9w") pod "f3cfd303-3976-4400-8c37-e64a4bed85f2" (UID: "f3cfd303-3976-4400-8c37-e64a4bed85f2"). InnerVolumeSpecName "kube-api-access-9nw9w". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 10:05:06 crc kubenswrapper[4794]: I0310 10:05:06.301889 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/510d7963-19bc-4b66-8c32-0b7b92c8e7ad-kube-api-access-8gmdp" (OuterVolumeSpecName: "kube-api-access-8gmdp") pod "510d7963-19bc-4b66-8c32-0b7b92c8e7ad" (UID: "510d7963-19bc-4b66-8c32-0b7b92c8e7ad"). InnerVolumeSpecName "kube-api-access-8gmdp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 10:05:06 crc kubenswrapper[4794]: I0310 10:05:06.397699 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9nw9w\" (UniqueName: \"kubernetes.io/projected/f3cfd303-3976-4400-8c37-e64a4bed85f2-kube-api-access-9nw9w\") on node \"crc\" DevicePath \"\"" Mar 10 10:05:06 crc kubenswrapper[4794]: I0310 10:05:06.397726 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8gmdp\" (UniqueName: \"kubernetes.io/projected/510d7963-19bc-4b66-8c32-0b7b92c8e7ad-kube-api-access-8gmdp\") on node \"crc\" DevicePath \"\"" Mar 10 10:05:06 crc kubenswrapper[4794]: I0310 10:05:06.397738 4794 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f3cfd303-3976-4400-8c37-e64a4bed85f2-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 10:05:06 crc kubenswrapper[4794]: I0310 10:05:06.813104 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-7da2-account-create-update-dtltz" event={"ID":"f3cfd303-3976-4400-8c37-e64a4bed85f2","Type":"ContainerDied","Data":"82cc98e7acb1bbe09fa0df35d1c8dcc10892b6adbfbf048e7047abdf1733663f"} Mar 10 10:05:06 crc kubenswrapper[4794]: I0310 10:05:06.813136 4794 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="82cc98e7acb1bbe09fa0df35d1c8dcc10892b6adbfbf048e7047abdf1733663f" Mar 10 10:05:06 crc kubenswrapper[4794]: I0310 10:05:06.813166 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-7da2-account-create-update-dtltz" Mar 10 10:05:06 crc kubenswrapper[4794]: I0310 10:05:06.818962 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-jtsc6" event={"ID":"510d7963-19bc-4b66-8c32-0b7b92c8e7ad","Type":"ContainerDied","Data":"6b638b1ddbe57e1cc4949e19ce72436e197db7687793b05ff13ff64c30b54e04"} Mar 10 10:05:06 crc kubenswrapper[4794]: I0310 10:05:06.818994 4794 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6b638b1ddbe57e1cc4949e19ce72436e197db7687793b05ff13ff64c30b54e04" Mar 10 10:05:06 crc kubenswrapper[4794]: I0310 10:05:06.819038 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-jtsc6" Mar 10 10:05:10 crc kubenswrapper[4794]: I0310 10:05:10.278391 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-northd-0" Mar 10 10:05:10 crc kubenswrapper[4794]: I0310 10:05:10.351549 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-g67ml"] Mar 10 10:05:10 crc kubenswrapper[4794]: E0310 10:05:10.351900 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f997533c-caab-4b81-9f03-cef498327517" containerName="mariadb-account-create-update" Mar 10 10:05:10 crc kubenswrapper[4794]: I0310 10:05:10.351924 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="f997533c-caab-4b81-9f03-cef498327517" containerName="mariadb-account-create-update" Mar 10 10:05:10 crc kubenswrapper[4794]: E0310 10:05:10.351940 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4d676f27-d254-46a3-afb6-7ecb637b61be" containerName="dnsmasq-dns" Mar 10 10:05:10 crc kubenswrapper[4794]: I0310 10:05:10.351949 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="4d676f27-d254-46a3-afb6-7ecb637b61be" containerName="dnsmasq-dns" Mar 10 10:05:10 crc kubenswrapper[4794]: E0310 10:05:10.351965 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f3cfd303-3976-4400-8c37-e64a4bed85f2" containerName="mariadb-account-create-update" Mar 10 10:05:10 crc kubenswrapper[4794]: I0310 10:05:10.351973 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="f3cfd303-3976-4400-8c37-e64a4bed85f2" containerName="mariadb-account-create-update" Mar 10 10:05:10 crc kubenswrapper[4794]: E0310 10:05:10.351986 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="510d7963-19bc-4b66-8c32-0b7b92c8e7ad" containerName="mariadb-database-create" Mar 10 10:05:10 crc kubenswrapper[4794]: I0310 10:05:10.351994 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="510d7963-19bc-4b66-8c32-0b7b92c8e7ad" containerName="mariadb-database-create" Mar 10 10:05:10 crc kubenswrapper[4794]: E0310 10:05:10.352025 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4d676f27-d254-46a3-afb6-7ecb637b61be" containerName="init" Mar 10 10:05:10 crc kubenswrapper[4794]: I0310 10:05:10.352032 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="4d676f27-d254-46a3-afb6-7ecb637b61be" containerName="init" Mar 10 10:05:10 crc kubenswrapper[4794]: I0310 10:05:10.352192 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="f3cfd303-3976-4400-8c37-e64a4bed85f2" containerName="mariadb-account-create-update" Mar 10 10:05:10 crc kubenswrapper[4794]: I0310 10:05:10.352212 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="510d7963-19bc-4b66-8c32-0b7b92c8e7ad" containerName="mariadb-database-create" Mar 10 10:05:10 crc kubenswrapper[4794]: I0310 10:05:10.352224 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="f997533c-caab-4b81-9f03-cef498327517" containerName="mariadb-account-create-update" Mar 10 10:05:10 crc kubenswrapper[4794]: I0310 10:05:10.352235 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="4d676f27-d254-46a3-afb6-7ecb637b61be" containerName="dnsmasq-dns" Mar 10 10:05:10 crc kubenswrapper[4794]: I0310 10:05:10.352677 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-g67ml" Mar 10 10:05:10 crc kubenswrapper[4794]: I0310 10:05:10.355229 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-mariadb-root-db-secret" Mar 10 10:05:10 crc kubenswrapper[4794]: I0310 10:05:10.359873 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-g67ml"] Mar 10 10:05:10 crc kubenswrapper[4794]: I0310 10:05:10.469354 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/691c983b-4d1e-406d-bd02-358e7a635547-operator-scripts\") pod \"root-account-create-update-g67ml\" (UID: \"691c983b-4d1e-406d-bd02-358e7a635547\") " pod="openstack/root-account-create-update-g67ml" Mar 10 10:05:10 crc kubenswrapper[4794]: I0310 10:05:10.469405 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vrdrz\" (UniqueName: \"kubernetes.io/projected/691c983b-4d1e-406d-bd02-358e7a635547-kube-api-access-vrdrz\") pod \"root-account-create-update-g67ml\" (UID: \"691c983b-4d1e-406d-bd02-358e7a635547\") " pod="openstack/root-account-create-update-g67ml" Mar 10 10:05:10 crc kubenswrapper[4794]: I0310 10:05:10.571715 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/691c983b-4d1e-406d-bd02-358e7a635547-operator-scripts\") pod \"root-account-create-update-g67ml\" (UID: \"691c983b-4d1e-406d-bd02-358e7a635547\") " pod="openstack/root-account-create-update-g67ml" Mar 10 10:05:10 crc kubenswrapper[4794]: I0310 10:05:10.571776 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vrdrz\" (UniqueName: \"kubernetes.io/projected/691c983b-4d1e-406d-bd02-358e7a635547-kube-api-access-vrdrz\") pod \"root-account-create-update-g67ml\" (UID: \"691c983b-4d1e-406d-bd02-358e7a635547\") " pod="openstack/root-account-create-update-g67ml" Mar 10 10:05:10 crc kubenswrapper[4794]: I0310 10:05:10.572658 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/691c983b-4d1e-406d-bd02-358e7a635547-operator-scripts\") pod \"root-account-create-update-g67ml\" (UID: \"691c983b-4d1e-406d-bd02-358e7a635547\") " pod="openstack/root-account-create-update-g67ml" Mar 10 10:05:10 crc kubenswrapper[4794]: I0310 10:05:10.592742 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vrdrz\" (UniqueName: \"kubernetes.io/projected/691c983b-4d1e-406d-bd02-358e7a635547-kube-api-access-vrdrz\") pod \"root-account-create-update-g67ml\" (UID: \"691c983b-4d1e-406d-bd02-358e7a635547\") " pod="openstack/root-account-create-update-g67ml" Mar 10 10:05:10 crc kubenswrapper[4794]: I0310 10:05:10.672228 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-g67ml" Mar 10 10:05:10 crc kubenswrapper[4794]: I0310 10:05:10.859622 4794 generic.go:334] "Generic (PLEG): container finished" podID="605a052b-1adf-461d-baf7-1a30a69d8de7" containerID="19f6cd89c73f96aeb050448147fc3ab69b927ded2c3caffb20293da23bbe586e" exitCode=0 Mar 10 10:05:10 crc kubenswrapper[4794]: I0310 10:05:10.859721 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-vb95j" event={"ID":"605a052b-1adf-461d-baf7-1a30a69d8de7","Type":"ContainerDied","Data":"19f6cd89c73f96aeb050448147fc3ab69b927ded2c3caffb20293da23bbe586e"} Mar 10 10:05:11 crc kubenswrapper[4794]: W0310 10:05:11.094421 4794 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod691c983b_4d1e_406d_bd02_358e7a635547.slice/crio-d073ae6460e18836d4c67cef4437bdbbadd5499b40789d90baf8228a2a046a38 WatchSource:0}: Error finding container d073ae6460e18836d4c67cef4437bdbbadd5499b40789d90baf8228a2a046a38: Status 404 returned error can't find the container with id d073ae6460e18836d4c67cef4437bdbbadd5499b40789d90baf8228a2a046a38 Mar 10 10:05:11 crc kubenswrapper[4794]: I0310 10:05:11.094655 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-g67ml"] Mar 10 10:05:11 crc kubenswrapper[4794]: I0310 10:05:11.182571 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/6cbde6fd-ed6b-49a8-96ae-642b15a1802b-etc-swift\") pod \"swift-storage-0\" (UID: \"6cbde6fd-ed6b-49a8-96ae-642b15a1802b\") " pod="openstack/swift-storage-0" Mar 10 10:05:11 crc kubenswrapper[4794]: I0310 10:05:11.189466 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/6cbde6fd-ed6b-49a8-96ae-642b15a1802b-etc-swift\") pod \"swift-storage-0\" (UID: \"6cbde6fd-ed6b-49a8-96ae-642b15a1802b\") " pod="openstack/swift-storage-0" Mar 10 10:05:11 crc kubenswrapper[4794]: I0310 10:05:11.192998 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Mar 10 10:05:11 crc kubenswrapper[4794]: W0310 10:05:11.768908 4794 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6cbde6fd_ed6b_49a8_96ae_642b15a1802b.slice/crio-98c7f33d0acbb031c2637b524355a2476576c8b482ba68322e539058c1cc15d7 WatchSource:0}: Error finding container 98c7f33d0acbb031c2637b524355a2476576c8b482ba68322e539058c1cc15d7: Status 404 returned error can't find the container with id 98c7f33d0acbb031c2637b524355a2476576c8b482ba68322e539058c1cc15d7 Mar 10 10:05:11 crc kubenswrapper[4794]: I0310 10:05:11.774101 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Mar 10 10:05:11 crc kubenswrapper[4794]: I0310 10:05:11.870827 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"6cbde6fd-ed6b-49a8-96ae-642b15a1802b","Type":"ContainerStarted","Data":"98c7f33d0acbb031c2637b524355a2476576c8b482ba68322e539058c1cc15d7"} Mar 10 10:05:11 crc kubenswrapper[4794]: I0310 10:05:11.872755 4794 generic.go:334] "Generic (PLEG): container finished" podID="691c983b-4d1e-406d-bd02-358e7a635547" containerID="8d1ff7da9b574a00cbfb6c54f68bf954b4ed60e409ce634a6b3ccf1e121b7c9d" exitCode=0 Mar 10 10:05:11 crc kubenswrapper[4794]: I0310 10:05:11.873436 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-g67ml" event={"ID":"691c983b-4d1e-406d-bd02-358e7a635547","Type":"ContainerDied","Data":"8d1ff7da9b574a00cbfb6c54f68bf954b4ed60e409ce634a6b3ccf1e121b7c9d"} Mar 10 10:05:11 crc kubenswrapper[4794]: I0310 10:05:11.873516 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-g67ml" event={"ID":"691c983b-4d1e-406d-bd02-358e7a635547","Type":"ContainerStarted","Data":"d073ae6460e18836d4c67cef4437bdbbadd5499b40789d90baf8228a2a046a38"} Mar 10 10:05:12 crc kubenswrapper[4794]: I0310 10:05:12.225588 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-vb95j" Mar 10 10:05:12 crc kubenswrapper[4794]: I0310 10:05:12.303259 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/605a052b-1adf-461d-baf7-1a30a69d8de7-ring-data-devices\") pod \"605a052b-1adf-461d-baf7-1a30a69d8de7\" (UID: \"605a052b-1adf-461d-baf7-1a30a69d8de7\") " Mar 10 10:05:12 crc kubenswrapper[4794]: I0310 10:05:12.303398 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/605a052b-1adf-461d-baf7-1a30a69d8de7-combined-ca-bundle\") pod \"605a052b-1adf-461d-baf7-1a30a69d8de7\" (UID: \"605a052b-1adf-461d-baf7-1a30a69d8de7\") " Mar 10 10:05:12 crc kubenswrapper[4794]: I0310 10:05:12.303450 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/605a052b-1adf-461d-baf7-1a30a69d8de7-swiftconf\") pod \"605a052b-1adf-461d-baf7-1a30a69d8de7\" (UID: \"605a052b-1adf-461d-baf7-1a30a69d8de7\") " Mar 10 10:05:12 crc kubenswrapper[4794]: I0310 10:05:12.303487 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/605a052b-1adf-461d-baf7-1a30a69d8de7-scripts\") pod \"605a052b-1adf-461d-baf7-1a30a69d8de7\" (UID: \"605a052b-1adf-461d-baf7-1a30a69d8de7\") " Mar 10 10:05:12 crc kubenswrapper[4794]: I0310 10:05:12.303505 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/605a052b-1adf-461d-baf7-1a30a69d8de7-dispersionconf\") pod \"605a052b-1adf-461d-baf7-1a30a69d8de7\" (UID: \"605a052b-1adf-461d-baf7-1a30a69d8de7\") " Mar 10 10:05:12 crc kubenswrapper[4794]: I0310 10:05:12.303525 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-msnkx\" (UniqueName: \"kubernetes.io/projected/605a052b-1adf-461d-baf7-1a30a69d8de7-kube-api-access-msnkx\") pod \"605a052b-1adf-461d-baf7-1a30a69d8de7\" (UID: \"605a052b-1adf-461d-baf7-1a30a69d8de7\") " Mar 10 10:05:12 crc kubenswrapper[4794]: I0310 10:05:12.303548 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/605a052b-1adf-461d-baf7-1a30a69d8de7-etc-swift\") pod \"605a052b-1adf-461d-baf7-1a30a69d8de7\" (UID: \"605a052b-1adf-461d-baf7-1a30a69d8de7\") " Mar 10 10:05:12 crc kubenswrapper[4794]: I0310 10:05:12.304025 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/605a052b-1adf-461d-baf7-1a30a69d8de7-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "605a052b-1adf-461d-baf7-1a30a69d8de7" (UID: "605a052b-1adf-461d-baf7-1a30a69d8de7"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 10:05:12 crc kubenswrapper[4794]: I0310 10:05:12.304983 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/605a052b-1adf-461d-baf7-1a30a69d8de7-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "605a052b-1adf-461d-baf7-1a30a69d8de7" (UID: "605a052b-1adf-461d-baf7-1a30a69d8de7"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 10:05:12 crc kubenswrapper[4794]: I0310 10:05:12.310480 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/605a052b-1adf-461d-baf7-1a30a69d8de7-kube-api-access-msnkx" (OuterVolumeSpecName: "kube-api-access-msnkx") pod "605a052b-1adf-461d-baf7-1a30a69d8de7" (UID: "605a052b-1adf-461d-baf7-1a30a69d8de7"). InnerVolumeSpecName "kube-api-access-msnkx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 10:05:12 crc kubenswrapper[4794]: I0310 10:05:12.325920 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/605a052b-1adf-461d-baf7-1a30a69d8de7-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "605a052b-1adf-461d-baf7-1a30a69d8de7" (UID: "605a052b-1adf-461d-baf7-1a30a69d8de7"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 10:05:12 crc kubenswrapper[4794]: I0310 10:05:12.327651 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/605a052b-1adf-461d-baf7-1a30a69d8de7-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "605a052b-1adf-461d-baf7-1a30a69d8de7" (UID: "605a052b-1adf-461d-baf7-1a30a69d8de7"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 10:05:12 crc kubenswrapper[4794]: I0310 10:05:12.328686 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/605a052b-1adf-461d-baf7-1a30a69d8de7-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "605a052b-1adf-461d-baf7-1a30a69d8de7" (UID: "605a052b-1adf-461d-baf7-1a30a69d8de7"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 10:05:12 crc kubenswrapper[4794]: I0310 10:05:12.334695 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/605a052b-1adf-461d-baf7-1a30a69d8de7-scripts" (OuterVolumeSpecName: "scripts") pod "605a052b-1adf-461d-baf7-1a30a69d8de7" (UID: "605a052b-1adf-461d-baf7-1a30a69d8de7"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 10:05:12 crc kubenswrapper[4794]: I0310 10:05:12.386808 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-sync-gg5z8"] Mar 10 10:05:12 crc kubenswrapper[4794]: E0310 10:05:12.387177 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="605a052b-1adf-461d-baf7-1a30a69d8de7" containerName="swift-ring-rebalance" Mar 10 10:05:12 crc kubenswrapper[4794]: I0310 10:05:12.387196 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="605a052b-1adf-461d-baf7-1a30a69d8de7" containerName="swift-ring-rebalance" Mar 10 10:05:12 crc kubenswrapper[4794]: I0310 10:05:12.387379 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="605a052b-1adf-461d-baf7-1a30a69d8de7" containerName="swift-ring-rebalance" Mar 10 10:05:12 crc kubenswrapper[4794]: I0310 10:05:12.387868 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-gg5z8" Mar 10 10:05:12 crc kubenswrapper[4794]: I0310 10:05:12.391954 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-v9fxn" Mar 10 10:05:12 crc kubenswrapper[4794]: I0310 10:05:12.392057 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-config-data" Mar 10 10:05:12 crc kubenswrapper[4794]: I0310 10:05:12.395464 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-gg5z8"] Mar 10 10:05:12 crc kubenswrapper[4794]: I0310 10:05:12.404833 4794 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/605a052b-1adf-461d-baf7-1a30a69d8de7-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 10 10:05:12 crc kubenswrapper[4794]: I0310 10:05:12.404862 4794 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/605a052b-1adf-461d-baf7-1a30a69d8de7-ring-data-devices\") on node \"crc\" DevicePath \"\"" Mar 10 10:05:12 crc kubenswrapper[4794]: I0310 10:05:12.404872 4794 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/605a052b-1adf-461d-baf7-1a30a69d8de7-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 10:05:12 crc kubenswrapper[4794]: I0310 10:05:12.404881 4794 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/605a052b-1adf-461d-baf7-1a30a69d8de7-swiftconf\") on node \"crc\" DevicePath \"\"" Mar 10 10:05:12 crc kubenswrapper[4794]: I0310 10:05:12.404889 4794 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/605a052b-1adf-461d-baf7-1a30a69d8de7-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 10:05:12 crc kubenswrapper[4794]: I0310 10:05:12.404898 4794 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/605a052b-1adf-461d-baf7-1a30a69d8de7-dispersionconf\") on node \"crc\" DevicePath \"\"" Mar 10 10:05:12 crc kubenswrapper[4794]: I0310 10:05:12.404907 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-msnkx\" (UniqueName: \"kubernetes.io/projected/605a052b-1adf-461d-baf7-1a30a69d8de7-kube-api-access-msnkx\") on node \"crc\" DevicePath \"\"" Mar 10 10:05:12 crc kubenswrapper[4794]: I0310 10:05:12.505762 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f7641d95-8aa6-43ad-a191-7d8356b29bac-config-data\") pod \"glance-db-sync-gg5z8\" (UID: \"f7641d95-8aa6-43ad-a191-7d8356b29bac\") " pod="openstack/glance-db-sync-gg5z8" Mar 10 10:05:12 crc kubenswrapper[4794]: I0310 10:05:12.505827 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f7641d95-8aa6-43ad-a191-7d8356b29bac-combined-ca-bundle\") pod \"glance-db-sync-gg5z8\" (UID: \"f7641d95-8aa6-43ad-a191-7d8356b29bac\") " pod="openstack/glance-db-sync-gg5z8" Mar 10 10:05:12 crc kubenswrapper[4794]: I0310 10:05:12.505882 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8gkl8\" (UniqueName: \"kubernetes.io/projected/f7641d95-8aa6-43ad-a191-7d8356b29bac-kube-api-access-8gkl8\") pod \"glance-db-sync-gg5z8\" (UID: \"f7641d95-8aa6-43ad-a191-7d8356b29bac\") " pod="openstack/glance-db-sync-gg5z8" Mar 10 10:05:12 crc kubenswrapper[4794]: I0310 10:05:12.506268 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/f7641d95-8aa6-43ad-a191-7d8356b29bac-db-sync-config-data\") pod \"glance-db-sync-gg5z8\" (UID: \"f7641d95-8aa6-43ad-a191-7d8356b29bac\") " pod="openstack/glance-db-sync-gg5z8" Mar 10 10:05:12 crc kubenswrapper[4794]: I0310 10:05:12.607603 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8gkl8\" (UniqueName: \"kubernetes.io/projected/f7641d95-8aa6-43ad-a191-7d8356b29bac-kube-api-access-8gkl8\") pod \"glance-db-sync-gg5z8\" (UID: \"f7641d95-8aa6-43ad-a191-7d8356b29bac\") " pod="openstack/glance-db-sync-gg5z8" Mar 10 10:05:12 crc kubenswrapper[4794]: I0310 10:05:12.607704 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/f7641d95-8aa6-43ad-a191-7d8356b29bac-db-sync-config-data\") pod \"glance-db-sync-gg5z8\" (UID: \"f7641d95-8aa6-43ad-a191-7d8356b29bac\") " pod="openstack/glance-db-sync-gg5z8" Mar 10 10:05:12 crc kubenswrapper[4794]: I0310 10:05:12.607743 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f7641d95-8aa6-43ad-a191-7d8356b29bac-config-data\") pod \"glance-db-sync-gg5z8\" (UID: \"f7641d95-8aa6-43ad-a191-7d8356b29bac\") " pod="openstack/glance-db-sync-gg5z8" Mar 10 10:05:12 crc kubenswrapper[4794]: I0310 10:05:12.607789 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f7641d95-8aa6-43ad-a191-7d8356b29bac-combined-ca-bundle\") pod \"glance-db-sync-gg5z8\" (UID: \"f7641d95-8aa6-43ad-a191-7d8356b29bac\") " pod="openstack/glance-db-sync-gg5z8" Mar 10 10:05:12 crc kubenswrapper[4794]: I0310 10:05:12.611154 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/f7641d95-8aa6-43ad-a191-7d8356b29bac-db-sync-config-data\") pod \"glance-db-sync-gg5z8\" (UID: \"f7641d95-8aa6-43ad-a191-7d8356b29bac\") " pod="openstack/glance-db-sync-gg5z8" Mar 10 10:05:12 crc kubenswrapper[4794]: I0310 10:05:12.611170 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f7641d95-8aa6-43ad-a191-7d8356b29bac-combined-ca-bundle\") pod \"glance-db-sync-gg5z8\" (UID: \"f7641d95-8aa6-43ad-a191-7d8356b29bac\") " pod="openstack/glance-db-sync-gg5z8" Mar 10 10:05:12 crc kubenswrapper[4794]: I0310 10:05:12.611234 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f7641d95-8aa6-43ad-a191-7d8356b29bac-config-data\") pod \"glance-db-sync-gg5z8\" (UID: \"f7641d95-8aa6-43ad-a191-7d8356b29bac\") " pod="openstack/glance-db-sync-gg5z8" Mar 10 10:05:12 crc kubenswrapper[4794]: I0310 10:05:12.625884 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8gkl8\" (UniqueName: \"kubernetes.io/projected/f7641d95-8aa6-43ad-a191-7d8356b29bac-kube-api-access-8gkl8\") pod \"glance-db-sync-gg5z8\" (UID: \"f7641d95-8aa6-43ad-a191-7d8356b29bac\") " pod="openstack/glance-db-sync-gg5z8" Mar 10 10:05:12 crc kubenswrapper[4794]: I0310 10:05:12.702766 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-gg5z8" Mar 10 10:05:12 crc kubenswrapper[4794]: I0310 10:05:12.886674 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-vb95j" Mar 10 10:05:12 crc kubenswrapper[4794]: I0310 10:05:12.886757 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-vb95j" event={"ID":"605a052b-1adf-461d-baf7-1a30a69d8de7","Type":"ContainerDied","Data":"bf3a081702df539ed4ae4432b8e9f82ca7dc54125bd2f38bf996d25bc14c9343"} Mar 10 10:05:12 crc kubenswrapper[4794]: I0310 10:05:12.887184 4794 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bf3a081702df539ed4ae4432b8e9f82ca7dc54125bd2f38bf996d25bc14c9343" Mar 10 10:05:13 crc kubenswrapper[4794]: I0310 10:05:13.177720 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-g67ml" Mar 10 10:05:13 crc kubenswrapper[4794]: I0310 10:05:13.180062 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-gg5z8"] Mar 10 10:05:13 crc kubenswrapper[4794]: I0310 10:05:13.218313 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/691c983b-4d1e-406d-bd02-358e7a635547-operator-scripts\") pod \"691c983b-4d1e-406d-bd02-358e7a635547\" (UID: \"691c983b-4d1e-406d-bd02-358e7a635547\") " Mar 10 10:05:13 crc kubenswrapper[4794]: I0310 10:05:13.218487 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vrdrz\" (UniqueName: \"kubernetes.io/projected/691c983b-4d1e-406d-bd02-358e7a635547-kube-api-access-vrdrz\") pod \"691c983b-4d1e-406d-bd02-358e7a635547\" (UID: \"691c983b-4d1e-406d-bd02-358e7a635547\") " Mar 10 10:05:13 crc kubenswrapper[4794]: I0310 10:05:13.219229 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/691c983b-4d1e-406d-bd02-358e7a635547-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "691c983b-4d1e-406d-bd02-358e7a635547" (UID: "691c983b-4d1e-406d-bd02-358e7a635547"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 10:05:13 crc kubenswrapper[4794]: I0310 10:05:13.224161 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/691c983b-4d1e-406d-bd02-358e7a635547-kube-api-access-vrdrz" (OuterVolumeSpecName: "kube-api-access-vrdrz") pod "691c983b-4d1e-406d-bd02-358e7a635547" (UID: "691c983b-4d1e-406d-bd02-358e7a635547"). InnerVolumeSpecName "kube-api-access-vrdrz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 10:05:13 crc kubenswrapper[4794]: I0310 10:05:13.320711 4794 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/691c983b-4d1e-406d-bd02-358e7a635547-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 10:05:13 crc kubenswrapper[4794]: I0310 10:05:13.321008 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vrdrz\" (UniqueName: \"kubernetes.io/projected/691c983b-4d1e-406d-bd02-358e7a635547-kube-api-access-vrdrz\") on node \"crc\" DevicePath \"\"" Mar 10 10:05:13 crc kubenswrapper[4794]: I0310 10:05:13.894746 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"6cbde6fd-ed6b-49a8-96ae-642b15a1802b","Type":"ContainerStarted","Data":"031388a5b0e7ab4e2d5a36045f55127c3e30f57424f750e9a01dea1da336e7c1"} Mar 10 10:05:13 crc kubenswrapper[4794]: I0310 10:05:13.897183 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-g67ml" Mar 10 10:05:13 crc kubenswrapper[4794]: I0310 10:05:13.897379 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-g67ml" event={"ID":"691c983b-4d1e-406d-bd02-358e7a635547","Type":"ContainerDied","Data":"d073ae6460e18836d4c67cef4437bdbbadd5499b40789d90baf8228a2a046a38"} Mar 10 10:05:13 crc kubenswrapper[4794]: I0310 10:05:13.897402 4794 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d073ae6460e18836d4c67cef4437bdbbadd5499b40789d90baf8228a2a046a38" Mar 10 10:05:13 crc kubenswrapper[4794]: I0310 10:05:13.898933 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-gg5z8" event={"ID":"f7641d95-8aa6-43ad-a191-7d8356b29bac","Type":"ContainerStarted","Data":"ba5c54c87514d9d373f080956887440384a5df2a4fad5caf02ebc6be5e6f52fb"} Mar 10 10:05:14 crc kubenswrapper[4794]: I0310 10:05:14.908256 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"6cbde6fd-ed6b-49a8-96ae-642b15a1802b","Type":"ContainerStarted","Data":"d63f9f6d7599f958801005a6670033ad2c6f68cf62e9c0465c4d34044c669139"} Mar 10 10:05:14 crc kubenswrapper[4794]: I0310 10:05:14.908585 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"6cbde6fd-ed6b-49a8-96ae-642b15a1802b","Type":"ContainerStarted","Data":"32aa0630d82d05463514a0c8a463bab43aad2c33f2b887d13c9714f7761c76c2"} Mar 10 10:05:14 crc kubenswrapper[4794]: I0310 10:05:14.908601 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"6cbde6fd-ed6b-49a8-96ae-642b15a1802b","Type":"ContainerStarted","Data":"8cc99334a202e511edfdad4eef31c86941e591eb3f6215e5d7e786f323618184"} Mar 10 10:05:15 crc kubenswrapper[4794]: I0310 10:05:15.917926 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"6cbde6fd-ed6b-49a8-96ae-642b15a1802b","Type":"ContainerStarted","Data":"8e6a16e1a4b64e9512a2b7d0587b85ab036e797d9ba29438a7eebcaaa92c8d35"} Mar 10 10:05:15 crc kubenswrapper[4794]: I0310 10:05:15.918369 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"6cbde6fd-ed6b-49a8-96ae-642b15a1802b","Type":"ContainerStarted","Data":"74c2be2cb91c7946838edd2c68684383f49526b9001079db31859fa44514bdac"} Mar 10 10:05:15 crc kubenswrapper[4794]: I0310 10:05:15.918382 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"6cbde6fd-ed6b-49a8-96ae-642b15a1802b","Type":"ContainerStarted","Data":"46ff2925f4f5d1bf13f6a0203dfcdd7152671d57863769e69b695dd9c5e3fb06"} Mar 10 10:05:15 crc kubenswrapper[4794]: I0310 10:05:15.918391 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"6cbde6fd-ed6b-49a8-96ae-642b15a1802b","Type":"ContainerStarted","Data":"21625e79ea0985db8645fee5a87d8192fa2df9962f7552b6121df93fb96d3e7f"} Mar 10 10:05:16 crc kubenswrapper[4794]: I0310 10:05:16.755144 4794 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-fvs8j" podUID="c4ffbeb0-ab29-4b48-bbd2-65c776b08edd" containerName="ovn-controller" probeResult="failure" output=< Mar 10 10:05:16 crc kubenswrapper[4794]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Mar 10 10:05:16 crc kubenswrapper[4794]: > Mar 10 10:05:16 crc kubenswrapper[4794]: I0310 10:05:16.760832 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-8552m" Mar 10 10:05:16 crc kubenswrapper[4794]: I0310 10:05:16.767195 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-8552m" Mar 10 10:05:16 crc kubenswrapper[4794]: I0310 10:05:16.930992 4794 generic.go:334] "Generic (PLEG): container finished" podID="a45381ea-b5d8-49aa-b4b8-ab372b39b0d3" containerID="a840482fd73ba7f63de99b82bc1c4a4c3093d855770bd6ce8ca9c72f090ea3e7" exitCode=0 Mar 10 10:05:16 crc kubenswrapper[4794]: I0310 10:05:16.931106 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"a45381ea-b5d8-49aa-b4b8-ab372b39b0d3","Type":"ContainerDied","Data":"a840482fd73ba7f63de99b82bc1c4a4c3093d855770bd6ce8ca9c72f090ea3e7"} Mar 10 10:05:16 crc kubenswrapper[4794]: I0310 10:05:16.933290 4794 generic.go:334] "Generic (PLEG): container finished" podID="598e06ed-3156-4e09-976e-4dda0e35afc2" containerID="d12835c0bc7b99c0b1d3a6ef8ae9785eba0f3a435f78fe3cb1b5f9e6c4bd6dea" exitCode=0 Mar 10 10:05:16 crc kubenswrapper[4794]: I0310 10:05:16.933665 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"598e06ed-3156-4e09-976e-4dda0e35afc2","Type":"ContainerDied","Data":"d12835c0bc7b99c0b1d3a6ef8ae9785eba0f3a435f78fe3cb1b5f9e6c4bd6dea"} Mar 10 10:05:16 crc kubenswrapper[4794]: I0310 10:05:16.991209 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-fvs8j-config-b9l5x"] Mar 10 10:05:16 crc kubenswrapper[4794]: E0310 10:05:16.991652 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="691c983b-4d1e-406d-bd02-358e7a635547" containerName="mariadb-account-create-update" Mar 10 10:05:16 crc kubenswrapper[4794]: I0310 10:05:16.991667 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="691c983b-4d1e-406d-bd02-358e7a635547" containerName="mariadb-account-create-update" Mar 10 10:05:16 crc kubenswrapper[4794]: I0310 10:05:16.991883 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="691c983b-4d1e-406d-bd02-358e7a635547" containerName="mariadb-account-create-update" Mar 10 10:05:16 crc kubenswrapper[4794]: I0310 10:05:16.992445 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-fvs8j-config-b9l5x" Mar 10 10:05:16 crc kubenswrapper[4794]: I0310 10:05:16.996813 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-extra-scripts" Mar 10 10:05:17 crc kubenswrapper[4794]: I0310 10:05:17.014346 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-fvs8j-config-b9l5x"] Mar 10 10:05:17 crc kubenswrapper[4794]: I0310 10:05:17.087624 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8nfdb\" (UniqueName: \"kubernetes.io/projected/bb8b5572-57dd-4c31-a86d-ee143ba44266-kube-api-access-8nfdb\") pod \"ovn-controller-fvs8j-config-b9l5x\" (UID: \"bb8b5572-57dd-4c31-a86d-ee143ba44266\") " pod="openstack/ovn-controller-fvs8j-config-b9l5x" Mar 10 10:05:17 crc kubenswrapper[4794]: I0310 10:05:17.087676 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/bb8b5572-57dd-4c31-a86d-ee143ba44266-var-run\") pod \"ovn-controller-fvs8j-config-b9l5x\" (UID: \"bb8b5572-57dd-4c31-a86d-ee143ba44266\") " pod="openstack/ovn-controller-fvs8j-config-b9l5x" Mar 10 10:05:17 crc kubenswrapper[4794]: I0310 10:05:17.087756 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/bb8b5572-57dd-4c31-a86d-ee143ba44266-var-run-ovn\") pod \"ovn-controller-fvs8j-config-b9l5x\" (UID: \"bb8b5572-57dd-4c31-a86d-ee143ba44266\") " pod="openstack/ovn-controller-fvs8j-config-b9l5x" Mar 10 10:05:17 crc kubenswrapper[4794]: I0310 10:05:17.087787 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/bb8b5572-57dd-4c31-a86d-ee143ba44266-var-log-ovn\") pod \"ovn-controller-fvs8j-config-b9l5x\" (UID: \"bb8b5572-57dd-4c31-a86d-ee143ba44266\") " pod="openstack/ovn-controller-fvs8j-config-b9l5x" Mar 10 10:05:17 crc kubenswrapper[4794]: I0310 10:05:17.087803 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/bb8b5572-57dd-4c31-a86d-ee143ba44266-additional-scripts\") pod \"ovn-controller-fvs8j-config-b9l5x\" (UID: \"bb8b5572-57dd-4c31-a86d-ee143ba44266\") " pod="openstack/ovn-controller-fvs8j-config-b9l5x" Mar 10 10:05:17 crc kubenswrapper[4794]: I0310 10:05:17.087858 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/bb8b5572-57dd-4c31-a86d-ee143ba44266-scripts\") pod \"ovn-controller-fvs8j-config-b9l5x\" (UID: \"bb8b5572-57dd-4c31-a86d-ee143ba44266\") " pod="openstack/ovn-controller-fvs8j-config-b9l5x" Mar 10 10:05:17 crc kubenswrapper[4794]: I0310 10:05:17.188929 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/bb8b5572-57dd-4c31-a86d-ee143ba44266-var-run-ovn\") pod \"ovn-controller-fvs8j-config-b9l5x\" (UID: \"bb8b5572-57dd-4c31-a86d-ee143ba44266\") " pod="openstack/ovn-controller-fvs8j-config-b9l5x" Mar 10 10:05:17 crc kubenswrapper[4794]: I0310 10:05:17.188995 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/bb8b5572-57dd-4c31-a86d-ee143ba44266-var-log-ovn\") pod \"ovn-controller-fvs8j-config-b9l5x\" (UID: \"bb8b5572-57dd-4c31-a86d-ee143ba44266\") " pod="openstack/ovn-controller-fvs8j-config-b9l5x" Mar 10 10:05:17 crc kubenswrapper[4794]: I0310 10:05:17.189014 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/bb8b5572-57dd-4c31-a86d-ee143ba44266-additional-scripts\") pod \"ovn-controller-fvs8j-config-b9l5x\" (UID: \"bb8b5572-57dd-4c31-a86d-ee143ba44266\") " pod="openstack/ovn-controller-fvs8j-config-b9l5x" Mar 10 10:05:17 crc kubenswrapper[4794]: I0310 10:05:17.189053 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/bb8b5572-57dd-4c31-a86d-ee143ba44266-scripts\") pod \"ovn-controller-fvs8j-config-b9l5x\" (UID: \"bb8b5572-57dd-4c31-a86d-ee143ba44266\") " pod="openstack/ovn-controller-fvs8j-config-b9l5x" Mar 10 10:05:17 crc kubenswrapper[4794]: I0310 10:05:17.189107 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8nfdb\" (UniqueName: \"kubernetes.io/projected/bb8b5572-57dd-4c31-a86d-ee143ba44266-kube-api-access-8nfdb\") pod \"ovn-controller-fvs8j-config-b9l5x\" (UID: \"bb8b5572-57dd-4c31-a86d-ee143ba44266\") " pod="openstack/ovn-controller-fvs8j-config-b9l5x" Mar 10 10:05:17 crc kubenswrapper[4794]: I0310 10:05:17.189140 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/bb8b5572-57dd-4c31-a86d-ee143ba44266-var-run\") pod \"ovn-controller-fvs8j-config-b9l5x\" (UID: \"bb8b5572-57dd-4c31-a86d-ee143ba44266\") " pod="openstack/ovn-controller-fvs8j-config-b9l5x" Mar 10 10:05:17 crc kubenswrapper[4794]: I0310 10:05:17.189352 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/bb8b5572-57dd-4c31-a86d-ee143ba44266-var-log-ovn\") pod \"ovn-controller-fvs8j-config-b9l5x\" (UID: \"bb8b5572-57dd-4c31-a86d-ee143ba44266\") " pod="openstack/ovn-controller-fvs8j-config-b9l5x" Mar 10 10:05:17 crc kubenswrapper[4794]: I0310 10:05:17.189382 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/bb8b5572-57dd-4c31-a86d-ee143ba44266-var-run\") pod \"ovn-controller-fvs8j-config-b9l5x\" (UID: \"bb8b5572-57dd-4c31-a86d-ee143ba44266\") " pod="openstack/ovn-controller-fvs8j-config-b9l5x" Mar 10 10:05:17 crc kubenswrapper[4794]: I0310 10:05:17.189400 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/bb8b5572-57dd-4c31-a86d-ee143ba44266-var-run-ovn\") pod \"ovn-controller-fvs8j-config-b9l5x\" (UID: \"bb8b5572-57dd-4c31-a86d-ee143ba44266\") " pod="openstack/ovn-controller-fvs8j-config-b9l5x" Mar 10 10:05:17 crc kubenswrapper[4794]: I0310 10:05:17.190284 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/bb8b5572-57dd-4c31-a86d-ee143ba44266-additional-scripts\") pod \"ovn-controller-fvs8j-config-b9l5x\" (UID: \"bb8b5572-57dd-4c31-a86d-ee143ba44266\") " pod="openstack/ovn-controller-fvs8j-config-b9l5x" Mar 10 10:05:17 crc kubenswrapper[4794]: I0310 10:05:17.191146 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/bb8b5572-57dd-4c31-a86d-ee143ba44266-scripts\") pod \"ovn-controller-fvs8j-config-b9l5x\" (UID: \"bb8b5572-57dd-4c31-a86d-ee143ba44266\") " pod="openstack/ovn-controller-fvs8j-config-b9l5x" Mar 10 10:05:17 crc kubenswrapper[4794]: I0310 10:05:17.223819 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8nfdb\" (UniqueName: \"kubernetes.io/projected/bb8b5572-57dd-4c31-a86d-ee143ba44266-kube-api-access-8nfdb\") pod \"ovn-controller-fvs8j-config-b9l5x\" (UID: \"bb8b5572-57dd-4c31-a86d-ee143ba44266\") " pod="openstack/ovn-controller-fvs8j-config-b9l5x" Mar 10 10:05:17 crc kubenswrapper[4794]: I0310 10:05:17.316568 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-fvs8j-config-b9l5x" Mar 10 10:05:21 crc kubenswrapper[4794]: I0310 10:05:21.750469 4794 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-fvs8j" podUID="c4ffbeb0-ab29-4b48-bbd2-65c776b08edd" containerName="ovn-controller" probeResult="failure" output=< Mar 10 10:05:21 crc kubenswrapper[4794]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Mar 10 10:05:21 crc kubenswrapper[4794]: > Mar 10 10:05:26 crc kubenswrapper[4794]: I0310 10:05:26.310262 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-fvs8j-config-b9l5x"] Mar 10 10:05:26 crc kubenswrapper[4794]: W0310 10:05:26.323025 4794 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbb8b5572_57dd_4c31_a86d_ee143ba44266.slice/crio-59649f01875e2bafe2ff905677b5ec719300dfcd4b7800827c89e8186bcecd2f WatchSource:0}: Error finding container 59649f01875e2bafe2ff905677b5ec719300dfcd4b7800827c89e8186bcecd2f: Status 404 returned error can't find the container with id 59649f01875e2bafe2ff905677b5ec719300dfcd4b7800827c89e8186bcecd2f Mar 10 10:05:26 crc kubenswrapper[4794]: I0310 10:05:26.767470 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-fvs8j" Mar 10 10:05:27 crc kubenswrapper[4794]: I0310 10:05:27.027585 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"6cbde6fd-ed6b-49a8-96ae-642b15a1802b","Type":"ContainerStarted","Data":"32f094fa9f1ca547ebae717b8b5951d1771b5fdb77dfeec605d769593501a6a7"} Mar 10 10:05:27 crc kubenswrapper[4794]: I0310 10:05:27.027893 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"6cbde6fd-ed6b-49a8-96ae-642b15a1802b","Type":"ContainerStarted","Data":"3b9e079a22cd5b6888eff2291538cfe9ec6e987ec470fa124da7e69bbac3f8c2"} Mar 10 10:05:27 crc kubenswrapper[4794]: I0310 10:05:27.027907 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"6cbde6fd-ed6b-49a8-96ae-642b15a1802b","Type":"ContainerStarted","Data":"250a700fa883215453330332d981b0fe632e9fa60d370e3a5759ce91865db4ab"} Mar 10 10:05:27 crc kubenswrapper[4794]: I0310 10:05:27.027919 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"6cbde6fd-ed6b-49a8-96ae-642b15a1802b","Type":"ContainerStarted","Data":"bc6074c0953ac28f265a3e17ebd69da6a9d931779c7686e644dc485527b0a8fb"} Mar 10 10:05:27 crc kubenswrapper[4794]: I0310 10:05:27.027931 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"6cbde6fd-ed6b-49a8-96ae-642b15a1802b","Type":"ContainerStarted","Data":"2190a44dadbbaa1a8486a80b4974382e233189c54458a517977ead0fca476329"} Mar 10 10:05:27 crc kubenswrapper[4794]: I0310 10:05:27.032205 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"a45381ea-b5d8-49aa-b4b8-ab372b39b0d3","Type":"ContainerStarted","Data":"b01da559f24b75afd94d0c65d373dcf5b4d3bb07708a3909a930cd454c72cc4d"} Mar 10 10:05:27 crc kubenswrapper[4794]: I0310 10:05:27.032397 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Mar 10 10:05:27 crc kubenswrapper[4794]: I0310 10:05:27.034037 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-gg5z8" event={"ID":"f7641d95-8aa6-43ad-a191-7d8356b29bac","Type":"ContainerStarted","Data":"5999c3f70eba4e38ddc5fbb492eefb265a6b6a1bfc7c6e88e7609ef9105892a8"} Mar 10 10:05:27 crc kubenswrapper[4794]: I0310 10:05:27.037545 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"598e06ed-3156-4e09-976e-4dda0e35afc2","Type":"ContainerStarted","Data":"145a1a04aa37c9588ebc88932f6178e631ace726a39e4a8d191e2883dc15e900"} Mar 10 10:05:27 crc kubenswrapper[4794]: I0310 10:05:27.037772 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Mar 10 10:05:27 crc kubenswrapper[4794]: I0310 10:05:27.039356 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-fvs8j-config-b9l5x" event={"ID":"bb8b5572-57dd-4c31-a86d-ee143ba44266","Type":"ContainerStarted","Data":"3de7e1e91921c511d1d59278200c580a671c8fa44d00a53740c8205df1d31761"} Mar 10 10:05:27 crc kubenswrapper[4794]: I0310 10:05:27.039385 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-fvs8j-config-b9l5x" event={"ID":"bb8b5572-57dd-4c31-a86d-ee143ba44266","Type":"ContainerStarted","Data":"59649f01875e2bafe2ff905677b5ec719300dfcd4b7800827c89e8186bcecd2f"} Mar 10 10:05:27 crc kubenswrapper[4794]: I0310 10:05:27.064666 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=60.09615379 podStartE2EDuration="1m10.064646232s" podCreationTimestamp="2026-03-10 10:04:17 +0000 UTC" firstStartedPulling="2026-03-10 10:04:31.963098278 +0000 UTC m=+1220.719269096" lastFinishedPulling="2026-03-10 10:04:41.93159071 +0000 UTC m=+1230.687761538" observedRunningTime="2026-03-10 10:05:27.054839512 +0000 UTC m=+1275.811010350" watchObservedRunningTime="2026-03-10 10:05:27.064646232 +0000 UTC m=+1275.820817050" Mar 10 10:05:27 crc kubenswrapper[4794]: I0310 10:05:27.116086 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-fvs8j-config-b9l5x" podStartSLOduration=11.116065818 podStartE2EDuration="11.116065818s" podCreationTimestamp="2026-03-10 10:05:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 10:05:27.111123582 +0000 UTC m=+1275.867294410" watchObservedRunningTime="2026-03-10 10:05:27.116065818 +0000 UTC m=+1275.872236636" Mar 10 10:05:27 crc kubenswrapper[4794]: I0310 10:05:27.116690 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-db-sync-gg5z8" podStartSLOduration=2.358252241 podStartE2EDuration="15.116682628s" podCreationTimestamp="2026-03-10 10:05:12 +0000 UTC" firstStartedPulling="2026-03-10 10:05:13.193414335 +0000 UTC m=+1261.949585153" lastFinishedPulling="2026-03-10 10:05:25.951844722 +0000 UTC m=+1274.708015540" observedRunningTime="2026-03-10 10:05:27.08672092 +0000 UTC m=+1275.842891738" watchObservedRunningTime="2026-03-10 10:05:27.116682628 +0000 UTC m=+1275.872853456" Mar 10 10:05:27 crc kubenswrapper[4794]: I0310 10:05:27.144927 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=57.778183263 podStartE2EDuration="1m10.144908051s" podCreationTimestamp="2026-03-10 10:04:17 +0000 UTC" firstStartedPulling="2026-03-10 10:04:29.249889377 +0000 UTC m=+1218.006060215" lastFinishedPulling="2026-03-10 10:04:41.616614185 +0000 UTC m=+1230.372785003" observedRunningTime="2026-03-10 10:05:27.144083744 +0000 UTC m=+1275.900254562" watchObservedRunningTime="2026-03-10 10:05:27.144908051 +0000 UTC m=+1275.901078869" Mar 10 10:05:28 crc kubenswrapper[4794]: I0310 10:05:28.051505 4794 generic.go:334] "Generic (PLEG): container finished" podID="bb8b5572-57dd-4c31-a86d-ee143ba44266" containerID="3de7e1e91921c511d1d59278200c580a671c8fa44d00a53740c8205df1d31761" exitCode=0 Mar 10 10:05:28 crc kubenswrapper[4794]: I0310 10:05:28.051612 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-fvs8j-config-b9l5x" event={"ID":"bb8b5572-57dd-4c31-a86d-ee143ba44266","Type":"ContainerDied","Data":"3de7e1e91921c511d1d59278200c580a671c8fa44d00a53740c8205df1d31761"} Mar 10 10:05:28 crc kubenswrapper[4794]: I0310 10:05:28.064902 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"6cbde6fd-ed6b-49a8-96ae-642b15a1802b","Type":"ContainerStarted","Data":"5c65c1b8cc623038c03f29280e90a94e0a70fec6174c721b69512958e9487bcb"} Mar 10 10:05:28 crc kubenswrapper[4794]: I0310 10:05:28.064960 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"6cbde6fd-ed6b-49a8-96ae-642b15a1802b","Type":"ContainerStarted","Data":"8f6939cc5c3159e417cd81d11fcb1a54fc7a6a362b3f62041010bad3ef1cdf82"} Mar 10 10:05:28 crc kubenswrapper[4794]: I0310 10:05:28.118030 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-storage-0" podStartSLOduration=19.991727501 podStartE2EDuration="34.118007823s" podCreationTimestamp="2026-03-10 10:04:54 +0000 UTC" firstStartedPulling="2026-03-10 10:05:11.771398176 +0000 UTC m=+1260.527569014" lastFinishedPulling="2026-03-10 10:05:25.897678518 +0000 UTC m=+1274.653849336" observedRunningTime="2026-03-10 10:05:28.108611445 +0000 UTC m=+1276.864782263" watchObservedRunningTime="2026-03-10 10:05:28.118007823 +0000 UTC m=+1276.874178641" Mar 10 10:05:28 crc kubenswrapper[4794]: I0310 10:05:28.394137 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5bdcf4fccc-dmbdh"] Mar 10 10:05:28 crc kubenswrapper[4794]: I0310 10:05:28.395777 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5bdcf4fccc-dmbdh" Mar 10 10:05:28 crc kubenswrapper[4794]: I0310 10:05:28.397663 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-swift-storage-0" Mar 10 10:05:28 crc kubenswrapper[4794]: I0310 10:05:28.407416 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5bdcf4fccc-dmbdh"] Mar 10 10:05:28 crc kubenswrapper[4794]: I0310 10:05:28.515099 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/4c81ef37-04a4-486e-8bfe-8862e3514256-dns-swift-storage-0\") pod \"dnsmasq-dns-5bdcf4fccc-dmbdh\" (UID: \"4c81ef37-04a4-486e-8bfe-8862e3514256\") " pod="openstack/dnsmasq-dns-5bdcf4fccc-dmbdh" Mar 10 10:05:28 crc kubenswrapper[4794]: I0310 10:05:28.515226 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4c81ef37-04a4-486e-8bfe-8862e3514256-ovsdbserver-nb\") pod \"dnsmasq-dns-5bdcf4fccc-dmbdh\" (UID: \"4c81ef37-04a4-486e-8bfe-8862e3514256\") " pod="openstack/dnsmasq-dns-5bdcf4fccc-dmbdh" Mar 10 10:05:28 crc kubenswrapper[4794]: I0310 10:05:28.515280 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z5jc2\" (UniqueName: \"kubernetes.io/projected/4c81ef37-04a4-486e-8bfe-8862e3514256-kube-api-access-z5jc2\") pod \"dnsmasq-dns-5bdcf4fccc-dmbdh\" (UID: \"4c81ef37-04a4-486e-8bfe-8862e3514256\") " pod="openstack/dnsmasq-dns-5bdcf4fccc-dmbdh" Mar 10 10:05:28 crc kubenswrapper[4794]: I0310 10:05:28.515419 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4c81ef37-04a4-486e-8bfe-8862e3514256-dns-svc\") pod \"dnsmasq-dns-5bdcf4fccc-dmbdh\" (UID: \"4c81ef37-04a4-486e-8bfe-8862e3514256\") " pod="openstack/dnsmasq-dns-5bdcf4fccc-dmbdh" Mar 10 10:05:28 crc kubenswrapper[4794]: I0310 10:05:28.515465 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4c81ef37-04a4-486e-8bfe-8862e3514256-config\") pod \"dnsmasq-dns-5bdcf4fccc-dmbdh\" (UID: \"4c81ef37-04a4-486e-8bfe-8862e3514256\") " pod="openstack/dnsmasq-dns-5bdcf4fccc-dmbdh" Mar 10 10:05:28 crc kubenswrapper[4794]: I0310 10:05:28.515499 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4c81ef37-04a4-486e-8bfe-8862e3514256-ovsdbserver-sb\") pod \"dnsmasq-dns-5bdcf4fccc-dmbdh\" (UID: \"4c81ef37-04a4-486e-8bfe-8862e3514256\") " pod="openstack/dnsmasq-dns-5bdcf4fccc-dmbdh" Mar 10 10:05:28 crc kubenswrapper[4794]: I0310 10:05:28.617049 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4c81ef37-04a4-486e-8bfe-8862e3514256-dns-svc\") pod \"dnsmasq-dns-5bdcf4fccc-dmbdh\" (UID: \"4c81ef37-04a4-486e-8bfe-8862e3514256\") " pod="openstack/dnsmasq-dns-5bdcf4fccc-dmbdh" Mar 10 10:05:28 crc kubenswrapper[4794]: I0310 10:05:28.617096 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4c81ef37-04a4-486e-8bfe-8862e3514256-config\") pod \"dnsmasq-dns-5bdcf4fccc-dmbdh\" (UID: \"4c81ef37-04a4-486e-8bfe-8862e3514256\") " pod="openstack/dnsmasq-dns-5bdcf4fccc-dmbdh" Mar 10 10:05:28 crc kubenswrapper[4794]: I0310 10:05:28.617148 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4c81ef37-04a4-486e-8bfe-8862e3514256-ovsdbserver-sb\") pod \"dnsmasq-dns-5bdcf4fccc-dmbdh\" (UID: \"4c81ef37-04a4-486e-8bfe-8862e3514256\") " pod="openstack/dnsmasq-dns-5bdcf4fccc-dmbdh" Mar 10 10:05:28 crc kubenswrapper[4794]: I0310 10:05:28.617211 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/4c81ef37-04a4-486e-8bfe-8862e3514256-dns-swift-storage-0\") pod \"dnsmasq-dns-5bdcf4fccc-dmbdh\" (UID: \"4c81ef37-04a4-486e-8bfe-8862e3514256\") " pod="openstack/dnsmasq-dns-5bdcf4fccc-dmbdh" Mar 10 10:05:28 crc kubenswrapper[4794]: I0310 10:05:28.617263 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4c81ef37-04a4-486e-8bfe-8862e3514256-ovsdbserver-nb\") pod \"dnsmasq-dns-5bdcf4fccc-dmbdh\" (UID: \"4c81ef37-04a4-486e-8bfe-8862e3514256\") " pod="openstack/dnsmasq-dns-5bdcf4fccc-dmbdh" Mar 10 10:05:28 crc kubenswrapper[4794]: I0310 10:05:28.617301 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z5jc2\" (UniqueName: \"kubernetes.io/projected/4c81ef37-04a4-486e-8bfe-8862e3514256-kube-api-access-z5jc2\") pod \"dnsmasq-dns-5bdcf4fccc-dmbdh\" (UID: \"4c81ef37-04a4-486e-8bfe-8862e3514256\") " pod="openstack/dnsmasq-dns-5bdcf4fccc-dmbdh" Mar 10 10:05:28 crc kubenswrapper[4794]: I0310 10:05:28.618173 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4c81ef37-04a4-486e-8bfe-8862e3514256-ovsdbserver-sb\") pod \"dnsmasq-dns-5bdcf4fccc-dmbdh\" (UID: \"4c81ef37-04a4-486e-8bfe-8862e3514256\") " pod="openstack/dnsmasq-dns-5bdcf4fccc-dmbdh" Mar 10 10:05:28 crc kubenswrapper[4794]: I0310 10:05:28.618258 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4c81ef37-04a4-486e-8bfe-8862e3514256-config\") pod \"dnsmasq-dns-5bdcf4fccc-dmbdh\" (UID: \"4c81ef37-04a4-486e-8bfe-8862e3514256\") " pod="openstack/dnsmasq-dns-5bdcf4fccc-dmbdh" Mar 10 10:05:28 crc kubenswrapper[4794]: I0310 10:05:28.618530 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4c81ef37-04a4-486e-8bfe-8862e3514256-ovsdbserver-nb\") pod \"dnsmasq-dns-5bdcf4fccc-dmbdh\" (UID: \"4c81ef37-04a4-486e-8bfe-8862e3514256\") " pod="openstack/dnsmasq-dns-5bdcf4fccc-dmbdh" Mar 10 10:05:28 crc kubenswrapper[4794]: I0310 10:05:28.618532 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/4c81ef37-04a4-486e-8bfe-8862e3514256-dns-swift-storage-0\") pod \"dnsmasq-dns-5bdcf4fccc-dmbdh\" (UID: \"4c81ef37-04a4-486e-8bfe-8862e3514256\") " pod="openstack/dnsmasq-dns-5bdcf4fccc-dmbdh" Mar 10 10:05:28 crc kubenswrapper[4794]: I0310 10:05:28.618532 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4c81ef37-04a4-486e-8bfe-8862e3514256-dns-svc\") pod \"dnsmasq-dns-5bdcf4fccc-dmbdh\" (UID: \"4c81ef37-04a4-486e-8bfe-8862e3514256\") " pod="openstack/dnsmasq-dns-5bdcf4fccc-dmbdh" Mar 10 10:05:28 crc kubenswrapper[4794]: I0310 10:05:28.638230 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z5jc2\" (UniqueName: \"kubernetes.io/projected/4c81ef37-04a4-486e-8bfe-8862e3514256-kube-api-access-z5jc2\") pod \"dnsmasq-dns-5bdcf4fccc-dmbdh\" (UID: \"4c81ef37-04a4-486e-8bfe-8862e3514256\") " pod="openstack/dnsmasq-dns-5bdcf4fccc-dmbdh" Mar 10 10:05:28 crc kubenswrapper[4794]: I0310 10:05:28.726668 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5bdcf4fccc-dmbdh" Mar 10 10:05:29 crc kubenswrapper[4794]: I0310 10:05:29.163454 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5bdcf4fccc-dmbdh"] Mar 10 10:05:29 crc kubenswrapper[4794]: I0310 10:05:29.305890 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-fvs8j-config-b9l5x" Mar 10 10:05:29 crc kubenswrapper[4794]: I0310 10:05:29.428138 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/bb8b5572-57dd-4c31-a86d-ee143ba44266-var-run\") pod \"bb8b5572-57dd-4c31-a86d-ee143ba44266\" (UID: \"bb8b5572-57dd-4c31-a86d-ee143ba44266\") " Mar 10 10:05:29 crc kubenswrapper[4794]: I0310 10:05:29.428224 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/bb8b5572-57dd-4c31-a86d-ee143ba44266-scripts\") pod \"bb8b5572-57dd-4c31-a86d-ee143ba44266\" (UID: \"bb8b5572-57dd-4c31-a86d-ee143ba44266\") " Mar 10 10:05:29 crc kubenswrapper[4794]: I0310 10:05:29.428254 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/bb8b5572-57dd-4c31-a86d-ee143ba44266-additional-scripts\") pod \"bb8b5572-57dd-4c31-a86d-ee143ba44266\" (UID: \"bb8b5572-57dd-4c31-a86d-ee143ba44266\") " Mar 10 10:05:29 crc kubenswrapper[4794]: I0310 10:05:29.428282 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/bb8b5572-57dd-4c31-a86d-ee143ba44266-var-run-ovn\") pod \"bb8b5572-57dd-4c31-a86d-ee143ba44266\" (UID: \"bb8b5572-57dd-4c31-a86d-ee143ba44266\") " Mar 10 10:05:29 crc kubenswrapper[4794]: I0310 10:05:29.428288 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/bb8b5572-57dd-4c31-a86d-ee143ba44266-var-run" (OuterVolumeSpecName: "var-run") pod "bb8b5572-57dd-4c31-a86d-ee143ba44266" (UID: "bb8b5572-57dd-4c31-a86d-ee143ba44266"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 10:05:29 crc kubenswrapper[4794]: I0310 10:05:29.428389 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/bb8b5572-57dd-4c31-a86d-ee143ba44266-var-log-ovn\") pod \"bb8b5572-57dd-4c31-a86d-ee143ba44266\" (UID: \"bb8b5572-57dd-4c31-a86d-ee143ba44266\") " Mar 10 10:05:29 crc kubenswrapper[4794]: I0310 10:05:29.428401 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/bb8b5572-57dd-4c31-a86d-ee143ba44266-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "bb8b5572-57dd-4c31-a86d-ee143ba44266" (UID: "bb8b5572-57dd-4c31-a86d-ee143ba44266"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 10:05:29 crc kubenswrapper[4794]: I0310 10:05:29.428469 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8nfdb\" (UniqueName: \"kubernetes.io/projected/bb8b5572-57dd-4c31-a86d-ee143ba44266-kube-api-access-8nfdb\") pod \"bb8b5572-57dd-4c31-a86d-ee143ba44266\" (UID: \"bb8b5572-57dd-4c31-a86d-ee143ba44266\") " Mar 10 10:05:29 crc kubenswrapper[4794]: I0310 10:05:29.428523 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/bb8b5572-57dd-4c31-a86d-ee143ba44266-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "bb8b5572-57dd-4c31-a86d-ee143ba44266" (UID: "bb8b5572-57dd-4c31-a86d-ee143ba44266"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 10:05:29 crc kubenswrapper[4794]: I0310 10:05:29.428801 4794 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/bb8b5572-57dd-4c31-a86d-ee143ba44266-var-run-ovn\") on node \"crc\" DevicePath \"\"" Mar 10 10:05:29 crc kubenswrapper[4794]: I0310 10:05:29.428819 4794 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/bb8b5572-57dd-4c31-a86d-ee143ba44266-var-log-ovn\") on node \"crc\" DevicePath \"\"" Mar 10 10:05:29 crc kubenswrapper[4794]: I0310 10:05:29.429054 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bb8b5572-57dd-4c31-a86d-ee143ba44266-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "bb8b5572-57dd-4c31-a86d-ee143ba44266" (UID: "bb8b5572-57dd-4c31-a86d-ee143ba44266"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 10:05:29 crc kubenswrapper[4794]: I0310 10:05:29.429994 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bb8b5572-57dd-4c31-a86d-ee143ba44266-scripts" (OuterVolumeSpecName: "scripts") pod "bb8b5572-57dd-4c31-a86d-ee143ba44266" (UID: "bb8b5572-57dd-4c31-a86d-ee143ba44266"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 10:05:29 crc kubenswrapper[4794]: I0310 10:05:29.439889 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bb8b5572-57dd-4c31-a86d-ee143ba44266-kube-api-access-8nfdb" (OuterVolumeSpecName: "kube-api-access-8nfdb") pod "bb8b5572-57dd-4c31-a86d-ee143ba44266" (UID: "bb8b5572-57dd-4c31-a86d-ee143ba44266"). InnerVolumeSpecName "kube-api-access-8nfdb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 10:05:29 crc kubenswrapper[4794]: I0310 10:05:29.530300 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8nfdb\" (UniqueName: \"kubernetes.io/projected/bb8b5572-57dd-4c31-a86d-ee143ba44266-kube-api-access-8nfdb\") on node \"crc\" DevicePath \"\"" Mar 10 10:05:29 crc kubenswrapper[4794]: I0310 10:05:29.530358 4794 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/bb8b5572-57dd-4c31-a86d-ee143ba44266-var-run\") on node \"crc\" DevicePath \"\"" Mar 10 10:05:29 crc kubenswrapper[4794]: I0310 10:05:29.530370 4794 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/bb8b5572-57dd-4c31-a86d-ee143ba44266-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 10:05:29 crc kubenswrapper[4794]: I0310 10:05:29.530378 4794 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/bb8b5572-57dd-4c31-a86d-ee143ba44266-additional-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 10:05:30 crc kubenswrapper[4794]: I0310 10:05:30.083971 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-fvs8j-config-b9l5x" event={"ID":"bb8b5572-57dd-4c31-a86d-ee143ba44266","Type":"ContainerDied","Data":"59649f01875e2bafe2ff905677b5ec719300dfcd4b7800827c89e8186bcecd2f"} Mar 10 10:05:30 crc kubenswrapper[4794]: I0310 10:05:30.084018 4794 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="59649f01875e2bafe2ff905677b5ec719300dfcd4b7800827c89e8186bcecd2f" Mar 10 10:05:30 crc kubenswrapper[4794]: I0310 10:05:30.084040 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-fvs8j-config-b9l5x" Mar 10 10:05:30 crc kubenswrapper[4794]: I0310 10:05:30.086295 4794 generic.go:334] "Generic (PLEG): container finished" podID="4c81ef37-04a4-486e-8bfe-8862e3514256" containerID="032e3981c05101d1f5fd31ac52721100475578399d8f015d14bd04d5f8f63fe7" exitCode=0 Mar 10 10:05:30 crc kubenswrapper[4794]: I0310 10:05:30.086370 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5bdcf4fccc-dmbdh" event={"ID":"4c81ef37-04a4-486e-8bfe-8862e3514256","Type":"ContainerDied","Data":"032e3981c05101d1f5fd31ac52721100475578399d8f015d14bd04d5f8f63fe7"} Mar 10 10:05:30 crc kubenswrapper[4794]: I0310 10:05:30.086879 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5bdcf4fccc-dmbdh" event={"ID":"4c81ef37-04a4-486e-8bfe-8862e3514256","Type":"ContainerStarted","Data":"1d741f26c076cbe73af55886fe085d0e0064cac0749e0f3a6e74529fea14c3b4"} Mar 10 10:05:30 crc kubenswrapper[4794]: I0310 10:05:30.411086 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-fvs8j-config-b9l5x"] Mar 10 10:05:30 crc kubenswrapper[4794]: I0310 10:05:30.419808 4794 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-fvs8j-config-b9l5x"] Mar 10 10:05:30 crc kubenswrapper[4794]: I0310 10:05:30.513241 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-fvs8j-config-5gfl2"] Mar 10 10:05:30 crc kubenswrapper[4794]: E0310 10:05:30.513584 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bb8b5572-57dd-4c31-a86d-ee143ba44266" containerName="ovn-config" Mar 10 10:05:30 crc kubenswrapper[4794]: I0310 10:05:30.513597 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="bb8b5572-57dd-4c31-a86d-ee143ba44266" containerName="ovn-config" Mar 10 10:05:30 crc kubenswrapper[4794]: I0310 10:05:30.513766 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="bb8b5572-57dd-4c31-a86d-ee143ba44266" containerName="ovn-config" Mar 10 10:05:30 crc kubenswrapper[4794]: I0310 10:05:30.514285 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-fvs8j-config-5gfl2" Mar 10 10:05:30 crc kubenswrapper[4794]: I0310 10:05:30.516389 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-extra-scripts" Mar 10 10:05:30 crc kubenswrapper[4794]: I0310 10:05:30.524702 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-fvs8j-config-5gfl2"] Mar 10 10:05:30 crc kubenswrapper[4794]: I0310 10:05:30.647796 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/07c66acc-45bc-440d-9cbc-c8f68f10cdd5-var-run-ovn\") pod \"ovn-controller-fvs8j-config-5gfl2\" (UID: \"07c66acc-45bc-440d-9cbc-c8f68f10cdd5\") " pod="openstack/ovn-controller-fvs8j-config-5gfl2" Mar 10 10:05:30 crc kubenswrapper[4794]: I0310 10:05:30.647842 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/07c66acc-45bc-440d-9cbc-c8f68f10cdd5-var-run\") pod \"ovn-controller-fvs8j-config-5gfl2\" (UID: \"07c66acc-45bc-440d-9cbc-c8f68f10cdd5\") " pod="openstack/ovn-controller-fvs8j-config-5gfl2" Mar 10 10:05:30 crc kubenswrapper[4794]: I0310 10:05:30.647861 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9kmjn\" (UniqueName: \"kubernetes.io/projected/07c66acc-45bc-440d-9cbc-c8f68f10cdd5-kube-api-access-9kmjn\") pod \"ovn-controller-fvs8j-config-5gfl2\" (UID: \"07c66acc-45bc-440d-9cbc-c8f68f10cdd5\") " pod="openstack/ovn-controller-fvs8j-config-5gfl2" Mar 10 10:05:30 crc kubenswrapper[4794]: I0310 10:05:30.647888 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/07c66acc-45bc-440d-9cbc-c8f68f10cdd5-scripts\") pod \"ovn-controller-fvs8j-config-5gfl2\" (UID: \"07c66acc-45bc-440d-9cbc-c8f68f10cdd5\") " pod="openstack/ovn-controller-fvs8j-config-5gfl2" Mar 10 10:05:30 crc kubenswrapper[4794]: I0310 10:05:30.647914 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/07c66acc-45bc-440d-9cbc-c8f68f10cdd5-additional-scripts\") pod \"ovn-controller-fvs8j-config-5gfl2\" (UID: \"07c66acc-45bc-440d-9cbc-c8f68f10cdd5\") " pod="openstack/ovn-controller-fvs8j-config-5gfl2" Mar 10 10:05:30 crc kubenswrapper[4794]: I0310 10:05:30.648016 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/07c66acc-45bc-440d-9cbc-c8f68f10cdd5-var-log-ovn\") pod \"ovn-controller-fvs8j-config-5gfl2\" (UID: \"07c66acc-45bc-440d-9cbc-c8f68f10cdd5\") " pod="openstack/ovn-controller-fvs8j-config-5gfl2" Mar 10 10:05:30 crc kubenswrapper[4794]: I0310 10:05:30.749926 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/07c66acc-45bc-440d-9cbc-c8f68f10cdd5-var-run-ovn\") pod \"ovn-controller-fvs8j-config-5gfl2\" (UID: \"07c66acc-45bc-440d-9cbc-c8f68f10cdd5\") " pod="openstack/ovn-controller-fvs8j-config-5gfl2" Mar 10 10:05:30 crc kubenswrapper[4794]: I0310 10:05:30.749976 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/07c66acc-45bc-440d-9cbc-c8f68f10cdd5-var-run\") pod \"ovn-controller-fvs8j-config-5gfl2\" (UID: \"07c66acc-45bc-440d-9cbc-c8f68f10cdd5\") " pod="openstack/ovn-controller-fvs8j-config-5gfl2" Mar 10 10:05:30 crc kubenswrapper[4794]: I0310 10:05:30.750008 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9kmjn\" (UniqueName: \"kubernetes.io/projected/07c66acc-45bc-440d-9cbc-c8f68f10cdd5-kube-api-access-9kmjn\") pod \"ovn-controller-fvs8j-config-5gfl2\" (UID: \"07c66acc-45bc-440d-9cbc-c8f68f10cdd5\") " pod="openstack/ovn-controller-fvs8j-config-5gfl2" Mar 10 10:05:30 crc kubenswrapper[4794]: I0310 10:05:30.750047 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/07c66acc-45bc-440d-9cbc-c8f68f10cdd5-scripts\") pod \"ovn-controller-fvs8j-config-5gfl2\" (UID: \"07c66acc-45bc-440d-9cbc-c8f68f10cdd5\") " pod="openstack/ovn-controller-fvs8j-config-5gfl2" Mar 10 10:05:30 crc kubenswrapper[4794]: I0310 10:05:30.750082 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/07c66acc-45bc-440d-9cbc-c8f68f10cdd5-additional-scripts\") pod \"ovn-controller-fvs8j-config-5gfl2\" (UID: \"07c66acc-45bc-440d-9cbc-c8f68f10cdd5\") " pod="openstack/ovn-controller-fvs8j-config-5gfl2" Mar 10 10:05:30 crc kubenswrapper[4794]: I0310 10:05:30.750142 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/07c66acc-45bc-440d-9cbc-c8f68f10cdd5-var-log-ovn\") pod \"ovn-controller-fvs8j-config-5gfl2\" (UID: \"07c66acc-45bc-440d-9cbc-c8f68f10cdd5\") " pod="openstack/ovn-controller-fvs8j-config-5gfl2" Mar 10 10:05:30 crc kubenswrapper[4794]: I0310 10:05:30.750198 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/07c66acc-45bc-440d-9cbc-c8f68f10cdd5-var-run\") pod \"ovn-controller-fvs8j-config-5gfl2\" (UID: \"07c66acc-45bc-440d-9cbc-c8f68f10cdd5\") " pod="openstack/ovn-controller-fvs8j-config-5gfl2" Mar 10 10:05:30 crc kubenswrapper[4794]: I0310 10:05:30.750227 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/07c66acc-45bc-440d-9cbc-c8f68f10cdd5-var-run-ovn\") pod \"ovn-controller-fvs8j-config-5gfl2\" (UID: \"07c66acc-45bc-440d-9cbc-c8f68f10cdd5\") " pod="openstack/ovn-controller-fvs8j-config-5gfl2" Mar 10 10:05:30 crc kubenswrapper[4794]: I0310 10:05:30.750305 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/07c66acc-45bc-440d-9cbc-c8f68f10cdd5-var-log-ovn\") pod \"ovn-controller-fvs8j-config-5gfl2\" (UID: \"07c66acc-45bc-440d-9cbc-c8f68f10cdd5\") " pod="openstack/ovn-controller-fvs8j-config-5gfl2" Mar 10 10:05:30 crc kubenswrapper[4794]: I0310 10:05:30.750814 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/07c66acc-45bc-440d-9cbc-c8f68f10cdd5-additional-scripts\") pod \"ovn-controller-fvs8j-config-5gfl2\" (UID: \"07c66acc-45bc-440d-9cbc-c8f68f10cdd5\") " pod="openstack/ovn-controller-fvs8j-config-5gfl2" Mar 10 10:05:30 crc kubenswrapper[4794]: I0310 10:05:30.752885 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/07c66acc-45bc-440d-9cbc-c8f68f10cdd5-scripts\") pod \"ovn-controller-fvs8j-config-5gfl2\" (UID: \"07c66acc-45bc-440d-9cbc-c8f68f10cdd5\") " pod="openstack/ovn-controller-fvs8j-config-5gfl2" Mar 10 10:05:30 crc kubenswrapper[4794]: I0310 10:05:30.772177 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9kmjn\" (UniqueName: \"kubernetes.io/projected/07c66acc-45bc-440d-9cbc-c8f68f10cdd5-kube-api-access-9kmjn\") pod \"ovn-controller-fvs8j-config-5gfl2\" (UID: \"07c66acc-45bc-440d-9cbc-c8f68f10cdd5\") " pod="openstack/ovn-controller-fvs8j-config-5gfl2" Mar 10 10:05:30 crc kubenswrapper[4794]: I0310 10:05:30.831111 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-fvs8j-config-5gfl2" Mar 10 10:05:31 crc kubenswrapper[4794]: I0310 10:05:31.097388 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5bdcf4fccc-dmbdh" event={"ID":"4c81ef37-04a4-486e-8bfe-8862e3514256","Type":"ContainerStarted","Data":"ce01843be5c5828e8979187dc904fe2a134b8a8f3591dc1be315393462a40d7d"} Mar 10 10:05:31 crc kubenswrapper[4794]: I0310 10:05:31.097641 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5bdcf4fccc-dmbdh" Mar 10 10:05:31 crc kubenswrapper[4794]: I0310 10:05:31.119726 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5bdcf4fccc-dmbdh" podStartSLOduration=3.119704377 podStartE2EDuration="3.119704377s" podCreationTimestamp="2026-03-10 10:05:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 10:05:31.117000181 +0000 UTC m=+1279.873171019" watchObservedRunningTime="2026-03-10 10:05:31.119704377 +0000 UTC m=+1279.875875195" Mar 10 10:05:31 crc kubenswrapper[4794]: I0310 10:05:31.297302 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-fvs8j-config-5gfl2"] Mar 10 10:05:31 crc kubenswrapper[4794]: W0310 10:05:31.306762 4794 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod07c66acc_45bc_440d_9cbc_c8f68f10cdd5.slice/crio-6074c681f86dd65a2128c3f8406742afefb135ccdea52ba5c2750e8d13fa769c WatchSource:0}: Error finding container 6074c681f86dd65a2128c3f8406742afefb135ccdea52ba5c2750e8d13fa769c: Status 404 returned error can't find the container with id 6074c681f86dd65a2128c3f8406742afefb135ccdea52ba5c2750e8d13fa769c Mar 10 10:05:32 crc kubenswrapper[4794]: I0310 10:05:32.011510 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bb8b5572-57dd-4c31-a86d-ee143ba44266" path="/var/lib/kubelet/pods/bb8b5572-57dd-4c31-a86d-ee143ba44266/volumes" Mar 10 10:05:32 crc kubenswrapper[4794]: I0310 10:05:32.106199 4794 generic.go:334] "Generic (PLEG): container finished" podID="07c66acc-45bc-440d-9cbc-c8f68f10cdd5" containerID="98685f94761e1e0498851e6b6d04b23abfedb6dc9cc964602015da98d51a50db" exitCode=0 Mar 10 10:05:32 crc kubenswrapper[4794]: I0310 10:05:32.106277 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-fvs8j-config-5gfl2" event={"ID":"07c66acc-45bc-440d-9cbc-c8f68f10cdd5","Type":"ContainerDied","Data":"98685f94761e1e0498851e6b6d04b23abfedb6dc9cc964602015da98d51a50db"} Mar 10 10:05:32 crc kubenswrapper[4794]: I0310 10:05:32.106344 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-fvs8j-config-5gfl2" event={"ID":"07c66acc-45bc-440d-9cbc-c8f68f10cdd5","Type":"ContainerStarted","Data":"6074c681f86dd65a2128c3f8406742afefb135ccdea52ba5c2750e8d13fa769c"} Mar 10 10:05:33 crc kubenswrapper[4794]: I0310 10:05:33.494033 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-fvs8j-config-5gfl2" Mar 10 10:05:33 crc kubenswrapper[4794]: I0310 10:05:33.905387 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/07c66acc-45bc-440d-9cbc-c8f68f10cdd5-var-run\") pod \"07c66acc-45bc-440d-9cbc-c8f68f10cdd5\" (UID: \"07c66acc-45bc-440d-9cbc-c8f68f10cdd5\") " Mar 10 10:05:33 crc kubenswrapper[4794]: I0310 10:05:33.905533 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/07c66acc-45bc-440d-9cbc-c8f68f10cdd5-additional-scripts\") pod \"07c66acc-45bc-440d-9cbc-c8f68f10cdd5\" (UID: \"07c66acc-45bc-440d-9cbc-c8f68f10cdd5\") " Mar 10 10:05:33 crc kubenswrapper[4794]: I0310 10:05:33.905580 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9kmjn\" (UniqueName: \"kubernetes.io/projected/07c66acc-45bc-440d-9cbc-c8f68f10cdd5-kube-api-access-9kmjn\") pod \"07c66acc-45bc-440d-9cbc-c8f68f10cdd5\" (UID: \"07c66acc-45bc-440d-9cbc-c8f68f10cdd5\") " Mar 10 10:05:33 crc kubenswrapper[4794]: I0310 10:05:33.905683 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/07c66acc-45bc-440d-9cbc-c8f68f10cdd5-scripts\") pod \"07c66acc-45bc-440d-9cbc-c8f68f10cdd5\" (UID: \"07c66acc-45bc-440d-9cbc-c8f68f10cdd5\") " Mar 10 10:05:33 crc kubenswrapper[4794]: I0310 10:05:33.905766 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/07c66acc-45bc-440d-9cbc-c8f68f10cdd5-var-run-ovn\") pod \"07c66acc-45bc-440d-9cbc-c8f68f10cdd5\" (UID: \"07c66acc-45bc-440d-9cbc-c8f68f10cdd5\") " Mar 10 10:05:33 crc kubenswrapper[4794]: I0310 10:05:33.905859 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/07c66acc-45bc-440d-9cbc-c8f68f10cdd5-var-log-ovn\") pod \"07c66acc-45bc-440d-9cbc-c8f68f10cdd5\" (UID: \"07c66acc-45bc-440d-9cbc-c8f68f10cdd5\") " Mar 10 10:05:33 crc kubenswrapper[4794]: I0310 10:05:33.906583 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/07c66acc-45bc-440d-9cbc-c8f68f10cdd5-var-run" (OuterVolumeSpecName: "var-run") pod "07c66acc-45bc-440d-9cbc-c8f68f10cdd5" (UID: "07c66acc-45bc-440d-9cbc-c8f68f10cdd5"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 10:05:33 crc kubenswrapper[4794]: I0310 10:05:33.906674 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/07c66acc-45bc-440d-9cbc-c8f68f10cdd5-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "07c66acc-45bc-440d-9cbc-c8f68f10cdd5" (UID: "07c66acc-45bc-440d-9cbc-c8f68f10cdd5"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 10:05:33 crc kubenswrapper[4794]: I0310 10:05:33.906695 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/07c66acc-45bc-440d-9cbc-c8f68f10cdd5-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "07c66acc-45bc-440d-9cbc-c8f68f10cdd5" (UID: "07c66acc-45bc-440d-9cbc-c8f68f10cdd5"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 10:05:33 crc kubenswrapper[4794]: I0310 10:05:33.906779 4794 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/07c66acc-45bc-440d-9cbc-c8f68f10cdd5-var-run\") on node \"crc\" DevicePath \"\"" Mar 10 10:05:33 crc kubenswrapper[4794]: I0310 10:05:33.907349 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/07c66acc-45bc-440d-9cbc-c8f68f10cdd5-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "07c66acc-45bc-440d-9cbc-c8f68f10cdd5" (UID: "07c66acc-45bc-440d-9cbc-c8f68f10cdd5"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 10:05:33 crc kubenswrapper[4794]: I0310 10:05:33.907788 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/07c66acc-45bc-440d-9cbc-c8f68f10cdd5-scripts" (OuterVolumeSpecName: "scripts") pod "07c66acc-45bc-440d-9cbc-c8f68f10cdd5" (UID: "07c66acc-45bc-440d-9cbc-c8f68f10cdd5"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 10:05:33 crc kubenswrapper[4794]: I0310 10:05:33.912099 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/07c66acc-45bc-440d-9cbc-c8f68f10cdd5-kube-api-access-9kmjn" (OuterVolumeSpecName: "kube-api-access-9kmjn") pod "07c66acc-45bc-440d-9cbc-c8f68f10cdd5" (UID: "07c66acc-45bc-440d-9cbc-c8f68f10cdd5"). InnerVolumeSpecName "kube-api-access-9kmjn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 10:05:34 crc kubenswrapper[4794]: I0310 10:05:34.009478 4794 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/07c66acc-45bc-440d-9cbc-c8f68f10cdd5-var-log-ovn\") on node \"crc\" DevicePath \"\"" Mar 10 10:05:34 crc kubenswrapper[4794]: I0310 10:05:34.009532 4794 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/07c66acc-45bc-440d-9cbc-c8f68f10cdd5-additional-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 10:05:34 crc kubenswrapper[4794]: I0310 10:05:34.009555 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9kmjn\" (UniqueName: \"kubernetes.io/projected/07c66acc-45bc-440d-9cbc-c8f68f10cdd5-kube-api-access-9kmjn\") on node \"crc\" DevicePath \"\"" Mar 10 10:05:34 crc kubenswrapper[4794]: I0310 10:05:34.009574 4794 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/07c66acc-45bc-440d-9cbc-c8f68f10cdd5-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 10:05:34 crc kubenswrapper[4794]: I0310 10:05:34.009592 4794 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/07c66acc-45bc-440d-9cbc-c8f68f10cdd5-var-run-ovn\") on node \"crc\" DevicePath \"\"" Mar 10 10:05:34 crc kubenswrapper[4794]: I0310 10:05:34.125238 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-fvs8j-config-5gfl2" Mar 10 10:05:34 crc kubenswrapper[4794]: I0310 10:05:34.125257 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-fvs8j-config-5gfl2" event={"ID":"07c66acc-45bc-440d-9cbc-c8f68f10cdd5","Type":"ContainerDied","Data":"6074c681f86dd65a2128c3f8406742afefb135ccdea52ba5c2750e8d13fa769c"} Mar 10 10:05:34 crc kubenswrapper[4794]: I0310 10:05:34.125322 4794 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6074c681f86dd65a2128c3f8406742afefb135ccdea52ba5c2750e8d13fa769c" Mar 10 10:05:34 crc kubenswrapper[4794]: I0310 10:05:34.127209 4794 generic.go:334] "Generic (PLEG): container finished" podID="f7641d95-8aa6-43ad-a191-7d8356b29bac" containerID="5999c3f70eba4e38ddc5fbb492eefb265a6b6a1bfc7c6e88e7609ef9105892a8" exitCode=0 Mar 10 10:05:34 crc kubenswrapper[4794]: I0310 10:05:34.127262 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-gg5z8" event={"ID":"f7641d95-8aa6-43ad-a191-7d8356b29bac","Type":"ContainerDied","Data":"5999c3f70eba4e38ddc5fbb492eefb265a6b6a1bfc7c6e88e7609ef9105892a8"} Mar 10 10:05:34 crc kubenswrapper[4794]: I0310 10:05:34.567568 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-fvs8j-config-5gfl2"] Mar 10 10:05:34 crc kubenswrapper[4794]: I0310 10:05:34.573993 4794 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-fvs8j-config-5gfl2"] Mar 10 10:05:35 crc kubenswrapper[4794]: I0310 10:05:35.503116 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-gg5z8" Mar 10 10:05:35 crc kubenswrapper[4794]: I0310 10:05:35.632411 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f7641d95-8aa6-43ad-a191-7d8356b29bac-config-data\") pod \"f7641d95-8aa6-43ad-a191-7d8356b29bac\" (UID: \"f7641d95-8aa6-43ad-a191-7d8356b29bac\") " Mar 10 10:05:35 crc kubenswrapper[4794]: I0310 10:05:35.632470 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8gkl8\" (UniqueName: \"kubernetes.io/projected/f7641d95-8aa6-43ad-a191-7d8356b29bac-kube-api-access-8gkl8\") pod \"f7641d95-8aa6-43ad-a191-7d8356b29bac\" (UID: \"f7641d95-8aa6-43ad-a191-7d8356b29bac\") " Mar 10 10:05:35 crc kubenswrapper[4794]: I0310 10:05:35.632512 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f7641d95-8aa6-43ad-a191-7d8356b29bac-combined-ca-bundle\") pod \"f7641d95-8aa6-43ad-a191-7d8356b29bac\" (UID: \"f7641d95-8aa6-43ad-a191-7d8356b29bac\") " Mar 10 10:05:35 crc kubenswrapper[4794]: I0310 10:05:35.632531 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/f7641d95-8aa6-43ad-a191-7d8356b29bac-db-sync-config-data\") pod \"f7641d95-8aa6-43ad-a191-7d8356b29bac\" (UID: \"f7641d95-8aa6-43ad-a191-7d8356b29bac\") " Mar 10 10:05:35 crc kubenswrapper[4794]: I0310 10:05:35.642574 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f7641d95-8aa6-43ad-a191-7d8356b29bac-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "f7641d95-8aa6-43ad-a191-7d8356b29bac" (UID: "f7641d95-8aa6-43ad-a191-7d8356b29bac"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 10:05:35 crc kubenswrapper[4794]: I0310 10:05:35.642812 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f7641d95-8aa6-43ad-a191-7d8356b29bac-kube-api-access-8gkl8" (OuterVolumeSpecName: "kube-api-access-8gkl8") pod "f7641d95-8aa6-43ad-a191-7d8356b29bac" (UID: "f7641d95-8aa6-43ad-a191-7d8356b29bac"). InnerVolumeSpecName "kube-api-access-8gkl8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 10:05:35 crc kubenswrapper[4794]: I0310 10:05:35.676215 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f7641d95-8aa6-43ad-a191-7d8356b29bac-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f7641d95-8aa6-43ad-a191-7d8356b29bac" (UID: "f7641d95-8aa6-43ad-a191-7d8356b29bac"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 10:05:35 crc kubenswrapper[4794]: I0310 10:05:35.700228 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f7641d95-8aa6-43ad-a191-7d8356b29bac-config-data" (OuterVolumeSpecName: "config-data") pod "f7641d95-8aa6-43ad-a191-7d8356b29bac" (UID: "f7641d95-8aa6-43ad-a191-7d8356b29bac"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 10:05:35 crc kubenswrapper[4794]: I0310 10:05:35.734270 4794 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f7641d95-8aa6-43ad-a191-7d8356b29bac-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 10:05:35 crc kubenswrapper[4794]: I0310 10:05:35.734311 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8gkl8\" (UniqueName: \"kubernetes.io/projected/f7641d95-8aa6-43ad-a191-7d8356b29bac-kube-api-access-8gkl8\") on node \"crc\" DevicePath \"\"" Mar 10 10:05:35 crc kubenswrapper[4794]: I0310 10:05:35.734345 4794 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/f7641d95-8aa6-43ad-a191-7d8356b29bac-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 10:05:35 crc kubenswrapper[4794]: I0310 10:05:35.734357 4794 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f7641d95-8aa6-43ad-a191-7d8356b29bac-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 10:05:36 crc kubenswrapper[4794]: I0310 10:05:36.012573 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="07c66acc-45bc-440d-9cbc-c8f68f10cdd5" path="/var/lib/kubelet/pods/07c66acc-45bc-440d-9cbc-c8f68f10cdd5/volumes" Mar 10 10:05:36 crc kubenswrapper[4794]: I0310 10:05:36.148029 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-gg5z8" event={"ID":"f7641d95-8aa6-43ad-a191-7d8356b29bac","Type":"ContainerDied","Data":"ba5c54c87514d9d373f080956887440384a5df2a4fad5caf02ebc6be5e6f52fb"} Mar 10 10:05:36 crc kubenswrapper[4794]: I0310 10:05:36.148097 4794 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ba5c54c87514d9d373f080956887440384a5df2a4fad5caf02ebc6be5e6f52fb" Mar 10 10:05:36 crc kubenswrapper[4794]: I0310 10:05:36.148138 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-gg5z8" Mar 10 10:05:36 crc kubenswrapper[4794]: I0310 10:05:36.791115 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5bdcf4fccc-dmbdh"] Mar 10 10:05:36 crc kubenswrapper[4794]: I0310 10:05:36.791619 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5bdcf4fccc-dmbdh" podUID="4c81ef37-04a4-486e-8bfe-8862e3514256" containerName="dnsmasq-dns" containerID="cri-o://ce01843be5c5828e8979187dc904fe2a134b8a8f3591dc1be315393462a40d7d" gracePeriod=10 Mar 10 10:05:36 crc kubenswrapper[4794]: I0310 10:05:36.792711 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5bdcf4fccc-dmbdh" Mar 10 10:05:36 crc kubenswrapper[4794]: I0310 10:05:36.855767 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7fd445f5bc-9njwb"] Mar 10 10:05:36 crc kubenswrapper[4794]: E0310 10:05:36.856132 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="07c66acc-45bc-440d-9cbc-c8f68f10cdd5" containerName="ovn-config" Mar 10 10:05:36 crc kubenswrapper[4794]: I0310 10:05:36.856159 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="07c66acc-45bc-440d-9cbc-c8f68f10cdd5" containerName="ovn-config" Mar 10 10:05:36 crc kubenswrapper[4794]: E0310 10:05:36.856179 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f7641d95-8aa6-43ad-a191-7d8356b29bac" containerName="glance-db-sync" Mar 10 10:05:36 crc kubenswrapper[4794]: I0310 10:05:36.856187 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="f7641d95-8aa6-43ad-a191-7d8356b29bac" containerName="glance-db-sync" Mar 10 10:05:36 crc kubenswrapper[4794]: I0310 10:05:36.856397 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="07c66acc-45bc-440d-9cbc-c8f68f10cdd5" containerName="ovn-config" Mar 10 10:05:36 crc kubenswrapper[4794]: I0310 10:05:36.856420 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="f7641d95-8aa6-43ad-a191-7d8356b29bac" containerName="glance-db-sync" Mar 10 10:05:36 crc kubenswrapper[4794]: I0310 10:05:36.857413 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7fd445f5bc-9njwb" Mar 10 10:05:36 crc kubenswrapper[4794]: I0310 10:05:36.875320 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7fd445f5bc-9njwb"] Mar 10 10:05:37 crc kubenswrapper[4794]: I0310 10:05:37.057554 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/69fccd5f-c6ca-4228-81ba-ee7ae103876a-dns-svc\") pod \"dnsmasq-dns-7fd445f5bc-9njwb\" (UID: \"69fccd5f-c6ca-4228-81ba-ee7ae103876a\") " pod="openstack/dnsmasq-dns-7fd445f5bc-9njwb" Mar 10 10:05:37 crc kubenswrapper[4794]: I0310 10:05:37.057609 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/69fccd5f-c6ca-4228-81ba-ee7ae103876a-ovsdbserver-nb\") pod \"dnsmasq-dns-7fd445f5bc-9njwb\" (UID: \"69fccd5f-c6ca-4228-81ba-ee7ae103876a\") " pod="openstack/dnsmasq-dns-7fd445f5bc-9njwb" Mar 10 10:05:37 crc kubenswrapper[4794]: I0310 10:05:37.057631 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w7wns\" (UniqueName: \"kubernetes.io/projected/69fccd5f-c6ca-4228-81ba-ee7ae103876a-kube-api-access-w7wns\") pod \"dnsmasq-dns-7fd445f5bc-9njwb\" (UID: \"69fccd5f-c6ca-4228-81ba-ee7ae103876a\") " pod="openstack/dnsmasq-dns-7fd445f5bc-9njwb" Mar 10 10:05:37 crc kubenswrapper[4794]: I0310 10:05:37.057677 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/69fccd5f-c6ca-4228-81ba-ee7ae103876a-dns-swift-storage-0\") pod \"dnsmasq-dns-7fd445f5bc-9njwb\" (UID: \"69fccd5f-c6ca-4228-81ba-ee7ae103876a\") " pod="openstack/dnsmasq-dns-7fd445f5bc-9njwb" Mar 10 10:05:37 crc kubenswrapper[4794]: I0310 10:05:37.057706 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/69fccd5f-c6ca-4228-81ba-ee7ae103876a-config\") pod \"dnsmasq-dns-7fd445f5bc-9njwb\" (UID: \"69fccd5f-c6ca-4228-81ba-ee7ae103876a\") " pod="openstack/dnsmasq-dns-7fd445f5bc-9njwb" Mar 10 10:05:37 crc kubenswrapper[4794]: I0310 10:05:37.057725 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/69fccd5f-c6ca-4228-81ba-ee7ae103876a-ovsdbserver-sb\") pod \"dnsmasq-dns-7fd445f5bc-9njwb\" (UID: \"69fccd5f-c6ca-4228-81ba-ee7ae103876a\") " pod="openstack/dnsmasq-dns-7fd445f5bc-9njwb" Mar 10 10:05:37 crc kubenswrapper[4794]: I0310 10:05:37.158068 4794 generic.go:334] "Generic (PLEG): container finished" podID="4c81ef37-04a4-486e-8bfe-8862e3514256" containerID="ce01843be5c5828e8979187dc904fe2a134b8a8f3591dc1be315393462a40d7d" exitCode=0 Mar 10 10:05:37 crc kubenswrapper[4794]: I0310 10:05:37.158101 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5bdcf4fccc-dmbdh" event={"ID":"4c81ef37-04a4-486e-8bfe-8862e3514256","Type":"ContainerDied","Data":"ce01843be5c5828e8979187dc904fe2a134b8a8f3591dc1be315393462a40d7d"} Mar 10 10:05:37 crc kubenswrapper[4794]: I0310 10:05:37.158135 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5bdcf4fccc-dmbdh" event={"ID":"4c81ef37-04a4-486e-8bfe-8862e3514256","Type":"ContainerDied","Data":"1d741f26c076cbe73af55886fe085d0e0064cac0749e0f3a6e74529fea14c3b4"} Mar 10 10:05:37 crc kubenswrapper[4794]: I0310 10:05:37.158151 4794 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1d741f26c076cbe73af55886fe085d0e0064cac0749e0f3a6e74529fea14c3b4" Mar 10 10:05:37 crc kubenswrapper[4794]: I0310 10:05:37.158903 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/69fccd5f-c6ca-4228-81ba-ee7ae103876a-ovsdbserver-nb\") pod \"dnsmasq-dns-7fd445f5bc-9njwb\" (UID: \"69fccd5f-c6ca-4228-81ba-ee7ae103876a\") " pod="openstack/dnsmasq-dns-7fd445f5bc-9njwb" Mar 10 10:05:37 crc kubenswrapper[4794]: I0310 10:05:37.158959 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w7wns\" (UniqueName: \"kubernetes.io/projected/69fccd5f-c6ca-4228-81ba-ee7ae103876a-kube-api-access-w7wns\") pod \"dnsmasq-dns-7fd445f5bc-9njwb\" (UID: \"69fccd5f-c6ca-4228-81ba-ee7ae103876a\") " pod="openstack/dnsmasq-dns-7fd445f5bc-9njwb" Mar 10 10:05:37 crc kubenswrapper[4794]: I0310 10:05:37.159011 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/69fccd5f-c6ca-4228-81ba-ee7ae103876a-dns-swift-storage-0\") pod \"dnsmasq-dns-7fd445f5bc-9njwb\" (UID: \"69fccd5f-c6ca-4228-81ba-ee7ae103876a\") " pod="openstack/dnsmasq-dns-7fd445f5bc-9njwb" Mar 10 10:05:37 crc kubenswrapper[4794]: I0310 10:05:37.159057 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/69fccd5f-c6ca-4228-81ba-ee7ae103876a-config\") pod \"dnsmasq-dns-7fd445f5bc-9njwb\" (UID: \"69fccd5f-c6ca-4228-81ba-ee7ae103876a\") " pod="openstack/dnsmasq-dns-7fd445f5bc-9njwb" Mar 10 10:05:37 crc kubenswrapper[4794]: I0310 10:05:37.159089 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/69fccd5f-c6ca-4228-81ba-ee7ae103876a-ovsdbserver-sb\") pod \"dnsmasq-dns-7fd445f5bc-9njwb\" (UID: \"69fccd5f-c6ca-4228-81ba-ee7ae103876a\") " pod="openstack/dnsmasq-dns-7fd445f5bc-9njwb" Mar 10 10:05:37 crc kubenswrapper[4794]: I0310 10:05:37.159170 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/69fccd5f-c6ca-4228-81ba-ee7ae103876a-dns-svc\") pod \"dnsmasq-dns-7fd445f5bc-9njwb\" (UID: \"69fccd5f-c6ca-4228-81ba-ee7ae103876a\") " pod="openstack/dnsmasq-dns-7fd445f5bc-9njwb" Mar 10 10:05:37 crc kubenswrapper[4794]: I0310 10:05:37.159916 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/69fccd5f-c6ca-4228-81ba-ee7ae103876a-ovsdbserver-nb\") pod \"dnsmasq-dns-7fd445f5bc-9njwb\" (UID: \"69fccd5f-c6ca-4228-81ba-ee7ae103876a\") " pod="openstack/dnsmasq-dns-7fd445f5bc-9njwb" Mar 10 10:05:37 crc kubenswrapper[4794]: I0310 10:05:37.159957 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/69fccd5f-c6ca-4228-81ba-ee7ae103876a-config\") pod \"dnsmasq-dns-7fd445f5bc-9njwb\" (UID: \"69fccd5f-c6ca-4228-81ba-ee7ae103876a\") " pod="openstack/dnsmasq-dns-7fd445f5bc-9njwb" Mar 10 10:05:37 crc kubenswrapper[4794]: I0310 10:05:37.160203 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/69fccd5f-c6ca-4228-81ba-ee7ae103876a-dns-swift-storage-0\") pod \"dnsmasq-dns-7fd445f5bc-9njwb\" (UID: \"69fccd5f-c6ca-4228-81ba-ee7ae103876a\") " pod="openstack/dnsmasq-dns-7fd445f5bc-9njwb" Mar 10 10:05:37 crc kubenswrapper[4794]: I0310 10:05:37.160205 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/69fccd5f-c6ca-4228-81ba-ee7ae103876a-ovsdbserver-sb\") pod \"dnsmasq-dns-7fd445f5bc-9njwb\" (UID: \"69fccd5f-c6ca-4228-81ba-ee7ae103876a\") " pod="openstack/dnsmasq-dns-7fd445f5bc-9njwb" Mar 10 10:05:37 crc kubenswrapper[4794]: I0310 10:05:37.160395 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/69fccd5f-c6ca-4228-81ba-ee7ae103876a-dns-svc\") pod \"dnsmasq-dns-7fd445f5bc-9njwb\" (UID: \"69fccd5f-c6ca-4228-81ba-ee7ae103876a\") " pod="openstack/dnsmasq-dns-7fd445f5bc-9njwb" Mar 10 10:05:37 crc kubenswrapper[4794]: I0310 10:05:37.178256 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w7wns\" (UniqueName: \"kubernetes.io/projected/69fccd5f-c6ca-4228-81ba-ee7ae103876a-kube-api-access-w7wns\") pod \"dnsmasq-dns-7fd445f5bc-9njwb\" (UID: \"69fccd5f-c6ca-4228-81ba-ee7ae103876a\") " pod="openstack/dnsmasq-dns-7fd445f5bc-9njwb" Mar 10 10:05:37 crc kubenswrapper[4794]: I0310 10:05:37.205057 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7fd445f5bc-9njwb" Mar 10 10:05:37 crc kubenswrapper[4794]: I0310 10:05:37.293216 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5bdcf4fccc-dmbdh" Mar 10 10:05:37 crc kubenswrapper[4794]: I0310 10:05:37.462492 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z5jc2\" (UniqueName: \"kubernetes.io/projected/4c81ef37-04a4-486e-8bfe-8862e3514256-kube-api-access-z5jc2\") pod \"4c81ef37-04a4-486e-8bfe-8862e3514256\" (UID: \"4c81ef37-04a4-486e-8bfe-8862e3514256\") " Mar 10 10:05:37 crc kubenswrapper[4794]: I0310 10:05:37.462802 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4c81ef37-04a4-486e-8bfe-8862e3514256-dns-svc\") pod \"4c81ef37-04a4-486e-8bfe-8862e3514256\" (UID: \"4c81ef37-04a4-486e-8bfe-8862e3514256\") " Mar 10 10:05:37 crc kubenswrapper[4794]: I0310 10:05:37.462939 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4c81ef37-04a4-486e-8bfe-8862e3514256-config\") pod \"4c81ef37-04a4-486e-8bfe-8862e3514256\" (UID: \"4c81ef37-04a4-486e-8bfe-8862e3514256\") " Mar 10 10:05:37 crc kubenswrapper[4794]: I0310 10:05:37.462955 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4c81ef37-04a4-486e-8bfe-8862e3514256-ovsdbserver-sb\") pod \"4c81ef37-04a4-486e-8bfe-8862e3514256\" (UID: \"4c81ef37-04a4-486e-8bfe-8862e3514256\") " Mar 10 10:05:37 crc kubenswrapper[4794]: I0310 10:05:37.462993 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/4c81ef37-04a4-486e-8bfe-8862e3514256-dns-swift-storage-0\") pod \"4c81ef37-04a4-486e-8bfe-8862e3514256\" (UID: \"4c81ef37-04a4-486e-8bfe-8862e3514256\") " Mar 10 10:05:37 crc kubenswrapper[4794]: I0310 10:05:37.463016 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4c81ef37-04a4-486e-8bfe-8862e3514256-ovsdbserver-nb\") pod \"4c81ef37-04a4-486e-8bfe-8862e3514256\" (UID: \"4c81ef37-04a4-486e-8bfe-8862e3514256\") " Mar 10 10:05:37 crc kubenswrapper[4794]: I0310 10:05:37.470267 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4c81ef37-04a4-486e-8bfe-8862e3514256-kube-api-access-z5jc2" (OuterVolumeSpecName: "kube-api-access-z5jc2") pod "4c81ef37-04a4-486e-8bfe-8862e3514256" (UID: "4c81ef37-04a4-486e-8bfe-8862e3514256"). InnerVolumeSpecName "kube-api-access-z5jc2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 10:05:37 crc kubenswrapper[4794]: I0310 10:05:37.503983 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4c81ef37-04a4-486e-8bfe-8862e3514256-config" (OuterVolumeSpecName: "config") pod "4c81ef37-04a4-486e-8bfe-8862e3514256" (UID: "4c81ef37-04a4-486e-8bfe-8862e3514256"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 10:05:37 crc kubenswrapper[4794]: I0310 10:05:37.505097 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4c81ef37-04a4-486e-8bfe-8862e3514256-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "4c81ef37-04a4-486e-8bfe-8862e3514256" (UID: "4c81ef37-04a4-486e-8bfe-8862e3514256"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 10:05:37 crc kubenswrapper[4794]: I0310 10:05:37.506389 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4c81ef37-04a4-486e-8bfe-8862e3514256-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "4c81ef37-04a4-486e-8bfe-8862e3514256" (UID: "4c81ef37-04a4-486e-8bfe-8862e3514256"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 10:05:37 crc kubenswrapper[4794]: I0310 10:05:37.508313 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4c81ef37-04a4-486e-8bfe-8862e3514256-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "4c81ef37-04a4-486e-8bfe-8862e3514256" (UID: "4c81ef37-04a4-486e-8bfe-8862e3514256"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 10:05:37 crc kubenswrapper[4794]: I0310 10:05:37.510700 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4c81ef37-04a4-486e-8bfe-8862e3514256-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "4c81ef37-04a4-486e-8bfe-8862e3514256" (UID: "4c81ef37-04a4-486e-8bfe-8862e3514256"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 10:05:37 crc kubenswrapper[4794]: I0310 10:05:37.564552 4794 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4c81ef37-04a4-486e-8bfe-8862e3514256-config\") on node \"crc\" DevicePath \"\"" Mar 10 10:05:37 crc kubenswrapper[4794]: I0310 10:05:37.564584 4794 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4c81ef37-04a4-486e-8bfe-8862e3514256-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 10 10:05:37 crc kubenswrapper[4794]: I0310 10:05:37.564596 4794 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/4c81ef37-04a4-486e-8bfe-8862e3514256-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 10 10:05:37 crc kubenswrapper[4794]: I0310 10:05:37.564606 4794 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4c81ef37-04a4-486e-8bfe-8862e3514256-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 10 10:05:37 crc kubenswrapper[4794]: I0310 10:05:37.564615 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z5jc2\" (UniqueName: \"kubernetes.io/projected/4c81ef37-04a4-486e-8bfe-8862e3514256-kube-api-access-z5jc2\") on node \"crc\" DevicePath \"\"" Mar 10 10:05:37 crc kubenswrapper[4794]: I0310 10:05:37.564624 4794 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4c81ef37-04a4-486e-8bfe-8862e3514256-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 10 10:05:37 crc kubenswrapper[4794]: I0310 10:05:37.635066 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7fd445f5bc-9njwb"] Mar 10 10:05:37 crc kubenswrapper[4794]: W0310 10:05:37.638354 4794 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod69fccd5f_c6ca_4228_81ba_ee7ae103876a.slice/crio-9b08d65c991101e497d7f11e22c15b36bf6577f95e5100c52f75ed21a6d8a0f2 WatchSource:0}: Error finding container 9b08d65c991101e497d7f11e22c15b36bf6577f95e5100c52f75ed21a6d8a0f2: Status 404 returned error can't find the container with id 9b08d65c991101e497d7f11e22c15b36bf6577f95e5100c52f75ed21a6d8a0f2 Mar 10 10:05:38 crc kubenswrapper[4794]: I0310 10:05:38.166112 4794 generic.go:334] "Generic (PLEG): container finished" podID="69fccd5f-c6ca-4228-81ba-ee7ae103876a" containerID="538774c14100ed00991d4835f5659f6f2efefa4b4d07d2baab06ba0d13be29dc" exitCode=0 Mar 10 10:05:38 crc kubenswrapper[4794]: I0310 10:05:38.166207 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7fd445f5bc-9njwb" event={"ID":"69fccd5f-c6ca-4228-81ba-ee7ae103876a","Type":"ContainerDied","Data":"538774c14100ed00991d4835f5659f6f2efefa4b4d07d2baab06ba0d13be29dc"} Mar 10 10:05:38 crc kubenswrapper[4794]: I0310 10:05:38.166734 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7fd445f5bc-9njwb" event={"ID":"69fccd5f-c6ca-4228-81ba-ee7ae103876a","Type":"ContainerStarted","Data":"9b08d65c991101e497d7f11e22c15b36bf6577f95e5100c52f75ed21a6d8a0f2"} Mar 10 10:05:38 crc kubenswrapper[4794]: I0310 10:05:38.166837 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5bdcf4fccc-dmbdh" Mar 10 10:05:38 crc kubenswrapper[4794]: I0310 10:05:38.221304 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5bdcf4fccc-dmbdh"] Mar 10 10:05:38 crc kubenswrapper[4794]: I0310 10:05:38.229971 4794 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5bdcf4fccc-dmbdh"] Mar 10 10:05:38 crc kubenswrapper[4794]: I0310 10:05:38.703515 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Mar 10 10:05:38 crc kubenswrapper[4794]: I0310 10:05:38.996527 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Mar 10 10:05:39 crc kubenswrapper[4794]: I0310 10:05:39.175809 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7fd445f5bc-9njwb" event={"ID":"69fccd5f-c6ca-4228-81ba-ee7ae103876a","Type":"ContainerStarted","Data":"83ea99766083c37e566fe16c905e9d98e5f40860228f0197c875205a8e8c07b7"} Mar 10 10:05:39 crc kubenswrapper[4794]: I0310 10:05:39.176785 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-7fd445f5bc-9njwb" Mar 10 10:05:39 crc kubenswrapper[4794]: I0310 10:05:39.194001 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-7fd445f5bc-9njwb" podStartSLOduration=3.193983773 podStartE2EDuration="3.193983773s" podCreationTimestamp="2026-03-10 10:05:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 10:05:39.19388393 +0000 UTC m=+1287.950054748" watchObservedRunningTime="2026-03-10 10:05:39.193983773 +0000 UTC m=+1287.950154591" Mar 10 10:05:40 crc kubenswrapper[4794]: I0310 10:05:40.009692 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4c81ef37-04a4-486e-8bfe-8862e3514256" path="/var/lib/kubelet/pods/4c81ef37-04a4-486e-8bfe-8862e3514256/volumes" Mar 10 10:05:40 crc kubenswrapper[4794]: I0310 10:05:40.561710 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-create-9zpcn"] Mar 10 10:05:40 crc kubenswrapper[4794]: E0310 10:05:40.562420 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4c81ef37-04a4-486e-8bfe-8862e3514256" containerName="init" Mar 10 10:05:40 crc kubenswrapper[4794]: I0310 10:05:40.562434 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="4c81ef37-04a4-486e-8bfe-8862e3514256" containerName="init" Mar 10 10:05:40 crc kubenswrapper[4794]: E0310 10:05:40.562463 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4c81ef37-04a4-486e-8bfe-8862e3514256" containerName="dnsmasq-dns" Mar 10 10:05:40 crc kubenswrapper[4794]: I0310 10:05:40.562470 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="4c81ef37-04a4-486e-8bfe-8862e3514256" containerName="dnsmasq-dns" Mar 10 10:05:40 crc kubenswrapper[4794]: I0310 10:05:40.562637 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="4c81ef37-04a4-486e-8bfe-8862e3514256" containerName="dnsmasq-dns" Mar 10 10:05:40 crc kubenswrapper[4794]: I0310 10:05:40.563275 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-9zpcn" Mar 10 10:05:40 crc kubenswrapper[4794]: I0310 10:05:40.614587 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-9zpcn"] Mar 10 10:05:40 crc kubenswrapper[4794]: I0310 10:05:40.689841 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-e78d-account-create-update-lzbkb"] Mar 10 10:05:40 crc kubenswrapper[4794]: I0310 10:05:40.690772 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-e78d-account-create-update-lzbkb" Mar 10 10:05:40 crc kubenswrapper[4794]: I0310 10:05:40.699370 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-e78d-account-create-update-lzbkb"] Mar 10 10:05:40 crc kubenswrapper[4794]: I0310 10:05:40.704669 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-db-secret" Mar 10 10:05:40 crc kubenswrapper[4794]: I0310 10:05:40.711126 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ksgmf\" (UniqueName: \"kubernetes.io/projected/6f6993ef-0bbf-49a3-a1cb-1dd304ba6564-kube-api-access-ksgmf\") pod \"cinder-db-create-9zpcn\" (UID: \"6f6993ef-0bbf-49a3-a1cb-1dd304ba6564\") " pod="openstack/cinder-db-create-9zpcn" Mar 10 10:05:40 crc kubenswrapper[4794]: I0310 10:05:40.711177 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6f6993ef-0bbf-49a3-a1cb-1dd304ba6564-operator-scripts\") pod \"cinder-db-create-9zpcn\" (UID: \"6f6993ef-0bbf-49a3-a1cb-1dd304ba6564\") " pod="openstack/cinder-db-create-9zpcn" Mar 10 10:05:40 crc kubenswrapper[4794]: I0310 10:05:40.761838 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-create-r6tt9"] Mar 10 10:05:40 crc kubenswrapper[4794]: I0310 10:05:40.763084 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-r6tt9" Mar 10 10:05:40 crc kubenswrapper[4794]: I0310 10:05:40.776553 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-r6tt9"] Mar 10 10:05:40 crc kubenswrapper[4794]: I0310 10:05:40.785033 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-d8db-account-create-update-bxfq5"] Mar 10 10:05:40 crc kubenswrapper[4794]: I0310 10:05:40.786386 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-d8db-account-create-update-bxfq5" Mar 10 10:05:40 crc kubenswrapper[4794]: I0310 10:05:40.792766 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-db-secret" Mar 10 10:05:40 crc kubenswrapper[4794]: I0310 10:05:40.813101 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ksgmf\" (UniqueName: \"kubernetes.io/projected/6f6993ef-0bbf-49a3-a1cb-1dd304ba6564-kube-api-access-ksgmf\") pod \"cinder-db-create-9zpcn\" (UID: \"6f6993ef-0bbf-49a3-a1cb-1dd304ba6564\") " pod="openstack/cinder-db-create-9zpcn" Mar 10 10:05:40 crc kubenswrapper[4794]: I0310 10:05:40.813162 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-msncv\" (UniqueName: \"kubernetes.io/projected/56a0cbd8-93d4-4236-bf7d-434752b9d246-kube-api-access-msncv\") pod \"barbican-e78d-account-create-update-lzbkb\" (UID: \"56a0cbd8-93d4-4236-bf7d-434752b9d246\") " pod="openstack/barbican-e78d-account-create-update-lzbkb" Mar 10 10:05:40 crc kubenswrapper[4794]: I0310 10:05:40.813199 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6f6993ef-0bbf-49a3-a1cb-1dd304ba6564-operator-scripts\") pod \"cinder-db-create-9zpcn\" (UID: \"6f6993ef-0bbf-49a3-a1cb-1dd304ba6564\") " pod="openstack/cinder-db-create-9zpcn" Mar 10 10:05:40 crc kubenswrapper[4794]: I0310 10:05:40.813270 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/56a0cbd8-93d4-4236-bf7d-434752b9d246-operator-scripts\") pod \"barbican-e78d-account-create-update-lzbkb\" (UID: \"56a0cbd8-93d4-4236-bf7d-434752b9d246\") " pod="openstack/barbican-e78d-account-create-update-lzbkb" Mar 10 10:05:40 crc kubenswrapper[4794]: I0310 10:05:40.814230 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6f6993ef-0bbf-49a3-a1cb-1dd304ba6564-operator-scripts\") pod \"cinder-db-create-9zpcn\" (UID: \"6f6993ef-0bbf-49a3-a1cb-1dd304ba6564\") " pod="openstack/cinder-db-create-9zpcn" Mar 10 10:05:40 crc kubenswrapper[4794]: I0310 10:05:40.820132 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-d8db-account-create-update-bxfq5"] Mar 10 10:05:40 crc kubenswrapper[4794]: I0310 10:05:40.851526 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ksgmf\" (UniqueName: \"kubernetes.io/projected/6f6993ef-0bbf-49a3-a1cb-1dd304ba6564-kube-api-access-ksgmf\") pod \"cinder-db-create-9zpcn\" (UID: \"6f6993ef-0bbf-49a3-a1cb-1dd304ba6564\") " pod="openstack/cinder-db-create-9zpcn" Mar 10 10:05:40 crc kubenswrapper[4794]: I0310 10:05:40.878467 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-create-qvz28"] Mar 10 10:05:40 crc kubenswrapper[4794]: I0310 10:05:40.879601 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-qvz28" Mar 10 10:05:40 crc kubenswrapper[4794]: I0310 10:05:40.882944 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-9zpcn" Mar 10 10:05:40 crc kubenswrapper[4794]: I0310 10:05:40.891005 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-qvz28"] Mar 10 10:05:40 crc kubenswrapper[4794]: I0310 10:05:40.915209 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b22rk\" (UniqueName: \"kubernetes.io/projected/543f505d-cd2e-491c-bab0-33efb1f71f57-kube-api-access-b22rk\") pod \"barbican-db-create-r6tt9\" (UID: \"543f505d-cd2e-491c-bab0-33efb1f71f57\") " pod="openstack/barbican-db-create-r6tt9" Mar 10 10:05:40 crc kubenswrapper[4794]: I0310 10:05:40.915257 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-msncv\" (UniqueName: \"kubernetes.io/projected/56a0cbd8-93d4-4236-bf7d-434752b9d246-kube-api-access-msncv\") pod \"barbican-e78d-account-create-update-lzbkb\" (UID: \"56a0cbd8-93d4-4236-bf7d-434752b9d246\") " pod="openstack/barbican-e78d-account-create-update-lzbkb" Mar 10 10:05:40 crc kubenswrapper[4794]: I0310 10:05:40.915299 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/543f505d-cd2e-491c-bab0-33efb1f71f57-operator-scripts\") pod \"barbican-db-create-r6tt9\" (UID: \"543f505d-cd2e-491c-bab0-33efb1f71f57\") " pod="openstack/barbican-db-create-r6tt9" Mar 10 10:05:40 crc kubenswrapper[4794]: I0310 10:05:40.915320 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/76332524-3f6c-4d9c-80a7-11f41ef04ade-operator-scripts\") pod \"cinder-d8db-account-create-update-bxfq5\" (UID: \"76332524-3f6c-4d9c-80a7-11f41ef04ade\") " pod="openstack/cinder-d8db-account-create-update-bxfq5" Mar 10 10:05:40 crc kubenswrapper[4794]: I0310 10:05:40.915379 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-26tb4\" (UniqueName: \"kubernetes.io/projected/76332524-3f6c-4d9c-80a7-11f41ef04ade-kube-api-access-26tb4\") pod \"cinder-d8db-account-create-update-bxfq5\" (UID: \"76332524-3f6c-4d9c-80a7-11f41ef04ade\") " pod="openstack/cinder-d8db-account-create-update-bxfq5" Mar 10 10:05:40 crc kubenswrapper[4794]: I0310 10:05:40.915420 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/56a0cbd8-93d4-4236-bf7d-434752b9d246-operator-scripts\") pod \"barbican-e78d-account-create-update-lzbkb\" (UID: \"56a0cbd8-93d4-4236-bf7d-434752b9d246\") " pod="openstack/barbican-e78d-account-create-update-lzbkb" Mar 10 10:05:40 crc kubenswrapper[4794]: I0310 10:05:40.916251 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/56a0cbd8-93d4-4236-bf7d-434752b9d246-operator-scripts\") pod \"barbican-e78d-account-create-update-lzbkb\" (UID: \"56a0cbd8-93d4-4236-bf7d-434752b9d246\") " pod="openstack/barbican-e78d-account-create-update-lzbkb" Mar 10 10:05:40 crc kubenswrapper[4794]: I0310 10:05:40.936184 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-sync-wq6b7"] Mar 10 10:05:40 crc kubenswrapper[4794]: I0310 10:05:40.937242 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-wq6b7" Mar 10 10:05:40 crc kubenswrapper[4794]: I0310 10:05:40.941747 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Mar 10 10:05:40 crc kubenswrapper[4794]: I0310 10:05:40.941901 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-4svgf" Mar 10 10:05:40 crc kubenswrapper[4794]: I0310 10:05:40.942082 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Mar 10 10:05:40 crc kubenswrapper[4794]: I0310 10:05:40.942190 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Mar 10 10:05:40 crc kubenswrapper[4794]: I0310 10:05:40.944041 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-wq6b7"] Mar 10 10:05:40 crc kubenswrapper[4794]: I0310 10:05:40.945482 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-msncv\" (UniqueName: \"kubernetes.io/projected/56a0cbd8-93d4-4236-bf7d-434752b9d246-kube-api-access-msncv\") pod \"barbican-e78d-account-create-update-lzbkb\" (UID: \"56a0cbd8-93d4-4236-bf7d-434752b9d246\") " pod="openstack/barbican-e78d-account-create-update-lzbkb" Mar 10 10:05:40 crc kubenswrapper[4794]: I0310 10:05:40.994635 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-290e-account-create-update-g7rwh"] Mar 10 10:05:40 crc kubenswrapper[4794]: I0310 10:05:40.996160 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-290e-account-create-update-g7rwh" Mar 10 10:05:40 crc kubenswrapper[4794]: I0310 10:05:40.998921 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-db-secret" Mar 10 10:05:41 crc kubenswrapper[4794]: I0310 10:05:41.006387 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-290e-account-create-update-g7rwh"] Mar 10 10:05:41 crc kubenswrapper[4794]: I0310 10:05:41.022667 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-26tb4\" (UniqueName: \"kubernetes.io/projected/76332524-3f6c-4d9c-80a7-11f41ef04ade-kube-api-access-26tb4\") pod \"cinder-d8db-account-create-update-bxfq5\" (UID: \"76332524-3f6c-4d9c-80a7-11f41ef04ade\") " pod="openstack/cinder-d8db-account-create-update-bxfq5" Mar 10 10:05:41 crc kubenswrapper[4794]: I0310 10:05:41.023063 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/65e0ce91-6453-4818-ae1f-39a24f8e6a66-operator-scripts\") pod \"neutron-db-create-qvz28\" (UID: \"65e0ce91-6453-4818-ae1f-39a24f8e6a66\") " pod="openstack/neutron-db-create-qvz28" Mar 10 10:05:41 crc kubenswrapper[4794]: I0310 10:05:41.023117 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b22rk\" (UniqueName: \"kubernetes.io/projected/543f505d-cd2e-491c-bab0-33efb1f71f57-kube-api-access-b22rk\") pod \"barbican-db-create-r6tt9\" (UID: \"543f505d-cd2e-491c-bab0-33efb1f71f57\") " pod="openstack/barbican-db-create-r6tt9" Mar 10 10:05:41 crc kubenswrapper[4794]: I0310 10:05:41.023147 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gr27h\" (UniqueName: \"kubernetes.io/projected/65e0ce91-6453-4818-ae1f-39a24f8e6a66-kube-api-access-gr27h\") pod \"neutron-db-create-qvz28\" (UID: \"65e0ce91-6453-4818-ae1f-39a24f8e6a66\") " pod="openstack/neutron-db-create-qvz28" Mar 10 10:05:41 crc kubenswrapper[4794]: I0310 10:05:41.023175 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/543f505d-cd2e-491c-bab0-33efb1f71f57-operator-scripts\") pod \"barbican-db-create-r6tt9\" (UID: \"543f505d-cd2e-491c-bab0-33efb1f71f57\") " pod="openstack/barbican-db-create-r6tt9" Mar 10 10:05:41 crc kubenswrapper[4794]: I0310 10:05:41.023197 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/76332524-3f6c-4d9c-80a7-11f41ef04ade-operator-scripts\") pod \"cinder-d8db-account-create-update-bxfq5\" (UID: \"76332524-3f6c-4d9c-80a7-11f41ef04ade\") " pod="openstack/cinder-d8db-account-create-update-bxfq5" Mar 10 10:05:41 crc kubenswrapper[4794]: I0310 10:05:41.023854 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/76332524-3f6c-4d9c-80a7-11f41ef04ade-operator-scripts\") pod \"cinder-d8db-account-create-update-bxfq5\" (UID: \"76332524-3f6c-4d9c-80a7-11f41ef04ade\") " pod="openstack/cinder-d8db-account-create-update-bxfq5" Mar 10 10:05:41 crc kubenswrapper[4794]: I0310 10:05:41.024809 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/543f505d-cd2e-491c-bab0-33efb1f71f57-operator-scripts\") pod \"barbican-db-create-r6tt9\" (UID: \"543f505d-cd2e-491c-bab0-33efb1f71f57\") " pod="openstack/barbican-db-create-r6tt9" Mar 10 10:05:41 crc kubenswrapper[4794]: I0310 10:05:41.025356 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-e78d-account-create-update-lzbkb" Mar 10 10:05:41 crc kubenswrapper[4794]: I0310 10:05:41.059002 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b22rk\" (UniqueName: \"kubernetes.io/projected/543f505d-cd2e-491c-bab0-33efb1f71f57-kube-api-access-b22rk\") pod \"barbican-db-create-r6tt9\" (UID: \"543f505d-cd2e-491c-bab0-33efb1f71f57\") " pod="openstack/barbican-db-create-r6tt9" Mar 10 10:05:41 crc kubenswrapper[4794]: I0310 10:05:41.067834 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-26tb4\" (UniqueName: \"kubernetes.io/projected/76332524-3f6c-4d9c-80a7-11f41ef04ade-kube-api-access-26tb4\") pod \"cinder-d8db-account-create-update-bxfq5\" (UID: \"76332524-3f6c-4d9c-80a7-11f41ef04ade\") " pod="openstack/cinder-d8db-account-create-update-bxfq5" Mar 10 10:05:41 crc kubenswrapper[4794]: I0310 10:05:41.078432 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-r6tt9" Mar 10 10:05:41 crc kubenswrapper[4794]: I0310 10:05:41.102559 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-d8db-account-create-update-bxfq5" Mar 10 10:05:41 crc kubenswrapper[4794]: I0310 10:05:41.124408 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c8drj\" (UniqueName: \"kubernetes.io/projected/ab5bb4b1-f7a5-4bfb-a2a1-dfb63b32fcbe-kube-api-access-c8drj\") pod \"keystone-db-sync-wq6b7\" (UID: \"ab5bb4b1-f7a5-4bfb-a2a1-dfb63b32fcbe\") " pod="openstack/keystone-db-sync-wq6b7" Mar 10 10:05:41 crc kubenswrapper[4794]: I0310 10:05:41.124670 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ab5bb4b1-f7a5-4bfb-a2a1-dfb63b32fcbe-config-data\") pod \"keystone-db-sync-wq6b7\" (UID: \"ab5bb4b1-f7a5-4bfb-a2a1-dfb63b32fcbe\") " pod="openstack/keystone-db-sync-wq6b7" Mar 10 10:05:41 crc kubenswrapper[4794]: I0310 10:05:41.124789 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/65e0ce91-6453-4818-ae1f-39a24f8e6a66-operator-scripts\") pod \"neutron-db-create-qvz28\" (UID: \"65e0ce91-6453-4818-ae1f-39a24f8e6a66\") " pod="openstack/neutron-db-create-qvz28" Mar 10 10:05:41 crc kubenswrapper[4794]: I0310 10:05:41.124824 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/27e140de-d875-48f9-87dd-2ea5908121c9-operator-scripts\") pod \"neutron-290e-account-create-update-g7rwh\" (UID: \"27e140de-d875-48f9-87dd-2ea5908121c9\") " pod="openstack/neutron-290e-account-create-update-g7rwh" Mar 10 10:05:41 crc kubenswrapper[4794]: I0310 10:05:41.124876 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ab5bb4b1-f7a5-4bfb-a2a1-dfb63b32fcbe-combined-ca-bundle\") pod \"keystone-db-sync-wq6b7\" (UID: \"ab5bb4b1-f7a5-4bfb-a2a1-dfb63b32fcbe\") " pod="openstack/keystone-db-sync-wq6b7" Mar 10 10:05:41 crc kubenswrapper[4794]: I0310 10:05:41.124946 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-brmvq\" (UniqueName: \"kubernetes.io/projected/27e140de-d875-48f9-87dd-2ea5908121c9-kube-api-access-brmvq\") pod \"neutron-290e-account-create-update-g7rwh\" (UID: \"27e140de-d875-48f9-87dd-2ea5908121c9\") " pod="openstack/neutron-290e-account-create-update-g7rwh" Mar 10 10:05:41 crc kubenswrapper[4794]: I0310 10:05:41.124995 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gr27h\" (UniqueName: \"kubernetes.io/projected/65e0ce91-6453-4818-ae1f-39a24f8e6a66-kube-api-access-gr27h\") pod \"neutron-db-create-qvz28\" (UID: \"65e0ce91-6453-4818-ae1f-39a24f8e6a66\") " pod="openstack/neutron-db-create-qvz28" Mar 10 10:05:41 crc kubenswrapper[4794]: I0310 10:05:41.125795 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/65e0ce91-6453-4818-ae1f-39a24f8e6a66-operator-scripts\") pod \"neutron-db-create-qvz28\" (UID: \"65e0ce91-6453-4818-ae1f-39a24f8e6a66\") " pod="openstack/neutron-db-create-qvz28" Mar 10 10:05:41 crc kubenswrapper[4794]: I0310 10:05:41.145485 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gr27h\" (UniqueName: \"kubernetes.io/projected/65e0ce91-6453-4818-ae1f-39a24f8e6a66-kube-api-access-gr27h\") pod \"neutron-db-create-qvz28\" (UID: \"65e0ce91-6453-4818-ae1f-39a24f8e6a66\") " pod="openstack/neutron-db-create-qvz28" Mar 10 10:05:41 crc kubenswrapper[4794]: I0310 10:05:41.226371 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/27e140de-d875-48f9-87dd-2ea5908121c9-operator-scripts\") pod \"neutron-290e-account-create-update-g7rwh\" (UID: \"27e140de-d875-48f9-87dd-2ea5908121c9\") " pod="openstack/neutron-290e-account-create-update-g7rwh" Mar 10 10:05:41 crc kubenswrapper[4794]: I0310 10:05:41.226437 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ab5bb4b1-f7a5-4bfb-a2a1-dfb63b32fcbe-combined-ca-bundle\") pod \"keystone-db-sync-wq6b7\" (UID: \"ab5bb4b1-f7a5-4bfb-a2a1-dfb63b32fcbe\") " pod="openstack/keystone-db-sync-wq6b7" Mar 10 10:05:41 crc kubenswrapper[4794]: I0310 10:05:41.226478 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-brmvq\" (UniqueName: \"kubernetes.io/projected/27e140de-d875-48f9-87dd-2ea5908121c9-kube-api-access-brmvq\") pod \"neutron-290e-account-create-update-g7rwh\" (UID: \"27e140de-d875-48f9-87dd-2ea5908121c9\") " pod="openstack/neutron-290e-account-create-update-g7rwh" Mar 10 10:05:41 crc kubenswrapper[4794]: I0310 10:05:41.226513 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c8drj\" (UniqueName: \"kubernetes.io/projected/ab5bb4b1-f7a5-4bfb-a2a1-dfb63b32fcbe-kube-api-access-c8drj\") pod \"keystone-db-sync-wq6b7\" (UID: \"ab5bb4b1-f7a5-4bfb-a2a1-dfb63b32fcbe\") " pod="openstack/keystone-db-sync-wq6b7" Mar 10 10:05:41 crc kubenswrapper[4794]: I0310 10:05:41.226601 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ab5bb4b1-f7a5-4bfb-a2a1-dfb63b32fcbe-config-data\") pod \"keystone-db-sync-wq6b7\" (UID: \"ab5bb4b1-f7a5-4bfb-a2a1-dfb63b32fcbe\") " pod="openstack/keystone-db-sync-wq6b7" Mar 10 10:05:41 crc kubenswrapper[4794]: I0310 10:05:41.227591 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/27e140de-d875-48f9-87dd-2ea5908121c9-operator-scripts\") pod \"neutron-290e-account-create-update-g7rwh\" (UID: \"27e140de-d875-48f9-87dd-2ea5908121c9\") " pod="openstack/neutron-290e-account-create-update-g7rwh" Mar 10 10:05:41 crc kubenswrapper[4794]: I0310 10:05:41.232660 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ab5bb4b1-f7a5-4bfb-a2a1-dfb63b32fcbe-config-data\") pod \"keystone-db-sync-wq6b7\" (UID: \"ab5bb4b1-f7a5-4bfb-a2a1-dfb63b32fcbe\") " pod="openstack/keystone-db-sync-wq6b7" Mar 10 10:05:41 crc kubenswrapper[4794]: I0310 10:05:41.247778 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ab5bb4b1-f7a5-4bfb-a2a1-dfb63b32fcbe-combined-ca-bundle\") pod \"keystone-db-sync-wq6b7\" (UID: \"ab5bb4b1-f7a5-4bfb-a2a1-dfb63b32fcbe\") " pod="openstack/keystone-db-sync-wq6b7" Mar 10 10:05:41 crc kubenswrapper[4794]: I0310 10:05:41.249695 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c8drj\" (UniqueName: \"kubernetes.io/projected/ab5bb4b1-f7a5-4bfb-a2a1-dfb63b32fcbe-kube-api-access-c8drj\") pod \"keystone-db-sync-wq6b7\" (UID: \"ab5bb4b1-f7a5-4bfb-a2a1-dfb63b32fcbe\") " pod="openstack/keystone-db-sync-wq6b7" Mar 10 10:05:41 crc kubenswrapper[4794]: I0310 10:05:41.252523 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-brmvq\" (UniqueName: \"kubernetes.io/projected/27e140de-d875-48f9-87dd-2ea5908121c9-kube-api-access-brmvq\") pod \"neutron-290e-account-create-update-g7rwh\" (UID: \"27e140de-d875-48f9-87dd-2ea5908121c9\") " pod="openstack/neutron-290e-account-create-update-g7rwh" Mar 10 10:05:41 crc kubenswrapper[4794]: I0310 10:05:41.328127 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-qvz28" Mar 10 10:05:41 crc kubenswrapper[4794]: I0310 10:05:41.406672 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-9zpcn"] Mar 10 10:05:41 crc kubenswrapper[4794]: I0310 10:05:41.413709 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-wq6b7" Mar 10 10:05:41 crc kubenswrapper[4794]: I0310 10:05:41.418371 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-290e-account-create-update-g7rwh" Mar 10 10:05:41 crc kubenswrapper[4794]: I0310 10:05:41.571744 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-e78d-account-create-update-lzbkb"] Mar 10 10:05:41 crc kubenswrapper[4794]: W0310 10:05:41.582466 4794 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod56a0cbd8_93d4_4236_bf7d_434752b9d246.slice/crio-bdd01aa3fb15b3aba31545f455d3c0584f869450218e43795867bcaf2d16fdc3 WatchSource:0}: Error finding container bdd01aa3fb15b3aba31545f455d3c0584f869450218e43795867bcaf2d16fdc3: Status 404 returned error can't find the container with id bdd01aa3fb15b3aba31545f455d3c0584f869450218e43795867bcaf2d16fdc3 Mar 10 10:05:41 crc kubenswrapper[4794]: I0310 10:05:41.630809 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-d8db-account-create-update-bxfq5"] Mar 10 10:05:41 crc kubenswrapper[4794]: I0310 10:05:41.706749 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-r6tt9"] Mar 10 10:05:41 crc kubenswrapper[4794]: W0310 10:05:41.713601 4794 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod543f505d_cd2e_491c_bab0_33efb1f71f57.slice/crio-66a82ae556591211c781c1fc5ef1864f69d8c1c5211717ed7af8c65749526681 WatchSource:0}: Error finding container 66a82ae556591211c781c1fc5ef1864f69d8c1c5211717ed7af8c65749526681: Status 404 returned error can't find the container with id 66a82ae556591211c781c1fc5ef1864f69d8c1c5211717ed7af8c65749526681 Mar 10 10:05:41 crc kubenswrapper[4794]: I0310 10:05:41.797838 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-qvz28"] Mar 10 10:05:41 crc kubenswrapper[4794]: W0310 10:05:41.966906 4794 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podab5bb4b1_f7a5_4bfb_a2a1_dfb63b32fcbe.slice/crio-ea7fe1c15d20c96ac12d2e0fc764f9a019706b8f35b669cdeea17731cf234a0d WatchSource:0}: Error finding container ea7fe1c15d20c96ac12d2e0fc764f9a019706b8f35b669cdeea17731cf234a0d: Status 404 returned error can't find the container with id ea7fe1c15d20c96ac12d2e0fc764f9a019706b8f35b669cdeea17731cf234a0d Mar 10 10:05:41 crc kubenswrapper[4794]: I0310 10:05:41.970535 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-wq6b7"] Mar 10 10:05:42 crc kubenswrapper[4794]: I0310 10:05:42.072067 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-290e-account-create-update-g7rwh"] Mar 10 10:05:42 crc kubenswrapper[4794]: I0310 10:05:42.214745 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-wq6b7" event={"ID":"ab5bb4b1-f7a5-4bfb-a2a1-dfb63b32fcbe","Type":"ContainerStarted","Data":"ea7fe1c15d20c96ac12d2e0fc764f9a019706b8f35b669cdeea17731cf234a0d"} Mar 10 10:05:42 crc kubenswrapper[4794]: I0310 10:05:42.216988 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-qvz28" event={"ID":"65e0ce91-6453-4818-ae1f-39a24f8e6a66","Type":"ContainerStarted","Data":"e93ec00740ada705822837c538ffa9760f9d8c6961b642225b14741f8c12ddcc"} Mar 10 10:05:42 crc kubenswrapper[4794]: I0310 10:05:42.217034 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-qvz28" event={"ID":"65e0ce91-6453-4818-ae1f-39a24f8e6a66","Type":"ContainerStarted","Data":"2d65a374b118bd50bdf7c38e33fb2a48bdcc39ff394784b814218a43bea07aaa"} Mar 10 10:05:42 crc kubenswrapper[4794]: I0310 10:05:42.221521 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-9zpcn" event={"ID":"6f6993ef-0bbf-49a3-a1cb-1dd304ba6564","Type":"ContainerDied","Data":"5a33b91b527b1919053bf43cda8f3cb23a6d76721c506dda2a7bfbc73df9e4fb"} Mar 10 10:05:42 crc kubenswrapper[4794]: I0310 10:05:42.221476 4794 generic.go:334] "Generic (PLEG): container finished" podID="6f6993ef-0bbf-49a3-a1cb-1dd304ba6564" containerID="5a33b91b527b1919053bf43cda8f3cb23a6d76721c506dda2a7bfbc73df9e4fb" exitCode=0 Mar 10 10:05:42 crc kubenswrapper[4794]: I0310 10:05:42.221634 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-9zpcn" event={"ID":"6f6993ef-0bbf-49a3-a1cb-1dd304ba6564","Type":"ContainerStarted","Data":"33d8060fd9109f2ab5881a07f06f751320e64f55522efc9666a915051965e514"} Mar 10 10:05:42 crc kubenswrapper[4794]: I0310 10:05:42.223826 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-290e-account-create-update-g7rwh" event={"ID":"27e140de-d875-48f9-87dd-2ea5908121c9","Type":"ContainerStarted","Data":"c3b60a3a73e00466be814c01a4f63a17e51313cc634f5f6a4751159adfcaf712"} Mar 10 10:05:42 crc kubenswrapper[4794]: I0310 10:05:42.228786 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-r6tt9" event={"ID":"543f505d-cd2e-491c-bab0-33efb1f71f57","Type":"ContainerStarted","Data":"1e3c68f525250d31acefa52ae32155d2762006dff412a57a1e872f7a90a1f9ed"} Mar 10 10:05:42 crc kubenswrapper[4794]: I0310 10:05:42.228839 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-r6tt9" event={"ID":"543f505d-cd2e-491c-bab0-33efb1f71f57","Type":"ContainerStarted","Data":"66a82ae556591211c781c1fc5ef1864f69d8c1c5211717ed7af8c65749526681"} Mar 10 10:05:42 crc kubenswrapper[4794]: I0310 10:05:42.233542 4794 generic.go:334] "Generic (PLEG): container finished" podID="76332524-3f6c-4d9c-80a7-11f41ef04ade" containerID="7e53340b15e3ba50ade3cd2da6c770411ad4bac15ca846d1a1785f37fda6ab2b" exitCode=0 Mar 10 10:05:42 crc kubenswrapper[4794]: I0310 10:05:42.233635 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-d8db-account-create-update-bxfq5" event={"ID":"76332524-3f6c-4d9c-80a7-11f41ef04ade","Type":"ContainerDied","Data":"7e53340b15e3ba50ade3cd2da6c770411ad4bac15ca846d1a1785f37fda6ab2b"} Mar 10 10:05:42 crc kubenswrapper[4794]: I0310 10:05:42.233709 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-d8db-account-create-update-bxfq5" event={"ID":"76332524-3f6c-4d9c-80a7-11f41ef04ade","Type":"ContainerStarted","Data":"2754bec76fc337b6f09edf9060f446670b61210d2a5d894048f578ebf9d123bc"} Mar 10 10:05:42 crc kubenswrapper[4794]: I0310 10:05:42.235287 4794 generic.go:334] "Generic (PLEG): container finished" podID="56a0cbd8-93d4-4236-bf7d-434752b9d246" containerID="ddda66dbb1d59fd19b03a30fbf32119c6446e237479d0ac3e5dc4dcefbdd15fb" exitCode=0 Mar 10 10:05:42 crc kubenswrapper[4794]: I0310 10:05:42.235346 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-e78d-account-create-update-lzbkb" event={"ID":"56a0cbd8-93d4-4236-bf7d-434752b9d246","Type":"ContainerDied","Data":"ddda66dbb1d59fd19b03a30fbf32119c6446e237479d0ac3e5dc4dcefbdd15fb"} Mar 10 10:05:42 crc kubenswrapper[4794]: I0310 10:05:42.235375 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-e78d-account-create-update-lzbkb" event={"ID":"56a0cbd8-93d4-4236-bf7d-434752b9d246","Type":"ContainerStarted","Data":"bdd01aa3fb15b3aba31545f455d3c0584f869450218e43795867bcaf2d16fdc3"} Mar 10 10:05:42 crc kubenswrapper[4794]: I0310 10:05:42.246318 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-db-create-qvz28" podStartSLOduration=2.246301768 podStartE2EDuration="2.246301768s" podCreationTimestamp="2026-03-10 10:05:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 10:05:42.237853811 +0000 UTC m=+1290.994024629" watchObservedRunningTime="2026-03-10 10:05:42.246301768 +0000 UTC m=+1291.002472586" Mar 10 10:05:42 crc kubenswrapper[4794]: I0310 10:05:42.268874 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-db-create-r6tt9" podStartSLOduration=2.26885002 podStartE2EDuration="2.26885002s" podCreationTimestamp="2026-03-10 10:05:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 10:05:42.261813848 +0000 UTC m=+1291.017984666" watchObservedRunningTime="2026-03-10 10:05:42.26885002 +0000 UTC m=+1291.025020838" Mar 10 10:05:43 crc kubenswrapper[4794]: I0310 10:05:43.244261 4794 generic.go:334] "Generic (PLEG): container finished" podID="65e0ce91-6453-4818-ae1f-39a24f8e6a66" containerID="e93ec00740ada705822837c538ffa9760f9d8c6961b642225b14741f8c12ddcc" exitCode=0 Mar 10 10:05:43 crc kubenswrapper[4794]: I0310 10:05:43.244382 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-qvz28" event={"ID":"65e0ce91-6453-4818-ae1f-39a24f8e6a66","Type":"ContainerDied","Data":"e93ec00740ada705822837c538ffa9760f9d8c6961b642225b14741f8c12ddcc"} Mar 10 10:05:43 crc kubenswrapper[4794]: I0310 10:05:43.245711 4794 generic.go:334] "Generic (PLEG): container finished" podID="27e140de-d875-48f9-87dd-2ea5908121c9" containerID="09cd8aa2d139fb0c80d695b9f0a857597678db3d626060120182e8665cb195f3" exitCode=0 Mar 10 10:05:43 crc kubenswrapper[4794]: I0310 10:05:43.245792 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-290e-account-create-update-g7rwh" event={"ID":"27e140de-d875-48f9-87dd-2ea5908121c9","Type":"ContainerDied","Data":"09cd8aa2d139fb0c80d695b9f0a857597678db3d626060120182e8665cb195f3"} Mar 10 10:05:43 crc kubenswrapper[4794]: I0310 10:05:43.247443 4794 generic.go:334] "Generic (PLEG): container finished" podID="543f505d-cd2e-491c-bab0-33efb1f71f57" containerID="1e3c68f525250d31acefa52ae32155d2762006dff412a57a1e872f7a90a1f9ed" exitCode=0 Mar 10 10:05:43 crc kubenswrapper[4794]: I0310 10:05:43.247488 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-r6tt9" event={"ID":"543f505d-cd2e-491c-bab0-33efb1f71f57","Type":"ContainerDied","Data":"1e3c68f525250d31acefa52ae32155d2762006dff412a57a1e872f7a90a1f9ed"} Mar 10 10:05:43 crc kubenswrapper[4794]: I0310 10:05:43.624209 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-e78d-account-create-update-lzbkb" Mar 10 10:05:43 crc kubenswrapper[4794]: I0310 10:05:43.744555 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-9zpcn" Mar 10 10:05:43 crc kubenswrapper[4794]: I0310 10:05:43.751763 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-d8db-account-create-update-bxfq5" Mar 10 10:05:43 crc kubenswrapper[4794]: I0310 10:05:43.785360 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/56a0cbd8-93d4-4236-bf7d-434752b9d246-operator-scripts\") pod \"56a0cbd8-93d4-4236-bf7d-434752b9d246\" (UID: \"56a0cbd8-93d4-4236-bf7d-434752b9d246\") " Mar 10 10:05:43 crc kubenswrapper[4794]: I0310 10:05:43.785495 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-msncv\" (UniqueName: \"kubernetes.io/projected/56a0cbd8-93d4-4236-bf7d-434752b9d246-kube-api-access-msncv\") pod \"56a0cbd8-93d4-4236-bf7d-434752b9d246\" (UID: \"56a0cbd8-93d4-4236-bf7d-434752b9d246\") " Mar 10 10:05:43 crc kubenswrapper[4794]: I0310 10:05:43.786272 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/56a0cbd8-93d4-4236-bf7d-434752b9d246-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "56a0cbd8-93d4-4236-bf7d-434752b9d246" (UID: "56a0cbd8-93d4-4236-bf7d-434752b9d246"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 10:05:43 crc kubenswrapper[4794]: I0310 10:05:43.786969 4794 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/56a0cbd8-93d4-4236-bf7d-434752b9d246-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 10:05:43 crc kubenswrapper[4794]: I0310 10:05:43.809454 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/56a0cbd8-93d4-4236-bf7d-434752b9d246-kube-api-access-msncv" (OuterVolumeSpecName: "kube-api-access-msncv") pod "56a0cbd8-93d4-4236-bf7d-434752b9d246" (UID: "56a0cbd8-93d4-4236-bf7d-434752b9d246"). InnerVolumeSpecName "kube-api-access-msncv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 10:05:43 crc kubenswrapper[4794]: I0310 10:05:43.888032 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/76332524-3f6c-4d9c-80a7-11f41ef04ade-operator-scripts\") pod \"76332524-3f6c-4d9c-80a7-11f41ef04ade\" (UID: \"76332524-3f6c-4d9c-80a7-11f41ef04ade\") " Mar 10 10:05:43 crc kubenswrapper[4794]: I0310 10:05:43.888088 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6f6993ef-0bbf-49a3-a1cb-1dd304ba6564-operator-scripts\") pod \"6f6993ef-0bbf-49a3-a1cb-1dd304ba6564\" (UID: \"6f6993ef-0bbf-49a3-a1cb-1dd304ba6564\") " Mar 10 10:05:43 crc kubenswrapper[4794]: I0310 10:05:43.888132 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ksgmf\" (UniqueName: \"kubernetes.io/projected/6f6993ef-0bbf-49a3-a1cb-1dd304ba6564-kube-api-access-ksgmf\") pod \"6f6993ef-0bbf-49a3-a1cb-1dd304ba6564\" (UID: \"6f6993ef-0bbf-49a3-a1cb-1dd304ba6564\") " Mar 10 10:05:43 crc kubenswrapper[4794]: I0310 10:05:43.888160 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-26tb4\" (UniqueName: \"kubernetes.io/projected/76332524-3f6c-4d9c-80a7-11f41ef04ade-kube-api-access-26tb4\") pod \"76332524-3f6c-4d9c-80a7-11f41ef04ade\" (UID: \"76332524-3f6c-4d9c-80a7-11f41ef04ade\") " Mar 10 10:05:43 crc kubenswrapper[4794]: I0310 10:05:43.888596 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-msncv\" (UniqueName: \"kubernetes.io/projected/56a0cbd8-93d4-4236-bf7d-434752b9d246-kube-api-access-msncv\") on node \"crc\" DevicePath \"\"" Mar 10 10:05:43 crc kubenswrapper[4794]: I0310 10:05:43.889472 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6f6993ef-0bbf-49a3-a1cb-1dd304ba6564-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "6f6993ef-0bbf-49a3-a1cb-1dd304ba6564" (UID: "6f6993ef-0bbf-49a3-a1cb-1dd304ba6564"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 10:05:43 crc kubenswrapper[4794]: I0310 10:05:43.889605 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/76332524-3f6c-4d9c-80a7-11f41ef04ade-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "76332524-3f6c-4d9c-80a7-11f41ef04ade" (UID: "76332524-3f6c-4d9c-80a7-11f41ef04ade"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 10:05:43 crc kubenswrapper[4794]: I0310 10:05:43.892553 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/76332524-3f6c-4d9c-80a7-11f41ef04ade-kube-api-access-26tb4" (OuterVolumeSpecName: "kube-api-access-26tb4") pod "76332524-3f6c-4d9c-80a7-11f41ef04ade" (UID: "76332524-3f6c-4d9c-80a7-11f41ef04ade"). InnerVolumeSpecName "kube-api-access-26tb4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 10:05:43 crc kubenswrapper[4794]: I0310 10:05:43.892585 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6f6993ef-0bbf-49a3-a1cb-1dd304ba6564-kube-api-access-ksgmf" (OuterVolumeSpecName: "kube-api-access-ksgmf") pod "6f6993ef-0bbf-49a3-a1cb-1dd304ba6564" (UID: "6f6993ef-0bbf-49a3-a1cb-1dd304ba6564"). InnerVolumeSpecName "kube-api-access-ksgmf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 10:05:43 crc kubenswrapper[4794]: I0310 10:05:43.990863 4794 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/76332524-3f6c-4d9c-80a7-11f41ef04ade-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 10:05:43 crc kubenswrapper[4794]: I0310 10:05:43.990899 4794 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6f6993ef-0bbf-49a3-a1cb-1dd304ba6564-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 10:05:43 crc kubenswrapper[4794]: I0310 10:05:43.990908 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ksgmf\" (UniqueName: \"kubernetes.io/projected/6f6993ef-0bbf-49a3-a1cb-1dd304ba6564-kube-api-access-ksgmf\") on node \"crc\" DevicePath \"\"" Mar 10 10:05:43 crc kubenswrapper[4794]: I0310 10:05:43.990918 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-26tb4\" (UniqueName: \"kubernetes.io/projected/76332524-3f6c-4d9c-80a7-11f41ef04ade-kube-api-access-26tb4\") on node \"crc\" DevicePath \"\"" Mar 10 10:05:44 crc kubenswrapper[4794]: I0310 10:05:44.257606 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-d8db-account-create-update-bxfq5" Mar 10 10:05:44 crc kubenswrapper[4794]: I0310 10:05:44.257629 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-d8db-account-create-update-bxfq5" event={"ID":"76332524-3f6c-4d9c-80a7-11f41ef04ade","Type":"ContainerDied","Data":"2754bec76fc337b6f09edf9060f446670b61210d2a5d894048f578ebf9d123bc"} Mar 10 10:05:44 crc kubenswrapper[4794]: I0310 10:05:44.257713 4794 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2754bec76fc337b6f09edf9060f446670b61210d2a5d894048f578ebf9d123bc" Mar 10 10:05:44 crc kubenswrapper[4794]: I0310 10:05:44.259196 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-e78d-account-create-update-lzbkb" event={"ID":"56a0cbd8-93d4-4236-bf7d-434752b9d246","Type":"ContainerDied","Data":"bdd01aa3fb15b3aba31545f455d3c0584f869450218e43795867bcaf2d16fdc3"} Mar 10 10:05:44 crc kubenswrapper[4794]: I0310 10:05:44.259233 4794 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bdd01aa3fb15b3aba31545f455d3c0584f869450218e43795867bcaf2d16fdc3" Mar 10 10:05:44 crc kubenswrapper[4794]: I0310 10:05:44.259588 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-e78d-account-create-update-lzbkb" Mar 10 10:05:44 crc kubenswrapper[4794]: I0310 10:05:44.263217 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-9zpcn" event={"ID":"6f6993ef-0bbf-49a3-a1cb-1dd304ba6564","Type":"ContainerDied","Data":"33d8060fd9109f2ab5881a07f06f751320e64f55522efc9666a915051965e514"} Mar 10 10:05:44 crc kubenswrapper[4794]: I0310 10:05:44.263320 4794 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="33d8060fd9109f2ab5881a07f06f751320e64f55522efc9666a915051965e514" Mar 10 10:05:44 crc kubenswrapper[4794]: I0310 10:05:44.263443 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-9zpcn" Mar 10 10:05:47 crc kubenswrapper[4794]: I0310 10:05:47.020917 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-qvz28" Mar 10 10:05:47 crc kubenswrapper[4794]: I0310 10:05:47.027322 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-290e-account-create-update-g7rwh" Mar 10 10:05:47 crc kubenswrapper[4794]: I0310 10:05:47.039257 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-r6tt9" Mar 10 10:05:47 crc kubenswrapper[4794]: I0310 10:05:47.051089 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/27e140de-d875-48f9-87dd-2ea5908121c9-operator-scripts\") pod \"27e140de-d875-48f9-87dd-2ea5908121c9\" (UID: \"27e140de-d875-48f9-87dd-2ea5908121c9\") " Mar 10 10:05:47 crc kubenswrapper[4794]: I0310 10:05:47.051226 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gr27h\" (UniqueName: \"kubernetes.io/projected/65e0ce91-6453-4818-ae1f-39a24f8e6a66-kube-api-access-gr27h\") pod \"65e0ce91-6453-4818-ae1f-39a24f8e6a66\" (UID: \"65e0ce91-6453-4818-ae1f-39a24f8e6a66\") " Mar 10 10:05:47 crc kubenswrapper[4794]: I0310 10:05:47.051306 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-brmvq\" (UniqueName: \"kubernetes.io/projected/27e140de-d875-48f9-87dd-2ea5908121c9-kube-api-access-brmvq\") pod \"27e140de-d875-48f9-87dd-2ea5908121c9\" (UID: \"27e140de-d875-48f9-87dd-2ea5908121c9\") " Mar 10 10:05:47 crc kubenswrapper[4794]: I0310 10:05:47.051357 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/543f505d-cd2e-491c-bab0-33efb1f71f57-operator-scripts\") pod \"543f505d-cd2e-491c-bab0-33efb1f71f57\" (UID: \"543f505d-cd2e-491c-bab0-33efb1f71f57\") " Mar 10 10:05:47 crc kubenswrapper[4794]: I0310 10:05:47.051380 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b22rk\" (UniqueName: \"kubernetes.io/projected/543f505d-cd2e-491c-bab0-33efb1f71f57-kube-api-access-b22rk\") pod \"543f505d-cd2e-491c-bab0-33efb1f71f57\" (UID: \"543f505d-cd2e-491c-bab0-33efb1f71f57\") " Mar 10 10:05:47 crc kubenswrapper[4794]: I0310 10:05:47.051441 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/65e0ce91-6453-4818-ae1f-39a24f8e6a66-operator-scripts\") pod \"65e0ce91-6453-4818-ae1f-39a24f8e6a66\" (UID: \"65e0ce91-6453-4818-ae1f-39a24f8e6a66\") " Mar 10 10:05:47 crc kubenswrapper[4794]: I0310 10:05:47.053147 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/65e0ce91-6453-4818-ae1f-39a24f8e6a66-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "65e0ce91-6453-4818-ae1f-39a24f8e6a66" (UID: "65e0ce91-6453-4818-ae1f-39a24f8e6a66"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 10:05:47 crc kubenswrapper[4794]: I0310 10:05:47.053300 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/543f505d-cd2e-491c-bab0-33efb1f71f57-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "543f505d-cd2e-491c-bab0-33efb1f71f57" (UID: "543f505d-cd2e-491c-bab0-33efb1f71f57"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 10:05:47 crc kubenswrapper[4794]: I0310 10:05:47.057576 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/27e140de-d875-48f9-87dd-2ea5908121c9-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "27e140de-d875-48f9-87dd-2ea5908121c9" (UID: "27e140de-d875-48f9-87dd-2ea5908121c9"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 10:05:47 crc kubenswrapper[4794]: I0310 10:05:47.066029 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/543f505d-cd2e-491c-bab0-33efb1f71f57-kube-api-access-b22rk" (OuterVolumeSpecName: "kube-api-access-b22rk") pod "543f505d-cd2e-491c-bab0-33efb1f71f57" (UID: "543f505d-cd2e-491c-bab0-33efb1f71f57"). InnerVolumeSpecName "kube-api-access-b22rk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 10:05:47 crc kubenswrapper[4794]: I0310 10:05:47.066084 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/27e140de-d875-48f9-87dd-2ea5908121c9-kube-api-access-brmvq" (OuterVolumeSpecName: "kube-api-access-brmvq") pod "27e140de-d875-48f9-87dd-2ea5908121c9" (UID: "27e140de-d875-48f9-87dd-2ea5908121c9"). InnerVolumeSpecName "kube-api-access-brmvq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 10:05:47 crc kubenswrapper[4794]: I0310 10:05:47.068830 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/65e0ce91-6453-4818-ae1f-39a24f8e6a66-kube-api-access-gr27h" (OuterVolumeSpecName: "kube-api-access-gr27h") pod "65e0ce91-6453-4818-ae1f-39a24f8e6a66" (UID: "65e0ce91-6453-4818-ae1f-39a24f8e6a66"). InnerVolumeSpecName "kube-api-access-gr27h". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 10:05:47 crc kubenswrapper[4794]: I0310 10:05:47.153761 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gr27h\" (UniqueName: \"kubernetes.io/projected/65e0ce91-6453-4818-ae1f-39a24f8e6a66-kube-api-access-gr27h\") on node \"crc\" DevicePath \"\"" Mar 10 10:05:47 crc kubenswrapper[4794]: I0310 10:05:47.153799 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-brmvq\" (UniqueName: \"kubernetes.io/projected/27e140de-d875-48f9-87dd-2ea5908121c9-kube-api-access-brmvq\") on node \"crc\" DevicePath \"\"" Mar 10 10:05:47 crc kubenswrapper[4794]: I0310 10:05:47.153811 4794 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/543f505d-cd2e-491c-bab0-33efb1f71f57-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 10:05:47 crc kubenswrapper[4794]: I0310 10:05:47.153823 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b22rk\" (UniqueName: \"kubernetes.io/projected/543f505d-cd2e-491c-bab0-33efb1f71f57-kube-api-access-b22rk\") on node \"crc\" DevicePath \"\"" Mar 10 10:05:47 crc kubenswrapper[4794]: I0310 10:05:47.153834 4794 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/65e0ce91-6453-4818-ae1f-39a24f8e6a66-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 10:05:47 crc kubenswrapper[4794]: I0310 10:05:47.153846 4794 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/27e140de-d875-48f9-87dd-2ea5908121c9-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 10:05:47 crc kubenswrapper[4794]: I0310 10:05:47.208473 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-7fd445f5bc-9njwb" Mar 10 10:05:47 crc kubenswrapper[4794]: I0310 10:05:47.284950 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-58df884995-wr8ms"] Mar 10 10:05:47 crc kubenswrapper[4794]: I0310 10:05:47.285206 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-58df884995-wr8ms" podUID="0d9d50dc-009d-40eb-9dc5-4c1ffc2ef40a" containerName="dnsmasq-dns" containerID="cri-o://658379ccd63c03a7b70ce7eaceec34bae80606e2f0afc083861a3090282f3ec4" gracePeriod=10 Mar 10 10:05:47 crc kubenswrapper[4794]: I0310 10:05:47.305502 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-wq6b7" event={"ID":"ab5bb4b1-f7a5-4bfb-a2a1-dfb63b32fcbe","Type":"ContainerStarted","Data":"9bdd46d650f18fd8daa1d10e48bda92310a8b529240b1904c24330c1b16cecd5"} Mar 10 10:05:47 crc kubenswrapper[4794]: I0310 10:05:47.308437 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-qvz28" event={"ID":"65e0ce91-6453-4818-ae1f-39a24f8e6a66","Type":"ContainerDied","Data":"2d65a374b118bd50bdf7c38e33fb2a48bdcc39ff394784b814218a43bea07aaa"} Mar 10 10:05:47 crc kubenswrapper[4794]: I0310 10:05:47.308470 4794 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2d65a374b118bd50bdf7c38e33fb2a48bdcc39ff394784b814218a43bea07aaa" Mar 10 10:05:47 crc kubenswrapper[4794]: I0310 10:05:47.308540 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-qvz28" Mar 10 10:05:47 crc kubenswrapper[4794]: I0310 10:05:47.312527 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-290e-account-create-update-g7rwh" event={"ID":"27e140de-d875-48f9-87dd-2ea5908121c9","Type":"ContainerDied","Data":"c3b60a3a73e00466be814c01a4f63a17e51313cc634f5f6a4751159adfcaf712"} Mar 10 10:05:47 crc kubenswrapper[4794]: I0310 10:05:47.312565 4794 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c3b60a3a73e00466be814c01a4f63a17e51313cc634f5f6a4751159adfcaf712" Mar 10 10:05:47 crc kubenswrapper[4794]: I0310 10:05:47.312640 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-290e-account-create-update-g7rwh" Mar 10 10:05:47 crc kubenswrapper[4794]: I0310 10:05:47.320584 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-r6tt9" event={"ID":"543f505d-cd2e-491c-bab0-33efb1f71f57","Type":"ContainerDied","Data":"66a82ae556591211c781c1fc5ef1864f69d8c1c5211717ed7af8c65749526681"} Mar 10 10:05:47 crc kubenswrapper[4794]: I0310 10:05:47.320614 4794 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="66a82ae556591211c781c1fc5ef1864f69d8c1c5211717ed7af8c65749526681" Mar 10 10:05:47 crc kubenswrapper[4794]: I0310 10:05:47.320663 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-r6tt9" Mar 10 10:05:47 crc kubenswrapper[4794]: I0310 10:05:47.343154 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-db-sync-wq6b7" podStartSLOduration=2.413645389 podStartE2EDuration="7.343130096s" podCreationTimestamp="2026-03-10 10:05:40 +0000 UTC" firstStartedPulling="2026-03-10 10:05:41.968612826 +0000 UTC m=+1290.724783644" lastFinishedPulling="2026-03-10 10:05:46.898097533 +0000 UTC m=+1295.654268351" observedRunningTime="2026-03-10 10:05:47.338155239 +0000 UTC m=+1296.094326077" watchObservedRunningTime="2026-03-10 10:05:47.343130096 +0000 UTC m=+1296.099300914" Mar 10 10:05:47 crc kubenswrapper[4794]: I0310 10:05:47.670610 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-58df884995-wr8ms" Mar 10 10:05:47 crc kubenswrapper[4794]: I0310 10:05:47.772311 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0d9d50dc-009d-40eb-9dc5-4c1ffc2ef40a-ovsdbserver-sb\") pod \"0d9d50dc-009d-40eb-9dc5-4c1ffc2ef40a\" (UID: \"0d9d50dc-009d-40eb-9dc5-4c1ffc2ef40a\") " Mar 10 10:05:47 crc kubenswrapper[4794]: I0310 10:05:47.772375 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0d9d50dc-009d-40eb-9dc5-4c1ffc2ef40a-dns-svc\") pod \"0d9d50dc-009d-40eb-9dc5-4c1ffc2ef40a\" (UID: \"0d9d50dc-009d-40eb-9dc5-4c1ffc2ef40a\") " Mar 10 10:05:47 crc kubenswrapper[4794]: I0310 10:05:47.772467 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0d9d50dc-009d-40eb-9dc5-4c1ffc2ef40a-ovsdbserver-nb\") pod \"0d9d50dc-009d-40eb-9dc5-4c1ffc2ef40a\" (UID: \"0d9d50dc-009d-40eb-9dc5-4c1ffc2ef40a\") " Mar 10 10:05:47 crc kubenswrapper[4794]: I0310 10:05:47.773138 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-spbpj\" (UniqueName: \"kubernetes.io/projected/0d9d50dc-009d-40eb-9dc5-4c1ffc2ef40a-kube-api-access-spbpj\") pod \"0d9d50dc-009d-40eb-9dc5-4c1ffc2ef40a\" (UID: \"0d9d50dc-009d-40eb-9dc5-4c1ffc2ef40a\") " Mar 10 10:05:47 crc kubenswrapper[4794]: I0310 10:05:47.773190 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0d9d50dc-009d-40eb-9dc5-4c1ffc2ef40a-config\") pod \"0d9d50dc-009d-40eb-9dc5-4c1ffc2ef40a\" (UID: \"0d9d50dc-009d-40eb-9dc5-4c1ffc2ef40a\") " Mar 10 10:05:47 crc kubenswrapper[4794]: I0310 10:05:47.779083 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0d9d50dc-009d-40eb-9dc5-4c1ffc2ef40a-kube-api-access-spbpj" (OuterVolumeSpecName: "kube-api-access-spbpj") pod "0d9d50dc-009d-40eb-9dc5-4c1ffc2ef40a" (UID: "0d9d50dc-009d-40eb-9dc5-4c1ffc2ef40a"). InnerVolumeSpecName "kube-api-access-spbpj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 10:05:47 crc kubenswrapper[4794]: I0310 10:05:47.809688 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0d9d50dc-009d-40eb-9dc5-4c1ffc2ef40a-config" (OuterVolumeSpecName: "config") pod "0d9d50dc-009d-40eb-9dc5-4c1ffc2ef40a" (UID: "0d9d50dc-009d-40eb-9dc5-4c1ffc2ef40a"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 10:05:47 crc kubenswrapper[4794]: I0310 10:05:47.810812 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0d9d50dc-009d-40eb-9dc5-4c1ffc2ef40a-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "0d9d50dc-009d-40eb-9dc5-4c1ffc2ef40a" (UID: "0d9d50dc-009d-40eb-9dc5-4c1ffc2ef40a"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 10:05:47 crc kubenswrapper[4794]: I0310 10:05:47.812736 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0d9d50dc-009d-40eb-9dc5-4c1ffc2ef40a-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "0d9d50dc-009d-40eb-9dc5-4c1ffc2ef40a" (UID: "0d9d50dc-009d-40eb-9dc5-4c1ffc2ef40a"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 10:05:47 crc kubenswrapper[4794]: I0310 10:05:47.813981 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0d9d50dc-009d-40eb-9dc5-4c1ffc2ef40a-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "0d9d50dc-009d-40eb-9dc5-4c1ffc2ef40a" (UID: "0d9d50dc-009d-40eb-9dc5-4c1ffc2ef40a"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 10:05:47 crc kubenswrapper[4794]: I0310 10:05:47.878635 4794 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0d9d50dc-009d-40eb-9dc5-4c1ffc2ef40a-config\") on node \"crc\" DevicePath \"\"" Mar 10 10:05:47 crc kubenswrapper[4794]: I0310 10:05:47.878679 4794 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0d9d50dc-009d-40eb-9dc5-4c1ffc2ef40a-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 10 10:05:47 crc kubenswrapper[4794]: I0310 10:05:47.878690 4794 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0d9d50dc-009d-40eb-9dc5-4c1ffc2ef40a-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 10 10:05:47 crc kubenswrapper[4794]: I0310 10:05:47.878713 4794 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0d9d50dc-009d-40eb-9dc5-4c1ffc2ef40a-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 10 10:05:47 crc kubenswrapper[4794]: I0310 10:05:47.878723 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-spbpj\" (UniqueName: \"kubernetes.io/projected/0d9d50dc-009d-40eb-9dc5-4c1ffc2ef40a-kube-api-access-spbpj\") on node \"crc\" DevicePath \"\"" Mar 10 10:05:48 crc kubenswrapper[4794]: I0310 10:05:48.330204 4794 generic.go:334] "Generic (PLEG): container finished" podID="0d9d50dc-009d-40eb-9dc5-4c1ffc2ef40a" containerID="658379ccd63c03a7b70ce7eaceec34bae80606e2f0afc083861a3090282f3ec4" exitCode=0 Mar 10 10:05:48 crc kubenswrapper[4794]: I0310 10:05:48.330255 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-58df884995-wr8ms" Mar 10 10:05:48 crc kubenswrapper[4794]: I0310 10:05:48.330250 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-58df884995-wr8ms" event={"ID":"0d9d50dc-009d-40eb-9dc5-4c1ffc2ef40a","Type":"ContainerDied","Data":"658379ccd63c03a7b70ce7eaceec34bae80606e2f0afc083861a3090282f3ec4"} Mar 10 10:05:48 crc kubenswrapper[4794]: I0310 10:05:48.330350 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-58df884995-wr8ms" event={"ID":"0d9d50dc-009d-40eb-9dc5-4c1ffc2ef40a","Type":"ContainerDied","Data":"a41ecb82175a17bdf15e0e28d9ba0d9a934f24a98329058e4ed90c99e9c8f8ce"} Mar 10 10:05:48 crc kubenswrapper[4794]: I0310 10:05:48.330376 4794 scope.go:117] "RemoveContainer" containerID="658379ccd63c03a7b70ce7eaceec34bae80606e2f0afc083861a3090282f3ec4" Mar 10 10:05:48 crc kubenswrapper[4794]: I0310 10:05:48.374010 4794 scope.go:117] "RemoveContainer" containerID="70e8baac06ba32920a4cbdf0279be6a6858547f3c37e9d1983091d760290b72d" Mar 10 10:05:48 crc kubenswrapper[4794]: I0310 10:05:48.377871 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-58df884995-wr8ms"] Mar 10 10:05:48 crc kubenswrapper[4794]: I0310 10:05:48.384950 4794 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-58df884995-wr8ms"] Mar 10 10:05:48 crc kubenswrapper[4794]: I0310 10:05:48.400500 4794 scope.go:117] "RemoveContainer" containerID="658379ccd63c03a7b70ce7eaceec34bae80606e2f0afc083861a3090282f3ec4" Mar 10 10:05:48 crc kubenswrapper[4794]: E0310 10:05:48.401171 4794 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"658379ccd63c03a7b70ce7eaceec34bae80606e2f0afc083861a3090282f3ec4\": container with ID starting with 658379ccd63c03a7b70ce7eaceec34bae80606e2f0afc083861a3090282f3ec4 not found: ID does not exist" containerID="658379ccd63c03a7b70ce7eaceec34bae80606e2f0afc083861a3090282f3ec4" Mar 10 10:05:48 crc kubenswrapper[4794]: I0310 10:05:48.401201 4794 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"658379ccd63c03a7b70ce7eaceec34bae80606e2f0afc083861a3090282f3ec4"} err="failed to get container status \"658379ccd63c03a7b70ce7eaceec34bae80606e2f0afc083861a3090282f3ec4\": rpc error: code = NotFound desc = could not find container \"658379ccd63c03a7b70ce7eaceec34bae80606e2f0afc083861a3090282f3ec4\": container with ID starting with 658379ccd63c03a7b70ce7eaceec34bae80606e2f0afc083861a3090282f3ec4 not found: ID does not exist" Mar 10 10:05:48 crc kubenswrapper[4794]: I0310 10:05:48.401222 4794 scope.go:117] "RemoveContainer" containerID="70e8baac06ba32920a4cbdf0279be6a6858547f3c37e9d1983091d760290b72d" Mar 10 10:05:48 crc kubenswrapper[4794]: E0310 10:05:48.402717 4794 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"70e8baac06ba32920a4cbdf0279be6a6858547f3c37e9d1983091d760290b72d\": container with ID starting with 70e8baac06ba32920a4cbdf0279be6a6858547f3c37e9d1983091d760290b72d not found: ID does not exist" containerID="70e8baac06ba32920a4cbdf0279be6a6858547f3c37e9d1983091d760290b72d" Mar 10 10:05:48 crc kubenswrapper[4794]: I0310 10:05:48.402798 4794 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"70e8baac06ba32920a4cbdf0279be6a6858547f3c37e9d1983091d760290b72d"} err="failed to get container status \"70e8baac06ba32920a4cbdf0279be6a6858547f3c37e9d1983091d760290b72d\": rpc error: code = NotFound desc = could not find container \"70e8baac06ba32920a4cbdf0279be6a6858547f3c37e9d1983091d760290b72d\": container with ID starting with 70e8baac06ba32920a4cbdf0279be6a6858547f3c37e9d1983091d760290b72d not found: ID does not exist" Mar 10 10:05:50 crc kubenswrapper[4794]: I0310 10:05:50.008225 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0d9d50dc-009d-40eb-9dc5-4c1ffc2ef40a" path="/var/lib/kubelet/pods/0d9d50dc-009d-40eb-9dc5-4c1ffc2ef40a/volumes" Mar 10 10:05:50 crc kubenswrapper[4794]: I0310 10:05:50.349040 4794 generic.go:334] "Generic (PLEG): container finished" podID="ab5bb4b1-f7a5-4bfb-a2a1-dfb63b32fcbe" containerID="9bdd46d650f18fd8daa1d10e48bda92310a8b529240b1904c24330c1b16cecd5" exitCode=0 Mar 10 10:05:50 crc kubenswrapper[4794]: I0310 10:05:50.349098 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-wq6b7" event={"ID":"ab5bb4b1-f7a5-4bfb-a2a1-dfb63b32fcbe","Type":"ContainerDied","Data":"9bdd46d650f18fd8daa1d10e48bda92310a8b529240b1904c24330c1b16cecd5"} Mar 10 10:05:51 crc kubenswrapper[4794]: I0310 10:05:51.682165 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-wq6b7" Mar 10 10:05:51 crc kubenswrapper[4794]: I0310 10:05:51.737207 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ab5bb4b1-f7a5-4bfb-a2a1-dfb63b32fcbe-config-data\") pod \"ab5bb4b1-f7a5-4bfb-a2a1-dfb63b32fcbe\" (UID: \"ab5bb4b1-f7a5-4bfb-a2a1-dfb63b32fcbe\") " Mar 10 10:05:51 crc kubenswrapper[4794]: I0310 10:05:51.737423 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c8drj\" (UniqueName: \"kubernetes.io/projected/ab5bb4b1-f7a5-4bfb-a2a1-dfb63b32fcbe-kube-api-access-c8drj\") pod \"ab5bb4b1-f7a5-4bfb-a2a1-dfb63b32fcbe\" (UID: \"ab5bb4b1-f7a5-4bfb-a2a1-dfb63b32fcbe\") " Mar 10 10:05:51 crc kubenswrapper[4794]: I0310 10:05:51.737502 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ab5bb4b1-f7a5-4bfb-a2a1-dfb63b32fcbe-combined-ca-bundle\") pod \"ab5bb4b1-f7a5-4bfb-a2a1-dfb63b32fcbe\" (UID: \"ab5bb4b1-f7a5-4bfb-a2a1-dfb63b32fcbe\") " Mar 10 10:05:51 crc kubenswrapper[4794]: I0310 10:05:51.760626 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ab5bb4b1-f7a5-4bfb-a2a1-dfb63b32fcbe-kube-api-access-c8drj" (OuterVolumeSpecName: "kube-api-access-c8drj") pod "ab5bb4b1-f7a5-4bfb-a2a1-dfb63b32fcbe" (UID: "ab5bb4b1-f7a5-4bfb-a2a1-dfb63b32fcbe"). InnerVolumeSpecName "kube-api-access-c8drj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 10:05:51 crc kubenswrapper[4794]: I0310 10:05:51.766830 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ab5bb4b1-f7a5-4bfb-a2a1-dfb63b32fcbe-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ab5bb4b1-f7a5-4bfb-a2a1-dfb63b32fcbe" (UID: "ab5bb4b1-f7a5-4bfb-a2a1-dfb63b32fcbe"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 10:05:51 crc kubenswrapper[4794]: I0310 10:05:51.803674 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ab5bb4b1-f7a5-4bfb-a2a1-dfb63b32fcbe-config-data" (OuterVolumeSpecName: "config-data") pod "ab5bb4b1-f7a5-4bfb-a2a1-dfb63b32fcbe" (UID: "ab5bb4b1-f7a5-4bfb-a2a1-dfb63b32fcbe"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 10:05:51 crc kubenswrapper[4794]: I0310 10:05:51.839171 4794 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ab5bb4b1-f7a5-4bfb-a2a1-dfb63b32fcbe-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 10:05:51 crc kubenswrapper[4794]: I0310 10:05:51.839201 4794 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ab5bb4b1-f7a5-4bfb-a2a1-dfb63b32fcbe-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 10:05:51 crc kubenswrapper[4794]: I0310 10:05:51.839210 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c8drj\" (UniqueName: \"kubernetes.io/projected/ab5bb4b1-f7a5-4bfb-a2a1-dfb63b32fcbe-kube-api-access-c8drj\") on node \"crc\" DevicePath \"\"" Mar 10 10:05:52 crc kubenswrapper[4794]: I0310 10:05:52.364842 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-wq6b7" event={"ID":"ab5bb4b1-f7a5-4bfb-a2a1-dfb63b32fcbe","Type":"ContainerDied","Data":"ea7fe1c15d20c96ac12d2e0fc764f9a019706b8f35b669cdeea17731cf234a0d"} Mar 10 10:05:52 crc kubenswrapper[4794]: I0310 10:05:52.364881 4794 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ea7fe1c15d20c96ac12d2e0fc764f9a019706b8f35b669cdeea17731cf234a0d" Mar 10 10:05:52 crc kubenswrapper[4794]: I0310 10:05:52.364929 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-wq6b7" Mar 10 10:05:52 crc kubenswrapper[4794]: I0310 10:05:52.534511 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5879b95d97-lx6f2"] Mar 10 10:05:52 crc kubenswrapper[4794]: E0310 10:05:52.535013 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6f6993ef-0bbf-49a3-a1cb-1dd304ba6564" containerName="mariadb-database-create" Mar 10 10:05:52 crc kubenswrapper[4794]: I0310 10:05:52.535039 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="6f6993ef-0bbf-49a3-a1cb-1dd304ba6564" containerName="mariadb-database-create" Mar 10 10:05:52 crc kubenswrapper[4794]: E0310 10:05:52.535050 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ab5bb4b1-f7a5-4bfb-a2a1-dfb63b32fcbe" containerName="keystone-db-sync" Mar 10 10:05:52 crc kubenswrapper[4794]: I0310 10:05:52.535058 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="ab5bb4b1-f7a5-4bfb-a2a1-dfb63b32fcbe" containerName="keystone-db-sync" Mar 10 10:05:52 crc kubenswrapper[4794]: E0310 10:05:52.535077 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="27e140de-d875-48f9-87dd-2ea5908121c9" containerName="mariadb-account-create-update" Mar 10 10:05:52 crc kubenswrapper[4794]: I0310 10:05:52.535085 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="27e140de-d875-48f9-87dd-2ea5908121c9" containerName="mariadb-account-create-update" Mar 10 10:05:52 crc kubenswrapper[4794]: E0310 10:05:52.535108 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0d9d50dc-009d-40eb-9dc5-4c1ffc2ef40a" containerName="init" Mar 10 10:05:52 crc kubenswrapper[4794]: I0310 10:05:52.535117 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="0d9d50dc-009d-40eb-9dc5-4c1ffc2ef40a" containerName="init" Mar 10 10:05:52 crc kubenswrapper[4794]: E0310 10:05:52.535133 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="543f505d-cd2e-491c-bab0-33efb1f71f57" containerName="mariadb-database-create" Mar 10 10:05:52 crc kubenswrapper[4794]: I0310 10:05:52.535141 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="543f505d-cd2e-491c-bab0-33efb1f71f57" containerName="mariadb-database-create" Mar 10 10:05:52 crc kubenswrapper[4794]: E0310 10:05:52.535152 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="76332524-3f6c-4d9c-80a7-11f41ef04ade" containerName="mariadb-account-create-update" Mar 10 10:05:52 crc kubenswrapper[4794]: I0310 10:05:52.535160 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="76332524-3f6c-4d9c-80a7-11f41ef04ade" containerName="mariadb-account-create-update" Mar 10 10:05:52 crc kubenswrapper[4794]: E0310 10:05:52.535171 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="65e0ce91-6453-4818-ae1f-39a24f8e6a66" containerName="mariadb-database-create" Mar 10 10:05:52 crc kubenswrapper[4794]: I0310 10:05:52.535180 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="65e0ce91-6453-4818-ae1f-39a24f8e6a66" containerName="mariadb-database-create" Mar 10 10:05:52 crc kubenswrapper[4794]: E0310 10:05:52.535192 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="56a0cbd8-93d4-4236-bf7d-434752b9d246" containerName="mariadb-account-create-update" Mar 10 10:05:52 crc kubenswrapper[4794]: I0310 10:05:52.535201 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="56a0cbd8-93d4-4236-bf7d-434752b9d246" containerName="mariadb-account-create-update" Mar 10 10:05:52 crc kubenswrapper[4794]: E0310 10:05:52.535216 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0d9d50dc-009d-40eb-9dc5-4c1ffc2ef40a" containerName="dnsmasq-dns" Mar 10 10:05:52 crc kubenswrapper[4794]: I0310 10:05:52.535223 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="0d9d50dc-009d-40eb-9dc5-4c1ffc2ef40a" containerName="dnsmasq-dns" Mar 10 10:05:52 crc kubenswrapper[4794]: I0310 10:05:52.535446 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="65e0ce91-6453-4818-ae1f-39a24f8e6a66" containerName="mariadb-database-create" Mar 10 10:05:52 crc kubenswrapper[4794]: I0310 10:05:52.535465 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="ab5bb4b1-f7a5-4bfb-a2a1-dfb63b32fcbe" containerName="keystone-db-sync" Mar 10 10:05:52 crc kubenswrapper[4794]: I0310 10:05:52.535480 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="76332524-3f6c-4d9c-80a7-11f41ef04ade" containerName="mariadb-account-create-update" Mar 10 10:05:52 crc kubenswrapper[4794]: I0310 10:05:52.535490 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="6f6993ef-0bbf-49a3-a1cb-1dd304ba6564" containerName="mariadb-database-create" Mar 10 10:05:52 crc kubenswrapper[4794]: I0310 10:05:52.535503 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="543f505d-cd2e-491c-bab0-33efb1f71f57" containerName="mariadb-database-create" Mar 10 10:05:52 crc kubenswrapper[4794]: I0310 10:05:52.535518 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="27e140de-d875-48f9-87dd-2ea5908121c9" containerName="mariadb-account-create-update" Mar 10 10:05:52 crc kubenswrapper[4794]: I0310 10:05:52.535533 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="0d9d50dc-009d-40eb-9dc5-4c1ffc2ef40a" containerName="dnsmasq-dns" Mar 10 10:05:52 crc kubenswrapper[4794]: I0310 10:05:52.535545 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="56a0cbd8-93d4-4236-bf7d-434752b9d246" containerName="mariadb-account-create-update" Mar 10 10:05:52 crc kubenswrapper[4794]: I0310 10:05:52.536574 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5879b95d97-lx6f2" Mar 10 10:05:52 crc kubenswrapper[4794]: I0310 10:05:52.557787 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5879b95d97-lx6f2"] Mar 10 10:05:52 crc kubenswrapper[4794]: I0310 10:05:52.594467 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-qw4lv"] Mar 10 10:05:52 crc kubenswrapper[4794]: I0310 10:05:52.595791 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-qw4lv" Mar 10 10:05:52 crc kubenswrapper[4794]: I0310 10:05:52.598725 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Mar 10 10:05:52 crc kubenswrapper[4794]: I0310 10:05:52.599900 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Mar 10 10:05:52 crc kubenswrapper[4794]: I0310 10:05:52.600227 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-4svgf" Mar 10 10:05:52 crc kubenswrapper[4794]: I0310 10:05:52.600475 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Mar 10 10:05:52 crc kubenswrapper[4794]: I0310 10:05:52.601723 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Mar 10 10:05:52 crc kubenswrapper[4794]: I0310 10:05:52.610053 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-qw4lv"] Mar 10 10:05:52 crc kubenswrapper[4794]: I0310 10:05:52.654124 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x2gm5\" (UniqueName: \"kubernetes.io/projected/2af99ed8-10a0-4bc5-8561-1870c4ca7eb2-kube-api-access-x2gm5\") pod \"dnsmasq-dns-5879b95d97-lx6f2\" (UID: \"2af99ed8-10a0-4bc5-8561-1870c4ca7eb2\") " pod="openstack/dnsmasq-dns-5879b95d97-lx6f2" Mar 10 10:05:52 crc kubenswrapper[4794]: I0310 10:05:52.654177 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2af99ed8-10a0-4bc5-8561-1870c4ca7eb2-ovsdbserver-nb\") pod \"dnsmasq-dns-5879b95d97-lx6f2\" (UID: \"2af99ed8-10a0-4bc5-8561-1870c4ca7eb2\") " pod="openstack/dnsmasq-dns-5879b95d97-lx6f2" Mar 10 10:05:52 crc kubenswrapper[4794]: I0310 10:05:52.654218 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5vzsb\" (UniqueName: \"kubernetes.io/projected/d871867e-0fb5-48df-9d55-f19a63d160ea-kube-api-access-5vzsb\") pod \"keystone-bootstrap-qw4lv\" (UID: \"d871867e-0fb5-48df-9d55-f19a63d160ea\") " pod="openstack/keystone-bootstrap-qw4lv" Mar 10 10:05:52 crc kubenswrapper[4794]: I0310 10:05:52.654239 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d871867e-0fb5-48df-9d55-f19a63d160ea-combined-ca-bundle\") pod \"keystone-bootstrap-qw4lv\" (UID: \"d871867e-0fb5-48df-9d55-f19a63d160ea\") " pod="openstack/keystone-bootstrap-qw4lv" Mar 10 10:05:52 crc kubenswrapper[4794]: I0310 10:05:52.654261 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2af99ed8-10a0-4bc5-8561-1870c4ca7eb2-config\") pod \"dnsmasq-dns-5879b95d97-lx6f2\" (UID: \"2af99ed8-10a0-4bc5-8561-1870c4ca7eb2\") " pod="openstack/dnsmasq-dns-5879b95d97-lx6f2" Mar 10 10:05:52 crc kubenswrapper[4794]: I0310 10:05:52.654286 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2af99ed8-10a0-4bc5-8561-1870c4ca7eb2-ovsdbserver-sb\") pod \"dnsmasq-dns-5879b95d97-lx6f2\" (UID: \"2af99ed8-10a0-4bc5-8561-1870c4ca7eb2\") " pod="openstack/dnsmasq-dns-5879b95d97-lx6f2" Mar 10 10:05:52 crc kubenswrapper[4794]: I0310 10:05:52.654299 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/d871867e-0fb5-48df-9d55-f19a63d160ea-fernet-keys\") pod \"keystone-bootstrap-qw4lv\" (UID: \"d871867e-0fb5-48df-9d55-f19a63d160ea\") " pod="openstack/keystone-bootstrap-qw4lv" Mar 10 10:05:52 crc kubenswrapper[4794]: I0310 10:05:52.654313 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d871867e-0fb5-48df-9d55-f19a63d160ea-config-data\") pod \"keystone-bootstrap-qw4lv\" (UID: \"d871867e-0fb5-48df-9d55-f19a63d160ea\") " pod="openstack/keystone-bootstrap-qw4lv" Mar 10 10:05:52 crc kubenswrapper[4794]: I0310 10:05:52.654343 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2af99ed8-10a0-4bc5-8561-1870c4ca7eb2-dns-svc\") pod \"dnsmasq-dns-5879b95d97-lx6f2\" (UID: \"2af99ed8-10a0-4bc5-8561-1870c4ca7eb2\") " pod="openstack/dnsmasq-dns-5879b95d97-lx6f2" Mar 10 10:05:52 crc kubenswrapper[4794]: I0310 10:05:52.654362 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/d871867e-0fb5-48df-9d55-f19a63d160ea-credential-keys\") pod \"keystone-bootstrap-qw4lv\" (UID: \"d871867e-0fb5-48df-9d55-f19a63d160ea\") " pod="openstack/keystone-bootstrap-qw4lv" Mar 10 10:05:52 crc kubenswrapper[4794]: I0310 10:05:52.654379 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d871867e-0fb5-48df-9d55-f19a63d160ea-scripts\") pod \"keystone-bootstrap-qw4lv\" (UID: \"d871867e-0fb5-48df-9d55-f19a63d160ea\") " pod="openstack/keystone-bootstrap-qw4lv" Mar 10 10:05:52 crc kubenswrapper[4794]: I0310 10:05:52.654394 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/2af99ed8-10a0-4bc5-8561-1870c4ca7eb2-dns-swift-storage-0\") pod \"dnsmasq-dns-5879b95d97-lx6f2\" (UID: \"2af99ed8-10a0-4bc5-8561-1870c4ca7eb2\") " pod="openstack/dnsmasq-dns-5879b95d97-lx6f2" Mar 10 10:05:52 crc kubenswrapper[4794]: I0310 10:05:52.756198 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2af99ed8-10a0-4bc5-8561-1870c4ca7eb2-config\") pod \"dnsmasq-dns-5879b95d97-lx6f2\" (UID: \"2af99ed8-10a0-4bc5-8561-1870c4ca7eb2\") " pod="openstack/dnsmasq-dns-5879b95d97-lx6f2" Mar 10 10:05:52 crc kubenswrapper[4794]: I0310 10:05:52.756258 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2af99ed8-10a0-4bc5-8561-1870c4ca7eb2-ovsdbserver-sb\") pod \"dnsmasq-dns-5879b95d97-lx6f2\" (UID: \"2af99ed8-10a0-4bc5-8561-1870c4ca7eb2\") " pod="openstack/dnsmasq-dns-5879b95d97-lx6f2" Mar 10 10:05:52 crc kubenswrapper[4794]: I0310 10:05:52.756284 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/d871867e-0fb5-48df-9d55-f19a63d160ea-fernet-keys\") pod \"keystone-bootstrap-qw4lv\" (UID: \"d871867e-0fb5-48df-9d55-f19a63d160ea\") " pod="openstack/keystone-bootstrap-qw4lv" Mar 10 10:05:52 crc kubenswrapper[4794]: I0310 10:05:52.756301 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d871867e-0fb5-48df-9d55-f19a63d160ea-config-data\") pod \"keystone-bootstrap-qw4lv\" (UID: \"d871867e-0fb5-48df-9d55-f19a63d160ea\") " pod="openstack/keystone-bootstrap-qw4lv" Mar 10 10:05:52 crc kubenswrapper[4794]: I0310 10:05:52.756324 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2af99ed8-10a0-4bc5-8561-1870c4ca7eb2-dns-svc\") pod \"dnsmasq-dns-5879b95d97-lx6f2\" (UID: \"2af99ed8-10a0-4bc5-8561-1870c4ca7eb2\") " pod="openstack/dnsmasq-dns-5879b95d97-lx6f2" Mar 10 10:05:52 crc kubenswrapper[4794]: I0310 10:05:52.756363 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/d871867e-0fb5-48df-9d55-f19a63d160ea-credential-keys\") pod \"keystone-bootstrap-qw4lv\" (UID: \"d871867e-0fb5-48df-9d55-f19a63d160ea\") " pod="openstack/keystone-bootstrap-qw4lv" Mar 10 10:05:52 crc kubenswrapper[4794]: I0310 10:05:52.756384 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d871867e-0fb5-48df-9d55-f19a63d160ea-scripts\") pod \"keystone-bootstrap-qw4lv\" (UID: \"d871867e-0fb5-48df-9d55-f19a63d160ea\") " pod="openstack/keystone-bootstrap-qw4lv" Mar 10 10:05:52 crc kubenswrapper[4794]: I0310 10:05:52.756409 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/2af99ed8-10a0-4bc5-8561-1870c4ca7eb2-dns-swift-storage-0\") pod \"dnsmasq-dns-5879b95d97-lx6f2\" (UID: \"2af99ed8-10a0-4bc5-8561-1870c4ca7eb2\") " pod="openstack/dnsmasq-dns-5879b95d97-lx6f2" Mar 10 10:05:52 crc kubenswrapper[4794]: I0310 10:05:52.756468 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x2gm5\" (UniqueName: \"kubernetes.io/projected/2af99ed8-10a0-4bc5-8561-1870c4ca7eb2-kube-api-access-x2gm5\") pod \"dnsmasq-dns-5879b95d97-lx6f2\" (UID: \"2af99ed8-10a0-4bc5-8561-1870c4ca7eb2\") " pod="openstack/dnsmasq-dns-5879b95d97-lx6f2" Mar 10 10:05:52 crc kubenswrapper[4794]: I0310 10:05:52.756504 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2af99ed8-10a0-4bc5-8561-1870c4ca7eb2-ovsdbserver-nb\") pod \"dnsmasq-dns-5879b95d97-lx6f2\" (UID: \"2af99ed8-10a0-4bc5-8561-1870c4ca7eb2\") " pod="openstack/dnsmasq-dns-5879b95d97-lx6f2" Mar 10 10:05:52 crc kubenswrapper[4794]: I0310 10:05:52.756561 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5vzsb\" (UniqueName: \"kubernetes.io/projected/d871867e-0fb5-48df-9d55-f19a63d160ea-kube-api-access-5vzsb\") pod \"keystone-bootstrap-qw4lv\" (UID: \"d871867e-0fb5-48df-9d55-f19a63d160ea\") " pod="openstack/keystone-bootstrap-qw4lv" Mar 10 10:05:52 crc kubenswrapper[4794]: I0310 10:05:52.756592 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d871867e-0fb5-48df-9d55-f19a63d160ea-combined-ca-bundle\") pod \"keystone-bootstrap-qw4lv\" (UID: \"d871867e-0fb5-48df-9d55-f19a63d160ea\") " pod="openstack/keystone-bootstrap-qw4lv" Mar 10 10:05:52 crc kubenswrapper[4794]: I0310 10:05:52.757186 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2af99ed8-10a0-4bc5-8561-1870c4ca7eb2-ovsdbserver-sb\") pod \"dnsmasq-dns-5879b95d97-lx6f2\" (UID: \"2af99ed8-10a0-4bc5-8561-1870c4ca7eb2\") " pod="openstack/dnsmasq-dns-5879b95d97-lx6f2" Mar 10 10:05:52 crc kubenswrapper[4794]: I0310 10:05:52.757692 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2af99ed8-10a0-4bc5-8561-1870c4ca7eb2-config\") pod \"dnsmasq-dns-5879b95d97-lx6f2\" (UID: \"2af99ed8-10a0-4bc5-8561-1870c4ca7eb2\") " pod="openstack/dnsmasq-dns-5879b95d97-lx6f2" Mar 10 10:05:52 crc kubenswrapper[4794]: I0310 10:05:52.760404 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/d871867e-0fb5-48df-9d55-f19a63d160ea-credential-keys\") pod \"keystone-bootstrap-qw4lv\" (UID: \"d871867e-0fb5-48df-9d55-f19a63d160ea\") " pod="openstack/keystone-bootstrap-qw4lv" Mar 10 10:05:52 crc kubenswrapper[4794]: I0310 10:05:52.761230 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2af99ed8-10a0-4bc5-8561-1870c4ca7eb2-dns-svc\") pod \"dnsmasq-dns-5879b95d97-lx6f2\" (UID: \"2af99ed8-10a0-4bc5-8561-1870c4ca7eb2\") " pod="openstack/dnsmasq-dns-5879b95d97-lx6f2" Mar 10 10:05:52 crc kubenswrapper[4794]: I0310 10:05:52.761245 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/2af99ed8-10a0-4bc5-8561-1870c4ca7eb2-dns-swift-storage-0\") pod \"dnsmasq-dns-5879b95d97-lx6f2\" (UID: \"2af99ed8-10a0-4bc5-8561-1870c4ca7eb2\") " pod="openstack/dnsmasq-dns-5879b95d97-lx6f2" Mar 10 10:05:52 crc kubenswrapper[4794]: I0310 10:05:52.761913 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2af99ed8-10a0-4bc5-8561-1870c4ca7eb2-ovsdbserver-nb\") pod \"dnsmasq-dns-5879b95d97-lx6f2\" (UID: \"2af99ed8-10a0-4bc5-8561-1870c4ca7eb2\") " pod="openstack/dnsmasq-dns-5879b95d97-lx6f2" Mar 10 10:05:52 crc kubenswrapper[4794]: I0310 10:05:52.768273 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d871867e-0fb5-48df-9d55-f19a63d160ea-combined-ca-bundle\") pod \"keystone-bootstrap-qw4lv\" (UID: \"d871867e-0fb5-48df-9d55-f19a63d160ea\") " pod="openstack/keystone-bootstrap-qw4lv" Mar 10 10:05:52 crc kubenswrapper[4794]: I0310 10:05:52.776733 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d871867e-0fb5-48df-9d55-f19a63d160ea-config-data\") pod \"keystone-bootstrap-qw4lv\" (UID: \"d871867e-0fb5-48df-9d55-f19a63d160ea\") " pod="openstack/keystone-bootstrap-qw4lv" Mar 10 10:05:52 crc kubenswrapper[4794]: I0310 10:05:52.792194 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5vzsb\" (UniqueName: \"kubernetes.io/projected/d871867e-0fb5-48df-9d55-f19a63d160ea-kube-api-access-5vzsb\") pod \"keystone-bootstrap-qw4lv\" (UID: \"d871867e-0fb5-48df-9d55-f19a63d160ea\") " pod="openstack/keystone-bootstrap-qw4lv" Mar 10 10:05:52 crc kubenswrapper[4794]: I0310 10:05:52.792212 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-sync-g7gmt"] Mar 10 10:05:52 crc kubenswrapper[4794]: I0310 10:05:52.793770 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-g7gmt" Mar 10 10:05:52 crc kubenswrapper[4794]: I0310 10:05:52.796057 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d871867e-0fb5-48df-9d55-f19a63d160ea-scripts\") pod \"keystone-bootstrap-qw4lv\" (UID: \"d871867e-0fb5-48df-9d55-f19a63d160ea\") " pod="openstack/keystone-bootstrap-qw4lv" Mar 10 10:05:52 crc kubenswrapper[4794]: I0310 10:05:52.796730 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x2gm5\" (UniqueName: \"kubernetes.io/projected/2af99ed8-10a0-4bc5-8561-1870c4ca7eb2-kube-api-access-x2gm5\") pod \"dnsmasq-dns-5879b95d97-lx6f2\" (UID: \"2af99ed8-10a0-4bc5-8561-1870c4ca7eb2\") " pod="openstack/dnsmasq-dns-5879b95d97-lx6f2" Mar 10 10:05:52 crc kubenswrapper[4794]: I0310 10:05:52.797472 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Mar 10 10:05:52 crc kubenswrapper[4794]: I0310 10:05:52.797749 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Mar 10 10:05:52 crc kubenswrapper[4794]: I0310 10:05:52.798773 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-jv8qn" Mar 10 10:05:52 crc kubenswrapper[4794]: I0310 10:05:52.800110 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/d871867e-0fb5-48df-9d55-f19a63d160ea-fernet-keys\") pod \"keystone-bootstrap-qw4lv\" (UID: \"d871867e-0fb5-48df-9d55-f19a63d160ea\") " pod="openstack/keystone-bootstrap-qw4lv" Mar 10 10:05:52 crc kubenswrapper[4794]: I0310 10:05:52.823865 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-g7gmt"] Mar 10 10:05:52 crc kubenswrapper[4794]: I0310 10:05:52.852209 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5879b95d97-lx6f2" Mar 10 10:05:52 crc kubenswrapper[4794]: I0310 10:05:52.857751 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/821b338a-8a20-4d93-8dfa-28727da3ecba-combined-ca-bundle\") pod \"cinder-db-sync-g7gmt\" (UID: \"821b338a-8a20-4d93-8dfa-28727da3ecba\") " pod="openstack/cinder-db-sync-g7gmt" Mar 10 10:05:52 crc kubenswrapper[4794]: I0310 10:05:52.857780 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/821b338a-8a20-4d93-8dfa-28727da3ecba-db-sync-config-data\") pod \"cinder-db-sync-g7gmt\" (UID: \"821b338a-8a20-4d93-8dfa-28727da3ecba\") " pod="openstack/cinder-db-sync-g7gmt" Mar 10 10:05:52 crc kubenswrapper[4794]: I0310 10:05:52.857835 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/821b338a-8a20-4d93-8dfa-28727da3ecba-etc-machine-id\") pod \"cinder-db-sync-g7gmt\" (UID: \"821b338a-8a20-4d93-8dfa-28727da3ecba\") " pod="openstack/cinder-db-sync-g7gmt" Mar 10 10:05:52 crc kubenswrapper[4794]: I0310 10:05:52.857852 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2pmml\" (UniqueName: \"kubernetes.io/projected/821b338a-8a20-4d93-8dfa-28727da3ecba-kube-api-access-2pmml\") pod \"cinder-db-sync-g7gmt\" (UID: \"821b338a-8a20-4d93-8dfa-28727da3ecba\") " pod="openstack/cinder-db-sync-g7gmt" Mar 10 10:05:52 crc kubenswrapper[4794]: I0310 10:05:52.857896 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/821b338a-8a20-4d93-8dfa-28727da3ecba-config-data\") pod \"cinder-db-sync-g7gmt\" (UID: \"821b338a-8a20-4d93-8dfa-28727da3ecba\") " pod="openstack/cinder-db-sync-g7gmt" Mar 10 10:05:52 crc kubenswrapper[4794]: I0310 10:05:52.857916 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/821b338a-8a20-4d93-8dfa-28727da3ecba-scripts\") pod \"cinder-db-sync-g7gmt\" (UID: \"821b338a-8a20-4d93-8dfa-28727da3ecba\") " pod="openstack/cinder-db-sync-g7gmt" Mar 10 10:05:52 crc kubenswrapper[4794]: I0310 10:05:52.864573 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 10 10:05:52 crc kubenswrapper[4794]: I0310 10:05:52.866921 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 10 10:05:52 crc kubenswrapper[4794]: I0310 10:05:52.872702 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 10 10:05:52 crc kubenswrapper[4794]: I0310 10:05:52.872882 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 10 10:05:52 crc kubenswrapper[4794]: I0310 10:05:52.925292 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 10 10:05:52 crc kubenswrapper[4794]: I0310 10:05:52.925715 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-qw4lv" Mar 10 10:05:52 crc kubenswrapper[4794]: I0310 10:05:52.944173 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-sync-hjnc2"] Mar 10 10:05:52 crc kubenswrapper[4794]: I0310 10:05:52.959677 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-hjnc2" Mar 10 10:05:52 crc kubenswrapper[4794]: I0310 10:05:52.962273 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Mar 10 10:05:52 crc kubenswrapper[4794]: I0310 10:05:52.962581 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-8qmcc" Mar 10 10:05:52 crc kubenswrapper[4794]: I0310 10:05:52.962983 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Mar 10 10:05:52 crc kubenswrapper[4794]: I0310 10:05:52.963202 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/821b338a-8a20-4d93-8dfa-28727da3ecba-etc-machine-id\") pod \"cinder-db-sync-g7gmt\" (UID: \"821b338a-8a20-4d93-8dfa-28727da3ecba\") " pod="openstack/cinder-db-sync-g7gmt" Mar 10 10:05:52 crc kubenswrapper[4794]: I0310 10:05:52.963246 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2pmml\" (UniqueName: \"kubernetes.io/projected/821b338a-8a20-4d93-8dfa-28727da3ecba-kube-api-access-2pmml\") pod \"cinder-db-sync-g7gmt\" (UID: \"821b338a-8a20-4d93-8dfa-28727da3ecba\") " pod="openstack/cinder-db-sync-g7gmt" Mar 10 10:05:52 crc kubenswrapper[4794]: I0310 10:05:52.963323 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/821b338a-8a20-4d93-8dfa-28727da3ecba-config-data\") pod \"cinder-db-sync-g7gmt\" (UID: \"821b338a-8a20-4d93-8dfa-28727da3ecba\") " pod="openstack/cinder-db-sync-g7gmt" Mar 10 10:05:52 crc kubenswrapper[4794]: I0310 10:05:52.963369 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/821b338a-8a20-4d93-8dfa-28727da3ecba-scripts\") pod \"cinder-db-sync-g7gmt\" (UID: \"821b338a-8a20-4d93-8dfa-28727da3ecba\") " pod="openstack/cinder-db-sync-g7gmt" Mar 10 10:05:52 crc kubenswrapper[4794]: I0310 10:05:52.963431 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/821b338a-8a20-4d93-8dfa-28727da3ecba-combined-ca-bundle\") pod \"cinder-db-sync-g7gmt\" (UID: \"821b338a-8a20-4d93-8dfa-28727da3ecba\") " pod="openstack/cinder-db-sync-g7gmt" Mar 10 10:05:52 crc kubenswrapper[4794]: I0310 10:05:52.963457 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/821b338a-8a20-4d93-8dfa-28727da3ecba-db-sync-config-data\") pod \"cinder-db-sync-g7gmt\" (UID: \"821b338a-8a20-4d93-8dfa-28727da3ecba\") " pod="openstack/cinder-db-sync-g7gmt" Mar 10 10:05:52 crc kubenswrapper[4794]: I0310 10:05:52.971682 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/821b338a-8a20-4d93-8dfa-28727da3ecba-etc-machine-id\") pod \"cinder-db-sync-g7gmt\" (UID: \"821b338a-8a20-4d93-8dfa-28727da3ecba\") " pod="openstack/cinder-db-sync-g7gmt" Mar 10 10:05:52 crc kubenswrapper[4794]: I0310 10:05:52.972729 4794 patch_prober.go:28] interesting pod/machine-config-daemon-69278 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 10:05:52 crc kubenswrapper[4794]: I0310 10:05:52.972776 4794 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-69278" podUID="071a79a8-a892-4d38-a255-2a19483b64aa" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 10:05:53 crc kubenswrapper[4794]: I0310 10:05:53.005851 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/821b338a-8a20-4d93-8dfa-28727da3ecba-config-data\") pod \"cinder-db-sync-g7gmt\" (UID: \"821b338a-8a20-4d93-8dfa-28727da3ecba\") " pod="openstack/cinder-db-sync-g7gmt" Mar 10 10:05:53 crc kubenswrapper[4794]: I0310 10:05:53.016189 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/821b338a-8a20-4d93-8dfa-28727da3ecba-combined-ca-bundle\") pod \"cinder-db-sync-g7gmt\" (UID: \"821b338a-8a20-4d93-8dfa-28727da3ecba\") " pod="openstack/cinder-db-sync-g7gmt" Mar 10 10:05:53 crc kubenswrapper[4794]: I0310 10:05:53.018481 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2pmml\" (UniqueName: \"kubernetes.io/projected/821b338a-8a20-4d93-8dfa-28727da3ecba-kube-api-access-2pmml\") pod \"cinder-db-sync-g7gmt\" (UID: \"821b338a-8a20-4d93-8dfa-28727da3ecba\") " pod="openstack/cinder-db-sync-g7gmt" Mar 10 10:05:53 crc kubenswrapper[4794]: I0310 10:05:53.031908 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/821b338a-8a20-4d93-8dfa-28727da3ecba-db-sync-config-data\") pod \"cinder-db-sync-g7gmt\" (UID: \"821b338a-8a20-4d93-8dfa-28727da3ecba\") " pod="openstack/cinder-db-sync-g7gmt" Mar 10 10:05:53 crc kubenswrapper[4794]: I0310 10:05:53.037447 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/821b338a-8a20-4d93-8dfa-28727da3ecba-scripts\") pod \"cinder-db-sync-g7gmt\" (UID: \"821b338a-8a20-4d93-8dfa-28727da3ecba\") " pod="openstack/cinder-db-sync-g7gmt" Mar 10 10:05:53 crc kubenswrapper[4794]: I0310 10:05:53.041476 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-hjnc2"] Mar 10 10:05:53 crc kubenswrapper[4794]: I0310 10:05:53.078723 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-sync-c4b4g"] Mar 10 10:05:53 crc kubenswrapper[4794]: I0310 10:05:53.080288 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/39f62c40-641b-409f-ab29-ba30c14de2d8-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"39f62c40-641b-409f-ab29-ba30c14de2d8\") " pod="openstack/ceilometer-0" Mar 10 10:05:53 crc kubenswrapper[4794]: I0310 10:05:53.081102 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/39f62c40-641b-409f-ab29-ba30c14de2d8-log-httpd\") pod \"ceilometer-0\" (UID: \"39f62c40-641b-409f-ab29-ba30c14de2d8\") " pod="openstack/ceilometer-0" Mar 10 10:05:53 crc kubenswrapper[4794]: I0310 10:05:53.081175 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/39f62c40-641b-409f-ab29-ba30c14de2d8-run-httpd\") pod \"ceilometer-0\" (UID: \"39f62c40-641b-409f-ab29-ba30c14de2d8\") " pod="openstack/ceilometer-0" Mar 10 10:05:53 crc kubenswrapper[4794]: I0310 10:05:53.081203 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/39f62c40-641b-409f-ab29-ba30c14de2d8-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"39f62c40-641b-409f-ab29-ba30c14de2d8\") " pod="openstack/ceilometer-0" Mar 10 10:05:53 crc kubenswrapper[4794]: I0310 10:05:53.081268 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-llmkn\" (UniqueName: \"kubernetes.io/projected/b9423b3f-3f25-484c-aacc-c83d78c2f731-kube-api-access-llmkn\") pod \"neutron-db-sync-hjnc2\" (UID: \"b9423b3f-3f25-484c-aacc-c83d78c2f731\") " pod="openstack/neutron-db-sync-hjnc2" Mar 10 10:05:53 crc kubenswrapper[4794]: I0310 10:05:53.081325 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/39f62c40-641b-409f-ab29-ba30c14de2d8-scripts\") pod \"ceilometer-0\" (UID: \"39f62c40-641b-409f-ab29-ba30c14de2d8\") " pod="openstack/ceilometer-0" Mar 10 10:05:53 crc kubenswrapper[4794]: I0310 10:05:53.081398 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nt7ks\" (UniqueName: \"kubernetes.io/projected/39f62c40-641b-409f-ab29-ba30c14de2d8-kube-api-access-nt7ks\") pod \"ceilometer-0\" (UID: \"39f62c40-641b-409f-ab29-ba30c14de2d8\") " pod="openstack/ceilometer-0" Mar 10 10:05:53 crc kubenswrapper[4794]: I0310 10:05:53.081438 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/39f62c40-641b-409f-ab29-ba30c14de2d8-config-data\") pod \"ceilometer-0\" (UID: \"39f62c40-641b-409f-ab29-ba30c14de2d8\") " pod="openstack/ceilometer-0" Mar 10 10:05:53 crc kubenswrapper[4794]: I0310 10:05:53.081520 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/b9423b3f-3f25-484c-aacc-c83d78c2f731-config\") pod \"neutron-db-sync-hjnc2\" (UID: \"b9423b3f-3f25-484c-aacc-c83d78c2f731\") " pod="openstack/neutron-db-sync-hjnc2" Mar 10 10:05:53 crc kubenswrapper[4794]: I0310 10:05:53.081568 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b9423b3f-3f25-484c-aacc-c83d78c2f731-combined-ca-bundle\") pod \"neutron-db-sync-hjnc2\" (UID: \"b9423b3f-3f25-484c-aacc-c83d78c2f731\") " pod="openstack/neutron-db-sync-hjnc2" Mar 10 10:05:53 crc kubenswrapper[4794]: I0310 10:05:53.087832 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-c4b4g" Mar 10 10:05:53 crc kubenswrapper[4794]: I0310 10:05:53.094094 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-vs4x7" Mar 10 10:05:53 crc kubenswrapper[4794]: I0310 10:05:53.096441 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Mar 10 10:05:53 crc kubenswrapper[4794]: I0310 10:05:53.105050 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-c4b4g"] Mar 10 10:05:53 crc kubenswrapper[4794]: I0310 10:05:53.117512 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5879b95d97-lx6f2"] Mar 10 10:05:53 crc kubenswrapper[4794]: I0310 10:05:53.138105 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-sync-v6v6k"] Mar 10 10:05:53 crc kubenswrapper[4794]: I0310 10:05:53.141620 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-v6v6k" Mar 10 10:05:53 crc kubenswrapper[4794]: I0310 10:05:53.146009 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Mar 10 10:05:53 crc kubenswrapper[4794]: I0310 10:05:53.146264 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-k9rlk" Mar 10 10:05:53 crc kubenswrapper[4794]: I0310 10:05:53.146576 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Mar 10 10:05:53 crc kubenswrapper[4794]: I0310 10:05:53.159136 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-v6v6k"] Mar 10 10:05:53 crc kubenswrapper[4794]: I0310 10:05:53.174635 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5bf6456ddf-8m6kc"] Mar 10 10:05:53 crc kubenswrapper[4794]: I0310 10:05:53.176047 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5bf6456ddf-8m6kc" Mar 10 10:05:53 crc kubenswrapper[4794]: I0310 10:05:53.181800 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5bf6456ddf-8m6kc"] Mar 10 10:05:53 crc kubenswrapper[4794]: I0310 10:05:53.183259 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/39f62c40-641b-409f-ab29-ba30c14de2d8-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"39f62c40-641b-409f-ab29-ba30c14de2d8\") " pod="openstack/ceilometer-0" Mar 10 10:05:53 crc kubenswrapper[4794]: I0310 10:05:53.183291 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cb29878d-3dbb-496a-bd25-b6b4ff102b6f-config-data\") pod \"placement-db-sync-v6v6k\" (UID: \"cb29878d-3dbb-496a-bd25-b6b4ff102b6f\") " pod="openstack/placement-db-sync-v6v6k" Mar 10 10:05:53 crc kubenswrapper[4794]: I0310 10:05:53.183317 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/39f62c40-641b-409f-ab29-ba30c14de2d8-log-httpd\") pod \"ceilometer-0\" (UID: \"39f62c40-641b-409f-ab29-ba30c14de2d8\") " pod="openstack/ceilometer-0" Mar 10 10:05:53 crc kubenswrapper[4794]: I0310 10:05:53.183357 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/39f62c40-641b-409f-ab29-ba30c14de2d8-run-httpd\") pod \"ceilometer-0\" (UID: \"39f62c40-641b-409f-ab29-ba30c14de2d8\") " pod="openstack/ceilometer-0" Mar 10 10:05:53 crc kubenswrapper[4794]: I0310 10:05:53.183373 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/39f62c40-641b-409f-ab29-ba30c14de2d8-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"39f62c40-641b-409f-ab29-ba30c14de2d8\") " pod="openstack/ceilometer-0" Mar 10 10:05:53 crc kubenswrapper[4794]: I0310 10:05:53.183433 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-llmkn\" (UniqueName: \"kubernetes.io/projected/b9423b3f-3f25-484c-aacc-c83d78c2f731-kube-api-access-llmkn\") pod \"neutron-db-sync-hjnc2\" (UID: \"b9423b3f-3f25-484c-aacc-c83d78c2f731\") " pod="openstack/neutron-db-sync-hjnc2" Mar 10 10:05:53 crc kubenswrapper[4794]: I0310 10:05:53.183490 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cb29878d-3dbb-496a-bd25-b6b4ff102b6f-combined-ca-bundle\") pod \"placement-db-sync-v6v6k\" (UID: \"cb29878d-3dbb-496a-bd25-b6b4ff102b6f\") " pod="openstack/placement-db-sync-v6v6k" Mar 10 10:05:53 crc kubenswrapper[4794]: I0310 10:05:53.183523 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cb29878d-3dbb-496a-bd25-b6b4ff102b6f-logs\") pod \"placement-db-sync-v6v6k\" (UID: \"cb29878d-3dbb-496a-bd25-b6b4ff102b6f\") " pod="openstack/placement-db-sync-v6v6k" Mar 10 10:05:53 crc kubenswrapper[4794]: I0310 10:05:53.183560 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/39f62c40-641b-409f-ab29-ba30c14de2d8-scripts\") pod \"ceilometer-0\" (UID: \"39f62c40-641b-409f-ab29-ba30c14de2d8\") " pod="openstack/ceilometer-0" Mar 10 10:05:53 crc kubenswrapper[4794]: I0310 10:05:53.183590 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nt7ks\" (UniqueName: \"kubernetes.io/projected/39f62c40-641b-409f-ab29-ba30c14de2d8-kube-api-access-nt7ks\") pod \"ceilometer-0\" (UID: \"39f62c40-641b-409f-ab29-ba30c14de2d8\") " pod="openstack/ceilometer-0" Mar 10 10:05:53 crc kubenswrapper[4794]: I0310 10:05:53.183630 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/39f62c40-641b-409f-ab29-ba30c14de2d8-config-data\") pod \"ceilometer-0\" (UID: \"39f62c40-641b-409f-ab29-ba30c14de2d8\") " pod="openstack/ceilometer-0" Mar 10 10:05:53 crc kubenswrapper[4794]: I0310 10:05:53.183669 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zktwg\" (UniqueName: \"kubernetes.io/projected/22069ba2-0135-4559-9c7f-2d73ae0dd81a-kube-api-access-zktwg\") pod \"barbican-db-sync-c4b4g\" (UID: \"22069ba2-0135-4559-9c7f-2d73ae0dd81a\") " pod="openstack/barbican-db-sync-c4b4g" Mar 10 10:05:53 crc kubenswrapper[4794]: I0310 10:05:53.183701 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/22069ba2-0135-4559-9c7f-2d73ae0dd81a-db-sync-config-data\") pod \"barbican-db-sync-c4b4g\" (UID: \"22069ba2-0135-4559-9c7f-2d73ae0dd81a\") " pod="openstack/barbican-db-sync-c4b4g" Mar 10 10:05:53 crc kubenswrapper[4794]: I0310 10:05:53.183758 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/b9423b3f-3f25-484c-aacc-c83d78c2f731-config\") pod \"neutron-db-sync-hjnc2\" (UID: \"b9423b3f-3f25-484c-aacc-c83d78c2f731\") " pod="openstack/neutron-db-sync-hjnc2" Mar 10 10:05:53 crc kubenswrapper[4794]: I0310 10:05:53.183777 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cb29878d-3dbb-496a-bd25-b6b4ff102b6f-scripts\") pod \"placement-db-sync-v6v6k\" (UID: \"cb29878d-3dbb-496a-bd25-b6b4ff102b6f\") " pod="openstack/placement-db-sync-v6v6k" Mar 10 10:05:53 crc kubenswrapper[4794]: I0310 10:05:53.183803 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/22069ba2-0135-4559-9c7f-2d73ae0dd81a-combined-ca-bundle\") pod \"barbican-db-sync-c4b4g\" (UID: \"22069ba2-0135-4559-9c7f-2d73ae0dd81a\") " pod="openstack/barbican-db-sync-c4b4g" Mar 10 10:05:53 crc kubenswrapper[4794]: I0310 10:05:53.183836 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b9423b3f-3f25-484c-aacc-c83d78c2f731-combined-ca-bundle\") pod \"neutron-db-sync-hjnc2\" (UID: \"b9423b3f-3f25-484c-aacc-c83d78c2f731\") " pod="openstack/neutron-db-sync-hjnc2" Mar 10 10:05:53 crc kubenswrapper[4794]: I0310 10:05:53.184134 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zp5c2\" (UniqueName: \"kubernetes.io/projected/cb29878d-3dbb-496a-bd25-b6b4ff102b6f-kube-api-access-zp5c2\") pod \"placement-db-sync-v6v6k\" (UID: \"cb29878d-3dbb-496a-bd25-b6b4ff102b6f\") " pod="openstack/placement-db-sync-v6v6k" Mar 10 10:05:53 crc kubenswrapper[4794]: I0310 10:05:53.185246 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/39f62c40-641b-409f-ab29-ba30c14de2d8-log-httpd\") pod \"ceilometer-0\" (UID: \"39f62c40-641b-409f-ab29-ba30c14de2d8\") " pod="openstack/ceilometer-0" Mar 10 10:05:53 crc kubenswrapper[4794]: I0310 10:05:53.187014 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/39f62c40-641b-409f-ab29-ba30c14de2d8-run-httpd\") pod \"ceilometer-0\" (UID: \"39f62c40-641b-409f-ab29-ba30c14de2d8\") " pod="openstack/ceilometer-0" Mar 10 10:05:53 crc kubenswrapper[4794]: I0310 10:05:53.191901 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/39f62c40-641b-409f-ab29-ba30c14de2d8-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"39f62c40-641b-409f-ab29-ba30c14de2d8\") " pod="openstack/ceilometer-0" Mar 10 10:05:53 crc kubenswrapper[4794]: I0310 10:05:53.192053 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b9423b3f-3f25-484c-aacc-c83d78c2f731-combined-ca-bundle\") pod \"neutron-db-sync-hjnc2\" (UID: \"b9423b3f-3f25-484c-aacc-c83d78c2f731\") " pod="openstack/neutron-db-sync-hjnc2" Mar 10 10:05:53 crc kubenswrapper[4794]: I0310 10:05:53.198976 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/39f62c40-641b-409f-ab29-ba30c14de2d8-scripts\") pod \"ceilometer-0\" (UID: \"39f62c40-641b-409f-ab29-ba30c14de2d8\") " pod="openstack/ceilometer-0" Mar 10 10:05:53 crc kubenswrapper[4794]: I0310 10:05:53.203591 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/39f62c40-641b-409f-ab29-ba30c14de2d8-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"39f62c40-641b-409f-ab29-ba30c14de2d8\") " pod="openstack/ceilometer-0" Mar 10 10:05:53 crc kubenswrapper[4794]: I0310 10:05:53.206052 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/39f62c40-641b-409f-ab29-ba30c14de2d8-config-data\") pod \"ceilometer-0\" (UID: \"39f62c40-641b-409f-ab29-ba30c14de2d8\") " pod="openstack/ceilometer-0" Mar 10 10:05:53 crc kubenswrapper[4794]: I0310 10:05:53.206368 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/b9423b3f-3f25-484c-aacc-c83d78c2f731-config\") pod \"neutron-db-sync-hjnc2\" (UID: \"b9423b3f-3f25-484c-aacc-c83d78c2f731\") " pod="openstack/neutron-db-sync-hjnc2" Mar 10 10:05:53 crc kubenswrapper[4794]: I0310 10:05:53.211394 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-llmkn\" (UniqueName: \"kubernetes.io/projected/b9423b3f-3f25-484c-aacc-c83d78c2f731-kube-api-access-llmkn\") pod \"neutron-db-sync-hjnc2\" (UID: \"b9423b3f-3f25-484c-aacc-c83d78c2f731\") " pod="openstack/neutron-db-sync-hjnc2" Mar 10 10:05:53 crc kubenswrapper[4794]: I0310 10:05:53.235510 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nt7ks\" (UniqueName: \"kubernetes.io/projected/39f62c40-641b-409f-ab29-ba30c14de2d8-kube-api-access-nt7ks\") pod \"ceilometer-0\" (UID: \"39f62c40-641b-409f-ab29-ba30c14de2d8\") " pod="openstack/ceilometer-0" Mar 10 10:05:53 crc kubenswrapper[4794]: I0310 10:05:53.285521 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-g7gmt" Mar 10 10:05:53 crc kubenswrapper[4794]: I0310 10:05:53.288789 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/22069ba2-0135-4559-9c7f-2d73ae0dd81a-db-sync-config-data\") pod \"barbican-db-sync-c4b4g\" (UID: \"22069ba2-0135-4559-9c7f-2d73ae0dd81a\") " pod="openstack/barbican-db-sync-c4b4g" Mar 10 10:05:53 crc kubenswrapper[4794]: I0310 10:05:53.288831 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cb29878d-3dbb-496a-bd25-b6b4ff102b6f-scripts\") pod \"placement-db-sync-v6v6k\" (UID: \"cb29878d-3dbb-496a-bd25-b6b4ff102b6f\") " pod="openstack/placement-db-sync-v6v6k" Mar 10 10:05:53 crc kubenswrapper[4794]: I0310 10:05:53.288849 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/22069ba2-0135-4559-9c7f-2d73ae0dd81a-combined-ca-bundle\") pod \"barbican-db-sync-c4b4g\" (UID: \"22069ba2-0135-4559-9c7f-2d73ae0dd81a\") " pod="openstack/barbican-db-sync-c4b4g" Mar 10 10:05:53 crc kubenswrapper[4794]: I0310 10:05:53.288876 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/327bbff1-1be5-4d8f-9ef9-dfc71e5199cc-dns-swift-storage-0\") pod \"dnsmasq-dns-5bf6456ddf-8m6kc\" (UID: \"327bbff1-1be5-4d8f-9ef9-dfc71e5199cc\") " pod="openstack/dnsmasq-dns-5bf6456ddf-8m6kc" Mar 10 10:05:53 crc kubenswrapper[4794]: I0310 10:05:53.288925 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/327bbff1-1be5-4d8f-9ef9-dfc71e5199cc-config\") pod \"dnsmasq-dns-5bf6456ddf-8m6kc\" (UID: \"327bbff1-1be5-4d8f-9ef9-dfc71e5199cc\") " pod="openstack/dnsmasq-dns-5bf6456ddf-8m6kc" Mar 10 10:05:53 crc kubenswrapper[4794]: I0310 10:05:53.288941 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zp5c2\" (UniqueName: \"kubernetes.io/projected/cb29878d-3dbb-496a-bd25-b6b4ff102b6f-kube-api-access-zp5c2\") pod \"placement-db-sync-v6v6k\" (UID: \"cb29878d-3dbb-496a-bd25-b6b4ff102b6f\") " pod="openstack/placement-db-sync-v6v6k" Mar 10 10:05:53 crc kubenswrapper[4794]: I0310 10:05:53.288965 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cb29878d-3dbb-496a-bd25-b6b4ff102b6f-config-data\") pod \"placement-db-sync-v6v6k\" (UID: \"cb29878d-3dbb-496a-bd25-b6b4ff102b6f\") " pod="openstack/placement-db-sync-v6v6k" Mar 10 10:05:53 crc kubenswrapper[4794]: I0310 10:05:53.289002 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/327bbff1-1be5-4d8f-9ef9-dfc71e5199cc-dns-svc\") pod \"dnsmasq-dns-5bf6456ddf-8m6kc\" (UID: \"327bbff1-1be5-4d8f-9ef9-dfc71e5199cc\") " pod="openstack/dnsmasq-dns-5bf6456ddf-8m6kc" Mar 10 10:05:53 crc kubenswrapper[4794]: I0310 10:05:53.289018 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/327bbff1-1be5-4d8f-9ef9-dfc71e5199cc-ovsdbserver-sb\") pod \"dnsmasq-dns-5bf6456ddf-8m6kc\" (UID: \"327bbff1-1be5-4d8f-9ef9-dfc71e5199cc\") " pod="openstack/dnsmasq-dns-5bf6456ddf-8m6kc" Mar 10 10:05:53 crc kubenswrapper[4794]: I0310 10:05:53.289034 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p9d5t\" (UniqueName: \"kubernetes.io/projected/327bbff1-1be5-4d8f-9ef9-dfc71e5199cc-kube-api-access-p9d5t\") pod \"dnsmasq-dns-5bf6456ddf-8m6kc\" (UID: \"327bbff1-1be5-4d8f-9ef9-dfc71e5199cc\") " pod="openstack/dnsmasq-dns-5bf6456ddf-8m6kc" Mar 10 10:05:53 crc kubenswrapper[4794]: I0310 10:05:53.289054 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/327bbff1-1be5-4d8f-9ef9-dfc71e5199cc-ovsdbserver-nb\") pod \"dnsmasq-dns-5bf6456ddf-8m6kc\" (UID: \"327bbff1-1be5-4d8f-9ef9-dfc71e5199cc\") " pod="openstack/dnsmasq-dns-5bf6456ddf-8m6kc" Mar 10 10:05:53 crc kubenswrapper[4794]: I0310 10:05:53.289071 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cb29878d-3dbb-496a-bd25-b6b4ff102b6f-combined-ca-bundle\") pod \"placement-db-sync-v6v6k\" (UID: \"cb29878d-3dbb-496a-bd25-b6b4ff102b6f\") " pod="openstack/placement-db-sync-v6v6k" Mar 10 10:05:53 crc kubenswrapper[4794]: I0310 10:05:53.289087 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cb29878d-3dbb-496a-bd25-b6b4ff102b6f-logs\") pod \"placement-db-sync-v6v6k\" (UID: \"cb29878d-3dbb-496a-bd25-b6b4ff102b6f\") " pod="openstack/placement-db-sync-v6v6k" Mar 10 10:05:53 crc kubenswrapper[4794]: I0310 10:05:53.289126 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zktwg\" (UniqueName: \"kubernetes.io/projected/22069ba2-0135-4559-9c7f-2d73ae0dd81a-kube-api-access-zktwg\") pod \"barbican-db-sync-c4b4g\" (UID: \"22069ba2-0135-4559-9c7f-2d73ae0dd81a\") " pod="openstack/barbican-db-sync-c4b4g" Mar 10 10:05:53 crc kubenswrapper[4794]: I0310 10:05:53.290840 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cb29878d-3dbb-496a-bd25-b6b4ff102b6f-logs\") pod \"placement-db-sync-v6v6k\" (UID: \"cb29878d-3dbb-496a-bd25-b6b4ff102b6f\") " pod="openstack/placement-db-sync-v6v6k" Mar 10 10:05:53 crc kubenswrapper[4794]: I0310 10:05:53.296773 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 10 10:05:53 crc kubenswrapper[4794]: I0310 10:05:53.297508 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/22069ba2-0135-4559-9c7f-2d73ae0dd81a-db-sync-config-data\") pod \"barbican-db-sync-c4b4g\" (UID: \"22069ba2-0135-4559-9c7f-2d73ae0dd81a\") " pod="openstack/barbican-db-sync-c4b4g" Mar 10 10:05:53 crc kubenswrapper[4794]: I0310 10:05:53.302226 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cb29878d-3dbb-496a-bd25-b6b4ff102b6f-combined-ca-bundle\") pod \"placement-db-sync-v6v6k\" (UID: \"cb29878d-3dbb-496a-bd25-b6b4ff102b6f\") " pod="openstack/placement-db-sync-v6v6k" Mar 10 10:05:53 crc kubenswrapper[4794]: I0310 10:05:53.304741 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cb29878d-3dbb-496a-bd25-b6b4ff102b6f-config-data\") pod \"placement-db-sync-v6v6k\" (UID: \"cb29878d-3dbb-496a-bd25-b6b4ff102b6f\") " pod="openstack/placement-db-sync-v6v6k" Mar 10 10:05:53 crc kubenswrapper[4794]: I0310 10:05:53.305447 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/22069ba2-0135-4559-9c7f-2d73ae0dd81a-combined-ca-bundle\") pod \"barbican-db-sync-c4b4g\" (UID: \"22069ba2-0135-4559-9c7f-2d73ae0dd81a\") " pod="openstack/barbican-db-sync-c4b4g" Mar 10 10:05:53 crc kubenswrapper[4794]: I0310 10:05:53.306610 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cb29878d-3dbb-496a-bd25-b6b4ff102b6f-scripts\") pod \"placement-db-sync-v6v6k\" (UID: \"cb29878d-3dbb-496a-bd25-b6b4ff102b6f\") " pod="openstack/placement-db-sync-v6v6k" Mar 10 10:05:53 crc kubenswrapper[4794]: I0310 10:05:53.312572 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-hjnc2" Mar 10 10:05:53 crc kubenswrapper[4794]: I0310 10:05:53.313533 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zp5c2\" (UniqueName: \"kubernetes.io/projected/cb29878d-3dbb-496a-bd25-b6b4ff102b6f-kube-api-access-zp5c2\") pod \"placement-db-sync-v6v6k\" (UID: \"cb29878d-3dbb-496a-bd25-b6b4ff102b6f\") " pod="openstack/placement-db-sync-v6v6k" Mar 10 10:05:53 crc kubenswrapper[4794]: I0310 10:05:53.355208 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zktwg\" (UniqueName: \"kubernetes.io/projected/22069ba2-0135-4559-9c7f-2d73ae0dd81a-kube-api-access-zktwg\") pod \"barbican-db-sync-c4b4g\" (UID: \"22069ba2-0135-4559-9c7f-2d73ae0dd81a\") " pod="openstack/barbican-db-sync-c4b4g" Mar 10 10:05:53 crc kubenswrapper[4794]: I0310 10:05:53.390604 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/327bbff1-1be5-4d8f-9ef9-dfc71e5199cc-config\") pod \"dnsmasq-dns-5bf6456ddf-8m6kc\" (UID: \"327bbff1-1be5-4d8f-9ef9-dfc71e5199cc\") " pod="openstack/dnsmasq-dns-5bf6456ddf-8m6kc" Mar 10 10:05:53 crc kubenswrapper[4794]: I0310 10:05:53.390741 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/327bbff1-1be5-4d8f-9ef9-dfc71e5199cc-dns-svc\") pod \"dnsmasq-dns-5bf6456ddf-8m6kc\" (UID: \"327bbff1-1be5-4d8f-9ef9-dfc71e5199cc\") " pod="openstack/dnsmasq-dns-5bf6456ddf-8m6kc" Mar 10 10:05:53 crc kubenswrapper[4794]: I0310 10:05:53.390762 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/327bbff1-1be5-4d8f-9ef9-dfc71e5199cc-ovsdbserver-sb\") pod \"dnsmasq-dns-5bf6456ddf-8m6kc\" (UID: \"327bbff1-1be5-4d8f-9ef9-dfc71e5199cc\") " pod="openstack/dnsmasq-dns-5bf6456ddf-8m6kc" Mar 10 10:05:53 crc kubenswrapper[4794]: I0310 10:05:53.390784 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p9d5t\" (UniqueName: \"kubernetes.io/projected/327bbff1-1be5-4d8f-9ef9-dfc71e5199cc-kube-api-access-p9d5t\") pod \"dnsmasq-dns-5bf6456ddf-8m6kc\" (UID: \"327bbff1-1be5-4d8f-9ef9-dfc71e5199cc\") " pod="openstack/dnsmasq-dns-5bf6456ddf-8m6kc" Mar 10 10:05:53 crc kubenswrapper[4794]: I0310 10:05:53.390822 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/327bbff1-1be5-4d8f-9ef9-dfc71e5199cc-ovsdbserver-nb\") pod \"dnsmasq-dns-5bf6456ddf-8m6kc\" (UID: \"327bbff1-1be5-4d8f-9ef9-dfc71e5199cc\") " pod="openstack/dnsmasq-dns-5bf6456ddf-8m6kc" Mar 10 10:05:53 crc kubenswrapper[4794]: I0310 10:05:53.390922 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/327bbff1-1be5-4d8f-9ef9-dfc71e5199cc-dns-swift-storage-0\") pod \"dnsmasq-dns-5bf6456ddf-8m6kc\" (UID: \"327bbff1-1be5-4d8f-9ef9-dfc71e5199cc\") " pod="openstack/dnsmasq-dns-5bf6456ddf-8m6kc" Mar 10 10:05:53 crc kubenswrapper[4794]: I0310 10:05:53.391924 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/327bbff1-1be5-4d8f-9ef9-dfc71e5199cc-dns-swift-storage-0\") pod \"dnsmasq-dns-5bf6456ddf-8m6kc\" (UID: \"327bbff1-1be5-4d8f-9ef9-dfc71e5199cc\") " pod="openstack/dnsmasq-dns-5bf6456ddf-8m6kc" Mar 10 10:05:53 crc kubenswrapper[4794]: I0310 10:05:53.393315 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/327bbff1-1be5-4d8f-9ef9-dfc71e5199cc-ovsdbserver-nb\") pod \"dnsmasq-dns-5bf6456ddf-8m6kc\" (UID: \"327bbff1-1be5-4d8f-9ef9-dfc71e5199cc\") " pod="openstack/dnsmasq-dns-5bf6456ddf-8m6kc" Mar 10 10:05:53 crc kubenswrapper[4794]: I0310 10:05:53.394068 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/327bbff1-1be5-4d8f-9ef9-dfc71e5199cc-dns-svc\") pod \"dnsmasq-dns-5bf6456ddf-8m6kc\" (UID: \"327bbff1-1be5-4d8f-9ef9-dfc71e5199cc\") " pod="openstack/dnsmasq-dns-5bf6456ddf-8m6kc" Mar 10 10:05:53 crc kubenswrapper[4794]: I0310 10:05:53.396091 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/327bbff1-1be5-4d8f-9ef9-dfc71e5199cc-ovsdbserver-sb\") pod \"dnsmasq-dns-5bf6456ddf-8m6kc\" (UID: \"327bbff1-1be5-4d8f-9ef9-dfc71e5199cc\") " pod="openstack/dnsmasq-dns-5bf6456ddf-8m6kc" Mar 10 10:05:53 crc kubenswrapper[4794]: I0310 10:05:53.396389 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/327bbff1-1be5-4d8f-9ef9-dfc71e5199cc-config\") pod \"dnsmasq-dns-5bf6456ddf-8m6kc\" (UID: \"327bbff1-1be5-4d8f-9ef9-dfc71e5199cc\") " pod="openstack/dnsmasq-dns-5bf6456ddf-8m6kc" Mar 10 10:05:53 crc kubenswrapper[4794]: I0310 10:05:53.414671 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p9d5t\" (UniqueName: \"kubernetes.io/projected/327bbff1-1be5-4d8f-9ef9-dfc71e5199cc-kube-api-access-p9d5t\") pod \"dnsmasq-dns-5bf6456ddf-8m6kc\" (UID: \"327bbff1-1be5-4d8f-9ef9-dfc71e5199cc\") " pod="openstack/dnsmasq-dns-5bf6456ddf-8m6kc" Mar 10 10:05:53 crc kubenswrapper[4794]: I0310 10:05:53.427752 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5879b95d97-lx6f2"] Mar 10 10:05:53 crc kubenswrapper[4794]: I0310 10:05:53.431497 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-c4b4g" Mar 10 10:05:53 crc kubenswrapper[4794]: I0310 10:05:53.472021 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-v6v6k" Mar 10 10:05:53 crc kubenswrapper[4794]: I0310 10:05:53.505148 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5bf6456ddf-8m6kc" Mar 10 10:05:53 crc kubenswrapper[4794]: I0310 10:05:53.539814 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-qw4lv"] Mar 10 10:05:53 crc kubenswrapper[4794]: W0310 10:05:53.544376 4794 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2af99ed8_10a0_4bc5_8561_1870c4ca7eb2.slice/crio-35fe115177598b35ebf27db2be8033850215ef13d4d483f04566836decdb1054 WatchSource:0}: Error finding container 35fe115177598b35ebf27db2be8033850215ef13d4d483f04566836decdb1054: Status 404 returned error can't find the container with id 35fe115177598b35ebf27db2be8033850215ef13d4d483f04566836decdb1054 Mar 10 10:05:53 crc kubenswrapper[4794]: I0310 10:05:53.683064 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Mar 10 10:05:53 crc kubenswrapper[4794]: I0310 10:05:53.684897 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 10 10:05:53 crc kubenswrapper[4794]: I0310 10:05:53.688452 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Mar 10 10:05:53 crc kubenswrapper[4794]: I0310 10:05:53.688505 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Mar 10 10:05:53 crc kubenswrapper[4794]: I0310 10:05:53.688744 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-v9fxn" Mar 10 10:05:53 crc kubenswrapper[4794]: I0310 10:05:53.689582 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Mar 10 10:05:53 crc kubenswrapper[4794]: I0310 10:05:53.696091 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 10 10:05:53 crc kubenswrapper[4794]: I0310 10:05:53.760936 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 10 10:05:53 crc kubenswrapper[4794]: I0310 10:05:53.774801 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 10 10:05:53 crc kubenswrapper[4794]: I0310 10:05:53.776142 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 10 10:05:53 crc kubenswrapper[4794]: I0310 10:05:53.791647 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Mar 10 10:05:53 crc kubenswrapper[4794]: I0310 10:05:53.791860 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Mar 10 10:05:53 crc kubenswrapper[4794]: I0310 10:05:53.801885 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qbzqm\" (UniqueName: \"kubernetes.io/projected/a3c438b7-94dd-4c8b-a360-7d15a4cc5574-kube-api-access-qbzqm\") pod \"glance-default-external-api-0\" (UID: \"a3c438b7-94dd-4c8b-a360-7d15a4cc5574\") " pod="openstack/glance-default-external-api-0" Mar 10 10:05:53 crc kubenswrapper[4794]: I0310 10:05:53.801965 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/a3c438b7-94dd-4c8b-a360-7d15a4cc5574-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"a3c438b7-94dd-4c8b-a360-7d15a4cc5574\") " pod="openstack/glance-default-external-api-0" Mar 10 10:05:53 crc kubenswrapper[4794]: I0310 10:05:53.802018 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a3c438b7-94dd-4c8b-a360-7d15a4cc5574-scripts\") pod \"glance-default-external-api-0\" (UID: \"a3c438b7-94dd-4c8b-a360-7d15a4cc5574\") " pod="openstack/glance-default-external-api-0" Mar 10 10:05:53 crc kubenswrapper[4794]: I0310 10:05:53.802045 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-external-api-0\" (UID: \"a3c438b7-94dd-4c8b-a360-7d15a4cc5574\") " pod="openstack/glance-default-external-api-0" Mar 10 10:05:53 crc kubenswrapper[4794]: I0310 10:05:53.802120 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a3c438b7-94dd-4c8b-a360-7d15a4cc5574-config-data\") pod \"glance-default-external-api-0\" (UID: \"a3c438b7-94dd-4c8b-a360-7d15a4cc5574\") " pod="openstack/glance-default-external-api-0" Mar 10 10:05:53 crc kubenswrapper[4794]: I0310 10:05:53.802153 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a3c438b7-94dd-4c8b-a360-7d15a4cc5574-logs\") pod \"glance-default-external-api-0\" (UID: \"a3c438b7-94dd-4c8b-a360-7d15a4cc5574\") " pod="openstack/glance-default-external-api-0" Mar 10 10:05:53 crc kubenswrapper[4794]: I0310 10:05:53.802171 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a3c438b7-94dd-4c8b-a360-7d15a4cc5574-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"a3c438b7-94dd-4c8b-a360-7d15a4cc5574\") " pod="openstack/glance-default-external-api-0" Mar 10 10:05:53 crc kubenswrapper[4794]: I0310 10:05:53.802213 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a3c438b7-94dd-4c8b-a360-7d15a4cc5574-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"a3c438b7-94dd-4c8b-a360-7d15a4cc5574\") " pod="openstack/glance-default-external-api-0" Mar 10 10:05:53 crc kubenswrapper[4794]: I0310 10:05:53.903117 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qbzqm\" (UniqueName: \"kubernetes.io/projected/a3c438b7-94dd-4c8b-a360-7d15a4cc5574-kube-api-access-qbzqm\") pod \"glance-default-external-api-0\" (UID: \"a3c438b7-94dd-4c8b-a360-7d15a4cc5574\") " pod="openstack/glance-default-external-api-0" Mar 10 10:05:53 crc kubenswrapper[4794]: I0310 10:05:53.903479 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3dc6f544-8e2f-4322-bbff-968278be4ce4-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"3dc6f544-8e2f-4322-bbff-968278be4ce4\") " pod="openstack/glance-default-internal-api-0" Mar 10 10:05:53 crc kubenswrapper[4794]: I0310 10:05:53.903505 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3dc6f544-8e2f-4322-bbff-968278be4ce4-config-data\") pod \"glance-default-internal-api-0\" (UID: \"3dc6f544-8e2f-4322-bbff-968278be4ce4\") " pod="openstack/glance-default-internal-api-0" Mar 10 10:05:53 crc kubenswrapper[4794]: I0310 10:05:53.903524 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hpq8f\" (UniqueName: \"kubernetes.io/projected/3dc6f544-8e2f-4322-bbff-968278be4ce4-kube-api-access-hpq8f\") pod \"glance-default-internal-api-0\" (UID: \"3dc6f544-8e2f-4322-bbff-968278be4ce4\") " pod="openstack/glance-default-internal-api-0" Mar 10 10:05:53 crc kubenswrapper[4794]: I0310 10:05:53.903546 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3dc6f544-8e2f-4322-bbff-968278be4ce4-logs\") pod \"glance-default-internal-api-0\" (UID: \"3dc6f544-8e2f-4322-bbff-968278be4ce4\") " pod="openstack/glance-default-internal-api-0" Mar 10 10:05:53 crc kubenswrapper[4794]: I0310 10:05:53.903569 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/a3c438b7-94dd-4c8b-a360-7d15a4cc5574-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"a3c438b7-94dd-4c8b-a360-7d15a4cc5574\") " pod="openstack/glance-default-external-api-0" Mar 10 10:05:53 crc kubenswrapper[4794]: I0310 10:05:53.903606 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a3c438b7-94dd-4c8b-a360-7d15a4cc5574-scripts\") pod \"glance-default-external-api-0\" (UID: \"a3c438b7-94dd-4c8b-a360-7d15a4cc5574\") " pod="openstack/glance-default-external-api-0" Mar 10 10:05:53 crc kubenswrapper[4794]: I0310 10:05:53.903622 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3dc6f544-8e2f-4322-bbff-968278be4ce4-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"3dc6f544-8e2f-4322-bbff-968278be4ce4\") " pod="openstack/glance-default-internal-api-0" Mar 10 10:05:53 crc kubenswrapper[4794]: I0310 10:05:53.903648 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-external-api-0\" (UID: \"a3c438b7-94dd-4c8b-a360-7d15a4cc5574\") " pod="openstack/glance-default-external-api-0" Mar 10 10:05:53 crc kubenswrapper[4794]: I0310 10:05:53.903682 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/3dc6f544-8e2f-4322-bbff-968278be4ce4-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"3dc6f544-8e2f-4322-bbff-968278be4ce4\") " pod="openstack/glance-default-internal-api-0" Mar 10 10:05:53 crc kubenswrapper[4794]: I0310 10:05:53.903721 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-internal-api-0\" (UID: \"3dc6f544-8e2f-4322-bbff-968278be4ce4\") " pod="openstack/glance-default-internal-api-0" Mar 10 10:05:53 crc kubenswrapper[4794]: I0310 10:05:53.903740 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a3c438b7-94dd-4c8b-a360-7d15a4cc5574-config-data\") pod \"glance-default-external-api-0\" (UID: \"a3c438b7-94dd-4c8b-a360-7d15a4cc5574\") " pod="openstack/glance-default-external-api-0" Mar 10 10:05:53 crc kubenswrapper[4794]: I0310 10:05:53.903762 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a3c438b7-94dd-4c8b-a360-7d15a4cc5574-logs\") pod \"glance-default-external-api-0\" (UID: \"a3c438b7-94dd-4c8b-a360-7d15a4cc5574\") " pod="openstack/glance-default-external-api-0" Mar 10 10:05:53 crc kubenswrapper[4794]: I0310 10:05:53.903779 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a3c438b7-94dd-4c8b-a360-7d15a4cc5574-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"a3c438b7-94dd-4c8b-a360-7d15a4cc5574\") " pod="openstack/glance-default-external-api-0" Mar 10 10:05:53 crc kubenswrapper[4794]: I0310 10:05:53.903794 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a3c438b7-94dd-4c8b-a360-7d15a4cc5574-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"a3c438b7-94dd-4c8b-a360-7d15a4cc5574\") " pod="openstack/glance-default-external-api-0" Mar 10 10:05:53 crc kubenswrapper[4794]: I0310 10:05:53.903825 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3dc6f544-8e2f-4322-bbff-968278be4ce4-scripts\") pod \"glance-default-internal-api-0\" (UID: \"3dc6f544-8e2f-4322-bbff-968278be4ce4\") " pod="openstack/glance-default-internal-api-0" Mar 10 10:05:53 crc kubenswrapper[4794]: I0310 10:05:53.905537 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/a3c438b7-94dd-4c8b-a360-7d15a4cc5574-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"a3c438b7-94dd-4c8b-a360-7d15a4cc5574\") " pod="openstack/glance-default-external-api-0" Mar 10 10:05:53 crc kubenswrapper[4794]: I0310 10:05:53.905862 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a3c438b7-94dd-4c8b-a360-7d15a4cc5574-logs\") pod \"glance-default-external-api-0\" (UID: \"a3c438b7-94dd-4c8b-a360-7d15a4cc5574\") " pod="openstack/glance-default-external-api-0" Mar 10 10:05:53 crc kubenswrapper[4794]: I0310 10:05:53.906064 4794 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-external-api-0\" (UID: \"a3c438b7-94dd-4c8b-a360-7d15a4cc5574\") device mount path \"/mnt/openstack/pv05\"" pod="openstack/glance-default-external-api-0" Mar 10 10:05:53 crc kubenswrapper[4794]: I0310 10:05:53.910082 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a3c438b7-94dd-4c8b-a360-7d15a4cc5574-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"a3c438b7-94dd-4c8b-a360-7d15a4cc5574\") " pod="openstack/glance-default-external-api-0" Mar 10 10:05:53 crc kubenswrapper[4794]: I0310 10:05:53.919712 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qbzqm\" (UniqueName: \"kubernetes.io/projected/a3c438b7-94dd-4c8b-a360-7d15a4cc5574-kube-api-access-qbzqm\") pod \"glance-default-external-api-0\" (UID: \"a3c438b7-94dd-4c8b-a360-7d15a4cc5574\") " pod="openstack/glance-default-external-api-0" Mar 10 10:05:53 crc kubenswrapper[4794]: I0310 10:05:53.919792 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a3c438b7-94dd-4c8b-a360-7d15a4cc5574-scripts\") pod \"glance-default-external-api-0\" (UID: \"a3c438b7-94dd-4c8b-a360-7d15a4cc5574\") " pod="openstack/glance-default-external-api-0" Mar 10 10:05:53 crc kubenswrapper[4794]: I0310 10:05:53.920553 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a3c438b7-94dd-4c8b-a360-7d15a4cc5574-config-data\") pod \"glance-default-external-api-0\" (UID: \"a3c438b7-94dd-4c8b-a360-7d15a4cc5574\") " pod="openstack/glance-default-external-api-0" Mar 10 10:05:53 crc kubenswrapper[4794]: I0310 10:05:53.925891 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a3c438b7-94dd-4c8b-a360-7d15a4cc5574-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"a3c438b7-94dd-4c8b-a360-7d15a4cc5574\") " pod="openstack/glance-default-external-api-0" Mar 10 10:05:53 crc kubenswrapper[4794]: I0310 10:05:53.945267 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-external-api-0\" (UID: \"a3c438b7-94dd-4c8b-a360-7d15a4cc5574\") " pod="openstack/glance-default-external-api-0" Mar 10 10:05:53 crc kubenswrapper[4794]: I0310 10:05:53.959287 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 10 10:05:54 crc kubenswrapper[4794]: I0310 10:05:54.005676 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3dc6f544-8e2f-4322-bbff-968278be4ce4-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"3dc6f544-8e2f-4322-bbff-968278be4ce4\") " pod="openstack/glance-default-internal-api-0" Mar 10 10:05:54 crc kubenswrapper[4794]: I0310 10:05:54.005754 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/3dc6f544-8e2f-4322-bbff-968278be4ce4-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"3dc6f544-8e2f-4322-bbff-968278be4ce4\") " pod="openstack/glance-default-internal-api-0" Mar 10 10:05:54 crc kubenswrapper[4794]: I0310 10:05:54.005822 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-internal-api-0\" (UID: \"3dc6f544-8e2f-4322-bbff-968278be4ce4\") " pod="openstack/glance-default-internal-api-0" Mar 10 10:05:54 crc kubenswrapper[4794]: I0310 10:05:54.005874 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3dc6f544-8e2f-4322-bbff-968278be4ce4-scripts\") pod \"glance-default-internal-api-0\" (UID: \"3dc6f544-8e2f-4322-bbff-968278be4ce4\") " pod="openstack/glance-default-internal-api-0" Mar 10 10:05:54 crc kubenswrapper[4794]: I0310 10:05:54.005931 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3dc6f544-8e2f-4322-bbff-968278be4ce4-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"3dc6f544-8e2f-4322-bbff-968278be4ce4\") " pod="openstack/glance-default-internal-api-0" Mar 10 10:05:54 crc kubenswrapper[4794]: I0310 10:05:54.005956 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3dc6f544-8e2f-4322-bbff-968278be4ce4-config-data\") pod \"glance-default-internal-api-0\" (UID: \"3dc6f544-8e2f-4322-bbff-968278be4ce4\") " pod="openstack/glance-default-internal-api-0" Mar 10 10:05:54 crc kubenswrapper[4794]: I0310 10:05:54.005974 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hpq8f\" (UniqueName: \"kubernetes.io/projected/3dc6f544-8e2f-4322-bbff-968278be4ce4-kube-api-access-hpq8f\") pod \"glance-default-internal-api-0\" (UID: \"3dc6f544-8e2f-4322-bbff-968278be4ce4\") " pod="openstack/glance-default-internal-api-0" Mar 10 10:05:54 crc kubenswrapper[4794]: I0310 10:05:54.005999 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3dc6f544-8e2f-4322-bbff-968278be4ce4-logs\") pod \"glance-default-internal-api-0\" (UID: \"3dc6f544-8e2f-4322-bbff-968278be4ce4\") " pod="openstack/glance-default-internal-api-0" Mar 10 10:05:54 crc kubenswrapper[4794]: I0310 10:05:54.006361 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/3dc6f544-8e2f-4322-bbff-968278be4ce4-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"3dc6f544-8e2f-4322-bbff-968278be4ce4\") " pod="openstack/glance-default-internal-api-0" Mar 10 10:05:54 crc kubenswrapper[4794]: I0310 10:05:54.006447 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3dc6f544-8e2f-4322-bbff-968278be4ce4-logs\") pod \"glance-default-internal-api-0\" (UID: \"3dc6f544-8e2f-4322-bbff-968278be4ce4\") " pod="openstack/glance-default-internal-api-0" Mar 10 10:05:54 crc kubenswrapper[4794]: I0310 10:05:54.007003 4794 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-internal-api-0\" (UID: \"3dc6f544-8e2f-4322-bbff-968278be4ce4\") device mount path \"/mnt/openstack/pv03\"" pod="openstack/glance-default-internal-api-0" Mar 10 10:05:54 crc kubenswrapper[4794]: I0310 10:05:54.009993 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3dc6f544-8e2f-4322-bbff-968278be4ce4-config-data\") pod \"glance-default-internal-api-0\" (UID: \"3dc6f544-8e2f-4322-bbff-968278be4ce4\") " pod="openstack/glance-default-internal-api-0" Mar 10 10:05:54 crc kubenswrapper[4794]: I0310 10:05:54.010208 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3dc6f544-8e2f-4322-bbff-968278be4ce4-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"3dc6f544-8e2f-4322-bbff-968278be4ce4\") " pod="openstack/glance-default-internal-api-0" Mar 10 10:05:54 crc kubenswrapper[4794]: I0310 10:05:54.011675 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3dc6f544-8e2f-4322-bbff-968278be4ce4-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"3dc6f544-8e2f-4322-bbff-968278be4ce4\") " pod="openstack/glance-default-internal-api-0" Mar 10 10:05:54 crc kubenswrapper[4794]: I0310 10:05:54.015865 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3dc6f544-8e2f-4322-bbff-968278be4ce4-scripts\") pod \"glance-default-internal-api-0\" (UID: \"3dc6f544-8e2f-4322-bbff-968278be4ce4\") " pod="openstack/glance-default-internal-api-0" Mar 10 10:05:54 crc kubenswrapper[4794]: I0310 10:05:54.023707 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hpq8f\" (UniqueName: \"kubernetes.io/projected/3dc6f544-8e2f-4322-bbff-968278be4ce4-kube-api-access-hpq8f\") pod \"glance-default-internal-api-0\" (UID: \"3dc6f544-8e2f-4322-bbff-968278be4ce4\") " pod="openstack/glance-default-internal-api-0" Mar 10 10:05:54 crc kubenswrapper[4794]: I0310 10:05:54.038674 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-internal-api-0\" (UID: \"3dc6f544-8e2f-4322-bbff-968278be4ce4\") " pod="openstack/glance-default-internal-api-0" Mar 10 10:05:54 crc kubenswrapper[4794]: I0310 10:05:54.070732 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-g7gmt"] Mar 10 10:05:54 crc kubenswrapper[4794]: I0310 10:05:54.138436 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 10 10:05:54 crc kubenswrapper[4794]: I0310 10:05:54.168122 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 10 10:05:54 crc kubenswrapper[4794]: I0310 10:05:54.189709 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5bf6456ddf-8m6kc"] Mar 10 10:05:54 crc kubenswrapper[4794]: I0310 10:05:54.206415 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-hjnc2"] Mar 10 10:05:54 crc kubenswrapper[4794]: I0310 10:05:54.346555 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-c4b4g"] Mar 10 10:05:54 crc kubenswrapper[4794]: I0310 10:05:54.355248 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-v6v6k"] Mar 10 10:05:54 crc kubenswrapper[4794]: W0310 10:05:54.377581 4794 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcb29878d_3dbb_496a_bd25_b6b4ff102b6f.slice/crio-93d359360c71a102874ef08a2070921c8871397780b23f51f208a0cbeb037303 WatchSource:0}: Error finding container 93d359360c71a102874ef08a2070921c8871397780b23f51f208a0cbeb037303: Status 404 returned error can't find the container with id 93d359360c71a102874ef08a2070921c8871397780b23f51f208a0cbeb037303 Mar 10 10:05:54 crc kubenswrapper[4794]: I0310 10:05:54.405759 4794 generic.go:334] "Generic (PLEG): container finished" podID="2af99ed8-10a0-4bc5-8561-1870c4ca7eb2" containerID="ee798773b59853bbc58547ab1a97941d50c2e396368bb913165e55fc595c2413" exitCode=0 Mar 10 10:05:54 crc kubenswrapper[4794]: I0310 10:05:54.405815 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5879b95d97-lx6f2" event={"ID":"2af99ed8-10a0-4bc5-8561-1870c4ca7eb2","Type":"ContainerDied","Data":"ee798773b59853bbc58547ab1a97941d50c2e396368bb913165e55fc595c2413"} Mar 10 10:05:54 crc kubenswrapper[4794]: I0310 10:05:54.405840 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5879b95d97-lx6f2" event={"ID":"2af99ed8-10a0-4bc5-8561-1870c4ca7eb2","Type":"ContainerStarted","Data":"35fe115177598b35ebf27db2be8033850215ef13d4d483f04566836decdb1054"} Mar 10 10:05:54 crc kubenswrapper[4794]: I0310 10:05:54.410418 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-qw4lv" event={"ID":"d871867e-0fb5-48df-9d55-f19a63d160ea","Type":"ContainerStarted","Data":"9a8f466fe0f05d67d968612dd2a52437cb959e872211abcc0038fb9ddc671c06"} Mar 10 10:05:54 crc kubenswrapper[4794]: I0310 10:05:54.410452 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-qw4lv" event={"ID":"d871867e-0fb5-48df-9d55-f19a63d160ea","Type":"ContainerStarted","Data":"1b013488e2eca54120f64eb5c09f49a9105a58e44d9cb8be283000474063fc18"} Mar 10 10:05:54 crc kubenswrapper[4794]: I0310 10:05:54.416058 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-c4b4g" event={"ID":"22069ba2-0135-4559-9c7f-2d73ae0dd81a","Type":"ContainerStarted","Data":"e022f301a846b17b394b61e4f02ca463d5f3b67d498580990434cf197a465694"} Mar 10 10:05:54 crc kubenswrapper[4794]: I0310 10:05:54.418147 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"39f62c40-641b-409f-ab29-ba30c14de2d8","Type":"ContainerStarted","Data":"6d96c35d6454f521b3bcdd00592a057cad340a5e17bbf0b15a4ecc50c0381e17"} Mar 10 10:05:54 crc kubenswrapper[4794]: I0310 10:05:54.420361 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5bf6456ddf-8m6kc" event={"ID":"327bbff1-1be5-4d8f-9ef9-dfc71e5199cc","Type":"ContainerStarted","Data":"d8664381fb8ffc7759188bf1b875286ca51b8370eff32d73220ac805e5f24235"} Mar 10 10:05:54 crc kubenswrapper[4794]: I0310 10:05:54.430757 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-hjnc2" event={"ID":"b9423b3f-3f25-484c-aacc-c83d78c2f731","Type":"ContainerStarted","Data":"2c7df2f4bf734c24efa56d36143ca2786fa004791f8a96791a2ea61a83d8b718"} Mar 10 10:05:54 crc kubenswrapper[4794]: I0310 10:05:54.439184 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-g7gmt" event={"ID":"821b338a-8a20-4d93-8dfa-28727da3ecba","Type":"ContainerStarted","Data":"29d12262bda8eeedd7d36fd18f6f69ab2c366cde30f1a57401e1c6cb87a5fb04"} Mar 10 10:05:54 crc kubenswrapper[4794]: I0310 10:05:54.440586 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-v6v6k" event={"ID":"cb29878d-3dbb-496a-bd25-b6b4ff102b6f","Type":"ContainerStarted","Data":"93d359360c71a102874ef08a2070921c8871397780b23f51f208a0cbeb037303"} Mar 10 10:05:54 crc kubenswrapper[4794]: I0310 10:05:54.446956 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-qw4lv" podStartSLOduration=2.446940554 podStartE2EDuration="2.446940554s" podCreationTimestamp="2026-03-10 10:05:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 10:05:54.446443119 +0000 UTC m=+1303.202613947" watchObservedRunningTime="2026-03-10 10:05:54.446940554 +0000 UTC m=+1303.203111372" Mar 10 10:05:54 crc kubenswrapper[4794]: I0310 10:05:54.759463 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5879b95d97-lx6f2" Mar 10 10:05:54 crc kubenswrapper[4794]: I0310 10:05:54.822490 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 10 10:05:54 crc kubenswrapper[4794]: I0310 10:05:54.867184 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 10 10:05:54 crc kubenswrapper[4794]: I0310 10:05:54.936486 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x2gm5\" (UniqueName: \"kubernetes.io/projected/2af99ed8-10a0-4bc5-8561-1870c4ca7eb2-kube-api-access-x2gm5\") pod \"2af99ed8-10a0-4bc5-8561-1870c4ca7eb2\" (UID: \"2af99ed8-10a0-4bc5-8561-1870c4ca7eb2\") " Mar 10 10:05:54 crc kubenswrapper[4794]: I0310 10:05:54.936533 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2af99ed8-10a0-4bc5-8561-1870c4ca7eb2-ovsdbserver-sb\") pod \"2af99ed8-10a0-4bc5-8561-1870c4ca7eb2\" (UID: \"2af99ed8-10a0-4bc5-8561-1870c4ca7eb2\") " Mar 10 10:05:54 crc kubenswrapper[4794]: I0310 10:05:54.936609 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2af99ed8-10a0-4bc5-8561-1870c4ca7eb2-config\") pod \"2af99ed8-10a0-4bc5-8561-1870c4ca7eb2\" (UID: \"2af99ed8-10a0-4bc5-8561-1870c4ca7eb2\") " Mar 10 10:05:54 crc kubenswrapper[4794]: I0310 10:05:54.936661 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2af99ed8-10a0-4bc5-8561-1870c4ca7eb2-ovsdbserver-nb\") pod \"2af99ed8-10a0-4bc5-8561-1870c4ca7eb2\" (UID: \"2af99ed8-10a0-4bc5-8561-1870c4ca7eb2\") " Mar 10 10:05:54 crc kubenswrapper[4794]: I0310 10:05:54.936701 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2af99ed8-10a0-4bc5-8561-1870c4ca7eb2-dns-svc\") pod \"2af99ed8-10a0-4bc5-8561-1870c4ca7eb2\" (UID: \"2af99ed8-10a0-4bc5-8561-1870c4ca7eb2\") " Mar 10 10:05:54 crc kubenswrapper[4794]: I0310 10:05:54.936728 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/2af99ed8-10a0-4bc5-8561-1870c4ca7eb2-dns-swift-storage-0\") pod \"2af99ed8-10a0-4bc5-8561-1870c4ca7eb2\" (UID: \"2af99ed8-10a0-4bc5-8561-1870c4ca7eb2\") " Mar 10 10:05:54 crc kubenswrapper[4794]: I0310 10:05:54.959061 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 10 10:05:54 crc kubenswrapper[4794]: I0310 10:05:54.967265 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2af99ed8-10a0-4bc5-8561-1870c4ca7eb2-kube-api-access-x2gm5" (OuterVolumeSpecName: "kube-api-access-x2gm5") pod "2af99ed8-10a0-4bc5-8561-1870c4ca7eb2" (UID: "2af99ed8-10a0-4bc5-8561-1870c4ca7eb2"). InnerVolumeSpecName "kube-api-access-x2gm5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 10:05:54 crc kubenswrapper[4794]: I0310 10:05:54.973305 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2af99ed8-10a0-4bc5-8561-1870c4ca7eb2-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "2af99ed8-10a0-4bc5-8561-1870c4ca7eb2" (UID: "2af99ed8-10a0-4bc5-8561-1870c4ca7eb2"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 10:05:55 crc kubenswrapper[4794]: I0310 10:05:55.006499 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2af99ed8-10a0-4bc5-8561-1870c4ca7eb2-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "2af99ed8-10a0-4bc5-8561-1870c4ca7eb2" (UID: "2af99ed8-10a0-4bc5-8561-1870c4ca7eb2"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 10:05:55 crc kubenswrapper[4794]: I0310 10:05:55.008392 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 10 10:05:55 crc kubenswrapper[4794]: I0310 10:05:55.012752 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2af99ed8-10a0-4bc5-8561-1870c4ca7eb2-config" (OuterVolumeSpecName: "config") pod "2af99ed8-10a0-4bc5-8561-1870c4ca7eb2" (UID: "2af99ed8-10a0-4bc5-8561-1870c4ca7eb2"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 10:05:55 crc kubenswrapper[4794]: I0310 10:05:55.044739 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x2gm5\" (UniqueName: \"kubernetes.io/projected/2af99ed8-10a0-4bc5-8561-1870c4ca7eb2-kube-api-access-x2gm5\") on node \"crc\" DevicePath \"\"" Mar 10 10:05:55 crc kubenswrapper[4794]: I0310 10:05:55.044766 4794 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2af99ed8-10a0-4bc5-8561-1870c4ca7eb2-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 10 10:05:55 crc kubenswrapper[4794]: I0310 10:05:55.044776 4794 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2af99ed8-10a0-4bc5-8561-1870c4ca7eb2-config\") on node \"crc\" DevicePath \"\"" Mar 10 10:05:55 crc kubenswrapper[4794]: I0310 10:05:55.044784 4794 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2af99ed8-10a0-4bc5-8561-1870c4ca7eb2-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 10 10:05:55 crc kubenswrapper[4794]: I0310 10:05:55.045920 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2af99ed8-10a0-4bc5-8561-1870c4ca7eb2-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "2af99ed8-10a0-4bc5-8561-1870c4ca7eb2" (UID: "2af99ed8-10a0-4bc5-8561-1870c4ca7eb2"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 10:05:55 crc kubenswrapper[4794]: I0310 10:05:55.061791 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2af99ed8-10a0-4bc5-8561-1870c4ca7eb2-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "2af99ed8-10a0-4bc5-8561-1870c4ca7eb2" (UID: "2af99ed8-10a0-4bc5-8561-1870c4ca7eb2"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 10:05:55 crc kubenswrapper[4794]: I0310 10:05:55.146162 4794 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2af99ed8-10a0-4bc5-8561-1870c4ca7eb2-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 10 10:05:55 crc kubenswrapper[4794]: I0310 10:05:55.146186 4794 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/2af99ed8-10a0-4bc5-8561-1870c4ca7eb2-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 10 10:05:55 crc kubenswrapper[4794]: I0310 10:05:55.460230 4794 generic.go:334] "Generic (PLEG): container finished" podID="327bbff1-1be5-4d8f-9ef9-dfc71e5199cc" containerID="b0a4fdbb275ad365cd528dfeb3598b4be49751b4aabd80a0da95803def06acbd" exitCode=0 Mar 10 10:05:55 crc kubenswrapper[4794]: I0310 10:05:55.460393 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5bf6456ddf-8m6kc" event={"ID":"327bbff1-1be5-4d8f-9ef9-dfc71e5199cc","Type":"ContainerDied","Data":"b0a4fdbb275ad365cd528dfeb3598b4be49751b4aabd80a0da95803def06acbd"} Mar 10 10:05:55 crc kubenswrapper[4794]: I0310 10:05:55.466533 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"a3c438b7-94dd-4c8b-a360-7d15a4cc5574","Type":"ContainerStarted","Data":"85495a6e1562a3c51041a05b2902ff7d7af667d7e398087743556918c08bd265"} Mar 10 10:05:55 crc kubenswrapper[4794]: I0310 10:05:55.482771 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-hjnc2" event={"ID":"b9423b3f-3f25-484c-aacc-c83d78c2f731","Type":"ContainerStarted","Data":"00ea922430a28d8905d03ab4683719c7f0ddb1de198f027cdcd8c2578d18e421"} Mar 10 10:05:55 crc kubenswrapper[4794]: I0310 10:05:55.489443 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5879b95d97-lx6f2" Mar 10 10:05:55 crc kubenswrapper[4794]: I0310 10:05:55.491582 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5879b95d97-lx6f2" event={"ID":"2af99ed8-10a0-4bc5-8561-1870c4ca7eb2","Type":"ContainerDied","Data":"35fe115177598b35ebf27db2be8033850215ef13d4d483f04566836decdb1054"} Mar 10 10:05:55 crc kubenswrapper[4794]: I0310 10:05:55.491691 4794 scope.go:117] "RemoveContainer" containerID="ee798773b59853bbc58547ab1a97941d50c2e396368bb913165e55fc595c2413" Mar 10 10:05:55 crc kubenswrapper[4794]: I0310 10:05:55.508661 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-db-sync-hjnc2" podStartSLOduration=3.508642098 podStartE2EDuration="3.508642098s" podCreationTimestamp="2026-03-10 10:05:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 10:05:55.498297112 +0000 UTC m=+1304.254467930" watchObservedRunningTime="2026-03-10 10:05:55.508642098 +0000 UTC m=+1304.264812916" Mar 10 10:05:55 crc kubenswrapper[4794]: I0310 10:05:55.573426 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5879b95d97-lx6f2"] Mar 10 10:05:55 crc kubenswrapper[4794]: I0310 10:05:55.590130 4794 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5879b95d97-lx6f2"] Mar 10 10:05:55 crc kubenswrapper[4794]: I0310 10:05:55.679753 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 10 10:05:56 crc kubenswrapper[4794]: I0310 10:05:56.018586 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2af99ed8-10a0-4bc5-8561-1870c4ca7eb2" path="/var/lib/kubelet/pods/2af99ed8-10a0-4bc5-8561-1870c4ca7eb2/volumes" Mar 10 10:05:56 crc kubenswrapper[4794]: I0310 10:05:56.524548 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"3dc6f544-8e2f-4322-bbff-968278be4ce4","Type":"ContainerStarted","Data":"a7cbfbe6a35be0b6f9a28390aa2592275e86a7af9d0d985168c86608a6a223e2"} Mar 10 10:05:56 crc kubenswrapper[4794]: I0310 10:05:56.524766 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"3dc6f544-8e2f-4322-bbff-968278be4ce4","Type":"ContainerStarted","Data":"5f68b3713ee70617c0ff70834850b2f3c67d5f678e67a3e1320e9ec704604e9d"} Mar 10 10:05:56 crc kubenswrapper[4794]: I0310 10:05:56.534913 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5bf6456ddf-8m6kc" event={"ID":"327bbff1-1be5-4d8f-9ef9-dfc71e5199cc","Type":"ContainerStarted","Data":"11afbcbf2e1ea029e0946f9cfb8d80bf8c8658a761240173341846934dc9c6c5"} Mar 10 10:05:56 crc kubenswrapper[4794]: I0310 10:05:56.535408 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5bf6456ddf-8m6kc" Mar 10 10:05:56 crc kubenswrapper[4794]: I0310 10:05:56.540118 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"a3c438b7-94dd-4c8b-a360-7d15a4cc5574","Type":"ContainerStarted","Data":"d845c380b80620a17bbd0643c40c5a7e9ffe41c8cafd50e97f42affa0549719e"} Mar 10 10:05:56 crc kubenswrapper[4794]: I0310 10:05:56.563216 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5bf6456ddf-8m6kc" podStartSLOduration=3.563194588 podStartE2EDuration="3.563194588s" podCreationTimestamp="2026-03-10 10:05:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 10:05:56.557640702 +0000 UTC m=+1305.313811520" watchObservedRunningTime="2026-03-10 10:05:56.563194588 +0000 UTC m=+1305.319365406" Mar 10 10:05:57 crc kubenswrapper[4794]: I0310 10:05:57.554554 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"3dc6f544-8e2f-4322-bbff-968278be4ce4","Type":"ContainerStarted","Data":"ea6cf5e160274f28f608753fb25272872be1c23ae6c7d921e630fbe91e0e1a2c"} Mar 10 10:05:57 crc kubenswrapper[4794]: I0310 10:05:57.554626 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="3dc6f544-8e2f-4322-bbff-968278be4ce4" containerName="glance-log" containerID="cri-o://a7cbfbe6a35be0b6f9a28390aa2592275e86a7af9d0d985168c86608a6a223e2" gracePeriod=30 Mar 10 10:05:57 crc kubenswrapper[4794]: I0310 10:05:57.554820 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="3dc6f544-8e2f-4322-bbff-968278be4ce4" containerName="glance-httpd" containerID="cri-o://ea6cf5e160274f28f608753fb25272872be1c23ae6c7d921e630fbe91e0e1a2c" gracePeriod=30 Mar 10 10:05:57 crc kubenswrapper[4794]: I0310 10:05:57.560889 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="a3c438b7-94dd-4c8b-a360-7d15a4cc5574" containerName="glance-log" containerID="cri-o://d845c380b80620a17bbd0643c40c5a7e9ffe41c8cafd50e97f42affa0549719e" gracePeriod=30 Mar 10 10:05:57 crc kubenswrapper[4794]: I0310 10:05:57.560973 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"a3c438b7-94dd-4c8b-a360-7d15a4cc5574","Type":"ContainerStarted","Data":"eb956f9a8b192b4ab0fa534df486454443df48b244ccc2dc5ba908d9cdc4d94f"} Mar 10 10:05:57 crc kubenswrapper[4794]: I0310 10:05:57.561023 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="a3c438b7-94dd-4c8b-a360-7d15a4cc5574" containerName="glance-httpd" containerID="cri-o://eb956f9a8b192b4ab0fa534df486454443df48b244ccc2dc5ba908d9cdc4d94f" gracePeriod=30 Mar 10 10:05:57 crc kubenswrapper[4794]: I0310 10:05:57.577865 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=5.577849294 podStartE2EDuration="5.577849294s" podCreationTimestamp="2026-03-10 10:05:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 10:05:57.575568592 +0000 UTC m=+1306.331739410" watchObservedRunningTime="2026-03-10 10:05:57.577849294 +0000 UTC m=+1306.334020112" Mar 10 10:05:57 crc kubenswrapper[4794]: I0310 10:05:57.596680 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=5.596661658 podStartE2EDuration="5.596661658s" podCreationTimestamp="2026-03-10 10:05:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 10:05:57.596459652 +0000 UTC m=+1306.352630480" watchObservedRunningTime="2026-03-10 10:05:57.596661658 +0000 UTC m=+1306.352832476" Mar 10 10:05:58 crc kubenswrapper[4794]: I0310 10:05:58.571004 4794 generic.go:334] "Generic (PLEG): container finished" podID="a3c438b7-94dd-4c8b-a360-7d15a4cc5574" containerID="eb956f9a8b192b4ab0fa534df486454443df48b244ccc2dc5ba908d9cdc4d94f" exitCode=0 Mar 10 10:05:58 crc kubenswrapper[4794]: I0310 10:05:58.571406 4794 generic.go:334] "Generic (PLEG): container finished" podID="a3c438b7-94dd-4c8b-a360-7d15a4cc5574" containerID="d845c380b80620a17bbd0643c40c5a7e9ffe41c8cafd50e97f42affa0549719e" exitCode=143 Mar 10 10:05:58 crc kubenswrapper[4794]: I0310 10:05:58.571363 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"a3c438b7-94dd-4c8b-a360-7d15a4cc5574","Type":"ContainerDied","Data":"eb956f9a8b192b4ab0fa534df486454443df48b244ccc2dc5ba908d9cdc4d94f"} Mar 10 10:05:58 crc kubenswrapper[4794]: I0310 10:05:58.571465 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"a3c438b7-94dd-4c8b-a360-7d15a4cc5574","Type":"ContainerDied","Data":"d845c380b80620a17bbd0643c40c5a7e9ffe41c8cafd50e97f42affa0549719e"} Mar 10 10:05:58 crc kubenswrapper[4794]: I0310 10:05:58.573500 4794 generic.go:334] "Generic (PLEG): container finished" podID="d871867e-0fb5-48df-9d55-f19a63d160ea" containerID="9a8f466fe0f05d67d968612dd2a52437cb959e872211abcc0038fb9ddc671c06" exitCode=0 Mar 10 10:05:58 crc kubenswrapper[4794]: I0310 10:05:58.573521 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-qw4lv" event={"ID":"d871867e-0fb5-48df-9d55-f19a63d160ea","Type":"ContainerDied","Data":"9a8f466fe0f05d67d968612dd2a52437cb959e872211abcc0038fb9ddc671c06"} Mar 10 10:05:58 crc kubenswrapper[4794]: I0310 10:05:58.576399 4794 generic.go:334] "Generic (PLEG): container finished" podID="3dc6f544-8e2f-4322-bbff-968278be4ce4" containerID="ea6cf5e160274f28f608753fb25272872be1c23ae6c7d921e630fbe91e0e1a2c" exitCode=0 Mar 10 10:05:58 crc kubenswrapper[4794]: I0310 10:05:58.576418 4794 generic.go:334] "Generic (PLEG): container finished" podID="3dc6f544-8e2f-4322-bbff-968278be4ce4" containerID="a7cbfbe6a35be0b6f9a28390aa2592275e86a7af9d0d985168c86608a6a223e2" exitCode=143 Mar 10 10:05:58 crc kubenswrapper[4794]: I0310 10:05:58.576436 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"3dc6f544-8e2f-4322-bbff-968278be4ce4","Type":"ContainerDied","Data":"ea6cf5e160274f28f608753fb25272872be1c23ae6c7d921e630fbe91e0e1a2c"} Mar 10 10:05:58 crc kubenswrapper[4794]: I0310 10:05:58.576454 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"3dc6f544-8e2f-4322-bbff-968278be4ce4","Type":"ContainerDied","Data":"a7cbfbe6a35be0b6f9a28390aa2592275e86a7af9d0d985168c86608a6a223e2"} Mar 10 10:06:00 crc kubenswrapper[4794]: I0310 10:06:00.123901 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29552286-6rbvj"] Mar 10 10:06:00 crc kubenswrapper[4794]: E0310 10:06:00.125627 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2af99ed8-10a0-4bc5-8561-1870c4ca7eb2" containerName="init" Mar 10 10:06:00 crc kubenswrapper[4794]: I0310 10:06:00.125653 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="2af99ed8-10a0-4bc5-8561-1870c4ca7eb2" containerName="init" Mar 10 10:06:00 crc kubenswrapper[4794]: I0310 10:06:00.125821 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="2af99ed8-10a0-4bc5-8561-1870c4ca7eb2" containerName="init" Mar 10 10:06:00 crc kubenswrapper[4794]: I0310 10:06:00.126336 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552286-6rbvj" Mar 10 10:06:00 crc kubenswrapper[4794]: I0310 10:06:00.130158 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 10 10:06:00 crc kubenswrapper[4794]: I0310 10:06:00.130519 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-r5tkc" Mar 10 10:06:00 crc kubenswrapper[4794]: I0310 10:06:00.132938 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 10 10:06:00 crc kubenswrapper[4794]: I0310 10:06:00.139395 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552286-6rbvj"] Mar 10 10:06:00 crc kubenswrapper[4794]: I0310 10:06:00.248624 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q8whm\" (UniqueName: \"kubernetes.io/projected/f9689ae4-36b4-41d8-b75d-805a57b17041-kube-api-access-q8whm\") pod \"auto-csr-approver-29552286-6rbvj\" (UID: \"f9689ae4-36b4-41d8-b75d-805a57b17041\") " pod="openshift-infra/auto-csr-approver-29552286-6rbvj" Mar 10 10:06:00 crc kubenswrapper[4794]: I0310 10:06:00.350855 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q8whm\" (UniqueName: \"kubernetes.io/projected/f9689ae4-36b4-41d8-b75d-805a57b17041-kube-api-access-q8whm\") pod \"auto-csr-approver-29552286-6rbvj\" (UID: \"f9689ae4-36b4-41d8-b75d-805a57b17041\") " pod="openshift-infra/auto-csr-approver-29552286-6rbvj" Mar 10 10:06:00 crc kubenswrapper[4794]: I0310 10:06:00.371291 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q8whm\" (UniqueName: \"kubernetes.io/projected/f9689ae4-36b4-41d8-b75d-805a57b17041-kube-api-access-q8whm\") pod \"auto-csr-approver-29552286-6rbvj\" (UID: \"f9689ae4-36b4-41d8-b75d-805a57b17041\") " pod="openshift-infra/auto-csr-approver-29552286-6rbvj" Mar 10 10:06:00 crc kubenswrapper[4794]: I0310 10:06:00.453986 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552286-6rbvj" Mar 10 10:06:03 crc kubenswrapper[4794]: I0310 10:06:03.506499 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5bf6456ddf-8m6kc" Mar 10 10:06:03 crc kubenswrapper[4794]: I0310 10:06:03.599319 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7fd445f5bc-9njwb"] Mar 10 10:06:03 crc kubenswrapper[4794]: I0310 10:06:03.599601 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-7fd445f5bc-9njwb" podUID="69fccd5f-c6ca-4228-81ba-ee7ae103876a" containerName="dnsmasq-dns" containerID="cri-o://83ea99766083c37e566fe16c905e9d98e5f40860228f0197c875205a8e8c07b7" gracePeriod=10 Mar 10 10:06:04 crc kubenswrapper[4794]: I0310 10:06:04.473516 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-qw4lv" Mar 10 10:06:04 crc kubenswrapper[4794]: I0310 10:06:04.483106 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 10 10:06:04 crc kubenswrapper[4794]: I0310 10:06:04.500715 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 10 10:06:04 crc kubenswrapper[4794]: I0310 10:06:04.546977 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/d871867e-0fb5-48df-9d55-f19a63d160ea-credential-keys\") pod \"d871867e-0fb5-48df-9d55-f19a63d160ea\" (UID: \"d871867e-0fb5-48df-9d55-f19a63d160ea\") " Mar 10 10:06:04 crc kubenswrapper[4794]: I0310 10:06:04.547040 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5vzsb\" (UniqueName: \"kubernetes.io/projected/d871867e-0fb5-48df-9d55-f19a63d160ea-kube-api-access-5vzsb\") pod \"d871867e-0fb5-48df-9d55-f19a63d160ea\" (UID: \"d871867e-0fb5-48df-9d55-f19a63d160ea\") " Mar 10 10:06:04 crc kubenswrapper[4794]: I0310 10:06:04.547115 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/d871867e-0fb5-48df-9d55-f19a63d160ea-fernet-keys\") pod \"d871867e-0fb5-48df-9d55-f19a63d160ea\" (UID: \"d871867e-0fb5-48df-9d55-f19a63d160ea\") " Mar 10 10:06:04 crc kubenswrapper[4794]: I0310 10:06:04.547159 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d871867e-0fb5-48df-9d55-f19a63d160ea-config-data\") pod \"d871867e-0fb5-48df-9d55-f19a63d160ea\" (UID: \"d871867e-0fb5-48df-9d55-f19a63d160ea\") " Mar 10 10:06:04 crc kubenswrapper[4794]: I0310 10:06:04.547228 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d871867e-0fb5-48df-9d55-f19a63d160ea-scripts\") pod \"d871867e-0fb5-48df-9d55-f19a63d160ea\" (UID: \"d871867e-0fb5-48df-9d55-f19a63d160ea\") " Mar 10 10:06:04 crc kubenswrapper[4794]: I0310 10:06:04.547286 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d871867e-0fb5-48df-9d55-f19a63d160ea-combined-ca-bundle\") pod \"d871867e-0fb5-48df-9d55-f19a63d160ea\" (UID: \"d871867e-0fb5-48df-9d55-f19a63d160ea\") " Mar 10 10:06:04 crc kubenswrapper[4794]: I0310 10:06:04.559242 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d871867e-0fb5-48df-9d55-f19a63d160ea-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "d871867e-0fb5-48df-9d55-f19a63d160ea" (UID: "d871867e-0fb5-48df-9d55-f19a63d160ea"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 10:06:04 crc kubenswrapper[4794]: I0310 10:06:04.559267 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d871867e-0fb5-48df-9d55-f19a63d160ea-kube-api-access-5vzsb" (OuterVolumeSpecName: "kube-api-access-5vzsb") pod "d871867e-0fb5-48df-9d55-f19a63d160ea" (UID: "d871867e-0fb5-48df-9d55-f19a63d160ea"). InnerVolumeSpecName "kube-api-access-5vzsb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 10:06:04 crc kubenswrapper[4794]: I0310 10:06:04.565574 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d871867e-0fb5-48df-9d55-f19a63d160ea-scripts" (OuterVolumeSpecName: "scripts") pod "d871867e-0fb5-48df-9d55-f19a63d160ea" (UID: "d871867e-0fb5-48df-9d55-f19a63d160ea"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 10:06:04 crc kubenswrapper[4794]: I0310 10:06:04.566996 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d871867e-0fb5-48df-9d55-f19a63d160ea-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "d871867e-0fb5-48df-9d55-f19a63d160ea" (UID: "d871867e-0fb5-48df-9d55-f19a63d160ea"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 10:06:04 crc kubenswrapper[4794]: I0310 10:06:04.575638 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d871867e-0fb5-48df-9d55-f19a63d160ea-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d871867e-0fb5-48df-9d55-f19a63d160ea" (UID: "d871867e-0fb5-48df-9d55-f19a63d160ea"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 10:06:04 crc kubenswrapper[4794]: I0310 10:06:04.640632 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d871867e-0fb5-48df-9d55-f19a63d160ea-config-data" (OuterVolumeSpecName: "config-data") pod "d871867e-0fb5-48df-9d55-f19a63d160ea" (UID: "d871867e-0fb5-48df-9d55-f19a63d160ea"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 10:06:04 crc kubenswrapper[4794]: I0310 10:06:04.648119 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 10 10:06:04 crc kubenswrapper[4794]: I0310 10:06:04.648298 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"a3c438b7-94dd-4c8b-a360-7d15a4cc5574","Type":"ContainerDied","Data":"85495a6e1562a3c51041a05b2902ff7d7af667d7e398087743556918c08bd265"} Mar 10 10:06:04 crc kubenswrapper[4794]: I0310 10:06:04.648388 4794 scope.go:117] "RemoveContainer" containerID="eb956f9a8b192b4ab0fa534df486454443df48b244ccc2dc5ba908d9cdc4d94f" Mar 10 10:06:04 crc kubenswrapper[4794]: I0310 10:06:04.653663 4794 generic.go:334] "Generic (PLEG): container finished" podID="69fccd5f-c6ca-4228-81ba-ee7ae103876a" containerID="83ea99766083c37e566fe16c905e9d98e5f40860228f0197c875205a8e8c07b7" exitCode=0 Mar 10 10:06:04 crc kubenswrapper[4794]: I0310 10:06:04.653727 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7fd445f5bc-9njwb" event={"ID":"69fccd5f-c6ca-4228-81ba-ee7ae103876a","Type":"ContainerDied","Data":"83ea99766083c37e566fe16c905e9d98e5f40860228f0197c875205a8e8c07b7"} Mar 10 10:06:04 crc kubenswrapper[4794]: I0310 10:06:04.655463 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-qw4lv" event={"ID":"d871867e-0fb5-48df-9d55-f19a63d160ea","Type":"ContainerDied","Data":"1b013488e2eca54120f64eb5c09f49a9105a58e44d9cb8be283000474063fc18"} Mar 10 10:06:04 crc kubenswrapper[4794]: I0310 10:06:04.655487 4794 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1b013488e2eca54120f64eb5c09f49a9105a58e44d9cb8be283000474063fc18" Mar 10 10:06:04 crc kubenswrapper[4794]: I0310 10:06:04.655534 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-qw4lv" Mar 10 10:06:04 crc kubenswrapper[4794]: I0310 10:06:04.661830 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a3c438b7-94dd-4c8b-a360-7d15a4cc5574-combined-ca-bundle\") pod \"a3c438b7-94dd-4c8b-a360-7d15a4cc5574\" (UID: \"a3c438b7-94dd-4c8b-a360-7d15a4cc5574\") " Mar 10 10:06:04 crc kubenswrapper[4794]: I0310 10:06:04.661976 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3dc6f544-8e2f-4322-bbff-968278be4ce4-combined-ca-bundle\") pod \"3dc6f544-8e2f-4322-bbff-968278be4ce4\" (UID: \"3dc6f544-8e2f-4322-bbff-968278be4ce4\") " Mar 10 10:06:04 crc kubenswrapper[4794]: I0310 10:06:04.662016 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qbzqm\" (UniqueName: \"kubernetes.io/projected/a3c438b7-94dd-4c8b-a360-7d15a4cc5574-kube-api-access-qbzqm\") pod \"a3c438b7-94dd-4c8b-a360-7d15a4cc5574\" (UID: \"a3c438b7-94dd-4c8b-a360-7d15a4cc5574\") " Mar 10 10:06:04 crc kubenswrapper[4794]: I0310 10:06:04.662040 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a3c438b7-94dd-4c8b-a360-7d15a4cc5574-public-tls-certs\") pod \"a3c438b7-94dd-4c8b-a360-7d15a4cc5574\" (UID: \"a3c438b7-94dd-4c8b-a360-7d15a4cc5574\") " Mar 10 10:06:04 crc kubenswrapper[4794]: I0310 10:06:04.662127 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a3c438b7-94dd-4c8b-a360-7d15a4cc5574-scripts\") pod \"a3c438b7-94dd-4c8b-a360-7d15a4cc5574\" (UID: \"a3c438b7-94dd-4c8b-a360-7d15a4cc5574\") " Mar 10 10:06:04 crc kubenswrapper[4794]: I0310 10:06:04.662175 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/3dc6f544-8e2f-4322-bbff-968278be4ce4-httpd-run\") pod \"3dc6f544-8e2f-4322-bbff-968278be4ce4\" (UID: \"3dc6f544-8e2f-4322-bbff-968278be4ce4\") " Mar 10 10:06:04 crc kubenswrapper[4794]: I0310 10:06:04.662198 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"3dc6f544-8e2f-4322-bbff-968278be4ce4\" (UID: \"3dc6f544-8e2f-4322-bbff-968278be4ce4\") " Mar 10 10:06:04 crc kubenswrapper[4794]: I0310 10:06:04.662218 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/a3c438b7-94dd-4c8b-a360-7d15a4cc5574-httpd-run\") pod \"a3c438b7-94dd-4c8b-a360-7d15a4cc5574\" (UID: \"a3c438b7-94dd-4c8b-a360-7d15a4cc5574\") " Mar 10 10:06:04 crc kubenswrapper[4794]: I0310 10:06:04.662241 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3dc6f544-8e2f-4322-bbff-968278be4ce4-scripts\") pod \"3dc6f544-8e2f-4322-bbff-968278be4ce4\" (UID: \"3dc6f544-8e2f-4322-bbff-968278be4ce4\") " Mar 10 10:06:04 crc kubenswrapper[4794]: I0310 10:06:04.662264 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3dc6f544-8e2f-4322-bbff-968278be4ce4-internal-tls-certs\") pod \"3dc6f544-8e2f-4322-bbff-968278be4ce4\" (UID: \"3dc6f544-8e2f-4322-bbff-968278be4ce4\") " Mar 10 10:06:04 crc kubenswrapper[4794]: I0310 10:06:04.662300 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a3c438b7-94dd-4c8b-a360-7d15a4cc5574-config-data\") pod \"a3c438b7-94dd-4c8b-a360-7d15a4cc5574\" (UID: \"a3c438b7-94dd-4c8b-a360-7d15a4cc5574\") " Mar 10 10:06:04 crc kubenswrapper[4794]: I0310 10:06:04.662332 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3dc6f544-8e2f-4322-bbff-968278be4ce4-logs\") pod \"3dc6f544-8e2f-4322-bbff-968278be4ce4\" (UID: \"3dc6f544-8e2f-4322-bbff-968278be4ce4\") " Mar 10 10:06:04 crc kubenswrapper[4794]: I0310 10:06:04.662380 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"a3c438b7-94dd-4c8b-a360-7d15a4cc5574\" (UID: \"a3c438b7-94dd-4c8b-a360-7d15a4cc5574\") " Mar 10 10:06:04 crc kubenswrapper[4794]: I0310 10:06:04.662434 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a3c438b7-94dd-4c8b-a360-7d15a4cc5574-logs\") pod \"a3c438b7-94dd-4c8b-a360-7d15a4cc5574\" (UID: \"a3c438b7-94dd-4c8b-a360-7d15a4cc5574\") " Mar 10 10:06:04 crc kubenswrapper[4794]: I0310 10:06:04.662462 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hpq8f\" (UniqueName: \"kubernetes.io/projected/3dc6f544-8e2f-4322-bbff-968278be4ce4-kube-api-access-hpq8f\") pod \"3dc6f544-8e2f-4322-bbff-968278be4ce4\" (UID: \"3dc6f544-8e2f-4322-bbff-968278be4ce4\") " Mar 10 10:06:04 crc kubenswrapper[4794]: I0310 10:06:04.662488 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3dc6f544-8e2f-4322-bbff-968278be4ce4-config-data\") pod \"3dc6f544-8e2f-4322-bbff-968278be4ce4\" (UID: \"3dc6f544-8e2f-4322-bbff-968278be4ce4\") " Mar 10 10:06:04 crc kubenswrapper[4794]: I0310 10:06:04.662961 4794 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/d871867e-0fb5-48df-9d55-f19a63d160ea-fernet-keys\") on node \"crc\" DevicePath \"\"" Mar 10 10:06:04 crc kubenswrapper[4794]: I0310 10:06:04.662984 4794 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d871867e-0fb5-48df-9d55-f19a63d160ea-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 10:06:04 crc kubenswrapper[4794]: I0310 10:06:04.662995 4794 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d871867e-0fb5-48df-9d55-f19a63d160ea-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 10:06:04 crc kubenswrapper[4794]: I0310 10:06:04.663009 4794 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d871867e-0fb5-48df-9d55-f19a63d160ea-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 10:06:04 crc kubenswrapper[4794]: I0310 10:06:04.663071 4794 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/d871867e-0fb5-48df-9d55-f19a63d160ea-credential-keys\") on node \"crc\" DevicePath \"\"" Mar 10 10:06:04 crc kubenswrapper[4794]: I0310 10:06:04.663082 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5vzsb\" (UniqueName: \"kubernetes.io/projected/d871867e-0fb5-48df-9d55-f19a63d160ea-kube-api-access-5vzsb\") on node \"crc\" DevicePath \"\"" Mar 10 10:06:04 crc kubenswrapper[4794]: I0310 10:06:04.664216 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3dc6f544-8e2f-4322-bbff-968278be4ce4-logs" (OuterVolumeSpecName: "logs") pod "3dc6f544-8e2f-4322-bbff-968278be4ce4" (UID: "3dc6f544-8e2f-4322-bbff-968278be4ce4"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 10:06:04 crc kubenswrapper[4794]: I0310 10:06:04.664846 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a3c438b7-94dd-4c8b-a360-7d15a4cc5574-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "a3c438b7-94dd-4c8b-a360-7d15a4cc5574" (UID: "a3c438b7-94dd-4c8b-a360-7d15a4cc5574"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 10:06:04 crc kubenswrapper[4794]: I0310 10:06:04.665265 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a3c438b7-94dd-4c8b-a360-7d15a4cc5574-logs" (OuterVolumeSpecName: "logs") pod "a3c438b7-94dd-4c8b-a360-7d15a4cc5574" (UID: "a3c438b7-94dd-4c8b-a360-7d15a4cc5574"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 10:06:04 crc kubenswrapper[4794]: I0310 10:06:04.668237 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3dc6f544-8e2f-4322-bbff-968278be4ce4-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "3dc6f544-8e2f-4322-bbff-968278be4ce4" (UID: "3dc6f544-8e2f-4322-bbff-968278be4ce4"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 10:06:04 crc kubenswrapper[4794]: I0310 10:06:04.670162 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage05-crc" (OuterVolumeSpecName: "glance") pod "a3c438b7-94dd-4c8b-a360-7d15a4cc5574" (UID: "a3c438b7-94dd-4c8b-a360-7d15a4cc5574"). InnerVolumeSpecName "local-storage05-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 10 10:06:04 crc kubenswrapper[4794]: I0310 10:06:04.671952 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a3c438b7-94dd-4c8b-a360-7d15a4cc5574-scripts" (OuterVolumeSpecName: "scripts") pod "a3c438b7-94dd-4c8b-a360-7d15a4cc5574" (UID: "a3c438b7-94dd-4c8b-a360-7d15a4cc5574"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 10:06:04 crc kubenswrapper[4794]: I0310 10:06:04.673909 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3dc6f544-8e2f-4322-bbff-968278be4ce4-kube-api-access-hpq8f" (OuterVolumeSpecName: "kube-api-access-hpq8f") pod "3dc6f544-8e2f-4322-bbff-968278be4ce4" (UID: "3dc6f544-8e2f-4322-bbff-968278be4ce4"). InnerVolumeSpecName "kube-api-access-hpq8f". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 10:06:04 crc kubenswrapper[4794]: I0310 10:06:04.677033 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a3c438b7-94dd-4c8b-a360-7d15a4cc5574-kube-api-access-qbzqm" (OuterVolumeSpecName: "kube-api-access-qbzqm") pod "a3c438b7-94dd-4c8b-a360-7d15a4cc5574" (UID: "a3c438b7-94dd-4c8b-a360-7d15a4cc5574"). InnerVolumeSpecName "kube-api-access-qbzqm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 10:06:04 crc kubenswrapper[4794]: I0310 10:06:04.678368 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage03-crc" (OuterVolumeSpecName: "glance") pod "3dc6f544-8e2f-4322-bbff-968278be4ce4" (UID: "3dc6f544-8e2f-4322-bbff-968278be4ce4"). InnerVolumeSpecName "local-storage03-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 10 10:06:04 crc kubenswrapper[4794]: I0310 10:06:04.682405 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 10 10:06:04 crc kubenswrapper[4794]: I0310 10:06:04.682330 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"3dc6f544-8e2f-4322-bbff-968278be4ce4","Type":"ContainerDied","Data":"5f68b3713ee70617c0ff70834850b2f3c67d5f678e67a3e1320e9ec704604e9d"} Mar 10 10:06:04 crc kubenswrapper[4794]: I0310 10:06:04.684562 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3dc6f544-8e2f-4322-bbff-968278be4ce4-scripts" (OuterVolumeSpecName: "scripts") pod "3dc6f544-8e2f-4322-bbff-968278be4ce4" (UID: "3dc6f544-8e2f-4322-bbff-968278be4ce4"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 10:06:04 crc kubenswrapper[4794]: I0310 10:06:04.702603 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3dc6f544-8e2f-4322-bbff-968278be4ce4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3dc6f544-8e2f-4322-bbff-968278be4ce4" (UID: "3dc6f544-8e2f-4322-bbff-968278be4ce4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 10:06:04 crc kubenswrapper[4794]: I0310 10:06:04.717638 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3dc6f544-8e2f-4322-bbff-968278be4ce4-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "3dc6f544-8e2f-4322-bbff-968278be4ce4" (UID: "3dc6f544-8e2f-4322-bbff-968278be4ce4"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 10:06:04 crc kubenswrapper[4794]: I0310 10:06:04.721219 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a3c438b7-94dd-4c8b-a360-7d15a4cc5574-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "a3c438b7-94dd-4c8b-a360-7d15a4cc5574" (UID: "a3c438b7-94dd-4c8b-a360-7d15a4cc5574"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 10:06:04 crc kubenswrapper[4794]: I0310 10:06:04.727650 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3dc6f544-8e2f-4322-bbff-968278be4ce4-config-data" (OuterVolumeSpecName: "config-data") pod "3dc6f544-8e2f-4322-bbff-968278be4ce4" (UID: "3dc6f544-8e2f-4322-bbff-968278be4ce4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 10:06:04 crc kubenswrapper[4794]: I0310 10:06:04.733257 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a3c438b7-94dd-4c8b-a360-7d15a4cc5574-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a3c438b7-94dd-4c8b-a360-7d15a4cc5574" (UID: "a3c438b7-94dd-4c8b-a360-7d15a4cc5574"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 10:06:04 crc kubenswrapper[4794]: I0310 10:06:04.761224 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a3c438b7-94dd-4c8b-a360-7d15a4cc5574-config-data" (OuterVolumeSpecName: "config-data") pod "a3c438b7-94dd-4c8b-a360-7d15a4cc5574" (UID: "a3c438b7-94dd-4c8b-a360-7d15a4cc5574"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 10:06:04 crc kubenswrapper[4794]: I0310 10:06:04.764052 4794 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a3c438b7-94dd-4c8b-a360-7d15a4cc5574-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 10:06:04 crc kubenswrapper[4794]: I0310 10:06:04.764080 4794 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/3dc6f544-8e2f-4322-bbff-968278be4ce4-httpd-run\") on node \"crc\" DevicePath \"\"" Mar 10 10:06:04 crc kubenswrapper[4794]: I0310 10:06:04.764115 4794 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") on node \"crc\" " Mar 10 10:06:04 crc kubenswrapper[4794]: I0310 10:06:04.764129 4794 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/a3c438b7-94dd-4c8b-a360-7d15a4cc5574-httpd-run\") on node \"crc\" DevicePath \"\"" Mar 10 10:06:04 crc kubenswrapper[4794]: I0310 10:06:04.764142 4794 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3dc6f544-8e2f-4322-bbff-968278be4ce4-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 10:06:04 crc kubenswrapper[4794]: I0310 10:06:04.764152 4794 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3dc6f544-8e2f-4322-bbff-968278be4ce4-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 10 10:06:04 crc kubenswrapper[4794]: I0310 10:06:04.764163 4794 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a3c438b7-94dd-4c8b-a360-7d15a4cc5574-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 10:06:04 crc kubenswrapper[4794]: I0310 10:06:04.764172 4794 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3dc6f544-8e2f-4322-bbff-968278be4ce4-logs\") on node \"crc\" DevicePath \"\"" Mar 10 10:06:04 crc kubenswrapper[4794]: I0310 10:06:04.764185 4794 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" " Mar 10 10:06:04 crc kubenswrapper[4794]: I0310 10:06:04.764195 4794 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a3c438b7-94dd-4c8b-a360-7d15a4cc5574-logs\") on node \"crc\" DevicePath \"\"" Mar 10 10:06:04 crc kubenswrapper[4794]: I0310 10:06:04.764204 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hpq8f\" (UniqueName: \"kubernetes.io/projected/3dc6f544-8e2f-4322-bbff-968278be4ce4-kube-api-access-hpq8f\") on node \"crc\" DevicePath \"\"" Mar 10 10:06:04 crc kubenswrapper[4794]: I0310 10:06:04.764212 4794 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3dc6f544-8e2f-4322-bbff-968278be4ce4-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 10:06:04 crc kubenswrapper[4794]: I0310 10:06:04.764222 4794 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a3c438b7-94dd-4c8b-a360-7d15a4cc5574-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 10:06:04 crc kubenswrapper[4794]: I0310 10:06:04.764231 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qbzqm\" (UniqueName: \"kubernetes.io/projected/a3c438b7-94dd-4c8b-a360-7d15a4cc5574-kube-api-access-qbzqm\") on node \"crc\" DevicePath \"\"" Mar 10 10:06:04 crc kubenswrapper[4794]: I0310 10:06:04.764239 4794 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3dc6f544-8e2f-4322-bbff-968278be4ce4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 10:06:04 crc kubenswrapper[4794]: I0310 10:06:04.764246 4794 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a3c438b7-94dd-4c8b-a360-7d15a4cc5574-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 10 10:06:04 crc kubenswrapper[4794]: I0310 10:06:04.781322 4794 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage05-crc" (UniqueName: "kubernetes.io/local-volume/local-storage05-crc") on node "crc" Mar 10 10:06:04 crc kubenswrapper[4794]: I0310 10:06:04.786629 4794 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage03-crc" (UniqueName: "kubernetes.io/local-volume/local-storage03-crc") on node "crc" Mar 10 10:06:04 crc kubenswrapper[4794]: I0310 10:06:04.865373 4794 reconciler_common.go:293] "Volume detached for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" DevicePath \"\"" Mar 10 10:06:04 crc kubenswrapper[4794]: I0310 10:06:04.865408 4794 reconciler_common.go:293] "Volume detached for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") on node \"crc\" DevicePath \"\"" Mar 10 10:06:04 crc kubenswrapper[4794]: I0310 10:06:04.986805 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 10 10:06:04 crc kubenswrapper[4794]: I0310 10:06:04.992776 4794 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 10 10:06:05 crc kubenswrapper[4794]: I0310 10:06:05.031505 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Mar 10 10:06:05 crc kubenswrapper[4794]: E0310 10:06:05.031980 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a3c438b7-94dd-4c8b-a360-7d15a4cc5574" containerName="glance-httpd" Mar 10 10:06:05 crc kubenswrapper[4794]: I0310 10:06:05.031998 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="a3c438b7-94dd-4c8b-a360-7d15a4cc5574" containerName="glance-httpd" Mar 10 10:06:05 crc kubenswrapper[4794]: E0310 10:06:05.032021 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3dc6f544-8e2f-4322-bbff-968278be4ce4" containerName="glance-httpd" Mar 10 10:06:05 crc kubenswrapper[4794]: I0310 10:06:05.032028 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="3dc6f544-8e2f-4322-bbff-968278be4ce4" containerName="glance-httpd" Mar 10 10:06:05 crc kubenswrapper[4794]: E0310 10:06:05.032040 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d871867e-0fb5-48df-9d55-f19a63d160ea" containerName="keystone-bootstrap" Mar 10 10:06:05 crc kubenswrapper[4794]: I0310 10:06:05.032048 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="d871867e-0fb5-48df-9d55-f19a63d160ea" containerName="keystone-bootstrap" Mar 10 10:06:05 crc kubenswrapper[4794]: E0310 10:06:05.032066 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a3c438b7-94dd-4c8b-a360-7d15a4cc5574" containerName="glance-log" Mar 10 10:06:05 crc kubenswrapper[4794]: I0310 10:06:05.032073 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="a3c438b7-94dd-4c8b-a360-7d15a4cc5574" containerName="glance-log" Mar 10 10:06:05 crc kubenswrapper[4794]: E0310 10:06:05.032092 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3dc6f544-8e2f-4322-bbff-968278be4ce4" containerName="glance-log" Mar 10 10:06:05 crc kubenswrapper[4794]: I0310 10:06:05.032100 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="3dc6f544-8e2f-4322-bbff-968278be4ce4" containerName="glance-log" Mar 10 10:06:05 crc kubenswrapper[4794]: I0310 10:06:05.032323 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="d871867e-0fb5-48df-9d55-f19a63d160ea" containerName="keystone-bootstrap" Mar 10 10:06:05 crc kubenswrapper[4794]: I0310 10:06:05.032412 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="a3c438b7-94dd-4c8b-a360-7d15a4cc5574" containerName="glance-log" Mar 10 10:06:05 crc kubenswrapper[4794]: I0310 10:06:05.032428 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="3dc6f544-8e2f-4322-bbff-968278be4ce4" containerName="glance-log" Mar 10 10:06:05 crc kubenswrapper[4794]: I0310 10:06:05.032440 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="3dc6f544-8e2f-4322-bbff-968278be4ce4" containerName="glance-httpd" Mar 10 10:06:05 crc kubenswrapper[4794]: I0310 10:06:05.032459 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="a3c438b7-94dd-4c8b-a360-7d15a4cc5574" containerName="glance-httpd" Mar 10 10:06:05 crc kubenswrapper[4794]: I0310 10:06:05.033497 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 10 10:06:05 crc kubenswrapper[4794]: I0310 10:06:05.048518 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 10 10:06:05 crc kubenswrapper[4794]: I0310 10:06:05.071900 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 10 10:06:05 crc kubenswrapper[4794]: I0310 10:06:05.077739 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Mar 10 10:06:05 crc kubenswrapper[4794]: I0310 10:06:05.078119 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Mar 10 10:06:05 crc kubenswrapper[4794]: I0310 10:06:05.078156 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-v9fxn" Mar 10 10:06:05 crc kubenswrapper[4794]: I0310 10:06:05.078161 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Mar 10 10:06:05 crc kubenswrapper[4794]: I0310 10:06:05.097556 4794 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 10 10:06:05 crc kubenswrapper[4794]: I0310 10:06:05.119678 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 10 10:06:05 crc kubenswrapper[4794]: I0310 10:06:05.121630 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 10 10:06:05 crc kubenswrapper[4794]: I0310 10:06:05.124250 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Mar 10 10:06:05 crc kubenswrapper[4794]: I0310 10:06:05.125193 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Mar 10 10:06:05 crc kubenswrapper[4794]: I0310 10:06:05.135648 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 10 10:06:05 crc kubenswrapper[4794]: I0310 10:06:05.174160 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/325754ce-6381-4bb4-9102-04933c1a928b-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"325754ce-6381-4bb4-9102-04933c1a928b\") " pod="openstack/glance-default-external-api-0" Mar 10 10:06:05 crc kubenswrapper[4794]: I0310 10:06:05.174288 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/325754ce-6381-4bb4-9102-04933c1a928b-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"325754ce-6381-4bb4-9102-04933c1a928b\") " pod="openstack/glance-default-external-api-0" Mar 10 10:06:05 crc kubenswrapper[4794]: I0310 10:06:05.174386 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/325754ce-6381-4bb4-9102-04933c1a928b-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"325754ce-6381-4bb4-9102-04933c1a928b\") " pod="openstack/glance-default-external-api-0" Mar 10 10:06:05 crc kubenswrapper[4794]: I0310 10:06:05.174432 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-external-api-0\" (UID: \"325754ce-6381-4bb4-9102-04933c1a928b\") " pod="openstack/glance-default-external-api-0" Mar 10 10:06:05 crc kubenswrapper[4794]: I0310 10:06:05.174451 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/325754ce-6381-4bb4-9102-04933c1a928b-config-data\") pod \"glance-default-external-api-0\" (UID: \"325754ce-6381-4bb4-9102-04933c1a928b\") " pod="openstack/glance-default-external-api-0" Mar 10 10:06:05 crc kubenswrapper[4794]: I0310 10:06:05.174504 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/325754ce-6381-4bb4-9102-04933c1a928b-scripts\") pod \"glance-default-external-api-0\" (UID: \"325754ce-6381-4bb4-9102-04933c1a928b\") " pod="openstack/glance-default-external-api-0" Mar 10 10:06:05 crc kubenswrapper[4794]: I0310 10:06:05.174519 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/325754ce-6381-4bb4-9102-04933c1a928b-logs\") pod \"glance-default-external-api-0\" (UID: \"325754ce-6381-4bb4-9102-04933c1a928b\") " pod="openstack/glance-default-external-api-0" Mar 10 10:06:05 crc kubenswrapper[4794]: I0310 10:06:05.175677 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qq7dd\" (UniqueName: \"kubernetes.io/projected/325754ce-6381-4bb4-9102-04933c1a928b-kube-api-access-qq7dd\") pod \"glance-default-external-api-0\" (UID: \"325754ce-6381-4bb4-9102-04933c1a928b\") " pod="openstack/glance-default-external-api-0" Mar 10 10:06:05 crc kubenswrapper[4794]: I0310 10:06:05.277192 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/325754ce-6381-4bb4-9102-04933c1a928b-config-data\") pod \"glance-default-external-api-0\" (UID: \"325754ce-6381-4bb4-9102-04933c1a928b\") " pod="openstack/glance-default-external-api-0" Mar 10 10:06:05 crc kubenswrapper[4794]: I0310 10:06:05.277253 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/34993523-76a5-426f-a8bb-14466731fd21-config-data\") pod \"glance-default-internal-api-0\" (UID: \"34993523-76a5-426f-a8bb-14466731fd21\") " pod="openstack/glance-default-internal-api-0" Mar 10 10:06:05 crc kubenswrapper[4794]: I0310 10:06:05.277285 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/34993523-76a5-426f-a8bb-14466731fd21-logs\") pod \"glance-default-internal-api-0\" (UID: \"34993523-76a5-426f-a8bb-14466731fd21\") " pod="openstack/glance-default-internal-api-0" Mar 10 10:06:05 crc kubenswrapper[4794]: I0310 10:06:05.277308 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/325754ce-6381-4bb4-9102-04933c1a928b-scripts\") pod \"glance-default-external-api-0\" (UID: \"325754ce-6381-4bb4-9102-04933c1a928b\") " pod="openstack/glance-default-external-api-0" Mar 10 10:06:05 crc kubenswrapper[4794]: I0310 10:06:05.277333 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/325754ce-6381-4bb4-9102-04933c1a928b-logs\") pod \"glance-default-external-api-0\" (UID: \"325754ce-6381-4bb4-9102-04933c1a928b\") " pod="openstack/glance-default-external-api-0" Mar 10 10:06:05 crc kubenswrapper[4794]: I0310 10:06:05.277398 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qq7dd\" (UniqueName: \"kubernetes.io/projected/325754ce-6381-4bb4-9102-04933c1a928b-kube-api-access-qq7dd\") pod \"glance-default-external-api-0\" (UID: \"325754ce-6381-4bb4-9102-04933c1a928b\") " pod="openstack/glance-default-external-api-0" Mar 10 10:06:05 crc kubenswrapper[4794]: I0310 10:06:05.277440 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/34993523-76a5-426f-a8bb-14466731fd21-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"34993523-76a5-426f-a8bb-14466731fd21\") " pod="openstack/glance-default-internal-api-0" Mar 10 10:06:05 crc kubenswrapper[4794]: I0310 10:06:05.277470 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-internal-api-0\" (UID: \"34993523-76a5-426f-a8bb-14466731fd21\") " pod="openstack/glance-default-internal-api-0" Mar 10 10:06:05 crc kubenswrapper[4794]: I0310 10:06:05.277497 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bsbtw\" (UniqueName: \"kubernetes.io/projected/34993523-76a5-426f-a8bb-14466731fd21-kube-api-access-bsbtw\") pod \"glance-default-internal-api-0\" (UID: \"34993523-76a5-426f-a8bb-14466731fd21\") " pod="openstack/glance-default-internal-api-0" Mar 10 10:06:05 crc kubenswrapper[4794]: I0310 10:06:05.277523 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/34993523-76a5-426f-a8bb-14466731fd21-scripts\") pod \"glance-default-internal-api-0\" (UID: \"34993523-76a5-426f-a8bb-14466731fd21\") " pod="openstack/glance-default-internal-api-0" Mar 10 10:06:05 crc kubenswrapper[4794]: I0310 10:06:05.277542 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/325754ce-6381-4bb4-9102-04933c1a928b-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"325754ce-6381-4bb4-9102-04933c1a928b\") " pod="openstack/glance-default-external-api-0" Mar 10 10:06:05 crc kubenswrapper[4794]: I0310 10:06:05.277576 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/325754ce-6381-4bb4-9102-04933c1a928b-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"325754ce-6381-4bb4-9102-04933c1a928b\") " pod="openstack/glance-default-external-api-0" Mar 10 10:06:05 crc kubenswrapper[4794]: I0310 10:06:05.277604 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/34993523-76a5-426f-a8bb-14466731fd21-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"34993523-76a5-426f-a8bb-14466731fd21\") " pod="openstack/glance-default-internal-api-0" Mar 10 10:06:05 crc kubenswrapper[4794]: I0310 10:06:05.277624 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/34993523-76a5-426f-a8bb-14466731fd21-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"34993523-76a5-426f-a8bb-14466731fd21\") " pod="openstack/glance-default-internal-api-0" Mar 10 10:06:05 crc kubenswrapper[4794]: I0310 10:06:05.277655 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/325754ce-6381-4bb4-9102-04933c1a928b-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"325754ce-6381-4bb4-9102-04933c1a928b\") " pod="openstack/glance-default-external-api-0" Mar 10 10:06:05 crc kubenswrapper[4794]: I0310 10:06:05.277678 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-external-api-0\" (UID: \"325754ce-6381-4bb4-9102-04933c1a928b\") " pod="openstack/glance-default-external-api-0" Mar 10 10:06:05 crc kubenswrapper[4794]: I0310 10:06:05.277960 4794 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-external-api-0\" (UID: \"325754ce-6381-4bb4-9102-04933c1a928b\") device mount path \"/mnt/openstack/pv05\"" pod="openstack/glance-default-external-api-0" Mar 10 10:06:05 crc kubenswrapper[4794]: I0310 10:06:05.278791 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/325754ce-6381-4bb4-9102-04933c1a928b-logs\") pod \"glance-default-external-api-0\" (UID: \"325754ce-6381-4bb4-9102-04933c1a928b\") " pod="openstack/glance-default-external-api-0" Mar 10 10:06:05 crc kubenswrapper[4794]: I0310 10:06:05.278884 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/325754ce-6381-4bb4-9102-04933c1a928b-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"325754ce-6381-4bb4-9102-04933c1a928b\") " pod="openstack/glance-default-external-api-0" Mar 10 10:06:05 crc kubenswrapper[4794]: I0310 10:06:05.287703 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/325754ce-6381-4bb4-9102-04933c1a928b-scripts\") pod \"glance-default-external-api-0\" (UID: \"325754ce-6381-4bb4-9102-04933c1a928b\") " pod="openstack/glance-default-external-api-0" Mar 10 10:06:05 crc kubenswrapper[4794]: I0310 10:06:05.287724 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/325754ce-6381-4bb4-9102-04933c1a928b-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"325754ce-6381-4bb4-9102-04933c1a928b\") " pod="openstack/glance-default-external-api-0" Mar 10 10:06:05 crc kubenswrapper[4794]: I0310 10:06:05.293040 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/325754ce-6381-4bb4-9102-04933c1a928b-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"325754ce-6381-4bb4-9102-04933c1a928b\") " pod="openstack/glance-default-external-api-0" Mar 10 10:06:05 crc kubenswrapper[4794]: I0310 10:06:05.296504 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/325754ce-6381-4bb4-9102-04933c1a928b-config-data\") pod \"glance-default-external-api-0\" (UID: \"325754ce-6381-4bb4-9102-04933c1a928b\") " pod="openstack/glance-default-external-api-0" Mar 10 10:06:05 crc kubenswrapper[4794]: I0310 10:06:05.316082 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qq7dd\" (UniqueName: \"kubernetes.io/projected/325754ce-6381-4bb4-9102-04933c1a928b-kube-api-access-qq7dd\") pod \"glance-default-external-api-0\" (UID: \"325754ce-6381-4bb4-9102-04933c1a928b\") " pod="openstack/glance-default-external-api-0" Mar 10 10:06:05 crc kubenswrapper[4794]: I0310 10:06:05.346636 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-external-api-0\" (UID: \"325754ce-6381-4bb4-9102-04933c1a928b\") " pod="openstack/glance-default-external-api-0" Mar 10 10:06:05 crc kubenswrapper[4794]: I0310 10:06:05.382904 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/34993523-76a5-426f-a8bb-14466731fd21-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"34993523-76a5-426f-a8bb-14466731fd21\") " pod="openstack/glance-default-internal-api-0" Mar 10 10:06:05 crc kubenswrapper[4794]: I0310 10:06:05.382976 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-internal-api-0\" (UID: \"34993523-76a5-426f-a8bb-14466731fd21\") " pod="openstack/glance-default-internal-api-0" Mar 10 10:06:05 crc kubenswrapper[4794]: I0310 10:06:05.383017 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bsbtw\" (UniqueName: \"kubernetes.io/projected/34993523-76a5-426f-a8bb-14466731fd21-kube-api-access-bsbtw\") pod \"glance-default-internal-api-0\" (UID: \"34993523-76a5-426f-a8bb-14466731fd21\") " pod="openstack/glance-default-internal-api-0" Mar 10 10:06:05 crc kubenswrapper[4794]: I0310 10:06:05.383049 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/34993523-76a5-426f-a8bb-14466731fd21-scripts\") pod \"glance-default-internal-api-0\" (UID: \"34993523-76a5-426f-a8bb-14466731fd21\") " pod="openstack/glance-default-internal-api-0" Mar 10 10:06:05 crc kubenswrapper[4794]: I0310 10:06:05.383118 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/34993523-76a5-426f-a8bb-14466731fd21-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"34993523-76a5-426f-a8bb-14466731fd21\") " pod="openstack/glance-default-internal-api-0" Mar 10 10:06:05 crc kubenswrapper[4794]: I0310 10:06:05.383145 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/34993523-76a5-426f-a8bb-14466731fd21-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"34993523-76a5-426f-a8bb-14466731fd21\") " pod="openstack/glance-default-internal-api-0" Mar 10 10:06:05 crc kubenswrapper[4794]: I0310 10:06:05.383187 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/34993523-76a5-426f-a8bb-14466731fd21-config-data\") pod \"glance-default-internal-api-0\" (UID: \"34993523-76a5-426f-a8bb-14466731fd21\") " pod="openstack/glance-default-internal-api-0" Mar 10 10:06:05 crc kubenswrapper[4794]: I0310 10:06:05.383209 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/34993523-76a5-426f-a8bb-14466731fd21-logs\") pod \"glance-default-internal-api-0\" (UID: \"34993523-76a5-426f-a8bb-14466731fd21\") " pod="openstack/glance-default-internal-api-0" Mar 10 10:06:05 crc kubenswrapper[4794]: I0310 10:06:05.383783 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/34993523-76a5-426f-a8bb-14466731fd21-logs\") pod \"glance-default-internal-api-0\" (UID: \"34993523-76a5-426f-a8bb-14466731fd21\") " pod="openstack/glance-default-internal-api-0" Mar 10 10:06:05 crc kubenswrapper[4794]: I0310 10:06:05.385601 4794 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-internal-api-0\" (UID: \"34993523-76a5-426f-a8bb-14466731fd21\") device mount path \"/mnt/openstack/pv03\"" pod="openstack/glance-default-internal-api-0" Mar 10 10:06:05 crc kubenswrapper[4794]: I0310 10:06:05.387535 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/34993523-76a5-426f-a8bb-14466731fd21-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"34993523-76a5-426f-a8bb-14466731fd21\") " pod="openstack/glance-default-internal-api-0" Mar 10 10:06:05 crc kubenswrapper[4794]: I0310 10:06:05.394072 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/34993523-76a5-426f-a8bb-14466731fd21-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"34993523-76a5-426f-a8bb-14466731fd21\") " pod="openstack/glance-default-internal-api-0" Mar 10 10:06:05 crc kubenswrapper[4794]: I0310 10:06:05.396755 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 10 10:06:05 crc kubenswrapper[4794]: I0310 10:06:05.398482 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/34993523-76a5-426f-a8bb-14466731fd21-scripts\") pod \"glance-default-internal-api-0\" (UID: \"34993523-76a5-426f-a8bb-14466731fd21\") " pod="openstack/glance-default-internal-api-0" Mar 10 10:06:05 crc kubenswrapper[4794]: I0310 10:06:05.403678 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/34993523-76a5-426f-a8bb-14466731fd21-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"34993523-76a5-426f-a8bb-14466731fd21\") " pod="openstack/glance-default-internal-api-0" Mar 10 10:06:05 crc kubenswrapper[4794]: I0310 10:06:05.404777 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/34993523-76a5-426f-a8bb-14466731fd21-config-data\") pod \"glance-default-internal-api-0\" (UID: \"34993523-76a5-426f-a8bb-14466731fd21\") " pod="openstack/glance-default-internal-api-0" Mar 10 10:06:05 crc kubenswrapper[4794]: I0310 10:06:05.406859 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bsbtw\" (UniqueName: \"kubernetes.io/projected/34993523-76a5-426f-a8bb-14466731fd21-kube-api-access-bsbtw\") pod \"glance-default-internal-api-0\" (UID: \"34993523-76a5-426f-a8bb-14466731fd21\") " pod="openstack/glance-default-internal-api-0" Mar 10 10:06:05 crc kubenswrapper[4794]: I0310 10:06:05.426526 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-internal-api-0\" (UID: \"34993523-76a5-426f-a8bb-14466731fd21\") " pod="openstack/glance-default-internal-api-0" Mar 10 10:06:05 crc kubenswrapper[4794]: I0310 10:06:05.456961 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 10 10:06:05 crc kubenswrapper[4794]: I0310 10:06:05.646172 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-qw4lv"] Mar 10 10:06:05 crc kubenswrapper[4794]: I0310 10:06:05.653372 4794 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-qw4lv"] Mar 10 10:06:05 crc kubenswrapper[4794]: I0310 10:06:05.752733 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-sz7rz"] Mar 10 10:06:05 crc kubenswrapper[4794]: I0310 10:06:05.754190 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-sz7rz" Mar 10 10:06:05 crc kubenswrapper[4794]: I0310 10:06:05.756908 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Mar 10 10:06:05 crc kubenswrapper[4794]: I0310 10:06:05.757070 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Mar 10 10:06:05 crc kubenswrapper[4794]: I0310 10:06:05.757108 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Mar 10 10:06:05 crc kubenswrapper[4794]: I0310 10:06:05.757447 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-4svgf" Mar 10 10:06:05 crc kubenswrapper[4794]: I0310 10:06:05.758416 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Mar 10 10:06:05 crc kubenswrapper[4794]: I0310 10:06:05.768219 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-sz7rz"] Mar 10 10:06:05 crc kubenswrapper[4794]: I0310 10:06:05.891246 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/74a5421b-362d-437b-98ce-c11e44a2e6f0-combined-ca-bundle\") pod \"keystone-bootstrap-sz7rz\" (UID: \"74a5421b-362d-437b-98ce-c11e44a2e6f0\") " pod="openstack/keystone-bootstrap-sz7rz" Mar 10 10:06:05 crc kubenswrapper[4794]: I0310 10:06:05.891561 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/74a5421b-362d-437b-98ce-c11e44a2e6f0-credential-keys\") pod \"keystone-bootstrap-sz7rz\" (UID: \"74a5421b-362d-437b-98ce-c11e44a2e6f0\") " pod="openstack/keystone-bootstrap-sz7rz" Mar 10 10:06:05 crc kubenswrapper[4794]: I0310 10:06:05.891609 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r9686\" (UniqueName: \"kubernetes.io/projected/74a5421b-362d-437b-98ce-c11e44a2e6f0-kube-api-access-r9686\") pod \"keystone-bootstrap-sz7rz\" (UID: \"74a5421b-362d-437b-98ce-c11e44a2e6f0\") " pod="openstack/keystone-bootstrap-sz7rz" Mar 10 10:06:05 crc kubenswrapper[4794]: I0310 10:06:05.891634 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/74a5421b-362d-437b-98ce-c11e44a2e6f0-scripts\") pod \"keystone-bootstrap-sz7rz\" (UID: \"74a5421b-362d-437b-98ce-c11e44a2e6f0\") " pod="openstack/keystone-bootstrap-sz7rz" Mar 10 10:06:05 crc kubenswrapper[4794]: I0310 10:06:05.891964 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/74a5421b-362d-437b-98ce-c11e44a2e6f0-fernet-keys\") pod \"keystone-bootstrap-sz7rz\" (UID: \"74a5421b-362d-437b-98ce-c11e44a2e6f0\") " pod="openstack/keystone-bootstrap-sz7rz" Mar 10 10:06:05 crc kubenswrapper[4794]: I0310 10:06:05.892168 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/74a5421b-362d-437b-98ce-c11e44a2e6f0-config-data\") pod \"keystone-bootstrap-sz7rz\" (UID: \"74a5421b-362d-437b-98ce-c11e44a2e6f0\") " pod="openstack/keystone-bootstrap-sz7rz" Mar 10 10:06:05 crc kubenswrapper[4794]: I0310 10:06:05.993811 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/74a5421b-362d-437b-98ce-c11e44a2e6f0-fernet-keys\") pod \"keystone-bootstrap-sz7rz\" (UID: \"74a5421b-362d-437b-98ce-c11e44a2e6f0\") " pod="openstack/keystone-bootstrap-sz7rz" Mar 10 10:06:05 crc kubenswrapper[4794]: I0310 10:06:05.993884 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/74a5421b-362d-437b-98ce-c11e44a2e6f0-config-data\") pod \"keystone-bootstrap-sz7rz\" (UID: \"74a5421b-362d-437b-98ce-c11e44a2e6f0\") " pod="openstack/keystone-bootstrap-sz7rz" Mar 10 10:06:05 crc kubenswrapper[4794]: I0310 10:06:05.993928 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/74a5421b-362d-437b-98ce-c11e44a2e6f0-combined-ca-bundle\") pod \"keystone-bootstrap-sz7rz\" (UID: \"74a5421b-362d-437b-98ce-c11e44a2e6f0\") " pod="openstack/keystone-bootstrap-sz7rz" Mar 10 10:06:05 crc kubenswrapper[4794]: I0310 10:06:05.993953 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/74a5421b-362d-437b-98ce-c11e44a2e6f0-credential-keys\") pod \"keystone-bootstrap-sz7rz\" (UID: \"74a5421b-362d-437b-98ce-c11e44a2e6f0\") " pod="openstack/keystone-bootstrap-sz7rz" Mar 10 10:06:05 crc kubenswrapper[4794]: I0310 10:06:05.993999 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r9686\" (UniqueName: \"kubernetes.io/projected/74a5421b-362d-437b-98ce-c11e44a2e6f0-kube-api-access-r9686\") pod \"keystone-bootstrap-sz7rz\" (UID: \"74a5421b-362d-437b-98ce-c11e44a2e6f0\") " pod="openstack/keystone-bootstrap-sz7rz" Mar 10 10:06:05 crc kubenswrapper[4794]: I0310 10:06:05.994023 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/74a5421b-362d-437b-98ce-c11e44a2e6f0-scripts\") pod \"keystone-bootstrap-sz7rz\" (UID: \"74a5421b-362d-437b-98ce-c11e44a2e6f0\") " pod="openstack/keystone-bootstrap-sz7rz" Mar 10 10:06:06 crc kubenswrapper[4794]: I0310 10:06:06.003517 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/74a5421b-362d-437b-98ce-c11e44a2e6f0-scripts\") pod \"keystone-bootstrap-sz7rz\" (UID: \"74a5421b-362d-437b-98ce-c11e44a2e6f0\") " pod="openstack/keystone-bootstrap-sz7rz" Mar 10 10:06:06 crc kubenswrapper[4794]: I0310 10:06:06.003726 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/74a5421b-362d-437b-98ce-c11e44a2e6f0-combined-ca-bundle\") pod \"keystone-bootstrap-sz7rz\" (UID: \"74a5421b-362d-437b-98ce-c11e44a2e6f0\") " pod="openstack/keystone-bootstrap-sz7rz" Mar 10 10:06:06 crc kubenswrapper[4794]: I0310 10:06:06.004061 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/74a5421b-362d-437b-98ce-c11e44a2e6f0-config-data\") pod \"keystone-bootstrap-sz7rz\" (UID: \"74a5421b-362d-437b-98ce-c11e44a2e6f0\") " pod="openstack/keystone-bootstrap-sz7rz" Mar 10 10:06:06 crc kubenswrapper[4794]: I0310 10:06:06.004986 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/74a5421b-362d-437b-98ce-c11e44a2e6f0-credential-keys\") pod \"keystone-bootstrap-sz7rz\" (UID: \"74a5421b-362d-437b-98ce-c11e44a2e6f0\") " pod="openstack/keystone-bootstrap-sz7rz" Mar 10 10:06:06 crc kubenswrapper[4794]: I0310 10:06:06.009590 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/74a5421b-362d-437b-98ce-c11e44a2e6f0-fernet-keys\") pod \"keystone-bootstrap-sz7rz\" (UID: \"74a5421b-362d-437b-98ce-c11e44a2e6f0\") " pod="openstack/keystone-bootstrap-sz7rz" Mar 10 10:06:06 crc kubenswrapper[4794]: I0310 10:06:06.012247 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r9686\" (UniqueName: \"kubernetes.io/projected/74a5421b-362d-437b-98ce-c11e44a2e6f0-kube-api-access-r9686\") pod \"keystone-bootstrap-sz7rz\" (UID: \"74a5421b-362d-437b-98ce-c11e44a2e6f0\") " pod="openstack/keystone-bootstrap-sz7rz" Mar 10 10:06:06 crc kubenswrapper[4794]: I0310 10:06:06.015622 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3dc6f544-8e2f-4322-bbff-968278be4ce4" path="/var/lib/kubelet/pods/3dc6f544-8e2f-4322-bbff-968278be4ce4/volumes" Mar 10 10:06:06 crc kubenswrapper[4794]: I0310 10:06:06.018432 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a3c438b7-94dd-4c8b-a360-7d15a4cc5574" path="/var/lib/kubelet/pods/a3c438b7-94dd-4c8b-a360-7d15a4cc5574/volumes" Mar 10 10:06:06 crc kubenswrapper[4794]: I0310 10:06:06.019669 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d871867e-0fb5-48df-9d55-f19a63d160ea" path="/var/lib/kubelet/pods/d871867e-0fb5-48df-9d55-f19a63d160ea/volumes" Mar 10 10:06:06 crc kubenswrapper[4794]: I0310 10:06:06.070754 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-sz7rz" Mar 10 10:06:12 crc kubenswrapper[4794]: I0310 10:06:12.205867 4794 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-7fd445f5bc-9njwb" podUID="69fccd5f-c6ca-4228-81ba-ee7ae103876a" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.132:5353: i/o timeout" Mar 10 10:06:12 crc kubenswrapper[4794]: I0310 10:06:12.751768 4794 generic.go:334] "Generic (PLEG): container finished" podID="b9423b3f-3f25-484c-aacc-c83d78c2f731" containerID="00ea922430a28d8905d03ab4683719c7f0ddb1de198f027cdcd8c2578d18e421" exitCode=0 Mar 10 10:06:12 crc kubenswrapper[4794]: I0310 10:06:12.751807 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-hjnc2" event={"ID":"b9423b3f-3f25-484c-aacc-c83d78c2f731","Type":"ContainerDied","Data":"00ea922430a28d8905d03ab4683719c7f0ddb1de198f027cdcd8c2578d18e421"} Mar 10 10:06:13 crc kubenswrapper[4794]: E0310 10:06:13.476831 4794 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-barbican-api@sha256:2c52c0f4b4baa15796eb284522adff7fa9e5c85a2d77c2e47ef4afdf8e4a7c7d" Mar 10 10:06:13 crc kubenswrapper[4794]: E0310 10:06:13.476978 4794 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:barbican-db-sync,Image:quay.io/podified-antelope-centos9/openstack-barbican-api@sha256:2c52c0f4b4baa15796eb284522adff7fa9e5c85a2d77c2e47ef4afdf8e4a7c7d,Command:[/bin/bash],Args:[-c barbican-manage db upgrade],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/barbican/barbican.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-zktwg,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42403,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:*42403,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod barbican-db-sync-c4b4g_openstack(22069ba2-0135-4559-9c7f-2d73ae0dd81a): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 10 10:06:13 crc kubenswrapper[4794]: E0310 10:06:13.478262 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"barbican-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/barbican-db-sync-c4b4g" podUID="22069ba2-0135-4559-9c7f-2d73ae0dd81a" Mar 10 10:06:13 crc kubenswrapper[4794]: E0310 10:06:13.760322 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"barbican-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-barbican-api@sha256:2c52c0f4b4baa15796eb284522adff7fa9e5c85a2d77c2e47ef4afdf8e4a7c7d\\\"\"" pod="openstack/barbican-db-sync-c4b4g" podUID="22069ba2-0135-4559-9c7f-2d73ae0dd81a" Mar 10 10:06:13 crc kubenswrapper[4794]: I0310 10:06:13.894618 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552286-6rbvj"] Mar 10 10:06:14 crc kubenswrapper[4794]: W0310 10:06:14.733683 4794 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf9689ae4_36b4_41d8_b75d_805a57b17041.slice/crio-db11f739f3ec897eb3d136ae60edf781579a6ae9d6fd9016363bf2ba30310a74 WatchSource:0}: Error finding container db11f739f3ec897eb3d136ae60edf781579a6ae9d6fd9016363bf2ba30310a74: Status 404 returned error can't find the container with id db11f739f3ec897eb3d136ae60edf781579a6ae9d6fd9016363bf2ba30310a74 Mar 10 10:06:14 crc kubenswrapper[4794]: I0310 10:06:14.754153 4794 scope.go:117] "RemoveContainer" containerID="d845c380b80620a17bbd0643c40c5a7e9ffe41c8cafd50e97f42affa0549719e" Mar 10 10:06:14 crc kubenswrapper[4794]: E0310 10:06:14.762214 4794 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-cinder-api@sha256:44ed1ca84e17bd0f004cfbdc3c0827d767daba52abb8e83e076bfd0e6c02f838" Mar 10 10:06:14 crc kubenswrapper[4794]: E0310 10:06:14.763281 4794 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:cinder-db-sync,Image:quay.io/podified-antelope-centos9/openstack-cinder-api@sha256:44ed1ca84e17bd0f004cfbdc3c0827d767daba52abb8e83e076bfd0e6c02f838,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_set_configs && /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:etc-machine-id,ReadOnly:true,MountPath:/etc/machine-id,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:scripts,ReadOnly:true,MountPath:/usr/local/bin/container-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/config-data/merged,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/my.cnf,SubPath:my.cnf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/cinder/cinder.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:db-sync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-2pmml,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:nil,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cinder-db-sync-g7gmt_openstack(821b338a-8a20-4d93-8dfa-28727da3ecba): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 10 10:06:14 crc kubenswrapper[4794]: E0310 10:06:14.764899 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/cinder-db-sync-g7gmt" podUID="821b338a-8a20-4d93-8dfa-28727da3ecba" Mar 10 10:06:14 crc kubenswrapper[4794]: I0310 10:06:14.795418 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-hjnc2" event={"ID":"b9423b3f-3f25-484c-aacc-c83d78c2f731","Type":"ContainerDied","Data":"2c7df2f4bf734c24efa56d36143ca2786fa004791f8a96791a2ea61a83d8b718"} Mar 10 10:06:14 crc kubenswrapper[4794]: I0310 10:06:14.795678 4794 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2c7df2f4bf734c24efa56d36143ca2786fa004791f8a96791a2ea61a83d8b718" Mar 10 10:06:14 crc kubenswrapper[4794]: I0310 10:06:14.797449 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7fd445f5bc-9njwb" event={"ID":"69fccd5f-c6ca-4228-81ba-ee7ae103876a","Type":"ContainerDied","Data":"9b08d65c991101e497d7f11e22c15b36bf6577f95e5100c52f75ed21a6d8a0f2"} Mar 10 10:06:14 crc kubenswrapper[4794]: I0310 10:06:14.797477 4794 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9b08d65c991101e497d7f11e22c15b36bf6577f95e5100c52f75ed21a6d8a0f2" Mar 10 10:06:14 crc kubenswrapper[4794]: I0310 10:06:14.798787 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552286-6rbvj" event={"ID":"f9689ae4-36b4-41d8-b75d-805a57b17041","Type":"ContainerStarted","Data":"db11f739f3ec897eb3d136ae60edf781579a6ae9d6fd9016363bf2ba30310a74"} Mar 10 10:06:14 crc kubenswrapper[4794]: I0310 10:06:14.977069 4794 scope.go:117] "RemoveContainer" containerID="ea6cf5e160274f28f608753fb25272872be1c23ae6c7d921e630fbe91e0e1a2c" Mar 10 10:06:15 crc kubenswrapper[4794]: I0310 10:06:15.106566 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7fd445f5bc-9njwb" Mar 10 10:06:15 crc kubenswrapper[4794]: I0310 10:06:15.120113 4794 scope.go:117] "RemoveContainer" containerID="a7cbfbe6a35be0b6f9a28390aa2592275e86a7af9d0d985168c86608a6a223e2" Mar 10 10:06:15 crc kubenswrapper[4794]: I0310 10:06:15.129121 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-hjnc2" Mar 10 10:06:15 crc kubenswrapper[4794]: I0310 10:06:15.155113 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w7wns\" (UniqueName: \"kubernetes.io/projected/69fccd5f-c6ca-4228-81ba-ee7ae103876a-kube-api-access-w7wns\") pod \"69fccd5f-c6ca-4228-81ba-ee7ae103876a\" (UID: \"69fccd5f-c6ca-4228-81ba-ee7ae103876a\") " Mar 10 10:06:15 crc kubenswrapper[4794]: I0310 10:06:15.155167 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/69fccd5f-c6ca-4228-81ba-ee7ae103876a-ovsdbserver-sb\") pod \"69fccd5f-c6ca-4228-81ba-ee7ae103876a\" (UID: \"69fccd5f-c6ca-4228-81ba-ee7ae103876a\") " Mar 10 10:06:15 crc kubenswrapper[4794]: I0310 10:06:15.155211 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/69fccd5f-c6ca-4228-81ba-ee7ae103876a-dns-swift-storage-0\") pod \"69fccd5f-c6ca-4228-81ba-ee7ae103876a\" (UID: \"69fccd5f-c6ca-4228-81ba-ee7ae103876a\") " Mar 10 10:06:15 crc kubenswrapper[4794]: I0310 10:06:15.155280 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/69fccd5f-c6ca-4228-81ba-ee7ae103876a-config\") pod \"69fccd5f-c6ca-4228-81ba-ee7ae103876a\" (UID: \"69fccd5f-c6ca-4228-81ba-ee7ae103876a\") " Mar 10 10:06:15 crc kubenswrapper[4794]: I0310 10:06:15.155407 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/69fccd5f-c6ca-4228-81ba-ee7ae103876a-dns-svc\") pod \"69fccd5f-c6ca-4228-81ba-ee7ae103876a\" (UID: \"69fccd5f-c6ca-4228-81ba-ee7ae103876a\") " Mar 10 10:06:15 crc kubenswrapper[4794]: I0310 10:06:15.155434 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/69fccd5f-c6ca-4228-81ba-ee7ae103876a-ovsdbserver-nb\") pod \"69fccd5f-c6ca-4228-81ba-ee7ae103876a\" (UID: \"69fccd5f-c6ca-4228-81ba-ee7ae103876a\") " Mar 10 10:06:15 crc kubenswrapper[4794]: I0310 10:06:15.174717 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/69fccd5f-c6ca-4228-81ba-ee7ae103876a-kube-api-access-w7wns" (OuterVolumeSpecName: "kube-api-access-w7wns") pod "69fccd5f-c6ca-4228-81ba-ee7ae103876a" (UID: "69fccd5f-c6ca-4228-81ba-ee7ae103876a"). InnerVolumeSpecName "kube-api-access-w7wns". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 10:06:15 crc kubenswrapper[4794]: I0310 10:06:15.239061 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/69fccd5f-c6ca-4228-81ba-ee7ae103876a-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "69fccd5f-c6ca-4228-81ba-ee7ae103876a" (UID: "69fccd5f-c6ca-4228-81ba-ee7ae103876a"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 10:06:15 crc kubenswrapper[4794]: I0310 10:06:15.244723 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/69fccd5f-c6ca-4228-81ba-ee7ae103876a-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "69fccd5f-c6ca-4228-81ba-ee7ae103876a" (UID: "69fccd5f-c6ca-4228-81ba-ee7ae103876a"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 10:06:15 crc kubenswrapper[4794]: I0310 10:06:15.246182 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/69fccd5f-c6ca-4228-81ba-ee7ae103876a-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "69fccd5f-c6ca-4228-81ba-ee7ae103876a" (UID: "69fccd5f-c6ca-4228-81ba-ee7ae103876a"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 10:06:15 crc kubenswrapper[4794]: I0310 10:06:15.251090 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/69fccd5f-c6ca-4228-81ba-ee7ae103876a-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "69fccd5f-c6ca-4228-81ba-ee7ae103876a" (UID: "69fccd5f-c6ca-4228-81ba-ee7ae103876a"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 10:06:15 crc kubenswrapper[4794]: I0310 10:06:15.255869 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/69fccd5f-c6ca-4228-81ba-ee7ae103876a-config" (OuterVolumeSpecName: "config") pod "69fccd5f-c6ca-4228-81ba-ee7ae103876a" (UID: "69fccd5f-c6ca-4228-81ba-ee7ae103876a"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 10:06:15 crc kubenswrapper[4794]: I0310 10:06:15.256949 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/b9423b3f-3f25-484c-aacc-c83d78c2f731-config\") pod \"b9423b3f-3f25-484c-aacc-c83d78c2f731\" (UID: \"b9423b3f-3f25-484c-aacc-c83d78c2f731\") " Mar 10 10:06:15 crc kubenswrapper[4794]: I0310 10:06:15.257035 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b9423b3f-3f25-484c-aacc-c83d78c2f731-combined-ca-bundle\") pod \"b9423b3f-3f25-484c-aacc-c83d78c2f731\" (UID: \"b9423b3f-3f25-484c-aacc-c83d78c2f731\") " Mar 10 10:06:15 crc kubenswrapper[4794]: I0310 10:06:15.257222 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-llmkn\" (UniqueName: \"kubernetes.io/projected/b9423b3f-3f25-484c-aacc-c83d78c2f731-kube-api-access-llmkn\") pod \"b9423b3f-3f25-484c-aacc-c83d78c2f731\" (UID: \"b9423b3f-3f25-484c-aacc-c83d78c2f731\") " Mar 10 10:06:15 crc kubenswrapper[4794]: I0310 10:06:15.257719 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w7wns\" (UniqueName: \"kubernetes.io/projected/69fccd5f-c6ca-4228-81ba-ee7ae103876a-kube-api-access-w7wns\") on node \"crc\" DevicePath \"\"" Mar 10 10:06:15 crc kubenswrapper[4794]: I0310 10:06:15.257733 4794 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/69fccd5f-c6ca-4228-81ba-ee7ae103876a-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 10 10:06:15 crc kubenswrapper[4794]: I0310 10:06:15.257743 4794 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/69fccd5f-c6ca-4228-81ba-ee7ae103876a-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 10 10:06:15 crc kubenswrapper[4794]: I0310 10:06:15.257751 4794 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/69fccd5f-c6ca-4228-81ba-ee7ae103876a-config\") on node \"crc\" DevicePath \"\"" Mar 10 10:06:15 crc kubenswrapper[4794]: I0310 10:06:15.257759 4794 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/69fccd5f-c6ca-4228-81ba-ee7ae103876a-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 10 10:06:15 crc kubenswrapper[4794]: I0310 10:06:15.257767 4794 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/69fccd5f-c6ca-4228-81ba-ee7ae103876a-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 10 10:06:15 crc kubenswrapper[4794]: I0310 10:06:15.261406 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b9423b3f-3f25-484c-aacc-c83d78c2f731-kube-api-access-llmkn" (OuterVolumeSpecName: "kube-api-access-llmkn") pod "b9423b3f-3f25-484c-aacc-c83d78c2f731" (UID: "b9423b3f-3f25-484c-aacc-c83d78c2f731"). InnerVolumeSpecName "kube-api-access-llmkn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 10:06:15 crc kubenswrapper[4794]: I0310 10:06:15.264126 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-sz7rz"] Mar 10 10:06:15 crc kubenswrapper[4794]: W0310 10:06:15.266550 4794 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod74a5421b_362d_437b_98ce_c11e44a2e6f0.slice/crio-ed735d8ea2cd1b7f855a538c37b457135c56eb28083c21334c689d4b13ccb3c4 WatchSource:0}: Error finding container ed735d8ea2cd1b7f855a538c37b457135c56eb28083c21334c689d4b13ccb3c4: Status 404 returned error can't find the container with id ed735d8ea2cd1b7f855a538c37b457135c56eb28083c21334c689d4b13ccb3c4 Mar 10 10:06:15 crc kubenswrapper[4794]: I0310 10:06:15.273900 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Mar 10 10:06:15 crc kubenswrapper[4794]: I0310 10:06:15.280926 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b9423b3f-3f25-484c-aacc-c83d78c2f731-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b9423b3f-3f25-484c-aacc-c83d78c2f731" (UID: "b9423b3f-3f25-484c-aacc-c83d78c2f731"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 10:06:15 crc kubenswrapper[4794]: I0310 10:06:15.281498 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b9423b3f-3f25-484c-aacc-c83d78c2f731-config" (OuterVolumeSpecName: "config") pod "b9423b3f-3f25-484c-aacc-c83d78c2f731" (UID: "b9423b3f-3f25-484c-aacc-c83d78c2f731"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 10:06:15 crc kubenswrapper[4794]: I0310 10:06:15.358840 4794 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b9423b3f-3f25-484c-aacc-c83d78c2f731-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 10:06:15 crc kubenswrapper[4794]: I0310 10:06:15.358873 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-llmkn\" (UniqueName: \"kubernetes.io/projected/b9423b3f-3f25-484c-aacc-c83d78c2f731-kube-api-access-llmkn\") on node \"crc\" DevicePath \"\"" Mar 10 10:06:15 crc kubenswrapper[4794]: I0310 10:06:15.358882 4794 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/b9423b3f-3f25-484c-aacc-c83d78c2f731-config\") on node \"crc\" DevicePath \"\"" Mar 10 10:06:15 crc kubenswrapper[4794]: I0310 10:06:15.425222 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 10 10:06:15 crc kubenswrapper[4794]: W0310 10:06:15.429027 4794 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod34993523_76a5_426f_a8bb_14466731fd21.slice/crio-81de560fbdf656511469793bbc0079e4b031e9a1eaf3c0073a8d3b425a268416 WatchSource:0}: Error finding container 81de560fbdf656511469793bbc0079e4b031e9a1eaf3c0073a8d3b425a268416: Status 404 returned error can't find the container with id 81de560fbdf656511469793bbc0079e4b031e9a1eaf3c0073a8d3b425a268416 Mar 10 10:06:15 crc kubenswrapper[4794]: I0310 10:06:15.817311 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-v6v6k" event={"ID":"cb29878d-3dbb-496a-bd25-b6b4ff102b6f","Type":"ContainerStarted","Data":"12c0f08ec9fff9a449d27dcfe0c2a18494ac9de59ece4c141755f344ab722b72"} Mar 10 10:06:15 crc kubenswrapper[4794]: I0310 10:06:15.819604 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"34993523-76a5-426f-a8bb-14466731fd21","Type":"ContainerStarted","Data":"81de560fbdf656511469793bbc0079e4b031e9a1eaf3c0073a8d3b425a268416"} Mar 10 10:06:15 crc kubenswrapper[4794]: I0310 10:06:15.822958 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"39f62c40-641b-409f-ab29-ba30c14de2d8","Type":"ContainerStarted","Data":"fbb0da4f7706c5131082ecadb64151d23fdfad08580f0036fc285ac4f92a3ab6"} Mar 10 10:06:15 crc kubenswrapper[4794]: I0310 10:06:15.825778 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-sz7rz" event={"ID":"74a5421b-362d-437b-98ce-c11e44a2e6f0","Type":"ContainerStarted","Data":"6afdec6836ae216dcd698ba4415c1d439445dabd433ed2a5a743f82c64c471d6"} Mar 10 10:06:15 crc kubenswrapper[4794]: I0310 10:06:15.825850 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-sz7rz" event={"ID":"74a5421b-362d-437b-98ce-c11e44a2e6f0","Type":"ContainerStarted","Data":"ed735d8ea2cd1b7f855a538c37b457135c56eb28083c21334c689d4b13ccb3c4"} Mar 10 10:06:15 crc kubenswrapper[4794]: I0310 10:06:15.826033 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-hjnc2" Mar 10 10:06:15 crc kubenswrapper[4794]: I0310 10:06:15.826731 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7fd445f5bc-9njwb" Mar 10 10:06:15 crc kubenswrapper[4794]: E0310 10:06:15.827040 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-cinder-api@sha256:44ed1ca84e17bd0f004cfbdc3c0827d767daba52abb8e83e076bfd0e6c02f838\\\"\"" pod="openstack/cinder-db-sync-g7gmt" podUID="821b338a-8a20-4d93-8dfa-28727da3ecba" Mar 10 10:06:15 crc kubenswrapper[4794]: I0310 10:06:15.841174 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-db-sync-v6v6k" podStartSLOduration=2.470400529 podStartE2EDuration="22.841152561s" podCreationTimestamp="2026-03-10 10:05:53 +0000 UTC" firstStartedPulling="2026-03-10 10:05:54.38325757 +0000 UTC m=+1303.139428388" lastFinishedPulling="2026-03-10 10:06:14.754009602 +0000 UTC m=+1323.510180420" observedRunningTime="2026-03-10 10:06:15.834041786 +0000 UTC m=+1324.590212614" watchObservedRunningTime="2026-03-10 10:06:15.841152561 +0000 UTC m=+1324.597323379" Mar 10 10:06:15 crc kubenswrapper[4794]: I0310 10:06:15.864065 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-sz7rz" podStartSLOduration=10.864044995 podStartE2EDuration="10.864044995s" podCreationTimestamp="2026-03-10 10:06:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 10:06:15.853623735 +0000 UTC m=+1324.609794553" watchObservedRunningTime="2026-03-10 10:06:15.864044995 +0000 UTC m=+1324.620215973" Mar 10 10:06:15 crc kubenswrapper[4794]: I0310 10:06:15.891991 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7fd445f5bc-9njwb"] Mar 10 10:06:15 crc kubenswrapper[4794]: I0310 10:06:15.899256 4794 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7fd445f5bc-9njwb"] Mar 10 10:06:16 crc kubenswrapper[4794]: I0310 10:06:16.014713 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="69fccd5f-c6ca-4228-81ba-ee7ae103876a" path="/var/lib/kubelet/pods/69fccd5f-c6ca-4228-81ba-ee7ae103876a/volumes" Mar 10 10:06:16 crc kubenswrapper[4794]: I0310 10:06:16.331260 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6cbd95f657-9dw7b"] Mar 10 10:06:16 crc kubenswrapper[4794]: E0310 10:06:16.331613 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b9423b3f-3f25-484c-aacc-c83d78c2f731" containerName="neutron-db-sync" Mar 10 10:06:16 crc kubenswrapper[4794]: I0310 10:06:16.331627 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="b9423b3f-3f25-484c-aacc-c83d78c2f731" containerName="neutron-db-sync" Mar 10 10:06:16 crc kubenswrapper[4794]: E0310 10:06:16.331647 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="69fccd5f-c6ca-4228-81ba-ee7ae103876a" containerName="dnsmasq-dns" Mar 10 10:06:16 crc kubenswrapper[4794]: I0310 10:06:16.331653 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="69fccd5f-c6ca-4228-81ba-ee7ae103876a" containerName="dnsmasq-dns" Mar 10 10:06:16 crc kubenswrapper[4794]: E0310 10:06:16.331673 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="69fccd5f-c6ca-4228-81ba-ee7ae103876a" containerName="init" Mar 10 10:06:16 crc kubenswrapper[4794]: I0310 10:06:16.331679 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="69fccd5f-c6ca-4228-81ba-ee7ae103876a" containerName="init" Mar 10 10:06:16 crc kubenswrapper[4794]: I0310 10:06:16.331824 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="b9423b3f-3f25-484c-aacc-c83d78c2f731" containerName="neutron-db-sync" Mar 10 10:06:16 crc kubenswrapper[4794]: I0310 10:06:16.331840 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="69fccd5f-c6ca-4228-81ba-ee7ae103876a" containerName="dnsmasq-dns" Mar 10 10:06:16 crc kubenswrapper[4794]: I0310 10:06:16.332632 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6cbd95f657-9dw7b" Mar 10 10:06:16 crc kubenswrapper[4794]: I0310 10:06:16.368671 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6cbd95f657-9dw7b"] Mar 10 10:06:16 crc kubenswrapper[4794]: I0310 10:06:16.405269 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f89f48c1-02ee-42ca-8101-7b646136d21d-config\") pod \"dnsmasq-dns-6cbd95f657-9dw7b\" (UID: \"f89f48c1-02ee-42ca-8101-7b646136d21d\") " pod="openstack/dnsmasq-dns-6cbd95f657-9dw7b" Mar 10 10:06:16 crc kubenswrapper[4794]: I0310 10:06:16.405359 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f89f48c1-02ee-42ca-8101-7b646136d21d-ovsdbserver-nb\") pod \"dnsmasq-dns-6cbd95f657-9dw7b\" (UID: \"f89f48c1-02ee-42ca-8101-7b646136d21d\") " pod="openstack/dnsmasq-dns-6cbd95f657-9dw7b" Mar 10 10:06:16 crc kubenswrapper[4794]: I0310 10:06:16.405386 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f89f48c1-02ee-42ca-8101-7b646136d21d-ovsdbserver-sb\") pod \"dnsmasq-dns-6cbd95f657-9dw7b\" (UID: \"f89f48c1-02ee-42ca-8101-7b646136d21d\") " pod="openstack/dnsmasq-dns-6cbd95f657-9dw7b" Mar 10 10:06:16 crc kubenswrapper[4794]: I0310 10:06:16.405418 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f89f48c1-02ee-42ca-8101-7b646136d21d-dns-swift-storage-0\") pod \"dnsmasq-dns-6cbd95f657-9dw7b\" (UID: \"f89f48c1-02ee-42ca-8101-7b646136d21d\") " pod="openstack/dnsmasq-dns-6cbd95f657-9dw7b" Mar 10 10:06:16 crc kubenswrapper[4794]: I0310 10:06:16.405487 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zwttg\" (UniqueName: \"kubernetes.io/projected/f89f48c1-02ee-42ca-8101-7b646136d21d-kube-api-access-zwttg\") pod \"dnsmasq-dns-6cbd95f657-9dw7b\" (UID: \"f89f48c1-02ee-42ca-8101-7b646136d21d\") " pod="openstack/dnsmasq-dns-6cbd95f657-9dw7b" Mar 10 10:06:16 crc kubenswrapper[4794]: I0310 10:06:16.405507 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f89f48c1-02ee-42ca-8101-7b646136d21d-dns-svc\") pod \"dnsmasq-dns-6cbd95f657-9dw7b\" (UID: \"f89f48c1-02ee-42ca-8101-7b646136d21d\") " pod="openstack/dnsmasq-dns-6cbd95f657-9dw7b" Mar 10 10:06:16 crc kubenswrapper[4794]: I0310 10:06:16.488103 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-74b7765548-sk248"] Mar 10 10:06:16 crc kubenswrapper[4794]: I0310 10:06:16.498723 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-74b7765548-sk248"] Mar 10 10:06:16 crc kubenswrapper[4794]: I0310 10:06:16.498848 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-74b7765548-sk248" Mar 10 10:06:16 crc kubenswrapper[4794]: I0310 10:06:16.506864 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Mar 10 10:06:16 crc kubenswrapper[4794]: I0310 10:06:16.506902 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-ovndbs" Mar 10 10:06:16 crc kubenswrapper[4794]: I0310 10:06:16.506985 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Mar 10 10:06:16 crc kubenswrapper[4794]: I0310 10:06:16.507287 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-8qmcc" Mar 10 10:06:16 crc kubenswrapper[4794]: I0310 10:06:16.507859 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zwttg\" (UniqueName: \"kubernetes.io/projected/f89f48c1-02ee-42ca-8101-7b646136d21d-kube-api-access-zwttg\") pod \"dnsmasq-dns-6cbd95f657-9dw7b\" (UID: \"f89f48c1-02ee-42ca-8101-7b646136d21d\") " pod="openstack/dnsmasq-dns-6cbd95f657-9dw7b" Mar 10 10:06:16 crc kubenswrapper[4794]: I0310 10:06:16.507891 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f89f48c1-02ee-42ca-8101-7b646136d21d-dns-svc\") pod \"dnsmasq-dns-6cbd95f657-9dw7b\" (UID: \"f89f48c1-02ee-42ca-8101-7b646136d21d\") " pod="openstack/dnsmasq-dns-6cbd95f657-9dw7b" Mar 10 10:06:16 crc kubenswrapper[4794]: I0310 10:06:16.507925 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f89f48c1-02ee-42ca-8101-7b646136d21d-config\") pod \"dnsmasq-dns-6cbd95f657-9dw7b\" (UID: \"f89f48c1-02ee-42ca-8101-7b646136d21d\") " pod="openstack/dnsmasq-dns-6cbd95f657-9dw7b" Mar 10 10:06:16 crc kubenswrapper[4794]: I0310 10:06:16.507957 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f89f48c1-02ee-42ca-8101-7b646136d21d-ovsdbserver-nb\") pod \"dnsmasq-dns-6cbd95f657-9dw7b\" (UID: \"f89f48c1-02ee-42ca-8101-7b646136d21d\") " pod="openstack/dnsmasq-dns-6cbd95f657-9dw7b" Mar 10 10:06:16 crc kubenswrapper[4794]: I0310 10:06:16.507980 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f89f48c1-02ee-42ca-8101-7b646136d21d-ovsdbserver-sb\") pod \"dnsmasq-dns-6cbd95f657-9dw7b\" (UID: \"f89f48c1-02ee-42ca-8101-7b646136d21d\") " pod="openstack/dnsmasq-dns-6cbd95f657-9dw7b" Mar 10 10:06:16 crc kubenswrapper[4794]: I0310 10:06:16.508007 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f89f48c1-02ee-42ca-8101-7b646136d21d-dns-swift-storage-0\") pod \"dnsmasq-dns-6cbd95f657-9dw7b\" (UID: \"f89f48c1-02ee-42ca-8101-7b646136d21d\") " pod="openstack/dnsmasq-dns-6cbd95f657-9dw7b" Mar 10 10:06:16 crc kubenswrapper[4794]: I0310 10:06:16.509185 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f89f48c1-02ee-42ca-8101-7b646136d21d-dns-swift-storage-0\") pod \"dnsmasq-dns-6cbd95f657-9dw7b\" (UID: \"f89f48c1-02ee-42ca-8101-7b646136d21d\") " pod="openstack/dnsmasq-dns-6cbd95f657-9dw7b" Mar 10 10:06:16 crc kubenswrapper[4794]: I0310 10:06:16.509293 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f89f48c1-02ee-42ca-8101-7b646136d21d-ovsdbserver-nb\") pod \"dnsmasq-dns-6cbd95f657-9dw7b\" (UID: \"f89f48c1-02ee-42ca-8101-7b646136d21d\") " pod="openstack/dnsmasq-dns-6cbd95f657-9dw7b" Mar 10 10:06:16 crc kubenswrapper[4794]: I0310 10:06:16.511120 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f89f48c1-02ee-42ca-8101-7b646136d21d-ovsdbserver-sb\") pod \"dnsmasq-dns-6cbd95f657-9dw7b\" (UID: \"f89f48c1-02ee-42ca-8101-7b646136d21d\") " pod="openstack/dnsmasq-dns-6cbd95f657-9dw7b" Mar 10 10:06:16 crc kubenswrapper[4794]: I0310 10:06:16.511257 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f89f48c1-02ee-42ca-8101-7b646136d21d-config\") pod \"dnsmasq-dns-6cbd95f657-9dw7b\" (UID: \"f89f48c1-02ee-42ca-8101-7b646136d21d\") " pod="openstack/dnsmasq-dns-6cbd95f657-9dw7b" Mar 10 10:06:16 crc kubenswrapper[4794]: I0310 10:06:16.511411 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f89f48c1-02ee-42ca-8101-7b646136d21d-dns-svc\") pod \"dnsmasq-dns-6cbd95f657-9dw7b\" (UID: \"f89f48c1-02ee-42ca-8101-7b646136d21d\") " pod="openstack/dnsmasq-dns-6cbd95f657-9dw7b" Mar 10 10:06:16 crc kubenswrapper[4794]: I0310 10:06:16.530398 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 10 10:06:16 crc kubenswrapper[4794]: I0310 10:06:16.546097 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zwttg\" (UniqueName: \"kubernetes.io/projected/f89f48c1-02ee-42ca-8101-7b646136d21d-kube-api-access-zwttg\") pod \"dnsmasq-dns-6cbd95f657-9dw7b\" (UID: \"f89f48c1-02ee-42ca-8101-7b646136d21d\") " pod="openstack/dnsmasq-dns-6cbd95f657-9dw7b" Mar 10 10:06:16 crc kubenswrapper[4794]: I0310 10:06:16.613350 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4pnzh\" (UniqueName: \"kubernetes.io/projected/652a5188-47b5-4235-8385-f9b9b1e3db2d-kube-api-access-4pnzh\") pod \"neutron-74b7765548-sk248\" (UID: \"652a5188-47b5-4235-8385-f9b9b1e3db2d\") " pod="openstack/neutron-74b7765548-sk248" Mar 10 10:06:16 crc kubenswrapper[4794]: I0310 10:06:16.613617 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/652a5188-47b5-4235-8385-f9b9b1e3db2d-combined-ca-bundle\") pod \"neutron-74b7765548-sk248\" (UID: \"652a5188-47b5-4235-8385-f9b9b1e3db2d\") " pod="openstack/neutron-74b7765548-sk248" Mar 10 10:06:16 crc kubenswrapper[4794]: I0310 10:06:16.613652 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/652a5188-47b5-4235-8385-f9b9b1e3db2d-ovndb-tls-certs\") pod \"neutron-74b7765548-sk248\" (UID: \"652a5188-47b5-4235-8385-f9b9b1e3db2d\") " pod="openstack/neutron-74b7765548-sk248" Mar 10 10:06:16 crc kubenswrapper[4794]: I0310 10:06:16.613705 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/652a5188-47b5-4235-8385-f9b9b1e3db2d-httpd-config\") pod \"neutron-74b7765548-sk248\" (UID: \"652a5188-47b5-4235-8385-f9b9b1e3db2d\") " pod="openstack/neutron-74b7765548-sk248" Mar 10 10:06:16 crc kubenswrapper[4794]: I0310 10:06:16.613791 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/652a5188-47b5-4235-8385-f9b9b1e3db2d-config\") pod \"neutron-74b7765548-sk248\" (UID: \"652a5188-47b5-4235-8385-f9b9b1e3db2d\") " pod="openstack/neutron-74b7765548-sk248" Mar 10 10:06:16 crc kubenswrapper[4794]: I0310 10:06:16.692997 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6cbd95f657-9dw7b" Mar 10 10:06:16 crc kubenswrapper[4794]: I0310 10:06:16.718952 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/652a5188-47b5-4235-8385-f9b9b1e3db2d-config\") pod \"neutron-74b7765548-sk248\" (UID: \"652a5188-47b5-4235-8385-f9b9b1e3db2d\") " pod="openstack/neutron-74b7765548-sk248" Mar 10 10:06:16 crc kubenswrapper[4794]: I0310 10:06:16.719110 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4pnzh\" (UniqueName: \"kubernetes.io/projected/652a5188-47b5-4235-8385-f9b9b1e3db2d-kube-api-access-4pnzh\") pod \"neutron-74b7765548-sk248\" (UID: \"652a5188-47b5-4235-8385-f9b9b1e3db2d\") " pod="openstack/neutron-74b7765548-sk248" Mar 10 10:06:16 crc kubenswrapper[4794]: I0310 10:06:16.719212 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/652a5188-47b5-4235-8385-f9b9b1e3db2d-combined-ca-bundle\") pod \"neutron-74b7765548-sk248\" (UID: \"652a5188-47b5-4235-8385-f9b9b1e3db2d\") " pod="openstack/neutron-74b7765548-sk248" Mar 10 10:06:16 crc kubenswrapper[4794]: I0310 10:06:16.719251 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/652a5188-47b5-4235-8385-f9b9b1e3db2d-ovndb-tls-certs\") pod \"neutron-74b7765548-sk248\" (UID: \"652a5188-47b5-4235-8385-f9b9b1e3db2d\") " pod="openstack/neutron-74b7765548-sk248" Mar 10 10:06:16 crc kubenswrapper[4794]: I0310 10:06:16.719296 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/652a5188-47b5-4235-8385-f9b9b1e3db2d-httpd-config\") pod \"neutron-74b7765548-sk248\" (UID: \"652a5188-47b5-4235-8385-f9b9b1e3db2d\") " pod="openstack/neutron-74b7765548-sk248" Mar 10 10:06:16 crc kubenswrapper[4794]: I0310 10:06:16.723782 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/652a5188-47b5-4235-8385-f9b9b1e3db2d-ovndb-tls-certs\") pod \"neutron-74b7765548-sk248\" (UID: \"652a5188-47b5-4235-8385-f9b9b1e3db2d\") " pod="openstack/neutron-74b7765548-sk248" Mar 10 10:06:16 crc kubenswrapper[4794]: I0310 10:06:16.724866 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/652a5188-47b5-4235-8385-f9b9b1e3db2d-combined-ca-bundle\") pod \"neutron-74b7765548-sk248\" (UID: \"652a5188-47b5-4235-8385-f9b9b1e3db2d\") " pod="openstack/neutron-74b7765548-sk248" Mar 10 10:06:16 crc kubenswrapper[4794]: I0310 10:06:16.742962 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/652a5188-47b5-4235-8385-f9b9b1e3db2d-httpd-config\") pod \"neutron-74b7765548-sk248\" (UID: \"652a5188-47b5-4235-8385-f9b9b1e3db2d\") " pod="openstack/neutron-74b7765548-sk248" Mar 10 10:06:16 crc kubenswrapper[4794]: I0310 10:06:16.743635 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/652a5188-47b5-4235-8385-f9b9b1e3db2d-config\") pod \"neutron-74b7765548-sk248\" (UID: \"652a5188-47b5-4235-8385-f9b9b1e3db2d\") " pod="openstack/neutron-74b7765548-sk248" Mar 10 10:06:16 crc kubenswrapper[4794]: I0310 10:06:16.749280 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4pnzh\" (UniqueName: \"kubernetes.io/projected/652a5188-47b5-4235-8385-f9b9b1e3db2d-kube-api-access-4pnzh\") pod \"neutron-74b7765548-sk248\" (UID: \"652a5188-47b5-4235-8385-f9b9b1e3db2d\") " pod="openstack/neutron-74b7765548-sk248" Mar 10 10:06:16 crc kubenswrapper[4794]: I0310 10:06:16.854825 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-74b7765548-sk248" Mar 10 10:06:16 crc kubenswrapper[4794]: I0310 10:06:16.855980 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552286-6rbvj" event={"ID":"f9689ae4-36b4-41d8-b75d-805a57b17041","Type":"ContainerStarted","Data":"44d4d4f0866a0ad74a78bd1b8960fddbed29b1a1d629ac2ff1cc5f7193fc557c"} Mar 10 10:06:16 crc kubenswrapper[4794]: I0310 10:06:16.869611 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"34993523-76a5-426f-a8bb-14466731fd21","Type":"ContainerStarted","Data":"6f74627ee5225f35dce01ac902c6a0fb721d52abe0c763bef1037f66809b4d95"} Mar 10 10:06:16 crc kubenswrapper[4794]: I0310 10:06:16.869668 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"34993523-76a5-426f-a8bb-14466731fd21","Type":"ContainerStarted","Data":"dfe796b758d9be52c77e22c0da16ab79aedbbd54085aff9d8b415d2185a274e8"} Mar 10 10:06:16 crc kubenswrapper[4794]: I0310 10:06:16.872508 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29552286-6rbvj" podStartSLOduration=15.725626807 podStartE2EDuration="16.872488855s" podCreationTimestamp="2026-03-10 10:06:00 +0000 UTC" firstStartedPulling="2026-03-10 10:06:14.76849783 +0000 UTC m=+1323.524668688" lastFinishedPulling="2026-03-10 10:06:15.915359928 +0000 UTC m=+1324.671530736" observedRunningTime="2026-03-10 10:06:16.868603713 +0000 UTC m=+1325.624774541" watchObservedRunningTime="2026-03-10 10:06:16.872488855 +0000 UTC m=+1325.628659673" Mar 10 10:06:16 crc kubenswrapper[4794]: I0310 10:06:16.900187 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=11.90016512 podStartE2EDuration="11.90016512s" podCreationTimestamp="2026-03-10 10:06:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 10:06:16.891533607 +0000 UTC m=+1325.647704425" watchObservedRunningTime="2026-03-10 10:06:16.90016512 +0000 UTC m=+1325.656335938" Mar 10 10:06:17 crc kubenswrapper[4794]: I0310 10:06:17.207369 4794 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-7fd445f5bc-9njwb" podUID="69fccd5f-c6ca-4228-81ba-ee7ae103876a" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.132:5353: i/o timeout" Mar 10 10:06:17 crc kubenswrapper[4794]: I0310 10:06:17.885457 4794 generic.go:334] "Generic (PLEG): container finished" podID="f9689ae4-36b4-41d8-b75d-805a57b17041" containerID="44d4d4f0866a0ad74a78bd1b8960fddbed29b1a1d629ac2ff1cc5f7193fc557c" exitCode=0 Mar 10 10:06:17 crc kubenswrapper[4794]: I0310 10:06:17.885530 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552286-6rbvj" event={"ID":"f9689ae4-36b4-41d8-b75d-805a57b17041","Type":"ContainerDied","Data":"44d4d4f0866a0ad74a78bd1b8960fddbed29b1a1d629ac2ff1cc5f7193fc557c"} Mar 10 10:06:18 crc kubenswrapper[4794]: I0310 10:06:18.535099 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6cbd95f657-9dw7b"] Mar 10 10:06:18 crc kubenswrapper[4794]: W0310 10:06:18.547327 4794 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf89f48c1_02ee_42ca_8101_7b646136d21d.slice/crio-a6a5519e42059f7273e00990315ebdbe2d6747c76dee3dcbc5b570330032211e WatchSource:0}: Error finding container a6a5519e42059f7273e00990315ebdbe2d6747c76dee3dcbc5b570330032211e: Status 404 returned error can't find the container with id a6a5519e42059f7273e00990315ebdbe2d6747c76dee3dcbc5b570330032211e Mar 10 10:06:18 crc kubenswrapper[4794]: I0310 10:06:18.617612 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-74b7765548-sk248"] Mar 10 10:06:18 crc kubenswrapper[4794]: W0310 10:06:18.630815 4794 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod652a5188_47b5_4235_8385_f9b9b1e3db2d.slice/crio-08d58ce08fe9b44e0e6151d616ba08d669daab57fe94928bc53d816d8f894a67 WatchSource:0}: Error finding container 08d58ce08fe9b44e0e6151d616ba08d669daab57fe94928bc53d816d8f894a67: Status 404 returned error can't find the container with id 08d58ce08fe9b44e0e6151d616ba08d669daab57fe94928bc53d816d8f894a67 Mar 10 10:06:18 crc kubenswrapper[4794]: I0310 10:06:18.937054 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"39f62c40-641b-409f-ab29-ba30c14de2d8","Type":"ContainerStarted","Data":"3a9d616e4d6b008bb7cbf6d2bbc20e8de0074f1639e7627b35604fe2fd3d69a4"} Mar 10 10:06:18 crc kubenswrapper[4794]: I0310 10:06:18.943985 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"325754ce-6381-4bb4-9102-04933c1a928b","Type":"ContainerStarted","Data":"0383025111f0e6982d3ca9019cd29a06bc4484fc4b9cbeafc472f00480493c35"} Mar 10 10:06:18 crc kubenswrapper[4794]: I0310 10:06:18.944052 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"325754ce-6381-4bb4-9102-04933c1a928b","Type":"ContainerStarted","Data":"e5f1ca83860db413e4dbafdb64456ad83868bbe4106fa6d57fe111a3132e1ed1"} Mar 10 10:06:18 crc kubenswrapper[4794]: I0310 10:06:18.955638 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-74b7765548-sk248" event={"ID":"652a5188-47b5-4235-8385-f9b9b1e3db2d","Type":"ContainerStarted","Data":"a0b9dcaf358ff946d205c9bc8e9a9040d43717b794c1e68a54a11e2a563b14f0"} Mar 10 10:06:18 crc kubenswrapper[4794]: I0310 10:06:18.955697 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-74b7765548-sk248" event={"ID":"652a5188-47b5-4235-8385-f9b9b1e3db2d","Type":"ContainerStarted","Data":"08d58ce08fe9b44e0e6151d616ba08d669daab57fe94928bc53d816d8f894a67"} Mar 10 10:06:18 crc kubenswrapper[4794]: I0310 10:06:18.982404 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-66d985f6f7-q8rt6"] Mar 10 10:06:18 crc kubenswrapper[4794]: I0310 10:06:18.983798 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-66d985f6f7-q8rt6" Mar 10 10:06:18 crc kubenswrapper[4794]: I0310 10:06:18.989395 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-public-svc" Mar 10 10:06:18 crc kubenswrapper[4794]: I0310 10:06:18.989569 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-internal-svc" Mar 10 10:06:18 crc kubenswrapper[4794]: I0310 10:06:18.991529 4794 generic.go:334] "Generic (PLEG): container finished" podID="cb29878d-3dbb-496a-bd25-b6b4ff102b6f" containerID="12c0f08ec9fff9a449d27dcfe0c2a18494ac9de59ece4c141755f344ab722b72" exitCode=0 Mar 10 10:06:18 crc kubenswrapper[4794]: I0310 10:06:18.991592 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-v6v6k" event={"ID":"cb29878d-3dbb-496a-bd25-b6b4ff102b6f","Type":"ContainerDied","Data":"12c0f08ec9fff9a449d27dcfe0c2a18494ac9de59ece4c141755f344ab722b72"} Mar 10 10:06:19 crc kubenswrapper[4794]: I0310 10:06:19.020744 4794 generic.go:334] "Generic (PLEG): container finished" podID="f89f48c1-02ee-42ca-8101-7b646136d21d" containerID="2e0c219d86c961651a397743750de2afecc05a22f55337936f8aacb3fa39b919" exitCode=0 Mar 10 10:06:19 crc kubenswrapper[4794]: I0310 10:06:19.021399 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6cbd95f657-9dw7b" event={"ID":"f89f48c1-02ee-42ca-8101-7b646136d21d","Type":"ContainerDied","Data":"2e0c219d86c961651a397743750de2afecc05a22f55337936f8aacb3fa39b919"} Mar 10 10:06:19 crc kubenswrapper[4794]: I0310 10:06:19.021450 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6cbd95f657-9dw7b" event={"ID":"f89f48c1-02ee-42ca-8101-7b646136d21d","Type":"ContainerStarted","Data":"a6a5519e42059f7273e00990315ebdbe2d6747c76dee3dcbc5b570330032211e"} Mar 10 10:06:19 crc kubenswrapper[4794]: I0310 10:06:19.022397 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-66d985f6f7-q8rt6"] Mar 10 10:06:19 crc kubenswrapper[4794]: I0310 10:06:19.078493 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/b66eadd9-dea4-4c3c-aa14-494f04e0e8c7-config\") pod \"neutron-66d985f6f7-q8rt6\" (UID: \"b66eadd9-dea4-4c3c-aa14-494f04e0e8c7\") " pod="openstack/neutron-66d985f6f7-q8rt6" Mar 10 10:06:19 crc kubenswrapper[4794]: I0310 10:06:19.078594 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/b66eadd9-dea4-4c3c-aa14-494f04e0e8c7-ovndb-tls-certs\") pod \"neutron-66d985f6f7-q8rt6\" (UID: \"b66eadd9-dea4-4c3c-aa14-494f04e0e8c7\") " pod="openstack/neutron-66d985f6f7-q8rt6" Mar 10 10:06:19 crc kubenswrapper[4794]: I0310 10:06:19.078636 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b66eadd9-dea4-4c3c-aa14-494f04e0e8c7-internal-tls-certs\") pod \"neutron-66d985f6f7-q8rt6\" (UID: \"b66eadd9-dea4-4c3c-aa14-494f04e0e8c7\") " pod="openstack/neutron-66d985f6f7-q8rt6" Mar 10 10:06:19 crc kubenswrapper[4794]: I0310 10:06:19.078711 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b66eadd9-dea4-4c3c-aa14-494f04e0e8c7-combined-ca-bundle\") pod \"neutron-66d985f6f7-q8rt6\" (UID: \"b66eadd9-dea4-4c3c-aa14-494f04e0e8c7\") " pod="openstack/neutron-66d985f6f7-q8rt6" Mar 10 10:06:19 crc kubenswrapper[4794]: I0310 10:06:19.078747 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b66eadd9-dea4-4c3c-aa14-494f04e0e8c7-public-tls-certs\") pod \"neutron-66d985f6f7-q8rt6\" (UID: \"b66eadd9-dea4-4c3c-aa14-494f04e0e8c7\") " pod="openstack/neutron-66d985f6f7-q8rt6" Mar 10 10:06:19 crc kubenswrapper[4794]: I0310 10:06:19.078778 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7zb47\" (UniqueName: \"kubernetes.io/projected/b66eadd9-dea4-4c3c-aa14-494f04e0e8c7-kube-api-access-7zb47\") pod \"neutron-66d985f6f7-q8rt6\" (UID: \"b66eadd9-dea4-4c3c-aa14-494f04e0e8c7\") " pod="openstack/neutron-66d985f6f7-q8rt6" Mar 10 10:06:19 crc kubenswrapper[4794]: I0310 10:06:19.078850 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/b66eadd9-dea4-4c3c-aa14-494f04e0e8c7-httpd-config\") pod \"neutron-66d985f6f7-q8rt6\" (UID: \"b66eadd9-dea4-4c3c-aa14-494f04e0e8c7\") " pod="openstack/neutron-66d985f6f7-q8rt6" Mar 10 10:06:19 crc kubenswrapper[4794]: I0310 10:06:19.181316 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/b66eadd9-dea4-4c3c-aa14-494f04e0e8c7-ovndb-tls-certs\") pod \"neutron-66d985f6f7-q8rt6\" (UID: \"b66eadd9-dea4-4c3c-aa14-494f04e0e8c7\") " pod="openstack/neutron-66d985f6f7-q8rt6" Mar 10 10:06:19 crc kubenswrapper[4794]: I0310 10:06:19.181633 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b66eadd9-dea4-4c3c-aa14-494f04e0e8c7-internal-tls-certs\") pod \"neutron-66d985f6f7-q8rt6\" (UID: \"b66eadd9-dea4-4c3c-aa14-494f04e0e8c7\") " pod="openstack/neutron-66d985f6f7-q8rt6" Mar 10 10:06:19 crc kubenswrapper[4794]: I0310 10:06:19.181721 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b66eadd9-dea4-4c3c-aa14-494f04e0e8c7-combined-ca-bundle\") pod \"neutron-66d985f6f7-q8rt6\" (UID: \"b66eadd9-dea4-4c3c-aa14-494f04e0e8c7\") " pod="openstack/neutron-66d985f6f7-q8rt6" Mar 10 10:06:19 crc kubenswrapper[4794]: I0310 10:06:19.181756 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b66eadd9-dea4-4c3c-aa14-494f04e0e8c7-public-tls-certs\") pod \"neutron-66d985f6f7-q8rt6\" (UID: \"b66eadd9-dea4-4c3c-aa14-494f04e0e8c7\") " pod="openstack/neutron-66d985f6f7-q8rt6" Mar 10 10:06:19 crc kubenswrapper[4794]: I0310 10:06:19.181781 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7zb47\" (UniqueName: \"kubernetes.io/projected/b66eadd9-dea4-4c3c-aa14-494f04e0e8c7-kube-api-access-7zb47\") pod \"neutron-66d985f6f7-q8rt6\" (UID: \"b66eadd9-dea4-4c3c-aa14-494f04e0e8c7\") " pod="openstack/neutron-66d985f6f7-q8rt6" Mar 10 10:06:19 crc kubenswrapper[4794]: I0310 10:06:19.181815 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/b66eadd9-dea4-4c3c-aa14-494f04e0e8c7-httpd-config\") pod \"neutron-66d985f6f7-q8rt6\" (UID: \"b66eadd9-dea4-4c3c-aa14-494f04e0e8c7\") " pod="openstack/neutron-66d985f6f7-q8rt6" Mar 10 10:06:19 crc kubenswrapper[4794]: I0310 10:06:19.181868 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/b66eadd9-dea4-4c3c-aa14-494f04e0e8c7-config\") pod \"neutron-66d985f6f7-q8rt6\" (UID: \"b66eadd9-dea4-4c3c-aa14-494f04e0e8c7\") " pod="openstack/neutron-66d985f6f7-q8rt6" Mar 10 10:06:19 crc kubenswrapper[4794]: I0310 10:06:19.188366 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/b66eadd9-dea4-4c3c-aa14-494f04e0e8c7-httpd-config\") pod \"neutron-66d985f6f7-q8rt6\" (UID: \"b66eadd9-dea4-4c3c-aa14-494f04e0e8c7\") " pod="openstack/neutron-66d985f6f7-q8rt6" Mar 10 10:06:19 crc kubenswrapper[4794]: I0310 10:06:19.191071 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b66eadd9-dea4-4c3c-aa14-494f04e0e8c7-combined-ca-bundle\") pod \"neutron-66d985f6f7-q8rt6\" (UID: \"b66eadd9-dea4-4c3c-aa14-494f04e0e8c7\") " pod="openstack/neutron-66d985f6f7-q8rt6" Mar 10 10:06:19 crc kubenswrapper[4794]: I0310 10:06:19.191792 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b66eadd9-dea4-4c3c-aa14-494f04e0e8c7-public-tls-certs\") pod \"neutron-66d985f6f7-q8rt6\" (UID: \"b66eadd9-dea4-4c3c-aa14-494f04e0e8c7\") " pod="openstack/neutron-66d985f6f7-q8rt6" Mar 10 10:06:19 crc kubenswrapper[4794]: I0310 10:06:19.192759 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/b66eadd9-dea4-4c3c-aa14-494f04e0e8c7-config\") pod \"neutron-66d985f6f7-q8rt6\" (UID: \"b66eadd9-dea4-4c3c-aa14-494f04e0e8c7\") " pod="openstack/neutron-66d985f6f7-q8rt6" Mar 10 10:06:19 crc kubenswrapper[4794]: I0310 10:06:19.194197 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/b66eadd9-dea4-4c3c-aa14-494f04e0e8c7-ovndb-tls-certs\") pod \"neutron-66d985f6f7-q8rt6\" (UID: \"b66eadd9-dea4-4c3c-aa14-494f04e0e8c7\") " pod="openstack/neutron-66d985f6f7-q8rt6" Mar 10 10:06:19 crc kubenswrapper[4794]: I0310 10:06:19.196957 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b66eadd9-dea4-4c3c-aa14-494f04e0e8c7-internal-tls-certs\") pod \"neutron-66d985f6f7-q8rt6\" (UID: \"b66eadd9-dea4-4c3c-aa14-494f04e0e8c7\") " pod="openstack/neutron-66d985f6f7-q8rt6" Mar 10 10:06:19 crc kubenswrapper[4794]: I0310 10:06:19.215379 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7zb47\" (UniqueName: \"kubernetes.io/projected/b66eadd9-dea4-4c3c-aa14-494f04e0e8c7-kube-api-access-7zb47\") pod \"neutron-66d985f6f7-q8rt6\" (UID: \"b66eadd9-dea4-4c3c-aa14-494f04e0e8c7\") " pod="openstack/neutron-66d985f6f7-q8rt6" Mar 10 10:06:19 crc kubenswrapper[4794]: I0310 10:06:19.229404 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-66d985f6f7-q8rt6" Mar 10 10:06:19 crc kubenswrapper[4794]: I0310 10:06:19.432101 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552286-6rbvj" Mar 10 10:06:19 crc kubenswrapper[4794]: I0310 10:06:19.591312 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q8whm\" (UniqueName: \"kubernetes.io/projected/f9689ae4-36b4-41d8-b75d-805a57b17041-kube-api-access-q8whm\") pod \"f9689ae4-36b4-41d8-b75d-805a57b17041\" (UID: \"f9689ae4-36b4-41d8-b75d-805a57b17041\") " Mar 10 10:06:19 crc kubenswrapper[4794]: I0310 10:06:19.597500 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f9689ae4-36b4-41d8-b75d-805a57b17041-kube-api-access-q8whm" (OuterVolumeSpecName: "kube-api-access-q8whm") pod "f9689ae4-36b4-41d8-b75d-805a57b17041" (UID: "f9689ae4-36b4-41d8-b75d-805a57b17041"). InnerVolumeSpecName "kube-api-access-q8whm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 10:06:19 crc kubenswrapper[4794]: I0310 10:06:19.693542 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q8whm\" (UniqueName: \"kubernetes.io/projected/f9689ae4-36b4-41d8-b75d-805a57b17041-kube-api-access-q8whm\") on node \"crc\" DevicePath \"\"" Mar 10 10:06:19 crc kubenswrapper[4794]: I0310 10:06:19.849439 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-66d985f6f7-q8rt6"] Mar 10 10:06:19 crc kubenswrapper[4794]: W0310 10:06:19.854562 4794 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb66eadd9_dea4_4c3c_aa14_494f04e0e8c7.slice/crio-bc0ffbfa7e16f1c8c83c431b4d4c9d19414ccd076bab65ecbf1699a767bb2a37 WatchSource:0}: Error finding container bc0ffbfa7e16f1c8c83c431b4d4c9d19414ccd076bab65ecbf1699a767bb2a37: Status 404 returned error can't find the container with id bc0ffbfa7e16f1c8c83c431b4d4c9d19414ccd076bab65ecbf1699a767bb2a37 Mar 10 10:06:19 crc kubenswrapper[4794]: I0310 10:06:19.942483 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29552280-m8kkn"] Mar 10 10:06:19 crc kubenswrapper[4794]: I0310 10:06:19.951192 4794 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29552280-m8kkn"] Mar 10 10:06:20 crc kubenswrapper[4794]: I0310 10:06:20.013200 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="449e1884-a4e4-4b83-b831-ddbc4f598eff" path="/var/lib/kubelet/pods/449e1884-a4e4-4b83-b831-ddbc4f598eff/volumes" Mar 10 10:06:20 crc kubenswrapper[4794]: I0310 10:06:20.082957 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552286-6rbvj" Mar 10 10:06:20 crc kubenswrapper[4794]: I0310 10:06:20.083254 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552286-6rbvj" event={"ID":"f9689ae4-36b4-41d8-b75d-805a57b17041","Type":"ContainerDied","Data":"db11f739f3ec897eb3d136ae60edf781579a6ae9d6fd9016363bf2ba30310a74"} Mar 10 10:06:20 crc kubenswrapper[4794]: I0310 10:06:20.083408 4794 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="db11f739f3ec897eb3d136ae60edf781579a6ae9d6fd9016363bf2ba30310a74" Mar 10 10:06:20 crc kubenswrapper[4794]: I0310 10:06:20.096708 4794 generic.go:334] "Generic (PLEG): container finished" podID="74a5421b-362d-437b-98ce-c11e44a2e6f0" containerID="6afdec6836ae216dcd698ba4415c1d439445dabd433ed2a5a743f82c64c471d6" exitCode=0 Mar 10 10:06:20 crc kubenswrapper[4794]: I0310 10:06:20.096791 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-sz7rz" event={"ID":"74a5421b-362d-437b-98ce-c11e44a2e6f0","Type":"ContainerDied","Data":"6afdec6836ae216dcd698ba4415c1d439445dabd433ed2a5a743f82c64c471d6"} Mar 10 10:06:20 crc kubenswrapper[4794]: I0310 10:06:20.107345 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"325754ce-6381-4bb4-9102-04933c1a928b","Type":"ContainerStarted","Data":"b35130c90243c66a164c26598446bab5071d3dcbebe9e12292cadd9f5f9536f4"} Mar 10 10:06:20 crc kubenswrapper[4794]: I0310 10:06:20.110460 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-74b7765548-sk248" event={"ID":"652a5188-47b5-4235-8385-f9b9b1e3db2d","Type":"ContainerStarted","Data":"c90cc14f7dbebc115d43b4687a53f79c2412e4e1ff879557986772034e76699c"} Mar 10 10:06:20 crc kubenswrapper[4794]: I0310 10:06:20.111207 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-74b7765548-sk248" Mar 10 10:06:20 crc kubenswrapper[4794]: I0310 10:06:20.125889 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6cbd95f657-9dw7b" event={"ID":"f89f48c1-02ee-42ca-8101-7b646136d21d","Type":"ContainerStarted","Data":"63e0788ae87d6fcca50d60481e254936b7034a5d8794646feff3dcf7869e3b87"} Mar 10 10:06:20 crc kubenswrapper[4794]: I0310 10:06:20.125983 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6cbd95f657-9dw7b" Mar 10 10:06:20 crc kubenswrapper[4794]: I0310 10:06:20.128530 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-66d985f6f7-q8rt6" event={"ID":"b66eadd9-dea4-4c3c-aa14-494f04e0e8c7","Type":"ContainerStarted","Data":"bc0ffbfa7e16f1c8c83c431b4d4c9d19414ccd076bab65ecbf1699a767bb2a37"} Mar 10 10:06:20 crc kubenswrapper[4794]: I0310 10:06:20.149759 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=16.149731992 podStartE2EDuration="16.149731992s" podCreationTimestamp="2026-03-10 10:06:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 10:06:20.146177921 +0000 UTC m=+1328.902348749" watchObservedRunningTime="2026-03-10 10:06:20.149731992 +0000 UTC m=+1328.905902840" Mar 10 10:06:20 crc kubenswrapper[4794]: I0310 10:06:20.173251 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-74b7765548-sk248" podStartSLOduration=4.173232516 podStartE2EDuration="4.173232516s" podCreationTimestamp="2026-03-10 10:06:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 10:06:20.167818515 +0000 UTC m=+1328.923989343" watchObservedRunningTime="2026-03-10 10:06:20.173232516 +0000 UTC m=+1328.929403364" Mar 10 10:06:20 crc kubenswrapper[4794]: I0310 10:06:20.185833 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6cbd95f657-9dw7b" podStartSLOduration=4.185815954 podStartE2EDuration="4.185815954s" podCreationTimestamp="2026-03-10 10:06:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 10:06:20.185614998 +0000 UTC m=+1328.941785816" watchObservedRunningTime="2026-03-10 10:06:20.185815954 +0000 UTC m=+1328.941986772" Mar 10 10:06:20 crc kubenswrapper[4794]: E0310 10:06:20.340573 4794 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf9689ae4_36b4_41d8_b75d_805a57b17041.slice\": RecentStats: unable to find data in memory cache]" Mar 10 10:06:20 crc kubenswrapper[4794]: I0310 10:06:20.445957 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-v6v6k" Mar 10 10:06:20 crc kubenswrapper[4794]: I0310 10:06:20.509729 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cb29878d-3dbb-496a-bd25-b6b4ff102b6f-scripts\") pod \"cb29878d-3dbb-496a-bd25-b6b4ff102b6f\" (UID: \"cb29878d-3dbb-496a-bd25-b6b4ff102b6f\") " Mar 10 10:06:20 crc kubenswrapper[4794]: I0310 10:06:20.509836 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zp5c2\" (UniqueName: \"kubernetes.io/projected/cb29878d-3dbb-496a-bd25-b6b4ff102b6f-kube-api-access-zp5c2\") pod \"cb29878d-3dbb-496a-bd25-b6b4ff102b6f\" (UID: \"cb29878d-3dbb-496a-bd25-b6b4ff102b6f\") " Mar 10 10:06:20 crc kubenswrapper[4794]: I0310 10:06:20.509966 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cb29878d-3dbb-496a-bd25-b6b4ff102b6f-combined-ca-bundle\") pod \"cb29878d-3dbb-496a-bd25-b6b4ff102b6f\" (UID: \"cb29878d-3dbb-496a-bd25-b6b4ff102b6f\") " Mar 10 10:06:20 crc kubenswrapper[4794]: I0310 10:06:20.509997 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cb29878d-3dbb-496a-bd25-b6b4ff102b6f-logs\") pod \"cb29878d-3dbb-496a-bd25-b6b4ff102b6f\" (UID: \"cb29878d-3dbb-496a-bd25-b6b4ff102b6f\") " Mar 10 10:06:20 crc kubenswrapper[4794]: I0310 10:06:20.510041 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cb29878d-3dbb-496a-bd25-b6b4ff102b6f-config-data\") pod \"cb29878d-3dbb-496a-bd25-b6b4ff102b6f\" (UID: \"cb29878d-3dbb-496a-bd25-b6b4ff102b6f\") " Mar 10 10:06:20 crc kubenswrapper[4794]: I0310 10:06:20.511158 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cb29878d-3dbb-496a-bd25-b6b4ff102b6f-logs" (OuterVolumeSpecName: "logs") pod "cb29878d-3dbb-496a-bd25-b6b4ff102b6f" (UID: "cb29878d-3dbb-496a-bd25-b6b4ff102b6f"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 10:06:20 crc kubenswrapper[4794]: I0310 10:06:20.511688 4794 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cb29878d-3dbb-496a-bd25-b6b4ff102b6f-logs\") on node \"crc\" DevicePath \"\"" Mar 10 10:06:20 crc kubenswrapper[4794]: I0310 10:06:20.515413 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cb29878d-3dbb-496a-bd25-b6b4ff102b6f-kube-api-access-zp5c2" (OuterVolumeSpecName: "kube-api-access-zp5c2") pod "cb29878d-3dbb-496a-bd25-b6b4ff102b6f" (UID: "cb29878d-3dbb-496a-bd25-b6b4ff102b6f"). InnerVolumeSpecName "kube-api-access-zp5c2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 10:06:20 crc kubenswrapper[4794]: I0310 10:06:20.515895 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cb29878d-3dbb-496a-bd25-b6b4ff102b6f-scripts" (OuterVolumeSpecName: "scripts") pod "cb29878d-3dbb-496a-bd25-b6b4ff102b6f" (UID: "cb29878d-3dbb-496a-bd25-b6b4ff102b6f"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 10:06:20 crc kubenswrapper[4794]: I0310 10:06:20.541295 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cb29878d-3dbb-496a-bd25-b6b4ff102b6f-config-data" (OuterVolumeSpecName: "config-data") pod "cb29878d-3dbb-496a-bd25-b6b4ff102b6f" (UID: "cb29878d-3dbb-496a-bd25-b6b4ff102b6f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 10:06:20 crc kubenswrapper[4794]: I0310 10:06:20.543488 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cb29878d-3dbb-496a-bd25-b6b4ff102b6f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "cb29878d-3dbb-496a-bd25-b6b4ff102b6f" (UID: "cb29878d-3dbb-496a-bd25-b6b4ff102b6f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 10:06:20 crc kubenswrapper[4794]: I0310 10:06:20.613402 4794 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cb29878d-3dbb-496a-bd25-b6b4ff102b6f-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 10:06:20 crc kubenswrapper[4794]: I0310 10:06:20.613446 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zp5c2\" (UniqueName: \"kubernetes.io/projected/cb29878d-3dbb-496a-bd25-b6b4ff102b6f-kube-api-access-zp5c2\") on node \"crc\" DevicePath \"\"" Mar 10 10:06:20 crc kubenswrapper[4794]: I0310 10:06:20.613461 4794 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cb29878d-3dbb-496a-bd25-b6b4ff102b6f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 10:06:20 crc kubenswrapper[4794]: I0310 10:06:20.613471 4794 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cb29878d-3dbb-496a-bd25-b6b4ff102b6f-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 10:06:21 crc kubenswrapper[4794]: I0310 10:06:21.155179 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-v6v6k" event={"ID":"cb29878d-3dbb-496a-bd25-b6b4ff102b6f","Type":"ContainerDied","Data":"93d359360c71a102874ef08a2070921c8871397780b23f51f208a0cbeb037303"} Mar 10 10:06:21 crc kubenswrapper[4794]: I0310 10:06:21.155547 4794 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="93d359360c71a102874ef08a2070921c8871397780b23f51f208a0cbeb037303" Mar 10 10:06:21 crc kubenswrapper[4794]: I0310 10:06:21.155613 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-v6v6k" Mar 10 10:06:21 crc kubenswrapper[4794]: I0310 10:06:21.163429 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-66d985f6f7-q8rt6" event={"ID":"b66eadd9-dea4-4c3c-aa14-494f04e0e8c7","Type":"ContainerStarted","Data":"2a165f3b4e22ad3b365cba1bf564b6800c46f607f88d27558dfb8a8ce501c7b2"} Mar 10 10:06:21 crc kubenswrapper[4794]: I0310 10:06:21.163481 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-66d985f6f7-q8rt6" event={"ID":"b66eadd9-dea4-4c3c-aa14-494f04e0e8c7","Type":"ContainerStarted","Data":"ac9789f1492e8d84ba992e18bc28a423157e539d3ae3b6e20dec1dc67aaa8e96"} Mar 10 10:06:21 crc kubenswrapper[4794]: I0310 10:06:21.170642 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-5fc5898d7d-hsl9p"] Mar 10 10:06:21 crc kubenswrapper[4794]: E0310 10:06:21.171077 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f9689ae4-36b4-41d8-b75d-805a57b17041" containerName="oc" Mar 10 10:06:21 crc kubenswrapper[4794]: I0310 10:06:21.171094 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="f9689ae4-36b4-41d8-b75d-805a57b17041" containerName="oc" Mar 10 10:06:21 crc kubenswrapper[4794]: E0310 10:06:21.171110 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cb29878d-3dbb-496a-bd25-b6b4ff102b6f" containerName="placement-db-sync" Mar 10 10:06:21 crc kubenswrapper[4794]: I0310 10:06:21.171117 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="cb29878d-3dbb-496a-bd25-b6b4ff102b6f" containerName="placement-db-sync" Mar 10 10:06:21 crc kubenswrapper[4794]: I0310 10:06:21.171321 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="cb29878d-3dbb-496a-bd25-b6b4ff102b6f" containerName="placement-db-sync" Mar 10 10:06:21 crc kubenswrapper[4794]: I0310 10:06:21.171366 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="f9689ae4-36b4-41d8-b75d-805a57b17041" containerName="oc" Mar 10 10:06:21 crc kubenswrapper[4794]: I0310 10:06:21.172428 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-5fc5898d7d-hsl9p" Mar 10 10:06:21 crc kubenswrapper[4794]: I0310 10:06:21.175609 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Mar 10 10:06:21 crc kubenswrapper[4794]: I0310 10:06:21.176018 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Mar 10 10:06:21 crc kubenswrapper[4794]: I0310 10:06:21.176205 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-internal-svc" Mar 10 10:06:21 crc kubenswrapper[4794]: I0310 10:06:21.176429 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-public-svc" Mar 10 10:06:21 crc kubenswrapper[4794]: I0310 10:06:21.176560 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-k9rlk" Mar 10 10:06:21 crc kubenswrapper[4794]: I0310 10:06:21.201234 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-5fc5898d7d-hsl9p"] Mar 10 10:06:21 crc kubenswrapper[4794]: I0310 10:06:21.214908 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-66d985f6f7-q8rt6" podStartSLOduration=3.214867886 podStartE2EDuration="3.214867886s" podCreationTimestamp="2026-03-10 10:06:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 10:06:21.187882633 +0000 UTC m=+1329.944053461" watchObservedRunningTime="2026-03-10 10:06:21.214867886 +0000 UTC m=+1329.971038704" Mar 10 10:06:21 crc kubenswrapper[4794]: I0310 10:06:21.224819 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/26721832-e716-44b8-ab75-60ba0be9e511-scripts\") pod \"placement-5fc5898d7d-hsl9p\" (UID: \"26721832-e716-44b8-ab75-60ba0be9e511\") " pod="openstack/placement-5fc5898d7d-hsl9p" Mar 10 10:06:21 crc kubenswrapper[4794]: I0310 10:06:21.224866 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/26721832-e716-44b8-ab75-60ba0be9e511-config-data\") pod \"placement-5fc5898d7d-hsl9p\" (UID: \"26721832-e716-44b8-ab75-60ba0be9e511\") " pod="openstack/placement-5fc5898d7d-hsl9p" Mar 10 10:06:21 crc kubenswrapper[4794]: I0310 10:06:21.224926 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/26721832-e716-44b8-ab75-60ba0be9e511-internal-tls-certs\") pod \"placement-5fc5898d7d-hsl9p\" (UID: \"26721832-e716-44b8-ab75-60ba0be9e511\") " pod="openstack/placement-5fc5898d7d-hsl9p" Mar 10 10:06:21 crc kubenswrapper[4794]: I0310 10:06:21.224961 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/26721832-e716-44b8-ab75-60ba0be9e511-logs\") pod \"placement-5fc5898d7d-hsl9p\" (UID: \"26721832-e716-44b8-ab75-60ba0be9e511\") " pod="openstack/placement-5fc5898d7d-hsl9p" Mar 10 10:06:21 crc kubenswrapper[4794]: I0310 10:06:21.225213 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wrxx4\" (UniqueName: \"kubernetes.io/projected/26721832-e716-44b8-ab75-60ba0be9e511-kube-api-access-wrxx4\") pod \"placement-5fc5898d7d-hsl9p\" (UID: \"26721832-e716-44b8-ab75-60ba0be9e511\") " pod="openstack/placement-5fc5898d7d-hsl9p" Mar 10 10:06:21 crc kubenswrapper[4794]: I0310 10:06:21.225249 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/26721832-e716-44b8-ab75-60ba0be9e511-combined-ca-bundle\") pod \"placement-5fc5898d7d-hsl9p\" (UID: \"26721832-e716-44b8-ab75-60ba0be9e511\") " pod="openstack/placement-5fc5898d7d-hsl9p" Mar 10 10:06:21 crc kubenswrapper[4794]: I0310 10:06:21.225855 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/26721832-e716-44b8-ab75-60ba0be9e511-public-tls-certs\") pod \"placement-5fc5898d7d-hsl9p\" (UID: \"26721832-e716-44b8-ab75-60ba0be9e511\") " pod="openstack/placement-5fc5898d7d-hsl9p" Mar 10 10:06:21 crc kubenswrapper[4794]: I0310 10:06:21.327934 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wrxx4\" (UniqueName: \"kubernetes.io/projected/26721832-e716-44b8-ab75-60ba0be9e511-kube-api-access-wrxx4\") pod \"placement-5fc5898d7d-hsl9p\" (UID: \"26721832-e716-44b8-ab75-60ba0be9e511\") " pod="openstack/placement-5fc5898d7d-hsl9p" Mar 10 10:06:21 crc kubenswrapper[4794]: I0310 10:06:21.328017 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/26721832-e716-44b8-ab75-60ba0be9e511-combined-ca-bundle\") pod \"placement-5fc5898d7d-hsl9p\" (UID: \"26721832-e716-44b8-ab75-60ba0be9e511\") " pod="openstack/placement-5fc5898d7d-hsl9p" Mar 10 10:06:21 crc kubenswrapper[4794]: I0310 10:06:21.328106 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/26721832-e716-44b8-ab75-60ba0be9e511-public-tls-certs\") pod \"placement-5fc5898d7d-hsl9p\" (UID: \"26721832-e716-44b8-ab75-60ba0be9e511\") " pod="openstack/placement-5fc5898d7d-hsl9p" Mar 10 10:06:21 crc kubenswrapper[4794]: I0310 10:06:21.328221 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/26721832-e716-44b8-ab75-60ba0be9e511-scripts\") pod \"placement-5fc5898d7d-hsl9p\" (UID: \"26721832-e716-44b8-ab75-60ba0be9e511\") " pod="openstack/placement-5fc5898d7d-hsl9p" Mar 10 10:06:21 crc kubenswrapper[4794]: I0310 10:06:21.328247 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/26721832-e716-44b8-ab75-60ba0be9e511-config-data\") pod \"placement-5fc5898d7d-hsl9p\" (UID: \"26721832-e716-44b8-ab75-60ba0be9e511\") " pod="openstack/placement-5fc5898d7d-hsl9p" Mar 10 10:06:21 crc kubenswrapper[4794]: I0310 10:06:21.328305 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/26721832-e716-44b8-ab75-60ba0be9e511-internal-tls-certs\") pod \"placement-5fc5898d7d-hsl9p\" (UID: \"26721832-e716-44b8-ab75-60ba0be9e511\") " pod="openstack/placement-5fc5898d7d-hsl9p" Mar 10 10:06:21 crc kubenswrapper[4794]: I0310 10:06:21.328362 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/26721832-e716-44b8-ab75-60ba0be9e511-logs\") pod \"placement-5fc5898d7d-hsl9p\" (UID: \"26721832-e716-44b8-ab75-60ba0be9e511\") " pod="openstack/placement-5fc5898d7d-hsl9p" Mar 10 10:06:21 crc kubenswrapper[4794]: I0310 10:06:21.329136 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/26721832-e716-44b8-ab75-60ba0be9e511-logs\") pod \"placement-5fc5898d7d-hsl9p\" (UID: \"26721832-e716-44b8-ab75-60ba0be9e511\") " pod="openstack/placement-5fc5898d7d-hsl9p" Mar 10 10:06:21 crc kubenswrapper[4794]: I0310 10:06:21.333810 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/26721832-e716-44b8-ab75-60ba0be9e511-scripts\") pod \"placement-5fc5898d7d-hsl9p\" (UID: \"26721832-e716-44b8-ab75-60ba0be9e511\") " pod="openstack/placement-5fc5898d7d-hsl9p" Mar 10 10:06:21 crc kubenswrapper[4794]: I0310 10:06:21.335194 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/26721832-e716-44b8-ab75-60ba0be9e511-config-data\") pod \"placement-5fc5898d7d-hsl9p\" (UID: \"26721832-e716-44b8-ab75-60ba0be9e511\") " pod="openstack/placement-5fc5898d7d-hsl9p" Mar 10 10:06:21 crc kubenswrapper[4794]: I0310 10:06:21.335187 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/26721832-e716-44b8-ab75-60ba0be9e511-combined-ca-bundle\") pod \"placement-5fc5898d7d-hsl9p\" (UID: \"26721832-e716-44b8-ab75-60ba0be9e511\") " pod="openstack/placement-5fc5898d7d-hsl9p" Mar 10 10:06:21 crc kubenswrapper[4794]: I0310 10:06:21.335630 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/26721832-e716-44b8-ab75-60ba0be9e511-public-tls-certs\") pod \"placement-5fc5898d7d-hsl9p\" (UID: \"26721832-e716-44b8-ab75-60ba0be9e511\") " pod="openstack/placement-5fc5898d7d-hsl9p" Mar 10 10:06:21 crc kubenswrapper[4794]: I0310 10:06:21.351119 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/26721832-e716-44b8-ab75-60ba0be9e511-internal-tls-certs\") pod \"placement-5fc5898d7d-hsl9p\" (UID: \"26721832-e716-44b8-ab75-60ba0be9e511\") " pod="openstack/placement-5fc5898d7d-hsl9p" Mar 10 10:06:21 crc kubenswrapper[4794]: I0310 10:06:21.353442 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wrxx4\" (UniqueName: \"kubernetes.io/projected/26721832-e716-44b8-ab75-60ba0be9e511-kube-api-access-wrxx4\") pod \"placement-5fc5898d7d-hsl9p\" (UID: \"26721832-e716-44b8-ab75-60ba0be9e511\") " pod="openstack/placement-5fc5898d7d-hsl9p" Mar 10 10:06:21 crc kubenswrapper[4794]: I0310 10:06:21.510721 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-5fc5898d7d-hsl9p" Mar 10 10:06:21 crc kubenswrapper[4794]: I0310 10:06:21.535770 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-sz7rz" Mar 10 10:06:21 crc kubenswrapper[4794]: I0310 10:06:21.639768 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/74a5421b-362d-437b-98ce-c11e44a2e6f0-scripts\") pod \"74a5421b-362d-437b-98ce-c11e44a2e6f0\" (UID: \"74a5421b-362d-437b-98ce-c11e44a2e6f0\") " Mar 10 10:06:21 crc kubenswrapper[4794]: I0310 10:06:21.639813 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/74a5421b-362d-437b-98ce-c11e44a2e6f0-config-data\") pod \"74a5421b-362d-437b-98ce-c11e44a2e6f0\" (UID: \"74a5421b-362d-437b-98ce-c11e44a2e6f0\") " Mar 10 10:06:21 crc kubenswrapper[4794]: I0310 10:06:21.639833 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/74a5421b-362d-437b-98ce-c11e44a2e6f0-combined-ca-bundle\") pod \"74a5421b-362d-437b-98ce-c11e44a2e6f0\" (UID: \"74a5421b-362d-437b-98ce-c11e44a2e6f0\") " Mar 10 10:06:21 crc kubenswrapper[4794]: I0310 10:06:21.639853 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/74a5421b-362d-437b-98ce-c11e44a2e6f0-credential-keys\") pod \"74a5421b-362d-437b-98ce-c11e44a2e6f0\" (UID: \"74a5421b-362d-437b-98ce-c11e44a2e6f0\") " Mar 10 10:06:21 crc kubenswrapper[4794]: I0310 10:06:21.639870 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r9686\" (UniqueName: \"kubernetes.io/projected/74a5421b-362d-437b-98ce-c11e44a2e6f0-kube-api-access-r9686\") pod \"74a5421b-362d-437b-98ce-c11e44a2e6f0\" (UID: \"74a5421b-362d-437b-98ce-c11e44a2e6f0\") " Mar 10 10:06:21 crc kubenswrapper[4794]: I0310 10:06:21.639970 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/74a5421b-362d-437b-98ce-c11e44a2e6f0-fernet-keys\") pod \"74a5421b-362d-437b-98ce-c11e44a2e6f0\" (UID: \"74a5421b-362d-437b-98ce-c11e44a2e6f0\") " Mar 10 10:06:21 crc kubenswrapper[4794]: I0310 10:06:21.644319 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/74a5421b-362d-437b-98ce-c11e44a2e6f0-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "74a5421b-362d-437b-98ce-c11e44a2e6f0" (UID: "74a5421b-362d-437b-98ce-c11e44a2e6f0"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 10:06:21 crc kubenswrapper[4794]: I0310 10:06:21.647483 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/74a5421b-362d-437b-98ce-c11e44a2e6f0-scripts" (OuterVolumeSpecName: "scripts") pod "74a5421b-362d-437b-98ce-c11e44a2e6f0" (UID: "74a5421b-362d-437b-98ce-c11e44a2e6f0"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 10:06:21 crc kubenswrapper[4794]: I0310 10:06:21.647501 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/74a5421b-362d-437b-98ce-c11e44a2e6f0-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "74a5421b-362d-437b-98ce-c11e44a2e6f0" (UID: "74a5421b-362d-437b-98ce-c11e44a2e6f0"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 10:06:21 crc kubenswrapper[4794]: I0310 10:06:21.650591 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/74a5421b-362d-437b-98ce-c11e44a2e6f0-kube-api-access-r9686" (OuterVolumeSpecName: "kube-api-access-r9686") pod "74a5421b-362d-437b-98ce-c11e44a2e6f0" (UID: "74a5421b-362d-437b-98ce-c11e44a2e6f0"). InnerVolumeSpecName "kube-api-access-r9686". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 10:06:21 crc kubenswrapper[4794]: I0310 10:06:21.672583 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/74a5421b-362d-437b-98ce-c11e44a2e6f0-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "74a5421b-362d-437b-98ce-c11e44a2e6f0" (UID: "74a5421b-362d-437b-98ce-c11e44a2e6f0"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 10:06:21 crc kubenswrapper[4794]: I0310 10:06:21.684535 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/74a5421b-362d-437b-98ce-c11e44a2e6f0-config-data" (OuterVolumeSpecName: "config-data") pod "74a5421b-362d-437b-98ce-c11e44a2e6f0" (UID: "74a5421b-362d-437b-98ce-c11e44a2e6f0"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 10:06:21 crc kubenswrapper[4794]: I0310 10:06:21.741607 4794 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/74a5421b-362d-437b-98ce-c11e44a2e6f0-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 10:06:21 crc kubenswrapper[4794]: I0310 10:06:21.741641 4794 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/74a5421b-362d-437b-98ce-c11e44a2e6f0-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 10:06:21 crc kubenswrapper[4794]: I0310 10:06:21.741651 4794 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/74a5421b-362d-437b-98ce-c11e44a2e6f0-credential-keys\") on node \"crc\" DevicePath \"\"" Mar 10 10:06:21 crc kubenswrapper[4794]: I0310 10:06:21.741659 4794 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/74a5421b-362d-437b-98ce-c11e44a2e6f0-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 10:06:21 crc kubenswrapper[4794]: I0310 10:06:21.741667 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r9686\" (UniqueName: \"kubernetes.io/projected/74a5421b-362d-437b-98ce-c11e44a2e6f0-kube-api-access-r9686\") on node \"crc\" DevicePath \"\"" Mar 10 10:06:21 crc kubenswrapper[4794]: I0310 10:06:21.741675 4794 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/74a5421b-362d-437b-98ce-c11e44a2e6f0-fernet-keys\") on node \"crc\" DevicePath \"\"" Mar 10 10:06:22 crc kubenswrapper[4794]: I0310 10:06:22.073150 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-5fc5898d7d-hsl9p"] Mar 10 10:06:22 crc kubenswrapper[4794]: I0310 10:06:22.178124 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-sz7rz" Mar 10 10:06:22 crc kubenswrapper[4794]: I0310 10:06:22.179085 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-sz7rz" event={"ID":"74a5421b-362d-437b-98ce-c11e44a2e6f0","Type":"ContainerDied","Data":"ed735d8ea2cd1b7f855a538c37b457135c56eb28083c21334c689d4b13ccb3c4"} Mar 10 10:06:22 crc kubenswrapper[4794]: I0310 10:06:22.179118 4794 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ed735d8ea2cd1b7f855a538c37b457135c56eb28083c21334c689d4b13ccb3c4" Mar 10 10:06:22 crc kubenswrapper[4794]: I0310 10:06:22.179379 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-66d985f6f7-q8rt6" Mar 10 10:06:22 crc kubenswrapper[4794]: I0310 10:06:22.227498 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-547f85b784-tn9hj"] Mar 10 10:06:22 crc kubenswrapper[4794]: E0310 10:06:22.227974 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="74a5421b-362d-437b-98ce-c11e44a2e6f0" containerName="keystone-bootstrap" Mar 10 10:06:22 crc kubenswrapper[4794]: I0310 10:06:22.227994 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="74a5421b-362d-437b-98ce-c11e44a2e6f0" containerName="keystone-bootstrap" Mar 10 10:06:22 crc kubenswrapper[4794]: I0310 10:06:22.228245 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="74a5421b-362d-437b-98ce-c11e44a2e6f0" containerName="keystone-bootstrap" Mar 10 10:06:22 crc kubenswrapper[4794]: I0310 10:06:22.228960 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-547f85b784-tn9hj" Mar 10 10:06:22 crc kubenswrapper[4794]: I0310 10:06:22.232745 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Mar 10 10:06:22 crc kubenswrapper[4794]: I0310 10:06:22.233044 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-internal-svc" Mar 10 10:06:22 crc kubenswrapper[4794]: I0310 10:06:22.233205 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Mar 10 10:06:22 crc kubenswrapper[4794]: I0310 10:06:22.233362 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-4svgf" Mar 10 10:06:22 crc kubenswrapper[4794]: I0310 10:06:22.237112 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Mar 10 10:06:22 crc kubenswrapper[4794]: I0310 10:06:22.237447 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-public-svc" Mar 10 10:06:22 crc kubenswrapper[4794]: I0310 10:06:22.256155 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/95ce97ce-b89c-4868-b9a8-48297e8e35e1-credential-keys\") pod \"keystone-547f85b784-tn9hj\" (UID: \"95ce97ce-b89c-4868-b9a8-48297e8e35e1\") " pod="openstack/keystone-547f85b784-tn9hj" Mar 10 10:06:22 crc kubenswrapper[4794]: I0310 10:06:22.256204 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/95ce97ce-b89c-4868-b9a8-48297e8e35e1-fernet-keys\") pod \"keystone-547f85b784-tn9hj\" (UID: \"95ce97ce-b89c-4868-b9a8-48297e8e35e1\") " pod="openstack/keystone-547f85b784-tn9hj" Mar 10 10:06:22 crc kubenswrapper[4794]: I0310 10:06:22.256305 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/95ce97ce-b89c-4868-b9a8-48297e8e35e1-config-data\") pod \"keystone-547f85b784-tn9hj\" (UID: \"95ce97ce-b89c-4868-b9a8-48297e8e35e1\") " pod="openstack/keystone-547f85b784-tn9hj" Mar 10 10:06:22 crc kubenswrapper[4794]: I0310 10:06:22.256372 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/95ce97ce-b89c-4868-b9a8-48297e8e35e1-scripts\") pod \"keystone-547f85b784-tn9hj\" (UID: \"95ce97ce-b89c-4868-b9a8-48297e8e35e1\") " pod="openstack/keystone-547f85b784-tn9hj" Mar 10 10:06:22 crc kubenswrapper[4794]: I0310 10:06:22.256414 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b9hcf\" (UniqueName: \"kubernetes.io/projected/95ce97ce-b89c-4868-b9a8-48297e8e35e1-kube-api-access-b9hcf\") pod \"keystone-547f85b784-tn9hj\" (UID: \"95ce97ce-b89c-4868-b9a8-48297e8e35e1\") " pod="openstack/keystone-547f85b784-tn9hj" Mar 10 10:06:22 crc kubenswrapper[4794]: I0310 10:06:22.256445 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/95ce97ce-b89c-4868-b9a8-48297e8e35e1-combined-ca-bundle\") pod \"keystone-547f85b784-tn9hj\" (UID: \"95ce97ce-b89c-4868-b9a8-48297e8e35e1\") " pod="openstack/keystone-547f85b784-tn9hj" Mar 10 10:06:22 crc kubenswrapper[4794]: I0310 10:06:22.256526 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/95ce97ce-b89c-4868-b9a8-48297e8e35e1-internal-tls-certs\") pod \"keystone-547f85b784-tn9hj\" (UID: \"95ce97ce-b89c-4868-b9a8-48297e8e35e1\") " pod="openstack/keystone-547f85b784-tn9hj" Mar 10 10:06:22 crc kubenswrapper[4794]: I0310 10:06:22.256565 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/95ce97ce-b89c-4868-b9a8-48297e8e35e1-public-tls-certs\") pod \"keystone-547f85b784-tn9hj\" (UID: \"95ce97ce-b89c-4868-b9a8-48297e8e35e1\") " pod="openstack/keystone-547f85b784-tn9hj" Mar 10 10:06:22 crc kubenswrapper[4794]: I0310 10:06:22.262147 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-547f85b784-tn9hj"] Mar 10 10:06:22 crc kubenswrapper[4794]: I0310 10:06:22.358043 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/95ce97ce-b89c-4868-b9a8-48297e8e35e1-internal-tls-certs\") pod \"keystone-547f85b784-tn9hj\" (UID: \"95ce97ce-b89c-4868-b9a8-48297e8e35e1\") " pod="openstack/keystone-547f85b784-tn9hj" Mar 10 10:06:22 crc kubenswrapper[4794]: I0310 10:06:22.358124 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/95ce97ce-b89c-4868-b9a8-48297e8e35e1-public-tls-certs\") pod \"keystone-547f85b784-tn9hj\" (UID: \"95ce97ce-b89c-4868-b9a8-48297e8e35e1\") " pod="openstack/keystone-547f85b784-tn9hj" Mar 10 10:06:22 crc kubenswrapper[4794]: I0310 10:06:22.358199 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/95ce97ce-b89c-4868-b9a8-48297e8e35e1-credential-keys\") pod \"keystone-547f85b784-tn9hj\" (UID: \"95ce97ce-b89c-4868-b9a8-48297e8e35e1\") " pod="openstack/keystone-547f85b784-tn9hj" Mar 10 10:06:22 crc kubenswrapper[4794]: I0310 10:06:22.358225 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/95ce97ce-b89c-4868-b9a8-48297e8e35e1-fernet-keys\") pod \"keystone-547f85b784-tn9hj\" (UID: \"95ce97ce-b89c-4868-b9a8-48297e8e35e1\") " pod="openstack/keystone-547f85b784-tn9hj" Mar 10 10:06:22 crc kubenswrapper[4794]: I0310 10:06:22.358281 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/95ce97ce-b89c-4868-b9a8-48297e8e35e1-config-data\") pod \"keystone-547f85b784-tn9hj\" (UID: \"95ce97ce-b89c-4868-b9a8-48297e8e35e1\") " pod="openstack/keystone-547f85b784-tn9hj" Mar 10 10:06:22 crc kubenswrapper[4794]: I0310 10:06:22.358312 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/95ce97ce-b89c-4868-b9a8-48297e8e35e1-scripts\") pod \"keystone-547f85b784-tn9hj\" (UID: \"95ce97ce-b89c-4868-b9a8-48297e8e35e1\") " pod="openstack/keystone-547f85b784-tn9hj" Mar 10 10:06:22 crc kubenswrapper[4794]: I0310 10:06:22.358355 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b9hcf\" (UniqueName: \"kubernetes.io/projected/95ce97ce-b89c-4868-b9a8-48297e8e35e1-kube-api-access-b9hcf\") pod \"keystone-547f85b784-tn9hj\" (UID: \"95ce97ce-b89c-4868-b9a8-48297e8e35e1\") " pod="openstack/keystone-547f85b784-tn9hj" Mar 10 10:06:22 crc kubenswrapper[4794]: I0310 10:06:22.358383 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/95ce97ce-b89c-4868-b9a8-48297e8e35e1-combined-ca-bundle\") pod \"keystone-547f85b784-tn9hj\" (UID: \"95ce97ce-b89c-4868-b9a8-48297e8e35e1\") " pod="openstack/keystone-547f85b784-tn9hj" Mar 10 10:06:22 crc kubenswrapper[4794]: I0310 10:06:22.372368 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/95ce97ce-b89c-4868-b9a8-48297e8e35e1-public-tls-certs\") pod \"keystone-547f85b784-tn9hj\" (UID: \"95ce97ce-b89c-4868-b9a8-48297e8e35e1\") " pod="openstack/keystone-547f85b784-tn9hj" Mar 10 10:06:22 crc kubenswrapper[4794]: I0310 10:06:22.374221 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/95ce97ce-b89c-4868-b9a8-48297e8e35e1-internal-tls-certs\") pod \"keystone-547f85b784-tn9hj\" (UID: \"95ce97ce-b89c-4868-b9a8-48297e8e35e1\") " pod="openstack/keystone-547f85b784-tn9hj" Mar 10 10:06:22 crc kubenswrapper[4794]: I0310 10:06:22.377106 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/95ce97ce-b89c-4868-b9a8-48297e8e35e1-combined-ca-bundle\") pod \"keystone-547f85b784-tn9hj\" (UID: \"95ce97ce-b89c-4868-b9a8-48297e8e35e1\") " pod="openstack/keystone-547f85b784-tn9hj" Mar 10 10:06:22 crc kubenswrapper[4794]: I0310 10:06:22.378659 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/95ce97ce-b89c-4868-b9a8-48297e8e35e1-scripts\") pod \"keystone-547f85b784-tn9hj\" (UID: \"95ce97ce-b89c-4868-b9a8-48297e8e35e1\") " pod="openstack/keystone-547f85b784-tn9hj" Mar 10 10:06:22 crc kubenswrapper[4794]: I0310 10:06:22.379076 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/95ce97ce-b89c-4868-b9a8-48297e8e35e1-fernet-keys\") pod \"keystone-547f85b784-tn9hj\" (UID: \"95ce97ce-b89c-4868-b9a8-48297e8e35e1\") " pod="openstack/keystone-547f85b784-tn9hj" Mar 10 10:06:22 crc kubenswrapper[4794]: I0310 10:06:22.380799 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/95ce97ce-b89c-4868-b9a8-48297e8e35e1-credential-keys\") pod \"keystone-547f85b784-tn9hj\" (UID: \"95ce97ce-b89c-4868-b9a8-48297e8e35e1\") " pod="openstack/keystone-547f85b784-tn9hj" Mar 10 10:06:22 crc kubenswrapper[4794]: I0310 10:06:22.381533 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/95ce97ce-b89c-4868-b9a8-48297e8e35e1-config-data\") pod \"keystone-547f85b784-tn9hj\" (UID: \"95ce97ce-b89c-4868-b9a8-48297e8e35e1\") " pod="openstack/keystone-547f85b784-tn9hj" Mar 10 10:06:22 crc kubenswrapper[4794]: I0310 10:06:22.400436 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b9hcf\" (UniqueName: \"kubernetes.io/projected/95ce97ce-b89c-4868-b9a8-48297e8e35e1-kube-api-access-b9hcf\") pod \"keystone-547f85b784-tn9hj\" (UID: \"95ce97ce-b89c-4868-b9a8-48297e8e35e1\") " pod="openstack/keystone-547f85b784-tn9hj" Mar 10 10:06:22 crc kubenswrapper[4794]: I0310 10:06:22.563998 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-547f85b784-tn9hj" Mar 10 10:06:22 crc kubenswrapper[4794]: I0310 10:06:22.967310 4794 patch_prober.go:28] interesting pod/machine-config-daemon-69278 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 10:06:22 crc kubenswrapper[4794]: I0310 10:06:22.967388 4794 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-69278" podUID="071a79a8-a892-4d38-a255-2a19483b64aa" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 10:06:23 crc kubenswrapper[4794]: I0310 10:06:23.306089 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-68cfd4d846-drn7b"] Mar 10 10:06:23 crc kubenswrapper[4794]: I0310 10:06:23.307584 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-68cfd4d846-drn7b" Mar 10 10:06:23 crc kubenswrapper[4794]: I0310 10:06:23.349798 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-68cfd4d846-drn7b"] Mar 10 10:06:23 crc kubenswrapper[4794]: I0310 10:06:23.374351 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ecae27ed-535f-47c8-93e4-07baac3bc64c-public-tls-certs\") pod \"placement-68cfd4d846-drn7b\" (UID: \"ecae27ed-535f-47c8-93e4-07baac3bc64c\") " pod="openstack/placement-68cfd4d846-drn7b" Mar 10 10:06:23 crc kubenswrapper[4794]: I0310 10:06:23.374627 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ecae27ed-535f-47c8-93e4-07baac3bc64c-config-data\") pod \"placement-68cfd4d846-drn7b\" (UID: \"ecae27ed-535f-47c8-93e4-07baac3bc64c\") " pod="openstack/placement-68cfd4d846-drn7b" Mar 10 10:06:23 crc kubenswrapper[4794]: I0310 10:06:23.374750 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ecae27ed-535f-47c8-93e4-07baac3bc64c-internal-tls-certs\") pod \"placement-68cfd4d846-drn7b\" (UID: \"ecae27ed-535f-47c8-93e4-07baac3bc64c\") " pod="openstack/placement-68cfd4d846-drn7b" Mar 10 10:06:23 crc kubenswrapper[4794]: I0310 10:06:23.374832 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ecae27ed-535f-47c8-93e4-07baac3bc64c-scripts\") pod \"placement-68cfd4d846-drn7b\" (UID: \"ecae27ed-535f-47c8-93e4-07baac3bc64c\") " pod="openstack/placement-68cfd4d846-drn7b" Mar 10 10:06:23 crc kubenswrapper[4794]: I0310 10:06:23.374959 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ecae27ed-535f-47c8-93e4-07baac3bc64c-logs\") pod \"placement-68cfd4d846-drn7b\" (UID: \"ecae27ed-535f-47c8-93e4-07baac3bc64c\") " pod="openstack/placement-68cfd4d846-drn7b" Mar 10 10:06:23 crc kubenswrapper[4794]: I0310 10:06:23.375061 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bglt2\" (UniqueName: \"kubernetes.io/projected/ecae27ed-535f-47c8-93e4-07baac3bc64c-kube-api-access-bglt2\") pod \"placement-68cfd4d846-drn7b\" (UID: \"ecae27ed-535f-47c8-93e4-07baac3bc64c\") " pod="openstack/placement-68cfd4d846-drn7b" Mar 10 10:06:23 crc kubenswrapper[4794]: I0310 10:06:23.375158 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ecae27ed-535f-47c8-93e4-07baac3bc64c-combined-ca-bundle\") pod \"placement-68cfd4d846-drn7b\" (UID: \"ecae27ed-535f-47c8-93e4-07baac3bc64c\") " pod="openstack/placement-68cfd4d846-drn7b" Mar 10 10:06:23 crc kubenswrapper[4794]: I0310 10:06:23.477205 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ecae27ed-535f-47c8-93e4-07baac3bc64c-scripts\") pod \"placement-68cfd4d846-drn7b\" (UID: \"ecae27ed-535f-47c8-93e4-07baac3bc64c\") " pod="openstack/placement-68cfd4d846-drn7b" Mar 10 10:06:23 crc kubenswrapper[4794]: I0310 10:06:23.477289 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ecae27ed-535f-47c8-93e4-07baac3bc64c-logs\") pod \"placement-68cfd4d846-drn7b\" (UID: \"ecae27ed-535f-47c8-93e4-07baac3bc64c\") " pod="openstack/placement-68cfd4d846-drn7b" Mar 10 10:06:23 crc kubenswrapper[4794]: I0310 10:06:23.477411 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bglt2\" (UniqueName: \"kubernetes.io/projected/ecae27ed-535f-47c8-93e4-07baac3bc64c-kube-api-access-bglt2\") pod \"placement-68cfd4d846-drn7b\" (UID: \"ecae27ed-535f-47c8-93e4-07baac3bc64c\") " pod="openstack/placement-68cfd4d846-drn7b" Mar 10 10:06:23 crc kubenswrapper[4794]: I0310 10:06:23.477493 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ecae27ed-535f-47c8-93e4-07baac3bc64c-combined-ca-bundle\") pod \"placement-68cfd4d846-drn7b\" (UID: \"ecae27ed-535f-47c8-93e4-07baac3bc64c\") " pod="openstack/placement-68cfd4d846-drn7b" Mar 10 10:06:23 crc kubenswrapper[4794]: I0310 10:06:23.477609 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ecae27ed-535f-47c8-93e4-07baac3bc64c-public-tls-certs\") pod \"placement-68cfd4d846-drn7b\" (UID: \"ecae27ed-535f-47c8-93e4-07baac3bc64c\") " pod="openstack/placement-68cfd4d846-drn7b" Mar 10 10:06:23 crc kubenswrapper[4794]: I0310 10:06:23.477665 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ecae27ed-535f-47c8-93e4-07baac3bc64c-config-data\") pod \"placement-68cfd4d846-drn7b\" (UID: \"ecae27ed-535f-47c8-93e4-07baac3bc64c\") " pod="openstack/placement-68cfd4d846-drn7b" Mar 10 10:06:23 crc kubenswrapper[4794]: I0310 10:06:23.477710 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ecae27ed-535f-47c8-93e4-07baac3bc64c-internal-tls-certs\") pod \"placement-68cfd4d846-drn7b\" (UID: \"ecae27ed-535f-47c8-93e4-07baac3bc64c\") " pod="openstack/placement-68cfd4d846-drn7b" Mar 10 10:06:23 crc kubenswrapper[4794]: I0310 10:06:23.477915 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ecae27ed-535f-47c8-93e4-07baac3bc64c-logs\") pod \"placement-68cfd4d846-drn7b\" (UID: \"ecae27ed-535f-47c8-93e4-07baac3bc64c\") " pod="openstack/placement-68cfd4d846-drn7b" Mar 10 10:06:23 crc kubenswrapper[4794]: I0310 10:06:23.481488 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ecae27ed-535f-47c8-93e4-07baac3bc64c-public-tls-certs\") pod \"placement-68cfd4d846-drn7b\" (UID: \"ecae27ed-535f-47c8-93e4-07baac3bc64c\") " pod="openstack/placement-68cfd4d846-drn7b" Mar 10 10:06:23 crc kubenswrapper[4794]: I0310 10:06:23.482218 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ecae27ed-535f-47c8-93e4-07baac3bc64c-internal-tls-certs\") pod \"placement-68cfd4d846-drn7b\" (UID: \"ecae27ed-535f-47c8-93e4-07baac3bc64c\") " pod="openstack/placement-68cfd4d846-drn7b" Mar 10 10:06:23 crc kubenswrapper[4794]: I0310 10:06:23.483222 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ecae27ed-535f-47c8-93e4-07baac3bc64c-config-data\") pod \"placement-68cfd4d846-drn7b\" (UID: \"ecae27ed-535f-47c8-93e4-07baac3bc64c\") " pod="openstack/placement-68cfd4d846-drn7b" Mar 10 10:06:23 crc kubenswrapper[4794]: I0310 10:06:23.483526 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ecae27ed-535f-47c8-93e4-07baac3bc64c-scripts\") pod \"placement-68cfd4d846-drn7b\" (UID: \"ecae27ed-535f-47c8-93e4-07baac3bc64c\") " pod="openstack/placement-68cfd4d846-drn7b" Mar 10 10:06:23 crc kubenswrapper[4794]: I0310 10:06:23.488951 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ecae27ed-535f-47c8-93e4-07baac3bc64c-combined-ca-bundle\") pod \"placement-68cfd4d846-drn7b\" (UID: \"ecae27ed-535f-47c8-93e4-07baac3bc64c\") " pod="openstack/placement-68cfd4d846-drn7b" Mar 10 10:06:23 crc kubenswrapper[4794]: I0310 10:06:23.495476 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bglt2\" (UniqueName: \"kubernetes.io/projected/ecae27ed-535f-47c8-93e4-07baac3bc64c-kube-api-access-bglt2\") pod \"placement-68cfd4d846-drn7b\" (UID: \"ecae27ed-535f-47c8-93e4-07baac3bc64c\") " pod="openstack/placement-68cfd4d846-drn7b" Mar 10 10:06:23 crc kubenswrapper[4794]: I0310 10:06:23.643537 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-68cfd4d846-drn7b" Mar 10 10:06:23 crc kubenswrapper[4794]: W0310 10:06:23.787036 4794 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod26721832_e716_44b8_ab75_60ba0be9e511.slice/crio-68a005ad997ff45c1f61c2a95324919e9b6b7079279883530aac59cabd6615dd WatchSource:0}: Error finding container 68a005ad997ff45c1f61c2a95324919e9b6b7079279883530aac59cabd6615dd: Status 404 returned error can't find the container with id 68a005ad997ff45c1f61c2a95324919e9b6b7079279883530aac59cabd6615dd Mar 10 10:06:24 crc kubenswrapper[4794]: I0310 10:06:24.212812 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"39f62c40-641b-409f-ab29-ba30c14de2d8","Type":"ContainerStarted","Data":"f61478060205d8fcc6cec95994753f9d66a103e3c2e6f3bb9a7e8201a71bbfe8"} Mar 10 10:06:24 crc kubenswrapper[4794]: I0310 10:06:24.215510 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-5fc5898d7d-hsl9p" event={"ID":"26721832-e716-44b8-ab75-60ba0be9e511","Type":"ContainerStarted","Data":"0ab88aa23c611ec45f1aa310e542bf00e0fbb47047d97131ad66ee91504c23d1"} Mar 10 10:06:24 crc kubenswrapper[4794]: I0310 10:06:24.215608 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-5fc5898d7d-hsl9p" event={"ID":"26721832-e716-44b8-ab75-60ba0be9e511","Type":"ContainerStarted","Data":"68a005ad997ff45c1f61c2a95324919e9b6b7079279883530aac59cabd6615dd"} Mar 10 10:06:24 crc kubenswrapper[4794]: I0310 10:06:24.312117 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-68cfd4d846-drn7b"] Mar 10 10:06:24 crc kubenswrapper[4794]: W0310 10:06:24.312841 4794 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podecae27ed_535f_47c8_93e4_07baac3bc64c.slice/crio-bce16e9cc32c94b4d11d6002586ef136ea6b8659d3c91269d03e6df727d5a323 WatchSource:0}: Error finding container bce16e9cc32c94b4d11d6002586ef136ea6b8659d3c91269d03e6df727d5a323: Status 404 returned error can't find the container with id bce16e9cc32c94b4d11d6002586ef136ea6b8659d3c91269d03e6df727d5a323 Mar 10 10:06:24 crc kubenswrapper[4794]: I0310 10:06:24.398154 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-547f85b784-tn9hj"] Mar 10 10:06:24 crc kubenswrapper[4794]: W0310 10:06:24.405205 4794 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod95ce97ce_b89c_4868_b9a8_48297e8e35e1.slice/crio-337a49f733a79e868893a5f1af6eeadfb39f47c72873ba9f315bf933e2e0249a WatchSource:0}: Error finding container 337a49f733a79e868893a5f1af6eeadfb39f47c72873ba9f315bf933e2e0249a: Status 404 returned error can't find the container with id 337a49f733a79e868893a5f1af6eeadfb39f47c72873ba9f315bf933e2e0249a Mar 10 10:06:25 crc kubenswrapper[4794]: I0310 10:06:25.226538 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-5fc5898d7d-hsl9p" event={"ID":"26721832-e716-44b8-ab75-60ba0be9e511","Type":"ContainerStarted","Data":"bfceb8ee56c6036817c0ce50263764b2e16edc3f0e44869dddf9c663e04937b2"} Mar 10 10:06:25 crc kubenswrapper[4794]: I0310 10:06:25.226667 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-5fc5898d7d-hsl9p" Mar 10 10:06:25 crc kubenswrapper[4794]: I0310 10:06:25.230983 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-68cfd4d846-drn7b" event={"ID":"ecae27ed-535f-47c8-93e4-07baac3bc64c","Type":"ContainerStarted","Data":"5b36ab842a7613453f011b16a0ad771867c79a6397fa893991fe94b1caac6d43"} Mar 10 10:06:25 crc kubenswrapper[4794]: I0310 10:06:25.231022 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-68cfd4d846-drn7b" event={"ID":"ecae27ed-535f-47c8-93e4-07baac3bc64c","Type":"ContainerStarted","Data":"07b975998a50058fa523429e713118ddb3da394ec6e69a92bc5bda9b7839104f"} Mar 10 10:06:25 crc kubenswrapper[4794]: I0310 10:06:25.231035 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-68cfd4d846-drn7b" event={"ID":"ecae27ed-535f-47c8-93e4-07baac3bc64c","Type":"ContainerStarted","Data":"bce16e9cc32c94b4d11d6002586ef136ea6b8659d3c91269d03e6df727d5a323"} Mar 10 10:06:25 crc kubenswrapper[4794]: I0310 10:06:25.231050 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-68cfd4d846-drn7b" Mar 10 10:06:25 crc kubenswrapper[4794]: I0310 10:06:25.231081 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-68cfd4d846-drn7b" Mar 10 10:06:25 crc kubenswrapper[4794]: I0310 10:06:25.234653 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-547f85b784-tn9hj" event={"ID":"95ce97ce-b89c-4868-b9a8-48297e8e35e1","Type":"ContainerStarted","Data":"ba7917d5c284239059542152be24644fc0561a4ad2613bc181bad5791fbc0849"} Mar 10 10:06:25 crc kubenswrapper[4794]: I0310 10:06:25.234711 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-547f85b784-tn9hj" event={"ID":"95ce97ce-b89c-4868-b9a8-48297e8e35e1","Type":"ContainerStarted","Data":"337a49f733a79e868893a5f1af6eeadfb39f47c72873ba9f315bf933e2e0249a"} Mar 10 10:06:25 crc kubenswrapper[4794]: I0310 10:06:25.234884 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/keystone-547f85b784-tn9hj" Mar 10 10:06:25 crc kubenswrapper[4794]: I0310 10:06:25.257579 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-5fc5898d7d-hsl9p" podStartSLOduration=4.257561279 podStartE2EDuration="4.257561279s" podCreationTimestamp="2026-03-10 10:06:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 10:06:25.246360645 +0000 UTC m=+1334.002531473" watchObservedRunningTime="2026-03-10 10:06:25.257561279 +0000 UTC m=+1334.013732097" Mar 10 10:06:25 crc kubenswrapper[4794]: I0310 10:06:25.271233 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-68cfd4d846-drn7b" podStartSLOduration=2.271211641 podStartE2EDuration="2.271211641s" podCreationTimestamp="2026-03-10 10:06:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 10:06:25.267661998 +0000 UTC m=+1334.023832816" watchObservedRunningTime="2026-03-10 10:06:25.271211641 +0000 UTC m=+1334.027382479" Mar 10 10:06:25 crc kubenswrapper[4794]: I0310 10:06:25.303111 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-547f85b784-tn9hj" podStartSLOduration=3.30309088 podStartE2EDuration="3.30309088s" podCreationTimestamp="2026-03-10 10:06:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 10:06:25.289122238 +0000 UTC m=+1334.045293096" watchObservedRunningTime="2026-03-10 10:06:25.30309088 +0000 UTC m=+1334.059261708" Mar 10 10:06:25 crc kubenswrapper[4794]: I0310 10:06:25.398390 4794 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Mar 10 10:06:25 crc kubenswrapper[4794]: I0310 10:06:25.398779 4794 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Mar 10 10:06:25 crc kubenswrapper[4794]: I0310 10:06:25.427161 4794 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Mar 10 10:06:25 crc kubenswrapper[4794]: I0310 10:06:25.451607 4794 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Mar 10 10:06:25 crc kubenswrapper[4794]: I0310 10:06:25.457329 4794 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Mar 10 10:06:25 crc kubenswrapper[4794]: I0310 10:06:25.457474 4794 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Mar 10 10:06:25 crc kubenswrapper[4794]: I0310 10:06:25.505154 4794 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Mar 10 10:06:25 crc kubenswrapper[4794]: I0310 10:06:25.520705 4794 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Mar 10 10:06:26 crc kubenswrapper[4794]: I0310 10:06:26.244303 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Mar 10 10:06:26 crc kubenswrapper[4794]: I0310 10:06:26.244361 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Mar 10 10:06:26 crc kubenswrapper[4794]: I0310 10:06:26.244386 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-5fc5898d7d-hsl9p" Mar 10 10:06:26 crc kubenswrapper[4794]: I0310 10:06:26.244664 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Mar 10 10:06:26 crc kubenswrapper[4794]: I0310 10:06:26.244691 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Mar 10 10:06:26 crc kubenswrapper[4794]: I0310 10:06:26.695533 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-6cbd95f657-9dw7b" Mar 10 10:06:26 crc kubenswrapper[4794]: I0310 10:06:26.775015 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5bf6456ddf-8m6kc"] Mar 10 10:06:26 crc kubenswrapper[4794]: I0310 10:06:26.775719 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5bf6456ddf-8m6kc" podUID="327bbff1-1be5-4d8f-9ef9-dfc71e5199cc" containerName="dnsmasq-dns" containerID="cri-o://11afbcbf2e1ea029e0946f9cfb8d80bf8c8658a761240173341846934dc9c6c5" gracePeriod=10 Mar 10 10:06:27 crc kubenswrapper[4794]: I0310 10:06:27.268444 4794 generic.go:334] "Generic (PLEG): container finished" podID="327bbff1-1be5-4d8f-9ef9-dfc71e5199cc" containerID="11afbcbf2e1ea029e0946f9cfb8d80bf8c8658a761240173341846934dc9c6c5" exitCode=0 Mar 10 10:06:27 crc kubenswrapper[4794]: I0310 10:06:27.268779 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5bf6456ddf-8m6kc" event={"ID":"327bbff1-1be5-4d8f-9ef9-dfc71e5199cc","Type":"ContainerDied","Data":"11afbcbf2e1ea029e0946f9cfb8d80bf8c8658a761240173341846934dc9c6c5"} Mar 10 10:06:27 crc kubenswrapper[4794]: I0310 10:06:27.268825 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5bf6456ddf-8m6kc" event={"ID":"327bbff1-1be5-4d8f-9ef9-dfc71e5199cc","Type":"ContainerDied","Data":"d8664381fb8ffc7759188bf1b875286ca51b8370eff32d73220ac805e5f24235"} Mar 10 10:06:27 crc kubenswrapper[4794]: I0310 10:06:27.268840 4794 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d8664381fb8ffc7759188bf1b875286ca51b8370eff32d73220ac805e5f24235" Mar 10 10:06:27 crc kubenswrapper[4794]: I0310 10:06:27.346926 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5bf6456ddf-8m6kc" Mar 10 10:06:27 crc kubenswrapper[4794]: I0310 10:06:27.463058 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p9d5t\" (UniqueName: \"kubernetes.io/projected/327bbff1-1be5-4d8f-9ef9-dfc71e5199cc-kube-api-access-p9d5t\") pod \"327bbff1-1be5-4d8f-9ef9-dfc71e5199cc\" (UID: \"327bbff1-1be5-4d8f-9ef9-dfc71e5199cc\") " Mar 10 10:06:27 crc kubenswrapper[4794]: I0310 10:06:27.463125 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/327bbff1-1be5-4d8f-9ef9-dfc71e5199cc-dns-swift-storage-0\") pod \"327bbff1-1be5-4d8f-9ef9-dfc71e5199cc\" (UID: \"327bbff1-1be5-4d8f-9ef9-dfc71e5199cc\") " Mar 10 10:06:27 crc kubenswrapper[4794]: I0310 10:06:27.463187 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/327bbff1-1be5-4d8f-9ef9-dfc71e5199cc-config\") pod \"327bbff1-1be5-4d8f-9ef9-dfc71e5199cc\" (UID: \"327bbff1-1be5-4d8f-9ef9-dfc71e5199cc\") " Mar 10 10:06:27 crc kubenswrapper[4794]: I0310 10:06:27.463221 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/327bbff1-1be5-4d8f-9ef9-dfc71e5199cc-ovsdbserver-sb\") pod \"327bbff1-1be5-4d8f-9ef9-dfc71e5199cc\" (UID: \"327bbff1-1be5-4d8f-9ef9-dfc71e5199cc\") " Mar 10 10:06:27 crc kubenswrapper[4794]: I0310 10:06:27.463491 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/327bbff1-1be5-4d8f-9ef9-dfc71e5199cc-dns-svc\") pod \"327bbff1-1be5-4d8f-9ef9-dfc71e5199cc\" (UID: \"327bbff1-1be5-4d8f-9ef9-dfc71e5199cc\") " Mar 10 10:06:27 crc kubenswrapper[4794]: I0310 10:06:27.463524 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/327bbff1-1be5-4d8f-9ef9-dfc71e5199cc-ovsdbserver-nb\") pod \"327bbff1-1be5-4d8f-9ef9-dfc71e5199cc\" (UID: \"327bbff1-1be5-4d8f-9ef9-dfc71e5199cc\") " Mar 10 10:06:27 crc kubenswrapper[4794]: I0310 10:06:27.491591 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/327bbff1-1be5-4d8f-9ef9-dfc71e5199cc-kube-api-access-p9d5t" (OuterVolumeSpecName: "kube-api-access-p9d5t") pod "327bbff1-1be5-4d8f-9ef9-dfc71e5199cc" (UID: "327bbff1-1be5-4d8f-9ef9-dfc71e5199cc"). InnerVolumeSpecName "kube-api-access-p9d5t". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 10:06:27 crc kubenswrapper[4794]: I0310 10:06:27.557096 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/327bbff1-1be5-4d8f-9ef9-dfc71e5199cc-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "327bbff1-1be5-4d8f-9ef9-dfc71e5199cc" (UID: "327bbff1-1be5-4d8f-9ef9-dfc71e5199cc"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 10:06:27 crc kubenswrapper[4794]: I0310 10:06:27.565849 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/327bbff1-1be5-4d8f-9ef9-dfc71e5199cc-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "327bbff1-1be5-4d8f-9ef9-dfc71e5199cc" (UID: "327bbff1-1be5-4d8f-9ef9-dfc71e5199cc"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 10:06:27 crc kubenswrapper[4794]: I0310 10:06:27.569305 4794 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/327bbff1-1be5-4d8f-9ef9-dfc71e5199cc-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 10 10:06:27 crc kubenswrapper[4794]: I0310 10:06:27.569346 4794 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/327bbff1-1be5-4d8f-9ef9-dfc71e5199cc-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 10 10:06:27 crc kubenswrapper[4794]: I0310 10:06:27.569356 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p9d5t\" (UniqueName: \"kubernetes.io/projected/327bbff1-1be5-4d8f-9ef9-dfc71e5199cc-kube-api-access-p9d5t\") on node \"crc\" DevicePath \"\"" Mar 10 10:06:27 crc kubenswrapper[4794]: I0310 10:06:27.602777 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/327bbff1-1be5-4d8f-9ef9-dfc71e5199cc-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "327bbff1-1be5-4d8f-9ef9-dfc71e5199cc" (UID: "327bbff1-1be5-4d8f-9ef9-dfc71e5199cc"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 10:06:27 crc kubenswrapper[4794]: I0310 10:06:27.602795 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/327bbff1-1be5-4d8f-9ef9-dfc71e5199cc-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "327bbff1-1be5-4d8f-9ef9-dfc71e5199cc" (UID: "327bbff1-1be5-4d8f-9ef9-dfc71e5199cc"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 10:06:27 crc kubenswrapper[4794]: I0310 10:06:27.619743 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/327bbff1-1be5-4d8f-9ef9-dfc71e5199cc-config" (OuterVolumeSpecName: "config") pod "327bbff1-1be5-4d8f-9ef9-dfc71e5199cc" (UID: "327bbff1-1be5-4d8f-9ef9-dfc71e5199cc"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 10:06:27 crc kubenswrapper[4794]: I0310 10:06:27.671166 4794 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/327bbff1-1be5-4d8f-9ef9-dfc71e5199cc-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 10 10:06:27 crc kubenswrapper[4794]: I0310 10:06:27.671203 4794 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/327bbff1-1be5-4d8f-9ef9-dfc71e5199cc-config\") on node \"crc\" DevicePath \"\"" Mar 10 10:06:27 crc kubenswrapper[4794]: I0310 10:06:27.671215 4794 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/327bbff1-1be5-4d8f-9ef9-dfc71e5199cc-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 10 10:06:28 crc kubenswrapper[4794]: I0310 10:06:28.275181 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5bf6456ddf-8m6kc" Mar 10 10:06:28 crc kubenswrapper[4794]: I0310 10:06:28.304404 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5bf6456ddf-8m6kc"] Mar 10 10:06:28 crc kubenswrapper[4794]: I0310 10:06:28.308961 4794 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5bf6456ddf-8m6kc"] Mar 10 10:06:28 crc kubenswrapper[4794]: I0310 10:06:28.391569 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Mar 10 10:06:28 crc kubenswrapper[4794]: I0310 10:06:28.391695 4794 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 10 10:06:28 crc kubenswrapper[4794]: I0310 10:06:28.576198 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Mar 10 10:06:28 crc kubenswrapper[4794]: I0310 10:06:28.667235 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Mar 10 10:06:28 crc kubenswrapper[4794]: I0310 10:06:28.667363 4794 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 10 10:06:28 crc kubenswrapper[4794]: I0310 10:06:28.668011 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Mar 10 10:06:30 crc kubenswrapper[4794]: I0310 10:06:30.010651 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="327bbff1-1be5-4d8f-9ef9-dfc71e5199cc" path="/var/lib/kubelet/pods/327bbff1-1be5-4d8f-9ef9-dfc71e5199cc/volumes" Mar 10 10:06:31 crc kubenswrapper[4794]: I0310 10:06:31.627047 4794 scope.go:117] "RemoveContainer" containerID="10faf05a6df463cbd3424d36728612a9072a29a6e68134f130f0004c2fc9cd93" Mar 10 10:06:34 crc kubenswrapper[4794]: I0310 10:06:34.346005 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"39f62c40-641b-409f-ab29-ba30c14de2d8","Type":"ContainerStarted","Data":"85c4519c5fd753452de74cca4d80c98f995ac40e8e9d071b9258e6a667b6e320"} Mar 10 10:06:34 crc kubenswrapper[4794]: I0310 10:06:34.346575 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 10 10:06:34 crc kubenswrapper[4794]: I0310 10:06:34.346125 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="39f62c40-641b-409f-ab29-ba30c14de2d8" containerName="ceilometer-central-agent" containerID="cri-o://fbb0da4f7706c5131082ecadb64151d23fdfad08580f0036fc285ac4f92a3ab6" gracePeriod=30 Mar 10 10:06:34 crc kubenswrapper[4794]: I0310 10:06:34.346162 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="39f62c40-641b-409f-ab29-ba30c14de2d8" containerName="proxy-httpd" containerID="cri-o://85c4519c5fd753452de74cca4d80c98f995ac40e8e9d071b9258e6a667b6e320" gracePeriod=30 Mar 10 10:06:34 crc kubenswrapper[4794]: I0310 10:06:34.346176 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="39f62c40-641b-409f-ab29-ba30c14de2d8" containerName="ceilometer-notification-agent" containerID="cri-o://3a9d616e4d6b008bb7cbf6d2bbc20e8de0074f1639e7627b35604fe2fd3d69a4" gracePeriod=30 Mar 10 10:06:34 crc kubenswrapper[4794]: I0310 10:06:34.346168 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="39f62c40-641b-409f-ab29-ba30c14de2d8" containerName="sg-core" containerID="cri-o://f61478060205d8fcc6cec95994753f9d66a103e3c2e6f3bb9a7e8201a71bbfe8" gracePeriod=30 Mar 10 10:06:34 crc kubenswrapper[4794]: I0310 10:06:34.353764 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-c4b4g" event={"ID":"22069ba2-0135-4559-9c7f-2d73ae0dd81a","Type":"ContainerStarted","Data":"fe8fb67a806020a24997b1d8ba664de48c197527d827a7320e86b917c0811868"} Mar 10 10:06:34 crc kubenswrapper[4794]: I0310 10:06:34.381462 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.316364704 podStartE2EDuration="42.381444867s" podCreationTimestamp="2026-03-10 10:05:52 +0000 UTC" firstStartedPulling="2026-03-10 10:05:53.962784653 +0000 UTC m=+1302.718955471" lastFinishedPulling="2026-03-10 10:06:34.027864816 +0000 UTC m=+1342.784035634" observedRunningTime="2026-03-10 10:06:34.368368204 +0000 UTC m=+1343.124539032" watchObservedRunningTime="2026-03-10 10:06:34.381444867 +0000 UTC m=+1343.137615685" Mar 10 10:06:34 crc kubenswrapper[4794]: I0310 10:06:34.393613 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-db-sync-c4b4g" podStartSLOduration=2.738395621 podStartE2EDuration="42.393592182s" podCreationTimestamp="2026-03-10 10:05:52 +0000 UTC" firstStartedPulling="2026-03-10 10:05:54.369823615 +0000 UTC m=+1303.125994433" lastFinishedPulling="2026-03-10 10:06:34.025020176 +0000 UTC m=+1342.781190994" observedRunningTime="2026-03-10 10:06:34.385949409 +0000 UTC m=+1343.142120247" watchObservedRunningTime="2026-03-10 10:06:34.393592182 +0000 UTC m=+1343.149763000" Mar 10 10:06:35 crc kubenswrapper[4794]: I0310 10:06:35.382582 4794 generic.go:334] "Generic (PLEG): container finished" podID="39f62c40-641b-409f-ab29-ba30c14de2d8" containerID="f61478060205d8fcc6cec95994753f9d66a103e3c2e6f3bb9a7e8201a71bbfe8" exitCode=2 Mar 10 10:06:35 crc kubenswrapper[4794]: I0310 10:06:35.383621 4794 generic.go:334] "Generic (PLEG): container finished" podID="39f62c40-641b-409f-ab29-ba30c14de2d8" containerID="fbb0da4f7706c5131082ecadb64151d23fdfad08580f0036fc285ac4f92a3ab6" exitCode=0 Mar 10 10:06:35 crc kubenswrapper[4794]: I0310 10:06:35.382639 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"39f62c40-641b-409f-ab29-ba30c14de2d8","Type":"ContainerDied","Data":"f61478060205d8fcc6cec95994753f9d66a103e3c2e6f3bb9a7e8201a71bbfe8"} Mar 10 10:06:35 crc kubenswrapper[4794]: I0310 10:06:35.383842 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"39f62c40-641b-409f-ab29-ba30c14de2d8","Type":"ContainerDied","Data":"fbb0da4f7706c5131082ecadb64151d23fdfad08580f0036fc285ac4f92a3ab6"} Mar 10 10:06:35 crc kubenswrapper[4794]: I0310 10:06:35.387447 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-g7gmt" event={"ID":"821b338a-8a20-4d93-8dfa-28727da3ecba","Type":"ContainerStarted","Data":"f34d846d29fa22b51cfd81f5eb8d939afff71854f04f27647f2aaabad501e40f"} Mar 10 10:06:35 crc kubenswrapper[4794]: I0310 10:06:35.425931 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-db-sync-g7gmt" podStartSLOduration=3.480987664 podStartE2EDuration="43.425899547s" podCreationTimestamp="2026-03-10 10:05:52 +0000 UTC" firstStartedPulling="2026-03-10 10:05:54.082922012 +0000 UTC m=+1302.839092830" lastFinishedPulling="2026-03-10 10:06:34.027833875 +0000 UTC m=+1342.784004713" observedRunningTime="2026-03-10 10:06:35.41147782 +0000 UTC m=+1344.167648648" watchObservedRunningTime="2026-03-10 10:06:35.425899547 +0000 UTC m=+1344.182070385" Mar 10 10:06:36 crc kubenswrapper[4794]: I0310 10:06:36.399058 4794 generic.go:334] "Generic (PLEG): container finished" podID="39f62c40-641b-409f-ab29-ba30c14de2d8" containerID="3a9d616e4d6b008bb7cbf6d2bbc20e8de0074f1639e7627b35604fe2fd3d69a4" exitCode=0 Mar 10 10:06:36 crc kubenswrapper[4794]: I0310 10:06:36.399112 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"39f62c40-641b-409f-ab29-ba30c14de2d8","Type":"ContainerDied","Data":"3a9d616e4d6b008bb7cbf6d2bbc20e8de0074f1639e7627b35604fe2fd3d69a4"} Mar 10 10:06:37 crc kubenswrapper[4794]: I0310 10:06:37.411206 4794 generic.go:334] "Generic (PLEG): container finished" podID="22069ba2-0135-4559-9c7f-2d73ae0dd81a" containerID="fe8fb67a806020a24997b1d8ba664de48c197527d827a7320e86b917c0811868" exitCode=0 Mar 10 10:06:37 crc kubenswrapper[4794]: I0310 10:06:37.411253 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-c4b4g" event={"ID":"22069ba2-0135-4559-9c7f-2d73ae0dd81a","Type":"ContainerDied","Data":"fe8fb67a806020a24997b1d8ba664de48c197527d827a7320e86b917c0811868"} Mar 10 10:06:38 crc kubenswrapper[4794]: I0310 10:06:38.836047 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-c4b4g" Mar 10 10:06:38 crc kubenswrapper[4794]: I0310 10:06:38.985024 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/22069ba2-0135-4559-9c7f-2d73ae0dd81a-combined-ca-bundle\") pod \"22069ba2-0135-4559-9c7f-2d73ae0dd81a\" (UID: \"22069ba2-0135-4559-9c7f-2d73ae0dd81a\") " Mar 10 10:06:38 crc kubenswrapper[4794]: I0310 10:06:38.985158 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zktwg\" (UniqueName: \"kubernetes.io/projected/22069ba2-0135-4559-9c7f-2d73ae0dd81a-kube-api-access-zktwg\") pod \"22069ba2-0135-4559-9c7f-2d73ae0dd81a\" (UID: \"22069ba2-0135-4559-9c7f-2d73ae0dd81a\") " Mar 10 10:06:38 crc kubenswrapper[4794]: I0310 10:06:38.985263 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/22069ba2-0135-4559-9c7f-2d73ae0dd81a-db-sync-config-data\") pod \"22069ba2-0135-4559-9c7f-2d73ae0dd81a\" (UID: \"22069ba2-0135-4559-9c7f-2d73ae0dd81a\") " Mar 10 10:06:38 crc kubenswrapper[4794]: I0310 10:06:38.990826 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22069ba2-0135-4559-9c7f-2d73ae0dd81a-kube-api-access-zktwg" (OuterVolumeSpecName: "kube-api-access-zktwg") pod "22069ba2-0135-4559-9c7f-2d73ae0dd81a" (UID: "22069ba2-0135-4559-9c7f-2d73ae0dd81a"). InnerVolumeSpecName "kube-api-access-zktwg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 10:06:38 crc kubenswrapper[4794]: I0310 10:06:38.994741 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22069ba2-0135-4559-9c7f-2d73ae0dd81a-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "22069ba2-0135-4559-9c7f-2d73ae0dd81a" (UID: "22069ba2-0135-4559-9c7f-2d73ae0dd81a"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 10:06:39 crc kubenswrapper[4794]: I0310 10:06:39.010642 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22069ba2-0135-4559-9c7f-2d73ae0dd81a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "22069ba2-0135-4559-9c7f-2d73ae0dd81a" (UID: "22069ba2-0135-4559-9c7f-2d73ae0dd81a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 10:06:39 crc kubenswrapper[4794]: I0310 10:06:39.086985 4794 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/22069ba2-0135-4559-9c7f-2d73ae0dd81a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 10:06:39 crc kubenswrapper[4794]: I0310 10:06:39.087018 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zktwg\" (UniqueName: \"kubernetes.io/projected/22069ba2-0135-4559-9c7f-2d73ae0dd81a-kube-api-access-zktwg\") on node \"crc\" DevicePath \"\"" Mar 10 10:06:39 crc kubenswrapper[4794]: I0310 10:06:39.087032 4794 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/22069ba2-0135-4559-9c7f-2d73ae0dd81a-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 10:06:39 crc kubenswrapper[4794]: I0310 10:06:39.436464 4794 generic.go:334] "Generic (PLEG): container finished" podID="821b338a-8a20-4d93-8dfa-28727da3ecba" containerID="f34d846d29fa22b51cfd81f5eb8d939afff71854f04f27647f2aaabad501e40f" exitCode=0 Mar 10 10:06:39 crc kubenswrapper[4794]: I0310 10:06:39.436551 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-g7gmt" event={"ID":"821b338a-8a20-4d93-8dfa-28727da3ecba","Type":"ContainerDied","Data":"f34d846d29fa22b51cfd81f5eb8d939afff71854f04f27647f2aaabad501e40f"} Mar 10 10:06:39 crc kubenswrapper[4794]: I0310 10:06:39.444215 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-c4b4g" event={"ID":"22069ba2-0135-4559-9c7f-2d73ae0dd81a","Type":"ContainerDied","Data":"e022f301a846b17b394b61e4f02ca463d5f3b67d498580990434cf197a465694"} Mar 10 10:06:39 crc kubenswrapper[4794]: I0310 10:06:39.444248 4794 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e022f301a846b17b394b61e4f02ca463d5f3b67d498580990434cf197a465694" Mar 10 10:06:39 crc kubenswrapper[4794]: I0310 10:06:39.444322 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-c4b4g" Mar 10 10:06:39 crc kubenswrapper[4794]: I0310 10:06:39.761259 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-worker-b64684465-k4k4j"] Mar 10 10:06:39 crc kubenswrapper[4794]: E0310 10:06:39.762416 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="327bbff1-1be5-4d8f-9ef9-dfc71e5199cc" containerName="init" Mar 10 10:06:39 crc kubenswrapper[4794]: I0310 10:06:39.762472 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="327bbff1-1be5-4d8f-9ef9-dfc71e5199cc" containerName="init" Mar 10 10:06:39 crc kubenswrapper[4794]: E0310 10:06:39.762489 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="22069ba2-0135-4559-9c7f-2d73ae0dd81a" containerName="barbican-db-sync" Mar 10 10:06:39 crc kubenswrapper[4794]: I0310 10:06:39.762496 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="22069ba2-0135-4559-9c7f-2d73ae0dd81a" containerName="barbican-db-sync" Mar 10 10:06:39 crc kubenswrapper[4794]: E0310 10:06:39.762642 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="327bbff1-1be5-4d8f-9ef9-dfc71e5199cc" containerName="dnsmasq-dns" Mar 10 10:06:39 crc kubenswrapper[4794]: I0310 10:06:39.762650 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="327bbff1-1be5-4d8f-9ef9-dfc71e5199cc" containerName="dnsmasq-dns" Mar 10 10:06:39 crc kubenswrapper[4794]: I0310 10:06:39.762978 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="327bbff1-1be5-4d8f-9ef9-dfc71e5199cc" containerName="dnsmasq-dns" Mar 10 10:06:39 crc kubenswrapper[4794]: I0310 10:06:39.763031 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="22069ba2-0135-4559-9c7f-2d73ae0dd81a" containerName="barbican-db-sync" Mar 10 10:06:39 crc kubenswrapper[4794]: I0310 10:06:39.764045 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-b64684465-k4k4j" Mar 10 10:06:39 crc kubenswrapper[4794]: I0310 10:06:39.766436 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-worker-config-data" Mar 10 10:06:39 crc kubenswrapper[4794]: I0310 10:06:39.766494 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Mar 10 10:06:39 crc kubenswrapper[4794]: I0310 10:06:39.766724 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-vs4x7" Mar 10 10:06:39 crc kubenswrapper[4794]: I0310 10:06:39.804382 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/98b35dea-060e-4b8d-9829-37357853a9c4-config-data\") pod \"barbican-worker-b64684465-k4k4j\" (UID: \"98b35dea-060e-4b8d-9829-37357853a9c4\") " pod="openstack/barbican-worker-b64684465-k4k4j" Mar 10 10:06:39 crc kubenswrapper[4794]: I0310 10:06:39.804656 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/98b35dea-060e-4b8d-9829-37357853a9c4-logs\") pod \"barbican-worker-b64684465-k4k4j\" (UID: \"98b35dea-060e-4b8d-9829-37357853a9c4\") " pod="openstack/barbican-worker-b64684465-k4k4j" Mar 10 10:06:39 crc kubenswrapper[4794]: I0310 10:06:39.804786 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/98b35dea-060e-4b8d-9829-37357853a9c4-combined-ca-bundle\") pod \"barbican-worker-b64684465-k4k4j\" (UID: \"98b35dea-060e-4b8d-9829-37357853a9c4\") " pod="openstack/barbican-worker-b64684465-k4k4j" Mar 10 10:06:39 crc kubenswrapper[4794]: I0310 10:06:39.805293 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/98b35dea-060e-4b8d-9829-37357853a9c4-config-data-custom\") pod \"barbican-worker-b64684465-k4k4j\" (UID: \"98b35dea-060e-4b8d-9829-37357853a9c4\") " pod="openstack/barbican-worker-b64684465-k4k4j" Mar 10 10:06:39 crc kubenswrapper[4794]: I0310 10:06:39.805478 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5zkfk\" (UniqueName: \"kubernetes.io/projected/98b35dea-060e-4b8d-9829-37357853a9c4-kube-api-access-5zkfk\") pod \"barbican-worker-b64684465-k4k4j\" (UID: \"98b35dea-060e-4b8d-9829-37357853a9c4\") " pod="openstack/barbican-worker-b64684465-k4k4j" Mar 10 10:06:39 crc kubenswrapper[4794]: I0310 10:06:39.833010 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-b64684465-k4k4j"] Mar 10 10:06:39 crc kubenswrapper[4794]: I0310 10:06:39.868742 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-55c649d8d5-9bdqc"] Mar 10 10:06:39 crc kubenswrapper[4794]: I0310 10:06:39.870623 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-55c649d8d5-9bdqc" Mar 10 10:06:39 crc kubenswrapper[4794]: I0310 10:06:39.881286 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-keystone-listener-65fdc45d8b-2t64g"] Mar 10 10:06:39 crc kubenswrapper[4794]: I0310 10:06:39.882806 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-65fdc45d8b-2t64g" Mar 10 10:06:39 crc kubenswrapper[4794]: I0310 10:06:39.890870 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-keystone-listener-config-data" Mar 10 10:06:39 crc kubenswrapper[4794]: I0310 10:06:39.892914 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-55c649d8d5-9bdqc"] Mar 10 10:06:39 crc kubenswrapper[4794]: I0310 10:06:39.906142 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/67fadcbf-6db6-494f-a3b7-da60cf39d1eb-config\") pod \"dnsmasq-dns-55c649d8d5-9bdqc\" (UID: \"67fadcbf-6db6-494f-a3b7-da60cf39d1eb\") " pod="openstack/dnsmasq-dns-55c649d8d5-9bdqc" Mar 10 10:06:39 crc kubenswrapper[4794]: I0310 10:06:39.906189 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/67fadcbf-6db6-494f-a3b7-da60cf39d1eb-dns-svc\") pod \"dnsmasq-dns-55c649d8d5-9bdqc\" (UID: \"67fadcbf-6db6-494f-a3b7-da60cf39d1eb\") " pod="openstack/dnsmasq-dns-55c649d8d5-9bdqc" Mar 10 10:06:39 crc kubenswrapper[4794]: I0310 10:06:39.906217 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/67fadcbf-6db6-494f-a3b7-da60cf39d1eb-dns-swift-storage-0\") pod \"dnsmasq-dns-55c649d8d5-9bdqc\" (UID: \"67fadcbf-6db6-494f-a3b7-da60cf39d1eb\") " pod="openstack/dnsmasq-dns-55c649d8d5-9bdqc" Mar 10 10:06:39 crc kubenswrapper[4794]: I0310 10:06:39.906249 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5zkfk\" (UniqueName: \"kubernetes.io/projected/98b35dea-060e-4b8d-9829-37357853a9c4-kube-api-access-5zkfk\") pod \"barbican-worker-b64684465-k4k4j\" (UID: \"98b35dea-060e-4b8d-9829-37357853a9c4\") " pod="openstack/barbican-worker-b64684465-k4k4j" Mar 10 10:06:39 crc kubenswrapper[4794]: I0310 10:06:39.906275 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/67fadcbf-6db6-494f-a3b7-da60cf39d1eb-ovsdbserver-sb\") pod \"dnsmasq-dns-55c649d8d5-9bdqc\" (UID: \"67fadcbf-6db6-494f-a3b7-da60cf39d1eb\") " pod="openstack/dnsmasq-dns-55c649d8d5-9bdqc" Mar 10 10:06:39 crc kubenswrapper[4794]: I0310 10:06:39.906304 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v9g4h\" (UniqueName: \"kubernetes.io/projected/67fadcbf-6db6-494f-a3b7-da60cf39d1eb-kube-api-access-v9g4h\") pod \"dnsmasq-dns-55c649d8d5-9bdqc\" (UID: \"67fadcbf-6db6-494f-a3b7-da60cf39d1eb\") " pod="openstack/dnsmasq-dns-55c649d8d5-9bdqc" Mar 10 10:06:39 crc kubenswrapper[4794]: I0310 10:06:39.906325 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/98b35dea-060e-4b8d-9829-37357853a9c4-config-data\") pod \"barbican-worker-b64684465-k4k4j\" (UID: \"98b35dea-060e-4b8d-9829-37357853a9c4\") " pod="openstack/barbican-worker-b64684465-k4k4j" Mar 10 10:06:39 crc kubenswrapper[4794]: I0310 10:06:39.906458 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/efa96620-3d4b-4780-92a0-eeefbe9dcf9a-logs\") pod \"barbican-keystone-listener-65fdc45d8b-2t64g\" (UID: \"efa96620-3d4b-4780-92a0-eeefbe9dcf9a\") " pod="openstack/barbican-keystone-listener-65fdc45d8b-2t64g" Mar 10 10:06:39 crc kubenswrapper[4794]: I0310 10:06:39.906486 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-55k6f\" (UniqueName: \"kubernetes.io/projected/efa96620-3d4b-4780-92a0-eeefbe9dcf9a-kube-api-access-55k6f\") pod \"barbican-keystone-listener-65fdc45d8b-2t64g\" (UID: \"efa96620-3d4b-4780-92a0-eeefbe9dcf9a\") " pod="openstack/barbican-keystone-listener-65fdc45d8b-2t64g" Mar 10 10:06:39 crc kubenswrapper[4794]: I0310 10:06:39.906516 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/98b35dea-060e-4b8d-9829-37357853a9c4-logs\") pod \"barbican-worker-b64684465-k4k4j\" (UID: \"98b35dea-060e-4b8d-9829-37357853a9c4\") " pod="openstack/barbican-worker-b64684465-k4k4j" Mar 10 10:06:39 crc kubenswrapper[4794]: I0310 10:06:39.906559 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/98b35dea-060e-4b8d-9829-37357853a9c4-combined-ca-bundle\") pod \"barbican-worker-b64684465-k4k4j\" (UID: \"98b35dea-060e-4b8d-9829-37357853a9c4\") " pod="openstack/barbican-worker-b64684465-k4k4j" Mar 10 10:06:39 crc kubenswrapper[4794]: I0310 10:06:39.906589 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/efa96620-3d4b-4780-92a0-eeefbe9dcf9a-combined-ca-bundle\") pod \"barbican-keystone-listener-65fdc45d8b-2t64g\" (UID: \"efa96620-3d4b-4780-92a0-eeefbe9dcf9a\") " pod="openstack/barbican-keystone-listener-65fdc45d8b-2t64g" Mar 10 10:06:39 crc kubenswrapper[4794]: I0310 10:06:39.906625 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/67fadcbf-6db6-494f-a3b7-da60cf39d1eb-ovsdbserver-nb\") pod \"dnsmasq-dns-55c649d8d5-9bdqc\" (UID: \"67fadcbf-6db6-494f-a3b7-da60cf39d1eb\") " pod="openstack/dnsmasq-dns-55c649d8d5-9bdqc" Mar 10 10:06:39 crc kubenswrapper[4794]: I0310 10:06:39.906651 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/efa96620-3d4b-4780-92a0-eeefbe9dcf9a-config-data-custom\") pod \"barbican-keystone-listener-65fdc45d8b-2t64g\" (UID: \"efa96620-3d4b-4780-92a0-eeefbe9dcf9a\") " pod="openstack/barbican-keystone-listener-65fdc45d8b-2t64g" Mar 10 10:06:39 crc kubenswrapper[4794]: I0310 10:06:39.906693 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/efa96620-3d4b-4780-92a0-eeefbe9dcf9a-config-data\") pod \"barbican-keystone-listener-65fdc45d8b-2t64g\" (UID: \"efa96620-3d4b-4780-92a0-eeefbe9dcf9a\") " pod="openstack/barbican-keystone-listener-65fdc45d8b-2t64g" Mar 10 10:06:39 crc kubenswrapper[4794]: I0310 10:06:39.906728 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/98b35dea-060e-4b8d-9829-37357853a9c4-config-data-custom\") pod \"barbican-worker-b64684465-k4k4j\" (UID: \"98b35dea-060e-4b8d-9829-37357853a9c4\") " pod="openstack/barbican-worker-b64684465-k4k4j" Mar 10 10:06:39 crc kubenswrapper[4794]: I0310 10:06:39.907138 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/98b35dea-060e-4b8d-9829-37357853a9c4-logs\") pod \"barbican-worker-b64684465-k4k4j\" (UID: \"98b35dea-060e-4b8d-9829-37357853a9c4\") " pod="openstack/barbican-worker-b64684465-k4k4j" Mar 10 10:06:39 crc kubenswrapper[4794]: I0310 10:06:39.916674 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/98b35dea-060e-4b8d-9829-37357853a9c4-config-data-custom\") pod \"barbican-worker-b64684465-k4k4j\" (UID: \"98b35dea-060e-4b8d-9829-37357853a9c4\") " pod="openstack/barbican-worker-b64684465-k4k4j" Mar 10 10:06:39 crc kubenswrapper[4794]: I0310 10:06:39.916774 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/98b35dea-060e-4b8d-9829-37357853a9c4-combined-ca-bundle\") pod \"barbican-worker-b64684465-k4k4j\" (UID: \"98b35dea-060e-4b8d-9829-37357853a9c4\") " pod="openstack/barbican-worker-b64684465-k4k4j" Mar 10 10:06:39 crc kubenswrapper[4794]: I0310 10:06:39.918199 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/98b35dea-060e-4b8d-9829-37357853a9c4-config-data\") pod \"barbican-worker-b64684465-k4k4j\" (UID: \"98b35dea-060e-4b8d-9829-37357853a9c4\") " pod="openstack/barbican-worker-b64684465-k4k4j" Mar 10 10:06:39 crc kubenswrapper[4794]: I0310 10:06:39.918249 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-65fdc45d8b-2t64g"] Mar 10 10:06:39 crc kubenswrapper[4794]: I0310 10:06:39.928273 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5zkfk\" (UniqueName: \"kubernetes.io/projected/98b35dea-060e-4b8d-9829-37357853a9c4-kube-api-access-5zkfk\") pod \"barbican-worker-b64684465-k4k4j\" (UID: \"98b35dea-060e-4b8d-9829-37357853a9c4\") " pod="openstack/barbican-worker-b64684465-k4k4j" Mar 10 10:06:39 crc kubenswrapper[4794]: I0310 10:06:39.993154 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-65464978fd-hh68d"] Mar 10 10:06:39 crc kubenswrapper[4794]: I0310 10:06:39.994980 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-65464978fd-hh68d" Mar 10 10:06:39 crc kubenswrapper[4794]: I0310 10:06:39.999780 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-api-config-data" Mar 10 10:06:40 crc kubenswrapper[4794]: I0310 10:06:40.009227 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b25jx\" (UniqueName: \"kubernetes.io/projected/7f12c556-db49-48d8-aad0-981c8d746bb6-kube-api-access-b25jx\") pod \"barbican-api-65464978fd-hh68d\" (UID: \"7f12c556-db49-48d8-aad0-981c8d746bb6\") " pod="openstack/barbican-api-65464978fd-hh68d" Mar 10 10:06:40 crc kubenswrapper[4794]: I0310 10:06:40.009288 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7f12c556-db49-48d8-aad0-981c8d746bb6-combined-ca-bundle\") pod \"barbican-api-65464978fd-hh68d\" (UID: \"7f12c556-db49-48d8-aad0-981c8d746bb6\") " pod="openstack/barbican-api-65464978fd-hh68d" Mar 10 10:06:40 crc kubenswrapper[4794]: I0310 10:06:40.009315 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7f12c556-db49-48d8-aad0-981c8d746bb6-config-data\") pod \"barbican-api-65464978fd-hh68d\" (UID: \"7f12c556-db49-48d8-aad0-981c8d746bb6\") " pod="openstack/barbican-api-65464978fd-hh68d" Mar 10 10:06:40 crc kubenswrapper[4794]: I0310 10:06:40.009357 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7f12c556-db49-48d8-aad0-981c8d746bb6-config-data-custom\") pod \"barbican-api-65464978fd-hh68d\" (UID: \"7f12c556-db49-48d8-aad0-981c8d746bb6\") " pod="openstack/barbican-api-65464978fd-hh68d" Mar 10 10:06:40 crc kubenswrapper[4794]: I0310 10:06:40.009394 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/67fadcbf-6db6-494f-a3b7-da60cf39d1eb-config\") pod \"dnsmasq-dns-55c649d8d5-9bdqc\" (UID: \"67fadcbf-6db6-494f-a3b7-da60cf39d1eb\") " pod="openstack/dnsmasq-dns-55c649d8d5-9bdqc" Mar 10 10:06:40 crc kubenswrapper[4794]: I0310 10:06:40.009447 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/67fadcbf-6db6-494f-a3b7-da60cf39d1eb-dns-svc\") pod \"dnsmasq-dns-55c649d8d5-9bdqc\" (UID: \"67fadcbf-6db6-494f-a3b7-da60cf39d1eb\") " pod="openstack/dnsmasq-dns-55c649d8d5-9bdqc" Mar 10 10:06:40 crc kubenswrapper[4794]: I0310 10:06:40.009471 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/67fadcbf-6db6-494f-a3b7-da60cf39d1eb-dns-swift-storage-0\") pod \"dnsmasq-dns-55c649d8d5-9bdqc\" (UID: \"67fadcbf-6db6-494f-a3b7-da60cf39d1eb\") " pod="openstack/dnsmasq-dns-55c649d8d5-9bdqc" Mar 10 10:06:40 crc kubenswrapper[4794]: I0310 10:06:40.009504 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/67fadcbf-6db6-494f-a3b7-da60cf39d1eb-ovsdbserver-sb\") pod \"dnsmasq-dns-55c649d8d5-9bdqc\" (UID: \"67fadcbf-6db6-494f-a3b7-da60cf39d1eb\") " pod="openstack/dnsmasq-dns-55c649d8d5-9bdqc" Mar 10 10:06:40 crc kubenswrapper[4794]: I0310 10:06:40.009537 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v9g4h\" (UniqueName: \"kubernetes.io/projected/67fadcbf-6db6-494f-a3b7-da60cf39d1eb-kube-api-access-v9g4h\") pod \"dnsmasq-dns-55c649d8d5-9bdqc\" (UID: \"67fadcbf-6db6-494f-a3b7-da60cf39d1eb\") " pod="openstack/dnsmasq-dns-55c649d8d5-9bdqc" Mar 10 10:06:40 crc kubenswrapper[4794]: I0310 10:06:40.009559 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/efa96620-3d4b-4780-92a0-eeefbe9dcf9a-logs\") pod \"barbican-keystone-listener-65fdc45d8b-2t64g\" (UID: \"efa96620-3d4b-4780-92a0-eeefbe9dcf9a\") " pod="openstack/barbican-keystone-listener-65fdc45d8b-2t64g" Mar 10 10:06:40 crc kubenswrapper[4794]: I0310 10:06:40.009582 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-55k6f\" (UniqueName: \"kubernetes.io/projected/efa96620-3d4b-4780-92a0-eeefbe9dcf9a-kube-api-access-55k6f\") pod \"barbican-keystone-listener-65fdc45d8b-2t64g\" (UID: \"efa96620-3d4b-4780-92a0-eeefbe9dcf9a\") " pod="openstack/barbican-keystone-listener-65fdc45d8b-2t64g" Mar 10 10:06:40 crc kubenswrapper[4794]: I0310 10:06:40.009627 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/efa96620-3d4b-4780-92a0-eeefbe9dcf9a-combined-ca-bundle\") pod \"barbican-keystone-listener-65fdc45d8b-2t64g\" (UID: \"efa96620-3d4b-4780-92a0-eeefbe9dcf9a\") " pod="openstack/barbican-keystone-listener-65fdc45d8b-2t64g" Mar 10 10:06:40 crc kubenswrapper[4794]: I0310 10:06:40.009648 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/67fadcbf-6db6-494f-a3b7-da60cf39d1eb-ovsdbserver-nb\") pod \"dnsmasq-dns-55c649d8d5-9bdqc\" (UID: \"67fadcbf-6db6-494f-a3b7-da60cf39d1eb\") " pod="openstack/dnsmasq-dns-55c649d8d5-9bdqc" Mar 10 10:06:40 crc kubenswrapper[4794]: I0310 10:06:40.009665 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/efa96620-3d4b-4780-92a0-eeefbe9dcf9a-config-data-custom\") pod \"barbican-keystone-listener-65fdc45d8b-2t64g\" (UID: \"efa96620-3d4b-4780-92a0-eeefbe9dcf9a\") " pod="openstack/barbican-keystone-listener-65fdc45d8b-2t64g" Mar 10 10:06:40 crc kubenswrapper[4794]: I0310 10:06:40.009694 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/efa96620-3d4b-4780-92a0-eeefbe9dcf9a-config-data\") pod \"barbican-keystone-listener-65fdc45d8b-2t64g\" (UID: \"efa96620-3d4b-4780-92a0-eeefbe9dcf9a\") " pod="openstack/barbican-keystone-listener-65fdc45d8b-2t64g" Mar 10 10:06:40 crc kubenswrapper[4794]: I0310 10:06:40.009727 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7f12c556-db49-48d8-aad0-981c8d746bb6-logs\") pod \"barbican-api-65464978fd-hh68d\" (UID: \"7f12c556-db49-48d8-aad0-981c8d746bb6\") " pod="openstack/barbican-api-65464978fd-hh68d" Mar 10 10:06:40 crc kubenswrapper[4794]: I0310 10:06:40.010733 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/67fadcbf-6db6-494f-a3b7-da60cf39d1eb-config\") pod \"dnsmasq-dns-55c649d8d5-9bdqc\" (UID: \"67fadcbf-6db6-494f-a3b7-da60cf39d1eb\") " pod="openstack/dnsmasq-dns-55c649d8d5-9bdqc" Mar 10 10:06:40 crc kubenswrapper[4794]: I0310 10:06:40.010882 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/efa96620-3d4b-4780-92a0-eeefbe9dcf9a-logs\") pod \"barbican-keystone-listener-65fdc45d8b-2t64g\" (UID: \"efa96620-3d4b-4780-92a0-eeefbe9dcf9a\") " pod="openstack/barbican-keystone-listener-65fdc45d8b-2t64g" Mar 10 10:06:40 crc kubenswrapper[4794]: I0310 10:06:40.013179 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/67fadcbf-6db6-494f-a3b7-da60cf39d1eb-ovsdbserver-nb\") pod \"dnsmasq-dns-55c649d8d5-9bdqc\" (UID: \"67fadcbf-6db6-494f-a3b7-da60cf39d1eb\") " pod="openstack/dnsmasq-dns-55c649d8d5-9bdqc" Mar 10 10:06:40 crc kubenswrapper[4794]: I0310 10:06:40.014704 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/67fadcbf-6db6-494f-a3b7-da60cf39d1eb-dns-svc\") pod \"dnsmasq-dns-55c649d8d5-9bdqc\" (UID: \"67fadcbf-6db6-494f-a3b7-da60cf39d1eb\") " pod="openstack/dnsmasq-dns-55c649d8d5-9bdqc" Mar 10 10:06:40 crc kubenswrapper[4794]: I0310 10:06:40.015298 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/67fadcbf-6db6-494f-a3b7-da60cf39d1eb-ovsdbserver-sb\") pod \"dnsmasq-dns-55c649d8d5-9bdqc\" (UID: \"67fadcbf-6db6-494f-a3b7-da60cf39d1eb\") " pod="openstack/dnsmasq-dns-55c649d8d5-9bdqc" Mar 10 10:06:40 crc kubenswrapper[4794]: I0310 10:06:40.017226 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/67fadcbf-6db6-494f-a3b7-da60cf39d1eb-dns-swift-storage-0\") pod \"dnsmasq-dns-55c649d8d5-9bdqc\" (UID: \"67fadcbf-6db6-494f-a3b7-da60cf39d1eb\") " pod="openstack/dnsmasq-dns-55c649d8d5-9bdqc" Mar 10 10:06:40 crc kubenswrapper[4794]: I0310 10:06:40.017468 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/efa96620-3d4b-4780-92a0-eeefbe9dcf9a-combined-ca-bundle\") pod \"barbican-keystone-listener-65fdc45d8b-2t64g\" (UID: \"efa96620-3d4b-4780-92a0-eeefbe9dcf9a\") " pod="openstack/barbican-keystone-listener-65fdc45d8b-2t64g" Mar 10 10:06:40 crc kubenswrapper[4794]: I0310 10:06:40.024200 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-65464978fd-hh68d"] Mar 10 10:06:40 crc kubenswrapper[4794]: I0310 10:06:40.027261 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/efa96620-3d4b-4780-92a0-eeefbe9dcf9a-config-data-custom\") pod \"barbican-keystone-listener-65fdc45d8b-2t64g\" (UID: \"efa96620-3d4b-4780-92a0-eeefbe9dcf9a\") " pod="openstack/barbican-keystone-listener-65fdc45d8b-2t64g" Mar 10 10:06:40 crc kubenswrapper[4794]: I0310 10:06:40.040568 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-55k6f\" (UniqueName: \"kubernetes.io/projected/efa96620-3d4b-4780-92a0-eeefbe9dcf9a-kube-api-access-55k6f\") pod \"barbican-keystone-listener-65fdc45d8b-2t64g\" (UID: \"efa96620-3d4b-4780-92a0-eeefbe9dcf9a\") " pod="openstack/barbican-keystone-listener-65fdc45d8b-2t64g" Mar 10 10:06:40 crc kubenswrapper[4794]: I0310 10:06:40.042645 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/efa96620-3d4b-4780-92a0-eeefbe9dcf9a-config-data\") pod \"barbican-keystone-listener-65fdc45d8b-2t64g\" (UID: \"efa96620-3d4b-4780-92a0-eeefbe9dcf9a\") " pod="openstack/barbican-keystone-listener-65fdc45d8b-2t64g" Mar 10 10:06:40 crc kubenswrapper[4794]: I0310 10:06:40.047527 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v9g4h\" (UniqueName: \"kubernetes.io/projected/67fadcbf-6db6-494f-a3b7-da60cf39d1eb-kube-api-access-v9g4h\") pod \"dnsmasq-dns-55c649d8d5-9bdqc\" (UID: \"67fadcbf-6db6-494f-a3b7-da60cf39d1eb\") " pod="openstack/dnsmasq-dns-55c649d8d5-9bdqc" Mar 10 10:06:40 crc kubenswrapper[4794]: I0310 10:06:40.090616 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-b64684465-k4k4j" Mar 10 10:06:40 crc kubenswrapper[4794]: I0310 10:06:40.110238 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7f12c556-db49-48d8-aad0-981c8d746bb6-logs\") pod \"barbican-api-65464978fd-hh68d\" (UID: \"7f12c556-db49-48d8-aad0-981c8d746bb6\") " pod="openstack/barbican-api-65464978fd-hh68d" Mar 10 10:06:40 crc kubenswrapper[4794]: I0310 10:06:40.110290 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b25jx\" (UniqueName: \"kubernetes.io/projected/7f12c556-db49-48d8-aad0-981c8d746bb6-kube-api-access-b25jx\") pod \"barbican-api-65464978fd-hh68d\" (UID: \"7f12c556-db49-48d8-aad0-981c8d746bb6\") " pod="openstack/barbican-api-65464978fd-hh68d" Mar 10 10:06:40 crc kubenswrapper[4794]: I0310 10:06:40.110367 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7f12c556-db49-48d8-aad0-981c8d746bb6-combined-ca-bundle\") pod \"barbican-api-65464978fd-hh68d\" (UID: \"7f12c556-db49-48d8-aad0-981c8d746bb6\") " pod="openstack/barbican-api-65464978fd-hh68d" Mar 10 10:06:40 crc kubenswrapper[4794]: I0310 10:06:40.110387 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7f12c556-db49-48d8-aad0-981c8d746bb6-config-data\") pod \"barbican-api-65464978fd-hh68d\" (UID: \"7f12c556-db49-48d8-aad0-981c8d746bb6\") " pod="openstack/barbican-api-65464978fd-hh68d" Mar 10 10:06:40 crc kubenswrapper[4794]: I0310 10:06:40.110404 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7f12c556-db49-48d8-aad0-981c8d746bb6-config-data-custom\") pod \"barbican-api-65464978fd-hh68d\" (UID: \"7f12c556-db49-48d8-aad0-981c8d746bb6\") " pod="openstack/barbican-api-65464978fd-hh68d" Mar 10 10:06:40 crc kubenswrapper[4794]: I0310 10:06:40.111714 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7f12c556-db49-48d8-aad0-981c8d746bb6-logs\") pod \"barbican-api-65464978fd-hh68d\" (UID: \"7f12c556-db49-48d8-aad0-981c8d746bb6\") " pod="openstack/barbican-api-65464978fd-hh68d" Mar 10 10:06:40 crc kubenswrapper[4794]: I0310 10:06:40.114088 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7f12c556-db49-48d8-aad0-981c8d746bb6-config-data-custom\") pod \"barbican-api-65464978fd-hh68d\" (UID: \"7f12c556-db49-48d8-aad0-981c8d746bb6\") " pod="openstack/barbican-api-65464978fd-hh68d" Mar 10 10:06:40 crc kubenswrapper[4794]: I0310 10:06:40.114955 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7f12c556-db49-48d8-aad0-981c8d746bb6-combined-ca-bundle\") pod \"barbican-api-65464978fd-hh68d\" (UID: \"7f12c556-db49-48d8-aad0-981c8d746bb6\") " pod="openstack/barbican-api-65464978fd-hh68d" Mar 10 10:06:40 crc kubenswrapper[4794]: I0310 10:06:40.116058 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7f12c556-db49-48d8-aad0-981c8d746bb6-config-data\") pod \"barbican-api-65464978fd-hh68d\" (UID: \"7f12c556-db49-48d8-aad0-981c8d746bb6\") " pod="openstack/barbican-api-65464978fd-hh68d" Mar 10 10:06:40 crc kubenswrapper[4794]: I0310 10:06:40.129097 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b25jx\" (UniqueName: \"kubernetes.io/projected/7f12c556-db49-48d8-aad0-981c8d746bb6-kube-api-access-b25jx\") pod \"barbican-api-65464978fd-hh68d\" (UID: \"7f12c556-db49-48d8-aad0-981c8d746bb6\") " pod="openstack/barbican-api-65464978fd-hh68d" Mar 10 10:06:40 crc kubenswrapper[4794]: I0310 10:06:40.201704 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-55c649d8d5-9bdqc" Mar 10 10:06:40 crc kubenswrapper[4794]: I0310 10:06:40.227840 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-65fdc45d8b-2t64g" Mar 10 10:06:40 crc kubenswrapper[4794]: I0310 10:06:40.413971 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-65464978fd-hh68d" Mar 10 10:06:40 crc kubenswrapper[4794]: I0310 10:06:40.594258 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-b64684465-k4k4j"] Mar 10 10:06:40 crc kubenswrapper[4794]: I0310 10:06:40.763168 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-55c649d8d5-9bdqc"] Mar 10 10:06:40 crc kubenswrapper[4794]: I0310 10:06:40.841629 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-g7gmt" Mar 10 10:06:40 crc kubenswrapper[4794]: I0310 10:06:40.844851 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-65fdc45d8b-2t64g"] Mar 10 10:06:40 crc kubenswrapper[4794]: I0310 10:06:40.965900 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/821b338a-8a20-4d93-8dfa-28727da3ecba-scripts\") pod \"821b338a-8a20-4d93-8dfa-28727da3ecba\" (UID: \"821b338a-8a20-4d93-8dfa-28727da3ecba\") " Mar 10 10:06:40 crc kubenswrapper[4794]: I0310 10:06:40.966058 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/821b338a-8a20-4d93-8dfa-28727da3ecba-combined-ca-bundle\") pod \"821b338a-8a20-4d93-8dfa-28727da3ecba\" (UID: \"821b338a-8a20-4d93-8dfa-28727da3ecba\") " Mar 10 10:06:40 crc kubenswrapper[4794]: I0310 10:06:40.966104 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/821b338a-8a20-4d93-8dfa-28727da3ecba-etc-machine-id\") pod \"821b338a-8a20-4d93-8dfa-28727da3ecba\" (UID: \"821b338a-8a20-4d93-8dfa-28727da3ecba\") " Mar 10 10:06:40 crc kubenswrapper[4794]: I0310 10:06:40.966136 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2pmml\" (UniqueName: \"kubernetes.io/projected/821b338a-8a20-4d93-8dfa-28727da3ecba-kube-api-access-2pmml\") pod \"821b338a-8a20-4d93-8dfa-28727da3ecba\" (UID: \"821b338a-8a20-4d93-8dfa-28727da3ecba\") " Mar 10 10:06:40 crc kubenswrapper[4794]: I0310 10:06:40.966165 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/821b338a-8a20-4d93-8dfa-28727da3ecba-config-data\") pod \"821b338a-8a20-4d93-8dfa-28727da3ecba\" (UID: \"821b338a-8a20-4d93-8dfa-28727da3ecba\") " Mar 10 10:06:40 crc kubenswrapper[4794]: I0310 10:06:40.966184 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/821b338a-8a20-4d93-8dfa-28727da3ecba-db-sync-config-data\") pod \"821b338a-8a20-4d93-8dfa-28727da3ecba\" (UID: \"821b338a-8a20-4d93-8dfa-28727da3ecba\") " Mar 10 10:06:40 crc kubenswrapper[4794]: I0310 10:06:40.968247 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/821b338a-8a20-4d93-8dfa-28727da3ecba-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "821b338a-8a20-4d93-8dfa-28727da3ecba" (UID: "821b338a-8a20-4d93-8dfa-28727da3ecba"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 10:06:40 crc kubenswrapper[4794]: I0310 10:06:40.969971 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/821b338a-8a20-4d93-8dfa-28727da3ecba-kube-api-access-2pmml" (OuterVolumeSpecName: "kube-api-access-2pmml") pod "821b338a-8a20-4d93-8dfa-28727da3ecba" (UID: "821b338a-8a20-4d93-8dfa-28727da3ecba"). InnerVolumeSpecName "kube-api-access-2pmml". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 10:06:40 crc kubenswrapper[4794]: I0310 10:06:40.970512 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/821b338a-8a20-4d93-8dfa-28727da3ecba-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "821b338a-8a20-4d93-8dfa-28727da3ecba" (UID: "821b338a-8a20-4d93-8dfa-28727da3ecba"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 10:06:40 crc kubenswrapper[4794]: I0310 10:06:40.973615 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/821b338a-8a20-4d93-8dfa-28727da3ecba-scripts" (OuterVolumeSpecName: "scripts") pod "821b338a-8a20-4d93-8dfa-28727da3ecba" (UID: "821b338a-8a20-4d93-8dfa-28727da3ecba"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 10:06:41 crc kubenswrapper[4794]: I0310 10:06:40.999539 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/821b338a-8a20-4d93-8dfa-28727da3ecba-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "821b338a-8a20-4d93-8dfa-28727da3ecba" (UID: "821b338a-8a20-4d93-8dfa-28727da3ecba"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 10:06:41 crc kubenswrapper[4794]: I0310 10:06:41.012478 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-65464978fd-hh68d"] Mar 10 10:06:41 crc kubenswrapper[4794]: I0310 10:06:41.017150 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/821b338a-8a20-4d93-8dfa-28727da3ecba-config-data" (OuterVolumeSpecName: "config-data") pod "821b338a-8a20-4d93-8dfa-28727da3ecba" (UID: "821b338a-8a20-4d93-8dfa-28727da3ecba"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 10:06:41 crc kubenswrapper[4794]: W0310 10:06:41.017415 4794 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7f12c556_db49_48d8_aad0_981c8d746bb6.slice/crio-d13c87aa3b94a2e511e9d21ec1737fb611f1ab5a2eb9cf4629403bd21ba919a5 WatchSource:0}: Error finding container d13c87aa3b94a2e511e9d21ec1737fb611f1ab5a2eb9cf4629403bd21ba919a5: Status 404 returned error can't find the container with id d13c87aa3b94a2e511e9d21ec1737fb611f1ab5a2eb9cf4629403bd21ba919a5 Mar 10 10:06:41 crc kubenswrapper[4794]: I0310 10:06:41.068604 4794 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/821b338a-8a20-4d93-8dfa-28727da3ecba-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 10:06:41 crc kubenswrapper[4794]: I0310 10:06:41.068652 4794 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/821b338a-8a20-4d93-8dfa-28727da3ecba-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 10:06:41 crc kubenswrapper[4794]: I0310 10:06:41.068662 4794 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/821b338a-8a20-4d93-8dfa-28727da3ecba-etc-machine-id\") on node \"crc\" DevicePath \"\"" Mar 10 10:06:41 crc kubenswrapper[4794]: I0310 10:06:41.068670 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2pmml\" (UniqueName: \"kubernetes.io/projected/821b338a-8a20-4d93-8dfa-28727da3ecba-kube-api-access-2pmml\") on node \"crc\" DevicePath \"\"" Mar 10 10:06:41 crc kubenswrapper[4794]: I0310 10:06:41.068679 4794 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/821b338a-8a20-4d93-8dfa-28727da3ecba-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 10:06:41 crc kubenswrapper[4794]: I0310 10:06:41.068687 4794 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/821b338a-8a20-4d93-8dfa-28727da3ecba-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 10:06:41 crc kubenswrapper[4794]: I0310 10:06:41.461594 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-65fdc45d8b-2t64g" event={"ID":"efa96620-3d4b-4780-92a0-eeefbe9dcf9a","Type":"ContainerStarted","Data":"59bb9b9b2cfda494c5673c15b608c1720d58e7201b8382d7449948d619553b39"} Mar 10 10:06:41 crc kubenswrapper[4794]: I0310 10:06:41.463099 4794 generic.go:334] "Generic (PLEG): container finished" podID="67fadcbf-6db6-494f-a3b7-da60cf39d1eb" containerID="586a36051058e7b69ffb2a8c841ba9275ba207294d2a5a76ac6964cf0f0322ff" exitCode=0 Mar 10 10:06:41 crc kubenswrapper[4794]: I0310 10:06:41.463138 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55c649d8d5-9bdqc" event={"ID":"67fadcbf-6db6-494f-a3b7-da60cf39d1eb","Type":"ContainerDied","Data":"586a36051058e7b69ffb2a8c841ba9275ba207294d2a5a76ac6964cf0f0322ff"} Mar 10 10:06:41 crc kubenswrapper[4794]: I0310 10:06:41.463170 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55c649d8d5-9bdqc" event={"ID":"67fadcbf-6db6-494f-a3b7-da60cf39d1eb","Type":"ContainerStarted","Data":"fda2df964d8cec73d607d09162b4350afd5859b44b4498b06612d1d4bad19e29"} Mar 10 10:06:41 crc kubenswrapper[4794]: I0310 10:06:41.464471 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-b64684465-k4k4j" event={"ID":"98b35dea-060e-4b8d-9829-37357853a9c4","Type":"ContainerStarted","Data":"5466df36889b2f19d045b78a869f2f6c21971a503e33a7043fd1a0f26a98cfa3"} Mar 10 10:06:41 crc kubenswrapper[4794]: I0310 10:06:41.469232 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-65464978fd-hh68d" event={"ID":"7f12c556-db49-48d8-aad0-981c8d746bb6","Type":"ContainerStarted","Data":"2e1344b23f34c965294ce2a354b0152c4c935ae846652a17c50980e22b14974d"} Mar 10 10:06:41 crc kubenswrapper[4794]: I0310 10:06:41.469556 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-65464978fd-hh68d" event={"ID":"7f12c556-db49-48d8-aad0-981c8d746bb6","Type":"ContainerStarted","Data":"28bbc46ddd80c7238bde4d825e65659d1f1fa5c11c308841de1e1d86d3bd6f52"} Mar 10 10:06:41 crc kubenswrapper[4794]: I0310 10:06:41.469570 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-65464978fd-hh68d" event={"ID":"7f12c556-db49-48d8-aad0-981c8d746bb6","Type":"ContainerStarted","Data":"d13c87aa3b94a2e511e9d21ec1737fb611f1ab5a2eb9cf4629403bd21ba919a5"} Mar 10 10:06:41 crc kubenswrapper[4794]: I0310 10:06:41.470460 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-65464978fd-hh68d" Mar 10 10:06:41 crc kubenswrapper[4794]: I0310 10:06:41.470513 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-65464978fd-hh68d" Mar 10 10:06:41 crc kubenswrapper[4794]: I0310 10:06:41.473182 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-g7gmt" event={"ID":"821b338a-8a20-4d93-8dfa-28727da3ecba","Type":"ContainerDied","Data":"29d12262bda8eeedd7d36fd18f6f69ab2c366cde30f1a57401e1c6cb87a5fb04"} Mar 10 10:06:41 crc kubenswrapper[4794]: I0310 10:06:41.473219 4794 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="29d12262bda8eeedd7d36fd18f6f69ab2c366cde30f1a57401e1c6cb87a5fb04" Mar 10 10:06:41 crc kubenswrapper[4794]: I0310 10:06:41.473274 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-g7gmt" Mar 10 10:06:41 crc kubenswrapper[4794]: I0310 10:06:41.607958 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-65464978fd-hh68d" podStartSLOduration=2.607935133 podStartE2EDuration="2.607935133s" podCreationTimestamp="2026-03-10 10:06:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 10:06:41.581140756 +0000 UTC m=+1350.337311584" watchObservedRunningTime="2026-03-10 10:06:41.607935133 +0000 UTC m=+1350.364105961" Mar 10 10:06:41 crc kubenswrapper[4794]: I0310 10:06:41.843729 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-55c649d8d5-9bdqc"] Mar 10 10:06:41 crc kubenswrapper[4794]: I0310 10:06:41.860654 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Mar 10 10:06:41 crc kubenswrapper[4794]: E0310 10:06:41.861309 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="821b338a-8a20-4d93-8dfa-28727da3ecba" containerName="cinder-db-sync" Mar 10 10:06:41 crc kubenswrapper[4794]: I0310 10:06:41.861367 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="821b338a-8a20-4d93-8dfa-28727da3ecba" containerName="cinder-db-sync" Mar 10 10:06:41 crc kubenswrapper[4794]: I0310 10:06:41.861650 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="821b338a-8a20-4d93-8dfa-28727da3ecba" containerName="cinder-db-sync" Mar 10 10:06:41 crc kubenswrapper[4794]: I0310 10:06:41.862841 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 10 10:06:41 crc kubenswrapper[4794]: I0310 10:06:41.868728 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-jv8qn" Mar 10 10:06:41 crc kubenswrapper[4794]: I0310 10:06:41.868968 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Mar 10 10:06:41 crc kubenswrapper[4794]: I0310 10:06:41.868643 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Mar 10 10:06:41 crc kubenswrapper[4794]: I0310 10:06:41.879534 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Mar 10 10:06:41 crc kubenswrapper[4794]: I0310 10:06:41.888613 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 10 10:06:41 crc kubenswrapper[4794]: I0310 10:06:41.917805 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-765c5b6b49-4clf4"] Mar 10 10:06:41 crc kubenswrapper[4794]: I0310 10:06:41.919377 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-765c5b6b49-4clf4" Mar 10 10:06:41 crc kubenswrapper[4794]: I0310 10:06:41.937755 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-765c5b6b49-4clf4"] Mar 10 10:06:42 crc kubenswrapper[4794]: I0310 10:06:42.009656 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/49b69a8a-f5fa-408e-a378-bb1e0d59a2cb-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"49b69a8a-f5fa-408e-a378-bb1e0d59a2cb\") " pod="openstack/cinder-scheduler-0" Mar 10 10:06:42 crc kubenswrapper[4794]: I0310 10:06:42.009918 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/49b69a8a-f5fa-408e-a378-bb1e0d59a2cb-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"49b69a8a-f5fa-408e-a378-bb1e0d59a2cb\") " pod="openstack/cinder-scheduler-0" Mar 10 10:06:42 crc kubenswrapper[4794]: I0310 10:06:42.009950 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/eafe7ef4-11df-422c-8b13-8943429c4fa6-ovsdbserver-nb\") pod \"dnsmasq-dns-765c5b6b49-4clf4\" (UID: \"eafe7ef4-11df-422c-8b13-8943429c4fa6\") " pod="openstack/dnsmasq-dns-765c5b6b49-4clf4" Mar 10 10:06:42 crc kubenswrapper[4794]: I0310 10:06:42.009974 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cdxsn\" (UniqueName: \"kubernetes.io/projected/eafe7ef4-11df-422c-8b13-8943429c4fa6-kube-api-access-cdxsn\") pod \"dnsmasq-dns-765c5b6b49-4clf4\" (UID: \"eafe7ef4-11df-422c-8b13-8943429c4fa6\") " pod="openstack/dnsmasq-dns-765c5b6b49-4clf4" Mar 10 10:06:42 crc kubenswrapper[4794]: I0310 10:06:42.009989 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/49b69a8a-f5fa-408e-a378-bb1e0d59a2cb-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"49b69a8a-f5fa-408e-a378-bb1e0d59a2cb\") " pod="openstack/cinder-scheduler-0" Mar 10 10:06:42 crc kubenswrapper[4794]: I0310 10:06:42.010116 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/49b69a8a-f5fa-408e-a378-bb1e0d59a2cb-scripts\") pod \"cinder-scheduler-0\" (UID: \"49b69a8a-f5fa-408e-a378-bb1e0d59a2cb\") " pod="openstack/cinder-scheduler-0" Mar 10 10:06:42 crc kubenswrapper[4794]: I0310 10:06:42.010163 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/eafe7ef4-11df-422c-8b13-8943429c4fa6-config\") pod \"dnsmasq-dns-765c5b6b49-4clf4\" (UID: \"eafe7ef4-11df-422c-8b13-8943429c4fa6\") " pod="openstack/dnsmasq-dns-765c5b6b49-4clf4" Mar 10 10:06:42 crc kubenswrapper[4794]: I0310 10:06:42.010303 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/eafe7ef4-11df-422c-8b13-8943429c4fa6-dns-svc\") pod \"dnsmasq-dns-765c5b6b49-4clf4\" (UID: \"eafe7ef4-11df-422c-8b13-8943429c4fa6\") " pod="openstack/dnsmasq-dns-765c5b6b49-4clf4" Mar 10 10:06:42 crc kubenswrapper[4794]: I0310 10:06:42.010340 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kqljl\" (UniqueName: \"kubernetes.io/projected/49b69a8a-f5fa-408e-a378-bb1e0d59a2cb-kube-api-access-kqljl\") pod \"cinder-scheduler-0\" (UID: \"49b69a8a-f5fa-408e-a378-bb1e0d59a2cb\") " pod="openstack/cinder-scheduler-0" Mar 10 10:06:42 crc kubenswrapper[4794]: I0310 10:06:42.010411 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/49b69a8a-f5fa-408e-a378-bb1e0d59a2cb-config-data\") pod \"cinder-scheduler-0\" (UID: \"49b69a8a-f5fa-408e-a378-bb1e0d59a2cb\") " pod="openstack/cinder-scheduler-0" Mar 10 10:06:42 crc kubenswrapper[4794]: I0310 10:06:42.010460 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/eafe7ef4-11df-422c-8b13-8943429c4fa6-ovsdbserver-sb\") pod \"dnsmasq-dns-765c5b6b49-4clf4\" (UID: \"eafe7ef4-11df-422c-8b13-8943429c4fa6\") " pod="openstack/dnsmasq-dns-765c5b6b49-4clf4" Mar 10 10:06:42 crc kubenswrapper[4794]: I0310 10:06:42.010616 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/eafe7ef4-11df-422c-8b13-8943429c4fa6-dns-swift-storage-0\") pod \"dnsmasq-dns-765c5b6b49-4clf4\" (UID: \"eafe7ef4-11df-422c-8b13-8943429c4fa6\") " pod="openstack/dnsmasq-dns-765c5b6b49-4clf4" Mar 10 10:06:42 crc kubenswrapper[4794]: I0310 10:06:42.112495 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/49b69a8a-f5fa-408e-a378-bb1e0d59a2cb-config-data\") pod \"cinder-scheduler-0\" (UID: \"49b69a8a-f5fa-408e-a378-bb1e0d59a2cb\") " pod="openstack/cinder-scheduler-0" Mar 10 10:06:42 crc kubenswrapper[4794]: I0310 10:06:42.112552 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/eafe7ef4-11df-422c-8b13-8943429c4fa6-ovsdbserver-sb\") pod \"dnsmasq-dns-765c5b6b49-4clf4\" (UID: \"eafe7ef4-11df-422c-8b13-8943429c4fa6\") " pod="openstack/dnsmasq-dns-765c5b6b49-4clf4" Mar 10 10:06:42 crc kubenswrapper[4794]: I0310 10:06:42.112595 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/eafe7ef4-11df-422c-8b13-8943429c4fa6-dns-swift-storage-0\") pod \"dnsmasq-dns-765c5b6b49-4clf4\" (UID: \"eafe7ef4-11df-422c-8b13-8943429c4fa6\") " pod="openstack/dnsmasq-dns-765c5b6b49-4clf4" Mar 10 10:06:42 crc kubenswrapper[4794]: I0310 10:06:42.112627 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/49b69a8a-f5fa-408e-a378-bb1e0d59a2cb-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"49b69a8a-f5fa-408e-a378-bb1e0d59a2cb\") " pod="openstack/cinder-scheduler-0" Mar 10 10:06:42 crc kubenswrapper[4794]: I0310 10:06:42.112660 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/49b69a8a-f5fa-408e-a378-bb1e0d59a2cb-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"49b69a8a-f5fa-408e-a378-bb1e0d59a2cb\") " pod="openstack/cinder-scheduler-0" Mar 10 10:06:42 crc kubenswrapper[4794]: I0310 10:06:42.112690 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/eafe7ef4-11df-422c-8b13-8943429c4fa6-ovsdbserver-nb\") pod \"dnsmasq-dns-765c5b6b49-4clf4\" (UID: \"eafe7ef4-11df-422c-8b13-8943429c4fa6\") " pod="openstack/dnsmasq-dns-765c5b6b49-4clf4" Mar 10 10:06:42 crc kubenswrapper[4794]: I0310 10:06:42.112716 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cdxsn\" (UniqueName: \"kubernetes.io/projected/eafe7ef4-11df-422c-8b13-8943429c4fa6-kube-api-access-cdxsn\") pod \"dnsmasq-dns-765c5b6b49-4clf4\" (UID: \"eafe7ef4-11df-422c-8b13-8943429c4fa6\") " pod="openstack/dnsmasq-dns-765c5b6b49-4clf4" Mar 10 10:06:42 crc kubenswrapper[4794]: I0310 10:06:42.112731 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/49b69a8a-f5fa-408e-a378-bb1e0d59a2cb-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"49b69a8a-f5fa-408e-a378-bb1e0d59a2cb\") " pod="openstack/cinder-scheduler-0" Mar 10 10:06:42 crc kubenswrapper[4794]: I0310 10:06:42.112758 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/49b69a8a-f5fa-408e-a378-bb1e0d59a2cb-scripts\") pod \"cinder-scheduler-0\" (UID: \"49b69a8a-f5fa-408e-a378-bb1e0d59a2cb\") " pod="openstack/cinder-scheduler-0" Mar 10 10:06:42 crc kubenswrapper[4794]: I0310 10:06:42.112771 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/eafe7ef4-11df-422c-8b13-8943429c4fa6-config\") pod \"dnsmasq-dns-765c5b6b49-4clf4\" (UID: \"eafe7ef4-11df-422c-8b13-8943429c4fa6\") " pod="openstack/dnsmasq-dns-765c5b6b49-4clf4" Mar 10 10:06:42 crc kubenswrapper[4794]: I0310 10:06:42.112806 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/eafe7ef4-11df-422c-8b13-8943429c4fa6-dns-svc\") pod \"dnsmasq-dns-765c5b6b49-4clf4\" (UID: \"eafe7ef4-11df-422c-8b13-8943429c4fa6\") " pod="openstack/dnsmasq-dns-765c5b6b49-4clf4" Mar 10 10:06:42 crc kubenswrapper[4794]: I0310 10:06:42.112821 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kqljl\" (UniqueName: \"kubernetes.io/projected/49b69a8a-f5fa-408e-a378-bb1e0d59a2cb-kube-api-access-kqljl\") pod \"cinder-scheduler-0\" (UID: \"49b69a8a-f5fa-408e-a378-bb1e0d59a2cb\") " pod="openstack/cinder-scheduler-0" Mar 10 10:06:42 crc kubenswrapper[4794]: I0310 10:06:42.114350 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/eafe7ef4-11df-422c-8b13-8943429c4fa6-ovsdbserver-nb\") pod \"dnsmasq-dns-765c5b6b49-4clf4\" (UID: \"eafe7ef4-11df-422c-8b13-8943429c4fa6\") " pod="openstack/dnsmasq-dns-765c5b6b49-4clf4" Mar 10 10:06:42 crc kubenswrapper[4794]: I0310 10:06:42.115386 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/49b69a8a-f5fa-408e-a378-bb1e0d59a2cb-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"49b69a8a-f5fa-408e-a378-bb1e0d59a2cb\") " pod="openstack/cinder-scheduler-0" Mar 10 10:06:42 crc kubenswrapper[4794]: I0310 10:06:42.115748 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/eafe7ef4-11df-422c-8b13-8943429c4fa6-ovsdbserver-sb\") pod \"dnsmasq-dns-765c5b6b49-4clf4\" (UID: \"eafe7ef4-11df-422c-8b13-8943429c4fa6\") " pod="openstack/dnsmasq-dns-765c5b6b49-4clf4" Mar 10 10:06:42 crc kubenswrapper[4794]: I0310 10:06:42.122223 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/eafe7ef4-11df-422c-8b13-8943429c4fa6-config\") pod \"dnsmasq-dns-765c5b6b49-4clf4\" (UID: \"eafe7ef4-11df-422c-8b13-8943429c4fa6\") " pod="openstack/dnsmasq-dns-765c5b6b49-4clf4" Mar 10 10:06:42 crc kubenswrapper[4794]: I0310 10:06:42.125085 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/49b69a8a-f5fa-408e-a378-bb1e0d59a2cb-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"49b69a8a-f5fa-408e-a378-bb1e0d59a2cb\") " pod="openstack/cinder-scheduler-0" Mar 10 10:06:42 crc kubenswrapper[4794]: I0310 10:06:42.127188 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/49b69a8a-f5fa-408e-a378-bb1e0d59a2cb-config-data\") pod \"cinder-scheduler-0\" (UID: \"49b69a8a-f5fa-408e-a378-bb1e0d59a2cb\") " pod="openstack/cinder-scheduler-0" Mar 10 10:06:42 crc kubenswrapper[4794]: I0310 10:06:42.127773 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/eafe7ef4-11df-422c-8b13-8943429c4fa6-dns-swift-storage-0\") pod \"dnsmasq-dns-765c5b6b49-4clf4\" (UID: \"eafe7ef4-11df-422c-8b13-8943429c4fa6\") " pod="openstack/dnsmasq-dns-765c5b6b49-4clf4" Mar 10 10:06:42 crc kubenswrapper[4794]: I0310 10:06:42.128637 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/eafe7ef4-11df-422c-8b13-8943429c4fa6-dns-svc\") pod \"dnsmasq-dns-765c5b6b49-4clf4\" (UID: \"eafe7ef4-11df-422c-8b13-8943429c4fa6\") " pod="openstack/dnsmasq-dns-765c5b6b49-4clf4" Mar 10 10:06:42 crc kubenswrapper[4794]: I0310 10:06:42.130701 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/49b69a8a-f5fa-408e-a378-bb1e0d59a2cb-scripts\") pod \"cinder-scheduler-0\" (UID: \"49b69a8a-f5fa-408e-a378-bb1e0d59a2cb\") " pod="openstack/cinder-scheduler-0" Mar 10 10:06:42 crc kubenswrapper[4794]: I0310 10:06:42.132834 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/49b69a8a-f5fa-408e-a378-bb1e0d59a2cb-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"49b69a8a-f5fa-408e-a378-bb1e0d59a2cb\") " pod="openstack/cinder-scheduler-0" Mar 10 10:06:42 crc kubenswrapper[4794]: I0310 10:06:42.132833 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kqljl\" (UniqueName: \"kubernetes.io/projected/49b69a8a-f5fa-408e-a378-bb1e0d59a2cb-kube-api-access-kqljl\") pod \"cinder-scheduler-0\" (UID: \"49b69a8a-f5fa-408e-a378-bb1e0d59a2cb\") " pod="openstack/cinder-scheduler-0" Mar 10 10:06:42 crc kubenswrapper[4794]: I0310 10:06:42.136632 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cdxsn\" (UniqueName: \"kubernetes.io/projected/eafe7ef4-11df-422c-8b13-8943429c4fa6-kube-api-access-cdxsn\") pod \"dnsmasq-dns-765c5b6b49-4clf4\" (UID: \"eafe7ef4-11df-422c-8b13-8943429c4fa6\") " pod="openstack/dnsmasq-dns-765c5b6b49-4clf4" Mar 10 10:06:42 crc kubenswrapper[4794]: I0310 10:06:42.149465 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Mar 10 10:06:42 crc kubenswrapper[4794]: I0310 10:06:42.151228 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 10 10:06:42 crc kubenswrapper[4794]: I0310 10:06:42.156502 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Mar 10 10:06:42 crc kubenswrapper[4794]: I0310 10:06:42.204733 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Mar 10 10:06:42 crc kubenswrapper[4794]: I0310 10:06:42.205158 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 10 10:06:42 crc kubenswrapper[4794]: I0310 10:06:42.214190 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/2c6afd20-3241-4d98-b20e-fa367dca1f50-etc-machine-id\") pod \"cinder-api-0\" (UID: \"2c6afd20-3241-4d98-b20e-fa367dca1f50\") " pod="openstack/cinder-api-0" Mar 10 10:06:42 crc kubenswrapper[4794]: I0310 10:06:42.214297 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2c6afd20-3241-4d98-b20e-fa367dca1f50-logs\") pod \"cinder-api-0\" (UID: \"2c6afd20-3241-4d98-b20e-fa367dca1f50\") " pod="openstack/cinder-api-0" Mar 10 10:06:42 crc kubenswrapper[4794]: I0310 10:06:42.214382 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h6gfk\" (UniqueName: \"kubernetes.io/projected/2c6afd20-3241-4d98-b20e-fa367dca1f50-kube-api-access-h6gfk\") pod \"cinder-api-0\" (UID: \"2c6afd20-3241-4d98-b20e-fa367dca1f50\") " pod="openstack/cinder-api-0" Mar 10 10:06:42 crc kubenswrapper[4794]: I0310 10:06:42.214443 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2c6afd20-3241-4d98-b20e-fa367dca1f50-config-data\") pod \"cinder-api-0\" (UID: \"2c6afd20-3241-4d98-b20e-fa367dca1f50\") " pod="openstack/cinder-api-0" Mar 10 10:06:42 crc kubenswrapper[4794]: I0310 10:06:42.214469 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2c6afd20-3241-4d98-b20e-fa367dca1f50-config-data-custom\") pod \"cinder-api-0\" (UID: \"2c6afd20-3241-4d98-b20e-fa367dca1f50\") " pod="openstack/cinder-api-0" Mar 10 10:06:42 crc kubenswrapper[4794]: I0310 10:06:42.214490 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2c6afd20-3241-4d98-b20e-fa367dca1f50-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"2c6afd20-3241-4d98-b20e-fa367dca1f50\") " pod="openstack/cinder-api-0" Mar 10 10:06:42 crc kubenswrapper[4794]: I0310 10:06:42.214518 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2c6afd20-3241-4d98-b20e-fa367dca1f50-scripts\") pod \"cinder-api-0\" (UID: \"2c6afd20-3241-4d98-b20e-fa367dca1f50\") " pod="openstack/cinder-api-0" Mar 10 10:06:42 crc kubenswrapper[4794]: I0310 10:06:42.281462 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-765c5b6b49-4clf4" Mar 10 10:06:42 crc kubenswrapper[4794]: I0310 10:06:42.315956 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/2c6afd20-3241-4d98-b20e-fa367dca1f50-etc-machine-id\") pod \"cinder-api-0\" (UID: \"2c6afd20-3241-4d98-b20e-fa367dca1f50\") " pod="openstack/cinder-api-0" Mar 10 10:06:42 crc kubenswrapper[4794]: I0310 10:06:42.316050 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2c6afd20-3241-4d98-b20e-fa367dca1f50-logs\") pod \"cinder-api-0\" (UID: \"2c6afd20-3241-4d98-b20e-fa367dca1f50\") " pod="openstack/cinder-api-0" Mar 10 10:06:42 crc kubenswrapper[4794]: I0310 10:06:42.316096 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h6gfk\" (UniqueName: \"kubernetes.io/projected/2c6afd20-3241-4d98-b20e-fa367dca1f50-kube-api-access-h6gfk\") pod \"cinder-api-0\" (UID: \"2c6afd20-3241-4d98-b20e-fa367dca1f50\") " pod="openstack/cinder-api-0" Mar 10 10:06:42 crc kubenswrapper[4794]: I0310 10:06:42.316134 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2c6afd20-3241-4d98-b20e-fa367dca1f50-config-data\") pod \"cinder-api-0\" (UID: \"2c6afd20-3241-4d98-b20e-fa367dca1f50\") " pod="openstack/cinder-api-0" Mar 10 10:06:42 crc kubenswrapper[4794]: I0310 10:06:42.316152 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2c6afd20-3241-4d98-b20e-fa367dca1f50-config-data-custom\") pod \"cinder-api-0\" (UID: \"2c6afd20-3241-4d98-b20e-fa367dca1f50\") " pod="openstack/cinder-api-0" Mar 10 10:06:42 crc kubenswrapper[4794]: I0310 10:06:42.316177 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2c6afd20-3241-4d98-b20e-fa367dca1f50-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"2c6afd20-3241-4d98-b20e-fa367dca1f50\") " pod="openstack/cinder-api-0" Mar 10 10:06:42 crc kubenswrapper[4794]: I0310 10:06:42.316211 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2c6afd20-3241-4d98-b20e-fa367dca1f50-scripts\") pod \"cinder-api-0\" (UID: \"2c6afd20-3241-4d98-b20e-fa367dca1f50\") " pod="openstack/cinder-api-0" Mar 10 10:06:42 crc kubenswrapper[4794]: I0310 10:06:42.316625 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/2c6afd20-3241-4d98-b20e-fa367dca1f50-etc-machine-id\") pod \"cinder-api-0\" (UID: \"2c6afd20-3241-4d98-b20e-fa367dca1f50\") " pod="openstack/cinder-api-0" Mar 10 10:06:42 crc kubenswrapper[4794]: I0310 10:06:42.316946 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2c6afd20-3241-4d98-b20e-fa367dca1f50-logs\") pod \"cinder-api-0\" (UID: \"2c6afd20-3241-4d98-b20e-fa367dca1f50\") " pod="openstack/cinder-api-0" Mar 10 10:06:42 crc kubenswrapper[4794]: I0310 10:06:42.328000 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2c6afd20-3241-4d98-b20e-fa367dca1f50-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"2c6afd20-3241-4d98-b20e-fa367dca1f50\") " pod="openstack/cinder-api-0" Mar 10 10:06:42 crc kubenswrapper[4794]: I0310 10:06:42.328046 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2c6afd20-3241-4d98-b20e-fa367dca1f50-config-data-custom\") pod \"cinder-api-0\" (UID: \"2c6afd20-3241-4d98-b20e-fa367dca1f50\") " pod="openstack/cinder-api-0" Mar 10 10:06:42 crc kubenswrapper[4794]: I0310 10:06:42.328857 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2c6afd20-3241-4d98-b20e-fa367dca1f50-config-data\") pod \"cinder-api-0\" (UID: \"2c6afd20-3241-4d98-b20e-fa367dca1f50\") " pod="openstack/cinder-api-0" Mar 10 10:06:42 crc kubenswrapper[4794]: I0310 10:06:42.341536 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h6gfk\" (UniqueName: \"kubernetes.io/projected/2c6afd20-3241-4d98-b20e-fa367dca1f50-kube-api-access-h6gfk\") pod \"cinder-api-0\" (UID: \"2c6afd20-3241-4d98-b20e-fa367dca1f50\") " pod="openstack/cinder-api-0" Mar 10 10:06:42 crc kubenswrapper[4794]: I0310 10:06:42.349445 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2c6afd20-3241-4d98-b20e-fa367dca1f50-scripts\") pod \"cinder-api-0\" (UID: \"2c6afd20-3241-4d98-b20e-fa367dca1f50\") " pod="openstack/cinder-api-0" Mar 10 10:06:42 crc kubenswrapper[4794]: I0310 10:06:42.489554 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55c649d8d5-9bdqc" event={"ID":"67fadcbf-6db6-494f-a3b7-da60cf39d1eb","Type":"ContainerStarted","Data":"7d1a39652c977a7d70942fe754d720e9284b63632237820db63082ed4a594659"} Mar 10 10:06:42 crc kubenswrapper[4794]: I0310 10:06:42.489767 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-55c649d8d5-9bdqc" Mar 10 10:06:42 crc kubenswrapper[4794]: I0310 10:06:42.489717 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-55c649d8d5-9bdqc" podUID="67fadcbf-6db6-494f-a3b7-da60cf39d1eb" containerName="dnsmasq-dns" containerID="cri-o://7d1a39652c977a7d70942fe754d720e9284b63632237820db63082ed4a594659" gracePeriod=10 Mar 10 10:06:42 crc kubenswrapper[4794]: I0310 10:06:42.534518 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-55c649d8d5-9bdqc" podStartSLOduration=3.534494594 podStartE2EDuration="3.534494594s" podCreationTimestamp="2026-03-10 10:06:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 10:06:42.526288695 +0000 UTC m=+1351.282459513" watchObservedRunningTime="2026-03-10 10:06:42.534494594 +0000 UTC m=+1351.290665442" Mar 10 10:06:42 crc kubenswrapper[4794]: I0310 10:06:42.611443 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 10 10:06:42 crc kubenswrapper[4794]: I0310 10:06:42.707528 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 10 10:06:42 crc kubenswrapper[4794]: I0310 10:06:42.889416 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-765c5b6b49-4clf4"] Mar 10 10:06:43 crc kubenswrapper[4794]: W0310 10:06:43.149696 4794 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod49b69a8a_f5fa_408e_a378_bb1e0d59a2cb.slice/crio-daf22ad9bae827cf094d3d88dd0183adf37718ed6d0bda64788a51ba3c856d48 WatchSource:0}: Error finding container daf22ad9bae827cf094d3d88dd0183adf37718ed6d0bda64788a51ba3c856d48: Status 404 returned error can't find the container with id daf22ad9bae827cf094d3d88dd0183adf37718ed6d0bda64788a51ba3c856d48 Mar 10 10:06:43 crc kubenswrapper[4794]: W0310 10:06:43.151862 4794 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podeafe7ef4_11df_422c_8b13_8943429c4fa6.slice/crio-a9364d85ddbcf92f51dcb2239a0d555c2b8562b6dc441eb89c0e3371c5756c34 WatchSource:0}: Error finding container a9364d85ddbcf92f51dcb2239a0d555c2b8562b6dc441eb89c0e3371c5756c34: Status 404 returned error can't find the container with id a9364d85ddbcf92f51dcb2239a0d555c2b8562b6dc441eb89c0e3371c5756c34 Mar 10 10:06:43 crc kubenswrapper[4794]: I0310 10:06:43.193627 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Mar 10 10:06:43 crc kubenswrapper[4794]: I0310 10:06:43.243772 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-55c649d8d5-9bdqc" Mar 10 10:06:43 crc kubenswrapper[4794]: I0310 10:06:43.340881 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/67fadcbf-6db6-494f-a3b7-da60cf39d1eb-config\") pod \"67fadcbf-6db6-494f-a3b7-da60cf39d1eb\" (UID: \"67fadcbf-6db6-494f-a3b7-da60cf39d1eb\") " Mar 10 10:06:43 crc kubenswrapper[4794]: I0310 10:06:43.341189 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/67fadcbf-6db6-494f-a3b7-da60cf39d1eb-dns-svc\") pod \"67fadcbf-6db6-494f-a3b7-da60cf39d1eb\" (UID: \"67fadcbf-6db6-494f-a3b7-da60cf39d1eb\") " Mar 10 10:06:43 crc kubenswrapper[4794]: I0310 10:06:43.341222 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/67fadcbf-6db6-494f-a3b7-da60cf39d1eb-dns-swift-storage-0\") pod \"67fadcbf-6db6-494f-a3b7-da60cf39d1eb\" (UID: \"67fadcbf-6db6-494f-a3b7-da60cf39d1eb\") " Mar 10 10:06:43 crc kubenswrapper[4794]: I0310 10:06:43.341284 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/67fadcbf-6db6-494f-a3b7-da60cf39d1eb-ovsdbserver-nb\") pod \"67fadcbf-6db6-494f-a3b7-da60cf39d1eb\" (UID: \"67fadcbf-6db6-494f-a3b7-da60cf39d1eb\") " Mar 10 10:06:43 crc kubenswrapper[4794]: I0310 10:06:43.341348 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/67fadcbf-6db6-494f-a3b7-da60cf39d1eb-ovsdbserver-sb\") pod \"67fadcbf-6db6-494f-a3b7-da60cf39d1eb\" (UID: \"67fadcbf-6db6-494f-a3b7-da60cf39d1eb\") " Mar 10 10:06:43 crc kubenswrapper[4794]: I0310 10:06:43.341395 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v9g4h\" (UniqueName: \"kubernetes.io/projected/67fadcbf-6db6-494f-a3b7-da60cf39d1eb-kube-api-access-v9g4h\") pod \"67fadcbf-6db6-494f-a3b7-da60cf39d1eb\" (UID: \"67fadcbf-6db6-494f-a3b7-da60cf39d1eb\") " Mar 10 10:06:43 crc kubenswrapper[4794]: I0310 10:06:43.370714 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/67fadcbf-6db6-494f-a3b7-da60cf39d1eb-kube-api-access-v9g4h" (OuterVolumeSpecName: "kube-api-access-v9g4h") pod "67fadcbf-6db6-494f-a3b7-da60cf39d1eb" (UID: "67fadcbf-6db6-494f-a3b7-da60cf39d1eb"). InnerVolumeSpecName "kube-api-access-v9g4h". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 10:06:43 crc kubenswrapper[4794]: I0310 10:06:43.399406 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/67fadcbf-6db6-494f-a3b7-da60cf39d1eb-config" (OuterVolumeSpecName: "config") pod "67fadcbf-6db6-494f-a3b7-da60cf39d1eb" (UID: "67fadcbf-6db6-494f-a3b7-da60cf39d1eb"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 10:06:43 crc kubenswrapper[4794]: I0310 10:06:43.409157 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/67fadcbf-6db6-494f-a3b7-da60cf39d1eb-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "67fadcbf-6db6-494f-a3b7-da60cf39d1eb" (UID: "67fadcbf-6db6-494f-a3b7-da60cf39d1eb"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 10:06:43 crc kubenswrapper[4794]: I0310 10:06:43.423640 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/67fadcbf-6db6-494f-a3b7-da60cf39d1eb-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "67fadcbf-6db6-494f-a3b7-da60cf39d1eb" (UID: "67fadcbf-6db6-494f-a3b7-da60cf39d1eb"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 10:06:43 crc kubenswrapper[4794]: I0310 10:06:43.424115 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/67fadcbf-6db6-494f-a3b7-da60cf39d1eb-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "67fadcbf-6db6-494f-a3b7-da60cf39d1eb" (UID: "67fadcbf-6db6-494f-a3b7-da60cf39d1eb"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 10:06:43 crc kubenswrapper[4794]: I0310 10:06:43.431192 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/67fadcbf-6db6-494f-a3b7-da60cf39d1eb-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "67fadcbf-6db6-494f-a3b7-da60cf39d1eb" (UID: "67fadcbf-6db6-494f-a3b7-da60cf39d1eb"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 10:06:43 crc kubenswrapper[4794]: I0310 10:06:43.443971 4794 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/67fadcbf-6db6-494f-a3b7-da60cf39d1eb-config\") on node \"crc\" DevicePath \"\"" Mar 10 10:06:43 crc kubenswrapper[4794]: I0310 10:06:43.444013 4794 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/67fadcbf-6db6-494f-a3b7-da60cf39d1eb-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 10 10:06:43 crc kubenswrapper[4794]: I0310 10:06:43.444025 4794 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/67fadcbf-6db6-494f-a3b7-da60cf39d1eb-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 10 10:06:43 crc kubenswrapper[4794]: I0310 10:06:43.444040 4794 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/67fadcbf-6db6-494f-a3b7-da60cf39d1eb-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 10 10:06:43 crc kubenswrapper[4794]: I0310 10:06:43.444051 4794 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/67fadcbf-6db6-494f-a3b7-da60cf39d1eb-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 10 10:06:43 crc kubenswrapper[4794]: I0310 10:06:43.444066 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v9g4h\" (UniqueName: \"kubernetes.io/projected/67fadcbf-6db6-494f-a3b7-da60cf39d1eb-kube-api-access-v9g4h\") on node \"crc\" DevicePath \"\"" Mar 10 10:06:43 crc kubenswrapper[4794]: I0310 10:06:43.504924 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-765c5b6b49-4clf4" event={"ID":"eafe7ef4-11df-422c-8b13-8943429c4fa6","Type":"ContainerStarted","Data":"a9364d85ddbcf92f51dcb2239a0d555c2b8562b6dc441eb89c0e3371c5756c34"} Mar 10 10:06:43 crc kubenswrapper[4794]: I0310 10:06:43.505966 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"49b69a8a-f5fa-408e-a378-bb1e0d59a2cb","Type":"ContainerStarted","Data":"daf22ad9bae827cf094d3d88dd0183adf37718ed6d0bda64788a51ba3c856d48"} Mar 10 10:06:43 crc kubenswrapper[4794]: I0310 10:06:43.510471 4794 generic.go:334] "Generic (PLEG): container finished" podID="67fadcbf-6db6-494f-a3b7-da60cf39d1eb" containerID="7d1a39652c977a7d70942fe754d720e9284b63632237820db63082ed4a594659" exitCode=0 Mar 10 10:06:43 crc kubenswrapper[4794]: I0310 10:06:43.510568 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-55c649d8d5-9bdqc" Mar 10 10:06:43 crc kubenswrapper[4794]: I0310 10:06:43.510555 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55c649d8d5-9bdqc" event={"ID":"67fadcbf-6db6-494f-a3b7-da60cf39d1eb","Type":"ContainerDied","Data":"7d1a39652c977a7d70942fe754d720e9284b63632237820db63082ed4a594659"} Mar 10 10:06:43 crc kubenswrapper[4794]: I0310 10:06:43.510631 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55c649d8d5-9bdqc" event={"ID":"67fadcbf-6db6-494f-a3b7-da60cf39d1eb","Type":"ContainerDied","Data":"fda2df964d8cec73d607d09162b4350afd5859b44b4498b06612d1d4bad19e29"} Mar 10 10:06:43 crc kubenswrapper[4794]: I0310 10:06:43.510656 4794 scope.go:117] "RemoveContainer" containerID="7d1a39652c977a7d70942fe754d720e9284b63632237820db63082ed4a594659" Mar 10 10:06:43 crc kubenswrapper[4794]: I0310 10:06:43.569253 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-55c649d8d5-9bdqc"] Mar 10 10:06:43 crc kubenswrapper[4794]: I0310 10:06:43.586111 4794 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-55c649d8d5-9bdqc"] Mar 10 10:06:43 crc kubenswrapper[4794]: I0310 10:06:43.697560 4794 scope.go:117] "RemoveContainer" containerID="586a36051058e7b69ffb2a8c841ba9275ba207294d2a5a76ac6964cf0f0322ff" Mar 10 10:06:43 crc kubenswrapper[4794]: I0310 10:06:43.734792 4794 scope.go:117] "RemoveContainer" containerID="7d1a39652c977a7d70942fe754d720e9284b63632237820db63082ed4a594659" Mar 10 10:06:43 crc kubenswrapper[4794]: E0310 10:06:43.735373 4794 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7d1a39652c977a7d70942fe754d720e9284b63632237820db63082ed4a594659\": container with ID starting with 7d1a39652c977a7d70942fe754d720e9284b63632237820db63082ed4a594659 not found: ID does not exist" containerID="7d1a39652c977a7d70942fe754d720e9284b63632237820db63082ed4a594659" Mar 10 10:06:43 crc kubenswrapper[4794]: I0310 10:06:43.735450 4794 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7d1a39652c977a7d70942fe754d720e9284b63632237820db63082ed4a594659"} err="failed to get container status \"7d1a39652c977a7d70942fe754d720e9284b63632237820db63082ed4a594659\": rpc error: code = NotFound desc = could not find container \"7d1a39652c977a7d70942fe754d720e9284b63632237820db63082ed4a594659\": container with ID starting with 7d1a39652c977a7d70942fe754d720e9284b63632237820db63082ed4a594659 not found: ID does not exist" Mar 10 10:06:43 crc kubenswrapper[4794]: I0310 10:06:43.735484 4794 scope.go:117] "RemoveContainer" containerID="586a36051058e7b69ffb2a8c841ba9275ba207294d2a5a76ac6964cf0f0322ff" Mar 10 10:06:43 crc kubenswrapper[4794]: E0310 10:06:43.735901 4794 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"586a36051058e7b69ffb2a8c841ba9275ba207294d2a5a76ac6964cf0f0322ff\": container with ID starting with 586a36051058e7b69ffb2a8c841ba9275ba207294d2a5a76ac6964cf0f0322ff not found: ID does not exist" containerID="586a36051058e7b69ffb2a8c841ba9275ba207294d2a5a76ac6964cf0f0322ff" Mar 10 10:06:43 crc kubenswrapper[4794]: I0310 10:06:43.735939 4794 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"586a36051058e7b69ffb2a8c841ba9275ba207294d2a5a76ac6964cf0f0322ff"} err="failed to get container status \"586a36051058e7b69ffb2a8c841ba9275ba207294d2a5a76ac6964cf0f0322ff\": rpc error: code = NotFound desc = could not find container \"586a36051058e7b69ffb2a8c841ba9275ba207294d2a5a76ac6964cf0f0322ff\": container with ID starting with 586a36051058e7b69ffb2a8c841ba9275ba207294d2a5a76ac6964cf0f0322ff not found: ID does not exist" Mar 10 10:06:44 crc kubenswrapper[4794]: I0310 10:06:44.022582 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="67fadcbf-6db6-494f-a3b7-da60cf39d1eb" path="/var/lib/kubelet/pods/67fadcbf-6db6-494f-a3b7-da60cf39d1eb/volumes" Mar 10 10:06:44 crc kubenswrapper[4794]: I0310 10:06:44.531691 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-65fdc45d8b-2t64g" event={"ID":"efa96620-3d4b-4780-92a0-eeefbe9dcf9a","Type":"ContainerStarted","Data":"c415d347d34421bcb42d60573a12da5f8047e3e7f977fb04d5829812ddc302a7"} Mar 10 10:06:44 crc kubenswrapper[4794]: I0310 10:06:44.536721 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-b64684465-k4k4j" event={"ID":"98b35dea-060e-4b8d-9829-37357853a9c4","Type":"ContainerStarted","Data":"f89f95fa1764bb4ed8927a1c6b5d7ec0737f1ea37babc6ae8e3dc09577573205"} Mar 10 10:06:44 crc kubenswrapper[4794]: I0310 10:06:44.538047 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"2c6afd20-3241-4d98-b20e-fa367dca1f50","Type":"ContainerStarted","Data":"8110fc6c52caf533901fbd920401a16f449bebfd7656e61e153590e8b41c3c39"} Mar 10 10:06:44 crc kubenswrapper[4794]: I0310 10:06:44.541969 4794 generic.go:334] "Generic (PLEG): container finished" podID="eafe7ef4-11df-422c-8b13-8943429c4fa6" containerID="f633620d50b34c7196de159934eb2d25f9a821c13de81842b89e78d64e2697a0" exitCode=0 Mar 10 10:06:44 crc kubenswrapper[4794]: I0310 10:06:44.542021 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-765c5b6b49-4clf4" event={"ID":"eafe7ef4-11df-422c-8b13-8943429c4fa6","Type":"ContainerDied","Data":"f633620d50b34c7196de159934eb2d25f9a821c13de81842b89e78d64e2697a0"} Mar 10 10:06:45 crc kubenswrapper[4794]: I0310 10:06:45.449028 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Mar 10 10:06:45 crc kubenswrapper[4794]: I0310 10:06:45.553395 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-b64684465-k4k4j" event={"ID":"98b35dea-060e-4b8d-9829-37357853a9c4","Type":"ContainerStarted","Data":"13f70234e665cee8f47182684e880f742b068d9ada3d1f0b83237e5efa99c1ee"} Mar 10 10:06:45 crc kubenswrapper[4794]: I0310 10:06:45.558216 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"2c6afd20-3241-4d98-b20e-fa367dca1f50","Type":"ContainerStarted","Data":"a14be46a842bd33d94e42d4205eaf1195f617e8d1c5a1f5bb3b42b78ff7ec992"} Mar 10 10:06:45 crc kubenswrapper[4794]: I0310 10:06:45.558262 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"2c6afd20-3241-4d98-b20e-fa367dca1f50","Type":"ContainerStarted","Data":"b719cb354868e0590748e3014b6523a285a2941899860f1487137031b6ec8bd6"} Mar 10 10:06:45 crc kubenswrapper[4794]: I0310 10:06:45.558594 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Mar 10 10:06:45 crc kubenswrapper[4794]: I0310 10:06:45.560927 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-765c5b6b49-4clf4" event={"ID":"eafe7ef4-11df-422c-8b13-8943429c4fa6","Type":"ContainerStarted","Data":"66047b3812719d5a33aa83c2452b755e2d6ad4b3fe29d3498048cef766eecffd"} Mar 10 10:06:45 crc kubenswrapper[4794]: I0310 10:06:45.561050 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-765c5b6b49-4clf4" Mar 10 10:06:45 crc kubenswrapper[4794]: I0310 10:06:45.563157 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"49b69a8a-f5fa-408e-a378-bb1e0d59a2cb","Type":"ContainerStarted","Data":"665db5329da9820beabdb8d21a743c9b7bcb95d7415ffd07880e65918c7cfcc4"} Mar 10 10:06:45 crc kubenswrapper[4794]: I0310 10:06:45.563200 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"49b69a8a-f5fa-408e-a378-bb1e0d59a2cb","Type":"ContainerStarted","Data":"aea6bf3ae630950c554e57a7e22e6a5aa2f4a4e15d3a2260d3fb2be319fc33da"} Mar 10 10:06:45 crc kubenswrapper[4794]: I0310 10:06:45.565156 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-65fdc45d8b-2t64g" event={"ID":"efa96620-3d4b-4780-92a0-eeefbe9dcf9a","Type":"ContainerStarted","Data":"19687e523e2ddee57d1501fea9c4f3051cf91b15cc6073a1586ab9b0d2214569"} Mar 10 10:06:45 crc kubenswrapper[4794]: I0310 10:06:45.587066 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-worker-b64684465-k4k4j" podStartSLOduration=3.47811221 podStartE2EDuration="6.587049656s" podCreationTimestamp="2026-03-10 10:06:39 +0000 UTC" firstStartedPulling="2026-03-10 10:06:40.606434182 +0000 UTC m=+1349.362605000" lastFinishedPulling="2026-03-10 10:06:43.715371638 +0000 UTC m=+1352.471542446" observedRunningTime="2026-03-10 10:06:45.584544847 +0000 UTC m=+1354.340715665" watchObservedRunningTime="2026-03-10 10:06:45.587049656 +0000 UTC m=+1354.343220474" Mar 10 10:06:45 crc kubenswrapper[4794]: I0310 10:06:45.626625 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-keystone-listener-65fdc45d8b-2t64g" podStartSLOduration=3.76171712 podStartE2EDuration="6.626607117s" podCreationTimestamp="2026-03-10 10:06:39 +0000 UTC" firstStartedPulling="2026-03-10 10:06:40.849618323 +0000 UTC m=+1349.605789141" lastFinishedPulling="2026-03-10 10:06:43.71450832 +0000 UTC m=+1352.470679138" observedRunningTime="2026-03-10 10:06:45.616826538 +0000 UTC m=+1354.372997356" watchObservedRunningTime="2026-03-10 10:06:45.626607117 +0000 UTC m=+1354.382777935" Mar 10 10:06:45 crc kubenswrapper[4794]: I0310 10:06:45.638124 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=3.638105601 podStartE2EDuration="3.638105601s" podCreationTimestamp="2026-03-10 10:06:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 10:06:45.636016065 +0000 UTC m=+1354.392186883" watchObservedRunningTime="2026-03-10 10:06:45.638105601 +0000 UTC m=+1354.394276409" Mar 10 10:06:45 crc kubenswrapper[4794]: I0310 10:06:45.666696 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-765c5b6b49-4clf4" podStartSLOduration=4.666677544 podStartE2EDuration="4.666677544s" podCreationTimestamp="2026-03-10 10:06:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 10:06:45.664609289 +0000 UTC m=+1354.420780107" watchObservedRunningTime="2026-03-10 10:06:45.666677544 +0000 UTC m=+1354.422848362" Mar 10 10:06:45 crc kubenswrapper[4794]: I0310 10:06:45.688733 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=3.975550348 podStartE2EDuration="4.688715841s" podCreationTimestamp="2026-03-10 10:06:41 +0000 UTC" firstStartedPulling="2026-03-10 10:06:43.154932524 +0000 UTC m=+1351.911103342" lastFinishedPulling="2026-03-10 10:06:43.868097987 +0000 UTC m=+1352.624268835" observedRunningTime="2026-03-10 10:06:45.685171979 +0000 UTC m=+1354.441342797" watchObservedRunningTime="2026-03-10 10:06:45.688715841 +0000 UTC m=+1354.444886659" Mar 10 10:06:46 crc kubenswrapper[4794]: I0310 10:06:46.573639 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="2c6afd20-3241-4d98-b20e-fa367dca1f50" containerName="cinder-api-log" containerID="cri-o://b719cb354868e0590748e3014b6523a285a2941899860f1487137031b6ec8bd6" gracePeriod=30 Mar 10 10:06:46 crc kubenswrapper[4794]: I0310 10:06:46.574241 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="2c6afd20-3241-4d98-b20e-fa367dca1f50" containerName="cinder-api" containerID="cri-o://a14be46a842bd33d94e42d4205eaf1195f617e8d1c5a1f5bb3b42b78ff7ec992" gracePeriod=30 Mar 10 10:06:46 crc kubenswrapper[4794]: I0310 10:06:46.864574 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-74b7765548-sk248" Mar 10 10:06:47 crc kubenswrapper[4794]: I0310 10:06:47.095761 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-5dd9f46c58-hkfks"] Mar 10 10:06:47 crc kubenswrapper[4794]: E0310 10:06:47.096280 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="67fadcbf-6db6-494f-a3b7-da60cf39d1eb" containerName="dnsmasq-dns" Mar 10 10:06:47 crc kubenswrapper[4794]: I0310 10:06:47.096295 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="67fadcbf-6db6-494f-a3b7-da60cf39d1eb" containerName="dnsmasq-dns" Mar 10 10:06:47 crc kubenswrapper[4794]: E0310 10:06:47.096315 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="67fadcbf-6db6-494f-a3b7-da60cf39d1eb" containerName="init" Mar 10 10:06:47 crc kubenswrapper[4794]: I0310 10:06:47.096324 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="67fadcbf-6db6-494f-a3b7-da60cf39d1eb" containerName="init" Mar 10 10:06:47 crc kubenswrapper[4794]: I0310 10:06:47.096583 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="67fadcbf-6db6-494f-a3b7-da60cf39d1eb" containerName="dnsmasq-dns" Mar 10 10:06:47 crc kubenswrapper[4794]: I0310 10:06:47.097709 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-5dd9f46c58-hkfks" Mar 10 10:06:47 crc kubenswrapper[4794]: I0310 10:06:47.111731 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-internal-svc" Mar 10 10:06:47 crc kubenswrapper[4794]: I0310 10:06:47.112166 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-public-svc" Mar 10 10:06:47 crc kubenswrapper[4794]: I0310 10:06:47.147974 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-5dd9f46c58-hkfks"] Mar 10 10:06:47 crc kubenswrapper[4794]: I0310 10:06:47.205801 4794 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Mar 10 10:06:47 crc kubenswrapper[4794]: I0310 10:06:47.245778 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cffdb4d7-881d-4c1b-a15c-d8d9e8b7756f-logs\") pod \"barbican-api-5dd9f46c58-hkfks\" (UID: \"cffdb4d7-881d-4c1b-a15c-d8d9e8b7756f\") " pod="openstack/barbican-api-5dd9f46c58-hkfks" Mar 10 10:06:47 crc kubenswrapper[4794]: I0310 10:06:47.245831 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/cffdb4d7-881d-4c1b-a15c-d8d9e8b7756f-public-tls-certs\") pod \"barbican-api-5dd9f46c58-hkfks\" (UID: \"cffdb4d7-881d-4c1b-a15c-d8d9e8b7756f\") " pod="openstack/barbican-api-5dd9f46c58-hkfks" Mar 10 10:06:47 crc kubenswrapper[4794]: I0310 10:06:47.245857 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/cffdb4d7-881d-4c1b-a15c-d8d9e8b7756f-config-data-custom\") pod \"barbican-api-5dd9f46c58-hkfks\" (UID: \"cffdb4d7-881d-4c1b-a15c-d8d9e8b7756f\") " pod="openstack/barbican-api-5dd9f46c58-hkfks" Mar 10 10:06:47 crc kubenswrapper[4794]: I0310 10:06:47.245963 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cffdb4d7-881d-4c1b-a15c-d8d9e8b7756f-combined-ca-bundle\") pod \"barbican-api-5dd9f46c58-hkfks\" (UID: \"cffdb4d7-881d-4c1b-a15c-d8d9e8b7756f\") " pod="openstack/barbican-api-5dd9f46c58-hkfks" Mar 10 10:06:47 crc kubenswrapper[4794]: I0310 10:06:47.245982 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n7gxf\" (UniqueName: \"kubernetes.io/projected/cffdb4d7-881d-4c1b-a15c-d8d9e8b7756f-kube-api-access-n7gxf\") pod \"barbican-api-5dd9f46c58-hkfks\" (UID: \"cffdb4d7-881d-4c1b-a15c-d8d9e8b7756f\") " pod="openstack/barbican-api-5dd9f46c58-hkfks" Mar 10 10:06:47 crc kubenswrapper[4794]: I0310 10:06:47.246038 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/cffdb4d7-881d-4c1b-a15c-d8d9e8b7756f-internal-tls-certs\") pod \"barbican-api-5dd9f46c58-hkfks\" (UID: \"cffdb4d7-881d-4c1b-a15c-d8d9e8b7756f\") " pod="openstack/barbican-api-5dd9f46c58-hkfks" Mar 10 10:06:47 crc kubenswrapper[4794]: I0310 10:06:47.246081 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cffdb4d7-881d-4c1b-a15c-d8d9e8b7756f-config-data\") pod \"barbican-api-5dd9f46c58-hkfks\" (UID: \"cffdb4d7-881d-4c1b-a15c-d8d9e8b7756f\") " pod="openstack/barbican-api-5dd9f46c58-hkfks" Mar 10 10:06:47 crc kubenswrapper[4794]: I0310 10:06:47.284341 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-66d985f6f7-q8rt6"] Mar 10 10:06:47 crc kubenswrapper[4794]: I0310 10:06:47.285422 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-66d985f6f7-q8rt6" podUID="b66eadd9-dea4-4c3c-aa14-494f04e0e8c7" containerName="neutron-httpd" containerID="cri-o://2a165f3b4e22ad3b365cba1bf564b6800c46f607f88d27558dfb8a8ce501c7b2" gracePeriod=30 Mar 10 10:06:47 crc kubenswrapper[4794]: I0310 10:06:47.285984 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-66d985f6f7-q8rt6" podUID="b66eadd9-dea4-4c3c-aa14-494f04e0e8c7" containerName="neutron-api" containerID="cri-o://ac9789f1492e8d84ba992e18bc28a423157e539d3ae3b6e20dec1dc67aaa8e96" gracePeriod=30 Mar 10 10:06:47 crc kubenswrapper[4794]: I0310 10:06:47.328075 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-bd7575545-w8qjp"] Mar 10 10:06:47 crc kubenswrapper[4794]: I0310 10:06:47.335801 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-bd7575545-w8qjp" Mar 10 10:06:47 crc kubenswrapper[4794]: I0310 10:06:47.354891 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cffdb4d7-881d-4c1b-a15c-d8d9e8b7756f-logs\") pod \"barbican-api-5dd9f46c58-hkfks\" (UID: \"cffdb4d7-881d-4c1b-a15c-d8d9e8b7756f\") " pod="openstack/barbican-api-5dd9f46c58-hkfks" Mar 10 10:06:47 crc kubenswrapper[4794]: I0310 10:06:47.354924 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/cffdb4d7-881d-4c1b-a15c-d8d9e8b7756f-public-tls-certs\") pod \"barbican-api-5dd9f46c58-hkfks\" (UID: \"cffdb4d7-881d-4c1b-a15c-d8d9e8b7756f\") " pod="openstack/barbican-api-5dd9f46c58-hkfks" Mar 10 10:06:47 crc kubenswrapper[4794]: I0310 10:06:47.354947 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/cffdb4d7-881d-4c1b-a15c-d8d9e8b7756f-config-data-custom\") pod \"barbican-api-5dd9f46c58-hkfks\" (UID: \"cffdb4d7-881d-4c1b-a15c-d8d9e8b7756f\") " pod="openstack/barbican-api-5dd9f46c58-hkfks" Mar 10 10:06:47 crc kubenswrapper[4794]: I0310 10:06:47.355015 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cffdb4d7-881d-4c1b-a15c-d8d9e8b7756f-combined-ca-bundle\") pod \"barbican-api-5dd9f46c58-hkfks\" (UID: \"cffdb4d7-881d-4c1b-a15c-d8d9e8b7756f\") " pod="openstack/barbican-api-5dd9f46c58-hkfks" Mar 10 10:06:47 crc kubenswrapper[4794]: I0310 10:06:47.355031 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n7gxf\" (UniqueName: \"kubernetes.io/projected/cffdb4d7-881d-4c1b-a15c-d8d9e8b7756f-kube-api-access-n7gxf\") pod \"barbican-api-5dd9f46c58-hkfks\" (UID: \"cffdb4d7-881d-4c1b-a15c-d8d9e8b7756f\") " pod="openstack/barbican-api-5dd9f46c58-hkfks" Mar 10 10:06:47 crc kubenswrapper[4794]: I0310 10:06:47.355065 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/cffdb4d7-881d-4c1b-a15c-d8d9e8b7756f-internal-tls-certs\") pod \"barbican-api-5dd9f46c58-hkfks\" (UID: \"cffdb4d7-881d-4c1b-a15c-d8d9e8b7756f\") " pod="openstack/barbican-api-5dd9f46c58-hkfks" Mar 10 10:06:47 crc kubenswrapper[4794]: I0310 10:06:47.355103 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cffdb4d7-881d-4c1b-a15c-d8d9e8b7756f-config-data\") pod \"barbican-api-5dd9f46c58-hkfks\" (UID: \"cffdb4d7-881d-4c1b-a15c-d8d9e8b7756f\") " pod="openstack/barbican-api-5dd9f46c58-hkfks" Mar 10 10:06:47 crc kubenswrapper[4794]: I0310 10:06:47.358808 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cffdb4d7-881d-4c1b-a15c-d8d9e8b7756f-logs\") pod \"barbican-api-5dd9f46c58-hkfks\" (UID: \"cffdb4d7-881d-4c1b-a15c-d8d9e8b7756f\") " pod="openstack/barbican-api-5dd9f46c58-hkfks" Mar 10 10:06:47 crc kubenswrapper[4794]: I0310 10:06:47.364809 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 10 10:06:47 crc kubenswrapper[4794]: I0310 10:06:47.369661 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/cffdb4d7-881d-4c1b-a15c-d8d9e8b7756f-public-tls-certs\") pod \"barbican-api-5dd9f46c58-hkfks\" (UID: \"cffdb4d7-881d-4c1b-a15c-d8d9e8b7756f\") " pod="openstack/barbican-api-5dd9f46c58-hkfks" Mar 10 10:06:47 crc kubenswrapper[4794]: I0310 10:06:47.376530 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cffdb4d7-881d-4c1b-a15c-d8d9e8b7756f-config-data\") pod \"barbican-api-5dd9f46c58-hkfks\" (UID: \"cffdb4d7-881d-4c1b-a15c-d8d9e8b7756f\") " pod="openstack/barbican-api-5dd9f46c58-hkfks" Mar 10 10:06:47 crc kubenswrapper[4794]: I0310 10:06:47.377975 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/cffdb4d7-881d-4c1b-a15c-d8d9e8b7756f-internal-tls-certs\") pod \"barbican-api-5dd9f46c58-hkfks\" (UID: \"cffdb4d7-881d-4c1b-a15c-d8d9e8b7756f\") " pod="openstack/barbican-api-5dd9f46c58-hkfks" Mar 10 10:06:47 crc kubenswrapper[4794]: I0310 10:06:47.381694 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/cffdb4d7-881d-4c1b-a15c-d8d9e8b7756f-config-data-custom\") pod \"barbican-api-5dd9f46c58-hkfks\" (UID: \"cffdb4d7-881d-4c1b-a15c-d8d9e8b7756f\") " pod="openstack/barbican-api-5dd9f46c58-hkfks" Mar 10 10:06:47 crc kubenswrapper[4794]: I0310 10:06:47.381943 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-bd7575545-w8qjp"] Mar 10 10:06:47 crc kubenswrapper[4794]: I0310 10:06:47.398072 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n7gxf\" (UniqueName: \"kubernetes.io/projected/cffdb4d7-881d-4c1b-a15c-d8d9e8b7756f-kube-api-access-n7gxf\") pod \"barbican-api-5dd9f46c58-hkfks\" (UID: \"cffdb4d7-881d-4c1b-a15c-d8d9e8b7756f\") " pod="openstack/barbican-api-5dd9f46c58-hkfks" Mar 10 10:06:47 crc kubenswrapper[4794]: I0310 10:06:47.398593 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cffdb4d7-881d-4c1b-a15c-d8d9e8b7756f-combined-ca-bundle\") pod \"barbican-api-5dd9f46c58-hkfks\" (UID: \"cffdb4d7-881d-4c1b-a15c-d8d9e8b7756f\") " pod="openstack/barbican-api-5dd9f46c58-hkfks" Mar 10 10:06:47 crc kubenswrapper[4794]: I0310 10:06:47.454816 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-5dd9f46c58-hkfks" Mar 10 10:06:47 crc kubenswrapper[4794]: I0310 10:06:47.455730 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2c6afd20-3241-4d98-b20e-fa367dca1f50-logs\") pod \"2c6afd20-3241-4d98-b20e-fa367dca1f50\" (UID: \"2c6afd20-3241-4d98-b20e-fa367dca1f50\") " Mar 10 10:06:47 crc kubenswrapper[4794]: I0310 10:06:47.455761 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2c6afd20-3241-4d98-b20e-fa367dca1f50-scripts\") pod \"2c6afd20-3241-4d98-b20e-fa367dca1f50\" (UID: \"2c6afd20-3241-4d98-b20e-fa367dca1f50\") " Mar 10 10:06:47 crc kubenswrapper[4794]: I0310 10:06:47.455832 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/2c6afd20-3241-4d98-b20e-fa367dca1f50-etc-machine-id\") pod \"2c6afd20-3241-4d98-b20e-fa367dca1f50\" (UID: \"2c6afd20-3241-4d98-b20e-fa367dca1f50\") " Mar 10 10:06:47 crc kubenswrapper[4794]: I0310 10:06:47.455877 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2c6afd20-3241-4d98-b20e-fa367dca1f50-config-data-custom\") pod \"2c6afd20-3241-4d98-b20e-fa367dca1f50\" (UID: \"2c6afd20-3241-4d98-b20e-fa367dca1f50\") " Mar 10 10:06:47 crc kubenswrapper[4794]: I0310 10:06:47.455929 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2c6afd20-3241-4d98-b20e-fa367dca1f50-config-data\") pod \"2c6afd20-3241-4d98-b20e-fa367dca1f50\" (UID: \"2c6afd20-3241-4d98-b20e-fa367dca1f50\") " Mar 10 10:06:47 crc kubenswrapper[4794]: I0310 10:06:47.455957 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h6gfk\" (UniqueName: \"kubernetes.io/projected/2c6afd20-3241-4d98-b20e-fa367dca1f50-kube-api-access-h6gfk\") pod \"2c6afd20-3241-4d98-b20e-fa367dca1f50\" (UID: \"2c6afd20-3241-4d98-b20e-fa367dca1f50\") " Mar 10 10:06:47 crc kubenswrapper[4794]: I0310 10:06:47.456056 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2c6afd20-3241-4d98-b20e-fa367dca1f50-combined-ca-bundle\") pod \"2c6afd20-3241-4d98-b20e-fa367dca1f50\" (UID: \"2c6afd20-3241-4d98-b20e-fa367dca1f50\") " Mar 10 10:06:47 crc kubenswrapper[4794]: I0310 10:06:47.456146 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2c6afd20-3241-4d98-b20e-fa367dca1f50-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "2c6afd20-3241-4d98-b20e-fa367dca1f50" (UID: "2c6afd20-3241-4d98-b20e-fa367dca1f50"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 10:06:47 crc kubenswrapper[4794]: I0310 10:06:47.456254 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/55771788-f3c0-4cde-af2f-ca527c2e2965-public-tls-certs\") pod \"neutron-bd7575545-w8qjp\" (UID: \"55771788-f3c0-4cde-af2f-ca527c2e2965\") " pod="openstack/neutron-bd7575545-w8qjp" Mar 10 10:06:47 crc kubenswrapper[4794]: I0310 10:06:47.456285 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/55771788-f3c0-4cde-af2f-ca527c2e2965-combined-ca-bundle\") pod \"neutron-bd7575545-w8qjp\" (UID: \"55771788-f3c0-4cde-af2f-ca527c2e2965\") " pod="openstack/neutron-bd7575545-w8qjp" Mar 10 10:06:47 crc kubenswrapper[4794]: I0310 10:06:47.456306 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/55771788-f3c0-4cde-af2f-ca527c2e2965-config\") pod \"neutron-bd7575545-w8qjp\" (UID: \"55771788-f3c0-4cde-af2f-ca527c2e2965\") " pod="openstack/neutron-bd7575545-w8qjp" Mar 10 10:06:47 crc kubenswrapper[4794]: I0310 10:06:47.456355 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/55771788-f3c0-4cde-af2f-ca527c2e2965-httpd-config\") pod \"neutron-bd7575545-w8qjp\" (UID: \"55771788-f3c0-4cde-af2f-ca527c2e2965\") " pod="openstack/neutron-bd7575545-w8qjp" Mar 10 10:06:47 crc kubenswrapper[4794]: I0310 10:06:47.456419 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/55771788-f3c0-4cde-af2f-ca527c2e2965-ovndb-tls-certs\") pod \"neutron-bd7575545-w8qjp\" (UID: \"55771788-f3c0-4cde-af2f-ca527c2e2965\") " pod="openstack/neutron-bd7575545-w8qjp" Mar 10 10:06:47 crc kubenswrapper[4794]: I0310 10:06:47.456455 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hdjtn\" (UniqueName: \"kubernetes.io/projected/55771788-f3c0-4cde-af2f-ca527c2e2965-kube-api-access-hdjtn\") pod \"neutron-bd7575545-w8qjp\" (UID: \"55771788-f3c0-4cde-af2f-ca527c2e2965\") " pod="openstack/neutron-bd7575545-w8qjp" Mar 10 10:06:47 crc kubenswrapper[4794]: I0310 10:06:47.456498 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/55771788-f3c0-4cde-af2f-ca527c2e2965-internal-tls-certs\") pod \"neutron-bd7575545-w8qjp\" (UID: \"55771788-f3c0-4cde-af2f-ca527c2e2965\") " pod="openstack/neutron-bd7575545-w8qjp" Mar 10 10:06:47 crc kubenswrapper[4794]: I0310 10:06:47.456543 4794 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/2c6afd20-3241-4d98-b20e-fa367dca1f50-etc-machine-id\") on node \"crc\" DevicePath \"\"" Mar 10 10:06:47 crc kubenswrapper[4794]: I0310 10:06:47.456605 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2c6afd20-3241-4d98-b20e-fa367dca1f50-logs" (OuterVolumeSpecName: "logs") pod "2c6afd20-3241-4d98-b20e-fa367dca1f50" (UID: "2c6afd20-3241-4d98-b20e-fa367dca1f50"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 10:06:47 crc kubenswrapper[4794]: I0310 10:06:47.471569 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2c6afd20-3241-4d98-b20e-fa367dca1f50-scripts" (OuterVolumeSpecName: "scripts") pod "2c6afd20-3241-4d98-b20e-fa367dca1f50" (UID: "2c6afd20-3241-4d98-b20e-fa367dca1f50"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 10:06:47 crc kubenswrapper[4794]: I0310 10:06:47.478470 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2c6afd20-3241-4d98-b20e-fa367dca1f50-kube-api-access-h6gfk" (OuterVolumeSpecName: "kube-api-access-h6gfk") pod "2c6afd20-3241-4d98-b20e-fa367dca1f50" (UID: "2c6afd20-3241-4d98-b20e-fa367dca1f50"). InnerVolumeSpecName "kube-api-access-h6gfk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 10:06:47 crc kubenswrapper[4794]: I0310 10:06:47.478548 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2c6afd20-3241-4d98-b20e-fa367dca1f50-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "2c6afd20-3241-4d98-b20e-fa367dca1f50" (UID: "2c6afd20-3241-4d98-b20e-fa367dca1f50"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 10:06:47 crc kubenswrapper[4794]: I0310 10:06:47.521676 4794 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/neutron-66d985f6f7-q8rt6" podUID="b66eadd9-dea4-4c3c-aa14-494f04e0e8c7" containerName="neutron-httpd" probeResult="failure" output="Get \"https://10.217.0.156:9696/\": EOF" Mar 10 10:06:47 crc kubenswrapper[4794]: I0310 10:06:47.534558 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2c6afd20-3241-4d98-b20e-fa367dca1f50-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2c6afd20-3241-4d98-b20e-fa367dca1f50" (UID: "2c6afd20-3241-4d98-b20e-fa367dca1f50"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 10:06:47 crc kubenswrapper[4794]: I0310 10:06:47.549866 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2c6afd20-3241-4d98-b20e-fa367dca1f50-config-data" (OuterVolumeSpecName: "config-data") pod "2c6afd20-3241-4d98-b20e-fa367dca1f50" (UID: "2c6afd20-3241-4d98-b20e-fa367dca1f50"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 10:06:47 crc kubenswrapper[4794]: I0310 10:06:47.561430 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/55771788-f3c0-4cde-af2f-ca527c2e2965-ovndb-tls-certs\") pod \"neutron-bd7575545-w8qjp\" (UID: \"55771788-f3c0-4cde-af2f-ca527c2e2965\") " pod="openstack/neutron-bd7575545-w8qjp" Mar 10 10:06:47 crc kubenswrapper[4794]: I0310 10:06:47.561498 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hdjtn\" (UniqueName: \"kubernetes.io/projected/55771788-f3c0-4cde-af2f-ca527c2e2965-kube-api-access-hdjtn\") pod \"neutron-bd7575545-w8qjp\" (UID: \"55771788-f3c0-4cde-af2f-ca527c2e2965\") " pod="openstack/neutron-bd7575545-w8qjp" Mar 10 10:06:47 crc kubenswrapper[4794]: I0310 10:06:47.561550 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/55771788-f3c0-4cde-af2f-ca527c2e2965-internal-tls-certs\") pod \"neutron-bd7575545-w8qjp\" (UID: \"55771788-f3c0-4cde-af2f-ca527c2e2965\") " pod="openstack/neutron-bd7575545-w8qjp" Mar 10 10:06:47 crc kubenswrapper[4794]: I0310 10:06:47.561567 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/55771788-f3c0-4cde-af2f-ca527c2e2965-public-tls-certs\") pod \"neutron-bd7575545-w8qjp\" (UID: \"55771788-f3c0-4cde-af2f-ca527c2e2965\") " pod="openstack/neutron-bd7575545-w8qjp" Mar 10 10:06:47 crc kubenswrapper[4794]: I0310 10:06:47.561586 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/55771788-f3c0-4cde-af2f-ca527c2e2965-combined-ca-bundle\") pod \"neutron-bd7575545-w8qjp\" (UID: \"55771788-f3c0-4cde-af2f-ca527c2e2965\") " pod="openstack/neutron-bd7575545-w8qjp" Mar 10 10:06:47 crc kubenswrapper[4794]: I0310 10:06:47.561603 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/55771788-f3c0-4cde-af2f-ca527c2e2965-config\") pod \"neutron-bd7575545-w8qjp\" (UID: \"55771788-f3c0-4cde-af2f-ca527c2e2965\") " pod="openstack/neutron-bd7575545-w8qjp" Mar 10 10:06:47 crc kubenswrapper[4794]: I0310 10:06:47.561634 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/55771788-f3c0-4cde-af2f-ca527c2e2965-httpd-config\") pod \"neutron-bd7575545-w8qjp\" (UID: \"55771788-f3c0-4cde-af2f-ca527c2e2965\") " pod="openstack/neutron-bd7575545-w8qjp" Mar 10 10:06:47 crc kubenswrapper[4794]: I0310 10:06:47.561705 4794 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2c6afd20-3241-4d98-b20e-fa367dca1f50-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 10:06:47 crc kubenswrapper[4794]: I0310 10:06:47.561717 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h6gfk\" (UniqueName: \"kubernetes.io/projected/2c6afd20-3241-4d98-b20e-fa367dca1f50-kube-api-access-h6gfk\") on node \"crc\" DevicePath \"\"" Mar 10 10:06:47 crc kubenswrapper[4794]: I0310 10:06:47.561726 4794 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2c6afd20-3241-4d98-b20e-fa367dca1f50-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 10:06:47 crc kubenswrapper[4794]: I0310 10:06:47.561735 4794 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2c6afd20-3241-4d98-b20e-fa367dca1f50-logs\") on node \"crc\" DevicePath \"\"" Mar 10 10:06:47 crc kubenswrapper[4794]: I0310 10:06:47.561743 4794 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2c6afd20-3241-4d98-b20e-fa367dca1f50-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 10:06:47 crc kubenswrapper[4794]: I0310 10:06:47.561752 4794 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2c6afd20-3241-4d98-b20e-fa367dca1f50-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 10 10:06:47 crc kubenswrapper[4794]: I0310 10:06:47.568901 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/55771788-f3c0-4cde-af2f-ca527c2e2965-internal-tls-certs\") pod \"neutron-bd7575545-w8qjp\" (UID: \"55771788-f3c0-4cde-af2f-ca527c2e2965\") " pod="openstack/neutron-bd7575545-w8qjp" Mar 10 10:06:47 crc kubenswrapper[4794]: I0310 10:06:47.573519 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/55771788-f3c0-4cde-af2f-ca527c2e2965-httpd-config\") pod \"neutron-bd7575545-w8qjp\" (UID: \"55771788-f3c0-4cde-af2f-ca527c2e2965\") " pod="openstack/neutron-bd7575545-w8qjp" Mar 10 10:06:47 crc kubenswrapper[4794]: I0310 10:06:47.575236 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/55771788-f3c0-4cde-af2f-ca527c2e2965-ovndb-tls-certs\") pod \"neutron-bd7575545-w8qjp\" (UID: \"55771788-f3c0-4cde-af2f-ca527c2e2965\") " pod="openstack/neutron-bd7575545-w8qjp" Mar 10 10:06:47 crc kubenswrapper[4794]: I0310 10:06:47.576174 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/55771788-f3c0-4cde-af2f-ca527c2e2965-public-tls-certs\") pod \"neutron-bd7575545-w8qjp\" (UID: \"55771788-f3c0-4cde-af2f-ca527c2e2965\") " pod="openstack/neutron-bd7575545-w8qjp" Mar 10 10:06:47 crc kubenswrapper[4794]: I0310 10:06:47.577373 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/55771788-f3c0-4cde-af2f-ca527c2e2965-combined-ca-bundle\") pod \"neutron-bd7575545-w8qjp\" (UID: \"55771788-f3c0-4cde-af2f-ca527c2e2965\") " pod="openstack/neutron-bd7575545-w8qjp" Mar 10 10:06:47 crc kubenswrapper[4794]: I0310 10:06:47.581106 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/55771788-f3c0-4cde-af2f-ca527c2e2965-config\") pod \"neutron-bd7575545-w8qjp\" (UID: \"55771788-f3c0-4cde-af2f-ca527c2e2965\") " pod="openstack/neutron-bd7575545-w8qjp" Mar 10 10:06:47 crc kubenswrapper[4794]: I0310 10:06:47.606210 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hdjtn\" (UniqueName: \"kubernetes.io/projected/55771788-f3c0-4cde-af2f-ca527c2e2965-kube-api-access-hdjtn\") pod \"neutron-bd7575545-w8qjp\" (UID: \"55771788-f3c0-4cde-af2f-ca527c2e2965\") " pod="openstack/neutron-bd7575545-w8qjp" Mar 10 10:06:47 crc kubenswrapper[4794]: I0310 10:06:47.630989 4794 generic.go:334] "Generic (PLEG): container finished" podID="2c6afd20-3241-4d98-b20e-fa367dca1f50" containerID="a14be46a842bd33d94e42d4205eaf1195f617e8d1c5a1f5bb3b42b78ff7ec992" exitCode=0 Mar 10 10:06:47 crc kubenswrapper[4794]: I0310 10:06:47.631284 4794 generic.go:334] "Generic (PLEG): container finished" podID="2c6afd20-3241-4d98-b20e-fa367dca1f50" containerID="b719cb354868e0590748e3014b6523a285a2941899860f1487137031b6ec8bd6" exitCode=143 Mar 10 10:06:47 crc kubenswrapper[4794]: I0310 10:06:47.631410 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"2c6afd20-3241-4d98-b20e-fa367dca1f50","Type":"ContainerDied","Data":"a14be46a842bd33d94e42d4205eaf1195f617e8d1c5a1f5bb3b42b78ff7ec992"} Mar 10 10:06:47 crc kubenswrapper[4794]: I0310 10:06:47.631465 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"2c6afd20-3241-4d98-b20e-fa367dca1f50","Type":"ContainerDied","Data":"b719cb354868e0590748e3014b6523a285a2941899860f1487137031b6ec8bd6"} Mar 10 10:06:47 crc kubenswrapper[4794]: I0310 10:06:47.631476 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"2c6afd20-3241-4d98-b20e-fa367dca1f50","Type":"ContainerDied","Data":"8110fc6c52caf533901fbd920401a16f449bebfd7656e61e153590e8b41c3c39"} Mar 10 10:06:47 crc kubenswrapper[4794]: I0310 10:06:47.631482 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 10 10:06:47 crc kubenswrapper[4794]: I0310 10:06:47.631492 4794 scope.go:117] "RemoveContainer" containerID="a14be46a842bd33d94e42d4205eaf1195f617e8d1c5a1f5bb3b42b78ff7ec992" Mar 10 10:06:47 crc kubenswrapper[4794]: I0310 10:06:47.712169 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Mar 10 10:06:47 crc kubenswrapper[4794]: I0310 10:06:47.718701 4794 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-api-0"] Mar 10 10:06:47 crc kubenswrapper[4794]: I0310 10:06:47.726200 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Mar 10 10:06:47 crc kubenswrapper[4794]: E0310 10:06:47.726554 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2c6afd20-3241-4d98-b20e-fa367dca1f50" containerName="cinder-api-log" Mar 10 10:06:47 crc kubenswrapper[4794]: I0310 10:06:47.726568 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="2c6afd20-3241-4d98-b20e-fa367dca1f50" containerName="cinder-api-log" Mar 10 10:06:47 crc kubenswrapper[4794]: E0310 10:06:47.726595 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2c6afd20-3241-4d98-b20e-fa367dca1f50" containerName="cinder-api" Mar 10 10:06:47 crc kubenswrapper[4794]: I0310 10:06:47.726601 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="2c6afd20-3241-4d98-b20e-fa367dca1f50" containerName="cinder-api" Mar 10 10:06:47 crc kubenswrapper[4794]: I0310 10:06:47.726795 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="2c6afd20-3241-4d98-b20e-fa367dca1f50" containerName="cinder-api" Mar 10 10:06:47 crc kubenswrapper[4794]: I0310 10:06:47.726815 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="2c6afd20-3241-4d98-b20e-fa367dca1f50" containerName="cinder-api-log" Mar 10 10:06:47 crc kubenswrapper[4794]: I0310 10:06:47.727777 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 10 10:06:47 crc kubenswrapper[4794]: I0310 10:06:47.730073 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Mar 10 10:06:47 crc kubenswrapper[4794]: I0310 10:06:47.731734 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-internal-svc" Mar 10 10:06:47 crc kubenswrapper[4794]: I0310 10:06:47.731755 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-public-svc" Mar 10 10:06:47 crc kubenswrapper[4794]: I0310 10:06:47.737360 4794 scope.go:117] "RemoveContainer" containerID="b719cb354868e0590748e3014b6523a285a2941899860f1487137031b6ec8bd6" Mar 10 10:06:47 crc kubenswrapper[4794]: I0310 10:06:47.741947 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Mar 10 10:06:47 crc kubenswrapper[4794]: I0310 10:06:47.771468 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-bd7575545-w8qjp" Mar 10 10:06:47 crc kubenswrapper[4794]: I0310 10:06:47.804472 4794 scope.go:117] "RemoveContainer" containerID="a14be46a842bd33d94e42d4205eaf1195f617e8d1c5a1f5bb3b42b78ff7ec992" Mar 10 10:06:47 crc kubenswrapper[4794]: E0310 10:06:47.806716 4794 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a14be46a842bd33d94e42d4205eaf1195f617e8d1c5a1f5bb3b42b78ff7ec992\": container with ID starting with a14be46a842bd33d94e42d4205eaf1195f617e8d1c5a1f5bb3b42b78ff7ec992 not found: ID does not exist" containerID="a14be46a842bd33d94e42d4205eaf1195f617e8d1c5a1f5bb3b42b78ff7ec992" Mar 10 10:06:47 crc kubenswrapper[4794]: I0310 10:06:47.806763 4794 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a14be46a842bd33d94e42d4205eaf1195f617e8d1c5a1f5bb3b42b78ff7ec992"} err="failed to get container status \"a14be46a842bd33d94e42d4205eaf1195f617e8d1c5a1f5bb3b42b78ff7ec992\": rpc error: code = NotFound desc = could not find container \"a14be46a842bd33d94e42d4205eaf1195f617e8d1c5a1f5bb3b42b78ff7ec992\": container with ID starting with a14be46a842bd33d94e42d4205eaf1195f617e8d1c5a1f5bb3b42b78ff7ec992 not found: ID does not exist" Mar 10 10:06:47 crc kubenswrapper[4794]: I0310 10:06:47.806790 4794 scope.go:117] "RemoveContainer" containerID="b719cb354868e0590748e3014b6523a285a2941899860f1487137031b6ec8bd6" Mar 10 10:06:47 crc kubenswrapper[4794]: E0310 10:06:47.807020 4794 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b719cb354868e0590748e3014b6523a285a2941899860f1487137031b6ec8bd6\": container with ID starting with b719cb354868e0590748e3014b6523a285a2941899860f1487137031b6ec8bd6 not found: ID does not exist" containerID="b719cb354868e0590748e3014b6523a285a2941899860f1487137031b6ec8bd6" Mar 10 10:06:47 crc kubenswrapper[4794]: I0310 10:06:47.807036 4794 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b719cb354868e0590748e3014b6523a285a2941899860f1487137031b6ec8bd6"} err="failed to get container status \"b719cb354868e0590748e3014b6523a285a2941899860f1487137031b6ec8bd6\": rpc error: code = NotFound desc = could not find container \"b719cb354868e0590748e3014b6523a285a2941899860f1487137031b6ec8bd6\": container with ID starting with b719cb354868e0590748e3014b6523a285a2941899860f1487137031b6ec8bd6 not found: ID does not exist" Mar 10 10:06:47 crc kubenswrapper[4794]: I0310 10:06:47.807049 4794 scope.go:117] "RemoveContainer" containerID="a14be46a842bd33d94e42d4205eaf1195f617e8d1c5a1f5bb3b42b78ff7ec992" Mar 10 10:06:47 crc kubenswrapper[4794]: I0310 10:06:47.807349 4794 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a14be46a842bd33d94e42d4205eaf1195f617e8d1c5a1f5bb3b42b78ff7ec992"} err="failed to get container status \"a14be46a842bd33d94e42d4205eaf1195f617e8d1c5a1f5bb3b42b78ff7ec992\": rpc error: code = NotFound desc = could not find container \"a14be46a842bd33d94e42d4205eaf1195f617e8d1c5a1f5bb3b42b78ff7ec992\": container with ID starting with a14be46a842bd33d94e42d4205eaf1195f617e8d1c5a1f5bb3b42b78ff7ec992 not found: ID does not exist" Mar 10 10:06:47 crc kubenswrapper[4794]: I0310 10:06:47.807371 4794 scope.go:117] "RemoveContainer" containerID="b719cb354868e0590748e3014b6523a285a2941899860f1487137031b6ec8bd6" Mar 10 10:06:47 crc kubenswrapper[4794]: I0310 10:06:47.807555 4794 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b719cb354868e0590748e3014b6523a285a2941899860f1487137031b6ec8bd6"} err="failed to get container status \"b719cb354868e0590748e3014b6523a285a2941899860f1487137031b6ec8bd6\": rpc error: code = NotFound desc = could not find container \"b719cb354868e0590748e3014b6523a285a2941899860f1487137031b6ec8bd6\": container with ID starting with b719cb354868e0590748e3014b6523a285a2941899860f1487137031b6ec8bd6 not found: ID does not exist" Mar 10 10:06:47 crc kubenswrapper[4794]: I0310 10:06:47.883825 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/39114d89-8cf8-4563-bc50-e96e2113349d-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"39114d89-8cf8-4563-bc50-e96e2113349d\") " pod="openstack/cinder-api-0" Mar 10 10:06:47 crc kubenswrapper[4794]: I0310 10:06:47.883895 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/39114d89-8cf8-4563-bc50-e96e2113349d-config-data\") pod \"cinder-api-0\" (UID: \"39114d89-8cf8-4563-bc50-e96e2113349d\") " pod="openstack/cinder-api-0" Mar 10 10:06:47 crc kubenswrapper[4794]: I0310 10:06:47.883981 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/39114d89-8cf8-4563-bc50-e96e2113349d-public-tls-certs\") pod \"cinder-api-0\" (UID: \"39114d89-8cf8-4563-bc50-e96e2113349d\") " pod="openstack/cinder-api-0" Mar 10 10:06:47 crc kubenswrapper[4794]: I0310 10:06:47.884020 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jxh88\" (UniqueName: \"kubernetes.io/projected/39114d89-8cf8-4563-bc50-e96e2113349d-kube-api-access-jxh88\") pod \"cinder-api-0\" (UID: \"39114d89-8cf8-4563-bc50-e96e2113349d\") " pod="openstack/cinder-api-0" Mar 10 10:06:47 crc kubenswrapper[4794]: I0310 10:06:47.884056 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/39114d89-8cf8-4563-bc50-e96e2113349d-config-data-custom\") pod \"cinder-api-0\" (UID: \"39114d89-8cf8-4563-bc50-e96e2113349d\") " pod="openstack/cinder-api-0" Mar 10 10:06:47 crc kubenswrapper[4794]: I0310 10:06:47.884084 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/39114d89-8cf8-4563-bc50-e96e2113349d-etc-machine-id\") pod \"cinder-api-0\" (UID: \"39114d89-8cf8-4563-bc50-e96e2113349d\") " pod="openstack/cinder-api-0" Mar 10 10:06:47 crc kubenswrapper[4794]: I0310 10:06:47.884157 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/39114d89-8cf8-4563-bc50-e96e2113349d-scripts\") pod \"cinder-api-0\" (UID: \"39114d89-8cf8-4563-bc50-e96e2113349d\") " pod="openstack/cinder-api-0" Mar 10 10:06:47 crc kubenswrapper[4794]: I0310 10:06:47.884190 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/39114d89-8cf8-4563-bc50-e96e2113349d-logs\") pod \"cinder-api-0\" (UID: \"39114d89-8cf8-4563-bc50-e96e2113349d\") " pod="openstack/cinder-api-0" Mar 10 10:06:47 crc kubenswrapper[4794]: I0310 10:06:47.884208 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/39114d89-8cf8-4563-bc50-e96e2113349d-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"39114d89-8cf8-4563-bc50-e96e2113349d\") " pod="openstack/cinder-api-0" Mar 10 10:06:47 crc kubenswrapper[4794]: I0310 10:06:47.985687 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/39114d89-8cf8-4563-bc50-e96e2113349d-public-tls-certs\") pod \"cinder-api-0\" (UID: \"39114d89-8cf8-4563-bc50-e96e2113349d\") " pod="openstack/cinder-api-0" Mar 10 10:06:47 crc kubenswrapper[4794]: I0310 10:06:47.986038 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jxh88\" (UniqueName: \"kubernetes.io/projected/39114d89-8cf8-4563-bc50-e96e2113349d-kube-api-access-jxh88\") pod \"cinder-api-0\" (UID: \"39114d89-8cf8-4563-bc50-e96e2113349d\") " pod="openstack/cinder-api-0" Mar 10 10:06:47 crc kubenswrapper[4794]: I0310 10:06:47.986092 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/39114d89-8cf8-4563-bc50-e96e2113349d-config-data-custom\") pod \"cinder-api-0\" (UID: \"39114d89-8cf8-4563-bc50-e96e2113349d\") " pod="openstack/cinder-api-0" Mar 10 10:06:47 crc kubenswrapper[4794]: I0310 10:06:47.986121 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/39114d89-8cf8-4563-bc50-e96e2113349d-etc-machine-id\") pod \"cinder-api-0\" (UID: \"39114d89-8cf8-4563-bc50-e96e2113349d\") " pod="openstack/cinder-api-0" Mar 10 10:06:47 crc kubenswrapper[4794]: I0310 10:06:47.986141 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/39114d89-8cf8-4563-bc50-e96e2113349d-scripts\") pod \"cinder-api-0\" (UID: \"39114d89-8cf8-4563-bc50-e96e2113349d\") " pod="openstack/cinder-api-0" Mar 10 10:06:47 crc kubenswrapper[4794]: I0310 10:06:47.986161 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/39114d89-8cf8-4563-bc50-e96e2113349d-logs\") pod \"cinder-api-0\" (UID: \"39114d89-8cf8-4563-bc50-e96e2113349d\") " pod="openstack/cinder-api-0" Mar 10 10:06:47 crc kubenswrapper[4794]: I0310 10:06:47.986183 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/39114d89-8cf8-4563-bc50-e96e2113349d-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"39114d89-8cf8-4563-bc50-e96e2113349d\") " pod="openstack/cinder-api-0" Mar 10 10:06:47 crc kubenswrapper[4794]: I0310 10:06:47.986242 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/39114d89-8cf8-4563-bc50-e96e2113349d-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"39114d89-8cf8-4563-bc50-e96e2113349d\") " pod="openstack/cinder-api-0" Mar 10 10:06:47 crc kubenswrapper[4794]: I0310 10:06:47.986280 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/39114d89-8cf8-4563-bc50-e96e2113349d-config-data\") pod \"cinder-api-0\" (UID: \"39114d89-8cf8-4563-bc50-e96e2113349d\") " pod="openstack/cinder-api-0" Mar 10 10:06:47 crc kubenswrapper[4794]: I0310 10:06:47.987380 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/39114d89-8cf8-4563-bc50-e96e2113349d-etc-machine-id\") pod \"cinder-api-0\" (UID: \"39114d89-8cf8-4563-bc50-e96e2113349d\") " pod="openstack/cinder-api-0" Mar 10 10:06:47 crc kubenswrapper[4794]: I0310 10:06:47.988612 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/39114d89-8cf8-4563-bc50-e96e2113349d-logs\") pod \"cinder-api-0\" (UID: \"39114d89-8cf8-4563-bc50-e96e2113349d\") " pod="openstack/cinder-api-0" Mar 10 10:06:47 crc kubenswrapper[4794]: I0310 10:06:47.994116 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/39114d89-8cf8-4563-bc50-e96e2113349d-config-data-custom\") pod \"cinder-api-0\" (UID: \"39114d89-8cf8-4563-bc50-e96e2113349d\") " pod="openstack/cinder-api-0" Mar 10 10:06:47 crc kubenswrapper[4794]: I0310 10:06:47.994151 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/39114d89-8cf8-4563-bc50-e96e2113349d-public-tls-certs\") pod \"cinder-api-0\" (UID: \"39114d89-8cf8-4563-bc50-e96e2113349d\") " pod="openstack/cinder-api-0" Mar 10 10:06:47 crc kubenswrapper[4794]: I0310 10:06:47.994259 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/39114d89-8cf8-4563-bc50-e96e2113349d-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"39114d89-8cf8-4563-bc50-e96e2113349d\") " pod="openstack/cinder-api-0" Mar 10 10:06:47 crc kubenswrapper[4794]: I0310 10:06:47.994836 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/39114d89-8cf8-4563-bc50-e96e2113349d-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"39114d89-8cf8-4563-bc50-e96e2113349d\") " pod="openstack/cinder-api-0" Mar 10 10:06:47 crc kubenswrapper[4794]: I0310 10:06:47.998115 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/39114d89-8cf8-4563-bc50-e96e2113349d-config-data\") pod \"cinder-api-0\" (UID: \"39114d89-8cf8-4563-bc50-e96e2113349d\") " pod="openstack/cinder-api-0" Mar 10 10:06:48 crc kubenswrapper[4794]: I0310 10:06:48.010406 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jxh88\" (UniqueName: \"kubernetes.io/projected/39114d89-8cf8-4563-bc50-e96e2113349d-kube-api-access-jxh88\") pod \"cinder-api-0\" (UID: \"39114d89-8cf8-4563-bc50-e96e2113349d\") " pod="openstack/cinder-api-0" Mar 10 10:06:48 crc kubenswrapper[4794]: I0310 10:06:48.010820 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/39114d89-8cf8-4563-bc50-e96e2113349d-scripts\") pod \"cinder-api-0\" (UID: \"39114d89-8cf8-4563-bc50-e96e2113349d\") " pod="openstack/cinder-api-0" Mar 10 10:06:48 crc kubenswrapper[4794]: I0310 10:06:48.018803 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2c6afd20-3241-4d98-b20e-fa367dca1f50" path="/var/lib/kubelet/pods/2c6afd20-3241-4d98-b20e-fa367dca1f50/volumes" Mar 10 10:06:48 crc kubenswrapper[4794]: I0310 10:06:48.081216 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-5dd9f46c58-hkfks"] Mar 10 10:06:48 crc kubenswrapper[4794]: I0310 10:06:48.101898 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 10 10:06:48 crc kubenswrapper[4794]: I0310 10:06:48.421708 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-bd7575545-w8qjp"] Mar 10 10:06:48 crc kubenswrapper[4794]: W0310 10:06:48.439712 4794 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod55771788_f3c0_4cde_af2f_ca527c2e2965.slice/crio-6e93094adb86c32b925a12f87d735ca2d15349396683c2bb8f480ad6ff2646d3 WatchSource:0}: Error finding container 6e93094adb86c32b925a12f87d735ca2d15349396683c2bb8f480ad6ff2646d3: Status 404 returned error can't find the container with id 6e93094adb86c32b925a12f87d735ca2d15349396683c2bb8f480ad6ff2646d3 Mar 10 10:06:48 crc kubenswrapper[4794]: W0310 10:06:48.621880 4794 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod39114d89_8cf8_4563_bc50_e96e2113349d.slice/crio-05309f8870b7523828ec7fdf40578ab276ca5685ead9a1535e9e59e74c80aa2e WatchSource:0}: Error finding container 05309f8870b7523828ec7fdf40578ab276ca5685ead9a1535e9e59e74c80aa2e: Status 404 returned error can't find the container with id 05309f8870b7523828ec7fdf40578ab276ca5685ead9a1535e9e59e74c80aa2e Mar 10 10:06:48 crc kubenswrapper[4794]: I0310 10:06:48.628937 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Mar 10 10:06:48 crc kubenswrapper[4794]: I0310 10:06:48.642288 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-5dd9f46c58-hkfks" event={"ID":"cffdb4d7-881d-4c1b-a15c-d8d9e8b7756f","Type":"ContainerStarted","Data":"839d455cbd220b0b4cbb46ee40c9764f0de266ff250390018a7424fa8aa36507"} Mar 10 10:06:48 crc kubenswrapper[4794]: I0310 10:06:48.642350 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-5dd9f46c58-hkfks" event={"ID":"cffdb4d7-881d-4c1b-a15c-d8d9e8b7756f","Type":"ContainerStarted","Data":"d7f0a62cb5cfb4fded049c3aefb7fe44d4d036d9c535f290cbc47f08da15b658"} Mar 10 10:06:48 crc kubenswrapper[4794]: I0310 10:06:48.642366 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-5dd9f46c58-hkfks" event={"ID":"cffdb4d7-881d-4c1b-a15c-d8d9e8b7756f","Type":"ContainerStarted","Data":"00270a947c119d653e8057284300c51c48f65a49f3fce81fd665ac1ea41f87dd"} Mar 10 10:06:48 crc kubenswrapper[4794]: I0310 10:06:48.642383 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-5dd9f46c58-hkfks" Mar 10 10:06:48 crc kubenswrapper[4794]: I0310 10:06:48.642410 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-5dd9f46c58-hkfks" Mar 10 10:06:48 crc kubenswrapper[4794]: I0310 10:06:48.644241 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-bd7575545-w8qjp" event={"ID":"55771788-f3c0-4cde-af2f-ca527c2e2965","Type":"ContainerStarted","Data":"3a1060bd42d7158c308820f09f849a414458bab447ccd7c995609acf055ac995"} Mar 10 10:06:48 crc kubenswrapper[4794]: I0310 10:06:48.644277 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-bd7575545-w8qjp" event={"ID":"55771788-f3c0-4cde-af2f-ca527c2e2965","Type":"ContainerStarted","Data":"6e93094adb86c32b925a12f87d735ca2d15349396683c2bb8f480ad6ff2646d3"} Mar 10 10:06:48 crc kubenswrapper[4794]: I0310 10:06:48.647643 4794 generic.go:334] "Generic (PLEG): container finished" podID="b66eadd9-dea4-4c3c-aa14-494f04e0e8c7" containerID="2a165f3b4e22ad3b365cba1bf564b6800c46f607f88d27558dfb8a8ce501c7b2" exitCode=0 Mar 10 10:06:48 crc kubenswrapper[4794]: I0310 10:06:48.647697 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-66d985f6f7-q8rt6" event={"ID":"b66eadd9-dea4-4c3c-aa14-494f04e0e8c7","Type":"ContainerDied","Data":"2a165f3b4e22ad3b365cba1bf564b6800c46f607f88d27558dfb8a8ce501c7b2"} Mar 10 10:06:48 crc kubenswrapper[4794]: I0310 10:06:48.648851 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"39114d89-8cf8-4563-bc50-e96e2113349d","Type":"ContainerStarted","Data":"05309f8870b7523828ec7fdf40578ab276ca5685ead9a1535e9e59e74c80aa2e"} Mar 10 10:06:48 crc kubenswrapper[4794]: I0310 10:06:48.675915 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-5dd9f46c58-hkfks" podStartSLOduration=1.675893836 podStartE2EDuration="1.675893836s" podCreationTimestamp="2026-03-10 10:06:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 10:06:48.667841141 +0000 UTC m=+1357.424011959" watchObservedRunningTime="2026-03-10 10:06:48.675893836 +0000 UTC m=+1357.432064654" Mar 10 10:06:49 crc kubenswrapper[4794]: I0310 10:06:49.230397 4794 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/neutron-66d985f6f7-q8rt6" podUID="b66eadd9-dea4-4c3c-aa14-494f04e0e8c7" containerName="neutron-httpd" probeResult="failure" output="Get \"https://10.217.0.156:9696/\": dial tcp 10.217.0.156:9696: connect: connection refused" Mar 10 10:06:49 crc kubenswrapper[4794]: I0310 10:06:49.660407 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-bd7575545-w8qjp" event={"ID":"55771788-f3c0-4cde-af2f-ca527c2e2965","Type":"ContainerStarted","Data":"76dbe19e4daa257d305db13b3f34518e614449759a4f59af217356e156607317"} Mar 10 10:06:49 crc kubenswrapper[4794]: I0310 10:06:49.660485 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-bd7575545-w8qjp" Mar 10 10:06:49 crc kubenswrapper[4794]: I0310 10:06:49.663926 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"39114d89-8cf8-4563-bc50-e96e2113349d","Type":"ContainerStarted","Data":"841f2c7007209f71d0fcab9d21091a5638e79f7695a27cbb0864a109529f7bb5"} Mar 10 10:06:49 crc kubenswrapper[4794]: I0310 10:06:49.687088 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-bd7575545-w8qjp" podStartSLOduration=2.687071433 podStartE2EDuration="2.687071433s" podCreationTimestamp="2026-03-10 10:06:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 10:06:49.681578899 +0000 UTC m=+1358.437749717" watchObservedRunningTime="2026-03-10 10:06:49.687071433 +0000 UTC m=+1358.443242241" Mar 10 10:06:50 crc kubenswrapper[4794]: I0310 10:06:50.673582 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"39114d89-8cf8-4563-bc50-e96e2113349d","Type":"ContainerStarted","Data":"1860e4a742cfe98abb0f65295ec8d6e591d40ce5a255968cfb355b03216be258"} Mar 10 10:06:50 crc kubenswrapper[4794]: I0310 10:06:50.711667 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=3.7116497429999997 podStartE2EDuration="3.711649743s" podCreationTimestamp="2026-03-10 10:06:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 10:06:50.705405286 +0000 UTC m=+1359.461576104" watchObservedRunningTime="2026-03-10 10:06:50.711649743 +0000 UTC m=+1359.467820561" Mar 10 10:06:51 crc kubenswrapper[4794]: I0310 10:06:51.682487 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Mar 10 10:06:52 crc kubenswrapper[4794]: I0310 10:06:52.252375 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-65464978fd-hh68d" Mar 10 10:06:52 crc kubenswrapper[4794]: I0310 10:06:52.271132 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-65464978fd-hh68d" Mar 10 10:06:52 crc kubenswrapper[4794]: I0310 10:06:52.288539 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-765c5b6b49-4clf4" Mar 10 10:06:52 crc kubenswrapper[4794]: I0310 10:06:52.395910 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6cbd95f657-9dw7b"] Mar 10 10:06:52 crc kubenswrapper[4794]: I0310 10:06:52.396131 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-6cbd95f657-9dw7b" podUID="f89f48c1-02ee-42ca-8101-7b646136d21d" containerName="dnsmasq-dns" containerID="cri-o://63e0788ae87d6fcca50d60481e254936b7034a5d8794646feff3dcf7869e3b87" gracePeriod=10 Mar 10 10:06:52 crc kubenswrapper[4794]: I0310 10:06:52.635309 4794 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Mar 10 10:06:52 crc kubenswrapper[4794]: I0310 10:06:52.688091 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 10 10:06:52 crc kubenswrapper[4794]: I0310 10:06:52.723614 4794 generic.go:334] "Generic (PLEG): container finished" podID="f89f48c1-02ee-42ca-8101-7b646136d21d" containerID="63e0788ae87d6fcca50d60481e254936b7034a5d8794646feff3dcf7869e3b87" exitCode=0 Mar 10 10:06:52 crc kubenswrapper[4794]: I0310 10:06:52.723644 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6cbd95f657-9dw7b" event={"ID":"f89f48c1-02ee-42ca-8101-7b646136d21d","Type":"ContainerDied","Data":"63e0788ae87d6fcca50d60481e254936b7034a5d8794646feff3dcf7869e3b87"} Mar 10 10:06:52 crc kubenswrapper[4794]: I0310 10:06:52.723875 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="49b69a8a-f5fa-408e-a378-bb1e0d59a2cb" containerName="cinder-scheduler" containerID="cri-o://aea6bf3ae630950c554e57a7e22e6a5aa2f4a4e15d3a2260d3fb2be319fc33da" gracePeriod=30 Mar 10 10:06:52 crc kubenswrapper[4794]: I0310 10:06:52.723981 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="49b69a8a-f5fa-408e-a378-bb1e0d59a2cb" containerName="probe" containerID="cri-o://665db5329da9820beabdb8d21a743c9b7bcb95d7415ffd07880e65918c7cfcc4" gracePeriod=30 Mar 10 10:06:52 crc kubenswrapper[4794]: I0310 10:06:52.973624 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-5fc5898d7d-hsl9p" Mar 10 10:06:52 crc kubenswrapper[4794]: I0310 10:06:52.975102 4794 patch_prober.go:28] interesting pod/machine-config-daemon-69278 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 10:06:52 crc kubenswrapper[4794]: I0310 10:06:52.975147 4794 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-69278" podUID="071a79a8-a892-4d38-a255-2a19483b64aa" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 10:06:52 crc kubenswrapper[4794]: I0310 10:06:52.975185 4794 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-69278" Mar 10 10:06:52 crc kubenswrapper[4794]: I0310 10:06:52.975713 4794 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"4fbbb2d33125ccb00592b9b895dbb76529b93f7f4dbc98756e86d7dc556b940a"} pod="openshift-machine-config-operator/machine-config-daemon-69278" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 10 10:06:52 crc kubenswrapper[4794]: I0310 10:06:52.975767 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-69278" podUID="071a79a8-a892-4d38-a255-2a19483b64aa" containerName="machine-config-daemon" containerID="cri-o://4fbbb2d33125ccb00592b9b895dbb76529b93f7f4dbc98756e86d7dc556b940a" gracePeriod=600 Mar 10 10:06:53 crc kubenswrapper[4794]: I0310 10:06:53.165730 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6cbd95f657-9dw7b" Mar 10 10:06:53 crc kubenswrapper[4794]: I0310 10:06:53.301849 4794 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="39f62c40-641b-409f-ab29-ba30c14de2d8" containerName="proxy-httpd" probeResult="failure" output="HTTP probe failed with statuscode: 503" Mar 10 10:06:53 crc kubenswrapper[4794]: I0310 10:06:53.302363 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f89f48c1-02ee-42ca-8101-7b646136d21d-ovsdbserver-nb\") pod \"f89f48c1-02ee-42ca-8101-7b646136d21d\" (UID: \"f89f48c1-02ee-42ca-8101-7b646136d21d\") " Mar 10 10:06:53 crc kubenswrapper[4794]: I0310 10:06:53.302423 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f89f48c1-02ee-42ca-8101-7b646136d21d-ovsdbserver-sb\") pod \"f89f48c1-02ee-42ca-8101-7b646136d21d\" (UID: \"f89f48c1-02ee-42ca-8101-7b646136d21d\") " Mar 10 10:06:53 crc kubenswrapper[4794]: I0310 10:06:53.302490 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f89f48c1-02ee-42ca-8101-7b646136d21d-dns-swift-storage-0\") pod \"f89f48c1-02ee-42ca-8101-7b646136d21d\" (UID: \"f89f48c1-02ee-42ca-8101-7b646136d21d\") " Mar 10 10:06:53 crc kubenswrapper[4794]: I0310 10:06:53.304655 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f89f48c1-02ee-42ca-8101-7b646136d21d-config\") pod \"f89f48c1-02ee-42ca-8101-7b646136d21d\" (UID: \"f89f48c1-02ee-42ca-8101-7b646136d21d\") " Mar 10 10:06:53 crc kubenswrapper[4794]: I0310 10:06:53.304790 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zwttg\" (UniqueName: \"kubernetes.io/projected/f89f48c1-02ee-42ca-8101-7b646136d21d-kube-api-access-zwttg\") pod \"f89f48c1-02ee-42ca-8101-7b646136d21d\" (UID: \"f89f48c1-02ee-42ca-8101-7b646136d21d\") " Mar 10 10:06:53 crc kubenswrapper[4794]: I0310 10:06:53.304906 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f89f48c1-02ee-42ca-8101-7b646136d21d-dns-svc\") pod \"f89f48c1-02ee-42ca-8101-7b646136d21d\" (UID: \"f89f48c1-02ee-42ca-8101-7b646136d21d\") " Mar 10 10:06:53 crc kubenswrapper[4794]: I0310 10:06:53.311236 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f89f48c1-02ee-42ca-8101-7b646136d21d-kube-api-access-zwttg" (OuterVolumeSpecName: "kube-api-access-zwttg") pod "f89f48c1-02ee-42ca-8101-7b646136d21d" (UID: "f89f48c1-02ee-42ca-8101-7b646136d21d"). InnerVolumeSpecName "kube-api-access-zwttg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 10:06:53 crc kubenswrapper[4794]: I0310 10:06:53.374869 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f89f48c1-02ee-42ca-8101-7b646136d21d-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "f89f48c1-02ee-42ca-8101-7b646136d21d" (UID: "f89f48c1-02ee-42ca-8101-7b646136d21d"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 10:06:53 crc kubenswrapper[4794]: I0310 10:06:53.378438 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f89f48c1-02ee-42ca-8101-7b646136d21d-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "f89f48c1-02ee-42ca-8101-7b646136d21d" (UID: "f89f48c1-02ee-42ca-8101-7b646136d21d"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 10:06:53 crc kubenswrapper[4794]: I0310 10:06:53.391236 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f89f48c1-02ee-42ca-8101-7b646136d21d-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "f89f48c1-02ee-42ca-8101-7b646136d21d" (UID: "f89f48c1-02ee-42ca-8101-7b646136d21d"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 10:06:53 crc kubenswrapper[4794]: I0310 10:06:53.392297 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f89f48c1-02ee-42ca-8101-7b646136d21d-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "f89f48c1-02ee-42ca-8101-7b646136d21d" (UID: "f89f48c1-02ee-42ca-8101-7b646136d21d"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 10:06:53 crc kubenswrapper[4794]: I0310 10:06:53.399975 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f89f48c1-02ee-42ca-8101-7b646136d21d-config" (OuterVolumeSpecName: "config") pod "f89f48c1-02ee-42ca-8101-7b646136d21d" (UID: "f89f48c1-02ee-42ca-8101-7b646136d21d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 10:06:53 crc kubenswrapper[4794]: I0310 10:06:53.402867 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-5fc5898d7d-hsl9p" Mar 10 10:06:53 crc kubenswrapper[4794]: I0310 10:06:53.410366 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zwttg\" (UniqueName: \"kubernetes.io/projected/f89f48c1-02ee-42ca-8101-7b646136d21d-kube-api-access-zwttg\") on node \"crc\" DevicePath \"\"" Mar 10 10:06:53 crc kubenswrapper[4794]: I0310 10:06:53.410401 4794 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f89f48c1-02ee-42ca-8101-7b646136d21d-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 10 10:06:53 crc kubenswrapper[4794]: I0310 10:06:53.410412 4794 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f89f48c1-02ee-42ca-8101-7b646136d21d-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 10 10:06:53 crc kubenswrapper[4794]: I0310 10:06:53.410421 4794 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f89f48c1-02ee-42ca-8101-7b646136d21d-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 10 10:06:53 crc kubenswrapper[4794]: I0310 10:06:53.410429 4794 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f89f48c1-02ee-42ca-8101-7b646136d21d-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 10 10:06:53 crc kubenswrapper[4794]: I0310 10:06:53.410438 4794 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f89f48c1-02ee-42ca-8101-7b646136d21d-config\") on node \"crc\" DevicePath \"\"" Mar 10 10:06:53 crc kubenswrapper[4794]: I0310 10:06:53.787048 4794 generic.go:334] "Generic (PLEG): container finished" podID="071a79a8-a892-4d38-a255-2a19483b64aa" containerID="4fbbb2d33125ccb00592b9b895dbb76529b93f7f4dbc98756e86d7dc556b940a" exitCode=0 Mar 10 10:06:53 crc kubenswrapper[4794]: I0310 10:06:53.787133 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-69278" event={"ID":"071a79a8-a892-4d38-a255-2a19483b64aa","Type":"ContainerDied","Data":"4fbbb2d33125ccb00592b9b895dbb76529b93f7f4dbc98756e86d7dc556b940a"} Mar 10 10:06:53 crc kubenswrapper[4794]: I0310 10:06:53.787470 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-69278" event={"ID":"071a79a8-a892-4d38-a255-2a19483b64aa","Type":"ContainerStarted","Data":"b8035e63199f09775c99e31be585fd326d355f434e689db9d33f71ae65c45f9f"} Mar 10 10:06:53 crc kubenswrapper[4794]: I0310 10:06:53.787492 4794 scope.go:117] "RemoveContainer" containerID="961f68351db8d19b5c0a0d1359e0a0bfe3a6d383630ab326fdce756a36734d0e" Mar 10 10:06:53 crc kubenswrapper[4794]: I0310 10:06:53.794790 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6cbd95f657-9dw7b" Mar 10 10:06:53 crc kubenswrapper[4794]: I0310 10:06:53.795392 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6cbd95f657-9dw7b" event={"ID":"f89f48c1-02ee-42ca-8101-7b646136d21d","Type":"ContainerDied","Data":"a6a5519e42059f7273e00990315ebdbe2d6747c76dee3dcbc5b570330032211e"} Mar 10 10:06:53 crc kubenswrapper[4794]: I0310 10:06:53.859477 4794 scope.go:117] "RemoveContainer" containerID="63e0788ae87d6fcca50d60481e254936b7034a5d8794646feff3dcf7869e3b87" Mar 10 10:06:53 crc kubenswrapper[4794]: I0310 10:06:53.890389 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6cbd95f657-9dw7b"] Mar 10 10:06:53 crc kubenswrapper[4794]: I0310 10:06:53.892432 4794 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6cbd95f657-9dw7b"] Mar 10 10:06:53 crc kubenswrapper[4794]: I0310 10:06:53.918588 4794 scope.go:117] "RemoveContainer" containerID="2e0c219d86c961651a397743750de2afecc05a22f55337936f8aacb3fa39b919" Mar 10 10:06:54 crc kubenswrapper[4794]: I0310 10:06:54.014078 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f89f48c1-02ee-42ca-8101-7b646136d21d" path="/var/lib/kubelet/pods/f89f48c1-02ee-42ca-8101-7b646136d21d/volumes" Mar 10 10:06:54 crc kubenswrapper[4794]: I0310 10:06:54.803915 4794 generic.go:334] "Generic (PLEG): container finished" podID="49b69a8a-f5fa-408e-a378-bb1e0d59a2cb" containerID="665db5329da9820beabdb8d21a743c9b7bcb95d7415ffd07880e65918c7cfcc4" exitCode=0 Mar 10 10:06:54 crc kubenswrapper[4794]: I0310 10:06:54.803961 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"49b69a8a-f5fa-408e-a378-bb1e0d59a2cb","Type":"ContainerDied","Data":"665db5329da9820beabdb8d21a743c9b7bcb95d7415ffd07880e65918c7cfcc4"} Mar 10 10:06:55 crc kubenswrapper[4794]: I0310 10:06:55.134854 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-68cfd4d846-drn7b" Mar 10 10:06:55 crc kubenswrapper[4794]: I0310 10:06:55.283248 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-68cfd4d846-drn7b" Mar 10 10:06:55 crc kubenswrapper[4794]: I0310 10:06:55.339028 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/keystone-547f85b784-tn9hj" Mar 10 10:06:55 crc kubenswrapper[4794]: I0310 10:06:55.349362 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-5fc5898d7d-hsl9p"] Mar 10 10:06:55 crc kubenswrapper[4794]: I0310 10:06:55.349635 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/placement-5fc5898d7d-hsl9p" podUID="26721832-e716-44b8-ab75-60ba0be9e511" containerName="placement-log" containerID="cri-o://0ab88aa23c611ec45f1aa310e542bf00e0fbb47047d97131ad66ee91504c23d1" gracePeriod=30 Mar 10 10:06:55 crc kubenswrapper[4794]: I0310 10:06:55.349802 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/placement-5fc5898d7d-hsl9p" podUID="26721832-e716-44b8-ab75-60ba0be9e511" containerName="placement-api" containerID="cri-o://bfceb8ee56c6036817c0ce50263764b2e16edc3f0e44869dddf9c663e04937b2" gracePeriod=30 Mar 10 10:06:55 crc kubenswrapper[4794]: I0310 10:06:55.818494 4794 generic.go:334] "Generic (PLEG): container finished" podID="26721832-e716-44b8-ab75-60ba0be9e511" containerID="0ab88aa23c611ec45f1aa310e542bf00e0fbb47047d97131ad66ee91504c23d1" exitCode=143 Mar 10 10:06:55 crc kubenswrapper[4794]: I0310 10:06:55.818639 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-5fc5898d7d-hsl9p" event={"ID":"26721832-e716-44b8-ab75-60ba0be9e511","Type":"ContainerDied","Data":"0ab88aa23c611ec45f1aa310e542bf00e0fbb47047d97131ad66ee91504c23d1"} Mar 10 10:06:56 crc kubenswrapper[4794]: I0310 10:06:56.669287 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 10 10:06:56 crc kubenswrapper[4794]: I0310 10:06:56.794836 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/49b69a8a-f5fa-408e-a378-bb1e0d59a2cb-combined-ca-bundle\") pod \"49b69a8a-f5fa-408e-a378-bb1e0d59a2cb\" (UID: \"49b69a8a-f5fa-408e-a378-bb1e0d59a2cb\") " Mar 10 10:06:56 crc kubenswrapper[4794]: I0310 10:06:56.794900 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/49b69a8a-f5fa-408e-a378-bb1e0d59a2cb-config-data\") pod \"49b69a8a-f5fa-408e-a378-bb1e0d59a2cb\" (UID: \"49b69a8a-f5fa-408e-a378-bb1e0d59a2cb\") " Mar 10 10:06:56 crc kubenswrapper[4794]: I0310 10:06:56.794956 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/49b69a8a-f5fa-408e-a378-bb1e0d59a2cb-config-data-custom\") pod \"49b69a8a-f5fa-408e-a378-bb1e0d59a2cb\" (UID: \"49b69a8a-f5fa-408e-a378-bb1e0d59a2cb\") " Mar 10 10:06:56 crc kubenswrapper[4794]: I0310 10:06:56.795140 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kqljl\" (UniqueName: \"kubernetes.io/projected/49b69a8a-f5fa-408e-a378-bb1e0d59a2cb-kube-api-access-kqljl\") pod \"49b69a8a-f5fa-408e-a378-bb1e0d59a2cb\" (UID: \"49b69a8a-f5fa-408e-a378-bb1e0d59a2cb\") " Mar 10 10:06:56 crc kubenswrapper[4794]: I0310 10:06:56.795160 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/49b69a8a-f5fa-408e-a378-bb1e0d59a2cb-scripts\") pod \"49b69a8a-f5fa-408e-a378-bb1e0d59a2cb\" (UID: \"49b69a8a-f5fa-408e-a378-bb1e0d59a2cb\") " Mar 10 10:06:56 crc kubenswrapper[4794]: I0310 10:06:56.795191 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/49b69a8a-f5fa-408e-a378-bb1e0d59a2cb-etc-machine-id\") pod \"49b69a8a-f5fa-408e-a378-bb1e0d59a2cb\" (UID: \"49b69a8a-f5fa-408e-a378-bb1e0d59a2cb\") " Mar 10 10:06:56 crc kubenswrapper[4794]: I0310 10:06:56.795574 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/49b69a8a-f5fa-408e-a378-bb1e0d59a2cb-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "49b69a8a-f5fa-408e-a378-bb1e0d59a2cb" (UID: "49b69a8a-f5fa-408e-a378-bb1e0d59a2cb"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 10:06:56 crc kubenswrapper[4794]: I0310 10:06:56.800730 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49b69a8a-f5fa-408e-a378-bb1e0d59a2cb-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "49b69a8a-f5fa-408e-a378-bb1e0d59a2cb" (UID: "49b69a8a-f5fa-408e-a378-bb1e0d59a2cb"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 10:06:56 crc kubenswrapper[4794]: I0310 10:06:56.804430 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49b69a8a-f5fa-408e-a378-bb1e0d59a2cb-scripts" (OuterVolumeSpecName: "scripts") pod "49b69a8a-f5fa-408e-a378-bb1e0d59a2cb" (UID: "49b69a8a-f5fa-408e-a378-bb1e0d59a2cb"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 10:06:56 crc kubenswrapper[4794]: I0310 10:06:56.817487 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49b69a8a-f5fa-408e-a378-bb1e0d59a2cb-kube-api-access-kqljl" (OuterVolumeSpecName: "kube-api-access-kqljl") pod "49b69a8a-f5fa-408e-a378-bb1e0d59a2cb" (UID: "49b69a8a-f5fa-408e-a378-bb1e0d59a2cb"). InnerVolumeSpecName "kube-api-access-kqljl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 10:06:56 crc kubenswrapper[4794]: I0310 10:06:56.884025 4794 generic.go:334] "Generic (PLEG): container finished" podID="49b69a8a-f5fa-408e-a378-bb1e0d59a2cb" containerID="aea6bf3ae630950c554e57a7e22e6a5aa2f4a4e15d3a2260d3fb2be319fc33da" exitCode=0 Mar 10 10:06:56 crc kubenswrapper[4794]: I0310 10:06:56.884072 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"49b69a8a-f5fa-408e-a378-bb1e0d59a2cb","Type":"ContainerDied","Data":"aea6bf3ae630950c554e57a7e22e6a5aa2f4a4e15d3a2260d3fb2be319fc33da"} Mar 10 10:06:56 crc kubenswrapper[4794]: I0310 10:06:56.884098 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"49b69a8a-f5fa-408e-a378-bb1e0d59a2cb","Type":"ContainerDied","Data":"daf22ad9bae827cf094d3d88dd0183adf37718ed6d0bda64788a51ba3c856d48"} Mar 10 10:06:56 crc kubenswrapper[4794]: I0310 10:06:56.884114 4794 scope.go:117] "RemoveContainer" containerID="665db5329da9820beabdb8d21a743c9b7bcb95d7415ffd07880e65918c7cfcc4" Mar 10 10:06:56 crc kubenswrapper[4794]: I0310 10:06:56.884263 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 10 10:06:56 crc kubenswrapper[4794]: I0310 10:06:56.898369 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kqljl\" (UniqueName: \"kubernetes.io/projected/49b69a8a-f5fa-408e-a378-bb1e0d59a2cb-kube-api-access-kqljl\") on node \"crc\" DevicePath \"\"" Mar 10 10:06:56 crc kubenswrapper[4794]: I0310 10:06:56.898391 4794 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/49b69a8a-f5fa-408e-a378-bb1e0d59a2cb-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 10:06:56 crc kubenswrapper[4794]: I0310 10:06:56.898400 4794 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/49b69a8a-f5fa-408e-a378-bb1e0d59a2cb-etc-machine-id\") on node \"crc\" DevicePath \"\"" Mar 10 10:06:56 crc kubenswrapper[4794]: I0310 10:06:56.898409 4794 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/49b69a8a-f5fa-408e-a378-bb1e0d59a2cb-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 10 10:06:56 crc kubenswrapper[4794]: I0310 10:06:56.898826 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49b69a8a-f5fa-408e-a378-bb1e0d59a2cb-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "49b69a8a-f5fa-408e-a378-bb1e0d59a2cb" (UID: "49b69a8a-f5fa-408e-a378-bb1e0d59a2cb"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 10:06:56 crc kubenswrapper[4794]: I0310 10:06:56.934485 4794 scope.go:117] "RemoveContainer" containerID="aea6bf3ae630950c554e57a7e22e6a5aa2f4a4e15d3a2260d3fb2be319fc33da" Mar 10 10:06:56 crc kubenswrapper[4794]: I0310 10:06:56.986729 4794 scope.go:117] "RemoveContainer" containerID="665db5329da9820beabdb8d21a743c9b7bcb95d7415ffd07880e65918c7cfcc4" Mar 10 10:06:56 crc kubenswrapper[4794]: E0310 10:06:56.995519 4794 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"665db5329da9820beabdb8d21a743c9b7bcb95d7415ffd07880e65918c7cfcc4\": container with ID starting with 665db5329da9820beabdb8d21a743c9b7bcb95d7415ffd07880e65918c7cfcc4 not found: ID does not exist" containerID="665db5329da9820beabdb8d21a743c9b7bcb95d7415ffd07880e65918c7cfcc4" Mar 10 10:06:56 crc kubenswrapper[4794]: I0310 10:06:56.995784 4794 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"665db5329da9820beabdb8d21a743c9b7bcb95d7415ffd07880e65918c7cfcc4"} err="failed to get container status \"665db5329da9820beabdb8d21a743c9b7bcb95d7415ffd07880e65918c7cfcc4\": rpc error: code = NotFound desc = could not find container \"665db5329da9820beabdb8d21a743c9b7bcb95d7415ffd07880e65918c7cfcc4\": container with ID starting with 665db5329da9820beabdb8d21a743c9b7bcb95d7415ffd07880e65918c7cfcc4 not found: ID does not exist" Mar 10 10:06:56 crc kubenswrapper[4794]: I0310 10:06:56.995928 4794 scope.go:117] "RemoveContainer" containerID="aea6bf3ae630950c554e57a7e22e6a5aa2f4a4e15d3a2260d3fb2be319fc33da" Mar 10 10:06:56 crc kubenswrapper[4794]: E0310 10:06:56.996723 4794 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"aea6bf3ae630950c554e57a7e22e6a5aa2f4a4e15d3a2260d3fb2be319fc33da\": container with ID starting with aea6bf3ae630950c554e57a7e22e6a5aa2f4a4e15d3a2260d3fb2be319fc33da not found: ID does not exist" containerID="aea6bf3ae630950c554e57a7e22e6a5aa2f4a4e15d3a2260d3fb2be319fc33da" Mar 10 10:06:56 crc kubenswrapper[4794]: I0310 10:06:56.996845 4794 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aea6bf3ae630950c554e57a7e22e6a5aa2f4a4e15d3a2260d3fb2be319fc33da"} err="failed to get container status \"aea6bf3ae630950c554e57a7e22e6a5aa2f4a4e15d3a2260d3fb2be319fc33da\": rpc error: code = NotFound desc = could not find container \"aea6bf3ae630950c554e57a7e22e6a5aa2f4a4e15d3a2260d3fb2be319fc33da\": container with ID starting with aea6bf3ae630950c554e57a7e22e6a5aa2f4a4e15d3a2260d3fb2be319fc33da not found: ID does not exist" Mar 10 10:06:57 crc kubenswrapper[4794]: I0310 10:06:57.000843 4794 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/49b69a8a-f5fa-408e-a378-bb1e0d59a2cb-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 10:06:57 crc kubenswrapper[4794]: I0310 10:06:57.027573 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49b69a8a-f5fa-408e-a378-bb1e0d59a2cb-config-data" (OuterVolumeSpecName: "config-data") pod "49b69a8a-f5fa-408e-a378-bb1e0d59a2cb" (UID: "49b69a8a-f5fa-408e-a378-bb1e0d59a2cb"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 10:06:57 crc kubenswrapper[4794]: I0310 10:06:57.103098 4794 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/49b69a8a-f5fa-408e-a378-bb1e0d59a2cb-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 10:06:57 crc kubenswrapper[4794]: I0310 10:06:57.223502 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 10 10:06:57 crc kubenswrapper[4794]: I0310 10:06:57.234792 4794 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 10 10:06:57 crc kubenswrapper[4794]: I0310 10:06:57.303602 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Mar 10 10:06:57 crc kubenswrapper[4794]: E0310 10:06:57.304086 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f89f48c1-02ee-42ca-8101-7b646136d21d" containerName="dnsmasq-dns" Mar 10 10:06:57 crc kubenswrapper[4794]: I0310 10:06:57.304109 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="f89f48c1-02ee-42ca-8101-7b646136d21d" containerName="dnsmasq-dns" Mar 10 10:06:57 crc kubenswrapper[4794]: E0310 10:06:57.304138 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f89f48c1-02ee-42ca-8101-7b646136d21d" containerName="init" Mar 10 10:06:57 crc kubenswrapper[4794]: I0310 10:06:57.304146 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="f89f48c1-02ee-42ca-8101-7b646136d21d" containerName="init" Mar 10 10:06:57 crc kubenswrapper[4794]: E0310 10:06:57.304172 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="49b69a8a-f5fa-408e-a378-bb1e0d59a2cb" containerName="probe" Mar 10 10:06:57 crc kubenswrapper[4794]: I0310 10:06:57.304179 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="49b69a8a-f5fa-408e-a378-bb1e0d59a2cb" containerName="probe" Mar 10 10:06:57 crc kubenswrapper[4794]: E0310 10:06:57.304189 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="49b69a8a-f5fa-408e-a378-bb1e0d59a2cb" containerName="cinder-scheduler" Mar 10 10:06:57 crc kubenswrapper[4794]: I0310 10:06:57.304197 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="49b69a8a-f5fa-408e-a378-bb1e0d59a2cb" containerName="cinder-scheduler" Mar 10 10:06:57 crc kubenswrapper[4794]: I0310 10:06:57.304453 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="49b69a8a-f5fa-408e-a378-bb1e0d59a2cb" containerName="probe" Mar 10 10:06:57 crc kubenswrapper[4794]: I0310 10:06:57.304476 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="f89f48c1-02ee-42ca-8101-7b646136d21d" containerName="dnsmasq-dns" Mar 10 10:06:57 crc kubenswrapper[4794]: I0310 10:06:57.304501 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="49b69a8a-f5fa-408e-a378-bb1e0d59a2cb" containerName="cinder-scheduler" Mar 10 10:06:57 crc kubenswrapper[4794]: I0310 10:06:57.306830 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 10 10:06:57 crc kubenswrapper[4794]: I0310 10:06:57.310993 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Mar 10 10:06:57 crc kubenswrapper[4794]: I0310 10:06:57.318947 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 10 10:06:57 crc kubenswrapper[4794]: I0310 10:06:57.409169 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3c6c5ff5-c0d2-49d5-a2d0-f49871e6444e-config-data\") pod \"cinder-scheduler-0\" (UID: \"3c6c5ff5-c0d2-49d5-a2d0-f49871e6444e\") " pod="openstack/cinder-scheduler-0" Mar 10 10:06:57 crc kubenswrapper[4794]: I0310 10:06:57.409256 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3c6c5ff5-c0d2-49d5-a2d0-f49871e6444e-scripts\") pod \"cinder-scheduler-0\" (UID: \"3c6c5ff5-c0d2-49d5-a2d0-f49871e6444e\") " pod="openstack/cinder-scheduler-0" Mar 10 10:06:57 crc kubenswrapper[4794]: I0310 10:06:57.409284 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/3c6c5ff5-c0d2-49d5-a2d0-f49871e6444e-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"3c6c5ff5-c0d2-49d5-a2d0-f49871e6444e\") " pod="openstack/cinder-scheduler-0" Mar 10 10:06:57 crc kubenswrapper[4794]: I0310 10:06:57.409349 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sjtmv\" (UniqueName: \"kubernetes.io/projected/3c6c5ff5-c0d2-49d5-a2d0-f49871e6444e-kube-api-access-sjtmv\") pod \"cinder-scheduler-0\" (UID: \"3c6c5ff5-c0d2-49d5-a2d0-f49871e6444e\") " pod="openstack/cinder-scheduler-0" Mar 10 10:06:57 crc kubenswrapper[4794]: I0310 10:06:57.409414 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3c6c5ff5-c0d2-49d5-a2d0-f49871e6444e-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"3c6c5ff5-c0d2-49d5-a2d0-f49871e6444e\") " pod="openstack/cinder-scheduler-0" Mar 10 10:06:57 crc kubenswrapper[4794]: I0310 10:06:57.409443 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3c6c5ff5-c0d2-49d5-a2d0-f49871e6444e-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"3c6c5ff5-c0d2-49d5-a2d0-f49871e6444e\") " pod="openstack/cinder-scheduler-0" Mar 10 10:06:57 crc kubenswrapper[4794]: I0310 10:06:57.510498 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3c6c5ff5-c0d2-49d5-a2d0-f49871e6444e-config-data\") pod \"cinder-scheduler-0\" (UID: \"3c6c5ff5-c0d2-49d5-a2d0-f49871e6444e\") " pod="openstack/cinder-scheduler-0" Mar 10 10:06:57 crc kubenswrapper[4794]: I0310 10:06:57.510823 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3c6c5ff5-c0d2-49d5-a2d0-f49871e6444e-scripts\") pod \"cinder-scheduler-0\" (UID: \"3c6c5ff5-c0d2-49d5-a2d0-f49871e6444e\") " pod="openstack/cinder-scheduler-0" Mar 10 10:06:57 crc kubenswrapper[4794]: I0310 10:06:57.510846 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/3c6c5ff5-c0d2-49d5-a2d0-f49871e6444e-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"3c6c5ff5-c0d2-49d5-a2d0-f49871e6444e\") " pod="openstack/cinder-scheduler-0" Mar 10 10:06:57 crc kubenswrapper[4794]: I0310 10:06:57.510887 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sjtmv\" (UniqueName: \"kubernetes.io/projected/3c6c5ff5-c0d2-49d5-a2d0-f49871e6444e-kube-api-access-sjtmv\") pod \"cinder-scheduler-0\" (UID: \"3c6c5ff5-c0d2-49d5-a2d0-f49871e6444e\") " pod="openstack/cinder-scheduler-0" Mar 10 10:06:57 crc kubenswrapper[4794]: I0310 10:06:57.510935 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3c6c5ff5-c0d2-49d5-a2d0-f49871e6444e-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"3c6c5ff5-c0d2-49d5-a2d0-f49871e6444e\") " pod="openstack/cinder-scheduler-0" Mar 10 10:06:57 crc kubenswrapper[4794]: I0310 10:06:57.510955 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3c6c5ff5-c0d2-49d5-a2d0-f49871e6444e-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"3c6c5ff5-c0d2-49d5-a2d0-f49871e6444e\") " pod="openstack/cinder-scheduler-0" Mar 10 10:06:57 crc kubenswrapper[4794]: I0310 10:06:57.511384 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/3c6c5ff5-c0d2-49d5-a2d0-f49871e6444e-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"3c6c5ff5-c0d2-49d5-a2d0-f49871e6444e\") " pod="openstack/cinder-scheduler-0" Mar 10 10:06:57 crc kubenswrapper[4794]: I0310 10:06:57.520782 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3c6c5ff5-c0d2-49d5-a2d0-f49871e6444e-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"3c6c5ff5-c0d2-49d5-a2d0-f49871e6444e\") " pod="openstack/cinder-scheduler-0" Mar 10 10:06:57 crc kubenswrapper[4794]: I0310 10:06:57.522865 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3c6c5ff5-c0d2-49d5-a2d0-f49871e6444e-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"3c6c5ff5-c0d2-49d5-a2d0-f49871e6444e\") " pod="openstack/cinder-scheduler-0" Mar 10 10:06:57 crc kubenswrapper[4794]: I0310 10:06:57.524928 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3c6c5ff5-c0d2-49d5-a2d0-f49871e6444e-scripts\") pod \"cinder-scheduler-0\" (UID: \"3c6c5ff5-c0d2-49d5-a2d0-f49871e6444e\") " pod="openstack/cinder-scheduler-0" Mar 10 10:06:57 crc kubenswrapper[4794]: I0310 10:06:57.525131 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3c6c5ff5-c0d2-49d5-a2d0-f49871e6444e-config-data\") pod \"cinder-scheduler-0\" (UID: \"3c6c5ff5-c0d2-49d5-a2d0-f49871e6444e\") " pod="openstack/cinder-scheduler-0" Mar 10 10:06:57 crc kubenswrapper[4794]: I0310 10:06:57.543368 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sjtmv\" (UniqueName: \"kubernetes.io/projected/3c6c5ff5-c0d2-49d5-a2d0-f49871e6444e-kube-api-access-sjtmv\") pod \"cinder-scheduler-0\" (UID: \"3c6c5ff5-c0d2-49d5-a2d0-f49871e6444e\") " pod="openstack/cinder-scheduler-0" Mar 10 10:06:57 crc kubenswrapper[4794]: I0310 10:06:57.630030 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 10 10:06:58 crc kubenswrapper[4794]: I0310 10:06:58.036281 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49b69a8a-f5fa-408e-a378-bb1e0d59a2cb" path="/var/lib/kubelet/pods/49b69a8a-f5fa-408e-a378-bb1e0d59a2cb/volumes" Mar 10 10:06:58 crc kubenswrapper[4794]: I0310 10:06:58.144797 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 10 10:06:58 crc kubenswrapper[4794]: I0310 10:06:58.330447 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Mar 10 10:06:58 crc kubenswrapper[4794]: I0310 10:06:58.332144 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Mar 10 10:06:58 crc kubenswrapper[4794]: I0310 10:06:58.341381 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config" Mar 10 10:06:58 crc kubenswrapper[4794]: I0310 10:06:58.341626 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-config-secret" Mar 10 10:06:58 crc kubenswrapper[4794]: I0310 10:06:58.342442 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstackclient-openstackclient-dockercfg-s4l5q" Mar 10 10:06:58 crc kubenswrapper[4794]: I0310 10:06:58.369263 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Mar 10 10:06:58 crc kubenswrapper[4794]: I0310 10:06:58.431800 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/88e57f3a-171a-4359-94e8-32861401846a-openstack-config-secret\") pod \"openstackclient\" (UID: \"88e57f3a-171a-4359-94e8-32861401846a\") " pod="openstack/openstackclient" Mar 10 10:06:58 crc kubenswrapper[4794]: I0310 10:06:58.432014 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/88e57f3a-171a-4359-94e8-32861401846a-combined-ca-bundle\") pod \"openstackclient\" (UID: \"88e57f3a-171a-4359-94e8-32861401846a\") " pod="openstack/openstackclient" Mar 10 10:06:58 crc kubenswrapper[4794]: I0310 10:06:58.432206 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/88e57f3a-171a-4359-94e8-32861401846a-openstack-config\") pod \"openstackclient\" (UID: \"88e57f3a-171a-4359-94e8-32861401846a\") " pod="openstack/openstackclient" Mar 10 10:06:58 crc kubenswrapper[4794]: I0310 10:06:58.432274 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qlznm\" (UniqueName: \"kubernetes.io/projected/88e57f3a-171a-4359-94e8-32861401846a-kube-api-access-qlznm\") pod \"openstackclient\" (UID: \"88e57f3a-171a-4359-94e8-32861401846a\") " pod="openstack/openstackclient" Mar 10 10:06:58 crc kubenswrapper[4794]: I0310 10:06:58.533926 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/88e57f3a-171a-4359-94e8-32861401846a-openstack-config\") pod \"openstackclient\" (UID: \"88e57f3a-171a-4359-94e8-32861401846a\") " pod="openstack/openstackclient" Mar 10 10:06:58 crc kubenswrapper[4794]: I0310 10:06:58.533979 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qlznm\" (UniqueName: \"kubernetes.io/projected/88e57f3a-171a-4359-94e8-32861401846a-kube-api-access-qlznm\") pod \"openstackclient\" (UID: \"88e57f3a-171a-4359-94e8-32861401846a\") " pod="openstack/openstackclient" Mar 10 10:06:58 crc kubenswrapper[4794]: I0310 10:06:58.534002 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/88e57f3a-171a-4359-94e8-32861401846a-openstack-config-secret\") pod \"openstackclient\" (UID: \"88e57f3a-171a-4359-94e8-32861401846a\") " pod="openstack/openstackclient" Mar 10 10:06:58 crc kubenswrapper[4794]: I0310 10:06:58.534088 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/88e57f3a-171a-4359-94e8-32861401846a-combined-ca-bundle\") pod \"openstackclient\" (UID: \"88e57f3a-171a-4359-94e8-32861401846a\") " pod="openstack/openstackclient" Mar 10 10:06:58 crc kubenswrapper[4794]: I0310 10:06:58.535233 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/88e57f3a-171a-4359-94e8-32861401846a-openstack-config\") pod \"openstackclient\" (UID: \"88e57f3a-171a-4359-94e8-32861401846a\") " pod="openstack/openstackclient" Mar 10 10:06:58 crc kubenswrapper[4794]: I0310 10:06:58.538017 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/88e57f3a-171a-4359-94e8-32861401846a-combined-ca-bundle\") pod \"openstackclient\" (UID: \"88e57f3a-171a-4359-94e8-32861401846a\") " pod="openstack/openstackclient" Mar 10 10:06:58 crc kubenswrapper[4794]: I0310 10:06:58.538678 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/88e57f3a-171a-4359-94e8-32861401846a-openstack-config-secret\") pod \"openstackclient\" (UID: \"88e57f3a-171a-4359-94e8-32861401846a\") " pod="openstack/openstackclient" Mar 10 10:06:58 crc kubenswrapper[4794]: I0310 10:06:58.554304 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qlznm\" (UniqueName: \"kubernetes.io/projected/88e57f3a-171a-4359-94e8-32861401846a-kube-api-access-qlznm\") pod \"openstackclient\" (UID: \"88e57f3a-171a-4359-94e8-32861401846a\") " pod="openstack/openstackclient" Mar 10 10:06:58 crc kubenswrapper[4794]: I0310 10:06:58.631646 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstackclient"] Mar 10 10:06:58 crc kubenswrapper[4794]: I0310 10:06:58.632504 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Mar 10 10:06:58 crc kubenswrapper[4794]: I0310 10:06:58.642735 4794 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/openstackclient"] Mar 10 10:06:58 crc kubenswrapper[4794]: I0310 10:06:58.687022 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Mar 10 10:06:58 crc kubenswrapper[4794]: I0310 10:06:58.688400 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Mar 10 10:06:58 crc kubenswrapper[4794]: I0310 10:06:58.739018 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d235a0a5-57c9-4938-b742-5788ade30a12-combined-ca-bundle\") pod \"openstackclient\" (UID: \"d235a0a5-57c9-4938-b742-5788ade30a12\") " pod="openstack/openstackclient" Mar 10 10:06:58 crc kubenswrapper[4794]: I0310 10:06:58.739113 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/d235a0a5-57c9-4938-b742-5788ade30a12-openstack-config\") pod \"openstackclient\" (UID: \"d235a0a5-57c9-4938-b742-5788ade30a12\") " pod="openstack/openstackclient" Mar 10 10:06:58 crc kubenswrapper[4794]: I0310 10:06:58.739166 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/d235a0a5-57c9-4938-b742-5788ade30a12-openstack-config-secret\") pod \"openstackclient\" (UID: \"d235a0a5-57c9-4938-b742-5788ade30a12\") " pod="openstack/openstackclient" Mar 10 10:06:58 crc kubenswrapper[4794]: I0310 10:06:58.739188 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jclrs\" (UniqueName: \"kubernetes.io/projected/d235a0a5-57c9-4938-b742-5788ade30a12-kube-api-access-jclrs\") pod \"openstackclient\" (UID: \"d235a0a5-57c9-4938-b742-5788ade30a12\") " pod="openstack/openstackclient" Mar 10 10:06:58 crc kubenswrapper[4794]: I0310 10:06:58.739423 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Mar 10 10:06:58 crc kubenswrapper[4794]: E0310 10:06:58.815963 4794 log.go:32] "RunPodSandbox from runtime service failed" err=< Mar 10 10:06:58 crc kubenswrapper[4794]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_openstackclient_openstack_88e57f3a-171a-4359-94e8-32861401846a_0(f629d98f3c226dafb893d42bb41ce5ef903a1e211d8c9866172d015847642057): error adding pod openstack_openstackclient to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"f629d98f3c226dafb893d42bb41ce5ef903a1e211d8c9866172d015847642057" Netns:"/var/run/netns/c0916283-ca08-4ab8-9534-9bbf8f5689b6" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openstack;K8S_POD_NAME=openstackclient;K8S_POD_INFRA_CONTAINER_ID=f629d98f3c226dafb893d42bb41ce5ef903a1e211d8c9866172d015847642057;K8S_POD_UID=88e57f3a-171a-4359-94e8-32861401846a" Path:"" ERRORED: error configuring pod [openstack/openstackclient] networking: Multus: [openstack/openstackclient/88e57f3a-171a-4359-94e8-32861401846a]: expected pod UID "88e57f3a-171a-4359-94e8-32861401846a" but got "d235a0a5-57c9-4938-b742-5788ade30a12" from Kube API Mar 10 10:06:58 crc kubenswrapper[4794]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Mar 10 10:06:58 crc kubenswrapper[4794]: > Mar 10 10:06:58 crc kubenswrapper[4794]: E0310 10:06:58.816027 4794 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err=< Mar 10 10:06:58 crc kubenswrapper[4794]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_openstackclient_openstack_88e57f3a-171a-4359-94e8-32861401846a_0(f629d98f3c226dafb893d42bb41ce5ef903a1e211d8c9866172d015847642057): error adding pod openstack_openstackclient to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"f629d98f3c226dafb893d42bb41ce5ef903a1e211d8c9866172d015847642057" Netns:"/var/run/netns/c0916283-ca08-4ab8-9534-9bbf8f5689b6" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openstack;K8S_POD_NAME=openstackclient;K8S_POD_INFRA_CONTAINER_ID=f629d98f3c226dafb893d42bb41ce5ef903a1e211d8c9866172d015847642057;K8S_POD_UID=88e57f3a-171a-4359-94e8-32861401846a" Path:"" ERRORED: error configuring pod [openstack/openstackclient] networking: Multus: [openstack/openstackclient/88e57f3a-171a-4359-94e8-32861401846a]: expected pod UID "88e57f3a-171a-4359-94e8-32861401846a" but got "d235a0a5-57c9-4938-b742-5788ade30a12" from Kube API Mar 10 10:06:58 crc kubenswrapper[4794]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Mar 10 10:06:58 crc kubenswrapper[4794]: > pod="openstack/openstackclient" Mar 10 10:06:58 crc kubenswrapper[4794]: I0310 10:06:58.845110 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d235a0a5-57c9-4938-b742-5788ade30a12-combined-ca-bundle\") pod \"openstackclient\" (UID: \"d235a0a5-57c9-4938-b742-5788ade30a12\") " pod="openstack/openstackclient" Mar 10 10:06:58 crc kubenswrapper[4794]: I0310 10:06:58.845169 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/d235a0a5-57c9-4938-b742-5788ade30a12-openstack-config\") pod \"openstackclient\" (UID: \"d235a0a5-57c9-4938-b742-5788ade30a12\") " pod="openstack/openstackclient" Mar 10 10:06:58 crc kubenswrapper[4794]: I0310 10:06:58.845201 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/d235a0a5-57c9-4938-b742-5788ade30a12-openstack-config-secret\") pod \"openstackclient\" (UID: \"d235a0a5-57c9-4938-b742-5788ade30a12\") " pod="openstack/openstackclient" Mar 10 10:06:58 crc kubenswrapper[4794]: I0310 10:06:58.845218 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jclrs\" (UniqueName: \"kubernetes.io/projected/d235a0a5-57c9-4938-b742-5788ade30a12-kube-api-access-jclrs\") pod \"openstackclient\" (UID: \"d235a0a5-57c9-4938-b742-5788ade30a12\") " pod="openstack/openstackclient" Mar 10 10:06:58 crc kubenswrapper[4794]: I0310 10:06:58.846676 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/d235a0a5-57c9-4938-b742-5788ade30a12-openstack-config\") pod \"openstackclient\" (UID: \"d235a0a5-57c9-4938-b742-5788ade30a12\") " pod="openstack/openstackclient" Mar 10 10:06:58 crc kubenswrapper[4794]: I0310 10:06:58.855190 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/d235a0a5-57c9-4938-b742-5788ade30a12-openstack-config-secret\") pod \"openstackclient\" (UID: \"d235a0a5-57c9-4938-b742-5788ade30a12\") " pod="openstack/openstackclient" Mar 10 10:06:58 crc kubenswrapper[4794]: I0310 10:06:58.858970 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d235a0a5-57c9-4938-b742-5788ade30a12-combined-ca-bundle\") pod \"openstackclient\" (UID: \"d235a0a5-57c9-4938-b742-5788ade30a12\") " pod="openstack/openstackclient" Mar 10 10:06:58 crc kubenswrapper[4794]: I0310 10:06:58.878295 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jclrs\" (UniqueName: \"kubernetes.io/projected/d235a0a5-57c9-4938-b742-5788ade30a12-kube-api-access-jclrs\") pod \"openstackclient\" (UID: \"d235a0a5-57c9-4938-b742-5788ade30a12\") " pod="openstack/openstackclient" Mar 10 10:06:58 crc kubenswrapper[4794]: I0310 10:06:58.929470 4794 generic.go:334] "Generic (PLEG): container finished" podID="26721832-e716-44b8-ab75-60ba0be9e511" containerID="bfceb8ee56c6036817c0ce50263764b2e16edc3f0e44869dddf9c663e04937b2" exitCode=0 Mar 10 10:06:58 crc kubenswrapper[4794]: I0310 10:06:58.929539 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-5fc5898d7d-hsl9p" event={"ID":"26721832-e716-44b8-ab75-60ba0be9e511","Type":"ContainerDied","Data":"bfceb8ee56c6036817c0ce50263764b2e16edc3f0e44869dddf9c663e04937b2"} Mar 10 10:06:58 crc kubenswrapper[4794]: I0310 10:06:58.932613 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Mar 10 10:06:58 crc kubenswrapper[4794]: I0310 10:06:58.933313 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"3c6c5ff5-c0d2-49d5-a2d0-f49871e6444e","Type":"ContainerStarted","Data":"d1dca3e4366d2a714eda333d52b2463f788c4bb17ba1264be9387aa062133e42"} Mar 10 10:06:58 crc kubenswrapper[4794]: I0310 10:06:58.936311 4794 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/openstackclient" oldPodUID="88e57f3a-171a-4359-94e8-32861401846a" podUID="d235a0a5-57c9-4938-b742-5788ade30a12" Mar 10 10:06:58 crc kubenswrapper[4794]: I0310 10:06:58.941788 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Mar 10 10:06:59 crc kubenswrapper[4794]: I0310 10:06:59.015881 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-5fc5898d7d-hsl9p" Mar 10 10:06:59 crc kubenswrapper[4794]: I0310 10:06:59.049349 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/88e57f3a-171a-4359-94e8-32861401846a-openstack-config-secret\") pod \"88e57f3a-171a-4359-94e8-32861401846a\" (UID: \"88e57f3a-171a-4359-94e8-32861401846a\") " Mar 10 10:06:59 crc kubenswrapper[4794]: I0310 10:06:59.049424 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/26721832-e716-44b8-ab75-60ba0be9e511-config-data\") pod \"26721832-e716-44b8-ab75-60ba0be9e511\" (UID: \"26721832-e716-44b8-ab75-60ba0be9e511\") " Mar 10 10:06:59 crc kubenswrapper[4794]: I0310 10:06:59.049450 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/88e57f3a-171a-4359-94e8-32861401846a-openstack-config\") pod \"88e57f3a-171a-4359-94e8-32861401846a\" (UID: \"88e57f3a-171a-4359-94e8-32861401846a\") " Mar 10 10:06:59 crc kubenswrapper[4794]: I0310 10:06:59.049472 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/26721832-e716-44b8-ab75-60ba0be9e511-internal-tls-certs\") pod \"26721832-e716-44b8-ab75-60ba0be9e511\" (UID: \"26721832-e716-44b8-ab75-60ba0be9e511\") " Mar 10 10:06:59 crc kubenswrapper[4794]: I0310 10:06:59.049768 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qlznm\" (UniqueName: \"kubernetes.io/projected/88e57f3a-171a-4359-94e8-32861401846a-kube-api-access-qlznm\") pod \"88e57f3a-171a-4359-94e8-32861401846a\" (UID: \"88e57f3a-171a-4359-94e8-32861401846a\") " Mar 10 10:06:59 crc kubenswrapper[4794]: I0310 10:06:59.049804 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/26721832-e716-44b8-ab75-60ba0be9e511-logs\") pod \"26721832-e716-44b8-ab75-60ba0be9e511\" (UID: \"26721832-e716-44b8-ab75-60ba0be9e511\") " Mar 10 10:06:59 crc kubenswrapper[4794]: I0310 10:06:59.049843 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/88e57f3a-171a-4359-94e8-32861401846a-combined-ca-bundle\") pod \"88e57f3a-171a-4359-94e8-32861401846a\" (UID: \"88e57f3a-171a-4359-94e8-32861401846a\") " Mar 10 10:06:59 crc kubenswrapper[4794]: I0310 10:06:59.049861 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wrxx4\" (UniqueName: \"kubernetes.io/projected/26721832-e716-44b8-ab75-60ba0be9e511-kube-api-access-wrxx4\") pod \"26721832-e716-44b8-ab75-60ba0be9e511\" (UID: \"26721832-e716-44b8-ab75-60ba0be9e511\") " Mar 10 10:06:59 crc kubenswrapper[4794]: I0310 10:06:59.049877 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/26721832-e716-44b8-ab75-60ba0be9e511-combined-ca-bundle\") pod \"26721832-e716-44b8-ab75-60ba0be9e511\" (UID: \"26721832-e716-44b8-ab75-60ba0be9e511\") " Mar 10 10:06:59 crc kubenswrapper[4794]: I0310 10:06:59.049926 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/26721832-e716-44b8-ab75-60ba0be9e511-public-tls-certs\") pod \"26721832-e716-44b8-ab75-60ba0be9e511\" (UID: \"26721832-e716-44b8-ab75-60ba0be9e511\") " Mar 10 10:06:59 crc kubenswrapper[4794]: I0310 10:06:59.049961 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/26721832-e716-44b8-ab75-60ba0be9e511-scripts\") pod \"26721832-e716-44b8-ab75-60ba0be9e511\" (UID: \"26721832-e716-44b8-ab75-60ba0be9e511\") " Mar 10 10:06:59 crc kubenswrapper[4794]: I0310 10:06:59.051778 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/26721832-e716-44b8-ab75-60ba0be9e511-logs" (OuterVolumeSpecName: "logs") pod "26721832-e716-44b8-ab75-60ba0be9e511" (UID: "26721832-e716-44b8-ab75-60ba0be9e511"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 10:06:59 crc kubenswrapper[4794]: I0310 10:06:59.059059 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/88e57f3a-171a-4359-94e8-32861401846a-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "88e57f3a-171a-4359-94e8-32861401846a" (UID: "88e57f3a-171a-4359-94e8-32861401846a"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 10:06:59 crc kubenswrapper[4794]: I0310 10:06:59.060568 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/88e57f3a-171a-4359-94e8-32861401846a-kube-api-access-qlznm" (OuterVolumeSpecName: "kube-api-access-qlznm") pod "88e57f3a-171a-4359-94e8-32861401846a" (UID: "88e57f3a-171a-4359-94e8-32861401846a"). InnerVolumeSpecName "kube-api-access-qlznm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 10:06:59 crc kubenswrapper[4794]: I0310 10:06:59.060636 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/88e57f3a-171a-4359-94e8-32861401846a-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "88e57f3a-171a-4359-94e8-32861401846a" (UID: "88e57f3a-171a-4359-94e8-32861401846a"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 10:06:59 crc kubenswrapper[4794]: I0310 10:06:59.062245 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/88e57f3a-171a-4359-94e8-32861401846a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "88e57f3a-171a-4359-94e8-32861401846a" (UID: "88e57f3a-171a-4359-94e8-32861401846a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 10:06:59 crc kubenswrapper[4794]: I0310 10:06:59.093509 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/26721832-e716-44b8-ab75-60ba0be9e511-scripts" (OuterVolumeSpecName: "scripts") pod "26721832-e716-44b8-ab75-60ba0be9e511" (UID: "26721832-e716-44b8-ab75-60ba0be9e511"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 10:06:59 crc kubenswrapper[4794]: I0310 10:06:59.093506 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/26721832-e716-44b8-ab75-60ba0be9e511-kube-api-access-wrxx4" (OuterVolumeSpecName: "kube-api-access-wrxx4") pod "26721832-e716-44b8-ab75-60ba0be9e511" (UID: "26721832-e716-44b8-ab75-60ba0be9e511"). InnerVolumeSpecName "kube-api-access-wrxx4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 10:06:59 crc kubenswrapper[4794]: I0310 10:06:59.122231 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/26721832-e716-44b8-ab75-60ba0be9e511-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "26721832-e716-44b8-ab75-60ba0be9e511" (UID: "26721832-e716-44b8-ab75-60ba0be9e511"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 10:06:59 crc kubenswrapper[4794]: I0310 10:06:59.139500 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/26721832-e716-44b8-ab75-60ba0be9e511-config-data" (OuterVolumeSpecName: "config-data") pod "26721832-e716-44b8-ab75-60ba0be9e511" (UID: "26721832-e716-44b8-ab75-60ba0be9e511"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 10:06:59 crc kubenswrapper[4794]: I0310 10:06:59.152992 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qlznm\" (UniqueName: \"kubernetes.io/projected/88e57f3a-171a-4359-94e8-32861401846a-kube-api-access-qlznm\") on node \"crc\" DevicePath \"\"" Mar 10 10:06:59 crc kubenswrapper[4794]: I0310 10:06:59.153020 4794 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/26721832-e716-44b8-ab75-60ba0be9e511-logs\") on node \"crc\" DevicePath \"\"" Mar 10 10:06:59 crc kubenswrapper[4794]: I0310 10:06:59.153032 4794 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/88e57f3a-171a-4359-94e8-32861401846a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 10:06:59 crc kubenswrapper[4794]: I0310 10:06:59.153044 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wrxx4\" (UniqueName: \"kubernetes.io/projected/26721832-e716-44b8-ab75-60ba0be9e511-kube-api-access-wrxx4\") on node \"crc\" DevicePath \"\"" Mar 10 10:06:59 crc kubenswrapper[4794]: I0310 10:06:59.153056 4794 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/26721832-e716-44b8-ab75-60ba0be9e511-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 10:06:59 crc kubenswrapper[4794]: I0310 10:06:59.153066 4794 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/26721832-e716-44b8-ab75-60ba0be9e511-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 10:06:59 crc kubenswrapper[4794]: I0310 10:06:59.153074 4794 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/88e57f3a-171a-4359-94e8-32861401846a-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Mar 10 10:06:59 crc kubenswrapper[4794]: I0310 10:06:59.153082 4794 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/26721832-e716-44b8-ab75-60ba0be9e511-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 10:06:59 crc kubenswrapper[4794]: I0310 10:06:59.153090 4794 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/88e57f3a-171a-4359-94e8-32861401846a-openstack-config\") on node \"crc\" DevicePath \"\"" Mar 10 10:06:59 crc kubenswrapper[4794]: I0310 10:06:59.163621 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Mar 10 10:06:59 crc kubenswrapper[4794]: I0310 10:06:59.189469 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/26721832-e716-44b8-ab75-60ba0be9e511-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "26721832-e716-44b8-ab75-60ba0be9e511" (UID: "26721832-e716-44b8-ab75-60ba0be9e511"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 10:06:59 crc kubenswrapper[4794]: I0310 10:06:59.200572 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/26721832-e716-44b8-ab75-60ba0be9e511-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "26721832-e716-44b8-ab75-60ba0be9e511" (UID: "26721832-e716-44b8-ab75-60ba0be9e511"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 10:06:59 crc kubenswrapper[4794]: I0310 10:06:59.254759 4794 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/26721832-e716-44b8-ab75-60ba0be9e511-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 10 10:06:59 crc kubenswrapper[4794]: I0310 10:06:59.254978 4794 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/26721832-e716-44b8-ab75-60ba0be9e511-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 10 10:06:59 crc kubenswrapper[4794]: I0310 10:06:59.651190 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Mar 10 10:06:59 crc kubenswrapper[4794]: W0310 10:06:59.658517 4794 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd235a0a5_57c9_4938_b742_5788ade30a12.slice/crio-e362e973fd1ddbdde704946272f4e6b5130e556c24e23a93143cf72208cad466 WatchSource:0}: Error finding container e362e973fd1ddbdde704946272f4e6b5130e556c24e23a93143cf72208cad466: Status 404 returned error can't find the container with id e362e973fd1ddbdde704946272f4e6b5130e556c24e23a93143cf72208cad466 Mar 10 10:06:59 crc kubenswrapper[4794]: I0310 10:06:59.939819 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-5dd9f46c58-hkfks" Mar 10 10:06:59 crc kubenswrapper[4794]: I0310 10:06:59.942650 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"3c6c5ff5-c0d2-49d5-a2d0-f49871e6444e","Type":"ContainerStarted","Data":"5a0b460bd15a1cc1517c39d79a934d02b21b241c99d789df7e899e079f6a12eb"} Mar 10 10:06:59 crc kubenswrapper[4794]: I0310 10:06:59.943822 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"d235a0a5-57c9-4938-b742-5788ade30a12","Type":"ContainerStarted","Data":"e362e973fd1ddbdde704946272f4e6b5130e556c24e23a93143cf72208cad466"} Mar 10 10:06:59 crc kubenswrapper[4794]: I0310 10:06:59.946250 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-5fc5898d7d-hsl9p" event={"ID":"26721832-e716-44b8-ab75-60ba0be9e511","Type":"ContainerDied","Data":"68a005ad997ff45c1f61c2a95324919e9b6b7079279883530aac59cabd6615dd"} Mar 10 10:06:59 crc kubenswrapper[4794]: I0310 10:06:59.946274 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Mar 10 10:06:59 crc kubenswrapper[4794]: I0310 10:06:59.946296 4794 scope.go:117] "RemoveContainer" containerID="bfceb8ee56c6036817c0ce50263764b2e16edc3f0e44869dddf9c663e04937b2" Mar 10 10:06:59 crc kubenswrapper[4794]: I0310 10:06:59.946306 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-5fc5898d7d-hsl9p" Mar 10 10:06:59 crc kubenswrapper[4794]: I0310 10:06:59.965896 4794 scope.go:117] "RemoveContainer" containerID="0ab88aa23c611ec45f1aa310e542bf00e0fbb47047d97131ad66ee91504c23d1" Mar 10 10:06:59 crc kubenswrapper[4794]: I0310 10:06:59.987896 4794 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/openstackclient" oldPodUID="88e57f3a-171a-4359-94e8-32861401846a" podUID="d235a0a5-57c9-4938-b742-5788ade30a12" Mar 10 10:07:00 crc kubenswrapper[4794]: I0310 10:07:00.050236 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="88e57f3a-171a-4359-94e8-32861401846a" path="/var/lib/kubelet/pods/88e57f3a-171a-4359-94e8-32861401846a/volumes" Mar 10 10:07:00 crc kubenswrapper[4794]: I0310 10:07:00.050850 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-5fc5898d7d-hsl9p"] Mar 10 10:07:00 crc kubenswrapper[4794]: I0310 10:07:00.074618 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-5dd9f46c58-hkfks" Mar 10 10:07:00 crc kubenswrapper[4794]: I0310 10:07:00.077220 4794 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-5fc5898d7d-hsl9p"] Mar 10 10:07:00 crc kubenswrapper[4794]: I0310 10:07:00.165319 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-65464978fd-hh68d"] Mar 10 10:07:00 crc kubenswrapper[4794]: I0310 10:07:00.165535 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-65464978fd-hh68d" podUID="7f12c556-db49-48d8-aad0-981c8d746bb6" containerName="barbican-api-log" containerID="cri-o://28bbc46ddd80c7238bde4d825e65659d1f1fa5c11c308841de1e1d86d3bd6f52" gracePeriod=30 Mar 10 10:07:00 crc kubenswrapper[4794]: I0310 10:07:00.165874 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-65464978fd-hh68d" podUID="7f12c556-db49-48d8-aad0-981c8d746bb6" containerName="barbican-api" containerID="cri-o://2e1344b23f34c965294ce2a354b0152c4c935ae846652a17c50980e22b14974d" gracePeriod=30 Mar 10 10:07:00 crc kubenswrapper[4794]: I0310 10:07:00.961862 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"3c6c5ff5-c0d2-49d5-a2d0-f49871e6444e","Type":"ContainerStarted","Data":"849ee9cf4f0b2ba0c3906171081ce847f0534ff16ad62c26ba765cd71b509f45"} Mar 10 10:07:00 crc kubenswrapper[4794]: I0310 10:07:00.974400 4794 generic.go:334] "Generic (PLEG): container finished" podID="7f12c556-db49-48d8-aad0-981c8d746bb6" containerID="28bbc46ddd80c7238bde4d825e65659d1f1fa5c11c308841de1e1d86d3bd6f52" exitCode=143 Mar 10 10:07:00 crc kubenswrapper[4794]: I0310 10:07:00.974446 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-65464978fd-hh68d" event={"ID":"7f12c556-db49-48d8-aad0-981c8d746bb6","Type":"ContainerDied","Data":"28bbc46ddd80c7238bde4d825e65659d1f1fa5c11c308841de1e1d86d3bd6f52"} Mar 10 10:07:00 crc kubenswrapper[4794]: I0310 10:07:00.984546 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=3.98452391 podStartE2EDuration="3.98452391s" podCreationTimestamp="2026-03-10 10:06:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 10:07:00.981906183 +0000 UTC m=+1369.738077001" watchObservedRunningTime="2026-03-10 10:07:00.98452391 +0000 UTC m=+1369.740694728" Mar 10 10:07:01 crc kubenswrapper[4794]: I0310 10:07:01.229321 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cinder-api-0" Mar 10 10:07:02 crc kubenswrapper[4794]: I0310 10:07:02.017734 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="26721832-e716-44b8-ab75-60ba0be9e511" path="/var/lib/kubelet/pods/26721832-e716-44b8-ab75-60ba0be9e511/volumes" Mar 10 10:07:02 crc kubenswrapper[4794]: I0310 10:07:02.630988 4794 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Mar 10 10:07:03 crc kubenswrapper[4794]: I0310 10:07:03.664615 4794 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-65464978fd-hh68d" podUID="7f12c556-db49-48d8-aad0-981c8d746bb6" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.0.163:9311/healthcheck\": read tcp 10.217.0.2:60920->10.217.0.163:9311: read: connection reset by peer" Mar 10 10:07:03 crc kubenswrapper[4794]: I0310 10:07:03.664638 4794 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-65464978fd-hh68d" podUID="7f12c556-db49-48d8-aad0-981c8d746bb6" containerName="barbican-api" probeResult="failure" output="Get \"http://10.217.0.163:9311/healthcheck\": read tcp 10.217.0.2:60904->10.217.0.163:9311: read: connection reset by peer" Mar 10 10:07:04 crc kubenswrapper[4794]: I0310 10:07:04.026633 4794 generic.go:334] "Generic (PLEG): container finished" podID="7f12c556-db49-48d8-aad0-981c8d746bb6" containerID="2e1344b23f34c965294ce2a354b0152c4c935ae846652a17c50980e22b14974d" exitCode=0 Mar 10 10:07:04 crc kubenswrapper[4794]: I0310 10:07:04.026892 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-65464978fd-hh68d" event={"ID":"7f12c556-db49-48d8-aad0-981c8d746bb6","Type":"ContainerDied","Data":"2e1344b23f34c965294ce2a354b0152c4c935ae846652a17c50980e22b14974d"} Mar 10 10:07:04 crc kubenswrapper[4794]: I0310 10:07:04.026926 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-65464978fd-hh68d" event={"ID":"7f12c556-db49-48d8-aad0-981c8d746bb6","Type":"ContainerDied","Data":"d13c87aa3b94a2e511e9d21ec1737fb611f1ab5a2eb9cf4629403bd21ba919a5"} Mar 10 10:07:04 crc kubenswrapper[4794]: I0310 10:07:04.026937 4794 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d13c87aa3b94a2e511e9d21ec1737fb611f1ab5a2eb9cf4629403bd21ba919a5" Mar 10 10:07:04 crc kubenswrapper[4794]: I0310 10:07:04.086139 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-65464978fd-hh68d" Mar 10 10:07:04 crc kubenswrapper[4794]: I0310 10:07:04.170627 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7f12c556-db49-48d8-aad0-981c8d746bb6-logs\") pod \"7f12c556-db49-48d8-aad0-981c8d746bb6\" (UID: \"7f12c556-db49-48d8-aad0-981c8d746bb6\") " Mar 10 10:07:04 crc kubenswrapper[4794]: I0310 10:07:04.170794 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7f12c556-db49-48d8-aad0-981c8d746bb6-combined-ca-bundle\") pod \"7f12c556-db49-48d8-aad0-981c8d746bb6\" (UID: \"7f12c556-db49-48d8-aad0-981c8d746bb6\") " Mar 10 10:07:04 crc kubenswrapper[4794]: I0310 10:07:04.170903 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7f12c556-db49-48d8-aad0-981c8d746bb6-config-data-custom\") pod \"7f12c556-db49-48d8-aad0-981c8d746bb6\" (UID: \"7f12c556-db49-48d8-aad0-981c8d746bb6\") " Mar 10 10:07:04 crc kubenswrapper[4794]: I0310 10:07:04.171060 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7f12c556-db49-48d8-aad0-981c8d746bb6-config-data\") pod \"7f12c556-db49-48d8-aad0-981c8d746bb6\" (UID: \"7f12c556-db49-48d8-aad0-981c8d746bb6\") " Mar 10 10:07:04 crc kubenswrapper[4794]: I0310 10:07:04.171127 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b25jx\" (UniqueName: \"kubernetes.io/projected/7f12c556-db49-48d8-aad0-981c8d746bb6-kube-api-access-b25jx\") pod \"7f12c556-db49-48d8-aad0-981c8d746bb6\" (UID: \"7f12c556-db49-48d8-aad0-981c8d746bb6\") " Mar 10 10:07:04 crc kubenswrapper[4794]: I0310 10:07:04.171296 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7f12c556-db49-48d8-aad0-981c8d746bb6-logs" (OuterVolumeSpecName: "logs") pod "7f12c556-db49-48d8-aad0-981c8d746bb6" (UID: "7f12c556-db49-48d8-aad0-981c8d746bb6"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 10:07:04 crc kubenswrapper[4794]: I0310 10:07:04.171693 4794 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7f12c556-db49-48d8-aad0-981c8d746bb6-logs\") on node \"crc\" DevicePath \"\"" Mar 10 10:07:04 crc kubenswrapper[4794]: I0310 10:07:04.183591 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7f12c556-db49-48d8-aad0-981c8d746bb6-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "7f12c556-db49-48d8-aad0-981c8d746bb6" (UID: "7f12c556-db49-48d8-aad0-981c8d746bb6"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 10:07:04 crc kubenswrapper[4794]: I0310 10:07:04.189965 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7f12c556-db49-48d8-aad0-981c8d746bb6-kube-api-access-b25jx" (OuterVolumeSpecName: "kube-api-access-b25jx") pod "7f12c556-db49-48d8-aad0-981c8d746bb6" (UID: "7f12c556-db49-48d8-aad0-981c8d746bb6"). InnerVolumeSpecName "kube-api-access-b25jx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 10:07:04 crc kubenswrapper[4794]: I0310 10:07:04.197510 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7f12c556-db49-48d8-aad0-981c8d746bb6-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7f12c556-db49-48d8-aad0-981c8d746bb6" (UID: "7f12c556-db49-48d8-aad0-981c8d746bb6"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 10:07:04 crc kubenswrapper[4794]: I0310 10:07:04.229523 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7f12c556-db49-48d8-aad0-981c8d746bb6-config-data" (OuterVolumeSpecName: "config-data") pod "7f12c556-db49-48d8-aad0-981c8d746bb6" (UID: "7f12c556-db49-48d8-aad0-981c8d746bb6"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 10:07:04 crc kubenswrapper[4794]: I0310 10:07:04.273750 4794 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7f12c556-db49-48d8-aad0-981c8d746bb6-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 10:07:04 crc kubenswrapper[4794]: I0310 10:07:04.273786 4794 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7f12c556-db49-48d8-aad0-981c8d746bb6-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 10 10:07:04 crc kubenswrapper[4794]: I0310 10:07:04.273799 4794 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7f12c556-db49-48d8-aad0-981c8d746bb6-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 10:07:04 crc kubenswrapper[4794]: I0310 10:07:04.273813 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b25jx\" (UniqueName: \"kubernetes.io/projected/7f12c556-db49-48d8-aad0-981c8d746bb6-kube-api-access-b25jx\") on node \"crc\" DevicePath \"\"" Mar 10 10:07:04 crc kubenswrapper[4794]: I0310 10:07:04.876154 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-66d985f6f7-q8rt6" Mar 10 10:07:04 crc kubenswrapper[4794]: I0310 10:07:04.888868 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 10 10:07:04 crc kubenswrapper[4794]: I0310 10:07:04.985231 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nt7ks\" (UniqueName: \"kubernetes.io/projected/39f62c40-641b-409f-ab29-ba30c14de2d8-kube-api-access-nt7ks\") pod \"39f62c40-641b-409f-ab29-ba30c14de2d8\" (UID: \"39f62c40-641b-409f-ab29-ba30c14de2d8\") " Mar 10 10:07:04 crc kubenswrapper[4794]: I0310 10:07:04.985411 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/39f62c40-641b-409f-ab29-ba30c14de2d8-combined-ca-bundle\") pod \"39f62c40-641b-409f-ab29-ba30c14de2d8\" (UID: \"39f62c40-641b-409f-ab29-ba30c14de2d8\") " Mar 10 10:07:04 crc kubenswrapper[4794]: I0310 10:07:04.986195 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/39f62c40-641b-409f-ab29-ba30c14de2d8-run-httpd\") pod \"39f62c40-641b-409f-ab29-ba30c14de2d8\" (UID: \"39f62c40-641b-409f-ab29-ba30c14de2d8\") " Mar 10 10:07:04 crc kubenswrapper[4794]: I0310 10:07:04.986268 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b66eadd9-dea4-4c3c-aa14-494f04e0e8c7-public-tls-certs\") pod \"b66eadd9-dea4-4c3c-aa14-494f04e0e8c7\" (UID: \"b66eadd9-dea4-4c3c-aa14-494f04e0e8c7\") " Mar 10 10:07:04 crc kubenswrapper[4794]: I0310 10:07:04.986461 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/39f62c40-641b-409f-ab29-ba30c14de2d8-log-httpd\") pod \"39f62c40-641b-409f-ab29-ba30c14de2d8\" (UID: \"39f62c40-641b-409f-ab29-ba30c14de2d8\") " Mar 10 10:07:04 crc kubenswrapper[4794]: I0310 10:07:04.986571 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b66eadd9-dea4-4c3c-aa14-494f04e0e8c7-combined-ca-bundle\") pod \"b66eadd9-dea4-4c3c-aa14-494f04e0e8c7\" (UID: \"b66eadd9-dea4-4c3c-aa14-494f04e0e8c7\") " Mar 10 10:07:04 crc kubenswrapper[4794]: I0310 10:07:04.986632 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b66eadd9-dea4-4c3c-aa14-494f04e0e8c7-internal-tls-certs\") pod \"b66eadd9-dea4-4c3c-aa14-494f04e0e8c7\" (UID: \"b66eadd9-dea4-4c3c-aa14-494f04e0e8c7\") " Mar 10 10:07:04 crc kubenswrapper[4794]: I0310 10:07:04.986661 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/b66eadd9-dea4-4c3c-aa14-494f04e0e8c7-config\") pod \"b66eadd9-dea4-4c3c-aa14-494f04e0e8c7\" (UID: \"b66eadd9-dea4-4c3c-aa14-494f04e0e8c7\") " Mar 10 10:07:04 crc kubenswrapper[4794]: I0310 10:07:04.986686 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/b66eadd9-dea4-4c3c-aa14-494f04e0e8c7-ovndb-tls-certs\") pod \"b66eadd9-dea4-4c3c-aa14-494f04e0e8c7\" (UID: \"b66eadd9-dea4-4c3c-aa14-494f04e0e8c7\") " Mar 10 10:07:04 crc kubenswrapper[4794]: I0310 10:07:04.986722 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/39f62c40-641b-409f-ab29-ba30c14de2d8-scripts\") pod \"39f62c40-641b-409f-ab29-ba30c14de2d8\" (UID: \"39f62c40-641b-409f-ab29-ba30c14de2d8\") " Mar 10 10:07:04 crc kubenswrapper[4794]: I0310 10:07:04.987086 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/b66eadd9-dea4-4c3c-aa14-494f04e0e8c7-httpd-config\") pod \"b66eadd9-dea4-4c3c-aa14-494f04e0e8c7\" (UID: \"b66eadd9-dea4-4c3c-aa14-494f04e0e8c7\") " Mar 10 10:07:04 crc kubenswrapper[4794]: I0310 10:07:04.987147 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/39f62c40-641b-409f-ab29-ba30c14de2d8-config-data\") pod \"39f62c40-641b-409f-ab29-ba30c14de2d8\" (UID: \"39f62c40-641b-409f-ab29-ba30c14de2d8\") " Mar 10 10:07:04 crc kubenswrapper[4794]: I0310 10:07:04.987194 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7zb47\" (UniqueName: \"kubernetes.io/projected/b66eadd9-dea4-4c3c-aa14-494f04e0e8c7-kube-api-access-7zb47\") pod \"b66eadd9-dea4-4c3c-aa14-494f04e0e8c7\" (UID: \"b66eadd9-dea4-4c3c-aa14-494f04e0e8c7\") " Mar 10 10:07:04 crc kubenswrapper[4794]: I0310 10:07:04.987291 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/39f62c40-641b-409f-ab29-ba30c14de2d8-sg-core-conf-yaml\") pod \"39f62c40-641b-409f-ab29-ba30c14de2d8\" (UID: \"39f62c40-641b-409f-ab29-ba30c14de2d8\") " Mar 10 10:07:04 crc kubenswrapper[4794]: I0310 10:07:04.986768 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/39f62c40-641b-409f-ab29-ba30c14de2d8-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "39f62c40-641b-409f-ab29-ba30c14de2d8" (UID: "39f62c40-641b-409f-ab29-ba30c14de2d8"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 10:07:04 crc kubenswrapper[4794]: I0310 10:07:04.986989 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/39f62c40-641b-409f-ab29-ba30c14de2d8-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "39f62c40-641b-409f-ab29-ba30c14de2d8" (UID: "39f62c40-641b-409f-ab29-ba30c14de2d8"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 10:07:04 crc kubenswrapper[4794]: I0310 10:07:04.989786 4794 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/39f62c40-641b-409f-ab29-ba30c14de2d8-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 10 10:07:04 crc kubenswrapper[4794]: I0310 10:07:04.989805 4794 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/39f62c40-641b-409f-ab29-ba30c14de2d8-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 10 10:07:04 crc kubenswrapper[4794]: I0310 10:07:04.994466 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/39f62c40-641b-409f-ab29-ba30c14de2d8-scripts" (OuterVolumeSpecName: "scripts") pod "39f62c40-641b-409f-ab29-ba30c14de2d8" (UID: "39f62c40-641b-409f-ab29-ba30c14de2d8"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 10:07:04 crc kubenswrapper[4794]: I0310 10:07:04.994475 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/39f62c40-641b-409f-ab29-ba30c14de2d8-kube-api-access-nt7ks" (OuterVolumeSpecName: "kube-api-access-nt7ks") pod "39f62c40-641b-409f-ab29-ba30c14de2d8" (UID: "39f62c40-641b-409f-ab29-ba30c14de2d8"). InnerVolumeSpecName "kube-api-access-nt7ks". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 10:07:04 crc kubenswrapper[4794]: I0310 10:07:04.994493 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b66eadd9-dea4-4c3c-aa14-494f04e0e8c7-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "b66eadd9-dea4-4c3c-aa14-494f04e0e8c7" (UID: "b66eadd9-dea4-4c3c-aa14-494f04e0e8c7"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 10:07:05 crc kubenswrapper[4794]: I0310 10:07:05.008497 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b66eadd9-dea4-4c3c-aa14-494f04e0e8c7-kube-api-access-7zb47" (OuterVolumeSpecName: "kube-api-access-7zb47") pod "b66eadd9-dea4-4c3c-aa14-494f04e0e8c7" (UID: "b66eadd9-dea4-4c3c-aa14-494f04e0e8c7"). InnerVolumeSpecName "kube-api-access-7zb47". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 10:07:05 crc kubenswrapper[4794]: I0310 10:07:05.044048 4794 generic.go:334] "Generic (PLEG): container finished" podID="39f62c40-641b-409f-ab29-ba30c14de2d8" containerID="85c4519c5fd753452de74cca4d80c98f995ac40e8e9d071b9258e6a667b6e320" exitCode=137 Mar 10 10:07:05 crc kubenswrapper[4794]: I0310 10:07:05.044104 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"39f62c40-641b-409f-ab29-ba30c14de2d8","Type":"ContainerDied","Data":"85c4519c5fd753452de74cca4d80c98f995ac40e8e9d071b9258e6a667b6e320"} Mar 10 10:07:05 crc kubenswrapper[4794]: I0310 10:07:05.044138 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 10 10:07:05 crc kubenswrapper[4794]: I0310 10:07:05.044145 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"39f62c40-641b-409f-ab29-ba30c14de2d8","Type":"ContainerDied","Data":"6d96c35d6454f521b3bcdd00592a057cad340a5e17bbf0b15a4ecc50c0381e17"} Mar 10 10:07:05 crc kubenswrapper[4794]: I0310 10:07:05.044162 4794 scope.go:117] "RemoveContainer" containerID="85c4519c5fd753452de74cca4d80c98f995ac40e8e9d071b9258e6a667b6e320" Mar 10 10:07:05 crc kubenswrapper[4794]: I0310 10:07:05.046786 4794 generic.go:334] "Generic (PLEG): container finished" podID="b66eadd9-dea4-4c3c-aa14-494f04e0e8c7" containerID="ac9789f1492e8d84ba992e18bc28a423157e539d3ae3b6e20dec1dc67aaa8e96" exitCode=0 Mar 10 10:07:05 crc kubenswrapper[4794]: I0310 10:07:05.046839 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-65464978fd-hh68d" Mar 10 10:07:05 crc kubenswrapper[4794]: I0310 10:07:05.048764 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b66eadd9-dea4-4c3c-aa14-494f04e0e8c7-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "b66eadd9-dea4-4c3c-aa14-494f04e0e8c7" (UID: "b66eadd9-dea4-4c3c-aa14-494f04e0e8c7"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 10:07:05 crc kubenswrapper[4794]: I0310 10:07:05.049710 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-66d985f6f7-q8rt6" Mar 10 10:07:05 crc kubenswrapper[4794]: I0310 10:07:05.049945 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-66d985f6f7-q8rt6" event={"ID":"b66eadd9-dea4-4c3c-aa14-494f04e0e8c7","Type":"ContainerDied","Data":"ac9789f1492e8d84ba992e18bc28a423157e539d3ae3b6e20dec1dc67aaa8e96"} Mar 10 10:07:05 crc kubenswrapper[4794]: I0310 10:07:05.050242 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-66d985f6f7-q8rt6" event={"ID":"b66eadd9-dea4-4c3c-aa14-494f04e0e8c7","Type":"ContainerDied","Data":"bc0ffbfa7e16f1c8c83c431b4d4c9d19414ccd076bab65ecbf1699a767bb2a37"} Mar 10 10:07:05 crc kubenswrapper[4794]: I0310 10:07:05.068435 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b66eadd9-dea4-4c3c-aa14-494f04e0e8c7-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b66eadd9-dea4-4c3c-aa14-494f04e0e8c7" (UID: "b66eadd9-dea4-4c3c-aa14-494f04e0e8c7"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 10:07:05 crc kubenswrapper[4794]: I0310 10:07:05.089432 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/39f62c40-641b-409f-ab29-ba30c14de2d8-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "39f62c40-641b-409f-ab29-ba30c14de2d8" (UID: "39f62c40-641b-409f-ab29-ba30c14de2d8"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 10:07:05 crc kubenswrapper[4794]: I0310 10:07:05.091715 4794 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b66eadd9-dea4-4c3c-aa14-494f04e0e8c7-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 10 10:07:05 crc kubenswrapper[4794]: I0310 10:07:05.091740 4794 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b66eadd9-dea4-4c3c-aa14-494f04e0e8c7-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 10:07:05 crc kubenswrapper[4794]: I0310 10:07:05.091749 4794 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/39f62c40-641b-409f-ab29-ba30c14de2d8-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 10:07:05 crc kubenswrapper[4794]: I0310 10:07:05.091757 4794 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/b66eadd9-dea4-4c3c-aa14-494f04e0e8c7-httpd-config\") on node \"crc\" DevicePath \"\"" Mar 10 10:07:05 crc kubenswrapper[4794]: I0310 10:07:05.091766 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7zb47\" (UniqueName: \"kubernetes.io/projected/b66eadd9-dea4-4c3c-aa14-494f04e0e8c7-kube-api-access-7zb47\") on node \"crc\" DevicePath \"\"" Mar 10 10:07:05 crc kubenswrapper[4794]: I0310 10:07:05.091774 4794 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/39f62c40-641b-409f-ab29-ba30c14de2d8-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 10 10:07:05 crc kubenswrapper[4794]: I0310 10:07:05.091782 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nt7ks\" (UniqueName: \"kubernetes.io/projected/39f62c40-641b-409f-ab29-ba30c14de2d8-kube-api-access-nt7ks\") on node \"crc\" DevicePath \"\"" Mar 10 10:07:05 crc kubenswrapper[4794]: I0310 10:07:05.093784 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b66eadd9-dea4-4c3c-aa14-494f04e0e8c7-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "b66eadd9-dea4-4c3c-aa14-494f04e0e8c7" (UID: "b66eadd9-dea4-4c3c-aa14-494f04e0e8c7"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 10:07:05 crc kubenswrapper[4794]: I0310 10:07:05.124939 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/39f62c40-641b-409f-ab29-ba30c14de2d8-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "39f62c40-641b-409f-ab29-ba30c14de2d8" (UID: "39f62c40-641b-409f-ab29-ba30c14de2d8"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 10:07:05 crc kubenswrapper[4794]: I0310 10:07:05.129847 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b66eadd9-dea4-4c3c-aa14-494f04e0e8c7-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "b66eadd9-dea4-4c3c-aa14-494f04e0e8c7" (UID: "b66eadd9-dea4-4c3c-aa14-494f04e0e8c7"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 10:07:05 crc kubenswrapper[4794]: I0310 10:07:05.131121 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b66eadd9-dea4-4c3c-aa14-494f04e0e8c7-config" (OuterVolumeSpecName: "config") pod "b66eadd9-dea4-4c3c-aa14-494f04e0e8c7" (UID: "b66eadd9-dea4-4c3c-aa14-494f04e0e8c7"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 10:07:05 crc kubenswrapper[4794]: I0310 10:07:05.159465 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/39f62c40-641b-409f-ab29-ba30c14de2d8-config-data" (OuterVolumeSpecName: "config-data") pod "39f62c40-641b-409f-ab29-ba30c14de2d8" (UID: "39f62c40-641b-409f-ab29-ba30c14de2d8"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 10:07:05 crc kubenswrapper[4794]: I0310 10:07:05.178837 4794 scope.go:117] "RemoveContainer" containerID="f61478060205d8fcc6cec95994753f9d66a103e3c2e6f3bb9a7e8201a71bbfe8" Mar 10 10:07:05 crc kubenswrapper[4794]: I0310 10:07:05.186129 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-65464978fd-hh68d"] Mar 10 10:07:05 crc kubenswrapper[4794]: I0310 10:07:05.193649 4794 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b66eadd9-dea4-4c3c-aa14-494f04e0e8c7-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 10 10:07:05 crc kubenswrapper[4794]: I0310 10:07:05.193684 4794 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/b66eadd9-dea4-4c3c-aa14-494f04e0e8c7-config\") on node \"crc\" DevicePath \"\"" Mar 10 10:07:05 crc kubenswrapper[4794]: I0310 10:07:05.193698 4794 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/b66eadd9-dea4-4c3c-aa14-494f04e0e8c7-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 10 10:07:05 crc kubenswrapper[4794]: I0310 10:07:05.193712 4794 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/39f62c40-641b-409f-ab29-ba30c14de2d8-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 10:07:05 crc kubenswrapper[4794]: I0310 10:07:05.193723 4794 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/39f62c40-641b-409f-ab29-ba30c14de2d8-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 10:07:05 crc kubenswrapper[4794]: I0310 10:07:05.195516 4794 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-api-65464978fd-hh68d"] Mar 10 10:07:05 crc kubenswrapper[4794]: I0310 10:07:05.205159 4794 scope.go:117] "RemoveContainer" containerID="3a9d616e4d6b008bb7cbf6d2bbc20e8de0074f1639e7627b35604fe2fd3d69a4" Mar 10 10:07:05 crc kubenswrapper[4794]: I0310 10:07:05.240659 4794 scope.go:117] "RemoveContainer" containerID="fbb0da4f7706c5131082ecadb64151d23fdfad08580f0036fc285ac4f92a3ab6" Mar 10 10:07:05 crc kubenswrapper[4794]: I0310 10:07:05.264264 4794 scope.go:117] "RemoveContainer" containerID="85c4519c5fd753452de74cca4d80c98f995ac40e8e9d071b9258e6a667b6e320" Mar 10 10:07:05 crc kubenswrapper[4794]: E0310 10:07:05.264831 4794 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"85c4519c5fd753452de74cca4d80c98f995ac40e8e9d071b9258e6a667b6e320\": container with ID starting with 85c4519c5fd753452de74cca4d80c98f995ac40e8e9d071b9258e6a667b6e320 not found: ID does not exist" containerID="85c4519c5fd753452de74cca4d80c98f995ac40e8e9d071b9258e6a667b6e320" Mar 10 10:07:05 crc kubenswrapper[4794]: I0310 10:07:05.264862 4794 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"85c4519c5fd753452de74cca4d80c98f995ac40e8e9d071b9258e6a667b6e320"} err="failed to get container status \"85c4519c5fd753452de74cca4d80c98f995ac40e8e9d071b9258e6a667b6e320\": rpc error: code = NotFound desc = could not find container \"85c4519c5fd753452de74cca4d80c98f995ac40e8e9d071b9258e6a667b6e320\": container with ID starting with 85c4519c5fd753452de74cca4d80c98f995ac40e8e9d071b9258e6a667b6e320 not found: ID does not exist" Mar 10 10:07:05 crc kubenswrapper[4794]: I0310 10:07:05.264887 4794 scope.go:117] "RemoveContainer" containerID="f61478060205d8fcc6cec95994753f9d66a103e3c2e6f3bb9a7e8201a71bbfe8" Mar 10 10:07:05 crc kubenswrapper[4794]: E0310 10:07:05.265085 4794 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f61478060205d8fcc6cec95994753f9d66a103e3c2e6f3bb9a7e8201a71bbfe8\": container with ID starting with f61478060205d8fcc6cec95994753f9d66a103e3c2e6f3bb9a7e8201a71bbfe8 not found: ID does not exist" containerID="f61478060205d8fcc6cec95994753f9d66a103e3c2e6f3bb9a7e8201a71bbfe8" Mar 10 10:07:05 crc kubenswrapper[4794]: I0310 10:07:05.265106 4794 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f61478060205d8fcc6cec95994753f9d66a103e3c2e6f3bb9a7e8201a71bbfe8"} err="failed to get container status \"f61478060205d8fcc6cec95994753f9d66a103e3c2e6f3bb9a7e8201a71bbfe8\": rpc error: code = NotFound desc = could not find container \"f61478060205d8fcc6cec95994753f9d66a103e3c2e6f3bb9a7e8201a71bbfe8\": container with ID starting with f61478060205d8fcc6cec95994753f9d66a103e3c2e6f3bb9a7e8201a71bbfe8 not found: ID does not exist" Mar 10 10:07:05 crc kubenswrapper[4794]: I0310 10:07:05.265118 4794 scope.go:117] "RemoveContainer" containerID="3a9d616e4d6b008bb7cbf6d2bbc20e8de0074f1639e7627b35604fe2fd3d69a4" Mar 10 10:07:05 crc kubenswrapper[4794]: E0310 10:07:05.265381 4794 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3a9d616e4d6b008bb7cbf6d2bbc20e8de0074f1639e7627b35604fe2fd3d69a4\": container with ID starting with 3a9d616e4d6b008bb7cbf6d2bbc20e8de0074f1639e7627b35604fe2fd3d69a4 not found: ID does not exist" containerID="3a9d616e4d6b008bb7cbf6d2bbc20e8de0074f1639e7627b35604fe2fd3d69a4" Mar 10 10:07:05 crc kubenswrapper[4794]: I0310 10:07:05.265405 4794 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3a9d616e4d6b008bb7cbf6d2bbc20e8de0074f1639e7627b35604fe2fd3d69a4"} err="failed to get container status \"3a9d616e4d6b008bb7cbf6d2bbc20e8de0074f1639e7627b35604fe2fd3d69a4\": rpc error: code = NotFound desc = could not find container \"3a9d616e4d6b008bb7cbf6d2bbc20e8de0074f1639e7627b35604fe2fd3d69a4\": container with ID starting with 3a9d616e4d6b008bb7cbf6d2bbc20e8de0074f1639e7627b35604fe2fd3d69a4 not found: ID does not exist" Mar 10 10:07:05 crc kubenswrapper[4794]: I0310 10:07:05.265424 4794 scope.go:117] "RemoveContainer" containerID="fbb0da4f7706c5131082ecadb64151d23fdfad08580f0036fc285ac4f92a3ab6" Mar 10 10:07:05 crc kubenswrapper[4794]: E0310 10:07:05.267125 4794 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fbb0da4f7706c5131082ecadb64151d23fdfad08580f0036fc285ac4f92a3ab6\": container with ID starting with fbb0da4f7706c5131082ecadb64151d23fdfad08580f0036fc285ac4f92a3ab6 not found: ID does not exist" containerID="fbb0da4f7706c5131082ecadb64151d23fdfad08580f0036fc285ac4f92a3ab6" Mar 10 10:07:05 crc kubenswrapper[4794]: I0310 10:07:05.267160 4794 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fbb0da4f7706c5131082ecadb64151d23fdfad08580f0036fc285ac4f92a3ab6"} err="failed to get container status \"fbb0da4f7706c5131082ecadb64151d23fdfad08580f0036fc285ac4f92a3ab6\": rpc error: code = NotFound desc = could not find container \"fbb0da4f7706c5131082ecadb64151d23fdfad08580f0036fc285ac4f92a3ab6\": container with ID starting with fbb0da4f7706c5131082ecadb64151d23fdfad08580f0036fc285ac4f92a3ab6 not found: ID does not exist" Mar 10 10:07:05 crc kubenswrapper[4794]: I0310 10:07:05.267177 4794 scope.go:117] "RemoveContainer" containerID="2a165f3b4e22ad3b365cba1bf564b6800c46f607f88d27558dfb8a8ce501c7b2" Mar 10 10:07:05 crc kubenswrapper[4794]: I0310 10:07:05.289780 4794 scope.go:117] "RemoveContainer" containerID="ac9789f1492e8d84ba992e18bc28a423157e539d3ae3b6e20dec1dc67aaa8e96" Mar 10 10:07:05 crc kubenswrapper[4794]: I0310 10:07:05.314173 4794 scope.go:117] "RemoveContainer" containerID="2a165f3b4e22ad3b365cba1bf564b6800c46f607f88d27558dfb8a8ce501c7b2" Mar 10 10:07:05 crc kubenswrapper[4794]: E0310 10:07:05.314746 4794 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2a165f3b4e22ad3b365cba1bf564b6800c46f607f88d27558dfb8a8ce501c7b2\": container with ID starting with 2a165f3b4e22ad3b365cba1bf564b6800c46f607f88d27558dfb8a8ce501c7b2 not found: ID does not exist" containerID="2a165f3b4e22ad3b365cba1bf564b6800c46f607f88d27558dfb8a8ce501c7b2" Mar 10 10:07:05 crc kubenswrapper[4794]: I0310 10:07:05.314774 4794 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2a165f3b4e22ad3b365cba1bf564b6800c46f607f88d27558dfb8a8ce501c7b2"} err="failed to get container status \"2a165f3b4e22ad3b365cba1bf564b6800c46f607f88d27558dfb8a8ce501c7b2\": rpc error: code = NotFound desc = could not find container \"2a165f3b4e22ad3b365cba1bf564b6800c46f607f88d27558dfb8a8ce501c7b2\": container with ID starting with 2a165f3b4e22ad3b365cba1bf564b6800c46f607f88d27558dfb8a8ce501c7b2 not found: ID does not exist" Mar 10 10:07:05 crc kubenswrapper[4794]: I0310 10:07:05.314794 4794 scope.go:117] "RemoveContainer" containerID="ac9789f1492e8d84ba992e18bc28a423157e539d3ae3b6e20dec1dc67aaa8e96" Mar 10 10:07:05 crc kubenswrapper[4794]: E0310 10:07:05.315423 4794 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ac9789f1492e8d84ba992e18bc28a423157e539d3ae3b6e20dec1dc67aaa8e96\": container with ID starting with ac9789f1492e8d84ba992e18bc28a423157e539d3ae3b6e20dec1dc67aaa8e96 not found: ID does not exist" containerID="ac9789f1492e8d84ba992e18bc28a423157e539d3ae3b6e20dec1dc67aaa8e96" Mar 10 10:07:05 crc kubenswrapper[4794]: I0310 10:07:05.315451 4794 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ac9789f1492e8d84ba992e18bc28a423157e539d3ae3b6e20dec1dc67aaa8e96"} err="failed to get container status \"ac9789f1492e8d84ba992e18bc28a423157e539d3ae3b6e20dec1dc67aaa8e96\": rpc error: code = NotFound desc = could not find container \"ac9789f1492e8d84ba992e18bc28a423157e539d3ae3b6e20dec1dc67aaa8e96\": container with ID starting with ac9789f1492e8d84ba992e18bc28a423157e539d3ae3b6e20dec1dc67aaa8e96 not found: ID does not exist" Mar 10 10:07:05 crc kubenswrapper[4794]: I0310 10:07:05.397942 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 10 10:07:05 crc kubenswrapper[4794]: I0310 10:07:05.415605 4794 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 10 10:07:05 crc kubenswrapper[4794]: I0310 10:07:05.425281 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-66d985f6f7-q8rt6"] Mar 10 10:07:05 crc kubenswrapper[4794]: I0310 10:07:05.437631 4794 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-66d985f6f7-q8rt6"] Mar 10 10:07:05 crc kubenswrapper[4794]: I0310 10:07:05.465507 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 10 10:07:05 crc kubenswrapper[4794]: E0310 10:07:05.465898 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="39f62c40-641b-409f-ab29-ba30c14de2d8" containerName="proxy-httpd" Mar 10 10:07:05 crc kubenswrapper[4794]: I0310 10:07:05.465917 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="39f62c40-641b-409f-ab29-ba30c14de2d8" containerName="proxy-httpd" Mar 10 10:07:05 crc kubenswrapper[4794]: E0310 10:07:05.465934 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7f12c556-db49-48d8-aad0-981c8d746bb6" containerName="barbican-api" Mar 10 10:07:05 crc kubenswrapper[4794]: I0310 10:07:05.465943 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="7f12c556-db49-48d8-aad0-981c8d746bb6" containerName="barbican-api" Mar 10 10:07:05 crc kubenswrapper[4794]: E0310 10:07:05.465961 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7f12c556-db49-48d8-aad0-981c8d746bb6" containerName="barbican-api-log" Mar 10 10:07:05 crc kubenswrapper[4794]: I0310 10:07:05.465969 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="7f12c556-db49-48d8-aad0-981c8d746bb6" containerName="barbican-api-log" Mar 10 10:07:05 crc kubenswrapper[4794]: E0310 10:07:05.465985 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b66eadd9-dea4-4c3c-aa14-494f04e0e8c7" containerName="neutron-api" Mar 10 10:07:05 crc kubenswrapper[4794]: I0310 10:07:05.465993 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="b66eadd9-dea4-4c3c-aa14-494f04e0e8c7" containerName="neutron-api" Mar 10 10:07:05 crc kubenswrapper[4794]: E0310 10:07:05.466007 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b66eadd9-dea4-4c3c-aa14-494f04e0e8c7" containerName="neutron-httpd" Mar 10 10:07:05 crc kubenswrapper[4794]: I0310 10:07:05.466013 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="b66eadd9-dea4-4c3c-aa14-494f04e0e8c7" containerName="neutron-httpd" Mar 10 10:07:05 crc kubenswrapper[4794]: E0310 10:07:05.466023 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="26721832-e716-44b8-ab75-60ba0be9e511" containerName="placement-log" Mar 10 10:07:05 crc kubenswrapper[4794]: I0310 10:07:05.466028 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="26721832-e716-44b8-ab75-60ba0be9e511" containerName="placement-log" Mar 10 10:07:05 crc kubenswrapper[4794]: E0310 10:07:05.466043 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="39f62c40-641b-409f-ab29-ba30c14de2d8" containerName="sg-core" Mar 10 10:07:05 crc kubenswrapper[4794]: I0310 10:07:05.466049 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="39f62c40-641b-409f-ab29-ba30c14de2d8" containerName="sg-core" Mar 10 10:07:05 crc kubenswrapper[4794]: E0310 10:07:05.466057 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="39f62c40-641b-409f-ab29-ba30c14de2d8" containerName="ceilometer-central-agent" Mar 10 10:07:05 crc kubenswrapper[4794]: I0310 10:07:05.466063 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="39f62c40-641b-409f-ab29-ba30c14de2d8" containerName="ceilometer-central-agent" Mar 10 10:07:05 crc kubenswrapper[4794]: E0310 10:07:05.466070 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="26721832-e716-44b8-ab75-60ba0be9e511" containerName="placement-api" Mar 10 10:07:05 crc kubenswrapper[4794]: I0310 10:07:05.466076 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="26721832-e716-44b8-ab75-60ba0be9e511" containerName="placement-api" Mar 10 10:07:05 crc kubenswrapper[4794]: E0310 10:07:05.466097 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="39f62c40-641b-409f-ab29-ba30c14de2d8" containerName="ceilometer-notification-agent" Mar 10 10:07:05 crc kubenswrapper[4794]: I0310 10:07:05.466103 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="39f62c40-641b-409f-ab29-ba30c14de2d8" containerName="ceilometer-notification-agent" Mar 10 10:07:05 crc kubenswrapper[4794]: I0310 10:07:05.466265 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="39f62c40-641b-409f-ab29-ba30c14de2d8" containerName="ceilometer-notification-agent" Mar 10 10:07:05 crc kubenswrapper[4794]: I0310 10:07:05.466280 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="39f62c40-641b-409f-ab29-ba30c14de2d8" containerName="ceilometer-central-agent" Mar 10 10:07:05 crc kubenswrapper[4794]: I0310 10:07:05.466290 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="b66eadd9-dea4-4c3c-aa14-494f04e0e8c7" containerName="neutron-httpd" Mar 10 10:07:05 crc kubenswrapper[4794]: I0310 10:07:05.466301 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="b66eadd9-dea4-4c3c-aa14-494f04e0e8c7" containerName="neutron-api" Mar 10 10:07:05 crc kubenswrapper[4794]: I0310 10:07:05.466312 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="39f62c40-641b-409f-ab29-ba30c14de2d8" containerName="proxy-httpd" Mar 10 10:07:05 crc kubenswrapper[4794]: I0310 10:07:05.466323 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="39f62c40-641b-409f-ab29-ba30c14de2d8" containerName="sg-core" Mar 10 10:07:05 crc kubenswrapper[4794]: I0310 10:07:05.466401 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="7f12c556-db49-48d8-aad0-981c8d746bb6" containerName="barbican-api" Mar 10 10:07:05 crc kubenswrapper[4794]: I0310 10:07:05.466421 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="26721832-e716-44b8-ab75-60ba0be9e511" containerName="placement-api" Mar 10 10:07:05 crc kubenswrapper[4794]: I0310 10:07:05.466428 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="26721832-e716-44b8-ab75-60ba0be9e511" containerName="placement-log" Mar 10 10:07:05 crc kubenswrapper[4794]: I0310 10:07:05.466437 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="7f12c556-db49-48d8-aad0-981c8d746bb6" containerName="barbican-api-log" Mar 10 10:07:05 crc kubenswrapper[4794]: I0310 10:07:05.468301 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 10 10:07:05 crc kubenswrapper[4794]: I0310 10:07:05.470896 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 10 10:07:05 crc kubenswrapper[4794]: I0310 10:07:05.470901 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 10 10:07:05 crc kubenswrapper[4794]: I0310 10:07:05.479691 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 10 10:07:05 crc kubenswrapper[4794]: I0310 10:07:05.606176 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/38977633-0094-470c-bd46-76d58e261f4c-config-data\") pod \"ceilometer-0\" (UID: \"38977633-0094-470c-bd46-76d58e261f4c\") " pod="openstack/ceilometer-0" Mar 10 10:07:05 crc kubenswrapper[4794]: I0310 10:07:05.606241 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/38977633-0094-470c-bd46-76d58e261f4c-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"38977633-0094-470c-bd46-76d58e261f4c\") " pod="openstack/ceilometer-0" Mar 10 10:07:05 crc kubenswrapper[4794]: I0310 10:07:05.606270 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/38977633-0094-470c-bd46-76d58e261f4c-scripts\") pod \"ceilometer-0\" (UID: \"38977633-0094-470c-bd46-76d58e261f4c\") " pod="openstack/ceilometer-0" Mar 10 10:07:05 crc kubenswrapper[4794]: I0310 10:07:05.606326 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/38977633-0094-470c-bd46-76d58e261f4c-log-httpd\") pod \"ceilometer-0\" (UID: \"38977633-0094-470c-bd46-76d58e261f4c\") " pod="openstack/ceilometer-0" Mar 10 10:07:05 crc kubenswrapper[4794]: I0310 10:07:05.606370 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/38977633-0094-470c-bd46-76d58e261f4c-run-httpd\") pod \"ceilometer-0\" (UID: \"38977633-0094-470c-bd46-76d58e261f4c\") " pod="openstack/ceilometer-0" Mar 10 10:07:05 crc kubenswrapper[4794]: I0310 10:07:05.606508 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5nft5\" (UniqueName: \"kubernetes.io/projected/38977633-0094-470c-bd46-76d58e261f4c-kube-api-access-5nft5\") pod \"ceilometer-0\" (UID: \"38977633-0094-470c-bd46-76d58e261f4c\") " pod="openstack/ceilometer-0" Mar 10 10:07:05 crc kubenswrapper[4794]: I0310 10:07:05.606585 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/38977633-0094-470c-bd46-76d58e261f4c-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"38977633-0094-470c-bd46-76d58e261f4c\") " pod="openstack/ceilometer-0" Mar 10 10:07:05 crc kubenswrapper[4794]: I0310 10:07:05.614919 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 10 10:07:05 crc kubenswrapper[4794]: E0310 10:07:05.615868 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[combined-ca-bundle config-data kube-api-access-5nft5 log-httpd run-httpd scripts sg-core-conf-yaml], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openstack/ceilometer-0" podUID="38977633-0094-470c-bd46-76d58e261f4c" Mar 10 10:07:05 crc kubenswrapper[4794]: I0310 10:07:05.707915 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/38977633-0094-470c-bd46-76d58e261f4c-log-httpd\") pod \"ceilometer-0\" (UID: \"38977633-0094-470c-bd46-76d58e261f4c\") " pod="openstack/ceilometer-0" Mar 10 10:07:05 crc kubenswrapper[4794]: I0310 10:07:05.707976 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/38977633-0094-470c-bd46-76d58e261f4c-run-httpd\") pod \"ceilometer-0\" (UID: \"38977633-0094-470c-bd46-76d58e261f4c\") " pod="openstack/ceilometer-0" Mar 10 10:07:05 crc kubenswrapper[4794]: I0310 10:07:05.708041 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5nft5\" (UniqueName: \"kubernetes.io/projected/38977633-0094-470c-bd46-76d58e261f4c-kube-api-access-5nft5\") pod \"ceilometer-0\" (UID: \"38977633-0094-470c-bd46-76d58e261f4c\") " pod="openstack/ceilometer-0" Mar 10 10:07:05 crc kubenswrapper[4794]: I0310 10:07:05.708105 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/38977633-0094-470c-bd46-76d58e261f4c-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"38977633-0094-470c-bd46-76d58e261f4c\") " pod="openstack/ceilometer-0" Mar 10 10:07:05 crc kubenswrapper[4794]: I0310 10:07:05.708153 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/38977633-0094-470c-bd46-76d58e261f4c-config-data\") pod \"ceilometer-0\" (UID: \"38977633-0094-470c-bd46-76d58e261f4c\") " pod="openstack/ceilometer-0" Mar 10 10:07:05 crc kubenswrapper[4794]: I0310 10:07:05.708183 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/38977633-0094-470c-bd46-76d58e261f4c-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"38977633-0094-470c-bd46-76d58e261f4c\") " pod="openstack/ceilometer-0" Mar 10 10:07:05 crc kubenswrapper[4794]: I0310 10:07:05.708208 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/38977633-0094-470c-bd46-76d58e261f4c-scripts\") pod \"ceilometer-0\" (UID: \"38977633-0094-470c-bd46-76d58e261f4c\") " pod="openstack/ceilometer-0" Mar 10 10:07:05 crc kubenswrapper[4794]: I0310 10:07:05.708626 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/38977633-0094-470c-bd46-76d58e261f4c-log-httpd\") pod \"ceilometer-0\" (UID: \"38977633-0094-470c-bd46-76d58e261f4c\") " pod="openstack/ceilometer-0" Mar 10 10:07:05 crc kubenswrapper[4794]: I0310 10:07:05.710458 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/38977633-0094-470c-bd46-76d58e261f4c-run-httpd\") pod \"ceilometer-0\" (UID: \"38977633-0094-470c-bd46-76d58e261f4c\") " pod="openstack/ceilometer-0" Mar 10 10:07:05 crc kubenswrapper[4794]: I0310 10:07:05.713278 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/38977633-0094-470c-bd46-76d58e261f4c-scripts\") pod \"ceilometer-0\" (UID: \"38977633-0094-470c-bd46-76d58e261f4c\") " pod="openstack/ceilometer-0" Mar 10 10:07:05 crc kubenswrapper[4794]: I0310 10:07:05.713894 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/38977633-0094-470c-bd46-76d58e261f4c-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"38977633-0094-470c-bd46-76d58e261f4c\") " pod="openstack/ceilometer-0" Mar 10 10:07:05 crc kubenswrapper[4794]: I0310 10:07:05.717587 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/38977633-0094-470c-bd46-76d58e261f4c-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"38977633-0094-470c-bd46-76d58e261f4c\") " pod="openstack/ceilometer-0" Mar 10 10:07:05 crc kubenswrapper[4794]: I0310 10:07:05.717901 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/38977633-0094-470c-bd46-76d58e261f4c-config-data\") pod \"ceilometer-0\" (UID: \"38977633-0094-470c-bd46-76d58e261f4c\") " pod="openstack/ceilometer-0" Mar 10 10:07:05 crc kubenswrapper[4794]: I0310 10:07:05.727444 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5nft5\" (UniqueName: \"kubernetes.io/projected/38977633-0094-470c-bd46-76d58e261f4c-kube-api-access-5nft5\") pod \"ceilometer-0\" (UID: \"38977633-0094-470c-bd46-76d58e261f4c\") " pod="openstack/ceilometer-0" Mar 10 10:07:05 crc kubenswrapper[4794]: I0310 10:07:05.858377 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-proxy-7575fbf969-gq2mq"] Mar 10 10:07:05 crc kubenswrapper[4794]: I0310 10:07:05.866004 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-7575fbf969-gq2mq" Mar 10 10:07:05 crc kubenswrapper[4794]: I0310 10:07:05.867800 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-internal-svc" Mar 10 10:07:05 crc kubenswrapper[4794]: I0310 10:07:05.867897 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-public-svc" Mar 10 10:07:05 crc kubenswrapper[4794]: I0310 10:07:05.868358 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Mar 10 10:07:05 crc kubenswrapper[4794]: I0310 10:07:05.885695 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-7575fbf969-gq2mq"] Mar 10 10:07:05 crc kubenswrapper[4794]: I0310 10:07:05.913390 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5fg5z\" (UniqueName: \"kubernetes.io/projected/ed705a10-5bb5-4170-8536-57c6be1cb816-kube-api-access-5fg5z\") pod \"swift-proxy-7575fbf969-gq2mq\" (UID: \"ed705a10-5bb5-4170-8536-57c6be1cb816\") " pod="openstack/swift-proxy-7575fbf969-gq2mq" Mar 10 10:07:05 crc kubenswrapper[4794]: I0310 10:07:05.913451 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ed705a10-5bb5-4170-8536-57c6be1cb816-config-data\") pod \"swift-proxy-7575fbf969-gq2mq\" (UID: \"ed705a10-5bb5-4170-8536-57c6be1cb816\") " pod="openstack/swift-proxy-7575fbf969-gq2mq" Mar 10 10:07:05 crc kubenswrapper[4794]: I0310 10:07:05.913507 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ed705a10-5bb5-4170-8536-57c6be1cb816-log-httpd\") pod \"swift-proxy-7575fbf969-gq2mq\" (UID: \"ed705a10-5bb5-4170-8536-57c6be1cb816\") " pod="openstack/swift-proxy-7575fbf969-gq2mq" Mar 10 10:07:05 crc kubenswrapper[4794]: I0310 10:07:05.913545 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ed705a10-5bb5-4170-8536-57c6be1cb816-combined-ca-bundle\") pod \"swift-proxy-7575fbf969-gq2mq\" (UID: \"ed705a10-5bb5-4170-8536-57c6be1cb816\") " pod="openstack/swift-proxy-7575fbf969-gq2mq" Mar 10 10:07:05 crc kubenswrapper[4794]: I0310 10:07:05.913565 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ed705a10-5bb5-4170-8536-57c6be1cb816-run-httpd\") pod \"swift-proxy-7575fbf969-gq2mq\" (UID: \"ed705a10-5bb5-4170-8536-57c6be1cb816\") " pod="openstack/swift-proxy-7575fbf969-gq2mq" Mar 10 10:07:05 crc kubenswrapper[4794]: I0310 10:07:05.913619 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ed705a10-5bb5-4170-8536-57c6be1cb816-public-tls-certs\") pod \"swift-proxy-7575fbf969-gq2mq\" (UID: \"ed705a10-5bb5-4170-8536-57c6be1cb816\") " pod="openstack/swift-proxy-7575fbf969-gq2mq" Mar 10 10:07:05 crc kubenswrapper[4794]: I0310 10:07:05.913751 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/ed705a10-5bb5-4170-8536-57c6be1cb816-etc-swift\") pod \"swift-proxy-7575fbf969-gq2mq\" (UID: \"ed705a10-5bb5-4170-8536-57c6be1cb816\") " pod="openstack/swift-proxy-7575fbf969-gq2mq" Mar 10 10:07:05 crc kubenswrapper[4794]: I0310 10:07:05.913877 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ed705a10-5bb5-4170-8536-57c6be1cb816-internal-tls-certs\") pod \"swift-proxy-7575fbf969-gq2mq\" (UID: \"ed705a10-5bb5-4170-8536-57c6be1cb816\") " pod="openstack/swift-proxy-7575fbf969-gq2mq" Mar 10 10:07:06 crc kubenswrapper[4794]: I0310 10:07:06.012851 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="39f62c40-641b-409f-ab29-ba30c14de2d8" path="/var/lib/kubelet/pods/39f62c40-641b-409f-ab29-ba30c14de2d8/volumes" Mar 10 10:07:06 crc kubenswrapper[4794]: I0310 10:07:06.014054 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7f12c556-db49-48d8-aad0-981c8d746bb6" path="/var/lib/kubelet/pods/7f12c556-db49-48d8-aad0-981c8d746bb6/volumes" Mar 10 10:07:06 crc kubenswrapper[4794]: I0310 10:07:06.015045 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b66eadd9-dea4-4c3c-aa14-494f04e0e8c7" path="/var/lib/kubelet/pods/b66eadd9-dea4-4c3c-aa14-494f04e0e8c7/volumes" Mar 10 10:07:06 crc kubenswrapper[4794]: I0310 10:07:06.017279 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ed705a10-5bb5-4170-8536-57c6be1cb816-combined-ca-bundle\") pod \"swift-proxy-7575fbf969-gq2mq\" (UID: \"ed705a10-5bb5-4170-8536-57c6be1cb816\") " pod="openstack/swift-proxy-7575fbf969-gq2mq" Mar 10 10:07:06 crc kubenswrapper[4794]: I0310 10:07:06.017410 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ed705a10-5bb5-4170-8536-57c6be1cb816-run-httpd\") pod \"swift-proxy-7575fbf969-gq2mq\" (UID: \"ed705a10-5bb5-4170-8536-57c6be1cb816\") " pod="openstack/swift-proxy-7575fbf969-gq2mq" Mar 10 10:07:06 crc kubenswrapper[4794]: I0310 10:07:06.017478 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ed705a10-5bb5-4170-8536-57c6be1cb816-public-tls-certs\") pod \"swift-proxy-7575fbf969-gq2mq\" (UID: \"ed705a10-5bb5-4170-8536-57c6be1cb816\") " pod="openstack/swift-proxy-7575fbf969-gq2mq" Mar 10 10:07:06 crc kubenswrapper[4794]: I0310 10:07:06.017523 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/ed705a10-5bb5-4170-8536-57c6be1cb816-etc-swift\") pod \"swift-proxy-7575fbf969-gq2mq\" (UID: \"ed705a10-5bb5-4170-8536-57c6be1cb816\") " pod="openstack/swift-proxy-7575fbf969-gq2mq" Mar 10 10:07:06 crc kubenswrapper[4794]: I0310 10:07:06.017572 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ed705a10-5bb5-4170-8536-57c6be1cb816-internal-tls-certs\") pod \"swift-proxy-7575fbf969-gq2mq\" (UID: \"ed705a10-5bb5-4170-8536-57c6be1cb816\") " pod="openstack/swift-proxy-7575fbf969-gq2mq" Mar 10 10:07:06 crc kubenswrapper[4794]: I0310 10:07:06.017679 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5fg5z\" (UniqueName: \"kubernetes.io/projected/ed705a10-5bb5-4170-8536-57c6be1cb816-kube-api-access-5fg5z\") pod \"swift-proxy-7575fbf969-gq2mq\" (UID: \"ed705a10-5bb5-4170-8536-57c6be1cb816\") " pod="openstack/swift-proxy-7575fbf969-gq2mq" Mar 10 10:07:06 crc kubenswrapper[4794]: I0310 10:07:06.017723 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ed705a10-5bb5-4170-8536-57c6be1cb816-config-data\") pod \"swift-proxy-7575fbf969-gq2mq\" (UID: \"ed705a10-5bb5-4170-8536-57c6be1cb816\") " pod="openstack/swift-proxy-7575fbf969-gq2mq" Mar 10 10:07:06 crc kubenswrapper[4794]: I0310 10:07:06.017794 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ed705a10-5bb5-4170-8536-57c6be1cb816-log-httpd\") pod \"swift-proxy-7575fbf969-gq2mq\" (UID: \"ed705a10-5bb5-4170-8536-57c6be1cb816\") " pod="openstack/swift-proxy-7575fbf969-gq2mq" Mar 10 10:07:06 crc kubenswrapper[4794]: I0310 10:07:06.018012 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ed705a10-5bb5-4170-8536-57c6be1cb816-run-httpd\") pod \"swift-proxy-7575fbf969-gq2mq\" (UID: \"ed705a10-5bb5-4170-8536-57c6be1cb816\") " pod="openstack/swift-proxy-7575fbf969-gq2mq" Mar 10 10:07:06 crc kubenswrapper[4794]: I0310 10:07:06.018154 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ed705a10-5bb5-4170-8536-57c6be1cb816-log-httpd\") pod \"swift-proxy-7575fbf969-gq2mq\" (UID: \"ed705a10-5bb5-4170-8536-57c6be1cb816\") " pod="openstack/swift-proxy-7575fbf969-gq2mq" Mar 10 10:07:06 crc kubenswrapper[4794]: I0310 10:07:06.022887 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ed705a10-5bb5-4170-8536-57c6be1cb816-internal-tls-certs\") pod \"swift-proxy-7575fbf969-gq2mq\" (UID: \"ed705a10-5bb5-4170-8536-57c6be1cb816\") " pod="openstack/swift-proxy-7575fbf969-gq2mq" Mar 10 10:07:06 crc kubenswrapper[4794]: I0310 10:07:06.023515 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ed705a10-5bb5-4170-8536-57c6be1cb816-combined-ca-bundle\") pod \"swift-proxy-7575fbf969-gq2mq\" (UID: \"ed705a10-5bb5-4170-8536-57c6be1cb816\") " pod="openstack/swift-proxy-7575fbf969-gq2mq" Mar 10 10:07:06 crc kubenswrapper[4794]: I0310 10:07:06.026086 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ed705a10-5bb5-4170-8536-57c6be1cb816-public-tls-certs\") pod \"swift-proxy-7575fbf969-gq2mq\" (UID: \"ed705a10-5bb5-4170-8536-57c6be1cb816\") " pod="openstack/swift-proxy-7575fbf969-gq2mq" Mar 10 10:07:06 crc kubenswrapper[4794]: I0310 10:07:06.028047 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/ed705a10-5bb5-4170-8536-57c6be1cb816-etc-swift\") pod \"swift-proxy-7575fbf969-gq2mq\" (UID: \"ed705a10-5bb5-4170-8536-57c6be1cb816\") " pod="openstack/swift-proxy-7575fbf969-gq2mq" Mar 10 10:07:06 crc kubenswrapper[4794]: I0310 10:07:06.038167 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ed705a10-5bb5-4170-8536-57c6be1cb816-config-data\") pod \"swift-proxy-7575fbf969-gq2mq\" (UID: \"ed705a10-5bb5-4170-8536-57c6be1cb816\") " pod="openstack/swift-proxy-7575fbf969-gq2mq" Mar 10 10:07:06 crc kubenswrapper[4794]: I0310 10:07:06.039371 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5fg5z\" (UniqueName: \"kubernetes.io/projected/ed705a10-5bb5-4170-8536-57c6be1cb816-kube-api-access-5fg5z\") pod \"swift-proxy-7575fbf969-gq2mq\" (UID: \"ed705a10-5bb5-4170-8536-57c6be1cb816\") " pod="openstack/swift-proxy-7575fbf969-gq2mq" Mar 10 10:07:06 crc kubenswrapper[4794]: I0310 10:07:06.057139 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 10 10:07:06 crc kubenswrapper[4794]: I0310 10:07:06.119695 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 10 10:07:06 crc kubenswrapper[4794]: I0310 10:07:06.181983 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-7575fbf969-gq2mq" Mar 10 10:07:06 crc kubenswrapper[4794]: I0310 10:07:06.221783 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/38977633-0094-470c-bd46-76d58e261f4c-combined-ca-bundle\") pod \"38977633-0094-470c-bd46-76d58e261f4c\" (UID: \"38977633-0094-470c-bd46-76d58e261f4c\") " Mar 10 10:07:06 crc kubenswrapper[4794]: I0310 10:07:06.221840 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/38977633-0094-470c-bd46-76d58e261f4c-log-httpd\") pod \"38977633-0094-470c-bd46-76d58e261f4c\" (UID: \"38977633-0094-470c-bd46-76d58e261f4c\") " Mar 10 10:07:06 crc kubenswrapper[4794]: I0310 10:07:06.221892 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5nft5\" (UniqueName: \"kubernetes.io/projected/38977633-0094-470c-bd46-76d58e261f4c-kube-api-access-5nft5\") pod \"38977633-0094-470c-bd46-76d58e261f4c\" (UID: \"38977633-0094-470c-bd46-76d58e261f4c\") " Mar 10 10:07:06 crc kubenswrapper[4794]: I0310 10:07:06.222038 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/38977633-0094-470c-bd46-76d58e261f4c-config-data\") pod \"38977633-0094-470c-bd46-76d58e261f4c\" (UID: \"38977633-0094-470c-bd46-76d58e261f4c\") " Mar 10 10:07:06 crc kubenswrapper[4794]: I0310 10:07:06.222069 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/38977633-0094-470c-bd46-76d58e261f4c-sg-core-conf-yaml\") pod \"38977633-0094-470c-bd46-76d58e261f4c\" (UID: \"38977633-0094-470c-bd46-76d58e261f4c\") " Mar 10 10:07:06 crc kubenswrapper[4794]: I0310 10:07:06.222128 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/38977633-0094-470c-bd46-76d58e261f4c-scripts\") pod \"38977633-0094-470c-bd46-76d58e261f4c\" (UID: \"38977633-0094-470c-bd46-76d58e261f4c\") " Mar 10 10:07:06 crc kubenswrapper[4794]: I0310 10:07:06.222157 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/38977633-0094-470c-bd46-76d58e261f4c-run-httpd\") pod \"38977633-0094-470c-bd46-76d58e261f4c\" (UID: \"38977633-0094-470c-bd46-76d58e261f4c\") " Mar 10 10:07:06 crc kubenswrapper[4794]: I0310 10:07:06.223089 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/38977633-0094-470c-bd46-76d58e261f4c-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "38977633-0094-470c-bd46-76d58e261f4c" (UID: "38977633-0094-470c-bd46-76d58e261f4c"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 10:07:06 crc kubenswrapper[4794]: I0310 10:07:06.224260 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/38977633-0094-470c-bd46-76d58e261f4c-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "38977633-0094-470c-bd46-76d58e261f4c" (UID: "38977633-0094-470c-bd46-76d58e261f4c"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 10:07:06 crc kubenswrapper[4794]: I0310 10:07:06.227709 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/38977633-0094-470c-bd46-76d58e261f4c-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "38977633-0094-470c-bd46-76d58e261f4c" (UID: "38977633-0094-470c-bd46-76d58e261f4c"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 10:07:06 crc kubenswrapper[4794]: I0310 10:07:06.243794 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/38977633-0094-470c-bd46-76d58e261f4c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "38977633-0094-470c-bd46-76d58e261f4c" (UID: "38977633-0094-470c-bd46-76d58e261f4c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 10:07:06 crc kubenswrapper[4794]: I0310 10:07:06.243904 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/38977633-0094-470c-bd46-76d58e261f4c-config-data" (OuterVolumeSpecName: "config-data") pod "38977633-0094-470c-bd46-76d58e261f4c" (UID: "38977633-0094-470c-bd46-76d58e261f4c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 10:07:06 crc kubenswrapper[4794]: I0310 10:07:06.245898 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/38977633-0094-470c-bd46-76d58e261f4c-kube-api-access-5nft5" (OuterVolumeSpecName: "kube-api-access-5nft5") pod "38977633-0094-470c-bd46-76d58e261f4c" (UID: "38977633-0094-470c-bd46-76d58e261f4c"). InnerVolumeSpecName "kube-api-access-5nft5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 10:07:06 crc kubenswrapper[4794]: I0310 10:07:06.249751 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/38977633-0094-470c-bd46-76d58e261f4c-scripts" (OuterVolumeSpecName: "scripts") pod "38977633-0094-470c-bd46-76d58e261f4c" (UID: "38977633-0094-470c-bd46-76d58e261f4c"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 10:07:06 crc kubenswrapper[4794]: I0310 10:07:06.323971 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5nft5\" (UniqueName: \"kubernetes.io/projected/38977633-0094-470c-bd46-76d58e261f4c-kube-api-access-5nft5\") on node \"crc\" DevicePath \"\"" Mar 10 10:07:06 crc kubenswrapper[4794]: I0310 10:07:06.324028 4794 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/38977633-0094-470c-bd46-76d58e261f4c-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 10:07:06 crc kubenswrapper[4794]: I0310 10:07:06.324041 4794 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/38977633-0094-470c-bd46-76d58e261f4c-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 10 10:07:06 crc kubenswrapper[4794]: I0310 10:07:06.324051 4794 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/38977633-0094-470c-bd46-76d58e261f4c-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 10:07:06 crc kubenswrapper[4794]: I0310 10:07:06.324065 4794 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/38977633-0094-470c-bd46-76d58e261f4c-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 10 10:07:06 crc kubenswrapper[4794]: I0310 10:07:06.324074 4794 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/38977633-0094-470c-bd46-76d58e261f4c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 10:07:06 crc kubenswrapper[4794]: I0310 10:07:06.324087 4794 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/38977633-0094-470c-bd46-76d58e261f4c-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 10 10:07:07 crc kubenswrapper[4794]: I0310 10:07:07.067829 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 10 10:07:07 crc kubenswrapper[4794]: I0310 10:07:07.126517 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 10 10:07:07 crc kubenswrapper[4794]: I0310 10:07:07.148758 4794 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 10 10:07:07 crc kubenswrapper[4794]: I0310 10:07:07.170406 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 10 10:07:07 crc kubenswrapper[4794]: I0310 10:07:07.172778 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 10 10:07:07 crc kubenswrapper[4794]: I0310 10:07:07.175058 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 10 10:07:07 crc kubenswrapper[4794]: I0310 10:07:07.175451 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 10 10:07:07 crc kubenswrapper[4794]: I0310 10:07:07.181275 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 10 10:07:07 crc kubenswrapper[4794]: I0310 10:07:07.238970 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/59e99408-027e-4f50-bfa8-02482e877cc8-scripts\") pod \"ceilometer-0\" (UID: \"59e99408-027e-4f50-bfa8-02482e877cc8\") " pod="openstack/ceilometer-0" Mar 10 10:07:07 crc kubenswrapper[4794]: I0310 10:07:07.239052 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/59e99408-027e-4f50-bfa8-02482e877cc8-config-data\") pod \"ceilometer-0\" (UID: \"59e99408-027e-4f50-bfa8-02482e877cc8\") " pod="openstack/ceilometer-0" Mar 10 10:07:07 crc kubenswrapper[4794]: I0310 10:07:07.239108 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/59e99408-027e-4f50-bfa8-02482e877cc8-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"59e99408-027e-4f50-bfa8-02482e877cc8\") " pod="openstack/ceilometer-0" Mar 10 10:07:07 crc kubenswrapper[4794]: I0310 10:07:07.239280 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/59e99408-027e-4f50-bfa8-02482e877cc8-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"59e99408-027e-4f50-bfa8-02482e877cc8\") " pod="openstack/ceilometer-0" Mar 10 10:07:07 crc kubenswrapper[4794]: I0310 10:07:07.239376 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/59e99408-027e-4f50-bfa8-02482e877cc8-run-httpd\") pod \"ceilometer-0\" (UID: \"59e99408-027e-4f50-bfa8-02482e877cc8\") " pod="openstack/ceilometer-0" Mar 10 10:07:07 crc kubenswrapper[4794]: I0310 10:07:07.239442 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/59e99408-027e-4f50-bfa8-02482e877cc8-log-httpd\") pod \"ceilometer-0\" (UID: \"59e99408-027e-4f50-bfa8-02482e877cc8\") " pod="openstack/ceilometer-0" Mar 10 10:07:07 crc kubenswrapper[4794]: I0310 10:07:07.239478 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-plz4z\" (UniqueName: \"kubernetes.io/projected/59e99408-027e-4f50-bfa8-02482e877cc8-kube-api-access-plz4z\") pod \"ceilometer-0\" (UID: \"59e99408-027e-4f50-bfa8-02482e877cc8\") " pod="openstack/ceilometer-0" Mar 10 10:07:07 crc kubenswrapper[4794]: I0310 10:07:07.342005 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/59e99408-027e-4f50-bfa8-02482e877cc8-config-data\") pod \"ceilometer-0\" (UID: \"59e99408-027e-4f50-bfa8-02482e877cc8\") " pod="openstack/ceilometer-0" Mar 10 10:07:07 crc kubenswrapper[4794]: I0310 10:07:07.342070 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/59e99408-027e-4f50-bfa8-02482e877cc8-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"59e99408-027e-4f50-bfa8-02482e877cc8\") " pod="openstack/ceilometer-0" Mar 10 10:07:07 crc kubenswrapper[4794]: I0310 10:07:07.342143 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/59e99408-027e-4f50-bfa8-02482e877cc8-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"59e99408-027e-4f50-bfa8-02482e877cc8\") " pod="openstack/ceilometer-0" Mar 10 10:07:07 crc kubenswrapper[4794]: I0310 10:07:07.342171 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/59e99408-027e-4f50-bfa8-02482e877cc8-run-httpd\") pod \"ceilometer-0\" (UID: \"59e99408-027e-4f50-bfa8-02482e877cc8\") " pod="openstack/ceilometer-0" Mar 10 10:07:07 crc kubenswrapper[4794]: I0310 10:07:07.342203 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/59e99408-027e-4f50-bfa8-02482e877cc8-log-httpd\") pod \"ceilometer-0\" (UID: \"59e99408-027e-4f50-bfa8-02482e877cc8\") " pod="openstack/ceilometer-0" Mar 10 10:07:07 crc kubenswrapper[4794]: I0310 10:07:07.342238 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-plz4z\" (UniqueName: \"kubernetes.io/projected/59e99408-027e-4f50-bfa8-02482e877cc8-kube-api-access-plz4z\") pod \"ceilometer-0\" (UID: \"59e99408-027e-4f50-bfa8-02482e877cc8\") " pod="openstack/ceilometer-0" Mar 10 10:07:07 crc kubenswrapper[4794]: I0310 10:07:07.342273 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/59e99408-027e-4f50-bfa8-02482e877cc8-scripts\") pod \"ceilometer-0\" (UID: \"59e99408-027e-4f50-bfa8-02482e877cc8\") " pod="openstack/ceilometer-0" Mar 10 10:07:07 crc kubenswrapper[4794]: I0310 10:07:07.343933 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/59e99408-027e-4f50-bfa8-02482e877cc8-run-httpd\") pod \"ceilometer-0\" (UID: \"59e99408-027e-4f50-bfa8-02482e877cc8\") " pod="openstack/ceilometer-0" Mar 10 10:07:07 crc kubenswrapper[4794]: I0310 10:07:07.344558 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/59e99408-027e-4f50-bfa8-02482e877cc8-log-httpd\") pod \"ceilometer-0\" (UID: \"59e99408-027e-4f50-bfa8-02482e877cc8\") " pod="openstack/ceilometer-0" Mar 10 10:07:07 crc kubenswrapper[4794]: I0310 10:07:07.348822 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/59e99408-027e-4f50-bfa8-02482e877cc8-scripts\") pod \"ceilometer-0\" (UID: \"59e99408-027e-4f50-bfa8-02482e877cc8\") " pod="openstack/ceilometer-0" Mar 10 10:07:07 crc kubenswrapper[4794]: I0310 10:07:07.349894 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/59e99408-027e-4f50-bfa8-02482e877cc8-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"59e99408-027e-4f50-bfa8-02482e877cc8\") " pod="openstack/ceilometer-0" Mar 10 10:07:07 crc kubenswrapper[4794]: I0310 10:07:07.350199 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/59e99408-027e-4f50-bfa8-02482e877cc8-config-data\") pod \"ceilometer-0\" (UID: \"59e99408-027e-4f50-bfa8-02482e877cc8\") " pod="openstack/ceilometer-0" Mar 10 10:07:07 crc kubenswrapper[4794]: I0310 10:07:07.350495 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/59e99408-027e-4f50-bfa8-02482e877cc8-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"59e99408-027e-4f50-bfa8-02482e877cc8\") " pod="openstack/ceilometer-0" Mar 10 10:07:07 crc kubenswrapper[4794]: I0310 10:07:07.371424 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-plz4z\" (UniqueName: \"kubernetes.io/projected/59e99408-027e-4f50-bfa8-02482e877cc8-kube-api-access-plz4z\") pod \"ceilometer-0\" (UID: \"59e99408-027e-4f50-bfa8-02482e877cc8\") " pod="openstack/ceilometer-0" Mar 10 10:07:07 crc kubenswrapper[4794]: I0310 10:07:07.502046 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 10 10:07:07 crc kubenswrapper[4794]: I0310 10:07:07.882213 4794 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Mar 10 10:07:08 crc kubenswrapper[4794]: I0310 10:07:08.008669 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="38977633-0094-470c-bd46-76d58e261f4c" path="/var/lib/kubelet/pods/38977633-0094-470c-bd46-76d58e261f4c/volumes" Mar 10 10:07:08 crc kubenswrapper[4794]: I0310 10:07:08.880447 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 10 10:07:10 crc kubenswrapper[4794]: I0310 10:07:10.456220 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 10 10:07:10 crc kubenswrapper[4794]: I0310 10:07:10.456472 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="34993523-76a5-426f-a8bb-14466731fd21" containerName="glance-log" containerID="cri-o://dfe796b758d9be52c77e22c0da16ab79aedbbd54085aff9d8b415d2185a274e8" gracePeriod=30 Mar 10 10:07:10 crc kubenswrapper[4794]: I0310 10:07:10.456562 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="34993523-76a5-426f-a8bb-14466731fd21" containerName="glance-httpd" containerID="cri-o://6f74627ee5225f35dce01ac902c6a0fb721d52abe0c763bef1037f66809b4d95" gracePeriod=30 Mar 10 10:07:11 crc kubenswrapper[4794]: I0310 10:07:11.111562 4794 generic.go:334] "Generic (PLEG): container finished" podID="34993523-76a5-426f-a8bb-14466731fd21" containerID="dfe796b758d9be52c77e22c0da16ab79aedbbd54085aff9d8b415d2185a274e8" exitCode=143 Mar 10 10:07:11 crc kubenswrapper[4794]: I0310 10:07:11.111637 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"34993523-76a5-426f-a8bb-14466731fd21","Type":"ContainerDied","Data":"dfe796b758d9be52c77e22c0da16ab79aedbbd54085aff9d8b415d2185a274e8"} Mar 10 10:07:11 crc kubenswrapper[4794]: I0310 10:07:11.203861 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 10 10:07:11 crc kubenswrapper[4794]: I0310 10:07:11.204131 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="325754ce-6381-4bb4-9102-04933c1a928b" containerName="glance-log" containerID="cri-o://0383025111f0e6982d3ca9019cd29a06bc4484fc4b9cbeafc472f00480493c35" gracePeriod=30 Mar 10 10:07:11 crc kubenswrapper[4794]: I0310 10:07:11.204699 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="325754ce-6381-4bb4-9102-04933c1a928b" containerName="glance-httpd" containerID="cri-o://b35130c90243c66a164c26598446bab5071d3dcbebe9e12292cadd9f5f9536f4" gracePeriod=30 Mar 10 10:07:12 crc kubenswrapper[4794]: I0310 10:07:12.120982 4794 generic.go:334] "Generic (PLEG): container finished" podID="325754ce-6381-4bb4-9102-04933c1a928b" containerID="0383025111f0e6982d3ca9019cd29a06bc4484fc4b9cbeafc472f00480493c35" exitCode=143 Mar 10 10:07:12 crc kubenswrapper[4794]: I0310 10:07:12.121284 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"325754ce-6381-4bb4-9102-04933c1a928b","Type":"ContainerDied","Data":"0383025111f0e6982d3ca9019cd29a06bc4484fc4b9cbeafc472f00480493c35"} Mar 10 10:07:12 crc kubenswrapper[4794]: I0310 10:07:12.887167 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 10 10:07:13 crc kubenswrapper[4794]: I0310 10:07:13.075824 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-7575fbf969-gq2mq"] Mar 10 10:07:13 crc kubenswrapper[4794]: I0310 10:07:13.142749 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"59e99408-027e-4f50-bfa8-02482e877cc8","Type":"ContainerStarted","Data":"8fba9ab5f0e0c8372949a44bdd45c19d96e519f7322d89e6fd6285e3c6351d90"} Mar 10 10:07:13 crc kubenswrapper[4794]: I0310 10:07:13.143819 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-7575fbf969-gq2mq" event={"ID":"ed705a10-5bb5-4170-8536-57c6be1cb816","Type":"ContainerStarted","Data":"36653d4cabcbc5261ec1c6493c2eb93d199872239ce11c67f90424bfd49113b9"} Mar 10 10:07:13 crc kubenswrapper[4794]: I0310 10:07:13.145829 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"d235a0a5-57c9-4938-b742-5788ade30a12","Type":"ContainerStarted","Data":"657c6e566fe3c3ec4a4a199f42966fd39347a8a6f061fe653f5274f12bb76445"} Mar 10 10:07:13 crc kubenswrapper[4794]: I0310 10:07:13.165679 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstackclient" podStartSLOduration=2.420128907 podStartE2EDuration="15.165660956s" podCreationTimestamp="2026-03-10 10:06:58 +0000 UTC" firstStartedPulling="2026-03-10 10:06:59.66066406 +0000 UTC m=+1368.416834878" lastFinishedPulling="2026-03-10 10:07:12.406196109 +0000 UTC m=+1381.162366927" observedRunningTime="2026-03-10 10:07:13.163389979 +0000 UTC m=+1381.919560797" watchObservedRunningTime="2026-03-10 10:07:13.165660956 +0000 UTC m=+1381.921831774" Mar 10 10:07:13 crc kubenswrapper[4794]: I0310 10:07:13.545102 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-db-create-hsjqq"] Mar 10 10:07:13 crc kubenswrapper[4794]: I0310 10:07:13.546348 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-hsjqq" Mar 10 10:07:13 crc kubenswrapper[4794]: I0310 10:07:13.555620 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-hsjqq"] Mar 10 10:07:13 crc kubenswrapper[4794]: I0310 10:07:13.657005 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-db-create-pkvrz"] Mar 10 10:07:13 crc kubenswrapper[4794]: I0310 10:07:13.658519 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-pkvrz" Mar 10 10:07:13 crc kubenswrapper[4794]: I0310 10:07:13.664678 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d7addf26-5bbf-4b13-aa27-070bab62a929-operator-scripts\") pod \"nova-api-db-create-hsjqq\" (UID: \"d7addf26-5bbf-4b13-aa27-070bab62a929\") " pod="openstack/nova-api-db-create-hsjqq" Mar 10 10:07:13 crc kubenswrapper[4794]: I0310 10:07:13.664861 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tqd6z\" (UniqueName: \"kubernetes.io/projected/d7addf26-5bbf-4b13-aa27-070bab62a929-kube-api-access-tqd6z\") pod \"nova-api-db-create-hsjqq\" (UID: \"d7addf26-5bbf-4b13-aa27-070bab62a929\") " pod="openstack/nova-api-db-create-hsjqq" Mar 10 10:07:13 crc kubenswrapper[4794]: I0310 10:07:13.693662 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-pkvrz"] Mar 10 10:07:13 crc kubenswrapper[4794]: I0310 10:07:13.787302 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-e13e-account-create-update-nqb9b"] Mar 10 10:07:13 crc kubenswrapper[4794]: I0310 10:07:13.797104 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-e13e-account-create-update-nqb9b" Mar 10 10:07:13 crc kubenswrapper[4794]: I0310 10:07:13.797117 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d7addf26-5bbf-4b13-aa27-070bab62a929-operator-scripts\") pod \"nova-api-db-create-hsjqq\" (UID: \"d7addf26-5bbf-4b13-aa27-070bab62a929\") " pod="openstack/nova-api-db-create-hsjqq" Mar 10 10:07:13 crc kubenswrapper[4794]: I0310 10:07:13.797789 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v6pkl\" (UniqueName: \"kubernetes.io/projected/598f9a9c-0b67-44b8-83a9-428f55be33a9-kube-api-access-v6pkl\") pod \"nova-cell0-db-create-pkvrz\" (UID: \"598f9a9c-0b67-44b8-83a9-428f55be33a9\") " pod="openstack/nova-cell0-db-create-pkvrz" Mar 10 10:07:13 crc kubenswrapper[4794]: I0310 10:07:13.797972 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tqd6z\" (UniqueName: \"kubernetes.io/projected/d7addf26-5bbf-4b13-aa27-070bab62a929-kube-api-access-tqd6z\") pod \"nova-api-db-create-hsjqq\" (UID: \"d7addf26-5bbf-4b13-aa27-070bab62a929\") " pod="openstack/nova-api-db-create-hsjqq" Mar 10 10:07:13 crc kubenswrapper[4794]: I0310 10:07:13.798153 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/598f9a9c-0b67-44b8-83a9-428f55be33a9-operator-scripts\") pod \"nova-cell0-db-create-pkvrz\" (UID: \"598f9a9c-0b67-44b8-83a9-428f55be33a9\") " pod="openstack/nova-cell0-db-create-pkvrz" Mar 10 10:07:13 crc kubenswrapper[4794]: I0310 10:07:13.798029 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d7addf26-5bbf-4b13-aa27-070bab62a929-operator-scripts\") pod \"nova-api-db-create-hsjqq\" (UID: \"d7addf26-5bbf-4b13-aa27-070bab62a929\") " pod="openstack/nova-api-db-create-hsjqq" Mar 10 10:07:13 crc kubenswrapper[4794]: I0310 10:07:13.801700 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-db-secret" Mar 10 10:07:13 crc kubenswrapper[4794]: I0310 10:07:13.810404 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-e13e-account-create-update-nqb9b"] Mar 10 10:07:13 crc kubenswrapper[4794]: I0310 10:07:13.823016 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-db-create-ngww6"] Mar 10 10:07:13 crc kubenswrapper[4794]: I0310 10:07:13.824034 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tqd6z\" (UniqueName: \"kubernetes.io/projected/d7addf26-5bbf-4b13-aa27-070bab62a929-kube-api-access-tqd6z\") pod \"nova-api-db-create-hsjqq\" (UID: \"d7addf26-5bbf-4b13-aa27-070bab62a929\") " pod="openstack/nova-api-db-create-hsjqq" Mar 10 10:07:13 crc kubenswrapper[4794]: I0310 10:07:13.825195 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-ngww6" Mar 10 10:07:13 crc kubenswrapper[4794]: I0310 10:07:13.848437 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-ngww6"] Mar 10 10:07:13 crc kubenswrapper[4794]: I0310 10:07:13.861804 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-hsjqq" Mar 10 10:07:13 crc kubenswrapper[4794]: I0310 10:07:13.877630 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-a513-account-create-update-l8ctf"] Mar 10 10:07:13 crc kubenswrapper[4794]: I0310 10:07:13.888050 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-a513-account-create-update-l8ctf" Mar 10 10:07:13 crc kubenswrapper[4794]: I0310 10:07:13.892280 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-db-secret" Mar 10 10:07:13 crc kubenswrapper[4794]: I0310 10:07:13.903082 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v6pkl\" (UniqueName: \"kubernetes.io/projected/598f9a9c-0b67-44b8-83a9-428f55be33a9-kube-api-access-v6pkl\") pod \"nova-cell0-db-create-pkvrz\" (UID: \"598f9a9c-0b67-44b8-83a9-428f55be33a9\") " pod="openstack/nova-cell0-db-create-pkvrz" Mar 10 10:07:13 crc kubenswrapper[4794]: I0310 10:07:13.903529 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/598f9a9c-0b67-44b8-83a9-428f55be33a9-operator-scripts\") pod \"nova-cell0-db-create-pkvrz\" (UID: \"598f9a9c-0b67-44b8-83a9-428f55be33a9\") " pod="openstack/nova-cell0-db-create-pkvrz" Mar 10 10:07:13 crc kubenswrapper[4794]: I0310 10:07:13.903621 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f7bcz\" (UniqueName: \"kubernetes.io/projected/75e280fc-27ef-4cd8-b46d-3913a229ba81-kube-api-access-f7bcz\") pod \"nova-cell0-a513-account-create-update-l8ctf\" (UID: \"75e280fc-27ef-4cd8-b46d-3913a229ba81\") " pod="openstack/nova-cell0-a513-account-create-update-l8ctf" Mar 10 10:07:13 crc kubenswrapper[4794]: I0310 10:07:13.903650 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/269fc3af-0d8a-4d19-8753-8c4d07670864-operator-scripts\") pod \"nova-api-e13e-account-create-update-nqb9b\" (UID: \"269fc3af-0d8a-4d19-8753-8c4d07670864\") " pod="openstack/nova-api-e13e-account-create-update-nqb9b" Mar 10 10:07:13 crc kubenswrapper[4794]: I0310 10:07:13.903762 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ffpdg\" (UniqueName: \"kubernetes.io/projected/269fc3af-0d8a-4d19-8753-8c4d07670864-kube-api-access-ffpdg\") pod \"nova-api-e13e-account-create-update-nqb9b\" (UID: \"269fc3af-0d8a-4d19-8753-8c4d07670864\") " pod="openstack/nova-api-e13e-account-create-update-nqb9b" Mar 10 10:07:13 crc kubenswrapper[4794]: I0310 10:07:13.903840 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a989a70f-ba12-4aa6-b96d-397cde6a5d48-operator-scripts\") pod \"nova-cell1-db-create-ngww6\" (UID: \"a989a70f-ba12-4aa6-b96d-397cde6a5d48\") " pod="openstack/nova-cell1-db-create-ngww6" Mar 10 10:07:13 crc kubenswrapper[4794]: I0310 10:07:13.903898 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/75e280fc-27ef-4cd8-b46d-3913a229ba81-operator-scripts\") pod \"nova-cell0-a513-account-create-update-l8ctf\" (UID: \"75e280fc-27ef-4cd8-b46d-3913a229ba81\") " pod="openstack/nova-cell0-a513-account-create-update-l8ctf" Mar 10 10:07:13 crc kubenswrapper[4794]: I0310 10:07:13.903962 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l8vkw\" (UniqueName: \"kubernetes.io/projected/a989a70f-ba12-4aa6-b96d-397cde6a5d48-kube-api-access-l8vkw\") pod \"nova-cell1-db-create-ngww6\" (UID: \"a989a70f-ba12-4aa6-b96d-397cde6a5d48\") " pod="openstack/nova-cell1-db-create-ngww6" Mar 10 10:07:13 crc kubenswrapper[4794]: I0310 10:07:13.904902 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/598f9a9c-0b67-44b8-83a9-428f55be33a9-operator-scripts\") pod \"nova-cell0-db-create-pkvrz\" (UID: \"598f9a9c-0b67-44b8-83a9-428f55be33a9\") " pod="openstack/nova-cell0-db-create-pkvrz" Mar 10 10:07:13 crc kubenswrapper[4794]: I0310 10:07:13.912488 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-a513-account-create-update-l8ctf"] Mar 10 10:07:13 crc kubenswrapper[4794]: I0310 10:07:13.926863 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v6pkl\" (UniqueName: \"kubernetes.io/projected/598f9a9c-0b67-44b8-83a9-428f55be33a9-kube-api-access-v6pkl\") pod \"nova-cell0-db-create-pkvrz\" (UID: \"598f9a9c-0b67-44b8-83a9-428f55be33a9\") " pod="openstack/nova-cell0-db-create-pkvrz" Mar 10 10:07:14 crc kubenswrapper[4794]: I0310 10:07:14.005070 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l8vkw\" (UniqueName: \"kubernetes.io/projected/a989a70f-ba12-4aa6-b96d-397cde6a5d48-kube-api-access-l8vkw\") pod \"nova-cell1-db-create-ngww6\" (UID: \"a989a70f-ba12-4aa6-b96d-397cde6a5d48\") " pod="openstack/nova-cell1-db-create-ngww6" Mar 10 10:07:14 crc kubenswrapper[4794]: I0310 10:07:14.005176 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f7bcz\" (UniqueName: \"kubernetes.io/projected/75e280fc-27ef-4cd8-b46d-3913a229ba81-kube-api-access-f7bcz\") pod \"nova-cell0-a513-account-create-update-l8ctf\" (UID: \"75e280fc-27ef-4cd8-b46d-3913a229ba81\") " pod="openstack/nova-cell0-a513-account-create-update-l8ctf" Mar 10 10:07:14 crc kubenswrapper[4794]: I0310 10:07:14.005198 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/269fc3af-0d8a-4d19-8753-8c4d07670864-operator-scripts\") pod \"nova-api-e13e-account-create-update-nqb9b\" (UID: \"269fc3af-0d8a-4d19-8753-8c4d07670864\") " pod="openstack/nova-api-e13e-account-create-update-nqb9b" Mar 10 10:07:14 crc kubenswrapper[4794]: I0310 10:07:14.005231 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ffpdg\" (UniqueName: \"kubernetes.io/projected/269fc3af-0d8a-4d19-8753-8c4d07670864-kube-api-access-ffpdg\") pod \"nova-api-e13e-account-create-update-nqb9b\" (UID: \"269fc3af-0d8a-4d19-8753-8c4d07670864\") " pod="openstack/nova-api-e13e-account-create-update-nqb9b" Mar 10 10:07:14 crc kubenswrapper[4794]: I0310 10:07:14.005264 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a989a70f-ba12-4aa6-b96d-397cde6a5d48-operator-scripts\") pod \"nova-cell1-db-create-ngww6\" (UID: \"a989a70f-ba12-4aa6-b96d-397cde6a5d48\") " pod="openstack/nova-cell1-db-create-ngww6" Mar 10 10:07:14 crc kubenswrapper[4794]: I0310 10:07:14.005287 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/75e280fc-27ef-4cd8-b46d-3913a229ba81-operator-scripts\") pod \"nova-cell0-a513-account-create-update-l8ctf\" (UID: \"75e280fc-27ef-4cd8-b46d-3913a229ba81\") " pod="openstack/nova-cell0-a513-account-create-update-l8ctf" Mar 10 10:07:14 crc kubenswrapper[4794]: I0310 10:07:14.006052 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/75e280fc-27ef-4cd8-b46d-3913a229ba81-operator-scripts\") pod \"nova-cell0-a513-account-create-update-l8ctf\" (UID: \"75e280fc-27ef-4cd8-b46d-3913a229ba81\") " pod="openstack/nova-cell0-a513-account-create-update-l8ctf" Mar 10 10:07:14 crc kubenswrapper[4794]: I0310 10:07:14.006885 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/269fc3af-0d8a-4d19-8753-8c4d07670864-operator-scripts\") pod \"nova-api-e13e-account-create-update-nqb9b\" (UID: \"269fc3af-0d8a-4d19-8753-8c4d07670864\") " pod="openstack/nova-api-e13e-account-create-update-nqb9b" Mar 10 10:07:14 crc kubenswrapper[4794]: I0310 10:07:14.007506 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a989a70f-ba12-4aa6-b96d-397cde6a5d48-operator-scripts\") pod \"nova-cell1-db-create-ngww6\" (UID: \"a989a70f-ba12-4aa6-b96d-397cde6a5d48\") " pod="openstack/nova-cell1-db-create-ngww6" Mar 10 10:07:14 crc kubenswrapper[4794]: I0310 10:07:14.018847 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-pkvrz" Mar 10 10:07:14 crc kubenswrapper[4794]: I0310 10:07:14.028932 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l8vkw\" (UniqueName: \"kubernetes.io/projected/a989a70f-ba12-4aa6-b96d-397cde6a5d48-kube-api-access-l8vkw\") pod \"nova-cell1-db-create-ngww6\" (UID: \"a989a70f-ba12-4aa6-b96d-397cde6a5d48\") " pod="openstack/nova-cell1-db-create-ngww6" Mar 10 10:07:14 crc kubenswrapper[4794]: I0310 10:07:14.034377 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f7bcz\" (UniqueName: \"kubernetes.io/projected/75e280fc-27ef-4cd8-b46d-3913a229ba81-kube-api-access-f7bcz\") pod \"nova-cell0-a513-account-create-update-l8ctf\" (UID: \"75e280fc-27ef-4cd8-b46d-3913a229ba81\") " pod="openstack/nova-cell0-a513-account-create-update-l8ctf" Mar 10 10:07:14 crc kubenswrapper[4794]: I0310 10:07:14.038897 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ffpdg\" (UniqueName: \"kubernetes.io/projected/269fc3af-0d8a-4d19-8753-8c4d07670864-kube-api-access-ffpdg\") pod \"nova-api-e13e-account-create-update-nqb9b\" (UID: \"269fc3af-0d8a-4d19-8753-8c4d07670864\") " pod="openstack/nova-api-e13e-account-create-update-nqb9b" Mar 10 10:07:14 crc kubenswrapper[4794]: I0310 10:07:14.062377 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-61a4-account-create-update-t8jxm"] Mar 10 10:07:14 crc kubenswrapper[4794]: I0310 10:07:14.063499 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-61a4-account-create-update-t8jxm" Mar 10 10:07:14 crc kubenswrapper[4794]: I0310 10:07:14.066619 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-db-secret" Mar 10 10:07:14 crc kubenswrapper[4794]: I0310 10:07:14.075668 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-61a4-account-create-update-t8jxm"] Mar 10 10:07:14 crc kubenswrapper[4794]: I0310 10:07:14.109207 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bc31804f-5a50-4c0a-80e8-42d0752ee5b5-operator-scripts\") pod \"nova-cell1-61a4-account-create-update-t8jxm\" (UID: \"bc31804f-5a50-4c0a-80e8-42d0752ee5b5\") " pod="openstack/nova-cell1-61a4-account-create-update-t8jxm" Mar 10 10:07:14 crc kubenswrapper[4794]: I0310 10:07:14.109307 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q6wnc\" (UniqueName: \"kubernetes.io/projected/bc31804f-5a50-4c0a-80e8-42d0752ee5b5-kube-api-access-q6wnc\") pod \"nova-cell1-61a4-account-create-update-t8jxm\" (UID: \"bc31804f-5a50-4c0a-80e8-42d0752ee5b5\") " pod="openstack/nova-cell1-61a4-account-create-update-t8jxm" Mar 10 10:07:14 crc kubenswrapper[4794]: I0310 10:07:14.131494 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-e13e-account-create-update-nqb9b" Mar 10 10:07:14 crc kubenswrapper[4794]: I0310 10:07:14.179963 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-ngww6" Mar 10 10:07:14 crc kubenswrapper[4794]: I0310 10:07:14.191412 4794 generic.go:334] "Generic (PLEG): container finished" podID="34993523-76a5-426f-a8bb-14466731fd21" containerID="6f74627ee5225f35dce01ac902c6a0fb721d52abe0c763bef1037f66809b4d95" exitCode=0 Mar 10 10:07:14 crc kubenswrapper[4794]: I0310 10:07:14.191476 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"34993523-76a5-426f-a8bb-14466731fd21","Type":"ContainerDied","Data":"6f74627ee5225f35dce01ac902c6a0fb721d52abe0c763bef1037f66809b4d95"} Mar 10 10:07:14 crc kubenswrapper[4794]: I0310 10:07:14.210618 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bc31804f-5a50-4c0a-80e8-42d0752ee5b5-operator-scripts\") pod \"nova-cell1-61a4-account-create-update-t8jxm\" (UID: \"bc31804f-5a50-4c0a-80e8-42d0752ee5b5\") " pod="openstack/nova-cell1-61a4-account-create-update-t8jxm" Mar 10 10:07:14 crc kubenswrapper[4794]: I0310 10:07:14.210666 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q6wnc\" (UniqueName: \"kubernetes.io/projected/bc31804f-5a50-4c0a-80e8-42d0752ee5b5-kube-api-access-q6wnc\") pod \"nova-cell1-61a4-account-create-update-t8jxm\" (UID: \"bc31804f-5a50-4c0a-80e8-42d0752ee5b5\") " pod="openstack/nova-cell1-61a4-account-create-update-t8jxm" Mar 10 10:07:14 crc kubenswrapper[4794]: I0310 10:07:14.211579 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bc31804f-5a50-4c0a-80e8-42d0752ee5b5-operator-scripts\") pod \"nova-cell1-61a4-account-create-update-t8jxm\" (UID: \"bc31804f-5a50-4c0a-80e8-42d0752ee5b5\") " pod="openstack/nova-cell1-61a4-account-create-update-t8jxm" Mar 10 10:07:14 crc kubenswrapper[4794]: I0310 10:07:14.211755 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"59e99408-027e-4f50-bfa8-02482e877cc8","Type":"ContainerStarted","Data":"32c70546d85a39bc9911849dd97d4c3679b338fa26fee4f77da0b94068604ff1"} Mar 10 10:07:14 crc kubenswrapper[4794]: I0310 10:07:14.224758 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-a513-account-create-update-l8ctf" Mar 10 10:07:14 crc kubenswrapper[4794]: I0310 10:07:14.240931 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q6wnc\" (UniqueName: \"kubernetes.io/projected/bc31804f-5a50-4c0a-80e8-42d0752ee5b5-kube-api-access-q6wnc\") pod \"nova-cell1-61a4-account-create-update-t8jxm\" (UID: \"bc31804f-5a50-4c0a-80e8-42d0752ee5b5\") " pod="openstack/nova-cell1-61a4-account-create-update-t8jxm" Mar 10 10:07:14 crc kubenswrapper[4794]: I0310 10:07:14.244678 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-7575fbf969-gq2mq" event={"ID":"ed705a10-5bb5-4170-8536-57c6be1cb816","Type":"ContainerStarted","Data":"a5ce358b520a173198341e8d239869c88b76f27ef305f00661c9c0926b396a4c"} Mar 10 10:07:14 crc kubenswrapper[4794]: I0310 10:07:14.244719 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-7575fbf969-gq2mq" event={"ID":"ed705a10-5bb5-4170-8536-57c6be1cb816","Type":"ContainerStarted","Data":"f4dda11441d70fc4769e94a81ee22fc71ec078c6070b2dd0749628f78f387314"} Mar 10 10:07:14 crc kubenswrapper[4794]: I0310 10:07:14.244756 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-7575fbf969-gq2mq" Mar 10 10:07:14 crc kubenswrapper[4794]: I0310 10:07:14.244779 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-7575fbf969-gq2mq" Mar 10 10:07:14 crc kubenswrapper[4794]: I0310 10:07:14.275833 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-proxy-7575fbf969-gq2mq" podStartSLOduration=9.27581456 podStartE2EDuration="9.27581456s" podCreationTimestamp="2026-03-10 10:07:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 10:07:14.269490653 +0000 UTC m=+1383.025661461" watchObservedRunningTime="2026-03-10 10:07:14.27581456 +0000 UTC m=+1383.031985378" Mar 10 10:07:14 crc kubenswrapper[4794]: I0310 10:07:14.291687 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 10 10:07:14 crc kubenswrapper[4794]: I0310 10:07:14.316110 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"34993523-76a5-426f-a8bb-14466731fd21\" (UID: \"34993523-76a5-426f-a8bb-14466731fd21\") " Mar 10 10:07:14 crc kubenswrapper[4794]: I0310 10:07:14.316222 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/34993523-76a5-426f-a8bb-14466731fd21-config-data\") pod \"34993523-76a5-426f-a8bb-14466731fd21\" (UID: \"34993523-76a5-426f-a8bb-14466731fd21\") " Mar 10 10:07:14 crc kubenswrapper[4794]: I0310 10:07:14.316260 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/34993523-76a5-426f-a8bb-14466731fd21-combined-ca-bundle\") pod \"34993523-76a5-426f-a8bb-14466731fd21\" (UID: \"34993523-76a5-426f-a8bb-14466731fd21\") " Mar 10 10:07:14 crc kubenswrapper[4794]: I0310 10:07:14.316287 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/34993523-76a5-426f-a8bb-14466731fd21-scripts\") pod \"34993523-76a5-426f-a8bb-14466731fd21\" (UID: \"34993523-76a5-426f-a8bb-14466731fd21\") " Mar 10 10:07:14 crc kubenswrapper[4794]: I0310 10:07:14.316323 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/34993523-76a5-426f-a8bb-14466731fd21-httpd-run\") pod \"34993523-76a5-426f-a8bb-14466731fd21\" (UID: \"34993523-76a5-426f-a8bb-14466731fd21\") " Mar 10 10:07:14 crc kubenswrapper[4794]: I0310 10:07:14.316381 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/34993523-76a5-426f-a8bb-14466731fd21-logs\") pod \"34993523-76a5-426f-a8bb-14466731fd21\" (UID: \"34993523-76a5-426f-a8bb-14466731fd21\") " Mar 10 10:07:14 crc kubenswrapper[4794]: I0310 10:07:14.316427 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/34993523-76a5-426f-a8bb-14466731fd21-internal-tls-certs\") pod \"34993523-76a5-426f-a8bb-14466731fd21\" (UID: \"34993523-76a5-426f-a8bb-14466731fd21\") " Mar 10 10:07:14 crc kubenswrapper[4794]: I0310 10:07:14.316457 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bsbtw\" (UniqueName: \"kubernetes.io/projected/34993523-76a5-426f-a8bb-14466731fd21-kube-api-access-bsbtw\") pod \"34993523-76a5-426f-a8bb-14466731fd21\" (UID: \"34993523-76a5-426f-a8bb-14466731fd21\") " Mar 10 10:07:14 crc kubenswrapper[4794]: I0310 10:07:14.318464 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/34993523-76a5-426f-a8bb-14466731fd21-logs" (OuterVolumeSpecName: "logs") pod "34993523-76a5-426f-a8bb-14466731fd21" (UID: "34993523-76a5-426f-a8bb-14466731fd21"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 10:07:14 crc kubenswrapper[4794]: I0310 10:07:14.319179 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/34993523-76a5-426f-a8bb-14466731fd21-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "34993523-76a5-426f-a8bb-14466731fd21" (UID: "34993523-76a5-426f-a8bb-14466731fd21"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 10:07:14 crc kubenswrapper[4794]: I0310 10:07:14.344107 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage03-crc" (OuterVolumeSpecName: "glance") pod "34993523-76a5-426f-a8bb-14466731fd21" (UID: "34993523-76a5-426f-a8bb-14466731fd21"). InnerVolumeSpecName "local-storage03-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 10 10:07:14 crc kubenswrapper[4794]: I0310 10:07:14.377636 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/34993523-76a5-426f-a8bb-14466731fd21-scripts" (OuterVolumeSpecName: "scripts") pod "34993523-76a5-426f-a8bb-14466731fd21" (UID: "34993523-76a5-426f-a8bb-14466731fd21"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 10:07:14 crc kubenswrapper[4794]: I0310 10:07:14.377759 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/34993523-76a5-426f-a8bb-14466731fd21-kube-api-access-bsbtw" (OuterVolumeSpecName: "kube-api-access-bsbtw") pod "34993523-76a5-426f-a8bb-14466731fd21" (UID: "34993523-76a5-426f-a8bb-14466731fd21"). InnerVolumeSpecName "kube-api-access-bsbtw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 10:07:14 crc kubenswrapper[4794]: I0310 10:07:14.407054 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-61a4-account-create-update-t8jxm" Mar 10 10:07:14 crc kubenswrapper[4794]: I0310 10:07:14.420023 4794 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") on node \"crc\" " Mar 10 10:07:14 crc kubenswrapper[4794]: I0310 10:07:14.420051 4794 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/34993523-76a5-426f-a8bb-14466731fd21-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 10:07:14 crc kubenswrapper[4794]: I0310 10:07:14.420061 4794 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/34993523-76a5-426f-a8bb-14466731fd21-httpd-run\") on node \"crc\" DevicePath \"\"" Mar 10 10:07:14 crc kubenswrapper[4794]: I0310 10:07:14.420069 4794 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/34993523-76a5-426f-a8bb-14466731fd21-logs\") on node \"crc\" DevicePath \"\"" Mar 10 10:07:14 crc kubenswrapper[4794]: I0310 10:07:14.420077 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bsbtw\" (UniqueName: \"kubernetes.io/projected/34993523-76a5-426f-a8bb-14466731fd21-kube-api-access-bsbtw\") on node \"crc\" DevicePath \"\"" Mar 10 10:07:14 crc kubenswrapper[4794]: I0310 10:07:14.431914 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/34993523-76a5-426f-a8bb-14466731fd21-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "34993523-76a5-426f-a8bb-14466731fd21" (UID: "34993523-76a5-426f-a8bb-14466731fd21"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 10:07:14 crc kubenswrapper[4794]: I0310 10:07:14.459472 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/34993523-76a5-426f-a8bb-14466731fd21-config-data" (OuterVolumeSpecName: "config-data") pod "34993523-76a5-426f-a8bb-14466731fd21" (UID: "34993523-76a5-426f-a8bb-14466731fd21"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 10:07:14 crc kubenswrapper[4794]: I0310 10:07:14.459749 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/34993523-76a5-426f-a8bb-14466731fd21-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "34993523-76a5-426f-a8bb-14466731fd21" (UID: "34993523-76a5-426f-a8bb-14466731fd21"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 10:07:14 crc kubenswrapper[4794]: I0310 10:07:14.485764 4794 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage03-crc" (UniqueName: "kubernetes.io/local-volume/local-storage03-crc") on node "crc" Mar 10 10:07:14 crc kubenswrapper[4794]: I0310 10:07:14.498048 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-hsjqq"] Mar 10 10:07:14 crc kubenswrapper[4794]: I0310 10:07:14.527693 4794 reconciler_common.go:293] "Volume detached for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") on node \"crc\" DevicePath \"\"" Mar 10 10:07:14 crc kubenswrapper[4794]: I0310 10:07:14.527725 4794 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/34993523-76a5-426f-a8bb-14466731fd21-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 10:07:14 crc kubenswrapper[4794]: I0310 10:07:14.527843 4794 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/34993523-76a5-426f-a8bb-14466731fd21-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 10:07:14 crc kubenswrapper[4794]: I0310 10:07:14.527875 4794 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/34993523-76a5-426f-a8bb-14466731fd21-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 10 10:07:14 crc kubenswrapper[4794]: W0310 10:07:14.871770 4794 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod598f9a9c_0b67_44b8_83a9_428f55be33a9.slice/crio-bf266c1f8f114d437dd4b643c72b26ab5ca8ea9777958c403c15359aeca6725b WatchSource:0}: Error finding container bf266c1f8f114d437dd4b643c72b26ab5ca8ea9777958c403c15359aeca6725b: Status 404 returned error can't find the container with id bf266c1f8f114d437dd4b643c72b26ab5ca8ea9777958c403c15359aeca6725b Mar 10 10:07:14 crc kubenswrapper[4794]: I0310 10:07:14.877075 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-pkvrz"] Mar 10 10:07:15 crc kubenswrapper[4794]: I0310 10:07:15.288656 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-ngww6"] Mar 10 10:07:15 crc kubenswrapper[4794]: I0310 10:07:15.292383 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-pkvrz" event={"ID":"598f9a9c-0b67-44b8-83a9-428f55be33a9","Type":"ContainerStarted","Data":"533e00e14c8524408bf878f2e40fcf2843d0d66d7ba048ed96beac141f3440ab"} Mar 10 10:07:15 crc kubenswrapper[4794]: I0310 10:07:15.292418 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-pkvrz" event={"ID":"598f9a9c-0b67-44b8-83a9-428f55be33a9","Type":"ContainerStarted","Data":"bf266c1f8f114d437dd4b643c72b26ab5ca8ea9777958c403c15359aeca6725b"} Mar 10 10:07:15 crc kubenswrapper[4794]: I0310 10:07:15.292994 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 10 10:07:15 crc kubenswrapper[4794]: I0310 10:07:15.296010 4794 generic.go:334] "Generic (PLEG): container finished" podID="325754ce-6381-4bb4-9102-04933c1a928b" containerID="b35130c90243c66a164c26598446bab5071d3dcbebe9e12292cadd9f5f9536f4" exitCode=0 Mar 10 10:07:15 crc kubenswrapper[4794]: I0310 10:07:15.296063 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"325754ce-6381-4bb4-9102-04933c1a928b","Type":"ContainerDied","Data":"b35130c90243c66a164c26598446bab5071d3dcbebe9e12292cadd9f5f9536f4"} Mar 10 10:07:15 crc kubenswrapper[4794]: I0310 10:07:15.296092 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"325754ce-6381-4bb4-9102-04933c1a928b","Type":"ContainerDied","Data":"e5f1ca83860db413e4dbafdb64456ad83868bbe4106fa6d57fe111a3132e1ed1"} Mar 10 10:07:15 crc kubenswrapper[4794]: I0310 10:07:15.296113 4794 scope.go:117] "RemoveContainer" containerID="b35130c90243c66a164c26598446bab5071d3dcbebe9e12292cadd9f5f9536f4" Mar 10 10:07:15 crc kubenswrapper[4794]: W0310 10:07:15.296990 4794 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda989a70f_ba12_4aa6_b96d_397cde6a5d48.slice/crio-83f75f9dbd92c1f81599dac8a1df99df0e5884743438edc4aec56c4e5f623db4 WatchSource:0}: Error finding container 83f75f9dbd92c1f81599dac8a1df99df0e5884743438edc4aec56c4e5f623db4: Status 404 returned error can't find the container with id 83f75f9dbd92c1f81599dac8a1df99df0e5884743438edc4aec56c4e5f623db4 Mar 10 10:07:15 crc kubenswrapper[4794]: I0310 10:07:15.323387 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-a513-account-create-update-l8ctf"] Mar 10 10:07:15 crc kubenswrapper[4794]: I0310 10:07:15.372778 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/325754ce-6381-4bb4-9102-04933c1a928b-scripts\") pod \"325754ce-6381-4bb4-9102-04933c1a928b\" (UID: \"325754ce-6381-4bb4-9102-04933c1a928b\") " Mar 10 10:07:15 crc kubenswrapper[4794]: I0310 10:07:15.372887 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/325754ce-6381-4bb4-9102-04933c1a928b-combined-ca-bundle\") pod \"325754ce-6381-4bb4-9102-04933c1a928b\" (UID: \"325754ce-6381-4bb4-9102-04933c1a928b\") " Mar 10 10:07:15 crc kubenswrapper[4794]: I0310 10:07:15.372929 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/325754ce-6381-4bb4-9102-04933c1a928b-httpd-run\") pod \"325754ce-6381-4bb4-9102-04933c1a928b\" (UID: \"325754ce-6381-4bb4-9102-04933c1a928b\") " Mar 10 10:07:15 crc kubenswrapper[4794]: I0310 10:07:15.372986 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"325754ce-6381-4bb4-9102-04933c1a928b\" (UID: \"325754ce-6381-4bb4-9102-04933c1a928b\") " Mar 10 10:07:15 crc kubenswrapper[4794]: I0310 10:07:15.373017 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/325754ce-6381-4bb4-9102-04933c1a928b-config-data\") pod \"325754ce-6381-4bb4-9102-04933c1a928b\" (UID: \"325754ce-6381-4bb4-9102-04933c1a928b\") " Mar 10 10:07:15 crc kubenswrapper[4794]: I0310 10:07:15.373131 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/325754ce-6381-4bb4-9102-04933c1a928b-public-tls-certs\") pod \"325754ce-6381-4bb4-9102-04933c1a928b\" (UID: \"325754ce-6381-4bb4-9102-04933c1a928b\") " Mar 10 10:07:15 crc kubenswrapper[4794]: I0310 10:07:15.373157 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/325754ce-6381-4bb4-9102-04933c1a928b-logs\") pod \"325754ce-6381-4bb4-9102-04933c1a928b\" (UID: \"325754ce-6381-4bb4-9102-04933c1a928b\") " Mar 10 10:07:15 crc kubenswrapper[4794]: I0310 10:07:15.373183 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qq7dd\" (UniqueName: \"kubernetes.io/projected/325754ce-6381-4bb4-9102-04933c1a928b-kube-api-access-qq7dd\") pod \"325754ce-6381-4bb4-9102-04933c1a928b\" (UID: \"325754ce-6381-4bb4-9102-04933c1a928b\") " Mar 10 10:07:15 crc kubenswrapper[4794]: I0310 10:07:15.382817 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"34993523-76a5-426f-a8bb-14466731fd21","Type":"ContainerDied","Data":"81de560fbdf656511469793bbc0079e4b031e9a1eaf3c0073a8d3b425a268416"} Mar 10 10:07:15 crc kubenswrapper[4794]: I0310 10:07:15.382929 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 10 10:07:15 crc kubenswrapper[4794]: I0310 10:07:15.385294 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage05-crc" (OuterVolumeSpecName: "glance") pod "325754ce-6381-4bb4-9102-04933c1a928b" (UID: "325754ce-6381-4bb4-9102-04933c1a928b"). InnerVolumeSpecName "local-storage05-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 10 10:07:15 crc kubenswrapper[4794]: I0310 10:07:15.385724 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/325754ce-6381-4bb4-9102-04933c1a928b-scripts" (OuterVolumeSpecName: "scripts") pod "325754ce-6381-4bb4-9102-04933c1a928b" (UID: "325754ce-6381-4bb4-9102-04933c1a928b"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 10:07:15 crc kubenswrapper[4794]: I0310 10:07:15.386056 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/325754ce-6381-4bb4-9102-04933c1a928b-logs" (OuterVolumeSpecName: "logs") pod "325754ce-6381-4bb4-9102-04933c1a928b" (UID: "325754ce-6381-4bb4-9102-04933c1a928b"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 10:07:15 crc kubenswrapper[4794]: I0310 10:07:15.386761 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/325754ce-6381-4bb4-9102-04933c1a928b-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "325754ce-6381-4bb4-9102-04933c1a928b" (UID: "325754ce-6381-4bb4-9102-04933c1a928b"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 10:07:15 crc kubenswrapper[4794]: I0310 10:07:15.405131 4794 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/325754ce-6381-4bb4-9102-04933c1a928b-httpd-run\") on node \"crc\" DevicePath \"\"" Mar 10 10:07:15 crc kubenswrapper[4794]: I0310 10:07:15.405169 4794 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" " Mar 10 10:07:15 crc kubenswrapper[4794]: I0310 10:07:15.405179 4794 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/325754ce-6381-4bb4-9102-04933c1a928b-logs\") on node \"crc\" DevicePath \"\"" Mar 10 10:07:15 crc kubenswrapper[4794]: I0310 10:07:15.405189 4794 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/325754ce-6381-4bb4-9102-04933c1a928b-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 10:07:15 crc kubenswrapper[4794]: I0310 10:07:15.440003 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/325754ce-6381-4bb4-9102-04933c1a928b-kube-api-access-qq7dd" (OuterVolumeSpecName: "kube-api-access-qq7dd") pod "325754ce-6381-4bb4-9102-04933c1a928b" (UID: "325754ce-6381-4bb4-9102-04933c1a928b"). InnerVolumeSpecName "kube-api-access-qq7dd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 10:07:15 crc kubenswrapper[4794]: I0310 10:07:15.469398 4794 generic.go:334] "Generic (PLEG): container finished" podID="d7addf26-5bbf-4b13-aa27-070bab62a929" containerID="c71befb5940658f0f62fdcfde2739c45a38416fb2d1194d3aeeb9466f078e8ad" exitCode=0 Mar 10 10:07:15 crc kubenswrapper[4794]: I0310 10:07:15.469499 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-hsjqq" event={"ID":"d7addf26-5bbf-4b13-aa27-070bab62a929","Type":"ContainerDied","Data":"c71befb5940658f0f62fdcfde2739c45a38416fb2d1194d3aeeb9466f078e8ad"} Mar 10 10:07:15 crc kubenswrapper[4794]: I0310 10:07:15.469539 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-hsjqq" event={"ID":"d7addf26-5bbf-4b13-aa27-070bab62a929","Type":"ContainerStarted","Data":"9c3300f22b86e3369084c4649108e950a8142c34a1a47ad01a8e3ab49cf4ee17"} Mar 10 10:07:15 crc kubenswrapper[4794]: I0310 10:07:15.496510 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-e13e-account-create-update-nqb9b"] Mar 10 10:07:15 crc kubenswrapper[4794]: I0310 10:07:15.568531 4794 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage05-crc" (UniqueName: "kubernetes.io/local-volume/local-storage05-crc") on node "crc" Mar 10 10:07:15 crc kubenswrapper[4794]: I0310 10:07:15.574265 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"59e99408-027e-4f50-bfa8-02482e877cc8","Type":"ContainerStarted","Data":"a7002fe4bfa5464e36af4a82b820f45e5a86f1792f9647edde24d5323671acda"} Mar 10 10:07:15 crc kubenswrapper[4794]: I0310 10:07:15.592607 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/325754ce-6381-4bb4-9102-04933c1a928b-config-data" (OuterVolumeSpecName: "config-data") pod "325754ce-6381-4bb4-9102-04933c1a928b" (UID: "325754ce-6381-4bb4-9102-04933c1a928b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 10:07:15 crc kubenswrapper[4794]: I0310 10:07:15.593210 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-61a4-account-create-update-t8jxm"] Mar 10 10:07:15 crc kubenswrapper[4794]: I0310 10:07:15.620407 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/325754ce-6381-4bb4-9102-04933c1a928b-config-data\") pod \"325754ce-6381-4bb4-9102-04933c1a928b\" (UID: \"325754ce-6381-4bb4-9102-04933c1a928b\") " Mar 10 10:07:15 crc kubenswrapper[4794]: I0310 10:07:15.621794 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/325754ce-6381-4bb4-9102-04933c1a928b-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "325754ce-6381-4bb4-9102-04933c1a928b" (UID: "325754ce-6381-4bb4-9102-04933c1a928b"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 10:07:15 crc kubenswrapper[4794]: W0310 10:07:15.621907 4794 empty_dir.go:500] Warning: Unmount skipped because path does not exist: /var/lib/kubelet/pods/325754ce-6381-4bb4-9102-04933c1a928b/volumes/kubernetes.io~secret/config-data Mar 10 10:07:15 crc kubenswrapper[4794]: I0310 10:07:15.621988 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/325754ce-6381-4bb4-9102-04933c1a928b-config-data" (OuterVolumeSpecName: "config-data") pod "325754ce-6381-4bb4-9102-04933c1a928b" (UID: "325754ce-6381-4bb4-9102-04933c1a928b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 10:07:15 crc kubenswrapper[4794]: I0310 10:07:15.645919 4794 reconciler_common.go:293] "Volume detached for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" DevicePath \"\"" Mar 10 10:07:15 crc kubenswrapper[4794]: I0310 10:07:15.645956 4794 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/325754ce-6381-4bb4-9102-04933c1a928b-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 10:07:15 crc kubenswrapper[4794]: I0310 10:07:15.645967 4794 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/325754ce-6381-4bb4-9102-04933c1a928b-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 10 10:07:15 crc kubenswrapper[4794]: I0310 10:07:15.645978 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qq7dd\" (UniqueName: \"kubernetes.io/projected/325754ce-6381-4bb4-9102-04933c1a928b-kube-api-access-qq7dd\") on node \"crc\" DevicePath \"\"" Mar 10 10:07:15 crc kubenswrapper[4794]: I0310 10:07:15.656738 4794 scope.go:117] "RemoveContainer" containerID="0383025111f0e6982d3ca9019cd29a06bc4484fc4b9cbeafc472f00480493c35" Mar 10 10:07:15 crc kubenswrapper[4794]: I0310 10:07:15.658451 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-db-create-pkvrz" podStartSLOduration=2.658436487 podStartE2EDuration="2.658436487s" podCreationTimestamp="2026-03-10 10:07:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 10:07:15.353956658 +0000 UTC m=+1384.110127486" watchObservedRunningTime="2026-03-10 10:07:15.658436487 +0000 UTC m=+1384.414607315" Mar 10 10:07:15 crc kubenswrapper[4794]: I0310 10:07:15.672780 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/325754ce-6381-4bb4-9102-04933c1a928b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "325754ce-6381-4bb4-9102-04933c1a928b" (UID: "325754ce-6381-4bb4-9102-04933c1a928b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 10:07:15 crc kubenswrapper[4794]: I0310 10:07:15.734465 4794 scope.go:117] "RemoveContainer" containerID="b35130c90243c66a164c26598446bab5071d3dcbebe9e12292cadd9f5f9536f4" Mar 10 10:07:15 crc kubenswrapper[4794]: E0310 10:07:15.736473 4794 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b35130c90243c66a164c26598446bab5071d3dcbebe9e12292cadd9f5f9536f4\": container with ID starting with b35130c90243c66a164c26598446bab5071d3dcbebe9e12292cadd9f5f9536f4 not found: ID does not exist" containerID="b35130c90243c66a164c26598446bab5071d3dcbebe9e12292cadd9f5f9536f4" Mar 10 10:07:15 crc kubenswrapper[4794]: I0310 10:07:15.736523 4794 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b35130c90243c66a164c26598446bab5071d3dcbebe9e12292cadd9f5f9536f4"} err="failed to get container status \"b35130c90243c66a164c26598446bab5071d3dcbebe9e12292cadd9f5f9536f4\": rpc error: code = NotFound desc = could not find container \"b35130c90243c66a164c26598446bab5071d3dcbebe9e12292cadd9f5f9536f4\": container with ID starting with b35130c90243c66a164c26598446bab5071d3dcbebe9e12292cadd9f5f9536f4 not found: ID does not exist" Mar 10 10:07:15 crc kubenswrapper[4794]: I0310 10:07:15.736547 4794 scope.go:117] "RemoveContainer" containerID="0383025111f0e6982d3ca9019cd29a06bc4484fc4b9cbeafc472f00480493c35" Mar 10 10:07:15 crc kubenswrapper[4794]: E0310 10:07:15.744505 4794 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0383025111f0e6982d3ca9019cd29a06bc4484fc4b9cbeafc472f00480493c35\": container with ID starting with 0383025111f0e6982d3ca9019cd29a06bc4484fc4b9cbeafc472f00480493c35 not found: ID does not exist" containerID="0383025111f0e6982d3ca9019cd29a06bc4484fc4b9cbeafc472f00480493c35" Mar 10 10:07:15 crc kubenswrapper[4794]: I0310 10:07:15.744559 4794 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0383025111f0e6982d3ca9019cd29a06bc4484fc4b9cbeafc472f00480493c35"} err="failed to get container status \"0383025111f0e6982d3ca9019cd29a06bc4484fc4b9cbeafc472f00480493c35\": rpc error: code = NotFound desc = could not find container \"0383025111f0e6982d3ca9019cd29a06bc4484fc4b9cbeafc472f00480493c35\": container with ID starting with 0383025111f0e6982d3ca9019cd29a06bc4484fc4b9cbeafc472f00480493c35 not found: ID does not exist" Mar 10 10:07:15 crc kubenswrapper[4794]: I0310 10:07:15.744583 4794 scope.go:117] "RemoveContainer" containerID="6f74627ee5225f35dce01ac902c6a0fb721d52abe0c763bef1037f66809b4d95" Mar 10 10:07:15 crc kubenswrapper[4794]: I0310 10:07:15.749652 4794 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/325754ce-6381-4bb4-9102-04933c1a928b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 10:07:15 crc kubenswrapper[4794]: I0310 10:07:15.772725 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 10 10:07:15 crc kubenswrapper[4794]: I0310 10:07:15.813694 4794 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 10 10:07:15 crc kubenswrapper[4794]: I0310 10:07:15.848723 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 10 10:07:15 crc kubenswrapper[4794]: E0310 10:07:15.851012 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="325754ce-6381-4bb4-9102-04933c1a928b" containerName="glance-httpd" Mar 10 10:07:15 crc kubenswrapper[4794]: I0310 10:07:15.851194 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="325754ce-6381-4bb4-9102-04933c1a928b" containerName="glance-httpd" Mar 10 10:07:15 crc kubenswrapper[4794]: E0310 10:07:15.851310 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="325754ce-6381-4bb4-9102-04933c1a928b" containerName="glance-log" Mar 10 10:07:15 crc kubenswrapper[4794]: I0310 10:07:15.851452 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="325754ce-6381-4bb4-9102-04933c1a928b" containerName="glance-log" Mar 10 10:07:15 crc kubenswrapper[4794]: E0310 10:07:15.851539 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="34993523-76a5-426f-a8bb-14466731fd21" containerName="glance-httpd" Mar 10 10:07:15 crc kubenswrapper[4794]: I0310 10:07:15.851609 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="34993523-76a5-426f-a8bb-14466731fd21" containerName="glance-httpd" Mar 10 10:07:15 crc kubenswrapper[4794]: E0310 10:07:15.851818 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="34993523-76a5-426f-a8bb-14466731fd21" containerName="glance-log" Mar 10 10:07:15 crc kubenswrapper[4794]: I0310 10:07:15.851895 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="34993523-76a5-426f-a8bb-14466731fd21" containerName="glance-log" Mar 10 10:07:15 crc kubenswrapper[4794]: I0310 10:07:15.852211 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="34993523-76a5-426f-a8bb-14466731fd21" containerName="glance-log" Mar 10 10:07:15 crc kubenswrapper[4794]: I0310 10:07:15.852294 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="34993523-76a5-426f-a8bb-14466731fd21" containerName="glance-httpd" Mar 10 10:07:15 crc kubenswrapper[4794]: I0310 10:07:15.852392 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="325754ce-6381-4bb4-9102-04933c1a928b" containerName="glance-httpd" Mar 10 10:07:15 crc kubenswrapper[4794]: I0310 10:07:15.852475 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="325754ce-6381-4bb4-9102-04933c1a928b" containerName="glance-log" Mar 10 10:07:15 crc kubenswrapper[4794]: I0310 10:07:15.853781 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 10 10:07:15 crc kubenswrapper[4794]: I0310 10:07:15.861258 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 10 10:07:15 crc kubenswrapper[4794]: I0310 10:07:15.866144 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Mar 10 10:07:15 crc kubenswrapper[4794]: I0310 10:07:15.866381 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Mar 10 10:07:15 crc kubenswrapper[4794]: I0310 10:07:15.866476 4794 scope.go:117] "RemoveContainer" containerID="dfe796b758d9be52c77e22c0da16ab79aedbbd54085aff9d8b415d2185a274e8" Mar 10 10:07:15 crc kubenswrapper[4794]: I0310 10:07:15.957602 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/eb384e40-1917-4b9c-bcfa-440a3a10fd1d-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"eb384e40-1917-4b9c-bcfa-440a3a10fd1d\") " pod="openstack/glance-default-internal-api-0" Mar 10 10:07:15 crc kubenswrapper[4794]: I0310 10:07:15.957681 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-internal-api-0\" (UID: \"eb384e40-1917-4b9c-bcfa-440a3a10fd1d\") " pod="openstack/glance-default-internal-api-0" Mar 10 10:07:15 crc kubenswrapper[4794]: I0310 10:07:15.957726 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eb384e40-1917-4b9c-bcfa-440a3a10fd1d-config-data\") pod \"glance-default-internal-api-0\" (UID: \"eb384e40-1917-4b9c-bcfa-440a3a10fd1d\") " pod="openstack/glance-default-internal-api-0" Mar 10 10:07:15 crc kubenswrapper[4794]: I0310 10:07:15.957775 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/eb384e40-1917-4b9c-bcfa-440a3a10fd1d-logs\") pod \"glance-default-internal-api-0\" (UID: \"eb384e40-1917-4b9c-bcfa-440a3a10fd1d\") " pod="openstack/glance-default-internal-api-0" Mar 10 10:07:15 crc kubenswrapper[4794]: I0310 10:07:15.957827 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lwst6\" (UniqueName: \"kubernetes.io/projected/eb384e40-1917-4b9c-bcfa-440a3a10fd1d-kube-api-access-lwst6\") pod \"glance-default-internal-api-0\" (UID: \"eb384e40-1917-4b9c-bcfa-440a3a10fd1d\") " pod="openstack/glance-default-internal-api-0" Mar 10 10:07:15 crc kubenswrapper[4794]: I0310 10:07:15.957852 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/eb384e40-1917-4b9c-bcfa-440a3a10fd1d-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"eb384e40-1917-4b9c-bcfa-440a3a10fd1d\") " pod="openstack/glance-default-internal-api-0" Mar 10 10:07:15 crc kubenswrapper[4794]: I0310 10:07:15.957900 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/eb384e40-1917-4b9c-bcfa-440a3a10fd1d-scripts\") pod \"glance-default-internal-api-0\" (UID: \"eb384e40-1917-4b9c-bcfa-440a3a10fd1d\") " pod="openstack/glance-default-internal-api-0" Mar 10 10:07:15 crc kubenswrapper[4794]: I0310 10:07:15.957949 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eb384e40-1917-4b9c-bcfa-440a3a10fd1d-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"eb384e40-1917-4b9c-bcfa-440a3a10fd1d\") " pod="openstack/glance-default-internal-api-0" Mar 10 10:07:16 crc kubenswrapper[4794]: I0310 10:07:16.023304 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="34993523-76a5-426f-a8bb-14466731fd21" path="/var/lib/kubelet/pods/34993523-76a5-426f-a8bb-14466731fd21/volumes" Mar 10 10:07:16 crc kubenswrapper[4794]: I0310 10:07:16.059186 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-internal-api-0\" (UID: \"eb384e40-1917-4b9c-bcfa-440a3a10fd1d\") " pod="openstack/glance-default-internal-api-0" Mar 10 10:07:16 crc kubenswrapper[4794]: I0310 10:07:16.059240 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eb384e40-1917-4b9c-bcfa-440a3a10fd1d-config-data\") pod \"glance-default-internal-api-0\" (UID: \"eb384e40-1917-4b9c-bcfa-440a3a10fd1d\") " pod="openstack/glance-default-internal-api-0" Mar 10 10:07:16 crc kubenswrapper[4794]: I0310 10:07:16.059493 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/eb384e40-1917-4b9c-bcfa-440a3a10fd1d-logs\") pod \"glance-default-internal-api-0\" (UID: \"eb384e40-1917-4b9c-bcfa-440a3a10fd1d\") " pod="openstack/glance-default-internal-api-0" Mar 10 10:07:16 crc kubenswrapper[4794]: I0310 10:07:16.059553 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lwst6\" (UniqueName: \"kubernetes.io/projected/eb384e40-1917-4b9c-bcfa-440a3a10fd1d-kube-api-access-lwst6\") pod \"glance-default-internal-api-0\" (UID: \"eb384e40-1917-4b9c-bcfa-440a3a10fd1d\") " pod="openstack/glance-default-internal-api-0" Mar 10 10:07:16 crc kubenswrapper[4794]: I0310 10:07:16.059578 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/eb384e40-1917-4b9c-bcfa-440a3a10fd1d-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"eb384e40-1917-4b9c-bcfa-440a3a10fd1d\") " pod="openstack/glance-default-internal-api-0" Mar 10 10:07:16 crc kubenswrapper[4794]: I0310 10:07:16.059626 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/eb384e40-1917-4b9c-bcfa-440a3a10fd1d-scripts\") pod \"glance-default-internal-api-0\" (UID: \"eb384e40-1917-4b9c-bcfa-440a3a10fd1d\") " pod="openstack/glance-default-internal-api-0" Mar 10 10:07:16 crc kubenswrapper[4794]: I0310 10:07:16.059664 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eb384e40-1917-4b9c-bcfa-440a3a10fd1d-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"eb384e40-1917-4b9c-bcfa-440a3a10fd1d\") " pod="openstack/glance-default-internal-api-0" Mar 10 10:07:16 crc kubenswrapper[4794]: I0310 10:07:16.059719 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/eb384e40-1917-4b9c-bcfa-440a3a10fd1d-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"eb384e40-1917-4b9c-bcfa-440a3a10fd1d\") " pod="openstack/glance-default-internal-api-0" Mar 10 10:07:16 crc kubenswrapper[4794]: I0310 10:07:16.061673 4794 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-internal-api-0\" (UID: \"eb384e40-1917-4b9c-bcfa-440a3a10fd1d\") device mount path \"/mnt/openstack/pv03\"" pod="openstack/glance-default-internal-api-0" Mar 10 10:07:16 crc kubenswrapper[4794]: I0310 10:07:16.061864 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/eb384e40-1917-4b9c-bcfa-440a3a10fd1d-logs\") pod \"glance-default-internal-api-0\" (UID: \"eb384e40-1917-4b9c-bcfa-440a3a10fd1d\") " pod="openstack/glance-default-internal-api-0" Mar 10 10:07:16 crc kubenswrapper[4794]: I0310 10:07:16.062117 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/eb384e40-1917-4b9c-bcfa-440a3a10fd1d-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"eb384e40-1917-4b9c-bcfa-440a3a10fd1d\") " pod="openstack/glance-default-internal-api-0" Mar 10 10:07:16 crc kubenswrapper[4794]: I0310 10:07:16.069005 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/eb384e40-1917-4b9c-bcfa-440a3a10fd1d-scripts\") pod \"glance-default-internal-api-0\" (UID: \"eb384e40-1917-4b9c-bcfa-440a3a10fd1d\") " pod="openstack/glance-default-internal-api-0" Mar 10 10:07:16 crc kubenswrapper[4794]: I0310 10:07:16.072077 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eb384e40-1917-4b9c-bcfa-440a3a10fd1d-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"eb384e40-1917-4b9c-bcfa-440a3a10fd1d\") " pod="openstack/glance-default-internal-api-0" Mar 10 10:07:16 crc kubenswrapper[4794]: I0310 10:07:16.079415 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eb384e40-1917-4b9c-bcfa-440a3a10fd1d-config-data\") pod \"glance-default-internal-api-0\" (UID: \"eb384e40-1917-4b9c-bcfa-440a3a10fd1d\") " pod="openstack/glance-default-internal-api-0" Mar 10 10:07:16 crc kubenswrapper[4794]: I0310 10:07:16.090132 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lwst6\" (UniqueName: \"kubernetes.io/projected/eb384e40-1917-4b9c-bcfa-440a3a10fd1d-kube-api-access-lwst6\") pod \"glance-default-internal-api-0\" (UID: \"eb384e40-1917-4b9c-bcfa-440a3a10fd1d\") " pod="openstack/glance-default-internal-api-0" Mar 10 10:07:16 crc kubenswrapper[4794]: I0310 10:07:16.090629 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/eb384e40-1917-4b9c-bcfa-440a3a10fd1d-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"eb384e40-1917-4b9c-bcfa-440a3a10fd1d\") " pod="openstack/glance-default-internal-api-0" Mar 10 10:07:16 crc kubenswrapper[4794]: I0310 10:07:16.109350 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-internal-api-0\" (UID: \"eb384e40-1917-4b9c-bcfa-440a3a10fd1d\") " pod="openstack/glance-default-internal-api-0" Mar 10 10:07:16 crc kubenswrapper[4794]: I0310 10:07:16.253270 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 10 10:07:16 crc kubenswrapper[4794]: I0310 10:07:16.582553 4794 generic.go:334] "Generic (PLEG): container finished" podID="a989a70f-ba12-4aa6-b96d-397cde6a5d48" containerID="e70254b5af8bdd5f0d931f675058f876b6f118dd848b3a87ff25ec203ef4bf4d" exitCode=0 Mar 10 10:07:16 crc kubenswrapper[4794]: I0310 10:07:16.582602 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-ngww6" event={"ID":"a989a70f-ba12-4aa6-b96d-397cde6a5d48","Type":"ContainerDied","Data":"e70254b5af8bdd5f0d931f675058f876b6f118dd848b3a87ff25ec203ef4bf4d"} Mar 10 10:07:16 crc kubenswrapper[4794]: I0310 10:07:16.582882 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-ngww6" event={"ID":"a989a70f-ba12-4aa6-b96d-397cde6a5d48","Type":"ContainerStarted","Data":"83f75f9dbd92c1f81599dac8a1df99df0e5884743438edc4aec56c4e5f623db4"} Mar 10 10:07:16 crc kubenswrapper[4794]: I0310 10:07:16.584979 4794 generic.go:334] "Generic (PLEG): container finished" podID="269fc3af-0d8a-4d19-8753-8c4d07670864" containerID="540c3c25fe0dd253f107d25fb50dacca81557b8d7b51277e3dafbce6effea553" exitCode=0 Mar 10 10:07:16 crc kubenswrapper[4794]: I0310 10:07:16.585071 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-e13e-account-create-update-nqb9b" event={"ID":"269fc3af-0d8a-4d19-8753-8c4d07670864","Type":"ContainerDied","Data":"540c3c25fe0dd253f107d25fb50dacca81557b8d7b51277e3dafbce6effea553"} Mar 10 10:07:16 crc kubenswrapper[4794]: I0310 10:07:16.585098 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-e13e-account-create-update-nqb9b" event={"ID":"269fc3af-0d8a-4d19-8753-8c4d07670864","Type":"ContainerStarted","Data":"d6644e1b42c8be758c06deefc11fa8a53b2f402fdba8f807c6fa978114cf607c"} Mar 10 10:07:16 crc kubenswrapper[4794]: I0310 10:07:16.587159 4794 generic.go:334] "Generic (PLEG): container finished" podID="75e280fc-27ef-4cd8-b46d-3913a229ba81" containerID="4fc4d6cd8ca31d8694a05ea7825a17b8db43c3dc50594cab99e3bba48437e073" exitCode=0 Mar 10 10:07:16 crc kubenswrapper[4794]: I0310 10:07:16.587203 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-a513-account-create-update-l8ctf" event={"ID":"75e280fc-27ef-4cd8-b46d-3913a229ba81","Type":"ContainerDied","Data":"4fc4d6cd8ca31d8694a05ea7825a17b8db43c3dc50594cab99e3bba48437e073"} Mar 10 10:07:16 crc kubenswrapper[4794]: I0310 10:07:16.587220 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-a513-account-create-update-l8ctf" event={"ID":"75e280fc-27ef-4cd8-b46d-3913a229ba81","Type":"ContainerStarted","Data":"7b834712020ea0bc0cb8a74fdd3922ecbb58d7a51d7c9e877bf440fff1063a07"} Mar 10 10:07:16 crc kubenswrapper[4794]: I0310 10:07:16.591374 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"59e99408-027e-4f50-bfa8-02482e877cc8","Type":"ContainerStarted","Data":"8696fb652e731569d2b69215de0830c0019020b0ba18a1194e4eb4c859380849"} Mar 10 10:07:16 crc kubenswrapper[4794]: I0310 10:07:16.595399 4794 generic.go:334] "Generic (PLEG): container finished" podID="598f9a9c-0b67-44b8-83a9-428f55be33a9" containerID="533e00e14c8524408bf878f2e40fcf2843d0d66d7ba048ed96beac141f3440ab" exitCode=0 Mar 10 10:07:16 crc kubenswrapper[4794]: I0310 10:07:16.595467 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-pkvrz" event={"ID":"598f9a9c-0b67-44b8-83a9-428f55be33a9","Type":"ContainerDied","Data":"533e00e14c8524408bf878f2e40fcf2843d0d66d7ba048ed96beac141f3440ab"} Mar 10 10:07:16 crc kubenswrapper[4794]: I0310 10:07:16.598269 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 10 10:07:16 crc kubenswrapper[4794]: I0310 10:07:16.599855 4794 generic.go:334] "Generic (PLEG): container finished" podID="bc31804f-5a50-4c0a-80e8-42d0752ee5b5" containerID="7049bf67f9f820cd9e74cbe5c408865922f9627182ed1b04dc96fc9839d624ee" exitCode=0 Mar 10 10:07:16 crc kubenswrapper[4794]: I0310 10:07:16.600037 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-61a4-account-create-update-t8jxm" event={"ID":"bc31804f-5a50-4c0a-80e8-42d0752ee5b5","Type":"ContainerDied","Data":"7049bf67f9f820cd9e74cbe5c408865922f9627182ed1b04dc96fc9839d624ee"} Mar 10 10:07:16 crc kubenswrapper[4794]: I0310 10:07:16.600075 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-61a4-account-create-update-t8jxm" event={"ID":"bc31804f-5a50-4c0a-80e8-42d0752ee5b5","Type":"ContainerStarted","Data":"2643476cb7fd9c72b2616f6e22e68beeb127f5457cc90127d26282024da72aec"} Mar 10 10:07:16 crc kubenswrapper[4794]: I0310 10:07:16.660388 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 10 10:07:16 crc kubenswrapper[4794]: I0310 10:07:16.673801 4794 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 10 10:07:16 crc kubenswrapper[4794]: I0310 10:07:16.722966 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Mar 10 10:07:16 crc kubenswrapper[4794]: I0310 10:07:16.725309 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 10 10:07:16 crc kubenswrapper[4794]: I0310 10:07:16.728251 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Mar 10 10:07:16 crc kubenswrapper[4794]: I0310 10:07:16.728294 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Mar 10 10:07:16 crc kubenswrapper[4794]: I0310 10:07:16.750808 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 10 10:07:16 crc kubenswrapper[4794]: I0310 10:07:16.838599 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 10 10:07:16 crc kubenswrapper[4794]: I0310 10:07:16.885043 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/53686d91-dc01-4a36-99c3-e6c84052e15e-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"53686d91-dc01-4a36-99c3-e6c84052e15e\") " pod="openstack/glance-default-external-api-0" Mar 10 10:07:16 crc kubenswrapper[4794]: I0310 10:07:16.885106 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/53686d91-dc01-4a36-99c3-e6c84052e15e-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"53686d91-dc01-4a36-99c3-e6c84052e15e\") " pod="openstack/glance-default-external-api-0" Mar 10 10:07:16 crc kubenswrapper[4794]: I0310 10:07:16.885671 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-external-api-0\" (UID: \"53686d91-dc01-4a36-99c3-e6c84052e15e\") " pod="openstack/glance-default-external-api-0" Mar 10 10:07:16 crc kubenswrapper[4794]: I0310 10:07:16.885796 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/53686d91-dc01-4a36-99c3-e6c84052e15e-logs\") pod \"glance-default-external-api-0\" (UID: \"53686d91-dc01-4a36-99c3-e6c84052e15e\") " pod="openstack/glance-default-external-api-0" Mar 10 10:07:16 crc kubenswrapper[4794]: I0310 10:07:16.885833 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/53686d91-dc01-4a36-99c3-e6c84052e15e-config-data\") pod \"glance-default-external-api-0\" (UID: \"53686d91-dc01-4a36-99c3-e6c84052e15e\") " pod="openstack/glance-default-external-api-0" Mar 10 10:07:16 crc kubenswrapper[4794]: I0310 10:07:16.885887 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/53686d91-dc01-4a36-99c3-e6c84052e15e-scripts\") pod \"glance-default-external-api-0\" (UID: \"53686d91-dc01-4a36-99c3-e6c84052e15e\") " pod="openstack/glance-default-external-api-0" Mar 10 10:07:16 crc kubenswrapper[4794]: I0310 10:07:16.886024 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/53686d91-dc01-4a36-99c3-e6c84052e15e-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"53686d91-dc01-4a36-99c3-e6c84052e15e\") " pod="openstack/glance-default-external-api-0" Mar 10 10:07:16 crc kubenswrapper[4794]: I0310 10:07:16.886059 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m4d5h\" (UniqueName: \"kubernetes.io/projected/53686d91-dc01-4a36-99c3-e6c84052e15e-kube-api-access-m4d5h\") pod \"glance-default-external-api-0\" (UID: \"53686d91-dc01-4a36-99c3-e6c84052e15e\") " pod="openstack/glance-default-external-api-0" Mar 10 10:07:16 crc kubenswrapper[4794]: I0310 10:07:16.987302 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/53686d91-dc01-4a36-99c3-e6c84052e15e-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"53686d91-dc01-4a36-99c3-e6c84052e15e\") " pod="openstack/glance-default-external-api-0" Mar 10 10:07:16 crc kubenswrapper[4794]: I0310 10:07:16.987672 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/53686d91-dc01-4a36-99c3-e6c84052e15e-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"53686d91-dc01-4a36-99c3-e6c84052e15e\") " pod="openstack/glance-default-external-api-0" Mar 10 10:07:16 crc kubenswrapper[4794]: I0310 10:07:16.987702 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-external-api-0\" (UID: \"53686d91-dc01-4a36-99c3-e6c84052e15e\") " pod="openstack/glance-default-external-api-0" Mar 10 10:07:16 crc kubenswrapper[4794]: I0310 10:07:16.987733 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/53686d91-dc01-4a36-99c3-e6c84052e15e-logs\") pod \"glance-default-external-api-0\" (UID: \"53686d91-dc01-4a36-99c3-e6c84052e15e\") " pod="openstack/glance-default-external-api-0" Mar 10 10:07:16 crc kubenswrapper[4794]: I0310 10:07:16.987756 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/53686d91-dc01-4a36-99c3-e6c84052e15e-config-data\") pod \"glance-default-external-api-0\" (UID: \"53686d91-dc01-4a36-99c3-e6c84052e15e\") " pod="openstack/glance-default-external-api-0" Mar 10 10:07:16 crc kubenswrapper[4794]: I0310 10:07:16.987822 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/53686d91-dc01-4a36-99c3-e6c84052e15e-scripts\") pod \"glance-default-external-api-0\" (UID: \"53686d91-dc01-4a36-99c3-e6c84052e15e\") " pod="openstack/glance-default-external-api-0" Mar 10 10:07:16 crc kubenswrapper[4794]: I0310 10:07:16.987877 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/53686d91-dc01-4a36-99c3-e6c84052e15e-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"53686d91-dc01-4a36-99c3-e6c84052e15e\") " pod="openstack/glance-default-external-api-0" Mar 10 10:07:16 crc kubenswrapper[4794]: I0310 10:07:16.987900 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m4d5h\" (UniqueName: \"kubernetes.io/projected/53686d91-dc01-4a36-99c3-e6c84052e15e-kube-api-access-m4d5h\") pod \"glance-default-external-api-0\" (UID: \"53686d91-dc01-4a36-99c3-e6c84052e15e\") " pod="openstack/glance-default-external-api-0" Mar 10 10:07:16 crc kubenswrapper[4794]: I0310 10:07:16.993622 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/53686d91-dc01-4a36-99c3-e6c84052e15e-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"53686d91-dc01-4a36-99c3-e6c84052e15e\") " pod="openstack/glance-default-external-api-0" Mar 10 10:07:16 crc kubenswrapper[4794]: I0310 10:07:16.995703 4794 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-external-api-0\" (UID: \"53686d91-dc01-4a36-99c3-e6c84052e15e\") device mount path \"/mnt/openstack/pv05\"" pod="openstack/glance-default-external-api-0" Mar 10 10:07:17 crc kubenswrapper[4794]: I0310 10:07:17.001703 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/53686d91-dc01-4a36-99c3-e6c84052e15e-config-data\") pod \"glance-default-external-api-0\" (UID: \"53686d91-dc01-4a36-99c3-e6c84052e15e\") " pod="openstack/glance-default-external-api-0" Mar 10 10:07:17 crc kubenswrapper[4794]: I0310 10:07:17.002818 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m4d5h\" (UniqueName: \"kubernetes.io/projected/53686d91-dc01-4a36-99c3-e6c84052e15e-kube-api-access-m4d5h\") pod \"glance-default-external-api-0\" (UID: \"53686d91-dc01-4a36-99c3-e6c84052e15e\") " pod="openstack/glance-default-external-api-0" Mar 10 10:07:17 crc kubenswrapper[4794]: I0310 10:07:17.002908 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/53686d91-dc01-4a36-99c3-e6c84052e15e-scripts\") pod \"glance-default-external-api-0\" (UID: \"53686d91-dc01-4a36-99c3-e6c84052e15e\") " pod="openstack/glance-default-external-api-0" Mar 10 10:07:17 crc kubenswrapper[4794]: I0310 10:07:17.007844 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/53686d91-dc01-4a36-99c3-e6c84052e15e-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"53686d91-dc01-4a36-99c3-e6c84052e15e\") " pod="openstack/glance-default-external-api-0" Mar 10 10:07:17 crc kubenswrapper[4794]: I0310 10:07:17.007903 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/53686d91-dc01-4a36-99c3-e6c84052e15e-logs\") pod \"glance-default-external-api-0\" (UID: \"53686d91-dc01-4a36-99c3-e6c84052e15e\") " pod="openstack/glance-default-external-api-0" Mar 10 10:07:17 crc kubenswrapper[4794]: I0310 10:07:17.011607 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/53686d91-dc01-4a36-99c3-e6c84052e15e-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"53686d91-dc01-4a36-99c3-e6c84052e15e\") " pod="openstack/glance-default-external-api-0" Mar 10 10:07:17 crc kubenswrapper[4794]: I0310 10:07:17.047604 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-external-api-0\" (UID: \"53686d91-dc01-4a36-99c3-e6c84052e15e\") " pod="openstack/glance-default-external-api-0" Mar 10 10:07:17 crc kubenswrapper[4794]: I0310 10:07:17.065238 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 10 10:07:17 crc kubenswrapper[4794]: I0310 10:07:17.100187 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-hsjqq" Mar 10 10:07:17 crc kubenswrapper[4794]: I0310 10:07:17.192167 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tqd6z\" (UniqueName: \"kubernetes.io/projected/d7addf26-5bbf-4b13-aa27-070bab62a929-kube-api-access-tqd6z\") pod \"d7addf26-5bbf-4b13-aa27-070bab62a929\" (UID: \"d7addf26-5bbf-4b13-aa27-070bab62a929\") " Mar 10 10:07:17 crc kubenswrapper[4794]: I0310 10:07:17.192371 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d7addf26-5bbf-4b13-aa27-070bab62a929-operator-scripts\") pod \"d7addf26-5bbf-4b13-aa27-070bab62a929\" (UID: \"d7addf26-5bbf-4b13-aa27-070bab62a929\") " Mar 10 10:07:17 crc kubenswrapper[4794]: I0310 10:07:17.193064 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d7addf26-5bbf-4b13-aa27-070bab62a929-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "d7addf26-5bbf-4b13-aa27-070bab62a929" (UID: "d7addf26-5bbf-4b13-aa27-070bab62a929"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 10:07:17 crc kubenswrapper[4794]: I0310 10:07:17.196546 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d7addf26-5bbf-4b13-aa27-070bab62a929-kube-api-access-tqd6z" (OuterVolumeSpecName: "kube-api-access-tqd6z") pod "d7addf26-5bbf-4b13-aa27-070bab62a929" (UID: "d7addf26-5bbf-4b13-aa27-070bab62a929"). InnerVolumeSpecName "kube-api-access-tqd6z". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 10:07:17 crc kubenswrapper[4794]: I0310 10:07:17.311553 4794 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d7addf26-5bbf-4b13-aa27-070bab62a929-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 10:07:17 crc kubenswrapper[4794]: I0310 10:07:17.311845 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tqd6z\" (UniqueName: \"kubernetes.io/projected/d7addf26-5bbf-4b13-aa27-070bab62a929-kube-api-access-tqd6z\") on node \"crc\" DevicePath \"\"" Mar 10 10:07:17 crc kubenswrapper[4794]: I0310 10:07:17.610744 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"eb384e40-1917-4b9c-bcfa-440a3a10fd1d","Type":"ContainerStarted","Data":"ef5f845a9297d9baef67ae43283960d69bd559077bfce60e39f562cbd5f935df"} Mar 10 10:07:17 crc kubenswrapper[4794]: I0310 10:07:17.610788 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"eb384e40-1917-4b9c-bcfa-440a3a10fd1d","Type":"ContainerStarted","Data":"aae6a2afa3749d02039ce43aa61c5aacff2989b05d29eb1a47e756aa2162d339"} Mar 10 10:07:17 crc kubenswrapper[4794]: I0310 10:07:17.614729 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-hsjqq" Mar 10 10:07:17 crc kubenswrapper[4794]: I0310 10:07:17.622506 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-hsjqq" event={"ID":"d7addf26-5bbf-4b13-aa27-070bab62a929","Type":"ContainerDied","Data":"9c3300f22b86e3369084c4649108e950a8142c34a1a47ad01a8e3ab49cf4ee17"} Mar 10 10:07:17 crc kubenswrapper[4794]: I0310 10:07:17.622560 4794 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9c3300f22b86e3369084c4649108e950a8142c34a1a47ad01a8e3ab49cf4ee17" Mar 10 10:07:17 crc kubenswrapper[4794]: I0310 10:07:17.782798 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 10 10:07:17 crc kubenswrapper[4794]: I0310 10:07:17.813003 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-bd7575545-w8qjp" Mar 10 10:07:17 crc kubenswrapper[4794]: I0310 10:07:17.931729 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-74b7765548-sk248"] Mar 10 10:07:17 crc kubenswrapper[4794]: I0310 10:07:17.932221 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-74b7765548-sk248" podUID="652a5188-47b5-4235-8385-f9b9b1e3db2d" containerName="neutron-api" containerID="cri-o://a0b9dcaf358ff946d205c9bc8e9a9040d43717b794c1e68a54a11e2a563b14f0" gracePeriod=30 Mar 10 10:07:17 crc kubenswrapper[4794]: I0310 10:07:17.932364 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-74b7765548-sk248" podUID="652a5188-47b5-4235-8385-f9b9b1e3db2d" containerName="neutron-httpd" containerID="cri-o://c90cc14f7dbebc115d43b4687a53f79c2412e4e1ff879557986772034e76699c" gracePeriod=30 Mar 10 10:07:18 crc kubenswrapper[4794]: I0310 10:07:18.025227 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-a513-account-create-update-l8ctf" Mar 10 10:07:18 crc kubenswrapper[4794]: I0310 10:07:18.034407 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="325754ce-6381-4bb4-9102-04933c1a928b" path="/var/lib/kubelet/pods/325754ce-6381-4bb4-9102-04933c1a928b/volumes" Mar 10 10:07:18 crc kubenswrapper[4794]: I0310 10:07:18.143706 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f7bcz\" (UniqueName: \"kubernetes.io/projected/75e280fc-27ef-4cd8-b46d-3913a229ba81-kube-api-access-f7bcz\") pod \"75e280fc-27ef-4cd8-b46d-3913a229ba81\" (UID: \"75e280fc-27ef-4cd8-b46d-3913a229ba81\") " Mar 10 10:07:18 crc kubenswrapper[4794]: I0310 10:07:18.143748 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/75e280fc-27ef-4cd8-b46d-3913a229ba81-operator-scripts\") pod \"75e280fc-27ef-4cd8-b46d-3913a229ba81\" (UID: \"75e280fc-27ef-4cd8-b46d-3913a229ba81\") " Mar 10 10:07:18 crc kubenswrapper[4794]: I0310 10:07:18.148533 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/75e280fc-27ef-4cd8-b46d-3913a229ba81-kube-api-access-f7bcz" (OuterVolumeSpecName: "kube-api-access-f7bcz") pod "75e280fc-27ef-4cd8-b46d-3913a229ba81" (UID: "75e280fc-27ef-4cd8-b46d-3913a229ba81"). InnerVolumeSpecName "kube-api-access-f7bcz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 10:07:18 crc kubenswrapper[4794]: I0310 10:07:18.148580 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/75e280fc-27ef-4cd8-b46d-3913a229ba81-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "75e280fc-27ef-4cd8-b46d-3913a229ba81" (UID: "75e280fc-27ef-4cd8-b46d-3913a229ba81"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 10:07:18 crc kubenswrapper[4794]: I0310 10:07:18.249090 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f7bcz\" (UniqueName: \"kubernetes.io/projected/75e280fc-27ef-4cd8-b46d-3913a229ba81-kube-api-access-f7bcz\") on node \"crc\" DevicePath \"\"" Mar 10 10:07:18 crc kubenswrapper[4794]: I0310 10:07:18.249402 4794 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/75e280fc-27ef-4cd8-b46d-3913a229ba81-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 10:07:18 crc kubenswrapper[4794]: I0310 10:07:18.255693 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-ngww6" Mar 10 10:07:18 crc kubenswrapper[4794]: I0310 10:07:18.271921 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-pkvrz" Mar 10 10:07:18 crc kubenswrapper[4794]: I0310 10:07:18.291764 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-e13e-account-create-update-nqb9b" Mar 10 10:07:18 crc kubenswrapper[4794]: I0310 10:07:18.293764 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-61a4-account-create-update-t8jxm" Mar 10 10:07:18 crc kubenswrapper[4794]: I0310 10:07:18.350953 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l8vkw\" (UniqueName: \"kubernetes.io/projected/a989a70f-ba12-4aa6-b96d-397cde6a5d48-kube-api-access-l8vkw\") pod \"a989a70f-ba12-4aa6-b96d-397cde6a5d48\" (UID: \"a989a70f-ba12-4aa6-b96d-397cde6a5d48\") " Mar 10 10:07:18 crc kubenswrapper[4794]: I0310 10:07:18.351098 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/598f9a9c-0b67-44b8-83a9-428f55be33a9-operator-scripts\") pod \"598f9a9c-0b67-44b8-83a9-428f55be33a9\" (UID: \"598f9a9c-0b67-44b8-83a9-428f55be33a9\") " Mar 10 10:07:18 crc kubenswrapper[4794]: I0310 10:07:18.351184 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v6pkl\" (UniqueName: \"kubernetes.io/projected/598f9a9c-0b67-44b8-83a9-428f55be33a9-kube-api-access-v6pkl\") pod \"598f9a9c-0b67-44b8-83a9-428f55be33a9\" (UID: \"598f9a9c-0b67-44b8-83a9-428f55be33a9\") " Mar 10 10:07:18 crc kubenswrapper[4794]: I0310 10:07:18.351214 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a989a70f-ba12-4aa6-b96d-397cde6a5d48-operator-scripts\") pod \"a989a70f-ba12-4aa6-b96d-397cde6a5d48\" (UID: \"a989a70f-ba12-4aa6-b96d-397cde6a5d48\") " Mar 10 10:07:18 crc kubenswrapper[4794]: I0310 10:07:18.352617 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/598f9a9c-0b67-44b8-83a9-428f55be33a9-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "598f9a9c-0b67-44b8-83a9-428f55be33a9" (UID: "598f9a9c-0b67-44b8-83a9-428f55be33a9"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 10:07:18 crc kubenswrapper[4794]: I0310 10:07:18.352766 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a989a70f-ba12-4aa6-b96d-397cde6a5d48-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "a989a70f-ba12-4aa6-b96d-397cde6a5d48" (UID: "a989a70f-ba12-4aa6-b96d-397cde6a5d48"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 10:07:18 crc kubenswrapper[4794]: I0310 10:07:18.356430 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/598f9a9c-0b67-44b8-83a9-428f55be33a9-kube-api-access-v6pkl" (OuterVolumeSpecName: "kube-api-access-v6pkl") pod "598f9a9c-0b67-44b8-83a9-428f55be33a9" (UID: "598f9a9c-0b67-44b8-83a9-428f55be33a9"). InnerVolumeSpecName "kube-api-access-v6pkl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 10:07:18 crc kubenswrapper[4794]: I0310 10:07:18.373312 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a989a70f-ba12-4aa6-b96d-397cde6a5d48-kube-api-access-l8vkw" (OuterVolumeSpecName: "kube-api-access-l8vkw") pod "a989a70f-ba12-4aa6-b96d-397cde6a5d48" (UID: "a989a70f-ba12-4aa6-b96d-397cde6a5d48"). InnerVolumeSpecName "kube-api-access-l8vkw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 10:07:18 crc kubenswrapper[4794]: I0310 10:07:18.458364 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bc31804f-5a50-4c0a-80e8-42d0752ee5b5-operator-scripts\") pod \"bc31804f-5a50-4c0a-80e8-42d0752ee5b5\" (UID: \"bc31804f-5a50-4c0a-80e8-42d0752ee5b5\") " Mar 10 10:07:18 crc kubenswrapper[4794]: I0310 10:07:18.458701 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ffpdg\" (UniqueName: \"kubernetes.io/projected/269fc3af-0d8a-4d19-8753-8c4d07670864-kube-api-access-ffpdg\") pod \"269fc3af-0d8a-4d19-8753-8c4d07670864\" (UID: \"269fc3af-0d8a-4d19-8753-8c4d07670864\") " Mar 10 10:07:18 crc kubenswrapper[4794]: I0310 10:07:18.458790 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q6wnc\" (UniqueName: \"kubernetes.io/projected/bc31804f-5a50-4c0a-80e8-42d0752ee5b5-kube-api-access-q6wnc\") pod \"bc31804f-5a50-4c0a-80e8-42d0752ee5b5\" (UID: \"bc31804f-5a50-4c0a-80e8-42d0752ee5b5\") " Mar 10 10:07:18 crc kubenswrapper[4794]: I0310 10:07:18.459791 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bc31804f-5a50-4c0a-80e8-42d0752ee5b5-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "bc31804f-5a50-4c0a-80e8-42d0752ee5b5" (UID: "bc31804f-5a50-4c0a-80e8-42d0752ee5b5"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 10:07:18 crc kubenswrapper[4794]: I0310 10:07:18.466313 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/269fc3af-0d8a-4d19-8753-8c4d07670864-operator-scripts\") pod \"269fc3af-0d8a-4d19-8753-8c4d07670864\" (UID: \"269fc3af-0d8a-4d19-8753-8c4d07670864\") " Mar 10 10:07:18 crc kubenswrapper[4794]: I0310 10:07:18.467088 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v6pkl\" (UniqueName: \"kubernetes.io/projected/598f9a9c-0b67-44b8-83a9-428f55be33a9-kube-api-access-v6pkl\") on node \"crc\" DevicePath \"\"" Mar 10 10:07:18 crc kubenswrapper[4794]: I0310 10:07:18.467111 4794 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a989a70f-ba12-4aa6-b96d-397cde6a5d48-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 10:07:18 crc kubenswrapper[4794]: I0310 10:07:18.467126 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l8vkw\" (UniqueName: \"kubernetes.io/projected/a989a70f-ba12-4aa6-b96d-397cde6a5d48-kube-api-access-l8vkw\") on node \"crc\" DevicePath \"\"" Mar 10 10:07:18 crc kubenswrapper[4794]: I0310 10:07:18.467138 4794 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/598f9a9c-0b67-44b8-83a9-428f55be33a9-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 10:07:18 crc kubenswrapper[4794]: I0310 10:07:18.467149 4794 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bc31804f-5a50-4c0a-80e8-42d0752ee5b5-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 10:07:18 crc kubenswrapper[4794]: I0310 10:07:18.467679 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/269fc3af-0d8a-4d19-8753-8c4d07670864-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "269fc3af-0d8a-4d19-8753-8c4d07670864" (UID: "269fc3af-0d8a-4d19-8753-8c4d07670864"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 10:07:18 crc kubenswrapper[4794]: I0310 10:07:18.467834 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/269fc3af-0d8a-4d19-8753-8c4d07670864-kube-api-access-ffpdg" (OuterVolumeSpecName: "kube-api-access-ffpdg") pod "269fc3af-0d8a-4d19-8753-8c4d07670864" (UID: "269fc3af-0d8a-4d19-8753-8c4d07670864"). InnerVolumeSpecName "kube-api-access-ffpdg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 10:07:18 crc kubenswrapper[4794]: I0310 10:07:18.487199 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc31804f-5a50-4c0a-80e8-42d0752ee5b5-kube-api-access-q6wnc" (OuterVolumeSpecName: "kube-api-access-q6wnc") pod "bc31804f-5a50-4c0a-80e8-42d0752ee5b5" (UID: "bc31804f-5a50-4c0a-80e8-42d0752ee5b5"). InnerVolumeSpecName "kube-api-access-q6wnc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 10:07:18 crc kubenswrapper[4794]: I0310 10:07:18.571671 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ffpdg\" (UniqueName: \"kubernetes.io/projected/269fc3af-0d8a-4d19-8753-8c4d07670864-kube-api-access-ffpdg\") on node \"crc\" DevicePath \"\"" Mar 10 10:07:18 crc kubenswrapper[4794]: I0310 10:07:18.571705 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q6wnc\" (UniqueName: \"kubernetes.io/projected/bc31804f-5a50-4c0a-80e8-42d0752ee5b5-kube-api-access-q6wnc\") on node \"crc\" DevicePath \"\"" Mar 10 10:07:18 crc kubenswrapper[4794]: I0310 10:07:18.571717 4794 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/269fc3af-0d8a-4d19-8753-8c4d07670864-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 10:07:18 crc kubenswrapper[4794]: I0310 10:07:18.656162 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-e13e-account-create-update-nqb9b" event={"ID":"269fc3af-0d8a-4d19-8753-8c4d07670864","Type":"ContainerDied","Data":"d6644e1b42c8be758c06deefc11fa8a53b2f402fdba8f807c6fa978114cf607c"} Mar 10 10:07:18 crc kubenswrapper[4794]: I0310 10:07:18.656204 4794 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d6644e1b42c8be758c06deefc11fa8a53b2f402fdba8f807c6fa978114cf607c" Mar 10 10:07:18 crc kubenswrapper[4794]: I0310 10:07:18.656187 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-e13e-account-create-update-nqb9b" Mar 10 10:07:18 crc kubenswrapper[4794]: I0310 10:07:18.661412 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-a513-account-create-update-l8ctf" event={"ID":"75e280fc-27ef-4cd8-b46d-3913a229ba81","Type":"ContainerDied","Data":"7b834712020ea0bc0cb8a74fdd3922ecbb58d7a51d7c9e877bf440fff1063a07"} Mar 10 10:07:18 crc kubenswrapper[4794]: I0310 10:07:18.661445 4794 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7b834712020ea0bc0cb8a74fdd3922ecbb58d7a51d7c9e877bf440fff1063a07" Mar 10 10:07:18 crc kubenswrapper[4794]: I0310 10:07:18.661503 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-a513-account-create-update-l8ctf" Mar 10 10:07:18 crc kubenswrapper[4794]: I0310 10:07:18.671395 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-pkvrz" event={"ID":"598f9a9c-0b67-44b8-83a9-428f55be33a9","Type":"ContainerDied","Data":"bf266c1f8f114d437dd4b643c72b26ab5ca8ea9777958c403c15359aeca6725b"} Mar 10 10:07:18 crc kubenswrapper[4794]: I0310 10:07:18.671436 4794 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bf266c1f8f114d437dd4b643c72b26ab5ca8ea9777958c403c15359aeca6725b" Mar 10 10:07:18 crc kubenswrapper[4794]: I0310 10:07:18.671473 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-pkvrz" Mar 10 10:07:18 crc kubenswrapper[4794]: I0310 10:07:18.679820 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-61a4-account-create-update-t8jxm" event={"ID":"bc31804f-5a50-4c0a-80e8-42d0752ee5b5","Type":"ContainerDied","Data":"2643476cb7fd9c72b2616f6e22e68beeb127f5457cc90127d26282024da72aec"} Mar 10 10:07:18 crc kubenswrapper[4794]: I0310 10:07:18.679860 4794 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2643476cb7fd9c72b2616f6e22e68beeb127f5457cc90127d26282024da72aec" Mar 10 10:07:18 crc kubenswrapper[4794]: I0310 10:07:18.679914 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-61a4-account-create-update-t8jxm" Mar 10 10:07:18 crc kubenswrapper[4794]: I0310 10:07:18.696018 4794 generic.go:334] "Generic (PLEG): container finished" podID="652a5188-47b5-4235-8385-f9b9b1e3db2d" containerID="c90cc14f7dbebc115d43b4687a53f79c2412e4e1ff879557986772034e76699c" exitCode=0 Mar 10 10:07:18 crc kubenswrapper[4794]: I0310 10:07:18.696086 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-74b7765548-sk248" event={"ID":"652a5188-47b5-4235-8385-f9b9b1e3db2d","Type":"ContainerDied","Data":"c90cc14f7dbebc115d43b4687a53f79c2412e4e1ff879557986772034e76699c"} Mar 10 10:07:18 crc kubenswrapper[4794]: I0310 10:07:18.698733 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-ngww6" event={"ID":"a989a70f-ba12-4aa6-b96d-397cde6a5d48","Type":"ContainerDied","Data":"83f75f9dbd92c1f81599dac8a1df99df0e5884743438edc4aec56c4e5f623db4"} Mar 10 10:07:18 crc kubenswrapper[4794]: I0310 10:07:18.698778 4794 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="83f75f9dbd92c1f81599dac8a1df99df0e5884743438edc4aec56c4e5f623db4" Mar 10 10:07:18 crc kubenswrapper[4794]: I0310 10:07:18.698824 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-ngww6" Mar 10 10:07:18 crc kubenswrapper[4794]: I0310 10:07:18.702696 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"eb384e40-1917-4b9c-bcfa-440a3a10fd1d","Type":"ContainerStarted","Data":"d75c771de3d291bbbb95bf0f193cefd57708bcdc53c0f2f718b3d8e320f642c8"} Mar 10 10:07:18 crc kubenswrapper[4794]: I0310 10:07:18.711836 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"59e99408-027e-4f50-bfa8-02482e877cc8","Type":"ContainerStarted","Data":"b67cbbb007b8117677bb130e293d9fa95339f3811b4bb2d7a28ce7ce30cc78d3"} Mar 10 10:07:18 crc kubenswrapper[4794]: I0310 10:07:18.711957 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="59e99408-027e-4f50-bfa8-02482e877cc8" containerName="ceilometer-central-agent" containerID="cri-o://32c70546d85a39bc9911849dd97d4c3679b338fa26fee4f77da0b94068604ff1" gracePeriod=30 Mar 10 10:07:18 crc kubenswrapper[4794]: I0310 10:07:18.712010 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="59e99408-027e-4f50-bfa8-02482e877cc8" containerName="sg-core" containerID="cri-o://8696fb652e731569d2b69215de0830c0019020b0ba18a1194e4eb4c859380849" gracePeriod=30 Mar 10 10:07:18 crc kubenswrapper[4794]: I0310 10:07:18.712038 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="59e99408-027e-4f50-bfa8-02482e877cc8" containerName="proxy-httpd" containerID="cri-o://b67cbbb007b8117677bb130e293d9fa95339f3811b4bb2d7a28ce7ce30cc78d3" gracePeriod=30 Mar 10 10:07:18 crc kubenswrapper[4794]: I0310 10:07:18.712079 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="59e99408-027e-4f50-bfa8-02482e877cc8" containerName="ceilometer-notification-agent" containerID="cri-o://a7002fe4bfa5464e36af4a82b820f45e5a86f1792f9647edde24d5323671acda" gracePeriod=30 Mar 10 10:07:18 crc kubenswrapper[4794]: I0310 10:07:18.711984 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 10 10:07:18 crc kubenswrapper[4794]: I0310 10:07:18.713653 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"53686d91-dc01-4a36-99c3-e6c84052e15e","Type":"ContainerStarted","Data":"a93a1cdce0c6102672e520db125516311ff3a3151f376512b31940fae9eb6766"} Mar 10 10:07:18 crc kubenswrapper[4794]: I0310 10:07:18.722889 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=3.722870114 podStartE2EDuration="3.722870114s" podCreationTimestamp="2026-03-10 10:07:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 10:07:18.719065183 +0000 UTC m=+1387.475236001" watchObservedRunningTime="2026-03-10 10:07:18.722870114 +0000 UTC m=+1387.479040932" Mar 10 10:07:18 crc kubenswrapper[4794]: I0310 10:07:18.742132 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=6.587190265 podStartE2EDuration="11.742112303s" podCreationTimestamp="2026-03-10 10:07:07 +0000 UTC" firstStartedPulling="2026-03-10 10:07:12.889477823 +0000 UTC m=+1381.645648641" lastFinishedPulling="2026-03-10 10:07:18.044399861 +0000 UTC m=+1386.800570679" observedRunningTime="2026-03-10 10:07:18.741704572 +0000 UTC m=+1387.497875410" watchObservedRunningTime="2026-03-10 10:07:18.742112303 +0000 UTC m=+1387.498283121" Mar 10 10:07:19 crc kubenswrapper[4794]: I0310 10:07:19.726786 4794 generic.go:334] "Generic (PLEG): container finished" podID="59e99408-027e-4f50-bfa8-02482e877cc8" containerID="b67cbbb007b8117677bb130e293d9fa95339f3811b4bb2d7a28ce7ce30cc78d3" exitCode=0 Mar 10 10:07:19 crc kubenswrapper[4794]: I0310 10:07:19.727086 4794 generic.go:334] "Generic (PLEG): container finished" podID="59e99408-027e-4f50-bfa8-02482e877cc8" containerID="8696fb652e731569d2b69215de0830c0019020b0ba18a1194e4eb4c859380849" exitCode=2 Mar 10 10:07:19 crc kubenswrapper[4794]: I0310 10:07:19.726864 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"59e99408-027e-4f50-bfa8-02482e877cc8","Type":"ContainerDied","Data":"b67cbbb007b8117677bb130e293d9fa95339f3811b4bb2d7a28ce7ce30cc78d3"} Mar 10 10:07:19 crc kubenswrapper[4794]: I0310 10:07:19.727130 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"59e99408-027e-4f50-bfa8-02482e877cc8","Type":"ContainerDied","Data":"8696fb652e731569d2b69215de0830c0019020b0ba18a1194e4eb4c859380849"} Mar 10 10:07:19 crc kubenswrapper[4794]: I0310 10:07:19.727142 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"59e99408-027e-4f50-bfa8-02482e877cc8","Type":"ContainerDied","Data":"a7002fe4bfa5464e36af4a82b820f45e5a86f1792f9647edde24d5323671acda"} Mar 10 10:07:19 crc kubenswrapper[4794]: I0310 10:07:19.727098 4794 generic.go:334] "Generic (PLEG): container finished" podID="59e99408-027e-4f50-bfa8-02482e877cc8" containerID="a7002fe4bfa5464e36af4a82b820f45e5a86f1792f9647edde24d5323671acda" exitCode=0 Mar 10 10:07:19 crc kubenswrapper[4794]: I0310 10:07:19.729736 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"53686d91-dc01-4a36-99c3-e6c84052e15e","Type":"ContainerStarted","Data":"d36356b5c770ecad29603a57d1346e81bb0210ad811dc767118c368012779874"} Mar 10 10:07:19 crc kubenswrapper[4794]: I0310 10:07:19.729772 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"53686d91-dc01-4a36-99c3-e6c84052e15e","Type":"ContainerStarted","Data":"d74fbbdb86c3cdb65171312cf2c9c803c458f47ad7b0f5c525579801ae96ec9d"} Mar 10 10:07:19 crc kubenswrapper[4794]: I0310 10:07:19.758138 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=3.7581127739999998 podStartE2EDuration="3.758112774s" podCreationTimestamp="2026-03-10 10:07:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 10:07:19.750561191 +0000 UTC m=+1388.506732009" watchObservedRunningTime="2026-03-10 10:07:19.758112774 +0000 UTC m=+1388.514283592" Mar 10 10:07:21 crc kubenswrapper[4794]: I0310 10:07:21.189242 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-7575fbf969-gq2mq" Mar 10 10:07:21 crc kubenswrapper[4794]: I0310 10:07:21.190354 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-7575fbf969-gq2mq" Mar 10 10:07:24 crc kubenswrapper[4794]: I0310 10:07:24.217598 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-db-sync-wd2vl"] Mar 10 10:07:24 crc kubenswrapper[4794]: E0310 10:07:24.218434 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="75e280fc-27ef-4cd8-b46d-3913a229ba81" containerName="mariadb-account-create-update" Mar 10 10:07:24 crc kubenswrapper[4794]: I0310 10:07:24.218452 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="75e280fc-27ef-4cd8-b46d-3913a229ba81" containerName="mariadb-account-create-update" Mar 10 10:07:24 crc kubenswrapper[4794]: E0310 10:07:24.218469 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d7addf26-5bbf-4b13-aa27-070bab62a929" containerName="mariadb-database-create" Mar 10 10:07:24 crc kubenswrapper[4794]: I0310 10:07:24.218478 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="d7addf26-5bbf-4b13-aa27-070bab62a929" containerName="mariadb-database-create" Mar 10 10:07:24 crc kubenswrapper[4794]: E0310 10:07:24.218492 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="269fc3af-0d8a-4d19-8753-8c4d07670864" containerName="mariadb-account-create-update" Mar 10 10:07:24 crc kubenswrapper[4794]: I0310 10:07:24.218501 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="269fc3af-0d8a-4d19-8753-8c4d07670864" containerName="mariadb-account-create-update" Mar 10 10:07:24 crc kubenswrapper[4794]: E0310 10:07:24.218517 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bc31804f-5a50-4c0a-80e8-42d0752ee5b5" containerName="mariadb-account-create-update" Mar 10 10:07:24 crc kubenswrapper[4794]: I0310 10:07:24.218526 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="bc31804f-5a50-4c0a-80e8-42d0752ee5b5" containerName="mariadb-account-create-update" Mar 10 10:07:24 crc kubenswrapper[4794]: E0310 10:07:24.218549 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="598f9a9c-0b67-44b8-83a9-428f55be33a9" containerName="mariadb-database-create" Mar 10 10:07:24 crc kubenswrapper[4794]: I0310 10:07:24.218558 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="598f9a9c-0b67-44b8-83a9-428f55be33a9" containerName="mariadb-database-create" Mar 10 10:07:24 crc kubenswrapper[4794]: E0310 10:07:24.218577 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a989a70f-ba12-4aa6-b96d-397cde6a5d48" containerName="mariadb-database-create" Mar 10 10:07:24 crc kubenswrapper[4794]: I0310 10:07:24.218586 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="a989a70f-ba12-4aa6-b96d-397cde6a5d48" containerName="mariadb-database-create" Mar 10 10:07:24 crc kubenswrapper[4794]: I0310 10:07:24.218817 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="d7addf26-5bbf-4b13-aa27-070bab62a929" containerName="mariadb-database-create" Mar 10 10:07:24 crc kubenswrapper[4794]: I0310 10:07:24.218836 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="75e280fc-27ef-4cd8-b46d-3913a229ba81" containerName="mariadb-account-create-update" Mar 10 10:07:24 crc kubenswrapper[4794]: I0310 10:07:24.218852 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="bc31804f-5a50-4c0a-80e8-42d0752ee5b5" containerName="mariadb-account-create-update" Mar 10 10:07:24 crc kubenswrapper[4794]: I0310 10:07:24.218866 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="598f9a9c-0b67-44b8-83a9-428f55be33a9" containerName="mariadb-database-create" Mar 10 10:07:24 crc kubenswrapper[4794]: I0310 10:07:24.218887 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="269fc3af-0d8a-4d19-8753-8c4d07670864" containerName="mariadb-account-create-update" Mar 10 10:07:24 crc kubenswrapper[4794]: I0310 10:07:24.218898 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="a989a70f-ba12-4aa6-b96d-397cde6a5d48" containerName="mariadb-database-create" Mar 10 10:07:24 crc kubenswrapper[4794]: I0310 10:07:24.219657 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-wd2vl" Mar 10 10:07:24 crc kubenswrapper[4794]: I0310 10:07:24.223537 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-57ldq" Mar 10 10:07:24 crc kubenswrapper[4794]: I0310 10:07:24.223903 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Mar 10 10:07:24 crc kubenswrapper[4794]: I0310 10:07:24.224009 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-scripts" Mar 10 10:07:24 crc kubenswrapper[4794]: I0310 10:07:24.225202 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-wd2vl"] Mar 10 10:07:24 crc kubenswrapper[4794]: I0310 10:07:24.272690 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a352ac02-d7f9-40c5-b5ad-40d7fb5d2a55-config-data\") pod \"nova-cell0-conductor-db-sync-wd2vl\" (UID: \"a352ac02-d7f9-40c5-b5ad-40d7fb5d2a55\") " pod="openstack/nova-cell0-conductor-db-sync-wd2vl" Mar 10 10:07:24 crc kubenswrapper[4794]: I0310 10:07:24.273009 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hjtgd\" (UniqueName: \"kubernetes.io/projected/a352ac02-d7f9-40c5-b5ad-40d7fb5d2a55-kube-api-access-hjtgd\") pod \"nova-cell0-conductor-db-sync-wd2vl\" (UID: \"a352ac02-d7f9-40c5-b5ad-40d7fb5d2a55\") " pod="openstack/nova-cell0-conductor-db-sync-wd2vl" Mar 10 10:07:24 crc kubenswrapper[4794]: I0310 10:07:24.273208 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a352ac02-d7f9-40c5-b5ad-40d7fb5d2a55-scripts\") pod \"nova-cell0-conductor-db-sync-wd2vl\" (UID: \"a352ac02-d7f9-40c5-b5ad-40d7fb5d2a55\") " pod="openstack/nova-cell0-conductor-db-sync-wd2vl" Mar 10 10:07:24 crc kubenswrapper[4794]: I0310 10:07:24.273311 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a352ac02-d7f9-40c5-b5ad-40d7fb5d2a55-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-wd2vl\" (UID: \"a352ac02-d7f9-40c5-b5ad-40d7fb5d2a55\") " pod="openstack/nova-cell0-conductor-db-sync-wd2vl" Mar 10 10:07:24 crc kubenswrapper[4794]: I0310 10:07:24.374897 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a352ac02-d7f9-40c5-b5ad-40d7fb5d2a55-scripts\") pod \"nova-cell0-conductor-db-sync-wd2vl\" (UID: \"a352ac02-d7f9-40c5-b5ad-40d7fb5d2a55\") " pod="openstack/nova-cell0-conductor-db-sync-wd2vl" Mar 10 10:07:24 crc kubenswrapper[4794]: I0310 10:07:24.375204 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a352ac02-d7f9-40c5-b5ad-40d7fb5d2a55-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-wd2vl\" (UID: \"a352ac02-d7f9-40c5-b5ad-40d7fb5d2a55\") " pod="openstack/nova-cell0-conductor-db-sync-wd2vl" Mar 10 10:07:24 crc kubenswrapper[4794]: I0310 10:07:24.375404 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a352ac02-d7f9-40c5-b5ad-40d7fb5d2a55-config-data\") pod \"nova-cell0-conductor-db-sync-wd2vl\" (UID: \"a352ac02-d7f9-40c5-b5ad-40d7fb5d2a55\") " pod="openstack/nova-cell0-conductor-db-sync-wd2vl" Mar 10 10:07:24 crc kubenswrapper[4794]: I0310 10:07:24.375608 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hjtgd\" (UniqueName: \"kubernetes.io/projected/a352ac02-d7f9-40c5-b5ad-40d7fb5d2a55-kube-api-access-hjtgd\") pod \"nova-cell0-conductor-db-sync-wd2vl\" (UID: \"a352ac02-d7f9-40c5-b5ad-40d7fb5d2a55\") " pod="openstack/nova-cell0-conductor-db-sync-wd2vl" Mar 10 10:07:24 crc kubenswrapper[4794]: I0310 10:07:24.382820 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a352ac02-d7f9-40c5-b5ad-40d7fb5d2a55-scripts\") pod \"nova-cell0-conductor-db-sync-wd2vl\" (UID: \"a352ac02-d7f9-40c5-b5ad-40d7fb5d2a55\") " pod="openstack/nova-cell0-conductor-db-sync-wd2vl" Mar 10 10:07:24 crc kubenswrapper[4794]: I0310 10:07:24.394958 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a352ac02-d7f9-40c5-b5ad-40d7fb5d2a55-config-data\") pod \"nova-cell0-conductor-db-sync-wd2vl\" (UID: \"a352ac02-d7f9-40c5-b5ad-40d7fb5d2a55\") " pod="openstack/nova-cell0-conductor-db-sync-wd2vl" Mar 10 10:07:24 crc kubenswrapper[4794]: I0310 10:07:24.395042 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a352ac02-d7f9-40c5-b5ad-40d7fb5d2a55-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-wd2vl\" (UID: \"a352ac02-d7f9-40c5-b5ad-40d7fb5d2a55\") " pod="openstack/nova-cell0-conductor-db-sync-wd2vl" Mar 10 10:07:24 crc kubenswrapper[4794]: I0310 10:07:24.399049 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hjtgd\" (UniqueName: \"kubernetes.io/projected/a352ac02-d7f9-40c5-b5ad-40d7fb5d2a55-kube-api-access-hjtgd\") pod \"nova-cell0-conductor-db-sync-wd2vl\" (UID: \"a352ac02-d7f9-40c5-b5ad-40d7fb5d2a55\") " pod="openstack/nova-cell0-conductor-db-sync-wd2vl" Mar 10 10:07:24 crc kubenswrapper[4794]: I0310 10:07:24.544906 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-wd2vl" Mar 10 10:07:24 crc kubenswrapper[4794]: I0310 10:07:24.988696 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-wd2vl"] Mar 10 10:07:25 crc kubenswrapper[4794]: I0310 10:07:25.765678 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 10 10:07:25 crc kubenswrapper[4794]: I0310 10:07:25.800060 4794 generic.go:334] "Generic (PLEG): container finished" podID="59e99408-027e-4f50-bfa8-02482e877cc8" containerID="32c70546d85a39bc9911849dd97d4c3679b338fa26fee4f77da0b94068604ff1" exitCode=0 Mar 10 10:07:25 crc kubenswrapper[4794]: I0310 10:07:25.800128 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"59e99408-027e-4f50-bfa8-02482e877cc8","Type":"ContainerDied","Data":"32c70546d85a39bc9911849dd97d4c3679b338fa26fee4f77da0b94068604ff1"} Mar 10 10:07:25 crc kubenswrapper[4794]: I0310 10:07:25.800159 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"59e99408-027e-4f50-bfa8-02482e877cc8","Type":"ContainerDied","Data":"8fba9ab5f0e0c8372949a44bdd45c19d96e519f7322d89e6fd6285e3c6351d90"} Mar 10 10:07:25 crc kubenswrapper[4794]: I0310 10:07:25.800179 4794 scope.go:117] "RemoveContainer" containerID="b67cbbb007b8117677bb130e293d9fa95339f3811b4bb2d7a28ce7ce30cc78d3" Mar 10 10:07:25 crc kubenswrapper[4794]: I0310 10:07:25.800324 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 10 10:07:25 crc kubenswrapper[4794]: I0310 10:07:25.802572 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-wd2vl" event={"ID":"a352ac02-d7f9-40c5-b5ad-40d7fb5d2a55","Type":"ContainerStarted","Data":"9b27e5887a9b3ff188dc13c409bd684b06944065876f358a3fa1c70c440e2515"} Mar 10 10:07:25 crc kubenswrapper[4794]: I0310 10:07:25.805036 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-plz4z\" (UniqueName: \"kubernetes.io/projected/59e99408-027e-4f50-bfa8-02482e877cc8-kube-api-access-plz4z\") pod \"59e99408-027e-4f50-bfa8-02482e877cc8\" (UID: \"59e99408-027e-4f50-bfa8-02482e877cc8\") " Mar 10 10:07:25 crc kubenswrapper[4794]: I0310 10:07:25.805118 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/59e99408-027e-4f50-bfa8-02482e877cc8-sg-core-conf-yaml\") pod \"59e99408-027e-4f50-bfa8-02482e877cc8\" (UID: \"59e99408-027e-4f50-bfa8-02482e877cc8\") " Mar 10 10:07:25 crc kubenswrapper[4794]: I0310 10:07:25.805210 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/59e99408-027e-4f50-bfa8-02482e877cc8-combined-ca-bundle\") pod \"59e99408-027e-4f50-bfa8-02482e877cc8\" (UID: \"59e99408-027e-4f50-bfa8-02482e877cc8\") " Mar 10 10:07:25 crc kubenswrapper[4794]: I0310 10:07:25.805308 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/59e99408-027e-4f50-bfa8-02482e877cc8-config-data\") pod \"59e99408-027e-4f50-bfa8-02482e877cc8\" (UID: \"59e99408-027e-4f50-bfa8-02482e877cc8\") " Mar 10 10:07:25 crc kubenswrapper[4794]: I0310 10:07:25.805432 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/59e99408-027e-4f50-bfa8-02482e877cc8-scripts\") pod \"59e99408-027e-4f50-bfa8-02482e877cc8\" (UID: \"59e99408-027e-4f50-bfa8-02482e877cc8\") " Mar 10 10:07:25 crc kubenswrapper[4794]: I0310 10:07:25.805530 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/59e99408-027e-4f50-bfa8-02482e877cc8-run-httpd\") pod \"59e99408-027e-4f50-bfa8-02482e877cc8\" (UID: \"59e99408-027e-4f50-bfa8-02482e877cc8\") " Mar 10 10:07:25 crc kubenswrapper[4794]: I0310 10:07:25.805584 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/59e99408-027e-4f50-bfa8-02482e877cc8-log-httpd\") pod \"59e99408-027e-4f50-bfa8-02482e877cc8\" (UID: \"59e99408-027e-4f50-bfa8-02482e877cc8\") " Mar 10 10:07:25 crc kubenswrapper[4794]: I0310 10:07:25.806128 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/59e99408-027e-4f50-bfa8-02482e877cc8-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "59e99408-027e-4f50-bfa8-02482e877cc8" (UID: "59e99408-027e-4f50-bfa8-02482e877cc8"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 10:07:25 crc kubenswrapper[4794]: I0310 10:07:25.806228 4794 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/59e99408-027e-4f50-bfa8-02482e877cc8-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 10 10:07:25 crc kubenswrapper[4794]: I0310 10:07:25.806494 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/59e99408-027e-4f50-bfa8-02482e877cc8-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "59e99408-027e-4f50-bfa8-02482e877cc8" (UID: "59e99408-027e-4f50-bfa8-02482e877cc8"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 10:07:25 crc kubenswrapper[4794]: I0310 10:07:25.810159 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/59e99408-027e-4f50-bfa8-02482e877cc8-kube-api-access-plz4z" (OuterVolumeSpecName: "kube-api-access-plz4z") pod "59e99408-027e-4f50-bfa8-02482e877cc8" (UID: "59e99408-027e-4f50-bfa8-02482e877cc8"). InnerVolumeSpecName "kube-api-access-plz4z". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 10:07:25 crc kubenswrapper[4794]: I0310 10:07:25.812368 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/59e99408-027e-4f50-bfa8-02482e877cc8-scripts" (OuterVolumeSpecName: "scripts") pod "59e99408-027e-4f50-bfa8-02482e877cc8" (UID: "59e99408-027e-4f50-bfa8-02482e877cc8"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 10:07:25 crc kubenswrapper[4794]: I0310 10:07:25.842535 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/59e99408-027e-4f50-bfa8-02482e877cc8-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "59e99408-027e-4f50-bfa8-02482e877cc8" (UID: "59e99408-027e-4f50-bfa8-02482e877cc8"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 10:07:25 crc kubenswrapper[4794]: I0310 10:07:25.847912 4794 scope.go:117] "RemoveContainer" containerID="8696fb652e731569d2b69215de0830c0019020b0ba18a1194e4eb4c859380849" Mar 10 10:07:25 crc kubenswrapper[4794]: I0310 10:07:25.900954 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/59e99408-027e-4f50-bfa8-02482e877cc8-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "59e99408-027e-4f50-bfa8-02482e877cc8" (UID: "59e99408-027e-4f50-bfa8-02482e877cc8"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 10:07:25 crc kubenswrapper[4794]: I0310 10:07:25.904803 4794 scope.go:117] "RemoveContainer" containerID="a7002fe4bfa5464e36af4a82b820f45e5a86f1792f9647edde24d5323671acda" Mar 10 10:07:25 crc kubenswrapper[4794]: I0310 10:07:25.907868 4794 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/59e99408-027e-4f50-bfa8-02482e877cc8-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 10 10:07:25 crc kubenswrapper[4794]: I0310 10:07:25.907902 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-plz4z\" (UniqueName: \"kubernetes.io/projected/59e99408-027e-4f50-bfa8-02482e877cc8-kube-api-access-plz4z\") on node \"crc\" DevicePath \"\"" Mar 10 10:07:25 crc kubenswrapper[4794]: I0310 10:07:25.907911 4794 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/59e99408-027e-4f50-bfa8-02482e877cc8-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 10 10:07:25 crc kubenswrapper[4794]: I0310 10:07:25.907920 4794 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/59e99408-027e-4f50-bfa8-02482e877cc8-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 10:07:25 crc kubenswrapper[4794]: I0310 10:07:25.907928 4794 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/59e99408-027e-4f50-bfa8-02482e877cc8-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 10:07:25 crc kubenswrapper[4794]: I0310 10:07:25.921834 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/59e99408-027e-4f50-bfa8-02482e877cc8-config-data" (OuterVolumeSpecName: "config-data") pod "59e99408-027e-4f50-bfa8-02482e877cc8" (UID: "59e99408-027e-4f50-bfa8-02482e877cc8"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 10:07:25 crc kubenswrapper[4794]: I0310 10:07:25.926084 4794 scope.go:117] "RemoveContainer" containerID="32c70546d85a39bc9911849dd97d4c3679b338fa26fee4f77da0b94068604ff1" Mar 10 10:07:25 crc kubenswrapper[4794]: I0310 10:07:25.958668 4794 scope.go:117] "RemoveContainer" containerID="b67cbbb007b8117677bb130e293d9fa95339f3811b4bb2d7a28ce7ce30cc78d3" Mar 10 10:07:25 crc kubenswrapper[4794]: E0310 10:07:25.959245 4794 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b67cbbb007b8117677bb130e293d9fa95339f3811b4bb2d7a28ce7ce30cc78d3\": container with ID starting with b67cbbb007b8117677bb130e293d9fa95339f3811b4bb2d7a28ce7ce30cc78d3 not found: ID does not exist" containerID="b67cbbb007b8117677bb130e293d9fa95339f3811b4bb2d7a28ce7ce30cc78d3" Mar 10 10:07:25 crc kubenswrapper[4794]: I0310 10:07:25.959300 4794 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b67cbbb007b8117677bb130e293d9fa95339f3811b4bb2d7a28ce7ce30cc78d3"} err="failed to get container status \"b67cbbb007b8117677bb130e293d9fa95339f3811b4bb2d7a28ce7ce30cc78d3\": rpc error: code = NotFound desc = could not find container \"b67cbbb007b8117677bb130e293d9fa95339f3811b4bb2d7a28ce7ce30cc78d3\": container with ID starting with b67cbbb007b8117677bb130e293d9fa95339f3811b4bb2d7a28ce7ce30cc78d3 not found: ID does not exist" Mar 10 10:07:25 crc kubenswrapper[4794]: I0310 10:07:25.959326 4794 scope.go:117] "RemoveContainer" containerID="8696fb652e731569d2b69215de0830c0019020b0ba18a1194e4eb4c859380849" Mar 10 10:07:25 crc kubenswrapper[4794]: E0310 10:07:25.959754 4794 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8696fb652e731569d2b69215de0830c0019020b0ba18a1194e4eb4c859380849\": container with ID starting with 8696fb652e731569d2b69215de0830c0019020b0ba18a1194e4eb4c859380849 not found: ID does not exist" containerID="8696fb652e731569d2b69215de0830c0019020b0ba18a1194e4eb4c859380849" Mar 10 10:07:25 crc kubenswrapper[4794]: I0310 10:07:25.959798 4794 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8696fb652e731569d2b69215de0830c0019020b0ba18a1194e4eb4c859380849"} err="failed to get container status \"8696fb652e731569d2b69215de0830c0019020b0ba18a1194e4eb4c859380849\": rpc error: code = NotFound desc = could not find container \"8696fb652e731569d2b69215de0830c0019020b0ba18a1194e4eb4c859380849\": container with ID starting with 8696fb652e731569d2b69215de0830c0019020b0ba18a1194e4eb4c859380849 not found: ID does not exist" Mar 10 10:07:25 crc kubenswrapper[4794]: I0310 10:07:25.959814 4794 scope.go:117] "RemoveContainer" containerID="a7002fe4bfa5464e36af4a82b820f45e5a86f1792f9647edde24d5323671acda" Mar 10 10:07:25 crc kubenswrapper[4794]: E0310 10:07:25.960137 4794 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a7002fe4bfa5464e36af4a82b820f45e5a86f1792f9647edde24d5323671acda\": container with ID starting with a7002fe4bfa5464e36af4a82b820f45e5a86f1792f9647edde24d5323671acda not found: ID does not exist" containerID="a7002fe4bfa5464e36af4a82b820f45e5a86f1792f9647edde24d5323671acda" Mar 10 10:07:25 crc kubenswrapper[4794]: I0310 10:07:25.960157 4794 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a7002fe4bfa5464e36af4a82b820f45e5a86f1792f9647edde24d5323671acda"} err="failed to get container status \"a7002fe4bfa5464e36af4a82b820f45e5a86f1792f9647edde24d5323671acda\": rpc error: code = NotFound desc = could not find container \"a7002fe4bfa5464e36af4a82b820f45e5a86f1792f9647edde24d5323671acda\": container with ID starting with a7002fe4bfa5464e36af4a82b820f45e5a86f1792f9647edde24d5323671acda not found: ID does not exist" Mar 10 10:07:25 crc kubenswrapper[4794]: I0310 10:07:25.960195 4794 scope.go:117] "RemoveContainer" containerID="32c70546d85a39bc9911849dd97d4c3679b338fa26fee4f77da0b94068604ff1" Mar 10 10:07:25 crc kubenswrapper[4794]: E0310 10:07:25.960579 4794 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"32c70546d85a39bc9911849dd97d4c3679b338fa26fee4f77da0b94068604ff1\": container with ID starting with 32c70546d85a39bc9911849dd97d4c3679b338fa26fee4f77da0b94068604ff1 not found: ID does not exist" containerID="32c70546d85a39bc9911849dd97d4c3679b338fa26fee4f77da0b94068604ff1" Mar 10 10:07:25 crc kubenswrapper[4794]: I0310 10:07:25.960626 4794 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"32c70546d85a39bc9911849dd97d4c3679b338fa26fee4f77da0b94068604ff1"} err="failed to get container status \"32c70546d85a39bc9911849dd97d4c3679b338fa26fee4f77da0b94068604ff1\": rpc error: code = NotFound desc = could not find container \"32c70546d85a39bc9911849dd97d4c3679b338fa26fee4f77da0b94068604ff1\": container with ID starting with 32c70546d85a39bc9911849dd97d4c3679b338fa26fee4f77da0b94068604ff1 not found: ID does not exist" Mar 10 10:07:26 crc kubenswrapper[4794]: I0310 10:07:26.009672 4794 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/59e99408-027e-4f50-bfa8-02482e877cc8-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 10:07:26 crc kubenswrapper[4794]: I0310 10:07:26.128160 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 10 10:07:26 crc kubenswrapper[4794]: I0310 10:07:26.139473 4794 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 10 10:07:26 crc kubenswrapper[4794]: I0310 10:07:26.156006 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 10 10:07:26 crc kubenswrapper[4794]: E0310 10:07:26.156479 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="59e99408-027e-4f50-bfa8-02482e877cc8" containerName="sg-core" Mar 10 10:07:26 crc kubenswrapper[4794]: I0310 10:07:26.156500 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="59e99408-027e-4f50-bfa8-02482e877cc8" containerName="sg-core" Mar 10 10:07:26 crc kubenswrapper[4794]: E0310 10:07:26.156524 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="59e99408-027e-4f50-bfa8-02482e877cc8" containerName="ceilometer-central-agent" Mar 10 10:07:26 crc kubenswrapper[4794]: I0310 10:07:26.156532 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="59e99408-027e-4f50-bfa8-02482e877cc8" containerName="ceilometer-central-agent" Mar 10 10:07:26 crc kubenswrapper[4794]: E0310 10:07:26.156558 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="59e99408-027e-4f50-bfa8-02482e877cc8" containerName="proxy-httpd" Mar 10 10:07:26 crc kubenswrapper[4794]: I0310 10:07:26.156566 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="59e99408-027e-4f50-bfa8-02482e877cc8" containerName="proxy-httpd" Mar 10 10:07:26 crc kubenswrapper[4794]: E0310 10:07:26.156580 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="59e99408-027e-4f50-bfa8-02482e877cc8" containerName="ceilometer-notification-agent" Mar 10 10:07:26 crc kubenswrapper[4794]: I0310 10:07:26.156589 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="59e99408-027e-4f50-bfa8-02482e877cc8" containerName="ceilometer-notification-agent" Mar 10 10:07:26 crc kubenswrapper[4794]: I0310 10:07:26.156797 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="59e99408-027e-4f50-bfa8-02482e877cc8" containerName="ceilometer-central-agent" Mar 10 10:07:26 crc kubenswrapper[4794]: I0310 10:07:26.156829 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="59e99408-027e-4f50-bfa8-02482e877cc8" containerName="sg-core" Mar 10 10:07:26 crc kubenswrapper[4794]: I0310 10:07:26.156849 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="59e99408-027e-4f50-bfa8-02482e877cc8" containerName="ceilometer-notification-agent" Mar 10 10:07:26 crc kubenswrapper[4794]: I0310 10:07:26.156862 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="59e99408-027e-4f50-bfa8-02482e877cc8" containerName="proxy-httpd" Mar 10 10:07:26 crc kubenswrapper[4794]: I0310 10:07:26.161198 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 10 10:07:26 crc kubenswrapper[4794]: I0310 10:07:26.164060 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 10 10:07:26 crc kubenswrapper[4794]: I0310 10:07:26.164309 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 10 10:07:26 crc kubenswrapper[4794]: I0310 10:07:26.176353 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 10 10:07:26 crc kubenswrapper[4794]: I0310 10:07:26.213430 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jkd6v\" (UniqueName: \"kubernetes.io/projected/b159d937-7865-4d6d-ac2d-65deeb0e9161-kube-api-access-jkd6v\") pod \"ceilometer-0\" (UID: \"b159d937-7865-4d6d-ac2d-65deeb0e9161\") " pod="openstack/ceilometer-0" Mar 10 10:07:26 crc kubenswrapper[4794]: I0310 10:07:26.213615 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b159d937-7865-4d6d-ac2d-65deeb0e9161-run-httpd\") pod \"ceilometer-0\" (UID: \"b159d937-7865-4d6d-ac2d-65deeb0e9161\") " pod="openstack/ceilometer-0" Mar 10 10:07:26 crc kubenswrapper[4794]: I0310 10:07:26.213738 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b159d937-7865-4d6d-ac2d-65deeb0e9161-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"b159d937-7865-4d6d-ac2d-65deeb0e9161\") " pod="openstack/ceilometer-0" Mar 10 10:07:26 crc kubenswrapper[4794]: I0310 10:07:26.213770 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b159d937-7865-4d6d-ac2d-65deeb0e9161-log-httpd\") pod \"ceilometer-0\" (UID: \"b159d937-7865-4d6d-ac2d-65deeb0e9161\") " pod="openstack/ceilometer-0" Mar 10 10:07:26 crc kubenswrapper[4794]: I0310 10:07:26.213805 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b159d937-7865-4d6d-ac2d-65deeb0e9161-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"b159d937-7865-4d6d-ac2d-65deeb0e9161\") " pod="openstack/ceilometer-0" Mar 10 10:07:26 crc kubenswrapper[4794]: I0310 10:07:26.213827 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b159d937-7865-4d6d-ac2d-65deeb0e9161-scripts\") pod \"ceilometer-0\" (UID: \"b159d937-7865-4d6d-ac2d-65deeb0e9161\") " pod="openstack/ceilometer-0" Mar 10 10:07:26 crc kubenswrapper[4794]: I0310 10:07:26.213850 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b159d937-7865-4d6d-ac2d-65deeb0e9161-config-data\") pod \"ceilometer-0\" (UID: \"b159d937-7865-4d6d-ac2d-65deeb0e9161\") " pod="openstack/ceilometer-0" Mar 10 10:07:26 crc kubenswrapper[4794]: I0310 10:07:26.253685 4794 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Mar 10 10:07:26 crc kubenswrapper[4794]: I0310 10:07:26.253734 4794 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Mar 10 10:07:26 crc kubenswrapper[4794]: I0310 10:07:26.302172 4794 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Mar 10 10:07:26 crc kubenswrapper[4794]: I0310 10:07:26.307872 4794 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Mar 10 10:07:26 crc kubenswrapper[4794]: I0310 10:07:26.315563 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b159d937-7865-4d6d-ac2d-65deeb0e9161-run-httpd\") pod \"ceilometer-0\" (UID: \"b159d937-7865-4d6d-ac2d-65deeb0e9161\") " pod="openstack/ceilometer-0" Mar 10 10:07:26 crc kubenswrapper[4794]: I0310 10:07:26.315627 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b159d937-7865-4d6d-ac2d-65deeb0e9161-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"b159d937-7865-4d6d-ac2d-65deeb0e9161\") " pod="openstack/ceilometer-0" Mar 10 10:07:26 crc kubenswrapper[4794]: I0310 10:07:26.315650 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b159d937-7865-4d6d-ac2d-65deeb0e9161-log-httpd\") pod \"ceilometer-0\" (UID: \"b159d937-7865-4d6d-ac2d-65deeb0e9161\") " pod="openstack/ceilometer-0" Mar 10 10:07:26 crc kubenswrapper[4794]: I0310 10:07:26.315695 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b159d937-7865-4d6d-ac2d-65deeb0e9161-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"b159d937-7865-4d6d-ac2d-65deeb0e9161\") " pod="openstack/ceilometer-0" Mar 10 10:07:26 crc kubenswrapper[4794]: I0310 10:07:26.315718 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b159d937-7865-4d6d-ac2d-65deeb0e9161-scripts\") pod \"ceilometer-0\" (UID: \"b159d937-7865-4d6d-ac2d-65deeb0e9161\") " pod="openstack/ceilometer-0" Mar 10 10:07:26 crc kubenswrapper[4794]: I0310 10:07:26.316137 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b159d937-7865-4d6d-ac2d-65deeb0e9161-config-data\") pod \"ceilometer-0\" (UID: \"b159d937-7865-4d6d-ac2d-65deeb0e9161\") " pod="openstack/ceilometer-0" Mar 10 10:07:26 crc kubenswrapper[4794]: I0310 10:07:26.316168 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b159d937-7865-4d6d-ac2d-65deeb0e9161-run-httpd\") pod \"ceilometer-0\" (UID: \"b159d937-7865-4d6d-ac2d-65deeb0e9161\") " pod="openstack/ceilometer-0" Mar 10 10:07:26 crc kubenswrapper[4794]: I0310 10:07:26.316447 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jkd6v\" (UniqueName: \"kubernetes.io/projected/b159d937-7865-4d6d-ac2d-65deeb0e9161-kube-api-access-jkd6v\") pod \"ceilometer-0\" (UID: \"b159d937-7865-4d6d-ac2d-65deeb0e9161\") " pod="openstack/ceilometer-0" Mar 10 10:07:26 crc kubenswrapper[4794]: I0310 10:07:26.316544 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b159d937-7865-4d6d-ac2d-65deeb0e9161-log-httpd\") pod \"ceilometer-0\" (UID: \"b159d937-7865-4d6d-ac2d-65deeb0e9161\") " pod="openstack/ceilometer-0" Mar 10 10:07:26 crc kubenswrapper[4794]: I0310 10:07:26.327054 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b159d937-7865-4d6d-ac2d-65deeb0e9161-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"b159d937-7865-4d6d-ac2d-65deeb0e9161\") " pod="openstack/ceilometer-0" Mar 10 10:07:26 crc kubenswrapper[4794]: I0310 10:07:26.327630 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b159d937-7865-4d6d-ac2d-65deeb0e9161-config-data\") pod \"ceilometer-0\" (UID: \"b159d937-7865-4d6d-ac2d-65deeb0e9161\") " pod="openstack/ceilometer-0" Mar 10 10:07:26 crc kubenswrapper[4794]: I0310 10:07:26.327924 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b159d937-7865-4d6d-ac2d-65deeb0e9161-scripts\") pod \"ceilometer-0\" (UID: \"b159d937-7865-4d6d-ac2d-65deeb0e9161\") " pod="openstack/ceilometer-0" Mar 10 10:07:26 crc kubenswrapper[4794]: I0310 10:07:26.333963 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jkd6v\" (UniqueName: \"kubernetes.io/projected/b159d937-7865-4d6d-ac2d-65deeb0e9161-kube-api-access-jkd6v\") pod \"ceilometer-0\" (UID: \"b159d937-7865-4d6d-ac2d-65deeb0e9161\") " pod="openstack/ceilometer-0" Mar 10 10:07:26 crc kubenswrapper[4794]: I0310 10:07:26.339159 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b159d937-7865-4d6d-ac2d-65deeb0e9161-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"b159d937-7865-4d6d-ac2d-65deeb0e9161\") " pod="openstack/ceilometer-0" Mar 10 10:07:26 crc kubenswrapper[4794]: I0310 10:07:26.479836 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 10 10:07:26 crc kubenswrapper[4794]: I0310 10:07:26.814661 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Mar 10 10:07:26 crc kubenswrapper[4794]: I0310 10:07:26.815055 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Mar 10 10:07:26 crc kubenswrapper[4794]: I0310 10:07:26.966502 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 10 10:07:27 crc kubenswrapper[4794]: I0310 10:07:27.067601 4794 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Mar 10 10:07:27 crc kubenswrapper[4794]: I0310 10:07:27.067653 4794 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Mar 10 10:07:27 crc kubenswrapper[4794]: I0310 10:07:27.104886 4794 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Mar 10 10:07:27 crc kubenswrapper[4794]: I0310 10:07:27.131912 4794 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Mar 10 10:07:27 crc kubenswrapper[4794]: I0310 10:07:27.829600 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b159d937-7865-4d6d-ac2d-65deeb0e9161","Type":"ContainerStarted","Data":"a84d31b83f776fba7d862ab537cfe7af87924e9f09a6ebcb3ca146692d6171a7"} Mar 10 10:07:27 crc kubenswrapper[4794]: I0310 10:07:27.830301 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Mar 10 10:07:27 crc kubenswrapper[4794]: I0310 10:07:27.830367 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Mar 10 10:07:28 crc kubenswrapper[4794]: I0310 10:07:28.012251 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="59e99408-027e-4f50-bfa8-02482e877cc8" path="/var/lib/kubelet/pods/59e99408-027e-4f50-bfa8-02482e877cc8/volumes" Mar 10 10:07:28 crc kubenswrapper[4794]: I0310 10:07:28.703296 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 10 10:07:28 crc kubenswrapper[4794]: I0310 10:07:28.841083 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b159d937-7865-4d6d-ac2d-65deeb0e9161","Type":"ContainerStarted","Data":"5f829c37170e508875404923f8f7fce403c6b6cb5440a7e06342663d5067d0bc"} Mar 10 10:07:28 crc kubenswrapper[4794]: I0310 10:07:28.841145 4794 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 10 10:07:28 crc kubenswrapper[4794]: I0310 10:07:28.841169 4794 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 10 10:07:29 crc kubenswrapper[4794]: I0310 10:07:29.055119 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Mar 10 10:07:29 crc kubenswrapper[4794]: I0310 10:07:29.081711 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Mar 10 10:07:29 crc kubenswrapper[4794]: I0310 10:07:29.996800 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Mar 10 10:07:29 crc kubenswrapper[4794]: I0310 10:07:29.997149 4794 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 10 10:07:30 crc kubenswrapper[4794]: I0310 10:07:30.012292 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Mar 10 10:07:34 crc kubenswrapper[4794]: I0310 10:07:34.904170 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b159d937-7865-4d6d-ac2d-65deeb0e9161","Type":"ContainerStarted","Data":"3d9448ba9246b3efae07b7d37a8c471fa6937487c20455a457df20f1957ba5b8"} Mar 10 10:07:34 crc kubenswrapper[4794]: I0310 10:07:34.904798 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b159d937-7865-4d6d-ac2d-65deeb0e9161","Type":"ContainerStarted","Data":"547b3f81df0ea73dc74896043fbdee177611dcff3b12b2f5c185bbfaaec75405"} Mar 10 10:07:34 crc kubenswrapper[4794]: I0310 10:07:34.907049 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-wd2vl" event={"ID":"a352ac02-d7f9-40c5-b5ad-40d7fb5d2a55","Type":"ContainerStarted","Data":"e2edb92bbd38d8a9c294b9b3ad89c0346bb305954930f87be72067c7c60f2e7f"} Mar 10 10:07:34 crc kubenswrapper[4794]: I0310 10:07:34.932718 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-db-sync-wd2vl" podStartSLOduration=2.031654625 podStartE2EDuration="10.932692328s" podCreationTimestamp="2026-03-10 10:07:24 +0000 UTC" firstStartedPulling="2026-03-10 10:07:24.998037524 +0000 UTC m=+1393.754208342" lastFinishedPulling="2026-03-10 10:07:33.899075217 +0000 UTC m=+1402.655246045" observedRunningTime="2026-03-10 10:07:34.924604249 +0000 UTC m=+1403.680775077" watchObservedRunningTime="2026-03-10 10:07:34.932692328 +0000 UTC m=+1403.688863156" Mar 10 10:07:37 crc kubenswrapper[4794]: I0310 10:07:37.941286 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b159d937-7865-4d6d-ac2d-65deeb0e9161","Type":"ContainerStarted","Data":"54a175da1ab61f5c34de173226d67054ee45620708e8430ac532d01ab1add867"} Mar 10 10:07:37 crc kubenswrapper[4794]: I0310 10:07:37.941782 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 10 10:07:37 crc kubenswrapper[4794]: I0310 10:07:37.941505 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="b159d937-7865-4d6d-ac2d-65deeb0e9161" containerName="sg-core" containerID="cri-o://3d9448ba9246b3efae07b7d37a8c471fa6937487c20455a457df20f1957ba5b8" gracePeriod=30 Mar 10 10:07:37 crc kubenswrapper[4794]: I0310 10:07:37.941480 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="b159d937-7865-4d6d-ac2d-65deeb0e9161" containerName="ceilometer-central-agent" containerID="cri-o://5f829c37170e508875404923f8f7fce403c6b6cb5440a7e06342663d5067d0bc" gracePeriod=30 Mar 10 10:07:37 crc kubenswrapper[4794]: I0310 10:07:37.941527 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="b159d937-7865-4d6d-ac2d-65deeb0e9161" containerName="proxy-httpd" containerID="cri-o://54a175da1ab61f5c34de173226d67054ee45620708e8430ac532d01ab1add867" gracePeriod=30 Mar 10 10:07:37 crc kubenswrapper[4794]: I0310 10:07:37.941516 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="b159d937-7865-4d6d-ac2d-65deeb0e9161" containerName="ceilometer-notification-agent" containerID="cri-o://547b3f81df0ea73dc74896043fbdee177611dcff3b12b2f5c185bbfaaec75405" gracePeriod=30 Mar 10 10:07:37 crc kubenswrapper[4794]: I0310 10:07:37.985322 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.248101967 podStartE2EDuration="11.985284376s" podCreationTimestamp="2026-03-10 10:07:26 +0000 UTC" firstStartedPulling="2026-03-10 10:07:26.980189221 +0000 UTC m=+1395.736360029" lastFinishedPulling="2026-03-10 10:07:36.71737161 +0000 UTC m=+1405.473542438" observedRunningTime="2026-03-10 10:07:37.980095373 +0000 UTC m=+1406.736266201" watchObservedRunningTime="2026-03-10 10:07:37.985284376 +0000 UTC m=+1406.741455194" Mar 10 10:07:38 crc kubenswrapper[4794]: I0310 10:07:38.683996 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 10 10:07:38 crc kubenswrapper[4794]: I0310 10:07:38.774998 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b159d937-7865-4d6d-ac2d-65deeb0e9161-sg-core-conf-yaml\") pod \"b159d937-7865-4d6d-ac2d-65deeb0e9161\" (UID: \"b159d937-7865-4d6d-ac2d-65deeb0e9161\") " Mar 10 10:07:38 crc kubenswrapper[4794]: I0310 10:07:38.775058 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b159d937-7865-4d6d-ac2d-65deeb0e9161-scripts\") pod \"b159d937-7865-4d6d-ac2d-65deeb0e9161\" (UID: \"b159d937-7865-4d6d-ac2d-65deeb0e9161\") " Mar 10 10:07:38 crc kubenswrapper[4794]: I0310 10:07:38.775152 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b159d937-7865-4d6d-ac2d-65deeb0e9161-log-httpd\") pod \"b159d937-7865-4d6d-ac2d-65deeb0e9161\" (UID: \"b159d937-7865-4d6d-ac2d-65deeb0e9161\") " Mar 10 10:07:38 crc kubenswrapper[4794]: I0310 10:07:38.775187 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b159d937-7865-4d6d-ac2d-65deeb0e9161-config-data\") pod \"b159d937-7865-4d6d-ac2d-65deeb0e9161\" (UID: \"b159d937-7865-4d6d-ac2d-65deeb0e9161\") " Mar 10 10:07:38 crc kubenswrapper[4794]: I0310 10:07:38.775238 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b159d937-7865-4d6d-ac2d-65deeb0e9161-combined-ca-bundle\") pod \"b159d937-7865-4d6d-ac2d-65deeb0e9161\" (UID: \"b159d937-7865-4d6d-ac2d-65deeb0e9161\") " Mar 10 10:07:38 crc kubenswrapper[4794]: I0310 10:07:38.775420 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b159d937-7865-4d6d-ac2d-65deeb0e9161-run-httpd\") pod \"b159d937-7865-4d6d-ac2d-65deeb0e9161\" (UID: \"b159d937-7865-4d6d-ac2d-65deeb0e9161\") " Mar 10 10:07:38 crc kubenswrapper[4794]: I0310 10:07:38.775482 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jkd6v\" (UniqueName: \"kubernetes.io/projected/b159d937-7865-4d6d-ac2d-65deeb0e9161-kube-api-access-jkd6v\") pod \"b159d937-7865-4d6d-ac2d-65deeb0e9161\" (UID: \"b159d937-7865-4d6d-ac2d-65deeb0e9161\") " Mar 10 10:07:38 crc kubenswrapper[4794]: I0310 10:07:38.777077 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b159d937-7865-4d6d-ac2d-65deeb0e9161-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "b159d937-7865-4d6d-ac2d-65deeb0e9161" (UID: "b159d937-7865-4d6d-ac2d-65deeb0e9161"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 10:07:38 crc kubenswrapper[4794]: I0310 10:07:38.777759 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b159d937-7865-4d6d-ac2d-65deeb0e9161-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "b159d937-7865-4d6d-ac2d-65deeb0e9161" (UID: "b159d937-7865-4d6d-ac2d-65deeb0e9161"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 10:07:38 crc kubenswrapper[4794]: I0310 10:07:38.781837 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b159d937-7865-4d6d-ac2d-65deeb0e9161-kube-api-access-jkd6v" (OuterVolumeSpecName: "kube-api-access-jkd6v") pod "b159d937-7865-4d6d-ac2d-65deeb0e9161" (UID: "b159d937-7865-4d6d-ac2d-65deeb0e9161"). InnerVolumeSpecName "kube-api-access-jkd6v". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 10:07:38 crc kubenswrapper[4794]: I0310 10:07:38.782546 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b159d937-7865-4d6d-ac2d-65deeb0e9161-scripts" (OuterVolumeSpecName: "scripts") pod "b159d937-7865-4d6d-ac2d-65deeb0e9161" (UID: "b159d937-7865-4d6d-ac2d-65deeb0e9161"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 10:07:38 crc kubenswrapper[4794]: I0310 10:07:38.808540 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b159d937-7865-4d6d-ac2d-65deeb0e9161-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "b159d937-7865-4d6d-ac2d-65deeb0e9161" (UID: "b159d937-7865-4d6d-ac2d-65deeb0e9161"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 10:07:38 crc kubenswrapper[4794]: I0310 10:07:38.877650 4794 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b159d937-7865-4d6d-ac2d-65deeb0e9161-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 10 10:07:38 crc kubenswrapper[4794]: I0310 10:07:38.877704 4794 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b159d937-7865-4d6d-ac2d-65deeb0e9161-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 10 10:07:38 crc kubenswrapper[4794]: I0310 10:07:38.877719 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jkd6v\" (UniqueName: \"kubernetes.io/projected/b159d937-7865-4d6d-ac2d-65deeb0e9161-kube-api-access-jkd6v\") on node \"crc\" DevicePath \"\"" Mar 10 10:07:38 crc kubenswrapper[4794]: I0310 10:07:38.877733 4794 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b159d937-7865-4d6d-ac2d-65deeb0e9161-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 10 10:07:38 crc kubenswrapper[4794]: I0310 10:07:38.877745 4794 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b159d937-7865-4d6d-ac2d-65deeb0e9161-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 10:07:38 crc kubenswrapper[4794]: I0310 10:07:38.902470 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b159d937-7865-4d6d-ac2d-65deeb0e9161-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b159d937-7865-4d6d-ac2d-65deeb0e9161" (UID: "b159d937-7865-4d6d-ac2d-65deeb0e9161"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 10:07:38 crc kubenswrapper[4794]: I0310 10:07:38.924490 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b159d937-7865-4d6d-ac2d-65deeb0e9161-config-data" (OuterVolumeSpecName: "config-data") pod "b159d937-7865-4d6d-ac2d-65deeb0e9161" (UID: "b159d937-7865-4d6d-ac2d-65deeb0e9161"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 10:07:38 crc kubenswrapper[4794]: I0310 10:07:38.957893 4794 generic.go:334] "Generic (PLEG): container finished" podID="b159d937-7865-4d6d-ac2d-65deeb0e9161" containerID="54a175da1ab61f5c34de173226d67054ee45620708e8430ac532d01ab1add867" exitCode=0 Mar 10 10:07:38 crc kubenswrapper[4794]: I0310 10:07:38.957932 4794 generic.go:334] "Generic (PLEG): container finished" podID="b159d937-7865-4d6d-ac2d-65deeb0e9161" containerID="3d9448ba9246b3efae07b7d37a8c471fa6937487c20455a457df20f1957ba5b8" exitCode=2 Mar 10 10:07:38 crc kubenswrapper[4794]: I0310 10:07:38.957944 4794 generic.go:334] "Generic (PLEG): container finished" podID="b159d937-7865-4d6d-ac2d-65deeb0e9161" containerID="547b3f81df0ea73dc74896043fbdee177611dcff3b12b2f5c185bbfaaec75405" exitCode=0 Mar 10 10:07:38 crc kubenswrapper[4794]: I0310 10:07:38.957952 4794 generic.go:334] "Generic (PLEG): container finished" podID="b159d937-7865-4d6d-ac2d-65deeb0e9161" containerID="5f829c37170e508875404923f8f7fce403c6b6cb5440a7e06342663d5067d0bc" exitCode=0 Mar 10 10:07:38 crc kubenswrapper[4794]: I0310 10:07:38.957974 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b159d937-7865-4d6d-ac2d-65deeb0e9161","Type":"ContainerDied","Data":"54a175da1ab61f5c34de173226d67054ee45620708e8430ac532d01ab1add867"} Mar 10 10:07:38 crc kubenswrapper[4794]: I0310 10:07:38.958005 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b159d937-7865-4d6d-ac2d-65deeb0e9161","Type":"ContainerDied","Data":"3d9448ba9246b3efae07b7d37a8c471fa6937487c20455a457df20f1957ba5b8"} Mar 10 10:07:38 crc kubenswrapper[4794]: I0310 10:07:38.958017 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b159d937-7865-4d6d-ac2d-65deeb0e9161","Type":"ContainerDied","Data":"547b3f81df0ea73dc74896043fbdee177611dcff3b12b2f5c185bbfaaec75405"} Mar 10 10:07:38 crc kubenswrapper[4794]: I0310 10:07:38.958030 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b159d937-7865-4d6d-ac2d-65deeb0e9161","Type":"ContainerDied","Data":"5f829c37170e508875404923f8f7fce403c6b6cb5440a7e06342663d5067d0bc"} Mar 10 10:07:38 crc kubenswrapper[4794]: I0310 10:07:38.958042 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b159d937-7865-4d6d-ac2d-65deeb0e9161","Type":"ContainerDied","Data":"a84d31b83f776fba7d862ab537cfe7af87924e9f09a6ebcb3ca146692d6171a7"} Mar 10 10:07:38 crc kubenswrapper[4794]: I0310 10:07:38.958060 4794 scope.go:117] "RemoveContainer" containerID="54a175da1ab61f5c34de173226d67054ee45620708e8430ac532d01ab1add867" Mar 10 10:07:38 crc kubenswrapper[4794]: I0310 10:07:38.958213 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 10 10:07:38 crc kubenswrapper[4794]: I0310 10:07:38.980003 4794 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b159d937-7865-4d6d-ac2d-65deeb0e9161-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 10:07:38 crc kubenswrapper[4794]: I0310 10:07:38.980041 4794 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b159d937-7865-4d6d-ac2d-65deeb0e9161-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 10:07:39 crc kubenswrapper[4794]: I0310 10:07:39.046958 4794 scope.go:117] "RemoveContainer" containerID="3d9448ba9246b3efae07b7d37a8c471fa6937487c20455a457df20f1957ba5b8" Mar 10 10:07:39 crc kubenswrapper[4794]: I0310 10:07:39.056058 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 10 10:07:39 crc kubenswrapper[4794]: I0310 10:07:39.073767 4794 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 10 10:07:39 crc kubenswrapper[4794]: I0310 10:07:39.081943 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 10 10:07:39 crc kubenswrapper[4794]: E0310 10:07:39.082309 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b159d937-7865-4d6d-ac2d-65deeb0e9161" containerName="ceilometer-notification-agent" Mar 10 10:07:39 crc kubenswrapper[4794]: I0310 10:07:39.082325 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="b159d937-7865-4d6d-ac2d-65deeb0e9161" containerName="ceilometer-notification-agent" Mar 10 10:07:39 crc kubenswrapper[4794]: E0310 10:07:39.082363 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b159d937-7865-4d6d-ac2d-65deeb0e9161" containerName="ceilometer-central-agent" Mar 10 10:07:39 crc kubenswrapper[4794]: I0310 10:07:39.082369 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="b159d937-7865-4d6d-ac2d-65deeb0e9161" containerName="ceilometer-central-agent" Mar 10 10:07:39 crc kubenswrapper[4794]: E0310 10:07:39.082379 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b159d937-7865-4d6d-ac2d-65deeb0e9161" containerName="proxy-httpd" Mar 10 10:07:39 crc kubenswrapper[4794]: I0310 10:07:39.082385 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="b159d937-7865-4d6d-ac2d-65deeb0e9161" containerName="proxy-httpd" Mar 10 10:07:39 crc kubenswrapper[4794]: E0310 10:07:39.082401 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b159d937-7865-4d6d-ac2d-65deeb0e9161" containerName="sg-core" Mar 10 10:07:39 crc kubenswrapper[4794]: I0310 10:07:39.082407 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="b159d937-7865-4d6d-ac2d-65deeb0e9161" containerName="sg-core" Mar 10 10:07:39 crc kubenswrapper[4794]: I0310 10:07:39.082583 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="b159d937-7865-4d6d-ac2d-65deeb0e9161" containerName="ceilometer-notification-agent" Mar 10 10:07:39 crc kubenswrapper[4794]: I0310 10:07:39.082600 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="b159d937-7865-4d6d-ac2d-65deeb0e9161" containerName="sg-core" Mar 10 10:07:39 crc kubenswrapper[4794]: I0310 10:07:39.082619 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="b159d937-7865-4d6d-ac2d-65deeb0e9161" containerName="ceilometer-central-agent" Mar 10 10:07:39 crc kubenswrapper[4794]: I0310 10:07:39.082629 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="b159d937-7865-4d6d-ac2d-65deeb0e9161" containerName="proxy-httpd" Mar 10 10:07:39 crc kubenswrapper[4794]: I0310 10:07:39.084044 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 10 10:07:39 crc kubenswrapper[4794]: I0310 10:07:39.088301 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 10 10:07:39 crc kubenswrapper[4794]: I0310 10:07:39.088577 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 10 10:07:39 crc kubenswrapper[4794]: I0310 10:07:39.114712 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 10 10:07:39 crc kubenswrapper[4794]: I0310 10:07:39.135266 4794 scope.go:117] "RemoveContainer" containerID="547b3f81df0ea73dc74896043fbdee177611dcff3b12b2f5c185bbfaaec75405" Mar 10 10:07:39 crc kubenswrapper[4794]: I0310 10:07:39.175066 4794 scope.go:117] "RemoveContainer" containerID="5f829c37170e508875404923f8f7fce403c6b6cb5440a7e06342663d5067d0bc" Mar 10 10:07:39 crc kubenswrapper[4794]: I0310 10:07:39.182888 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/75a0fec5-73d1-4c03-9157-8ad2393e14f6-run-httpd\") pod \"ceilometer-0\" (UID: \"75a0fec5-73d1-4c03-9157-8ad2393e14f6\") " pod="openstack/ceilometer-0" Mar 10 10:07:39 crc kubenswrapper[4794]: I0310 10:07:39.182925 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/75a0fec5-73d1-4c03-9157-8ad2393e14f6-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"75a0fec5-73d1-4c03-9157-8ad2393e14f6\") " pod="openstack/ceilometer-0" Mar 10 10:07:39 crc kubenswrapper[4794]: I0310 10:07:39.182984 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/75a0fec5-73d1-4c03-9157-8ad2393e14f6-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"75a0fec5-73d1-4c03-9157-8ad2393e14f6\") " pod="openstack/ceilometer-0" Mar 10 10:07:39 crc kubenswrapper[4794]: I0310 10:07:39.183008 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/75a0fec5-73d1-4c03-9157-8ad2393e14f6-scripts\") pod \"ceilometer-0\" (UID: \"75a0fec5-73d1-4c03-9157-8ad2393e14f6\") " pod="openstack/ceilometer-0" Mar 10 10:07:39 crc kubenswrapper[4794]: I0310 10:07:39.183026 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/75a0fec5-73d1-4c03-9157-8ad2393e14f6-log-httpd\") pod \"ceilometer-0\" (UID: \"75a0fec5-73d1-4c03-9157-8ad2393e14f6\") " pod="openstack/ceilometer-0" Mar 10 10:07:39 crc kubenswrapper[4794]: I0310 10:07:39.183047 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k2hwx\" (UniqueName: \"kubernetes.io/projected/75a0fec5-73d1-4c03-9157-8ad2393e14f6-kube-api-access-k2hwx\") pod \"ceilometer-0\" (UID: \"75a0fec5-73d1-4c03-9157-8ad2393e14f6\") " pod="openstack/ceilometer-0" Mar 10 10:07:39 crc kubenswrapper[4794]: I0310 10:07:39.183089 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/75a0fec5-73d1-4c03-9157-8ad2393e14f6-config-data\") pod \"ceilometer-0\" (UID: \"75a0fec5-73d1-4c03-9157-8ad2393e14f6\") " pod="openstack/ceilometer-0" Mar 10 10:07:39 crc kubenswrapper[4794]: I0310 10:07:39.196458 4794 scope.go:117] "RemoveContainer" containerID="54a175da1ab61f5c34de173226d67054ee45620708e8430ac532d01ab1add867" Mar 10 10:07:39 crc kubenswrapper[4794]: E0310 10:07:39.196828 4794 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"54a175da1ab61f5c34de173226d67054ee45620708e8430ac532d01ab1add867\": container with ID starting with 54a175da1ab61f5c34de173226d67054ee45620708e8430ac532d01ab1add867 not found: ID does not exist" containerID="54a175da1ab61f5c34de173226d67054ee45620708e8430ac532d01ab1add867" Mar 10 10:07:39 crc kubenswrapper[4794]: I0310 10:07:39.196877 4794 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"54a175da1ab61f5c34de173226d67054ee45620708e8430ac532d01ab1add867"} err="failed to get container status \"54a175da1ab61f5c34de173226d67054ee45620708e8430ac532d01ab1add867\": rpc error: code = NotFound desc = could not find container \"54a175da1ab61f5c34de173226d67054ee45620708e8430ac532d01ab1add867\": container with ID starting with 54a175da1ab61f5c34de173226d67054ee45620708e8430ac532d01ab1add867 not found: ID does not exist" Mar 10 10:07:39 crc kubenswrapper[4794]: I0310 10:07:39.196895 4794 scope.go:117] "RemoveContainer" containerID="3d9448ba9246b3efae07b7d37a8c471fa6937487c20455a457df20f1957ba5b8" Mar 10 10:07:39 crc kubenswrapper[4794]: E0310 10:07:39.197287 4794 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3d9448ba9246b3efae07b7d37a8c471fa6937487c20455a457df20f1957ba5b8\": container with ID starting with 3d9448ba9246b3efae07b7d37a8c471fa6937487c20455a457df20f1957ba5b8 not found: ID does not exist" containerID="3d9448ba9246b3efae07b7d37a8c471fa6937487c20455a457df20f1957ba5b8" Mar 10 10:07:39 crc kubenswrapper[4794]: I0310 10:07:39.197309 4794 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3d9448ba9246b3efae07b7d37a8c471fa6937487c20455a457df20f1957ba5b8"} err="failed to get container status \"3d9448ba9246b3efae07b7d37a8c471fa6937487c20455a457df20f1957ba5b8\": rpc error: code = NotFound desc = could not find container \"3d9448ba9246b3efae07b7d37a8c471fa6937487c20455a457df20f1957ba5b8\": container with ID starting with 3d9448ba9246b3efae07b7d37a8c471fa6937487c20455a457df20f1957ba5b8 not found: ID does not exist" Mar 10 10:07:39 crc kubenswrapper[4794]: I0310 10:07:39.197322 4794 scope.go:117] "RemoveContainer" containerID="547b3f81df0ea73dc74896043fbdee177611dcff3b12b2f5c185bbfaaec75405" Mar 10 10:07:39 crc kubenswrapper[4794]: E0310 10:07:39.197616 4794 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"547b3f81df0ea73dc74896043fbdee177611dcff3b12b2f5c185bbfaaec75405\": container with ID starting with 547b3f81df0ea73dc74896043fbdee177611dcff3b12b2f5c185bbfaaec75405 not found: ID does not exist" containerID="547b3f81df0ea73dc74896043fbdee177611dcff3b12b2f5c185bbfaaec75405" Mar 10 10:07:39 crc kubenswrapper[4794]: I0310 10:07:39.197742 4794 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"547b3f81df0ea73dc74896043fbdee177611dcff3b12b2f5c185bbfaaec75405"} err="failed to get container status \"547b3f81df0ea73dc74896043fbdee177611dcff3b12b2f5c185bbfaaec75405\": rpc error: code = NotFound desc = could not find container \"547b3f81df0ea73dc74896043fbdee177611dcff3b12b2f5c185bbfaaec75405\": container with ID starting with 547b3f81df0ea73dc74896043fbdee177611dcff3b12b2f5c185bbfaaec75405 not found: ID does not exist" Mar 10 10:07:39 crc kubenswrapper[4794]: I0310 10:07:39.197921 4794 scope.go:117] "RemoveContainer" containerID="5f829c37170e508875404923f8f7fce403c6b6cb5440a7e06342663d5067d0bc" Mar 10 10:07:39 crc kubenswrapper[4794]: E0310 10:07:39.198206 4794 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5f829c37170e508875404923f8f7fce403c6b6cb5440a7e06342663d5067d0bc\": container with ID starting with 5f829c37170e508875404923f8f7fce403c6b6cb5440a7e06342663d5067d0bc not found: ID does not exist" containerID="5f829c37170e508875404923f8f7fce403c6b6cb5440a7e06342663d5067d0bc" Mar 10 10:07:39 crc kubenswrapper[4794]: I0310 10:07:39.198243 4794 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5f829c37170e508875404923f8f7fce403c6b6cb5440a7e06342663d5067d0bc"} err="failed to get container status \"5f829c37170e508875404923f8f7fce403c6b6cb5440a7e06342663d5067d0bc\": rpc error: code = NotFound desc = could not find container \"5f829c37170e508875404923f8f7fce403c6b6cb5440a7e06342663d5067d0bc\": container with ID starting with 5f829c37170e508875404923f8f7fce403c6b6cb5440a7e06342663d5067d0bc not found: ID does not exist" Mar 10 10:07:39 crc kubenswrapper[4794]: I0310 10:07:39.198258 4794 scope.go:117] "RemoveContainer" containerID="54a175da1ab61f5c34de173226d67054ee45620708e8430ac532d01ab1add867" Mar 10 10:07:39 crc kubenswrapper[4794]: I0310 10:07:39.198600 4794 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"54a175da1ab61f5c34de173226d67054ee45620708e8430ac532d01ab1add867"} err="failed to get container status \"54a175da1ab61f5c34de173226d67054ee45620708e8430ac532d01ab1add867\": rpc error: code = NotFound desc = could not find container \"54a175da1ab61f5c34de173226d67054ee45620708e8430ac532d01ab1add867\": container with ID starting with 54a175da1ab61f5c34de173226d67054ee45620708e8430ac532d01ab1add867 not found: ID does not exist" Mar 10 10:07:39 crc kubenswrapper[4794]: I0310 10:07:39.198694 4794 scope.go:117] "RemoveContainer" containerID="3d9448ba9246b3efae07b7d37a8c471fa6937487c20455a457df20f1957ba5b8" Mar 10 10:07:39 crc kubenswrapper[4794]: I0310 10:07:39.198936 4794 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3d9448ba9246b3efae07b7d37a8c471fa6937487c20455a457df20f1957ba5b8"} err="failed to get container status \"3d9448ba9246b3efae07b7d37a8c471fa6937487c20455a457df20f1957ba5b8\": rpc error: code = NotFound desc = could not find container \"3d9448ba9246b3efae07b7d37a8c471fa6937487c20455a457df20f1957ba5b8\": container with ID starting with 3d9448ba9246b3efae07b7d37a8c471fa6937487c20455a457df20f1957ba5b8 not found: ID does not exist" Mar 10 10:07:39 crc kubenswrapper[4794]: I0310 10:07:39.198971 4794 scope.go:117] "RemoveContainer" containerID="547b3f81df0ea73dc74896043fbdee177611dcff3b12b2f5c185bbfaaec75405" Mar 10 10:07:39 crc kubenswrapper[4794]: I0310 10:07:39.199267 4794 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"547b3f81df0ea73dc74896043fbdee177611dcff3b12b2f5c185bbfaaec75405"} err="failed to get container status \"547b3f81df0ea73dc74896043fbdee177611dcff3b12b2f5c185bbfaaec75405\": rpc error: code = NotFound desc = could not find container \"547b3f81df0ea73dc74896043fbdee177611dcff3b12b2f5c185bbfaaec75405\": container with ID starting with 547b3f81df0ea73dc74896043fbdee177611dcff3b12b2f5c185bbfaaec75405 not found: ID does not exist" Mar 10 10:07:39 crc kubenswrapper[4794]: I0310 10:07:39.199282 4794 scope.go:117] "RemoveContainer" containerID="5f829c37170e508875404923f8f7fce403c6b6cb5440a7e06342663d5067d0bc" Mar 10 10:07:39 crc kubenswrapper[4794]: I0310 10:07:39.199623 4794 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5f829c37170e508875404923f8f7fce403c6b6cb5440a7e06342663d5067d0bc"} err="failed to get container status \"5f829c37170e508875404923f8f7fce403c6b6cb5440a7e06342663d5067d0bc\": rpc error: code = NotFound desc = could not find container \"5f829c37170e508875404923f8f7fce403c6b6cb5440a7e06342663d5067d0bc\": container with ID starting with 5f829c37170e508875404923f8f7fce403c6b6cb5440a7e06342663d5067d0bc not found: ID does not exist" Mar 10 10:07:39 crc kubenswrapper[4794]: I0310 10:07:39.199644 4794 scope.go:117] "RemoveContainer" containerID="54a175da1ab61f5c34de173226d67054ee45620708e8430ac532d01ab1add867" Mar 10 10:07:39 crc kubenswrapper[4794]: I0310 10:07:39.200168 4794 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"54a175da1ab61f5c34de173226d67054ee45620708e8430ac532d01ab1add867"} err="failed to get container status \"54a175da1ab61f5c34de173226d67054ee45620708e8430ac532d01ab1add867\": rpc error: code = NotFound desc = could not find container \"54a175da1ab61f5c34de173226d67054ee45620708e8430ac532d01ab1add867\": container with ID starting with 54a175da1ab61f5c34de173226d67054ee45620708e8430ac532d01ab1add867 not found: ID does not exist" Mar 10 10:07:39 crc kubenswrapper[4794]: I0310 10:07:39.200185 4794 scope.go:117] "RemoveContainer" containerID="3d9448ba9246b3efae07b7d37a8c471fa6937487c20455a457df20f1957ba5b8" Mar 10 10:07:39 crc kubenswrapper[4794]: I0310 10:07:39.200633 4794 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3d9448ba9246b3efae07b7d37a8c471fa6937487c20455a457df20f1957ba5b8"} err="failed to get container status \"3d9448ba9246b3efae07b7d37a8c471fa6937487c20455a457df20f1957ba5b8\": rpc error: code = NotFound desc = could not find container \"3d9448ba9246b3efae07b7d37a8c471fa6937487c20455a457df20f1957ba5b8\": container with ID starting with 3d9448ba9246b3efae07b7d37a8c471fa6937487c20455a457df20f1957ba5b8 not found: ID does not exist" Mar 10 10:07:39 crc kubenswrapper[4794]: I0310 10:07:39.200652 4794 scope.go:117] "RemoveContainer" containerID="547b3f81df0ea73dc74896043fbdee177611dcff3b12b2f5c185bbfaaec75405" Mar 10 10:07:39 crc kubenswrapper[4794]: I0310 10:07:39.201004 4794 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"547b3f81df0ea73dc74896043fbdee177611dcff3b12b2f5c185bbfaaec75405"} err="failed to get container status \"547b3f81df0ea73dc74896043fbdee177611dcff3b12b2f5c185bbfaaec75405\": rpc error: code = NotFound desc = could not find container \"547b3f81df0ea73dc74896043fbdee177611dcff3b12b2f5c185bbfaaec75405\": container with ID starting with 547b3f81df0ea73dc74896043fbdee177611dcff3b12b2f5c185bbfaaec75405 not found: ID does not exist" Mar 10 10:07:39 crc kubenswrapper[4794]: I0310 10:07:39.201044 4794 scope.go:117] "RemoveContainer" containerID="5f829c37170e508875404923f8f7fce403c6b6cb5440a7e06342663d5067d0bc" Mar 10 10:07:39 crc kubenswrapper[4794]: I0310 10:07:39.201354 4794 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5f829c37170e508875404923f8f7fce403c6b6cb5440a7e06342663d5067d0bc"} err="failed to get container status \"5f829c37170e508875404923f8f7fce403c6b6cb5440a7e06342663d5067d0bc\": rpc error: code = NotFound desc = could not find container \"5f829c37170e508875404923f8f7fce403c6b6cb5440a7e06342663d5067d0bc\": container with ID starting with 5f829c37170e508875404923f8f7fce403c6b6cb5440a7e06342663d5067d0bc not found: ID does not exist" Mar 10 10:07:39 crc kubenswrapper[4794]: I0310 10:07:39.201457 4794 scope.go:117] "RemoveContainer" containerID="54a175da1ab61f5c34de173226d67054ee45620708e8430ac532d01ab1add867" Mar 10 10:07:39 crc kubenswrapper[4794]: I0310 10:07:39.201821 4794 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"54a175da1ab61f5c34de173226d67054ee45620708e8430ac532d01ab1add867"} err="failed to get container status \"54a175da1ab61f5c34de173226d67054ee45620708e8430ac532d01ab1add867\": rpc error: code = NotFound desc = could not find container \"54a175da1ab61f5c34de173226d67054ee45620708e8430ac532d01ab1add867\": container with ID starting with 54a175da1ab61f5c34de173226d67054ee45620708e8430ac532d01ab1add867 not found: ID does not exist" Mar 10 10:07:39 crc kubenswrapper[4794]: I0310 10:07:39.201857 4794 scope.go:117] "RemoveContainer" containerID="3d9448ba9246b3efae07b7d37a8c471fa6937487c20455a457df20f1957ba5b8" Mar 10 10:07:39 crc kubenswrapper[4794]: I0310 10:07:39.202165 4794 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3d9448ba9246b3efae07b7d37a8c471fa6937487c20455a457df20f1957ba5b8"} err="failed to get container status \"3d9448ba9246b3efae07b7d37a8c471fa6937487c20455a457df20f1957ba5b8\": rpc error: code = NotFound desc = could not find container \"3d9448ba9246b3efae07b7d37a8c471fa6937487c20455a457df20f1957ba5b8\": container with ID starting with 3d9448ba9246b3efae07b7d37a8c471fa6937487c20455a457df20f1957ba5b8 not found: ID does not exist" Mar 10 10:07:39 crc kubenswrapper[4794]: I0310 10:07:39.202277 4794 scope.go:117] "RemoveContainer" containerID="547b3f81df0ea73dc74896043fbdee177611dcff3b12b2f5c185bbfaaec75405" Mar 10 10:07:39 crc kubenswrapper[4794]: I0310 10:07:39.202632 4794 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"547b3f81df0ea73dc74896043fbdee177611dcff3b12b2f5c185bbfaaec75405"} err="failed to get container status \"547b3f81df0ea73dc74896043fbdee177611dcff3b12b2f5c185bbfaaec75405\": rpc error: code = NotFound desc = could not find container \"547b3f81df0ea73dc74896043fbdee177611dcff3b12b2f5c185bbfaaec75405\": container with ID starting with 547b3f81df0ea73dc74896043fbdee177611dcff3b12b2f5c185bbfaaec75405 not found: ID does not exist" Mar 10 10:07:39 crc kubenswrapper[4794]: I0310 10:07:39.202677 4794 scope.go:117] "RemoveContainer" containerID="5f829c37170e508875404923f8f7fce403c6b6cb5440a7e06342663d5067d0bc" Mar 10 10:07:39 crc kubenswrapper[4794]: I0310 10:07:39.202988 4794 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5f829c37170e508875404923f8f7fce403c6b6cb5440a7e06342663d5067d0bc"} err="failed to get container status \"5f829c37170e508875404923f8f7fce403c6b6cb5440a7e06342663d5067d0bc\": rpc error: code = NotFound desc = could not find container \"5f829c37170e508875404923f8f7fce403c6b6cb5440a7e06342663d5067d0bc\": container with ID starting with 5f829c37170e508875404923f8f7fce403c6b6cb5440a7e06342663d5067d0bc not found: ID does not exist" Mar 10 10:07:39 crc kubenswrapper[4794]: I0310 10:07:39.284378 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/75a0fec5-73d1-4c03-9157-8ad2393e14f6-run-httpd\") pod \"ceilometer-0\" (UID: \"75a0fec5-73d1-4c03-9157-8ad2393e14f6\") " pod="openstack/ceilometer-0" Mar 10 10:07:39 crc kubenswrapper[4794]: I0310 10:07:39.284415 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/75a0fec5-73d1-4c03-9157-8ad2393e14f6-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"75a0fec5-73d1-4c03-9157-8ad2393e14f6\") " pod="openstack/ceilometer-0" Mar 10 10:07:39 crc kubenswrapper[4794]: I0310 10:07:39.284474 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/75a0fec5-73d1-4c03-9157-8ad2393e14f6-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"75a0fec5-73d1-4c03-9157-8ad2393e14f6\") " pod="openstack/ceilometer-0" Mar 10 10:07:39 crc kubenswrapper[4794]: I0310 10:07:39.284500 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/75a0fec5-73d1-4c03-9157-8ad2393e14f6-scripts\") pod \"ceilometer-0\" (UID: \"75a0fec5-73d1-4c03-9157-8ad2393e14f6\") " pod="openstack/ceilometer-0" Mar 10 10:07:39 crc kubenswrapper[4794]: I0310 10:07:39.284520 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/75a0fec5-73d1-4c03-9157-8ad2393e14f6-log-httpd\") pod \"ceilometer-0\" (UID: \"75a0fec5-73d1-4c03-9157-8ad2393e14f6\") " pod="openstack/ceilometer-0" Mar 10 10:07:39 crc kubenswrapper[4794]: I0310 10:07:39.284542 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k2hwx\" (UniqueName: \"kubernetes.io/projected/75a0fec5-73d1-4c03-9157-8ad2393e14f6-kube-api-access-k2hwx\") pod \"ceilometer-0\" (UID: \"75a0fec5-73d1-4c03-9157-8ad2393e14f6\") " pod="openstack/ceilometer-0" Mar 10 10:07:39 crc kubenswrapper[4794]: I0310 10:07:39.284585 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/75a0fec5-73d1-4c03-9157-8ad2393e14f6-config-data\") pod \"ceilometer-0\" (UID: \"75a0fec5-73d1-4c03-9157-8ad2393e14f6\") " pod="openstack/ceilometer-0" Mar 10 10:07:39 crc kubenswrapper[4794]: I0310 10:07:39.285285 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/75a0fec5-73d1-4c03-9157-8ad2393e14f6-log-httpd\") pod \"ceilometer-0\" (UID: \"75a0fec5-73d1-4c03-9157-8ad2393e14f6\") " pod="openstack/ceilometer-0" Mar 10 10:07:39 crc kubenswrapper[4794]: I0310 10:07:39.285695 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/75a0fec5-73d1-4c03-9157-8ad2393e14f6-run-httpd\") pod \"ceilometer-0\" (UID: \"75a0fec5-73d1-4c03-9157-8ad2393e14f6\") " pod="openstack/ceilometer-0" Mar 10 10:07:39 crc kubenswrapper[4794]: I0310 10:07:39.290200 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/75a0fec5-73d1-4c03-9157-8ad2393e14f6-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"75a0fec5-73d1-4c03-9157-8ad2393e14f6\") " pod="openstack/ceilometer-0" Mar 10 10:07:39 crc kubenswrapper[4794]: I0310 10:07:39.290945 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/75a0fec5-73d1-4c03-9157-8ad2393e14f6-scripts\") pod \"ceilometer-0\" (UID: \"75a0fec5-73d1-4c03-9157-8ad2393e14f6\") " pod="openstack/ceilometer-0" Mar 10 10:07:39 crc kubenswrapper[4794]: I0310 10:07:39.296889 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/75a0fec5-73d1-4c03-9157-8ad2393e14f6-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"75a0fec5-73d1-4c03-9157-8ad2393e14f6\") " pod="openstack/ceilometer-0" Mar 10 10:07:39 crc kubenswrapper[4794]: I0310 10:07:39.297860 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/75a0fec5-73d1-4c03-9157-8ad2393e14f6-config-data\") pod \"ceilometer-0\" (UID: \"75a0fec5-73d1-4c03-9157-8ad2393e14f6\") " pod="openstack/ceilometer-0" Mar 10 10:07:39 crc kubenswrapper[4794]: I0310 10:07:39.302520 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k2hwx\" (UniqueName: \"kubernetes.io/projected/75a0fec5-73d1-4c03-9157-8ad2393e14f6-kube-api-access-k2hwx\") pod \"ceilometer-0\" (UID: \"75a0fec5-73d1-4c03-9157-8ad2393e14f6\") " pod="openstack/ceilometer-0" Mar 10 10:07:39 crc kubenswrapper[4794]: I0310 10:07:39.433897 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 10 10:07:39 crc kubenswrapper[4794]: I0310 10:07:39.883556 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 10 10:07:39 crc kubenswrapper[4794]: I0310 10:07:39.967346 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"75a0fec5-73d1-4c03-9157-8ad2393e14f6","Type":"ContainerStarted","Data":"3acc489455ae0365f5c255102c983daba7706228ef78d3edd40ce50265f10b0a"} Mar 10 10:07:40 crc kubenswrapper[4794]: I0310 10:07:40.009938 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b159d937-7865-4d6d-ac2d-65deeb0e9161" path="/var/lib/kubelet/pods/b159d937-7865-4d6d-ac2d-65deeb0e9161/volumes" Mar 10 10:07:40 crc kubenswrapper[4794]: I0310 10:07:40.717285 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 10 10:07:40 crc kubenswrapper[4794]: I0310 10:07:40.976651 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"75a0fec5-73d1-4c03-9157-8ad2393e14f6","Type":"ContainerStarted","Data":"a1f56f9ed8fa18da6b342233881b8a5578b85e45e42628a24ddd4f28fa9c83f5"} Mar 10 10:07:41 crc kubenswrapper[4794]: I0310 10:07:41.997288 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"75a0fec5-73d1-4c03-9157-8ad2393e14f6","Type":"ContainerStarted","Data":"70c1bd2be207cfb21689544858b1be21a9520ab1246d4252164fa52e382039dd"} Mar 10 10:07:43 crc kubenswrapper[4794]: I0310 10:07:43.009190 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"75a0fec5-73d1-4c03-9157-8ad2393e14f6","Type":"ContainerStarted","Data":"7c3292e96c1842af17a02a7d13dacacdda02471998461d29da3d4bb05662501b"} Mar 10 10:07:45 crc kubenswrapper[4794]: I0310 10:07:45.024881 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"75a0fec5-73d1-4c03-9157-8ad2393e14f6","Type":"ContainerStarted","Data":"ecaa9a0746d74918af0d1f0cbca513d3ddfcefc2e5e44146c5e8b80b5eb1db58"} Mar 10 10:07:45 crc kubenswrapper[4794]: I0310 10:07:45.025293 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="75a0fec5-73d1-4c03-9157-8ad2393e14f6" containerName="ceilometer-central-agent" containerID="cri-o://a1f56f9ed8fa18da6b342233881b8a5578b85e45e42628a24ddd4f28fa9c83f5" gracePeriod=30 Mar 10 10:07:45 crc kubenswrapper[4794]: I0310 10:07:45.026548 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 10 10:07:45 crc kubenswrapper[4794]: I0310 10:07:45.027025 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="75a0fec5-73d1-4c03-9157-8ad2393e14f6" containerName="sg-core" containerID="cri-o://7c3292e96c1842af17a02a7d13dacacdda02471998461d29da3d4bb05662501b" gracePeriod=30 Mar 10 10:07:45 crc kubenswrapper[4794]: I0310 10:07:45.027076 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="75a0fec5-73d1-4c03-9157-8ad2393e14f6" containerName="proxy-httpd" containerID="cri-o://ecaa9a0746d74918af0d1f0cbca513d3ddfcefc2e5e44146c5e8b80b5eb1db58" gracePeriod=30 Mar 10 10:07:45 crc kubenswrapper[4794]: I0310 10:07:45.027135 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="75a0fec5-73d1-4c03-9157-8ad2393e14f6" containerName="ceilometer-notification-agent" containerID="cri-o://70c1bd2be207cfb21689544858b1be21a9520ab1246d4252164fa52e382039dd" gracePeriod=30 Mar 10 10:07:45 crc kubenswrapper[4794]: I0310 10:07:45.027294 4794 generic.go:334] "Generic (PLEG): container finished" podID="a352ac02-d7f9-40c5-b5ad-40d7fb5d2a55" containerID="e2edb92bbd38d8a9c294b9b3ad89c0346bb305954930f87be72067c7c60f2e7f" exitCode=0 Mar 10 10:07:45 crc kubenswrapper[4794]: I0310 10:07:45.027312 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-wd2vl" event={"ID":"a352ac02-d7f9-40c5-b5ad-40d7fb5d2a55","Type":"ContainerDied","Data":"e2edb92bbd38d8a9c294b9b3ad89c0346bb305954930f87be72067c7c60f2e7f"} Mar 10 10:07:45 crc kubenswrapper[4794]: I0310 10:07:45.048490 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=1.967945045 podStartE2EDuration="6.048473417s" podCreationTimestamp="2026-03-10 10:07:39 +0000 UTC" firstStartedPulling="2026-03-10 10:07:39.884609075 +0000 UTC m=+1408.640779893" lastFinishedPulling="2026-03-10 10:07:43.965137427 +0000 UTC m=+1412.721308265" observedRunningTime="2026-03-10 10:07:45.048394025 +0000 UTC m=+1413.804564863" watchObservedRunningTime="2026-03-10 10:07:45.048473417 +0000 UTC m=+1413.804644235" Mar 10 10:07:46 crc kubenswrapper[4794]: I0310 10:07:46.038403 4794 generic.go:334] "Generic (PLEG): container finished" podID="75a0fec5-73d1-4c03-9157-8ad2393e14f6" containerID="ecaa9a0746d74918af0d1f0cbca513d3ddfcefc2e5e44146c5e8b80b5eb1db58" exitCode=0 Mar 10 10:07:46 crc kubenswrapper[4794]: I0310 10:07:46.038609 4794 generic.go:334] "Generic (PLEG): container finished" podID="75a0fec5-73d1-4c03-9157-8ad2393e14f6" containerID="7c3292e96c1842af17a02a7d13dacacdda02471998461d29da3d4bb05662501b" exitCode=2 Mar 10 10:07:46 crc kubenswrapper[4794]: I0310 10:07:46.038620 4794 generic.go:334] "Generic (PLEG): container finished" podID="75a0fec5-73d1-4c03-9157-8ad2393e14f6" containerID="70c1bd2be207cfb21689544858b1be21a9520ab1246d4252164fa52e382039dd" exitCode=0 Mar 10 10:07:46 crc kubenswrapper[4794]: I0310 10:07:46.038473 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"75a0fec5-73d1-4c03-9157-8ad2393e14f6","Type":"ContainerDied","Data":"ecaa9a0746d74918af0d1f0cbca513d3ddfcefc2e5e44146c5e8b80b5eb1db58"} Mar 10 10:07:46 crc kubenswrapper[4794]: I0310 10:07:46.038796 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"75a0fec5-73d1-4c03-9157-8ad2393e14f6","Type":"ContainerDied","Data":"7c3292e96c1842af17a02a7d13dacacdda02471998461d29da3d4bb05662501b"} Mar 10 10:07:46 crc kubenswrapper[4794]: I0310 10:07:46.038829 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"75a0fec5-73d1-4c03-9157-8ad2393e14f6","Type":"ContainerDied","Data":"70c1bd2be207cfb21689544858b1be21a9520ab1246d4252164fa52e382039dd"} Mar 10 10:07:46 crc kubenswrapper[4794]: I0310 10:07:46.401072 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-wd2vl" Mar 10 10:07:46 crc kubenswrapper[4794]: I0310 10:07:46.543685 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a352ac02-d7f9-40c5-b5ad-40d7fb5d2a55-combined-ca-bundle\") pod \"a352ac02-d7f9-40c5-b5ad-40d7fb5d2a55\" (UID: \"a352ac02-d7f9-40c5-b5ad-40d7fb5d2a55\") " Mar 10 10:07:46 crc kubenswrapper[4794]: I0310 10:07:46.543812 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hjtgd\" (UniqueName: \"kubernetes.io/projected/a352ac02-d7f9-40c5-b5ad-40d7fb5d2a55-kube-api-access-hjtgd\") pod \"a352ac02-d7f9-40c5-b5ad-40d7fb5d2a55\" (UID: \"a352ac02-d7f9-40c5-b5ad-40d7fb5d2a55\") " Mar 10 10:07:46 crc kubenswrapper[4794]: I0310 10:07:46.543857 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a352ac02-d7f9-40c5-b5ad-40d7fb5d2a55-scripts\") pod \"a352ac02-d7f9-40c5-b5ad-40d7fb5d2a55\" (UID: \"a352ac02-d7f9-40c5-b5ad-40d7fb5d2a55\") " Mar 10 10:07:46 crc kubenswrapper[4794]: I0310 10:07:46.543874 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a352ac02-d7f9-40c5-b5ad-40d7fb5d2a55-config-data\") pod \"a352ac02-d7f9-40c5-b5ad-40d7fb5d2a55\" (UID: \"a352ac02-d7f9-40c5-b5ad-40d7fb5d2a55\") " Mar 10 10:07:46 crc kubenswrapper[4794]: I0310 10:07:46.551584 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a352ac02-d7f9-40c5-b5ad-40d7fb5d2a55-scripts" (OuterVolumeSpecName: "scripts") pod "a352ac02-d7f9-40c5-b5ad-40d7fb5d2a55" (UID: "a352ac02-d7f9-40c5-b5ad-40d7fb5d2a55"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 10:07:46 crc kubenswrapper[4794]: I0310 10:07:46.572157 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a352ac02-d7f9-40c5-b5ad-40d7fb5d2a55-kube-api-access-hjtgd" (OuterVolumeSpecName: "kube-api-access-hjtgd") pod "a352ac02-d7f9-40c5-b5ad-40d7fb5d2a55" (UID: "a352ac02-d7f9-40c5-b5ad-40d7fb5d2a55"). InnerVolumeSpecName "kube-api-access-hjtgd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 10:07:46 crc kubenswrapper[4794]: I0310 10:07:46.573766 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a352ac02-d7f9-40c5-b5ad-40d7fb5d2a55-config-data" (OuterVolumeSpecName: "config-data") pod "a352ac02-d7f9-40c5-b5ad-40d7fb5d2a55" (UID: "a352ac02-d7f9-40c5-b5ad-40d7fb5d2a55"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 10:07:46 crc kubenswrapper[4794]: I0310 10:07:46.574328 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a352ac02-d7f9-40c5-b5ad-40d7fb5d2a55-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a352ac02-d7f9-40c5-b5ad-40d7fb5d2a55" (UID: "a352ac02-d7f9-40c5-b5ad-40d7fb5d2a55"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 10:07:46 crc kubenswrapper[4794]: I0310 10:07:46.647123 4794 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a352ac02-d7f9-40c5-b5ad-40d7fb5d2a55-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 10:07:46 crc kubenswrapper[4794]: I0310 10:07:46.647162 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hjtgd\" (UniqueName: \"kubernetes.io/projected/a352ac02-d7f9-40c5-b5ad-40d7fb5d2a55-kube-api-access-hjtgd\") on node \"crc\" DevicePath \"\"" Mar 10 10:07:46 crc kubenswrapper[4794]: I0310 10:07:46.647174 4794 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a352ac02-d7f9-40c5-b5ad-40d7fb5d2a55-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 10:07:46 crc kubenswrapper[4794]: I0310 10:07:46.647183 4794 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a352ac02-d7f9-40c5-b5ad-40d7fb5d2a55-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 10:07:46 crc kubenswrapper[4794]: I0310 10:07:46.857316 4794 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/neutron-74b7765548-sk248" podUID="652a5188-47b5-4235-8385-f9b9b1e3db2d" containerName="neutron-httpd" probeResult="failure" output="Get \"http://10.217.0.155:9696/\": dial tcp 10.217.0.155:9696: connect: connection refused" Mar 10 10:07:47 crc kubenswrapper[4794]: I0310 10:07:47.055802 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-wd2vl" event={"ID":"a352ac02-d7f9-40c5-b5ad-40d7fb5d2a55","Type":"ContainerDied","Data":"9b27e5887a9b3ff188dc13c409bd684b06944065876f358a3fa1c70c440e2515"} Mar 10 10:07:47 crc kubenswrapper[4794]: I0310 10:07:47.055859 4794 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9b27e5887a9b3ff188dc13c409bd684b06944065876f358a3fa1c70c440e2515" Mar 10 10:07:47 crc kubenswrapper[4794]: I0310 10:07:47.055939 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-wd2vl" Mar 10 10:07:47 crc kubenswrapper[4794]: I0310 10:07:47.238919 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-0"] Mar 10 10:07:47 crc kubenswrapper[4794]: E0310 10:07:47.239284 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a352ac02-d7f9-40c5-b5ad-40d7fb5d2a55" containerName="nova-cell0-conductor-db-sync" Mar 10 10:07:47 crc kubenswrapper[4794]: I0310 10:07:47.239300 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="a352ac02-d7f9-40c5-b5ad-40d7fb5d2a55" containerName="nova-cell0-conductor-db-sync" Mar 10 10:07:47 crc kubenswrapper[4794]: I0310 10:07:47.239481 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="a352ac02-d7f9-40c5-b5ad-40d7fb5d2a55" containerName="nova-cell0-conductor-db-sync" Mar 10 10:07:47 crc kubenswrapper[4794]: I0310 10:07:47.239980 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Mar 10 10:07:47 crc kubenswrapper[4794]: I0310 10:07:47.242239 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-57ldq" Mar 10 10:07:47 crc kubenswrapper[4794]: I0310 10:07:47.242486 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Mar 10 10:07:47 crc kubenswrapper[4794]: I0310 10:07:47.252840 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Mar 10 10:07:47 crc kubenswrapper[4794]: I0310 10:07:47.360058 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6bd64ab2-dcd8-4404-973e-551182005da1-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"6bd64ab2-dcd8-4404-973e-551182005da1\") " pod="openstack/nova-cell0-conductor-0" Mar 10 10:07:47 crc kubenswrapper[4794]: I0310 10:07:47.360132 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2gcck\" (UniqueName: \"kubernetes.io/projected/6bd64ab2-dcd8-4404-973e-551182005da1-kube-api-access-2gcck\") pod \"nova-cell0-conductor-0\" (UID: \"6bd64ab2-dcd8-4404-973e-551182005da1\") " pod="openstack/nova-cell0-conductor-0" Mar 10 10:07:47 crc kubenswrapper[4794]: I0310 10:07:47.360396 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6bd64ab2-dcd8-4404-973e-551182005da1-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"6bd64ab2-dcd8-4404-973e-551182005da1\") " pod="openstack/nova-cell0-conductor-0" Mar 10 10:07:47 crc kubenswrapper[4794]: I0310 10:07:47.461493 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2gcck\" (UniqueName: \"kubernetes.io/projected/6bd64ab2-dcd8-4404-973e-551182005da1-kube-api-access-2gcck\") pod \"nova-cell0-conductor-0\" (UID: \"6bd64ab2-dcd8-4404-973e-551182005da1\") " pod="openstack/nova-cell0-conductor-0" Mar 10 10:07:47 crc kubenswrapper[4794]: I0310 10:07:47.461604 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6bd64ab2-dcd8-4404-973e-551182005da1-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"6bd64ab2-dcd8-4404-973e-551182005da1\") " pod="openstack/nova-cell0-conductor-0" Mar 10 10:07:47 crc kubenswrapper[4794]: I0310 10:07:47.461699 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6bd64ab2-dcd8-4404-973e-551182005da1-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"6bd64ab2-dcd8-4404-973e-551182005da1\") " pod="openstack/nova-cell0-conductor-0" Mar 10 10:07:47 crc kubenswrapper[4794]: I0310 10:07:47.467092 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6bd64ab2-dcd8-4404-973e-551182005da1-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"6bd64ab2-dcd8-4404-973e-551182005da1\") " pod="openstack/nova-cell0-conductor-0" Mar 10 10:07:47 crc kubenswrapper[4794]: I0310 10:07:47.473957 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6bd64ab2-dcd8-4404-973e-551182005da1-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"6bd64ab2-dcd8-4404-973e-551182005da1\") " pod="openstack/nova-cell0-conductor-0" Mar 10 10:07:47 crc kubenswrapper[4794]: I0310 10:07:47.477842 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2gcck\" (UniqueName: \"kubernetes.io/projected/6bd64ab2-dcd8-4404-973e-551182005da1-kube-api-access-2gcck\") pod \"nova-cell0-conductor-0\" (UID: \"6bd64ab2-dcd8-4404-973e-551182005da1\") " pod="openstack/nova-cell0-conductor-0" Mar 10 10:07:47 crc kubenswrapper[4794]: I0310 10:07:47.566314 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Mar 10 10:07:48 crc kubenswrapper[4794]: I0310 10:07:48.069406 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-74b7765548-sk248_652a5188-47b5-4235-8385-f9b9b1e3db2d/neutron-api/0.log" Mar 10 10:07:48 crc kubenswrapper[4794]: I0310 10:07:48.069450 4794 generic.go:334] "Generic (PLEG): container finished" podID="652a5188-47b5-4235-8385-f9b9b1e3db2d" containerID="a0b9dcaf358ff946d205c9bc8e9a9040d43717b794c1e68a54a11e2a563b14f0" exitCode=137 Mar 10 10:07:48 crc kubenswrapper[4794]: I0310 10:07:48.069510 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-74b7765548-sk248" event={"ID":"652a5188-47b5-4235-8385-f9b9b1e3db2d","Type":"ContainerDied","Data":"a0b9dcaf358ff946d205c9bc8e9a9040d43717b794c1e68a54a11e2a563b14f0"} Mar 10 10:07:48 crc kubenswrapper[4794]: I0310 10:07:48.073162 4794 generic.go:334] "Generic (PLEG): container finished" podID="75a0fec5-73d1-4c03-9157-8ad2393e14f6" containerID="a1f56f9ed8fa18da6b342233881b8a5578b85e45e42628a24ddd4f28fa9c83f5" exitCode=0 Mar 10 10:07:48 crc kubenswrapper[4794]: I0310 10:07:48.073198 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"75a0fec5-73d1-4c03-9157-8ad2393e14f6","Type":"ContainerDied","Data":"a1f56f9ed8fa18da6b342233881b8a5578b85e45e42628a24ddd4f28fa9c83f5"} Mar 10 10:07:48 crc kubenswrapper[4794]: I0310 10:07:48.214030 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 10 10:07:48 crc kubenswrapper[4794]: I0310 10:07:48.214680 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Mar 10 10:07:48 crc kubenswrapper[4794]: W0310 10:07:48.216986 4794 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6bd64ab2_dcd8_4404_973e_551182005da1.slice/crio-b17db50fcecf9f756739db2c43fd7495686b6dc284d13e468b782301f2adb2d4 WatchSource:0}: Error finding container b17db50fcecf9f756739db2c43fd7495686b6dc284d13e468b782301f2adb2d4: Status 404 returned error can't find the container with id b17db50fcecf9f756739db2c43fd7495686b6dc284d13e468b782301f2adb2d4 Mar 10 10:07:48 crc kubenswrapper[4794]: I0310 10:07:48.383694 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/75a0fec5-73d1-4c03-9157-8ad2393e14f6-config-data\") pod \"75a0fec5-73d1-4c03-9157-8ad2393e14f6\" (UID: \"75a0fec5-73d1-4c03-9157-8ad2393e14f6\") " Mar 10 10:07:48 crc kubenswrapper[4794]: I0310 10:07:48.383761 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/75a0fec5-73d1-4c03-9157-8ad2393e14f6-sg-core-conf-yaml\") pod \"75a0fec5-73d1-4c03-9157-8ad2393e14f6\" (UID: \"75a0fec5-73d1-4c03-9157-8ad2393e14f6\") " Mar 10 10:07:48 crc kubenswrapper[4794]: I0310 10:07:48.383788 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/75a0fec5-73d1-4c03-9157-8ad2393e14f6-log-httpd\") pod \"75a0fec5-73d1-4c03-9157-8ad2393e14f6\" (UID: \"75a0fec5-73d1-4c03-9157-8ad2393e14f6\") " Mar 10 10:07:48 crc kubenswrapper[4794]: I0310 10:07:48.383907 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/75a0fec5-73d1-4c03-9157-8ad2393e14f6-scripts\") pod \"75a0fec5-73d1-4c03-9157-8ad2393e14f6\" (UID: \"75a0fec5-73d1-4c03-9157-8ad2393e14f6\") " Mar 10 10:07:48 crc kubenswrapper[4794]: I0310 10:07:48.384011 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/75a0fec5-73d1-4c03-9157-8ad2393e14f6-combined-ca-bundle\") pod \"75a0fec5-73d1-4c03-9157-8ad2393e14f6\" (UID: \"75a0fec5-73d1-4c03-9157-8ad2393e14f6\") " Mar 10 10:07:48 crc kubenswrapper[4794]: I0310 10:07:48.384063 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k2hwx\" (UniqueName: \"kubernetes.io/projected/75a0fec5-73d1-4c03-9157-8ad2393e14f6-kube-api-access-k2hwx\") pod \"75a0fec5-73d1-4c03-9157-8ad2393e14f6\" (UID: \"75a0fec5-73d1-4c03-9157-8ad2393e14f6\") " Mar 10 10:07:48 crc kubenswrapper[4794]: I0310 10:07:48.384097 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/75a0fec5-73d1-4c03-9157-8ad2393e14f6-run-httpd\") pod \"75a0fec5-73d1-4c03-9157-8ad2393e14f6\" (UID: \"75a0fec5-73d1-4c03-9157-8ad2393e14f6\") " Mar 10 10:07:48 crc kubenswrapper[4794]: I0310 10:07:48.384991 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/75a0fec5-73d1-4c03-9157-8ad2393e14f6-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "75a0fec5-73d1-4c03-9157-8ad2393e14f6" (UID: "75a0fec5-73d1-4c03-9157-8ad2393e14f6"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 10:07:48 crc kubenswrapper[4794]: I0310 10:07:48.385829 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/75a0fec5-73d1-4c03-9157-8ad2393e14f6-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "75a0fec5-73d1-4c03-9157-8ad2393e14f6" (UID: "75a0fec5-73d1-4c03-9157-8ad2393e14f6"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 10:07:48 crc kubenswrapper[4794]: I0310 10:07:48.388524 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/75a0fec5-73d1-4c03-9157-8ad2393e14f6-kube-api-access-k2hwx" (OuterVolumeSpecName: "kube-api-access-k2hwx") pod "75a0fec5-73d1-4c03-9157-8ad2393e14f6" (UID: "75a0fec5-73d1-4c03-9157-8ad2393e14f6"). InnerVolumeSpecName "kube-api-access-k2hwx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 10:07:48 crc kubenswrapper[4794]: I0310 10:07:48.389064 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/75a0fec5-73d1-4c03-9157-8ad2393e14f6-scripts" (OuterVolumeSpecName: "scripts") pod "75a0fec5-73d1-4c03-9157-8ad2393e14f6" (UID: "75a0fec5-73d1-4c03-9157-8ad2393e14f6"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 10:07:48 crc kubenswrapper[4794]: I0310 10:07:48.420013 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/75a0fec5-73d1-4c03-9157-8ad2393e14f6-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "75a0fec5-73d1-4c03-9157-8ad2393e14f6" (UID: "75a0fec5-73d1-4c03-9157-8ad2393e14f6"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 10:07:48 crc kubenswrapper[4794]: I0310 10:07:48.432926 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-74b7765548-sk248_652a5188-47b5-4235-8385-f9b9b1e3db2d/neutron-api/0.log" Mar 10 10:07:48 crc kubenswrapper[4794]: I0310 10:07:48.433033 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-74b7765548-sk248" Mar 10 10:07:48 crc kubenswrapper[4794]: I0310 10:07:48.479886 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/75a0fec5-73d1-4c03-9157-8ad2393e14f6-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "75a0fec5-73d1-4c03-9157-8ad2393e14f6" (UID: "75a0fec5-73d1-4c03-9157-8ad2393e14f6"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 10:07:48 crc kubenswrapper[4794]: I0310 10:07:48.486522 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k2hwx\" (UniqueName: \"kubernetes.io/projected/75a0fec5-73d1-4c03-9157-8ad2393e14f6-kube-api-access-k2hwx\") on node \"crc\" DevicePath \"\"" Mar 10 10:07:48 crc kubenswrapper[4794]: I0310 10:07:48.486550 4794 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/75a0fec5-73d1-4c03-9157-8ad2393e14f6-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 10 10:07:48 crc kubenswrapper[4794]: I0310 10:07:48.486560 4794 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/75a0fec5-73d1-4c03-9157-8ad2393e14f6-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 10 10:07:48 crc kubenswrapper[4794]: I0310 10:07:48.486569 4794 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/75a0fec5-73d1-4c03-9157-8ad2393e14f6-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 10 10:07:48 crc kubenswrapper[4794]: I0310 10:07:48.486576 4794 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/75a0fec5-73d1-4c03-9157-8ad2393e14f6-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 10:07:48 crc kubenswrapper[4794]: I0310 10:07:48.486584 4794 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/75a0fec5-73d1-4c03-9157-8ad2393e14f6-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 10:07:48 crc kubenswrapper[4794]: I0310 10:07:48.519052 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/75a0fec5-73d1-4c03-9157-8ad2393e14f6-config-data" (OuterVolumeSpecName: "config-data") pod "75a0fec5-73d1-4c03-9157-8ad2393e14f6" (UID: "75a0fec5-73d1-4c03-9157-8ad2393e14f6"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 10:07:48 crc kubenswrapper[4794]: I0310 10:07:48.587560 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/652a5188-47b5-4235-8385-f9b9b1e3db2d-combined-ca-bundle\") pod \"652a5188-47b5-4235-8385-f9b9b1e3db2d\" (UID: \"652a5188-47b5-4235-8385-f9b9b1e3db2d\") " Mar 10 10:07:48 crc kubenswrapper[4794]: I0310 10:07:48.587618 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/652a5188-47b5-4235-8385-f9b9b1e3db2d-httpd-config\") pod \"652a5188-47b5-4235-8385-f9b9b1e3db2d\" (UID: \"652a5188-47b5-4235-8385-f9b9b1e3db2d\") " Mar 10 10:07:48 crc kubenswrapper[4794]: I0310 10:07:48.587747 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4pnzh\" (UniqueName: \"kubernetes.io/projected/652a5188-47b5-4235-8385-f9b9b1e3db2d-kube-api-access-4pnzh\") pod \"652a5188-47b5-4235-8385-f9b9b1e3db2d\" (UID: \"652a5188-47b5-4235-8385-f9b9b1e3db2d\") " Mar 10 10:07:48 crc kubenswrapper[4794]: I0310 10:07:48.587816 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/652a5188-47b5-4235-8385-f9b9b1e3db2d-ovndb-tls-certs\") pod \"652a5188-47b5-4235-8385-f9b9b1e3db2d\" (UID: \"652a5188-47b5-4235-8385-f9b9b1e3db2d\") " Mar 10 10:07:48 crc kubenswrapper[4794]: I0310 10:07:48.587843 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/652a5188-47b5-4235-8385-f9b9b1e3db2d-config\") pod \"652a5188-47b5-4235-8385-f9b9b1e3db2d\" (UID: \"652a5188-47b5-4235-8385-f9b9b1e3db2d\") " Mar 10 10:07:48 crc kubenswrapper[4794]: I0310 10:07:48.588383 4794 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/75a0fec5-73d1-4c03-9157-8ad2393e14f6-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 10:07:48 crc kubenswrapper[4794]: I0310 10:07:48.594530 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/652a5188-47b5-4235-8385-f9b9b1e3db2d-kube-api-access-4pnzh" (OuterVolumeSpecName: "kube-api-access-4pnzh") pod "652a5188-47b5-4235-8385-f9b9b1e3db2d" (UID: "652a5188-47b5-4235-8385-f9b9b1e3db2d"). InnerVolumeSpecName "kube-api-access-4pnzh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 10:07:48 crc kubenswrapper[4794]: I0310 10:07:48.600950 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/652a5188-47b5-4235-8385-f9b9b1e3db2d-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "652a5188-47b5-4235-8385-f9b9b1e3db2d" (UID: "652a5188-47b5-4235-8385-f9b9b1e3db2d"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 10:07:48 crc kubenswrapper[4794]: I0310 10:07:48.650343 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/652a5188-47b5-4235-8385-f9b9b1e3db2d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "652a5188-47b5-4235-8385-f9b9b1e3db2d" (UID: "652a5188-47b5-4235-8385-f9b9b1e3db2d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 10:07:48 crc kubenswrapper[4794]: I0310 10:07:48.652545 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/652a5188-47b5-4235-8385-f9b9b1e3db2d-config" (OuterVolumeSpecName: "config") pod "652a5188-47b5-4235-8385-f9b9b1e3db2d" (UID: "652a5188-47b5-4235-8385-f9b9b1e3db2d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 10:07:48 crc kubenswrapper[4794]: I0310 10:07:48.673668 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/652a5188-47b5-4235-8385-f9b9b1e3db2d-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "652a5188-47b5-4235-8385-f9b9b1e3db2d" (UID: "652a5188-47b5-4235-8385-f9b9b1e3db2d"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 10:07:48 crc kubenswrapper[4794]: I0310 10:07:48.691657 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4pnzh\" (UniqueName: \"kubernetes.io/projected/652a5188-47b5-4235-8385-f9b9b1e3db2d-kube-api-access-4pnzh\") on node \"crc\" DevicePath \"\"" Mar 10 10:07:48 crc kubenswrapper[4794]: I0310 10:07:48.691704 4794 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/652a5188-47b5-4235-8385-f9b9b1e3db2d-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 10 10:07:48 crc kubenswrapper[4794]: I0310 10:07:48.691718 4794 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/652a5188-47b5-4235-8385-f9b9b1e3db2d-config\") on node \"crc\" DevicePath \"\"" Mar 10 10:07:48 crc kubenswrapper[4794]: I0310 10:07:48.691733 4794 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/652a5188-47b5-4235-8385-f9b9b1e3db2d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 10:07:48 crc kubenswrapper[4794]: I0310 10:07:48.691748 4794 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/652a5188-47b5-4235-8385-f9b9b1e3db2d-httpd-config\") on node \"crc\" DevicePath \"\"" Mar 10 10:07:49 crc kubenswrapper[4794]: I0310 10:07:49.083993 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"6bd64ab2-dcd8-4404-973e-551182005da1","Type":"ContainerStarted","Data":"eeacb7a4237f606fda5183e09ef7cdbc5464bd0f9e121d184887a224b87c2de7"} Mar 10 10:07:49 crc kubenswrapper[4794]: I0310 10:07:49.084272 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"6bd64ab2-dcd8-4404-973e-551182005da1","Type":"ContainerStarted","Data":"b17db50fcecf9f756739db2c43fd7495686b6dc284d13e468b782301f2adb2d4"} Mar 10 10:07:49 crc kubenswrapper[4794]: I0310 10:07:49.084303 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell0-conductor-0" Mar 10 10:07:49 crc kubenswrapper[4794]: I0310 10:07:49.088451 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-74b7765548-sk248_652a5188-47b5-4235-8385-f9b9b1e3db2d/neutron-api/0.log" Mar 10 10:07:49 crc kubenswrapper[4794]: I0310 10:07:49.088532 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-74b7765548-sk248" event={"ID":"652a5188-47b5-4235-8385-f9b9b1e3db2d","Type":"ContainerDied","Data":"08d58ce08fe9b44e0e6151d616ba08d669daab57fe94928bc53d816d8f894a67"} Mar 10 10:07:49 crc kubenswrapper[4794]: I0310 10:07:49.088540 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-74b7765548-sk248" Mar 10 10:07:49 crc kubenswrapper[4794]: I0310 10:07:49.088567 4794 scope.go:117] "RemoveContainer" containerID="c90cc14f7dbebc115d43b4687a53f79c2412e4e1ff879557986772034e76699c" Mar 10 10:07:49 crc kubenswrapper[4794]: I0310 10:07:49.092961 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"75a0fec5-73d1-4c03-9157-8ad2393e14f6","Type":"ContainerDied","Data":"3acc489455ae0365f5c255102c983daba7706228ef78d3edd40ce50265f10b0a"} Mar 10 10:07:49 crc kubenswrapper[4794]: I0310 10:07:49.093101 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 10 10:07:49 crc kubenswrapper[4794]: I0310 10:07:49.108904 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-0" podStartSLOduration=2.108885793 podStartE2EDuration="2.108885793s" podCreationTimestamp="2026-03-10 10:07:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 10:07:49.10370088 +0000 UTC m=+1417.859871728" watchObservedRunningTime="2026-03-10 10:07:49.108885793 +0000 UTC m=+1417.865056611" Mar 10 10:07:49 crc kubenswrapper[4794]: I0310 10:07:49.121590 4794 scope.go:117] "RemoveContainer" containerID="a0b9dcaf358ff946d205c9bc8e9a9040d43717b794c1e68a54a11e2a563b14f0" Mar 10 10:07:49 crc kubenswrapper[4794]: I0310 10:07:49.171483 4794 scope.go:117] "RemoveContainer" containerID="ecaa9a0746d74918af0d1f0cbca513d3ddfcefc2e5e44146c5e8b80b5eb1db58" Mar 10 10:07:49 crc kubenswrapper[4794]: I0310 10:07:49.171666 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 10 10:07:49 crc kubenswrapper[4794]: I0310 10:07:49.188497 4794 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 10 10:07:49 crc kubenswrapper[4794]: I0310 10:07:49.200647 4794 scope.go:117] "RemoveContainer" containerID="7c3292e96c1842af17a02a7d13dacacdda02471998461d29da3d4bb05662501b" Mar 10 10:07:49 crc kubenswrapper[4794]: I0310 10:07:49.208812 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 10 10:07:49 crc kubenswrapper[4794]: E0310 10:07:49.209650 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="75a0fec5-73d1-4c03-9157-8ad2393e14f6" containerName="sg-core" Mar 10 10:07:49 crc kubenswrapper[4794]: I0310 10:07:49.209741 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="75a0fec5-73d1-4c03-9157-8ad2393e14f6" containerName="sg-core" Mar 10 10:07:49 crc kubenswrapper[4794]: E0310 10:07:49.209841 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="75a0fec5-73d1-4c03-9157-8ad2393e14f6" containerName="ceilometer-central-agent" Mar 10 10:07:49 crc kubenswrapper[4794]: I0310 10:07:49.210117 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="75a0fec5-73d1-4c03-9157-8ad2393e14f6" containerName="ceilometer-central-agent" Mar 10 10:07:49 crc kubenswrapper[4794]: E0310 10:07:49.210213 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="652a5188-47b5-4235-8385-f9b9b1e3db2d" containerName="neutron-api" Mar 10 10:07:49 crc kubenswrapper[4794]: I0310 10:07:49.210285 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="652a5188-47b5-4235-8385-f9b9b1e3db2d" containerName="neutron-api" Mar 10 10:07:49 crc kubenswrapper[4794]: E0310 10:07:49.210399 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="75a0fec5-73d1-4c03-9157-8ad2393e14f6" containerName="proxy-httpd" Mar 10 10:07:49 crc kubenswrapper[4794]: I0310 10:07:49.210475 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="75a0fec5-73d1-4c03-9157-8ad2393e14f6" containerName="proxy-httpd" Mar 10 10:07:49 crc kubenswrapper[4794]: E0310 10:07:49.210579 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="75a0fec5-73d1-4c03-9157-8ad2393e14f6" containerName="ceilometer-notification-agent" Mar 10 10:07:49 crc kubenswrapper[4794]: I0310 10:07:49.210654 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="75a0fec5-73d1-4c03-9157-8ad2393e14f6" containerName="ceilometer-notification-agent" Mar 10 10:07:49 crc kubenswrapper[4794]: E0310 10:07:49.210730 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="652a5188-47b5-4235-8385-f9b9b1e3db2d" containerName="neutron-httpd" Mar 10 10:07:49 crc kubenswrapper[4794]: I0310 10:07:49.210789 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="652a5188-47b5-4235-8385-f9b9b1e3db2d" containerName="neutron-httpd" Mar 10 10:07:49 crc kubenswrapper[4794]: I0310 10:07:49.211194 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="75a0fec5-73d1-4c03-9157-8ad2393e14f6" containerName="proxy-httpd" Mar 10 10:07:49 crc kubenswrapper[4794]: I0310 10:07:49.211283 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="75a0fec5-73d1-4c03-9157-8ad2393e14f6" containerName="ceilometer-central-agent" Mar 10 10:07:49 crc kubenswrapper[4794]: I0310 10:07:49.211368 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="652a5188-47b5-4235-8385-f9b9b1e3db2d" containerName="neutron-api" Mar 10 10:07:49 crc kubenswrapper[4794]: I0310 10:07:49.211468 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="75a0fec5-73d1-4c03-9157-8ad2393e14f6" containerName="ceilometer-notification-agent" Mar 10 10:07:49 crc kubenswrapper[4794]: I0310 10:07:49.211549 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="75a0fec5-73d1-4c03-9157-8ad2393e14f6" containerName="sg-core" Mar 10 10:07:49 crc kubenswrapper[4794]: I0310 10:07:49.211638 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="652a5188-47b5-4235-8385-f9b9b1e3db2d" containerName="neutron-httpd" Mar 10 10:07:49 crc kubenswrapper[4794]: I0310 10:07:49.213364 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 10 10:07:49 crc kubenswrapper[4794]: I0310 10:07:49.215660 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-74b7765548-sk248"] Mar 10 10:07:49 crc kubenswrapper[4794]: I0310 10:07:49.216254 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 10 10:07:49 crc kubenswrapper[4794]: I0310 10:07:49.216712 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 10 10:07:49 crc kubenswrapper[4794]: I0310 10:07:49.224267 4794 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-74b7765548-sk248"] Mar 10 10:07:49 crc kubenswrapper[4794]: I0310 10:07:49.233748 4794 scope.go:117] "RemoveContainer" containerID="70c1bd2be207cfb21689544858b1be21a9520ab1246d4252164fa52e382039dd" Mar 10 10:07:49 crc kubenswrapper[4794]: I0310 10:07:49.236108 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 10 10:07:49 crc kubenswrapper[4794]: I0310 10:07:49.259624 4794 scope.go:117] "RemoveContainer" containerID="a1f56f9ed8fa18da6b342233881b8a5578b85e45e42628a24ddd4f28fa9c83f5" Mar 10 10:07:49 crc kubenswrapper[4794]: I0310 10:07:49.316473 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b8c51b99-93aa-4f8d-a27c-d074fba6c088-config-data\") pod \"ceilometer-0\" (UID: \"b8c51b99-93aa-4f8d-a27c-d074fba6c088\") " pod="openstack/ceilometer-0" Mar 10 10:07:49 crc kubenswrapper[4794]: I0310 10:07:49.316552 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b8c51b99-93aa-4f8d-a27c-d074fba6c088-scripts\") pod \"ceilometer-0\" (UID: \"b8c51b99-93aa-4f8d-a27c-d074fba6c088\") " pod="openstack/ceilometer-0" Mar 10 10:07:49 crc kubenswrapper[4794]: I0310 10:07:49.316618 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b8c51b99-93aa-4f8d-a27c-d074fba6c088-log-httpd\") pod \"ceilometer-0\" (UID: \"b8c51b99-93aa-4f8d-a27c-d074fba6c088\") " pod="openstack/ceilometer-0" Mar 10 10:07:49 crc kubenswrapper[4794]: I0310 10:07:49.316642 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b8c51b99-93aa-4f8d-a27c-d074fba6c088-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"b8c51b99-93aa-4f8d-a27c-d074fba6c088\") " pod="openstack/ceilometer-0" Mar 10 10:07:49 crc kubenswrapper[4794]: I0310 10:07:49.316665 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b8c51b99-93aa-4f8d-a27c-d074fba6c088-run-httpd\") pod \"ceilometer-0\" (UID: \"b8c51b99-93aa-4f8d-a27c-d074fba6c088\") " pod="openstack/ceilometer-0" Mar 10 10:07:49 crc kubenswrapper[4794]: I0310 10:07:49.316691 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b8c51b99-93aa-4f8d-a27c-d074fba6c088-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"b8c51b99-93aa-4f8d-a27c-d074fba6c088\") " pod="openstack/ceilometer-0" Mar 10 10:07:49 crc kubenswrapper[4794]: I0310 10:07:49.316715 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qkd4w\" (UniqueName: \"kubernetes.io/projected/b8c51b99-93aa-4f8d-a27c-d074fba6c088-kube-api-access-qkd4w\") pod \"ceilometer-0\" (UID: \"b8c51b99-93aa-4f8d-a27c-d074fba6c088\") " pod="openstack/ceilometer-0" Mar 10 10:07:49 crc kubenswrapper[4794]: I0310 10:07:49.417801 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b8c51b99-93aa-4f8d-a27c-d074fba6c088-config-data\") pod \"ceilometer-0\" (UID: \"b8c51b99-93aa-4f8d-a27c-d074fba6c088\") " pod="openstack/ceilometer-0" Mar 10 10:07:49 crc kubenswrapper[4794]: I0310 10:07:49.417881 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b8c51b99-93aa-4f8d-a27c-d074fba6c088-scripts\") pod \"ceilometer-0\" (UID: \"b8c51b99-93aa-4f8d-a27c-d074fba6c088\") " pod="openstack/ceilometer-0" Mar 10 10:07:49 crc kubenswrapper[4794]: I0310 10:07:49.417935 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b8c51b99-93aa-4f8d-a27c-d074fba6c088-log-httpd\") pod \"ceilometer-0\" (UID: \"b8c51b99-93aa-4f8d-a27c-d074fba6c088\") " pod="openstack/ceilometer-0" Mar 10 10:07:49 crc kubenswrapper[4794]: I0310 10:07:49.417952 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b8c51b99-93aa-4f8d-a27c-d074fba6c088-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"b8c51b99-93aa-4f8d-a27c-d074fba6c088\") " pod="openstack/ceilometer-0" Mar 10 10:07:49 crc kubenswrapper[4794]: I0310 10:07:49.417971 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b8c51b99-93aa-4f8d-a27c-d074fba6c088-run-httpd\") pod \"ceilometer-0\" (UID: \"b8c51b99-93aa-4f8d-a27c-d074fba6c088\") " pod="openstack/ceilometer-0" Mar 10 10:07:49 crc kubenswrapper[4794]: I0310 10:07:49.417995 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b8c51b99-93aa-4f8d-a27c-d074fba6c088-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"b8c51b99-93aa-4f8d-a27c-d074fba6c088\") " pod="openstack/ceilometer-0" Mar 10 10:07:49 crc kubenswrapper[4794]: I0310 10:07:49.418034 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qkd4w\" (UniqueName: \"kubernetes.io/projected/b8c51b99-93aa-4f8d-a27c-d074fba6c088-kube-api-access-qkd4w\") pod \"ceilometer-0\" (UID: \"b8c51b99-93aa-4f8d-a27c-d074fba6c088\") " pod="openstack/ceilometer-0" Mar 10 10:07:49 crc kubenswrapper[4794]: I0310 10:07:49.418767 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b8c51b99-93aa-4f8d-a27c-d074fba6c088-log-httpd\") pod \"ceilometer-0\" (UID: \"b8c51b99-93aa-4f8d-a27c-d074fba6c088\") " pod="openstack/ceilometer-0" Mar 10 10:07:49 crc kubenswrapper[4794]: I0310 10:07:49.418867 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b8c51b99-93aa-4f8d-a27c-d074fba6c088-run-httpd\") pod \"ceilometer-0\" (UID: \"b8c51b99-93aa-4f8d-a27c-d074fba6c088\") " pod="openstack/ceilometer-0" Mar 10 10:07:49 crc kubenswrapper[4794]: I0310 10:07:49.423396 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b8c51b99-93aa-4f8d-a27c-d074fba6c088-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"b8c51b99-93aa-4f8d-a27c-d074fba6c088\") " pod="openstack/ceilometer-0" Mar 10 10:07:49 crc kubenswrapper[4794]: I0310 10:07:49.423513 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b8c51b99-93aa-4f8d-a27c-d074fba6c088-config-data\") pod \"ceilometer-0\" (UID: \"b8c51b99-93aa-4f8d-a27c-d074fba6c088\") " pod="openstack/ceilometer-0" Mar 10 10:07:49 crc kubenswrapper[4794]: I0310 10:07:49.424739 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b8c51b99-93aa-4f8d-a27c-d074fba6c088-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"b8c51b99-93aa-4f8d-a27c-d074fba6c088\") " pod="openstack/ceilometer-0" Mar 10 10:07:49 crc kubenswrapper[4794]: I0310 10:07:49.427928 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b8c51b99-93aa-4f8d-a27c-d074fba6c088-scripts\") pod \"ceilometer-0\" (UID: \"b8c51b99-93aa-4f8d-a27c-d074fba6c088\") " pod="openstack/ceilometer-0" Mar 10 10:07:49 crc kubenswrapper[4794]: I0310 10:07:49.439965 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qkd4w\" (UniqueName: \"kubernetes.io/projected/b8c51b99-93aa-4f8d-a27c-d074fba6c088-kube-api-access-qkd4w\") pod \"ceilometer-0\" (UID: \"b8c51b99-93aa-4f8d-a27c-d074fba6c088\") " pod="openstack/ceilometer-0" Mar 10 10:07:49 crc kubenswrapper[4794]: I0310 10:07:49.536500 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 10 10:07:49 crc kubenswrapper[4794]: I0310 10:07:49.993872 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 10 10:07:50 crc kubenswrapper[4794]: I0310 10:07:50.013055 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="652a5188-47b5-4235-8385-f9b9b1e3db2d" path="/var/lib/kubelet/pods/652a5188-47b5-4235-8385-f9b9b1e3db2d/volumes" Mar 10 10:07:50 crc kubenswrapper[4794]: I0310 10:07:50.014068 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="75a0fec5-73d1-4c03-9157-8ad2393e14f6" path="/var/lib/kubelet/pods/75a0fec5-73d1-4c03-9157-8ad2393e14f6/volumes" Mar 10 10:07:50 crc kubenswrapper[4794]: I0310 10:07:50.105238 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b8c51b99-93aa-4f8d-a27c-d074fba6c088","Type":"ContainerStarted","Data":"6bd17e255196691cb49e7a6e71c66b3c31388673fe3f135e65510842d222cc86"} Mar 10 10:07:51 crc kubenswrapper[4794]: I0310 10:07:51.124976 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b8c51b99-93aa-4f8d-a27c-d074fba6c088","Type":"ContainerStarted","Data":"289c72a035524d3aa2d5de9768558a7437aaedc5914a97e7e9ecf6ff6eeccdce"} Mar 10 10:07:52 crc kubenswrapper[4794]: I0310 10:07:52.137775 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b8c51b99-93aa-4f8d-a27c-d074fba6c088","Type":"ContainerStarted","Data":"9dc19e1bf68288ae1411b1a8de862fd3ebbb95f8ce54b07c9ceb502662ba61d1"} Mar 10 10:07:53 crc kubenswrapper[4794]: I0310 10:07:53.148511 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b8c51b99-93aa-4f8d-a27c-d074fba6c088","Type":"ContainerStarted","Data":"21069cd1e2fdce0936b66d3c3e620ae34405dd97c82d27a0096c58cc684bf07f"} Mar 10 10:07:55 crc kubenswrapper[4794]: I0310 10:07:55.170500 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b8c51b99-93aa-4f8d-a27c-d074fba6c088","Type":"ContainerStarted","Data":"26e182b3712e4bfb475fe4f60312bb3098934981450ef39df84c8266f3a7d01d"} Mar 10 10:07:55 crc kubenswrapper[4794]: I0310 10:07:55.171155 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 10 10:07:55 crc kubenswrapper[4794]: I0310 10:07:55.200118 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=1.456058354 podStartE2EDuration="6.200099056s" podCreationTimestamp="2026-03-10 10:07:49 +0000 UTC" firstStartedPulling="2026-03-10 10:07:50.006592368 +0000 UTC m=+1418.762763226" lastFinishedPulling="2026-03-10 10:07:54.7506331 +0000 UTC m=+1423.506803928" observedRunningTime="2026-03-10 10:07:55.196966143 +0000 UTC m=+1423.953136981" watchObservedRunningTime="2026-03-10 10:07:55.200099056 +0000 UTC m=+1423.956269874" Mar 10 10:07:57 crc kubenswrapper[4794]: I0310 10:07:57.594503 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell0-conductor-0" Mar 10 10:07:58 crc kubenswrapper[4794]: I0310 10:07:58.101227 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-cell-mapping-czl5f"] Mar 10 10:07:58 crc kubenswrapper[4794]: I0310 10:07:58.104005 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-czl5f" Mar 10 10:07:58 crc kubenswrapper[4794]: I0310 10:07:58.118122 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-czl5f"] Mar 10 10:07:58 crc kubenswrapper[4794]: I0310 10:07:58.119467 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-scripts" Mar 10 10:07:58 crc kubenswrapper[4794]: I0310 10:07:58.119662 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-config-data" Mar 10 10:07:58 crc kubenswrapper[4794]: I0310 10:07:58.201638 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4290ea58-8af5-478d-a452-421fe656fe01-config-data\") pod \"nova-cell0-cell-mapping-czl5f\" (UID: \"4290ea58-8af5-478d-a452-421fe656fe01\") " pod="openstack/nova-cell0-cell-mapping-czl5f" Mar 10 10:07:58 crc kubenswrapper[4794]: I0310 10:07:58.201709 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4290ea58-8af5-478d-a452-421fe656fe01-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-czl5f\" (UID: \"4290ea58-8af5-478d-a452-421fe656fe01\") " pod="openstack/nova-cell0-cell-mapping-czl5f" Mar 10 10:07:58 crc kubenswrapper[4794]: I0310 10:07:58.201789 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4290ea58-8af5-478d-a452-421fe656fe01-scripts\") pod \"nova-cell0-cell-mapping-czl5f\" (UID: \"4290ea58-8af5-478d-a452-421fe656fe01\") " pod="openstack/nova-cell0-cell-mapping-czl5f" Mar 10 10:07:58 crc kubenswrapper[4794]: I0310 10:07:58.201817 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zrnj4\" (UniqueName: \"kubernetes.io/projected/4290ea58-8af5-478d-a452-421fe656fe01-kube-api-access-zrnj4\") pod \"nova-cell0-cell-mapping-czl5f\" (UID: \"4290ea58-8af5-478d-a452-421fe656fe01\") " pod="openstack/nova-cell0-cell-mapping-czl5f" Mar 10 10:07:58 crc kubenswrapper[4794]: I0310 10:07:58.267599 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Mar 10 10:07:58 crc kubenswrapper[4794]: I0310 10:07:58.269045 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 10 10:07:58 crc kubenswrapper[4794]: I0310 10:07:58.271814 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Mar 10 10:07:58 crc kubenswrapper[4794]: I0310 10:07:58.292852 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 10 10:07:58 crc kubenswrapper[4794]: I0310 10:07:58.303150 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4290ea58-8af5-478d-a452-421fe656fe01-config-data\") pod \"nova-cell0-cell-mapping-czl5f\" (UID: \"4290ea58-8af5-478d-a452-421fe656fe01\") " pod="openstack/nova-cell0-cell-mapping-czl5f" Mar 10 10:07:58 crc kubenswrapper[4794]: I0310 10:07:58.303225 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4290ea58-8af5-478d-a452-421fe656fe01-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-czl5f\" (UID: \"4290ea58-8af5-478d-a452-421fe656fe01\") " pod="openstack/nova-cell0-cell-mapping-czl5f" Mar 10 10:07:58 crc kubenswrapper[4794]: I0310 10:07:58.303297 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4290ea58-8af5-478d-a452-421fe656fe01-scripts\") pod \"nova-cell0-cell-mapping-czl5f\" (UID: \"4290ea58-8af5-478d-a452-421fe656fe01\") " pod="openstack/nova-cell0-cell-mapping-czl5f" Mar 10 10:07:58 crc kubenswrapper[4794]: I0310 10:07:58.303344 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zrnj4\" (UniqueName: \"kubernetes.io/projected/4290ea58-8af5-478d-a452-421fe656fe01-kube-api-access-zrnj4\") pod \"nova-cell0-cell-mapping-czl5f\" (UID: \"4290ea58-8af5-478d-a452-421fe656fe01\") " pod="openstack/nova-cell0-cell-mapping-czl5f" Mar 10 10:07:58 crc kubenswrapper[4794]: I0310 10:07:58.330171 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4290ea58-8af5-478d-a452-421fe656fe01-config-data\") pod \"nova-cell0-cell-mapping-czl5f\" (UID: \"4290ea58-8af5-478d-a452-421fe656fe01\") " pod="openstack/nova-cell0-cell-mapping-czl5f" Mar 10 10:07:58 crc kubenswrapper[4794]: I0310 10:07:58.353593 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4290ea58-8af5-478d-a452-421fe656fe01-scripts\") pod \"nova-cell0-cell-mapping-czl5f\" (UID: \"4290ea58-8af5-478d-a452-421fe656fe01\") " pod="openstack/nova-cell0-cell-mapping-czl5f" Mar 10 10:07:58 crc kubenswrapper[4794]: I0310 10:07:58.353889 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zrnj4\" (UniqueName: \"kubernetes.io/projected/4290ea58-8af5-478d-a452-421fe656fe01-kube-api-access-zrnj4\") pod \"nova-cell0-cell-mapping-czl5f\" (UID: \"4290ea58-8af5-478d-a452-421fe656fe01\") " pod="openstack/nova-cell0-cell-mapping-czl5f" Mar 10 10:07:58 crc kubenswrapper[4794]: I0310 10:07:58.357904 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4290ea58-8af5-478d-a452-421fe656fe01-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-czl5f\" (UID: \"4290ea58-8af5-478d-a452-421fe656fe01\") " pod="openstack/nova-cell0-cell-mapping-czl5f" Mar 10 10:07:58 crc kubenswrapper[4794]: I0310 10:07:58.357961 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Mar 10 10:07:58 crc kubenswrapper[4794]: I0310 10:07:58.359405 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 10 10:07:58 crc kubenswrapper[4794]: I0310 10:07:58.364643 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Mar 10 10:07:58 crc kubenswrapper[4794]: I0310 10:07:58.378167 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 10 10:07:58 crc kubenswrapper[4794]: I0310 10:07:58.413067 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d5707404-5eca-49e1-b11a-25e9e46ecf57-logs\") pod \"nova-api-0\" (UID: \"d5707404-5eca-49e1-b11a-25e9e46ecf57\") " pod="openstack/nova-api-0" Mar 10 10:07:58 crc kubenswrapper[4794]: I0310 10:07:58.413131 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9fv6c\" (UniqueName: \"kubernetes.io/projected/d5707404-5eca-49e1-b11a-25e9e46ecf57-kube-api-access-9fv6c\") pod \"nova-api-0\" (UID: \"d5707404-5eca-49e1-b11a-25e9e46ecf57\") " pod="openstack/nova-api-0" Mar 10 10:07:58 crc kubenswrapper[4794]: I0310 10:07:58.413197 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d5707404-5eca-49e1-b11a-25e9e46ecf57-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"d5707404-5eca-49e1-b11a-25e9e46ecf57\") " pod="openstack/nova-api-0" Mar 10 10:07:58 crc kubenswrapper[4794]: I0310 10:07:58.413228 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d5707404-5eca-49e1-b11a-25e9e46ecf57-config-data\") pod \"nova-api-0\" (UID: \"d5707404-5eca-49e1-b11a-25e9e46ecf57\") " pod="openstack/nova-api-0" Mar 10 10:07:58 crc kubenswrapper[4794]: I0310 10:07:58.440435 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Mar 10 10:07:58 crc kubenswrapper[4794]: I0310 10:07:58.441526 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 10 10:07:58 crc kubenswrapper[4794]: I0310 10:07:58.447874 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Mar 10 10:07:58 crc kubenswrapper[4794]: I0310 10:07:58.448713 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-czl5f" Mar 10 10:07:58 crc kubenswrapper[4794]: I0310 10:07:58.500499 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 10 10:07:58 crc kubenswrapper[4794]: I0310 10:07:58.514771 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d5707404-5eca-49e1-b11a-25e9e46ecf57-logs\") pod \"nova-api-0\" (UID: \"d5707404-5eca-49e1-b11a-25e9e46ecf57\") " pod="openstack/nova-api-0" Mar 10 10:07:58 crc kubenswrapper[4794]: I0310 10:07:58.514982 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9fv6c\" (UniqueName: \"kubernetes.io/projected/d5707404-5eca-49e1-b11a-25e9e46ecf57-kube-api-access-9fv6c\") pod \"nova-api-0\" (UID: \"d5707404-5eca-49e1-b11a-25e9e46ecf57\") " pod="openstack/nova-api-0" Mar 10 10:07:58 crc kubenswrapper[4794]: I0310 10:07:58.515094 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e30640b0-144a-4928-a05e-e42a296a3214-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"e30640b0-144a-4928-a05e-e42a296a3214\") " pod="openstack/nova-metadata-0" Mar 10 10:07:58 crc kubenswrapper[4794]: I0310 10:07:58.521536 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e30640b0-144a-4928-a05e-e42a296a3214-logs\") pod \"nova-metadata-0\" (UID: \"e30640b0-144a-4928-a05e-e42a296a3214\") " pod="openstack/nova-metadata-0" Mar 10 10:07:58 crc kubenswrapper[4794]: I0310 10:07:58.523600 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d5707404-5eca-49e1-b11a-25e9e46ecf57-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"d5707404-5eca-49e1-b11a-25e9e46ecf57\") " pod="openstack/nova-api-0" Mar 10 10:07:58 crc kubenswrapper[4794]: I0310 10:07:58.523744 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d5707404-5eca-49e1-b11a-25e9e46ecf57-config-data\") pod \"nova-api-0\" (UID: \"d5707404-5eca-49e1-b11a-25e9e46ecf57\") " pod="openstack/nova-api-0" Mar 10 10:07:58 crc kubenswrapper[4794]: I0310 10:07:58.523826 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sb9v4\" (UniqueName: \"kubernetes.io/projected/e30640b0-144a-4928-a05e-e42a296a3214-kube-api-access-sb9v4\") pod \"nova-metadata-0\" (UID: \"e30640b0-144a-4928-a05e-e42a296a3214\") " pod="openstack/nova-metadata-0" Mar 10 10:07:58 crc kubenswrapper[4794]: I0310 10:07:58.523946 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e30640b0-144a-4928-a05e-e42a296a3214-config-data\") pod \"nova-metadata-0\" (UID: \"e30640b0-144a-4928-a05e-e42a296a3214\") " pod="openstack/nova-metadata-0" Mar 10 10:07:58 crc kubenswrapper[4794]: I0310 10:07:58.517815 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d5707404-5eca-49e1-b11a-25e9e46ecf57-logs\") pod \"nova-api-0\" (UID: \"d5707404-5eca-49e1-b11a-25e9e46ecf57\") " pod="openstack/nova-api-0" Mar 10 10:07:58 crc kubenswrapper[4794]: I0310 10:07:58.536889 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d5707404-5eca-49e1-b11a-25e9e46ecf57-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"d5707404-5eca-49e1-b11a-25e9e46ecf57\") " pod="openstack/nova-api-0" Mar 10 10:07:58 crc kubenswrapper[4794]: I0310 10:07:58.544937 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9fv6c\" (UniqueName: \"kubernetes.io/projected/d5707404-5eca-49e1-b11a-25e9e46ecf57-kube-api-access-9fv6c\") pod \"nova-api-0\" (UID: \"d5707404-5eca-49e1-b11a-25e9e46ecf57\") " pod="openstack/nova-api-0" Mar 10 10:07:58 crc kubenswrapper[4794]: I0310 10:07:58.571927 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d5707404-5eca-49e1-b11a-25e9e46ecf57-config-data\") pod \"nova-api-0\" (UID: \"d5707404-5eca-49e1-b11a-25e9e46ecf57\") " pod="openstack/nova-api-0" Mar 10 10:07:58 crc kubenswrapper[4794]: I0310 10:07:58.609293 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7559df67df-6sc9d"] Mar 10 10:07:58 crc kubenswrapper[4794]: I0310 10:07:58.611525 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7559df67df-6sc9d" Mar 10 10:07:58 crc kubenswrapper[4794]: I0310 10:07:58.623416 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 10 10:07:58 crc kubenswrapper[4794]: I0310 10:07:58.625247 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mm4qx\" (UniqueName: \"kubernetes.io/projected/cd02df1e-27d1-40fe-943e-e7fddce1449c-kube-api-access-mm4qx\") pod \"nova-scheduler-0\" (UID: \"cd02df1e-27d1-40fe-943e-e7fddce1449c\") " pod="openstack/nova-scheduler-0" Mar 10 10:07:58 crc kubenswrapper[4794]: I0310 10:07:58.625378 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sb9v4\" (UniqueName: \"kubernetes.io/projected/e30640b0-144a-4928-a05e-e42a296a3214-kube-api-access-sb9v4\") pod \"nova-metadata-0\" (UID: \"e30640b0-144a-4928-a05e-e42a296a3214\") " pod="openstack/nova-metadata-0" Mar 10 10:07:58 crc kubenswrapper[4794]: I0310 10:07:58.625492 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e30640b0-144a-4928-a05e-e42a296a3214-config-data\") pod \"nova-metadata-0\" (UID: \"e30640b0-144a-4928-a05e-e42a296a3214\") " pod="openstack/nova-metadata-0" Mar 10 10:07:58 crc kubenswrapper[4794]: I0310 10:07:58.625650 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e30640b0-144a-4928-a05e-e42a296a3214-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"e30640b0-144a-4928-a05e-e42a296a3214\") " pod="openstack/nova-metadata-0" Mar 10 10:07:58 crc kubenswrapper[4794]: I0310 10:07:58.625730 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e30640b0-144a-4928-a05e-e42a296a3214-logs\") pod \"nova-metadata-0\" (UID: \"e30640b0-144a-4928-a05e-e42a296a3214\") " pod="openstack/nova-metadata-0" Mar 10 10:07:58 crc kubenswrapper[4794]: I0310 10:07:58.625815 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cd02df1e-27d1-40fe-943e-e7fddce1449c-config-data\") pod \"nova-scheduler-0\" (UID: \"cd02df1e-27d1-40fe-943e-e7fddce1449c\") " pod="openstack/nova-scheduler-0" Mar 10 10:07:58 crc kubenswrapper[4794]: I0310 10:07:58.625917 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cd02df1e-27d1-40fe-943e-e7fddce1449c-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"cd02df1e-27d1-40fe-943e-e7fddce1449c\") " pod="openstack/nova-scheduler-0" Mar 10 10:07:58 crc kubenswrapper[4794]: I0310 10:07:58.626726 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e30640b0-144a-4928-a05e-e42a296a3214-logs\") pod \"nova-metadata-0\" (UID: \"e30640b0-144a-4928-a05e-e42a296a3214\") " pod="openstack/nova-metadata-0" Mar 10 10:07:58 crc kubenswrapper[4794]: I0310 10:07:58.633512 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e30640b0-144a-4928-a05e-e42a296a3214-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"e30640b0-144a-4928-a05e-e42a296a3214\") " pod="openstack/nova-metadata-0" Mar 10 10:07:58 crc kubenswrapper[4794]: I0310 10:07:58.645081 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7559df67df-6sc9d"] Mar 10 10:07:58 crc kubenswrapper[4794]: I0310 10:07:58.648222 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e30640b0-144a-4928-a05e-e42a296a3214-config-data\") pod \"nova-metadata-0\" (UID: \"e30640b0-144a-4928-a05e-e42a296a3214\") " pod="openstack/nova-metadata-0" Mar 10 10:07:58 crc kubenswrapper[4794]: I0310 10:07:58.655262 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sb9v4\" (UniqueName: \"kubernetes.io/projected/e30640b0-144a-4928-a05e-e42a296a3214-kube-api-access-sb9v4\") pod \"nova-metadata-0\" (UID: \"e30640b0-144a-4928-a05e-e42a296a3214\") " pod="openstack/nova-metadata-0" Mar 10 10:07:58 crc kubenswrapper[4794]: I0310 10:07:58.655290 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 10 10:07:58 crc kubenswrapper[4794]: I0310 10:07:58.656359 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 10 10:07:58 crc kubenswrapper[4794]: I0310 10:07:58.658996 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Mar 10 10:07:58 crc kubenswrapper[4794]: I0310 10:07:58.671899 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 10 10:07:58 crc kubenswrapper[4794]: I0310 10:07:58.728031 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/bc2846c8-b6db-4b5f-8bb8-998b50e64970-dns-swift-storage-0\") pod \"dnsmasq-dns-7559df67df-6sc9d\" (UID: \"bc2846c8-b6db-4b5f-8bb8-998b50e64970\") " pod="openstack/dnsmasq-dns-7559df67df-6sc9d" Mar 10 10:07:58 crc kubenswrapper[4794]: I0310 10:07:58.728318 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mm4qx\" (UniqueName: \"kubernetes.io/projected/cd02df1e-27d1-40fe-943e-e7fddce1449c-kube-api-access-mm4qx\") pod \"nova-scheduler-0\" (UID: \"cd02df1e-27d1-40fe-943e-e7fddce1449c\") " pod="openstack/nova-scheduler-0" Mar 10 10:07:58 crc kubenswrapper[4794]: I0310 10:07:58.728395 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xs787\" (UniqueName: \"kubernetes.io/projected/bc2846c8-b6db-4b5f-8bb8-998b50e64970-kube-api-access-xs787\") pod \"dnsmasq-dns-7559df67df-6sc9d\" (UID: \"bc2846c8-b6db-4b5f-8bb8-998b50e64970\") " pod="openstack/dnsmasq-dns-7559df67df-6sc9d" Mar 10 10:07:58 crc kubenswrapper[4794]: I0310 10:07:58.728452 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/bc2846c8-b6db-4b5f-8bb8-998b50e64970-ovsdbserver-sb\") pod \"dnsmasq-dns-7559df67df-6sc9d\" (UID: \"bc2846c8-b6db-4b5f-8bb8-998b50e64970\") " pod="openstack/dnsmasq-dns-7559df67df-6sc9d" Mar 10 10:07:58 crc kubenswrapper[4794]: I0310 10:07:58.728472 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bc2846c8-b6db-4b5f-8bb8-998b50e64970-dns-svc\") pod \"dnsmasq-dns-7559df67df-6sc9d\" (UID: \"bc2846c8-b6db-4b5f-8bb8-998b50e64970\") " pod="openstack/dnsmasq-dns-7559df67df-6sc9d" Mar 10 10:07:58 crc kubenswrapper[4794]: I0310 10:07:58.728546 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bc2846c8-b6db-4b5f-8bb8-998b50e64970-config\") pod \"dnsmasq-dns-7559df67df-6sc9d\" (UID: \"bc2846c8-b6db-4b5f-8bb8-998b50e64970\") " pod="openstack/dnsmasq-dns-7559df67df-6sc9d" Mar 10 10:07:58 crc kubenswrapper[4794]: I0310 10:07:58.728607 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/bc2846c8-b6db-4b5f-8bb8-998b50e64970-ovsdbserver-nb\") pod \"dnsmasq-dns-7559df67df-6sc9d\" (UID: \"bc2846c8-b6db-4b5f-8bb8-998b50e64970\") " pod="openstack/dnsmasq-dns-7559df67df-6sc9d" Mar 10 10:07:58 crc kubenswrapper[4794]: I0310 10:07:58.728700 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cd02df1e-27d1-40fe-943e-e7fddce1449c-config-data\") pod \"nova-scheduler-0\" (UID: \"cd02df1e-27d1-40fe-943e-e7fddce1449c\") " pod="openstack/nova-scheduler-0" Mar 10 10:07:58 crc kubenswrapper[4794]: I0310 10:07:58.728723 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cd02df1e-27d1-40fe-943e-e7fddce1449c-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"cd02df1e-27d1-40fe-943e-e7fddce1449c\") " pod="openstack/nova-scheduler-0" Mar 10 10:07:58 crc kubenswrapper[4794]: I0310 10:07:58.735593 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cd02df1e-27d1-40fe-943e-e7fddce1449c-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"cd02df1e-27d1-40fe-943e-e7fddce1449c\") " pod="openstack/nova-scheduler-0" Mar 10 10:07:58 crc kubenswrapper[4794]: I0310 10:07:58.739093 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cd02df1e-27d1-40fe-943e-e7fddce1449c-config-data\") pod \"nova-scheduler-0\" (UID: \"cd02df1e-27d1-40fe-943e-e7fddce1449c\") " pod="openstack/nova-scheduler-0" Mar 10 10:07:58 crc kubenswrapper[4794]: I0310 10:07:58.749807 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mm4qx\" (UniqueName: \"kubernetes.io/projected/cd02df1e-27d1-40fe-943e-e7fddce1449c-kube-api-access-mm4qx\") pod \"nova-scheduler-0\" (UID: \"cd02df1e-27d1-40fe-943e-e7fddce1449c\") " pod="openstack/nova-scheduler-0" Mar 10 10:07:58 crc kubenswrapper[4794]: I0310 10:07:58.762572 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 10 10:07:58 crc kubenswrapper[4794]: I0310 10:07:58.829996 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xs787\" (UniqueName: \"kubernetes.io/projected/bc2846c8-b6db-4b5f-8bb8-998b50e64970-kube-api-access-xs787\") pod \"dnsmasq-dns-7559df67df-6sc9d\" (UID: \"bc2846c8-b6db-4b5f-8bb8-998b50e64970\") " pod="openstack/dnsmasq-dns-7559df67df-6sc9d" Mar 10 10:07:58 crc kubenswrapper[4794]: I0310 10:07:58.830047 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/bc2846c8-b6db-4b5f-8bb8-998b50e64970-ovsdbserver-sb\") pod \"dnsmasq-dns-7559df67df-6sc9d\" (UID: \"bc2846c8-b6db-4b5f-8bb8-998b50e64970\") " pod="openstack/dnsmasq-dns-7559df67df-6sc9d" Mar 10 10:07:58 crc kubenswrapper[4794]: I0310 10:07:58.830070 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bc2846c8-b6db-4b5f-8bb8-998b50e64970-dns-svc\") pod \"dnsmasq-dns-7559df67df-6sc9d\" (UID: \"bc2846c8-b6db-4b5f-8bb8-998b50e64970\") " pod="openstack/dnsmasq-dns-7559df67df-6sc9d" Mar 10 10:07:58 crc kubenswrapper[4794]: I0310 10:07:58.830093 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f10f408e-5c27-4931-873b-85396e7442e8-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"f10f408e-5c27-4931-873b-85396e7442e8\") " pod="openstack/nova-cell1-novncproxy-0" Mar 10 10:07:58 crc kubenswrapper[4794]: I0310 10:07:58.830138 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bc2846c8-b6db-4b5f-8bb8-998b50e64970-config\") pod \"dnsmasq-dns-7559df67df-6sc9d\" (UID: \"bc2846c8-b6db-4b5f-8bb8-998b50e64970\") " pod="openstack/dnsmasq-dns-7559df67df-6sc9d" Mar 10 10:07:58 crc kubenswrapper[4794]: I0310 10:07:58.830176 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/bc2846c8-b6db-4b5f-8bb8-998b50e64970-ovsdbserver-nb\") pod \"dnsmasq-dns-7559df67df-6sc9d\" (UID: \"bc2846c8-b6db-4b5f-8bb8-998b50e64970\") " pod="openstack/dnsmasq-dns-7559df67df-6sc9d" Mar 10 10:07:58 crc kubenswrapper[4794]: I0310 10:07:58.830225 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-twz89\" (UniqueName: \"kubernetes.io/projected/f10f408e-5c27-4931-873b-85396e7442e8-kube-api-access-twz89\") pod \"nova-cell1-novncproxy-0\" (UID: \"f10f408e-5c27-4931-873b-85396e7442e8\") " pod="openstack/nova-cell1-novncproxy-0" Mar 10 10:07:58 crc kubenswrapper[4794]: I0310 10:07:58.830249 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f10f408e-5c27-4931-873b-85396e7442e8-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"f10f408e-5c27-4931-873b-85396e7442e8\") " pod="openstack/nova-cell1-novncproxy-0" Mar 10 10:07:58 crc kubenswrapper[4794]: I0310 10:07:58.830277 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/bc2846c8-b6db-4b5f-8bb8-998b50e64970-dns-swift-storage-0\") pod \"dnsmasq-dns-7559df67df-6sc9d\" (UID: \"bc2846c8-b6db-4b5f-8bb8-998b50e64970\") " pod="openstack/dnsmasq-dns-7559df67df-6sc9d" Mar 10 10:07:58 crc kubenswrapper[4794]: I0310 10:07:58.831092 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/bc2846c8-b6db-4b5f-8bb8-998b50e64970-dns-swift-storage-0\") pod \"dnsmasq-dns-7559df67df-6sc9d\" (UID: \"bc2846c8-b6db-4b5f-8bb8-998b50e64970\") " pod="openstack/dnsmasq-dns-7559df67df-6sc9d" Mar 10 10:07:58 crc kubenswrapper[4794]: I0310 10:07:58.831832 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/bc2846c8-b6db-4b5f-8bb8-998b50e64970-ovsdbserver-sb\") pod \"dnsmasq-dns-7559df67df-6sc9d\" (UID: \"bc2846c8-b6db-4b5f-8bb8-998b50e64970\") " pod="openstack/dnsmasq-dns-7559df67df-6sc9d" Mar 10 10:07:58 crc kubenswrapper[4794]: I0310 10:07:58.832834 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bc2846c8-b6db-4b5f-8bb8-998b50e64970-dns-svc\") pod \"dnsmasq-dns-7559df67df-6sc9d\" (UID: \"bc2846c8-b6db-4b5f-8bb8-998b50e64970\") " pod="openstack/dnsmasq-dns-7559df67df-6sc9d" Mar 10 10:07:58 crc kubenswrapper[4794]: I0310 10:07:58.833376 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bc2846c8-b6db-4b5f-8bb8-998b50e64970-config\") pod \"dnsmasq-dns-7559df67df-6sc9d\" (UID: \"bc2846c8-b6db-4b5f-8bb8-998b50e64970\") " pod="openstack/dnsmasq-dns-7559df67df-6sc9d" Mar 10 10:07:58 crc kubenswrapper[4794]: I0310 10:07:58.833911 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/bc2846c8-b6db-4b5f-8bb8-998b50e64970-ovsdbserver-nb\") pod \"dnsmasq-dns-7559df67df-6sc9d\" (UID: \"bc2846c8-b6db-4b5f-8bb8-998b50e64970\") " pod="openstack/dnsmasq-dns-7559df67df-6sc9d" Mar 10 10:07:58 crc kubenswrapper[4794]: I0310 10:07:58.858460 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xs787\" (UniqueName: \"kubernetes.io/projected/bc2846c8-b6db-4b5f-8bb8-998b50e64970-kube-api-access-xs787\") pod \"dnsmasq-dns-7559df67df-6sc9d\" (UID: \"bc2846c8-b6db-4b5f-8bb8-998b50e64970\") " pod="openstack/dnsmasq-dns-7559df67df-6sc9d" Mar 10 10:07:58 crc kubenswrapper[4794]: I0310 10:07:58.931588 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f10f408e-5c27-4931-873b-85396e7442e8-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"f10f408e-5c27-4931-873b-85396e7442e8\") " pod="openstack/nova-cell1-novncproxy-0" Mar 10 10:07:58 crc kubenswrapper[4794]: I0310 10:07:58.931720 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-twz89\" (UniqueName: \"kubernetes.io/projected/f10f408e-5c27-4931-873b-85396e7442e8-kube-api-access-twz89\") pod \"nova-cell1-novncproxy-0\" (UID: \"f10f408e-5c27-4931-873b-85396e7442e8\") " pod="openstack/nova-cell1-novncproxy-0" Mar 10 10:07:58 crc kubenswrapper[4794]: I0310 10:07:58.931763 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f10f408e-5c27-4931-873b-85396e7442e8-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"f10f408e-5c27-4931-873b-85396e7442e8\") " pod="openstack/nova-cell1-novncproxy-0" Mar 10 10:07:58 crc kubenswrapper[4794]: I0310 10:07:58.934969 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 10 10:07:58 crc kubenswrapper[4794]: I0310 10:07:58.935578 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f10f408e-5c27-4931-873b-85396e7442e8-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"f10f408e-5c27-4931-873b-85396e7442e8\") " pod="openstack/nova-cell1-novncproxy-0" Mar 10 10:07:58 crc kubenswrapper[4794]: I0310 10:07:58.938844 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f10f408e-5c27-4931-873b-85396e7442e8-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"f10f408e-5c27-4931-873b-85396e7442e8\") " pod="openstack/nova-cell1-novncproxy-0" Mar 10 10:07:58 crc kubenswrapper[4794]: I0310 10:07:58.959093 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-twz89\" (UniqueName: \"kubernetes.io/projected/f10f408e-5c27-4931-873b-85396e7442e8-kube-api-access-twz89\") pod \"nova-cell1-novncproxy-0\" (UID: \"f10f408e-5c27-4931-873b-85396e7442e8\") " pod="openstack/nova-cell1-novncproxy-0" Mar 10 10:07:58 crc kubenswrapper[4794]: I0310 10:07:58.978353 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7559df67df-6sc9d" Mar 10 10:07:58 crc kubenswrapper[4794]: I0310 10:07:58.996509 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 10 10:07:59 crc kubenswrapper[4794]: I0310 10:07:59.179320 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-czl5f"] Mar 10 10:07:59 crc kubenswrapper[4794]: I0310 10:07:59.199972 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 10 10:07:59 crc kubenswrapper[4794]: I0310 10:07:59.243225 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"d5707404-5eca-49e1-b11a-25e9e46ecf57","Type":"ContainerStarted","Data":"2eb0712a92ec56430ec83031700a92710898dda90366c5ce18d236e76dac6aba"} Mar 10 10:07:59 crc kubenswrapper[4794]: I0310 10:07:59.248304 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-czl5f" event={"ID":"4290ea58-8af5-478d-a452-421fe656fe01","Type":"ContainerStarted","Data":"2d21c3c1609b0e27af174d9efe50c2b1df4becaa82baf3a3aa4e0721a9cb7aef"} Mar 10 10:07:59 crc kubenswrapper[4794]: I0310 10:07:59.399895 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 10 10:07:59 crc kubenswrapper[4794]: I0310 10:07:59.434598 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-db-sync-qzv7j"] Mar 10 10:07:59 crc kubenswrapper[4794]: I0310 10:07:59.435714 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-qzv7j" Mar 10 10:07:59 crc kubenswrapper[4794]: I0310 10:07:59.438834 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Mar 10 10:07:59 crc kubenswrapper[4794]: I0310 10:07:59.438837 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-scripts" Mar 10 10:07:59 crc kubenswrapper[4794]: I0310 10:07:59.451991 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-qzv7j"] Mar 10 10:07:59 crc kubenswrapper[4794]: I0310 10:07:59.534427 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 10 10:07:59 crc kubenswrapper[4794]: I0310 10:07:59.548221 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gzzsb\" (UniqueName: \"kubernetes.io/projected/5f967270-f06c-40e0-9a53-00d2bc42d930-kube-api-access-gzzsb\") pod \"nova-cell1-conductor-db-sync-qzv7j\" (UID: \"5f967270-f06c-40e0-9a53-00d2bc42d930\") " pod="openstack/nova-cell1-conductor-db-sync-qzv7j" Mar 10 10:07:59 crc kubenswrapper[4794]: I0310 10:07:59.548279 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5f967270-f06c-40e0-9a53-00d2bc42d930-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-qzv7j\" (UID: \"5f967270-f06c-40e0-9a53-00d2bc42d930\") " pod="openstack/nova-cell1-conductor-db-sync-qzv7j" Mar 10 10:07:59 crc kubenswrapper[4794]: I0310 10:07:59.548428 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5f967270-f06c-40e0-9a53-00d2bc42d930-scripts\") pod \"nova-cell1-conductor-db-sync-qzv7j\" (UID: \"5f967270-f06c-40e0-9a53-00d2bc42d930\") " pod="openstack/nova-cell1-conductor-db-sync-qzv7j" Mar 10 10:07:59 crc kubenswrapper[4794]: I0310 10:07:59.549004 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5f967270-f06c-40e0-9a53-00d2bc42d930-config-data\") pod \"nova-cell1-conductor-db-sync-qzv7j\" (UID: \"5f967270-f06c-40e0-9a53-00d2bc42d930\") " pod="openstack/nova-cell1-conductor-db-sync-qzv7j" Mar 10 10:07:59 crc kubenswrapper[4794]: I0310 10:07:59.638095 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7559df67df-6sc9d"] Mar 10 10:07:59 crc kubenswrapper[4794]: I0310 10:07:59.696630 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5f967270-f06c-40e0-9a53-00d2bc42d930-config-data\") pod \"nova-cell1-conductor-db-sync-qzv7j\" (UID: \"5f967270-f06c-40e0-9a53-00d2bc42d930\") " pod="openstack/nova-cell1-conductor-db-sync-qzv7j" Mar 10 10:07:59 crc kubenswrapper[4794]: I0310 10:07:59.696699 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gzzsb\" (UniqueName: \"kubernetes.io/projected/5f967270-f06c-40e0-9a53-00d2bc42d930-kube-api-access-gzzsb\") pod \"nova-cell1-conductor-db-sync-qzv7j\" (UID: \"5f967270-f06c-40e0-9a53-00d2bc42d930\") " pod="openstack/nova-cell1-conductor-db-sync-qzv7j" Mar 10 10:07:59 crc kubenswrapper[4794]: I0310 10:07:59.696742 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5f967270-f06c-40e0-9a53-00d2bc42d930-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-qzv7j\" (UID: \"5f967270-f06c-40e0-9a53-00d2bc42d930\") " pod="openstack/nova-cell1-conductor-db-sync-qzv7j" Mar 10 10:07:59 crc kubenswrapper[4794]: I0310 10:07:59.696863 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5f967270-f06c-40e0-9a53-00d2bc42d930-scripts\") pod \"nova-cell1-conductor-db-sync-qzv7j\" (UID: \"5f967270-f06c-40e0-9a53-00d2bc42d930\") " pod="openstack/nova-cell1-conductor-db-sync-qzv7j" Mar 10 10:07:59 crc kubenswrapper[4794]: I0310 10:07:59.701470 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5f967270-f06c-40e0-9a53-00d2bc42d930-config-data\") pod \"nova-cell1-conductor-db-sync-qzv7j\" (UID: \"5f967270-f06c-40e0-9a53-00d2bc42d930\") " pod="openstack/nova-cell1-conductor-db-sync-qzv7j" Mar 10 10:07:59 crc kubenswrapper[4794]: I0310 10:07:59.703183 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5f967270-f06c-40e0-9a53-00d2bc42d930-scripts\") pod \"nova-cell1-conductor-db-sync-qzv7j\" (UID: \"5f967270-f06c-40e0-9a53-00d2bc42d930\") " pod="openstack/nova-cell1-conductor-db-sync-qzv7j" Mar 10 10:07:59 crc kubenswrapper[4794]: I0310 10:07:59.707050 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5f967270-f06c-40e0-9a53-00d2bc42d930-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-qzv7j\" (UID: \"5f967270-f06c-40e0-9a53-00d2bc42d930\") " pod="openstack/nova-cell1-conductor-db-sync-qzv7j" Mar 10 10:07:59 crc kubenswrapper[4794]: I0310 10:07:59.734940 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gzzsb\" (UniqueName: \"kubernetes.io/projected/5f967270-f06c-40e0-9a53-00d2bc42d930-kube-api-access-gzzsb\") pod \"nova-cell1-conductor-db-sync-qzv7j\" (UID: \"5f967270-f06c-40e0-9a53-00d2bc42d930\") " pod="openstack/nova-cell1-conductor-db-sync-qzv7j" Mar 10 10:07:59 crc kubenswrapper[4794]: I0310 10:07:59.821095 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-qzv7j" Mar 10 10:07:59 crc kubenswrapper[4794]: I0310 10:07:59.826156 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 10 10:07:59 crc kubenswrapper[4794]: W0310 10:07:59.857279 4794 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf10f408e_5c27_4931_873b_85396e7442e8.slice/crio-a3edc0ec6c8931acc60b8e4d83a89c64f0042fc0873aedfafe66858254ce1219 WatchSource:0}: Error finding container a3edc0ec6c8931acc60b8e4d83a89c64f0042fc0873aedfafe66858254ce1219: Status 404 returned error can't find the container with id a3edc0ec6c8931acc60b8e4d83a89c64f0042fc0873aedfafe66858254ce1219 Mar 10 10:08:00 crc kubenswrapper[4794]: I0310 10:08:00.157690 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29552288-fck7s"] Mar 10 10:08:00 crc kubenswrapper[4794]: I0310 10:08:00.161800 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552288-fck7s"] Mar 10 10:08:00 crc kubenswrapper[4794]: I0310 10:08:00.162097 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552288-fck7s" Mar 10 10:08:00 crc kubenswrapper[4794]: I0310 10:08:00.166834 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 10 10:08:00 crc kubenswrapper[4794]: I0310 10:08:00.166898 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-r5tkc" Mar 10 10:08:00 crc kubenswrapper[4794]: I0310 10:08:00.167001 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 10 10:08:00 crc kubenswrapper[4794]: I0310 10:08:00.204880 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-94qnk\" (UniqueName: \"kubernetes.io/projected/e8db1d2e-9690-4d13-bce2-b8602ffb7583-kube-api-access-94qnk\") pod \"auto-csr-approver-29552288-fck7s\" (UID: \"e8db1d2e-9690-4d13-bce2-b8602ffb7583\") " pod="openshift-infra/auto-csr-approver-29552288-fck7s" Mar 10 10:08:00 crc kubenswrapper[4794]: I0310 10:08:00.261845 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"f10f408e-5c27-4931-873b-85396e7442e8","Type":"ContainerStarted","Data":"a3edc0ec6c8931acc60b8e4d83a89c64f0042fc0873aedfafe66858254ce1219"} Mar 10 10:08:00 crc kubenswrapper[4794]: I0310 10:08:00.263243 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"e30640b0-144a-4928-a05e-e42a296a3214","Type":"ContainerStarted","Data":"675de561fd7c13bcd9b912d512987bda82e80d28e0dacfcf05ec2d25dc8cae54"} Mar 10 10:08:00 crc kubenswrapper[4794]: I0310 10:08:00.265006 4794 generic.go:334] "Generic (PLEG): container finished" podID="bc2846c8-b6db-4b5f-8bb8-998b50e64970" containerID="cc8e47c20a1668aed38a81367397f1ee82817b86543da1bd05f2dd64f31ed821" exitCode=0 Mar 10 10:08:00 crc kubenswrapper[4794]: I0310 10:08:00.265082 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7559df67df-6sc9d" event={"ID":"bc2846c8-b6db-4b5f-8bb8-998b50e64970","Type":"ContainerDied","Data":"cc8e47c20a1668aed38a81367397f1ee82817b86543da1bd05f2dd64f31ed821"} Mar 10 10:08:00 crc kubenswrapper[4794]: I0310 10:08:00.265111 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7559df67df-6sc9d" event={"ID":"bc2846c8-b6db-4b5f-8bb8-998b50e64970","Type":"ContainerStarted","Data":"eceb2c4e331804ce49d828d9b9e8b23b418352076ff423d503b5a8aa40f5bb8c"} Mar 10 10:08:00 crc kubenswrapper[4794]: I0310 10:08:00.267001 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"cd02df1e-27d1-40fe-943e-e7fddce1449c","Type":"ContainerStarted","Data":"963e370895085f1dd74d78799b1fdb3483c72c8626ca9b2bca2a088bc131593d"} Mar 10 10:08:00 crc kubenswrapper[4794]: I0310 10:08:00.270003 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-czl5f" event={"ID":"4290ea58-8af5-478d-a452-421fe656fe01","Type":"ContainerStarted","Data":"1edfc188e8a48dc22fef269f386093bd77ed504ab9f840271be29afeff769334"} Mar 10 10:08:00 crc kubenswrapper[4794]: I0310 10:08:00.311057 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-94qnk\" (UniqueName: \"kubernetes.io/projected/e8db1d2e-9690-4d13-bce2-b8602ffb7583-kube-api-access-94qnk\") pod \"auto-csr-approver-29552288-fck7s\" (UID: \"e8db1d2e-9690-4d13-bce2-b8602ffb7583\") " pod="openshift-infra/auto-csr-approver-29552288-fck7s" Mar 10 10:08:00 crc kubenswrapper[4794]: I0310 10:08:00.327494 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-cell-mapping-czl5f" podStartSLOduration=2.327467409 podStartE2EDuration="2.327467409s" podCreationTimestamp="2026-03-10 10:07:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 10:08:00.310085995 +0000 UTC m=+1429.066256833" watchObservedRunningTime="2026-03-10 10:08:00.327467409 +0000 UTC m=+1429.083638227" Mar 10 10:08:00 crc kubenswrapper[4794]: I0310 10:08:00.336719 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-94qnk\" (UniqueName: \"kubernetes.io/projected/e8db1d2e-9690-4d13-bce2-b8602ffb7583-kube-api-access-94qnk\") pod \"auto-csr-approver-29552288-fck7s\" (UID: \"e8db1d2e-9690-4d13-bce2-b8602ffb7583\") " pod="openshift-infra/auto-csr-approver-29552288-fck7s" Mar 10 10:08:00 crc kubenswrapper[4794]: I0310 10:08:00.390205 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-qzv7j"] Mar 10 10:08:00 crc kubenswrapper[4794]: I0310 10:08:00.503176 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552288-fck7s" Mar 10 10:08:01 crc kubenswrapper[4794]: I0310 10:08:01.025615 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552288-fck7s"] Mar 10 10:08:01 crc kubenswrapper[4794]: W0310 10:08:01.053803 4794 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode8db1d2e_9690_4d13_bce2_b8602ffb7583.slice/crio-0723a6c4cbbe6747cebd0c38efa9420f17083934edd83bb0498a1a6b2a71c449 WatchSource:0}: Error finding container 0723a6c4cbbe6747cebd0c38efa9420f17083934edd83bb0498a1a6b2a71c449: Status 404 returned error can't find the container with id 0723a6c4cbbe6747cebd0c38efa9420f17083934edd83bb0498a1a6b2a71c449 Mar 10 10:08:01 crc kubenswrapper[4794]: I0310 10:08:01.283124 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-qzv7j" event={"ID":"5f967270-f06c-40e0-9a53-00d2bc42d930","Type":"ContainerStarted","Data":"f014916484fd1fca04a820b617c80d6df43e5e2743a3252f69ec6b699226813e"} Mar 10 10:08:01 crc kubenswrapper[4794]: I0310 10:08:01.283168 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-qzv7j" event={"ID":"5f967270-f06c-40e0-9a53-00d2bc42d930","Type":"ContainerStarted","Data":"d7b6a104c4ad2dcc4a97dcd533265424d07651d1b5b0069f543e277a14a2c1e4"} Mar 10 10:08:01 crc kubenswrapper[4794]: I0310 10:08:01.285093 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7559df67df-6sc9d" event={"ID":"bc2846c8-b6db-4b5f-8bb8-998b50e64970","Type":"ContainerStarted","Data":"c1a6f8e537285056480257d619eb73fbf78910635268ce5df54e252c9727c3b7"} Mar 10 10:08:01 crc kubenswrapper[4794]: I0310 10:08:01.285252 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-7559df67df-6sc9d" Mar 10 10:08:01 crc kubenswrapper[4794]: I0310 10:08:01.286101 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552288-fck7s" event={"ID":"e8db1d2e-9690-4d13-bce2-b8602ffb7583","Type":"ContainerStarted","Data":"0723a6c4cbbe6747cebd0c38efa9420f17083934edd83bb0498a1a6b2a71c449"} Mar 10 10:08:01 crc kubenswrapper[4794]: I0310 10:08:01.299236 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-db-sync-qzv7j" podStartSLOduration=2.299215611 podStartE2EDuration="2.299215611s" podCreationTimestamp="2026-03-10 10:07:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 10:08:01.298035657 +0000 UTC m=+1430.054206475" watchObservedRunningTime="2026-03-10 10:08:01.299215611 +0000 UTC m=+1430.055386419" Mar 10 10:08:01 crc kubenswrapper[4794]: I0310 10:08:01.331201 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-7559df67df-6sc9d" podStartSLOduration=3.331179407 podStartE2EDuration="3.331179407s" podCreationTimestamp="2026-03-10 10:07:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 10:08:01.315997417 +0000 UTC m=+1430.072168245" watchObservedRunningTime="2026-03-10 10:08:01.331179407 +0000 UTC m=+1430.087350225" Mar 10 10:08:02 crc kubenswrapper[4794]: I0310 10:08:02.040318 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 10 10:08:02 crc kubenswrapper[4794]: I0310 10:08:02.117187 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 10 10:08:04 crc kubenswrapper[4794]: I0310 10:08:04.317747 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"d5707404-5eca-49e1-b11a-25e9e46ecf57","Type":"ContainerStarted","Data":"e8a1ed949ae80d580360cadba16748744763ab0aabdfbf25c0c8f8f031a813a5"} Mar 10 10:08:04 crc kubenswrapper[4794]: I0310 10:08:04.318182 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"d5707404-5eca-49e1-b11a-25e9e46ecf57","Type":"ContainerStarted","Data":"e58b8ece246538a5d35c361568da48a1fe4dfd6ade472acb22a81b8fc265fb3c"} Mar 10 10:08:04 crc kubenswrapper[4794]: I0310 10:08:04.321099 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"cd02df1e-27d1-40fe-943e-e7fddce1449c","Type":"ContainerStarted","Data":"acefac07e37e15e65d3a749fdb78baa62a2bf2dee62ae623b0547bea339c72e9"} Mar 10 10:08:04 crc kubenswrapper[4794]: I0310 10:08:04.323834 4794 generic.go:334] "Generic (PLEG): container finished" podID="e8db1d2e-9690-4d13-bce2-b8602ffb7583" containerID="ad95ef6605bbb6d66ad66594f4383cbe222c3fc9ee15ad33d45b92de3f7d5de8" exitCode=0 Mar 10 10:08:04 crc kubenswrapper[4794]: I0310 10:08:04.323907 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552288-fck7s" event={"ID":"e8db1d2e-9690-4d13-bce2-b8602ffb7583","Type":"ContainerDied","Data":"ad95ef6605bbb6d66ad66594f4383cbe222c3fc9ee15ad33d45b92de3f7d5de8"} Mar 10 10:08:04 crc kubenswrapper[4794]: I0310 10:08:04.325925 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"e30640b0-144a-4928-a05e-e42a296a3214","Type":"ContainerStarted","Data":"762ad40585a4eb6bce3691196c9277567e660a2a4269117a85fc6a3f10c9ffa7"} Mar 10 10:08:04 crc kubenswrapper[4794]: I0310 10:08:04.325964 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"e30640b0-144a-4928-a05e-e42a296a3214","Type":"ContainerStarted","Data":"d22595944b57acf068a314a2386cfb551ea3eca990a82f6ca6f32ed1f24d3aae"} Mar 10 10:08:04 crc kubenswrapper[4794]: I0310 10:08:04.325956 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="e30640b0-144a-4928-a05e-e42a296a3214" containerName="nova-metadata-log" containerID="cri-o://d22595944b57acf068a314a2386cfb551ea3eca990a82f6ca6f32ed1f24d3aae" gracePeriod=30 Mar 10 10:08:04 crc kubenswrapper[4794]: I0310 10:08:04.326064 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="e30640b0-144a-4928-a05e-e42a296a3214" containerName="nova-metadata-metadata" containerID="cri-o://762ad40585a4eb6bce3691196c9277567e660a2a4269117a85fc6a3f10c9ffa7" gracePeriod=30 Mar 10 10:08:04 crc kubenswrapper[4794]: I0310 10:08:04.330941 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"f10f408e-5c27-4931-873b-85396e7442e8","Type":"ContainerStarted","Data":"a64296135e8402c3dbcfc0bd0539d38da58c4b92991827f927bd0ad98f6e598b"} Mar 10 10:08:04 crc kubenswrapper[4794]: I0310 10:08:04.331135 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-novncproxy-0" podUID="f10f408e-5c27-4931-873b-85396e7442e8" containerName="nova-cell1-novncproxy-novncproxy" containerID="cri-o://a64296135e8402c3dbcfc0bd0539d38da58c4b92991827f927bd0ad98f6e598b" gracePeriod=30 Mar 10 10:08:04 crc kubenswrapper[4794]: I0310 10:08:04.347611 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.206707469 podStartE2EDuration="6.347594294s" podCreationTimestamp="2026-03-10 10:07:58 +0000 UTC" firstStartedPulling="2026-03-10 10:07:59.212676918 +0000 UTC m=+1427.968847736" lastFinishedPulling="2026-03-10 10:08:03.353563723 +0000 UTC m=+1432.109734561" observedRunningTime="2026-03-10 10:08:04.334383154 +0000 UTC m=+1433.090553982" watchObservedRunningTime="2026-03-10 10:08:04.347594294 +0000 UTC m=+1433.103765112" Mar 10 10:08:04 crc kubenswrapper[4794]: I0310 10:08:04.367680 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=2.930244966 podStartE2EDuration="6.367664798s" podCreationTimestamp="2026-03-10 10:07:58 +0000 UTC" firstStartedPulling="2026-03-10 10:07:59.897636544 +0000 UTC m=+1428.653807362" lastFinishedPulling="2026-03-10 10:08:03.335056376 +0000 UTC m=+1432.091227194" observedRunningTime="2026-03-10 10:08:04.36571602 +0000 UTC m=+1433.121886838" watchObservedRunningTime="2026-03-10 10:08:04.367664798 +0000 UTC m=+1433.123835616" Mar 10 10:08:04 crc kubenswrapper[4794]: I0310 10:08:04.403022 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.487335814 podStartE2EDuration="6.402996212s" podCreationTimestamp="2026-03-10 10:07:58 +0000 UTC" firstStartedPulling="2026-03-10 10:07:59.429777115 +0000 UTC m=+1428.185947933" lastFinishedPulling="2026-03-10 10:08:03.345437513 +0000 UTC m=+1432.101608331" observedRunningTime="2026-03-10 10:08:04.388880584 +0000 UTC m=+1433.145051402" watchObservedRunningTime="2026-03-10 10:08:04.402996212 +0000 UTC m=+1433.159167030" Mar 10 10:08:04 crc kubenswrapper[4794]: I0310 10:08:04.426130 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.630157806 podStartE2EDuration="6.426108835s" podCreationTimestamp="2026-03-10 10:07:58 +0000 UTC" firstStartedPulling="2026-03-10 10:07:59.559458579 +0000 UTC m=+1428.315629397" lastFinishedPulling="2026-03-10 10:08:03.355409588 +0000 UTC m=+1432.111580426" observedRunningTime="2026-03-10 10:08:04.414674967 +0000 UTC m=+1433.170845805" watchObservedRunningTime="2026-03-10 10:08:04.426108835 +0000 UTC m=+1433.182279653" Mar 10 10:08:04 crc kubenswrapper[4794]: I0310 10:08:04.924431 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 10 10:08:04 crc kubenswrapper[4794]: I0310 10:08:04.938384 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e30640b0-144a-4928-a05e-e42a296a3214-config-data\") pod \"e30640b0-144a-4928-a05e-e42a296a3214\" (UID: \"e30640b0-144a-4928-a05e-e42a296a3214\") " Mar 10 10:08:04 crc kubenswrapper[4794]: I0310 10:08:04.938433 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e30640b0-144a-4928-a05e-e42a296a3214-logs\") pod \"e30640b0-144a-4928-a05e-e42a296a3214\" (UID: \"e30640b0-144a-4928-a05e-e42a296a3214\") " Mar 10 10:08:04 crc kubenswrapper[4794]: I0310 10:08:04.938471 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e30640b0-144a-4928-a05e-e42a296a3214-combined-ca-bundle\") pod \"e30640b0-144a-4928-a05e-e42a296a3214\" (UID: \"e30640b0-144a-4928-a05e-e42a296a3214\") " Mar 10 10:08:04 crc kubenswrapper[4794]: I0310 10:08:04.938528 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sb9v4\" (UniqueName: \"kubernetes.io/projected/e30640b0-144a-4928-a05e-e42a296a3214-kube-api-access-sb9v4\") pod \"e30640b0-144a-4928-a05e-e42a296a3214\" (UID: \"e30640b0-144a-4928-a05e-e42a296a3214\") " Mar 10 10:08:04 crc kubenswrapper[4794]: I0310 10:08:04.938981 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e30640b0-144a-4928-a05e-e42a296a3214-logs" (OuterVolumeSpecName: "logs") pod "e30640b0-144a-4928-a05e-e42a296a3214" (UID: "e30640b0-144a-4928-a05e-e42a296a3214"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 10:08:04 crc kubenswrapper[4794]: I0310 10:08:04.956723 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e30640b0-144a-4928-a05e-e42a296a3214-kube-api-access-sb9v4" (OuterVolumeSpecName: "kube-api-access-sb9v4") pod "e30640b0-144a-4928-a05e-e42a296a3214" (UID: "e30640b0-144a-4928-a05e-e42a296a3214"). InnerVolumeSpecName "kube-api-access-sb9v4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 10:08:04 crc kubenswrapper[4794]: I0310 10:08:04.975166 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e30640b0-144a-4928-a05e-e42a296a3214-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e30640b0-144a-4928-a05e-e42a296a3214" (UID: "e30640b0-144a-4928-a05e-e42a296a3214"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 10:08:04 crc kubenswrapper[4794]: I0310 10:08:04.980604 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e30640b0-144a-4928-a05e-e42a296a3214-config-data" (OuterVolumeSpecName: "config-data") pod "e30640b0-144a-4928-a05e-e42a296a3214" (UID: "e30640b0-144a-4928-a05e-e42a296a3214"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 10:08:05 crc kubenswrapper[4794]: I0310 10:08:05.039883 4794 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e30640b0-144a-4928-a05e-e42a296a3214-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 10:08:05 crc kubenswrapper[4794]: I0310 10:08:05.039921 4794 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e30640b0-144a-4928-a05e-e42a296a3214-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 10:08:05 crc kubenswrapper[4794]: I0310 10:08:05.039937 4794 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e30640b0-144a-4928-a05e-e42a296a3214-logs\") on node \"crc\" DevicePath \"\"" Mar 10 10:08:05 crc kubenswrapper[4794]: I0310 10:08:05.039972 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sb9v4\" (UniqueName: \"kubernetes.io/projected/e30640b0-144a-4928-a05e-e42a296a3214-kube-api-access-sb9v4\") on node \"crc\" DevicePath \"\"" Mar 10 10:08:05 crc kubenswrapper[4794]: I0310 10:08:05.353084 4794 generic.go:334] "Generic (PLEG): container finished" podID="e30640b0-144a-4928-a05e-e42a296a3214" containerID="762ad40585a4eb6bce3691196c9277567e660a2a4269117a85fc6a3f10c9ffa7" exitCode=0 Mar 10 10:08:05 crc kubenswrapper[4794]: I0310 10:08:05.353115 4794 generic.go:334] "Generic (PLEG): container finished" podID="e30640b0-144a-4928-a05e-e42a296a3214" containerID="d22595944b57acf068a314a2386cfb551ea3eca990a82f6ca6f32ed1f24d3aae" exitCode=143 Mar 10 10:08:05 crc kubenswrapper[4794]: I0310 10:08:05.353809 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 10 10:08:05 crc kubenswrapper[4794]: I0310 10:08:05.355169 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"e30640b0-144a-4928-a05e-e42a296a3214","Type":"ContainerDied","Data":"762ad40585a4eb6bce3691196c9277567e660a2a4269117a85fc6a3f10c9ffa7"} Mar 10 10:08:05 crc kubenswrapper[4794]: I0310 10:08:05.355245 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"e30640b0-144a-4928-a05e-e42a296a3214","Type":"ContainerDied","Data":"d22595944b57acf068a314a2386cfb551ea3eca990a82f6ca6f32ed1f24d3aae"} Mar 10 10:08:05 crc kubenswrapper[4794]: I0310 10:08:05.355268 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"e30640b0-144a-4928-a05e-e42a296a3214","Type":"ContainerDied","Data":"675de561fd7c13bcd9b912d512987bda82e80d28e0dacfcf05ec2d25dc8cae54"} Mar 10 10:08:05 crc kubenswrapper[4794]: I0310 10:08:05.355295 4794 scope.go:117] "RemoveContainer" containerID="762ad40585a4eb6bce3691196c9277567e660a2a4269117a85fc6a3f10c9ffa7" Mar 10 10:08:05 crc kubenswrapper[4794]: I0310 10:08:05.449600 4794 scope.go:117] "RemoveContainer" containerID="d22595944b57acf068a314a2386cfb551ea3eca990a82f6ca6f32ed1f24d3aae" Mar 10 10:08:05 crc kubenswrapper[4794]: I0310 10:08:05.449748 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 10 10:08:05 crc kubenswrapper[4794]: I0310 10:08:05.464869 4794 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Mar 10 10:08:05 crc kubenswrapper[4794]: I0310 10:08:05.474788 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Mar 10 10:08:05 crc kubenswrapper[4794]: E0310 10:08:05.475195 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e30640b0-144a-4928-a05e-e42a296a3214" containerName="nova-metadata-metadata" Mar 10 10:08:05 crc kubenswrapper[4794]: I0310 10:08:05.475214 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="e30640b0-144a-4928-a05e-e42a296a3214" containerName="nova-metadata-metadata" Mar 10 10:08:05 crc kubenswrapper[4794]: E0310 10:08:05.475256 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e30640b0-144a-4928-a05e-e42a296a3214" containerName="nova-metadata-log" Mar 10 10:08:05 crc kubenswrapper[4794]: I0310 10:08:05.475264 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="e30640b0-144a-4928-a05e-e42a296a3214" containerName="nova-metadata-log" Mar 10 10:08:05 crc kubenswrapper[4794]: I0310 10:08:05.475461 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="e30640b0-144a-4928-a05e-e42a296a3214" containerName="nova-metadata-log" Mar 10 10:08:05 crc kubenswrapper[4794]: I0310 10:08:05.475486 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="e30640b0-144a-4928-a05e-e42a296a3214" containerName="nova-metadata-metadata" Mar 10 10:08:05 crc kubenswrapper[4794]: I0310 10:08:05.478879 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 10 10:08:05 crc kubenswrapper[4794]: I0310 10:08:05.485397 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Mar 10 10:08:05 crc kubenswrapper[4794]: I0310 10:08:05.485576 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Mar 10 10:08:05 crc kubenswrapper[4794]: I0310 10:08:05.490372 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 10 10:08:05 crc kubenswrapper[4794]: I0310 10:08:05.514516 4794 scope.go:117] "RemoveContainer" containerID="762ad40585a4eb6bce3691196c9277567e660a2a4269117a85fc6a3f10c9ffa7" Mar 10 10:08:05 crc kubenswrapper[4794]: E0310 10:08:05.517836 4794 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"762ad40585a4eb6bce3691196c9277567e660a2a4269117a85fc6a3f10c9ffa7\": container with ID starting with 762ad40585a4eb6bce3691196c9277567e660a2a4269117a85fc6a3f10c9ffa7 not found: ID does not exist" containerID="762ad40585a4eb6bce3691196c9277567e660a2a4269117a85fc6a3f10c9ffa7" Mar 10 10:08:05 crc kubenswrapper[4794]: I0310 10:08:05.517884 4794 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"762ad40585a4eb6bce3691196c9277567e660a2a4269117a85fc6a3f10c9ffa7"} err="failed to get container status \"762ad40585a4eb6bce3691196c9277567e660a2a4269117a85fc6a3f10c9ffa7\": rpc error: code = NotFound desc = could not find container \"762ad40585a4eb6bce3691196c9277567e660a2a4269117a85fc6a3f10c9ffa7\": container with ID starting with 762ad40585a4eb6bce3691196c9277567e660a2a4269117a85fc6a3f10c9ffa7 not found: ID does not exist" Mar 10 10:08:05 crc kubenswrapper[4794]: I0310 10:08:05.517914 4794 scope.go:117] "RemoveContainer" containerID="d22595944b57acf068a314a2386cfb551ea3eca990a82f6ca6f32ed1f24d3aae" Mar 10 10:08:05 crc kubenswrapper[4794]: E0310 10:08:05.521427 4794 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d22595944b57acf068a314a2386cfb551ea3eca990a82f6ca6f32ed1f24d3aae\": container with ID starting with d22595944b57acf068a314a2386cfb551ea3eca990a82f6ca6f32ed1f24d3aae not found: ID does not exist" containerID="d22595944b57acf068a314a2386cfb551ea3eca990a82f6ca6f32ed1f24d3aae" Mar 10 10:08:05 crc kubenswrapper[4794]: I0310 10:08:05.521460 4794 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d22595944b57acf068a314a2386cfb551ea3eca990a82f6ca6f32ed1f24d3aae"} err="failed to get container status \"d22595944b57acf068a314a2386cfb551ea3eca990a82f6ca6f32ed1f24d3aae\": rpc error: code = NotFound desc = could not find container \"d22595944b57acf068a314a2386cfb551ea3eca990a82f6ca6f32ed1f24d3aae\": container with ID starting with d22595944b57acf068a314a2386cfb551ea3eca990a82f6ca6f32ed1f24d3aae not found: ID does not exist" Mar 10 10:08:05 crc kubenswrapper[4794]: I0310 10:08:05.521479 4794 scope.go:117] "RemoveContainer" containerID="762ad40585a4eb6bce3691196c9277567e660a2a4269117a85fc6a3f10c9ffa7" Mar 10 10:08:05 crc kubenswrapper[4794]: I0310 10:08:05.531556 4794 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"762ad40585a4eb6bce3691196c9277567e660a2a4269117a85fc6a3f10c9ffa7"} err="failed to get container status \"762ad40585a4eb6bce3691196c9277567e660a2a4269117a85fc6a3f10c9ffa7\": rpc error: code = NotFound desc = could not find container \"762ad40585a4eb6bce3691196c9277567e660a2a4269117a85fc6a3f10c9ffa7\": container with ID starting with 762ad40585a4eb6bce3691196c9277567e660a2a4269117a85fc6a3f10c9ffa7 not found: ID does not exist" Mar 10 10:08:05 crc kubenswrapper[4794]: I0310 10:08:05.531592 4794 scope.go:117] "RemoveContainer" containerID="d22595944b57acf068a314a2386cfb551ea3eca990a82f6ca6f32ed1f24d3aae" Mar 10 10:08:05 crc kubenswrapper[4794]: I0310 10:08:05.532855 4794 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d22595944b57acf068a314a2386cfb551ea3eca990a82f6ca6f32ed1f24d3aae"} err="failed to get container status \"d22595944b57acf068a314a2386cfb551ea3eca990a82f6ca6f32ed1f24d3aae\": rpc error: code = NotFound desc = could not find container \"d22595944b57acf068a314a2386cfb551ea3eca990a82f6ca6f32ed1f24d3aae\": container with ID starting with d22595944b57acf068a314a2386cfb551ea3eca990a82f6ca6f32ed1f24d3aae not found: ID does not exist" Mar 10 10:08:05 crc kubenswrapper[4794]: I0310 10:08:05.561683 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/2298f21e-2ee9-4446-a5b8-e125fb631861-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"2298f21e-2ee9-4446-a5b8-e125fb631861\") " pod="openstack/nova-metadata-0" Mar 10 10:08:05 crc kubenswrapper[4794]: I0310 10:08:05.561754 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2298f21e-2ee9-4446-a5b8-e125fb631861-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"2298f21e-2ee9-4446-a5b8-e125fb631861\") " pod="openstack/nova-metadata-0" Mar 10 10:08:05 crc kubenswrapper[4794]: I0310 10:08:05.561805 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2298f21e-2ee9-4446-a5b8-e125fb631861-logs\") pod \"nova-metadata-0\" (UID: \"2298f21e-2ee9-4446-a5b8-e125fb631861\") " pod="openstack/nova-metadata-0" Mar 10 10:08:05 crc kubenswrapper[4794]: I0310 10:08:05.561878 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dmnkq\" (UniqueName: \"kubernetes.io/projected/2298f21e-2ee9-4446-a5b8-e125fb631861-kube-api-access-dmnkq\") pod \"nova-metadata-0\" (UID: \"2298f21e-2ee9-4446-a5b8-e125fb631861\") " pod="openstack/nova-metadata-0" Mar 10 10:08:05 crc kubenswrapper[4794]: I0310 10:08:05.561943 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2298f21e-2ee9-4446-a5b8-e125fb631861-config-data\") pod \"nova-metadata-0\" (UID: \"2298f21e-2ee9-4446-a5b8-e125fb631861\") " pod="openstack/nova-metadata-0" Mar 10 10:08:05 crc kubenswrapper[4794]: I0310 10:08:05.664521 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2298f21e-2ee9-4446-a5b8-e125fb631861-config-data\") pod \"nova-metadata-0\" (UID: \"2298f21e-2ee9-4446-a5b8-e125fb631861\") " pod="openstack/nova-metadata-0" Mar 10 10:08:05 crc kubenswrapper[4794]: I0310 10:08:05.664939 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/2298f21e-2ee9-4446-a5b8-e125fb631861-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"2298f21e-2ee9-4446-a5b8-e125fb631861\") " pod="openstack/nova-metadata-0" Mar 10 10:08:05 crc kubenswrapper[4794]: I0310 10:08:05.664980 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2298f21e-2ee9-4446-a5b8-e125fb631861-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"2298f21e-2ee9-4446-a5b8-e125fb631861\") " pod="openstack/nova-metadata-0" Mar 10 10:08:05 crc kubenswrapper[4794]: I0310 10:08:05.665042 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2298f21e-2ee9-4446-a5b8-e125fb631861-logs\") pod \"nova-metadata-0\" (UID: \"2298f21e-2ee9-4446-a5b8-e125fb631861\") " pod="openstack/nova-metadata-0" Mar 10 10:08:05 crc kubenswrapper[4794]: I0310 10:08:05.665123 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dmnkq\" (UniqueName: \"kubernetes.io/projected/2298f21e-2ee9-4446-a5b8-e125fb631861-kube-api-access-dmnkq\") pod \"nova-metadata-0\" (UID: \"2298f21e-2ee9-4446-a5b8-e125fb631861\") " pod="openstack/nova-metadata-0" Mar 10 10:08:05 crc kubenswrapper[4794]: I0310 10:08:05.665545 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2298f21e-2ee9-4446-a5b8-e125fb631861-logs\") pod \"nova-metadata-0\" (UID: \"2298f21e-2ee9-4446-a5b8-e125fb631861\") " pod="openstack/nova-metadata-0" Mar 10 10:08:05 crc kubenswrapper[4794]: I0310 10:08:05.668418 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2298f21e-2ee9-4446-a5b8-e125fb631861-config-data\") pod \"nova-metadata-0\" (UID: \"2298f21e-2ee9-4446-a5b8-e125fb631861\") " pod="openstack/nova-metadata-0" Mar 10 10:08:05 crc kubenswrapper[4794]: I0310 10:08:05.670121 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2298f21e-2ee9-4446-a5b8-e125fb631861-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"2298f21e-2ee9-4446-a5b8-e125fb631861\") " pod="openstack/nova-metadata-0" Mar 10 10:08:05 crc kubenswrapper[4794]: I0310 10:08:05.682013 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dmnkq\" (UniqueName: \"kubernetes.io/projected/2298f21e-2ee9-4446-a5b8-e125fb631861-kube-api-access-dmnkq\") pod \"nova-metadata-0\" (UID: \"2298f21e-2ee9-4446-a5b8-e125fb631861\") " pod="openstack/nova-metadata-0" Mar 10 10:08:05 crc kubenswrapper[4794]: I0310 10:08:05.686159 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/2298f21e-2ee9-4446-a5b8-e125fb631861-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"2298f21e-2ee9-4446-a5b8-e125fb631861\") " pod="openstack/nova-metadata-0" Mar 10 10:08:05 crc kubenswrapper[4794]: I0310 10:08:05.807715 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 10 10:08:05 crc kubenswrapper[4794]: I0310 10:08:05.810815 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552288-fck7s" Mar 10 10:08:05 crc kubenswrapper[4794]: I0310 10:08:05.969070 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-94qnk\" (UniqueName: \"kubernetes.io/projected/e8db1d2e-9690-4d13-bce2-b8602ffb7583-kube-api-access-94qnk\") pod \"e8db1d2e-9690-4d13-bce2-b8602ffb7583\" (UID: \"e8db1d2e-9690-4d13-bce2-b8602ffb7583\") " Mar 10 10:08:05 crc kubenswrapper[4794]: I0310 10:08:05.976172 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e8db1d2e-9690-4d13-bce2-b8602ffb7583-kube-api-access-94qnk" (OuterVolumeSpecName: "kube-api-access-94qnk") pod "e8db1d2e-9690-4d13-bce2-b8602ffb7583" (UID: "e8db1d2e-9690-4d13-bce2-b8602ffb7583"). InnerVolumeSpecName "kube-api-access-94qnk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 10:08:06 crc kubenswrapper[4794]: I0310 10:08:06.010122 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e30640b0-144a-4928-a05e-e42a296a3214" path="/var/lib/kubelet/pods/e30640b0-144a-4928-a05e-e42a296a3214/volumes" Mar 10 10:08:06 crc kubenswrapper[4794]: I0310 10:08:06.072467 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-94qnk\" (UniqueName: \"kubernetes.io/projected/e8db1d2e-9690-4d13-bce2-b8602ffb7583-kube-api-access-94qnk\") on node \"crc\" DevicePath \"\"" Mar 10 10:08:06 crc kubenswrapper[4794]: W0310 10:08:06.265612 4794 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2298f21e_2ee9_4446_a5b8_e125fb631861.slice/crio-30aaa2e23110b5c09c9608444a069ec9be9b436a38ed8edf5f5a828714175044 WatchSource:0}: Error finding container 30aaa2e23110b5c09c9608444a069ec9be9b436a38ed8edf5f5a828714175044: Status 404 returned error can't find the container with id 30aaa2e23110b5c09c9608444a069ec9be9b436a38ed8edf5f5a828714175044 Mar 10 10:08:06 crc kubenswrapper[4794]: I0310 10:08:06.266363 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 10 10:08:06 crc kubenswrapper[4794]: I0310 10:08:06.367805 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552288-fck7s" event={"ID":"e8db1d2e-9690-4d13-bce2-b8602ffb7583","Type":"ContainerDied","Data":"0723a6c4cbbe6747cebd0c38efa9420f17083934edd83bb0498a1a6b2a71c449"} Mar 10 10:08:06 crc kubenswrapper[4794]: I0310 10:08:06.367848 4794 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0723a6c4cbbe6747cebd0c38efa9420f17083934edd83bb0498a1a6b2a71c449" Mar 10 10:08:06 crc kubenswrapper[4794]: I0310 10:08:06.367899 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552288-fck7s" Mar 10 10:08:06 crc kubenswrapper[4794]: I0310 10:08:06.372152 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"2298f21e-2ee9-4446-a5b8-e125fb631861","Type":"ContainerStarted","Data":"30aaa2e23110b5c09c9608444a069ec9be9b436a38ed8edf5f5a828714175044"} Mar 10 10:08:06 crc kubenswrapper[4794]: I0310 10:08:06.882243 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29552282-5r6p7"] Mar 10 10:08:06 crc kubenswrapper[4794]: I0310 10:08:06.894761 4794 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29552282-5r6p7"] Mar 10 10:08:07 crc kubenswrapper[4794]: I0310 10:08:07.384506 4794 generic.go:334] "Generic (PLEG): container finished" podID="4290ea58-8af5-478d-a452-421fe656fe01" containerID="1edfc188e8a48dc22fef269f386093bd77ed504ab9f840271be29afeff769334" exitCode=0 Mar 10 10:08:07 crc kubenswrapper[4794]: I0310 10:08:07.384606 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-czl5f" event={"ID":"4290ea58-8af5-478d-a452-421fe656fe01","Type":"ContainerDied","Data":"1edfc188e8a48dc22fef269f386093bd77ed504ab9f840271be29afeff769334"} Mar 10 10:08:07 crc kubenswrapper[4794]: I0310 10:08:07.388409 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"2298f21e-2ee9-4446-a5b8-e125fb631861","Type":"ContainerStarted","Data":"74087c5f756913633b978bcec2b884b1e1bc5339bb43f04c91c1d96a44b78bb2"} Mar 10 10:08:07 crc kubenswrapper[4794]: I0310 10:08:07.388971 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"2298f21e-2ee9-4446-a5b8-e125fb631861","Type":"ContainerStarted","Data":"4dc02cfa57214a1b8c6bf8265715c8083c8b0b11cc5188c3df2d3a62bdafc796"} Mar 10 10:08:07 crc kubenswrapper[4794]: I0310 10:08:07.424470 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.424450869 podStartE2EDuration="2.424450869s" podCreationTimestamp="2026-03-10 10:08:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 10:08:07.423231683 +0000 UTC m=+1436.179402511" watchObservedRunningTime="2026-03-10 10:08:07.424450869 +0000 UTC m=+1436.180621687" Mar 10 10:08:08 crc kubenswrapper[4794]: I0310 10:08:08.012961 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ef0c15f0-8d9a-41da-a93f-3799c7e84d28" path="/var/lib/kubelet/pods/ef0c15f0-8d9a-41da-a93f-3799c7e84d28/volumes" Mar 10 10:08:08 crc kubenswrapper[4794]: I0310 10:08:08.403012 4794 generic.go:334] "Generic (PLEG): container finished" podID="5f967270-f06c-40e0-9a53-00d2bc42d930" containerID="f014916484fd1fca04a820b617c80d6df43e5e2743a3252f69ec6b699226813e" exitCode=0 Mar 10 10:08:08 crc kubenswrapper[4794]: I0310 10:08:08.403103 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-qzv7j" event={"ID":"5f967270-f06c-40e0-9a53-00d2bc42d930","Type":"ContainerDied","Data":"f014916484fd1fca04a820b617c80d6df43e5e2743a3252f69ec6b699226813e"} Mar 10 10:08:08 crc kubenswrapper[4794]: I0310 10:08:08.624371 4794 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 10 10:08:08 crc kubenswrapper[4794]: I0310 10:08:08.624410 4794 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 10 10:08:08 crc kubenswrapper[4794]: I0310 10:08:08.834548 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-czl5f" Mar 10 10:08:08 crc kubenswrapper[4794]: I0310 10:08:08.936101 4794 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Mar 10 10:08:08 crc kubenswrapper[4794]: I0310 10:08:08.936164 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Mar 10 10:08:08 crc kubenswrapper[4794]: I0310 10:08:08.946470 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4290ea58-8af5-478d-a452-421fe656fe01-combined-ca-bundle\") pod \"4290ea58-8af5-478d-a452-421fe656fe01\" (UID: \"4290ea58-8af5-478d-a452-421fe656fe01\") " Mar 10 10:08:08 crc kubenswrapper[4794]: I0310 10:08:08.946724 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4290ea58-8af5-478d-a452-421fe656fe01-scripts\") pod \"4290ea58-8af5-478d-a452-421fe656fe01\" (UID: \"4290ea58-8af5-478d-a452-421fe656fe01\") " Mar 10 10:08:08 crc kubenswrapper[4794]: I0310 10:08:08.946767 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4290ea58-8af5-478d-a452-421fe656fe01-config-data\") pod \"4290ea58-8af5-478d-a452-421fe656fe01\" (UID: \"4290ea58-8af5-478d-a452-421fe656fe01\") " Mar 10 10:08:08 crc kubenswrapper[4794]: I0310 10:08:08.947516 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zrnj4\" (UniqueName: \"kubernetes.io/projected/4290ea58-8af5-478d-a452-421fe656fe01-kube-api-access-zrnj4\") pod \"4290ea58-8af5-478d-a452-421fe656fe01\" (UID: \"4290ea58-8af5-478d-a452-421fe656fe01\") " Mar 10 10:08:08 crc kubenswrapper[4794]: I0310 10:08:08.952323 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4290ea58-8af5-478d-a452-421fe656fe01-kube-api-access-zrnj4" (OuterVolumeSpecName: "kube-api-access-zrnj4") pod "4290ea58-8af5-478d-a452-421fe656fe01" (UID: "4290ea58-8af5-478d-a452-421fe656fe01"). InnerVolumeSpecName "kube-api-access-zrnj4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 10:08:08 crc kubenswrapper[4794]: I0310 10:08:08.954072 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4290ea58-8af5-478d-a452-421fe656fe01-scripts" (OuterVolumeSpecName: "scripts") pod "4290ea58-8af5-478d-a452-421fe656fe01" (UID: "4290ea58-8af5-478d-a452-421fe656fe01"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 10:08:08 crc kubenswrapper[4794]: I0310 10:08:08.962889 4794 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Mar 10 10:08:08 crc kubenswrapper[4794]: I0310 10:08:08.980551 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-7559df67df-6sc9d" Mar 10 10:08:08 crc kubenswrapper[4794]: I0310 10:08:08.985458 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4290ea58-8af5-478d-a452-421fe656fe01-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4290ea58-8af5-478d-a452-421fe656fe01" (UID: "4290ea58-8af5-478d-a452-421fe656fe01"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 10:08:09 crc kubenswrapper[4794]: I0310 10:08:08.997566 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Mar 10 10:08:09 crc kubenswrapper[4794]: I0310 10:08:09.012575 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4290ea58-8af5-478d-a452-421fe656fe01-config-data" (OuterVolumeSpecName: "config-data") pod "4290ea58-8af5-478d-a452-421fe656fe01" (UID: "4290ea58-8af5-478d-a452-421fe656fe01"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 10:08:09 crc kubenswrapper[4794]: I0310 10:08:09.051904 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zrnj4\" (UniqueName: \"kubernetes.io/projected/4290ea58-8af5-478d-a452-421fe656fe01-kube-api-access-zrnj4\") on node \"crc\" DevicePath \"\"" Mar 10 10:08:09 crc kubenswrapper[4794]: I0310 10:08:09.051983 4794 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4290ea58-8af5-478d-a452-421fe656fe01-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 10:08:09 crc kubenswrapper[4794]: I0310 10:08:09.052004 4794 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4290ea58-8af5-478d-a452-421fe656fe01-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 10:08:09 crc kubenswrapper[4794]: I0310 10:08:09.052056 4794 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4290ea58-8af5-478d-a452-421fe656fe01-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 10:08:09 crc kubenswrapper[4794]: I0310 10:08:09.058531 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-765c5b6b49-4clf4"] Mar 10 10:08:09 crc kubenswrapper[4794]: I0310 10:08:09.058751 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-765c5b6b49-4clf4" podUID="eafe7ef4-11df-422c-8b13-8943429c4fa6" containerName="dnsmasq-dns" containerID="cri-o://66047b3812719d5a33aa83c2452b755e2d6ad4b3fe29d3498048cef766eecffd" gracePeriod=10 Mar 10 10:08:09 crc kubenswrapper[4794]: I0310 10:08:09.416131 4794 generic.go:334] "Generic (PLEG): container finished" podID="eafe7ef4-11df-422c-8b13-8943429c4fa6" containerID="66047b3812719d5a33aa83c2452b755e2d6ad4b3fe29d3498048cef766eecffd" exitCode=0 Mar 10 10:08:09 crc kubenswrapper[4794]: I0310 10:08:09.416210 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-765c5b6b49-4clf4" event={"ID":"eafe7ef4-11df-422c-8b13-8943429c4fa6","Type":"ContainerDied","Data":"66047b3812719d5a33aa83c2452b755e2d6ad4b3fe29d3498048cef766eecffd"} Mar 10 10:08:09 crc kubenswrapper[4794]: I0310 10:08:09.420138 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-czl5f" Mar 10 10:08:09 crc kubenswrapper[4794]: I0310 10:08:09.420004 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-czl5f" event={"ID":"4290ea58-8af5-478d-a452-421fe656fe01","Type":"ContainerDied","Data":"2d21c3c1609b0e27af174d9efe50c2b1df4becaa82baf3a3aa4e0721a9cb7aef"} Mar 10 10:08:09 crc kubenswrapper[4794]: I0310 10:08:09.420828 4794 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2d21c3c1609b0e27af174d9efe50c2b1df4becaa82baf3a3aa4e0721a9cb7aef" Mar 10 10:08:09 crc kubenswrapper[4794]: I0310 10:08:09.458508 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Mar 10 10:08:09 crc kubenswrapper[4794]: I0310 10:08:09.484989 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-765c5b6b49-4clf4" Mar 10 10:08:09 crc kubenswrapper[4794]: I0310 10:08:09.662719 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/eafe7ef4-11df-422c-8b13-8943429c4fa6-config\") pod \"eafe7ef4-11df-422c-8b13-8943429c4fa6\" (UID: \"eafe7ef4-11df-422c-8b13-8943429c4fa6\") " Mar 10 10:08:09 crc kubenswrapper[4794]: I0310 10:08:09.662767 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/eafe7ef4-11df-422c-8b13-8943429c4fa6-ovsdbserver-sb\") pod \"eafe7ef4-11df-422c-8b13-8943429c4fa6\" (UID: \"eafe7ef4-11df-422c-8b13-8943429c4fa6\") " Mar 10 10:08:09 crc kubenswrapper[4794]: I0310 10:08:09.662832 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/eafe7ef4-11df-422c-8b13-8943429c4fa6-dns-svc\") pod \"eafe7ef4-11df-422c-8b13-8943429c4fa6\" (UID: \"eafe7ef4-11df-422c-8b13-8943429c4fa6\") " Mar 10 10:08:09 crc kubenswrapper[4794]: I0310 10:08:09.662852 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/eafe7ef4-11df-422c-8b13-8943429c4fa6-dns-swift-storage-0\") pod \"eafe7ef4-11df-422c-8b13-8943429c4fa6\" (UID: \"eafe7ef4-11df-422c-8b13-8943429c4fa6\") " Mar 10 10:08:09 crc kubenswrapper[4794]: I0310 10:08:09.662876 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/eafe7ef4-11df-422c-8b13-8943429c4fa6-ovsdbserver-nb\") pod \"eafe7ef4-11df-422c-8b13-8943429c4fa6\" (UID: \"eafe7ef4-11df-422c-8b13-8943429c4fa6\") " Mar 10 10:08:09 crc kubenswrapper[4794]: I0310 10:08:09.662911 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cdxsn\" (UniqueName: \"kubernetes.io/projected/eafe7ef4-11df-422c-8b13-8943429c4fa6-kube-api-access-cdxsn\") pod \"eafe7ef4-11df-422c-8b13-8943429c4fa6\" (UID: \"eafe7ef4-11df-422c-8b13-8943429c4fa6\") " Mar 10 10:08:09 crc kubenswrapper[4794]: I0310 10:08:09.668587 4794 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="d5707404-5eca-49e1-b11a-25e9e46ecf57" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.190:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 10 10:08:09 crc kubenswrapper[4794]: I0310 10:08:09.668915 4794 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="d5707404-5eca-49e1-b11a-25e9e46ecf57" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.190:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 10 10:08:09 crc kubenswrapper[4794]: I0310 10:08:09.676492 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eafe7ef4-11df-422c-8b13-8943429c4fa6-kube-api-access-cdxsn" (OuterVolumeSpecName: "kube-api-access-cdxsn") pod "eafe7ef4-11df-422c-8b13-8943429c4fa6" (UID: "eafe7ef4-11df-422c-8b13-8943429c4fa6"). InnerVolumeSpecName "kube-api-access-cdxsn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 10:08:09 crc kubenswrapper[4794]: I0310 10:08:09.731176 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/eafe7ef4-11df-422c-8b13-8943429c4fa6-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "eafe7ef4-11df-422c-8b13-8943429c4fa6" (UID: "eafe7ef4-11df-422c-8b13-8943429c4fa6"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 10:08:09 crc kubenswrapper[4794]: I0310 10:08:09.731241 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 10 10:08:09 crc kubenswrapper[4794]: I0310 10:08:09.731650 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="d5707404-5eca-49e1-b11a-25e9e46ecf57" containerName="nova-api-log" containerID="cri-o://e58b8ece246538a5d35c361568da48a1fe4dfd6ade472acb22a81b8fc265fb3c" gracePeriod=30 Mar 10 10:08:09 crc kubenswrapper[4794]: I0310 10:08:09.732036 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="d5707404-5eca-49e1-b11a-25e9e46ecf57" containerName="nova-api-api" containerID="cri-o://e8a1ed949ae80d580360cadba16748744763ab0aabdfbf25c0c8f8f031a813a5" gracePeriod=30 Mar 10 10:08:09 crc kubenswrapper[4794]: I0310 10:08:09.738899 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/eafe7ef4-11df-422c-8b13-8943429c4fa6-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "eafe7ef4-11df-422c-8b13-8943429c4fa6" (UID: "eafe7ef4-11df-422c-8b13-8943429c4fa6"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 10:08:09 crc kubenswrapper[4794]: I0310 10:08:09.760456 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 10 10:08:09 crc kubenswrapper[4794]: I0310 10:08:09.760737 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="2298f21e-2ee9-4446-a5b8-e125fb631861" containerName="nova-metadata-log" containerID="cri-o://4dc02cfa57214a1b8c6bf8265715c8083c8b0b11cc5188c3df2d3a62bdafc796" gracePeriod=30 Mar 10 10:08:09 crc kubenswrapper[4794]: I0310 10:08:09.761269 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="2298f21e-2ee9-4446-a5b8-e125fb631861" containerName="nova-metadata-metadata" containerID="cri-o://74087c5f756913633b978bcec2b884b1e1bc5339bb43f04c91c1d96a44b78bb2" gracePeriod=30 Mar 10 10:08:09 crc kubenswrapper[4794]: I0310 10:08:09.769454 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/eafe7ef4-11df-422c-8b13-8943429c4fa6-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "eafe7ef4-11df-422c-8b13-8943429c4fa6" (UID: "eafe7ef4-11df-422c-8b13-8943429c4fa6"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 10:08:09 crc kubenswrapper[4794]: I0310 10:08:09.769905 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/eafe7ef4-11df-422c-8b13-8943429c4fa6-dns-swift-storage-0\") pod \"eafe7ef4-11df-422c-8b13-8943429c4fa6\" (UID: \"eafe7ef4-11df-422c-8b13-8943429c4fa6\") " Mar 10 10:08:09 crc kubenswrapper[4794]: W0310 10:08:09.770582 4794 empty_dir.go:500] Warning: Unmount skipped because path does not exist: /var/lib/kubelet/pods/eafe7ef4-11df-422c-8b13-8943429c4fa6/volumes/kubernetes.io~configmap/dns-swift-storage-0 Mar 10 10:08:09 crc kubenswrapper[4794]: I0310 10:08:09.770612 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/eafe7ef4-11df-422c-8b13-8943429c4fa6-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "eafe7ef4-11df-422c-8b13-8943429c4fa6" (UID: "eafe7ef4-11df-422c-8b13-8943429c4fa6"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 10:08:09 crc kubenswrapper[4794]: I0310 10:08:09.770797 4794 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/eafe7ef4-11df-422c-8b13-8943429c4fa6-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 10 10:08:09 crc kubenswrapper[4794]: I0310 10:08:09.770816 4794 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/eafe7ef4-11df-422c-8b13-8943429c4fa6-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 10 10:08:09 crc kubenswrapper[4794]: I0310 10:08:09.770825 4794 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/eafe7ef4-11df-422c-8b13-8943429c4fa6-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 10 10:08:09 crc kubenswrapper[4794]: I0310 10:08:09.770834 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cdxsn\" (UniqueName: \"kubernetes.io/projected/eafe7ef4-11df-422c-8b13-8943429c4fa6-kube-api-access-cdxsn\") on node \"crc\" DevicePath \"\"" Mar 10 10:08:09 crc kubenswrapper[4794]: I0310 10:08:09.771902 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/eafe7ef4-11df-422c-8b13-8943429c4fa6-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "eafe7ef4-11df-422c-8b13-8943429c4fa6" (UID: "eafe7ef4-11df-422c-8b13-8943429c4fa6"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 10:08:09 crc kubenswrapper[4794]: I0310 10:08:09.800974 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/eafe7ef4-11df-422c-8b13-8943429c4fa6-config" (OuterVolumeSpecName: "config") pod "eafe7ef4-11df-422c-8b13-8943429c4fa6" (UID: "eafe7ef4-11df-422c-8b13-8943429c4fa6"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 10:08:09 crc kubenswrapper[4794]: I0310 10:08:09.872822 4794 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/eafe7ef4-11df-422c-8b13-8943429c4fa6-config\") on node \"crc\" DevicePath \"\"" Mar 10 10:08:09 crc kubenswrapper[4794]: I0310 10:08:09.872853 4794 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/eafe7ef4-11df-422c-8b13-8943429c4fa6-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 10 10:08:09 crc kubenswrapper[4794]: I0310 10:08:09.969573 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Mar 10 10:08:10 crc kubenswrapper[4794]: I0310 10:08:10.040818 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-qzv7j" Mar 10 10:08:10 crc kubenswrapper[4794]: I0310 10:08:10.090562 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5f967270-f06c-40e0-9a53-00d2bc42d930-scripts\") pod \"5f967270-f06c-40e0-9a53-00d2bc42d930\" (UID: \"5f967270-f06c-40e0-9a53-00d2bc42d930\") " Mar 10 10:08:10 crc kubenswrapper[4794]: I0310 10:08:10.090634 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5f967270-f06c-40e0-9a53-00d2bc42d930-combined-ca-bundle\") pod \"5f967270-f06c-40e0-9a53-00d2bc42d930\" (UID: \"5f967270-f06c-40e0-9a53-00d2bc42d930\") " Mar 10 10:08:10 crc kubenswrapper[4794]: I0310 10:08:10.090685 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gzzsb\" (UniqueName: \"kubernetes.io/projected/5f967270-f06c-40e0-9a53-00d2bc42d930-kube-api-access-gzzsb\") pod \"5f967270-f06c-40e0-9a53-00d2bc42d930\" (UID: \"5f967270-f06c-40e0-9a53-00d2bc42d930\") " Mar 10 10:08:10 crc kubenswrapper[4794]: I0310 10:08:10.090711 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5f967270-f06c-40e0-9a53-00d2bc42d930-config-data\") pod \"5f967270-f06c-40e0-9a53-00d2bc42d930\" (UID: \"5f967270-f06c-40e0-9a53-00d2bc42d930\") " Mar 10 10:08:10 crc kubenswrapper[4794]: I0310 10:08:10.094564 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5f967270-f06c-40e0-9a53-00d2bc42d930-scripts" (OuterVolumeSpecName: "scripts") pod "5f967270-f06c-40e0-9a53-00d2bc42d930" (UID: "5f967270-f06c-40e0-9a53-00d2bc42d930"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 10:08:10 crc kubenswrapper[4794]: I0310 10:08:10.100685 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5f967270-f06c-40e0-9a53-00d2bc42d930-kube-api-access-gzzsb" (OuterVolumeSpecName: "kube-api-access-gzzsb") pod "5f967270-f06c-40e0-9a53-00d2bc42d930" (UID: "5f967270-f06c-40e0-9a53-00d2bc42d930"). InnerVolumeSpecName "kube-api-access-gzzsb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 10:08:10 crc kubenswrapper[4794]: I0310 10:08:10.115291 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5f967270-f06c-40e0-9a53-00d2bc42d930-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5f967270-f06c-40e0-9a53-00d2bc42d930" (UID: "5f967270-f06c-40e0-9a53-00d2bc42d930"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 10:08:10 crc kubenswrapper[4794]: I0310 10:08:10.117667 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5f967270-f06c-40e0-9a53-00d2bc42d930-config-data" (OuterVolumeSpecName: "config-data") pod "5f967270-f06c-40e0-9a53-00d2bc42d930" (UID: "5f967270-f06c-40e0-9a53-00d2bc42d930"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 10:08:10 crc kubenswrapper[4794]: I0310 10:08:10.193444 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gzzsb\" (UniqueName: \"kubernetes.io/projected/5f967270-f06c-40e0-9a53-00d2bc42d930-kube-api-access-gzzsb\") on node \"crc\" DevicePath \"\"" Mar 10 10:08:10 crc kubenswrapper[4794]: I0310 10:08:10.193479 4794 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5f967270-f06c-40e0-9a53-00d2bc42d930-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 10:08:10 crc kubenswrapper[4794]: I0310 10:08:10.193491 4794 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5f967270-f06c-40e0-9a53-00d2bc42d930-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 10:08:10 crc kubenswrapper[4794]: I0310 10:08:10.193502 4794 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5f967270-f06c-40e0-9a53-00d2bc42d930-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 10:08:10 crc kubenswrapper[4794]: I0310 10:08:10.274323 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 10 10:08:10 crc kubenswrapper[4794]: I0310 10:08:10.295027 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2298f21e-2ee9-4446-a5b8-e125fb631861-logs\") pod \"2298f21e-2ee9-4446-a5b8-e125fb631861\" (UID: \"2298f21e-2ee9-4446-a5b8-e125fb631861\") " Mar 10 10:08:10 crc kubenswrapper[4794]: I0310 10:08:10.295132 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dmnkq\" (UniqueName: \"kubernetes.io/projected/2298f21e-2ee9-4446-a5b8-e125fb631861-kube-api-access-dmnkq\") pod \"2298f21e-2ee9-4446-a5b8-e125fb631861\" (UID: \"2298f21e-2ee9-4446-a5b8-e125fb631861\") " Mar 10 10:08:10 crc kubenswrapper[4794]: I0310 10:08:10.295201 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2298f21e-2ee9-4446-a5b8-e125fb631861-config-data\") pod \"2298f21e-2ee9-4446-a5b8-e125fb631861\" (UID: \"2298f21e-2ee9-4446-a5b8-e125fb631861\") " Mar 10 10:08:10 crc kubenswrapper[4794]: I0310 10:08:10.295286 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2298f21e-2ee9-4446-a5b8-e125fb631861-combined-ca-bundle\") pod \"2298f21e-2ee9-4446-a5b8-e125fb631861\" (UID: \"2298f21e-2ee9-4446-a5b8-e125fb631861\") " Mar 10 10:08:10 crc kubenswrapper[4794]: I0310 10:08:10.295316 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/2298f21e-2ee9-4446-a5b8-e125fb631861-nova-metadata-tls-certs\") pod \"2298f21e-2ee9-4446-a5b8-e125fb631861\" (UID: \"2298f21e-2ee9-4446-a5b8-e125fb631861\") " Mar 10 10:08:10 crc kubenswrapper[4794]: I0310 10:08:10.296754 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2298f21e-2ee9-4446-a5b8-e125fb631861-logs" (OuterVolumeSpecName: "logs") pod "2298f21e-2ee9-4446-a5b8-e125fb631861" (UID: "2298f21e-2ee9-4446-a5b8-e125fb631861"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 10:08:10 crc kubenswrapper[4794]: I0310 10:08:10.299956 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2298f21e-2ee9-4446-a5b8-e125fb631861-kube-api-access-dmnkq" (OuterVolumeSpecName: "kube-api-access-dmnkq") pod "2298f21e-2ee9-4446-a5b8-e125fb631861" (UID: "2298f21e-2ee9-4446-a5b8-e125fb631861"). InnerVolumeSpecName "kube-api-access-dmnkq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 10:08:10 crc kubenswrapper[4794]: I0310 10:08:10.322724 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2298f21e-2ee9-4446-a5b8-e125fb631861-config-data" (OuterVolumeSpecName: "config-data") pod "2298f21e-2ee9-4446-a5b8-e125fb631861" (UID: "2298f21e-2ee9-4446-a5b8-e125fb631861"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 10:08:10 crc kubenswrapper[4794]: I0310 10:08:10.338189 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2298f21e-2ee9-4446-a5b8-e125fb631861-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2298f21e-2ee9-4446-a5b8-e125fb631861" (UID: "2298f21e-2ee9-4446-a5b8-e125fb631861"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 10:08:10 crc kubenswrapper[4794]: I0310 10:08:10.376494 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2298f21e-2ee9-4446-a5b8-e125fb631861-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "2298f21e-2ee9-4446-a5b8-e125fb631861" (UID: "2298f21e-2ee9-4446-a5b8-e125fb631861"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 10:08:10 crc kubenswrapper[4794]: I0310 10:08:10.397507 4794 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2298f21e-2ee9-4446-a5b8-e125fb631861-logs\") on node \"crc\" DevicePath \"\"" Mar 10 10:08:10 crc kubenswrapper[4794]: I0310 10:08:10.397544 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dmnkq\" (UniqueName: \"kubernetes.io/projected/2298f21e-2ee9-4446-a5b8-e125fb631861-kube-api-access-dmnkq\") on node \"crc\" DevicePath \"\"" Mar 10 10:08:10 crc kubenswrapper[4794]: I0310 10:08:10.397555 4794 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2298f21e-2ee9-4446-a5b8-e125fb631861-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 10:08:10 crc kubenswrapper[4794]: I0310 10:08:10.397564 4794 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2298f21e-2ee9-4446-a5b8-e125fb631861-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 10:08:10 crc kubenswrapper[4794]: I0310 10:08:10.397572 4794 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/2298f21e-2ee9-4446-a5b8-e125fb631861-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 10 10:08:10 crc kubenswrapper[4794]: I0310 10:08:10.429088 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-qzv7j" event={"ID":"5f967270-f06c-40e0-9a53-00d2bc42d930","Type":"ContainerDied","Data":"d7b6a104c4ad2dcc4a97dcd533265424d07651d1b5b0069f543e277a14a2c1e4"} Mar 10 10:08:10 crc kubenswrapper[4794]: I0310 10:08:10.429131 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-qzv7j" Mar 10 10:08:10 crc kubenswrapper[4794]: I0310 10:08:10.429137 4794 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d7b6a104c4ad2dcc4a97dcd533265424d07651d1b5b0069f543e277a14a2c1e4" Mar 10 10:08:10 crc kubenswrapper[4794]: I0310 10:08:10.430286 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-765c5b6b49-4clf4" event={"ID":"eafe7ef4-11df-422c-8b13-8943429c4fa6","Type":"ContainerDied","Data":"a9364d85ddbcf92f51dcb2239a0d555c2b8562b6dc441eb89c0e3371c5756c34"} Mar 10 10:08:10 crc kubenswrapper[4794]: I0310 10:08:10.430311 4794 scope.go:117] "RemoveContainer" containerID="66047b3812719d5a33aa83c2452b755e2d6ad4b3fe29d3498048cef766eecffd" Mar 10 10:08:10 crc kubenswrapper[4794]: I0310 10:08:10.430473 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-765c5b6b49-4clf4" Mar 10 10:08:10 crc kubenswrapper[4794]: I0310 10:08:10.436229 4794 generic.go:334] "Generic (PLEG): container finished" podID="d5707404-5eca-49e1-b11a-25e9e46ecf57" containerID="e58b8ece246538a5d35c361568da48a1fe4dfd6ade472acb22a81b8fc265fb3c" exitCode=143 Mar 10 10:08:10 crc kubenswrapper[4794]: I0310 10:08:10.436294 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"d5707404-5eca-49e1-b11a-25e9e46ecf57","Type":"ContainerDied","Data":"e58b8ece246538a5d35c361568da48a1fe4dfd6ade472acb22a81b8fc265fb3c"} Mar 10 10:08:10 crc kubenswrapper[4794]: I0310 10:08:10.437587 4794 generic.go:334] "Generic (PLEG): container finished" podID="2298f21e-2ee9-4446-a5b8-e125fb631861" containerID="74087c5f756913633b978bcec2b884b1e1bc5339bb43f04c91c1d96a44b78bb2" exitCode=0 Mar 10 10:08:10 crc kubenswrapper[4794]: I0310 10:08:10.437609 4794 generic.go:334] "Generic (PLEG): container finished" podID="2298f21e-2ee9-4446-a5b8-e125fb631861" containerID="4dc02cfa57214a1b8c6bf8265715c8083c8b0b11cc5188c3df2d3a62bdafc796" exitCode=143 Mar 10 10:08:10 crc kubenswrapper[4794]: I0310 10:08:10.437618 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 10 10:08:10 crc kubenswrapper[4794]: I0310 10:08:10.437639 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"2298f21e-2ee9-4446-a5b8-e125fb631861","Type":"ContainerDied","Data":"74087c5f756913633b978bcec2b884b1e1bc5339bb43f04c91c1d96a44b78bb2"} Mar 10 10:08:10 crc kubenswrapper[4794]: I0310 10:08:10.437655 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"2298f21e-2ee9-4446-a5b8-e125fb631861","Type":"ContainerDied","Data":"4dc02cfa57214a1b8c6bf8265715c8083c8b0b11cc5188c3df2d3a62bdafc796"} Mar 10 10:08:10 crc kubenswrapper[4794]: I0310 10:08:10.437665 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"2298f21e-2ee9-4446-a5b8-e125fb631861","Type":"ContainerDied","Data":"30aaa2e23110b5c09c9608444a069ec9be9b436a38ed8edf5f5a828714175044"} Mar 10 10:08:10 crc kubenswrapper[4794]: I0310 10:08:10.463680 4794 scope.go:117] "RemoveContainer" containerID="f633620d50b34c7196de159934eb2d25f9a821c13de81842b89e78d64e2697a0" Mar 10 10:08:10 crc kubenswrapper[4794]: I0310 10:08:10.472614 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-765c5b6b49-4clf4"] Mar 10 10:08:10 crc kubenswrapper[4794]: I0310 10:08:10.489667 4794 scope.go:117] "RemoveContainer" containerID="74087c5f756913633b978bcec2b884b1e1bc5339bb43f04c91c1d96a44b78bb2" Mar 10 10:08:10 crc kubenswrapper[4794]: I0310 10:08:10.508635 4794 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-765c5b6b49-4clf4"] Mar 10 10:08:10 crc kubenswrapper[4794]: I0310 10:08:10.536700 4794 scope.go:117] "RemoveContainer" containerID="4dc02cfa57214a1b8c6bf8265715c8083c8b0b11cc5188c3df2d3a62bdafc796" Mar 10 10:08:10 crc kubenswrapper[4794]: I0310 10:08:10.545934 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 10 10:08:10 crc kubenswrapper[4794]: I0310 10:08:10.559851 4794 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Mar 10 10:08:10 crc kubenswrapper[4794]: I0310 10:08:10.568298 4794 scope.go:117] "RemoveContainer" containerID="74087c5f756913633b978bcec2b884b1e1bc5339bb43f04c91c1d96a44b78bb2" Mar 10 10:08:10 crc kubenswrapper[4794]: I0310 10:08:10.568437 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-0"] Mar 10 10:08:10 crc kubenswrapper[4794]: E0310 10:08:10.568787 4794 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"74087c5f756913633b978bcec2b884b1e1bc5339bb43f04c91c1d96a44b78bb2\": container with ID starting with 74087c5f756913633b978bcec2b884b1e1bc5339bb43f04c91c1d96a44b78bb2 not found: ID does not exist" containerID="74087c5f756913633b978bcec2b884b1e1bc5339bb43f04c91c1d96a44b78bb2" Mar 10 10:08:10 crc kubenswrapper[4794]: I0310 10:08:10.568890 4794 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"74087c5f756913633b978bcec2b884b1e1bc5339bb43f04c91c1d96a44b78bb2"} err="failed to get container status \"74087c5f756913633b978bcec2b884b1e1bc5339bb43f04c91c1d96a44b78bb2\": rpc error: code = NotFound desc = could not find container \"74087c5f756913633b978bcec2b884b1e1bc5339bb43f04c91c1d96a44b78bb2\": container with ID starting with 74087c5f756913633b978bcec2b884b1e1bc5339bb43f04c91c1d96a44b78bb2 not found: ID does not exist" Mar 10 10:08:10 crc kubenswrapper[4794]: I0310 10:08:10.568971 4794 scope.go:117] "RemoveContainer" containerID="4dc02cfa57214a1b8c6bf8265715c8083c8b0b11cc5188c3df2d3a62bdafc796" Mar 10 10:08:10 crc kubenswrapper[4794]: E0310 10:08:10.568805 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5f967270-f06c-40e0-9a53-00d2bc42d930" containerName="nova-cell1-conductor-db-sync" Mar 10 10:08:10 crc kubenswrapper[4794]: I0310 10:08:10.569155 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="5f967270-f06c-40e0-9a53-00d2bc42d930" containerName="nova-cell1-conductor-db-sync" Mar 10 10:08:10 crc kubenswrapper[4794]: E0310 10:08:10.569221 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e8db1d2e-9690-4d13-bce2-b8602ffb7583" containerName="oc" Mar 10 10:08:10 crc kubenswrapper[4794]: I0310 10:08:10.569297 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="e8db1d2e-9690-4d13-bce2-b8602ffb7583" containerName="oc" Mar 10 10:08:10 crc kubenswrapper[4794]: E0310 10:08:10.569579 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4290ea58-8af5-478d-a452-421fe656fe01" containerName="nova-manage" Mar 10 10:08:10 crc kubenswrapper[4794]: I0310 10:08:10.569681 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="4290ea58-8af5-478d-a452-421fe656fe01" containerName="nova-manage" Mar 10 10:08:10 crc kubenswrapper[4794]: E0310 10:08:10.569632 4794 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4dc02cfa57214a1b8c6bf8265715c8083c8b0b11cc5188c3df2d3a62bdafc796\": container with ID starting with 4dc02cfa57214a1b8c6bf8265715c8083c8b0b11cc5188c3df2d3a62bdafc796 not found: ID does not exist" containerID="4dc02cfa57214a1b8c6bf8265715c8083c8b0b11cc5188c3df2d3a62bdafc796" Mar 10 10:08:10 crc kubenswrapper[4794]: I0310 10:08:10.569780 4794 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4dc02cfa57214a1b8c6bf8265715c8083c8b0b11cc5188c3df2d3a62bdafc796"} err="failed to get container status \"4dc02cfa57214a1b8c6bf8265715c8083c8b0b11cc5188c3df2d3a62bdafc796\": rpc error: code = NotFound desc = could not find container \"4dc02cfa57214a1b8c6bf8265715c8083c8b0b11cc5188c3df2d3a62bdafc796\": container with ID starting with 4dc02cfa57214a1b8c6bf8265715c8083c8b0b11cc5188c3df2d3a62bdafc796 not found: ID does not exist" Mar 10 10:08:10 crc kubenswrapper[4794]: I0310 10:08:10.569808 4794 scope.go:117] "RemoveContainer" containerID="74087c5f756913633b978bcec2b884b1e1bc5339bb43f04c91c1d96a44b78bb2" Mar 10 10:08:10 crc kubenswrapper[4794]: I0310 10:08:10.570045 4794 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"74087c5f756913633b978bcec2b884b1e1bc5339bb43f04c91c1d96a44b78bb2"} err="failed to get container status \"74087c5f756913633b978bcec2b884b1e1bc5339bb43f04c91c1d96a44b78bb2\": rpc error: code = NotFound desc = could not find container \"74087c5f756913633b978bcec2b884b1e1bc5339bb43f04c91c1d96a44b78bb2\": container with ID starting with 74087c5f756913633b978bcec2b884b1e1bc5339bb43f04c91c1d96a44b78bb2 not found: ID does not exist" Mar 10 10:08:10 crc kubenswrapper[4794]: I0310 10:08:10.570152 4794 scope.go:117] "RemoveContainer" containerID="4dc02cfa57214a1b8c6bf8265715c8083c8b0b11cc5188c3df2d3a62bdafc796" Mar 10 10:08:10 crc kubenswrapper[4794]: E0310 10:08:10.570462 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2298f21e-2ee9-4446-a5b8-e125fb631861" containerName="nova-metadata-log" Mar 10 10:08:10 crc kubenswrapper[4794]: I0310 10:08:10.570757 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="2298f21e-2ee9-4446-a5b8-e125fb631861" containerName="nova-metadata-log" Mar 10 10:08:10 crc kubenswrapper[4794]: E0310 10:08:10.570854 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eafe7ef4-11df-422c-8b13-8943429c4fa6" containerName="init" Mar 10 10:08:10 crc kubenswrapper[4794]: I0310 10:08:10.570934 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="eafe7ef4-11df-422c-8b13-8943429c4fa6" containerName="init" Mar 10 10:08:10 crc kubenswrapper[4794]: E0310 10:08:10.571010 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2298f21e-2ee9-4446-a5b8-e125fb631861" containerName="nova-metadata-metadata" Mar 10 10:08:10 crc kubenswrapper[4794]: I0310 10:08:10.571081 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="2298f21e-2ee9-4446-a5b8-e125fb631861" containerName="nova-metadata-metadata" Mar 10 10:08:10 crc kubenswrapper[4794]: E0310 10:08:10.571159 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eafe7ef4-11df-422c-8b13-8943429c4fa6" containerName="dnsmasq-dns" Mar 10 10:08:10 crc kubenswrapper[4794]: I0310 10:08:10.571231 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="eafe7ef4-11df-422c-8b13-8943429c4fa6" containerName="dnsmasq-dns" Mar 10 10:08:10 crc kubenswrapper[4794]: I0310 10:08:10.570487 4794 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4dc02cfa57214a1b8c6bf8265715c8083c8b0b11cc5188c3df2d3a62bdafc796"} err="failed to get container status \"4dc02cfa57214a1b8c6bf8265715c8083c8b0b11cc5188c3df2d3a62bdafc796\": rpc error: code = NotFound desc = could not find container \"4dc02cfa57214a1b8c6bf8265715c8083c8b0b11cc5188c3df2d3a62bdafc796\": container with ID starting with 4dc02cfa57214a1b8c6bf8265715c8083c8b0b11cc5188c3df2d3a62bdafc796 not found: ID does not exist" Mar 10 10:08:10 crc kubenswrapper[4794]: I0310 10:08:10.571689 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="e8db1d2e-9690-4d13-bce2-b8602ffb7583" containerName="oc" Mar 10 10:08:10 crc kubenswrapper[4794]: I0310 10:08:10.571772 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="2298f21e-2ee9-4446-a5b8-e125fb631861" containerName="nova-metadata-log" Mar 10 10:08:10 crc kubenswrapper[4794]: I0310 10:08:10.571832 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="4290ea58-8af5-478d-a452-421fe656fe01" containerName="nova-manage" Mar 10 10:08:10 crc kubenswrapper[4794]: I0310 10:08:10.571900 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="eafe7ef4-11df-422c-8b13-8943429c4fa6" containerName="dnsmasq-dns" Mar 10 10:08:10 crc kubenswrapper[4794]: I0310 10:08:10.571976 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="2298f21e-2ee9-4446-a5b8-e125fb631861" containerName="nova-metadata-metadata" Mar 10 10:08:10 crc kubenswrapper[4794]: I0310 10:08:10.572036 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="5f967270-f06c-40e0-9a53-00d2bc42d930" containerName="nova-cell1-conductor-db-sync" Mar 10 10:08:10 crc kubenswrapper[4794]: I0310 10:08:10.572711 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Mar 10 10:08:10 crc kubenswrapper[4794]: I0310 10:08:10.574943 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Mar 10 10:08:10 crc kubenswrapper[4794]: I0310 10:08:10.577655 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Mar 10 10:08:10 crc kubenswrapper[4794]: I0310 10:08:10.590463 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Mar 10 10:08:10 crc kubenswrapper[4794]: I0310 10:08:10.592226 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 10 10:08:10 crc kubenswrapper[4794]: I0310 10:08:10.594984 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Mar 10 10:08:10 crc kubenswrapper[4794]: I0310 10:08:10.595273 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Mar 10 10:08:10 crc kubenswrapper[4794]: I0310 10:08:10.600738 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2e14b1f3-e9a2-41ae-96ee-88dc84f69921-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"2e14b1f3-e9a2-41ae-96ee-88dc84f69921\") " pod="openstack/nova-cell1-conductor-0" Mar 10 10:08:10 crc kubenswrapper[4794]: I0310 10:08:10.600809 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2e14b1f3-e9a2-41ae-96ee-88dc84f69921-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"2e14b1f3-e9a2-41ae-96ee-88dc84f69921\") " pod="openstack/nova-cell1-conductor-0" Mar 10 10:08:10 crc kubenswrapper[4794]: I0310 10:08:10.600874 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cr2zs\" (UniqueName: \"kubernetes.io/projected/2e14b1f3-e9a2-41ae-96ee-88dc84f69921-kube-api-access-cr2zs\") pod \"nova-cell1-conductor-0\" (UID: \"2e14b1f3-e9a2-41ae-96ee-88dc84f69921\") " pod="openstack/nova-cell1-conductor-0" Mar 10 10:08:10 crc kubenswrapper[4794]: I0310 10:08:10.601002 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 10 10:08:10 crc kubenswrapper[4794]: I0310 10:08:10.702535 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2e14b1f3-e9a2-41ae-96ee-88dc84f69921-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"2e14b1f3-e9a2-41ae-96ee-88dc84f69921\") " pod="openstack/nova-cell1-conductor-0" Mar 10 10:08:10 crc kubenswrapper[4794]: I0310 10:08:10.702587 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/4886c6bf-be4d-4b43-8df2-90ce26e40bf1-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"4886c6bf-be4d-4b43-8df2-90ce26e40bf1\") " pod="openstack/nova-metadata-0" Mar 10 10:08:10 crc kubenswrapper[4794]: I0310 10:08:10.702651 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cr2zs\" (UniqueName: \"kubernetes.io/projected/2e14b1f3-e9a2-41ae-96ee-88dc84f69921-kube-api-access-cr2zs\") pod \"nova-cell1-conductor-0\" (UID: \"2e14b1f3-e9a2-41ae-96ee-88dc84f69921\") " pod="openstack/nova-cell1-conductor-0" Mar 10 10:08:10 crc kubenswrapper[4794]: I0310 10:08:10.702675 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wcjfb\" (UniqueName: \"kubernetes.io/projected/4886c6bf-be4d-4b43-8df2-90ce26e40bf1-kube-api-access-wcjfb\") pod \"nova-metadata-0\" (UID: \"4886c6bf-be4d-4b43-8df2-90ce26e40bf1\") " pod="openstack/nova-metadata-0" Mar 10 10:08:10 crc kubenswrapper[4794]: I0310 10:08:10.702714 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4886c6bf-be4d-4b43-8df2-90ce26e40bf1-config-data\") pod \"nova-metadata-0\" (UID: \"4886c6bf-be4d-4b43-8df2-90ce26e40bf1\") " pod="openstack/nova-metadata-0" Mar 10 10:08:10 crc kubenswrapper[4794]: I0310 10:08:10.702740 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4886c6bf-be4d-4b43-8df2-90ce26e40bf1-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"4886c6bf-be4d-4b43-8df2-90ce26e40bf1\") " pod="openstack/nova-metadata-0" Mar 10 10:08:10 crc kubenswrapper[4794]: I0310 10:08:10.702772 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4886c6bf-be4d-4b43-8df2-90ce26e40bf1-logs\") pod \"nova-metadata-0\" (UID: \"4886c6bf-be4d-4b43-8df2-90ce26e40bf1\") " pod="openstack/nova-metadata-0" Mar 10 10:08:10 crc kubenswrapper[4794]: I0310 10:08:10.702790 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2e14b1f3-e9a2-41ae-96ee-88dc84f69921-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"2e14b1f3-e9a2-41ae-96ee-88dc84f69921\") " pod="openstack/nova-cell1-conductor-0" Mar 10 10:08:10 crc kubenswrapper[4794]: I0310 10:08:10.707016 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2e14b1f3-e9a2-41ae-96ee-88dc84f69921-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"2e14b1f3-e9a2-41ae-96ee-88dc84f69921\") " pod="openstack/nova-cell1-conductor-0" Mar 10 10:08:10 crc kubenswrapper[4794]: I0310 10:08:10.707489 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2e14b1f3-e9a2-41ae-96ee-88dc84f69921-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"2e14b1f3-e9a2-41ae-96ee-88dc84f69921\") " pod="openstack/nova-cell1-conductor-0" Mar 10 10:08:10 crc kubenswrapper[4794]: I0310 10:08:10.720424 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cr2zs\" (UniqueName: \"kubernetes.io/projected/2e14b1f3-e9a2-41ae-96ee-88dc84f69921-kube-api-access-cr2zs\") pod \"nova-cell1-conductor-0\" (UID: \"2e14b1f3-e9a2-41ae-96ee-88dc84f69921\") " pod="openstack/nova-cell1-conductor-0" Mar 10 10:08:10 crc kubenswrapper[4794]: I0310 10:08:10.804925 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4886c6bf-be4d-4b43-8df2-90ce26e40bf1-config-data\") pod \"nova-metadata-0\" (UID: \"4886c6bf-be4d-4b43-8df2-90ce26e40bf1\") " pod="openstack/nova-metadata-0" Mar 10 10:08:10 crc kubenswrapper[4794]: I0310 10:08:10.804986 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4886c6bf-be4d-4b43-8df2-90ce26e40bf1-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"4886c6bf-be4d-4b43-8df2-90ce26e40bf1\") " pod="openstack/nova-metadata-0" Mar 10 10:08:10 crc kubenswrapper[4794]: I0310 10:08:10.805025 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4886c6bf-be4d-4b43-8df2-90ce26e40bf1-logs\") pod \"nova-metadata-0\" (UID: \"4886c6bf-be4d-4b43-8df2-90ce26e40bf1\") " pod="openstack/nova-metadata-0" Mar 10 10:08:10 crc kubenswrapper[4794]: I0310 10:08:10.805085 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/4886c6bf-be4d-4b43-8df2-90ce26e40bf1-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"4886c6bf-be4d-4b43-8df2-90ce26e40bf1\") " pod="openstack/nova-metadata-0" Mar 10 10:08:10 crc kubenswrapper[4794]: I0310 10:08:10.805161 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wcjfb\" (UniqueName: \"kubernetes.io/projected/4886c6bf-be4d-4b43-8df2-90ce26e40bf1-kube-api-access-wcjfb\") pod \"nova-metadata-0\" (UID: \"4886c6bf-be4d-4b43-8df2-90ce26e40bf1\") " pod="openstack/nova-metadata-0" Mar 10 10:08:10 crc kubenswrapper[4794]: I0310 10:08:10.805909 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4886c6bf-be4d-4b43-8df2-90ce26e40bf1-logs\") pod \"nova-metadata-0\" (UID: \"4886c6bf-be4d-4b43-8df2-90ce26e40bf1\") " pod="openstack/nova-metadata-0" Mar 10 10:08:10 crc kubenswrapper[4794]: I0310 10:08:10.808854 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4886c6bf-be4d-4b43-8df2-90ce26e40bf1-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"4886c6bf-be4d-4b43-8df2-90ce26e40bf1\") " pod="openstack/nova-metadata-0" Mar 10 10:08:10 crc kubenswrapper[4794]: I0310 10:08:10.809080 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4886c6bf-be4d-4b43-8df2-90ce26e40bf1-config-data\") pod \"nova-metadata-0\" (UID: \"4886c6bf-be4d-4b43-8df2-90ce26e40bf1\") " pod="openstack/nova-metadata-0" Mar 10 10:08:10 crc kubenswrapper[4794]: I0310 10:08:10.820756 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/4886c6bf-be4d-4b43-8df2-90ce26e40bf1-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"4886c6bf-be4d-4b43-8df2-90ce26e40bf1\") " pod="openstack/nova-metadata-0" Mar 10 10:08:10 crc kubenswrapper[4794]: I0310 10:08:10.822032 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wcjfb\" (UniqueName: \"kubernetes.io/projected/4886c6bf-be4d-4b43-8df2-90ce26e40bf1-kube-api-access-wcjfb\") pod \"nova-metadata-0\" (UID: \"4886c6bf-be4d-4b43-8df2-90ce26e40bf1\") " pod="openstack/nova-metadata-0" Mar 10 10:08:10 crc kubenswrapper[4794]: I0310 10:08:10.891005 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Mar 10 10:08:10 crc kubenswrapper[4794]: I0310 10:08:10.919119 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 10 10:08:11 crc kubenswrapper[4794]: I0310 10:08:11.411009 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Mar 10 10:08:11 crc kubenswrapper[4794]: W0310 10:08:11.415879 4794 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2e14b1f3_e9a2_41ae_96ee_88dc84f69921.slice/crio-9672b520e7da191c60f7e43b404a8c3e35d1e2b994bee5f95ba0d8a7406f3b83 WatchSource:0}: Error finding container 9672b520e7da191c60f7e43b404a8c3e35d1e2b994bee5f95ba0d8a7406f3b83: Status 404 returned error can't find the container with id 9672b520e7da191c60f7e43b404a8c3e35d1e2b994bee5f95ba0d8a7406f3b83 Mar 10 10:08:11 crc kubenswrapper[4794]: I0310 10:08:11.425401 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 10 10:08:11 crc kubenswrapper[4794]: W0310 10:08:11.428694 4794 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4886c6bf_be4d_4b43_8df2_90ce26e40bf1.slice/crio-d75a78d46d1db4d984400e32993c2b4f6e5d0b2e7918b14dca38cb401d3d0df2 WatchSource:0}: Error finding container d75a78d46d1db4d984400e32993c2b4f6e5d0b2e7918b14dca38cb401d3d0df2: Status 404 returned error can't find the container with id d75a78d46d1db4d984400e32993c2b4f6e5d0b2e7918b14dca38cb401d3d0df2 Mar 10 10:08:11 crc kubenswrapper[4794]: I0310 10:08:11.460857 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"2e14b1f3-e9a2-41ae-96ee-88dc84f69921","Type":"ContainerStarted","Data":"9672b520e7da191c60f7e43b404a8c3e35d1e2b994bee5f95ba0d8a7406f3b83"} Mar 10 10:08:11 crc kubenswrapper[4794]: I0310 10:08:11.464715 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"4886c6bf-be4d-4b43-8df2-90ce26e40bf1","Type":"ContainerStarted","Data":"d75a78d46d1db4d984400e32993c2b4f6e5d0b2e7918b14dca38cb401d3d0df2"} Mar 10 10:08:11 crc kubenswrapper[4794]: I0310 10:08:11.466091 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="cd02df1e-27d1-40fe-943e-e7fddce1449c" containerName="nova-scheduler-scheduler" containerID="cri-o://acefac07e37e15e65d3a749fdb78baa62a2bf2dee62ae623b0547bea339c72e9" gracePeriod=30 Mar 10 10:08:12 crc kubenswrapper[4794]: I0310 10:08:12.043192 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2298f21e-2ee9-4446-a5b8-e125fb631861" path="/var/lib/kubelet/pods/2298f21e-2ee9-4446-a5b8-e125fb631861/volumes" Mar 10 10:08:12 crc kubenswrapper[4794]: I0310 10:08:12.057919 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="eafe7ef4-11df-422c-8b13-8943429c4fa6" path="/var/lib/kubelet/pods/eafe7ef4-11df-422c-8b13-8943429c4fa6/volumes" Mar 10 10:08:12 crc kubenswrapper[4794]: I0310 10:08:12.479840 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"4886c6bf-be4d-4b43-8df2-90ce26e40bf1","Type":"ContainerStarted","Data":"090867c4656e57401d661ab880641f95ac9d80266e520315826cc074018acfd5"} Mar 10 10:08:12 crc kubenswrapper[4794]: I0310 10:08:12.479887 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"4886c6bf-be4d-4b43-8df2-90ce26e40bf1","Type":"ContainerStarted","Data":"2f44a40ebd4b833eb003259a0a3572ebf0ae0a6431d4f908701d1fadfc35dc2b"} Mar 10 10:08:12 crc kubenswrapper[4794]: I0310 10:08:12.496544 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"2e14b1f3-e9a2-41ae-96ee-88dc84f69921","Type":"ContainerStarted","Data":"c0456f26d4f37efa9041727ed282574e1a91d5423940c7b04377a50eb6ff7d08"} Mar 10 10:08:12 crc kubenswrapper[4794]: I0310 10:08:12.496741 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-conductor-0" Mar 10 10:08:12 crc kubenswrapper[4794]: I0310 10:08:12.523066 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.523043602 podStartE2EDuration="2.523043602s" podCreationTimestamp="2026-03-10 10:08:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 10:08:12.509005057 +0000 UTC m=+1441.265175915" watchObservedRunningTime="2026-03-10 10:08:12.523043602 +0000 UTC m=+1441.279214420" Mar 10 10:08:12 crc kubenswrapper[4794]: I0310 10:08:12.536111 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-0" podStartSLOduration=2.536088987 podStartE2EDuration="2.536088987s" podCreationTimestamp="2026-03-10 10:08:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 10:08:12.524475874 +0000 UTC m=+1441.280646702" watchObservedRunningTime="2026-03-10 10:08:12.536088987 +0000 UTC m=+1441.292259815" Mar 10 10:08:13 crc kubenswrapper[4794]: E0310 10:08:13.938312 4794 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="acefac07e37e15e65d3a749fdb78baa62a2bf2dee62ae623b0547bea339c72e9" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 10 10:08:13 crc kubenswrapper[4794]: E0310 10:08:13.940603 4794 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="acefac07e37e15e65d3a749fdb78baa62a2bf2dee62ae623b0547bea339c72e9" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 10 10:08:13 crc kubenswrapper[4794]: E0310 10:08:13.942855 4794 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="acefac07e37e15e65d3a749fdb78baa62a2bf2dee62ae623b0547bea339c72e9" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 10 10:08:13 crc kubenswrapper[4794]: E0310 10:08:13.942924 4794 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="cd02df1e-27d1-40fe-943e-e7fddce1449c" containerName="nova-scheduler-scheduler" Mar 10 10:08:15 crc kubenswrapper[4794]: I0310 10:08:15.491386 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 10 10:08:15 crc kubenswrapper[4794]: I0310 10:08:15.542218 4794 generic.go:334] "Generic (PLEG): container finished" podID="d5707404-5eca-49e1-b11a-25e9e46ecf57" containerID="e8a1ed949ae80d580360cadba16748744763ab0aabdfbf25c0c8f8f031a813a5" exitCode=0 Mar 10 10:08:15 crc kubenswrapper[4794]: I0310 10:08:15.542292 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"d5707404-5eca-49e1-b11a-25e9e46ecf57","Type":"ContainerDied","Data":"e8a1ed949ae80d580360cadba16748744763ab0aabdfbf25c0c8f8f031a813a5"} Mar 10 10:08:15 crc kubenswrapper[4794]: I0310 10:08:15.545188 4794 generic.go:334] "Generic (PLEG): container finished" podID="cd02df1e-27d1-40fe-943e-e7fddce1449c" containerID="acefac07e37e15e65d3a749fdb78baa62a2bf2dee62ae623b0547bea339c72e9" exitCode=0 Mar 10 10:08:15 crc kubenswrapper[4794]: I0310 10:08:15.545219 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"cd02df1e-27d1-40fe-943e-e7fddce1449c","Type":"ContainerDied","Data":"acefac07e37e15e65d3a749fdb78baa62a2bf2dee62ae623b0547bea339c72e9"} Mar 10 10:08:15 crc kubenswrapper[4794]: I0310 10:08:15.545235 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"cd02df1e-27d1-40fe-943e-e7fddce1449c","Type":"ContainerDied","Data":"963e370895085f1dd74d78799b1fdb3483c72c8626ca9b2bca2a088bc131593d"} Mar 10 10:08:15 crc kubenswrapper[4794]: I0310 10:08:15.545257 4794 scope.go:117] "RemoveContainer" containerID="acefac07e37e15e65d3a749fdb78baa62a2bf2dee62ae623b0547bea339c72e9" Mar 10 10:08:15 crc kubenswrapper[4794]: I0310 10:08:15.545419 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 10 10:08:15 crc kubenswrapper[4794]: I0310 10:08:15.610145 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cd02df1e-27d1-40fe-943e-e7fddce1449c-combined-ca-bundle\") pod \"cd02df1e-27d1-40fe-943e-e7fddce1449c\" (UID: \"cd02df1e-27d1-40fe-943e-e7fddce1449c\") " Mar 10 10:08:15 crc kubenswrapper[4794]: I0310 10:08:15.610226 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mm4qx\" (UniqueName: \"kubernetes.io/projected/cd02df1e-27d1-40fe-943e-e7fddce1449c-kube-api-access-mm4qx\") pod \"cd02df1e-27d1-40fe-943e-e7fddce1449c\" (UID: \"cd02df1e-27d1-40fe-943e-e7fddce1449c\") " Mar 10 10:08:15 crc kubenswrapper[4794]: I0310 10:08:15.610477 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cd02df1e-27d1-40fe-943e-e7fddce1449c-config-data\") pod \"cd02df1e-27d1-40fe-943e-e7fddce1449c\" (UID: \"cd02df1e-27d1-40fe-943e-e7fddce1449c\") " Mar 10 10:08:15 crc kubenswrapper[4794]: I0310 10:08:15.612860 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 10 10:08:15 crc kubenswrapper[4794]: I0310 10:08:15.617583 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd02df1e-27d1-40fe-943e-e7fddce1449c-kube-api-access-mm4qx" (OuterVolumeSpecName: "kube-api-access-mm4qx") pod "cd02df1e-27d1-40fe-943e-e7fddce1449c" (UID: "cd02df1e-27d1-40fe-943e-e7fddce1449c"). InnerVolumeSpecName "kube-api-access-mm4qx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 10:08:15 crc kubenswrapper[4794]: I0310 10:08:15.619512 4794 scope.go:117] "RemoveContainer" containerID="acefac07e37e15e65d3a749fdb78baa62a2bf2dee62ae623b0547bea339c72e9" Mar 10 10:08:15 crc kubenswrapper[4794]: E0310 10:08:15.619892 4794 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"acefac07e37e15e65d3a749fdb78baa62a2bf2dee62ae623b0547bea339c72e9\": container with ID starting with acefac07e37e15e65d3a749fdb78baa62a2bf2dee62ae623b0547bea339c72e9 not found: ID does not exist" containerID="acefac07e37e15e65d3a749fdb78baa62a2bf2dee62ae623b0547bea339c72e9" Mar 10 10:08:15 crc kubenswrapper[4794]: I0310 10:08:15.619930 4794 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"acefac07e37e15e65d3a749fdb78baa62a2bf2dee62ae623b0547bea339c72e9"} err="failed to get container status \"acefac07e37e15e65d3a749fdb78baa62a2bf2dee62ae623b0547bea339c72e9\": rpc error: code = NotFound desc = could not find container \"acefac07e37e15e65d3a749fdb78baa62a2bf2dee62ae623b0547bea339c72e9\": container with ID starting with acefac07e37e15e65d3a749fdb78baa62a2bf2dee62ae623b0547bea339c72e9 not found: ID does not exist" Mar 10 10:08:15 crc kubenswrapper[4794]: I0310 10:08:15.648473 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cd02df1e-27d1-40fe-943e-e7fddce1449c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "cd02df1e-27d1-40fe-943e-e7fddce1449c" (UID: "cd02df1e-27d1-40fe-943e-e7fddce1449c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 10:08:15 crc kubenswrapper[4794]: I0310 10:08:15.650642 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cd02df1e-27d1-40fe-943e-e7fddce1449c-config-data" (OuterVolumeSpecName: "config-data") pod "cd02df1e-27d1-40fe-943e-e7fddce1449c" (UID: "cd02df1e-27d1-40fe-943e-e7fddce1449c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 10:08:15 crc kubenswrapper[4794]: I0310 10:08:15.713685 4794 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cd02df1e-27d1-40fe-943e-e7fddce1449c-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 10:08:15 crc kubenswrapper[4794]: I0310 10:08:15.713720 4794 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cd02df1e-27d1-40fe-943e-e7fddce1449c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 10:08:15 crc kubenswrapper[4794]: I0310 10:08:15.713731 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mm4qx\" (UniqueName: \"kubernetes.io/projected/cd02df1e-27d1-40fe-943e-e7fddce1449c-kube-api-access-mm4qx\") on node \"crc\" DevicePath \"\"" Mar 10 10:08:15 crc kubenswrapper[4794]: I0310 10:08:15.814661 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d5707404-5eca-49e1-b11a-25e9e46ecf57-combined-ca-bundle\") pod \"d5707404-5eca-49e1-b11a-25e9e46ecf57\" (UID: \"d5707404-5eca-49e1-b11a-25e9e46ecf57\") " Mar 10 10:08:15 crc kubenswrapper[4794]: I0310 10:08:15.814810 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d5707404-5eca-49e1-b11a-25e9e46ecf57-config-data\") pod \"d5707404-5eca-49e1-b11a-25e9e46ecf57\" (UID: \"d5707404-5eca-49e1-b11a-25e9e46ecf57\") " Mar 10 10:08:15 crc kubenswrapper[4794]: I0310 10:08:15.815106 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d5707404-5eca-49e1-b11a-25e9e46ecf57-logs\") pod \"d5707404-5eca-49e1-b11a-25e9e46ecf57\" (UID: \"d5707404-5eca-49e1-b11a-25e9e46ecf57\") " Mar 10 10:08:15 crc kubenswrapper[4794]: I0310 10:08:15.815189 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9fv6c\" (UniqueName: \"kubernetes.io/projected/d5707404-5eca-49e1-b11a-25e9e46ecf57-kube-api-access-9fv6c\") pod \"d5707404-5eca-49e1-b11a-25e9e46ecf57\" (UID: \"d5707404-5eca-49e1-b11a-25e9e46ecf57\") " Mar 10 10:08:15 crc kubenswrapper[4794]: I0310 10:08:15.815637 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d5707404-5eca-49e1-b11a-25e9e46ecf57-logs" (OuterVolumeSpecName: "logs") pod "d5707404-5eca-49e1-b11a-25e9e46ecf57" (UID: "d5707404-5eca-49e1-b11a-25e9e46ecf57"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 10:08:15 crc kubenswrapper[4794]: I0310 10:08:15.817701 4794 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d5707404-5eca-49e1-b11a-25e9e46ecf57-logs\") on node \"crc\" DevicePath \"\"" Mar 10 10:08:15 crc kubenswrapper[4794]: I0310 10:08:15.819403 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d5707404-5eca-49e1-b11a-25e9e46ecf57-kube-api-access-9fv6c" (OuterVolumeSpecName: "kube-api-access-9fv6c") pod "d5707404-5eca-49e1-b11a-25e9e46ecf57" (UID: "d5707404-5eca-49e1-b11a-25e9e46ecf57"). InnerVolumeSpecName "kube-api-access-9fv6c". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 10:08:15 crc kubenswrapper[4794]: I0310 10:08:15.841217 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d5707404-5eca-49e1-b11a-25e9e46ecf57-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d5707404-5eca-49e1-b11a-25e9e46ecf57" (UID: "d5707404-5eca-49e1-b11a-25e9e46ecf57"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 10:08:15 crc kubenswrapper[4794]: I0310 10:08:15.842822 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d5707404-5eca-49e1-b11a-25e9e46ecf57-config-data" (OuterVolumeSpecName: "config-data") pod "d5707404-5eca-49e1-b11a-25e9e46ecf57" (UID: "d5707404-5eca-49e1-b11a-25e9e46ecf57"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 10:08:15 crc kubenswrapper[4794]: I0310 10:08:15.894847 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Mar 10 10:08:15 crc kubenswrapper[4794]: I0310 10:08:15.919753 4794 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d5707404-5eca-49e1-b11a-25e9e46ecf57-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 10:08:15 crc kubenswrapper[4794]: I0310 10:08:15.919789 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Mar 10 10:08:15 crc kubenswrapper[4794]: I0310 10:08:15.919797 4794 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d5707404-5eca-49e1-b11a-25e9e46ecf57-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 10:08:15 crc kubenswrapper[4794]: I0310 10:08:15.919822 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Mar 10 10:08:15 crc kubenswrapper[4794]: I0310 10:08:15.919826 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9fv6c\" (UniqueName: \"kubernetes.io/projected/d5707404-5eca-49e1-b11a-25e9e46ecf57-kube-api-access-9fv6c\") on node \"crc\" DevicePath \"\"" Mar 10 10:08:15 crc kubenswrapper[4794]: I0310 10:08:15.921174 4794 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Mar 10 10:08:15 crc kubenswrapper[4794]: I0310 10:08:15.930052 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Mar 10 10:08:15 crc kubenswrapper[4794]: E0310 10:08:15.930481 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d5707404-5eca-49e1-b11a-25e9e46ecf57" containerName="nova-api-log" Mar 10 10:08:15 crc kubenswrapper[4794]: I0310 10:08:15.930501 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="d5707404-5eca-49e1-b11a-25e9e46ecf57" containerName="nova-api-log" Mar 10 10:08:15 crc kubenswrapper[4794]: E0310 10:08:15.930514 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d5707404-5eca-49e1-b11a-25e9e46ecf57" containerName="nova-api-api" Mar 10 10:08:15 crc kubenswrapper[4794]: I0310 10:08:15.930521 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="d5707404-5eca-49e1-b11a-25e9e46ecf57" containerName="nova-api-api" Mar 10 10:08:15 crc kubenswrapper[4794]: E0310 10:08:15.930564 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cd02df1e-27d1-40fe-943e-e7fddce1449c" containerName="nova-scheduler-scheduler" Mar 10 10:08:15 crc kubenswrapper[4794]: I0310 10:08:15.930570 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="cd02df1e-27d1-40fe-943e-e7fddce1449c" containerName="nova-scheduler-scheduler" Mar 10 10:08:15 crc kubenswrapper[4794]: I0310 10:08:15.930746 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="d5707404-5eca-49e1-b11a-25e9e46ecf57" containerName="nova-api-log" Mar 10 10:08:15 crc kubenswrapper[4794]: I0310 10:08:15.930776 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="d5707404-5eca-49e1-b11a-25e9e46ecf57" containerName="nova-api-api" Mar 10 10:08:15 crc kubenswrapper[4794]: I0310 10:08:15.930788 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="cd02df1e-27d1-40fe-943e-e7fddce1449c" containerName="nova-scheduler-scheduler" Mar 10 10:08:15 crc kubenswrapper[4794]: I0310 10:08:15.931481 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 10 10:08:15 crc kubenswrapper[4794]: I0310 10:08:15.933484 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Mar 10 10:08:15 crc kubenswrapper[4794]: I0310 10:08:15.939014 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 10 10:08:16 crc kubenswrapper[4794]: I0310 10:08:16.010225 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd02df1e-27d1-40fe-943e-e7fddce1449c" path="/var/lib/kubelet/pods/cd02df1e-27d1-40fe-943e-e7fddce1449c/volumes" Mar 10 10:08:16 crc kubenswrapper[4794]: I0310 10:08:16.020518 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/43594c25-d6b5-4e31-b751-3649526b219b-config-data\") pod \"nova-scheduler-0\" (UID: \"43594c25-d6b5-4e31-b751-3649526b219b\") " pod="openstack/nova-scheduler-0" Mar 10 10:08:16 crc kubenswrapper[4794]: I0310 10:08:16.020593 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-45s6m\" (UniqueName: \"kubernetes.io/projected/43594c25-d6b5-4e31-b751-3649526b219b-kube-api-access-45s6m\") pod \"nova-scheduler-0\" (UID: \"43594c25-d6b5-4e31-b751-3649526b219b\") " pod="openstack/nova-scheduler-0" Mar 10 10:08:16 crc kubenswrapper[4794]: I0310 10:08:16.020643 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/43594c25-d6b5-4e31-b751-3649526b219b-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"43594c25-d6b5-4e31-b751-3649526b219b\") " pod="openstack/nova-scheduler-0" Mar 10 10:08:16 crc kubenswrapper[4794]: I0310 10:08:16.122097 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/43594c25-d6b5-4e31-b751-3649526b219b-config-data\") pod \"nova-scheduler-0\" (UID: \"43594c25-d6b5-4e31-b751-3649526b219b\") " pod="openstack/nova-scheduler-0" Mar 10 10:08:16 crc kubenswrapper[4794]: I0310 10:08:16.122158 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-45s6m\" (UniqueName: \"kubernetes.io/projected/43594c25-d6b5-4e31-b751-3649526b219b-kube-api-access-45s6m\") pod \"nova-scheduler-0\" (UID: \"43594c25-d6b5-4e31-b751-3649526b219b\") " pod="openstack/nova-scheduler-0" Mar 10 10:08:16 crc kubenswrapper[4794]: I0310 10:08:16.122192 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/43594c25-d6b5-4e31-b751-3649526b219b-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"43594c25-d6b5-4e31-b751-3649526b219b\") " pod="openstack/nova-scheduler-0" Mar 10 10:08:16 crc kubenswrapper[4794]: I0310 10:08:16.126686 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/43594c25-d6b5-4e31-b751-3649526b219b-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"43594c25-d6b5-4e31-b751-3649526b219b\") " pod="openstack/nova-scheduler-0" Mar 10 10:08:16 crc kubenswrapper[4794]: I0310 10:08:16.127419 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/43594c25-d6b5-4e31-b751-3649526b219b-config-data\") pod \"nova-scheduler-0\" (UID: \"43594c25-d6b5-4e31-b751-3649526b219b\") " pod="openstack/nova-scheduler-0" Mar 10 10:08:16 crc kubenswrapper[4794]: I0310 10:08:16.145979 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-45s6m\" (UniqueName: \"kubernetes.io/projected/43594c25-d6b5-4e31-b751-3649526b219b-kube-api-access-45s6m\") pod \"nova-scheduler-0\" (UID: \"43594c25-d6b5-4e31-b751-3649526b219b\") " pod="openstack/nova-scheduler-0" Mar 10 10:08:16 crc kubenswrapper[4794]: I0310 10:08:16.249228 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 10 10:08:16 crc kubenswrapper[4794]: I0310 10:08:16.558364 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"d5707404-5eca-49e1-b11a-25e9e46ecf57","Type":"ContainerDied","Data":"2eb0712a92ec56430ec83031700a92710898dda90366c5ce18d236e76dac6aba"} Mar 10 10:08:16 crc kubenswrapper[4794]: I0310 10:08:16.558378 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 10 10:08:16 crc kubenswrapper[4794]: I0310 10:08:16.558447 4794 scope.go:117] "RemoveContainer" containerID="e8a1ed949ae80d580360cadba16748744763ab0aabdfbf25c0c8f8f031a813a5" Mar 10 10:08:16 crc kubenswrapper[4794]: I0310 10:08:16.585231 4794 scope.go:117] "RemoveContainer" containerID="e58b8ece246538a5d35c361568da48a1fe4dfd6ade472acb22a81b8fc265fb3c" Mar 10 10:08:16 crc kubenswrapper[4794]: I0310 10:08:16.591947 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 10 10:08:16 crc kubenswrapper[4794]: I0310 10:08:16.606682 4794 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Mar 10 10:08:16 crc kubenswrapper[4794]: I0310 10:08:16.616642 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Mar 10 10:08:16 crc kubenswrapper[4794]: I0310 10:08:16.618323 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 10 10:08:16 crc kubenswrapper[4794]: I0310 10:08:16.629015 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Mar 10 10:08:16 crc kubenswrapper[4794]: I0310 10:08:16.631648 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jcbx7\" (UniqueName: \"kubernetes.io/projected/35c66301-907d-40ed-afb3-bc354acece6c-kube-api-access-jcbx7\") pod \"nova-api-0\" (UID: \"35c66301-907d-40ed-afb3-bc354acece6c\") " pod="openstack/nova-api-0" Mar 10 10:08:16 crc kubenswrapper[4794]: I0310 10:08:16.631751 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/35c66301-907d-40ed-afb3-bc354acece6c-logs\") pod \"nova-api-0\" (UID: \"35c66301-907d-40ed-afb3-bc354acece6c\") " pod="openstack/nova-api-0" Mar 10 10:08:16 crc kubenswrapper[4794]: I0310 10:08:16.631873 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/35c66301-907d-40ed-afb3-bc354acece6c-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"35c66301-907d-40ed-afb3-bc354acece6c\") " pod="openstack/nova-api-0" Mar 10 10:08:16 crc kubenswrapper[4794]: I0310 10:08:16.631942 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/35c66301-907d-40ed-afb3-bc354acece6c-config-data\") pod \"nova-api-0\" (UID: \"35c66301-907d-40ed-afb3-bc354acece6c\") " pod="openstack/nova-api-0" Mar 10 10:08:16 crc kubenswrapper[4794]: I0310 10:08:16.638587 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 10 10:08:16 crc kubenswrapper[4794]: I0310 10:08:16.726052 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 10 10:08:16 crc kubenswrapper[4794]: I0310 10:08:16.734469 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jcbx7\" (UniqueName: \"kubernetes.io/projected/35c66301-907d-40ed-afb3-bc354acece6c-kube-api-access-jcbx7\") pod \"nova-api-0\" (UID: \"35c66301-907d-40ed-afb3-bc354acece6c\") " pod="openstack/nova-api-0" Mar 10 10:08:16 crc kubenswrapper[4794]: I0310 10:08:16.734555 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/35c66301-907d-40ed-afb3-bc354acece6c-logs\") pod \"nova-api-0\" (UID: \"35c66301-907d-40ed-afb3-bc354acece6c\") " pod="openstack/nova-api-0" Mar 10 10:08:16 crc kubenswrapper[4794]: I0310 10:08:16.734615 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/35c66301-907d-40ed-afb3-bc354acece6c-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"35c66301-907d-40ed-afb3-bc354acece6c\") " pod="openstack/nova-api-0" Mar 10 10:08:16 crc kubenswrapper[4794]: I0310 10:08:16.734651 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/35c66301-907d-40ed-afb3-bc354acece6c-config-data\") pod \"nova-api-0\" (UID: \"35c66301-907d-40ed-afb3-bc354acece6c\") " pod="openstack/nova-api-0" Mar 10 10:08:16 crc kubenswrapper[4794]: I0310 10:08:16.736164 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/35c66301-907d-40ed-afb3-bc354acece6c-logs\") pod \"nova-api-0\" (UID: \"35c66301-907d-40ed-afb3-bc354acece6c\") " pod="openstack/nova-api-0" Mar 10 10:08:16 crc kubenswrapper[4794]: I0310 10:08:16.740910 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/35c66301-907d-40ed-afb3-bc354acece6c-config-data\") pod \"nova-api-0\" (UID: \"35c66301-907d-40ed-afb3-bc354acece6c\") " pod="openstack/nova-api-0" Mar 10 10:08:16 crc kubenswrapper[4794]: I0310 10:08:16.742572 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/35c66301-907d-40ed-afb3-bc354acece6c-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"35c66301-907d-40ed-afb3-bc354acece6c\") " pod="openstack/nova-api-0" Mar 10 10:08:16 crc kubenswrapper[4794]: I0310 10:08:16.756847 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jcbx7\" (UniqueName: \"kubernetes.io/projected/35c66301-907d-40ed-afb3-bc354acece6c-kube-api-access-jcbx7\") pod \"nova-api-0\" (UID: \"35c66301-907d-40ed-afb3-bc354acece6c\") " pod="openstack/nova-api-0" Mar 10 10:08:16 crc kubenswrapper[4794]: I0310 10:08:16.935315 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 10 10:08:17 crc kubenswrapper[4794]: I0310 10:08:17.412208 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 10 10:08:17 crc kubenswrapper[4794]: I0310 10:08:17.578404 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"43594c25-d6b5-4e31-b751-3649526b219b","Type":"ContainerStarted","Data":"af3f12191565dd5dcc24b1a062feef4911592964922b1d240a36b5411cf7f013"} Mar 10 10:08:17 crc kubenswrapper[4794]: I0310 10:08:17.578734 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"43594c25-d6b5-4e31-b751-3649526b219b","Type":"ContainerStarted","Data":"f7b46753c26e2d65b8d2a6ace22e377c796e1085f9b723e0dcab48ab1b33210b"} Mar 10 10:08:17 crc kubenswrapper[4794]: I0310 10:08:17.582576 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"35c66301-907d-40ed-afb3-bc354acece6c","Type":"ContainerStarted","Data":"ba2ce354562303a436d5a85cd3efdddac77398279ea32b798e7045886e2012a3"} Mar 10 10:08:17 crc kubenswrapper[4794]: I0310 10:08:17.604575 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.60455936 podStartE2EDuration="2.60455936s" podCreationTimestamp="2026-03-10 10:08:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 10:08:17.601141258 +0000 UTC m=+1446.357312096" watchObservedRunningTime="2026-03-10 10:08:17.60455936 +0000 UTC m=+1446.360730168" Mar 10 10:08:18 crc kubenswrapper[4794]: I0310 10:08:18.015746 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d5707404-5eca-49e1-b11a-25e9e46ecf57" path="/var/lib/kubelet/pods/d5707404-5eca-49e1-b11a-25e9e46ecf57/volumes" Mar 10 10:08:18 crc kubenswrapper[4794]: I0310 10:08:18.594384 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"35c66301-907d-40ed-afb3-bc354acece6c","Type":"ContainerStarted","Data":"9acc5f862dab66c8a5e24eebcf3c82c037a59d49bfb52b3d73f73a14e9e9958a"} Mar 10 10:08:18 crc kubenswrapper[4794]: I0310 10:08:18.594687 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"35c66301-907d-40ed-afb3-bc354acece6c","Type":"ContainerStarted","Data":"568f66f4a8b27c2454e806e1a7d04f1e8eae2e56111d1087a639f0edfdf2a055"} Mar 10 10:08:18 crc kubenswrapper[4794]: I0310 10:08:18.620398 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.620331313 podStartE2EDuration="2.620331313s" podCreationTimestamp="2026-03-10 10:08:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 10:08:18.610551805 +0000 UTC m=+1447.366722653" watchObservedRunningTime="2026-03-10 10:08:18.620331313 +0000 UTC m=+1447.376502131" Mar 10 10:08:19 crc kubenswrapper[4794]: I0310 10:08:19.550868 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Mar 10 10:08:20 crc kubenswrapper[4794]: I0310 10:08:20.917529 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-conductor-0" Mar 10 10:08:20 crc kubenswrapper[4794]: I0310 10:08:20.919843 4794 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Mar 10 10:08:20 crc kubenswrapper[4794]: I0310 10:08:20.919889 4794 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Mar 10 10:08:21 crc kubenswrapper[4794]: I0310 10:08:21.249996 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Mar 10 10:08:21 crc kubenswrapper[4794]: I0310 10:08:21.930471 4794 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="4886c6bf-be4d-4b43-8df2-90ce26e40bf1" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.199:8775/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 10 10:08:21 crc kubenswrapper[4794]: I0310 10:08:21.930769 4794 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="4886c6bf-be4d-4b43-8df2-90ce26e40bf1" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.199:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 10 10:08:23 crc kubenswrapper[4794]: I0310 10:08:23.400429 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 10 10:08:23 crc kubenswrapper[4794]: I0310 10:08:23.400964 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/kube-state-metrics-0" podUID="57b6ddc9-2a57-41f1-a6c8-d0be37b88252" containerName="kube-state-metrics" containerID="cri-o://b7b6d56bae7bead8aea36dc9b247f1241a2b6e20d19a3ee1d788129133d85a48" gracePeriod=30 Mar 10 10:08:24 crc kubenswrapper[4794]: I0310 10:08:23.669467 4794 generic.go:334] "Generic (PLEG): container finished" podID="57b6ddc9-2a57-41f1-a6c8-d0be37b88252" containerID="b7b6d56bae7bead8aea36dc9b247f1241a2b6e20d19a3ee1d788129133d85a48" exitCode=2 Mar 10 10:08:24 crc kubenswrapper[4794]: I0310 10:08:23.669579 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"57b6ddc9-2a57-41f1-a6c8-d0be37b88252","Type":"ContainerDied","Data":"b7b6d56bae7bead8aea36dc9b247f1241a2b6e20d19a3ee1d788129133d85a48"} Mar 10 10:08:24 crc kubenswrapper[4794]: I0310 10:08:23.886362 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 10 10:08:24 crc kubenswrapper[4794]: I0310 10:08:23.973442 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5zjqk\" (UniqueName: \"kubernetes.io/projected/57b6ddc9-2a57-41f1-a6c8-d0be37b88252-kube-api-access-5zjqk\") pod \"57b6ddc9-2a57-41f1-a6c8-d0be37b88252\" (UID: \"57b6ddc9-2a57-41f1-a6c8-d0be37b88252\") " Mar 10 10:08:24 crc kubenswrapper[4794]: I0310 10:08:23.979743 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57b6ddc9-2a57-41f1-a6c8-d0be37b88252-kube-api-access-5zjqk" (OuterVolumeSpecName: "kube-api-access-5zjqk") pod "57b6ddc9-2a57-41f1-a6c8-d0be37b88252" (UID: "57b6ddc9-2a57-41f1-a6c8-d0be37b88252"). InnerVolumeSpecName "kube-api-access-5zjqk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 10:08:24 crc kubenswrapper[4794]: I0310 10:08:24.075834 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5zjqk\" (UniqueName: \"kubernetes.io/projected/57b6ddc9-2a57-41f1-a6c8-d0be37b88252-kube-api-access-5zjqk\") on node \"crc\" DevicePath \"\"" Mar 10 10:08:24 crc kubenswrapper[4794]: I0310 10:08:24.679592 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"57b6ddc9-2a57-41f1-a6c8-d0be37b88252","Type":"ContainerDied","Data":"3b36d3081398e6b67192ac531cd0a42f8ac5118b45044f79104e02d527f5122b"} Mar 10 10:08:24 crc kubenswrapper[4794]: I0310 10:08:24.679648 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 10 10:08:24 crc kubenswrapper[4794]: I0310 10:08:24.679944 4794 scope.go:117] "RemoveContainer" containerID="b7b6d56bae7bead8aea36dc9b247f1241a2b6e20d19a3ee1d788129133d85a48" Mar 10 10:08:24 crc kubenswrapper[4794]: I0310 10:08:24.704249 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 10 10:08:24 crc kubenswrapper[4794]: I0310 10:08:24.713013 4794 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 10 10:08:24 crc kubenswrapper[4794]: I0310 10:08:24.724138 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Mar 10 10:08:24 crc kubenswrapper[4794]: E0310 10:08:24.724519 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="57b6ddc9-2a57-41f1-a6c8-d0be37b88252" containerName="kube-state-metrics" Mar 10 10:08:24 crc kubenswrapper[4794]: I0310 10:08:24.724539 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="57b6ddc9-2a57-41f1-a6c8-d0be37b88252" containerName="kube-state-metrics" Mar 10 10:08:24 crc kubenswrapper[4794]: I0310 10:08:24.724722 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="57b6ddc9-2a57-41f1-a6c8-d0be37b88252" containerName="kube-state-metrics" Mar 10 10:08:24 crc kubenswrapper[4794]: I0310 10:08:24.725259 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 10 10:08:24 crc kubenswrapper[4794]: I0310 10:08:24.728419 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-kube-state-metrics-svc" Mar 10 10:08:24 crc kubenswrapper[4794]: I0310 10:08:24.730152 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"kube-state-metrics-tls-config" Mar 10 10:08:24 crc kubenswrapper[4794]: I0310 10:08:24.737985 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 10 10:08:24 crc kubenswrapper[4794]: I0310 10:08:24.791139 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0a15b784-796e-4834-97e1-978b1f0d9690-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"0a15b784-796e-4834-97e1-978b1f0d9690\") " pod="openstack/kube-state-metrics-0" Mar 10 10:08:24 crc kubenswrapper[4794]: I0310 10:08:24.791477 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n2tbq\" (UniqueName: \"kubernetes.io/projected/0a15b784-796e-4834-97e1-978b1f0d9690-kube-api-access-n2tbq\") pod \"kube-state-metrics-0\" (UID: \"0a15b784-796e-4834-97e1-978b1f0d9690\") " pod="openstack/kube-state-metrics-0" Mar 10 10:08:24 crc kubenswrapper[4794]: I0310 10:08:24.791650 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/0a15b784-796e-4834-97e1-978b1f0d9690-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"0a15b784-796e-4834-97e1-978b1f0d9690\") " pod="openstack/kube-state-metrics-0" Mar 10 10:08:24 crc kubenswrapper[4794]: I0310 10:08:24.791872 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/0a15b784-796e-4834-97e1-978b1f0d9690-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"0a15b784-796e-4834-97e1-978b1f0d9690\") " pod="openstack/kube-state-metrics-0" Mar 10 10:08:24 crc kubenswrapper[4794]: I0310 10:08:24.893369 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0a15b784-796e-4834-97e1-978b1f0d9690-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"0a15b784-796e-4834-97e1-978b1f0d9690\") " pod="openstack/kube-state-metrics-0" Mar 10 10:08:24 crc kubenswrapper[4794]: I0310 10:08:24.893466 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n2tbq\" (UniqueName: \"kubernetes.io/projected/0a15b784-796e-4834-97e1-978b1f0d9690-kube-api-access-n2tbq\") pod \"kube-state-metrics-0\" (UID: \"0a15b784-796e-4834-97e1-978b1f0d9690\") " pod="openstack/kube-state-metrics-0" Mar 10 10:08:24 crc kubenswrapper[4794]: I0310 10:08:24.893533 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/0a15b784-796e-4834-97e1-978b1f0d9690-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"0a15b784-796e-4834-97e1-978b1f0d9690\") " pod="openstack/kube-state-metrics-0" Mar 10 10:08:24 crc kubenswrapper[4794]: I0310 10:08:24.893599 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/0a15b784-796e-4834-97e1-978b1f0d9690-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"0a15b784-796e-4834-97e1-978b1f0d9690\") " pod="openstack/kube-state-metrics-0" Mar 10 10:08:24 crc kubenswrapper[4794]: I0310 10:08:24.899423 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0a15b784-796e-4834-97e1-978b1f0d9690-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"0a15b784-796e-4834-97e1-978b1f0d9690\") " pod="openstack/kube-state-metrics-0" Mar 10 10:08:24 crc kubenswrapper[4794]: I0310 10:08:24.899801 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/0a15b784-796e-4834-97e1-978b1f0d9690-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"0a15b784-796e-4834-97e1-978b1f0d9690\") " pod="openstack/kube-state-metrics-0" Mar 10 10:08:24 crc kubenswrapper[4794]: I0310 10:08:24.900606 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/0a15b784-796e-4834-97e1-978b1f0d9690-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"0a15b784-796e-4834-97e1-978b1f0d9690\") " pod="openstack/kube-state-metrics-0" Mar 10 10:08:24 crc kubenswrapper[4794]: I0310 10:08:24.913917 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n2tbq\" (UniqueName: \"kubernetes.io/projected/0a15b784-796e-4834-97e1-978b1f0d9690-kube-api-access-n2tbq\") pod \"kube-state-metrics-0\" (UID: \"0a15b784-796e-4834-97e1-978b1f0d9690\") " pod="openstack/kube-state-metrics-0" Mar 10 10:08:25 crc kubenswrapper[4794]: I0310 10:08:25.051830 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 10 10:08:25 crc kubenswrapper[4794]: I0310 10:08:25.156412 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 10 10:08:25 crc kubenswrapper[4794]: I0310 10:08:25.157015 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="b8c51b99-93aa-4f8d-a27c-d074fba6c088" containerName="ceilometer-central-agent" containerID="cri-o://289c72a035524d3aa2d5de9768558a7437aaedc5914a97e7e9ecf6ff6eeccdce" gracePeriod=30 Mar 10 10:08:25 crc kubenswrapper[4794]: I0310 10:08:25.157053 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="b8c51b99-93aa-4f8d-a27c-d074fba6c088" containerName="ceilometer-notification-agent" containerID="cri-o://9dc19e1bf68288ae1411b1a8de862fd3ebbb95f8ce54b07c9ceb502662ba61d1" gracePeriod=30 Mar 10 10:08:25 crc kubenswrapper[4794]: I0310 10:08:25.157089 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="b8c51b99-93aa-4f8d-a27c-d074fba6c088" containerName="sg-core" containerID="cri-o://21069cd1e2fdce0936b66d3c3e620ae34405dd97c82d27a0096c58cc684bf07f" gracePeriod=30 Mar 10 10:08:25 crc kubenswrapper[4794]: I0310 10:08:25.157015 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="b8c51b99-93aa-4f8d-a27c-d074fba6c088" containerName="proxy-httpd" containerID="cri-o://26e182b3712e4bfb475fe4f60312bb3098934981450ef39df84c8266f3a7d01d" gracePeriod=30 Mar 10 10:08:25 crc kubenswrapper[4794]: I0310 10:08:25.590629 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 10 10:08:25 crc kubenswrapper[4794]: I0310 10:08:25.693255 4794 generic.go:334] "Generic (PLEG): container finished" podID="b8c51b99-93aa-4f8d-a27c-d074fba6c088" containerID="26e182b3712e4bfb475fe4f60312bb3098934981450ef39df84c8266f3a7d01d" exitCode=0 Mar 10 10:08:25 crc kubenswrapper[4794]: I0310 10:08:25.693287 4794 generic.go:334] "Generic (PLEG): container finished" podID="b8c51b99-93aa-4f8d-a27c-d074fba6c088" containerID="21069cd1e2fdce0936b66d3c3e620ae34405dd97c82d27a0096c58cc684bf07f" exitCode=2 Mar 10 10:08:25 crc kubenswrapper[4794]: I0310 10:08:25.693298 4794 generic.go:334] "Generic (PLEG): container finished" podID="b8c51b99-93aa-4f8d-a27c-d074fba6c088" containerID="289c72a035524d3aa2d5de9768558a7437aaedc5914a97e7e9ecf6ff6eeccdce" exitCode=0 Mar 10 10:08:25 crc kubenswrapper[4794]: I0310 10:08:25.693311 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b8c51b99-93aa-4f8d-a27c-d074fba6c088","Type":"ContainerDied","Data":"26e182b3712e4bfb475fe4f60312bb3098934981450ef39df84c8266f3a7d01d"} Mar 10 10:08:25 crc kubenswrapper[4794]: I0310 10:08:25.693381 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b8c51b99-93aa-4f8d-a27c-d074fba6c088","Type":"ContainerDied","Data":"21069cd1e2fdce0936b66d3c3e620ae34405dd97c82d27a0096c58cc684bf07f"} Mar 10 10:08:25 crc kubenswrapper[4794]: I0310 10:08:25.693395 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b8c51b99-93aa-4f8d-a27c-d074fba6c088","Type":"ContainerDied","Data":"289c72a035524d3aa2d5de9768558a7437aaedc5914a97e7e9ecf6ff6eeccdce"} Mar 10 10:08:25 crc kubenswrapper[4794]: I0310 10:08:25.694512 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"0a15b784-796e-4834-97e1-978b1f0d9690","Type":"ContainerStarted","Data":"cfaa5e833f20da11cb041d945a3128f76cb8fe90d425062179af059b0fe4f9bc"} Mar 10 10:08:26 crc kubenswrapper[4794]: I0310 10:08:26.013112 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57b6ddc9-2a57-41f1-a6c8-d0be37b88252" path="/var/lib/kubelet/pods/57b6ddc9-2a57-41f1-a6c8-d0be37b88252/volumes" Mar 10 10:08:26 crc kubenswrapper[4794]: I0310 10:08:26.250170 4794 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Mar 10 10:08:26 crc kubenswrapper[4794]: I0310 10:08:26.287301 4794 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Mar 10 10:08:26 crc kubenswrapper[4794]: I0310 10:08:26.708841 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"0a15b784-796e-4834-97e1-978b1f0d9690","Type":"ContainerStarted","Data":"1a9f488919341a6cbd7a3e003cf402c07ea892b34d5616b62a24305d40e123b4"} Mar 10 10:08:26 crc kubenswrapper[4794]: I0310 10:08:26.729887 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=2.266867036 podStartE2EDuration="2.729860672s" podCreationTimestamp="2026-03-10 10:08:24 +0000 UTC" firstStartedPulling="2026-03-10 10:08:25.59438503 +0000 UTC m=+1454.350555848" lastFinishedPulling="2026-03-10 10:08:26.057378666 +0000 UTC m=+1454.813549484" observedRunningTime="2026-03-10 10:08:26.728170822 +0000 UTC m=+1455.484341670" watchObservedRunningTime="2026-03-10 10:08:26.729860672 +0000 UTC m=+1455.486031500" Mar 10 10:08:26 crc kubenswrapper[4794]: I0310 10:08:26.739844 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Mar 10 10:08:26 crc kubenswrapper[4794]: I0310 10:08:26.936910 4794 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 10 10:08:26 crc kubenswrapper[4794]: I0310 10:08:26.936978 4794 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 10 10:08:27 crc kubenswrapper[4794]: I0310 10:08:27.102490 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-hz6dc"] Mar 10 10:08:27 crc kubenswrapper[4794]: I0310 10:08:27.104113 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-hz6dc" Mar 10 10:08:27 crc kubenswrapper[4794]: I0310 10:08:27.118958 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-hz6dc"] Mar 10 10:08:27 crc kubenswrapper[4794]: I0310 10:08:27.254374 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fghwl\" (UniqueName: \"kubernetes.io/projected/6d7ccc31-5316-4f75-999a-45cd1f36a9f1-kube-api-access-fghwl\") pod \"community-operators-hz6dc\" (UID: \"6d7ccc31-5316-4f75-999a-45cd1f36a9f1\") " pod="openshift-marketplace/community-operators-hz6dc" Mar 10 10:08:27 crc kubenswrapper[4794]: I0310 10:08:27.254438 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6d7ccc31-5316-4f75-999a-45cd1f36a9f1-catalog-content\") pod \"community-operators-hz6dc\" (UID: \"6d7ccc31-5316-4f75-999a-45cd1f36a9f1\") " pod="openshift-marketplace/community-operators-hz6dc" Mar 10 10:08:27 crc kubenswrapper[4794]: I0310 10:08:27.254756 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6d7ccc31-5316-4f75-999a-45cd1f36a9f1-utilities\") pod \"community-operators-hz6dc\" (UID: \"6d7ccc31-5316-4f75-999a-45cd1f36a9f1\") " pod="openshift-marketplace/community-operators-hz6dc" Mar 10 10:08:27 crc kubenswrapper[4794]: I0310 10:08:27.356653 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6d7ccc31-5316-4f75-999a-45cd1f36a9f1-utilities\") pod \"community-operators-hz6dc\" (UID: \"6d7ccc31-5316-4f75-999a-45cd1f36a9f1\") " pod="openshift-marketplace/community-operators-hz6dc" Mar 10 10:08:27 crc kubenswrapper[4794]: I0310 10:08:27.357021 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fghwl\" (UniqueName: \"kubernetes.io/projected/6d7ccc31-5316-4f75-999a-45cd1f36a9f1-kube-api-access-fghwl\") pod \"community-operators-hz6dc\" (UID: \"6d7ccc31-5316-4f75-999a-45cd1f36a9f1\") " pod="openshift-marketplace/community-operators-hz6dc" Mar 10 10:08:27 crc kubenswrapper[4794]: I0310 10:08:27.357071 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6d7ccc31-5316-4f75-999a-45cd1f36a9f1-catalog-content\") pod \"community-operators-hz6dc\" (UID: \"6d7ccc31-5316-4f75-999a-45cd1f36a9f1\") " pod="openshift-marketplace/community-operators-hz6dc" Mar 10 10:08:27 crc kubenswrapper[4794]: I0310 10:08:27.357154 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6d7ccc31-5316-4f75-999a-45cd1f36a9f1-utilities\") pod \"community-operators-hz6dc\" (UID: \"6d7ccc31-5316-4f75-999a-45cd1f36a9f1\") " pod="openshift-marketplace/community-operators-hz6dc" Mar 10 10:08:27 crc kubenswrapper[4794]: I0310 10:08:27.357532 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6d7ccc31-5316-4f75-999a-45cd1f36a9f1-catalog-content\") pod \"community-operators-hz6dc\" (UID: \"6d7ccc31-5316-4f75-999a-45cd1f36a9f1\") " pod="openshift-marketplace/community-operators-hz6dc" Mar 10 10:08:27 crc kubenswrapper[4794]: I0310 10:08:27.379728 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fghwl\" (UniqueName: \"kubernetes.io/projected/6d7ccc31-5316-4f75-999a-45cd1f36a9f1-kube-api-access-fghwl\") pod \"community-operators-hz6dc\" (UID: \"6d7ccc31-5316-4f75-999a-45cd1f36a9f1\") " pod="openshift-marketplace/community-operators-hz6dc" Mar 10 10:08:27 crc kubenswrapper[4794]: I0310 10:08:27.422246 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-hz6dc" Mar 10 10:08:27 crc kubenswrapper[4794]: I0310 10:08:27.721134 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Mar 10 10:08:28 crc kubenswrapper[4794]: I0310 10:08:28.019492 4794 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="35c66301-907d-40ed-afb3-bc354acece6c" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.201:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 10 10:08:28 crc kubenswrapper[4794]: I0310 10:08:28.019489 4794 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="35c66301-907d-40ed-afb3-bc354acece6c" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.201:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 10 10:08:28 crc kubenswrapper[4794]: I0310 10:08:28.039618 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-hz6dc"] Mar 10 10:08:28 crc kubenswrapper[4794]: I0310 10:08:28.730863 4794 generic.go:334] "Generic (PLEG): container finished" podID="6d7ccc31-5316-4f75-999a-45cd1f36a9f1" containerID="1909ca74782e471a986b8d7bbc83a5c2f0401ac34268d5ad41b636d8631c8704" exitCode=0 Mar 10 10:08:28 crc kubenswrapper[4794]: I0310 10:08:28.731231 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hz6dc" event={"ID":"6d7ccc31-5316-4f75-999a-45cd1f36a9f1","Type":"ContainerDied","Data":"1909ca74782e471a986b8d7bbc83a5c2f0401ac34268d5ad41b636d8631c8704"} Mar 10 10:08:28 crc kubenswrapper[4794]: I0310 10:08:28.731386 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hz6dc" event={"ID":"6d7ccc31-5316-4f75-999a-45cd1f36a9f1","Type":"ContainerStarted","Data":"e387f2db73044d1eb1371bf820417ce99825760618a0cb4c3033355183c6daea"} Mar 10 10:08:28 crc kubenswrapper[4794]: I0310 10:08:28.735526 4794 generic.go:334] "Generic (PLEG): container finished" podID="b8c51b99-93aa-4f8d-a27c-d074fba6c088" containerID="9dc19e1bf68288ae1411b1a8de862fd3ebbb95f8ce54b07c9ceb502662ba61d1" exitCode=0 Mar 10 10:08:28 crc kubenswrapper[4794]: I0310 10:08:28.736275 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b8c51b99-93aa-4f8d-a27c-d074fba6c088","Type":"ContainerDied","Data":"9dc19e1bf68288ae1411b1a8de862fd3ebbb95f8ce54b07c9ceb502662ba61d1"} Mar 10 10:08:29 crc kubenswrapper[4794]: I0310 10:08:29.176971 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 10 10:08:29 crc kubenswrapper[4794]: I0310 10:08:29.324395 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b8c51b99-93aa-4f8d-a27c-d074fba6c088-config-data\") pod \"b8c51b99-93aa-4f8d-a27c-d074fba6c088\" (UID: \"b8c51b99-93aa-4f8d-a27c-d074fba6c088\") " Mar 10 10:08:29 crc kubenswrapper[4794]: I0310 10:08:29.324794 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b8c51b99-93aa-4f8d-a27c-d074fba6c088-run-httpd\") pod \"b8c51b99-93aa-4f8d-a27c-d074fba6c088\" (UID: \"b8c51b99-93aa-4f8d-a27c-d074fba6c088\") " Mar 10 10:08:29 crc kubenswrapper[4794]: I0310 10:08:29.325008 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b8c51b99-93aa-4f8d-a27c-d074fba6c088-sg-core-conf-yaml\") pod \"b8c51b99-93aa-4f8d-a27c-d074fba6c088\" (UID: \"b8c51b99-93aa-4f8d-a27c-d074fba6c088\") " Mar 10 10:08:29 crc kubenswrapper[4794]: I0310 10:08:29.325037 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b8c51b99-93aa-4f8d-a27c-d074fba6c088-combined-ca-bundle\") pod \"b8c51b99-93aa-4f8d-a27c-d074fba6c088\" (UID: \"b8c51b99-93aa-4f8d-a27c-d074fba6c088\") " Mar 10 10:08:29 crc kubenswrapper[4794]: I0310 10:08:29.325090 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qkd4w\" (UniqueName: \"kubernetes.io/projected/b8c51b99-93aa-4f8d-a27c-d074fba6c088-kube-api-access-qkd4w\") pod \"b8c51b99-93aa-4f8d-a27c-d074fba6c088\" (UID: \"b8c51b99-93aa-4f8d-a27c-d074fba6c088\") " Mar 10 10:08:29 crc kubenswrapper[4794]: I0310 10:08:29.325136 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b8c51b99-93aa-4f8d-a27c-d074fba6c088-scripts\") pod \"b8c51b99-93aa-4f8d-a27c-d074fba6c088\" (UID: \"b8c51b99-93aa-4f8d-a27c-d074fba6c088\") " Mar 10 10:08:29 crc kubenswrapper[4794]: I0310 10:08:29.325162 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b8c51b99-93aa-4f8d-a27c-d074fba6c088-log-httpd\") pod \"b8c51b99-93aa-4f8d-a27c-d074fba6c088\" (UID: \"b8c51b99-93aa-4f8d-a27c-d074fba6c088\") " Mar 10 10:08:29 crc kubenswrapper[4794]: I0310 10:08:29.325305 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b8c51b99-93aa-4f8d-a27c-d074fba6c088-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "b8c51b99-93aa-4f8d-a27c-d074fba6c088" (UID: "b8c51b99-93aa-4f8d-a27c-d074fba6c088"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 10:08:29 crc kubenswrapper[4794]: I0310 10:08:29.328634 4794 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b8c51b99-93aa-4f8d-a27c-d074fba6c088-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 10 10:08:29 crc kubenswrapper[4794]: I0310 10:08:29.329284 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b8c51b99-93aa-4f8d-a27c-d074fba6c088-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "b8c51b99-93aa-4f8d-a27c-d074fba6c088" (UID: "b8c51b99-93aa-4f8d-a27c-d074fba6c088"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 10:08:29 crc kubenswrapper[4794]: I0310 10:08:29.357072 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b8c51b99-93aa-4f8d-a27c-d074fba6c088-kube-api-access-qkd4w" (OuterVolumeSpecName: "kube-api-access-qkd4w") pod "b8c51b99-93aa-4f8d-a27c-d074fba6c088" (UID: "b8c51b99-93aa-4f8d-a27c-d074fba6c088"). InnerVolumeSpecName "kube-api-access-qkd4w". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 10:08:29 crc kubenswrapper[4794]: I0310 10:08:29.370500 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b8c51b99-93aa-4f8d-a27c-d074fba6c088-scripts" (OuterVolumeSpecName: "scripts") pod "b8c51b99-93aa-4f8d-a27c-d074fba6c088" (UID: "b8c51b99-93aa-4f8d-a27c-d074fba6c088"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 10:08:29 crc kubenswrapper[4794]: I0310 10:08:29.392541 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b8c51b99-93aa-4f8d-a27c-d074fba6c088-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "b8c51b99-93aa-4f8d-a27c-d074fba6c088" (UID: "b8c51b99-93aa-4f8d-a27c-d074fba6c088"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 10:08:29 crc kubenswrapper[4794]: I0310 10:08:29.427590 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b8c51b99-93aa-4f8d-a27c-d074fba6c088-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b8c51b99-93aa-4f8d-a27c-d074fba6c088" (UID: "b8c51b99-93aa-4f8d-a27c-d074fba6c088"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 10:08:29 crc kubenswrapper[4794]: I0310 10:08:29.430695 4794 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b8c51b99-93aa-4f8d-a27c-d074fba6c088-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 10:08:29 crc kubenswrapper[4794]: I0310 10:08:29.430717 4794 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b8c51b99-93aa-4f8d-a27c-d074fba6c088-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 10 10:08:29 crc kubenswrapper[4794]: I0310 10:08:29.430729 4794 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b8c51b99-93aa-4f8d-a27c-d074fba6c088-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 10 10:08:29 crc kubenswrapper[4794]: I0310 10:08:29.430739 4794 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b8c51b99-93aa-4f8d-a27c-d074fba6c088-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 10:08:29 crc kubenswrapper[4794]: I0310 10:08:29.430749 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qkd4w\" (UniqueName: \"kubernetes.io/projected/b8c51b99-93aa-4f8d-a27c-d074fba6c088-kube-api-access-qkd4w\") on node \"crc\" DevicePath \"\"" Mar 10 10:08:29 crc kubenswrapper[4794]: I0310 10:08:29.447554 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b8c51b99-93aa-4f8d-a27c-d074fba6c088-config-data" (OuterVolumeSpecName: "config-data") pod "b8c51b99-93aa-4f8d-a27c-d074fba6c088" (UID: "b8c51b99-93aa-4f8d-a27c-d074fba6c088"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 10:08:29 crc kubenswrapper[4794]: I0310 10:08:29.532586 4794 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b8c51b99-93aa-4f8d-a27c-d074fba6c088-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 10:08:29 crc kubenswrapper[4794]: I0310 10:08:29.745553 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hz6dc" event={"ID":"6d7ccc31-5316-4f75-999a-45cd1f36a9f1","Type":"ContainerStarted","Data":"9f19ad9a25b8952d88b5b8df1a31437498f9669ecb56f4e5c9f282290fb28df8"} Mar 10 10:08:29 crc kubenswrapper[4794]: I0310 10:08:29.748088 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b8c51b99-93aa-4f8d-a27c-d074fba6c088","Type":"ContainerDied","Data":"6bd17e255196691cb49e7a6e71c66b3c31388673fe3f135e65510842d222cc86"} Mar 10 10:08:29 crc kubenswrapper[4794]: I0310 10:08:29.748221 4794 scope.go:117] "RemoveContainer" containerID="26e182b3712e4bfb475fe4f60312bb3098934981450ef39df84c8266f3a7d01d" Mar 10 10:08:29 crc kubenswrapper[4794]: I0310 10:08:29.748245 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 10 10:08:29 crc kubenswrapper[4794]: I0310 10:08:29.775207 4794 scope.go:117] "RemoveContainer" containerID="21069cd1e2fdce0936b66d3c3e620ae34405dd97c82d27a0096c58cc684bf07f" Mar 10 10:08:29 crc kubenswrapper[4794]: I0310 10:08:29.796351 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 10 10:08:29 crc kubenswrapper[4794]: I0310 10:08:29.816565 4794 scope.go:117] "RemoveContainer" containerID="9dc19e1bf68288ae1411b1a8de862fd3ebbb95f8ce54b07c9ceb502662ba61d1" Mar 10 10:08:29 crc kubenswrapper[4794]: I0310 10:08:29.831770 4794 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 10 10:08:29 crc kubenswrapper[4794]: I0310 10:08:29.835944 4794 scope.go:117] "RemoveContainer" containerID="289c72a035524d3aa2d5de9768558a7437aaedc5914a97e7e9ecf6ff6eeccdce" Mar 10 10:08:29 crc kubenswrapper[4794]: I0310 10:08:29.844846 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 10 10:08:29 crc kubenswrapper[4794]: E0310 10:08:29.845278 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b8c51b99-93aa-4f8d-a27c-d074fba6c088" containerName="proxy-httpd" Mar 10 10:08:29 crc kubenswrapper[4794]: I0310 10:08:29.845317 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="b8c51b99-93aa-4f8d-a27c-d074fba6c088" containerName="proxy-httpd" Mar 10 10:08:29 crc kubenswrapper[4794]: E0310 10:08:29.845357 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b8c51b99-93aa-4f8d-a27c-d074fba6c088" containerName="ceilometer-notification-agent" Mar 10 10:08:29 crc kubenswrapper[4794]: I0310 10:08:29.845366 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="b8c51b99-93aa-4f8d-a27c-d074fba6c088" containerName="ceilometer-notification-agent" Mar 10 10:08:29 crc kubenswrapper[4794]: E0310 10:08:29.845393 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b8c51b99-93aa-4f8d-a27c-d074fba6c088" containerName="ceilometer-central-agent" Mar 10 10:08:29 crc kubenswrapper[4794]: I0310 10:08:29.845402 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="b8c51b99-93aa-4f8d-a27c-d074fba6c088" containerName="ceilometer-central-agent" Mar 10 10:08:29 crc kubenswrapper[4794]: E0310 10:08:29.845425 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b8c51b99-93aa-4f8d-a27c-d074fba6c088" containerName="sg-core" Mar 10 10:08:29 crc kubenswrapper[4794]: I0310 10:08:29.845433 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="b8c51b99-93aa-4f8d-a27c-d074fba6c088" containerName="sg-core" Mar 10 10:08:29 crc kubenswrapper[4794]: I0310 10:08:29.845636 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="b8c51b99-93aa-4f8d-a27c-d074fba6c088" containerName="ceilometer-central-agent" Mar 10 10:08:29 crc kubenswrapper[4794]: I0310 10:08:29.845672 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="b8c51b99-93aa-4f8d-a27c-d074fba6c088" containerName="ceilometer-notification-agent" Mar 10 10:08:29 crc kubenswrapper[4794]: I0310 10:08:29.845690 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="b8c51b99-93aa-4f8d-a27c-d074fba6c088" containerName="sg-core" Mar 10 10:08:29 crc kubenswrapper[4794]: I0310 10:08:29.845704 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="b8c51b99-93aa-4f8d-a27c-d074fba6c088" containerName="proxy-httpd" Mar 10 10:08:29 crc kubenswrapper[4794]: I0310 10:08:29.847827 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 10 10:08:29 crc kubenswrapper[4794]: I0310 10:08:29.851421 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 10 10:08:29 crc kubenswrapper[4794]: I0310 10:08:29.851917 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Mar 10 10:08:29 crc kubenswrapper[4794]: I0310 10:08:29.852699 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 10 10:08:29 crc kubenswrapper[4794]: I0310 10:08:29.863397 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 10 10:08:29 crc kubenswrapper[4794]: I0310 10:08:29.939111 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/230982f7-1c4d-40b6-9233-b035f68e7209-run-httpd\") pod \"ceilometer-0\" (UID: \"230982f7-1c4d-40b6-9233-b035f68e7209\") " pod="openstack/ceilometer-0" Mar 10 10:08:29 crc kubenswrapper[4794]: I0310 10:08:29.939189 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/230982f7-1c4d-40b6-9233-b035f68e7209-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"230982f7-1c4d-40b6-9233-b035f68e7209\") " pod="openstack/ceilometer-0" Mar 10 10:08:29 crc kubenswrapper[4794]: I0310 10:08:29.939223 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/230982f7-1c4d-40b6-9233-b035f68e7209-log-httpd\") pod \"ceilometer-0\" (UID: \"230982f7-1c4d-40b6-9233-b035f68e7209\") " pod="openstack/ceilometer-0" Mar 10 10:08:29 crc kubenswrapper[4794]: I0310 10:08:29.939272 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/230982f7-1c4d-40b6-9233-b035f68e7209-scripts\") pod \"ceilometer-0\" (UID: \"230982f7-1c4d-40b6-9233-b035f68e7209\") " pod="openstack/ceilometer-0" Mar 10 10:08:29 crc kubenswrapper[4794]: I0310 10:08:29.939292 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/230982f7-1c4d-40b6-9233-b035f68e7209-config-data\") pod \"ceilometer-0\" (UID: \"230982f7-1c4d-40b6-9233-b035f68e7209\") " pod="openstack/ceilometer-0" Mar 10 10:08:29 crc kubenswrapper[4794]: I0310 10:08:29.939362 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/230982f7-1c4d-40b6-9233-b035f68e7209-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"230982f7-1c4d-40b6-9233-b035f68e7209\") " pod="openstack/ceilometer-0" Mar 10 10:08:29 crc kubenswrapper[4794]: I0310 10:08:29.939610 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qbgrk\" (UniqueName: \"kubernetes.io/projected/230982f7-1c4d-40b6-9233-b035f68e7209-kube-api-access-qbgrk\") pod \"ceilometer-0\" (UID: \"230982f7-1c4d-40b6-9233-b035f68e7209\") " pod="openstack/ceilometer-0" Mar 10 10:08:29 crc kubenswrapper[4794]: I0310 10:08:29.939741 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/230982f7-1c4d-40b6-9233-b035f68e7209-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"230982f7-1c4d-40b6-9233-b035f68e7209\") " pod="openstack/ceilometer-0" Mar 10 10:08:30 crc kubenswrapper[4794]: I0310 10:08:30.026998 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b8c51b99-93aa-4f8d-a27c-d074fba6c088" path="/var/lib/kubelet/pods/b8c51b99-93aa-4f8d-a27c-d074fba6c088/volumes" Mar 10 10:08:30 crc kubenswrapper[4794]: I0310 10:08:30.041394 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/230982f7-1c4d-40b6-9233-b035f68e7209-log-httpd\") pod \"ceilometer-0\" (UID: \"230982f7-1c4d-40b6-9233-b035f68e7209\") " pod="openstack/ceilometer-0" Mar 10 10:08:30 crc kubenswrapper[4794]: I0310 10:08:30.042423 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/230982f7-1c4d-40b6-9233-b035f68e7209-log-httpd\") pod \"ceilometer-0\" (UID: \"230982f7-1c4d-40b6-9233-b035f68e7209\") " pod="openstack/ceilometer-0" Mar 10 10:08:30 crc kubenswrapper[4794]: I0310 10:08:30.045055 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/230982f7-1c4d-40b6-9233-b035f68e7209-scripts\") pod \"ceilometer-0\" (UID: \"230982f7-1c4d-40b6-9233-b035f68e7209\") " pod="openstack/ceilometer-0" Mar 10 10:08:30 crc kubenswrapper[4794]: I0310 10:08:30.045135 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/230982f7-1c4d-40b6-9233-b035f68e7209-config-data\") pod \"ceilometer-0\" (UID: \"230982f7-1c4d-40b6-9233-b035f68e7209\") " pod="openstack/ceilometer-0" Mar 10 10:08:30 crc kubenswrapper[4794]: I0310 10:08:30.045173 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/230982f7-1c4d-40b6-9233-b035f68e7209-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"230982f7-1c4d-40b6-9233-b035f68e7209\") " pod="openstack/ceilometer-0" Mar 10 10:08:30 crc kubenswrapper[4794]: I0310 10:08:30.045326 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qbgrk\" (UniqueName: \"kubernetes.io/projected/230982f7-1c4d-40b6-9233-b035f68e7209-kube-api-access-qbgrk\") pod \"ceilometer-0\" (UID: \"230982f7-1c4d-40b6-9233-b035f68e7209\") " pod="openstack/ceilometer-0" Mar 10 10:08:30 crc kubenswrapper[4794]: I0310 10:08:30.045394 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/230982f7-1c4d-40b6-9233-b035f68e7209-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"230982f7-1c4d-40b6-9233-b035f68e7209\") " pod="openstack/ceilometer-0" Mar 10 10:08:30 crc kubenswrapper[4794]: I0310 10:08:30.045554 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/230982f7-1c4d-40b6-9233-b035f68e7209-run-httpd\") pod \"ceilometer-0\" (UID: \"230982f7-1c4d-40b6-9233-b035f68e7209\") " pod="openstack/ceilometer-0" Mar 10 10:08:30 crc kubenswrapper[4794]: I0310 10:08:30.045583 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/230982f7-1c4d-40b6-9233-b035f68e7209-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"230982f7-1c4d-40b6-9233-b035f68e7209\") " pod="openstack/ceilometer-0" Mar 10 10:08:30 crc kubenswrapper[4794]: I0310 10:08:30.046489 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/230982f7-1c4d-40b6-9233-b035f68e7209-run-httpd\") pod \"ceilometer-0\" (UID: \"230982f7-1c4d-40b6-9233-b035f68e7209\") " pod="openstack/ceilometer-0" Mar 10 10:08:30 crc kubenswrapper[4794]: I0310 10:08:30.050586 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/230982f7-1c4d-40b6-9233-b035f68e7209-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"230982f7-1c4d-40b6-9233-b035f68e7209\") " pod="openstack/ceilometer-0" Mar 10 10:08:30 crc kubenswrapper[4794]: I0310 10:08:30.051351 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/230982f7-1c4d-40b6-9233-b035f68e7209-scripts\") pod \"ceilometer-0\" (UID: \"230982f7-1c4d-40b6-9233-b035f68e7209\") " pod="openstack/ceilometer-0" Mar 10 10:08:30 crc kubenswrapper[4794]: I0310 10:08:30.052972 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/230982f7-1c4d-40b6-9233-b035f68e7209-config-data\") pod \"ceilometer-0\" (UID: \"230982f7-1c4d-40b6-9233-b035f68e7209\") " pod="openstack/ceilometer-0" Mar 10 10:08:30 crc kubenswrapper[4794]: I0310 10:08:30.059021 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/230982f7-1c4d-40b6-9233-b035f68e7209-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"230982f7-1c4d-40b6-9233-b035f68e7209\") " pod="openstack/ceilometer-0" Mar 10 10:08:30 crc kubenswrapper[4794]: I0310 10:08:30.066251 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/230982f7-1c4d-40b6-9233-b035f68e7209-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"230982f7-1c4d-40b6-9233-b035f68e7209\") " pod="openstack/ceilometer-0" Mar 10 10:08:30 crc kubenswrapper[4794]: I0310 10:08:30.068498 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qbgrk\" (UniqueName: \"kubernetes.io/projected/230982f7-1c4d-40b6-9233-b035f68e7209-kube-api-access-qbgrk\") pod \"ceilometer-0\" (UID: \"230982f7-1c4d-40b6-9233-b035f68e7209\") " pod="openstack/ceilometer-0" Mar 10 10:08:30 crc kubenswrapper[4794]: I0310 10:08:30.164499 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 10 10:08:30 crc kubenswrapper[4794]: I0310 10:08:30.588745 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 10 10:08:30 crc kubenswrapper[4794]: I0310 10:08:30.759778 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"230982f7-1c4d-40b6-9233-b035f68e7209","Type":"ContainerStarted","Data":"e0a966804fdc7d16db2f7239acc8ee2cd13d439925077525a326382dc93bf59d"} Mar 10 10:08:30 crc kubenswrapper[4794]: I0310 10:08:30.763284 4794 generic.go:334] "Generic (PLEG): container finished" podID="6d7ccc31-5316-4f75-999a-45cd1f36a9f1" containerID="9f19ad9a25b8952d88b5b8df1a31437498f9669ecb56f4e5c9f282290fb28df8" exitCode=0 Mar 10 10:08:30 crc kubenswrapper[4794]: I0310 10:08:30.763369 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hz6dc" event={"ID":"6d7ccc31-5316-4f75-999a-45cd1f36a9f1","Type":"ContainerDied","Data":"9f19ad9a25b8952d88b5b8df1a31437498f9669ecb56f4e5c9f282290fb28df8"} Mar 10 10:08:30 crc kubenswrapper[4794]: I0310 10:08:30.928518 4794 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Mar 10 10:08:30 crc kubenswrapper[4794]: I0310 10:08:30.929099 4794 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Mar 10 10:08:30 crc kubenswrapper[4794]: I0310 10:08:30.935964 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Mar 10 10:08:31 crc kubenswrapper[4794]: I0310 10:08:31.778674 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"230982f7-1c4d-40b6-9233-b035f68e7209","Type":"ContainerStarted","Data":"6182af082a3d40db9a59620e6e5cb812d179e2631dba1ec96c81e72af402c52c"} Mar 10 10:08:31 crc kubenswrapper[4794]: I0310 10:08:31.783082 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hz6dc" event={"ID":"6d7ccc31-5316-4f75-999a-45cd1f36a9f1","Type":"ContainerStarted","Data":"27ff0203e6f08f5795cc198aaf2f05f8cca13fa2dcbf78def477d0d4dabdf1f4"} Mar 10 10:08:31 crc kubenswrapper[4794]: I0310 10:08:31.793104 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Mar 10 10:08:31 crc kubenswrapper[4794]: I0310 10:08:31.801235 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-hz6dc" podStartSLOduration=2.26890646 podStartE2EDuration="4.80121816s" podCreationTimestamp="2026-03-10 10:08:27 +0000 UTC" firstStartedPulling="2026-03-10 10:08:28.733910077 +0000 UTC m=+1457.490080915" lastFinishedPulling="2026-03-10 10:08:31.266221797 +0000 UTC m=+1460.022392615" observedRunningTime="2026-03-10 10:08:31.797220292 +0000 UTC m=+1460.553391120" watchObservedRunningTime="2026-03-10 10:08:31.80121816 +0000 UTC m=+1460.557388978" Mar 10 10:08:32 crc kubenswrapper[4794]: I0310 10:08:32.793979 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"230982f7-1c4d-40b6-9233-b035f68e7209","Type":"ContainerStarted","Data":"282cec3f4ac5f0ec80cd1749c21b5faac6df3aedd361be42ad1f6f285a8c2c41"} Mar 10 10:08:32 crc kubenswrapper[4794]: I0310 10:08:32.794297 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"230982f7-1c4d-40b6-9233-b035f68e7209","Type":"ContainerStarted","Data":"0d0784e834cbcbc15c6490e8cf81c08b40559ec8d3194f48f0be5ccf0f7a6bdc"} Mar 10 10:08:34 crc kubenswrapper[4794]: I0310 10:08:34.288999 4794 scope.go:117] "RemoveContainer" containerID="be31c254c980a9b29478470f979589994d488f44e0eb6a6f25567fd7bda621ef" Mar 10 10:08:34 crc kubenswrapper[4794]: E0310 10:08:34.629191 4794 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf10f408e_5c27_4931_873b_85396e7442e8.slice/crio-conmon-a64296135e8402c3dbcfc0bd0539d38da58c4b92991827f927bd0ad98f6e598b.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf10f408e_5c27_4931_873b_85396e7442e8.slice/crio-a64296135e8402c3dbcfc0bd0539d38da58c4b92991827f927bd0ad98f6e598b.scope\": RecentStats: unable to find data in memory cache]" Mar 10 10:08:34 crc kubenswrapper[4794]: I0310 10:08:34.727382 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 10 10:08:34 crc kubenswrapper[4794]: I0310 10:08:34.821012 4794 generic.go:334] "Generic (PLEG): container finished" podID="f10f408e-5c27-4931-873b-85396e7442e8" containerID="a64296135e8402c3dbcfc0bd0539d38da58c4b92991827f927bd0ad98f6e598b" exitCode=137 Mar 10 10:08:34 crc kubenswrapper[4794]: I0310 10:08:34.821062 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"f10f408e-5c27-4931-873b-85396e7442e8","Type":"ContainerDied","Data":"a64296135e8402c3dbcfc0bd0539d38da58c4b92991827f927bd0ad98f6e598b"} Mar 10 10:08:34 crc kubenswrapper[4794]: I0310 10:08:34.821092 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"f10f408e-5c27-4931-873b-85396e7442e8","Type":"ContainerDied","Data":"a3edc0ec6c8931acc60b8e4d83a89c64f0042fc0873aedfafe66858254ce1219"} Mar 10 10:08:34 crc kubenswrapper[4794]: I0310 10:08:34.821114 4794 scope.go:117] "RemoveContainer" containerID="a64296135e8402c3dbcfc0bd0539d38da58c4b92991827f927bd0ad98f6e598b" Mar 10 10:08:34 crc kubenswrapper[4794]: I0310 10:08:34.821205 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 10 10:08:34 crc kubenswrapper[4794]: I0310 10:08:34.826897 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f10f408e-5c27-4931-873b-85396e7442e8-config-data\") pod \"f10f408e-5c27-4931-873b-85396e7442e8\" (UID: \"f10f408e-5c27-4931-873b-85396e7442e8\") " Mar 10 10:08:34 crc kubenswrapper[4794]: I0310 10:08:34.826980 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-twz89\" (UniqueName: \"kubernetes.io/projected/f10f408e-5c27-4931-873b-85396e7442e8-kube-api-access-twz89\") pod \"f10f408e-5c27-4931-873b-85396e7442e8\" (UID: \"f10f408e-5c27-4931-873b-85396e7442e8\") " Mar 10 10:08:34 crc kubenswrapper[4794]: I0310 10:08:34.827073 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f10f408e-5c27-4931-873b-85396e7442e8-combined-ca-bundle\") pod \"f10f408e-5c27-4931-873b-85396e7442e8\" (UID: \"f10f408e-5c27-4931-873b-85396e7442e8\") " Mar 10 10:08:34 crc kubenswrapper[4794]: I0310 10:08:34.836799 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f10f408e-5c27-4931-873b-85396e7442e8-kube-api-access-twz89" (OuterVolumeSpecName: "kube-api-access-twz89") pod "f10f408e-5c27-4931-873b-85396e7442e8" (UID: "f10f408e-5c27-4931-873b-85396e7442e8"). InnerVolumeSpecName "kube-api-access-twz89". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 10:08:34 crc kubenswrapper[4794]: I0310 10:08:34.844880 4794 scope.go:117] "RemoveContainer" containerID="a64296135e8402c3dbcfc0bd0539d38da58c4b92991827f927bd0ad98f6e598b" Mar 10 10:08:34 crc kubenswrapper[4794]: E0310 10:08:34.845809 4794 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a64296135e8402c3dbcfc0bd0539d38da58c4b92991827f927bd0ad98f6e598b\": container with ID starting with a64296135e8402c3dbcfc0bd0539d38da58c4b92991827f927bd0ad98f6e598b not found: ID does not exist" containerID="a64296135e8402c3dbcfc0bd0539d38da58c4b92991827f927bd0ad98f6e598b" Mar 10 10:08:34 crc kubenswrapper[4794]: I0310 10:08:34.845854 4794 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a64296135e8402c3dbcfc0bd0539d38da58c4b92991827f927bd0ad98f6e598b"} err="failed to get container status \"a64296135e8402c3dbcfc0bd0539d38da58c4b92991827f927bd0ad98f6e598b\": rpc error: code = NotFound desc = could not find container \"a64296135e8402c3dbcfc0bd0539d38da58c4b92991827f927bd0ad98f6e598b\": container with ID starting with a64296135e8402c3dbcfc0bd0539d38da58c4b92991827f927bd0ad98f6e598b not found: ID does not exist" Mar 10 10:08:34 crc kubenswrapper[4794]: I0310 10:08:34.857575 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f10f408e-5c27-4931-873b-85396e7442e8-config-data" (OuterVolumeSpecName: "config-data") pod "f10f408e-5c27-4931-873b-85396e7442e8" (UID: "f10f408e-5c27-4931-873b-85396e7442e8"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 10:08:34 crc kubenswrapper[4794]: I0310 10:08:34.867729 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f10f408e-5c27-4931-873b-85396e7442e8-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f10f408e-5c27-4931-873b-85396e7442e8" (UID: "f10f408e-5c27-4931-873b-85396e7442e8"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 10:08:34 crc kubenswrapper[4794]: I0310 10:08:34.930170 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-twz89\" (UniqueName: \"kubernetes.io/projected/f10f408e-5c27-4931-873b-85396e7442e8-kube-api-access-twz89\") on node \"crc\" DevicePath \"\"" Mar 10 10:08:34 crc kubenswrapper[4794]: I0310 10:08:34.930217 4794 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f10f408e-5c27-4931-873b-85396e7442e8-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 10:08:34 crc kubenswrapper[4794]: I0310 10:08:34.930236 4794 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f10f408e-5c27-4931-873b-85396e7442e8-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 10:08:35 crc kubenswrapper[4794]: I0310 10:08:35.077157 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Mar 10 10:08:35 crc kubenswrapper[4794]: I0310 10:08:35.170349 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 10 10:08:35 crc kubenswrapper[4794]: I0310 10:08:35.201530 4794 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 10 10:08:35 crc kubenswrapper[4794]: I0310 10:08:35.213511 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 10 10:08:35 crc kubenswrapper[4794]: E0310 10:08:35.214012 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f10f408e-5c27-4931-873b-85396e7442e8" containerName="nova-cell1-novncproxy-novncproxy" Mar 10 10:08:35 crc kubenswrapper[4794]: I0310 10:08:35.214035 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="f10f408e-5c27-4931-873b-85396e7442e8" containerName="nova-cell1-novncproxy-novncproxy" Mar 10 10:08:35 crc kubenswrapper[4794]: I0310 10:08:35.214324 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="f10f408e-5c27-4931-873b-85396e7442e8" containerName="nova-cell1-novncproxy-novncproxy" Mar 10 10:08:35 crc kubenswrapper[4794]: I0310 10:08:35.215091 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 10 10:08:35 crc kubenswrapper[4794]: I0310 10:08:35.217443 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Mar 10 10:08:35 crc kubenswrapper[4794]: I0310 10:08:35.218830 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-public-svc" Mar 10 10:08:35 crc kubenswrapper[4794]: I0310 10:08:35.220526 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 10 10:08:35 crc kubenswrapper[4794]: I0310 10:08:35.223839 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-vencrypt" Mar 10 10:08:35 crc kubenswrapper[4794]: I0310 10:08:35.339130 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/2ceb2c7c-89c8-4fce-aa02-aac3840e3f5a-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"2ceb2c7c-89c8-4fce-aa02-aac3840e3f5a\") " pod="openstack/nova-cell1-novncproxy-0" Mar 10 10:08:35 crc kubenswrapper[4794]: I0310 10:08:35.339530 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tmnv9\" (UniqueName: \"kubernetes.io/projected/2ceb2c7c-89c8-4fce-aa02-aac3840e3f5a-kube-api-access-tmnv9\") pod \"nova-cell1-novncproxy-0\" (UID: \"2ceb2c7c-89c8-4fce-aa02-aac3840e3f5a\") " pod="openstack/nova-cell1-novncproxy-0" Mar 10 10:08:35 crc kubenswrapper[4794]: I0310 10:08:35.339555 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2ceb2c7c-89c8-4fce-aa02-aac3840e3f5a-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"2ceb2c7c-89c8-4fce-aa02-aac3840e3f5a\") " pod="openstack/nova-cell1-novncproxy-0" Mar 10 10:08:35 crc kubenswrapper[4794]: I0310 10:08:35.339586 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2ceb2c7c-89c8-4fce-aa02-aac3840e3f5a-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"2ceb2c7c-89c8-4fce-aa02-aac3840e3f5a\") " pod="openstack/nova-cell1-novncproxy-0" Mar 10 10:08:35 crc kubenswrapper[4794]: I0310 10:08:35.339820 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/2ceb2c7c-89c8-4fce-aa02-aac3840e3f5a-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"2ceb2c7c-89c8-4fce-aa02-aac3840e3f5a\") " pod="openstack/nova-cell1-novncproxy-0" Mar 10 10:08:35 crc kubenswrapper[4794]: I0310 10:08:35.441921 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/2ceb2c7c-89c8-4fce-aa02-aac3840e3f5a-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"2ceb2c7c-89c8-4fce-aa02-aac3840e3f5a\") " pod="openstack/nova-cell1-novncproxy-0" Mar 10 10:08:35 crc kubenswrapper[4794]: I0310 10:08:35.442012 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/2ceb2c7c-89c8-4fce-aa02-aac3840e3f5a-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"2ceb2c7c-89c8-4fce-aa02-aac3840e3f5a\") " pod="openstack/nova-cell1-novncproxy-0" Mar 10 10:08:35 crc kubenswrapper[4794]: I0310 10:08:35.442043 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tmnv9\" (UniqueName: \"kubernetes.io/projected/2ceb2c7c-89c8-4fce-aa02-aac3840e3f5a-kube-api-access-tmnv9\") pod \"nova-cell1-novncproxy-0\" (UID: \"2ceb2c7c-89c8-4fce-aa02-aac3840e3f5a\") " pod="openstack/nova-cell1-novncproxy-0" Mar 10 10:08:35 crc kubenswrapper[4794]: I0310 10:08:35.442067 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2ceb2c7c-89c8-4fce-aa02-aac3840e3f5a-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"2ceb2c7c-89c8-4fce-aa02-aac3840e3f5a\") " pod="openstack/nova-cell1-novncproxy-0" Mar 10 10:08:35 crc kubenswrapper[4794]: I0310 10:08:35.442108 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2ceb2c7c-89c8-4fce-aa02-aac3840e3f5a-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"2ceb2c7c-89c8-4fce-aa02-aac3840e3f5a\") " pod="openstack/nova-cell1-novncproxy-0" Mar 10 10:08:35 crc kubenswrapper[4794]: I0310 10:08:35.449989 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/2ceb2c7c-89c8-4fce-aa02-aac3840e3f5a-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"2ceb2c7c-89c8-4fce-aa02-aac3840e3f5a\") " pod="openstack/nova-cell1-novncproxy-0" Mar 10 10:08:35 crc kubenswrapper[4794]: I0310 10:08:35.450044 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/2ceb2c7c-89c8-4fce-aa02-aac3840e3f5a-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"2ceb2c7c-89c8-4fce-aa02-aac3840e3f5a\") " pod="openstack/nova-cell1-novncproxy-0" Mar 10 10:08:35 crc kubenswrapper[4794]: I0310 10:08:35.450193 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2ceb2c7c-89c8-4fce-aa02-aac3840e3f5a-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"2ceb2c7c-89c8-4fce-aa02-aac3840e3f5a\") " pod="openstack/nova-cell1-novncproxy-0" Mar 10 10:08:35 crc kubenswrapper[4794]: I0310 10:08:35.450424 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2ceb2c7c-89c8-4fce-aa02-aac3840e3f5a-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"2ceb2c7c-89c8-4fce-aa02-aac3840e3f5a\") " pod="openstack/nova-cell1-novncproxy-0" Mar 10 10:08:35 crc kubenswrapper[4794]: I0310 10:08:35.469754 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tmnv9\" (UniqueName: \"kubernetes.io/projected/2ceb2c7c-89c8-4fce-aa02-aac3840e3f5a-kube-api-access-tmnv9\") pod \"nova-cell1-novncproxy-0\" (UID: \"2ceb2c7c-89c8-4fce-aa02-aac3840e3f5a\") " pod="openstack/nova-cell1-novncproxy-0" Mar 10 10:08:35 crc kubenswrapper[4794]: I0310 10:08:35.532183 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 10 10:08:35 crc kubenswrapper[4794]: I0310 10:08:35.833169 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"230982f7-1c4d-40b6-9233-b035f68e7209","Type":"ContainerStarted","Data":"c1b9f784504bacbb54ea792f2d4ca3996de2408d457b238ce8bcc3d1eb8dd25d"} Mar 10 10:08:35 crc kubenswrapper[4794]: I0310 10:08:35.833578 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 10 10:08:35 crc kubenswrapper[4794]: I0310 10:08:35.859678 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.889718412 podStartE2EDuration="6.859657353s" podCreationTimestamp="2026-03-10 10:08:29 +0000 UTC" firstStartedPulling="2026-03-10 10:08:30.59681963 +0000 UTC m=+1459.352990448" lastFinishedPulling="2026-03-10 10:08:34.566758561 +0000 UTC m=+1463.322929389" observedRunningTime="2026-03-10 10:08:35.852360775 +0000 UTC m=+1464.608531593" watchObservedRunningTime="2026-03-10 10:08:35.859657353 +0000 UTC m=+1464.615828171" Mar 10 10:08:36 crc kubenswrapper[4794]: I0310 10:08:36.011965 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f10f408e-5c27-4931-873b-85396e7442e8" path="/var/lib/kubelet/pods/f10f408e-5c27-4931-873b-85396e7442e8/volumes" Mar 10 10:08:36 crc kubenswrapper[4794]: I0310 10:08:36.026888 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 10 10:08:36 crc kubenswrapper[4794]: W0310 10:08:36.033574 4794 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2ceb2c7c_89c8_4fce_aa02_aac3840e3f5a.slice/crio-eb147e24025bb532d58dd9ffc23a6ad01518c78d77d677f6b647e18c0b16474e WatchSource:0}: Error finding container eb147e24025bb532d58dd9ffc23a6ad01518c78d77d677f6b647e18c0b16474e: Status 404 returned error can't find the container with id eb147e24025bb532d58dd9ffc23a6ad01518c78d77d677f6b647e18c0b16474e Mar 10 10:08:36 crc kubenswrapper[4794]: I0310 10:08:36.847411 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"2ceb2c7c-89c8-4fce-aa02-aac3840e3f5a","Type":"ContainerStarted","Data":"9e44cda9b00f14d7a50fe91ef2bfbfd700f1b8ac60755da2c7898a76949812f8"} Mar 10 10:08:36 crc kubenswrapper[4794]: I0310 10:08:36.847694 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"2ceb2c7c-89c8-4fce-aa02-aac3840e3f5a","Type":"ContainerStarted","Data":"eb147e24025bb532d58dd9ffc23a6ad01518c78d77d677f6b647e18c0b16474e"} Mar 10 10:08:36 crc kubenswrapper[4794]: I0310 10:08:36.868897 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=1.868880827 podStartE2EDuration="1.868880827s" podCreationTimestamp="2026-03-10 10:08:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 10:08:36.859857075 +0000 UTC m=+1465.616027903" watchObservedRunningTime="2026-03-10 10:08:36.868880827 +0000 UTC m=+1465.625051645" Mar 10 10:08:36 crc kubenswrapper[4794]: I0310 10:08:36.940939 4794 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Mar 10 10:08:36 crc kubenswrapper[4794]: I0310 10:08:36.944243 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Mar 10 10:08:36 crc kubenswrapper[4794]: I0310 10:08:36.945825 4794 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Mar 10 10:08:36 crc kubenswrapper[4794]: I0310 10:08:36.947715 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Mar 10 10:08:37 crc kubenswrapper[4794]: I0310 10:08:37.423123 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-hz6dc" Mar 10 10:08:37 crc kubenswrapper[4794]: I0310 10:08:37.423411 4794 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-hz6dc" Mar 10 10:08:37 crc kubenswrapper[4794]: I0310 10:08:37.855515 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Mar 10 10:08:37 crc kubenswrapper[4794]: I0310 10:08:37.858765 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Mar 10 10:08:38 crc kubenswrapper[4794]: I0310 10:08:38.031365 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-c8964d89c-gfknm"] Mar 10 10:08:38 crc kubenswrapper[4794]: I0310 10:08:38.034390 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-c8964d89c-gfknm" Mar 10 10:08:38 crc kubenswrapper[4794]: I0310 10:08:38.065383 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-c8964d89c-gfknm"] Mar 10 10:08:38 crc kubenswrapper[4794]: I0310 10:08:38.098502 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/bb0dcfe9-2446-4f91-bc24-3cabcab9e8af-ovsdbserver-nb\") pod \"dnsmasq-dns-c8964d89c-gfknm\" (UID: \"bb0dcfe9-2446-4f91-bc24-3cabcab9e8af\") " pod="openstack/dnsmasq-dns-c8964d89c-gfknm" Mar 10 10:08:38 crc kubenswrapper[4794]: I0310 10:08:38.098685 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/bb0dcfe9-2446-4f91-bc24-3cabcab9e8af-dns-swift-storage-0\") pod \"dnsmasq-dns-c8964d89c-gfknm\" (UID: \"bb0dcfe9-2446-4f91-bc24-3cabcab9e8af\") " pod="openstack/dnsmasq-dns-c8964d89c-gfknm" Mar 10 10:08:38 crc kubenswrapper[4794]: I0310 10:08:38.098708 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bb0dcfe9-2446-4f91-bc24-3cabcab9e8af-config\") pod \"dnsmasq-dns-c8964d89c-gfknm\" (UID: \"bb0dcfe9-2446-4f91-bc24-3cabcab9e8af\") " pod="openstack/dnsmasq-dns-c8964d89c-gfknm" Mar 10 10:08:38 crc kubenswrapper[4794]: I0310 10:08:38.098751 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bb0dcfe9-2446-4f91-bc24-3cabcab9e8af-dns-svc\") pod \"dnsmasq-dns-c8964d89c-gfknm\" (UID: \"bb0dcfe9-2446-4f91-bc24-3cabcab9e8af\") " pod="openstack/dnsmasq-dns-c8964d89c-gfknm" Mar 10 10:08:38 crc kubenswrapper[4794]: I0310 10:08:38.098843 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/bb0dcfe9-2446-4f91-bc24-3cabcab9e8af-ovsdbserver-sb\") pod \"dnsmasq-dns-c8964d89c-gfknm\" (UID: \"bb0dcfe9-2446-4f91-bc24-3cabcab9e8af\") " pod="openstack/dnsmasq-dns-c8964d89c-gfknm" Mar 10 10:08:38 crc kubenswrapper[4794]: I0310 10:08:38.098875 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q625v\" (UniqueName: \"kubernetes.io/projected/bb0dcfe9-2446-4f91-bc24-3cabcab9e8af-kube-api-access-q625v\") pod \"dnsmasq-dns-c8964d89c-gfknm\" (UID: \"bb0dcfe9-2446-4f91-bc24-3cabcab9e8af\") " pod="openstack/dnsmasq-dns-c8964d89c-gfknm" Mar 10 10:08:38 crc kubenswrapper[4794]: I0310 10:08:38.201080 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bb0dcfe9-2446-4f91-bc24-3cabcab9e8af-config\") pod \"dnsmasq-dns-c8964d89c-gfknm\" (UID: \"bb0dcfe9-2446-4f91-bc24-3cabcab9e8af\") " pod="openstack/dnsmasq-dns-c8964d89c-gfknm" Mar 10 10:08:38 crc kubenswrapper[4794]: I0310 10:08:38.201168 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bb0dcfe9-2446-4f91-bc24-3cabcab9e8af-dns-svc\") pod \"dnsmasq-dns-c8964d89c-gfknm\" (UID: \"bb0dcfe9-2446-4f91-bc24-3cabcab9e8af\") " pod="openstack/dnsmasq-dns-c8964d89c-gfknm" Mar 10 10:08:38 crc kubenswrapper[4794]: I0310 10:08:38.201257 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/bb0dcfe9-2446-4f91-bc24-3cabcab9e8af-ovsdbserver-sb\") pod \"dnsmasq-dns-c8964d89c-gfknm\" (UID: \"bb0dcfe9-2446-4f91-bc24-3cabcab9e8af\") " pod="openstack/dnsmasq-dns-c8964d89c-gfknm" Mar 10 10:08:38 crc kubenswrapper[4794]: I0310 10:08:38.201295 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q625v\" (UniqueName: \"kubernetes.io/projected/bb0dcfe9-2446-4f91-bc24-3cabcab9e8af-kube-api-access-q625v\") pod \"dnsmasq-dns-c8964d89c-gfknm\" (UID: \"bb0dcfe9-2446-4f91-bc24-3cabcab9e8af\") " pod="openstack/dnsmasq-dns-c8964d89c-gfknm" Mar 10 10:08:38 crc kubenswrapper[4794]: I0310 10:08:38.201384 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/bb0dcfe9-2446-4f91-bc24-3cabcab9e8af-ovsdbserver-nb\") pod \"dnsmasq-dns-c8964d89c-gfknm\" (UID: \"bb0dcfe9-2446-4f91-bc24-3cabcab9e8af\") " pod="openstack/dnsmasq-dns-c8964d89c-gfknm" Mar 10 10:08:38 crc kubenswrapper[4794]: I0310 10:08:38.201453 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/bb0dcfe9-2446-4f91-bc24-3cabcab9e8af-dns-swift-storage-0\") pod \"dnsmasq-dns-c8964d89c-gfknm\" (UID: \"bb0dcfe9-2446-4f91-bc24-3cabcab9e8af\") " pod="openstack/dnsmasq-dns-c8964d89c-gfknm" Mar 10 10:08:38 crc kubenswrapper[4794]: I0310 10:08:38.202019 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bb0dcfe9-2446-4f91-bc24-3cabcab9e8af-config\") pod \"dnsmasq-dns-c8964d89c-gfknm\" (UID: \"bb0dcfe9-2446-4f91-bc24-3cabcab9e8af\") " pod="openstack/dnsmasq-dns-c8964d89c-gfknm" Mar 10 10:08:38 crc kubenswrapper[4794]: I0310 10:08:38.202200 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/bb0dcfe9-2446-4f91-bc24-3cabcab9e8af-dns-swift-storage-0\") pod \"dnsmasq-dns-c8964d89c-gfknm\" (UID: \"bb0dcfe9-2446-4f91-bc24-3cabcab9e8af\") " pod="openstack/dnsmasq-dns-c8964d89c-gfknm" Mar 10 10:08:38 crc kubenswrapper[4794]: I0310 10:08:38.202672 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/bb0dcfe9-2446-4f91-bc24-3cabcab9e8af-ovsdbserver-sb\") pod \"dnsmasq-dns-c8964d89c-gfknm\" (UID: \"bb0dcfe9-2446-4f91-bc24-3cabcab9e8af\") " pod="openstack/dnsmasq-dns-c8964d89c-gfknm" Mar 10 10:08:38 crc kubenswrapper[4794]: I0310 10:08:38.202679 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/bb0dcfe9-2446-4f91-bc24-3cabcab9e8af-ovsdbserver-nb\") pod \"dnsmasq-dns-c8964d89c-gfknm\" (UID: \"bb0dcfe9-2446-4f91-bc24-3cabcab9e8af\") " pod="openstack/dnsmasq-dns-c8964d89c-gfknm" Mar 10 10:08:38 crc kubenswrapper[4794]: I0310 10:08:38.202684 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bb0dcfe9-2446-4f91-bc24-3cabcab9e8af-dns-svc\") pod \"dnsmasq-dns-c8964d89c-gfknm\" (UID: \"bb0dcfe9-2446-4f91-bc24-3cabcab9e8af\") " pod="openstack/dnsmasq-dns-c8964d89c-gfknm" Mar 10 10:08:38 crc kubenswrapper[4794]: I0310 10:08:38.224343 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q625v\" (UniqueName: \"kubernetes.io/projected/bb0dcfe9-2446-4f91-bc24-3cabcab9e8af-kube-api-access-q625v\") pod \"dnsmasq-dns-c8964d89c-gfknm\" (UID: \"bb0dcfe9-2446-4f91-bc24-3cabcab9e8af\") " pod="openstack/dnsmasq-dns-c8964d89c-gfknm" Mar 10 10:08:38 crc kubenswrapper[4794]: I0310 10:08:38.363786 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-c8964d89c-gfknm" Mar 10 10:08:38 crc kubenswrapper[4794]: I0310 10:08:38.489739 4794 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/community-operators-hz6dc" podUID="6d7ccc31-5316-4f75-999a-45cd1f36a9f1" containerName="registry-server" probeResult="failure" output=< Mar 10 10:08:38 crc kubenswrapper[4794]: timeout: failed to connect service ":50051" within 1s Mar 10 10:08:38 crc kubenswrapper[4794]: > Mar 10 10:08:38 crc kubenswrapper[4794]: I0310 10:08:38.852295 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-c8964d89c-gfknm"] Mar 10 10:08:39 crc kubenswrapper[4794]: I0310 10:08:39.877095 4794 generic.go:334] "Generic (PLEG): container finished" podID="bb0dcfe9-2446-4f91-bc24-3cabcab9e8af" containerID="2b61af45d5cd8cc92786b42b8c4b712c37165e562d116857905c7a2806091415" exitCode=0 Mar 10 10:08:39 crc kubenswrapper[4794]: I0310 10:08:39.877200 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-c8964d89c-gfknm" event={"ID":"bb0dcfe9-2446-4f91-bc24-3cabcab9e8af","Type":"ContainerDied","Data":"2b61af45d5cd8cc92786b42b8c4b712c37165e562d116857905c7a2806091415"} Mar 10 10:08:39 crc kubenswrapper[4794]: I0310 10:08:39.877510 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-c8964d89c-gfknm" event={"ID":"bb0dcfe9-2446-4f91-bc24-3cabcab9e8af","Type":"ContainerStarted","Data":"52ec40d533f039df7d6f7e0f4b6824af5e321cc9af8b25f91764a63cf1117c87"} Mar 10 10:08:40 crc kubenswrapper[4794]: I0310 10:08:40.389471 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 10 10:08:40 crc kubenswrapper[4794]: I0310 10:08:40.390189 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="230982f7-1c4d-40b6-9233-b035f68e7209" containerName="ceilometer-central-agent" containerID="cri-o://6182af082a3d40db9a59620e6e5cb812d179e2631dba1ec96c81e72af402c52c" gracePeriod=30 Mar 10 10:08:40 crc kubenswrapper[4794]: I0310 10:08:40.390279 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="230982f7-1c4d-40b6-9233-b035f68e7209" containerName="proxy-httpd" containerID="cri-o://c1b9f784504bacbb54ea792f2d4ca3996de2408d457b238ce8bcc3d1eb8dd25d" gracePeriod=30 Mar 10 10:08:40 crc kubenswrapper[4794]: I0310 10:08:40.390276 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="230982f7-1c4d-40b6-9233-b035f68e7209" containerName="sg-core" containerID="cri-o://282cec3f4ac5f0ec80cd1749c21b5faac6df3aedd361be42ad1f6f285a8c2c41" gracePeriod=30 Mar 10 10:08:40 crc kubenswrapper[4794]: I0310 10:08:40.390255 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="230982f7-1c4d-40b6-9233-b035f68e7209" containerName="ceilometer-notification-agent" containerID="cri-o://0d0784e834cbcbc15c6490e8cf81c08b40559ec8d3194f48f0be5ccf0f7a6bdc" gracePeriod=30 Mar 10 10:08:40 crc kubenswrapper[4794]: I0310 10:08:40.533042 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Mar 10 10:08:40 crc kubenswrapper[4794]: I0310 10:08:40.635304 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 10 10:08:40 crc kubenswrapper[4794]: I0310 10:08:40.905165 4794 generic.go:334] "Generic (PLEG): container finished" podID="230982f7-1c4d-40b6-9233-b035f68e7209" containerID="c1b9f784504bacbb54ea792f2d4ca3996de2408d457b238ce8bcc3d1eb8dd25d" exitCode=0 Mar 10 10:08:40 crc kubenswrapper[4794]: I0310 10:08:40.905280 4794 generic.go:334] "Generic (PLEG): container finished" podID="230982f7-1c4d-40b6-9233-b035f68e7209" containerID="282cec3f4ac5f0ec80cd1749c21b5faac6df3aedd361be42ad1f6f285a8c2c41" exitCode=2 Mar 10 10:08:40 crc kubenswrapper[4794]: I0310 10:08:40.905306 4794 generic.go:334] "Generic (PLEG): container finished" podID="230982f7-1c4d-40b6-9233-b035f68e7209" containerID="6182af082a3d40db9a59620e6e5cb812d179e2631dba1ec96c81e72af402c52c" exitCode=0 Mar 10 10:08:40 crc kubenswrapper[4794]: I0310 10:08:40.905395 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"230982f7-1c4d-40b6-9233-b035f68e7209","Type":"ContainerDied","Data":"c1b9f784504bacbb54ea792f2d4ca3996de2408d457b238ce8bcc3d1eb8dd25d"} Mar 10 10:08:40 crc kubenswrapper[4794]: I0310 10:08:40.905434 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"230982f7-1c4d-40b6-9233-b035f68e7209","Type":"ContainerDied","Data":"282cec3f4ac5f0ec80cd1749c21b5faac6df3aedd361be42ad1f6f285a8c2c41"} Mar 10 10:08:40 crc kubenswrapper[4794]: I0310 10:08:40.905454 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"230982f7-1c4d-40b6-9233-b035f68e7209","Type":"ContainerDied","Data":"6182af082a3d40db9a59620e6e5cb812d179e2631dba1ec96c81e72af402c52c"} Mar 10 10:08:40 crc kubenswrapper[4794]: I0310 10:08:40.910867 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="35c66301-907d-40ed-afb3-bc354acece6c" containerName="nova-api-log" containerID="cri-o://568f66f4a8b27c2454e806e1a7d04f1e8eae2e56111d1087a639f0edfdf2a055" gracePeriod=30 Mar 10 10:08:40 crc kubenswrapper[4794]: I0310 10:08:40.911146 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="35c66301-907d-40ed-afb3-bc354acece6c" containerName="nova-api-api" containerID="cri-o://9acc5f862dab66c8a5e24eebcf3c82c037a59d49bfb52b3d73f73a14e9e9958a" gracePeriod=30 Mar 10 10:08:40 crc kubenswrapper[4794]: I0310 10:08:40.911209 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-c8964d89c-gfknm" event={"ID":"bb0dcfe9-2446-4f91-bc24-3cabcab9e8af","Type":"ContainerStarted","Data":"8b1add4deeef74619d980e1481eb91a0c5f66b424e4ad63cfb9113db133078f3"} Mar 10 10:08:40 crc kubenswrapper[4794]: I0310 10:08:40.911528 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-c8964d89c-gfknm" Mar 10 10:08:40 crc kubenswrapper[4794]: I0310 10:08:40.954578 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-c8964d89c-gfknm" podStartSLOduration=3.954558761 podStartE2EDuration="3.954558761s" podCreationTimestamp="2026-03-10 10:08:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 10:08:40.949635168 +0000 UTC m=+1469.705805986" watchObservedRunningTime="2026-03-10 10:08:40.954558761 +0000 UTC m=+1469.710729599" Mar 10 10:08:41 crc kubenswrapper[4794]: I0310 10:08:41.231871 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 10 10:08:41 crc kubenswrapper[4794]: I0310 10:08:41.360390 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/230982f7-1c4d-40b6-9233-b035f68e7209-ceilometer-tls-certs\") pod \"230982f7-1c4d-40b6-9233-b035f68e7209\" (UID: \"230982f7-1c4d-40b6-9233-b035f68e7209\") " Mar 10 10:08:41 crc kubenswrapper[4794]: I0310 10:08:41.360446 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/230982f7-1c4d-40b6-9233-b035f68e7209-run-httpd\") pod \"230982f7-1c4d-40b6-9233-b035f68e7209\" (UID: \"230982f7-1c4d-40b6-9233-b035f68e7209\") " Mar 10 10:08:41 crc kubenswrapper[4794]: I0310 10:08:41.360536 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/230982f7-1c4d-40b6-9233-b035f68e7209-log-httpd\") pod \"230982f7-1c4d-40b6-9233-b035f68e7209\" (UID: \"230982f7-1c4d-40b6-9233-b035f68e7209\") " Mar 10 10:08:41 crc kubenswrapper[4794]: I0310 10:08:41.360669 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/230982f7-1c4d-40b6-9233-b035f68e7209-config-data\") pod \"230982f7-1c4d-40b6-9233-b035f68e7209\" (UID: \"230982f7-1c4d-40b6-9233-b035f68e7209\") " Mar 10 10:08:41 crc kubenswrapper[4794]: I0310 10:08:41.360701 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/230982f7-1c4d-40b6-9233-b035f68e7209-sg-core-conf-yaml\") pod \"230982f7-1c4d-40b6-9233-b035f68e7209\" (UID: \"230982f7-1c4d-40b6-9233-b035f68e7209\") " Mar 10 10:08:41 crc kubenswrapper[4794]: I0310 10:08:41.360743 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/230982f7-1c4d-40b6-9233-b035f68e7209-combined-ca-bundle\") pod \"230982f7-1c4d-40b6-9233-b035f68e7209\" (UID: \"230982f7-1c4d-40b6-9233-b035f68e7209\") " Mar 10 10:08:41 crc kubenswrapper[4794]: I0310 10:08:41.360799 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/230982f7-1c4d-40b6-9233-b035f68e7209-scripts\") pod \"230982f7-1c4d-40b6-9233-b035f68e7209\" (UID: \"230982f7-1c4d-40b6-9233-b035f68e7209\") " Mar 10 10:08:41 crc kubenswrapper[4794]: I0310 10:08:41.360850 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qbgrk\" (UniqueName: \"kubernetes.io/projected/230982f7-1c4d-40b6-9233-b035f68e7209-kube-api-access-qbgrk\") pod \"230982f7-1c4d-40b6-9233-b035f68e7209\" (UID: \"230982f7-1c4d-40b6-9233-b035f68e7209\") " Mar 10 10:08:41 crc kubenswrapper[4794]: I0310 10:08:41.361846 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/230982f7-1c4d-40b6-9233-b035f68e7209-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "230982f7-1c4d-40b6-9233-b035f68e7209" (UID: "230982f7-1c4d-40b6-9233-b035f68e7209"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 10:08:41 crc kubenswrapper[4794]: I0310 10:08:41.362664 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/230982f7-1c4d-40b6-9233-b035f68e7209-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "230982f7-1c4d-40b6-9233-b035f68e7209" (UID: "230982f7-1c4d-40b6-9233-b035f68e7209"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 10:08:41 crc kubenswrapper[4794]: I0310 10:08:41.366675 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/230982f7-1c4d-40b6-9233-b035f68e7209-kube-api-access-qbgrk" (OuterVolumeSpecName: "kube-api-access-qbgrk") pod "230982f7-1c4d-40b6-9233-b035f68e7209" (UID: "230982f7-1c4d-40b6-9233-b035f68e7209"). InnerVolumeSpecName "kube-api-access-qbgrk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 10:08:41 crc kubenswrapper[4794]: I0310 10:08:41.371512 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/230982f7-1c4d-40b6-9233-b035f68e7209-scripts" (OuterVolumeSpecName: "scripts") pod "230982f7-1c4d-40b6-9233-b035f68e7209" (UID: "230982f7-1c4d-40b6-9233-b035f68e7209"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 10:08:41 crc kubenswrapper[4794]: I0310 10:08:41.396556 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/230982f7-1c4d-40b6-9233-b035f68e7209-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "230982f7-1c4d-40b6-9233-b035f68e7209" (UID: "230982f7-1c4d-40b6-9233-b035f68e7209"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 10:08:41 crc kubenswrapper[4794]: I0310 10:08:41.421429 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/230982f7-1c4d-40b6-9233-b035f68e7209-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "230982f7-1c4d-40b6-9233-b035f68e7209" (UID: "230982f7-1c4d-40b6-9233-b035f68e7209"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 10:08:41 crc kubenswrapper[4794]: I0310 10:08:41.445053 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/230982f7-1c4d-40b6-9233-b035f68e7209-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "230982f7-1c4d-40b6-9233-b035f68e7209" (UID: "230982f7-1c4d-40b6-9233-b035f68e7209"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 10:08:41 crc kubenswrapper[4794]: I0310 10:08:41.462833 4794 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/230982f7-1c4d-40b6-9233-b035f68e7209-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 10 10:08:41 crc kubenswrapper[4794]: I0310 10:08:41.462869 4794 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/230982f7-1c4d-40b6-9233-b035f68e7209-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 10 10:08:41 crc kubenswrapper[4794]: I0310 10:08:41.462884 4794 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/230982f7-1c4d-40b6-9233-b035f68e7209-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 10:08:41 crc kubenswrapper[4794]: I0310 10:08:41.462898 4794 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/230982f7-1c4d-40b6-9233-b035f68e7209-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 10:08:41 crc kubenswrapper[4794]: I0310 10:08:41.462908 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qbgrk\" (UniqueName: \"kubernetes.io/projected/230982f7-1c4d-40b6-9233-b035f68e7209-kube-api-access-qbgrk\") on node \"crc\" DevicePath \"\"" Mar 10 10:08:41 crc kubenswrapper[4794]: I0310 10:08:41.462920 4794 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/230982f7-1c4d-40b6-9233-b035f68e7209-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 10 10:08:41 crc kubenswrapper[4794]: I0310 10:08:41.462930 4794 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/230982f7-1c4d-40b6-9233-b035f68e7209-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 10 10:08:41 crc kubenswrapper[4794]: I0310 10:08:41.487450 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/230982f7-1c4d-40b6-9233-b035f68e7209-config-data" (OuterVolumeSpecName: "config-data") pod "230982f7-1c4d-40b6-9233-b035f68e7209" (UID: "230982f7-1c4d-40b6-9233-b035f68e7209"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 10:08:41 crc kubenswrapper[4794]: I0310 10:08:41.570951 4794 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/230982f7-1c4d-40b6-9233-b035f68e7209-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 10:08:41 crc kubenswrapper[4794]: I0310 10:08:41.919906 4794 generic.go:334] "Generic (PLEG): container finished" podID="35c66301-907d-40ed-afb3-bc354acece6c" containerID="568f66f4a8b27c2454e806e1a7d04f1e8eae2e56111d1087a639f0edfdf2a055" exitCode=143 Mar 10 10:08:41 crc kubenswrapper[4794]: I0310 10:08:41.919982 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"35c66301-907d-40ed-afb3-bc354acece6c","Type":"ContainerDied","Data":"568f66f4a8b27c2454e806e1a7d04f1e8eae2e56111d1087a639f0edfdf2a055"} Mar 10 10:08:41 crc kubenswrapper[4794]: I0310 10:08:41.923099 4794 generic.go:334] "Generic (PLEG): container finished" podID="230982f7-1c4d-40b6-9233-b035f68e7209" containerID="0d0784e834cbcbc15c6490e8cf81c08b40559ec8d3194f48f0be5ccf0f7a6bdc" exitCode=0 Mar 10 10:08:41 crc kubenswrapper[4794]: I0310 10:08:41.923175 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 10 10:08:41 crc kubenswrapper[4794]: I0310 10:08:41.923189 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"230982f7-1c4d-40b6-9233-b035f68e7209","Type":"ContainerDied","Data":"0d0784e834cbcbc15c6490e8cf81c08b40559ec8d3194f48f0be5ccf0f7a6bdc"} Mar 10 10:08:41 crc kubenswrapper[4794]: I0310 10:08:41.923221 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"230982f7-1c4d-40b6-9233-b035f68e7209","Type":"ContainerDied","Data":"e0a966804fdc7d16db2f7239acc8ee2cd13d439925077525a326382dc93bf59d"} Mar 10 10:08:41 crc kubenswrapper[4794]: I0310 10:08:41.923243 4794 scope.go:117] "RemoveContainer" containerID="c1b9f784504bacbb54ea792f2d4ca3996de2408d457b238ce8bcc3d1eb8dd25d" Mar 10 10:08:41 crc kubenswrapper[4794]: I0310 10:08:41.945453 4794 scope.go:117] "RemoveContainer" containerID="282cec3f4ac5f0ec80cd1749c21b5faac6df3aedd361be42ad1f6f285a8c2c41" Mar 10 10:08:41 crc kubenswrapper[4794]: I0310 10:08:41.958246 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 10 10:08:41 crc kubenswrapper[4794]: I0310 10:08:41.966544 4794 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 10 10:08:41 crc kubenswrapper[4794]: I0310 10:08:41.967490 4794 scope.go:117] "RemoveContainer" containerID="0d0784e834cbcbc15c6490e8cf81c08b40559ec8d3194f48f0be5ccf0f7a6bdc" Mar 10 10:08:41 crc kubenswrapper[4794]: I0310 10:08:41.983849 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 10 10:08:41 crc kubenswrapper[4794]: E0310 10:08:41.984225 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="230982f7-1c4d-40b6-9233-b035f68e7209" containerName="proxy-httpd" Mar 10 10:08:41 crc kubenswrapper[4794]: I0310 10:08:41.984242 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="230982f7-1c4d-40b6-9233-b035f68e7209" containerName="proxy-httpd" Mar 10 10:08:41 crc kubenswrapper[4794]: E0310 10:08:41.984270 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="230982f7-1c4d-40b6-9233-b035f68e7209" containerName="sg-core" Mar 10 10:08:41 crc kubenswrapper[4794]: I0310 10:08:41.984276 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="230982f7-1c4d-40b6-9233-b035f68e7209" containerName="sg-core" Mar 10 10:08:41 crc kubenswrapper[4794]: E0310 10:08:41.984284 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="230982f7-1c4d-40b6-9233-b035f68e7209" containerName="ceilometer-notification-agent" Mar 10 10:08:41 crc kubenswrapper[4794]: I0310 10:08:41.984290 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="230982f7-1c4d-40b6-9233-b035f68e7209" containerName="ceilometer-notification-agent" Mar 10 10:08:41 crc kubenswrapper[4794]: E0310 10:08:41.984303 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="230982f7-1c4d-40b6-9233-b035f68e7209" containerName="ceilometer-central-agent" Mar 10 10:08:41 crc kubenswrapper[4794]: I0310 10:08:41.984309 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="230982f7-1c4d-40b6-9233-b035f68e7209" containerName="ceilometer-central-agent" Mar 10 10:08:41 crc kubenswrapper[4794]: I0310 10:08:41.984532 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="230982f7-1c4d-40b6-9233-b035f68e7209" containerName="ceilometer-central-agent" Mar 10 10:08:41 crc kubenswrapper[4794]: I0310 10:08:41.984556 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="230982f7-1c4d-40b6-9233-b035f68e7209" containerName="ceilometer-notification-agent" Mar 10 10:08:41 crc kubenswrapper[4794]: I0310 10:08:41.984565 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="230982f7-1c4d-40b6-9233-b035f68e7209" containerName="sg-core" Mar 10 10:08:41 crc kubenswrapper[4794]: I0310 10:08:41.984580 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="230982f7-1c4d-40b6-9233-b035f68e7209" containerName="proxy-httpd" Mar 10 10:08:41 crc kubenswrapper[4794]: I0310 10:08:41.986185 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 10 10:08:41 crc kubenswrapper[4794]: I0310 10:08:41.988684 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Mar 10 10:08:41 crc kubenswrapper[4794]: I0310 10:08:41.988745 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 10 10:08:41 crc kubenswrapper[4794]: I0310 10:08:41.988860 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 10 10:08:41 crc kubenswrapper[4794]: I0310 10:08:41.997369 4794 scope.go:117] "RemoveContainer" containerID="6182af082a3d40db9a59620e6e5cb812d179e2631dba1ec96c81e72af402c52c" Mar 10 10:08:42 crc kubenswrapper[4794]: I0310 10:08:42.018190 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="230982f7-1c4d-40b6-9233-b035f68e7209" path="/var/lib/kubelet/pods/230982f7-1c4d-40b6-9233-b035f68e7209/volumes" Mar 10 10:08:42 crc kubenswrapper[4794]: I0310 10:08:42.019782 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 10 10:08:42 crc kubenswrapper[4794]: I0310 10:08:42.021349 4794 scope.go:117] "RemoveContainer" containerID="c1b9f784504bacbb54ea792f2d4ca3996de2408d457b238ce8bcc3d1eb8dd25d" Mar 10 10:08:42 crc kubenswrapper[4794]: E0310 10:08:42.027006 4794 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c1b9f784504bacbb54ea792f2d4ca3996de2408d457b238ce8bcc3d1eb8dd25d\": container with ID starting with c1b9f784504bacbb54ea792f2d4ca3996de2408d457b238ce8bcc3d1eb8dd25d not found: ID does not exist" containerID="c1b9f784504bacbb54ea792f2d4ca3996de2408d457b238ce8bcc3d1eb8dd25d" Mar 10 10:08:42 crc kubenswrapper[4794]: I0310 10:08:42.027056 4794 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c1b9f784504bacbb54ea792f2d4ca3996de2408d457b238ce8bcc3d1eb8dd25d"} err="failed to get container status \"c1b9f784504bacbb54ea792f2d4ca3996de2408d457b238ce8bcc3d1eb8dd25d\": rpc error: code = NotFound desc = could not find container \"c1b9f784504bacbb54ea792f2d4ca3996de2408d457b238ce8bcc3d1eb8dd25d\": container with ID starting with c1b9f784504bacbb54ea792f2d4ca3996de2408d457b238ce8bcc3d1eb8dd25d not found: ID does not exist" Mar 10 10:08:42 crc kubenswrapper[4794]: I0310 10:08:42.027081 4794 scope.go:117] "RemoveContainer" containerID="282cec3f4ac5f0ec80cd1749c21b5faac6df3aedd361be42ad1f6f285a8c2c41" Mar 10 10:08:42 crc kubenswrapper[4794]: E0310 10:08:42.028844 4794 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"282cec3f4ac5f0ec80cd1749c21b5faac6df3aedd361be42ad1f6f285a8c2c41\": container with ID starting with 282cec3f4ac5f0ec80cd1749c21b5faac6df3aedd361be42ad1f6f285a8c2c41 not found: ID does not exist" containerID="282cec3f4ac5f0ec80cd1749c21b5faac6df3aedd361be42ad1f6f285a8c2c41" Mar 10 10:08:42 crc kubenswrapper[4794]: I0310 10:08:42.028869 4794 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"282cec3f4ac5f0ec80cd1749c21b5faac6df3aedd361be42ad1f6f285a8c2c41"} err="failed to get container status \"282cec3f4ac5f0ec80cd1749c21b5faac6df3aedd361be42ad1f6f285a8c2c41\": rpc error: code = NotFound desc = could not find container \"282cec3f4ac5f0ec80cd1749c21b5faac6df3aedd361be42ad1f6f285a8c2c41\": container with ID starting with 282cec3f4ac5f0ec80cd1749c21b5faac6df3aedd361be42ad1f6f285a8c2c41 not found: ID does not exist" Mar 10 10:08:42 crc kubenswrapper[4794]: I0310 10:08:42.028884 4794 scope.go:117] "RemoveContainer" containerID="0d0784e834cbcbc15c6490e8cf81c08b40559ec8d3194f48f0be5ccf0f7a6bdc" Mar 10 10:08:42 crc kubenswrapper[4794]: E0310 10:08:42.029201 4794 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0d0784e834cbcbc15c6490e8cf81c08b40559ec8d3194f48f0be5ccf0f7a6bdc\": container with ID starting with 0d0784e834cbcbc15c6490e8cf81c08b40559ec8d3194f48f0be5ccf0f7a6bdc not found: ID does not exist" containerID="0d0784e834cbcbc15c6490e8cf81c08b40559ec8d3194f48f0be5ccf0f7a6bdc" Mar 10 10:08:42 crc kubenswrapper[4794]: I0310 10:08:42.029229 4794 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0d0784e834cbcbc15c6490e8cf81c08b40559ec8d3194f48f0be5ccf0f7a6bdc"} err="failed to get container status \"0d0784e834cbcbc15c6490e8cf81c08b40559ec8d3194f48f0be5ccf0f7a6bdc\": rpc error: code = NotFound desc = could not find container \"0d0784e834cbcbc15c6490e8cf81c08b40559ec8d3194f48f0be5ccf0f7a6bdc\": container with ID starting with 0d0784e834cbcbc15c6490e8cf81c08b40559ec8d3194f48f0be5ccf0f7a6bdc not found: ID does not exist" Mar 10 10:08:42 crc kubenswrapper[4794]: I0310 10:08:42.029242 4794 scope.go:117] "RemoveContainer" containerID="6182af082a3d40db9a59620e6e5cb812d179e2631dba1ec96c81e72af402c52c" Mar 10 10:08:42 crc kubenswrapper[4794]: E0310 10:08:42.029567 4794 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6182af082a3d40db9a59620e6e5cb812d179e2631dba1ec96c81e72af402c52c\": container with ID starting with 6182af082a3d40db9a59620e6e5cb812d179e2631dba1ec96c81e72af402c52c not found: ID does not exist" containerID="6182af082a3d40db9a59620e6e5cb812d179e2631dba1ec96c81e72af402c52c" Mar 10 10:08:42 crc kubenswrapper[4794]: I0310 10:08:42.029587 4794 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6182af082a3d40db9a59620e6e5cb812d179e2631dba1ec96c81e72af402c52c"} err="failed to get container status \"6182af082a3d40db9a59620e6e5cb812d179e2631dba1ec96c81e72af402c52c\": rpc error: code = NotFound desc = could not find container \"6182af082a3d40db9a59620e6e5cb812d179e2631dba1ec96c81e72af402c52c\": container with ID starting with 6182af082a3d40db9a59620e6e5cb812d179e2631dba1ec96c81e72af402c52c not found: ID does not exist" Mar 10 10:08:42 crc kubenswrapper[4794]: I0310 10:08:42.084590 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/bbacde9d-ef79-425b-9959-afcff9bb2a2f-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"bbacde9d-ef79-425b-9959-afcff9bb2a2f\") " pod="openstack/ceilometer-0" Mar 10 10:08:42 crc kubenswrapper[4794]: I0310 10:08:42.084673 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bbacde9d-ef79-425b-9959-afcff9bb2a2f-scripts\") pod \"ceilometer-0\" (UID: \"bbacde9d-ef79-425b-9959-afcff9bb2a2f\") " pod="openstack/ceilometer-0" Mar 10 10:08:42 crc kubenswrapper[4794]: I0310 10:08:42.084692 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bznzf\" (UniqueName: \"kubernetes.io/projected/bbacde9d-ef79-425b-9959-afcff9bb2a2f-kube-api-access-bznzf\") pod \"ceilometer-0\" (UID: \"bbacde9d-ef79-425b-9959-afcff9bb2a2f\") " pod="openstack/ceilometer-0" Mar 10 10:08:42 crc kubenswrapper[4794]: I0310 10:08:42.084712 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/bbacde9d-ef79-425b-9959-afcff9bb2a2f-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"bbacde9d-ef79-425b-9959-afcff9bb2a2f\") " pod="openstack/ceilometer-0" Mar 10 10:08:42 crc kubenswrapper[4794]: I0310 10:08:42.084750 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bbacde9d-ef79-425b-9959-afcff9bb2a2f-config-data\") pod \"ceilometer-0\" (UID: \"bbacde9d-ef79-425b-9959-afcff9bb2a2f\") " pod="openstack/ceilometer-0" Mar 10 10:08:42 crc kubenswrapper[4794]: I0310 10:08:42.084771 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bbacde9d-ef79-425b-9959-afcff9bb2a2f-log-httpd\") pod \"ceilometer-0\" (UID: \"bbacde9d-ef79-425b-9959-afcff9bb2a2f\") " pod="openstack/ceilometer-0" Mar 10 10:08:42 crc kubenswrapper[4794]: I0310 10:08:42.084870 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bbacde9d-ef79-425b-9959-afcff9bb2a2f-run-httpd\") pod \"ceilometer-0\" (UID: \"bbacde9d-ef79-425b-9959-afcff9bb2a2f\") " pod="openstack/ceilometer-0" Mar 10 10:08:42 crc kubenswrapper[4794]: I0310 10:08:42.084918 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bbacde9d-ef79-425b-9959-afcff9bb2a2f-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"bbacde9d-ef79-425b-9959-afcff9bb2a2f\") " pod="openstack/ceilometer-0" Mar 10 10:08:42 crc kubenswrapper[4794]: I0310 10:08:42.134255 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 10 10:08:42 crc kubenswrapper[4794]: E0310 10:08:42.134948 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[ceilometer-tls-certs combined-ca-bundle config-data kube-api-access-bznzf log-httpd run-httpd scripts sg-core-conf-yaml], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openstack/ceilometer-0" podUID="bbacde9d-ef79-425b-9959-afcff9bb2a2f" Mar 10 10:08:42 crc kubenswrapper[4794]: I0310 10:08:42.187126 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bbacde9d-ef79-425b-9959-afcff9bb2a2f-run-httpd\") pod \"ceilometer-0\" (UID: \"bbacde9d-ef79-425b-9959-afcff9bb2a2f\") " pod="openstack/ceilometer-0" Mar 10 10:08:42 crc kubenswrapper[4794]: I0310 10:08:42.187195 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bbacde9d-ef79-425b-9959-afcff9bb2a2f-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"bbacde9d-ef79-425b-9959-afcff9bb2a2f\") " pod="openstack/ceilometer-0" Mar 10 10:08:42 crc kubenswrapper[4794]: I0310 10:08:42.187243 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/bbacde9d-ef79-425b-9959-afcff9bb2a2f-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"bbacde9d-ef79-425b-9959-afcff9bb2a2f\") " pod="openstack/ceilometer-0" Mar 10 10:08:42 crc kubenswrapper[4794]: I0310 10:08:42.187293 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bbacde9d-ef79-425b-9959-afcff9bb2a2f-scripts\") pod \"ceilometer-0\" (UID: \"bbacde9d-ef79-425b-9959-afcff9bb2a2f\") " pod="openstack/ceilometer-0" Mar 10 10:08:42 crc kubenswrapper[4794]: I0310 10:08:42.187308 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bznzf\" (UniqueName: \"kubernetes.io/projected/bbacde9d-ef79-425b-9959-afcff9bb2a2f-kube-api-access-bznzf\") pod \"ceilometer-0\" (UID: \"bbacde9d-ef79-425b-9959-afcff9bb2a2f\") " pod="openstack/ceilometer-0" Mar 10 10:08:42 crc kubenswrapper[4794]: I0310 10:08:42.187911 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/bbacde9d-ef79-425b-9959-afcff9bb2a2f-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"bbacde9d-ef79-425b-9959-afcff9bb2a2f\") " pod="openstack/ceilometer-0" Mar 10 10:08:42 crc kubenswrapper[4794]: I0310 10:08:42.187998 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bbacde9d-ef79-425b-9959-afcff9bb2a2f-config-data\") pod \"ceilometer-0\" (UID: \"bbacde9d-ef79-425b-9959-afcff9bb2a2f\") " pod="openstack/ceilometer-0" Mar 10 10:08:42 crc kubenswrapper[4794]: I0310 10:08:42.188038 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bbacde9d-ef79-425b-9959-afcff9bb2a2f-log-httpd\") pod \"ceilometer-0\" (UID: \"bbacde9d-ef79-425b-9959-afcff9bb2a2f\") " pod="openstack/ceilometer-0" Mar 10 10:08:42 crc kubenswrapper[4794]: I0310 10:08:42.188183 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bbacde9d-ef79-425b-9959-afcff9bb2a2f-run-httpd\") pod \"ceilometer-0\" (UID: \"bbacde9d-ef79-425b-9959-afcff9bb2a2f\") " pod="openstack/ceilometer-0" Mar 10 10:08:42 crc kubenswrapper[4794]: I0310 10:08:42.188616 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bbacde9d-ef79-425b-9959-afcff9bb2a2f-log-httpd\") pod \"ceilometer-0\" (UID: \"bbacde9d-ef79-425b-9959-afcff9bb2a2f\") " pod="openstack/ceilometer-0" Mar 10 10:08:42 crc kubenswrapper[4794]: I0310 10:08:42.192873 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bbacde9d-ef79-425b-9959-afcff9bb2a2f-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"bbacde9d-ef79-425b-9959-afcff9bb2a2f\") " pod="openstack/ceilometer-0" Mar 10 10:08:42 crc kubenswrapper[4794]: I0310 10:08:42.193138 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/bbacde9d-ef79-425b-9959-afcff9bb2a2f-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"bbacde9d-ef79-425b-9959-afcff9bb2a2f\") " pod="openstack/ceilometer-0" Mar 10 10:08:42 crc kubenswrapper[4794]: I0310 10:08:42.194010 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bbacde9d-ef79-425b-9959-afcff9bb2a2f-scripts\") pod \"ceilometer-0\" (UID: \"bbacde9d-ef79-425b-9959-afcff9bb2a2f\") " pod="openstack/ceilometer-0" Mar 10 10:08:42 crc kubenswrapper[4794]: I0310 10:08:42.194238 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bbacde9d-ef79-425b-9959-afcff9bb2a2f-config-data\") pod \"ceilometer-0\" (UID: \"bbacde9d-ef79-425b-9959-afcff9bb2a2f\") " pod="openstack/ceilometer-0" Mar 10 10:08:42 crc kubenswrapper[4794]: I0310 10:08:42.194807 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/bbacde9d-ef79-425b-9959-afcff9bb2a2f-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"bbacde9d-ef79-425b-9959-afcff9bb2a2f\") " pod="openstack/ceilometer-0" Mar 10 10:08:42 crc kubenswrapper[4794]: I0310 10:08:42.205108 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bznzf\" (UniqueName: \"kubernetes.io/projected/bbacde9d-ef79-425b-9959-afcff9bb2a2f-kube-api-access-bznzf\") pod \"ceilometer-0\" (UID: \"bbacde9d-ef79-425b-9959-afcff9bb2a2f\") " pod="openstack/ceilometer-0" Mar 10 10:08:42 crc kubenswrapper[4794]: I0310 10:08:42.935871 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 10 10:08:42 crc kubenswrapper[4794]: I0310 10:08:42.947639 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 10 10:08:43 crc kubenswrapper[4794]: I0310 10:08:43.000956 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bbacde9d-ef79-425b-9959-afcff9bb2a2f-scripts\") pod \"bbacde9d-ef79-425b-9959-afcff9bb2a2f\" (UID: \"bbacde9d-ef79-425b-9959-afcff9bb2a2f\") " Mar 10 10:08:43 crc kubenswrapper[4794]: I0310 10:08:43.001069 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bbacde9d-ef79-425b-9959-afcff9bb2a2f-config-data\") pod \"bbacde9d-ef79-425b-9959-afcff9bb2a2f\" (UID: \"bbacde9d-ef79-425b-9959-afcff9bb2a2f\") " Mar 10 10:08:43 crc kubenswrapper[4794]: I0310 10:08:43.001115 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bbacde9d-ef79-425b-9959-afcff9bb2a2f-combined-ca-bundle\") pod \"bbacde9d-ef79-425b-9959-afcff9bb2a2f\" (UID: \"bbacde9d-ef79-425b-9959-afcff9bb2a2f\") " Mar 10 10:08:43 crc kubenswrapper[4794]: I0310 10:08:43.001161 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/bbacde9d-ef79-425b-9959-afcff9bb2a2f-sg-core-conf-yaml\") pod \"bbacde9d-ef79-425b-9959-afcff9bb2a2f\" (UID: \"bbacde9d-ef79-425b-9959-afcff9bb2a2f\") " Mar 10 10:08:43 crc kubenswrapper[4794]: I0310 10:08:43.001179 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bbacde9d-ef79-425b-9959-afcff9bb2a2f-run-httpd\") pod \"bbacde9d-ef79-425b-9959-afcff9bb2a2f\" (UID: \"bbacde9d-ef79-425b-9959-afcff9bb2a2f\") " Mar 10 10:08:43 crc kubenswrapper[4794]: I0310 10:08:43.001198 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bbacde9d-ef79-425b-9959-afcff9bb2a2f-log-httpd\") pod \"bbacde9d-ef79-425b-9959-afcff9bb2a2f\" (UID: \"bbacde9d-ef79-425b-9959-afcff9bb2a2f\") " Mar 10 10:08:43 crc kubenswrapper[4794]: I0310 10:08:43.001240 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bznzf\" (UniqueName: \"kubernetes.io/projected/bbacde9d-ef79-425b-9959-afcff9bb2a2f-kube-api-access-bznzf\") pod \"bbacde9d-ef79-425b-9959-afcff9bb2a2f\" (UID: \"bbacde9d-ef79-425b-9959-afcff9bb2a2f\") " Mar 10 10:08:43 crc kubenswrapper[4794]: I0310 10:08:43.001280 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/bbacde9d-ef79-425b-9959-afcff9bb2a2f-ceilometer-tls-certs\") pod \"bbacde9d-ef79-425b-9959-afcff9bb2a2f\" (UID: \"bbacde9d-ef79-425b-9959-afcff9bb2a2f\") " Mar 10 10:08:43 crc kubenswrapper[4794]: I0310 10:08:43.004177 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bbacde9d-ef79-425b-9959-afcff9bb2a2f-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "bbacde9d-ef79-425b-9959-afcff9bb2a2f" (UID: "bbacde9d-ef79-425b-9959-afcff9bb2a2f"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 10:08:43 crc kubenswrapper[4794]: I0310 10:08:43.004266 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bbacde9d-ef79-425b-9959-afcff9bb2a2f-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "bbacde9d-ef79-425b-9959-afcff9bb2a2f" (UID: "bbacde9d-ef79-425b-9959-afcff9bb2a2f"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 10:08:43 crc kubenswrapper[4794]: I0310 10:08:43.004285 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bbacde9d-ef79-425b-9959-afcff9bb2a2f-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "bbacde9d-ef79-425b-9959-afcff9bb2a2f" (UID: "bbacde9d-ef79-425b-9959-afcff9bb2a2f"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 10:08:43 crc kubenswrapper[4794]: I0310 10:08:43.005976 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bbacde9d-ef79-425b-9959-afcff9bb2a2f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "bbacde9d-ef79-425b-9959-afcff9bb2a2f" (UID: "bbacde9d-ef79-425b-9959-afcff9bb2a2f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 10:08:43 crc kubenswrapper[4794]: I0310 10:08:43.006066 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bbacde9d-ef79-425b-9959-afcff9bb2a2f-scripts" (OuterVolumeSpecName: "scripts") pod "bbacde9d-ef79-425b-9959-afcff9bb2a2f" (UID: "bbacde9d-ef79-425b-9959-afcff9bb2a2f"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 10:08:43 crc kubenswrapper[4794]: I0310 10:08:43.007392 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bbacde9d-ef79-425b-9959-afcff9bb2a2f-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "bbacde9d-ef79-425b-9959-afcff9bb2a2f" (UID: "bbacde9d-ef79-425b-9959-afcff9bb2a2f"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 10:08:43 crc kubenswrapper[4794]: I0310 10:08:43.008782 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bbacde9d-ef79-425b-9959-afcff9bb2a2f-kube-api-access-bznzf" (OuterVolumeSpecName: "kube-api-access-bznzf") pod "bbacde9d-ef79-425b-9959-afcff9bb2a2f" (UID: "bbacde9d-ef79-425b-9959-afcff9bb2a2f"). InnerVolumeSpecName "kube-api-access-bznzf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 10:08:43 crc kubenswrapper[4794]: I0310 10:08:43.011119 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bbacde9d-ef79-425b-9959-afcff9bb2a2f-config-data" (OuterVolumeSpecName: "config-data") pod "bbacde9d-ef79-425b-9959-afcff9bb2a2f" (UID: "bbacde9d-ef79-425b-9959-afcff9bb2a2f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 10:08:43 crc kubenswrapper[4794]: I0310 10:08:43.104603 4794 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bbacde9d-ef79-425b-9959-afcff9bb2a2f-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 10:08:43 crc kubenswrapper[4794]: I0310 10:08:43.105119 4794 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bbacde9d-ef79-425b-9959-afcff9bb2a2f-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 10:08:43 crc kubenswrapper[4794]: I0310 10:08:43.105268 4794 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bbacde9d-ef79-425b-9959-afcff9bb2a2f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 10:08:43 crc kubenswrapper[4794]: I0310 10:08:43.105444 4794 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/bbacde9d-ef79-425b-9959-afcff9bb2a2f-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 10 10:08:43 crc kubenswrapper[4794]: I0310 10:08:43.105514 4794 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bbacde9d-ef79-425b-9959-afcff9bb2a2f-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 10 10:08:43 crc kubenswrapper[4794]: I0310 10:08:43.105540 4794 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bbacde9d-ef79-425b-9959-afcff9bb2a2f-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 10 10:08:43 crc kubenswrapper[4794]: I0310 10:08:43.105558 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bznzf\" (UniqueName: \"kubernetes.io/projected/bbacde9d-ef79-425b-9959-afcff9bb2a2f-kube-api-access-bznzf\") on node \"crc\" DevicePath \"\"" Mar 10 10:08:43 crc kubenswrapper[4794]: I0310 10:08:43.105578 4794 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/bbacde9d-ef79-425b-9959-afcff9bb2a2f-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 10 10:08:43 crc kubenswrapper[4794]: I0310 10:08:43.943436 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 10 10:08:43 crc kubenswrapper[4794]: I0310 10:08:43.989238 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 10 10:08:44 crc kubenswrapper[4794]: I0310 10:08:44.020466 4794 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 10 10:08:44 crc kubenswrapper[4794]: I0310 10:08:44.036153 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 10 10:08:44 crc kubenswrapper[4794]: I0310 10:08:44.041709 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 10 10:08:44 crc kubenswrapper[4794]: I0310 10:08:44.044910 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 10 10:08:44 crc kubenswrapper[4794]: I0310 10:08:44.045075 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 10 10:08:44 crc kubenswrapper[4794]: I0310 10:08:44.045101 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Mar 10 10:08:44 crc kubenswrapper[4794]: I0310 10:08:44.054398 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 10 10:08:44 crc kubenswrapper[4794]: I0310 10:08:44.128816 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/5ad7a22a-3e88-4447-b675-0a8339bd5f55-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"5ad7a22a-3e88-4447-b675-0a8339bd5f55\") " pod="openstack/ceilometer-0" Mar 10 10:08:44 crc kubenswrapper[4794]: I0310 10:08:44.128862 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/5ad7a22a-3e88-4447-b675-0a8339bd5f55-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"5ad7a22a-3e88-4447-b675-0a8339bd5f55\") " pod="openstack/ceilometer-0" Mar 10 10:08:44 crc kubenswrapper[4794]: I0310 10:08:44.129001 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5ad7a22a-3e88-4447-b675-0a8339bd5f55-run-httpd\") pod \"ceilometer-0\" (UID: \"5ad7a22a-3e88-4447-b675-0a8339bd5f55\") " pod="openstack/ceilometer-0" Mar 10 10:08:44 crc kubenswrapper[4794]: I0310 10:08:44.129117 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5ad7a22a-3e88-4447-b675-0a8339bd5f55-config-data\") pod \"ceilometer-0\" (UID: \"5ad7a22a-3e88-4447-b675-0a8339bd5f55\") " pod="openstack/ceilometer-0" Mar 10 10:08:44 crc kubenswrapper[4794]: I0310 10:08:44.129171 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5ad7a22a-3e88-4447-b675-0a8339bd5f55-log-httpd\") pod \"ceilometer-0\" (UID: \"5ad7a22a-3e88-4447-b675-0a8339bd5f55\") " pod="openstack/ceilometer-0" Mar 10 10:08:44 crc kubenswrapper[4794]: I0310 10:08:44.129403 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5ad7a22a-3e88-4447-b675-0a8339bd5f55-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"5ad7a22a-3e88-4447-b675-0a8339bd5f55\") " pod="openstack/ceilometer-0" Mar 10 10:08:44 crc kubenswrapper[4794]: I0310 10:08:44.129440 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bk52r\" (UniqueName: \"kubernetes.io/projected/5ad7a22a-3e88-4447-b675-0a8339bd5f55-kube-api-access-bk52r\") pod \"ceilometer-0\" (UID: \"5ad7a22a-3e88-4447-b675-0a8339bd5f55\") " pod="openstack/ceilometer-0" Mar 10 10:08:44 crc kubenswrapper[4794]: I0310 10:08:44.129464 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5ad7a22a-3e88-4447-b675-0a8339bd5f55-scripts\") pod \"ceilometer-0\" (UID: \"5ad7a22a-3e88-4447-b675-0a8339bd5f55\") " pod="openstack/ceilometer-0" Mar 10 10:08:44 crc kubenswrapper[4794]: I0310 10:08:44.230864 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5ad7a22a-3e88-4447-b675-0a8339bd5f55-log-httpd\") pod \"ceilometer-0\" (UID: \"5ad7a22a-3e88-4447-b675-0a8339bd5f55\") " pod="openstack/ceilometer-0" Mar 10 10:08:44 crc kubenswrapper[4794]: I0310 10:08:44.231241 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5ad7a22a-3e88-4447-b675-0a8339bd5f55-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"5ad7a22a-3e88-4447-b675-0a8339bd5f55\") " pod="openstack/ceilometer-0" Mar 10 10:08:44 crc kubenswrapper[4794]: I0310 10:08:44.231268 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bk52r\" (UniqueName: \"kubernetes.io/projected/5ad7a22a-3e88-4447-b675-0a8339bd5f55-kube-api-access-bk52r\") pod \"ceilometer-0\" (UID: \"5ad7a22a-3e88-4447-b675-0a8339bd5f55\") " pod="openstack/ceilometer-0" Mar 10 10:08:44 crc kubenswrapper[4794]: I0310 10:08:44.231296 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5ad7a22a-3e88-4447-b675-0a8339bd5f55-scripts\") pod \"ceilometer-0\" (UID: \"5ad7a22a-3e88-4447-b675-0a8339bd5f55\") " pod="openstack/ceilometer-0" Mar 10 10:08:44 crc kubenswrapper[4794]: I0310 10:08:44.231367 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/5ad7a22a-3e88-4447-b675-0a8339bd5f55-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"5ad7a22a-3e88-4447-b675-0a8339bd5f55\") " pod="openstack/ceilometer-0" Mar 10 10:08:44 crc kubenswrapper[4794]: I0310 10:08:44.231395 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/5ad7a22a-3e88-4447-b675-0a8339bd5f55-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"5ad7a22a-3e88-4447-b675-0a8339bd5f55\") " pod="openstack/ceilometer-0" Mar 10 10:08:44 crc kubenswrapper[4794]: I0310 10:08:44.231438 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5ad7a22a-3e88-4447-b675-0a8339bd5f55-run-httpd\") pod \"ceilometer-0\" (UID: \"5ad7a22a-3e88-4447-b675-0a8339bd5f55\") " pod="openstack/ceilometer-0" Mar 10 10:08:44 crc kubenswrapper[4794]: I0310 10:08:44.231470 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5ad7a22a-3e88-4447-b675-0a8339bd5f55-log-httpd\") pod \"ceilometer-0\" (UID: \"5ad7a22a-3e88-4447-b675-0a8339bd5f55\") " pod="openstack/ceilometer-0" Mar 10 10:08:44 crc kubenswrapper[4794]: I0310 10:08:44.231606 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5ad7a22a-3e88-4447-b675-0a8339bd5f55-config-data\") pod \"ceilometer-0\" (UID: \"5ad7a22a-3e88-4447-b675-0a8339bd5f55\") " pod="openstack/ceilometer-0" Mar 10 10:08:44 crc kubenswrapper[4794]: I0310 10:08:44.232022 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5ad7a22a-3e88-4447-b675-0a8339bd5f55-run-httpd\") pod \"ceilometer-0\" (UID: \"5ad7a22a-3e88-4447-b675-0a8339bd5f55\") " pod="openstack/ceilometer-0" Mar 10 10:08:44 crc kubenswrapper[4794]: I0310 10:08:44.238225 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/5ad7a22a-3e88-4447-b675-0a8339bd5f55-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"5ad7a22a-3e88-4447-b675-0a8339bd5f55\") " pod="openstack/ceilometer-0" Mar 10 10:08:44 crc kubenswrapper[4794]: I0310 10:08:44.239501 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/5ad7a22a-3e88-4447-b675-0a8339bd5f55-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"5ad7a22a-3e88-4447-b675-0a8339bd5f55\") " pod="openstack/ceilometer-0" Mar 10 10:08:44 crc kubenswrapper[4794]: I0310 10:08:44.239497 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5ad7a22a-3e88-4447-b675-0a8339bd5f55-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"5ad7a22a-3e88-4447-b675-0a8339bd5f55\") " pod="openstack/ceilometer-0" Mar 10 10:08:44 crc kubenswrapper[4794]: I0310 10:08:44.240976 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5ad7a22a-3e88-4447-b675-0a8339bd5f55-config-data\") pod \"ceilometer-0\" (UID: \"5ad7a22a-3e88-4447-b675-0a8339bd5f55\") " pod="openstack/ceilometer-0" Mar 10 10:08:44 crc kubenswrapper[4794]: I0310 10:08:44.241782 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5ad7a22a-3e88-4447-b675-0a8339bd5f55-scripts\") pod \"ceilometer-0\" (UID: \"5ad7a22a-3e88-4447-b675-0a8339bd5f55\") " pod="openstack/ceilometer-0" Mar 10 10:08:44 crc kubenswrapper[4794]: I0310 10:08:44.261414 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bk52r\" (UniqueName: \"kubernetes.io/projected/5ad7a22a-3e88-4447-b675-0a8339bd5f55-kube-api-access-bk52r\") pod \"ceilometer-0\" (UID: \"5ad7a22a-3e88-4447-b675-0a8339bd5f55\") " pod="openstack/ceilometer-0" Mar 10 10:08:44 crc kubenswrapper[4794]: I0310 10:08:44.360313 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 10 10:08:44 crc kubenswrapper[4794]: I0310 10:08:44.485075 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 10 10:08:44 crc kubenswrapper[4794]: I0310 10:08:44.536744 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jcbx7\" (UniqueName: \"kubernetes.io/projected/35c66301-907d-40ed-afb3-bc354acece6c-kube-api-access-jcbx7\") pod \"35c66301-907d-40ed-afb3-bc354acece6c\" (UID: \"35c66301-907d-40ed-afb3-bc354acece6c\") " Mar 10 10:08:44 crc kubenswrapper[4794]: I0310 10:08:44.536795 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/35c66301-907d-40ed-afb3-bc354acece6c-logs\") pod \"35c66301-907d-40ed-afb3-bc354acece6c\" (UID: \"35c66301-907d-40ed-afb3-bc354acece6c\") " Mar 10 10:08:44 crc kubenswrapper[4794]: I0310 10:08:44.536938 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/35c66301-907d-40ed-afb3-bc354acece6c-combined-ca-bundle\") pod \"35c66301-907d-40ed-afb3-bc354acece6c\" (UID: \"35c66301-907d-40ed-afb3-bc354acece6c\") " Mar 10 10:08:44 crc kubenswrapper[4794]: I0310 10:08:44.536974 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/35c66301-907d-40ed-afb3-bc354acece6c-config-data\") pod \"35c66301-907d-40ed-afb3-bc354acece6c\" (UID: \"35c66301-907d-40ed-afb3-bc354acece6c\") " Mar 10 10:08:44 crc kubenswrapper[4794]: I0310 10:08:44.538636 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/35c66301-907d-40ed-afb3-bc354acece6c-logs" (OuterVolumeSpecName: "logs") pod "35c66301-907d-40ed-afb3-bc354acece6c" (UID: "35c66301-907d-40ed-afb3-bc354acece6c"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 10:08:44 crc kubenswrapper[4794]: I0310 10:08:44.541040 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/35c66301-907d-40ed-afb3-bc354acece6c-kube-api-access-jcbx7" (OuterVolumeSpecName: "kube-api-access-jcbx7") pod "35c66301-907d-40ed-afb3-bc354acece6c" (UID: "35c66301-907d-40ed-afb3-bc354acece6c"). InnerVolumeSpecName "kube-api-access-jcbx7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 10:08:44 crc kubenswrapper[4794]: I0310 10:08:44.565917 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/35c66301-907d-40ed-afb3-bc354acece6c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "35c66301-907d-40ed-afb3-bc354acece6c" (UID: "35c66301-907d-40ed-afb3-bc354acece6c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 10:08:44 crc kubenswrapper[4794]: I0310 10:08:44.583414 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/35c66301-907d-40ed-afb3-bc354acece6c-config-data" (OuterVolumeSpecName: "config-data") pod "35c66301-907d-40ed-afb3-bc354acece6c" (UID: "35c66301-907d-40ed-afb3-bc354acece6c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 10:08:44 crc kubenswrapper[4794]: I0310 10:08:44.638487 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jcbx7\" (UniqueName: \"kubernetes.io/projected/35c66301-907d-40ed-afb3-bc354acece6c-kube-api-access-jcbx7\") on node \"crc\" DevicePath \"\"" Mar 10 10:08:44 crc kubenswrapper[4794]: I0310 10:08:44.638517 4794 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/35c66301-907d-40ed-afb3-bc354acece6c-logs\") on node \"crc\" DevicePath \"\"" Mar 10 10:08:44 crc kubenswrapper[4794]: I0310 10:08:44.638527 4794 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/35c66301-907d-40ed-afb3-bc354acece6c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 10:08:44 crc kubenswrapper[4794]: I0310 10:08:44.638535 4794 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/35c66301-907d-40ed-afb3-bc354acece6c-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 10:08:44 crc kubenswrapper[4794]: W0310 10:08:44.865021 4794 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5ad7a22a_3e88_4447_b675_0a8339bd5f55.slice/crio-0ecaf5d4f675c8d0c9dea14ae63403540987146039f8a7d1e5b775fd6df28002 WatchSource:0}: Error finding container 0ecaf5d4f675c8d0c9dea14ae63403540987146039f8a7d1e5b775fd6df28002: Status 404 returned error can't find the container with id 0ecaf5d4f675c8d0c9dea14ae63403540987146039f8a7d1e5b775fd6df28002 Mar 10 10:08:44 crc kubenswrapper[4794]: I0310 10:08:44.873394 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 10 10:08:44 crc kubenswrapper[4794]: I0310 10:08:44.954891 4794 generic.go:334] "Generic (PLEG): container finished" podID="35c66301-907d-40ed-afb3-bc354acece6c" containerID="9acc5f862dab66c8a5e24eebcf3c82c037a59d49bfb52b3d73f73a14e9e9958a" exitCode=0 Mar 10 10:08:44 crc kubenswrapper[4794]: I0310 10:08:44.954949 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 10 10:08:44 crc kubenswrapper[4794]: I0310 10:08:44.954966 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"35c66301-907d-40ed-afb3-bc354acece6c","Type":"ContainerDied","Data":"9acc5f862dab66c8a5e24eebcf3c82c037a59d49bfb52b3d73f73a14e9e9958a"} Mar 10 10:08:44 crc kubenswrapper[4794]: I0310 10:08:44.955310 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"35c66301-907d-40ed-afb3-bc354acece6c","Type":"ContainerDied","Data":"ba2ce354562303a436d5a85cd3efdddac77398279ea32b798e7045886e2012a3"} Mar 10 10:08:44 crc kubenswrapper[4794]: I0310 10:08:44.955404 4794 scope.go:117] "RemoveContainer" containerID="9acc5f862dab66c8a5e24eebcf3c82c037a59d49bfb52b3d73f73a14e9e9958a" Mar 10 10:08:44 crc kubenswrapper[4794]: I0310 10:08:44.957829 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5ad7a22a-3e88-4447-b675-0a8339bd5f55","Type":"ContainerStarted","Data":"0ecaf5d4f675c8d0c9dea14ae63403540987146039f8a7d1e5b775fd6df28002"} Mar 10 10:08:44 crc kubenswrapper[4794]: I0310 10:08:44.987648 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 10 10:08:44 crc kubenswrapper[4794]: I0310 10:08:44.990440 4794 scope.go:117] "RemoveContainer" containerID="568f66f4a8b27c2454e806e1a7d04f1e8eae2e56111d1087a639f0edfdf2a055" Mar 10 10:08:44 crc kubenswrapper[4794]: I0310 10:08:44.994933 4794 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Mar 10 10:08:45 crc kubenswrapper[4794]: I0310 10:08:45.007869 4794 scope.go:117] "RemoveContainer" containerID="9acc5f862dab66c8a5e24eebcf3c82c037a59d49bfb52b3d73f73a14e9e9958a" Mar 10 10:08:45 crc kubenswrapper[4794]: E0310 10:08:45.008274 4794 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9acc5f862dab66c8a5e24eebcf3c82c037a59d49bfb52b3d73f73a14e9e9958a\": container with ID starting with 9acc5f862dab66c8a5e24eebcf3c82c037a59d49bfb52b3d73f73a14e9e9958a not found: ID does not exist" containerID="9acc5f862dab66c8a5e24eebcf3c82c037a59d49bfb52b3d73f73a14e9e9958a" Mar 10 10:08:45 crc kubenswrapper[4794]: I0310 10:08:45.008327 4794 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9acc5f862dab66c8a5e24eebcf3c82c037a59d49bfb52b3d73f73a14e9e9958a"} err="failed to get container status \"9acc5f862dab66c8a5e24eebcf3c82c037a59d49bfb52b3d73f73a14e9e9958a\": rpc error: code = NotFound desc = could not find container \"9acc5f862dab66c8a5e24eebcf3c82c037a59d49bfb52b3d73f73a14e9e9958a\": container with ID starting with 9acc5f862dab66c8a5e24eebcf3c82c037a59d49bfb52b3d73f73a14e9e9958a not found: ID does not exist" Mar 10 10:08:45 crc kubenswrapper[4794]: I0310 10:08:45.008396 4794 scope.go:117] "RemoveContainer" containerID="568f66f4a8b27c2454e806e1a7d04f1e8eae2e56111d1087a639f0edfdf2a055" Mar 10 10:08:45 crc kubenswrapper[4794]: E0310 10:08:45.008635 4794 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"568f66f4a8b27c2454e806e1a7d04f1e8eae2e56111d1087a639f0edfdf2a055\": container with ID starting with 568f66f4a8b27c2454e806e1a7d04f1e8eae2e56111d1087a639f0edfdf2a055 not found: ID does not exist" containerID="568f66f4a8b27c2454e806e1a7d04f1e8eae2e56111d1087a639f0edfdf2a055" Mar 10 10:08:45 crc kubenswrapper[4794]: I0310 10:08:45.008656 4794 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"568f66f4a8b27c2454e806e1a7d04f1e8eae2e56111d1087a639f0edfdf2a055"} err="failed to get container status \"568f66f4a8b27c2454e806e1a7d04f1e8eae2e56111d1087a639f0edfdf2a055\": rpc error: code = NotFound desc = could not find container \"568f66f4a8b27c2454e806e1a7d04f1e8eae2e56111d1087a639f0edfdf2a055\": container with ID starting with 568f66f4a8b27c2454e806e1a7d04f1e8eae2e56111d1087a639f0edfdf2a055 not found: ID does not exist" Mar 10 10:08:45 crc kubenswrapper[4794]: I0310 10:08:45.011899 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Mar 10 10:08:45 crc kubenswrapper[4794]: E0310 10:08:45.012245 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="35c66301-907d-40ed-afb3-bc354acece6c" containerName="nova-api-api" Mar 10 10:08:45 crc kubenswrapper[4794]: I0310 10:08:45.012260 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="35c66301-907d-40ed-afb3-bc354acece6c" containerName="nova-api-api" Mar 10 10:08:45 crc kubenswrapper[4794]: E0310 10:08:45.012275 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="35c66301-907d-40ed-afb3-bc354acece6c" containerName="nova-api-log" Mar 10 10:08:45 crc kubenswrapper[4794]: I0310 10:08:45.012281 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="35c66301-907d-40ed-afb3-bc354acece6c" containerName="nova-api-log" Mar 10 10:08:45 crc kubenswrapper[4794]: I0310 10:08:45.012448 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="35c66301-907d-40ed-afb3-bc354acece6c" containerName="nova-api-log" Mar 10 10:08:45 crc kubenswrapper[4794]: I0310 10:08:45.012477 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="35c66301-907d-40ed-afb3-bc354acece6c" containerName="nova-api-api" Mar 10 10:08:45 crc kubenswrapper[4794]: I0310 10:08:45.013287 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 10 10:08:45 crc kubenswrapper[4794]: I0310 10:08:45.019353 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Mar 10 10:08:45 crc kubenswrapper[4794]: I0310 10:08:45.019373 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Mar 10 10:08:45 crc kubenswrapper[4794]: I0310 10:08:45.019578 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Mar 10 10:08:45 crc kubenswrapper[4794]: I0310 10:08:45.026369 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 10 10:08:45 crc kubenswrapper[4794]: I0310 10:08:45.150051 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/61199562-76fc-4b20-a14e-6da4c1738232-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"61199562-76fc-4b20-a14e-6da4c1738232\") " pod="openstack/nova-api-0" Mar 10 10:08:45 crc kubenswrapper[4794]: I0310 10:08:45.150092 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/61199562-76fc-4b20-a14e-6da4c1738232-internal-tls-certs\") pod \"nova-api-0\" (UID: \"61199562-76fc-4b20-a14e-6da4c1738232\") " pod="openstack/nova-api-0" Mar 10 10:08:45 crc kubenswrapper[4794]: I0310 10:08:45.150274 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/61199562-76fc-4b20-a14e-6da4c1738232-config-data\") pod \"nova-api-0\" (UID: \"61199562-76fc-4b20-a14e-6da4c1738232\") " pod="openstack/nova-api-0" Mar 10 10:08:45 crc kubenswrapper[4794]: I0310 10:08:45.151442 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/61199562-76fc-4b20-a14e-6da4c1738232-logs\") pod \"nova-api-0\" (UID: \"61199562-76fc-4b20-a14e-6da4c1738232\") " pod="openstack/nova-api-0" Mar 10 10:08:45 crc kubenswrapper[4794]: I0310 10:08:45.151521 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9l8nh\" (UniqueName: \"kubernetes.io/projected/61199562-76fc-4b20-a14e-6da4c1738232-kube-api-access-9l8nh\") pod \"nova-api-0\" (UID: \"61199562-76fc-4b20-a14e-6da4c1738232\") " pod="openstack/nova-api-0" Mar 10 10:08:45 crc kubenswrapper[4794]: I0310 10:08:45.151742 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/61199562-76fc-4b20-a14e-6da4c1738232-public-tls-certs\") pod \"nova-api-0\" (UID: \"61199562-76fc-4b20-a14e-6da4c1738232\") " pod="openstack/nova-api-0" Mar 10 10:08:45 crc kubenswrapper[4794]: I0310 10:08:45.253945 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/61199562-76fc-4b20-a14e-6da4c1738232-public-tls-certs\") pod \"nova-api-0\" (UID: \"61199562-76fc-4b20-a14e-6da4c1738232\") " pod="openstack/nova-api-0" Mar 10 10:08:45 crc kubenswrapper[4794]: I0310 10:08:45.254020 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/61199562-76fc-4b20-a14e-6da4c1738232-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"61199562-76fc-4b20-a14e-6da4c1738232\") " pod="openstack/nova-api-0" Mar 10 10:08:45 crc kubenswrapper[4794]: I0310 10:08:45.254044 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/61199562-76fc-4b20-a14e-6da4c1738232-internal-tls-certs\") pod \"nova-api-0\" (UID: \"61199562-76fc-4b20-a14e-6da4c1738232\") " pod="openstack/nova-api-0" Mar 10 10:08:45 crc kubenswrapper[4794]: I0310 10:08:45.254126 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/61199562-76fc-4b20-a14e-6da4c1738232-config-data\") pod \"nova-api-0\" (UID: \"61199562-76fc-4b20-a14e-6da4c1738232\") " pod="openstack/nova-api-0" Mar 10 10:08:45 crc kubenswrapper[4794]: I0310 10:08:45.254821 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/61199562-76fc-4b20-a14e-6da4c1738232-logs\") pod \"nova-api-0\" (UID: \"61199562-76fc-4b20-a14e-6da4c1738232\") " pod="openstack/nova-api-0" Mar 10 10:08:45 crc kubenswrapper[4794]: I0310 10:08:45.254868 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9l8nh\" (UniqueName: \"kubernetes.io/projected/61199562-76fc-4b20-a14e-6da4c1738232-kube-api-access-9l8nh\") pod \"nova-api-0\" (UID: \"61199562-76fc-4b20-a14e-6da4c1738232\") " pod="openstack/nova-api-0" Mar 10 10:08:45 crc kubenswrapper[4794]: I0310 10:08:45.255187 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/61199562-76fc-4b20-a14e-6da4c1738232-logs\") pod \"nova-api-0\" (UID: \"61199562-76fc-4b20-a14e-6da4c1738232\") " pod="openstack/nova-api-0" Mar 10 10:08:45 crc kubenswrapper[4794]: I0310 10:08:45.260008 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/61199562-76fc-4b20-a14e-6da4c1738232-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"61199562-76fc-4b20-a14e-6da4c1738232\") " pod="openstack/nova-api-0" Mar 10 10:08:45 crc kubenswrapper[4794]: I0310 10:08:45.260020 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/61199562-76fc-4b20-a14e-6da4c1738232-internal-tls-certs\") pod \"nova-api-0\" (UID: \"61199562-76fc-4b20-a14e-6da4c1738232\") " pod="openstack/nova-api-0" Mar 10 10:08:45 crc kubenswrapper[4794]: I0310 10:08:45.261780 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/61199562-76fc-4b20-a14e-6da4c1738232-public-tls-certs\") pod \"nova-api-0\" (UID: \"61199562-76fc-4b20-a14e-6da4c1738232\") " pod="openstack/nova-api-0" Mar 10 10:08:45 crc kubenswrapper[4794]: I0310 10:08:45.263357 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/61199562-76fc-4b20-a14e-6da4c1738232-config-data\") pod \"nova-api-0\" (UID: \"61199562-76fc-4b20-a14e-6da4c1738232\") " pod="openstack/nova-api-0" Mar 10 10:08:45 crc kubenswrapper[4794]: I0310 10:08:45.275457 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9l8nh\" (UniqueName: \"kubernetes.io/projected/61199562-76fc-4b20-a14e-6da4c1738232-kube-api-access-9l8nh\") pod \"nova-api-0\" (UID: \"61199562-76fc-4b20-a14e-6da4c1738232\") " pod="openstack/nova-api-0" Mar 10 10:08:45 crc kubenswrapper[4794]: I0310 10:08:45.331625 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 10 10:08:45 crc kubenswrapper[4794]: I0310 10:08:45.533205 4794 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-cell1-novncproxy-0" Mar 10 10:08:45 crc kubenswrapper[4794]: I0310 10:08:45.552664 4794 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-cell1-novncproxy-0" Mar 10 10:08:45 crc kubenswrapper[4794]: I0310 10:08:45.798022 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 10 10:08:45 crc kubenswrapper[4794]: I0310 10:08:45.981154 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"61199562-76fc-4b20-a14e-6da4c1738232","Type":"ContainerStarted","Data":"ef4bba98527955cc98443dd4a1ddf94c6712a262eabb7c4ec9f8ed25f1388ef4"} Mar 10 10:08:45 crc kubenswrapper[4794]: I0310 10:08:45.981206 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"61199562-76fc-4b20-a14e-6da4c1738232","Type":"ContainerStarted","Data":"bafa54943158e9f1163fa92e2b76e4b121ee77cbaf837360f966963fe3031c56"} Mar 10 10:08:45 crc kubenswrapper[4794]: I0310 10:08:45.982875 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5ad7a22a-3e88-4447-b675-0a8339bd5f55","Type":"ContainerStarted","Data":"e87552170b0f59d6ba42cedca611dc39fa99be4bf11ae3c34213d11839f96d1a"} Mar 10 10:08:46 crc kubenswrapper[4794]: I0310 10:08:46.036584 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="35c66301-907d-40ed-afb3-bc354acece6c" path="/var/lib/kubelet/pods/35c66301-907d-40ed-afb3-bc354acece6c/volumes" Mar 10 10:08:46 crc kubenswrapper[4794]: I0310 10:08:46.037269 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bbacde9d-ef79-425b-9959-afcff9bb2a2f" path="/var/lib/kubelet/pods/bbacde9d-ef79-425b-9959-afcff9bb2a2f/volumes" Mar 10 10:08:46 crc kubenswrapper[4794]: I0310 10:08:46.037789 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-novncproxy-0" Mar 10 10:08:46 crc kubenswrapper[4794]: I0310 10:08:46.225251 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-cell-mapping-q5rgf"] Mar 10 10:08:46 crc kubenswrapper[4794]: I0310 10:08:46.226857 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-q5rgf" Mar 10 10:08:46 crc kubenswrapper[4794]: I0310 10:08:46.231126 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-config-data" Mar 10 10:08:46 crc kubenswrapper[4794]: I0310 10:08:46.231482 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-scripts" Mar 10 10:08:46 crc kubenswrapper[4794]: I0310 10:08:46.240060 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-q5rgf"] Mar 10 10:08:46 crc kubenswrapper[4794]: I0310 10:08:46.278213 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c66c006d-3dd9-4544-8272-51226c41a2fd-scripts\") pod \"nova-cell1-cell-mapping-q5rgf\" (UID: \"c66c006d-3dd9-4544-8272-51226c41a2fd\") " pod="openstack/nova-cell1-cell-mapping-q5rgf" Mar 10 10:08:46 crc kubenswrapper[4794]: I0310 10:08:46.278263 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c66c006d-3dd9-4544-8272-51226c41a2fd-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-q5rgf\" (UID: \"c66c006d-3dd9-4544-8272-51226c41a2fd\") " pod="openstack/nova-cell1-cell-mapping-q5rgf" Mar 10 10:08:46 crc kubenswrapper[4794]: I0310 10:08:46.278319 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c66c006d-3dd9-4544-8272-51226c41a2fd-config-data\") pod \"nova-cell1-cell-mapping-q5rgf\" (UID: \"c66c006d-3dd9-4544-8272-51226c41a2fd\") " pod="openstack/nova-cell1-cell-mapping-q5rgf" Mar 10 10:08:46 crc kubenswrapper[4794]: I0310 10:08:46.278417 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wk8jv\" (UniqueName: \"kubernetes.io/projected/c66c006d-3dd9-4544-8272-51226c41a2fd-kube-api-access-wk8jv\") pod \"nova-cell1-cell-mapping-q5rgf\" (UID: \"c66c006d-3dd9-4544-8272-51226c41a2fd\") " pod="openstack/nova-cell1-cell-mapping-q5rgf" Mar 10 10:08:46 crc kubenswrapper[4794]: I0310 10:08:46.379897 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c66c006d-3dd9-4544-8272-51226c41a2fd-scripts\") pod \"nova-cell1-cell-mapping-q5rgf\" (UID: \"c66c006d-3dd9-4544-8272-51226c41a2fd\") " pod="openstack/nova-cell1-cell-mapping-q5rgf" Mar 10 10:08:46 crc kubenswrapper[4794]: I0310 10:08:46.379944 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c66c006d-3dd9-4544-8272-51226c41a2fd-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-q5rgf\" (UID: \"c66c006d-3dd9-4544-8272-51226c41a2fd\") " pod="openstack/nova-cell1-cell-mapping-q5rgf" Mar 10 10:08:46 crc kubenswrapper[4794]: I0310 10:08:46.379994 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c66c006d-3dd9-4544-8272-51226c41a2fd-config-data\") pod \"nova-cell1-cell-mapping-q5rgf\" (UID: \"c66c006d-3dd9-4544-8272-51226c41a2fd\") " pod="openstack/nova-cell1-cell-mapping-q5rgf" Mar 10 10:08:46 crc kubenswrapper[4794]: I0310 10:08:46.380089 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wk8jv\" (UniqueName: \"kubernetes.io/projected/c66c006d-3dd9-4544-8272-51226c41a2fd-kube-api-access-wk8jv\") pod \"nova-cell1-cell-mapping-q5rgf\" (UID: \"c66c006d-3dd9-4544-8272-51226c41a2fd\") " pod="openstack/nova-cell1-cell-mapping-q5rgf" Mar 10 10:08:46 crc kubenswrapper[4794]: I0310 10:08:46.385604 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c66c006d-3dd9-4544-8272-51226c41a2fd-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-q5rgf\" (UID: \"c66c006d-3dd9-4544-8272-51226c41a2fd\") " pod="openstack/nova-cell1-cell-mapping-q5rgf" Mar 10 10:08:46 crc kubenswrapper[4794]: I0310 10:08:46.397882 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c66c006d-3dd9-4544-8272-51226c41a2fd-scripts\") pod \"nova-cell1-cell-mapping-q5rgf\" (UID: \"c66c006d-3dd9-4544-8272-51226c41a2fd\") " pod="openstack/nova-cell1-cell-mapping-q5rgf" Mar 10 10:08:46 crc kubenswrapper[4794]: I0310 10:08:46.400326 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c66c006d-3dd9-4544-8272-51226c41a2fd-config-data\") pod \"nova-cell1-cell-mapping-q5rgf\" (UID: \"c66c006d-3dd9-4544-8272-51226c41a2fd\") " pod="openstack/nova-cell1-cell-mapping-q5rgf" Mar 10 10:08:46 crc kubenswrapper[4794]: I0310 10:08:46.402900 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wk8jv\" (UniqueName: \"kubernetes.io/projected/c66c006d-3dd9-4544-8272-51226c41a2fd-kube-api-access-wk8jv\") pod \"nova-cell1-cell-mapping-q5rgf\" (UID: \"c66c006d-3dd9-4544-8272-51226c41a2fd\") " pod="openstack/nova-cell1-cell-mapping-q5rgf" Mar 10 10:08:46 crc kubenswrapper[4794]: I0310 10:08:46.647300 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-q5rgf" Mar 10 10:08:46 crc kubenswrapper[4794]: I0310 10:08:46.997035 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"61199562-76fc-4b20-a14e-6da4c1738232","Type":"ContainerStarted","Data":"154560ffa9cfeb302b6ae22b1cdc0838499403243568c6df472ca1b6def1f145"} Mar 10 10:08:47 crc kubenswrapper[4794]: I0310 10:08:47.000136 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5ad7a22a-3e88-4447-b675-0a8339bd5f55","Type":"ContainerStarted","Data":"cca51dc254ea928ff512f597522805c2f3a8b9ea69647a3b6f31bf5e631eac13"} Mar 10 10:08:47 crc kubenswrapper[4794]: I0310 10:08:47.021810 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=3.021788471 podStartE2EDuration="3.021788471s" podCreationTimestamp="2026-03-10 10:08:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 10:08:47.016936459 +0000 UTC m=+1475.773107277" watchObservedRunningTime="2026-03-10 10:08:47.021788471 +0000 UTC m=+1475.777959299" Mar 10 10:08:47 crc kubenswrapper[4794]: I0310 10:08:47.167862 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-q5rgf"] Mar 10 10:08:47 crc kubenswrapper[4794]: I0310 10:08:47.469801 4794 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-hz6dc" Mar 10 10:08:47 crc kubenswrapper[4794]: I0310 10:08:47.526184 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-hz6dc" Mar 10 10:08:47 crc kubenswrapper[4794]: I0310 10:08:47.704197 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-hz6dc"] Mar 10 10:08:48 crc kubenswrapper[4794]: I0310 10:08:48.048726 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5ad7a22a-3e88-4447-b675-0a8339bd5f55","Type":"ContainerStarted","Data":"efa120338ab691675da4aafa0ebfbff3b4647e0b19ef6e555a8542261261a114"} Mar 10 10:08:48 crc kubenswrapper[4794]: I0310 10:08:48.050091 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-q5rgf" event={"ID":"c66c006d-3dd9-4544-8272-51226c41a2fd","Type":"ContainerStarted","Data":"6c9754ac702c4f23067de7bb692d3a96bce0ffba1d10777a374b930149ba51e5"} Mar 10 10:08:48 crc kubenswrapper[4794]: I0310 10:08:48.050119 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-q5rgf" event={"ID":"c66c006d-3dd9-4544-8272-51226c41a2fd","Type":"ContainerStarted","Data":"cf051fd871316a9114e8b6cfa739ccdd2425dd9985881ad2fe57301b79e601f8"} Mar 10 10:08:48 crc kubenswrapper[4794]: I0310 10:08:48.065715 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-cell-mapping-q5rgf" podStartSLOduration=2.065695601 podStartE2EDuration="2.065695601s" podCreationTimestamp="2026-03-10 10:08:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 10:08:48.061866322 +0000 UTC m=+1476.818037140" watchObservedRunningTime="2026-03-10 10:08:48.065695601 +0000 UTC m=+1476.821866429" Mar 10 10:08:48 crc kubenswrapper[4794]: I0310 10:08:48.365546 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-c8964d89c-gfknm" Mar 10 10:08:48 crc kubenswrapper[4794]: I0310 10:08:48.457230 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7559df67df-6sc9d"] Mar 10 10:08:48 crc kubenswrapper[4794]: I0310 10:08:48.457816 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-7559df67df-6sc9d" podUID="bc2846c8-b6db-4b5f-8bb8-998b50e64970" containerName="dnsmasq-dns" containerID="cri-o://c1a6f8e537285056480257d619eb73fbf78910635268ce5df54e252c9727c3b7" gracePeriod=10 Mar 10 10:08:48 crc kubenswrapper[4794]: I0310 10:08:48.952104 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7559df67df-6sc9d" Mar 10 10:08:49 crc kubenswrapper[4794]: I0310 10:08:49.034506 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bc2846c8-b6db-4b5f-8bb8-998b50e64970-config\") pod \"bc2846c8-b6db-4b5f-8bb8-998b50e64970\" (UID: \"bc2846c8-b6db-4b5f-8bb8-998b50e64970\") " Mar 10 10:08:49 crc kubenswrapper[4794]: I0310 10:08:49.034553 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/bc2846c8-b6db-4b5f-8bb8-998b50e64970-ovsdbserver-sb\") pod \"bc2846c8-b6db-4b5f-8bb8-998b50e64970\" (UID: \"bc2846c8-b6db-4b5f-8bb8-998b50e64970\") " Mar 10 10:08:49 crc kubenswrapper[4794]: I0310 10:08:49.034587 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bc2846c8-b6db-4b5f-8bb8-998b50e64970-dns-svc\") pod \"bc2846c8-b6db-4b5f-8bb8-998b50e64970\" (UID: \"bc2846c8-b6db-4b5f-8bb8-998b50e64970\") " Mar 10 10:08:49 crc kubenswrapper[4794]: I0310 10:08:49.034609 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xs787\" (UniqueName: \"kubernetes.io/projected/bc2846c8-b6db-4b5f-8bb8-998b50e64970-kube-api-access-xs787\") pod \"bc2846c8-b6db-4b5f-8bb8-998b50e64970\" (UID: \"bc2846c8-b6db-4b5f-8bb8-998b50e64970\") " Mar 10 10:08:49 crc kubenswrapper[4794]: I0310 10:08:49.034683 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/bc2846c8-b6db-4b5f-8bb8-998b50e64970-ovsdbserver-nb\") pod \"bc2846c8-b6db-4b5f-8bb8-998b50e64970\" (UID: \"bc2846c8-b6db-4b5f-8bb8-998b50e64970\") " Mar 10 10:08:49 crc kubenswrapper[4794]: I0310 10:08:49.034798 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/bc2846c8-b6db-4b5f-8bb8-998b50e64970-dns-swift-storage-0\") pod \"bc2846c8-b6db-4b5f-8bb8-998b50e64970\" (UID: \"bc2846c8-b6db-4b5f-8bb8-998b50e64970\") " Mar 10 10:08:49 crc kubenswrapper[4794]: I0310 10:08:49.040452 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc2846c8-b6db-4b5f-8bb8-998b50e64970-kube-api-access-xs787" (OuterVolumeSpecName: "kube-api-access-xs787") pod "bc2846c8-b6db-4b5f-8bb8-998b50e64970" (UID: "bc2846c8-b6db-4b5f-8bb8-998b50e64970"). InnerVolumeSpecName "kube-api-access-xs787". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 10:08:49 crc kubenswrapper[4794]: I0310 10:08:49.064302 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5ad7a22a-3e88-4447-b675-0a8339bd5f55","Type":"ContainerStarted","Data":"eea46c0d3766a780d0c4aadf572fb639917b34eb0330835ed5b59a3cdae42cd7"} Mar 10 10:08:49 crc kubenswrapper[4794]: I0310 10:08:49.065461 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 10 10:08:49 crc kubenswrapper[4794]: I0310 10:08:49.067456 4794 generic.go:334] "Generic (PLEG): container finished" podID="bc2846c8-b6db-4b5f-8bb8-998b50e64970" containerID="c1a6f8e537285056480257d619eb73fbf78910635268ce5df54e252c9727c3b7" exitCode=0 Mar 10 10:08:49 crc kubenswrapper[4794]: I0310 10:08:49.067935 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7559df67df-6sc9d" Mar 10 10:08:49 crc kubenswrapper[4794]: I0310 10:08:49.068099 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7559df67df-6sc9d" event={"ID":"bc2846c8-b6db-4b5f-8bb8-998b50e64970","Type":"ContainerDied","Data":"c1a6f8e537285056480257d619eb73fbf78910635268ce5df54e252c9727c3b7"} Mar 10 10:08:49 crc kubenswrapper[4794]: I0310 10:08:49.068118 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7559df67df-6sc9d" event={"ID":"bc2846c8-b6db-4b5f-8bb8-998b50e64970","Type":"ContainerDied","Data":"eceb2c4e331804ce49d828d9b9e8b23b418352076ff423d503b5a8aa40f5bb8c"} Mar 10 10:08:49 crc kubenswrapper[4794]: I0310 10:08:49.068133 4794 scope.go:117] "RemoveContainer" containerID="c1a6f8e537285056480257d619eb73fbf78910635268ce5df54e252c9727c3b7" Mar 10 10:08:49 crc kubenswrapper[4794]: I0310 10:08:49.068307 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-hz6dc" podUID="6d7ccc31-5316-4f75-999a-45cd1f36a9f1" containerName="registry-server" containerID="cri-o://27ff0203e6f08f5795cc198aaf2f05f8cca13fa2dcbf78def477d0d4dabdf1f4" gracePeriod=2 Mar 10 10:08:49 crc kubenswrapper[4794]: I0310 10:08:49.092674 4794 scope.go:117] "RemoveContainer" containerID="cc8e47c20a1668aed38a81367397f1ee82817b86543da1bd05f2dd64f31ed821" Mar 10 10:08:49 crc kubenswrapper[4794]: I0310 10:08:49.099516 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.272489586 podStartE2EDuration="6.099496245s" podCreationTimestamp="2026-03-10 10:08:43 +0000 UTC" firstStartedPulling="2026-03-10 10:08:44.870745892 +0000 UTC m=+1473.626916720" lastFinishedPulling="2026-03-10 10:08:48.697752561 +0000 UTC m=+1477.453923379" observedRunningTime="2026-03-10 10:08:49.08721158 +0000 UTC m=+1477.843382398" watchObservedRunningTime="2026-03-10 10:08:49.099496245 +0000 UTC m=+1477.855667053" Mar 10 10:08:49 crc kubenswrapper[4794]: I0310 10:08:49.103661 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bc2846c8-b6db-4b5f-8bb8-998b50e64970-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "bc2846c8-b6db-4b5f-8bb8-998b50e64970" (UID: "bc2846c8-b6db-4b5f-8bb8-998b50e64970"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 10:08:49 crc kubenswrapper[4794]: I0310 10:08:49.119873 4794 scope.go:117] "RemoveContainer" containerID="c1a6f8e537285056480257d619eb73fbf78910635268ce5df54e252c9727c3b7" Mar 10 10:08:49 crc kubenswrapper[4794]: I0310 10:08:49.126024 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bc2846c8-b6db-4b5f-8bb8-998b50e64970-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "bc2846c8-b6db-4b5f-8bb8-998b50e64970" (UID: "bc2846c8-b6db-4b5f-8bb8-998b50e64970"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 10:08:49 crc kubenswrapper[4794]: I0310 10:08:49.126668 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bc2846c8-b6db-4b5f-8bb8-998b50e64970-config" (OuterVolumeSpecName: "config") pod "bc2846c8-b6db-4b5f-8bb8-998b50e64970" (UID: "bc2846c8-b6db-4b5f-8bb8-998b50e64970"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 10:08:49 crc kubenswrapper[4794]: E0310 10:08:49.127070 4794 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c1a6f8e537285056480257d619eb73fbf78910635268ce5df54e252c9727c3b7\": container with ID starting with c1a6f8e537285056480257d619eb73fbf78910635268ce5df54e252c9727c3b7 not found: ID does not exist" containerID="c1a6f8e537285056480257d619eb73fbf78910635268ce5df54e252c9727c3b7" Mar 10 10:08:49 crc kubenswrapper[4794]: I0310 10:08:49.127105 4794 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c1a6f8e537285056480257d619eb73fbf78910635268ce5df54e252c9727c3b7"} err="failed to get container status \"c1a6f8e537285056480257d619eb73fbf78910635268ce5df54e252c9727c3b7\": rpc error: code = NotFound desc = could not find container \"c1a6f8e537285056480257d619eb73fbf78910635268ce5df54e252c9727c3b7\": container with ID starting with c1a6f8e537285056480257d619eb73fbf78910635268ce5df54e252c9727c3b7 not found: ID does not exist" Mar 10 10:08:49 crc kubenswrapper[4794]: I0310 10:08:49.127129 4794 scope.go:117] "RemoveContainer" containerID="cc8e47c20a1668aed38a81367397f1ee82817b86543da1bd05f2dd64f31ed821" Mar 10 10:08:49 crc kubenswrapper[4794]: E0310 10:08:49.127386 4794 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cc8e47c20a1668aed38a81367397f1ee82817b86543da1bd05f2dd64f31ed821\": container with ID starting with cc8e47c20a1668aed38a81367397f1ee82817b86543da1bd05f2dd64f31ed821 not found: ID does not exist" containerID="cc8e47c20a1668aed38a81367397f1ee82817b86543da1bd05f2dd64f31ed821" Mar 10 10:08:49 crc kubenswrapper[4794]: I0310 10:08:49.127421 4794 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cc8e47c20a1668aed38a81367397f1ee82817b86543da1bd05f2dd64f31ed821"} err="failed to get container status \"cc8e47c20a1668aed38a81367397f1ee82817b86543da1bd05f2dd64f31ed821\": rpc error: code = NotFound desc = could not find container \"cc8e47c20a1668aed38a81367397f1ee82817b86543da1bd05f2dd64f31ed821\": container with ID starting with cc8e47c20a1668aed38a81367397f1ee82817b86543da1bd05f2dd64f31ed821 not found: ID does not exist" Mar 10 10:08:49 crc kubenswrapper[4794]: I0310 10:08:49.136750 4794 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/bc2846c8-b6db-4b5f-8bb8-998b50e64970-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 10 10:08:49 crc kubenswrapper[4794]: I0310 10:08:49.136780 4794 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bc2846c8-b6db-4b5f-8bb8-998b50e64970-config\") on node \"crc\" DevicePath \"\"" Mar 10 10:08:49 crc kubenswrapper[4794]: I0310 10:08:49.136789 4794 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/bc2846c8-b6db-4b5f-8bb8-998b50e64970-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 10 10:08:49 crc kubenswrapper[4794]: I0310 10:08:49.136798 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xs787\" (UniqueName: \"kubernetes.io/projected/bc2846c8-b6db-4b5f-8bb8-998b50e64970-kube-api-access-xs787\") on node \"crc\" DevicePath \"\"" Mar 10 10:08:49 crc kubenswrapper[4794]: I0310 10:08:49.136842 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bc2846c8-b6db-4b5f-8bb8-998b50e64970-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "bc2846c8-b6db-4b5f-8bb8-998b50e64970" (UID: "bc2846c8-b6db-4b5f-8bb8-998b50e64970"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 10:08:49 crc kubenswrapper[4794]: I0310 10:08:49.141180 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bc2846c8-b6db-4b5f-8bb8-998b50e64970-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "bc2846c8-b6db-4b5f-8bb8-998b50e64970" (UID: "bc2846c8-b6db-4b5f-8bb8-998b50e64970"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 10:08:49 crc kubenswrapper[4794]: I0310 10:08:49.238631 4794 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bc2846c8-b6db-4b5f-8bb8-998b50e64970-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 10 10:08:49 crc kubenswrapper[4794]: I0310 10:08:49.238944 4794 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/bc2846c8-b6db-4b5f-8bb8-998b50e64970-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 10 10:08:49 crc kubenswrapper[4794]: I0310 10:08:49.413262 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7559df67df-6sc9d"] Mar 10 10:08:49 crc kubenswrapper[4794]: I0310 10:08:49.424158 4794 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7559df67df-6sc9d"] Mar 10 10:08:49 crc kubenswrapper[4794]: I0310 10:08:49.438183 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-hz6dc" Mar 10 10:08:49 crc kubenswrapper[4794]: I0310 10:08:49.543600 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6d7ccc31-5316-4f75-999a-45cd1f36a9f1-catalog-content\") pod \"6d7ccc31-5316-4f75-999a-45cd1f36a9f1\" (UID: \"6d7ccc31-5316-4f75-999a-45cd1f36a9f1\") " Mar 10 10:08:49 crc kubenswrapper[4794]: I0310 10:08:49.543692 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fghwl\" (UniqueName: \"kubernetes.io/projected/6d7ccc31-5316-4f75-999a-45cd1f36a9f1-kube-api-access-fghwl\") pod \"6d7ccc31-5316-4f75-999a-45cd1f36a9f1\" (UID: \"6d7ccc31-5316-4f75-999a-45cd1f36a9f1\") " Mar 10 10:08:49 crc kubenswrapper[4794]: I0310 10:08:49.544518 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6d7ccc31-5316-4f75-999a-45cd1f36a9f1-utilities\") pod \"6d7ccc31-5316-4f75-999a-45cd1f36a9f1\" (UID: \"6d7ccc31-5316-4f75-999a-45cd1f36a9f1\") " Mar 10 10:08:49 crc kubenswrapper[4794]: I0310 10:08:49.545046 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6d7ccc31-5316-4f75-999a-45cd1f36a9f1-utilities" (OuterVolumeSpecName: "utilities") pod "6d7ccc31-5316-4f75-999a-45cd1f36a9f1" (UID: "6d7ccc31-5316-4f75-999a-45cd1f36a9f1"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 10:08:49 crc kubenswrapper[4794]: I0310 10:08:49.545323 4794 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6d7ccc31-5316-4f75-999a-45cd1f36a9f1-utilities\") on node \"crc\" DevicePath \"\"" Mar 10 10:08:49 crc kubenswrapper[4794]: I0310 10:08:49.547241 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6d7ccc31-5316-4f75-999a-45cd1f36a9f1-kube-api-access-fghwl" (OuterVolumeSpecName: "kube-api-access-fghwl") pod "6d7ccc31-5316-4f75-999a-45cd1f36a9f1" (UID: "6d7ccc31-5316-4f75-999a-45cd1f36a9f1"). InnerVolumeSpecName "kube-api-access-fghwl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 10:08:49 crc kubenswrapper[4794]: I0310 10:08:49.597762 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6d7ccc31-5316-4f75-999a-45cd1f36a9f1-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "6d7ccc31-5316-4f75-999a-45cd1f36a9f1" (UID: "6d7ccc31-5316-4f75-999a-45cd1f36a9f1"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 10:08:49 crc kubenswrapper[4794]: I0310 10:08:49.647485 4794 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6d7ccc31-5316-4f75-999a-45cd1f36a9f1-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 10 10:08:49 crc kubenswrapper[4794]: I0310 10:08:49.647525 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fghwl\" (UniqueName: \"kubernetes.io/projected/6d7ccc31-5316-4f75-999a-45cd1f36a9f1-kube-api-access-fghwl\") on node \"crc\" DevicePath \"\"" Mar 10 10:08:50 crc kubenswrapper[4794]: I0310 10:08:50.009480 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc2846c8-b6db-4b5f-8bb8-998b50e64970" path="/var/lib/kubelet/pods/bc2846c8-b6db-4b5f-8bb8-998b50e64970/volumes" Mar 10 10:08:50 crc kubenswrapper[4794]: I0310 10:08:50.080034 4794 generic.go:334] "Generic (PLEG): container finished" podID="6d7ccc31-5316-4f75-999a-45cd1f36a9f1" containerID="27ff0203e6f08f5795cc198aaf2f05f8cca13fa2dcbf78def477d0d4dabdf1f4" exitCode=0 Mar 10 10:08:50 crc kubenswrapper[4794]: I0310 10:08:50.081087 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-hz6dc" Mar 10 10:08:50 crc kubenswrapper[4794]: I0310 10:08:50.081308 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hz6dc" event={"ID":"6d7ccc31-5316-4f75-999a-45cd1f36a9f1","Type":"ContainerDied","Data":"27ff0203e6f08f5795cc198aaf2f05f8cca13fa2dcbf78def477d0d4dabdf1f4"} Mar 10 10:08:50 crc kubenswrapper[4794]: I0310 10:08:50.081360 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hz6dc" event={"ID":"6d7ccc31-5316-4f75-999a-45cd1f36a9f1","Type":"ContainerDied","Data":"e387f2db73044d1eb1371bf820417ce99825760618a0cb4c3033355183c6daea"} Mar 10 10:08:50 crc kubenswrapper[4794]: I0310 10:08:50.081383 4794 scope.go:117] "RemoveContainer" containerID="27ff0203e6f08f5795cc198aaf2f05f8cca13fa2dcbf78def477d0d4dabdf1f4" Mar 10 10:08:50 crc kubenswrapper[4794]: I0310 10:08:50.105829 4794 scope.go:117] "RemoveContainer" containerID="9f19ad9a25b8952d88b5b8df1a31437498f9669ecb56f4e5c9f282290fb28df8" Mar 10 10:08:50 crc kubenswrapper[4794]: I0310 10:08:50.111147 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-hz6dc"] Mar 10 10:08:50 crc kubenswrapper[4794]: I0310 10:08:50.121797 4794 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-hz6dc"] Mar 10 10:08:50 crc kubenswrapper[4794]: I0310 10:08:50.129272 4794 scope.go:117] "RemoveContainer" containerID="1909ca74782e471a986b8d7bbc83a5c2f0401ac34268d5ad41b636d8631c8704" Mar 10 10:08:50 crc kubenswrapper[4794]: I0310 10:08:50.180492 4794 scope.go:117] "RemoveContainer" containerID="27ff0203e6f08f5795cc198aaf2f05f8cca13fa2dcbf78def477d0d4dabdf1f4" Mar 10 10:08:50 crc kubenswrapper[4794]: E0310 10:08:50.182491 4794 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"27ff0203e6f08f5795cc198aaf2f05f8cca13fa2dcbf78def477d0d4dabdf1f4\": container with ID starting with 27ff0203e6f08f5795cc198aaf2f05f8cca13fa2dcbf78def477d0d4dabdf1f4 not found: ID does not exist" containerID="27ff0203e6f08f5795cc198aaf2f05f8cca13fa2dcbf78def477d0d4dabdf1f4" Mar 10 10:08:50 crc kubenswrapper[4794]: I0310 10:08:50.182550 4794 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"27ff0203e6f08f5795cc198aaf2f05f8cca13fa2dcbf78def477d0d4dabdf1f4"} err="failed to get container status \"27ff0203e6f08f5795cc198aaf2f05f8cca13fa2dcbf78def477d0d4dabdf1f4\": rpc error: code = NotFound desc = could not find container \"27ff0203e6f08f5795cc198aaf2f05f8cca13fa2dcbf78def477d0d4dabdf1f4\": container with ID starting with 27ff0203e6f08f5795cc198aaf2f05f8cca13fa2dcbf78def477d0d4dabdf1f4 not found: ID does not exist" Mar 10 10:08:50 crc kubenswrapper[4794]: I0310 10:08:50.182582 4794 scope.go:117] "RemoveContainer" containerID="9f19ad9a25b8952d88b5b8df1a31437498f9669ecb56f4e5c9f282290fb28df8" Mar 10 10:08:50 crc kubenswrapper[4794]: E0310 10:08:50.183078 4794 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9f19ad9a25b8952d88b5b8df1a31437498f9669ecb56f4e5c9f282290fb28df8\": container with ID starting with 9f19ad9a25b8952d88b5b8df1a31437498f9669ecb56f4e5c9f282290fb28df8 not found: ID does not exist" containerID="9f19ad9a25b8952d88b5b8df1a31437498f9669ecb56f4e5c9f282290fb28df8" Mar 10 10:08:50 crc kubenswrapper[4794]: I0310 10:08:50.183116 4794 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9f19ad9a25b8952d88b5b8df1a31437498f9669ecb56f4e5c9f282290fb28df8"} err="failed to get container status \"9f19ad9a25b8952d88b5b8df1a31437498f9669ecb56f4e5c9f282290fb28df8\": rpc error: code = NotFound desc = could not find container \"9f19ad9a25b8952d88b5b8df1a31437498f9669ecb56f4e5c9f282290fb28df8\": container with ID starting with 9f19ad9a25b8952d88b5b8df1a31437498f9669ecb56f4e5c9f282290fb28df8 not found: ID does not exist" Mar 10 10:08:50 crc kubenswrapper[4794]: I0310 10:08:50.183137 4794 scope.go:117] "RemoveContainer" containerID="1909ca74782e471a986b8d7bbc83a5c2f0401ac34268d5ad41b636d8631c8704" Mar 10 10:08:50 crc kubenswrapper[4794]: E0310 10:08:50.183516 4794 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1909ca74782e471a986b8d7bbc83a5c2f0401ac34268d5ad41b636d8631c8704\": container with ID starting with 1909ca74782e471a986b8d7bbc83a5c2f0401ac34268d5ad41b636d8631c8704 not found: ID does not exist" containerID="1909ca74782e471a986b8d7bbc83a5c2f0401ac34268d5ad41b636d8631c8704" Mar 10 10:08:50 crc kubenswrapper[4794]: I0310 10:08:50.183549 4794 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1909ca74782e471a986b8d7bbc83a5c2f0401ac34268d5ad41b636d8631c8704"} err="failed to get container status \"1909ca74782e471a986b8d7bbc83a5c2f0401ac34268d5ad41b636d8631c8704\": rpc error: code = NotFound desc = could not find container \"1909ca74782e471a986b8d7bbc83a5c2f0401ac34268d5ad41b636d8631c8704\": container with ID starting with 1909ca74782e471a986b8d7bbc83a5c2f0401ac34268d5ad41b636d8631c8704 not found: ID does not exist" Mar 10 10:08:52 crc kubenswrapper[4794]: I0310 10:08:52.030526 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6d7ccc31-5316-4f75-999a-45cd1f36a9f1" path="/var/lib/kubelet/pods/6d7ccc31-5316-4f75-999a-45cd1f36a9f1/volumes" Mar 10 10:08:53 crc kubenswrapper[4794]: I0310 10:08:53.140147 4794 generic.go:334] "Generic (PLEG): container finished" podID="c66c006d-3dd9-4544-8272-51226c41a2fd" containerID="6c9754ac702c4f23067de7bb692d3a96bce0ffba1d10777a374b930149ba51e5" exitCode=0 Mar 10 10:08:53 crc kubenswrapper[4794]: I0310 10:08:53.140215 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-q5rgf" event={"ID":"c66c006d-3dd9-4544-8272-51226c41a2fd","Type":"ContainerDied","Data":"6c9754ac702c4f23067de7bb692d3a96bce0ffba1d10777a374b930149ba51e5"} Mar 10 10:08:54 crc kubenswrapper[4794]: I0310 10:08:54.526078 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-q5rgf" Mar 10 10:08:54 crc kubenswrapper[4794]: I0310 10:08:54.543015 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c66c006d-3dd9-4544-8272-51226c41a2fd-config-data\") pod \"c66c006d-3dd9-4544-8272-51226c41a2fd\" (UID: \"c66c006d-3dd9-4544-8272-51226c41a2fd\") " Mar 10 10:08:54 crc kubenswrapper[4794]: I0310 10:08:54.543158 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c66c006d-3dd9-4544-8272-51226c41a2fd-combined-ca-bundle\") pod \"c66c006d-3dd9-4544-8272-51226c41a2fd\" (UID: \"c66c006d-3dd9-4544-8272-51226c41a2fd\") " Mar 10 10:08:54 crc kubenswrapper[4794]: I0310 10:08:54.543387 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wk8jv\" (UniqueName: \"kubernetes.io/projected/c66c006d-3dd9-4544-8272-51226c41a2fd-kube-api-access-wk8jv\") pod \"c66c006d-3dd9-4544-8272-51226c41a2fd\" (UID: \"c66c006d-3dd9-4544-8272-51226c41a2fd\") " Mar 10 10:08:54 crc kubenswrapper[4794]: I0310 10:08:54.543466 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c66c006d-3dd9-4544-8272-51226c41a2fd-scripts\") pod \"c66c006d-3dd9-4544-8272-51226c41a2fd\" (UID: \"c66c006d-3dd9-4544-8272-51226c41a2fd\") " Mar 10 10:08:54 crc kubenswrapper[4794]: I0310 10:08:54.549518 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c66c006d-3dd9-4544-8272-51226c41a2fd-scripts" (OuterVolumeSpecName: "scripts") pod "c66c006d-3dd9-4544-8272-51226c41a2fd" (UID: "c66c006d-3dd9-4544-8272-51226c41a2fd"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 10:08:54 crc kubenswrapper[4794]: I0310 10:08:54.550624 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c66c006d-3dd9-4544-8272-51226c41a2fd-kube-api-access-wk8jv" (OuterVolumeSpecName: "kube-api-access-wk8jv") pod "c66c006d-3dd9-4544-8272-51226c41a2fd" (UID: "c66c006d-3dd9-4544-8272-51226c41a2fd"). InnerVolumeSpecName "kube-api-access-wk8jv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 10:08:54 crc kubenswrapper[4794]: I0310 10:08:54.583183 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c66c006d-3dd9-4544-8272-51226c41a2fd-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c66c006d-3dd9-4544-8272-51226c41a2fd" (UID: "c66c006d-3dd9-4544-8272-51226c41a2fd"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 10:08:54 crc kubenswrapper[4794]: I0310 10:08:54.594522 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c66c006d-3dd9-4544-8272-51226c41a2fd-config-data" (OuterVolumeSpecName: "config-data") pod "c66c006d-3dd9-4544-8272-51226c41a2fd" (UID: "c66c006d-3dd9-4544-8272-51226c41a2fd"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 10:08:54 crc kubenswrapper[4794]: I0310 10:08:54.646412 4794 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c66c006d-3dd9-4544-8272-51226c41a2fd-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 10:08:54 crc kubenswrapper[4794]: I0310 10:08:54.646456 4794 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c66c006d-3dd9-4544-8272-51226c41a2fd-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 10:08:54 crc kubenswrapper[4794]: I0310 10:08:54.646470 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wk8jv\" (UniqueName: \"kubernetes.io/projected/c66c006d-3dd9-4544-8272-51226c41a2fd-kube-api-access-wk8jv\") on node \"crc\" DevicePath \"\"" Mar 10 10:08:54 crc kubenswrapper[4794]: I0310 10:08:54.646482 4794 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c66c006d-3dd9-4544-8272-51226c41a2fd-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 10:08:55 crc kubenswrapper[4794]: I0310 10:08:55.159965 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-q5rgf" event={"ID":"c66c006d-3dd9-4544-8272-51226c41a2fd","Type":"ContainerDied","Data":"cf051fd871316a9114e8b6cfa739ccdd2425dd9985881ad2fe57301b79e601f8"} Mar 10 10:08:55 crc kubenswrapper[4794]: I0310 10:08:55.160258 4794 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cf051fd871316a9114e8b6cfa739ccdd2425dd9985881ad2fe57301b79e601f8" Mar 10 10:08:55 crc kubenswrapper[4794]: I0310 10:08:55.160050 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-q5rgf" Mar 10 10:08:55 crc kubenswrapper[4794]: I0310 10:08:55.332092 4794 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 10 10:08:55 crc kubenswrapper[4794]: I0310 10:08:55.332477 4794 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 10 10:08:55 crc kubenswrapper[4794]: I0310 10:08:55.341734 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Mar 10 10:08:55 crc kubenswrapper[4794]: I0310 10:08:55.342082 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="43594c25-d6b5-4e31-b751-3649526b219b" containerName="nova-scheduler-scheduler" containerID="cri-o://af3f12191565dd5dcc24b1a062feef4911592964922b1d240a36b5411cf7f013" gracePeriod=30 Mar 10 10:08:55 crc kubenswrapper[4794]: I0310 10:08:55.356489 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 10 10:08:55 crc kubenswrapper[4794]: I0310 10:08:55.387869 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 10 10:08:55 crc kubenswrapper[4794]: I0310 10:08:55.388130 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="4886c6bf-be4d-4b43-8df2-90ce26e40bf1" containerName="nova-metadata-log" containerID="cri-o://2f44a40ebd4b833eb003259a0a3572ebf0ae0a6431d4f908701d1fadfc35dc2b" gracePeriod=30 Mar 10 10:08:55 crc kubenswrapper[4794]: I0310 10:08:55.388287 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="4886c6bf-be4d-4b43-8df2-90ce26e40bf1" containerName="nova-metadata-metadata" containerID="cri-o://090867c4656e57401d661ab880641f95ac9d80266e520315826cc074018acfd5" gracePeriod=30 Mar 10 10:08:56 crc kubenswrapper[4794]: I0310 10:08:56.171872 4794 generic.go:334] "Generic (PLEG): container finished" podID="4886c6bf-be4d-4b43-8df2-90ce26e40bf1" containerID="2f44a40ebd4b833eb003259a0a3572ebf0ae0a6431d4f908701d1fadfc35dc2b" exitCode=143 Mar 10 10:08:56 crc kubenswrapper[4794]: I0310 10:08:56.171969 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"4886c6bf-be4d-4b43-8df2-90ce26e40bf1","Type":"ContainerDied","Data":"2f44a40ebd4b833eb003259a0a3572ebf0ae0a6431d4f908701d1fadfc35dc2b"} Mar 10 10:08:56 crc kubenswrapper[4794]: I0310 10:08:56.172080 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="61199562-76fc-4b20-a14e-6da4c1738232" containerName="nova-api-log" containerID="cri-o://ef4bba98527955cc98443dd4a1ddf94c6712a262eabb7c4ec9f8ed25f1388ef4" gracePeriod=30 Mar 10 10:08:56 crc kubenswrapper[4794]: I0310 10:08:56.172180 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="61199562-76fc-4b20-a14e-6da4c1738232" containerName="nova-api-api" containerID="cri-o://154560ffa9cfeb302b6ae22b1cdc0838499403243568c6df472ca1b6def1f145" gracePeriod=30 Mar 10 10:08:56 crc kubenswrapper[4794]: I0310 10:08:56.178836 4794 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="61199562-76fc-4b20-a14e-6da4c1738232" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.0.209:8774/\": EOF" Mar 10 10:08:56 crc kubenswrapper[4794]: I0310 10:08:56.178841 4794 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="61199562-76fc-4b20-a14e-6da4c1738232" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.0.209:8774/\": EOF" Mar 10 10:08:56 crc kubenswrapper[4794]: E0310 10:08:56.256390 4794 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="af3f12191565dd5dcc24b1a062feef4911592964922b1d240a36b5411cf7f013" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 10 10:08:56 crc kubenswrapper[4794]: E0310 10:08:56.260353 4794 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="af3f12191565dd5dcc24b1a062feef4911592964922b1d240a36b5411cf7f013" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 10 10:08:56 crc kubenswrapper[4794]: E0310 10:08:56.261591 4794 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="af3f12191565dd5dcc24b1a062feef4911592964922b1d240a36b5411cf7f013" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 10 10:08:56 crc kubenswrapper[4794]: E0310 10:08:56.261689 4794 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="43594c25-d6b5-4e31-b751-3649526b219b" containerName="nova-scheduler-scheduler" Mar 10 10:08:57 crc kubenswrapper[4794]: I0310 10:08:57.186851 4794 generic.go:334] "Generic (PLEG): container finished" podID="61199562-76fc-4b20-a14e-6da4c1738232" containerID="ef4bba98527955cc98443dd4a1ddf94c6712a262eabb7c4ec9f8ed25f1388ef4" exitCode=143 Mar 10 10:08:57 crc kubenswrapper[4794]: I0310 10:08:57.186928 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"61199562-76fc-4b20-a14e-6da4c1738232","Type":"ContainerDied","Data":"ef4bba98527955cc98443dd4a1ddf94c6712a262eabb7c4ec9f8ed25f1388ef4"} Mar 10 10:08:58 crc kubenswrapper[4794]: I0310 10:08:58.530043 4794 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="4886c6bf-be4d-4b43-8df2-90ce26e40bf1" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.199:8775/\": read tcp 10.217.0.2:46128->10.217.0.199:8775: read: connection reset by peer" Mar 10 10:08:58 crc kubenswrapper[4794]: I0310 10:08:58.530103 4794 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="4886c6bf-be4d-4b43-8df2-90ce26e40bf1" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.199:8775/\": read tcp 10.217.0.2:46122->10.217.0.199:8775: read: connection reset by peer" Mar 10 10:08:58 crc kubenswrapper[4794]: I0310 10:08:58.992315 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 10 10:08:59 crc kubenswrapper[4794]: I0310 10:08:59.138552 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4886c6bf-be4d-4b43-8df2-90ce26e40bf1-combined-ca-bundle\") pod \"4886c6bf-be4d-4b43-8df2-90ce26e40bf1\" (UID: \"4886c6bf-be4d-4b43-8df2-90ce26e40bf1\") " Mar 10 10:08:59 crc kubenswrapper[4794]: I0310 10:08:59.138699 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4886c6bf-be4d-4b43-8df2-90ce26e40bf1-config-data\") pod \"4886c6bf-be4d-4b43-8df2-90ce26e40bf1\" (UID: \"4886c6bf-be4d-4b43-8df2-90ce26e40bf1\") " Mar 10 10:08:59 crc kubenswrapper[4794]: I0310 10:08:59.138727 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/4886c6bf-be4d-4b43-8df2-90ce26e40bf1-nova-metadata-tls-certs\") pod \"4886c6bf-be4d-4b43-8df2-90ce26e40bf1\" (UID: \"4886c6bf-be4d-4b43-8df2-90ce26e40bf1\") " Mar 10 10:08:59 crc kubenswrapper[4794]: I0310 10:08:59.138752 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4886c6bf-be4d-4b43-8df2-90ce26e40bf1-logs\") pod \"4886c6bf-be4d-4b43-8df2-90ce26e40bf1\" (UID: \"4886c6bf-be4d-4b43-8df2-90ce26e40bf1\") " Mar 10 10:08:59 crc kubenswrapper[4794]: I0310 10:08:59.138814 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wcjfb\" (UniqueName: \"kubernetes.io/projected/4886c6bf-be4d-4b43-8df2-90ce26e40bf1-kube-api-access-wcjfb\") pod \"4886c6bf-be4d-4b43-8df2-90ce26e40bf1\" (UID: \"4886c6bf-be4d-4b43-8df2-90ce26e40bf1\") " Mar 10 10:08:59 crc kubenswrapper[4794]: I0310 10:08:59.140195 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4886c6bf-be4d-4b43-8df2-90ce26e40bf1-logs" (OuterVolumeSpecName: "logs") pod "4886c6bf-be4d-4b43-8df2-90ce26e40bf1" (UID: "4886c6bf-be4d-4b43-8df2-90ce26e40bf1"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 10:08:59 crc kubenswrapper[4794]: I0310 10:08:59.147886 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4886c6bf-be4d-4b43-8df2-90ce26e40bf1-kube-api-access-wcjfb" (OuterVolumeSpecName: "kube-api-access-wcjfb") pod "4886c6bf-be4d-4b43-8df2-90ce26e40bf1" (UID: "4886c6bf-be4d-4b43-8df2-90ce26e40bf1"). InnerVolumeSpecName "kube-api-access-wcjfb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 10:08:59 crc kubenswrapper[4794]: I0310 10:08:59.173515 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4886c6bf-be4d-4b43-8df2-90ce26e40bf1-config-data" (OuterVolumeSpecName: "config-data") pod "4886c6bf-be4d-4b43-8df2-90ce26e40bf1" (UID: "4886c6bf-be4d-4b43-8df2-90ce26e40bf1"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 10:08:59 crc kubenswrapper[4794]: I0310 10:08:59.188930 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4886c6bf-be4d-4b43-8df2-90ce26e40bf1-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4886c6bf-be4d-4b43-8df2-90ce26e40bf1" (UID: "4886c6bf-be4d-4b43-8df2-90ce26e40bf1"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 10:08:59 crc kubenswrapper[4794]: I0310 10:08:59.197976 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4886c6bf-be4d-4b43-8df2-90ce26e40bf1-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "4886c6bf-be4d-4b43-8df2-90ce26e40bf1" (UID: "4886c6bf-be4d-4b43-8df2-90ce26e40bf1"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 10:08:59 crc kubenswrapper[4794]: I0310 10:08:59.214884 4794 generic.go:334] "Generic (PLEG): container finished" podID="4886c6bf-be4d-4b43-8df2-90ce26e40bf1" containerID="090867c4656e57401d661ab880641f95ac9d80266e520315826cc074018acfd5" exitCode=0 Mar 10 10:08:59 crc kubenswrapper[4794]: I0310 10:08:59.214933 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"4886c6bf-be4d-4b43-8df2-90ce26e40bf1","Type":"ContainerDied","Data":"090867c4656e57401d661ab880641f95ac9d80266e520315826cc074018acfd5"} Mar 10 10:08:59 crc kubenswrapper[4794]: I0310 10:08:59.214971 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"4886c6bf-be4d-4b43-8df2-90ce26e40bf1","Type":"ContainerDied","Data":"d75a78d46d1db4d984400e32993c2b4f6e5d0b2e7918b14dca38cb401d3d0df2"} Mar 10 10:08:59 crc kubenswrapper[4794]: I0310 10:08:59.214979 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 10 10:08:59 crc kubenswrapper[4794]: I0310 10:08:59.214994 4794 scope.go:117] "RemoveContainer" containerID="090867c4656e57401d661ab880641f95ac9d80266e520315826cc074018acfd5" Mar 10 10:08:59 crc kubenswrapper[4794]: I0310 10:08:59.240824 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wcjfb\" (UniqueName: \"kubernetes.io/projected/4886c6bf-be4d-4b43-8df2-90ce26e40bf1-kube-api-access-wcjfb\") on node \"crc\" DevicePath \"\"" Mar 10 10:08:59 crc kubenswrapper[4794]: I0310 10:08:59.240858 4794 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4886c6bf-be4d-4b43-8df2-90ce26e40bf1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 10:08:59 crc kubenswrapper[4794]: I0310 10:08:59.240872 4794 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4886c6bf-be4d-4b43-8df2-90ce26e40bf1-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 10:08:59 crc kubenswrapper[4794]: I0310 10:08:59.240883 4794 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/4886c6bf-be4d-4b43-8df2-90ce26e40bf1-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 10 10:08:59 crc kubenswrapper[4794]: I0310 10:08:59.240894 4794 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4886c6bf-be4d-4b43-8df2-90ce26e40bf1-logs\") on node \"crc\" DevicePath \"\"" Mar 10 10:08:59 crc kubenswrapper[4794]: I0310 10:08:59.268185 4794 scope.go:117] "RemoveContainer" containerID="2f44a40ebd4b833eb003259a0a3572ebf0ae0a6431d4f908701d1fadfc35dc2b" Mar 10 10:08:59 crc kubenswrapper[4794]: I0310 10:08:59.269770 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 10 10:08:59 crc kubenswrapper[4794]: I0310 10:08:59.284775 4794 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Mar 10 10:08:59 crc kubenswrapper[4794]: I0310 10:08:59.297387 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Mar 10 10:08:59 crc kubenswrapper[4794]: E0310 10:08:59.297834 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4886c6bf-be4d-4b43-8df2-90ce26e40bf1" containerName="nova-metadata-log" Mar 10 10:08:59 crc kubenswrapper[4794]: I0310 10:08:59.297859 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="4886c6bf-be4d-4b43-8df2-90ce26e40bf1" containerName="nova-metadata-log" Mar 10 10:08:59 crc kubenswrapper[4794]: E0310 10:08:59.297870 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6d7ccc31-5316-4f75-999a-45cd1f36a9f1" containerName="registry-server" Mar 10 10:08:59 crc kubenswrapper[4794]: I0310 10:08:59.297879 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="6d7ccc31-5316-4f75-999a-45cd1f36a9f1" containerName="registry-server" Mar 10 10:08:59 crc kubenswrapper[4794]: E0310 10:08:59.297897 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bc2846c8-b6db-4b5f-8bb8-998b50e64970" containerName="dnsmasq-dns" Mar 10 10:08:59 crc kubenswrapper[4794]: I0310 10:08:59.297906 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="bc2846c8-b6db-4b5f-8bb8-998b50e64970" containerName="dnsmasq-dns" Mar 10 10:08:59 crc kubenswrapper[4794]: E0310 10:08:59.297922 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6d7ccc31-5316-4f75-999a-45cd1f36a9f1" containerName="extract-content" Mar 10 10:08:59 crc kubenswrapper[4794]: I0310 10:08:59.297930 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="6d7ccc31-5316-4f75-999a-45cd1f36a9f1" containerName="extract-content" Mar 10 10:08:59 crc kubenswrapper[4794]: E0310 10:08:59.297963 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6d7ccc31-5316-4f75-999a-45cd1f36a9f1" containerName="extract-utilities" Mar 10 10:08:59 crc kubenswrapper[4794]: I0310 10:08:59.297972 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="6d7ccc31-5316-4f75-999a-45cd1f36a9f1" containerName="extract-utilities" Mar 10 10:08:59 crc kubenswrapper[4794]: E0310 10:08:59.297989 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4886c6bf-be4d-4b43-8df2-90ce26e40bf1" containerName="nova-metadata-metadata" Mar 10 10:08:59 crc kubenswrapper[4794]: I0310 10:08:59.297997 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="4886c6bf-be4d-4b43-8df2-90ce26e40bf1" containerName="nova-metadata-metadata" Mar 10 10:08:59 crc kubenswrapper[4794]: E0310 10:08:59.298015 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c66c006d-3dd9-4544-8272-51226c41a2fd" containerName="nova-manage" Mar 10 10:08:59 crc kubenswrapper[4794]: I0310 10:08:59.298024 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="c66c006d-3dd9-4544-8272-51226c41a2fd" containerName="nova-manage" Mar 10 10:08:59 crc kubenswrapper[4794]: E0310 10:08:59.298036 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bc2846c8-b6db-4b5f-8bb8-998b50e64970" containerName="init" Mar 10 10:08:59 crc kubenswrapper[4794]: I0310 10:08:59.298043 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="bc2846c8-b6db-4b5f-8bb8-998b50e64970" containerName="init" Mar 10 10:08:59 crc kubenswrapper[4794]: I0310 10:08:59.298262 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="c66c006d-3dd9-4544-8272-51226c41a2fd" containerName="nova-manage" Mar 10 10:08:59 crc kubenswrapper[4794]: I0310 10:08:59.298275 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="6d7ccc31-5316-4f75-999a-45cd1f36a9f1" containerName="registry-server" Mar 10 10:08:59 crc kubenswrapper[4794]: I0310 10:08:59.298291 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="4886c6bf-be4d-4b43-8df2-90ce26e40bf1" containerName="nova-metadata-metadata" Mar 10 10:08:59 crc kubenswrapper[4794]: I0310 10:08:59.298302 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="bc2846c8-b6db-4b5f-8bb8-998b50e64970" containerName="dnsmasq-dns" Mar 10 10:08:59 crc kubenswrapper[4794]: I0310 10:08:59.298318 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="4886c6bf-be4d-4b43-8df2-90ce26e40bf1" containerName="nova-metadata-log" Mar 10 10:08:59 crc kubenswrapper[4794]: I0310 10:08:59.299458 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 10 10:08:59 crc kubenswrapper[4794]: I0310 10:08:59.301858 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Mar 10 10:08:59 crc kubenswrapper[4794]: I0310 10:08:59.302178 4794 scope.go:117] "RemoveContainer" containerID="090867c4656e57401d661ab880641f95ac9d80266e520315826cc074018acfd5" Mar 10 10:08:59 crc kubenswrapper[4794]: I0310 10:08:59.302546 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Mar 10 10:08:59 crc kubenswrapper[4794]: E0310 10:08:59.302755 4794 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"090867c4656e57401d661ab880641f95ac9d80266e520315826cc074018acfd5\": container with ID starting with 090867c4656e57401d661ab880641f95ac9d80266e520315826cc074018acfd5 not found: ID does not exist" containerID="090867c4656e57401d661ab880641f95ac9d80266e520315826cc074018acfd5" Mar 10 10:08:59 crc kubenswrapper[4794]: I0310 10:08:59.302795 4794 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"090867c4656e57401d661ab880641f95ac9d80266e520315826cc074018acfd5"} err="failed to get container status \"090867c4656e57401d661ab880641f95ac9d80266e520315826cc074018acfd5\": rpc error: code = NotFound desc = could not find container \"090867c4656e57401d661ab880641f95ac9d80266e520315826cc074018acfd5\": container with ID starting with 090867c4656e57401d661ab880641f95ac9d80266e520315826cc074018acfd5 not found: ID does not exist" Mar 10 10:08:59 crc kubenswrapper[4794]: I0310 10:08:59.302822 4794 scope.go:117] "RemoveContainer" containerID="2f44a40ebd4b833eb003259a0a3572ebf0ae0a6431d4f908701d1fadfc35dc2b" Mar 10 10:08:59 crc kubenswrapper[4794]: E0310 10:08:59.305755 4794 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2f44a40ebd4b833eb003259a0a3572ebf0ae0a6431d4f908701d1fadfc35dc2b\": container with ID starting with 2f44a40ebd4b833eb003259a0a3572ebf0ae0a6431d4f908701d1fadfc35dc2b not found: ID does not exist" containerID="2f44a40ebd4b833eb003259a0a3572ebf0ae0a6431d4f908701d1fadfc35dc2b" Mar 10 10:08:59 crc kubenswrapper[4794]: I0310 10:08:59.305786 4794 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2f44a40ebd4b833eb003259a0a3572ebf0ae0a6431d4f908701d1fadfc35dc2b"} err="failed to get container status \"2f44a40ebd4b833eb003259a0a3572ebf0ae0a6431d4f908701d1fadfc35dc2b\": rpc error: code = NotFound desc = could not find container \"2f44a40ebd4b833eb003259a0a3572ebf0ae0a6431d4f908701d1fadfc35dc2b\": container with ID starting with 2f44a40ebd4b833eb003259a0a3572ebf0ae0a6431d4f908701d1fadfc35dc2b not found: ID does not exist" Mar 10 10:08:59 crc kubenswrapper[4794]: I0310 10:08:59.307021 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 10 10:08:59 crc kubenswrapper[4794]: I0310 10:08:59.342444 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6b5facf1-8bc0-497d-925f-ee382862cf22-logs\") pod \"nova-metadata-0\" (UID: \"6b5facf1-8bc0-497d-925f-ee382862cf22\") " pod="openstack/nova-metadata-0" Mar 10 10:08:59 crc kubenswrapper[4794]: I0310 10:08:59.342665 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/6b5facf1-8bc0-497d-925f-ee382862cf22-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"6b5facf1-8bc0-497d-925f-ee382862cf22\") " pod="openstack/nova-metadata-0" Mar 10 10:08:59 crc kubenswrapper[4794]: I0310 10:08:59.342691 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6b5facf1-8bc0-497d-925f-ee382862cf22-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"6b5facf1-8bc0-497d-925f-ee382862cf22\") " pod="openstack/nova-metadata-0" Mar 10 10:08:59 crc kubenswrapper[4794]: I0310 10:08:59.342772 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6b5facf1-8bc0-497d-925f-ee382862cf22-config-data\") pod \"nova-metadata-0\" (UID: \"6b5facf1-8bc0-497d-925f-ee382862cf22\") " pod="openstack/nova-metadata-0" Mar 10 10:08:59 crc kubenswrapper[4794]: I0310 10:08:59.342900 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rc6qg\" (UniqueName: \"kubernetes.io/projected/6b5facf1-8bc0-497d-925f-ee382862cf22-kube-api-access-rc6qg\") pod \"nova-metadata-0\" (UID: \"6b5facf1-8bc0-497d-925f-ee382862cf22\") " pod="openstack/nova-metadata-0" Mar 10 10:08:59 crc kubenswrapper[4794]: I0310 10:08:59.444511 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6b5facf1-8bc0-497d-925f-ee382862cf22-logs\") pod \"nova-metadata-0\" (UID: \"6b5facf1-8bc0-497d-925f-ee382862cf22\") " pod="openstack/nova-metadata-0" Mar 10 10:08:59 crc kubenswrapper[4794]: I0310 10:08:59.444946 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/6b5facf1-8bc0-497d-925f-ee382862cf22-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"6b5facf1-8bc0-497d-925f-ee382862cf22\") " pod="openstack/nova-metadata-0" Mar 10 10:08:59 crc kubenswrapper[4794]: I0310 10:08:59.445158 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6b5facf1-8bc0-497d-925f-ee382862cf22-logs\") pod \"nova-metadata-0\" (UID: \"6b5facf1-8bc0-497d-925f-ee382862cf22\") " pod="openstack/nova-metadata-0" Mar 10 10:08:59 crc kubenswrapper[4794]: I0310 10:08:59.445167 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6b5facf1-8bc0-497d-925f-ee382862cf22-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"6b5facf1-8bc0-497d-925f-ee382862cf22\") " pod="openstack/nova-metadata-0" Mar 10 10:08:59 crc kubenswrapper[4794]: I0310 10:08:59.445270 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6b5facf1-8bc0-497d-925f-ee382862cf22-config-data\") pod \"nova-metadata-0\" (UID: \"6b5facf1-8bc0-497d-925f-ee382862cf22\") " pod="openstack/nova-metadata-0" Mar 10 10:08:59 crc kubenswrapper[4794]: I0310 10:08:59.445431 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rc6qg\" (UniqueName: \"kubernetes.io/projected/6b5facf1-8bc0-497d-925f-ee382862cf22-kube-api-access-rc6qg\") pod \"nova-metadata-0\" (UID: \"6b5facf1-8bc0-497d-925f-ee382862cf22\") " pod="openstack/nova-metadata-0" Mar 10 10:08:59 crc kubenswrapper[4794]: I0310 10:08:59.450427 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6b5facf1-8bc0-497d-925f-ee382862cf22-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"6b5facf1-8bc0-497d-925f-ee382862cf22\") " pod="openstack/nova-metadata-0" Mar 10 10:08:59 crc kubenswrapper[4794]: I0310 10:08:59.450516 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/6b5facf1-8bc0-497d-925f-ee382862cf22-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"6b5facf1-8bc0-497d-925f-ee382862cf22\") " pod="openstack/nova-metadata-0" Mar 10 10:08:59 crc kubenswrapper[4794]: I0310 10:08:59.450929 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6b5facf1-8bc0-497d-925f-ee382862cf22-config-data\") pod \"nova-metadata-0\" (UID: \"6b5facf1-8bc0-497d-925f-ee382862cf22\") " pod="openstack/nova-metadata-0" Mar 10 10:08:59 crc kubenswrapper[4794]: I0310 10:08:59.461994 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rc6qg\" (UniqueName: \"kubernetes.io/projected/6b5facf1-8bc0-497d-925f-ee382862cf22-kube-api-access-rc6qg\") pod \"nova-metadata-0\" (UID: \"6b5facf1-8bc0-497d-925f-ee382862cf22\") " pod="openstack/nova-metadata-0" Mar 10 10:08:59 crc kubenswrapper[4794]: I0310 10:08:59.615406 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 10 10:09:00 crc kubenswrapper[4794]: I0310 10:09:00.013785 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4886c6bf-be4d-4b43-8df2-90ce26e40bf1" path="/var/lib/kubelet/pods/4886c6bf-be4d-4b43-8df2-90ce26e40bf1/volumes" Mar 10 10:09:00 crc kubenswrapper[4794]: I0310 10:09:00.059398 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 10 10:09:00 crc kubenswrapper[4794]: I0310 10:09:00.145986 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 10 10:09:00 crc kubenswrapper[4794]: I0310 10:09:00.227736 4794 generic.go:334] "Generic (PLEG): container finished" podID="43594c25-d6b5-4e31-b751-3649526b219b" containerID="af3f12191565dd5dcc24b1a062feef4911592964922b1d240a36b5411cf7f013" exitCode=0 Mar 10 10:09:00 crc kubenswrapper[4794]: I0310 10:09:00.227800 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"43594c25-d6b5-4e31-b751-3649526b219b","Type":"ContainerDied","Data":"af3f12191565dd5dcc24b1a062feef4911592964922b1d240a36b5411cf7f013"} Mar 10 10:09:00 crc kubenswrapper[4794]: I0310 10:09:00.227826 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"43594c25-d6b5-4e31-b751-3649526b219b","Type":"ContainerDied","Data":"f7b46753c26e2d65b8d2a6ace22e377c796e1085f9b723e0dcab48ab1b33210b"} Mar 10 10:09:00 crc kubenswrapper[4794]: I0310 10:09:00.227828 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 10 10:09:00 crc kubenswrapper[4794]: I0310 10:09:00.227843 4794 scope.go:117] "RemoveContainer" containerID="af3f12191565dd5dcc24b1a062feef4911592964922b1d240a36b5411cf7f013" Mar 10 10:09:00 crc kubenswrapper[4794]: I0310 10:09:00.232751 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"6b5facf1-8bc0-497d-925f-ee382862cf22","Type":"ContainerStarted","Data":"14dbdb7d7cc1e4aa9caa7e5b71ac3344e5c3eb3a18b977211732c84b49d8764c"} Mar 10 10:09:00 crc kubenswrapper[4794]: I0310 10:09:00.250036 4794 scope.go:117] "RemoveContainer" containerID="af3f12191565dd5dcc24b1a062feef4911592964922b1d240a36b5411cf7f013" Mar 10 10:09:00 crc kubenswrapper[4794]: E0310 10:09:00.250483 4794 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"af3f12191565dd5dcc24b1a062feef4911592964922b1d240a36b5411cf7f013\": container with ID starting with af3f12191565dd5dcc24b1a062feef4911592964922b1d240a36b5411cf7f013 not found: ID does not exist" containerID="af3f12191565dd5dcc24b1a062feef4911592964922b1d240a36b5411cf7f013" Mar 10 10:09:00 crc kubenswrapper[4794]: I0310 10:09:00.250550 4794 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"af3f12191565dd5dcc24b1a062feef4911592964922b1d240a36b5411cf7f013"} err="failed to get container status \"af3f12191565dd5dcc24b1a062feef4911592964922b1d240a36b5411cf7f013\": rpc error: code = NotFound desc = could not find container \"af3f12191565dd5dcc24b1a062feef4911592964922b1d240a36b5411cf7f013\": container with ID starting with af3f12191565dd5dcc24b1a062feef4911592964922b1d240a36b5411cf7f013 not found: ID does not exist" Mar 10 10:09:00 crc kubenswrapper[4794]: I0310 10:09:00.263147 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/43594c25-d6b5-4e31-b751-3649526b219b-config-data\") pod \"43594c25-d6b5-4e31-b751-3649526b219b\" (UID: \"43594c25-d6b5-4e31-b751-3649526b219b\") " Mar 10 10:09:00 crc kubenswrapper[4794]: I0310 10:09:00.263202 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-45s6m\" (UniqueName: \"kubernetes.io/projected/43594c25-d6b5-4e31-b751-3649526b219b-kube-api-access-45s6m\") pod \"43594c25-d6b5-4e31-b751-3649526b219b\" (UID: \"43594c25-d6b5-4e31-b751-3649526b219b\") " Mar 10 10:09:00 crc kubenswrapper[4794]: I0310 10:09:00.263299 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/43594c25-d6b5-4e31-b751-3649526b219b-combined-ca-bundle\") pod \"43594c25-d6b5-4e31-b751-3649526b219b\" (UID: \"43594c25-d6b5-4e31-b751-3649526b219b\") " Mar 10 10:09:00 crc kubenswrapper[4794]: I0310 10:09:00.267537 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43594c25-d6b5-4e31-b751-3649526b219b-kube-api-access-45s6m" (OuterVolumeSpecName: "kube-api-access-45s6m") pod "43594c25-d6b5-4e31-b751-3649526b219b" (UID: "43594c25-d6b5-4e31-b751-3649526b219b"). InnerVolumeSpecName "kube-api-access-45s6m". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 10:09:00 crc kubenswrapper[4794]: I0310 10:09:00.298149 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43594c25-d6b5-4e31-b751-3649526b219b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "43594c25-d6b5-4e31-b751-3649526b219b" (UID: "43594c25-d6b5-4e31-b751-3649526b219b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 10:09:00 crc kubenswrapper[4794]: I0310 10:09:00.304953 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43594c25-d6b5-4e31-b751-3649526b219b-config-data" (OuterVolumeSpecName: "config-data") pod "43594c25-d6b5-4e31-b751-3649526b219b" (UID: "43594c25-d6b5-4e31-b751-3649526b219b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 10:09:00 crc kubenswrapper[4794]: I0310 10:09:00.365433 4794 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/43594c25-d6b5-4e31-b751-3649526b219b-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 10:09:00 crc kubenswrapper[4794]: I0310 10:09:00.365475 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-45s6m\" (UniqueName: \"kubernetes.io/projected/43594c25-d6b5-4e31-b751-3649526b219b-kube-api-access-45s6m\") on node \"crc\" DevicePath \"\"" Mar 10 10:09:00 crc kubenswrapper[4794]: I0310 10:09:00.365486 4794 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/43594c25-d6b5-4e31-b751-3649526b219b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 10:09:00 crc kubenswrapper[4794]: I0310 10:09:00.557342 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Mar 10 10:09:00 crc kubenswrapper[4794]: I0310 10:09:00.566184 4794 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Mar 10 10:09:00 crc kubenswrapper[4794]: I0310 10:09:00.576672 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Mar 10 10:09:00 crc kubenswrapper[4794]: E0310 10:09:00.577224 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="43594c25-d6b5-4e31-b751-3649526b219b" containerName="nova-scheduler-scheduler" Mar 10 10:09:00 crc kubenswrapper[4794]: I0310 10:09:00.577289 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="43594c25-d6b5-4e31-b751-3649526b219b" containerName="nova-scheduler-scheduler" Mar 10 10:09:00 crc kubenswrapper[4794]: I0310 10:09:00.577586 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="43594c25-d6b5-4e31-b751-3649526b219b" containerName="nova-scheduler-scheduler" Mar 10 10:09:00 crc kubenswrapper[4794]: I0310 10:09:00.578415 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 10 10:09:00 crc kubenswrapper[4794]: I0310 10:09:00.588710 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 10 10:09:00 crc kubenswrapper[4794]: I0310 10:09:00.589175 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Mar 10 10:09:00 crc kubenswrapper[4794]: I0310 10:09:00.771353 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d6e6c324-8bba-4585-9ffc-afad751594d7-config-data\") pod \"nova-scheduler-0\" (UID: \"d6e6c324-8bba-4585-9ffc-afad751594d7\") " pod="openstack/nova-scheduler-0" Mar 10 10:09:00 crc kubenswrapper[4794]: I0310 10:09:00.771425 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d6e6c324-8bba-4585-9ffc-afad751594d7-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"d6e6c324-8bba-4585-9ffc-afad751594d7\") " pod="openstack/nova-scheduler-0" Mar 10 10:09:00 crc kubenswrapper[4794]: I0310 10:09:00.771495 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mk7pd\" (UniqueName: \"kubernetes.io/projected/d6e6c324-8bba-4585-9ffc-afad751594d7-kube-api-access-mk7pd\") pod \"nova-scheduler-0\" (UID: \"d6e6c324-8bba-4585-9ffc-afad751594d7\") " pod="openstack/nova-scheduler-0" Mar 10 10:09:00 crc kubenswrapper[4794]: I0310 10:09:00.873510 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d6e6c324-8bba-4585-9ffc-afad751594d7-config-data\") pod \"nova-scheduler-0\" (UID: \"d6e6c324-8bba-4585-9ffc-afad751594d7\") " pod="openstack/nova-scheduler-0" Mar 10 10:09:00 crc kubenswrapper[4794]: I0310 10:09:00.873584 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d6e6c324-8bba-4585-9ffc-afad751594d7-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"d6e6c324-8bba-4585-9ffc-afad751594d7\") " pod="openstack/nova-scheduler-0" Mar 10 10:09:00 crc kubenswrapper[4794]: I0310 10:09:00.873661 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mk7pd\" (UniqueName: \"kubernetes.io/projected/d6e6c324-8bba-4585-9ffc-afad751594d7-kube-api-access-mk7pd\") pod \"nova-scheduler-0\" (UID: \"d6e6c324-8bba-4585-9ffc-afad751594d7\") " pod="openstack/nova-scheduler-0" Mar 10 10:09:00 crc kubenswrapper[4794]: I0310 10:09:00.877476 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d6e6c324-8bba-4585-9ffc-afad751594d7-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"d6e6c324-8bba-4585-9ffc-afad751594d7\") " pod="openstack/nova-scheduler-0" Mar 10 10:09:00 crc kubenswrapper[4794]: I0310 10:09:00.877520 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d6e6c324-8bba-4585-9ffc-afad751594d7-config-data\") pod \"nova-scheduler-0\" (UID: \"d6e6c324-8bba-4585-9ffc-afad751594d7\") " pod="openstack/nova-scheduler-0" Mar 10 10:09:00 crc kubenswrapper[4794]: I0310 10:09:00.902877 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mk7pd\" (UniqueName: \"kubernetes.io/projected/d6e6c324-8bba-4585-9ffc-afad751594d7-kube-api-access-mk7pd\") pod \"nova-scheduler-0\" (UID: \"d6e6c324-8bba-4585-9ffc-afad751594d7\") " pod="openstack/nova-scheduler-0" Mar 10 10:09:00 crc kubenswrapper[4794]: I0310 10:09:00.997210 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 10 10:09:01 crc kubenswrapper[4794]: I0310 10:09:01.245692 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"6b5facf1-8bc0-497d-925f-ee382862cf22","Type":"ContainerStarted","Data":"2db723c104c847e70b7c742bd2157495764accb8230b8d1c494ef49084c7b620"} Mar 10 10:09:01 crc kubenswrapper[4794]: I0310 10:09:01.246016 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"6b5facf1-8bc0-497d-925f-ee382862cf22","Type":"ContainerStarted","Data":"a0b3e91e1e10cd24f85913c9dfa160aee857a4f7ffe7e21e4b28aa608366d44a"} Mar 10 10:09:01 crc kubenswrapper[4794]: I0310 10:09:01.274886 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.274868013 podStartE2EDuration="2.274868013s" podCreationTimestamp="2026-03-10 10:08:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 10:09:01.263821057 +0000 UTC m=+1490.019991895" watchObservedRunningTime="2026-03-10 10:09:01.274868013 +0000 UTC m=+1490.031038831" Mar 10 10:09:01 crc kubenswrapper[4794]: I0310 10:09:01.411176 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 10 10:09:01 crc kubenswrapper[4794]: W0310 10:09:01.413029 4794 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd6e6c324_8bba_4585_9ffc_afad751594d7.slice/crio-8f90bc40a5012204eeb3c53e19ecdf2f7e253b0770393dff4e6c78ac9f448b9a WatchSource:0}: Error finding container 8f90bc40a5012204eeb3c53e19ecdf2f7e253b0770393dff4e6c78ac9f448b9a: Status 404 returned error can't find the container with id 8f90bc40a5012204eeb3c53e19ecdf2f7e253b0770393dff4e6c78ac9f448b9a Mar 10 10:09:01 crc kubenswrapper[4794]: I0310 10:09:01.978267 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 10 10:09:02 crc kubenswrapper[4794]: I0310 10:09:02.012535 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43594c25-d6b5-4e31-b751-3649526b219b" path="/var/lib/kubelet/pods/43594c25-d6b5-4e31-b751-3649526b219b/volumes" Mar 10 10:09:02 crc kubenswrapper[4794]: I0310 10:09:02.105929 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/61199562-76fc-4b20-a14e-6da4c1738232-public-tls-certs\") pod \"61199562-76fc-4b20-a14e-6da4c1738232\" (UID: \"61199562-76fc-4b20-a14e-6da4c1738232\") " Mar 10 10:09:02 crc kubenswrapper[4794]: I0310 10:09:02.106094 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/61199562-76fc-4b20-a14e-6da4c1738232-combined-ca-bundle\") pod \"61199562-76fc-4b20-a14e-6da4c1738232\" (UID: \"61199562-76fc-4b20-a14e-6da4c1738232\") " Mar 10 10:09:02 crc kubenswrapper[4794]: I0310 10:09:02.106128 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/61199562-76fc-4b20-a14e-6da4c1738232-logs\") pod \"61199562-76fc-4b20-a14e-6da4c1738232\" (UID: \"61199562-76fc-4b20-a14e-6da4c1738232\") " Mar 10 10:09:02 crc kubenswrapper[4794]: I0310 10:09:02.106160 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9l8nh\" (UniqueName: \"kubernetes.io/projected/61199562-76fc-4b20-a14e-6da4c1738232-kube-api-access-9l8nh\") pod \"61199562-76fc-4b20-a14e-6da4c1738232\" (UID: \"61199562-76fc-4b20-a14e-6da4c1738232\") " Mar 10 10:09:02 crc kubenswrapper[4794]: I0310 10:09:02.106245 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/61199562-76fc-4b20-a14e-6da4c1738232-config-data\") pod \"61199562-76fc-4b20-a14e-6da4c1738232\" (UID: \"61199562-76fc-4b20-a14e-6da4c1738232\") " Mar 10 10:09:02 crc kubenswrapper[4794]: I0310 10:09:02.106374 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/61199562-76fc-4b20-a14e-6da4c1738232-internal-tls-certs\") pod \"61199562-76fc-4b20-a14e-6da4c1738232\" (UID: \"61199562-76fc-4b20-a14e-6da4c1738232\") " Mar 10 10:09:02 crc kubenswrapper[4794]: I0310 10:09:02.109354 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/61199562-76fc-4b20-a14e-6da4c1738232-logs" (OuterVolumeSpecName: "logs") pod "61199562-76fc-4b20-a14e-6da4c1738232" (UID: "61199562-76fc-4b20-a14e-6da4c1738232"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 10:09:02 crc kubenswrapper[4794]: I0310 10:09:02.117506 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/61199562-76fc-4b20-a14e-6da4c1738232-kube-api-access-9l8nh" (OuterVolumeSpecName: "kube-api-access-9l8nh") pod "61199562-76fc-4b20-a14e-6da4c1738232" (UID: "61199562-76fc-4b20-a14e-6da4c1738232"). InnerVolumeSpecName "kube-api-access-9l8nh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 10:09:02 crc kubenswrapper[4794]: I0310 10:09:02.196273 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/61199562-76fc-4b20-a14e-6da4c1738232-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "61199562-76fc-4b20-a14e-6da4c1738232" (UID: "61199562-76fc-4b20-a14e-6da4c1738232"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 10:09:02 crc kubenswrapper[4794]: I0310 10:09:02.208488 4794 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/61199562-76fc-4b20-a14e-6da4c1738232-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 10:09:02 crc kubenswrapper[4794]: I0310 10:09:02.208520 4794 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/61199562-76fc-4b20-a14e-6da4c1738232-logs\") on node \"crc\" DevicePath \"\"" Mar 10 10:09:02 crc kubenswrapper[4794]: I0310 10:09:02.208531 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9l8nh\" (UniqueName: \"kubernetes.io/projected/61199562-76fc-4b20-a14e-6da4c1738232-kube-api-access-9l8nh\") on node \"crc\" DevicePath \"\"" Mar 10 10:09:02 crc kubenswrapper[4794]: I0310 10:09:02.212956 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/61199562-76fc-4b20-a14e-6da4c1738232-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "61199562-76fc-4b20-a14e-6da4c1738232" (UID: "61199562-76fc-4b20-a14e-6da4c1738232"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 10:09:02 crc kubenswrapper[4794]: I0310 10:09:02.215303 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/61199562-76fc-4b20-a14e-6da4c1738232-config-data" (OuterVolumeSpecName: "config-data") pod "61199562-76fc-4b20-a14e-6da4c1738232" (UID: "61199562-76fc-4b20-a14e-6da4c1738232"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 10:09:02 crc kubenswrapper[4794]: I0310 10:09:02.239925 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/61199562-76fc-4b20-a14e-6da4c1738232-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "61199562-76fc-4b20-a14e-6da4c1738232" (UID: "61199562-76fc-4b20-a14e-6da4c1738232"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 10:09:02 crc kubenswrapper[4794]: I0310 10:09:02.256987 4794 generic.go:334] "Generic (PLEG): container finished" podID="61199562-76fc-4b20-a14e-6da4c1738232" containerID="154560ffa9cfeb302b6ae22b1cdc0838499403243568c6df472ca1b6def1f145" exitCode=0 Mar 10 10:09:02 crc kubenswrapper[4794]: I0310 10:09:02.257040 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 10 10:09:02 crc kubenswrapper[4794]: I0310 10:09:02.257052 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"61199562-76fc-4b20-a14e-6da4c1738232","Type":"ContainerDied","Data":"154560ffa9cfeb302b6ae22b1cdc0838499403243568c6df472ca1b6def1f145"} Mar 10 10:09:02 crc kubenswrapper[4794]: I0310 10:09:02.257080 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"61199562-76fc-4b20-a14e-6da4c1738232","Type":"ContainerDied","Data":"bafa54943158e9f1163fa92e2b76e4b121ee77cbaf837360f966963fe3031c56"} Mar 10 10:09:02 crc kubenswrapper[4794]: I0310 10:09:02.257099 4794 scope.go:117] "RemoveContainer" containerID="154560ffa9cfeb302b6ae22b1cdc0838499403243568c6df472ca1b6def1f145" Mar 10 10:09:02 crc kubenswrapper[4794]: I0310 10:09:02.259238 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"d6e6c324-8bba-4585-9ffc-afad751594d7","Type":"ContainerStarted","Data":"a6cc45af0b715a94827c03b09ad8044d92a3ff4ebaf066479557fde441a9fe8f"} Mar 10 10:09:02 crc kubenswrapper[4794]: I0310 10:09:02.259269 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"d6e6c324-8bba-4585-9ffc-afad751594d7","Type":"ContainerStarted","Data":"8f90bc40a5012204eeb3c53e19ecdf2f7e253b0770393dff4e6c78ac9f448b9a"} Mar 10 10:09:02 crc kubenswrapper[4794]: I0310 10:09:02.276664 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.276648015 podStartE2EDuration="2.276648015s" podCreationTimestamp="2026-03-10 10:09:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 10:09:02.273536057 +0000 UTC m=+1491.029706875" watchObservedRunningTime="2026-03-10 10:09:02.276648015 +0000 UTC m=+1491.032818833" Mar 10 10:09:02 crc kubenswrapper[4794]: I0310 10:09:02.279037 4794 scope.go:117] "RemoveContainer" containerID="ef4bba98527955cc98443dd4a1ddf94c6712a262eabb7c4ec9f8ed25f1388ef4" Mar 10 10:09:02 crc kubenswrapper[4794]: I0310 10:09:02.303000 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 10 10:09:02 crc kubenswrapper[4794]: I0310 10:09:02.309735 4794 scope.go:117] "RemoveContainer" containerID="154560ffa9cfeb302b6ae22b1cdc0838499403243568c6df472ca1b6def1f145" Mar 10 10:09:02 crc kubenswrapper[4794]: E0310 10:09:02.310457 4794 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"154560ffa9cfeb302b6ae22b1cdc0838499403243568c6df472ca1b6def1f145\": container with ID starting with 154560ffa9cfeb302b6ae22b1cdc0838499403243568c6df472ca1b6def1f145 not found: ID does not exist" containerID="154560ffa9cfeb302b6ae22b1cdc0838499403243568c6df472ca1b6def1f145" Mar 10 10:09:02 crc kubenswrapper[4794]: I0310 10:09:02.310498 4794 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"154560ffa9cfeb302b6ae22b1cdc0838499403243568c6df472ca1b6def1f145"} err="failed to get container status \"154560ffa9cfeb302b6ae22b1cdc0838499403243568c6df472ca1b6def1f145\": rpc error: code = NotFound desc = could not find container \"154560ffa9cfeb302b6ae22b1cdc0838499403243568c6df472ca1b6def1f145\": container with ID starting with 154560ffa9cfeb302b6ae22b1cdc0838499403243568c6df472ca1b6def1f145 not found: ID does not exist" Mar 10 10:09:02 crc kubenswrapper[4794]: I0310 10:09:02.310521 4794 scope.go:117] "RemoveContainer" containerID="ef4bba98527955cc98443dd4a1ddf94c6712a262eabb7c4ec9f8ed25f1388ef4" Mar 10 10:09:02 crc kubenswrapper[4794]: E0310 10:09:02.310788 4794 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ef4bba98527955cc98443dd4a1ddf94c6712a262eabb7c4ec9f8ed25f1388ef4\": container with ID starting with ef4bba98527955cc98443dd4a1ddf94c6712a262eabb7c4ec9f8ed25f1388ef4 not found: ID does not exist" containerID="ef4bba98527955cc98443dd4a1ddf94c6712a262eabb7c4ec9f8ed25f1388ef4" Mar 10 10:09:02 crc kubenswrapper[4794]: I0310 10:09:02.310808 4794 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ef4bba98527955cc98443dd4a1ddf94c6712a262eabb7c4ec9f8ed25f1388ef4"} err="failed to get container status \"ef4bba98527955cc98443dd4a1ddf94c6712a262eabb7c4ec9f8ed25f1388ef4\": rpc error: code = NotFound desc = could not find container \"ef4bba98527955cc98443dd4a1ddf94c6712a262eabb7c4ec9f8ed25f1388ef4\": container with ID starting with ef4bba98527955cc98443dd4a1ddf94c6712a262eabb7c4ec9f8ed25f1388ef4 not found: ID does not exist" Mar 10 10:09:02 crc kubenswrapper[4794]: I0310 10:09:02.312393 4794 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/61199562-76fc-4b20-a14e-6da4c1738232-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 10:09:02 crc kubenswrapper[4794]: I0310 10:09:02.312432 4794 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/61199562-76fc-4b20-a14e-6da4c1738232-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 10 10:09:02 crc kubenswrapper[4794]: I0310 10:09:02.312450 4794 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/61199562-76fc-4b20-a14e-6da4c1738232-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 10 10:09:02 crc kubenswrapper[4794]: I0310 10:09:02.314415 4794 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Mar 10 10:09:02 crc kubenswrapper[4794]: I0310 10:09:02.325665 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Mar 10 10:09:02 crc kubenswrapper[4794]: E0310 10:09:02.326049 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="61199562-76fc-4b20-a14e-6da4c1738232" containerName="nova-api-log" Mar 10 10:09:02 crc kubenswrapper[4794]: I0310 10:09:02.326069 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="61199562-76fc-4b20-a14e-6da4c1738232" containerName="nova-api-log" Mar 10 10:09:02 crc kubenswrapper[4794]: E0310 10:09:02.326131 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="61199562-76fc-4b20-a14e-6da4c1738232" containerName="nova-api-api" Mar 10 10:09:02 crc kubenswrapper[4794]: I0310 10:09:02.326142 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="61199562-76fc-4b20-a14e-6da4c1738232" containerName="nova-api-api" Mar 10 10:09:02 crc kubenswrapper[4794]: I0310 10:09:02.326381 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="61199562-76fc-4b20-a14e-6da4c1738232" containerName="nova-api-api" Mar 10 10:09:02 crc kubenswrapper[4794]: I0310 10:09:02.326413 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="61199562-76fc-4b20-a14e-6da4c1738232" containerName="nova-api-log" Mar 10 10:09:02 crc kubenswrapper[4794]: I0310 10:09:02.327468 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 10 10:09:02 crc kubenswrapper[4794]: I0310 10:09:02.329677 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Mar 10 10:09:02 crc kubenswrapper[4794]: I0310 10:09:02.330250 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Mar 10 10:09:02 crc kubenswrapper[4794]: I0310 10:09:02.332993 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Mar 10 10:09:02 crc kubenswrapper[4794]: I0310 10:09:02.349399 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 10 10:09:02 crc kubenswrapper[4794]: I0310 10:09:02.413680 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2446b2bc-c3c8-465d-a808-981664228cba-internal-tls-certs\") pod \"nova-api-0\" (UID: \"2446b2bc-c3c8-465d-a808-981664228cba\") " pod="openstack/nova-api-0" Mar 10 10:09:02 crc kubenswrapper[4794]: I0310 10:09:02.413753 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2446b2bc-c3c8-465d-a808-981664228cba-logs\") pod \"nova-api-0\" (UID: \"2446b2bc-c3c8-465d-a808-981664228cba\") " pod="openstack/nova-api-0" Mar 10 10:09:02 crc kubenswrapper[4794]: I0310 10:09:02.413786 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ql47l\" (UniqueName: \"kubernetes.io/projected/2446b2bc-c3c8-465d-a808-981664228cba-kube-api-access-ql47l\") pod \"nova-api-0\" (UID: \"2446b2bc-c3c8-465d-a808-981664228cba\") " pod="openstack/nova-api-0" Mar 10 10:09:02 crc kubenswrapper[4794]: I0310 10:09:02.413854 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2446b2bc-c3c8-465d-a808-981664228cba-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"2446b2bc-c3c8-465d-a808-981664228cba\") " pod="openstack/nova-api-0" Mar 10 10:09:02 crc kubenswrapper[4794]: I0310 10:09:02.413905 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2446b2bc-c3c8-465d-a808-981664228cba-config-data\") pod \"nova-api-0\" (UID: \"2446b2bc-c3c8-465d-a808-981664228cba\") " pod="openstack/nova-api-0" Mar 10 10:09:02 crc kubenswrapper[4794]: I0310 10:09:02.413938 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2446b2bc-c3c8-465d-a808-981664228cba-public-tls-certs\") pod \"nova-api-0\" (UID: \"2446b2bc-c3c8-465d-a808-981664228cba\") " pod="openstack/nova-api-0" Mar 10 10:09:02 crc kubenswrapper[4794]: I0310 10:09:02.515233 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2446b2bc-c3c8-465d-a808-981664228cba-logs\") pod \"nova-api-0\" (UID: \"2446b2bc-c3c8-465d-a808-981664228cba\") " pod="openstack/nova-api-0" Mar 10 10:09:02 crc kubenswrapper[4794]: I0310 10:09:02.515619 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ql47l\" (UniqueName: \"kubernetes.io/projected/2446b2bc-c3c8-465d-a808-981664228cba-kube-api-access-ql47l\") pod \"nova-api-0\" (UID: \"2446b2bc-c3c8-465d-a808-981664228cba\") " pod="openstack/nova-api-0" Mar 10 10:09:02 crc kubenswrapper[4794]: I0310 10:09:02.515707 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2446b2bc-c3c8-465d-a808-981664228cba-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"2446b2bc-c3c8-465d-a808-981664228cba\") " pod="openstack/nova-api-0" Mar 10 10:09:02 crc kubenswrapper[4794]: I0310 10:09:02.515776 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2446b2bc-c3c8-465d-a808-981664228cba-config-data\") pod \"nova-api-0\" (UID: \"2446b2bc-c3c8-465d-a808-981664228cba\") " pod="openstack/nova-api-0" Mar 10 10:09:02 crc kubenswrapper[4794]: I0310 10:09:02.515823 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2446b2bc-c3c8-465d-a808-981664228cba-logs\") pod \"nova-api-0\" (UID: \"2446b2bc-c3c8-465d-a808-981664228cba\") " pod="openstack/nova-api-0" Mar 10 10:09:02 crc kubenswrapper[4794]: I0310 10:09:02.515834 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2446b2bc-c3c8-465d-a808-981664228cba-public-tls-certs\") pod \"nova-api-0\" (UID: \"2446b2bc-c3c8-465d-a808-981664228cba\") " pod="openstack/nova-api-0" Mar 10 10:09:02 crc kubenswrapper[4794]: I0310 10:09:02.516066 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2446b2bc-c3c8-465d-a808-981664228cba-internal-tls-certs\") pod \"nova-api-0\" (UID: \"2446b2bc-c3c8-465d-a808-981664228cba\") " pod="openstack/nova-api-0" Mar 10 10:09:02 crc kubenswrapper[4794]: I0310 10:09:02.520455 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2446b2bc-c3c8-465d-a808-981664228cba-internal-tls-certs\") pod \"nova-api-0\" (UID: \"2446b2bc-c3c8-465d-a808-981664228cba\") " pod="openstack/nova-api-0" Mar 10 10:09:02 crc kubenswrapper[4794]: I0310 10:09:02.520465 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2446b2bc-c3c8-465d-a808-981664228cba-public-tls-certs\") pod \"nova-api-0\" (UID: \"2446b2bc-c3c8-465d-a808-981664228cba\") " pod="openstack/nova-api-0" Mar 10 10:09:02 crc kubenswrapper[4794]: I0310 10:09:02.520750 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2446b2bc-c3c8-465d-a808-981664228cba-config-data\") pod \"nova-api-0\" (UID: \"2446b2bc-c3c8-465d-a808-981664228cba\") " pod="openstack/nova-api-0" Mar 10 10:09:02 crc kubenswrapper[4794]: I0310 10:09:02.520888 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2446b2bc-c3c8-465d-a808-981664228cba-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"2446b2bc-c3c8-465d-a808-981664228cba\") " pod="openstack/nova-api-0" Mar 10 10:09:02 crc kubenswrapper[4794]: I0310 10:09:02.543960 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ql47l\" (UniqueName: \"kubernetes.io/projected/2446b2bc-c3c8-465d-a808-981664228cba-kube-api-access-ql47l\") pod \"nova-api-0\" (UID: \"2446b2bc-c3c8-465d-a808-981664228cba\") " pod="openstack/nova-api-0" Mar 10 10:09:02 crc kubenswrapper[4794]: I0310 10:09:02.651104 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 10 10:09:03 crc kubenswrapper[4794]: I0310 10:09:03.100502 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 10 10:09:03 crc kubenswrapper[4794]: I0310 10:09:03.270666 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"2446b2bc-c3c8-465d-a808-981664228cba","Type":"ContainerStarted","Data":"2fa7b2465b193518461982c6dd45b272c119057d175f0a6e5ef8f619368f117f"} Mar 10 10:09:04 crc kubenswrapper[4794]: I0310 10:09:04.011042 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="61199562-76fc-4b20-a14e-6da4c1738232" path="/var/lib/kubelet/pods/61199562-76fc-4b20-a14e-6da4c1738232/volumes" Mar 10 10:09:04 crc kubenswrapper[4794]: I0310 10:09:04.287425 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"2446b2bc-c3c8-465d-a808-981664228cba","Type":"ContainerStarted","Data":"826580e64ddfef3da4fda6ec39829dbc5dafbc69c66385fc8ca2a2bcd5ca60d8"} Mar 10 10:09:04 crc kubenswrapper[4794]: I0310 10:09:04.287468 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"2446b2bc-c3c8-465d-a808-981664228cba","Type":"ContainerStarted","Data":"6e6cf54d16f75007332086d04328fb26dd120e8114419987c7377f33c0bef36c"} Mar 10 10:09:04 crc kubenswrapper[4794]: I0310 10:09:04.313220 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.313200567 podStartE2EDuration="2.313200567s" podCreationTimestamp="2026-03-10 10:09:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 10:09:04.303667899 +0000 UTC m=+1493.059838727" watchObservedRunningTime="2026-03-10 10:09:04.313200567 +0000 UTC m=+1493.069371385" Mar 10 10:09:04 crc kubenswrapper[4794]: I0310 10:09:04.616106 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Mar 10 10:09:04 crc kubenswrapper[4794]: I0310 10:09:04.616188 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Mar 10 10:09:05 crc kubenswrapper[4794]: I0310 10:09:05.998131 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Mar 10 10:09:09 crc kubenswrapper[4794]: I0310 10:09:09.616258 4794 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Mar 10 10:09:09 crc kubenswrapper[4794]: I0310 10:09:09.616436 4794 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Mar 10 10:09:10 crc kubenswrapper[4794]: I0310 10:09:10.630536 4794 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="6b5facf1-8bc0-497d-925f-ee382862cf22" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.211:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 10 10:09:10 crc kubenswrapper[4794]: I0310 10:09:10.630539 4794 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="6b5facf1-8bc0-497d-925f-ee382862cf22" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.211:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 10 10:09:10 crc kubenswrapper[4794]: I0310 10:09:10.997405 4794 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Mar 10 10:09:11 crc kubenswrapper[4794]: I0310 10:09:11.025080 4794 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Mar 10 10:09:11 crc kubenswrapper[4794]: I0310 10:09:11.386128 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Mar 10 10:09:12 crc kubenswrapper[4794]: I0310 10:09:12.651224 4794 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 10 10:09:12 crc kubenswrapper[4794]: I0310 10:09:12.651284 4794 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 10 10:09:13 crc kubenswrapper[4794]: I0310 10:09:13.664478 4794 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="2446b2bc-c3c8-465d-a808-981664228cba" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.0.213:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 10 10:09:13 crc kubenswrapper[4794]: I0310 10:09:13.664619 4794 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="2446b2bc-c3c8-465d-a808-981664228cba" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.0.213:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 10 10:09:14 crc kubenswrapper[4794]: I0310 10:09:14.379507 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Mar 10 10:09:19 crc kubenswrapper[4794]: I0310 10:09:19.621939 4794 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Mar 10 10:09:19 crc kubenswrapper[4794]: I0310 10:09:19.624300 4794 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Mar 10 10:09:19 crc kubenswrapper[4794]: I0310 10:09:19.627162 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Mar 10 10:09:20 crc kubenswrapper[4794]: I0310 10:09:20.473053 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Mar 10 10:09:22 crc kubenswrapper[4794]: I0310 10:09:22.660825 4794 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Mar 10 10:09:22 crc kubenswrapper[4794]: I0310 10:09:22.661911 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Mar 10 10:09:22 crc kubenswrapper[4794]: I0310 10:09:22.665546 4794 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Mar 10 10:09:22 crc kubenswrapper[4794]: I0310 10:09:22.668290 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Mar 10 10:09:22 crc kubenswrapper[4794]: I0310 10:09:22.967185 4794 patch_prober.go:28] interesting pod/machine-config-daemon-69278 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 10:09:22 crc kubenswrapper[4794]: I0310 10:09:22.967248 4794 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-69278" podUID="071a79a8-a892-4d38-a255-2a19483b64aa" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 10:09:23 crc kubenswrapper[4794]: I0310 10:09:23.505289 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Mar 10 10:09:23 crc kubenswrapper[4794]: I0310 10:09:23.513151 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Mar 10 10:09:31 crc kubenswrapper[4794]: I0310 10:09:31.314800 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-ncstb"] Mar 10 10:09:31 crc kubenswrapper[4794]: I0310 10:09:31.317231 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-ncstb" Mar 10 10:09:31 crc kubenswrapper[4794]: I0310 10:09:31.349618 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-ncstb"] Mar 10 10:09:31 crc kubenswrapper[4794]: I0310 10:09:31.399465 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-64gcn\" (UniqueName: \"kubernetes.io/projected/df649851-7d90-41e8-80e9-f7fd44d77af0-kube-api-access-64gcn\") pod \"certified-operators-ncstb\" (UID: \"df649851-7d90-41e8-80e9-f7fd44d77af0\") " pod="openshift-marketplace/certified-operators-ncstb" Mar 10 10:09:31 crc kubenswrapper[4794]: I0310 10:09:31.399591 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/df649851-7d90-41e8-80e9-f7fd44d77af0-utilities\") pod \"certified-operators-ncstb\" (UID: \"df649851-7d90-41e8-80e9-f7fd44d77af0\") " pod="openshift-marketplace/certified-operators-ncstb" Mar 10 10:09:31 crc kubenswrapper[4794]: I0310 10:09:31.399624 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/df649851-7d90-41e8-80e9-f7fd44d77af0-catalog-content\") pod \"certified-operators-ncstb\" (UID: \"df649851-7d90-41e8-80e9-f7fd44d77af0\") " pod="openshift-marketplace/certified-operators-ncstb" Mar 10 10:09:31 crc kubenswrapper[4794]: I0310 10:09:31.502287 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/df649851-7d90-41e8-80e9-f7fd44d77af0-utilities\") pod \"certified-operators-ncstb\" (UID: \"df649851-7d90-41e8-80e9-f7fd44d77af0\") " pod="openshift-marketplace/certified-operators-ncstb" Mar 10 10:09:31 crc kubenswrapper[4794]: I0310 10:09:31.502486 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/df649851-7d90-41e8-80e9-f7fd44d77af0-catalog-content\") pod \"certified-operators-ncstb\" (UID: \"df649851-7d90-41e8-80e9-f7fd44d77af0\") " pod="openshift-marketplace/certified-operators-ncstb" Mar 10 10:09:31 crc kubenswrapper[4794]: I0310 10:09:31.502690 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-64gcn\" (UniqueName: \"kubernetes.io/projected/df649851-7d90-41e8-80e9-f7fd44d77af0-kube-api-access-64gcn\") pod \"certified-operators-ncstb\" (UID: \"df649851-7d90-41e8-80e9-f7fd44d77af0\") " pod="openshift-marketplace/certified-operators-ncstb" Mar 10 10:09:31 crc kubenswrapper[4794]: I0310 10:09:31.502907 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/df649851-7d90-41e8-80e9-f7fd44d77af0-utilities\") pod \"certified-operators-ncstb\" (UID: \"df649851-7d90-41e8-80e9-f7fd44d77af0\") " pod="openshift-marketplace/certified-operators-ncstb" Mar 10 10:09:31 crc kubenswrapper[4794]: I0310 10:09:31.503152 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/df649851-7d90-41e8-80e9-f7fd44d77af0-catalog-content\") pod \"certified-operators-ncstb\" (UID: \"df649851-7d90-41e8-80e9-f7fd44d77af0\") " pod="openshift-marketplace/certified-operators-ncstb" Mar 10 10:09:31 crc kubenswrapper[4794]: I0310 10:09:31.537213 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-64gcn\" (UniqueName: \"kubernetes.io/projected/df649851-7d90-41e8-80e9-f7fd44d77af0-kube-api-access-64gcn\") pod \"certified-operators-ncstb\" (UID: \"df649851-7d90-41e8-80e9-f7fd44d77af0\") " pod="openshift-marketplace/certified-operators-ncstb" Mar 10 10:09:31 crc kubenswrapper[4794]: I0310 10:09:31.650639 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-ncstb" Mar 10 10:09:32 crc kubenswrapper[4794]: I0310 10:09:32.161951 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-ncstb"] Mar 10 10:09:32 crc kubenswrapper[4794]: I0310 10:09:32.603576 4794 generic.go:334] "Generic (PLEG): container finished" podID="df649851-7d90-41e8-80e9-f7fd44d77af0" containerID="2932a8cb91e0c1e8155dc36d90b5dde9d6ccef4a40218c833440f05cd348e0dc" exitCode=0 Mar 10 10:09:32 crc kubenswrapper[4794]: I0310 10:09:32.603718 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ncstb" event={"ID":"df649851-7d90-41e8-80e9-f7fd44d77af0","Type":"ContainerDied","Data":"2932a8cb91e0c1e8155dc36d90b5dde9d6ccef4a40218c833440f05cd348e0dc"} Mar 10 10:09:32 crc kubenswrapper[4794]: I0310 10:09:32.603990 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ncstb" event={"ID":"df649851-7d90-41e8-80e9-f7fd44d77af0","Type":"ContainerStarted","Data":"093e303226a7976dab0561b0896c1693e8abccc6b9d859c15ed43a19cd6bd9c3"} Mar 10 10:09:32 crc kubenswrapper[4794]: I0310 10:09:32.606925 4794 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 10 10:09:33 crc kubenswrapper[4794]: I0310 10:09:33.616164 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ncstb" event={"ID":"df649851-7d90-41e8-80e9-f7fd44d77af0","Type":"ContainerStarted","Data":"300562753a0ec20aa3aeb28f13de5e5e0c7a2cb2a3fd6a2132f69624d5cdf946"} Mar 10 10:09:34 crc kubenswrapper[4794]: I0310 10:09:34.632862 4794 generic.go:334] "Generic (PLEG): container finished" podID="df649851-7d90-41e8-80e9-f7fd44d77af0" containerID="300562753a0ec20aa3aeb28f13de5e5e0c7a2cb2a3fd6a2132f69624d5cdf946" exitCode=0 Mar 10 10:09:34 crc kubenswrapper[4794]: I0310 10:09:34.632911 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ncstb" event={"ID":"df649851-7d90-41e8-80e9-f7fd44d77af0","Type":"ContainerDied","Data":"300562753a0ec20aa3aeb28f13de5e5e0c7a2cb2a3fd6a2132f69624d5cdf946"} Mar 10 10:09:35 crc kubenswrapper[4794]: I0310 10:09:35.486504 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-x6kv8"] Mar 10 10:09:35 crc kubenswrapper[4794]: I0310 10:09:35.489022 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-x6kv8" Mar 10 10:09:35 crc kubenswrapper[4794]: I0310 10:09:35.514000 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-x6kv8"] Mar 10 10:09:35 crc kubenswrapper[4794]: I0310 10:09:35.582725 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rgqrx\" (UniqueName: \"kubernetes.io/projected/0919f85e-d789-43ec-90d3-7df281f603c5-kube-api-access-rgqrx\") pod \"redhat-marketplace-x6kv8\" (UID: \"0919f85e-d789-43ec-90d3-7df281f603c5\") " pod="openshift-marketplace/redhat-marketplace-x6kv8" Mar 10 10:09:35 crc kubenswrapper[4794]: I0310 10:09:35.583123 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0919f85e-d789-43ec-90d3-7df281f603c5-utilities\") pod \"redhat-marketplace-x6kv8\" (UID: \"0919f85e-d789-43ec-90d3-7df281f603c5\") " pod="openshift-marketplace/redhat-marketplace-x6kv8" Mar 10 10:09:35 crc kubenswrapper[4794]: I0310 10:09:35.583440 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0919f85e-d789-43ec-90d3-7df281f603c5-catalog-content\") pod \"redhat-marketplace-x6kv8\" (UID: \"0919f85e-d789-43ec-90d3-7df281f603c5\") " pod="openshift-marketplace/redhat-marketplace-x6kv8" Mar 10 10:09:35 crc kubenswrapper[4794]: I0310 10:09:35.652182 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ncstb" event={"ID":"df649851-7d90-41e8-80e9-f7fd44d77af0","Type":"ContainerStarted","Data":"5c587627c7f33469c641839031a8ba051d3fb1f6bbb5856feff3d2931910e1cd"} Mar 10 10:09:35 crc kubenswrapper[4794]: I0310 10:09:35.677796 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-ncstb" podStartSLOduration=2.228863962 podStartE2EDuration="4.677774956s" podCreationTimestamp="2026-03-10 10:09:31 +0000 UTC" firstStartedPulling="2026-03-10 10:09:32.606593503 +0000 UTC m=+1521.362764331" lastFinishedPulling="2026-03-10 10:09:35.055504477 +0000 UTC m=+1523.811675325" observedRunningTime="2026-03-10 10:09:35.676673402 +0000 UTC m=+1524.432844240" watchObservedRunningTime="2026-03-10 10:09:35.677774956 +0000 UTC m=+1524.433945784" Mar 10 10:09:35 crc kubenswrapper[4794]: I0310 10:09:35.685210 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0919f85e-d789-43ec-90d3-7df281f603c5-catalog-content\") pod \"redhat-marketplace-x6kv8\" (UID: \"0919f85e-d789-43ec-90d3-7df281f603c5\") " pod="openshift-marketplace/redhat-marketplace-x6kv8" Mar 10 10:09:35 crc kubenswrapper[4794]: I0310 10:09:35.685365 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rgqrx\" (UniqueName: \"kubernetes.io/projected/0919f85e-d789-43ec-90d3-7df281f603c5-kube-api-access-rgqrx\") pod \"redhat-marketplace-x6kv8\" (UID: \"0919f85e-d789-43ec-90d3-7df281f603c5\") " pod="openshift-marketplace/redhat-marketplace-x6kv8" Mar 10 10:09:35 crc kubenswrapper[4794]: I0310 10:09:35.685449 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0919f85e-d789-43ec-90d3-7df281f603c5-utilities\") pod \"redhat-marketplace-x6kv8\" (UID: \"0919f85e-d789-43ec-90d3-7df281f603c5\") " pod="openshift-marketplace/redhat-marketplace-x6kv8" Mar 10 10:09:35 crc kubenswrapper[4794]: I0310 10:09:35.685855 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0919f85e-d789-43ec-90d3-7df281f603c5-catalog-content\") pod \"redhat-marketplace-x6kv8\" (UID: \"0919f85e-d789-43ec-90d3-7df281f603c5\") " pod="openshift-marketplace/redhat-marketplace-x6kv8" Mar 10 10:09:35 crc kubenswrapper[4794]: I0310 10:09:35.685889 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0919f85e-d789-43ec-90d3-7df281f603c5-utilities\") pod \"redhat-marketplace-x6kv8\" (UID: \"0919f85e-d789-43ec-90d3-7df281f603c5\") " pod="openshift-marketplace/redhat-marketplace-x6kv8" Mar 10 10:09:35 crc kubenswrapper[4794]: I0310 10:09:35.706113 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rgqrx\" (UniqueName: \"kubernetes.io/projected/0919f85e-d789-43ec-90d3-7df281f603c5-kube-api-access-rgqrx\") pod \"redhat-marketplace-x6kv8\" (UID: \"0919f85e-d789-43ec-90d3-7df281f603c5\") " pod="openshift-marketplace/redhat-marketplace-x6kv8" Mar 10 10:09:35 crc kubenswrapper[4794]: I0310 10:09:35.829755 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-x6kv8" Mar 10 10:09:36 crc kubenswrapper[4794]: I0310 10:09:36.323166 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-x6kv8"] Mar 10 10:09:36 crc kubenswrapper[4794]: I0310 10:09:36.663277 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-x6kv8" event={"ID":"0919f85e-d789-43ec-90d3-7df281f603c5","Type":"ContainerStarted","Data":"89b93a69dcf4450867e6a7f4b04a62bcc8b890b7da6063a0b9562828a92e866e"} Mar 10 10:09:37 crc kubenswrapper[4794]: I0310 10:09:37.677638 4794 generic.go:334] "Generic (PLEG): container finished" podID="0919f85e-d789-43ec-90d3-7df281f603c5" containerID="b90bc052c3d4a25868027ce9df7e66a9eafd23be70bfd49d08d79c81d6265ed9" exitCode=0 Mar 10 10:09:37 crc kubenswrapper[4794]: I0310 10:09:37.677724 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-x6kv8" event={"ID":"0919f85e-d789-43ec-90d3-7df281f603c5","Type":"ContainerDied","Data":"b90bc052c3d4a25868027ce9df7e66a9eafd23be70bfd49d08d79c81d6265ed9"} Mar 10 10:09:38 crc kubenswrapper[4794]: I0310 10:09:38.694135 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-x6kv8" event={"ID":"0919f85e-d789-43ec-90d3-7df281f603c5","Type":"ContainerStarted","Data":"9f4521ae14fc84ffc7327e5c29a6cdf8af244218cc5ae1d39d8183f14b4d4b5b"} Mar 10 10:09:39 crc kubenswrapper[4794]: I0310 10:09:39.708591 4794 generic.go:334] "Generic (PLEG): container finished" podID="0919f85e-d789-43ec-90d3-7df281f603c5" containerID="9f4521ae14fc84ffc7327e5c29a6cdf8af244218cc5ae1d39d8183f14b4d4b5b" exitCode=0 Mar 10 10:09:39 crc kubenswrapper[4794]: I0310 10:09:39.708632 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-x6kv8" event={"ID":"0919f85e-d789-43ec-90d3-7df281f603c5","Type":"ContainerDied","Data":"9f4521ae14fc84ffc7327e5c29a6cdf8af244218cc5ae1d39d8183f14b4d4b5b"} Mar 10 10:09:40 crc kubenswrapper[4794]: I0310 10:09:40.719032 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-x6kv8" event={"ID":"0919f85e-d789-43ec-90d3-7df281f603c5","Type":"ContainerStarted","Data":"afa0e1f6b3adf6aae632dc9a41dcb35cb3cdc5fc59c830b888bcffb71481d269"} Mar 10 10:09:40 crc kubenswrapper[4794]: I0310 10:09:40.862981 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-x6kv8" podStartSLOduration=3.405736682 podStartE2EDuration="5.862964795s" podCreationTimestamp="2026-03-10 10:09:35 +0000 UTC" firstStartedPulling="2026-03-10 10:09:37.680446888 +0000 UTC m=+1526.436617706" lastFinishedPulling="2026-03-10 10:09:40.137674951 +0000 UTC m=+1528.893845819" observedRunningTime="2026-03-10 10:09:40.860667613 +0000 UTC m=+1529.616838451" watchObservedRunningTime="2026-03-10 10:09:40.862964795 +0000 UTC m=+1529.619135613" Mar 10 10:09:41 crc kubenswrapper[4794]: I0310 10:09:41.205954 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstackclient"] Mar 10 10:09:41 crc kubenswrapper[4794]: I0310 10:09:41.206183 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/openstackclient" podUID="d235a0a5-57c9-4938-b742-5788ade30a12" containerName="openstackclient" containerID="cri-o://657c6e566fe3c3ec4a4a199f42966fd39347a8a6f061fe653f5274f12bb76445" gracePeriod=2 Mar 10 10:09:41 crc kubenswrapper[4794]: I0310 10:09:41.229462 4794 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/openstackclient"] Mar 10 10:09:41 crc kubenswrapper[4794]: I0310 10:09:41.245993 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-h8ctb"] Mar 10 10:09:41 crc kubenswrapper[4794]: E0310 10:09:41.246571 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d235a0a5-57c9-4938-b742-5788ade30a12" containerName="openstackclient" Mar 10 10:09:41 crc kubenswrapper[4794]: I0310 10:09:41.246647 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="d235a0a5-57c9-4938-b742-5788ade30a12" containerName="openstackclient" Mar 10 10:09:41 crc kubenswrapper[4794]: I0310 10:09:41.246908 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="d235a0a5-57c9-4938-b742-5788ade30a12" containerName="openstackclient" Mar 10 10:09:41 crc kubenswrapper[4794]: I0310 10:09:41.253833 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-h8ctb" Mar 10 10:09:41 crc kubenswrapper[4794]: I0310 10:09:41.266443 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-mariadb-root-db-secret" Mar 10 10:09:41 crc kubenswrapper[4794]: I0310 10:09:41.292951 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-g67ml"] Mar 10 10:09:41 crc kubenswrapper[4794]: I0310 10:09:41.311648 4794 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-g67ml"] Mar 10 10:09:41 crc kubenswrapper[4794]: I0310 10:09:41.317760 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-h8ctb"] Mar 10 10:09:41 crc kubenswrapper[4794]: I0310 10:09:41.336656 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4a681af5-3cbf-4d83-a8cd-42a552cdc06d-operator-scripts\") pod \"root-account-create-update-h8ctb\" (UID: \"4a681af5-3cbf-4d83-a8cd-42a552cdc06d\") " pod="openstack/root-account-create-update-h8ctb" Mar 10 10:09:41 crc kubenswrapper[4794]: I0310 10:09:41.336823 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kw958\" (UniqueName: \"kubernetes.io/projected/4a681af5-3cbf-4d83-a8cd-42a552cdc06d-kube-api-access-kw958\") pod \"root-account-create-update-h8ctb\" (UID: \"4a681af5-3cbf-4d83-a8cd-42a552cdc06d\") " pod="openstack/root-account-create-update-h8ctb" Mar 10 10:09:41 crc kubenswrapper[4794]: I0310 10:09:41.419067 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-7da2-account-create-update-dtltz"] Mar 10 10:09:41 crc kubenswrapper[4794]: I0310 10:09:41.434574 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 10 10:09:41 crc kubenswrapper[4794]: I0310 10:09:41.445698 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kw958\" (UniqueName: \"kubernetes.io/projected/4a681af5-3cbf-4d83-a8cd-42a552cdc06d-kube-api-access-kw958\") pod \"root-account-create-update-h8ctb\" (UID: \"4a681af5-3cbf-4d83-a8cd-42a552cdc06d\") " pod="openstack/root-account-create-update-h8ctb" Mar 10 10:09:41 crc kubenswrapper[4794]: I0310 10:09:41.446318 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4a681af5-3cbf-4d83-a8cd-42a552cdc06d-operator-scripts\") pod \"root-account-create-update-h8ctb\" (UID: \"4a681af5-3cbf-4d83-a8cd-42a552cdc06d\") " pod="openstack/root-account-create-update-h8ctb" Mar 10 10:09:41 crc kubenswrapper[4794]: I0310 10:09:41.447067 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4a681af5-3cbf-4d83-a8cd-42a552cdc06d-operator-scripts\") pod \"root-account-create-update-h8ctb\" (UID: \"4a681af5-3cbf-4d83-a8cd-42a552cdc06d\") " pod="openstack/root-account-create-update-h8ctb" Mar 10 10:09:41 crc kubenswrapper[4794]: I0310 10:09:41.474976 4794 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-7da2-account-create-update-dtltz"] Mar 10 10:09:41 crc kubenswrapper[4794]: I0310 10:09:41.495696 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kw958\" (UniqueName: \"kubernetes.io/projected/4a681af5-3cbf-4d83-a8cd-42a552cdc06d-kube-api-access-kw958\") pod \"root-account-create-update-h8ctb\" (UID: \"4a681af5-3cbf-4d83-a8cd-42a552cdc06d\") " pod="openstack/root-account-create-update-h8ctb" Mar 10 10:09:41 crc kubenswrapper[4794]: I0310 10:09:41.523518 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-northd-0"] Mar 10 10:09:41 crc kubenswrapper[4794]: I0310 10:09:41.523819 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovn-northd-0" podUID="3a1da9f1-f33d-4327-b899-b5a38c6990d8" containerName="ovn-northd" containerID="cri-o://a27a1a3aea44b5edb2651ff49b0fcc031f55a8ebaeb3807a56fb618e61e23ded" gracePeriod=30 Mar 10 10:09:41 crc kubenswrapper[4794]: I0310 10:09:41.524237 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovn-northd-0" podUID="3a1da9f1-f33d-4327-b899-b5a38c6990d8" containerName="openstack-network-exporter" containerID="cri-o://804d445e998cb87856a9e74959cb49d74352b3c669f7c0cfc5587e7a39d5a062" gracePeriod=30 Mar 10 10:09:41 crc kubenswrapper[4794]: I0310 10:09:41.548363 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovsdbserver-sb-0"] Mar 10 10:09:41 crc kubenswrapper[4794]: E0310 10:09:41.550570 4794 configmap.go:193] Couldn't get configMap openstack/rabbitmq-config-data: configmap "rabbitmq-config-data" not found Mar 10 10:09:41 crc kubenswrapper[4794]: E0310 10:09:41.559032 4794 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/a45381ea-b5d8-49aa-b4b8-ab372b39b0d3-config-data podName:a45381ea-b5d8-49aa-b4b8-ab372b39b0d3 nodeName:}" failed. No retries permitted until 2026-03-10 10:09:42.059007524 +0000 UTC m=+1530.815178342 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/a45381ea-b5d8-49aa-b4b8-ab372b39b0d3-config-data") pod "rabbitmq-server-0" (UID: "a45381ea-b5d8-49aa-b4b8-ab372b39b0d3") : configmap "rabbitmq-config-data" not found Mar 10 10:09:41 crc kubenswrapper[4794]: I0310 10:09:41.559074 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovsdbserver-sb-0" podUID="1fcb4385-7603-4d75-8c41-23f457fcae25" containerName="openstack-network-exporter" containerID="cri-o://7e4285317b0405a3de83fe6a6261bf54c91dad51c10abc1e8c435712c795e03f" gracePeriod=300 Mar 10 10:09:41 crc kubenswrapper[4794]: I0310 10:09:41.562905 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-7da2-account-create-update-rd2nz"] Mar 10 10:09:41 crc kubenswrapper[4794]: I0310 10:09:41.565453 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-7da2-account-create-update-rd2nz" Mar 10 10:09:41 crc kubenswrapper[4794]: I0310 10:09:41.572508 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-db-secret" Mar 10 10:09:41 crc kubenswrapper[4794]: I0310 10:09:41.600220 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-7da2-account-create-update-rd2nz"] Mar 10 10:09:41 crc kubenswrapper[4794]: I0310 10:09:41.621427 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-h8ctb" Mar 10 10:09:41 crc kubenswrapper[4794]: I0310 10:09:41.651749 4794 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-ncstb" Mar 10 10:09:41 crc kubenswrapper[4794]: I0310 10:09:41.651787 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-ncstb" Mar 10 10:09:41 crc kubenswrapper[4794]: I0310 10:09:41.653019 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7krhj\" (UniqueName: \"kubernetes.io/projected/19885e30-4144-4598-be9f-99644e5d5d4a-kube-api-access-7krhj\") pod \"glance-7da2-account-create-update-rd2nz\" (UID: \"19885e30-4144-4598-be9f-99644e5d5d4a\") " pod="openstack/glance-7da2-account-create-update-rd2nz" Mar 10 10:09:41 crc kubenswrapper[4794]: I0310 10:09:41.653132 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/19885e30-4144-4598-be9f-99644e5d5d4a-operator-scripts\") pod \"glance-7da2-account-create-update-rd2nz\" (UID: \"19885e30-4144-4598-be9f-99644e5d5d4a\") " pod="openstack/glance-7da2-account-create-update-rd2nz" Mar 10 10:09:41 crc kubenswrapper[4794]: I0310 10:09:41.734928 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-ovs-8552m"] Mar 10 10:09:41 crc kubenswrapper[4794]: I0310 10:09:41.754459 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7krhj\" (UniqueName: \"kubernetes.io/projected/19885e30-4144-4598-be9f-99644e5d5d4a-kube-api-access-7krhj\") pod \"glance-7da2-account-create-update-rd2nz\" (UID: \"19885e30-4144-4598-be9f-99644e5d5d4a\") " pod="openstack/glance-7da2-account-create-update-rd2nz" Mar 10 10:09:41 crc kubenswrapper[4794]: I0310 10:09:41.754627 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/19885e30-4144-4598-be9f-99644e5d5d4a-operator-scripts\") pod \"glance-7da2-account-create-update-rd2nz\" (UID: \"19885e30-4144-4598-be9f-99644e5d5d4a\") " pod="openstack/glance-7da2-account-create-update-rd2nz" Mar 10 10:09:41 crc kubenswrapper[4794]: I0310 10:09:41.768517 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/19885e30-4144-4598-be9f-99644e5d5d4a-operator-scripts\") pod \"glance-7da2-account-create-update-rd2nz\" (UID: \"19885e30-4144-4598-be9f-99644e5d5d4a\") " pod="openstack/glance-7da2-account-create-update-rd2nz" Mar 10 10:09:41 crc kubenswrapper[4794]: I0310 10:09:41.804087 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovsdbserver-sb-0" podUID="1fcb4385-7603-4d75-8c41-23f457fcae25" containerName="ovsdbserver-sb" containerID="cri-o://09e39ed4ef8c86442deb9f63034c2d652a3971bc2b7caf7ccc84b744457e17cf" gracePeriod=300 Mar 10 10:09:41 crc kubenswrapper[4794]: I0310 10:09:41.804309 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-d8db-account-create-update-bxfq5"] Mar 10 10:09:41 crc kubenswrapper[4794]: I0310 10:09:41.816701 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7krhj\" (UniqueName: \"kubernetes.io/projected/19885e30-4144-4598-be9f-99644e5d5d4a-kube-api-access-7krhj\") pod \"glance-7da2-account-create-update-rd2nz\" (UID: \"19885e30-4144-4598-be9f-99644e5d5d4a\") " pod="openstack/glance-7da2-account-create-update-rd2nz" Mar 10 10:09:41 crc kubenswrapper[4794]: I0310 10:09:41.842062 4794 generic.go:334] "Generic (PLEG): container finished" podID="3a1da9f1-f33d-4327-b899-b5a38c6990d8" containerID="804d445e998cb87856a9e74959cb49d74352b3c669f7c0cfc5587e7a39d5a062" exitCode=2 Mar 10 10:09:41 crc kubenswrapper[4794]: I0310 10:09:41.846239 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"3a1da9f1-f33d-4327-b899-b5a38c6990d8","Type":"ContainerDied","Data":"804d445e998cb87856a9e74959cb49d74352b3c669f7c0cfc5587e7a39d5a062"} Mar 10 10:09:41 crc kubenswrapper[4794]: I0310 10:09:41.872843 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-fvs8j"] Mar 10 10:09:41 crc kubenswrapper[4794]: I0310 10:09:41.913166 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-290e-account-create-update-g7rwh"] Mar 10 10:09:41 crc kubenswrapper[4794]: I0310 10:09:41.937746 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-metrics-p9r7k"] Mar 10 10:09:41 crc kubenswrapper[4794]: I0310 10:09:41.938016 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovn-controller-metrics-p9r7k" podUID="f68cd69f-6fe2-4189-ad03-9593a4e94337" containerName="openstack-network-exporter" containerID="cri-o://f4db0913dfb22b8fdf8a9875e0693880b022c639ea96ea1251770012a7e71a8f" gracePeriod=30 Mar 10 10:09:41 crc kubenswrapper[4794]: I0310 10:09:41.972513 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-7da2-account-create-update-rd2nz" Mar 10 10:09:41 crc kubenswrapper[4794]: I0310 10:09:41.978760 4794 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-d8db-account-create-update-bxfq5"] Mar 10 10:09:42 crc kubenswrapper[4794]: E0310 10:09:42.120023 4794 configmap.go:193] Couldn't get configMap openstack/rabbitmq-config-data: configmap "rabbitmq-config-data" not found Mar 10 10:09:42 crc kubenswrapper[4794]: E0310 10:09:42.120296 4794 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/a45381ea-b5d8-49aa-b4b8-ab372b39b0d3-config-data podName:a45381ea-b5d8-49aa-b4b8-ab372b39b0d3 nodeName:}" failed. No retries permitted until 2026-03-10 10:09:43.120279215 +0000 UTC m=+1531.876450023 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/a45381ea-b5d8-49aa-b4b8-ab372b39b0d3-config-data") pod "rabbitmq-server-0" (UID: "a45381ea-b5d8-49aa-b4b8-ab372b39b0d3") : configmap "rabbitmq-config-data" not found Mar 10 10:09:42 crc kubenswrapper[4794]: I0310 10:09:42.216308 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="691c983b-4d1e-406d-bd02-358e7a635547" path="/var/lib/kubelet/pods/691c983b-4d1e-406d-bd02-358e7a635547/volumes" Mar 10 10:09:42 crc kubenswrapper[4794]: I0310 10:09:42.223854 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="76332524-3f6c-4d9c-80a7-11f41ef04ade" path="/var/lib/kubelet/pods/76332524-3f6c-4d9c-80a7-11f41ef04ade/volumes" Mar 10 10:09:42 crc kubenswrapper[4794]: I0310 10:09:42.226570 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f3cfd303-3976-4400-8c37-e64a4bed85f2" path="/var/lib/kubelet/pods/f3cfd303-3976-4400-8c37-e64a4bed85f2/volumes" Mar 10 10:09:42 crc kubenswrapper[4794]: I0310 10:09:42.228375 4794 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-290e-account-create-update-g7rwh"] Mar 10 10:09:42 crc kubenswrapper[4794]: I0310 10:09:42.228482 4794 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-ncstb" Mar 10 10:09:42 crc kubenswrapper[4794]: I0310 10:09:42.228497 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 10 10:09:42 crc kubenswrapper[4794]: I0310 10:09:42.228510 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-sync-c4b4g"] Mar 10 10:09:42 crc kubenswrapper[4794]: I0310 10:09:42.248461 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-a23e-account-create-update-4mdw7"] Mar 10 10:09:42 crc kubenswrapper[4794]: I0310 10:09:42.270932 4794 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-a23e-account-create-update-4mdw7"] Mar 10 10:09:42 crc kubenswrapper[4794]: I0310 10:09:42.293721 4794 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-sync-c4b4g"] Mar 10 10:09:42 crc kubenswrapper[4794]: I0310 10:09:42.321042 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-c8964d89c-gfknm"] Mar 10 10:09:42 crc kubenswrapper[4794]: I0310 10:09:42.321347 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-c8964d89c-gfknm" podUID="bb0dcfe9-2446-4f91-bc24-3cabcab9e8af" containerName="dnsmasq-dns" containerID="cri-o://8b1add4deeef74619d980e1481eb91a0c5f66b424e4ad63cfb9113db133078f3" gracePeriod=10 Mar 10 10:09:42 crc kubenswrapper[4794]: I0310 10:09:42.331526 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-sync-g7gmt"] Mar 10 10:09:42 crc kubenswrapper[4794]: I0310 10:09:42.335775 4794 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-sync-g7gmt"] Mar 10 10:09:42 crc kubenswrapper[4794]: E0310 10:09:42.355236 4794 configmap.go:193] Couldn't get configMap openstack/rabbitmq-cell1-config-data: configmap "rabbitmq-cell1-config-data" not found Mar 10 10:09:42 crc kubenswrapper[4794]: E0310 10:09:42.355517 4794 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/598e06ed-3156-4e09-976e-4dda0e35afc2-config-data podName:598e06ed-3156-4e09-976e-4dda0e35afc2 nodeName:}" failed. No retries permitted until 2026-03-10 10:09:42.855504385 +0000 UTC m=+1531.611675203 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/598e06ed-3156-4e09-976e-4dda0e35afc2-config-data") pod "rabbitmq-cell1-server-0" (UID: "598e06ed-3156-4e09-976e-4dda0e35afc2") : configmap "rabbitmq-cell1-config-data" not found Mar 10 10:09:42 crc kubenswrapper[4794]: I0310 10:09:42.370619 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-e13e-account-create-update-nqb9b"] Mar 10 10:09:42 crc kubenswrapper[4794]: I0310 10:09:42.380261 4794 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-e13e-account-create-update-nqb9b"] Mar 10 10:09:42 crc kubenswrapper[4794]: I0310 10:09:42.402709 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-sync-gg5z8"] Mar 10 10:09:42 crc kubenswrapper[4794]: I0310 10:09:42.455562 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-ncstb" Mar 10 10:09:42 crc kubenswrapper[4794]: I0310 10:09:42.464393 4794 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-sync-gg5z8"] Mar 10 10:09:42 crc kubenswrapper[4794]: I0310 10:09:42.504790 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-sync-hjnc2"] Mar 10 10:09:42 crc kubenswrapper[4794]: I0310 10:09:42.539221 4794 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-sync-hjnc2"] Mar 10 10:09:42 crc kubenswrapper[4794]: I0310 10:09:42.552431 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-61a4-account-create-update-t8jxm"] Mar 10 10:09:42 crc kubenswrapper[4794]: I0310 10:09:42.583151 4794 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-61a4-account-create-update-t8jxm"] Mar 10 10:09:42 crc kubenswrapper[4794]: I0310 10:09:42.628403 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-a513-account-create-update-l8ctf"] Mar 10 10:09:42 crc kubenswrapper[4794]: I0310 10:09:42.652477 4794 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-a513-account-create-update-l8ctf"] Mar 10 10:09:42 crc kubenswrapper[4794]: I0310 10:09:42.665936 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-sync-v6v6k"] Mar 10 10:09:42 crc kubenswrapper[4794]: I0310 10:09:42.677404 4794 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-sync-v6v6k"] Mar 10 10:09:42 crc kubenswrapper[4794]: I0310 10:09:42.690254 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovsdbserver-nb-0"] Mar 10 10:09:42 crc kubenswrapper[4794]: I0310 10:09:42.690805 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovsdbserver-nb-0" podUID="9575e254-d696-4a8a-b84f-c8f36d746ff8" containerName="openstack-network-exporter" containerID="cri-o://7c722e31acce807ad3627a24a3d71255a5f9253c9aaa80d754038f1c69a3dfc4" gracePeriod=300 Mar 10 10:09:42 crc kubenswrapper[4794]: I0310 10:09:42.727962 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-cell-mapping-czl5f"] Mar 10 10:09:42 crc kubenswrapper[4794]: I0310 10:09:42.785935 4794 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-cell-mapping-czl5f"] Mar 10 10:09:42 crc kubenswrapper[4794]: I0310 10:09:42.794102 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovsdbserver-nb-0" podUID="9575e254-d696-4a8a-b84f-c8f36d746ff8" containerName="ovsdbserver-nb" containerID="cri-o://568dc6ea988e0568a2d3d813b291ac89a4bc785eb376be0c5d4fc65492cc33f4" gracePeriod=300 Mar 10 10:09:42 crc kubenswrapper[4794]: I0310 10:09:42.828585 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-cell-mapping-q5rgf"] Mar 10 10:09:42 crc kubenswrapper[4794]: I0310 10:09:42.843027 4794 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-cell-mapping-q5rgf"] Mar 10 10:09:42 crc kubenswrapper[4794]: I0310 10:09:42.874195 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-ring-rebalance-vb95j"] Mar 10 10:09:42 crc kubenswrapper[4794]: E0310 10:09:42.877717 4794 configmap.go:193] Couldn't get configMap openstack/rabbitmq-cell1-config-data: configmap "rabbitmq-cell1-config-data" not found Mar 10 10:09:42 crc kubenswrapper[4794]: E0310 10:09:42.877791 4794 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/598e06ed-3156-4e09-976e-4dda0e35afc2-config-data podName:598e06ed-3156-4e09-976e-4dda0e35afc2 nodeName:}" failed. No retries permitted until 2026-03-10 10:09:43.877775556 +0000 UTC m=+1532.633946374 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/598e06ed-3156-4e09-976e-4dda0e35afc2-config-data") pod "rabbitmq-cell1-server-0" (UID: "598e06ed-3156-4e09-976e-4dda0e35afc2") : configmap "rabbitmq-cell1-config-data" not found Mar 10 10:09:42 crc kubenswrapper[4794]: I0310 10:09:42.889272 4794 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/swift-ring-rebalance-vb95j"] Mar 10 10:09:42 crc kubenswrapper[4794]: I0310 10:09:42.900371 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 10 10:09:42 crc kubenswrapper[4794]: I0310 10:09:42.900676 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="53686d91-dc01-4a36-99c3-e6c84052e15e" containerName="glance-log" containerID="cri-o://d74fbbdb86c3cdb65171312cf2c9c803c458f47ad7b0f5c525579801ae96ec9d" gracePeriod=30 Mar 10 10:09:42 crc kubenswrapper[4794]: I0310 10:09:42.901084 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="53686d91-dc01-4a36-99c3-e6c84052e15e" containerName="glance-httpd" containerID="cri-o://d36356b5c770ecad29603a57d1346e81bb0210ad811dc767118c368012779874" gracePeriod=30 Mar 10 10:09:42 crc kubenswrapper[4794]: I0310 10:09:42.914234 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 10 10:09:42 crc kubenswrapper[4794]: I0310 10:09:42.914570 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="3c6c5ff5-c0d2-49d5-a2d0-f49871e6444e" containerName="cinder-scheduler" containerID="cri-o://5a0b460bd15a1cc1517c39d79a934d02b21b241c99d789df7e899e079f6a12eb" gracePeriod=30 Mar 10 10:09:42 crc kubenswrapper[4794]: I0310 10:09:42.914754 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="3c6c5ff5-c0d2-49d5-a2d0-f49871e6444e" containerName="probe" containerID="cri-o://849ee9cf4f0b2ba0c3906171081ce847f0534ff16ad62c26ba765cd71b509f45" gracePeriod=30 Mar 10 10:09:42 crc kubenswrapper[4794]: I0310 10:09:42.957488 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_1fcb4385-7603-4d75-8c41-23f457fcae25/ovsdbserver-sb/0.log" Mar 10 10:09:42 crc kubenswrapper[4794]: I0310 10:09:42.957543 4794 generic.go:334] "Generic (PLEG): container finished" podID="1fcb4385-7603-4d75-8c41-23f457fcae25" containerID="7e4285317b0405a3de83fe6a6261bf54c91dad51c10abc1e8c435712c795e03f" exitCode=2 Mar 10 10:09:42 crc kubenswrapper[4794]: I0310 10:09:42.957560 4794 generic.go:334] "Generic (PLEG): container finished" podID="1fcb4385-7603-4d75-8c41-23f457fcae25" containerID="09e39ed4ef8c86442deb9f63034c2d652a3971bc2b7caf7ccc84b744457e17cf" exitCode=143 Mar 10 10:09:42 crc kubenswrapper[4794]: I0310 10:09:42.957971 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-storage-0"] Mar 10 10:09:42 crc kubenswrapper[4794]: I0310 10:09:42.958009 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"1fcb4385-7603-4d75-8c41-23f457fcae25","Type":"ContainerDied","Data":"7e4285317b0405a3de83fe6a6261bf54c91dad51c10abc1e8c435712c795e03f"} Mar 10 10:09:42 crc kubenswrapper[4794]: I0310 10:09:42.958030 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"1fcb4385-7603-4d75-8c41-23f457fcae25","Type":"ContainerDied","Data":"09e39ed4ef8c86442deb9f63034c2d652a3971bc2b7caf7ccc84b744457e17cf"} Mar 10 10:09:42 crc kubenswrapper[4794]: I0310 10:09:42.958538 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="6cbde6fd-ed6b-49a8-96ae-642b15a1802b" containerName="account-server" containerID="cri-o://031388a5b0e7ab4e2d5a36045f55127c3e30f57424f750e9a01dea1da336e7c1" gracePeriod=30 Mar 10 10:09:42 crc kubenswrapper[4794]: I0310 10:09:42.958925 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="6cbde6fd-ed6b-49a8-96ae-642b15a1802b" containerName="swift-recon-cron" containerID="cri-o://5c65c1b8cc623038c03f29280e90a94e0a70fec6174c721b69512958e9487bcb" gracePeriod=30 Mar 10 10:09:42 crc kubenswrapper[4794]: I0310 10:09:42.958990 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="6cbde6fd-ed6b-49a8-96ae-642b15a1802b" containerName="rsync" containerID="cri-o://8f6939cc5c3159e417cd81d11fcb1a54fc7a6a362b3f62041010bad3ef1cdf82" gracePeriod=30 Mar 10 10:09:42 crc kubenswrapper[4794]: I0310 10:09:42.959037 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="6cbde6fd-ed6b-49a8-96ae-642b15a1802b" containerName="object-expirer" containerID="cri-o://32f094fa9f1ca547ebae717b8b5951d1771b5fdb77dfeec605d769593501a6a7" gracePeriod=30 Mar 10 10:09:42 crc kubenswrapper[4794]: I0310 10:09:42.959078 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="6cbde6fd-ed6b-49a8-96ae-642b15a1802b" containerName="object-updater" containerID="cri-o://3b9e079a22cd5b6888eff2291538cfe9ec6e987ec470fa124da7e69bbac3f8c2" gracePeriod=30 Mar 10 10:09:42 crc kubenswrapper[4794]: I0310 10:09:42.959117 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="6cbde6fd-ed6b-49a8-96ae-642b15a1802b" containerName="object-auditor" containerID="cri-o://250a700fa883215453330332d981b0fe632e9fa60d370e3a5759ce91865db4ab" gracePeriod=30 Mar 10 10:09:42 crc kubenswrapper[4794]: I0310 10:09:42.959159 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="6cbde6fd-ed6b-49a8-96ae-642b15a1802b" containerName="object-replicator" containerID="cri-o://bc6074c0953ac28f265a3e17ebd69da6a9d931779c7686e644dc485527b0a8fb" gracePeriod=30 Mar 10 10:09:42 crc kubenswrapper[4794]: I0310 10:09:42.959197 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="6cbde6fd-ed6b-49a8-96ae-642b15a1802b" containerName="object-server" containerID="cri-o://2190a44dadbbaa1a8486a80b4974382e233189c54458a517977ead0fca476329" gracePeriod=30 Mar 10 10:09:42 crc kubenswrapper[4794]: I0310 10:09:42.959237 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="6cbde6fd-ed6b-49a8-96ae-642b15a1802b" containerName="container-updater" containerID="cri-o://8e6a16e1a4b64e9512a2b7d0587b85ab036e797d9ba29438a7eebcaaa92c8d35" gracePeriod=30 Mar 10 10:09:42 crc kubenswrapper[4794]: I0310 10:09:42.959273 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="6cbde6fd-ed6b-49a8-96ae-642b15a1802b" containerName="container-auditor" containerID="cri-o://74c2be2cb91c7946838edd2c68684383f49526b9001079db31859fa44514bdac" gracePeriod=30 Mar 10 10:09:42 crc kubenswrapper[4794]: I0310 10:09:42.959312 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="6cbde6fd-ed6b-49a8-96ae-642b15a1802b" containerName="container-replicator" containerID="cri-o://46ff2925f4f5d1bf13f6a0203dfcdd7152671d57863769e69b695dd9c5e3fb06" gracePeriod=30 Mar 10 10:09:42 crc kubenswrapper[4794]: I0310 10:09:42.959401 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="6cbde6fd-ed6b-49a8-96ae-642b15a1802b" containerName="container-server" containerID="cri-o://21625e79ea0985db8645fee5a87d8192fa2df9962f7552b6121df93fb96d3e7f" gracePeriod=30 Mar 10 10:09:42 crc kubenswrapper[4794]: I0310 10:09:42.959443 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="6cbde6fd-ed6b-49a8-96ae-642b15a1802b" containerName="account-reaper" containerID="cri-o://d63f9f6d7599f958801005a6670033ad2c6f68cf62e9c0465c4d34044c669139" gracePeriod=30 Mar 10 10:09:42 crc kubenswrapper[4794]: I0310 10:09:42.959479 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="6cbde6fd-ed6b-49a8-96ae-642b15a1802b" containerName="account-auditor" containerID="cri-o://32aa0630d82d05463514a0c8a463bab43aad2c33f2b887d13c9714f7761c76c2" gracePeriod=30 Mar 10 10:09:42 crc kubenswrapper[4794]: I0310 10:09:42.959516 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="6cbde6fd-ed6b-49a8-96ae-642b15a1802b" containerName="account-replicator" containerID="cri-o://8cc99334a202e511edfdad4eef31c86941e591eb3f6215e5d7e786f323618184" gracePeriod=30 Mar 10 10:09:42 crc kubenswrapper[4794]: I0310 10:09:42.993021 4794 generic.go:334] "Generic (PLEG): container finished" podID="9575e254-d696-4a8a-b84f-c8f36d746ff8" containerID="7c722e31acce807ad3627a24a3d71255a5f9253c9aaa80d754038f1c69a3dfc4" exitCode=2 Mar 10 10:09:42 crc kubenswrapper[4794]: I0310 10:09:42.993256 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"9575e254-d696-4a8a-b84f-c8f36d746ff8","Type":"ContainerDied","Data":"7c722e31acce807ad3627a24a3d71255a5f9253c9aaa80d754038f1c69a3dfc4"} Mar 10 10:09:42 crc kubenswrapper[4794]: I0310 10:09:42.995521 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-p9r7k_f68cd69f-6fe2-4189-ad03-9593a4e94337/openstack-network-exporter/0.log" Mar 10 10:09:42 crc kubenswrapper[4794]: I0310 10:09:42.995563 4794 generic.go:334] "Generic (PLEG): container finished" podID="f68cd69f-6fe2-4189-ad03-9593a4e94337" containerID="f4db0913dfb22b8fdf8a9875e0693880b022c639ea96ea1251770012a7e71a8f" exitCode=2 Mar 10 10:09:42 crc kubenswrapper[4794]: I0310 10:09:42.995623 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-p9r7k" event={"ID":"f68cd69f-6fe2-4189-ad03-9593a4e94337","Type":"ContainerDied","Data":"f4db0913dfb22b8fdf8a9875e0693880b022c639ea96ea1251770012a7e71a8f"} Mar 10 10:09:42 crc kubenswrapper[4794]: I0310 10:09:42.996320 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Mar 10 10:09:42 crc kubenswrapper[4794]: I0310 10:09:42.997898 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="39114d89-8cf8-4563-bc50-e96e2113349d" containerName="cinder-api" containerID="cri-o://1860e4a742cfe98abb0f65295ec8d6e591d40ce5a255968cfb355b03216be258" gracePeriod=30 Mar 10 10:09:43 crc kubenswrapper[4794]: I0310 10:09:43.000687 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="39114d89-8cf8-4563-bc50-e96e2113349d" containerName="cinder-api-log" containerID="cri-o://841f2c7007209f71d0fcab9d21091a5638e79f7695a27cbb0864a109529f7bb5" gracePeriod=30 Mar 10 10:09:43 crc kubenswrapper[4794]: I0310 10:09:43.001266 4794 generic.go:334] "Generic (PLEG): container finished" podID="bb0dcfe9-2446-4f91-bc24-3cabcab9e8af" containerID="8b1add4deeef74619d980e1481eb91a0c5f66b424e4ad63cfb9113db133078f3" exitCode=0 Mar 10 10:09:43 crc kubenswrapper[4794]: I0310 10:09:43.001776 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-c8964d89c-gfknm" event={"ID":"bb0dcfe9-2446-4f91-bc24-3cabcab9e8af","Type":"ContainerDied","Data":"8b1add4deeef74619d980e1481eb91a0c5f66b424e4ad63cfb9113db133078f3"} Mar 10 10:09:43 crc kubenswrapper[4794]: I0310 10:09:43.018169 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-bd7575545-w8qjp"] Mar 10 10:09:43 crc kubenswrapper[4794]: I0310 10:09:43.018630 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-bd7575545-w8qjp" podUID="55771788-f3c0-4cde-af2f-ca527c2e2965" containerName="neutron-api" containerID="cri-o://3a1060bd42d7158c308820f09f849a414458bab447ccd7c995609acf055ac995" gracePeriod=30 Mar 10 10:09:43 crc kubenswrapper[4794]: I0310 10:09:43.019439 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-bd7575545-w8qjp" podUID="55771788-f3c0-4cde-af2f-ca527c2e2965" containerName="neutron-httpd" containerID="cri-o://76dbe19e4daa257d305db13b3f34518e614449759a4f59af217356e156607317" gracePeriod=30 Mar 10 10:09:43 crc kubenswrapper[4794]: I0310 10:09:43.107063 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 10 10:09:43 crc kubenswrapper[4794]: I0310 10:09:43.107702 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="eb384e40-1917-4b9c-bcfa-440a3a10fd1d" containerName="glance-log" containerID="cri-o://ef5f845a9297d9baef67ae43283960d69bd559077bfce60e39f562cbd5f935df" gracePeriod=30 Mar 10 10:09:43 crc kubenswrapper[4794]: I0310 10:09:43.108250 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="eb384e40-1917-4b9c-bcfa-440a3a10fd1d" containerName="glance-httpd" containerID="cri-o://d75c771de3d291bbbb95bf0f193cefd57708bcdc53c0f2f718b3d8e320f642c8" gracePeriod=30 Mar 10 10:09:43 crc kubenswrapper[4794]: E0310 10:09:43.150789 4794 configmap.go:193] Couldn't get configMap openstack/rabbitmq-config-data: configmap "rabbitmq-config-data" not found Mar 10 10:09:43 crc kubenswrapper[4794]: E0310 10:09:43.150903 4794 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/a45381ea-b5d8-49aa-b4b8-ab372b39b0d3-config-data podName:a45381ea-b5d8-49aa-b4b8-ab372b39b0d3 nodeName:}" failed. No retries permitted until 2026-03-10 10:09:45.150883011 +0000 UTC m=+1533.907053829 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/a45381ea-b5d8-49aa-b4b8-ab372b39b0d3-config-data") pod "rabbitmq-server-0" (UID: "a45381ea-b5d8-49aa-b4b8-ab372b39b0d3") : configmap "rabbitmq-config-data" not found Mar 10 10:09:43 crc kubenswrapper[4794]: I0310 10:09:43.195558 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 10 10:09:43 crc kubenswrapper[4794]: I0310 10:09:43.220908 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-68cfd4d846-drn7b"] Mar 10 10:09:43 crc kubenswrapper[4794]: I0310 10:09:43.221209 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/placement-68cfd4d846-drn7b" podUID="ecae27ed-535f-47c8-93e4-07baac3bc64c" containerName="placement-log" containerID="cri-o://07b975998a50058fa523429e713118ddb3da394ec6e69a92bc5bda9b7839104f" gracePeriod=30 Mar 10 10:09:43 crc kubenswrapper[4794]: I0310 10:09:43.222063 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/placement-68cfd4d846-drn7b" podUID="ecae27ed-535f-47c8-93e4-07baac3bc64c" containerName="placement-api" containerID="cri-o://5b36ab842a7613453f011b16a0ad771867c79a6397fa893991fe94b1caac6d43" gracePeriod=30 Mar 10 10:09:43 crc kubenswrapper[4794]: I0310 10:09:43.245740 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-worker-b64684465-k4k4j"] Mar 10 10:09:43 crc kubenswrapper[4794]: I0310 10:09:43.245936 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-worker-b64684465-k4k4j" podUID="98b35dea-060e-4b8d-9829-37357853a9c4" containerName="barbican-worker-log" containerID="cri-o://f89f95fa1764bb4ed8927a1c6b5d7ec0737f1ea37babc6ae8e3dc09577573205" gracePeriod=30 Mar 10 10:09:43 crc kubenswrapper[4794]: I0310 10:09:43.246243 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-worker-b64684465-k4k4j" podUID="98b35dea-060e-4b8d-9829-37357853a9c4" containerName="barbican-worker" containerID="cri-o://13f70234e665cee8f47182684e880f742b068d9ada3d1f0b83237e5efa99c1ee" gracePeriod=30 Mar 10 10:09:43 crc kubenswrapper[4794]: I0310 10:09:43.265994 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-keystone-listener-65fdc45d8b-2t64g"] Mar 10 10:09:43 crc kubenswrapper[4794]: I0310 10:09:43.266254 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-keystone-listener-65fdc45d8b-2t64g" podUID="efa96620-3d4b-4780-92a0-eeefbe9dcf9a" containerName="barbican-keystone-listener-log" containerID="cri-o://c415d347d34421bcb42d60573a12da5f8047e3e7f977fb04d5829812ddc302a7" gracePeriod=30 Mar 10 10:09:43 crc kubenswrapper[4794]: I0310 10:09:43.269646 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-keystone-listener-65fdc45d8b-2t64g" podUID="efa96620-3d4b-4780-92a0-eeefbe9dcf9a" containerName="barbican-keystone-listener" containerID="cri-o://19687e523e2ddee57d1501fea9c4f3051cf91b15cc6073a1586ab9b0d2214569" gracePeriod=30 Mar 10 10:09:43 crc kubenswrapper[4794]: I0310 10:09:43.293634 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-server-0" podUID="a45381ea-b5d8-49aa-b4b8-ab372b39b0d3" containerName="rabbitmq" containerID="cri-o://b01da559f24b75afd94d0c65d373dcf5b4d3bb07708a3909a930cd454c72cc4d" gracePeriod=604800 Mar 10 10:09:43 crc kubenswrapper[4794]: I0310 10:09:43.296715 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 10 10:09:43 crc kubenswrapper[4794]: I0310 10:09:43.296979 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="6b5facf1-8bc0-497d-925f-ee382862cf22" containerName="nova-metadata-log" containerID="cri-o://2db723c104c847e70b7c742bd2157495764accb8230b8d1c494ef49084c7b620" gracePeriod=30 Mar 10 10:09:43 crc kubenswrapper[4794]: I0310 10:09:43.297462 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="6b5facf1-8bc0-497d-925f-ee382862cf22" containerName="nova-metadata-metadata" containerID="cri-o://a0b3e91e1e10cd24f85913c9dfa160aee857a4f7ffe7e21e4b28aa608366d44a" gracePeriod=30 Mar 10 10:09:43 crc kubenswrapper[4794]: I0310 10:09:43.322422 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-e78d-account-create-update-lzbkb"] Mar 10 10:09:43 crc kubenswrapper[4794]: I0310 10:09:43.327465 4794 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-e78d-account-create-update-lzbkb"] Mar 10 10:09:43 crc kubenswrapper[4794]: I0310 10:09:43.343668 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-create-9zpcn"] Mar 10 10:09:43 crc kubenswrapper[4794]: I0310 10:09:43.361492 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-create-qvz28"] Mar 10 10:09:43 crc kubenswrapper[4794]: I0310 10:09:43.376472 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-create-r6tt9"] Mar 10 10:09:43 crc kubenswrapper[4794]: I0310 10:09:43.390508 4794 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-create-9zpcn"] Mar 10 10:09:43 crc kubenswrapper[4794]: I0310 10:09:43.409387 4794 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-create-qvz28"] Mar 10 10:09:43 crc kubenswrapper[4794]: I0310 10:09:43.413028 4794 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-create-r6tt9"] Mar 10 10:09:43 crc kubenswrapper[4794]: I0310 10:09:43.423930 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 10 10:09:43 crc kubenswrapper[4794]: I0310 10:09:43.424161 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="2446b2bc-c3c8-465d-a808-981664228cba" containerName="nova-api-log" containerID="cri-o://6e6cf54d16f75007332086d04328fb26dd120e8114419987c7377f33c0bef36c" gracePeriod=30 Mar 10 10:09:43 crc kubenswrapper[4794]: I0310 10:09:43.424474 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="2446b2bc-c3c8-465d-a808-981664228cba" containerName="nova-api-api" containerID="cri-o://826580e64ddfef3da4fda6ec39829dbc5dafbc69c66385fc8ca2a2bcd5ca60d8" gracePeriod=30 Mar 10 10:09:43 crc kubenswrapper[4794]: I0310 10:09:43.440692 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-create-jtsc6"] Mar 10 10:09:43 crc kubenswrapper[4794]: I0310 10:09:43.448141 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-7da2-account-create-update-rd2nz"] Mar 10 10:09:43 crc kubenswrapper[4794]: I0310 10:09:43.459444 4794 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-create-jtsc6"] Mar 10 10:09:43 crc kubenswrapper[4794]: I0310 10:09:43.464175 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-5dd9f46c58-hkfks"] Mar 10 10:09:43 crc kubenswrapper[4794]: I0310 10:09:43.464385 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-5dd9f46c58-hkfks" podUID="cffdb4d7-881d-4c1b-a15c-d8d9e8b7756f" containerName="barbican-api-log" containerID="cri-o://d7f0a62cb5cfb4fded049c3aefb7fe44d4d036d9c535f290cbc47f08da15b658" gracePeriod=30 Mar 10 10:09:43 crc kubenswrapper[4794]: I0310 10:09:43.468356 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-5dd9f46c58-hkfks" podUID="cffdb4d7-881d-4c1b-a15c-d8d9e8b7756f" containerName="barbican-api" containerID="cri-o://839d455cbd220b0b4cbb46ee40c9764f0de266ff250390018a7424fa8aa36507" gracePeriod=30 Mar 10 10:09:43 crc kubenswrapper[4794]: I0310 10:09:43.473172 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstack-cell1-galera-0"] Mar 10 10:09:43 crc kubenswrapper[4794]: I0310 10:09:43.480450 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-db-create-ngww6"] Mar 10 10:09:43 crc kubenswrapper[4794]: E0310 10:09:43.486102 4794 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="568dc6ea988e0568a2d3d813b291ac89a4bc785eb376be0c5d4fc65492cc33f4" cmd=["/usr/bin/pidof","ovsdb-server"] Mar 10 10:09:43 crc kubenswrapper[4794]: E0310 10:09:43.488016 4794 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="568dc6ea988e0568a2d3d813b291ac89a4bc785eb376be0c5d4fc65492cc33f4" cmd=["/usr/bin/pidof","ovsdb-server"] Mar 10 10:09:43 crc kubenswrapper[4794]: E0310 10:09:43.489443 4794 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="568dc6ea988e0568a2d3d813b291ac89a4bc785eb376be0c5d4fc65492cc33f4" cmd=["/usr/bin/pidof","ovsdb-server"] Mar 10 10:09:43 crc kubenswrapper[4794]: E0310 10:09:43.489466 4794 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/ovsdbserver-nb-0" podUID="9575e254-d696-4a8a-b84f-c8f36d746ff8" containerName="ovsdbserver-nb" Mar 10 10:09:43 crc kubenswrapper[4794]: I0310 10:09:43.500898 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovn-controller-ovs-8552m" podUID="4fbaedcd-8c1d-452e-a36b-1a12e47f48d1" containerName="ovs-vswitchd" containerID="cri-o://b7dfa84825a07e2ea2e0b6f2310752db3a4b9e635a5fe6eebca49deff4d3269e" gracePeriod=29 Mar 10 10:09:43 crc kubenswrapper[4794]: I0310 10:09:43.559389 4794 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-db-create-ngww6"] Mar 10 10:09:43 crc kubenswrapper[4794]: I0310 10:09:43.598648 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-db-create-hsjqq"] Mar 10 10:09:43 crc kubenswrapper[4794]: I0310 10:09:43.609144 4794 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-db-create-hsjqq"] Mar 10 10:09:43 crc kubenswrapper[4794]: I0310 10:09:43.624392 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-create-dcx5m"] Mar 10 10:09:43 crc kubenswrapper[4794]: I0310 10:09:43.624450 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-db-create-pkvrz"] Mar 10 10:09:43 crc kubenswrapper[4794]: I0310 10:09:43.633837 4794 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-db-create-pkvrz"] Mar 10 10:09:43 crc kubenswrapper[4794]: I0310 10:09:43.638172 4794 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-create-dcx5m"] Mar 10 10:09:43 crc kubenswrapper[4794]: I0310 10:09:43.646322 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 10 10:09:43 crc kubenswrapper[4794]: I0310 10:09:43.646581 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-novncproxy-0" podUID="2ceb2c7c-89c8-4fce-aa02-aac3840e3f5a" containerName="nova-cell1-novncproxy-novncproxy" containerID="cri-o://9e44cda9b00f14d7a50fe91ef2bfbfd700f1b8ac60755da2c7898a76949812f8" gracePeriod=30 Mar 10 10:09:43 crc kubenswrapper[4794]: I0310 10:09:43.658634 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 10 10:09:43 crc kubenswrapper[4794]: I0310 10:09:43.665813 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-h8ctb"] Mar 10 10:09:43 crc kubenswrapper[4794]: I0310 10:09:43.685234 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Mar 10 10:09:43 crc kubenswrapper[4794]: I0310 10:09:43.685443 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="d6e6c324-8bba-4585-9ffc-afad751594d7" containerName="nova-scheduler-scheduler" containerID="cri-o://a6cc45af0b715a94827c03b09ad8044d92a3ff4ebaf066479557fde441a9fe8f" gracePeriod=30 Mar 10 10:09:43 crc kubenswrapper[4794]: I0310 10:09:43.702264 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-7da2-account-create-update-rd2nz"] Mar 10 10:09:43 crc kubenswrapper[4794]: E0310 10:09:43.854230 4794 handlers.go:78] "Exec lifecycle hook for Container in Pod failed" err=< Mar 10 10:09:43 crc kubenswrapper[4794]: command '/usr/local/bin/container-scripts/stop-ovsdb-server.sh' exited with 137: ++ dirname /usr/local/bin/container-scripts/stop-ovsdb-server.sh Mar 10 10:09:43 crc kubenswrapper[4794]: + source /usr/local/bin/container-scripts/functions Mar 10 10:09:43 crc kubenswrapper[4794]: ++ OVNBridge=br-int Mar 10 10:09:43 crc kubenswrapper[4794]: ++ OVNRemote=tcp:localhost:6642 Mar 10 10:09:43 crc kubenswrapper[4794]: ++ OVNEncapType=geneve Mar 10 10:09:43 crc kubenswrapper[4794]: ++ OVNAvailabilityZones= Mar 10 10:09:43 crc kubenswrapper[4794]: ++ EnableChassisAsGateway=true Mar 10 10:09:43 crc kubenswrapper[4794]: ++ PhysicalNetworks= Mar 10 10:09:43 crc kubenswrapper[4794]: ++ OVNHostName= Mar 10 10:09:43 crc kubenswrapper[4794]: ++ DB_FILE=/etc/openvswitch/conf.db Mar 10 10:09:43 crc kubenswrapper[4794]: ++ ovs_dir=/var/lib/openvswitch Mar 10 10:09:43 crc kubenswrapper[4794]: ++ FLOWS_RESTORE_SCRIPT=/var/lib/openvswitch/flows-script Mar 10 10:09:43 crc kubenswrapper[4794]: ++ FLOWS_RESTORE_DIR=/var/lib/openvswitch/saved-flows Mar 10 10:09:43 crc kubenswrapper[4794]: ++ SAFE_TO_STOP_OVSDB_SERVER_SEMAPHORE=/var/lib/openvswitch/is_safe_to_stop_ovsdb_server Mar 10 10:09:43 crc kubenswrapper[4794]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Mar 10 10:09:43 crc kubenswrapper[4794]: + sleep 0.5 Mar 10 10:09:43 crc kubenswrapper[4794]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Mar 10 10:09:43 crc kubenswrapper[4794]: + sleep 0.5 Mar 10 10:09:43 crc kubenswrapper[4794]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Mar 10 10:09:43 crc kubenswrapper[4794]: + cleanup_ovsdb_server_semaphore Mar 10 10:09:43 crc kubenswrapper[4794]: + rm -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server Mar 10 10:09:43 crc kubenswrapper[4794]: + /usr/share/openvswitch/scripts/ovs-ctl stop --no-ovs-vswitchd Mar 10 10:09:43 crc kubenswrapper[4794]: > execCommand=["/usr/local/bin/container-scripts/stop-ovsdb-server.sh"] containerName="ovsdb-server" pod="openstack/ovn-controller-ovs-8552m" message=< Mar 10 10:09:43 crc kubenswrapper[4794]: Exiting ovsdb-server (5) [ OK ] Mar 10 10:09:43 crc kubenswrapper[4794]: ++ dirname /usr/local/bin/container-scripts/stop-ovsdb-server.sh Mar 10 10:09:43 crc kubenswrapper[4794]: + source /usr/local/bin/container-scripts/functions Mar 10 10:09:43 crc kubenswrapper[4794]: ++ OVNBridge=br-int Mar 10 10:09:43 crc kubenswrapper[4794]: ++ OVNRemote=tcp:localhost:6642 Mar 10 10:09:43 crc kubenswrapper[4794]: ++ OVNEncapType=geneve Mar 10 10:09:43 crc kubenswrapper[4794]: ++ OVNAvailabilityZones= Mar 10 10:09:43 crc kubenswrapper[4794]: ++ EnableChassisAsGateway=true Mar 10 10:09:43 crc kubenswrapper[4794]: ++ PhysicalNetworks= Mar 10 10:09:43 crc kubenswrapper[4794]: ++ OVNHostName= Mar 10 10:09:43 crc kubenswrapper[4794]: ++ DB_FILE=/etc/openvswitch/conf.db Mar 10 10:09:43 crc kubenswrapper[4794]: ++ ovs_dir=/var/lib/openvswitch Mar 10 10:09:43 crc kubenswrapper[4794]: ++ FLOWS_RESTORE_SCRIPT=/var/lib/openvswitch/flows-script Mar 10 10:09:43 crc kubenswrapper[4794]: ++ FLOWS_RESTORE_DIR=/var/lib/openvswitch/saved-flows Mar 10 10:09:43 crc kubenswrapper[4794]: ++ SAFE_TO_STOP_OVSDB_SERVER_SEMAPHORE=/var/lib/openvswitch/is_safe_to_stop_ovsdb_server Mar 10 10:09:43 crc kubenswrapper[4794]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Mar 10 10:09:43 crc kubenswrapper[4794]: + sleep 0.5 Mar 10 10:09:43 crc kubenswrapper[4794]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Mar 10 10:09:43 crc kubenswrapper[4794]: + sleep 0.5 Mar 10 10:09:43 crc kubenswrapper[4794]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Mar 10 10:09:43 crc kubenswrapper[4794]: + cleanup_ovsdb_server_semaphore Mar 10 10:09:43 crc kubenswrapper[4794]: + rm -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server Mar 10 10:09:43 crc kubenswrapper[4794]: + /usr/share/openvswitch/scripts/ovs-ctl stop --no-ovs-vswitchd Mar 10 10:09:43 crc kubenswrapper[4794]: > Mar 10 10:09:43 crc kubenswrapper[4794]: E0310 10:09:43.854286 4794 kuberuntime_container.go:691] "PreStop hook failed" err=< Mar 10 10:09:43 crc kubenswrapper[4794]: command '/usr/local/bin/container-scripts/stop-ovsdb-server.sh' exited with 137: ++ dirname /usr/local/bin/container-scripts/stop-ovsdb-server.sh Mar 10 10:09:43 crc kubenswrapper[4794]: + source /usr/local/bin/container-scripts/functions Mar 10 10:09:43 crc kubenswrapper[4794]: ++ OVNBridge=br-int Mar 10 10:09:43 crc kubenswrapper[4794]: ++ OVNRemote=tcp:localhost:6642 Mar 10 10:09:43 crc kubenswrapper[4794]: ++ OVNEncapType=geneve Mar 10 10:09:43 crc kubenswrapper[4794]: ++ OVNAvailabilityZones= Mar 10 10:09:43 crc kubenswrapper[4794]: ++ EnableChassisAsGateway=true Mar 10 10:09:43 crc kubenswrapper[4794]: ++ PhysicalNetworks= Mar 10 10:09:43 crc kubenswrapper[4794]: ++ OVNHostName= Mar 10 10:09:43 crc kubenswrapper[4794]: ++ DB_FILE=/etc/openvswitch/conf.db Mar 10 10:09:43 crc kubenswrapper[4794]: ++ ovs_dir=/var/lib/openvswitch Mar 10 10:09:43 crc kubenswrapper[4794]: ++ FLOWS_RESTORE_SCRIPT=/var/lib/openvswitch/flows-script Mar 10 10:09:43 crc kubenswrapper[4794]: ++ FLOWS_RESTORE_DIR=/var/lib/openvswitch/saved-flows Mar 10 10:09:43 crc kubenswrapper[4794]: ++ SAFE_TO_STOP_OVSDB_SERVER_SEMAPHORE=/var/lib/openvswitch/is_safe_to_stop_ovsdb_server Mar 10 10:09:43 crc kubenswrapper[4794]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Mar 10 10:09:43 crc kubenswrapper[4794]: + sleep 0.5 Mar 10 10:09:43 crc kubenswrapper[4794]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Mar 10 10:09:43 crc kubenswrapper[4794]: + sleep 0.5 Mar 10 10:09:43 crc kubenswrapper[4794]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Mar 10 10:09:43 crc kubenswrapper[4794]: + cleanup_ovsdb_server_semaphore Mar 10 10:09:43 crc kubenswrapper[4794]: + rm -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server Mar 10 10:09:43 crc kubenswrapper[4794]: + /usr/share/openvswitch/scripts/ovs-ctl stop --no-ovs-vswitchd Mar 10 10:09:43 crc kubenswrapper[4794]: > pod="openstack/ovn-controller-ovs-8552m" podUID="4fbaedcd-8c1d-452e-a36b-1a12e47f48d1" containerName="ovsdb-server" containerID="cri-o://760dbf63b8a7310e794a1834c303840d43e2dfc6133fa29596b18c0cf1a30fcb" Mar 10 10:09:43 crc kubenswrapper[4794]: I0310 10:09:43.854318 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovn-controller-ovs-8552m" podUID="4fbaedcd-8c1d-452e-a36b-1a12e47f48d1" containerName="ovsdb-server" containerID="cri-o://760dbf63b8a7310e794a1834c303840d43e2dfc6133fa29596b18c0cf1a30fcb" gracePeriod=28 Mar 10 10:09:43 crc kubenswrapper[4794]: I0310 10:09:43.875441 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-ncstb"] Mar 10 10:09:43 crc kubenswrapper[4794]: E0310 10:09:43.968506 4794 configmap.go:193] Couldn't get configMap openstack/rabbitmq-cell1-config-data: configmap "rabbitmq-cell1-config-data" not found Mar 10 10:09:43 crc kubenswrapper[4794]: E0310 10:09:43.968789 4794 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/598e06ed-3156-4e09-976e-4dda0e35afc2-config-data podName:598e06ed-3156-4e09-976e-4dda0e35afc2 nodeName:}" failed. No retries permitted until 2026-03-10 10:09:45.968770732 +0000 UTC m=+1534.724941550 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/598e06ed-3156-4e09-976e-4dda0e35afc2-config-data") pod "rabbitmq-cell1-server-0" (UID: "598e06ed-3156-4e09-976e-4dda0e35afc2") : configmap "rabbitmq-cell1-config-data" not found Mar 10 10:09:44 crc kubenswrapper[4794]: I0310 10:09:44.026542 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22069ba2-0135-4559-9c7f-2d73ae0dd81a" path="/var/lib/kubelet/pods/22069ba2-0135-4559-9c7f-2d73ae0dd81a/volumes" Mar 10 10:09:44 crc kubenswrapper[4794]: I0310 10:09:44.027300 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="269fc3af-0d8a-4d19-8753-8c4d07670864" path="/var/lib/kubelet/pods/269fc3af-0d8a-4d19-8753-8c4d07670864/volumes" Mar 10 10:09:44 crc kubenswrapper[4794]: I0310 10:09:44.027891 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="27e140de-d875-48f9-87dd-2ea5908121c9" path="/var/lib/kubelet/pods/27e140de-d875-48f9-87dd-2ea5908121c9/volumes" Mar 10 10:09:44 crc kubenswrapper[4794]: I0310 10:09:44.028503 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4290ea58-8af5-478d-a452-421fe656fe01" path="/var/lib/kubelet/pods/4290ea58-8af5-478d-a452-421fe656fe01/volumes" Mar 10 10:09:44 crc kubenswrapper[4794]: W0310 10:09:44.028664 4794 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod19885e30_4144_4598_be9f_99644e5d5d4a.slice/crio-462746f7d4293095c4760c9bebe6c55150931da29629648c6e102ec58e5deb64 WatchSource:0}: Error finding container 462746f7d4293095c4760c9bebe6c55150931da29629648c6e102ec58e5deb64: Status 404 returned error can't find the container with id 462746f7d4293095c4760c9bebe6c55150931da29629648c6e102ec58e5deb64 Mar 10 10:09:44 crc kubenswrapper[4794]: I0310 10:09:44.033973 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-cell1-server-0" podUID="598e06ed-3156-4e09-976e-4dda0e35afc2" containerName="rabbitmq" containerID="cri-o://145a1a04aa37c9588ebc88932f6178e631ace726a39e4a8d191e2883dc15e900" gracePeriod=604800 Mar 10 10:09:44 crc kubenswrapper[4794]: I0310 10:09:44.034543 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="510d7963-19bc-4b66-8c32-0b7b92c8e7ad" path="/var/lib/kubelet/pods/510d7963-19bc-4b66-8c32-0b7b92c8e7ad/volumes" Mar 10 10:09:44 crc kubenswrapper[4794]: I0310 10:09:44.035873 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="543f505d-cd2e-491c-bab0-33efb1f71f57" path="/var/lib/kubelet/pods/543f505d-cd2e-491c-bab0-33efb1f71f57/volumes" Mar 10 10:09:44 crc kubenswrapper[4794]: I0310 10:09:44.036942 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="56a0cbd8-93d4-4236-bf7d-434752b9d246" path="/var/lib/kubelet/pods/56a0cbd8-93d4-4236-bf7d-434752b9d246/volumes" Mar 10 10:09:44 crc kubenswrapper[4794]: I0310 10:09:44.037423 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="598f9a9c-0b67-44b8-83a9-428f55be33a9" path="/var/lib/kubelet/pods/598f9a9c-0b67-44b8-83a9-428f55be33a9/volumes" Mar 10 10:09:44 crc kubenswrapper[4794]: I0310 10:09:44.043278 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="605a052b-1adf-461d-baf7-1a30a69d8de7" path="/var/lib/kubelet/pods/605a052b-1adf-461d-baf7-1a30a69d8de7/volumes" Mar 10 10:09:44 crc kubenswrapper[4794]: I0310 10:09:44.043854 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="65e0ce91-6453-4818-ae1f-39a24f8e6a66" path="/var/lib/kubelet/pods/65e0ce91-6453-4818-ae1f-39a24f8e6a66/volumes" Mar 10 10:09:44 crc kubenswrapper[4794]: I0310 10:09:44.044672 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6f6993ef-0bbf-49a3-a1cb-1dd304ba6564" path="/var/lib/kubelet/pods/6f6993ef-0bbf-49a3-a1cb-1dd304ba6564/volumes" Mar 10 10:09:44 crc kubenswrapper[4794]: I0310 10:09:44.046064 4794 generic.go:334] "Generic (PLEG): container finished" podID="eb384e40-1917-4b9c-bcfa-440a3a10fd1d" containerID="ef5f845a9297d9baef67ae43283960d69bd559077bfce60e39f562cbd5f935df" exitCode=143 Mar 10 10:09:44 crc kubenswrapper[4794]: E0310 10:09:44.052561 4794 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 10 10:09:44 crc kubenswrapper[4794]: container &Container{Name:mariadb-account-create-update,Image:quay.io/podified-antelope-centos9/openstack-mariadb@sha256:763d1f1e8a1cf877c151c59609960fd2fa29e7e50001f8818122a2d51878befa,Command:[/bin/sh -c #!/bin/bash Mar 10 10:09:44 crc kubenswrapper[4794]: Mar 10 10:09:44 crc kubenswrapper[4794]: MYSQL_REMOTE_HOST="" source /var/lib/operator-scripts/mysql_root_auth.sh Mar 10 10:09:44 crc kubenswrapper[4794]: Mar 10 10:09:44 crc kubenswrapper[4794]: export DatabasePassword=${DatabasePassword:?"Please specify a DatabasePassword variable."} Mar 10 10:09:44 crc kubenswrapper[4794]: Mar 10 10:09:44 crc kubenswrapper[4794]: MYSQL_CMD="mysql -h -u root -P 3306" Mar 10 10:09:44 crc kubenswrapper[4794]: Mar 10 10:09:44 crc kubenswrapper[4794]: if [ -n "glance" ]; then Mar 10 10:09:44 crc kubenswrapper[4794]: GRANT_DATABASE="glance" Mar 10 10:09:44 crc kubenswrapper[4794]: else Mar 10 10:09:44 crc kubenswrapper[4794]: GRANT_DATABASE="*" Mar 10 10:09:44 crc kubenswrapper[4794]: fi Mar 10 10:09:44 crc kubenswrapper[4794]: Mar 10 10:09:44 crc kubenswrapper[4794]: # going for maximum compatibility here: Mar 10 10:09:44 crc kubenswrapper[4794]: # 1. MySQL 8 no longer allows implicit create user when GRANT is used Mar 10 10:09:44 crc kubenswrapper[4794]: # 2. MariaDB has "CREATE OR REPLACE", but MySQL does not Mar 10 10:09:44 crc kubenswrapper[4794]: # 3. create user with CREATE but then do all password and TLS with ALTER to Mar 10 10:09:44 crc kubenswrapper[4794]: # support updates Mar 10 10:09:44 crc kubenswrapper[4794]: Mar 10 10:09:44 crc kubenswrapper[4794]: $MYSQL_CMD < logger="UnhandledError" Mar 10 10:09:44 crc kubenswrapper[4794]: I0310 10:09:44.053031 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="71cc55d3-bfda-4469-ad6c-6c5c0357360a" path="/var/lib/kubelet/pods/71cc55d3-bfda-4469-ad6c-6c5c0357360a/volumes" Mar 10 10:09:44 crc kubenswrapper[4794]: I0310 10:09:44.053771 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="75e280fc-27ef-4cd8-b46d-3913a229ba81" path="/var/lib/kubelet/pods/75e280fc-27ef-4cd8-b46d-3913a229ba81/volumes" Mar 10 10:09:44 crc kubenswrapper[4794]: E0310 10:09:44.053899 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mariadb-account-create-update\" with CreateContainerConfigError: \"secret \\\"glance-db-secret\\\" not found\"" pod="openstack/glance-7da2-account-create-update-rd2nz" podUID="19885e30-4144-4598-be9f-99644e5d5d4a" Mar 10 10:09:44 crc kubenswrapper[4794]: I0310 10:09:44.056596 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="821b338a-8a20-4d93-8dfa-28727da3ecba" path="/var/lib/kubelet/pods/821b338a-8a20-4d93-8dfa-28727da3ecba/volumes" Mar 10 10:09:44 crc kubenswrapper[4794]: I0310 10:09:44.058858 4794 generic.go:334] "Generic (PLEG): container finished" podID="efa96620-3d4b-4780-92a0-eeefbe9dcf9a" containerID="c415d347d34421bcb42d60573a12da5f8047e3e7f977fb04d5829812ddc302a7" exitCode=143 Mar 10 10:09:44 crc kubenswrapper[4794]: I0310 10:09:44.060233 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a989a70f-ba12-4aa6-b96d-397cde6a5d48" path="/var/lib/kubelet/pods/a989a70f-ba12-4aa6-b96d-397cde6a5d48/volumes" Mar 10 10:09:44 crc kubenswrapper[4794]: I0310 10:09:44.060926 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b9423b3f-3f25-484c-aacc-c83d78c2f731" path="/var/lib/kubelet/pods/b9423b3f-3f25-484c-aacc-c83d78c2f731/volumes" Mar 10 10:09:44 crc kubenswrapper[4794]: I0310 10:09:44.061727 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc31804f-5a50-4c0a-80e8-42d0752ee5b5" path="/var/lib/kubelet/pods/bc31804f-5a50-4c0a-80e8-42d0752ee5b5/volumes" Mar 10 10:09:44 crc kubenswrapper[4794]: I0310 10:09:44.065037 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c66c006d-3dd9-4544-8272-51226c41a2fd" path="/var/lib/kubelet/pods/c66c006d-3dd9-4544-8272-51226c41a2fd/volumes" Mar 10 10:09:44 crc kubenswrapper[4794]: I0310 10:09:44.065720 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cb29878d-3dbb-496a-bd25-b6b4ff102b6f" path="/var/lib/kubelet/pods/cb29878d-3dbb-496a-bd25-b6b4ff102b6f/volumes" Mar 10 10:09:44 crc kubenswrapper[4794]: I0310 10:09:44.066239 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d1e8a45c-ff29-42fa-9999-5ba419470fd5" path="/var/lib/kubelet/pods/d1e8a45c-ff29-42fa-9999-5ba419470fd5/volumes" Mar 10 10:09:44 crc kubenswrapper[4794]: I0310 10:09:44.067797 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d7addf26-5bbf-4b13-aa27-070bab62a929" path="/var/lib/kubelet/pods/d7addf26-5bbf-4b13-aa27-070bab62a929/volumes" Mar 10 10:09:44 crc kubenswrapper[4794]: I0310 10:09:44.068949 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f7641d95-8aa6-43ad-a191-7d8356b29bac" path="/var/lib/kubelet/pods/f7641d95-8aa6-43ad-a191-7d8356b29bac/volumes" Mar 10 10:09:44 crc kubenswrapper[4794]: I0310 10:09:44.069868 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"eb384e40-1917-4b9c-bcfa-440a3a10fd1d","Type":"ContainerDied","Data":"ef5f845a9297d9baef67ae43283960d69bd559077bfce60e39f562cbd5f935df"} Mar 10 10:09:44 crc kubenswrapper[4794]: I0310 10:09:44.069900 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-65fdc45d8b-2t64g" event={"ID":"efa96620-3d4b-4780-92a0-eeefbe9dcf9a","Type":"ContainerDied","Data":"c415d347d34421bcb42d60573a12da5f8047e3e7f977fb04d5829812ddc302a7"} Mar 10 10:09:44 crc kubenswrapper[4794]: I0310 10:09:44.084793 4794 generic.go:334] "Generic (PLEG): container finished" podID="2446b2bc-c3c8-465d-a808-981664228cba" containerID="6e6cf54d16f75007332086d04328fb26dd120e8114419987c7377f33c0bef36c" exitCode=143 Mar 10 10:09:44 crc kubenswrapper[4794]: I0310 10:09:44.084853 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"2446b2bc-c3c8-465d-a808-981664228cba","Type":"ContainerDied","Data":"6e6cf54d16f75007332086d04328fb26dd120e8114419987c7377f33c0bef36c"} Mar 10 10:09:44 crc kubenswrapper[4794]: I0310 10:09:44.096460 4794 generic.go:334] "Generic (PLEG): container finished" podID="98b35dea-060e-4b8d-9829-37357853a9c4" containerID="f89f95fa1764bb4ed8927a1c6b5d7ec0737f1ea37babc6ae8e3dc09577573205" exitCode=143 Mar 10 10:09:44 crc kubenswrapper[4794]: I0310 10:09:44.096518 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-b64684465-k4k4j" event={"ID":"98b35dea-060e-4b8d-9829-37357853a9c4","Type":"ContainerDied","Data":"f89f95fa1764bb4ed8927a1c6b5d7ec0737f1ea37babc6ae8e3dc09577573205"} Mar 10 10:09:44 crc kubenswrapper[4794]: I0310 10:09:44.102789 4794 generic.go:334] "Generic (PLEG): container finished" podID="4fbaedcd-8c1d-452e-a36b-1a12e47f48d1" containerID="760dbf63b8a7310e794a1834c303840d43e2dfc6133fa29596b18c0cf1a30fcb" exitCode=0 Mar 10 10:09:44 crc kubenswrapper[4794]: I0310 10:09:44.103007 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-8552m" event={"ID":"4fbaedcd-8c1d-452e-a36b-1a12e47f48d1","Type":"ContainerDied","Data":"760dbf63b8a7310e794a1834c303840d43e2dfc6133fa29596b18c0cf1a30fcb"} Mar 10 10:09:44 crc kubenswrapper[4794]: I0310 10:09:44.119476 4794 generic.go:334] "Generic (PLEG): container finished" podID="6cbde6fd-ed6b-49a8-96ae-642b15a1802b" containerID="8f6939cc5c3159e417cd81d11fcb1a54fc7a6a362b3f62041010bad3ef1cdf82" exitCode=0 Mar 10 10:09:44 crc kubenswrapper[4794]: I0310 10:09:44.119506 4794 generic.go:334] "Generic (PLEG): container finished" podID="6cbde6fd-ed6b-49a8-96ae-642b15a1802b" containerID="32f094fa9f1ca547ebae717b8b5951d1771b5fdb77dfeec605d769593501a6a7" exitCode=0 Mar 10 10:09:44 crc kubenswrapper[4794]: I0310 10:09:44.119513 4794 generic.go:334] "Generic (PLEG): container finished" podID="6cbde6fd-ed6b-49a8-96ae-642b15a1802b" containerID="3b9e079a22cd5b6888eff2291538cfe9ec6e987ec470fa124da7e69bbac3f8c2" exitCode=0 Mar 10 10:09:44 crc kubenswrapper[4794]: I0310 10:09:44.119520 4794 generic.go:334] "Generic (PLEG): container finished" podID="6cbde6fd-ed6b-49a8-96ae-642b15a1802b" containerID="250a700fa883215453330332d981b0fe632e9fa60d370e3a5759ce91865db4ab" exitCode=0 Mar 10 10:09:44 crc kubenswrapper[4794]: I0310 10:09:44.119526 4794 generic.go:334] "Generic (PLEG): container finished" podID="6cbde6fd-ed6b-49a8-96ae-642b15a1802b" containerID="bc6074c0953ac28f265a3e17ebd69da6a9d931779c7686e644dc485527b0a8fb" exitCode=0 Mar 10 10:09:44 crc kubenswrapper[4794]: I0310 10:09:44.119532 4794 generic.go:334] "Generic (PLEG): container finished" podID="6cbde6fd-ed6b-49a8-96ae-642b15a1802b" containerID="2190a44dadbbaa1a8486a80b4974382e233189c54458a517977ead0fca476329" exitCode=0 Mar 10 10:09:44 crc kubenswrapper[4794]: I0310 10:09:44.119539 4794 generic.go:334] "Generic (PLEG): container finished" podID="6cbde6fd-ed6b-49a8-96ae-642b15a1802b" containerID="8e6a16e1a4b64e9512a2b7d0587b85ab036e797d9ba29438a7eebcaaa92c8d35" exitCode=0 Mar 10 10:09:44 crc kubenswrapper[4794]: I0310 10:09:44.119545 4794 generic.go:334] "Generic (PLEG): container finished" podID="6cbde6fd-ed6b-49a8-96ae-642b15a1802b" containerID="74c2be2cb91c7946838edd2c68684383f49526b9001079db31859fa44514bdac" exitCode=0 Mar 10 10:09:44 crc kubenswrapper[4794]: I0310 10:09:44.119551 4794 generic.go:334] "Generic (PLEG): container finished" podID="6cbde6fd-ed6b-49a8-96ae-642b15a1802b" containerID="46ff2925f4f5d1bf13f6a0203dfcdd7152671d57863769e69b695dd9c5e3fb06" exitCode=0 Mar 10 10:09:44 crc kubenswrapper[4794]: I0310 10:09:44.119557 4794 generic.go:334] "Generic (PLEG): container finished" podID="6cbde6fd-ed6b-49a8-96ae-642b15a1802b" containerID="21625e79ea0985db8645fee5a87d8192fa2df9962f7552b6121df93fb96d3e7f" exitCode=0 Mar 10 10:09:44 crc kubenswrapper[4794]: I0310 10:09:44.119565 4794 generic.go:334] "Generic (PLEG): container finished" podID="6cbde6fd-ed6b-49a8-96ae-642b15a1802b" containerID="d63f9f6d7599f958801005a6670033ad2c6f68cf62e9c0465c4d34044c669139" exitCode=0 Mar 10 10:09:44 crc kubenswrapper[4794]: I0310 10:09:44.119571 4794 generic.go:334] "Generic (PLEG): container finished" podID="6cbde6fd-ed6b-49a8-96ae-642b15a1802b" containerID="32aa0630d82d05463514a0c8a463bab43aad2c33f2b887d13c9714f7761c76c2" exitCode=0 Mar 10 10:09:44 crc kubenswrapper[4794]: I0310 10:09:44.119577 4794 generic.go:334] "Generic (PLEG): container finished" podID="6cbde6fd-ed6b-49a8-96ae-642b15a1802b" containerID="8cc99334a202e511edfdad4eef31c86941e591eb3f6215e5d7e786f323618184" exitCode=0 Mar 10 10:09:44 crc kubenswrapper[4794]: I0310 10:09:44.119584 4794 generic.go:334] "Generic (PLEG): container finished" podID="6cbde6fd-ed6b-49a8-96ae-642b15a1802b" containerID="031388a5b0e7ab4e2d5a36045f55127c3e30f57424f750e9a01dea1da336e7c1" exitCode=0 Mar 10 10:09:44 crc kubenswrapper[4794]: I0310 10:09:44.119656 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"6cbde6fd-ed6b-49a8-96ae-642b15a1802b","Type":"ContainerDied","Data":"8f6939cc5c3159e417cd81d11fcb1a54fc7a6a362b3f62041010bad3ef1cdf82"} Mar 10 10:09:44 crc kubenswrapper[4794]: I0310 10:09:44.119685 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"6cbde6fd-ed6b-49a8-96ae-642b15a1802b","Type":"ContainerDied","Data":"32f094fa9f1ca547ebae717b8b5951d1771b5fdb77dfeec605d769593501a6a7"} Mar 10 10:09:44 crc kubenswrapper[4794]: I0310 10:09:44.119697 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"6cbde6fd-ed6b-49a8-96ae-642b15a1802b","Type":"ContainerDied","Data":"3b9e079a22cd5b6888eff2291538cfe9ec6e987ec470fa124da7e69bbac3f8c2"} Mar 10 10:09:44 crc kubenswrapper[4794]: I0310 10:09:44.119707 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"6cbde6fd-ed6b-49a8-96ae-642b15a1802b","Type":"ContainerDied","Data":"250a700fa883215453330332d981b0fe632e9fa60d370e3a5759ce91865db4ab"} Mar 10 10:09:44 crc kubenswrapper[4794]: I0310 10:09:44.119718 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"6cbde6fd-ed6b-49a8-96ae-642b15a1802b","Type":"ContainerDied","Data":"bc6074c0953ac28f265a3e17ebd69da6a9d931779c7686e644dc485527b0a8fb"} Mar 10 10:09:44 crc kubenswrapper[4794]: I0310 10:09:44.119729 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"6cbde6fd-ed6b-49a8-96ae-642b15a1802b","Type":"ContainerDied","Data":"2190a44dadbbaa1a8486a80b4974382e233189c54458a517977ead0fca476329"} Mar 10 10:09:44 crc kubenswrapper[4794]: I0310 10:09:44.119738 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"6cbde6fd-ed6b-49a8-96ae-642b15a1802b","Type":"ContainerDied","Data":"8e6a16e1a4b64e9512a2b7d0587b85ab036e797d9ba29438a7eebcaaa92c8d35"} Mar 10 10:09:44 crc kubenswrapper[4794]: I0310 10:09:44.119748 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"6cbde6fd-ed6b-49a8-96ae-642b15a1802b","Type":"ContainerDied","Data":"74c2be2cb91c7946838edd2c68684383f49526b9001079db31859fa44514bdac"} Mar 10 10:09:44 crc kubenswrapper[4794]: I0310 10:09:44.119757 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"6cbde6fd-ed6b-49a8-96ae-642b15a1802b","Type":"ContainerDied","Data":"46ff2925f4f5d1bf13f6a0203dfcdd7152671d57863769e69b695dd9c5e3fb06"} Mar 10 10:09:44 crc kubenswrapper[4794]: I0310 10:09:44.119768 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"6cbde6fd-ed6b-49a8-96ae-642b15a1802b","Type":"ContainerDied","Data":"21625e79ea0985db8645fee5a87d8192fa2df9962f7552b6121df93fb96d3e7f"} Mar 10 10:09:44 crc kubenswrapper[4794]: I0310 10:09:44.119778 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"6cbde6fd-ed6b-49a8-96ae-642b15a1802b","Type":"ContainerDied","Data":"d63f9f6d7599f958801005a6670033ad2c6f68cf62e9c0465c4d34044c669139"} Mar 10 10:09:44 crc kubenswrapper[4794]: I0310 10:09:44.119788 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"6cbde6fd-ed6b-49a8-96ae-642b15a1802b","Type":"ContainerDied","Data":"32aa0630d82d05463514a0c8a463bab43aad2c33f2b887d13c9714f7761c76c2"} Mar 10 10:09:44 crc kubenswrapper[4794]: I0310 10:09:44.119797 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"6cbde6fd-ed6b-49a8-96ae-642b15a1802b","Type":"ContainerDied","Data":"8cc99334a202e511edfdad4eef31c86941e591eb3f6215e5d7e786f323618184"} Mar 10 10:09:44 crc kubenswrapper[4794]: I0310 10:09:44.119806 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"6cbde6fd-ed6b-49a8-96ae-642b15a1802b","Type":"ContainerDied","Data":"031388a5b0e7ab4e2d5a36045f55127c3e30f57424f750e9a01dea1da336e7c1"} Mar 10 10:09:44 crc kubenswrapper[4794]: I0310 10:09:44.121477 4794 generic.go:334] "Generic (PLEG): container finished" podID="d235a0a5-57c9-4938-b742-5788ade30a12" containerID="657c6e566fe3c3ec4a4a199f42966fd39347a8a6f061fe653f5274f12bb76445" exitCode=137 Mar 10 10:09:44 crc kubenswrapper[4794]: I0310 10:09:44.121551 4794 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e362e973fd1ddbdde704946272f4e6b5130e556c24e23a93143cf72208cad466" Mar 10 10:09:44 crc kubenswrapper[4794]: I0310 10:09:44.123526 4794 generic.go:334] "Generic (PLEG): container finished" podID="cffdb4d7-881d-4c1b-a15c-d8d9e8b7756f" containerID="d7f0a62cb5cfb4fded049c3aefb7fe44d4d036d9c535f290cbc47f08da15b658" exitCode=143 Mar 10 10:09:44 crc kubenswrapper[4794]: I0310 10:09:44.123634 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-5dd9f46c58-hkfks" event={"ID":"cffdb4d7-881d-4c1b-a15c-d8d9e8b7756f","Type":"ContainerDied","Data":"d7f0a62cb5cfb4fded049c3aefb7fe44d4d036d9c535f290cbc47f08da15b658"} Mar 10 10:09:44 crc kubenswrapper[4794]: I0310 10:09:44.139222 4794 generic.go:334] "Generic (PLEG): container finished" podID="55771788-f3c0-4cde-af2f-ca527c2e2965" containerID="76dbe19e4daa257d305db13b3f34518e614449759a4f59af217356e156607317" exitCode=0 Mar 10 10:09:44 crc kubenswrapper[4794]: I0310 10:09:44.139295 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-bd7575545-w8qjp" event={"ID":"55771788-f3c0-4cde-af2f-ca527c2e2965","Type":"ContainerDied","Data":"76dbe19e4daa257d305db13b3f34518e614449759a4f59af217356e156607317"} Mar 10 10:09:44 crc kubenswrapper[4794]: I0310 10:09:44.145558 4794 generic.go:334] "Generic (PLEG): container finished" podID="ecae27ed-535f-47c8-93e4-07baac3bc64c" containerID="07b975998a50058fa523429e713118ddb3da394ec6e69a92bc5bda9b7839104f" exitCode=143 Mar 10 10:09:44 crc kubenswrapper[4794]: I0310 10:09:44.145618 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-68cfd4d846-drn7b" event={"ID":"ecae27ed-535f-47c8-93e4-07baac3bc64c","Type":"ContainerDied","Data":"07b975998a50058fa523429e713118ddb3da394ec6e69a92bc5bda9b7839104f"} Mar 10 10:09:44 crc kubenswrapper[4794]: I0310 10:09:44.157445 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_1fcb4385-7603-4d75-8c41-23f457fcae25/ovsdbserver-sb/0.log" Mar 10 10:09:44 crc kubenswrapper[4794]: I0310 10:09:44.157567 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"1fcb4385-7603-4d75-8c41-23f457fcae25","Type":"ContainerDied","Data":"0aaa2584749680bc8e1a1c028dcb6ca7fc34ef2cd230a42cb8f6e7a9c56b9f8b"} Mar 10 10:09:44 crc kubenswrapper[4794]: I0310 10:09:44.157607 4794 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0aaa2584749680bc8e1a1c028dcb6ca7fc34ef2cd230a42cb8f6e7a9c56b9f8b" Mar 10 10:09:44 crc kubenswrapper[4794]: I0310 10:09:44.159686 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_9575e254-d696-4a8a-b84f-c8f36d746ff8/ovsdbserver-nb/0.log" Mar 10 10:09:44 crc kubenswrapper[4794]: I0310 10:09:44.159718 4794 generic.go:334] "Generic (PLEG): container finished" podID="9575e254-d696-4a8a-b84f-c8f36d746ff8" containerID="568dc6ea988e0568a2d3d813b291ac89a4bc785eb376be0c5d4fc65492cc33f4" exitCode=143 Mar 10 10:09:44 crc kubenswrapper[4794]: I0310 10:09:44.159760 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"9575e254-d696-4a8a-b84f-c8f36d746ff8","Type":"ContainerDied","Data":"568dc6ea988e0568a2d3d813b291ac89a4bc785eb376be0c5d4fc65492cc33f4"} Mar 10 10:09:44 crc kubenswrapper[4794]: I0310 10:09:44.220570 4794 generic.go:334] "Generic (PLEG): container finished" podID="39114d89-8cf8-4563-bc50-e96e2113349d" containerID="841f2c7007209f71d0fcab9d21091a5638e79f7695a27cbb0864a109529f7bb5" exitCode=143 Mar 10 10:09:44 crc kubenswrapper[4794]: I0310 10:09:44.220809 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"39114d89-8cf8-4563-bc50-e96e2113349d","Type":"ContainerDied","Data":"841f2c7007209f71d0fcab9d21091a5638e79f7695a27cbb0864a109529f7bb5"} Mar 10 10:09:44 crc kubenswrapper[4794]: I0310 10:09:44.237963 4794 generic.go:334] "Generic (PLEG): container finished" podID="6b5facf1-8bc0-497d-925f-ee382862cf22" containerID="2db723c104c847e70b7c742bd2157495764accb8230b8d1c494ef49084c7b620" exitCode=143 Mar 10 10:09:44 crc kubenswrapper[4794]: I0310 10:09:44.238047 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"6b5facf1-8bc0-497d-925f-ee382862cf22","Type":"ContainerDied","Data":"2db723c104c847e70b7c742bd2157495764accb8230b8d1c494ef49084c7b620"} Mar 10 10:09:44 crc kubenswrapper[4794]: I0310 10:09:44.243731 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-p9r7k_f68cd69f-6fe2-4189-ad03-9593a4e94337/openstack-network-exporter/0.log" Mar 10 10:09:44 crc kubenswrapper[4794]: I0310 10:09:44.243800 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-p9r7k" event={"ID":"f68cd69f-6fe2-4189-ad03-9593a4e94337","Type":"ContainerDied","Data":"1df7f3a49ec37831f1f34354206d89199e20087c4d0704329a60ba21a2b8f285"} Mar 10 10:09:44 crc kubenswrapper[4794]: I0310 10:09:44.243825 4794 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1df7f3a49ec37831f1f34354206d89199e20087c4d0704329a60ba21a2b8f285" Mar 10 10:09:44 crc kubenswrapper[4794]: I0310 10:09:44.256179 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-c8964d89c-gfknm" event={"ID":"bb0dcfe9-2446-4f91-bc24-3cabcab9e8af","Type":"ContainerDied","Data":"52ec40d533f039df7d6f7e0f4b6824af5e321cc9af8b25f91764a63cf1117c87"} Mar 10 10:09:44 crc kubenswrapper[4794]: I0310 10:09:44.256218 4794 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="52ec40d533f039df7d6f7e0f4b6824af5e321cc9af8b25f91764a63cf1117c87" Mar 10 10:09:44 crc kubenswrapper[4794]: I0310 10:09:44.264728 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/openstack-cell1-galera-0" podUID="7f82e49b-0e4d-4cf5-8213-b30edcae94d4" containerName="galera" containerID="cri-o://c4c1209870650aefcffd90e94778414a3fb571c20ef3e08baeadd14915b7e8d6" gracePeriod=30 Mar 10 10:09:44 crc kubenswrapper[4794]: I0310 10:09:44.268867 4794 generic.go:334] "Generic (PLEG): container finished" podID="53686d91-dc01-4a36-99c3-e6c84052e15e" containerID="d74fbbdb86c3cdb65171312cf2c9c803c458f47ad7b0f5c525579801ae96ec9d" exitCode=143 Mar 10 10:09:44 crc kubenswrapper[4794]: I0310 10:09:44.268983 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"53686d91-dc01-4a36-99c3-e6c84052e15e","Type":"ContainerDied","Data":"d74fbbdb86c3cdb65171312cf2c9c803c458f47ad7b0f5c525579801ae96ec9d"} Mar 10 10:09:44 crc kubenswrapper[4794]: I0310 10:09:44.269060 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-ncstb" podUID="df649851-7d90-41e8-80e9-f7fd44d77af0" containerName="registry-server" containerID="cri-o://5c587627c7f33469c641839031a8ba051d3fb1f6bbb5856feff3d2931910e1cd" gracePeriod=2 Mar 10 10:09:44 crc kubenswrapper[4794]: I0310 10:09:44.349409 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-qzv7j"] Mar 10 10:09:44 crc kubenswrapper[4794]: I0310 10:09:44.362476 4794 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-qzv7j"] Mar 10 10:09:44 crc kubenswrapper[4794]: I0310 10:09:44.370463 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-0"] Mar 10 10:09:44 crc kubenswrapper[4794]: I0310 10:09:44.370765 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-conductor-0" podUID="2e14b1f3-e9a2-41ae-96ee-88dc84f69921" containerName="nova-cell1-conductor-conductor" containerID="cri-o://c0456f26d4f37efa9041727ed282574e1a91d5423940c7b04377a50eb6ff7d08" gracePeriod=30 Mar 10 10:09:44 crc kubenswrapper[4794]: I0310 10:09:44.377419 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-0"] Mar 10 10:09:44 crc kubenswrapper[4794]: I0310 10:09:44.377665 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell0-conductor-0" podUID="6bd64ab2-dcd8-4404-973e-551182005da1" containerName="nova-cell0-conductor-conductor" containerID="cri-o://eeacb7a4237f606fda5183e09ef7cdbc5464bd0f9e121d184887a224b87c2de7" gracePeriod=30 Mar 10 10:09:44 crc kubenswrapper[4794]: I0310 10:09:44.384239 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-wd2vl"] Mar 10 10:09:44 crc kubenswrapper[4794]: I0310 10:09:44.391573 4794 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-wd2vl"] Mar 10 10:09:44 crc kubenswrapper[4794]: I0310 10:09:44.468699 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-p9r7k_f68cd69f-6fe2-4189-ad03-9593a4e94337/openstack-network-exporter/0.log" Mar 10 10:09:44 crc kubenswrapper[4794]: I0310 10:09:44.468756 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-p9r7k" Mar 10 10:09:44 crc kubenswrapper[4794]: I0310 10:09:44.501026 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-c8964d89c-gfknm" Mar 10 10:09:44 crc kubenswrapper[4794]: I0310 10:09:44.536243 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_1fcb4385-7603-4d75-8c41-23f457fcae25/ovsdbserver-sb/0.log" Mar 10 10:09:44 crc kubenswrapper[4794]: I0310 10:09:44.536357 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Mar 10 10:09:44 crc kubenswrapper[4794]: I0310 10:09:44.594558 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Mar 10 10:09:44 crc kubenswrapper[4794]: I0310 10:09:44.641425 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_9575e254-d696-4a8a-b84f-c8f36d746ff8/ovsdbserver-nb/0.log" Mar 10 10:09:44 crc kubenswrapper[4794]: I0310 10:09:44.641488 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Mar 10 10:09:44 crc kubenswrapper[4794]: I0310 10:09:44.644511 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bb0dcfe9-2446-4f91-bc24-3cabcab9e8af-config\") pod \"bb0dcfe9-2446-4f91-bc24-3cabcab9e8af\" (UID: \"bb0dcfe9-2446-4f91-bc24-3cabcab9e8af\") " Mar 10 10:09:44 crc kubenswrapper[4794]: I0310 10:09:44.644535 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s5gzb\" (UniqueName: \"kubernetes.io/projected/1fcb4385-7603-4d75-8c41-23f457fcae25-kube-api-access-s5gzb\") pod \"1fcb4385-7603-4d75-8c41-23f457fcae25\" (UID: \"1fcb4385-7603-4d75-8c41-23f457fcae25\") " Mar 10 10:09:44 crc kubenswrapper[4794]: I0310 10:09:44.644774 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f68cd69f-6fe2-4189-ad03-9593a4e94337-config\") pod \"f68cd69f-6fe2-4189-ad03-9593a4e94337\" (UID: \"f68cd69f-6fe2-4189-ad03-9593a4e94337\") " Mar 10 10:09:44 crc kubenswrapper[4794]: I0310 10:09:44.644792 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/bb0dcfe9-2446-4f91-bc24-3cabcab9e8af-ovsdbserver-sb\") pod \"bb0dcfe9-2446-4f91-bc24-3cabcab9e8af\" (UID: \"bb0dcfe9-2446-4f91-bc24-3cabcab9e8af\") " Mar 10 10:09:44 crc kubenswrapper[4794]: I0310 10:09:44.644810 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/f68cd69f-6fe2-4189-ad03-9593a4e94337-ovs-rundir\") pod \"f68cd69f-6fe2-4189-ad03-9593a4e94337\" (UID: \"f68cd69f-6fe2-4189-ad03-9593a4e94337\") " Mar 10 10:09:44 crc kubenswrapper[4794]: I0310 10:09:44.644832 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1fcb4385-7603-4d75-8c41-23f457fcae25-combined-ca-bundle\") pod \"1fcb4385-7603-4d75-8c41-23f457fcae25\" (UID: \"1fcb4385-7603-4d75-8c41-23f457fcae25\") " Mar 10 10:09:44 crc kubenswrapper[4794]: I0310 10:09:44.644848 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/bb0dcfe9-2446-4f91-bc24-3cabcab9e8af-dns-swift-storage-0\") pod \"bb0dcfe9-2446-4f91-bc24-3cabcab9e8af\" (UID: \"bb0dcfe9-2446-4f91-bc24-3cabcab9e8af\") " Mar 10 10:09:44 crc kubenswrapper[4794]: I0310 10:09:44.644881 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/1fcb4385-7603-4d75-8c41-23f457fcae25-ovsdb-rundir\") pod \"1fcb4385-7603-4d75-8c41-23f457fcae25\" (UID: \"1fcb4385-7603-4d75-8c41-23f457fcae25\") " Mar 10 10:09:44 crc kubenswrapper[4794]: I0310 10:09:44.644902 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1fcb4385-7603-4d75-8c41-23f457fcae25-scripts\") pod \"1fcb4385-7603-4d75-8c41-23f457fcae25\" (UID: \"1fcb4385-7603-4d75-8c41-23f457fcae25\") " Mar 10 10:09:44 crc kubenswrapper[4794]: I0310 10:09:44.644927 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q625v\" (UniqueName: \"kubernetes.io/projected/bb0dcfe9-2446-4f91-bc24-3cabcab9e8af-kube-api-access-q625v\") pod \"bb0dcfe9-2446-4f91-bc24-3cabcab9e8af\" (UID: \"bb0dcfe9-2446-4f91-bc24-3cabcab9e8af\") " Mar 10 10:09:44 crc kubenswrapper[4794]: I0310 10:09:44.644969 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ztmcd\" (UniqueName: \"kubernetes.io/projected/f68cd69f-6fe2-4189-ad03-9593a4e94337-kube-api-access-ztmcd\") pod \"f68cd69f-6fe2-4189-ad03-9593a4e94337\" (UID: \"f68cd69f-6fe2-4189-ad03-9593a4e94337\") " Mar 10 10:09:44 crc kubenswrapper[4794]: I0310 10:09:44.644993 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/f68cd69f-6fe2-4189-ad03-9593a4e94337-metrics-certs-tls-certs\") pod \"f68cd69f-6fe2-4189-ad03-9593a4e94337\" (UID: \"f68cd69f-6fe2-4189-ad03-9593a4e94337\") " Mar 10 10:09:44 crc kubenswrapper[4794]: I0310 10:09:44.645013 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/1fcb4385-7603-4d75-8c41-23f457fcae25-metrics-certs-tls-certs\") pod \"1fcb4385-7603-4d75-8c41-23f457fcae25\" (UID: \"1fcb4385-7603-4d75-8c41-23f457fcae25\") " Mar 10 10:09:44 crc kubenswrapper[4794]: I0310 10:09:44.645045 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/f68cd69f-6fe2-4189-ad03-9593a4e94337-ovn-rundir\") pod \"f68cd69f-6fe2-4189-ad03-9593a4e94337\" (UID: \"f68cd69f-6fe2-4189-ad03-9593a4e94337\") " Mar 10 10:09:44 crc kubenswrapper[4794]: I0310 10:09:44.645063 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f68cd69f-6fe2-4189-ad03-9593a4e94337-combined-ca-bundle\") pod \"f68cd69f-6fe2-4189-ad03-9593a4e94337\" (UID: \"f68cd69f-6fe2-4189-ad03-9593a4e94337\") " Mar 10 10:09:44 crc kubenswrapper[4794]: I0310 10:09:44.645140 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bb0dcfe9-2446-4f91-bc24-3cabcab9e8af-dns-svc\") pod \"bb0dcfe9-2446-4f91-bc24-3cabcab9e8af\" (UID: \"bb0dcfe9-2446-4f91-bc24-3cabcab9e8af\") " Mar 10 10:09:44 crc kubenswrapper[4794]: I0310 10:09:44.645162 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/1fcb4385-7603-4d75-8c41-23f457fcae25-ovsdbserver-sb-tls-certs\") pod \"1fcb4385-7603-4d75-8c41-23f457fcae25\" (UID: \"1fcb4385-7603-4d75-8c41-23f457fcae25\") " Mar 10 10:09:44 crc kubenswrapper[4794]: I0310 10:09:44.645182 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/bb0dcfe9-2446-4f91-bc24-3cabcab9e8af-ovsdbserver-nb\") pod \"bb0dcfe9-2446-4f91-bc24-3cabcab9e8af\" (UID: \"bb0dcfe9-2446-4f91-bc24-3cabcab9e8af\") " Mar 10 10:09:44 crc kubenswrapper[4794]: I0310 10:09:44.645209 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndbcluster-sb-etc-ovn\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"1fcb4385-7603-4d75-8c41-23f457fcae25\" (UID: \"1fcb4385-7603-4d75-8c41-23f457fcae25\") " Mar 10 10:09:44 crc kubenswrapper[4794]: I0310 10:09:44.645225 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1fcb4385-7603-4d75-8c41-23f457fcae25-config\") pod \"1fcb4385-7603-4d75-8c41-23f457fcae25\" (UID: \"1fcb4385-7603-4d75-8c41-23f457fcae25\") " Mar 10 10:09:44 crc kubenswrapper[4794]: I0310 10:09:44.646230 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1fcb4385-7603-4d75-8c41-23f457fcae25-config" (OuterVolumeSpecName: "config") pod "1fcb4385-7603-4d75-8c41-23f457fcae25" (UID: "1fcb4385-7603-4d75-8c41-23f457fcae25"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 10:09:44 crc kubenswrapper[4794]: I0310 10:09:44.647294 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f68cd69f-6fe2-4189-ad03-9593a4e94337-ovs-rundir" (OuterVolumeSpecName: "ovs-rundir") pod "f68cd69f-6fe2-4189-ad03-9593a4e94337" (UID: "f68cd69f-6fe2-4189-ad03-9593a4e94337"). InnerVolumeSpecName "ovs-rundir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 10:09:44 crc kubenswrapper[4794]: I0310 10:09:44.651462 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f68cd69f-6fe2-4189-ad03-9593a4e94337-config" (OuterVolumeSpecName: "config") pod "f68cd69f-6fe2-4189-ad03-9593a4e94337" (UID: "f68cd69f-6fe2-4189-ad03-9593a4e94337"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 10:09:44 crc kubenswrapper[4794]: I0310 10:09:44.654541 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bb0dcfe9-2446-4f91-bc24-3cabcab9e8af-kube-api-access-q625v" (OuterVolumeSpecName: "kube-api-access-q625v") pod "bb0dcfe9-2446-4f91-bc24-3cabcab9e8af" (UID: "bb0dcfe9-2446-4f91-bc24-3cabcab9e8af"). InnerVolumeSpecName "kube-api-access-q625v". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 10:09:44 crc kubenswrapper[4794]: I0310 10:09:44.654588 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f68cd69f-6fe2-4189-ad03-9593a4e94337-ovn-rundir" (OuterVolumeSpecName: "ovn-rundir") pod "f68cd69f-6fe2-4189-ad03-9593a4e94337" (UID: "f68cd69f-6fe2-4189-ad03-9593a4e94337"). InnerVolumeSpecName "ovn-rundir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 10:09:44 crc kubenswrapper[4794]: I0310 10:09:44.657390 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1fcb4385-7603-4d75-8c41-23f457fcae25-ovsdb-rundir" (OuterVolumeSpecName: "ovsdb-rundir") pod "1fcb4385-7603-4d75-8c41-23f457fcae25" (UID: "1fcb4385-7603-4d75-8c41-23f457fcae25"). InnerVolumeSpecName "ovsdb-rundir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 10:09:44 crc kubenswrapper[4794]: I0310 10:09:44.658044 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1fcb4385-7603-4d75-8c41-23f457fcae25-kube-api-access-s5gzb" (OuterVolumeSpecName: "kube-api-access-s5gzb") pod "1fcb4385-7603-4d75-8c41-23f457fcae25" (UID: "1fcb4385-7603-4d75-8c41-23f457fcae25"). InnerVolumeSpecName "kube-api-access-s5gzb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 10:09:44 crc kubenswrapper[4794]: I0310 10:09:44.659297 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1fcb4385-7603-4d75-8c41-23f457fcae25-scripts" (OuterVolumeSpecName: "scripts") pod "1fcb4385-7603-4d75-8c41-23f457fcae25" (UID: "1fcb4385-7603-4d75-8c41-23f457fcae25"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 10:09:44 crc kubenswrapper[4794]: I0310 10:09:44.672626 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f68cd69f-6fe2-4189-ad03-9593a4e94337-kube-api-access-ztmcd" (OuterVolumeSpecName: "kube-api-access-ztmcd") pod "f68cd69f-6fe2-4189-ad03-9593a4e94337" (UID: "f68cd69f-6fe2-4189-ad03-9593a4e94337"). InnerVolumeSpecName "kube-api-access-ztmcd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 10:09:44 crc kubenswrapper[4794]: I0310 10:09:44.689268 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage11-crc" (OuterVolumeSpecName: "ovndbcluster-sb-etc-ovn") pod "1fcb4385-7603-4d75-8c41-23f457fcae25" (UID: "1fcb4385-7603-4d75-8c41-23f457fcae25"). InnerVolumeSpecName "local-storage11-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 10 10:09:44 crc kubenswrapper[4794]: I0310 10:09:44.735742 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f68cd69f-6fe2-4189-ad03-9593a4e94337-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f68cd69f-6fe2-4189-ad03-9593a4e94337" (UID: "f68cd69f-6fe2-4189-ad03-9593a4e94337"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 10:09:44 crc kubenswrapper[4794]: I0310 10:09:44.746885 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/d235a0a5-57c9-4938-b742-5788ade30a12-openstack-config-secret\") pod \"d235a0a5-57c9-4938-b742-5788ade30a12\" (UID: \"d235a0a5-57c9-4938-b742-5788ade30a12\") " Mar 10 10:09:44 crc kubenswrapper[4794]: I0310 10:09:44.746930 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/9575e254-d696-4a8a-b84f-c8f36d746ff8-ovsdbserver-nb-tls-certs\") pod \"9575e254-d696-4a8a-b84f-c8f36d746ff8\" (UID: \"9575e254-d696-4a8a-b84f-c8f36d746ff8\") " Mar 10 10:09:44 crc kubenswrapper[4794]: I0310 10:09:44.746959 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9575e254-d696-4a8a-b84f-c8f36d746ff8-combined-ca-bundle\") pod \"9575e254-d696-4a8a-b84f-c8f36d746ff8\" (UID: \"9575e254-d696-4a8a-b84f-c8f36d746ff8\") " Mar 10 10:09:44 crc kubenswrapper[4794]: I0310 10:09:44.746988 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d235a0a5-57c9-4938-b742-5788ade30a12-combined-ca-bundle\") pod \"d235a0a5-57c9-4938-b742-5788ade30a12\" (UID: \"d235a0a5-57c9-4938-b742-5788ade30a12\") " Mar 10 10:09:44 crc kubenswrapper[4794]: I0310 10:09:44.747009 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9575e254-d696-4a8a-b84f-c8f36d746ff8-scripts\") pod \"9575e254-d696-4a8a-b84f-c8f36d746ff8\" (UID: \"9575e254-d696-4a8a-b84f-c8f36d746ff8\") " Mar 10 10:09:44 crc kubenswrapper[4794]: I0310 10:09:44.747060 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/d235a0a5-57c9-4938-b742-5788ade30a12-openstack-config\") pod \"d235a0a5-57c9-4938-b742-5788ade30a12\" (UID: \"d235a0a5-57c9-4938-b742-5788ade30a12\") " Mar 10 10:09:44 crc kubenswrapper[4794]: I0310 10:09:44.747096 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/9575e254-d696-4a8a-b84f-c8f36d746ff8-metrics-certs-tls-certs\") pod \"9575e254-d696-4a8a-b84f-c8f36d746ff8\" (UID: \"9575e254-d696-4a8a-b84f-c8f36d746ff8\") " Mar 10 10:09:44 crc kubenswrapper[4794]: I0310 10:09:44.747157 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c9bcv\" (UniqueName: \"kubernetes.io/projected/9575e254-d696-4a8a-b84f-c8f36d746ff8-kube-api-access-c9bcv\") pod \"9575e254-d696-4a8a-b84f-c8f36d746ff8\" (UID: \"9575e254-d696-4a8a-b84f-c8f36d746ff8\") " Mar 10 10:09:44 crc kubenswrapper[4794]: I0310 10:09:44.747221 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndbcluster-nb-etc-ovn\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"9575e254-d696-4a8a-b84f-c8f36d746ff8\" (UID: \"9575e254-d696-4a8a-b84f-c8f36d746ff8\") " Mar 10 10:09:44 crc kubenswrapper[4794]: I0310 10:09:44.747265 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/9575e254-d696-4a8a-b84f-c8f36d746ff8-ovsdb-rundir\") pod \"9575e254-d696-4a8a-b84f-c8f36d746ff8\" (UID: \"9575e254-d696-4a8a-b84f-c8f36d746ff8\") " Mar 10 10:09:44 crc kubenswrapper[4794]: I0310 10:09:44.747295 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9575e254-d696-4a8a-b84f-c8f36d746ff8-config\") pod \"9575e254-d696-4a8a-b84f-c8f36d746ff8\" (UID: \"9575e254-d696-4a8a-b84f-c8f36d746ff8\") " Mar 10 10:09:44 crc kubenswrapper[4794]: I0310 10:09:44.747369 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jclrs\" (UniqueName: \"kubernetes.io/projected/d235a0a5-57c9-4938-b742-5788ade30a12-kube-api-access-jclrs\") pod \"d235a0a5-57c9-4938-b742-5788ade30a12\" (UID: \"d235a0a5-57c9-4938-b742-5788ade30a12\") " Mar 10 10:09:44 crc kubenswrapper[4794]: I0310 10:09:44.747865 4794 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") on node \"crc\" " Mar 10 10:09:44 crc kubenswrapper[4794]: I0310 10:09:44.747881 4794 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1fcb4385-7603-4d75-8c41-23f457fcae25-config\") on node \"crc\" DevicePath \"\"" Mar 10 10:09:44 crc kubenswrapper[4794]: I0310 10:09:44.747900 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s5gzb\" (UniqueName: \"kubernetes.io/projected/1fcb4385-7603-4d75-8c41-23f457fcae25-kube-api-access-s5gzb\") on node \"crc\" DevicePath \"\"" Mar 10 10:09:44 crc kubenswrapper[4794]: I0310 10:09:44.747909 4794 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f68cd69f-6fe2-4189-ad03-9593a4e94337-config\") on node \"crc\" DevicePath \"\"" Mar 10 10:09:44 crc kubenswrapper[4794]: I0310 10:09:44.747919 4794 reconciler_common.go:293] "Volume detached for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/f68cd69f-6fe2-4189-ad03-9593a4e94337-ovs-rundir\") on node \"crc\" DevicePath \"\"" Mar 10 10:09:44 crc kubenswrapper[4794]: I0310 10:09:44.747927 4794 reconciler_common.go:293] "Volume detached for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/1fcb4385-7603-4d75-8c41-23f457fcae25-ovsdb-rundir\") on node \"crc\" DevicePath \"\"" Mar 10 10:09:44 crc kubenswrapper[4794]: I0310 10:09:44.747935 4794 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1fcb4385-7603-4d75-8c41-23f457fcae25-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 10:09:44 crc kubenswrapper[4794]: I0310 10:09:44.747943 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q625v\" (UniqueName: \"kubernetes.io/projected/bb0dcfe9-2446-4f91-bc24-3cabcab9e8af-kube-api-access-q625v\") on node \"crc\" DevicePath \"\"" Mar 10 10:09:44 crc kubenswrapper[4794]: I0310 10:09:44.747951 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ztmcd\" (UniqueName: \"kubernetes.io/projected/f68cd69f-6fe2-4189-ad03-9593a4e94337-kube-api-access-ztmcd\") on node \"crc\" DevicePath \"\"" Mar 10 10:09:44 crc kubenswrapper[4794]: I0310 10:09:44.747959 4794 reconciler_common.go:293] "Volume detached for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/f68cd69f-6fe2-4189-ad03-9593a4e94337-ovn-rundir\") on node \"crc\" DevicePath \"\"" Mar 10 10:09:44 crc kubenswrapper[4794]: I0310 10:09:44.747967 4794 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f68cd69f-6fe2-4189-ad03-9593a4e94337-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 10:09:44 crc kubenswrapper[4794]: I0310 10:09:44.749325 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9575e254-d696-4a8a-b84f-c8f36d746ff8-ovsdb-rundir" (OuterVolumeSpecName: "ovsdb-rundir") pod "9575e254-d696-4a8a-b84f-c8f36d746ff8" (UID: "9575e254-d696-4a8a-b84f-c8f36d746ff8"). InnerVolumeSpecName "ovsdb-rundir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 10:09:44 crc kubenswrapper[4794]: I0310 10:09:44.749517 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9575e254-d696-4a8a-b84f-c8f36d746ff8-scripts" (OuterVolumeSpecName: "scripts") pod "9575e254-d696-4a8a-b84f-c8f36d746ff8" (UID: "9575e254-d696-4a8a-b84f-c8f36d746ff8"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 10:09:44 crc kubenswrapper[4794]: I0310 10:09:44.751007 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9575e254-d696-4a8a-b84f-c8f36d746ff8-config" (OuterVolumeSpecName: "config") pod "9575e254-d696-4a8a-b84f-c8f36d746ff8" (UID: "9575e254-d696-4a8a-b84f-c8f36d746ff8"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 10:09:44 crc kubenswrapper[4794]: I0310 10:09:44.754563 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d235a0a5-57c9-4938-b742-5788ade30a12-kube-api-access-jclrs" (OuterVolumeSpecName: "kube-api-access-jclrs") pod "d235a0a5-57c9-4938-b742-5788ade30a12" (UID: "d235a0a5-57c9-4938-b742-5788ade30a12"). InnerVolumeSpecName "kube-api-access-jclrs". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 10:09:44 crc kubenswrapper[4794]: I0310 10:09:44.762190 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage06-crc" (OuterVolumeSpecName: "ovndbcluster-nb-etc-ovn") pod "9575e254-d696-4a8a-b84f-c8f36d746ff8" (UID: "9575e254-d696-4a8a-b84f-c8f36d746ff8"). InnerVolumeSpecName "local-storage06-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 10 10:09:44 crc kubenswrapper[4794]: I0310 10:09:44.770929 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9575e254-d696-4a8a-b84f-c8f36d746ff8-kube-api-access-c9bcv" (OuterVolumeSpecName: "kube-api-access-c9bcv") pod "9575e254-d696-4a8a-b84f-c8f36d746ff8" (UID: "9575e254-d696-4a8a-b84f-c8f36d746ff8"). InnerVolumeSpecName "kube-api-access-c9bcv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 10:09:44 crc kubenswrapper[4794]: I0310 10:09:44.849769 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c9bcv\" (UniqueName: \"kubernetes.io/projected/9575e254-d696-4a8a-b84f-c8f36d746ff8-kube-api-access-c9bcv\") on node \"crc\" DevicePath \"\"" Mar 10 10:09:44 crc kubenswrapper[4794]: I0310 10:09:44.850296 4794 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" " Mar 10 10:09:44 crc kubenswrapper[4794]: I0310 10:09:44.850308 4794 reconciler_common.go:293] "Volume detached for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/9575e254-d696-4a8a-b84f-c8f36d746ff8-ovsdb-rundir\") on node \"crc\" DevicePath \"\"" Mar 10 10:09:44 crc kubenswrapper[4794]: I0310 10:09:44.850316 4794 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9575e254-d696-4a8a-b84f-c8f36d746ff8-config\") on node \"crc\" DevicePath \"\"" Mar 10 10:09:44 crc kubenswrapper[4794]: I0310 10:09:44.850340 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jclrs\" (UniqueName: \"kubernetes.io/projected/d235a0a5-57c9-4938-b742-5788ade30a12-kube-api-access-jclrs\") on node \"crc\" DevicePath \"\"" Mar 10 10:09:44 crc kubenswrapper[4794]: I0310 10:09:44.850351 4794 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9575e254-d696-4a8a-b84f-c8f36d746ff8-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 10:09:44 crc kubenswrapper[4794]: I0310 10:09:44.865550 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d235a0a5-57c9-4938-b742-5788ade30a12-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d235a0a5-57c9-4938-b742-5788ade30a12" (UID: "d235a0a5-57c9-4938-b742-5788ade30a12"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 10:09:44 crc kubenswrapper[4794]: I0310 10:09:44.877066 4794 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage11-crc" (UniqueName: "kubernetes.io/local-volume/local-storage11-crc") on node "crc" Mar 10 10:09:44 crc kubenswrapper[4794]: I0310 10:09:44.910303 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1fcb4385-7603-4d75-8c41-23f457fcae25-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1fcb4385-7603-4d75-8c41-23f457fcae25" (UID: "1fcb4385-7603-4d75-8c41-23f457fcae25"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 10:09:44 crc kubenswrapper[4794]: I0310 10:09:44.910883 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bb0dcfe9-2446-4f91-bc24-3cabcab9e8af-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "bb0dcfe9-2446-4f91-bc24-3cabcab9e8af" (UID: "bb0dcfe9-2446-4f91-bc24-3cabcab9e8af"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 10:09:44 crc kubenswrapper[4794]: I0310 10:09:44.933725 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9575e254-d696-4a8a-b84f-c8f36d746ff8-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9575e254-d696-4a8a-b84f-c8f36d746ff8" (UID: "9575e254-d696-4a8a-b84f-c8f36d746ff8"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 10:09:44 crc kubenswrapper[4794]: I0310 10:09:44.934110 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bb0dcfe9-2446-4f91-bc24-3cabcab9e8af-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "bb0dcfe9-2446-4f91-bc24-3cabcab9e8af" (UID: "bb0dcfe9-2446-4f91-bc24-3cabcab9e8af"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 10:09:44 crc kubenswrapper[4794]: I0310 10:09:44.952047 4794 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage06-crc" (UniqueName: "kubernetes.io/local-volume/local-storage06-crc") on node "crc" Mar 10 10:09:44 crc kubenswrapper[4794]: I0310 10:09:44.953410 4794 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/bb0dcfe9-2446-4f91-bc24-3cabcab9e8af-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 10 10:09:44 crc kubenswrapper[4794]: I0310 10:09:44.953432 4794 reconciler_common.go:293] "Volume detached for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" DevicePath \"\"" Mar 10 10:09:44 crc kubenswrapper[4794]: I0310 10:09:44.953443 4794 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bb0dcfe9-2446-4f91-bc24-3cabcab9e8af-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 10 10:09:44 crc kubenswrapper[4794]: I0310 10:09:44.953456 4794 reconciler_common.go:293] "Volume detached for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") on node \"crc\" DevicePath \"\"" Mar 10 10:09:44 crc kubenswrapper[4794]: I0310 10:09:44.953466 4794 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9575e254-d696-4a8a-b84f-c8f36d746ff8-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 10:09:44 crc kubenswrapper[4794]: I0310 10:09:44.953478 4794 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d235a0a5-57c9-4938-b742-5788ade30a12-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 10:09:44 crc kubenswrapper[4794]: I0310 10:09:44.953489 4794 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1fcb4385-7603-4d75-8c41-23f457fcae25-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 10:09:44 crc kubenswrapper[4794]: I0310 10:09:44.968734 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d235a0a5-57c9-4938-b742-5788ade30a12-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "d235a0a5-57c9-4938-b742-5788ade30a12" (UID: "d235a0a5-57c9-4938-b742-5788ade30a12"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 10:09:45 crc kubenswrapper[4794]: I0310 10:09:45.046166 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bb0dcfe9-2446-4f91-bc24-3cabcab9e8af-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "bb0dcfe9-2446-4f91-bc24-3cabcab9e8af" (UID: "bb0dcfe9-2446-4f91-bc24-3cabcab9e8af"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 10:09:45 crc kubenswrapper[4794]: I0310 10:09:45.047075 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d235a0a5-57c9-4938-b742-5788ade30a12-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "d235a0a5-57c9-4938-b742-5788ade30a12" (UID: "d235a0a5-57c9-4938-b742-5788ade30a12"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 10:09:45 crc kubenswrapper[4794]: I0310 10:09:45.057855 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1fcb4385-7603-4d75-8c41-23f457fcae25-metrics-certs-tls-certs" (OuterVolumeSpecName: "metrics-certs-tls-certs") pod "1fcb4385-7603-4d75-8c41-23f457fcae25" (UID: "1fcb4385-7603-4d75-8c41-23f457fcae25"). InnerVolumeSpecName "metrics-certs-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 10:09:45 crc kubenswrapper[4794]: I0310 10:09:45.060162 4794 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/bb0dcfe9-2446-4f91-bc24-3cabcab9e8af-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 10 10:09:45 crc kubenswrapper[4794]: I0310 10:09:45.060193 4794 reconciler_common.go:293] "Volume detached for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/1fcb4385-7603-4d75-8c41-23f457fcae25-metrics-certs-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 10 10:09:45 crc kubenswrapper[4794]: I0310 10:09:45.060205 4794 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/d235a0a5-57c9-4938-b742-5788ade30a12-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Mar 10 10:09:45 crc kubenswrapper[4794]: I0310 10:09:45.060214 4794 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/d235a0a5-57c9-4938-b742-5788ade30a12-openstack-config\") on node \"crc\" DevicePath \"\"" Mar 10 10:09:45 crc kubenswrapper[4794]: I0310 10:09:45.062206 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bb0dcfe9-2446-4f91-bc24-3cabcab9e8af-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "bb0dcfe9-2446-4f91-bc24-3cabcab9e8af" (UID: "bb0dcfe9-2446-4f91-bc24-3cabcab9e8af"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 10:09:45 crc kubenswrapper[4794]: I0310 10:09:45.090629 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f68cd69f-6fe2-4189-ad03-9593a4e94337-metrics-certs-tls-certs" (OuterVolumeSpecName: "metrics-certs-tls-certs") pod "f68cd69f-6fe2-4189-ad03-9593a4e94337" (UID: "f68cd69f-6fe2-4189-ad03-9593a4e94337"). InnerVolumeSpecName "metrics-certs-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 10:09:45 crc kubenswrapper[4794]: I0310 10:09:45.105212 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bb0dcfe9-2446-4f91-bc24-3cabcab9e8af-config" (OuterVolumeSpecName: "config") pod "bb0dcfe9-2446-4f91-bc24-3cabcab9e8af" (UID: "bb0dcfe9-2446-4f91-bc24-3cabcab9e8af"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 10:09:45 crc kubenswrapper[4794]: I0310 10:09:45.138492 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1fcb4385-7603-4d75-8c41-23f457fcae25-ovsdbserver-sb-tls-certs" (OuterVolumeSpecName: "ovsdbserver-sb-tls-certs") pod "1fcb4385-7603-4d75-8c41-23f457fcae25" (UID: "1fcb4385-7603-4d75-8c41-23f457fcae25"). InnerVolumeSpecName "ovsdbserver-sb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 10:09:45 crc kubenswrapper[4794]: I0310 10:09:45.169549 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9575e254-d696-4a8a-b84f-c8f36d746ff8-metrics-certs-tls-certs" (OuterVolumeSpecName: "metrics-certs-tls-certs") pod "9575e254-d696-4a8a-b84f-c8f36d746ff8" (UID: "9575e254-d696-4a8a-b84f-c8f36d746ff8"). InnerVolumeSpecName "metrics-certs-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 10:09:45 crc kubenswrapper[4794]: I0310 10:09:45.170386 4794 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/1fcb4385-7603-4d75-8c41-23f457fcae25-ovsdbserver-sb-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 10 10:09:45 crc kubenswrapper[4794]: I0310 10:09:45.170429 4794 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/bb0dcfe9-2446-4f91-bc24-3cabcab9e8af-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 10 10:09:45 crc kubenswrapper[4794]: I0310 10:09:45.170441 4794 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bb0dcfe9-2446-4f91-bc24-3cabcab9e8af-config\") on node \"crc\" DevicePath \"\"" Mar 10 10:09:45 crc kubenswrapper[4794]: I0310 10:09:45.170449 4794 reconciler_common.go:293] "Volume detached for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/9575e254-d696-4a8a-b84f-c8f36d746ff8-metrics-certs-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 10 10:09:45 crc kubenswrapper[4794]: I0310 10:09:45.170457 4794 reconciler_common.go:293] "Volume detached for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/f68cd69f-6fe2-4189-ad03-9593a4e94337-metrics-certs-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 10 10:09:45 crc kubenswrapper[4794]: E0310 10:09:45.171226 4794 configmap.go:193] Couldn't get configMap openstack/rabbitmq-config-data: configmap "rabbitmq-config-data" not found Mar 10 10:09:45 crc kubenswrapper[4794]: E0310 10:09:45.171289 4794 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/a45381ea-b5d8-49aa-b4b8-ab372b39b0d3-config-data podName:a45381ea-b5d8-49aa-b4b8-ab372b39b0d3 nodeName:}" failed. No retries permitted until 2026-03-10 10:09:49.171268677 +0000 UTC m=+1537.927439575 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/a45381ea-b5d8-49aa-b4b8-ab372b39b0d3-config-data") pod "rabbitmq-server-0" (UID: "a45381ea-b5d8-49aa-b4b8-ab372b39b0d3") : configmap "rabbitmq-config-data" not found Mar 10 10:09:45 crc kubenswrapper[4794]: I0310 10:09:45.184986 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-proxy-7575fbf969-gq2mq"] Mar 10 10:09:45 crc kubenswrapper[4794]: I0310 10:09:45.185211 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-proxy-7575fbf969-gq2mq" podUID="ed705a10-5bb5-4170-8536-57c6be1cb816" containerName="proxy-httpd" containerID="cri-o://f4dda11441d70fc4769e94a81ee22fc71ec078c6070b2dd0749628f78f387314" gracePeriod=30 Mar 10 10:09:45 crc kubenswrapper[4794]: I0310 10:09:45.185921 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-ncstb" Mar 10 10:09:45 crc kubenswrapper[4794]: I0310 10:09:45.186099 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-proxy-7575fbf969-gq2mq" podUID="ed705a10-5bb5-4170-8536-57c6be1cb816" containerName="proxy-server" containerID="cri-o://a5ce358b520a173198341e8d239869c88b76f27ef305f00661c9c0926b396a4c" gracePeriod=30 Mar 10 10:09:45 crc kubenswrapper[4794]: I0310 10:09:45.191458 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9575e254-d696-4a8a-b84f-c8f36d746ff8-ovsdbserver-nb-tls-certs" (OuterVolumeSpecName: "ovsdbserver-nb-tls-certs") pod "9575e254-d696-4a8a-b84f-c8f36d746ff8" (UID: "9575e254-d696-4a8a-b84f-c8f36d746ff8"). InnerVolumeSpecName "ovsdbserver-nb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 10:09:45 crc kubenswrapper[4794]: I0310 10:09:45.203468 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 10 10:09:45 crc kubenswrapper[4794]: I0310 10:09:45.272129 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/df649851-7d90-41e8-80e9-f7fd44d77af0-utilities\") pod \"df649851-7d90-41e8-80e9-f7fd44d77af0\" (UID: \"df649851-7d90-41e8-80e9-f7fd44d77af0\") " Mar 10 10:09:45 crc kubenswrapper[4794]: I0310 10:09:45.272178 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/df649851-7d90-41e8-80e9-f7fd44d77af0-catalog-content\") pod \"df649851-7d90-41e8-80e9-f7fd44d77af0\" (UID: \"df649851-7d90-41e8-80e9-f7fd44d77af0\") " Mar 10 10:09:45 crc kubenswrapper[4794]: I0310 10:09:45.272304 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-64gcn\" (UniqueName: \"kubernetes.io/projected/df649851-7d90-41e8-80e9-f7fd44d77af0-kube-api-access-64gcn\") pod \"df649851-7d90-41e8-80e9-f7fd44d77af0\" (UID: \"df649851-7d90-41e8-80e9-f7fd44d77af0\") " Mar 10 10:09:45 crc kubenswrapper[4794]: I0310 10:09:45.272706 4794 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/9575e254-d696-4a8a-b84f-c8f36d746ff8-ovsdbserver-nb-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 10 10:09:45 crc kubenswrapper[4794]: I0310 10:09:45.274501 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/df649851-7d90-41e8-80e9-f7fd44d77af0-utilities" (OuterVolumeSpecName: "utilities") pod "df649851-7d90-41e8-80e9-f7fd44d77af0" (UID: "df649851-7d90-41e8-80e9-f7fd44d77af0"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 10:09:45 crc kubenswrapper[4794]: I0310 10:09:45.292221 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/df649851-7d90-41e8-80e9-f7fd44d77af0-kube-api-access-64gcn" (OuterVolumeSpecName: "kube-api-access-64gcn") pod "df649851-7d90-41e8-80e9-f7fd44d77af0" (UID: "df649851-7d90-41e8-80e9-f7fd44d77af0"). InnerVolumeSpecName "kube-api-access-64gcn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 10:09:45 crc kubenswrapper[4794]: I0310 10:09:45.301670 4794 generic.go:334] "Generic (PLEG): container finished" podID="df649851-7d90-41e8-80e9-f7fd44d77af0" containerID="5c587627c7f33469c641839031a8ba051d3fb1f6bbb5856feff3d2931910e1cd" exitCode=0 Mar 10 10:09:45 crc kubenswrapper[4794]: I0310 10:09:45.301820 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ncstb" event={"ID":"df649851-7d90-41e8-80e9-f7fd44d77af0","Type":"ContainerDied","Data":"5c587627c7f33469c641839031a8ba051d3fb1f6bbb5856feff3d2931910e1cd"} Mar 10 10:09:45 crc kubenswrapper[4794]: I0310 10:09:45.301849 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ncstb" event={"ID":"df649851-7d90-41e8-80e9-f7fd44d77af0","Type":"ContainerDied","Data":"093e303226a7976dab0561b0896c1693e8abccc6b9d859c15ed43a19cd6bd9c3"} Mar 10 10:09:45 crc kubenswrapper[4794]: I0310 10:09:45.301866 4794 scope.go:117] "RemoveContainer" containerID="5c587627c7f33469c641839031a8ba051d3fb1f6bbb5856feff3d2931910e1cd" Mar 10 10:09:45 crc kubenswrapper[4794]: I0310 10:09:45.301992 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-ncstb" Mar 10 10:09:45 crc kubenswrapper[4794]: I0310 10:09:45.306727 4794 generic.go:334] "Generic (PLEG): container finished" podID="2ceb2c7c-89c8-4fce-aa02-aac3840e3f5a" containerID="9e44cda9b00f14d7a50fe91ef2bfbfd700f1b8ac60755da2c7898a76949812f8" exitCode=0 Mar 10 10:09:45 crc kubenswrapper[4794]: I0310 10:09:45.306773 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"2ceb2c7c-89c8-4fce-aa02-aac3840e3f5a","Type":"ContainerDied","Data":"9e44cda9b00f14d7a50fe91ef2bfbfd700f1b8ac60755da2c7898a76949812f8"} Mar 10 10:09:45 crc kubenswrapper[4794]: I0310 10:09:45.306790 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"2ceb2c7c-89c8-4fce-aa02-aac3840e3f5a","Type":"ContainerDied","Data":"eb147e24025bb532d58dd9ffc23a6ad01518c78d77d677f6b647e18c0b16474e"} Mar 10 10:09:45 crc kubenswrapper[4794]: I0310 10:09:45.306829 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 10 10:09:45 crc kubenswrapper[4794]: I0310 10:09:45.311222 4794 generic.go:334] "Generic (PLEG): container finished" podID="4a681af5-3cbf-4d83-a8cd-42a552cdc06d" containerID="161acba2f6771b8ea310825bf85f5470591fd30d63eb4f7e8d35e0b63dbde45c" exitCode=1 Mar 10 10:09:45 crc kubenswrapper[4794]: I0310 10:09:45.311260 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-h8ctb" event={"ID":"4a681af5-3cbf-4d83-a8cd-42a552cdc06d","Type":"ContainerDied","Data":"161acba2f6771b8ea310825bf85f5470591fd30d63eb4f7e8d35e0b63dbde45c"} Mar 10 10:09:45 crc kubenswrapper[4794]: I0310 10:09:45.311277 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-h8ctb" event={"ID":"4a681af5-3cbf-4d83-a8cd-42a552cdc06d","Type":"ContainerStarted","Data":"27b43c473996656b05298e1bdc7a4df550fc2873a112768d66877b30f849fb5d"} Mar 10 10:09:45 crc kubenswrapper[4794]: I0310 10:09:45.311678 4794 scope.go:117] "RemoveContainer" containerID="161acba2f6771b8ea310825bf85f5470591fd30d63eb4f7e8d35e0b63dbde45c" Mar 10 10:09:45 crc kubenswrapper[4794]: I0310 10:09:45.317969 4794 generic.go:334] "Generic (PLEG): container finished" podID="d6e6c324-8bba-4585-9ffc-afad751594d7" containerID="a6cc45af0b715a94827c03b09ad8044d92a3ff4ebaf066479557fde441a9fe8f" exitCode=0 Mar 10 10:09:45 crc kubenswrapper[4794]: I0310 10:09:45.318040 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"d6e6c324-8bba-4585-9ffc-afad751594d7","Type":"ContainerDied","Data":"a6cc45af0b715a94827c03b09ad8044d92a3ff4ebaf066479557fde441a9fe8f"} Mar 10 10:09:45 crc kubenswrapper[4794]: I0310 10:09:45.334207 4794 generic.go:334] "Generic (PLEG): container finished" podID="7f82e49b-0e4d-4cf5-8213-b30edcae94d4" containerID="c4c1209870650aefcffd90e94778414a3fb571c20ef3e08baeadd14915b7e8d6" exitCode=0 Mar 10 10:09:45 crc kubenswrapper[4794]: I0310 10:09:45.334274 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"7f82e49b-0e4d-4cf5-8213-b30edcae94d4","Type":"ContainerDied","Data":"c4c1209870650aefcffd90e94778414a3fb571c20ef3e08baeadd14915b7e8d6"} Mar 10 10:09:45 crc kubenswrapper[4794]: I0310 10:09:45.339103 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_9575e254-d696-4a8a-b84f-c8f36d746ff8/ovsdbserver-nb/0.log" Mar 10 10:09:45 crc kubenswrapper[4794]: I0310 10:09:45.339158 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"9575e254-d696-4a8a-b84f-c8f36d746ff8","Type":"ContainerDied","Data":"8e38a5eade5e988013fb78ab62479c4344b7f756e8e7babc55613c3221471a07"} Mar 10 10:09:45 crc kubenswrapper[4794]: I0310 10:09:45.339276 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Mar 10 10:09:45 crc kubenswrapper[4794]: I0310 10:09:45.350670 4794 scope.go:117] "RemoveContainer" containerID="300562753a0ec20aa3aeb28f13de5e5e0c7a2cb2a3fd6a2132f69624d5cdf946" Mar 10 10:09:45 crc kubenswrapper[4794]: I0310 10:09:45.357654 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/df649851-7d90-41e8-80e9-f7fd44d77af0-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "df649851-7d90-41e8-80e9-f7fd44d77af0" (UID: "df649851-7d90-41e8-80e9-f7fd44d77af0"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 10:09:45 crc kubenswrapper[4794]: I0310 10:09:45.374640 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2ceb2c7c-89c8-4fce-aa02-aac3840e3f5a-config-data\") pod \"2ceb2c7c-89c8-4fce-aa02-aac3840e3f5a\" (UID: \"2ceb2c7c-89c8-4fce-aa02-aac3840e3f5a\") " Mar 10 10:09:45 crc kubenswrapper[4794]: I0310 10:09:45.374799 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/2ceb2c7c-89c8-4fce-aa02-aac3840e3f5a-nova-novncproxy-tls-certs\") pod \"2ceb2c7c-89c8-4fce-aa02-aac3840e3f5a\" (UID: \"2ceb2c7c-89c8-4fce-aa02-aac3840e3f5a\") " Mar 10 10:09:45 crc kubenswrapper[4794]: I0310 10:09:45.374866 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tmnv9\" (UniqueName: \"kubernetes.io/projected/2ceb2c7c-89c8-4fce-aa02-aac3840e3f5a-kube-api-access-tmnv9\") pod \"2ceb2c7c-89c8-4fce-aa02-aac3840e3f5a\" (UID: \"2ceb2c7c-89c8-4fce-aa02-aac3840e3f5a\") " Mar 10 10:09:45 crc kubenswrapper[4794]: I0310 10:09:45.375706 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2ceb2c7c-89c8-4fce-aa02-aac3840e3f5a-combined-ca-bundle\") pod \"2ceb2c7c-89c8-4fce-aa02-aac3840e3f5a\" (UID: \"2ceb2c7c-89c8-4fce-aa02-aac3840e3f5a\") " Mar 10 10:09:45 crc kubenswrapper[4794]: I0310 10:09:45.375754 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/2ceb2c7c-89c8-4fce-aa02-aac3840e3f5a-vencrypt-tls-certs\") pod \"2ceb2c7c-89c8-4fce-aa02-aac3840e3f5a\" (UID: \"2ceb2c7c-89c8-4fce-aa02-aac3840e3f5a\") " Mar 10 10:09:45 crc kubenswrapper[4794]: I0310 10:09:45.376771 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-64gcn\" (UniqueName: \"kubernetes.io/projected/df649851-7d90-41e8-80e9-f7fd44d77af0-kube-api-access-64gcn\") on node \"crc\" DevicePath \"\"" Mar 10 10:09:45 crc kubenswrapper[4794]: I0310 10:09:45.376801 4794 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/df649851-7d90-41e8-80e9-f7fd44d77af0-utilities\") on node \"crc\" DevicePath \"\"" Mar 10 10:09:45 crc kubenswrapper[4794]: I0310 10:09:45.376814 4794 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/df649851-7d90-41e8-80e9-f7fd44d77af0-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 10 10:09:45 crc kubenswrapper[4794]: I0310 10:09:45.378456 4794 generic.go:334] "Generic (PLEG): container finished" podID="3c6c5ff5-c0d2-49d5-a2d0-f49871e6444e" containerID="849ee9cf4f0b2ba0c3906171081ce847f0534ff16ad62c26ba765cd71b509f45" exitCode=0 Mar 10 10:09:45 crc kubenswrapper[4794]: I0310 10:09:45.378529 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"3c6c5ff5-c0d2-49d5-a2d0-f49871e6444e","Type":"ContainerDied","Data":"849ee9cf4f0b2ba0c3906171081ce847f0534ff16ad62c26ba765cd71b509f45"} Mar 10 10:09:45 crc kubenswrapper[4794]: I0310 10:09:45.401508 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2ceb2c7c-89c8-4fce-aa02-aac3840e3f5a-kube-api-access-tmnv9" (OuterVolumeSpecName: "kube-api-access-tmnv9") pod "2ceb2c7c-89c8-4fce-aa02-aac3840e3f5a" (UID: "2ceb2c7c-89c8-4fce-aa02-aac3840e3f5a"). InnerVolumeSpecName "kube-api-access-tmnv9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 10:09:45 crc kubenswrapper[4794]: I0310 10:09:45.403523 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Mar 10 10:09:45 crc kubenswrapper[4794]: I0310 10:09:45.407860 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-c8964d89c-gfknm" Mar 10 10:09:45 crc kubenswrapper[4794]: I0310 10:09:45.407982 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-7da2-account-create-update-rd2nz" event={"ID":"19885e30-4144-4598-be9f-99644e5d5d4a","Type":"ContainerStarted","Data":"462746f7d4293095c4760c9bebe6c55150931da29629648c6e102ec58e5deb64"} Mar 10 10:09:45 crc kubenswrapper[4794]: I0310 10:09:45.409284 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-p9r7k" Mar 10 10:09:45 crc kubenswrapper[4794]: I0310 10:09:45.409390 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Mar 10 10:09:45 crc kubenswrapper[4794]: I0310 10:09:45.409715 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovsdbserver-nb-0"] Mar 10 10:09:45 crc kubenswrapper[4794]: I0310 10:09:45.440811 4794 scope.go:117] "RemoveContainer" containerID="2932a8cb91e0c1e8155dc36d90b5dde9d6ccef4a40218c833440f05cd348e0dc" Mar 10 10:09:45 crc kubenswrapper[4794]: I0310 10:09:45.450943 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2ceb2c7c-89c8-4fce-aa02-aac3840e3f5a-config-data" (OuterVolumeSpecName: "config-data") pod "2ceb2c7c-89c8-4fce-aa02-aac3840e3f5a" (UID: "2ceb2c7c-89c8-4fce-aa02-aac3840e3f5a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 10:09:45 crc kubenswrapper[4794]: I0310 10:09:45.451041 4794 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovsdbserver-nb-0"] Mar 10 10:09:45 crc kubenswrapper[4794]: I0310 10:09:45.456454 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2ceb2c7c-89c8-4fce-aa02-aac3840e3f5a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2ceb2c7c-89c8-4fce-aa02-aac3840e3f5a" (UID: "2ceb2c7c-89c8-4fce-aa02-aac3840e3f5a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 10:09:45 crc kubenswrapper[4794]: I0310 10:09:45.478942 4794 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2ceb2c7c-89c8-4fce-aa02-aac3840e3f5a-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 10:09:45 crc kubenswrapper[4794]: I0310 10:09:45.479019 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tmnv9\" (UniqueName: \"kubernetes.io/projected/2ceb2c7c-89c8-4fce-aa02-aac3840e3f5a-kube-api-access-tmnv9\") on node \"crc\" DevicePath \"\"" Mar 10 10:09:45 crc kubenswrapper[4794]: I0310 10:09:45.479031 4794 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2ceb2c7c-89c8-4fce-aa02-aac3840e3f5a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 10:09:45 crc kubenswrapper[4794]: I0310 10:09:45.492072 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2ceb2c7c-89c8-4fce-aa02-aac3840e3f5a-vencrypt-tls-certs" (OuterVolumeSpecName: "vencrypt-tls-certs") pod "2ceb2c7c-89c8-4fce-aa02-aac3840e3f5a" (UID: "2ceb2c7c-89c8-4fce-aa02-aac3840e3f5a"). InnerVolumeSpecName "vencrypt-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 10:09:45 crc kubenswrapper[4794]: I0310 10:09:45.541494 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2ceb2c7c-89c8-4fce-aa02-aac3840e3f5a-nova-novncproxy-tls-certs" (OuterVolumeSpecName: "nova-novncproxy-tls-certs") pod "2ceb2c7c-89c8-4fce-aa02-aac3840e3f5a" (UID: "2ceb2c7c-89c8-4fce-aa02-aac3840e3f5a"). InnerVolumeSpecName "nova-novncproxy-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 10:09:45 crc kubenswrapper[4794]: I0310 10:09:45.555923 4794 scope.go:117] "RemoveContainer" containerID="5c587627c7f33469c641839031a8ba051d3fb1f6bbb5856feff3d2931910e1cd" Mar 10 10:09:45 crc kubenswrapper[4794]: E0310 10:09:45.556552 4794 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5c587627c7f33469c641839031a8ba051d3fb1f6bbb5856feff3d2931910e1cd\": container with ID starting with 5c587627c7f33469c641839031a8ba051d3fb1f6bbb5856feff3d2931910e1cd not found: ID does not exist" containerID="5c587627c7f33469c641839031a8ba051d3fb1f6bbb5856feff3d2931910e1cd" Mar 10 10:09:45 crc kubenswrapper[4794]: I0310 10:09:45.556581 4794 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5c587627c7f33469c641839031a8ba051d3fb1f6bbb5856feff3d2931910e1cd"} err="failed to get container status \"5c587627c7f33469c641839031a8ba051d3fb1f6bbb5856feff3d2931910e1cd\": rpc error: code = NotFound desc = could not find container \"5c587627c7f33469c641839031a8ba051d3fb1f6bbb5856feff3d2931910e1cd\": container with ID starting with 5c587627c7f33469c641839031a8ba051d3fb1f6bbb5856feff3d2931910e1cd not found: ID does not exist" Mar 10 10:09:45 crc kubenswrapper[4794]: I0310 10:09:45.556599 4794 scope.go:117] "RemoveContainer" containerID="300562753a0ec20aa3aeb28f13de5e5e0c7a2cb2a3fd6a2132f69624d5cdf946" Mar 10 10:09:45 crc kubenswrapper[4794]: E0310 10:09:45.557153 4794 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"300562753a0ec20aa3aeb28f13de5e5e0c7a2cb2a3fd6a2132f69624d5cdf946\": container with ID starting with 300562753a0ec20aa3aeb28f13de5e5e0c7a2cb2a3fd6a2132f69624d5cdf946 not found: ID does not exist" containerID="300562753a0ec20aa3aeb28f13de5e5e0c7a2cb2a3fd6a2132f69624d5cdf946" Mar 10 10:09:45 crc kubenswrapper[4794]: I0310 10:09:45.557173 4794 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"300562753a0ec20aa3aeb28f13de5e5e0c7a2cb2a3fd6a2132f69624d5cdf946"} err="failed to get container status \"300562753a0ec20aa3aeb28f13de5e5e0c7a2cb2a3fd6a2132f69624d5cdf946\": rpc error: code = NotFound desc = could not find container \"300562753a0ec20aa3aeb28f13de5e5e0c7a2cb2a3fd6a2132f69624d5cdf946\": container with ID starting with 300562753a0ec20aa3aeb28f13de5e5e0c7a2cb2a3fd6a2132f69624d5cdf946 not found: ID does not exist" Mar 10 10:09:45 crc kubenswrapper[4794]: I0310 10:09:45.557196 4794 scope.go:117] "RemoveContainer" containerID="2932a8cb91e0c1e8155dc36d90b5dde9d6ccef4a40218c833440f05cd348e0dc" Mar 10 10:09:45 crc kubenswrapper[4794]: E0310 10:09:45.557448 4794 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2932a8cb91e0c1e8155dc36d90b5dde9d6ccef4a40218c833440f05cd348e0dc\": container with ID starting with 2932a8cb91e0c1e8155dc36d90b5dde9d6ccef4a40218c833440f05cd348e0dc not found: ID does not exist" containerID="2932a8cb91e0c1e8155dc36d90b5dde9d6ccef4a40218c833440f05cd348e0dc" Mar 10 10:09:45 crc kubenswrapper[4794]: I0310 10:09:45.557465 4794 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2932a8cb91e0c1e8155dc36d90b5dde9d6ccef4a40218c833440f05cd348e0dc"} err="failed to get container status \"2932a8cb91e0c1e8155dc36d90b5dde9d6ccef4a40218c833440f05cd348e0dc\": rpc error: code = NotFound desc = could not find container \"2932a8cb91e0c1e8155dc36d90b5dde9d6ccef4a40218c833440f05cd348e0dc\": container with ID starting with 2932a8cb91e0c1e8155dc36d90b5dde9d6ccef4a40218c833440f05cd348e0dc not found: ID does not exist" Mar 10 10:09:45 crc kubenswrapper[4794]: I0310 10:09:45.557480 4794 scope.go:117] "RemoveContainer" containerID="9e44cda9b00f14d7a50fe91ef2bfbfd700f1b8ac60755da2c7898a76949812f8" Mar 10 10:09:45 crc kubenswrapper[4794]: I0310 10:09:45.581042 4794 reconciler_common.go:293] "Volume detached for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/2ceb2c7c-89c8-4fce-aa02-aac3840e3f5a-nova-novncproxy-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 10 10:09:45 crc kubenswrapper[4794]: I0310 10:09:45.581075 4794 reconciler_common.go:293] "Volume detached for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/2ceb2c7c-89c8-4fce-aa02-aac3840e3f5a-vencrypt-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 10 10:09:45 crc kubenswrapper[4794]: I0310 10:09:45.830118 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-x6kv8" Mar 10 10:09:45 crc kubenswrapper[4794]: E0310 10:09:45.832769 4794 handlers.go:78] "Exec lifecycle hook for Container in Pod failed" err=< Mar 10 10:09:45 crc kubenswrapper[4794]: command '/usr/share/ovn/scripts/ovn-ctl stop_controller' exited with 137: 2026-03-10T10:09:43Z|00001|fatal_signal|WARN|terminating with signal 14 (Alarm clock) Mar 10 10:09:45 crc kubenswrapper[4794]: /etc/init.d/functions: line 589: 400 Alarm clock "$@" Mar 10 10:09:45 crc kubenswrapper[4794]: > execCommand=["/usr/share/ovn/scripts/ovn-ctl","stop_controller"] containerName="ovn-controller" pod="openstack/ovn-controller-fvs8j" message=< Mar 10 10:09:45 crc kubenswrapper[4794]: Exiting ovn-controller (1) [FAILED] Mar 10 10:09:45 crc kubenswrapper[4794]: Killing ovn-controller (1) [ OK ] Mar 10 10:09:45 crc kubenswrapper[4794]: 2026-03-10T10:09:43Z|00001|fatal_signal|WARN|terminating with signal 14 (Alarm clock) Mar 10 10:09:45 crc kubenswrapper[4794]: /etc/init.d/functions: line 589: 400 Alarm clock "$@" Mar 10 10:09:45 crc kubenswrapper[4794]: > Mar 10 10:09:45 crc kubenswrapper[4794]: E0310 10:09:45.832833 4794 kuberuntime_container.go:691] "PreStop hook failed" err=< Mar 10 10:09:45 crc kubenswrapper[4794]: command '/usr/share/ovn/scripts/ovn-ctl stop_controller' exited with 137: 2026-03-10T10:09:43Z|00001|fatal_signal|WARN|terminating with signal 14 (Alarm clock) Mar 10 10:09:45 crc kubenswrapper[4794]: /etc/init.d/functions: line 589: 400 Alarm clock "$@" Mar 10 10:09:45 crc kubenswrapper[4794]: > pod="openstack/ovn-controller-fvs8j" podUID="c4ffbeb0-ab29-4b48-bbd2-65c776b08edd" containerName="ovn-controller" containerID="cri-o://9ee666a84a51e6fb61d9cd9eaf3a5940b75beb1c27f7f0412bcb937917781eb8" Mar 10 10:09:45 crc kubenswrapper[4794]: I0310 10:09:45.832875 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovn-controller-fvs8j" podUID="c4ffbeb0-ab29-4b48-bbd2-65c776b08edd" containerName="ovn-controller" containerID="cri-o://9ee666a84a51e6fb61d9cd9eaf3a5940b75beb1c27f7f0412bcb937917781eb8" gracePeriod=27 Mar 10 10:09:45 crc kubenswrapper[4794]: I0310 10:09:45.833203 4794 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-x6kv8" Mar 10 10:09:45 crc kubenswrapper[4794]: I0310 10:09:45.843392 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-c8964d89c-gfknm"] Mar 10 10:09:45 crc kubenswrapper[4794]: I0310 10:09:45.851913 4794 scope.go:117] "RemoveContainer" containerID="9e44cda9b00f14d7a50fe91ef2bfbfd700f1b8ac60755da2c7898a76949812f8" Mar 10 10:09:45 crc kubenswrapper[4794]: I0310 10:09:45.852352 4794 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-c8964d89c-gfknm"] Mar 10 10:09:45 crc kubenswrapper[4794]: E0310 10:09:45.852626 4794 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9e44cda9b00f14d7a50fe91ef2bfbfd700f1b8ac60755da2c7898a76949812f8\": container with ID starting with 9e44cda9b00f14d7a50fe91ef2bfbfd700f1b8ac60755da2c7898a76949812f8 not found: ID does not exist" containerID="9e44cda9b00f14d7a50fe91ef2bfbfd700f1b8ac60755da2c7898a76949812f8" Mar 10 10:09:45 crc kubenswrapper[4794]: I0310 10:09:45.857497 4794 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9e44cda9b00f14d7a50fe91ef2bfbfd700f1b8ac60755da2c7898a76949812f8"} err="failed to get container status \"9e44cda9b00f14d7a50fe91ef2bfbfd700f1b8ac60755da2c7898a76949812f8\": rpc error: code = NotFound desc = could not find container \"9e44cda9b00f14d7a50fe91ef2bfbfd700f1b8ac60755da2c7898a76949812f8\": container with ID starting with 9e44cda9b00f14d7a50fe91ef2bfbfd700f1b8ac60755da2c7898a76949812f8 not found: ID does not exist" Mar 10 10:09:45 crc kubenswrapper[4794]: I0310 10:09:45.857736 4794 scope.go:117] "RemoveContainer" containerID="7c722e31acce807ad3627a24a3d71255a5f9253c9aaa80d754038f1c69a3dfc4" Mar 10 10:09:45 crc kubenswrapper[4794]: E0310 10:09:45.894729 4794 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of c0456f26d4f37efa9041727ed282574e1a91d5423940c7b04377a50eb6ff7d08 is running failed: container process not found" containerID="c0456f26d4f37efa9041727ed282574e1a91d5423940c7b04377a50eb6ff7d08" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Mar 10 10:09:45 crc kubenswrapper[4794]: E0310 10:09:45.894966 4794 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of c0456f26d4f37efa9041727ed282574e1a91d5423940c7b04377a50eb6ff7d08 is running failed: container process not found" containerID="c0456f26d4f37efa9041727ed282574e1a91d5423940c7b04377a50eb6ff7d08" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Mar 10 10:09:45 crc kubenswrapper[4794]: E0310 10:09:45.895182 4794 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of c0456f26d4f37efa9041727ed282574e1a91d5423940c7b04377a50eb6ff7d08 is running failed: container process not found" containerID="c0456f26d4f37efa9041727ed282574e1a91d5423940c7b04377a50eb6ff7d08" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Mar 10 10:09:45 crc kubenswrapper[4794]: E0310 10:09:45.895208 4794 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of c0456f26d4f37efa9041727ed282574e1a91d5423940c7b04377a50eb6ff7d08 is running failed: container process not found" probeType="Readiness" pod="openstack/nova-cell1-conductor-0" podUID="2e14b1f3-e9a2-41ae-96ee-88dc84f69921" containerName="nova-cell1-conductor-conductor" Mar 10 10:09:45 crc kubenswrapper[4794]: E0310 10:09:45.993701 4794 configmap.go:193] Couldn't get configMap openstack/rabbitmq-cell1-config-data: configmap "rabbitmq-cell1-config-data" not found Mar 10 10:09:45 crc kubenswrapper[4794]: E0310 10:09:45.993780 4794 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/598e06ed-3156-4e09-976e-4dda0e35afc2-config-data podName:598e06ed-3156-4e09-976e-4dda0e35afc2 nodeName:}" failed. No retries permitted until 2026-03-10 10:09:49.993760281 +0000 UTC m=+1538.749931159 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/598e06ed-3156-4e09-976e-4dda0e35afc2-config-data") pod "rabbitmq-cell1-server-0" (UID: "598e06ed-3156-4e09-976e-4dda0e35afc2") : configmap "rabbitmq-cell1-config-data" not found Mar 10 10:09:46 crc kubenswrapper[4794]: E0310 10:09:46.004187 4794 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of a6cc45af0b715a94827c03b09ad8044d92a3ff4ebaf066479557fde441a9fe8f is running failed: container process not found" containerID="a6cc45af0b715a94827c03b09ad8044d92a3ff4ebaf066479557fde441a9fe8f" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 10 10:09:46 crc kubenswrapper[4794]: E0310 10:09:46.005025 4794 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of a6cc45af0b715a94827c03b09ad8044d92a3ff4ebaf066479557fde441a9fe8f is running failed: container process not found" containerID="a6cc45af0b715a94827c03b09ad8044d92a3ff4ebaf066479557fde441a9fe8f" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 10 10:09:46 crc kubenswrapper[4794]: E0310 10:09:46.005290 4794 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of a6cc45af0b715a94827c03b09ad8044d92a3ff4ebaf066479557fde441a9fe8f is running failed: container process not found" containerID="a6cc45af0b715a94827c03b09ad8044d92a3ff4ebaf066479557fde441a9fe8f" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 10 10:09:46 crc kubenswrapper[4794]: E0310 10:09:46.005343 4794 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of a6cc45af0b715a94827c03b09ad8044d92a3ff4ebaf066479557fde441a9fe8f is running failed: container process not found" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="d6e6c324-8bba-4585-9ffc-afad751594d7" containerName="nova-scheduler-scheduler" Mar 10 10:09:46 crc kubenswrapper[4794]: I0310 10:09:46.022918 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 10 10:09:46 crc kubenswrapper[4794]: I0310 10:09:46.032054 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5f967270-f06c-40e0-9a53-00d2bc42d930" path="/var/lib/kubelet/pods/5f967270-f06c-40e0-9a53-00d2bc42d930/volumes" Mar 10 10:09:46 crc kubenswrapper[4794]: I0310 10:09:46.032935 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Mar 10 10:09:46 crc kubenswrapper[4794]: I0310 10:09:46.033006 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9575e254-d696-4a8a-b84f-c8f36d746ff8" path="/var/lib/kubelet/pods/9575e254-d696-4a8a-b84f-c8f36d746ff8/volumes" Mar 10 10:09:46 crc kubenswrapper[4794]: I0310 10:09:46.033873 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a352ac02-d7f9-40c5-b5ad-40d7fb5d2a55" path="/var/lib/kubelet/pods/a352ac02-d7f9-40c5-b5ad-40d7fb5d2a55/volumes" Mar 10 10:09:46 crc kubenswrapper[4794]: I0310 10:09:46.035142 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bb0dcfe9-2446-4f91-bc24-3cabcab9e8af" path="/var/lib/kubelet/pods/bb0dcfe9-2446-4f91-bc24-3cabcab9e8af/volumes" Mar 10 10:09:46 crc kubenswrapper[4794]: I0310 10:09:46.036209 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d235a0a5-57c9-4938-b742-5788ade30a12" path="/var/lib/kubelet/pods/d235a0a5-57c9-4938-b742-5788ade30a12/volumes" Mar 10 10:09:46 crc kubenswrapper[4794]: I0310 10:09:46.038007 4794 scope.go:117] "RemoveContainer" containerID="568dc6ea988e0568a2d3d813b291ac89a4bc785eb376be0c5d4fc65492cc33f4" Mar 10 10:09:46 crc kubenswrapper[4794]: I0310 10:09:46.040560 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-metrics-p9r7k"] Mar 10 10:09:46 crc kubenswrapper[4794]: I0310 10:09:46.061210 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-7da2-account-create-update-rd2nz" Mar 10 10:09:46 crc kubenswrapper[4794]: I0310 10:09:46.082351 4794 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-metrics-p9r7k"] Mar 10 10:09:46 crc kubenswrapper[4794]: I0310 10:09:46.111842 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovsdbserver-sb-0"] Mar 10 10:09:46 crc kubenswrapper[4794]: I0310 10:09:46.120397 4794 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovsdbserver-sb-0"] Mar 10 10:09:46 crc kubenswrapper[4794]: I0310 10:09:46.129444 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 10 10:09:46 crc kubenswrapper[4794]: I0310 10:09:46.135163 4794 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 10 10:09:46 crc kubenswrapper[4794]: I0310 10:09:46.170682 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-ncstb"] Mar 10 10:09:46 crc kubenswrapper[4794]: I0310 10:09:46.176927 4794 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-ncstb"] Mar 10 10:09:46 crc kubenswrapper[4794]: I0310 10:09:46.183059 4794 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/swift-proxy-7575fbf969-gq2mq" podUID="ed705a10-5bb5-4170-8536-57c6be1cb816" containerName="proxy-server" probeResult="failure" output="Get \"https://10.217.0.174:8080/healthcheck\": dial tcp 10.217.0.174:8080: connect: connection refused" Mar 10 10:09:46 crc kubenswrapper[4794]: I0310 10:09:46.183059 4794 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/swift-proxy-7575fbf969-gq2mq" podUID="ed705a10-5bb5-4170-8536-57c6be1cb816" containerName="proxy-httpd" probeResult="failure" output="Get \"https://10.217.0.174:8080/healthcheck\": dial tcp 10.217.0.174:8080: connect: connection refused" Mar 10 10:09:46 crc kubenswrapper[4794]: I0310 10:09:46.195462 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Mar 10 10:09:46 crc kubenswrapper[4794]: I0310 10:09:46.197743 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/19885e30-4144-4598-be9f-99644e5d5d4a-operator-scripts\") pod \"19885e30-4144-4598-be9f-99644e5d5d4a\" (UID: \"19885e30-4144-4598-be9f-99644e5d5d4a\") " Mar 10 10:09:46 crc kubenswrapper[4794]: I0310 10:09:46.197819 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d6e6c324-8bba-4585-9ffc-afad751594d7-combined-ca-bundle\") pod \"d6e6c324-8bba-4585-9ffc-afad751594d7\" (UID: \"d6e6c324-8bba-4585-9ffc-afad751594d7\") " Mar 10 10:09:46 crc kubenswrapper[4794]: I0310 10:09:46.197868 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7f82e49b-0e4d-4cf5-8213-b30edcae94d4-operator-scripts\") pod \"7f82e49b-0e4d-4cf5-8213-b30edcae94d4\" (UID: \"7f82e49b-0e4d-4cf5-8213-b30edcae94d4\") " Mar 10 10:09:46 crc kubenswrapper[4794]: I0310 10:09:46.197904 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/7f82e49b-0e4d-4cf5-8213-b30edcae94d4-config-data-generated\") pod \"7f82e49b-0e4d-4cf5-8213-b30edcae94d4\" (UID: \"7f82e49b-0e4d-4cf5-8213-b30edcae94d4\") " Mar 10 10:09:46 crc kubenswrapper[4794]: I0310 10:09:46.198023 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/7f82e49b-0e4d-4cf5-8213-b30edcae94d4-config-data-default\") pod \"7f82e49b-0e4d-4cf5-8213-b30edcae94d4\" (UID: \"7f82e49b-0e4d-4cf5-8213-b30edcae94d4\") " Mar 10 10:09:46 crc kubenswrapper[4794]: I0310 10:09:46.198063 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7krhj\" (UniqueName: \"kubernetes.io/projected/19885e30-4144-4598-be9f-99644e5d5d4a-kube-api-access-7krhj\") pod \"19885e30-4144-4598-be9f-99644e5d5d4a\" (UID: \"19885e30-4144-4598-be9f-99644e5d5d4a\") " Mar 10 10:09:46 crc kubenswrapper[4794]: I0310 10:09:46.198080 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/7f82e49b-0e4d-4cf5-8213-b30edcae94d4-galera-tls-certs\") pod \"7f82e49b-0e4d-4cf5-8213-b30edcae94d4\" (UID: \"7f82e49b-0e4d-4cf5-8213-b30edcae94d4\") " Mar 10 10:09:46 crc kubenswrapper[4794]: I0310 10:09:46.198474 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/7f82e49b-0e4d-4cf5-8213-b30edcae94d4-kolla-config\") pod \"7f82e49b-0e4d-4cf5-8213-b30edcae94d4\" (UID: \"7f82e49b-0e4d-4cf5-8213-b30edcae94d4\") " Mar 10 10:09:46 crc kubenswrapper[4794]: I0310 10:09:46.198525 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2gwsk\" (UniqueName: \"kubernetes.io/projected/7f82e49b-0e4d-4cf5-8213-b30edcae94d4-kube-api-access-2gwsk\") pod \"7f82e49b-0e4d-4cf5-8213-b30edcae94d4\" (UID: \"7f82e49b-0e4d-4cf5-8213-b30edcae94d4\") " Mar 10 10:09:46 crc kubenswrapper[4794]: I0310 10:09:46.198557 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mk7pd\" (UniqueName: \"kubernetes.io/projected/d6e6c324-8bba-4585-9ffc-afad751594d7-kube-api-access-mk7pd\") pod \"d6e6c324-8bba-4585-9ffc-afad751594d7\" (UID: \"d6e6c324-8bba-4585-9ffc-afad751594d7\") " Mar 10 10:09:46 crc kubenswrapper[4794]: I0310 10:09:46.198607 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d6e6c324-8bba-4585-9ffc-afad751594d7-config-data\") pod \"d6e6c324-8bba-4585-9ffc-afad751594d7\" (UID: \"d6e6c324-8bba-4585-9ffc-afad751594d7\") " Mar 10 10:09:46 crc kubenswrapper[4794]: I0310 10:09:46.198676 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7f82e49b-0e4d-4cf5-8213-b30edcae94d4-combined-ca-bundle\") pod \"7f82e49b-0e4d-4cf5-8213-b30edcae94d4\" (UID: \"7f82e49b-0e4d-4cf5-8213-b30edcae94d4\") " Mar 10 10:09:46 crc kubenswrapper[4794]: I0310 10:09:46.198701 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mysql-db\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"7f82e49b-0e4d-4cf5-8213-b30edcae94d4\" (UID: \"7f82e49b-0e4d-4cf5-8213-b30edcae94d4\") " Mar 10 10:09:46 crc kubenswrapper[4794]: I0310 10:09:46.198978 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7f82e49b-0e4d-4cf5-8213-b30edcae94d4-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "7f82e49b-0e4d-4cf5-8213-b30edcae94d4" (UID: "7f82e49b-0e4d-4cf5-8213-b30edcae94d4"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 10:09:46 crc kubenswrapper[4794]: I0310 10:09:46.199572 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7f82e49b-0e4d-4cf5-8213-b30edcae94d4-kolla-config" (OuterVolumeSpecName: "kolla-config") pod "7f82e49b-0e4d-4cf5-8213-b30edcae94d4" (UID: "7f82e49b-0e4d-4cf5-8213-b30edcae94d4"). InnerVolumeSpecName "kolla-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 10:09:46 crc kubenswrapper[4794]: I0310 10:09:46.199799 4794 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7f82e49b-0e4d-4cf5-8213-b30edcae94d4-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 10:09:46 crc kubenswrapper[4794]: I0310 10:09:46.200181 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/19885e30-4144-4598-be9f-99644e5d5d4a-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "19885e30-4144-4598-be9f-99644e5d5d4a" (UID: "19885e30-4144-4598-be9f-99644e5d5d4a"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 10:09:46 crc kubenswrapper[4794]: I0310 10:09:46.201244 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7f82e49b-0e4d-4cf5-8213-b30edcae94d4-config-data-generated" (OuterVolumeSpecName: "config-data-generated") pod "7f82e49b-0e4d-4cf5-8213-b30edcae94d4" (UID: "7f82e49b-0e4d-4cf5-8213-b30edcae94d4"). InnerVolumeSpecName "config-data-generated". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 10:09:46 crc kubenswrapper[4794]: I0310 10:09:46.202058 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7f82e49b-0e4d-4cf5-8213-b30edcae94d4-config-data-default" (OuterVolumeSpecName: "config-data-default") pod "7f82e49b-0e4d-4cf5-8213-b30edcae94d4" (UID: "7f82e49b-0e4d-4cf5-8213-b30edcae94d4"). InnerVolumeSpecName "config-data-default". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 10:09:46 crc kubenswrapper[4794]: I0310 10:09:46.205121 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/19885e30-4144-4598-be9f-99644e5d5d4a-kube-api-access-7krhj" (OuterVolumeSpecName: "kube-api-access-7krhj") pod "19885e30-4144-4598-be9f-99644e5d5d4a" (UID: "19885e30-4144-4598-be9f-99644e5d5d4a"). InnerVolumeSpecName "kube-api-access-7krhj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 10:09:46 crc kubenswrapper[4794]: I0310 10:09:46.205748 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7f82e49b-0e4d-4cf5-8213-b30edcae94d4-kube-api-access-2gwsk" (OuterVolumeSpecName: "kube-api-access-2gwsk") pod "7f82e49b-0e4d-4cf5-8213-b30edcae94d4" (UID: "7f82e49b-0e4d-4cf5-8213-b30edcae94d4"). InnerVolumeSpecName "kube-api-access-2gwsk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 10:09:46 crc kubenswrapper[4794]: I0310 10:09:46.214092 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d6e6c324-8bba-4585-9ffc-afad751594d7-kube-api-access-mk7pd" (OuterVolumeSpecName: "kube-api-access-mk7pd") pod "d6e6c324-8bba-4585-9ffc-afad751594d7" (UID: "d6e6c324-8bba-4585-9ffc-afad751594d7"). InnerVolumeSpecName "kube-api-access-mk7pd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 10:09:46 crc kubenswrapper[4794]: I0310 10:09:46.235198 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage02-crc" (OuterVolumeSpecName: "mysql-db") pod "7f82e49b-0e4d-4cf5-8213-b30edcae94d4" (UID: "7f82e49b-0e4d-4cf5-8213-b30edcae94d4"). InnerVolumeSpecName "local-storage02-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 10 10:09:46 crc kubenswrapper[4794]: I0310 10:09:46.246699 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7f82e49b-0e4d-4cf5-8213-b30edcae94d4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7f82e49b-0e4d-4cf5-8213-b30edcae94d4" (UID: "7f82e49b-0e4d-4cf5-8213-b30edcae94d4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 10:09:46 crc kubenswrapper[4794]: I0310 10:09:46.251807 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d6e6c324-8bba-4585-9ffc-afad751594d7-config-data" (OuterVolumeSpecName: "config-data") pod "d6e6c324-8bba-4585-9ffc-afad751594d7" (UID: "d6e6c324-8bba-4585-9ffc-afad751594d7"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 10:09:46 crc kubenswrapper[4794]: I0310 10:09:46.261727 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d6e6c324-8bba-4585-9ffc-afad751594d7-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d6e6c324-8bba-4585-9ffc-afad751594d7" (UID: "d6e6c324-8bba-4585-9ffc-afad751594d7"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 10:09:46 crc kubenswrapper[4794]: I0310 10:09:46.300778 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2e14b1f3-e9a2-41ae-96ee-88dc84f69921-config-data\") pod \"2e14b1f3-e9a2-41ae-96ee-88dc84f69921\" (UID: \"2e14b1f3-e9a2-41ae-96ee-88dc84f69921\") " Mar 10 10:09:46 crc kubenswrapper[4794]: I0310 10:09:46.301227 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cr2zs\" (UniqueName: \"kubernetes.io/projected/2e14b1f3-e9a2-41ae-96ee-88dc84f69921-kube-api-access-cr2zs\") pod \"2e14b1f3-e9a2-41ae-96ee-88dc84f69921\" (UID: \"2e14b1f3-e9a2-41ae-96ee-88dc84f69921\") " Mar 10 10:09:46 crc kubenswrapper[4794]: I0310 10:09:46.301495 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2e14b1f3-e9a2-41ae-96ee-88dc84f69921-combined-ca-bundle\") pod \"2e14b1f3-e9a2-41ae-96ee-88dc84f69921\" (UID: \"2e14b1f3-e9a2-41ae-96ee-88dc84f69921\") " Mar 10 10:09:46 crc kubenswrapper[4794]: I0310 10:09:46.302278 4794 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7f82e49b-0e4d-4cf5-8213-b30edcae94d4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 10:09:46 crc kubenswrapper[4794]: I0310 10:09:46.302309 4794 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" " Mar 10 10:09:46 crc kubenswrapper[4794]: I0310 10:09:46.302362 4794 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/19885e30-4144-4598-be9f-99644e5d5d4a-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 10:09:46 crc kubenswrapper[4794]: I0310 10:09:46.302403 4794 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d6e6c324-8bba-4585-9ffc-afad751594d7-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 10:09:46 crc kubenswrapper[4794]: I0310 10:09:46.302440 4794 reconciler_common.go:293] "Volume detached for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/7f82e49b-0e4d-4cf5-8213-b30edcae94d4-config-data-generated\") on node \"crc\" DevicePath \"\"" Mar 10 10:09:46 crc kubenswrapper[4794]: I0310 10:09:46.302453 4794 reconciler_common.go:293] "Volume detached for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/7f82e49b-0e4d-4cf5-8213-b30edcae94d4-config-data-default\") on node \"crc\" DevicePath \"\"" Mar 10 10:09:46 crc kubenswrapper[4794]: I0310 10:09:46.302465 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7krhj\" (UniqueName: \"kubernetes.io/projected/19885e30-4144-4598-be9f-99644e5d5d4a-kube-api-access-7krhj\") on node \"crc\" DevicePath \"\"" Mar 10 10:09:46 crc kubenswrapper[4794]: I0310 10:09:46.302476 4794 reconciler_common.go:293] "Volume detached for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/7f82e49b-0e4d-4cf5-8213-b30edcae94d4-kolla-config\") on node \"crc\" DevicePath \"\"" Mar 10 10:09:46 crc kubenswrapper[4794]: I0310 10:09:46.302487 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2gwsk\" (UniqueName: \"kubernetes.io/projected/7f82e49b-0e4d-4cf5-8213-b30edcae94d4-kube-api-access-2gwsk\") on node \"crc\" DevicePath \"\"" Mar 10 10:09:46 crc kubenswrapper[4794]: I0310 10:09:46.302526 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mk7pd\" (UniqueName: \"kubernetes.io/projected/d6e6c324-8bba-4585-9ffc-afad751594d7-kube-api-access-mk7pd\") on node \"crc\" DevicePath \"\"" Mar 10 10:09:46 crc kubenswrapper[4794]: I0310 10:09:46.302538 4794 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d6e6c324-8bba-4585-9ffc-afad751594d7-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 10:09:46 crc kubenswrapper[4794]: I0310 10:09:46.314948 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2e14b1f3-e9a2-41ae-96ee-88dc84f69921-kube-api-access-cr2zs" (OuterVolumeSpecName: "kube-api-access-cr2zs") pod "2e14b1f3-e9a2-41ae-96ee-88dc84f69921" (UID: "2e14b1f3-e9a2-41ae-96ee-88dc84f69921"). InnerVolumeSpecName "kube-api-access-cr2zs". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 10:09:46 crc kubenswrapper[4794]: I0310 10:09:46.318927 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7f82e49b-0e4d-4cf5-8213-b30edcae94d4-galera-tls-certs" (OuterVolumeSpecName: "galera-tls-certs") pod "7f82e49b-0e4d-4cf5-8213-b30edcae94d4" (UID: "7f82e49b-0e4d-4cf5-8213-b30edcae94d4"). InnerVolumeSpecName "galera-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 10:09:46 crc kubenswrapper[4794]: I0310 10:09:46.324842 4794 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage02-crc" (UniqueName: "kubernetes.io/local-volume/local-storage02-crc") on node "crc" Mar 10 10:09:46 crc kubenswrapper[4794]: I0310 10:09:46.367821 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2e14b1f3-e9a2-41ae-96ee-88dc84f69921-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2e14b1f3-e9a2-41ae-96ee-88dc84f69921" (UID: "2e14b1f3-e9a2-41ae-96ee-88dc84f69921"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 10:09:46 crc kubenswrapper[4794]: I0310 10:09:46.369830 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2e14b1f3-e9a2-41ae-96ee-88dc84f69921-config-data" (OuterVolumeSpecName: "config-data") pod "2e14b1f3-e9a2-41ae-96ee-88dc84f69921" (UID: "2e14b1f3-e9a2-41ae-96ee-88dc84f69921"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 10:09:46 crc kubenswrapper[4794]: I0310 10:09:46.373657 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-fvs8j_c4ffbeb0-ab29-4b48-bbd2-65c776b08edd/ovn-controller/0.log" Mar 10 10:09:46 crc kubenswrapper[4794]: I0310 10:09:46.373707 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-fvs8j" Mar 10 10:09:46 crc kubenswrapper[4794]: I0310 10:09:46.406511 4794 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2e14b1f3-e9a2-41ae-96ee-88dc84f69921-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 10:09:46 crc kubenswrapper[4794]: I0310 10:09:46.406545 4794 reconciler_common.go:293] "Volume detached for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" DevicePath \"\"" Mar 10 10:09:46 crc kubenswrapper[4794]: I0310 10:09:46.406554 4794 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2e14b1f3-e9a2-41ae-96ee-88dc84f69921-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 10:09:46 crc kubenswrapper[4794]: I0310 10:09:46.406566 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cr2zs\" (UniqueName: \"kubernetes.io/projected/2e14b1f3-e9a2-41ae-96ee-88dc84f69921-kube-api-access-cr2zs\") on node \"crc\" DevicePath \"\"" Mar 10 10:09:46 crc kubenswrapper[4794]: I0310 10:09:46.406575 4794 reconciler_common.go:293] "Volume detached for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/7f82e49b-0e4d-4cf5-8213-b30edcae94d4-galera-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 10 10:09:46 crc kubenswrapper[4794]: I0310 10:09:46.443550 4794 generic.go:334] "Generic (PLEG): container finished" podID="2e14b1f3-e9a2-41ae-96ee-88dc84f69921" containerID="c0456f26d4f37efa9041727ed282574e1a91d5423940c7b04377a50eb6ff7d08" exitCode=0 Mar 10 10:09:46 crc kubenswrapper[4794]: I0310 10:09:46.443634 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"2e14b1f3-e9a2-41ae-96ee-88dc84f69921","Type":"ContainerDied","Data":"c0456f26d4f37efa9041727ed282574e1a91d5423940c7b04377a50eb6ff7d08"} Mar 10 10:09:46 crc kubenswrapper[4794]: I0310 10:09:46.443659 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"2e14b1f3-e9a2-41ae-96ee-88dc84f69921","Type":"ContainerDied","Data":"9672b520e7da191c60f7e43b404a8c3e35d1e2b994bee5f95ba0d8a7406f3b83"} Mar 10 10:09:46 crc kubenswrapper[4794]: I0310 10:09:46.443674 4794 scope.go:117] "RemoveContainer" containerID="c0456f26d4f37efa9041727ed282574e1a91d5423940c7b04377a50eb6ff7d08" Mar 10 10:09:46 crc kubenswrapper[4794]: I0310 10:09:46.443775 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Mar 10 10:09:46 crc kubenswrapper[4794]: I0310 10:09:46.452198 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-7da2-account-create-update-rd2nz" event={"ID":"19885e30-4144-4598-be9f-99644e5d5d4a","Type":"ContainerDied","Data":"462746f7d4293095c4760c9bebe6c55150931da29629648c6e102ec58e5deb64"} Mar 10 10:09:46 crc kubenswrapper[4794]: I0310 10:09:46.452277 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-7da2-account-create-update-rd2nz" Mar 10 10:09:46 crc kubenswrapper[4794]: I0310 10:09:46.478570 4794 generic.go:334] "Generic (PLEG): container finished" podID="4a681af5-3cbf-4d83-a8cd-42a552cdc06d" containerID="bfd8291a4141335b8e769990025ae16ba71e4ee8c719b3ff7c16b2cbc268d255" exitCode=1 Mar 10 10:09:46 crc kubenswrapper[4794]: I0310 10:09:46.479605 4794 scope.go:117] "RemoveContainer" containerID="bfd8291a4141335b8e769990025ae16ba71e4ee8c719b3ff7c16b2cbc268d255" Mar 10 10:09:46 crc kubenswrapper[4794]: E0310 10:09:46.480150 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mariadb-account-create-update\" with CrashLoopBackOff: \"back-off 10s restarting failed container=mariadb-account-create-update pod=root-account-create-update-h8ctb_openstack(4a681af5-3cbf-4d83-a8cd-42a552cdc06d)\"" pod="openstack/root-account-create-update-h8ctb" podUID="4a681af5-3cbf-4d83-a8cd-42a552cdc06d" Mar 10 10:09:46 crc kubenswrapper[4794]: I0310 10:09:46.482669 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-h8ctb" event={"ID":"4a681af5-3cbf-4d83-a8cd-42a552cdc06d","Type":"ContainerDied","Data":"bfd8291a4141335b8e769990025ae16ba71e4ee8c719b3ff7c16b2cbc268d255"} Mar 10 10:09:46 crc kubenswrapper[4794]: I0310 10:09:46.483499 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-0"] Mar 10 10:09:46 crc kubenswrapper[4794]: I0310 10:09:46.487424 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"d6e6c324-8bba-4585-9ffc-afad751594d7","Type":"ContainerDied","Data":"8f90bc40a5012204eeb3c53e19ecdf2f7e253b0770393dff4e6c78ac9f448b9a"} Mar 10 10:09:46 crc kubenswrapper[4794]: I0310 10:09:46.490104 4794 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-conductor-0"] Mar 10 10:09:46 crc kubenswrapper[4794]: I0310 10:09:46.492131 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"7f82e49b-0e4d-4cf5-8213-b30edcae94d4","Type":"ContainerDied","Data":"d384067b285f1fd56157911934d42a055836cecf1f196917852bff1fa1e15975"} Mar 10 10:09:46 crc kubenswrapper[4794]: I0310 10:09:46.492219 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Mar 10 10:09:46 crc kubenswrapper[4794]: I0310 10:09:46.497472 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 10 10:09:46 crc kubenswrapper[4794]: I0310 10:09:46.510297 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2lddn\" (UniqueName: \"kubernetes.io/projected/c4ffbeb0-ab29-4b48-bbd2-65c776b08edd-kube-api-access-2lddn\") pod \"c4ffbeb0-ab29-4b48-bbd2-65c776b08edd\" (UID: \"c4ffbeb0-ab29-4b48-bbd2-65c776b08edd\") " Mar 10 10:09:46 crc kubenswrapper[4794]: I0310 10:09:46.510614 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/c4ffbeb0-ab29-4b48-bbd2-65c776b08edd-var-run\") pod \"c4ffbeb0-ab29-4b48-bbd2-65c776b08edd\" (UID: \"c4ffbeb0-ab29-4b48-bbd2-65c776b08edd\") " Mar 10 10:09:46 crc kubenswrapper[4794]: I0310 10:09:46.511095 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/c4ffbeb0-ab29-4b48-bbd2-65c776b08edd-var-run-ovn\") pod \"c4ffbeb0-ab29-4b48-bbd2-65c776b08edd\" (UID: \"c4ffbeb0-ab29-4b48-bbd2-65c776b08edd\") " Mar 10 10:09:46 crc kubenswrapper[4794]: I0310 10:09:46.511222 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/c4ffbeb0-ab29-4b48-bbd2-65c776b08edd-var-log-ovn\") pod \"c4ffbeb0-ab29-4b48-bbd2-65c776b08edd\" (UID: \"c4ffbeb0-ab29-4b48-bbd2-65c776b08edd\") " Mar 10 10:09:46 crc kubenswrapper[4794]: I0310 10:09:46.511156 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c4ffbeb0-ab29-4b48-bbd2-65c776b08edd-var-run" (OuterVolumeSpecName: "var-run") pod "c4ffbeb0-ab29-4b48-bbd2-65c776b08edd" (UID: "c4ffbeb0-ab29-4b48-bbd2-65c776b08edd"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 10:09:46 crc kubenswrapper[4794]: I0310 10:09:46.511182 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c4ffbeb0-ab29-4b48-bbd2-65c776b08edd-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "c4ffbeb0-ab29-4b48-bbd2-65c776b08edd" (UID: "c4ffbeb0-ab29-4b48-bbd2-65c776b08edd"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 10:09:46 crc kubenswrapper[4794]: I0310 10:09:46.511283 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c4ffbeb0-ab29-4b48-bbd2-65c776b08edd-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "c4ffbeb0-ab29-4b48-bbd2-65c776b08edd" (UID: "c4ffbeb0-ab29-4b48-bbd2-65c776b08edd"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 10:09:46 crc kubenswrapper[4794]: I0310 10:09:46.511598 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c4ffbeb0-ab29-4b48-bbd2-65c776b08edd-combined-ca-bundle\") pod \"c4ffbeb0-ab29-4b48-bbd2-65c776b08edd\" (UID: \"c4ffbeb0-ab29-4b48-bbd2-65c776b08edd\") " Mar 10 10:09:46 crc kubenswrapper[4794]: I0310 10:09:46.511743 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c4ffbeb0-ab29-4b48-bbd2-65c776b08edd-scripts\") pod \"c4ffbeb0-ab29-4b48-bbd2-65c776b08edd\" (UID: \"c4ffbeb0-ab29-4b48-bbd2-65c776b08edd\") " Mar 10 10:09:46 crc kubenswrapper[4794]: I0310 10:09:46.511952 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/c4ffbeb0-ab29-4b48-bbd2-65c776b08edd-ovn-controller-tls-certs\") pod \"c4ffbeb0-ab29-4b48-bbd2-65c776b08edd\" (UID: \"c4ffbeb0-ab29-4b48-bbd2-65c776b08edd\") " Mar 10 10:09:46 crc kubenswrapper[4794]: I0310 10:09:46.512525 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c4ffbeb0-ab29-4b48-bbd2-65c776b08edd-scripts" (OuterVolumeSpecName: "scripts") pod "c4ffbeb0-ab29-4b48-bbd2-65c776b08edd" (UID: "c4ffbeb0-ab29-4b48-bbd2-65c776b08edd"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 10:09:46 crc kubenswrapper[4794]: I0310 10:09:46.513381 4794 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/c4ffbeb0-ab29-4b48-bbd2-65c776b08edd-var-run\") on node \"crc\" DevicePath \"\"" Mar 10 10:09:46 crc kubenswrapper[4794]: I0310 10:09:46.513492 4794 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/c4ffbeb0-ab29-4b48-bbd2-65c776b08edd-var-run-ovn\") on node \"crc\" DevicePath \"\"" Mar 10 10:09:46 crc kubenswrapper[4794]: I0310 10:09:46.514894 4794 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/c4ffbeb0-ab29-4b48-bbd2-65c776b08edd-var-log-ovn\") on node \"crc\" DevicePath \"\"" Mar 10 10:09:46 crc kubenswrapper[4794]: I0310 10:09:46.515003 4794 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c4ffbeb0-ab29-4b48-bbd2-65c776b08edd-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 10:09:46 crc kubenswrapper[4794]: I0310 10:09:46.517882 4794 scope.go:117] "RemoveContainer" containerID="c0456f26d4f37efa9041727ed282574e1a91d5423940c7b04377a50eb6ff7d08" Mar 10 10:09:46 crc kubenswrapper[4794]: I0310 10:09:46.520774 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-fvs8j_c4ffbeb0-ab29-4b48-bbd2-65c776b08edd/ovn-controller/0.log" Mar 10 10:09:46 crc kubenswrapper[4794]: I0310 10:09:46.520920 4794 generic.go:334] "Generic (PLEG): container finished" podID="c4ffbeb0-ab29-4b48-bbd2-65c776b08edd" containerID="9ee666a84a51e6fb61d9cd9eaf3a5940b75beb1c27f7f0412bcb937917781eb8" exitCode=143 Mar 10 10:09:46 crc kubenswrapper[4794]: I0310 10:09:46.520976 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-fvs8j" Mar 10 10:09:46 crc kubenswrapper[4794]: I0310 10:09:46.520992 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-fvs8j" event={"ID":"c4ffbeb0-ab29-4b48-bbd2-65c776b08edd","Type":"ContainerDied","Data":"9ee666a84a51e6fb61d9cd9eaf3a5940b75beb1c27f7f0412bcb937917781eb8"} Mar 10 10:09:46 crc kubenswrapper[4794]: I0310 10:09:46.522081 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-fvs8j" event={"ID":"c4ffbeb0-ab29-4b48-bbd2-65c776b08edd","Type":"ContainerDied","Data":"200f1a127dbcf778780d3267801e67f85028cd39740627d3c9a68d705833a8d9"} Mar 10 10:09:46 crc kubenswrapper[4794]: E0310 10:09:46.523595 4794 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2e14b1f3_e9a2_41ae_96ee_88dc84f69921.slice\": RecentStats: unable to find data in memory cache]" Mar 10 10:09:46 crc kubenswrapper[4794]: E0310 10:09:46.523902 4794 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c0456f26d4f37efa9041727ed282574e1a91d5423940c7b04377a50eb6ff7d08\": container with ID starting with c0456f26d4f37efa9041727ed282574e1a91d5423940c7b04377a50eb6ff7d08 not found: ID does not exist" containerID="c0456f26d4f37efa9041727ed282574e1a91d5423940c7b04377a50eb6ff7d08" Mar 10 10:09:46 crc kubenswrapper[4794]: I0310 10:09:46.524030 4794 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c0456f26d4f37efa9041727ed282574e1a91d5423940c7b04377a50eb6ff7d08"} err="failed to get container status \"c0456f26d4f37efa9041727ed282574e1a91d5423940c7b04377a50eb6ff7d08\": rpc error: code = NotFound desc = could not find container \"c0456f26d4f37efa9041727ed282574e1a91d5423940c7b04377a50eb6ff7d08\": container with ID starting with c0456f26d4f37efa9041727ed282574e1a91d5423940c7b04377a50eb6ff7d08 not found: ID does not exist" Mar 10 10:09:46 crc kubenswrapper[4794]: I0310 10:09:46.524150 4794 scope.go:117] "RemoveContainer" containerID="161acba2f6771b8ea310825bf85f5470591fd30d63eb4f7e8d35e0b63dbde45c" Mar 10 10:09:46 crc kubenswrapper[4794]: I0310 10:09:46.543537 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c4ffbeb0-ab29-4b48-bbd2-65c776b08edd-kube-api-access-2lddn" (OuterVolumeSpecName: "kube-api-access-2lddn") pod "c4ffbeb0-ab29-4b48-bbd2-65c776b08edd" (UID: "c4ffbeb0-ab29-4b48-bbd2-65c776b08edd"). InnerVolumeSpecName "kube-api-access-2lddn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 10:09:46 crc kubenswrapper[4794]: I0310 10:09:46.547738 4794 generic.go:334] "Generic (PLEG): container finished" podID="ed705a10-5bb5-4170-8536-57c6be1cb816" containerID="a5ce358b520a173198341e8d239869c88b76f27ef305f00661c9c0926b396a4c" exitCode=0 Mar 10 10:09:46 crc kubenswrapper[4794]: I0310 10:09:46.547767 4794 generic.go:334] "Generic (PLEG): container finished" podID="ed705a10-5bb5-4170-8536-57c6be1cb816" containerID="f4dda11441d70fc4769e94a81ee22fc71ec078c6070b2dd0749628f78f387314" exitCode=0 Mar 10 10:09:46 crc kubenswrapper[4794]: I0310 10:09:46.549218 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-7575fbf969-gq2mq" event={"ID":"ed705a10-5bb5-4170-8536-57c6be1cb816","Type":"ContainerDied","Data":"a5ce358b520a173198341e8d239869c88b76f27ef305f00661c9c0926b396a4c"} Mar 10 10:09:46 crc kubenswrapper[4794]: I0310 10:09:46.549274 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-7575fbf969-gq2mq" event={"ID":"ed705a10-5bb5-4170-8536-57c6be1cb816","Type":"ContainerDied","Data":"f4dda11441d70fc4769e94a81ee22fc71ec078c6070b2dd0749628f78f387314"} Mar 10 10:09:46 crc kubenswrapper[4794]: I0310 10:09:46.570387 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-7575fbf969-gq2mq" Mar 10 10:09:46 crc kubenswrapper[4794]: I0310 10:09:46.579730 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-7da2-account-create-update-rd2nz"] Mar 10 10:09:46 crc kubenswrapper[4794]: I0310 10:09:46.583976 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c4ffbeb0-ab29-4b48-bbd2-65c776b08edd-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c4ffbeb0-ab29-4b48-bbd2-65c776b08edd" (UID: "c4ffbeb0-ab29-4b48-bbd2-65c776b08edd"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 10:09:46 crc kubenswrapper[4794]: I0310 10:09:46.612669 4794 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-7da2-account-create-update-rd2nz"] Mar 10 10:09:46 crc kubenswrapper[4794]: I0310 10:09:46.624453 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2lddn\" (UniqueName: \"kubernetes.io/projected/c4ffbeb0-ab29-4b48-bbd2-65c776b08edd-kube-api-access-2lddn\") on node \"crc\" DevicePath \"\"" Mar 10 10:09:46 crc kubenswrapper[4794]: I0310 10:09:46.625736 4794 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c4ffbeb0-ab29-4b48-bbd2-65c776b08edd-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 10:09:46 crc kubenswrapper[4794]: I0310 10:09:46.638201 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstack-cell1-galera-0"] Mar 10 10:09:46 crc kubenswrapper[4794]: I0310 10:09:46.655724 4794 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/openstack-cell1-galera-0"] Mar 10 10:09:46 crc kubenswrapper[4794]: I0310 10:09:46.698515 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Mar 10 10:09:46 crc kubenswrapper[4794]: I0310 10:09:46.712869 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c4ffbeb0-ab29-4b48-bbd2-65c776b08edd-ovn-controller-tls-certs" (OuterVolumeSpecName: "ovn-controller-tls-certs") pod "c4ffbeb0-ab29-4b48-bbd2-65c776b08edd" (UID: "c4ffbeb0-ab29-4b48-bbd2-65c776b08edd"). InnerVolumeSpecName "ovn-controller-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 10:09:46 crc kubenswrapper[4794]: I0310 10:09:46.720862 4794 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Mar 10 10:09:46 crc kubenswrapper[4794]: E0310 10:09:46.728203 4794 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 760dbf63b8a7310e794a1834c303840d43e2dfc6133fa29596b18c0cf1a30fcb is running failed: container process not found" containerID="760dbf63b8a7310e794a1834c303840d43e2dfc6133fa29596b18c0cf1a30fcb" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Mar 10 10:09:46 crc kubenswrapper[4794]: E0310 10:09:46.729823 4794 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 760dbf63b8a7310e794a1834c303840d43e2dfc6133fa29596b18c0cf1a30fcb is running failed: container process not found" containerID="760dbf63b8a7310e794a1834c303840d43e2dfc6133fa29596b18c0cf1a30fcb" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Mar 10 10:09:46 crc kubenswrapper[4794]: E0310 10:09:46.733412 4794 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 760dbf63b8a7310e794a1834c303840d43e2dfc6133fa29596b18c0cf1a30fcb is running failed: container process not found" containerID="760dbf63b8a7310e794a1834c303840d43e2dfc6133fa29596b18c0cf1a30fcb" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Mar 10 10:09:46 crc kubenswrapper[4794]: E0310 10:09:46.733456 4794 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 760dbf63b8a7310e794a1834c303840d43e2dfc6133fa29596b18c0cf1a30fcb is running failed: container process not found" probeType="Readiness" pod="openstack/ovn-controller-ovs-8552m" podUID="4fbaedcd-8c1d-452e-a36b-1a12e47f48d1" containerName="ovsdb-server" Mar 10 10:09:46 crc kubenswrapper[4794]: I0310 10:09:46.735708 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ed705a10-5bb5-4170-8536-57c6be1cb816-combined-ca-bundle\") pod \"ed705a10-5bb5-4170-8536-57c6be1cb816\" (UID: \"ed705a10-5bb5-4170-8536-57c6be1cb816\") " Mar 10 10:09:46 crc kubenswrapper[4794]: I0310 10:09:46.735813 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ed705a10-5bb5-4170-8536-57c6be1cb816-public-tls-certs\") pod \"ed705a10-5bb5-4170-8536-57c6be1cb816\" (UID: \"ed705a10-5bb5-4170-8536-57c6be1cb816\") " Mar 10 10:09:46 crc kubenswrapper[4794]: I0310 10:09:46.735832 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/ed705a10-5bb5-4170-8536-57c6be1cb816-etc-swift\") pod \"ed705a10-5bb5-4170-8536-57c6be1cb816\" (UID: \"ed705a10-5bb5-4170-8536-57c6be1cb816\") " Mar 10 10:09:46 crc kubenswrapper[4794]: I0310 10:09:46.735853 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ed705a10-5bb5-4170-8536-57c6be1cb816-log-httpd\") pod \"ed705a10-5bb5-4170-8536-57c6be1cb816\" (UID: \"ed705a10-5bb5-4170-8536-57c6be1cb816\") " Mar 10 10:09:46 crc kubenswrapper[4794]: I0310 10:09:46.735929 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5fg5z\" (UniqueName: \"kubernetes.io/projected/ed705a10-5bb5-4170-8536-57c6be1cb816-kube-api-access-5fg5z\") pod \"ed705a10-5bb5-4170-8536-57c6be1cb816\" (UID: \"ed705a10-5bb5-4170-8536-57c6be1cb816\") " Mar 10 10:09:46 crc kubenswrapper[4794]: I0310 10:09:46.735986 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ed705a10-5bb5-4170-8536-57c6be1cb816-internal-tls-certs\") pod \"ed705a10-5bb5-4170-8536-57c6be1cb816\" (UID: \"ed705a10-5bb5-4170-8536-57c6be1cb816\") " Mar 10 10:09:46 crc kubenswrapper[4794]: I0310 10:09:46.736038 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ed705a10-5bb5-4170-8536-57c6be1cb816-config-data\") pod \"ed705a10-5bb5-4170-8536-57c6be1cb816\" (UID: \"ed705a10-5bb5-4170-8536-57c6be1cb816\") " Mar 10 10:09:46 crc kubenswrapper[4794]: I0310 10:09:46.736055 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ed705a10-5bb5-4170-8536-57c6be1cb816-run-httpd\") pod \"ed705a10-5bb5-4170-8536-57c6be1cb816\" (UID: \"ed705a10-5bb5-4170-8536-57c6be1cb816\") " Mar 10 10:09:46 crc kubenswrapper[4794]: I0310 10:09:46.736467 4794 reconciler_common.go:293] "Volume detached for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/c4ffbeb0-ab29-4b48-bbd2-65c776b08edd-ovn-controller-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 10 10:09:46 crc kubenswrapper[4794]: I0310 10:09:46.736708 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ed705a10-5bb5-4170-8536-57c6be1cb816-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "ed705a10-5bb5-4170-8536-57c6be1cb816" (UID: "ed705a10-5bb5-4170-8536-57c6be1cb816"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 10:09:46 crc kubenswrapper[4794]: I0310 10:09:46.737552 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ed705a10-5bb5-4170-8536-57c6be1cb816-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "ed705a10-5bb5-4170-8536-57c6be1cb816" (UID: "ed705a10-5bb5-4170-8536-57c6be1cb816"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 10:09:46 crc kubenswrapper[4794]: I0310 10:09:46.739848 4794 scope.go:117] "RemoveContainer" containerID="a6cc45af0b715a94827c03b09ad8044d92a3ff4ebaf066479557fde441a9fe8f" Mar 10 10:09:46 crc kubenswrapper[4794]: E0310 10:09:46.742627 4794 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="b7dfa84825a07e2ea2e0b6f2310752db3a4b9e635a5fe6eebca49deff4d3269e" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Mar 10 10:09:46 crc kubenswrapper[4794]: I0310 10:09:46.753798 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ed705a10-5bb5-4170-8536-57c6be1cb816-kube-api-access-5fg5z" (OuterVolumeSpecName: "kube-api-access-5fg5z") pod "ed705a10-5bb5-4170-8536-57c6be1cb816" (UID: "ed705a10-5bb5-4170-8536-57c6be1cb816"). InnerVolumeSpecName "kube-api-access-5fg5z". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 10:09:46 crc kubenswrapper[4794]: I0310 10:09:46.757679 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ed705a10-5bb5-4170-8536-57c6be1cb816-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "ed705a10-5bb5-4170-8536-57c6be1cb816" (UID: "ed705a10-5bb5-4170-8536-57c6be1cb816"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 10:09:46 crc kubenswrapper[4794]: E0310 10:09:46.766233 4794 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="b7dfa84825a07e2ea2e0b6f2310752db3a4b9e635a5fe6eebca49deff4d3269e" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Mar 10 10:09:46 crc kubenswrapper[4794]: E0310 10:09:46.771023 4794 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="b7dfa84825a07e2ea2e0b6f2310752db3a4b9e635a5fe6eebca49deff4d3269e" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Mar 10 10:09:46 crc kubenswrapper[4794]: E0310 10:09:46.771084 4794 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/ovn-controller-ovs-8552m" podUID="4fbaedcd-8c1d-452e-a36b-1a12e47f48d1" containerName="ovs-vswitchd" Mar 10 10:09:46 crc kubenswrapper[4794]: I0310 10:09:46.834532 4794 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/glance-default-internal-api-0" podUID="eb384e40-1917-4b9c-bcfa-440a3a10fd1d" containerName="glance-log" probeResult="failure" output="Get \"https://10.217.0.182:9292/healthcheck\": read tcp 10.217.0.2:56144->10.217.0.182:9292: read: connection reset by peer" Mar 10 10:09:46 crc kubenswrapper[4794]: I0310 10:09:46.834536 4794 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/glance-default-internal-api-0" podUID="eb384e40-1917-4b9c-bcfa-440a3a10fd1d" containerName="glance-httpd" probeResult="failure" output="Get \"https://10.217.0.182:9292/healthcheck\": read tcp 10.217.0.2:56146->10.217.0.182:9292: read: connection reset by peer" Mar 10 10:09:46 crc kubenswrapper[4794]: I0310 10:09:46.838521 4794 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ed705a10-5bb5-4170-8536-57c6be1cb816-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 10 10:09:46 crc kubenswrapper[4794]: I0310 10:09:46.838554 4794 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/ed705a10-5bb5-4170-8536-57c6be1cb816-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 10 10:09:46 crc kubenswrapper[4794]: I0310 10:09:46.838563 4794 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ed705a10-5bb5-4170-8536-57c6be1cb816-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 10 10:09:46 crc kubenswrapper[4794]: I0310 10:09:46.838571 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5fg5z\" (UniqueName: \"kubernetes.io/projected/ed705a10-5bb5-4170-8536-57c6be1cb816-kube-api-access-5fg5z\") on node \"crc\" DevicePath \"\"" Mar 10 10:09:46 crc kubenswrapper[4794]: I0310 10:09:46.838678 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ed705a10-5bb5-4170-8536-57c6be1cb816-config-data" (OuterVolumeSpecName: "config-data") pod "ed705a10-5bb5-4170-8536-57c6be1cb816" (UID: "ed705a10-5bb5-4170-8536-57c6be1cb816"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 10:09:46 crc kubenswrapper[4794]: I0310 10:09:46.841538 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ed705a10-5bb5-4170-8536-57c6be1cb816-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "ed705a10-5bb5-4170-8536-57c6be1cb816" (UID: "ed705a10-5bb5-4170-8536-57c6be1cb816"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 10:09:46 crc kubenswrapper[4794]: I0310 10:09:46.842541 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ed705a10-5bb5-4170-8536-57c6be1cb816-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ed705a10-5bb5-4170-8536-57c6be1cb816" (UID: "ed705a10-5bb5-4170-8536-57c6be1cb816"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 10:09:46 crc kubenswrapper[4794]: I0310 10:09:46.844507 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ed705a10-5bb5-4170-8536-57c6be1cb816-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "ed705a10-5bb5-4170-8536-57c6be1cb816" (UID: "ed705a10-5bb5-4170-8536-57c6be1cb816"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 10:09:46 crc kubenswrapper[4794]: I0310 10:09:46.900901 4794 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-marketplace-x6kv8" podUID="0919f85e-d789-43ec-90d3-7df281f603c5" containerName="registry-server" probeResult="failure" output=< Mar 10 10:09:46 crc kubenswrapper[4794]: timeout: failed to connect service ":50051" within 1s Mar 10 10:09:46 crc kubenswrapper[4794]: > Mar 10 10:09:46 crc kubenswrapper[4794]: I0310 10:09:46.917110 4794 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/cinder-api-0" podUID="39114d89-8cf8-4563-bc50-e96e2113349d" containerName="cinder-api" probeResult="failure" output="Get \"https://10.217.0.169:8776/healthcheck\": read tcp 10.217.0.2:34082->10.217.0.169:8776: read: connection reset by peer" Mar 10 10:09:46 crc kubenswrapper[4794]: I0310 10:09:46.928121 4794 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="6b5facf1-8bc0-497d-925f-ee382862cf22" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.211:8775/\": read tcp 10.217.0.2:33456->10.217.0.211:8775: read: connection reset by peer" Mar 10 10:09:46 crc kubenswrapper[4794]: I0310 10:09:46.928384 4794 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="6b5facf1-8bc0-497d-925f-ee382862cf22" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.211:8775/\": read tcp 10.217.0.2:33464->10.217.0.211:8775: read: connection reset by peer" Mar 10 10:09:46 crc kubenswrapper[4794]: I0310 10:09:46.940522 4794 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ed705a10-5bb5-4170-8536-57c6be1cb816-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 10 10:09:46 crc kubenswrapper[4794]: I0310 10:09:46.940563 4794 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ed705a10-5bb5-4170-8536-57c6be1cb816-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 10:09:46 crc kubenswrapper[4794]: I0310 10:09:46.940572 4794 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ed705a10-5bb5-4170-8536-57c6be1cb816-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 10:09:46 crc kubenswrapper[4794]: I0310 10:09:46.940580 4794 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ed705a10-5bb5-4170-8536-57c6be1cb816-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 10 10:09:47 crc kubenswrapper[4794]: I0310 10:09:47.123093 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 10 10:09:47 crc kubenswrapper[4794]: I0310 10:09:47.123902 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="5ad7a22a-3e88-4447-b675-0a8339bd5f55" containerName="ceilometer-central-agent" containerID="cri-o://e87552170b0f59d6ba42cedca611dc39fa99be4bf11ae3c34213d11839f96d1a" gracePeriod=30 Mar 10 10:09:47 crc kubenswrapper[4794]: I0310 10:09:47.124378 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="5ad7a22a-3e88-4447-b675-0a8339bd5f55" containerName="sg-core" containerID="cri-o://efa120338ab691675da4aafa0ebfbff3b4647e0b19ef6e555a8542261261a114" gracePeriod=30 Mar 10 10:09:47 crc kubenswrapper[4794]: I0310 10:09:47.124405 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="5ad7a22a-3e88-4447-b675-0a8339bd5f55" containerName="proxy-httpd" containerID="cri-o://eea46c0d3766a780d0c4aadf572fb639917b34eb0330835ed5b59a3cdae42cd7" gracePeriod=30 Mar 10 10:09:47 crc kubenswrapper[4794]: I0310 10:09:47.124461 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="5ad7a22a-3e88-4447-b675-0a8339bd5f55" containerName="ceilometer-notification-agent" containerID="cri-o://cca51dc254ea928ff512f597522805c2f3a8b9ea69647a3b6f31bf5e631eac13" gracePeriod=30 Mar 10 10:09:47 crc kubenswrapper[4794]: I0310 10:09:47.236130 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 10 10:09:47 crc kubenswrapper[4794]: I0310 10:09:47.239780 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/kube-state-metrics-0" podUID="0a15b784-796e-4834-97e1-978b1f0d9690" containerName="kube-state-metrics" containerID="cri-o://1a9f488919341a6cbd7a3e003cf402c07ea892b34d5616b62a24305d40e123b4" gracePeriod=30 Mar 10 10:09:47 crc kubenswrapper[4794]: I0310 10:09:47.266374 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-fvs8j"] Mar 10 10:09:47 crc kubenswrapper[4794]: I0310 10:09:47.276069 4794 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-fvs8j"] Mar 10 10:09:47 crc kubenswrapper[4794]: I0310 10:09:47.316461 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-d7d5-account-create-update-2vwdw"] Mar 10 10:09:47 crc kubenswrapper[4794]: I0310 10:09:47.348431 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/memcached-0"] Mar 10 10:09:47 crc kubenswrapper[4794]: I0310 10:09:47.348681 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/memcached-0" podUID="71ee8a8d-89a0-495f-925b-071e52449063" containerName="memcached" containerID="cri-o://2cb4ade39b3ccc7065ae84be53a791f98290d564a27b4334ca648c8b39c8ca95" gracePeriod=30 Mar 10 10:09:47 crc kubenswrapper[4794]: I0310 10:09:47.377694 4794 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-d7d5-account-create-update-2vwdw"] Mar 10 10:09:47 crc kubenswrapper[4794]: I0310 10:09:47.392941 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-sz7rz"] Mar 10 10:09:47 crc kubenswrapper[4794]: I0310 10:09:47.406222 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-sync-wq6b7"] Mar 10 10:09:47 crc kubenswrapper[4794]: I0310 10:09:47.420144 4794 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-sync-wq6b7"] Mar 10 10:09:47 crc kubenswrapper[4794]: I0310 10:09:47.420473 4794 scope.go:117] "RemoveContainer" containerID="c4c1209870650aefcffd90e94778414a3fb571c20ef3e08baeadd14915b7e8d6" Mar 10 10:09:47 crc kubenswrapper[4794]: I0310 10:09:47.429875 4794 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-sz7rz"] Mar 10 10:09:47 crc kubenswrapper[4794]: I0310 10:09:47.439289 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-547f85b784-tn9hj"] Mar 10 10:09:47 crc kubenswrapper[4794]: I0310 10:09:47.439493 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/keystone-547f85b784-tn9hj" podUID="95ce97ce-b89c-4868-b9a8-48297e8e35e1" containerName="keystone-api" containerID="cri-o://ba7917d5c284239059542152be24644fc0561a4ad2613bc181bad5791fbc0849" gracePeriod=30 Mar 10 10:09:47 crc kubenswrapper[4794]: I0310 10:09:47.448435 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-d7d5-account-create-update-sxmjz"] Mar 10 10:09:47 crc kubenswrapper[4794]: I0310 10:09:47.456922 4794 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-5dd9f46c58-hkfks" podUID="cffdb4d7-881d-4c1b-a15c-d8d9e8b7756f" containerName="barbican-api-log" probeResult="failure" output="Get \"https://10.217.0.167:9311/healthcheck\": dial tcp 10.217.0.167:9311: connect: connection refused" Mar 10 10:09:47 crc kubenswrapper[4794]: I0310 10:09:47.457188 4794 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-5dd9f46c58-hkfks" podUID="cffdb4d7-881d-4c1b-a15c-d8d9e8b7756f" containerName="barbican-api" probeResult="failure" output="Get \"https://10.217.0.167:9311/healthcheck\": dial tcp 10.217.0.167:9311: connect: connection refused" Mar 10 10:09:47 crc kubenswrapper[4794]: E0310 10:09:47.458641 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7f82e49b-0e4d-4cf5-8213-b30edcae94d4" containerName="mysql-bootstrap" Mar 10 10:09:47 crc kubenswrapper[4794]: I0310 10:09:47.458671 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="7f82e49b-0e4d-4cf5-8213-b30edcae94d4" containerName="mysql-bootstrap" Mar 10 10:09:47 crc kubenswrapper[4794]: E0310 10:09:47.458687 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9575e254-d696-4a8a-b84f-c8f36d746ff8" containerName="openstack-network-exporter" Mar 10 10:09:47 crc kubenswrapper[4794]: I0310 10:09:47.458692 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="9575e254-d696-4a8a-b84f-c8f36d746ff8" containerName="openstack-network-exporter" Mar 10 10:09:47 crc kubenswrapper[4794]: E0310 10:09:47.458706 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9575e254-d696-4a8a-b84f-c8f36d746ff8" containerName="ovsdbserver-nb" Mar 10 10:09:47 crc kubenswrapper[4794]: I0310 10:09:47.458712 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="9575e254-d696-4a8a-b84f-c8f36d746ff8" containerName="ovsdbserver-nb" Mar 10 10:09:47 crc kubenswrapper[4794]: E0310 10:09:47.458722 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d6e6c324-8bba-4585-9ffc-afad751594d7" containerName="nova-scheduler-scheduler" Mar 10 10:09:47 crc kubenswrapper[4794]: I0310 10:09:47.458728 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="d6e6c324-8bba-4585-9ffc-afad751594d7" containerName="nova-scheduler-scheduler" Mar 10 10:09:47 crc kubenswrapper[4794]: E0310 10:09:47.458742 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="df649851-7d90-41e8-80e9-f7fd44d77af0" containerName="registry-server" Mar 10 10:09:47 crc kubenswrapper[4794]: I0310 10:09:47.458748 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="df649851-7d90-41e8-80e9-f7fd44d77af0" containerName="registry-server" Mar 10 10:09:47 crc kubenswrapper[4794]: E0310 10:09:47.458758 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1fcb4385-7603-4d75-8c41-23f457fcae25" containerName="openstack-network-exporter" Mar 10 10:09:47 crc kubenswrapper[4794]: I0310 10:09:47.458764 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="1fcb4385-7603-4d75-8c41-23f457fcae25" containerName="openstack-network-exporter" Mar 10 10:09:47 crc kubenswrapper[4794]: E0310 10:09:47.458771 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bb0dcfe9-2446-4f91-bc24-3cabcab9e8af" containerName="init" Mar 10 10:09:47 crc kubenswrapper[4794]: I0310 10:09:47.458777 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="bb0dcfe9-2446-4f91-bc24-3cabcab9e8af" containerName="init" Mar 10 10:09:47 crc kubenswrapper[4794]: E0310 10:09:47.458789 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f68cd69f-6fe2-4189-ad03-9593a4e94337" containerName="openstack-network-exporter" Mar 10 10:09:47 crc kubenswrapper[4794]: I0310 10:09:47.458794 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="f68cd69f-6fe2-4189-ad03-9593a4e94337" containerName="openstack-network-exporter" Mar 10 10:09:47 crc kubenswrapper[4794]: E0310 10:09:47.458801 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1fcb4385-7603-4d75-8c41-23f457fcae25" containerName="ovsdbserver-sb" Mar 10 10:09:47 crc kubenswrapper[4794]: I0310 10:09:47.458807 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="1fcb4385-7603-4d75-8c41-23f457fcae25" containerName="ovsdbserver-sb" Mar 10 10:09:47 crc kubenswrapper[4794]: E0310 10:09:47.458817 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2ceb2c7c-89c8-4fce-aa02-aac3840e3f5a" containerName="nova-cell1-novncproxy-novncproxy" Mar 10 10:09:47 crc kubenswrapper[4794]: I0310 10:09:47.458824 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="2ceb2c7c-89c8-4fce-aa02-aac3840e3f5a" containerName="nova-cell1-novncproxy-novncproxy" Mar 10 10:09:47 crc kubenswrapper[4794]: E0310 10:09:47.458835 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2e14b1f3-e9a2-41ae-96ee-88dc84f69921" containerName="nova-cell1-conductor-conductor" Mar 10 10:09:47 crc kubenswrapper[4794]: I0310 10:09:47.458841 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="2e14b1f3-e9a2-41ae-96ee-88dc84f69921" containerName="nova-cell1-conductor-conductor" Mar 10 10:09:47 crc kubenswrapper[4794]: E0310 10:09:47.458851 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7f82e49b-0e4d-4cf5-8213-b30edcae94d4" containerName="galera" Mar 10 10:09:47 crc kubenswrapper[4794]: I0310 10:09:47.458856 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="7f82e49b-0e4d-4cf5-8213-b30edcae94d4" containerName="galera" Mar 10 10:09:47 crc kubenswrapper[4794]: E0310 10:09:47.458869 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="df649851-7d90-41e8-80e9-f7fd44d77af0" containerName="extract-utilities" Mar 10 10:09:47 crc kubenswrapper[4794]: I0310 10:09:47.458876 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="df649851-7d90-41e8-80e9-f7fd44d77af0" containerName="extract-utilities" Mar 10 10:09:47 crc kubenswrapper[4794]: E0310 10:09:47.458885 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c4ffbeb0-ab29-4b48-bbd2-65c776b08edd" containerName="ovn-controller" Mar 10 10:09:47 crc kubenswrapper[4794]: I0310 10:09:47.458891 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="c4ffbeb0-ab29-4b48-bbd2-65c776b08edd" containerName="ovn-controller" Mar 10 10:09:47 crc kubenswrapper[4794]: E0310 10:09:47.458898 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="df649851-7d90-41e8-80e9-f7fd44d77af0" containerName="extract-content" Mar 10 10:09:47 crc kubenswrapper[4794]: I0310 10:09:47.458903 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="df649851-7d90-41e8-80e9-f7fd44d77af0" containerName="extract-content" Mar 10 10:09:47 crc kubenswrapper[4794]: E0310 10:09:47.458913 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ed705a10-5bb5-4170-8536-57c6be1cb816" containerName="proxy-httpd" Mar 10 10:09:47 crc kubenswrapper[4794]: I0310 10:09:47.458919 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="ed705a10-5bb5-4170-8536-57c6be1cb816" containerName="proxy-httpd" Mar 10 10:09:47 crc kubenswrapper[4794]: E0310 10:09:47.458930 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ed705a10-5bb5-4170-8536-57c6be1cb816" containerName="proxy-server" Mar 10 10:09:47 crc kubenswrapper[4794]: I0310 10:09:47.458936 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="ed705a10-5bb5-4170-8536-57c6be1cb816" containerName="proxy-server" Mar 10 10:09:47 crc kubenswrapper[4794]: E0310 10:09:47.458943 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bb0dcfe9-2446-4f91-bc24-3cabcab9e8af" containerName="dnsmasq-dns" Mar 10 10:09:47 crc kubenswrapper[4794]: I0310 10:09:47.458948 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="bb0dcfe9-2446-4f91-bc24-3cabcab9e8af" containerName="dnsmasq-dns" Mar 10 10:09:47 crc kubenswrapper[4794]: I0310 10:09:47.459186 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="ed705a10-5bb5-4170-8536-57c6be1cb816" containerName="proxy-httpd" Mar 10 10:09:47 crc kubenswrapper[4794]: I0310 10:09:47.459204 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="c4ffbeb0-ab29-4b48-bbd2-65c776b08edd" containerName="ovn-controller" Mar 10 10:09:47 crc kubenswrapper[4794]: I0310 10:09:47.459216 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="2e14b1f3-e9a2-41ae-96ee-88dc84f69921" containerName="nova-cell1-conductor-conductor" Mar 10 10:09:47 crc kubenswrapper[4794]: I0310 10:09:47.459230 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="1fcb4385-7603-4d75-8c41-23f457fcae25" containerName="openstack-network-exporter" Mar 10 10:09:47 crc kubenswrapper[4794]: I0310 10:09:47.459240 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="ed705a10-5bb5-4170-8536-57c6be1cb816" containerName="proxy-server" Mar 10 10:09:47 crc kubenswrapper[4794]: I0310 10:09:47.459250 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="df649851-7d90-41e8-80e9-f7fd44d77af0" containerName="registry-server" Mar 10 10:09:47 crc kubenswrapper[4794]: I0310 10:09:47.459256 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="9575e254-d696-4a8a-b84f-c8f36d746ff8" containerName="openstack-network-exporter" Mar 10 10:09:47 crc kubenswrapper[4794]: I0310 10:09:47.459263 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="2ceb2c7c-89c8-4fce-aa02-aac3840e3f5a" containerName="nova-cell1-novncproxy-novncproxy" Mar 10 10:09:47 crc kubenswrapper[4794]: I0310 10:09:47.459271 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="bb0dcfe9-2446-4f91-bc24-3cabcab9e8af" containerName="dnsmasq-dns" Mar 10 10:09:47 crc kubenswrapper[4794]: I0310 10:09:47.459280 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="d6e6c324-8bba-4585-9ffc-afad751594d7" containerName="nova-scheduler-scheduler" Mar 10 10:09:47 crc kubenswrapper[4794]: I0310 10:09:47.459294 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="7f82e49b-0e4d-4cf5-8213-b30edcae94d4" containerName="galera" Mar 10 10:09:47 crc kubenswrapper[4794]: I0310 10:09:47.459302 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="f68cd69f-6fe2-4189-ad03-9593a4e94337" containerName="openstack-network-exporter" Mar 10 10:09:47 crc kubenswrapper[4794]: I0310 10:09:47.459312 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="1fcb4385-7603-4d75-8c41-23f457fcae25" containerName="ovsdbserver-sb" Mar 10 10:09:47 crc kubenswrapper[4794]: I0310 10:09:47.459322 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="9575e254-d696-4a8a-b84f-c8f36d746ff8" containerName="ovsdbserver-nb" Mar 10 10:09:47 crc kubenswrapper[4794]: I0310 10:09:47.459872 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-d7d5-account-create-update-sxmjz"] Mar 10 10:09:47 crc kubenswrapper[4794]: I0310 10:09:47.459950 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-d7d5-account-create-update-sxmjz" Mar 10 10:09:47 crc kubenswrapper[4794]: I0310 10:09:47.467654 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-db-secret" Mar 10 10:09:47 crc kubenswrapper[4794]: I0310 10:09:47.483789 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstack-galera-0"] Mar 10 10:09:47 crc kubenswrapper[4794]: I0310 10:09:47.494910 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-d7d5-account-create-update-sxmjz"] Mar 10 10:09:47 crc kubenswrapper[4794]: I0310 10:09:47.510795 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-create-mbt9m"] Mar 10 10:09:47 crc kubenswrapper[4794]: I0310 10:09:47.516467 4794 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-create-mbt9m"] Mar 10 10:09:47 crc kubenswrapper[4794]: I0310 10:09:47.517903 4794 scope.go:117] "RemoveContainer" containerID="1a56c890345feacde6604baef18e26d73094a63129df6f970f2058bcdb56403e" Mar 10 10:09:47 crc kubenswrapper[4794]: E0310 10:09:47.521948 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[kube-api-access-ql7mv operator-scripts], unattached volumes=[], failed to process volumes=[kube-api-access-ql7mv operator-scripts]: context canceled" pod="openstack/keystone-d7d5-account-create-update-sxmjz" podUID="0c3485a5-8f4a-4634-9a11-ed7d3081277f" Mar 10 10:09:47 crc kubenswrapper[4794]: I0310 10:09:47.532263 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-h8ctb"] Mar 10 10:09:47 crc kubenswrapper[4794]: I0310 10:09:47.570427 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0c3485a5-8f4a-4634-9a11-ed7d3081277f-operator-scripts\") pod \"keystone-d7d5-account-create-update-sxmjz\" (UID: \"0c3485a5-8f4a-4634-9a11-ed7d3081277f\") " pod="openstack/keystone-d7d5-account-create-update-sxmjz" Mar 10 10:09:47 crc kubenswrapper[4794]: I0310 10:09:47.570870 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ql7mv\" (UniqueName: \"kubernetes.io/projected/0c3485a5-8f4a-4634-9a11-ed7d3081277f-kube-api-access-ql7mv\") pod \"keystone-d7d5-account-create-update-sxmjz\" (UID: \"0c3485a5-8f4a-4634-9a11-ed7d3081277f\") " pod="openstack/keystone-d7d5-account-create-update-sxmjz" Mar 10 10:09:47 crc kubenswrapper[4794]: I0310 10:09:47.605357 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-65fdc45d8b-2t64g" Mar 10 10:09:47 crc kubenswrapper[4794]: E0310 10:09:47.622305 4794 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of eeacb7a4237f606fda5183e09ef7cdbc5464bd0f9e121d184887a224b87c2de7 is running failed: container process not found" containerID="eeacb7a4237f606fda5183e09ef7cdbc5464bd0f9e121d184887a224b87c2de7" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Mar 10 10:09:47 crc kubenswrapper[4794]: E0310 10:09:47.626893 4794 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of eeacb7a4237f606fda5183e09ef7cdbc5464bd0f9e121d184887a224b87c2de7 is running failed: container process not found" containerID="eeacb7a4237f606fda5183e09ef7cdbc5464bd0f9e121d184887a224b87c2de7" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Mar 10 10:09:47 crc kubenswrapper[4794]: E0310 10:09:47.627371 4794 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of eeacb7a4237f606fda5183e09ef7cdbc5464bd0f9e121d184887a224b87c2de7 is running failed: container process not found" containerID="eeacb7a4237f606fda5183e09ef7cdbc5464bd0f9e121d184887a224b87c2de7" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Mar 10 10:09:47 crc kubenswrapper[4794]: E0310 10:09:47.627407 4794 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of eeacb7a4237f606fda5183e09ef7cdbc5464bd0f9e121d184887a224b87c2de7 is running failed: container process not found" probeType="Readiness" pod="openstack/nova-cell0-conductor-0" podUID="6bd64ab2-dcd8-4404-973e-551182005da1" containerName="nova-cell0-conductor-conductor" Mar 10 10:09:47 crc kubenswrapper[4794]: I0310 10:09:47.638676 4794 kubelet_pods.go:1007] "Unable to retrieve pull secret, the image pull may not succeed." pod="openstack/root-account-create-update-h8ctb" secret="" err="secret \"galera-openstack-dockercfg-lwkxx\" not found" Mar 10 10:09:47 crc kubenswrapper[4794]: I0310 10:09:47.638719 4794 scope.go:117] "RemoveContainer" containerID="bfd8291a4141335b8e769990025ae16ba71e4ee8c719b3ff7c16b2cbc268d255" Mar 10 10:09:47 crc kubenswrapper[4794]: E0310 10:09:47.638977 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mariadb-account-create-update\" with CrashLoopBackOff: \"back-off 10s restarting failed container=mariadb-account-create-update pod=root-account-create-update-h8ctb_openstack(4a681af5-3cbf-4d83-a8cd-42a552cdc06d)\"" pod="openstack/root-account-create-update-h8ctb" podUID="4a681af5-3cbf-4d83-a8cd-42a552cdc06d" Mar 10 10:09:47 crc kubenswrapper[4794]: I0310 10:09:47.654066 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-7575fbf969-gq2mq" event={"ID":"ed705a10-5bb5-4170-8536-57c6be1cb816","Type":"ContainerDied","Data":"36653d4cabcbc5261ec1c6493c2eb93d199872239ce11c67f90424bfd49113b9"} Mar 10 10:09:47 crc kubenswrapper[4794]: I0310 10:09:47.654183 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-7575fbf969-gq2mq" Mar 10 10:09:47 crc kubenswrapper[4794]: I0310 10:09:47.678782 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/efa96620-3d4b-4780-92a0-eeefbe9dcf9a-config-data\") pod \"efa96620-3d4b-4780-92a0-eeefbe9dcf9a\" (UID: \"efa96620-3d4b-4780-92a0-eeefbe9dcf9a\") " Mar 10 10:09:47 crc kubenswrapper[4794]: I0310 10:09:47.678869 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/efa96620-3d4b-4780-92a0-eeefbe9dcf9a-logs\") pod \"efa96620-3d4b-4780-92a0-eeefbe9dcf9a\" (UID: \"efa96620-3d4b-4780-92a0-eeefbe9dcf9a\") " Mar 10 10:09:47 crc kubenswrapper[4794]: I0310 10:09:47.678957 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/efa96620-3d4b-4780-92a0-eeefbe9dcf9a-combined-ca-bundle\") pod \"efa96620-3d4b-4780-92a0-eeefbe9dcf9a\" (UID: \"efa96620-3d4b-4780-92a0-eeefbe9dcf9a\") " Mar 10 10:09:47 crc kubenswrapper[4794]: I0310 10:09:47.678978 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-55k6f\" (UniqueName: \"kubernetes.io/projected/efa96620-3d4b-4780-92a0-eeefbe9dcf9a-kube-api-access-55k6f\") pod \"efa96620-3d4b-4780-92a0-eeefbe9dcf9a\" (UID: \"efa96620-3d4b-4780-92a0-eeefbe9dcf9a\") " Mar 10 10:09:47 crc kubenswrapper[4794]: I0310 10:09:47.679022 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/efa96620-3d4b-4780-92a0-eeefbe9dcf9a-config-data-custom\") pod \"efa96620-3d4b-4780-92a0-eeefbe9dcf9a\" (UID: \"efa96620-3d4b-4780-92a0-eeefbe9dcf9a\") " Mar 10 10:09:47 crc kubenswrapper[4794]: I0310 10:09:47.679250 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ql7mv\" (UniqueName: \"kubernetes.io/projected/0c3485a5-8f4a-4634-9a11-ed7d3081277f-kube-api-access-ql7mv\") pod \"keystone-d7d5-account-create-update-sxmjz\" (UID: \"0c3485a5-8f4a-4634-9a11-ed7d3081277f\") " pod="openstack/keystone-d7d5-account-create-update-sxmjz" Mar 10 10:09:47 crc kubenswrapper[4794]: I0310 10:09:47.679371 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0c3485a5-8f4a-4634-9a11-ed7d3081277f-operator-scripts\") pod \"keystone-d7d5-account-create-update-sxmjz\" (UID: \"0c3485a5-8f4a-4634-9a11-ed7d3081277f\") " pod="openstack/keystone-d7d5-account-create-update-sxmjz" Mar 10 10:09:47 crc kubenswrapper[4794]: E0310 10:09:47.679480 4794 configmap.go:193] Couldn't get configMap openstack/openstack-scripts: configmap "openstack-scripts" not found Mar 10 10:09:47 crc kubenswrapper[4794]: E0310 10:09:47.679524 4794 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/0c3485a5-8f4a-4634-9a11-ed7d3081277f-operator-scripts podName:0c3485a5-8f4a-4634-9a11-ed7d3081277f nodeName:}" failed. No retries permitted until 2026-03-10 10:09:48.179511237 +0000 UTC m=+1536.935682055 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/0c3485a5-8f4a-4634-9a11-ed7d3081277f-operator-scripts") pod "keystone-d7d5-account-create-update-sxmjz" (UID: "0c3485a5-8f4a-4634-9a11-ed7d3081277f") : configmap "openstack-scripts" not found Mar 10 10:09:47 crc kubenswrapper[4794]: E0310 10:09:47.680257 4794 configmap.go:193] Couldn't get configMap openstack/openstack-scripts: configmap "openstack-scripts" not found Mar 10 10:09:47 crc kubenswrapper[4794]: E0310 10:09:47.680288 4794 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/4a681af5-3cbf-4d83-a8cd-42a552cdc06d-operator-scripts podName:4a681af5-3cbf-4d83-a8cd-42a552cdc06d nodeName:}" failed. No retries permitted until 2026-03-10 10:09:48.180279801 +0000 UTC m=+1536.936450619 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/4a681af5-3cbf-4d83-a8cd-42a552cdc06d-operator-scripts") pod "root-account-create-update-h8ctb" (UID: "4a681af5-3cbf-4d83-a8cd-42a552cdc06d") : configmap "openstack-scripts" not found Mar 10 10:09:47 crc kubenswrapper[4794]: I0310 10:09:47.681717 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/efa96620-3d4b-4780-92a0-eeefbe9dcf9a-logs" (OuterVolumeSpecName: "logs") pod "efa96620-3d4b-4780-92a0-eeefbe9dcf9a" (UID: "efa96620-3d4b-4780-92a0-eeefbe9dcf9a"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 10:09:47 crc kubenswrapper[4794]: I0310 10:09:47.685515 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/efa96620-3d4b-4780-92a0-eeefbe9dcf9a-kube-api-access-55k6f" (OuterVolumeSpecName: "kube-api-access-55k6f") pod "efa96620-3d4b-4780-92a0-eeefbe9dcf9a" (UID: "efa96620-3d4b-4780-92a0-eeefbe9dcf9a"). InnerVolumeSpecName "kube-api-access-55k6f". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 10:09:47 crc kubenswrapper[4794]: E0310 10:09:47.685596 4794 projected.go:194] Error preparing data for projected volume kube-api-access-ql7mv for pod openstack/keystone-d7d5-account-create-update-sxmjz: failed to fetch token: serviceaccounts "galera-openstack" not found Mar 10 10:09:47 crc kubenswrapper[4794]: E0310 10:09:47.685655 4794 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/0c3485a5-8f4a-4634-9a11-ed7d3081277f-kube-api-access-ql7mv podName:0c3485a5-8f4a-4634-9a11-ed7d3081277f nodeName:}" failed. No retries permitted until 2026-03-10 10:09:48.185632108 +0000 UTC m=+1536.941802916 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-ql7mv" (UniqueName: "kubernetes.io/projected/0c3485a5-8f4a-4634-9a11-ed7d3081277f-kube-api-access-ql7mv") pod "keystone-d7d5-account-create-update-sxmjz" (UID: "0c3485a5-8f4a-4634-9a11-ed7d3081277f") : failed to fetch token: serviceaccounts "galera-openstack" not found Mar 10 10:09:47 crc kubenswrapper[4794]: I0310 10:09:47.685775 4794 generic.go:334] "Generic (PLEG): container finished" podID="39114d89-8cf8-4563-bc50-e96e2113349d" containerID="1860e4a742cfe98abb0f65295ec8d6e591d40ce5a255968cfb355b03216be258" exitCode=0 Mar 10 10:09:47 crc kubenswrapper[4794]: I0310 10:09:47.685856 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"39114d89-8cf8-4563-bc50-e96e2113349d","Type":"ContainerDied","Data":"1860e4a742cfe98abb0f65295ec8d6e591d40ce5a255968cfb355b03216be258"} Mar 10 10:09:47 crc kubenswrapper[4794]: I0310 10:09:47.686225 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efa96620-3d4b-4780-92a0-eeefbe9dcf9a-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "efa96620-3d4b-4780-92a0-eeefbe9dcf9a" (UID: "efa96620-3d4b-4780-92a0-eeefbe9dcf9a"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 10:09:47 crc kubenswrapper[4794]: I0310 10:09:47.708514 4794 generic.go:334] "Generic (PLEG): container finished" podID="98b35dea-060e-4b8d-9829-37357853a9c4" containerID="13f70234e665cee8f47182684e880f742b068d9ada3d1f0b83237e5efa99c1ee" exitCode=0 Mar 10 10:09:47 crc kubenswrapper[4794]: I0310 10:09:47.708584 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-b64684465-k4k4j" event={"ID":"98b35dea-060e-4b8d-9829-37357853a9c4","Type":"ContainerDied","Data":"13f70234e665cee8f47182684e880f742b068d9ada3d1f0b83237e5efa99c1ee"} Mar 10 10:09:47 crc kubenswrapper[4794]: I0310 10:09:47.719230 4794 generic.go:334] "Generic (PLEG): container finished" podID="6bd64ab2-dcd8-4404-973e-551182005da1" containerID="eeacb7a4237f606fda5183e09ef7cdbc5464bd0f9e121d184887a224b87c2de7" exitCode=0 Mar 10 10:09:47 crc kubenswrapper[4794]: I0310 10:09:47.719356 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"6bd64ab2-dcd8-4404-973e-551182005da1","Type":"ContainerDied","Data":"eeacb7a4237f606fda5183e09ef7cdbc5464bd0f9e121d184887a224b87c2de7"} Mar 10 10:09:47 crc kubenswrapper[4794]: I0310 10:09:47.723702 4794 generic.go:334] "Generic (PLEG): container finished" podID="53686d91-dc01-4a36-99c3-e6c84052e15e" containerID="d36356b5c770ecad29603a57d1346e81bb0210ad811dc767118c368012779874" exitCode=0 Mar 10 10:09:47 crc kubenswrapper[4794]: I0310 10:09:47.723755 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"53686d91-dc01-4a36-99c3-e6c84052e15e","Type":"ContainerDied","Data":"d36356b5c770ecad29603a57d1346e81bb0210ad811dc767118c368012779874"} Mar 10 10:09:47 crc kubenswrapper[4794]: I0310 10:09:47.734715 4794 generic.go:334] "Generic (PLEG): container finished" podID="eb384e40-1917-4b9c-bcfa-440a3a10fd1d" containerID="d75c771de3d291bbbb95bf0f193cefd57708bcdc53c0f2f718b3d8e320f642c8" exitCode=0 Mar 10 10:09:47 crc kubenswrapper[4794]: I0310 10:09:47.734774 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"eb384e40-1917-4b9c-bcfa-440a3a10fd1d","Type":"ContainerDied","Data":"d75c771de3d291bbbb95bf0f193cefd57708bcdc53c0f2f718b3d8e320f642c8"} Mar 10 10:09:47 crc kubenswrapper[4794]: I0310 10:09:47.740224 4794 generic.go:334] "Generic (PLEG): container finished" podID="cffdb4d7-881d-4c1b-a15c-d8d9e8b7756f" containerID="839d455cbd220b0b4cbb46ee40c9764f0de266ff250390018a7424fa8aa36507" exitCode=0 Mar 10 10:09:47 crc kubenswrapper[4794]: I0310 10:09:47.740278 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-5dd9f46c58-hkfks" event={"ID":"cffdb4d7-881d-4c1b-a15c-d8d9e8b7756f","Type":"ContainerDied","Data":"839d455cbd220b0b4cbb46ee40c9764f0de266ff250390018a7424fa8aa36507"} Mar 10 10:09:47 crc kubenswrapper[4794]: I0310 10:09:47.743375 4794 generic.go:334] "Generic (PLEG): container finished" podID="efa96620-3d4b-4780-92a0-eeefbe9dcf9a" containerID="19687e523e2ddee57d1501fea9c4f3051cf91b15cc6073a1586ab9b0d2214569" exitCode=0 Mar 10 10:09:47 crc kubenswrapper[4794]: I0310 10:09:47.743426 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-65fdc45d8b-2t64g" event={"ID":"efa96620-3d4b-4780-92a0-eeefbe9dcf9a","Type":"ContainerDied","Data":"19687e523e2ddee57d1501fea9c4f3051cf91b15cc6073a1586ab9b0d2214569"} Mar 10 10:09:47 crc kubenswrapper[4794]: I0310 10:09:47.743444 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-65fdc45d8b-2t64g" event={"ID":"efa96620-3d4b-4780-92a0-eeefbe9dcf9a","Type":"ContainerDied","Data":"59bb9b9b2cfda494c5673c15b608c1720d58e7201b8382d7449948d619553b39"} Mar 10 10:09:47 crc kubenswrapper[4794]: I0310 10:09:47.743496 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-65fdc45d8b-2t64g" Mar 10 10:09:47 crc kubenswrapper[4794]: I0310 10:09:47.747931 4794 generic.go:334] "Generic (PLEG): container finished" podID="0a15b784-796e-4834-97e1-978b1f0d9690" containerID="1a9f488919341a6cbd7a3e003cf402c07ea892b34d5616b62a24305d40e123b4" exitCode=2 Mar 10 10:09:47 crc kubenswrapper[4794]: I0310 10:09:47.747981 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"0a15b784-796e-4834-97e1-978b1f0d9690","Type":"ContainerDied","Data":"1a9f488919341a6cbd7a3e003cf402c07ea892b34d5616b62a24305d40e123b4"} Mar 10 10:09:47 crc kubenswrapper[4794]: I0310 10:09:47.751002 4794 generic.go:334] "Generic (PLEG): container finished" podID="5ad7a22a-3e88-4447-b675-0a8339bd5f55" containerID="efa120338ab691675da4aafa0ebfbff3b4647e0b19ef6e555a8542261261a114" exitCode=2 Mar 10 10:09:47 crc kubenswrapper[4794]: I0310 10:09:47.751042 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5ad7a22a-3e88-4447-b675-0a8339bd5f55","Type":"ContainerDied","Data":"efa120338ab691675da4aafa0ebfbff3b4647e0b19ef6e555a8542261261a114"} Mar 10 10:09:47 crc kubenswrapper[4794]: I0310 10:09:47.752617 4794 generic.go:334] "Generic (PLEG): container finished" podID="6b5facf1-8bc0-497d-925f-ee382862cf22" containerID="a0b3e91e1e10cd24f85913c9dfa160aee857a4f7ffe7e21e4b28aa608366d44a" exitCode=0 Mar 10 10:09:47 crc kubenswrapper[4794]: I0310 10:09:47.752664 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-d7d5-account-create-update-sxmjz" Mar 10 10:09:47 crc kubenswrapper[4794]: I0310 10:09:47.752954 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"6b5facf1-8bc0-497d-925f-ee382862cf22","Type":"ContainerDied","Data":"a0b3e91e1e10cd24f85913c9dfa160aee857a4f7ffe7e21e4b28aa608366d44a"} Mar 10 10:09:47 crc kubenswrapper[4794]: I0310 10:09:47.777195 4794 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/neutron-bd7575545-w8qjp" podUID="55771788-f3c0-4cde-af2f-ca527c2e2965" containerName="neutron-httpd" probeResult="failure" output="Get \"https://10.217.0.168:9696/\": dial tcp 10.217.0.168:9696: connect: connection refused" Mar 10 10:09:47 crc kubenswrapper[4794]: I0310 10:09:47.781553 4794 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/efa96620-3d4b-4780-92a0-eeefbe9dcf9a-logs\") on node \"crc\" DevicePath \"\"" Mar 10 10:09:47 crc kubenswrapper[4794]: I0310 10:09:47.781581 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-55k6f\" (UniqueName: \"kubernetes.io/projected/efa96620-3d4b-4780-92a0-eeefbe9dcf9a-kube-api-access-55k6f\") on node \"crc\" DevicePath \"\"" Mar 10 10:09:47 crc kubenswrapper[4794]: I0310 10:09:47.781596 4794 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/efa96620-3d4b-4780-92a0-eeefbe9dcf9a-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 10 10:09:47 crc kubenswrapper[4794]: I0310 10:09:47.806531 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/openstack-galera-0" podUID="2b6b3a2f-0f4d-4b7f-a507-450f0ffff42b" containerName="galera" containerID="cri-o://4ee46b9ad300bd7296d35f0791b2f32932975b8845ba6b96c9a8eff329eb83f5" gracePeriod=30 Mar 10 10:09:47 crc kubenswrapper[4794]: I0310 10:09:47.830311 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efa96620-3d4b-4780-92a0-eeefbe9dcf9a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "efa96620-3d4b-4780-92a0-eeefbe9dcf9a" (UID: "efa96620-3d4b-4780-92a0-eeefbe9dcf9a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 10:09:47 crc kubenswrapper[4794]: I0310 10:09:47.860964 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efa96620-3d4b-4780-92a0-eeefbe9dcf9a-config-data" (OuterVolumeSpecName: "config-data") pod "efa96620-3d4b-4780-92a0-eeefbe9dcf9a" (UID: "efa96620-3d4b-4780-92a0-eeefbe9dcf9a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 10:09:47 crc kubenswrapper[4794]: I0310 10:09:47.887059 4794 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/efa96620-3d4b-4780-92a0-eeefbe9dcf9a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 10:09:47 crc kubenswrapper[4794]: I0310 10:09:47.887099 4794 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/efa96620-3d4b-4780-92a0-eeefbe9dcf9a-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 10:09:47 crc kubenswrapper[4794]: I0310 10:09:47.973611 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Mar 10 10:09:47 crc kubenswrapper[4794]: I0310 10:09:47.974767 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 10 10:09:47 crc kubenswrapper[4794]: I0310 10:09:47.978622 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-b64684465-k4k4j" Mar 10 10:09:47 crc kubenswrapper[4794]: I0310 10:09:47.994245 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-proxy-7575fbf969-gq2mq"] Mar 10 10:09:47 crc kubenswrapper[4794]: I0310 10:09:47.994323 4794 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/swift-proxy-7575fbf969-gq2mq"] Mar 10 10:09:47 crc kubenswrapper[4794]: I0310 10:09:47.994551 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/53686d91-dc01-4a36-99c3-e6c84052e15e-config-data\") pod \"53686d91-dc01-4a36-99c3-e6c84052e15e\" (UID: \"53686d91-dc01-4a36-99c3-e6c84052e15e\") " Mar 10 10:09:47 crc kubenswrapper[4794]: I0310 10:09:47.994602 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"53686d91-dc01-4a36-99c3-e6c84052e15e\" (UID: \"53686d91-dc01-4a36-99c3-e6c84052e15e\") " Mar 10 10:09:47 crc kubenswrapper[4794]: I0310 10:09:47.994645 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6bd64ab2-dcd8-4404-973e-551182005da1-config-data\") pod \"6bd64ab2-dcd8-4404-973e-551182005da1\" (UID: \"6bd64ab2-dcd8-4404-973e-551182005da1\") " Mar 10 10:09:47 crc kubenswrapper[4794]: I0310 10:09:47.994682 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6bd64ab2-dcd8-4404-973e-551182005da1-combined-ca-bundle\") pod \"6bd64ab2-dcd8-4404-973e-551182005da1\" (UID: \"6bd64ab2-dcd8-4404-973e-551182005da1\") " Mar 10 10:09:47 crc kubenswrapper[4794]: I0310 10:09:47.994748 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/53686d91-dc01-4a36-99c3-e6c84052e15e-scripts\") pod \"53686d91-dc01-4a36-99c3-e6c84052e15e\" (UID: \"53686d91-dc01-4a36-99c3-e6c84052e15e\") " Mar 10 10:09:47 crc kubenswrapper[4794]: I0310 10:09:47.994779 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/53686d91-dc01-4a36-99c3-e6c84052e15e-public-tls-certs\") pod \"53686d91-dc01-4a36-99c3-e6c84052e15e\" (UID: \"53686d91-dc01-4a36-99c3-e6c84052e15e\") " Mar 10 10:09:47 crc kubenswrapper[4794]: I0310 10:09:47.994813 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m4d5h\" (UniqueName: \"kubernetes.io/projected/53686d91-dc01-4a36-99c3-e6c84052e15e-kube-api-access-m4d5h\") pod \"53686d91-dc01-4a36-99c3-e6c84052e15e\" (UID: \"53686d91-dc01-4a36-99c3-e6c84052e15e\") " Mar 10 10:09:47 crc kubenswrapper[4794]: I0310 10:09:47.994890 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2gcck\" (UniqueName: \"kubernetes.io/projected/6bd64ab2-dcd8-4404-973e-551182005da1-kube-api-access-2gcck\") pod \"6bd64ab2-dcd8-4404-973e-551182005da1\" (UID: \"6bd64ab2-dcd8-4404-973e-551182005da1\") " Mar 10 10:09:47 crc kubenswrapper[4794]: I0310 10:09:47.994931 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/53686d91-dc01-4a36-99c3-e6c84052e15e-httpd-run\") pod \"53686d91-dc01-4a36-99c3-e6c84052e15e\" (UID: \"53686d91-dc01-4a36-99c3-e6c84052e15e\") " Mar 10 10:09:47 crc kubenswrapper[4794]: I0310 10:09:47.994968 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/53686d91-dc01-4a36-99c3-e6c84052e15e-logs\") pod \"53686d91-dc01-4a36-99c3-e6c84052e15e\" (UID: \"53686d91-dc01-4a36-99c3-e6c84052e15e\") " Mar 10 10:09:47 crc kubenswrapper[4794]: I0310 10:09:47.995074 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/53686d91-dc01-4a36-99c3-e6c84052e15e-combined-ca-bundle\") pod \"53686d91-dc01-4a36-99c3-e6c84052e15e\" (UID: \"53686d91-dc01-4a36-99c3-e6c84052e15e\") " Mar 10 10:09:48 crc kubenswrapper[4794]: I0310 10:09:48.001075 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage05-crc" (OuterVolumeSpecName: "glance") pod "53686d91-dc01-4a36-99c3-e6c84052e15e" (UID: "53686d91-dc01-4a36-99c3-e6c84052e15e"). InnerVolumeSpecName "local-storage05-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 10 10:09:48 crc kubenswrapper[4794]: I0310 10:09:48.004604 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/53686d91-dc01-4a36-99c3-e6c84052e15e-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "53686d91-dc01-4a36-99c3-e6c84052e15e" (UID: "53686d91-dc01-4a36-99c3-e6c84052e15e"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 10:09:48 crc kubenswrapper[4794]: I0310 10:09:48.004923 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/53686d91-dc01-4a36-99c3-e6c84052e15e-logs" (OuterVolumeSpecName: "logs") pod "53686d91-dc01-4a36-99c3-e6c84052e15e" (UID: "53686d91-dc01-4a36-99c3-e6c84052e15e"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 10:09:48 crc kubenswrapper[4794]: I0310 10:09:48.007820 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-d7d5-account-create-update-sxmjz" Mar 10 10:09:48 crc kubenswrapper[4794]: I0310 10:09:48.009508 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6bd64ab2-dcd8-4404-973e-551182005da1-kube-api-access-2gcck" (OuterVolumeSpecName: "kube-api-access-2gcck") pod "6bd64ab2-dcd8-4404-973e-551182005da1" (UID: "6bd64ab2-dcd8-4404-973e-551182005da1"). InnerVolumeSpecName "kube-api-access-2gcck". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 10:09:48 crc kubenswrapper[4794]: I0310 10:09:48.010962 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/53686d91-dc01-4a36-99c3-e6c84052e15e-kube-api-access-m4d5h" (OuterVolumeSpecName: "kube-api-access-m4d5h") pod "53686d91-dc01-4a36-99c3-e6c84052e15e" (UID: "53686d91-dc01-4a36-99c3-e6c84052e15e"). InnerVolumeSpecName "kube-api-access-m4d5h". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 10:09:48 crc kubenswrapper[4794]: I0310 10:09:48.016113 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 10 10:09:48 crc kubenswrapper[4794]: I0310 10:09:48.027113 4794 scope.go:117] "RemoveContainer" containerID="9ee666a84a51e6fb61d9cd9eaf3a5940b75beb1c27f7f0412bcb937917781eb8" Mar 10 10:09:48 crc kubenswrapper[4794]: I0310 10:09:48.032457 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="19885e30-4144-4598-be9f-99644e5d5d4a" path="/var/lib/kubelet/pods/19885e30-4144-4598-be9f-99644e5d5d4a/volumes" Mar 10 10:09:48 crc kubenswrapper[4794]: I0310 10:09:48.033109 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1fcb4385-7603-4d75-8c41-23f457fcae25" path="/var/lib/kubelet/pods/1fcb4385-7603-4d75-8c41-23f457fcae25/volumes" Mar 10 10:09:48 crc kubenswrapper[4794]: I0310 10:09:48.033830 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2ceb2c7c-89c8-4fce-aa02-aac3840e3f5a" path="/var/lib/kubelet/pods/2ceb2c7c-89c8-4fce-aa02-aac3840e3f5a/volumes" Mar 10 10:09:48 crc kubenswrapper[4794]: I0310 10:09:48.034951 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2e14b1f3-e9a2-41ae-96ee-88dc84f69921" path="/var/lib/kubelet/pods/2e14b1f3-e9a2-41ae-96ee-88dc84f69921/volumes" Mar 10 10:09:48 crc kubenswrapper[4794]: I0310 10:09:48.035556 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="34215f04-95c7-4644-8ecc-70147aa8c100" path="/var/lib/kubelet/pods/34215f04-95c7-4644-8ecc-70147aa8c100/volumes" Mar 10 10:09:48 crc kubenswrapper[4794]: I0310 10:09:48.036288 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="74a5421b-362d-437b-98ce-c11e44a2e6f0" path="/var/lib/kubelet/pods/74a5421b-362d-437b-98ce-c11e44a2e6f0/volumes" Mar 10 10:09:48 crc kubenswrapper[4794]: I0310 10:09:48.037479 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7f82e49b-0e4d-4cf5-8213-b30edcae94d4" path="/var/lib/kubelet/pods/7f82e49b-0e4d-4cf5-8213-b30edcae94d4/volumes" Mar 10 10:09:48 crc kubenswrapper[4794]: I0310 10:09:48.038834 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a812a094-c8c4-48cf-82c4-97fc75f2774f" path="/var/lib/kubelet/pods/a812a094-c8c4-48cf-82c4-97fc75f2774f/volumes" Mar 10 10:09:48 crc kubenswrapper[4794]: I0310 10:09:48.039625 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ab5bb4b1-f7a5-4bfb-a2a1-dfb63b32fcbe" path="/var/lib/kubelet/pods/ab5bb4b1-f7a5-4bfb-a2a1-dfb63b32fcbe/volumes" Mar 10 10:09:48 crc kubenswrapper[4794]: I0310 10:09:48.040275 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c4ffbeb0-ab29-4b48-bbd2-65c776b08edd" path="/var/lib/kubelet/pods/c4ffbeb0-ab29-4b48-bbd2-65c776b08edd/volumes" Mar 10 10:09:48 crc kubenswrapper[4794]: I0310 10:09:48.040375 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/53686d91-dc01-4a36-99c3-e6c84052e15e-scripts" (OuterVolumeSpecName: "scripts") pod "53686d91-dc01-4a36-99c3-e6c84052e15e" (UID: "53686d91-dc01-4a36-99c3-e6c84052e15e"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 10:09:48 crc kubenswrapper[4794]: I0310 10:09:48.068154 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d6e6c324-8bba-4585-9ffc-afad751594d7" path="/var/lib/kubelet/pods/d6e6c324-8bba-4585-9ffc-afad751594d7/volumes" Mar 10 10:09:48 crc kubenswrapper[4794]: I0310 10:09:48.070960 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/53686d91-dc01-4a36-99c3-e6c84052e15e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "53686d91-dc01-4a36-99c3-e6c84052e15e" (UID: "53686d91-dc01-4a36-99c3-e6c84052e15e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 10:09:48 crc kubenswrapper[4794]: I0310 10:09:48.071721 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6bd64ab2-dcd8-4404-973e-551182005da1-config-data" (OuterVolumeSpecName: "config-data") pod "6bd64ab2-dcd8-4404-973e-551182005da1" (UID: "6bd64ab2-dcd8-4404-973e-551182005da1"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 10:09:48 crc kubenswrapper[4794]: I0310 10:09:48.074243 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-5dd9f46c58-hkfks" Mar 10 10:09:48 crc kubenswrapper[4794]: I0310 10:09:48.076886 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="df649851-7d90-41e8-80e9-f7fd44d77af0" path="/var/lib/kubelet/pods/df649851-7d90-41e8-80e9-f7fd44d77af0/volumes" Mar 10 10:09:48 crc kubenswrapper[4794]: I0310 10:09:48.083790 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ed705a10-5bb5-4170-8536-57c6be1cb816" path="/var/lib/kubelet/pods/ed705a10-5bb5-4170-8536-57c6be1cb816/volumes" Mar 10 10:09:48 crc kubenswrapper[4794]: I0310 10:09:48.089037 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f68cd69f-6fe2-4189-ad03-9593a4e94337" path="/var/lib/kubelet/pods/f68cd69f-6fe2-4189-ad03-9593a4e94337/volumes" Mar 10 10:09:48 crc kubenswrapper[4794]: I0310 10:09:48.093426 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/53686d91-dc01-4a36-99c3-e6c84052e15e-config-data" (OuterVolumeSpecName: "config-data") pod "53686d91-dc01-4a36-99c3-e6c84052e15e" (UID: "53686d91-dc01-4a36-99c3-e6c84052e15e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 10:09:48 crc kubenswrapper[4794]: I0310 10:09:48.112511 4794 scope.go:117] "RemoveContainer" containerID="9ee666a84a51e6fb61d9cd9eaf3a5940b75beb1c27f7f0412bcb937917781eb8" Mar 10 10:09:48 crc kubenswrapper[4794]: E0310 10:09:48.113640 4794 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9ee666a84a51e6fb61d9cd9eaf3a5940b75beb1c27f7f0412bcb937917781eb8\": container with ID starting with 9ee666a84a51e6fb61d9cd9eaf3a5940b75beb1c27f7f0412bcb937917781eb8 not found: ID does not exist" containerID="9ee666a84a51e6fb61d9cd9eaf3a5940b75beb1c27f7f0412bcb937917781eb8" Mar 10 10:09:48 crc kubenswrapper[4794]: I0310 10:09:48.113705 4794 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9ee666a84a51e6fb61d9cd9eaf3a5940b75beb1c27f7f0412bcb937917781eb8"} err="failed to get container status \"9ee666a84a51e6fb61d9cd9eaf3a5940b75beb1c27f7f0412bcb937917781eb8\": rpc error: code = NotFound desc = could not find container \"9ee666a84a51e6fb61d9cd9eaf3a5940b75beb1c27f7f0412bcb937917781eb8\": container with ID starting with 9ee666a84a51e6fb61d9cd9eaf3a5940b75beb1c27f7f0412bcb937917781eb8 not found: ID does not exist" Mar 10 10:09:48 crc kubenswrapper[4794]: I0310 10:09:48.113737 4794 scope.go:117] "RemoveContainer" containerID="a5ce358b520a173198341e8d239869c88b76f27ef305f00661c9c0926b396a4c" Mar 10 10:09:48 crc kubenswrapper[4794]: I0310 10:09:48.119452 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n7gxf\" (UniqueName: \"kubernetes.io/projected/cffdb4d7-881d-4c1b-a15c-d8d9e8b7756f-kube-api-access-n7gxf\") pod \"cffdb4d7-881d-4c1b-a15c-d8d9e8b7756f\" (UID: \"cffdb4d7-881d-4c1b-a15c-d8d9e8b7756f\") " Mar 10 10:09:48 crc kubenswrapper[4794]: I0310 10:09:48.119678 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eb384e40-1917-4b9c-bcfa-440a3a10fd1d-config-data\") pod \"eb384e40-1917-4b9c-bcfa-440a3a10fd1d\" (UID: \"eb384e40-1917-4b9c-bcfa-440a3a10fd1d\") " Mar 10 10:09:48 crc kubenswrapper[4794]: I0310 10:09:48.119934 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/cffdb4d7-881d-4c1b-a15c-d8d9e8b7756f-public-tls-certs\") pod \"cffdb4d7-881d-4c1b-a15c-d8d9e8b7756f\" (UID: \"cffdb4d7-881d-4c1b-a15c-d8d9e8b7756f\") " Mar 10 10:09:48 crc kubenswrapper[4794]: I0310 10:09:48.119961 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/eb384e40-1917-4b9c-bcfa-440a3a10fd1d-internal-tls-certs\") pod \"eb384e40-1917-4b9c-bcfa-440a3a10fd1d\" (UID: \"eb384e40-1917-4b9c-bcfa-440a3a10fd1d\") " Mar 10 10:09:48 crc kubenswrapper[4794]: I0310 10:09:48.120116 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cffdb4d7-881d-4c1b-a15c-d8d9e8b7756f-logs\") pod \"cffdb4d7-881d-4c1b-a15c-d8d9e8b7756f\" (UID: \"cffdb4d7-881d-4c1b-a15c-d8d9e8b7756f\") " Mar 10 10:09:48 crc kubenswrapper[4794]: I0310 10:09:48.120172 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eb384e40-1917-4b9c-bcfa-440a3a10fd1d-combined-ca-bundle\") pod \"eb384e40-1917-4b9c-bcfa-440a3a10fd1d\" (UID: \"eb384e40-1917-4b9c-bcfa-440a3a10fd1d\") " Mar 10 10:09:48 crc kubenswrapper[4794]: I0310 10:09:48.120194 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5zkfk\" (UniqueName: \"kubernetes.io/projected/98b35dea-060e-4b8d-9829-37357853a9c4-kube-api-access-5zkfk\") pod \"98b35dea-060e-4b8d-9829-37357853a9c4\" (UID: \"98b35dea-060e-4b8d-9829-37357853a9c4\") " Mar 10 10:09:48 crc kubenswrapper[4794]: I0310 10:09:48.120259 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/98b35dea-060e-4b8d-9829-37357853a9c4-config-data\") pod \"98b35dea-060e-4b8d-9829-37357853a9c4\" (UID: \"98b35dea-060e-4b8d-9829-37357853a9c4\") " Mar 10 10:09:48 crc kubenswrapper[4794]: I0310 10:09:48.120282 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/cffdb4d7-881d-4c1b-a15c-d8d9e8b7756f-internal-tls-certs\") pod \"cffdb4d7-881d-4c1b-a15c-d8d9e8b7756f\" (UID: \"cffdb4d7-881d-4c1b-a15c-d8d9e8b7756f\") " Mar 10 10:09:48 crc kubenswrapper[4794]: I0310 10:09:48.120407 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/98b35dea-060e-4b8d-9829-37357853a9c4-combined-ca-bundle\") pod \"98b35dea-060e-4b8d-9829-37357853a9c4\" (UID: \"98b35dea-060e-4b8d-9829-37357853a9c4\") " Mar 10 10:09:48 crc kubenswrapper[4794]: I0310 10:09:48.120474 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"eb384e40-1917-4b9c-bcfa-440a3a10fd1d\" (UID: \"eb384e40-1917-4b9c-bcfa-440a3a10fd1d\") " Mar 10 10:09:48 crc kubenswrapper[4794]: I0310 10:09:48.120491 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cffdb4d7-881d-4c1b-a15c-d8d9e8b7756f-config-data\") pod \"cffdb4d7-881d-4c1b-a15c-d8d9e8b7756f\" (UID: \"cffdb4d7-881d-4c1b-a15c-d8d9e8b7756f\") " Mar 10 10:09:48 crc kubenswrapper[4794]: I0310 10:09:48.120541 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/eb384e40-1917-4b9c-bcfa-440a3a10fd1d-httpd-run\") pod \"eb384e40-1917-4b9c-bcfa-440a3a10fd1d\" (UID: \"eb384e40-1917-4b9c-bcfa-440a3a10fd1d\") " Mar 10 10:09:48 crc kubenswrapper[4794]: I0310 10:09:48.120572 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/eb384e40-1917-4b9c-bcfa-440a3a10fd1d-scripts\") pod \"eb384e40-1917-4b9c-bcfa-440a3a10fd1d\" (UID: \"eb384e40-1917-4b9c-bcfa-440a3a10fd1d\") " Mar 10 10:09:48 crc kubenswrapper[4794]: I0310 10:09:48.120695 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lwst6\" (UniqueName: \"kubernetes.io/projected/eb384e40-1917-4b9c-bcfa-440a3a10fd1d-kube-api-access-lwst6\") pod \"eb384e40-1917-4b9c-bcfa-440a3a10fd1d\" (UID: \"eb384e40-1917-4b9c-bcfa-440a3a10fd1d\") " Mar 10 10:09:48 crc kubenswrapper[4794]: I0310 10:09:48.120728 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/98b35dea-060e-4b8d-9829-37357853a9c4-logs\") pod \"98b35dea-060e-4b8d-9829-37357853a9c4\" (UID: \"98b35dea-060e-4b8d-9829-37357853a9c4\") " Mar 10 10:09:48 crc kubenswrapper[4794]: I0310 10:09:48.120779 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/98b35dea-060e-4b8d-9829-37357853a9c4-config-data-custom\") pod \"98b35dea-060e-4b8d-9829-37357853a9c4\" (UID: \"98b35dea-060e-4b8d-9829-37357853a9c4\") " Mar 10 10:09:48 crc kubenswrapper[4794]: I0310 10:09:48.120805 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/cffdb4d7-881d-4c1b-a15c-d8d9e8b7756f-config-data-custom\") pod \"cffdb4d7-881d-4c1b-a15c-d8d9e8b7756f\" (UID: \"cffdb4d7-881d-4c1b-a15c-d8d9e8b7756f\") " Mar 10 10:09:48 crc kubenswrapper[4794]: I0310 10:09:48.120845 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/eb384e40-1917-4b9c-bcfa-440a3a10fd1d-logs\") pod \"eb384e40-1917-4b9c-bcfa-440a3a10fd1d\" (UID: \"eb384e40-1917-4b9c-bcfa-440a3a10fd1d\") " Mar 10 10:09:48 crc kubenswrapper[4794]: I0310 10:09:48.120863 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cffdb4d7-881d-4c1b-a15c-d8d9e8b7756f-combined-ca-bundle\") pod \"cffdb4d7-881d-4c1b-a15c-d8d9e8b7756f\" (UID: \"cffdb4d7-881d-4c1b-a15c-d8d9e8b7756f\") " Mar 10 10:09:48 crc kubenswrapper[4794]: I0310 10:09:48.123168 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2gcck\" (UniqueName: \"kubernetes.io/projected/6bd64ab2-dcd8-4404-973e-551182005da1-kube-api-access-2gcck\") on node \"crc\" DevicePath \"\"" Mar 10 10:09:48 crc kubenswrapper[4794]: I0310 10:09:48.124173 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/eb384e40-1917-4b9c-bcfa-440a3a10fd1d-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "eb384e40-1917-4b9c-bcfa-440a3a10fd1d" (UID: "eb384e40-1917-4b9c-bcfa-440a3a10fd1d"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 10:09:48 crc kubenswrapper[4794]: I0310 10:09:48.125367 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cffdb4d7-881d-4c1b-a15c-d8d9e8b7756f-logs" (OuterVolumeSpecName: "logs") pod "cffdb4d7-881d-4c1b-a15c-d8d9e8b7756f" (UID: "cffdb4d7-881d-4c1b-a15c-d8d9e8b7756f"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 10:09:48 crc kubenswrapper[4794]: I0310 10:09:48.130143 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/98b35dea-060e-4b8d-9829-37357853a9c4-logs" (OuterVolumeSpecName: "logs") pod "98b35dea-060e-4b8d-9829-37357853a9c4" (UID: "98b35dea-060e-4b8d-9829-37357853a9c4"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 10:09:48 crc kubenswrapper[4794]: I0310 10:09:48.134074 4794 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/53686d91-dc01-4a36-99c3-e6c84052e15e-httpd-run\") on node \"crc\" DevicePath \"\"" Mar 10 10:09:48 crc kubenswrapper[4794]: I0310 10:09:48.134094 4794 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/53686d91-dc01-4a36-99c3-e6c84052e15e-logs\") on node \"crc\" DevicePath \"\"" Mar 10 10:09:48 crc kubenswrapper[4794]: I0310 10:09:48.134105 4794 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/53686d91-dc01-4a36-99c3-e6c84052e15e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 10:09:48 crc kubenswrapper[4794]: I0310 10:09:48.134115 4794 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/53686d91-dc01-4a36-99c3-e6c84052e15e-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 10:09:48 crc kubenswrapper[4794]: I0310 10:09:48.134136 4794 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" " Mar 10 10:09:48 crc kubenswrapper[4794]: I0310 10:09:48.134146 4794 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6bd64ab2-dcd8-4404-973e-551182005da1-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 10:09:48 crc kubenswrapper[4794]: I0310 10:09:48.134155 4794 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/53686d91-dc01-4a36-99c3-e6c84052e15e-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 10:09:48 crc kubenswrapper[4794]: I0310 10:09:48.134164 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m4d5h\" (UniqueName: \"kubernetes.io/projected/53686d91-dc01-4a36-99c3-e6c84052e15e-kube-api-access-m4d5h\") on node \"crc\" DevicePath \"\"" Mar 10 10:09:48 crc kubenswrapper[4794]: I0310 10:09:48.135629 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/53686d91-dc01-4a36-99c3-e6c84052e15e-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "53686d91-dc01-4a36-99c3-e6c84052e15e" (UID: "53686d91-dc01-4a36-99c3-e6c84052e15e"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 10:09:48 crc kubenswrapper[4794]: I0310 10:09:48.146868 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/eb384e40-1917-4b9c-bcfa-440a3a10fd1d-logs" (OuterVolumeSpecName: "logs") pod "eb384e40-1917-4b9c-bcfa-440a3a10fd1d" (UID: "eb384e40-1917-4b9c-bcfa-440a3a10fd1d"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 10:09:48 crc kubenswrapper[4794]: I0310 10:09:48.150981 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-keystone-listener-65fdc45d8b-2t64g"] Mar 10 10:09:48 crc kubenswrapper[4794]: I0310 10:09:48.152065 4794 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-keystone-listener-65fdc45d8b-2t64g"] Mar 10 10:09:48 crc kubenswrapper[4794]: I0310 10:09:48.159574 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cffdb4d7-881d-4c1b-a15c-d8d9e8b7756f-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "cffdb4d7-881d-4c1b-a15c-d8d9e8b7756f" (UID: "cffdb4d7-881d-4c1b-a15c-d8d9e8b7756f"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 10:09:48 crc kubenswrapper[4794]: I0310 10:09:48.174603 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cffdb4d7-881d-4c1b-a15c-d8d9e8b7756f-kube-api-access-n7gxf" (OuterVolumeSpecName: "kube-api-access-n7gxf") pod "cffdb4d7-881d-4c1b-a15c-d8d9e8b7756f" (UID: "cffdb4d7-881d-4c1b-a15c-d8d9e8b7756f"). InnerVolumeSpecName "kube-api-access-n7gxf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 10:09:48 crc kubenswrapper[4794]: I0310 10:09:48.174706 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eb384e40-1917-4b9c-bcfa-440a3a10fd1d-kube-api-access-lwst6" (OuterVolumeSpecName: "kube-api-access-lwst6") pod "eb384e40-1917-4b9c-bcfa-440a3a10fd1d" (UID: "eb384e40-1917-4b9c-bcfa-440a3a10fd1d"). InnerVolumeSpecName "kube-api-access-lwst6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 10:09:48 crc kubenswrapper[4794]: I0310 10:09:48.174764 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage03-crc" (OuterVolumeSpecName: "glance") pod "eb384e40-1917-4b9c-bcfa-440a3a10fd1d" (UID: "eb384e40-1917-4b9c-bcfa-440a3a10fd1d"). InnerVolumeSpecName "local-storage03-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 10 10:09:48 crc kubenswrapper[4794]: I0310 10:09:48.174904 4794 scope.go:117] "RemoveContainer" containerID="f4dda11441d70fc4769e94a81ee22fc71ec078c6070b2dd0749628f78f387314" Mar 10 10:09:48 crc kubenswrapper[4794]: I0310 10:09:48.176023 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eb384e40-1917-4b9c-bcfa-440a3a10fd1d-scripts" (OuterVolumeSpecName: "scripts") pod "eb384e40-1917-4b9c-bcfa-440a3a10fd1d" (UID: "eb384e40-1917-4b9c-bcfa-440a3a10fd1d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 10:09:48 crc kubenswrapper[4794]: I0310 10:09:48.218465 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/98b35dea-060e-4b8d-9829-37357853a9c4-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "98b35dea-060e-4b8d-9829-37357853a9c4" (UID: "98b35dea-060e-4b8d-9829-37357853a9c4"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 10:09:48 crc kubenswrapper[4794]: I0310 10:09:48.222712 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6bd64ab2-dcd8-4404-973e-551182005da1-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6bd64ab2-dcd8-4404-973e-551182005da1" (UID: "6bd64ab2-dcd8-4404-973e-551182005da1"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 10:09:48 crc kubenswrapper[4794]: I0310 10:09:48.232096 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/98b35dea-060e-4b8d-9829-37357853a9c4-kube-api-access-5zkfk" (OuterVolumeSpecName: "kube-api-access-5zkfk") pod "98b35dea-060e-4b8d-9829-37357853a9c4" (UID: "98b35dea-060e-4b8d-9829-37357853a9c4"). InnerVolumeSpecName "kube-api-access-5zkfk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 10:09:48 crc kubenswrapper[4794]: I0310 10:09:48.235558 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0c3485a5-8f4a-4634-9a11-ed7d3081277f-operator-scripts\") pod \"keystone-d7d5-account-create-update-sxmjz\" (UID: \"0c3485a5-8f4a-4634-9a11-ed7d3081277f\") " pod="openstack/keystone-d7d5-account-create-update-sxmjz" Mar 10 10:09:48 crc kubenswrapper[4794]: E0310 10:09:48.235664 4794 configmap.go:193] Couldn't get configMap openstack/openstack-scripts: configmap "openstack-scripts" not found Mar 10 10:09:48 crc kubenswrapper[4794]: E0310 10:09:48.235742 4794 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/4a681af5-3cbf-4d83-a8cd-42a552cdc06d-operator-scripts podName:4a681af5-3cbf-4d83-a8cd-42a552cdc06d nodeName:}" failed. No retries permitted until 2026-03-10 10:09:49.235723411 +0000 UTC m=+1537.991894229 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/4a681af5-3cbf-4d83-a8cd-42a552cdc06d-operator-scripts") pod "root-account-create-update-h8ctb" (UID: "4a681af5-3cbf-4d83-a8cd-42a552cdc06d") : configmap "openstack-scripts" not found Mar 10 10:09:48 crc kubenswrapper[4794]: E0310 10:09:48.235745 4794 configmap.go:193] Couldn't get configMap openstack/openstack-scripts: configmap "openstack-scripts" not found Mar 10 10:09:48 crc kubenswrapper[4794]: I0310 10:09:48.235805 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ql7mv\" (UniqueName: \"kubernetes.io/projected/0c3485a5-8f4a-4634-9a11-ed7d3081277f-kube-api-access-ql7mv\") pod \"keystone-d7d5-account-create-update-sxmjz\" (UID: \"0c3485a5-8f4a-4634-9a11-ed7d3081277f\") " pod="openstack/keystone-d7d5-account-create-update-sxmjz" Mar 10 10:09:48 crc kubenswrapper[4794]: E0310 10:09:48.235828 4794 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/0c3485a5-8f4a-4634-9a11-ed7d3081277f-operator-scripts podName:0c3485a5-8f4a-4634-9a11-ed7d3081277f nodeName:}" failed. No retries permitted until 2026-03-10 10:09:49.235804053 +0000 UTC m=+1537.991974951 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/0c3485a5-8f4a-4634-9a11-ed7d3081277f-operator-scripts") pod "keystone-d7d5-account-create-update-sxmjz" (UID: "0c3485a5-8f4a-4634-9a11-ed7d3081277f") : configmap "openstack-scripts" not found Mar 10 10:09:48 crc kubenswrapper[4794]: I0310 10:09:48.235866 4794 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/eb384e40-1917-4b9c-bcfa-440a3a10fd1d-httpd-run\") on node \"crc\" DevicePath \"\"" Mar 10 10:09:48 crc kubenswrapper[4794]: I0310 10:09:48.235881 4794 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/eb384e40-1917-4b9c-bcfa-440a3a10fd1d-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 10:09:48 crc kubenswrapper[4794]: I0310 10:09:48.235894 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lwst6\" (UniqueName: \"kubernetes.io/projected/eb384e40-1917-4b9c-bcfa-440a3a10fd1d-kube-api-access-lwst6\") on node \"crc\" DevicePath \"\"" Mar 10 10:09:48 crc kubenswrapper[4794]: I0310 10:09:48.235907 4794 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/98b35dea-060e-4b8d-9829-37357853a9c4-logs\") on node \"crc\" DevicePath \"\"" Mar 10 10:09:48 crc kubenswrapper[4794]: I0310 10:09:48.235918 4794 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/98b35dea-060e-4b8d-9829-37357853a9c4-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 10 10:09:48 crc kubenswrapper[4794]: I0310 10:09:48.235929 4794 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/cffdb4d7-881d-4c1b-a15c-d8d9e8b7756f-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 10 10:09:48 crc kubenswrapper[4794]: I0310 10:09:48.235941 4794 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/eb384e40-1917-4b9c-bcfa-440a3a10fd1d-logs\") on node \"crc\" DevicePath \"\"" Mar 10 10:09:48 crc kubenswrapper[4794]: I0310 10:09:48.235952 4794 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6bd64ab2-dcd8-4404-973e-551182005da1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 10:09:48 crc kubenswrapper[4794]: I0310 10:09:48.235963 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n7gxf\" (UniqueName: \"kubernetes.io/projected/cffdb4d7-881d-4c1b-a15c-d8d9e8b7756f-kube-api-access-n7gxf\") on node \"crc\" DevicePath \"\"" Mar 10 10:09:48 crc kubenswrapper[4794]: I0310 10:09:48.235975 4794 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/53686d91-dc01-4a36-99c3-e6c84052e15e-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 10 10:09:48 crc kubenswrapper[4794]: I0310 10:09:48.235984 4794 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cffdb4d7-881d-4c1b-a15c-d8d9e8b7756f-logs\") on node \"crc\" DevicePath \"\"" Mar 10 10:09:48 crc kubenswrapper[4794]: I0310 10:09:48.235994 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5zkfk\" (UniqueName: \"kubernetes.io/projected/98b35dea-060e-4b8d-9829-37357853a9c4-kube-api-access-5zkfk\") on node \"crc\" DevicePath \"\"" Mar 10 10:09:48 crc kubenswrapper[4794]: I0310 10:09:48.236020 4794 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") on node \"crc\" " Mar 10 10:09:48 crc kubenswrapper[4794]: E0310 10:09:48.240357 4794 projected.go:194] Error preparing data for projected volume kube-api-access-ql7mv for pod openstack/keystone-d7d5-account-create-update-sxmjz: failed to fetch token: serviceaccounts "galera-openstack" not found Mar 10 10:09:48 crc kubenswrapper[4794]: E0310 10:09:48.240424 4794 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/0c3485a5-8f4a-4634-9a11-ed7d3081277f-kube-api-access-ql7mv podName:0c3485a5-8f4a-4634-9a11-ed7d3081277f nodeName:}" failed. No retries permitted until 2026-03-10 10:09:49.240405547 +0000 UTC m=+1537.996576365 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-ql7mv" (UniqueName: "kubernetes.io/projected/0c3485a5-8f4a-4634-9a11-ed7d3081277f-kube-api-access-ql7mv") pod "keystone-d7d5-account-create-update-sxmjz" (UID: "0c3485a5-8f4a-4634-9a11-ed7d3081277f") : failed to fetch token: serviceaccounts "galera-openstack" not found Mar 10 10:09:48 crc kubenswrapper[4794]: I0310 10:09:48.241139 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eb384e40-1917-4b9c-bcfa-440a3a10fd1d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "eb384e40-1917-4b9c-bcfa-440a3a10fd1d" (UID: "eb384e40-1917-4b9c-bcfa-440a3a10fd1d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 10:09:48 crc kubenswrapper[4794]: I0310 10:09:48.242904 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/98b35dea-060e-4b8d-9829-37357853a9c4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "98b35dea-060e-4b8d-9829-37357853a9c4" (UID: "98b35dea-060e-4b8d-9829-37357853a9c4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 10:09:48 crc kubenswrapper[4794]: I0310 10:09:48.249517 4794 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage05-crc" (UniqueName: "kubernetes.io/local-volume/local-storage05-crc") on node "crc" Mar 10 10:09:48 crc kubenswrapper[4794]: I0310 10:09:48.253063 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 10 10:09:48 crc kubenswrapper[4794]: I0310 10:09:48.267066 4794 scope.go:117] "RemoveContainer" containerID="19687e523e2ddee57d1501fea9c4f3051cf91b15cc6073a1586ab9b0d2214569" Mar 10 10:09:48 crc kubenswrapper[4794]: I0310 10:09:48.270636 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cffdb4d7-881d-4c1b-a15c-d8d9e8b7756f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "cffdb4d7-881d-4c1b-a15c-d8d9e8b7756f" (UID: "cffdb4d7-881d-4c1b-a15c-d8d9e8b7756f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 10:09:48 crc kubenswrapper[4794]: I0310 10:09:48.282503 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/98b35dea-060e-4b8d-9829-37357853a9c4-config-data" (OuterVolumeSpecName: "config-data") pod "98b35dea-060e-4b8d-9829-37357853a9c4" (UID: "98b35dea-060e-4b8d-9829-37357853a9c4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 10:09:48 crc kubenswrapper[4794]: I0310 10:09:48.286907 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 10 10:09:48 crc kubenswrapper[4794]: I0310 10:09:48.289190 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eb384e40-1917-4b9c-bcfa-440a3a10fd1d-config-data" (OuterVolumeSpecName: "config-data") pod "eb384e40-1917-4b9c-bcfa-440a3a10fd1d" (UID: "eb384e40-1917-4b9c-bcfa-440a3a10fd1d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 10:09:48 crc kubenswrapper[4794]: I0310 10:09:48.291602 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 10 10:09:48 crc kubenswrapper[4794]: I0310 10:09:48.308653 4794 scope.go:117] "RemoveContainer" containerID="c415d347d34421bcb42d60573a12da5f8047e3e7f977fb04d5829812ddc302a7" Mar 10 10:09:48 crc kubenswrapper[4794]: I0310 10:09:48.337269 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/39114d89-8cf8-4563-bc50-e96e2113349d-etc-machine-id\") pod \"39114d89-8cf8-4563-bc50-e96e2113349d\" (UID: \"39114d89-8cf8-4563-bc50-e96e2113349d\") " Mar 10 10:09:48 crc kubenswrapper[4794]: I0310 10:09:48.337361 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/0a15b784-796e-4834-97e1-978b1f0d9690-kube-state-metrics-tls-certs\") pod \"0a15b784-796e-4834-97e1-978b1f0d9690\" (UID: \"0a15b784-796e-4834-97e1-978b1f0d9690\") " Mar 10 10:09:48 crc kubenswrapper[4794]: I0310 10:09:48.337417 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/39114d89-8cf8-4563-bc50-e96e2113349d-public-tls-certs\") pod \"39114d89-8cf8-4563-bc50-e96e2113349d\" (UID: \"39114d89-8cf8-4563-bc50-e96e2113349d\") " Mar 10 10:09:48 crc kubenswrapper[4794]: I0310 10:09:48.337446 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/6b5facf1-8bc0-497d-925f-ee382862cf22-nova-metadata-tls-certs\") pod \"6b5facf1-8bc0-497d-925f-ee382862cf22\" (UID: \"6b5facf1-8bc0-497d-925f-ee382862cf22\") " Mar 10 10:09:48 crc kubenswrapper[4794]: I0310 10:09:48.337490 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0a15b784-796e-4834-97e1-978b1f0d9690-combined-ca-bundle\") pod \"0a15b784-796e-4834-97e1-978b1f0d9690\" (UID: \"0a15b784-796e-4834-97e1-978b1f0d9690\") " Mar 10 10:09:48 crc kubenswrapper[4794]: I0310 10:09:48.337510 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6b5facf1-8bc0-497d-925f-ee382862cf22-combined-ca-bundle\") pod \"6b5facf1-8bc0-497d-925f-ee382862cf22\" (UID: \"6b5facf1-8bc0-497d-925f-ee382862cf22\") " Mar 10 10:09:48 crc kubenswrapper[4794]: I0310 10:09:48.337544 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rc6qg\" (UniqueName: \"kubernetes.io/projected/6b5facf1-8bc0-497d-925f-ee382862cf22-kube-api-access-rc6qg\") pod \"6b5facf1-8bc0-497d-925f-ee382862cf22\" (UID: \"6b5facf1-8bc0-497d-925f-ee382862cf22\") " Mar 10 10:09:48 crc kubenswrapper[4794]: I0310 10:09:48.337600 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6b5facf1-8bc0-497d-925f-ee382862cf22-config-data\") pod \"6b5facf1-8bc0-497d-925f-ee382862cf22\" (UID: \"6b5facf1-8bc0-497d-925f-ee382862cf22\") " Mar 10 10:09:48 crc kubenswrapper[4794]: I0310 10:09:48.337658 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6b5facf1-8bc0-497d-925f-ee382862cf22-logs\") pod \"6b5facf1-8bc0-497d-925f-ee382862cf22\" (UID: \"6b5facf1-8bc0-497d-925f-ee382862cf22\") " Mar 10 10:09:48 crc kubenswrapper[4794]: I0310 10:09:48.337685 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/39114d89-8cf8-4563-bc50-e96e2113349d-logs\") pod \"39114d89-8cf8-4563-bc50-e96e2113349d\" (UID: \"39114d89-8cf8-4563-bc50-e96e2113349d\") " Mar 10 10:09:48 crc kubenswrapper[4794]: I0310 10:09:48.337716 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/39114d89-8cf8-4563-bc50-e96e2113349d-combined-ca-bundle\") pod \"39114d89-8cf8-4563-bc50-e96e2113349d\" (UID: \"39114d89-8cf8-4563-bc50-e96e2113349d\") " Mar 10 10:09:48 crc kubenswrapper[4794]: I0310 10:09:48.337734 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/39114d89-8cf8-4563-bc50-e96e2113349d-scripts\") pod \"39114d89-8cf8-4563-bc50-e96e2113349d\" (UID: \"39114d89-8cf8-4563-bc50-e96e2113349d\") " Mar 10 10:09:48 crc kubenswrapper[4794]: I0310 10:09:48.337761 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jxh88\" (UniqueName: \"kubernetes.io/projected/39114d89-8cf8-4563-bc50-e96e2113349d-kube-api-access-jxh88\") pod \"39114d89-8cf8-4563-bc50-e96e2113349d\" (UID: \"39114d89-8cf8-4563-bc50-e96e2113349d\") " Mar 10 10:09:48 crc kubenswrapper[4794]: I0310 10:09:48.337792 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/39114d89-8cf8-4563-bc50-e96e2113349d-internal-tls-certs\") pod \"39114d89-8cf8-4563-bc50-e96e2113349d\" (UID: \"39114d89-8cf8-4563-bc50-e96e2113349d\") " Mar 10 10:09:48 crc kubenswrapper[4794]: I0310 10:09:48.338002 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n2tbq\" (UniqueName: \"kubernetes.io/projected/0a15b784-796e-4834-97e1-978b1f0d9690-kube-api-access-n2tbq\") pod \"0a15b784-796e-4834-97e1-978b1f0d9690\" (UID: \"0a15b784-796e-4834-97e1-978b1f0d9690\") " Mar 10 10:09:48 crc kubenswrapper[4794]: I0310 10:09:48.338038 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/39114d89-8cf8-4563-bc50-e96e2113349d-config-data\") pod \"39114d89-8cf8-4563-bc50-e96e2113349d\" (UID: \"39114d89-8cf8-4563-bc50-e96e2113349d\") " Mar 10 10:09:48 crc kubenswrapper[4794]: I0310 10:09:48.338067 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/39114d89-8cf8-4563-bc50-e96e2113349d-config-data-custom\") pod \"39114d89-8cf8-4563-bc50-e96e2113349d\" (UID: \"39114d89-8cf8-4563-bc50-e96e2113349d\") " Mar 10 10:09:48 crc kubenswrapper[4794]: I0310 10:09:48.338093 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/0a15b784-796e-4834-97e1-978b1f0d9690-kube-state-metrics-tls-config\") pod \"0a15b784-796e-4834-97e1-978b1f0d9690\" (UID: \"0a15b784-796e-4834-97e1-978b1f0d9690\") " Mar 10 10:09:48 crc kubenswrapper[4794]: I0310 10:09:48.338549 4794 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eb384e40-1917-4b9c-bcfa-440a3a10fd1d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 10:09:48 crc kubenswrapper[4794]: I0310 10:09:48.338572 4794 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/98b35dea-060e-4b8d-9829-37357853a9c4-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 10:09:48 crc kubenswrapper[4794]: I0310 10:09:48.338583 4794 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/98b35dea-060e-4b8d-9829-37357853a9c4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 10:09:48 crc kubenswrapper[4794]: I0310 10:09:48.338595 4794 reconciler_common.go:293] "Volume detached for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" DevicePath \"\"" Mar 10 10:09:48 crc kubenswrapper[4794]: I0310 10:09:48.338607 4794 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cffdb4d7-881d-4c1b-a15c-d8d9e8b7756f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 10:09:48 crc kubenswrapper[4794]: I0310 10:09:48.338619 4794 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eb384e40-1917-4b9c-bcfa-440a3a10fd1d-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 10:09:48 crc kubenswrapper[4794]: I0310 10:09:48.340582 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/39114d89-8cf8-4563-bc50-e96e2113349d-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "39114d89-8cf8-4563-bc50-e96e2113349d" (UID: "39114d89-8cf8-4563-bc50-e96e2113349d"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 10:09:48 crc kubenswrapper[4794]: I0310 10:09:48.343002 4794 scope.go:117] "RemoveContainer" containerID="19687e523e2ddee57d1501fea9c4f3051cf91b15cc6073a1586ab9b0d2214569" Mar 10 10:09:48 crc kubenswrapper[4794]: I0310 10:09:48.343888 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6b5facf1-8bc0-497d-925f-ee382862cf22-logs" (OuterVolumeSpecName: "logs") pod "6b5facf1-8bc0-497d-925f-ee382862cf22" (UID: "6b5facf1-8bc0-497d-925f-ee382862cf22"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 10:09:48 crc kubenswrapper[4794]: I0310 10:09:48.344399 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/39114d89-8cf8-4563-bc50-e96e2113349d-logs" (OuterVolumeSpecName: "logs") pod "39114d89-8cf8-4563-bc50-e96e2113349d" (UID: "39114d89-8cf8-4563-bc50-e96e2113349d"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 10:09:48 crc kubenswrapper[4794]: E0310 10:09:48.344488 4794 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"19687e523e2ddee57d1501fea9c4f3051cf91b15cc6073a1586ab9b0d2214569\": container with ID starting with 19687e523e2ddee57d1501fea9c4f3051cf91b15cc6073a1586ab9b0d2214569 not found: ID does not exist" containerID="19687e523e2ddee57d1501fea9c4f3051cf91b15cc6073a1586ab9b0d2214569" Mar 10 10:09:48 crc kubenswrapper[4794]: I0310 10:09:48.344518 4794 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"19687e523e2ddee57d1501fea9c4f3051cf91b15cc6073a1586ab9b0d2214569"} err="failed to get container status \"19687e523e2ddee57d1501fea9c4f3051cf91b15cc6073a1586ab9b0d2214569\": rpc error: code = NotFound desc = could not find container \"19687e523e2ddee57d1501fea9c4f3051cf91b15cc6073a1586ab9b0d2214569\": container with ID starting with 19687e523e2ddee57d1501fea9c4f3051cf91b15cc6073a1586ab9b0d2214569 not found: ID does not exist" Mar 10 10:09:48 crc kubenswrapper[4794]: I0310 10:09:48.344541 4794 scope.go:117] "RemoveContainer" containerID="c415d347d34421bcb42d60573a12da5f8047e3e7f977fb04d5829812ddc302a7" Mar 10 10:09:48 crc kubenswrapper[4794]: E0310 10:09:48.345237 4794 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c415d347d34421bcb42d60573a12da5f8047e3e7f977fb04d5829812ddc302a7\": container with ID starting with c415d347d34421bcb42d60573a12da5f8047e3e7f977fb04d5829812ddc302a7 not found: ID does not exist" containerID="c415d347d34421bcb42d60573a12da5f8047e3e7f977fb04d5829812ddc302a7" Mar 10 10:09:48 crc kubenswrapper[4794]: I0310 10:09:48.345273 4794 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c415d347d34421bcb42d60573a12da5f8047e3e7f977fb04d5829812ddc302a7"} err="failed to get container status \"c415d347d34421bcb42d60573a12da5f8047e3e7f977fb04d5829812ddc302a7\": rpc error: code = NotFound desc = could not find container \"c415d347d34421bcb42d60573a12da5f8047e3e7f977fb04d5829812ddc302a7\": container with ID starting with c415d347d34421bcb42d60573a12da5f8047e3e7f977fb04d5829812ddc302a7 not found: ID does not exist" Mar 10 10:09:48 crc kubenswrapper[4794]: I0310 10:09:48.361753 4794 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage03-crc" (UniqueName: "kubernetes.io/local-volume/local-storage03-crc") on node "crc" Mar 10 10:09:48 crc kubenswrapper[4794]: I0310 10:09:48.365952 4794 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-c8964d89c-gfknm" podUID="bb0dcfe9-2446-4f91-bc24-3cabcab9e8af" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.206:5353: i/o timeout" Mar 10 10:09:48 crc kubenswrapper[4794]: I0310 10:09:48.378774 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/39114d89-8cf8-4563-bc50-e96e2113349d-scripts" (OuterVolumeSpecName: "scripts") pod "39114d89-8cf8-4563-bc50-e96e2113349d" (UID: "39114d89-8cf8-4563-bc50-e96e2113349d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 10:09:48 crc kubenswrapper[4794]: I0310 10:09:48.378803 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0a15b784-796e-4834-97e1-978b1f0d9690-kube-api-access-n2tbq" (OuterVolumeSpecName: "kube-api-access-n2tbq") pod "0a15b784-796e-4834-97e1-978b1f0d9690" (UID: "0a15b784-796e-4834-97e1-978b1f0d9690"). InnerVolumeSpecName "kube-api-access-n2tbq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 10:09:48 crc kubenswrapper[4794]: I0310 10:09:48.380004 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/39114d89-8cf8-4563-bc50-e96e2113349d-kube-api-access-jxh88" (OuterVolumeSpecName: "kube-api-access-jxh88") pod "39114d89-8cf8-4563-bc50-e96e2113349d" (UID: "39114d89-8cf8-4563-bc50-e96e2113349d"). InnerVolumeSpecName "kube-api-access-jxh88". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 10:09:48 crc kubenswrapper[4794]: I0310 10:09:48.380054 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6b5facf1-8bc0-497d-925f-ee382862cf22-kube-api-access-rc6qg" (OuterVolumeSpecName: "kube-api-access-rc6qg") pod "6b5facf1-8bc0-497d-925f-ee382862cf22" (UID: "6b5facf1-8bc0-497d-925f-ee382862cf22"). InnerVolumeSpecName "kube-api-access-rc6qg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 10:09:48 crc kubenswrapper[4794]: I0310 10:09:48.385441 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/39114d89-8cf8-4563-bc50-e96e2113349d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "39114d89-8cf8-4563-bc50-e96e2113349d" (UID: "39114d89-8cf8-4563-bc50-e96e2113349d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 10:09:48 crc kubenswrapper[4794]: I0310 10:09:48.385802 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/39114d89-8cf8-4563-bc50-e96e2113349d-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "39114d89-8cf8-4563-bc50-e96e2113349d" (UID: "39114d89-8cf8-4563-bc50-e96e2113349d"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 10:09:48 crc kubenswrapper[4794]: I0310 10:09:48.401397 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_3a1da9f1-f33d-4327-b899-b5a38c6990d8/ovn-northd/0.log" Mar 10 10:09:48 crc kubenswrapper[4794]: I0310 10:09:48.401529 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Mar 10 10:09:48 crc kubenswrapper[4794]: I0310 10:09:48.402035 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cffdb4d7-881d-4c1b-a15c-d8d9e8b7756f-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "cffdb4d7-881d-4c1b-a15c-d8d9e8b7756f" (UID: "cffdb4d7-881d-4c1b-a15c-d8d9e8b7756f"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 10:09:48 crc kubenswrapper[4794]: I0310 10:09:48.407875 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0a15b784-796e-4834-97e1-978b1f0d9690-kube-state-metrics-tls-config" (OuterVolumeSpecName: "kube-state-metrics-tls-config") pod "0a15b784-796e-4834-97e1-978b1f0d9690" (UID: "0a15b784-796e-4834-97e1-978b1f0d9690"). InnerVolumeSpecName "kube-state-metrics-tls-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 10:09:48 crc kubenswrapper[4794]: I0310 10:09:48.409112 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-68cfd4d846-drn7b" Mar 10 10:09:48 crc kubenswrapper[4794]: I0310 10:09:48.415559 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0a15b784-796e-4834-97e1-978b1f0d9690-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0a15b784-796e-4834-97e1-978b1f0d9690" (UID: "0a15b784-796e-4834-97e1-978b1f0d9690"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 10:09:48 crc kubenswrapper[4794]: I0310 10:09:48.415560 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/39114d89-8cf8-4563-bc50-e96e2113349d-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "39114d89-8cf8-4563-bc50-e96e2113349d" (UID: "39114d89-8cf8-4563-bc50-e96e2113349d"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 10:09:48 crc kubenswrapper[4794]: I0310 10:09:48.419853 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6b5facf1-8bc0-497d-925f-ee382862cf22-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6b5facf1-8bc0-497d-925f-ee382862cf22" (UID: "6b5facf1-8bc0-497d-925f-ee382862cf22"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 10:09:48 crc kubenswrapper[4794]: I0310 10:09:48.427173 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6b5facf1-8bc0-497d-925f-ee382862cf22-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "6b5facf1-8bc0-497d-925f-ee382862cf22" (UID: "6b5facf1-8bc0-497d-925f-ee382862cf22"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 10:09:48 crc kubenswrapper[4794]: I0310 10:09:48.429393 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eb384e40-1917-4b9c-bcfa-440a3a10fd1d-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "eb384e40-1917-4b9c-bcfa-440a3a10fd1d" (UID: "eb384e40-1917-4b9c-bcfa-440a3a10fd1d"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 10:09:48 crc kubenswrapper[4794]: I0310 10:09:48.435464 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cffdb4d7-881d-4c1b-a15c-d8d9e8b7756f-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "cffdb4d7-881d-4c1b-a15c-d8d9e8b7756f" (UID: "cffdb4d7-881d-4c1b-a15c-d8d9e8b7756f"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 10:09:48 crc kubenswrapper[4794]: I0310 10:09:48.436301 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cffdb4d7-881d-4c1b-a15c-d8d9e8b7756f-config-data" (OuterVolumeSpecName: "config-data") pod "cffdb4d7-881d-4c1b-a15c-d8d9e8b7756f" (UID: "cffdb4d7-881d-4c1b-a15c-d8d9e8b7756f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 10:09:48 crc kubenswrapper[4794]: I0310 10:09:48.439918 4794 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/39114d89-8cf8-4563-bc50-e96e2113349d-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 10 10:09:48 crc kubenswrapper[4794]: I0310 10:09:48.439950 4794 reconciler_common.go:293] "Volume detached for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/0a15b784-796e-4834-97e1-978b1f0d9690-kube-state-metrics-tls-config\") on node \"crc\" DevicePath \"\"" Mar 10 10:09:48 crc kubenswrapper[4794]: I0310 10:09:48.439966 4794 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/cffdb4d7-881d-4c1b-a15c-d8d9e8b7756f-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 10 10:09:48 crc kubenswrapper[4794]: I0310 10:09:48.439977 4794 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/eb384e40-1917-4b9c-bcfa-440a3a10fd1d-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 10 10:09:48 crc kubenswrapper[4794]: I0310 10:09:48.439990 4794 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/39114d89-8cf8-4563-bc50-e96e2113349d-etc-machine-id\") on node \"crc\" DevicePath \"\"" Mar 10 10:09:48 crc kubenswrapper[4794]: I0310 10:09:48.440002 4794 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/6b5facf1-8bc0-497d-925f-ee382862cf22-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 10 10:09:48 crc kubenswrapper[4794]: I0310 10:09:48.440050 4794 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/cffdb4d7-881d-4c1b-a15c-d8d9e8b7756f-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 10 10:09:48 crc kubenswrapper[4794]: I0310 10:09:48.440059 4794 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0a15b784-796e-4834-97e1-978b1f0d9690-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 10:09:48 crc kubenswrapper[4794]: I0310 10:09:48.440069 4794 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6b5facf1-8bc0-497d-925f-ee382862cf22-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 10:09:48 crc kubenswrapper[4794]: I0310 10:09:48.440079 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rc6qg\" (UniqueName: \"kubernetes.io/projected/6b5facf1-8bc0-497d-925f-ee382862cf22-kube-api-access-rc6qg\") on node \"crc\" DevicePath \"\"" Mar 10 10:09:48 crc kubenswrapper[4794]: I0310 10:09:48.440088 4794 reconciler_common.go:293] "Volume detached for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") on node \"crc\" DevicePath \"\"" Mar 10 10:09:48 crc kubenswrapper[4794]: I0310 10:09:48.440097 4794 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cffdb4d7-881d-4c1b-a15c-d8d9e8b7756f-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 10:09:48 crc kubenswrapper[4794]: I0310 10:09:48.440104 4794 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6b5facf1-8bc0-497d-925f-ee382862cf22-logs\") on node \"crc\" DevicePath \"\"" Mar 10 10:09:48 crc kubenswrapper[4794]: I0310 10:09:48.440114 4794 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/39114d89-8cf8-4563-bc50-e96e2113349d-logs\") on node \"crc\" DevicePath \"\"" Mar 10 10:09:48 crc kubenswrapper[4794]: I0310 10:09:48.440122 4794 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/39114d89-8cf8-4563-bc50-e96e2113349d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 10:09:48 crc kubenswrapper[4794]: I0310 10:09:48.440131 4794 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/39114d89-8cf8-4563-bc50-e96e2113349d-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 10:09:48 crc kubenswrapper[4794]: I0310 10:09:48.440140 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jxh88\" (UniqueName: \"kubernetes.io/projected/39114d89-8cf8-4563-bc50-e96e2113349d-kube-api-access-jxh88\") on node \"crc\" DevicePath \"\"" Mar 10 10:09:48 crc kubenswrapper[4794]: I0310 10:09:48.440148 4794 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/39114d89-8cf8-4563-bc50-e96e2113349d-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 10 10:09:48 crc kubenswrapper[4794]: I0310 10:09:48.440156 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n2tbq\" (UniqueName: \"kubernetes.io/projected/0a15b784-796e-4834-97e1-978b1f0d9690-kube-api-access-n2tbq\") on node \"crc\" DevicePath \"\"" Mar 10 10:09:48 crc kubenswrapper[4794]: I0310 10:09:48.454446 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6b5facf1-8bc0-497d-925f-ee382862cf22-config-data" (OuterVolumeSpecName: "config-data") pod "6b5facf1-8bc0-497d-925f-ee382862cf22" (UID: "6b5facf1-8bc0-497d-925f-ee382862cf22"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 10:09:48 crc kubenswrapper[4794]: I0310 10:09:48.459342 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0a15b784-796e-4834-97e1-978b1f0d9690-kube-state-metrics-tls-certs" (OuterVolumeSpecName: "kube-state-metrics-tls-certs") pod "0a15b784-796e-4834-97e1-978b1f0d9690" (UID: "0a15b784-796e-4834-97e1-978b1f0d9690"). InnerVolumeSpecName "kube-state-metrics-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 10:09:48 crc kubenswrapper[4794]: I0310 10:09:48.462685 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/39114d89-8cf8-4563-bc50-e96e2113349d-config-data" (OuterVolumeSpecName: "config-data") pod "39114d89-8cf8-4563-bc50-e96e2113349d" (UID: "39114d89-8cf8-4563-bc50-e96e2113349d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 10:09:48 crc kubenswrapper[4794]: I0310 10:09:48.464476 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/39114d89-8cf8-4563-bc50-e96e2113349d-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "39114d89-8cf8-4563-bc50-e96e2113349d" (UID: "39114d89-8cf8-4563-bc50-e96e2113349d"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 10:09:48 crc kubenswrapper[4794]: I0310 10:09:48.540846 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bglt2\" (UniqueName: \"kubernetes.io/projected/ecae27ed-535f-47c8-93e4-07baac3bc64c-kube-api-access-bglt2\") pod \"ecae27ed-535f-47c8-93e4-07baac3bc64c\" (UID: \"ecae27ed-535f-47c8-93e4-07baac3bc64c\") " Mar 10 10:09:48 crc kubenswrapper[4794]: I0310 10:09:48.540926 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ecae27ed-535f-47c8-93e4-07baac3bc64c-logs\") pod \"ecae27ed-535f-47c8-93e4-07baac3bc64c\" (UID: \"ecae27ed-535f-47c8-93e4-07baac3bc64c\") " Mar 10 10:09:48 crc kubenswrapper[4794]: I0310 10:09:48.540962 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ecae27ed-535f-47c8-93e4-07baac3bc64c-public-tls-certs\") pod \"ecae27ed-535f-47c8-93e4-07baac3bc64c\" (UID: \"ecae27ed-535f-47c8-93e4-07baac3bc64c\") " Mar 10 10:09:48 crc kubenswrapper[4794]: I0310 10:09:48.540994 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3a1da9f1-f33d-4327-b899-b5a38c6990d8-scripts\") pod \"3a1da9f1-f33d-4327-b899-b5a38c6990d8\" (UID: \"3a1da9f1-f33d-4327-b899-b5a38c6990d8\") " Mar 10 10:09:48 crc kubenswrapper[4794]: I0310 10:09:48.541012 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/3a1da9f1-f33d-4327-b899-b5a38c6990d8-ovn-northd-tls-certs\") pod \"3a1da9f1-f33d-4327-b899-b5a38c6990d8\" (UID: \"3a1da9f1-f33d-4327-b899-b5a38c6990d8\") " Mar 10 10:09:48 crc kubenswrapper[4794]: I0310 10:09:48.541027 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ecae27ed-535f-47c8-93e4-07baac3bc64c-combined-ca-bundle\") pod \"ecae27ed-535f-47c8-93e4-07baac3bc64c\" (UID: \"ecae27ed-535f-47c8-93e4-07baac3bc64c\") " Mar 10 10:09:48 crc kubenswrapper[4794]: I0310 10:09:48.541053 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/3a1da9f1-f33d-4327-b899-b5a38c6990d8-metrics-certs-tls-certs\") pod \"3a1da9f1-f33d-4327-b899-b5a38c6990d8\" (UID: \"3a1da9f1-f33d-4327-b899-b5a38c6990d8\") " Mar 10 10:09:48 crc kubenswrapper[4794]: I0310 10:09:48.541069 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hqg4h\" (UniqueName: \"kubernetes.io/projected/3a1da9f1-f33d-4327-b899-b5a38c6990d8-kube-api-access-hqg4h\") pod \"3a1da9f1-f33d-4327-b899-b5a38c6990d8\" (UID: \"3a1da9f1-f33d-4327-b899-b5a38c6990d8\") " Mar 10 10:09:48 crc kubenswrapper[4794]: I0310 10:09:48.541090 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ecae27ed-535f-47c8-93e4-07baac3bc64c-internal-tls-certs\") pod \"ecae27ed-535f-47c8-93e4-07baac3bc64c\" (UID: \"ecae27ed-535f-47c8-93e4-07baac3bc64c\") " Mar 10 10:09:48 crc kubenswrapper[4794]: I0310 10:09:48.541130 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ecae27ed-535f-47c8-93e4-07baac3bc64c-config-data\") pod \"ecae27ed-535f-47c8-93e4-07baac3bc64c\" (UID: \"ecae27ed-535f-47c8-93e4-07baac3bc64c\") " Mar 10 10:09:48 crc kubenswrapper[4794]: I0310 10:09:48.541191 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/3a1da9f1-f33d-4327-b899-b5a38c6990d8-ovn-rundir\") pod \"3a1da9f1-f33d-4327-b899-b5a38c6990d8\" (UID: \"3a1da9f1-f33d-4327-b899-b5a38c6990d8\") " Mar 10 10:09:48 crc kubenswrapper[4794]: I0310 10:09:48.541207 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3a1da9f1-f33d-4327-b899-b5a38c6990d8-combined-ca-bundle\") pod \"3a1da9f1-f33d-4327-b899-b5a38c6990d8\" (UID: \"3a1da9f1-f33d-4327-b899-b5a38c6990d8\") " Mar 10 10:09:48 crc kubenswrapper[4794]: I0310 10:09:48.541233 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3a1da9f1-f33d-4327-b899-b5a38c6990d8-config\") pod \"3a1da9f1-f33d-4327-b899-b5a38c6990d8\" (UID: \"3a1da9f1-f33d-4327-b899-b5a38c6990d8\") " Mar 10 10:09:48 crc kubenswrapper[4794]: I0310 10:09:48.541260 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ecae27ed-535f-47c8-93e4-07baac3bc64c-scripts\") pod \"ecae27ed-535f-47c8-93e4-07baac3bc64c\" (UID: \"ecae27ed-535f-47c8-93e4-07baac3bc64c\") " Mar 10 10:09:48 crc kubenswrapper[4794]: I0310 10:09:48.541664 4794 reconciler_common.go:293] "Volume detached for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/0a15b784-796e-4834-97e1-978b1f0d9690-kube-state-metrics-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 10 10:09:48 crc kubenswrapper[4794]: I0310 10:09:48.541678 4794 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/39114d89-8cf8-4563-bc50-e96e2113349d-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 10 10:09:48 crc kubenswrapper[4794]: I0310 10:09:48.541686 4794 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6b5facf1-8bc0-497d-925f-ee382862cf22-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 10:09:48 crc kubenswrapper[4794]: I0310 10:09:48.541696 4794 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/39114d89-8cf8-4563-bc50-e96e2113349d-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 10:09:48 crc kubenswrapper[4794]: I0310 10:09:48.543546 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3a1da9f1-f33d-4327-b899-b5a38c6990d8-scripts" (OuterVolumeSpecName: "scripts") pod "3a1da9f1-f33d-4327-b899-b5a38c6990d8" (UID: "3a1da9f1-f33d-4327-b899-b5a38c6990d8"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 10:09:48 crc kubenswrapper[4794]: I0310 10:09:48.543877 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ecae27ed-535f-47c8-93e4-07baac3bc64c-logs" (OuterVolumeSpecName: "logs") pod "ecae27ed-535f-47c8-93e4-07baac3bc64c" (UID: "ecae27ed-535f-47c8-93e4-07baac3bc64c"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 10:09:48 crc kubenswrapper[4794]: I0310 10:09:48.545077 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3a1da9f1-f33d-4327-b899-b5a38c6990d8-config" (OuterVolumeSpecName: "config") pod "3a1da9f1-f33d-4327-b899-b5a38c6990d8" (UID: "3a1da9f1-f33d-4327-b899-b5a38c6990d8"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 10:09:48 crc kubenswrapper[4794]: I0310 10:09:48.545460 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3a1da9f1-f33d-4327-b899-b5a38c6990d8-ovn-rundir" (OuterVolumeSpecName: "ovn-rundir") pod "3a1da9f1-f33d-4327-b899-b5a38c6990d8" (UID: "3a1da9f1-f33d-4327-b899-b5a38c6990d8"). InnerVolumeSpecName "ovn-rundir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 10:09:48 crc kubenswrapper[4794]: I0310 10:09:48.545626 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ecae27ed-535f-47c8-93e4-07baac3bc64c-kube-api-access-bglt2" (OuterVolumeSpecName: "kube-api-access-bglt2") pod "ecae27ed-535f-47c8-93e4-07baac3bc64c" (UID: "ecae27ed-535f-47c8-93e4-07baac3bc64c"). InnerVolumeSpecName "kube-api-access-bglt2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 10:09:48 crc kubenswrapper[4794]: I0310 10:09:48.546732 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3a1da9f1-f33d-4327-b899-b5a38c6990d8-kube-api-access-hqg4h" (OuterVolumeSpecName: "kube-api-access-hqg4h") pod "3a1da9f1-f33d-4327-b899-b5a38c6990d8" (UID: "3a1da9f1-f33d-4327-b899-b5a38c6990d8"). InnerVolumeSpecName "kube-api-access-hqg4h". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 10:09:48 crc kubenswrapper[4794]: I0310 10:09:48.547441 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ecae27ed-535f-47c8-93e4-07baac3bc64c-scripts" (OuterVolumeSpecName: "scripts") pod "ecae27ed-535f-47c8-93e4-07baac3bc64c" (UID: "ecae27ed-535f-47c8-93e4-07baac3bc64c"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 10:09:48 crc kubenswrapper[4794]: I0310 10:09:48.591627 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3a1da9f1-f33d-4327-b899-b5a38c6990d8-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3a1da9f1-f33d-4327-b899-b5a38c6990d8" (UID: "3a1da9f1-f33d-4327-b899-b5a38c6990d8"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 10:09:48 crc kubenswrapper[4794]: I0310 10:09:48.631930 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ecae27ed-535f-47c8-93e4-07baac3bc64c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ecae27ed-535f-47c8-93e4-07baac3bc64c" (UID: "ecae27ed-535f-47c8-93e4-07baac3bc64c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 10:09:48 crc kubenswrapper[4794]: I0310 10:09:48.644630 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bglt2\" (UniqueName: \"kubernetes.io/projected/ecae27ed-535f-47c8-93e4-07baac3bc64c-kube-api-access-bglt2\") on node \"crc\" DevicePath \"\"" Mar 10 10:09:48 crc kubenswrapper[4794]: I0310 10:09:48.644660 4794 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ecae27ed-535f-47c8-93e4-07baac3bc64c-logs\") on node \"crc\" DevicePath \"\"" Mar 10 10:09:48 crc kubenswrapper[4794]: I0310 10:09:48.644669 4794 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3a1da9f1-f33d-4327-b899-b5a38c6990d8-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 10:09:48 crc kubenswrapper[4794]: I0310 10:09:48.644678 4794 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ecae27ed-535f-47c8-93e4-07baac3bc64c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 10:09:48 crc kubenswrapper[4794]: I0310 10:09:48.644687 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hqg4h\" (UniqueName: \"kubernetes.io/projected/3a1da9f1-f33d-4327-b899-b5a38c6990d8-kube-api-access-hqg4h\") on node \"crc\" DevicePath \"\"" Mar 10 10:09:48 crc kubenswrapper[4794]: I0310 10:09:48.644760 4794 reconciler_common.go:293] "Volume detached for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/3a1da9f1-f33d-4327-b899-b5a38c6990d8-ovn-rundir\") on node \"crc\" DevicePath \"\"" Mar 10 10:09:48 crc kubenswrapper[4794]: I0310 10:09:48.644790 4794 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3a1da9f1-f33d-4327-b899-b5a38c6990d8-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 10:09:48 crc kubenswrapper[4794]: I0310 10:09:48.644798 4794 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3a1da9f1-f33d-4327-b899-b5a38c6990d8-config\") on node \"crc\" DevicePath \"\"" Mar 10 10:09:48 crc kubenswrapper[4794]: I0310 10:09:48.644806 4794 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ecae27ed-535f-47c8-93e4-07baac3bc64c-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 10:09:48 crc kubenswrapper[4794]: I0310 10:09:48.673992 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ecae27ed-535f-47c8-93e4-07baac3bc64c-config-data" (OuterVolumeSpecName: "config-data") pod "ecae27ed-535f-47c8-93e4-07baac3bc64c" (UID: "ecae27ed-535f-47c8-93e4-07baac3bc64c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 10:09:48 crc kubenswrapper[4794]: I0310 10:09:48.693444 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3a1da9f1-f33d-4327-b899-b5a38c6990d8-ovn-northd-tls-certs" (OuterVolumeSpecName: "ovn-northd-tls-certs") pod "3a1da9f1-f33d-4327-b899-b5a38c6990d8" (UID: "3a1da9f1-f33d-4327-b899-b5a38c6990d8"). InnerVolumeSpecName "ovn-northd-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 10:09:48 crc kubenswrapper[4794]: I0310 10:09:48.695667 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3a1da9f1-f33d-4327-b899-b5a38c6990d8-metrics-certs-tls-certs" (OuterVolumeSpecName: "metrics-certs-tls-certs") pod "3a1da9f1-f33d-4327-b899-b5a38c6990d8" (UID: "3a1da9f1-f33d-4327-b899-b5a38c6990d8"). InnerVolumeSpecName "metrics-certs-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 10:09:48 crc kubenswrapper[4794]: I0310 10:09:48.696495 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ecae27ed-535f-47c8-93e4-07baac3bc64c-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "ecae27ed-535f-47c8-93e4-07baac3bc64c" (UID: "ecae27ed-535f-47c8-93e4-07baac3bc64c"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 10:09:48 crc kubenswrapper[4794]: I0310 10:09:48.702789 4794 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-cell1-server-0" podUID="598e06ed-3156-4e09-976e-4dda0e35afc2" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.102:5671: connect: connection refused" Mar 10 10:09:48 crc kubenswrapper[4794]: I0310 10:09:48.711258 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ecae27ed-535f-47c8-93e4-07baac3bc64c-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "ecae27ed-535f-47c8-93e4-07baac3bc64c" (UID: "ecae27ed-535f-47c8-93e4-07baac3bc64c"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 10:09:48 crc kubenswrapper[4794]: I0310 10:09:48.745877 4794 reconciler_common.go:293] "Volume detached for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/3a1da9f1-f33d-4327-b899-b5a38c6990d8-ovn-northd-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 10 10:09:48 crc kubenswrapper[4794]: I0310 10:09:48.745910 4794 reconciler_common.go:293] "Volume detached for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/3a1da9f1-f33d-4327-b899-b5a38c6990d8-metrics-certs-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 10 10:09:48 crc kubenswrapper[4794]: I0310 10:09:48.745922 4794 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ecae27ed-535f-47c8-93e4-07baac3bc64c-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 10 10:09:48 crc kubenswrapper[4794]: I0310 10:09:48.745932 4794 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ecae27ed-535f-47c8-93e4-07baac3bc64c-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 10:09:48 crc kubenswrapper[4794]: I0310 10:09:48.745941 4794 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ecae27ed-535f-47c8-93e4-07baac3bc64c-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 10 10:09:48 crc kubenswrapper[4794]: I0310 10:09:48.766487 4794 generic.go:334] "Generic (PLEG): container finished" podID="ecae27ed-535f-47c8-93e4-07baac3bc64c" containerID="5b36ab842a7613453f011b16a0ad771867c79a6397fa893991fe94b1caac6d43" exitCode=0 Mar 10 10:09:48 crc kubenswrapper[4794]: I0310 10:09:48.766558 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-68cfd4d846-drn7b" event={"ID":"ecae27ed-535f-47c8-93e4-07baac3bc64c","Type":"ContainerDied","Data":"5b36ab842a7613453f011b16a0ad771867c79a6397fa893991fe94b1caac6d43"} Mar 10 10:09:48 crc kubenswrapper[4794]: I0310 10:09:48.766566 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-68cfd4d846-drn7b" Mar 10 10:09:48 crc kubenswrapper[4794]: I0310 10:09:48.766641 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-68cfd4d846-drn7b" event={"ID":"ecae27ed-535f-47c8-93e4-07baac3bc64c","Type":"ContainerDied","Data":"bce16e9cc32c94b4d11d6002586ef136ea6b8659d3c91269d03e6df727d5a323"} Mar 10 10:09:48 crc kubenswrapper[4794]: I0310 10:09:48.766663 4794 scope.go:117] "RemoveContainer" containerID="5b36ab842a7613453f011b16a0ad771867c79a6397fa893991fe94b1caac6d43" Mar 10 10:09:48 crc kubenswrapper[4794]: I0310 10:09:48.770837 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"0a15b784-796e-4834-97e1-978b1f0d9690","Type":"ContainerDied","Data":"cfaa5e833f20da11cb041d945a3128f76cb8fe90d425062179af059b0fe4f9bc"} Mar 10 10:09:48 crc kubenswrapper[4794]: I0310 10:09:48.770905 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 10 10:09:48 crc kubenswrapper[4794]: I0310 10:09:48.802646 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"39114d89-8cf8-4563-bc50-e96e2113349d","Type":"ContainerDied","Data":"05309f8870b7523828ec7fdf40578ab276ca5685ead9a1535e9e59e74c80aa2e"} Mar 10 10:09:48 crc kubenswrapper[4794]: I0310 10:09:48.802675 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 10 10:09:48 crc kubenswrapper[4794]: I0310 10:09:48.816759 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 10 10:09:48 crc kubenswrapper[4794]: I0310 10:09:48.817181 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"eb384e40-1917-4b9c-bcfa-440a3a10fd1d","Type":"ContainerDied","Data":"aae6a2afa3749d02039ce43aa61c5aacff2989b05d29eb1a47e756aa2162d339"} Mar 10 10:09:48 crc kubenswrapper[4794]: I0310 10:09:48.821658 4794 generic.go:334] "Generic (PLEG): container finished" podID="71ee8a8d-89a0-495f-925b-071e52449063" containerID="2cb4ade39b3ccc7065ae84be53a791f98290d564a27b4334ca648c8b39c8ca95" exitCode=0 Mar 10 10:09:48 crc kubenswrapper[4794]: I0310 10:09:48.821704 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"71ee8a8d-89a0-495f-925b-071e52449063","Type":"ContainerDied","Data":"2cb4ade39b3ccc7065ae84be53a791f98290d564a27b4334ca648c8b39c8ca95"} Mar 10 10:09:48 crc kubenswrapper[4794]: I0310 10:09:48.824646 4794 generic.go:334] "Generic (PLEG): container finished" podID="2446b2bc-c3c8-465d-a808-981664228cba" containerID="826580e64ddfef3da4fda6ec39829dbc5dafbc69c66385fc8ca2a2bcd5ca60d8" exitCode=0 Mar 10 10:09:48 crc kubenswrapper[4794]: I0310 10:09:48.824684 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"2446b2bc-c3c8-465d-a808-981664228cba","Type":"ContainerDied","Data":"826580e64ddfef3da4fda6ec39829dbc5dafbc69c66385fc8ca2a2bcd5ca60d8"} Mar 10 10:09:48 crc kubenswrapper[4794]: I0310 10:09:48.824705 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"2446b2bc-c3c8-465d-a808-981664228cba","Type":"ContainerDied","Data":"2fa7b2465b193518461982c6dd45b272c119057d175f0a6e5ef8f619368f117f"} Mar 10 10:09:48 crc kubenswrapper[4794]: I0310 10:09:48.824715 4794 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2fa7b2465b193518461982c6dd45b272c119057d175f0a6e5ef8f619368f117f" Mar 10 10:09:48 crc kubenswrapper[4794]: I0310 10:09:48.829474 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"6bd64ab2-dcd8-4404-973e-551182005da1","Type":"ContainerDied","Data":"b17db50fcecf9f756739db2c43fd7495686b6dc284d13e468b782301f2adb2d4"} Mar 10 10:09:48 crc kubenswrapper[4794]: I0310 10:09:48.829594 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Mar 10 10:09:48 crc kubenswrapper[4794]: I0310 10:09:48.847773 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-b64684465-k4k4j" event={"ID":"98b35dea-060e-4b8d-9829-37357853a9c4","Type":"ContainerDied","Data":"5466df36889b2f19d045b78a869f2f6c21971a503e33a7043fd1a0f26a98cfa3"} Mar 10 10:09:48 crc kubenswrapper[4794]: I0310 10:09:48.847868 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-b64684465-k4k4j" Mar 10 10:09:48 crc kubenswrapper[4794]: I0310 10:09:48.853823 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"53686d91-dc01-4a36-99c3-e6c84052e15e","Type":"ContainerDied","Data":"a93a1cdce0c6102672e520db125516311ff3a3151f376512b31940fae9eb6766"} Mar 10 10:09:48 crc kubenswrapper[4794]: I0310 10:09:48.853915 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 10 10:09:48 crc kubenswrapper[4794]: I0310 10:09:48.865844 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 10 10:09:48 crc kubenswrapper[4794]: I0310 10:09:48.866414 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"6b5facf1-8bc0-497d-925f-ee382862cf22","Type":"ContainerDied","Data":"14dbdb7d7cc1e4aa9caa7e5b71ac3344e5c3eb3a18b977211732c84b49d8764c"} Mar 10 10:09:48 crc kubenswrapper[4794]: I0310 10:09:48.866478 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 10 10:09:48 crc kubenswrapper[4794]: I0310 10:09:48.874475 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-5dd9f46c58-hkfks" event={"ID":"cffdb4d7-881d-4c1b-a15c-d8d9e8b7756f","Type":"ContainerDied","Data":"00270a947c119d653e8057284300c51c48f65a49f3fce81fd665ac1ea41f87dd"} Mar 10 10:09:48 crc kubenswrapper[4794]: I0310 10:09:48.874645 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-5dd9f46c58-hkfks" Mar 10 10:09:48 crc kubenswrapper[4794]: I0310 10:09:48.876618 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_3a1da9f1-f33d-4327-b899-b5a38c6990d8/ovn-northd/0.log" Mar 10 10:09:48 crc kubenswrapper[4794]: I0310 10:09:48.876663 4794 generic.go:334] "Generic (PLEG): container finished" podID="3a1da9f1-f33d-4327-b899-b5a38c6990d8" containerID="a27a1a3aea44b5edb2651ff49b0fcc031f55a8ebaeb3807a56fb618e61e23ded" exitCode=139 Mar 10 10:09:48 crc kubenswrapper[4794]: I0310 10:09:48.876711 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"3a1da9f1-f33d-4327-b899-b5a38c6990d8","Type":"ContainerDied","Data":"a27a1a3aea44b5edb2651ff49b0fcc031f55a8ebaeb3807a56fb618e61e23ded"} Mar 10 10:09:48 crc kubenswrapper[4794]: I0310 10:09:48.876742 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"3a1da9f1-f33d-4327-b899-b5a38c6990d8","Type":"ContainerDied","Data":"2855314e11c632caa2a2618519660341ead19e4cd33121471f0e27f25c3b5915"} Mar 10 10:09:48 crc kubenswrapper[4794]: I0310 10:09:48.876715 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Mar 10 10:09:48 crc kubenswrapper[4794]: I0310 10:09:48.882441 4794 generic.go:334] "Generic (PLEG): container finished" podID="5ad7a22a-3e88-4447-b675-0a8339bd5f55" containerID="eea46c0d3766a780d0c4aadf572fb639917b34eb0330835ed5b59a3cdae42cd7" exitCode=0 Mar 10 10:09:48 crc kubenswrapper[4794]: I0310 10:09:48.882464 4794 generic.go:334] "Generic (PLEG): container finished" podID="5ad7a22a-3e88-4447-b675-0a8339bd5f55" containerID="e87552170b0f59d6ba42cedca611dc39fa99be4bf11ae3c34213d11839f96d1a" exitCode=0 Mar 10 10:09:48 crc kubenswrapper[4794]: I0310 10:09:48.887544 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5ad7a22a-3e88-4447-b675-0a8339bd5f55","Type":"ContainerDied","Data":"eea46c0d3766a780d0c4aadf572fb639917b34eb0330835ed5b59a3cdae42cd7"} Mar 10 10:09:48 crc kubenswrapper[4794]: I0310 10:09:48.887589 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5ad7a22a-3e88-4447-b675-0a8339bd5f55","Type":"ContainerDied","Data":"e87552170b0f59d6ba42cedca611dc39fa99be4bf11ae3c34213d11839f96d1a"} Mar 10 10:09:48 crc kubenswrapper[4794]: I0310 10:09:48.887705 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-d7d5-account-create-update-sxmjz" Mar 10 10:09:48 crc kubenswrapper[4794]: I0310 10:09:48.888274 4794 scope.go:117] "RemoveContainer" containerID="07b975998a50058fa523429e713118ddb3da394ec6e69a92bc5bda9b7839104f" Mar 10 10:09:48 crc kubenswrapper[4794]: I0310 10:09:48.957023 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2446b2bc-c3c8-465d-a808-981664228cba-logs\") pod \"2446b2bc-c3c8-465d-a808-981664228cba\" (UID: \"2446b2bc-c3c8-465d-a808-981664228cba\") " Mar 10 10:09:48 crc kubenswrapper[4794]: I0310 10:09:48.957297 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2446b2bc-c3c8-465d-a808-981664228cba-combined-ca-bundle\") pod \"2446b2bc-c3c8-465d-a808-981664228cba\" (UID: \"2446b2bc-c3c8-465d-a808-981664228cba\") " Mar 10 10:09:48 crc kubenswrapper[4794]: I0310 10:09:48.957353 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ql47l\" (UniqueName: \"kubernetes.io/projected/2446b2bc-c3c8-465d-a808-981664228cba-kube-api-access-ql47l\") pod \"2446b2bc-c3c8-465d-a808-981664228cba\" (UID: \"2446b2bc-c3c8-465d-a808-981664228cba\") " Mar 10 10:09:48 crc kubenswrapper[4794]: I0310 10:09:48.957388 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2446b2bc-c3c8-465d-a808-981664228cba-internal-tls-certs\") pod \"2446b2bc-c3c8-465d-a808-981664228cba\" (UID: \"2446b2bc-c3c8-465d-a808-981664228cba\") " Mar 10 10:09:48 crc kubenswrapper[4794]: I0310 10:09:48.957427 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2446b2bc-c3c8-465d-a808-981664228cba-public-tls-certs\") pod \"2446b2bc-c3c8-465d-a808-981664228cba\" (UID: \"2446b2bc-c3c8-465d-a808-981664228cba\") " Mar 10 10:09:48 crc kubenswrapper[4794]: I0310 10:09:48.957464 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2446b2bc-c3c8-465d-a808-981664228cba-config-data\") pod \"2446b2bc-c3c8-465d-a808-981664228cba\" (UID: \"2446b2bc-c3c8-465d-a808-981664228cba\") " Mar 10 10:09:48 crc kubenswrapper[4794]: I0310 10:09:48.958660 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2446b2bc-c3c8-465d-a808-981664228cba-logs" (OuterVolumeSpecName: "logs") pod "2446b2bc-c3c8-465d-a808-981664228cba" (UID: "2446b2bc-c3c8-465d-a808-981664228cba"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 10:09:48 crc kubenswrapper[4794]: I0310 10:09:48.974668 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2446b2bc-c3c8-465d-a808-981664228cba-kube-api-access-ql47l" (OuterVolumeSpecName: "kube-api-access-ql47l") pod "2446b2bc-c3c8-465d-a808-981664228cba" (UID: "2446b2bc-c3c8-465d-a808-981664228cba"). InnerVolumeSpecName "kube-api-access-ql47l". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 10:09:48 crc kubenswrapper[4794]: I0310 10:09:48.997540 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2446b2bc-c3c8-465d-a808-981664228cba-config-data" (OuterVolumeSpecName: "config-data") pod "2446b2bc-c3c8-465d-a808-981664228cba" (UID: "2446b2bc-c3c8-465d-a808-981664228cba"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 10:09:48 crc kubenswrapper[4794]: I0310 10:09:48.997844 4794 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-server-0" podUID="a45381ea-b5d8-49aa-b4b8-ab372b39b0d3" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.103:5671: connect: connection refused" Mar 10 10:09:49 crc kubenswrapper[4794]: I0310 10:09:49.013951 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2446b2bc-c3c8-465d-a808-981664228cba-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "2446b2bc-c3c8-465d-a808-981664228cba" (UID: "2446b2bc-c3c8-465d-a808-981664228cba"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 10:09:49 crc kubenswrapper[4794]: I0310 10:09:49.017200 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2446b2bc-c3c8-465d-a808-981664228cba-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2446b2bc-c3c8-465d-a808-981664228cba" (UID: "2446b2bc-c3c8-465d-a808-981664228cba"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 10:09:49 crc kubenswrapper[4794]: I0310 10:09:49.022817 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2446b2bc-c3c8-465d-a808-981664228cba-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "2446b2bc-c3c8-465d-a808-981664228cba" (UID: "2446b2bc-c3c8-465d-a808-981664228cba"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 10:09:49 crc kubenswrapper[4794]: I0310 10:09:49.059919 4794 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2446b2bc-c3c8-465d-a808-981664228cba-logs\") on node \"crc\" DevicePath \"\"" Mar 10 10:09:49 crc kubenswrapper[4794]: I0310 10:09:49.059949 4794 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2446b2bc-c3c8-465d-a808-981664228cba-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 10:09:49 crc kubenswrapper[4794]: I0310 10:09:49.059966 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ql47l\" (UniqueName: \"kubernetes.io/projected/2446b2bc-c3c8-465d-a808-981664228cba-kube-api-access-ql47l\") on node \"crc\" DevicePath \"\"" Mar 10 10:09:49 crc kubenswrapper[4794]: I0310 10:09:49.059978 4794 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2446b2bc-c3c8-465d-a808-981664228cba-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 10 10:09:49 crc kubenswrapper[4794]: I0310 10:09:49.059989 4794 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2446b2bc-c3c8-465d-a808-981664228cba-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 10 10:09:49 crc kubenswrapper[4794]: I0310 10:09:49.060003 4794 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2446b2bc-c3c8-465d-a808-981664228cba-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 10:09:49 crc kubenswrapper[4794]: I0310 10:09:49.138691 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 10 10:09:49 crc kubenswrapper[4794]: I0310 10:09:49.140755 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Mar 10 10:09:49 crc kubenswrapper[4794]: I0310 10:09:49.145901 4794 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 10 10:09:49 crc kubenswrapper[4794]: I0310 10:09:49.147553 4794 scope.go:117] "RemoveContainer" containerID="5b36ab842a7613453f011b16a0ad771867c79a6397fa893991fe94b1caac6d43" Mar 10 10:09:49 crc kubenswrapper[4794]: E0310 10:09:49.159547 4794 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5b36ab842a7613453f011b16a0ad771867c79a6397fa893991fe94b1caac6d43\": container with ID starting with 5b36ab842a7613453f011b16a0ad771867c79a6397fa893991fe94b1caac6d43 not found: ID does not exist" containerID="5b36ab842a7613453f011b16a0ad771867c79a6397fa893991fe94b1caac6d43" Mar 10 10:09:49 crc kubenswrapper[4794]: I0310 10:09:49.159596 4794 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5b36ab842a7613453f011b16a0ad771867c79a6397fa893991fe94b1caac6d43"} err="failed to get container status \"5b36ab842a7613453f011b16a0ad771867c79a6397fa893991fe94b1caac6d43\": rpc error: code = NotFound desc = could not find container \"5b36ab842a7613453f011b16a0ad771867c79a6397fa893991fe94b1caac6d43\": container with ID starting with 5b36ab842a7613453f011b16a0ad771867c79a6397fa893991fe94b1caac6d43 not found: ID does not exist" Mar 10 10:09:49 crc kubenswrapper[4794]: I0310 10:09:49.159628 4794 scope.go:117] "RemoveContainer" containerID="07b975998a50058fa523429e713118ddb3da394ec6e69a92bc5bda9b7839104f" Mar 10 10:09:49 crc kubenswrapper[4794]: E0310 10:09:49.161588 4794 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"07b975998a50058fa523429e713118ddb3da394ec6e69a92bc5bda9b7839104f\": container with ID starting with 07b975998a50058fa523429e713118ddb3da394ec6e69a92bc5bda9b7839104f not found: ID does not exist" containerID="07b975998a50058fa523429e713118ddb3da394ec6e69a92bc5bda9b7839104f" Mar 10 10:09:49 crc kubenswrapper[4794]: I0310 10:09:49.161636 4794 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"07b975998a50058fa523429e713118ddb3da394ec6e69a92bc5bda9b7839104f"} err="failed to get container status \"07b975998a50058fa523429e713118ddb3da394ec6e69a92bc5bda9b7839104f\": rpc error: code = NotFound desc = could not find container \"07b975998a50058fa523429e713118ddb3da394ec6e69a92bc5bda9b7839104f\": container with ID starting with 07b975998a50058fa523429e713118ddb3da394ec6e69a92bc5bda9b7839104f not found: ID does not exist" Mar 10 10:09:49 crc kubenswrapper[4794]: I0310 10:09:49.161659 4794 scope.go:117] "RemoveContainer" containerID="1a9f488919341a6cbd7a3e003cf402c07ea892b34d5616b62a24305d40e123b4" Mar 10 10:09:49 crc kubenswrapper[4794]: I0310 10:09:49.161769 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-worker-b64684465-k4k4j"] Mar 10 10:09:49 crc kubenswrapper[4794]: I0310 10:09:49.174478 4794 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-worker-b64684465-k4k4j"] Mar 10 10:09:49 crc kubenswrapper[4794]: I0310 10:09:49.185890 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 10 10:09:49 crc kubenswrapper[4794]: I0310 10:09:49.193650 4794 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 10 10:09:49 crc kubenswrapper[4794]: I0310 10:09:49.198824 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Mar 10 10:09:49 crc kubenswrapper[4794]: I0310 10:09:49.206567 4794 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-api-0"] Mar 10 10:09:49 crc kubenswrapper[4794]: I0310 10:09:49.224842 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-d7d5-account-create-update-sxmjz"] Mar 10 10:09:49 crc kubenswrapper[4794]: I0310 10:09:49.227075 4794 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-d7d5-account-create-update-sxmjz"] Mar 10 10:09:49 crc kubenswrapper[4794]: I0310 10:09:49.229179 4794 scope.go:117] "RemoveContainer" containerID="1860e4a742cfe98abb0f65295ec8d6e591d40ce5a255968cfb355b03216be258" Mar 10 10:09:49 crc kubenswrapper[4794]: I0310 10:09:49.232903 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 10 10:09:49 crc kubenswrapper[4794]: I0310 10:09:49.240238 4794 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 10 10:09:49 crc kubenswrapper[4794]: I0310 10:09:49.258638 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-68cfd4d846-drn7b"] Mar 10 10:09:49 crc kubenswrapper[4794]: I0310 10:09:49.270683 4794 scope.go:117] "RemoveContainer" containerID="841f2c7007209f71d0fcab9d21091a5638e79f7695a27cbb0864a109529f7bb5" Mar 10 10:09:49 crc kubenswrapper[4794]: I0310 10:09:49.280677 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/71ee8a8d-89a0-495f-925b-071e52449063-memcached-tls-certs\") pod \"71ee8a8d-89a0-495f-925b-071e52449063\" (UID: \"71ee8a8d-89a0-495f-925b-071e52449063\") " Mar 10 10:09:49 crc kubenswrapper[4794]: I0310 10:09:49.283065 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/71ee8a8d-89a0-495f-925b-071e52449063-config-data\") pod \"71ee8a8d-89a0-495f-925b-071e52449063\" (UID: \"71ee8a8d-89a0-495f-925b-071e52449063\") " Mar 10 10:09:49 crc kubenswrapper[4794]: I0310 10:09:49.283309 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/71ee8a8d-89a0-495f-925b-071e52449063-combined-ca-bundle\") pod \"71ee8a8d-89a0-495f-925b-071e52449063\" (UID: \"71ee8a8d-89a0-495f-925b-071e52449063\") " Mar 10 10:09:49 crc kubenswrapper[4794]: I0310 10:09:49.288053 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/71ee8a8d-89a0-495f-925b-071e52449063-kolla-config\") pod \"71ee8a8d-89a0-495f-925b-071e52449063\" (UID: \"71ee8a8d-89a0-495f-925b-071e52449063\") " Mar 10 10:09:49 crc kubenswrapper[4794]: I0310 10:09:49.288296 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p5grc\" (UniqueName: \"kubernetes.io/projected/71ee8a8d-89a0-495f-925b-071e52449063-kube-api-access-p5grc\") pod \"71ee8a8d-89a0-495f-925b-071e52449063\" (UID: \"71ee8a8d-89a0-495f-925b-071e52449063\") " Mar 10 10:09:49 crc kubenswrapper[4794]: I0310 10:09:49.288940 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/71ee8a8d-89a0-495f-925b-071e52449063-config-data" (OuterVolumeSpecName: "config-data") pod "71ee8a8d-89a0-495f-925b-071e52449063" (UID: "71ee8a8d-89a0-495f-925b-071e52449063"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 10:09:49 crc kubenswrapper[4794]: I0310 10:09:49.289726 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0c3485a5-8f4a-4634-9a11-ed7d3081277f-operator-scripts\") pod \"keystone-d7d5-account-create-update-sxmjz\" (UID: \"0c3485a5-8f4a-4634-9a11-ed7d3081277f\") " pod="openstack/keystone-d7d5-account-create-update-sxmjz" Mar 10 10:09:49 crc kubenswrapper[4794]: I0310 10:09:49.290079 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ql7mv\" (UniqueName: \"kubernetes.io/projected/0c3485a5-8f4a-4634-9a11-ed7d3081277f-kube-api-access-ql7mv\") pod \"keystone-d7d5-account-create-update-sxmjz\" (UID: \"0c3485a5-8f4a-4634-9a11-ed7d3081277f\") " pod="openstack/keystone-d7d5-account-create-update-sxmjz" Mar 10 10:09:49 crc kubenswrapper[4794]: I0310 10:09:49.290252 4794 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/71ee8a8d-89a0-495f-925b-071e52449063-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 10:09:49 crc kubenswrapper[4794]: I0310 10:09:49.290771 4794 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-68cfd4d846-drn7b"] Mar 10 10:09:49 crc kubenswrapper[4794]: E0310 10:09:49.290983 4794 configmap.go:193] Couldn't get configMap openstack/openstack-scripts: configmap "openstack-scripts" not found Mar 10 10:09:49 crc kubenswrapper[4794]: E0310 10:09:49.291068 4794 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/4a681af5-3cbf-4d83-a8cd-42a552cdc06d-operator-scripts podName:4a681af5-3cbf-4d83-a8cd-42a552cdc06d nodeName:}" failed. No retries permitted until 2026-03-10 10:09:51.29104205 +0000 UTC m=+1540.047212868 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/4a681af5-3cbf-4d83-a8cd-42a552cdc06d-operator-scripts") pod "root-account-create-update-h8ctb" (UID: "4a681af5-3cbf-4d83-a8cd-42a552cdc06d") : configmap "openstack-scripts" not found Mar 10 10:09:49 crc kubenswrapper[4794]: E0310 10:09:49.292357 4794 configmap.go:193] Couldn't get configMap openstack/openstack-scripts: configmap "openstack-scripts" not found Mar 10 10:09:49 crc kubenswrapper[4794]: E0310 10:09:49.292423 4794 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/0c3485a5-8f4a-4634-9a11-ed7d3081277f-operator-scripts podName:0c3485a5-8f4a-4634-9a11-ed7d3081277f nodeName:}" failed. No retries permitted until 2026-03-10 10:09:51.292398272 +0000 UTC m=+1540.048569090 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/0c3485a5-8f4a-4634-9a11-ed7d3081277f-operator-scripts") pod "keystone-d7d5-account-create-update-sxmjz" (UID: "0c3485a5-8f4a-4634-9a11-ed7d3081277f") : configmap "openstack-scripts" not found Mar 10 10:09:49 crc kubenswrapper[4794]: E0310 10:09:49.292504 4794 configmap.go:193] Couldn't get configMap openstack/rabbitmq-config-data: configmap "rabbitmq-config-data" not found Mar 10 10:09:49 crc kubenswrapper[4794]: E0310 10:09:49.292546 4794 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/a45381ea-b5d8-49aa-b4b8-ab372b39b0d3-config-data podName:a45381ea-b5d8-49aa-b4b8-ab372b39b0d3 nodeName:}" failed. No retries permitted until 2026-03-10 10:09:57.292533886 +0000 UTC m=+1546.048704704 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/a45381ea-b5d8-49aa-b4b8-ab372b39b0d3-config-data") pod "rabbitmq-server-0" (UID: "a45381ea-b5d8-49aa-b4b8-ab372b39b0d3") : configmap "rabbitmq-config-data" not found Mar 10 10:09:49 crc kubenswrapper[4794]: E0310 10:09:49.295692 4794 projected.go:194] Error preparing data for projected volume kube-api-access-ql7mv for pod openstack/keystone-d7d5-account-create-update-sxmjz: failed to fetch token: pod "keystone-d7d5-account-create-update-sxmjz" not found Mar 10 10:09:49 crc kubenswrapper[4794]: E0310 10:09:49.295811 4794 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/0c3485a5-8f4a-4634-9a11-ed7d3081277f-kube-api-access-ql7mv podName:0c3485a5-8f4a-4634-9a11-ed7d3081277f nodeName:}" failed. No retries permitted until 2026-03-10 10:09:51.295783638 +0000 UTC m=+1540.051954496 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-ql7mv" (UniqueName: "kubernetes.io/projected/0c3485a5-8f4a-4634-9a11-ed7d3081277f-kube-api-access-ql7mv") pod "keystone-d7d5-account-create-update-sxmjz" (UID: "0c3485a5-8f4a-4634-9a11-ed7d3081277f") : failed to fetch token: pod "keystone-d7d5-account-create-update-sxmjz" not found Mar 10 10:09:49 crc kubenswrapper[4794]: I0310 10:09:49.295831 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/71ee8a8d-89a0-495f-925b-071e52449063-kolla-config" (OuterVolumeSpecName: "kolla-config") pod "71ee8a8d-89a0-495f-925b-071e52449063" (UID: "71ee8a8d-89a0-495f-925b-071e52449063"). InnerVolumeSpecName "kolla-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 10:09:49 crc kubenswrapper[4794]: I0310 10:09:49.297974 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/71ee8a8d-89a0-495f-925b-071e52449063-kube-api-access-p5grc" (OuterVolumeSpecName: "kube-api-access-p5grc") pod "71ee8a8d-89a0-495f-925b-071e52449063" (UID: "71ee8a8d-89a0-495f-925b-071e52449063"). InnerVolumeSpecName "kube-api-access-p5grc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 10:09:49 crc kubenswrapper[4794]: I0310 10:09:49.305295 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-0"] Mar 10 10:09:49 crc kubenswrapper[4794]: I0310 10:09:49.318170 4794 scope.go:117] "RemoveContainer" containerID="d75c771de3d291bbbb95bf0f193cefd57708bcdc53c0f2f718b3d8e320f642c8" Mar 10 10:09:49 crc kubenswrapper[4794]: I0310 10:09:49.319023 4794 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-0"] Mar 10 10:09:49 crc kubenswrapper[4794]: I0310 10:09:49.334448 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/71ee8a8d-89a0-495f-925b-071e52449063-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "71ee8a8d-89a0-495f-925b-071e52449063" (UID: "71ee8a8d-89a0-495f-925b-071e52449063"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 10:09:49 crc kubenswrapper[4794]: I0310 10:09:49.338864 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-h8ctb" Mar 10 10:09:49 crc kubenswrapper[4794]: I0310 10:09:49.339588 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 10 10:09:49 crc kubenswrapper[4794]: I0310 10:09:49.348832 4794 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Mar 10 10:09:49 crc kubenswrapper[4794]: I0310 10:09:49.351748 4794 scope.go:117] "RemoveContainer" containerID="ef5f845a9297d9baef67ae43283960d69bd559077bfce60e39f562cbd5f935df" Mar 10 10:09:49 crc kubenswrapper[4794]: I0310 10:09:49.353630 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/71ee8a8d-89a0-495f-925b-071e52449063-memcached-tls-certs" (OuterVolumeSpecName: "memcached-tls-certs") pod "71ee8a8d-89a0-495f-925b-071e52449063" (UID: "71ee8a8d-89a0-495f-925b-071e52449063"). InnerVolumeSpecName "memcached-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 10:09:49 crc kubenswrapper[4794]: I0310 10:09:49.363860 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-5dd9f46c58-hkfks"] Mar 10 10:09:49 crc kubenswrapper[4794]: I0310 10:09:49.373648 4794 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-api-5dd9f46c58-hkfks"] Mar 10 10:09:49 crc kubenswrapper[4794]: I0310 10:09:49.377708 4794 scope.go:117] "RemoveContainer" containerID="eeacb7a4237f606fda5183e09ef7cdbc5464bd0f9e121d184887a224b87c2de7" Mar 10 10:09:49 crc kubenswrapper[4794]: I0310 10:09:49.385853 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-northd-0"] Mar 10 10:09:49 crc kubenswrapper[4794]: I0310 10:09:49.389522 4794 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-northd-0"] Mar 10 10:09:49 crc kubenswrapper[4794]: I0310 10:09:49.392248 4794 reconciler_common.go:293] "Volume detached for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/71ee8a8d-89a0-495f-925b-071e52449063-memcached-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 10 10:09:49 crc kubenswrapper[4794]: I0310 10:09:49.392284 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ql7mv\" (UniqueName: \"kubernetes.io/projected/0c3485a5-8f4a-4634-9a11-ed7d3081277f-kube-api-access-ql7mv\") on node \"crc\" DevicePath \"\"" Mar 10 10:09:49 crc kubenswrapper[4794]: I0310 10:09:49.392298 4794 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/71ee8a8d-89a0-495f-925b-071e52449063-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 10:09:49 crc kubenswrapper[4794]: I0310 10:09:49.392310 4794 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0c3485a5-8f4a-4634-9a11-ed7d3081277f-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 10:09:49 crc kubenswrapper[4794]: I0310 10:09:49.392322 4794 reconciler_common.go:293] "Volume detached for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/71ee8a8d-89a0-495f-925b-071e52449063-kolla-config\") on node \"crc\" DevicePath \"\"" Mar 10 10:09:49 crc kubenswrapper[4794]: I0310 10:09:49.392376 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p5grc\" (UniqueName: \"kubernetes.io/projected/71ee8a8d-89a0-495f-925b-071e52449063-kube-api-access-p5grc\") on node \"crc\" DevicePath \"\"" Mar 10 10:09:49 crc kubenswrapper[4794]: I0310 10:09:49.404977 4794 scope.go:117] "RemoveContainer" containerID="13f70234e665cee8f47182684e880f742b068d9ada3d1f0b83237e5efa99c1ee" Mar 10 10:09:49 crc kubenswrapper[4794]: I0310 10:09:49.434410 4794 scope.go:117] "RemoveContainer" containerID="f89f95fa1764bb4ed8927a1c6b5d7ec0737f1ea37babc6ae8e3dc09577573205" Mar 10 10:09:49 crc kubenswrapper[4794]: I0310 10:09:49.493899 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kw958\" (UniqueName: \"kubernetes.io/projected/4a681af5-3cbf-4d83-a8cd-42a552cdc06d-kube-api-access-kw958\") pod \"4a681af5-3cbf-4d83-a8cd-42a552cdc06d\" (UID: \"4a681af5-3cbf-4d83-a8cd-42a552cdc06d\") " Mar 10 10:09:49 crc kubenswrapper[4794]: I0310 10:09:49.494096 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4a681af5-3cbf-4d83-a8cd-42a552cdc06d-operator-scripts\") pod \"4a681af5-3cbf-4d83-a8cd-42a552cdc06d\" (UID: \"4a681af5-3cbf-4d83-a8cd-42a552cdc06d\") " Mar 10 10:09:49 crc kubenswrapper[4794]: I0310 10:09:49.494825 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4a681af5-3cbf-4d83-a8cd-42a552cdc06d-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "4a681af5-3cbf-4d83-a8cd-42a552cdc06d" (UID: "4a681af5-3cbf-4d83-a8cd-42a552cdc06d"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 10:09:49 crc kubenswrapper[4794]: I0310 10:09:49.501514 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4a681af5-3cbf-4d83-a8cd-42a552cdc06d-kube-api-access-kw958" (OuterVolumeSpecName: "kube-api-access-kw958") pod "4a681af5-3cbf-4d83-a8cd-42a552cdc06d" (UID: "4a681af5-3cbf-4d83-a8cd-42a552cdc06d"). InnerVolumeSpecName "kube-api-access-kw958". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 10:09:49 crc kubenswrapper[4794]: I0310 10:09:49.595771 4794 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4a681af5-3cbf-4d83-a8cd-42a552cdc06d-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 10:09:49 crc kubenswrapper[4794]: I0310 10:09:49.595802 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kw958\" (UniqueName: \"kubernetes.io/projected/4a681af5-3cbf-4d83-a8cd-42a552cdc06d-kube-api-access-kw958\") on node \"crc\" DevicePath \"\"" Mar 10 10:09:49 crc kubenswrapper[4794]: I0310 10:09:49.629460 4794 scope.go:117] "RemoveContainer" containerID="d36356b5c770ecad29603a57d1346e81bb0210ad811dc767118c368012779874" Mar 10 10:09:49 crc kubenswrapper[4794]: I0310 10:09:49.657306 4794 scope.go:117] "RemoveContainer" containerID="d74fbbdb86c3cdb65171312cf2c9c803c458f47ad7b0f5c525579801ae96ec9d" Mar 10 10:09:49 crc kubenswrapper[4794]: I0310 10:09:49.673422 4794 scope.go:117] "RemoveContainer" containerID="a0b3e91e1e10cd24f85913c9dfa160aee857a4f7ffe7e21e4b28aa608366d44a" Mar 10 10:09:49 crc kubenswrapper[4794]: I0310 10:09:49.690209 4794 scope.go:117] "RemoveContainer" containerID="2db723c104c847e70b7c742bd2157495764accb8230b8d1c494ef49084c7b620" Mar 10 10:09:49 crc kubenswrapper[4794]: I0310 10:09:49.707596 4794 scope.go:117] "RemoveContainer" containerID="839d455cbd220b0b4cbb46ee40c9764f0de266ff250390018a7424fa8aa36507" Mar 10 10:09:49 crc kubenswrapper[4794]: I0310 10:09:49.745816 4794 scope.go:117] "RemoveContainer" containerID="d7f0a62cb5cfb4fded049c3aefb7fe44d4d036d9c535f290cbc47f08da15b658" Mar 10 10:09:49 crc kubenswrapper[4794]: I0310 10:09:49.768696 4794 scope.go:117] "RemoveContainer" containerID="804d445e998cb87856a9e74959cb49d74352b3c669f7c0cfc5587e7a39d5a062" Mar 10 10:09:49 crc kubenswrapper[4794]: I0310 10:09:49.794778 4794 scope.go:117] "RemoveContainer" containerID="a27a1a3aea44b5edb2651ff49b0fcc031f55a8ebaeb3807a56fb618e61e23ded" Mar 10 10:09:49 crc kubenswrapper[4794]: I0310 10:09:49.814627 4794 scope.go:117] "RemoveContainer" containerID="804d445e998cb87856a9e74959cb49d74352b3c669f7c0cfc5587e7a39d5a062" Mar 10 10:09:49 crc kubenswrapper[4794]: E0310 10:09:49.815142 4794 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"804d445e998cb87856a9e74959cb49d74352b3c669f7c0cfc5587e7a39d5a062\": container with ID starting with 804d445e998cb87856a9e74959cb49d74352b3c669f7c0cfc5587e7a39d5a062 not found: ID does not exist" containerID="804d445e998cb87856a9e74959cb49d74352b3c669f7c0cfc5587e7a39d5a062" Mar 10 10:09:49 crc kubenswrapper[4794]: I0310 10:09:49.815174 4794 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"804d445e998cb87856a9e74959cb49d74352b3c669f7c0cfc5587e7a39d5a062"} err="failed to get container status \"804d445e998cb87856a9e74959cb49d74352b3c669f7c0cfc5587e7a39d5a062\": rpc error: code = NotFound desc = could not find container \"804d445e998cb87856a9e74959cb49d74352b3c669f7c0cfc5587e7a39d5a062\": container with ID starting with 804d445e998cb87856a9e74959cb49d74352b3c669f7c0cfc5587e7a39d5a062 not found: ID does not exist" Mar 10 10:09:49 crc kubenswrapper[4794]: I0310 10:09:49.815204 4794 scope.go:117] "RemoveContainer" containerID="a27a1a3aea44b5edb2651ff49b0fcc031f55a8ebaeb3807a56fb618e61e23ded" Mar 10 10:09:49 crc kubenswrapper[4794]: E0310 10:09:49.815637 4794 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a27a1a3aea44b5edb2651ff49b0fcc031f55a8ebaeb3807a56fb618e61e23ded\": container with ID starting with a27a1a3aea44b5edb2651ff49b0fcc031f55a8ebaeb3807a56fb618e61e23ded not found: ID does not exist" containerID="a27a1a3aea44b5edb2651ff49b0fcc031f55a8ebaeb3807a56fb618e61e23ded" Mar 10 10:09:49 crc kubenswrapper[4794]: I0310 10:09:49.815661 4794 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a27a1a3aea44b5edb2651ff49b0fcc031f55a8ebaeb3807a56fb618e61e23ded"} err="failed to get container status \"a27a1a3aea44b5edb2651ff49b0fcc031f55a8ebaeb3807a56fb618e61e23ded\": rpc error: code = NotFound desc = could not find container \"a27a1a3aea44b5edb2651ff49b0fcc031f55a8ebaeb3807a56fb618e61e23ded\": container with ID starting with a27a1a3aea44b5edb2651ff49b0fcc031f55a8ebaeb3807a56fb618e61e23ded not found: ID does not exist" Mar 10 10:09:49 crc kubenswrapper[4794]: I0310 10:09:49.894864 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"71ee8a8d-89a0-495f-925b-071e52449063","Type":"ContainerDied","Data":"6fa8d1949110896a40b5b1cfbf46eb189d5e00b9d4d7ed875abf5ab436383ec5"} Mar 10 10:09:49 crc kubenswrapper[4794]: I0310 10:09:49.894915 4794 scope.go:117] "RemoveContainer" containerID="2cb4ade39b3ccc7065ae84be53a791f98290d564a27b4334ca648c8b39c8ca95" Mar 10 10:09:49 crc kubenswrapper[4794]: I0310 10:09:49.895063 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Mar 10 10:09:49 crc kubenswrapper[4794]: I0310 10:09:49.901293 4794 generic.go:334] "Generic (PLEG): container finished" podID="a45381ea-b5d8-49aa-b4b8-ab372b39b0d3" containerID="b01da559f24b75afd94d0c65d373dcf5b4d3bb07708a3909a930cd454c72cc4d" exitCode=0 Mar 10 10:09:49 crc kubenswrapper[4794]: I0310 10:09:49.901369 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"a45381ea-b5d8-49aa-b4b8-ab372b39b0d3","Type":"ContainerDied","Data":"b01da559f24b75afd94d0c65d373dcf5b4d3bb07708a3909a930cd454c72cc4d"} Mar 10 10:09:49 crc kubenswrapper[4794]: I0310 10:09:49.904086 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-h8ctb" event={"ID":"4a681af5-3cbf-4d83-a8cd-42a552cdc06d","Type":"ContainerDied","Data":"27b43c473996656b05298e1bdc7a4df550fc2873a112768d66877b30f849fb5d"} Mar 10 10:09:49 crc kubenswrapper[4794]: I0310 10:09:49.904170 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-h8ctb" Mar 10 10:09:49 crc kubenswrapper[4794]: I0310 10:09:49.920615 4794 generic.go:334] "Generic (PLEG): container finished" podID="2b6b3a2f-0f4d-4b7f-a507-450f0ffff42b" containerID="4ee46b9ad300bd7296d35f0791b2f32932975b8845ba6b96c9a8eff329eb83f5" exitCode=0 Mar 10 10:09:49 crc kubenswrapper[4794]: I0310 10:09:49.920696 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"2b6b3a2f-0f4d-4b7f-a507-450f0ffff42b","Type":"ContainerDied","Data":"4ee46b9ad300bd7296d35f0791b2f32932975b8845ba6b96c9a8eff329eb83f5"} Mar 10 10:09:49 crc kubenswrapper[4794]: I0310 10:09:49.920725 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"2b6b3a2f-0f4d-4b7f-a507-450f0ffff42b","Type":"ContainerDied","Data":"7dab7f9e9746bf17b56c4b83e051254e9901df841f512cb3effa974e3664d6fa"} Mar 10 10:09:49 crc kubenswrapper[4794]: I0310 10:09:49.920738 4794 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7dab7f9e9746bf17b56c4b83e051254e9901df841f512cb3effa974e3664d6fa" Mar 10 10:09:49 crc kubenswrapper[4794]: I0310 10:09:49.922636 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 10 10:09:49 crc kubenswrapper[4794]: I0310 10:09:49.967174 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Mar 10 10:09:49 crc kubenswrapper[4794]: I0310 10:09:49.968466 4794 scope.go:117] "RemoveContainer" containerID="bfd8291a4141335b8e769990025ae16ba71e4ee8c719b3ff7c16b2cbc268d255" Mar 10 10:09:49 crc kubenswrapper[4794]: I0310 10:09:49.979229 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-h8ctb"] Mar 10 10:09:49 crc kubenswrapper[4794]: I0310 10:09:49.979769 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 10 10:09:49 crc kubenswrapper[4794]: I0310 10:09:49.987596 4794 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-h8ctb"] Mar 10 10:09:49 crc kubenswrapper[4794]: I0310 10:09:49.996060 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/memcached-0"] Mar 10 10:09:50 crc kubenswrapper[4794]: E0310 10:09:50.006817 4794 configmap.go:193] Couldn't get configMap openstack/rabbitmq-cell1-config-data: configmap "rabbitmq-cell1-config-data" not found Mar 10 10:09:50 crc kubenswrapper[4794]: E0310 10:09:50.006883 4794 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/598e06ed-3156-4e09-976e-4dda0e35afc2-config-data podName:598e06ed-3156-4e09-976e-4dda0e35afc2 nodeName:}" failed. No retries permitted until 2026-03-10 10:09:58.006863787 +0000 UTC m=+1546.763034665 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/598e06ed-3156-4e09-976e-4dda0e35afc2-config-data") pod "rabbitmq-cell1-server-0" (UID: "598e06ed-3156-4e09-976e-4dda0e35afc2") : configmap "rabbitmq-cell1-config-data" not found Mar 10 10:09:50 crc kubenswrapper[4794]: I0310 10:09:50.008711 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0a15b784-796e-4834-97e1-978b1f0d9690" path="/var/lib/kubelet/pods/0a15b784-796e-4834-97e1-978b1f0d9690/volumes" Mar 10 10:09:50 crc kubenswrapper[4794]: I0310 10:09:50.009082 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0c3485a5-8f4a-4634-9a11-ed7d3081277f" path="/var/lib/kubelet/pods/0c3485a5-8f4a-4634-9a11-ed7d3081277f/volumes" Mar 10 10:09:50 crc kubenswrapper[4794]: I0310 10:09:50.009419 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="39114d89-8cf8-4563-bc50-e96e2113349d" path="/var/lib/kubelet/pods/39114d89-8cf8-4563-bc50-e96e2113349d/volumes" Mar 10 10:09:50 crc kubenswrapper[4794]: I0310 10:09:50.010285 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3a1da9f1-f33d-4327-b899-b5a38c6990d8" path="/var/lib/kubelet/pods/3a1da9f1-f33d-4327-b899-b5a38c6990d8/volumes" Mar 10 10:09:50 crc kubenswrapper[4794]: I0310 10:09:50.011422 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4a681af5-3cbf-4d83-a8cd-42a552cdc06d" path="/var/lib/kubelet/pods/4a681af5-3cbf-4d83-a8cd-42a552cdc06d/volumes" Mar 10 10:09:50 crc kubenswrapper[4794]: I0310 10:09:50.011970 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="53686d91-dc01-4a36-99c3-e6c84052e15e" path="/var/lib/kubelet/pods/53686d91-dc01-4a36-99c3-e6c84052e15e/volumes" Mar 10 10:09:50 crc kubenswrapper[4794]: I0310 10:09:50.013007 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6b5facf1-8bc0-497d-925f-ee382862cf22" path="/var/lib/kubelet/pods/6b5facf1-8bc0-497d-925f-ee382862cf22/volumes" Mar 10 10:09:50 crc kubenswrapper[4794]: I0310 10:09:50.013577 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6bd64ab2-dcd8-4404-973e-551182005da1" path="/var/lib/kubelet/pods/6bd64ab2-dcd8-4404-973e-551182005da1/volumes" Mar 10 10:09:50 crc kubenswrapper[4794]: I0310 10:09:50.014042 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="98b35dea-060e-4b8d-9829-37357853a9c4" path="/var/lib/kubelet/pods/98b35dea-060e-4b8d-9829-37357853a9c4/volumes" Mar 10 10:09:50 crc kubenswrapper[4794]: I0310 10:09:50.015633 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cffdb4d7-881d-4c1b-a15c-d8d9e8b7756f" path="/var/lib/kubelet/pods/cffdb4d7-881d-4c1b-a15c-d8d9e8b7756f/volumes" Mar 10 10:09:50 crc kubenswrapper[4794]: I0310 10:09:50.018188 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="eb384e40-1917-4b9c-bcfa-440a3a10fd1d" path="/var/lib/kubelet/pods/eb384e40-1917-4b9c-bcfa-440a3a10fd1d/volumes" Mar 10 10:09:50 crc kubenswrapper[4794]: I0310 10:09:50.018930 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ecae27ed-535f-47c8-93e4-07baac3bc64c" path="/var/lib/kubelet/pods/ecae27ed-535f-47c8-93e4-07baac3bc64c/volumes" Mar 10 10:09:50 crc kubenswrapper[4794]: I0310 10:09:50.019564 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="efa96620-3d4b-4780-92a0-eeefbe9dcf9a" path="/var/lib/kubelet/pods/efa96620-3d4b-4780-92a0-eeefbe9dcf9a/volumes" Mar 10 10:09:50 crc kubenswrapper[4794]: I0310 10:09:50.034127 4794 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/memcached-0"] Mar 10 10:09:50 crc kubenswrapper[4794]: I0310 10:09:50.057029 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 10 10:09:50 crc kubenswrapper[4794]: I0310 10:09:50.062023 4794 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Mar 10 10:09:50 crc kubenswrapper[4794]: I0310 10:09:50.107731 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/a45381ea-b5d8-49aa-b4b8-ab372b39b0d3-pod-info\") pod \"a45381ea-b5d8-49aa-b4b8-ab372b39b0d3\" (UID: \"a45381ea-b5d8-49aa-b4b8-ab372b39b0d3\") " Mar 10 10:09:50 crc kubenswrapper[4794]: I0310 10:09:50.107819 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-27qt6\" (UniqueName: \"kubernetes.io/projected/a45381ea-b5d8-49aa-b4b8-ab372b39b0d3-kube-api-access-27qt6\") pod \"a45381ea-b5d8-49aa-b4b8-ab372b39b0d3\" (UID: \"a45381ea-b5d8-49aa-b4b8-ab372b39b0d3\") " Mar 10 10:09:50 crc kubenswrapper[4794]: I0310 10:09:50.107844 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/a45381ea-b5d8-49aa-b4b8-ab372b39b0d3-config-data\") pod \"a45381ea-b5d8-49aa-b4b8-ab372b39b0d3\" (UID: \"a45381ea-b5d8-49aa-b4b8-ab372b39b0d3\") " Mar 10 10:09:50 crc kubenswrapper[4794]: I0310 10:09:50.107863 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2b6b3a2f-0f4d-4b7f-a507-450f0ffff42b-operator-scripts\") pod \"2b6b3a2f-0f4d-4b7f-a507-450f0ffff42b\" (UID: \"2b6b3a2f-0f4d-4b7f-a507-450f0ffff42b\") " Mar 10 10:09:50 crc kubenswrapper[4794]: I0310 10:09:50.107886 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/2b6b3a2f-0f4d-4b7f-a507-450f0ffff42b-kolla-config\") pod \"2b6b3a2f-0f4d-4b7f-a507-450f0ffff42b\" (UID: \"2b6b3a2f-0f4d-4b7f-a507-450f0ffff42b\") " Mar 10 10:09:50 crc kubenswrapper[4794]: I0310 10:09:50.107917 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/a45381ea-b5d8-49aa-b4b8-ab372b39b0d3-plugins-conf\") pod \"a45381ea-b5d8-49aa-b4b8-ab372b39b0d3\" (UID: \"a45381ea-b5d8-49aa-b4b8-ab372b39b0d3\") " Mar 10 10:09:50 crc kubenswrapper[4794]: I0310 10:09:50.107931 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qjbsb\" (UniqueName: \"kubernetes.io/projected/2b6b3a2f-0f4d-4b7f-a507-450f0ffff42b-kube-api-access-qjbsb\") pod \"2b6b3a2f-0f4d-4b7f-a507-450f0ffff42b\" (UID: \"2b6b3a2f-0f4d-4b7f-a507-450f0ffff42b\") " Mar 10 10:09:50 crc kubenswrapper[4794]: I0310 10:09:50.107952 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/2b6b3a2f-0f4d-4b7f-a507-450f0ffff42b-config-data-generated\") pod \"2b6b3a2f-0f4d-4b7f-a507-450f0ffff42b\" (UID: \"2b6b3a2f-0f4d-4b7f-a507-450f0ffff42b\") " Mar 10 10:09:50 crc kubenswrapper[4794]: I0310 10:09:50.107997 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mysql-db\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"2b6b3a2f-0f4d-4b7f-a507-450f0ffff42b\" (UID: \"2b6b3a2f-0f4d-4b7f-a507-450f0ffff42b\") " Mar 10 10:09:50 crc kubenswrapper[4794]: I0310 10:09:50.108028 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2b6b3a2f-0f4d-4b7f-a507-450f0ffff42b-combined-ca-bundle\") pod \"2b6b3a2f-0f4d-4b7f-a507-450f0ffff42b\" (UID: \"2b6b3a2f-0f4d-4b7f-a507-450f0ffff42b\") " Mar 10 10:09:50 crc kubenswrapper[4794]: I0310 10:09:50.108044 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/2b6b3a2f-0f4d-4b7f-a507-450f0ffff42b-galera-tls-certs\") pod \"2b6b3a2f-0f4d-4b7f-a507-450f0ffff42b\" (UID: \"2b6b3a2f-0f4d-4b7f-a507-450f0ffff42b\") " Mar 10 10:09:50 crc kubenswrapper[4794]: I0310 10:09:50.108080 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/a45381ea-b5d8-49aa-b4b8-ab372b39b0d3-erlang-cookie-secret\") pod \"a45381ea-b5d8-49aa-b4b8-ab372b39b0d3\" (UID: \"a45381ea-b5d8-49aa-b4b8-ab372b39b0d3\") " Mar 10 10:09:50 crc kubenswrapper[4794]: I0310 10:09:50.108096 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/a45381ea-b5d8-49aa-b4b8-ab372b39b0d3-rabbitmq-tls\") pod \"a45381ea-b5d8-49aa-b4b8-ab372b39b0d3\" (UID: \"a45381ea-b5d8-49aa-b4b8-ab372b39b0d3\") " Mar 10 10:09:50 crc kubenswrapper[4794]: I0310 10:09:50.108112 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/2b6b3a2f-0f4d-4b7f-a507-450f0ffff42b-config-data-default\") pod \"2b6b3a2f-0f4d-4b7f-a507-450f0ffff42b\" (UID: \"2b6b3a2f-0f4d-4b7f-a507-450f0ffff42b\") " Mar 10 10:09:50 crc kubenswrapper[4794]: I0310 10:09:50.108129 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/a45381ea-b5d8-49aa-b4b8-ab372b39b0d3-rabbitmq-erlang-cookie\") pod \"a45381ea-b5d8-49aa-b4b8-ab372b39b0d3\" (UID: \"a45381ea-b5d8-49aa-b4b8-ab372b39b0d3\") " Mar 10 10:09:50 crc kubenswrapper[4794]: I0310 10:09:50.108151 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/a45381ea-b5d8-49aa-b4b8-ab372b39b0d3-server-conf\") pod \"a45381ea-b5d8-49aa-b4b8-ab372b39b0d3\" (UID: \"a45381ea-b5d8-49aa-b4b8-ab372b39b0d3\") " Mar 10 10:09:50 crc kubenswrapper[4794]: I0310 10:09:50.108193 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/a45381ea-b5d8-49aa-b4b8-ab372b39b0d3-rabbitmq-confd\") pod \"a45381ea-b5d8-49aa-b4b8-ab372b39b0d3\" (UID: \"a45381ea-b5d8-49aa-b4b8-ab372b39b0d3\") " Mar 10 10:09:50 crc kubenswrapper[4794]: I0310 10:09:50.108211 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/a45381ea-b5d8-49aa-b4b8-ab372b39b0d3-rabbitmq-plugins\") pod \"a45381ea-b5d8-49aa-b4b8-ab372b39b0d3\" (UID: \"a45381ea-b5d8-49aa-b4b8-ab372b39b0d3\") " Mar 10 10:09:50 crc kubenswrapper[4794]: I0310 10:09:50.108228 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"a45381ea-b5d8-49aa-b4b8-ab372b39b0d3\" (UID: \"a45381ea-b5d8-49aa-b4b8-ab372b39b0d3\") " Mar 10 10:09:50 crc kubenswrapper[4794]: I0310 10:09:50.109030 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a45381ea-b5d8-49aa-b4b8-ab372b39b0d3-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "a45381ea-b5d8-49aa-b4b8-ab372b39b0d3" (UID: "a45381ea-b5d8-49aa-b4b8-ab372b39b0d3"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 10:09:50 crc kubenswrapper[4794]: I0310 10:09:50.110171 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2b6b3a2f-0f4d-4b7f-a507-450f0ffff42b-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "2b6b3a2f-0f4d-4b7f-a507-450f0ffff42b" (UID: "2b6b3a2f-0f4d-4b7f-a507-450f0ffff42b"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 10:09:50 crc kubenswrapper[4794]: I0310 10:09:50.110396 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2b6b3a2f-0f4d-4b7f-a507-450f0ffff42b-config-data-default" (OuterVolumeSpecName: "config-data-default") pod "2b6b3a2f-0f4d-4b7f-a507-450f0ffff42b" (UID: "2b6b3a2f-0f4d-4b7f-a507-450f0ffff42b"). InnerVolumeSpecName "config-data-default". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 10:09:50 crc kubenswrapper[4794]: I0310 10:09:50.110397 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2b6b3a2f-0f4d-4b7f-a507-450f0ffff42b-config-data-generated" (OuterVolumeSpecName: "config-data-generated") pod "2b6b3a2f-0f4d-4b7f-a507-450f0ffff42b" (UID: "2b6b3a2f-0f4d-4b7f-a507-450f0ffff42b"). InnerVolumeSpecName "config-data-generated". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 10:09:50 crc kubenswrapper[4794]: I0310 10:09:50.110931 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2b6b3a2f-0f4d-4b7f-a507-450f0ffff42b-kolla-config" (OuterVolumeSpecName: "kolla-config") pod "2b6b3a2f-0f4d-4b7f-a507-450f0ffff42b" (UID: "2b6b3a2f-0f4d-4b7f-a507-450f0ffff42b"). InnerVolumeSpecName "kolla-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 10:09:50 crc kubenswrapper[4794]: I0310 10:09:50.111203 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/a45381ea-b5d8-49aa-b4b8-ab372b39b0d3-pod-info" (OuterVolumeSpecName: "pod-info") pod "a45381ea-b5d8-49aa-b4b8-ab372b39b0d3" (UID: "a45381ea-b5d8-49aa-b4b8-ab372b39b0d3"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Mar 10 10:09:50 crc kubenswrapper[4794]: I0310 10:09:50.111364 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a45381ea-b5d8-49aa-b4b8-ab372b39b0d3-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "a45381ea-b5d8-49aa-b4b8-ab372b39b0d3" (UID: "a45381ea-b5d8-49aa-b4b8-ab372b39b0d3"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 10:09:50 crc kubenswrapper[4794]: I0310 10:09:50.112222 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a45381ea-b5d8-49aa-b4b8-ab372b39b0d3-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "a45381ea-b5d8-49aa-b4b8-ab372b39b0d3" (UID: "a45381ea-b5d8-49aa-b4b8-ab372b39b0d3"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 10:09:50 crc kubenswrapper[4794]: I0310 10:09:50.113084 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a45381ea-b5d8-49aa-b4b8-ab372b39b0d3-kube-api-access-27qt6" (OuterVolumeSpecName: "kube-api-access-27qt6") pod "a45381ea-b5d8-49aa-b4b8-ab372b39b0d3" (UID: "a45381ea-b5d8-49aa-b4b8-ab372b39b0d3"). InnerVolumeSpecName "kube-api-access-27qt6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 10:09:50 crc kubenswrapper[4794]: I0310 10:09:50.113236 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a45381ea-b5d8-49aa-b4b8-ab372b39b0d3-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "a45381ea-b5d8-49aa-b4b8-ab372b39b0d3" (UID: "a45381ea-b5d8-49aa-b4b8-ab372b39b0d3"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 10:09:50 crc kubenswrapper[4794]: I0310 10:09:50.113347 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a45381ea-b5d8-49aa-b4b8-ab372b39b0d3-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "a45381ea-b5d8-49aa-b4b8-ab372b39b0d3" (UID: "a45381ea-b5d8-49aa-b4b8-ab372b39b0d3"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 10:09:50 crc kubenswrapper[4794]: I0310 10:09:50.113790 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2b6b3a2f-0f4d-4b7f-a507-450f0ffff42b-kube-api-access-qjbsb" (OuterVolumeSpecName: "kube-api-access-qjbsb") pod "2b6b3a2f-0f4d-4b7f-a507-450f0ffff42b" (UID: "2b6b3a2f-0f4d-4b7f-a507-450f0ffff42b"). InnerVolumeSpecName "kube-api-access-qjbsb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 10:09:50 crc kubenswrapper[4794]: I0310 10:09:50.114474 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage01-crc" (OuterVolumeSpecName: "persistence") pod "a45381ea-b5d8-49aa-b4b8-ab372b39b0d3" (UID: "a45381ea-b5d8-49aa-b4b8-ab372b39b0d3"). InnerVolumeSpecName "local-storage01-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 10 10:09:50 crc kubenswrapper[4794]: I0310 10:09:50.120796 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage08-crc" (OuterVolumeSpecName: "mysql-db") pod "2b6b3a2f-0f4d-4b7f-a507-450f0ffff42b" (UID: "2b6b3a2f-0f4d-4b7f-a507-450f0ffff42b"). InnerVolumeSpecName "local-storage08-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 10 10:09:50 crc kubenswrapper[4794]: I0310 10:09:50.131897 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2b6b3a2f-0f4d-4b7f-a507-450f0ffff42b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2b6b3a2f-0f4d-4b7f-a507-450f0ffff42b" (UID: "2b6b3a2f-0f4d-4b7f-a507-450f0ffff42b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 10:09:50 crc kubenswrapper[4794]: I0310 10:09:50.133930 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a45381ea-b5d8-49aa-b4b8-ab372b39b0d3-config-data" (OuterVolumeSpecName: "config-data") pod "a45381ea-b5d8-49aa-b4b8-ab372b39b0d3" (UID: "a45381ea-b5d8-49aa-b4b8-ab372b39b0d3"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 10:09:50 crc kubenswrapper[4794]: I0310 10:09:50.150737 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a45381ea-b5d8-49aa-b4b8-ab372b39b0d3-server-conf" (OuterVolumeSpecName: "server-conf") pod "a45381ea-b5d8-49aa-b4b8-ab372b39b0d3" (UID: "a45381ea-b5d8-49aa-b4b8-ab372b39b0d3"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 10:09:50 crc kubenswrapper[4794]: I0310 10:09:50.162029 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2b6b3a2f-0f4d-4b7f-a507-450f0ffff42b-galera-tls-certs" (OuterVolumeSpecName: "galera-tls-certs") pod "2b6b3a2f-0f4d-4b7f-a507-450f0ffff42b" (UID: "2b6b3a2f-0f4d-4b7f-a507-450f0ffff42b"). InnerVolumeSpecName "galera-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 10:09:50 crc kubenswrapper[4794]: I0310 10:09:50.187631 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a45381ea-b5d8-49aa-b4b8-ab372b39b0d3-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "a45381ea-b5d8-49aa-b4b8-ab372b39b0d3" (UID: "a45381ea-b5d8-49aa-b4b8-ab372b39b0d3"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 10:09:50 crc kubenswrapper[4794]: I0310 10:09:50.209559 4794 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/a45381ea-b5d8-49aa-b4b8-ab372b39b0d3-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Mar 10 10:09:50 crc kubenswrapper[4794]: I0310 10:09:50.209592 4794 reconciler_common.go:293] "Volume detached for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/2b6b3a2f-0f4d-4b7f-a507-450f0ffff42b-config-data-default\") on node \"crc\" DevicePath \"\"" Mar 10 10:09:50 crc kubenswrapper[4794]: I0310 10:09:50.209602 4794 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/a45381ea-b5d8-49aa-b4b8-ab372b39b0d3-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Mar 10 10:09:50 crc kubenswrapper[4794]: I0310 10:09:50.209612 4794 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/a45381ea-b5d8-49aa-b4b8-ab372b39b0d3-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Mar 10 10:09:50 crc kubenswrapper[4794]: I0310 10:09:50.209622 4794 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/a45381ea-b5d8-49aa-b4b8-ab372b39b0d3-server-conf\") on node \"crc\" DevicePath \"\"" Mar 10 10:09:50 crc kubenswrapper[4794]: I0310 10:09:50.209630 4794 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/a45381ea-b5d8-49aa-b4b8-ab372b39b0d3-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Mar 10 10:09:50 crc kubenswrapper[4794]: I0310 10:09:50.209638 4794 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/a45381ea-b5d8-49aa-b4b8-ab372b39b0d3-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Mar 10 10:09:50 crc kubenswrapper[4794]: I0310 10:09:50.209666 4794 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" " Mar 10 10:09:50 crc kubenswrapper[4794]: I0310 10:09:50.209675 4794 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/a45381ea-b5d8-49aa-b4b8-ab372b39b0d3-pod-info\") on node \"crc\" DevicePath \"\"" Mar 10 10:09:50 crc kubenswrapper[4794]: I0310 10:09:50.209683 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-27qt6\" (UniqueName: \"kubernetes.io/projected/a45381ea-b5d8-49aa-b4b8-ab372b39b0d3-kube-api-access-27qt6\") on node \"crc\" DevicePath \"\"" Mar 10 10:09:50 crc kubenswrapper[4794]: I0310 10:09:50.209693 4794 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/a45381ea-b5d8-49aa-b4b8-ab372b39b0d3-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 10:09:50 crc kubenswrapper[4794]: I0310 10:09:50.209702 4794 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2b6b3a2f-0f4d-4b7f-a507-450f0ffff42b-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 10:09:50 crc kubenswrapper[4794]: I0310 10:09:50.209709 4794 reconciler_common.go:293] "Volume detached for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/2b6b3a2f-0f4d-4b7f-a507-450f0ffff42b-kolla-config\") on node \"crc\" DevicePath \"\"" Mar 10 10:09:50 crc kubenswrapper[4794]: I0310 10:09:50.209718 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qjbsb\" (UniqueName: \"kubernetes.io/projected/2b6b3a2f-0f4d-4b7f-a507-450f0ffff42b-kube-api-access-qjbsb\") on node \"crc\" DevicePath \"\"" Mar 10 10:09:50 crc kubenswrapper[4794]: I0310 10:09:50.209725 4794 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/a45381ea-b5d8-49aa-b4b8-ab372b39b0d3-plugins-conf\") on node \"crc\" DevicePath \"\"" Mar 10 10:09:50 crc kubenswrapper[4794]: I0310 10:09:50.209733 4794 reconciler_common.go:293] "Volume detached for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/2b6b3a2f-0f4d-4b7f-a507-450f0ffff42b-config-data-generated\") on node \"crc\" DevicePath \"\"" Mar 10 10:09:50 crc kubenswrapper[4794]: I0310 10:09:50.209747 4794 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") on node \"crc\" " Mar 10 10:09:50 crc kubenswrapper[4794]: I0310 10:09:50.209756 4794 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2b6b3a2f-0f4d-4b7f-a507-450f0ffff42b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 10:09:50 crc kubenswrapper[4794]: I0310 10:09:50.209764 4794 reconciler_common.go:293] "Volume detached for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/2b6b3a2f-0f4d-4b7f-a507-450f0ffff42b-galera-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 10 10:09:50 crc kubenswrapper[4794]: I0310 10:09:50.225371 4794 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage01-crc" (UniqueName: "kubernetes.io/local-volume/local-storage01-crc") on node "crc" Mar 10 10:09:50 crc kubenswrapper[4794]: I0310 10:09:50.242619 4794 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage08-crc" (UniqueName: "kubernetes.io/local-volume/local-storage08-crc") on node "crc" Mar 10 10:09:50 crc kubenswrapper[4794]: I0310 10:09:50.311093 4794 reconciler_common.go:293] "Volume detached for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" DevicePath \"\"" Mar 10 10:09:50 crc kubenswrapper[4794]: I0310 10:09:50.311116 4794 reconciler_common.go:293] "Volume detached for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") on node \"crc\" DevicePath \"\"" Mar 10 10:09:50 crc kubenswrapper[4794]: I0310 10:09:50.565511 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 10 10:09:50 crc kubenswrapper[4794]: I0310 10:09:50.716997 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/598e06ed-3156-4e09-976e-4dda0e35afc2-pod-info\") pod \"598e06ed-3156-4e09-976e-4dda0e35afc2\" (UID: \"598e06ed-3156-4e09-976e-4dda0e35afc2\") " Mar 10 10:09:50 crc kubenswrapper[4794]: I0310 10:09:50.717465 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/598e06ed-3156-4e09-976e-4dda0e35afc2-config-data\") pod \"598e06ed-3156-4e09-976e-4dda0e35afc2\" (UID: \"598e06ed-3156-4e09-976e-4dda0e35afc2\") " Mar 10 10:09:50 crc kubenswrapper[4794]: I0310 10:09:50.717512 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/598e06ed-3156-4e09-976e-4dda0e35afc2-rabbitmq-confd\") pod \"598e06ed-3156-4e09-976e-4dda0e35afc2\" (UID: \"598e06ed-3156-4e09-976e-4dda0e35afc2\") " Mar 10 10:09:50 crc kubenswrapper[4794]: I0310 10:09:50.717551 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"598e06ed-3156-4e09-976e-4dda0e35afc2\" (UID: \"598e06ed-3156-4e09-976e-4dda0e35afc2\") " Mar 10 10:09:50 crc kubenswrapper[4794]: I0310 10:09:50.717615 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/598e06ed-3156-4e09-976e-4dda0e35afc2-rabbitmq-erlang-cookie\") pod \"598e06ed-3156-4e09-976e-4dda0e35afc2\" (UID: \"598e06ed-3156-4e09-976e-4dda0e35afc2\") " Mar 10 10:09:50 crc kubenswrapper[4794]: I0310 10:09:50.717667 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/598e06ed-3156-4e09-976e-4dda0e35afc2-rabbitmq-tls\") pod \"598e06ed-3156-4e09-976e-4dda0e35afc2\" (UID: \"598e06ed-3156-4e09-976e-4dda0e35afc2\") " Mar 10 10:09:50 crc kubenswrapper[4794]: I0310 10:09:50.717695 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/598e06ed-3156-4e09-976e-4dda0e35afc2-plugins-conf\") pod \"598e06ed-3156-4e09-976e-4dda0e35afc2\" (UID: \"598e06ed-3156-4e09-976e-4dda0e35afc2\") " Mar 10 10:09:50 crc kubenswrapper[4794]: I0310 10:09:50.717731 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/598e06ed-3156-4e09-976e-4dda0e35afc2-server-conf\") pod \"598e06ed-3156-4e09-976e-4dda0e35afc2\" (UID: \"598e06ed-3156-4e09-976e-4dda0e35afc2\") " Mar 10 10:09:50 crc kubenswrapper[4794]: I0310 10:09:50.717762 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/598e06ed-3156-4e09-976e-4dda0e35afc2-erlang-cookie-secret\") pod \"598e06ed-3156-4e09-976e-4dda0e35afc2\" (UID: \"598e06ed-3156-4e09-976e-4dda0e35afc2\") " Mar 10 10:09:50 crc kubenswrapper[4794]: I0310 10:09:50.717785 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/598e06ed-3156-4e09-976e-4dda0e35afc2-rabbitmq-plugins\") pod \"598e06ed-3156-4e09-976e-4dda0e35afc2\" (UID: \"598e06ed-3156-4e09-976e-4dda0e35afc2\") " Mar 10 10:09:50 crc kubenswrapper[4794]: I0310 10:09:50.717811 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g5gx7\" (UniqueName: \"kubernetes.io/projected/598e06ed-3156-4e09-976e-4dda0e35afc2-kube-api-access-g5gx7\") pod \"598e06ed-3156-4e09-976e-4dda0e35afc2\" (UID: \"598e06ed-3156-4e09-976e-4dda0e35afc2\") " Mar 10 10:09:50 crc kubenswrapper[4794]: I0310 10:09:50.719113 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/598e06ed-3156-4e09-976e-4dda0e35afc2-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "598e06ed-3156-4e09-976e-4dda0e35afc2" (UID: "598e06ed-3156-4e09-976e-4dda0e35afc2"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 10:09:50 crc kubenswrapper[4794]: I0310 10:09:50.722632 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/598e06ed-3156-4e09-976e-4dda0e35afc2-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "598e06ed-3156-4e09-976e-4dda0e35afc2" (UID: "598e06ed-3156-4e09-976e-4dda0e35afc2"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 10:09:50 crc kubenswrapper[4794]: I0310 10:09:50.722808 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/598e06ed-3156-4e09-976e-4dda0e35afc2-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "598e06ed-3156-4e09-976e-4dda0e35afc2" (UID: "598e06ed-3156-4e09-976e-4dda0e35afc2"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 10:09:50 crc kubenswrapper[4794]: I0310 10:09:50.727321 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/598e06ed-3156-4e09-976e-4dda0e35afc2-kube-api-access-g5gx7" (OuterVolumeSpecName: "kube-api-access-g5gx7") pod "598e06ed-3156-4e09-976e-4dda0e35afc2" (UID: "598e06ed-3156-4e09-976e-4dda0e35afc2"). InnerVolumeSpecName "kube-api-access-g5gx7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 10:09:50 crc kubenswrapper[4794]: I0310 10:09:50.727352 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/598e06ed-3156-4e09-976e-4dda0e35afc2-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "598e06ed-3156-4e09-976e-4dda0e35afc2" (UID: "598e06ed-3156-4e09-976e-4dda0e35afc2"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 10:09:50 crc kubenswrapper[4794]: I0310 10:09:50.727432 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/598e06ed-3156-4e09-976e-4dda0e35afc2-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "598e06ed-3156-4e09-976e-4dda0e35afc2" (UID: "598e06ed-3156-4e09-976e-4dda0e35afc2"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 10:09:50 crc kubenswrapper[4794]: I0310 10:09:50.729995 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/598e06ed-3156-4e09-976e-4dda0e35afc2-pod-info" (OuterVolumeSpecName: "pod-info") pod "598e06ed-3156-4e09-976e-4dda0e35afc2" (UID: "598e06ed-3156-4e09-976e-4dda0e35afc2"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Mar 10 10:09:50 crc kubenswrapper[4794]: I0310 10:09:50.743607 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage07-crc" (OuterVolumeSpecName: "persistence") pod "598e06ed-3156-4e09-976e-4dda0e35afc2" (UID: "598e06ed-3156-4e09-976e-4dda0e35afc2"). InnerVolumeSpecName "local-storage07-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 10 10:09:50 crc kubenswrapper[4794]: I0310 10:09:50.760006 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/598e06ed-3156-4e09-976e-4dda0e35afc2-config-data" (OuterVolumeSpecName: "config-data") pod "598e06ed-3156-4e09-976e-4dda0e35afc2" (UID: "598e06ed-3156-4e09-976e-4dda0e35afc2"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 10:09:50 crc kubenswrapper[4794]: I0310 10:09:50.793389 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/598e06ed-3156-4e09-976e-4dda0e35afc2-server-conf" (OuterVolumeSpecName: "server-conf") pod "598e06ed-3156-4e09-976e-4dda0e35afc2" (UID: "598e06ed-3156-4e09-976e-4dda0e35afc2"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 10:09:50 crc kubenswrapper[4794]: I0310 10:09:50.823890 4794 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/598e06ed-3156-4e09-976e-4dda0e35afc2-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Mar 10 10:09:50 crc kubenswrapper[4794]: I0310 10:09:50.823917 4794 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/598e06ed-3156-4e09-976e-4dda0e35afc2-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Mar 10 10:09:50 crc kubenswrapper[4794]: I0310 10:09:50.823926 4794 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/598e06ed-3156-4e09-976e-4dda0e35afc2-plugins-conf\") on node \"crc\" DevicePath \"\"" Mar 10 10:09:50 crc kubenswrapper[4794]: I0310 10:09:50.823934 4794 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/598e06ed-3156-4e09-976e-4dda0e35afc2-server-conf\") on node \"crc\" DevicePath \"\"" Mar 10 10:09:50 crc kubenswrapper[4794]: I0310 10:09:50.823942 4794 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/598e06ed-3156-4e09-976e-4dda0e35afc2-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Mar 10 10:09:50 crc kubenswrapper[4794]: I0310 10:09:50.823951 4794 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/598e06ed-3156-4e09-976e-4dda0e35afc2-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Mar 10 10:09:50 crc kubenswrapper[4794]: I0310 10:09:50.823961 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g5gx7\" (UniqueName: \"kubernetes.io/projected/598e06ed-3156-4e09-976e-4dda0e35afc2-kube-api-access-g5gx7\") on node \"crc\" DevicePath \"\"" Mar 10 10:09:50 crc kubenswrapper[4794]: I0310 10:09:50.823970 4794 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/598e06ed-3156-4e09-976e-4dda0e35afc2-pod-info\") on node \"crc\" DevicePath \"\"" Mar 10 10:09:50 crc kubenswrapper[4794]: I0310 10:09:50.823978 4794 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/598e06ed-3156-4e09-976e-4dda0e35afc2-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 10:09:50 crc kubenswrapper[4794]: I0310 10:09:50.823997 4794 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") on node \"crc\" " Mar 10 10:09:50 crc kubenswrapper[4794]: I0310 10:09:50.832536 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/598e06ed-3156-4e09-976e-4dda0e35afc2-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "598e06ed-3156-4e09-976e-4dda0e35afc2" (UID: "598e06ed-3156-4e09-976e-4dda0e35afc2"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 10:09:50 crc kubenswrapper[4794]: I0310 10:09:50.846642 4794 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage07-crc" (UniqueName: "kubernetes.io/local-volume/local-storage07-crc") on node "crc" Mar 10 10:09:50 crc kubenswrapper[4794]: I0310 10:09:50.925419 4794 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/598e06ed-3156-4e09-976e-4dda0e35afc2-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Mar 10 10:09:50 crc kubenswrapper[4794]: I0310 10:09:50.925443 4794 reconciler_common.go:293] "Volume detached for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") on node \"crc\" DevicePath \"\"" Mar 10 10:09:50 crc kubenswrapper[4794]: I0310 10:09:50.932269 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"a45381ea-b5d8-49aa-b4b8-ab372b39b0d3","Type":"ContainerDied","Data":"0149e10f7e01ef9d92ad296ac2010105b1999fbb92de901a0dc1173a78ddb7ab"} Mar 10 10:09:50 crc kubenswrapper[4794]: I0310 10:09:50.932506 4794 scope.go:117] "RemoveContainer" containerID="b01da559f24b75afd94d0c65d373dcf5b4d3bb07708a3909a930cd454c72cc4d" Mar 10 10:09:50 crc kubenswrapper[4794]: I0310 10:09:50.932318 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 10 10:09:50 crc kubenswrapper[4794]: I0310 10:09:50.933635 4794 generic.go:334] "Generic (PLEG): container finished" podID="95ce97ce-b89c-4868-b9a8-48297e8e35e1" containerID="ba7917d5c284239059542152be24644fc0561a4ad2613bc181bad5791fbc0849" exitCode=0 Mar 10 10:09:50 crc kubenswrapper[4794]: I0310 10:09:50.933686 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-547f85b784-tn9hj" event={"ID":"95ce97ce-b89c-4868-b9a8-48297e8e35e1","Type":"ContainerDied","Data":"ba7917d5c284239059542152be24644fc0561a4ad2613bc181bad5791fbc0849"} Mar 10 10:09:50 crc kubenswrapper[4794]: I0310 10:09:50.935649 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-547f85b784-tn9hj" Mar 10 10:09:50 crc kubenswrapper[4794]: I0310 10:09:50.939723 4794 generic.go:334] "Generic (PLEG): container finished" podID="598e06ed-3156-4e09-976e-4dda0e35afc2" containerID="145a1a04aa37c9588ebc88932f6178e631ace726a39e4a8d191e2883dc15e900" exitCode=0 Mar 10 10:09:50 crc kubenswrapper[4794]: I0310 10:09:50.939787 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"598e06ed-3156-4e09-976e-4dda0e35afc2","Type":"ContainerDied","Data":"145a1a04aa37c9588ebc88932f6178e631ace726a39e4a8d191e2883dc15e900"} Mar 10 10:09:50 crc kubenswrapper[4794]: I0310 10:09:50.939811 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"598e06ed-3156-4e09-976e-4dda0e35afc2","Type":"ContainerDied","Data":"04250b32ee461c407bf2af76f554ef3ce4c4f1ccb97739e259bbd8a00dce17b4"} Mar 10 10:09:50 crc kubenswrapper[4794]: I0310 10:09:50.939864 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 10 10:09:50 crc kubenswrapper[4794]: I0310 10:09:50.943912 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Mar 10 10:09:50 crc kubenswrapper[4794]: I0310 10:09:50.963978 4794 scope.go:117] "RemoveContainer" containerID="a840482fd73ba7f63de99b82bc1c4a4c3093d855770bd6ce8ca9c72f090ea3e7" Mar 10 10:09:50 crc kubenswrapper[4794]: I0310 10:09:50.994258 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstack-galera-0"] Mar 10 10:09:51 crc kubenswrapper[4794]: I0310 10:09:51.001038 4794 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/openstack-galera-0"] Mar 10 10:09:51 crc kubenswrapper[4794]: I0310 10:09:51.014036 4794 scope.go:117] "RemoveContainer" containerID="145a1a04aa37c9588ebc88932f6178e631ace726a39e4a8d191e2883dc15e900" Mar 10 10:09:51 crc kubenswrapper[4794]: I0310 10:09:51.020885 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 10 10:09:51 crc kubenswrapper[4794]: I0310 10:09:51.025986 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/95ce97ce-b89c-4868-b9a8-48297e8e35e1-internal-tls-certs\") pod \"95ce97ce-b89c-4868-b9a8-48297e8e35e1\" (UID: \"95ce97ce-b89c-4868-b9a8-48297e8e35e1\") " Mar 10 10:09:51 crc kubenswrapper[4794]: I0310 10:09:51.026032 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/95ce97ce-b89c-4868-b9a8-48297e8e35e1-config-data\") pod \"95ce97ce-b89c-4868-b9a8-48297e8e35e1\" (UID: \"95ce97ce-b89c-4868-b9a8-48297e8e35e1\") " Mar 10 10:09:51 crc kubenswrapper[4794]: I0310 10:09:51.026074 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/95ce97ce-b89c-4868-b9a8-48297e8e35e1-fernet-keys\") pod \"95ce97ce-b89c-4868-b9a8-48297e8e35e1\" (UID: \"95ce97ce-b89c-4868-b9a8-48297e8e35e1\") " Mar 10 10:09:51 crc kubenswrapper[4794]: I0310 10:09:51.026176 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b9hcf\" (UniqueName: \"kubernetes.io/projected/95ce97ce-b89c-4868-b9a8-48297e8e35e1-kube-api-access-b9hcf\") pod \"95ce97ce-b89c-4868-b9a8-48297e8e35e1\" (UID: \"95ce97ce-b89c-4868-b9a8-48297e8e35e1\") " Mar 10 10:09:51 crc kubenswrapper[4794]: I0310 10:09:51.026235 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/95ce97ce-b89c-4868-b9a8-48297e8e35e1-credential-keys\") pod \"95ce97ce-b89c-4868-b9a8-48297e8e35e1\" (UID: \"95ce97ce-b89c-4868-b9a8-48297e8e35e1\") " Mar 10 10:09:51 crc kubenswrapper[4794]: I0310 10:09:51.026290 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/95ce97ce-b89c-4868-b9a8-48297e8e35e1-public-tls-certs\") pod \"95ce97ce-b89c-4868-b9a8-48297e8e35e1\" (UID: \"95ce97ce-b89c-4868-b9a8-48297e8e35e1\") " Mar 10 10:09:51 crc kubenswrapper[4794]: I0310 10:09:51.026322 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/95ce97ce-b89c-4868-b9a8-48297e8e35e1-combined-ca-bundle\") pod \"95ce97ce-b89c-4868-b9a8-48297e8e35e1\" (UID: \"95ce97ce-b89c-4868-b9a8-48297e8e35e1\") " Mar 10 10:09:51 crc kubenswrapper[4794]: I0310 10:09:51.026395 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/95ce97ce-b89c-4868-b9a8-48297e8e35e1-scripts\") pod \"95ce97ce-b89c-4868-b9a8-48297e8e35e1\" (UID: \"95ce97ce-b89c-4868-b9a8-48297e8e35e1\") " Mar 10 10:09:51 crc kubenswrapper[4794]: I0310 10:09:51.030928 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/95ce97ce-b89c-4868-b9a8-48297e8e35e1-scripts" (OuterVolumeSpecName: "scripts") pod "95ce97ce-b89c-4868-b9a8-48297e8e35e1" (UID: "95ce97ce-b89c-4868-b9a8-48297e8e35e1"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 10:09:51 crc kubenswrapper[4794]: I0310 10:09:51.031977 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/95ce97ce-b89c-4868-b9a8-48297e8e35e1-kube-api-access-b9hcf" (OuterVolumeSpecName: "kube-api-access-b9hcf") pod "95ce97ce-b89c-4868-b9a8-48297e8e35e1" (UID: "95ce97ce-b89c-4868-b9a8-48297e8e35e1"). InnerVolumeSpecName "kube-api-access-b9hcf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 10:09:51 crc kubenswrapper[4794]: I0310 10:09:51.034749 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/95ce97ce-b89c-4868-b9a8-48297e8e35e1-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "95ce97ce-b89c-4868-b9a8-48297e8e35e1" (UID: "95ce97ce-b89c-4868-b9a8-48297e8e35e1"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 10:09:51 crc kubenswrapper[4794]: I0310 10:09:51.034799 4794 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 10 10:09:51 crc kubenswrapper[4794]: I0310 10:09:51.042820 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/95ce97ce-b89c-4868-b9a8-48297e8e35e1-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "95ce97ce-b89c-4868-b9a8-48297e8e35e1" (UID: "95ce97ce-b89c-4868-b9a8-48297e8e35e1"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 10:09:51 crc kubenswrapper[4794]: I0310 10:09:51.043469 4794 scope.go:117] "RemoveContainer" containerID="d12835c0bc7b99c0b1d3a6ef8ae9785eba0f3a435f78fe3cb1b5f9e6c4bd6dea" Mar 10 10:09:51 crc kubenswrapper[4794]: I0310 10:09:51.066733 4794 scope.go:117] "RemoveContainer" containerID="145a1a04aa37c9588ebc88932f6178e631ace726a39e4a8d191e2883dc15e900" Mar 10 10:09:51 crc kubenswrapper[4794]: E0310 10:09:51.067805 4794 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"145a1a04aa37c9588ebc88932f6178e631ace726a39e4a8d191e2883dc15e900\": container with ID starting with 145a1a04aa37c9588ebc88932f6178e631ace726a39e4a8d191e2883dc15e900 not found: ID does not exist" containerID="145a1a04aa37c9588ebc88932f6178e631ace726a39e4a8d191e2883dc15e900" Mar 10 10:09:51 crc kubenswrapper[4794]: I0310 10:09:51.067835 4794 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"145a1a04aa37c9588ebc88932f6178e631ace726a39e4a8d191e2883dc15e900"} err="failed to get container status \"145a1a04aa37c9588ebc88932f6178e631ace726a39e4a8d191e2883dc15e900\": rpc error: code = NotFound desc = could not find container \"145a1a04aa37c9588ebc88932f6178e631ace726a39e4a8d191e2883dc15e900\": container with ID starting with 145a1a04aa37c9588ebc88932f6178e631ace726a39e4a8d191e2883dc15e900 not found: ID does not exist" Mar 10 10:09:51 crc kubenswrapper[4794]: I0310 10:09:51.067857 4794 scope.go:117] "RemoveContainer" containerID="d12835c0bc7b99c0b1d3a6ef8ae9785eba0f3a435f78fe3cb1b5f9e6c4bd6dea" Mar 10 10:09:51 crc kubenswrapper[4794]: E0310 10:09:51.068643 4794 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d12835c0bc7b99c0b1d3a6ef8ae9785eba0f3a435f78fe3cb1b5f9e6c4bd6dea\": container with ID starting with d12835c0bc7b99c0b1d3a6ef8ae9785eba0f3a435f78fe3cb1b5f9e6c4bd6dea not found: ID does not exist" containerID="d12835c0bc7b99c0b1d3a6ef8ae9785eba0f3a435f78fe3cb1b5f9e6c4bd6dea" Mar 10 10:09:51 crc kubenswrapper[4794]: I0310 10:09:51.068661 4794 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d12835c0bc7b99c0b1d3a6ef8ae9785eba0f3a435f78fe3cb1b5f9e6c4bd6dea"} err="failed to get container status \"d12835c0bc7b99c0b1d3a6ef8ae9785eba0f3a435f78fe3cb1b5f9e6c4bd6dea\": rpc error: code = NotFound desc = could not find container \"d12835c0bc7b99c0b1d3a6ef8ae9785eba0f3a435f78fe3cb1b5f9e6c4bd6dea\": container with ID starting with d12835c0bc7b99c0b1d3a6ef8ae9785eba0f3a435f78fe3cb1b5f9e6c4bd6dea not found: ID does not exist" Mar 10 10:09:51 crc kubenswrapper[4794]: I0310 10:09:51.077151 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 10 10:09:51 crc kubenswrapper[4794]: I0310 10:09:51.092665 4794 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 10 10:09:51 crc kubenswrapper[4794]: I0310 10:09:51.106080 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/95ce97ce-b89c-4868-b9a8-48297e8e35e1-config-data" (OuterVolumeSpecName: "config-data") pod "95ce97ce-b89c-4868-b9a8-48297e8e35e1" (UID: "95ce97ce-b89c-4868-b9a8-48297e8e35e1"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 10:09:51 crc kubenswrapper[4794]: I0310 10:09:51.110277 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/95ce97ce-b89c-4868-b9a8-48297e8e35e1-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "95ce97ce-b89c-4868-b9a8-48297e8e35e1" (UID: "95ce97ce-b89c-4868-b9a8-48297e8e35e1"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 10:09:51 crc kubenswrapper[4794]: I0310 10:09:51.127614 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/95ce97ce-b89c-4868-b9a8-48297e8e35e1-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "95ce97ce-b89c-4868-b9a8-48297e8e35e1" (UID: "95ce97ce-b89c-4868-b9a8-48297e8e35e1"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 10:09:51 crc kubenswrapper[4794]: I0310 10:09:51.127821 4794 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/95ce97ce-b89c-4868-b9a8-48297e8e35e1-credential-keys\") on node \"crc\" DevicePath \"\"" Mar 10 10:09:51 crc kubenswrapper[4794]: I0310 10:09:51.127850 4794 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/95ce97ce-b89c-4868-b9a8-48297e8e35e1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 10:09:51 crc kubenswrapper[4794]: I0310 10:09:51.127860 4794 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/95ce97ce-b89c-4868-b9a8-48297e8e35e1-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 10:09:51 crc kubenswrapper[4794]: I0310 10:09:51.127869 4794 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/95ce97ce-b89c-4868-b9a8-48297e8e35e1-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 10 10:09:51 crc kubenswrapper[4794]: I0310 10:09:51.127877 4794 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/95ce97ce-b89c-4868-b9a8-48297e8e35e1-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 10:09:51 crc kubenswrapper[4794]: I0310 10:09:51.127887 4794 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/95ce97ce-b89c-4868-b9a8-48297e8e35e1-fernet-keys\") on node \"crc\" DevicePath \"\"" Mar 10 10:09:51 crc kubenswrapper[4794]: I0310 10:09:51.127896 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b9hcf\" (UniqueName: \"kubernetes.io/projected/95ce97ce-b89c-4868-b9a8-48297e8e35e1-kube-api-access-b9hcf\") on node \"crc\" DevicePath \"\"" Mar 10 10:09:51 crc kubenswrapper[4794]: I0310 10:09:51.133283 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/95ce97ce-b89c-4868-b9a8-48297e8e35e1-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "95ce97ce-b89c-4868-b9a8-48297e8e35e1" (UID: "95ce97ce-b89c-4868-b9a8-48297e8e35e1"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 10:09:51 crc kubenswrapper[4794]: I0310 10:09:51.229300 4794 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/95ce97ce-b89c-4868-b9a8-48297e8e35e1-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 10 10:09:51 crc kubenswrapper[4794]: E0310 10:09:51.723689 4794 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 760dbf63b8a7310e794a1834c303840d43e2dfc6133fa29596b18c0cf1a30fcb is running failed: container process not found" containerID="760dbf63b8a7310e794a1834c303840d43e2dfc6133fa29596b18c0cf1a30fcb" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Mar 10 10:09:51 crc kubenswrapper[4794]: E0310 10:09:51.724669 4794 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 760dbf63b8a7310e794a1834c303840d43e2dfc6133fa29596b18c0cf1a30fcb is running failed: container process not found" containerID="760dbf63b8a7310e794a1834c303840d43e2dfc6133fa29596b18c0cf1a30fcb" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Mar 10 10:09:51 crc kubenswrapper[4794]: E0310 10:09:51.724955 4794 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="b7dfa84825a07e2ea2e0b6f2310752db3a4b9e635a5fe6eebca49deff4d3269e" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Mar 10 10:09:51 crc kubenswrapper[4794]: E0310 10:09:51.724985 4794 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 760dbf63b8a7310e794a1834c303840d43e2dfc6133fa29596b18c0cf1a30fcb is running failed: container process not found" containerID="760dbf63b8a7310e794a1834c303840d43e2dfc6133fa29596b18c0cf1a30fcb" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Mar 10 10:09:51 crc kubenswrapper[4794]: E0310 10:09:51.725051 4794 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 760dbf63b8a7310e794a1834c303840d43e2dfc6133fa29596b18c0cf1a30fcb is running failed: container process not found" probeType="Readiness" pod="openstack/ovn-controller-ovs-8552m" podUID="4fbaedcd-8c1d-452e-a36b-1a12e47f48d1" containerName="ovsdb-server" Mar 10 10:09:51 crc kubenswrapper[4794]: E0310 10:09:51.726705 4794 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="b7dfa84825a07e2ea2e0b6f2310752db3a4b9e635a5fe6eebca49deff4d3269e" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Mar 10 10:09:51 crc kubenswrapper[4794]: E0310 10:09:51.728618 4794 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="b7dfa84825a07e2ea2e0b6f2310752db3a4b9e635a5fe6eebca49deff4d3269e" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Mar 10 10:09:51 crc kubenswrapper[4794]: E0310 10:09:51.728651 4794 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/ovn-controller-ovs-8552m" podUID="4fbaedcd-8c1d-452e-a36b-1a12e47f48d1" containerName="ovs-vswitchd" Mar 10 10:09:51 crc kubenswrapper[4794]: I0310 10:09:51.957957 4794 generic.go:334] "Generic (PLEG): container finished" podID="5ad7a22a-3e88-4447-b675-0a8339bd5f55" containerID="cca51dc254ea928ff512f597522805c2f3a8b9ea69647a3b6f31bf5e631eac13" exitCode=0 Mar 10 10:09:51 crc kubenswrapper[4794]: I0310 10:09:51.958034 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5ad7a22a-3e88-4447-b675-0a8339bd5f55","Type":"ContainerDied","Data":"cca51dc254ea928ff512f597522805c2f3a8b9ea69647a3b6f31bf5e631eac13"} Mar 10 10:09:51 crc kubenswrapper[4794]: I0310 10:09:51.961304 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-547f85b784-tn9hj" event={"ID":"95ce97ce-b89c-4868-b9a8-48297e8e35e1","Type":"ContainerDied","Data":"337a49f733a79e868893a5f1af6eeadfb39f47c72873ba9f315bf933e2e0249a"} Mar 10 10:09:51 crc kubenswrapper[4794]: I0310 10:09:51.961361 4794 scope.go:117] "RemoveContainer" containerID="ba7917d5c284239059542152be24644fc0561a4ad2613bc181bad5791fbc0849" Mar 10 10:09:51 crc kubenswrapper[4794]: I0310 10:09:51.961400 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-547f85b784-tn9hj" Mar 10 10:09:52 crc kubenswrapper[4794]: I0310 10:09:52.014443 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2446b2bc-c3c8-465d-a808-981664228cba" path="/var/lib/kubelet/pods/2446b2bc-c3c8-465d-a808-981664228cba/volumes" Mar 10 10:09:52 crc kubenswrapper[4794]: I0310 10:09:52.015288 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2b6b3a2f-0f4d-4b7f-a507-450f0ffff42b" path="/var/lib/kubelet/pods/2b6b3a2f-0f4d-4b7f-a507-450f0ffff42b/volumes" Mar 10 10:09:52 crc kubenswrapper[4794]: I0310 10:09:52.016602 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="598e06ed-3156-4e09-976e-4dda0e35afc2" path="/var/lib/kubelet/pods/598e06ed-3156-4e09-976e-4dda0e35afc2/volumes" Mar 10 10:09:52 crc kubenswrapper[4794]: I0310 10:09:52.017313 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="71ee8a8d-89a0-495f-925b-071e52449063" path="/var/lib/kubelet/pods/71ee8a8d-89a0-495f-925b-071e52449063/volumes" Mar 10 10:09:52 crc kubenswrapper[4794]: I0310 10:09:52.018520 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a45381ea-b5d8-49aa-b4b8-ab372b39b0d3" path="/var/lib/kubelet/pods/a45381ea-b5d8-49aa-b4b8-ab372b39b0d3/volumes" Mar 10 10:09:52 crc kubenswrapper[4794]: I0310 10:09:52.019044 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-547f85b784-tn9hj"] Mar 10 10:09:52 crc kubenswrapper[4794]: I0310 10:09:52.019652 4794 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-547f85b784-tn9hj"] Mar 10 10:09:52 crc kubenswrapper[4794]: I0310 10:09:52.080639 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 10 10:09:52 crc kubenswrapper[4794]: I0310 10:09:52.171586 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5ad7a22a-3e88-4447-b675-0a8339bd5f55-scripts\") pod \"5ad7a22a-3e88-4447-b675-0a8339bd5f55\" (UID: \"5ad7a22a-3e88-4447-b675-0a8339bd5f55\") " Mar 10 10:09:52 crc kubenswrapper[4794]: I0310 10:09:52.171876 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/5ad7a22a-3e88-4447-b675-0a8339bd5f55-ceilometer-tls-certs\") pod \"5ad7a22a-3e88-4447-b675-0a8339bd5f55\" (UID: \"5ad7a22a-3e88-4447-b675-0a8339bd5f55\") " Mar 10 10:09:52 crc kubenswrapper[4794]: I0310 10:09:52.172032 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5ad7a22a-3e88-4447-b675-0a8339bd5f55-combined-ca-bundle\") pod \"5ad7a22a-3e88-4447-b675-0a8339bd5f55\" (UID: \"5ad7a22a-3e88-4447-b675-0a8339bd5f55\") " Mar 10 10:09:52 crc kubenswrapper[4794]: I0310 10:09:52.172167 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5ad7a22a-3e88-4447-b675-0a8339bd5f55-config-data\") pod \"5ad7a22a-3e88-4447-b675-0a8339bd5f55\" (UID: \"5ad7a22a-3e88-4447-b675-0a8339bd5f55\") " Mar 10 10:09:52 crc kubenswrapper[4794]: I0310 10:09:52.172271 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/5ad7a22a-3e88-4447-b675-0a8339bd5f55-sg-core-conf-yaml\") pod \"5ad7a22a-3e88-4447-b675-0a8339bd5f55\" (UID: \"5ad7a22a-3e88-4447-b675-0a8339bd5f55\") " Mar 10 10:09:52 crc kubenswrapper[4794]: I0310 10:09:52.172668 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5ad7a22a-3e88-4447-b675-0a8339bd5f55-run-httpd\") pod \"5ad7a22a-3e88-4447-b675-0a8339bd5f55\" (UID: \"5ad7a22a-3e88-4447-b675-0a8339bd5f55\") " Mar 10 10:09:52 crc kubenswrapper[4794]: I0310 10:09:52.172764 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5ad7a22a-3e88-4447-b675-0a8339bd5f55-log-httpd\") pod \"5ad7a22a-3e88-4447-b675-0a8339bd5f55\" (UID: \"5ad7a22a-3e88-4447-b675-0a8339bd5f55\") " Mar 10 10:09:52 crc kubenswrapper[4794]: I0310 10:09:52.172864 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bk52r\" (UniqueName: \"kubernetes.io/projected/5ad7a22a-3e88-4447-b675-0a8339bd5f55-kube-api-access-bk52r\") pod \"5ad7a22a-3e88-4447-b675-0a8339bd5f55\" (UID: \"5ad7a22a-3e88-4447-b675-0a8339bd5f55\") " Mar 10 10:09:52 crc kubenswrapper[4794]: I0310 10:09:52.173088 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5ad7a22a-3e88-4447-b675-0a8339bd5f55-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "5ad7a22a-3e88-4447-b675-0a8339bd5f55" (UID: "5ad7a22a-3e88-4447-b675-0a8339bd5f55"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 10:09:52 crc kubenswrapper[4794]: I0310 10:09:52.173223 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5ad7a22a-3e88-4447-b675-0a8339bd5f55-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "5ad7a22a-3e88-4447-b675-0a8339bd5f55" (UID: "5ad7a22a-3e88-4447-b675-0a8339bd5f55"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 10:09:52 crc kubenswrapper[4794]: I0310 10:09:52.173368 4794 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5ad7a22a-3e88-4447-b675-0a8339bd5f55-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 10 10:09:52 crc kubenswrapper[4794]: I0310 10:09:52.184815 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5ad7a22a-3e88-4447-b675-0a8339bd5f55-scripts" (OuterVolumeSpecName: "scripts") pod "5ad7a22a-3e88-4447-b675-0a8339bd5f55" (UID: "5ad7a22a-3e88-4447-b675-0a8339bd5f55"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 10:09:52 crc kubenswrapper[4794]: I0310 10:09:52.189759 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5ad7a22a-3e88-4447-b675-0a8339bd5f55-kube-api-access-bk52r" (OuterVolumeSpecName: "kube-api-access-bk52r") pod "5ad7a22a-3e88-4447-b675-0a8339bd5f55" (UID: "5ad7a22a-3e88-4447-b675-0a8339bd5f55"). InnerVolumeSpecName "kube-api-access-bk52r". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 10:09:52 crc kubenswrapper[4794]: I0310 10:09:52.192387 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5ad7a22a-3e88-4447-b675-0a8339bd5f55-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "5ad7a22a-3e88-4447-b675-0a8339bd5f55" (UID: "5ad7a22a-3e88-4447-b675-0a8339bd5f55"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 10:09:52 crc kubenswrapper[4794]: I0310 10:09:52.209981 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5ad7a22a-3e88-4447-b675-0a8339bd5f55-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "5ad7a22a-3e88-4447-b675-0a8339bd5f55" (UID: "5ad7a22a-3e88-4447-b675-0a8339bd5f55"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 10:09:52 crc kubenswrapper[4794]: I0310 10:09:52.235775 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5ad7a22a-3e88-4447-b675-0a8339bd5f55-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5ad7a22a-3e88-4447-b675-0a8339bd5f55" (UID: "5ad7a22a-3e88-4447-b675-0a8339bd5f55"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 10:09:52 crc kubenswrapper[4794]: I0310 10:09:52.263276 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5ad7a22a-3e88-4447-b675-0a8339bd5f55-config-data" (OuterVolumeSpecName: "config-data") pod "5ad7a22a-3e88-4447-b675-0a8339bd5f55" (UID: "5ad7a22a-3e88-4447-b675-0a8339bd5f55"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 10:09:52 crc kubenswrapper[4794]: I0310 10:09:52.275400 4794 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/5ad7a22a-3e88-4447-b675-0a8339bd5f55-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 10 10:09:52 crc kubenswrapper[4794]: I0310 10:09:52.275489 4794 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5ad7a22a-3e88-4447-b675-0a8339bd5f55-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 10 10:09:52 crc kubenswrapper[4794]: I0310 10:09:52.275502 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bk52r\" (UniqueName: \"kubernetes.io/projected/5ad7a22a-3e88-4447-b675-0a8339bd5f55-kube-api-access-bk52r\") on node \"crc\" DevicePath \"\"" Mar 10 10:09:52 crc kubenswrapper[4794]: I0310 10:09:52.275514 4794 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5ad7a22a-3e88-4447-b675-0a8339bd5f55-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 10:09:52 crc kubenswrapper[4794]: I0310 10:09:52.275527 4794 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/5ad7a22a-3e88-4447-b675-0a8339bd5f55-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 10 10:09:52 crc kubenswrapper[4794]: I0310 10:09:52.275537 4794 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5ad7a22a-3e88-4447-b675-0a8339bd5f55-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 10:09:52 crc kubenswrapper[4794]: I0310 10:09:52.275546 4794 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5ad7a22a-3e88-4447-b675-0a8339bd5f55-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 10:09:52 crc kubenswrapper[4794]: I0310 10:09:52.967945 4794 patch_prober.go:28] interesting pod/machine-config-daemon-69278 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 10:09:52 crc kubenswrapper[4794]: I0310 10:09:52.968802 4794 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-69278" podUID="071a79a8-a892-4d38-a255-2a19483b64aa" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 10:09:52 crc kubenswrapper[4794]: I0310 10:09:52.976811 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5ad7a22a-3e88-4447-b675-0a8339bd5f55","Type":"ContainerDied","Data":"0ecaf5d4f675c8d0c9dea14ae63403540987146039f8a7d1e5b775fd6df28002"} Mar 10 10:09:52 crc kubenswrapper[4794]: I0310 10:09:52.977052 4794 scope.go:117] "RemoveContainer" containerID="eea46c0d3766a780d0c4aadf572fb639917b34eb0330835ed5b59a3cdae42cd7" Mar 10 10:09:52 crc kubenswrapper[4794]: I0310 10:09:52.977558 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 10 10:09:53 crc kubenswrapper[4794]: I0310 10:09:53.040440 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 10 10:09:53 crc kubenswrapper[4794]: I0310 10:09:53.043623 4794 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 10 10:09:53 crc kubenswrapper[4794]: I0310 10:09:53.055131 4794 scope.go:117] "RemoveContainer" containerID="efa120338ab691675da4aafa0ebfbff3b4647e0b19ef6e555a8542261261a114" Mar 10 10:09:53 crc kubenswrapper[4794]: I0310 10:09:53.075084 4794 scope.go:117] "RemoveContainer" containerID="cca51dc254ea928ff512f597522805c2f3a8b9ea69647a3b6f31bf5e631eac13" Mar 10 10:09:53 crc kubenswrapper[4794]: I0310 10:09:53.094726 4794 scope.go:117] "RemoveContainer" containerID="e87552170b0f59d6ba42cedca611dc39fa99be4bf11ae3c34213d11839f96d1a" Mar 10 10:09:53 crc kubenswrapper[4794]: I0310 10:09:53.106405 4794 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/cinder-api-0" podUID="39114d89-8cf8-4563-bc50-e96e2113349d" containerName="cinder-api" probeResult="failure" output="Get \"https://10.217.0.169:8776/healthcheck\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 10 10:09:53 crc kubenswrapper[4794]: I0310 10:09:53.299916 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-rc2sw"] Mar 10 10:09:53 crc kubenswrapper[4794]: E0310 10:09:53.300223 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6bd64ab2-dcd8-4404-973e-551182005da1" containerName="nova-cell0-conductor-conductor" Mar 10 10:09:53 crc kubenswrapper[4794]: I0310 10:09:53.300234 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="6bd64ab2-dcd8-4404-973e-551182005da1" containerName="nova-cell0-conductor-conductor" Mar 10 10:09:53 crc kubenswrapper[4794]: E0310 10:09:53.300243 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="98b35dea-060e-4b8d-9829-37357853a9c4" containerName="barbican-worker" Mar 10 10:09:53 crc kubenswrapper[4794]: I0310 10:09:53.300249 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="98b35dea-060e-4b8d-9829-37357853a9c4" containerName="barbican-worker" Mar 10 10:09:53 crc kubenswrapper[4794]: E0310 10:09:53.300256 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5ad7a22a-3e88-4447-b675-0a8339bd5f55" containerName="proxy-httpd" Mar 10 10:09:53 crc kubenswrapper[4794]: I0310 10:09:53.300263 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="5ad7a22a-3e88-4447-b675-0a8339bd5f55" containerName="proxy-httpd" Mar 10 10:09:53 crc kubenswrapper[4794]: E0310 10:09:53.300302 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2446b2bc-c3c8-465d-a808-981664228cba" containerName="nova-api-api" Mar 10 10:09:53 crc kubenswrapper[4794]: I0310 10:09:53.300308 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="2446b2bc-c3c8-465d-a808-981664228cba" containerName="nova-api-api" Mar 10 10:09:53 crc kubenswrapper[4794]: E0310 10:09:53.300323 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3a1da9f1-f33d-4327-b899-b5a38c6990d8" containerName="ovn-northd" Mar 10 10:09:53 crc kubenswrapper[4794]: I0310 10:09:53.300344 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="3a1da9f1-f33d-4327-b899-b5a38c6990d8" containerName="ovn-northd" Mar 10 10:09:53 crc kubenswrapper[4794]: E0310 10:09:53.300354 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a45381ea-b5d8-49aa-b4b8-ab372b39b0d3" containerName="setup-container" Mar 10 10:09:53 crc kubenswrapper[4794]: I0310 10:09:53.300360 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="a45381ea-b5d8-49aa-b4b8-ab372b39b0d3" containerName="setup-container" Mar 10 10:09:53 crc kubenswrapper[4794]: E0310 10:09:53.300371 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4a681af5-3cbf-4d83-a8cd-42a552cdc06d" containerName="mariadb-account-create-update" Mar 10 10:09:53 crc kubenswrapper[4794]: I0310 10:09:53.300387 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="4a681af5-3cbf-4d83-a8cd-42a552cdc06d" containerName="mariadb-account-create-update" Mar 10 10:09:53 crc kubenswrapper[4794]: E0310 10:09:53.300395 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2446b2bc-c3c8-465d-a808-981664228cba" containerName="nova-api-log" Mar 10 10:09:53 crc kubenswrapper[4794]: I0310 10:09:53.300401 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="2446b2bc-c3c8-465d-a808-981664228cba" containerName="nova-api-log" Mar 10 10:09:53 crc kubenswrapper[4794]: E0310 10:09:53.300410 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="98b35dea-060e-4b8d-9829-37357853a9c4" containerName="barbican-worker-log" Mar 10 10:09:53 crc kubenswrapper[4794]: I0310 10:09:53.300416 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="98b35dea-060e-4b8d-9829-37357853a9c4" containerName="barbican-worker-log" Mar 10 10:09:53 crc kubenswrapper[4794]: E0310 10:09:53.300425 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eb384e40-1917-4b9c-bcfa-440a3a10fd1d" containerName="glance-httpd" Mar 10 10:09:53 crc kubenswrapper[4794]: I0310 10:09:53.300430 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="eb384e40-1917-4b9c-bcfa-440a3a10fd1d" containerName="glance-httpd" Mar 10 10:09:53 crc kubenswrapper[4794]: E0310 10:09:53.300439 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eb384e40-1917-4b9c-bcfa-440a3a10fd1d" containerName="glance-log" Mar 10 10:09:53 crc kubenswrapper[4794]: I0310 10:09:53.300444 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="eb384e40-1917-4b9c-bcfa-440a3a10fd1d" containerName="glance-log" Mar 10 10:09:53 crc kubenswrapper[4794]: E0310 10:09:53.300457 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="53686d91-dc01-4a36-99c3-e6c84052e15e" containerName="glance-httpd" Mar 10 10:09:53 crc kubenswrapper[4794]: I0310 10:09:53.300463 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="53686d91-dc01-4a36-99c3-e6c84052e15e" containerName="glance-httpd" Mar 10 10:09:53 crc kubenswrapper[4794]: E0310 10:09:53.300479 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2b6b3a2f-0f4d-4b7f-a507-450f0ffff42b" containerName="galera" Mar 10 10:09:53 crc kubenswrapper[4794]: I0310 10:09:53.300485 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="2b6b3a2f-0f4d-4b7f-a507-450f0ffff42b" containerName="galera" Mar 10 10:09:53 crc kubenswrapper[4794]: E0310 10:09:53.300495 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cffdb4d7-881d-4c1b-a15c-d8d9e8b7756f" containerName="barbican-api" Mar 10 10:09:53 crc kubenswrapper[4794]: I0310 10:09:53.300500 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="cffdb4d7-881d-4c1b-a15c-d8d9e8b7756f" containerName="barbican-api" Mar 10 10:09:53 crc kubenswrapper[4794]: E0310 10:09:53.300511 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a45381ea-b5d8-49aa-b4b8-ab372b39b0d3" containerName="rabbitmq" Mar 10 10:09:53 crc kubenswrapper[4794]: I0310 10:09:53.300517 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="a45381ea-b5d8-49aa-b4b8-ab372b39b0d3" containerName="rabbitmq" Mar 10 10:09:53 crc kubenswrapper[4794]: E0310 10:09:53.300526 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="39114d89-8cf8-4563-bc50-e96e2113349d" containerName="cinder-api-log" Mar 10 10:09:53 crc kubenswrapper[4794]: I0310 10:09:53.300534 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="39114d89-8cf8-4563-bc50-e96e2113349d" containerName="cinder-api-log" Mar 10 10:09:53 crc kubenswrapper[4794]: E0310 10:09:53.300547 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="598e06ed-3156-4e09-976e-4dda0e35afc2" containerName="rabbitmq" Mar 10 10:09:53 crc kubenswrapper[4794]: I0310 10:09:53.300555 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="598e06ed-3156-4e09-976e-4dda0e35afc2" containerName="rabbitmq" Mar 10 10:09:53 crc kubenswrapper[4794]: E0310 10:09:53.300562 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="efa96620-3d4b-4780-92a0-eeefbe9dcf9a" containerName="barbican-keystone-listener" Mar 10 10:09:53 crc kubenswrapper[4794]: I0310 10:09:53.300568 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="efa96620-3d4b-4780-92a0-eeefbe9dcf9a" containerName="barbican-keystone-listener" Mar 10 10:09:53 crc kubenswrapper[4794]: E0310 10:09:53.300579 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5ad7a22a-3e88-4447-b675-0a8339bd5f55" containerName="sg-core" Mar 10 10:09:53 crc kubenswrapper[4794]: I0310 10:09:53.300585 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="5ad7a22a-3e88-4447-b675-0a8339bd5f55" containerName="sg-core" Mar 10 10:09:53 crc kubenswrapper[4794]: E0310 10:09:53.300596 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="95ce97ce-b89c-4868-b9a8-48297e8e35e1" containerName="keystone-api" Mar 10 10:09:53 crc kubenswrapper[4794]: I0310 10:09:53.300602 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="95ce97ce-b89c-4868-b9a8-48297e8e35e1" containerName="keystone-api" Mar 10 10:09:53 crc kubenswrapper[4794]: E0310 10:09:53.300611 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0a15b784-796e-4834-97e1-978b1f0d9690" containerName="kube-state-metrics" Mar 10 10:09:53 crc kubenswrapper[4794]: I0310 10:09:53.300617 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="0a15b784-796e-4834-97e1-978b1f0d9690" containerName="kube-state-metrics" Mar 10 10:09:53 crc kubenswrapper[4794]: E0310 10:09:53.300627 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="39114d89-8cf8-4563-bc50-e96e2113349d" containerName="cinder-api" Mar 10 10:09:53 crc kubenswrapper[4794]: I0310 10:09:53.300634 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="39114d89-8cf8-4563-bc50-e96e2113349d" containerName="cinder-api" Mar 10 10:09:53 crc kubenswrapper[4794]: E0310 10:09:53.300642 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3a1da9f1-f33d-4327-b899-b5a38c6990d8" containerName="openstack-network-exporter" Mar 10 10:09:53 crc kubenswrapper[4794]: I0310 10:09:53.300648 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="3a1da9f1-f33d-4327-b899-b5a38c6990d8" containerName="openstack-network-exporter" Mar 10 10:09:53 crc kubenswrapper[4794]: E0310 10:09:53.300656 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ecae27ed-535f-47c8-93e4-07baac3bc64c" containerName="placement-log" Mar 10 10:09:53 crc kubenswrapper[4794]: I0310 10:09:53.300662 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="ecae27ed-535f-47c8-93e4-07baac3bc64c" containerName="placement-log" Mar 10 10:09:53 crc kubenswrapper[4794]: E0310 10:09:53.300671 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6b5facf1-8bc0-497d-925f-ee382862cf22" containerName="nova-metadata-log" Mar 10 10:09:53 crc kubenswrapper[4794]: I0310 10:09:53.300679 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="6b5facf1-8bc0-497d-925f-ee382862cf22" containerName="nova-metadata-log" Mar 10 10:09:53 crc kubenswrapper[4794]: E0310 10:09:53.300700 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5ad7a22a-3e88-4447-b675-0a8339bd5f55" containerName="ceilometer-notification-agent" Mar 10 10:09:53 crc kubenswrapper[4794]: I0310 10:09:53.300706 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="5ad7a22a-3e88-4447-b675-0a8339bd5f55" containerName="ceilometer-notification-agent" Mar 10 10:09:53 crc kubenswrapper[4794]: E0310 10:09:53.300715 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2b6b3a2f-0f4d-4b7f-a507-450f0ffff42b" containerName="mysql-bootstrap" Mar 10 10:09:53 crc kubenswrapper[4794]: I0310 10:09:53.300721 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="2b6b3a2f-0f4d-4b7f-a507-450f0ffff42b" containerName="mysql-bootstrap" Mar 10 10:09:53 crc kubenswrapper[4794]: E0310 10:09:53.300731 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ecae27ed-535f-47c8-93e4-07baac3bc64c" containerName="placement-api" Mar 10 10:09:53 crc kubenswrapper[4794]: I0310 10:09:53.300736 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="ecae27ed-535f-47c8-93e4-07baac3bc64c" containerName="placement-api" Mar 10 10:09:53 crc kubenswrapper[4794]: E0310 10:09:53.300746 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6b5facf1-8bc0-497d-925f-ee382862cf22" containerName="nova-metadata-metadata" Mar 10 10:09:53 crc kubenswrapper[4794]: I0310 10:09:53.300752 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="6b5facf1-8bc0-497d-925f-ee382862cf22" containerName="nova-metadata-metadata" Mar 10 10:09:53 crc kubenswrapper[4794]: E0310 10:09:53.300760 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="53686d91-dc01-4a36-99c3-e6c84052e15e" containerName="glance-log" Mar 10 10:09:53 crc kubenswrapper[4794]: I0310 10:09:53.300766 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="53686d91-dc01-4a36-99c3-e6c84052e15e" containerName="glance-log" Mar 10 10:09:53 crc kubenswrapper[4794]: E0310 10:09:53.300776 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4a681af5-3cbf-4d83-a8cd-42a552cdc06d" containerName="mariadb-account-create-update" Mar 10 10:09:53 crc kubenswrapper[4794]: I0310 10:09:53.300781 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="4a681af5-3cbf-4d83-a8cd-42a552cdc06d" containerName="mariadb-account-create-update" Mar 10 10:09:53 crc kubenswrapper[4794]: E0310 10:09:53.300792 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="71ee8a8d-89a0-495f-925b-071e52449063" containerName="memcached" Mar 10 10:09:53 crc kubenswrapper[4794]: I0310 10:09:53.300797 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="71ee8a8d-89a0-495f-925b-071e52449063" containerName="memcached" Mar 10 10:09:53 crc kubenswrapper[4794]: E0310 10:09:53.300805 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="598e06ed-3156-4e09-976e-4dda0e35afc2" containerName="setup-container" Mar 10 10:09:53 crc kubenswrapper[4794]: I0310 10:09:53.300811 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="598e06ed-3156-4e09-976e-4dda0e35afc2" containerName="setup-container" Mar 10 10:09:53 crc kubenswrapper[4794]: E0310 10:09:53.300817 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cffdb4d7-881d-4c1b-a15c-d8d9e8b7756f" containerName="barbican-api-log" Mar 10 10:09:53 crc kubenswrapper[4794]: I0310 10:09:53.300823 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="cffdb4d7-881d-4c1b-a15c-d8d9e8b7756f" containerName="barbican-api-log" Mar 10 10:09:53 crc kubenswrapper[4794]: E0310 10:09:53.300833 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5ad7a22a-3e88-4447-b675-0a8339bd5f55" containerName="ceilometer-central-agent" Mar 10 10:09:53 crc kubenswrapper[4794]: I0310 10:09:53.300839 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="5ad7a22a-3e88-4447-b675-0a8339bd5f55" containerName="ceilometer-central-agent" Mar 10 10:09:53 crc kubenswrapper[4794]: E0310 10:09:53.300850 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="efa96620-3d4b-4780-92a0-eeefbe9dcf9a" containerName="barbican-keystone-listener-log" Mar 10 10:09:53 crc kubenswrapper[4794]: I0310 10:09:53.300855 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="efa96620-3d4b-4780-92a0-eeefbe9dcf9a" containerName="barbican-keystone-listener-log" Mar 10 10:09:53 crc kubenswrapper[4794]: I0310 10:09:53.300986 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="5ad7a22a-3e88-4447-b675-0a8339bd5f55" containerName="ceilometer-central-agent" Mar 10 10:09:53 crc kubenswrapper[4794]: I0310 10:09:53.300998 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="efa96620-3d4b-4780-92a0-eeefbe9dcf9a" containerName="barbican-keystone-listener-log" Mar 10 10:09:53 crc kubenswrapper[4794]: I0310 10:09:53.301009 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="4a681af5-3cbf-4d83-a8cd-42a552cdc06d" containerName="mariadb-account-create-update" Mar 10 10:09:53 crc kubenswrapper[4794]: I0310 10:09:53.301017 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="5ad7a22a-3e88-4447-b675-0a8339bd5f55" containerName="ceilometer-notification-agent" Mar 10 10:09:53 crc kubenswrapper[4794]: I0310 10:09:53.301028 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="3a1da9f1-f33d-4327-b899-b5a38c6990d8" containerName="openstack-network-exporter" Mar 10 10:09:53 crc kubenswrapper[4794]: I0310 10:09:53.301037 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="6bd64ab2-dcd8-4404-973e-551182005da1" containerName="nova-cell0-conductor-conductor" Mar 10 10:09:53 crc kubenswrapper[4794]: I0310 10:09:53.301044 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="95ce97ce-b89c-4868-b9a8-48297e8e35e1" containerName="keystone-api" Mar 10 10:09:53 crc kubenswrapper[4794]: I0310 10:09:53.301053 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="4a681af5-3cbf-4d83-a8cd-42a552cdc06d" containerName="mariadb-account-create-update" Mar 10 10:09:53 crc kubenswrapper[4794]: I0310 10:09:53.301062 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="2b6b3a2f-0f4d-4b7f-a507-450f0ffff42b" containerName="galera" Mar 10 10:09:53 crc kubenswrapper[4794]: I0310 10:09:53.301070 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="39114d89-8cf8-4563-bc50-e96e2113349d" containerName="cinder-api" Mar 10 10:09:53 crc kubenswrapper[4794]: I0310 10:09:53.301081 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="6b5facf1-8bc0-497d-925f-ee382862cf22" containerName="nova-metadata-metadata" Mar 10 10:09:53 crc kubenswrapper[4794]: I0310 10:09:53.301088 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="cffdb4d7-881d-4c1b-a15c-d8d9e8b7756f" containerName="barbican-api" Mar 10 10:09:53 crc kubenswrapper[4794]: I0310 10:09:53.301096 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="598e06ed-3156-4e09-976e-4dda0e35afc2" containerName="rabbitmq" Mar 10 10:09:53 crc kubenswrapper[4794]: I0310 10:09:53.301103 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="efa96620-3d4b-4780-92a0-eeefbe9dcf9a" containerName="barbican-keystone-listener" Mar 10 10:09:53 crc kubenswrapper[4794]: I0310 10:09:53.301114 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="5ad7a22a-3e88-4447-b675-0a8339bd5f55" containerName="sg-core" Mar 10 10:09:53 crc kubenswrapper[4794]: I0310 10:09:53.301122 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="6b5facf1-8bc0-497d-925f-ee382862cf22" containerName="nova-metadata-log" Mar 10 10:09:53 crc kubenswrapper[4794]: I0310 10:09:53.301130 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="71ee8a8d-89a0-495f-925b-071e52449063" containerName="memcached" Mar 10 10:09:53 crc kubenswrapper[4794]: I0310 10:09:53.301146 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="0a15b784-796e-4834-97e1-978b1f0d9690" containerName="kube-state-metrics" Mar 10 10:09:53 crc kubenswrapper[4794]: I0310 10:09:53.301155 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="2446b2bc-c3c8-465d-a808-981664228cba" containerName="nova-api-log" Mar 10 10:09:53 crc kubenswrapper[4794]: I0310 10:09:53.301164 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="ecae27ed-535f-47c8-93e4-07baac3bc64c" containerName="placement-api" Mar 10 10:09:53 crc kubenswrapper[4794]: I0310 10:09:53.301171 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="a45381ea-b5d8-49aa-b4b8-ab372b39b0d3" containerName="rabbitmq" Mar 10 10:09:53 crc kubenswrapper[4794]: I0310 10:09:53.301177 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="98b35dea-060e-4b8d-9829-37357853a9c4" containerName="barbican-worker" Mar 10 10:09:53 crc kubenswrapper[4794]: I0310 10:09:53.301184 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="eb384e40-1917-4b9c-bcfa-440a3a10fd1d" containerName="glance-httpd" Mar 10 10:09:53 crc kubenswrapper[4794]: I0310 10:09:53.301194 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="eb384e40-1917-4b9c-bcfa-440a3a10fd1d" containerName="glance-log" Mar 10 10:09:53 crc kubenswrapper[4794]: I0310 10:09:53.301204 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="3a1da9f1-f33d-4327-b899-b5a38c6990d8" containerName="ovn-northd" Mar 10 10:09:53 crc kubenswrapper[4794]: I0310 10:09:53.301213 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="39114d89-8cf8-4563-bc50-e96e2113349d" containerName="cinder-api-log" Mar 10 10:09:53 crc kubenswrapper[4794]: I0310 10:09:53.301220 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="53686d91-dc01-4a36-99c3-e6c84052e15e" containerName="glance-log" Mar 10 10:09:53 crc kubenswrapper[4794]: I0310 10:09:53.301228 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="98b35dea-060e-4b8d-9829-37357853a9c4" containerName="barbican-worker-log" Mar 10 10:09:53 crc kubenswrapper[4794]: I0310 10:09:53.301236 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="ecae27ed-535f-47c8-93e4-07baac3bc64c" containerName="placement-log" Mar 10 10:09:53 crc kubenswrapper[4794]: I0310 10:09:53.301244 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="2446b2bc-c3c8-465d-a808-981664228cba" containerName="nova-api-api" Mar 10 10:09:53 crc kubenswrapper[4794]: I0310 10:09:53.301251 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="53686d91-dc01-4a36-99c3-e6c84052e15e" containerName="glance-httpd" Mar 10 10:09:53 crc kubenswrapper[4794]: I0310 10:09:53.301257 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="cffdb4d7-881d-4c1b-a15c-d8d9e8b7756f" containerName="barbican-api-log" Mar 10 10:09:53 crc kubenswrapper[4794]: I0310 10:09:53.301264 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="5ad7a22a-3e88-4447-b675-0a8339bd5f55" containerName="proxy-httpd" Mar 10 10:09:53 crc kubenswrapper[4794]: I0310 10:09:53.302206 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-rc2sw" Mar 10 10:09:53 crc kubenswrapper[4794]: I0310 10:09:53.316642 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-rc2sw"] Mar 10 10:09:53 crc kubenswrapper[4794]: I0310 10:09:53.496543 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-95xh2\" (UniqueName: \"kubernetes.io/projected/735dee14-1f8d-4cbf-b66a-553892775e5c-kube-api-access-95xh2\") pod \"redhat-operators-rc2sw\" (UID: \"735dee14-1f8d-4cbf-b66a-553892775e5c\") " pod="openshift-marketplace/redhat-operators-rc2sw" Mar 10 10:09:53 crc kubenswrapper[4794]: I0310 10:09:53.496640 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/735dee14-1f8d-4cbf-b66a-553892775e5c-utilities\") pod \"redhat-operators-rc2sw\" (UID: \"735dee14-1f8d-4cbf-b66a-553892775e5c\") " pod="openshift-marketplace/redhat-operators-rc2sw" Mar 10 10:09:53 crc kubenswrapper[4794]: I0310 10:09:53.496752 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/735dee14-1f8d-4cbf-b66a-553892775e5c-catalog-content\") pod \"redhat-operators-rc2sw\" (UID: \"735dee14-1f8d-4cbf-b66a-553892775e5c\") " pod="openshift-marketplace/redhat-operators-rc2sw" Mar 10 10:09:53 crc kubenswrapper[4794]: I0310 10:09:53.598312 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-95xh2\" (UniqueName: \"kubernetes.io/projected/735dee14-1f8d-4cbf-b66a-553892775e5c-kube-api-access-95xh2\") pod \"redhat-operators-rc2sw\" (UID: \"735dee14-1f8d-4cbf-b66a-553892775e5c\") " pod="openshift-marketplace/redhat-operators-rc2sw" Mar 10 10:09:53 crc kubenswrapper[4794]: I0310 10:09:53.598395 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/735dee14-1f8d-4cbf-b66a-553892775e5c-utilities\") pod \"redhat-operators-rc2sw\" (UID: \"735dee14-1f8d-4cbf-b66a-553892775e5c\") " pod="openshift-marketplace/redhat-operators-rc2sw" Mar 10 10:09:53 crc kubenswrapper[4794]: I0310 10:09:53.598467 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/735dee14-1f8d-4cbf-b66a-553892775e5c-catalog-content\") pod \"redhat-operators-rc2sw\" (UID: \"735dee14-1f8d-4cbf-b66a-553892775e5c\") " pod="openshift-marketplace/redhat-operators-rc2sw" Mar 10 10:09:53 crc kubenswrapper[4794]: I0310 10:09:53.598838 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/735dee14-1f8d-4cbf-b66a-553892775e5c-utilities\") pod \"redhat-operators-rc2sw\" (UID: \"735dee14-1f8d-4cbf-b66a-553892775e5c\") " pod="openshift-marketplace/redhat-operators-rc2sw" Mar 10 10:09:53 crc kubenswrapper[4794]: I0310 10:09:53.598877 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/735dee14-1f8d-4cbf-b66a-553892775e5c-catalog-content\") pod \"redhat-operators-rc2sw\" (UID: \"735dee14-1f8d-4cbf-b66a-553892775e5c\") " pod="openshift-marketplace/redhat-operators-rc2sw" Mar 10 10:09:53 crc kubenswrapper[4794]: I0310 10:09:53.616213 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-95xh2\" (UniqueName: \"kubernetes.io/projected/735dee14-1f8d-4cbf-b66a-553892775e5c-kube-api-access-95xh2\") pod \"redhat-operators-rc2sw\" (UID: \"735dee14-1f8d-4cbf-b66a-553892775e5c\") " pod="openshift-marketplace/redhat-operators-rc2sw" Mar 10 10:09:53 crc kubenswrapper[4794]: I0310 10:09:53.619937 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-rc2sw" Mar 10 10:09:54 crc kubenswrapper[4794]: I0310 10:09:54.007857 4794 generic.go:334] "Generic (PLEG): container finished" podID="55771788-f3c0-4cde-af2f-ca527c2e2965" containerID="3a1060bd42d7158c308820f09f849a414458bab447ccd7c995609acf055ac995" exitCode=0 Mar 10 10:09:54 crc kubenswrapper[4794]: I0310 10:09:54.010259 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5ad7a22a-3e88-4447-b675-0a8339bd5f55" path="/var/lib/kubelet/pods/5ad7a22a-3e88-4447-b675-0a8339bd5f55/volumes" Mar 10 10:09:54 crc kubenswrapper[4794]: I0310 10:09:54.011631 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="95ce97ce-b89c-4868-b9a8-48297e8e35e1" path="/var/lib/kubelet/pods/95ce97ce-b89c-4868-b9a8-48297e8e35e1/volumes" Mar 10 10:09:54 crc kubenswrapper[4794]: I0310 10:09:54.014749 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-bd7575545-w8qjp" event={"ID":"55771788-f3c0-4cde-af2f-ca527c2e2965","Type":"ContainerDied","Data":"3a1060bd42d7158c308820f09f849a414458bab447ccd7c995609acf055ac995"} Mar 10 10:09:54 crc kubenswrapper[4794]: I0310 10:09:54.136538 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-rc2sw"] Mar 10 10:09:54 crc kubenswrapper[4794]: W0310 10:09:54.140896 4794 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod735dee14_1f8d_4cbf_b66a_553892775e5c.slice/crio-ca163e48db376412b7df84bbe63b14da4f2baee558b5e0063cff05d5e42124ce WatchSource:0}: Error finding container ca163e48db376412b7df84bbe63b14da4f2baee558b5e0063cff05d5e42124ce: Status 404 returned error can't find the container with id ca163e48db376412b7df84bbe63b14da4f2baee558b5e0063cff05d5e42124ce Mar 10 10:09:54 crc kubenswrapper[4794]: I0310 10:09:54.333593 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-bd7575545-w8qjp" Mar 10 10:09:54 crc kubenswrapper[4794]: I0310 10:09:54.510521 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hdjtn\" (UniqueName: \"kubernetes.io/projected/55771788-f3c0-4cde-af2f-ca527c2e2965-kube-api-access-hdjtn\") pod \"55771788-f3c0-4cde-af2f-ca527c2e2965\" (UID: \"55771788-f3c0-4cde-af2f-ca527c2e2965\") " Mar 10 10:09:54 crc kubenswrapper[4794]: I0310 10:09:54.510779 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/55771788-f3c0-4cde-af2f-ca527c2e2965-ovndb-tls-certs\") pod \"55771788-f3c0-4cde-af2f-ca527c2e2965\" (UID: \"55771788-f3c0-4cde-af2f-ca527c2e2965\") " Mar 10 10:09:54 crc kubenswrapper[4794]: I0310 10:09:54.510829 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/55771788-f3c0-4cde-af2f-ca527c2e2965-public-tls-certs\") pod \"55771788-f3c0-4cde-af2f-ca527c2e2965\" (UID: \"55771788-f3c0-4cde-af2f-ca527c2e2965\") " Mar 10 10:09:54 crc kubenswrapper[4794]: I0310 10:09:54.510847 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/55771788-f3c0-4cde-af2f-ca527c2e2965-httpd-config\") pod \"55771788-f3c0-4cde-af2f-ca527c2e2965\" (UID: \"55771788-f3c0-4cde-af2f-ca527c2e2965\") " Mar 10 10:09:54 crc kubenswrapper[4794]: I0310 10:09:54.511625 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/55771788-f3c0-4cde-af2f-ca527c2e2965-config\") pod \"55771788-f3c0-4cde-af2f-ca527c2e2965\" (UID: \"55771788-f3c0-4cde-af2f-ca527c2e2965\") " Mar 10 10:09:54 crc kubenswrapper[4794]: I0310 10:09:54.511715 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/55771788-f3c0-4cde-af2f-ca527c2e2965-internal-tls-certs\") pod \"55771788-f3c0-4cde-af2f-ca527c2e2965\" (UID: \"55771788-f3c0-4cde-af2f-ca527c2e2965\") " Mar 10 10:09:54 crc kubenswrapper[4794]: I0310 10:09:54.511778 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/55771788-f3c0-4cde-af2f-ca527c2e2965-combined-ca-bundle\") pod \"55771788-f3c0-4cde-af2f-ca527c2e2965\" (UID: \"55771788-f3c0-4cde-af2f-ca527c2e2965\") " Mar 10 10:09:54 crc kubenswrapper[4794]: I0310 10:09:54.515928 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/55771788-f3c0-4cde-af2f-ca527c2e2965-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "55771788-f3c0-4cde-af2f-ca527c2e2965" (UID: "55771788-f3c0-4cde-af2f-ca527c2e2965"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 10:09:54 crc kubenswrapper[4794]: I0310 10:09:54.516423 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/55771788-f3c0-4cde-af2f-ca527c2e2965-kube-api-access-hdjtn" (OuterVolumeSpecName: "kube-api-access-hdjtn") pod "55771788-f3c0-4cde-af2f-ca527c2e2965" (UID: "55771788-f3c0-4cde-af2f-ca527c2e2965"). InnerVolumeSpecName "kube-api-access-hdjtn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 10:09:54 crc kubenswrapper[4794]: I0310 10:09:54.557154 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/55771788-f3c0-4cde-af2f-ca527c2e2965-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "55771788-f3c0-4cde-af2f-ca527c2e2965" (UID: "55771788-f3c0-4cde-af2f-ca527c2e2965"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 10:09:54 crc kubenswrapper[4794]: I0310 10:09:54.560739 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/55771788-f3c0-4cde-af2f-ca527c2e2965-config" (OuterVolumeSpecName: "config") pod "55771788-f3c0-4cde-af2f-ca527c2e2965" (UID: "55771788-f3c0-4cde-af2f-ca527c2e2965"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 10:09:54 crc kubenswrapper[4794]: I0310 10:09:54.561903 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/55771788-f3c0-4cde-af2f-ca527c2e2965-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "55771788-f3c0-4cde-af2f-ca527c2e2965" (UID: "55771788-f3c0-4cde-af2f-ca527c2e2965"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 10:09:54 crc kubenswrapper[4794]: I0310 10:09:54.578484 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/55771788-f3c0-4cde-af2f-ca527c2e2965-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "55771788-f3c0-4cde-af2f-ca527c2e2965" (UID: "55771788-f3c0-4cde-af2f-ca527c2e2965"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 10:09:54 crc kubenswrapper[4794]: I0310 10:09:54.580445 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/55771788-f3c0-4cde-af2f-ca527c2e2965-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "55771788-f3c0-4cde-af2f-ca527c2e2965" (UID: "55771788-f3c0-4cde-af2f-ca527c2e2965"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 10:09:54 crc kubenswrapper[4794]: I0310 10:09:54.612952 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hdjtn\" (UniqueName: \"kubernetes.io/projected/55771788-f3c0-4cde-af2f-ca527c2e2965-kube-api-access-hdjtn\") on node \"crc\" DevicePath \"\"" Mar 10 10:09:54 crc kubenswrapper[4794]: I0310 10:09:54.612994 4794 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/55771788-f3c0-4cde-af2f-ca527c2e2965-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 10 10:09:54 crc kubenswrapper[4794]: I0310 10:09:54.613008 4794 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/55771788-f3c0-4cde-af2f-ca527c2e2965-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 10 10:09:54 crc kubenswrapper[4794]: I0310 10:09:54.613020 4794 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/55771788-f3c0-4cde-af2f-ca527c2e2965-httpd-config\") on node \"crc\" DevicePath \"\"" Mar 10 10:09:54 crc kubenswrapper[4794]: I0310 10:09:54.613037 4794 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/55771788-f3c0-4cde-af2f-ca527c2e2965-config\") on node \"crc\" DevicePath \"\"" Mar 10 10:09:54 crc kubenswrapper[4794]: I0310 10:09:54.613049 4794 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/55771788-f3c0-4cde-af2f-ca527c2e2965-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 10 10:09:54 crc kubenswrapper[4794]: I0310 10:09:54.613059 4794 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/55771788-f3c0-4cde-af2f-ca527c2e2965-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 10:09:55 crc kubenswrapper[4794]: I0310 10:09:55.020093 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-bd7575545-w8qjp" event={"ID":"55771788-f3c0-4cde-af2f-ca527c2e2965","Type":"ContainerDied","Data":"6e93094adb86c32b925a12f87d735ca2d15349396683c2bb8f480ad6ff2646d3"} Mar 10 10:09:55 crc kubenswrapper[4794]: I0310 10:09:55.020118 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-bd7575545-w8qjp" Mar 10 10:09:55 crc kubenswrapper[4794]: I0310 10:09:55.020167 4794 scope.go:117] "RemoveContainer" containerID="76dbe19e4daa257d305db13b3f34518e614449759a4f59af217356e156607317" Mar 10 10:09:55 crc kubenswrapper[4794]: I0310 10:09:55.023127 4794 generic.go:334] "Generic (PLEG): container finished" podID="735dee14-1f8d-4cbf-b66a-553892775e5c" containerID="e544585b984d1e07a73109284d4d6af79a46ddb946ed617533fd68015932665c" exitCode=0 Mar 10 10:09:55 crc kubenswrapper[4794]: I0310 10:09:55.023170 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rc2sw" event={"ID":"735dee14-1f8d-4cbf-b66a-553892775e5c","Type":"ContainerDied","Data":"e544585b984d1e07a73109284d4d6af79a46ddb946ed617533fd68015932665c"} Mar 10 10:09:55 crc kubenswrapper[4794]: I0310 10:09:55.023236 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rc2sw" event={"ID":"735dee14-1f8d-4cbf-b66a-553892775e5c","Type":"ContainerStarted","Data":"ca163e48db376412b7df84bbe63b14da4f2baee558b5e0063cff05d5e42124ce"} Mar 10 10:09:55 crc kubenswrapper[4794]: I0310 10:09:55.045993 4794 scope.go:117] "RemoveContainer" containerID="3a1060bd42d7158c308820f09f849a414458bab447ccd7c995609acf055ac995" Mar 10 10:09:55 crc kubenswrapper[4794]: I0310 10:09:55.071085 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-bd7575545-w8qjp"] Mar 10 10:09:55 crc kubenswrapper[4794]: I0310 10:09:55.078922 4794 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-bd7575545-w8qjp"] Mar 10 10:09:55 crc kubenswrapper[4794]: I0310 10:09:55.881379 4794 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-x6kv8" Mar 10 10:09:55 crc kubenswrapper[4794]: I0310 10:09:55.944519 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-x6kv8" Mar 10 10:09:56 crc kubenswrapper[4794]: I0310 10:09:56.016184 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="55771788-f3c0-4cde-af2f-ca527c2e2965" path="/var/lib/kubelet/pods/55771788-f3c0-4cde-af2f-ca527c2e2965/volumes" Mar 10 10:09:56 crc kubenswrapper[4794]: I0310 10:09:56.039904 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rc2sw" event={"ID":"735dee14-1f8d-4cbf-b66a-553892775e5c","Type":"ContainerStarted","Data":"9703f3e7fd8cb6014a6b161bb42125738ba49cd0f305403e1ed2dc459d084bba"} Mar 10 10:09:56 crc kubenswrapper[4794]: E0310 10:09:56.723372 4794 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 760dbf63b8a7310e794a1834c303840d43e2dfc6133fa29596b18c0cf1a30fcb is running failed: container process not found" containerID="760dbf63b8a7310e794a1834c303840d43e2dfc6133fa29596b18c0cf1a30fcb" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Mar 10 10:09:56 crc kubenswrapper[4794]: E0310 10:09:56.725769 4794 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="b7dfa84825a07e2ea2e0b6f2310752db3a4b9e635a5fe6eebca49deff4d3269e" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Mar 10 10:09:56 crc kubenswrapper[4794]: E0310 10:09:56.726426 4794 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 760dbf63b8a7310e794a1834c303840d43e2dfc6133fa29596b18c0cf1a30fcb is running failed: container process not found" containerID="760dbf63b8a7310e794a1834c303840d43e2dfc6133fa29596b18c0cf1a30fcb" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Mar 10 10:09:56 crc kubenswrapper[4794]: E0310 10:09:56.726953 4794 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 760dbf63b8a7310e794a1834c303840d43e2dfc6133fa29596b18c0cf1a30fcb is running failed: container process not found" containerID="760dbf63b8a7310e794a1834c303840d43e2dfc6133fa29596b18c0cf1a30fcb" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Mar 10 10:09:56 crc kubenswrapper[4794]: E0310 10:09:56.727085 4794 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 760dbf63b8a7310e794a1834c303840d43e2dfc6133fa29596b18c0cf1a30fcb is running failed: container process not found" probeType="Readiness" pod="openstack/ovn-controller-ovs-8552m" podUID="4fbaedcd-8c1d-452e-a36b-1a12e47f48d1" containerName="ovsdb-server" Mar 10 10:09:56 crc kubenswrapper[4794]: E0310 10:09:56.728613 4794 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="b7dfa84825a07e2ea2e0b6f2310752db3a4b9e635a5fe6eebca49deff4d3269e" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Mar 10 10:09:56 crc kubenswrapper[4794]: E0310 10:09:56.730134 4794 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="b7dfa84825a07e2ea2e0b6f2310752db3a4b9e635a5fe6eebca49deff4d3269e" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Mar 10 10:09:56 crc kubenswrapper[4794]: E0310 10:09:56.730181 4794 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/ovn-controller-ovs-8552m" podUID="4fbaedcd-8c1d-452e-a36b-1a12e47f48d1" containerName="ovs-vswitchd" Mar 10 10:09:56 crc kubenswrapper[4794]: E0310 10:09:56.753753 4794 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod735dee14_1f8d_4cbf_b66a_553892775e5c.slice/crio-conmon-9703f3e7fd8cb6014a6b161bb42125738ba49cd0f305403e1ed2dc459d084bba.scope\": RecentStats: unable to find data in memory cache]" Mar 10 10:09:57 crc kubenswrapper[4794]: I0310 10:09:57.050488 4794 generic.go:334] "Generic (PLEG): container finished" podID="735dee14-1f8d-4cbf-b66a-553892775e5c" containerID="9703f3e7fd8cb6014a6b161bb42125738ba49cd0f305403e1ed2dc459d084bba" exitCode=0 Mar 10 10:09:57 crc kubenswrapper[4794]: I0310 10:09:57.050543 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rc2sw" event={"ID":"735dee14-1f8d-4cbf-b66a-553892775e5c","Type":"ContainerDied","Data":"9703f3e7fd8cb6014a6b161bb42125738ba49cd0f305403e1ed2dc459d084bba"} Mar 10 10:09:58 crc kubenswrapper[4794]: I0310 10:09:58.061405 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rc2sw" event={"ID":"735dee14-1f8d-4cbf-b66a-553892775e5c","Type":"ContainerStarted","Data":"2c9c54730eb704c58a05847dfd2a78c990adcc648e7a012ca755e85670d4a4f7"} Mar 10 10:09:58 crc kubenswrapper[4794]: I0310 10:09:58.093910 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-rc2sw" podStartSLOduration=2.654856677 podStartE2EDuration="5.09389429s" podCreationTimestamp="2026-03-10 10:09:53 +0000 UTC" firstStartedPulling="2026-03-10 10:09:55.024762962 +0000 UTC m=+1543.780933790" lastFinishedPulling="2026-03-10 10:09:57.463800585 +0000 UTC m=+1546.219971403" observedRunningTime="2026-03-10 10:09:58.088362317 +0000 UTC m=+1546.844533145" watchObservedRunningTime="2026-03-10 10:09:58.09389429 +0000 UTC m=+1546.850065108" Mar 10 10:09:58 crc kubenswrapper[4794]: I0310 10:09:58.275777 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-x6kv8"] Mar 10 10:09:58 crc kubenswrapper[4794]: I0310 10:09:58.276032 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-x6kv8" podUID="0919f85e-d789-43ec-90d3-7df281f603c5" containerName="registry-server" containerID="cri-o://afa0e1f6b3adf6aae632dc9a41dcb35cb3cdc5fc59c830b888bcffb71481d269" gracePeriod=2 Mar 10 10:09:58 crc kubenswrapper[4794]: I0310 10:09:58.738707 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-x6kv8" Mar 10 10:09:58 crc kubenswrapper[4794]: I0310 10:09:58.889898 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0919f85e-d789-43ec-90d3-7df281f603c5-catalog-content\") pod \"0919f85e-d789-43ec-90d3-7df281f603c5\" (UID: \"0919f85e-d789-43ec-90d3-7df281f603c5\") " Mar 10 10:09:58 crc kubenswrapper[4794]: I0310 10:09:58.890042 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rgqrx\" (UniqueName: \"kubernetes.io/projected/0919f85e-d789-43ec-90d3-7df281f603c5-kube-api-access-rgqrx\") pod \"0919f85e-d789-43ec-90d3-7df281f603c5\" (UID: \"0919f85e-d789-43ec-90d3-7df281f603c5\") " Mar 10 10:09:58 crc kubenswrapper[4794]: I0310 10:09:58.890139 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0919f85e-d789-43ec-90d3-7df281f603c5-utilities\") pod \"0919f85e-d789-43ec-90d3-7df281f603c5\" (UID: \"0919f85e-d789-43ec-90d3-7df281f603c5\") " Mar 10 10:09:58 crc kubenswrapper[4794]: I0310 10:09:58.890820 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0919f85e-d789-43ec-90d3-7df281f603c5-utilities" (OuterVolumeSpecName: "utilities") pod "0919f85e-d789-43ec-90d3-7df281f603c5" (UID: "0919f85e-d789-43ec-90d3-7df281f603c5"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 10:09:58 crc kubenswrapper[4794]: I0310 10:09:58.894611 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0919f85e-d789-43ec-90d3-7df281f603c5-kube-api-access-rgqrx" (OuterVolumeSpecName: "kube-api-access-rgqrx") pod "0919f85e-d789-43ec-90d3-7df281f603c5" (UID: "0919f85e-d789-43ec-90d3-7df281f603c5"). InnerVolumeSpecName "kube-api-access-rgqrx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 10:09:58 crc kubenswrapper[4794]: I0310 10:09:58.928629 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0919f85e-d789-43ec-90d3-7df281f603c5-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "0919f85e-d789-43ec-90d3-7df281f603c5" (UID: "0919f85e-d789-43ec-90d3-7df281f603c5"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 10:09:58 crc kubenswrapper[4794]: I0310 10:09:58.991545 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rgqrx\" (UniqueName: \"kubernetes.io/projected/0919f85e-d789-43ec-90d3-7df281f603c5-kube-api-access-rgqrx\") on node \"crc\" DevicePath \"\"" Mar 10 10:09:58 crc kubenswrapper[4794]: I0310 10:09:58.991611 4794 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0919f85e-d789-43ec-90d3-7df281f603c5-utilities\") on node \"crc\" DevicePath \"\"" Mar 10 10:09:58 crc kubenswrapper[4794]: I0310 10:09:58.991631 4794 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0919f85e-d789-43ec-90d3-7df281f603c5-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 10 10:09:59 crc kubenswrapper[4794]: I0310 10:09:59.081706 4794 generic.go:334] "Generic (PLEG): container finished" podID="0919f85e-d789-43ec-90d3-7df281f603c5" containerID="afa0e1f6b3adf6aae632dc9a41dcb35cb3cdc5fc59c830b888bcffb71481d269" exitCode=0 Mar 10 10:09:59 crc kubenswrapper[4794]: I0310 10:09:59.082419 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-x6kv8" Mar 10 10:09:59 crc kubenswrapper[4794]: I0310 10:09:59.082564 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-x6kv8" event={"ID":"0919f85e-d789-43ec-90d3-7df281f603c5","Type":"ContainerDied","Data":"afa0e1f6b3adf6aae632dc9a41dcb35cb3cdc5fc59c830b888bcffb71481d269"} Mar 10 10:09:59 crc kubenswrapper[4794]: I0310 10:09:59.082604 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-x6kv8" event={"ID":"0919f85e-d789-43ec-90d3-7df281f603c5","Type":"ContainerDied","Data":"89b93a69dcf4450867e6a7f4b04a62bcc8b890b7da6063a0b9562828a92e866e"} Mar 10 10:09:59 crc kubenswrapper[4794]: I0310 10:09:59.082624 4794 scope.go:117] "RemoveContainer" containerID="afa0e1f6b3adf6aae632dc9a41dcb35cb3cdc5fc59c830b888bcffb71481d269" Mar 10 10:09:59 crc kubenswrapper[4794]: I0310 10:09:59.114960 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-x6kv8"] Mar 10 10:09:59 crc kubenswrapper[4794]: I0310 10:09:59.122463 4794 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-x6kv8"] Mar 10 10:09:59 crc kubenswrapper[4794]: I0310 10:09:59.123010 4794 scope.go:117] "RemoveContainer" containerID="9f4521ae14fc84ffc7327e5c29a6cdf8af244218cc5ae1d39d8183f14b4d4b5b" Mar 10 10:09:59 crc kubenswrapper[4794]: I0310 10:09:59.147584 4794 scope.go:117] "RemoveContainer" containerID="b90bc052c3d4a25868027ce9df7e66a9eafd23be70bfd49d08d79c81d6265ed9" Mar 10 10:09:59 crc kubenswrapper[4794]: I0310 10:09:59.178050 4794 scope.go:117] "RemoveContainer" containerID="afa0e1f6b3adf6aae632dc9a41dcb35cb3cdc5fc59c830b888bcffb71481d269" Mar 10 10:09:59 crc kubenswrapper[4794]: E0310 10:09:59.178757 4794 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"afa0e1f6b3adf6aae632dc9a41dcb35cb3cdc5fc59c830b888bcffb71481d269\": container with ID starting with afa0e1f6b3adf6aae632dc9a41dcb35cb3cdc5fc59c830b888bcffb71481d269 not found: ID does not exist" containerID="afa0e1f6b3adf6aae632dc9a41dcb35cb3cdc5fc59c830b888bcffb71481d269" Mar 10 10:09:59 crc kubenswrapper[4794]: I0310 10:09:59.178814 4794 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"afa0e1f6b3adf6aae632dc9a41dcb35cb3cdc5fc59c830b888bcffb71481d269"} err="failed to get container status \"afa0e1f6b3adf6aae632dc9a41dcb35cb3cdc5fc59c830b888bcffb71481d269\": rpc error: code = NotFound desc = could not find container \"afa0e1f6b3adf6aae632dc9a41dcb35cb3cdc5fc59c830b888bcffb71481d269\": container with ID starting with afa0e1f6b3adf6aae632dc9a41dcb35cb3cdc5fc59c830b888bcffb71481d269 not found: ID does not exist" Mar 10 10:09:59 crc kubenswrapper[4794]: I0310 10:09:59.178853 4794 scope.go:117] "RemoveContainer" containerID="9f4521ae14fc84ffc7327e5c29a6cdf8af244218cc5ae1d39d8183f14b4d4b5b" Mar 10 10:09:59 crc kubenswrapper[4794]: E0310 10:09:59.179269 4794 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9f4521ae14fc84ffc7327e5c29a6cdf8af244218cc5ae1d39d8183f14b4d4b5b\": container with ID starting with 9f4521ae14fc84ffc7327e5c29a6cdf8af244218cc5ae1d39d8183f14b4d4b5b not found: ID does not exist" containerID="9f4521ae14fc84ffc7327e5c29a6cdf8af244218cc5ae1d39d8183f14b4d4b5b" Mar 10 10:09:59 crc kubenswrapper[4794]: I0310 10:09:59.179308 4794 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9f4521ae14fc84ffc7327e5c29a6cdf8af244218cc5ae1d39d8183f14b4d4b5b"} err="failed to get container status \"9f4521ae14fc84ffc7327e5c29a6cdf8af244218cc5ae1d39d8183f14b4d4b5b\": rpc error: code = NotFound desc = could not find container \"9f4521ae14fc84ffc7327e5c29a6cdf8af244218cc5ae1d39d8183f14b4d4b5b\": container with ID starting with 9f4521ae14fc84ffc7327e5c29a6cdf8af244218cc5ae1d39d8183f14b4d4b5b not found: ID does not exist" Mar 10 10:09:59 crc kubenswrapper[4794]: I0310 10:09:59.179351 4794 scope.go:117] "RemoveContainer" containerID="b90bc052c3d4a25868027ce9df7e66a9eafd23be70bfd49d08d79c81d6265ed9" Mar 10 10:09:59 crc kubenswrapper[4794]: E0310 10:09:59.179673 4794 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b90bc052c3d4a25868027ce9df7e66a9eafd23be70bfd49d08d79c81d6265ed9\": container with ID starting with b90bc052c3d4a25868027ce9df7e66a9eafd23be70bfd49d08d79c81d6265ed9 not found: ID does not exist" containerID="b90bc052c3d4a25868027ce9df7e66a9eafd23be70bfd49d08d79c81d6265ed9" Mar 10 10:09:59 crc kubenswrapper[4794]: I0310 10:09:59.179698 4794 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b90bc052c3d4a25868027ce9df7e66a9eafd23be70bfd49d08d79c81d6265ed9"} err="failed to get container status \"b90bc052c3d4a25868027ce9df7e66a9eafd23be70bfd49d08d79c81d6265ed9\": rpc error: code = NotFound desc = could not find container \"b90bc052c3d4a25868027ce9df7e66a9eafd23be70bfd49d08d79c81d6265ed9\": container with ID starting with b90bc052c3d4a25868027ce9df7e66a9eafd23be70bfd49d08d79c81d6265ed9 not found: ID does not exist" Mar 10 10:10:00 crc kubenswrapper[4794]: I0310 10:10:00.011061 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0919f85e-d789-43ec-90d3-7df281f603c5" path="/var/lib/kubelet/pods/0919f85e-d789-43ec-90d3-7df281f603c5/volumes" Mar 10 10:10:00 crc kubenswrapper[4794]: I0310 10:10:00.140311 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29552290-9sfrg"] Mar 10 10:10:00 crc kubenswrapper[4794]: E0310 10:10:00.140688 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0919f85e-d789-43ec-90d3-7df281f603c5" containerName="registry-server" Mar 10 10:10:00 crc kubenswrapper[4794]: I0310 10:10:00.140702 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="0919f85e-d789-43ec-90d3-7df281f603c5" containerName="registry-server" Mar 10 10:10:00 crc kubenswrapper[4794]: E0310 10:10:00.140719 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0919f85e-d789-43ec-90d3-7df281f603c5" containerName="extract-utilities" Mar 10 10:10:00 crc kubenswrapper[4794]: I0310 10:10:00.140728 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="0919f85e-d789-43ec-90d3-7df281f603c5" containerName="extract-utilities" Mar 10 10:10:00 crc kubenswrapper[4794]: E0310 10:10:00.140742 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0919f85e-d789-43ec-90d3-7df281f603c5" containerName="extract-content" Mar 10 10:10:00 crc kubenswrapper[4794]: I0310 10:10:00.140753 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="0919f85e-d789-43ec-90d3-7df281f603c5" containerName="extract-content" Mar 10 10:10:00 crc kubenswrapper[4794]: E0310 10:10:00.140775 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="55771788-f3c0-4cde-af2f-ca527c2e2965" containerName="neutron-httpd" Mar 10 10:10:00 crc kubenswrapper[4794]: I0310 10:10:00.140783 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="55771788-f3c0-4cde-af2f-ca527c2e2965" containerName="neutron-httpd" Mar 10 10:10:00 crc kubenswrapper[4794]: E0310 10:10:00.140805 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="55771788-f3c0-4cde-af2f-ca527c2e2965" containerName="neutron-api" Mar 10 10:10:00 crc kubenswrapper[4794]: I0310 10:10:00.140813 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="55771788-f3c0-4cde-af2f-ca527c2e2965" containerName="neutron-api" Mar 10 10:10:00 crc kubenswrapper[4794]: I0310 10:10:00.140986 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="0919f85e-d789-43ec-90d3-7df281f603c5" containerName="registry-server" Mar 10 10:10:00 crc kubenswrapper[4794]: I0310 10:10:00.141006 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="55771788-f3c0-4cde-af2f-ca527c2e2965" containerName="neutron-httpd" Mar 10 10:10:00 crc kubenswrapper[4794]: I0310 10:10:00.141030 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="55771788-f3c0-4cde-af2f-ca527c2e2965" containerName="neutron-api" Mar 10 10:10:00 crc kubenswrapper[4794]: I0310 10:10:00.141621 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552290-9sfrg" Mar 10 10:10:00 crc kubenswrapper[4794]: I0310 10:10:00.144526 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 10 10:10:00 crc kubenswrapper[4794]: I0310 10:10:00.144814 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-r5tkc" Mar 10 10:10:00 crc kubenswrapper[4794]: I0310 10:10:00.144934 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 10 10:10:00 crc kubenswrapper[4794]: I0310 10:10:00.156016 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552290-9sfrg"] Mar 10 10:10:00 crc kubenswrapper[4794]: I0310 10:10:00.206874 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-25pwq\" (UniqueName: \"kubernetes.io/projected/cc3b15ea-8912-46a4-a69a-5a8f6323b078-kube-api-access-25pwq\") pod \"auto-csr-approver-29552290-9sfrg\" (UID: \"cc3b15ea-8912-46a4-a69a-5a8f6323b078\") " pod="openshift-infra/auto-csr-approver-29552290-9sfrg" Mar 10 10:10:00 crc kubenswrapper[4794]: I0310 10:10:00.308044 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-25pwq\" (UniqueName: \"kubernetes.io/projected/cc3b15ea-8912-46a4-a69a-5a8f6323b078-kube-api-access-25pwq\") pod \"auto-csr-approver-29552290-9sfrg\" (UID: \"cc3b15ea-8912-46a4-a69a-5a8f6323b078\") " pod="openshift-infra/auto-csr-approver-29552290-9sfrg" Mar 10 10:10:00 crc kubenswrapper[4794]: I0310 10:10:00.328224 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-25pwq\" (UniqueName: \"kubernetes.io/projected/cc3b15ea-8912-46a4-a69a-5a8f6323b078-kube-api-access-25pwq\") pod \"auto-csr-approver-29552290-9sfrg\" (UID: \"cc3b15ea-8912-46a4-a69a-5a8f6323b078\") " pod="openshift-infra/auto-csr-approver-29552290-9sfrg" Mar 10 10:10:00 crc kubenswrapper[4794]: I0310 10:10:00.460990 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552290-9sfrg" Mar 10 10:10:00 crc kubenswrapper[4794]: I0310 10:10:00.892294 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552290-9sfrg"] Mar 10 10:10:00 crc kubenswrapper[4794]: W0310 10:10:00.902633 4794 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcc3b15ea_8912_46a4_a69a_5a8f6323b078.slice/crio-6f366645e67a4380a9ee3019f6fd8bb5bebfca551bbb5952517d347a142dc5bb WatchSource:0}: Error finding container 6f366645e67a4380a9ee3019f6fd8bb5bebfca551bbb5952517d347a142dc5bb: Status 404 returned error can't find the container with id 6f366645e67a4380a9ee3019f6fd8bb5bebfca551bbb5952517d347a142dc5bb Mar 10 10:10:01 crc kubenswrapper[4794]: I0310 10:10:01.103167 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552290-9sfrg" event={"ID":"cc3b15ea-8912-46a4-a69a-5a8f6323b078","Type":"ContainerStarted","Data":"6f366645e67a4380a9ee3019f6fd8bb5bebfca551bbb5952517d347a142dc5bb"} Mar 10 10:10:01 crc kubenswrapper[4794]: E0310 10:10:01.723759 4794 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 760dbf63b8a7310e794a1834c303840d43e2dfc6133fa29596b18c0cf1a30fcb is running failed: container process not found" containerID="760dbf63b8a7310e794a1834c303840d43e2dfc6133fa29596b18c0cf1a30fcb" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Mar 10 10:10:01 crc kubenswrapper[4794]: E0310 10:10:01.724224 4794 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 760dbf63b8a7310e794a1834c303840d43e2dfc6133fa29596b18c0cf1a30fcb is running failed: container process not found" containerID="760dbf63b8a7310e794a1834c303840d43e2dfc6133fa29596b18c0cf1a30fcb" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Mar 10 10:10:01 crc kubenswrapper[4794]: E0310 10:10:01.724782 4794 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 760dbf63b8a7310e794a1834c303840d43e2dfc6133fa29596b18c0cf1a30fcb is running failed: container process not found" containerID="760dbf63b8a7310e794a1834c303840d43e2dfc6133fa29596b18c0cf1a30fcb" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Mar 10 10:10:01 crc kubenswrapper[4794]: E0310 10:10:01.724875 4794 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 760dbf63b8a7310e794a1834c303840d43e2dfc6133fa29596b18c0cf1a30fcb is running failed: container process not found" probeType="Readiness" pod="openstack/ovn-controller-ovs-8552m" podUID="4fbaedcd-8c1d-452e-a36b-1a12e47f48d1" containerName="ovsdb-server" Mar 10 10:10:01 crc kubenswrapper[4794]: E0310 10:10:01.725307 4794 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="b7dfa84825a07e2ea2e0b6f2310752db3a4b9e635a5fe6eebca49deff4d3269e" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Mar 10 10:10:01 crc kubenswrapper[4794]: E0310 10:10:01.732842 4794 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="b7dfa84825a07e2ea2e0b6f2310752db3a4b9e635a5fe6eebca49deff4d3269e" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Mar 10 10:10:01 crc kubenswrapper[4794]: E0310 10:10:01.735204 4794 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="b7dfa84825a07e2ea2e0b6f2310752db3a4b9e635a5fe6eebca49deff4d3269e" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Mar 10 10:10:01 crc kubenswrapper[4794]: E0310 10:10:01.735249 4794 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/ovn-controller-ovs-8552m" podUID="4fbaedcd-8c1d-452e-a36b-1a12e47f48d1" containerName="ovs-vswitchd" Mar 10 10:10:03 crc kubenswrapper[4794]: I0310 10:10:03.124430 4794 generic.go:334] "Generic (PLEG): container finished" podID="cc3b15ea-8912-46a4-a69a-5a8f6323b078" containerID="17d92794a693024017718809f962f24b0d1af0701939fc313e048469be1fea41" exitCode=0 Mar 10 10:10:03 crc kubenswrapper[4794]: I0310 10:10:03.124494 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552290-9sfrg" event={"ID":"cc3b15ea-8912-46a4-a69a-5a8f6323b078","Type":"ContainerDied","Data":"17d92794a693024017718809f962f24b0d1af0701939fc313e048469be1fea41"} Mar 10 10:10:03 crc kubenswrapper[4794]: I0310 10:10:03.620877 4794 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-rc2sw" Mar 10 10:10:03 crc kubenswrapper[4794]: I0310 10:10:03.621228 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-rc2sw" Mar 10 10:10:04 crc kubenswrapper[4794]: I0310 10:10:04.436386 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552290-9sfrg" Mar 10 10:10:04 crc kubenswrapper[4794]: I0310 10:10:04.570822 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-25pwq\" (UniqueName: \"kubernetes.io/projected/cc3b15ea-8912-46a4-a69a-5a8f6323b078-kube-api-access-25pwq\") pod \"cc3b15ea-8912-46a4-a69a-5a8f6323b078\" (UID: \"cc3b15ea-8912-46a4-a69a-5a8f6323b078\") " Mar 10 10:10:04 crc kubenswrapper[4794]: I0310 10:10:04.579630 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cc3b15ea-8912-46a4-a69a-5a8f6323b078-kube-api-access-25pwq" (OuterVolumeSpecName: "kube-api-access-25pwq") pod "cc3b15ea-8912-46a4-a69a-5a8f6323b078" (UID: "cc3b15ea-8912-46a4-a69a-5a8f6323b078"). InnerVolumeSpecName "kube-api-access-25pwq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 10:10:04 crc kubenswrapper[4794]: I0310 10:10:04.667029 4794 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-rc2sw" podUID="735dee14-1f8d-4cbf-b66a-553892775e5c" containerName="registry-server" probeResult="failure" output=< Mar 10 10:10:04 crc kubenswrapper[4794]: timeout: failed to connect service ":50051" within 1s Mar 10 10:10:04 crc kubenswrapper[4794]: > Mar 10 10:10:04 crc kubenswrapper[4794]: I0310 10:10:04.672164 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-25pwq\" (UniqueName: \"kubernetes.io/projected/cc3b15ea-8912-46a4-a69a-5a8f6323b078-kube-api-access-25pwq\") on node \"crc\" DevicePath \"\"" Mar 10 10:10:05 crc kubenswrapper[4794]: I0310 10:10:05.147422 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552290-9sfrg" event={"ID":"cc3b15ea-8912-46a4-a69a-5a8f6323b078","Type":"ContainerDied","Data":"6f366645e67a4380a9ee3019f6fd8bb5bebfca551bbb5952517d347a142dc5bb"} Mar 10 10:10:05 crc kubenswrapper[4794]: I0310 10:10:05.147467 4794 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6f366645e67a4380a9ee3019f6fd8bb5bebfca551bbb5952517d347a142dc5bb" Mar 10 10:10:05 crc kubenswrapper[4794]: I0310 10:10:05.147497 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552290-9sfrg" Mar 10 10:10:05 crc kubenswrapper[4794]: I0310 10:10:05.526458 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29552284-4ph2z"] Mar 10 10:10:05 crc kubenswrapper[4794]: I0310 10:10:05.535151 4794 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29552284-4ph2z"] Mar 10 10:10:06 crc kubenswrapper[4794]: I0310 10:10:06.014931 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf038607-8e68-452a-a8af-162d60ea7061" path="/var/lib/kubelet/pods/bf038607-8e68-452a-a8af-162d60ea7061/volumes" Mar 10 10:10:06 crc kubenswrapper[4794]: E0310 10:10:06.723613 4794 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 760dbf63b8a7310e794a1834c303840d43e2dfc6133fa29596b18c0cf1a30fcb is running failed: container process not found" containerID="760dbf63b8a7310e794a1834c303840d43e2dfc6133fa29596b18c0cf1a30fcb" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Mar 10 10:10:06 crc kubenswrapper[4794]: E0310 10:10:06.724210 4794 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 760dbf63b8a7310e794a1834c303840d43e2dfc6133fa29596b18c0cf1a30fcb is running failed: container process not found" containerID="760dbf63b8a7310e794a1834c303840d43e2dfc6133fa29596b18c0cf1a30fcb" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Mar 10 10:10:06 crc kubenswrapper[4794]: E0310 10:10:06.724737 4794 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 760dbf63b8a7310e794a1834c303840d43e2dfc6133fa29596b18c0cf1a30fcb is running failed: container process not found" containerID="760dbf63b8a7310e794a1834c303840d43e2dfc6133fa29596b18c0cf1a30fcb" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Mar 10 10:10:06 crc kubenswrapper[4794]: E0310 10:10:06.724775 4794 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 760dbf63b8a7310e794a1834c303840d43e2dfc6133fa29596b18c0cf1a30fcb is running failed: container process not found" probeType="Readiness" pod="openstack/ovn-controller-ovs-8552m" podUID="4fbaedcd-8c1d-452e-a36b-1a12e47f48d1" containerName="ovsdb-server" Mar 10 10:10:06 crc kubenswrapper[4794]: E0310 10:10:06.727232 4794 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="b7dfa84825a07e2ea2e0b6f2310752db3a4b9e635a5fe6eebca49deff4d3269e" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Mar 10 10:10:06 crc kubenswrapper[4794]: E0310 10:10:06.729274 4794 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="b7dfa84825a07e2ea2e0b6f2310752db3a4b9e635a5fe6eebca49deff4d3269e" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Mar 10 10:10:06 crc kubenswrapper[4794]: E0310 10:10:06.732030 4794 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="b7dfa84825a07e2ea2e0b6f2310752db3a4b9e635a5fe6eebca49deff4d3269e" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Mar 10 10:10:06 crc kubenswrapper[4794]: E0310 10:10:06.732083 4794 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/ovn-controller-ovs-8552m" podUID="4fbaedcd-8c1d-452e-a36b-1a12e47f48d1" containerName="ovs-vswitchd" Mar 10 10:10:11 crc kubenswrapper[4794]: E0310 10:10:11.724251 4794 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 760dbf63b8a7310e794a1834c303840d43e2dfc6133fa29596b18c0cf1a30fcb is running failed: container process not found" containerID="760dbf63b8a7310e794a1834c303840d43e2dfc6133fa29596b18c0cf1a30fcb" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Mar 10 10:10:11 crc kubenswrapper[4794]: E0310 10:10:11.725308 4794 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 760dbf63b8a7310e794a1834c303840d43e2dfc6133fa29596b18c0cf1a30fcb is running failed: container process not found" containerID="760dbf63b8a7310e794a1834c303840d43e2dfc6133fa29596b18c0cf1a30fcb" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Mar 10 10:10:11 crc kubenswrapper[4794]: E0310 10:10:11.725501 4794 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="b7dfa84825a07e2ea2e0b6f2310752db3a4b9e635a5fe6eebca49deff4d3269e" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Mar 10 10:10:11 crc kubenswrapper[4794]: E0310 10:10:11.726078 4794 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 760dbf63b8a7310e794a1834c303840d43e2dfc6133fa29596b18c0cf1a30fcb is running failed: container process not found" containerID="760dbf63b8a7310e794a1834c303840d43e2dfc6133fa29596b18c0cf1a30fcb" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Mar 10 10:10:11 crc kubenswrapper[4794]: E0310 10:10:11.726137 4794 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 760dbf63b8a7310e794a1834c303840d43e2dfc6133fa29596b18c0cf1a30fcb is running failed: container process not found" probeType="Readiness" pod="openstack/ovn-controller-ovs-8552m" podUID="4fbaedcd-8c1d-452e-a36b-1a12e47f48d1" containerName="ovsdb-server" Mar 10 10:10:11 crc kubenswrapper[4794]: E0310 10:10:11.726937 4794 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="b7dfa84825a07e2ea2e0b6f2310752db3a4b9e635a5fe6eebca49deff4d3269e" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Mar 10 10:10:11 crc kubenswrapper[4794]: E0310 10:10:11.728229 4794 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="b7dfa84825a07e2ea2e0b6f2310752db3a4b9e635a5fe6eebca49deff4d3269e" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Mar 10 10:10:11 crc kubenswrapper[4794]: E0310 10:10:11.728272 4794 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/ovn-controller-ovs-8552m" podUID="4fbaedcd-8c1d-452e-a36b-1a12e47f48d1" containerName="ovs-vswitchd" Mar 10 10:10:13 crc kubenswrapper[4794]: I0310 10:10:13.220156 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-8552m_4fbaedcd-8c1d-452e-a36b-1a12e47f48d1/ovs-vswitchd/0.log" Mar 10 10:10:13 crc kubenswrapper[4794]: I0310 10:10:13.222003 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-8552m" Mar 10 10:10:13 crc kubenswrapper[4794]: I0310 10:10:13.260483 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-8552m_4fbaedcd-8c1d-452e-a36b-1a12e47f48d1/ovs-vswitchd/0.log" Mar 10 10:10:13 crc kubenswrapper[4794]: I0310 10:10:13.262268 4794 generic.go:334] "Generic (PLEG): container finished" podID="4fbaedcd-8c1d-452e-a36b-1a12e47f48d1" containerID="b7dfa84825a07e2ea2e0b6f2310752db3a4b9e635a5fe6eebca49deff4d3269e" exitCode=137 Mar 10 10:10:13 crc kubenswrapper[4794]: I0310 10:10:13.262357 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-8552m" event={"ID":"4fbaedcd-8c1d-452e-a36b-1a12e47f48d1","Type":"ContainerDied","Data":"b7dfa84825a07e2ea2e0b6f2310752db3a4b9e635a5fe6eebca49deff4d3269e"} Mar 10 10:10:13 crc kubenswrapper[4794]: I0310 10:10:13.262384 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-8552m" event={"ID":"4fbaedcd-8c1d-452e-a36b-1a12e47f48d1","Type":"ContainerDied","Data":"ba97aebd530c2adb07290b9a30334411952d96573f9a64ca5c288c8642ed252e"} Mar 10 10:10:13 crc kubenswrapper[4794]: I0310 10:10:13.262387 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-8552m" Mar 10 10:10:13 crc kubenswrapper[4794]: I0310 10:10:13.262399 4794 scope.go:117] "RemoveContainer" containerID="b7dfa84825a07e2ea2e0b6f2310752db3a4b9e635a5fe6eebca49deff4d3269e" Mar 10 10:10:13 crc kubenswrapper[4794]: I0310 10:10:13.270695 4794 generic.go:334] "Generic (PLEG): container finished" podID="6cbde6fd-ed6b-49a8-96ae-642b15a1802b" containerID="5c65c1b8cc623038c03f29280e90a94e0a70fec6174c721b69512958e9487bcb" exitCode=137 Mar 10 10:10:13 crc kubenswrapper[4794]: I0310 10:10:13.270765 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"6cbde6fd-ed6b-49a8-96ae-642b15a1802b","Type":"ContainerDied","Data":"5c65c1b8cc623038c03f29280e90a94e0a70fec6174c721b69512958e9487bcb"} Mar 10 10:10:13 crc kubenswrapper[4794]: I0310 10:10:13.272406 4794 generic.go:334] "Generic (PLEG): container finished" podID="3c6c5ff5-c0d2-49d5-a2d0-f49871e6444e" containerID="5a0b460bd15a1cc1517c39d79a934d02b21b241c99d789df7e899e079f6a12eb" exitCode=137 Mar 10 10:10:13 crc kubenswrapper[4794]: I0310 10:10:13.272431 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"3c6c5ff5-c0d2-49d5-a2d0-f49871e6444e","Type":"ContainerDied","Data":"5a0b460bd15a1cc1517c39d79a934d02b21b241c99d789df7e899e079f6a12eb"} Mar 10 10:10:13 crc kubenswrapper[4794]: I0310 10:10:13.311467 4794 scope.go:117] "RemoveContainer" containerID="760dbf63b8a7310e794a1834c303840d43e2dfc6133fa29596b18c0cf1a30fcb" Mar 10 10:10:13 crc kubenswrapper[4794]: I0310 10:10:13.319639 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/4fbaedcd-8c1d-452e-a36b-1a12e47f48d1-var-lib\") pod \"4fbaedcd-8c1d-452e-a36b-1a12e47f48d1\" (UID: \"4fbaedcd-8c1d-452e-a36b-1a12e47f48d1\") " Mar 10 10:10:13 crc kubenswrapper[4794]: I0310 10:10:13.319686 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/4fbaedcd-8c1d-452e-a36b-1a12e47f48d1-var-run\") pod \"4fbaedcd-8c1d-452e-a36b-1a12e47f48d1\" (UID: \"4fbaedcd-8c1d-452e-a36b-1a12e47f48d1\") " Mar 10 10:10:13 crc kubenswrapper[4794]: I0310 10:10:13.319741 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/4fbaedcd-8c1d-452e-a36b-1a12e47f48d1-var-log\") pod \"4fbaedcd-8c1d-452e-a36b-1a12e47f48d1\" (UID: \"4fbaedcd-8c1d-452e-a36b-1a12e47f48d1\") " Mar 10 10:10:13 crc kubenswrapper[4794]: I0310 10:10:13.319795 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bl2kw\" (UniqueName: \"kubernetes.io/projected/4fbaedcd-8c1d-452e-a36b-1a12e47f48d1-kube-api-access-bl2kw\") pod \"4fbaedcd-8c1d-452e-a36b-1a12e47f48d1\" (UID: \"4fbaedcd-8c1d-452e-a36b-1a12e47f48d1\") " Mar 10 10:10:13 crc kubenswrapper[4794]: I0310 10:10:13.319833 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4fbaedcd-8c1d-452e-a36b-1a12e47f48d1-var-lib" (OuterVolumeSpecName: "var-lib") pod "4fbaedcd-8c1d-452e-a36b-1a12e47f48d1" (UID: "4fbaedcd-8c1d-452e-a36b-1a12e47f48d1"). InnerVolumeSpecName "var-lib". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 10:10:13 crc kubenswrapper[4794]: I0310 10:10:13.319862 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/4fbaedcd-8c1d-452e-a36b-1a12e47f48d1-etc-ovs\") pod \"4fbaedcd-8c1d-452e-a36b-1a12e47f48d1\" (UID: \"4fbaedcd-8c1d-452e-a36b-1a12e47f48d1\") " Mar 10 10:10:13 crc kubenswrapper[4794]: I0310 10:10:13.319858 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4fbaedcd-8c1d-452e-a36b-1a12e47f48d1-var-run" (OuterVolumeSpecName: "var-run") pod "4fbaedcd-8c1d-452e-a36b-1a12e47f48d1" (UID: "4fbaedcd-8c1d-452e-a36b-1a12e47f48d1"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 10:10:13 crc kubenswrapper[4794]: I0310 10:10:13.319886 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4fbaedcd-8c1d-452e-a36b-1a12e47f48d1-scripts\") pod \"4fbaedcd-8c1d-452e-a36b-1a12e47f48d1\" (UID: \"4fbaedcd-8c1d-452e-a36b-1a12e47f48d1\") " Mar 10 10:10:13 crc kubenswrapper[4794]: I0310 10:10:13.319914 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4fbaedcd-8c1d-452e-a36b-1a12e47f48d1-etc-ovs" (OuterVolumeSpecName: "etc-ovs") pod "4fbaedcd-8c1d-452e-a36b-1a12e47f48d1" (UID: "4fbaedcd-8c1d-452e-a36b-1a12e47f48d1"). InnerVolumeSpecName "etc-ovs". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 10:10:13 crc kubenswrapper[4794]: I0310 10:10:13.319894 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4fbaedcd-8c1d-452e-a36b-1a12e47f48d1-var-log" (OuterVolumeSpecName: "var-log") pod "4fbaedcd-8c1d-452e-a36b-1a12e47f48d1" (UID: "4fbaedcd-8c1d-452e-a36b-1a12e47f48d1"). InnerVolumeSpecName "var-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 10:10:13 crc kubenswrapper[4794]: I0310 10:10:13.320205 4794 reconciler_common.go:293] "Volume detached for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/4fbaedcd-8c1d-452e-a36b-1a12e47f48d1-etc-ovs\") on node \"crc\" DevicePath \"\"" Mar 10 10:10:13 crc kubenswrapper[4794]: I0310 10:10:13.320227 4794 reconciler_common.go:293] "Volume detached for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/4fbaedcd-8c1d-452e-a36b-1a12e47f48d1-var-lib\") on node \"crc\" DevicePath \"\"" Mar 10 10:10:13 crc kubenswrapper[4794]: I0310 10:10:13.320236 4794 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/4fbaedcd-8c1d-452e-a36b-1a12e47f48d1-var-run\") on node \"crc\" DevicePath \"\"" Mar 10 10:10:13 crc kubenswrapper[4794]: I0310 10:10:13.320244 4794 reconciler_common.go:293] "Volume detached for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/4fbaedcd-8c1d-452e-a36b-1a12e47f48d1-var-log\") on node \"crc\" DevicePath \"\"" Mar 10 10:10:13 crc kubenswrapper[4794]: I0310 10:10:13.321530 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4fbaedcd-8c1d-452e-a36b-1a12e47f48d1-scripts" (OuterVolumeSpecName: "scripts") pod "4fbaedcd-8c1d-452e-a36b-1a12e47f48d1" (UID: "4fbaedcd-8c1d-452e-a36b-1a12e47f48d1"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 10:10:13 crc kubenswrapper[4794]: I0310 10:10:13.324671 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4fbaedcd-8c1d-452e-a36b-1a12e47f48d1-kube-api-access-bl2kw" (OuterVolumeSpecName: "kube-api-access-bl2kw") pod "4fbaedcd-8c1d-452e-a36b-1a12e47f48d1" (UID: "4fbaedcd-8c1d-452e-a36b-1a12e47f48d1"). InnerVolumeSpecName "kube-api-access-bl2kw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 10:10:13 crc kubenswrapper[4794]: I0310 10:10:13.340430 4794 scope.go:117] "RemoveContainer" containerID="f77929cd0c4936dedf4ae780db89375e59fd10d1eb9a6d9a9595f0dfbea94734" Mar 10 10:10:13 crc kubenswrapper[4794]: I0310 10:10:13.391034 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 10 10:10:13 crc kubenswrapper[4794]: I0310 10:10:13.411652 4794 scope.go:117] "RemoveContainer" containerID="b7dfa84825a07e2ea2e0b6f2310752db3a4b9e635a5fe6eebca49deff4d3269e" Mar 10 10:10:13 crc kubenswrapper[4794]: E0310 10:10:13.412021 4794 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b7dfa84825a07e2ea2e0b6f2310752db3a4b9e635a5fe6eebca49deff4d3269e\": container with ID starting with b7dfa84825a07e2ea2e0b6f2310752db3a4b9e635a5fe6eebca49deff4d3269e not found: ID does not exist" containerID="b7dfa84825a07e2ea2e0b6f2310752db3a4b9e635a5fe6eebca49deff4d3269e" Mar 10 10:10:13 crc kubenswrapper[4794]: I0310 10:10:13.412046 4794 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b7dfa84825a07e2ea2e0b6f2310752db3a4b9e635a5fe6eebca49deff4d3269e"} err="failed to get container status \"b7dfa84825a07e2ea2e0b6f2310752db3a4b9e635a5fe6eebca49deff4d3269e\": rpc error: code = NotFound desc = could not find container \"b7dfa84825a07e2ea2e0b6f2310752db3a4b9e635a5fe6eebca49deff4d3269e\": container with ID starting with b7dfa84825a07e2ea2e0b6f2310752db3a4b9e635a5fe6eebca49deff4d3269e not found: ID does not exist" Mar 10 10:10:13 crc kubenswrapper[4794]: I0310 10:10:13.412063 4794 scope.go:117] "RemoveContainer" containerID="760dbf63b8a7310e794a1834c303840d43e2dfc6133fa29596b18c0cf1a30fcb" Mar 10 10:10:13 crc kubenswrapper[4794]: E0310 10:10:13.412224 4794 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"760dbf63b8a7310e794a1834c303840d43e2dfc6133fa29596b18c0cf1a30fcb\": container with ID starting with 760dbf63b8a7310e794a1834c303840d43e2dfc6133fa29596b18c0cf1a30fcb not found: ID does not exist" containerID="760dbf63b8a7310e794a1834c303840d43e2dfc6133fa29596b18c0cf1a30fcb" Mar 10 10:10:13 crc kubenswrapper[4794]: I0310 10:10:13.412238 4794 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"760dbf63b8a7310e794a1834c303840d43e2dfc6133fa29596b18c0cf1a30fcb"} err="failed to get container status \"760dbf63b8a7310e794a1834c303840d43e2dfc6133fa29596b18c0cf1a30fcb\": rpc error: code = NotFound desc = could not find container \"760dbf63b8a7310e794a1834c303840d43e2dfc6133fa29596b18c0cf1a30fcb\": container with ID starting with 760dbf63b8a7310e794a1834c303840d43e2dfc6133fa29596b18c0cf1a30fcb not found: ID does not exist" Mar 10 10:10:13 crc kubenswrapper[4794]: I0310 10:10:13.412251 4794 scope.go:117] "RemoveContainer" containerID="f77929cd0c4936dedf4ae780db89375e59fd10d1eb9a6d9a9595f0dfbea94734" Mar 10 10:10:13 crc kubenswrapper[4794]: E0310 10:10:13.412627 4794 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f77929cd0c4936dedf4ae780db89375e59fd10d1eb9a6d9a9595f0dfbea94734\": container with ID starting with f77929cd0c4936dedf4ae780db89375e59fd10d1eb9a6d9a9595f0dfbea94734 not found: ID does not exist" containerID="f77929cd0c4936dedf4ae780db89375e59fd10d1eb9a6d9a9595f0dfbea94734" Mar 10 10:10:13 crc kubenswrapper[4794]: I0310 10:10:13.412647 4794 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f77929cd0c4936dedf4ae780db89375e59fd10d1eb9a6d9a9595f0dfbea94734"} err="failed to get container status \"f77929cd0c4936dedf4ae780db89375e59fd10d1eb9a6d9a9595f0dfbea94734\": rpc error: code = NotFound desc = could not find container \"f77929cd0c4936dedf4ae780db89375e59fd10d1eb9a6d9a9595f0dfbea94734\": container with ID starting with f77929cd0c4936dedf4ae780db89375e59fd10d1eb9a6d9a9595f0dfbea94734 not found: ID does not exist" Mar 10 10:10:13 crc kubenswrapper[4794]: I0310 10:10:13.421854 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bl2kw\" (UniqueName: \"kubernetes.io/projected/4fbaedcd-8c1d-452e-a36b-1a12e47f48d1-kube-api-access-bl2kw\") on node \"crc\" DevicePath \"\"" Mar 10 10:10:13 crc kubenswrapper[4794]: I0310 10:10:13.421881 4794 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4fbaedcd-8c1d-452e-a36b-1a12e47f48d1-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 10:10:13 crc kubenswrapper[4794]: I0310 10:10:13.445571 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Mar 10 10:10:13 crc kubenswrapper[4794]: I0310 10:10:13.522854 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/6cbde6fd-ed6b-49a8-96ae-642b15a1802b-etc-swift\") pod \"6cbde6fd-ed6b-49a8-96ae-642b15a1802b\" (UID: \"6cbde6fd-ed6b-49a8-96ae-642b15a1802b\") " Mar 10 10:10:13 crc kubenswrapper[4794]: I0310 10:10:13.522920 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/3c6c5ff5-c0d2-49d5-a2d0-f49871e6444e-etc-machine-id\") pod \"3c6c5ff5-c0d2-49d5-a2d0-f49871e6444e\" (UID: \"3c6c5ff5-c0d2-49d5-a2d0-f49871e6444e\") " Mar 10 10:10:13 crc kubenswrapper[4794]: I0310 10:10:13.522954 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/6cbde6fd-ed6b-49a8-96ae-642b15a1802b-cache\") pod \"6cbde6fd-ed6b-49a8-96ae-642b15a1802b\" (UID: \"6cbde6fd-ed6b-49a8-96ae-642b15a1802b\") " Mar 10 10:10:13 crc kubenswrapper[4794]: I0310 10:10:13.522979 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2dx2x\" (UniqueName: \"kubernetes.io/projected/6cbde6fd-ed6b-49a8-96ae-642b15a1802b-kube-api-access-2dx2x\") pod \"6cbde6fd-ed6b-49a8-96ae-642b15a1802b\" (UID: \"6cbde6fd-ed6b-49a8-96ae-642b15a1802b\") " Mar 10 10:10:13 crc kubenswrapper[4794]: I0310 10:10:13.523010 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3c6c5ff5-c0d2-49d5-a2d0-f49871e6444e-combined-ca-bundle\") pod \"3c6c5ff5-c0d2-49d5-a2d0-f49871e6444e\" (UID: \"3c6c5ff5-c0d2-49d5-a2d0-f49871e6444e\") " Mar 10 10:10:13 crc kubenswrapper[4794]: I0310 10:10:13.523040 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/3c6c5ff5-c0d2-49d5-a2d0-f49871e6444e-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "3c6c5ff5-c0d2-49d5-a2d0-f49871e6444e" (UID: "3c6c5ff5-c0d2-49d5-a2d0-f49871e6444e"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 10:10:13 crc kubenswrapper[4794]: I0310 10:10:13.523052 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3c6c5ff5-c0d2-49d5-a2d0-f49871e6444e-config-data\") pod \"3c6c5ff5-c0d2-49d5-a2d0-f49871e6444e\" (UID: \"3c6c5ff5-c0d2-49d5-a2d0-f49871e6444e\") " Mar 10 10:10:13 crc kubenswrapper[4794]: I0310 10:10:13.523138 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3c6c5ff5-c0d2-49d5-a2d0-f49871e6444e-scripts\") pod \"3c6c5ff5-c0d2-49d5-a2d0-f49871e6444e\" (UID: \"3c6c5ff5-c0d2-49d5-a2d0-f49871e6444e\") " Mar 10 10:10:13 crc kubenswrapper[4794]: I0310 10:10:13.523206 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/6cbde6fd-ed6b-49a8-96ae-642b15a1802b-lock\") pod \"6cbde6fd-ed6b-49a8-96ae-642b15a1802b\" (UID: \"6cbde6fd-ed6b-49a8-96ae-642b15a1802b\") " Mar 10 10:10:13 crc kubenswrapper[4794]: I0310 10:10:13.523240 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3c6c5ff5-c0d2-49d5-a2d0-f49871e6444e-config-data-custom\") pod \"3c6c5ff5-c0d2-49d5-a2d0-f49871e6444e\" (UID: \"3c6c5ff5-c0d2-49d5-a2d0-f49871e6444e\") " Mar 10 10:10:13 crc kubenswrapper[4794]: I0310 10:10:13.523265 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swift\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"6cbde6fd-ed6b-49a8-96ae-642b15a1802b\" (UID: \"6cbde6fd-ed6b-49a8-96ae-642b15a1802b\") " Mar 10 10:10:13 crc kubenswrapper[4794]: I0310 10:10:13.523293 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6cbde6fd-ed6b-49a8-96ae-642b15a1802b-combined-ca-bundle\") pod \"6cbde6fd-ed6b-49a8-96ae-642b15a1802b\" (UID: \"6cbde6fd-ed6b-49a8-96ae-642b15a1802b\") " Mar 10 10:10:13 crc kubenswrapper[4794]: I0310 10:10:13.523364 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sjtmv\" (UniqueName: \"kubernetes.io/projected/3c6c5ff5-c0d2-49d5-a2d0-f49871e6444e-kube-api-access-sjtmv\") pod \"3c6c5ff5-c0d2-49d5-a2d0-f49871e6444e\" (UID: \"3c6c5ff5-c0d2-49d5-a2d0-f49871e6444e\") " Mar 10 10:10:13 crc kubenswrapper[4794]: I0310 10:10:13.523989 4794 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/3c6c5ff5-c0d2-49d5-a2d0-f49871e6444e-etc-machine-id\") on node \"crc\" DevicePath \"\"" Mar 10 10:10:13 crc kubenswrapper[4794]: I0310 10:10:13.524512 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6cbde6fd-ed6b-49a8-96ae-642b15a1802b-lock" (OuterVolumeSpecName: "lock") pod "6cbde6fd-ed6b-49a8-96ae-642b15a1802b" (UID: "6cbde6fd-ed6b-49a8-96ae-642b15a1802b"). InnerVolumeSpecName "lock". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 10:10:13 crc kubenswrapper[4794]: I0310 10:10:13.524515 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6cbde6fd-ed6b-49a8-96ae-642b15a1802b-cache" (OuterVolumeSpecName: "cache") pod "6cbde6fd-ed6b-49a8-96ae-642b15a1802b" (UID: "6cbde6fd-ed6b-49a8-96ae-642b15a1802b"). InnerVolumeSpecName "cache". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 10:10:13 crc kubenswrapper[4794]: I0310 10:10:13.527005 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3c6c5ff5-c0d2-49d5-a2d0-f49871e6444e-kube-api-access-sjtmv" (OuterVolumeSpecName: "kube-api-access-sjtmv") pod "3c6c5ff5-c0d2-49d5-a2d0-f49871e6444e" (UID: "3c6c5ff5-c0d2-49d5-a2d0-f49871e6444e"). InnerVolumeSpecName "kube-api-access-sjtmv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 10:10:13 crc kubenswrapper[4794]: I0310 10:10:13.527110 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6cbde6fd-ed6b-49a8-96ae-642b15a1802b-kube-api-access-2dx2x" (OuterVolumeSpecName: "kube-api-access-2dx2x") pod "6cbde6fd-ed6b-49a8-96ae-642b15a1802b" (UID: "6cbde6fd-ed6b-49a8-96ae-642b15a1802b"). InnerVolumeSpecName "kube-api-access-2dx2x". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 10:10:13 crc kubenswrapper[4794]: I0310 10:10:13.528462 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage12-crc" (OuterVolumeSpecName: "swift") pod "6cbde6fd-ed6b-49a8-96ae-642b15a1802b" (UID: "6cbde6fd-ed6b-49a8-96ae-642b15a1802b"). InnerVolumeSpecName "local-storage12-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 10 10:10:13 crc kubenswrapper[4794]: I0310 10:10:13.528535 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6cbde6fd-ed6b-49a8-96ae-642b15a1802b-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "6cbde6fd-ed6b-49a8-96ae-642b15a1802b" (UID: "6cbde6fd-ed6b-49a8-96ae-642b15a1802b"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 10:10:13 crc kubenswrapper[4794]: I0310 10:10:13.529432 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3c6c5ff5-c0d2-49d5-a2d0-f49871e6444e-scripts" (OuterVolumeSpecName: "scripts") pod "3c6c5ff5-c0d2-49d5-a2d0-f49871e6444e" (UID: "3c6c5ff5-c0d2-49d5-a2d0-f49871e6444e"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 10:10:13 crc kubenswrapper[4794]: I0310 10:10:13.530425 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3c6c5ff5-c0d2-49d5-a2d0-f49871e6444e-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "3c6c5ff5-c0d2-49d5-a2d0-f49871e6444e" (UID: "3c6c5ff5-c0d2-49d5-a2d0-f49871e6444e"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 10:10:13 crc kubenswrapper[4794]: I0310 10:10:13.556094 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3c6c5ff5-c0d2-49d5-a2d0-f49871e6444e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3c6c5ff5-c0d2-49d5-a2d0-f49871e6444e" (UID: "3c6c5ff5-c0d2-49d5-a2d0-f49871e6444e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 10:10:13 crc kubenswrapper[4794]: I0310 10:10:13.598947 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3c6c5ff5-c0d2-49d5-a2d0-f49871e6444e-config-data" (OuterVolumeSpecName: "config-data") pod "3c6c5ff5-c0d2-49d5-a2d0-f49871e6444e" (UID: "3c6c5ff5-c0d2-49d5-a2d0-f49871e6444e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 10:10:13 crc kubenswrapper[4794]: I0310 10:10:13.626493 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sjtmv\" (UniqueName: \"kubernetes.io/projected/3c6c5ff5-c0d2-49d5-a2d0-f49871e6444e-kube-api-access-sjtmv\") on node \"crc\" DevicePath \"\"" Mar 10 10:10:13 crc kubenswrapper[4794]: I0310 10:10:13.626560 4794 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/6cbde6fd-ed6b-49a8-96ae-642b15a1802b-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 10 10:10:13 crc kubenswrapper[4794]: I0310 10:10:13.626574 4794 reconciler_common.go:293] "Volume detached for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/6cbde6fd-ed6b-49a8-96ae-642b15a1802b-cache\") on node \"crc\" DevicePath \"\"" Mar 10 10:10:13 crc kubenswrapper[4794]: I0310 10:10:13.626588 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2dx2x\" (UniqueName: \"kubernetes.io/projected/6cbde6fd-ed6b-49a8-96ae-642b15a1802b-kube-api-access-2dx2x\") on node \"crc\" DevicePath \"\"" Mar 10 10:10:13 crc kubenswrapper[4794]: I0310 10:10:13.626615 4794 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3c6c5ff5-c0d2-49d5-a2d0-f49871e6444e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 10:10:13 crc kubenswrapper[4794]: I0310 10:10:13.626630 4794 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3c6c5ff5-c0d2-49d5-a2d0-f49871e6444e-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 10:10:13 crc kubenswrapper[4794]: I0310 10:10:13.626640 4794 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3c6c5ff5-c0d2-49d5-a2d0-f49871e6444e-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 10:10:13 crc kubenswrapper[4794]: I0310 10:10:13.626652 4794 reconciler_common.go:293] "Volume detached for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/6cbde6fd-ed6b-49a8-96ae-642b15a1802b-lock\") on node \"crc\" DevicePath \"\"" Mar 10 10:10:13 crc kubenswrapper[4794]: I0310 10:10:13.626662 4794 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3c6c5ff5-c0d2-49d5-a2d0-f49871e6444e-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 10 10:10:13 crc kubenswrapper[4794]: I0310 10:10:13.626705 4794 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") on node \"crc\" " Mar 10 10:10:13 crc kubenswrapper[4794]: I0310 10:10:13.646077 4794 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage12-crc" (UniqueName: "kubernetes.io/local-volume/local-storage12-crc") on node "crc" Mar 10 10:10:13 crc kubenswrapper[4794]: I0310 10:10:13.671508 4794 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-rc2sw" Mar 10 10:10:13 crc kubenswrapper[4794]: I0310 10:10:13.725808 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-ovs-8552m"] Mar 10 10:10:13 crc kubenswrapper[4794]: I0310 10:10:13.728138 4794 reconciler_common.go:293] "Volume detached for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") on node \"crc\" DevicePath \"\"" Mar 10 10:10:13 crc kubenswrapper[4794]: I0310 10:10:13.731621 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-rc2sw" Mar 10 10:10:13 crc kubenswrapper[4794]: I0310 10:10:13.733499 4794 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-ovs-8552m"] Mar 10 10:10:13 crc kubenswrapper[4794]: I0310 10:10:13.819036 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6cbde6fd-ed6b-49a8-96ae-642b15a1802b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6cbde6fd-ed6b-49a8-96ae-642b15a1802b" (UID: "6cbde6fd-ed6b-49a8-96ae-642b15a1802b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 10:10:13 crc kubenswrapper[4794]: I0310 10:10:13.829316 4794 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6cbde6fd-ed6b-49a8-96ae-642b15a1802b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 10:10:13 crc kubenswrapper[4794]: I0310 10:10:13.901794 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-rc2sw"] Mar 10 10:10:14 crc kubenswrapper[4794]: I0310 10:10:14.012750 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4fbaedcd-8c1d-452e-a36b-1a12e47f48d1" path="/var/lib/kubelet/pods/4fbaedcd-8c1d-452e-a36b-1a12e47f48d1/volumes" Mar 10 10:10:14 crc kubenswrapper[4794]: I0310 10:10:14.293858 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"6cbde6fd-ed6b-49a8-96ae-642b15a1802b","Type":"ContainerDied","Data":"98c7f33d0acbb031c2637b524355a2476576c8b482ba68322e539058c1cc15d7"} Mar 10 10:10:14 crc kubenswrapper[4794]: I0310 10:10:14.293907 4794 scope.go:117] "RemoveContainer" containerID="5c65c1b8cc623038c03f29280e90a94e0a70fec6174c721b69512958e9487bcb" Mar 10 10:10:14 crc kubenswrapper[4794]: I0310 10:10:14.293957 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Mar 10 10:10:14 crc kubenswrapper[4794]: I0310 10:10:14.300350 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"3c6c5ff5-c0d2-49d5-a2d0-f49871e6444e","Type":"ContainerDied","Data":"d1dca3e4366d2a714eda333d52b2463f788c4bb17ba1264be9387aa062133e42"} Mar 10 10:10:14 crc kubenswrapper[4794]: I0310 10:10:14.301119 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 10 10:10:14 crc kubenswrapper[4794]: I0310 10:10:14.331994 4794 scope.go:117] "RemoveContainer" containerID="8f6939cc5c3159e417cd81d11fcb1a54fc7a6a362b3f62041010bad3ef1cdf82" Mar 10 10:10:14 crc kubenswrapper[4794]: I0310 10:10:14.338323 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-storage-0"] Mar 10 10:10:14 crc kubenswrapper[4794]: I0310 10:10:14.353600 4794 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/swift-storage-0"] Mar 10 10:10:14 crc kubenswrapper[4794]: I0310 10:10:14.359823 4794 scope.go:117] "RemoveContainer" containerID="32f094fa9f1ca547ebae717b8b5951d1771b5fdb77dfeec605d769593501a6a7" Mar 10 10:10:14 crc kubenswrapper[4794]: I0310 10:10:14.359974 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 10 10:10:14 crc kubenswrapper[4794]: I0310 10:10:14.368741 4794 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 10 10:10:14 crc kubenswrapper[4794]: I0310 10:10:14.392206 4794 scope.go:117] "RemoveContainer" containerID="3b9e079a22cd5b6888eff2291538cfe9ec6e987ec470fa124da7e69bbac3f8c2" Mar 10 10:10:14 crc kubenswrapper[4794]: I0310 10:10:14.411753 4794 scope.go:117] "RemoveContainer" containerID="250a700fa883215453330332d981b0fe632e9fa60d370e3a5759ce91865db4ab" Mar 10 10:10:14 crc kubenswrapper[4794]: I0310 10:10:14.437154 4794 scope.go:117] "RemoveContainer" containerID="bc6074c0953ac28f265a3e17ebd69da6a9d931779c7686e644dc485527b0a8fb" Mar 10 10:10:14 crc kubenswrapper[4794]: I0310 10:10:14.463616 4794 scope.go:117] "RemoveContainer" containerID="2190a44dadbbaa1a8486a80b4974382e233189c54458a517977ead0fca476329" Mar 10 10:10:14 crc kubenswrapper[4794]: I0310 10:10:14.494021 4794 scope.go:117] "RemoveContainer" containerID="8e6a16e1a4b64e9512a2b7d0587b85ab036e797d9ba29438a7eebcaaa92c8d35" Mar 10 10:10:14 crc kubenswrapper[4794]: I0310 10:10:14.510547 4794 scope.go:117] "RemoveContainer" containerID="74c2be2cb91c7946838edd2c68684383f49526b9001079db31859fa44514bdac" Mar 10 10:10:14 crc kubenswrapper[4794]: I0310 10:10:14.529070 4794 scope.go:117] "RemoveContainer" containerID="46ff2925f4f5d1bf13f6a0203dfcdd7152671d57863769e69b695dd9c5e3fb06" Mar 10 10:10:14 crc kubenswrapper[4794]: I0310 10:10:14.548320 4794 scope.go:117] "RemoveContainer" containerID="21625e79ea0985db8645fee5a87d8192fa2df9962f7552b6121df93fb96d3e7f" Mar 10 10:10:14 crc kubenswrapper[4794]: I0310 10:10:14.577627 4794 scope.go:117] "RemoveContainer" containerID="d63f9f6d7599f958801005a6670033ad2c6f68cf62e9c0465c4d34044c669139" Mar 10 10:10:14 crc kubenswrapper[4794]: I0310 10:10:14.628027 4794 scope.go:117] "RemoveContainer" containerID="32aa0630d82d05463514a0c8a463bab43aad2c33f2b887d13c9714f7761c76c2" Mar 10 10:10:14 crc kubenswrapper[4794]: I0310 10:10:14.656658 4794 scope.go:117] "RemoveContainer" containerID="8cc99334a202e511edfdad4eef31c86941e591eb3f6215e5d7e786f323618184" Mar 10 10:10:14 crc kubenswrapper[4794]: I0310 10:10:14.679741 4794 scope.go:117] "RemoveContainer" containerID="031388a5b0e7ab4e2d5a36045f55127c3e30f57424f750e9a01dea1da336e7c1" Mar 10 10:10:14 crc kubenswrapper[4794]: I0310 10:10:14.709390 4794 scope.go:117] "RemoveContainer" containerID="849ee9cf4f0b2ba0c3906171081ce847f0534ff16ad62c26ba765cd71b509f45" Mar 10 10:10:14 crc kubenswrapper[4794]: I0310 10:10:14.735267 4794 scope.go:117] "RemoveContainer" containerID="5a0b460bd15a1cc1517c39d79a934d02b21b241c99d789df7e899e079f6a12eb" Mar 10 10:10:15 crc kubenswrapper[4794]: I0310 10:10:15.315901 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-rc2sw" podUID="735dee14-1f8d-4cbf-b66a-553892775e5c" containerName="registry-server" containerID="cri-o://2c9c54730eb704c58a05847dfd2a78c990adcc648e7a012ca755e85670d4a4f7" gracePeriod=2 Mar 10 10:10:15 crc kubenswrapper[4794]: I0310 10:10:15.786665 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-rc2sw" Mar 10 10:10:15 crc kubenswrapper[4794]: I0310 10:10:15.862573 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/735dee14-1f8d-4cbf-b66a-553892775e5c-catalog-content\") pod \"735dee14-1f8d-4cbf-b66a-553892775e5c\" (UID: \"735dee14-1f8d-4cbf-b66a-553892775e5c\") " Mar 10 10:10:15 crc kubenswrapper[4794]: I0310 10:10:15.862639 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/735dee14-1f8d-4cbf-b66a-553892775e5c-utilities\") pod \"735dee14-1f8d-4cbf-b66a-553892775e5c\" (UID: \"735dee14-1f8d-4cbf-b66a-553892775e5c\") " Mar 10 10:10:15 crc kubenswrapper[4794]: I0310 10:10:15.862667 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-95xh2\" (UniqueName: \"kubernetes.io/projected/735dee14-1f8d-4cbf-b66a-553892775e5c-kube-api-access-95xh2\") pod \"735dee14-1f8d-4cbf-b66a-553892775e5c\" (UID: \"735dee14-1f8d-4cbf-b66a-553892775e5c\") " Mar 10 10:10:15 crc kubenswrapper[4794]: I0310 10:10:15.866575 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/735dee14-1f8d-4cbf-b66a-553892775e5c-utilities" (OuterVolumeSpecName: "utilities") pod "735dee14-1f8d-4cbf-b66a-553892775e5c" (UID: "735dee14-1f8d-4cbf-b66a-553892775e5c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 10:10:15 crc kubenswrapper[4794]: I0310 10:10:15.867399 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/735dee14-1f8d-4cbf-b66a-553892775e5c-kube-api-access-95xh2" (OuterVolumeSpecName: "kube-api-access-95xh2") pod "735dee14-1f8d-4cbf-b66a-553892775e5c" (UID: "735dee14-1f8d-4cbf-b66a-553892775e5c"). InnerVolumeSpecName "kube-api-access-95xh2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 10:10:15 crc kubenswrapper[4794]: I0310 10:10:15.964634 4794 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/735dee14-1f8d-4cbf-b66a-553892775e5c-utilities\") on node \"crc\" DevicePath \"\"" Mar 10 10:10:15 crc kubenswrapper[4794]: I0310 10:10:15.964684 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-95xh2\" (UniqueName: \"kubernetes.io/projected/735dee14-1f8d-4cbf-b66a-553892775e5c-kube-api-access-95xh2\") on node \"crc\" DevicePath \"\"" Mar 10 10:10:16 crc kubenswrapper[4794]: I0310 10:10:16.013753 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3c6c5ff5-c0d2-49d5-a2d0-f49871e6444e" path="/var/lib/kubelet/pods/3c6c5ff5-c0d2-49d5-a2d0-f49871e6444e/volumes" Mar 10 10:10:16 crc kubenswrapper[4794]: I0310 10:10:16.014670 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6cbde6fd-ed6b-49a8-96ae-642b15a1802b" path="/var/lib/kubelet/pods/6cbde6fd-ed6b-49a8-96ae-642b15a1802b/volumes" Mar 10 10:10:16 crc kubenswrapper[4794]: I0310 10:10:16.019826 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/735dee14-1f8d-4cbf-b66a-553892775e5c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "735dee14-1f8d-4cbf-b66a-553892775e5c" (UID: "735dee14-1f8d-4cbf-b66a-553892775e5c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 10:10:16 crc kubenswrapper[4794]: I0310 10:10:16.066567 4794 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/735dee14-1f8d-4cbf-b66a-553892775e5c-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 10 10:10:16 crc kubenswrapper[4794]: I0310 10:10:16.331348 4794 generic.go:334] "Generic (PLEG): container finished" podID="735dee14-1f8d-4cbf-b66a-553892775e5c" containerID="2c9c54730eb704c58a05847dfd2a78c990adcc648e7a012ca755e85670d4a4f7" exitCode=0 Mar 10 10:10:16 crc kubenswrapper[4794]: I0310 10:10:16.331395 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rc2sw" event={"ID":"735dee14-1f8d-4cbf-b66a-553892775e5c","Type":"ContainerDied","Data":"2c9c54730eb704c58a05847dfd2a78c990adcc648e7a012ca755e85670d4a4f7"} Mar 10 10:10:16 crc kubenswrapper[4794]: I0310 10:10:16.331449 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rc2sw" event={"ID":"735dee14-1f8d-4cbf-b66a-553892775e5c","Type":"ContainerDied","Data":"ca163e48db376412b7df84bbe63b14da4f2baee558b5e0063cff05d5e42124ce"} Mar 10 10:10:16 crc kubenswrapper[4794]: I0310 10:10:16.331480 4794 scope.go:117] "RemoveContainer" containerID="2c9c54730eb704c58a05847dfd2a78c990adcc648e7a012ca755e85670d4a4f7" Mar 10 10:10:16 crc kubenswrapper[4794]: I0310 10:10:16.331410 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-rc2sw" Mar 10 10:10:16 crc kubenswrapper[4794]: I0310 10:10:16.361594 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-rc2sw"] Mar 10 10:10:16 crc kubenswrapper[4794]: I0310 10:10:16.361790 4794 scope.go:117] "RemoveContainer" containerID="9703f3e7fd8cb6014a6b161bb42125738ba49cd0f305403e1ed2dc459d084bba" Mar 10 10:10:16 crc kubenswrapper[4794]: I0310 10:10:16.372385 4794 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-rc2sw"] Mar 10 10:10:16 crc kubenswrapper[4794]: I0310 10:10:16.390086 4794 scope.go:117] "RemoveContainer" containerID="e544585b984d1e07a73109284d4d6af79a46ddb946ed617533fd68015932665c" Mar 10 10:10:16 crc kubenswrapper[4794]: I0310 10:10:16.417194 4794 scope.go:117] "RemoveContainer" containerID="2c9c54730eb704c58a05847dfd2a78c990adcc648e7a012ca755e85670d4a4f7" Mar 10 10:10:16 crc kubenswrapper[4794]: E0310 10:10:16.417688 4794 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2c9c54730eb704c58a05847dfd2a78c990adcc648e7a012ca755e85670d4a4f7\": container with ID starting with 2c9c54730eb704c58a05847dfd2a78c990adcc648e7a012ca755e85670d4a4f7 not found: ID does not exist" containerID="2c9c54730eb704c58a05847dfd2a78c990adcc648e7a012ca755e85670d4a4f7" Mar 10 10:10:16 crc kubenswrapper[4794]: I0310 10:10:16.417723 4794 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2c9c54730eb704c58a05847dfd2a78c990adcc648e7a012ca755e85670d4a4f7"} err="failed to get container status \"2c9c54730eb704c58a05847dfd2a78c990adcc648e7a012ca755e85670d4a4f7\": rpc error: code = NotFound desc = could not find container \"2c9c54730eb704c58a05847dfd2a78c990adcc648e7a012ca755e85670d4a4f7\": container with ID starting with 2c9c54730eb704c58a05847dfd2a78c990adcc648e7a012ca755e85670d4a4f7 not found: ID does not exist" Mar 10 10:10:16 crc kubenswrapper[4794]: I0310 10:10:16.417764 4794 scope.go:117] "RemoveContainer" containerID="9703f3e7fd8cb6014a6b161bb42125738ba49cd0f305403e1ed2dc459d084bba" Mar 10 10:10:16 crc kubenswrapper[4794]: E0310 10:10:16.418171 4794 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9703f3e7fd8cb6014a6b161bb42125738ba49cd0f305403e1ed2dc459d084bba\": container with ID starting with 9703f3e7fd8cb6014a6b161bb42125738ba49cd0f305403e1ed2dc459d084bba not found: ID does not exist" containerID="9703f3e7fd8cb6014a6b161bb42125738ba49cd0f305403e1ed2dc459d084bba" Mar 10 10:10:16 crc kubenswrapper[4794]: I0310 10:10:16.418207 4794 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9703f3e7fd8cb6014a6b161bb42125738ba49cd0f305403e1ed2dc459d084bba"} err="failed to get container status \"9703f3e7fd8cb6014a6b161bb42125738ba49cd0f305403e1ed2dc459d084bba\": rpc error: code = NotFound desc = could not find container \"9703f3e7fd8cb6014a6b161bb42125738ba49cd0f305403e1ed2dc459d084bba\": container with ID starting with 9703f3e7fd8cb6014a6b161bb42125738ba49cd0f305403e1ed2dc459d084bba not found: ID does not exist" Mar 10 10:10:16 crc kubenswrapper[4794]: I0310 10:10:16.418254 4794 scope.go:117] "RemoveContainer" containerID="e544585b984d1e07a73109284d4d6af79a46ddb946ed617533fd68015932665c" Mar 10 10:10:16 crc kubenswrapper[4794]: E0310 10:10:16.418654 4794 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e544585b984d1e07a73109284d4d6af79a46ddb946ed617533fd68015932665c\": container with ID starting with e544585b984d1e07a73109284d4d6af79a46ddb946ed617533fd68015932665c not found: ID does not exist" containerID="e544585b984d1e07a73109284d4d6af79a46ddb946ed617533fd68015932665c" Mar 10 10:10:16 crc kubenswrapper[4794]: I0310 10:10:16.418678 4794 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e544585b984d1e07a73109284d4d6af79a46ddb946ed617533fd68015932665c"} err="failed to get container status \"e544585b984d1e07a73109284d4d6af79a46ddb946ed617533fd68015932665c\": rpc error: code = NotFound desc = could not find container \"e544585b984d1e07a73109284d4d6af79a46ddb946ed617533fd68015932665c\": container with ID starting with e544585b984d1e07a73109284d4d6af79a46ddb946ed617533fd68015932665c not found: ID does not exist" Mar 10 10:10:17 crc kubenswrapper[4794]: I0310 10:10:17.067971 4794 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/glance-default-external-api-0" podUID="53686d91-dc01-4a36-99c3-e6c84052e15e" containerName="glance-httpd" probeResult="failure" output="Get \"https://10.217.0.183:9292/healthcheck\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 10 10:10:17 crc kubenswrapper[4794]: I0310 10:10:17.068067 4794 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/glance-default-external-api-0" podUID="53686d91-dc01-4a36-99c3-e6c84052e15e" containerName="glance-log" probeResult="failure" output="Get \"https://10.217.0.183:9292/healthcheck\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 10 10:10:18 crc kubenswrapper[4794]: I0310 10:10:18.019429 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="735dee14-1f8d-4cbf-b66a-553892775e5c" path="/var/lib/kubelet/pods/735dee14-1f8d-4cbf-b66a-553892775e5c/volumes" Mar 10 10:10:22 crc kubenswrapper[4794]: I0310 10:10:22.967951 4794 patch_prober.go:28] interesting pod/machine-config-daemon-69278 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 10:10:22 crc kubenswrapper[4794]: I0310 10:10:22.968233 4794 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-69278" podUID="071a79a8-a892-4d38-a255-2a19483b64aa" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 10:10:22 crc kubenswrapper[4794]: I0310 10:10:22.968268 4794 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-69278" Mar 10 10:10:22 crc kubenswrapper[4794]: I0310 10:10:22.968921 4794 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"b8035e63199f09775c99e31be585fd326d355f434e689db9d33f71ae65c45f9f"} pod="openshift-machine-config-operator/machine-config-daemon-69278" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 10 10:10:22 crc kubenswrapper[4794]: I0310 10:10:22.968968 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-69278" podUID="071a79a8-a892-4d38-a255-2a19483b64aa" containerName="machine-config-daemon" containerID="cri-o://b8035e63199f09775c99e31be585fd326d355f434e689db9d33f71ae65c45f9f" gracePeriod=600 Mar 10 10:10:23 crc kubenswrapper[4794]: I0310 10:10:23.400973 4794 generic.go:334] "Generic (PLEG): container finished" podID="071a79a8-a892-4d38-a255-2a19483b64aa" containerID="b8035e63199f09775c99e31be585fd326d355f434e689db9d33f71ae65c45f9f" exitCode=0 Mar 10 10:10:23 crc kubenswrapper[4794]: I0310 10:10:23.401034 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-69278" event={"ID":"071a79a8-a892-4d38-a255-2a19483b64aa","Type":"ContainerDied","Data":"b8035e63199f09775c99e31be585fd326d355f434e689db9d33f71ae65c45f9f"} Mar 10 10:10:23 crc kubenswrapper[4794]: I0310 10:10:23.401294 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-69278" event={"ID":"071a79a8-a892-4d38-a255-2a19483b64aa","Type":"ContainerStarted","Data":"9729570461f7cfbf915875f9b959db1e77a9b0affc73f47b04e5f09f1c4d906d"} Mar 10 10:10:23 crc kubenswrapper[4794]: I0310 10:10:23.401312 4794 scope.go:117] "RemoveContainer" containerID="4fbbb2d33125ccb00592b9b895dbb76529b93f7f4dbc98756e86d7dc556b940a" Mar 10 10:10:34 crc kubenswrapper[4794]: I0310 10:10:34.825086 4794 scope.go:117] "RemoveContainer" containerID="d7d3ebbfdeb38990606a0487ab1bcb1e035cb5d8c6aaa0c6504e779a44a224c7" Mar 10 10:11:35 crc kubenswrapper[4794]: I0310 10:11:35.536844 4794 scope.go:117] "RemoveContainer" containerID="0b930984fbf109b32ed28954bcbccc55e0338f35e6b3df584a5cad91d11280de" Mar 10 10:11:35 crc kubenswrapper[4794]: I0310 10:11:35.575178 4794 scope.go:117] "RemoveContainer" containerID="98685f94761e1e0498851e6b6d04b23abfedb6dc9cc964602015da98d51a50db" Mar 10 10:11:35 crc kubenswrapper[4794]: I0310 10:11:35.614211 4794 scope.go:117] "RemoveContainer" containerID="7e4285317b0405a3de83fe6a6261bf54c91dad51c10abc1e8c435712c795e03f" Mar 10 10:11:35 crc kubenswrapper[4794]: I0310 10:11:35.633704 4794 scope.go:117] "RemoveContainer" containerID="8d0146b1fe6aebb86ceb3b4c055564825a0bd4b041da3364fefe0c8d956a6d6c" Mar 10 10:11:35 crc kubenswrapper[4794]: I0310 10:11:35.663033 4794 scope.go:117] "RemoveContainer" containerID="4ee46b9ad300bd7296d35f0791b2f32932975b8845ba6b96c9a8eff329eb83f5" Mar 10 10:11:35 crc kubenswrapper[4794]: I0310 10:11:35.693142 4794 scope.go:117] "RemoveContainer" containerID="3de7e1e91921c511d1d59278200c580a671c8fa44d00a53740c8205df1d31761" Mar 10 10:11:35 crc kubenswrapper[4794]: I0310 10:11:35.720523 4794 scope.go:117] "RemoveContainer" containerID="93f84ceecd4613c693b9db23e3eaa377c57bb4bbb5320eba6b3f1970cdfa1c72" Mar 10 10:11:35 crc kubenswrapper[4794]: I0310 10:11:35.741513 4794 scope.go:117] "RemoveContainer" containerID="e28e928400da89d2d5a42bae0c7bca5c24862a455b433690c6883af7e54ea226" Mar 10 10:11:35 crc kubenswrapper[4794]: I0310 10:11:35.773536 4794 scope.go:117] "RemoveContainer" containerID="4c601955445959dac10e09eee2e220072dd4672a1b804c16d5f6f9003c83d0ab" Mar 10 10:11:35 crc kubenswrapper[4794]: I0310 10:11:35.797897 4794 scope.go:117] "RemoveContainer" containerID="09e39ed4ef8c86442deb9f63034c2d652a3971bc2b7caf7ccc84b744457e17cf" Mar 10 10:11:35 crc kubenswrapper[4794]: I0310 10:11:35.816987 4794 scope.go:117] "RemoveContainer" containerID="ce01843be5c5828e8979187dc904fe2a134b8a8f3591dc1be315393462a40d7d" Mar 10 10:11:35 crc kubenswrapper[4794]: I0310 10:11:35.836113 4794 scope.go:117] "RemoveContainer" containerID="4801069e3ec4ee45b0400bd772935ecc7c61384c2fafb5723da3323b36de9214" Mar 10 10:11:35 crc kubenswrapper[4794]: I0310 10:11:35.852861 4794 scope.go:117] "RemoveContainer" containerID="e91b423fcccb339cdeb18818c51cf8ecd72433c4ea29ae829ab212dc3d885f87" Mar 10 10:11:35 crc kubenswrapper[4794]: I0310 10:11:35.874162 4794 scope.go:117] "RemoveContainer" containerID="00869dd26164d229d447f55eb54f29605e2bcbe41e9378c894e892c957a95916" Mar 10 10:11:35 crc kubenswrapper[4794]: I0310 10:11:35.898493 4794 scope.go:117] "RemoveContainer" containerID="032e3981c05101d1f5fd31ac52721100475578399d8f015d14bd04d5f8f63fe7" Mar 10 10:11:35 crc kubenswrapper[4794]: I0310 10:11:35.919419 4794 scope.go:117] "RemoveContainer" containerID="8d1ff7da9b574a00cbfb6c54f68bf954b4ed60e409ce634a6b3ccf1e121b7c9d" Mar 10 10:11:35 crc kubenswrapper[4794]: I0310 10:11:35.957396 4794 scope.go:117] "RemoveContainer" containerID="19f6cd89c73f96aeb050448147fc3ab69b927ded2c3caffb20293da23bbe586e" Mar 10 10:11:35 crc kubenswrapper[4794]: I0310 10:11:35.986311 4794 scope.go:117] "RemoveContainer" containerID="f4db0913dfb22b8fdf8a9875e0693880b022c639ea96ea1251770012a7e71a8f" Mar 10 10:11:36 crc kubenswrapper[4794]: I0310 10:11:36.023900 4794 scope.go:117] "RemoveContainer" containerID="5999c3f70eba4e38ddc5fbb492eefb265a6b6a1bfc7c6e88e7609ef9105892a8" Mar 10 10:12:00 crc kubenswrapper[4794]: I0310 10:12:00.151522 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29552292-mz24x"] Mar 10 10:12:00 crc kubenswrapper[4794]: E0310 10:12:00.153423 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4fbaedcd-8c1d-452e-a36b-1a12e47f48d1" containerName="ovs-vswitchd" Mar 10 10:12:00 crc kubenswrapper[4794]: I0310 10:12:00.153458 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="4fbaedcd-8c1d-452e-a36b-1a12e47f48d1" containerName="ovs-vswitchd" Mar 10 10:12:00 crc kubenswrapper[4794]: E0310 10:12:00.153504 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6cbde6fd-ed6b-49a8-96ae-642b15a1802b" containerName="account-auditor" Mar 10 10:12:00 crc kubenswrapper[4794]: I0310 10:12:00.153525 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="6cbde6fd-ed6b-49a8-96ae-642b15a1802b" containerName="account-auditor" Mar 10 10:12:00 crc kubenswrapper[4794]: E0310 10:12:00.153570 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3c6c5ff5-c0d2-49d5-a2d0-f49871e6444e" containerName="cinder-scheduler" Mar 10 10:12:00 crc kubenswrapper[4794]: I0310 10:12:00.153591 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="3c6c5ff5-c0d2-49d5-a2d0-f49871e6444e" containerName="cinder-scheduler" Mar 10 10:12:00 crc kubenswrapper[4794]: E0310 10:12:00.153637 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6cbde6fd-ed6b-49a8-96ae-642b15a1802b" containerName="object-auditor" Mar 10 10:12:00 crc kubenswrapper[4794]: I0310 10:12:00.153656 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="6cbde6fd-ed6b-49a8-96ae-642b15a1802b" containerName="object-auditor" Mar 10 10:12:00 crc kubenswrapper[4794]: E0310 10:12:00.153681 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6cbde6fd-ed6b-49a8-96ae-642b15a1802b" containerName="account-server" Mar 10 10:12:00 crc kubenswrapper[4794]: I0310 10:12:00.153712 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="6cbde6fd-ed6b-49a8-96ae-642b15a1802b" containerName="account-server" Mar 10 10:12:00 crc kubenswrapper[4794]: E0310 10:12:00.153739 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6cbde6fd-ed6b-49a8-96ae-642b15a1802b" containerName="rsync" Mar 10 10:12:00 crc kubenswrapper[4794]: I0310 10:12:00.153757 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="6cbde6fd-ed6b-49a8-96ae-642b15a1802b" containerName="rsync" Mar 10 10:12:00 crc kubenswrapper[4794]: E0310 10:12:00.153817 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6cbde6fd-ed6b-49a8-96ae-642b15a1802b" containerName="object-expirer" Mar 10 10:12:00 crc kubenswrapper[4794]: I0310 10:12:00.153835 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="6cbde6fd-ed6b-49a8-96ae-642b15a1802b" containerName="object-expirer" Mar 10 10:12:00 crc kubenswrapper[4794]: E0310 10:12:00.153874 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6cbde6fd-ed6b-49a8-96ae-642b15a1802b" containerName="swift-recon-cron" Mar 10 10:12:00 crc kubenswrapper[4794]: I0310 10:12:00.153892 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="6cbde6fd-ed6b-49a8-96ae-642b15a1802b" containerName="swift-recon-cron" Mar 10 10:12:00 crc kubenswrapper[4794]: E0310 10:12:00.154144 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6cbde6fd-ed6b-49a8-96ae-642b15a1802b" containerName="container-server" Mar 10 10:12:00 crc kubenswrapper[4794]: I0310 10:12:00.154163 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="6cbde6fd-ed6b-49a8-96ae-642b15a1802b" containerName="container-server" Mar 10 10:12:00 crc kubenswrapper[4794]: E0310 10:12:00.154208 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6cbde6fd-ed6b-49a8-96ae-642b15a1802b" containerName="container-replicator" Mar 10 10:12:00 crc kubenswrapper[4794]: I0310 10:12:00.154227 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="6cbde6fd-ed6b-49a8-96ae-642b15a1802b" containerName="container-replicator" Mar 10 10:12:00 crc kubenswrapper[4794]: E0310 10:12:00.154267 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6cbde6fd-ed6b-49a8-96ae-642b15a1802b" containerName="object-updater" Mar 10 10:12:00 crc kubenswrapper[4794]: I0310 10:12:00.154287 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="6cbde6fd-ed6b-49a8-96ae-642b15a1802b" containerName="object-updater" Mar 10 10:12:00 crc kubenswrapper[4794]: E0310 10:12:00.154322 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cc3b15ea-8912-46a4-a69a-5a8f6323b078" containerName="oc" Mar 10 10:12:00 crc kubenswrapper[4794]: I0310 10:12:00.154374 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="cc3b15ea-8912-46a4-a69a-5a8f6323b078" containerName="oc" Mar 10 10:12:00 crc kubenswrapper[4794]: E0310 10:12:00.154419 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="735dee14-1f8d-4cbf-b66a-553892775e5c" containerName="extract-content" Mar 10 10:12:00 crc kubenswrapper[4794]: I0310 10:12:00.154449 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="735dee14-1f8d-4cbf-b66a-553892775e5c" containerName="extract-content" Mar 10 10:12:00 crc kubenswrapper[4794]: E0310 10:12:00.154491 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6cbde6fd-ed6b-49a8-96ae-642b15a1802b" containerName="object-replicator" Mar 10 10:12:00 crc kubenswrapper[4794]: I0310 10:12:00.154512 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="6cbde6fd-ed6b-49a8-96ae-642b15a1802b" containerName="object-replicator" Mar 10 10:12:00 crc kubenswrapper[4794]: E0310 10:12:00.154576 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6cbde6fd-ed6b-49a8-96ae-642b15a1802b" containerName="container-auditor" Mar 10 10:12:00 crc kubenswrapper[4794]: I0310 10:12:00.154597 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="6cbde6fd-ed6b-49a8-96ae-642b15a1802b" containerName="container-auditor" Mar 10 10:12:00 crc kubenswrapper[4794]: E0310 10:12:00.154640 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="735dee14-1f8d-4cbf-b66a-553892775e5c" containerName="registry-server" Mar 10 10:12:00 crc kubenswrapper[4794]: I0310 10:12:00.154658 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="735dee14-1f8d-4cbf-b66a-553892775e5c" containerName="registry-server" Mar 10 10:12:00 crc kubenswrapper[4794]: E0310 10:12:00.154695 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="735dee14-1f8d-4cbf-b66a-553892775e5c" containerName="extract-utilities" Mar 10 10:12:00 crc kubenswrapper[4794]: I0310 10:12:00.154714 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="735dee14-1f8d-4cbf-b66a-553892775e5c" containerName="extract-utilities" Mar 10 10:12:00 crc kubenswrapper[4794]: E0310 10:12:00.154752 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3c6c5ff5-c0d2-49d5-a2d0-f49871e6444e" containerName="probe" Mar 10 10:12:00 crc kubenswrapper[4794]: I0310 10:12:00.154770 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="3c6c5ff5-c0d2-49d5-a2d0-f49871e6444e" containerName="probe" Mar 10 10:12:00 crc kubenswrapper[4794]: E0310 10:12:00.154818 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6cbde6fd-ed6b-49a8-96ae-642b15a1802b" containerName="account-replicator" Mar 10 10:12:00 crc kubenswrapper[4794]: I0310 10:12:00.154847 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="6cbde6fd-ed6b-49a8-96ae-642b15a1802b" containerName="account-replicator" Mar 10 10:12:00 crc kubenswrapper[4794]: E0310 10:12:00.154892 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4fbaedcd-8c1d-452e-a36b-1a12e47f48d1" containerName="ovsdb-server" Mar 10 10:12:00 crc kubenswrapper[4794]: I0310 10:12:00.154910 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="4fbaedcd-8c1d-452e-a36b-1a12e47f48d1" containerName="ovsdb-server" Mar 10 10:12:00 crc kubenswrapper[4794]: E0310 10:12:00.154949 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6cbde6fd-ed6b-49a8-96ae-642b15a1802b" containerName="account-reaper" Mar 10 10:12:00 crc kubenswrapper[4794]: I0310 10:12:00.154969 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="6cbde6fd-ed6b-49a8-96ae-642b15a1802b" containerName="account-reaper" Mar 10 10:12:00 crc kubenswrapper[4794]: E0310 10:12:00.155014 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4fbaedcd-8c1d-452e-a36b-1a12e47f48d1" containerName="ovsdb-server-init" Mar 10 10:12:00 crc kubenswrapper[4794]: I0310 10:12:00.155032 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="4fbaedcd-8c1d-452e-a36b-1a12e47f48d1" containerName="ovsdb-server-init" Mar 10 10:12:00 crc kubenswrapper[4794]: E0310 10:12:00.155086 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6cbde6fd-ed6b-49a8-96ae-642b15a1802b" containerName="object-server" Mar 10 10:12:00 crc kubenswrapper[4794]: I0310 10:12:00.155105 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="6cbde6fd-ed6b-49a8-96ae-642b15a1802b" containerName="object-server" Mar 10 10:12:00 crc kubenswrapper[4794]: E0310 10:12:00.155151 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6cbde6fd-ed6b-49a8-96ae-642b15a1802b" containerName="container-updater" Mar 10 10:12:00 crc kubenswrapper[4794]: I0310 10:12:00.155169 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="6cbde6fd-ed6b-49a8-96ae-642b15a1802b" containerName="container-updater" Mar 10 10:12:00 crc kubenswrapper[4794]: I0310 10:12:00.155954 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="6cbde6fd-ed6b-49a8-96ae-642b15a1802b" containerName="account-replicator" Mar 10 10:12:00 crc kubenswrapper[4794]: I0310 10:12:00.156002 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="6cbde6fd-ed6b-49a8-96ae-642b15a1802b" containerName="object-replicator" Mar 10 10:12:00 crc kubenswrapper[4794]: I0310 10:12:00.156041 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="6cbde6fd-ed6b-49a8-96ae-642b15a1802b" containerName="object-server" Mar 10 10:12:00 crc kubenswrapper[4794]: I0310 10:12:00.156104 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="6cbde6fd-ed6b-49a8-96ae-642b15a1802b" containerName="container-auditor" Mar 10 10:12:00 crc kubenswrapper[4794]: I0310 10:12:00.156148 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="4fbaedcd-8c1d-452e-a36b-1a12e47f48d1" containerName="ovsdb-server" Mar 10 10:12:00 crc kubenswrapper[4794]: I0310 10:12:00.156190 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="6cbde6fd-ed6b-49a8-96ae-642b15a1802b" containerName="container-replicator" Mar 10 10:12:00 crc kubenswrapper[4794]: I0310 10:12:00.156233 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="6cbde6fd-ed6b-49a8-96ae-642b15a1802b" containerName="account-auditor" Mar 10 10:12:00 crc kubenswrapper[4794]: I0310 10:12:00.156268 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="6cbde6fd-ed6b-49a8-96ae-642b15a1802b" containerName="account-server" Mar 10 10:12:00 crc kubenswrapper[4794]: I0310 10:12:00.156306 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="6cbde6fd-ed6b-49a8-96ae-642b15a1802b" containerName="object-auditor" Mar 10 10:12:00 crc kubenswrapper[4794]: I0310 10:12:00.156366 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="6cbde6fd-ed6b-49a8-96ae-642b15a1802b" containerName="rsync" Mar 10 10:12:00 crc kubenswrapper[4794]: I0310 10:12:00.156418 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="6cbde6fd-ed6b-49a8-96ae-642b15a1802b" containerName="container-server" Mar 10 10:12:00 crc kubenswrapper[4794]: I0310 10:12:00.156444 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="3c6c5ff5-c0d2-49d5-a2d0-f49871e6444e" containerName="probe" Mar 10 10:12:00 crc kubenswrapper[4794]: I0310 10:12:00.156484 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="6cbde6fd-ed6b-49a8-96ae-642b15a1802b" containerName="object-expirer" Mar 10 10:12:00 crc kubenswrapper[4794]: I0310 10:12:00.156522 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="3c6c5ff5-c0d2-49d5-a2d0-f49871e6444e" containerName="cinder-scheduler" Mar 10 10:12:00 crc kubenswrapper[4794]: I0310 10:12:00.156557 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="6cbde6fd-ed6b-49a8-96ae-642b15a1802b" containerName="container-updater" Mar 10 10:12:00 crc kubenswrapper[4794]: I0310 10:12:00.156575 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="735dee14-1f8d-4cbf-b66a-553892775e5c" containerName="registry-server" Mar 10 10:12:00 crc kubenswrapper[4794]: I0310 10:12:00.156594 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="6cbde6fd-ed6b-49a8-96ae-642b15a1802b" containerName="swift-recon-cron" Mar 10 10:12:00 crc kubenswrapper[4794]: I0310 10:12:00.156629 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="cc3b15ea-8912-46a4-a69a-5a8f6323b078" containerName="oc" Mar 10 10:12:00 crc kubenswrapper[4794]: I0310 10:12:00.156666 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="6cbde6fd-ed6b-49a8-96ae-642b15a1802b" containerName="account-reaper" Mar 10 10:12:00 crc kubenswrapper[4794]: I0310 10:12:00.156705 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="4fbaedcd-8c1d-452e-a36b-1a12e47f48d1" containerName="ovs-vswitchd" Mar 10 10:12:00 crc kubenswrapper[4794]: I0310 10:12:00.156723 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="6cbde6fd-ed6b-49a8-96ae-642b15a1802b" containerName="object-updater" Mar 10 10:12:00 crc kubenswrapper[4794]: I0310 10:12:00.158163 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552292-mz24x" Mar 10 10:12:00 crc kubenswrapper[4794]: I0310 10:12:00.165512 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-r5tkc" Mar 10 10:12:00 crc kubenswrapper[4794]: I0310 10:12:00.166716 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 10 10:12:00 crc kubenswrapper[4794]: I0310 10:12:00.181678 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 10 10:12:00 crc kubenswrapper[4794]: I0310 10:12:00.194797 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552292-mz24x"] Mar 10 10:12:00 crc kubenswrapper[4794]: I0310 10:12:00.278250 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fct4z\" (UniqueName: \"kubernetes.io/projected/fba92096-5c77-4876-9587-adcca946875a-kube-api-access-fct4z\") pod \"auto-csr-approver-29552292-mz24x\" (UID: \"fba92096-5c77-4876-9587-adcca946875a\") " pod="openshift-infra/auto-csr-approver-29552292-mz24x" Mar 10 10:12:00 crc kubenswrapper[4794]: I0310 10:12:00.379977 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fct4z\" (UniqueName: \"kubernetes.io/projected/fba92096-5c77-4876-9587-adcca946875a-kube-api-access-fct4z\") pod \"auto-csr-approver-29552292-mz24x\" (UID: \"fba92096-5c77-4876-9587-adcca946875a\") " pod="openshift-infra/auto-csr-approver-29552292-mz24x" Mar 10 10:12:00 crc kubenswrapper[4794]: I0310 10:12:00.419817 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fct4z\" (UniqueName: \"kubernetes.io/projected/fba92096-5c77-4876-9587-adcca946875a-kube-api-access-fct4z\") pod \"auto-csr-approver-29552292-mz24x\" (UID: \"fba92096-5c77-4876-9587-adcca946875a\") " pod="openshift-infra/auto-csr-approver-29552292-mz24x" Mar 10 10:12:00 crc kubenswrapper[4794]: I0310 10:12:00.489748 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552292-mz24x" Mar 10 10:12:00 crc kubenswrapper[4794]: I0310 10:12:00.927065 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552292-mz24x"] Mar 10 10:12:01 crc kubenswrapper[4794]: I0310 10:12:01.452831 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552292-mz24x" event={"ID":"fba92096-5c77-4876-9587-adcca946875a","Type":"ContainerStarted","Data":"76dc1e69e3b2818dd6aa55e254c20eedf2b7f54185b8043b86c17c7d855e0710"} Mar 10 10:12:03 crc kubenswrapper[4794]: I0310 10:12:03.471446 4794 generic.go:334] "Generic (PLEG): container finished" podID="fba92096-5c77-4876-9587-adcca946875a" containerID="294b27da4d8580373329333669c62af029345b2aa7ddee9921325199df016a58" exitCode=0 Mar 10 10:12:03 crc kubenswrapper[4794]: I0310 10:12:03.471669 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552292-mz24x" event={"ID":"fba92096-5c77-4876-9587-adcca946875a","Type":"ContainerDied","Data":"294b27da4d8580373329333669c62af029345b2aa7ddee9921325199df016a58"} Mar 10 10:12:04 crc kubenswrapper[4794]: I0310 10:12:04.820527 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552292-mz24x" Mar 10 10:12:04 crc kubenswrapper[4794]: I0310 10:12:04.941596 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fct4z\" (UniqueName: \"kubernetes.io/projected/fba92096-5c77-4876-9587-adcca946875a-kube-api-access-fct4z\") pod \"fba92096-5c77-4876-9587-adcca946875a\" (UID: \"fba92096-5c77-4876-9587-adcca946875a\") " Mar 10 10:12:04 crc kubenswrapper[4794]: I0310 10:12:04.955058 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fba92096-5c77-4876-9587-adcca946875a-kube-api-access-fct4z" (OuterVolumeSpecName: "kube-api-access-fct4z") pod "fba92096-5c77-4876-9587-adcca946875a" (UID: "fba92096-5c77-4876-9587-adcca946875a"). InnerVolumeSpecName "kube-api-access-fct4z". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 10:12:05 crc kubenswrapper[4794]: I0310 10:12:05.043024 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fct4z\" (UniqueName: \"kubernetes.io/projected/fba92096-5c77-4876-9587-adcca946875a-kube-api-access-fct4z\") on node \"crc\" DevicePath \"\"" Mar 10 10:12:05 crc kubenswrapper[4794]: I0310 10:12:05.495520 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552292-mz24x" event={"ID":"fba92096-5c77-4876-9587-adcca946875a","Type":"ContainerDied","Data":"76dc1e69e3b2818dd6aa55e254c20eedf2b7f54185b8043b86c17c7d855e0710"} Mar 10 10:12:05 crc kubenswrapper[4794]: I0310 10:12:05.495900 4794 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="76dc1e69e3b2818dd6aa55e254c20eedf2b7f54185b8043b86c17c7d855e0710" Mar 10 10:12:05 crc kubenswrapper[4794]: I0310 10:12:05.495709 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552292-mz24x" Mar 10 10:12:05 crc kubenswrapper[4794]: I0310 10:12:05.905900 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29552286-6rbvj"] Mar 10 10:12:05 crc kubenswrapper[4794]: I0310 10:12:05.912643 4794 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29552286-6rbvj"] Mar 10 10:12:06 crc kubenswrapper[4794]: I0310 10:12:06.009802 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f9689ae4-36b4-41d8-b75d-805a57b17041" path="/var/lib/kubelet/pods/f9689ae4-36b4-41d8-b75d-805a57b17041/volumes" Mar 10 10:12:36 crc kubenswrapper[4794]: I0310 10:12:36.347371 4794 scope.go:117] "RemoveContainer" containerID="fe8fb67a806020a24997b1d8ba664de48c197527d827a7320e86b917c0811868" Mar 10 10:12:36 crc kubenswrapper[4794]: I0310 10:12:36.402329 4794 scope.go:117] "RemoveContainer" containerID="6afdec6836ae216dcd698ba4415c1d439445dabd433ed2a5a743f82c64c471d6" Mar 10 10:12:36 crc kubenswrapper[4794]: I0310 10:12:36.462428 4794 scope.go:117] "RemoveContainer" containerID="12c0f08ec9fff9a449d27dcfe0c2a18494ac9de59ece4c141755f344ab722b72" Mar 10 10:12:36 crc kubenswrapper[4794]: I0310 10:12:36.498837 4794 scope.go:117] "RemoveContainer" containerID="538774c14100ed00991d4835f5659f6f2efefa4b4d07d2baab06ba0d13be29dc" Mar 10 10:12:36 crc kubenswrapper[4794]: I0310 10:12:36.519934 4794 scope.go:117] "RemoveContainer" containerID="9a8f466fe0f05d67d968612dd2a52437cb959e872211abcc0038fb9ddc671c06" Mar 10 10:12:36 crc kubenswrapper[4794]: I0310 10:12:36.570408 4794 scope.go:117] "RemoveContainer" containerID="09cd8aa2d139fb0c80d695b9f0a857597678db3d626060120182e8665cb195f3" Mar 10 10:12:36 crc kubenswrapper[4794]: I0310 10:12:36.596067 4794 scope.go:117] "RemoveContainer" containerID="00ea922430a28d8905d03ab4683719c7f0ddb1de198f027cdcd8c2578d18e421" Mar 10 10:12:36 crc kubenswrapper[4794]: I0310 10:12:36.633843 4794 scope.go:117] "RemoveContainer" containerID="f34d846d29fa22b51cfd81f5eb8d939afff71854f04f27647f2aaabad501e40f" Mar 10 10:12:36 crc kubenswrapper[4794]: I0310 10:12:36.675529 4794 scope.go:117] "RemoveContainer" containerID="5a33b91b527b1919053bf43cda8f3cb23a6d76721c506dda2a7bfbc73df9e4fb" Mar 10 10:12:36 crc kubenswrapper[4794]: I0310 10:12:36.705702 4794 scope.go:117] "RemoveContainer" containerID="44d4d4f0866a0ad74a78bd1b8960fddbed29b1a1d629ac2ff1cc5f7193fc557c" Mar 10 10:12:36 crc kubenswrapper[4794]: I0310 10:12:36.744821 4794 scope.go:117] "RemoveContainer" containerID="11afbcbf2e1ea029e0946f9cfb8d80bf8c8658a761240173341846934dc9c6c5" Mar 10 10:12:36 crc kubenswrapper[4794]: I0310 10:12:36.765698 4794 scope.go:117] "RemoveContainer" containerID="1e3c68f525250d31acefa52ae32155d2762006dff412a57a1e872f7a90a1f9ed" Mar 10 10:12:36 crc kubenswrapper[4794]: I0310 10:12:36.789503 4794 scope.go:117] "RemoveContainer" containerID="7e53340b15e3ba50ade3cd2da6c770411ad4bac15ca846d1a1785f37fda6ab2b" Mar 10 10:12:36 crc kubenswrapper[4794]: I0310 10:12:36.807773 4794 scope.go:117] "RemoveContainer" containerID="9bdd46d650f18fd8daa1d10e48bda92310a8b529240b1904c24330c1b16cecd5" Mar 10 10:12:36 crc kubenswrapper[4794]: I0310 10:12:36.837231 4794 scope.go:117] "RemoveContainer" containerID="e93ec00740ada705822837c538ffa9760f9d8c6961b642225b14741f8c12ddcc" Mar 10 10:12:36 crc kubenswrapper[4794]: I0310 10:12:36.859999 4794 scope.go:117] "RemoveContainer" containerID="83ea99766083c37e566fe16c905e9d98e5f40860228f0197c875205a8e8c07b7" Mar 10 10:12:36 crc kubenswrapper[4794]: I0310 10:12:36.884107 4794 scope.go:117] "RemoveContainer" containerID="b0a4fdbb275ad365cd528dfeb3598b4be49751b4aabd80a0da95803def06acbd" Mar 10 10:12:36 crc kubenswrapper[4794]: I0310 10:12:36.910303 4794 scope.go:117] "RemoveContainer" containerID="ddda66dbb1d59fd19b03a30fbf32119c6446e237479d0ac3e5dc4dcefbdd15fb" Mar 10 10:12:52 crc kubenswrapper[4794]: I0310 10:12:52.967893 4794 patch_prober.go:28] interesting pod/machine-config-daemon-69278 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 10:12:52 crc kubenswrapper[4794]: I0310 10:12:52.968442 4794 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-69278" podUID="071a79a8-a892-4d38-a255-2a19483b64aa" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 10:13:22 crc kubenswrapper[4794]: I0310 10:13:22.967225 4794 patch_prober.go:28] interesting pod/machine-config-daemon-69278 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 10:13:22 crc kubenswrapper[4794]: I0310 10:13:22.967953 4794 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-69278" podUID="071a79a8-a892-4d38-a255-2a19483b64aa" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 10:13:37 crc kubenswrapper[4794]: I0310 10:13:37.237516 4794 scope.go:117] "RemoveContainer" containerID="533e00e14c8524408bf878f2e40fcf2843d0d66d7ba048ed96beac141f3440ab" Mar 10 10:13:37 crc kubenswrapper[4794]: I0310 10:13:37.272589 4794 scope.go:117] "RemoveContainer" containerID="540c3c25fe0dd253f107d25fb50dacca81557b8d7b51277e3dafbce6effea553" Mar 10 10:13:37 crc kubenswrapper[4794]: I0310 10:13:37.330669 4794 scope.go:117] "RemoveContainer" containerID="7049bf67f9f820cd9e74cbe5c408865922f9627182ed1b04dc96fc9839d624ee" Mar 10 10:13:37 crc kubenswrapper[4794]: I0310 10:13:37.378837 4794 scope.go:117] "RemoveContainer" containerID="657c6e566fe3c3ec4a4a199f42966fd39347a8a6f061fe653f5274f12bb76445" Mar 10 10:13:37 crc kubenswrapper[4794]: I0310 10:13:37.400976 4794 scope.go:117] "RemoveContainer" containerID="2e1344b23f34c965294ce2a354b0152c4c935ae846652a17c50980e22b14974d" Mar 10 10:13:37 crc kubenswrapper[4794]: I0310 10:13:37.419518 4794 scope.go:117] "RemoveContainer" containerID="4fc4d6cd8ca31d8694a05ea7825a17b8db43c3dc50594cab99e3bba48437e073" Mar 10 10:13:37 crc kubenswrapper[4794]: I0310 10:13:37.448636 4794 scope.go:117] "RemoveContainer" containerID="28bbc46ddd80c7238bde4d825e65659d1f1fa5c11c308841de1e1d86d3bd6f52" Mar 10 10:13:37 crc kubenswrapper[4794]: I0310 10:13:37.482214 4794 scope.go:117] "RemoveContainer" containerID="e2edb92bbd38d8a9c294b9b3ad89c0346bb305954930f87be72067c7c60f2e7f" Mar 10 10:13:37 crc kubenswrapper[4794]: I0310 10:13:37.530890 4794 scope.go:117] "RemoveContainer" containerID="e70254b5af8bdd5f0d931f675058f876b6f118dd848b3a87ff25ec203ef4bf4d" Mar 10 10:13:37 crc kubenswrapper[4794]: I0310 10:13:37.567742 4794 scope.go:117] "RemoveContainer" containerID="c71befb5940658f0f62fdcfde2739c45a38416fb2d1194d3aeeb9466f078e8ad" Mar 10 10:13:52 crc kubenswrapper[4794]: I0310 10:13:52.967642 4794 patch_prober.go:28] interesting pod/machine-config-daemon-69278 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 10:13:52 crc kubenswrapper[4794]: I0310 10:13:52.968146 4794 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-69278" podUID="071a79a8-a892-4d38-a255-2a19483b64aa" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 10:13:52 crc kubenswrapper[4794]: I0310 10:13:52.968184 4794 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-69278" Mar 10 10:13:52 crc kubenswrapper[4794]: I0310 10:13:52.968840 4794 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"9729570461f7cfbf915875f9b959db1e77a9b0affc73f47b04e5f09f1c4d906d"} pod="openshift-machine-config-operator/machine-config-daemon-69278" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 10 10:13:52 crc kubenswrapper[4794]: I0310 10:13:52.968897 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-69278" podUID="071a79a8-a892-4d38-a255-2a19483b64aa" containerName="machine-config-daemon" containerID="cri-o://9729570461f7cfbf915875f9b959db1e77a9b0affc73f47b04e5f09f1c4d906d" gracePeriod=600 Mar 10 10:13:53 crc kubenswrapper[4794]: E0310 10:13:53.097770 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-69278_openshift-machine-config-operator(071a79a8-a892-4d38-a255-2a19483b64aa)\"" pod="openshift-machine-config-operator/machine-config-daemon-69278" podUID="071a79a8-a892-4d38-a255-2a19483b64aa" Mar 10 10:13:53 crc kubenswrapper[4794]: I0310 10:13:53.494114 4794 generic.go:334] "Generic (PLEG): container finished" podID="071a79a8-a892-4d38-a255-2a19483b64aa" containerID="9729570461f7cfbf915875f9b959db1e77a9b0affc73f47b04e5f09f1c4d906d" exitCode=0 Mar 10 10:13:53 crc kubenswrapper[4794]: I0310 10:13:53.494172 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-69278" event={"ID":"071a79a8-a892-4d38-a255-2a19483b64aa","Type":"ContainerDied","Data":"9729570461f7cfbf915875f9b959db1e77a9b0affc73f47b04e5f09f1c4d906d"} Mar 10 10:13:53 crc kubenswrapper[4794]: I0310 10:13:53.494212 4794 scope.go:117] "RemoveContainer" containerID="b8035e63199f09775c99e31be585fd326d355f434e689db9d33f71ae65c45f9f" Mar 10 10:13:53 crc kubenswrapper[4794]: I0310 10:13:53.494969 4794 scope.go:117] "RemoveContainer" containerID="9729570461f7cfbf915875f9b959db1e77a9b0affc73f47b04e5f09f1c4d906d" Mar 10 10:13:53 crc kubenswrapper[4794]: E0310 10:13:53.495495 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-69278_openshift-machine-config-operator(071a79a8-a892-4d38-a255-2a19483b64aa)\"" pod="openshift-machine-config-operator/machine-config-daemon-69278" podUID="071a79a8-a892-4d38-a255-2a19483b64aa" Mar 10 10:14:00 crc kubenswrapper[4794]: I0310 10:14:00.153929 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29552294-nwmdb"] Mar 10 10:14:00 crc kubenswrapper[4794]: E0310 10:14:00.154970 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fba92096-5c77-4876-9587-adcca946875a" containerName="oc" Mar 10 10:14:00 crc kubenswrapper[4794]: I0310 10:14:00.154992 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="fba92096-5c77-4876-9587-adcca946875a" containerName="oc" Mar 10 10:14:00 crc kubenswrapper[4794]: I0310 10:14:00.155269 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="fba92096-5c77-4876-9587-adcca946875a" containerName="oc" Mar 10 10:14:00 crc kubenswrapper[4794]: I0310 10:14:00.156046 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552294-nwmdb" Mar 10 10:14:00 crc kubenswrapper[4794]: I0310 10:14:00.158909 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 10 10:14:00 crc kubenswrapper[4794]: I0310 10:14:00.158965 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 10 10:14:00 crc kubenswrapper[4794]: I0310 10:14:00.158920 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-r5tkc" Mar 10 10:14:00 crc kubenswrapper[4794]: I0310 10:14:00.161220 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552294-nwmdb"] Mar 10 10:14:00 crc kubenswrapper[4794]: I0310 10:14:00.225328 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tl995\" (UniqueName: \"kubernetes.io/projected/a4cabbd0-9c5c-4f5f-9233-938cd7b1ced1-kube-api-access-tl995\") pod \"auto-csr-approver-29552294-nwmdb\" (UID: \"a4cabbd0-9c5c-4f5f-9233-938cd7b1ced1\") " pod="openshift-infra/auto-csr-approver-29552294-nwmdb" Mar 10 10:14:00 crc kubenswrapper[4794]: I0310 10:14:00.327454 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tl995\" (UniqueName: \"kubernetes.io/projected/a4cabbd0-9c5c-4f5f-9233-938cd7b1ced1-kube-api-access-tl995\") pod \"auto-csr-approver-29552294-nwmdb\" (UID: \"a4cabbd0-9c5c-4f5f-9233-938cd7b1ced1\") " pod="openshift-infra/auto-csr-approver-29552294-nwmdb" Mar 10 10:14:00 crc kubenswrapper[4794]: I0310 10:14:00.356187 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tl995\" (UniqueName: \"kubernetes.io/projected/a4cabbd0-9c5c-4f5f-9233-938cd7b1ced1-kube-api-access-tl995\") pod \"auto-csr-approver-29552294-nwmdb\" (UID: \"a4cabbd0-9c5c-4f5f-9233-938cd7b1ced1\") " pod="openshift-infra/auto-csr-approver-29552294-nwmdb" Mar 10 10:14:00 crc kubenswrapper[4794]: I0310 10:14:00.476197 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552294-nwmdb" Mar 10 10:14:00 crc kubenswrapper[4794]: I0310 10:14:00.925176 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552294-nwmdb"] Mar 10 10:14:01 crc kubenswrapper[4794]: I0310 10:14:01.565721 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552294-nwmdb" event={"ID":"a4cabbd0-9c5c-4f5f-9233-938cd7b1ced1","Type":"ContainerStarted","Data":"4f6ff2825ccf31ab8de8fda7a31205bcaa98ddd090374ff71e6cc39d694d0ed3"} Mar 10 10:14:02 crc kubenswrapper[4794]: I0310 10:14:02.585269 4794 generic.go:334] "Generic (PLEG): container finished" podID="a4cabbd0-9c5c-4f5f-9233-938cd7b1ced1" containerID="fefd5652bcce429465fd023412735698b02b5e345af55d22393e6b6e3bf6de74" exitCode=0 Mar 10 10:14:02 crc kubenswrapper[4794]: I0310 10:14:02.585597 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552294-nwmdb" event={"ID":"a4cabbd0-9c5c-4f5f-9233-938cd7b1ced1","Type":"ContainerDied","Data":"fefd5652bcce429465fd023412735698b02b5e345af55d22393e6b6e3bf6de74"} Mar 10 10:14:03 crc kubenswrapper[4794]: I0310 10:14:03.901442 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552294-nwmdb" Mar 10 10:14:03 crc kubenswrapper[4794]: I0310 10:14:03.980547 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tl995\" (UniqueName: \"kubernetes.io/projected/a4cabbd0-9c5c-4f5f-9233-938cd7b1ced1-kube-api-access-tl995\") pod \"a4cabbd0-9c5c-4f5f-9233-938cd7b1ced1\" (UID: \"a4cabbd0-9c5c-4f5f-9233-938cd7b1ced1\") " Mar 10 10:14:03 crc kubenswrapper[4794]: I0310 10:14:03.986499 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a4cabbd0-9c5c-4f5f-9233-938cd7b1ced1-kube-api-access-tl995" (OuterVolumeSpecName: "kube-api-access-tl995") pod "a4cabbd0-9c5c-4f5f-9233-938cd7b1ced1" (UID: "a4cabbd0-9c5c-4f5f-9233-938cd7b1ced1"). InnerVolumeSpecName "kube-api-access-tl995". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 10:14:04 crc kubenswrapper[4794]: I0310 10:14:04.083635 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tl995\" (UniqueName: \"kubernetes.io/projected/a4cabbd0-9c5c-4f5f-9233-938cd7b1ced1-kube-api-access-tl995\") on node \"crc\" DevicePath \"\"" Mar 10 10:14:04 crc kubenswrapper[4794]: I0310 10:14:04.607014 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552294-nwmdb" event={"ID":"a4cabbd0-9c5c-4f5f-9233-938cd7b1ced1","Type":"ContainerDied","Data":"4f6ff2825ccf31ab8de8fda7a31205bcaa98ddd090374ff71e6cc39d694d0ed3"} Mar 10 10:14:04 crc kubenswrapper[4794]: I0310 10:14:04.607433 4794 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4f6ff2825ccf31ab8de8fda7a31205bcaa98ddd090374ff71e6cc39d694d0ed3" Mar 10 10:14:04 crc kubenswrapper[4794]: I0310 10:14:04.607089 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552294-nwmdb" Mar 10 10:14:04 crc kubenswrapper[4794]: I0310 10:14:04.982722 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29552288-fck7s"] Mar 10 10:14:04 crc kubenswrapper[4794]: I0310 10:14:04.989148 4794 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29552288-fck7s"] Mar 10 10:14:05 crc kubenswrapper[4794]: I0310 10:14:05.999413 4794 scope.go:117] "RemoveContainer" containerID="9729570461f7cfbf915875f9b959db1e77a9b0affc73f47b04e5f09f1c4d906d" Mar 10 10:14:06 crc kubenswrapper[4794]: E0310 10:14:05.999706 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-69278_openshift-machine-config-operator(071a79a8-a892-4d38-a255-2a19483b64aa)\"" pod="openshift-machine-config-operator/machine-config-daemon-69278" podUID="071a79a8-a892-4d38-a255-2a19483b64aa" Mar 10 10:14:06 crc kubenswrapper[4794]: I0310 10:14:06.010385 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e8db1d2e-9690-4d13-bce2-b8602ffb7583" path="/var/lib/kubelet/pods/e8db1d2e-9690-4d13-bce2-b8602ffb7583/volumes" Mar 10 10:14:20 crc kubenswrapper[4794]: I0310 10:14:20.999036 4794 scope.go:117] "RemoveContainer" containerID="9729570461f7cfbf915875f9b959db1e77a9b0affc73f47b04e5f09f1c4d906d" Mar 10 10:14:21 crc kubenswrapper[4794]: E0310 10:14:20.999870 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-69278_openshift-machine-config-operator(071a79a8-a892-4d38-a255-2a19483b64aa)\"" pod="openshift-machine-config-operator/machine-config-daemon-69278" podUID="071a79a8-a892-4d38-a255-2a19483b64aa" Mar 10 10:14:32 crc kubenswrapper[4794]: I0310 10:14:32.998868 4794 scope.go:117] "RemoveContainer" containerID="9729570461f7cfbf915875f9b959db1e77a9b0affc73f47b04e5f09f1c4d906d" Mar 10 10:14:33 crc kubenswrapper[4794]: E0310 10:14:32.999636 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-69278_openshift-machine-config-operator(071a79a8-a892-4d38-a255-2a19483b64aa)\"" pod="openshift-machine-config-operator/machine-config-daemon-69278" podUID="071a79a8-a892-4d38-a255-2a19483b64aa" Mar 10 10:14:37 crc kubenswrapper[4794]: I0310 10:14:37.712722 4794 scope.go:117] "RemoveContainer" containerID="ad95ef6605bbb6d66ad66594f4383cbe222c3fc9ee15ad33d45b92de3f7d5de8" Mar 10 10:14:37 crc kubenswrapper[4794]: I0310 10:14:37.776652 4794 scope.go:117] "RemoveContainer" containerID="1edfc188e8a48dc22fef269f386093bd77ed504ab9f840271be29afeff769334" Mar 10 10:14:37 crc kubenswrapper[4794]: I0310 10:14:37.844818 4794 scope.go:117] "RemoveContainer" containerID="f014916484fd1fca04a820b617c80d6df43e5e2743a3252f69ec6b699226813e" Mar 10 10:14:44 crc kubenswrapper[4794]: I0310 10:14:44.008264 4794 scope.go:117] "RemoveContainer" containerID="9729570461f7cfbf915875f9b959db1e77a9b0affc73f47b04e5f09f1c4d906d" Mar 10 10:14:44 crc kubenswrapper[4794]: E0310 10:14:44.009190 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-69278_openshift-machine-config-operator(071a79a8-a892-4d38-a255-2a19483b64aa)\"" pod="openshift-machine-config-operator/machine-config-daemon-69278" podUID="071a79a8-a892-4d38-a255-2a19483b64aa" Mar 10 10:14:59 crc kubenswrapper[4794]: I0310 10:14:58.999456 4794 scope.go:117] "RemoveContainer" containerID="9729570461f7cfbf915875f9b959db1e77a9b0affc73f47b04e5f09f1c4d906d" Mar 10 10:14:59 crc kubenswrapper[4794]: E0310 10:14:59.000757 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-69278_openshift-machine-config-operator(071a79a8-a892-4d38-a255-2a19483b64aa)\"" pod="openshift-machine-config-operator/machine-config-daemon-69278" podUID="071a79a8-a892-4d38-a255-2a19483b64aa" Mar 10 10:15:00 crc kubenswrapper[4794]: I0310 10:15:00.178086 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29552295-mkdsg"] Mar 10 10:15:00 crc kubenswrapper[4794]: E0310 10:15:00.178730 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a4cabbd0-9c5c-4f5f-9233-938cd7b1ced1" containerName="oc" Mar 10 10:15:00 crc kubenswrapper[4794]: I0310 10:15:00.178762 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="a4cabbd0-9c5c-4f5f-9233-938cd7b1ced1" containerName="oc" Mar 10 10:15:00 crc kubenswrapper[4794]: I0310 10:15:00.179189 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="a4cabbd0-9c5c-4f5f-9233-938cd7b1ced1" containerName="oc" Mar 10 10:15:00 crc kubenswrapper[4794]: I0310 10:15:00.180281 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29552295-mkdsg" Mar 10 10:15:00 crc kubenswrapper[4794]: I0310 10:15:00.182634 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 10 10:15:00 crc kubenswrapper[4794]: I0310 10:15:00.183418 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 10 10:15:00 crc kubenswrapper[4794]: I0310 10:15:00.187487 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29552295-mkdsg"] Mar 10 10:15:00 crc kubenswrapper[4794]: I0310 10:15:00.262058 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/0afae991-6837-47dd-948b-cb1a298f8ce3-config-volume\") pod \"collect-profiles-29552295-mkdsg\" (UID: \"0afae991-6837-47dd-948b-cb1a298f8ce3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552295-mkdsg" Mar 10 10:15:00 crc kubenswrapper[4794]: I0310 10:15:00.262528 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5jhbr\" (UniqueName: \"kubernetes.io/projected/0afae991-6837-47dd-948b-cb1a298f8ce3-kube-api-access-5jhbr\") pod \"collect-profiles-29552295-mkdsg\" (UID: \"0afae991-6837-47dd-948b-cb1a298f8ce3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552295-mkdsg" Mar 10 10:15:00 crc kubenswrapper[4794]: I0310 10:15:00.262567 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/0afae991-6837-47dd-948b-cb1a298f8ce3-secret-volume\") pod \"collect-profiles-29552295-mkdsg\" (UID: \"0afae991-6837-47dd-948b-cb1a298f8ce3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552295-mkdsg" Mar 10 10:15:00 crc kubenswrapper[4794]: I0310 10:15:00.363678 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/0afae991-6837-47dd-948b-cb1a298f8ce3-config-volume\") pod \"collect-profiles-29552295-mkdsg\" (UID: \"0afae991-6837-47dd-948b-cb1a298f8ce3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552295-mkdsg" Mar 10 10:15:00 crc kubenswrapper[4794]: I0310 10:15:00.363785 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5jhbr\" (UniqueName: \"kubernetes.io/projected/0afae991-6837-47dd-948b-cb1a298f8ce3-kube-api-access-5jhbr\") pod \"collect-profiles-29552295-mkdsg\" (UID: \"0afae991-6837-47dd-948b-cb1a298f8ce3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552295-mkdsg" Mar 10 10:15:00 crc kubenswrapper[4794]: I0310 10:15:00.363814 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/0afae991-6837-47dd-948b-cb1a298f8ce3-secret-volume\") pod \"collect-profiles-29552295-mkdsg\" (UID: \"0afae991-6837-47dd-948b-cb1a298f8ce3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552295-mkdsg" Mar 10 10:15:00 crc kubenswrapper[4794]: I0310 10:15:00.364606 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/0afae991-6837-47dd-948b-cb1a298f8ce3-config-volume\") pod \"collect-profiles-29552295-mkdsg\" (UID: \"0afae991-6837-47dd-948b-cb1a298f8ce3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552295-mkdsg" Mar 10 10:15:00 crc kubenswrapper[4794]: I0310 10:15:00.379164 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/0afae991-6837-47dd-948b-cb1a298f8ce3-secret-volume\") pod \"collect-profiles-29552295-mkdsg\" (UID: \"0afae991-6837-47dd-948b-cb1a298f8ce3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552295-mkdsg" Mar 10 10:15:00 crc kubenswrapper[4794]: I0310 10:15:00.380893 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5jhbr\" (UniqueName: \"kubernetes.io/projected/0afae991-6837-47dd-948b-cb1a298f8ce3-kube-api-access-5jhbr\") pod \"collect-profiles-29552295-mkdsg\" (UID: \"0afae991-6837-47dd-948b-cb1a298f8ce3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552295-mkdsg" Mar 10 10:15:00 crc kubenswrapper[4794]: I0310 10:15:00.511967 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29552295-mkdsg" Mar 10 10:15:00 crc kubenswrapper[4794]: I0310 10:15:00.995776 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29552295-mkdsg"] Mar 10 10:15:01 crc kubenswrapper[4794]: W0310 10:15:01.006597 4794 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0afae991_6837_47dd_948b_cb1a298f8ce3.slice/crio-c2ffa746081888bf1262ff6a26810f0adca0c539c64a188cc9bd07c5d9a6f7bf WatchSource:0}: Error finding container c2ffa746081888bf1262ff6a26810f0adca0c539c64a188cc9bd07c5d9a6f7bf: Status 404 returned error can't find the container with id c2ffa746081888bf1262ff6a26810f0adca0c539c64a188cc9bd07c5d9a6f7bf Mar 10 10:15:01 crc kubenswrapper[4794]: I0310 10:15:01.164999 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29552295-mkdsg" event={"ID":"0afae991-6837-47dd-948b-cb1a298f8ce3","Type":"ContainerStarted","Data":"6aed1b4e3e53a82bcb59402d0112ac3efd038a185687b506ba60c28add5a99c7"} Mar 10 10:15:01 crc kubenswrapper[4794]: I0310 10:15:01.165048 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29552295-mkdsg" event={"ID":"0afae991-6837-47dd-948b-cb1a298f8ce3","Type":"ContainerStarted","Data":"c2ffa746081888bf1262ff6a26810f0adca0c539c64a188cc9bd07c5d9a6f7bf"} Mar 10 10:15:01 crc kubenswrapper[4794]: I0310 10:15:01.187904 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29552295-mkdsg" podStartSLOduration=1.187885115 podStartE2EDuration="1.187885115s" podCreationTimestamp="2026-03-10 10:15:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 10:15:01.17937885 +0000 UTC m=+1849.935549678" watchObservedRunningTime="2026-03-10 10:15:01.187885115 +0000 UTC m=+1849.944055943" Mar 10 10:15:02 crc kubenswrapper[4794]: I0310 10:15:02.178304 4794 generic.go:334] "Generic (PLEG): container finished" podID="0afae991-6837-47dd-948b-cb1a298f8ce3" containerID="6aed1b4e3e53a82bcb59402d0112ac3efd038a185687b506ba60c28add5a99c7" exitCode=0 Mar 10 10:15:02 crc kubenswrapper[4794]: I0310 10:15:02.178424 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29552295-mkdsg" event={"ID":"0afae991-6837-47dd-948b-cb1a298f8ce3","Type":"ContainerDied","Data":"6aed1b4e3e53a82bcb59402d0112ac3efd038a185687b506ba60c28add5a99c7"} Mar 10 10:15:03 crc kubenswrapper[4794]: I0310 10:15:03.558194 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29552295-mkdsg" Mar 10 10:15:03 crc kubenswrapper[4794]: I0310 10:15:03.718274 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/0afae991-6837-47dd-948b-cb1a298f8ce3-config-volume\") pod \"0afae991-6837-47dd-948b-cb1a298f8ce3\" (UID: \"0afae991-6837-47dd-948b-cb1a298f8ce3\") " Mar 10 10:15:03 crc kubenswrapper[4794]: I0310 10:15:03.718778 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/0afae991-6837-47dd-948b-cb1a298f8ce3-secret-volume\") pod \"0afae991-6837-47dd-948b-cb1a298f8ce3\" (UID: \"0afae991-6837-47dd-948b-cb1a298f8ce3\") " Mar 10 10:15:03 crc kubenswrapper[4794]: I0310 10:15:03.718871 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0afae991-6837-47dd-948b-cb1a298f8ce3-config-volume" (OuterVolumeSpecName: "config-volume") pod "0afae991-6837-47dd-948b-cb1a298f8ce3" (UID: "0afae991-6837-47dd-948b-cb1a298f8ce3"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 10:15:03 crc kubenswrapper[4794]: I0310 10:15:03.718961 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5jhbr\" (UniqueName: \"kubernetes.io/projected/0afae991-6837-47dd-948b-cb1a298f8ce3-kube-api-access-5jhbr\") pod \"0afae991-6837-47dd-948b-cb1a298f8ce3\" (UID: \"0afae991-6837-47dd-948b-cb1a298f8ce3\") " Mar 10 10:15:03 crc kubenswrapper[4794]: I0310 10:15:03.719701 4794 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/0afae991-6837-47dd-948b-cb1a298f8ce3-config-volume\") on node \"crc\" DevicePath \"\"" Mar 10 10:15:03 crc kubenswrapper[4794]: I0310 10:15:03.723786 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0afae991-6837-47dd-948b-cb1a298f8ce3-kube-api-access-5jhbr" (OuterVolumeSpecName: "kube-api-access-5jhbr") pod "0afae991-6837-47dd-948b-cb1a298f8ce3" (UID: "0afae991-6837-47dd-948b-cb1a298f8ce3"). InnerVolumeSpecName "kube-api-access-5jhbr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 10:15:03 crc kubenswrapper[4794]: I0310 10:15:03.724015 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0afae991-6837-47dd-948b-cb1a298f8ce3-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "0afae991-6837-47dd-948b-cb1a298f8ce3" (UID: "0afae991-6837-47dd-948b-cb1a298f8ce3"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 10:15:03 crc kubenswrapper[4794]: I0310 10:15:03.821015 4794 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/0afae991-6837-47dd-948b-cb1a298f8ce3-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 10 10:15:03 crc kubenswrapper[4794]: I0310 10:15:03.821226 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5jhbr\" (UniqueName: \"kubernetes.io/projected/0afae991-6837-47dd-948b-cb1a298f8ce3-kube-api-access-5jhbr\") on node \"crc\" DevicePath \"\"" Mar 10 10:15:04 crc kubenswrapper[4794]: I0310 10:15:04.200696 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29552295-mkdsg" event={"ID":"0afae991-6837-47dd-948b-cb1a298f8ce3","Type":"ContainerDied","Data":"c2ffa746081888bf1262ff6a26810f0adca0c539c64a188cc9bd07c5d9a6f7bf"} Mar 10 10:15:04 crc kubenswrapper[4794]: I0310 10:15:04.200758 4794 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c2ffa746081888bf1262ff6a26810f0adca0c539c64a188cc9bd07c5d9a6f7bf" Mar 10 10:15:04 crc kubenswrapper[4794]: I0310 10:15:04.200765 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29552295-mkdsg" Mar 10 10:15:12 crc kubenswrapper[4794]: I0310 10:15:12.999156 4794 scope.go:117] "RemoveContainer" containerID="9729570461f7cfbf915875f9b959db1e77a9b0affc73f47b04e5f09f1c4d906d" Mar 10 10:15:13 crc kubenswrapper[4794]: E0310 10:15:13.000076 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-69278_openshift-machine-config-operator(071a79a8-a892-4d38-a255-2a19483b64aa)\"" pod="openshift-machine-config-operator/machine-config-daemon-69278" podUID="071a79a8-a892-4d38-a255-2a19483b64aa" Mar 10 10:15:23 crc kubenswrapper[4794]: I0310 10:15:23.999135 4794 scope.go:117] "RemoveContainer" containerID="9729570461f7cfbf915875f9b959db1e77a9b0affc73f47b04e5f09f1c4d906d" Mar 10 10:15:24 crc kubenswrapper[4794]: E0310 10:15:24.000668 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-69278_openshift-machine-config-operator(071a79a8-a892-4d38-a255-2a19483b64aa)\"" pod="openshift-machine-config-operator/machine-config-daemon-69278" podUID="071a79a8-a892-4d38-a255-2a19483b64aa" Mar 10 10:15:37 crc kubenswrapper[4794]: I0310 10:15:37.934814 4794 scope.go:117] "RemoveContainer" containerID="826580e64ddfef3da4fda6ec39829dbc5dafbc69c66385fc8ca2a2bcd5ca60d8" Mar 10 10:15:37 crc kubenswrapper[4794]: I0310 10:15:37.964411 4794 scope.go:117] "RemoveContainer" containerID="8b1add4deeef74619d980e1481eb91a0c5f66b424e4ad63cfb9113db133078f3" Mar 10 10:15:37 crc kubenswrapper[4794]: I0310 10:15:37.987626 4794 scope.go:117] "RemoveContainer" containerID="2b61af45d5cd8cc92786b42b8c4b712c37165e562d116857905c7a2806091415" Mar 10 10:15:38 crc kubenswrapper[4794]: I0310 10:15:38.015895 4794 scope.go:117] "RemoveContainer" containerID="6e6cf54d16f75007332086d04328fb26dd120e8114419987c7377f33c0bef36c" Mar 10 10:15:38 crc kubenswrapper[4794]: I0310 10:15:38.044475 4794 scope.go:117] "RemoveContainer" containerID="6c9754ac702c4f23067de7bb692d3a96bce0ffba1d10777a374b930149ba51e5" Mar 10 10:15:38 crc kubenswrapper[4794]: I0310 10:15:38.998633 4794 scope.go:117] "RemoveContainer" containerID="9729570461f7cfbf915875f9b959db1e77a9b0affc73f47b04e5f09f1c4d906d" Mar 10 10:15:38 crc kubenswrapper[4794]: E0310 10:15:38.998905 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-69278_openshift-machine-config-operator(071a79a8-a892-4d38-a255-2a19483b64aa)\"" pod="openshift-machine-config-operator/machine-config-daemon-69278" podUID="071a79a8-a892-4d38-a255-2a19483b64aa" Mar 10 10:15:50 crc kubenswrapper[4794]: I0310 10:15:50.998922 4794 scope.go:117] "RemoveContainer" containerID="9729570461f7cfbf915875f9b959db1e77a9b0affc73f47b04e5f09f1c4d906d" Mar 10 10:15:51 crc kubenswrapper[4794]: E0310 10:15:50.999508 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-69278_openshift-machine-config-operator(071a79a8-a892-4d38-a255-2a19483b64aa)\"" pod="openshift-machine-config-operator/machine-config-daemon-69278" podUID="071a79a8-a892-4d38-a255-2a19483b64aa" Mar 10 10:16:00 crc kubenswrapper[4794]: I0310 10:16:00.171911 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29552296-s5sss"] Mar 10 10:16:00 crc kubenswrapper[4794]: E0310 10:16:00.173310 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0afae991-6837-47dd-948b-cb1a298f8ce3" containerName="collect-profiles" Mar 10 10:16:00 crc kubenswrapper[4794]: I0310 10:16:00.173373 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="0afae991-6837-47dd-948b-cb1a298f8ce3" containerName="collect-profiles" Mar 10 10:16:00 crc kubenswrapper[4794]: I0310 10:16:00.173662 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="0afae991-6837-47dd-948b-cb1a298f8ce3" containerName="collect-profiles" Mar 10 10:16:00 crc kubenswrapper[4794]: I0310 10:16:00.174558 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552296-s5sss" Mar 10 10:16:00 crc kubenswrapper[4794]: I0310 10:16:00.177315 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 10 10:16:00 crc kubenswrapper[4794]: I0310 10:16:00.177629 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-r5tkc" Mar 10 10:16:00 crc kubenswrapper[4794]: I0310 10:16:00.181364 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 10 10:16:00 crc kubenswrapper[4794]: I0310 10:16:00.182269 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552296-s5sss"] Mar 10 10:16:00 crc kubenswrapper[4794]: I0310 10:16:00.306061 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8x885\" (UniqueName: \"kubernetes.io/projected/1a10473d-463b-4d38-b940-07be581de25a-kube-api-access-8x885\") pod \"auto-csr-approver-29552296-s5sss\" (UID: \"1a10473d-463b-4d38-b940-07be581de25a\") " pod="openshift-infra/auto-csr-approver-29552296-s5sss" Mar 10 10:16:00 crc kubenswrapper[4794]: I0310 10:16:00.408175 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8x885\" (UniqueName: \"kubernetes.io/projected/1a10473d-463b-4d38-b940-07be581de25a-kube-api-access-8x885\") pod \"auto-csr-approver-29552296-s5sss\" (UID: \"1a10473d-463b-4d38-b940-07be581de25a\") " pod="openshift-infra/auto-csr-approver-29552296-s5sss" Mar 10 10:16:00 crc kubenswrapper[4794]: I0310 10:16:00.449847 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8x885\" (UniqueName: \"kubernetes.io/projected/1a10473d-463b-4d38-b940-07be581de25a-kube-api-access-8x885\") pod \"auto-csr-approver-29552296-s5sss\" (UID: \"1a10473d-463b-4d38-b940-07be581de25a\") " pod="openshift-infra/auto-csr-approver-29552296-s5sss" Mar 10 10:16:00 crc kubenswrapper[4794]: I0310 10:16:00.515295 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552296-s5sss" Mar 10 10:16:00 crc kubenswrapper[4794]: I0310 10:16:00.952433 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552296-s5sss"] Mar 10 10:16:00 crc kubenswrapper[4794]: I0310 10:16:00.955373 4794 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 10 10:16:01 crc kubenswrapper[4794]: I0310 10:16:01.753436 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552296-s5sss" event={"ID":"1a10473d-463b-4d38-b940-07be581de25a","Type":"ContainerStarted","Data":"fb1b656778b429acabc6c19ad3ec28cea205a1290a06c23def3b3a2f62d73bdd"} Mar 10 10:16:02 crc kubenswrapper[4794]: I0310 10:16:02.763725 4794 generic.go:334] "Generic (PLEG): container finished" podID="1a10473d-463b-4d38-b940-07be581de25a" containerID="4d7c9c6915a24bd64369c4f60746e302d0bd841017387f151a97521d36b05e21" exitCode=0 Mar 10 10:16:02 crc kubenswrapper[4794]: I0310 10:16:02.763807 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552296-s5sss" event={"ID":"1a10473d-463b-4d38-b940-07be581de25a","Type":"ContainerDied","Data":"4d7c9c6915a24bd64369c4f60746e302d0bd841017387f151a97521d36b05e21"} Mar 10 10:16:03 crc kubenswrapper[4794]: I0310 10:16:03.999530 4794 scope.go:117] "RemoveContainer" containerID="9729570461f7cfbf915875f9b959db1e77a9b0affc73f47b04e5f09f1c4d906d" Mar 10 10:16:04 crc kubenswrapper[4794]: E0310 10:16:04.000232 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-69278_openshift-machine-config-operator(071a79a8-a892-4d38-a255-2a19483b64aa)\"" pod="openshift-machine-config-operator/machine-config-daemon-69278" podUID="071a79a8-a892-4d38-a255-2a19483b64aa" Mar 10 10:16:04 crc kubenswrapper[4794]: I0310 10:16:04.069092 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552296-s5sss" Mar 10 10:16:04 crc kubenswrapper[4794]: I0310 10:16:04.174987 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8x885\" (UniqueName: \"kubernetes.io/projected/1a10473d-463b-4d38-b940-07be581de25a-kube-api-access-8x885\") pod \"1a10473d-463b-4d38-b940-07be581de25a\" (UID: \"1a10473d-463b-4d38-b940-07be581de25a\") " Mar 10 10:16:04 crc kubenswrapper[4794]: I0310 10:16:04.181613 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1a10473d-463b-4d38-b940-07be581de25a-kube-api-access-8x885" (OuterVolumeSpecName: "kube-api-access-8x885") pod "1a10473d-463b-4d38-b940-07be581de25a" (UID: "1a10473d-463b-4d38-b940-07be581de25a"). InnerVolumeSpecName "kube-api-access-8x885". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 10:16:04 crc kubenswrapper[4794]: I0310 10:16:04.277495 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8x885\" (UniqueName: \"kubernetes.io/projected/1a10473d-463b-4d38-b940-07be581de25a-kube-api-access-8x885\") on node \"crc\" DevicePath \"\"" Mar 10 10:16:04 crc kubenswrapper[4794]: I0310 10:16:04.784275 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552296-s5sss" event={"ID":"1a10473d-463b-4d38-b940-07be581de25a","Type":"ContainerDied","Data":"fb1b656778b429acabc6c19ad3ec28cea205a1290a06c23def3b3a2f62d73bdd"} Mar 10 10:16:04 crc kubenswrapper[4794]: I0310 10:16:04.784325 4794 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fb1b656778b429acabc6c19ad3ec28cea205a1290a06c23def3b3a2f62d73bdd" Mar 10 10:16:04 crc kubenswrapper[4794]: I0310 10:16:04.784393 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552296-s5sss" Mar 10 10:16:05 crc kubenswrapper[4794]: I0310 10:16:05.149823 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29552290-9sfrg"] Mar 10 10:16:05 crc kubenswrapper[4794]: I0310 10:16:05.157552 4794 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29552290-9sfrg"] Mar 10 10:16:06 crc kubenswrapper[4794]: I0310 10:16:06.016944 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cc3b15ea-8912-46a4-a69a-5a8f6323b078" path="/var/lib/kubelet/pods/cc3b15ea-8912-46a4-a69a-5a8f6323b078/volumes" Mar 10 10:16:16 crc kubenswrapper[4794]: I0310 10:16:16.998969 4794 scope.go:117] "RemoveContainer" containerID="9729570461f7cfbf915875f9b959db1e77a9b0affc73f47b04e5f09f1c4d906d" Mar 10 10:16:17 crc kubenswrapper[4794]: E0310 10:16:16.999654 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-69278_openshift-machine-config-operator(071a79a8-a892-4d38-a255-2a19483b64aa)\"" pod="openshift-machine-config-operator/machine-config-daemon-69278" podUID="071a79a8-a892-4d38-a255-2a19483b64aa" Mar 10 10:16:29 crc kubenswrapper[4794]: I0310 10:16:28.999487 4794 scope.go:117] "RemoveContainer" containerID="9729570461f7cfbf915875f9b959db1e77a9b0affc73f47b04e5f09f1c4d906d" Mar 10 10:16:29 crc kubenswrapper[4794]: E0310 10:16:29.000498 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-69278_openshift-machine-config-operator(071a79a8-a892-4d38-a255-2a19483b64aa)\"" pod="openshift-machine-config-operator/machine-config-daemon-69278" podUID="071a79a8-a892-4d38-a255-2a19483b64aa" Mar 10 10:16:38 crc kubenswrapper[4794]: I0310 10:16:38.145516 4794 scope.go:117] "RemoveContainer" containerID="17d92794a693024017718809f962f24b0d1af0701939fc313e048469be1fea41" Mar 10 10:16:42 crc kubenswrapper[4794]: I0310 10:16:42.999252 4794 scope.go:117] "RemoveContainer" containerID="9729570461f7cfbf915875f9b959db1e77a9b0affc73f47b04e5f09f1c4d906d" Mar 10 10:16:43 crc kubenswrapper[4794]: E0310 10:16:42.999776 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-69278_openshift-machine-config-operator(071a79a8-a892-4d38-a255-2a19483b64aa)\"" pod="openshift-machine-config-operator/machine-config-daemon-69278" podUID="071a79a8-a892-4d38-a255-2a19483b64aa" Mar 10 10:16:58 crc kubenswrapper[4794]: I0310 10:16:58.000305 4794 scope.go:117] "RemoveContainer" containerID="9729570461f7cfbf915875f9b959db1e77a9b0affc73f47b04e5f09f1c4d906d" Mar 10 10:16:58 crc kubenswrapper[4794]: E0310 10:16:58.001558 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-69278_openshift-machine-config-operator(071a79a8-a892-4d38-a255-2a19483b64aa)\"" pod="openshift-machine-config-operator/machine-config-daemon-69278" podUID="071a79a8-a892-4d38-a255-2a19483b64aa" Mar 10 10:17:09 crc kubenswrapper[4794]: I0310 10:17:09.999005 4794 scope.go:117] "RemoveContainer" containerID="9729570461f7cfbf915875f9b959db1e77a9b0affc73f47b04e5f09f1c4d906d" Mar 10 10:17:10 crc kubenswrapper[4794]: E0310 10:17:09.999956 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-69278_openshift-machine-config-operator(071a79a8-a892-4d38-a255-2a19483b64aa)\"" pod="openshift-machine-config-operator/machine-config-daemon-69278" podUID="071a79a8-a892-4d38-a255-2a19483b64aa" Mar 10 10:17:24 crc kubenswrapper[4794]: I0310 10:17:24.999318 4794 scope.go:117] "RemoveContainer" containerID="9729570461f7cfbf915875f9b959db1e77a9b0affc73f47b04e5f09f1c4d906d" Mar 10 10:17:25 crc kubenswrapper[4794]: E0310 10:17:25.000212 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-69278_openshift-machine-config-operator(071a79a8-a892-4d38-a255-2a19483b64aa)\"" pod="openshift-machine-config-operator/machine-config-daemon-69278" podUID="071a79a8-a892-4d38-a255-2a19483b64aa" Mar 10 10:17:37 crc kubenswrapper[4794]: I0310 10:17:36.999795 4794 scope.go:117] "RemoveContainer" containerID="9729570461f7cfbf915875f9b959db1e77a9b0affc73f47b04e5f09f1c4d906d" Mar 10 10:17:37 crc kubenswrapper[4794]: E0310 10:17:37.000962 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-69278_openshift-machine-config-operator(071a79a8-a892-4d38-a255-2a19483b64aa)\"" pod="openshift-machine-config-operator/machine-config-daemon-69278" podUID="071a79a8-a892-4d38-a255-2a19483b64aa" Mar 10 10:17:52 crc kubenswrapper[4794]: I0310 10:17:52.009183 4794 scope.go:117] "RemoveContainer" containerID="9729570461f7cfbf915875f9b959db1e77a9b0affc73f47b04e5f09f1c4d906d" Mar 10 10:17:52 crc kubenswrapper[4794]: E0310 10:17:52.010447 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-69278_openshift-machine-config-operator(071a79a8-a892-4d38-a255-2a19483b64aa)\"" pod="openshift-machine-config-operator/machine-config-daemon-69278" podUID="071a79a8-a892-4d38-a255-2a19483b64aa" Mar 10 10:18:00 crc kubenswrapper[4794]: I0310 10:18:00.158850 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29552298-ftk4m"] Mar 10 10:18:00 crc kubenswrapper[4794]: E0310 10:18:00.160135 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1a10473d-463b-4d38-b940-07be581de25a" containerName="oc" Mar 10 10:18:00 crc kubenswrapper[4794]: I0310 10:18:00.160155 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="1a10473d-463b-4d38-b940-07be581de25a" containerName="oc" Mar 10 10:18:00 crc kubenswrapper[4794]: I0310 10:18:00.160322 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="1a10473d-463b-4d38-b940-07be581de25a" containerName="oc" Mar 10 10:18:00 crc kubenswrapper[4794]: I0310 10:18:00.160883 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552298-ftk4m" Mar 10 10:18:00 crc kubenswrapper[4794]: I0310 10:18:00.168064 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552298-ftk4m"] Mar 10 10:18:00 crc kubenswrapper[4794]: I0310 10:18:00.175287 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-r5tkc" Mar 10 10:18:00 crc kubenswrapper[4794]: I0310 10:18:00.175324 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 10 10:18:00 crc kubenswrapper[4794]: I0310 10:18:00.176007 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 10 10:18:00 crc kubenswrapper[4794]: I0310 10:18:00.204664 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8lknp\" (UniqueName: \"kubernetes.io/projected/e220ad51-de67-4b15-b34f-765bb6fd2a82-kube-api-access-8lknp\") pod \"auto-csr-approver-29552298-ftk4m\" (UID: \"e220ad51-de67-4b15-b34f-765bb6fd2a82\") " pod="openshift-infra/auto-csr-approver-29552298-ftk4m" Mar 10 10:18:00 crc kubenswrapper[4794]: I0310 10:18:00.305716 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8lknp\" (UniqueName: \"kubernetes.io/projected/e220ad51-de67-4b15-b34f-765bb6fd2a82-kube-api-access-8lknp\") pod \"auto-csr-approver-29552298-ftk4m\" (UID: \"e220ad51-de67-4b15-b34f-765bb6fd2a82\") " pod="openshift-infra/auto-csr-approver-29552298-ftk4m" Mar 10 10:18:00 crc kubenswrapper[4794]: I0310 10:18:00.327436 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8lknp\" (UniqueName: \"kubernetes.io/projected/e220ad51-de67-4b15-b34f-765bb6fd2a82-kube-api-access-8lknp\") pod \"auto-csr-approver-29552298-ftk4m\" (UID: \"e220ad51-de67-4b15-b34f-765bb6fd2a82\") " pod="openshift-infra/auto-csr-approver-29552298-ftk4m" Mar 10 10:18:00 crc kubenswrapper[4794]: I0310 10:18:00.498063 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552298-ftk4m" Mar 10 10:18:00 crc kubenswrapper[4794]: I0310 10:18:00.928862 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552298-ftk4m"] Mar 10 10:18:00 crc kubenswrapper[4794]: I0310 10:18:00.991973 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552298-ftk4m" event={"ID":"e220ad51-de67-4b15-b34f-765bb6fd2a82","Type":"ContainerStarted","Data":"5d163e6ea5ce130776ab69bb91a84c56a265cd53fb91ff86801f7ddb4e6db963"} Mar 10 10:18:03 crc kubenswrapper[4794]: I0310 10:18:03.010429 4794 generic.go:334] "Generic (PLEG): container finished" podID="e220ad51-de67-4b15-b34f-765bb6fd2a82" containerID="93b918edb4bbe861ad47ece5dd9c8dca8a3943f42e398eb544f0d8a92a4f5843" exitCode=0 Mar 10 10:18:03 crc kubenswrapper[4794]: I0310 10:18:03.010516 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552298-ftk4m" event={"ID":"e220ad51-de67-4b15-b34f-765bb6fd2a82","Type":"ContainerDied","Data":"93b918edb4bbe861ad47ece5dd9c8dca8a3943f42e398eb544f0d8a92a4f5843"} Mar 10 10:18:04 crc kubenswrapper[4794]: I0310 10:18:03.999857 4794 scope.go:117] "RemoveContainer" containerID="9729570461f7cfbf915875f9b959db1e77a9b0affc73f47b04e5f09f1c4d906d" Mar 10 10:18:04 crc kubenswrapper[4794]: E0310 10:18:04.000592 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-69278_openshift-machine-config-operator(071a79a8-a892-4d38-a255-2a19483b64aa)\"" pod="openshift-machine-config-operator/machine-config-daemon-69278" podUID="071a79a8-a892-4d38-a255-2a19483b64aa" Mar 10 10:18:04 crc kubenswrapper[4794]: I0310 10:18:04.376584 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552298-ftk4m" Mar 10 10:18:04 crc kubenswrapper[4794]: I0310 10:18:04.570856 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8lknp\" (UniqueName: \"kubernetes.io/projected/e220ad51-de67-4b15-b34f-765bb6fd2a82-kube-api-access-8lknp\") pod \"e220ad51-de67-4b15-b34f-765bb6fd2a82\" (UID: \"e220ad51-de67-4b15-b34f-765bb6fd2a82\") " Mar 10 10:18:04 crc kubenswrapper[4794]: I0310 10:18:04.579712 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e220ad51-de67-4b15-b34f-765bb6fd2a82-kube-api-access-8lknp" (OuterVolumeSpecName: "kube-api-access-8lknp") pod "e220ad51-de67-4b15-b34f-765bb6fd2a82" (UID: "e220ad51-de67-4b15-b34f-765bb6fd2a82"). InnerVolumeSpecName "kube-api-access-8lknp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 10:18:04 crc kubenswrapper[4794]: I0310 10:18:04.672783 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8lknp\" (UniqueName: \"kubernetes.io/projected/e220ad51-de67-4b15-b34f-765bb6fd2a82-kube-api-access-8lknp\") on node \"crc\" DevicePath \"\"" Mar 10 10:18:05 crc kubenswrapper[4794]: I0310 10:18:05.031401 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552298-ftk4m" event={"ID":"e220ad51-de67-4b15-b34f-765bb6fd2a82","Type":"ContainerDied","Data":"5d163e6ea5ce130776ab69bb91a84c56a265cd53fb91ff86801f7ddb4e6db963"} Mar 10 10:18:05 crc kubenswrapper[4794]: I0310 10:18:05.032294 4794 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5d163e6ea5ce130776ab69bb91a84c56a265cd53fb91ff86801f7ddb4e6db963" Mar 10 10:18:05 crc kubenswrapper[4794]: I0310 10:18:05.031480 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552298-ftk4m" Mar 10 10:18:05 crc kubenswrapper[4794]: I0310 10:18:05.450747 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29552292-mz24x"] Mar 10 10:18:05 crc kubenswrapper[4794]: I0310 10:18:05.457241 4794 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29552292-mz24x"] Mar 10 10:18:06 crc kubenswrapper[4794]: I0310 10:18:06.013260 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fba92096-5c77-4876-9587-adcca946875a" path="/var/lib/kubelet/pods/fba92096-5c77-4876-9587-adcca946875a/volumes" Mar 10 10:18:15 crc kubenswrapper[4794]: I0310 10:18:14.999389 4794 scope.go:117] "RemoveContainer" containerID="9729570461f7cfbf915875f9b959db1e77a9b0affc73f47b04e5f09f1c4d906d" Mar 10 10:18:15 crc kubenswrapper[4794]: E0310 10:18:15.001745 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-69278_openshift-machine-config-operator(071a79a8-a892-4d38-a255-2a19483b64aa)\"" pod="openshift-machine-config-operator/machine-config-daemon-69278" podUID="071a79a8-a892-4d38-a255-2a19483b64aa" Mar 10 10:18:28 crc kubenswrapper[4794]: I0310 10:18:27.999772 4794 scope.go:117] "RemoveContainer" containerID="9729570461f7cfbf915875f9b959db1e77a9b0affc73f47b04e5f09f1c4d906d" Mar 10 10:18:28 crc kubenswrapper[4794]: E0310 10:18:28.002058 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-69278_openshift-machine-config-operator(071a79a8-a892-4d38-a255-2a19483b64aa)\"" pod="openshift-machine-config-operator/machine-config-daemon-69278" podUID="071a79a8-a892-4d38-a255-2a19483b64aa" Mar 10 10:18:38 crc kubenswrapper[4794]: I0310 10:18:38.257922 4794 scope.go:117] "RemoveContainer" containerID="294b27da4d8580373329333669c62af029345b2aa7ddee9921325199df016a58" Mar 10 10:18:38 crc kubenswrapper[4794]: I0310 10:18:38.999656 4794 scope.go:117] "RemoveContainer" containerID="9729570461f7cfbf915875f9b959db1e77a9b0affc73f47b04e5f09f1c4d906d" Mar 10 10:18:39 crc kubenswrapper[4794]: E0310 10:18:39.000630 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-69278_openshift-machine-config-operator(071a79a8-a892-4d38-a255-2a19483b64aa)\"" pod="openshift-machine-config-operator/machine-config-daemon-69278" podUID="071a79a8-a892-4d38-a255-2a19483b64aa" Mar 10 10:18:50 crc kubenswrapper[4794]: I0310 10:18:50.000203 4794 scope.go:117] "RemoveContainer" containerID="9729570461f7cfbf915875f9b959db1e77a9b0affc73f47b04e5f09f1c4d906d" Mar 10 10:18:50 crc kubenswrapper[4794]: E0310 10:18:50.001200 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-69278_openshift-machine-config-operator(071a79a8-a892-4d38-a255-2a19483b64aa)\"" pod="openshift-machine-config-operator/machine-config-daemon-69278" podUID="071a79a8-a892-4d38-a255-2a19483b64aa" Mar 10 10:19:04 crc kubenswrapper[4794]: I0310 10:19:04.998979 4794 scope.go:117] "RemoveContainer" containerID="9729570461f7cfbf915875f9b959db1e77a9b0affc73f47b04e5f09f1c4d906d" Mar 10 10:19:06 crc kubenswrapper[4794]: I0310 10:19:06.116095 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-69278" event={"ID":"071a79a8-a892-4d38-a255-2a19483b64aa","Type":"ContainerStarted","Data":"f1e2ed57babac0296daa078712b218f38f6d04891fd0d4501ddc9eee4a38ca67"} Mar 10 10:20:00 crc kubenswrapper[4794]: I0310 10:20:00.143814 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29552300-j49hg"] Mar 10 10:20:00 crc kubenswrapper[4794]: E0310 10:20:00.144676 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e220ad51-de67-4b15-b34f-765bb6fd2a82" containerName="oc" Mar 10 10:20:00 crc kubenswrapper[4794]: I0310 10:20:00.144691 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="e220ad51-de67-4b15-b34f-765bb6fd2a82" containerName="oc" Mar 10 10:20:00 crc kubenswrapper[4794]: I0310 10:20:00.144887 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="e220ad51-de67-4b15-b34f-765bb6fd2a82" containerName="oc" Mar 10 10:20:00 crc kubenswrapper[4794]: I0310 10:20:00.145443 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552300-j49hg" Mar 10 10:20:00 crc kubenswrapper[4794]: I0310 10:20:00.147474 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 10 10:20:00 crc kubenswrapper[4794]: I0310 10:20:00.147716 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 10 10:20:00 crc kubenswrapper[4794]: I0310 10:20:00.147926 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-r5tkc" Mar 10 10:20:00 crc kubenswrapper[4794]: I0310 10:20:00.151475 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552300-j49hg"] Mar 10 10:20:00 crc kubenswrapper[4794]: I0310 10:20:00.345570 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nxl82\" (UniqueName: \"kubernetes.io/projected/e559a103-2981-408e-8c74-72067a88f425-kube-api-access-nxl82\") pod \"auto-csr-approver-29552300-j49hg\" (UID: \"e559a103-2981-408e-8c74-72067a88f425\") " pod="openshift-infra/auto-csr-approver-29552300-j49hg" Mar 10 10:20:00 crc kubenswrapper[4794]: I0310 10:20:00.447981 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nxl82\" (UniqueName: \"kubernetes.io/projected/e559a103-2981-408e-8c74-72067a88f425-kube-api-access-nxl82\") pod \"auto-csr-approver-29552300-j49hg\" (UID: \"e559a103-2981-408e-8c74-72067a88f425\") " pod="openshift-infra/auto-csr-approver-29552300-j49hg" Mar 10 10:20:00 crc kubenswrapper[4794]: I0310 10:20:00.479472 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nxl82\" (UniqueName: \"kubernetes.io/projected/e559a103-2981-408e-8c74-72067a88f425-kube-api-access-nxl82\") pod \"auto-csr-approver-29552300-j49hg\" (UID: \"e559a103-2981-408e-8c74-72067a88f425\") " pod="openshift-infra/auto-csr-approver-29552300-j49hg" Mar 10 10:20:00 crc kubenswrapper[4794]: I0310 10:20:00.764893 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552300-j49hg" Mar 10 10:20:01 crc kubenswrapper[4794]: I0310 10:20:01.024174 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552300-j49hg"] Mar 10 10:20:01 crc kubenswrapper[4794]: I0310 10:20:01.610091 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552300-j49hg" event={"ID":"e559a103-2981-408e-8c74-72067a88f425","Type":"ContainerStarted","Data":"4e738a0f1a38af71f5e667c753457cf098f7914905f73f06c586e02c835c5f76"} Mar 10 10:20:03 crc kubenswrapper[4794]: I0310 10:20:03.628771 4794 generic.go:334] "Generic (PLEG): container finished" podID="e559a103-2981-408e-8c74-72067a88f425" containerID="39a4af9fc60313b55a9936560ffdf562d569ad6a3f55df49f5d292093409d7f8" exitCode=0 Mar 10 10:20:03 crc kubenswrapper[4794]: I0310 10:20:03.628856 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552300-j49hg" event={"ID":"e559a103-2981-408e-8c74-72067a88f425","Type":"ContainerDied","Data":"39a4af9fc60313b55a9936560ffdf562d569ad6a3f55df49f5d292093409d7f8"} Mar 10 10:20:05 crc kubenswrapper[4794]: I0310 10:20:05.032099 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552300-j49hg" Mar 10 10:20:05 crc kubenswrapper[4794]: I0310 10:20:05.215688 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nxl82\" (UniqueName: \"kubernetes.io/projected/e559a103-2981-408e-8c74-72067a88f425-kube-api-access-nxl82\") pod \"e559a103-2981-408e-8c74-72067a88f425\" (UID: \"e559a103-2981-408e-8c74-72067a88f425\") " Mar 10 10:20:05 crc kubenswrapper[4794]: I0310 10:20:05.220705 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e559a103-2981-408e-8c74-72067a88f425-kube-api-access-nxl82" (OuterVolumeSpecName: "kube-api-access-nxl82") pod "e559a103-2981-408e-8c74-72067a88f425" (UID: "e559a103-2981-408e-8c74-72067a88f425"). InnerVolumeSpecName "kube-api-access-nxl82". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 10:20:05 crc kubenswrapper[4794]: I0310 10:20:05.317122 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nxl82\" (UniqueName: \"kubernetes.io/projected/e559a103-2981-408e-8c74-72067a88f425-kube-api-access-nxl82\") on node \"crc\" DevicePath \"\"" Mar 10 10:20:05 crc kubenswrapper[4794]: I0310 10:20:05.647090 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552300-j49hg" event={"ID":"e559a103-2981-408e-8c74-72067a88f425","Type":"ContainerDied","Data":"4e738a0f1a38af71f5e667c753457cf098f7914905f73f06c586e02c835c5f76"} Mar 10 10:20:05 crc kubenswrapper[4794]: I0310 10:20:05.647143 4794 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4e738a0f1a38af71f5e667c753457cf098f7914905f73f06c586e02c835c5f76" Mar 10 10:20:05 crc kubenswrapper[4794]: I0310 10:20:05.647156 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552300-j49hg" Mar 10 10:20:06 crc kubenswrapper[4794]: I0310 10:20:06.111084 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29552294-nwmdb"] Mar 10 10:20:06 crc kubenswrapper[4794]: I0310 10:20:06.138434 4794 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29552294-nwmdb"] Mar 10 10:20:08 crc kubenswrapper[4794]: I0310 10:20:08.013813 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a4cabbd0-9c5c-4f5f-9233-938cd7b1ced1" path="/var/lib/kubelet/pods/a4cabbd0-9c5c-4f5f-9233-938cd7b1ced1/volumes" Mar 10 10:20:09 crc kubenswrapper[4794]: I0310 10:20:09.372737 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-rcpk7"] Mar 10 10:20:09 crc kubenswrapper[4794]: E0310 10:20:09.373792 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e559a103-2981-408e-8c74-72067a88f425" containerName="oc" Mar 10 10:20:09 crc kubenswrapper[4794]: I0310 10:20:09.373821 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="e559a103-2981-408e-8c74-72067a88f425" containerName="oc" Mar 10 10:20:09 crc kubenswrapper[4794]: I0310 10:20:09.374079 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="e559a103-2981-408e-8c74-72067a88f425" containerName="oc" Mar 10 10:20:09 crc kubenswrapper[4794]: I0310 10:20:09.376179 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-rcpk7" Mar 10 10:20:09 crc kubenswrapper[4794]: I0310 10:20:09.384535 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-rcpk7"] Mar 10 10:20:09 crc kubenswrapper[4794]: I0310 10:20:09.412207 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7ed83901-fbcb-404c-8982-0408a1aa8427-catalog-content\") pod \"redhat-operators-rcpk7\" (UID: \"7ed83901-fbcb-404c-8982-0408a1aa8427\") " pod="openshift-marketplace/redhat-operators-rcpk7" Mar 10 10:20:09 crc kubenswrapper[4794]: I0310 10:20:09.412277 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nrw9h\" (UniqueName: \"kubernetes.io/projected/7ed83901-fbcb-404c-8982-0408a1aa8427-kube-api-access-nrw9h\") pod \"redhat-operators-rcpk7\" (UID: \"7ed83901-fbcb-404c-8982-0408a1aa8427\") " pod="openshift-marketplace/redhat-operators-rcpk7" Mar 10 10:20:09 crc kubenswrapper[4794]: I0310 10:20:09.412316 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7ed83901-fbcb-404c-8982-0408a1aa8427-utilities\") pod \"redhat-operators-rcpk7\" (UID: \"7ed83901-fbcb-404c-8982-0408a1aa8427\") " pod="openshift-marketplace/redhat-operators-rcpk7" Mar 10 10:20:09 crc kubenswrapper[4794]: I0310 10:20:09.513615 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7ed83901-fbcb-404c-8982-0408a1aa8427-catalog-content\") pod \"redhat-operators-rcpk7\" (UID: \"7ed83901-fbcb-404c-8982-0408a1aa8427\") " pod="openshift-marketplace/redhat-operators-rcpk7" Mar 10 10:20:09 crc kubenswrapper[4794]: I0310 10:20:09.513702 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nrw9h\" (UniqueName: \"kubernetes.io/projected/7ed83901-fbcb-404c-8982-0408a1aa8427-kube-api-access-nrw9h\") pod \"redhat-operators-rcpk7\" (UID: \"7ed83901-fbcb-404c-8982-0408a1aa8427\") " pod="openshift-marketplace/redhat-operators-rcpk7" Mar 10 10:20:09 crc kubenswrapper[4794]: I0310 10:20:09.513753 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7ed83901-fbcb-404c-8982-0408a1aa8427-utilities\") pod \"redhat-operators-rcpk7\" (UID: \"7ed83901-fbcb-404c-8982-0408a1aa8427\") " pod="openshift-marketplace/redhat-operators-rcpk7" Mar 10 10:20:09 crc kubenswrapper[4794]: I0310 10:20:09.514159 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7ed83901-fbcb-404c-8982-0408a1aa8427-catalog-content\") pod \"redhat-operators-rcpk7\" (UID: \"7ed83901-fbcb-404c-8982-0408a1aa8427\") " pod="openshift-marketplace/redhat-operators-rcpk7" Mar 10 10:20:09 crc kubenswrapper[4794]: I0310 10:20:09.514298 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7ed83901-fbcb-404c-8982-0408a1aa8427-utilities\") pod \"redhat-operators-rcpk7\" (UID: \"7ed83901-fbcb-404c-8982-0408a1aa8427\") " pod="openshift-marketplace/redhat-operators-rcpk7" Mar 10 10:20:09 crc kubenswrapper[4794]: I0310 10:20:09.534004 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nrw9h\" (UniqueName: \"kubernetes.io/projected/7ed83901-fbcb-404c-8982-0408a1aa8427-kube-api-access-nrw9h\") pod \"redhat-operators-rcpk7\" (UID: \"7ed83901-fbcb-404c-8982-0408a1aa8427\") " pod="openshift-marketplace/redhat-operators-rcpk7" Mar 10 10:20:09 crc kubenswrapper[4794]: I0310 10:20:09.727668 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-rcpk7" Mar 10 10:20:09 crc kubenswrapper[4794]: I0310 10:20:09.950116 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-rcpk7"] Mar 10 10:20:10 crc kubenswrapper[4794]: I0310 10:20:10.692698 4794 generic.go:334] "Generic (PLEG): container finished" podID="7ed83901-fbcb-404c-8982-0408a1aa8427" containerID="812859b9f97e79af083f6ffef71d413eb4a784369d73a346ffbb09568e4c9f19" exitCode=0 Mar 10 10:20:10 crc kubenswrapper[4794]: I0310 10:20:10.692994 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rcpk7" event={"ID":"7ed83901-fbcb-404c-8982-0408a1aa8427","Type":"ContainerDied","Data":"812859b9f97e79af083f6ffef71d413eb4a784369d73a346ffbb09568e4c9f19"} Mar 10 10:20:10 crc kubenswrapper[4794]: I0310 10:20:10.693029 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rcpk7" event={"ID":"7ed83901-fbcb-404c-8982-0408a1aa8427","Type":"ContainerStarted","Data":"3852f1bba9c452f50463b0332df04e24c8ae6c482c5877bc846d62010f2b0f33"} Mar 10 10:20:11 crc kubenswrapper[4794]: I0310 10:20:11.708593 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rcpk7" event={"ID":"7ed83901-fbcb-404c-8982-0408a1aa8427","Type":"ContainerStarted","Data":"39c1757c024c2e38bb82a32b385755071b99ed76df2d729df9855ba3d1c6d73a"} Mar 10 10:20:12 crc kubenswrapper[4794]: I0310 10:20:12.720190 4794 generic.go:334] "Generic (PLEG): container finished" podID="7ed83901-fbcb-404c-8982-0408a1aa8427" containerID="39c1757c024c2e38bb82a32b385755071b99ed76df2d729df9855ba3d1c6d73a" exitCode=0 Mar 10 10:20:12 crc kubenswrapper[4794]: I0310 10:20:12.720325 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rcpk7" event={"ID":"7ed83901-fbcb-404c-8982-0408a1aa8427","Type":"ContainerDied","Data":"39c1757c024c2e38bb82a32b385755071b99ed76df2d729df9855ba3d1c6d73a"} Mar 10 10:20:13 crc kubenswrapper[4794]: I0310 10:20:13.743928 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rcpk7" event={"ID":"7ed83901-fbcb-404c-8982-0408a1aa8427","Type":"ContainerStarted","Data":"65f7dafce2f3c72955be0ea3afad70457325efb7467959ff0172bcb3c979b3fb"} Mar 10 10:20:13 crc kubenswrapper[4794]: I0310 10:20:13.778249 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-rcpk7" podStartSLOduration=2.365505606 podStartE2EDuration="4.778224656s" podCreationTimestamp="2026-03-10 10:20:09 +0000 UTC" firstStartedPulling="2026-03-10 10:20:10.694969219 +0000 UTC m=+2159.451140027" lastFinishedPulling="2026-03-10 10:20:13.107688259 +0000 UTC m=+2161.863859077" observedRunningTime="2026-03-10 10:20:13.777688649 +0000 UTC m=+2162.533859517" watchObservedRunningTime="2026-03-10 10:20:13.778224656 +0000 UTC m=+2162.534395514" Mar 10 10:20:19 crc kubenswrapper[4794]: I0310 10:20:19.728267 4794 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-rcpk7" Mar 10 10:20:19 crc kubenswrapper[4794]: I0310 10:20:19.728915 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-rcpk7" Mar 10 10:20:20 crc kubenswrapper[4794]: I0310 10:20:20.776608 4794 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-rcpk7" podUID="7ed83901-fbcb-404c-8982-0408a1aa8427" containerName="registry-server" probeResult="failure" output=< Mar 10 10:20:20 crc kubenswrapper[4794]: timeout: failed to connect service ":50051" within 1s Mar 10 10:20:20 crc kubenswrapper[4794]: > Mar 10 10:20:29 crc kubenswrapper[4794]: I0310 10:20:29.779063 4794 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-rcpk7" Mar 10 10:20:29 crc kubenswrapper[4794]: I0310 10:20:29.902630 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-rcpk7" Mar 10 10:20:30 crc kubenswrapper[4794]: I0310 10:20:30.041893 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-rcpk7"] Mar 10 10:20:30 crc kubenswrapper[4794]: I0310 10:20:30.900867 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-rcpk7" podUID="7ed83901-fbcb-404c-8982-0408a1aa8427" containerName="registry-server" containerID="cri-o://65f7dafce2f3c72955be0ea3afad70457325efb7467959ff0172bcb3c979b3fb" gracePeriod=2 Mar 10 10:20:31 crc kubenswrapper[4794]: I0310 10:20:31.308298 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-rcpk7" Mar 10 10:20:31 crc kubenswrapper[4794]: I0310 10:20:31.367075 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7ed83901-fbcb-404c-8982-0408a1aa8427-catalog-content\") pod \"7ed83901-fbcb-404c-8982-0408a1aa8427\" (UID: \"7ed83901-fbcb-404c-8982-0408a1aa8427\") " Mar 10 10:20:31 crc kubenswrapper[4794]: I0310 10:20:31.367157 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7ed83901-fbcb-404c-8982-0408a1aa8427-utilities\") pod \"7ed83901-fbcb-404c-8982-0408a1aa8427\" (UID: \"7ed83901-fbcb-404c-8982-0408a1aa8427\") " Mar 10 10:20:31 crc kubenswrapper[4794]: I0310 10:20:31.367174 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nrw9h\" (UniqueName: \"kubernetes.io/projected/7ed83901-fbcb-404c-8982-0408a1aa8427-kube-api-access-nrw9h\") pod \"7ed83901-fbcb-404c-8982-0408a1aa8427\" (UID: \"7ed83901-fbcb-404c-8982-0408a1aa8427\") " Mar 10 10:20:31 crc kubenswrapper[4794]: I0310 10:20:31.368240 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7ed83901-fbcb-404c-8982-0408a1aa8427-utilities" (OuterVolumeSpecName: "utilities") pod "7ed83901-fbcb-404c-8982-0408a1aa8427" (UID: "7ed83901-fbcb-404c-8982-0408a1aa8427"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 10:20:31 crc kubenswrapper[4794]: I0310 10:20:31.373901 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7ed83901-fbcb-404c-8982-0408a1aa8427-kube-api-access-nrw9h" (OuterVolumeSpecName: "kube-api-access-nrw9h") pod "7ed83901-fbcb-404c-8982-0408a1aa8427" (UID: "7ed83901-fbcb-404c-8982-0408a1aa8427"). InnerVolumeSpecName "kube-api-access-nrw9h". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 10:20:31 crc kubenswrapper[4794]: I0310 10:20:31.469188 4794 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7ed83901-fbcb-404c-8982-0408a1aa8427-utilities\") on node \"crc\" DevicePath \"\"" Mar 10 10:20:31 crc kubenswrapper[4794]: I0310 10:20:31.469230 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nrw9h\" (UniqueName: \"kubernetes.io/projected/7ed83901-fbcb-404c-8982-0408a1aa8427-kube-api-access-nrw9h\") on node \"crc\" DevicePath \"\"" Mar 10 10:20:31 crc kubenswrapper[4794]: I0310 10:20:31.503740 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7ed83901-fbcb-404c-8982-0408a1aa8427-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "7ed83901-fbcb-404c-8982-0408a1aa8427" (UID: "7ed83901-fbcb-404c-8982-0408a1aa8427"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 10:20:31 crc kubenswrapper[4794]: I0310 10:20:31.570449 4794 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7ed83901-fbcb-404c-8982-0408a1aa8427-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 10 10:20:31 crc kubenswrapper[4794]: I0310 10:20:31.919701 4794 generic.go:334] "Generic (PLEG): container finished" podID="7ed83901-fbcb-404c-8982-0408a1aa8427" containerID="65f7dafce2f3c72955be0ea3afad70457325efb7467959ff0172bcb3c979b3fb" exitCode=0 Mar 10 10:20:31 crc kubenswrapper[4794]: I0310 10:20:31.919747 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rcpk7" event={"ID":"7ed83901-fbcb-404c-8982-0408a1aa8427","Type":"ContainerDied","Data":"65f7dafce2f3c72955be0ea3afad70457325efb7467959ff0172bcb3c979b3fb"} Mar 10 10:20:31 crc kubenswrapper[4794]: I0310 10:20:31.919778 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rcpk7" event={"ID":"7ed83901-fbcb-404c-8982-0408a1aa8427","Type":"ContainerDied","Data":"3852f1bba9c452f50463b0332df04e24c8ae6c482c5877bc846d62010f2b0f33"} Mar 10 10:20:31 crc kubenswrapper[4794]: I0310 10:20:31.919799 4794 scope.go:117] "RemoveContainer" containerID="65f7dafce2f3c72955be0ea3afad70457325efb7467959ff0172bcb3c979b3fb" Mar 10 10:20:31 crc kubenswrapper[4794]: I0310 10:20:31.919850 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-rcpk7" Mar 10 10:20:31 crc kubenswrapper[4794]: I0310 10:20:31.946003 4794 scope.go:117] "RemoveContainer" containerID="39c1757c024c2e38bb82a32b385755071b99ed76df2d729df9855ba3d1c6d73a" Mar 10 10:20:31 crc kubenswrapper[4794]: I0310 10:20:31.970123 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-rcpk7"] Mar 10 10:20:31 crc kubenswrapper[4794]: I0310 10:20:31.980587 4794 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-rcpk7"] Mar 10 10:20:32 crc kubenswrapper[4794]: I0310 10:20:32.002573 4794 scope.go:117] "RemoveContainer" containerID="812859b9f97e79af083f6ffef71d413eb4a784369d73a346ffbb09568e4c9f19" Mar 10 10:20:32 crc kubenswrapper[4794]: I0310 10:20:32.022806 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7ed83901-fbcb-404c-8982-0408a1aa8427" path="/var/lib/kubelet/pods/7ed83901-fbcb-404c-8982-0408a1aa8427/volumes" Mar 10 10:20:32 crc kubenswrapper[4794]: I0310 10:20:32.032370 4794 scope.go:117] "RemoveContainer" containerID="65f7dafce2f3c72955be0ea3afad70457325efb7467959ff0172bcb3c979b3fb" Mar 10 10:20:32 crc kubenswrapper[4794]: E0310 10:20:32.033263 4794 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"65f7dafce2f3c72955be0ea3afad70457325efb7467959ff0172bcb3c979b3fb\": container with ID starting with 65f7dafce2f3c72955be0ea3afad70457325efb7467959ff0172bcb3c979b3fb not found: ID does not exist" containerID="65f7dafce2f3c72955be0ea3afad70457325efb7467959ff0172bcb3c979b3fb" Mar 10 10:20:32 crc kubenswrapper[4794]: I0310 10:20:32.033315 4794 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"65f7dafce2f3c72955be0ea3afad70457325efb7467959ff0172bcb3c979b3fb"} err="failed to get container status \"65f7dafce2f3c72955be0ea3afad70457325efb7467959ff0172bcb3c979b3fb\": rpc error: code = NotFound desc = could not find container \"65f7dafce2f3c72955be0ea3afad70457325efb7467959ff0172bcb3c979b3fb\": container with ID starting with 65f7dafce2f3c72955be0ea3afad70457325efb7467959ff0172bcb3c979b3fb not found: ID does not exist" Mar 10 10:20:32 crc kubenswrapper[4794]: I0310 10:20:32.033374 4794 scope.go:117] "RemoveContainer" containerID="39c1757c024c2e38bb82a32b385755071b99ed76df2d729df9855ba3d1c6d73a" Mar 10 10:20:32 crc kubenswrapper[4794]: E0310 10:20:32.033861 4794 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"39c1757c024c2e38bb82a32b385755071b99ed76df2d729df9855ba3d1c6d73a\": container with ID starting with 39c1757c024c2e38bb82a32b385755071b99ed76df2d729df9855ba3d1c6d73a not found: ID does not exist" containerID="39c1757c024c2e38bb82a32b385755071b99ed76df2d729df9855ba3d1c6d73a" Mar 10 10:20:32 crc kubenswrapper[4794]: I0310 10:20:32.033900 4794 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"39c1757c024c2e38bb82a32b385755071b99ed76df2d729df9855ba3d1c6d73a"} err="failed to get container status \"39c1757c024c2e38bb82a32b385755071b99ed76df2d729df9855ba3d1c6d73a\": rpc error: code = NotFound desc = could not find container \"39c1757c024c2e38bb82a32b385755071b99ed76df2d729df9855ba3d1c6d73a\": container with ID starting with 39c1757c024c2e38bb82a32b385755071b99ed76df2d729df9855ba3d1c6d73a not found: ID does not exist" Mar 10 10:20:32 crc kubenswrapper[4794]: I0310 10:20:32.033928 4794 scope.go:117] "RemoveContainer" containerID="812859b9f97e79af083f6ffef71d413eb4a784369d73a346ffbb09568e4c9f19" Mar 10 10:20:32 crc kubenswrapper[4794]: E0310 10:20:32.034285 4794 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"812859b9f97e79af083f6ffef71d413eb4a784369d73a346ffbb09568e4c9f19\": container with ID starting with 812859b9f97e79af083f6ffef71d413eb4a784369d73a346ffbb09568e4c9f19 not found: ID does not exist" containerID="812859b9f97e79af083f6ffef71d413eb4a784369d73a346ffbb09568e4c9f19" Mar 10 10:20:32 crc kubenswrapper[4794]: I0310 10:20:32.034304 4794 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"812859b9f97e79af083f6ffef71d413eb4a784369d73a346ffbb09568e4c9f19"} err="failed to get container status \"812859b9f97e79af083f6ffef71d413eb4a784369d73a346ffbb09568e4c9f19\": rpc error: code = NotFound desc = could not find container \"812859b9f97e79af083f6ffef71d413eb4a784369d73a346ffbb09568e4c9f19\": container with ID starting with 812859b9f97e79af083f6ffef71d413eb4a784369d73a346ffbb09568e4c9f19 not found: ID does not exist" Mar 10 10:20:37 crc kubenswrapper[4794]: I0310 10:20:37.703354 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-gmrcf"] Mar 10 10:20:37 crc kubenswrapper[4794]: E0310 10:20:37.704067 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7ed83901-fbcb-404c-8982-0408a1aa8427" containerName="extract-content" Mar 10 10:20:37 crc kubenswrapper[4794]: I0310 10:20:37.704082 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="7ed83901-fbcb-404c-8982-0408a1aa8427" containerName="extract-content" Mar 10 10:20:37 crc kubenswrapper[4794]: E0310 10:20:37.704101 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7ed83901-fbcb-404c-8982-0408a1aa8427" containerName="extract-utilities" Mar 10 10:20:37 crc kubenswrapper[4794]: I0310 10:20:37.704110 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="7ed83901-fbcb-404c-8982-0408a1aa8427" containerName="extract-utilities" Mar 10 10:20:37 crc kubenswrapper[4794]: E0310 10:20:37.704121 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7ed83901-fbcb-404c-8982-0408a1aa8427" containerName="registry-server" Mar 10 10:20:37 crc kubenswrapper[4794]: I0310 10:20:37.704129 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="7ed83901-fbcb-404c-8982-0408a1aa8427" containerName="registry-server" Mar 10 10:20:37 crc kubenswrapper[4794]: I0310 10:20:37.704290 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="7ed83901-fbcb-404c-8982-0408a1aa8427" containerName="registry-server" Mar 10 10:20:37 crc kubenswrapper[4794]: I0310 10:20:37.705438 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-gmrcf" Mar 10 10:20:37 crc kubenswrapper[4794]: I0310 10:20:37.722384 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-gmrcf"] Mar 10 10:20:37 crc kubenswrapper[4794]: I0310 10:20:37.856736 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c14cb6fe-4921-4c22-aa6b-5055831dc731-utilities\") pod \"certified-operators-gmrcf\" (UID: \"c14cb6fe-4921-4c22-aa6b-5055831dc731\") " pod="openshift-marketplace/certified-operators-gmrcf" Mar 10 10:20:37 crc kubenswrapper[4794]: I0310 10:20:37.856815 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fxf7z\" (UniqueName: \"kubernetes.io/projected/c14cb6fe-4921-4c22-aa6b-5055831dc731-kube-api-access-fxf7z\") pod \"certified-operators-gmrcf\" (UID: \"c14cb6fe-4921-4c22-aa6b-5055831dc731\") " pod="openshift-marketplace/certified-operators-gmrcf" Mar 10 10:20:37 crc kubenswrapper[4794]: I0310 10:20:37.856875 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c14cb6fe-4921-4c22-aa6b-5055831dc731-catalog-content\") pod \"certified-operators-gmrcf\" (UID: \"c14cb6fe-4921-4c22-aa6b-5055831dc731\") " pod="openshift-marketplace/certified-operators-gmrcf" Mar 10 10:20:37 crc kubenswrapper[4794]: I0310 10:20:37.958287 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fxf7z\" (UniqueName: \"kubernetes.io/projected/c14cb6fe-4921-4c22-aa6b-5055831dc731-kube-api-access-fxf7z\") pod \"certified-operators-gmrcf\" (UID: \"c14cb6fe-4921-4c22-aa6b-5055831dc731\") " pod="openshift-marketplace/certified-operators-gmrcf" Mar 10 10:20:37 crc kubenswrapper[4794]: I0310 10:20:37.958440 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c14cb6fe-4921-4c22-aa6b-5055831dc731-catalog-content\") pod \"certified-operators-gmrcf\" (UID: \"c14cb6fe-4921-4c22-aa6b-5055831dc731\") " pod="openshift-marketplace/certified-operators-gmrcf" Mar 10 10:20:37 crc kubenswrapper[4794]: I0310 10:20:37.958510 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c14cb6fe-4921-4c22-aa6b-5055831dc731-utilities\") pod \"certified-operators-gmrcf\" (UID: \"c14cb6fe-4921-4c22-aa6b-5055831dc731\") " pod="openshift-marketplace/certified-operators-gmrcf" Mar 10 10:20:37 crc kubenswrapper[4794]: I0310 10:20:37.959025 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c14cb6fe-4921-4c22-aa6b-5055831dc731-utilities\") pod \"certified-operators-gmrcf\" (UID: \"c14cb6fe-4921-4c22-aa6b-5055831dc731\") " pod="openshift-marketplace/certified-operators-gmrcf" Mar 10 10:20:37 crc kubenswrapper[4794]: I0310 10:20:37.959066 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c14cb6fe-4921-4c22-aa6b-5055831dc731-catalog-content\") pod \"certified-operators-gmrcf\" (UID: \"c14cb6fe-4921-4c22-aa6b-5055831dc731\") " pod="openshift-marketplace/certified-operators-gmrcf" Mar 10 10:20:37 crc kubenswrapper[4794]: I0310 10:20:37.981665 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fxf7z\" (UniqueName: \"kubernetes.io/projected/c14cb6fe-4921-4c22-aa6b-5055831dc731-kube-api-access-fxf7z\") pod \"certified-operators-gmrcf\" (UID: \"c14cb6fe-4921-4c22-aa6b-5055831dc731\") " pod="openshift-marketplace/certified-operators-gmrcf" Mar 10 10:20:38 crc kubenswrapper[4794]: I0310 10:20:38.028076 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-gmrcf" Mar 10 10:20:38 crc kubenswrapper[4794]: I0310 10:20:38.313154 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-gmrcf"] Mar 10 10:20:38 crc kubenswrapper[4794]: I0310 10:20:38.354777 4794 scope.go:117] "RemoveContainer" containerID="fefd5652bcce429465fd023412735698b02b5e345af55d22393e6b6e3bf6de74" Mar 10 10:20:38 crc kubenswrapper[4794]: I0310 10:20:38.976876 4794 generic.go:334] "Generic (PLEG): container finished" podID="c14cb6fe-4921-4c22-aa6b-5055831dc731" containerID="4abe03faee2d7e29860d3b7612fc106d23f7452d72f7155c3f033800082338ac" exitCode=0 Mar 10 10:20:38 crc kubenswrapper[4794]: I0310 10:20:38.976927 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gmrcf" event={"ID":"c14cb6fe-4921-4c22-aa6b-5055831dc731","Type":"ContainerDied","Data":"4abe03faee2d7e29860d3b7612fc106d23f7452d72f7155c3f033800082338ac"} Mar 10 10:20:38 crc kubenswrapper[4794]: I0310 10:20:38.976948 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gmrcf" event={"ID":"c14cb6fe-4921-4c22-aa6b-5055831dc731","Type":"ContainerStarted","Data":"c755a9e925cf63c0f86e02443f9eebb25d92755163387e33e2e5f791d882e5a9"} Mar 10 10:20:39 crc kubenswrapper[4794]: I0310 10:20:39.990542 4794 generic.go:334] "Generic (PLEG): container finished" podID="c14cb6fe-4921-4c22-aa6b-5055831dc731" containerID="6a3e7b2ef5d7363e992def68abd63bfda410120200ba6b8b4b602e65ee8e8d1f" exitCode=0 Mar 10 10:20:39 crc kubenswrapper[4794]: I0310 10:20:39.990612 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gmrcf" event={"ID":"c14cb6fe-4921-4c22-aa6b-5055831dc731","Type":"ContainerDied","Data":"6a3e7b2ef5d7363e992def68abd63bfda410120200ba6b8b4b602e65ee8e8d1f"} Mar 10 10:20:41 crc kubenswrapper[4794]: I0310 10:20:41.002020 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gmrcf" event={"ID":"c14cb6fe-4921-4c22-aa6b-5055831dc731","Type":"ContainerStarted","Data":"608c49a73963d81a8a2b0c9ff9c4393c66771325c7df685666b3bffbfe14d366"} Mar 10 10:20:41 crc kubenswrapper[4794]: I0310 10:20:41.022397 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-gmrcf" podStartSLOduration=2.616533285 podStartE2EDuration="4.0223779s" podCreationTimestamp="2026-03-10 10:20:37 +0000 UTC" firstStartedPulling="2026-03-10 10:20:38.979143536 +0000 UTC m=+2187.735314354" lastFinishedPulling="2026-03-10 10:20:40.384988151 +0000 UTC m=+2189.141158969" observedRunningTime="2026-03-10 10:20:41.020815511 +0000 UTC m=+2189.776986399" watchObservedRunningTime="2026-03-10 10:20:41.0223779 +0000 UTC m=+2189.778548718" Mar 10 10:20:48 crc kubenswrapper[4794]: I0310 10:20:48.028215 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-gmrcf" Mar 10 10:20:48 crc kubenswrapper[4794]: I0310 10:20:48.028694 4794 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-gmrcf" Mar 10 10:20:48 crc kubenswrapper[4794]: I0310 10:20:48.085543 4794 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-gmrcf" Mar 10 10:20:48 crc kubenswrapper[4794]: I0310 10:20:48.148811 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-gmrcf" Mar 10 10:20:48 crc kubenswrapper[4794]: I0310 10:20:48.332734 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-gmrcf"] Mar 10 10:20:50 crc kubenswrapper[4794]: I0310 10:20:50.072533 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-gmrcf" podUID="c14cb6fe-4921-4c22-aa6b-5055831dc731" containerName="registry-server" containerID="cri-o://608c49a73963d81a8a2b0c9ff9c4393c66771325c7df685666b3bffbfe14d366" gracePeriod=2 Mar 10 10:20:50 crc kubenswrapper[4794]: I0310 10:20:50.491856 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-gmrcf" Mar 10 10:20:50 crc kubenswrapper[4794]: I0310 10:20:50.598981 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c14cb6fe-4921-4c22-aa6b-5055831dc731-catalog-content\") pod \"c14cb6fe-4921-4c22-aa6b-5055831dc731\" (UID: \"c14cb6fe-4921-4c22-aa6b-5055831dc731\") " Mar 10 10:20:50 crc kubenswrapper[4794]: I0310 10:20:50.599437 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fxf7z\" (UniqueName: \"kubernetes.io/projected/c14cb6fe-4921-4c22-aa6b-5055831dc731-kube-api-access-fxf7z\") pod \"c14cb6fe-4921-4c22-aa6b-5055831dc731\" (UID: \"c14cb6fe-4921-4c22-aa6b-5055831dc731\") " Mar 10 10:20:50 crc kubenswrapper[4794]: I0310 10:20:50.599680 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c14cb6fe-4921-4c22-aa6b-5055831dc731-utilities\") pod \"c14cb6fe-4921-4c22-aa6b-5055831dc731\" (UID: \"c14cb6fe-4921-4c22-aa6b-5055831dc731\") " Mar 10 10:20:50 crc kubenswrapper[4794]: I0310 10:20:50.601039 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c14cb6fe-4921-4c22-aa6b-5055831dc731-utilities" (OuterVolumeSpecName: "utilities") pod "c14cb6fe-4921-4c22-aa6b-5055831dc731" (UID: "c14cb6fe-4921-4c22-aa6b-5055831dc731"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 10:20:50 crc kubenswrapper[4794]: I0310 10:20:50.607422 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c14cb6fe-4921-4c22-aa6b-5055831dc731-kube-api-access-fxf7z" (OuterVolumeSpecName: "kube-api-access-fxf7z") pod "c14cb6fe-4921-4c22-aa6b-5055831dc731" (UID: "c14cb6fe-4921-4c22-aa6b-5055831dc731"). InnerVolumeSpecName "kube-api-access-fxf7z". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 10:20:50 crc kubenswrapper[4794]: I0310 10:20:50.701748 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fxf7z\" (UniqueName: \"kubernetes.io/projected/c14cb6fe-4921-4c22-aa6b-5055831dc731-kube-api-access-fxf7z\") on node \"crc\" DevicePath \"\"" Mar 10 10:20:50 crc kubenswrapper[4794]: I0310 10:20:50.701811 4794 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c14cb6fe-4921-4c22-aa6b-5055831dc731-utilities\") on node \"crc\" DevicePath \"\"" Mar 10 10:20:50 crc kubenswrapper[4794]: I0310 10:20:50.866262 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c14cb6fe-4921-4c22-aa6b-5055831dc731-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c14cb6fe-4921-4c22-aa6b-5055831dc731" (UID: "c14cb6fe-4921-4c22-aa6b-5055831dc731"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 10:20:50 crc kubenswrapper[4794]: I0310 10:20:50.903822 4794 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c14cb6fe-4921-4c22-aa6b-5055831dc731-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 10 10:20:51 crc kubenswrapper[4794]: I0310 10:20:51.082711 4794 generic.go:334] "Generic (PLEG): container finished" podID="c14cb6fe-4921-4c22-aa6b-5055831dc731" containerID="608c49a73963d81a8a2b0c9ff9c4393c66771325c7df685666b3bffbfe14d366" exitCode=0 Mar 10 10:20:51 crc kubenswrapper[4794]: I0310 10:20:51.082812 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-gmrcf" Mar 10 10:20:51 crc kubenswrapper[4794]: I0310 10:20:51.082812 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gmrcf" event={"ID":"c14cb6fe-4921-4c22-aa6b-5055831dc731","Type":"ContainerDied","Data":"608c49a73963d81a8a2b0c9ff9c4393c66771325c7df685666b3bffbfe14d366"} Mar 10 10:20:51 crc kubenswrapper[4794]: I0310 10:20:51.083542 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gmrcf" event={"ID":"c14cb6fe-4921-4c22-aa6b-5055831dc731","Type":"ContainerDied","Data":"c755a9e925cf63c0f86e02443f9eebb25d92755163387e33e2e5f791d882e5a9"} Mar 10 10:20:51 crc kubenswrapper[4794]: I0310 10:20:51.083563 4794 scope.go:117] "RemoveContainer" containerID="608c49a73963d81a8a2b0c9ff9c4393c66771325c7df685666b3bffbfe14d366" Mar 10 10:20:51 crc kubenswrapper[4794]: I0310 10:20:51.107117 4794 scope.go:117] "RemoveContainer" containerID="6a3e7b2ef5d7363e992def68abd63bfda410120200ba6b8b4b602e65ee8e8d1f" Mar 10 10:20:51 crc kubenswrapper[4794]: I0310 10:20:51.134911 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-gmrcf"] Mar 10 10:20:51 crc kubenswrapper[4794]: I0310 10:20:51.142541 4794 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-gmrcf"] Mar 10 10:20:51 crc kubenswrapper[4794]: I0310 10:20:51.145607 4794 scope.go:117] "RemoveContainer" containerID="4abe03faee2d7e29860d3b7612fc106d23f7452d72f7155c3f033800082338ac" Mar 10 10:20:51 crc kubenswrapper[4794]: I0310 10:20:51.173486 4794 scope.go:117] "RemoveContainer" containerID="608c49a73963d81a8a2b0c9ff9c4393c66771325c7df685666b3bffbfe14d366" Mar 10 10:20:51 crc kubenswrapper[4794]: E0310 10:20:51.173852 4794 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"608c49a73963d81a8a2b0c9ff9c4393c66771325c7df685666b3bffbfe14d366\": container with ID starting with 608c49a73963d81a8a2b0c9ff9c4393c66771325c7df685666b3bffbfe14d366 not found: ID does not exist" containerID="608c49a73963d81a8a2b0c9ff9c4393c66771325c7df685666b3bffbfe14d366" Mar 10 10:20:51 crc kubenswrapper[4794]: I0310 10:20:51.173889 4794 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"608c49a73963d81a8a2b0c9ff9c4393c66771325c7df685666b3bffbfe14d366"} err="failed to get container status \"608c49a73963d81a8a2b0c9ff9c4393c66771325c7df685666b3bffbfe14d366\": rpc error: code = NotFound desc = could not find container \"608c49a73963d81a8a2b0c9ff9c4393c66771325c7df685666b3bffbfe14d366\": container with ID starting with 608c49a73963d81a8a2b0c9ff9c4393c66771325c7df685666b3bffbfe14d366 not found: ID does not exist" Mar 10 10:20:51 crc kubenswrapper[4794]: I0310 10:20:51.173914 4794 scope.go:117] "RemoveContainer" containerID="6a3e7b2ef5d7363e992def68abd63bfda410120200ba6b8b4b602e65ee8e8d1f" Mar 10 10:20:51 crc kubenswrapper[4794]: E0310 10:20:51.174154 4794 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6a3e7b2ef5d7363e992def68abd63bfda410120200ba6b8b4b602e65ee8e8d1f\": container with ID starting with 6a3e7b2ef5d7363e992def68abd63bfda410120200ba6b8b4b602e65ee8e8d1f not found: ID does not exist" containerID="6a3e7b2ef5d7363e992def68abd63bfda410120200ba6b8b4b602e65ee8e8d1f" Mar 10 10:20:51 crc kubenswrapper[4794]: I0310 10:20:51.174183 4794 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6a3e7b2ef5d7363e992def68abd63bfda410120200ba6b8b4b602e65ee8e8d1f"} err="failed to get container status \"6a3e7b2ef5d7363e992def68abd63bfda410120200ba6b8b4b602e65ee8e8d1f\": rpc error: code = NotFound desc = could not find container \"6a3e7b2ef5d7363e992def68abd63bfda410120200ba6b8b4b602e65ee8e8d1f\": container with ID starting with 6a3e7b2ef5d7363e992def68abd63bfda410120200ba6b8b4b602e65ee8e8d1f not found: ID does not exist" Mar 10 10:20:51 crc kubenswrapper[4794]: I0310 10:20:51.174201 4794 scope.go:117] "RemoveContainer" containerID="4abe03faee2d7e29860d3b7612fc106d23f7452d72f7155c3f033800082338ac" Mar 10 10:20:51 crc kubenswrapper[4794]: E0310 10:20:51.174576 4794 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4abe03faee2d7e29860d3b7612fc106d23f7452d72f7155c3f033800082338ac\": container with ID starting with 4abe03faee2d7e29860d3b7612fc106d23f7452d72f7155c3f033800082338ac not found: ID does not exist" containerID="4abe03faee2d7e29860d3b7612fc106d23f7452d72f7155c3f033800082338ac" Mar 10 10:20:51 crc kubenswrapper[4794]: I0310 10:20:51.174618 4794 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4abe03faee2d7e29860d3b7612fc106d23f7452d72f7155c3f033800082338ac"} err="failed to get container status \"4abe03faee2d7e29860d3b7612fc106d23f7452d72f7155c3f033800082338ac\": rpc error: code = NotFound desc = could not find container \"4abe03faee2d7e29860d3b7612fc106d23f7452d72f7155c3f033800082338ac\": container with ID starting with 4abe03faee2d7e29860d3b7612fc106d23f7452d72f7155c3f033800082338ac not found: ID does not exist" Mar 10 10:20:52 crc kubenswrapper[4794]: I0310 10:20:52.015603 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c14cb6fe-4921-4c22-aa6b-5055831dc731" path="/var/lib/kubelet/pods/c14cb6fe-4921-4c22-aa6b-5055831dc731/volumes" Mar 10 10:21:22 crc kubenswrapper[4794]: I0310 10:21:22.967558 4794 patch_prober.go:28] interesting pod/machine-config-daemon-69278 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 10:21:22 crc kubenswrapper[4794]: I0310 10:21:22.968033 4794 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-69278" podUID="071a79a8-a892-4d38-a255-2a19483b64aa" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 10:21:52 crc kubenswrapper[4794]: I0310 10:21:52.968234 4794 patch_prober.go:28] interesting pod/machine-config-daemon-69278 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 10:21:52 crc kubenswrapper[4794]: I0310 10:21:52.968894 4794 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-69278" podUID="071a79a8-a892-4d38-a255-2a19483b64aa" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 10:21:55 crc kubenswrapper[4794]: I0310 10:21:55.390292 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-9hbv2"] Mar 10 10:21:55 crc kubenswrapper[4794]: E0310 10:21:55.390807 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c14cb6fe-4921-4c22-aa6b-5055831dc731" containerName="extract-utilities" Mar 10 10:21:55 crc kubenswrapper[4794]: I0310 10:21:55.390830 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="c14cb6fe-4921-4c22-aa6b-5055831dc731" containerName="extract-utilities" Mar 10 10:21:55 crc kubenswrapper[4794]: E0310 10:21:55.390882 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c14cb6fe-4921-4c22-aa6b-5055831dc731" containerName="registry-server" Mar 10 10:21:55 crc kubenswrapper[4794]: I0310 10:21:55.390894 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="c14cb6fe-4921-4c22-aa6b-5055831dc731" containerName="registry-server" Mar 10 10:21:55 crc kubenswrapper[4794]: E0310 10:21:55.390911 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c14cb6fe-4921-4c22-aa6b-5055831dc731" containerName="extract-content" Mar 10 10:21:55 crc kubenswrapper[4794]: I0310 10:21:55.390924 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="c14cb6fe-4921-4c22-aa6b-5055831dc731" containerName="extract-content" Mar 10 10:21:55 crc kubenswrapper[4794]: I0310 10:21:55.391168 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="c14cb6fe-4921-4c22-aa6b-5055831dc731" containerName="registry-server" Mar 10 10:21:55 crc kubenswrapper[4794]: I0310 10:21:55.393133 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-9hbv2" Mar 10 10:21:55 crc kubenswrapper[4794]: I0310 10:21:55.404826 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-9hbv2"] Mar 10 10:21:55 crc kubenswrapper[4794]: I0310 10:21:55.465418 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fb4q8\" (UniqueName: \"kubernetes.io/projected/580c775c-92d7-4066-a60a-e7b84b7cea5a-kube-api-access-fb4q8\") pod \"community-operators-9hbv2\" (UID: \"580c775c-92d7-4066-a60a-e7b84b7cea5a\") " pod="openshift-marketplace/community-operators-9hbv2" Mar 10 10:21:55 crc kubenswrapper[4794]: I0310 10:21:55.465946 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/580c775c-92d7-4066-a60a-e7b84b7cea5a-catalog-content\") pod \"community-operators-9hbv2\" (UID: \"580c775c-92d7-4066-a60a-e7b84b7cea5a\") " pod="openshift-marketplace/community-operators-9hbv2" Mar 10 10:21:55 crc kubenswrapper[4794]: I0310 10:21:55.466044 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/580c775c-92d7-4066-a60a-e7b84b7cea5a-utilities\") pod \"community-operators-9hbv2\" (UID: \"580c775c-92d7-4066-a60a-e7b84b7cea5a\") " pod="openshift-marketplace/community-operators-9hbv2" Mar 10 10:21:55 crc kubenswrapper[4794]: I0310 10:21:55.567841 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/580c775c-92d7-4066-a60a-e7b84b7cea5a-catalog-content\") pod \"community-operators-9hbv2\" (UID: \"580c775c-92d7-4066-a60a-e7b84b7cea5a\") " pod="openshift-marketplace/community-operators-9hbv2" Mar 10 10:21:55 crc kubenswrapper[4794]: I0310 10:21:55.567903 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/580c775c-92d7-4066-a60a-e7b84b7cea5a-utilities\") pod \"community-operators-9hbv2\" (UID: \"580c775c-92d7-4066-a60a-e7b84b7cea5a\") " pod="openshift-marketplace/community-operators-9hbv2" Mar 10 10:21:55 crc kubenswrapper[4794]: I0310 10:21:55.567934 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fb4q8\" (UniqueName: \"kubernetes.io/projected/580c775c-92d7-4066-a60a-e7b84b7cea5a-kube-api-access-fb4q8\") pod \"community-operators-9hbv2\" (UID: \"580c775c-92d7-4066-a60a-e7b84b7cea5a\") " pod="openshift-marketplace/community-operators-9hbv2" Mar 10 10:21:55 crc kubenswrapper[4794]: I0310 10:21:55.568491 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/580c775c-92d7-4066-a60a-e7b84b7cea5a-catalog-content\") pod \"community-operators-9hbv2\" (UID: \"580c775c-92d7-4066-a60a-e7b84b7cea5a\") " pod="openshift-marketplace/community-operators-9hbv2" Mar 10 10:21:55 crc kubenswrapper[4794]: I0310 10:21:55.568590 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/580c775c-92d7-4066-a60a-e7b84b7cea5a-utilities\") pod \"community-operators-9hbv2\" (UID: \"580c775c-92d7-4066-a60a-e7b84b7cea5a\") " pod="openshift-marketplace/community-operators-9hbv2" Mar 10 10:21:55 crc kubenswrapper[4794]: I0310 10:21:55.589309 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fb4q8\" (UniqueName: \"kubernetes.io/projected/580c775c-92d7-4066-a60a-e7b84b7cea5a-kube-api-access-fb4q8\") pod \"community-operators-9hbv2\" (UID: \"580c775c-92d7-4066-a60a-e7b84b7cea5a\") " pod="openshift-marketplace/community-operators-9hbv2" Mar 10 10:21:55 crc kubenswrapper[4794]: I0310 10:21:55.731153 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-9hbv2" Mar 10 10:21:56 crc kubenswrapper[4794]: I0310 10:21:56.199727 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-9hbv2"] Mar 10 10:21:57 crc kubenswrapper[4794]: I0310 10:21:57.096403 4794 generic.go:334] "Generic (PLEG): container finished" podID="580c775c-92d7-4066-a60a-e7b84b7cea5a" containerID="c755515f7b250ab30888fe92e55eabffb5fc0b14f4049ee52d2e582f044135b4" exitCode=0 Mar 10 10:21:57 crc kubenswrapper[4794]: I0310 10:21:57.096526 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9hbv2" event={"ID":"580c775c-92d7-4066-a60a-e7b84b7cea5a","Type":"ContainerDied","Data":"c755515f7b250ab30888fe92e55eabffb5fc0b14f4049ee52d2e582f044135b4"} Mar 10 10:21:57 crc kubenswrapper[4794]: I0310 10:21:57.097668 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9hbv2" event={"ID":"580c775c-92d7-4066-a60a-e7b84b7cea5a","Type":"ContainerStarted","Data":"016fd2c225eaa263165eb4390159850e09a9662950e11a1a63648ecdf9423910"} Mar 10 10:21:57 crc kubenswrapper[4794]: I0310 10:21:57.099424 4794 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 10 10:21:57 crc kubenswrapper[4794]: I0310 10:21:57.172693 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-l8c9n"] Mar 10 10:21:57 crc kubenswrapper[4794]: I0310 10:21:57.175369 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-l8c9n" Mar 10 10:21:57 crc kubenswrapper[4794]: I0310 10:21:57.186981 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-l8c9n"] Mar 10 10:21:57 crc kubenswrapper[4794]: I0310 10:21:57.297590 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b1ed5a3a-ea3b-4467-bb4b-1060b0b302cd-catalog-content\") pod \"redhat-marketplace-l8c9n\" (UID: \"b1ed5a3a-ea3b-4467-bb4b-1060b0b302cd\") " pod="openshift-marketplace/redhat-marketplace-l8c9n" Mar 10 10:21:57 crc kubenswrapper[4794]: I0310 10:21:57.298096 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wn29q\" (UniqueName: \"kubernetes.io/projected/b1ed5a3a-ea3b-4467-bb4b-1060b0b302cd-kube-api-access-wn29q\") pod \"redhat-marketplace-l8c9n\" (UID: \"b1ed5a3a-ea3b-4467-bb4b-1060b0b302cd\") " pod="openshift-marketplace/redhat-marketplace-l8c9n" Mar 10 10:21:57 crc kubenswrapper[4794]: I0310 10:21:57.298158 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b1ed5a3a-ea3b-4467-bb4b-1060b0b302cd-utilities\") pod \"redhat-marketplace-l8c9n\" (UID: \"b1ed5a3a-ea3b-4467-bb4b-1060b0b302cd\") " pod="openshift-marketplace/redhat-marketplace-l8c9n" Mar 10 10:21:57 crc kubenswrapper[4794]: I0310 10:21:57.399375 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wn29q\" (UniqueName: \"kubernetes.io/projected/b1ed5a3a-ea3b-4467-bb4b-1060b0b302cd-kube-api-access-wn29q\") pod \"redhat-marketplace-l8c9n\" (UID: \"b1ed5a3a-ea3b-4467-bb4b-1060b0b302cd\") " pod="openshift-marketplace/redhat-marketplace-l8c9n" Mar 10 10:21:57 crc kubenswrapper[4794]: I0310 10:21:57.399435 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b1ed5a3a-ea3b-4467-bb4b-1060b0b302cd-utilities\") pod \"redhat-marketplace-l8c9n\" (UID: \"b1ed5a3a-ea3b-4467-bb4b-1060b0b302cd\") " pod="openshift-marketplace/redhat-marketplace-l8c9n" Mar 10 10:21:57 crc kubenswrapper[4794]: I0310 10:21:57.399512 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b1ed5a3a-ea3b-4467-bb4b-1060b0b302cd-catalog-content\") pod \"redhat-marketplace-l8c9n\" (UID: \"b1ed5a3a-ea3b-4467-bb4b-1060b0b302cd\") " pod="openshift-marketplace/redhat-marketplace-l8c9n" Mar 10 10:21:57 crc kubenswrapper[4794]: I0310 10:21:57.399959 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b1ed5a3a-ea3b-4467-bb4b-1060b0b302cd-utilities\") pod \"redhat-marketplace-l8c9n\" (UID: \"b1ed5a3a-ea3b-4467-bb4b-1060b0b302cd\") " pod="openshift-marketplace/redhat-marketplace-l8c9n" Mar 10 10:21:57 crc kubenswrapper[4794]: I0310 10:21:57.400142 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b1ed5a3a-ea3b-4467-bb4b-1060b0b302cd-catalog-content\") pod \"redhat-marketplace-l8c9n\" (UID: \"b1ed5a3a-ea3b-4467-bb4b-1060b0b302cd\") " pod="openshift-marketplace/redhat-marketplace-l8c9n" Mar 10 10:21:57 crc kubenswrapper[4794]: I0310 10:21:57.424070 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wn29q\" (UniqueName: \"kubernetes.io/projected/b1ed5a3a-ea3b-4467-bb4b-1060b0b302cd-kube-api-access-wn29q\") pod \"redhat-marketplace-l8c9n\" (UID: \"b1ed5a3a-ea3b-4467-bb4b-1060b0b302cd\") " pod="openshift-marketplace/redhat-marketplace-l8c9n" Mar 10 10:21:57 crc kubenswrapper[4794]: I0310 10:21:57.510590 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-l8c9n" Mar 10 10:21:57 crc kubenswrapper[4794]: I0310 10:21:57.960850 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-l8c9n"] Mar 10 10:21:57 crc kubenswrapper[4794]: W0310 10:21:57.974608 4794 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb1ed5a3a_ea3b_4467_bb4b_1060b0b302cd.slice/crio-1cff86aa85d6642959f65453041ed8c1fec8efd55bddb52f51aabc349de4bd94 WatchSource:0}: Error finding container 1cff86aa85d6642959f65453041ed8c1fec8efd55bddb52f51aabc349de4bd94: Status 404 returned error can't find the container with id 1cff86aa85d6642959f65453041ed8c1fec8efd55bddb52f51aabc349de4bd94 Mar 10 10:21:58 crc kubenswrapper[4794]: I0310 10:21:58.105533 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9hbv2" event={"ID":"580c775c-92d7-4066-a60a-e7b84b7cea5a","Type":"ContainerStarted","Data":"53efee9058b7a7d1aab4f3d6c6d7645b00980bfbf6f7ffccf52047b70b1c6bcb"} Mar 10 10:21:58 crc kubenswrapper[4794]: I0310 10:21:58.106777 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-l8c9n" event={"ID":"b1ed5a3a-ea3b-4467-bb4b-1060b0b302cd","Type":"ContainerStarted","Data":"cf6558aa222e5bf618c00518009d648403a40539d56bea753eba2c42f483f37b"} Mar 10 10:21:58 crc kubenswrapper[4794]: I0310 10:21:58.106813 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-l8c9n" event={"ID":"b1ed5a3a-ea3b-4467-bb4b-1060b0b302cd","Type":"ContainerStarted","Data":"1cff86aa85d6642959f65453041ed8c1fec8efd55bddb52f51aabc349de4bd94"} Mar 10 10:21:59 crc kubenswrapper[4794]: I0310 10:21:59.118226 4794 generic.go:334] "Generic (PLEG): container finished" podID="580c775c-92d7-4066-a60a-e7b84b7cea5a" containerID="53efee9058b7a7d1aab4f3d6c6d7645b00980bfbf6f7ffccf52047b70b1c6bcb" exitCode=0 Mar 10 10:21:59 crc kubenswrapper[4794]: I0310 10:21:59.118283 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9hbv2" event={"ID":"580c775c-92d7-4066-a60a-e7b84b7cea5a","Type":"ContainerDied","Data":"53efee9058b7a7d1aab4f3d6c6d7645b00980bfbf6f7ffccf52047b70b1c6bcb"} Mar 10 10:21:59 crc kubenswrapper[4794]: I0310 10:21:59.120083 4794 generic.go:334] "Generic (PLEG): container finished" podID="b1ed5a3a-ea3b-4467-bb4b-1060b0b302cd" containerID="cf6558aa222e5bf618c00518009d648403a40539d56bea753eba2c42f483f37b" exitCode=0 Mar 10 10:21:59 crc kubenswrapper[4794]: I0310 10:21:59.120111 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-l8c9n" event={"ID":"b1ed5a3a-ea3b-4467-bb4b-1060b0b302cd","Type":"ContainerDied","Data":"cf6558aa222e5bf618c00518009d648403a40539d56bea753eba2c42f483f37b"} Mar 10 10:22:00 crc kubenswrapper[4794]: I0310 10:22:00.130569 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9hbv2" event={"ID":"580c775c-92d7-4066-a60a-e7b84b7cea5a","Type":"ContainerStarted","Data":"0d0750813b8dddc3bb14cf26cee34b0646b1774b66ad5dee7b9c5415b4f87c22"} Mar 10 10:22:00 crc kubenswrapper[4794]: I0310 10:22:00.133101 4794 generic.go:334] "Generic (PLEG): container finished" podID="b1ed5a3a-ea3b-4467-bb4b-1060b0b302cd" containerID="2f61c6d17fd7c8be334eb171ecaefaba754f4a199461b4811c349bcfa20d624f" exitCode=0 Mar 10 10:22:00 crc kubenswrapper[4794]: I0310 10:22:00.133142 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-l8c9n" event={"ID":"b1ed5a3a-ea3b-4467-bb4b-1060b0b302cd","Type":"ContainerDied","Data":"2f61c6d17fd7c8be334eb171ecaefaba754f4a199461b4811c349bcfa20d624f"} Mar 10 10:22:00 crc kubenswrapper[4794]: I0310 10:22:00.151668 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29552302-5dk5d"] Mar 10 10:22:00 crc kubenswrapper[4794]: I0310 10:22:00.152676 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552302-5dk5d" Mar 10 10:22:00 crc kubenswrapper[4794]: I0310 10:22:00.154393 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 10 10:22:00 crc kubenswrapper[4794]: I0310 10:22:00.155305 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 10 10:22:00 crc kubenswrapper[4794]: I0310 10:22:00.156107 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-r5tkc" Mar 10 10:22:00 crc kubenswrapper[4794]: I0310 10:22:00.161668 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552302-5dk5d"] Mar 10 10:22:00 crc kubenswrapper[4794]: I0310 10:22:00.165792 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-9hbv2" podStartSLOduration=2.694426456 podStartE2EDuration="5.165773157s" podCreationTimestamp="2026-03-10 10:21:55 +0000 UTC" firstStartedPulling="2026-03-10 10:21:57.098842267 +0000 UTC m=+2265.855013115" lastFinishedPulling="2026-03-10 10:21:59.570188948 +0000 UTC m=+2268.326359816" observedRunningTime="2026-03-10 10:22:00.158292385 +0000 UTC m=+2268.914463223" watchObservedRunningTime="2026-03-10 10:22:00.165773157 +0000 UTC m=+2268.921943985" Mar 10 10:22:00 crc kubenswrapper[4794]: I0310 10:22:00.248197 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5jptf\" (UniqueName: \"kubernetes.io/projected/4913cb4e-029b-483b-8002-2d0647230b8b-kube-api-access-5jptf\") pod \"auto-csr-approver-29552302-5dk5d\" (UID: \"4913cb4e-029b-483b-8002-2d0647230b8b\") " pod="openshift-infra/auto-csr-approver-29552302-5dk5d" Mar 10 10:22:00 crc kubenswrapper[4794]: I0310 10:22:00.349236 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5jptf\" (UniqueName: \"kubernetes.io/projected/4913cb4e-029b-483b-8002-2d0647230b8b-kube-api-access-5jptf\") pod \"auto-csr-approver-29552302-5dk5d\" (UID: \"4913cb4e-029b-483b-8002-2d0647230b8b\") " pod="openshift-infra/auto-csr-approver-29552302-5dk5d" Mar 10 10:22:00 crc kubenswrapper[4794]: I0310 10:22:00.371462 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5jptf\" (UniqueName: \"kubernetes.io/projected/4913cb4e-029b-483b-8002-2d0647230b8b-kube-api-access-5jptf\") pod \"auto-csr-approver-29552302-5dk5d\" (UID: \"4913cb4e-029b-483b-8002-2d0647230b8b\") " pod="openshift-infra/auto-csr-approver-29552302-5dk5d" Mar 10 10:22:00 crc kubenswrapper[4794]: I0310 10:22:00.471050 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552302-5dk5d" Mar 10 10:22:00 crc kubenswrapper[4794]: I0310 10:22:00.936618 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552302-5dk5d"] Mar 10 10:22:01 crc kubenswrapper[4794]: I0310 10:22:01.142560 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-l8c9n" event={"ID":"b1ed5a3a-ea3b-4467-bb4b-1060b0b302cd","Type":"ContainerStarted","Data":"8b7b51e65fca96115a8f04d8cb41e8e3214ad0c4bd48d49cbbbde196af89911c"} Mar 10 10:22:01 crc kubenswrapper[4794]: I0310 10:22:01.143721 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552302-5dk5d" event={"ID":"4913cb4e-029b-483b-8002-2d0647230b8b","Type":"ContainerStarted","Data":"9f1b33288097ca07d04a558fa7fbc961623d17049fec94960623bdb65a3a247d"} Mar 10 10:22:01 crc kubenswrapper[4794]: I0310 10:22:01.162637 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-l8c9n" podStartSLOduration=2.736653349 podStartE2EDuration="4.16261853s" podCreationTimestamp="2026-03-10 10:21:57 +0000 UTC" firstStartedPulling="2026-03-10 10:21:59.123546145 +0000 UTC m=+2267.879716983" lastFinishedPulling="2026-03-10 10:22:00.549511306 +0000 UTC m=+2269.305682164" observedRunningTime="2026-03-10 10:22:01.161850736 +0000 UTC m=+2269.918021564" watchObservedRunningTime="2026-03-10 10:22:01.16261853 +0000 UTC m=+2269.918789348" Mar 10 10:22:02 crc kubenswrapper[4794]: I0310 10:22:02.151119 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552302-5dk5d" event={"ID":"4913cb4e-029b-483b-8002-2d0647230b8b","Type":"ContainerStarted","Data":"72928c91eb17f5b84b5fcd129be2f33bdf80f60e3f828b2cf5669559e800dd6a"} Mar 10 10:22:02 crc kubenswrapper[4794]: I0310 10:22:02.172275 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29552302-5dk5d" podStartSLOduration=1.418543938 podStartE2EDuration="2.172258519s" podCreationTimestamp="2026-03-10 10:22:00 +0000 UTC" firstStartedPulling="2026-03-10 10:22:00.94145129 +0000 UTC m=+2269.697622118" lastFinishedPulling="2026-03-10 10:22:01.695165891 +0000 UTC m=+2270.451336699" observedRunningTime="2026-03-10 10:22:02.168050589 +0000 UTC m=+2270.924221407" watchObservedRunningTime="2026-03-10 10:22:02.172258519 +0000 UTC m=+2270.928429337" Mar 10 10:22:03 crc kubenswrapper[4794]: I0310 10:22:03.160751 4794 generic.go:334] "Generic (PLEG): container finished" podID="4913cb4e-029b-483b-8002-2d0647230b8b" containerID="72928c91eb17f5b84b5fcd129be2f33bdf80f60e3f828b2cf5669559e800dd6a" exitCode=0 Mar 10 10:22:03 crc kubenswrapper[4794]: I0310 10:22:03.160887 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552302-5dk5d" event={"ID":"4913cb4e-029b-483b-8002-2d0647230b8b","Type":"ContainerDied","Data":"72928c91eb17f5b84b5fcd129be2f33bdf80f60e3f828b2cf5669559e800dd6a"} Mar 10 10:22:04 crc kubenswrapper[4794]: I0310 10:22:04.487040 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552302-5dk5d" Mar 10 10:22:04 crc kubenswrapper[4794]: I0310 10:22:04.613013 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5jptf\" (UniqueName: \"kubernetes.io/projected/4913cb4e-029b-483b-8002-2d0647230b8b-kube-api-access-5jptf\") pod \"4913cb4e-029b-483b-8002-2d0647230b8b\" (UID: \"4913cb4e-029b-483b-8002-2d0647230b8b\") " Mar 10 10:22:04 crc kubenswrapper[4794]: I0310 10:22:04.619486 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4913cb4e-029b-483b-8002-2d0647230b8b-kube-api-access-5jptf" (OuterVolumeSpecName: "kube-api-access-5jptf") pod "4913cb4e-029b-483b-8002-2d0647230b8b" (UID: "4913cb4e-029b-483b-8002-2d0647230b8b"). InnerVolumeSpecName "kube-api-access-5jptf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 10:22:04 crc kubenswrapper[4794]: I0310 10:22:04.717638 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5jptf\" (UniqueName: \"kubernetes.io/projected/4913cb4e-029b-483b-8002-2d0647230b8b-kube-api-access-5jptf\") on node \"crc\" DevicePath \"\"" Mar 10 10:22:05 crc kubenswrapper[4794]: I0310 10:22:05.108551 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29552296-s5sss"] Mar 10 10:22:05 crc kubenswrapper[4794]: I0310 10:22:05.116207 4794 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29552296-s5sss"] Mar 10 10:22:05 crc kubenswrapper[4794]: I0310 10:22:05.177729 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552302-5dk5d" event={"ID":"4913cb4e-029b-483b-8002-2d0647230b8b","Type":"ContainerDied","Data":"9f1b33288097ca07d04a558fa7fbc961623d17049fec94960623bdb65a3a247d"} Mar 10 10:22:05 crc kubenswrapper[4794]: I0310 10:22:05.177779 4794 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9f1b33288097ca07d04a558fa7fbc961623d17049fec94960623bdb65a3a247d" Mar 10 10:22:05 crc kubenswrapper[4794]: I0310 10:22:05.177788 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552302-5dk5d" Mar 10 10:22:05 crc kubenswrapper[4794]: I0310 10:22:05.731501 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-9hbv2" Mar 10 10:22:05 crc kubenswrapper[4794]: I0310 10:22:05.732349 4794 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-9hbv2" Mar 10 10:22:05 crc kubenswrapper[4794]: I0310 10:22:05.793988 4794 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-9hbv2" Mar 10 10:22:06 crc kubenswrapper[4794]: I0310 10:22:06.014422 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1a10473d-463b-4d38-b940-07be581de25a" path="/var/lib/kubelet/pods/1a10473d-463b-4d38-b940-07be581de25a/volumes" Mar 10 10:22:06 crc kubenswrapper[4794]: I0310 10:22:06.255688 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-9hbv2" Mar 10 10:22:07 crc kubenswrapper[4794]: I0310 10:22:07.510973 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-l8c9n" Mar 10 10:22:07 crc kubenswrapper[4794]: I0310 10:22:07.512117 4794 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-l8c9n" Mar 10 10:22:07 crc kubenswrapper[4794]: I0310 10:22:07.562581 4794 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-l8c9n" Mar 10 10:22:08 crc kubenswrapper[4794]: I0310 10:22:08.266850 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-l8c9n" Mar 10 10:22:09 crc kubenswrapper[4794]: I0310 10:22:09.968254 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-9hbv2"] Mar 10 10:22:09 crc kubenswrapper[4794]: I0310 10:22:09.969078 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-9hbv2" podUID="580c775c-92d7-4066-a60a-e7b84b7cea5a" containerName="registry-server" containerID="cri-o://0d0750813b8dddc3bb14cf26cee34b0646b1774b66ad5dee7b9c5415b4f87c22" gracePeriod=2 Mar 10 10:22:10 crc kubenswrapper[4794]: I0310 10:22:10.241724 4794 generic.go:334] "Generic (PLEG): container finished" podID="580c775c-92d7-4066-a60a-e7b84b7cea5a" containerID="0d0750813b8dddc3bb14cf26cee34b0646b1774b66ad5dee7b9c5415b4f87c22" exitCode=0 Mar 10 10:22:10 crc kubenswrapper[4794]: I0310 10:22:10.241770 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9hbv2" event={"ID":"580c775c-92d7-4066-a60a-e7b84b7cea5a","Type":"ContainerDied","Data":"0d0750813b8dddc3bb14cf26cee34b0646b1774b66ad5dee7b9c5415b4f87c22"} Mar 10 10:22:10 crc kubenswrapper[4794]: I0310 10:22:10.494201 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-9hbv2" Mar 10 10:22:10 crc kubenswrapper[4794]: I0310 10:22:10.638244 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/580c775c-92d7-4066-a60a-e7b84b7cea5a-utilities\") pod \"580c775c-92d7-4066-a60a-e7b84b7cea5a\" (UID: \"580c775c-92d7-4066-a60a-e7b84b7cea5a\") " Mar 10 10:22:10 crc kubenswrapper[4794]: I0310 10:22:10.638698 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/580c775c-92d7-4066-a60a-e7b84b7cea5a-catalog-content\") pod \"580c775c-92d7-4066-a60a-e7b84b7cea5a\" (UID: \"580c775c-92d7-4066-a60a-e7b84b7cea5a\") " Mar 10 10:22:10 crc kubenswrapper[4794]: I0310 10:22:10.638741 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fb4q8\" (UniqueName: \"kubernetes.io/projected/580c775c-92d7-4066-a60a-e7b84b7cea5a-kube-api-access-fb4q8\") pod \"580c775c-92d7-4066-a60a-e7b84b7cea5a\" (UID: \"580c775c-92d7-4066-a60a-e7b84b7cea5a\") " Mar 10 10:22:10 crc kubenswrapper[4794]: I0310 10:22:10.639200 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/580c775c-92d7-4066-a60a-e7b84b7cea5a-utilities" (OuterVolumeSpecName: "utilities") pod "580c775c-92d7-4066-a60a-e7b84b7cea5a" (UID: "580c775c-92d7-4066-a60a-e7b84b7cea5a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 10:22:10 crc kubenswrapper[4794]: I0310 10:22:10.652141 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/580c775c-92d7-4066-a60a-e7b84b7cea5a-kube-api-access-fb4q8" (OuterVolumeSpecName: "kube-api-access-fb4q8") pod "580c775c-92d7-4066-a60a-e7b84b7cea5a" (UID: "580c775c-92d7-4066-a60a-e7b84b7cea5a"). InnerVolumeSpecName "kube-api-access-fb4q8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 10:22:10 crc kubenswrapper[4794]: I0310 10:22:10.695403 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/580c775c-92d7-4066-a60a-e7b84b7cea5a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "580c775c-92d7-4066-a60a-e7b84b7cea5a" (UID: "580c775c-92d7-4066-a60a-e7b84b7cea5a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 10:22:10 crc kubenswrapper[4794]: I0310 10:22:10.740096 4794 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/580c775c-92d7-4066-a60a-e7b84b7cea5a-utilities\") on node \"crc\" DevicePath \"\"" Mar 10 10:22:10 crc kubenswrapper[4794]: I0310 10:22:10.740140 4794 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/580c775c-92d7-4066-a60a-e7b84b7cea5a-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 10 10:22:10 crc kubenswrapper[4794]: I0310 10:22:10.740153 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fb4q8\" (UniqueName: \"kubernetes.io/projected/580c775c-92d7-4066-a60a-e7b84b7cea5a-kube-api-access-fb4q8\") on node \"crc\" DevicePath \"\"" Mar 10 10:22:11 crc kubenswrapper[4794]: I0310 10:22:11.158297 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-l8c9n"] Mar 10 10:22:11 crc kubenswrapper[4794]: I0310 10:22:11.158639 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-l8c9n" podUID="b1ed5a3a-ea3b-4467-bb4b-1060b0b302cd" containerName="registry-server" containerID="cri-o://8b7b51e65fca96115a8f04d8cb41e8e3214ad0c4bd48d49cbbbde196af89911c" gracePeriod=2 Mar 10 10:22:11 crc kubenswrapper[4794]: I0310 10:22:11.255270 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9hbv2" event={"ID":"580c775c-92d7-4066-a60a-e7b84b7cea5a","Type":"ContainerDied","Data":"016fd2c225eaa263165eb4390159850e09a9662950e11a1a63648ecdf9423910"} Mar 10 10:22:11 crc kubenswrapper[4794]: I0310 10:22:11.255372 4794 scope.go:117] "RemoveContainer" containerID="0d0750813b8dddc3bb14cf26cee34b0646b1774b66ad5dee7b9c5415b4f87c22" Mar 10 10:22:11 crc kubenswrapper[4794]: I0310 10:22:11.255547 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-9hbv2" Mar 10 10:22:11 crc kubenswrapper[4794]: I0310 10:22:11.293950 4794 scope.go:117] "RemoveContainer" containerID="53efee9058b7a7d1aab4f3d6c6d7645b00980bfbf6f7ffccf52047b70b1c6bcb" Mar 10 10:22:11 crc kubenswrapper[4794]: E0310 10:22:11.316557 4794 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb1ed5a3a_ea3b_4467_bb4b_1060b0b302cd.slice/crio-8b7b51e65fca96115a8f04d8cb41e8e3214ad0c4bd48d49cbbbde196af89911c.scope\": RecentStats: unable to find data in memory cache]" Mar 10 10:22:11 crc kubenswrapper[4794]: I0310 10:22:11.335430 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-9hbv2"] Mar 10 10:22:11 crc kubenswrapper[4794]: I0310 10:22:11.341105 4794 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-9hbv2"] Mar 10 10:22:11 crc kubenswrapper[4794]: I0310 10:22:11.345058 4794 scope.go:117] "RemoveContainer" containerID="c755515f7b250ab30888fe92e55eabffb5fc0b14f4049ee52d2e582f044135b4" Mar 10 10:22:11 crc kubenswrapper[4794]: I0310 10:22:11.638722 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-l8c9n" Mar 10 10:22:11 crc kubenswrapper[4794]: I0310 10:22:11.753605 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b1ed5a3a-ea3b-4467-bb4b-1060b0b302cd-catalog-content\") pod \"b1ed5a3a-ea3b-4467-bb4b-1060b0b302cd\" (UID: \"b1ed5a3a-ea3b-4467-bb4b-1060b0b302cd\") " Mar 10 10:22:11 crc kubenswrapper[4794]: I0310 10:22:11.753662 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wn29q\" (UniqueName: \"kubernetes.io/projected/b1ed5a3a-ea3b-4467-bb4b-1060b0b302cd-kube-api-access-wn29q\") pod \"b1ed5a3a-ea3b-4467-bb4b-1060b0b302cd\" (UID: \"b1ed5a3a-ea3b-4467-bb4b-1060b0b302cd\") " Mar 10 10:22:11 crc kubenswrapper[4794]: I0310 10:22:11.753757 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b1ed5a3a-ea3b-4467-bb4b-1060b0b302cd-utilities\") pod \"b1ed5a3a-ea3b-4467-bb4b-1060b0b302cd\" (UID: \"b1ed5a3a-ea3b-4467-bb4b-1060b0b302cd\") " Mar 10 10:22:11 crc kubenswrapper[4794]: I0310 10:22:11.755638 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b1ed5a3a-ea3b-4467-bb4b-1060b0b302cd-utilities" (OuterVolumeSpecName: "utilities") pod "b1ed5a3a-ea3b-4467-bb4b-1060b0b302cd" (UID: "b1ed5a3a-ea3b-4467-bb4b-1060b0b302cd"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 10:22:11 crc kubenswrapper[4794]: I0310 10:22:11.757422 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b1ed5a3a-ea3b-4467-bb4b-1060b0b302cd-kube-api-access-wn29q" (OuterVolumeSpecName: "kube-api-access-wn29q") pod "b1ed5a3a-ea3b-4467-bb4b-1060b0b302cd" (UID: "b1ed5a3a-ea3b-4467-bb4b-1060b0b302cd"). InnerVolumeSpecName "kube-api-access-wn29q". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 10:22:11 crc kubenswrapper[4794]: I0310 10:22:11.791158 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b1ed5a3a-ea3b-4467-bb4b-1060b0b302cd-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b1ed5a3a-ea3b-4467-bb4b-1060b0b302cd" (UID: "b1ed5a3a-ea3b-4467-bb4b-1060b0b302cd"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 10:22:11 crc kubenswrapper[4794]: I0310 10:22:11.855466 4794 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b1ed5a3a-ea3b-4467-bb4b-1060b0b302cd-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 10 10:22:11 crc kubenswrapper[4794]: I0310 10:22:11.855513 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wn29q\" (UniqueName: \"kubernetes.io/projected/b1ed5a3a-ea3b-4467-bb4b-1060b0b302cd-kube-api-access-wn29q\") on node \"crc\" DevicePath \"\"" Mar 10 10:22:11 crc kubenswrapper[4794]: I0310 10:22:11.855531 4794 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b1ed5a3a-ea3b-4467-bb4b-1060b0b302cd-utilities\") on node \"crc\" DevicePath \"\"" Mar 10 10:22:12 crc kubenswrapper[4794]: I0310 10:22:12.016967 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="580c775c-92d7-4066-a60a-e7b84b7cea5a" path="/var/lib/kubelet/pods/580c775c-92d7-4066-a60a-e7b84b7cea5a/volumes" Mar 10 10:22:12 crc kubenswrapper[4794]: I0310 10:22:12.266275 4794 generic.go:334] "Generic (PLEG): container finished" podID="b1ed5a3a-ea3b-4467-bb4b-1060b0b302cd" containerID="8b7b51e65fca96115a8f04d8cb41e8e3214ad0c4bd48d49cbbbde196af89911c" exitCode=0 Mar 10 10:22:12 crc kubenswrapper[4794]: I0310 10:22:12.266375 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-l8c9n" Mar 10 10:22:12 crc kubenswrapper[4794]: I0310 10:22:12.267408 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-l8c9n" event={"ID":"b1ed5a3a-ea3b-4467-bb4b-1060b0b302cd","Type":"ContainerDied","Data":"8b7b51e65fca96115a8f04d8cb41e8e3214ad0c4bd48d49cbbbde196af89911c"} Mar 10 10:22:12 crc kubenswrapper[4794]: I0310 10:22:12.267569 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-l8c9n" event={"ID":"b1ed5a3a-ea3b-4467-bb4b-1060b0b302cd","Type":"ContainerDied","Data":"1cff86aa85d6642959f65453041ed8c1fec8efd55bddb52f51aabc349de4bd94"} Mar 10 10:22:12 crc kubenswrapper[4794]: I0310 10:22:12.267676 4794 scope.go:117] "RemoveContainer" containerID="8b7b51e65fca96115a8f04d8cb41e8e3214ad0c4bd48d49cbbbde196af89911c" Mar 10 10:22:12 crc kubenswrapper[4794]: I0310 10:22:12.332543 4794 scope.go:117] "RemoveContainer" containerID="2f61c6d17fd7c8be334eb171ecaefaba754f4a199461b4811c349bcfa20d624f" Mar 10 10:22:12 crc kubenswrapper[4794]: I0310 10:22:12.335001 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-l8c9n"] Mar 10 10:22:12 crc kubenswrapper[4794]: I0310 10:22:12.339645 4794 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-l8c9n"] Mar 10 10:22:12 crc kubenswrapper[4794]: I0310 10:22:12.350740 4794 scope.go:117] "RemoveContainer" containerID="cf6558aa222e5bf618c00518009d648403a40539d56bea753eba2c42f483f37b" Mar 10 10:22:12 crc kubenswrapper[4794]: I0310 10:22:12.389400 4794 scope.go:117] "RemoveContainer" containerID="8b7b51e65fca96115a8f04d8cb41e8e3214ad0c4bd48d49cbbbde196af89911c" Mar 10 10:22:12 crc kubenswrapper[4794]: E0310 10:22:12.389821 4794 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8b7b51e65fca96115a8f04d8cb41e8e3214ad0c4bd48d49cbbbde196af89911c\": container with ID starting with 8b7b51e65fca96115a8f04d8cb41e8e3214ad0c4bd48d49cbbbde196af89911c not found: ID does not exist" containerID="8b7b51e65fca96115a8f04d8cb41e8e3214ad0c4bd48d49cbbbde196af89911c" Mar 10 10:22:12 crc kubenswrapper[4794]: I0310 10:22:12.389852 4794 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8b7b51e65fca96115a8f04d8cb41e8e3214ad0c4bd48d49cbbbde196af89911c"} err="failed to get container status \"8b7b51e65fca96115a8f04d8cb41e8e3214ad0c4bd48d49cbbbde196af89911c\": rpc error: code = NotFound desc = could not find container \"8b7b51e65fca96115a8f04d8cb41e8e3214ad0c4bd48d49cbbbde196af89911c\": container with ID starting with 8b7b51e65fca96115a8f04d8cb41e8e3214ad0c4bd48d49cbbbde196af89911c not found: ID does not exist" Mar 10 10:22:12 crc kubenswrapper[4794]: I0310 10:22:12.389873 4794 scope.go:117] "RemoveContainer" containerID="2f61c6d17fd7c8be334eb171ecaefaba754f4a199461b4811c349bcfa20d624f" Mar 10 10:22:12 crc kubenswrapper[4794]: E0310 10:22:12.390291 4794 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2f61c6d17fd7c8be334eb171ecaefaba754f4a199461b4811c349bcfa20d624f\": container with ID starting with 2f61c6d17fd7c8be334eb171ecaefaba754f4a199461b4811c349bcfa20d624f not found: ID does not exist" containerID="2f61c6d17fd7c8be334eb171ecaefaba754f4a199461b4811c349bcfa20d624f" Mar 10 10:22:12 crc kubenswrapper[4794]: I0310 10:22:12.390313 4794 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2f61c6d17fd7c8be334eb171ecaefaba754f4a199461b4811c349bcfa20d624f"} err="failed to get container status \"2f61c6d17fd7c8be334eb171ecaefaba754f4a199461b4811c349bcfa20d624f\": rpc error: code = NotFound desc = could not find container \"2f61c6d17fd7c8be334eb171ecaefaba754f4a199461b4811c349bcfa20d624f\": container with ID starting with 2f61c6d17fd7c8be334eb171ecaefaba754f4a199461b4811c349bcfa20d624f not found: ID does not exist" Mar 10 10:22:12 crc kubenswrapper[4794]: I0310 10:22:12.390324 4794 scope.go:117] "RemoveContainer" containerID="cf6558aa222e5bf618c00518009d648403a40539d56bea753eba2c42f483f37b" Mar 10 10:22:12 crc kubenswrapper[4794]: E0310 10:22:12.390617 4794 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cf6558aa222e5bf618c00518009d648403a40539d56bea753eba2c42f483f37b\": container with ID starting with cf6558aa222e5bf618c00518009d648403a40539d56bea753eba2c42f483f37b not found: ID does not exist" containerID="cf6558aa222e5bf618c00518009d648403a40539d56bea753eba2c42f483f37b" Mar 10 10:22:12 crc kubenswrapper[4794]: I0310 10:22:12.390657 4794 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cf6558aa222e5bf618c00518009d648403a40539d56bea753eba2c42f483f37b"} err="failed to get container status \"cf6558aa222e5bf618c00518009d648403a40539d56bea753eba2c42f483f37b\": rpc error: code = NotFound desc = could not find container \"cf6558aa222e5bf618c00518009d648403a40539d56bea753eba2c42f483f37b\": container with ID starting with cf6558aa222e5bf618c00518009d648403a40539d56bea753eba2c42f483f37b not found: ID does not exist" Mar 10 10:22:14 crc kubenswrapper[4794]: I0310 10:22:14.014412 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b1ed5a3a-ea3b-4467-bb4b-1060b0b302cd" path="/var/lib/kubelet/pods/b1ed5a3a-ea3b-4467-bb4b-1060b0b302cd/volumes" Mar 10 10:22:22 crc kubenswrapper[4794]: I0310 10:22:22.968414 4794 patch_prober.go:28] interesting pod/machine-config-daemon-69278 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 10:22:22 crc kubenswrapper[4794]: I0310 10:22:22.969172 4794 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-69278" podUID="071a79a8-a892-4d38-a255-2a19483b64aa" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 10:22:22 crc kubenswrapper[4794]: I0310 10:22:22.969243 4794 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-69278" Mar 10 10:22:22 crc kubenswrapper[4794]: I0310 10:22:22.970256 4794 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"f1e2ed57babac0296daa078712b218f38f6d04891fd0d4501ddc9eee4a38ca67"} pod="openshift-machine-config-operator/machine-config-daemon-69278" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 10 10:22:22 crc kubenswrapper[4794]: I0310 10:22:22.970394 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-69278" podUID="071a79a8-a892-4d38-a255-2a19483b64aa" containerName="machine-config-daemon" containerID="cri-o://f1e2ed57babac0296daa078712b218f38f6d04891fd0d4501ddc9eee4a38ca67" gracePeriod=600 Mar 10 10:22:23 crc kubenswrapper[4794]: I0310 10:22:23.375515 4794 generic.go:334] "Generic (PLEG): container finished" podID="071a79a8-a892-4d38-a255-2a19483b64aa" containerID="f1e2ed57babac0296daa078712b218f38f6d04891fd0d4501ddc9eee4a38ca67" exitCode=0 Mar 10 10:22:23 crc kubenswrapper[4794]: I0310 10:22:23.375582 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-69278" event={"ID":"071a79a8-a892-4d38-a255-2a19483b64aa","Type":"ContainerDied","Data":"f1e2ed57babac0296daa078712b218f38f6d04891fd0d4501ddc9eee4a38ca67"} Mar 10 10:22:23 crc kubenswrapper[4794]: I0310 10:22:23.375983 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-69278" event={"ID":"071a79a8-a892-4d38-a255-2a19483b64aa","Type":"ContainerStarted","Data":"c557b8a03c6ce2bd8bc84abc516eb62fd0c43ee1ffed47f8d0095002911fe7d5"} Mar 10 10:22:23 crc kubenswrapper[4794]: I0310 10:22:23.376028 4794 scope.go:117] "RemoveContainer" containerID="9729570461f7cfbf915875f9b959db1e77a9b0affc73f47b04e5f09f1c4d906d" Mar 10 10:22:38 crc kubenswrapper[4794]: I0310 10:22:38.514672 4794 scope.go:117] "RemoveContainer" containerID="4d7c9c6915a24bd64369c4f60746e302d0bd841017387f151a97521d36b05e21" Mar 10 10:24:00 crc kubenswrapper[4794]: I0310 10:24:00.199507 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29552304-hq4rd"] Mar 10 10:24:00 crc kubenswrapper[4794]: E0310 10:24:00.200516 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="580c775c-92d7-4066-a60a-e7b84b7cea5a" containerName="extract-content" Mar 10 10:24:00 crc kubenswrapper[4794]: I0310 10:24:00.200538 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="580c775c-92d7-4066-a60a-e7b84b7cea5a" containerName="extract-content" Mar 10 10:24:00 crc kubenswrapper[4794]: E0310 10:24:00.200566 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b1ed5a3a-ea3b-4467-bb4b-1060b0b302cd" containerName="extract-content" Mar 10 10:24:00 crc kubenswrapper[4794]: I0310 10:24:00.200578 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="b1ed5a3a-ea3b-4467-bb4b-1060b0b302cd" containerName="extract-content" Mar 10 10:24:00 crc kubenswrapper[4794]: E0310 10:24:00.200595 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="580c775c-92d7-4066-a60a-e7b84b7cea5a" containerName="extract-utilities" Mar 10 10:24:00 crc kubenswrapper[4794]: I0310 10:24:00.200609 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="580c775c-92d7-4066-a60a-e7b84b7cea5a" containerName="extract-utilities" Mar 10 10:24:00 crc kubenswrapper[4794]: E0310 10:24:00.200629 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="580c775c-92d7-4066-a60a-e7b84b7cea5a" containerName="registry-server" Mar 10 10:24:00 crc kubenswrapper[4794]: I0310 10:24:00.200641 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="580c775c-92d7-4066-a60a-e7b84b7cea5a" containerName="registry-server" Mar 10 10:24:00 crc kubenswrapper[4794]: E0310 10:24:00.200665 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b1ed5a3a-ea3b-4467-bb4b-1060b0b302cd" containerName="registry-server" Mar 10 10:24:00 crc kubenswrapper[4794]: I0310 10:24:00.200676 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="b1ed5a3a-ea3b-4467-bb4b-1060b0b302cd" containerName="registry-server" Mar 10 10:24:00 crc kubenswrapper[4794]: E0310 10:24:00.200694 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b1ed5a3a-ea3b-4467-bb4b-1060b0b302cd" containerName="extract-utilities" Mar 10 10:24:00 crc kubenswrapper[4794]: I0310 10:24:00.200706 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="b1ed5a3a-ea3b-4467-bb4b-1060b0b302cd" containerName="extract-utilities" Mar 10 10:24:00 crc kubenswrapper[4794]: E0310 10:24:00.200734 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4913cb4e-029b-483b-8002-2d0647230b8b" containerName="oc" Mar 10 10:24:00 crc kubenswrapper[4794]: I0310 10:24:00.200746 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="4913cb4e-029b-483b-8002-2d0647230b8b" containerName="oc" Mar 10 10:24:00 crc kubenswrapper[4794]: I0310 10:24:00.200977 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="b1ed5a3a-ea3b-4467-bb4b-1060b0b302cd" containerName="registry-server" Mar 10 10:24:00 crc kubenswrapper[4794]: I0310 10:24:00.200994 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="4913cb4e-029b-483b-8002-2d0647230b8b" containerName="oc" Mar 10 10:24:00 crc kubenswrapper[4794]: I0310 10:24:00.201035 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="580c775c-92d7-4066-a60a-e7b84b7cea5a" containerName="registry-server" Mar 10 10:24:00 crc kubenswrapper[4794]: I0310 10:24:00.201722 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552304-hq4rd" Mar 10 10:24:00 crc kubenswrapper[4794]: I0310 10:24:00.204954 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 10 10:24:00 crc kubenswrapper[4794]: I0310 10:24:00.205044 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-r5tkc" Mar 10 10:24:00 crc kubenswrapper[4794]: I0310 10:24:00.205402 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 10 10:24:00 crc kubenswrapper[4794]: I0310 10:24:00.216828 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552304-hq4rd"] Mar 10 10:24:00 crc kubenswrapper[4794]: I0310 10:24:00.355552 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sxc6g\" (UniqueName: \"kubernetes.io/projected/70c2796b-dac2-48f7-895e-29f131b8b388-kube-api-access-sxc6g\") pod \"auto-csr-approver-29552304-hq4rd\" (UID: \"70c2796b-dac2-48f7-895e-29f131b8b388\") " pod="openshift-infra/auto-csr-approver-29552304-hq4rd" Mar 10 10:24:00 crc kubenswrapper[4794]: I0310 10:24:00.457604 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sxc6g\" (UniqueName: \"kubernetes.io/projected/70c2796b-dac2-48f7-895e-29f131b8b388-kube-api-access-sxc6g\") pod \"auto-csr-approver-29552304-hq4rd\" (UID: \"70c2796b-dac2-48f7-895e-29f131b8b388\") " pod="openshift-infra/auto-csr-approver-29552304-hq4rd" Mar 10 10:24:00 crc kubenswrapper[4794]: I0310 10:24:00.497374 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sxc6g\" (UniqueName: \"kubernetes.io/projected/70c2796b-dac2-48f7-895e-29f131b8b388-kube-api-access-sxc6g\") pod \"auto-csr-approver-29552304-hq4rd\" (UID: \"70c2796b-dac2-48f7-895e-29f131b8b388\") " pod="openshift-infra/auto-csr-approver-29552304-hq4rd" Mar 10 10:24:00 crc kubenswrapper[4794]: I0310 10:24:00.527652 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552304-hq4rd" Mar 10 10:24:00 crc kubenswrapper[4794]: I0310 10:24:00.963193 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552304-hq4rd"] Mar 10 10:24:01 crc kubenswrapper[4794]: I0310 10:24:01.241170 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552304-hq4rd" event={"ID":"70c2796b-dac2-48f7-895e-29f131b8b388","Type":"ContainerStarted","Data":"8a81bcf3816e60f898897b77648cb35da743ba433e7d03c345582088b37b4fd2"} Mar 10 10:24:02 crc kubenswrapper[4794]: I0310 10:24:02.249612 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552304-hq4rd" event={"ID":"70c2796b-dac2-48f7-895e-29f131b8b388","Type":"ContainerStarted","Data":"e92ab2e4be97fbeec5e70ee47cae7af566db8a7ad10f751d14bf8b798e413f14"} Mar 10 10:24:02 crc kubenswrapper[4794]: I0310 10:24:02.264147 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29552304-hq4rd" podStartSLOduration=1.394519266 podStartE2EDuration="2.264126455s" podCreationTimestamp="2026-03-10 10:24:00 +0000 UTC" firstStartedPulling="2026-03-10 10:24:00.983355864 +0000 UTC m=+2389.739526702" lastFinishedPulling="2026-03-10 10:24:01.852963053 +0000 UTC m=+2390.609133891" observedRunningTime="2026-03-10 10:24:02.260401939 +0000 UTC m=+2391.016572767" watchObservedRunningTime="2026-03-10 10:24:02.264126455 +0000 UTC m=+2391.020297273" Mar 10 10:24:03 crc kubenswrapper[4794]: I0310 10:24:03.263064 4794 generic.go:334] "Generic (PLEG): container finished" podID="70c2796b-dac2-48f7-895e-29f131b8b388" containerID="e92ab2e4be97fbeec5e70ee47cae7af566db8a7ad10f751d14bf8b798e413f14" exitCode=0 Mar 10 10:24:03 crc kubenswrapper[4794]: I0310 10:24:03.263146 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552304-hq4rd" event={"ID":"70c2796b-dac2-48f7-895e-29f131b8b388","Type":"ContainerDied","Data":"e92ab2e4be97fbeec5e70ee47cae7af566db8a7ad10f751d14bf8b798e413f14"} Mar 10 10:24:04 crc kubenswrapper[4794]: I0310 10:24:04.615813 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552304-hq4rd" Mar 10 10:24:04 crc kubenswrapper[4794]: I0310 10:24:04.655117 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sxc6g\" (UniqueName: \"kubernetes.io/projected/70c2796b-dac2-48f7-895e-29f131b8b388-kube-api-access-sxc6g\") pod \"70c2796b-dac2-48f7-895e-29f131b8b388\" (UID: \"70c2796b-dac2-48f7-895e-29f131b8b388\") " Mar 10 10:24:04 crc kubenswrapper[4794]: I0310 10:24:04.660249 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/70c2796b-dac2-48f7-895e-29f131b8b388-kube-api-access-sxc6g" (OuterVolumeSpecName: "kube-api-access-sxc6g") pod "70c2796b-dac2-48f7-895e-29f131b8b388" (UID: "70c2796b-dac2-48f7-895e-29f131b8b388"). InnerVolumeSpecName "kube-api-access-sxc6g". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 10:24:04 crc kubenswrapper[4794]: I0310 10:24:04.756982 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sxc6g\" (UniqueName: \"kubernetes.io/projected/70c2796b-dac2-48f7-895e-29f131b8b388-kube-api-access-sxc6g\") on node \"crc\" DevicePath \"\"" Mar 10 10:24:05 crc kubenswrapper[4794]: I0310 10:24:05.098812 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29552298-ftk4m"] Mar 10 10:24:05 crc kubenswrapper[4794]: I0310 10:24:05.103193 4794 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29552298-ftk4m"] Mar 10 10:24:05 crc kubenswrapper[4794]: I0310 10:24:05.280557 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552304-hq4rd" event={"ID":"70c2796b-dac2-48f7-895e-29f131b8b388","Type":"ContainerDied","Data":"8a81bcf3816e60f898897b77648cb35da743ba433e7d03c345582088b37b4fd2"} Mar 10 10:24:05 crc kubenswrapper[4794]: I0310 10:24:05.280598 4794 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8a81bcf3816e60f898897b77648cb35da743ba433e7d03c345582088b37b4fd2" Mar 10 10:24:05 crc kubenswrapper[4794]: I0310 10:24:05.280647 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552304-hq4rd" Mar 10 10:24:06 crc kubenswrapper[4794]: I0310 10:24:06.010953 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e220ad51-de67-4b15-b34f-765bb6fd2a82" path="/var/lib/kubelet/pods/e220ad51-de67-4b15-b34f-765bb6fd2a82/volumes" Mar 10 10:24:38 crc kubenswrapper[4794]: I0310 10:24:38.658082 4794 scope.go:117] "RemoveContainer" containerID="93b918edb4bbe861ad47ece5dd9c8dca8a3943f42e398eb544f0d8a92a4f5843" Mar 10 10:24:52 crc kubenswrapper[4794]: I0310 10:24:52.968189 4794 patch_prober.go:28] interesting pod/machine-config-daemon-69278 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 10:24:52 crc kubenswrapper[4794]: I0310 10:24:52.968998 4794 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-69278" podUID="071a79a8-a892-4d38-a255-2a19483b64aa" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 10:25:22 crc kubenswrapper[4794]: I0310 10:25:22.967734 4794 patch_prober.go:28] interesting pod/machine-config-daemon-69278 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 10:25:22 crc kubenswrapper[4794]: I0310 10:25:22.968398 4794 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-69278" podUID="071a79a8-a892-4d38-a255-2a19483b64aa" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 10:25:52 crc kubenswrapper[4794]: I0310 10:25:52.968062 4794 patch_prober.go:28] interesting pod/machine-config-daemon-69278 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 10:25:52 crc kubenswrapper[4794]: I0310 10:25:52.969214 4794 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-69278" podUID="071a79a8-a892-4d38-a255-2a19483b64aa" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 10:25:52 crc kubenswrapper[4794]: I0310 10:25:52.969353 4794 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-69278" Mar 10 10:25:52 crc kubenswrapper[4794]: I0310 10:25:52.970852 4794 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"c557b8a03c6ce2bd8bc84abc516eb62fd0c43ee1ffed47f8d0095002911fe7d5"} pod="openshift-machine-config-operator/machine-config-daemon-69278" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 10 10:25:52 crc kubenswrapper[4794]: I0310 10:25:52.970965 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-69278" podUID="071a79a8-a892-4d38-a255-2a19483b64aa" containerName="machine-config-daemon" containerID="cri-o://c557b8a03c6ce2bd8bc84abc516eb62fd0c43ee1ffed47f8d0095002911fe7d5" gracePeriod=600 Mar 10 10:25:53 crc kubenswrapper[4794]: E0310 10:25:53.115757 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-69278_openshift-machine-config-operator(071a79a8-a892-4d38-a255-2a19483b64aa)\"" pod="openshift-machine-config-operator/machine-config-daemon-69278" podUID="071a79a8-a892-4d38-a255-2a19483b64aa" Mar 10 10:25:53 crc kubenswrapper[4794]: I0310 10:25:53.157179 4794 generic.go:334] "Generic (PLEG): container finished" podID="071a79a8-a892-4d38-a255-2a19483b64aa" containerID="c557b8a03c6ce2bd8bc84abc516eb62fd0c43ee1ffed47f8d0095002911fe7d5" exitCode=0 Mar 10 10:25:53 crc kubenswrapper[4794]: I0310 10:25:53.157220 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-69278" event={"ID":"071a79a8-a892-4d38-a255-2a19483b64aa","Type":"ContainerDied","Data":"c557b8a03c6ce2bd8bc84abc516eb62fd0c43ee1ffed47f8d0095002911fe7d5"} Mar 10 10:25:53 crc kubenswrapper[4794]: I0310 10:25:53.157252 4794 scope.go:117] "RemoveContainer" containerID="f1e2ed57babac0296daa078712b218f38f6d04891fd0d4501ddc9eee4a38ca67" Mar 10 10:25:53 crc kubenswrapper[4794]: I0310 10:25:53.157855 4794 scope.go:117] "RemoveContainer" containerID="c557b8a03c6ce2bd8bc84abc516eb62fd0c43ee1ffed47f8d0095002911fe7d5" Mar 10 10:25:53 crc kubenswrapper[4794]: E0310 10:25:53.158057 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-69278_openshift-machine-config-operator(071a79a8-a892-4d38-a255-2a19483b64aa)\"" pod="openshift-machine-config-operator/machine-config-daemon-69278" podUID="071a79a8-a892-4d38-a255-2a19483b64aa" Mar 10 10:26:00 crc kubenswrapper[4794]: I0310 10:26:00.154207 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29552306-27pgh"] Mar 10 10:26:00 crc kubenswrapper[4794]: E0310 10:26:00.155102 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="70c2796b-dac2-48f7-895e-29f131b8b388" containerName="oc" Mar 10 10:26:00 crc kubenswrapper[4794]: I0310 10:26:00.155117 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="70c2796b-dac2-48f7-895e-29f131b8b388" containerName="oc" Mar 10 10:26:00 crc kubenswrapper[4794]: I0310 10:26:00.155280 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="70c2796b-dac2-48f7-895e-29f131b8b388" containerName="oc" Mar 10 10:26:00 crc kubenswrapper[4794]: I0310 10:26:00.155820 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552306-27pgh" Mar 10 10:26:00 crc kubenswrapper[4794]: I0310 10:26:00.158784 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 10 10:26:00 crc kubenswrapper[4794]: I0310 10:26:00.159375 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 10 10:26:00 crc kubenswrapper[4794]: I0310 10:26:00.162573 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-r5tkc" Mar 10 10:26:00 crc kubenswrapper[4794]: I0310 10:26:00.176016 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552306-27pgh"] Mar 10 10:26:00 crc kubenswrapper[4794]: I0310 10:26:00.252994 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vglcc\" (UniqueName: \"kubernetes.io/projected/25f0a091-76f3-469d-9747-e7b64d723366-kube-api-access-vglcc\") pod \"auto-csr-approver-29552306-27pgh\" (UID: \"25f0a091-76f3-469d-9747-e7b64d723366\") " pod="openshift-infra/auto-csr-approver-29552306-27pgh" Mar 10 10:26:00 crc kubenswrapper[4794]: I0310 10:26:00.354454 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vglcc\" (UniqueName: \"kubernetes.io/projected/25f0a091-76f3-469d-9747-e7b64d723366-kube-api-access-vglcc\") pod \"auto-csr-approver-29552306-27pgh\" (UID: \"25f0a091-76f3-469d-9747-e7b64d723366\") " pod="openshift-infra/auto-csr-approver-29552306-27pgh" Mar 10 10:26:00 crc kubenswrapper[4794]: I0310 10:26:00.380194 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vglcc\" (UniqueName: \"kubernetes.io/projected/25f0a091-76f3-469d-9747-e7b64d723366-kube-api-access-vglcc\") pod \"auto-csr-approver-29552306-27pgh\" (UID: \"25f0a091-76f3-469d-9747-e7b64d723366\") " pod="openshift-infra/auto-csr-approver-29552306-27pgh" Mar 10 10:26:00 crc kubenswrapper[4794]: I0310 10:26:00.480989 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552306-27pgh" Mar 10 10:26:00 crc kubenswrapper[4794]: I0310 10:26:00.907652 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552306-27pgh"] Mar 10 10:26:01 crc kubenswrapper[4794]: I0310 10:26:01.233744 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552306-27pgh" event={"ID":"25f0a091-76f3-469d-9747-e7b64d723366","Type":"ContainerStarted","Data":"68577bbba7768113dc5b0ed900c5347c6a7965fd45664495b993b7012388b60e"} Mar 10 10:26:03 crc kubenswrapper[4794]: I0310 10:26:03.247566 4794 generic.go:334] "Generic (PLEG): container finished" podID="25f0a091-76f3-469d-9747-e7b64d723366" containerID="b0844cbefd80c3d054e9fa40c4fb4523d38d58f4044c04a2cc6115ab980269bd" exitCode=0 Mar 10 10:26:03 crc kubenswrapper[4794]: I0310 10:26:03.247632 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552306-27pgh" event={"ID":"25f0a091-76f3-469d-9747-e7b64d723366","Type":"ContainerDied","Data":"b0844cbefd80c3d054e9fa40c4fb4523d38d58f4044c04a2cc6115ab980269bd"} Mar 10 10:26:04 crc kubenswrapper[4794]: I0310 10:26:04.554885 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552306-27pgh" Mar 10 10:26:04 crc kubenswrapper[4794]: I0310 10:26:04.630093 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vglcc\" (UniqueName: \"kubernetes.io/projected/25f0a091-76f3-469d-9747-e7b64d723366-kube-api-access-vglcc\") pod \"25f0a091-76f3-469d-9747-e7b64d723366\" (UID: \"25f0a091-76f3-469d-9747-e7b64d723366\") " Mar 10 10:26:04 crc kubenswrapper[4794]: I0310 10:26:04.636184 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25f0a091-76f3-469d-9747-e7b64d723366-kube-api-access-vglcc" (OuterVolumeSpecName: "kube-api-access-vglcc") pod "25f0a091-76f3-469d-9747-e7b64d723366" (UID: "25f0a091-76f3-469d-9747-e7b64d723366"). InnerVolumeSpecName "kube-api-access-vglcc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 10:26:04 crc kubenswrapper[4794]: I0310 10:26:04.732248 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vglcc\" (UniqueName: \"kubernetes.io/projected/25f0a091-76f3-469d-9747-e7b64d723366-kube-api-access-vglcc\") on node \"crc\" DevicePath \"\"" Mar 10 10:26:05 crc kubenswrapper[4794]: I0310 10:26:05.265621 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552306-27pgh" event={"ID":"25f0a091-76f3-469d-9747-e7b64d723366","Type":"ContainerDied","Data":"68577bbba7768113dc5b0ed900c5347c6a7965fd45664495b993b7012388b60e"} Mar 10 10:26:05 crc kubenswrapper[4794]: I0310 10:26:05.265920 4794 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="68577bbba7768113dc5b0ed900c5347c6a7965fd45664495b993b7012388b60e" Mar 10 10:26:05 crc kubenswrapper[4794]: I0310 10:26:05.265695 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552306-27pgh" Mar 10 10:26:05 crc kubenswrapper[4794]: I0310 10:26:05.631256 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29552300-j49hg"] Mar 10 10:26:05 crc kubenswrapper[4794]: I0310 10:26:05.639414 4794 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29552300-j49hg"] Mar 10 10:26:06 crc kubenswrapper[4794]: I0310 10:26:06.008874 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e559a103-2981-408e-8c74-72067a88f425" path="/var/lib/kubelet/pods/e559a103-2981-408e-8c74-72067a88f425/volumes" Mar 10 10:26:06 crc kubenswrapper[4794]: I0310 10:26:06.998982 4794 scope.go:117] "RemoveContainer" containerID="c557b8a03c6ce2bd8bc84abc516eb62fd0c43ee1ffed47f8d0095002911fe7d5" Mar 10 10:26:07 crc kubenswrapper[4794]: E0310 10:26:06.999435 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-69278_openshift-machine-config-operator(071a79a8-a892-4d38-a255-2a19483b64aa)\"" pod="openshift-machine-config-operator/machine-config-daemon-69278" podUID="071a79a8-a892-4d38-a255-2a19483b64aa" Mar 10 10:26:22 crc kubenswrapper[4794]: I0310 10:26:22.009193 4794 scope.go:117] "RemoveContainer" containerID="c557b8a03c6ce2bd8bc84abc516eb62fd0c43ee1ffed47f8d0095002911fe7d5" Mar 10 10:26:22 crc kubenswrapper[4794]: E0310 10:26:22.010168 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-69278_openshift-machine-config-operator(071a79a8-a892-4d38-a255-2a19483b64aa)\"" pod="openshift-machine-config-operator/machine-config-daemon-69278" podUID="071a79a8-a892-4d38-a255-2a19483b64aa" Mar 10 10:26:34 crc kubenswrapper[4794]: I0310 10:26:34.999260 4794 scope.go:117] "RemoveContainer" containerID="c557b8a03c6ce2bd8bc84abc516eb62fd0c43ee1ffed47f8d0095002911fe7d5" Mar 10 10:26:35 crc kubenswrapper[4794]: E0310 10:26:35.000275 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-69278_openshift-machine-config-operator(071a79a8-a892-4d38-a255-2a19483b64aa)\"" pod="openshift-machine-config-operator/machine-config-daemon-69278" podUID="071a79a8-a892-4d38-a255-2a19483b64aa" Mar 10 10:26:38 crc kubenswrapper[4794]: I0310 10:26:38.789994 4794 scope.go:117] "RemoveContainer" containerID="39a4af9fc60313b55a9936560ffdf562d569ad6a3f55df49f5d292093409d7f8" Mar 10 10:26:50 crc kubenswrapper[4794]: I0310 10:26:49.999517 4794 scope.go:117] "RemoveContainer" containerID="c557b8a03c6ce2bd8bc84abc516eb62fd0c43ee1ffed47f8d0095002911fe7d5" Mar 10 10:26:50 crc kubenswrapper[4794]: E0310 10:26:50.000601 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-69278_openshift-machine-config-operator(071a79a8-a892-4d38-a255-2a19483b64aa)\"" pod="openshift-machine-config-operator/machine-config-daemon-69278" podUID="071a79a8-a892-4d38-a255-2a19483b64aa" Mar 10 10:27:02 crc kubenswrapper[4794]: I0310 10:27:02.999199 4794 scope.go:117] "RemoveContainer" containerID="c557b8a03c6ce2bd8bc84abc516eb62fd0c43ee1ffed47f8d0095002911fe7d5" Mar 10 10:27:03 crc kubenswrapper[4794]: E0310 10:27:03.000160 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-69278_openshift-machine-config-operator(071a79a8-a892-4d38-a255-2a19483b64aa)\"" pod="openshift-machine-config-operator/machine-config-daemon-69278" podUID="071a79a8-a892-4d38-a255-2a19483b64aa" Mar 10 10:27:14 crc kubenswrapper[4794]: I0310 10:27:13.999906 4794 scope.go:117] "RemoveContainer" containerID="c557b8a03c6ce2bd8bc84abc516eb62fd0c43ee1ffed47f8d0095002911fe7d5" Mar 10 10:27:14 crc kubenswrapper[4794]: E0310 10:27:14.000918 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-69278_openshift-machine-config-operator(071a79a8-a892-4d38-a255-2a19483b64aa)\"" pod="openshift-machine-config-operator/machine-config-daemon-69278" podUID="071a79a8-a892-4d38-a255-2a19483b64aa" Mar 10 10:27:26 crc kubenswrapper[4794]: I0310 10:27:26.998959 4794 scope.go:117] "RemoveContainer" containerID="c557b8a03c6ce2bd8bc84abc516eb62fd0c43ee1ffed47f8d0095002911fe7d5" Mar 10 10:27:27 crc kubenswrapper[4794]: E0310 10:27:26.999629 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-69278_openshift-machine-config-operator(071a79a8-a892-4d38-a255-2a19483b64aa)\"" pod="openshift-machine-config-operator/machine-config-daemon-69278" podUID="071a79a8-a892-4d38-a255-2a19483b64aa" Mar 10 10:27:39 crc kubenswrapper[4794]: I0310 10:27:38.999572 4794 scope.go:117] "RemoveContainer" containerID="c557b8a03c6ce2bd8bc84abc516eb62fd0c43ee1ffed47f8d0095002911fe7d5" Mar 10 10:27:39 crc kubenswrapper[4794]: E0310 10:27:39.000666 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-69278_openshift-machine-config-operator(071a79a8-a892-4d38-a255-2a19483b64aa)\"" pod="openshift-machine-config-operator/machine-config-daemon-69278" podUID="071a79a8-a892-4d38-a255-2a19483b64aa" Mar 10 10:27:51 crc kubenswrapper[4794]: I0310 10:27:51.000660 4794 scope.go:117] "RemoveContainer" containerID="c557b8a03c6ce2bd8bc84abc516eb62fd0c43ee1ffed47f8d0095002911fe7d5" Mar 10 10:27:51 crc kubenswrapper[4794]: E0310 10:27:51.001638 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-69278_openshift-machine-config-operator(071a79a8-a892-4d38-a255-2a19483b64aa)\"" pod="openshift-machine-config-operator/machine-config-daemon-69278" podUID="071a79a8-a892-4d38-a255-2a19483b64aa" Mar 10 10:28:00 crc kubenswrapper[4794]: I0310 10:28:00.153147 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29552308-ncwj7"] Mar 10 10:28:00 crc kubenswrapper[4794]: E0310 10:28:00.154454 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="25f0a091-76f3-469d-9747-e7b64d723366" containerName="oc" Mar 10 10:28:00 crc kubenswrapper[4794]: I0310 10:28:00.154487 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="25f0a091-76f3-469d-9747-e7b64d723366" containerName="oc" Mar 10 10:28:00 crc kubenswrapper[4794]: I0310 10:28:00.154849 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="25f0a091-76f3-469d-9747-e7b64d723366" containerName="oc" Mar 10 10:28:00 crc kubenswrapper[4794]: I0310 10:28:00.155863 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552308-ncwj7" Mar 10 10:28:00 crc kubenswrapper[4794]: I0310 10:28:00.159944 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-r5tkc" Mar 10 10:28:00 crc kubenswrapper[4794]: I0310 10:28:00.160417 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 10 10:28:00 crc kubenswrapper[4794]: I0310 10:28:00.160844 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 10 10:28:00 crc kubenswrapper[4794]: I0310 10:28:00.162907 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552308-ncwj7"] Mar 10 10:28:00 crc kubenswrapper[4794]: I0310 10:28:00.295778 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8nnt9\" (UniqueName: \"kubernetes.io/projected/4ae16337-4b6f-4615-9ee9-3e25b2000903-kube-api-access-8nnt9\") pod \"auto-csr-approver-29552308-ncwj7\" (UID: \"4ae16337-4b6f-4615-9ee9-3e25b2000903\") " pod="openshift-infra/auto-csr-approver-29552308-ncwj7" Mar 10 10:28:00 crc kubenswrapper[4794]: I0310 10:28:00.397241 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8nnt9\" (UniqueName: \"kubernetes.io/projected/4ae16337-4b6f-4615-9ee9-3e25b2000903-kube-api-access-8nnt9\") pod \"auto-csr-approver-29552308-ncwj7\" (UID: \"4ae16337-4b6f-4615-9ee9-3e25b2000903\") " pod="openshift-infra/auto-csr-approver-29552308-ncwj7" Mar 10 10:28:00 crc kubenswrapper[4794]: I0310 10:28:00.430534 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8nnt9\" (UniqueName: \"kubernetes.io/projected/4ae16337-4b6f-4615-9ee9-3e25b2000903-kube-api-access-8nnt9\") pod \"auto-csr-approver-29552308-ncwj7\" (UID: \"4ae16337-4b6f-4615-9ee9-3e25b2000903\") " pod="openshift-infra/auto-csr-approver-29552308-ncwj7" Mar 10 10:28:00 crc kubenswrapper[4794]: I0310 10:28:00.479596 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552308-ncwj7" Mar 10 10:28:00 crc kubenswrapper[4794]: I0310 10:28:00.923383 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552308-ncwj7"] Mar 10 10:28:00 crc kubenswrapper[4794]: I0310 10:28:00.929013 4794 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 10 10:28:01 crc kubenswrapper[4794]: I0310 10:28:01.263179 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552308-ncwj7" event={"ID":"4ae16337-4b6f-4615-9ee9-3e25b2000903","Type":"ContainerStarted","Data":"2ce252dd9b94bae55d08df45a5e43c732e3c57166525ccfa5be3642fa8f1a8c9"} Mar 10 10:28:02 crc kubenswrapper[4794]: I0310 10:28:02.270850 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552308-ncwj7" event={"ID":"4ae16337-4b6f-4615-9ee9-3e25b2000903","Type":"ContainerStarted","Data":"b908fcf46dad167b7a1c4364e3591fed4cd00db062410bfea7f4d3a4842112d3"} Mar 10 10:28:02 crc kubenswrapper[4794]: I0310 10:28:02.284540 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29552308-ncwj7" podStartSLOduration=1.343989173 podStartE2EDuration="2.284521388s" podCreationTimestamp="2026-03-10 10:28:00 +0000 UTC" firstStartedPulling="2026-03-10 10:28:00.92871628 +0000 UTC m=+2629.684887098" lastFinishedPulling="2026-03-10 10:28:01.869248495 +0000 UTC m=+2630.625419313" observedRunningTime="2026-03-10 10:28:02.283350871 +0000 UTC m=+2631.039521699" watchObservedRunningTime="2026-03-10 10:28:02.284521388 +0000 UTC m=+2631.040692206" Mar 10 10:28:03 crc kubenswrapper[4794]: I0310 10:28:03.281139 4794 generic.go:334] "Generic (PLEG): container finished" podID="4ae16337-4b6f-4615-9ee9-3e25b2000903" containerID="b908fcf46dad167b7a1c4364e3591fed4cd00db062410bfea7f4d3a4842112d3" exitCode=0 Mar 10 10:28:03 crc kubenswrapper[4794]: I0310 10:28:03.281231 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552308-ncwj7" event={"ID":"4ae16337-4b6f-4615-9ee9-3e25b2000903","Type":"ContainerDied","Data":"b908fcf46dad167b7a1c4364e3591fed4cd00db062410bfea7f4d3a4842112d3"} Mar 10 10:28:03 crc kubenswrapper[4794]: I0310 10:28:03.998787 4794 scope.go:117] "RemoveContainer" containerID="c557b8a03c6ce2bd8bc84abc516eb62fd0c43ee1ffed47f8d0095002911fe7d5" Mar 10 10:28:03 crc kubenswrapper[4794]: E0310 10:28:03.999066 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-69278_openshift-machine-config-operator(071a79a8-a892-4d38-a255-2a19483b64aa)\"" pod="openshift-machine-config-operator/machine-config-daemon-69278" podUID="071a79a8-a892-4d38-a255-2a19483b64aa" Mar 10 10:28:04 crc kubenswrapper[4794]: I0310 10:28:04.597994 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552308-ncwj7" Mar 10 10:28:04 crc kubenswrapper[4794]: I0310 10:28:04.771622 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8nnt9\" (UniqueName: \"kubernetes.io/projected/4ae16337-4b6f-4615-9ee9-3e25b2000903-kube-api-access-8nnt9\") pod \"4ae16337-4b6f-4615-9ee9-3e25b2000903\" (UID: \"4ae16337-4b6f-4615-9ee9-3e25b2000903\") " Mar 10 10:28:04 crc kubenswrapper[4794]: I0310 10:28:04.776681 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4ae16337-4b6f-4615-9ee9-3e25b2000903-kube-api-access-8nnt9" (OuterVolumeSpecName: "kube-api-access-8nnt9") pod "4ae16337-4b6f-4615-9ee9-3e25b2000903" (UID: "4ae16337-4b6f-4615-9ee9-3e25b2000903"). InnerVolumeSpecName "kube-api-access-8nnt9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 10:28:04 crc kubenswrapper[4794]: I0310 10:28:04.874224 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8nnt9\" (UniqueName: \"kubernetes.io/projected/4ae16337-4b6f-4615-9ee9-3e25b2000903-kube-api-access-8nnt9\") on node \"crc\" DevicePath \"\"" Mar 10 10:28:05 crc kubenswrapper[4794]: I0310 10:28:05.098113 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29552302-5dk5d"] Mar 10 10:28:05 crc kubenswrapper[4794]: I0310 10:28:05.103613 4794 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29552302-5dk5d"] Mar 10 10:28:05 crc kubenswrapper[4794]: I0310 10:28:05.299075 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552308-ncwj7" event={"ID":"4ae16337-4b6f-4615-9ee9-3e25b2000903","Type":"ContainerDied","Data":"2ce252dd9b94bae55d08df45a5e43c732e3c57166525ccfa5be3642fa8f1a8c9"} Mar 10 10:28:05 crc kubenswrapper[4794]: I0310 10:28:05.299134 4794 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2ce252dd9b94bae55d08df45a5e43c732e3c57166525ccfa5be3642fa8f1a8c9" Mar 10 10:28:05 crc kubenswrapper[4794]: I0310 10:28:05.299211 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552308-ncwj7" Mar 10 10:28:06 crc kubenswrapper[4794]: I0310 10:28:06.015649 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4913cb4e-029b-483b-8002-2d0647230b8b" path="/var/lib/kubelet/pods/4913cb4e-029b-483b-8002-2d0647230b8b/volumes" Mar 10 10:28:16 crc kubenswrapper[4794]: I0310 10:28:16.000862 4794 scope.go:117] "RemoveContainer" containerID="c557b8a03c6ce2bd8bc84abc516eb62fd0c43ee1ffed47f8d0095002911fe7d5" Mar 10 10:28:16 crc kubenswrapper[4794]: E0310 10:28:16.001847 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-69278_openshift-machine-config-operator(071a79a8-a892-4d38-a255-2a19483b64aa)\"" pod="openshift-machine-config-operator/machine-config-daemon-69278" podUID="071a79a8-a892-4d38-a255-2a19483b64aa" Mar 10 10:28:26 crc kubenswrapper[4794]: I0310 10:28:26.999102 4794 scope.go:117] "RemoveContainer" containerID="c557b8a03c6ce2bd8bc84abc516eb62fd0c43ee1ffed47f8d0095002911fe7d5" Mar 10 10:28:27 crc kubenswrapper[4794]: E0310 10:28:26.999852 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-69278_openshift-machine-config-operator(071a79a8-a892-4d38-a255-2a19483b64aa)\"" pod="openshift-machine-config-operator/machine-config-daemon-69278" podUID="071a79a8-a892-4d38-a255-2a19483b64aa" Mar 10 10:28:38 crc kubenswrapper[4794]: I0310 10:28:38.890298 4794 scope.go:117] "RemoveContainer" containerID="72928c91eb17f5b84b5fcd129be2f33bdf80f60e3f828b2cf5669559e800dd6a" Mar 10 10:28:40 crc kubenswrapper[4794]: I0310 10:28:40.998526 4794 scope.go:117] "RemoveContainer" containerID="c557b8a03c6ce2bd8bc84abc516eb62fd0c43ee1ffed47f8d0095002911fe7d5" Mar 10 10:28:40 crc kubenswrapper[4794]: E0310 10:28:40.999016 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-69278_openshift-machine-config-operator(071a79a8-a892-4d38-a255-2a19483b64aa)\"" pod="openshift-machine-config-operator/machine-config-daemon-69278" podUID="071a79a8-a892-4d38-a255-2a19483b64aa" Mar 10 10:28:53 crc kubenswrapper[4794]: I0310 10:28:53.999061 4794 scope.go:117] "RemoveContainer" containerID="c557b8a03c6ce2bd8bc84abc516eb62fd0c43ee1ffed47f8d0095002911fe7d5" Mar 10 10:28:54 crc kubenswrapper[4794]: E0310 10:28:53.999964 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-69278_openshift-machine-config-operator(071a79a8-a892-4d38-a255-2a19483b64aa)\"" pod="openshift-machine-config-operator/machine-config-daemon-69278" podUID="071a79a8-a892-4d38-a255-2a19483b64aa" Mar 10 10:29:06 crc kubenswrapper[4794]: I0310 10:29:06.998697 4794 scope.go:117] "RemoveContainer" containerID="c557b8a03c6ce2bd8bc84abc516eb62fd0c43ee1ffed47f8d0095002911fe7d5" Mar 10 10:29:07 crc kubenswrapper[4794]: E0310 10:29:06.999554 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-69278_openshift-machine-config-operator(071a79a8-a892-4d38-a255-2a19483b64aa)\"" pod="openshift-machine-config-operator/machine-config-daemon-69278" podUID="071a79a8-a892-4d38-a255-2a19483b64aa" Mar 10 10:29:18 crc kubenswrapper[4794]: I0310 10:29:17.999974 4794 scope.go:117] "RemoveContainer" containerID="c557b8a03c6ce2bd8bc84abc516eb62fd0c43ee1ffed47f8d0095002911fe7d5" Mar 10 10:29:18 crc kubenswrapper[4794]: E0310 10:29:18.001063 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-69278_openshift-machine-config-operator(071a79a8-a892-4d38-a255-2a19483b64aa)\"" pod="openshift-machine-config-operator/machine-config-daemon-69278" podUID="071a79a8-a892-4d38-a255-2a19483b64aa" Mar 10 10:29:30 crc kubenswrapper[4794]: I0310 10:29:29.999693 4794 scope.go:117] "RemoveContainer" containerID="c557b8a03c6ce2bd8bc84abc516eb62fd0c43ee1ffed47f8d0095002911fe7d5" Mar 10 10:29:30 crc kubenswrapper[4794]: E0310 10:29:30.000348 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-69278_openshift-machine-config-operator(071a79a8-a892-4d38-a255-2a19483b64aa)\"" pod="openshift-machine-config-operator/machine-config-daemon-69278" podUID="071a79a8-a892-4d38-a255-2a19483b64aa" Mar 10 10:29:44 crc kubenswrapper[4794]: I0310 10:29:44.000185 4794 scope.go:117] "RemoveContainer" containerID="c557b8a03c6ce2bd8bc84abc516eb62fd0c43ee1ffed47f8d0095002911fe7d5" Mar 10 10:29:44 crc kubenswrapper[4794]: E0310 10:29:44.001408 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-69278_openshift-machine-config-operator(071a79a8-a892-4d38-a255-2a19483b64aa)\"" pod="openshift-machine-config-operator/machine-config-daemon-69278" podUID="071a79a8-a892-4d38-a255-2a19483b64aa" Mar 10 10:29:57 crc kubenswrapper[4794]: I0310 10:29:57.000414 4794 scope.go:117] "RemoveContainer" containerID="c557b8a03c6ce2bd8bc84abc516eb62fd0c43ee1ffed47f8d0095002911fe7d5" Mar 10 10:29:57 crc kubenswrapper[4794]: E0310 10:29:57.001168 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-69278_openshift-machine-config-operator(071a79a8-a892-4d38-a255-2a19483b64aa)\"" pod="openshift-machine-config-operator/machine-config-daemon-69278" podUID="071a79a8-a892-4d38-a255-2a19483b64aa" Mar 10 10:30:00 crc kubenswrapper[4794]: I0310 10:30:00.141488 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29552310-bd5xh"] Mar 10 10:30:00 crc kubenswrapper[4794]: E0310 10:30:00.142060 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4ae16337-4b6f-4615-9ee9-3e25b2000903" containerName="oc" Mar 10 10:30:00 crc kubenswrapper[4794]: I0310 10:30:00.142072 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="4ae16337-4b6f-4615-9ee9-3e25b2000903" containerName="oc" Mar 10 10:30:00 crc kubenswrapper[4794]: I0310 10:30:00.142219 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="4ae16337-4b6f-4615-9ee9-3e25b2000903" containerName="oc" Mar 10 10:30:00 crc kubenswrapper[4794]: I0310 10:30:00.142653 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552310-bd5xh" Mar 10 10:30:00 crc kubenswrapper[4794]: I0310 10:30:00.145448 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 10 10:30:00 crc kubenswrapper[4794]: I0310 10:30:00.145554 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 10 10:30:00 crc kubenswrapper[4794]: I0310 10:30:00.151493 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-r5tkc" Mar 10 10:30:00 crc kubenswrapper[4794]: I0310 10:30:00.160428 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552310-bd5xh"] Mar 10 10:30:00 crc kubenswrapper[4794]: I0310 10:30:00.176400 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29552310-7jcl5"] Mar 10 10:30:00 crc kubenswrapper[4794]: I0310 10:30:00.177796 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29552310-7jcl5" Mar 10 10:30:00 crc kubenswrapper[4794]: I0310 10:30:00.181102 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 10 10:30:00 crc kubenswrapper[4794]: I0310 10:30:00.181187 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 10 10:30:00 crc kubenswrapper[4794]: I0310 10:30:00.215352 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29552310-7jcl5"] Mar 10 10:30:00 crc kubenswrapper[4794]: I0310 10:30:00.301750 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/decc335d-1858-4f2c-b1d4-880766c7b327-config-volume\") pod \"collect-profiles-29552310-7jcl5\" (UID: \"decc335d-1858-4f2c-b1d4-880766c7b327\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552310-7jcl5" Mar 10 10:30:00 crc kubenswrapper[4794]: I0310 10:30:00.301813 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2kkvp\" (UniqueName: \"kubernetes.io/projected/441d0540-deea-4dae-8803-ce9d141c2944-kube-api-access-2kkvp\") pod \"auto-csr-approver-29552310-bd5xh\" (UID: \"441d0540-deea-4dae-8803-ce9d141c2944\") " pod="openshift-infra/auto-csr-approver-29552310-bd5xh" Mar 10 10:30:00 crc kubenswrapper[4794]: I0310 10:30:00.301946 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/decc335d-1858-4f2c-b1d4-880766c7b327-secret-volume\") pod \"collect-profiles-29552310-7jcl5\" (UID: \"decc335d-1858-4f2c-b1d4-880766c7b327\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552310-7jcl5" Mar 10 10:30:00 crc kubenswrapper[4794]: I0310 10:30:00.302078 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dcppp\" (UniqueName: \"kubernetes.io/projected/decc335d-1858-4f2c-b1d4-880766c7b327-kube-api-access-dcppp\") pod \"collect-profiles-29552310-7jcl5\" (UID: \"decc335d-1858-4f2c-b1d4-880766c7b327\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552310-7jcl5" Mar 10 10:30:00 crc kubenswrapper[4794]: I0310 10:30:00.403428 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dcppp\" (UniqueName: \"kubernetes.io/projected/decc335d-1858-4f2c-b1d4-880766c7b327-kube-api-access-dcppp\") pod \"collect-profiles-29552310-7jcl5\" (UID: \"decc335d-1858-4f2c-b1d4-880766c7b327\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552310-7jcl5" Mar 10 10:30:00 crc kubenswrapper[4794]: I0310 10:30:00.403634 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/decc335d-1858-4f2c-b1d4-880766c7b327-config-volume\") pod \"collect-profiles-29552310-7jcl5\" (UID: \"decc335d-1858-4f2c-b1d4-880766c7b327\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552310-7jcl5" Mar 10 10:30:00 crc kubenswrapper[4794]: I0310 10:30:00.403682 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2kkvp\" (UniqueName: \"kubernetes.io/projected/441d0540-deea-4dae-8803-ce9d141c2944-kube-api-access-2kkvp\") pod \"auto-csr-approver-29552310-bd5xh\" (UID: \"441d0540-deea-4dae-8803-ce9d141c2944\") " pod="openshift-infra/auto-csr-approver-29552310-bd5xh" Mar 10 10:30:00 crc kubenswrapper[4794]: I0310 10:30:00.403741 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/decc335d-1858-4f2c-b1d4-880766c7b327-secret-volume\") pod \"collect-profiles-29552310-7jcl5\" (UID: \"decc335d-1858-4f2c-b1d4-880766c7b327\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552310-7jcl5" Mar 10 10:30:00 crc kubenswrapper[4794]: I0310 10:30:00.404955 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/decc335d-1858-4f2c-b1d4-880766c7b327-config-volume\") pod \"collect-profiles-29552310-7jcl5\" (UID: \"decc335d-1858-4f2c-b1d4-880766c7b327\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552310-7jcl5" Mar 10 10:30:00 crc kubenswrapper[4794]: I0310 10:30:00.413751 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/decc335d-1858-4f2c-b1d4-880766c7b327-secret-volume\") pod \"collect-profiles-29552310-7jcl5\" (UID: \"decc335d-1858-4f2c-b1d4-880766c7b327\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552310-7jcl5" Mar 10 10:30:00 crc kubenswrapper[4794]: I0310 10:30:00.419215 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dcppp\" (UniqueName: \"kubernetes.io/projected/decc335d-1858-4f2c-b1d4-880766c7b327-kube-api-access-dcppp\") pod \"collect-profiles-29552310-7jcl5\" (UID: \"decc335d-1858-4f2c-b1d4-880766c7b327\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552310-7jcl5" Mar 10 10:30:00 crc kubenswrapper[4794]: I0310 10:30:00.428100 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2kkvp\" (UniqueName: \"kubernetes.io/projected/441d0540-deea-4dae-8803-ce9d141c2944-kube-api-access-2kkvp\") pod \"auto-csr-approver-29552310-bd5xh\" (UID: \"441d0540-deea-4dae-8803-ce9d141c2944\") " pod="openshift-infra/auto-csr-approver-29552310-bd5xh" Mar 10 10:30:00 crc kubenswrapper[4794]: I0310 10:30:00.469603 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552310-bd5xh" Mar 10 10:30:00 crc kubenswrapper[4794]: I0310 10:30:00.521475 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29552310-7jcl5" Mar 10 10:30:00 crc kubenswrapper[4794]: I0310 10:30:00.740655 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552310-bd5xh"] Mar 10 10:30:01 crc kubenswrapper[4794]: I0310 10:30:01.008836 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29552310-7jcl5"] Mar 10 10:30:01 crc kubenswrapper[4794]: W0310 10:30:01.014684 4794 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddecc335d_1858_4f2c_b1d4_880766c7b327.slice/crio-65a75c26a0f7f5e2f2b661c948797ee5821fbd421701baaaa45d7b757b6a33e1 WatchSource:0}: Error finding container 65a75c26a0f7f5e2f2b661c948797ee5821fbd421701baaaa45d7b757b6a33e1: Status 404 returned error can't find the container with id 65a75c26a0f7f5e2f2b661c948797ee5821fbd421701baaaa45d7b757b6a33e1 Mar 10 10:30:01 crc kubenswrapper[4794]: I0310 10:30:01.222359 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552310-bd5xh" event={"ID":"441d0540-deea-4dae-8803-ce9d141c2944","Type":"ContainerStarted","Data":"5a0f5da5afafb10af883c5ad073a2088e0fcf96107d9447c96879f3c5cdb522a"} Mar 10 10:30:01 crc kubenswrapper[4794]: I0310 10:30:01.224678 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29552310-7jcl5" event={"ID":"decc335d-1858-4f2c-b1d4-880766c7b327","Type":"ContainerStarted","Data":"53e9eb9caa6ee4f53b43c42b40530a755d11265dd8e0eea792637b9e0b86997a"} Mar 10 10:30:01 crc kubenswrapper[4794]: I0310 10:30:01.224702 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29552310-7jcl5" event={"ID":"decc335d-1858-4f2c-b1d4-880766c7b327","Type":"ContainerStarted","Data":"65a75c26a0f7f5e2f2b661c948797ee5821fbd421701baaaa45d7b757b6a33e1"} Mar 10 10:30:01 crc kubenswrapper[4794]: I0310 10:30:01.247191 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29552310-7jcl5" podStartSLOduration=1.24715974 podStartE2EDuration="1.24715974s" podCreationTimestamp="2026-03-10 10:30:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 10:30:01.243567188 +0000 UTC m=+2749.999738006" watchObservedRunningTime="2026-03-10 10:30:01.24715974 +0000 UTC m=+2750.003330568" Mar 10 10:30:02 crc kubenswrapper[4794]: I0310 10:30:02.234988 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552310-bd5xh" event={"ID":"441d0540-deea-4dae-8803-ce9d141c2944","Type":"ContainerStarted","Data":"52b56d08dff0686f2d040169fda9eb3b35ef2500bf07e885d1d7d3b8ac17a5f4"} Mar 10 10:30:02 crc kubenswrapper[4794]: I0310 10:30:02.237086 4794 generic.go:334] "Generic (PLEG): container finished" podID="decc335d-1858-4f2c-b1d4-880766c7b327" containerID="53e9eb9caa6ee4f53b43c42b40530a755d11265dd8e0eea792637b9e0b86997a" exitCode=0 Mar 10 10:30:02 crc kubenswrapper[4794]: I0310 10:30:02.237158 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29552310-7jcl5" event={"ID":"decc335d-1858-4f2c-b1d4-880766c7b327","Type":"ContainerDied","Data":"53e9eb9caa6ee4f53b43c42b40530a755d11265dd8e0eea792637b9e0b86997a"} Mar 10 10:30:02 crc kubenswrapper[4794]: I0310 10:30:02.256116 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29552310-bd5xh" podStartSLOduration=1.10513627 podStartE2EDuration="2.256088514s" podCreationTimestamp="2026-03-10 10:30:00 +0000 UTC" firstStartedPulling="2026-03-10 10:30:00.752193093 +0000 UTC m=+2749.508363901" lastFinishedPulling="2026-03-10 10:30:01.903145277 +0000 UTC m=+2750.659316145" observedRunningTime="2026-03-10 10:30:02.247523567 +0000 UTC m=+2751.003694405" watchObservedRunningTime="2026-03-10 10:30:02.256088514 +0000 UTC m=+2751.012259332" Mar 10 10:30:03 crc kubenswrapper[4794]: I0310 10:30:03.245988 4794 generic.go:334] "Generic (PLEG): container finished" podID="441d0540-deea-4dae-8803-ce9d141c2944" containerID="52b56d08dff0686f2d040169fda9eb3b35ef2500bf07e885d1d7d3b8ac17a5f4" exitCode=0 Mar 10 10:30:03 crc kubenswrapper[4794]: I0310 10:30:03.246075 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552310-bd5xh" event={"ID":"441d0540-deea-4dae-8803-ce9d141c2944","Type":"ContainerDied","Data":"52b56d08dff0686f2d040169fda9eb3b35ef2500bf07e885d1d7d3b8ac17a5f4"} Mar 10 10:30:03 crc kubenswrapper[4794]: I0310 10:30:03.532229 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29552310-7jcl5" Mar 10 10:30:03 crc kubenswrapper[4794]: I0310 10:30:03.652778 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dcppp\" (UniqueName: \"kubernetes.io/projected/decc335d-1858-4f2c-b1d4-880766c7b327-kube-api-access-dcppp\") pod \"decc335d-1858-4f2c-b1d4-880766c7b327\" (UID: \"decc335d-1858-4f2c-b1d4-880766c7b327\") " Mar 10 10:30:03 crc kubenswrapper[4794]: I0310 10:30:03.652867 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/decc335d-1858-4f2c-b1d4-880766c7b327-secret-volume\") pod \"decc335d-1858-4f2c-b1d4-880766c7b327\" (UID: \"decc335d-1858-4f2c-b1d4-880766c7b327\") " Mar 10 10:30:03 crc kubenswrapper[4794]: I0310 10:30:03.652946 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/decc335d-1858-4f2c-b1d4-880766c7b327-config-volume\") pod \"decc335d-1858-4f2c-b1d4-880766c7b327\" (UID: \"decc335d-1858-4f2c-b1d4-880766c7b327\") " Mar 10 10:30:03 crc kubenswrapper[4794]: I0310 10:30:03.653743 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/decc335d-1858-4f2c-b1d4-880766c7b327-config-volume" (OuterVolumeSpecName: "config-volume") pod "decc335d-1858-4f2c-b1d4-880766c7b327" (UID: "decc335d-1858-4f2c-b1d4-880766c7b327"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 10:30:03 crc kubenswrapper[4794]: I0310 10:30:03.657255 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/decc335d-1858-4f2c-b1d4-880766c7b327-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "decc335d-1858-4f2c-b1d4-880766c7b327" (UID: "decc335d-1858-4f2c-b1d4-880766c7b327"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 10:30:03 crc kubenswrapper[4794]: I0310 10:30:03.657642 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/decc335d-1858-4f2c-b1d4-880766c7b327-kube-api-access-dcppp" (OuterVolumeSpecName: "kube-api-access-dcppp") pod "decc335d-1858-4f2c-b1d4-880766c7b327" (UID: "decc335d-1858-4f2c-b1d4-880766c7b327"). InnerVolumeSpecName "kube-api-access-dcppp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 10:30:03 crc kubenswrapper[4794]: I0310 10:30:03.754509 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dcppp\" (UniqueName: \"kubernetes.io/projected/decc335d-1858-4f2c-b1d4-880766c7b327-kube-api-access-dcppp\") on node \"crc\" DevicePath \"\"" Mar 10 10:30:03 crc kubenswrapper[4794]: I0310 10:30:03.754564 4794 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/decc335d-1858-4f2c-b1d4-880766c7b327-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 10 10:30:03 crc kubenswrapper[4794]: I0310 10:30:03.754579 4794 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/decc335d-1858-4f2c-b1d4-880766c7b327-config-volume\") on node \"crc\" DevicePath \"\"" Mar 10 10:30:04 crc kubenswrapper[4794]: I0310 10:30:04.254706 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29552310-7jcl5" event={"ID":"decc335d-1858-4f2c-b1d4-880766c7b327","Type":"ContainerDied","Data":"65a75c26a0f7f5e2f2b661c948797ee5821fbd421701baaaa45d7b757b6a33e1"} Mar 10 10:30:04 crc kubenswrapper[4794]: I0310 10:30:04.254768 4794 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="65a75c26a0f7f5e2f2b661c948797ee5821fbd421701baaaa45d7b757b6a33e1" Mar 10 10:30:04 crc kubenswrapper[4794]: I0310 10:30:04.254804 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29552310-7jcl5" Mar 10 10:30:04 crc kubenswrapper[4794]: I0310 10:30:04.323511 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29552265-smnpm"] Mar 10 10:30:04 crc kubenswrapper[4794]: I0310 10:30:04.328061 4794 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29552265-smnpm"] Mar 10 10:30:04 crc kubenswrapper[4794]: I0310 10:30:04.537878 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552310-bd5xh" Mar 10 10:30:04 crc kubenswrapper[4794]: I0310 10:30:04.667112 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2kkvp\" (UniqueName: \"kubernetes.io/projected/441d0540-deea-4dae-8803-ce9d141c2944-kube-api-access-2kkvp\") pod \"441d0540-deea-4dae-8803-ce9d141c2944\" (UID: \"441d0540-deea-4dae-8803-ce9d141c2944\") " Mar 10 10:30:04 crc kubenswrapper[4794]: I0310 10:30:04.672767 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/441d0540-deea-4dae-8803-ce9d141c2944-kube-api-access-2kkvp" (OuterVolumeSpecName: "kube-api-access-2kkvp") pod "441d0540-deea-4dae-8803-ce9d141c2944" (UID: "441d0540-deea-4dae-8803-ce9d141c2944"). InnerVolumeSpecName "kube-api-access-2kkvp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 10:30:04 crc kubenswrapper[4794]: I0310 10:30:04.769473 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2kkvp\" (UniqueName: \"kubernetes.io/projected/441d0540-deea-4dae-8803-ce9d141c2944-kube-api-access-2kkvp\") on node \"crc\" DevicePath \"\"" Mar 10 10:30:05 crc kubenswrapper[4794]: I0310 10:30:05.078097 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29552304-hq4rd"] Mar 10 10:30:05 crc kubenswrapper[4794]: I0310 10:30:05.083613 4794 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29552304-hq4rd"] Mar 10 10:30:05 crc kubenswrapper[4794]: I0310 10:30:05.262440 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552310-bd5xh" event={"ID":"441d0540-deea-4dae-8803-ce9d141c2944","Type":"ContainerDied","Data":"5a0f5da5afafb10af883c5ad073a2088e0fcf96107d9447c96879f3c5cdb522a"} Mar 10 10:30:05 crc kubenswrapper[4794]: I0310 10:30:05.262475 4794 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5a0f5da5afafb10af883c5ad073a2088e0fcf96107d9447c96879f3c5cdb522a" Mar 10 10:30:05 crc kubenswrapper[4794]: I0310 10:30:05.262524 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552310-bd5xh" Mar 10 10:30:06 crc kubenswrapper[4794]: I0310 10:30:06.008670 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="70c2796b-dac2-48f7-895e-29f131b8b388" path="/var/lib/kubelet/pods/70c2796b-dac2-48f7-895e-29f131b8b388/volumes" Mar 10 10:30:06 crc kubenswrapper[4794]: I0310 10:30:06.009545 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="eece3bac-ab7c-4a16-82ae-35775eef8806" path="/var/lib/kubelet/pods/eece3bac-ab7c-4a16-82ae-35775eef8806/volumes" Mar 10 10:30:12 crc kubenswrapper[4794]: I0310 10:30:12.004097 4794 scope.go:117] "RemoveContainer" containerID="c557b8a03c6ce2bd8bc84abc516eb62fd0c43ee1ffed47f8d0095002911fe7d5" Mar 10 10:30:12 crc kubenswrapper[4794]: E0310 10:30:12.004450 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-69278_openshift-machine-config-operator(071a79a8-a892-4d38-a255-2a19483b64aa)\"" pod="openshift-machine-config-operator/machine-config-daemon-69278" podUID="071a79a8-a892-4d38-a255-2a19483b64aa" Mar 10 10:30:24 crc kubenswrapper[4794]: I0310 10:30:24.000076 4794 scope.go:117] "RemoveContainer" containerID="c557b8a03c6ce2bd8bc84abc516eb62fd0c43ee1ffed47f8d0095002911fe7d5" Mar 10 10:30:24 crc kubenswrapper[4794]: E0310 10:30:24.000986 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-69278_openshift-machine-config-operator(071a79a8-a892-4d38-a255-2a19483b64aa)\"" pod="openshift-machine-config-operator/machine-config-daemon-69278" podUID="071a79a8-a892-4d38-a255-2a19483b64aa" Mar 10 10:30:36 crc kubenswrapper[4794]: I0310 10:30:36.999068 4794 scope.go:117] "RemoveContainer" containerID="c557b8a03c6ce2bd8bc84abc516eb62fd0c43ee1ffed47f8d0095002911fe7d5" Mar 10 10:30:37 crc kubenswrapper[4794]: E0310 10:30:36.999775 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-69278_openshift-machine-config-operator(071a79a8-a892-4d38-a255-2a19483b64aa)\"" pod="openshift-machine-config-operator/machine-config-daemon-69278" podUID="071a79a8-a892-4d38-a255-2a19483b64aa" Mar 10 10:30:38 crc kubenswrapper[4794]: I0310 10:30:38.998284 4794 scope.go:117] "RemoveContainer" containerID="9359a9c9fbf9b27a25c133709f0cca4798f0af917d86e842a532f8a026f6b7c7" Mar 10 10:30:39 crc kubenswrapper[4794]: I0310 10:30:39.046257 4794 scope.go:117] "RemoveContainer" containerID="e92ab2e4be97fbeec5e70ee47cae7af566db8a7ad10f751d14bf8b798e413f14" Mar 10 10:30:52 crc kubenswrapper[4794]: I0310 10:30:52.003597 4794 scope.go:117] "RemoveContainer" containerID="c557b8a03c6ce2bd8bc84abc516eb62fd0c43ee1ffed47f8d0095002911fe7d5" Mar 10 10:30:52 crc kubenswrapper[4794]: E0310 10:30:52.005092 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-69278_openshift-machine-config-operator(071a79a8-a892-4d38-a255-2a19483b64aa)\"" pod="openshift-machine-config-operator/machine-config-daemon-69278" podUID="071a79a8-a892-4d38-a255-2a19483b64aa" Mar 10 10:31:07 crc kubenswrapper[4794]: I0310 10:31:06.999285 4794 scope.go:117] "RemoveContainer" containerID="c557b8a03c6ce2bd8bc84abc516eb62fd0c43ee1ffed47f8d0095002911fe7d5" Mar 10 10:31:07 crc kubenswrapper[4794]: I0310 10:31:07.744653 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-69278" event={"ID":"071a79a8-a892-4d38-a255-2a19483b64aa","Type":"ContainerStarted","Data":"ce542eaa03554afd6301ca429d8e5d5b288e9f73610638883d1131444c25e394"} Mar 10 10:31:15 crc kubenswrapper[4794]: I0310 10:31:15.118096 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-fd6s6"] Mar 10 10:31:15 crc kubenswrapper[4794]: E0310 10:31:15.120207 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="441d0540-deea-4dae-8803-ce9d141c2944" containerName="oc" Mar 10 10:31:15 crc kubenswrapper[4794]: I0310 10:31:15.120244 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="441d0540-deea-4dae-8803-ce9d141c2944" containerName="oc" Mar 10 10:31:15 crc kubenswrapper[4794]: E0310 10:31:15.120300 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="decc335d-1858-4f2c-b1d4-880766c7b327" containerName="collect-profiles" Mar 10 10:31:15 crc kubenswrapper[4794]: I0310 10:31:15.120317 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="decc335d-1858-4f2c-b1d4-880766c7b327" containerName="collect-profiles" Mar 10 10:31:15 crc kubenswrapper[4794]: I0310 10:31:15.120716 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="441d0540-deea-4dae-8803-ce9d141c2944" containerName="oc" Mar 10 10:31:15 crc kubenswrapper[4794]: I0310 10:31:15.120751 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="decc335d-1858-4f2c-b1d4-880766c7b327" containerName="collect-profiles" Mar 10 10:31:15 crc kubenswrapper[4794]: I0310 10:31:15.123209 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-fd6s6" Mar 10 10:31:15 crc kubenswrapper[4794]: I0310 10:31:15.139666 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-fd6s6"] Mar 10 10:31:15 crc kubenswrapper[4794]: I0310 10:31:15.218060 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/02dd1e4e-c396-44f9-b19f-67a645e01de6-utilities\") pod \"redhat-operators-fd6s6\" (UID: \"02dd1e4e-c396-44f9-b19f-67a645e01de6\") " pod="openshift-marketplace/redhat-operators-fd6s6" Mar 10 10:31:15 crc kubenswrapper[4794]: I0310 10:31:15.218135 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lxmqd\" (UniqueName: \"kubernetes.io/projected/02dd1e4e-c396-44f9-b19f-67a645e01de6-kube-api-access-lxmqd\") pod \"redhat-operators-fd6s6\" (UID: \"02dd1e4e-c396-44f9-b19f-67a645e01de6\") " pod="openshift-marketplace/redhat-operators-fd6s6" Mar 10 10:31:15 crc kubenswrapper[4794]: I0310 10:31:15.218162 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/02dd1e4e-c396-44f9-b19f-67a645e01de6-catalog-content\") pod \"redhat-operators-fd6s6\" (UID: \"02dd1e4e-c396-44f9-b19f-67a645e01de6\") " pod="openshift-marketplace/redhat-operators-fd6s6" Mar 10 10:31:15 crc kubenswrapper[4794]: I0310 10:31:15.320142 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/02dd1e4e-c396-44f9-b19f-67a645e01de6-utilities\") pod \"redhat-operators-fd6s6\" (UID: \"02dd1e4e-c396-44f9-b19f-67a645e01de6\") " pod="openshift-marketplace/redhat-operators-fd6s6" Mar 10 10:31:15 crc kubenswrapper[4794]: I0310 10:31:15.320215 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lxmqd\" (UniqueName: \"kubernetes.io/projected/02dd1e4e-c396-44f9-b19f-67a645e01de6-kube-api-access-lxmqd\") pod \"redhat-operators-fd6s6\" (UID: \"02dd1e4e-c396-44f9-b19f-67a645e01de6\") " pod="openshift-marketplace/redhat-operators-fd6s6" Mar 10 10:31:15 crc kubenswrapper[4794]: I0310 10:31:15.320244 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/02dd1e4e-c396-44f9-b19f-67a645e01de6-catalog-content\") pod \"redhat-operators-fd6s6\" (UID: \"02dd1e4e-c396-44f9-b19f-67a645e01de6\") " pod="openshift-marketplace/redhat-operators-fd6s6" Mar 10 10:31:15 crc kubenswrapper[4794]: I0310 10:31:15.320741 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/02dd1e4e-c396-44f9-b19f-67a645e01de6-utilities\") pod \"redhat-operators-fd6s6\" (UID: \"02dd1e4e-c396-44f9-b19f-67a645e01de6\") " pod="openshift-marketplace/redhat-operators-fd6s6" Mar 10 10:31:15 crc kubenswrapper[4794]: I0310 10:31:15.320765 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/02dd1e4e-c396-44f9-b19f-67a645e01de6-catalog-content\") pod \"redhat-operators-fd6s6\" (UID: \"02dd1e4e-c396-44f9-b19f-67a645e01de6\") " pod="openshift-marketplace/redhat-operators-fd6s6" Mar 10 10:31:15 crc kubenswrapper[4794]: I0310 10:31:15.344864 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lxmqd\" (UniqueName: \"kubernetes.io/projected/02dd1e4e-c396-44f9-b19f-67a645e01de6-kube-api-access-lxmqd\") pod \"redhat-operators-fd6s6\" (UID: \"02dd1e4e-c396-44f9-b19f-67a645e01de6\") " pod="openshift-marketplace/redhat-operators-fd6s6" Mar 10 10:31:15 crc kubenswrapper[4794]: I0310 10:31:15.459843 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-fd6s6" Mar 10 10:31:15 crc kubenswrapper[4794]: I0310 10:31:15.900565 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-fd6s6"] Mar 10 10:31:16 crc kubenswrapper[4794]: I0310 10:31:16.823329 4794 generic.go:334] "Generic (PLEG): container finished" podID="02dd1e4e-c396-44f9-b19f-67a645e01de6" containerID="68b030a6f2f415b3d23cdd995769ebf4c786f98ff9a093caa0c1a6ad0baee299" exitCode=0 Mar 10 10:31:16 crc kubenswrapper[4794]: I0310 10:31:16.823565 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fd6s6" event={"ID":"02dd1e4e-c396-44f9-b19f-67a645e01de6","Type":"ContainerDied","Data":"68b030a6f2f415b3d23cdd995769ebf4c786f98ff9a093caa0c1a6ad0baee299"} Mar 10 10:31:16 crc kubenswrapper[4794]: I0310 10:31:16.823737 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fd6s6" event={"ID":"02dd1e4e-c396-44f9-b19f-67a645e01de6","Type":"ContainerStarted","Data":"fcabce62b5ae0e55847823949b921f305e7f0f62d5fe8d887f0b2adf430d9970"} Mar 10 10:31:17 crc kubenswrapper[4794]: I0310 10:31:17.831162 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fd6s6" event={"ID":"02dd1e4e-c396-44f9-b19f-67a645e01de6","Type":"ContainerStarted","Data":"f1d818b8b1fa2cacf492cf4427ad4f4d6fc4901a155342e5f9c12dba5271cb67"} Mar 10 10:31:18 crc kubenswrapper[4794]: I0310 10:31:18.838714 4794 generic.go:334] "Generic (PLEG): container finished" podID="02dd1e4e-c396-44f9-b19f-67a645e01de6" containerID="f1d818b8b1fa2cacf492cf4427ad4f4d6fc4901a155342e5f9c12dba5271cb67" exitCode=0 Mar 10 10:31:18 crc kubenswrapper[4794]: I0310 10:31:18.838761 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fd6s6" event={"ID":"02dd1e4e-c396-44f9-b19f-67a645e01de6","Type":"ContainerDied","Data":"f1d818b8b1fa2cacf492cf4427ad4f4d6fc4901a155342e5f9c12dba5271cb67"} Mar 10 10:31:19 crc kubenswrapper[4794]: I0310 10:31:19.851557 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fd6s6" event={"ID":"02dd1e4e-c396-44f9-b19f-67a645e01de6","Type":"ContainerStarted","Data":"fac8abc0a65fedd28f49898fc8231d258d3c4d6fe81db71a85903746ac889553"} Mar 10 10:31:19 crc kubenswrapper[4794]: I0310 10:31:19.869629 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-fd6s6" podStartSLOduration=2.39570326 podStartE2EDuration="4.869602092s" podCreationTimestamp="2026-03-10 10:31:15 +0000 UTC" firstStartedPulling="2026-03-10 10:31:16.826741826 +0000 UTC m=+2825.582912654" lastFinishedPulling="2026-03-10 10:31:19.300640668 +0000 UTC m=+2828.056811486" observedRunningTime="2026-03-10 10:31:19.867545018 +0000 UTC m=+2828.623715826" watchObservedRunningTime="2026-03-10 10:31:19.869602092 +0000 UTC m=+2828.625772930" Mar 10 10:31:25 crc kubenswrapper[4794]: I0310 10:31:25.460766 4794 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-fd6s6" Mar 10 10:31:25 crc kubenswrapper[4794]: I0310 10:31:25.460857 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-fd6s6" Mar 10 10:31:26 crc kubenswrapper[4794]: I0310 10:31:26.514270 4794 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-fd6s6" podUID="02dd1e4e-c396-44f9-b19f-67a645e01de6" containerName="registry-server" probeResult="failure" output=< Mar 10 10:31:26 crc kubenswrapper[4794]: timeout: failed to connect service ":50051" within 1s Mar 10 10:31:26 crc kubenswrapper[4794]: > Mar 10 10:31:35 crc kubenswrapper[4794]: I0310 10:31:35.524761 4794 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-fd6s6" Mar 10 10:31:35 crc kubenswrapper[4794]: I0310 10:31:35.571833 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-fd6s6" Mar 10 10:31:35 crc kubenswrapper[4794]: I0310 10:31:35.759078 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-fd6s6"] Mar 10 10:31:36 crc kubenswrapper[4794]: I0310 10:31:36.995072 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-fd6s6" podUID="02dd1e4e-c396-44f9-b19f-67a645e01de6" containerName="registry-server" containerID="cri-o://fac8abc0a65fedd28f49898fc8231d258d3c4d6fe81db71a85903746ac889553" gracePeriod=2 Mar 10 10:31:37 crc kubenswrapper[4794]: I0310 10:31:37.437290 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-fd6s6" Mar 10 10:31:37 crc kubenswrapper[4794]: I0310 10:31:37.569284 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/02dd1e4e-c396-44f9-b19f-67a645e01de6-utilities\") pod \"02dd1e4e-c396-44f9-b19f-67a645e01de6\" (UID: \"02dd1e4e-c396-44f9-b19f-67a645e01de6\") " Mar 10 10:31:37 crc kubenswrapper[4794]: I0310 10:31:37.569414 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/02dd1e4e-c396-44f9-b19f-67a645e01de6-catalog-content\") pod \"02dd1e4e-c396-44f9-b19f-67a645e01de6\" (UID: \"02dd1e4e-c396-44f9-b19f-67a645e01de6\") " Mar 10 10:31:37 crc kubenswrapper[4794]: I0310 10:31:37.569474 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lxmqd\" (UniqueName: \"kubernetes.io/projected/02dd1e4e-c396-44f9-b19f-67a645e01de6-kube-api-access-lxmqd\") pod \"02dd1e4e-c396-44f9-b19f-67a645e01de6\" (UID: \"02dd1e4e-c396-44f9-b19f-67a645e01de6\") " Mar 10 10:31:37 crc kubenswrapper[4794]: I0310 10:31:37.570254 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/02dd1e4e-c396-44f9-b19f-67a645e01de6-utilities" (OuterVolumeSpecName: "utilities") pod "02dd1e4e-c396-44f9-b19f-67a645e01de6" (UID: "02dd1e4e-c396-44f9-b19f-67a645e01de6"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 10:31:37 crc kubenswrapper[4794]: I0310 10:31:37.575576 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/02dd1e4e-c396-44f9-b19f-67a645e01de6-kube-api-access-lxmqd" (OuterVolumeSpecName: "kube-api-access-lxmqd") pod "02dd1e4e-c396-44f9-b19f-67a645e01de6" (UID: "02dd1e4e-c396-44f9-b19f-67a645e01de6"). InnerVolumeSpecName "kube-api-access-lxmqd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 10:31:37 crc kubenswrapper[4794]: I0310 10:31:37.671570 4794 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/02dd1e4e-c396-44f9-b19f-67a645e01de6-utilities\") on node \"crc\" DevicePath \"\"" Mar 10 10:31:37 crc kubenswrapper[4794]: I0310 10:31:37.671614 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lxmqd\" (UniqueName: \"kubernetes.io/projected/02dd1e4e-c396-44f9-b19f-67a645e01de6-kube-api-access-lxmqd\") on node \"crc\" DevicePath \"\"" Mar 10 10:31:37 crc kubenswrapper[4794]: I0310 10:31:37.715972 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/02dd1e4e-c396-44f9-b19f-67a645e01de6-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "02dd1e4e-c396-44f9-b19f-67a645e01de6" (UID: "02dd1e4e-c396-44f9-b19f-67a645e01de6"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 10:31:37 crc kubenswrapper[4794]: I0310 10:31:37.773363 4794 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/02dd1e4e-c396-44f9-b19f-67a645e01de6-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 10 10:31:38 crc kubenswrapper[4794]: I0310 10:31:38.008449 4794 generic.go:334] "Generic (PLEG): container finished" podID="02dd1e4e-c396-44f9-b19f-67a645e01de6" containerID="fac8abc0a65fedd28f49898fc8231d258d3c4d6fe81db71a85903746ac889553" exitCode=0 Mar 10 10:31:38 crc kubenswrapper[4794]: I0310 10:31:38.008656 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-fd6s6" Mar 10 10:31:38 crc kubenswrapper[4794]: I0310 10:31:38.009040 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fd6s6" event={"ID":"02dd1e4e-c396-44f9-b19f-67a645e01de6","Type":"ContainerDied","Data":"fac8abc0a65fedd28f49898fc8231d258d3c4d6fe81db71a85903746ac889553"} Mar 10 10:31:38 crc kubenswrapper[4794]: I0310 10:31:38.009065 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fd6s6" event={"ID":"02dd1e4e-c396-44f9-b19f-67a645e01de6","Type":"ContainerDied","Data":"fcabce62b5ae0e55847823949b921f305e7f0f62d5fe8d887f0b2adf430d9970"} Mar 10 10:31:38 crc kubenswrapper[4794]: I0310 10:31:38.009121 4794 scope.go:117] "RemoveContainer" containerID="fac8abc0a65fedd28f49898fc8231d258d3c4d6fe81db71a85903746ac889553" Mar 10 10:31:38 crc kubenswrapper[4794]: I0310 10:31:38.031644 4794 scope.go:117] "RemoveContainer" containerID="f1d818b8b1fa2cacf492cf4427ad4f4d6fc4901a155342e5f9c12dba5271cb67" Mar 10 10:31:38 crc kubenswrapper[4794]: I0310 10:31:38.052626 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-fd6s6"] Mar 10 10:31:38 crc kubenswrapper[4794]: I0310 10:31:38.064307 4794 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-fd6s6"] Mar 10 10:31:38 crc kubenswrapper[4794]: I0310 10:31:38.083659 4794 scope.go:117] "RemoveContainer" containerID="68b030a6f2f415b3d23cdd995769ebf4c786f98ff9a093caa0c1a6ad0baee299" Mar 10 10:31:38 crc kubenswrapper[4794]: I0310 10:31:38.108208 4794 scope.go:117] "RemoveContainer" containerID="fac8abc0a65fedd28f49898fc8231d258d3c4d6fe81db71a85903746ac889553" Mar 10 10:31:38 crc kubenswrapper[4794]: E0310 10:31:38.108808 4794 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fac8abc0a65fedd28f49898fc8231d258d3c4d6fe81db71a85903746ac889553\": container with ID starting with fac8abc0a65fedd28f49898fc8231d258d3c4d6fe81db71a85903746ac889553 not found: ID does not exist" containerID="fac8abc0a65fedd28f49898fc8231d258d3c4d6fe81db71a85903746ac889553" Mar 10 10:31:38 crc kubenswrapper[4794]: I0310 10:31:38.108858 4794 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fac8abc0a65fedd28f49898fc8231d258d3c4d6fe81db71a85903746ac889553"} err="failed to get container status \"fac8abc0a65fedd28f49898fc8231d258d3c4d6fe81db71a85903746ac889553\": rpc error: code = NotFound desc = could not find container \"fac8abc0a65fedd28f49898fc8231d258d3c4d6fe81db71a85903746ac889553\": container with ID starting with fac8abc0a65fedd28f49898fc8231d258d3c4d6fe81db71a85903746ac889553 not found: ID does not exist" Mar 10 10:31:38 crc kubenswrapper[4794]: I0310 10:31:38.108892 4794 scope.go:117] "RemoveContainer" containerID="f1d818b8b1fa2cacf492cf4427ad4f4d6fc4901a155342e5f9c12dba5271cb67" Mar 10 10:31:38 crc kubenswrapper[4794]: E0310 10:31:38.109296 4794 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f1d818b8b1fa2cacf492cf4427ad4f4d6fc4901a155342e5f9c12dba5271cb67\": container with ID starting with f1d818b8b1fa2cacf492cf4427ad4f4d6fc4901a155342e5f9c12dba5271cb67 not found: ID does not exist" containerID="f1d818b8b1fa2cacf492cf4427ad4f4d6fc4901a155342e5f9c12dba5271cb67" Mar 10 10:31:38 crc kubenswrapper[4794]: I0310 10:31:38.109383 4794 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f1d818b8b1fa2cacf492cf4427ad4f4d6fc4901a155342e5f9c12dba5271cb67"} err="failed to get container status \"f1d818b8b1fa2cacf492cf4427ad4f4d6fc4901a155342e5f9c12dba5271cb67\": rpc error: code = NotFound desc = could not find container \"f1d818b8b1fa2cacf492cf4427ad4f4d6fc4901a155342e5f9c12dba5271cb67\": container with ID starting with f1d818b8b1fa2cacf492cf4427ad4f4d6fc4901a155342e5f9c12dba5271cb67 not found: ID does not exist" Mar 10 10:31:38 crc kubenswrapper[4794]: I0310 10:31:38.109423 4794 scope.go:117] "RemoveContainer" containerID="68b030a6f2f415b3d23cdd995769ebf4c786f98ff9a093caa0c1a6ad0baee299" Mar 10 10:31:38 crc kubenswrapper[4794]: E0310 10:31:38.109924 4794 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"68b030a6f2f415b3d23cdd995769ebf4c786f98ff9a093caa0c1a6ad0baee299\": container with ID starting with 68b030a6f2f415b3d23cdd995769ebf4c786f98ff9a093caa0c1a6ad0baee299 not found: ID does not exist" containerID="68b030a6f2f415b3d23cdd995769ebf4c786f98ff9a093caa0c1a6ad0baee299" Mar 10 10:31:38 crc kubenswrapper[4794]: I0310 10:31:38.109972 4794 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"68b030a6f2f415b3d23cdd995769ebf4c786f98ff9a093caa0c1a6ad0baee299"} err="failed to get container status \"68b030a6f2f415b3d23cdd995769ebf4c786f98ff9a093caa0c1a6ad0baee299\": rpc error: code = NotFound desc = could not find container \"68b030a6f2f415b3d23cdd995769ebf4c786f98ff9a093caa0c1a6ad0baee299\": container with ID starting with 68b030a6f2f415b3d23cdd995769ebf4c786f98ff9a093caa0c1a6ad0baee299 not found: ID does not exist" Mar 10 10:31:40 crc kubenswrapper[4794]: I0310 10:31:40.013524 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="02dd1e4e-c396-44f9-b19f-67a645e01de6" path="/var/lib/kubelet/pods/02dd1e4e-c396-44f9-b19f-67a645e01de6/volumes" Mar 10 10:32:00 crc kubenswrapper[4794]: I0310 10:32:00.159651 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29552312-6zv4m"] Mar 10 10:32:00 crc kubenswrapper[4794]: E0310 10:32:00.160549 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="02dd1e4e-c396-44f9-b19f-67a645e01de6" containerName="extract-content" Mar 10 10:32:00 crc kubenswrapper[4794]: I0310 10:32:00.160564 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="02dd1e4e-c396-44f9-b19f-67a645e01de6" containerName="extract-content" Mar 10 10:32:00 crc kubenswrapper[4794]: E0310 10:32:00.160592 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="02dd1e4e-c396-44f9-b19f-67a645e01de6" containerName="extract-utilities" Mar 10 10:32:00 crc kubenswrapper[4794]: I0310 10:32:00.160600 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="02dd1e4e-c396-44f9-b19f-67a645e01de6" containerName="extract-utilities" Mar 10 10:32:00 crc kubenswrapper[4794]: E0310 10:32:00.160626 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="02dd1e4e-c396-44f9-b19f-67a645e01de6" containerName="registry-server" Mar 10 10:32:00 crc kubenswrapper[4794]: I0310 10:32:00.160636 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="02dd1e4e-c396-44f9-b19f-67a645e01de6" containerName="registry-server" Mar 10 10:32:00 crc kubenswrapper[4794]: I0310 10:32:00.160813 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="02dd1e4e-c396-44f9-b19f-67a645e01de6" containerName="registry-server" Mar 10 10:32:00 crc kubenswrapper[4794]: I0310 10:32:00.161375 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552312-6zv4m" Mar 10 10:32:00 crc kubenswrapper[4794]: I0310 10:32:00.165978 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 10 10:32:00 crc kubenswrapper[4794]: I0310 10:32:00.165978 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 10 10:32:00 crc kubenswrapper[4794]: I0310 10:32:00.166058 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-r5tkc" Mar 10 10:32:00 crc kubenswrapper[4794]: I0310 10:32:00.171494 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552312-6zv4m"] Mar 10 10:32:00 crc kubenswrapper[4794]: I0310 10:32:00.307256 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9lh6s\" (UniqueName: \"kubernetes.io/projected/dfb4f35d-3ce4-469f-bcf4-6a18666d7a60-kube-api-access-9lh6s\") pod \"auto-csr-approver-29552312-6zv4m\" (UID: \"dfb4f35d-3ce4-469f-bcf4-6a18666d7a60\") " pod="openshift-infra/auto-csr-approver-29552312-6zv4m" Mar 10 10:32:00 crc kubenswrapper[4794]: I0310 10:32:00.408840 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9lh6s\" (UniqueName: \"kubernetes.io/projected/dfb4f35d-3ce4-469f-bcf4-6a18666d7a60-kube-api-access-9lh6s\") pod \"auto-csr-approver-29552312-6zv4m\" (UID: \"dfb4f35d-3ce4-469f-bcf4-6a18666d7a60\") " pod="openshift-infra/auto-csr-approver-29552312-6zv4m" Mar 10 10:32:00 crc kubenswrapper[4794]: I0310 10:32:00.432946 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9lh6s\" (UniqueName: \"kubernetes.io/projected/dfb4f35d-3ce4-469f-bcf4-6a18666d7a60-kube-api-access-9lh6s\") pod \"auto-csr-approver-29552312-6zv4m\" (UID: \"dfb4f35d-3ce4-469f-bcf4-6a18666d7a60\") " pod="openshift-infra/auto-csr-approver-29552312-6zv4m" Mar 10 10:32:00 crc kubenswrapper[4794]: I0310 10:32:00.482686 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552312-6zv4m" Mar 10 10:32:00 crc kubenswrapper[4794]: I0310 10:32:00.940644 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552312-6zv4m"] Mar 10 10:32:01 crc kubenswrapper[4794]: I0310 10:32:01.205008 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552312-6zv4m" event={"ID":"dfb4f35d-3ce4-469f-bcf4-6a18666d7a60","Type":"ContainerStarted","Data":"d529735a848a3a95f614ef29217e664ca084e887ea4112064871f104585cc87f"} Mar 10 10:32:03 crc kubenswrapper[4794]: I0310 10:32:03.221144 4794 generic.go:334] "Generic (PLEG): container finished" podID="dfb4f35d-3ce4-469f-bcf4-6a18666d7a60" containerID="80b0a24179304b588a5e5f3c51302c950a38f80ef87a86f9808c344779a5f9b7" exitCode=0 Mar 10 10:32:03 crc kubenswrapper[4794]: I0310 10:32:03.221262 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552312-6zv4m" event={"ID":"dfb4f35d-3ce4-469f-bcf4-6a18666d7a60","Type":"ContainerDied","Data":"80b0a24179304b588a5e5f3c51302c950a38f80ef87a86f9808c344779a5f9b7"} Mar 10 10:32:04 crc kubenswrapper[4794]: I0310 10:32:04.525649 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552312-6zv4m" Mar 10 10:32:04 crc kubenswrapper[4794]: I0310 10:32:04.670061 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9lh6s\" (UniqueName: \"kubernetes.io/projected/dfb4f35d-3ce4-469f-bcf4-6a18666d7a60-kube-api-access-9lh6s\") pod \"dfb4f35d-3ce4-469f-bcf4-6a18666d7a60\" (UID: \"dfb4f35d-3ce4-469f-bcf4-6a18666d7a60\") " Mar 10 10:32:04 crc kubenswrapper[4794]: I0310 10:32:04.679523 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dfb4f35d-3ce4-469f-bcf4-6a18666d7a60-kube-api-access-9lh6s" (OuterVolumeSpecName: "kube-api-access-9lh6s") pod "dfb4f35d-3ce4-469f-bcf4-6a18666d7a60" (UID: "dfb4f35d-3ce4-469f-bcf4-6a18666d7a60"). InnerVolumeSpecName "kube-api-access-9lh6s". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 10:32:04 crc kubenswrapper[4794]: I0310 10:32:04.771148 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9lh6s\" (UniqueName: \"kubernetes.io/projected/dfb4f35d-3ce4-469f-bcf4-6a18666d7a60-kube-api-access-9lh6s\") on node \"crc\" DevicePath \"\"" Mar 10 10:32:05 crc kubenswrapper[4794]: I0310 10:32:05.241873 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552312-6zv4m" event={"ID":"dfb4f35d-3ce4-469f-bcf4-6a18666d7a60","Type":"ContainerDied","Data":"d529735a848a3a95f614ef29217e664ca084e887ea4112064871f104585cc87f"} Mar 10 10:32:05 crc kubenswrapper[4794]: I0310 10:32:05.241934 4794 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d529735a848a3a95f614ef29217e664ca084e887ea4112064871f104585cc87f" Mar 10 10:32:05 crc kubenswrapper[4794]: I0310 10:32:05.241971 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552312-6zv4m" Mar 10 10:32:05 crc kubenswrapper[4794]: I0310 10:32:05.583528 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29552306-27pgh"] Mar 10 10:32:05 crc kubenswrapper[4794]: I0310 10:32:05.590108 4794 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29552306-27pgh"] Mar 10 10:32:06 crc kubenswrapper[4794]: I0310 10:32:06.008657 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25f0a091-76f3-469d-9747-e7b64d723366" path="/var/lib/kubelet/pods/25f0a091-76f3-469d-9747-e7b64d723366/volumes" Mar 10 10:32:18 crc kubenswrapper[4794]: I0310 10:32:18.312479 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-rc82n"] Mar 10 10:32:18 crc kubenswrapper[4794]: E0310 10:32:18.313402 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dfb4f35d-3ce4-469f-bcf4-6a18666d7a60" containerName="oc" Mar 10 10:32:18 crc kubenswrapper[4794]: I0310 10:32:18.313419 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="dfb4f35d-3ce4-469f-bcf4-6a18666d7a60" containerName="oc" Mar 10 10:32:18 crc kubenswrapper[4794]: I0310 10:32:18.313619 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="dfb4f35d-3ce4-469f-bcf4-6a18666d7a60" containerName="oc" Mar 10 10:32:18 crc kubenswrapper[4794]: I0310 10:32:18.314743 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-rc82n"] Mar 10 10:32:18 crc kubenswrapper[4794]: I0310 10:32:18.314836 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-rc82n" Mar 10 10:32:18 crc kubenswrapper[4794]: I0310 10:32:18.381463 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6bp4j\" (UniqueName: \"kubernetes.io/projected/34e44dca-c437-4ea7-a11c-21f67120b518-kube-api-access-6bp4j\") pod \"redhat-marketplace-rc82n\" (UID: \"34e44dca-c437-4ea7-a11c-21f67120b518\") " pod="openshift-marketplace/redhat-marketplace-rc82n" Mar 10 10:32:18 crc kubenswrapper[4794]: I0310 10:32:18.381524 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/34e44dca-c437-4ea7-a11c-21f67120b518-utilities\") pod \"redhat-marketplace-rc82n\" (UID: \"34e44dca-c437-4ea7-a11c-21f67120b518\") " pod="openshift-marketplace/redhat-marketplace-rc82n" Mar 10 10:32:18 crc kubenswrapper[4794]: I0310 10:32:18.381582 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/34e44dca-c437-4ea7-a11c-21f67120b518-catalog-content\") pod \"redhat-marketplace-rc82n\" (UID: \"34e44dca-c437-4ea7-a11c-21f67120b518\") " pod="openshift-marketplace/redhat-marketplace-rc82n" Mar 10 10:32:18 crc kubenswrapper[4794]: I0310 10:32:18.483690 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6bp4j\" (UniqueName: \"kubernetes.io/projected/34e44dca-c437-4ea7-a11c-21f67120b518-kube-api-access-6bp4j\") pod \"redhat-marketplace-rc82n\" (UID: \"34e44dca-c437-4ea7-a11c-21f67120b518\") " pod="openshift-marketplace/redhat-marketplace-rc82n" Mar 10 10:32:18 crc kubenswrapper[4794]: I0310 10:32:18.483754 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/34e44dca-c437-4ea7-a11c-21f67120b518-utilities\") pod \"redhat-marketplace-rc82n\" (UID: \"34e44dca-c437-4ea7-a11c-21f67120b518\") " pod="openshift-marketplace/redhat-marketplace-rc82n" Mar 10 10:32:18 crc kubenswrapper[4794]: I0310 10:32:18.483817 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/34e44dca-c437-4ea7-a11c-21f67120b518-catalog-content\") pod \"redhat-marketplace-rc82n\" (UID: \"34e44dca-c437-4ea7-a11c-21f67120b518\") " pod="openshift-marketplace/redhat-marketplace-rc82n" Mar 10 10:32:18 crc kubenswrapper[4794]: I0310 10:32:18.484669 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/34e44dca-c437-4ea7-a11c-21f67120b518-catalog-content\") pod \"redhat-marketplace-rc82n\" (UID: \"34e44dca-c437-4ea7-a11c-21f67120b518\") " pod="openshift-marketplace/redhat-marketplace-rc82n" Mar 10 10:32:18 crc kubenswrapper[4794]: I0310 10:32:18.484669 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/34e44dca-c437-4ea7-a11c-21f67120b518-utilities\") pod \"redhat-marketplace-rc82n\" (UID: \"34e44dca-c437-4ea7-a11c-21f67120b518\") " pod="openshift-marketplace/redhat-marketplace-rc82n" Mar 10 10:32:18 crc kubenswrapper[4794]: I0310 10:32:18.519104 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6bp4j\" (UniqueName: \"kubernetes.io/projected/34e44dca-c437-4ea7-a11c-21f67120b518-kube-api-access-6bp4j\") pod \"redhat-marketplace-rc82n\" (UID: \"34e44dca-c437-4ea7-a11c-21f67120b518\") " pod="openshift-marketplace/redhat-marketplace-rc82n" Mar 10 10:32:18 crc kubenswrapper[4794]: I0310 10:32:18.635210 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-rc82n" Mar 10 10:32:19 crc kubenswrapper[4794]: I0310 10:32:19.109384 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-rc82n"] Mar 10 10:32:19 crc kubenswrapper[4794]: W0310 10:32:19.118469 4794 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod34e44dca_c437_4ea7_a11c_21f67120b518.slice/crio-c1e89c1fd4136f107a133b2755c475bc758b3d4c6b9ec72c1f355708c7205a12 WatchSource:0}: Error finding container c1e89c1fd4136f107a133b2755c475bc758b3d4c6b9ec72c1f355708c7205a12: Status 404 returned error can't find the container with id c1e89c1fd4136f107a133b2755c475bc758b3d4c6b9ec72c1f355708c7205a12 Mar 10 10:32:19 crc kubenswrapper[4794]: I0310 10:32:19.352830 4794 generic.go:334] "Generic (PLEG): container finished" podID="34e44dca-c437-4ea7-a11c-21f67120b518" containerID="f449508ccc43ffe579392371a48d94e3210422e5004b8fc09151b71e92e67cf1" exitCode=0 Mar 10 10:32:19 crc kubenswrapper[4794]: I0310 10:32:19.352890 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rc82n" event={"ID":"34e44dca-c437-4ea7-a11c-21f67120b518","Type":"ContainerDied","Data":"f449508ccc43ffe579392371a48d94e3210422e5004b8fc09151b71e92e67cf1"} Mar 10 10:32:19 crc kubenswrapper[4794]: I0310 10:32:19.352934 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rc82n" event={"ID":"34e44dca-c437-4ea7-a11c-21f67120b518","Type":"ContainerStarted","Data":"c1e89c1fd4136f107a133b2755c475bc758b3d4c6b9ec72c1f355708c7205a12"} Mar 10 10:32:20 crc kubenswrapper[4794]: I0310 10:32:20.362059 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rc82n" event={"ID":"34e44dca-c437-4ea7-a11c-21f67120b518","Type":"ContainerStarted","Data":"4a054c9b25101c29e19802960a78dc2f0825d552191966421a1a7f50533b2074"} Mar 10 10:32:21 crc kubenswrapper[4794]: I0310 10:32:21.374632 4794 generic.go:334] "Generic (PLEG): container finished" podID="34e44dca-c437-4ea7-a11c-21f67120b518" containerID="4a054c9b25101c29e19802960a78dc2f0825d552191966421a1a7f50533b2074" exitCode=0 Mar 10 10:32:21 crc kubenswrapper[4794]: I0310 10:32:21.374717 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rc82n" event={"ID":"34e44dca-c437-4ea7-a11c-21f67120b518","Type":"ContainerDied","Data":"4a054c9b25101c29e19802960a78dc2f0825d552191966421a1a7f50533b2074"} Mar 10 10:32:22 crc kubenswrapper[4794]: I0310 10:32:22.384833 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rc82n" event={"ID":"34e44dca-c437-4ea7-a11c-21f67120b518","Type":"ContainerStarted","Data":"1acb856b2d2e866bc8c74284fad1422849f749ed5d69fd45ab9ff054d73d44b6"} Mar 10 10:32:22 crc kubenswrapper[4794]: I0310 10:32:22.410257 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-rc82n" podStartSLOduration=1.989336329 podStartE2EDuration="4.410238258s" podCreationTimestamp="2026-03-10 10:32:18 +0000 UTC" firstStartedPulling="2026-03-10 10:32:19.354885333 +0000 UTC m=+2888.111056151" lastFinishedPulling="2026-03-10 10:32:21.775787262 +0000 UTC m=+2890.531958080" observedRunningTime="2026-03-10 10:32:22.4054913 +0000 UTC m=+2891.161662158" watchObservedRunningTime="2026-03-10 10:32:22.410238258 +0000 UTC m=+2891.166409066" Mar 10 10:32:28 crc kubenswrapper[4794]: I0310 10:32:28.635570 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-rc82n" Mar 10 10:32:28 crc kubenswrapper[4794]: I0310 10:32:28.636194 4794 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-rc82n" Mar 10 10:32:28 crc kubenswrapper[4794]: I0310 10:32:28.698234 4794 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-rc82n" Mar 10 10:32:29 crc kubenswrapper[4794]: I0310 10:32:29.510142 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-rc82n" Mar 10 10:32:29 crc kubenswrapper[4794]: I0310 10:32:29.574146 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-rc82n"] Mar 10 10:32:31 crc kubenswrapper[4794]: I0310 10:32:31.460467 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-rc82n" podUID="34e44dca-c437-4ea7-a11c-21f67120b518" containerName="registry-server" containerID="cri-o://1acb856b2d2e866bc8c74284fad1422849f749ed5d69fd45ab9ff054d73d44b6" gracePeriod=2 Mar 10 10:32:32 crc kubenswrapper[4794]: I0310 10:32:32.162227 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-rc82n" Mar 10 10:32:32 crc kubenswrapper[4794]: I0310 10:32:32.190267 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/34e44dca-c437-4ea7-a11c-21f67120b518-utilities\") pod \"34e44dca-c437-4ea7-a11c-21f67120b518\" (UID: \"34e44dca-c437-4ea7-a11c-21f67120b518\") " Mar 10 10:32:32 crc kubenswrapper[4794]: I0310 10:32:32.190419 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6bp4j\" (UniqueName: \"kubernetes.io/projected/34e44dca-c437-4ea7-a11c-21f67120b518-kube-api-access-6bp4j\") pod \"34e44dca-c437-4ea7-a11c-21f67120b518\" (UID: \"34e44dca-c437-4ea7-a11c-21f67120b518\") " Mar 10 10:32:32 crc kubenswrapper[4794]: I0310 10:32:32.190496 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/34e44dca-c437-4ea7-a11c-21f67120b518-catalog-content\") pod \"34e44dca-c437-4ea7-a11c-21f67120b518\" (UID: \"34e44dca-c437-4ea7-a11c-21f67120b518\") " Mar 10 10:32:32 crc kubenswrapper[4794]: I0310 10:32:32.191447 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/34e44dca-c437-4ea7-a11c-21f67120b518-utilities" (OuterVolumeSpecName: "utilities") pod "34e44dca-c437-4ea7-a11c-21f67120b518" (UID: "34e44dca-c437-4ea7-a11c-21f67120b518"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 10:32:32 crc kubenswrapper[4794]: I0310 10:32:32.196139 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/34e44dca-c437-4ea7-a11c-21f67120b518-kube-api-access-6bp4j" (OuterVolumeSpecName: "kube-api-access-6bp4j") pod "34e44dca-c437-4ea7-a11c-21f67120b518" (UID: "34e44dca-c437-4ea7-a11c-21f67120b518"). InnerVolumeSpecName "kube-api-access-6bp4j". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 10:32:32 crc kubenswrapper[4794]: I0310 10:32:32.231353 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/34e44dca-c437-4ea7-a11c-21f67120b518-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "34e44dca-c437-4ea7-a11c-21f67120b518" (UID: "34e44dca-c437-4ea7-a11c-21f67120b518"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 10:32:32 crc kubenswrapper[4794]: I0310 10:32:32.292782 4794 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/34e44dca-c437-4ea7-a11c-21f67120b518-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 10 10:32:32 crc kubenswrapper[4794]: I0310 10:32:32.292810 4794 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/34e44dca-c437-4ea7-a11c-21f67120b518-utilities\") on node \"crc\" DevicePath \"\"" Mar 10 10:32:32 crc kubenswrapper[4794]: I0310 10:32:32.292822 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6bp4j\" (UniqueName: \"kubernetes.io/projected/34e44dca-c437-4ea7-a11c-21f67120b518-kube-api-access-6bp4j\") on node \"crc\" DevicePath \"\"" Mar 10 10:32:32 crc kubenswrapper[4794]: I0310 10:32:32.473823 4794 generic.go:334] "Generic (PLEG): container finished" podID="34e44dca-c437-4ea7-a11c-21f67120b518" containerID="1acb856b2d2e866bc8c74284fad1422849f749ed5d69fd45ab9ff054d73d44b6" exitCode=0 Mar 10 10:32:32 crc kubenswrapper[4794]: I0310 10:32:32.473859 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rc82n" event={"ID":"34e44dca-c437-4ea7-a11c-21f67120b518","Type":"ContainerDied","Data":"1acb856b2d2e866bc8c74284fad1422849f749ed5d69fd45ab9ff054d73d44b6"} Mar 10 10:32:32 crc kubenswrapper[4794]: I0310 10:32:32.473888 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rc82n" event={"ID":"34e44dca-c437-4ea7-a11c-21f67120b518","Type":"ContainerDied","Data":"c1e89c1fd4136f107a133b2755c475bc758b3d4c6b9ec72c1f355708c7205a12"} Mar 10 10:32:32 crc kubenswrapper[4794]: I0310 10:32:32.473908 4794 scope.go:117] "RemoveContainer" containerID="1acb856b2d2e866bc8c74284fad1422849f749ed5d69fd45ab9ff054d73d44b6" Mar 10 10:32:32 crc kubenswrapper[4794]: I0310 10:32:32.473920 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-rc82n" Mar 10 10:32:32 crc kubenswrapper[4794]: I0310 10:32:32.501778 4794 scope.go:117] "RemoveContainer" containerID="4a054c9b25101c29e19802960a78dc2f0825d552191966421a1a7f50533b2074" Mar 10 10:32:32 crc kubenswrapper[4794]: I0310 10:32:32.542740 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-rc82n"] Mar 10 10:32:32 crc kubenswrapper[4794]: I0310 10:32:32.552151 4794 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-rc82n"] Mar 10 10:32:32 crc kubenswrapper[4794]: I0310 10:32:32.563227 4794 scope.go:117] "RemoveContainer" containerID="f449508ccc43ffe579392371a48d94e3210422e5004b8fc09151b71e92e67cf1" Mar 10 10:32:32 crc kubenswrapper[4794]: I0310 10:32:32.590493 4794 scope.go:117] "RemoveContainer" containerID="1acb856b2d2e866bc8c74284fad1422849f749ed5d69fd45ab9ff054d73d44b6" Mar 10 10:32:32 crc kubenswrapper[4794]: E0310 10:32:32.590972 4794 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1acb856b2d2e866bc8c74284fad1422849f749ed5d69fd45ab9ff054d73d44b6\": container with ID starting with 1acb856b2d2e866bc8c74284fad1422849f749ed5d69fd45ab9ff054d73d44b6 not found: ID does not exist" containerID="1acb856b2d2e866bc8c74284fad1422849f749ed5d69fd45ab9ff054d73d44b6" Mar 10 10:32:32 crc kubenswrapper[4794]: I0310 10:32:32.591050 4794 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1acb856b2d2e866bc8c74284fad1422849f749ed5d69fd45ab9ff054d73d44b6"} err="failed to get container status \"1acb856b2d2e866bc8c74284fad1422849f749ed5d69fd45ab9ff054d73d44b6\": rpc error: code = NotFound desc = could not find container \"1acb856b2d2e866bc8c74284fad1422849f749ed5d69fd45ab9ff054d73d44b6\": container with ID starting with 1acb856b2d2e866bc8c74284fad1422849f749ed5d69fd45ab9ff054d73d44b6 not found: ID does not exist" Mar 10 10:32:32 crc kubenswrapper[4794]: I0310 10:32:32.591100 4794 scope.go:117] "RemoveContainer" containerID="4a054c9b25101c29e19802960a78dc2f0825d552191966421a1a7f50533b2074" Mar 10 10:32:32 crc kubenswrapper[4794]: E0310 10:32:32.591902 4794 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4a054c9b25101c29e19802960a78dc2f0825d552191966421a1a7f50533b2074\": container with ID starting with 4a054c9b25101c29e19802960a78dc2f0825d552191966421a1a7f50533b2074 not found: ID does not exist" containerID="4a054c9b25101c29e19802960a78dc2f0825d552191966421a1a7f50533b2074" Mar 10 10:32:32 crc kubenswrapper[4794]: I0310 10:32:32.591948 4794 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4a054c9b25101c29e19802960a78dc2f0825d552191966421a1a7f50533b2074"} err="failed to get container status \"4a054c9b25101c29e19802960a78dc2f0825d552191966421a1a7f50533b2074\": rpc error: code = NotFound desc = could not find container \"4a054c9b25101c29e19802960a78dc2f0825d552191966421a1a7f50533b2074\": container with ID starting with 4a054c9b25101c29e19802960a78dc2f0825d552191966421a1a7f50533b2074 not found: ID does not exist" Mar 10 10:32:32 crc kubenswrapper[4794]: I0310 10:32:32.591977 4794 scope.go:117] "RemoveContainer" containerID="f449508ccc43ffe579392371a48d94e3210422e5004b8fc09151b71e92e67cf1" Mar 10 10:32:32 crc kubenswrapper[4794]: E0310 10:32:32.592857 4794 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f449508ccc43ffe579392371a48d94e3210422e5004b8fc09151b71e92e67cf1\": container with ID starting with f449508ccc43ffe579392371a48d94e3210422e5004b8fc09151b71e92e67cf1 not found: ID does not exist" containerID="f449508ccc43ffe579392371a48d94e3210422e5004b8fc09151b71e92e67cf1" Mar 10 10:32:32 crc kubenswrapper[4794]: I0310 10:32:32.592942 4794 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f449508ccc43ffe579392371a48d94e3210422e5004b8fc09151b71e92e67cf1"} err="failed to get container status \"f449508ccc43ffe579392371a48d94e3210422e5004b8fc09151b71e92e67cf1\": rpc error: code = NotFound desc = could not find container \"f449508ccc43ffe579392371a48d94e3210422e5004b8fc09151b71e92e67cf1\": container with ID starting with f449508ccc43ffe579392371a48d94e3210422e5004b8fc09151b71e92e67cf1 not found: ID does not exist" Mar 10 10:32:34 crc kubenswrapper[4794]: I0310 10:32:34.012782 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="34e44dca-c437-4ea7-a11c-21f67120b518" path="/var/lib/kubelet/pods/34e44dca-c437-4ea7-a11c-21f67120b518/volumes" Mar 10 10:32:34 crc kubenswrapper[4794]: I0310 10:32:34.534890 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-phnb9"] Mar 10 10:32:34 crc kubenswrapper[4794]: E0310 10:32:34.535860 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="34e44dca-c437-4ea7-a11c-21f67120b518" containerName="extract-content" Mar 10 10:32:34 crc kubenswrapper[4794]: I0310 10:32:34.535899 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="34e44dca-c437-4ea7-a11c-21f67120b518" containerName="extract-content" Mar 10 10:32:34 crc kubenswrapper[4794]: E0310 10:32:34.535928 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="34e44dca-c437-4ea7-a11c-21f67120b518" containerName="extract-utilities" Mar 10 10:32:34 crc kubenswrapper[4794]: I0310 10:32:34.535945 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="34e44dca-c437-4ea7-a11c-21f67120b518" containerName="extract-utilities" Mar 10 10:32:34 crc kubenswrapper[4794]: E0310 10:32:34.536005 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="34e44dca-c437-4ea7-a11c-21f67120b518" containerName="registry-server" Mar 10 10:32:34 crc kubenswrapper[4794]: I0310 10:32:34.536023 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="34e44dca-c437-4ea7-a11c-21f67120b518" containerName="registry-server" Mar 10 10:32:34 crc kubenswrapper[4794]: I0310 10:32:34.536422 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="34e44dca-c437-4ea7-a11c-21f67120b518" containerName="registry-server" Mar 10 10:32:34 crc kubenswrapper[4794]: I0310 10:32:34.538627 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-phnb9" Mar 10 10:32:34 crc kubenswrapper[4794]: I0310 10:32:34.544750 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-phnb9"] Mar 10 10:32:34 crc kubenswrapper[4794]: I0310 10:32:34.627613 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wjz8j\" (UniqueName: \"kubernetes.io/projected/057f019e-f049-41a8-abad-3ab801ad21c9-kube-api-access-wjz8j\") pod \"community-operators-phnb9\" (UID: \"057f019e-f049-41a8-abad-3ab801ad21c9\") " pod="openshift-marketplace/community-operators-phnb9" Mar 10 10:32:34 crc kubenswrapper[4794]: I0310 10:32:34.627730 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/057f019e-f049-41a8-abad-3ab801ad21c9-catalog-content\") pod \"community-operators-phnb9\" (UID: \"057f019e-f049-41a8-abad-3ab801ad21c9\") " pod="openshift-marketplace/community-operators-phnb9" Mar 10 10:32:34 crc kubenswrapper[4794]: I0310 10:32:34.627749 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/057f019e-f049-41a8-abad-3ab801ad21c9-utilities\") pod \"community-operators-phnb9\" (UID: \"057f019e-f049-41a8-abad-3ab801ad21c9\") " pod="openshift-marketplace/community-operators-phnb9" Mar 10 10:32:34 crc kubenswrapper[4794]: I0310 10:32:34.729039 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/057f019e-f049-41a8-abad-3ab801ad21c9-catalog-content\") pod \"community-operators-phnb9\" (UID: \"057f019e-f049-41a8-abad-3ab801ad21c9\") " pod="openshift-marketplace/community-operators-phnb9" Mar 10 10:32:34 crc kubenswrapper[4794]: I0310 10:32:34.729089 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/057f019e-f049-41a8-abad-3ab801ad21c9-utilities\") pod \"community-operators-phnb9\" (UID: \"057f019e-f049-41a8-abad-3ab801ad21c9\") " pod="openshift-marketplace/community-operators-phnb9" Mar 10 10:32:34 crc kubenswrapper[4794]: I0310 10:32:34.729133 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wjz8j\" (UniqueName: \"kubernetes.io/projected/057f019e-f049-41a8-abad-3ab801ad21c9-kube-api-access-wjz8j\") pod \"community-operators-phnb9\" (UID: \"057f019e-f049-41a8-abad-3ab801ad21c9\") " pod="openshift-marketplace/community-operators-phnb9" Mar 10 10:32:34 crc kubenswrapper[4794]: I0310 10:32:34.729648 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/057f019e-f049-41a8-abad-3ab801ad21c9-utilities\") pod \"community-operators-phnb9\" (UID: \"057f019e-f049-41a8-abad-3ab801ad21c9\") " pod="openshift-marketplace/community-operators-phnb9" Mar 10 10:32:34 crc kubenswrapper[4794]: I0310 10:32:34.729783 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/057f019e-f049-41a8-abad-3ab801ad21c9-catalog-content\") pod \"community-operators-phnb9\" (UID: \"057f019e-f049-41a8-abad-3ab801ad21c9\") " pod="openshift-marketplace/community-operators-phnb9" Mar 10 10:32:34 crc kubenswrapper[4794]: I0310 10:32:34.751976 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wjz8j\" (UniqueName: \"kubernetes.io/projected/057f019e-f049-41a8-abad-3ab801ad21c9-kube-api-access-wjz8j\") pod \"community-operators-phnb9\" (UID: \"057f019e-f049-41a8-abad-3ab801ad21c9\") " pod="openshift-marketplace/community-operators-phnb9" Mar 10 10:32:34 crc kubenswrapper[4794]: I0310 10:32:34.873774 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-phnb9" Mar 10 10:32:35 crc kubenswrapper[4794]: I0310 10:32:35.402494 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-phnb9"] Mar 10 10:32:35 crc kubenswrapper[4794]: I0310 10:32:35.498780 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-phnb9" event={"ID":"057f019e-f049-41a8-abad-3ab801ad21c9","Type":"ContainerStarted","Data":"860315f4b6bab25a779ecdddf51c419ee8e95c3698ad65f94d7f3b774e431456"} Mar 10 10:32:36 crc kubenswrapper[4794]: I0310 10:32:36.509788 4794 generic.go:334] "Generic (PLEG): container finished" podID="057f019e-f049-41a8-abad-3ab801ad21c9" containerID="08494a6e855d4c4ac58349cd0162ce9338c968871e67bd5cbf1684a6f90aef5d" exitCode=0 Mar 10 10:32:36 crc kubenswrapper[4794]: I0310 10:32:36.509905 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-phnb9" event={"ID":"057f019e-f049-41a8-abad-3ab801ad21c9","Type":"ContainerDied","Data":"08494a6e855d4c4ac58349cd0162ce9338c968871e67bd5cbf1684a6f90aef5d"} Mar 10 10:32:37 crc kubenswrapper[4794]: I0310 10:32:37.520844 4794 generic.go:334] "Generic (PLEG): container finished" podID="057f019e-f049-41a8-abad-3ab801ad21c9" containerID="97885429fbda0eef17cb685abda8b2db1e4b0eba6024f47ab44bc911bdffbca1" exitCode=0 Mar 10 10:32:37 crc kubenswrapper[4794]: I0310 10:32:37.520929 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-phnb9" event={"ID":"057f019e-f049-41a8-abad-3ab801ad21c9","Type":"ContainerDied","Data":"97885429fbda0eef17cb685abda8b2db1e4b0eba6024f47ab44bc911bdffbca1"} Mar 10 10:32:38 crc kubenswrapper[4794]: I0310 10:32:38.536512 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-phnb9" event={"ID":"057f019e-f049-41a8-abad-3ab801ad21c9","Type":"ContainerStarted","Data":"a2bcb1cd74a93c3c91ff2b46848481e812c9e9462d4b043888761550314c5f7a"} Mar 10 10:32:38 crc kubenswrapper[4794]: I0310 10:32:38.567027 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-phnb9" podStartSLOduration=3.024554112 podStartE2EDuration="4.567005546s" podCreationTimestamp="2026-03-10 10:32:34 +0000 UTC" firstStartedPulling="2026-03-10 10:32:36.512598846 +0000 UTC m=+2905.268769654" lastFinishedPulling="2026-03-10 10:32:38.05505027 +0000 UTC m=+2906.811221088" observedRunningTime="2026-03-10 10:32:38.560312447 +0000 UTC m=+2907.316483295" watchObservedRunningTime="2026-03-10 10:32:38.567005546 +0000 UTC m=+2907.323176364" Mar 10 10:32:39 crc kubenswrapper[4794]: I0310 10:32:39.196963 4794 scope.go:117] "RemoveContainer" containerID="b0844cbefd80c3d054e9fa40c4fb4523d38d58f4044c04a2cc6115ab980269bd" Mar 10 10:32:44 crc kubenswrapper[4794]: I0310 10:32:44.875488 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-phnb9" Mar 10 10:32:44 crc kubenswrapper[4794]: I0310 10:32:44.876481 4794 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-phnb9" Mar 10 10:32:44 crc kubenswrapper[4794]: I0310 10:32:44.940441 4794 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-phnb9" Mar 10 10:32:45 crc kubenswrapper[4794]: I0310 10:32:45.636487 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-phnb9" Mar 10 10:32:45 crc kubenswrapper[4794]: I0310 10:32:45.743620 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-phnb9"] Mar 10 10:32:47 crc kubenswrapper[4794]: I0310 10:32:47.599101 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-phnb9" podUID="057f019e-f049-41a8-abad-3ab801ad21c9" containerName="registry-server" containerID="cri-o://a2bcb1cd74a93c3c91ff2b46848481e812c9e9462d4b043888761550314c5f7a" gracePeriod=2 Mar 10 10:32:48 crc kubenswrapper[4794]: I0310 10:32:48.029183 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-phnb9" Mar 10 10:32:48 crc kubenswrapper[4794]: I0310 10:32:48.214223 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wjz8j\" (UniqueName: \"kubernetes.io/projected/057f019e-f049-41a8-abad-3ab801ad21c9-kube-api-access-wjz8j\") pod \"057f019e-f049-41a8-abad-3ab801ad21c9\" (UID: \"057f019e-f049-41a8-abad-3ab801ad21c9\") " Mar 10 10:32:48 crc kubenswrapper[4794]: I0310 10:32:48.214323 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/057f019e-f049-41a8-abad-3ab801ad21c9-catalog-content\") pod \"057f019e-f049-41a8-abad-3ab801ad21c9\" (UID: \"057f019e-f049-41a8-abad-3ab801ad21c9\") " Mar 10 10:32:48 crc kubenswrapper[4794]: I0310 10:32:48.214458 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/057f019e-f049-41a8-abad-3ab801ad21c9-utilities\") pod \"057f019e-f049-41a8-abad-3ab801ad21c9\" (UID: \"057f019e-f049-41a8-abad-3ab801ad21c9\") " Mar 10 10:32:48 crc kubenswrapper[4794]: I0310 10:32:48.215817 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/057f019e-f049-41a8-abad-3ab801ad21c9-utilities" (OuterVolumeSpecName: "utilities") pod "057f019e-f049-41a8-abad-3ab801ad21c9" (UID: "057f019e-f049-41a8-abad-3ab801ad21c9"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 10:32:48 crc kubenswrapper[4794]: I0310 10:32:48.222369 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/057f019e-f049-41a8-abad-3ab801ad21c9-kube-api-access-wjz8j" (OuterVolumeSpecName: "kube-api-access-wjz8j") pod "057f019e-f049-41a8-abad-3ab801ad21c9" (UID: "057f019e-f049-41a8-abad-3ab801ad21c9"). InnerVolumeSpecName "kube-api-access-wjz8j". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 10:32:48 crc kubenswrapper[4794]: I0310 10:32:48.288737 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/057f019e-f049-41a8-abad-3ab801ad21c9-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "057f019e-f049-41a8-abad-3ab801ad21c9" (UID: "057f019e-f049-41a8-abad-3ab801ad21c9"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 10:32:48 crc kubenswrapper[4794]: I0310 10:32:48.316592 4794 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/057f019e-f049-41a8-abad-3ab801ad21c9-utilities\") on node \"crc\" DevicePath \"\"" Mar 10 10:32:48 crc kubenswrapper[4794]: I0310 10:32:48.316635 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wjz8j\" (UniqueName: \"kubernetes.io/projected/057f019e-f049-41a8-abad-3ab801ad21c9-kube-api-access-wjz8j\") on node \"crc\" DevicePath \"\"" Mar 10 10:32:48 crc kubenswrapper[4794]: I0310 10:32:48.316650 4794 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/057f019e-f049-41a8-abad-3ab801ad21c9-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 10 10:32:48 crc kubenswrapper[4794]: I0310 10:32:48.611574 4794 generic.go:334] "Generic (PLEG): container finished" podID="057f019e-f049-41a8-abad-3ab801ad21c9" containerID="a2bcb1cd74a93c3c91ff2b46848481e812c9e9462d4b043888761550314c5f7a" exitCode=0 Mar 10 10:32:48 crc kubenswrapper[4794]: I0310 10:32:48.611625 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-phnb9" event={"ID":"057f019e-f049-41a8-abad-3ab801ad21c9","Type":"ContainerDied","Data":"a2bcb1cd74a93c3c91ff2b46848481e812c9e9462d4b043888761550314c5f7a"} Mar 10 10:32:48 crc kubenswrapper[4794]: I0310 10:32:48.611662 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-phnb9" event={"ID":"057f019e-f049-41a8-abad-3ab801ad21c9","Type":"ContainerDied","Data":"860315f4b6bab25a779ecdddf51c419ee8e95c3698ad65f94d7f3b774e431456"} Mar 10 10:32:48 crc kubenswrapper[4794]: I0310 10:32:48.611681 4794 scope.go:117] "RemoveContainer" containerID="a2bcb1cd74a93c3c91ff2b46848481e812c9e9462d4b043888761550314c5f7a" Mar 10 10:32:48 crc kubenswrapper[4794]: I0310 10:32:48.611772 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-phnb9" Mar 10 10:32:48 crc kubenswrapper[4794]: I0310 10:32:48.643555 4794 scope.go:117] "RemoveContainer" containerID="97885429fbda0eef17cb685abda8b2db1e4b0eba6024f47ab44bc911bdffbca1" Mar 10 10:32:48 crc kubenswrapper[4794]: I0310 10:32:48.653702 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-phnb9"] Mar 10 10:32:48 crc kubenswrapper[4794]: I0310 10:32:48.659194 4794 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-phnb9"] Mar 10 10:32:48 crc kubenswrapper[4794]: I0310 10:32:48.694372 4794 scope.go:117] "RemoveContainer" containerID="08494a6e855d4c4ac58349cd0162ce9338c968871e67bd5cbf1684a6f90aef5d" Mar 10 10:32:48 crc kubenswrapper[4794]: I0310 10:32:48.716475 4794 scope.go:117] "RemoveContainer" containerID="a2bcb1cd74a93c3c91ff2b46848481e812c9e9462d4b043888761550314c5f7a" Mar 10 10:32:48 crc kubenswrapper[4794]: E0310 10:32:48.717612 4794 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a2bcb1cd74a93c3c91ff2b46848481e812c9e9462d4b043888761550314c5f7a\": container with ID starting with a2bcb1cd74a93c3c91ff2b46848481e812c9e9462d4b043888761550314c5f7a not found: ID does not exist" containerID="a2bcb1cd74a93c3c91ff2b46848481e812c9e9462d4b043888761550314c5f7a" Mar 10 10:32:48 crc kubenswrapper[4794]: I0310 10:32:48.717697 4794 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a2bcb1cd74a93c3c91ff2b46848481e812c9e9462d4b043888761550314c5f7a"} err="failed to get container status \"a2bcb1cd74a93c3c91ff2b46848481e812c9e9462d4b043888761550314c5f7a\": rpc error: code = NotFound desc = could not find container \"a2bcb1cd74a93c3c91ff2b46848481e812c9e9462d4b043888761550314c5f7a\": container with ID starting with a2bcb1cd74a93c3c91ff2b46848481e812c9e9462d4b043888761550314c5f7a not found: ID does not exist" Mar 10 10:32:48 crc kubenswrapper[4794]: I0310 10:32:48.717755 4794 scope.go:117] "RemoveContainer" containerID="97885429fbda0eef17cb685abda8b2db1e4b0eba6024f47ab44bc911bdffbca1" Mar 10 10:32:48 crc kubenswrapper[4794]: E0310 10:32:48.718984 4794 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"97885429fbda0eef17cb685abda8b2db1e4b0eba6024f47ab44bc911bdffbca1\": container with ID starting with 97885429fbda0eef17cb685abda8b2db1e4b0eba6024f47ab44bc911bdffbca1 not found: ID does not exist" containerID="97885429fbda0eef17cb685abda8b2db1e4b0eba6024f47ab44bc911bdffbca1" Mar 10 10:32:48 crc kubenswrapper[4794]: I0310 10:32:48.719049 4794 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"97885429fbda0eef17cb685abda8b2db1e4b0eba6024f47ab44bc911bdffbca1"} err="failed to get container status \"97885429fbda0eef17cb685abda8b2db1e4b0eba6024f47ab44bc911bdffbca1\": rpc error: code = NotFound desc = could not find container \"97885429fbda0eef17cb685abda8b2db1e4b0eba6024f47ab44bc911bdffbca1\": container with ID starting with 97885429fbda0eef17cb685abda8b2db1e4b0eba6024f47ab44bc911bdffbca1 not found: ID does not exist" Mar 10 10:32:48 crc kubenswrapper[4794]: I0310 10:32:48.719105 4794 scope.go:117] "RemoveContainer" containerID="08494a6e855d4c4ac58349cd0162ce9338c968871e67bd5cbf1684a6f90aef5d" Mar 10 10:32:48 crc kubenswrapper[4794]: E0310 10:32:48.719839 4794 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"08494a6e855d4c4ac58349cd0162ce9338c968871e67bd5cbf1684a6f90aef5d\": container with ID starting with 08494a6e855d4c4ac58349cd0162ce9338c968871e67bd5cbf1684a6f90aef5d not found: ID does not exist" containerID="08494a6e855d4c4ac58349cd0162ce9338c968871e67bd5cbf1684a6f90aef5d" Mar 10 10:32:48 crc kubenswrapper[4794]: I0310 10:32:48.719898 4794 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"08494a6e855d4c4ac58349cd0162ce9338c968871e67bd5cbf1684a6f90aef5d"} err="failed to get container status \"08494a6e855d4c4ac58349cd0162ce9338c968871e67bd5cbf1684a6f90aef5d\": rpc error: code = NotFound desc = could not find container \"08494a6e855d4c4ac58349cd0162ce9338c968871e67bd5cbf1684a6f90aef5d\": container with ID starting with 08494a6e855d4c4ac58349cd0162ce9338c968871e67bd5cbf1684a6f90aef5d not found: ID does not exist" Mar 10 10:32:50 crc kubenswrapper[4794]: I0310 10:32:50.008492 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="057f019e-f049-41a8-abad-3ab801ad21c9" path="/var/lib/kubelet/pods/057f019e-f049-41a8-abad-3ab801ad21c9/volumes" Mar 10 10:33:22 crc kubenswrapper[4794]: I0310 10:33:22.967900 4794 patch_prober.go:28] interesting pod/machine-config-daemon-69278 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 10:33:22 crc kubenswrapper[4794]: I0310 10:33:22.968720 4794 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-69278" podUID="071a79a8-a892-4d38-a255-2a19483b64aa" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 10:33:52 crc kubenswrapper[4794]: I0310 10:33:52.967603 4794 patch_prober.go:28] interesting pod/machine-config-daemon-69278 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 10:33:52 crc kubenswrapper[4794]: I0310 10:33:52.968370 4794 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-69278" podUID="071a79a8-a892-4d38-a255-2a19483b64aa" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 10:34:00 crc kubenswrapper[4794]: I0310 10:34:00.146048 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29552314-rcrgt"] Mar 10 10:34:00 crc kubenswrapper[4794]: E0310 10:34:00.146967 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="057f019e-f049-41a8-abad-3ab801ad21c9" containerName="extract-utilities" Mar 10 10:34:00 crc kubenswrapper[4794]: I0310 10:34:00.146984 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="057f019e-f049-41a8-abad-3ab801ad21c9" containerName="extract-utilities" Mar 10 10:34:00 crc kubenswrapper[4794]: E0310 10:34:00.147003 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="057f019e-f049-41a8-abad-3ab801ad21c9" containerName="extract-content" Mar 10 10:34:00 crc kubenswrapper[4794]: I0310 10:34:00.147012 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="057f019e-f049-41a8-abad-3ab801ad21c9" containerName="extract-content" Mar 10 10:34:00 crc kubenswrapper[4794]: E0310 10:34:00.147034 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="057f019e-f049-41a8-abad-3ab801ad21c9" containerName="registry-server" Mar 10 10:34:00 crc kubenswrapper[4794]: I0310 10:34:00.147043 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="057f019e-f049-41a8-abad-3ab801ad21c9" containerName="registry-server" Mar 10 10:34:00 crc kubenswrapper[4794]: I0310 10:34:00.147217 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="057f019e-f049-41a8-abad-3ab801ad21c9" containerName="registry-server" Mar 10 10:34:00 crc kubenswrapper[4794]: I0310 10:34:00.147743 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552314-rcrgt" Mar 10 10:34:00 crc kubenswrapper[4794]: I0310 10:34:00.149614 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 10 10:34:00 crc kubenswrapper[4794]: I0310 10:34:00.149674 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-r5tkc" Mar 10 10:34:00 crc kubenswrapper[4794]: I0310 10:34:00.149755 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 10 10:34:00 crc kubenswrapper[4794]: I0310 10:34:00.157039 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552314-rcrgt"] Mar 10 10:34:00 crc kubenswrapper[4794]: I0310 10:34:00.209021 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sppwj\" (UniqueName: \"kubernetes.io/projected/0dc3852f-9ecc-4447-a363-55a794afede5-kube-api-access-sppwj\") pod \"auto-csr-approver-29552314-rcrgt\" (UID: \"0dc3852f-9ecc-4447-a363-55a794afede5\") " pod="openshift-infra/auto-csr-approver-29552314-rcrgt" Mar 10 10:34:00 crc kubenswrapper[4794]: I0310 10:34:00.311122 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sppwj\" (UniqueName: \"kubernetes.io/projected/0dc3852f-9ecc-4447-a363-55a794afede5-kube-api-access-sppwj\") pod \"auto-csr-approver-29552314-rcrgt\" (UID: \"0dc3852f-9ecc-4447-a363-55a794afede5\") " pod="openshift-infra/auto-csr-approver-29552314-rcrgt" Mar 10 10:34:00 crc kubenswrapper[4794]: I0310 10:34:00.334083 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sppwj\" (UniqueName: \"kubernetes.io/projected/0dc3852f-9ecc-4447-a363-55a794afede5-kube-api-access-sppwj\") pod \"auto-csr-approver-29552314-rcrgt\" (UID: \"0dc3852f-9ecc-4447-a363-55a794afede5\") " pod="openshift-infra/auto-csr-approver-29552314-rcrgt" Mar 10 10:34:00 crc kubenswrapper[4794]: I0310 10:34:00.470630 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552314-rcrgt" Mar 10 10:34:00 crc kubenswrapper[4794]: I0310 10:34:00.896280 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552314-rcrgt"] Mar 10 10:34:00 crc kubenswrapper[4794]: I0310 10:34:00.904179 4794 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 10 10:34:01 crc kubenswrapper[4794]: I0310 10:34:01.181933 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552314-rcrgt" event={"ID":"0dc3852f-9ecc-4447-a363-55a794afede5","Type":"ContainerStarted","Data":"ce8c0bc30bb175346a17791f7365733b409a5f8d80361a80f238bd61ae019815"} Mar 10 10:34:03 crc kubenswrapper[4794]: I0310 10:34:03.199147 4794 generic.go:334] "Generic (PLEG): container finished" podID="0dc3852f-9ecc-4447-a363-55a794afede5" containerID="3a5543160f41e487856d39e504d831e10f2a8b99a1a3cb1fecc61ce29db527f5" exitCode=0 Mar 10 10:34:03 crc kubenswrapper[4794]: I0310 10:34:03.199202 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552314-rcrgt" event={"ID":"0dc3852f-9ecc-4447-a363-55a794afede5","Type":"ContainerDied","Data":"3a5543160f41e487856d39e504d831e10f2a8b99a1a3cb1fecc61ce29db527f5"} Mar 10 10:34:04 crc kubenswrapper[4794]: I0310 10:34:04.501100 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552314-rcrgt" Mar 10 10:34:04 crc kubenswrapper[4794]: I0310 10:34:04.683301 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sppwj\" (UniqueName: \"kubernetes.io/projected/0dc3852f-9ecc-4447-a363-55a794afede5-kube-api-access-sppwj\") pod \"0dc3852f-9ecc-4447-a363-55a794afede5\" (UID: \"0dc3852f-9ecc-4447-a363-55a794afede5\") " Mar 10 10:34:04 crc kubenswrapper[4794]: I0310 10:34:04.688423 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0dc3852f-9ecc-4447-a363-55a794afede5-kube-api-access-sppwj" (OuterVolumeSpecName: "kube-api-access-sppwj") pod "0dc3852f-9ecc-4447-a363-55a794afede5" (UID: "0dc3852f-9ecc-4447-a363-55a794afede5"). InnerVolumeSpecName "kube-api-access-sppwj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 10:34:04 crc kubenswrapper[4794]: I0310 10:34:04.785200 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sppwj\" (UniqueName: \"kubernetes.io/projected/0dc3852f-9ecc-4447-a363-55a794afede5-kube-api-access-sppwj\") on node \"crc\" DevicePath \"\"" Mar 10 10:34:05 crc kubenswrapper[4794]: I0310 10:34:05.221503 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552314-rcrgt" event={"ID":"0dc3852f-9ecc-4447-a363-55a794afede5","Type":"ContainerDied","Data":"ce8c0bc30bb175346a17791f7365733b409a5f8d80361a80f238bd61ae019815"} Mar 10 10:34:05 crc kubenswrapper[4794]: I0310 10:34:05.221855 4794 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ce8c0bc30bb175346a17791f7365733b409a5f8d80361a80f238bd61ae019815" Mar 10 10:34:05 crc kubenswrapper[4794]: I0310 10:34:05.221585 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552314-rcrgt" Mar 10 10:34:05 crc kubenswrapper[4794]: I0310 10:34:05.578899 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29552308-ncwj7"] Mar 10 10:34:05 crc kubenswrapper[4794]: I0310 10:34:05.583823 4794 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29552308-ncwj7"] Mar 10 10:34:06 crc kubenswrapper[4794]: I0310 10:34:06.016058 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4ae16337-4b6f-4615-9ee9-3e25b2000903" path="/var/lib/kubelet/pods/4ae16337-4b6f-4615-9ee9-3e25b2000903/volumes" Mar 10 10:34:22 crc kubenswrapper[4794]: I0310 10:34:22.968074 4794 patch_prober.go:28] interesting pod/machine-config-daemon-69278 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 10:34:22 crc kubenswrapper[4794]: I0310 10:34:22.968615 4794 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-69278" podUID="071a79a8-a892-4d38-a255-2a19483b64aa" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 10:34:22 crc kubenswrapper[4794]: I0310 10:34:22.968655 4794 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-69278" Mar 10 10:34:22 crc kubenswrapper[4794]: I0310 10:34:22.969205 4794 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"ce542eaa03554afd6301ca429d8e5d5b288e9f73610638883d1131444c25e394"} pod="openshift-machine-config-operator/machine-config-daemon-69278" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 10 10:34:22 crc kubenswrapper[4794]: I0310 10:34:22.969268 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-69278" podUID="071a79a8-a892-4d38-a255-2a19483b64aa" containerName="machine-config-daemon" containerID="cri-o://ce542eaa03554afd6301ca429d8e5d5b288e9f73610638883d1131444c25e394" gracePeriod=600 Mar 10 10:34:23 crc kubenswrapper[4794]: I0310 10:34:23.362923 4794 generic.go:334] "Generic (PLEG): container finished" podID="071a79a8-a892-4d38-a255-2a19483b64aa" containerID="ce542eaa03554afd6301ca429d8e5d5b288e9f73610638883d1131444c25e394" exitCode=0 Mar 10 10:34:23 crc kubenswrapper[4794]: I0310 10:34:23.363004 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-69278" event={"ID":"071a79a8-a892-4d38-a255-2a19483b64aa","Type":"ContainerDied","Data":"ce542eaa03554afd6301ca429d8e5d5b288e9f73610638883d1131444c25e394"} Mar 10 10:34:23 crc kubenswrapper[4794]: I0310 10:34:23.363288 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-69278" event={"ID":"071a79a8-a892-4d38-a255-2a19483b64aa","Type":"ContainerStarted","Data":"c5dcfa3c828a23512ad7e7c27068e581300a9afface667fdaf2fc0cd211b8f32"} Mar 10 10:34:23 crc kubenswrapper[4794]: I0310 10:34:23.363318 4794 scope.go:117] "RemoveContainer" containerID="c557b8a03c6ce2bd8bc84abc516eb62fd0c43ee1ffed47f8d0095002911fe7d5" Mar 10 10:34:39 crc kubenswrapper[4794]: I0310 10:34:39.305210 4794 scope.go:117] "RemoveContainer" containerID="b908fcf46dad167b7a1c4364e3591fed4cd00db062410bfea7f4d3a4842112d3" Mar 10 10:36:00 crc kubenswrapper[4794]: I0310 10:36:00.152310 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29552316-s8psg"] Mar 10 10:36:00 crc kubenswrapper[4794]: E0310 10:36:00.153176 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0dc3852f-9ecc-4447-a363-55a794afede5" containerName="oc" Mar 10 10:36:00 crc kubenswrapper[4794]: I0310 10:36:00.153190 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="0dc3852f-9ecc-4447-a363-55a794afede5" containerName="oc" Mar 10 10:36:00 crc kubenswrapper[4794]: I0310 10:36:00.153368 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="0dc3852f-9ecc-4447-a363-55a794afede5" containerName="oc" Mar 10 10:36:00 crc kubenswrapper[4794]: I0310 10:36:00.153894 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552316-s8psg" Mar 10 10:36:00 crc kubenswrapper[4794]: I0310 10:36:00.156155 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 10 10:36:00 crc kubenswrapper[4794]: I0310 10:36:00.157564 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 10 10:36:00 crc kubenswrapper[4794]: I0310 10:36:00.162888 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-r5tkc" Mar 10 10:36:00 crc kubenswrapper[4794]: I0310 10:36:00.166799 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552316-s8psg"] Mar 10 10:36:00 crc kubenswrapper[4794]: I0310 10:36:00.302462 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wr627\" (UniqueName: \"kubernetes.io/projected/5b8d5b53-7e89-463a-a2f5-9e97d0fa45cd-kube-api-access-wr627\") pod \"auto-csr-approver-29552316-s8psg\" (UID: \"5b8d5b53-7e89-463a-a2f5-9e97d0fa45cd\") " pod="openshift-infra/auto-csr-approver-29552316-s8psg" Mar 10 10:36:00 crc kubenswrapper[4794]: I0310 10:36:00.403879 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wr627\" (UniqueName: \"kubernetes.io/projected/5b8d5b53-7e89-463a-a2f5-9e97d0fa45cd-kube-api-access-wr627\") pod \"auto-csr-approver-29552316-s8psg\" (UID: \"5b8d5b53-7e89-463a-a2f5-9e97d0fa45cd\") " pod="openshift-infra/auto-csr-approver-29552316-s8psg" Mar 10 10:36:00 crc kubenswrapper[4794]: I0310 10:36:00.425934 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wr627\" (UniqueName: \"kubernetes.io/projected/5b8d5b53-7e89-463a-a2f5-9e97d0fa45cd-kube-api-access-wr627\") pod \"auto-csr-approver-29552316-s8psg\" (UID: \"5b8d5b53-7e89-463a-a2f5-9e97d0fa45cd\") " pod="openshift-infra/auto-csr-approver-29552316-s8psg" Mar 10 10:36:00 crc kubenswrapper[4794]: I0310 10:36:00.471986 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552316-s8psg" Mar 10 10:36:00 crc kubenswrapper[4794]: I0310 10:36:00.917654 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552316-s8psg"] Mar 10 10:36:00 crc kubenswrapper[4794]: W0310 10:36:00.921423 4794 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5b8d5b53_7e89_463a_a2f5_9e97d0fa45cd.slice/crio-fd0551362c70704baa9c2e19b7e0852b83060dcdedff6362c5a0ddb191030648 WatchSource:0}: Error finding container fd0551362c70704baa9c2e19b7e0852b83060dcdedff6362c5a0ddb191030648: Status 404 returned error can't find the container with id fd0551362c70704baa9c2e19b7e0852b83060dcdedff6362c5a0ddb191030648 Mar 10 10:36:01 crc kubenswrapper[4794]: I0310 10:36:01.173673 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552316-s8psg" event={"ID":"5b8d5b53-7e89-463a-a2f5-9e97d0fa45cd","Type":"ContainerStarted","Data":"fd0551362c70704baa9c2e19b7e0852b83060dcdedff6362c5a0ddb191030648"} Mar 10 10:36:02 crc kubenswrapper[4794]: I0310 10:36:02.182108 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552316-s8psg" event={"ID":"5b8d5b53-7e89-463a-a2f5-9e97d0fa45cd","Type":"ContainerStarted","Data":"1fac3bbec1f13005c6b4aeb2bba749d8aeb745dd54c72b4049b8b31aa45a36c1"} Mar 10 10:36:02 crc kubenswrapper[4794]: I0310 10:36:02.195167 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29552316-s8psg" podStartSLOduration=1.349073823 podStartE2EDuration="2.195148699s" podCreationTimestamp="2026-03-10 10:36:00 +0000 UTC" firstStartedPulling="2026-03-10 10:36:00.923380207 +0000 UTC m=+3109.679551035" lastFinishedPulling="2026-03-10 10:36:01.769455083 +0000 UTC m=+3110.525625911" observedRunningTime="2026-03-10 10:36:02.194131717 +0000 UTC m=+3110.950302555" watchObservedRunningTime="2026-03-10 10:36:02.195148699 +0000 UTC m=+3110.951319517" Mar 10 10:36:03 crc kubenswrapper[4794]: I0310 10:36:03.192951 4794 generic.go:334] "Generic (PLEG): container finished" podID="5b8d5b53-7e89-463a-a2f5-9e97d0fa45cd" containerID="1fac3bbec1f13005c6b4aeb2bba749d8aeb745dd54c72b4049b8b31aa45a36c1" exitCode=0 Mar 10 10:36:03 crc kubenswrapper[4794]: I0310 10:36:03.193040 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552316-s8psg" event={"ID":"5b8d5b53-7e89-463a-a2f5-9e97d0fa45cd","Type":"ContainerDied","Data":"1fac3bbec1f13005c6b4aeb2bba749d8aeb745dd54c72b4049b8b31aa45a36c1"} Mar 10 10:36:04 crc kubenswrapper[4794]: I0310 10:36:04.488730 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552316-s8psg" Mar 10 10:36:04 crc kubenswrapper[4794]: I0310 10:36:04.675599 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wr627\" (UniqueName: \"kubernetes.io/projected/5b8d5b53-7e89-463a-a2f5-9e97d0fa45cd-kube-api-access-wr627\") pod \"5b8d5b53-7e89-463a-a2f5-9e97d0fa45cd\" (UID: \"5b8d5b53-7e89-463a-a2f5-9e97d0fa45cd\") " Mar 10 10:36:04 crc kubenswrapper[4794]: I0310 10:36:04.683061 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b8d5b53-7e89-463a-a2f5-9e97d0fa45cd-kube-api-access-wr627" (OuterVolumeSpecName: "kube-api-access-wr627") pod "5b8d5b53-7e89-463a-a2f5-9e97d0fa45cd" (UID: "5b8d5b53-7e89-463a-a2f5-9e97d0fa45cd"). InnerVolumeSpecName "kube-api-access-wr627". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 10:36:04 crc kubenswrapper[4794]: I0310 10:36:04.777445 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wr627\" (UniqueName: \"kubernetes.io/projected/5b8d5b53-7e89-463a-a2f5-9e97d0fa45cd-kube-api-access-wr627\") on node \"crc\" DevicePath \"\"" Mar 10 10:36:05 crc kubenswrapper[4794]: I0310 10:36:05.099552 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29552310-bd5xh"] Mar 10 10:36:05 crc kubenswrapper[4794]: I0310 10:36:05.106513 4794 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29552310-bd5xh"] Mar 10 10:36:05 crc kubenswrapper[4794]: I0310 10:36:05.207571 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552316-s8psg" event={"ID":"5b8d5b53-7e89-463a-a2f5-9e97d0fa45cd","Type":"ContainerDied","Data":"fd0551362c70704baa9c2e19b7e0852b83060dcdedff6362c5a0ddb191030648"} Mar 10 10:36:05 crc kubenswrapper[4794]: I0310 10:36:05.207608 4794 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fd0551362c70704baa9c2e19b7e0852b83060dcdedff6362c5a0ddb191030648" Mar 10 10:36:05 crc kubenswrapper[4794]: I0310 10:36:05.207636 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552316-s8psg" Mar 10 10:36:06 crc kubenswrapper[4794]: I0310 10:36:06.011153 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="441d0540-deea-4dae-8803-ce9d141c2944" path="/var/lib/kubelet/pods/441d0540-deea-4dae-8803-ce9d141c2944/volumes" Mar 10 10:36:39 crc kubenswrapper[4794]: I0310 10:36:39.423579 4794 scope.go:117] "RemoveContainer" containerID="52b56d08dff0686f2d040169fda9eb3b35ef2500bf07e885d1d7d3b8ac17a5f4" Mar 10 10:36:52 crc kubenswrapper[4794]: I0310 10:36:52.968077 4794 patch_prober.go:28] interesting pod/machine-config-daemon-69278 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 10:36:52 crc kubenswrapper[4794]: I0310 10:36:52.968863 4794 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-69278" podUID="071a79a8-a892-4d38-a255-2a19483b64aa" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 10:37:17 crc kubenswrapper[4794]: I0310 10:37:17.523998 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-m2m54"] Mar 10 10:37:17 crc kubenswrapper[4794]: E0310 10:37:17.525154 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5b8d5b53-7e89-463a-a2f5-9e97d0fa45cd" containerName="oc" Mar 10 10:37:17 crc kubenswrapper[4794]: I0310 10:37:17.525182 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="5b8d5b53-7e89-463a-a2f5-9e97d0fa45cd" containerName="oc" Mar 10 10:37:17 crc kubenswrapper[4794]: I0310 10:37:17.525464 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="5b8d5b53-7e89-463a-a2f5-9e97d0fa45cd" containerName="oc" Mar 10 10:37:17 crc kubenswrapper[4794]: I0310 10:37:17.527495 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-m2m54" Mar 10 10:37:17 crc kubenswrapper[4794]: I0310 10:37:17.546021 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-m2m54"] Mar 10 10:37:17 crc kubenswrapper[4794]: I0310 10:37:17.641701 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8311c78d-858b-46db-8ba7-274fb1b7540c-catalog-content\") pod \"certified-operators-m2m54\" (UID: \"8311c78d-858b-46db-8ba7-274fb1b7540c\") " pod="openshift-marketplace/certified-operators-m2m54" Mar 10 10:37:17 crc kubenswrapper[4794]: I0310 10:37:17.641763 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w46wx\" (UniqueName: \"kubernetes.io/projected/8311c78d-858b-46db-8ba7-274fb1b7540c-kube-api-access-w46wx\") pod \"certified-operators-m2m54\" (UID: \"8311c78d-858b-46db-8ba7-274fb1b7540c\") " pod="openshift-marketplace/certified-operators-m2m54" Mar 10 10:37:17 crc kubenswrapper[4794]: I0310 10:37:17.641855 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8311c78d-858b-46db-8ba7-274fb1b7540c-utilities\") pod \"certified-operators-m2m54\" (UID: \"8311c78d-858b-46db-8ba7-274fb1b7540c\") " pod="openshift-marketplace/certified-operators-m2m54" Mar 10 10:37:17 crc kubenswrapper[4794]: I0310 10:37:17.743303 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8311c78d-858b-46db-8ba7-274fb1b7540c-utilities\") pod \"certified-operators-m2m54\" (UID: \"8311c78d-858b-46db-8ba7-274fb1b7540c\") " pod="openshift-marketplace/certified-operators-m2m54" Mar 10 10:37:17 crc kubenswrapper[4794]: I0310 10:37:17.743429 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8311c78d-858b-46db-8ba7-274fb1b7540c-catalog-content\") pod \"certified-operators-m2m54\" (UID: \"8311c78d-858b-46db-8ba7-274fb1b7540c\") " pod="openshift-marketplace/certified-operators-m2m54" Mar 10 10:37:17 crc kubenswrapper[4794]: I0310 10:37:17.743455 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w46wx\" (UniqueName: \"kubernetes.io/projected/8311c78d-858b-46db-8ba7-274fb1b7540c-kube-api-access-w46wx\") pod \"certified-operators-m2m54\" (UID: \"8311c78d-858b-46db-8ba7-274fb1b7540c\") " pod="openshift-marketplace/certified-operators-m2m54" Mar 10 10:37:17 crc kubenswrapper[4794]: I0310 10:37:17.743850 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8311c78d-858b-46db-8ba7-274fb1b7540c-utilities\") pod \"certified-operators-m2m54\" (UID: \"8311c78d-858b-46db-8ba7-274fb1b7540c\") " pod="openshift-marketplace/certified-operators-m2m54" Mar 10 10:37:17 crc kubenswrapper[4794]: I0310 10:37:17.744024 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8311c78d-858b-46db-8ba7-274fb1b7540c-catalog-content\") pod \"certified-operators-m2m54\" (UID: \"8311c78d-858b-46db-8ba7-274fb1b7540c\") " pod="openshift-marketplace/certified-operators-m2m54" Mar 10 10:37:17 crc kubenswrapper[4794]: I0310 10:37:17.769674 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w46wx\" (UniqueName: \"kubernetes.io/projected/8311c78d-858b-46db-8ba7-274fb1b7540c-kube-api-access-w46wx\") pod \"certified-operators-m2m54\" (UID: \"8311c78d-858b-46db-8ba7-274fb1b7540c\") " pod="openshift-marketplace/certified-operators-m2m54" Mar 10 10:37:17 crc kubenswrapper[4794]: I0310 10:37:17.872583 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-m2m54" Mar 10 10:37:18 crc kubenswrapper[4794]: I0310 10:37:18.153260 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-m2m54"] Mar 10 10:37:18 crc kubenswrapper[4794]: I0310 10:37:18.790658 4794 generic.go:334] "Generic (PLEG): container finished" podID="8311c78d-858b-46db-8ba7-274fb1b7540c" containerID="ff51b5bb2a3ce307a8671beac2c5dfb26331679e621baf8694cd40d9b02f9cf6" exitCode=0 Mar 10 10:37:18 crc kubenswrapper[4794]: I0310 10:37:18.790723 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-m2m54" event={"ID":"8311c78d-858b-46db-8ba7-274fb1b7540c","Type":"ContainerDied","Data":"ff51b5bb2a3ce307a8671beac2c5dfb26331679e621baf8694cd40d9b02f9cf6"} Mar 10 10:37:18 crc kubenswrapper[4794]: I0310 10:37:18.791035 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-m2m54" event={"ID":"8311c78d-858b-46db-8ba7-274fb1b7540c","Type":"ContainerStarted","Data":"3fcc22e1e8d1e760e67ed87798b72b8ab6dd494e80539c7256dd89933e3231eb"} Mar 10 10:37:19 crc kubenswrapper[4794]: I0310 10:37:19.801694 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-m2m54" event={"ID":"8311c78d-858b-46db-8ba7-274fb1b7540c","Type":"ContainerStarted","Data":"9e2ec2c2194a1af8a3e93a10fc63a356a361083ee58a4e2a2e311d8c7ae329e3"} Mar 10 10:37:20 crc kubenswrapper[4794]: I0310 10:37:20.816233 4794 generic.go:334] "Generic (PLEG): container finished" podID="8311c78d-858b-46db-8ba7-274fb1b7540c" containerID="9e2ec2c2194a1af8a3e93a10fc63a356a361083ee58a4e2a2e311d8c7ae329e3" exitCode=0 Mar 10 10:37:20 crc kubenswrapper[4794]: I0310 10:37:20.816283 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-m2m54" event={"ID":"8311c78d-858b-46db-8ba7-274fb1b7540c","Type":"ContainerDied","Data":"9e2ec2c2194a1af8a3e93a10fc63a356a361083ee58a4e2a2e311d8c7ae329e3"} Mar 10 10:37:21 crc kubenswrapper[4794]: I0310 10:37:21.829120 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-m2m54" event={"ID":"8311c78d-858b-46db-8ba7-274fb1b7540c","Type":"ContainerStarted","Data":"01c7e1c7a1e04d42b10dbf852b5a704977020fd02d0c606f89c10ee082f5dc6b"} Mar 10 10:37:21 crc kubenswrapper[4794]: I0310 10:37:21.859012 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-m2m54" podStartSLOduration=2.3990749940000002 podStartE2EDuration="4.858995371s" podCreationTimestamp="2026-03-10 10:37:17 +0000 UTC" firstStartedPulling="2026-03-10 10:37:18.795101591 +0000 UTC m=+3187.551272439" lastFinishedPulling="2026-03-10 10:37:21.255021998 +0000 UTC m=+3190.011192816" observedRunningTime="2026-03-10 10:37:21.855381048 +0000 UTC m=+3190.611551886" watchObservedRunningTime="2026-03-10 10:37:21.858995371 +0000 UTC m=+3190.615166189" Mar 10 10:37:22 crc kubenswrapper[4794]: I0310 10:37:22.967848 4794 patch_prober.go:28] interesting pod/machine-config-daemon-69278 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 10:37:22 crc kubenswrapper[4794]: I0310 10:37:22.967911 4794 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-69278" podUID="071a79a8-a892-4d38-a255-2a19483b64aa" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 10:37:27 crc kubenswrapper[4794]: I0310 10:37:27.872778 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-m2m54" Mar 10 10:37:27 crc kubenswrapper[4794]: I0310 10:37:27.873144 4794 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-m2m54" Mar 10 10:37:27 crc kubenswrapper[4794]: I0310 10:37:27.939649 4794 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-m2m54" Mar 10 10:37:28 crc kubenswrapper[4794]: I0310 10:37:28.943750 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-m2m54" Mar 10 10:37:34 crc kubenswrapper[4794]: I0310 10:37:34.313415 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-m2m54"] Mar 10 10:37:34 crc kubenswrapper[4794]: I0310 10:37:34.314531 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-m2m54" podUID="8311c78d-858b-46db-8ba7-274fb1b7540c" containerName="registry-server" containerID="cri-o://01c7e1c7a1e04d42b10dbf852b5a704977020fd02d0c606f89c10ee082f5dc6b" gracePeriod=2 Mar 10 10:37:34 crc kubenswrapper[4794]: I0310 10:37:34.935912 4794 generic.go:334] "Generic (PLEG): container finished" podID="8311c78d-858b-46db-8ba7-274fb1b7540c" containerID="01c7e1c7a1e04d42b10dbf852b5a704977020fd02d0c606f89c10ee082f5dc6b" exitCode=0 Mar 10 10:37:34 crc kubenswrapper[4794]: I0310 10:37:34.935970 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-m2m54" event={"ID":"8311c78d-858b-46db-8ba7-274fb1b7540c","Type":"ContainerDied","Data":"01c7e1c7a1e04d42b10dbf852b5a704977020fd02d0c606f89c10ee082f5dc6b"} Mar 10 10:37:35 crc kubenswrapper[4794]: I0310 10:37:35.202362 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-m2m54" Mar 10 10:37:35 crc kubenswrapper[4794]: I0310 10:37:35.225398 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8311c78d-858b-46db-8ba7-274fb1b7540c-catalog-content\") pod \"8311c78d-858b-46db-8ba7-274fb1b7540c\" (UID: \"8311c78d-858b-46db-8ba7-274fb1b7540c\") " Mar 10 10:37:35 crc kubenswrapper[4794]: I0310 10:37:35.225476 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w46wx\" (UniqueName: \"kubernetes.io/projected/8311c78d-858b-46db-8ba7-274fb1b7540c-kube-api-access-w46wx\") pod \"8311c78d-858b-46db-8ba7-274fb1b7540c\" (UID: \"8311c78d-858b-46db-8ba7-274fb1b7540c\") " Mar 10 10:37:35 crc kubenswrapper[4794]: I0310 10:37:35.225528 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8311c78d-858b-46db-8ba7-274fb1b7540c-utilities\") pod \"8311c78d-858b-46db-8ba7-274fb1b7540c\" (UID: \"8311c78d-858b-46db-8ba7-274fb1b7540c\") " Mar 10 10:37:35 crc kubenswrapper[4794]: I0310 10:37:35.226816 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8311c78d-858b-46db-8ba7-274fb1b7540c-utilities" (OuterVolumeSpecName: "utilities") pod "8311c78d-858b-46db-8ba7-274fb1b7540c" (UID: "8311c78d-858b-46db-8ba7-274fb1b7540c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 10:37:35 crc kubenswrapper[4794]: I0310 10:37:35.240892 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8311c78d-858b-46db-8ba7-274fb1b7540c-kube-api-access-w46wx" (OuterVolumeSpecName: "kube-api-access-w46wx") pod "8311c78d-858b-46db-8ba7-274fb1b7540c" (UID: "8311c78d-858b-46db-8ba7-274fb1b7540c"). InnerVolumeSpecName "kube-api-access-w46wx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 10:37:35 crc kubenswrapper[4794]: I0310 10:37:35.289748 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8311c78d-858b-46db-8ba7-274fb1b7540c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "8311c78d-858b-46db-8ba7-274fb1b7540c" (UID: "8311c78d-858b-46db-8ba7-274fb1b7540c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 10:37:35 crc kubenswrapper[4794]: I0310 10:37:35.327117 4794 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8311c78d-858b-46db-8ba7-274fb1b7540c-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 10 10:37:35 crc kubenswrapper[4794]: I0310 10:37:35.327150 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w46wx\" (UniqueName: \"kubernetes.io/projected/8311c78d-858b-46db-8ba7-274fb1b7540c-kube-api-access-w46wx\") on node \"crc\" DevicePath \"\"" Mar 10 10:37:35 crc kubenswrapper[4794]: I0310 10:37:35.327162 4794 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8311c78d-858b-46db-8ba7-274fb1b7540c-utilities\") on node \"crc\" DevicePath \"\"" Mar 10 10:37:35 crc kubenswrapper[4794]: I0310 10:37:35.946966 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-m2m54" event={"ID":"8311c78d-858b-46db-8ba7-274fb1b7540c","Type":"ContainerDied","Data":"3fcc22e1e8d1e760e67ed87798b72b8ab6dd494e80539c7256dd89933e3231eb"} Mar 10 10:37:35 crc kubenswrapper[4794]: I0310 10:37:35.947018 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-m2m54" Mar 10 10:37:35 crc kubenswrapper[4794]: I0310 10:37:35.947029 4794 scope.go:117] "RemoveContainer" containerID="01c7e1c7a1e04d42b10dbf852b5a704977020fd02d0c606f89c10ee082f5dc6b" Mar 10 10:37:35 crc kubenswrapper[4794]: I0310 10:37:35.979764 4794 scope.go:117] "RemoveContainer" containerID="9e2ec2c2194a1af8a3e93a10fc63a356a361083ee58a4e2a2e311d8c7ae329e3" Mar 10 10:37:35 crc kubenswrapper[4794]: I0310 10:37:35.986085 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-m2m54"] Mar 10 10:37:36 crc kubenswrapper[4794]: I0310 10:37:36.023465 4794 scope.go:117] "RemoveContainer" containerID="ff51b5bb2a3ce307a8671beac2c5dfb26331679e621baf8694cd40d9b02f9cf6" Mar 10 10:37:36 crc kubenswrapper[4794]: I0310 10:37:36.029999 4794 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-m2m54"] Mar 10 10:37:38 crc kubenswrapper[4794]: I0310 10:37:38.010448 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8311c78d-858b-46db-8ba7-274fb1b7540c" path="/var/lib/kubelet/pods/8311c78d-858b-46db-8ba7-274fb1b7540c/volumes" Mar 10 10:37:52 crc kubenswrapper[4794]: I0310 10:37:52.967623 4794 patch_prober.go:28] interesting pod/machine-config-daemon-69278 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 10:37:52 crc kubenswrapper[4794]: I0310 10:37:52.968251 4794 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-69278" podUID="071a79a8-a892-4d38-a255-2a19483b64aa" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 10:37:52 crc kubenswrapper[4794]: I0310 10:37:52.968305 4794 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-69278" Mar 10 10:37:52 crc kubenswrapper[4794]: I0310 10:37:52.969071 4794 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"c5dcfa3c828a23512ad7e7c27068e581300a9afface667fdaf2fc0cd211b8f32"} pod="openshift-machine-config-operator/machine-config-daemon-69278" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 10 10:37:52 crc kubenswrapper[4794]: I0310 10:37:52.969140 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-69278" podUID="071a79a8-a892-4d38-a255-2a19483b64aa" containerName="machine-config-daemon" containerID="cri-o://c5dcfa3c828a23512ad7e7c27068e581300a9afface667fdaf2fc0cd211b8f32" gracePeriod=600 Mar 10 10:37:53 crc kubenswrapper[4794]: E0310 10:37:53.090657 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-69278_openshift-machine-config-operator(071a79a8-a892-4d38-a255-2a19483b64aa)\"" pod="openshift-machine-config-operator/machine-config-daemon-69278" podUID="071a79a8-a892-4d38-a255-2a19483b64aa" Mar 10 10:37:53 crc kubenswrapper[4794]: I0310 10:37:53.091773 4794 generic.go:334] "Generic (PLEG): container finished" podID="071a79a8-a892-4d38-a255-2a19483b64aa" containerID="c5dcfa3c828a23512ad7e7c27068e581300a9afface667fdaf2fc0cd211b8f32" exitCode=0 Mar 10 10:37:53 crc kubenswrapper[4794]: I0310 10:37:53.091816 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-69278" event={"ID":"071a79a8-a892-4d38-a255-2a19483b64aa","Type":"ContainerDied","Data":"c5dcfa3c828a23512ad7e7c27068e581300a9afface667fdaf2fc0cd211b8f32"} Mar 10 10:37:53 crc kubenswrapper[4794]: I0310 10:37:53.091852 4794 scope.go:117] "RemoveContainer" containerID="ce542eaa03554afd6301ca429d8e5d5b288e9f73610638883d1131444c25e394" Mar 10 10:37:54 crc kubenswrapper[4794]: I0310 10:37:54.106145 4794 scope.go:117] "RemoveContainer" containerID="c5dcfa3c828a23512ad7e7c27068e581300a9afface667fdaf2fc0cd211b8f32" Mar 10 10:37:54 crc kubenswrapper[4794]: E0310 10:37:54.106672 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-69278_openshift-machine-config-operator(071a79a8-a892-4d38-a255-2a19483b64aa)\"" pod="openshift-machine-config-operator/machine-config-daemon-69278" podUID="071a79a8-a892-4d38-a255-2a19483b64aa" Mar 10 10:38:00 crc kubenswrapper[4794]: I0310 10:38:00.149454 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29552318-42c2w"] Mar 10 10:38:00 crc kubenswrapper[4794]: E0310 10:38:00.150304 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8311c78d-858b-46db-8ba7-274fb1b7540c" containerName="extract-utilities" Mar 10 10:38:00 crc kubenswrapper[4794]: I0310 10:38:00.150321 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="8311c78d-858b-46db-8ba7-274fb1b7540c" containerName="extract-utilities" Mar 10 10:38:00 crc kubenswrapper[4794]: E0310 10:38:00.150364 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8311c78d-858b-46db-8ba7-274fb1b7540c" containerName="registry-server" Mar 10 10:38:00 crc kubenswrapper[4794]: I0310 10:38:00.150372 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="8311c78d-858b-46db-8ba7-274fb1b7540c" containerName="registry-server" Mar 10 10:38:00 crc kubenswrapper[4794]: E0310 10:38:00.150390 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8311c78d-858b-46db-8ba7-274fb1b7540c" containerName="extract-content" Mar 10 10:38:00 crc kubenswrapper[4794]: I0310 10:38:00.150398 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="8311c78d-858b-46db-8ba7-274fb1b7540c" containerName="extract-content" Mar 10 10:38:00 crc kubenswrapper[4794]: I0310 10:38:00.150598 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="8311c78d-858b-46db-8ba7-274fb1b7540c" containerName="registry-server" Mar 10 10:38:00 crc kubenswrapper[4794]: I0310 10:38:00.151127 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552318-42c2w" Mar 10 10:38:00 crc kubenswrapper[4794]: I0310 10:38:00.153765 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 10 10:38:00 crc kubenswrapper[4794]: I0310 10:38:00.153929 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 10 10:38:00 crc kubenswrapper[4794]: I0310 10:38:00.154202 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-r5tkc" Mar 10 10:38:00 crc kubenswrapper[4794]: I0310 10:38:00.157216 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552318-42c2w"] Mar 10 10:38:00 crc kubenswrapper[4794]: I0310 10:38:00.294518 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lc582\" (UniqueName: \"kubernetes.io/projected/e0cdc4e5-d940-4311-977d-ec96b95261f6-kube-api-access-lc582\") pod \"auto-csr-approver-29552318-42c2w\" (UID: \"e0cdc4e5-d940-4311-977d-ec96b95261f6\") " pod="openshift-infra/auto-csr-approver-29552318-42c2w" Mar 10 10:38:00 crc kubenswrapper[4794]: I0310 10:38:00.395698 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lc582\" (UniqueName: \"kubernetes.io/projected/e0cdc4e5-d940-4311-977d-ec96b95261f6-kube-api-access-lc582\") pod \"auto-csr-approver-29552318-42c2w\" (UID: \"e0cdc4e5-d940-4311-977d-ec96b95261f6\") " pod="openshift-infra/auto-csr-approver-29552318-42c2w" Mar 10 10:38:00 crc kubenswrapper[4794]: I0310 10:38:00.414515 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lc582\" (UniqueName: \"kubernetes.io/projected/e0cdc4e5-d940-4311-977d-ec96b95261f6-kube-api-access-lc582\") pod \"auto-csr-approver-29552318-42c2w\" (UID: \"e0cdc4e5-d940-4311-977d-ec96b95261f6\") " pod="openshift-infra/auto-csr-approver-29552318-42c2w" Mar 10 10:38:00 crc kubenswrapper[4794]: I0310 10:38:00.469939 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552318-42c2w" Mar 10 10:38:00 crc kubenswrapper[4794]: I0310 10:38:00.877145 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552318-42c2w"] Mar 10 10:38:01 crc kubenswrapper[4794]: I0310 10:38:01.155626 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552318-42c2w" event={"ID":"e0cdc4e5-d940-4311-977d-ec96b95261f6","Type":"ContainerStarted","Data":"2a4d7e3e2d27ec02f6cd3bf24305d6dfb257c6e7682a49cd26b506d83c57f54b"} Mar 10 10:38:03 crc kubenswrapper[4794]: I0310 10:38:03.174415 4794 generic.go:334] "Generic (PLEG): container finished" podID="e0cdc4e5-d940-4311-977d-ec96b95261f6" containerID="17f4a4daf6fd27bbbefebcadef074aa869c760c39f98d2c81256b0a4aed65ac3" exitCode=0 Mar 10 10:38:03 crc kubenswrapper[4794]: I0310 10:38:03.174462 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552318-42c2w" event={"ID":"e0cdc4e5-d940-4311-977d-ec96b95261f6","Type":"ContainerDied","Data":"17f4a4daf6fd27bbbefebcadef074aa869c760c39f98d2c81256b0a4aed65ac3"} Mar 10 10:38:04 crc kubenswrapper[4794]: I0310 10:38:04.448553 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552318-42c2w" Mar 10 10:38:04 crc kubenswrapper[4794]: I0310 10:38:04.557206 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lc582\" (UniqueName: \"kubernetes.io/projected/e0cdc4e5-d940-4311-977d-ec96b95261f6-kube-api-access-lc582\") pod \"e0cdc4e5-d940-4311-977d-ec96b95261f6\" (UID: \"e0cdc4e5-d940-4311-977d-ec96b95261f6\") " Mar 10 10:38:04 crc kubenswrapper[4794]: I0310 10:38:04.563223 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e0cdc4e5-d940-4311-977d-ec96b95261f6-kube-api-access-lc582" (OuterVolumeSpecName: "kube-api-access-lc582") pod "e0cdc4e5-d940-4311-977d-ec96b95261f6" (UID: "e0cdc4e5-d940-4311-977d-ec96b95261f6"). InnerVolumeSpecName "kube-api-access-lc582". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 10:38:04 crc kubenswrapper[4794]: I0310 10:38:04.658641 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lc582\" (UniqueName: \"kubernetes.io/projected/e0cdc4e5-d940-4311-977d-ec96b95261f6-kube-api-access-lc582\") on node \"crc\" DevicePath \"\"" Mar 10 10:38:05 crc kubenswrapper[4794]: I0310 10:38:05.187861 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552318-42c2w" event={"ID":"e0cdc4e5-d940-4311-977d-ec96b95261f6","Type":"ContainerDied","Data":"2a4d7e3e2d27ec02f6cd3bf24305d6dfb257c6e7682a49cd26b506d83c57f54b"} Mar 10 10:38:05 crc kubenswrapper[4794]: I0310 10:38:05.188182 4794 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2a4d7e3e2d27ec02f6cd3bf24305d6dfb257c6e7682a49cd26b506d83c57f54b" Mar 10 10:38:05 crc kubenswrapper[4794]: I0310 10:38:05.187902 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552318-42c2w" Mar 10 10:38:05 crc kubenswrapper[4794]: I0310 10:38:05.533991 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29552312-6zv4m"] Mar 10 10:38:05 crc kubenswrapper[4794]: I0310 10:38:05.542286 4794 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29552312-6zv4m"] Mar 10 10:38:06 crc kubenswrapper[4794]: I0310 10:38:06.000636 4794 scope.go:117] "RemoveContainer" containerID="c5dcfa3c828a23512ad7e7c27068e581300a9afface667fdaf2fc0cd211b8f32" Mar 10 10:38:06 crc kubenswrapper[4794]: E0310 10:38:06.001035 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-69278_openshift-machine-config-operator(071a79a8-a892-4d38-a255-2a19483b64aa)\"" pod="openshift-machine-config-operator/machine-config-daemon-69278" podUID="071a79a8-a892-4d38-a255-2a19483b64aa" Mar 10 10:38:06 crc kubenswrapper[4794]: I0310 10:38:06.011382 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dfb4f35d-3ce4-469f-bcf4-6a18666d7a60" path="/var/lib/kubelet/pods/dfb4f35d-3ce4-469f-bcf4-6a18666d7a60/volumes" Mar 10 10:38:18 crc kubenswrapper[4794]: I0310 10:38:18.999318 4794 scope.go:117] "RemoveContainer" containerID="c5dcfa3c828a23512ad7e7c27068e581300a9afface667fdaf2fc0cd211b8f32" Mar 10 10:38:19 crc kubenswrapper[4794]: E0310 10:38:19.000189 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-69278_openshift-machine-config-operator(071a79a8-a892-4d38-a255-2a19483b64aa)\"" pod="openshift-machine-config-operator/machine-config-daemon-69278" podUID="071a79a8-a892-4d38-a255-2a19483b64aa" Mar 10 10:38:32 crc kubenswrapper[4794]: I0310 10:38:32.004744 4794 scope.go:117] "RemoveContainer" containerID="c5dcfa3c828a23512ad7e7c27068e581300a9afface667fdaf2fc0cd211b8f32" Mar 10 10:38:32 crc kubenswrapper[4794]: E0310 10:38:32.005943 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-69278_openshift-machine-config-operator(071a79a8-a892-4d38-a255-2a19483b64aa)\"" pod="openshift-machine-config-operator/machine-config-daemon-69278" podUID="071a79a8-a892-4d38-a255-2a19483b64aa" Mar 10 10:38:39 crc kubenswrapper[4794]: I0310 10:38:39.524425 4794 scope.go:117] "RemoveContainer" containerID="80b0a24179304b588a5e5f3c51302c950a38f80ef87a86f9808c344779a5f9b7" Mar 10 10:38:43 crc kubenswrapper[4794]: I0310 10:38:43.999126 4794 scope.go:117] "RemoveContainer" containerID="c5dcfa3c828a23512ad7e7c27068e581300a9afface667fdaf2fc0cd211b8f32" Mar 10 10:38:44 crc kubenswrapper[4794]: E0310 10:38:44.000932 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-69278_openshift-machine-config-operator(071a79a8-a892-4d38-a255-2a19483b64aa)\"" pod="openshift-machine-config-operator/machine-config-daemon-69278" podUID="071a79a8-a892-4d38-a255-2a19483b64aa" Mar 10 10:38:57 crc kubenswrapper[4794]: I0310 10:38:56.999457 4794 scope.go:117] "RemoveContainer" containerID="c5dcfa3c828a23512ad7e7c27068e581300a9afface667fdaf2fc0cd211b8f32" Mar 10 10:38:57 crc kubenswrapper[4794]: E0310 10:38:57.000387 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-69278_openshift-machine-config-operator(071a79a8-a892-4d38-a255-2a19483b64aa)\"" pod="openshift-machine-config-operator/machine-config-daemon-69278" podUID="071a79a8-a892-4d38-a255-2a19483b64aa" Mar 10 10:39:10 crc kubenswrapper[4794]: I0310 10:39:09.999871 4794 scope.go:117] "RemoveContainer" containerID="c5dcfa3c828a23512ad7e7c27068e581300a9afface667fdaf2fc0cd211b8f32" Mar 10 10:39:10 crc kubenswrapper[4794]: E0310 10:39:10.000743 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-69278_openshift-machine-config-operator(071a79a8-a892-4d38-a255-2a19483b64aa)\"" pod="openshift-machine-config-operator/machine-config-daemon-69278" podUID="071a79a8-a892-4d38-a255-2a19483b64aa" Mar 10 10:39:24 crc kubenswrapper[4794]: I0310 10:39:24.001393 4794 scope.go:117] "RemoveContainer" containerID="c5dcfa3c828a23512ad7e7c27068e581300a9afface667fdaf2fc0cd211b8f32" Mar 10 10:39:24 crc kubenswrapper[4794]: E0310 10:39:24.002521 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-69278_openshift-machine-config-operator(071a79a8-a892-4d38-a255-2a19483b64aa)\"" pod="openshift-machine-config-operator/machine-config-daemon-69278" podUID="071a79a8-a892-4d38-a255-2a19483b64aa" Mar 10 10:39:37 crc kubenswrapper[4794]: I0310 10:39:37.999204 4794 scope.go:117] "RemoveContainer" containerID="c5dcfa3c828a23512ad7e7c27068e581300a9afface667fdaf2fc0cd211b8f32" Mar 10 10:39:38 crc kubenswrapper[4794]: E0310 10:39:38.000243 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-69278_openshift-machine-config-operator(071a79a8-a892-4d38-a255-2a19483b64aa)\"" pod="openshift-machine-config-operator/machine-config-daemon-69278" podUID="071a79a8-a892-4d38-a255-2a19483b64aa" Mar 10 10:39:53 crc kubenswrapper[4794]: I0310 10:39:52.999532 4794 scope.go:117] "RemoveContainer" containerID="c5dcfa3c828a23512ad7e7c27068e581300a9afface667fdaf2fc0cd211b8f32" Mar 10 10:39:53 crc kubenswrapper[4794]: E0310 10:39:53.000488 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-69278_openshift-machine-config-operator(071a79a8-a892-4d38-a255-2a19483b64aa)\"" pod="openshift-machine-config-operator/machine-config-daemon-69278" podUID="071a79a8-a892-4d38-a255-2a19483b64aa" Mar 10 10:40:00 crc kubenswrapper[4794]: I0310 10:40:00.213498 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29552320-jjftf"] Mar 10 10:40:00 crc kubenswrapper[4794]: E0310 10:40:00.214744 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e0cdc4e5-d940-4311-977d-ec96b95261f6" containerName="oc" Mar 10 10:40:00 crc kubenswrapper[4794]: I0310 10:40:00.214769 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="e0cdc4e5-d940-4311-977d-ec96b95261f6" containerName="oc" Mar 10 10:40:00 crc kubenswrapper[4794]: I0310 10:40:00.215062 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="e0cdc4e5-d940-4311-977d-ec96b95261f6" containerName="oc" Mar 10 10:40:00 crc kubenswrapper[4794]: I0310 10:40:00.215858 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552320-jjftf" Mar 10 10:40:00 crc kubenswrapper[4794]: I0310 10:40:00.231456 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552320-jjftf"] Mar 10 10:40:00 crc kubenswrapper[4794]: I0310 10:40:00.239189 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 10 10:40:00 crc kubenswrapper[4794]: I0310 10:40:00.239577 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-r5tkc" Mar 10 10:40:00 crc kubenswrapper[4794]: I0310 10:40:00.239802 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 10 10:40:00 crc kubenswrapper[4794]: I0310 10:40:00.346474 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8pw7h\" (UniqueName: \"kubernetes.io/projected/6f9819fe-c4f4-49d0-80ac-02fbd8ed66cd-kube-api-access-8pw7h\") pod \"auto-csr-approver-29552320-jjftf\" (UID: \"6f9819fe-c4f4-49d0-80ac-02fbd8ed66cd\") " pod="openshift-infra/auto-csr-approver-29552320-jjftf" Mar 10 10:40:00 crc kubenswrapper[4794]: I0310 10:40:00.448023 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8pw7h\" (UniqueName: \"kubernetes.io/projected/6f9819fe-c4f4-49d0-80ac-02fbd8ed66cd-kube-api-access-8pw7h\") pod \"auto-csr-approver-29552320-jjftf\" (UID: \"6f9819fe-c4f4-49d0-80ac-02fbd8ed66cd\") " pod="openshift-infra/auto-csr-approver-29552320-jjftf" Mar 10 10:40:00 crc kubenswrapper[4794]: I0310 10:40:00.487091 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8pw7h\" (UniqueName: \"kubernetes.io/projected/6f9819fe-c4f4-49d0-80ac-02fbd8ed66cd-kube-api-access-8pw7h\") pod \"auto-csr-approver-29552320-jjftf\" (UID: \"6f9819fe-c4f4-49d0-80ac-02fbd8ed66cd\") " pod="openshift-infra/auto-csr-approver-29552320-jjftf" Mar 10 10:40:00 crc kubenswrapper[4794]: I0310 10:40:00.540838 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552320-jjftf" Mar 10 10:40:01 crc kubenswrapper[4794]: I0310 10:40:00.761823 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552320-jjftf"] Mar 10 10:40:01 crc kubenswrapper[4794]: I0310 10:40:00.773418 4794 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 10 10:40:01 crc kubenswrapper[4794]: I0310 10:40:00.807422 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552320-jjftf" event={"ID":"6f9819fe-c4f4-49d0-80ac-02fbd8ed66cd","Type":"ContainerStarted","Data":"87693bbbed7ffbcaa614fb38763bb86b7cbb8ccdff367b1d74ac10880c18301b"} Mar 10 10:40:02 crc kubenswrapper[4794]: I0310 10:40:02.825394 4794 generic.go:334] "Generic (PLEG): container finished" podID="6f9819fe-c4f4-49d0-80ac-02fbd8ed66cd" containerID="1c517b71d1731ac08bc05f740319bc6476a13c4e427d2c717ac44177dacb2268" exitCode=0 Mar 10 10:40:02 crc kubenswrapper[4794]: I0310 10:40:02.825523 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552320-jjftf" event={"ID":"6f9819fe-c4f4-49d0-80ac-02fbd8ed66cd","Type":"ContainerDied","Data":"1c517b71d1731ac08bc05f740319bc6476a13c4e427d2c717ac44177dacb2268"} Mar 10 10:40:04 crc kubenswrapper[4794]: I0310 10:40:04.171884 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552320-jjftf" Mar 10 10:40:04 crc kubenswrapper[4794]: I0310 10:40:04.302298 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8pw7h\" (UniqueName: \"kubernetes.io/projected/6f9819fe-c4f4-49d0-80ac-02fbd8ed66cd-kube-api-access-8pw7h\") pod \"6f9819fe-c4f4-49d0-80ac-02fbd8ed66cd\" (UID: \"6f9819fe-c4f4-49d0-80ac-02fbd8ed66cd\") " Mar 10 10:40:04 crc kubenswrapper[4794]: I0310 10:40:04.310873 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6f9819fe-c4f4-49d0-80ac-02fbd8ed66cd-kube-api-access-8pw7h" (OuterVolumeSpecName: "kube-api-access-8pw7h") pod "6f9819fe-c4f4-49d0-80ac-02fbd8ed66cd" (UID: "6f9819fe-c4f4-49d0-80ac-02fbd8ed66cd"). InnerVolumeSpecName "kube-api-access-8pw7h". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 10:40:04 crc kubenswrapper[4794]: I0310 10:40:04.404237 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8pw7h\" (UniqueName: \"kubernetes.io/projected/6f9819fe-c4f4-49d0-80ac-02fbd8ed66cd-kube-api-access-8pw7h\") on node \"crc\" DevicePath \"\"" Mar 10 10:40:04 crc kubenswrapper[4794]: I0310 10:40:04.858495 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552320-jjftf" event={"ID":"6f9819fe-c4f4-49d0-80ac-02fbd8ed66cd","Type":"ContainerDied","Data":"87693bbbed7ffbcaa614fb38763bb86b7cbb8ccdff367b1d74ac10880c18301b"} Mar 10 10:40:04 crc kubenswrapper[4794]: I0310 10:40:04.858585 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552320-jjftf" Mar 10 10:40:04 crc kubenswrapper[4794]: I0310 10:40:04.858589 4794 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="87693bbbed7ffbcaa614fb38763bb86b7cbb8ccdff367b1d74ac10880c18301b" Mar 10 10:40:04 crc kubenswrapper[4794]: I0310 10:40:04.999411 4794 scope.go:117] "RemoveContainer" containerID="c5dcfa3c828a23512ad7e7c27068e581300a9afface667fdaf2fc0cd211b8f32" Mar 10 10:40:04 crc kubenswrapper[4794]: E0310 10:40:04.999761 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-69278_openshift-machine-config-operator(071a79a8-a892-4d38-a255-2a19483b64aa)\"" pod="openshift-machine-config-operator/machine-config-daemon-69278" podUID="071a79a8-a892-4d38-a255-2a19483b64aa" Mar 10 10:40:05 crc kubenswrapper[4794]: I0310 10:40:05.231988 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29552314-rcrgt"] Mar 10 10:40:05 crc kubenswrapper[4794]: I0310 10:40:05.239625 4794 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29552314-rcrgt"] Mar 10 10:40:06 crc kubenswrapper[4794]: I0310 10:40:06.016798 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0dc3852f-9ecc-4447-a363-55a794afede5" path="/var/lib/kubelet/pods/0dc3852f-9ecc-4447-a363-55a794afede5/volumes" Mar 10 10:40:20 crc kubenswrapper[4794]: I0310 10:40:19.999882 4794 scope.go:117] "RemoveContainer" containerID="c5dcfa3c828a23512ad7e7c27068e581300a9afface667fdaf2fc0cd211b8f32" Mar 10 10:40:20 crc kubenswrapper[4794]: E0310 10:40:20.000765 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-69278_openshift-machine-config-operator(071a79a8-a892-4d38-a255-2a19483b64aa)\"" pod="openshift-machine-config-operator/machine-config-daemon-69278" podUID="071a79a8-a892-4d38-a255-2a19483b64aa" Mar 10 10:40:31 crc kubenswrapper[4794]: I0310 10:40:31.001562 4794 scope.go:117] "RemoveContainer" containerID="c5dcfa3c828a23512ad7e7c27068e581300a9afface667fdaf2fc0cd211b8f32" Mar 10 10:40:31 crc kubenswrapper[4794]: E0310 10:40:31.002530 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-69278_openshift-machine-config-operator(071a79a8-a892-4d38-a255-2a19483b64aa)\"" pod="openshift-machine-config-operator/machine-config-daemon-69278" podUID="071a79a8-a892-4d38-a255-2a19483b64aa" Mar 10 10:40:39 crc kubenswrapper[4794]: I0310 10:40:39.626420 4794 scope.go:117] "RemoveContainer" containerID="3a5543160f41e487856d39e504d831e10f2a8b99a1a3cb1fecc61ce29db527f5" Mar 10 10:40:45 crc kubenswrapper[4794]: I0310 10:40:44.999438 4794 scope.go:117] "RemoveContainer" containerID="c5dcfa3c828a23512ad7e7c27068e581300a9afface667fdaf2fc0cd211b8f32" Mar 10 10:40:45 crc kubenswrapper[4794]: E0310 10:40:45.000469 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-69278_openshift-machine-config-operator(071a79a8-a892-4d38-a255-2a19483b64aa)\"" pod="openshift-machine-config-operator/machine-config-daemon-69278" podUID="071a79a8-a892-4d38-a255-2a19483b64aa" Mar 10 10:40:59 crc kubenswrapper[4794]: I0310 10:40:58.999634 4794 scope.go:117] "RemoveContainer" containerID="c5dcfa3c828a23512ad7e7c27068e581300a9afface667fdaf2fc0cd211b8f32" Mar 10 10:40:59 crc kubenswrapper[4794]: E0310 10:40:59.000721 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-69278_openshift-machine-config-operator(071a79a8-a892-4d38-a255-2a19483b64aa)\"" pod="openshift-machine-config-operator/machine-config-daemon-69278" podUID="071a79a8-a892-4d38-a255-2a19483b64aa" Mar 10 10:41:14 crc kubenswrapper[4794]: I0310 10:41:13.999544 4794 scope.go:117] "RemoveContainer" containerID="c5dcfa3c828a23512ad7e7c27068e581300a9afface667fdaf2fc0cd211b8f32" Mar 10 10:41:14 crc kubenswrapper[4794]: E0310 10:41:14.000575 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-69278_openshift-machine-config-operator(071a79a8-a892-4d38-a255-2a19483b64aa)\"" pod="openshift-machine-config-operator/machine-config-daemon-69278" podUID="071a79a8-a892-4d38-a255-2a19483b64aa" Mar 10 10:41:28 crc kubenswrapper[4794]: I0310 10:41:27.999870 4794 scope.go:117] "RemoveContainer" containerID="c5dcfa3c828a23512ad7e7c27068e581300a9afface667fdaf2fc0cd211b8f32" Mar 10 10:41:28 crc kubenswrapper[4794]: E0310 10:41:28.000899 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-69278_openshift-machine-config-operator(071a79a8-a892-4d38-a255-2a19483b64aa)\"" pod="openshift-machine-config-operator/machine-config-daemon-69278" podUID="071a79a8-a892-4d38-a255-2a19483b64aa" Mar 10 10:41:39 crc kubenswrapper[4794]: I0310 10:41:39.001015 4794 scope.go:117] "RemoveContainer" containerID="c5dcfa3c828a23512ad7e7c27068e581300a9afface667fdaf2fc0cd211b8f32" Mar 10 10:41:39 crc kubenswrapper[4794]: E0310 10:41:39.001850 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-69278_openshift-machine-config-operator(071a79a8-a892-4d38-a255-2a19483b64aa)\"" pod="openshift-machine-config-operator/machine-config-daemon-69278" podUID="071a79a8-a892-4d38-a255-2a19483b64aa" Mar 10 10:41:54 crc kubenswrapper[4794]: I0310 10:41:53.999146 4794 scope.go:117] "RemoveContainer" containerID="c5dcfa3c828a23512ad7e7c27068e581300a9afface667fdaf2fc0cd211b8f32" Mar 10 10:41:54 crc kubenswrapper[4794]: E0310 10:41:54.000449 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-69278_openshift-machine-config-operator(071a79a8-a892-4d38-a255-2a19483b64aa)\"" pod="openshift-machine-config-operator/machine-config-daemon-69278" podUID="071a79a8-a892-4d38-a255-2a19483b64aa" Mar 10 10:42:00 crc kubenswrapper[4794]: I0310 10:42:00.164776 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29552322-kqw2w"] Mar 10 10:42:00 crc kubenswrapper[4794]: E0310 10:42:00.173799 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6f9819fe-c4f4-49d0-80ac-02fbd8ed66cd" containerName="oc" Mar 10 10:42:00 crc kubenswrapper[4794]: I0310 10:42:00.173858 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="6f9819fe-c4f4-49d0-80ac-02fbd8ed66cd" containerName="oc" Mar 10 10:42:00 crc kubenswrapper[4794]: I0310 10:42:00.174046 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="6f9819fe-c4f4-49d0-80ac-02fbd8ed66cd" containerName="oc" Mar 10 10:42:00 crc kubenswrapper[4794]: I0310 10:42:00.174652 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552322-kqw2w" Mar 10 10:42:00 crc kubenswrapper[4794]: I0310 10:42:00.175857 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552322-kqw2w"] Mar 10 10:42:00 crc kubenswrapper[4794]: I0310 10:42:00.177003 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 10 10:42:00 crc kubenswrapper[4794]: I0310 10:42:00.177553 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 10 10:42:00 crc kubenswrapper[4794]: I0310 10:42:00.178839 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-r5tkc" Mar 10 10:42:00 crc kubenswrapper[4794]: I0310 10:42:00.313653 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8wgtk\" (UniqueName: \"kubernetes.io/projected/085121c1-faea-4b9f-8296-a0535e7ee9b6-kube-api-access-8wgtk\") pod \"auto-csr-approver-29552322-kqw2w\" (UID: \"085121c1-faea-4b9f-8296-a0535e7ee9b6\") " pod="openshift-infra/auto-csr-approver-29552322-kqw2w" Mar 10 10:42:00 crc kubenswrapper[4794]: I0310 10:42:00.414505 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8wgtk\" (UniqueName: \"kubernetes.io/projected/085121c1-faea-4b9f-8296-a0535e7ee9b6-kube-api-access-8wgtk\") pod \"auto-csr-approver-29552322-kqw2w\" (UID: \"085121c1-faea-4b9f-8296-a0535e7ee9b6\") " pod="openshift-infra/auto-csr-approver-29552322-kqw2w" Mar 10 10:42:00 crc kubenswrapper[4794]: I0310 10:42:00.446142 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8wgtk\" (UniqueName: \"kubernetes.io/projected/085121c1-faea-4b9f-8296-a0535e7ee9b6-kube-api-access-8wgtk\") pod \"auto-csr-approver-29552322-kqw2w\" (UID: \"085121c1-faea-4b9f-8296-a0535e7ee9b6\") " pod="openshift-infra/auto-csr-approver-29552322-kqw2w" Mar 10 10:42:00 crc kubenswrapper[4794]: I0310 10:42:00.504039 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552322-kqw2w" Mar 10 10:42:00 crc kubenswrapper[4794]: I0310 10:42:00.968267 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552322-kqw2w"] Mar 10 10:42:01 crc kubenswrapper[4794]: I0310 10:42:01.013998 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552322-kqw2w" event={"ID":"085121c1-faea-4b9f-8296-a0535e7ee9b6","Type":"ContainerStarted","Data":"e30b95e758ceb13d3ac219d5260885b534648d87853c365fec3e26a89a142a63"} Mar 10 10:42:03 crc kubenswrapper[4794]: I0310 10:42:03.037971 4794 generic.go:334] "Generic (PLEG): container finished" podID="085121c1-faea-4b9f-8296-a0535e7ee9b6" containerID="f94d1c6b0a6c66a9932fdd7f9ccc3a9455b2016b58ab2716680dcda6949b942e" exitCode=0 Mar 10 10:42:03 crc kubenswrapper[4794]: I0310 10:42:03.038068 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552322-kqw2w" event={"ID":"085121c1-faea-4b9f-8296-a0535e7ee9b6","Type":"ContainerDied","Data":"f94d1c6b0a6c66a9932fdd7f9ccc3a9455b2016b58ab2716680dcda6949b942e"} Mar 10 10:42:04 crc kubenswrapper[4794]: I0310 10:42:04.427512 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552322-kqw2w" Mar 10 10:42:04 crc kubenswrapper[4794]: I0310 10:42:04.576320 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8wgtk\" (UniqueName: \"kubernetes.io/projected/085121c1-faea-4b9f-8296-a0535e7ee9b6-kube-api-access-8wgtk\") pod \"085121c1-faea-4b9f-8296-a0535e7ee9b6\" (UID: \"085121c1-faea-4b9f-8296-a0535e7ee9b6\") " Mar 10 10:42:04 crc kubenswrapper[4794]: I0310 10:42:04.585055 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/085121c1-faea-4b9f-8296-a0535e7ee9b6-kube-api-access-8wgtk" (OuterVolumeSpecName: "kube-api-access-8wgtk") pod "085121c1-faea-4b9f-8296-a0535e7ee9b6" (UID: "085121c1-faea-4b9f-8296-a0535e7ee9b6"). InnerVolumeSpecName "kube-api-access-8wgtk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 10:42:04 crc kubenswrapper[4794]: I0310 10:42:04.679174 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8wgtk\" (UniqueName: \"kubernetes.io/projected/085121c1-faea-4b9f-8296-a0535e7ee9b6-kube-api-access-8wgtk\") on node \"crc\" DevicePath \"\"" Mar 10 10:42:05 crc kubenswrapper[4794]: I0310 10:42:05.062647 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552322-kqw2w" event={"ID":"085121c1-faea-4b9f-8296-a0535e7ee9b6","Type":"ContainerDied","Data":"e30b95e758ceb13d3ac219d5260885b534648d87853c365fec3e26a89a142a63"} Mar 10 10:42:05 crc kubenswrapper[4794]: I0310 10:42:05.062693 4794 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e30b95e758ceb13d3ac219d5260885b534648d87853c365fec3e26a89a142a63" Mar 10 10:42:05 crc kubenswrapper[4794]: I0310 10:42:05.062756 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552322-kqw2w" Mar 10 10:42:05 crc kubenswrapper[4794]: I0310 10:42:05.511016 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29552316-s8psg"] Mar 10 10:42:05 crc kubenswrapper[4794]: I0310 10:42:05.525419 4794 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29552316-s8psg"] Mar 10 10:42:06 crc kubenswrapper[4794]: I0310 10:42:06.022186 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b8d5b53-7e89-463a-a2f5-9e97d0fa45cd" path="/var/lib/kubelet/pods/5b8d5b53-7e89-463a-a2f5-9e97d0fa45cd/volumes" Mar 10 10:42:09 crc kubenswrapper[4794]: I0310 10:42:08.999857 4794 scope.go:117] "RemoveContainer" containerID="c5dcfa3c828a23512ad7e7c27068e581300a9afface667fdaf2fc0cd211b8f32" Mar 10 10:42:09 crc kubenswrapper[4794]: E0310 10:42:09.000293 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-69278_openshift-machine-config-operator(071a79a8-a892-4d38-a255-2a19483b64aa)\"" pod="openshift-machine-config-operator/machine-config-daemon-69278" podUID="071a79a8-a892-4d38-a255-2a19483b64aa" Mar 10 10:42:22 crc kubenswrapper[4794]: I0310 10:42:22.003627 4794 scope.go:117] "RemoveContainer" containerID="c5dcfa3c828a23512ad7e7c27068e581300a9afface667fdaf2fc0cd211b8f32" Mar 10 10:42:22 crc kubenswrapper[4794]: E0310 10:42:22.004361 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-69278_openshift-machine-config-operator(071a79a8-a892-4d38-a255-2a19483b64aa)\"" pod="openshift-machine-config-operator/machine-config-daemon-69278" podUID="071a79a8-a892-4d38-a255-2a19483b64aa" Mar 10 10:42:35 crc kubenswrapper[4794]: I0310 10:42:35.207965 4794 scope.go:117] "RemoveContainer" containerID="c5dcfa3c828a23512ad7e7c27068e581300a9afface667fdaf2fc0cd211b8f32" Mar 10 10:42:35 crc kubenswrapper[4794]: E0310 10:42:35.209117 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-69278_openshift-machine-config-operator(071a79a8-a892-4d38-a255-2a19483b64aa)\"" pod="openshift-machine-config-operator/machine-config-daemon-69278" podUID="071a79a8-a892-4d38-a255-2a19483b64aa" Mar 10 10:42:39 crc kubenswrapper[4794]: I0310 10:42:39.750472 4794 scope.go:117] "RemoveContainer" containerID="1fac3bbec1f13005c6b4aeb2bba749d8aeb745dd54c72b4049b8b31aa45a36c1" Mar 10 10:42:45 crc kubenswrapper[4794]: I0310 10:42:45.175834 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-xk8xd"] Mar 10 10:42:45 crc kubenswrapper[4794]: E0310 10:42:45.176889 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="085121c1-faea-4b9f-8296-a0535e7ee9b6" containerName="oc" Mar 10 10:42:45 crc kubenswrapper[4794]: I0310 10:42:45.176911 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="085121c1-faea-4b9f-8296-a0535e7ee9b6" containerName="oc" Mar 10 10:42:45 crc kubenswrapper[4794]: I0310 10:42:45.177187 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="085121c1-faea-4b9f-8296-a0535e7ee9b6" containerName="oc" Mar 10 10:42:45 crc kubenswrapper[4794]: I0310 10:42:45.179018 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-xk8xd" Mar 10 10:42:45 crc kubenswrapper[4794]: I0310 10:42:45.190146 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-xk8xd"] Mar 10 10:42:45 crc kubenswrapper[4794]: I0310 10:42:45.268213 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gmwxr\" (UniqueName: \"kubernetes.io/projected/15dc9197-9300-4e45-81f2-5ce7e01994ef-kube-api-access-gmwxr\") pod \"redhat-operators-xk8xd\" (UID: \"15dc9197-9300-4e45-81f2-5ce7e01994ef\") " pod="openshift-marketplace/redhat-operators-xk8xd" Mar 10 10:42:45 crc kubenswrapper[4794]: I0310 10:42:45.268260 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/15dc9197-9300-4e45-81f2-5ce7e01994ef-utilities\") pod \"redhat-operators-xk8xd\" (UID: \"15dc9197-9300-4e45-81f2-5ce7e01994ef\") " pod="openshift-marketplace/redhat-operators-xk8xd" Mar 10 10:42:45 crc kubenswrapper[4794]: I0310 10:42:45.268312 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/15dc9197-9300-4e45-81f2-5ce7e01994ef-catalog-content\") pod \"redhat-operators-xk8xd\" (UID: \"15dc9197-9300-4e45-81f2-5ce7e01994ef\") " pod="openshift-marketplace/redhat-operators-xk8xd" Mar 10 10:42:45 crc kubenswrapper[4794]: I0310 10:42:45.370016 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gmwxr\" (UniqueName: \"kubernetes.io/projected/15dc9197-9300-4e45-81f2-5ce7e01994ef-kube-api-access-gmwxr\") pod \"redhat-operators-xk8xd\" (UID: \"15dc9197-9300-4e45-81f2-5ce7e01994ef\") " pod="openshift-marketplace/redhat-operators-xk8xd" Mar 10 10:42:45 crc kubenswrapper[4794]: I0310 10:42:45.370071 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/15dc9197-9300-4e45-81f2-5ce7e01994ef-utilities\") pod \"redhat-operators-xk8xd\" (UID: \"15dc9197-9300-4e45-81f2-5ce7e01994ef\") " pod="openshift-marketplace/redhat-operators-xk8xd" Mar 10 10:42:45 crc kubenswrapper[4794]: I0310 10:42:45.370138 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/15dc9197-9300-4e45-81f2-5ce7e01994ef-catalog-content\") pod \"redhat-operators-xk8xd\" (UID: \"15dc9197-9300-4e45-81f2-5ce7e01994ef\") " pod="openshift-marketplace/redhat-operators-xk8xd" Mar 10 10:42:45 crc kubenswrapper[4794]: I0310 10:42:45.370689 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/15dc9197-9300-4e45-81f2-5ce7e01994ef-catalog-content\") pod \"redhat-operators-xk8xd\" (UID: \"15dc9197-9300-4e45-81f2-5ce7e01994ef\") " pod="openshift-marketplace/redhat-operators-xk8xd" Mar 10 10:42:45 crc kubenswrapper[4794]: I0310 10:42:45.370973 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/15dc9197-9300-4e45-81f2-5ce7e01994ef-utilities\") pod \"redhat-operators-xk8xd\" (UID: \"15dc9197-9300-4e45-81f2-5ce7e01994ef\") " pod="openshift-marketplace/redhat-operators-xk8xd" Mar 10 10:42:45 crc kubenswrapper[4794]: I0310 10:42:45.394355 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gmwxr\" (UniqueName: \"kubernetes.io/projected/15dc9197-9300-4e45-81f2-5ce7e01994ef-kube-api-access-gmwxr\") pod \"redhat-operators-xk8xd\" (UID: \"15dc9197-9300-4e45-81f2-5ce7e01994ef\") " pod="openshift-marketplace/redhat-operators-xk8xd" Mar 10 10:42:45 crc kubenswrapper[4794]: I0310 10:42:45.514211 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-xk8xd" Mar 10 10:42:46 crc kubenswrapper[4794]: I0310 10:42:46.012523 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-xk8xd"] Mar 10 10:42:46 crc kubenswrapper[4794]: I0310 10:42:46.443632 4794 generic.go:334] "Generic (PLEG): container finished" podID="15dc9197-9300-4e45-81f2-5ce7e01994ef" containerID="b4fc31df0ac88582f8b81c6240d4849ddc2cbf59ea148246dd1b22a8589aba6d" exitCode=0 Mar 10 10:42:46 crc kubenswrapper[4794]: I0310 10:42:46.443715 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xk8xd" event={"ID":"15dc9197-9300-4e45-81f2-5ce7e01994ef","Type":"ContainerDied","Data":"b4fc31df0ac88582f8b81c6240d4849ddc2cbf59ea148246dd1b22a8589aba6d"} Mar 10 10:42:46 crc kubenswrapper[4794]: I0310 10:42:46.443955 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xk8xd" event={"ID":"15dc9197-9300-4e45-81f2-5ce7e01994ef","Type":"ContainerStarted","Data":"52492366b8a632735eda4dfe19976da9be6ab0c30b617b709c944e6070432145"} Mar 10 10:42:47 crc kubenswrapper[4794]: I0310 10:42:47.452133 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xk8xd" event={"ID":"15dc9197-9300-4e45-81f2-5ce7e01994ef","Type":"ContainerStarted","Data":"63bbc660200d28bfb074347a89fb653266e4c560ad7caf50f2fced00500b5339"} Mar 10 10:42:47 crc kubenswrapper[4794]: I0310 10:42:47.999434 4794 scope.go:117] "RemoveContainer" containerID="c5dcfa3c828a23512ad7e7c27068e581300a9afface667fdaf2fc0cd211b8f32" Mar 10 10:42:47 crc kubenswrapper[4794]: E0310 10:42:47.999662 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-69278_openshift-machine-config-operator(071a79a8-a892-4d38-a255-2a19483b64aa)\"" pod="openshift-machine-config-operator/machine-config-daemon-69278" podUID="071a79a8-a892-4d38-a255-2a19483b64aa" Mar 10 10:42:48 crc kubenswrapper[4794]: I0310 10:42:48.467054 4794 generic.go:334] "Generic (PLEG): container finished" podID="15dc9197-9300-4e45-81f2-5ce7e01994ef" containerID="63bbc660200d28bfb074347a89fb653266e4c560ad7caf50f2fced00500b5339" exitCode=0 Mar 10 10:42:48 crc kubenswrapper[4794]: I0310 10:42:48.467093 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xk8xd" event={"ID":"15dc9197-9300-4e45-81f2-5ce7e01994ef","Type":"ContainerDied","Data":"63bbc660200d28bfb074347a89fb653266e4c560ad7caf50f2fced00500b5339"} Mar 10 10:42:49 crc kubenswrapper[4794]: I0310 10:42:49.478378 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xk8xd" event={"ID":"15dc9197-9300-4e45-81f2-5ce7e01994ef","Type":"ContainerStarted","Data":"157d431348fe3171e503b2d31538652b536e3916427f4dd92e63c0c71d6c69f3"} Mar 10 10:42:49 crc kubenswrapper[4794]: I0310 10:42:49.501493 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-xk8xd" podStartSLOduration=2.079586221 podStartE2EDuration="4.501457662s" podCreationTimestamp="2026-03-10 10:42:45 +0000 UTC" firstStartedPulling="2026-03-10 10:42:46.445767557 +0000 UTC m=+3515.201938375" lastFinishedPulling="2026-03-10 10:42:48.867638958 +0000 UTC m=+3517.623809816" observedRunningTime="2026-03-10 10:42:49.49977934 +0000 UTC m=+3518.255950168" watchObservedRunningTime="2026-03-10 10:42:49.501457662 +0000 UTC m=+3518.257628500" Mar 10 10:42:55 crc kubenswrapper[4794]: I0310 10:42:55.515734 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-xk8xd" Mar 10 10:42:55 crc kubenswrapper[4794]: I0310 10:42:55.517509 4794 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-xk8xd" Mar 10 10:42:56 crc kubenswrapper[4794]: I0310 10:42:56.587689 4794 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-xk8xd" podUID="15dc9197-9300-4e45-81f2-5ce7e01994ef" containerName="registry-server" probeResult="failure" output=< Mar 10 10:42:56 crc kubenswrapper[4794]: timeout: failed to connect service ":50051" within 1s Mar 10 10:42:56 crc kubenswrapper[4794]: > Mar 10 10:42:59 crc kubenswrapper[4794]: I0310 10:42:59.999261 4794 scope.go:117] "RemoveContainer" containerID="c5dcfa3c828a23512ad7e7c27068e581300a9afface667fdaf2fc0cd211b8f32" Mar 10 10:43:00 crc kubenswrapper[4794]: I0310 10:43:00.578727 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-69278" event={"ID":"071a79a8-a892-4d38-a255-2a19483b64aa","Type":"ContainerStarted","Data":"1dfef3673c4e36ac904ce63dccbdb3ac9fb5dfb060917b9d39017aa8664a4458"} Mar 10 10:43:05 crc kubenswrapper[4794]: I0310 10:43:05.570097 4794 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-xk8xd" Mar 10 10:43:05 crc kubenswrapper[4794]: I0310 10:43:05.630608 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-xk8xd" Mar 10 10:43:05 crc kubenswrapper[4794]: I0310 10:43:05.805984 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-xk8xd"] Mar 10 10:43:06 crc kubenswrapper[4794]: I0310 10:43:06.630653 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-xk8xd" podUID="15dc9197-9300-4e45-81f2-5ce7e01994ef" containerName="registry-server" containerID="cri-o://157d431348fe3171e503b2d31538652b536e3916427f4dd92e63c0c71d6c69f3" gracePeriod=2 Mar 10 10:43:07 crc kubenswrapper[4794]: I0310 10:43:07.053528 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-xk8xd" Mar 10 10:43:07 crc kubenswrapper[4794]: I0310 10:43:07.148603 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/15dc9197-9300-4e45-81f2-5ce7e01994ef-utilities\") pod \"15dc9197-9300-4e45-81f2-5ce7e01994ef\" (UID: \"15dc9197-9300-4e45-81f2-5ce7e01994ef\") " Mar 10 10:43:07 crc kubenswrapper[4794]: I0310 10:43:07.148682 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/15dc9197-9300-4e45-81f2-5ce7e01994ef-catalog-content\") pod \"15dc9197-9300-4e45-81f2-5ce7e01994ef\" (UID: \"15dc9197-9300-4e45-81f2-5ce7e01994ef\") " Mar 10 10:43:07 crc kubenswrapper[4794]: I0310 10:43:07.148720 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gmwxr\" (UniqueName: \"kubernetes.io/projected/15dc9197-9300-4e45-81f2-5ce7e01994ef-kube-api-access-gmwxr\") pod \"15dc9197-9300-4e45-81f2-5ce7e01994ef\" (UID: \"15dc9197-9300-4e45-81f2-5ce7e01994ef\") " Mar 10 10:43:07 crc kubenswrapper[4794]: I0310 10:43:07.150041 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/15dc9197-9300-4e45-81f2-5ce7e01994ef-utilities" (OuterVolumeSpecName: "utilities") pod "15dc9197-9300-4e45-81f2-5ce7e01994ef" (UID: "15dc9197-9300-4e45-81f2-5ce7e01994ef"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 10:43:07 crc kubenswrapper[4794]: I0310 10:43:07.158819 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/15dc9197-9300-4e45-81f2-5ce7e01994ef-kube-api-access-gmwxr" (OuterVolumeSpecName: "kube-api-access-gmwxr") pod "15dc9197-9300-4e45-81f2-5ce7e01994ef" (UID: "15dc9197-9300-4e45-81f2-5ce7e01994ef"). InnerVolumeSpecName "kube-api-access-gmwxr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 10:43:07 crc kubenswrapper[4794]: I0310 10:43:07.250940 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gmwxr\" (UniqueName: \"kubernetes.io/projected/15dc9197-9300-4e45-81f2-5ce7e01994ef-kube-api-access-gmwxr\") on node \"crc\" DevicePath \"\"" Mar 10 10:43:07 crc kubenswrapper[4794]: I0310 10:43:07.251167 4794 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/15dc9197-9300-4e45-81f2-5ce7e01994ef-utilities\") on node \"crc\" DevicePath \"\"" Mar 10 10:43:07 crc kubenswrapper[4794]: I0310 10:43:07.327745 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/15dc9197-9300-4e45-81f2-5ce7e01994ef-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "15dc9197-9300-4e45-81f2-5ce7e01994ef" (UID: "15dc9197-9300-4e45-81f2-5ce7e01994ef"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 10:43:07 crc kubenswrapper[4794]: I0310 10:43:07.352075 4794 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/15dc9197-9300-4e45-81f2-5ce7e01994ef-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 10 10:43:07 crc kubenswrapper[4794]: I0310 10:43:07.638897 4794 generic.go:334] "Generic (PLEG): container finished" podID="15dc9197-9300-4e45-81f2-5ce7e01994ef" containerID="157d431348fe3171e503b2d31538652b536e3916427f4dd92e63c0c71d6c69f3" exitCode=0 Mar 10 10:43:07 crc kubenswrapper[4794]: I0310 10:43:07.639034 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-xk8xd" Mar 10 10:43:07 crc kubenswrapper[4794]: I0310 10:43:07.639970 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xk8xd" event={"ID":"15dc9197-9300-4e45-81f2-5ce7e01994ef","Type":"ContainerDied","Data":"157d431348fe3171e503b2d31538652b536e3916427f4dd92e63c0c71d6c69f3"} Mar 10 10:43:07 crc kubenswrapper[4794]: I0310 10:43:07.640098 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xk8xd" event={"ID":"15dc9197-9300-4e45-81f2-5ce7e01994ef","Type":"ContainerDied","Data":"52492366b8a632735eda4dfe19976da9be6ab0c30b617b709c944e6070432145"} Mar 10 10:43:07 crc kubenswrapper[4794]: I0310 10:43:07.640140 4794 scope.go:117] "RemoveContainer" containerID="157d431348fe3171e503b2d31538652b536e3916427f4dd92e63c0c71d6c69f3" Mar 10 10:43:07 crc kubenswrapper[4794]: I0310 10:43:07.662514 4794 scope.go:117] "RemoveContainer" containerID="63bbc660200d28bfb074347a89fb653266e4c560ad7caf50f2fced00500b5339" Mar 10 10:43:07 crc kubenswrapper[4794]: I0310 10:43:07.688667 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-xk8xd"] Mar 10 10:43:07 crc kubenswrapper[4794]: I0310 10:43:07.691076 4794 scope.go:117] "RemoveContainer" containerID="b4fc31df0ac88582f8b81c6240d4849ddc2cbf59ea148246dd1b22a8589aba6d" Mar 10 10:43:07 crc kubenswrapper[4794]: I0310 10:43:07.698374 4794 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-xk8xd"] Mar 10 10:43:07 crc kubenswrapper[4794]: I0310 10:43:07.721153 4794 scope.go:117] "RemoveContainer" containerID="157d431348fe3171e503b2d31538652b536e3916427f4dd92e63c0c71d6c69f3" Mar 10 10:43:07 crc kubenswrapper[4794]: E0310 10:43:07.722361 4794 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"157d431348fe3171e503b2d31538652b536e3916427f4dd92e63c0c71d6c69f3\": container with ID starting with 157d431348fe3171e503b2d31538652b536e3916427f4dd92e63c0c71d6c69f3 not found: ID does not exist" containerID="157d431348fe3171e503b2d31538652b536e3916427f4dd92e63c0c71d6c69f3" Mar 10 10:43:07 crc kubenswrapper[4794]: I0310 10:43:07.722463 4794 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"157d431348fe3171e503b2d31538652b536e3916427f4dd92e63c0c71d6c69f3"} err="failed to get container status \"157d431348fe3171e503b2d31538652b536e3916427f4dd92e63c0c71d6c69f3\": rpc error: code = NotFound desc = could not find container \"157d431348fe3171e503b2d31538652b536e3916427f4dd92e63c0c71d6c69f3\": container with ID starting with 157d431348fe3171e503b2d31538652b536e3916427f4dd92e63c0c71d6c69f3 not found: ID does not exist" Mar 10 10:43:07 crc kubenswrapper[4794]: I0310 10:43:07.722538 4794 scope.go:117] "RemoveContainer" containerID="63bbc660200d28bfb074347a89fb653266e4c560ad7caf50f2fced00500b5339" Mar 10 10:43:07 crc kubenswrapper[4794]: E0310 10:43:07.723157 4794 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"63bbc660200d28bfb074347a89fb653266e4c560ad7caf50f2fced00500b5339\": container with ID starting with 63bbc660200d28bfb074347a89fb653266e4c560ad7caf50f2fced00500b5339 not found: ID does not exist" containerID="63bbc660200d28bfb074347a89fb653266e4c560ad7caf50f2fced00500b5339" Mar 10 10:43:07 crc kubenswrapper[4794]: I0310 10:43:07.723227 4794 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"63bbc660200d28bfb074347a89fb653266e4c560ad7caf50f2fced00500b5339"} err="failed to get container status \"63bbc660200d28bfb074347a89fb653266e4c560ad7caf50f2fced00500b5339\": rpc error: code = NotFound desc = could not find container \"63bbc660200d28bfb074347a89fb653266e4c560ad7caf50f2fced00500b5339\": container with ID starting with 63bbc660200d28bfb074347a89fb653266e4c560ad7caf50f2fced00500b5339 not found: ID does not exist" Mar 10 10:43:07 crc kubenswrapper[4794]: I0310 10:43:07.723292 4794 scope.go:117] "RemoveContainer" containerID="b4fc31df0ac88582f8b81c6240d4849ddc2cbf59ea148246dd1b22a8589aba6d" Mar 10 10:43:07 crc kubenswrapper[4794]: E0310 10:43:07.723664 4794 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b4fc31df0ac88582f8b81c6240d4849ddc2cbf59ea148246dd1b22a8589aba6d\": container with ID starting with b4fc31df0ac88582f8b81c6240d4849ddc2cbf59ea148246dd1b22a8589aba6d not found: ID does not exist" containerID="b4fc31df0ac88582f8b81c6240d4849ddc2cbf59ea148246dd1b22a8589aba6d" Mar 10 10:43:07 crc kubenswrapper[4794]: I0310 10:43:07.723739 4794 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b4fc31df0ac88582f8b81c6240d4849ddc2cbf59ea148246dd1b22a8589aba6d"} err="failed to get container status \"b4fc31df0ac88582f8b81c6240d4849ddc2cbf59ea148246dd1b22a8589aba6d\": rpc error: code = NotFound desc = could not find container \"b4fc31df0ac88582f8b81c6240d4849ddc2cbf59ea148246dd1b22a8589aba6d\": container with ID starting with b4fc31df0ac88582f8b81c6240d4849ddc2cbf59ea148246dd1b22a8589aba6d not found: ID does not exist" Mar 10 10:43:08 crc kubenswrapper[4794]: I0310 10:43:08.009732 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="15dc9197-9300-4e45-81f2-5ce7e01994ef" path="/var/lib/kubelet/pods/15dc9197-9300-4e45-81f2-5ce7e01994ef/volumes" Mar 10 10:43:17 crc kubenswrapper[4794]: I0310 10:43:17.343196 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-nhbtf"] Mar 10 10:43:17 crc kubenswrapper[4794]: E0310 10:43:17.343984 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="15dc9197-9300-4e45-81f2-5ce7e01994ef" containerName="extract-utilities" Mar 10 10:43:17 crc kubenswrapper[4794]: I0310 10:43:17.343995 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="15dc9197-9300-4e45-81f2-5ce7e01994ef" containerName="extract-utilities" Mar 10 10:43:17 crc kubenswrapper[4794]: E0310 10:43:17.344010 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="15dc9197-9300-4e45-81f2-5ce7e01994ef" containerName="registry-server" Mar 10 10:43:17 crc kubenswrapper[4794]: I0310 10:43:17.344016 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="15dc9197-9300-4e45-81f2-5ce7e01994ef" containerName="registry-server" Mar 10 10:43:17 crc kubenswrapper[4794]: E0310 10:43:17.344028 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="15dc9197-9300-4e45-81f2-5ce7e01994ef" containerName="extract-content" Mar 10 10:43:17 crc kubenswrapper[4794]: I0310 10:43:17.344034 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="15dc9197-9300-4e45-81f2-5ce7e01994ef" containerName="extract-content" Mar 10 10:43:17 crc kubenswrapper[4794]: I0310 10:43:17.344171 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="15dc9197-9300-4e45-81f2-5ce7e01994ef" containerName="registry-server" Mar 10 10:43:17 crc kubenswrapper[4794]: I0310 10:43:17.345082 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-nhbtf" Mar 10 10:43:17 crc kubenswrapper[4794]: I0310 10:43:17.368923 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-nhbtf"] Mar 10 10:43:17 crc kubenswrapper[4794]: I0310 10:43:17.409544 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a65687f5-261b-49ac-ba1f-a35d8778a45d-utilities\") pod \"redhat-marketplace-nhbtf\" (UID: \"a65687f5-261b-49ac-ba1f-a35d8778a45d\") " pod="openshift-marketplace/redhat-marketplace-nhbtf" Mar 10 10:43:17 crc kubenswrapper[4794]: I0310 10:43:17.409602 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a65687f5-261b-49ac-ba1f-a35d8778a45d-catalog-content\") pod \"redhat-marketplace-nhbtf\" (UID: \"a65687f5-261b-49ac-ba1f-a35d8778a45d\") " pod="openshift-marketplace/redhat-marketplace-nhbtf" Mar 10 10:43:17 crc kubenswrapper[4794]: I0310 10:43:17.409671 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nrph7\" (UniqueName: \"kubernetes.io/projected/a65687f5-261b-49ac-ba1f-a35d8778a45d-kube-api-access-nrph7\") pod \"redhat-marketplace-nhbtf\" (UID: \"a65687f5-261b-49ac-ba1f-a35d8778a45d\") " pod="openshift-marketplace/redhat-marketplace-nhbtf" Mar 10 10:43:17 crc kubenswrapper[4794]: I0310 10:43:17.510870 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a65687f5-261b-49ac-ba1f-a35d8778a45d-utilities\") pod \"redhat-marketplace-nhbtf\" (UID: \"a65687f5-261b-49ac-ba1f-a35d8778a45d\") " pod="openshift-marketplace/redhat-marketplace-nhbtf" Mar 10 10:43:17 crc kubenswrapper[4794]: I0310 10:43:17.510915 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a65687f5-261b-49ac-ba1f-a35d8778a45d-catalog-content\") pod \"redhat-marketplace-nhbtf\" (UID: \"a65687f5-261b-49ac-ba1f-a35d8778a45d\") " pod="openshift-marketplace/redhat-marketplace-nhbtf" Mar 10 10:43:17 crc kubenswrapper[4794]: I0310 10:43:17.510965 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nrph7\" (UniqueName: \"kubernetes.io/projected/a65687f5-261b-49ac-ba1f-a35d8778a45d-kube-api-access-nrph7\") pod \"redhat-marketplace-nhbtf\" (UID: \"a65687f5-261b-49ac-ba1f-a35d8778a45d\") " pod="openshift-marketplace/redhat-marketplace-nhbtf" Mar 10 10:43:17 crc kubenswrapper[4794]: I0310 10:43:17.511413 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a65687f5-261b-49ac-ba1f-a35d8778a45d-utilities\") pod \"redhat-marketplace-nhbtf\" (UID: \"a65687f5-261b-49ac-ba1f-a35d8778a45d\") " pod="openshift-marketplace/redhat-marketplace-nhbtf" Mar 10 10:43:17 crc kubenswrapper[4794]: I0310 10:43:17.511441 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a65687f5-261b-49ac-ba1f-a35d8778a45d-catalog-content\") pod \"redhat-marketplace-nhbtf\" (UID: \"a65687f5-261b-49ac-ba1f-a35d8778a45d\") " pod="openshift-marketplace/redhat-marketplace-nhbtf" Mar 10 10:43:17 crc kubenswrapper[4794]: I0310 10:43:17.538682 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nrph7\" (UniqueName: \"kubernetes.io/projected/a65687f5-261b-49ac-ba1f-a35d8778a45d-kube-api-access-nrph7\") pod \"redhat-marketplace-nhbtf\" (UID: \"a65687f5-261b-49ac-ba1f-a35d8778a45d\") " pod="openshift-marketplace/redhat-marketplace-nhbtf" Mar 10 10:43:17 crc kubenswrapper[4794]: I0310 10:43:17.669597 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-nhbtf" Mar 10 10:43:18 crc kubenswrapper[4794]: I0310 10:43:18.127654 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-nhbtf"] Mar 10 10:43:18 crc kubenswrapper[4794]: I0310 10:43:18.741319 4794 generic.go:334] "Generic (PLEG): container finished" podID="a65687f5-261b-49ac-ba1f-a35d8778a45d" containerID="6341f9869577a465a0421054e758d6521b7f39a2bb76cd850589488e03b5f78b" exitCode=0 Mar 10 10:43:18 crc kubenswrapper[4794]: I0310 10:43:18.741445 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-nhbtf" event={"ID":"a65687f5-261b-49ac-ba1f-a35d8778a45d","Type":"ContainerDied","Data":"6341f9869577a465a0421054e758d6521b7f39a2bb76cd850589488e03b5f78b"} Mar 10 10:43:18 crc kubenswrapper[4794]: I0310 10:43:18.741618 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-nhbtf" event={"ID":"a65687f5-261b-49ac-ba1f-a35d8778a45d","Type":"ContainerStarted","Data":"5ca97d7439ffdee8f77599c55f498666db4fd46fafb149fba9bff0153989ee42"} Mar 10 10:43:20 crc kubenswrapper[4794]: I0310 10:43:20.763411 4794 generic.go:334] "Generic (PLEG): container finished" podID="a65687f5-261b-49ac-ba1f-a35d8778a45d" containerID="97d19fe11c187971c950787c12db3993375eca64a5e3bc115d56fc2db3e977ec" exitCode=0 Mar 10 10:43:20 crc kubenswrapper[4794]: I0310 10:43:20.763500 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-nhbtf" event={"ID":"a65687f5-261b-49ac-ba1f-a35d8778a45d","Type":"ContainerDied","Data":"97d19fe11c187971c950787c12db3993375eca64a5e3bc115d56fc2db3e977ec"} Mar 10 10:43:21 crc kubenswrapper[4794]: I0310 10:43:21.774827 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-nhbtf" event={"ID":"a65687f5-261b-49ac-ba1f-a35d8778a45d","Type":"ContainerStarted","Data":"56a64625abcf8fe92041a0d012be7a2623747fd31b362906a646dd4e9e561516"} Mar 10 10:43:21 crc kubenswrapper[4794]: I0310 10:43:21.811674 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-nhbtf" podStartSLOduration=2.016128188 podStartE2EDuration="4.811649911s" podCreationTimestamp="2026-03-10 10:43:17 +0000 UTC" firstStartedPulling="2026-03-10 10:43:18.742998221 +0000 UTC m=+3547.499169039" lastFinishedPulling="2026-03-10 10:43:21.538519914 +0000 UTC m=+3550.294690762" observedRunningTime="2026-03-10 10:43:21.805236551 +0000 UTC m=+3550.561407419" watchObservedRunningTime="2026-03-10 10:43:21.811649911 +0000 UTC m=+3550.567820769" Mar 10 10:43:27 crc kubenswrapper[4794]: I0310 10:43:27.670165 4794 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-nhbtf" Mar 10 10:43:27 crc kubenswrapper[4794]: I0310 10:43:27.670698 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-nhbtf" Mar 10 10:43:27 crc kubenswrapper[4794]: I0310 10:43:27.708055 4794 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-nhbtf" Mar 10 10:43:27 crc kubenswrapper[4794]: I0310 10:43:27.865584 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-nhbtf" Mar 10 10:43:27 crc kubenswrapper[4794]: I0310 10:43:27.943103 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-nhbtf"] Mar 10 10:43:29 crc kubenswrapper[4794]: I0310 10:43:29.831457 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-nhbtf" podUID="a65687f5-261b-49ac-ba1f-a35d8778a45d" containerName="registry-server" containerID="cri-o://56a64625abcf8fe92041a0d012be7a2623747fd31b362906a646dd4e9e561516" gracePeriod=2 Mar 10 10:43:30 crc kubenswrapper[4794]: I0310 10:43:30.330365 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-nhbtf" Mar 10 10:43:30 crc kubenswrapper[4794]: I0310 10:43:30.400453 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a65687f5-261b-49ac-ba1f-a35d8778a45d-catalog-content\") pod \"a65687f5-261b-49ac-ba1f-a35d8778a45d\" (UID: \"a65687f5-261b-49ac-ba1f-a35d8778a45d\") " Mar 10 10:43:30 crc kubenswrapper[4794]: I0310 10:43:30.400525 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nrph7\" (UniqueName: \"kubernetes.io/projected/a65687f5-261b-49ac-ba1f-a35d8778a45d-kube-api-access-nrph7\") pod \"a65687f5-261b-49ac-ba1f-a35d8778a45d\" (UID: \"a65687f5-261b-49ac-ba1f-a35d8778a45d\") " Mar 10 10:43:30 crc kubenswrapper[4794]: I0310 10:43:30.400630 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a65687f5-261b-49ac-ba1f-a35d8778a45d-utilities\") pod \"a65687f5-261b-49ac-ba1f-a35d8778a45d\" (UID: \"a65687f5-261b-49ac-ba1f-a35d8778a45d\") " Mar 10 10:43:30 crc kubenswrapper[4794]: I0310 10:43:30.401786 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a65687f5-261b-49ac-ba1f-a35d8778a45d-utilities" (OuterVolumeSpecName: "utilities") pod "a65687f5-261b-49ac-ba1f-a35d8778a45d" (UID: "a65687f5-261b-49ac-ba1f-a35d8778a45d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 10:43:30 crc kubenswrapper[4794]: I0310 10:43:30.406639 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a65687f5-261b-49ac-ba1f-a35d8778a45d-kube-api-access-nrph7" (OuterVolumeSpecName: "kube-api-access-nrph7") pod "a65687f5-261b-49ac-ba1f-a35d8778a45d" (UID: "a65687f5-261b-49ac-ba1f-a35d8778a45d"). InnerVolumeSpecName "kube-api-access-nrph7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 10:43:30 crc kubenswrapper[4794]: I0310 10:43:30.439353 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a65687f5-261b-49ac-ba1f-a35d8778a45d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a65687f5-261b-49ac-ba1f-a35d8778a45d" (UID: "a65687f5-261b-49ac-ba1f-a35d8778a45d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 10:43:30 crc kubenswrapper[4794]: I0310 10:43:30.502035 4794 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a65687f5-261b-49ac-ba1f-a35d8778a45d-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 10 10:43:30 crc kubenswrapper[4794]: I0310 10:43:30.502070 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nrph7\" (UniqueName: \"kubernetes.io/projected/a65687f5-261b-49ac-ba1f-a35d8778a45d-kube-api-access-nrph7\") on node \"crc\" DevicePath \"\"" Mar 10 10:43:30 crc kubenswrapper[4794]: I0310 10:43:30.502081 4794 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a65687f5-261b-49ac-ba1f-a35d8778a45d-utilities\") on node \"crc\" DevicePath \"\"" Mar 10 10:43:30 crc kubenswrapper[4794]: I0310 10:43:30.844081 4794 generic.go:334] "Generic (PLEG): container finished" podID="a65687f5-261b-49ac-ba1f-a35d8778a45d" containerID="56a64625abcf8fe92041a0d012be7a2623747fd31b362906a646dd4e9e561516" exitCode=0 Mar 10 10:43:30 crc kubenswrapper[4794]: I0310 10:43:30.844155 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-nhbtf" event={"ID":"a65687f5-261b-49ac-ba1f-a35d8778a45d","Type":"ContainerDied","Data":"56a64625abcf8fe92041a0d012be7a2623747fd31b362906a646dd4e9e561516"} Mar 10 10:43:30 crc kubenswrapper[4794]: I0310 10:43:30.844249 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-nhbtf" Mar 10 10:43:30 crc kubenswrapper[4794]: I0310 10:43:30.844266 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-nhbtf" event={"ID":"a65687f5-261b-49ac-ba1f-a35d8778a45d","Type":"ContainerDied","Data":"5ca97d7439ffdee8f77599c55f498666db4fd46fafb149fba9bff0153989ee42"} Mar 10 10:43:30 crc kubenswrapper[4794]: I0310 10:43:30.844312 4794 scope.go:117] "RemoveContainer" containerID="56a64625abcf8fe92041a0d012be7a2623747fd31b362906a646dd4e9e561516" Mar 10 10:43:30 crc kubenswrapper[4794]: I0310 10:43:30.870661 4794 scope.go:117] "RemoveContainer" containerID="97d19fe11c187971c950787c12db3993375eca64a5e3bc115d56fc2db3e977ec" Mar 10 10:43:30 crc kubenswrapper[4794]: I0310 10:43:30.915309 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-nhbtf"] Mar 10 10:43:30 crc kubenswrapper[4794]: I0310 10:43:30.928725 4794 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-nhbtf"] Mar 10 10:43:30 crc kubenswrapper[4794]: I0310 10:43:30.930467 4794 scope.go:117] "RemoveContainer" containerID="6341f9869577a465a0421054e758d6521b7f39a2bb76cd850589488e03b5f78b" Mar 10 10:43:30 crc kubenswrapper[4794]: I0310 10:43:30.952990 4794 scope.go:117] "RemoveContainer" containerID="56a64625abcf8fe92041a0d012be7a2623747fd31b362906a646dd4e9e561516" Mar 10 10:43:30 crc kubenswrapper[4794]: E0310 10:43:30.953316 4794 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"56a64625abcf8fe92041a0d012be7a2623747fd31b362906a646dd4e9e561516\": container with ID starting with 56a64625abcf8fe92041a0d012be7a2623747fd31b362906a646dd4e9e561516 not found: ID does not exist" containerID="56a64625abcf8fe92041a0d012be7a2623747fd31b362906a646dd4e9e561516" Mar 10 10:43:30 crc kubenswrapper[4794]: I0310 10:43:30.953367 4794 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"56a64625abcf8fe92041a0d012be7a2623747fd31b362906a646dd4e9e561516"} err="failed to get container status \"56a64625abcf8fe92041a0d012be7a2623747fd31b362906a646dd4e9e561516\": rpc error: code = NotFound desc = could not find container \"56a64625abcf8fe92041a0d012be7a2623747fd31b362906a646dd4e9e561516\": container with ID starting with 56a64625abcf8fe92041a0d012be7a2623747fd31b362906a646dd4e9e561516 not found: ID does not exist" Mar 10 10:43:30 crc kubenswrapper[4794]: I0310 10:43:30.953392 4794 scope.go:117] "RemoveContainer" containerID="97d19fe11c187971c950787c12db3993375eca64a5e3bc115d56fc2db3e977ec" Mar 10 10:43:30 crc kubenswrapper[4794]: E0310 10:43:30.953613 4794 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"97d19fe11c187971c950787c12db3993375eca64a5e3bc115d56fc2db3e977ec\": container with ID starting with 97d19fe11c187971c950787c12db3993375eca64a5e3bc115d56fc2db3e977ec not found: ID does not exist" containerID="97d19fe11c187971c950787c12db3993375eca64a5e3bc115d56fc2db3e977ec" Mar 10 10:43:30 crc kubenswrapper[4794]: I0310 10:43:30.953639 4794 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"97d19fe11c187971c950787c12db3993375eca64a5e3bc115d56fc2db3e977ec"} err="failed to get container status \"97d19fe11c187971c950787c12db3993375eca64a5e3bc115d56fc2db3e977ec\": rpc error: code = NotFound desc = could not find container \"97d19fe11c187971c950787c12db3993375eca64a5e3bc115d56fc2db3e977ec\": container with ID starting with 97d19fe11c187971c950787c12db3993375eca64a5e3bc115d56fc2db3e977ec not found: ID does not exist" Mar 10 10:43:30 crc kubenswrapper[4794]: I0310 10:43:30.953655 4794 scope.go:117] "RemoveContainer" containerID="6341f9869577a465a0421054e758d6521b7f39a2bb76cd850589488e03b5f78b" Mar 10 10:43:30 crc kubenswrapper[4794]: E0310 10:43:30.953852 4794 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6341f9869577a465a0421054e758d6521b7f39a2bb76cd850589488e03b5f78b\": container with ID starting with 6341f9869577a465a0421054e758d6521b7f39a2bb76cd850589488e03b5f78b not found: ID does not exist" containerID="6341f9869577a465a0421054e758d6521b7f39a2bb76cd850589488e03b5f78b" Mar 10 10:43:30 crc kubenswrapper[4794]: I0310 10:43:30.953877 4794 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6341f9869577a465a0421054e758d6521b7f39a2bb76cd850589488e03b5f78b"} err="failed to get container status \"6341f9869577a465a0421054e758d6521b7f39a2bb76cd850589488e03b5f78b\": rpc error: code = NotFound desc = could not find container \"6341f9869577a465a0421054e758d6521b7f39a2bb76cd850589488e03b5f78b\": container with ID starting with 6341f9869577a465a0421054e758d6521b7f39a2bb76cd850589488e03b5f78b not found: ID does not exist" Mar 10 10:43:32 crc kubenswrapper[4794]: I0310 10:43:32.012821 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a65687f5-261b-49ac-ba1f-a35d8778a45d" path="/var/lib/kubelet/pods/a65687f5-261b-49ac-ba1f-a35d8778a45d/volumes" Mar 10 10:43:53 crc kubenswrapper[4794]: I0310 10:43:53.856426 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-x6crl"] Mar 10 10:43:53 crc kubenswrapper[4794]: E0310 10:43:53.857472 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a65687f5-261b-49ac-ba1f-a35d8778a45d" containerName="registry-server" Mar 10 10:43:53 crc kubenswrapper[4794]: I0310 10:43:53.857494 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="a65687f5-261b-49ac-ba1f-a35d8778a45d" containerName="registry-server" Mar 10 10:43:53 crc kubenswrapper[4794]: E0310 10:43:53.857534 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a65687f5-261b-49ac-ba1f-a35d8778a45d" containerName="extract-utilities" Mar 10 10:43:53 crc kubenswrapper[4794]: I0310 10:43:53.857545 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="a65687f5-261b-49ac-ba1f-a35d8778a45d" containerName="extract-utilities" Mar 10 10:43:53 crc kubenswrapper[4794]: E0310 10:43:53.857564 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a65687f5-261b-49ac-ba1f-a35d8778a45d" containerName="extract-content" Mar 10 10:43:53 crc kubenswrapper[4794]: I0310 10:43:53.857576 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="a65687f5-261b-49ac-ba1f-a35d8778a45d" containerName="extract-content" Mar 10 10:43:53 crc kubenswrapper[4794]: I0310 10:43:53.857793 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="a65687f5-261b-49ac-ba1f-a35d8778a45d" containerName="registry-server" Mar 10 10:43:53 crc kubenswrapper[4794]: I0310 10:43:53.859518 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-x6crl" Mar 10 10:43:53 crc kubenswrapper[4794]: I0310 10:43:53.868816 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-x6crl"] Mar 10 10:43:53 crc kubenswrapper[4794]: I0310 10:43:53.969302 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fc9bea91-9b71-4d05-a50c-4dc23b8449a8-utilities\") pod \"community-operators-x6crl\" (UID: \"fc9bea91-9b71-4d05-a50c-4dc23b8449a8\") " pod="openshift-marketplace/community-operators-x6crl" Mar 10 10:43:53 crc kubenswrapper[4794]: I0310 10:43:53.969409 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fc9bea91-9b71-4d05-a50c-4dc23b8449a8-catalog-content\") pod \"community-operators-x6crl\" (UID: \"fc9bea91-9b71-4d05-a50c-4dc23b8449a8\") " pod="openshift-marketplace/community-operators-x6crl" Mar 10 10:43:53 crc kubenswrapper[4794]: I0310 10:43:53.969497 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fwsqn\" (UniqueName: \"kubernetes.io/projected/fc9bea91-9b71-4d05-a50c-4dc23b8449a8-kube-api-access-fwsqn\") pod \"community-operators-x6crl\" (UID: \"fc9bea91-9b71-4d05-a50c-4dc23b8449a8\") " pod="openshift-marketplace/community-operators-x6crl" Mar 10 10:43:54 crc kubenswrapper[4794]: I0310 10:43:54.071344 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fwsqn\" (UniqueName: \"kubernetes.io/projected/fc9bea91-9b71-4d05-a50c-4dc23b8449a8-kube-api-access-fwsqn\") pod \"community-operators-x6crl\" (UID: \"fc9bea91-9b71-4d05-a50c-4dc23b8449a8\") " pod="openshift-marketplace/community-operators-x6crl" Mar 10 10:43:54 crc kubenswrapper[4794]: I0310 10:43:54.071459 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fc9bea91-9b71-4d05-a50c-4dc23b8449a8-utilities\") pod \"community-operators-x6crl\" (UID: \"fc9bea91-9b71-4d05-a50c-4dc23b8449a8\") " pod="openshift-marketplace/community-operators-x6crl" Mar 10 10:43:54 crc kubenswrapper[4794]: I0310 10:43:54.071518 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fc9bea91-9b71-4d05-a50c-4dc23b8449a8-catalog-content\") pod \"community-operators-x6crl\" (UID: \"fc9bea91-9b71-4d05-a50c-4dc23b8449a8\") " pod="openshift-marketplace/community-operators-x6crl" Mar 10 10:43:54 crc kubenswrapper[4794]: I0310 10:43:54.072103 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fc9bea91-9b71-4d05-a50c-4dc23b8449a8-utilities\") pod \"community-operators-x6crl\" (UID: \"fc9bea91-9b71-4d05-a50c-4dc23b8449a8\") " pod="openshift-marketplace/community-operators-x6crl" Mar 10 10:43:54 crc kubenswrapper[4794]: I0310 10:43:54.072180 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fc9bea91-9b71-4d05-a50c-4dc23b8449a8-catalog-content\") pod \"community-operators-x6crl\" (UID: \"fc9bea91-9b71-4d05-a50c-4dc23b8449a8\") " pod="openshift-marketplace/community-operators-x6crl" Mar 10 10:43:54 crc kubenswrapper[4794]: I0310 10:43:54.089187 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fwsqn\" (UniqueName: \"kubernetes.io/projected/fc9bea91-9b71-4d05-a50c-4dc23b8449a8-kube-api-access-fwsqn\") pod \"community-operators-x6crl\" (UID: \"fc9bea91-9b71-4d05-a50c-4dc23b8449a8\") " pod="openshift-marketplace/community-operators-x6crl" Mar 10 10:43:54 crc kubenswrapper[4794]: I0310 10:43:54.199785 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-x6crl" Mar 10 10:43:54 crc kubenswrapper[4794]: I0310 10:43:54.757920 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-x6crl"] Mar 10 10:43:55 crc kubenswrapper[4794]: I0310 10:43:55.044801 4794 generic.go:334] "Generic (PLEG): container finished" podID="fc9bea91-9b71-4d05-a50c-4dc23b8449a8" containerID="27e61296e932cbab2e1a2a8ff3aefb1216069b1b45e7a05456b9fd25d778fe2c" exitCode=0 Mar 10 10:43:55 crc kubenswrapper[4794]: I0310 10:43:55.044841 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-x6crl" event={"ID":"fc9bea91-9b71-4d05-a50c-4dc23b8449a8","Type":"ContainerDied","Data":"27e61296e932cbab2e1a2a8ff3aefb1216069b1b45e7a05456b9fd25d778fe2c"} Mar 10 10:43:55 crc kubenswrapper[4794]: I0310 10:43:55.044867 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-x6crl" event={"ID":"fc9bea91-9b71-4d05-a50c-4dc23b8449a8","Type":"ContainerStarted","Data":"3df30bde76b176cc6fd8c262ecb99452562fc50fdfd9c4d62b9630fb2d0db081"} Mar 10 10:43:56 crc kubenswrapper[4794]: I0310 10:43:56.052875 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-x6crl" event={"ID":"fc9bea91-9b71-4d05-a50c-4dc23b8449a8","Type":"ContainerStarted","Data":"ed8bee8aca51309b0216a1b8cf00321cf68e302f6f6c3a5087b52b22de080078"} Mar 10 10:43:57 crc kubenswrapper[4794]: I0310 10:43:57.064480 4794 generic.go:334] "Generic (PLEG): container finished" podID="fc9bea91-9b71-4d05-a50c-4dc23b8449a8" containerID="ed8bee8aca51309b0216a1b8cf00321cf68e302f6f6c3a5087b52b22de080078" exitCode=0 Mar 10 10:43:57 crc kubenswrapper[4794]: I0310 10:43:57.064620 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-x6crl" event={"ID":"fc9bea91-9b71-4d05-a50c-4dc23b8449a8","Type":"ContainerDied","Data":"ed8bee8aca51309b0216a1b8cf00321cf68e302f6f6c3a5087b52b22de080078"} Mar 10 10:43:58 crc kubenswrapper[4794]: I0310 10:43:58.079383 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-x6crl" event={"ID":"fc9bea91-9b71-4d05-a50c-4dc23b8449a8","Type":"ContainerStarted","Data":"d6cf37f987f9776895e73070196dd49253e7c3dab29ef7eb15fea5b1e9df8a99"} Mar 10 10:43:58 crc kubenswrapper[4794]: I0310 10:43:58.115758 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-x6crl" podStartSLOduration=2.670629475 podStartE2EDuration="5.115735921s" podCreationTimestamp="2026-03-10 10:43:53 +0000 UTC" firstStartedPulling="2026-03-10 10:43:55.046909436 +0000 UTC m=+3583.803080254" lastFinishedPulling="2026-03-10 10:43:57.492015872 +0000 UTC m=+3586.248186700" observedRunningTime="2026-03-10 10:43:58.109362022 +0000 UTC m=+3586.865532880" watchObservedRunningTime="2026-03-10 10:43:58.115735921 +0000 UTC m=+3586.871906749" Mar 10 10:44:00 crc kubenswrapper[4794]: I0310 10:44:00.149287 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29552324-5j6hl"] Mar 10 10:44:00 crc kubenswrapper[4794]: I0310 10:44:00.150922 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552324-5j6hl" Mar 10 10:44:00 crc kubenswrapper[4794]: I0310 10:44:00.157909 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552324-5j6hl"] Mar 10 10:44:00 crc kubenswrapper[4794]: I0310 10:44:00.159431 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-r5tkc" Mar 10 10:44:00 crc kubenswrapper[4794]: I0310 10:44:00.159757 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 10 10:44:00 crc kubenswrapper[4794]: I0310 10:44:00.162258 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 10 10:44:00 crc kubenswrapper[4794]: I0310 10:44:00.260525 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pvrzp\" (UniqueName: \"kubernetes.io/projected/6963f98c-6610-4db7-8cd0-3e45300c69b4-kube-api-access-pvrzp\") pod \"auto-csr-approver-29552324-5j6hl\" (UID: \"6963f98c-6610-4db7-8cd0-3e45300c69b4\") " pod="openshift-infra/auto-csr-approver-29552324-5j6hl" Mar 10 10:44:00 crc kubenswrapper[4794]: I0310 10:44:00.361604 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pvrzp\" (UniqueName: \"kubernetes.io/projected/6963f98c-6610-4db7-8cd0-3e45300c69b4-kube-api-access-pvrzp\") pod \"auto-csr-approver-29552324-5j6hl\" (UID: \"6963f98c-6610-4db7-8cd0-3e45300c69b4\") " pod="openshift-infra/auto-csr-approver-29552324-5j6hl" Mar 10 10:44:00 crc kubenswrapper[4794]: I0310 10:44:00.389834 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pvrzp\" (UniqueName: \"kubernetes.io/projected/6963f98c-6610-4db7-8cd0-3e45300c69b4-kube-api-access-pvrzp\") pod \"auto-csr-approver-29552324-5j6hl\" (UID: \"6963f98c-6610-4db7-8cd0-3e45300c69b4\") " pod="openshift-infra/auto-csr-approver-29552324-5j6hl" Mar 10 10:44:00 crc kubenswrapper[4794]: I0310 10:44:00.470883 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552324-5j6hl" Mar 10 10:44:00 crc kubenswrapper[4794]: I0310 10:44:00.896646 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552324-5j6hl"] Mar 10 10:44:01 crc kubenswrapper[4794]: I0310 10:44:01.107538 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552324-5j6hl" event={"ID":"6963f98c-6610-4db7-8cd0-3e45300c69b4","Type":"ContainerStarted","Data":"c968557dcb9b31bf9cbbe74ba3da66a026d1304c0cc73694909d23a34bf084e5"} Mar 10 10:44:03 crc kubenswrapper[4794]: I0310 10:44:03.127686 4794 generic.go:334] "Generic (PLEG): container finished" podID="6963f98c-6610-4db7-8cd0-3e45300c69b4" containerID="428c5ebeaf331b2a0ad6811af9ba01743719208f77e4c79d101da9c0e61c1d24" exitCode=0 Mar 10 10:44:03 crc kubenswrapper[4794]: I0310 10:44:03.127756 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552324-5j6hl" event={"ID":"6963f98c-6610-4db7-8cd0-3e45300c69b4","Type":"ContainerDied","Data":"428c5ebeaf331b2a0ad6811af9ba01743719208f77e4c79d101da9c0e61c1d24"} Mar 10 10:44:04 crc kubenswrapper[4794]: I0310 10:44:04.200082 4794 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-x6crl" Mar 10 10:44:04 crc kubenswrapper[4794]: I0310 10:44:04.200161 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-x6crl" Mar 10 10:44:04 crc kubenswrapper[4794]: I0310 10:44:04.280462 4794 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-x6crl" Mar 10 10:44:04 crc kubenswrapper[4794]: I0310 10:44:04.428683 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552324-5j6hl" Mar 10 10:44:04 crc kubenswrapper[4794]: I0310 10:44:04.522665 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pvrzp\" (UniqueName: \"kubernetes.io/projected/6963f98c-6610-4db7-8cd0-3e45300c69b4-kube-api-access-pvrzp\") pod \"6963f98c-6610-4db7-8cd0-3e45300c69b4\" (UID: \"6963f98c-6610-4db7-8cd0-3e45300c69b4\") " Mar 10 10:44:04 crc kubenswrapper[4794]: I0310 10:44:04.528373 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6963f98c-6610-4db7-8cd0-3e45300c69b4-kube-api-access-pvrzp" (OuterVolumeSpecName: "kube-api-access-pvrzp") pod "6963f98c-6610-4db7-8cd0-3e45300c69b4" (UID: "6963f98c-6610-4db7-8cd0-3e45300c69b4"). InnerVolumeSpecName "kube-api-access-pvrzp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 10:44:04 crc kubenswrapper[4794]: I0310 10:44:04.624638 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pvrzp\" (UniqueName: \"kubernetes.io/projected/6963f98c-6610-4db7-8cd0-3e45300c69b4-kube-api-access-pvrzp\") on node \"crc\" DevicePath \"\"" Mar 10 10:44:05 crc kubenswrapper[4794]: I0310 10:44:05.146453 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552324-5j6hl" Mar 10 10:44:05 crc kubenswrapper[4794]: I0310 10:44:05.146447 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552324-5j6hl" event={"ID":"6963f98c-6610-4db7-8cd0-3e45300c69b4","Type":"ContainerDied","Data":"c968557dcb9b31bf9cbbe74ba3da66a026d1304c0cc73694909d23a34bf084e5"} Mar 10 10:44:05 crc kubenswrapper[4794]: I0310 10:44:05.146625 4794 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c968557dcb9b31bf9cbbe74ba3da66a026d1304c0cc73694909d23a34bf084e5" Mar 10 10:44:05 crc kubenswrapper[4794]: I0310 10:44:05.196294 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-x6crl" Mar 10 10:44:05 crc kubenswrapper[4794]: I0310 10:44:05.237262 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-x6crl"] Mar 10 10:44:05 crc kubenswrapper[4794]: I0310 10:44:05.498011 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29552318-42c2w"] Mar 10 10:44:05 crc kubenswrapper[4794]: I0310 10:44:05.504407 4794 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29552318-42c2w"] Mar 10 10:44:06 crc kubenswrapper[4794]: I0310 10:44:06.014734 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e0cdc4e5-d940-4311-977d-ec96b95261f6" path="/var/lib/kubelet/pods/e0cdc4e5-d940-4311-977d-ec96b95261f6/volumes" Mar 10 10:44:07 crc kubenswrapper[4794]: I0310 10:44:07.162537 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-x6crl" podUID="fc9bea91-9b71-4d05-a50c-4dc23b8449a8" containerName="registry-server" containerID="cri-o://d6cf37f987f9776895e73070196dd49253e7c3dab29ef7eb15fea5b1e9df8a99" gracePeriod=2 Mar 10 10:44:07 crc kubenswrapper[4794]: I0310 10:44:07.615511 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-x6crl" Mar 10 10:44:07 crc kubenswrapper[4794]: I0310 10:44:07.673074 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fwsqn\" (UniqueName: \"kubernetes.io/projected/fc9bea91-9b71-4d05-a50c-4dc23b8449a8-kube-api-access-fwsqn\") pod \"fc9bea91-9b71-4d05-a50c-4dc23b8449a8\" (UID: \"fc9bea91-9b71-4d05-a50c-4dc23b8449a8\") " Mar 10 10:44:07 crc kubenswrapper[4794]: I0310 10:44:07.673140 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fc9bea91-9b71-4d05-a50c-4dc23b8449a8-utilities\") pod \"fc9bea91-9b71-4d05-a50c-4dc23b8449a8\" (UID: \"fc9bea91-9b71-4d05-a50c-4dc23b8449a8\") " Mar 10 10:44:07 crc kubenswrapper[4794]: I0310 10:44:07.673276 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fc9bea91-9b71-4d05-a50c-4dc23b8449a8-catalog-content\") pod \"fc9bea91-9b71-4d05-a50c-4dc23b8449a8\" (UID: \"fc9bea91-9b71-4d05-a50c-4dc23b8449a8\") " Mar 10 10:44:07 crc kubenswrapper[4794]: I0310 10:44:07.674579 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fc9bea91-9b71-4d05-a50c-4dc23b8449a8-utilities" (OuterVolumeSpecName: "utilities") pod "fc9bea91-9b71-4d05-a50c-4dc23b8449a8" (UID: "fc9bea91-9b71-4d05-a50c-4dc23b8449a8"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 10:44:07 crc kubenswrapper[4794]: I0310 10:44:07.681315 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fc9bea91-9b71-4d05-a50c-4dc23b8449a8-kube-api-access-fwsqn" (OuterVolumeSpecName: "kube-api-access-fwsqn") pod "fc9bea91-9b71-4d05-a50c-4dc23b8449a8" (UID: "fc9bea91-9b71-4d05-a50c-4dc23b8449a8"). InnerVolumeSpecName "kube-api-access-fwsqn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 10:44:07 crc kubenswrapper[4794]: I0310 10:44:07.726010 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fc9bea91-9b71-4d05-a50c-4dc23b8449a8-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "fc9bea91-9b71-4d05-a50c-4dc23b8449a8" (UID: "fc9bea91-9b71-4d05-a50c-4dc23b8449a8"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 10:44:07 crc kubenswrapper[4794]: I0310 10:44:07.774907 4794 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fc9bea91-9b71-4d05-a50c-4dc23b8449a8-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 10 10:44:07 crc kubenswrapper[4794]: I0310 10:44:07.774951 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fwsqn\" (UniqueName: \"kubernetes.io/projected/fc9bea91-9b71-4d05-a50c-4dc23b8449a8-kube-api-access-fwsqn\") on node \"crc\" DevicePath \"\"" Mar 10 10:44:07 crc kubenswrapper[4794]: I0310 10:44:07.774966 4794 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fc9bea91-9b71-4d05-a50c-4dc23b8449a8-utilities\") on node \"crc\" DevicePath \"\"" Mar 10 10:44:08 crc kubenswrapper[4794]: I0310 10:44:08.174924 4794 generic.go:334] "Generic (PLEG): container finished" podID="fc9bea91-9b71-4d05-a50c-4dc23b8449a8" containerID="d6cf37f987f9776895e73070196dd49253e7c3dab29ef7eb15fea5b1e9df8a99" exitCode=0 Mar 10 10:44:08 crc kubenswrapper[4794]: I0310 10:44:08.174981 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-x6crl" event={"ID":"fc9bea91-9b71-4d05-a50c-4dc23b8449a8","Type":"ContainerDied","Data":"d6cf37f987f9776895e73070196dd49253e7c3dab29ef7eb15fea5b1e9df8a99"} Mar 10 10:44:08 crc kubenswrapper[4794]: I0310 10:44:08.175018 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-x6crl" event={"ID":"fc9bea91-9b71-4d05-a50c-4dc23b8449a8","Type":"ContainerDied","Data":"3df30bde76b176cc6fd8c262ecb99452562fc50fdfd9c4d62b9630fb2d0db081"} Mar 10 10:44:08 crc kubenswrapper[4794]: I0310 10:44:08.175028 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-x6crl" Mar 10 10:44:08 crc kubenswrapper[4794]: I0310 10:44:08.175045 4794 scope.go:117] "RemoveContainer" containerID="d6cf37f987f9776895e73070196dd49253e7c3dab29ef7eb15fea5b1e9df8a99" Mar 10 10:44:08 crc kubenswrapper[4794]: I0310 10:44:08.206170 4794 scope.go:117] "RemoveContainer" containerID="ed8bee8aca51309b0216a1b8cf00321cf68e302f6f6c3a5087b52b22de080078" Mar 10 10:44:08 crc kubenswrapper[4794]: I0310 10:44:08.207423 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-x6crl"] Mar 10 10:44:08 crc kubenswrapper[4794]: I0310 10:44:08.215433 4794 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-x6crl"] Mar 10 10:44:08 crc kubenswrapper[4794]: I0310 10:44:08.231554 4794 scope.go:117] "RemoveContainer" containerID="27e61296e932cbab2e1a2a8ff3aefb1216069b1b45e7a05456b9fd25d778fe2c" Mar 10 10:44:08 crc kubenswrapper[4794]: I0310 10:44:08.265851 4794 scope.go:117] "RemoveContainer" containerID="d6cf37f987f9776895e73070196dd49253e7c3dab29ef7eb15fea5b1e9df8a99" Mar 10 10:44:08 crc kubenswrapper[4794]: E0310 10:44:08.266240 4794 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d6cf37f987f9776895e73070196dd49253e7c3dab29ef7eb15fea5b1e9df8a99\": container with ID starting with d6cf37f987f9776895e73070196dd49253e7c3dab29ef7eb15fea5b1e9df8a99 not found: ID does not exist" containerID="d6cf37f987f9776895e73070196dd49253e7c3dab29ef7eb15fea5b1e9df8a99" Mar 10 10:44:08 crc kubenswrapper[4794]: I0310 10:44:08.266270 4794 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d6cf37f987f9776895e73070196dd49253e7c3dab29ef7eb15fea5b1e9df8a99"} err="failed to get container status \"d6cf37f987f9776895e73070196dd49253e7c3dab29ef7eb15fea5b1e9df8a99\": rpc error: code = NotFound desc = could not find container \"d6cf37f987f9776895e73070196dd49253e7c3dab29ef7eb15fea5b1e9df8a99\": container with ID starting with d6cf37f987f9776895e73070196dd49253e7c3dab29ef7eb15fea5b1e9df8a99 not found: ID does not exist" Mar 10 10:44:08 crc kubenswrapper[4794]: I0310 10:44:08.266288 4794 scope.go:117] "RemoveContainer" containerID="ed8bee8aca51309b0216a1b8cf00321cf68e302f6f6c3a5087b52b22de080078" Mar 10 10:44:08 crc kubenswrapper[4794]: E0310 10:44:08.266525 4794 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ed8bee8aca51309b0216a1b8cf00321cf68e302f6f6c3a5087b52b22de080078\": container with ID starting with ed8bee8aca51309b0216a1b8cf00321cf68e302f6f6c3a5087b52b22de080078 not found: ID does not exist" containerID="ed8bee8aca51309b0216a1b8cf00321cf68e302f6f6c3a5087b52b22de080078" Mar 10 10:44:08 crc kubenswrapper[4794]: I0310 10:44:08.266549 4794 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ed8bee8aca51309b0216a1b8cf00321cf68e302f6f6c3a5087b52b22de080078"} err="failed to get container status \"ed8bee8aca51309b0216a1b8cf00321cf68e302f6f6c3a5087b52b22de080078\": rpc error: code = NotFound desc = could not find container \"ed8bee8aca51309b0216a1b8cf00321cf68e302f6f6c3a5087b52b22de080078\": container with ID starting with ed8bee8aca51309b0216a1b8cf00321cf68e302f6f6c3a5087b52b22de080078 not found: ID does not exist" Mar 10 10:44:08 crc kubenswrapper[4794]: I0310 10:44:08.266561 4794 scope.go:117] "RemoveContainer" containerID="27e61296e932cbab2e1a2a8ff3aefb1216069b1b45e7a05456b9fd25d778fe2c" Mar 10 10:44:08 crc kubenswrapper[4794]: E0310 10:44:08.266777 4794 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"27e61296e932cbab2e1a2a8ff3aefb1216069b1b45e7a05456b9fd25d778fe2c\": container with ID starting with 27e61296e932cbab2e1a2a8ff3aefb1216069b1b45e7a05456b9fd25d778fe2c not found: ID does not exist" containerID="27e61296e932cbab2e1a2a8ff3aefb1216069b1b45e7a05456b9fd25d778fe2c" Mar 10 10:44:08 crc kubenswrapper[4794]: I0310 10:44:08.266800 4794 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"27e61296e932cbab2e1a2a8ff3aefb1216069b1b45e7a05456b9fd25d778fe2c"} err="failed to get container status \"27e61296e932cbab2e1a2a8ff3aefb1216069b1b45e7a05456b9fd25d778fe2c\": rpc error: code = NotFound desc = could not find container \"27e61296e932cbab2e1a2a8ff3aefb1216069b1b45e7a05456b9fd25d778fe2c\": container with ID starting with 27e61296e932cbab2e1a2a8ff3aefb1216069b1b45e7a05456b9fd25d778fe2c not found: ID does not exist" Mar 10 10:44:10 crc kubenswrapper[4794]: I0310 10:44:10.013768 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fc9bea91-9b71-4d05-a50c-4dc23b8449a8" path="/var/lib/kubelet/pods/fc9bea91-9b71-4d05-a50c-4dc23b8449a8/volumes" Mar 10 10:44:39 crc kubenswrapper[4794]: I0310 10:44:39.911921 4794 scope.go:117] "RemoveContainer" containerID="17f4a4daf6fd27bbbefebcadef074aa869c760c39f98d2c81256b0a4aed65ac3" Mar 10 10:45:00 crc kubenswrapper[4794]: I0310 10:45:00.169542 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29552325-xx4p7"] Mar 10 10:45:00 crc kubenswrapper[4794]: E0310 10:45:00.170869 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6963f98c-6610-4db7-8cd0-3e45300c69b4" containerName="oc" Mar 10 10:45:00 crc kubenswrapper[4794]: I0310 10:45:00.170897 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="6963f98c-6610-4db7-8cd0-3e45300c69b4" containerName="oc" Mar 10 10:45:00 crc kubenswrapper[4794]: E0310 10:45:00.170916 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fc9bea91-9b71-4d05-a50c-4dc23b8449a8" containerName="extract-utilities" Mar 10 10:45:00 crc kubenswrapper[4794]: I0310 10:45:00.170929 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="fc9bea91-9b71-4d05-a50c-4dc23b8449a8" containerName="extract-utilities" Mar 10 10:45:00 crc kubenswrapper[4794]: E0310 10:45:00.170958 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fc9bea91-9b71-4d05-a50c-4dc23b8449a8" containerName="registry-server" Mar 10 10:45:00 crc kubenswrapper[4794]: I0310 10:45:00.170973 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="fc9bea91-9b71-4d05-a50c-4dc23b8449a8" containerName="registry-server" Mar 10 10:45:00 crc kubenswrapper[4794]: E0310 10:45:00.170994 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fc9bea91-9b71-4d05-a50c-4dc23b8449a8" containerName="extract-content" Mar 10 10:45:00 crc kubenswrapper[4794]: I0310 10:45:00.171006 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="fc9bea91-9b71-4d05-a50c-4dc23b8449a8" containerName="extract-content" Mar 10 10:45:00 crc kubenswrapper[4794]: I0310 10:45:00.171413 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="fc9bea91-9b71-4d05-a50c-4dc23b8449a8" containerName="registry-server" Mar 10 10:45:00 crc kubenswrapper[4794]: I0310 10:45:00.171456 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="6963f98c-6610-4db7-8cd0-3e45300c69b4" containerName="oc" Mar 10 10:45:00 crc kubenswrapper[4794]: I0310 10:45:00.172507 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29552325-xx4p7" Mar 10 10:45:00 crc kubenswrapper[4794]: I0310 10:45:00.176137 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 10 10:45:00 crc kubenswrapper[4794]: I0310 10:45:00.178386 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 10 10:45:00 crc kubenswrapper[4794]: I0310 10:45:00.187145 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29552325-xx4p7"] Mar 10 10:45:00 crc kubenswrapper[4794]: I0310 10:45:00.347598 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/bc5f5e33-ec54-41b5-b46d-69f07fac6d87-config-volume\") pod \"collect-profiles-29552325-xx4p7\" (UID: \"bc5f5e33-ec54-41b5-b46d-69f07fac6d87\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552325-xx4p7" Mar 10 10:45:00 crc kubenswrapper[4794]: I0310 10:45:00.347725 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/bc5f5e33-ec54-41b5-b46d-69f07fac6d87-secret-volume\") pod \"collect-profiles-29552325-xx4p7\" (UID: \"bc5f5e33-ec54-41b5-b46d-69f07fac6d87\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552325-xx4p7" Mar 10 10:45:00 crc kubenswrapper[4794]: I0310 10:45:00.347828 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7fv9b\" (UniqueName: \"kubernetes.io/projected/bc5f5e33-ec54-41b5-b46d-69f07fac6d87-kube-api-access-7fv9b\") pod \"collect-profiles-29552325-xx4p7\" (UID: \"bc5f5e33-ec54-41b5-b46d-69f07fac6d87\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552325-xx4p7" Mar 10 10:45:00 crc kubenswrapper[4794]: I0310 10:45:00.449797 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7fv9b\" (UniqueName: \"kubernetes.io/projected/bc5f5e33-ec54-41b5-b46d-69f07fac6d87-kube-api-access-7fv9b\") pod \"collect-profiles-29552325-xx4p7\" (UID: \"bc5f5e33-ec54-41b5-b46d-69f07fac6d87\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552325-xx4p7" Mar 10 10:45:00 crc kubenswrapper[4794]: I0310 10:45:00.449937 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/bc5f5e33-ec54-41b5-b46d-69f07fac6d87-config-volume\") pod \"collect-profiles-29552325-xx4p7\" (UID: \"bc5f5e33-ec54-41b5-b46d-69f07fac6d87\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552325-xx4p7" Mar 10 10:45:00 crc kubenswrapper[4794]: I0310 10:45:00.449974 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/bc5f5e33-ec54-41b5-b46d-69f07fac6d87-secret-volume\") pod \"collect-profiles-29552325-xx4p7\" (UID: \"bc5f5e33-ec54-41b5-b46d-69f07fac6d87\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552325-xx4p7" Mar 10 10:45:00 crc kubenswrapper[4794]: I0310 10:45:00.451645 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/bc5f5e33-ec54-41b5-b46d-69f07fac6d87-config-volume\") pod \"collect-profiles-29552325-xx4p7\" (UID: \"bc5f5e33-ec54-41b5-b46d-69f07fac6d87\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552325-xx4p7" Mar 10 10:45:00 crc kubenswrapper[4794]: I0310 10:45:00.463773 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/bc5f5e33-ec54-41b5-b46d-69f07fac6d87-secret-volume\") pod \"collect-profiles-29552325-xx4p7\" (UID: \"bc5f5e33-ec54-41b5-b46d-69f07fac6d87\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552325-xx4p7" Mar 10 10:45:00 crc kubenswrapper[4794]: I0310 10:45:00.487130 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7fv9b\" (UniqueName: \"kubernetes.io/projected/bc5f5e33-ec54-41b5-b46d-69f07fac6d87-kube-api-access-7fv9b\") pod \"collect-profiles-29552325-xx4p7\" (UID: \"bc5f5e33-ec54-41b5-b46d-69f07fac6d87\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552325-xx4p7" Mar 10 10:45:00 crc kubenswrapper[4794]: I0310 10:45:00.496531 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29552325-xx4p7" Mar 10 10:45:00 crc kubenswrapper[4794]: I0310 10:45:00.961668 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29552325-xx4p7"] Mar 10 10:45:01 crc kubenswrapper[4794]: I0310 10:45:01.648416 4794 generic.go:334] "Generic (PLEG): container finished" podID="bc5f5e33-ec54-41b5-b46d-69f07fac6d87" containerID="1b58417f5c7b9a2e3a08274cc6025c4ba19bca341dff9c5ddab3b3e64e531cdf" exitCode=0 Mar 10 10:45:01 crc kubenswrapper[4794]: I0310 10:45:01.648545 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29552325-xx4p7" event={"ID":"bc5f5e33-ec54-41b5-b46d-69f07fac6d87","Type":"ContainerDied","Data":"1b58417f5c7b9a2e3a08274cc6025c4ba19bca341dff9c5ddab3b3e64e531cdf"} Mar 10 10:45:01 crc kubenswrapper[4794]: I0310 10:45:01.648758 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29552325-xx4p7" event={"ID":"bc5f5e33-ec54-41b5-b46d-69f07fac6d87","Type":"ContainerStarted","Data":"afe79bd9442d18e94bd9cd8404424e02905b75accfb975a1bd555e409c8393c8"} Mar 10 10:45:03 crc kubenswrapper[4794]: I0310 10:45:03.065751 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29552325-xx4p7" Mar 10 10:45:03 crc kubenswrapper[4794]: I0310 10:45:03.197830 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7fv9b\" (UniqueName: \"kubernetes.io/projected/bc5f5e33-ec54-41b5-b46d-69f07fac6d87-kube-api-access-7fv9b\") pod \"bc5f5e33-ec54-41b5-b46d-69f07fac6d87\" (UID: \"bc5f5e33-ec54-41b5-b46d-69f07fac6d87\") " Mar 10 10:45:03 crc kubenswrapper[4794]: I0310 10:45:03.197932 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/bc5f5e33-ec54-41b5-b46d-69f07fac6d87-secret-volume\") pod \"bc5f5e33-ec54-41b5-b46d-69f07fac6d87\" (UID: \"bc5f5e33-ec54-41b5-b46d-69f07fac6d87\") " Mar 10 10:45:03 crc kubenswrapper[4794]: I0310 10:45:03.198049 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/bc5f5e33-ec54-41b5-b46d-69f07fac6d87-config-volume\") pod \"bc5f5e33-ec54-41b5-b46d-69f07fac6d87\" (UID: \"bc5f5e33-ec54-41b5-b46d-69f07fac6d87\") " Mar 10 10:45:03 crc kubenswrapper[4794]: I0310 10:45:03.199210 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bc5f5e33-ec54-41b5-b46d-69f07fac6d87-config-volume" (OuterVolumeSpecName: "config-volume") pod "bc5f5e33-ec54-41b5-b46d-69f07fac6d87" (UID: "bc5f5e33-ec54-41b5-b46d-69f07fac6d87"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 10:45:03 crc kubenswrapper[4794]: I0310 10:45:03.205799 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc5f5e33-ec54-41b5-b46d-69f07fac6d87-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "bc5f5e33-ec54-41b5-b46d-69f07fac6d87" (UID: "bc5f5e33-ec54-41b5-b46d-69f07fac6d87"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 10:45:03 crc kubenswrapper[4794]: I0310 10:45:03.205870 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc5f5e33-ec54-41b5-b46d-69f07fac6d87-kube-api-access-7fv9b" (OuterVolumeSpecName: "kube-api-access-7fv9b") pod "bc5f5e33-ec54-41b5-b46d-69f07fac6d87" (UID: "bc5f5e33-ec54-41b5-b46d-69f07fac6d87"). InnerVolumeSpecName "kube-api-access-7fv9b". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 10:45:03 crc kubenswrapper[4794]: I0310 10:45:03.300735 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7fv9b\" (UniqueName: \"kubernetes.io/projected/bc5f5e33-ec54-41b5-b46d-69f07fac6d87-kube-api-access-7fv9b\") on node \"crc\" DevicePath \"\"" Mar 10 10:45:03 crc kubenswrapper[4794]: I0310 10:45:03.300992 4794 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/bc5f5e33-ec54-41b5-b46d-69f07fac6d87-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 10 10:45:03 crc kubenswrapper[4794]: I0310 10:45:03.301114 4794 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/bc5f5e33-ec54-41b5-b46d-69f07fac6d87-config-volume\") on node \"crc\" DevicePath \"\"" Mar 10 10:45:03 crc kubenswrapper[4794]: I0310 10:45:03.681313 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29552325-xx4p7" event={"ID":"bc5f5e33-ec54-41b5-b46d-69f07fac6d87","Type":"ContainerDied","Data":"afe79bd9442d18e94bd9cd8404424e02905b75accfb975a1bd555e409c8393c8"} Mar 10 10:45:03 crc kubenswrapper[4794]: I0310 10:45:03.681450 4794 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="afe79bd9442d18e94bd9cd8404424e02905b75accfb975a1bd555e409c8393c8" Mar 10 10:45:03 crc kubenswrapper[4794]: I0310 10:45:03.681648 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29552325-xx4p7" Mar 10 10:45:04 crc kubenswrapper[4794]: I0310 10:45:04.169719 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29552280-t9n75"] Mar 10 10:45:04 crc kubenswrapper[4794]: I0310 10:45:04.174078 4794 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29552280-t9n75"] Mar 10 10:45:06 crc kubenswrapper[4794]: I0310 10:45:06.006458 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9be7aae6-30f2-4a0f-8aa3-c88cc81603d7" path="/var/lib/kubelet/pods/9be7aae6-30f2-4a0f-8aa3-c88cc81603d7/volumes" Mar 10 10:45:22 crc kubenswrapper[4794]: I0310 10:45:22.968278 4794 patch_prober.go:28] interesting pod/machine-config-daemon-69278 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 10:45:22 crc kubenswrapper[4794]: I0310 10:45:22.969147 4794 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-69278" podUID="071a79a8-a892-4d38-a255-2a19483b64aa" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 10:45:40 crc kubenswrapper[4794]: I0310 10:45:40.027539 4794 scope.go:117] "RemoveContainer" containerID="2e16baaeb24214f08df54071f10bd8f70b65ec2a0a85f920ccc8fbe70ee66c9b" Mar 10 10:45:52 crc kubenswrapper[4794]: I0310 10:45:52.968038 4794 patch_prober.go:28] interesting pod/machine-config-daemon-69278 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 10:45:52 crc kubenswrapper[4794]: I0310 10:45:52.968810 4794 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-69278" podUID="071a79a8-a892-4d38-a255-2a19483b64aa" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 10:46:00 crc kubenswrapper[4794]: I0310 10:46:00.160887 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29552326-8cbdt"] Mar 10 10:46:00 crc kubenswrapper[4794]: E0310 10:46:00.162311 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bc5f5e33-ec54-41b5-b46d-69f07fac6d87" containerName="collect-profiles" Mar 10 10:46:00 crc kubenswrapper[4794]: I0310 10:46:00.162393 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="bc5f5e33-ec54-41b5-b46d-69f07fac6d87" containerName="collect-profiles" Mar 10 10:46:00 crc kubenswrapper[4794]: I0310 10:46:00.162699 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="bc5f5e33-ec54-41b5-b46d-69f07fac6d87" containerName="collect-profiles" Mar 10 10:46:00 crc kubenswrapper[4794]: I0310 10:46:00.163779 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552326-8cbdt" Mar 10 10:46:00 crc kubenswrapper[4794]: I0310 10:46:00.166974 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552326-8cbdt"] Mar 10 10:46:00 crc kubenswrapper[4794]: I0310 10:46:00.167081 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-r5tkc" Mar 10 10:46:00 crc kubenswrapper[4794]: I0310 10:46:00.169432 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 10 10:46:00 crc kubenswrapper[4794]: I0310 10:46:00.169585 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 10 10:46:00 crc kubenswrapper[4794]: I0310 10:46:00.242998 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g2pkp\" (UniqueName: \"kubernetes.io/projected/f7dc16bb-b8cc-4252-a4c3-5d0b3d3542e9-kube-api-access-g2pkp\") pod \"auto-csr-approver-29552326-8cbdt\" (UID: \"f7dc16bb-b8cc-4252-a4c3-5d0b3d3542e9\") " pod="openshift-infra/auto-csr-approver-29552326-8cbdt" Mar 10 10:46:00 crc kubenswrapper[4794]: I0310 10:46:00.345856 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g2pkp\" (UniqueName: \"kubernetes.io/projected/f7dc16bb-b8cc-4252-a4c3-5d0b3d3542e9-kube-api-access-g2pkp\") pod \"auto-csr-approver-29552326-8cbdt\" (UID: \"f7dc16bb-b8cc-4252-a4c3-5d0b3d3542e9\") " pod="openshift-infra/auto-csr-approver-29552326-8cbdt" Mar 10 10:46:00 crc kubenswrapper[4794]: I0310 10:46:00.379899 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g2pkp\" (UniqueName: \"kubernetes.io/projected/f7dc16bb-b8cc-4252-a4c3-5d0b3d3542e9-kube-api-access-g2pkp\") pod \"auto-csr-approver-29552326-8cbdt\" (UID: \"f7dc16bb-b8cc-4252-a4c3-5d0b3d3542e9\") " pod="openshift-infra/auto-csr-approver-29552326-8cbdt" Mar 10 10:46:00 crc kubenswrapper[4794]: I0310 10:46:00.495741 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552326-8cbdt" Mar 10 10:46:00 crc kubenswrapper[4794]: I0310 10:46:00.922196 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552326-8cbdt"] Mar 10 10:46:00 crc kubenswrapper[4794]: I0310 10:46:00.928598 4794 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 10 10:46:01 crc kubenswrapper[4794]: I0310 10:46:01.215560 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552326-8cbdt" event={"ID":"f7dc16bb-b8cc-4252-a4c3-5d0b3d3542e9","Type":"ContainerStarted","Data":"bff4d6fbe00ad950390f73a4e971d10dd384cc5c86c94e54153f1eb9bdc11546"} Mar 10 10:46:03 crc kubenswrapper[4794]: I0310 10:46:03.238022 4794 generic.go:334] "Generic (PLEG): container finished" podID="f7dc16bb-b8cc-4252-a4c3-5d0b3d3542e9" containerID="217355a90848adc334ea39e860a5d04940bbef32280925df256a99d0792e14c3" exitCode=0 Mar 10 10:46:03 crc kubenswrapper[4794]: I0310 10:46:03.238101 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552326-8cbdt" event={"ID":"f7dc16bb-b8cc-4252-a4c3-5d0b3d3542e9","Type":"ContainerDied","Data":"217355a90848adc334ea39e860a5d04940bbef32280925df256a99d0792e14c3"} Mar 10 10:46:04 crc kubenswrapper[4794]: I0310 10:46:04.620304 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552326-8cbdt" Mar 10 10:46:04 crc kubenswrapper[4794]: I0310 10:46:04.812849 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g2pkp\" (UniqueName: \"kubernetes.io/projected/f7dc16bb-b8cc-4252-a4c3-5d0b3d3542e9-kube-api-access-g2pkp\") pod \"f7dc16bb-b8cc-4252-a4c3-5d0b3d3542e9\" (UID: \"f7dc16bb-b8cc-4252-a4c3-5d0b3d3542e9\") " Mar 10 10:46:04 crc kubenswrapper[4794]: I0310 10:46:04.821133 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f7dc16bb-b8cc-4252-a4c3-5d0b3d3542e9-kube-api-access-g2pkp" (OuterVolumeSpecName: "kube-api-access-g2pkp") pod "f7dc16bb-b8cc-4252-a4c3-5d0b3d3542e9" (UID: "f7dc16bb-b8cc-4252-a4c3-5d0b3d3542e9"). InnerVolumeSpecName "kube-api-access-g2pkp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 10:46:04 crc kubenswrapper[4794]: I0310 10:46:04.914828 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g2pkp\" (UniqueName: \"kubernetes.io/projected/f7dc16bb-b8cc-4252-a4c3-5d0b3d3542e9-kube-api-access-g2pkp\") on node \"crc\" DevicePath \"\"" Mar 10 10:46:05 crc kubenswrapper[4794]: I0310 10:46:05.258250 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552326-8cbdt" event={"ID":"f7dc16bb-b8cc-4252-a4c3-5d0b3d3542e9","Type":"ContainerDied","Data":"bff4d6fbe00ad950390f73a4e971d10dd384cc5c86c94e54153f1eb9bdc11546"} Mar 10 10:46:05 crc kubenswrapper[4794]: I0310 10:46:05.258639 4794 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bff4d6fbe00ad950390f73a4e971d10dd384cc5c86c94e54153f1eb9bdc11546" Mar 10 10:46:05 crc kubenswrapper[4794]: I0310 10:46:05.258711 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552326-8cbdt" Mar 10 10:46:05 crc kubenswrapper[4794]: I0310 10:46:05.727103 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29552320-jjftf"] Mar 10 10:46:05 crc kubenswrapper[4794]: I0310 10:46:05.737618 4794 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29552320-jjftf"] Mar 10 10:46:06 crc kubenswrapper[4794]: I0310 10:46:06.013426 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6f9819fe-c4f4-49d0-80ac-02fbd8ed66cd" path="/var/lib/kubelet/pods/6f9819fe-c4f4-49d0-80ac-02fbd8ed66cd/volumes" Mar 10 10:46:22 crc kubenswrapper[4794]: I0310 10:46:22.967946 4794 patch_prober.go:28] interesting pod/machine-config-daemon-69278 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 10:46:22 crc kubenswrapper[4794]: I0310 10:46:22.968778 4794 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-69278" podUID="071a79a8-a892-4d38-a255-2a19483b64aa" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 10:46:22 crc kubenswrapper[4794]: I0310 10:46:22.968857 4794 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-69278" Mar 10 10:46:22 crc kubenswrapper[4794]: I0310 10:46:22.969902 4794 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"1dfef3673c4e36ac904ce63dccbdb3ac9fb5dfb060917b9d39017aa8664a4458"} pod="openshift-machine-config-operator/machine-config-daemon-69278" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 10 10:46:22 crc kubenswrapper[4794]: I0310 10:46:22.970008 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-69278" podUID="071a79a8-a892-4d38-a255-2a19483b64aa" containerName="machine-config-daemon" containerID="cri-o://1dfef3673c4e36ac904ce63dccbdb3ac9fb5dfb060917b9d39017aa8664a4458" gracePeriod=600 Mar 10 10:46:23 crc kubenswrapper[4794]: I0310 10:46:23.416352 4794 generic.go:334] "Generic (PLEG): container finished" podID="071a79a8-a892-4d38-a255-2a19483b64aa" containerID="1dfef3673c4e36ac904ce63dccbdb3ac9fb5dfb060917b9d39017aa8664a4458" exitCode=0 Mar 10 10:46:23 crc kubenswrapper[4794]: I0310 10:46:23.416464 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-69278" event={"ID":"071a79a8-a892-4d38-a255-2a19483b64aa","Type":"ContainerDied","Data":"1dfef3673c4e36ac904ce63dccbdb3ac9fb5dfb060917b9d39017aa8664a4458"} Mar 10 10:46:23 crc kubenswrapper[4794]: I0310 10:46:23.416737 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-69278" event={"ID":"071a79a8-a892-4d38-a255-2a19483b64aa","Type":"ContainerStarted","Data":"a4170662b41f7f6ea8f48d36067f333890b45c5e5457bca77f3d71c2d215cd5a"} Mar 10 10:46:23 crc kubenswrapper[4794]: I0310 10:46:23.416776 4794 scope.go:117] "RemoveContainer" containerID="c5dcfa3c828a23512ad7e7c27068e581300a9afface667fdaf2fc0cd211b8f32" Mar 10 10:46:40 crc kubenswrapper[4794]: I0310 10:46:40.095045 4794 scope.go:117] "RemoveContainer" containerID="1c517b71d1731ac08bc05f740319bc6476a13c4e427d2c717ac44177dacb2268" Mar 10 10:47:47 crc kubenswrapper[4794]: I0310 10:47:47.958107 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-dt5l9"] Mar 10 10:47:47 crc kubenswrapper[4794]: E0310 10:47:47.958892 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f7dc16bb-b8cc-4252-a4c3-5d0b3d3542e9" containerName="oc" Mar 10 10:47:47 crc kubenswrapper[4794]: I0310 10:47:47.958910 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="f7dc16bb-b8cc-4252-a4c3-5d0b3d3542e9" containerName="oc" Mar 10 10:47:47 crc kubenswrapper[4794]: I0310 10:47:47.959104 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="f7dc16bb-b8cc-4252-a4c3-5d0b3d3542e9" containerName="oc" Mar 10 10:47:47 crc kubenswrapper[4794]: I0310 10:47:47.960238 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-dt5l9" Mar 10 10:47:48 crc kubenswrapper[4794]: I0310 10:47:48.025562 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-dt5l9"] Mar 10 10:47:48 crc kubenswrapper[4794]: I0310 10:47:48.123497 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5a4f7405-509e-4e78-bddb-ecb20b22eb85-catalog-content\") pod \"certified-operators-dt5l9\" (UID: \"5a4f7405-509e-4e78-bddb-ecb20b22eb85\") " pod="openshift-marketplace/certified-operators-dt5l9" Mar 10 10:47:48 crc kubenswrapper[4794]: I0310 10:47:48.124142 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5a4f7405-509e-4e78-bddb-ecb20b22eb85-utilities\") pod \"certified-operators-dt5l9\" (UID: \"5a4f7405-509e-4e78-bddb-ecb20b22eb85\") " pod="openshift-marketplace/certified-operators-dt5l9" Mar 10 10:47:48 crc kubenswrapper[4794]: I0310 10:47:48.124319 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pfv7l\" (UniqueName: \"kubernetes.io/projected/5a4f7405-509e-4e78-bddb-ecb20b22eb85-kube-api-access-pfv7l\") pod \"certified-operators-dt5l9\" (UID: \"5a4f7405-509e-4e78-bddb-ecb20b22eb85\") " pod="openshift-marketplace/certified-operators-dt5l9" Mar 10 10:47:48 crc kubenswrapper[4794]: I0310 10:47:48.226154 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5a4f7405-509e-4e78-bddb-ecb20b22eb85-catalog-content\") pod \"certified-operators-dt5l9\" (UID: \"5a4f7405-509e-4e78-bddb-ecb20b22eb85\") " pod="openshift-marketplace/certified-operators-dt5l9" Mar 10 10:47:48 crc kubenswrapper[4794]: I0310 10:47:48.226248 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5a4f7405-509e-4e78-bddb-ecb20b22eb85-utilities\") pod \"certified-operators-dt5l9\" (UID: \"5a4f7405-509e-4e78-bddb-ecb20b22eb85\") " pod="openshift-marketplace/certified-operators-dt5l9" Mar 10 10:47:48 crc kubenswrapper[4794]: I0310 10:47:48.226324 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pfv7l\" (UniqueName: \"kubernetes.io/projected/5a4f7405-509e-4e78-bddb-ecb20b22eb85-kube-api-access-pfv7l\") pod \"certified-operators-dt5l9\" (UID: \"5a4f7405-509e-4e78-bddb-ecb20b22eb85\") " pod="openshift-marketplace/certified-operators-dt5l9" Mar 10 10:47:48 crc kubenswrapper[4794]: I0310 10:47:48.227117 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5a4f7405-509e-4e78-bddb-ecb20b22eb85-utilities\") pod \"certified-operators-dt5l9\" (UID: \"5a4f7405-509e-4e78-bddb-ecb20b22eb85\") " pod="openshift-marketplace/certified-operators-dt5l9" Mar 10 10:47:48 crc kubenswrapper[4794]: I0310 10:47:48.227647 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5a4f7405-509e-4e78-bddb-ecb20b22eb85-catalog-content\") pod \"certified-operators-dt5l9\" (UID: \"5a4f7405-509e-4e78-bddb-ecb20b22eb85\") " pod="openshift-marketplace/certified-operators-dt5l9" Mar 10 10:47:48 crc kubenswrapper[4794]: I0310 10:47:48.247151 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pfv7l\" (UniqueName: \"kubernetes.io/projected/5a4f7405-509e-4e78-bddb-ecb20b22eb85-kube-api-access-pfv7l\") pod \"certified-operators-dt5l9\" (UID: \"5a4f7405-509e-4e78-bddb-ecb20b22eb85\") " pod="openshift-marketplace/certified-operators-dt5l9" Mar 10 10:47:48 crc kubenswrapper[4794]: I0310 10:47:48.289493 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-dt5l9" Mar 10 10:47:48 crc kubenswrapper[4794]: I0310 10:47:48.534237 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-dt5l9"] Mar 10 10:47:49 crc kubenswrapper[4794]: I0310 10:47:49.199545 4794 generic.go:334] "Generic (PLEG): container finished" podID="5a4f7405-509e-4e78-bddb-ecb20b22eb85" containerID="65066d5a5879e2742f9255d534cde6e2016844e2a652ea27453423c7fb6c9c30" exitCode=0 Mar 10 10:47:49 crc kubenswrapper[4794]: I0310 10:47:49.199616 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dt5l9" event={"ID":"5a4f7405-509e-4e78-bddb-ecb20b22eb85","Type":"ContainerDied","Data":"65066d5a5879e2742f9255d534cde6e2016844e2a652ea27453423c7fb6c9c30"} Mar 10 10:47:49 crc kubenswrapper[4794]: I0310 10:47:49.199910 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dt5l9" event={"ID":"5a4f7405-509e-4e78-bddb-ecb20b22eb85","Type":"ContainerStarted","Data":"27a609a5394d88b45e38b298276bbb28782d045e31ada18d8d9f062fe28edfe8"} Mar 10 10:47:51 crc kubenswrapper[4794]: I0310 10:47:51.218503 4794 generic.go:334] "Generic (PLEG): container finished" podID="5a4f7405-509e-4e78-bddb-ecb20b22eb85" containerID="1585926bdf29de9afec8f0076ebb325f7375daa366f71abf92c52ae2dbd059cb" exitCode=0 Mar 10 10:47:51 crc kubenswrapper[4794]: I0310 10:47:51.218561 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dt5l9" event={"ID":"5a4f7405-509e-4e78-bddb-ecb20b22eb85","Type":"ContainerDied","Data":"1585926bdf29de9afec8f0076ebb325f7375daa366f71abf92c52ae2dbd059cb"} Mar 10 10:47:52 crc kubenswrapper[4794]: I0310 10:47:52.229278 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dt5l9" event={"ID":"5a4f7405-509e-4e78-bddb-ecb20b22eb85","Type":"ContainerStarted","Data":"3fe275ce756cba86ccd1c1ff6172fd2030287b37a6d4149941c0c23289e560bd"} Mar 10 10:47:52 crc kubenswrapper[4794]: I0310 10:47:52.257743 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-dt5l9" podStartSLOduration=2.843416389 podStartE2EDuration="5.257723204s" podCreationTimestamp="2026-03-10 10:47:47 +0000 UTC" firstStartedPulling="2026-03-10 10:47:49.201637093 +0000 UTC m=+3817.957807921" lastFinishedPulling="2026-03-10 10:47:51.615943908 +0000 UTC m=+3820.372114736" observedRunningTime="2026-03-10 10:47:52.257203648 +0000 UTC m=+3821.013374476" watchObservedRunningTime="2026-03-10 10:47:52.257723204 +0000 UTC m=+3821.013894042" Mar 10 10:47:58 crc kubenswrapper[4794]: I0310 10:47:58.290438 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-dt5l9" Mar 10 10:47:58 crc kubenswrapper[4794]: I0310 10:47:58.291094 4794 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-dt5l9" Mar 10 10:47:58 crc kubenswrapper[4794]: I0310 10:47:58.366849 4794 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-dt5l9" Mar 10 10:47:59 crc kubenswrapper[4794]: I0310 10:47:59.352169 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-dt5l9" Mar 10 10:47:59 crc kubenswrapper[4794]: I0310 10:47:59.416575 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-dt5l9"] Mar 10 10:48:00 crc kubenswrapper[4794]: I0310 10:48:00.152314 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29552328-ftx2h"] Mar 10 10:48:00 crc kubenswrapper[4794]: I0310 10:48:00.153082 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552328-ftx2h" Mar 10 10:48:00 crc kubenswrapper[4794]: I0310 10:48:00.155445 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 10 10:48:00 crc kubenswrapper[4794]: I0310 10:48:00.155639 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-r5tkc" Mar 10 10:48:00 crc kubenswrapper[4794]: I0310 10:48:00.163695 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 10 10:48:00 crc kubenswrapper[4794]: I0310 10:48:00.173986 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552328-ftx2h"] Mar 10 10:48:00 crc kubenswrapper[4794]: I0310 10:48:00.310522 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8jc4j\" (UniqueName: \"kubernetes.io/projected/6836a559-5ecc-4386-be4d-028701d84ee4-kube-api-access-8jc4j\") pod \"auto-csr-approver-29552328-ftx2h\" (UID: \"6836a559-5ecc-4386-be4d-028701d84ee4\") " pod="openshift-infra/auto-csr-approver-29552328-ftx2h" Mar 10 10:48:00 crc kubenswrapper[4794]: I0310 10:48:00.411893 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8jc4j\" (UniqueName: \"kubernetes.io/projected/6836a559-5ecc-4386-be4d-028701d84ee4-kube-api-access-8jc4j\") pod \"auto-csr-approver-29552328-ftx2h\" (UID: \"6836a559-5ecc-4386-be4d-028701d84ee4\") " pod="openshift-infra/auto-csr-approver-29552328-ftx2h" Mar 10 10:48:00 crc kubenswrapper[4794]: I0310 10:48:00.442216 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8jc4j\" (UniqueName: \"kubernetes.io/projected/6836a559-5ecc-4386-be4d-028701d84ee4-kube-api-access-8jc4j\") pod \"auto-csr-approver-29552328-ftx2h\" (UID: \"6836a559-5ecc-4386-be4d-028701d84ee4\") " pod="openshift-infra/auto-csr-approver-29552328-ftx2h" Mar 10 10:48:00 crc kubenswrapper[4794]: I0310 10:48:00.483960 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552328-ftx2h" Mar 10 10:48:00 crc kubenswrapper[4794]: I0310 10:48:00.952818 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552328-ftx2h"] Mar 10 10:48:00 crc kubenswrapper[4794]: W0310 10:48:00.966605 4794 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6836a559_5ecc_4386_be4d_028701d84ee4.slice/crio-b9cc5fd5a68ae1adb9e0a4caa0aa7beba3089bc660911e10c1d222dadbbf4771 WatchSource:0}: Error finding container b9cc5fd5a68ae1adb9e0a4caa0aa7beba3089bc660911e10c1d222dadbbf4771: Status 404 returned error can't find the container with id b9cc5fd5a68ae1adb9e0a4caa0aa7beba3089bc660911e10c1d222dadbbf4771 Mar 10 10:48:01 crc kubenswrapper[4794]: I0310 10:48:01.310687 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552328-ftx2h" event={"ID":"6836a559-5ecc-4386-be4d-028701d84ee4","Type":"ContainerStarted","Data":"b9cc5fd5a68ae1adb9e0a4caa0aa7beba3089bc660911e10c1d222dadbbf4771"} Mar 10 10:48:01 crc kubenswrapper[4794]: I0310 10:48:01.310863 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-dt5l9" podUID="5a4f7405-509e-4e78-bddb-ecb20b22eb85" containerName="registry-server" containerID="cri-o://3fe275ce756cba86ccd1c1ff6172fd2030287b37a6d4149941c0c23289e560bd" gracePeriod=2 Mar 10 10:48:01 crc kubenswrapper[4794]: I0310 10:48:01.685870 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-dt5l9" Mar 10 10:48:01 crc kubenswrapper[4794]: I0310 10:48:01.856937 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pfv7l\" (UniqueName: \"kubernetes.io/projected/5a4f7405-509e-4e78-bddb-ecb20b22eb85-kube-api-access-pfv7l\") pod \"5a4f7405-509e-4e78-bddb-ecb20b22eb85\" (UID: \"5a4f7405-509e-4e78-bddb-ecb20b22eb85\") " Mar 10 10:48:01 crc kubenswrapper[4794]: I0310 10:48:01.857009 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5a4f7405-509e-4e78-bddb-ecb20b22eb85-utilities\") pod \"5a4f7405-509e-4e78-bddb-ecb20b22eb85\" (UID: \"5a4f7405-509e-4e78-bddb-ecb20b22eb85\") " Mar 10 10:48:01 crc kubenswrapper[4794]: I0310 10:48:01.857055 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5a4f7405-509e-4e78-bddb-ecb20b22eb85-catalog-content\") pod \"5a4f7405-509e-4e78-bddb-ecb20b22eb85\" (UID: \"5a4f7405-509e-4e78-bddb-ecb20b22eb85\") " Mar 10 10:48:01 crc kubenswrapper[4794]: I0310 10:48:01.858203 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5a4f7405-509e-4e78-bddb-ecb20b22eb85-utilities" (OuterVolumeSpecName: "utilities") pod "5a4f7405-509e-4e78-bddb-ecb20b22eb85" (UID: "5a4f7405-509e-4e78-bddb-ecb20b22eb85"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 10:48:01 crc kubenswrapper[4794]: I0310 10:48:01.868925 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5a4f7405-509e-4e78-bddb-ecb20b22eb85-kube-api-access-pfv7l" (OuterVolumeSpecName: "kube-api-access-pfv7l") pod "5a4f7405-509e-4e78-bddb-ecb20b22eb85" (UID: "5a4f7405-509e-4e78-bddb-ecb20b22eb85"). InnerVolumeSpecName "kube-api-access-pfv7l". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 10:48:01 crc kubenswrapper[4794]: I0310 10:48:01.917240 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5a4f7405-509e-4e78-bddb-ecb20b22eb85-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5a4f7405-509e-4e78-bddb-ecb20b22eb85" (UID: "5a4f7405-509e-4e78-bddb-ecb20b22eb85"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 10:48:01 crc kubenswrapper[4794]: I0310 10:48:01.959391 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pfv7l\" (UniqueName: \"kubernetes.io/projected/5a4f7405-509e-4e78-bddb-ecb20b22eb85-kube-api-access-pfv7l\") on node \"crc\" DevicePath \"\"" Mar 10 10:48:01 crc kubenswrapper[4794]: I0310 10:48:01.959434 4794 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5a4f7405-509e-4e78-bddb-ecb20b22eb85-utilities\") on node \"crc\" DevicePath \"\"" Mar 10 10:48:01 crc kubenswrapper[4794]: I0310 10:48:01.959447 4794 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5a4f7405-509e-4e78-bddb-ecb20b22eb85-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 10 10:48:02 crc kubenswrapper[4794]: I0310 10:48:02.322764 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552328-ftx2h" event={"ID":"6836a559-5ecc-4386-be4d-028701d84ee4","Type":"ContainerStarted","Data":"075af582db3b8b3b4357074eda167fa7d12e595ba702d67044ac1d0ebf135246"} Mar 10 10:48:02 crc kubenswrapper[4794]: I0310 10:48:02.327303 4794 generic.go:334] "Generic (PLEG): container finished" podID="5a4f7405-509e-4e78-bddb-ecb20b22eb85" containerID="3fe275ce756cba86ccd1c1ff6172fd2030287b37a6d4149941c0c23289e560bd" exitCode=0 Mar 10 10:48:02 crc kubenswrapper[4794]: I0310 10:48:02.327358 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dt5l9" event={"ID":"5a4f7405-509e-4e78-bddb-ecb20b22eb85","Type":"ContainerDied","Data":"3fe275ce756cba86ccd1c1ff6172fd2030287b37a6d4149941c0c23289e560bd"} Mar 10 10:48:02 crc kubenswrapper[4794]: I0310 10:48:02.327405 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dt5l9" event={"ID":"5a4f7405-509e-4e78-bddb-ecb20b22eb85","Type":"ContainerDied","Data":"27a609a5394d88b45e38b298276bbb28782d045e31ada18d8d9f062fe28edfe8"} Mar 10 10:48:02 crc kubenswrapper[4794]: I0310 10:48:02.327426 4794 scope.go:117] "RemoveContainer" containerID="3fe275ce756cba86ccd1c1ff6172fd2030287b37a6d4149941c0c23289e560bd" Mar 10 10:48:02 crc kubenswrapper[4794]: I0310 10:48:02.327768 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-dt5l9" Mar 10 10:48:02 crc kubenswrapper[4794]: I0310 10:48:02.337215 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29552328-ftx2h" podStartSLOduration=1.4143668169999999 podStartE2EDuration="2.337201105s" podCreationTimestamp="2026-03-10 10:48:00 +0000 UTC" firstStartedPulling="2026-03-10 10:48:00.971221697 +0000 UTC m=+3829.727392525" lastFinishedPulling="2026-03-10 10:48:01.894055995 +0000 UTC m=+3830.650226813" observedRunningTime="2026-03-10 10:48:02.33639141 +0000 UTC m=+3831.092562228" watchObservedRunningTime="2026-03-10 10:48:02.337201105 +0000 UTC m=+3831.093371913" Mar 10 10:48:02 crc kubenswrapper[4794]: I0310 10:48:02.371514 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-dt5l9"] Mar 10 10:48:02 crc kubenswrapper[4794]: I0310 10:48:02.378692 4794 scope.go:117] "RemoveContainer" containerID="1585926bdf29de9afec8f0076ebb325f7375daa366f71abf92c52ae2dbd059cb" Mar 10 10:48:02 crc kubenswrapper[4794]: I0310 10:48:02.383907 4794 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-dt5l9"] Mar 10 10:48:02 crc kubenswrapper[4794]: I0310 10:48:02.409474 4794 scope.go:117] "RemoveContainer" containerID="65066d5a5879e2742f9255d534cde6e2016844e2a652ea27453423c7fb6c9c30" Mar 10 10:48:02 crc kubenswrapper[4794]: I0310 10:48:02.444735 4794 scope.go:117] "RemoveContainer" containerID="3fe275ce756cba86ccd1c1ff6172fd2030287b37a6d4149941c0c23289e560bd" Mar 10 10:48:02 crc kubenswrapper[4794]: E0310 10:48:02.445189 4794 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3fe275ce756cba86ccd1c1ff6172fd2030287b37a6d4149941c0c23289e560bd\": container with ID starting with 3fe275ce756cba86ccd1c1ff6172fd2030287b37a6d4149941c0c23289e560bd not found: ID does not exist" containerID="3fe275ce756cba86ccd1c1ff6172fd2030287b37a6d4149941c0c23289e560bd" Mar 10 10:48:02 crc kubenswrapper[4794]: I0310 10:48:02.445233 4794 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3fe275ce756cba86ccd1c1ff6172fd2030287b37a6d4149941c0c23289e560bd"} err="failed to get container status \"3fe275ce756cba86ccd1c1ff6172fd2030287b37a6d4149941c0c23289e560bd\": rpc error: code = NotFound desc = could not find container \"3fe275ce756cba86ccd1c1ff6172fd2030287b37a6d4149941c0c23289e560bd\": container with ID starting with 3fe275ce756cba86ccd1c1ff6172fd2030287b37a6d4149941c0c23289e560bd not found: ID does not exist" Mar 10 10:48:02 crc kubenswrapper[4794]: I0310 10:48:02.445258 4794 scope.go:117] "RemoveContainer" containerID="1585926bdf29de9afec8f0076ebb325f7375daa366f71abf92c52ae2dbd059cb" Mar 10 10:48:02 crc kubenswrapper[4794]: E0310 10:48:02.445705 4794 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1585926bdf29de9afec8f0076ebb325f7375daa366f71abf92c52ae2dbd059cb\": container with ID starting with 1585926bdf29de9afec8f0076ebb325f7375daa366f71abf92c52ae2dbd059cb not found: ID does not exist" containerID="1585926bdf29de9afec8f0076ebb325f7375daa366f71abf92c52ae2dbd059cb" Mar 10 10:48:02 crc kubenswrapper[4794]: I0310 10:48:02.446040 4794 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1585926bdf29de9afec8f0076ebb325f7375daa366f71abf92c52ae2dbd059cb"} err="failed to get container status \"1585926bdf29de9afec8f0076ebb325f7375daa366f71abf92c52ae2dbd059cb\": rpc error: code = NotFound desc = could not find container \"1585926bdf29de9afec8f0076ebb325f7375daa366f71abf92c52ae2dbd059cb\": container with ID starting with 1585926bdf29de9afec8f0076ebb325f7375daa366f71abf92c52ae2dbd059cb not found: ID does not exist" Mar 10 10:48:02 crc kubenswrapper[4794]: I0310 10:48:02.446184 4794 scope.go:117] "RemoveContainer" containerID="65066d5a5879e2742f9255d534cde6e2016844e2a652ea27453423c7fb6c9c30" Mar 10 10:48:02 crc kubenswrapper[4794]: E0310 10:48:02.446752 4794 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"65066d5a5879e2742f9255d534cde6e2016844e2a652ea27453423c7fb6c9c30\": container with ID starting with 65066d5a5879e2742f9255d534cde6e2016844e2a652ea27453423c7fb6c9c30 not found: ID does not exist" containerID="65066d5a5879e2742f9255d534cde6e2016844e2a652ea27453423c7fb6c9c30" Mar 10 10:48:02 crc kubenswrapper[4794]: I0310 10:48:02.446772 4794 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"65066d5a5879e2742f9255d534cde6e2016844e2a652ea27453423c7fb6c9c30"} err="failed to get container status \"65066d5a5879e2742f9255d534cde6e2016844e2a652ea27453423c7fb6c9c30\": rpc error: code = NotFound desc = could not find container \"65066d5a5879e2742f9255d534cde6e2016844e2a652ea27453423c7fb6c9c30\": container with ID starting with 65066d5a5879e2742f9255d534cde6e2016844e2a652ea27453423c7fb6c9c30 not found: ID does not exist" Mar 10 10:48:03 crc kubenswrapper[4794]: I0310 10:48:03.340156 4794 generic.go:334] "Generic (PLEG): container finished" podID="6836a559-5ecc-4386-be4d-028701d84ee4" containerID="075af582db3b8b3b4357074eda167fa7d12e595ba702d67044ac1d0ebf135246" exitCode=0 Mar 10 10:48:03 crc kubenswrapper[4794]: I0310 10:48:03.340222 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552328-ftx2h" event={"ID":"6836a559-5ecc-4386-be4d-028701d84ee4","Type":"ContainerDied","Data":"075af582db3b8b3b4357074eda167fa7d12e595ba702d67044ac1d0ebf135246"} Mar 10 10:48:04 crc kubenswrapper[4794]: I0310 10:48:04.015520 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5a4f7405-509e-4e78-bddb-ecb20b22eb85" path="/var/lib/kubelet/pods/5a4f7405-509e-4e78-bddb-ecb20b22eb85/volumes" Mar 10 10:48:04 crc kubenswrapper[4794]: I0310 10:48:04.728450 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552328-ftx2h" Mar 10 10:48:04 crc kubenswrapper[4794]: I0310 10:48:04.906319 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8jc4j\" (UniqueName: \"kubernetes.io/projected/6836a559-5ecc-4386-be4d-028701d84ee4-kube-api-access-8jc4j\") pod \"6836a559-5ecc-4386-be4d-028701d84ee4\" (UID: \"6836a559-5ecc-4386-be4d-028701d84ee4\") " Mar 10 10:48:04 crc kubenswrapper[4794]: I0310 10:48:04.911245 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6836a559-5ecc-4386-be4d-028701d84ee4-kube-api-access-8jc4j" (OuterVolumeSpecName: "kube-api-access-8jc4j") pod "6836a559-5ecc-4386-be4d-028701d84ee4" (UID: "6836a559-5ecc-4386-be4d-028701d84ee4"). InnerVolumeSpecName "kube-api-access-8jc4j". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 10:48:05 crc kubenswrapper[4794]: I0310 10:48:05.008471 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8jc4j\" (UniqueName: \"kubernetes.io/projected/6836a559-5ecc-4386-be4d-028701d84ee4-kube-api-access-8jc4j\") on node \"crc\" DevicePath \"\"" Mar 10 10:48:05 crc kubenswrapper[4794]: I0310 10:48:05.099163 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29552322-kqw2w"] Mar 10 10:48:05 crc kubenswrapper[4794]: I0310 10:48:05.109539 4794 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29552322-kqw2w"] Mar 10 10:48:05 crc kubenswrapper[4794]: I0310 10:48:05.365983 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552328-ftx2h" event={"ID":"6836a559-5ecc-4386-be4d-028701d84ee4","Type":"ContainerDied","Data":"b9cc5fd5a68ae1adb9e0a4caa0aa7beba3089bc660911e10c1d222dadbbf4771"} Mar 10 10:48:05 crc kubenswrapper[4794]: I0310 10:48:05.366043 4794 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b9cc5fd5a68ae1adb9e0a4caa0aa7beba3089bc660911e10c1d222dadbbf4771" Mar 10 10:48:05 crc kubenswrapper[4794]: I0310 10:48:05.366106 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552328-ftx2h" Mar 10 10:48:06 crc kubenswrapper[4794]: I0310 10:48:06.010499 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="085121c1-faea-4b9f-8296-a0535e7ee9b6" path="/var/lib/kubelet/pods/085121c1-faea-4b9f-8296-a0535e7ee9b6/volumes" Mar 10 10:48:40 crc kubenswrapper[4794]: I0310 10:48:40.214409 4794 scope.go:117] "RemoveContainer" containerID="f94d1c6b0a6c66a9932fdd7f9ccc3a9455b2016b58ab2716680dcda6949b942e" Mar 10 10:48:52 crc kubenswrapper[4794]: I0310 10:48:52.967621 4794 patch_prober.go:28] interesting pod/machine-config-daemon-69278 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 10:48:52 crc kubenswrapper[4794]: I0310 10:48:52.968203 4794 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-69278" podUID="071a79a8-a892-4d38-a255-2a19483b64aa" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 10:49:22 crc kubenswrapper[4794]: I0310 10:49:22.967476 4794 patch_prober.go:28] interesting pod/machine-config-daemon-69278 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 10:49:22 crc kubenswrapper[4794]: I0310 10:49:22.969090 4794 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-69278" podUID="071a79a8-a892-4d38-a255-2a19483b64aa" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 10:49:52 crc kubenswrapper[4794]: I0310 10:49:52.967596 4794 patch_prober.go:28] interesting pod/machine-config-daemon-69278 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 10:49:52 crc kubenswrapper[4794]: I0310 10:49:52.968613 4794 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-69278" podUID="071a79a8-a892-4d38-a255-2a19483b64aa" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 10:49:52 crc kubenswrapper[4794]: I0310 10:49:52.968685 4794 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-69278" Mar 10 10:49:52 crc kubenswrapper[4794]: I0310 10:49:52.969665 4794 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"a4170662b41f7f6ea8f48d36067f333890b45c5e5457bca77f3d71c2d215cd5a"} pod="openshift-machine-config-operator/machine-config-daemon-69278" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 10 10:49:52 crc kubenswrapper[4794]: I0310 10:49:52.969858 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-69278" podUID="071a79a8-a892-4d38-a255-2a19483b64aa" containerName="machine-config-daemon" containerID="cri-o://a4170662b41f7f6ea8f48d36067f333890b45c5e5457bca77f3d71c2d215cd5a" gracePeriod=600 Mar 10 10:49:53 crc kubenswrapper[4794]: E0310 10:49:53.095542 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-69278_openshift-machine-config-operator(071a79a8-a892-4d38-a255-2a19483b64aa)\"" pod="openshift-machine-config-operator/machine-config-daemon-69278" podUID="071a79a8-a892-4d38-a255-2a19483b64aa" Mar 10 10:49:53 crc kubenswrapper[4794]: I0310 10:49:53.332547 4794 generic.go:334] "Generic (PLEG): container finished" podID="071a79a8-a892-4d38-a255-2a19483b64aa" containerID="a4170662b41f7f6ea8f48d36067f333890b45c5e5457bca77f3d71c2d215cd5a" exitCode=0 Mar 10 10:49:53 crc kubenswrapper[4794]: I0310 10:49:53.332606 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-69278" event={"ID":"071a79a8-a892-4d38-a255-2a19483b64aa","Type":"ContainerDied","Data":"a4170662b41f7f6ea8f48d36067f333890b45c5e5457bca77f3d71c2d215cd5a"} Mar 10 10:49:53 crc kubenswrapper[4794]: I0310 10:49:53.332651 4794 scope.go:117] "RemoveContainer" containerID="1dfef3673c4e36ac904ce63dccbdb3ac9fb5dfb060917b9d39017aa8664a4458" Mar 10 10:49:53 crc kubenswrapper[4794]: I0310 10:49:53.333203 4794 scope.go:117] "RemoveContainer" containerID="a4170662b41f7f6ea8f48d36067f333890b45c5e5457bca77f3d71c2d215cd5a" Mar 10 10:49:53 crc kubenswrapper[4794]: E0310 10:49:53.333508 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-69278_openshift-machine-config-operator(071a79a8-a892-4d38-a255-2a19483b64aa)\"" pod="openshift-machine-config-operator/machine-config-daemon-69278" podUID="071a79a8-a892-4d38-a255-2a19483b64aa" Mar 10 10:49:54 crc kubenswrapper[4794]: I0310 10:49:54.345494 4794 scope.go:117] "RemoveContainer" containerID="a4170662b41f7f6ea8f48d36067f333890b45c5e5457bca77f3d71c2d215cd5a" Mar 10 10:49:54 crc kubenswrapper[4794]: E0310 10:49:54.346199 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-69278_openshift-machine-config-operator(071a79a8-a892-4d38-a255-2a19483b64aa)\"" pod="openshift-machine-config-operator/machine-config-daemon-69278" podUID="071a79a8-a892-4d38-a255-2a19483b64aa" Mar 10 10:50:00 crc kubenswrapper[4794]: I0310 10:50:00.168666 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29552330-mhllm"] Mar 10 10:50:00 crc kubenswrapper[4794]: E0310 10:50:00.169455 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6836a559-5ecc-4386-be4d-028701d84ee4" containerName="oc" Mar 10 10:50:00 crc kubenswrapper[4794]: I0310 10:50:00.169470 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="6836a559-5ecc-4386-be4d-028701d84ee4" containerName="oc" Mar 10 10:50:00 crc kubenswrapper[4794]: E0310 10:50:00.169485 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5a4f7405-509e-4e78-bddb-ecb20b22eb85" containerName="registry-server" Mar 10 10:50:00 crc kubenswrapper[4794]: I0310 10:50:00.169493 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="5a4f7405-509e-4e78-bddb-ecb20b22eb85" containerName="registry-server" Mar 10 10:50:00 crc kubenswrapper[4794]: E0310 10:50:00.169517 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5a4f7405-509e-4e78-bddb-ecb20b22eb85" containerName="extract-utilities" Mar 10 10:50:00 crc kubenswrapper[4794]: I0310 10:50:00.169525 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="5a4f7405-509e-4e78-bddb-ecb20b22eb85" containerName="extract-utilities" Mar 10 10:50:00 crc kubenswrapper[4794]: E0310 10:50:00.169537 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5a4f7405-509e-4e78-bddb-ecb20b22eb85" containerName="extract-content" Mar 10 10:50:00 crc kubenswrapper[4794]: I0310 10:50:00.169545 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="5a4f7405-509e-4e78-bddb-ecb20b22eb85" containerName="extract-content" Mar 10 10:50:00 crc kubenswrapper[4794]: I0310 10:50:00.169721 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="5a4f7405-509e-4e78-bddb-ecb20b22eb85" containerName="registry-server" Mar 10 10:50:00 crc kubenswrapper[4794]: I0310 10:50:00.169743 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="6836a559-5ecc-4386-be4d-028701d84ee4" containerName="oc" Mar 10 10:50:00 crc kubenswrapper[4794]: I0310 10:50:00.170227 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552330-mhllm" Mar 10 10:50:00 crc kubenswrapper[4794]: I0310 10:50:00.178656 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 10 10:50:00 crc kubenswrapper[4794]: I0310 10:50:00.178878 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-r5tkc" Mar 10 10:50:00 crc kubenswrapper[4794]: I0310 10:50:00.179053 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 10 10:50:00 crc kubenswrapper[4794]: I0310 10:50:00.190772 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552330-mhllm"] Mar 10 10:50:00 crc kubenswrapper[4794]: I0310 10:50:00.208440 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zs22h\" (UniqueName: \"kubernetes.io/projected/d021b0b4-f8b7-4a9f-8a38-214bb3b25e21-kube-api-access-zs22h\") pod \"auto-csr-approver-29552330-mhllm\" (UID: \"d021b0b4-f8b7-4a9f-8a38-214bb3b25e21\") " pod="openshift-infra/auto-csr-approver-29552330-mhllm" Mar 10 10:50:00 crc kubenswrapper[4794]: I0310 10:50:00.309449 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zs22h\" (UniqueName: \"kubernetes.io/projected/d021b0b4-f8b7-4a9f-8a38-214bb3b25e21-kube-api-access-zs22h\") pod \"auto-csr-approver-29552330-mhllm\" (UID: \"d021b0b4-f8b7-4a9f-8a38-214bb3b25e21\") " pod="openshift-infra/auto-csr-approver-29552330-mhllm" Mar 10 10:50:00 crc kubenswrapper[4794]: I0310 10:50:00.341184 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zs22h\" (UniqueName: \"kubernetes.io/projected/d021b0b4-f8b7-4a9f-8a38-214bb3b25e21-kube-api-access-zs22h\") pod \"auto-csr-approver-29552330-mhllm\" (UID: \"d021b0b4-f8b7-4a9f-8a38-214bb3b25e21\") " pod="openshift-infra/auto-csr-approver-29552330-mhllm" Mar 10 10:50:00 crc kubenswrapper[4794]: I0310 10:50:00.499033 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552330-mhllm" Mar 10 10:50:01 crc kubenswrapper[4794]: I0310 10:50:00.998562 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552330-mhllm"] Mar 10 10:50:01 crc kubenswrapper[4794]: I0310 10:50:01.417349 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552330-mhllm" event={"ID":"d021b0b4-f8b7-4a9f-8a38-214bb3b25e21","Type":"ContainerStarted","Data":"5e5f366b16fbf7c308d4df485d616d42fb3f5c3fe230a3b81eebc26f4f76ee15"} Mar 10 10:50:03 crc kubenswrapper[4794]: I0310 10:50:03.434900 4794 generic.go:334] "Generic (PLEG): container finished" podID="d021b0b4-f8b7-4a9f-8a38-214bb3b25e21" containerID="0bbe115eaa6ea5850ecc3bcd96285e17e568e7d9ca8e693a297e7422b55c7498" exitCode=0 Mar 10 10:50:03 crc kubenswrapper[4794]: I0310 10:50:03.434965 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552330-mhllm" event={"ID":"d021b0b4-f8b7-4a9f-8a38-214bb3b25e21","Type":"ContainerDied","Data":"0bbe115eaa6ea5850ecc3bcd96285e17e568e7d9ca8e693a297e7422b55c7498"} Mar 10 10:50:04 crc kubenswrapper[4794]: I0310 10:50:04.877376 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552330-mhllm" Mar 10 10:50:05 crc kubenswrapper[4794]: I0310 10:50:05.075705 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zs22h\" (UniqueName: \"kubernetes.io/projected/d021b0b4-f8b7-4a9f-8a38-214bb3b25e21-kube-api-access-zs22h\") pod \"d021b0b4-f8b7-4a9f-8a38-214bb3b25e21\" (UID: \"d021b0b4-f8b7-4a9f-8a38-214bb3b25e21\") " Mar 10 10:50:05 crc kubenswrapper[4794]: I0310 10:50:05.085258 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d021b0b4-f8b7-4a9f-8a38-214bb3b25e21-kube-api-access-zs22h" (OuterVolumeSpecName: "kube-api-access-zs22h") pod "d021b0b4-f8b7-4a9f-8a38-214bb3b25e21" (UID: "d021b0b4-f8b7-4a9f-8a38-214bb3b25e21"). InnerVolumeSpecName "kube-api-access-zs22h". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 10:50:05 crc kubenswrapper[4794]: I0310 10:50:05.178456 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zs22h\" (UniqueName: \"kubernetes.io/projected/d021b0b4-f8b7-4a9f-8a38-214bb3b25e21-kube-api-access-zs22h\") on node \"crc\" DevicePath \"\"" Mar 10 10:50:05 crc kubenswrapper[4794]: I0310 10:50:05.458036 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552330-mhllm" event={"ID":"d021b0b4-f8b7-4a9f-8a38-214bb3b25e21","Type":"ContainerDied","Data":"5e5f366b16fbf7c308d4df485d616d42fb3f5c3fe230a3b81eebc26f4f76ee15"} Mar 10 10:50:05 crc kubenswrapper[4794]: I0310 10:50:05.458096 4794 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5e5f366b16fbf7c308d4df485d616d42fb3f5c3fe230a3b81eebc26f4f76ee15" Mar 10 10:50:05 crc kubenswrapper[4794]: I0310 10:50:05.458110 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552330-mhllm" Mar 10 10:50:05 crc kubenswrapper[4794]: I0310 10:50:05.969728 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29552324-5j6hl"] Mar 10 10:50:05 crc kubenswrapper[4794]: I0310 10:50:05.981184 4794 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29552324-5j6hl"] Mar 10 10:50:05 crc kubenswrapper[4794]: I0310 10:50:05.999219 4794 scope.go:117] "RemoveContainer" containerID="a4170662b41f7f6ea8f48d36067f333890b45c5e5457bca77f3d71c2d215cd5a" Mar 10 10:50:06 crc kubenswrapper[4794]: E0310 10:50:05.999968 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-69278_openshift-machine-config-operator(071a79a8-a892-4d38-a255-2a19483b64aa)\"" pod="openshift-machine-config-operator/machine-config-daemon-69278" podUID="071a79a8-a892-4d38-a255-2a19483b64aa" Mar 10 10:50:06 crc kubenswrapper[4794]: I0310 10:50:06.012924 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6963f98c-6610-4db7-8cd0-3e45300c69b4" path="/var/lib/kubelet/pods/6963f98c-6610-4db7-8cd0-3e45300c69b4/volumes" Mar 10 10:50:19 crc kubenswrapper[4794]: I0310 10:50:19.999084 4794 scope.go:117] "RemoveContainer" containerID="a4170662b41f7f6ea8f48d36067f333890b45c5e5457bca77f3d71c2d215cd5a" Mar 10 10:50:20 crc kubenswrapper[4794]: E0310 10:50:20.000011 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-69278_openshift-machine-config-operator(071a79a8-a892-4d38-a255-2a19483b64aa)\"" pod="openshift-machine-config-operator/machine-config-daemon-69278" podUID="071a79a8-a892-4d38-a255-2a19483b64aa" Mar 10 10:50:32 crc kubenswrapper[4794]: I0310 10:50:32.003180 4794 scope.go:117] "RemoveContainer" containerID="a4170662b41f7f6ea8f48d36067f333890b45c5e5457bca77f3d71c2d215cd5a" Mar 10 10:50:32 crc kubenswrapper[4794]: E0310 10:50:32.003897 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-69278_openshift-machine-config-operator(071a79a8-a892-4d38-a255-2a19483b64aa)\"" pod="openshift-machine-config-operator/machine-config-daemon-69278" podUID="071a79a8-a892-4d38-a255-2a19483b64aa" Mar 10 10:50:40 crc kubenswrapper[4794]: I0310 10:50:40.330500 4794 scope.go:117] "RemoveContainer" containerID="428c5ebeaf331b2a0ad6811af9ba01743719208f77e4c79d101da9c0e61c1d24" Mar 10 10:50:47 crc kubenswrapper[4794]: I0310 10:50:46.999364 4794 scope.go:117] "RemoveContainer" containerID="a4170662b41f7f6ea8f48d36067f333890b45c5e5457bca77f3d71c2d215cd5a" Mar 10 10:50:47 crc kubenswrapper[4794]: E0310 10:50:47.000188 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-69278_openshift-machine-config-operator(071a79a8-a892-4d38-a255-2a19483b64aa)\"" pod="openshift-machine-config-operator/machine-config-daemon-69278" podUID="071a79a8-a892-4d38-a255-2a19483b64aa" Mar 10 10:51:00 crc kubenswrapper[4794]: I0310 10:51:00.998910 4794 scope.go:117] "RemoveContainer" containerID="a4170662b41f7f6ea8f48d36067f333890b45c5e5457bca77f3d71c2d215cd5a" Mar 10 10:51:01 crc kubenswrapper[4794]: E0310 10:51:00.999767 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-69278_openshift-machine-config-operator(071a79a8-a892-4d38-a255-2a19483b64aa)\"" pod="openshift-machine-config-operator/machine-config-daemon-69278" podUID="071a79a8-a892-4d38-a255-2a19483b64aa" Mar 10 10:51:12 crc kubenswrapper[4794]: I0310 10:51:12.028264 4794 scope.go:117] "RemoveContainer" containerID="a4170662b41f7f6ea8f48d36067f333890b45c5e5457bca77f3d71c2d215cd5a" Mar 10 10:51:12 crc kubenswrapper[4794]: E0310 10:51:12.030375 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-69278_openshift-machine-config-operator(071a79a8-a892-4d38-a255-2a19483b64aa)\"" pod="openshift-machine-config-operator/machine-config-daemon-69278" podUID="071a79a8-a892-4d38-a255-2a19483b64aa" Mar 10 10:51:26 crc kubenswrapper[4794]: I0310 10:51:25.999734 4794 scope.go:117] "RemoveContainer" containerID="a4170662b41f7f6ea8f48d36067f333890b45c5e5457bca77f3d71c2d215cd5a" Mar 10 10:51:26 crc kubenswrapper[4794]: E0310 10:51:26.000635 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-69278_openshift-machine-config-operator(071a79a8-a892-4d38-a255-2a19483b64aa)\"" pod="openshift-machine-config-operator/machine-config-daemon-69278" podUID="071a79a8-a892-4d38-a255-2a19483b64aa" Mar 10 10:51:41 crc kubenswrapper[4794]: I0310 10:51:40.999952 4794 scope.go:117] "RemoveContainer" containerID="a4170662b41f7f6ea8f48d36067f333890b45c5e5457bca77f3d71c2d215cd5a" Mar 10 10:51:41 crc kubenswrapper[4794]: E0310 10:51:41.001125 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-69278_openshift-machine-config-operator(071a79a8-a892-4d38-a255-2a19483b64aa)\"" pod="openshift-machine-config-operator/machine-config-daemon-69278" podUID="071a79a8-a892-4d38-a255-2a19483b64aa" Mar 10 10:51:53 crc kubenswrapper[4794]: I0310 10:51:52.999759 4794 scope.go:117] "RemoveContainer" containerID="a4170662b41f7f6ea8f48d36067f333890b45c5e5457bca77f3d71c2d215cd5a" Mar 10 10:51:53 crc kubenswrapper[4794]: E0310 10:51:53.000625 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-69278_openshift-machine-config-operator(071a79a8-a892-4d38-a255-2a19483b64aa)\"" pod="openshift-machine-config-operator/machine-config-daemon-69278" podUID="071a79a8-a892-4d38-a255-2a19483b64aa" Mar 10 10:52:00 crc kubenswrapper[4794]: I0310 10:52:00.161162 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29552332-dfttx"] Mar 10 10:52:00 crc kubenswrapper[4794]: E0310 10:52:00.162590 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d021b0b4-f8b7-4a9f-8a38-214bb3b25e21" containerName="oc" Mar 10 10:52:00 crc kubenswrapper[4794]: I0310 10:52:00.162656 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="d021b0b4-f8b7-4a9f-8a38-214bb3b25e21" containerName="oc" Mar 10 10:52:00 crc kubenswrapper[4794]: I0310 10:52:00.162943 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="d021b0b4-f8b7-4a9f-8a38-214bb3b25e21" containerName="oc" Mar 10 10:52:00 crc kubenswrapper[4794]: I0310 10:52:00.163781 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552332-dfttx" Mar 10 10:52:00 crc kubenswrapper[4794]: I0310 10:52:00.166887 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 10 10:52:00 crc kubenswrapper[4794]: I0310 10:52:00.166920 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 10 10:52:00 crc kubenswrapper[4794]: I0310 10:52:00.167676 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-r5tkc" Mar 10 10:52:00 crc kubenswrapper[4794]: I0310 10:52:00.174309 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552332-dfttx"] Mar 10 10:52:00 crc kubenswrapper[4794]: I0310 10:52:00.359108 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mf6wd\" (UniqueName: \"kubernetes.io/projected/4f4c5563-e486-475b-a5d2-286cf7acde74-kube-api-access-mf6wd\") pod \"auto-csr-approver-29552332-dfttx\" (UID: \"4f4c5563-e486-475b-a5d2-286cf7acde74\") " pod="openshift-infra/auto-csr-approver-29552332-dfttx" Mar 10 10:52:00 crc kubenswrapper[4794]: I0310 10:52:00.460656 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mf6wd\" (UniqueName: \"kubernetes.io/projected/4f4c5563-e486-475b-a5d2-286cf7acde74-kube-api-access-mf6wd\") pod \"auto-csr-approver-29552332-dfttx\" (UID: \"4f4c5563-e486-475b-a5d2-286cf7acde74\") " pod="openshift-infra/auto-csr-approver-29552332-dfttx" Mar 10 10:52:00 crc kubenswrapper[4794]: I0310 10:52:00.499196 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mf6wd\" (UniqueName: \"kubernetes.io/projected/4f4c5563-e486-475b-a5d2-286cf7acde74-kube-api-access-mf6wd\") pod \"auto-csr-approver-29552332-dfttx\" (UID: \"4f4c5563-e486-475b-a5d2-286cf7acde74\") " pod="openshift-infra/auto-csr-approver-29552332-dfttx" Mar 10 10:52:00 crc kubenswrapper[4794]: I0310 10:52:00.798382 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552332-dfttx" Mar 10 10:52:01 crc kubenswrapper[4794]: I0310 10:52:01.315067 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552332-dfttx"] Mar 10 10:52:01 crc kubenswrapper[4794]: W0310 10:52:01.320970 4794 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4f4c5563_e486_475b_a5d2_286cf7acde74.slice/crio-029b83f0de1ba2493b38eebb4e9b58a2e5ea430e999ca20c15013d672c980bc3 WatchSource:0}: Error finding container 029b83f0de1ba2493b38eebb4e9b58a2e5ea430e999ca20c15013d672c980bc3: Status 404 returned error can't find the container with id 029b83f0de1ba2493b38eebb4e9b58a2e5ea430e999ca20c15013d672c980bc3 Mar 10 10:52:01 crc kubenswrapper[4794]: I0310 10:52:01.323505 4794 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 10 10:52:01 crc kubenswrapper[4794]: I0310 10:52:01.457757 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552332-dfttx" event={"ID":"4f4c5563-e486-475b-a5d2-286cf7acde74","Type":"ContainerStarted","Data":"029b83f0de1ba2493b38eebb4e9b58a2e5ea430e999ca20c15013d672c980bc3"} Mar 10 10:52:03 crc kubenswrapper[4794]: I0310 10:52:03.476783 4794 generic.go:334] "Generic (PLEG): container finished" podID="4f4c5563-e486-475b-a5d2-286cf7acde74" containerID="c5eae514c508f3b325afe94b722875a933a59e4e6019a98793799f7677a8ead3" exitCode=0 Mar 10 10:52:03 crc kubenswrapper[4794]: I0310 10:52:03.476903 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552332-dfttx" event={"ID":"4f4c5563-e486-475b-a5d2-286cf7acde74","Type":"ContainerDied","Data":"c5eae514c508f3b325afe94b722875a933a59e4e6019a98793799f7677a8ead3"} Mar 10 10:52:03 crc kubenswrapper[4794]: I0310 10:52:03.998860 4794 scope.go:117] "RemoveContainer" containerID="a4170662b41f7f6ea8f48d36067f333890b45c5e5457bca77f3d71c2d215cd5a" Mar 10 10:52:03 crc kubenswrapper[4794]: E0310 10:52:03.999075 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-69278_openshift-machine-config-operator(071a79a8-a892-4d38-a255-2a19483b64aa)\"" pod="openshift-machine-config-operator/machine-config-daemon-69278" podUID="071a79a8-a892-4d38-a255-2a19483b64aa" Mar 10 10:52:04 crc kubenswrapper[4794]: I0310 10:52:04.775295 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552332-dfttx" Mar 10 10:52:04 crc kubenswrapper[4794]: I0310 10:52:04.822828 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mf6wd\" (UniqueName: \"kubernetes.io/projected/4f4c5563-e486-475b-a5d2-286cf7acde74-kube-api-access-mf6wd\") pod \"4f4c5563-e486-475b-a5d2-286cf7acde74\" (UID: \"4f4c5563-e486-475b-a5d2-286cf7acde74\") " Mar 10 10:52:04 crc kubenswrapper[4794]: I0310 10:52:04.828030 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4f4c5563-e486-475b-a5d2-286cf7acde74-kube-api-access-mf6wd" (OuterVolumeSpecName: "kube-api-access-mf6wd") pod "4f4c5563-e486-475b-a5d2-286cf7acde74" (UID: "4f4c5563-e486-475b-a5d2-286cf7acde74"). InnerVolumeSpecName "kube-api-access-mf6wd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 10:52:04 crc kubenswrapper[4794]: I0310 10:52:04.924455 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mf6wd\" (UniqueName: \"kubernetes.io/projected/4f4c5563-e486-475b-a5d2-286cf7acde74-kube-api-access-mf6wd\") on node \"crc\" DevicePath \"\"" Mar 10 10:52:05 crc kubenswrapper[4794]: I0310 10:52:05.498375 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552332-dfttx" event={"ID":"4f4c5563-e486-475b-a5d2-286cf7acde74","Type":"ContainerDied","Data":"029b83f0de1ba2493b38eebb4e9b58a2e5ea430e999ca20c15013d672c980bc3"} Mar 10 10:52:05 crc kubenswrapper[4794]: I0310 10:52:05.498429 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552332-dfttx" Mar 10 10:52:05 crc kubenswrapper[4794]: I0310 10:52:05.498443 4794 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="029b83f0de1ba2493b38eebb4e9b58a2e5ea430e999ca20c15013d672c980bc3" Mar 10 10:52:05 crc kubenswrapper[4794]: I0310 10:52:05.848822 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29552326-8cbdt"] Mar 10 10:52:05 crc kubenswrapper[4794]: I0310 10:52:05.859360 4794 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29552326-8cbdt"] Mar 10 10:52:06 crc kubenswrapper[4794]: I0310 10:52:06.009801 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f7dc16bb-b8cc-4252-a4c3-5d0b3d3542e9" path="/var/lib/kubelet/pods/f7dc16bb-b8cc-4252-a4c3-5d0b3d3542e9/volumes" Mar 10 10:52:14 crc kubenswrapper[4794]: I0310 10:52:14.999107 4794 scope.go:117] "RemoveContainer" containerID="a4170662b41f7f6ea8f48d36067f333890b45c5e5457bca77f3d71c2d215cd5a" Mar 10 10:52:15 crc kubenswrapper[4794]: E0310 10:52:15.000187 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-69278_openshift-machine-config-operator(071a79a8-a892-4d38-a255-2a19483b64aa)\"" pod="openshift-machine-config-operator/machine-config-daemon-69278" podUID="071a79a8-a892-4d38-a255-2a19483b64aa" Mar 10 10:52:27 crc kubenswrapper[4794]: I0310 10:52:26.999405 4794 scope.go:117] "RemoveContainer" containerID="a4170662b41f7f6ea8f48d36067f333890b45c5e5457bca77f3d71c2d215cd5a" Mar 10 10:52:27 crc kubenswrapper[4794]: E0310 10:52:27.000748 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-69278_openshift-machine-config-operator(071a79a8-a892-4d38-a255-2a19483b64aa)\"" pod="openshift-machine-config-operator/machine-config-daemon-69278" podUID="071a79a8-a892-4d38-a255-2a19483b64aa" Mar 10 10:52:39 crc kubenswrapper[4794]: I0310 10:52:38.999576 4794 scope.go:117] "RemoveContainer" containerID="a4170662b41f7f6ea8f48d36067f333890b45c5e5457bca77f3d71c2d215cd5a" Mar 10 10:52:39 crc kubenswrapper[4794]: E0310 10:52:39.000498 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-69278_openshift-machine-config-operator(071a79a8-a892-4d38-a255-2a19483b64aa)\"" pod="openshift-machine-config-operator/machine-config-daemon-69278" podUID="071a79a8-a892-4d38-a255-2a19483b64aa" Mar 10 10:52:40 crc kubenswrapper[4794]: I0310 10:52:40.482617 4794 scope.go:117] "RemoveContainer" containerID="217355a90848adc334ea39e860a5d04940bbef32280925df256a99d0792e14c3" Mar 10 10:52:48 crc kubenswrapper[4794]: I0310 10:52:48.471041 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-qrnhh"] Mar 10 10:52:48 crc kubenswrapper[4794]: E0310 10:52:48.472565 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4f4c5563-e486-475b-a5d2-286cf7acde74" containerName="oc" Mar 10 10:52:48 crc kubenswrapper[4794]: I0310 10:52:48.472589 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="4f4c5563-e486-475b-a5d2-286cf7acde74" containerName="oc" Mar 10 10:52:48 crc kubenswrapper[4794]: I0310 10:52:48.472817 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="4f4c5563-e486-475b-a5d2-286cf7acde74" containerName="oc" Mar 10 10:52:48 crc kubenswrapper[4794]: I0310 10:52:48.474539 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-qrnhh" Mar 10 10:52:48 crc kubenswrapper[4794]: I0310 10:52:48.490615 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-qrnhh"] Mar 10 10:52:48 crc kubenswrapper[4794]: I0310 10:52:48.496132 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4abf3070-907c-4615-8a54-3f4932d76475-catalog-content\") pod \"redhat-operators-qrnhh\" (UID: \"4abf3070-907c-4615-8a54-3f4932d76475\") " pod="openshift-marketplace/redhat-operators-qrnhh" Mar 10 10:52:48 crc kubenswrapper[4794]: I0310 10:52:48.496267 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4abf3070-907c-4615-8a54-3f4932d76475-utilities\") pod \"redhat-operators-qrnhh\" (UID: \"4abf3070-907c-4615-8a54-3f4932d76475\") " pod="openshift-marketplace/redhat-operators-qrnhh" Mar 10 10:52:48 crc kubenswrapper[4794]: I0310 10:52:48.496464 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bm77k\" (UniqueName: \"kubernetes.io/projected/4abf3070-907c-4615-8a54-3f4932d76475-kube-api-access-bm77k\") pod \"redhat-operators-qrnhh\" (UID: \"4abf3070-907c-4615-8a54-3f4932d76475\") " pod="openshift-marketplace/redhat-operators-qrnhh" Mar 10 10:52:48 crc kubenswrapper[4794]: I0310 10:52:48.597092 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4abf3070-907c-4615-8a54-3f4932d76475-catalog-content\") pod \"redhat-operators-qrnhh\" (UID: \"4abf3070-907c-4615-8a54-3f4932d76475\") " pod="openshift-marketplace/redhat-operators-qrnhh" Mar 10 10:52:48 crc kubenswrapper[4794]: I0310 10:52:48.597436 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4abf3070-907c-4615-8a54-3f4932d76475-utilities\") pod \"redhat-operators-qrnhh\" (UID: \"4abf3070-907c-4615-8a54-3f4932d76475\") " pod="openshift-marketplace/redhat-operators-qrnhh" Mar 10 10:52:48 crc kubenswrapper[4794]: I0310 10:52:48.597598 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bm77k\" (UniqueName: \"kubernetes.io/projected/4abf3070-907c-4615-8a54-3f4932d76475-kube-api-access-bm77k\") pod \"redhat-operators-qrnhh\" (UID: \"4abf3070-907c-4615-8a54-3f4932d76475\") " pod="openshift-marketplace/redhat-operators-qrnhh" Mar 10 10:52:48 crc kubenswrapper[4794]: I0310 10:52:48.597993 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4abf3070-907c-4615-8a54-3f4932d76475-catalog-content\") pod \"redhat-operators-qrnhh\" (UID: \"4abf3070-907c-4615-8a54-3f4932d76475\") " pod="openshift-marketplace/redhat-operators-qrnhh" Mar 10 10:52:48 crc kubenswrapper[4794]: I0310 10:52:48.598001 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4abf3070-907c-4615-8a54-3f4932d76475-utilities\") pod \"redhat-operators-qrnhh\" (UID: \"4abf3070-907c-4615-8a54-3f4932d76475\") " pod="openshift-marketplace/redhat-operators-qrnhh" Mar 10 10:52:48 crc kubenswrapper[4794]: I0310 10:52:48.624321 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bm77k\" (UniqueName: \"kubernetes.io/projected/4abf3070-907c-4615-8a54-3f4932d76475-kube-api-access-bm77k\") pod \"redhat-operators-qrnhh\" (UID: \"4abf3070-907c-4615-8a54-3f4932d76475\") " pod="openshift-marketplace/redhat-operators-qrnhh" Mar 10 10:52:48 crc kubenswrapper[4794]: I0310 10:52:48.806778 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-qrnhh" Mar 10 10:52:49 crc kubenswrapper[4794]: I0310 10:52:49.296375 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-qrnhh"] Mar 10 10:52:49 crc kubenswrapper[4794]: I0310 10:52:49.931625 4794 generic.go:334] "Generic (PLEG): container finished" podID="4abf3070-907c-4615-8a54-3f4932d76475" containerID="00e8b83c5f8a3f703f278027232ad09955e28d6f5776a292e6bb2348beae9e52" exitCode=0 Mar 10 10:52:49 crc kubenswrapper[4794]: I0310 10:52:49.931851 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qrnhh" event={"ID":"4abf3070-907c-4615-8a54-3f4932d76475","Type":"ContainerDied","Data":"00e8b83c5f8a3f703f278027232ad09955e28d6f5776a292e6bb2348beae9e52"} Mar 10 10:52:49 crc kubenswrapper[4794]: I0310 10:52:49.931992 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qrnhh" event={"ID":"4abf3070-907c-4615-8a54-3f4932d76475","Type":"ContainerStarted","Data":"4be0c1a1e828f7589bc83de8ef842971b238b5d971400f19c90516b787ec9f0d"} Mar 10 10:52:49 crc kubenswrapper[4794]: I0310 10:52:49.998746 4794 scope.go:117] "RemoveContainer" containerID="a4170662b41f7f6ea8f48d36067f333890b45c5e5457bca77f3d71c2d215cd5a" Mar 10 10:52:49 crc kubenswrapper[4794]: E0310 10:52:49.998950 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-69278_openshift-machine-config-operator(071a79a8-a892-4d38-a255-2a19483b64aa)\"" pod="openshift-machine-config-operator/machine-config-daemon-69278" podUID="071a79a8-a892-4d38-a255-2a19483b64aa" Mar 10 10:52:50 crc kubenswrapper[4794]: I0310 10:52:50.949203 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qrnhh" event={"ID":"4abf3070-907c-4615-8a54-3f4932d76475","Type":"ContainerStarted","Data":"c8f972f70d863d2f450ad17717bae97aa79712538ee3a2e592ad07e848cb40ef"} Mar 10 10:52:51 crc kubenswrapper[4794]: I0310 10:52:51.961471 4794 generic.go:334] "Generic (PLEG): container finished" podID="4abf3070-907c-4615-8a54-3f4932d76475" containerID="c8f972f70d863d2f450ad17717bae97aa79712538ee3a2e592ad07e848cb40ef" exitCode=0 Mar 10 10:52:51 crc kubenswrapper[4794]: I0310 10:52:51.961521 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qrnhh" event={"ID":"4abf3070-907c-4615-8a54-3f4932d76475","Type":"ContainerDied","Data":"c8f972f70d863d2f450ad17717bae97aa79712538ee3a2e592ad07e848cb40ef"} Mar 10 10:52:52 crc kubenswrapper[4794]: I0310 10:52:52.976630 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qrnhh" event={"ID":"4abf3070-907c-4615-8a54-3f4932d76475","Type":"ContainerStarted","Data":"d03704463a824e06b9a25190074e7fcea306fc7be57ba05b896e5732ffe8165d"} Mar 10 10:52:58 crc kubenswrapper[4794]: I0310 10:52:58.807297 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-qrnhh" Mar 10 10:52:58 crc kubenswrapper[4794]: I0310 10:52:58.808079 4794 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-qrnhh" Mar 10 10:52:59 crc kubenswrapper[4794]: I0310 10:52:59.895694 4794 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-qrnhh" podUID="4abf3070-907c-4615-8a54-3f4932d76475" containerName="registry-server" probeResult="failure" output=< Mar 10 10:52:59 crc kubenswrapper[4794]: timeout: failed to connect service ":50051" within 1s Mar 10 10:52:59 crc kubenswrapper[4794]: > Mar 10 10:53:03 crc kubenswrapper[4794]: I0310 10:53:03.000047 4794 scope.go:117] "RemoveContainer" containerID="a4170662b41f7f6ea8f48d36067f333890b45c5e5457bca77f3d71c2d215cd5a" Mar 10 10:53:03 crc kubenswrapper[4794]: E0310 10:53:03.000733 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-69278_openshift-machine-config-operator(071a79a8-a892-4d38-a255-2a19483b64aa)\"" pod="openshift-machine-config-operator/machine-config-daemon-69278" podUID="071a79a8-a892-4d38-a255-2a19483b64aa" Mar 10 10:53:08 crc kubenswrapper[4794]: I0310 10:53:08.894307 4794 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-qrnhh" Mar 10 10:53:08 crc kubenswrapper[4794]: I0310 10:53:08.919279 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-qrnhh" podStartSLOduration=18.392593638 podStartE2EDuration="20.919243452s" podCreationTimestamp="2026-03-10 10:52:48 +0000 UTC" firstStartedPulling="2026-03-10 10:52:49.93395443 +0000 UTC m=+4118.690125248" lastFinishedPulling="2026-03-10 10:52:52.460604214 +0000 UTC m=+4121.216775062" observedRunningTime="2026-03-10 10:52:53.003046547 +0000 UTC m=+4121.759217375" watchObservedRunningTime="2026-03-10 10:53:08.919243452 +0000 UTC m=+4137.675414320" Mar 10 10:53:08 crc kubenswrapper[4794]: I0310 10:53:08.978257 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-qrnhh" Mar 10 10:53:09 crc kubenswrapper[4794]: I0310 10:53:09.137107 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-qrnhh"] Mar 10 10:53:10 crc kubenswrapper[4794]: I0310 10:53:10.138954 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-qrnhh" podUID="4abf3070-907c-4615-8a54-3f4932d76475" containerName="registry-server" containerID="cri-o://d03704463a824e06b9a25190074e7fcea306fc7be57ba05b896e5732ffe8165d" gracePeriod=2 Mar 10 10:53:10 crc kubenswrapper[4794]: I0310 10:53:10.602134 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-qrnhh" Mar 10 10:53:10 crc kubenswrapper[4794]: I0310 10:53:10.751410 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bm77k\" (UniqueName: \"kubernetes.io/projected/4abf3070-907c-4615-8a54-3f4932d76475-kube-api-access-bm77k\") pod \"4abf3070-907c-4615-8a54-3f4932d76475\" (UID: \"4abf3070-907c-4615-8a54-3f4932d76475\") " Mar 10 10:53:10 crc kubenswrapper[4794]: I0310 10:53:10.751625 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4abf3070-907c-4615-8a54-3f4932d76475-catalog-content\") pod \"4abf3070-907c-4615-8a54-3f4932d76475\" (UID: \"4abf3070-907c-4615-8a54-3f4932d76475\") " Mar 10 10:53:10 crc kubenswrapper[4794]: I0310 10:53:10.751724 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4abf3070-907c-4615-8a54-3f4932d76475-utilities\") pod \"4abf3070-907c-4615-8a54-3f4932d76475\" (UID: \"4abf3070-907c-4615-8a54-3f4932d76475\") " Mar 10 10:53:10 crc kubenswrapper[4794]: I0310 10:53:10.753660 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4abf3070-907c-4615-8a54-3f4932d76475-utilities" (OuterVolumeSpecName: "utilities") pod "4abf3070-907c-4615-8a54-3f4932d76475" (UID: "4abf3070-907c-4615-8a54-3f4932d76475"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 10:53:10 crc kubenswrapper[4794]: I0310 10:53:10.764554 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4abf3070-907c-4615-8a54-3f4932d76475-kube-api-access-bm77k" (OuterVolumeSpecName: "kube-api-access-bm77k") pod "4abf3070-907c-4615-8a54-3f4932d76475" (UID: "4abf3070-907c-4615-8a54-3f4932d76475"). InnerVolumeSpecName "kube-api-access-bm77k". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 10:53:10 crc kubenswrapper[4794]: I0310 10:53:10.853974 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bm77k\" (UniqueName: \"kubernetes.io/projected/4abf3070-907c-4615-8a54-3f4932d76475-kube-api-access-bm77k\") on node \"crc\" DevicePath \"\"" Mar 10 10:53:10 crc kubenswrapper[4794]: I0310 10:53:10.854024 4794 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4abf3070-907c-4615-8a54-3f4932d76475-utilities\") on node \"crc\" DevicePath \"\"" Mar 10 10:53:10 crc kubenswrapper[4794]: I0310 10:53:10.911958 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4abf3070-907c-4615-8a54-3f4932d76475-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "4abf3070-907c-4615-8a54-3f4932d76475" (UID: "4abf3070-907c-4615-8a54-3f4932d76475"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 10:53:10 crc kubenswrapper[4794]: I0310 10:53:10.955940 4794 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4abf3070-907c-4615-8a54-3f4932d76475-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 10 10:53:11 crc kubenswrapper[4794]: I0310 10:53:11.153391 4794 generic.go:334] "Generic (PLEG): container finished" podID="4abf3070-907c-4615-8a54-3f4932d76475" containerID="d03704463a824e06b9a25190074e7fcea306fc7be57ba05b896e5732ffe8165d" exitCode=0 Mar 10 10:53:11 crc kubenswrapper[4794]: I0310 10:53:11.153436 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qrnhh" event={"ID":"4abf3070-907c-4615-8a54-3f4932d76475","Type":"ContainerDied","Data":"d03704463a824e06b9a25190074e7fcea306fc7be57ba05b896e5732ffe8165d"} Mar 10 10:53:11 crc kubenswrapper[4794]: I0310 10:53:11.153466 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qrnhh" event={"ID":"4abf3070-907c-4615-8a54-3f4932d76475","Type":"ContainerDied","Data":"4be0c1a1e828f7589bc83de8ef842971b238b5d971400f19c90516b787ec9f0d"} Mar 10 10:53:11 crc kubenswrapper[4794]: I0310 10:53:11.153489 4794 scope.go:117] "RemoveContainer" containerID="d03704463a824e06b9a25190074e7fcea306fc7be57ba05b896e5732ffe8165d" Mar 10 10:53:11 crc kubenswrapper[4794]: I0310 10:53:11.153541 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-qrnhh" Mar 10 10:53:11 crc kubenswrapper[4794]: I0310 10:53:11.196296 4794 scope.go:117] "RemoveContainer" containerID="c8f972f70d863d2f450ad17717bae97aa79712538ee3a2e592ad07e848cb40ef" Mar 10 10:53:11 crc kubenswrapper[4794]: I0310 10:53:11.197974 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-qrnhh"] Mar 10 10:53:11 crc kubenswrapper[4794]: I0310 10:53:11.209279 4794 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-qrnhh"] Mar 10 10:53:11 crc kubenswrapper[4794]: I0310 10:53:11.215217 4794 scope.go:117] "RemoveContainer" containerID="00e8b83c5f8a3f703f278027232ad09955e28d6f5776a292e6bb2348beae9e52" Mar 10 10:53:11 crc kubenswrapper[4794]: I0310 10:53:11.261575 4794 scope.go:117] "RemoveContainer" containerID="d03704463a824e06b9a25190074e7fcea306fc7be57ba05b896e5732ffe8165d" Mar 10 10:53:11 crc kubenswrapper[4794]: E0310 10:53:11.262012 4794 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d03704463a824e06b9a25190074e7fcea306fc7be57ba05b896e5732ffe8165d\": container with ID starting with d03704463a824e06b9a25190074e7fcea306fc7be57ba05b896e5732ffe8165d not found: ID does not exist" containerID="d03704463a824e06b9a25190074e7fcea306fc7be57ba05b896e5732ffe8165d" Mar 10 10:53:11 crc kubenswrapper[4794]: I0310 10:53:11.262049 4794 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d03704463a824e06b9a25190074e7fcea306fc7be57ba05b896e5732ffe8165d"} err="failed to get container status \"d03704463a824e06b9a25190074e7fcea306fc7be57ba05b896e5732ffe8165d\": rpc error: code = NotFound desc = could not find container \"d03704463a824e06b9a25190074e7fcea306fc7be57ba05b896e5732ffe8165d\": container with ID starting with d03704463a824e06b9a25190074e7fcea306fc7be57ba05b896e5732ffe8165d not found: ID does not exist" Mar 10 10:53:11 crc kubenswrapper[4794]: I0310 10:53:11.262074 4794 scope.go:117] "RemoveContainer" containerID="c8f972f70d863d2f450ad17717bae97aa79712538ee3a2e592ad07e848cb40ef" Mar 10 10:53:11 crc kubenswrapper[4794]: E0310 10:53:11.262446 4794 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c8f972f70d863d2f450ad17717bae97aa79712538ee3a2e592ad07e848cb40ef\": container with ID starting with c8f972f70d863d2f450ad17717bae97aa79712538ee3a2e592ad07e848cb40ef not found: ID does not exist" containerID="c8f972f70d863d2f450ad17717bae97aa79712538ee3a2e592ad07e848cb40ef" Mar 10 10:53:11 crc kubenswrapper[4794]: I0310 10:53:11.262469 4794 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c8f972f70d863d2f450ad17717bae97aa79712538ee3a2e592ad07e848cb40ef"} err="failed to get container status \"c8f972f70d863d2f450ad17717bae97aa79712538ee3a2e592ad07e848cb40ef\": rpc error: code = NotFound desc = could not find container \"c8f972f70d863d2f450ad17717bae97aa79712538ee3a2e592ad07e848cb40ef\": container with ID starting with c8f972f70d863d2f450ad17717bae97aa79712538ee3a2e592ad07e848cb40ef not found: ID does not exist" Mar 10 10:53:11 crc kubenswrapper[4794]: I0310 10:53:11.262486 4794 scope.go:117] "RemoveContainer" containerID="00e8b83c5f8a3f703f278027232ad09955e28d6f5776a292e6bb2348beae9e52" Mar 10 10:53:11 crc kubenswrapper[4794]: E0310 10:53:11.263273 4794 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"00e8b83c5f8a3f703f278027232ad09955e28d6f5776a292e6bb2348beae9e52\": container with ID starting with 00e8b83c5f8a3f703f278027232ad09955e28d6f5776a292e6bb2348beae9e52 not found: ID does not exist" containerID="00e8b83c5f8a3f703f278027232ad09955e28d6f5776a292e6bb2348beae9e52" Mar 10 10:53:11 crc kubenswrapper[4794]: I0310 10:53:11.263383 4794 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"00e8b83c5f8a3f703f278027232ad09955e28d6f5776a292e6bb2348beae9e52"} err="failed to get container status \"00e8b83c5f8a3f703f278027232ad09955e28d6f5776a292e6bb2348beae9e52\": rpc error: code = NotFound desc = could not find container \"00e8b83c5f8a3f703f278027232ad09955e28d6f5776a292e6bb2348beae9e52\": container with ID starting with 00e8b83c5f8a3f703f278027232ad09955e28d6f5776a292e6bb2348beae9e52 not found: ID does not exist" Mar 10 10:53:12 crc kubenswrapper[4794]: I0310 10:53:12.025425 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4abf3070-907c-4615-8a54-3f4932d76475" path="/var/lib/kubelet/pods/4abf3070-907c-4615-8a54-3f4932d76475/volumes" Mar 10 10:53:17 crc kubenswrapper[4794]: I0310 10:53:17.998972 4794 scope.go:117] "RemoveContainer" containerID="a4170662b41f7f6ea8f48d36067f333890b45c5e5457bca77f3d71c2d215cd5a" Mar 10 10:53:18 crc kubenswrapper[4794]: E0310 10:53:17.999916 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-69278_openshift-machine-config-operator(071a79a8-a892-4d38-a255-2a19483b64aa)\"" pod="openshift-machine-config-operator/machine-config-daemon-69278" podUID="071a79a8-a892-4d38-a255-2a19483b64aa" Mar 10 10:53:30 crc kubenswrapper[4794]: I0310 10:53:29.999710 4794 scope.go:117] "RemoveContainer" containerID="a4170662b41f7f6ea8f48d36067f333890b45c5e5457bca77f3d71c2d215cd5a" Mar 10 10:53:30 crc kubenswrapper[4794]: E0310 10:53:30.001082 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-69278_openshift-machine-config-operator(071a79a8-a892-4d38-a255-2a19483b64aa)\"" pod="openshift-machine-config-operator/machine-config-daemon-69278" podUID="071a79a8-a892-4d38-a255-2a19483b64aa" Mar 10 10:53:42 crc kubenswrapper[4794]: I0310 10:53:42.009207 4794 scope.go:117] "RemoveContainer" containerID="a4170662b41f7f6ea8f48d36067f333890b45c5e5457bca77f3d71c2d215cd5a" Mar 10 10:53:42 crc kubenswrapper[4794]: E0310 10:53:42.010254 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-69278_openshift-machine-config-operator(071a79a8-a892-4d38-a255-2a19483b64aa)\"" pod="openshift-machine-config-operator/machine-config-daemon-69278" podUID="071a79a8-a892-4d38-a255-2a19483b64aa" Mar 10 10:53:47 crc kubenswrapper[4794]: I0310 10:53:47.940492 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-bqxmm"] Mar 10 10:53:47 crc kubenswrapper[4794]: E0310 10:53:47.941618 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4abf3070-907c-4615-8a54-3f4932d76475" containerName="registry-server" Mar 10 10:53:47 crc kubenswrapper[4794]: I0310 10:53:47.941648 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="4abf3070-907c-4615-8a54-3f4932d76475" containerName="registry-server" Mar 10 10:53:47 crc kubenswrapper[4794]: E0310 10:53:47.941689 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4abf3070-907c-4615-8a54-3f4932d76475" containerName="extract-content" Mar 10 10:53:47 crc kubenswrapper[4794]: I0310 10:53:47.941707 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="4abf3070-907c-4615-8a54-3f4932d76475" containerName="extract-content" Mar 10 10:53:47 crc kubenswrapper[4794]: E0310 10:53:47.941764 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4abf3070-907c-4615-8a54-3f4932d76475" containerName="extract-utilities" Mar 10 10:53:47 crc kubenswrapper[4794]: I0310 10:53:47.941781 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="4abf3070-907c-4615-8a54-3f4932d76475" containerName="extract-utilities" Mar 10 10:53:47 crc kubenswrapper[4794]: I0310 10:53:47.942105 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="4abf3070-907c-4615-8a54-3f4932d76475" containerName="registry-server" Mar 10 10:53:47 crc kubenswrapper[4794]: I0310 10:53:47.944735 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-bqxmm" Mar 10 10:53:47 crc kubenswrapper[4794]: I0310 10:53:47.947377 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-bqxmm"] Mar 10 10:53:47 crc kubenswrapper[4794]: I0310 10:53:47.963755 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4f40ee78-1d3b-479b-8826-46867c3b5f71-utilities\") pod \"redhat-marketplace-bqxmm\" (UID: \"4f40ee78-1d3b-479b-8826-46867c3b5f71\") " pod="openshift-marketplace/redhat-marketplace-bqxmm" Mar 10 10:53:47 crc kubenswrapper[4794]: I0310 10:53:47.964220 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lkmp7\" (UniqueName: \"kubernetes.io/projected/4f40ee78-1d3b-479b-8826-46867c3b5f71-kube-api-access-lkmp7\") pod \"redhat-marketplace-bqxmm\" (UID: \"4f40ee78-1d3b-479b-8826-46867c3b5f71\") " pod="openshift-marketplace/redhat-marketplace-bqxmm" Mar 10 10:53:47 crc kubenswrapper[4794]: I0310 10:53:47.964474 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4f40ee78-1d3b-479b-8826-46867c3b5f71-catalog-content\") pod \"redhat-marketplace-bqxmm\" (UID: \"4f40ee78-1d3b-479b-8826-46867c3b5f71\") " pod="openshift-marketplace/redhat-marketplace-bqxmm" Mar 10 10:53:48 crc kubenswrapper[4794]: I0310 10:53:48.065397 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lkmp7\" (UniqueName: \"kubernetes.io/projected/4f40ee78-1d3b-479b-8826-46867c3b5f71-kube-api-access-lkmp7\") pod \"redhat-marketplace-bqxmm\" (UID: \"4f40ee78-1d3b-479b-8826-46867c3b5f71\") " pod="openshift-marketplace/redhat-marketplace-bqxmm" Mar 10 10:53:48 crc kubenswrapper[4794]: I0310 10:53:48.065487 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4f40ee78-1d3b-479b-8826-46867c3b5f71-catalog-content\") pod \"redhat-marketplace-bqxmm\" (UID: \"4f40ee78-1d3b-479b-8826-46867c3b5f71\") " pod="openshift-marketplace/redhat-marketplace-bqxmm" Mar 10 10:53:48 crc kubenswrapper[4794]: I0310 10:53:48.065572 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4f40ee78-1d3b-479b-8826-46867c3b5f71-utilities\") pod \"redhat-marketplace-bqxmm\" (UID: \"4f40ee78-1d3b-479b-8826-46867c3b5f71\") " pod="openshift-marketplace/redhat-marketplace-bqxmm" Mar 10 10:53:48 crc kubenswrapper[4794]: I0310 10:53:48.066316 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4f40ee78-1d3b-479b-8826-46867c3b5f71-utilities\") pod \"redhat-marketplace-bqxmm\" (UID: \"4f40ee78-1d3b-479b-8826-46867c3b5f71\") " pod="openshift-marketplace/redhat-marketplace-bqxmm" Mar 10 10:53:48 crc kubenswrapper[4794]: I0310 10:53:48.066370 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4f40ee78-1d3b-479b-8826-46867c3b5f71-catalog-content\") pod \"redhat-marketplace-bqxmm\" (UID: \"4f40ee78-1d3b-479b-8826-46867c3b5f71\") " pod="openshift-marketplace/redhat-marketplace-bqxmm" Mar 10 10:53:48 crc kubenswrapper[4794]: I0310 10:53:48.100802 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lkmp7\" (UniqueName: \"kubernetes.io/projected/4f40ee78-1d3b-479b-8826-46867c3b5f71-kube-api-access-lkmp7\") pod \"redhat-marketplace-bqxmm\" (UID: \"4f40ee78-1d3b-479b-8826-46867c3b5f71\") " pod="openshift-marketplace/redhat-marketplace-bqxmm" Mar 10 10:53:48 crc kubenswrapper[4794]: I0310 10:53:48.290918 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-bqxmm" Mar 10 10:53:48 crc kubenswrapper[4794]: I0310 10:53:48.763921 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-bqxmm"] Mar 10 10:53:49 crc kubenswrapper[4794]: I0310 10:53:49.495834 4794 generic.go:334] "Generic (PLEG): container finished" podID="4f40ee78-1d3b-479b-8826-46867c3b5f71" containerID="a80e71237cb44f6946187bc49ec49b9a4574633f8bd0dcb8b4e9dc8278575305" exitCode=0 Mar 10 10:53:49 crc kubenswrapper[4794]: I0310 10:53:49.496091 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bqxmm" event={"ID":"4f40ee78-1d3b-479b-8826-46867c3b5f71","Type":"ContainerDied","Data":"a80e71237cb44f6946187bc49ec49b9a4574633f8bd0dcb8b4e9dc8278575305"} Mar 10 10:53:49 crc kubenswrapper[4794]: I0310 10:53:49.496360 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bqxmm" event={"ID":"4f40ee78-1d3b-479b-8826-46867c3b5f71","Type":"ContainerStarted","Data":"5522e7e5c472c4120b80911b8c7eb19b8d6efd250951ad8ff66f5d853bfd0232"} Mar 10 10:53:51 crc kubenswrapper[4794]: I0310 10:53:51.515859 4794 generic.go:334] "Generic (PLEG): container finished" podID="4f40ee78-1d3b-479b-8826-46867c3b5f71" containerID="8be1bade6dac799a949d3aded42dc5be05b6e5259a9a8962bb557c3137cfb38c" exitCode=0 Mar 10 10:53:51 crc kubenswrapper[4794]: I0310 10:53:51.515958 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bqxmm" event={"ID":"4f40ee78-1d3b-479b-8826-46867c3b5f71","Type":"ContainerDied","Data":"8be1bade6dac799a949d3aded42dc5be05b6e5259a9a8962bb557c3137cfb38c"} Mar 10 10:53:52 crc kubenswrapper[4794]: I0310 10:53:52.999533 4794 scope.go:117] "RemoveContainer" containerID="a4170662b41f7f6ea8f48d36067f333890b45c5e5457bca77f3d71c2d215cd5a" Mar 10 10:53:53 crc kubenswrapper[4794]: E0310 10:53:53.000155 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-69278_openshift-machine-config-operator(071a79a8-a892-4d38-a255-2a19483b64aa)\"" pod="openshift-machine-config-operator/machine-config-daemon-69278" podUID="071a79a8-a892-4d38-a255-2a19483b64aa" Mar 10 10:53:53 crc kubenswrapper[4794]: I0310 10:53:53.539445 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bqxmm" event={"ID":"4f40ee78-1d3b-479b-8826-46867c3b5f71","Type":"ContainerStarted","Data":"4be857e84981f3302c9277287f7d63bce6617c538d5432590ad3aa8f3808fc71"} Mar 10 10:53:53 crc kubenswrapper[4794]: I0310 10:53:53.572631 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-bqxmm" podStartSLOduration=4.13915928 podStartE2EDuration="6.572610381s" podCreationTimestamp="2026-03-10 10:53:47 +0000 UTC" firstStartedPulling="2026-03-10 10:53:49.498702534 +0000 UTC m=+4178.254873362" lastFinishedPulling="2026-03-10 10:53:51.932153645 +0000 UTC m=+4180.688324463" observedRunningTime="2026-03-10 10:53:53.564712126 +0000 UTC m=+4182.320882954" watchObservedRunningTime="2026-03-10 10:53:53.572610381 +0000 UTC m=+4182.328781209" Mar 10 10:53:58 crc kubenswrapper[4794]: I0310 10:53:58.291132 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-bqxmm" Mar 10 10:53:58 crc kubenswrapper[4794]: I0310 10:53:58.292114 4794 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-bqxmm" Mar 10 10:53:58 crc kubenswrapper[4794]: I0310 10:53:58.358355 4794 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-bqxmm" Mar 10 10:53:58 crc kubenswrapper[4794]: I0310 10:53:58.616575 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-bqxmm" Mar 10 10:53:58 crc kubenswrapper[4794]: I0310 10:53:58.666954 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-bqxmm"] Mar 10 10:54:00 crc kubenswrapper[4794]: I0310 10:54:00.153143 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29552334-4r886"] Mar 10 10:54:00 crc kubenswrapper[4794]: I0310 10:54:00.154070 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552334-4r886" Mar 10 10:54:00 crc kubenswrapper[4794]: I0310 10:54:00.157244 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 10 10:54:00 crc kubenswrapper[4794]: I0310 10:54:00.161830 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-r5tkc" Mar 10 10:54:00 crc kubenswrapper[4794]: I0310 10:54:00.162383 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 10 10:54:00 crc kubenswrapper[4794]: I0310 10:54:00.188914 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552334-4r886"] Mar 10 10:54:00 crc kubenswrapper[4794]: I0310 10:54:00.259171 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f8xzg\" (UniqueName: \"kubernetes.io/projected/6485889d-9687-4a87-a93d-0c7182ef14d4-kube-api-access-f8xzg\") pod \"auto-csr-approver-29552334-4r886\" (UID: \"6485889d-9687-4a87-a93d-0c7182ef14d4\") " pod="openshift-infra/auto-csr-approver-29552334-4r886" Mar 10 10:54:00 crc kubenswrapper[4794]: I0310 10:54:00.361788 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f8xzg\" (UniqueName: \"kubernetes.io/projected/6485889d-9687-4a87-a93d-0c7182ef14d4-kube-api-access-f8xzg\") pod \"auto-csr-approver-29552334-4r886\" (UID: \"6485889d-9687-4a87-a93d-0c7182ef14d4\") " pod="openshift-infra/auto-csr-approver-29552334-4r886" Mar 10 10:54:00 crc kubenswrapper[4794]: I0310 10:54:00.399599 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f8xzg\" (UniqueName: \"kubernetes.io/projected/6485889d-9687-4a87-a93d-0c7182ef14d4-kube-api-access-f8xzg\") pod \"auto-csr-approver-29552334-4r886\" (UID: \"6485889d-9687-4a87-a93d-0c7182ef14d4\") " pod="openshift-infra/auto-csr-approver-29552334-4r886" Mar 10 10:54:00 crc kubenswrapper[4794]: I0310 10:54:00.495830 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552334-4r886" Mar 10 10:54:00 crc kubenswrapper[4794]: I0310 10:54:00.600321 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-bqxmm" podUID="4f40ee78-1d3b-479b-8826-46867c3b5f71" containerName="registry-server" containerID="cri-o://4be857e84981f3302c9277287f7d63bce6617c538d5432590ad3aa8f3808fc71" gracePeriod=2 Mar 10 10:54:00 crc kubenswrapper[4794]: I0310 10:54:00.781320 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552334-4r886"] Mar 10 10:54:01 crc kubenswrapper[4794]: I0310 10:54:01.021103 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-bqxmm" Mar 10 10:54:01 crc kubenswrapper[4794]: I0310 10:54:01.174091 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4f40ee78-1d3b-479b-8826-46867c3b5f71-catalog-content\") pod \"4f40ee78-1d3b-479b-8826-46867c3b5f71\" (UID: \"4f40ee78-1d3b-479b-8826-46867c3b5f71\") " Mar 10 10:54:01 crc kubenswrapper[4794]: I0310 10:54:01.174222 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4f40ee78-1d3b-479b-8826-46867c3b5f71-utilities\") pod \"4f40ee78-1d3b-479b-8826-46867c3b5f71\" (UID: \"4f40ee78-1d3b-479b-8826-46867c3b5f71\") " Mar 10 10:54:01 crc kubenswrapper[4794]: I0310 10:54:01.174365 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lkmp7\" (UniqueName: \"kubernetes.io/projected/4f40ee78-1d3b-479b-8826-46867c3b5f71-kube-api-access-lkmp7\") pod \"4f40ee78-1d3b-479b-8826-46867c3b5f71\" (UID: \"4f40ee78-1d3b-479b-8826-46867c3b5f71\") " Mar 10 10:54:01 crc kubenswrapper[4794]: I0310 10:54:01.175000 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4f40ee78-1d3b-479b-8826-46867c3b5f71-utilities" (OuterVolumeSpecName: "utilities") pod "4f40ee78-1d3b-479b-8826-46867c3b5f71" (UID: "4f40ee78-1d3b-479b-8826-46867c3b5f71"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 10:54:01 crc kubenswrapper[4794]: I0310 10:54:01.175983 4794 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4f40ee78-1d3b-479b-8826-46867c3b5f71-utilities\") on node \"crc\" DevicePath \"\"" Mar 10 10:54:01 crc kubenswrapper[4794]: I0310 10:54:01.181755 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4f40ee78-1d3b-479b-8826-46867c3b5f71-kube-api-access-lkmp7" (OuterVolumeSpecName: "kube-api-access-lkmp7") pod "4f40ee78-1d3b-479b-8826-46867c3b5f71" (UID: "4f40ee78-1d3b-479b-8826-46867c3b5f71"). InnerVolumeSpecName "kube-api-access-lkmp7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 10:54:01 crc kubenswrapper[4794]: I0310 10:54:01.222494 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4f40ee78-1d3b-479b-8826-46867c3b5f71-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "4f40ee78-1d3b-479b-8826-46867c3b5f71" (UID: "4f40ee78-1d3b-479b-8826-46867c3b5f71"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 10:54:01 crc kubenswrapper[4794]: I0310 10:54:01.277526 4794 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4f40ee78-1d3b-479b-8826-46867c3b5f71-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 10 10:54:01 crc kubenswrapper[4794]: I0310 10:54:01.277589 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lkmp7\" (UniqueName: \"kubernetes.io/projected/4f40ee78-1d3b-479b-8826-46867c3b5f71-kube-api-access-lkmp7\") on node \"crc\" DevicePath \"\"" Mar 10 10:54:01 crc kubenswrapper[4794]: I0310 10:54:01.609845 4794 generic.go:334] "Generic (PLEG): container finished" podID="4f40ee78-1d3b-479b-8826-46867c3b5f71" containerID="4be857e84981f3302c9277287f7d63bce6617c538d5432590ad3aa8f3808fc71" exitCode=0 Mar 10 10:54:01 crc kubenswrapper[4794]: I0310 10:54:01.609943 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-bqxmm" Mar 10 10:54:01 crc kubenswrapper[4794]: I0310 10:54:01.610003 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bqxmm" event={"ID":"4f40ee78-1d3b-479b-8826-46867c3b5f71","Type":"ContainerDied","Data":"4be857e84981f3302c9277287f7d63bce6617c538d5432590ad3aa8f3808fc71"} Mar 10 10:54:01 crc kubenswrapper[4794]: I0310 10:54:01.610308 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bqxmm" event={"ID":"4f40ee78-1d3b-479b-8826-46867c3b5f71","Type":"ContainerDied","Data":"5522e7e5c472c4120b80911b8c7eb19b8d6efd250951ad8ff66f5d853bfd0232"} Mar 10 10:54:01 crc kubenswrapper[4794]: I0310 10:54:01.610385 4794 scope.go:117] "RemoveContainer" containerID="4be857e84981f3302c9277287f7d63bce6617c538d5432590ad3aa8f3808fc71" Mar 10 10:54:01 crc kubenswrapper[4794]: I0310 10:54:01.612159 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552334-4r886" event={"ID":"6485889d-9687-4a87-a93d-0c7182ef14d4","Type":"ContainerStarted","Data":"372c3033e182fb8ff59e3d32957f3f5e122cbc20e427a4f4b4ccb1b736063ba5"} Mar 10 10:54:01 crc kubenswrapper[4794]: I0310 10:54:01.640909 4794 scope.go:117] "RemoveContainer" containerID="8be1bade6dac799a949d3aded42dc5be05b6e5259a9a8962bb557c3137cfb38c" Mar 10 10:54:01 crc kubenswrapper[4794]: I0310 10:54:01.688416 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-bqxmm"] Mar 10 10:54:01 crc kubenswrapper[4794]: I0310 10:54:01.736418 4794 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-bqxmm"] Mar 10 10:54:01 crc kubenswrapper[4794]: I0310 10:54:01.739488 4794 scope.go:117] "RemoveContainer" containerID="a80e71237cb44f6946187bc49ec49b9a4574633f8bd0dcb8b4e9dc8278575305" Mar 10 10:54:01 crc kubenswrapper[4794]: I0310 10:54:01.779492 4794 scope.go:117] "RemoveContainer" containerID="4be857e84981f3302c9277287f7d63bce6617c538d5432590ad3aa8f3808fc71" Mar 10 10:54:01 crc kubenswrapper[4794]: E0310 10:54:01.786981 4794 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4be857e84981f3302c9277287f7d63bce6617c538d5432590ad3aa8f3808fc71\": container with ID starting with 4be857e84981f3302c9277287f7d63bce6617c538d5432590ad3aa8f3808fc71 not found: ID does not exist" containerID="4be857e84981f3302c9277287f7d63bce6617c538d5432590ad3aa8f3808fc71" Mar 10 10:54:01 crc kubenswrapper[4794]: I0310 10:54:01.787030 4794 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4be857e84981f3302c9277287f7d63bce6617c538d5432590ad3aa8f3808fc71"} err="failed to get container status \"4be857e84981f3302c9277287f7d63bce6617c538d5432590ad3aa8f3808fc71\": rpc error: code = NotFound desc = could not find container \"4be857e84981f3302c9277287f7d63bce6617c538d5432590ad3aa8f3808fc71\": container with ID starting with 4be857e84981f3302c9277287f7d63bce6617c538d5432590ad3aa8f3808fc71 not found: ID does not exist" Mar 10 10:54:01 crc kubenswrapper[4794]: I0310 10:54:01.787060 4794 scope.go:117] "RemoveContainer" containerID="8be1bade6dac799a949d3aded42dc5be05b6e5259a9a8962bb557c3137cfb38c" Mar 10 10:54:01 crc kubenswrapper[4794]: E0310 10:54:01.789752 4794 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8be1bade6dac799a949d3aded42dc5be05b6e5259a9a8962bb557c3137cfb38c\": container with ID starting with 8be1bade6dac799a949d3aded42dc5be05b6e5259a9a8962bb557c3137cfb38c not found: ID does not exist" containerID="8be1bade6dac799a949d3aded42dc5be05b6e5259a9a8962bb557c3137cfb38c" Mar 10 10:54:01 crc kubenswrapper[4794]: I0310 10:54:01.789785 4794 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8be1bade6dac799a949d3aded42dc5be05b6e5259a9a8962bb557c3137cfb38c"} err="failed to get container status \"8be1bade6dac799a949d3aded42dc5be05b6e5259a9a8962bb557c3137cfb38c\": rpc error: code = NotFound desc = could not find container \"8be1bade6dac799a949d3aded42dc5be05b6e5259a9a8962bb557c3137cfb38c\": container with ID starting with 8be1bade6dac799a949d3aded42dc5be05b6e5259a9a8962bb557c3137cfb38c not found: ID does not exist" Mar 10 10:54:01 crc kubenswrapper[4794]: I0310 10:54:01.789807 4794 scope.go:117] "RemoveContainer" containerID="a80e71237cb44f6946187bc49ec49b9a4574633f8bd0dcb8b4e9dc8278575305" Mar 10 10:54:01 crc kubenswrapper[4794]: E0310 10:54:01.794182 4794 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a80e71237cb44f6946187bc49ec49b9a4574633f8bd0dcb8b4e9dc8278575305\": container with ID starting with a80e71237cb44f6946187bc49ec49b9a4574633f8bd0dcb8b4e9dc8278575305 not found: ID does not exist" containerID="a80e71237cb44f6946187bc49ec49b9a4574633f8bd0dcb8b4e9dc8278575305" Mar 10 10:54:01 crc kubenswrapper[4794]: I0310 10:54:01.794222 4794 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a80e71237cb44f6946187bc49ec49b9a4574633f8bd0dcb8b4e9dc8278575305"} err="failed to get container status \"a80e71237cb44f6946187bc49ec49b9a4574633f8bd0dcb8b4e9dc8278575305\": rpc error: code = NotFound desc = could not find container \"a80e71237cb44f6946187bc49ec49b9a4574633f8bd0dcb8b4e9dc8278575305\": container with ID starting with a80e71237cb44f6946187bc49ec49b9a4574633f8bd0dcb8b4e9dc8278575305 not found: ID does not exist" Mar 10 10:54:02 crc kubenswrapper[4794]: I0310 10:54:02.021857 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4f40ee78-1d3b-479b-8826-46867c3b5f71" path="/var/lib/kubelet/pods/4f40ee78-1d3b-479b-8826-46867c3b5f71/volumes" Mar 10 10:54:02 crc kubenswrapper[4794]: I0310 10:54:02.623079 4794 generic.go:334] "Generic (PLEG): container finished" podID="6485889d-9687-4a87-a93d-0c7182ef14d4" containerID="143647dab814ae4c589916f76df89e89bac8e2ca76498f42f8dd7ea69deebed1" exitCode=0 Mar 10 10:54:02 crc kubenswrapper[4794]: I0310 10:54:02.623349 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552334-4r886" event={"ID":"6485889d-9687-4a87-a93d-0c7182ef14d4","Type":"ContainerDied","Data":"143647dab814ae4c589916f76df89e89bac8e2ca76498f42f8dd7ea69deebed1"} Mar 10 10:54:04 crc kubenswrapper[4794]: I0310 10:54:04.044288 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552334-4r886" Mar 10 10:54:04 crc kubenswrapper[4794]: I0310 10:54:04.226437 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f8xzg\" (UniqueName: \"kubernetes.io/projected/6485889d-9687-4a87-a93d-0c7182ef14d4-kube-api-access-f8xzg\") pod \"6485889d-9687-4a87-a93d-0c7182ef14d4\" (UID: \"6485889d-9687-4a87-a93d-0c7182ef14d4\") " Mar 10 10:54:04 crc kubenswrapper[4794]: I0310 10:54:04.658040 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552334-4r886" event={"ID":"6485889d-9687-4a87-a93d-0c7182ef14d4","Type":"ContainerDied","Data":"372c3033e182fb8ff59e3d32957f3f5e122cbc20e427a4f4b4ccb1b736063ba5"} Mar 10 10:54:04 crc kubenswrapper[4794]: I0310 10:54:04.658111 4794 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="372c3033e182fb8ff59e3d32957f3f5e122cbc20e427a4f4b4ccb1b736063ba5" Mar 10 10:54:04 crc kubenswrapper[4794]: I0310 10:54:04.658198 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552334-4r886" Mar 10 10:54:04 crc kubenswrapper[4794]: I0310 10:54:04.677906 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6485889d-9687-4a87-a93d-0c7182ef14d4-kube-api-access-f8xzg" (OuterVolumeSpecName: "kube-api-access-f8xzg") pod "6485889d-9687-4a87-a93d-0c7182ef14d4" (UID: "6485889d-9687-4a87-a93d-0c7182ef14d4"). InnerVolumeSpecName "kube-api-access-f8xzg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 10:54:04 crc kubenswrapper[4794]: I0310 10:54:04.735289 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f8xzg\" (UniqueName: \"kubernetes.io/projected/6485889d-9687-4a87-a93d-0c7182ef14d4-kube-api-access-f8xzg\") on node \"crc\" DevicePath \"\"" Mar 10 10:54:05 crc kubenswrapper[4794]: I0310 10:54:05.135634 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29552328-ftx2h"] Mar 10 10:54:05 crc kubenswrapper[4794]: I0310 10:54:05.144582 4794 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29552328-ftx2h"] Mar 10 10:54:05 crc kubenswrapper[4794]: I0310 10:54:05.999184 4794 scope.go:117] "RemoveContainer" containerID="a4170662b41f7f6ea8f48d36067f333890b45c5e5457bca77f3d71c2d215cd5a" Mar 10 10:54:06 crc kubenswrapper[4794]: E0310 10:54:05.999931 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-69278_openshift-machine-config-operator(071a79a8-a892-4d38-a255-2a19483b64aa)\"" pod="openshift-machine-config-operator/machine-config-daemon-69278" podUID="071a79a8-a892-4d38-a255-2a19483b64aa" Mar 10 10:54:06 crc kubenswrapper[4794]: I0310 10:54:06.013724 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6836a559-5ecc-4386-be4d-028701d84ee4" path="/var/lib/kubelet/pods/6836a559-5ecc-4386-be4d-028701d84ee4/volumes" Mar 10 10:54:17 crc kubenswrapper[4794]: I0310 10:54:16.999835 4794 scope.go:117] "RemoveContainer" containerID="a4170662b41f7f6ea8f48d36067f333890b45c5e5457bca77f3d71c2d215cd5a" Mar 10 10:54:17 crc kubenswrapper[4794]: E0310 10:54:17.000745 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-69278_openshift-machine-config-operator(071a79a8-a892-4d38-a255-2a19483b64aa)\"" pod="openshift-machine-config-operator/machine-config-daemon-69278" podUID="071a79a8-a892-4d38-a255-2a19483b64aa" Mar 10 10:54:28 crc kubenswrapper[4794]: I0310 10:54:27.999943 4794 scope.go:117] "RemoveContainer" containerID="a4170662b41f7f6ea8f48d36067f333890b45c5e5457bca77f3d71c2d215cd5a" Mar 10 10:54:28 crc kubenswrapper[4794]: E0310 10:54:28.001031 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-69278_openshift-machine-config-operator(071a79a8-a892-4d38-a255-2a19483b64aa)\"" pod="openshift-machine-config-operator/machine-config-daemon-69278" podUID="071a79a8-a892-4d38-a255-2a19483b64aa" Mar 10 10:54:38 crc kubenswrapper[4794]: I0310 10:54:38.999033 4794 scope.go:117] "RemoveContainer" containerID="a4170662b41f7f6ea8f48d36067f333890b45c5e5457bca77f3d71c2d215cd5a" Mar 10 10:54:39 crc kubenswrapper[4794]: E0310 10:54:38.999996 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-69278_openshift-machine-config-operator(071a79a8-a892-4d38-a255-2a19483b64aa)\"" pod="openshift-machine-config-operator/machine-config-daemon-69278" podUID="071a79a8-a892-4d38-a255-2a19483b64aa" Mar 10 10:54:40 crc kubenswrapper[4794]: I0310 10:54:40.643956 4794 scope.go:117] "RemoveContainer" containerID="075af582db3b8b3b4357074eda167fa7d12e595ba702d67044ac1d0ebf135246" Mar 10 10:54:52 crc kubenswrapper[4794]: I0310 10:54:52.007571 4794 scope.go:117] "RemoveContainer" containerID="a4170662b41f7f6ea8f48d36067f333890b45c5e5457bca77f3d71c2d215cd5a" Mar 10 10:54:52 crc kubenswrapper[4794]: E0310 10:54:52.008725 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-69278_openshift-machine-config-operator(071a79a8-a892-4d38-a255-2a19483b64aa)\"" pod="openshift-machine-config-operator/machine-config-daemon-69278" podUID="071a79a8-a892-4d38-a255-2a19483b64aa" Mar 10 10:55:06 crc kubenswrapper[4794]: I0310 10:55:06.003704 4794 scope.go:117] "RemoveContainer" containerID="a4170662b41f7f6ea8f48d36067f333890b45c5e5457bca77f3d71c2d215cd5a" Mar 10 10:55:07 crc kubenswrapper[4794]: I0310 10:55:07.228064 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-69278" event={"ID":"071a79a8-a892-4d38-a255-2a19483b64aa","Type":"ContainerStarted","Data":"b31747265e59a55e2a362d4d03121df29fa57741e08ecc5e62740ddf3aa58ff9"} Mar 10 10:56:00 crc kubenswrapper[4794]: I0310 10:56:00.169766 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29552336-kxkn8"] Mar 10 10:56:00 crc kubenswrapper[4794]: E0310 10:56:00.170854 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4f40ee78-1d3b-479b-8826-46867c3b5f71" containerName="registry-server" Mar 10 10:56:00 crc kubenswrapper[4794]: I0310 10:56:00.170877 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="4f40ee78-1d3b-479b-8826-46867c3b5f71" containerName="registry-server" Mar 10 10:56:00 crc kubenswrapper[4794]: E0310 10:56:00.170901 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4f40ee78-1d3b-479b-8826-46867c3b5f71" containerName="extract-utilities" Mar 10 10:56:00 crc kubenswrapper[4794]: I0310 10:56:00.170913 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="4f40ee78-1d3b-479b-8826-46867c3b5f71" containerName="extract-utilities" Mar 10 10:56:00 crc kubenswrapper[4794]: E0310 10:56:00.170941 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4f40ee78-1d3b-479b-8826-46867c3b5f71" containerName="extract-content" Mar 10 10:56:00 crc kubenswrapper[4794]: I0310 10:56:00.170955 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="4f40ee78-1d3b-479b-8826-46867c3b5f71" containerName="extract-content" Mar 10 10:56:00 crc kubenswrapper[4794]: E0310 10:56:00.170970 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6485889d-9687-4a87-a93d-0c7182ef14d4" containerName="oc" Mar 10 10:56:00 crc kubenswrapper[4794]: I0310 10:56:00.170984 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="6485889d-9687-4a87-a93d-0c7182ef14d4" containerName="oc" Mar 10 10:56:00 crc kubenswrapper[4794]: I0310 10:56:00.171240 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="6485889d-9687-4a87-a93d-0c7182ef14d4" containerName="oc" Mar 10 10:56:00 crc kubenswrapper[4794]: I0310 10:56:00.171274 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="4f40ee78-1d3b-479b-8826-46867c3b5f71" containerName="registry-server" Mar 10 10:56:00 crc kubenswrapper[4794]: I0310 10:56:00.172062 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552336-kxkn8" Mar 10 10:56:00 crc kubenswrapper[4794]: I0310 10:56:00.177222 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 10 10:56:00 crc kubenswrapper[4794]: I0310 10:56:00.177615 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-r5tkc" Mar 10 10:56:00 crc kubenswrapper[4794]: I0310 10:56:00.177911 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 10 10:56:00 crc kubenswrapper[4794]: I0310 10:56:00.187906 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-plz9f\" (UniqueName: \"kubernetes.io/projected/9b5a95b8-232e-4423-9f6b-c4fa4cc36f55-kube-api-access-plz9f\") pod \"auto-csr-approver-29552336-kxkn8\" (UID: \"9b5a95b8-232e-4423-9f6b-c4fa4cc36f55\") " pod="openshift-infra/auto-csr-approver-29552336-kxkn8" Mar 10 10:56:00 crc kubenswrapper[4794]: I0310 10:56:00.195595 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552336-kxkn8"] Mar 10 10:56:00 crc kubenswrapper[4794]: I0310 10:56:00.289618 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-plz9f\" (UniqueName: \"kubernetes.io/projected/9b5a95b8-232e-4423-9f6b-c4fa4cc36f55-kube-api-access-plz9f\") pod \"auto-csr-approver-29552336-kxkn8\" (UID: \"9b5a95b8-232e-4423-9f6b-c4fa4cc36f55\") " pod="openshift-infra/auto-csr-approver-29552336-kxkn8" Mar 10 10:56:00 crc kubenswrapper[4794]: I0310 10:56:00.327886 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-plz9f\" (UniqueName: \"kubernetes.io/projected/9b5a95b8-232e-4423-9f6b-c4fa4cc36f55-kube-api-access-plz9f\") pod \"auto-csr-approver-29552336-kxkn8\" (UID: \"9b5a95b8-232e-4423-9f6b-c4fa4cc36f55\") " pod="openshift-infra/auto-csr-approver-29552336-kxkn8" Mar 10 10:56:00 crc kubenswrapper[4794]: I0310 10:56:00.510029 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552336-kxkn8" Mar 10 10:56:01 crc kubenswrapper[4794]: I0310 10:56:01.019675 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552336-kxkn8"] Mar 10 10:56:01 crc kubenswrapper[4794]: I0310 10:56:01.708595 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552336-kxkn8" event={"ID":"9b5a95b8-232e-4423-9f6b-c4fa4cc36f55","Type":"ContainerStarted","Data":"09009c266fe55bcbdf547b6c7fa775eb6894f56e193f46f06b698fc198fb22e6"} Mar 10 10:56:02 crc kubenswrapper[4794]: I0310 10:56:02.719977 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552336-kxkn8" event={"ID":"9b5a95b8-232e-4423-9f6b-c4fa4cc36f55","Type":"ContainerStarted","Data":"7068b37217b482b7dff0f5c6b3c9e41d0a4133e421b1b48225cea8ce4ab48968"} Mar 10 10:56:02 crc kubenswrapper[4794]: I0310 10:56:02.741915 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29552336-kxkn8" podStartSLOduration=1.853767747 podStartE2EDuration="2.741888781s" podCreationTimestamp="2026-03-10 10:56:00 +0000 UTC" firstStartedPulling="2026-03-10 10:56:01.190453955 +0000 UTC m=+4309.946624793" lastFinishedPulling="2026-03-10 10:56:02.078574989 +0000 UTC m=+4310.834745827" observedRunningTime="2026-03-10 10:56:02.736922477 +0000 UTC m=+4311.493093335" watchObservedRunningTime="2026-03-10 10:56:02.741888781 +0000 UTC m=+4311.498059629" Mar 10 10:56:03 crc kubenswrapper[4794]: I0310 10:56:03.730552 4794 generic.go:334] "Generic (PLEG): container finished" podID="9b5a95b8-232e-4423-9f6b-c4fa4cc36f55" containerID="7068b37217b482b7dff0f5c6b3c9e41d0a4133e421b1b48225cea8ce4ab48968" exitCode=0 Mar 10 10:56:03 crc kubenswrapper[4794]: I0310 10:56:03.730616 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552336-kxkn8" event={"ID":"9b5a95b8-232e-4423-9f6b-c4fa4cc36f55","Type":"ContainerDied","Data":"7068b37217b482b7dff0f5c6b3c9e41d0a4133e421b1b48225cea8ce4ab48968"} Mar 10 10:56:05 crc kubenswrapper[4794]: I0310 10:56:05.103063 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552336-kxkn8" Mar 10 10:56:05 crc kubenswrapper[4794]: I0310 10:56:05.163245 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-plz9f\" (UniqueName: \"kubernetes.io/projected/9b5a95b8-232e-4423-9f6b-c4fa4cc36f55-kube-api-access-plz9f\") pod \"9b5a95b8-232e-4423-9f6b-c4fa4cc36f55\" (UID: \"9b5a95b8-232e-4423-9f6b-c4fa4cc36f55\") " Mar 10 10:56:05 crc kubenswrapper[4794]: I0310 10:56:05.171873 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9b5a95b8-232e-4423-9f6b-c4fa4cc36f55-kube-api-access-plz9f" (OuterVolumeSpecName: "kube-api-access-plz9f") pod "9b5a95b8-232e-4423-9f6b-c4fa4cc36f55" (UID: "9b5a95b8-232e-4423-9f6b-c4fa4cc36f55"). InnerVolumeSpecName "kube-api-access-plz9f". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 10:56:05 crc kubenswrapper[4794]: I0310 10:56:05.264998 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-plz9f\" (UniqueName: \"kubernetes.io/projected/9b5a95b8-232e-4423-9f6b-c4fa4cc36f55-kube-api-access-plz9f\") on node \"crc\" DevicePath \"\"" Mar 10 10:56:05 crc kubenswrapper[4794]: I0310 10:56:05.754905 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552336-kxkn8" event={"ID":"9b5a95b8-232e-4423-9f6b-c4fa4cc36f55","Type":"ContainerDied","Data":"09009c266fe55bcbdf547b6c7fa775eb6894f56e193f46f06b698fc198fb22e6"} Mar 10 10:56:05 crc kubenswrapper[4794]: I0310 10:56:05.754978 4794 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="09009c266fe55bcbdf547b6c7fa775eb6894f56e193f46f06b698fc198fb22e6" Mar 10 10:56:05 crc kubenswrapper[4794]: I0310 10:56:05.754983 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552336-kxkn8" Mar 10 10:56:06 crc kubenswrapper[4794]: I0310 10:56:06.209744 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29552330-mhllm"] Mar 10 10:56:06 crc kubenswrapper[4794]: I0310 10:56:06.221923 4794 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29552330-mhllm"] Mar 10 10:56:08 crc kubenswrapper[4794]: I0310 10:56:08.013844 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d021b0b4-f8b7-4a9f-8a38-214bb3b25e21" path="/var/lib/kubelet/pods/d021b0b4-f8b7-4a9f-8a38-214bb3b25e21/volumes" Mar 10 10:56:40 crc kubenswrapper[4794]: I0310 10:56:40.791215 4794 scope.go:117] "RemoveContainer" containerID="0bbe115eaa6ea5850ecc3bcd96285e17e568e7d9ca8e693a297e7422b55c7498" Mar 10 10:57:22 crc kubenswrapper[4794]: I0310 10:57:22.967813 4794 patch_prober.go:28] interesting pod/machine-config-daemon-69278 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 10:57:22 crc kubenswrapper[4794]: I0310 10:57:22.968507 4794 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-69278" podUID="071a79a8-a892-4d38-a255-2a19483b64aa" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 10:57:51 crc kubenswrapper[4794]: I0310 10:57:51.223098 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-mbrp9"] Mar 10 10:57:51 crc kubenswrapper[4794]: E0310 10:57:51.224189 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9b5a95b8-232e-4423-9f6b-c4fa4cc36f55" containerName="oc" Mar 10 10:57:51 crc kubenswrapper[4794]: I0310 10:57:51.224209 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="9b5a95b8-232e-4423-9f6b-c4fa4cc36f55" containerName="oc" Mar 10 10:57:51 crc kubenswrapper[4794]: I0310 10:57:51.224478 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="9b5a95b8-232e-4423-9f6b-c4fa4cc36f55" containerName="oc" Mar 10 10:57:51 crc kubenswrapper[4794]: I0310 10:57:51.229491 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-mbrp9" Mar 10 10:57:51 crc kubenswrapper[4794]: I0310 10:57:51.251134 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-mbrp9"] Mar 10 10:57:51 crc kubenswrapper[4794]: I0310 10:57:51.290277 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qzvjg\" (UniqueName: \"kubernetes.io/projected/569f7801-841a-4ccf-bf8a-a929abb94405-kube-api-access-qzvjg\") pod \"certified-operators-mbrp9\" (UID: \"569f7801-841a-4ccf-bf8a-a929abb94405\") " pod="openshift-marketplace/certified-operators-mbrp9" Mar 10 10:57:51 crc kubenswrapper[4794]: I0310 10:57:51.290393 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/569f7801-841a-4ccf-bf8a-a929abb94405-catalog-content\") pod \"certified-operators-mbrp9\" (UID: \"569f7801-841a-4ccf-bf8a-a929abb94405\") " pod="openshift-marketplace/certified-operators-mbrp9" Mar 10 10:57:51 crc kubenswrapper[4794]: I0310 10:57:51.290549 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/569f7801-841a-4ccf-bf8a-a929abb94405-utilities\") pod \"certified-operators-mbrp9\" (UID: \"569f7801-841a-4ccf-bf8a-a929abb94405\") " pod="openshift-marketplace/certified-operators-mbrp9" Mar 10 10:57:51 crc kubenswrapper[4794]: I0310 10:57:51.392439 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qzvjg\" (UniqueName: \"kubernetes.io/projected/569f7801-841a-4ccf-bf8a-a929abb94405-kube-api-access-qzvjg\") pod \"certified-operators-mbrp9\" (UID: \"569f7801-841a-4ccf-bf8a-a929abb94405\") " pod="openshift-marketplace/certified-operators-mbrp9" Mar 10 10:57:51 crc kubenswrapper[4794]: I0310 10:57:51.392918 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/569f7801-841a-4ccf-bf8a-a929abb94405-catalog-content\") pod \"certified-operators-mbrp9\" (UID: \"569f7801-841a-4ccf-bf8a-a929abb94405\") " pod="openshift-marketplace/certified-operators-mbrp9" Mar 10 10:57:51 crc kubenswrapper[4794]: I0310 10:57:51.393302 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/569f7801-841a-4ccf-bf8a-a929abb94405-utilities\") pod \"certified-operators-mbrp9\" (UID: \"569f7801-841a-4ccf-bf8a-a929abb94405\") " pod="openshift-marketplace/certified-operators-mbrp9" Mar 10 10:57:51 crc kubenswrapper[4794]: I0310 10:57:51.393429 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/569f7801-841a-4ccf-bf8a-a929abb94405-catalog-content\") pod \"certified-operators-mbrp9\" (UID: \"569f7801-841a-4ccf-bf8a-a929abb94405\") " pod="openshift-marketplace/certified-operators-mbrp9" Mar 10 10:57:51 crc kubenswrapper[4794]: I0310 10:57:51.393720 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/569f7801-841a-4ccf-bf8a-a929abb94405-utilities\") pod \"certified-operators-mbrp9\" (UID: \"569f7801-841a-4ccf-bf8a-a929abb94405\") " pod="openshift-marketplace/certified-operators-mbrp9" Mar 10 10:57:51 crc kubenswrapper[4794]: I0310 10:57:51.424851 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qzvjg\" (UniqueName: \"kubernetes.io/projected/569f7801-841a-4ccf-bf8a-a929abb94405-kube-api-access-qzvjg\") pod \"certified-operators-mbrp9\" (UID: \"569f7801-841a-4ccf-bf8a-a929abb94405\") " pod="openshift-marketplace/certified-operators-mbrp9" Mar 10 10:57:51 crc kubenswrapper[4794]: I0310 10:57:51.559920 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-mbrp9" Mar 10 10:57:52 crc kubenswrapper[4794]: I0310 10:57:52.071038 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-mbrp9"] Mar 10 10:57:52 crc kubenswrapper[4794]: I0310 10:57:52.751018 4794 generic.go:334] "Generic (PLEG): container finished" podID="569f7801-841a-4ccf-bf8a-a929abb94405" containerID="efd5ee9e107a3241655ffdcdfc8a4ee640af8b8c2d6db137f315baadfd9a30f2" exitCode=0 Mar 10 10:57:52 crc kubenswrapper[4794]: I0310 10:57:52.751098 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mbrp9" event={"ID":"569f7801-841a-4ccf-bf8a-a929abb94405","Type":"ContainerDied","Data":"efd5ee9e107a3241655ffdcdfc8a4ee640af8b8c2d6db137f315baadfd9a30f2"} Mar 10 10:57:52 crc kubenswrapper[4794]: I0310 10:57:52.751319 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mbrp9" event={"ID":"569f7801-841a-4ccf-bf8a-a929abb94405","Type":"ContainerStarted","Data":"fad457ca33bd0c8a6ba82073b2158d5edc1f2ef5b7fb296dda506f9ea018f0b2"} Mar 10 10:57:52 crc kubenswrapper[4794]: I0310 10:57:52.754364 4794 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 10 10:57:52 crc kubenswrapper[4794]: I0310 10:57:52.968529 4794 patch_prober.go:28] interesting pod/machine-config-daemon-69278 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 10:57:52 crc kubenswrapper[4794]: I0310 10:57:52.968605 4794 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-69278" podUID="071a79a8-a892-4d38-a255-2a19483b64aa" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 10:57:53 crc kubenswrapper[4794]: I0310 10:57:53.763800 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mbrp9" event={"ID":"569f7801-841a-4ccf-bf8a-a929abb94405","Type":"ContainerStarted","Data":"fad9400f496ccd2c327be111c0c6f927994b701d7537c0ba3ec1b1a350a1c9ba"} Mar 10 10:57:54 crc kubenswrapper[4794]: I0310 10:57:54.773078 4794 generic.go:334] "Generic (PLEG): container finished" podID="569f7801-841a-4ccf-bf8a-a929abb94405" containerID="fad9400f496ccd2c327be111c0c6f927994b701d7537c0ba3ec1b1a350a1c9ba" exitCode=0 Mar 10 10:57:54 crc kubenswrapper[4794]: I0310 10:57:54.773126 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mbrp9" event={"ID":"569f7801-841a-4ccf-bf8a-a929abb94405","Type":"ContainerDied","Data":"fad9400f496ccd2c327be111c0c6f927994b701d7537c0ba3ec1b1a350a1c9ba"} Mar 10 10:57:55 crc kubenswrapper[4794]: I0310 10:57:55.791143 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mbrp9" event={"ID":"569f7801-841a-4ccf-bf8a-a929abb94405","Type":"ContainerStarted","Data":"8f7cc813e5042b4eabdcad411fd939b24dd84c21b5294aeca89c1d86f67a5a2e"} Mar 10 10:57:55 crc kubenswrapper[4794]: I0310 10:57:55.831152 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-mbrp9" podStartSLOduration=2.10523993 podStartE2EDuration="4.83112454s" podCreationTimestamp="2026-03-10 10:57:51 +0000 UTC" firstStartedPulling="2026-03-10 10:57:52.754007466 +0000 UTC m=+4421.510178294" lastFinishedPulling="2026-03-10 10:57:55.479892056 +0000 UTC m=+4424.236062904" observedRunningTime="2026-03-10 10:57:55.823779332 +0000 UTC m=+4424.579950200" watchObservedRunningTime="2026-03-10 10:57:55.83112454 +0000 UTC m=+4424.587295388" Mar 10 10:58:00 crc kubenswrapper[4794]: I0310 10:58:00.183580 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29552338-c5llz"] Mar 10 10:58:00 crc kubenswrapper[4794]: I0310 10:58:00.186050 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552338-c5llz" Mar 10 10:58:00 crc kubenswrapper[4794]: I0310 10:58:00.188304 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 10 10:58:00 crc kubenswrapper[4794]: I0310 10:58:00.188847 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 10 10:58:00 crc kubenswrapper[4794]: I0310 10:58:00.189048 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-r5tkc" Mar 10 10:58:00 crc kubenswrapper[4794]: I0310 10:58:00.212631 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552338-c5llz"] Mar 10 10:58:00 crc kubenswrapper[4794]: I0310 10:58:00.362217 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c76xf\" (UniqueName: \"kubernetes.io/projected/49f1bbfe-ab6d-4237-b11a-8fc0f6f2d71e-kube-api-access-c76xf\") pod \"auto-csr-approver-29552338-c5llz\" (UID: \"49f1bbfe-ab6d-4237-b11a-8fc0f6f2d71e\") " pod="openshift-infra/auto-csr-approver-29552338-c5llz" Mar 10 10:58:00 crc kubenswrapper[4794]: I0310 10:58:00.463499 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c76xf\" (UniqueName: \"kubernetes.io/projected/49f1bbfe-ab6d-4237-b11a-8fc0f6f2d71e-kube-api-access-c76xf\") pod \"auto-csr-approver-29552338-c5llz\" (UID: \"49f1bbfe-ab6d-4237-b11a-8fc0f6f2d71e\") " pod="openshift-infra/auto-csr-approver-29552338-c5llz" Mar 10 10:58:00 crc kubenswrapper[4794]: I0310 10:58:00.483290 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c76xf\" (UniqueName: \"kubernetes.io/projected/49f1bbfe-ab6d-4237-b11a-8fc0f6f2d71e-kube-api-access-c76xf\") pod \"auto-csr-approver-29552338-c5llz\" (UID: \"49f1bbfe-ab6d-4237-b11a-8fc0f6f2d71e\") " pod="openshift-infra/auto-csr-approver-29552338-c5llz" Mar 10 10:58:00 crc kubenswrapper[4794]: I0310 10:58:00.503044 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552338-c5llz" Mar 10 10:58:00 crc kubenswrapper[4794]: I0310 10:58:00.977794 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552338-c5llz"] Mar 10 10:58:01 crc kubenswrapper[4794]: I0310 10:58:01.561216 4794 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-mbrp9" Mar 10 10:58:01 crc kubenswrapper[4794]: I0310 10:58:01.561264 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-mbrp9" Mar 10 10:58:01 crc kubenswrapper[4794]: I0310 10:58:01.616368 4794 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-mbrp9" Mar 10 10:58:01 crc kubenswrapper[4794]: I0310 10:58:01.869929 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552338-c5llz" event={"ID":"49f1bbfe-ab6d-4237-b11a-8fc0f6f2d71e","Type":"ContainerStarted","Data":"f7b58f2f3d3073182a646afc16820049a76d0887ffa8fcf0da26ed69b4612289"} Mar 10 10:58:01 crc kubenswrapper[4794]: I0310 10:58:01.920236 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-mbrp9" Mar 10 10:58:01 crc kubenswrapper[4794]: I0310 10:58:01.979524 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-mbrp9"] Mar 10 10:58:02 crc kubenswrapper[4794]: I0310 10:58:02.879156 4794 generic.go:334] "Generic (PLEG): container finished" podID="49f1bbfe-ab6d-4237-b11a-8fc0f6f2d71e" containerID="67c831ca716684d87c42bfa8a89dba497d232cab396336570e3399db0faee93d" exitCode=0 Mar 10 10:58:02 crc kubenswrapper[4794]: I0310 10:58:02.879237 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552338-c5llz" event={"ID":"49f1bbfe-ab6d-4237-b11a-8fc0f6f2d71e","Type":"ContainerDied","Data":"67c831ca716684d87c42bfa8a89dba497d232cab396336570e3399db0faee93d"} Mar 10 10:58:03 crc kubenswrapper[4794]: I0310 10:58:03.884407 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-mbrp9" podUID="569f7801-841a-4ccf-bf8a-a929abb94405" containerName="registry-server" containerID="cri-o://8f7cc813e5042b4eabdcad411fd939b24dd84c21b5294aeca89c1d86f67a5a2e" gracePeriod=2 Mar 10 10:58:04 crc kubenswrapper[4794]: I0310 10:58:04.332322 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552338-c5llz" Mar 10 10:58:04 crc kubenswrapper[4794]: I0310 10:58:04.337312 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-mbrp9" Mar 10 10:58:04 crc kubenswrapper[4794]: I0310 10:58:04.432532 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c76xf\" (UniqueName: \"kubernetes.io/projected/49f1bbfe-ab6d-4237-b11a-8fc0f6f2d71e-kube-api-access-c76xf\") pod \"49f1bbfe-ab6d-4237-b11a-8fc0f6f2d71e\" (UID: \"49f1bbfe-ab6d-4237-b11a-8fc0f6f2d71e\") " Mar 10 10:58:04 crc kubenswrapper[4794]: I0310 10:58:04.433037 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/569f7801-841a-4ccf-bf8a-a929abb94405-utilities\") pod \"569f7801-841a-4ccf-bf8a-a929abb94405\" (UID: \"569f7801-841a-4ccf-bf8a-a929abb94405\") " Mar 10 10:58:04 crc kubenswrapper[4794]: I0310 10:58:04.433526 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qzvjg\" (UniqueName: \"kubernetes.io/projected/569f7801-841a-4ccf-bf8a-a929abb94405-kube-api-access-qzvjg\") pod \"569f7801-841a-4ccf-bf8a-a929abb94405\" (UID: \"569f7801-841a-4ccf-bf8a-a929abb94405\") " Mar 10 10:58:04 crc kubenswrapper[4794]: I0310 10:58:04.433688 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/569f7801-841a-4ccf-bf8a-a929abb94405-catalog-content\") pod \"569f7801-841a-4ccf-bf8a-a929abb94405\" (UID: \"569f7801-841a-4ccf-bf8a-a929abb94405\") " Mar 10 10:58:04 crc kubenswrapper[4794]: I0310 10:58:04.434735 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/569f7801-841a-4ccf-bf8a-a929abb94405-utilities" (OuterVolumeSpecName: "utilities") pod "569f7801-841a-4ccf-bf8a-a929abb94405" (UID: "569f7801-841a-4ccf-bf8a-a929abb94405"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 10:58:04 crc kubenswrapper[4794]: I0310 10:58:04.437316 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/569f7801-841a-4ccf-bf8a-a929abb94405-kube-api-access-qzvjg" (OuterVolumeSpecName: "kube-api-access-qzvjg") pod "569f7801-841a-4ccf-bf8a-a929abb94405" (UID: "569f7801-841a-4ccf-bf8a-a929abb94405"). InnerVolumeSpecName "kube-api-access-qzvjg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 10:58:04 crc kubenswrapper[4794]: I0310 10:58:04.437602 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49f1bbfe-ab6d-4237-b11a-8fc0f6f2d71e-kube-api-access-c76xf" (OuterVolumeSpecName: "kube-api-access-c76xf") pod "49f1bbfe-ab6d-4237-b11a-8fc0f6f2d71e" (UID: "49f1bbfe-ab6d-4237-b11a-8fc0f6f2d71e"). InnerVolumeSpecName "kube-api-access-c76xf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 10:58:04 crc kubenswrapper[4794]: I0310 10:58:04.536315 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c76xf\" (UniqueName: \"kubernetes.io/projected/49f1bbfe-ab6d-4237-b11a-8fc0f6f2d71e-kube-api-access-c76xf\") on node \"crc\" DevicePath \"\"" Mar 10 10:58:04 crc kubenswrapper[4794]: I0310 10:58:04.536369 4794 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/569f7801-841a-4ccf-bf8a-a929abb94405-utilities\") on node \"crc\" DevicePath \"\"" Mar 10 10:58:04 crc kubenswrapper[4794]: I0310 10:58:04.536395 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qzvjg\" (UniqueName: \"kubernetes.io/projected/569f7801-841a-4ccf-bf8a-a929abb94405-kube-api-access-qzvjg\") on node \"crc\" DevicePath \"\"" Mar 10 10:58:04 crc kubenswrapper[4794]: I0310 10:58:04.679594 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/569f7801-841a-4ccf-bf8a-a929abb94405-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "569f7801-841a-4ccf-bf8a-a929abb94405" (UID: "569f7801-841a-4ccf-bf8a-a929abb94405"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 10:58:04 crc kubenswrapper[4794]: I0310 10:58:04.739187 4794 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/569f7801-841a-4ccf-bf8a-a929abb94405-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 10 10:58:04 crc kubenswrapper[4794]: I0310 10:58:04.898400 4794 generic.go:334] "Generic (PLEG): container finished" podID="569f7801-841a-4ccf-bf8a-a929abb94405" containerID="8f7cc813e5042b4eabdcad411fd939b24dd84c21b5294aeca89c1d86f67a5a2e" exitCode=0 Mar 10 10:58:04 crc kubenswrapper[4794]: I0310 10:58:04.898466 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-mbrp9" Mar 10 10:58:04 crc kubenswrapper[4794]: I0310 10:58:04.898487 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mbrp9" event={"ID":"569f7801-841a-4ccf-bf8a-a929abb94405","Type":"ContainerDied","Data":"8f7cc813e5042b4eabdcad411fd939b24dd84c21b5294aeca89c1d86f67a5a2e"} Mar 10 10:58:04 crc kubenswrapper[4794]: I0310 10:58:04.898530 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mbrp9" event={"ID":"569f7801-841a-4ccf-bf8a-a929abb94405","Type":"ContainerDied","Data":"fad457ca33bd0c8a6ba82073b2158d5edc1f2ef5b7fb296dda506f9ea018f0b2"} Mar 10 10:58:04 crc kubenswrapper[4794]: I0310 10:58:04.898548 4794 scope.go:117] "RemoveContainer" containerID="8f7cc813e5042b4eabdcad411fd939b24dd84c21b5294aeca89c1d86f67a5a2e" Mar 10 10:58:04 crc kubenswrapper[4794]: I0310 10:58:04.902856 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552338-c5llz" event={"ID":"49f1bbfe-ab6d-4237-b11a-8fc0f6f2d71e","Type":"ContainerDied","Data":"f7b58f2f3d3073182a646afc16820049a76d0887ffa8fcf0da26ed69b4612289"} Mar 10 10:58:04 crc kubenswrapper[4794]: I0310 10:58:04.902890 4794 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f7b58f2f3d3073182a646afc16820049a76d0887ffa8fcf0da26ed69b4612289" Mar 10 10:58:04 crc kubenswrapper[4794]: I0310 10:58:04.902936 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552338-c5llz" Mar 10 10:58:04 crc kubenswrapper[4794]: I0310 10:58:04.934572 4794 scope.go:117] "RemoveContainer" containerID="fad9400f496ccd2c327be111c0c6f927994b701d7537c0ba3ec1b1a350a1c9ba" Mar 10 10:58:04 crc kubenswrapper[4794]: I0310 10:58:04.943873 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-mbrp9"] Mar 10 10:58:04 crc kubenswrapper[4794]: I0310 10:58:04.951358 4794 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-mbrp9"] Mar 10 10:58:04 crc kubenswrapper[4794]: I0310 10:58:04.978462 4794 scope.go:117] "RemoveContainer" containerID="efd5ee9e107a3241655ffdcdfc8a4ee640af8b8c2d6db137f315baadfd9a30f2" Mar 10 10:58:04 crc kubenswrapper[4794]: I0310 10:58:04.999571 4794 scope.go:117] "RemoveContainer" containerID="8f7cc813e5042b4eabdcad411fd939b24dd84c21b5294aeca89c1d86f67a5a2e" Mar 10 10:58:05 crc kubenswrapper[4794]: E0310 10:58:05.000025 4794 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8f7cc813e5042b4eabdcad411fd939b24dd84c21b5294aeca89c1d86f67a5a2e\": container with ID starting with 8f7cc813e5042b4eabdcad411fd939b24dd84c21b5294aeca89c1d86f67a5a2e not found: ID does not exist" containerID="8f7cc813e5042b4eabdcad411fd939b24dd84c21b5294aeca89c1d86f67a5a2e" Mar 10 10:58:05 crc kubenswrapper[4794]: I0310 10:58:05.000089 4794 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8f7cc813e5042b4eabdcad411fd939b24dd84c21b5294aeca89c1d86f67a5a2e"} err="failed to get container status \"8f7cc813e5042b4eabdcad411fd939b24dd84c21b5294aeca89c1d86f67a5a2e\": rpc error: code = NotFound desc = could not find container \"8f7cc813e5042b4eabdcad411fd939b24dd84c21b5294aeca89c1d86f67a5a2e\": container with ID starting with 8f7cc813e5042b4eabdcad411fd939b24dd84c21b5294aeca89c1d86f67a5a2e not found: ID does not exist" Mar 10 10:58:05 crc kubenswrapper[4794]: I0310 10:58:05.000135 4794 scope.go:117] "RemoveContainer" containerID="fad9400f496ccd2c327be111c0c6f927994b701d7537c0ba3ec1b1a350a1c9ba" Mar 10 10:58:05 crc kubenswrapper[4794]: E0310 10:58:05.000518 4794 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fad9400f496ccd2c327be111c0c6f927994b701d7537c0ba3ec1b1a350a1c9ba\": container with ID starting with fad9400f496ccd2c327be111c0c6f927994b701d7537c0ba3ec1b1a350a1c9ba not found: ID does not exist" containerID="fad9400f496ccd2c327be111c0c6f927994b701d7537c0ba3ec1b1a350a1c9ba" Mar 10 10:58:05 crc kubenswrapper[4794]: I0310 10:58:05.000573 4794 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fad9400f496ccd2c327be111c0c6f927994b701d7537c0ba3ec1b1a350a1c9ba"} err="failed to get container status \"fad9400f496ccd2c327be111c0c6f927994b701d7537c0ba3ec1b1a350a1c9ba\": rpc error: code = NotFound desc = could not find container \"fad9400f496ccd2c327be111c0c6f927994b701d7537c0ba3ec1b1a350a1c9ba\": container with ID starting with fad9400f496ccd2c327be111c0c6f927994b701d7537c0ba3ec1b1a350a1c9ba not found: ID does not exist" Mar 10 10:58:05 crc kubenswrapper[4794]: I0310 10:58:05.000605 4794 scope.go:117] "RemoveContainer" containerID="efd5ee9e107a3241655ffdcdfc8a4ee640af8b8c2d6db137f315baadfd9a30f2" Mar 10 10:58:05 crc kubenswrapper[4794]: E0310 10:58:05.000957 4794 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"efd5ee9e107a3241655ffdcdfc8a4ee640af8b8c2d6db137f315baadfd9a30f2\": container with ID starting with efd5ee9e107a3241655ffdcdfc8a4ee640af8b8c2d6db137f315baadfd9a30f2 not found: ID does not exist" containerID="efd5ee9e107a3241655ffdcdfc8a4ee640af8b8c2d6db137f315baadfd9a30f2" Mar 10 10:58:05 crc kubenswrapper[4794]: I0310 10:58:05.001010 4794 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"efd5ee9e107a3241655ffdcdfc8a4ee640af8b8c2d6db137f315baadfd9a30f2"} err="failed to get container status \"efd5ee9e107a3241655ffdcdfc8a4ee640af8b8c2d6db137f315baadfd9a30f2\": rpc error: code = NotFound desc = could not find container \"efd5ee9e107a3241655ffdcdfc8a4ee640af8b8c2d6db137f315baadfd9a30f2\": container with ID starting with efd5ee9e107a3241655ffdcdfc8a4ee640af8b8c2d6db137f315baadfd9a30f2 not found: ID does not exist" Mar 10 10:58:05 crc kubenswrapper[4794]: I0310 10:58:05.392643 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29552332-dfttx"] Mar 10 10:58:05 crc kubenswrapper[4794]: I0310 10:58:05.396959 4794 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29552332-dfttx"] Mar 10 10:58:06 crc kubenswrapper[4794]: I0310 10:58:06.017360 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4f4c5563-e486-475b-a5d2-286cf7acde74" path="/var/lib/kubelet/pods/4f4c5563-e486-475b-a5d2-286cf7acde74/volumes" Mar 10 10:58:06 crc kubenswrapper[4794]: I0310 10:58:06.018547 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="569f7801-841a-4ccf-bf8a-a929abb94405" path="/var/lib/kubelet/pods/569f7801-841a-4ccf-bf8a-a929abb94405/volumes" Mar 10 10:58:22 crc kubenswrapper[4794]: I0310 10:58:22.968067 4794 patch_prober.go:28] interesting pod/machine-config-daemon-69278 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 10:58:22 crc kubenswrapper[4794]: I0310 10:58:22.968808 4794 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-69278" podUID="071a79a8-a892-4d38-a255-2a19483b64aa" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 10:58:22 crc kubenswrapper[4794]: I0310 10:58:22.968869 4794 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-69278" Mar 10 10:58:22 crc kubenswrapper[4794]: I0310 10:58:22.969747 4794 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"b31747265e59a55e2a362d4d03121df29fa57741e08ecc5e62740ddf3aa58ff9"} pod="openshift-machine-config-operator/machine-config-daemon-69278" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 10 10:58:22 crc kubenswrapper[4794]: I0310 10:58:22.969821 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-69278" podUID="071a79a8-a892-4d38-a255-2a19483b64aa" containerName="machine-config-daemon" containerID="cri-o://b31747265e59a55e2a362d4d03121df29fa57741e08ecc5e62740ddf3aa58ff9" gracePeriod=600 Mar 10 10:58:24 crc kubenswrapper[4794]: I0310 10:58:24.081468 4794 generic.go:334] "Generic (PLEG): container finished" podID="071a79a8-a892-4d38-a255-2a19483b64aa" containerID="b31747265e59a55e2a362d4d03121df29fa57741e08ecc5e62740ddf3aa58ff9" exitCode=0 Mar 10 10:58:24 crc kubenswrapper[4794]: I0310 10:58:24.082236 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-69278" event={"ID":"071a79a8-a892-4d38-a255-2a19483b64aa","Type":"ContainerDied","Data":"b31747265e59a55e2a362d4d03121df29fa57741e08ecc5e62740ddf3aa58ff9"} Mar 10 10:58:24 crc kubenswrapper[4794]: I0310 10:58:24.082279 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-69278" event={"ID":"071a79a8-a892-4d38-a255-2a19483b64aa","Type":"ContainerStarted","Data":"7246b508c9eeccdf8a5ec9276e239c3ec455273ced1f978a5745ca8a5e590e02"} Mar 10 10:58:24 crc kubenswrapper[4794]: I0310 10:58:24.082308 4794 scope.go:117] "RemoveContainer" containerID="a4170662b41f7f6ea8f48d36067f333890b45c5e5457bca77f3d71c2d215cd5a" Mar 10 10:58:40 crc kubenswrapper[4794]: I0310 10:58:40.896350 4794 scope.go:117] "RemoveContainer" containerID="c5eae514c508f3b325afe94b722875a933a59e4e6019a98793799f7677a8ead3" Mar 10 10:59:56 crc kubenswrapper[4794]: I0310 10:59:56.103837 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["crc-storage/crc-storage-crc-9vh2h"] Mar 10 10:59:56 crc kubenswrapper[4794]: I0310 10:59:56.130532 4794 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["crc-storage/crc-storage-crc-9vh2h"] Mar 10 10:59:56 crc kubenswrapper[4794]: I0310 10:59:56.245398 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["crc-storage/crc-storage-crc-46476"] Mar 10 10:59:56 crc kubenswrapper[4794]: E0310 10:59:56.245812 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="49f1bbfe-ab6d-4237-b11a-8fc0f6f2d71e" containerName="oc" Mar 10 10:59:56 crc kubenswrapper[4794]: I0310 10:59:56.245835 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="49f1bbfe-ab6d-4237-b11a-8fc0f6f2d71e" containerName="oc" Mar 10 10:59:56 crc kubenswrapper[4794]: E0310 10:59:56.245877 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="569f7801-841a-4ccf-bf8a-a929abb94405" containerName="registry-server" Mar 10 10:59:56 crc kubenswrapper[4794]: I0310 10:59:56.245887 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="569f7801-841a-4ccf-bf8a-a929abb94405" containerName="registry-server" Mar 10 10:59:56 crc kubenswrapper[4794]: E0310 10:59:56.245903 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="569f7801-841a-4ccf-bf8a-a929abb94405" containerName="extract-utilities" Mar 10 10:59:56 crc kubenswrapper[4794]: I0310 10:59:56.245911 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="569f7801-841a-4ccf-bf8a-a929abb94405" containerName="extract-utilities" Mar 10 10:59:56 crc kubenswrapper[4794]: E0310 10:59:56.245930 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="569f7801-841a-4ccf-bf8a-a929abb94405" containerName="extract-content" Mar 10 10:59:56 crc kubenswrapper[4794]: I0310 10:59:56.245937 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="569f7801-841a-4ccf-bf8a-a929abb94405" containerName="extract-content" Mar 10 10:59:56 crc kubenswrapper[4794]: I0310 10:59:56.246123 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="569f7801-841a-4ccf-bf8a-a929abb94405" containerName="registry-server" Mar 10 10:59:56 crc kubenswrapper[4794]: I0310 10:59:56.246144 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="49f1bbfe-ab6d-4237-b11a-8fc0f6f2d71e" containerName="oc" Mar 10 10:59:56 crc kubenswrapper[4794]: I0310 10:59:56.246795 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-46476" Mar 10 10:59:56 crc kubenswrapper[4794]: I0310 10:59:56.248527 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"openshift-service-ca.crt" Mar 10 10:59:56 crc kubenswrapper[4794]: I0310 10:59:56.249181 4794 reflector.go:368] Caches populated for *v1.Secret from object-"crc-storage"/"crc-storage-dockercfg-76wvz" Mar 10 10:59:56 crc kubenswrapper[4794]: I0310 10:59:56.256986 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"crc-storage" Mar 10 10:59:56 crc kubenswrapper[4794]: I0310 10:59:56.257643 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"kube-root-ca.crt" Mar 10 10:59:56 crc kubenswrapper[4794]: I0310 10:59:56.264010 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-46476"] Mar 10 10:59:56 crc kubenswrapper[4794]: I0310 10:59:56.411427 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-57znh\" (UniqueName: \"kubernetes.io/projected/d5522caf-5153-473e-9d16-145320c36a7c-kube-api-access-57znh\") pod \"crc-storage-crc-46476\" (UID: \"d5522caf-5153-473e-9d16-145320c36a7c\") " pod="crc-storage/crc-storage-crc-46476" Mar 10 10:59:56 crc kubenswrapper[4794]: I0310 10:59:56.411477 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/d5522caf-5153-473e-9d16-145320c36a7c-crc-storage\") pod \"crc-storage-crc-46476\" (UID: \"d5522caf-5153-473e-9d16-145320c36a7c\") " pod="crc-storage/crc-storage-crc-46476" Mar 10 10:59:56 crc kubenswrapper[4794]: I0310 10:59:56.411548 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/d5522caf-5153-473e-9d16-145320c36a7c-node-mnt\") pod \"crc-storage-crc-46476\" (UID: \"d5522caf-5153-473e-9d16-145320c36a7c\") " pod="crc-storage/crc-storage-crc-46476" Mar 10 10:59:56 crc kubenswrapper[4794]: I0310 10:59:56.512850 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-57znh\" (UniqueName: \"kubernetes.io/projected/d5522caf-5153-473e-9d16-145320c36a7c-kube-api-access-57znh\") pod \"crc-storage-crc-46476\" (UID: \"d5522caf-5153-473e-9d16-145320c36a7c\") " pod="crc-storage/crc-storage-crc-46476" Mar 10 10:59:56 crc kubenswrapper[4794]: I0310 10:59:56.513207 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/d5522caf-5153-473e-9d16-145320c36a7c-crc-storage\") pod \"crc-storage-crc-46476\" (UID: \"d5522caf-5153-473e-9d16-145320c36a7c\") " pod="crc-storage/crc-storage-crc-46476" Mar 10 10:59:56 crc kubenswrapper[4794]: I0310 10:59:56.513286 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/d5522caf-5153-473e-9d16-145320c36a7c-node-mnt\") pod \"crc-storage-crc-46476\" (UID: \"d5522caf-5153-473e-9d16-145320c36a7c\") " pod="crc-storage/crc-storage-crc-46476" Mar 10 10:59:56 crc kubenswrapper[4794]: I0310 10:59:56.513751 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/d5522caf-5153-473e-9d16-145320c36a7c-node-mnt\") pod \"crc-storage-crc-46476\" (UID: \"d5522caf-5153-473e-9d16-145320c36a7c\") " pod="crc-storage/crc-storage-crc-46476" Mar 10 10:59:56 crc kubenswrapper[4794]: I0310 10:59:56.515299 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/d5522caf-5153-473e-9d16-145320c36a7c-crc-storage\") pod \"crc-storage-crc-46476\" (UID: \"d5522caf-5153-473e-9d16-145320c36a7c\") " pod="crc-storage/crc-storage-crc-46476" Mar 10 10:59:56 crc kubenswrapper[4794]: I0310 10:59:56.545793 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-57znh\" (UniqueName: \"kubernetes.io/projected/d5522caf-5153-473e-9d16-145320c36a7c-kube-api-access-57znh\") pod \"crc-storage-crc-46476\" (UID: \"d5522caf-5153-473e-9d16-145320c36a7c\") " pod="crc-storage/crc-storage-crc-46476" Mar 10 10:59:56 crc kubenswrapper[4794]: I0310 10:59:56.580353 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-46476" Mar 10 10:59:57 crc kubenswrapper[4794]: I0310 10:59:57.035094 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-46476"] Mar 10 10:59:57 crc kubenswrapper[4794]: I0310 10:59:57.958700 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-46476" event={"ID":"d5522caf-5153-473e-9d16-145320c36a7c","Type":"ContainerStarted","Data":"8e921242eb8bf483693b8e8aa4b940aae81dd090fbbcb3fdc5f3d00ba9d12870"} Mar 10 10:59:57 crc kubenswrapper[4794]: I0310 10:59:57.959411 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-46476" event={"ID":"d5522caf-5153-473e-9d16-145320c36a7c","Type":"ContainerStarted","Data":"29a24d5edb6bc5864f8a6cbac53d0048a51e4391b8f283a449503025f32ae07e"} Mar 10 10:59:57 crc kubenswrapper[4794]: I0310 10:59:57.986548 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="crc-storage/crc-storage-crc-46476" podStartSLOduration=1.424184146 podStartE2EDuration="1.986506557s" podCreationTimestamp="2026-03-10 10:59:56 +0000 UTC" firstStartedPulling="2026-03-10 10:59:57.044683211 +0000 UTC m=+4545.800854069" lastFinishedPulling="2026-03-10 10:59:57.607005632 +0000 UTC m=+4546.363176480" observedRunningTime="2026-03-10 10:59:57.974648968 +0000 UTC m=+4546.730819796" watchObservedRunningTime="2026-03-10 10:59:57.986506557 +0000 UTC m=+4546.742677415" Mar 10 10:59:58 crc kubenswrapper[4794]: I0310 10:59:58.013059 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2cdb631e-df45-47a7-bcfe-d659cbd1fd1e" path="/var/lib/kubelet/pods/2cdb631e-df45-47a7-bcfe-d659cbd1fd1e/volumes" Mar 10 10:59:58 crc kubenswrapper[4794]: I0310 10:59:58.974478 4794 generic.go:334] "Generic (PLEG): container finished" podID="d5522caf-5153-473e-9d16-145320c36a7c" containerID="8e921242eb8bf483693b8e8aa4b940aae81dd090fbbcb3fdc5f3d00ba9d12870" exitCode=0 Mar 10 10:59:58 crc kubenswrapper[4794]: I0310 10:59:58.974593 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-46476" event={"ID":"d5522caf-5153-473e-9d16-145320c36a7c","Type":"ContainerDied","Data":"8e921242eb8bf483693b8e8aa4b940aae81dd090fbbcb3fdc5f3d00ba9d12870"} Mar 10 11:00:00 crc kubenswrapper[4794]: I0310 11:00:00.173653 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29552340-2skgf"] Mar 10 11:00:00 crc kubenswrapper[4794]: I0310 11:00:00.175201 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552340-2skgf" Mar 10 11:00:00 crc kubenswrapper[4794]: I0310 11:00:00.179478 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 10 11:00:00 crc kubenswrapper[4794]: I0310 11:00:00.179559 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-r5tkc" Mar 10 11:00:00 crc kubenswrapper[4794]: I0310 11:00:00.179773 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 10 11:00:00 crc kubenswrapper[4794]: I0310 11:00:00.190589 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29552340-94k6f"] Mar 10 11:00:00 crc kubenswrapper[4794]: I0310 11:00:00.191887 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29552340-94k6f" Mar 10 11:00:00 crc kubenswrapper[4794]: I0310 11:00:00.194478 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 10 11:00:00 crc kubenswrapper[4794]: I0310 11:00:00.195730 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 10 11:00:00 crc kubenswrapper[4794]: I0310 11:00:00.201487 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552340-2skgf"] Mar 10 11:00:00 crc kubenswrapper[4794]: I0310 11:00:00.209857 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29552340-94k6f"] Mar 10 11:00:00 crc kubenswrapper[4794]: I0310 11:00:00.289282 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vcxwd\" (UniqueName: \"kubernetes.io/projected/84abb13a-cdb3-4eb8-8845-fa8e7db8ef02-kube-api-access-vcxwd\") pod \"collect-profiles-29552340-94k6f\" (UID: \"84abb13a-cdb3-4eb8-8845-fa8e7db8ef02\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552340-94k6f" Mar 10 11:00:00 crc kubenswrapper[4794]: I0310 11:00:00.289457 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zrtjl\" (UniqueName: \"kubernetes.io/projected/a69e0088-724c-47d3-b1fc-89279d3195ed-kube-api-access-zrtjl\") pod \"auto-csr-approver-29552340-2skgf\" (UID: \"a69e0088-724c-47d3-b1fc-89279d3195ed\") " pod="openshift-infra/auto-csr-approver-29552340-2skgf" Mar 10 11:00:00 crc kubenswrapper[4794]: I0310 11:00:00.289560 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/84abb13a-cdb3-4eb8-8845-fa8e7db8ef02-config-volume\") pod \"collect-profiles-29552340-94k6f\" (UID: \"84abb13a-cdb3-4eb8-8845-fa8e7db8ef02\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552340-94k6f" Mar 10 11:00:00 crc kubenswrapper[4794]: I0310 11:00:00.289611 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/84abb13a-cdb3-4eb8-8845-fa8e7db8ef02-secret-volume\") pod \"collect-profiles-29552340-94k6f\" (UID: \"84abb13a-cdb3-4eb8-8845-fa8e7db8ef02\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552340-94k6f" Mar 10 11:00:00 crc kubenswrapper[4794]: I0310 11:00:00.392007 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/84abb13a-cdb3-4eb8-8845-fa8e7db8ef02-config-volume\") pod \"collect-profiles-29552340-94k6f\" (UID: \"84abb13a-cdb3-4eb8-8845-fa8e7db8ef02\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552340-94k6f" Mar 10 11:00:00 crc kubenswrapper[4794]: I0310 11:00:00.392088 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/84abb13a-cdb3-4eb8-8845-fa8e7db8ef02-secret-volume\") pod \"collect-profiles-29552340-94k6f\" (UID: \"84abb13a-cdb3-4eb8-8845-fa8e7db8ef02\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552340-94k6f" Mar 10 11:00:00 crc kubenswrapper[4794]: I0310 11:00:00.392148 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vcxwd\" (UniqueName: \"kubernetes.io/projected/84abb13a-cdb3-4eb8-8845-fa8e7db8ef02-kube-api-access-vcxwd\") pod \"collect-profiles-29552340-94k6f\" (UID: \"84abb13a-cdb3-4eb8-8845-fa8e7db8ef02\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552340-94k6f" Mar 10 11:00:00 crc kubenswrapper[4794]: I0310 11:00:00.392258 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zrtjl\" (UniqueName: \"kubernetes.io/projected/a69e0088-724c-47d3-b1fc-89279d3195ed-kube-api-access-zrtjl\") pod \"auto-csr-approver-29552340-2skgf\" (UID: \"a69e0088-724c-47d3-b1fc-89279d3195ed\") " pod="openshift-infra/auto-csr-approver-29552340-2skgf" Mar 10 11:00:00 crc kubenswrapper[4794]: I0310 11:00:00.393089 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/84abb13a-cdb3-4eb8-8845-fa8e7db8ef02-config-volume\") pod \"collect-profiles-29552340-94k6f\" (UID: \"84abb13a-cdb3-4eb8-8845-fa8e7db8ef02\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552340-94k6f" Mar 10 11:00:00 crc kubenswrapper[4794]: I0310 11:00:00.402581 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/84abb13a-cdb3-4eb8-8845-fa8e7db8ef02-secret-volume\") pod \"collect-profiles-29552340-94k6f\" (UID: \"84abb13a-cdb3-4eb8-8845-fa8e7db8ef02\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552340-94k6f" Mar 10 11:00:00 crc kubenswrapper[4794]: I0310 11:00:00.409641 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zrtjl\" (UniqueName: \"kubernetes.io/projected/a69e0088-724c-47d3-b1fc-89279d3195ed-kube-api-access-zrtjl\") pod \"auto-csr-approver-29552340-2skgf\" (UID: \"a69e0088-724c-47d3-b1fc-89279d3195ed\") " pod="openshift-infra/auto-csr-approver-29552340-2skgf" Mar 10 11:00:00 crc kubenswrapper[4794]: I0310 11:00:00.410081 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vcxwd\" (UniqueName: \"kubernetes.io/projected/84abb13a-cdb3-4eb8-8845-fa8e7db8ef02-kube-api-access-vcxwd\") pod \"collect-profiles-29552340-94k6f\" (UID: \"84abb13a-cdb3-4eb8-8845-fa8e7db8ef02\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552340-94k6f" Mar 10 11:00:00 crc kubenswrapper[4794]: I0310 11:00:00.475166 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-46476" Mar 10 11:00:00 crc kubenswrapper[4794]: I0310 11:00:00.493521 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/d5522caf-5153-473e-9d16-145320c36a7c-crc-storage\") pod \"d5522caf-5153-473e-9d16-145320c36a7c\" (UID: \"d5522caf-5153-473e-9d16-145320c36a7c\") " Mar 10 11:00:00 crc kubenswrapper[4794]: I0310 11:00:00.493600 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/d5522caf-5153-473e-9d16-145320c36a7c-node-mnt\") pod \"d5522caf-5153-473e-9d16-145320c36a7c\" (UID: \"d5522caf-5153-473e-9d16-145320c36a7c\") " Mar 10 11:00:00 crc kubenswrapper[4794]: I0310 11:00:00.493630 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-57znh\" (UniqueName: \"kubernetes.io/projected/d5522caf-5153-473e-9d16-145320c36a7c-kube-api-access-57znh\") pod \"d5522caf-5153-473e-9d16-145320c36a7c\" (UID: \"d5522caf-5153-473e-9d16-145320c36a7c\") " Mar 10 11:00:00 crc kubenswrapper[4794]: I0310 11:00:00.494131 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d5522caf-5153-473e-9d16-145320c36a7c-node-mnt" (OuterVolumeSpecName: "node-mnt") pod "d5522caf-5153-473e-9d16-145320c36a7c" (UID: "d5522caf-5153-473e-9d16-145320c36a7c"). InnerVolumeSpecName "node-mnt". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 11:00:00 crc kubenswrapper[4794]: I0310 11:00:00.498310 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d5522caf-5153-473e-9d16-145320c36a7c-kube-api-access-57znh" (OuterVolumeSpecName: "kube-api-access-57znh") pod "d5522caf-5153-473e-9d16-145320c36a7c" (UID: "d5522caf-5153-473e-9d16-145320c36a7c"). InnerVolumeSpecName "kube-api-access-57znh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 11:00:00 crc kubenswrapper[4794]: I0310 11:00:00.512421 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552340-2skgf" Mar 10 11:00:00 crc kubenswrapper[4794]: I0310 11:00:00.526350 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29552340-94k6f" Mar 10 11:00:00 crc kubenswrapper[4794]: I0310 11:00:00.547660 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d5522caf-5153-473e-9d16-145320c36a7c-crc-storage" (OuterVolumeSpecName: "crc-storage") pod "d5522caf-5153-473e-9d16-145320c36a7c" (UID: "d5522caf-5153-473e-9d16-145320c36a7c"). InnerVolumeSpecName "crc-storage". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 11:00:00 crc kubenswrapper[4794]: I0310 11:00:00.595723 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-57znh\" (UniqueName: \"kubernetes.io/projected/d5522caf-5153-473e-9d16-145320c36a7c-kube-api-access-57znh\") on node \"crc\" DevicePath \"\"" Mar 10 11:00:00 crc kubenswrapper[4794]: I0310 11:00:00.595794 4794 reconciler_common.go:293] "Volume detached for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/d5522caf-5153-473e-9d16-145320c36a7c-crc-storage\") on node \"crc\" DevicePath \"\"" Mar 10 11:00:00 crc kubenswrapper[4794]: I0310 11:00:00.595823 4794 reconciler_common.go:293] "Volume detached for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/d5522caf-5153-473e-9d16-145320c36a7c-node-mnt\") on node \"crc\" DevicePath \"\"" Mar 10 11:00:00 crc kubenswrapper[4794]: I0310 11:00:00.778828 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29552340-94k6f"] Mar 10 11:00:00 crc kubenswrapper[4794]: W0310 11:00:00.784313 4794 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod84abb13a_cdb3_4eb8_8845_fa8e7db8ef02.slice/crio-9c7e77bbe06236a2350bc156242b848984ac978122647aa8d7567a6378381307 WatchSource:0}: Error finding container 9c7e77bbe06236a2350bc156242b848984ac978122647aa8d7567a6378381307: Status 404 returned error can't find the container with id 9c7e77bbe06236a2350bc156242b848984ac978122647aa8d7567a6378381307 Mar 10 11:00:00 crc kubenswrapper[4794]: I0310 11:00:00.996107 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-46476" event={"ID":"d5522caf-5153-473e-9d16-145320c36a7c","Type":"ContainerDied","Data":"29a24d5edb6bc5864f8a6cbac53d0048a51e4391b8f283a449503025f32ae07e"} Mar 10 11:00:00 crc kubenswrapper[4794]: I0310 11:00:00.996137 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-46476" Mar 10 11:00:00 crc kubenswrapper[4794]: I0310 11:00:00.996159 4794 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="29a24d5edb6bc5864f8a6cbac53d0048a51e4391b8f283a449503025f32ae07e" Mar 10 11:00:00 crc kubenswrapper[4794]: I0310 11:00:00.997710 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29552340-94k6f" event={"ID":"84abb13a-cdb3-4eb8-8845-fa8e7db8ef02","Type":"ContainerStarted","Data":"21fad9483da30ee067fd585ceb3f41eb4f0448942f59671109057931e96c4a41"} Mar 10 11:00:00 crc kubenswrapper[4794]: I0310 11:00:00.997783 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29552340-94k6f" event={"ID":"84abb13a-cdb3-4eb8-8845-fa8e7db8ef02","Type":"ContainerStarted","Data":"9c7e77bbe06236a2350bc156242b848984ac978122647aa8d7567a6378381307"} Mar 10 11:00:01 crc kubenswrapper[4794]: I0310 11:00:01.026162 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29552340-94k6f" podStartSLOduration=1.026141284 podStartE2EDuration="1.026141284s" podCreationTimestamp="2026-03-10 11:00:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 11:00:01.017815805 +0000 UTC m=+4549.773986673" watchObservedRunningTime="2026-03-10 11:00:01.026141284 +0000 UTC m=+4549.782312112" Mar 10 11:00:01 crc kubenswrapper[4794]: I0310 11:00:01.053654 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552340-2skgf"] Mar 10 11:00:02 crc kubenswrapper[4794]: I0310 11:00:02.012545 4794 generic.go:334] "Generic (PLEG): container finished" podID="84abb13a-cdb3-4eb8-8845-fa8e7db8ef02" containerID="21fad9483da30ee067fd585ceb3f41eb4f0448942f59671109057931e96c4a41" exitCode=0 Mar 10 11:00:02 crc kubenswrapper[4794]: I0310 11:00:02.012639 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29552340-94k6f" event={"ID":"84abb13a-cdb3-4eb8-8845-fa8e7db8ef02","Type":"ContainerDied","Data":"21fad9483da30ee067fd585ceb3f41eb4f0448942f59671109057931e96c4a41"} Mar 10 11:00:02 crc kubenswrapper[4794]: I0310 11:00:02.014317 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552340-2skgf" event={"ID":"a69e0088-724c-47d3-b1fc-89279d3195ed","Type":"ContainerStarted","Data":"22233106b2006a9cb9cafcb0deb2095d23f24efaa08308c271168b037ab2177c"} Mar 10 11:00:02 crc kubenswrapper[4794]: I0310 11:00:02.415372 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["crc-storage/crc-storage-crc-46476"] Mar 10 11:00:02 crc kubenswrapper[4794]: I0310 11:00:02.428523 4794 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["crc-storage/crc-storage-crc-46476"] Mar 10 11:00:02 crc kubenswrapper[4794]: I0310 11:00:02.569308 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["crc-storage/crc-storage-crc-49qft"] Mar 10 11:00:02 crc kubenswrapper[4794]: E0310 11:00:02.569776 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d5522caf-5153-473e-9d16-145320c36a7c" containerName="storage" Mar 10 11:00:02 crc kubenswrapper[4794]: I0310 11:00:02.569799 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="d5522caf-5153-473e-9d16-145320c36a7c" containerName="storage" Mar 10 11:00:02 crc kubenswrapper[4794]: I0310 11:00:02.570051 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="d5522caf-5153-473e-9d16-145320c36a7c" containerName="storage" Mar 10 11:00:02 crc kubenswrapper[4794]: I0310 11:00:02.570873 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-49qft" Mar 10 11:00:02 crc kubenswrapper[4794]: I0310 11:00:02.574085 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"crc-storage" Mar 10 11:00:02 crc kubenswrapper[4794]: I0310 11:00:02.574237 4794 reflector.go:368] Caches populated for *v1.Secret from object-"crc-storage"/"crc-storage-dockercfg-76wvz" Mar 10 11:00:02 crc kubenswrapper[4794]: I0310 11:00:02.574453 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"kube-root-ca.crt" Mar 10 11:00:02 crc kubenswrapper[4794]: I0310 11:00:02.582112 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"openshift-service-ca.crt" Mar 10 11:00:02 crc kubenswrapper[4794]: I0310 11:00:02.588876 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-49qft"] Mar 10 11:00:02 crc kubenswrapper[4794]: I0310 11:00:02.626757 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/f9106c0b-af66-4602-b04e-457241dd8865-node-mnt\") pod \"crc-storage-crc-49qft\" (UID: \"f9106c0b-af66-4602-b04e-457241dd8865\") " pod="crc-storage/crc-storage-crc-49qft" Mar 10 11:00:02 crc kubenswrapper[4794]: I0310 11:00:02.626911 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/f9106c0b-af66-4602-b04e-457241dd8865-crc-storage\") pod \"crc-storage-crc-49qft\" (UID: \"f9106c0b-af66-4602-b04e-457241dd8865\") " pod="crc-storage/crc-storage-crc-49qft" Mar 10 11:00:02 crc kubenswrapper[4794]: I0310 11:00:02.627008 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-srj8d\" (UniqueName: \"kubernetes.io/projected/f9106c0b-af66-4602-b04e-457241dd8865-kube-api-access-srj8d\") pod \"crc-storage-crc-49qft\" (UID: \"f9106c0b-af66-4602-b04e-457241dd8865\") " pod="crc-storage/crc-storage-crc-49qft" Mar 10 11:00:02 crc kubenswrapper[4794]: I0310 11:00:02.728750 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/f9106c0b-af66-4602-b04e-457241dd8865-crc-storage\") pod \"crc-storage-crc-49qft\" (UID: \"f9106c0b-af66-4602-b04e-457241dd8865\") " pod="crc-storage/crc-storage-crc-49qft" Mar 10 11:00:02 crc kubenswrapper[4794]: I0310 11:00:02.728919 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-srj8d\" (UniqueName: \"kubernetes.io/projected/f9106c0b-af66-4602-b04e-457241dd8865-kube-api-access-srj8d\") pod \"crc-storage-crc-49qft\" (UID: \"f9106c0b-af66-4602-b04e-457241dd8865\") " pod="crc-storage/crc-storage-crc-49qft" Mar 10 11:00:02 crc kubenswrapper[4794]: I0310 11:00:02.729050 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/f9106c0b-af66-4602-b04e-457241dd8865-node-mnt\") pod \"crc-storage-crc-49qft\" (UID: \"f9106c0b-af66-4602-b04e-457241dd8865\") " pod="crc-storage/crc-storage-crc-49qft" Mar 10 11:00:02 crc kubenswrapper[4794]: I0310 11:00:02.729573 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/f9106c0b-af66-4602-b04e-457241dd8865-node-mnt\") pod \"crc-storage-crc-49qft\" (UID: \"f9106c0b-af66-4602-b04e-457241dd8865\") " pod="crc-storage/crc-storage-crc-49qft" Mar 10 11:00:02 crc kubenswrapper[4794]: I0310 11:00:02.730306 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/f9106c0b-af66-4602-b04e-457241dd8865-crc-storage\") pod \"crc-storage-crc-49qft\" (UID: \"f9106c0b-af66-4602-b04e-457241dd8865\") " pod="crc-storage/crc-storage-crc-49qft" Mar 10 11:00:02 crc kubenswrapper[4794]: I0310 11:00:02.762495 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-srj8d\" (UniqueName: \"kubernetes.io/projected/f9106c0b-af66-4602-b04e-457241dd8865-kube-api-access-srj8d\") pod \"crc-storage-crc-49qft\" (UID: \"f9106c0b-af66-4602-b04e-457241dd8865\") " pod="crc-storage/crc-storage-crc-49qft" Mar 10 11:00:02 crc kubenswrapper[4794]: I0310 11:00:02.924753 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-49qft" Mar 10 11:00:03 crc kubenswrapper[4794]: I0310 11:00:03.385519 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29552340-94k6f" Mar 10 11:00:03 crc kubenswrapper[4794]: I0310 11:00:03.420635 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-49qft"] Mar 10 11:00:03 crc kubenswrapper[4794]: W0310 11:00:03.421788 4794 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf9106c0b_af66_4602_b04e_457241dd8865.slice/crio-6f9e8cf2fbb0907a2b58d65e182970d29ccd42d373c2e2436fc374d9dfd7322c WatchSource:0}: Error finding container 6f9e8cf2fbb0907a2b58d65e182970d29ccd42d373c2e2436fc374d9dfd7322c: Status 404 returned error can't find the container with id 6f9e8cf2fbb0907a2b58d65e182970d29ccd42d373c2e2436fc374d9dfd7322c Mar 10 11:00:03 crc kubenswrapper[4794]: I0310 11:00:03.437463 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vcxwd\" (UniqueName: \"kubernetes.io/projected/84abb13a-cdb3-4eb8-8845-fa8e7db8ef02-kube-api-access-vcxwd\") pod \"84abb13a-cdb3-4eb8-8845-fa8e7db8ef02\" (UID: \"84abb13a-cdb3-4eb8-8845-fa8e7db8ef02\") " Mar 10 11:00:03 crc kubenswrapper[4794]: I0310 11:00:03.437516 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/84abb13a-cdb3-4eb8-8845-fa8e7db8ef02-secret-volume\") pod \"84abb13a-cdb3-4eb8-8845-fa8e7db8ef02\" (UID: \"84abb13a-cdb3-4eb8-8845-fa8e7db8ef02\") " Mar 10 11:00:03 crc kubenswrapper[4794]: I0310 11:00:03.437775 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/84abb13a-cdb3-4eb8-8845-fa8e7db8ef02-config-volume\") pod \"84abb13a-cdb3-4eb8-8845-fa8e7db8ef02\" (UID: \"84abb13a-cdb3-4eb8-8845-fa8e7db8ef02\") " Mar 10 11:00:03 crc kubenswrapper[4794]: I0310 11:00:03.439035 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/84abb13a-cdb3-4eb8-8845-fa8e7db8ef02-config-volume" (OuterVolumeSpecName: "config-volume") pod "84abb13a-cdb3-4eb8-8845-fa8e7db8ef02" (UID: "84abb13a-cdb3-4eb8-8845-fa8e7db8ef02"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 11:00:03 crc kubenswrapper[4794]: I0310 11:00:03.443244 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/84abb13a-cdb3-4eb8-8845-fa8e7db8ef02-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "84abb13a-cdb3-4eb8-8845-fa8e7db8ef02" (UID: "84abb13a-cdb3-4eb8-8845-fa8e7db8ef02"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 11:00:03 crc kubenswrapper[4794]: I0310 11:00:03.443409 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/84abb13a-cdb3-4eb8-8845-fa8e7db8ef02-kube-api-access-vcxwd" (OuterVolumeSpecName: "kube-api-access-vcxwd") pod "84abb13a-cdb3-4eb8-8845-fa8e7db8ef02" (UID: "84abb13a-cdb3-4eb8-8845-fa8e7db8ef02"). InnerVolumeSpecName "kube-api-access-vcxwd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 11:00:03 crc kubenswrapper[4794]: I0310 11:00:03.538750 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vcxwd\" (UniqueName: \"kubernetes.io/projected/84abb13a-cdb3-4eb8-8845-fa8e7db8ef02-kube-api-access-vcxwd\") on node \"crc\" DevicePath \"\"" Mar 10 11:00:03 crc kubenswrapper[4794]: I0310 11:00:03.538895 4794 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/84abb13a-cdb3-4eb8-8845-fa8e7db8ef02-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 10 11:00:03 crc kubenswrapper[4794]: I0310 11:00:03.538971 4794 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/84abb13a-cdb3-4eb8-8845-fa8e7db8ef02-config-volume\") on node \"crc\" DevicePath \"\"" Mar 10 11:00:04 crc kubenswrapper[4794]: I0310 11:00:04.013932 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d5522caf-5153-473e-9d16-145320c36a7c" path="/var/lib/kubelet/pods/d5522caf-5153-473e-9d16-145320c36a7c/volumes" Mar 10 11:00:04 crc kubenswrapper[4794]: I0310 11:00:04.034054 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-49qft" event={"ID":"f9106c0b-af66-4602-b04e-457241dd8865","Type":"ContainerStarted","Data":"6f9e8cf2fbb0907a2b58d65e182970d29ccd42d373c2e2436fc374d9dfd7322c"} Mar 10 11:00:04 crc kubenswrapper[4794]: I0310 11:00:04.036281 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29552340-94k6f" event={"ID":"84abb13a-cdb3-4eb8-8845-fa8e7db8ef02","Type":"ContainerDied","Data":"9c7e77bbe06236a2350bc156242b848984ac978122647aa8d7567a6378381307"} Mar 10 11:00:04 crc kubenswrapper[4794]: I0310 11:00:04.036316 4794 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9c7e77bbe06236a2350bc156242b848984ac978122647aa8d7567a6378381307" Mar 10 11:00:04 crc kubenswrapper[4794]: I0310 11:00:04.036412 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29552340-94k6f" Mar 10 11:00:04 crc kubenswrapper[4794]: I0310 11:00:04.466745 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29552295-mkdsg"] Mar 10 11:00:04 crc kubenswrapper[4794]: I0310 11:00:04.477051 4794 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29552295-mkdsg"] Mar 10 11:00:05 crc kubenswrapper[4794]: I0310 11:00:05.050251 4794 generic.go:334] "Generic (PLEG): container finished" podID="f9106c0b-af66-4602-b04e-457241dd8865" containerID="7433ab70ed01821a0d10cd02d675161cca180ae1496de6aed5d67410ff653df6" exitCode=0 Mar 10 11:00:05 crc kubenswrapper[4794]: I0310 11:00:05.050327 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-49qft" event={"ID":"f9106c0b-af66-4602-b04e-457241dd8865","Type":"ContainerDied","Data":"7433ab70ed01821a0d10cd02d675161cca180ae1496de6aed5d67410ff653df6"} Mar 10 11:00:06 crc kubenswrapper[4794]: I0310 11:00:06.016075 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0afae991-6837-47dd-948b-cb1a298f8ce3" path="/var/lib/kubelet/pods/0afae991-6837-47dd-948b-cb1a298f8ce3/volumes" Mar 10 11:00:06 crc kubenswrapper[4794]: I0310 11:00:06.410157 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-49qft" Mar 10 11:00:06 crc kubenswrapper[4794]: I0310 11:00:06.500580 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/f9106c0b-af66-4602-b04e-457241dd8865-crc-storage\") pod \"f9106c0b-af66-4602-b04e-457241dd8865\" (UID: \"f9106c0b-af66-4602-b04e-457241dd8865\") " Mar 10 11:00:06 crc kubenswrapper[4794]: I0310 11:00:06.500669 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-srj8d\" (UniqueName: \"kubernetes.io/projected/f9106c0b-af66-4602-b04e-457241dd8865-kube-api-access-srj8d\") pod \"f9106c0b-af66-4602-b04e-457241dd8865\" (UID: \"f9106c0b-af66-4602-b04e-457241dd8865\") " Mar 10 11:00:06 crc kubenswrapper[4794]: I0310 11:00:06.500728 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/f9106c0b-af66-4602-b04e-457241dd8865-node-mnt\") pod \"f9106c0b-af66-4602-b04e-457241dd8865\" (UID: \"f9106c0b-af66-4602-b04e-457241dd8865\") " Mar 10 11:00:06 crc kubenswrapper[4794]: I0310 11:00:06.500905 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f9106c0b-af66-4602-b04e-457241dd8865-node-mnt" (OuterVolumeSpecName: "node-mnt") pod "f9106c0b-af66-4602-b04e-457241dd8865" (UID: "f9106c0b-af66-4602-b04e-457241dd8865"). InnerVolumeSpecName "node-mnt". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 11:00:06 crc kubenswrapper[4794]: I0310 11:00:06.507544 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f9106c0b-af66-4602-b04e-457241dd8865-kube-api-access-srj8d" (OuterVolumeSpecName: "kube-api-access-srj8d") pod "f9106c0b-af66-4602-b04e-457241dd8865" (UID: "f9106c0b-af66-4602-b04e-457241dd8865"). InnerVolumeSpecName "kube-api-access-srj8d". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 11:00:06 crc kubenswrapper[4794]: I0310 11:00:06.524068 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f9106c0b-af66-4602-b04e-457241dd8865-crc-storage" (OuterVolumeSpecName: "crc-storage") pod "f9106c0b-af66-4602-b04e-457241dd8865" (UID: "f9106c0b-af66-4602-b04e-457241dd8865"). InnerVolumeSpecName "crc-storage". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 11:00:06 crc kubenswrapper[4794]: I0310 11:00:06.602479 4794 reconciler_common.go:293] "Volume detached for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/f9106c0b-af66-4602-b04e-457241dd8865-crc-storage\") on node \"crc\" DevicePath \"\"" Mar 10 11:00:06 crc kubenswrapper[4794]: I0310 11:00:06.602530 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-srj8d\" (UniqueName: \"kubernetes.io/projected/f9106c0b-af66-4602-b04e-457241dd8865-kube-api-access-srj8d\") on node \"crc\" DevicePath \"\"" Mar 10 11:00:06 crc kubenswrapper[4794]: I0310 11:00:06.602555 4794 reconciler_common.go:293] "Volume detached for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/f9106c0b-af66-4602-b04e-457241dd8865-node-mnt\") on node \"crc\" DevicePath \"\"" Mar 10 11:00:07 crc kubenswrapper[4794]: I0310 11:00:07.073423 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-49qft" event={"ID":"f9106c0b-af66-4602-b04e-457241dd8865","Type":"ContainerDied","Data":"6f9e8cf2fbb0907a2b58d65e182970d29ccd42d373c2e2436fc374d9dfd7322c"} Mar 10 11:00:07 crc kubenswrapper[4794]: I0310 11:00:07.073499 4794 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6f9e8cf2fbb0907a2b58d65e182970d29ccd42d373c2e2436fc374d9dfd7322c" Mar 10 11:00:07 crc kubenswrapper[4794]: I0310 11:00:07.073523 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-49qft" Mar 10 11:00:23 crc kubenswrapper[4794]: I0310 11:00:23.208035 4794 generic.go:334] "Generic (PLEG): container finished" podID="a69e0088-724c-47d3-b1fc-89279d3195ed" containerID="6bba33afae504fb232f42fc6a6fd9f0a82b3d56cfb82e02962e93f6c279d986a" exitCode=0 Mar 10 11:00:23 crc kubenswrapper[4794]: I0310 11:00:23.208168 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552340-2skgf" event={"ID":"a69e0088-724c-47d3-b1fc-89279d3195ed","Type":"ContainerDied","Data":"6bba33afae504fb232f42fc6a6fd9f0a82b3d56cfb82e02962e93f6c279d986a"} Mar 10 11:00:24 crc kubenswrapper[4794]: I0310 11:00:24.600902 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552340-2skgf" Mar 10 11:00:24 crc kubenswrapper[4794]: I0310 11:00:24.803387 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zrtjl\" (UniqueName: \"kubernetes.io/projected/a69e0088-724c-47d3-b1fc-89279d3195ed-kube-api-access-zrtjl\") pod \"a69e0088-724c-47d3-b1fc-89279d3195ed\" (UID: \"a69e0088-724c-47d3-b1fc-89279d3195ed\") " Mar 10 11:00:24 crc kubenswrapper[4794]: I0310 11:00:24.811584 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a69e0088-724c-47d3-b1fc-89279d3195ed-kube-api-access-zrtjl" (OuterVolumeSpecName: "kube-api-access-zrtjl") pod "a69e0088-724c-47d3-b1fc-89279d3195ed" (UID: "a69e0088-724c-47d3-b1fc-89279d3195ed"). InnerVolumeSpecName "kube-api-access-zrtjl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 11:00:24 crc kubenswrapper[4794]: I0310 11:00:24.905136 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zrtjl\" (UniqueName: \"kubernetes.io/projected/a69e0088-724c-47d3-b1fc-89279d3195ed-kube-api-access-zrtjl\") on node \"crc\" DevicePath \"\"" Mar 10 11:00:25 crc kubenswrapper[4794]: I0310 11:00:25.229732 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552340-2skgf" event={"ID":"a69e0088-724c-47d3-b1fc-89279d3195ed","Type":"ContainerDied","Data":"22233106b2006a9cb9cafcb0deb2095d23f24efaa08308c271168b037ab2177c"} Mar 10 11:00:25 crc kubenswrapper[4794]: I0310 11:00:25.229776 4794 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="22233106b2006a9cb9cafcb0deb2095d23f24efaa08308c271168b037ab2177c" Mar 10 11:00:25 crc kubenswrapper[4794]: I0310 11:00:25.229815 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552340-2skgf" Mar 10 11:00:25 crc kubenswrapper[4794]: I0310 11:00:25.724095 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29552334-4r886"] Mar 10 11:00:25 crc kubenswrapper[4794]: I0310 11:00:25.735787 4794 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29552334-4r886"] Mar 10 11:00:26 crc kubenswrapper[4794]: I0310 11:00:26.015028 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6485889d-9687-4a87-a93d-0c7182ef14d4" path="/var/lib/kubelet/pods/6485889d-9687-4a87-a93d-0c7182ef14d4/volumes" Mar 10 11:00:41 crc kubenswrapper[4794]: I0310 11:00:41.056152 4794 scope.go:117] "RemoveContainer" containerID="800525ae1fae6bbec2e45f78754b0cad9cf595eaf5b4cd7024b07fe4c8a121ef" Mar 10 11:00:41 crc kubenswrapper[4794]: I0310 11:00:41.100865 4794 scope.go:117] "RemoveContainer" containerID="6aed1b4e3e53a82bcb59402d0112ac3efd038a185687b506ba60c28add5a99c7" Mar 10 11:00:41 crc kubenswrapper[4794]: I0310 11:00:41.133191 4794 scope.go:117] "RemoveContainer" containerID="143647dab814ae4c589916f76df89e89bac8e2ca76498f42f8dd7ea69deebed1" Mar 10 11:00:52 crc kubenswrapper[4794]: I0310 11:00:52.967302 4794 patch_prober.go:28] interesting pod/machine-config-daemon-69278 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 11:00:52 crc kubenswrapper[4794]: I0310 11:00:52.968050 4794 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-69278" podUID="071a79a8-a892-4d38-a255-2a19483b64aa" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 11:01:22 crc kubenswrapper[4794]: I0310 11:01:22.968042 4794 patch_prober.go:28] interesting pod/machine-config-daemon-69278 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 11:01:22 crc kubenswrapper[4794]: I0310 11:01:22.968909 4794 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-69278" podUID="071a79a8-a892-4d38-a255-2a19483b64aa" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 11:01:52 crc kubenswrapper[4794]: I0310 11:01:52.967958 4794 patch_prober.go:28] interesting pod/machine-config-daemon-69278 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 11:01:52 crc kubenswrapper[4794]: I0310 11:01:52.968530 4794 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-69278" podUID="071a79a8-a892-4d38-a255-2a19483b64aa" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 11:01:52 crc kubenswrapper[4794]: I0310 11:01:52.968593 4794 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-69278" Mar 10 11:01:52 crc kubenswrapper[4794]: I0310 11:01:52.969354 4794 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"7246b508c9eeccdf8a5ec9276e239c3ec455273ced1f978a5745ca8a5e590e02"} pod="openshift-machine-config-operator/machine-config-daemon-69278" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 10 11:01:52 crc kubenswrapper[4794]: I0310 11:01:52.969462 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-69278" podUID="071a79a8-a892-4d38-a255-2a19483b64aa" containerName="machine-config-daemon" containerID="cri-o://7246b508c9eeccdf8a5ec9276e239c3ec455273ced1f978a5745ca8a5e590e02" gracePeriod=600 Mar 10 11:01:53 crc kubenswrapper[4794]: E0310 11:01:53.111837 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-69278_openshift-machine-config-operator(071a79a8-a892-4d38-a255-2a19483b64aa)\"" pod="openshift-machine-config-operator/machine-config-daemon-69278" podUID="071a79a8-a892-4d38-a255-2a19483b64aa" Mar 10 11:01:54 crc kubenswrapper[4794]: I0310 11:01:54.070855 4794 generic.go:334] "Generic (PLEG): container finished" podID="071a79a8-a892-4d38-a255-2a19483b64aa" containerID="7246b508c9eeccdf8a5ec9276e239c3ec455273ced1f978a5745ca8a5e590e02" exitCode=0 Mar 10 11:01:54 crc kubenswrapper[4794]: I0310 11:01:54.070920 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-69278" event={"ID":"071a79a8-a892-4d38-a255-2a19483b64aa","Type":"ContainerDied","Data":"7246b508c9eeccdf8a5ec9276e239c3ec455273ced1f978a5745ca8a5e590e02"} Mar 10 11:01:54 crc kubenswrapper[4794]: I0310 11:01:54.070998 4794 scope.go:117] "RemoveContainer" containerID="b31747265e59a55e2a362d4d03121df29fa57741e08ecc5e62740ddf3aa58ff9" Mar 10 11:01:54 crc kubenswrapper[4794]: I0310 11:01:54.072112 4794 scope.go:117] "RemoveContainer" containerID="7246b508c9eeccdf8a5ec9276e239c3ec455273ced1f978a5745ca8a5e590e02" Mar 10 11:01:54 crc kubenswrapper[4794]: E0310 11:01:54.072545 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-69278_openshift-machine-config-operator(071a79a8-a892-4d38-a255-2a19483b64aa)\"" pod="openshift-machine-config-operator/machine-config-daemon-69278" podUID="071a79a8-a892-4d38-a255-2a19483b64aa" Mar 10 11:02:00 crc kubenswrapper[4794]: I0310 11:02:00.161973 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29552342-msf78"] Mar 10 11:02:00 crc kubenswrapper[4794]: E0310 11:02:00.163063 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="84abb13a-cdb3-4eb8-8845-fa8e7db8ef02" containerName="collect-profiles" Mar 10 11:02:00 crc kubenswrapper[4794]: I0310 11:02:00.163084 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="84abb13a-cdb3-4eb8-8845-fa8e7db8ef02" containerName="collect-profiles" Mar 10 11:02:00 crc kubenswrapper[4794]: E0310 11:02:00.163116 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a69e0088-724c-47d3-b1fc-89279d3195ed" containerName="oc" Mar 10 11:02:00 crc kubenswrapper[4794]: I0310 11:02:00.163128 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="a69e0088-724c-47d3-b1fc-89279d3195ed" containerName="oc" Mar 10 11:02:00 crc kubenswrapper[4794]: E0310 11:02:00.163148 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f9106c0b-af66-4602-b04e-457241dd8865" containerName="storage" Mar 10 11:02:00 crc kubenswrapper[4794]: I0310 11:02:00.163160 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="f9106c0b-af66-4602-b04e-457241dd8865" containerName="storage" Mar 10 11:02:00 crc kubenswrapper[4794]: I0310 11:02:00.163440 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="f9106c0b-af66-4602-b04e-457241dd8865" containerName="storage" Mar 10 11:02:00 crc kubenswrapper[4794]: I0310 11:02:00.163461 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="84abb13a-cdb3-4eb8-8845-fa8e7db8ef02" containerName="collect-profiles" Mar 10 11:02:00 crc kubenswrapper[4794]: I0310 11:02:00.163489 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="a69e0088-724c-47d3-b1fc-89279d3195ed" containerName="oc" Mar 10 11:02:00 crc kubenswrapper[4794]: I0310 11:02:00.164250 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552342-msf78" Mar 10 11:02:00 crc kubenswrapper[4794]: I0310 11:02:00.168077 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 10 11:02:00 crc kubenswrapper[4794]: I0310 11:02:00.182801 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-r5tkc" Mar 10 11:02:00 crc kubenswrapper[4794]: I0310 11:02:00.182821 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 10 11:02:00 crc kubenswrapper[4794]: I0310 11:02:00.200146 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552342-msf78"] Mar 10 11:02:00 crc kubenswrapper[4794]: I0310 11:02:00.321627 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mqrdd\" (UniqueName: \"kubernetes.io/projected/863a1c75-0aca-4cc7-b986-5d6e3814f721-kube-api-access-mqrdd\") pod \"auto-csr-approver-29552342-msf78\" (UID: \"863a1c75-0aca-4cc7-b986-5d6e3814f721\") " pod="openshift-infra/auto-csr-approver-29552342-msf78" Mar 10 11:02:00 crc kubenswrapper[4794]: I0310 11:02:00.423679 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mqrdd\" (UniqueName: \"kubernetes.io/projected/863a1c75-0aca-4cc7-b986-5d6e3814f721-kube-api-access-mqrdd\") pod \"auto-csr-approver-29552342-msf78\" (UID: \"863a1c75-0aca-4cc7-b986-5d6e3814f721\") " pod="openshift-infra/auto-csr-approver-29552342-msf78" Mar 10 11:02:00 crc kubenswrapper[4794]: I0310 11:02:00.449988 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mqrdd\" (UniqueName: \"kubernetes.io/projected/863a1c75-0aca-4cc7-b986-5d6e3814f721-kube-api-access-mqrdd\") pod \"auto-csr-approver-29552342-msf78\" (UID: \"863a1c75-0aca-4cc7-b986-5d6e3814f721\") " pod="openshift-infra/auto-csr-approver-29552342-msf78" Mar 10 11:02:00 crc kubenswrapper[4794]: I0310 11:02:00.510560 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552342-msf78" Mar 10 11:02:01 crc kubenswrapper[4794]: I0310 11:02:01.022710 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552342-msf78"] Mar 10 11:02:01 crc kubenswrapper[4794]: I0310 11:02:01.161180 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552342-msf78" event={"ID":"863a1c75-0aca-4cc7-b986-5d6e3814f721","Type":"ContainerStarted","Data":"ce1aa474326500eb76c7d23127061b14e0ee1806a5d09ffeb93a8316000e093b"} Mar 10 11:02:03 crc kubenswrapper[4794]: I0310 11:02:03.183421 4794 generic.go:334] "Generic (PLEG): container finished" podID="863a1c75-0aca-4cc7-b986-5d6e3814f721" containerID="df357288bd1762198d32a3df55b0077fb9466bc51d819d5a6ca9d6d73642a5bb" exitCode=0 Mar 10 11:02:03 crc kubenswrapper[4794]: I0310 11:02:03.183493 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552342-msf78" event={"ID":"863a1c75-0aca-4cc7-b986-5d6e3814f721","Type":"ContainerDied","Data":"df357288bd1762198d32a3df55b0077fb9466bc51d819d5a6ca9d6d73642a5bb"} Mar 10 11:02:04 crc kubenswrapper[4794]: I0310 11:02:04.597264 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552342-msf78" Mar 10 11:02:04 crc kubenswrapper[4794]: I0310 11:02:04.689587 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mqrdd\" (UniqueName: \"kubernetes.io/projected/863a1c75-0aca-4cc7-b986-5d6e3814f721-kube-api-access-mqrdd\") pod \"863a1c75-0aca-4cc7-b986-5d6e3814f721\" (UID: \"863a1c75-0aca-4cc7-b986-5d6e3814f721\") " Mar 10 11:02:04 crc kubenswrapper[4794]: I0310 11:02:04.698099 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/863a1c75-0aca-4cc7-b986-5d6e3814f721-kube-api-access-mqrdd" (OuterVolumeSpecName: "kube-api-access-mqrdd") pod "863a1c75-0aca-4cc7-b986-5d6e3814f721" (UID: "863a1c75-0aca-4cc7-b986-5d6e3814f721"). InnerVolumeSpecName "kube-api-access-mqrdd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 11:02:04 crc kubenswrapper[4794]: I0310 11:02:04.790905 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mqrdd\" (UniqueName: \"kubernetes.io/projected/863a1c75-0aca-4cc7-b986-5d6e3814f721-kube-api-access-mqrdd\") on node \"crc\" DevicePath \"\"" Mar 10 11:02:05 crc kubenswrapper[4794]: I0310 11:02:05.204668 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552342-msf78" event={"ID":"863a1c75-0aca-4cc7-b986-5d6e3814f721","Type":"ContainerDied","Data":"ce1aa474326500eb76c7d23127061b14e0ee1806a5d09ffeb93a8316000e093b"} Mar 10 11:02:05 crc kubenswrapper[4794]: I0310 11:02:05.204733 4794 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ce1aa474326500eb76c7d23127061b14e0ee1806a5d09ffeb93a8316000e093b" Mar 10 11:02:05 crc kubenswrapper[4794]: I0310 11:02:05.204741 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552342-msf78" Mar 10 11:02:05 crc kubenswrapper[4794]: I0310 11:02:05.687109 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29552336-kxkn8"] Mar 10 11:02:05 crc kubenswrapper[4794]: I0310 11:02:05.698492 4794 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29552336-kxkn8"] Mar 10 11:02:06 crc kubenswrapper[4794]: I0310 11:02:06.014551 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9b5a95b8-232e-4423-9f6b-c4fa4cc36f55" path="/var/lib/kubelet/pods/9b5a95b8-232e-4423-9f6b-c4fa4cc36f55/volumes" Mar 10 11:02:08 crc kubenswrapper[4794]: I0310 11:02:08.999446 4794 scope.go:117] "RemoveContainer" containerID="7246b508c9eeccdf8a5ec9276e239c3ec455273ced1f978a5745ca8a5e590e02" Mar 10 11:02:09 crc kubenswrapper[4794]: E0310 11:02:09.000127 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-69278_openshift-machine-config-operator(071a79a8-a892-4d38-a255-2a19483b64aa)\"" pod="openshift-machine-config-operator/machine-config-daemon-69278" podUID="071a79a8-a892-4d38-a255-2a19483b64aa" Mar 10 11:02:23 crc kubenswrapper[4794]: I0310 11:02:22.999932 4794 scope.go:117] "RemoveContainer" containerID="7246b508c9eeccdf8a5ec9276e239c3ec455273ced1f978a5745ca8a5e590e02" Mar 10 11:02:23 crc kubenswrapper[4794]: E0310 11:02:23.002572 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-69278_openshift-machine-config-operator(071a79a8-a892-4d38-a255-2a19483b64aa)\"" pod="openshift-machine-config-operator/machine-config-daemon-69278" podUID="071a79a8-a892-4d38-a255-2a19483b64aa" Mar 10 11:02:35 crc kubenswrapper[4794]: I0310 11:02:34.999537 4794 scope.go:117] "RemoveContainer" containerID="7246b508c9eeccdf8a5ec9276e239c3ec455273ced1f978a5745ca8a5e590e02" Mar 10 11:02:35 crc kubenswrapper[4794]: E0310 11:02:35.000526 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-69278_openshift-machine-config-operator(071a79a8-a892-4d38-a255-2a19483b64aa)\"" pod="openshift-machine-config-operator/machine-config-daemon-69278" podUID="071a79a8-a892-4d38-a255-2a19483b64aa" Mar 10 11:02:41 crc kubenswrapper[4794]: I0310 11:02:41.289562 4794 scope.go:117] "RemoveContainer" containerID="7068b37217b482b7dff0f5c6b3c9e41d0a4133e421b1b48225cea8ce4ab48968" Mar 10 11:02:45 crc kubenswrapper[4794]: I0310 11:02:45.999385 4794 scope.go:117] "RemoveContainer" containerID="7246b508c9eeccdf8a5ec9276e239c3ec455273ced1f978a5745ca8a5e590e02" Mar 10 11:02:46 crc kubenswrapper[4794]: E0310 11:02:46.001927 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-69278_openshift-machine-config-operator(071a79a8-a892-4d38-a255-2a19483b64aa)\"" pod="openshift-machine-config-operator/machine-config-daemon-69278" podUID="071a79a8-a892-4d38-a255-2a19483b64aa" Mar 10 11:03:00 crc kubenswrapper[4794]: I0310 11:02:59.999954 4794 scope.go:117] "RemoveContainer" containerID="7246b508c9eeccdf8a5ec9276e239c3ec455273ced1f978a5745ca8a5e590e02" Mar 10 11:03:00 crc kubenswrapper[4794]: E0310 11:03:00.000671 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-69278_openshift-machine-config-operator(071a79a8-a892-4d38-a255-2a19483b64aa)\"" pod="openshift-machine-config-operator/machine-config-daemon-69278" podUID="071a79a8-a892-4d38-a255-2a19483b64aa" Mar 10 11:03:14 crc kubenswrapper[4794]: I0310 11:03:14.777101 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-b89xg"] Mar 10 11:03:14 crc kubenswrapper[4794]: E0310 11:03:14.778247 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="863a1c75-0aca-4cc7-b986-5d6e3814f721" containerName="oc" Mar 10 11:03:14 crc kubenswrapper[4794]: I0310 11:03:14.778269 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="863a1c75-0aca-4cc7-b986-5d6e3814f721" containerName="oc" Mar 10 11:03:14 crc kubenswrapper[4794]: I0310 11:03:14.778558 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="863a1c75-0aca-4cc7-b986-5d6e3814f721" containerName="oc" Mar 10 11:03:14 crc kubenswrapper[4794]: I0310 11:03:14.780501 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-b89xg" Mar 10 11:03:14 crc kubenswrapper[4794]: I0310 11:03:14.787161 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-b89xg"] Mar 10 11:03:14 crc kubenswrapper[4794]: I0310 11:03:14.933974 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qmc8s\" (UniqueName: \"kubernetes.io/projected/80da5faf-3e24-412d-b8dc-7dbc30d930fe-kube-api-access-qmc8s\") pod \"community-operators-b89xg\" (UID: \"80da5faf-3e24-412d-b8dc-7dbc30d930fe\") " pod="openshift-marketplace/community-operators-b89xg" Mar 10 11:03:14 crc kubenswrapper[4794]: I0310 11:03:14.934297 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/80da5faf-3e24-412d-b8dc-7dbc30d930fe-utilities\") pod \"community-operators-b89xg\" (UID: \"80da5faf-3e24-412d-b8dc-7dbc30d930fe\") " pod="openshift-marketplace/community-operators-b89xg" Mar 10 11:03:14 crc kubenswrapper[4794]: I0310 11:03:14.934349 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/80da5faf-3e24-412d-b8dc-7dbc30d930fe-catalog-content\") pod \"community-operators-b89xg\" (UID: \"80da5faf-3e24-412d-b8dc-7dbc30d930fe\") " pod="openshift-marketplace/community-operators-b89xg" Mar 10 11:03:15 crc kubenswrapper[4794]: I0310 11:03:15.000031 4794 scope.go:117] "RemoveContainer" containerID="7246b508c9eeccdf8a5ec9276e239c3ec455273ced1f978a5745ca8a5e590e02" Mar 10 11:03:15 crc kubenswrapper[4794]: E0310 11:03:15.000566 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-69278_openshift-machine-config-operator(071a79a8-a892-4d38-a255-2a19483b64aa)\"" pod="openshift-machine-config-operator/machine-config-daemon-69278" podUID="071a79a8-a892-4d38-a255-2a19483b64aa" Mar 10 11:03:15 crc kubenswrapper[4794]: I0310 11:03:15.035574 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/80da5faf-3e24-412d-b8dc-7dbc30d930fe-utilities\") pod \"community-operators-b89xg\" (UID: \"80da5faf-3e24-412d-b8dc-7dbc30d930fe\") " pod="openshift-marketplace/community-operators-b89xg" Mar 10 11:03:15 crc kubenswrapper[4794]: I0310 11:03:15.035641 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/80da5faf-3e24-412d-b8dc-7dbc30d930fe-catalog-content\") pod \"community-operators-b89xg\" (UID: \"80da5faf-3e24-412d-b8dc-7dbc30d930fe\") " pod="openshift-marketplace/community-operators-b89xg" Mar 10 11:03:15 crc kubenswrapper[4794]: I0310 11:03:15.035707 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qmc8s\" (UniqueName: \"kubernetes.io/projected/80da5faf-3e24-412d-b8dc-7dbc30d930fe-kube-api-access-qmc8s\") pod \"community-operators-b89xg\" (UID: \"80da5faf-3e24-412d-b8dc-7dbc30d930fe\") " pod="openshift-marketplace/community-operators-b89xg" Mar 10 11:03:15 crc kubenswrapper[4794]: I0310 11:03:15.036251 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/80da5faf-3e24-412d-b8dc-7dbc30d930fe-utilities\") pod \"community-operators-b89xg\" (UID: \"80da5faf-3e24-412d-b8dc-7dbc30d930fe\") " pod="openshift-marketplace/community-operators-b89xg" Mar 10 11:03:15 crc kubenswrapper[4794]: I0310 11:03:15.036277 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/80da5faf-3e24-412d-b8dc-7dbc30d930fe-catalog-content\") pod \"community-operators-b89xg\" (UID: \"80da5faf-3e24-412d-b8dc-7dbc30d930fe\") " pod="openshift-marketplace/community-operators-b89xg" Mar 10 11:03:15 crc kubenswrapper[4794]: I0310 11:03:15.058297 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qmc8s\" (UniqueName: \"kubernetes.io/projected/80da5faf-3e24-412d-b8dc-7dbc30d930fe-kube-api-access-qmc8s\") pod \"community-operators-b89xg\" (UID: \"80da5faf-3e24-412d-b8dc-7dbc30d930fe\") " pod="openshift-marketplace/community-operators-b89xg" Mar 10 11:03:15 crc kubenswrapper[4794]: I0310 11:03:15.113931 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-b89xg" Mar 10 11:03:15 crc kubenswrapper[4794]: I0310 11:03:15.426639 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-b89xg"] Mar 10 11:03:15 crc kubenswrapper[4794]: I0310 11:03:15.885702 4794 generic.go:334] "Generic (PLEG): container finished" podID="80da5faf-3e24-412d-b8dc-7dbc30d930fe" containerID="1688ead9c119db09fd06969df137452ad19edeb2cdbc128a30828b5a254958ef" exitCode=0 Mar 10 11:03:15 crc kubenswrapper[4794]: I0310 11:03:15.885746 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-b89xg" event={"ID":"80da5faf-3e24-412d-b8dc-7dbc30d930fe","Type":"ContainerDied","Data":"1688ead9c119db09fd06969df137452ad19edeb2cdbc128a30828b5a254958ef"} Mar 10 11:03:15 crc kubenswrapper[4794]: I0310 11:03:15.885772 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-b89xg" event={"ID":"80da5faf-3e24-412d-b8dc-7dbc30d930fe","Type":"ContainerStarted","Data":"1b03a8d4db5c10d51735ae82fce2c31e496d7a06553f6776036994e7ede66da0"} Mar 10 11:03:15 crc kubenswrapper[4794]: I0310 11:03:15.887535 4794 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 10 11:03:17 crc kubenswrapper[4794]: I0310 11:03:17.905431 4794 generic.go:334] "Generic (PLEG): container finished" podID="80da5faf-3e24-412d-b8dc-7dbc30d930fe" containerID="cf97f03feb752e7d2e1d6f7f1f8a04f264f5b3de5dbcb44ce159a1201ed2d023" exitCode=0 Mar 10 11:03:17 crc kubenswrapper[4794]: I0310 11:03:17.905546 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-b89xg" event={"ID":"80da5faf-3e24-412d-b8dc-7dbc30d930fe","Type":"ContainerDied","Data":"cf97f03feb752e7d2e1d6f7f1f8a04f264f5b3de5dbcb44ce159a1201ed2d023"} Mar 10 11:03:18 crc kubenswrapper[4794]: I0310 11:03:18.917659 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-b89xg" event={"ID":"80da5faf-3e24-412d-b8dc-7dbc30d930fe","Type":"ContainerStarted","Data":"c44d79a5498967a6b7bab20907f0efe2f2a2d77469d92c5711c3ad3542f66175"} Mar 10 11:03:18 crc kubenswrapper[4794]: I0310 11:03:18.943080 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-b89xg" podStartSLOduration=2.408883689 podStartE2EDuration="4.943062535s" podCreationTimestamp="2026-03-10 11:03:14 +0000 UTC" firstStartedPulling="2026-03-10 11:03:15.887318226 +0000 UTC m=+4744.643489044" lastFinishedPulling="2026-03-10 11:03:18.421497072 +0000 UTC m=+4747.177667890" observedRunningTime="2026-03-10 11:03:18.940499875 +0000 UTC m=+4747.696670703" watchObservedRunningTime="2026-03-10 11:03:18.943062535 +0000 UTC m=+4747.699233373" Mar 10 11:03:20 crc kubenswrapper[4794]: I0310 11:03:20.055344 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-c44667757-jwqh5"] Mar 10 11:03:20 crc kubenswrapper[4794]: I0310 11:03:20.056324 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-55c76fd6b7-88j8m"] Mar 10 11:03:20 crc kubenswrapper[4794]: I0310 11:03:20.057154 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-55c76fd6b7-88j8m" Mar 10 11:03:20 crc kubenswrapper[4794]: I0310 11:03:20.057570 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-c44667757-jwqh5" Mar 10 11:03:20 crc kubenswrapper[4794]: I0310 11:03:20.059760 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dnsmasq-dns-dockercfg-rq8zq" Mar 10 11:03:20 crc kubenswrapper[4794]: I0310 11:03:20.059969 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns" Mar 10 11:03:20 crc kubenswrapper[4794]: I0310 11:03:20.060123 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"kube-root-ca.crt" Mar 10 11:03:20 crc kubenswrapper[4794]: I0310 11:03:20.060434 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openshift-service-ca.crt" Mar 10 11:03:20 crc kubenswrapper[4794]: I0310 11:03:20.060598 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-svc" Mar 10 11:03:20 crc kubenswrapper[4794]: I0310 11:03:20.069802 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-c44667757-jwqh5"] Mar 10 11:03:20 crc kubenswrapper[4794]: I0310 11:03:20.085645 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-55c76fd6b7-88j8m"] Mar 10 11:03:20 crc kubenswrapper[4794]: I0310 11:03:20.126628 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-c44667757-jwqh5"] Mar 10 11:03:20 crc kubenswrapper[4794]: E0310 11:03:20.127010 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[config kube-api-access-c6njg], unattached volumes=[], failed to process volumes=[config kube-api-access-c6njg]: context canceled" pod="openstack/dnsmasq-dns-c44667757-jwqh5" podUID="f095ff67-32dc-4f41-adea-62bcc1a5520b" Mar 10 11:03:20 crc kubenswrapper[4794]: I0310 11:03:20.157727 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5fb77f9685-6b6f7"] Mar 10 11:03:20 crc kubenswrapper[4794]: I0310 11:03:20.158922 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5fb77f9685-6b6f7" Mar 10 11:03:20 crc kubenswrapper[4794]: I0310 11:03:20.166149 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5fb77f9685-6b6f7"] Mar 10 11:03:20 crc kubenswrapper[4794]: I0310 11:03:20.207032 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f095ff67-32dc-4f41-adea-62bcc1a5520b-config\") pod \"dnsmasq-dns-c44667757-jwqh5\" (UID: \"f095ff67-32dc-4f41-adea-62bcc1a5520b\") " pod="openstack/dnsmasq-dns-c44667757-jwqh5" Mar 10 11:03:20 crc kubenswrapper[4794]: I0310 11:03:20.207083 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/aee8bb33-0bd7-4b71-9032-193f164cebe3-config\") pod \"dnsmasq-dns-55c76fd6b7-88j8m\" (UID: \"aee8bb33-0bd7-4b71-9032-193f164cebe3\") " pod="openstack/dnsmasq-dns-55c76fd6b7-88j8m" Mar 10 11:03:20 crc kubenswrapper[4794]: I0310 11:03:20.207122 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/aee8bb33-0bd7-4b71-9032-193f164cebe3-dns-svc\") pod \"dnsmasq-dns-55c76fd6b7-88j8m\" (UID: \"aee8bb33-0bd7-4b71-9032-193f164cebe3\") " pod="openstack/dnsmasq-dns-55c76fd6b7-88j8m" Mar 10 11:03:20 crc kubenswrapper[4794]: I0310 11:03:20.207147 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f54l5\" (UniqueName: \"kubernetes.io/projected/aee8bb33-0bd7-4b71-9032-193f164cebe3-kube-api-access-f54l5\") pod \"dnsmasq-dns-55c76fd6b7-88j8m\" (UID: \"aee8bb33-0bd7-4b71-9032-193f164cebe3\") " pod="openstack/dnsmasq-dns-55c76fd6b7-88j8m" Mar 10 11:03:20 crc kubenswrapper[4794]: I0310 11:03:20.207263 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c6njg\" (UniqueName: \"kubernetes.io/projected/f095ff67-32dc-4f41-adea-62bcc1a5520b-kube-api-access-c6njg\") pod \"dnsmasq-dns-c44667757-jwqh5\" (UID: \"f095ff67-32dc-4f41-adea-62bcc1a5520b\") " pod="openstack/dnsmasq-dns-c44667757-jwqh5" Mar 10 11:03:20 crc kubenswrapper[4794]: I0310 11:03:20.309244 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c6njg\" (UniqueName: \"kubernetes.io/projected/f095ff67-32dc-4f41-adea-62bcc1a5520b-kube-api-access-c6njg\") pod \"dnsmasq-dns-c44667757-jwqh5\" (UID: \"f095ff67-32dc-4f41-adea-62bcc1a5520b\") " pod="openstack/dnsmasq-dns-c44667757-jwqh5" Mar 10 11:03:20 crc kubenswrapper[4794]: I0310 11:03:20.309914 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lv2rp\" (UniqueName: \"kubernetes.io/projected/a520b016-b09d-430c-a64b-83f450dab2ef-kube-api-access-lv2rp\") pod \"dnsmasq-dns-5fb77f9685-6b6f7\" (UID: \"a520b016-b09d-430c-a64b-83f450dab2ef\") " pod="openstack/dnsmasq-dns-5fb77f9685-6b6f7" Mar 10 11:03:20 crc kubenswrapper[4794]: I0310 11:03:20.310045 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a520b016-b09d-430c-a64b-83f450dab2ef-config\") pod \"dnsmasq-dns-5fb77f9685-6b6f7\" (UID: \"a520b016-b09d-430c-a64b-83f450dab2ef\") " pod="openstack/dnsmasq-dns-5fb77f9685-6b6f7" Mar 10 11:03:20 crc kubenswrapper[4794]: I0310 11:03:20.310168 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f095ff67-32dc-4f41-adea-62bcc1a5520b-config\") pod \"dnsmasq-dns-c44667757-jwqh5\" (UID: \"f095ff67-32dc-4f41-adea-62bcc1a5520b\") " pod="openstack/dnsmasq-dns-c44667757-jwqh5" Mar 10 11:03:20 crc kubenswrapper[4794]: I0310 11:03:20.310298 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/aee8bb33-0bd7-4b71-9032-193f164cebe3-config\") pod \"dnsmasq-dns-55c76fd6b7-88j8m\" (UID: \"aee8bb33-0bd7-4b71-9032-193f164cebe3\") " pod="openstack/dnsmasq-dns-55c76fd6b7-88j8m" Mar 10 11:03:20 crc kubenswrapper[4794]: I0310 11:03:20.310498 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/aee8bb33-0bd7-4b71-9032-193f164cebe3-dns-svc\") pod \"dnsmasq-dns-55c76fd6b7-88j8m\" (UID: \"aee8bb33-0bd7-4b71-9032-193f164cebe3\") " pod="openstack/dnsmasq-dns-55c76fd6b7-88j8m" Mar 10 11:03:20 crc kubenswrapper[4794]: I0310 11:03:20.311485 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f095ff67-32dc-4f41-adea-62bcc1a5520b-config\") pod \"dnsmasq-dns-c44667757-jwqh5\" (UID: \"f095ff67-32dc-4f41-adea-62bcc1a5520b\") " pod="openstack/dnsmasq-dns-c44667757-jwqh5" Mar 10 11:03:20 crc kubenswrapper[4794]: I0310 11:03:20.311625 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a520b016-b09d-430c-a64b-83f450dab2ef-dns-svc\") pod \"dnsmasq-dns-5fb77f9685-6b6f7\" (UID: \"a520b016-b09d-430c-a64b-83f450dab2ef\") " pod="openstack/dnsmasq-dns-5fb77f9685-6b6f7" Mar 10 11:03:20 crc kubenswrapper[4794]: I0310 11:03:20.311793 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/aee8bb33-0bd7-4b71-9032-193f164cebe3-config\") pod \"dnsmasq-dns-55c76fd6b7-88j8m\" (UID: \"aee8bb33-0bd7-4b71-9032-193f164cebe3\") " pod="openstack/dnsmasq-dns-55c76fd6b7-88j8m" Mar 10 11:03:20 crc kubenswrapper[4794]: I0310 11:03:20.312393 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/aee8bb33-0bd7-4b71-9032-193f164cebe3-dns-svc\") pod \"dnsmasq-dns-55c76fd6b7-88j8m\" (UID: \"aee8bb33-0bd7-4b71-9032-193f164cebe3\") " pod="openstack/dnsmasq-dns-55c76fd6b7-88j8m" Mar 10 11:03:20 crc kubenswrapper[4794]: I0310 11:03:20.312459 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f54l5\" (UniqueName: \"kubernetes.io/projected/aee8bb33-0bd7-4b71-9032-193f164cebe3-kube-api-access-f54l5\") pod \"dnsmasq-dns-55c76fd6b7-88j8m\" (UID: \"aee8bb33-0bd7-4b71-9032-193f164cebe3\") " pod="openstack/dnsmasq-dns-55c76fd6b7-88j8m" Mar 10 11:03:20 crc kubenswrapper[4794]: I0310 11:03:20.334507 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c6njg\" (UniqueName: \"kubernetes.io/projected/f095ff67-32dc-4f41-adea-62bcc1a5520b-kube-api-access-c6njg\") pod \"dnsmasq-dns-c44667757-jwqh5\" (UID: \"f095ff67-32dc-4f41-adea-62bcc1a5520b\") " pod="openstack/dnsmasq-dns-c44667757-jwqh5" Mar 10 11:03:20 crc kubenswrapper[4794]: I0310 11:03:20.334525 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f54l5\" (UniqueName: \"kubernetes.io/projected/aee8bb33-0bd7-4b71-9032-193f164cebe3-kube-api-access-f54l5\") pod \"dnsmasq-dns-55c76fd6b7-88j8m\" (UID: \"aee8bb33-0bd7-4b71-9032-193f164cebe3\") " pod="openstack/dnsmasq-dns-55c76fd6b7-88j8m" Mar 10 11:03:20 crc kubenswrapper[4794]: I0310 11:03:20.379878 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-55c76fd6b7-88j8m"] Mar 10 11:03:20 crc kubenswrapper[4794]: I0310 11:03:20.380324 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-55c76fd6b7-88j8m" Mar 10 11:03:20 crc kubenswrapper[4794]: I0310 11:03:20.399564 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-ff89b6977-xmvxn"] Mar 10 11:03:20 crc kubenswrapper[4794]: I0310 11:03:20.400686 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-ff89b6977-xmvxn" Mar 10 11:03:20 crc kubenswrapper[4794]: I0310 11:03:20.413839 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a520b016-b09d-430c-a64b-83f450dab2ef-dns-svc\") pod \"dnsmasq-dns-5fb77f9685-6b6f7\" (UID: \"a520b016-b09d-430c-a64b-83f450dab2ef\") " pod="openstack/dnsmasq-dns-5fb77f9685-6b6f7" Mar 10 11:03:20 crc kubenswrapper[4794]: I0310 11:03:20.413956 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lv2rp\" (UniqueName: \"kubernetes.io/projected/a520b016-b09d-430c-a64b-83f450dab2ef-kube-api-access-lv2rp\") pod \"dnsmasq-dns-5fb77f9685-6b6f7\" (UID: \"a520b016-b09d-430c-a64b-83f450dab2ef\") " pod="openstack/dnsmasq-dns-5fb77f9685-6b6f7" Mar 10 11:03:20 crc kubenswrapper[4794]: I0310 11:03:20.413981 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a520b016-b09d-430c-a64b-83f450dab2ef-config\") pod \"dnsmasq-dns-5fb77f9685-6b6f7\" (UID: \"a520b016-b09d-430c-a64b-83f450dab2ef\") " pod="openstack/dnsmasq-dns-5fb77f9685-6b6f7" Mar 10 11:03:20 crc kubenswrapper[4794]: I0310 11:03:20.414847 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a520b016-b09d-430c-a64b-83f450dab2ef-config\") pod \"dnsmasq-dns-5fb77f9685-6b6f7\" (UID: \"a520b016-b09d-430c-a64b-83f450dab2ef\") " pod="openstack/dnsmasq-dns-5fb77f9685-6b6f7" Mar 10 11:03:20 crc kubenswrapper[4794]: I0310 11:03:20.415349 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a520b016-b09d-430c-a64b-83f450dab2ef-dns-svc\") pod \"dnsmasq-dns-5fb77f9685-6b6f7\" (UID: \"a520b016-b09d-430c-a64b-83f450dab2ef\") " pod="openstack/dnsmasq-dns-5fb77f9685-6b6f7" Mar 10 11:03:20 crc kubenswrapper[4794]: I0310 11:03:20.416071 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-ff89b6977-xmvxn"] Mar 10 11:03:20 crc kubenswrapper[4794]: I0310 11:03:20.433794 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lv2rp\" (UniqueName: \"kubernetes.io/projected/a520b016-b09d-430c-a64b-83f450dab2ef-kube-api-access-lv2rp\") pod \"dnsmasq-dns-5fb77f9685-6b6f7\" (UID: \"a520b016-b09d-430c-a64b-83f450dab2ef\") " pod="openstack/dnsmasq-dns-5fb77f9685-6b6f7" Mar 10 11:03:20 crc kubenswrapper[4794]: I0310 11:03:20.491065 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5fb77f9685-6b6f7" Mar 10 11:03:20 crc kubenswrapper[4794]: I0310 11:03:20.514980 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/000e40eb-9cb3-4535-ab24-652ebaf83d42-dns-svc\") pod \"dnsmasq-dns-ff89b6977-xmvxn\" (UID: \"000e40eb-9cb3-4535-ab24-652ebaf83d42\") " pod="openstack/dnsmasq-dns-ff89b6977-xmvxn" Mar 10 11:03:20 crc kubenswrapper[4794]: I0310 11:03:20.515047 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/000e40eb-9cb3-4535-ab24-652ebaf83d42-config\") pod \"dnsmasq-dns-ff89b6977-xmvxn\" (UID: \"000e40eb-9cb3-4535-ab24-652ebaf83d42\") " pod="openstack/dnsmasq-dns-ff89b6977-xmvxn" Mar 10 11:03:20 crc kubenswrapper[4794]: I0310 11:03:20.515076 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p2lpk\" (UniqueName: \"kubernetes.io/projected/000e40eb-9cb3-4535-ab24-652ebaf83d42-kube-api-access-p2lpk\") pod \"dnsmasq-dns-ff89b6977-xmvxn\" (UID: \"000e40eb-9cb3-4535-ab24-652ebaf83d42\") " pod="openstack/dnsmasq-dns-ff89b6977-xmvxn" Mar 10 11:03:20 crc kubenswrapper[4794]: I0310 11:03:20.616668 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/000e40eb-9cb3-4535-ab24-652ebaf83d42-config\") pod \"dnsmasq-dns-ff89b6977-xmvxn\" (UID: \"000e40eb-9cb3-4535-ab24-652ebaf83d42\") " pod="openstack/dnsmasq-dns-ff89b6977-xmvxn" Mar 10 11:03:20 crc kubenswrapper[4794]: I0310 11:03:20.617064 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p2lpk\" (UniqueName: \"kubernetes.io/projected/000e40eb-9cb3-4535-ab24-652ebaf83d42-kube-api-access-p2lpk\") pod \"dnsmasq-dns-ff89b6977-xmvxn\" (UID: \"000e40eb-9cb3-4535-ab24-652ebaf83d42\") " pod="openstack/dnsmasq-dns-ff89b6977-xmvxn" Mar 10 11:03:20 crc kubenswrapper[4794]: I0310 11:03:20.617181 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/000e40eb-9cb3-4535-ab24-652ebaf83d42-dns-svc\") pod \"dnsmasq-dns-ff89b6977-xmvxn\" (UID: \"000e40eb-9cb3-4535-ab24-652ebaf83d42\") " pod="openstack/dnsmasq-dns-ff89b6977-xmvxn" Mar 10 11:03:20 crc kubenswrapper[4794]: I0310 11:03:20.618202 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/000e40eb-9cb3-4535-ab24-652ebaf83d42-dns-svc\") pod \"dnsmasq-dns-ff89b6977-xmvxn\" (UID: \"000e40eb-9cb3-4535-ab24-652ebaf83d42\") " pod="openstack/dnsmasq-dns-ff89b6977-xmvxn" Mar 10 11:03:20 crc kubenswrapper[4794]: I0310 11:03:20.618474 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/000e40eb-9cb3-4535-ab24-652ebaf83d42-config\") pod \"dnsmasq-dns-ff89b6977-xmvxn\" (UID: \"000e40eb-9cb3-4535-ab24-652ebaf83d42\") " pod="openstack/dnsmasq-dns-ff89b6977-xmvxn" Mar 10 11:03:20 crc kubenswrapper[4794]: I0310 11:03:20.637104 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p2lpk\" (UniqueName: \"kubernetes.io/projected/000e40eb-9cb3-4535-ab24-652ebaf83d42-kube-api-access-p2lpk\") pod \"dnsmasq-dns-ff89b6977-xmvxn\" (UID: \"000e40eb-9cb3-4535-ab24-652ebaf83d42\") " pod="openstack/dnsmasq-dns-ff89b6977-xmvxn" Mar 10 11:03:20 crc kubenswrapper[4794]: I0310 11:03:20.779805 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-ff89b6977-xmvxn" Mar 10 11:03:20 crc kubenswrapper[4794]: I0310 11:03:20.899896 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-55c76fd6b7-88j8m"] Mar 10 11:03:20 crc kubenswrapper[4794]: I0310 11:03:20.929445 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-c44667757-jwqh5" Mar 10 11:03:20 crc kubenswrapper[4794]: I0310 11:03:20.929425 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55c76fd6b7-88j8m" event={"ID":"aee8bb33-0bd7-4b71-9032-193f164cebe3","Type":"ContainerStarted","Data":"b32cb56acf9e82971a512e3f5d04cf989dae23b63364090d022ea549f62fa524"} Mar 10 11:03:20 crc kubenswrapper[4794]: I0310 11:03:20.943622 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-c44667757-jwqh5" Mar 10 11:03:21 crc kubenswrapper[4794]: W0310 11:03:21.017553 4794 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda520b016_b09d_430c_a64b_83f450dab2ef.slice/crio-9e87a39d1e38764a954abbc5cac989d7a7527a29da4ebb0a4ef8123c4f7e92f4 WatchSource:0}: Error finding container 9e87a39d1e38764a954abbc5cac989d7a7527a29da4ebb0a4ef8123c4f7e92f4: Status 404 returned error can't find the container with id 9e87a39d1e38764a954abbc5cac989d7a7527a29da4ebb0a4ef8123c4f7e92f4 Mar 10 11:03:21 crc kubenswrapper[4794]: I0310 11:03:21.023298 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c6njg\" (UniqueName: \"kubernetes.io/projected/f095ff67-32dc-4f41-adea-62bcc1a5520b-kube-api-access-c6njg\") pod \"f095ff67-32dc-4f41-adea-62bcc1a5520b\" (UID: \"f095ff67-32dc-4f41-adea-62bcc1a5520b\") " Mar 10 11:03:21 crc kubenswrapper[4794]: I0310 11:03:21.025265 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f095ff67-32dc-4f41-adea-62bcc1a5520b-config\") pod \"f095ff67-32dc-4f41-adea-62bcc1a5520b\" (UID: \"f095ff67-32dc-4f41-adea-62bcc1a5520b\") " Mar 10 11:03:21 crc kubenswrapper[4794]: I0310 11:03:21.026238 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f095ff67-32dc-4f41-adea-62bcc1a5520b-config" (OuterVolumeSpecName: "config") pod "f095ff67-32dc-4f41-adea-62bcc1a5520b" (UID: "f095ff67-32dc-4f41-adea-62bcc1a5520b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 11:03:21 crc kubenswrapper[4794]: I0310 11:03:21.026464 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f095ff67-32dc-4f41-adea-62bcc1a5520b-kube-api-access-c6njg" (OuterVolumeSpecName: "kube-api-access-c6njg") pod "f095ff67-32dc-4f41-adea-62bcc1a5520b" (UID: "f095ff67-32dc-4f41-adea-62bcc1a5520b"). InnerVolumeSpecName "kube-api-access-c6njg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 11:03:21 crc kubenswrapper[4794]: I0310 11:03:21.029642 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5fb77f9685-6b6f7"] Mar 10 11:03:21 crc kubenswrapper[4794]: I0310 11:03:21.126725 4794 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f095ff67-32dc-4f41-adea-62bcc1a5520b-config\") on node \"crc\" DevicePath \"\"" Mar 10 11:03:21 crc kubenswrapper[4794]: I0310 11:03:21.126755 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c6njg\" (UniqueName: \"kubernetes.io/projected/f095ff67-32dc-4f41-adea-62bcc1a5520b-kube-api-access-c6njg\") on node \"crc\" DevicePath \"\"" Mar 10 11:03:21 crc kubenswrapper[4794]: I0310 11:03:21.243068 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-ff89b6977-xmvxn"] Mar 10 11:03:21 crc kubenswrapper[4794]: W0310 11:03:21.259801 4794 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod000e40eb_9cb3_4535_ab24_652ebaf83d42.slice/crio-81047964819e8faed2e5a169055c3fc3ffce9083322fc3a8a3b3be91278c1c9e WatchSource:0}: Error finding container 81047964819e8faed2e5a169055c3fc3ffce9083322fc3a8a3b3be91278c1c9e: Status 404 returned error can't find the container with id 81047964819e8faed2e5a169055c3fc3ffce9083322fc3a8a3b3be91278c1c9e Mar 10 11:03:21 crc kubenswrapper[4794]: I0310 11:03:21.265930 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 10 11:03:21 crc kubenswrapper[4794]: I0310 11:03:21.268056 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 10 11:03:21 crc kubenswrapper[4794]: I0310 11:03:21.270318 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Mar 10 11:03:21 crc kubenswrapper[4794]: I0310 11:03:21.270728 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Mar 10 11:03:21 crc kubenswrapper[4794]: I0310 11:03:21.271409 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Mar 10 11:03:21 crc kubenswrapper[4794]: I0310 11:03:21.271491 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-fsm2v" Mar 10 11:03:21 crc kubenswrapper[4794]: I0310 11:03:21.271988 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Mar 10 11:03:21 crc kubenswrapper[4794]: I0310 11:03:21.338534 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 10 11:03:21 crc kubenswrapper[4794]: I0310 11:03:21.429767 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/5ab2d0d3-42ae-4af0-a0c8-518a7bb9a69b-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"5ab2d0d3-42ae-4af0-a0c8-518a7bb9a69b\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 11:03:21 crc kubenswrapper[4794]: I0310 11:03:21.429853 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/5ab2d0d3-42ae-4af0-a0c8-518a7bb9a69b-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"5ab2d0d3-42ae-4af0-a0c8-518a7bb9a69b\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 11:03:21 crc kubenswrapper[4794]: I0310 11:03:21.429877 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/5ab2d0d3-42ae-4af0-a0c8-518a7bb9a69b-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"5ab2d0d3-42ae-4af0-a0c8-518a7bb9a69b\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 11:03:21 crc kubenswrapper[4794]: I0310 11:03:21.429895 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/5ab2d0d3-42ae-4af0-a0c8-518a7bb9a69b-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"5ab2d0d3-42ae-4af0-a0c8-518a7bb9a69b\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 11:03:21 crc kubenswrapper[4794]: I0310 11:03:21.429910 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/5ab2d0d3-42ae-4af0-a0c8-518a7bb9a69b-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"5ab2d0d3-42ae-4af0-a0c8-518a7bb9a69b\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 11:03:21 crc kubenswrapper[4794]: I0310 11:03:21.429934 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bp545\" (UniqueName: \"kubernetes.io/projected/5ab2d0d3-42ae-4af0-a0c8-518a7bb9a69b-kube-api-access-bp545\") pod \"rabbitmq-cell1-server-0\" (UID: \"5ab2d0d3-42ae-4af0-a0c8-518a7bb9a69b\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 11:03:21 crc kubenswrapper[4794]: I0310 11:03:21.429955 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/5ab2d0d3-42ae-4af0-a0c8-518a7bb9a69b-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"5ab2d0d3-42ae-4af0-a0c8-518a7bb9a69b\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 11:03:21 crc kubenswrapper[4794]: I0310 11:03:21.429974 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-dc822ad3-d386-47fb-a341-58b137079423\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-dc822ad3-d386-47fb-a341-58b137079423\") pod \"rabbitmq-cell1-server-0\" (UID: \"5ab2d0d3-42ae-4af0-a0c8-518a7bb9a69b\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 11:03:21 crc kubenswrapper[4794]: I0310 11:03:21.430003 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/5ab2d0d3-42ae-4af0-a0c8-518a7bb9a69b-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"5ab2d0d3-42ae-4af0-a0c8-518a7bb9a69b\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 11:03:21 crc kubenswrapper[4794]: I0310 11:03:21.530648 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Mar 10 11:03:21 crc kubenswrapper[4794]: I0310 11:03:21.531379 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/5ab2d0d3-42ae-4af0-a0c8-518a7bb9a69b-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"5ab2d0d3-42ae-4af0-a0c8-518a7bb9a69b\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 11:03:21 crc kubenswrapper[4794]: I0310 11:03:21.531484 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/5ab2d0d3-42ae-4af0-a0c8-518a7bb9a69b-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"5ab2d0d3-42ae-4af0-a0c8-518a7bb9a69b\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 11:03:21 crc kubenswrapper[4794]: I0310 11:03:21.532260 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/5ab2d0d3-42ae-4af0-a0c8-518a7bb9a69b-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"5ab2d0d3-42ae-4af0-a0c8-518a7bb9a69b\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 11:03:21 crc kubenswrapper[4794]: I0310 11:03:21.532347 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/5ab2d0d3-42ae-4af0-a0c8-518a7bb9a69b-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"5ab2d0d3-42ae-4af0-a0c8-518a7bb9a69b\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 11:03:21 crc kubenswrapper[4794]: I0310 11:03:21.532364 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/5ab2d0d3-42ae-4af0-a0c8-518a7bb9a69b-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"5ab2d0d3-42ae-4af0-a0c8-518a7bb9a69b\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 11:03:21 crc kubenswrapper[4794]: I0310 11:03:21.532385 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/5ab2d0d3-42ae-4af0-a0c8-518a7bb9a69b-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"5ab2d0d3-42ae-4af0-a0c8-518a7bb9a69b\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 11:03:21 crc kubenswrapper[4794]: I0310 11:03:21.532431 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bp545\" (UniqueName: \"kubernetes.io/projected/5ab2d0d3-42ae-4af0-a0c8-518a7bb9a69b-kube-api-access-bp545\") pod \"rabbitmq-cell1-server-0\" (UID: \"5ab2d0d3-42ae-4af0-a0c8-518a7bb9a69b\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 11:03:21 crc kubenswrapper[4794]: I0310 11:03:21.532466 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/5ab2d0d3-42ae-4af0-a0c8-518a7bb9a69b-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"5ab2d0d3-42ae-4af0-a0c8-518a7bb9a69b\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 11:03:21 crc kubenswrapper[4794]: I0310 11:03:21.532489 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-dc822ad3-d386-47fb-a341-58b137079423\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-dc822ad3-d386-47fb-a341-58b137079423\") pod \"rabbitmq-cell1-server-0\" (UID: \"5ab2d0d3-42ae-4af0-a0c8-518a7bb9a69b\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 11:03:21 crc kubenswrapper[4794]: I0310 11:03:21.532895 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/5ab2d0d3-42ae-4af0-a0c8-518a7bb9a69b-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"5ab2d0d3-42ae-4af0-a0c8-518a7bb9a69b\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 11:03:21 crc kubenswrapper[4794]: I0310 11:03:21.533292 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/5ab2d0d3-42ae-4af0-a0c8-518a7bb9a69b-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"5ab2d0d3-42ae-4af0-a0c8-518a7bb9a69b\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 11:03:21 crc kubenswrapper[4794]: I0310 11:03:21.534994 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 10 11:03:21 crc kubenswrapper[4794]: I0310 11:03:21.536106 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/5ab2d0d3-42ae-4af0-a0c8-518a7bb9a69b-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"5ab2d0d3-42ae-4af0-a0c8-518a7bb9a69b\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 11:03:21 crc kubenswrapper[4794]: I0310 11:03:21.536364 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/5ab2d0d3-42ae-4af0-a0c8-518a7bb9a69b-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"5ab2d0d3-42ae-4af0-a0c8-518a7bb9a69b\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 11:03:21 crc kubenswrapper[4794]: I0310 11:03:21.536510 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/5ab2d0d3-42ae-4af0-a0c8-518a7bb9a69b-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"5ab2d0d3-42ae-4af0-a0c8-518a7bb9a69b\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 11:03:21 crc kubenswrapper[4794]: I0310 11:03:21.536909 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/5ab2d0d3-42ae-4af0-a0c8-518a7bb9a69b-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"5ab2d0d3-42ae-4af0-a0c8-518a7bb9a69b\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 11:03:21 crc kubenswrapper[4794]: I0310 11:03:21.537688 4794 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 10 11:03:21 crc kubenswrapper[4794]: I0310 11:03:21.537709 4794 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-dc822ad3-d386-47fb-a341-58b137079423\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-dc822ad3-d386-47fb-a341-58b137079423\") pod \"rabbitmq-cell1-server-0\" (UID: \"5ab2d0d3-42ae-4af0-a0c8-518a7bb9a69b\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/f374e62a9af5e12b6ef694a4e09178c1818cc6901a6785cefe8026d0920279ff/globalmount\"" pod="openstack/rabbitmq-cell1-server-0" Mar 10 11:03:21 crc kubenswrapper[4794]: I0310 11:03:21.538585 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/5ab2d0d3-42ae-4af0-a0c8-518a7bb9a69b-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"5ab2d0d3-42ae-4af0-a0c8-518a7bb9a69b\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 11:03:21 crc kubenswrapper[4794]: I0310 11:03:21.539121 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Mar 10 11:03:21 crc kubenswrapper[4794]: I0310 11:03:21.540654 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Mar 10 11:03:21 crc kubenswrapper[4794]: I0310 11:03:21.540819 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Mar 10 11:03:21 crc kubenswrapper[4794]: I0310 11:03:21.540901 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-62tth" Mar 10 11:03:21 crc kubenswrapper[4794]: I0310 11:03:21.545041 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Mar 10 11:03:21 crc kubenswrapper[4794]: I0310 11:03:21.546729 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 10 11:03:21 crc kubenswrapper[4794]: I0310 11:03:21.556501 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bp545\" (UniqueName: \"kubernetes.io/projected/5ab2d0d3-42ae-4af0-a0c8-518a7bb9a69b-kube-api-access-bp545\") pod \"rabbitmq-cell1-server-0\" (UID: \"5ab2d0d3-42ae-4af0-a0c8-518a7bb9a69b\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 11:03:21 crc kubenswrapper[4794]: I0310 11:03:21.602282 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-dc822ad3-d386-47fb-a341-58b137079423\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-dc822ad3-d386-47fb-a341-58b137079423\") pod \"rabbitmq-cell1-server-0\" (UID: \"5ab2d0d3-42ae-4af0-a0c8-518a7bb9a69b\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 11:03:21 crc kubenswrapper[4794]: I0310 11:03:21.624275 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 10 11:03:21 crc kubenswrapper[4794]: I0310 11:03:21.633704 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/234dcbfa-1ba5-4960-952c-1ee50cfb2c23-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"234dcbfa-1ba5-4960-952c-1ee50cfb2c23\") " pod="openstack/rabbitmq-server-0" Mar 10 11:03:21 crc kubenswrapper[4794]: I0310 11:03:21.633775 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/234dcbfa-1ba5-4960-952c-1ee50cfb2c23-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"234dcbfa-1ba5-4960-952c-1ee50cfb2c23\") " pod="openstack/rabbitmq-server-0" Mar 10 11:03:21 crc kubenswrapper[4794]: I0310 11:03:21.633801 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/234dcbfa-1ba5-4960-952c-1ee50cfb2c23-server-conf\") pod \"rabbitmq-server-0\" (UID: \"234dcbfa-1ba5-4960-952c-1ee50cfb2c23\") " pod="openstack/rabbitmq-server-0" Mar 10 11:03:21 crc kubenswrapper[4794]: I0310 11:03:21.633834 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jkcwt\" (UniqueName: \"kubernetes.io/projected/234dcbfa-1ba5-4960-952c-1ee50cfb2c23-kube-api-access-jkcwt\") pod \"rabbitmq-server-0\" (UID: \"234dcbfa-1ba5-4960-952c-1ee50cfb2c23\") " pod="openstack/rabbitmq-server-0" Mar 10 11:03:21 crc kubenswrapper[4794]: I0310 11:03:21.633868 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-78950b35-609e-4037-9d4f-5e3ac7d5d3a9\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-78950b35-609e-4037-9d4f-5e3ac7d5d3a9\") pod \"rabbitmq-server-0\" (UID: \"234dcbfa-1ba5-4960-952c-1ee50cfb2c23\") " pod="openstack/rabbitmq-server-0" Mar 10 11:03:21 crc kubenswrapper[4794]: I0310 11:03:21.633903 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/234dcbfa-1ba5-4960-952c-1ee50cfb2c23-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"234dcbfa-1ba5-4960-952c-1ee50cfb2c23\") " pod="openstack/rabbitmq-server-0" Mar 10 11:03:21 crc kubenswrapper[4794]: I0310 11:03:21.633926 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/234dcbfa-1ba5-4960-952c-1ee50cfb2c23-pod-info\") pod \"rabbitmq-server-0\" (UID: \"234dcbfa-1ba5-4960-952c-1ee50cfb2c23\") " pod="openstack/rabbitmq-server-0" Mar 10 11:03:21 crc kubenswrapper[4794]: I0310 11:03:21.633944 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/234dcbfa-1ba5-4960-952c-1ee50cfb2c23-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"234dcbfa-1ba5-4960-952c-1ee50cfb2c23\") " pod="openstack/rabbitmq-server-0" Mar 10 11:03:21 crc kubenswrapper[4794]: I0310 11:03:21.634000 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/234dcbfa-1ba5-4960-952c-1ee50cfb2c23-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"234dcbfa-1ba5-4960-952c-1ee50cfb2c23\") " pod="openstack/rabbitmq-server-0" Mar 10 11:03:21 crc kubenswrapper[4794]: I0310 11:03:21.736202 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/234dcbfa-1ba5-4960-952c-1ee50cfb2c23-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"234dcbfa-1ba5-4960-952c-1ee50cfb2c23\") " pod="openstack/rabbitmq-server-0" Mar 10 11:03:21 crc kubenswrapper[4794]: I0310 11:03:21.736280 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/234dcbfa-1ba5-4960-952c-1ee50cfb2c23-server-conf\") pod \"rabbitmq-server-0\" (UID: \"234dcbfa-1ba5-4960-952c-1ee50cfb2c23\") " pod="openstack/rabbitmq-server-0" Mar 10 11:03:21 crc kubenswrapper[4794]: I0310 11:03:21.736326 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jkcwt\" (UniqueName: \"kubernetes.io/projected/234dcbfa-1ba5-4960-952c-1ee50cfb2c23-kube-api-access-jkcwt\") pod \"rabbitmq-server-0\" (UID: \"234dcbfa-1ba5-4960-952c-1ee50cfb2c23\") " pod="openstack/rabbitmq-server-0" Mar 10 11:03:21 crc kubenswrapper[4794]: I0310 11:03:21.736393 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-78950b35-609e-4037-9d4f-5e3ac7d5d3a9\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-78950b35-609e-4037-9d4f-5e3ac7d5d3a9\") pod \"rabbitmq-server-0\" (UID: \"234dcbfa-1ba5-4960-952c-1ee50cfb2c23\") " pod="openstack/rabbitmq-server-0" Mar 10 11:03:21 crc kubenswrapper[4794]: I0310 11:03:21.736439 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/234dcbfa-1ba5-4960-952c-1ee50cfb2c23-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"234dcbfa-1ba5-4960-952c-1ee50cfb2c23\") " pod="openstack/rabbitmq-server-0" Mar 10 11:03:21 crc kubenswrapper[4794]: I0310 11:03:21.736473 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/234dcbfa-1ba5-4960-952c-1ee50cfb2c23-pod-info\") pod \"rabbitmq-server-0\" (UID: \"234dcbfa-1ba5-4960-952c-1ee50cfb2c23\") " pod="openstack/rabbitmq-server-0" Mar 10 11:03:21 crc kubenswrapper[4794]: I0310 11:03:21.738096 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/234dcbfa-1ba5-4960-952c-1ee50cfb2c23-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"234dcbfa-1ba5-4960-952c-1ee50cfb2c23\") " pod="openstack/rabbitmq-server-0" Mar 10 11:03:21 crc kubenswrapper[4794]: I0310 11:03:21.738122 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/234dcbfa-1ba5-4960-952c-1ee50cfb2c23-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"234dcbfa-1ba5-4960-952c-1ee50cfb2c23\") " pod="openstack/rabbitmq-server-0" Mar 10 11:03:21 crc kubenswrapper[4794]: I0310 11:03:21.739102 4794 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 10 11:03:21 crc kubenswrapper[4794]: I0310 11:03:21.739135 4794 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-78950b35-609e-4037-9d4f-5e3ac7d5d3a9\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-78950b35-609e-4037-9d4f-5e3ac7d5d3a9\") pod \"rabbitmq-server-0\" (UID: \"234dcbfa-1ba5-4960-952c-1ee50cfb2c23\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/27739c6f70fd848ff8e7803db65322a299e8c241afa4bc43c94133a97797b4f8/globalmount\"" pod="openstack/rabbitmq-server-0" Mar 10 11:03:21 crc kubenswrapper[4794]: I0310 11:03:21.739975 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/234dcbfa-1ba5-4960-952c-1ee50cfb2c23-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"234dcbfa-1ba5-4960-952c-1ee50cfb2c23\") " pod="openstack/rabbitmq-server-0" Mar 10 11:03:21 crc kubenswrapper[4794]: I0310 11:03:21.741452 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/234dcbfa-1ba5-4960-952c-1ee50cfb2c23-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"234dcbfa-1ba5-4960-952c-1ee50cfb2c23\") " pod="openstack/rabbitmq-server-0" Mar 10 11:03:21 crc kubenswrapper[4794]: I0310 11:03:21.741506 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/234dcbfa-1ba5-4960-952c-1ee50cfb2c23-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"234dcbfa-1ba5-4960-952c-1ee50cfb2c23\") " pod="openstack/rabbitmq-server-0" Mar 10 11:03:21 crc kubenswrapper[4794]: I0310 11:03:21.742724 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/234dcbfa-1ba5-4960-952c-1ee50cfb2c23-pod-info\") pod \"rabbitmq-server-0\" (UID: \"234dcbfa-1ba5-4960-952c-1ee50cfb2c23\") " pod="openstack/rabbitmq-server-0" Mar 10 11:03:21 crc kubenswrapper[4794]: I0310 11:03:21.743773 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/234dcbfa-1ba5-4960-952c-1ee50cfb2c23-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"234dcbfa-1ba5-4960-952c-1ee50cfb2c23\") " pod="openstack/rabbitmq-server-0" Mar 10 11:03:21 crc kubenswrapper[4794]: I0310 11:03:21.743790 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/234dcbfa-1ba5-4960-952c-1ee50cfb2c23-server-conf\") pod \"rabbitmq-server-0\" (UID: \"234dcbfa-1ba5-4960-952c-1ee50cfb2c23\") " pod="openstack/rabbitmq-server-0" Mar 10 11:03:21 crc kubenswrapper[4794]: I0310 11:03:21.746771 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/234dcbfa-1ba5-4960-952c-1ee50cfb2c23-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"234dcbfa-1ba5-4960-952c-1ee50cfb2c23\") " pod="openstack/rabbitmq-server-0" Mar 10 11:03:21 crc kubenswrapper[4794]: I0310 11:03:21.747186 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/234dcbfa-1ba5-4960-952c-1ee50cfb2c23-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"234dcbfa-1ba5-4960-952c-1ee50cfb2c23\") " pod="openstack/rabbitmq-server-0" Mar 10 11:03:21 crc kubenswrapper[4794]: I0310 11:03:21.752793 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jkcwt\" (UniqueName: \"kubernetes.io/projected/234dcbfa-1ba5-4960-952c-1ee50cfb2c23-kube-api-access-jkcwt\") pod \"rabbitmq-server-0\" (UID: \"234dcbfa-1ba5-4960-952c-1ee50cfb2c23\") " pod="openstack/rabbitmq-server-0" Mar 10 11:03:21 crc kubenswrapper[4794]: I0310 11:03:21.785584 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-78950b35-609e-4037-9d4f-5e3ac7d5d3a9\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-78950b35-609e-4037-9d4f-5e3ac7d5d3a9\") pod \"rabbitmq-server-0\" (UID: \"234dcbfa-1ba5-4960-952c-1ee50cfb2c23\") " pod="openstack/rabbitmq-server-0" Mar 10 11:03:21 crc kubenswrapper[4794]: I0310 11:03:21.939430 4794 generic.go:334] "Generic (PLEG): container finished" podID="a520b016-b09d-430c-a64b-83f450dab2ef" containerID="546a4fa4591c362f8981e9467637777c345b1ff0579e6b41f43217e55dac7217" exitCode=0 Mar 10 11:03:21 crc kubenswrapper[4794]: I0310 11:03:21.939502 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5fb77f9685-6b6f7" event={"ID":"a520b016-b09d-430c-a64b-83f450dab2ef","Type":"ContainerDied","Data":"546a4fa4591c362f8981e9467637777c345b1ff0579e6b41f43217e55dac7217"} Mar 10 11:03:21 crc kubenswrapper[4794]: I0310 11:03:21.939890 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5fb77f9685-6b6f7" event={"ID":"a520b016-b09d-430c-a64b-83f450dab2ef","Type":"ContainerStarted","Data":"9e87a39d1e38764a954abbc5cac989d7a7527a29da4ebb0a4ef8123c4f7e92f4"} Mar 10 11:03:21 crc kubenswrapper[4794]: I0310 11:03:21.941434 4794 generic.go:334] "Generic (PLEG): container finished" podID="000e40eb-9cb3-4535-ab24-652ebaf83d42" containerID="2964c180f508d51a70babef19278b7ac73427a795532236fa29fe156d95051d5" exitCode=0 Mar 10 11:03:21 crc kubenswrapper[4794]: I0310 11:03:21.941547 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-ff89b6977-xmvxn" event={"ID":"000e40eb-9cb3-4535-ab24-652ebaf83d42","Type":"ContainerDied","Data":"2964c180f508d51a70babef19278b7ac73427a795532236fa29fe156d95051d5"} Mar 10 11:03:21 crc kubenswrapper[4794]: I0310 11:03:21.941606 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-ff89b6977-xmvxn" event={"ID":"000e40eb-9cb3-4535-ab24-652ebaf83d42","Type":"ContainerStarted","Data":"81047964819e8faed2e5a169055c3fc3ffce9083322fc3a8a3b3be91278c1c9e"} Mar 10 11:03:21 crc kubenswrapper[4794]: I0310 11:03:21.944873 4794 generic.go:334] "Generic (PLEG): container finished" podID="aee8bb33-0bd7-4b71-9032-193f164cebe3" containerID="e4bfa6f406f0f35fd51f76eda17ae10923162d95e5abe2ac4ec4d609e68e908d" exitCode=0 Mar 10 11:03:21 crc kubenswrapper[4794]: I0310 11:03:21.944972 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-c44667757-jwqh5" Mar 10 11:03:21 crc kubenswrapper[4794]: I0310 11:03:21.945020 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55c76fd6b7-88j8m" event={"ID":"aee8bb33-0bd7-4b71-9032-193f164cebe3","Type":"ContainerDied","Data":"e4bfa6f406f0f35fd51f76eda17ae10923162d95e5abe2ac4ec4d609e68e908d"} Mar 10 11:03:21 crc kubenswrapper[4794]: I0310 11:03:21.986803 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 10 11:03:22 crc kubenswrapper[4794]: W0310 11:03:22.076024 4794 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5ab2d0d3_42ae_4af0_a0c8_518a7bb9a69b.slice/crio-c7358810302df97f6c85085a925e56344657bcfa5db29e46af7a0edb14987a24 WatchSource:0}: Error finding container c7358810302df97f6c85085a925e56344657bcfa5db29e46af7a0edb14987a24: Status 404 returned error can't find the container with id c7358810302df97f6c85085a925e56344657bcfa5db29e46af7a0edb14987a24 Mar 10 11:03:22 crc kubenswrapper[4794]: I0310 11:03:22.082875 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-c44667757-jwqh5"] Mar 10 11:03:22 crc kubenswrapper[4794]: I0310 11:03:22.089561 4794 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-c44667757-jwqh5"] Mar 10 11:03:22 crc kubenswrapper[4794]: I0310 11:03:22.096065 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 10 11:03:22 crc kubenswrapper[4794]: I0310 11:03:22.275762 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-55c76fd6b7-88j8m" Mar 10 11:03:22 crc kubenswrapper[4794]: I0310 11:03:22.347690 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/aee8bb33-0bd7-4b71-9032-193f164cebe3-dns-svc\") pod \"aee8bb33-0bd7-4b71-9032-193f164cebe3\" (UID: \"aee8bb33-0bd7-4b71-9032-193f164cebe3\") " Mar 10 11:03:22 crc kubenswrapper[4794]: I0310 11:03:22.347759 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f54l5\" (UniqueName: \"kubernetes.io/projected/aee8bb33-0bd7-4b71-9032-193f164cebe3-kube-api-access-f54l5\") pod \"aee8bb33-0bd7-4b71-9032-193f164cebe3\" (UID: \"aee8bb33-0bd7-4b71-9032-193f164cebe3\") " Mar 10 11:03:22 crc kubenswrapper[4794]: I0310 11:03:22.347824 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/aee8bb33-0bd7-4b71-9032-193f164cebe3-config\") pod \"aee8bb33-0bd7-4b71-9032-193f164cebe3\" (UID: \"aee8bb33-0bd7-4b71-9032-193f164cebe3\") " Mar 10 11:03:22 crc kubenswrapper[4794]: I0310 11:03:22.351265 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aee8bb33-0bd7-4b71-9032-193f164cebe3-kube-api-access-f54l5" (OuterVolumeSpecName: "kube-api-access-f54l5") pod "aee8bb33-0bd7-4b71-9032-193f164cebe3" (UID: "aee8bb33-0bd7-4b71-9032-193f164cebe3"). InnerVolumeSpecName "kube-api-access-f54l5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 11:03:22 crc kubenswrapper[4794]: I0310 11:03:22.362913 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/aee8bb33-0bd7-4b71-9032-193f164cebe3-config" (OuterVolumeSpecName: "config") pod "aee8bb33-0bd7-4b71-9032-193f164cebe3" (UID: "aee8bb33-0bd7-4b71-9032-193f164cebe3"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 11:03:22 crc kubenswrapper[4794]: I0310 11:03:22.376223 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/aee8bb33-0bd7-4b71-9032-193f164cebe3-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "aee8bb33-0bd7-4b71-9032-193f164cebe3" (UID: "aee8bb33-0bd7-4b71-9032-193f164cebe3"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 11:03:22 crc kubenswrapper[4794]: I0310 11:03:22.449432 4794 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/aee8bb33-0bd7-4b71-9032-193f164cebe3-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 10 11:03:22 crc kubenswrapper[4794]: I0310 11:03:22.449466 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f54l5\" (UniqueName: \"kubernetes.io/projected/aee8bb33-0bd7-4b71-9032-193f164cebe3-kube-api-access-f54l5\") on node \"crc\" DevicePath \"\"" Mar 10 11:03:22 crc kubenswrapper[4794]: I0310 11:03:22.449477 4794 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/aee8bb33-0bd7-4b71-9032-193f164cebe3-config\") on node \"crc\" DevicePath \"\"" Mar 10 11:03:22 crc kubenswrapper[4794]: I0310 11:03:22.565586 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 10 11:03:22 crc kubenswrapper[4794]: W0310 11:03:22.575723 4794 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod234dcbfa_1ba5_4960_952c_1ee50cfb2c23.slice/crio-676f53a52438fb8a5949313738b0fa164610706bc32a6b5a3ebb3e3863773d60 WatchSource:0}: Error finding container 676f53a52438fb8a5949313738b0fa164610706bc32a6b5a3ebb3e3863773d60: Status 404 returned error can't find the container with id 676f53a52438fb8a5949313738b0fa164610706bc32a6b5a3ebb3e3863773d60 Mar 10 11:03:22 crc kubenswrapper[4794]: I0310 11:03:22.733047 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-galera-0"] Mar 10 11:03:22 crc kubenswrapper[4794]: E0310 11:03:22.733808 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aee8bb33-0bd7-4b71-9032-193f164cebe3" containerName="init" Mar 10 11:03:22 crc kubenswrapper[4794]: I0310 11:03:22.733833 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="aee8bb33-0bd7-4b71-9032-193f164cebe3" containerName="init" Mar 10 11:03:22 crc kubenswrapper[4794]: I0310 11:03:22.734028 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="aee8bb33-0bd7-4b71-9032-193f164cebe3" containerName="init" Mar 10 11:03:22 crc kubenswrapper[4794]: I0310 11:03:22.734911 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Mar 10 11:03:22 crc kubenswrapper[4794]: I0310 11:03:22.737269 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-svc" Mar 10 11:03:22 crc kubenswrapper[4794]: I0310 11:03:22.738830 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-dockercfg-gmspr" Mar 10 11:03:22 crc kubenswrapper[4794]: I0310 11:03:22.739106 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-scripts" Mar 10 11:03:22 crc kubenswrapper[4794]: I0310 11:03:22.739828 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config-data" Mar 10 11:03:22 crc kubenswrapper[4794]: I0310 11:03:22.745682 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"combined-ca-bundle" Mar 10 11:03:22 crc kubenswrapper[4794]: I0310 11:03:22.760236 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Mar 10 11:03:22 crc kubenswrapper[4794]: I0310 11:03:22.857327 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/c100d9a4-5eb2-48f1-a419-24f8758331e3-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"c100d9a4-5eb2-48f1-a419-24f8758331e3\") " pod="openstack/openstack-galera-0" Mar 10 11:03:22 crc kubenswrapper[4794]: I0310 11:03:22.857495 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/c100d9a4-5eb2-48f1-a419-24f8758331e3-config-data-default\") pod \"openstack-galera-0\" (UID: \"c100d9a4-5eb2-48f1-a419-24f8758331e3\") " pod="openstack/openstack-galera-0" Mar 10 11:03:22 crc kubenswrapper[4794]: I0310 11:03:22.857535 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fvv95\" (UniqueName: \"kubernetes.io/projected/c100d9a4-5eb2-48f1-a419-24f8758331e3-kube-api-access-fvv95\") pod \"openstack-galera-0\" (UID: \"c100d9a4-5eb2-48f1-a419-24f8758331e3\") " pod="openstack/openstack-galera-0" Mar 10 11:03:22 crc kubenswrapper[4794]: I0310 11:03:22.857612 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c100d9a4-5eb2-48f1-a419-24f8758331e3-operator-scripts\") pod \"openstack-galera-0\" (UID: \"c100d9a4-5eb2-48f1-a419-24f8758331e3\") " pod="openstack/openstack-galera-0" Mar 10 11:03:22 crc kubenswrapper[4794]: I0310 11:03:22.857723 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-5c143a90-39df-4d59-b003-47e7d0db67f0\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-5c143a90-39df-4d59-b003-47e7d0db67f0\") pod \"openstack-galera-0\" (UID: \"c100d9a4-5eb2-48f1-a419-24f8758331e3\") " pod="openstack/openstack-galera-0" Mar 10 11:03:22 crc kubenswrapper[4794]: I0310 11:03:22.857822 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/c100d9a4-5eb2-48f1-a419-24f8758331e3-kolla-config\") pod \"openstack-galera-0\" (UID: \"c100d9a4-5eb2-48f1-a419-24f8758331e3\") " pod="openstack/openstack-galera-0" Mar 10 11:03:22 crc kubenswrapper[4794]: I0310 11:03:22.857862 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c100d9a4-5eb2-48f1-a419-24f8758331e3-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"c100d9a4-5eb2-48f1-a419-24f8758331e3\") " pod="openstack/openstack-galera-0" Mar 10 11:03:22 crc kubenswrapper[4794]: I0310 11:03:22.858007 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/c100d9a4-5eb2-48f1-a419-24f8758331e3-config-data-generated\") pod \"openstack-galera-0\" (UID: \"c100d9a4-5eb2-48f1-a419-24f8758331e3\") " pod="openstack/openstack-galera-0" Mar 10 11:03:22 crc kubenswrapper[4794]: I0310 11:03:22.954919 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"234dcbfa-1ba5-4960-952c-1ee50cfb2c23","Type":"ContainerStarted","Data":"676f53a52438fb8a5949313738b0fa164610706bc32a6b5a3ebb3e3863773d60"} Mar 10 11:03:22 crc kubenswrapper[4794]: I0310 11:03:22.956905 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55c76fd6b7-88j8m" event={"ID":"aee8bb33-0bd7-4b71-9032-193f164cebe3","Type":"ContainerDied","Data":"b32cb56acf9e82971a512e3f5d04cf989dae23b63364090d022ea549f62fa524"} Mar 10 11:03:22 crc kubenswrapper[4794]: I0310 11:03:22.956968 4794 scope.go:117] "RemoveContainer" containerID="e4bfa6f406f0f35fd51f76eda17ae10923162d95e5abe2ac4ec4d609e68e908d" Mar 10 11:03:22 crc kubenswrapper[4794]: I0310 11:03:22.956918 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-55c76fd6b7-88j8m" Mar 10 11:03:22 crc kubenswrapper[4794]: I0310 11:03:22.958958 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c100d9a4-5eb2-48f1-a419-24f8758331e3-operator-scripts\") pod \"openstack-galera-0\" (UID: \"c100d9a4-5eb2-48f1-a419-24f8758331e3\") " pod="openstack/openstack-galera-0" Mar 10 11:03:22 crc kubenswrapper[4794]: I0310 11:03:22.959025 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-5c143a90-39df-4d59-b003-47e7d0db67f0\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-5c143a90-39df-4d59-b003-47e7d0db67f0\") pod \"openstack-galera-0\" (UID: \"c100d9a4-5eb2-48f1-a419-24f8758331e3\") " pod="openstack/openstack-galera-0" Mar 10 11:03:22 crc kubenswrapper[4794]: I0310 11:03:22.959075 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/c100d9a4-5eb2-48f1-a419-24f8758331e3-kolla-config\") pod \"openstack-galera-0\" (UID: \"c100d9a4-5eb2-48f1-a419-24f8758331e3\") " pod="openstack/openstack-galera-0" Mar 10 11:03:22 crc kubenswrapper[4794]: I0310 11:03:22.959099 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c100d9a4-5eb2-48f1-a419-24f8758331e3-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"c100d9a4-5eb2-48f1-a419-24f8758331e3\") " pod="openstack/openstack-galera-0" Mar 10 11:03:22 crc kubenswrapper[4794]: I0310 11:03:22.959185 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/c100d9a4-5eb2-48f1-a419-24f8758331e3-config-data-generated\") pod \"openstack-galera-0\" (UID: \"c100d9a4-5eb2-48f1-a419-24f8758331e3\") " pod="openstack/openstack-galera-0" Mar 10 11:03:22 crc kubenswrapper[4794]: I0310 11:03:22.959219 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/c100d9a4-5eb2-48f1-a419-24f8758331e3-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"c100d9a4-5eb2-48f1-a419-24f8758331e3\") " pod="openstack/openstack-galera-0" Mar 10 11:03:22 crc kubenswrapper[4794]: I0310 11:03:22.959250 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/c100d9a4-5eb2-48f1-a419-24f8758331e3-config-data-default\") pod \"openstack-galera-0\" (UID: \"c100d9a4-5eb2-48f1-a419-24f8758331e3\") " pod="openstack/openstack-galera-0" Mar 10 11:03:22 crc kubenswrapper[4794]: I0310 11:03:22.959278 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fvv95\" (UniqueName: \"kubernetes.io/projected/c100d9a4-5eb2-48f1-a419-24f8758331e3-kube-api-access-fvv95\") pod \"openstack-galera-0\" (UID: \"c100d9a4-5eb2-48f1-a419-24f8758331e3\") " pod="openstack/openstack-galera-0" Mar 10 11:03:22 crc kubenswrapper[4794]: I0310 11:03:22.959905 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"5ab2d0d3-42ae-4af0-a0c8-518a7bb9a69b","Type":"ContainerStarted","Data":"c7358810302df97f6c85085a925e56344657bcfa5db29e46af7a0edb14987a24"} Mar 10 11:03:22 crc kubenswrapper[4794]: I0310 11:03:22.960845 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/c100d9a4-5eb2-48f1-a419-24f8758331e3-config-data-generated\") pod \"openstack-galera-0\" (UID: \"c100d9a4-5eb2-48f1-a419-24f8758331e3\") " pod="openstack/openstack-galera-0" Mar 10 11:03:22 crc kubenswrapper[4794]: I0310 11:03:22.961232 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/c100d9a4-5eb2-48f1-a419-24f8758331e3-kolla-config\") pod \"openstack-galera-0\" (UID: \"c100d9a4-5eb2-48f1-a419-24f8758331e3\") " pod="openstack/openstack-galera-0" Mar 10 11:03:22 crc kubenswrapper[4794]: I0310 11:03:22.961691 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/c100d9a4-5eb2-48f1-a419-24f8758331e3-config-data-default\") pod \"openstack-galera-0\" (UID: \"c100d9a4-5eb2-48f1-a419-24f8758331e3\") " pod="openstack/openstack-galera-0" Mar 10 11:03:22 crc kubenswrapper[4794]: I0310 11:03:22.962184 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c100d9a4-5eb2-48f1-a419-24f8758331e3-operator-scripts\") pod \"openstack-galera-0\" (UID: \"c100d9a4-5eb2-48f1-a419-24f8758331e3\") " pod="openstack/openstack-galera-0" Mar 10 11:03:22 crc kubenswrapper[4794]: I0310 11:03:22.967527 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5fb77f9685-6b6f7" event={"ID":"a520b016-b09d-430c-a64b-83f450dab2ef","Type":"ContainerStarted","Data":"e4e24e1ee07f4f3538b3d229b7b75f57e636295a525e347621294bd6e5a17772"} Mar 10 11:03:22 crc kubenswrapper[4794]: I0310 11:03:22.968003 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5fb77f9685-6b6f7" Mar 10 11:03:22 crc kubenswrapper[4794]: I0310 11:03:22.968985 4794 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 10 11:03:22 crc kubenswrapper[4794]: I0310 11:03:22.969041 4794 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-5c143a90-39df-4d59-b003-47e7d0db67f0\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-5c143a90-39df-4d59-b003-47e7d0db67f0\") pod \"openstack-galera-0\" (UID: \"c100d9a4-5eb2-48f1-a419-24f8758331e3\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/8a210f98616c9ad580bda075448e256f21b0a6aaf6eb8a55dd79c14d3aa5717c/globalmount\"" pod="openstack/openstack-galera-0" Mar 10 11:03:22 crc kubenswrapper[4794]: I0310 11:03:22.975268 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-ff89b6977-xmvxn" event={"ID":"000e40eb-9cb3-4535-ab24-652ebaf83d42","Type":"ContainerStarted","Data":"7c28c5cc71d21c2b87a96f04ac62381eecc94256e79308f1abc7b4e599db4f7f"} Mar 10 11:03:22 crc kubenswrapper[4794]: I0310 11:03:22.976270 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-ff89b6977-xmvxn" Mar 10 11:03:22 crc kubenswrapper[4794]: I0310 11:03:22.985493 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c100d9a4-5eb2-48f1-a419-24f8758331e3-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"c100d9a4-5eb2-48f1-a419-24f8758331e3\") " pod="openstack/openstack-galera-0" Mar 10 11:03:22 crc kubenswrapper[4794]: I0310 11:03:22.991203 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/c100d9a4-5eb2-48f1-a419-24f8758331e3-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"c100d9a4-5eb2-48f1-a419-24f8758331e3\") " pod="openstack/openstack-galera-0" Mar 10 11:03:22 crc kubenswrapper[4794]: I0310 11:03:22.994083 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5fb77f9685-6b6f7" podStartSLOduration=2.9940678419999998 podStartE2EDuration="2.994067842s" podCreationTimestamp="2026-03-10 11:03:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 11:03:22.993391972 +0000 UTC m=+4751.749562810" watchObservedRunningTime="2026-03-10 11:03:22.994067842 +0000 UTC m=+4751.750238670" Mar 10 11:03:22 crc kubenswrapper[4794]: I0310 11:03:22.997692 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fvv95\" (UniqueName: \"kubernetes.io/projected/c100d9a4-5eb2-48f1-a419-24f8758331e3-kube-api-access-fvv95\") pod \"openstack-galera-0\" (UID: \"c100d9a4-5eb2-48f1-a419-24f8758331e3\") " pod="openstack/openstack-galera-0" Mar 10 11:03:23 crc kubenswrapper[4794]: I0310 11:03:23.048909 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-5c143a90-39df-4d59-b003-47e7d0db67f0\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-5c143a90-39df-4d59-b003-47e7d0db67f0\") pod \"openstack-galera-0\" (UID: \"c100d9a4-5eb2-48f1-a419-24f8758331e3\") " pod="openstack/openstack-galera-0" Mar 10 11:03:23 crc kubenswrapper[4794]: I0310 11:03:23.052855 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-55c76fd6b7-88j8m"] Mar 10 11:03:23 crc kubenswrapper[4794]: I0310 11:03:23.055691 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Mar 10 11:03:23 crc kubenswrapper[4794]: I0310 11:03:23.072584 4794 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-55c76fd6b7-88j8m"] Mar 10 11:03:23 crc kubenswrapper[4794]: I0310 11:03:23.073094 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-ff89b6977-xmvxn" podStartSLOduration=3.07307146 podStartE2EDuration="3.07307146s" podCreationTimestamp="2026-03-10 11:03:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 11:03:23.055400601 +0000 UTC m=+4751.811571429" watchObservedRunningTime="2026-03-10 11:03:23.07307146 +0000 UTC m=+4751.829242278" Mar 10 11:03:23 crc kubenswrapper[4794]: I0310 11:03:23.136183 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/memcached-0"] Mar 10 11:03:23 crc kubenswrapper[4794]: I0310 11:03:23.140486 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Mar 10 11:03:23 crc kubenswrapper[4794]: I0310 11:03:23.146716 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"memcached-config-data" Mar 10 11:03:23 crc kubenswrapper[4794]: I0310 11:03:23.152995 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"memcached-memcached-dockercfg-pp8n2" Mar 10 11:03:23 crc kubenswrapper[4794]: I0310 11:03:23.158637 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Mar 10 11:03:23 crc kubenswrapper[4794]: I0310 11:03:23.265538 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/bec2b4b2-9935-4ce8-b1e3-4e999d18a441-config-data\") pod \"memcached-0\" (UID: \"bec2b4b2-9935-4ce8-b1e3-4e999d18a441\") " pod="openstack/memcached-0" Mar 10 11:03:23 crc kubenswrapper[4794]: I0310 11:03:23.265935 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/bec2b4b2-9935-4ce8-b1e3-4e999d18a441-kolla-config\") pod \"memcached-0\" (UID: \"bec2b4b2-9935-4ce8-b1e3-4e999d18a441\") " pod="openstack/memcached-0" Mar 10 11:03:23 crc kubenswrapper[4794]: I0310 11:03:23.265960 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-78m8r\" (UniqueName: \"kubernetes.io/projected/bec2b4b2-9935-4ce8-b1e3-4e999d18a441-kube-api-access-78m8r\") pod \"memcached-0\" (UID: \"bec2b4b2-9935-4ce8-b1e3-4e999d18a441\") " pod="openstack/memcached-0" Mar 10 11:03:23 crc kubenswrapper[4794]: I0310 11:03:23.367636 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/bec2b4b2-9935-4ce8-b1e3-4e999d18a441-config-data\") pod \"memcached-0\" (UID: \"bec2b4b2-9935-4ce8-b1e3-4e999d18a441\") " pod="openstack/memcached-0" Mar 10 11:03:23 crc kubenswrapper[4794]: I0310 11:03:23.367713 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/bec2b4b2-9935-4ce8-b1e3-4e999d18a441-kolla-config\") pod \"memcached-0\" (UID: \"bec2b4b2-9935-4ce8-b1e3-4e999d18a441\") " pod="openstack/memcached-0" Mar 10 11:03:23 crc kubenswrapper[4794]: I0310 11:03:23.367737 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-78m8r\" (UniqueName: \"kubernetes.io/projected/bec2b4b2-9935-4ce8-b1e3-4e999d18a441-kube-api-access-78m8r\") pod \"memcached-0\" (UID: \"bec2b4b2-9935-4ce8-b1e3-4e999d18a441\") " pod="openstack/memcached-0" Mar 10 11:03:23 crc kubenswrapper[4794]: I0310 11:03:23.368648 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/bec2b4b2-9935-4ce8-b1e3-4e999d18a441-config-data\") pod \"memcached-0\" (UID: \"bec2b4b2-9935-4ce8-b1e3-4e999d18a441\") " pod="openstack/memcached-0" Mar 10 11:03:23 crc kubenswrapper[4794]: I0310 11:03:23.369107 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/bec2b4b2-9935-4ce8-b1e3-4e999d18a441-kolla-config\") pod \"memcached-0\" (UID: \"bec2b4b2-9935-4ce8-b1e3-4e999d18a441\") " pod="openstack/memcached-0" Mar 10 11:03:23 crc kubenswrapper[4794]: I0310 11:03:23.383651 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-78m8r\" (UniqueName: \"kubernetes.io/projected/bec2b4b2-9935-4ce8-b1e3-4e999d18a441-kube-api-access-78m8r\") pod \"memcached-0\" (UID: \"bec2b4b2-9935-4ce8-b1e3-4e999d18a441\") " pod="openstack/memcached-0" Mar 10 11:03:23 crc kubenswrapper[4794]: I0310 11:03:23.467193 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Mar 10 11:03:23 crc kubenswrapper[4794]: I0310 11:03:23.637116 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Mar 10 11:03:23 crc kubenswrapper[4794]: I0310 11:03:23.941983 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Mar 10 11:03:23 crc kubenswrapper[4794]: W0310 11:03:23.951421 4794 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbec2b4b2_9935_4ce8_b1e3_4e999d18a441.slice/crio-a972b38e14bc136a9e054bdea3c6a85cb1779fba4d566d7490fb27349ffdc766 WatchSource:0}: Error finding container a972b38e14bc136a9e054bdea3c6a85cb1779fba4d566d7490fb27349ffdc766: Status 404 returned error can't find the container with id a972b38e14bc136a9e054bdea3c6a85cb1779fba4d566d7490fb27349ffdc766 Mar 10 11:03:23 crc kubenswrapper[4794]: I0310 11:03:23.985701 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"5ab2d0d3-42ae-4af0-a0c8-518a7bb9a69b","Type":"ContainerStarted","Data":"bfc4c53ccdbb347f7baadd72109438231354bd34500259fa9f7925fca91384ae"} Mar 10 11:03:23 crc kubenswrapper[4794]: I0310 11:03:23.988417 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"bec2b4b2-9935-4ce8-b1e3-4e999d18a441","Type":"ContainerStarted","Data":"a972b38e14bc136a9e054bdea3c6a85cb1779fba4d566d7490fb27349ffdc766"} Mar 10 11:03:23 crc kubenswrapper[4794]: I0310 11:03:23.990163 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"c100d9a4-5eb2-48f1-a419-24f8758331e3","Type":"ContainerStarted","Data":"72f2626ca291b786124c29e806756619e32a02b4cea1489f37e38fe292ca8027"} Mar 10 11:03:23 crc kubenswrapper[4794]: I0310 11:03:23.990197 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"c100d9a4-5eb2-48f1-a419-24f8758331e3","Type":"ContainerStarted","Data":"ad97f26f2a347316272156cff7d5e2763026ecc9f08f3e3577f88f8a0e1b4ab8"} Mar 10 11:03:24 crc kubenswrapper[4794]: I0310 11:03:24.017252 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="aee8bb33-0bd7-4b71-9032-193f164cebe3" path="/var/lib/kubelet/pods/aee8bb33-0bd7-4b71-9032-193f164cebe3/volumes" Mar 10 11:03:24 crc kubenswrapper[4794]: I0310 11:03:24.018280 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f095ff67-32dc-4f41-adea-62bcc1a5520b" path="/var/lib/kubelet/pods/f095ff67-32dc-4f41-adea-62bcc1a5520b/volumes" Mar 10 11:03:24 crc kubenswrapper[4794]: I0310 11:03:24.121915 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-cell1-galera-0"] Mar 10 11:03:24 crc kubenswrapper[4794]: I0310 11:03:24.122997 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Mar 10 11:03:24 crc kubenswrapper[4794]: I0310 11:03:24.125353 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-cell1-dockercfg-c4c9m" Mar 10 11:03:24 crc kubenswrapper[4794]: I0310 11:03:24.125398 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-cell1-svc" Mar 10 11:03:24 crc kubenswrapper[4794]: I0310 11:03:24.125406 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-config-data" Mar 10 11:03:24 crc kubenswrapper[4794]: I0310 11:03:24.125588 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-scripts" Mar 10 11:03:24 crc kubenswrapper[4794]: I0310 11:03:24.136702 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Mar 10 11:03:24 crc kubenswrapper[4794]: I0310 11:03:24.199467 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/9144c4f6-a35f-4aa9-a1d3-fbce6ca09b39-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"9144c4f6-a35f-4aa9-a1d3-fbce6ca09b39\") " pod="openstack/openstack-cell1-galera-0" Mar 10 11:03:24 crc kubenswrapper[4794]: I0310 11:03:24.199503 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9144c4f6-a35f-4aa9-a1d3-fbce6ca09b39-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"9144c4f6-a35f-4aa9-a1d3-fbce6ca09b39\") " pod="openstack/openstack-cell1-galera-0" Mar 10 11:03:24 crc kubenswrapper[4794]: I0310 11:03:24.199656 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/9144c4f6-a35f-4aa9-a1d3-fbce6ca09b39-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"9144c4f6-a35f-4aa9-a1d3-fbce6ca09b39\") " pod="openstack/openstack-cell1-galera-0" Mar 10 11:03:24 crc kubenswrapper[4794]: I0310 11:03:24.199718 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-9f4d7844-fa75-42f8-b673-615539f708b5\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-9f4d7844-fa75-42f8-b673-615539f708b5\") pod \"openstack-cell1-galera-0\" (UID: \"9144c4f6-a35f-4aa9-a1d3-fbce6ca09b39\") " pod="openstack/openstack-cell1-galera-0" Mar 10 11:03:24 crc kubenswrapper[4794]: I0310 11:03:24.199919 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/9144c4f6-a35f-4aa9-a1d3-fbce6ca09b39-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"9144c4f6-a35f-4aa9-a1d3-fbce6ca09b39\") " pod="openstack/openstack-cell1-galera-0" Mar 10 11:03:24 crc kubenswrapper[4794]: I0310 11:03:24.200013 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/9144c4f6-a35f-4aa9-a1d3-fbce6ca09b39-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"9144c4f6-a35f-4aa9-a1d3-fbce6ca09b39\") " pod="openstack/openstack-cell1-galera-0" Mar 10 11:03:24 crc kubenswrapper[4794]: I0310 11:03:24.200038 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b9nzl\" (UniqueName: \"kubernetes.io/projected/9144c4f6-a35f-4aa9-a1d3-fbce6ca09b39-kube-api-access-b9nzl\") pod \"openstack-cell1-galera-0\" (UID: \"9144c4f6-a35f-4aa9-a1d3-fbce6ca09b39\") " pod="openstack/openstack-cell1-galera-0" Mar 10 11:03:24 crc kubenswrapper[4794]: I0310 11:03:24.200091 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9144c4f6-a35f-4aa9-a1d3-fbce6ca09b39-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"9144c4f6-a35f-4aa9-a1d3-fbce6ca09b39\") " pod="openstack/openstack-cell1-galera-0" Mar 10 11:03:24 crc kubenswrapper[4794]: I0310 11:03:24.301192 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-9f4d7844-fa75-42f8-b673-615539f708b5\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-9f4d7844-fa75-42f8-b673-615539f708b5\") pod \"openstack-cell1-galera-0\" (UID: \"9144c4f6-a35f-4aa9-a1d3-fbce6ca09b39\") " pod="openstack/openstack-cell1-galera-0" Mar 10 11:03:24 crc kubenswrapper[4794]: I0310 11:03:24.301396 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/9144c4f6-a35f-4aa9-a1d3-fbce6ca09b39-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"9144c4f6-a35f-4aa9-a1d3-fbce6ca09b39\") " pod="openstack/openstack-cell1-galera-0" Mar 10 11:03:24 crc kubenswrapper[4794]: I0310 11:03:24.301479 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/9144c4f6-a35f-4aa9-a1d3-fbce6ca09b39-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"9144c4f6-a35f-4aa9-a1d3-fbce6ca09b39\") " pod="openstack/openstack-cell1-galera-0" Mar 10 11:03:24 crc kubenswrapper[4794]: I0310 11:03:24.301524 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b9nzl\" (UniqueName: \"kubernetes.io/projected/9144c4f6-a35f-4aa9-a1d3-fbce6ca09b39-kube-api-access-b9nzl\") pod \"openstack-cell1-galera-0\" (UID: \"9144c4f6-a35f-4aa9-a1d3-fbce6ca09b39\") " pod="openstack/openstack-cell1-galera-0" Mar 10 11:03:24 crc kubenswrapper[4794]: I0310 11:03:24.301557 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9144c4f6-a35f-4aa9-a1d3-fbce6ca09b39-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"9144c4f6-a35f-4aa9-a1d3-fbce6ca09b39\") " pod="openstack/openstack-cell1-galera-0" Mar 10 11:03:24 crc kubenswrapper[4794]: I0310 11:03:24.301619 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/9144c4f6-a35f-4aa9-a1d3-fbce6ca09b39-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"9144c4f6-a35f-4aa9-a1d3-fbce6ca09b39\") " pod="openstack/openstack-cell1-galera-0" Mar 10 11:03:24 crc kubenswrapper[4794]: I0310 11:03:24.301656 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9144c4f6-a35f-4aa9-a1d3-fbce6ca09b39-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"9144c4f6-a35f-4aa9-a1d3-fbce6ca09b39\") " pod="openstack/openstack-cell1-galera-0" Mar 10 11:03:24 crc kubenswrapper[4794]: I0310 11:03:24.301690 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/9144c4f6-a35f-4aa9-a1d3-fbce6ca09b39-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"9144c4f6-a35f-4aa9-a1d3-fbce6ca09b39\") " pod="openstack/openstack-cell1-galera-0" Mar 10 11:03:24 crc kubenswrapper[4794]: I0310 11:03:24.302055 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/9144c4f6-a35f-4aa9-a1d3-fbce6ca09b39-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"9144c4f6-a35f-4aa9-a1d3-fbce6ca09b39\") " pod="openstack/openstack-cell1-galera-0" Mar 10 11:03:24 crc kubenswrapper[4794]: I0310 11:03:24.303123 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/9144c4f6-a35f-4aa9-a1d3-fbce6ca09b39-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"9144c4f6-a35f-4aa9-a1d3-fbce6ca09b39\") " pod="openstack/openstack-cell1-galera-0" Mar 10 11:03:24 crc kubenswrapper[4794]: I0310 11:03:24.303184 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/9144c4f6-a35f-4aa9-a1d3-fbce6ca09b39-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"9144c4f6-a35f-4aa9-a1d3-fbce6ca09b39\") " pod="openstack/openstack-cell1-galera-0" Mar 10 11:03:24 crc kubenswrapper[4794]: I0310 11:03:24.305118 4794 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 10 11:03:24 crc kubenswrapper[4794]: I0310 11:03:24.305164 4794 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-9f4d7844-fa75-42f8-b673-615539f708b5\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-9f4d7844-fa75-42f8-b673-615539f708b5\") pod \"openstack-cell1-galera-0\" (UID: \"9144c4f6-a35f-4aa9-a1d3-fbce6ca09b39\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/0860ceeaa8a9bdba05f05182a13996a9545bc2b53e1155265cb101a721b43d06/globalmount\"" pod="openstack/openstack-cell1-galera-0" Mar 10 11:03:24 crc kubenswrapper[4794]: I0310 11:03:24.305954 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9144c4f6-a35f-4aa9-a1d3-fbce6ca09b39-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"9144c4f6-a35f-4aa9-a1d3-fbce6ca09b39\") " pod="openstack/openstack-cell1-galera-0" Mar 10 11:03:24 crc kubenswrapper[4794]: I0310 11:03:24.307892 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/9144c4f6-a35f-4aa9-a1d3-fbce6ca09b39-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"9144c4f6-a35f-4aa9-a1d3-fbce6ca09b39\") " pod="openstack/openstack-cell1-galera-0" Mar 10 11:03:24 crc kubenswrapper[4794]: I0310 11:03:24.310853 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9144c4f6-a35f-4aa9-a1d3-fbce6ca09b39-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"9144c4f6-a35f-4aa9-a1d3-fbce6ca09b39\") " pod="openstack/openstack-cell1-galera-0" Mar 10 11:03:24 crc kubenswrapper[4794]: I0310 11:03:24.335740 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b9nzl\" (UniqueName: \"kubernetes.io/projected/9144c4f6-a35f-4aa9-a1d3-fbce6ca09b39-kube-api-access-b9nzl\") pod \"openstack-cell1-galera-0\" (UID: \"9144c4f6-a35f-4aa9-a1d3-fbce6ca09b39\") " pod="openstack/openstack-cell1-galera-0" Mar 10 11:03:24 crc kubenswrapper[4794]: I0310 11:03:24.348005 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-9f4d7844-fa75-42f8-b673-615539f708b5\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-9f4d7844-fa75-42f8-b673-615539f708b5\") pod \"openstack-cell1-galera-0\" (UID: \"9144c4f6-a35f-4aa9-a1d3-fbce6ca09b39\") " pod="openstack/openstack-cell1-galera-0" Mar 10 11:03:24 crc kubenswrapper[4794]: I0310 11:03:24.444011 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Mar 10 11:03:24 crc kubenswrapper[4794]: I0310 11:03:24.746947 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Mar 10 11:03:25 crc kubenswrapper[4794]: I0310 11:03:25.006215 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"234dcbfa-1ba5-4960-952c-1ee50cfb2c23","Type":"ContainerStarted","Data":"37fa3a5e51255eca9e03fb94c20d30aee8427d15b76573220ce7cfa614cb9707"} Mar 10 11:03:25 crc kubenswrapper[4794]: I0310 11:03:25.009665 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"bec2b4b2-9935-4ce8-b1e3-4e999d18a441","Type":"ContainerStarted","Data":"d9ce2630eaa5c5fda3d08f7c69f10006db2a562b0858aa78615f3194ef788e78"} Mar 10 11:03:25 crc kubenswrapper[4794]: I0310 11:03:25.009942 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/memcached-0" Mar 10 11:03:25 crc kubenswrapper[4794]: I0310 11:03:25.012653 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"9144c4f6-a35f-4aa9-a1d3-fbce6ca09b39","Type":"ContainerStarted","Data":"9b70fa9f9448e93cb471e66dbfd452e212d04d72d3c5e673b01567c3d7b9fdcd"} Mar 10 11:03:25 crc kubenswrapper[4794]: I0310 11:03:25.012698 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"9144c4f6-a35f-4aa9-a1d3-fbce6ca09b39","Type":"ContainerStarted","Data":"dc3cacc84a651eb4f5b41cf6db50c0c72f6aa6bcaef57c268ac7bb35abf66a04"} Mar 10 11:03:25 crc kubenswrapper[4794]: I0310 11:03:25.058366 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/memcached-0" podStartSLOduration=2.058343082 podStartE2EDuration="2.058343082s" podCreationTimestamp="2026-03-10 11:03:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 11:03:25.05666788 +0000 UTC m=+4753.812838708" watchObservedRunningTime="2026-03-10 11:03:25.058343082 +0000 UTC m=+4753.814513920" Mar 10 11:03:25 crc kubenswrapper[4794]: I0310 11:03:25.114489 4794 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-b89xg" Mar 10 11:03:25 crc kubenswrapper[4794]: I0310 11:03:25.114524 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-b89xg" Mar 10 11:03:25 crc kubenswrapper[4794]: I0310 11:03:25.158706 4794 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-b89xg" Mar 10 11:03:26 crc kubenswrapper[4794]: I0310 11:03:26.093234 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-b89xg" Mar 10 11:03:26 crc kubenswrapper[4794]: I0310 11:03:26.157476 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-b89xg"] Mar 10 11:03:28 crc kubenswrapper[4794]: I0310 11:03:28.040065 4794 generic.go:334] "Generic (PLEG): container finished" podID="c100d9a4-5eb2-48f1-a419-24f8758331e3" containerID="72f2626ca291b786124c29e806756619e32a02b4cea1489f37e38fe292ca8027" exitCode=0 Mar 10 11:03:28 crc kubenswrapper[4794]: I0310 11:03:28.040210 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"c100d9a4-5eb2-48f1-a419-24f8758331e3","Type":"ContainerDied","Data":"72f2626ca291b786124c29e806756619e32a02b4cea1489f37e38fe292ca8027"} Mar 10 11:03:28 crc kubenswrapper[4794]: I0310 11:03:28.040267 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-b89xg" podUID="80da5faf-3e24-412d-b8dc-7dbc30d930fe" containerName="registry-server" containerID="cri-o://c44d79a5498967a6b7bab20907f0efe2f2a2d77469d92c5711c3ad3542f66175" gracePeriod=2 Mar 10 11:03:28 crc kubenswrapper[4794]: I0310 11:03:28.677195 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-b89xg" Mar 10 11:03:28 crc kubenswrapper[4794]: I0310 11:03:28.784938 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/80da5faf-3e24-412d-b8dc-7dbc30d930fe-catalog-content\") pod \"80da5faf-3e24-412d-b8dc-7dbc30d930fe\" (UID: \"80da5faf-3e24-412d-b8dc-7dbc30d930fe\") " Mar 10 11:03:28 crc kubenswrapper[4794]: I0310 11:03:28.785272 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/80da5faf-3e24-412d-b8dc-7dbc30d930fe-utilities\") pod \"80da5faf-3e24-412d-b8dc-7dbc30d930fe\" (UID: \"80da5faf-3e24-412d-b8dc-7dbc30d930fe\") " Mar 10 11:03:28 crc kubenswrapper[4794]: I0310 11:03:28.785468 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qmc8s\" (UniqueName: \"kubernetes.io/projected/80da5faf-3e24-412d-b8dc-7dbc30d930fe-kube-api-access-qmc8s\") pod \"80da5faf-3e24-412d-b8dc-7dbc30d930fe\" (UID: \"80da5faf-3e24-412d-b8dc-7dbc30d930fe\") " Mar 10 11:03:28 crc kubenswrapper[4794]: I0310 11:03:28.786736 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/80da5faf-3e24-412d-b8dc-7dbc30d930fe-utilities" (OuterVolumeSpecName: "utilities") pod "80da5faf-3e24-412d-b8dc-7dbc30d930fe" (UID: "80da5faf-3e24-412d-b8dc-7dbc30d930fe"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 11:03:28 crc kubenswrapper[4794]: I0310 11:03:28.843717 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/80da5faf-3e24-412d-b8dc-7dbc30d930fe-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "80da5faf-3e24-412d-b8dc-7dbc30d930fe" (UID: "80da5faf-3e24-412d-b8dc-7dbc30d930fe"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 11:03:28 crc kubenswrapper[4794]: I0310 11:03:28.877286 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/80da5faf-3e24-412d-b8dc-7dbc30d930fe-kube-api-access-qmc8s" (OuterVolumeSpecName: "kube-api-access-qmc8s") pod "80da5faf-3e24-412d-b8dc-7dbc30d930fe" (UID: "80da5faf-3e24-412d-b8dc-7dbc30d930fe"). InnerVolumeSpecName "kube-api-access-qmc8s". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 11:03:28 crc kubenswrapper[4794]: I0310 11:03:28.887496 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qmc8s\" (UniqueName: \"kubernetes.io/projected/80da5faf-3e24-412d-b8dc-7dbc30d930fe-kube-api-access-qmc8s\") on node \"crc\" DevicePath \"\"" Mar 10 11:03:28 crc kubenswrapper[4794]: I0310 11:03:28.887544 4794 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/80da5faf-3e24-412d-b8dc-7dbc30d930fe-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 10 11:03:28 crc kubenswrapper[4794]: I0310 11:03:28.887564 4794 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/80da5faf-3e24-412d-b8dc-7dbc30d930fe-utilities\") on node \"crc\" DevicePath \"\"" Mar 10 11:03:29 crc kubenswrapper[4794]: I0310 11:03:29.000659 4794 scope.go:117] "RemoveContainer" containerID="7246b508c9eeccdf8a5ec9276e239c3ec455273ced1f978a5745ca8a5e590e02" Mar 10 11:03:29 crc kubenswrapper[4794]: E0310 11:03:29.001380 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-69278_openshift-machine-config-operator(071a79a8-a892-4d38-a255-2a19483b64aa)\"" pod="openshift-machine-config-operator/machine-config-daemon-69278" podUID="071a79a8-a892-4d38-a255-2a19483b64aa" Mar 10 11:03:29 crc kubenswrapper[4794]: I0310 11:03:29.052101 4794 generic.go:334] "Generic (PLEG): container finished" podID="9144c4f6-a35f-4aa9-a1d3-fbce6ca09b39" containerID="9b70fa9f9448e93cb471e66dbfd452e212d04d72d3c5e673b01567c3d7b9fdcd" exitCode=0 Mar 10 11:03:29 crc kubenswrapper[4794]: I0310 11:03:29.052234 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"9144c4f6-a35f-4aa9-a1d3-fbce6ca09b39","Type":"ContainerDied","Data":"9b70fa9f9448e93cb471e66dbfd452e212d04d72d3c5e673b01567c3d7b9fdcd"} Mar 10 11:03:29 crc kubenswrapper[4794]: I0310 11:03:29.055055 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"c100d9a4-5eb2-48f1-a419-24f8758331e3","Type":"ContainerStarted","Data":"cd800ee71712ef2d042265d6d60f79125d71df5a506dd34b2ddafed8fac48316"} Mar 10 11:03:29 crc kubenswrapper[4794]: I0310 11:03:29.062267 4794 generic.go:334] "Generic (PLEG): container finished" podID="80da5faf-3e24-412d-b8dc-7dbc30d930fe" containerID="c44d79a5498967a6b7bab20907f0efe2f2a2d77469d92c5711c3ad3542f66175" exitCode=0 Mar 10 11:03:29 crc kubenswrapper[4794]: I0310 11:03:29.062311 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-b89xg" event={"ID":"80da5faf-3e24-412d-b8dc-7dbc30d930fe","Type":"ContainerDied","Data":"c44d79a5498967a6b7bab20907f0efe2f2a2d77469d92c5711c3ad3542f66175"} Mar 10 11:03:29 crc kubenswrapper[4794]: I0310 11:03:29.062357 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-b89xg" event={"ID":"80da5faf-3e24-412d-b8dc-7dbc30d930fe","Type":"ContainerDied","Data":"1b03a8d4db5c10d51735ae82fce2c31e496d7a06553f6776036994e7ede66da0"} Mar 10 11:03:29 crc kubenswrapper[4794]: I0310 11:03:29.062378 4794 scope.go:117] "RemoveContainer" containerID="c44d79a5498967a6b7bab20907f0efe2f2a2d77469d92c5711c3ad3542f66175" Mar 10 11:03:29 crc kubenswrapper[4794]: I0310 11:03:29.062421 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-b89xg" Mar 10 11:03:29 crc kubenswrapper[4794]: I0310 11:03:29.084193 4794 scope.go:117] "RemoveContainer" containerID="cf97f03feb752e7d2e1d6f7f1f8a04f264f5b3de5dbcb44ce159a1201ed2d023" Mar 10 11:03:29 crc kubenswrapper[4794]: I0310 11:03:29.125874 4794 scope.go:117] "RemoveContainer" containerID="1688ead9c119db09fd06969df137452ad19edeb2cdbc128a30828b5a254958ef" Mar 10 11:03:29 crc kubenswrapper[4794]: I0310 11:03:29.145066 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-galera-0" podStartSLOduration=8.145026218 podStartE2EDuration="8.145026218s" podCreationTimestamp="2026-03-10 11:03:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 11:03:29.119254827 +0000 UTC m=+4757.875425655" watchObservedRunningTime="2026-03-10 11:03:29.145026218 +0000 UTC m=+4757.901197086" Mar 10 11:03:29 crc kubenswrapper[4794]: I0310 11:03:29.171151 4794 scope.go:117] "RemoveContainer" containerID="c44d79a5498967a6b7bab20907f0efe2f2a2d77469d92c5711c3ad3542f66175" Mar 10 11:03:29 crc kubenswrapper[4794]: E0310 11:03:29.171785 4794 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c44d79a5498967a6b7bab20907f0efe2f2a2d77469d92c5711c3ad3542f66175\": container with ID starting with c44d79a5498967a6b7bab20907f0efe2f2a2d77469d92c5711c3ad3542f66175 not found: ID does not exist" containerID="c44d79a5498967a6b7bab20907f0efe2f2a2d77469d92c5711c3ad3542f66175" Mar 10 11:03:29 crc kubenswrapper[4794]: I0310 11:03:29.171834 4794 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c44d79a5498967a6b7bab20907f0efe2f2a2d77469d92c5711c3ad3542f66175"} err="failed to get container status \"c44d79a5498967a6b7bab20907f0efe2f2a2d77469d92c5711c3ad3542f66175\": rpc error: code = NotFound desc = could not find container \"c44d79a5498967a6b7bab20907f0efe2f2a2d77469d92c5711c3ad3542f66175\": container with ID starting with c44d79a5498967a6b7bab20907f0efe2f2a2d77469d92c5711c3ad3542f66175 not found: ID does not exist" Mar 10 11:03:29 crc kubenswrapper[4794]: I0310 11:03:29.171865 4794 scope.go:117] "RemoveContainer" containerID="cf97f03feb752e7d2e1d6f7f1f8a04f264f5b3de5dbcb44ce159a1201ed2d023" Mar 10 11:03:29 crc kubenswrapper[4794]: E0310 11:03:29.172243 4794 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cf97f03feb752e7d2e1d6f7f1f8a04f264f5b3de5dbcb44ce159a1201ed2d023\": container with ID starting with cf97f03feb752e7d2e1d6f7f1f8a04f264f5b3de5dbcb44ce159a1201ed2d023 not found: ID does not exist" containerID="cf97f03feb752e7d2e1d6f7f1f8a04f264f5b3de5dbcb44ce159a1201ed2d023" Mar 10 11:03:29 crc kubenswrapper[4794]: I0310 11:03:29.172307 4794 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cf97f03feb752e7d2e1d6f7f1f8a04f264f5b3de5dbcb44ce159a1201ed2d023"} err="failed to get container status \"cf97f03feb752e7d2e1d6f7f1f8a04f264f5b3de5dbcb44ce159a1201ed2d023\": rpc error: code = NotFound desc = could not find container \"cf97f03feb752e7d2e1d6f7f1f8a04f264f5b3de5dbcb44ce159a1201ed2d023\": container with ID starting with cf97f03feb752e7d2e1d6f7f1f8a04f264f5b3de5dbcb44ce159a1201ed2d023 not found: ID does not exist" Mar 10 11:03:29 crc kubenswrapper[4794]: I0310 11:03:29.172374 4794 scope.go:117] "RemoveContainer" containerID="1688ead9c119db09fd06969df137452ad19edeb2cdbc128a30828b5a254958ef" Mar 10 11:03:29 crc kubenswrapper[4794]: E0310 11:03:29.172769 4794 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1688ead9c119db09fd06969df137452ad19edeb2cdbc128a30828b5a254958ef\": container with ID starting with 1688ead9c119db09fd06969df137452ad19edeb2cdbc128a30828b5a254958ef not found: ID does not exist" containerID="1688ead9c119db09fd06969df137452ad19edeb2cdbc128a30828b5a254958ef" Mar 10 11:03:29 crc kubenswrapper[4794]: I0310 11:03:29.172793 4794 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1688ead9c119db09fd06969df137452ad19edeb2cdbc128a30828b5a254958ef"} err="failed to get container status \"1688ead9c119db09fd06969df137452ad19edeb2cdbc128a30828b5a254958ef\": rpc error: code = NotFound desc = could not find container \"1688ead9c119db09fd06969df137452ad19edeb2cdbc128a30828b5a254958ef\": container with ID starting with 1688ead9c119db09fd06969df137452ad19edeb2cdbc128a30828b5a254958ef not found: ID does not exist" Mar 10 11:03:29 crc kubenswrapper[4794]: I0310 11:03:29.188011 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-b89xg"] Mar 10 11:03:29 crc kubenswrapper[4794]: I0310 11:03:29.212177 4794 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-b89xg"] Mar 10 11:03:30 crc kubenswrapper[4794]: I0310 11:03:30.014366 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="80da5faf-3e24-412d-b8dc-7dbc30d930fe" path="/var/lib/kubelet/pods/80da5faf-3e24-412d-b8dc-7dbc30d930fe/volumes" Mar 10 11:03:30 crc kubenswrapper[4794]: I0310 11:03:30.076406 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"9144c4f6-a35f-4aa9-a1d3-fbce6ca09b39","Type":"ContainerStarted","Data":"52be047ed6bca6b6297a44f8a0561a8da8206f391cfefc0e19d67bb2fb6a0cc4"} Mar 10 11:03:30 crc kubenswrapper[4794]: I0310 11:03:30.112968 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-cell1-galera-0" podStartSLOduration=7.112785771 podStartE2EDuration="7.112785771s" podCreationTimestamp="2026-03-10 11:03:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 11:03:30.108745735 +0000 UTC m=+4758.864916593" watchObservedRunningTime="2026-03-10 11:03:30.112785771 +0000 UTC m=+4758.868956619" Mar 10 11:03:30 crc kubenswrapper[4794]: I0310 11:03:30.493505 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5fb77f9685-6b6f7" Mar 10 11:03:30 crc kubenswrapper[4794]: I0310 11:03:30.781092 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-ff89b6977-xmvxn" Mar 10 11:03:30 crc kubenswrapper[4794]: I0310 11:03:30.832059 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5fb77f9685-6b6f7"] Mar 10 11:03:31 crc kubenswrapper[4794]: I0310 11:03:31.084238 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5fb77f9685-6b6f7" podUID="a520b016-b09d-430c-a64b-83f450dab2ef" containerName="dnsmasq-dns" containerID="cri-o://e4e24e1ee07f4f3538b3d229b7b75f57e636295a525e347621294bd6e5a17772" gracePeriod=10 Mar 10 11:03:31 crc kubenswrapper[4794]: I0310 11:03:31.505969 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5fb77f9685-6b6f7" Mar 10 11:03:31 crc kubenswrapper[4794]: I0310 11:03:31.645541 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a520b016-b09d-430c-a64b-83f450dab2ef-dns-svc\") pod \"a520b016-b09d-430c-a64b-83f450dab2ef\" (UID: \"a520b016-b09d-430c-a64b-83f450dab2ef\") " Mar 10 11:03:31 crc kubenswrapper[4794]: I0310 11:03:31.645633 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a520b016-b09d-430c-a64b-83f450dab2ef-config\") pod \"a520b016-b09d-430c-a64b-83f450dab2ef\" (UID: \"a520b016-b09d-430c-a64b-83f450dab2ef\") " Mar 10 11:03:31 crc kubenswrapper[4794]: I0310 11:03:31.645797 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lv2rp\" (UniqueName: \"kubernetes.io/projected/a520b016-b09d-430c-a64b-83f450dab2ef-kube-api-access-lv2rp\") pod \"a520b016-b09d-430c-a64b-83f450dab2ef\" (UID: \"a520b016-b09d-430c-a64b-83f450dab2ef\") " Mar 10 11:03:31 crc kubenswrapper[4794]: I0310 11:03:31.652318 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a520b016-b09d-430c-a64b-83f450dab2ef-kube-api-access-lv2rp" (OuterVolumeSpecName: "kube-api-access-lv2rp") pod "a520b016-b09d-430c-a64b-83f450dab2ef" (UID: "a520b016-b09d-430c-a64b-83f450dab2ef"). InnerVolumeSpecName "kube-api-access-lv2rp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 11:03:31 crc kubenswrapper[4794]: I0310 11:03:31.693983 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a520b016-b09d-430c-a64b-83f450dab2ef-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "a520b016-b09d-430c-a64b-83f450dab2ef" (UID: "a520b016-b09d-430c-a64b-83f450dab2ef"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 11:03:31 crc kubenswrapper[4794]: I0310 11:03:31.711228 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a520b016-b09d-430c-a64b-83f450dab2ef-config" (OuterVolumeSpecName: "config") pod "a520b016-b09d-430c-a64b-83f450dab2ef" (UID: "a520b016-b09d-430c-a64b-83f450dab2ef"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 11:03:31 crc kubenswrapper[4794]: I0310 11:03:31.747709 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lv2rp\" (UniqueName: \"kubernetes.io/projected/a520b016-b09d-430c-a64b-83f450dab2ef-kube-api-access-lv2rp\") on node \"crc\" DevicePath \"\"" Mar 10 11:03:31 crc kubenswrapper[4794]: I0310 11:03:31.747749 4794 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a520b016-b09d-430c-a64b-83f450dab2ef-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 10 11:03:31 crc kubenswrapper[4794]: I0310 11:03:31.747762 4794 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a520b016-b09d-430c-a64b-83f450dab2ef-config\") on node \"crc\" DevicePath \"\"" Mar 10 11:03:32 crc kubenswrapper[4794]: I0310 11:03:32.189549 4794 generic.go:334] "Generic (PLEG): container finished" podID="a520b016-b09d-430c-a64b-83f450dab2ef" containerID="e4e24e1ee07f4f3538b3d229b7b75f57e636295a525e347621294bd6e5a17772" exitCode=0 Mar 10 11:03:32 crc kubenswrapper[4794]: I0310 11:03:32.189591 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5fb77f9685-6b6f7" event={"ID":"a520b016-b09d-430c-a64b-83f450dab2ef","Type":"ContainerDied","Data":"e4e24e1ee07f4f3538b3d229b7b75f57e636295a525e347621294bd6e5a17772"} Mar 10 11:03:32 crc kubenswrapper[4794]: I0310 11:03:32.189616 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5fb77f9685-6b6f7" Mar 10 11:03:32 crc kubenswrapper[4794]: I0310 11:03:32.189627 4794 scope.go:117] "RemoveContainer" containerID="e4e24e1ee07f4f3538b3d229b7b75f57e636295a525e347621294bd6e5a17772" Mar 10 11:03:32 crc kubenswrapper[4794]: I0310 11:03:32.189615 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5fb77f9685-6b6f7" event={"ID":"a520b016-b09d-430c-a64b-83f450dab2ef","Type":"ContainerDied","Data":"9e87a39d1e38764a954abbc5cac989d7a7527a29da4ebb0a4ef8123c4f7e92f4"} Mar 10 11:03:32 crc kubenswrapper[4794]: I0310 11:03:32.243162 4794 scope.go:117] "RemoveContainer" containerID="546a4fa4591c362f8981e9467637777c345b1ff0579e6b41f43217e55dac7217" Mar 10 11:03:32 crc kubenswrapper[4794]: I0310 11:03:32.266357 4794 scope.go:117] "RemoveContainer" containerID="e4e24e1ee07f4f3538b3d229b7b75f57e636295a525e347621294bd6e5a17772" Mar 10 11:03:32 crc kubenswrapper[4794]: E0310 11:03:32.266705 4794 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e4e24e1ee07f4f3538b3d229b7b75f57e636295a525e347621294bd6e5a17772\": container with ID starting with e4e24e1ee07f4f3538b3d229b7b75f57e636295a525e347621294bd6e5a17772 not found: ID does not exist" containerID="e4e24e1ee07f4f3538b3d229b7b75f57e636295a525e347621294bd6e5a17772" Mar 10 11:03:32 crc kubenswrapper[4794]: I0310 11:03:32.266732 4794 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e4e24e1ee07f4f3538b3d229b7b75f57e636295a525e347621294bd6e5a17772"} err="failed to get container status \"e4e24e1ee07f4f3538b3d229b7b75f57e636295a525e347621294bd6e5a17772\": rpc error: code = NotFound desc = could not find container \"e4e24e1ee07f4f3538b3d229b7b75f57e636295a525e347621294bd6e5a17772\": container with ID starting with e4e24e1ee07f4f3538b3d229b7b75f57e636295a525e347621294bd6e5a17772 not found: ID does not exist" Mar 10 11:03:32 crc kubenswrapper[4794]: I0310 11:03:32.266756 4794 scope.go:117] "RemoveContainer" containerID="546a4fa4591c362f8981e9467637777c345b1ff0579e6b41f43217e55dac7217" Mar 10 11:03:32 crc kubenswrapper[4794]: E0310 11:03:32.267050 4794 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"546a4fa4591c362f8981e9467637777c345b1ff0579e6b41f43217e55dac7217\": container with ID starting with 546a4fa4591c362f8981e9467637777c345b1ff0579e6b41f43217e55dac7217 not found: ID does not exist" containerID="546a4fa4591c362f8981e9467637777c345b1ff0579e6b41f43217e55dac7217" Mar 10 11:03:32 crc kubenswrapper[4794]: I0310 11:03:32.267072 4794 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"546a4fa4591c362f8981e9467637777c345b1ff0579e6b41f43217e55dac7217"} err="failed to get container status \"546a4fa4591c362f8981e9467637777c345b1ff0579e6b41f43217e55dac7217\": rpc error: code = NotFound desc = could not find container \"546a4fa4591c362f8981e9467637777c345b1ff0579e6b41f43217e55dac7217\": container with ID starting with 546a4fa4591c362f8981e9467637777c345b1ff0579e6b41f43217e55dac7217 not found: ID does not exist" Mar 10 11:03:33 crc kubenswrapper[4794]: I0310 11:03:33.056910 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-galera-0" Mar 10 11:03:33 crc kubenswrapper[4794]: I0310 11:03:33.056968 4794 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-galera-0" Mar 10 11:03:33 crc kubenswrapper[4794]: I0310 11:03:33.359928 4794 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-galera-0" Mar 10 11:03:33 crc kubenswrapper[4794]: I0310 11:03:33.432917 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-galera-0" Mar 10 11:03:33 crc kubenswrapper[4794]: I0310 11:03:33.468477 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/memcached-0" Mar 10 11:03:34 crc kubenswrapper[4794]: I0310 11:03:34.444869 4794 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-cell1-galera-0" Mar 10 11:03:34 crc kubenswrapper[4794]: I0310 11:03:34.446183 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-cell1-galera-0" Mar 10 11:03:36 crc kubenswrapper[4794]: I0310 11:03:36.844364 4794 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-cell1-galera-0" Mar 10 11:03:36 crc kubenswrapper[4794]: I0310 11:03:36.992599 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-cell1-galera-0" Mar 10 11:03:41 crc kubenswrapper[4794]: I0310 11:03:41.722632 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-pc4zv"] Mar 10 11:03:41 crc kubenswrapper[4794]: E0310 11:03:41.725602 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a520b016-b09d-430c-a64b-83f450dab2ef" containerName="init" Mar 10 11:03:41 crc kubenswrapper[4794]: I0310 11:03:41.725801 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="a520b016-b09d-430c-a64b-83f450dab2ef" containerName="init" Mar 10 11:03:41 crc kubenswrapper[4794]: E0310 11:03:41.726002 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="80da5faf-3e24-412d-b8dc-7dbc30d930fe" containerName="registry-server" Mar 10 11:03:41 crc kubenswrapper[4794]: I0310 11:03:41.726161 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="80da5faf-3e24-412d-b8dc-7dbc30d930fe" containerName="registry-server" Mar 10 11:03:41 crc kubenswrapper[4794]: E0310 11:03:41.726447 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="80da5faf-3e24-412d-b8dc-7dbc30d930fe" containerName="extract-content" Mar 10 11:03:41 crc kubenswrapper[4794]: I0310 11:03:41.726641 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="80da5faf-3e24-412d-b8dc-7dbc30d930fe" containerName="extract-content" Mar 10 11:03:41 crc kubenswrapper[4794]: E0310 11:03:41.726837 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="80da5faf-3e24-412d-b8dc-7dbc30d930fe" containerName="extract-utilities" Mar 10 11:03:41 crc kubenswrapper[4794]: I0310 11:03:41.727001 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="80da5faf-3e24-412d-b8dc-7dbc30d930fe" containerName="extract-utilities" Mar 10 11:03:41 crc kubenswrapper[4794]: E0310 11:03:41.727169 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a520b016-b09d-430c-a64b-83f450dab2ef" containerName="dnsmasq-dns" Mar 10 11:03:41 crc kubenswrapper[4794]: I0310 11:03:41.727362 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="a520b016-b09d-430c-a64b-83f450dab2ef" containerName="dnsmasq-dns" Mar 10 11:03:41 crc kubenswrapper[4794]: I0310 11:03:41.727848 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="a520b016-b09d-430c-a64b-83f450dab2ef" containerName="dnsmasq-dns" Mar 10 11:03:41 crc kubenswrapper[4794]: I0310 11:03:41.728046 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="80da5faf-3e24-412d-b8dc-7dbc30d930fe" containerName="registry-server" Mar 10 11:03:41 crc kubenswrapper[4794]: I0310 11:03:41.729293 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-pc4zv" Mar 10 11:03:41 crc kubenswrapper[4794]: I0310 11:03:41.733013 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-mariadb-root-db-secret" Mar 10 11:03:41 crc kubenswrapper[4794]: I0310 11:03:41.762871 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-pc4zv"] Mar 10 11:03:41 crc kubenswrapper[4794]: I0310 11:03:41.868166 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cbjhn\" (UniqueName: \"kubernetes.io/projected/d6ccce44-c82f-4e33-b96a-7e864faf5d08-kube-api-access-cbjhn\") pod \"root-account-create-update-pc4zv\" (UID: \"d6ccce44-c82f-4e33-b96a-7e864faf5d08\") " pod="openstack/root-account-create-update-pc4zv" Mar 10 11:03:41 crc kubenswrapper[4794]: I0310 11:03:41.868299 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d6ccce44-c82f-4e33-b96a-7e864faf5d08-operator-scripts\") pod \"root-account-create-update-pc4zv\" (UID: \"d6ccce44-c82f-4e33-b96a-7e864faf5d08\") " pod="openstack/root-account-create-update-pc4zv" Mar 10 11:03:41 crc kubenswrapper[4794]: I0310 11:03:41.969275 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d6ccce44-c82f-4e33-b96a-7e864faf5d08-operator-scripts\") pod \"root-account-create-update-pc4zv\" (UID: \"d6ccce44-c82f-4e33-b96a-7e864faf5d08\") " pod="openstack/root-account-create-update-pc4zv" Mar 10 11:03:41 crc kubenswrapper[4794]: I0310 11:03:41.969372 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cbjhn\" (UniqueName: \"kubernetes.io/projected/d6ccce44-c82f-4e33-b96a-7e864faf5d08-kube-api-access-cbjhn\") pod \"root-account-create-update-pc4zv\" (UID: \"d6ccce44-c82f-4e33-b96a-7e864faf5d08\") " pod="openstack/root-account-create-update-pc4zv" Mar 10 11:03:41 crc kubenswrapper[4794]: I0310 11:03:41.970394 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d6ccce44-c82f-4e33-b96a-7e864faf5d08-operator-scripts\") pod \"root-account-create-update-pc4zv\" (UID: \"d6ccce44-c82f-4e33-b96a-7e864faf5d08\") " pod="openstack/root-account-create-update-pc4zv" Mar 10 11:03:41 crc kubenswrapper[4794]: I0310 11:03:41.999026 4794 scope.go:117] "RemoveContainer" containerID="7246b508c9eeccdf8a5ec9276e239c3ec455273ced1f978a5745ca8a5e590e02" Mar 10 11:03:41 crc kubenswrapper[4794]: E0310 11:03:41.999228 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-69278_openshift-machine-config-operator(071a79a8-a892-4d38-a255-2a19483b64aa)\"" pod="openshift-machine-config-operator/machine-config-daemon-69278" podUID="071a79a8-a892-4d38-a255-2a19483b64aa" Mar 10 11:03:42 crc kubenswrapper[4794]: I0310 11:03:42.002980 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cbjhn\" (UniqueName: \"kubernetes.io/projected/d6ccce44-c82f-4e33-b96a-7e864faf5d08-kube-api-access-cbjhn\") pod \"root-account-create-update-pc4zv\" (UID: \"d6ccce44-c82f-4e33-b96a-7e864faf5d08\") " pod="openstack/root-account-create-update-pc4zv" Mar 10 11:03:42 crc kubenswrapper[4794]: I0310 11:03:42.062745 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-pc4zv" Mar 10 11:03:42 crc kubenswrapper[4794]: I0310 11:03:42.474797 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-pc4zv"] Mar 10 11:03:42 crc kubenswrapper[4794]: W0310 11:03:42.480015 4794 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd6ccce44_c82f_4e33_b96a_7e864faf5d08.slice/crio-dc00dab88c7ac39a750599a4d8e79336897b88b5ddc09bd32777bdd86f3ae561 WatchSource:0}: Error finding container dc00dab88c7ac39a750599a4d8e79336897b88b5ddc09bd32777bdd86f3ae561: Status 404 returned error can't find the container with id dc00dab88c7ac39a750599a4d8e79336897b88b5ddc09bd32777bdd86f3ae561 Mar 10 11:03:43 crc kubenswrapper[4794]: I0310 11:03:43.288190 4794 generic.go:334] "Generic (PLEG): container finished" podID="d6ccce44-c82f-4e33-b96a-7e864faf5d08" containerID="3ba063e2e29bfe8f6cb6ea2e7ea28189306e6f28661a6c001da495e5a98f6bbe" exitCode=0 Mar 10 11:03:43 crc kubenswrapper[4794]: I0310 11:03:43.288289 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-pc4zv" event={"ID":"d6ccce44-c82f-4e33-b96a-7e864faf5d08","Type":"ContainerDied","Data":"3ba063e2e29bfe8f6cb6ea2e7ea28189306e6f28661a6c001da495e5a98f6bbe"} Mar 10 11:03:43 crc kubenswrapper[4794]: I0310 11:03:43.288502 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-pc4zv" event={"ID":"d6ccce44-c82f-4e33-b96a-7e864faf5d08","Type":"ContainerStarted","Data":"dc00dab88c7ac39a750599a4d8e79336897b88b5ddc09bd32777bdd86f3ae561"} Mar 10 11:03:44 crc kubenswrapper[4794]: I0310 11:03:44.638385 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-pc4zv" Mar 10 11:03:44 crc kubenswrapper[4794]: I0310 11:03:44.710011 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d6ccce44-c82f-4e33-b96a-7e864faf5d08-operator-scripts\") pod \"d6ccce44-c82f-4e33-b96a-7e864faf5d08\" (UID: \"d6ccce44-c82f-4e33-b96a-7e864faf5d08\") " Mar 10 11:03:44 crc kubenswrapper[4794]: I0310 11:03:44.710069 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cbjhn\" (UniqueName: \"kubernetes.io/projected/d6ccce44-c82f-4e33-b96a-7e864faf5d08-kube-api-access-cbjhn\") pod \"d6ccce44-c82f-4e33-b96a-7e864faf5d08\" (UID: \"d6ccce44-c82f-4e33-b96a-7e864faf5d08\") " Mar 10 11:03:44 crc kubenswrapper[4794]: I0310 11:03:44.710847 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d6ccce44-c82f-4e33-b96a-7e864faf5d08-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "d6ccce44-c82f-4e33-b96a-7e864faf5d08" (UID: "d6ccce44-c82f-4e33-b96a-7e864faf5d08"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 11:03:44 crc kubenswrapper[4794]: I0310 11:03:44.715010 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d6ccce44-c82f-4e33-b96a-7e864faf5d08-kube-api-access-cbjhn" (OuterVolumeSpecName: "kube-api-access-cbjhn") pod "d6ccce44-c82f-4e33-b96a-7e864faf5d08" (UID: "d6ccce44-c82f-4e33-b96a-7e864faf5d08"). InnerVolumeSpecName "kube-api-access-cbjhn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 11:03:44 crc kubenswrapper[4794]: I0310 11:03:44.811959 4794 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d6ccce44-c82f-4e33-b96a-7e864faf5d08-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 11:03:44 crc kubenswrapper[4794]: I0310 11:03:44.811991 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cbjhn\" (UniqueName: \"kubernetes.io/projected/d6ccce44-c82f-4e33-b96a-7e864faf5d08-kube-api-access-cbjhn\") on node \"crc\" DevicePath \"\"" Mar 10 11:03:45 crc kubenswrapper[4794]: I0310 11:03:45.305986 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-pc4zv" event={"ID":"d6ccce44-c82f-4e33-b96a-7e864faf5d08","Type":"ContainerDied","Data":"dc00dab88c7ac39a750599a4d8e79336897b88b5ddc09bd32777bdd86f3ae561"} Mar 10 11:03:45 crc kubenswrapper[4794]: I0310 11:03:45.306057 4794 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="dc00dab88c7ac39a750599a4d8e79336897b88b5ddc09bd32777bdd86f3ae561" Mar 10 11:03:45 crc kubenswrapper[4794]: I0310 11:03:45.306086 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-pc4zv" Mar 10 11:03:48 crc kubenswrapper[4794]: I0310 11:03:48.150145 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-pc4zv"] Mar 10 11:03:48 crc kubenswrapper[4794]: I0310 11:03:48.160623 4794 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-pc4zv"] Mar 10 11:03:50 crc kubenswrapper[4794]: I0310 11:03:50.015629 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d6ccce44-c82f-4e33-b96a-7e864faf5d08" path="/var/lib/kubelet/pods/d6ccce44-c82f-4e33-b96a-7e864faf5d08/volumes" Mar 10 11:03:53 crc kubenswrapper[4794]: I0310 11:03:53.178241 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-fp5js"] Mar 10 11:03:53 crc kubenswrapper[4794]: E0310 11:03:53.179409 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d6ccce44-c82f-4e33-b96a-7e864faf5d08" containerName="mariadb-account-create-update" Mar 10 11:03:53 crc kubenswrapper[4794]: I0310 11:03:53.179432 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="d6ccce44-c82f-4e33-b96a-7e864faf5d08" containerName="mariadb-account-create-update" Mar 10 11:03:53 crc kubenswrapper[4794]: I0310 11:03:53.179732 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="d6ccce44-c82f-4e33-b96a-7e864faf5d08" containerName="mariadb-account-create-update" Mar 10 11:03:53 crc kubenswrapper[4794]: I0310 11:03:53.180579 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-fp5js" Mar 10 11:03:53 crc kubenswrapper[4794]: I0310 11:03:53.183593 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-mariadb-root-db-secret" Mar 10 11:03:53 crc kubenswrapper[4794]: I0310 11:03:53.188289 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-fp5js"] Mar 10 11:03:53 crc kubenswrapper[4794]: I0310 11:03:53.251729 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0aebf8d4-3811-45eb-88fd-cc864f69e181-operator-scripts\") pod \"root-account-create-update-fp5js\" (UID: \"0aebf8d4-3811-45eb-88fd-cc864f69e181\") " pod="openstack/root-account-create-update-fp5js" Mar 10 11:03:53 crc kubenswrapper[4794]: I0310 11:03:53.251763 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t5c2w\" (UniqueName: \"kubernetes.io/projected/0aebf8d4-3811-45eb-88fd-cc864f69e181-kube-api-access-t5c2w\") pod \"root-account-create-update-fp5js\" (UID: \"0aebf8d4-3811-45eb-88fd-cc864f69e181\") " pod="openstack/root-account-create-update-fp5js" Mar 10 11:03:53 crc kubenswrapper[4794]: I0310 11:03:53.353710 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0aebf8d4-3811-45eb-88fd-cc864f69e181-operator-scripts\") pod \"root-account-create-update-fp5js\" (UID: \"0aebf8d4-3811-45eb-88fd-cc864f69e181\") " pod="openstack/root-account-create-update-fp5js" Mar 10 11:03:53 crc kubenswrapper[4794]: I0310 11:03:53.353782 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t5c2w\" (UniqueName: \"kubernetes.io/projected/0aebf8d4-3811-45eb-88fd-cc864f69e181-kube-api-access-t5c2w\") pod \"root-account-create-update-fp5js\" (UID: \"0aebf8d4-3811-45eb-88fd-cc864f69e181\") " pod="openstack/root-account-create-update-fp5js" Mar 10 11:03:53 crc kubenswrapper[4794]: I0310 11:03:53.355249 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0aebf8d4-3811-45eb-88fd-cc864f69e181-operator-scripts\") pod \"root-account-create-update-fp5js\" (UID: \"0aebf8d4-3811-45eb-88fd-cc864f69e181\") " pod="openstack/root-account-create-update-fp5js" Mar 10 11:03:53 crc kubenswrapper[4794]: I0310 11:03:53.374617 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t5c2w\" (UniqueName: \"kubernetes.io/projected/0aebf8d4-3811-45eb-88fd-cc864f69e181-kube-api-access-t5c2w\") pod \"root-account-create-update-fp5js\" (UID: \"0aebf8d4-3811-45eb-88fd-cc864f69e181\") " pod="openstack/root-account-create-update-fp5js" Mar 10 11:03:53 crc kubenswrapper[4794]: I0310 11:03:53.538070 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-fp5js" Mar 10 11:03:53 crc kubenswrapper[4794]: I0310 11:03:53.836727 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-fp5js"] Mar 10 11:03:53 crc kubenswrapper[4794]: W0310 11:03:53.842207 4794 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0aebf8d4_3811_45eb_88fd_cc864f69e181.slice/crio-34e6973e5311fe8a810be0b747194b1eca7b04b8d831694ce47853408c6e5d9e WatchSource:0}: Error finding container 34e6973e5311fe8a810be0b747194b1eca7b04b8d831694ce47853408c6e5d9e: Status 404 returned error can't find the container with id 34e6973e5311fe8a810be0b747194b1eca7b04b8d831694ce47853408c6e5d9e Mar 10 11:03:53 crc kubenswrapper[4794]: I0310 11:03:53.999648 4794 scope.go:117] "RemoveContainer" containerID="7246b508c9eeccdf8a5ec9276e239c3ec455273ced1f978a5745ca8a5e590e02" Mar 10 11:03:53 crc kubenswrapper[4794]: E0310 11:03:53.999964 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-69278_openshift-machine-config-operator(071a79a8-a892-4d38-a255-2a19483b64aa)\"" pod="openshift-machine-config-operator/machine-config-daemon-69278" podUID="071a79a8-a892-4d38-a255-2a19483b64aa" Mar 10 11:03:54 crc kubenswrapper[4794]: I0310 11:03:54.390759 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-fp5js" event={"ID":"0aebf8d4-3811-45eb-88fd-cc864f69e181","Type":"ContainerStarted","Data":"c0e34bd502e79d6c39416a77bb318b8f87108b872e6f242bc54bbc3bcd908e30"} Mar 10 11:03:54 crc kubenswrapper[4794]: I0310 11:03:54.391144 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-fp5js" event={"ID":"0aebf8d4-3811-45eb-88fd-cc864f69e181","Type":"ContainerStarted","Data":"34e6973e5311fe8a810be0b747194b1eca7b04b8d831694ce47853408c6e5d9e"} Mar 10 11:03:54 crc kubenswrapper[4794]: I0310 11:03:54.418736 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/root-account-create-update-fp5js" podStartSLOduration=1.4187107 podStartE2EDuration="1.4187107s" podCreationTimestamp="2026-03-10 11:03:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 11:03:54.410567937 +0000 UTC m=+4783.166738765" watchObservedRunningTime="2026-03-10 11:03:54.4187107 +0000 UTC m=+4783.174881558" Mar 10 11:03:55 crc kubenswrapper[4794]: I0310 11:03:55.403548 4794 generic.go:334] "Generic (PLEG): container finished" podID="0aebf8d4-3811-45eb-88fd-cc864f69e181" containerID="c0e34bd502e79d6c39416a77bb318b8f87108b872e6f242bc54bbc3bcd908e30" exitCode=0 Mar 10 11:03:55 crc kubenswrapper[4794]: I0310 11:03:55.403624 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-fp5js" event={"ID":"0aebf8d4-3811-45eb-88fd-cc864f69e181","Type":"ContainerDied","Data":"c0e34bd502e79d6c39416a77bb318b8f87108b872e6f242bc54bbc3bcd908e30"} Mar 10 11:03:56 crc kubenswrapper[4794]: I0310 11:03:56.414806 4794 generic.go:334] "Generic (PLEG): container finished" podID="5ab2d0d3-42ae-4af0-a0c8-518a7bb9a69b" containerID="bfc4c53ccdbb347f7baadd72109438231354bd34500259fa9f7925fca91384ae" exitCode=0 Mar 10 11:03:56 crc kubenswrapper[4794]: I0310 11:03:56.414915 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"5ab2d0d3-42ae-4af0-a0c8-518a7bb9a69b","Type":"ContainerDied","Data":"bfc4c53ccdbb347f7baadd72109438231354bd34500259fa9f7925fca91384ae"} Mar 10 11:03:56 crc kubenswrapper[4794]: I0310 11:03:56.758369 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-fp5js" Mar 10 11:03:56 crc kubenswrapper[4794]: I0310 11:03:56.812714 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0aebf8d4-3811-45eb-88fd-cc864f69e181-operator-scripts\") pod \"0aebf8d4-3811-45eb-88fd-cc864f69e181\" (UID: \"0aebf8d4-3811-45eb-88fd-cc864f69e181\") " Mar 10 11:03:56 crc kubenswrapper[4794]: I0310 11:03:56.812849 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t5c2w\" (UniqueName: \"kubernetes.io/projected/0aebf8d4-3811-45eb-88fd-cc864f69e181-kube-api-access-t5c2w\") pod \"0aebf8d4-3811-45eb-88fd-cc864f69e181\" (UID: \"0aebf8d4-3811-45eb-88fd-cc864f69e181\") " Mar 10 11:03:56 crc kubenswrapper[4794]: I0310 11:03:56.813289 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0aebf8d4-3811-45eb-88fd-cc864f69e181-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "0aebf8d4-3811-45eb-88fd-cc864f69e181" (UID: "0aebf8d4-3811-45eb-88fd-cc864f69e181"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 11:03:56 crc kubenswrapper[4794]: I0310 11:03:56.817723 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0aebf8d4-3811-45eb-88fd-cc864f69e181-kube-api-access-t5c2w" (OuterVolumeSpecName: "kube-api-access-t5c2w") pod "0aebf8d4-3811-45eb-88fd-cc864f69e181" (UID: "0aebf8d4-3811-45eb-88fd-cc864f69e181"). InnerVolumeSpecName "kube-api-access-t5c2w". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 11:03:56 crc kubenswrapper[4794]: I0310 11:03:56.915084 4794 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0aebf8d4-3811-45eb-88fd-cc864f69e181-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 11:03:56 crc kubenswrapper[4794]: I0310 11:03:56.915124 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t5c2w\" (UniqueName: \"kubernetes.io/projected/0aebf8d4-3811-45eb-88fd-cc864f69e181-kube-api-access-t5c2w\") on node \"crc\" DevicePath \"\"" Mar 10 11:03:57 crc kubenswrapper[4794]: I0310 11:03:57.424563 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"5ab2d0d3-42ae-4af0-a0c8-518a7bb9a69b","Type":"ContainerStarted","Data":"a34cb3588cb3e8bbf28c0deb4db70508cb4c6244e58b6a11ecc7649ac0558a8e"} Mar 10 11:03:57 crc kubenswrapper[4794]: I0310 11:03:57.425172 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Mar 10 11:03:57 crc kubenswrapper[4794]: I0310 11:03:57.426528 4794 generic.go:334] "Generic (PLEG): container finished" podID="234dcbfa-1ba5-4960-952c-1ee50cfb2c23" containerID="37fa3a5e51255eca9e03fb94c20d30aee8427d15b76573220ce7cfa614cb9707" exitCode=0 Mar 10 11:03:57 crc kubenswrapper[4794]: I0310 11:03:57.426626 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"234dcbfa-1ba5-4960-952c-1ee50cfb2c23","Type":"ContainerDied","Data":"37fa3a5e51255eca9e03fb94c20d30aee8427d15b76573220ce7cfa614cb9707"} Mar 10 11:03:57 crc kubenswrapper[4794]: I0310 11:03:57.428774 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-fp5js" event={"ID":"0aebf8d4-3811-45eb-88fd-cc864f69e181","Type":"ContainerDied","Data":"34e6973e5311fe8a810be0b747194b1eca7b04b8d831694ce47853408c6e5d9e"} Mar 10 11:03:57 crc kubenswrapper[4794]: I0310 11:03:57.428808 4794 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="34e6973e5311fe8a810be0b747194b1eca7b04b8d831694ce47853408c6e5d9e" Mar 10 11:03:57 crc kubenswrapper[4794]: I0310 11:03:57.428846 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-fp5js" Mar 10 11:03:57 crc kubenswrapper[4794]: I0310 11:03:57.463369 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=37.463300922 podStartE2EDuration="37.463300922s" podCreationTimestamp="2026-03-10 11:03:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 11:03:57.45167317 +0000 UTC m=+4786.207844018" watchObservedRunningTime="2026-03-10 11:03:57.463300922 +0000 UTC m=+4786.219471760" Mar 10 11:03:58 crc kubenswrapper[4794]: I0310 11:03:58.437995 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"234dcbfa-1ba5-4960-952c-1ee50cfb2c23","Type":"ContainerStarted","Data":"e455a93ac3fc772d8ab0521cc8ac075723b25337540b5d93ac2e5efff28c6244"} Mar 10 11:03:58 crc kubenswrapper[4794]: I0310 11:03:58.438947 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Mar 10 11:03:58 crc kubenswrapper[4794]: I0310 11:03:58.464848 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=38.464833154 podStartE2EDuration="38.464833154s" podCreationTimestamp="2026-03-10 11:03:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 11:03:58.464191165 +0000 UTC m=+4787.220361983" watchObservedRunningTime="2026-03-10 11:03:58.464833154 +0000 UTC m=+4787.221003972" Mar 10 11:04:00 crc kubenswrapper[4794]: I0310 11:04:00.144836 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29552344-bqfvt"] Mar 10 11:04:00 crc kubenswrapper[4794]: E0310 11:04:00.145287 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0aebf8d4-3811-45eb-88fd-cc864f69e181" containerName="mariadb-account-create-update" Mar 10 11:04:00 crc kubenswrapper[4794]: I0310 11:04:00.145306 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="0aebf8d4-3811-45eb-88fd-cc864f69e181" containerName="mariadb-account-create-update" Mar 10 11:04:00 crc kubenswrapper[4794]: I0310 11:04:00.145698 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="0aebf8d4-3811-45eb-88fd-cc864f69e181" containerName="mariadb-account-create-update" Mar 10 11:04:00 crc kubenswrapper[4794]: I0310 11:04:00.146461 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552344-bqfvt" Mar 10 11:04:00 crc kubenswrapper[4794]: I0310 11:04:00.149569 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 10 11:04:00 crc kubenswrapper[4794]: I0310 11:04:00.150167 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-r5tkc" Mar 10 11:04:00 crc kubenswrapper[4794]: I0310 11:04:00.150630 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 10 11:04:00 crc kubenswrapper[4794]: I0310 11:04:00.160165 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552344-bqfvt"] Mar 10 11:04:00 crc kubenswrapper[4794]: I0310 11:04:00.258758 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-42lsc\" (UniqueName: \"kubernetes.io/projected/cff9bdcc-d9bc-448f-9ac1-a2462480dc5e-kube-api-access-42lsc\") pod \"auto-csr-approver-29552344-bqfvt\" (UID: \"cff9bdcc-d9bc-448f-9ac1-a2462480dc5e\") " pod="openshift-infra/auto-csr-approver-29552344-bqfvt" Mar 10 11:04:00 crc kubenswrapper[4794]: I0310 11:04:00.360502 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-42lsc\" (UniqueName: \"kubernetes.io/projected/cff9bdcc-d9bc-448f-9ac1-a2462480dc5e-kube-api-access-42lsc\") pod \"auto-csr-approver-29552344-bqfvt\" (UID: \"cff9bdcc-d9bc-448f-9ac1-a2462480dc5e\") " pod="openshift-infra/auto-csr-approver-29552344-bqfvt" Mar 10 11:04:00 crc kubenswrapper[4794]: I0310 11:04:00.388232 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-42lsc\" (UniqueName: \"kubernetes.io/projected/cff9bdcc-d9bc-448f-9ac1-a2462480dc5e-kube-api-access-42lsc\") pod \"auto-csr-approver-29552344-bqfvt\" (UID: \"cff9bdcc-d9bc-448f-9ac1-a2462480dc5e\") " pod="openshift-infra/auto-csr-approver-29552344-bqfvt" Mar 10 11:04:00 crc kubenswrapper[4794]: I0310 11:04:00.491956 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552344-bqfvt" Mar 10 11:04:00 crc kubenswrapper[4794]: I0310 11:04:00.932967 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552344-bqfvt"] Mar 10 11:04:01 crc kubenswrapper[4794]: I0310 11:04:01.460982 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552344-bqfvt" event={"ID":"cff9bdcc-d9bc-448f-9ac1-a2462480dc5e","Type":"ContainerStarted","Data":"5c0ee65192fcac4ac4abe807e7e17b29511b9ad1ed9c6cb9a1a65c7c2799806e"} Mar 10 11:04:02 crc kubenswrapper[4794]: I0310 11:04:02.222649 4794 pod_container_manager_linux.go:210] "Failed to delete cgroup paths" cgroupName=["kubepods","besteffort","poda520b016-b09d-430c-a64b-83f450dab2ef"] err="unable to destroy cgroup paths for cgroup [kubepods besteffort poda520b016-b09d-430c-a64b-83f450dab2ef] : Timed out while waiting for systemd to remove kubepods-besteffort-poda520b016_b09d_430c_a64b_83f450dab2ef.slice" Mar 10 11:04:02 crc kubenswrapper[4794]: E0310 11:04:02.223100 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to delete cgroup paths for [kubepods besteffort poda520b016-b09d-430c-a64b-83f450dab2ef] : unable to destroy cgroup paths for cgroup [kubepods besteffort poda520b016-b09d-430c-a64b-83f450dab2ef] : Timed out while waiting for systemd to remove kubepods-besteffort-poda520b016_b09d_430c_a64b_83f450dab2ef.slice" pod="openstack/dnsmasq-dns-5fb77f9685-6b6f7" podUID="a520b016-b09d-430c-a64b-83f450dab2ef" Mar 10 11:04:02 crc kubenswrapper[4794]: I0310 11:04:02.469135 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5fb77f9685-6b6f7" Mar 10 11:04:02 crc kubenswrapper[4794]: I0310 11:04:02.469874 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552344-bqfvt" event={"ID":"cff9bdcc-d9bc-448f-9ac1-a2462480dc5e","Type":"ContainerStarted","Data":"45b576e38ffe4e5303df02004b9f815700bf901686269567915d703e0c193c3b"} Mar 10 11:04:02 crc kubenswrapper[4794]: I0310 11:04:02.513142 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29552344-bqfvt" podStartSLOduration=1.4780458 podStartE2EDuration="2.513119727s" podCreationTimestamp="2026-03-10 11:04:00 +0000 UTC" firstStartedPulling="2026-03-10 11:04:00.939381195 +0000 UTC m=+4789.695552033" lastFinishedPulling="2026-03-10 11:04:01.974455132 +0000 UTC m=+4790.730625960" observedRunningTime="2026-03-10 11:04:02.488689417 +0000 UTC m=+4791.244860265" watchObservedRunningTime="2026-03-10 11:04:02.513119727 +0000 UTC m=+4791.269290545" Mar 10 11:04:02 crc kubenswrapper[4794]: I0310 11:04:02.523532 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5fb77f9685-6b6f7"] Mar 10 11:04:02 crc kubenswrapper[4794]: I0310 11:04:02.528861 4794 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5fb77f9685-6b6f7"] Mar 10 11:04:03 crc kubenswrapper[4794]: I0310 11:04:03.479483 4794 generic.go:334] "Generic (PLEG): container finished" podID="cff9bdcc-d9bc-448f-9ac1-a2462480dc5e" containerID="45b576e38ffe4e5303df02004b9f815700bf901686269567915d703e0c193c3b" exitCode=0 Mar 10 11:04:03 crc kubenswrapper[4794]: I0310 11:04:03.479524 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552344-bqfvt" event={"ID":"cff9bdcc-d9bc-448f-9ac1-a2462480dc5e","Type":"ContainerDied","Data":"45b576e38ffe4e5303df02004b9f815700bf901686269567915d703e0c193c3b"} Mar 10 11:04:04 crc kubenswrapper[4794]: I0310 11:04:04.013178 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a520b016-b09d-430c-a64b-83f450dab2ef" path="/var/lib/kubelet/pods/a520b016-b09d-430c-a64b-83f450dab2ef/volumes" Mar 10 11:04:04 crc kubenswrapper[4794]: I0310 11:04:04.860560 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552344-bqfvt" Mar 10 11:04:05 crc kubenswrapper[4794]: I0310 11:04:05.037982 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-42lsc\" (UniqueName: \"kubernetes.io/projected/cff9bdcc-d9bc-448f-9ac1-a2462480dc5e-kube-api-access-42lsc\") pod \"cff9bdcc-d9bc-448f-9ac1-a2462480dc5e\" (UID: \"cff9bdcc-d9bc-448f-9ac1-a2462480dc5e\") " Mar 10 11:04:05 crc kubenswrapper[4794]: I0310 11:04:05.043175 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cff9bdcc-d9bc-448f-9ac1-a2462480dc5e-kube-api-access-42lsc" (OuterVolumeSpecName: "kube-api-access-42lsc") pod "cff9bdcc-d9bc-448f-9ac1-a2462480dc5e" (UID: "cff9bdcc-d9bc-448f-9ac1-a2462480dc5e"). InnerVolumeSpecName "kube-api-access-42lsc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 11:04:05 crc kubenswrapper[4794]: I0310 11:04:05.093062 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29552338-c5llz"] Mar 10 11:04:05 crc kubenswrapper[4794]: I0310 11:04:05.101909 4794 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29552338-c5llz"] Mar 10 11:04:05 crc kubenswrapper[4794]: I0310 11:04:05.140014 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-42lsc\" (UniqueName: \"kubernetes.io/projected/cff9bdcc-d9bc-448f-9ac1-a2462480dc5e-kube-api-access-42lsc\") on node \"crc\" DevicePath \"\"" Mar 10 11:04:05 crc kubenswrapper[4794]: I0310 11:04:05.506915 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552344-bqfvt" event={"ID":"cff9bdcc-d9bc-448f-9ac1-a2462480dc5e","Type":"ContainerDied","Data":"5c0ee65192fcac4ac4abe807e7e17b29511b9ad1ed9c6cb9a1a65c7c2799806e"} Mar 10 11:04:05 crc kubenswrapper[4794]: I0310 11:04:05.506983 4794 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5c0ee65192fcac4ac4abe807e7e17b29511b9ad1ed9c6cb9a1a65c7c2799806e" Mar 10 11:04:05 crc kubenswrapper[4794]: I0310 11:04:05.506980 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552344-bqfvt" Mar 10 11:04:06 crc kubenswrapper[4794]: I0310 11:04:06.010501 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49f1bbfe-ab6d-4237-b11a-8fc0f6f2d71e" path="/var/lib/kubelet/pods/49f1bbfe-ab6d-4237-b11a-8fc0f6f2d71e/volumes" Mar 10 11:04:06 crc kubenswrapper[4794]: I0310 11:04:06.998743 4794 scope.go:117] "RemoveContainer" containerID="7246b508c9eeccdf8a5ec9276e239c3ec455273ced1f978a5745ca8a5e590e02" Mar 10 11:04:06 crc kubenswrapper[4794]: E0310 11:04:06.999046 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-69278_openshift-machine-config-operator(071a79a8-a892-4d38-a255-2a19483b64aa)\"" pod="openshift-machine-config-operator/machine-config-daemon-69278" podUID="071a79a8-a892-4d38-a255-2a19483b64aa" Mar 10 11:04:11 crc kubenswrapper[4794]: I0310 11:04:11.628625 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Mar 10 11:04:11 crc kubenswrapper[4794]: I0310 11:04:11.989475 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Mar 10 11:04:16 crc kubenswrapper[4794]: I0310 11:04:16.016758 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-2p246"] Mar 10 11:04:16 crc kubenswrapper[4794]: E0310 11:04:16.017832 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cff9bdcc-d9bc-448f-9ac1-a2462480dc5e" containerName="oc" Mar 10 11:04:16 crc kubenswrapper[4794]: I0310 11:04:16.017856 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="cff9bdcc-d9bc-448f-9ac1-a2462480dc5e" containerName="oc" Mar 10 11:04:16 crc kubenswrapper[4794]: I0310 11:04:16.018142 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="cff9bdcc-d9bc-448f-9ac1-a2462480dc5e" containerName="oc" Mar 10 11:04:16 crc kubenswrapper[4794]: I0310 11:04:16.020165 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-2p246" Mar 10 11:04:16 crc kubenswrapper[4794]: I0310 11:04:16.033605 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-2p246"] Mar 10 11:04:16 crc kubenswrapper[4794]: I0310 11:04:16.109490 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a3274f97-634f-4482-aa42-800ba96a25c8-catalog-content\") pod \"redhat-marketplace-2p246\" (UID: \"a3274f97-634f-4482-aa42-800ba96a25c8\") " pod="openshift-marketplace/redhat-marketplace-2p246" Mar 10 11:04:16 crc kubenswrapper[4794]: I0310 11:04:16.109582 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a3274f97-634f-4482-aa42-800ba96a25c8-utilities\") pod \"redhat-marketplace-2p246\" (UID: \"a3274f97-634f-4482-aa42-800ba96a25c8\") " pod="openshift-marketplace/redhat-marketplace-2p246" Mar 10 11:04:16 crc kubenswrapper[4794]: I0310 11:04:16.109783 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-krlfg\" (UniqueName: \"kubernetes.io/projected/a3274f97-634f-4482-aa42-800ba96a25c8-kube-api-access-krlfg\") pod \"redhat-marketplace-2p246\" (UID: \"a3274f97-634f-4482-aa42-800ba96a25c8\") " pod="openshift-marketplace/redhat-marketplace-2p246" Mar 10 11:04:16 crc kubenswrapper[4794]: I0310 11:04:16.211642 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a3274f97-634f-4482-aa42-800ba96a25c8-utilities\") pod \"redhat-marketplace-2p246\" (UID: \"a3274f97-634f-4482-aa42-800ba96a25c8\") " pod="openshift-marketplace/redhat-marketplace-2p246" Mar 10 11:04:16 crc kubenswrapper[4794]: I0310 11:04:16.211774 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-krlfg\" (UniqueName: \"kubernetes.io/projected/a3274f97-634f-4482-aa42-800ba96a25c8-kube-api-access-krlfg\") pod \"redhat-marketplace-2p246\" (UID: \"a3274f97-634f-4482-aa42-800ba96a25c8\") " pod="openshift-marketplace/redhat-marketplace-2p246" Mar 10 11:04:16 crc kubenswrapper[4794]: I0310 11:04:16.211806 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a3274f97-634f-4482-aa42-800ba96a25c8-catalog-content\") pod \"redhat-marketplace-2p246\" (UID: \"a3274f97-634f-4482-aa42-800ba96a25c8\") " pod="openshift-marketplace/redhat-marketplace-2p246" Mar 10 11:04:16 crc kubenswrapper[4794]: I0310 11:04:16.212162 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a3274f97-634f-4482-aa42-800ba96a25c8-utilities\") pod \"redhat-marketplace-2p246\" (UID: \"a3274f97-634f-4482-aa42-800ba96a25c8\") " pod="openshift-marketplace/redhat-marketplace-2p246" Mar 10 11:04:16 crc kubenswrapper[4794]: I0310 11:04:16.212206 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a3274f97-634f-4482-aa42-800ba96a25c8-catalog-content\") pod \"redhat-marketplace-2p246\" (UID: \"a3274f97-634f-4482-aa42-800ba96a25c8\") " pod="openshift-marketplace/redhat-marketplace-2p246" Mar 10 11:04:16 crc kubenswrapper[4794]: I0310 11:04:16.238354 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-krlfg\" (UniqueName: \"kubernetes.io/projected/a3274f97-634f-4482-aa42-800ba96a25c8-kube-api-access-krlfg\") pod \"redhat-marketplace-2p246\" (UID: \"a3274f97-634f-4482-aa42-800ba96a25c8\") " pod="openshift-marketplace/redhat-marketplace-2p246" Mar 10 11:04:16 crc kubenswrapper[4794]: I0310 11:04:16.350247 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-2p246" Mar 10 11:04:16 crc kubenswrapper[4794]: I0310 11:04:16.839761 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-2p246"] Mar 10 11:04:16 crc kubenswrapper[4794]: W0310 11:04:16.840566 4794 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda3274f97_634f_4482_aa42_800ba96a25c8.slice/crio-e61938dfcebe78eff78559bc9c8c97f7fe65a863f1f560c63d65527f46bc8477 WatchSource:0}: Error finding container e61938dfcebe78eff78559bc9c8c97f7fe65a863f1f560c63d65527f46bc8477: Status 404 returned error can't find the container with id e61938dfcebe78eff78559bc9c8c97f7fe65a863f1f560c63d65527f46bc8477 Mar 10 11:04:17 crc kubenswrapper[4794]: I0310 11:04:17.380834 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-66d5bf7c87-4n74d"] Mar 10 11:04:17 crc kubenswrapper[4794]: I0310 11:04:17.383222 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-66d5bf7c87-4n74d" Mar 10 11:04:17 crc kubenswrapper[4794]: I0310 11:04:17.402923 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-66d5bf7c87-4n74d"] Mar 10 11:04:17 crc kubenswrapper[4794]: I0310 11:04:17.530700 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0f4731c7-dafb-410c-9a54-5a857c02cfbf-config\") pod \"dnsmasq-dns-66d5bf7c87-4n74d\" (UID: \"0f4731c7-dafb-410c-9a54-5a857c02cfbf\") " pod="openstack/dnsmasq-dns-66d5bf7c87-4n74d" Mar 10 11:04:17 crc kubenswrapper[4794]: I0310 11:04:17.530760 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0f4731c7-dafb-410c-9a54-5a857c02cfbf-dns-svc\") pod \"dnsmasq-dns-66d5bf7c87-4n74d\" (UID: \"0f4731c7-dafb-410c-9a54-5a857c02cfbf\") " pod="openstack/dnsmasq-dns-66d5bf7c87-4n74d" Mar 10 11:04:17 crc kubenswrapper[4794]: I0310 11:04:17.530903 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t82fl\" (UniqueName: \"kubernetes.io/projected/0f4731c7-dafb-410c-9a54-5a857c02cfbf-kube-api-access-t82fl\") pod \"dnsmasq-dns-66d5bf7c87-4n74d\" (UID: \"0f4731c7-dafb-410c-9a54-5a857c02cfbf\") " pod="openstack/dnsmasq-dns-66d5bf7c87-4n74d" Mar 10 11:04:17 crc kubenswrapper[4794]: I0310 11:04:17.618638 4794 generic.go:334] "Generic (PLEG): container finished" podID="a3274f97-634f-4482-aa42-800ba96a25c8" containerID="a372f7be8a845cd1e4c33c59741158513277501076cc40beadf37a472742ecc3" exitCode=0 Mar 10 11:04:17 crc kubenswrapper[4794]: I0310 11:04:17.618683 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2p246" event={"ID":"a3274f97-634f-4482-aa42-800ba96a25c8","Type":"ContainerDied","Data":"a372f7be8a845cd1e4c33c59741158513277501076cc40beadf37a472742ecc3"} Mar 10 11:04:17 crc kubenswrapper[4794]: I0310 11:04:17.618712 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2p246" event={"ID":"a3274f97-634f-4482-aa42-800ba96a25c8","Type":"ContainerStarted","Data":"e61938dfcebe78eff78559bc9c8c97f7fe65a863f1f560c63d65527f46bc8477"} Mar 10 11:04:17 crc kubenswrapper[4794]: I0310 11:04:17.632445 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t82fl\" (UniqueName: \"kubernetes.io/projected/0f4731c7-dafb-410c-9a54-5a857c02cfbf-kube-api-access-t82fl\") pod \"dnsmasq-dns-66d5bf7c87-4n74d\" (UID: \"0f4731c7-dafb-410c-9a54-5a857c02cfbf\") " pod="openstack/dnsmasq-dns-66d5bf7c87-4n74d" Mar 10 11:04:17 crc kubenswrapper[4794]: I0310 11:04:17.632563 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0f4731c7-dafb-410c-9a54-5a857c02cfbf-config\") pod \"dnsmasq-dns-66d5bf7c87-4n74d\" (UID: \"0f4731c7-dafb-410c-9a54-5a857c02cfbf\") " pod="openstack/dnsmasq-dns-66d5bf7c87-4n74d" Mar 10 11:04:17 crc kubenswrapper[4794]: I0310 11:04:17.632600 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0f4731c7-dafb-410c-9a54-5a857c02cfbf-dns-svc\") pod \"dnsmasq-dns-66d5bf7c87-4n74d\" (UID: \"0f4731c7-dafb-410c-9a54-5a857c02cfbf\") " pod="openstack/dnsmasq-dns-66d5bf7c87-4n74d" Mar 10 11:04:17 crc kubenswrapper[4794]: I0310 11:04:17.633614 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0f4731c7-dafb-410c-9a54-5a857c02cfbf-dns-svc\") pod \"dnsmasq-dns-66d5bf7c87-4n74d\" (UID: \"0f4731c7-dafb-410c-9a54-5a857c02cfbf\") " pod="openstack/dnsmasq-dns-66d5bf7c87-4n74d" Mar 10 11:04:17 crc kubenswrapper[4794]: I0310 11:04:17.633697 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0f4731c7-dafb-410c-9a54-5a857c02cfbf-config\") pod \"dnsmasq-dns-66d5bf7c87-4n74d\" (UID: \"0f4731c7-dafb-410c-9a54-5a857c02cfbf\") " pod="openstack/dnsmasq-dns-66d5bf7c87-4n74d" Mar 10 11:04:17 crc kubenswrapper[4794]: I0310 11:04:17.663018 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t82fl\" (UniqueName: \"kubernetes.io/projected/0f4731c7-dafb-410c-9a54-5a857c02cfbf-kube-api-access-t82fl\") pod \"dnsmasq-dns-66d5bf7c87-4n74d\" (UID: \"0f4731c7-dafb-410c-9a54-5a857c02cfbf\") " pod="openstack/dnsmasq-dns-66d5bf7c87-4n74d" Mar 10 11:04:17 crc kubenswrapper[4794]: I0310 11:04:17.716921 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-66d5bf7c87-4n74d" Mar 10 11:04:18 crc kubenswrapper[4794]: I0310 11:04:18.000237 4794 scope.go:117] "RemoveContainer" containerID="7246b508c9eeccdf8a5ec9276e239c3ec455273ced1f978a5745ca8a5e590e02" Mar 10 11:04:18 crc kubenswrapper[4794]: E0310 11:04:18.000501 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-69278_openshift-machine-config-operator(071a79a8-a892-4d38-a255-2a19483b64aa)\"" pod="openshift-machine-config-operator/machine-config-daemon-69278" podUID="071a79a8-a892-4d38-a255-2a19483b64aa" Mar 10 11:04:18 crc kubenswrapper[4794]: I0310 11:04:18.217319 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-66d5bf7c87-4n74d"] Mar 10 11:04:18 crc kubenswrapper[4794]: W0310 11:04:18.222964 4794 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0f4731c7_dafb_410c_9a54_5a857c02cfbf.slice/crio-95f2fe1b84c2fc5c62eb13a0e481e27881cce9db4a13a1aa5f4d572d723b10d4 WatchSource:0}: Error finding container 95f2fe1b84c2fc5c62eb13a0e481e27881cce9db4a13a1aa5f4d572d723b10d4: Status 404 returned error can't find the container with id 95f2fe1b84c2fc5c62eb13a0e481e27881cce9db4a13a1aa5f4d572d723b10d4 Mar 10 11:04:18 crc kubenswrapper[4794]: I0310 11:04:18.627069 4794 generic.go:334] "Generic (PLEG): container finished" podID="0f4731c7-dafb-410c-9a54-5a857c02cfbf" containerID="bfb1e388d3989840d13d2f26911f6ca578d105595af361c6e10c5bccdaf9456b" exitCode=0 Mar 10 11:04:18 crc kubenswrapper[4794]: I0310 11:04:18.627153 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-66d5bf7c87-4n74d" event={"ID":"0f4731c7-dafb-410c-9a54-5a857c02cfbf","Type":"ContainerDied","Data":"bfb1e388d3989840d13d2f26911f6ca578d105595af361c6e10c5bccdaf9456b"} Mar 10 11:04:18 crc kubenswrapper[4794]: I0310 11:04:18.627196 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-66d5bf7c87-4n74d" event={"ID":"0f4731c7-dafb-410c-9a54-5a857c02cfbf","Type":"ContainerStarted","Data":"95f2fe1b84c2fc5c62eb13a0e481e27881cce9db4a13a1aa5f4d572d723b10d4"} Mar 10 11:04:18 crc kubenswrapper[4794]: I0310 11:04:18.629548 4794 generic.go:334] "Generic (PLEG): container finished" podID="a3274f97-634f-4482-aa42-800ba96a25c8" containerID="07ac5feac51ceba2ab92aaedb2afffdeb2b23e2594684ccbc66a92312e77fff6" exitCode=0 Mar 10 11:04:18 crc kubenswrapper[4794]: I0310 11:04:18.629588 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2p246" event={"ID":"a3274f97-634f-4482-aa42-800ba96a25c8","Type":"ContainerDied","Data":"07ac5feac51ceba2ab92aaedb2afffdeb2b23e2594684ccbc66a92312e77fff6"} Mar 10 11:04:18 crc kubenswrapper[4794]: I0310 11:04:18.652889 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 10 11:04:18 crc kubenswrapper[4794]: I0310 11:04:18.753750 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 10 11:04:19 crc kubenswrapper[4794]: I0310 11:04:19.636809 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-66d5bf7c87-4n74d" event={"ID":"0f4731c7-dafb-410c-9a54-5a857c02cfbf","Type":"ContainerStarted","Data":"b2dfa8f9c79764e9aa59702297a1345dc8192a28f1d794b6282c926009c7869a"} Mar 10 11:04:19 crc kubenswrapper[4794]: I0310 11:04:19.637043 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-66d5bf7c87-4n74d" Mar 10 11:04:19 crc kubenswrapper[4794]: I0310 11:04:19.639707 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2p246" event={"ID":"a3274f97-634f-4482-aa42-800ba96a25c8","Type":"ContainerStarted","Data":"70b736393e5b82bdad06ffdf1c7372847515c637cdba8490e3472173a430d0dd"} Mar 10 11:04:19 crc kubenswrapper[4794]: I0310 11:04:19.669277 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-66d5bf7c87-4n74d" podStartSLOduration=2.669258901 podStartE2EDuration="2.669258901s" podCreationTimestamp="2026-03-10 11:04:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 11:04:19.663594445 +0000 UTC m=+4808.419765263" watchObservedRunningTime="2026-03-10 11:04:19.669258901 +0000 UTC m=+4808.425429719" Mar 10 11:04:19 crc kubenswrapper[4794]: I0310 11:04:19.697731 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-2p246" podStartSLOduration=3.242990267 podStartE2EDuration="4.697708106s" podCreationTimestamp="2026-03-10 11:04:15 +0000 UTC" firstStartedPulling="2026-03-10 11:04:17.621045061 +0000 UTC m=+4806.377215889" lastFinishedPulling="2026-03-10 11:04:19.07576291 +0000 UTC m=+4807.831933728" observedRunningTime="2026-03-10 11:04:19.69078824 +0000 UTC m=+4808.446959058" watchObservedRunningTime="2026-03-10 11:04:19.697708106 +0000 UTC m=+4808.453878924" Mar 10 11:04:20 crc kubenswrapper[4794]: I0310 11:04:20.521892 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-server-0" podUID="234dcbfa-1ba5-4960-952c-1ee50cfb2c23" containerName="rabbitmq" containerID="cri-o://e455a93ac3fc772d8ab0521cc8ac075723b25337540b5d93ac2e5efff28c6244" gracePeriod=604799 Mar 10 11:04:20 crc kubenswrapper[4794]: I0310 11:04:20.751033 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-cell1-server-0" podUID="5ab2d0d3-42ae-4af0-a0c8-518a7bb9a69b" containerName="rabbitmq" containerID="cri-o://a34cb3588cb3e8bbf28c0deb4db70508cb4c6244e58b6a11ecc7649ac0558a8e" gracePeriod=604799 Mar 10 11:04:20 crc kubenswrapper[4794]: I0310 11:04:20.772688 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-d2q2f"] Mar 10 11:04:20 crc kubenswrapper[4794]: I0310 11:04:20.774051 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-d2q2f" Mar 10 11:04:20 crc kubenswrapper[4794]: I0310 11:04:20.793674 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-d2q2f"] Mar 10 11:04:20 crc kubenswrapper[4794]: I0310 11:04:20.878590 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b8c626f0-3ae2-44f4-83f1-660a9f69eb2a-catalog-content\") pod \"redhat-operators-d2q2f\" (UID: \"b8c626f0-3ae2-44f4-83f1-660a9f69eb2a\") " pod="openshift-marketplace/redhat-operators-d2q2f" Mar 10 11:04:20 crc kubenswrapper[4794]: I0310 11:04:20.878695 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b8c626f0-3ae2-44f4-83f1-660a9f69eb2a-utilities\") pod \"redhat-operators-d2q2f\" (UID: \"b8c626f0-3ae2-44f4-83f1-660a9f69eb2a\") " pod="openshift-marketplace/redhat-operators-d2q2f" Mar 10 11:04:20 crc kubenswrapper[4794]: I0310 11:04:20.878733 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2xmsw\" (UniqueName: \"kubernetes.io/projected/b8c626f0-3ae2-44f4-83f1-660a9f69eb2a-kube-api-access-2xmsw\") pod \"redhat-operators-d2q2f\" (UID: \"b8c626f0-3ae2-44f4-83f1-660a9f69eb2a\") " pod="openshift-marketplace/redhat-operators-d2q2f" Mar 10 11:04:20 crc kubenswrapper[4794]: I0310 11:04:20.980595 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b8c626f0-3ae2-44f4-83f1-660a9f69eb2a-utilities\") pod \"redhat-operators-d2q2f\" (UID: \"b8c626f0-3ae2-44f4-83f1-660a9f69eb2a\") " pod="openshift-marketplace/redhat-operators-d2q2f" Mar 10 11:04:20 crc kubenswrapper[4794]: I0310 11:04:20.980664 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2xmsw\" (UniqueName: \"kubernetes.io/projected/b8c626f0-3ae2-44f4-83f1-660a9f69eb2a-kube-api-access-2xmsw\") pod \"redhat-operators-d2q2f\" (UID: \"b8c626f0-3ae2-44f4-83f1-660a9f69eb2a\") " pod="openshift-marketplace/redhat-operators-d2q2f" Mar 10 11:04:20 crc kubenswrapper[4794]: I0310 11:04:20.980767 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b8c626f0-3ae2-44f4-83f1-660a9f69eb2a-catalog-content\") pod \"redhat-operators-d2q2f\" (UID: \"b8c626f0-3ae2-44f4-83f1-660a9f69eb2a\") " pod="openshift-marketplace/redhat-operators-d2q2f" Mar 10 11:04:20 crc kubenswrapper[4794]: I0310 11:04:20.981126 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b8c626f0-3ae2-44f4-83f1-660a9f69eb2a-utilities\") pod \"redhat-operators-d2q2f\" (UID: \"b8c626f0-3ae2-44f4-83f1-660a9f69eb2a\") " pod="openshift-marketplace/redhat-operators-d2q2f" Mar 10 11:04:20 crc kubenswrapper[4794]: I0310 11:04:20.981246 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b8c626f0-3ae2-44f4-83f1-660a9f69eb2a-catalog-content\") pod \"redhat-operators-d2q2f\" (UID: \"b8c626f0-3ae2-44f4-83f1-660a9f69eb2a\") " pod="openshift-marketplace/redhat-operators-d2q2f" Mar 10 11:04:21 crc kubenswrapper[4794]: I0310 11:04:21.281582 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2xmsw\" (UniqueName: \"kubernetes.io/projected/b8c626f0-3ae2-44f4-83f1-660a9f69eb2a-kube-api-access-2xmsw\") pod \"redhat-operators-d2q2f\" (UID: \"b8c626f0-3ae2-44f4-83f1-660a9f69eb2a\") " pod="openshift-marketplace/redhat-operators-d2q2f" Mar 10 11:04:21 crc kubenswrapper[4794]: I0310 11:04:21.395808 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-d2q2f" Mar 10 11:04:21 crc kubenswrapper[4794]: I0310 11:04:21.625260 4794 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-cell1-server-0" podUID="5ab2d0d3-42ae-4af0-a0c8-518a7bb9a69b" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.1.17:5672: connect: connection refused" Mar 10 11:04:21 crc kubenswrapper[4794]: I0310 11:04:21.644138 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-d2q2f"] Mar 10 11:04:21 crc kubenswrapper[4794]: W0310 11:04:21.647778 4794 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb8c626f0_3ae2_44f4_83f1_660a9f69eb2a.slice/crio-f312c50ee448418ffc7f112d78de35d60511cb78211ce11ab2b319777fdf0351 WatchSource:0}: Error finding container f312c50ee448418ffc7f112d78de35d60511cb78211ce11ab2b319777fdf0351: Status 404 returned error can't find the container with id f312c50ee448418ffc7f112d78de35d60511cb78211ce11ab2b319777fdf0351 Mar 10 11:04:21 crc kubenswrapper[4794]: I0310 11:04:21.987641 4794 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-server-0" podUID="234dcbfa-1ba5-4960-952c-1ee50cfb2c23" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.1.18:5672: connect: connection refused" Mar 10 11:04:22 crc kubenswrapper[4794]: I0310 11:04:22.667220 4794 generic.go:334] "Generic (PLEG): container finished" podID="b8c626f0-3ae2-44f4-83f1-660a9f69eb2a" containerID="cf7d770aef9f17304b70c50ece932b8527670ced668d01bff3482cf26397b064" exitCode=0 Mar 10 11:04:22 crc kubenswrapper[4794]: I0310 11:04:22.667299 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-d2q2f" event={"ID":"b8c626f0-3ae2-44f4-83f1-660a9f69eb2a","Type":"ContainerDied","Data":"cf7d770aef9f17304b70c50ece932b8527670ced668d01bff3482cf26397b064"} Mar 10 11:04:22 crc kubenswrapper[4794]: I0310 11:04:22.667374 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-d2q2f" event={"ID":"b8c626f0-3ae2-44f4-83f1-660a9f69eb2a","Type":"ContainerStarted","Data":"f312c50ee448418ffc7f112d78de35d60511cb78211ce11ab2b319777fdf0351"} Mar 10 11:04:26 crc kubenswrapper[4794]: I0310 11:04:26.350555 4794 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-2p246" Mar 10 11:04:26 crc kubenswrapper[4794]: I0310 11:04:26.350916 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-2p246" Mar 10 11:04:26 crc kubenswrapper[4794]: I0310 11:04:26.415639 4794 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-2p246" Mar 10 11:04:26 crc kubenswrapper[4794]: I0310 11:04:26.786748 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-2p246" Mar 10 11:04:26 crc kubenswrapper[4794]: I0310 11:04:26.850503 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-2p246"] Mar 10 11:04:27 crc kubenswrapper[4794]: I0310 11:04:27.194775 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 10 11:04:27 crc kubenswrapper[4794]: I0310 11:04:27.280979 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 10 11:04:27 crc kubenswrapper[4794]: I0310 11:04:27.312006 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/234dcbfa-1ba5-4960-952c-1ee50cfb2c23-plugins-conf\") pod \"234dcbfa-1ba5-4960-952c-1ee50cfb2c23\" (UID: \"234dcbfa-1ba5-4960-952c-1ee50cfb2c23\") " Mar 10 11:04:27 crc kubenswrapper[4794]: I0310 11:04:27.312267 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jkcwt\" (UniqueName: \"kubernetes.io/projected/234dcbfa-1ba5-4960-952c-1ee50cfb2c23-kube-api-access-jkcwt\") pod \"234dcbfa-1ba5-4960-952c-1ee50cfb2c23\" (UID: \"234dcbfa-1ba5-4960-952c-1ee50cfb2c23\") " Mar 10 11:04:27 crc kubenswrapper[4794]: I0310 11:04:27.312351 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/234dcbfa-1ba5-4960-952c-1ee50cfb2c23-pod-info\") pod \"234dcbfa-1ba5-4960-952c-1ee50cfb2c23\" (UID: \"234dcbfa-1ba5-4960-952c-1ee50cfb2c23\") " Mar 10 11:04:27 crc kubenswrapper[4794]: I0310 11:04:27.312377 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/234dcbfa-1ba5-4960-952c-1ee50cfb2c23-rabbitmq-plugins\") pod \"234dcbfa-1ba5-4960-952c-1ee50cfb2c23\" (UID: \"234dcbfa-1ba5-4960-952c-1ee50cfb2c23\") " Mar 10 11:04:27 crc kubenswrapper[4794]: I0310 11:04:27.312497 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-78950b35-609e-4037-9d4f-5e3ac7d5d3a9\") pod \"234dcbfa-1ba5-4960-952c-1ee50cfb2c23\" (UID: \"234dcbfa-1ba5-4960-952c-1ee50cfb2c23\") " Mar 10 11:04:27 crc kubenswrapper[4794]: I0310 11:04:27.312544 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/234dcbfa-1ba5-4960-952c-1ee50cfb2c23-server-conf\") pod \"234dcbfa-1ba5-4960-952c-1ee50cfb2c23\" (UID: \"234dcbfa-1ba5-4960-952c-1ee50cfb2c23\") " Mar 10 11:04:27 crc kubenswrapper[4794]: I0310 11:04:27.312608 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/234dcbfa-1ba5-4960-952c-1ee50cfb2c23-rabbitmq-erlang-cookie\") pod \"234dcbfa-1ba5-4960-952c-1ee50cfb2c23\" (UID: \"234dcbfa-1ba5-4960-952c-1ee50cfb2c23\") " Mar 10 11:04:27 crc kubenswrapper[4794]: I0310 11:04:27.312632 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/234dcbfa-1ba5-4960-952c-1ee50cfb2c23-erlang-cookie-secret\") pod \"234dcbfa-1ba5-4960-952c-1ee50cfb2c23\" (UID: \"234dcbfa-1ba5-4960-952c-1ee50cfb2c23\") " Mar 10 11:04:27 crc kubenswrapper[4794]: I0310 11:04:27.312651 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/234dcbfa-1ba5-4960-952c-1ee50cfb2c23-rabbitmq-confd\") pod \"234dcbfa-1ba5-4960-952c-1ee50cfb2c23\" (UID: \"234dcbfa-1ba5-4960-952c-1ee50cfb2c23\") " Mar 10 11:04:27 crc kubenswrapper[4794]: I0310 11:04:27.313853 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/234dcbfa-1ba5-4960-952c-1ee50cfb2c23-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "234dcbfa-1ba5-4960-952c-1ee50cfb2c23" (UID: "234dcbfa-1ba5-4960-952c-1ee50cfb2c23"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 11:04:27 crc kubenswrapper[4794]: I0310 11:04:27.314423 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/234dcbfa-1ba5-4960-952c-1ee50cfb2c23-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "234dcbfa-1ba5-4960-952c-1ee50cfb2c23" (UID: "234dcbfa-1ba5-4960-952c-1ee50cfb2c23"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 11:04:27 crc kubenswrapper[4794]: I0310 11:04:27.314611 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/234dcbfa-1ba5-4960-952c-1ee50cfb2c23-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "234dcbfa-1ba5-4960-952c-1ee50cfb2c23" (UID: "234dcbfa-1ba5-4960-952c-1ee50cfb2c23"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 11:04:27 crc kubenswrapper[4794]: I0310 11:04:27.323351 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/234dcbfa-1ba5-4960-952c-1ee50cfb2c23-pod-info" (OuterVolumeSpecName: "pod-info") pod "234dcbfa-1ba5-4960-952c-1ee50cfb2c23" (UID: "234dcbfa-1ba5-4960-952c-1ee50cfb2c23"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Mar 10 11:04:27 crc kubenswrapper[4794]: I0310 11:04:27.330499 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/234dcbfa-1ba5-4960-952c-1ee50cfb2c23-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "234dcbfa-1ba5-4960-952c-1ee50cfb2c23" (UID: "234dcbfa-1ba5-4960-952c-1ee50cfb2c23"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 11:04:27 crc kubenswrapper[4794]: I0310 11:04:27.350617 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/234dcbfa-1ba5-4960-952c-1ee50cfb2c23-kube-api-access-jkcwt" (OuterVolumeSpecName: "kube-api-access-jkcwt") pod "234dcbfa-1ba5-4960-952c-1ee50cfb2c23" (UID: "234dcbfa-1ba5-4960-952c-1ee50cfb2c23"). InnerVolumeSpecName "kube-api-access-jkcwt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 11:04:27 crc kubenswrapper[4794]: I0310 11:04:27.351644 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-78950b35-609e-4037-9d4f-5e3ac7d5d3a9" (OuterVolumeSpecName: "persistence") pod "234dcbfa-1ba5-4960-952c-1ee50cfb2c23" (UID: "234dcbfa-1ba5-4960-952c-1ee50cfb2c23"). InnerVolumeSpecName "pvc-78950b35-609e-4037-9d4f-5e3ac7d5d3a9". PluginName "kubernetes.io/csi", VolumeGidValue "" Mar 10 11:04:27 crc kubenswrapper[4794]: I0310 11:04:27.397047 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/234dcbfa-1ba5-4960-952c-1ee50cfb2c23-server-conf" (OuterVolumeSpecName: "server-conf") pod "234dcbfa-1ba5-4960-952c-1ee50cfb2c23" (UID: "234dcbfa-1ba5-4960-952c-1ee50cfb2c23"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 11:04:27 crc kubenswrapper[4794]: I0310 11:04:27.415231 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bp545\" (UniqueName: \"kubernetes.io/projected/5ab2d0d3-42ae-4af0-a0c8-518a7bb9a69b-kube-api-access-bp545\") pod \"5ab2d0d3-42ae-4af0-a0c8-518a7bb9a69b\" (UID: \"5ab2d0d3-42ae-4af0-a0c8-518a7bb9a69b\") " Mar 10 11:04:27 crc kubenswrapper[4794]: I0310 11:04:27.415318 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/5ab2d0d3-42ae-4af0-a0c8-518a7bb9a69b-plugins-conf\") pod \"5ab2d0d3-42ae-4af0-a0c8-518a7bb9a69b\" (UID: \"5ab2d0d3-42ae-4af0-a0c8-518a7bb9a69b\") " Mar 10 11:04:27 crc kubenswrapper[4794]: I0310 11:04:27.415393 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/5ab2d0d3-42ae-4af0-a0c8-518a7bb9a69b-server-conf\") pod \"5ab2d0d3-42ae-4af0-a0c8-518a7bb9a69b\" (UID: \"5ab2d0d3-42ae-4af0-a0c8-518a7bb9a69b\") " Mar 10 11:04:27 crc kubenswrapper[4794]: I0310 11:04:27.415427 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/5ab2d0d3-42ae-4af0-a0c8-518a7bb9a69b-pod-info\") pod \"5ab2d0d3-42ae-4af0-a0c8-518a7bb9a69b\" (UID: \"5ab2d0d3-42ae-4af0-a0c8-518a7bb9a69b\") " Mar 10 11:04:27 crc kubenswrapper[4794]: I0310 11:04:27.415483 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/5ab2d0d3-42ae-4af0-a0c8-518a7bb9a69b-rabbitmq-plugins\") pod \"5ab2d0d3-42ae-4af0-a0c8-518a7bb9a69b\" (UID: \"5ab2d0d3-42ae-4af0-a0c8-518a7bb9a69b\") " Mar 10 11:04:27 crc kubenswrapper[4794]: I0310 11:04:27.415540 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/5ab2d0d3-42ae-4af0-a0c8-518a7bb9a69b-rabbitmq-erlang-cookie\") pod \"5ab2d0d3-42ae-4af0-a0c8-518a7bb9a69b\" (UID: \"5ab2d0d3-42ae-4af0-a0c8-518a7bb9a69b\") " Mar 10 11:04:27 crc kubenswrapper[4794]: I0310 11:04:27.415673 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-dc822ad3-d386-47fb-a341-58b137079423\") pod \"5ab2d0d3-42ae-4af0-a0c8-518a7bb9a69b\" (UID: \"5ab2d0d3-42ae-4af0-a0c8-518a7bb9a69b\") " Mar 10 11:04:27 crc kubenswrapper[4794]: I0310 11:04:27.415756 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/5ab2d0d3-42ae-4af0-a0c8-518a7bb9a69b-erlang-cookie-secret\") pod \"5ab2d0d3-42ae-4af0-a0c8-518a7bb9a69b\" (UID: \"5ab2d0d3-42ae-4af0-a0c8-518a7bb9a69b\") " Mar 10 11:04:27 crc kubenswrapper[4794]: I0310 11:04:27.415779 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/5ab2d0d3-42ae-4af0-a0c8-518a7bb9a69b-rabbitmq-confd\") pod \"5ab2d0d3-42ae-4af0-a0c8-518a7bb9a69b\" (UID: \"5ab2d0d3-42ae-4af0-a0c8-518a7bb9a69b\") " Mar 10 11:04:27 crc kubenswrapper[4794]: I0310 11:04:27.416028 4794 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/234dcbfa-1ba5-4960-952c-1ee50cfb2c23-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Mar 10 11:04:27 crc kubenswrapper[4794]: I0310 11:04:27.416044 4794 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/234dcbfa-1ba5-4960-952c-1ee50cfb2c23-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Mar 10 11:04:27 crc kubenswrapper[4794]: I0310 11:04:27.416054 4794 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/234dcbfa-1ba5-4960-952c-1ee50cfb2c23-plugins-conf\") on node \"crc\" DevicePath \"\"" Mar 10 11:04:27 crc kubenswrapper[4794]: I0310 11:04:27.416063 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jkcwt\" (UniqueName: \"kubernetes.io/projected/234dcbfa-1ba5-4960-952c-1ee50cfb2c23-kube-api-access-jkcwt\") on node \"crc\" DevicePath \"\"" Mar 10 11:04:27 crc kubenswrapper[4794]: I0310 11:04:27.416072 4794 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/234dcbfa-1ba5-4960-952c-1ee50cfb2c23-pod-info\") on node \"crc\" DevicePath \"\"" Mar 10 11:04:27 crc kubenswrapper[4794]: I0310 11:04:27.416081 4794 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/234dcbfa-1ba5-4960-952c-1ee50cfb2c23-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Mar 10 11:04:27 crc kubenswrapper[4794]: I0310 11:04:27.416101 4794 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-78950b35-609e-4037-9d4f-5e3ac7d5d3a9\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-78950b35-609e-4037-9d4f-5e3ac7d5d3a9\") on node \"crc\" " Mar 10 11:04:27 crc kubenswrapper[4794]: I0310 11:04:27.416110 4794 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/234dcbfa-1ba5-4960-952c-1ee50cfb2c23-server-conf\") on node \"crc\" DevicePath \"\"" Mar 10 11:04:27 crc kubenswrapper[4794]: I0310 11:04:27.416101 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5ab2d0d3-42ae-4af0-a0c8-518a7bb9a69b-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "5ab2d0d3-42ae-4af0-a0c8-518a7bb9a69b" (UID: "5ab2d0d3-42ae-4af0-a0c8-518a7bb9a69b"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 11:04:27 crc kubenswrapper[4794]: I0310 11:04:27.416254 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5ab2d0d3-42ae-4af0-a0c8-518a7bb9a69b-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "5ab2d0d3-42ae-4af0-a0c8-518a7bb9a69b" (UID: "5ab2d0d3-42ae-4af0-a0c8-518a7bb9a69b"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 11:04:27 crc kubenswrapper[4794]: I0310 11:04:27.416580 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5ab2d0d3-42ae-4af0-a0c8-518a7bb9a69b-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "5ab2d0d3-42ae-4af0-a0c8-518a7bb9a69b" (UID: "5ab2d0d3-42ae-4af0-a0c8-518a7bb9a69b"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 11:04:27 crc kubenswrapper[4794]: I0310 11:04:27.417514 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5ab2d0d3-42ae-4af0-a0c8-518a7bb9a69b-kube-api-access-bp545" (OuterVolumeSpecName: "kube-api-access-bp545") pod "5ab2d0d3-42ae-4af0-a0c8-518a7bb9a69b" (UID: "5ab2d0d3-42ae-4af0-a0c8-518a7bb9a69b"). InnerVolumeSpecName "kube-api-access-bp545". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 11:04:27 crc kubenswrapper[4794]: I0310 11:04:27.417540 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/234dcbfa-1ba5-4960-952c-1ee50cfb2c23-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "234dcbfa-1ba5-4960-952c-1ee50cfb2c23" (UID: "234dcbfa-1ba5-4960-952c-1ee50cfb2c23"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 11:04:27 crc kubenswrapper[4794]: I0310 11:04:27.419012 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/5ab2d0d3-42ae-4af0-a0c8-518a7bb9a69b-pod-info" (OuterVolumeSpecName: "pod-info") pod "5ab2d0d3-42ae-4af0-a0c8-518a7bb9a69b" (UID: "5ab2d0d3-42ae-4af0-a0c8-518a7bb9a69b"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Mar 10 11:04:27 crc kubenswrapper[4794]: I0310 11:04:27.419662 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5ab2d0d3-42ae-4af0-a0c8-518a7bb9a69b-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "5ab2d0d3-42ae-4af0-a0c8-518a7bb9a69b" (UID: "5ab2d0d3-42ae-4af0-a0c8-518a7bb9a69b"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 11:04:27 crc kubenswrapper[4794]: I0310 11:04:27.425575 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-dc822ad3-d386-47fb-a341-58b137079423" (OuterVolumeSpecName: "persistence") pod "5ab2d0d3-42ae-4af0-a0c8-518a7bb9a69b" (UID: "5ab2d0d3-42ae-4af0-a0c8-518a7bb9a69b"). InnerVolumeSpecName "pvc-dc822ad3-d386-47fb-a341-58b137079423". PluginName "kubernetes.io/csi", VolumeGidValue "" Mar 10 11:04:27 crc kubenswrapper[4794]: I0310 11:04:27.435415 4794 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Mar 10 11:04:27 crc kubenswrapper[4794]: I0310 11:04:27.435605 4794 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-78950b35-609e-4037-9d4f-5e3ac7d5d3a9" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-78950b35-609e-4037-9d4f-5e3ac7d5d3a9") on node "crc" Mar 10 11:04:27 crc kubenswrapper[4794]: I0310 11:04:27.449968 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5ab2d0d3-42ae-4af0-a0c8-518a7bb9a69b-server-conf" (OuterVolumeSpecName: "server-conf") pod "5ab2d0d3-42ae-4af0-a0c8-518a7bb9a69b" (UID: "5ab2d0d3-42ae-4af0-a0c8-518a7bb9a69b"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 11:04:27 crc kubenswrapper[4794]: I0310 11:04:27.474305 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5ab2d0d3-42ae-4af0-a0c8-518a7bb9a69b-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "5ab2d0d3-42ae-4af0-a0c8-518a7bb9a69b" (UID: "5ab2d0d3-42ae-4af0-a0c8-518a7bb9a69b"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 11:04:27 crc kubenswrapper[4794]: I0310 11:04:27.516991 4794 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/5ab2d0d3-42ae-4af0-a0c8-518a7bb9a69b-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Mar 10 11:04:27 crc kubenswrapper[4794]: I0310 11:04:27.517049 4794 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-dc822ad3-d386-47fb-a341-58b137079423\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-dc822ad3-d386-47fb-a341-58b137079423\") on node \"crc\" " Mar 10 11:04:27 crc kubenswrapper[4794]: I0310 11:04:27.517066 4794 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/5ab2d0d3-42ae-4af0-a0c8-518a7bb9a69b-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Mar 10 11:04:27 crc kubenswrapper[4794]: I0310 11:04:27.517080 4794 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/5ab2d0d3-42ae-4af0-a0c8-518a7bb9a69b-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Mar 10 11:04:27 crc kubenswrapper[4794]: I0310 11:04:27.517092 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bp545\" (UniqueName: \"kubernetes.io/projected/5ab2d0d3-42ae-4af0-a0c8-518a7bb9a69b-kube-api-access-bp545\") on node \"crc\" DevicePath \"\"" Mar 10 11:04:27 crc kubenswrapper[4794]: I0310 11:04:27.517108 4794 reconciler_common.go:293] "Volume detached for volume \"pvc-78950b35-609e-4037-9d4f-5e3ac7d5d3a9\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-78950b35-609e-4037-9d4f-5e3ac7d5d3a9\") on node \"crc\" DevicePath \"\"" Mar 10 11:04:27 crc kubenswrapper[4794]: I0310 11:04:27.517122 4794 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/5ab2d0d3-42ae-4af0-a0c8-518a7bb9a69b-plugins-conf\") on node \"crc\" DevicePath \"\"" Mar 10 11:04:27 crc kubenswrapper[4794]: I0310 11:04:27.517133 4794 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/5ab2d0d3-42ae-4af0-a0c8-518a7bb9a69b-server-conf\") on node \"crc\" DevicePath \"\"" Mar 10 11:04:27 crc kubenswrapper[4794]: I0310 11:04:27.517144 4794 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/5ab2d0d3-42ae-4af0-a0c8-518a7bb9a69b-pod-info\") on node \"crc\" DevicePath \"\"" Mar 10 11:04:27 crc kubenswrapper[4794]: I0310 11:04:27.517156 4794 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/5ab2d0d3-42ae-4af0-a0c8-518a7bb9a69b-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Mar 10 11:04:27 crc kubenswrapper[4794]: I0310 11:04:27.517169 4794 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/234dcbfa-1ba5-4960-952c-1ee50cfb2c23-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Mar 10 11:04:27 crc kubenswrapper[4794]: I0310 11:04:27.537187 4794 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Mar 10 11:04:27 crc kubenswrapper[4794]: I0310 11:04:27.537369 4794 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-dc822ad3-d386-47fb-a341-58b137079423" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-dc822ad3-d386-47fb-a341-58b137079423") on node "crc" Mar 10 11:04:27 crc kubenswrapper[4794]: I0310 11:04:27.618433 4794 reconciler_common.go:293] "Volume detached for volume \"pvc-dc822ad3-d386-47fb-a341-58b137079423\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-dc822ad3-d386-47fb-a341-58b137079423\") on node \"crc\" DevicePath \"\"" Mar 10 11:04:27 crc kubenswrapper[4794]: I0310 11:04:27.712588 4794 generic.go:334] "Generic (PLEG): container finished" podID="5ab2d0d3-42ae-4af0-a0c8-518a7bb9a69b" containerID="a34cb3588cb3e8bbf28c0deb4db70508cb4c6244e58b6a11ecc7649ac0558a8e" exitCode=0 Mar 10 11:04:27 crc kubenswrapper[4794]: I0310 11:04:27.712640 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"5ab2d0d3-42ae-4af0-a0c8-518a7bb9a69b","Type":"ContainerDied","Data":"a34cb3588cb3e8bbf28c0deb4db70508cb4c6244e58b6a11ecc7649ac0558a8e"} Mar 10 11:04:27 crc kubenswrapper[4794]: I0310 11:04:27.712665 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"5ab2d0d3-42ae-4af0-a0c8-518a7bb9a69b","Type":"ContainerDied","Data":"c7358810302df97f6c85085a925e56344657bcfa5db29e46af7a0edb14987a24"} Mar 10 11:04:27 crc kubenswrapper[4794]: I0310 11:04:27.712682 4794 scope.go:117] "RemoveContainer" containerID="a34cb3588cb3e8bbf28c0deb4db70508cb4c6244e58b6a11ecc7649ac0558a8e" Mar 10 11:04:27 crc kubenswrapper[4794]: I0310 11:04:27.712788 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 10 11:04:27 crc kubenswrapper[4794]: I0310 11:04:27.718631 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-66d5bf7c87-4n74d" Mar 10 11:04:27 crc kubenswrapper[4794]: I0310 11:04:27.719084 4794 generic.go:334] "Generic (PLEG): container finished" podID="234dcbfa-1ba5-4960-952c-1ee50cfb2c23" containerID="e455a93ac3fc772d8ab0521cc8ac075723b25337540b5d93ac2e5efff28c6244" exitCode=0 Mar 10 11:04:27 crc kubenswrapper[4794]: I0310 11:04:27.719306 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"234dcbfa-1ba5-4960-952c-1ee50cfb2c23","Type":"ContainerDied","Data":"e455a93ac3fc772d8ab0521cc8ac075723b25337540b5d93ac2e5efff28c6244"} Mar 10 11:04:27 crc kubenswrapper[4794]: I0310 11:04:27.719475 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"234dcbfa-1ba5-4960-952c-1ee50cfb2c23","Type":"ContainerDied","Data":"676f53a52438fb8a5949313738b0fa164610706bc32a6b5a3ebb3e3863773d60"} Mar 10 11:04:27 crc kubenswrapper[4794]: I0310 11:04:27.719371 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 10 11:04:27 crc kubenswrapper[4794]: I0310 11:04:27.792138 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-ff89b6977-xmvxn"] Mar 10 11:04:27 crc kubenswrapper[4794]: I0310 11:04:27.792494 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-ff89b6977-xmvxn" podUID="000e40eb-9cb3-4535-ab24-652ebaf83d42" containerName="dnsmasq-dns" containerID="cri-o://7c28c5cc71d21c2b87a96f04ac62381eecc94256e79308f1abc7b4e599db4f7f" gracePeriod=10 Mar 10 11:04:27 crc kubenswrapper[4794]: I0310 11:04:27.829756 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 10 11:04:27 crc kubenswrapper[4794]: I0310 11:04:27.836937 4794 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 10 11:04:27 crc kubenswrapper[4794]: I0310 11:04:27.844418 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 10 11:04:27 crc kubenswrapper[4794]: I0310 11:04:27.852151 4794 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 10 11:04:27 crc kubenswrapper[4794]: I0310 11:04:27.858689 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 10 11:04:27 crc kubenswrapper[4794]: E0310 11:04:27.858995 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5ab2d0d3-42ae-4af0-a0c8-518a7bb9a69b" containerName="rabbitmq" Mar 10 11:04:27 crc kubenswrapper[4794]: I0310 11:04:27.859011 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="5ab2d0d3-42ae-4af0-a0c8-518a7bb9a69b" containerName="rabbitmq" Mar 10 11:04:27 crc kubenswrapper[4794]: E0310 11:04:27.859024 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="234dcbfa-1ba5-4960-952c-1ee50cfb2c23" containerName="rabbitmq" Mar 10 11:04:27 crc kubenswrapper[4794]: I0310 11:04:27.859031 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="234dcbfa-1ba5-4960-952c-1ee50cfb2c23" containerName="rabbitmq" Mar 10 11:04:27 crc kubenswrapper[4794]: E0310 11:04:27.859053 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5ab2d0d3-42ae-4af0-a0c8-518a7bb9a69b" containerName="setup-container" Mar 10 11:04:27 crc kubenswrapper[4794]: I0310 11:04:27.859059 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="5ab2d0d3-42ae-4af0-a0c8-518a7bb9a69b" containerName="setup-container" Mar 10 11:04:27 crc kubenswrapper[4794]: E0310 11:04:27.859070 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="234dcbfa-1ba5-4960-952c-1ee50cfb2c23" containerName="setup-container" Mar 10 11:04:27 crc kubenswrapper[4794]: I0310 11:04:27.859076 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="234dcbfa-1ba5-4960-952c-1ee50cfb2c23" containerName="setup-container" Mar 10 11:04:27 crc kubenswrapper[4794]: I0310 11:04:27.859206 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="5ab2d0d3-42ae-4af0-a0c8-518a7bb9a69b" containerName="rabbitmq" Mar 10 11:04:27 crc kubenswrapper[4794]: I0310 11:04:27.859224 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="234dcbfa-1ba5-4960-952c-1ee50cfb2c23" containerName="rabbitmq" Mar 10 11:04:27 crc kubenswrapper[4794]: I0310 11:04:27.863968 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 10 11:04:27 crc kubenswrapper[4794]: I0310 11:04:27.867101 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Mar 10 11:04:27 crc kubenswrapper[4794]: I0310 11:04:27.869380 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 10 11:04:27 crc kubenswrapper[4794]: I0310 11:04:27.871228 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Mar 10 11:04:27 crc kubenswrapper[4794]: I0310 11:04:27.871770 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Mar 10 11:04:27 crc kubenswrapper[4794]: I0310 11:04:27.872029 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Mar 10 11:04:27 crc kubenswrapper[4794]: I0310 11:04:27.872145 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Mar 10 11:04:27 crc kubenswrapper[4794]: I0310 11:04:27.872284 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-fsm2v" Mar 10 11:04:27 crc kubenswrapper[4794]: I0310 11:04:27.878864 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 10 11:04:27 crc kubenswrapper[4794]: I0310 11:04:27.879428 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Mar 10 11:04:27 crc kubenswrapper[4794]: I0310 11:04:27.879613 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Mar 10 11:04:27 crc kubenswrapper[4794]: I0310 11:04:27.879745 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-62tth" Mar 10 11:04:27 crc kubenswrapper[4794]: I0310 11:04:27.879877 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Mar 10 11:04:27 crc kubenswrapper[4794]: I0310 11:04:27.882140 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Mar 10 11:04:27 crc kubenswrapper[4794]: I0310 11:04:27.889712 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 10 11:04:27 crc kubenswrapper[4794]: I0310 11:04:27.924689 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/707decd3-40de-4bcf-873b-51fc83a7f136-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"707decd3-40de-4bcf-873b-51fc83a7f136\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 11:04:27 crc kubenswrapper[4794]: I0310 11:04:27.924738 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/707decd3-40de-4bcf-873b-51fc83a7f136-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"707decd3-40de-4bcf-873b-51fc83a7f136\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 11:04:27 crc kubenswrapper[4794]: I0310 11:04:27.924763 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/707decd3-40de-4bcf-873b-51fc83a7f136-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"707decd3-40de-4bcf-873b-51fc83a7f136\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 11:04:27 crc kubenswrapper[4794]: I0310 11:04:27.924776 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lf6wr\" (UniqueName: \"kubernetes.io/projected/707decd3-40de-4bcf-873b-51fc83a7f136-kube-api-access-lf6wr\") pod \"rabbitmq-cell1-server-0\" (UID: \"707decd3-40de-4bcf-873b-51fc83a7f136\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 11:04:27 crc kubenswrapper[4794]: I0310 11:04:27.924825 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/707decd3-40de-4bcf-873b-51fc83a7f136-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"707decd3-40de-4bcf-873b-51fc83a7f136\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 11:04:27 crc kubenswrapper[4794]: I0310 11:04:27.924847 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/707decd3-40de-4bcf-873b-51fc83a7f136-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"707decd3-40de-4bcf-873b-51fc83a7f136\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 11:04:27 crc kubenswrapper[4794]: I0310 11:04:27.924862 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/707decd3-40de-4bcf-873b-51fc83a7f136-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"707decd3-40de-4bcf-873b-51fc83a7f136\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 11:04:27 crc kubenswrapper[4794]: I0310 11:04:27.925098 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/707decd3-40de-4bcf-873b-51fc83a7f136-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"707decd3-40de-4bcf-873b-51fc83a7f136\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 11:04:27 crc kubenswrapper[4794]: I0310 11:04:27.925203 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-dc822ad3-d386-47fb-a341-58b137079423\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-dc822ad3-d386-47fb-a341-58b137079423\") pod \"rabbitmq-cell1-server-0\" (UID: \"707decd3-40de-4bcf-873b-51fc83a7f136\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 11:04:28 crc kubenswrapper[4794]: I0310 11:04:28.018218 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="234dcbfa-1ba5-4960-952c-1ee50cfb2c23" path="/var/lib/kubelet/pods/234dcbfa-1ba5-4960-952c-1ee50cfb2c23/volumes" Mar 10 11:04:28 crc kubenswrapper[4794]: I0310 11:04:28.018964 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5ab2d0d3-42ae-4af0-a0c8-518a7bb9a69b" path="/var/lib/kubelet/pods/5ab2d0d3-42ae-4af0-a0c8-518a7bb9a69b/volumes" Mar 10 11:04:28 crc kubenswrapper[4794]: I0310 11:04:28.026761 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/514f5ee1-5433-486a-ace2-ad62c7622526-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"514f5ee1-5433-486a-ace2-ad62c7622526\") " pod="openstack/rabbitmq-server-0" Mar 10 11:04:28 crc kubenswrapper[4794]: I0310 11:04:28.026809 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/514f5ee1-5433-486a-ace2-ad62c7622526-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"514f5ee1-5433-486a-ace2-ad62c7622526\") " pod="openstack/rabbitmq-server-0" Mar 10 11:04:28 crc kubenswrapper[4794]: I0310 11:04:28.026836 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-78950b35-609e-4037-9d4f-5e3ac7d5d3a9\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-78950b35-609e-4037-9d4f-5e3ac7d5d3a9\") pod \"rabbitmq-server-0\" (UID: \"514f5ee1-5433-486a-ace2-ad62c7622526\") " pod="openstack/rabbitmq-server-0" Mar 10 11:04:28 crc kubenswrapper[4794]: I0310 11:04:28.026894 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/707decd3-40de-4bcf-873b-51fc83a7f136-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"707decd3-40de-4bcf-873b-51fc83a7f136\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 11:04:28 crc kubenswrapper[4794]: I0310 11:04:28.026926 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/707decd3-40de-4bcf-873b-51fc83a7f136-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"707decd3-40de-4bcf-873b-51fc83a7f136\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 11:04:28 crc kubenswrapper[4794]: I0310 11:04:28.026951 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/707decd3-40de-4bcf-873b-51fc83a7f136-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"707decd3-40de-4bcf-873b-51fc83a7f136\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 11:04:28 crc kubenswrapper[4794]: I0310 11:04:28.026986 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/514f5ee1-5433-486a-ace2-ad62c7622526-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"514f5ee1-5433-486a-ace2-ad62c7622526\") " pod="openstack/rabbitmq-server-0" Mar 10 11:04:28 crc kubenswrapper[4794]: I0310 11:04:28.027002 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/514f5ee1-5433-486a-ace2-ad62c7622526-server-conf\") pod \"rabbitmq-server-0\" (UID: \"514f5ee1-5433-486a-ace2-ad62c7622526\") " pod="openstack/rabbitmq-server-0" Mar 10 11:04:28 crc kubenswrapper[4794]: I0310 11:04:28.027018 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-chxwm\" (UniqueName: \"kubernetes.io/projected/514f5ee1-5433-486a-ace2-ad62c7622526-kube-api-access-chxwm\") pod \"rabbitmq-server-0\" (UID: \"514f5ee1-5433-486a-ace2-ad62c7622526\") " pod="openstack/rabbitmq-server-0" Mar 10 11:04:28 crc kubenswrapper[4794]: I0310 11:04:28.027036 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/514f5ee1-5433-486a-ace2-ad62c7622526-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"514f5ee1-5433-486a-ace2-ad62c7622526\") " pod="openstack/rabbitmq-server-0" Mar 10 11:04:28 crc kubenswrapper[4794]: I0310 11:04:28.027078 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/514f5ee1-5433-486a-ace2-ad62c7622526-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"514f5ee1-5433-486a-ace2-ad62c7622526\") " pod="openstack/rabbitmq-server-0" Mar 10 11:04:28 crc kubenswrapper[4794]: I0310 11:04:28.027097 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/707decd3-40de-4bcf-873b-51fc83a7f136-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"707decd3-40de-4bcf-873b-51fc83a7f136\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 11:04:28 crc kubenswrapper[4794]: I0310 11:04:28.027117 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-dc822ad3-d386-47fb-a341-58b137079423\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-dc822ad3-d386-47fb-a341-58b137079423\") pod \"rabbitmq-cell1-server-0\" (UID: \"707decd3-40de-4bcf-873b-51fc83a7f136\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 11:04:28 crc kubenswrapper[4794]: I0310 11:04:28.027134 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/707decd3-40de-4bcf-873b-51fc83a7f136-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"707decd3-40de-4bcf-873b-51fc83a7f136\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 11:04:28 crc kubenswrapper[4794]: I0310 11:04:28.027166 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/707decd3-40de-4bcf-873b-51fc83a7f136-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"707decd3-40de-4bcf-873b-51fc83a7f136\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 11:04:28 crc kubenswrapper[4794]: I0310 11:04:28.027194 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/707decd3-40de-4bcf-873b-51fc83a7f136-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"707decd3-40de-4bcf-873b-51fc83a7f136\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 11:04:28 crc kubenswrapper[4794]: I0310 11:04:28.027207 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lf6wr\" (UniqueName: \"kubernetes.io/projected/707decd3-40de-4bcf-873b-51fc83a7f136-kube-api-access-lf6wr\") pod \"rabbitmq-cell1-server-0\" (UID: \"707decd3-40de-4bcf-873b-51fc83a7f136\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 11:04:28 crc kubenswrapper[4794]: I0310 11:04:28.027222 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/514f5ee1-5433-486a-ace2-ad62c7622526-pod-info\") pod \"rabbitmq-server-0\" (UID: \"514f5ee1-5433-486a-ace2-ad62c7622526\") " pod="openstack/rabbitmq-server-0" Mar 10 11:04:28 crc kubenswrapper[4794]: I0310 11:04:28.028614 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/707decd3-40de-4bcf-873b-51fc83a7f136-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"707decd3-40de-4bcf-873b-51fc83a7f136\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 11:04:28 crc kubenswrapper[4794]: I0310 11:04:28.028918 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/707decd3-40de-4bcf-873b-51fc83a7f136-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"707decd3-40de-4bcf-873b-51fc83a7f136\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 11:04:28 crc kubenswrapper[4794]: I0310 11:04:28.034163 4794 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 10 11:04:28 crc kubenswrapper[4794]: I0310 11:04:28.034191 4794 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-dc822ad3-d386-47fb-a341-58b137079423\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-dc822ad3-d386-47fb-a341-58b137079423\") pod \"rabbitmq-cell1-server-0\" (UID: \"707decd3-40de-4bcf-873b-51fc83a7f136\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/f374e62a9af5e12b6ef694a4e09178c1818cc6901a6785cefe8026d0920279ff/globalmount\"" pod="openstack/rabbitmq-cell1-server-0" Mar 10 11:04:28 crc kubenswrapper[4794]: I0310 11:04:28.046661 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/707decd3-40de-4bcf-873b-51fc83a7f136-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"707decd3-40de-4bcf-873b-51fc83a7f136\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 11:04:28 crc kubenswrapper[4794]: I0310 11:04:28.051284 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/707decd3-40de-4bcf-873b-51fc83a7f136-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"707decd3-40de-4bcf-873b-51fc83a7f136\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 11:04:28 crc kubenswrapper[4794]: I0310 11:04:28.051611 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/707decd3-40de-4bcf-873b-51fc83a7f136-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"707decd3-40de-4bcf-873b-51fc83a7f136\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 11:04:28 crc kubenswrapper[4794]: I0310 11:04:28.060743 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/707decd3-40de-4bcf-873b-51fc83a7f136-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"707decd3-40de-4bcf-873b-51fc83a7f136\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 11:04:28 crc kubenswrapper[4794]: I0310 11:04:28.062785 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-dc822ad3-d386-47fb-a341-58b137079423\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-dc822ad3-d386-47fb-a341-58b137079423\") pod \"rabbitmq-cell1-server-0\" (UID: \"707decd3-40de-4bcf-873b-51fc83a7f136\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 11:04:28 crc kubenswrapper[4794]: I0310 11:04:28.066073 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/707decd3-40de-4bcf-873b-51fc83a7f136-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"707decd3-40de-4bcf-873b-51fc83a7f136\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 11:04:28 crc kubenswrapper[4794]: I0310 11:04:28.079503 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lf6wr\" (UniqueName: \"kubernetes.io/projected/707decd3-40de-4bcf-873b-51fc83a7f136-kube-api-access-lf6wr\") pod \"rabbitmq-cell1-server-0\" (UID: \"707decd3-40de-4bcf-873b-51fc83a7f136\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 11:04:28 crc kubenswrapper[4794]: I0310 11:04:28.128531 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/514f5ee1-5433-486a-ace2-ad62c7622526-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"514f5ee1-5433-486a-ace2-ad62c7622526\") " pod="openstack/rabbitmq-server-0" Mar 10 11:04:28 crc kubenswrapper[4794]: I0310 11:04:28.128567 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/514f5ee1-5433-486a-ace2-ad62c7622526-server-conf\") pod \"rabbitmq-server-0\" (UID: \"514f5ee1-5433-486a-ace2-ad62c7622526\") " pod="openstack/rabbitmq-server-0" Mar 10 11:04:28 crc kubenswrapper[4794]: I0310 11:04:28.128585 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-chxwm\" (UniqueName: \"kubernetes.io/projected/514f5ee1-5433-486a-ace2-ad62c7622526-kube-api-access-chxwm\") pod \"rabbitmq-server-0\" (UID: \"514f5ee1-5433-486a-ace2-ad62c7622526\") " pod="openstack/rabbitmq-server-0" Mar 10 11:04:28 crc kubenswrapper[4794]: I0310 11:04:28.128604 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/514f5ee1-5433-486a-ace2-ad62c7622526-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"514f5ee1-5433-486a-ace2-ad62c7622526\") " pod="openstack/rabbitmq-server-0" Mar 10 11:04:28 crc kubenswrapper[4794]: I0310 11:04:28.128665 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/514f5ee1-5433-486a-ace2-ad62c7622526-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"514f5ee1-5433-486a-ace2-ad62c7622526\") " pod="openstack/rabbitmq-server-0" Mar 10 11:04:28 crc kubenswrapper[4794]: I0310 11:04:28.128712 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/514f5ee1-5433-486a-ace2-ad62c7622526-pod-info\") pod \"rabbitmq-server-0\" (UID: \"514f5ee1-5433-486a-ace2-ad62c7622526\") " pod="openstack/rabbitmq-server-0" Mar 10 11:04:28 crc kubenswrapper[4794]: I0310 11:04:28.128742 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/514f5ee1-5433-486a-ace2-ad62c7622526-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"514f5ee1-5433-486a-ace2-ad62c7622526\") " pod="openstack/rabbitmq-server-0" Mar 10 11:04:28 crc kubenswrapper[4794]: I0310 11:04:28.128762 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/514f5ee1-5433-486a-ace2-ad62c7622526-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"514f5ee1-5433-486a-ace2-ad62c7622526\") " pod="openstack/rabbitmq-server-0" Mar 10 11:04:28 crc kubenswrapper[4794]: I0310 11:04:28.128785 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-78950b35-609e-4037-9d4f-5e3ac7d5d3a9\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-78950b35-609e-4037-9d4f-5e3ac7d5d3a9\") pod \"rabbitmq-server-0\" (UID: \"514f5ee1-5433-486a-ace2-ad62c7622526\") " pod="openstack/rabbitmq-server-0" Mar 10 11:04:28 crc kubenswrapper[4794]: I0310 11:04:28.131439 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/514f5ee1-5433-486a-ace2-ad62c7622526-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"514f5ee1-5433-486a-ace2-ad62c7622526\") " pod="openstack/rabbitmq-server-0" Mar 10 11:04:28 crc kubenswrapper[4794]: I0310 11:04:28.131524 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/514f5ee1-5433-486a-ace2-ad62c7622526-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"514f5ee1-5433-486a-ace2-ad62c7622526\") " pod="openstack/rabbitmq-server-0" Mar 10 11:04:28 crc kubenswrapper[4794]: I0310 11:04:28.131588 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/514f5ee1-5433-486a-ace2-ad62c7622526-server-conf\") pod \"rabbitmq-server-0\" (UID: \"514f5ee1-5433-486a-ace2-ad62c7622526\") " pod="openstack/rabbitmq-server-0" Mar 10 11:04:28 crc kubenswrapper[4794]: I0310 11:04:28.132276 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/514f5ee1-5433-486a-ace2-ad62c7622526-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"514f5ee1-5433-486a-ace2-ad62c7622526\") " pod="openstack/rabbitmq-server-0" Mar 10 11:04:28 crc kubenswrapper[4794]: I0310 11:04:28.143022 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/514f5ee1-5433-486a-ace2-ad62c7622526-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"514f5ee1-5433-486a-ace2-ad62c7622526\") " pod="openstack/rabbitmq-server-0" Mar 10 11:04:28 crc kubenswrapper[4794]: I0310 11:04:28.144134 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/514f5ee1-5433-486a-ace2-ad62c7622526-pod-info\") pod \"rabbitmq-server-0\" (UID: \"514f5ee1-5433-486a-ace2-ad62c7622526\") " pod="openstack/rabbitmq-server-0" Mar 10 11:04:28 crc kubenswrapper[4794]: I0310 11:04:28.160020 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/514f5ee1-5433-486a-ace2-ad62c7622526-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"514f5ee1-5433-486a-ace2-ad62c7622526\") " pod="openstack/rabbitmq-server-0" Mar 10 11:04:28 crc kubenswrapper[4794]: I0310 11:04:28.161040 4794 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 10 11:04:28 crc kubenswrapper[4794]: I0310 11:04:28.161081 4794 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-78950b35-609e-4037-9d4f-5e3ac7d5d3a9\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-78950b35-609e-4037-9d4f-5e3ac7d5d3a9\") pod \"rabbitmq-server-0\" (UID: \"514f5ee1-5433-486a-ace2-ad62c7622526\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/27739c6f70fd848ff8e7803db65322a299e8c241afa4bc43c94133a97797b4f8/globalmount\"" pod="openstack/rabbitmq-server-0" Mar 10 11:04:28 crc kubenswrapper[4794]: I0310 11:04:28.167038 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-chxwm\" (UniqueName: \"kubernetes.io/projected/514f5ee1-5433-486a-ace2-ad62c7622526-kube-api-access-chxwm\") pod \"rabbitmq-server-0\" (UID: \"514f5ee1-5433-486a-ace2-ad62c7622526\") " pod="openstack/rabbitmq-server-0" Mar 10 11:04:28 crc kubenswrapper[4794]: I0310 11:04:28.217675 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-78950b35-609e-4037-9d4f-5e3ac7d5d3a9\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-78950b35-609e-4037-9d4f-5e3ac7d5d3a9\") pod \"rabbitmq-server-0\" (UID: \"514f5ee1-5433-486a-ace2-ad62c7622526\") " pod="openstack/rabbitmq-server-0" Mar 10 11:04:28 crc kubenswrapper[4794]: I0310 11:04:28.234765 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 10 11:04:28 crc kubenswrapper[4794]: I0310 11:04:28.245425 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 10 11:04:28 crc kubenswrapper[4794]: I0310 11:04:28.730256 4794 generic.go:334] "Generic (PLEG): container finished" podID="000e40eb-9cb3-4535-ab24-652ebaf83d42" containerID="7c28c5cc71d21c2b87a96f04ac62381eecc94256e79308f1abc7b4e599db4f7f" exitCode=0 Mar 10 11:04:28 crc kubenswrapper[4794]: I0310 11:04:28.730339 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-ff89b6977-xmvxn" event={"ID":"000e40eb-9cb3-4535-ab24-652ebaf83d42","Type":"ContainerDied","Data":"7c28c5cc71d21c2b87a96f04ac62381eecc94256e79308f1abc7b4e599db4f7f"} Mar 10 11:04:28 crc kubenswrapper[4794]: I0310 11:04:28.730593 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-2p246" podUID="a3274f97-634f-4482-aa42-800ba96a25c8" containerName="registry-server" containerID="cri-o://70b736393e5b82bdad06ffdf1c7372847515c637cdba8490e3472173a430d0dd" gracePeriod=2 Mar 10 11:04:29 crc kubenswrapper[4794]: I0310 11:04:29.743054 4794 generic.go:334] "Generic (PLEG): container finished" podID="a3274f97-634f-4482-aa42-800ba96a25c8" containerID="70b736393e5b82bdad06ffdf1c7372847515c637cdba8490e3472173a430d0dd" exitCode=0 Mar 10 11:04:29 crc kubenswrapper[4794]: I0310 11:04:29.743091 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2p246" event={"ID":"a3274f97-634f-4482-aa42-800ba96a25c8","Type":"ContainerDied","Data":"70b736393e5b82bdad06ffdf1c7372847515c637cdba8490e3472173a430d0dd"} Mar 10 11:04:30 crc kubenswrapper[4794]: I0310 11:04:30.780322 4794 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-ff89b6977-xmvxn" podUID="000e40eb-9cb3-4535-ab24-652ebaf83d42" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.1.16:5353: connect: connection refused" Mar 10 11:04:32 crc kubenswrapper[4794]: I0310 11:04:32.999484 4794 scope.go:117] "RemoveContainer" containerID="7246b508c9eeccdf8a5ec9276e239c3ec455273ced1f978a5745ca8a5e590e02" Mar 10 11:04:33 crc kubenswrapper[4794]: E0310 11:04:32.999880 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-69278_openshift-machine-config-operator(071a79a8-a892-4d38-a255-2a19483b64aa)\"" pod="openshift-machine-config-operator/machine-config-daemon-69278" podUID="071a79a8-a892-4d38-a255-2a19483b64aa" Mar 10 11:04:33 crc kubenswrapper[4794]: I0310 11:04:33.059561 4794 scope.go:117] "RemoveContainer" containerID="bfc4c53ccdbb347f7baadd72109438231354bd34500259fa9f7925fca91384ae" Mar 10 11:04:33 crc kubenswrapper[4794]: I0310 11:04:33.152705 4794 scope.go:117] "RemoveContainer" containerID="a34cb3588cb3e8bbf28c0deb4db70508cb4c6244e58b6a11ecc7649ac0558a8e" Mar 10 11:04:33 crc kubenswrapper[4794]: E0310 11:04:33.160735 4794 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a34cb3588cb3e8bbf28c0deb4db70508cb4c6244e58b6a11ecc7649ac0558a8e\": container with ID starting with a34cb3588cb3e8bbf28c0deb4db70508cb4c6244e58b6a11ecc7649ac0558a8e not found: ID does not exist" containerID="a34cb3588cb3e8bbf28c0deb4db70508cb4c6244e58b6a11ecc7649ac0558a8e" Mar 10 11:04:33 crc kubenswrapper[4794]: I0310 11:04:33.161101 4794 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a34cb3588cb3e8bbf28c0deb4db70508cb4c6244e58b6a11ecc7649ac0558a8e"} err="failed to get container status \"a34cb3588cb3e8bbf28c0deb4db70508cb4c6244e58b6a11ecc7649ac0558a8e\": rpc error: code = NotFound desc = could not find container \"a34cb3588cb3e8bbf28c0deb4db70508cb4c6244e58b6a11ecc7649ac0558a8e\": container with ID starting with a34cb3588cb3e8bbf28c0deb4db70508cb4c6244e58b6a11ecc7649ac0558a8e not found: ID does not exist" Mar 10 11:04:33 crc kubenswrapper[4794]: I0310 11:04:33.161131 4794 scope.go:117] "RemoveContainer" containerID="bfc4c53ccdbb347f7baadd72109438231354bd34500259fa9f7925fca91384ae" Mar 10 11:04:33 crc kubenswrapper[4794]: E0310 11:04:33.162103 4794 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bfc4c53ccdbb347f7baadd72109438231354bd34500259fa9f7925fca91384ae\": container with ID starting with bfc4c53ccdbb347f7baadd72109438231354bd34500259fa9f7925fca91384ae not found: ID does not exist" containerID="bfc4c53ccdbb347f7baadd72109438231354bd34500259fa9f7925fca91384ae" Mar 10 11:04:33 crc kubenswrapper[4794]: I0310 11:04:33.162145 4794 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bfc4c53ccdbb347f7baadd72109438231354bd34500259fa9f7925fca91384ae"} err="failed to get container status \"bfc4c53ccdbb347f7baadd72109438231354bd34500259fa9f7925fca91384ae\": rpc error: code = NotFound desc = could not find container \"bfc4c53ccdbb347f7baadd72109438231354bd34500259fa9f7925fca91384ae\": container with ID starting with bfc4c53ccdbb347f7baadd72109438231354bd34500259fa9f7925fca91384ae not found: ID does not exist" Mar 10 11:04:33 crc kubenswrapper[4794]: I0310 11:04:33.162173 4794 scope.go:117] "RemoveContainer" containerID="e455a93ac3fc772d8ab0521cc8ac075723b25337540b5d93ac2e5efff28c6244" Mar 10 11:04:33 crc kubenswrapper[4794]: I0310 11:04:33.222934 4794 scope.go:117] "RemoveContainer" containerID="37fa3a5e51255eca9e03fb94c20d30aee8427d15b76573220ce7cfa614cb9707" Mar 10 11:04:33 crc kubenswrapper[4794]: I0310 11:04:33.506461 4794 scope.go:117] "RemoveContainer" containerID="e455a93ac3fc772d8ab0521cc8ac075723b25337540b5d93ac2e5efff28c6244" Mar 10 11:04:33 crc kubenswrapper[4794]: E0310 11:04:33.507119 4794 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e455a93ac3fc772d8ab0521cc8ac075723b25337540b5d93ac2e5efff28c6244\": container with ID starting with e455a93ac3fc772d8ab0521cc8ac075723b25337540b5d93ac2e5efff28c6244 not found: ID does not exist" containerID="e455a93ac3fc772d8ab0521cc8ac075723b25337540b5d93ac2e5efff28c6244" Mar 10 11:04:33 crc kubenswrapper[4794]: I0310 11:04:33.507164 4794 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e455a93ac3fc772d8ab0521cc8ac075723b25337540b5d93ac2e5efff28c6244"} err="failed to get container status \"e455a93ac3fc772d8ab0521cc8ac075723b25337540b5d93ac2e5efff28c6244\": rpc error: code = NotFound desc = could not find container \"e455a93ac3fc772d8ab0521cc8ac075723b25337540b5d93ac2e5efff28c6244\": container with ID starting with e455a93ac3fc772d8ab0521cc8ac075723b25337540b5d93ac2e5efff28c6244 not found: ID does not exist" Mar 10 11:04:33 crc kubenswrapper[4794]: I0310 11:04:33.507197 4794 scope.go:117] "RemoveContainer" containerID="37fa3a5e51255eca9e03fb94c20d30aee8427d15b76573220ce7cfa614cb9707" Mar 10 11:04:33 crc kubenswrapper[4794]: E0310 11:04:33.507835 4794 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"37fa3a5e51255eca9e03fb94c20d30aee8427d15b76573220ce7cfa614cb9707\": container with ID starting with 37fa3a5e51255eca9e03fb94c20d30aee8427d15b76573220ce7cfa614cb9707 not found: ID does not exist" containerID="37fa3a5e51255eca9e03fb94c20d30aee8427d15b76573220ce7cfa614cb9707" Mar 10 11:04:33 crc kubenswrapper[4794]: I0310 11:04:33.507914 4794 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"37fa3a5e51255eca9e03fb94c20d30aee8427d15b76573220ce7cfa614cb9707"} err="failed to get container status \"37fa3a5e51255eca9e03fb94c20d30aee8427d15b76573220ce7cfa614cb9707\": rpc error: code = NotFound desc = could not find container \"37fa3a5e51255eca9e03fb94c20d30aee8427d15b76573220ce7cfa614cb9707\": container with ID starting with 37fa3a5e51255eca9e03fb94c20d30aee8427d15b76573220ce7cfa614cb9707 not found: ID does not exist" Mar 10 11:04:33 crc kubenswrapper[4794]: I0310 11:04:33.517755 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-ff89b6977-xmvxn" Mar 10 11:04:33 crc kubenswrapper[4794]: I0310 11:04:33.535525 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-2p246" Mar 10 11:04:33 crc kubenswrapper[4794]: I0310 11:04:33.645449 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/000e40eb-9cb3-4535-ab24-652ebaf83d42-config\") pod \"000e40eb-9cb3-4535-ab24-652ebaf83d42\" (UID: \"000e40eb-9cb3-4535-ab24-652ebaf83d42\") " Mar 10 11:04:33 crc kubenswrapper[4794]: I0310 11:04:33.645509 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-krlfg\" (UniqueName: \"kubernetes.io/projected/a3274f97-634f-4482-aa42-800ba96a25c8-kube-api-access-krlfg\") pod \"a3274f97-634f-4482-aa42-800ba96a25c8\" (UID: \"a3274f97-634f-4482-aa42-800ba96a25c8\") " Mar 10 11:04:33 crc kubenswrapper[4794]: I0310 11:04:33.645565 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a3274f97-634f-4482-aa42-800ba96a25c8-catalog-content\") pod \"a3274f97-634f-4482-aa42-800ba96a25c8\" (UID: \"a3274f97-634f-4482-aa42-800ba96a25c8\") " Mar 10 11:04:33 crc kubenswrapper[4794]: I0310 11:04:33.645615 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a3274f97-634f-4482-aa42-800ba96a25c8-utilities\") pod \"a3274f97-634f-4482-aa42-800ba96a25c8\" (UID: \"a3274f97-634f-4482-aa42-800ba96a25c8\") " Mar 10 11:04:33 crc kubenswrapper[4794]: I0310 11:04:33.645677 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p2lpk\" (UniqueName: \"kubernetes.io/projected/000e40eb-9cb3-4535-ab24-652ebaf83d42-kube-api-access-p2lpk\") pod \"000e40eb-9cb3-4535-ab24-652ebaf83d42\" (UID: \"000e40eb-9cb3-4535-ab24-652ebaf83d42\") " Mar 10 11:04:33 crc kubenswrapper[4794]: I0310 11:04:33.645698 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/000e40eb-9cb3-4535-ab24-652ebaf83d42-dns-svc\") pod \"000e40eb-9cb3-4535-ab24-652ebaf83d42\" (UID: \"000e40eb-9cb3-4535-ab24-652ebaf83d42\") " Mar 10 11:04:33 crc kubenswrapper[4794]: I0310 11:04:33.648051 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a3274f97-634f-4482-aa42-800ba96a25c8-utilities" (OuterVolumeSpecName: "utilities") pod "a3274f97-634f-4482-aa42-800ba96a25c8" (UID: "a3274f97-634f-4482-aa42-800ba96a25c8"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 11:04:33 crc kubenswrapper[4794]: I0310 11:04:33.669559 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a3274f97-634f-4482-aa42-800ba96a25c8-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a3274f97-634f-4482-aa42-800ba96a25c8" (UID: "a3274f97-634f-4482-aa42-800ba96a25c8"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 11:04:33 crc kubenswrapper[4794]: I0310 11:04:33.747650 4794 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a3274f97-634f-4482-aa42-800ba96a25c8-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 10 11:04:33 crc kubenswrapper[4794]: I0310 11:04:33.747676 4794 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a3274f97-634f-4482-aa42-800ba96a25c8-utilities\") on node \"crc\" DevicePath \"\"" Mar 10 11:04:33 crc kubenswrapper[4794]: I0310 11:04:33.775415 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-2p246" Mar 10 11:04:33 crc kubenswrapper[4794]: I0310 11:04:33.775436 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2p246" event={"ID":"a3274f97-634f-4482-aa42-800ba96a25c8","Type":"ContainerDied","Data":"e61938dfcebe78eff78559bc9c8c97f7fe65a863f1f560c63d65527f46bc8477"} Mar 10 11:04:33 crc kubenswrapper[4794]: I0310 11:04:33.775512 4794 scope.go:117] "RemoveContainer" containerID="70b736393e5b82bdad06ffdf1c7372847515c637cdba8490e3472173a430d0dd" Mar 10 11:04:33 crc kubenswrapper[4794]: I0310 11:04:33.793415 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-ff89b6977-xmvxn" event={"ID":"000e40eb-9cb3-4535-ab24-652ebaf83d42","Type":"ContainerDied","Data":"81047964819e8faed2e5a169055c3fc3ffce9083322fc3a8a3b3be91278c1c9e"} Mar 10 11:04:33 crc kubenswrapper[4794]: I0310 11:04:33.793479 4794 scope.go:117] "RemoveContainer" containerID="07ac5feac51ceba2ab92aaedb2afffdeb2b23e2594684ccbc66a92312e77fff6" Mar 10 11:04:33 crc kubenswrapper[4794]: I0310 11:04:33.793490 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-ff89b6977-xmvxn" Mar 10 11:04:33 crc kubenswrapper[4794]: I0310 11:04:33.831236 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/000e40eb-9cb3-4535-ab24-652ebaf83d42-kube-api-access-p2lpk" (OuterVolumeSpecName: "kube-api-access-p2lpk") pod "000e40eb-9cb3-4535-ab24-652ebaf83d42" (UID: "000e40eb-9cb3-4535-ab24-652ebaf83d42"). InnerVolumeSpecName "kube-api-access-p2lpk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 11:04:33 crc kubenswrapper[4794]: I0310 11:04:33.839633 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a3274f97-634f-4482-aa42-800ba96a25c8-kube-api-access-krlfg" (OuterVolumeSpecName: "kube-api-access-krlfg") pod "a3274f97-634f-4482-aa42-800ba96a25c8" (UID: "a3274f97-634f-4482-aa42-800ba96a25c8"). InnerVolumeSpecName "kube-api-access-krlfg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 11:04:33 crc kubenswrapper[4794]: I0310 11:04:33.846669 4794 scope.go:117] "RemoveContainer" containerID="a372f7be8a845cd1e4c33c59741158513277501076cc40beadf37a472742ecc3" Mar 10 11:04:33 crc kubenswrapper[4794]: I0310 11:04:33.852520 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p2lpk\" (UniqueName: \"kubernetes.io/projected/000e40eb-9cb3-4535-ab24-652ebaf83d42-kube-api-access-p2lpk\") on node \"crc\" DevicePath \"\"" Mar 10 11:04:33 crc kubenswrapper[4794]: I0310 11:04:33.852566 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-krlfg\" (UniqueName: \"kubernetes.io/projected/a3274f97-634f-4482-aa42-800ba96a25c8-kube-api-access-krlfg\") on node \"crc\" DevicePath \"\"" Mar 10 11:04:33 crc kubenswrapper[4794]: I0310 11:04:33.868431 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/000e40eb-9cb3-4535-ab24-652ebaf83d42-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "000e40eb-9cb3-4535-ab24-652ebaf83d42" (UID: "000e40eb-9cb3-4535-ab24-652ebaf83d42"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 11:04:33 crc kubenswrapper[4794]: I0310 11:04:33.890230 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 10 11:04:33 crc kubenswrapper[4794]: I0310 11:04:33.892051 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/000e40eb-9cb3-4535-ab24-652ebaf83d42-config" (OuterVolumeSpecName: "config") pod "000e40eb-9cb3-4535-ab24-652ebaf83d42" (UID: "000e40eb-9cb3-4535-ab24-652ebaf83d42"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 11:04:33 crc kubenswrapper[4794]: I0310 11:04:33.892476 4794 scope.go:117] "RemoveContainer" containerID="7c28c5cc71d21c2b87a96f04ac62381eecc94256e79308f1abc7b4e599db4f7f" Mar 10 11:04:33 crc kubenswrapper[4794]: I0310 11:04:33.896368 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 10 11:04:33 crc kubenswrapper[4794]: W0310 11:04:33.899860 4794 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod514f5ee1_5433_486a_ace2_ad62c7622526.slice/crio-9429fad2fbe338032189dd337c0d9ad350001c0fa84608ce78e11df148006232 WatchSource:0}: Error finding container 9429fad2fbe338032189dd337c0d9ad350001c0fa84608ce78e11df148006232: Status 404 returned error can't find the container with id 9429fad2fbe338032189dd337c0d9ad350001c0fa84608ce78e11df148006232 Mar 10 11:04:33 crc kubenswrapper[4794]: I0310 11:04:33.915924 4794 scope.go:117] "RemoveContainer" containerID="2964c180f508d51a70babef19278b7ac73427a795532236fa29fe156d95051d5" Mar 10 11:04:33 crc kubenswrapper[4794]: I0310 11:04:33.953628 4794 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/000e40eb-9cb3-4535-ab24-652ebaf83d42-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 10 11:04:33 crc kubenswrapper[4794]: I0310 11:04:33.953651 4794 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/000e40eb-9cb3-4535-ab24-652ebaf83d42-config\") on node \"crc\" DevicePath \"\"" Mar 10 11:04:34 crc kubenswrapper[4794]: I0310 11:04:34.167141 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-ff89b6977-xmvxn"] Mar 10 11:04:34 crc kubenswrapper[4794]: I0310 11:04:34.182996 4794 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-ff89b6977-xmvxn"] Mar 10 11:04:34 crc kubenswrapper[4794]: I0310 11:04:34.198551 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-2p246"] Mar 10 11:04:34 crc kubenswrapper[4794]: I0310 11:04:34.221666 4794 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-2p246"] Mar 10 11:04:34 crc kubenswrapper[4794]: I0310 11:04:34.805372 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"514f5ee1-5433-486a-ace2-ad62c7622526","Type":"ContainerStarted","Data":"9429fad2fbe338032189dd337c0d9ad350001c0fa84608ce78e11df148006232"} Mar 10 11:04:34 crc kubenswrapper[4794]: I0310 11:04:34.811145 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"707decd3-40de-4bcf-873b-51fc83a7f136","Type":"ContainerStarted","Data":"6515ca58d1b542e7de28db349922a8afc26ecfe764b5d6370ec27226cc3c39ad"} Mar 10 11:04:34 crc kubenswrapper[4794]: I0310 11:04:34.814749 4794 generic.go:334] "Generic (PLEG): container finished" podID="b8c626f0-3ae2-44f4-83f1-660a9f69eb2a" containerID="9a4ec50c37aa771fc095858ee122edd0ee9970357571551f9b95e7b3ed5ac9f4" exitCode=0 Mar 10 11:04:34 crc kubenswrapper[4794]: I0310 11:04:34.814933 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-d2q2f" event={"ID":"b8c626f0-3ae2-44f4-83f1-660a9f69eb2a","Type":"ContainerDied","Data":"9a4ec50c37aa771fc095858ee122edd0ee9970357571551f9b95e7b3ed5ac9f4"} Mar 10 11:04:35 crc kubenswrapper[4794]: I0310 11:04:35.829692 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"514f5ee1-5433-486a-ace2-ad62c7622526","Type":"ContainerStarted","Data":"f0b6de910758e84bc501274c902ec0598831a00298f645e5ad2d33f753a52ea5"} Mar 10 11:04:35 crc kubenswrapper[4794]: I0310 11:04:35.833864 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-d2q2f" event={"ID":"b8c626f0-3ae2-44f4-83f1-660a9f69eb2a","Type":"ContainerStarted","Data":"e9de57e2ecf0abf8470a0e23cc9325a630cdb64ecdfd3074cfa45a71a2044d70"} Mar 10 11:04:35 crc kubenswrapper[4794]: I0310 11:04:35.895127 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-d2q2f" podStartSLOduration=3.322355392 podStartE2EDuration="15.895099978s" podCreationTimestamp="2026-03-10 11:04:20 +0000 UTC" firstStartedPulling="2026-03-10 11:04:22.670460014 +0000 UTC m=+4811.426630832" lastFinishedPulling="2026-03-10 11:04:35.24320457 +0000 UTC m=+4823.999375418" observedRunningTime="2026-03-10 11:04:35.889971688 +0000 UTC m=+4824.646142516" watchObservedRunningTime="2026-03-10 11:04:35.895099978 +0000 UTC m=+4824.651270836" Mar 10 11:04:36 crc kubenswrapper[4794]: I0310 11:04:36.009179 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="000e40eb-9cb3-4535-ab24-652ebaf83d42" path="/var/lib/kubelet/pods/000e40eb-9cb3-4535-ab24-652ebaf83d42/volumes" Mar 10 11:04:36 crc kubenswrapper[4794]: I0310 11:04:36.010416 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a3274f97-634f-4482-aa42-800ba96a25c8" path="/var/lib/kubelet/pods/a3274f97-634f-4482-aa42-800ba96a25c8/volumes" Mar 10 11:04:36 crc kubenswrapper[4794]: I0310 11:04:36.850166 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"707decd3-40de-4bcf-873b-51fc83a7f136","Type":"ContainerStarted","Data":"1132ca0a8b0f054fbf2bedb33cd4d26aaa82c1d84fba3d8e744180083f86d98c"} Mar 10 11:04:41 crc kubenswrapper[4794]: I0310 11:04:41.396014 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-d2q2f" Mar 10 11:04:41 crc kubenswrapper[4794]: I0310 11:04:41.397918 4794 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-d2q2f" Mar 10 11:04:41 crc kubenswrapper[4794]: I0310 11:04:41.452947 4794 scope.go:117] "RemoveContainer" containerID="67c831ca716684d87c42bfa8a89dba497d232cab396336570e3399db0faee93d" Mar 10 11:04:42 crc kubenswrapper[4794]: I0310 11:04:42.474161 4794 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-d2q2f" podUID="b8c626f0-3ae2-44f4-83f1-660a9f69eb2a" containerName="registry-server" probeResult="failure" output=< Mar 10 11:04:42 crc kubenswrapper[4794]: timeout: failed to connect service ":50051" within 1s Mar 10 11:04:42 crc kubenswrapper[4794]: > Mar 10 11:04:46 crc kubenswrapper[4794]: I0310 11:04:46.000319 4794 scope.go:117] "RemoveContainer" containerID="7246b508c9eeccdf8a5ec9276e239c3ec455273ced1f978a5745ca8a5e590e02" Mar 10 11:04:46 crc kubenswrapper[4794]: E0310 11:04:46.001064 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-69278_openshift-machine-config-operator(071a79a8-a892-4d38-a255-2a19483b64aa)\"" pod="openshift-machine-config-operator/machine-config-daemon-69278" podUID="071a79a8-a892-4d38-a255-2a19483b64aa" Mar 10 11:04:51 crc kubenswrapper[4794]: I0310 11:04:51.478083 4794 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-d2q2f" Mar 10 11:04:51 crc kubenswrapper[4794]: I0310 11:04:51.559212 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-d2q2f" Mar 10 11:04:51 crc kubenswrapper[4794]: I0310 11:04:51.827204 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-d2q2f"] Mar 10 11:04:51 crc kubenswrapper[4794]: I0310 11:04:51.986403 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-kf727"] Mar 10 11:04:51 crc kubenswrapper[4794]: I0310 11:04:51.986752 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-kf727" podUID="75dcab4f-0176-43ea-81d3-4ecbff649959" containerName="registry-server" containerID="cri-o://0ff396ca2070d65925f02a7bd4dd416883336cbd1322412504065f692cc25cfe" gracePeriod=2 Mar 10 11:04:52 crc kubenswrapper[4794]: E0310 11:04:52.143305 4794 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod75dcab4f_0176_43ea_81d3_4ecbff649959.slice/crio-conmon-0ff396ca2070d65925f02a7bd4dd416883336cbd1322412504065f692cc25cfe.scope\": RecentStats: unable to find data in memory cache]" Mar 10 11:04:52 crc kubenswrapper[4794]: I0310 11:04:52.473077 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-kf727" Mar 10 11:04:52 crc kubenswrapper[4794]: I0310 11:04:52.588989 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/75dcab4f-0176-43ea-81d3-4ecbff649959-utilities\") pod \"75dcab4f-0176-43ea-81d3-4ecbff649959\" (UID: \"75dcab4f-0176-43ea-81d3-4ecbff649959\") " Mar 10 11:04:52 crc kubenswrapper[4794]: I0310 11:04:52.589037 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zk8np\" (UniqueName: \"kubernetes.io/projected/75dcab4f-0176-43ea-81d3-4ecbff649959-kube-api-access-zk8np\") pod \"75dcab4f-0176-43ea-81d3-4ecbff649959\" (UID: \"75dcab4f-0176-43ea-81d3-4ecbff649959\") " Mar 10 11:04:52 crc kubenswrapper[4794]: I0310 11:04:52.589123 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/75dcab4f-0176-43ea-81d3-4ecbff649959-catalog-content\") pod \"75dcab4f-0176-43ea-81d3-4ecbff649959\" (UID: \"75dcab4f-0176-43ea-81d3-4ecbff649959\") " Mar 10 11:04:52 crc kubenswrapper[4794]: I0310 11:04:52.589648 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/75dcab4f-0176-43ea-81d3-4ecbff649959-utilities" (OuterVolumeSpecName: "utilities") pod "75dcab4f-0176-43ea-81d3-4ecbff649959" (UID: "75dcab4f-0176-43ea-81d3-4ecbff649959"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 11:04:52 crc kubenswrapper[4794]: I0310 11:04:52.594610 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/75dcab4f-0176-43ea-81d3-4ecbff649959-kube-api-access-zk8np" (OuterVolumeSpecName: "kube-api-access-zk8np") pod "75dcab4f-0176-43ea-81d3-4ecbff649959" (UID: "75dcab4f-0176-43ea-81d3-4ecbff649959"). InnerVolumeSpecName "kube-api-access-zk8np". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 11:04:52 crc kubenswrapper[4794]: I0310 11:04:52.691185 4794 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/75dcab4f-0176-43ea-81d3-4ecbff649959-utilities\") on node \"crc\" DevicePath \"\"" Mar 10 11:04:52 crc kubenswrapper[4794]: I0310 11:04:52.691222 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zk8np\" (UniqueName: \"kubernetes.io/projected/75dcab4f-0176-43ea-81d3-4ecbff649959-kube-api-access-zk8np\") on node \"crc\" DevicePath \"\"" Mar 10 11:04:52 crc kubenswrapper[4794]: I0310 11:04:52.696608 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/75dcab4f-0176-43ea-81d3-4ecbff649959-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "75dcab4f-0176-43ea-81d3-4ecbff649959" (UID: "75dcab4f-0176-43ea-81d3-4ecbff649959"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 11:04:52 crc kubenswrapper[4794]: I0310 11:04:52.793080 4794 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/75dcab4f-0176-43ea-81d3-4ecbff649959-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 10 11:04:52 crc kubenswrapper[4794]: I0310 11:04:52.999252 4794 generic.go:334] "Generic (PLEG): container finished" podID="75dcab4f-0176-43ea-81d3-4ecbff649959" containerID="0ff396ca2070d65925f02a7bd4dd416883336cbd1322412504065f692cc25cfe" exitCode=0 Mar 10 11:04:52 crc kubenswrapper[4794]: I0310 11:04:52.999656 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-kf727" Mar 10 11:04:53 crc kubenswrapper[4794]: I0310 11:04:53.008139 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kf727" event={"ID":"75dcab4f-0176-43ea-81d3-4ecbff649959","Type":"ContainerDied","Data":"0ff396ca2070d65925f02a7bd4dd416883336cbd1322412504065f692cc25cfe"} Mar 10 11:04:53 crc kubenswrapper[4794]: I0310 11:04:53.008171 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kf727" event={"ID":"75dcab4f-0176-43ea-81d3-4ecbff649959","Type":"ContainerDied","Data":"aa2b025a251823cc00aa455fe365354de5df2ce5c64ffc85e251bfd6fb5cd95a"} Mar 10 11:04:53 crc kubenswrapper[4794]: I0310 11:04:53.008187 4794 scope.go:117] "RemoveContainer" containerID="0ff396ca2070d65925f02a7bd4dd416883336cbd1322412504065f692cc25cfe" Mar 10 11:04:53 crc kubenswrapper[4794]: I0310 11:04:53.029166 4794 scope.go:117] "RemoveContainer" containerID="c3e9d7eac2d3ffbd40e45ee87820d7ce0d466e670a6a3507ffeb59d342e45a27" Mar 10 11:04:53 crc kubenswrapper[4794]: I0310 11:04:53.035316 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-kf727"] Mar 10 11:04:53 crc kubenswrapper[4794]: I0310 11:04:53.045635 4794 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-kf727"] Mar 10 11:04:53 crc kubenswrapper[4794]: I0310 11:04:53.049084 4794 scope.go:117] "RemoveContainer" containerID="e6cfec15be4ea0143c5065b8eaccafdca4b38588762970ab15fe0fe9e6a34436" Mar 10 11:04:53 crc kubenswrapper[4794]: I0310 11:04:53.085916 4794 scope.go:117] "RemoveContainer" containerID="0ff396ca2070d65925f02a7bd4dd416883336cbd1322412504065f692cc25cfe" Mar 10 11:04:53 crc kubenswrapper[4794]: E0310 11:04:53.088224 4794 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0ff396ca2070d65925f02a7bd4dd416883336cbd1322412504065f692cc25cfe\": container with ID starting with 0ff396ca2070d65925f02a7bd4dd416883336cbd1322412504065f692cc25cfe not found: ID does not exist" containerID="0ff396ca2070d65925f02a7bd4dd416883336cbd1322412504065f692cc25cfe" Mar 10 11:04:53 crc kubenswrapper[4794]: I0310 11:04:53.088263 4794 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0ff396ca2070d65925f02a7bd4dd416883336cbd1322412504065f692cc25cfe"} err="failed to get container status \"0ff396ca2070d65925f02a7bd4dd416883336cbd1322412504065f692cc25cfe\": rpc error: code = NotFound desc = could not find container \"0ff396ca2070d65925f02a7bd4dd416883336cbd1322412504065f692cc25cfe\": container with ID starting with 0ff396ca2070d65925f02a7bd4dd416883336cbd1322412504065f692cc25cfe not found: ID does not exist" Mar 10 11:04:53 crc kubenswrapper[4794]: I0310 11:04:53.088300 4794 scope.go:117] "RemoveContainer" containerID="c3e9d7eac2d3ffbd40e45ee87820d7ce0d466e670a6a3507ffeb59d342e45a27" Mar 10 11:04:53 crc kubenswrapper[4794]: E0310 11:04:53.093056 4794 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c3e9d7eac2d3ffbd40e45ee87820d7ce0d466e670a6a3507ffeb59d342e45a27\": container with ID starting with c3e9d7eac2d3ffbd40e45ee87820d7ce0d466e670a6a3507ffeb59d342e45a27 not found: ID does not exist" containerID="c3e9d7eac2d3ffbd40e45ee87820d7ce0d466e670a6a3507ffeb59d342e45a27" Mar 10 11:04:53 crc kubenswrapper[4794]: I0310 11:04:53.093099 4794 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c3e9d7eac2d3ffbd40e45ee87820d7ce0d466e670a6a3507ffeb59d342e45a27"} err="failed to get container status \"c3e9d7eac2d3ffbd40e45ee87820d7ce0d466e670a6a3507ffeb59d342e45a27\": rpc error: code = NotFound desc = could not find container \"c3e9d7eac2d3ffbd40e45ee87820d7ce0d466e670a6a3507ffeb59d342e45a27\": container with ID starting with c3e9d7eac2d3ffbd40e45ee87820d7ce0d466e670a6a3507ffeb59d342e45a27 not found: ID does not exist" Mar 10 11:04:53 crc kubenswrapper[4794]: I0310 11:04:53.093156 4794 scope.go:117] "RemoveContainer" containerID="e6cfec15be4ea0143c5065b8eaccafdca4b38588762970ab15fe0fe9e6a34436" Mar 10 11:04:53 crc kubenswrapper[4794]: E0310 11:04:53.093518 4794 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e6cfec15be4ea0143c5065b8eaccafdca4b38588762970ab15fe0fe9e6a34436\": container with ID starting with e6cfec15be4ea0143c5065b8eaccafdca4b38588762970ab15fe0fe9e6a34436 not found: ID does not exist" containerID="e6cfec15be4ea0143c5065b8eaccafdca4b38588762970ab15fe0fe9e6a34436" Mar 10 11:04:53 crc kubenswrapper[4794]: I0310 11:04:53.093561 4794 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e6cfec15be4ea0143c5065b8eaccafdca4b38588762970ab15fe0fe9e6a34436"} err="failed to get container status \"e6cfec15be4ea0143c5065b8eaccafdca4b38588762970ab15fe0fe9e6a34436\": rpc error: code = NotFound desc = could not find container \"e6cfec15be4ea0143c5065b8eaccafdca4b38588762970ab15fe0fe9e6a34436\": container with ID starting with e6cfec15be4ea0143c5065b8eaccafdca4b38588762970ab15fe0fe9e6a34436 not found: ID does not exist" Mar 10 11:04:54 crc kubenswrapper[4794]: I0310 11:04:54.030557 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="75dcab4f-0176-43ea-81d3-4ecbff649959" path="/var/lib/kubelet/pods/75dcab4f-0176-43ea-81d3-4ecbff649959/volumes" Mar 10 11:05:00 crc kubenswrapper[4794]: I0310 11:05:00.009182 4794 scope.go:117] "RemoveContainer" containerID="7246b508c9eeccdf8a5ec9276e239c3ec455273ced1f978a5745ca8a5e590e02" Mar 10 11:05:00 crc kubenswrapper[4794]: E0310 11:05:00.010677 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-69278_openshift-machine-config-operator(071a79a8-a892-4d38-a255-2a19483b64aa)\"" pod="openshift-machine-config-operator/machine-config-daemon-69278" podUID="071a79a8-a892-4d38-a255-2a19483b64aa" Mar 10 11:05:09 crc kubenswrapper[4794]: I0310 11:05:09.157685 4794 generic.go:334] "Generic (PLEG): container finished" podID="707decd3-40de-4bcf-873b-51fc83a7f136" containerID="1132ca0a8b0f054fbf2bedb33cd4d26aaa82c1d84fba3d8e744180083f86d98c" exitCode=0 Mar 10 11:05:09 crc kubenswrapper[4794]: I0310 11:05:09.157796 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"707decd3-40de-4bcf-873b-51fc83a7f136","Type":"ContainerDied","Data":"1132ca0a8b0f054fbf2bedb33cd4d26aaa82c1d84fba3d8e744180083f86d98c"} Mar 10 11:05:09 crc kubenswrapper[4794]: I0310 11:05:09.161168 4794 generic.go:334] "Generic (PLEG): container finished" podID="514f5ee1-5433-486a-ace2-ad62c7622526" containerID="f0b6de910758e84bc501274c902ec0598831a00298f645e5ad2d33f753a52ea5" exitCode=0 Mar 10 11:05:09 crc kubenswrapper[4794]: I0310 11:05:09.161224 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"514f5ee1-5433-486a-ace2-ad62c7622526","Type":"ContainerDied","Data":"f0b6de910758e84bc501274c902ec0598831a00298f645e5ad2d33f753a52ea5"} Mar 10 11:05:10 crc kubenswrapper[4794]: I0310 11:05:10.172107 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"514f5ee1-5433-486a-ace2-ad62c7622526","Type":"ContainerStarted","Data":"fc5c598793c24699c139bcce7f4ef4a282fb02f4547c9d9763ae989fcd385ae2"} Mar 10 11:05:10 crc kubenswrapper[4794]: I0310 11:05:10.172953 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Mar 10 11:05:10 crc kubenswrapper[4794]: I0310 11:05:10.174869 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"707decd3-40de-4bcf-873b-51fc83a7f136","Type":"ContainerStarted","Data":"f6f09583138386c960704afdd55b5ac55c91623cc684c63c68dc50e2346ccae5"} Mar 10 11:05:10 crc kubenswrapper[4794]: I0310 11:05:10.175095 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Mar 10 11:05:10 crc kubenswrapper[4794]: I0310 11:05:10.262813 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=43.262788258 podStartE2EDuration="43.262788258s" podCreationTimestamp="2026-03-10 11:04:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 11:05:10.210495412 +0000 UTC m=+4858.966666260" watchObservedRunningTime="2026-03-10 11:05:10.262788258 +0000 UTC m=+4859.018959106" Mar 10 11:05:10 crc kubenswrapper[4794]: I0310 11:05:10.268409 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=43.268391473 podStartE2EDuration="43.268391473s" podCreationTimestamp="2026-03-10 11:04:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 11:05:10.257740862 +0000 UTC m=+4859.013911690" watchObservedRunningTime="2026-03-10 11:05:10.268391473 +0000 UTC m=+4859.024562321" Mar 10 11:05:12 crc kubenswrapper[4794]: I0310 11:05:12.007050 4794 scope.go:117] "RemoveContainer" containerID="7246b508c9eeccdf8a5ec9276e239c3ec455273ced1f978a5745ca8a5e590e02" Mar 10 11:05:12 crc kubenswrapper[4794]: E0310 11:05:12.007296 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-69278_openshift-machine-config-operator(071a79a8-a892-4d38-a255-2a19483b64aa)\"" pod="openshift-machine-config-operator/machine-config-daemon-69278" podUID="071a79a8-a892-4d38-a255-2a19483b64aa" Mar 10 11:05:23 crc kubenswrapper[4794]: I0310 11:05:23.000026 4794 scope.go:117] "RemoveContainer" containerID="7246b508c9eeccdf8a5ec9276e239c3ec455273ced1f978a5745ca8a5e590e02" Mar 10 11:05:23 crc kubenswrapper[4794]: E0310 11:05:23.001227 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-69278_openshift-machine-config-operator(071a79a8-a892-4d38-a255-2a19483b64aa)\"" pod="openshift-machine-config-operator/machine-config-daemon-69278" podUID="071a79a8-a892-4d38-a255-2a19483b64aa" Mar 10 11:05:28 crc kubenswrapper[4794]: I0310 11:05:28.239643 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Mar 10 11:05:28 crc kubenswrapper[4794]: I0310 11:05:28.250080 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Mar 10 11:05:35 crc kubenswrapper[4794]: I0310 11:05:35.545781 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mariadb-client"] Mar 10 11:05:35 crc kubenswrapper[4794]: E0310 11:05:35.546711 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="75dcab4f-0176-43ea-81d3-4ecbff649959" containerName="extract-utilities" Mar 10 11:05:35 crc kubenswrapper[4794]: I0310 11:05:35.546740 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="75dcab4f-0176-43ea-81d3-4ecbff649959" containerName="extract-utilities" Mar 10 11:05:35 crc kubenswrapper[4794]: E0310 11:05:35.546752 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="000e40eb-9cb3-4535-ab24-652ebaf83d42" containerName="init" Mar 10 11:05:35 crc kubenswrapper[4794]: I0310 11:05:35.546759 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="000e40eb-9cb3-4535-ab24-652ebaf83d42" containerName="init" Mar 10 11:05:35 crc kubenswrapper[4794]: E0310 11:05:35.546779 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a3274f97-634f-4482-aa42-800ba96a25c8" containerName="extract-utilities" Mar 10 11:05:35 crc kubenswrapper[4794]: I0310 11:05:35.546789 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="a3274f97-634f-4482-aa42-800ba96a25c8" containerName="extract-utilities" Mar 10 11:05:35 crc kubenswrapper[4794]: E0310 11:05:35.546807 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a3274f97-634f-4482-aa42-800ba96a25c8" containerName="registry-server" Mar 10 11:05:35 crc kubenswrapper[4794]: I0310 11:05:35.546814 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="a3274f97-634f-4482-aa42-800ba96a25c8" containerName="registry-server" Mar 10 11:05:35 crc kubenswrapper[4794]: E0310 11:05:35.546829 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a3274f97-634f-4482-aa42-800ba96a25c8" containerName="extract-content" Mar 10 11:05:35 crc kubenswrapper[4794]: I0310 11:05:35.546836 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="a3274f97-634f-4482-aa42-800ba96a25c8" containerName="extract-content" Mar 10 11:05:35 crc kubenswrapper[4794]: E0310 11:05:35.546856 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="000e40eb-9cb3-4535-ab24-652ebaf83d42" containerName="dnsmasq-dns" Mar 10 11:05:35 crc kubenswrapper[4794]: I0310 11:05:35.546865 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="000e40eb-9cb3-4535-ab24-652ebaf83d42" containerName="dnsmasq-dns" Mar 10 11:05:35 crc kubenswrapper[4794]: E0310 11:05:35.546875 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="75dcab4f-0176-43ea-81d3-4ecbff649959" containerName="extract-content" Mar 10 11:05:35 crc kubenswrapper[4794]: I0310 11:05:35.546882 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="75dcab4f-0176-43ea-81d3-4ecbff649959" containerName="extract-content" Mar 10 11:05:35 crc kubenswrapper[4794]: E0310 11:05:35.546897 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="75dcab4f-0176-43ea-81d3-4ecbff649959" containerName="registry-server" Mar 10 11:05:35 crc kubenswrapper[4794]: I0310 11:05:35.546905 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="75dcab4f-0176-43ea-81d3-4ecbff649959" containerName="registry-server" Mar 10 11:05:35 crc kubenswrapper[4794]: I0310 11:05:35.547079 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="a3274f97-634f-4482-aa42-800ba96a25c8" containerName="registry-server" Mar 10 11:05:35 crc kubenswrapper[4794]: I0310 11:05:35.547096 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="000e40eb-9cb3-4535-ab24-652ebaf83d42" containerName="dnsmasq-dns" Mar 10 11:05:35 crc kubenswrapper[4794]: I0310 11:05:35.547114 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="75dcab4f-0176-43ea-81d3-4ecbff649959" containerName="registry-server" Mar 10 11:05:35 crc kubenswrapper[4794]: I0310 11:05:35.547696 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Mar 10 11:05:35 crc kubenswrapper[4794]: I0310 11:05:35.550779 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-2zbln" Mar 10 11:05:35 crc kubenswrapper[4794]: I0310 11:05:35.555825 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client"] Mar 10 11:05:35 crc kubenswrapper[4794]: I0310 11:05:35.702453 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g4gqw\" (UniqueName: \"kubernetes.io/projected/d2eb37b2-36f5-46d9-b9a6-09b69867a936-kube-api-access-g4gqw\") pod \"mariadb-client\" (UID: \"d2eb37b2-36f5-46d9-b9a6-09b69867a936\") " pod="openstack/mariadb-client" Mar 10 11:05:35 crc kubenswrapper[4794]: I0310 11:05:35.804386 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g4gqw\" (UniqueName: \"kubernetes.io/projected/d2eb37b2-36f5-46d9-b9a6-09b69867a936-kube-api-access-g4gqw\") pod \"mariadb-client\" (UID: \"d2eb37b2-36f5-46d9-b9a6-09b69867a936\") " pod="openstack/mariadb-client" Mar 10 11:05:35 crc kubenswrapper[4794]: I0310 11:05:35.841678 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g4gqw\" (UniqueName: \"kubernetes.io/projected/d2eb37b2-36f5-46d9-b9a6-09b69867a936-kube-api-access-g4gqw\") pod \"mariadb-client\" (UID: \"d2eb37b2-36f5-46d9-b9a6-09b69867a936\") " pod="openstack/mariadb-client" Mar 10 11:05:35 crc kubenswrapper[4794]: I0310 11:05:35.880467 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Mar 10 11:05:36 crc kubenswrapper[4794]: I0310 11:05:35.999000 4794 scope.go:117] "RemoveContainer" containerID="7246b508c9eeccdf8a5ec9276e239c3ec455273ced1f978a5745ca8a5e590e02" Mar 10 11:05:36 crc kubenswrapper[4794]: E0310 11:05:36.000812 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-69278_openshift-machine-config-operator(071a79a8-a892-4d38-a255-2a19483b64aa)\"" pod="openshift-machine-config-operator/machine-config-daemon-69278" podUID="071a79a8-a892-4d38-a255-2a19483b64aa" Mar 10 11:05:36 crc kubenswrapper[4794]: I0310 11:05:36.448736 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client"] Mar 10 11:05:36 crc kubenswrapper[4794]: W0310 11:05:36.458024 4794 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd2eb37b2_36f5_46d9_b9a6_09b69867a936.slice/crio-3c33d38034b89336394eba5031f3246b2c9b3917ecc7649eff796d5631fe430d WatchSource:0}: Error finding container 3c33d38034b89336394eba5031f3246b2c9b3917ecc7649eff796d5631fe430d: Status 404 returned error can't find the container with id 3c33d38034b89336394eba5031f3246b2c9b3917ecc7649eff796d5631fe430d Mar 10 11:05:37 crc kubenswrapper[4794]: I0310 11:05:37.428250 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client" event={"ID":"d2eb37b2-36f5-46d9-b9a6-09b69867a936","Type":"ContainerStarted","Data":"1c5edf217e6327d76e66287721423af94f00f7a740cbe906ea4acf92bbe70cfc"} Mar 10 11:05:37 crc kubenswrapper[4794]: I0310 11:05:37.428656 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client" event={"ID":"d2eb37b2-36f5-46d9-b9a6-09b69867a936","Type":"ContainerStarted","Data":"3c33d38034b89336394eba5031f3246b2c9b3917ecc7649eff796d5631fe430d"} Mar 10 11:05:37 crc kubenswrapper[4794]: I0310 11:05:37.460087 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/mariadb-client" podStartSLOduration=1.945542987 podStartE2EDuration="2.460053991s" podCreationTimestamp="2026-03-10 11:05:35 +0000 UTC" firstStartedPulling="2026-03-10 11:05:36.461177041 +0000 UTC m=+4885.217347879" lastFinishedPulling="2026-03-10 11:05:36.975688025 +0000 UTC m=+4885.731858883" observedRunningTime="2026-03-10 11:05:37.44972564 +0000 UTC m=+4886.205896488" watchObservedRunningTime="2026-03-10 11:05:37.460053991 +0000 UTC m=+4886.216224839" Mar 10 11:05:51 crc kubenswrapper[4794]: I0310 11:05:50.999573 4794 scope.go:117] "RemoveContainer" containerID="7246b508c9eeccdf8a5ec9276e239c3ec455273ced1f978a5745ca8a5e590e02" Mar 10 11:05:51 crc kubenswrapper[4794]: E0310 11:05:51.001773 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-69278_openshift-machine-config-operator(071a79a8-a892-4d38-a255-2a19483b64aa)\"" pod="openshift-machine-config-operator/machine-config-daemon-69278" podUID="071a79a8-a892-4d38-a255-2a19483b64aa" Mar 10 11:05:51 crc kubenswrapper[4794]: I0310 11:05:51.261629 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mariadb-client"] Mar 10 11:05:51 crc kubenswrapper[4794]: I0310 11:05:51.261879 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/mariadb-client" podUID="d2eb37b2-36f5-46d9-b9a6-09b69867a936" containerName="mariadb-client" containerID="cri-o://1c5edf217e6327d76e66287721423af94f00f7a740cbe906ea4acf92bbe70cfc" gracePeriod=30 Mar 10 11:05:51 crc kubenswrapper[4794]: I0310 11:05:51.580702 4794 generic.go:334] "Generic (PLEG): container finished" podID="d2eb37b2-36f5-46d9-b9a6-09b69867a936" containerID="1c5edf217e6327d76e66287721423af94f00f7a740cbe906ea4acf92bbe70cfc" exitCode=143 Mar 10 11:05:51 crc kubenswrapper[4794]: I0310 11:05:51.580839 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client" event={"ID":"d2eb37b2-36f5-46d9-b9a6-09b69867a936","Type":"ContainerDied","Data":"1c5edf217e6327d76e66287721423af94f00f7a740cbe906ea4acf92bbe70cfc"} Mar 10 11:05:51 crc kubenswrapper[4794]: I0310 11:05:51.843756 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Mar 10 11:05:51 crc kubenswrapper[4794]: I0310 11:05:51.897300 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g4gqw\" (UniqueName: \"kubernetes.io/projected/d2eb37b2-36f5-46d9-b9a6-09b69867a936-kube-api-access-g4gqw\") pod \"d2eb37b2-36f5-46d9-b9a6-09b69867a936\" (UID: \"d2eb37b2-36f5-46d9-b9a6-09b69867a936\") " Mar 10 11:05:51 crc kubenswrapper[4794]: I0310 11:05:51.904951 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d2eb37b2-36f5-46d9-b9a6-09b69867a936-kube-api-access-g4gqw" (OuterVolumeSpecName: "kube-api-access-g4gqw") pod "d2eb37b2-36f5-46d9-b9a6-09b69867a936" (UID: "d2eb37b2-36f5-46d9-b9a6-09b69867a936"). InnerVolumeSpecName "kube-api-access-g4gqw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 11:05:51 crc kubenswrapper[4794]: I0310 11:05:51.999654 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g4gqw\" (UniqueName: \"kubernetes.io/projected/d2eb37b2-36f5-46d9-b9a6-09b69867a936-kube-api-access-g4gqw\") on node \"crc\" DevicePath \"\"" Mar 10 11:05:52 crc kubenswrapper[4794]: I0310 11:05:52.592674 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client" event={"ID":"d2eb37b2-36f5-46d9-b9a6-09b69867a936","Type":"ContainerDied","Data":"3c33d38034b89336394eba5031f3246b2c9b3917ecc7649eff796d5631fe430d"} Mar 10 11:05:52 crc kubenswrapper[4794]: I0310 11:05:52.592730 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Mar 10 11:05:52 crc kubenswrapper[4794]: I0310 11:05:52.592756 4794 scope.go:117] "RemoveContainer" containerID="1c5edf217e6327d76e66287721423af94f00f7a740cbe906ea4acf92bbe70cfc" Mar 10 11:05:52 crc kubenswrapper[4794]: I0310 11:05:52.632391 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mariadb-client"] Mar 10 11:05:52 crc kubenswrapper[4794]: I0310 11:05:52.643431 4794 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mariadb-client"] Mar 10 11:05:54 crc kubenswrapper[4794]: I0310 11:05:54.017656 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d2eb37b2-36f5-46d9-b9a6-09b69867a936" path="/var/lib/kubelet/pods/d2eb37b2-36f5-46d9-b9a6-09b69867a936/volumes" Mar 10 11:06:00 crc kubenswrapper[4794]: I0310 11:06:00.165544 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29552346-xcxk2"] Mar 10 11:06:00 crc kubenswrapper[4794]: E0310 11:06:00.166680 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d2eb37b2-36f5-46d9-b9a6-09b69867a936" containerName="mariadb-client" Mar 10 11:06:00 crc kubenswrapper[4794]: I0310 11:06:00.166726 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="d2eb37b2-36f5-46d9-b9a6-09b69867a936" containerName="mariadb-client" Mar 10 11:06:00 crc kubenswrapper[4794]: I0310 11:06:00.167002 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="d2eb37b2-36f5-46d9-b9a6-09b69867a936" containerName="mariadb-client" Mar 10 11:06:00 crc kubenswrapper[4794]: I0310 11:06:00.167771 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552346-xcxk2" Mar 10 11:06:00 crc kubenswrapper[4794]: I0310 11:06:00.171882 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 10 11:06:00 crc kubenswrapper[4794]: I0310 11:06:00.172435 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 10 11:06:00 crc kubenswrapper[4794]: I0310 11:06:00.175744 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-r5tkc" Mar 10 11:06:00 crc kubenswrapper[4794]: I0310 11:06:00.190851 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552346-xcxk2"] Mar 10 11:06:00 crc kubenswrapper[4794]: I0310 11:06:00.248344 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p5bfq\" (UniqueName: \"kubernetes.io/projected/851b3d8e-0191-4952-ac50-d09258590e83-kube-api-access-p5bfq\") pod \"auto-csr-approver-29552346-xcxk2\" (UID: \"851b3d8e-0191-4952-ac50-d09258590e83\") " pod="openshift-infra/auto-csr-approver-29552346-xcxk2" Mar 10 11:06:00 crc kubenswrapper[4794]: I0310 11:06:00.349682 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p5bfq\" (UniqueName: \"kubernetes.io/projected/851b3d8e-0191-4952-ac50-d09258590e83-kube-api-access-p5bfq\") pod \"auto-csr-approver-29552346-xcxk2\" (UID: \"851b3d8e-0191-4952-ac50-d09258590e83\") " pod="openshift-infra/auto-csr-approver-29552346-xcxk2" Mar 10 11:06:00 crc kubenswrapper[4794]: I0310 11:06:00.382572 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p5bfq\" (UniqueName: \"kubernetes.io/projected/851b3d8e-0191-4952-ac50-d09258590e83-kube-api-access-p5bfq\") pod \"auto-csr-approver-29552346-xcxk2\" (UID: \"851b3d8e-0191-4952-ac50-d09258590e83\") " pod="openshift-infra/auto-csr-approver-29552346-xcxk2" Mar 10 11:06:00 crc kubenswrapper[4794]: I0310 11:06:00.500406 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552346-xcxk2" Mar 10 11:06:00 crc kubenswrapper[4794]: I0310 11:06:00.760410 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552346-xcxk2"] Mar 10 11:06:01 crc kubenswrapper[4794]: I0310 11:06:01.680149 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552346-xcxk2" event={"ID":"851b3d8e-0191-4952-ac50-d09258590e83","Type":"ContainerStarted","Data":"b6b51734c7a3b5feaf8c86569a4a6ea165811b0a583997c8a14ff1347aeee198"} Mar 10 11:06:02 crc kubenswrapper[4794]: I0310 11:06:02.694184 4794 generic.go:334] "Generic (PLEG): container finished" podID="851b3d8e-0191-4952-ac50-d09258590e83" containerID="08c2c66effe54b7118bdf05a58b266c465935f44cf2c11e832bc73c547e0685e" exitCode=0 Mar 10 11:06:02 crc kubenswrapper[4794]: I0310 11:06:02.694309 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552346-xcxk2" event={"ID":"851b3d8e-0191-4952-ac50-d09258590e83","Type":"ContainerDied","Data":"08c2c66effe54b7118bdf05a58b266c465935f44cf2c11e832bc73c547e0685e"} Mar 10 11:06:04 crc kubenswrapper[4794]: I0310 11:06:04.056734 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552346-xcxk2" Mar 10 11:06:04 crc kubenswrapper[4794]: I0310 11:06:04.109622 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p5bfq\" (UniqueName: \"kubernetes.io/projected/851b3d8e-0191-4952-ac50-d09258590e83-kube-api-access-p5bfq\") pod \"851b3d8e-0191-4952-ac50-d09258590e83\" (UID: \"851b3d8e-0191-4952-ac50-d09258590e83\") " Mar 10 11:06:04 crc kubenswrapper[4794]: I0310 11:06:04.114673 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/851b3d8e-0191-4952-ac50-d09258590e83-kube-api-access-p5bfq" (OuterVolumeSpecName: "kube-api-access-p5bfq") pod "851b3d8e-0191-4952-ac50-d09258590e83" (UID: "851b3d8e-0191-4952-ac50-d09258590e83"). InnerVolumeSpecName "kube-api-access-p5bfq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 11:06:04 crc kubenswrapper[4794]: I0310 11:06:04.212249 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p5bfq\" (UniqueName: \"kubernetes.io/projected/851b3d8e-0191-4952-ac50-d09258590e83-kube-api-access-p5bfq\") on node \"crc\" DevicePath \"\"" Mar 10 11:06:04 crc kubenswrapper[4794]: I0310 11:06:04.717637 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552346-xcxk2" event={"ID":"851b3d8e-0191-4952-ac50-d09258590e83","Type":"ContainerDied","Data":"b6b51734c7a3b5feaf8c86569a4a6ea165811b0a583997c8a14ff1347aeee198"} Mar 10 11:06:04 crc kubenswrapper[4794]: I0310 11:06:04.717957 4794 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b6b51734c7a3b5feaf8c86569a4a6ea165811b0a583997c8a14ff1347aeee198" Mar 10 11:06:04 crc kubenswrapper[4794]: I0310 11:06:04.717687 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552346-xcxk2" Mar 10 11:06:05 crc kubenswrapper[4794]: I0310 11:06:05.138892 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29552340-2skgf"] Mar 10 11:06:05 crc kubenswrapper[4794]: I0310 11:06:05.169823 4794 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29552340-2skgf"] Mar 10 11:06:05 crc kubenswrapper[4794]: I0310 11:06:05.999071 4794 scope.go:117] "RemoveContainer" containerID="7246b508c9eeccdf8a5ec9276e239c3ec455273ced1f978a5745ca8a5e590e02" Mar 10 11:06:05 crc kubenswrapper[4794]: E0310 11:06:05.999311 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-69278_openshift-machine-config-operator(071a79a8-a892-4d38-a255-2a19483b64aa)\"" pod="openshift-machine-config-operator/machine-config-daemon-69278" podUID="071a79a8-a892-4d38-a255-2a19483b64aa" Mar 10 11:06:06 crc kubenswrapper[4794]: I0310 11:06:06.016399 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a69e0088-724c-47d3-b1fc-89279d3195ed" path="/var/lib/kubelet/pods/a69e0088-724c-47d3-b1fc-89279d3195ed/volumes" Mar 10 11:06:16 crc kubenswrapper[4794]: I0310 11:06:16.999056 4794 scope.go:117] "RemoveContainer" containerID="7246b508c9eeccdf8a5ec9276e239c3ec455273ced1f978a5745ca8a5e590e02" Mar 10 11:06:17 crc kubenswrapper[4794]: E0310 11:06:16.999717 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-69278_openshift-machine-config-operator(071a79a8-a892-4d38-a255-2a19483b64aa)\"" pod="openshift-machine-config-operator/machine-config-daemon-69278" podUID="071a79a8-a892-4d38-a255-2a19483b64aa" Mar 10 11:06:30 crc kubenswrapper[4794]: I0310 11:06:30.999227 4794 scope.go:117] "RemoveContainer" containerID="7246b508c9eeccdf8a5ec9276e239c3ec455273ced1f978a5745ca8a5e590e02" Mar 10 11:06:31 crc kubenswrapper[4794]: E0310 11:06:31.000227 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-69278_openshift-machine-config-operator(071a79a8-a892-4d38-a255-2a19483b64aa)\"" pod="openshift-machine-config-operator/machine-config-daemon-69278" podUID="071a79a8-a892-4d38-a255-2a19483b64aa" Mar 10 11:06:41 crc kubenswrapper[4794]: I0310 11:06:41.624972 4794 scope.go:117] "RemoveContainer" containerID="8e921242eb8bf483693b8e8aa4b940aae81dd090fbbcb3fdc5f3d00ba9d12870" Mar 10 11:06:41 crc kubenswrapper[4794]: I0310 11:06:41.662758 4794 scope.go:117] "RemoveContainer" containerID="6bba33afae504fb232f42fc6a6fd9f0a82b3d56cfb82e02962e93f6c279d986a" Mar 10 11:06:45 crc kubenswrapper[4794]: I0310 11:06:44.999810 4794 scope.go:117] "RemoveContainer" containerID="7246b508c9eeccdf8a5ec9276e239c3ec455273ced1f978a5745ca8a5e590e02" Mar 10 11:06:45 crc kubenswrapper[4794]: E0310 11:06:45.000765 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-69278_openshift-machine-config-operator(071a79a8-a892-4d38-a255-2a19483b64aa)\"" pod="openshift-machine-config-operator/machine-config-daemon-69278" podUID="071a79a8-a892-4d38-a255-2a19483b64aa" Mar 10 11:06:57 crc kubenswrapper[4794]: I0310 11:06:57.999302 4794 scope.go:117] "RemoveContainer" containerID="7246b508c9eeccdf8a5ec9276e239c3ec455273ced1f978a5745ca8a5e590e02" Mar 10 11:06:58 crc kubenswrapper[4794]: I0310 11:06:58.323329 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-69278" event={"ID":"071a79a8-a892-4d38-a255-2a19483b64aa","Type":"ContainerStarted","Data":"f76c37e6c1f055cf0ebe872366c01cd32a21819058acbb1bc0dcd8368831960a"} Mar 10 11:08:00 crc kubenswrapper[4794]: I0310 11:08:00.172151 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29552348-twd8d"] Mar 10 11:08:00 crc kubenswrapper[4794]: E0310 11:08:00.175058 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="851b3d8e-0191-4952-ac50-d09258590e83" containerName="oc" Mar 10 11:08:00 crc kubenswrapper[4794]: I0310 11:08:00.175275 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="851b3d8e-0191-4952-ac50-d09258590e83" containerName="oc" Mar 10 11:08:00 crc kubenswrapper[4794]: I0310 11:08:00.175788 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="851b3d8e-0191-4952-ac50-d09258590e83" containerName="oc" Mar 10 11:08:00 crc kubenswrapper[4794]: I0310 11:08:00.177041 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552348-twd8d" Mar 10 11:08:00 crc kubenswrapper[4794]: I0310 11:08:00.181038 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 10 11:08:00 crc kubenswrapper[4794]: I0310 11:08:00.183087 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552348-twd8d"] Mar 10 11:08:00 crc kubenswrapper[4794]: I0310 11:08:00.185463 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 10 11:08:00 crc kubenswrapper[4794]: I0310 11:08:00.187535 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-r5tkc" Mar 10 11:08:00 crc kubenswrapper[4794]: I0310 11:08:00.354032 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5ln42\" (UniqueName: \"kubernetes.io/projected/a58ccd65-b7c0-452b-82e1-01ce3abd6f4a-kube-api-access-5ln42\") pod \"auto-csr-approver-29552348-twd8d\" (UID: \"a58ccd65-b7c0-452b-82e1-01ce3abd6f4a\") " pod="openshift-infra/auto-csr-approver-29552348-twd8d" Mar 10 11:08:00 crc kubenswrapper[4794]: I0310 11:08:00.455372 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5ln42\" (UniqueName: \"kubernetes.io/projected/a58ccd65-b7c0-452b-82e1-01ce3abd6f4a-kube-api-access-5ln42\") pod \"auto-csr-approver-29552348-twd8d\" (UID: \"a58ccd65-b7c0-452b-82e1-01ce3abd6f4a\") " pod="openshift-infra/auto-csr-approver-29552348-twd8d" Mar 10 11:08:00 crc kubenswrapper[4794]: I0310 11:08:00.481664 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5ln42\" (UniqueName: \"kubernetes.io/projected/a58ccd65-b7c0-452b-82e1-01ce3abd6f4a-kube-api-access-5ln42\") pod \"auto-csr-approver-29552348-twd8d\" (UID: \"a58ccd65-b7c0-452b-82e1-01ce3abd6f4a\") " pod="openshift-infra/auto-csr-approver-29552348-twd8d" Mar 10 11:08:00 crc kubenswrapper[4794]: I0310 11:08:00.500712 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552348-twd8d" Mar 10 11:08:00 crc kubenswrapper[4794]: I0310 11:08:00.996014 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552348-twd8d"] Mar 10 11:08:01 crc kubenswrapper[4794]: I0310 11:08:01.932369 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552348-twd8d" event={"ID":"a58ccd65-b7c0-452b-82e1-01ce3abd6f4a","Type":"ContainerStarted","Data":"e9f8f9b9c47dafcf8d1b152067406e3ffcbb0a8d37a028e31448050729da8945"} Mar 10 11:08:02 crc kubenswrapper[4794]: I0310 11:08:02.206000 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-nvzj2"] Mar 10 11:08:02 crc kubenswrapper[4794]: I0310 11:08:02.209064 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-nvzj2" Mar 10 11:08:02 crc kubenswrapper[4794]: I0310 11:08:02.222506 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-nvzj2"] Mar 10 11:08:02 crc kubenswrapper[4794]: I0310 11:08:02.286568 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/52c010d2-f5ee-4754-ac96-b226e94a6928-utilities\") pod \"certified-operators-nvzj2\" (UID: \"52c010d2-f5ee-4754-ac96-b226e94a6928\") " pod="openshift-marketplace/certified-operators-nvzj2" Mar 10 11:08:02 crc kubenswrapper[4794]: I0310 11:08:02.286610 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/52c010d2-f5ee-4754-ac96-b226e94a6928-catalog-content\") pod \"certified-operators-nvzj2\" (UID: \"52c010d2-f5ee-4754-ac96-b226e94a6928\") " pod="openshift-marketplace/certified-operators-nvzj2" Mar 10 11:08:02 crc kubenswrapper[4794]: I0310 11:08:02.286661 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tl6wp\" (UniqueName: \"kubernetes.io/projected/52c010d2-f5ee-4754-ac96-b226e94a6928-kube-api-access-tl6wp\") pod \"certified-operators-nvzj2\" (UID: \"52c010d2-f5ee-4754-ac96-b226e94a6928\") " pod="openshift-marketplace/certified-operators-nvzj2" Mar 10 11:08:02 crc kubenswrapper[4794]: I0310 11:08:02.388264 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/52c010d2-f5ee-4754-ac96-b226e94a6928-catalog-content\") pod \"certified-operators-nvzj2\" (UID: \"52c010d2-f5ee-4754-ac96-b226e94a6928\") " pod="openshift-marketplace/certified-operators-nvzj2" Mar 10 11:08:02 crc kubenswrapper[4794]: I0310 11:08:02.388501 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tl6wp\" (UniqueName: \"kubernetes.io/projected/52c010d2-f5ee-4754-ac96-b226e94a6928-kube-api-access-tl6wp\") pod \"certified-operators-nvzj2\" (UID: \"52c010d2-f5ee-4754-ac96-b226e94a6928\") " pod="openshift-marketplace/certified-operators-nvzj2" Mar 10 11:08:02 crc kubenswrapper[4794]: I0310 11:08:02.388645 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/52c010d2-f5ee-4754-ac96-b226e94a6928-utilities\") pod \"certified-operators-nvzj2\" (UID: \"52c010d2-f5ee-4754-ac96-b226e94a6928\") " pod="openshift-marketplace/certified-operators-nvzj2" Mar 10 11:08:02 crc kubenswrapper[4794]: I0310 11:08:02.389053 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/52c010d2-f5ee-4754-ac96-b226e94a6928-utilities\") pod \"certified-operators-nvzj2\" (UID: \"52c010d2-f5ee-4754-ac96-b226e94a6928\") " pod="openshift-marketplace/certified-operators-nvzj2" Mar 10 11:08:02 crc kubenswrapper[4794]: I0310 11:08:02.389357 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/52c010d2-f5ee-4754-ac96-b226e94a6928-catalog-content\") pod \"certified-operators-nvzj2\" (UID: \"52c010d2-f5ee-4754-ac96-b226e94a6928\") " pod="openshift-marketplace/certified-operators-nvzj2" Mar 10 11:08:02 crc kubenswrapper[4794]: I0310 11:08:02.412597 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tl6wp\" (UniqueName: \"kubernetes.io/projected/52c010d2-f5ee-4754-ac96-b226e94a6928-kube-api-access-tl6wp\") pod \"certified-operators-nvzj2\" (UID: \"52c010d2-f5ee-4754-ac96-b226e94a6928\") " pod="openshift-marketplace/certified-operators-nvzj2" Mar 10 11:08:02 crc kubenswrapper[4794]: I0310 11:08:02.573846 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-nvzj2" Mar 10 11:08:02 crc kubenswrapper[4794]: I0310 11:08:02.939158 4794 generic.go:334] "Generic (PLEG): container finished" podID="a58ccd65-b7c0-452b-82e1-01ce3abd6f4a" containerID="35f0095264ba3d9ef4a7353e888be2676eef26a6d0e18908f36f913529577de5" exitCode=0 Mar 10 11:08:02 crc kubenswrapper[4794]: I0310 11:08:02.939233 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552348-twd8d" event={"ID":"a58ccd65-b7c0-452b-82e1-01ce3abd6f4a","Type":"ContainerDied","Data":"35f0095264ba3d9ef4a7353e888be2676eef26a6d0e18908f36f913529577de5"} Mar 10 11:08:03 crc kubenswrapper[4794]: I0310 11:08:03.057699 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-nvzj2"] Mar 10 11:08:03 crc kubenswrapper[4794]: W0310 11:08:03.062020 4794 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod52c010d2_f5ee_4754_ac96_b226e94a6928.slice/crio-760714f19fc0fcb680691541915c7f460871407f72f1a07b32f7bc2a9aafaad8 WatchSource:0}: Error finding container 760714f19fc0fcb680691541915c7f460871407f72f1a07b32f7bc2a9aafaad8: Status 404 returned error can't find the container with id 760714f19fc0fcb680691541915c7f460871407f72f1a07b32f7bc2a9aafaad8 Mar 10 11:08:03 crc kubenswrapper[4794]: I0310 11:08:03.950592 4794 generic.go:334] "Generic (PLEG): container finished" podID="52c010d2-f5ee-4754-ac96-b226e94a6928" containerID="f33181522cf1103b106095b1a061c101b9ed368099cdb65b10c0fb4468097c07" exitCode=0 Mar 10 11:08:03 crc kubenswrapper[4794]: I0310 11:08:03.950876 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-nvzj2" event={"ID":"52c010d2-f5ee-4754-ac96-b226e94a6928","Type":"ContainerDied","Data":"f33181522cf1103b106095b1a061c101b9ed368099cdb65b10c0fb4468097c07"} Mar 10 11:08:03 crc kubenswrapper[4794]: I0310 11:08:03.951290 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-nvzj2" event={"ID":"52c010d2-f5ee-4754-ac96-b226e94a6928","Type":"ContainerStarted","Data":"760714f19fc0fcb680691541915c7f460871407f72f1a07b32f7bc2a9aafaad8"} Mar 10 11:08:04 crc kubenswrapper[4794]: I0310 11:08:04.420023 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552348-twd8d" Mar 10 11:08:04 crc kubenswrapper[4794]: I0310 11:08:04.537459 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5ln42\" (UniqueName: \"kubernetes.io/projected/a58ccd65-b7c0-452b-82e1-01ce3abd6f4a-kube-api-access-5ln42\") pod \"a58ccd65-b7c0-452b-82e1-01ce3abd6f4a\" (UID: \"a58ccd65-b7c0-452b-82e1-01ce3abd6f4a\") " Mar 10 11:08:04 crc kubenswrapper[4794]: I0310 11:08:04.547761 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a58ccd65-b7c0-452b-82e1-01ce3abd6f4a-kube-api-access-5ln42" (OuterVolumeSpecName: "kube-api-access-5ln42") pod "a58ccd65-b7c0-452b-82e1-01ce3abd6f4a" (UID: "a58ccd65-b7c0-452b-82e1-01ce3abd6f4a"). InnerVolumeSpecName "kube-api-access-5ln42". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 11:08:04 crc kubenswrapper[4794]: I0310 11:08:04.640033 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5ln42\" (UniqueName: \"kubernetes.io/projected/a58ccd65-b7c0-452b-82e1-01ce3abd6f4a-kube-api-access-5ln42\") on node \"crc\" DevicePath \"\"" Mar 10 11:08:04 crc kubenswrapper[4794]: I0310 11:08:04.959829 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552348-twd8d" event={"ID":"a58ccd65-b7c0-452b-82e1-01ce3abd6f4a","Type":"ContainerDied","Data":"e9f8f9b9c47dafcf8d1b152067406e3ffcbb0a8d37a028e31448050729da8945"} Mar 10 11:08:04 crc kubenswrapper[4794]: I0310 11:08:04.960102 4794 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e9f8f9b9c47dafcf8d1b152067406e3ffcbb0a8d37a028e31448050729da8945" Mar 10 11:08:04 crc kubenswrapper[4794]: I0310 11:08:04.960122 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552348-twd8d" Mar 10 11:08:05 crc kubenswrapper[4794]: I0310 11:08:05.492579 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29552342-msf78"] Mar 10 11:08:05 crc kubenswrapper[4794]: I0310 11:08:05.498168 4794 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29552342-msf78"] Mar 10 11:08:05 crc kubenswrapper[4794]: I0310 11:08:05.970890 4794 generic.go:334] "Generic (PLEG): container finished" podID="52c010d2-f5ee-4754-ac96-b226e94a6928" containerID="9fe292ea9f7f2227fe2a80a8c7cc6e20d7026e20acdb62656d2b433a168df85d" exitCode=0 Mar 10 11:08:05 crc kubenswrapper[4794]: I0310 11:08:05.971013 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-nvzj2" event={"ID":"52c010d2-f5ee-4754-ac96-b226e94a6928","Type":"ContainerDied","Data":"9fe292ea9f7f2227fe2a80a8c7cc6e20d7026e20acdb62656d2b433a168df85d"} Mar 10 11:08:06 crc kubenswrapper[4794]: I0310 11:08:06.020403 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="863a1c75-0aca-4cc7-b986-5d6e3814f721" path="/var/lib/kubelet/pods/863a1c75-0aca-4cc7-b986-5d6e3814f721/volumes" Mar 10 11:08:06 crc kubenswrapper[4794]: I0310 11:08:06.989311 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-nvzj2" event={"ID":"52c010d2-f5ee-4754-ac96-b226e94a6928","Type":"ContainerStarted","Data":"1f0d0466696df0f936be9c94e4ff4c8aa06a30cdda7f1bd32a50e3e3903df37c"} Mar 10 11:08:07 crc kubenswrapper[4794]: I0310 11:08:07.024396 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-nvzj2" podStartSLOduration=2.590733545 podStartE2EDuration="5.024372374s" podCreationTimestamp="2026-03-10 11:08:02 +0000 UTC" firstStartedPulling="2026-03-10 11:08:03.953806054 +0000 UTC m=+5032.709976912" lastFinishedPulling="2026-03-10 11:08:06.387444913 +0000 UTC m=+5035.143615741" observedRunningTime="2026-03-10 11:08:07.017938575 +0000 UTC m=+5035.774109443" watchObservedRunningTime="2026-03-10 11:08:07.024372374 +0000 UTC m=+5035.780543222" Mar 10 11:08:12 crc kubenswrapper[4794]: I0310 11:08:12.575158 4794 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-nvzj2" Mar 10 11:08:12 crc kubenswrapper[4794]: I0310 11:08:12.575800 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-nvzj2" Mar 10 11:08:12 crc kubenswrapper[4794]: I0310 11:08:12.654362 4794 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-nvzj2" Mar 10 11:08:13 crc kubenswrapper[4794]: I0310 11:08:13.124170 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-nvzj2" Mar 10 11:08:18 crc kubenswrapper[4794]: I0310 11:08:18.804261 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-nvzj2"] Mar 10 11:08:18 crc kubenswrapper[4794]: I0310 11:08:18.807557 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-nvzj2" podUID="52c010d2-f5ee-4754-ac96-b226e94a6928" containerName="registry-server" containerID="cri-o://1f0d0466696df0f936be9c94e4ff4c8aa06a30cdda7f1bd32a50e3e3903df37c" gracePeriod=2 Mar 10 11:08:19 crc kubenswrapper[4794]: I0310 11:08:19.112303 4794 generic.go:334] "Generic (PLEG): container finished" podID="52c010d2-f5ee-4754-ac96-b226e94a6928" containerID="1f0d0466696df0f936be9c94e4ff4c8aa06a30cdda7f1bd32a50e3e3903df37c" exitCode=0 Mar 10 11:08:19 crc kubenswrapper[4794]: I0310 11:08:19.112379 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-nvzj2" event={"ID":"52c010d2-f5ee-4754-ac96-b226e94a6928","Type":"ContainerDied","Data":"1f0d0466696df0f936be9c94e4ff4c8aa06a30cdda7f1bd32a50e3e3903df37c"} Mar 10 11:08:19 crc kubenswrapper[4794]: I0310 11:08:19.243709 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-nvzj2" Mar 10 11:08:19 crc kubenswrapper[4794]: I0310 11:08:19.363379 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tl6wp\" (UniqueName: \"kubernetes.io/projected/52c010d2-f5ee-4754-ac96-b226e94a6928-kube-api-access-tl6wp\") pod \"52c010d2-f5ee-4754-ac96-b226e94a6928\" (UID: \"52c010d2-f5ee-4754-ac96-b226e94a6928\") " Mar 10 11:08:19 crc kubenswrapper[4794]: I0310 11:08:19.363763 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/52c010d2-f5ee-4754-ac96-b226e94a6928-utilities\") pod \"52c010d2-f5ee-4754-ac96-b226e94a6928\" (UID: \"52c010d2-f5ee-4754-ac96-b226e94a6928\") " Mar 10 11:08:19 crc kubenswrapper[4794]: I0310 11:08:19.363885 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/52c010d2-f5ee-4754-ac96-b226e94a6928-catalog-content\") pod \"52c010d2-f5ee-4754-ac96-b226e94a6928\" (UID: \"52c010d2-f5ee-4754-ac96-b226e94a6928\") " Mar 10 11:08:19 crc kubenswrapper[4794]: I0310 11:08:19.364610 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/52c010d2-f5ee-4754-ac96-b226e94a6928-utilities" (OuterVolumeSpecName: "utilities") pod "52c010d2-f5ee-4754-ac96-b226e94a6928" (UID: "52c010d2-f5ee-4754-ac96-b226e94a6928"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 11:08:19 crc kubenswrapper[4794]: I0310 11:08:19.369967 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/52c010d2-f5ee-4754-ac96-b226e94a6928-kube-api-access-tl6wp" (OuterVolumeSpecName: "kube-api-access-tl6wp") pod "52c010d2-f5ee-4754-ac96-b226e94a6928" (UID: "52c010d2-f5ee-4754-ac96-b226e94a6928"). InnerVolumeSpecName "kube-api-access-tl6wp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 11:08:19 crc kubenswrapper[4794]: I0310 11:08:19.455766 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/52c010d2-f5ee-4754-ac96-b226e94a6928-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "52c010d2-f5ee-4754-ac96-b226e94a6928" (UID: "52c010d2-f5ee-4754-ac96-b226e94a6928"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 11:08:19 crc kubenswrapper[4794]: I0310 11:08:19.466506 4794 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/52c010d2-f5ee-4754-ac96-b226e94a6928-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 10 11:08:19 crc kubenswrapper[4794]: I0310 11:08:19.466577 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tl6wp\" (UniqueName: \"kubernetes.io/projected/52c010d2-f5ee-4754-ac96-b226e94a6928-kube-api-access-tl6wp\") on node \"crc\" DevicePath \"\"" Mar 10 11:08:19 crc kubenswrapper[4794]: I0310 11:08:19.466612 4794 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/52c010d2-f5ee-4754-ac96-b226e94a6928-utilities\") on node \"crc\" DevicePath \"\"" Mar 10 11:08:20 crc kubenswrapper[4794]: I0310 11:08:20.126744 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-nvzj2" event={"ID":"52c010d2-f5ee-4754-ac96-b226e94a6928","Type":"ContainerDied","Data":"760714f19fc0fcb680691541915c7f460871407f72f1a07b32f7bc2a9aafaad8"} Mar 10 11:08:20 crc kubenswrapper[4794]: I0310 11:08:20.126847 4794 scope.go:117] "RemoveContainer" containerID="1f0d0466696df0f936be9c94e4ff4c8aa06a30cdda7f1bd32a50e3e3903df37c" Mar 10 11:08:20 crc kubenswrapper[4794]: I0310 11:08:20.127078 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-nvzj2" Mar 10 11:08:20 crc kubenswrapper[4794]: I0310 11:08:20.164705 4794 scope.go:117] "RemoveContainer" containerID="9fe292ea9f7f2227fe2a80a8c7cc6e20d7026e20acdb62656d2b433a168df85d" Mar 10 11:08:20 crc kubenswrapper[4794]: I0310 11:08:20.170034 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-nvzj2"] Mar 10 11:08:20 crc kubenswrapper[4794]: I0310 11:08:20.183450 4794 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-nvzj2"] Mar 10 11:08:20 crc kubenswrapper[4794]: I0310 11:08:20.195022 4794 scope.go:117] "RemoveContainer" containerID="f33181522cf1103b106095b1a061c101b9ed368099cdb65b10c0fb4468097c07" Mar 10 11:08:22 crc kubenswrapper[4794]: I0310 11:08:22.014236 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="52c010d2-f5ee-4754-ac96-b226e94a6928" path="/var/lib/kubelet/pods/52c010d2-f5ee-4754-ac96-b226e94a6928/volumes" Mar 10 11:08:41 crc kubenswrapper[4794]: I0310 11:08:41.780691 4794 scope.go:117] "RemoveContainer" containerID="df357288bd1762198d32a3df55b0077fb9466bc51d819d5a6ca9d6d73642a5bb" Mar 10 11:09:22 crc kubenswrapper[4794]: I0310 11:09:22.967227 4794 patch_prober.go:28] interesting pod/machine-config-daemon-69278 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 11:09:22 crc kubenswrapper[4794]: I0310 11:09:22.967884 4794 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-69278" podUID="071a79a8-a892-4d38-a255-2a19483b64aa" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 11:09:52 crc kubenswrapper[4794]: I0310 11:09:52.967226 4794 patch_prober.go:28] interesting pod/machine-config-daemon-69278 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 11:09:52 crc kubenswrapper[4794]: I0310 11:09:52.967867 4794 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-69278" podUID="071a79a8-a892-4d38-a255-2a19483b64aa" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 11:10:00 crc kubenswrapper[4794]: I0310 11:10:00.167223 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29552350-4wd5c"] Mar 10 11:10:00 crc kubenswrapper[4794]: E0310 11:10:00.168545 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a58ccd65-b7c0-452b-82e1-01ce3abd6f4a" containerName="oc" Mar 10 11:10:00 crc kubenswrapper[4794]: I0310 11:10:00.168580 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="a58ccd65-b7c0-452b-82e1-01ce3abd6f4a" containerName="oc" Mar 10 11:10:00 crc kubenswrapper[4794]: E0310 11:10:00.168627 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="52c010d2-f5ee-4754-ac96-b226e94a6928" containerName="extract-utilities" Mar 10 11:10:00 crc kubenswrapper[4794]: I0310 11:10:00.168646 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="52c010d2-f5ee-4754-ac96-b226e94a6928" containerName="extract-utilities" Mar 10 11:10:00 crc kubenswrapper[4794]: E0310 11:10:00.168678 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="52c010d2-f5ee-4754-ac96-b226e94a6928" containerName="extract-content" Mar 10 11:10:00 crc kubenswrapper[4794]: I0310 11:10:00.168696 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="52c010d2-f5ee-4754-ac96-b226e94a6928" containerName="extract-content" Mar 10 11:10:00 crc kubenswrapper[4794]: E0310 11:10:00.168722 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="52c010d2-f5ee-4754-ac96-b226e94a6928" containerName="registry-server" Mar 10 11:10:00 crc kubenswrapper[4794]: I0310 11:10:00.168739 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="52c010d2-f5ee-4754-ac96-b226e94a6928" containerName="registry-server" Mar 10 11:10:00 crc kubenswrapper[4794]: I0310 11:10:00.169249 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="52c010d2-f5ee-4754-ac96-b226e94a6928" containerName="registry-server" Mar 10 11:10:00 crc kubenswrapper[4794]: I0310 11:10:00.169326 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="a58ccd65-b7c0-452b-82e1-01ce3abd6f4a" containerName="oc" Mar 10 11:10:00 crc kubenswrapper[4794]: I0310 11:10:00.170865 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552350-4wd5c" Mar 10 11:10:00 crc kubenswrapper[4794]: I0310 11:10:00.174833 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-r5tkc" Mar 10 11:10:00 crc kubenswrapper[4794]: I0310 11:10:00.175609 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 10 11:10:00 crc kubenswrapper[4794]: I0310 11:10:00.178642 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 10 11:10:00 crc kubenswrapper[4794]: I0310 11:10:00.192995 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552350-4wd5c"] Mar 10 11:10:00 crc kubenswrapper[4794]: I0310 11:10:00.276391 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8mfxj\" (UniqueName: \"kubernetes.io/projected/29974508-802d-4263-91c7-c41b07c18dd7-kube-api-access-8mfxj\") pod \"auto-csr-approver-29552350-4wd5c\" (UID: \"29974508-802d-4263-91c7-c41b07c18dd7\") " pod="openshift-infra/auto-csr-approver-29552350-4wd5c" Mar 10 11:10:00 crc kubenswrapper[4794]: I0310 11:10:00.378262 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8mfxj\" (UniqueName: \"kubernetes.io/projected/29974508-802d-4263-91c7-c41b07c18dd7-kube-api-access-8mfxj\") pod \"auto-csr-approver-29552350-4wd5c\" (UID: \"29974508-802d-4263-91c7-c41b07c18dd7\") " pod="openshift-infra/auto-csr-approver-29552350-4wd5c" Mar 10 11:10:00 crc kubenswrapper[4794]: I0310 11:10:00.414299 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8mfxj\" (UniqueName: \"kubernetes.io/projected/29974508-802d-4263-91c7-c41b07c18dd7-kube-api-access-8mfxj\") pod \"auto-csr-approver-29552350-4wd5c\" (UID: \"29974508-802d-4263-91c7-c41b07c18dd7\") " pod="openshift-infra/auto-csr-approver-29552350-4wd5c" Mar 10 11:10:00 crc kubenswrapper[4794]: I0310 11:10:00.499189 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552350-4wd5c" Mar 10 11:10:01 crc kubenswrapper[4794]: I0310 11:10:01.108950 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552350-4wd5c"] Mar 10 11:10:01 crc kubenswrapper[4794]: I0310 11:10:01.119403 4794 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 10 11:10:02 crc kubenswrapper[4794]: I0310 11:10:02.036415 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552350-4wd5c" event={"ID":"29974508-802d-4263-91c7-c41b07c18dd7","Type":"ContainerStarted","Data":"9ef95db3159e691ba2f08a9108da1927a6626c3ead73d806633251a002e982e1"} Mar 10 11:10:04 crc kubenswrapper[4794]: I0310 11:10:04.057218 4794 generic.go:334] "Generic (PLEG): container finished" podID="29974508-802d-4263-91c7-c41b07c18dd7" containerID="cb39e8056a47c3aad427bfa16fa4633242ace230c4fee6f0e3feec2436b38082" exitCode=0 Mar 10 11:10:04 crc kubenswrapper[4794]: I0310 11:10:04.058577 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552350-4wd5c" event={"ID":"29974508-802d-4263-91c7-c41b07c18dd7","Type":"ContainerDied","Data":"cb39e8056a47c3aad427bfa16fa4633242ace230c4fee6f0e3feec2436b38082"} Mar 10 11:10:05 crc kubenswrapper[4794]: I0310 11:10:05.445631 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552350-4wd5c" Mar 10 11:10:05 crc kubenswrapper[4794]: I0310 11:10:05.567290 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8mfxj\" (UniqueName: \"kubernetes.io/projected/29974508-802d-4263-91c7-c41b07c18dd7-kube-api-access-8mfxj\") pod \"29974508-802d-4263-91c7-c41b07c18dd7\" (UID: \"29974508-802d-4263-91c7-c41b07c18dd7\") " Mar 10 11:10:05 crc kubenswrapper[4794]: I0310 11:10:05.575389 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/29974508-802d-4263-91c7-c41b07c18dd7-kube-api-access-8mfxj" (OuterVolumeSpecName: "kube-api-access-8mfxj") pod "29974508-802d-4263-91c7-c41b07c18dd7" (UID: "29974508-802d-4263-91c7-c41b07c18dd7"). InnerVolumeSpecName "kube-api-access-8mfxj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 11:10:05 crc kubenswrapper[4794]: I0310 11:10:05.669414 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8mfxj\" (UniqueName: \"kubernetes.io/projected/29974508-802d-4263-91c7-c41b07c18dd7-kube-api-access-8mfxj\") on node \"crc\" DevicePath \"\"" Mar 10 11:10:06 crc kubenswrapper[4794]: I0310 11:10:06.092233 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552350-4wd5c" event={"ID":"29974508-802d-4263-91c7-c41b07c18dd7","Type":"ContainerDied","Data":"9ef95db3159e691ba2f08a9108da1927a6626c3ead73d806633251a002e982e1"} Mar 10 11:10:06 crc kubenswrapper[4794]: I0310 11:10:06.092608 4794 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9ef95db3159e691ba2f08a9108da1927a6626c3ead73d806633251a002e982e1" Mar 10 11:10:06 crc kubenswrapper[4794]: I0310 11:10:06.092325 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552350-4wd5c" Mar 10 11:10:06 crc kubenswrapper[4794]: I0310 11:10:06.520477 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29552344-bqfvt"] Mar 10 11:10:06 crc kubenswrapper[4794]: I0310 11:10:06.526309 4794 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29552344-bqfvt"] Mar 10 11:10:08 crc kubenswrapper[4794]: I0310 11:10:08.018801 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cff9bdcc-d9bc-448f-9ac1-a2462480dc5e" path="/var/lib/kubelet/pods/cff9bdcc-d9bc-448f-9ac1-a2462480dc5e/volumes" Mar 10 11:10:22 crc kubenswrapper[4794]: I0310 11:10:22.968179 4794 patch_prober.go:28] interesting pod/machine-config-daemon-69278 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 11:10:22 crc kubenswrapper[4794]: I0310 11:10:22.968902 4794 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-69278" podUID="071a79a8-a892-4d38-a255-2a19483b64aa" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 11:10:22 crc kubenswrapper[4794]: I0310 11:10:22.968966 4794 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-69278" Mar 10 11:10:22 crc kubenswrapper[4794]: I0310 11:10:22.969741 4794 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"f76c37e6c1f055cf0ebe872366c01cd32a21819058acbb1bc0dcd8368831960a"} pod="openshift-machine-config-operator/machine-config-daemon-69278" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 10 11:10:22 crc kubenswrapper[4794]: I0310 11:10:22.969813 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-69278" podUID="071a79a8-a892-4d38-a255-2a19483b64aa" containerName="machine-config-daemon" containerID="cri-o://f76c37e6c1f055cf0ebe872366c01cd32a21819058acbb1bc0dcd8368831960a" gracePeriod=600 Mar 10 11:10:23 crc kubenswrapper[4794]: I0310 11:10:23.273175 4794 generic.go:334] "Generic (PLEG): container finished" podID="071a79a8-a892-4d38-a255-2a19483b64aa" containerID="f76c37e6c1f055cf0ebe872366c01cd32a21819058acbb1bc0dcd8368831960a" exitCode=0 Mar 10 11:10:23 crc kubenswrapper[4794]: I0310 11:10:23.273248 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-69278" event={"ID":"071a79a8-a892-4d38-a255-2a19483b64aa","Type":"ContainerDied","Data":"f76c37e6c1f055cf0ebe872366c01cd32a21819058acbb1bc0dcd8368831960a"} Mar 10 11:10:23 crc kubenswrapper[4794]: I0310 11:10:23.273491 4794 scope.go:117] "RemoveContainer" containerID="7246b508c9eeccdf8a5ec9276e239c3ec455273ced1f978a5745ca8a5e590e02" Mar 10 11:10:24 crc kubenswrapper[4794]: I0310 11:10:24.284745 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-69278" event={"ID":"071a79a8-a892-4d38-a255-2a19483b64aa","Type":"ContainerStarted","Data":"993a5336699cfaa4c1135438dd34154cd549d1e3528993849fd4002d26837157"} Mar 10 11:10:41 crc kubenswrapper[4794]: I0310 11:10:41.915148 4794 scope.go:117] "RemoveContainer" containerID="45b576e38ffe4e5303df02004b9f815700bf901686269567915d703e0c193c3b" Mar 10 11:10:41 crc kubenswrapper[4794]: I0310 11:10:41.976832 4794 scope.go:117] "RemoveContainer" containerID="3ba063e2e29bfe8f6cb6ea2e7ea28189306e6f28661a6c001da495e5a98f6bbe" Mar 10 11:10:46 crc kubenswrapper[4794]: I0310 11:10:46.450389 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mariadb-copy-data"] Mar 10 11:10:46 crc kubenswrapper[4794]: E0310 11:10:46.451273 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="29974508-802d-4263-91c7-c41b07c18dd7" containerName="oc" Mar 10 11:10:46 crc kubenswrapper[4794]: I0310 11:10:46.451293 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="29974508-802d-4263-91c7-c41b07c18dd7" containerName="oc" Mar 10 11:10:46 crc kubenswrapper[4794]: I0310 11:10:46.451537 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="29974508-802d-4263-91c7-c41b07c18dd7" containerName="oc" Mar 10 11:10:46 crc kubenswrapper[4794]: I0310 11:10:46.452207 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-copy-data" Mar 10 11:10:46 crc kubenswrapper[4794]: I0310 11:10:46.461644 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-copy-data"] Mar 10 11:10:46 crc kubenswrapper[4794]: I0310 11:10:46.509138 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-2zbln" Mar 10 11:10:46 crc kubenswrapper[4794]: I0310 11:10:46.616666 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-15da9a79-4d21-4867-9ffa-d1af79d9f24c\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-15da9a79-4d21-4867-9ffa-d1af79d9f24c\") pod \"mariadb-copy-data\" (UID: \"2d7b8834-0c36-41e1-837a-7bbd54185723\") " pod="openstack/mariadb-copy-data" Mar 10 11:10:46 crc kubenswrapper[4794]: I0310 11:10:46.617264 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c4bhh\" (UniqueName: \"kubernetes.io/projected/2d7b8834-0c36-41e1-837a-7bbd54185723-kube-api-access-c4bhh\") pod \"mariadb-copy-data\" (UID: \"2d7b8834-0c36-41e1-837a-7bbd54185723\") " pod="openstack/mariadb-copy-data" Mar 10 11:10:46 crc kubenswrapper[4794]: I0310 11:10:46.719791 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c4bhh\" (UniqueName: \"kubernetes.io/projected/2d7b8834-0c36-41e1-837a-7bbd54185723-kube-api-access-c4bhh\") pod \"mariadb-copy-data\" (UID: \"2d7b8834-0c36-41e1-837a-7bbd54185723\") " pod="openstack/mariadb-copy-data" Mar 10 11:10:46 crc kubenswrapper[4794]: I0310 11:10:46.719958 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-15da9a79-4d21-4867-9ffa-d1af79d9f24c\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-15da9a79-4d21-4867-9ffa-d1af79d9f24c\") pod \"mariadb-copy-data\" (UID: \"2d7b8834-0c36-41e1-837a-7bbd54185723\") " pod="openstack/mariadb-copy-data" Mar 10 11:10:46 crc kubenswrapper[4794]: I0310 11:10:46.723164 4794 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 10 11:10:46 crc kubenswrapper[4794]: I0310 11:10:46.723200 4794 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-15da9a79-4d21-4867-9ffa-d1af79d9f24c\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-15da9a79-4d21-4867-9ffa-d1af79d9f24c\") pod \"mariadb-copy-data\" (UID: \"2d7b8834-0c36-41e1-837a-7bbd54185723\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/2cb11fa4d04b3a4f788300b04481275667fbcfa5c03b883b20d15ba4a13fcb87/globalmount\"" pod="openstack/mariadb-copy-data" Mar 10 11:10:46 crc kubenswrapper[4794]: I0310 11:10:46.767594 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c4bhh\" (UniqueName: \"kubernetes.io/projected/2d7b8834-0c36-41e1-837a-7bbd54185723-kube-api-access-c4bhh\") pod \"mariadb-copy-data\" (UID: \"2d7b8834-0c36-41e1-837a-7bbd54185723\") " pod="openstack/mariadb-copy-data" Mar 10 11:10:46 crc kubenswrapper[4794]: I0310 11:10:46.774021 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-15da9a79-4d21-4867-9ffa-d1af79d9f24c\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-15da9a79-4d21-4867-9ffa-d1af79d9f24c\") pod \"mariadb-copy-data\" (UID: \"2d7b8834-0c36-41e1-837a-7bbd54185723\") " pod="openstack/mariadb-copy-data" Mar 10 11:10:46 crc kubenswrapper[4794]: I0310 11:10:46.825789 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-copy-data" Mar 10 11:10:47 crc kubenswrapper[4794]: I0310 11:10:47.393542 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-copy-data"] Mar 10 11:10:47 crc kubenswrapper[4794]: W0310 11:10:47.398555 4794 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2d7b8834_0c36_41e1_837a_7bbd54185723.slice/crio-afcbf05b47ade214ca72ee512964ef9a686a3dc614e30cc538c1cbd8fccd8109 WatchSource:0}: Error finding container afcbf05b47ade214ca72ee512964ef9a686a3dc614e30cc538c1cbd8fccd8109: Status 404 returned error can't find the container with id afcbf05b47ade214ca72ee512964ef9a686a3dc614e30cc538c1cbd8fccd8109 Mar 10 11:10:47 crc kubenswrapper[4794]: I0310 11:10:47.544208 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-copy-data" event={"ID":"2d7b8834-0c36-41e1-837a-7bbd54185723","Type":"ContainerStarted","Data":"afcbf05b47ade214ca72ee512964ef9a686a3dc614e30cc538c1cbd8fccd8109"} Mar 10 11:10:48 crc kubenswrapper[4794]: I0310 11:10:48.555550 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-copy-data" event={"ID":"2d7b8834-0c36-41e1-837a-7bbd54185723","Type":"ContainerStarted","Data":"cae650f04ee1d518deb6a50279452b396e8a329dee721a601ab8865cc3c8229b"} Mar 10 11:10:48 crc kubenswrapper[4794]: I0310 11:10:48.574210 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/mariadb-copy-data" podStartSLOduration=3.574193839 podStartE2EDuration="3.574193839s" podCreationTimestamp="2026-03-10 11:10:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 11:10:48.569662188 +0000 UTC m=+5197.325833016" watchObservedRunningTime="2026-03-10 11:10:48.574193839 +0000 UTC m=+5197.330364657" Mar 10 11:10:51 crc kubenswrapper[4794]: I0310 11:10:51.193425 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mariadb-client"] Mar 10 11:10:51 crc kubenswrapper[4794]: I0310 11:10:51.195391 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Mar 10 11:10:51 crc kubenswrapper[4794]: I0310 11:10:51.246841 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client"] Mar 10 11:10:51 crc kubenswrapper[4794]: I0310 11:10:51.299071 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f59ll\" (UniqueName: \"kubernetes.io/projected/ce244c62-56ff-46f3-bd95-2581aa5670ef-kube-api-access-f59ll\") pod \"mariadb-client\" (UID: \"ce244c62-56ff-46f3-bd95-2581aa5670ef\") " pod="openstack/mariadb-client" Mar 10 11:10:51 crc kubenswrapper[4794]: I0310 11:10:51.401565 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f59ll\" (UniqueName: \"kubernetes.io/projected/ce244c62-56ff-46f3-bd95-2581aa5670ef-kube-api-access-f59ll\") pod \"mariadb-client\" (UID: \"ce244c62-56ff-46f3-bd95-2581aa5670ef\") " pod="openstack/mariadb-client" Mar 10 11:10:51 crc kubenswrapper[4794]: I0310 11:10:51.437320 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f59ll\" (UniqueName: \"kubernetes.io/projected/ce244c62-56ff-46f3-bd95-2581aa5670ef-kube-api-access-f59ll\") pod \"mariadb-client\" (UID: \"ce244c62-56ff-46f3-bd95-2581aa5670ef\") " pod="openstack/mariadb-client" Mar 10 11:10:51 crc kubenswrapper[4794]: I0310 11:10:51.543307 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Mar 10 11:10:52 crc kubenswrapper[4794]: I0310 11:10:52.058503 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client"] Mar 10 11:10:52 crc kubenswrapper[4794]: W0310 11:10:52.061600 4794 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podce244c62_56ff_46f3_bd95_2581aa5670ef.slice/crio-3a2bc2171e56f08d818b2596ce47b5f301e5b31f19206102eeff985dede5e0a9 WatchSource:0}: Error finding container 3a2bc2171e56f08d818b2596ce47b5f301e5b31f19206102eeff985dede5e0a9: Status 404 returned error can't find the container with id 3a2bc2171e56f08d818b2596ce47b5f301e5b31f19206102eeff985dede5e0a9 Mar 10 11:10:52 crc kubenswrapper[4794]: I0310 11:10:52.594388 4794 generic.go:334] "Generic (PLEG): container finished" podID="ce244c62-56ff-46f3-bd95-2581aa5670ef" containerID="896b029fde6e7782a7bba68fbad03c78f2c6ef6299314b4c63d3d1fd26b436f7" exitCode=0 Mar 10 11:10:52 crc kubenswrapper[4794]: I0310 11:10:52.594462 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client" event={"ID":"ce244c62-56ff-46f3-bd95-2581aa5670ef","Type":"ContainerDied","Data":"896b029fde6e7782a7bba68fbad03c78f2c6ef6299314b4c63d3d1fd26b436f7"} Mar 10 11:10:52 crc kubenswrapper[4794]: I0310 11:10:52.594492 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client" event={"ID":"ce244c62-56ff-46f3-bd95-2581aa5670ef","Type":"ContainerStarted","Data":"3a2bc2171e56f08d818b2596ce47b5f301e5b31f19206102eeff985dede5e0a9"} Mar 10 11:10:53 crc kubenswrapper[4794]: I0310 11:10:53.894913 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Mar 10 11:10:53 crc kubenswrapper[4794]: I0310 11:10:53.922502 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_mariadb-client_ce244c62-56ff-46f3-bd95-2581aa5670ef/mariadb-client/0.log" Mar 10 11:10:53 crc kubenswrapper[4794]: I0310 11:10:53.962730 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mariadb-client"] Mar 10 11:10:53 crc kubenswrapper[4794]: I0310 11:10:53.970487 4794 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mariadb-client"] Mar 10 11:10:54 crc kubenswrapper[4794]: I0310 11:10:54.042158 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f59ll\" (UniqueName: \"kubernetes.io/projected/ce244c62-56ff-46f3-bd95-2581aa5670ef-kube-api-access-f59ll\") pod \"ce244c62-56ff-46f3-bd95-2581aa5670ef\" (UID: \"ce244c62-56ff-46f3-bd95-2581aa5670ef\") " Mar 10 11:10:54 crc kubenswrapper[4794]: I0310 11:10:54.051014 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ce244c62-56ff-46f3-bd95-2581aa5670ef-kube-api-access-f59ll" (OuterVolumeSpecName: "kube-api-access-f59ll") pod "ce244c62-56ff-46f3-bd95-2581aa5670ef" (UID: "ce244c62-56ff-46f3-bd95-2581aa5670ef"). InnerVolumeSpecName "kube-api-access-f59ll". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 11:10:54 crc kubenswrapper[4794]: I0310 11:10:54.100746 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mariadb-client"] Mar 10 11:10:54 crc kubenswrapper[4794]: E0310 11:10:54.101093 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ce244c62-56ff-46f3-bd95-2581aa5670ef" containerName="mariadb-client" Mar 10 11:10:54 crc kubenswrapper[4794]: I0310 11:10:54.101108 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="ce244c62-56ff-46f3-bd95-2581aa5670ef" containerName="mariadb-client" Mar 10 11:10:54 crc kubenswrapper[4794]: I0310 11:10:54.101440 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="ce244c62-56ff-46f3-bd95-2581aa5670ef" containerName="mariadb-client" Mar 10 11:10:54 crc kubenswrapper[4794]: I0310 11:10:54.102044 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Mar 10 11:10:54 crc kubenswrapper[4794]: I0310 11:10:54.119604 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client"] Mar 10 11:10:54 crc kubenswrapper[4794]: I0310 11:10:54.144384 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f59ll\" (UniqueName: \"kubernetes.io/projected/ce244c62-56ff-46f3-bd95-2581aa5670ef-kube-api-access-f59ll\") on node \"crc\" DevicePath \"\"" Mar 10 11:10:54 crc kubenswrapper[4794]: I0310 11:10:54.246528 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zxm6l\" (UniqueName: \"kubernetes.io/projected/dbd46fbb-f815-4fd2-834d-f3314f941a93-kube-api-access-zxm6l\") pod \"mariadb-client\" (UID: \"dbd46fbb-f815-4fd2-834d-f3314f941a93\") " pod="openstack/mariadb-client" Mar 10 11:10:54 crc kubenswrapper[4794]: I0310 11:10:54.348823 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zxm6l\" (UniqueName: \"kubernetes.io/projected/dbd46fbb-f815-4fd2-834d-f3314f941a93-kube-api-access-zxm6l\") pod \"mariadb-client\" (UID: \"dbd46fbb-f815-4fd2-834d-f3314f941a93\") " pod="openstack/mariadb-client" Mar 10 11:10:54 crc kubenswrapper[4794]: I0310 11:10:54.379446 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zxm6l\" (UniqueName: \"kubernetes.io/projected/dbd46fbb-f815-4fd2-834d-f3314f941a93-kube-api-access-zxm6l\") pod \"mariadb-client\" (UID: \"dbd46fbb-f815-4fd2-834d-f3314f941a93\") " pod="openstack/mariadb-client" Mar 10 11:10:54 crc kubenswrapper[4794]: I0310 11:10:54.433533 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Mar 10 11:10:54 crc kubenswrapper[4794]: I0310 11:10:54.619643 4794 scope.go:117] "RemoveContainer" containerID="896b029fde6e7782a7bba68fbad03c78f2c6ef6299314b4c63d3d1fd26b436f7" Mar 10 11:10:54 crc kubenswrapper[4794]: I0310 11:10:54.619880 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Mar 10 11:10:54 crc kubenswrapper[4794]: I0310 11:10:54.924483 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client"] Mar 10 11:10:54 crc kubenswrapper[4794]: W0310 11:10:54.932517 4794 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddbd46fbb_f815_4fd2_834d_f3314f941a93.slice/crio-d58a85c05beb3ad21a1300efbb3364d66192529a3bfabe3c92505e985d8fdf20 WatchSource:0}: Error finding container d58a85c05beb3ad21a1300efbb3364d66192529a3bfabe3c92505e985d8fdf20: Status 404 returned error can't find the container with id d58a85c05beb3ad21a1300efbb3364d66192529a3bfabe3c92505e985d8fdf20 Mar 10 11:10:55 crc kubenswrapper[4794]: I0310 11:10:55.654208 4794 generic.go:334] "Generic (PLEG): container finished" podID="dbd46fbb-f815-4fd2-834d-f3314f941a93" containerID="24cd0b134f71c7f37fdb9b7d6615d5c3345af2426be56aab801e6ad3e27ba750" exitCode=0 Mar 10 11:10:55 crc kubenswrapper[4794]: I0310 11:10:55.654320 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client" event={"ID":"dbd46fbb-f815-4fd2-834d-f3314f941a93","Type":"ContainerDied","Data":"24cd0b134f71c7f37fdb9b7d6615d5c3345af2426be56aab801e6ad3e27ba750"} Mar 10 11:10:55 crc kubenswrapper[4794]: I0310 11:10:55.654573 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client" event={"ID":"dbd46fbb-f815-4fd2-834d-f3314f941a93","Type":"ContainerStarted","Data":"d58a85c05beb3ad21a1300efbb3364d66192529a3bfabe3c92505e985d8fdf20"} Mar 10 11:10:56 crc kubenswrapper[4794]: I0310 11:10:56.015790 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ce244c62-56ff-46f3-bd95-2581aa5670ef" path="/var/lib/kubelet/pods/ce244c62-56ff-46f3-bd95-2581aa5670ef/volumes" Mar 10 11:10:57 crc kubenswrapper[4794]: I0310 11:10:57.079085 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Mar 10 11:10:57 crc kubenswrapper[4794]: I0310 11:10:57.103144 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_mariadb-client_dbd46fbb-f815-4fd2-834d-f3314f941a93/mariadb-client/0.log" Mar 10 11:10:57 crc kubenswrapper[4794]: I0310 11:10:57.139173 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mariadb-client"] Mar 10 11:10:57 crc kubenswrapper[4794]: I0310 11:10:57.151302 4794 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mariadb-client"] Mar 10 11:10:57 crc kubenswrapper[4794]: I0310 11:10:57.210725 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zxm6l\" (UniqueName: \"kubernetes.io/projected/dbd46fbb-f815-4fd2-834d-f3314f941a93-kube-api-access-zxm6l\") pod \"dbd46fbb-f815-4fd2-834d-f3314f941a93\" (UID: \"dbd46fbb-f815-4fd2-834d-f3314f941a93\") " Mar 10 11:10:57 crc kubenswrapper[4794]: I0310 11:10:57.218602 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dbd46fbb-f815-4fd2-834d-f3314f941a93-kube-api-access-zxm6l" (OuterVolumeSpecName: "kube-api-access-zxm6l") pod "dbd46fbb-f815-4fd2-834d-f3314f941a93" (UID: "dbd46fbb-f815-4fd2-834d-f3314f941a93"). InnerVolumeSpecName "kube-api-access-zxm6l". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 11:10:57 crc kubenswrapper[4794]: I0310 11:10:57.313499 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zxm6l\" (UniqueName: \"kubernetes.io/projected/dbd46fbb-f815-4fd2-834d-f3314f941a93-kube-api-access-zxm6l\") on node \"crc\" DevicePath \"\"" Mar 10 11:10:57 crc kubenswrapper[4794]: I0310 11:10:57.680416 4794 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d58a85c05beb3ad21a1300efbb3364d66192529a3bfabe3c92505e985d8fdf20" Mar 10 11:10:57 crc kubenswrapper[4794]: I0310 11:10:57.680503 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Mar 10 11:10:58 crc kubenswrapper[4794]: I0310 11:10:58.019423 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dbd46fbb-f815-4fd2-834d-f3314f941a93" path="/var/lib/kubelet/pods/dbd46fbb-f815-4fd2-834d-f3314f941a93/volumes" Mar 10 11:11:30 crc kubenswrapper[4794]: I0310 11:11:30.728366 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-nb-0"] Mar 10 11:11:30 crc kubenswrapper[4794]: E0310 11:11:30.729531 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dbd46fbb-f815-4fd2-834d-f3314f941a93" containerName="mariadb-client" Mar 10 11:11:30 crc kubenswrapper[4794]: I0310 11:11:30.729554 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="dbd46fbb-f815-4fd2-834d-f3314f941a93" containerName="mariadb-client" Mar 10 11:11:30 crc kubenswrapper[4794]: I0310 11:11:30.729831 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="dbd46fbb-f815-4fd2-834d-f3314f941a93" containerName="mariadb-client" Mar 10 11:11:30 crc kubenswrapper[4794]: I0310 11:11:30.731232 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Mar 10 11:11:30 crc kubenswrapper[4794]: I0310 11:11:30.735257 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-scripts" Mar 10 11:11:30 crc kubenswrapper[4794]: I0310 11:11:30.735579 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-nb-dockercfg-zplth" Mar 10 11:11:30 crc kubenswrapper[4794]: I0310 11:11:30.742716 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-config" Mar 10 11:11:30 crc kubenswrapper[4794]: I0310 11:11:30.743702 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Mar 10 11:11:30 crc kubenswrapper[4794]: I0310 11:11:30.749034 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-nb-2"] Mar 10 11:11:30 crc kubenswrapper[4794]: I0310 11:11:30.750895 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-2" Mar 10 11:11:30 crc kubenswrapper[4794]: I0310 11:11:30.783802 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-nb-1"] Mar 10 11:11:30 crc kubenswrapper[4794]: I0310 11:11:30.786776 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-1" Mar 10 11:11:30 crc kubenswrapper[4794]: I0310 11:11:30.800981 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-2"] Mar 10 11:11:30 crc kubenswrapper[4794]: I0310 11:11:30.814996 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-1"] Mar 10 11:11:30 crc kubenswrapper[4794]: I0310 11:11:30.905488 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-sb-0"] Mar 10 11:11:30 crc kubenswrapper[4794]: I0310 11:11:30.907375 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Mar 10 11:11:30 crc kubenswrapper[4794]: I0310 11:11:30.910625 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-config" Mar 10 11:11:30 crc kubenswrapper[4794]: I0310 11:11:30.912394 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-sb-dockercfg-vgz2k" Mar 10 11:11:30 crc kubenswrapper[4794]: I0310 11:11:30.929096 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Mar 10 11:11:30 crc kubenswrapper[4794]: I0310 11:11:30.931880 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/c452b797-4adc-4fa8-9fd4-bd0397013cbf-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"c452b797-4adc-4fa8-9fd4-bd0397013cbf\") " pod="openstack/ovsdbserver-nb-0" Mar 10 11:11:30 crc kubenswrapper[4794]: I0310 11:11:30.931951 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/55ecbbbb-548a-4a82-8302-984dc85503e8-config\") pod \"ovsdbserver-nb-1\" (UID: \"55ecbbbb-548a-4a82-8302-984dc85503e8\") " pod="openstack/ovsdbserver-nb-1" Mar 10 11:11:30 crc kubenswrapper[4794]: I0310 11:11:30.931989 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gjtwg\" (UniqueName: \"kubernetes.io/projected/c452b797-4adc-4fa8-9fd4-bd0397013cbf-kube-api-access-gjtwg\") pod \"ovsdbserver-nb-0\" (UID: \"c452b797-4adc-4fa8-9fd4-bd0397013cbf\") " pod="openstack/ovsdbserver-nb-0" Mar 10 11:11:30 crc kubenswrapper[4794]: I0310 11:11:30.932025 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-3a5e7975-08a6-45db-9fe8-1596427465d8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-3a5e7975-08a6-45db-9fe8-1596427465d8\") pod \"ovsdbserver-nb-1\" (UID: \"55ecbbbb-548a-4a82-8302-984dc85503e8\") " pod="openstack/ovsdbserver-nb-1" Mar 10 11:11:30 crc kubenswrapper[4794]: I0310 11:11:30.932074 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c452b797-4adc-4fa8-9fd4-bd0397013cbf-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"c452b797-4adc-4fa8-9fd4-bd0397013cbf\") " pod="openstack/ovsdbserver-nb-0" Mar 10 11:11:30 crc kubenswrapper[4794]: I0310 11:11:30.932145 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/9ed6efd3-8e35-42f4-902a-d76dceaf7e3a-ovsdb-rundir\") pod \"ovsdbserver-nb-2\" (UID: \"9ed6efd3-8e35-42f4-902a-d76dceaf7e3a\") " pod="openstack/ovsdbserver-nb-2" Mar 10 11:11:30 crc kubenswrapper[4794]: I0310 11:11:30.932167 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/55ecbbbb-548a-4a82-8302-984dc85503e8-combined-ca-bundle\") pod \"ovsdbserver-nb-1\" (UID: \"55ecbbbb-548a-4a82-8302-984dc85503e8\") " pod="openstack/ovsdbserver-nb-1" Mar 10 11:11:30 crc kubenswrapper[4794]: I0310 11:11:30.932223 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c9rxj\" (UniqueName: \"kubernetes.io/projected/9ed6efd3-8e35-42f4-902a-d76dceaf7e3a-kube-api-access-c9rxj\") pod \"ovsdbserver-nb-2\" (UID: \"9ed6efd3-8e35-42f4-902a-d76dceaf7e3a\") " pod="openstack/ovsdbserver-nb-2" Mar 10 11:11:30 crc kubenswrapper[4794]: I0310 11:11:30.932264 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9ed6efd3-8e35-42f4-902a-d76dceaf7e3a-config\") pod \"ovsdbserver-nb-2\" (UID: \"9ed6efd3-8e35-42f4-902a-d76dceaf7e3a\") " pod="openstack/ovsdbserver-nb-2" Mar 10 11:11:30 crc kubenswrapper[4794]: I0310 11:11:30.932295 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/55ecbbbb-548a-4a82-8302-984dc85503e8-ovsdb-rundir\") pod \"ovsdbserver-nb-1\" (UID: \"55ecbbbb-548a-4a82-8302-984dc85503e8\") " pod="openstack/ovsdbserver-nb-1" Mar 10 11:11:30 crc kubenswrapper[4794]: I0310 11:11:30.932353 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9ed6efd3-8e35-42f4-902a-d76dceaf7e3a-scripts\") pod \"ovsdbserver-nb-2\" (UID: \"9ed6efd3-8e35-42f4-902a-d76dceaf7e3a\") " pod="openstack/ovsdbserver-nb-2" Mar 10 11:11:30 crc kubenswrapper[4794]: I0310 11:11:30.932376 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c452b797-4adc-4fa8-9fd4-bd0397013cbf-config\") pod \"ovsdbserver-nb-0\" (UID: \"c452b797-4adc-4fa8-9fd4-bd0397013cbf\") " pod="openstack/ovsdbserver-nb-0" Mar 10 11:11:30 crc kubenswrapper[4794]: I0310 11:11:30.932476 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4ctpm\" (UniqueName: \"kubernetes.io/projected/55ecbbbb-548a-4a82-8302-984dc85503e8-kube-api-access-4ctpm\") pod \"ovsdbserver-nb-1\" (UID: \"55ecbbbb-548a-4a82-8302-984dc85503e8\") " pod="openstack/ovsdbserver-nb-1" Mar 10 11:11:30 crc kubenswrapper[4794]: I0310 11:11:30.932499 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9ed6efd3-8e35-42f4-902a-d76dceaf7e3a-combined-ca-bundle\") pod \"ovsdbserver-nb-2\" (UID: \"9ed6efd3-8e35-42f4-902a-d76dceaf7e3a\") " pod="openstack/ovsdbserver-nb-2" Mar 10 11:11:30 crc kubenswrapper[4794]: I0310 11:11:30.932526 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c452b797-4adc-4fa8-9fd4-bd0397013cbf-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"c452b797-4adc-4fa8-9fd4-bd0397013cbf\") " pod="openstack/ovsdbserver-nb-0" Mar 10 11:11:30 crc kubenswrapper[4794]: I0310 11:11:30.932560 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-c35ffe12-7f51-46a5-b689-b8ee64039625\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-c35ffe12-7f51-46a5-b689-b8ee64039625\") pod \"ovsdbserver-nb-2\" (UID: \"9ed6efd3-8e35-42f4-902a-d76dceaf7e3a\") " pod="openstack/ovsdbserver-nb-2" Mar 10 11:11:30 crc kubenswrapper[4794]: I0310 11:11:30.932580 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/55ecbbbb-548a-4a82-8302-984dc85503e8-scripts\") pod \"ovsdbserver-nb-1\" (UID: \"55ecbbbb-548a-4a82-8302-984dc85503e8\") " pod="openstack/ovsdbserver-nb-1" Mar 10 11:11:30 crc kubenswrapper[4794]: I0310 11:11:30.932618 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-472d4339-073f-4518-895d-c37d89b004bc\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-472d4339-073f-4518-895d-c37d89b004bc\") pod \"ovsdbserver-nb-0\" (UID: \"c452b797-4adc-4fa8-9fd4-bd0397013cbf\") " pod="openstack/ovsdbserver-nb-0" Mar 10 11:11:30 crc kubenswrapper[4794]: I0310 11:11:30.932836 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-scripts" Mar 10 11:11:30 crc kubenswrapper[4794]: I0310 11:11:30.946486 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-sb-1"] Mar 10 11:11:30 crc kubenswrapper[4794]: I0310 11:11:30.949592 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-1" Mar 10 11:11:30 crc kubenswrapper[4794]: I0310 11:11:30.955577 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-sb-2"] Mar 10 11:11:30 crc kubenswrapper[4794]: I0310 11:11:30.957293 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-2" Mar 10 11:11:30 crc kubenswrapper[4794]: I0310 11:11:30.977002 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-1"] Mar 10 11:11:30 crc kubenswrapper[4794]: I0310 11:11:30.997285 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-2"] Mar 10 11:11:31 crc kubenswrapper[4794]: I0310 11:11:31.499101 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-c35ffe12-7f51-46a5-b689-b8ee64039625\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-c35ffe12-7f51-46a5-b689-b8ee64039625\") pod \"ovsdbserver-nb-2\" (UID: \"9ed6efd3-8e35-42f4-902a-d76dceaf7e3a\") " pod="openstack/ovsdbserver-nb-2" Mar 10 11:11:31 crc kubenswrapper[4794]: I0310 11:11:31.499147 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/55ecbbbb-548a-4a82-8302-984dc85503e8-scripts\") pod \"ovsdbserver-nb-1\" (UID: \"55ecbbbb-548a-4a82-8302-984dc85503e8\") " pod="openstack/ovsdbserver-nb-1" Mar 10 11:11:31 crc kubenswrapper[4794]: I0310 11:11:31.499180 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/78f18d30-feff-435e-a61b-8ac7020f133e-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"78f18d30-feff-435e-a61b-8ac7020f133e\") " pod="openstack/ovsdbserver-sb-0" Mar 10 11:11:31 crc kubenswrapper[4794]: I0310 11:11:31.499212 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-472d4339-073f-4518-895d-c37d89b004bc\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-472d4339-073f-4518-895d-c37d89b004bc\") pod \"ovsdbserver-nb-0\" (UID: \"c452b797-4adc-4fa8-9fd4-bd0397013cbf\") " pod="openstack/ovsdbserver-nb-0" Mar 10 11:11:31 crc kubenswrapper[4794]: I0310 11:11:31.499240 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-38f5a92a-f5e0-4a03-a6fa-ddfefede10e3\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-38f5a92a-f5e0-4a03-a6fa-ddfefede10e3\") pod \"ovsdbserver-sb-0\" (UID: \"78f18d30-feff-435e-a61b-8ac7020f133e\") " pod="openstack/ovsdbserver-sb-0" Mar 10 11:11:31 crc kubenswrapper[4794]: I0310 11:11:31.499267 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/78f18d30-feff-435e-a61b-8ac7020f133e-config\") pod \"ovsdbserver-sb-0\" (UID: \"78f18d30-feff-435e-a61b-8ac7020f133e\") " pod="openstack/ovsdbserver-sb-0" Mar 10 11:11:31 crc kubenswrapper[4794]: I0310 11:11:31.499292 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/c452b797-4adc-4fa8-9fd4-bd0397013cbf-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"c452b797-4adc-4fa8-9fd4-bd0397013cbf\") " pod="openstack/ovsdbserver-nb-0" Mar 10 11:11:31 crc kubenswrapper[4794]: I0310 11:11:31.499315 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/55ecbbbb-548a-4a82-8302-984dc85503e8-config\") pod \"ovsdbserver-nb-1\" (UID: \"55ecbbbb-548a-4a82-8302-984dc85503e8\") " pod="openstack/ovsdbserver-nb-1" Mar 10 11:11:31 crc kubenswrapper[4794]: I0310 11:11:31.499361 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gjtwg\" (UniqueName: \"kubernetes.io/projected/c452b797-4adc-4fa8-9fd4-bd0397013cbf-kube-api-access-gjtwg\") pod \"ovsdbserver-nb-0\" (UID: \"c452b797-4adc-4fa8-9fd4-bd0397013cbf\") " pod="openstack/ovsdbserver-nb-0" Mar 10 11:11:31 crc kubenswrapper[4794]: I0310 11:11:31.499390 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-3a5e7975-08a6-45db-9fe8-1596427465d8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-3a5e7975-08a6-45db-9fe8-1596427465d8\") pod \"ovsdbserver-nb-1\" (UID: \"55ecbbbb-548a-4a82-8302-984dc85503e8\") " pod="openstack/ovsdbserver-nb-1" Mar 10 11:11:31 crc kubenswrapper[4794]: I0310 11:11:31.499415 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bhtlk\" (UniqueName: \"kubernetes.io/projected/78f18d30-feff-435e-a61b-8ac7020f133e-kube-api-access-bhtlk\") pod \"ovsdbserver-sb-0\" (UID: \"78f18d30-feff-435e-a61b-8ac7020f133e\") " pod="openstack/ovsdbserver-sb-0" Mar 10 11:11:31 crc kubenswrapper[4794]: I0310 11:11:31.499442 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c452b797-4adc-4fa8-9fd4-bd0397013cbf-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"c452b797-4adc-4fa8-9fd4-bd0397013cbf\") " pod="openstack/ovsdbserver-nb-0" Mar 10 11:11:31 crc kubenswrapper[4794]: I0310 11:11:31.499471 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/78f18d30-feff-435e-a61b-8ac7020f133e-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"78f18d30-feff-435e-a61b-8ac7020f133e\") " pod="openstack/ovsdbserver-sb-0" Mar 10 11:11:31 crc kubenswrapper[4794]: I0310 11:11:31.499504 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/9ed6efd3-8e35-42f4-902a-d76dceaf7e3a-ovsdb-rundir\") pod \"ovsdbserver-nb-2\" (UID: \"9ed6efd3-8e35-42f4-902a-d76dceaf7e3a\") " pod="openstack/ovsdbserver-nb-2" Mar 10 11:11:31 crc kubenswrapper[4794]: I0310 11:11:31.499525 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/55ecbbbb-548a-4a82-8302-984dc85503e8-combined-ca-bundle\") pod \"ovsdbserver-nb-1\" (UID: \"55ecbbbb-548a-4a82-8302-984dc85503e8\") " pod="openstack/ovsdbserver-nb-1" Mar 10 11:11:31 crc kubenswrapper[4794]: I0310 11:11:31.499559 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c9rxj\" (UniqueName: \"kubernetes.io/projected/9ed6efd3-8e35-42f4-902a-d76dceaf7e3a-kube-api-access-c9rxj\") pod \"ovsdbserver-nb-2\" (UID: \"9ed6efd3-8e35-42f4-902a-d76dceaf7e3a\") " pod="openstack/ovsdbserver-nb-2" Mar 10 11:11:31 crc kubenswrapper[4794]: I0310 11:11:31.499588 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9ed6efd3-8e35-42f4-902a-d76dceaf7e3a-config\") pod \"ovsdbserver-nb-2\" (UID: \"9ed6efd3-8e35-42f4-902a-d76dceaf7e3a\") " pod="openstack/ovsdbserver-nb-2" Mar 10 11:11:31 crc kubenswrapper[4794]: I0310 11:11:31.499616 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/55ecbbbb-548a-4a82-8302-984dc85503e8-ovsdb-rundir\") pod \"ovsdbserver-nb-1\" (UID: \"55ecbbbb-548a-4a82-8302-984dc85503e8\") " pod="openstack/ovsdbserver-nb-1" Mar 10 11:11:31 crc kubenswrapper[4794]: I0310 11:11:31.499640 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/78f18d30-feff-435e-a61b-8ac7020f133e-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"78f18d30-feff-435e-a61b-8ac7020f133e\") " pod="openstack/ovsdbserver-sb-0" Mar 10 11:11:31 crc kubenswrapper[4794]: I0310 11:11:31.499671 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9ed6efd3-8e35-42f4-902a-d76dceaf7e3a-scripts\") pod \"ovsdbserver-nb-2\" (UID: \"9ed6efd3-8e35-42f4-902a-d76dceaf7e3a\") " pod="openstack/ovsdbserver-nb-2" Mar 10 11:11:31 crc kubenswrapper[4794]: I0310 11:11:31.499694 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c452b797-4adc-4fa8-9fd4-bd0397013cbf-config\") pod \"ovsdbserver-nb-0\" (UID: \"c452b797-4adc-4fa8-9fd4-bd0397013cbf\") " pod="openstack/ovsdbserver-nb-0" Mar 10 11:11:31 crc kubenswrapper[4794]: I0310 11:11:31.499744 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4ctpm\" (UniqueName: \"kubernetes.io/projected/55ecbbbb-548a-4a82-8302-984dc85503e8-kube-api-access-4ctpm\") pod \"ovsdbserver-nb-1\" (UID: \"55ecbbbb-548a-4a82-8302-984dc85503e8\") " pod="openstack/ovsdbserver-nb-1" Mar 10 11:11:31 crc kubenswrapper[4794]: I0310 11:11:31.499765 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9ed6efd3-8e35-42f4-902a-d76dceaf7e3a-combined-ca-bundle\") pod \"ovsdbserver-nb-2\" (UID: \"9ed6efd3-8e35-42f4-902a-d76dceaf7e3a\") " pod="openstack/ovsdbserver-nb-2" Mar 10 11:11:31 crc kubenswrapper[4794]: I0310 11:11:31.499788 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c452b797-4adc-4fa8-9fd4-bd0397013cbf-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"c452b797-4adc-4fa8-9fd4-bd0397013cbf\") " pod="openstack/ovsdbserver-nb-0" Mar 10 11:11:31 crc kubenswrapper[4794]: I0310 11:11:31.500999 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c452b797-4adc-4fa8-9fd4-bd0397013cbf-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"c452b797-4adc-4fa8-9fd4-bd0397013cbf\") " pod="openstack/ovsdbserver-nb-0" Mar 10 11:11:31 crc kubenswrapper[4794]: I0310 11:11:31.503510 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/55ecbbbb-548a-4a82-8302-984dc85503e8-config\") pod \"ovsdbserver-nb-1\" (UID: \"55ecbbbb-548a-4a82-8302-984dc85503e8\") " pod="openstack/ovsdbserver-nb-1" Mar 10 11:11:31 crc kubenswrapper[4794]: I0310 11:11:31.504584 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9ed6efd3-8e35-42f4-902a-d76dceaf7e3a-config\") pod \"ovsdbserver-nb-2\" (UID: \"9ed6efd3-8e35-42f4-902a-d76dceaf7e3a\") " pod="openstack/ovsdbserver-nb-2" Mar 10 11:11:31 crc kubenswrapper[4794]: I0310 11:11:31.504898 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/55ecbbbb-548a-4a82-8302-984dc85503e8-ovsdb-rundir\") pod \"ovsdbserver-nb-1\" (UID: \"55ecbbbb-548a-4a82-8302-984dc85503e8\") " pod="openstack/ovsdbserver-nb-1" Mar 10 11:11:31 crc kubenswrapper[4794]: I0310 11:11:31.505838 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9ed6efd3-8e35-42f4-902a-d76dceaf7e3a-scripts\") pod \"ovsdbserver-nb-2\" (UID: \"9ed6efd3-8e35-42f4-902a-d76dceaf7e3a\") " pod="openstack/ovsdbserver-nb-2" Mar 10 11:11:31 crc kubenswrapper[4794]: I0310 11:11:31.506497 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c452b797-4adc-4fa8-9fd4-bd0397013cbf-config\") pod \"ovsdbserver-nb-0\" (UID: \"c452b797-4adc-4fa8-9fd4-bd0397013cbf\") " pod="openstack/ovsdbserver-nb-0" Mar 10 11:11:31 crc kubenswrapper[4794]: I0310 11:11:31.506831 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/9ed6efd3-8e35-42f4-902a-d76dceaf7e3a-ovsdb-rundir\") pod \"ovsdbserver-nb-2\" (UID: \"9ed6efd3-8e35-42f4-902a-d76dceaf7e3a\") " pod="openstack/ovsdbserver-nb-2" Mar 10 11:11:31 crc kubenswrapper[4794]: I0310 11:11:31.510397 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/55ecbbbb-548a-4a82-8302-984dc85503e8-scripts\") pod \"ovsdbserver-nb-1\" (UID: \"55ecbbbb-548a-4a82-8302-984dc85503e8\") " pod="openstack/ovsdbserver-nb-1" Mar 10 11:11:31 crc kubenswrapper[4794]: I0310 11:11:31.512956 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c452b797-4adc-4fa8-9fd4-bd0397013cbf-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"c452b797-4adc-4fa8-9fd4-bd0397013cbf\") " pod="openstack/ovsdbserver-nb-0" Mar 10 11:11:31 crc kubenswrapper[4794]: I0310 11:11:31.513954 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/55ecbbbb-548a-4a82-8302-984dc85503e8-combined-ca-bundle\") pod \"ovsdbserver-nb-1\" (UID: \"55ecbbbb-548a-4a82-8302-984dc85503e8\") " pod="openstack/ovsdbserver-nb-1" Mar 10 11:11:31 crc kubenswrapper[4794]: I0310 11:11:31.514147 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/c452b797-4adc-4fa8-9fd4-bd0397013cbf-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"c452b797-4adc-4fa8-9fd4-bd0397013cbf\") " pod="openstack/ovsdbserver-nb-0" Mar 10 11:11:31 crc kubenswrapper[4794]: I0310 11:11:31.524543 4794 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 10 11:11:31 crc kubenswrapper[4794]: I0310 11:11:31.524572 4794 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 10 11:11:31 crc kubenswrapper[4794]: I0310 11:11:31.524630 4794 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-c35ffe12-7f51-46a5-b689-b8ee64039625\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-c35ffe12-7f51-46a5-b689-b8ee64039625\") pod \"ovsdbserver-nb-2\" (UID: \"9ed6efd3-8e35-42f4-902a-d76dceaf7e3a\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/a84d9382d27e035d753f24823dbb3306c4d406a7edf481cc297ea09eafa290e6/globalmount\"" pod="openstack/ovsdbserver-nb-2" Mar 10 11:11:31 crc kubenswrapper[4794]: I0310 11:11:31.525027 4794 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-472d4339-073f-4518-895d-c37d89b004bc\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-472d4339-073f-4518-895d-c37d89b004bc\") pod \"ovsdbserver-nb-0\" (UID: \"c452b797-4adc-4fa8-9fd4-bd0397013cbf\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1714eef4f64bae3f605bb21d40dc354c5bd9bb97822aeec7f5522befa378fd40/globalmount\"" pod="openstack/ovsdbserver-nb-0" Mar 10 11:11:31 crc kubenswrapper[4794]: I0310 11:11:31.525305 4794 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 10 11:11:31 crc kubenswrapper[4794]: I0310 11:11:31.525409 4794 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-3a5e7975-08a6-45db-9fe8-1596427465d8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-3a5e7975-08a6-45db-9fe8-1596427465d8\") pod \"ovsdbserver-nb-1\" (UID: \"55ecbbbb-548a-4a82-8302-984dc85503e8\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/29a12e1bb65495d32ee1b629d401c62e90c478d5c840d5113813ff09ff7ec6b6/globalmount\"" pod="openstack/ovsdbserver-nb-1" Mar 10 11:11:31 crc kubenswrapper[4794]: I0310 11:11:31.527410 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9ed6efd3-8e35-42f4-902a-d76dceaf7e3a-combined-ca-bundle\") pod \"ovsdbserver-nb-2\" (UID: \"9ed6efd3-8e35-42f4-902a-d76dceaf7e3a\") " pod="openstack/ovsdbserver-nb-2" Mar 10 11:11:31 crc kubenswrapper[4794]: I0310 11:11:31.534360 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4ctpm\" (UniqueName: \"kubernetes.io/projected/55ecbbbb-548a-4a82-8302-984dc85503e8-kube-api-access-4ctpm\") pod \"ovsdbserver-nb-1\" (UID: \"55ecbbbb-548a-4a82-8302-984dc85503e8\") " pod="openstack/ovsdbserver-nb-1" Mar 10 11:11:31 crc kubenswrapper[4794]: I0310 11:11:31.562755 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c9rxj\" (UniqueName: \"kubernetes.io/projected/9ed6efd3-8e35-42f4-902a-d76dceaf7e3a-kube-api-access-c9rxj\") pod \"ovsdbserver-nb-2\" (UID: \"9ed6efd3-8e35-42f4-902a-d76dceaf7e3a\") " pod="openstack/ovsdbserver-nb-2" Mar 10 11:11:31 crc kubenswrapper[4794]: I0310 11:11:31.563814 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gjtwg\" (UniqueName: \"kubernetes.io/projected/c452b797-4adc-4fa8-9fd4-bd0397013cbf-kube-api-access-gjtwg\") pod \"ovsdbserver-nb-0\" (UID: \"c452b797-4adc-4fa8-9fd4-bd0397013cbf\") " pod="openstack/ovsdbserver-nb-0" Mar 10 11:11:31 crc kubenswrapper[4794]: I0310 11:11:31.582887 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-3a5e7975-08a6-45db-9fe8-1596427465d8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-3a5e7975-08a6-45db-9fe8-1596427465d8\") pod \"ovsdbserver-nb-1\" (UID: \"55ecbbbb-548a-4a82-8302-984dc85503e8\") " pod="openstack/ovsdbserver-nb-1" Mar 10 11:11:31 crc kubenswrapper[4794]: I0310 11:11:31.587820 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-c35ffe12-7f51-46a5-b689-b8ee64039625\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-c35ffe12-7f51-46a5-b689-b8ee64039625\") pod \"ovsdbserver-nb-2\" (UID: \"9ed6efd3-8e35-42f4-902a-d76dceaf7e3a\") " pod="openstack/ovsdbserver-nb-2" Mar 10 11:11:31 crc kubenswrapper[4794]: I0310 11:11:31.595063 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-472d4339-073f-4518-895d-c37d89b004bc\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-472d4339-073f-4518-895d-c37d89b004bc\") pod \"ovsdbserver-nb-0\" (UID: \"c452b797-4adc-4fa8-9fd4-bd0397013cbf\") " pod="openstack/ovsdbserver-nb-0" Mar 10 11:11:31 crc kubenswrapper[4794]: I0310 11:11:31.601808 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-8fa6cfd3-23af-4d38-8e4d-5f56cb6445cb\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-8fa6cfd3-23af-4d38-8e4d-5f56cb6445cb\") pod \"ovsdbserver-sb-1\" (UID: \"fb9f9744-a951-4f33-b9a7-2b2ed542dd84\") " pod="openstack/ovsdbserver-sb-1" Mar 10 11:11:31 crc kubenswrapper[4794]: I0310 11:11:31.601854 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/74dc83e0-8027-42ed-b958-117726668c78-scripts\") pod \"ovsdbserver-sb-2\" (UID: \"74dc83e0-8027-42ed-b958-117726668c78\") " pod="openstack/ovsdbserver-sb-2" Mar 10 11:11:31 crc kubenswrapper[4794]: I0310 11:11:31.601879 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-75b4b618-eda3-4556-bb6f-688e673def40\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-75b4b618-eda3-4556-bb6f-688e673def40\") pod \"ovsdbserver-sb-2\" (UID: \"74dc83e0-8027-42ed-b958-117726668c78\") " pod="openstack/ovsdbserver-sb-2" Mar 10 11:11:31 crc kubenswrapper[4794]: I0310 11:11:31.601914 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/74dc83e0-8027-42ed-b958-117726668c78-ovsdb-rundir\") pod \"ovsdbserver-sb-2\" (UID: \"74dc83e0-8027-42ed-b958-117726668c78\") " pod="openstack/ovsdbserver-sb-2" Mar 10 11:11:31 crc kubenswrapper[4794]: I0310 11:11:31.601956 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-llsxx\" (UniqueName: \"kubernetes.io/projected/74dc83e0-8027-42ed-b958-117726668c78-kube-api-access-llsxx\") pod \"ovsdbserver-sb-2\" (UID: \"74dc83e0-8027-42ed-b958-117726668c78\") " pod="openstack/ovsdbserver-sb-2" Mar 10 11:11:31 crc kubenswrapper[4794]: I0310 11:11:31.602029 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/78f18d30-feff-435e-a61b-8ac7020f133e-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"78f18d30-feff-435e-a61b-8ac7020f133e\") " pod="openstack/ovsdbserver-sb-0" Mar 10 11:11:31 crc kubenswrapper[4794]: I0310 11:11:31.602066 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/fb9f9744-a951-4f33-b9a7-2b2ed542dd84-ovsdb-rundir\") pod \"ovsdbserver-sb-1\" (UID: \"fb9f9744-a951-4f33-b9a7-2b2ed542dd84\") " pod="openstack/ovsdbserver-sb-1" Mar 10 11:11:31 crc kubenswrapper[4794]: I0310 11:11:31.602088 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/74dc83e0-8027-42ed-b958-117726668c78-config\") pod \"ovsdbserver-sb-2\" (UID: \"74dc83e0-8027-42ed-b958-117726668c78\") " pod="openstack/ovsdbserver-sb-2" Mar 10 11:11:31 crc kubenswrapper[4794]: I0310 11:11:31.602116 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-38f5a92a-f5e0-4a03-a6fa-ddfefede10e3\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-38f5a92a-f5e0-4a03-a6fa-ddfefede10e3\") pod \"ovsdbserver-sb-0\" (UID: \"78f18d30-feff-435e-a61b-8ac7020f133e\") " pod="openstack/ovsdbserver-sb-0" Mar 10 11:11:31 crc kubenswrapper[4794]: I0310 11:11:31.602140 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/78f18d30-feff-435e-a61b-8ac7020f133e-config\") pod \"ovsdbserver-sb-0\" (UID: \"78f18d30-feff-435e-a61b-8ac7020f133e\") " pod="openstack/ovsdbserver-sb-0" Mar 10 11:11:31 crc kubenswrapper[4794]: I0310 11:11:31.602160 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/74dc83e0-8027-42ed-b958-117726668c78-combined-ca-bundle\") pod \"ovsdbserver-sb-2\" (UID: \"74dc83e0-8027-42ed-b958-117726668c78\") " pod="openstack/ovsdbserver-sb-2" Mar 10 11:11:31 crc kubenswrapper[4794]: I0310 11:11:31.602192 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fb9f9744-a951-4f33-b9a7-2b2ed542dd84-config\") pod \"ovsdbserver-sb-1\" (UID: \"fb9f9744-a951-4f33-b9a7-2b2ed542dd84\") " pod="openstack/ovsdbserver-sb-1" Mar 10 11:11:31 crc kubenswrapper[4794]: I0310 11:11:31.602215 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bhtlk\" (UniqueName: \"kubernetes.io/projected/78f18d30-feff-435e-a61b-8ac7020f133e-kube-api-access-bhtlk\") pod \"ovsdbserver-sb-0\" (UID: \"78f18d30-feff-435e-a61b-8ac7020f133e\") " pod="openstack/ovsdbserver-sb-0" Mar 10 11:11:31 crc kubenswrapper[4794]: I0310 11:11:31.602257 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/78f18d30-feff-435e-a61b-8ac7020f133e-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"78f18d30-feff-435e-a61b-8ac7020f133e\") " pod="openstack/ovsdbserver-sb-0" Mar 10 11:11:31 crc kubenswrapper[4794]: I0310 11:11:31.602287 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fb9f9744-a951-4f33-b9a7-2b2ed542dd84-combined-ca-bundle\") pod \"ovsdbserver-sb-1\" (UID: \"fb9f9744-a951-4f33-b9a7-2b2ed542dd84\") " pod="openstack/ovsdbserver-sb-1" Mar 10 11:11:31 crc kubenswrapper[4794]: I0310 11:11:31.602317 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nljck\" (UniqueName: \"kubernetes.io/projected/fb9f9744-a951-4f33-b9a7-2b2ed542dd84-kube-api-access-nljck\") pod \"ovsdbserver-sb-1\" (UID: \"fb9f9744-a951-4f33-b9a7-2b2ed542dd84\") " pod="openstack/ovsdbserver-sb-1" Mar 10 11:11:31 crc kubenswrapper[4794]: I0310 11:11:31.603697 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/78f18d30-feff-435e-a61b-8ac7020f133e-config\") pod \"ovsdbserver-sb-0\" (UID: \"78f18d30-feff-435e-a61b-8ac7020f133e\") " pod="openstack/ovsdbserver-sb-0" Mar 10 11:11:31 crc kubenswrapper[4794]: I0310 11:11:31.606511 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/fb9f9744-a951-4f33-b9a7-2b2ed542dd84-scripts\") pod \"ovsdbserver-sb-1\" (UID: \"fb9f9744-a951-4f33-b9a7-2b2ed542dd84\") " pod="openstack/ovsdbserver-sb-1" Mar 10 11:11:31 crc kubenswrapper[4794]: I0310 11:11:31.606606 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/78f18d30-feff-435e-a61b-8ac7020f133e-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"78f18d30-feff-435e-a61b-8ac7020f133e\") " pod="openstack/ovsdbserver-sb-0" Mar 10 11:11:31 crc kubenswrapper[4794]: I0310 11:11:31.607133 4794 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 10 11:11:31 crc kubenswrapper[4794]: I0310 11:11:31.607164 4794 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-38f5a92a-f5e0-4a03-a6fa-ddfefede10e3\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-38f5a92a-f5e0-4a03-a6fa-ddfefede10e3\") pod \"ovsdbserver-sb-0\" (UID: \"78f18d30-feff-435e-a61b-8ac7020f133e\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/a5c99108e8e80b8c8ebec3e52f06391b664d5a2369f73685bd1918e5dab52913/globalmount\"" pod="openstack/ovsdbserver-sb-0" Mar 10 11:11:31 crc kubenswrapper[4794]: I0310 11:11:31.607182 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/78f18d30-feff-435e-a61b-8ac7020f133e-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"78f18d30-feff-435e-a61b-8ac7020f133e\") " pod="openstack/ovsdbserver-sb-0" Mar 10 11:11:31 crc kubenswrapper[4794]: I0310 11:11:31.608365 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/78f18d30-feff-435e-a61b-8ac7020f133e-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"78f18d30-feff-435e-a61b-8ac7020f133e\") " pod="openstack/ovsdbserver-sb-0" Mar 10 11:11:31 crc kubenswrapper[4794]: I0310 11:11:31.610306 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/78f18d30-feff-435e-a61b-8ac7020f133e-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"78f18d30-feff-435e-a61b-8ac7020f133e\") " pod="openstack/ovsdbserver-sb-0" Mar 10 11:11:31 crc kubenswrapper[4794]: I0310 11:11:31.623060 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bhtlk\" (UniqueName: \"kubernetes.io/projected/78f18d30-feff-435e-a61b-8ac7020f133e-kube-api-access-bhtlk\") pod \"ovsdbserver-sb-0\" (UID: \"78f18d30-feff-435e-a61b-8ac7020f133e\") " pod="openstack/ovsdbserver-sb-0" Mar 10 11:11:31 crc kubenswrapper[4794]: I0310 11:11:31.637349 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-38f5a92a-f5e0-4a03-a6fa-ddfefede10e3\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-38f5a92a-f5e0-4a03-a6fa-ddfefede10e3\") pod \"ovsdbserver-sb-0\" (UID: \"78f18d30-feff-435e-a61b-8ac7020f133e\") " pod="openstack/ovsdbserver-sb-0" Mar 10 11:11:31 crc kubenswrapper[4794]: I0310 11:11:31.665521 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Mar 10 11:11:31 crc kubenswrapper[4794]: I0310 11:11:31.684674 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-2" Mar 10 11:11:31 crc kubenswrapper[4794]: I0310 11:11:31.707988 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/74dc83e0-8027-42ed-b958-117726668c78-config\") pod \"ovsdbserver-sb-2\" (UID: \"74dc83e0-8027-42ed-b958-117726668c78\") " pod="openstack/ovsdbserver-sb-2" Mar 10 11:11:31 crc kubenswrapper[4794]: I0310 11:11:31.708742 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/74dc83e0-8027-42ed-b958-117726668c78-config\") pod \"ovsdbserver-sb-2\" (UID: \"74dc83e0-8027-42ed-b958-117726668c78\") " pod="openstack/ovsdbserver-sb-2" Mar 10 11:11:31 crc kubenswrapper[4794]: I0310 11:11:31.708827 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/74dc83e0-8027-42ed-b958-117726668c78-combined-ca-bundle\") pod \"ovsdbserver-sb-2\" (UID: \"74dc83e0-8027-42ed-b958-117726668c78\") " pod="openstack/ovsdbserver-sb-2" Mar 10 11:11:31 crc kubenswrapper[4794]: I0310 11:11:31.709244 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fb9f9744-a951-4f33-b9a7-2b2ed542dd84-config\") pod \"ovsdbserver-sb-1\" (UID: \"fb9f9744-a951-4f33-b9a7-2b2ed542dd84\") " pod="openstack/ovsdbserver-sb-1" Mar 10 11:11:31 crc kubenswrapper[4794]: I0310 11:11:31.709306 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fb9f9744-a951-4f33-b9a7-2b2ed542dd84-combined-ca-bundle\") pod \"ovsdbserver-sb-1\" (UID: \"fb9f9744-a951-4f33-b9a7-2b2ed542dd84\") " pod="openstack/ovsdbserver-sb-1" Mar 10 11:11:31 crc kubenswrapper[4794]: I0310 11:11:31.709355 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nljck\" (UniqueName: \"kubernetes.io/projected/fb9f9744-a951-4f33-b9a7-2b2ed542dd84-kube-api-access-nljck\") pod \"ovsdbserver-sb-1\" (UID: \"fb9f9744-a951-4f33-b9a7-2b2ed542dd84\") " pod="openstack/ovsdbserver-sb-1" Mar 10 11:11:31 crc kubenswrapper[4794]: I0310 11:11:31.709389 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/fb9f9744-a951-4f33-b9a7-2b2ed542dd84-scripts\") pod \"ovsdbserver-sb-1\" (UID: \"fb9f9744-a951-4f33-b9a7-2b2ed542dd84\") " pod="openstack/ovsdbserver-sb-1" Mar 10 11:11:31 crc kubenswrapper[4794]: I0310 11:11:31.709450 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-8fa6cfd3-23af-4d38-8e4d-5f56cb6445cb\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-8fa6cfd3-23af-4d38-8e4d-5f56cb6445cb\") pod \"ovsdbserver-sb-1\" (UID: \"fb9f9744-a951-4f33-b9a7-2b2ed542dd84\") " pod="openstack/ovsdbserver-sb-1" Mar 10 11:11:31 crc kubenswrapper[4794]: I0310 11:11:31.709550 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/74dc83e0-8027-42ed-b958-117726668c78-scripts\") pod \"ovsdbserver-sb-2\" (UID: \"74dc83e0-8027-42ed-b958-117726668c78\") " pod="openstack/ovsdbserver-sb-2" Mar 10 11:11:31 crc kubenswrapper[4794]: I0310 11:11:31.709577 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-75b4b618-eda3-4556-bb6f-688e673def40\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-75b4b618-eda3-4556-bb6f-688e673def40\") pod \"ovsdbserver-sb-2\" (UID: \"74dc83e0-8027-42ed-b958-117726668c78\") " pod="openstack/ovsdbserver-sb-2" Mar 10 11:11:31 crc kubenswrapper[4794]: I0310 11:11:31.709615 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/74dc83e0-8027-42ed-b958-117726668c78-ovsdb-rundir\") pod \"ovsdbserver-sb-2\" (UID: \"74dc83e0-8027-42ed-b958-117726668c78\") " pod="openstack/ovsdbserver-sb-2" Mar 10 11:11:31 crc kubenswrapper[4794]: I0310 11:11:31.709656 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-llsxx\" (UniqueName: \"kubernetes.io/projected/74dc83e0-8027-42ed-b958-117726668c78-kube-api-access-llsxx\") pod \"ovsdbserver-sb-2\" (UID: \"74dc83e0-8027-42ed-b958-117726668c78\") " pod="openstack/ovsdbserver-sb-2" Mar 10 11:11:31 crc kubenswrapper[4794]: I0310 11:11:31.709689 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/fb9f9744-a951-4f33-b9a7-2b2ed542dd84-ovsdb-rundir\") pod \"ovsdbserver-sb-1\" (UID: \"fb9f9744-a951-4f33-b9a7-2b2ed542dd84\") " pod="openstack/ovsdbserver-sb-1" Mar 10 11:11:31 crc kubenswrapper[4794]: I0310 11:11:31.710188 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/fb9f9744-a951-4f33-b9a7-2b2ed542dd84-ovsdb-rundir\") pod \"ovsdbserver-sb-1\" (UID: \"fb9f9744-a951-4f33-b9a7-2b2ed542dd84\") " pod="openstack/ovsdbserver-sb-1" Mar 10 11:11:31 crc kubenswrapper[4794]: I0310 11:11:31.710221 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fb9f9744-a951-4f33-b9a7-2b2ed542dd84-config\") pod \"ovsdbserver-sb-1\" (UID: \"fb9f9744-a951-4f33-b9a7-2b2ed542dd84\") " pod="openstack/ovsdbserver-sb-1" Mar 10 11:11:31 crc kubenswrapper[4794]: I0310 11:11:31.710377 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/74dc83e0-8027-42ed-b958-117726668c78-ovsdb-rundir\") pod \"ovsdbserver-sb-2\" (UID: \"74dc83e0-8027-42ed-b958-117726668c78\") " pod="openstack/ovsdbserver-sb-2" Mar 10 11:11:31 crc kubenswrapper[4794]: I0310 11:11:31.710680 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/74dc83e0-8027-42ed-b958-117726668c78-scripts\") pod \"ovsdbserver-sb-2\" (UID: \"74dc83e0-8027-42ed-b958-117726668c78\") " pod="openstack/ovsdbserver-sb-2" Mar 10 11:11:31 crc kubenswrapper[4794]: I0310 11:11:31.710752 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/fb9f9744-a951-4f33-b9a7-2b2ed542dd84-scripts\") pod \"ovsdbserver-sb-1\" (UID: \"fb9f9744-a951-4f33-b9a7-2b2ed542dd84\") " pod="openstack/ovsdbserver-sb-1" Mar 10 11:11:31 crc kubenswrapper[4794]: I0310 11:11:31.711245 4794 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 10 11:11:31 crc kubenswrapper[4794]: I0310 11:11:31.711281 4794 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-75b4b618-eda3-4556-bb6f-688e673def40\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-75b4b618-eda3-4556-bb6f-688e673def40\") pod \"ovsdbserver-sb-2\" (UID: \"74dc83e0-8027-42ed-b958-117726668c78\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/9d25d4a7ec2f6b29c91fe33c22269bcd883b6049c835e8bd9daa654a8b9d159b/globalmount\"" pod="openstack/ovsdbserver-sb-2" Mar 10 11:11:31 crc kubenswrapper[4794]: I0310 11:11:31.711563 4794 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 10 11:11:31 crc kubenswrapper[4794]: I0310 11:11:31.711633 4794 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-8fa6cfd3-23af-4d38-8e4d-5f56cb6445cb\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-8fa6cfd3-23af-4d38-8e4d-5f56cb6445cb\") pod \"ovsdbserver-sb-1\" (UID: \"fb9f9744-a951-4f33-b9a7-2b2ed542dd84\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/09079bd57183a5de7fa40833ba29b4c15e300c222d3f962ec4c58cae37106f2b/globalmount\"" pod="openstack/ovsdbserver-sb-1" Mar 10 11:11:31 crc kubenswrapper[4794]: I0310 11:11:31.715293 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fb9f9744-a951-4f33-b9a7-2b2ed542dd84-combined-ca-bundle\") pod \"ovsdbserver-sb-1\" (UID: \"fb9f9744-a951-4f33-b9a7-2b2ed542dd84\") " pod="openstack/ovsdbserver-sb-1" Mar 10 11:11:31 crc kubenswrapper[4794]: I0310 11:11:31.715869 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/74dc83e0-8027-42ed-b958-117726668c78-combined-ca-bundle\") pod \"ovsdbserver-sb-2\" (UID: \"74dc83e0-8027-42ed-b958-117726668c78\") " pod="openstack/ovsdbserver-sb-2" Mar 10 11:11:31 crc kubenswrapper[4794]: I0310 11:11:31.723659 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-1" Mar 10 11:11:31 crc kubenswrapper[4794]: I0310 11:11:31.728842 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-llsxx\" (UniqueName: \"kubernetes.io/projected/74dc83e0-8027-42ed-b958-117726668c78-kube-api-access-llsxx\") pod \"ovsdbserver-sb-2\" (UID: \"74dc83e0-8027-42ed-b958-117726668c78\") " pod="openstack/ovsdbserver-sb-2" Mar 10 11:11:31 crc kubenswrapper[4794]: I0310 11:11:31.729272 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nljck\" (UniqueName: \"kubernetes.io/projected/fb9f9744-a951-4f33-b9a7-2b2ed542dd84-kube-api-access-nljck\") pod \"ovsdbserver-sb-1\" (UID: \"fb9f9744-a951-4f33-b9a7-2b2ed542dd84\") " pod="openstack/ovsdbserver-sb-1" Mar 10 11:11:31 crc kubenswrapper[4794]: I0310 11:11:31.746577 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-75b4b618-eda3-4556-bb6f-688e673def40\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-75b4b618-eda3-4556-bb6f-688e673def40\") pod \"ovsdbserver-sb-2\" (UID: \"74dc83e0-8027-42ed-b958-117726668c78\") " pod="openstack/ovsdbserver-sb-2" Mar 10 11:11:31 crc kubenswrapper[4794]: I0310 11:11:31.747384 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-8fa6cfd3-23af-4d38-8e4d-5f56cb6445cb\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-8fa6cfd3-23af-4d38-8e4d-5f56cb6445cb\") pod \"ovsdbserver-sb-1\" (UID: \"fb9f9744-a951-4f33-b9a7-2b2ed542dd84\") " pod="openstack/ovsdbserver-sb-1" Mar 10 11:11:31 crc kubenswrapper[4794]: I0310 11:11:31.848590 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Mar 10 11:11:31 crc kubenswrapper[4794]: I0310 11:11:31.881384 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-1" Mar 10 11:11:31 crc kubenswrapper[4794]: I0310 11:11:31.896993 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-2" Mar 10 11:11:32 crc kubenswrapper[4794]: I0310 11:11:32.222564 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-2"] Mar 10 11:11:32 crc kubenswrapper[4794]: I0310 11:11:32.307549 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Mar 10 11:11:32 crc kubenswrapper[4794]: W0310 11:11:32.315789 4794 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc452b797_4adc_4fa8_9fd4_bd0397013cbf.slice/crio-51262861811b39cae3aecf6fd077753d425f240ef7e5fed9f22bdc55930c15aa WatchSource:0}: Error finding container 51262861811b39cae3aecf6fd077753d425f240ef7e5fed9f22bdc55930c15aa: Status 404 returned error can't find the container with id 51262861811b39cae3aecf6fd077753d425f240ef7e5fed9f22bdc55930c15aa Mar 10 11:11:32 crc kubenswrapper[4794]: I0310 11:11:32.447461 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-1"] Mar 10 11:11:32 crc kubenswrapper[4794]: W0310 11:11:32.448953 4794 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfb9f9744_a951_4f33_b9a7_2b2ed542dd84.slice/crio-bb07b55fae89c3f369eefbf3fd616cd850ccbc5569366f805ab61369aa1b7d28 WatchSource:0}: Error finding container bb07b55fae89c3f369eefbf3fd616cd850ccbc5569366f805ab61369aa1b7d28: Status 404 returned error can't find the container with id bb07b55fae89c3f369eefbf3fd616cd850ccbc5569366f805ab61369aa1b7d28 Mar 10 11:11:32 crc kubenswrapper[4794]: I0310 11:11:32.561947 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-2"] Mar 10 11:11:32 crc kubenswrapper[4794]: W0310 11:11:32.563988 4794 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod74dc83e0_8027_42ed_b958_117726668c78.slice/crio-d2b155b38305f09def622cd3c0d70fcda74a5cc562226604fd6b9a3216ae486c WatchSource:0}: Error finding container d2b155b38305f09def622cd3c0d70fcda74a5cc562226604fd6b9a3216ae486c: Status 404 returned error can't find the container with id d2b155b38305f09def622cd3c0d70fcda74a5cc562226604fd6b9a3216ae486c Mar 10 11:11:33 crc kubenswrapper[4794]: I0310 11:11:33.017125 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-1" event={"ID":"fb9f9744-a951-4f33-b9a7-2b2ed542dd84","Type":"ContainerStarted","Data":"cebc09252055af3fc318506e3d9644fc9fc1139d65facbc24a3336a7a4fb4f70"} Mar 10 11:11:33 crc kubenswrapper[4794]: I0310 11:11:33.017189 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-1" event={"ID":"fb9f9744-a951-4f33-b9a7-2b2ed542dd84","Type":"ContainerStarted","Data":"c79aceb1921ebb0530e27888e819c20b7a351d4998e683f75e92051fc4ef7058"} Mar 10 11:11:33 crc kubenswrapper[4794]: I0310 11:11:33.017201 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-1" event={"ID":"fb9f9744-a951-4f33-b9a7-2b2ed542dd84","Type":"ContainerStarted","Data":"bb07b55fae89c3f369eefbf3fd616cd850ccbc5569366f805ab61369aa1b7d28"} Mar 10 11:11:33 crc kubenswrapper[4794]: I0310 11:11:33.018273 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-2" event={"ID":"9ed6efd3-8e35-42f4-902a-d76dceaf7e3a","Type":"ContainerStarted","Data":"7c7bccc711f3676c23d1fc5cd6690bd6f549390beba05e481ade5d2b63ccfddd"} Mar 10 11:11:33 crc kubenswrapper[4794]: I0310 11:11:33.018318 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-2" event={"ID":"9ed6efd3-8e35-42f4-902a-d76dceaf7e3a","Type":"ContainerStarted","Data":"8f1caeef7dd7f8a762b9a46b3b5afdf3e3e5b0b60385bfb8d8aa4e35a3da8f1a"} Mar 10 11:11:33 crc kubenswrapper[4794]: I0310 11:11:33.018328 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-2" event={"ID":"9ed6efd3-8e35-42f4-902a-d76dceaf7e3a","Type":"ContainerStarted","Data":"11995a99a9526c4949a8bed4ad4ec884d4dcaee3a05c865a972ee0e776fc964b"} Mar 10 11:11:33 crc kubenswrapper[4794]: I0310 11:11:33.023750 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"c452b797-4adc-4fa8-9fd4-bd0397013cbf","Type":"ContainerStarted","Data":"fe182c27d40e2d884319d4617a45155617dc15ef2a55812bd798cfb6367463dd"} Mar 10 11:11:33 crc kubenswrapper[4794]: I0310 11:11:33.023832 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"c452b797-4adc-4fa8-9fd4-bd0397013cbf","Type":"ContainerStarted","Data":"46527187c87cd1b680216bfbef088c0c5ffe290c653de8c98de1a6d97f8d93d9"} Mar 10 11:11:33 crc kubenswrapper[4794]: I0310 11:11:33.023864 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"c452b797-4adc-4fa8-9fd4-bd0397013cbf","Type":"ContainerStarted","Data":"51262861811b39cae3aecf6fd077753d425f240ef7e5fed9f22bdc55930c15aa"} Mar 10 11:11:33 crc kubenswrapper[4794]: I0310 11:11:33.026498 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-2" event={"ID":"74dc83e0-8027-42ed-b958-117726668c78","Type":"ContainerStarted","Data":"32315bfe84324b6e3d5f833dd8f9c7a302603456d4207fbdcd548bac5b269427"} Mar 10 11:11:33 crc kubenswrapper[4794]: I0310 11:11:33.026538 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-2" event={"ID":"74dc83e0-8027-42ed-b958-117726668c78","Type":"ContainerStarted","Data":"9ad581946e31e816006bd77c64c223da638575c3748435a1a930137f6acb297b"} Mar 10 11:11:33 crc kubenswrapper[4794]: I0310 11:11:33.026552 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-2" event={"ID":"74dc83e0-8027-42ed-b958-117726668c78","Type":"ContainerStarted","Data":"d2b155b38305f09def622cd3c0d70fcda74a5cc562226604fd6b9a3216ae486c"} Mar 10 11:11:33 crc kubenswrapper[4794]: I0310 11:11:33.045827 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-sb-1" podStartSLOduration=4.04580603 podStartE2EDuration="4.04580603s" podCreationTimestamp="2026-03-10 11:11:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 11:11:33.037161841 +0000 UTC m=+5241.793332659" watchObservedRunningTime="2026-03-10 11:11:33.04580603 +0000 UTC m=+5241.801976858" Mar 10 11:11:33 crc kubenswrapper[4794]: I0310 11:11:33.060518 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-nb-0" podStartSLOduration=4.060495877 podStartE2EDuration="4.060495877s" podCreationTimestamp="2026-03-10 11:11:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 11:11:33.058611219 +0000 UTC m=+5241.814782037" watchObservedRunningTime="2026-03-10 11:11:33.060495877 +0000 UTC m=+5241.816666705" Mar 10 11:11:33 crc kubenswrapper[4794]: I0310 11:11:33.081571 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-nb-2" podStartSLOduration=4.081551232 podStartE2EDuration="4.081551232s" podCreationTimestamp="2026-03-10 11:11:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 11:11:33.075085211 +0000 UTC m=+5241.831256039" watchObservedRunningTime="2026-03-10 11:11:33.081551232 +0000 UTC m=+5241.837722070" Mar 10 11:11:33 crc kubenswrapper[4794]: I0310 11:11:33.097780 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Mar 10 11:11:33 crc kubenswrapper[4794]: I0310 11:11:33.103182 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-sb-2" podStartSLOduration=4.103166495 podStartE2EDuration="4.103166495s" podCreationTimestamp="2026-03-10 11:11:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 11:11:33.10142035 +0000 UTC m=+5241.857591178" watchObservedRunningTime="2026-03-10 11:11:33.103166495 +0000 UTC m=+5241.859337313" Mar 10 11:11:34 crc kubenswrapper[4794]: I0310 11:11:34.033451 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-1"] Mar 10 11:11:34 crc kubenswrapper[4794]: I0310 11:11:34.040110 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"78f18d30-feff-435e-a61b-8ac7020f133e","Type":"ContainerStarted","Data":"10e639a654cdfb57c25e411b9d081142301037e6b171bcc09d31210380b44d63"} Mar 10 11:11:34 crc kubenswrapper[4794]: I0310 11:11:34.040302 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"78f18d30-feff-435e-a61b-8ac7020f133e","Type":"ContainerStarted","Data":"64229afe0987880e361bbdcdb62ad89dff06db4f0bdc96f10607c99435b28f42"} Mar 10 11:11:34 crc kubenswrapper[4794]: I0310 11:11:34.040536 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"78f18d30-feff-435e-a61b-8ac7020f133e","Type":"ContainerStarted","Data":"4be7b9028bdc84e87059ed3931f72d369119556fbd9913109b313ae9e4f6cbb7"} Mar 10 11:11:34 crc kubenswrapper[4794]: W0310 11:11:34.039058 4794 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod55ecbbbb_548a_4a82_8302_984dc85503e8.slice/crio-da4a91c736ef4231e67d2b5900749260b58a8b44925ed499cb5b67fdd6647042 WatchSource:0}: Error finding container da4a91c736ef4231e67d2b5900749260b58a8b44925ed499cb5b67fdd6647042: Status 404 returned error can't find the container with id da4a91c736ef4231e67d2b5900749260b58a8b44925ed499cb5b67fdd6647042 Mar 10 11:11:34 crc kubenswrapper[4794]: I0310 11:11:34.066877 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-sb-0" podStartSLOduration=5.066849572 podStartE2EDuration="5.066849572s" podCreationTimestamp="2026-03-10 11:11:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 11:11:34.057936855 +0000 UTC m=+5242.814107703" watchObservedRunningTime="2026-03-10 11:11:34.066849572 +0000 UTC m=+5242.823020410" Mar 10 11:11:34 crc kubenswrapper[4794]: I0310 11:11:34.666275 4794 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-nb-0" Mar 10 11:11:34 crc kubenswrapper[4794]: I0310 11:11:34.684893 4794 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-nb-2" Mar 10 11:11:34 crc kubenswrapper[4794]: I0310 11:11:34.849616 4794 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-sb-0" Mar 10 11:11:34 crc kubenswrapper[4794]: I0310 11:11:34.881961 4794 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-sb-1" Mar 10 11:11:34 crc kubenswrapper[4794]: I0310 11:11:34.897115 4794 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-sb-2" Mar 10 11:11:35 crc kubenswrapper[4794]: I0310 11:11:35.053691 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-1" event={"ID":"55ecbbbb-548a-4a82-8302-984dc85503e8","Type":"ContainerStarted","Data":"9845e34a28fe5874fadbcfddab7268dfaf68697f390f2ba32b23754942136fe6"} Mar 10 11:11:35 crc kubenswrapper[4794]: I0310 11:11:35.053776 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-1" event={"ID":"55ecbbbb-548a-4a82-8302-984dc85503e8","Type":"ContainerStarted","Data":"01587758cb504d7db84c0b9160109b441981596518495b5da1f053f9ef9a8268"} Mar 10 11:11:35 crc kubenswrapper[4794]: I0310 11:11:35.053820 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-1" event={"ID":"55ecbbbb-548a-4a82-8302-984dc85503e8","Type":"ContainerStarted","Data":"da4a91c736ef4231e67d2b5900749260b58a8b44925ed499cb5b67fdd6647042"} Mar 10 11:11:35 crc kubenswrapper[4794]: I0310 11:11:35.082625 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-nb-1" podStartSLOduration=6.082607421 podStartE2EDuration="6.082607421s" podCreationTimestamp="2026-03-10 11:11:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 11:11:35.082245229 +0000 UTC m=+5243.838416087" watchObservedRunningTime="2026-03-10 11:11:35.082607421 +0000 UTC m=+5243.838778249" Mar 10 11:11:36 crc kubenswrapper[4794]: I0310 11:11:36.666566 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-nb-0" Mar 10 11:11:36 crc kubenswrapper[4794]: I0310 11:11:36.684763 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-nb-2" Mar 10 11:11:36 crc kubenswrapper[4794]: I0310 11:11:36.724865 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-nb-1" Mar 10 11:11:36 crc kubenswrapper[4794]: I0310 11:11:36.849707 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-sb-0" Mar 10 11:11:36 crc kubenswrapper[4794]: I0310 11:11:36.882536 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-sb-1" Mar 10 11:11:36 crc kubenswrapper[4794]: I0310 11:11:36.897140 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-sb-2" Mar 10 11:11:37 crc kubenswrapper[4794]: I0310 11:11:37.724329 4794 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-nb-1" Mar 10 11:11:37 crc kubenswrapper[4794]: I0310 11:11:37.742027 4794 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-nb-0" Mar 10 11:11:37 crc kubenswrapper[4794]: I0310 11:11:37.758121 4794 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-nb-2" Mar 10 11:11:37 crc kubenswrapper[4794]: I0310 11:11:37.794615 4794 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-nb-1" Mar 10 11:11:37 crc kubenswrapper[4794]: I0310 11:11:37.800990 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-nb-0" Mar 10 11:11:37 crc kubenswrapper[4794]: I0310 11:11:37.814769 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-nb-2" Mar 10 11:11:37 crc kubenswrapper[4794]: I0310 11:11:37.918352 4794 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-sb-0" Mar 10 11:11:37 crc kubenswrapper[4794]: I0310 11:11:37.967480 4794 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-sb-2" Mar 10 11:11:37 crc kubenswrapper[4794]: I0310 11:11:37.975002 4794 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-sb-1" Mar 10 11:11:38 crc kubenswrapper[4794]: I0310 11:11:38.030469 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-sb-1" Mar 10 11:11:38 crc kubenswrapper[4794]: I0310 11:11:38.033975 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-778cb79699-rnv47"] Mar 10 11:11:38 crc kubenswrapper[4794]: I0310 11:11:38.035224 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-778cb79699-rnv47" Mar 10 11:11:38 crc kubenswrapper[4794]: I0310 11:11:38.043109 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-nb" Mar 10 11:11:38 crc kubenswrapper[4794]: I0310 11:11:38.050402 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-778cb79699-rnv47"] Mar 10 11:11:38 crc kubenswrapper[4794]: I0310 11:11:38.053430 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-sb-2" Mar 10 11:11:38 crc kubenswrapper[4794]: I0310 11:11:38.122040 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-sb-0" Mar 10 11:11:38 crc kubenswrapper[4794]: I0310 11:11:38.146024 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-255w8\" (UniqueName: \"kubernetes.io/projected/30970c88-fc35-425c-a989-063bf1dc5ff8-kube-api-access-255w8\") pod \"dnsmasq-dns-778cb79699-rnv47\" (UID: \"30970c88-fc35-425c-a989-063bf1dc5ff8\") " pod="openstack/dnsmasq-dns-778cb79699-rnv47" Mar 10 11:11:38 crc kubenswrapper[4794]: I0310 11:11:38.146390 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/30970c88-fc35-425c-a989-063bf1dc5ff8-dns-svc\") pod \"dnsmasq-dns-778cb79699-rnv47\" (UID: \"30970c88-fc35-425c-a989-063bf1dc5ff8\") " pod="openstack/dnsmasq-dns-778cb79699-rnv47" Mar 10 11:11:38 crc kubenswrapper[4794]: I0310 11:11:38.146789 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/30970c88-fc35-425c-a989-063bf1dc5ff8-ovsdbserver-nb\") pod \"dnsmasq-dns-778cb79699-rnv47\" (UID: \"30970c88-fc35-425c-a989-063bf1dc5ff8\") " pod="openstack/dnsmasq-dns-778cb79699-rnv47" Mar 10 11:11:38 crc kubenswrapper[4794]: I0310 11:11:38.146887 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/30970c88-fc35-425c-a989-063bf1dc5ff8-config\") pod \"dnsmasq-dns-778cb79699-rnv47\" (UID: \"30970c88-fc35-425c-a989-063bf1dc5ff8\") " pod="openstack/dnsmasq-dns-778cb79699-rnv47" Mar 10 11:11:38 crc kubenswrapper[4794]: I0310 11:11:38.247903 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/30970c88-fc35-425c-a989-063bf1dc5ff8-ovsdbserver-nb\") pod \"dnsmasq-dns-778cb79699-rnv47\" (UID: \"30970c88-fc35-425c-a989-063bf1dc5ff8\") " pod="openstack/dnsmasq-dns-778cb79699-rnv47" Mar 10 11:11:38 crc kubenswrapper[4794]: I0310 11:11:38.247966 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/30970c88-fc35-425c-a989-063bf1dc5ff8-config\") pod \"dnsmasq-dns-778cb79699-rnv47\" (UID: \"30970c88-fc35-425c-a989-063bf1dc5ff8\") " pod="openstack/dnsmasq-dns-778cb79699-rnv47" Mar 10 11:11:38 crc kubenswrapper[4794]: I0310 11:11:38.248027 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-255w8\" (UniqueName: \"kubernetes.io/projected/30970c88-fc35-425c-a989-063bf1dc5ff8-kube-api-access-255w8\") pod \"dnsmasq-dns-778cb79699-rnv47\" (UID: \"30970c88-fc35-425c-a989-063bf1dc5ff8\") " pod="openstack/dnsmasq-dns-778cb79699-rnv47" Mar 10 11:11:38 crc kubenswrapper[4794]: I0310 11:11:38.248093 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/30970c88-fc35-425c-a989-063bf1dc5ff8-dns-svc\") pod \"dnsmasq-dns-778cb79699-rnv47\" (UID: \"30970c88-fc35-425c-a989-063bf1dc5ff8\") " pod="openstack/dnsmasq-dns-778cb79699-rnv47" Mar 10 11:11:38 crc kubenswrapper[4794]: I0310 11:11:38.249141 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/30970c88-fc35-425c-a989-063bf1dc5ff8-ovsdbserver-nb\") pod \"dnsmasq-dns-778cb79699-rnv47\" (UID: \"30970c88-fc35-425c-a989-063bf1dc5ff8\") " pod="openstack/dnsmasq-dns-778cb79699-rnv47" Mar 10 11:11:38 crc kubenswrapper[4794]: I0310 11:11:38.249210 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/30970c88-fc35-425c-a989-063bf1dc5ff8-config\") pod \"dnsmasq-dns-778cb79699-rnv47\" (UID: \"30970c88-fc35-425c-a989-063bf1dc5ff8\") " pod="openstack/dnsmasq-dns-778cb79699-rnv47" Mar 10 11:11:38 crc kubenswrapper[4794]: I0310 11:11:38.249366 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/30970c88-fc35-425c-a989-063bf1dc5ff8-dns-svc\") pod \"dnsmasq-dns-778cb79699-rnv47\" (UID: \"30970c88-fc35-425c-a989-063bf1dc5ff8\") " pod="openstack/dnsmasq-dns-778cb79699-rnv47" Mar 10 11:11:38 crc kubenswrapper[4794]: I0310 11:11:38.281104 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-255w8\" (UniqueName: \"kubernetes.io/projected/30970c88-fc35-425c-a989-063bf1dc5ff8-kube-api-access-255w8\") pod \"dnsmasq-dns-778cb79699-rnv47\" (UID: \"30970c88-fc35-425c-a989-063bf1dc5ff8\") " pod="openstack/dnsmasq-dns-778cb79699-rnv47" Mar 10 11:11:38 crc kubenswrapper[4794]: I0310 11:11:38.368653 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-778cb79699-rnv47" Mar 10 11:11:38 crc kubenswrapper[4794]: I0310 11:11:38.441657 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-778cb79699-rnv47"] Mar 10 11:11:38 crc kubenswrapper[4794]: I0310 11:11:38.465617 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6cd799f5b5-hgcjl"] Mar 10 11:11:38 crc kubenswrapper[4794]: I0310 11:11:38.473030 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6cd799f5b5-hgcjl"] Mar 10 11:11:38 crc kubenswrapper[4794]: I0310 11:11:38.473130 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6cd799f5b5-hgcjl" Mar 10 11:11:38 crc kubenswrapper[4794]: I0310 11:11:38.476802 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-sb" Mar 10 11:11:38 crc kubenswrapper[4794]: I0310 11:11:38.655697 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/69066222-1571-4ff1-86a6-1681967612da-dns-svc\") pod \"dnsmasq-dns-6cd799f5b5-hgcjl\" (UID: \"69066222-1571-4ff1-86a6-1681967612da\") " pod="openstack/dnsmasq-dns-6cd799f5b5-hgcjl" Mar 10 11:11:38 crc kubenswrapper[4794]: I0310 11:11:38.655745 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/69066222-1571-4ff1-86a6-1681967612da-config\") pod \"dnsmasq-dns-6cd799f5b5-hgcjl\" (UID: \"69066222-1571-4ff1-86a6-1681967612da\") " pod="openstack/dnsmasq-dns-6cd799f5b5-hgcjl" Mar 10 11:11:38 crc kubenswrapper[4794]: I0310 11:11:38.655870 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/69066222-1571-4ff1-86a6-1681967612da-ovsdbserver-nb\") pod \"dnsmasq-dns-6cd799f5b5-hgcjl\" (UID: \"69066222-1571-4ff1-86a6-1681967612da\") " pod="openstack/dnsmasq-dns-6cd799f5b5-hgcjl" Mar 10 11:11:38 crc kubenswrapper[4794]: I0310 11:11:38.655903 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/69066222-1571-4ff1-86a6-1681967612da-ovsdbserver-sb\") pod \"dnsmasq-dns-6cd799f5b5-hgcjl\" (UID: \"69066222-1571-4ff1-86a6-1681967612da\") " pod="openstack/dnsmasq-dns-6cd799f5b5-hgcjl" Mar 10 11:11:38 crc kubenswrapper[4794]: I0310 11:11:38.656243 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jg2xv\" (UniqueName: \"kubernetes.io/projected/69066222-1571-4ff1-86a6-1681967612da-kube-api-access-jg2xv\") pod \"dnsmasq-dns-6cd799f5b5-hgcjl\" (UID: \"69066222-1571-4ff1-86a6-1681967612da\") " pod="openstack/dnsmasq-dns-6cd799f5b5-hgcjl" Mar 10 11:11:38 crc kubenswrapper[4794]: I0310 11:11:38.757508 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/69066222-1571-4ff1-86a6-1681967612da-ovsdbserver-nb\") pod \"dnsmasq-dns-6cd799f5b5-hgcjl\" (UID: \"69066222-1571-4ff1-86a6-1681967612da\") " pod="openstack/dnsmasq-dns-6cd799f5b5-hgcjl" Mar 10 11:11:38 crc kubenswrapper[4794]: I0310 11:11:38.757763 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/69066222-1571-4ff1-86a6-1681967612da-ovsdbserver-sb\") pod \"dnsmasq-dns-6cd799f5b5-hgcjl\" (UID: \"69066222-1571-4ff1-86a6-1681967612da\") " pod="openstack/dnsmasq-dns-6cd799f5b5-hgcjl" Mar 10 11:11:38 crc kubenswrapper[4794]: I0310 11:11:38.757833 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jg2xv\" (UniqueName: \"kubernetes.io/projected/69066222-1571-4ff1-86a6-1681967612da-kube-api-access-jg2xv\") pod \"dnsmasq-dns-6cd799f5b5-hgcjl\" (UID: \"69066222-1571-4ff1-86a6-1681967612da\") " pod="openstack/dnsmasq-dns-6cd799f5b5-hgcjl" Mar 10 11:11:38 crc kubenswrapper[4794]: I0310 11:11:38.757886 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/69066222-1571-4ff1-86a6-1681967612da-dns-svc\") pod \"dnsmasq-dns-6cd799f5b5-hgcjl\" (UID: \"69066222-1571-4ff1-86a6-1681967612da\") " pod="openstack/dnsmasq-dns-6cd799f5b5-hgcjl" Mar 10 11:11:38 crc kubenswrapper[4794]: I0310 11:11:38.757905 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/69066222-1571-4ff1-86a6-1681967612da-config\") pod \"dnsmasq-dns-6cd799f5b5-hgcjl\" (UID: \"69066222-1571-4ff1-86a6-1681967612da\") " pod="openstack/dnsmasq-dns-6cd799f5b5-hgcjl" Mar 10 11:11:38 crc kubenswrapper[4794]: I0310 11:11:38.758317 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/69066222-1571-4ff1-86a6-1681967612da-ovsdbserver-nb\") pod \"dnsmasq-dns-6cd799f5b5-hgcjl\" (UID: \"69066222-1571-4ff1-86a6-1681967612da\") " pod="openstack/dnsmasq-dns-6cd799f5b5-hgcjl" Mar 10 11:11:38 crc kubenswrapper[4794]: I0310 11:11:38.758574 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/69066222-1571-4ff1-86a6-1681967612da-config\") pod \"dnsmasq-dns-6cd799f5b5-hgcjl\" (UID: \"69066222-1571-4ff1-86a6-1681967612da\") " pod="openstack/dnsmasq-dns-6cd799f5b5-hgcjl" Mar 10 11:11:38 crc kubenswrapper[4794]: I0310 11:11:38.759619 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/69066222-1571-4ff1-86a6-1681967612da-dns-svc\") pod \"dnsmasq-dns-6cd799f5b5-hgcjl\" (UID: \"69066222-1571-4ff1-86a6-1681967612da\") " pod="openstack/dnsmasq-dns-6cd799f5b5-hgcjl" Mar 10 11:11:38 crc kubenswrapper[4794]: I0310 11:11:38.759646 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/69066222-1571-4ff1-86a6-1681967612da-ovsdbserver-sb\") pod \"dnsmasq-dns-6cd799f5b5-hgcjl\" (UID: \"69066222-1571-4ff1-86a6-1681967612da\") " pod="openstack/dnsmasq-dns-6cd799f5b5-hgcjl" Mar 10 11:11:38 crc kubenswrapper[4794]: I0310 11:11:38.812225 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jg2xv\" (UniqueName: \"kubernetes.io/projected/69066222-1571-4ff1-86a6-1681967612da-kube-api-access-jg2xv\") pod \"dnsmasq-dns-6cd799f5b5-hgcjl\" (UID: \"69066222-1571-4ff1-86a6-1681967612da\") " pod="openstack/dnsmasq-dns-6cd799f5b5-hgcjl" Mar 10 11:11:38 crc kubenswrapper[4794]: I0310 11:11:38.879747 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-778cb79699-rnv47"] Mar 10 11:11:39 crc kubenswrapper[4794]: I0310 11:11:39.096985 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-778cb79699-rnv47" event={"ID":"30970c88-fc35-425c-a989-063bf1dc5ff8","Type":"ContainerStarted","Data":"19efc96fad918a2f445952450251dc4e93aca7be6672c63aa2ec2dbc15b74f29"} Mar 10 11:11:39 crc kubenswrapper[4794]: I0310 11:11:39.097032 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-778cb79699-rnv47" event={"ID":"30970c88-fc35-425c-a989-063bf1dc5ff8","Type":"ContainerStarted","Data":"0f2dbee8c75f115148c12debd7c0c0fc8567417925555d1c3cc60d9638c05e5d"} Mar 10 11:11:39 crc kubenswrapper[4794]: I0310 11:11:39.097185 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-778cb79699-rnv47" podUID="30970c88-fc35-425c-a989-063bf1dc5ff8" containerName="init" containerID="cri-o://19efc96fad918a2f445952450251dc4e93aca7be6672c63aa2ec2dbc15b74f29" gracePeriod=10 Mar 10 11:11:39 crc kubenswrapper[4794]: I0310 11:11:39.098356 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6cd799f5b5-hgcjl" Mar 10 11:11:39 crc kubenswrapper[4794]: I0310 11:11:39.171445 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-nb-1" Mar 10 11:11:39 crc kubenswrapper[4794]: I0310 11:11:39.494184 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-778cb79699-rnv47" Mar 10 11:11:39 crc kubenswrapper[4794]: I0310 11:11:39.570828 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-255w8\" (UniqueName: \"kubernetes.io/projected/30970c88-fc35-425c-a989-063bf1dc5ff8-kube-api-access-255w8\") pod \"30970c88-fc35-425c-a989-063bf1dc5ff8\" (UID: \"30970c88-fc35-425c-a989-063bf1dc5ff8\") " Mar 10 11:11:39 crc kubenswrapper[4794]: I0310 11:11:39.570921 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/30970c88-fc35-425c-a989-063bf1dc5ff8-config\") pod \"30970c88-fc35-425c-a989-063bf1dc5ff8\" (UID: \"30970c88-fc35-425c-a989-063bf1dc5ff8\") " Mar 10 11:11:39 crc kubenswrapper[4794]: I0310 11:11:39.571060 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/30970c88-fc35-425c-a989-063bf1dc5ff8-ovsdbserver-nb\") pod \"30970c88-fc35-425c-a989-063bf1dc5ff8\" (UID: \"30970c88-fc35-425c-a989-063bf1dc5ff8\") " Mar 10 11:11:39 crc kubenswrapper[4794]: I0310 11:11:39.571716 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/30970c88-fc35-425c-a989-063bf1dc5ff8-dns-svc\") pod \"30970c88-fc35-425c-a989-063bf1dc5ff8\" (UID: \"30970c88-fc35-425c-a989-063bf1dc5ff8\") " Mar 10 11:11:39 crc kubenswrapper[4794]: I0310 11:11:39.575986 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/30970c88-fc35-425c-a989-063bf1dc5ff8-kube-api-access-255w8" (OuterVolumeSpecName: "kube-api-access-255w8") pod "30970c88-fc35-425c-a989-063bf1dc5ff8" (UID: "30970c88-fc35-425c-a989-063bf1dc5ff8"). InnerVolumeSpecName "kube-api-access-255w8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 11:11:39 crc kubenswrapper[4794]: I0310 11:11:39.589644 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/30970c88-fc35-425c-a989-063bf1dc5ff8-config" (OuterVolumeSpecName: "config") pod "30970c88-fc35-425c-a989-063bf1dc5ff8" (UID: "30970c88-fc35-425c-a989-063bf1dc5ff8"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 11:11:39 crc kubenswrapper[4794]: I0310 11:11:39.596007 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/30970c88-fc35-425c-a989-063bf1dc5ff8-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "30970c88-fc35-425c-a989-063bf1dc5ff8" (UID: "30970c88-fc35-425c-a989-063bf1dc5ff8"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 11:11:39 crc kubenswrapper[4794]: I0310 11:11:39.602805 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/30970c88-fc35-425c-a989-063bf1dc5ff8-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "30970c88-fc35-425c-a989-063bf1dc5ff8" (UID: "30970c88-fc35-425c-a989-063bf1dc5ff8"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 11:11:39 crc kubenswrapper[4794]: I0310 11:11:39.643138 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6cd799f5b5-hgcjl"] Mar 10 11:11:39 crc kubenswrapper[4794]: W0310 11:11:39.644316 4794 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod69066222_1571_4ff1_86a6_1681967612da.slice/crio-c8a67e3985b11cb4dfe4ca9eb4dd117d03f21eca89533f0833164b9770bec688 WatchSource:0}: Error finding container c8a67e3985b11cb4dfe4ca9eb4dd117d03f21eca89533f0833164b9770bec688: Status 404 returned error can't find the container with id c8a67e3985b11cb4dfe4ca9eb4dd117d03f21eca89533f0833164b9770bec688 Mar 10 11:11:39 crc kubenswrapper[4794]: I0310 11:11:39.674005 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-255w8\" (UniqueName: \"kubernetes.io/projected/30970c88-fc35-425c-a989-063bf1dc5ff8-kube-api-access-255w8\") on node \"crc\" DevicePath \"\"" Mar 10 11:11:39 crc kubenswrapper[4794]: I0310 11:11:39.674057 4794 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/30970c88-fc35-425c-a989-063bf1dc5ff8-config\") on node \"crc\" DevicePath \"\"" Mar 10 11:11:39 crc kubenswrapper[4794]: I0310 11:11:39.674078 4794 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/30970c88-fc35-425c-a989-063bf1dc5ff8-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 10 11:11:39 crc kubenswrapper[4794]: I0310 11:11:39.674097 4794 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/30970c88-fc35-425c-a989-063bf1dc5ff8-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 10 11:11:40 crc kubenswrapper[4794]: I0310 11:11:40.112600 4794 generic.go:334] "Generic (PLEG): container finished" podID="69066222-1571-4ff1-86a6-1681967612da" containerID="ac432e8dea49d1b4563bfb647e53d8494d32c81c68c5a809f8e03285c79b8ccd" exitCode=0 Mar 10 11:11:40 crc kubenswrapper[4794]: I0310 11:11:40.112746 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6cd799f5b5-hgcjl" event={"ID":"69066222-1571-4ff1-86a6-1681967612da","Type":"ContainerDied","Data":"ac432e8dea49d1b4563bfb647e53d8494d32c81c68c5a809f8e03285c79b8ccd"} Mar 10 11:11:40 crc kubenswrapper[4794]: I0310 11:11:40.112823 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6cd799f5b5-hgcjl" event={"ID":"69066222-1571-4ff1-86a6-1681967612da","Type":"ContainerStarted","Data":"c8a67e3985b11cb4dfe4ca9eb4dd117d03f21eca89533f0833164b9770bec688"} Mar 10 11:11:40 crc kubenswrapper[4794]: I0310 11:11:40.115499 4794 generic.go:334] "Generic (PLEG): container finished" podID="30970c88-fc35-425c-a989-063bf1dc5ff8" containerID="19efc96fad918a2f445952450251dc4e93aca7be6672c63aa2ec2dbc15b74f29" exitCode=0 Mar 10 11:11:40 crc kubenswrapper[4794]: I0310 11:11:40.115785 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-778cb79699-rnv47" event={"ID":"30970c88-fc35-425c-a989-063bf1dc5ff8","Type":"ContainerDied","Data":"19efc96fad918a2f445952450251dc4e93aca7be6672c63aa2ec2dbc15b74f29"} Mar 10 11:11:40 crc kubenswrapper[4794]: I0310 11:11:40.115855 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-778cb79699-rnv47" Mar 10 11:11:40 crc kubenswrapper[4794]: I0310 11:11:40.115884 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-778cb79699-rnv47" event={"ID":"30970c88-fc35-425c-a989-063bf1dc5ff8","Type":"ContainerDied","Data":"0f2dbee8c75f115148c12debd7c0c0fc8567417925555d1c3cc60d9638c05e5d"} Mar 10 11:11:40 crc kubenswrapper[4794]: I0310 11:11:40.115931 4794 scope.go:117] "RemoveContainer" containerID="19efc96fad918a2f445952450251dc4e93aca7be6672c63aa2ec2dbc15b74f29" Mar 10 11:11:40 crc kubenswrapper[4794]: I0310 11:11:40.166151 4794 scope.go:117] "RemoveContainer" containerID="19efc96fad918a2f445952450251dc4e93aca7be6672c63aa2ec2dbc15b74f29" Mar 10 11:11:40 crc kubenswrapper[4794]: E0310 11:11:40.166658 4794 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"19efc96fad918a2f445952450251dc4e93aca7be6672c63aa2ec2dbc15b74f29\": container with ID starting with 19efc96fad918a2f445952450251dc4e93aca7be6672c63aa2ec2dbc15b74f29 not found: ID does not exist" containerID="19efc96fad918a2f445952450251dc4e93aca7be6672c63aa2ec2dbc15b74f29" Mar 10 11:11:40 crc kubenswrapper[4794]: I0310 11:11:40.166693 4794 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"19efc96fad918a2f445952450251dc4e93aca7be6672c63aa2ec2dbc15b74f29"} err="failed to get container status \"19efc96fad918a2f445952450251dc4e93aca7be6672c63aa2ec2dbc15b74f29\": rpc error: code = NotFound desc = could not find container \"19efc96fad918a2f445952450251dc4e93aca7be6672c63aa2ec2dbc15b74f29\": container with ID starting with 19efc96fad918a2f445952450251dc4e93aca7be6672c63aa2ec2dbc15b74f29 not found: ID does not exist" Mar 10 11:11:40 crc kubenswrapper[4794]: I0310 11:11:40.206649 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-778cb79699-rnv47"] Mar 10 11:11:40 crc kubenswrapper[4794]: I0310 11:11:40.211990 4794 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-778cb79699-rnv47"] Mar 10 11:11:41 crc kubenswrapper[4794]: I0310 11:11:41.131810 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6cd799f5b5-hgcjl" event={"ID":"69066222-1571-4ff1-86a6-1681967612da","Type":"ContainerStarted","Data":"03da0622d5760d58291153ca42e768b1658694e688253040567b816d0b833cd0"} Mar 10 11:11:41 crc kubenswrapper[4794]: I0310 11:11:41.132249 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6cd799f5b5-hgcjl" Mar 10 11:11:41 crc kubenswrapper[4794]: I0310 11:11:41.171259 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6cd799f5b5-hgcjl" podStartSLOduration=3.171235555 podStartE2EDuration="3.171235555s" podCreationTimestamp="2026-03-10 11:11:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 11:11:41.168695416 +0000 UTC m=+5249.924866254" watchObservedRunningTime="2026-03-10 11:11:41.171235555 +0000 UTC m=+5249.927406383" Mar 10 11:11:42 crc kubenswrapper[4794]: I0310 11:11:42.032045 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="30970c88-fc35-425c-a989-063bf1dc5ff8" path="/var/lib/kubelet/pods/30970c88-fc35-425c-a989-063bf1dc5ff8/volumes" Mar 10 11:11:42 crc kubenswrapper[4794]: I0310 11:11:42.033029 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-copy-data"] Mar 10 11:11:42 crc kubenswrapper[4794]: E0310 11:11:42.033373 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="30970c88-fc35-425c-a989-063bf1dc5ff8" containerName="init" Mar 10 11:11:42 crc kubenswrapper[4794]: I0310 11:11:42.033390 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="30970c88-fc35-425c-a989-063bf1dc5ff8" containerName="init" Mar 10 11:11:42 crc kubenswrapper[4794]: I0310 11:11:42.033593 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="30970c88-fc35-425c-a989-063bf1dc5ff8" containerName="init" Mar 10 11:11:42 crc kubenswrapper[4794]: I0310 11:11:42.034187 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-copy-data"] Mar 10 11:11:42 crc kubenswrapper[4794]: I0310 11:11:42.034382 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-copy-data" Mar 10 11:11:42 crc kubenswrapper[4794]: I0310 11:11:42.036470 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovn-data-cert" Mar 10 11:11:42 crc kubenswrapper[4794]: I0310 11:11:42.120171 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g6ph8\" (UniqueName: \"kubernetes.io/projected/84da0233-958e-4c22-b27c-3a0881846fb3-kube-api-access-g6ph8\") pod \"ovn-copy-data\" (UID: \"84da0233-958e-4c22-b27c-3a0881846fb3\") " pod="openstack/ovn-copy-data" Mar 10 11:11:42 crc kubenswrapper[4794]: I0310 11:11:42.120671 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-c50601f3-8eda-4ed4-9103-3471358ec992\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-c50601f3-8eda-4ed4-9103-3471358ec992\") pod \"ovn-copy-data\" (UID: \"84da0233-958e-4c22-b27c-3a0881846fb3\") " pod="openstack/ovn-copy-data" Mar 10 11:11:42 crc kubenswrapper[4794]: I0310 11:11:42.120809 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-data-cert\" (UniqueName: \"kubernetes.io/secret/84da0233-958e-4c22-b27c-3a0881846fb3-ovn-data-cert\") pod \"ovn-copy-data\" (UID: \"84da0233-958e-4c22-b27c-3a0881846fb3\") " pod="openstack/ovn-copy-data" Mar 10 11:11:42 crc kubenswrapper[4794]: I0310 11:11:42.224431 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g6ph8\" (UniqueName: \"kubernetes.io/projected/84da0233-958e-4c22-b27c-3a0881846fb3-kube-api-access-g6ph8\") pod \"ovn-copy-data\" (UID: \"84da0233-958e-4c22-b27c-3a0881846fb3\") " pod="openstack/ovn-copy-data" Mar 10 11:11:42 crc kubenswrapper[4794]: I0310 11:11:42.224510 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-c50601f3-8eda-4ed4-9103-3471358ec992\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-c50601f3-8eda-4ed4-9103-3471358ec992\") pod \"ovn-copy-data\" (UID: \"84da0233-958e-4c22-b27c-3a0881846fb3\") " pod="openstack/ovn-copy-data" Mar 10 11:11:42 crc kubenswrapper[4794]: I0310 11:11:42.224539 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-data-cert\" (UniqueName: \"kubernetes.io/secret/84da0233-958e-4c22-b27c-3a0881846fb3-ovn-data-cert\") pod \"ovn-copy-data\" (UID: \"84da0233-958e-4c22-b27c-3a0881846fb3\") " pod="openstack/ovn-copy-data" Mar 10 11:11:42 crc kubenswrapper[4794]: I0310 11:11:42.229838 4794 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 10 11:11:42 crc kubenswrapper[4794]: I0310 11:11:42.229967 4794 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-c50601f3-8eda-4ed4-9103-3471358ec992\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-c50601f3-8eda-4ed4-9103-3471358ec992\") pod \"ovn-copy-data\" (UID: \"84da0233-958e-4c22-b27c-3a0881846fb3\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/7c57bc7577322c149b55eedc681f600c82105ed7beac371801e064032205218c/globalmount\"" pod="openstack/ovn-copy-data" Mar 10 11:11:42 crc kubenswrapper[4794]: I0310 11:11:42.241720 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-data-cert\" (UniqueName: \"kubernetes.io/secret/84da0233-958e-4c22-b27c-3a0881846fb3-ovn-data-cert\") pod \"ovn-copy-data\" (UID: \"84da0233-958e-4c22-b27c-3a0881846fb3\") " pod="openstack/ovn-copy-data" Mar 10 11:11:42 crc kubenswrapper[4794]: I0310 11:11:42.245974 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g6ph8\" (UniqueName: \"kubernetes.io/projected/84da0233-958e-4c22-b27c-3a0881846fb3-kube-api-access-g6ph8\") pod \"ovn-copy-data\" (UID: \"84da0233-958e-4c22-b27c-3a0881846fb3\") " pod="openstack/ovn-copy-data" Mar 10 11:11:42 crc kubenswrapper[4794]: I0310 11:11:42.260592 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-c50601f3-8eda-4ed4-9103-3471358ec992\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-c50601f3-8eda-4ed4-9103-3471358ec992\") pod \"ovn-copy-data\" (UID: \"84da0233-958e-4c22-b27c-3a0881846fb3\") " pod="openstack/ovn-copy-data" Mar 10 11:11:42 crc kubenswrapper[4794]: I0310 11:11:42.367408 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-copy-data" Mar 10 11:11:43 crc kubenswrapper[4794]: I0310 11:11:43.058993 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-copy-data"] Mar 10 11:11:43 crc kubenswrapper[4794]: I0310 11:11:43.149833 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-copy-data" event={"ID":"84da0233-958e-4c22-b27c-3a0881846fb3","Type":"ContainerStarted","Data":"1924ccd8aadfe2c3e652f89c10cdf0cabfa1a4466a434f8774421fc1aa8fe7c9"} Mar 10 11:11:44 crc kubenswrapper[4794]: I0310 11:11:44.159617 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-copy-data" event={"ID":"84da0233-958e-4c22-b27c-3a0881846fb3","Type":"ContainerStarted","Data":"09dc834cf10cfb1813d5549f474b71a0e3bba108fcbaa9f3ef535529d414d9f0"} Mar 10 11:11:44 crc kubenswrapper[4794]: I0310 11:11:44.192048 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-copy-data" podStartSLOduration=3.445921747 podStartE2EDuration="4.192017295s" podCreationTimestamp="2026-03-10 11:11:40 +0000 UTC" firstStartedPulling="2026-03-10 11:11:43.073895201 +0000 UTC m=+5251.830066049" lastFinishedPulling="2026-03-10 11:11:43.819990759 +0000 UTC m=+5252.576161597" observedRunningTime="2026-03-10 11:11:44.178106722 +0000 UTC m=+5252.934277560" watchObservedRunningTime="2026-03-10 11:11:44.192017295 +0000 UTC m=+5252.948188153" Mar 10 11:11:49 crc kubenswrapper[4794]: I0310 11:11:49.100167 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-6cd799f5b5-hgcjl" Mar 10 11:11:49 crc kubenswrapper[4794]: I0310 11:11:49.178960 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-66d5bf7c87-4n74d"] Mar 10 11:11:49 crc kubenswrapper[4794]: I0310 11:11:49.179421 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-66d5bf7c87-4n74d" podUID="0f4731c7-dafb-410c-9a54-5a857c02cfbf" containerName="dnsmasq-dns" containerID="cri-o://b2dfa8f9c79764e9aa59702297a1345dc8192a28f1d794b6282c926009c7869a" gracePeriod=10 Mar 10 11:11:49 crc kubenswrapper[4794]: I0310 11:11:49.702403 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-66d5bf7c87-4n74d" Mar 10 11:11:49 crc kubenswrapper[4794]: I0310 11:11:49.785578 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0f4731c7-dafb-410c-9a54-5a857c02cfbf-config\") pod \"0f4731c7-dafb-410c-9a54-5a857c02cfbf\" (UID: \"0f4731c7-dafb-410c-9a54-5a857c02cfbf\") " Mar 10 11:11:49 crc kubenswrapper[4794]: I0310 11:11:49.785684 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0f4731c7-dafb-410c-9a54-5a857c02cfbf-dns-svc\") pod \"0f4731c7-dafb-410c-9a54-5a857c02cfbf\" (UID: \"0f4731c7-dafb-410c-9a54-5a857c02cfbf\") " Mar 10 11:11:49 crc kubenswrapper[4794]: I0310 11:11:49.785720 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t82fl\" (UniqueName: \"kubernetes.io/projected/0f4731c7-dafb-410c-9a54-5a857c02cfbf-kube-api-access-t82fl\") pod \"0f4731c7-dafb-410c-9a54-5a857c02cfbf\" (UID: \"0f4731c7-dafb-410c-9a54-5a857c02cfbf\") " Mar 10 11:11:49 crc kubenswrapper[4794]: I0310 11:11:49.810931 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0f4731c7-dafb-410c-9a54-5a857c02cfbf-kube-api-access-t82fl" (OuterVolumeSpecName: "kube-api-access-t82fl") pod "0f4731c7-dafb-410c-9a54-5a857c02cfbf" (UID: "0f4731c7-dafb-410c-9a54-5a857c02cfbf"). InnerVolumeSpecName "kube-api-access-t82fl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 11:11:49 crc kubenswrapper[4794]: I0310 11:11:49.837429 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0f4731c7-dafb-410c-9a54-5a857c02cfbf-config" (OuterVolumeSpecName: "config") pod "0f4731c7-dafb-410c-9a54-5a857c02cfbf" (UID: "0f4731c7-dafb-410c-9a54-5a857c02cfbf"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 11:11:49 crc kubenswrapper[4794]: I0310 11:11:49.847216 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-northd-0"] Mar 10 11:11:49 crc kubenswrapper[4794]: E0310 11:11:49.847620 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0f4731c7-dafb-410c-9a54-5a857c02cfbf" containerName="dnsmasq-dns" Mar 10 11:11:49 crc kubenswrapper[4794]: I0310 11:11:49.847641 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="0f4731c7-dafb-410c-9a54-5a857c02cfbf" containerName="dnsmasq-dns" Mar 10 11:11:49 crc kubenswrapper[4794]: E0310 11:11:49.847654 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0f4731c7-dafb-410c-9a54-5a857c02cfbf" containerName="init" Mar 10 11:11:49 crc kubenswrapper[4794]: I0310 11:11:49.847660 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="0f4731c7-dafb-410c-9a54-5a857c02cfbf" containerName="init" Mar 10 11:11:49 crc kubenswrapper[4794]: I0310 11:11:49.847817 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="0f4731c7-dafb-410c-9a54-5a857c02cfbf" containerName="dnsmasq-dns" Mar 10 11:11:49 crc kubenswrapper[4794]: I0310 11:11:49.848665 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Mar 10 11:11:49 crc kubenswrapper[4794]: I0310 11:11:49.851692 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-config" Mar 10 11:11:49 crc kubenswrapper[4794]: I0310 11:11:49.851901 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-scripts" Mar 10 11:11:49 crc kubenswrapper[4794]: I0310 11:11:49.852108 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovnnorthd-ovnnorthd-dockercfg-xqxrf" Mar 10 11:11:49 crc kubenswrapper[4794]: I0310 11:11:49.856869 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Mar 10 11:11:49 crc kubenswrapper[4794]: I0310 11:11:49.866250 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0f4731c7-dafb-410c-9a54-5a857c02cfbf-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "0f4731c7-dafb-410c-9a54-5a857c02cfbf" (UID: "0f4731c7-dafb-410c-9a54-5a857c02cfbf"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 11:11:49 crc kubenswrapper[4794]: I0310 11:11:49.886993 4794 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0f4731c7-dafb-410c-9a54-5a857c02cfbf-config\") on node \"crc\" DevicePath \"\"" Mar 10 11:11:49 crc kubenswrapper[4794]: I0310 11:11:49.887029 4794 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0f4731c7-dafb-410c-9a54-5a857c02cfbf-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 10 11:11:49 crc kubenswrapper[4794]: I0310 11:11:49.887042 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t82fl\" (UniqueName: \"kubernetes.io/projected/0f4731c7-dafb-410c-9a54-5a857c02cfbf-kube-api-access-t82fl\") on node \"crc\" DevicePath \"\"" Mar 10 11:11:49 crc kubenswrapper[4794]: I0310 11:11:49.988258 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0436aee5-3421-4a09-b2c9-468430d109ec-config\") pod \"ovn-northd-0\" (UID: \"0436aee5-3421-4a09-b2c9-468430d109ec\") " pod="openstack/ovn-northd-0" Mar 10 11:11:49 crc kubenswrapper[4794]: I0310 11:11:49.988310 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/0436aee5-3421-4a09-b2c9-468430d109ec-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"0436aee5-3421-4a09-b2c9-468430d109ec\") " pod="openstack/ovn-northd-0" Mar 10 11:11:49 crc kubenswrapper[4794]: I0310 11:11:49.988347 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0436aee5-3421-4a09-b2c9-468430d109ec-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"0436aee5-3421-4a09-b2c9-468430d109ec\") " pod="openstack/ovn-northd-0" Mar 10 11:11:49 crc kubenswrapper[4794]: I0310 11:11:49.988385 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/0436aee5-3421-4a09-b2c9-468430d109ec-scripts\") pod \"ovn-northd-0\" (UID: \"0436aee5-3421-4a09-b2c9-468430d109ec\") " pod="openstack/ovn-northd-0" Mar 10 11:11:49 crc kubenswrapper[4794]: I0310 11:11:49.988408 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gd7mm\" (UniqueName: \"kubernetes.io/projected/0436aee5-3421-4a09-b2c9-468430d109ec-kube-api-access-gd7mm\") pod \"ovn-northd-0\" (UID: \"0436aee5-3421-4a09-b2c9-468430d109ec\") " pod="openstack/ovn-northd-0" Mar 10 11:11:50 crc kubenswrapper[4794]: I0310 11:11:50.090003 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/0436aee5-3421-4a09-b2c9-468430d109ec-scripts\") pod \"ovn-northd-0\" (UID: \"0436aee5-3421-4a09-b2c9-468430d109ec\") " pod="openstack/ovn-northd-0" Mar 10 11:11:50 crc kubenswrapper[4794]: I0310 11:11:50.090055 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gd7mm\" (UniqueName: \"kubernetes.io/projected/0436aee5-3421-4a09-b2c9-468430d109ec-kube-api-access-gd7mm\") pod \"ovn-northd-0\" (UID: \"0436aee5-3421-4a09-b2c9-468430d109ec\") " pod="openstack/ovn-northd-0" Mar 10 11:11:50 crc kubenswrapper[4794]: I0310 11:11:50.090159 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0436aee5-3421-4a09-b2c9-468430d109ec-config\") pod \"ovn-northd-0\" (UID: \"0436aee5-3421-4a09-b2c9-468430d109ec\") " pod="openstack/ovn-northd-0" Mar 10 11:11:50 crc kubenswrapper[4794]: I0310 11:11:50.090198 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/0436aee5-3421-4a09-b2c9-468430d109ec-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"0436aee5-3421-4a09-b2c9-468430d109ec\") " pod="openstack/ovn-northd-0" Mar 10 11:11:50 crc kubenswrapper[4794]: I0310 11:11:50.090219 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0436aee5-3421-4a09-b2c9-468430d109ec-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"0436aee5-3421-4a09-b2c9-468430d109ec\") " pod="openstack/ovn-northd-0" Mar 10 11:11:50 crc kubenswrapper[4794]: I0310 11:11:50.091019 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/0436aee5-3421-4a09-b2c9-468430d109ec-scripts\") pod \"ovn-northd-0\" (UID: \"0436aee5-3421-4a09-b2c9-468430d109ec\") " pod="openstack/ovn-northd-0" Mar 10 11:11:50 crc kubenswrapper[4794]: I0310 11:11:50.091299 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/0436aee5-3421-4a09-b2c9-468430d109ec-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"0436aee5-3421-4a09-b2c9-468430d109ec\") " pod="openstack/ovn-northd-0" Mar 10 11:11:50 crc kubenswrapper[4794]: I0310 11:11:50.092089 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0436aee5-3421-4a09-b2c9-468430d109ec-config\") pod \"ovn-northd-0\" (UID: \"0436aee5-3421-4a09-b2c9-468430d109ec\") " pod="openstack/ovn-northd-0" Mar 10 11:11:50 crc kubenswrapper[4794]: I0310 11:11:50.094588 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0436aee5-3421-4a09-b2c9-468430d109ec-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"0436aee5-3421-4a09-b2c9-468430d109ec\") " pod="openstack/ovn-northd-0" Mar 10 11:11:50 crc kubenswrapper[4794]: I0310 11:11:50.106503 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gd7mm\" (UniqueName: \"kubernetes.io/projected/0436aee5-3421-4a09-b2c9-468430d109ec-kube-api-access-gd7mm\") pod \"ovn-northd-0\" (UID: \"0436aee5-3421-4a09-b2c9-468430d109ec\") " pod="openstack/ovn-northd-0" Mar 10 11:11:50 crc kubenswrapper[4794]: I0310 11:11:50.171810 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Mar 10 11:11:50 crc kubenswrapper[4794]: I0310 11:11:50.232418 4794 generic.go:334] "Generic (PLEG): container finished" podID="0f4731c7-dafb-410c-9a54-5a857c02cfbf" containerID="b2dfa8f9c79764e9aa59702297a1345dc8192a28f1d794b6282c926009c7869a" exitCode=0 Mar 10 11:11:50 crc kubenswrapper[4794]: I0310 11:11:50.232472 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-66d5bf7c87-4n74d" event={"ID":"0f4731c7-dafb-410c-9a54-5a857c02cfbf","Type":"ContainerDied","Data":"b2dfa8f9c79764e9aa59702297a1345dc8192a28f1d794b6282c926009c7869a"} Mar 10 11:11:50 crc kubenswrapper[4794]: I0310 11:11:50.232508 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-66d5bf7c87-4n74d" event={"ID":"0f4731c7-dafb-410c-9a54-5a857c02cfbf","Type":"ContainerDied","Data":"95f2fe1b84c2fc5c62eb13a0e481e27881cce9db4a13a1aa5f4d572d723b10d4"} Mar 10 11:11:50 crc kubenswrapper[4794]: I0310 11:11:50.232529 4794 scope.go:117] "RemoveContainer" containerID="b2dfa8f9c79764e9aa59702297a1345dc8192a28f1d794b6282c926009c7869a" Mar 10 11:11:50 crc kubenswrapper[4794]: I0310 11:11:50.232678 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-66d5bf7c87-4n74d" Mar 10 11:11:50 crc kubenswrapper[4794]: I0310 11:11:50.271174 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-66d5bf7c87-4n74d"] Mar 10 11:11:50 crc kubenswrapper[4794]: I0310 11:11:50.283513 4794 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-66d5bf7c87-4n74d"] Mar 10 11:11:50 crc kubenswrapper[4794]: I0310 11:11:50.283562 4794 scope.go:117] "RemoveContainer" containerID="bfb1e388d3989840d13d2f26911f6ca578d105595af361c6e10c5bccdaf9456b" Mar 10 11:11:50 crc kubenswrapper[4794]: I0310 11:11:50.360880 4794 scope.go:117] "RemoveContainer" containerID="b2dfa8f9c79764e9aa59702297a1345dc8192a28f1d794b6282c926009c7869a" Mar 10 11:11:50 crc kubenswrapper[4794]: E0310 11:11:50.361935 4794 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b2dfa8f9c79764e9aa59702297a1345dc8192a28f1d794b6282c926009c7869a\": container with ID starting with b2dfa8f9c79764e9aa59702297a1345dc8192a28f1d794b6282c926009c7869a not found: ID does not exist" containerID="b2dfa8f9c79764e9aa59702297a1345dc8192a28f1d794b6282c926009c7869a" Mar 10 11:11:50 crc kubenswrapper[4794]: I0310 11:11:50.362000 4794 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b2dfa8f9c79764e9aa59702297a1345dc8192a28f1d794b6282c926009c7869a"} err="failed to get container status \"b2dfa8f9c79764e9aa59702297a1345dc8192a28f1d794b6282c926009c7869a\": rpc error: code = NotFound desc = could not find container \"b2dfa8f9c79764e9aa59702297a1345dc8192a28f1d794b6282c926009c7869a\": container with ID starting with b2dfa8f9c79764e9aa59702297a1345dc8192a28f1d794b6282c926009c7869a not found: ID does not exist" Mar 10 11:11:50 crc kubenswrapper[4794]: I0310 11:11:50.362029 4794 scope.go:117] "RemoveContainer" containerID="bfb1e388d3989840d13d2f26911f6ca578d105595af361c6e10c5bccdaf9456b" Mar 10 11:11:50 crc kubenswrapper[4794]: E0310 11:11:50.362709 4794 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bfb1e388d3989840d13d2f26911f6ca578d105595af361c6e10c5bccdaf9456b\": container with ID starting with bfb1e388d3989840d13d2f26911f6ca578d105595af361c6e10c5bccdaf9456b not found: ID does not exist" containerID="bfb1e388d3989840d13d2f26911f6ca578d105595af361c6e10c5bccdaf9456b" Mar 10 11:11:50 crc kubenswrapper[4794]: I0310 11:11:50.362745 4794 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bfb1e388d3989840d13d2f26911f6ca578d105595af361c6e10c5bccdaf9456b"} err="failed to get container status \"bfb1e388d3989840d13d2f26911f6ca578d105595af361c6e10c5bccdaf9456b\": rpc error: code = NotFound desc = could not find container \"bfb1e388d3989840d13d2f26911f6ca578d105595af361c6e10c5bccdaf9456b\": container with ID starting with bfb1e388d3989840d13d2f26911f6ca578d105595af361c6e10c5bccdaf9456b not found: ID does not exist" Mar 10 11:11:50 crc kubenswrapper[4794]: I0310 11:11:50.647838 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Mar 10 11:11:51 crc kubenswrapper[4794]: I0310 11:11:51.258448 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"0436aee5-3421-4a09-b2c9-468430d109ec","Type":"ContainerStarted","Data":"6ee4068a36a0fae982dcfe7cba70f490a4fd1a9e3c87f104ec161c347912e770"} Mar 10 11:11:51 crc kubenswrapper[4794]: I0310 11:11:51.258494 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"0436aee5-3421-4a09-b2c9-468430d109ec","Type":"ContainerStarted","Data":"60e745b5a0efa61b540ab79c701ad2835496dd4c4f8dfb11fdca5ec84df2b18d"} Mar 10 11:11:51 crc kubenswrapper[4794]: I0310 11:11:51.258511 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"0436aee5-3421-4a09-b2c9-468430d109ec","Type":"ContainerStarted","Data":"803587ee44106e661cc1c04aac3274d0f9a46930f70b62f4ab83eefa52644fff"} Mar 10 11:11:51 crc kubenswrapper[4794]: I0310 11:11:51.259036 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-northd-0" Mar 10 11:11:51 crc kubenswrapper[4794]: I0310 11:11:51.300643 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-northd-0" podStartSLOduration=2.300621098 podStartE2EDuration="2.300621098s" podCreationTimestamp="2026-03-10 11:11:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 11:11:51.29005056 +0000 UTC m=+5260.046221388" watchObservedRunningTime="2026-03-10 11:11:51.300621098 +0000 UTC m=+5260.056791926" Mar 10 11:11:52 crc kubenswrapper[4794]: I0310 11:11:52.033413 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0f4731c7-dafb-410c-9a54-5a857c02cfbf" path="/var/lib/kubelet/pods/0f4731c7-dafb-410c-9a54-5a857c02cfbf/volumes" Mar 10 11:11:55 crc kubenswrapper[4794]: I0310 11:11:55.383357 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-create-d5rbx"] Mar 10 11:11:55 crc kubenswrapper[4794]: I0310 11:11:55.384585 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-d5rbx" Mar 10 11:11:55 crc kubenswrapper[4794]: I0310 11:11:55.398231 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-d5rbx"] Mar 10 11:11:55 crc kubenswrapper[4794]: I0310 11:11:55.474367 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-2a13-account-create-update-9pmtz"] Mar 10 11:11:55 crc kubenswrapper[4794]: I0310 11:11:55.475751 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-2a13-account-create-update-9pmtz" Mar 10 11:11:55 crc kubenswrapper[4794]: I0310 11:11:55.479283 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-db-secret" Mar 10 11:11:55 crc kubenswrapper[4794]: I0310 11:11:55.483637 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-2a13-account-create-update-9pmtz"] Mar 10 11:11:55 crc kubenswrapper[4794]: I0310 11:11:55.496715 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6gd6m\" (UniqueName: \"kubernetes.io/projected/206a0153-5800-403c-8749-e68d34d36a81-kube-api-access-6gd6m\") pod \"keystone-db-create-d5rbx\" (UID: \"206a0153-5800-403c-8749-e68d34d36a81\") " pod="openstack/keystone-db-create-d5rbx" Mar 10 11:11:55 crc kubenswrapper[4794]: I0310 11:11:55.496771 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/206a0153-5800-403c-8749-e68d34d36a81-operator-scripts\") pod \"keystone-db-create-d5rbx\" (UID: \"206a0153-5800-403c-8749-e68d34d36a81\") " pod="openstack/keystone-db-create-d5rbx" Mar 10 11:11:55 crc kubenswrapper[4794]: I0310 11:11:55.597820 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/206a0153-5800-403c-8749-e68d34d36a81-operator-scripts\") pod \"keystone-db-create-d5rbx\" (UID: \"206a0153-5800-403c-8749-e68d34d36a81\") " pod="openstack/keystone-db-create-d5rbx" Mar 10 11:11:55 crc kubenswrapper[4794]: I0310 11:11:55.597903 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4kfnk\" (UniqueName: \"kubernetes.io/projected/b2416ded-2ad4-4cef-af44-8c6e944834d8-kube-api-access-4kfnk\") pod \"keystone-2a13-account-create-update-9pmtz\" (UID: \"b2416ded-2ad4-4cef-af44-8c6e944834d8\") " pod="openstack/keystone-2a13-account-create-update-9pmtz" Mar 10 11:11:55 crc kubenswrapper[4794]: I0310 11:11:55.597981 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6gd6m\" (UniqueName: \"kubernetes.io/projected/206a0153-5800-403c-8749-e68d34d36a81-kube-api-access-6gd6m\") pod \"keystone-db-create-d5rbx\" (UID: \"206a0153-5800-403c-8749-e68d34d36a81\") " pod="openstack/keystone-db-create-d5rbx" Mar 10 11:11:55 crc kubenswrapper[4794]: I0310 11:11:55.598063 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b2416ded-2ad4-4cef-af44-8c6e944834d8-operator-scripts\") pod \"keystone-2a13-account-create-update-9pmtz\" (UID: \"b2416ded-2ad4-4cef-af44-8c6e944834d8\") " pod="openstack/keystone-2a13-account-create-update-9pmtz" Mar 10 11:11:55 crc kubenswrapper[4794]: I0310 11:11:55.598541 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/206a0153-5800-403c-8749-e68d34d36a81-operator-scripts\") pod \"keystone-db-create-d5rbx\" (UID: \"206a0153-5800-403c-8749-e68d34d36a81\") " pod="openstack/keystone-db-create-d5rbx" Mar 10 11:11:55 crc kubenswrapper[4794]: I0310 11:11:55.621180 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6gd6m\" (UniqueName: \"kubernetes.io/projected/206a0153-5800-403c-8749-e68d34d36a81-kube-api-access-6gd6m\") pod \"keystone-db-create-d5rbx\" (UID: \"206a0153-5800-403c-8749-e68d34d36a81\") " pod="openstack/keystone-db-create-d5rbx" Mar 10 11:11:55 crc kubenswrapper[4794]: I0310 11:11:55.699408 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b2416ded-2ad4-4cef-af44-8c6e944834d8-operator-scripts\") pod \"keystone-2a13-account-create-update-9pmtz\" (UID: \"b2416ded-2ad4-4cef-af44-8c6e944834d8\") " pod="openstack/keystone-2a13-account-create-update-9pmtz" Mar 10 11:11:55 crc kubenswrapper[4794]: I0310 11:11:55.699544 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4kfnk\" (UniqueName: \"kubernetes.io/projected/b2416ded-2ad4-4cef-af44-8c6e944834d8-kube-api-access-4kfnk\") pod \"keystone-2a13-account-create-update-9pmtz\" (UID: \"b2416ded-2ad4-4cef-af44-8c6e944834d8\") " pod="openstack/keystone-2a13-account-create-update-9pmtz" Mar 10 11:11:55 crc kubenswrapper[4794]: I0310 11:11:55.700764 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b2416ded-2ad4-4cef-af44-8c6e944834d8-operator-scripts\") pod \"keystone-2a13-account-create-update-9pmtz\" (UID: \"b2416ded-2ad4-4cef-af44-8c6e944834d8\") " pod="openstack/keystone-2a13-account-create-update-9pmtz" Mar 10 11:11:55 crc kubenswrapper[4794]: I0310 11:11:55.704244 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-d5rbx" Mar 10 11:11:55 crc kubenswrapper[4794]: I0310 11:11:55.718119 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4kfnk\" (UniqueName: \"kubernetes.io/projected/b2416ded-2ad4-4cef-af44-8c6e944834d8-kube-api-access-4kfnk\") pod \"keystone-2a13-account-create-update-9pmtz\" (UID: \"b2416ded-2ad4-4cef-af44-8c6e944834d8\") " pod="openstack/keystone-2a13-account-create-update-9pmtz" Mar 10 11:11:55 crc kubenswrapper[4794]: I0310 11:11:55.794688 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-2a13-account-create-update-9pmtz" Mar 10 11:11:56 crc kubenswrapper[4794]: I0310 11:11:56.024334 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-d5rbx"] Mar 10 11:11:56 crc kubenswrapper[4794]: W0310 11:11:56.048178 4794 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod206a0153_5800_403c_8749_e68d34d36a81.slice/crio-d71456099f728332a03ad7c371d99c5d3745df5e09cd4c8eb414bd6c528b2c9e WatchSource:0}: Error finding container d71456099f728332a03ad7c371d99c5d3745df5e09cd4c8eb414bd6c528b2c9e: Status 404 returned error can't find the container with id d71456099f728332a03ad7c371d99c5d3745df5e09cd4c8eb414bd6c528b2c9e Mar 10 11:11:56 crc kubenswrapper[4794]: I0310 11:11:56.312266 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-d5rbx" event={"ID":"206a0153-5800-403c-8749-e68d34d36a81","Type":"ContainerStarted","Data":"e6435e00dc9ed00c63a47426d0fe37e7439e843bd0c8aa87fdef1360a6df02b0"} Mar 10 11:11:56 crc kubenswrapper[4794]: I0310 11:11:56.312322 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-d5rbx" event={"ID":"206a0153-5800-403c-8749-e68d34d36a81","Type":"ContainerStarted","Data":"d71456099f728332a03ad7c371d99c5d3745df5e09cd4c8eb414bd6c528b2c9e"} Mar 10 11:11:56 crc kubenswrapper[4794]: I0310 11:11:56.333278 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-db-create-d5rbx" podStartSLOduration=1.333259923 podStartE2EDuration="1.333259923s" podCreationTimestamp="2026-03-10 11:11:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 11:11:56.325958295 +0000 UTC m=+5265.082129113" watchObservedRunningTime="2026-03-10 11:11:56.333259923 +0000 UTC m=+5265.089430731" Mar 10 11:11:56 crc kubenswrapper[4794]: I0310 11:11:56.389581 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-2a13-account-create-update-9pmtz"] Mar 10 11:11:56 crc kubenswrapper[4794]: W0310 11:11:56.470438 4794 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb2416ded_2ad4_4cef_af44_8c6e944834d8.slice/crio-40fbca779ea20ccc4d2f5e45c7cf6c83dadd427c0acba33c4420da4e94ffb7cf WatchSource:0}: Error finding container 40fbca779ea20ccc4d2f5e45c7cf6c83dadd427c0acba33c4420da4e94ffb7cf: Status 404 returned error can't find the container with id 40fbca779ea20ccc4d2f5e45c7cf6c83dadd427c0acba33c4420da4e94ffb7cf Mar 10 11:11:57 crc kubenswrapper[4794]: I0310 11:11:57.328175 4794 generic.go:334] "Generic (PLEG): container finished" podID="b2416ded-2ad4-4cef-af44-8c6e944834d8" containerID="a6d7da3bc9b8ea52a168771183dfea67df181fe896297055a7347ee15be0d992" exitCode=0 Mar 10 11:11:57 crc kubenswrapper[4794]: I0310 11:11:57.328302 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-2a13-account-create-update-9pmtz" event={"ID":"b2416ded-2ad4-4cef-af44-8c6e944834d8","Type":"ContainerDied","Data":"a6d7da3bc9b8ea52a168771183dfea67df181fe896297055a7347ee15be0d992"} Mar 10 11:11:57 crc kubenswrapper[4794]: I0310 11:11:57.328351 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-2a13-account-create-update-9pmtz" event={"ID":"b2416ded-2ad4-4cef-af44-8c6e944834d8","Type":"ContainerStarted","Data":"40fbca779ea20ccc4d2f5e45c7cf6c83dadd427c0acba33c4420da4e94ffb7cf"} Mar 10 11:11:57 crc kubenswrapper[4794]: I0310 11:11:57.331026 4794 generic.go:334] "Generic (PLEG): container finished" podID="206a0153-5800-403c-8749-e68d34d36a81" containerID="e6435e00dc9ed00c63a47426d0fe37e7439e843bd0c8aa87fdef1360a6df02b0" exitCode=0 Mar 10 11:11:57 crc kubenswrapper[4794]: I0310 11:11:57.331094 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-d5rbx" event={"ID":"206a0153-5800-403c-8749-e68d34d36a81","Type":"ContainerDied","Data":"e6435e00dc9ed00c63a47426d0fe37e7439e843bd0c8aa87fdef1360a6df02b0"} Mar 10 11:11:58 crc kubenswrapper[4794]: I0310 11:11:58.881256 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-d5rbx" Mar 10 11:11:58 crc kubenswrapper[4794]: I0310 11:11:58.896484 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-2a13-account-create-update-9pmtz" Mar 10 11:11:58 crc kubenswrapper[4794]: I0310 11:11:58.957005 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6gd6m\" (UniqueName: \"kubernetes.io/projected/206a0153-5800-403c-8749-e68d34d36a81-kube-api-access-6gd6m\") pod \"206a0153-5800-403c-8749-e68d34d36a81\" (UID: \"206a0153-5800-403c-8749-e68d34d36a81\") " Mar 10 11:11:58 crc kubenswrapper[4794]: I0310 11:11:58.957174 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4kfnk\" (UniqueName: \"kubernetes.io/projected/b2416ded-2ad4-4cef-af44-8c6e944834d8-kube-api-access-4kfnk\") pod \"b2416ded-2ad4-4cef-af44-8c6e944834d8\" (UID: \"b2416ded-2ad4-4cef-af44-8c6e944834d8\") " Mar 10 11:11:58 crc kubenswrapper[4794]: I0310 11:11:58.957209 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b2416ded-2ad4-4cef-af44-8c6e944834d8-operator-scripts\") pod \"b2416ded-2ad4-4cef-af44-8c6e944834d8\" (UID: \"b2416ded-2ad4-4cef-af44-8c6e944834d8\") " Mar 10 11:11:58 crc kubenswrapper[4794]: I0310 11:11:58.957293 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/206a0153-5800-403c-8749-e68d34d36a81-operator-scripts\") pod \"206a0153-5800-403c-8749-e68d34d36a81\" (UID: \"206a0153-5800-403c-8749-e68d34d36a81\") " Mar 10 11:11:58 crc kubenswrapper[4794]: I0310 11:11:58.958187 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b2416ded-2ad4-4cef-af44-8c6e944834d8-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "b2416ded-2ad4-4cef-af44-8c6e944834d8" (UID: "b2416ded-2ad4-4cef-af44-8c6e944834d8"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 11:11:58 crc kubenswrapper[4794]: I0310 11:11:58.958513 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/206a0153-5800-403c-8749-e68d34d36a81-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "206a0153-5800-403c-8749-e68d34d36a81" (UID: "206a0153-5800-403c-8749-e68d34d36a81"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 11:11:58 crc kubenswrapper[4794]: I0310 11:11:58.962500 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/206a0153-5800-403c-8749-e68d34d36a81-kube-api-access-6gd6m" (OuterVolumeSpecName: "kube-api-access-6gd6m") pod "206a0153-5800-403c-8749-e68d34d36a81" (UID: "206a0153-5800-403c-8749-e68d34d36a81"). InnerVolumeSpecName "kube-api-access-6gd6m". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 11:11:58 crc kubenswrapper[4794]: I0310 11:11:58.976533 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b2416ded-2ad4-4cef-af44-8c6e944834d8-kube-api-access-4kfnk" (OuterVolumeSpecName: "kube-api-access-4kfnk") pod "b2416ded-2ad4-4cef-af44-8c6e944834d8" (UID: "b2416ded-2ad4-4cef-af44-8c6e944834d8"). InnerVolumeSpecName "kube-api-access-4kfnk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 11:11:59 crc kubenswrapper[4794]: I0310 11:11:59.059597 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4kfnk\" (UniqueName: \"kubernetes.io/projected/b2416ded-2ad4-4cef-af44-8c6e944834d8-kube-api-access-4kfnk\") on node \"crc\" DevicePath \"\"" Mar 10 11:11:59 crc kubenswrapper[4794]: I0310 11:11:59.059646 4794 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b2416ded-2ad4-4cef-af44-8c6e944834d8-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 11:11:59 crc kubenswrapper[4794]: I0310 11:11:59.059666 4794 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/206a0153-5800-403c-8749-e68d34d36a81-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 11:11:59 crc kubenswrapper[4794]: I0310 11:11:59.059689 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6gd6m\" (UniqueName: \"kubernetes.io/projected/206a0153-5800-403c-8749-e68d34d36a81-kube-api-access-6gd6m\") on node \"crc\" DevicePath \"\"" Mar 10 11:11:59 crc kubenswrapper[4794]: I0310 11:11:59.351955 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-d5rbx" event={"ID":"206a0153-5800-403c-8749-e68d34d36a81","Type":"ContainerDied","Data":"d71456099f728332a03ad7c371d99c5d3745df5e09cd4c8eb414bd6c528b2c9e"} Mar 10 11:11:59 crc kubenswrapper[4794]: I0310 11:11:59.352013 4794 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d71456099f728332a03ad7c371d99c5d3745df5e09cd4c8eb414bd6c528b2c9e" Mar 10 11:11:59 crc kubenswrapper[4794]: I0310 11:11:59.352574 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-d5rbx" Mar 10 11:11:59 crc kubenswrapper[4794]: I0310 11:11:59.354043 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-2a13-account-create-update-9pmtz" event={"ID":"b2416ded-2ad4-4cef-af44-8c6e944834d8","Type":"ContainerDied","Data":"40fbca779ea20ccc4d2f5e45c7cf6c83dadd427c0acba33c4420da4e94ffb7cf"} Mar 10 11:11:59 crc kubenswrapper[4794]: I0310 11:11:59.354078 4794 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="40fbca779ea20ccc4d2f5e45c7cf6c83dadd427c0acba33c4420da4e94ffb7cf" Mar 10 11:11:59 crc kubenswrapper[4794]: I0310 11:11:59.354113 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-2a13-account-create-update-9pmtz" Mar 10 11:12:00 crc kubenswrapper[4794]: I0310 11:12:00.137455 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29552352-5jscq"] Mar 10 11:12:00 crc kubenswrapper[4794]: E0310 11:12:00.137795 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b2416ded-2ad4-4cef-af44-8c6e944834d8" containerName="mariadb-account-create-update" Mar 10 11:12:00 crc kubenswrapper[4794]: I0310 11:12:00.137808 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="b2416ded-2ad4-4cef-af44-8c6e944834d8" containerName="mariadb-account-create-update" Mar 10 11:12:00 crc kubenswrapper[4794]: E0310 11:12:00.137825 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="206a0153-5800-403c-8749-e68d34d36a81" containerName="mariadb-database-create" Mar 10 11:12:00 crc kubenswrapper[4794]: I0310 11:12:00.137834 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="206a0153-5800-403c-8749-e68d34d36a81" containerName="mariadb-database-create" Mar 10 11:12:00 crc kubenswrapper[4794]: I0310 11:12:00.138020 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="206a0153-5800-403c-8749-e68d34d36a81" containerName="mariadb-database-create" Mar 10 11:12:00 crc kubenswrapper[4794]: I0310 11:12:00.138049 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="b2416ded-2ad4-4cef-af44-8c6e944834d8" containerName="mariadb-account-create-update" Mar 10 11:12:00 crc kubenswrapper[4794]: I0310 11:12:00.138650 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552352-5jscq" Mar 10 11:12:00 crc kubenswrapper[4794]: I0310 11:12:00.141707 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-r5tkc" Mar 10 11:12:00 crc kubenswrapper[4794]: I0310 11:12:00.141919 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 10 11:12:00 crc kubenswrapper[4794]: I0310 11:12:00.146903 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552352-5jscq"] Mar 10 11:12:00 crc kubenswrapper[4794]: I0310 11:12:00.149293 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 10 11:12:00 crc kubenswrapper[4794]: I0310 11:12:00.181266 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zc7sh\" (UniqueName: \"kubernetes.io/projected/007279e7-7980-459f-8eba-f63a2dab9526-kube-api-access-zc7sh\") pod \"auto-csr-approver-29552352-5jscq\" (UID: \"007279e7-7980-459f-8eba-f63a2dab9526\") " pod="openshift-infra/auto-csr-approver-29552352-5jscq" Mar 10 11:12:00 crc kubenswrapper[4794]: I0310 11:12:00.248627 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-northd-0" Mar 10 11:12:00 crc kubenswrapper[4794]: I0310 11:12:00.283522 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zc7sh\" (UniqueName: \"kubernetes.io/projected/007279e7-7980-459f-8eba-f63a2dab9526-kube-api-access-zc7sh\") pod \"auto-csr-approver-29552352-5jscq\" (UID: \"007279e7-7980-459f-8eba-f63a2dab9526\") " pod="openshift-infra/auto-csr-approver-29552352-5jscq" Mar 10 11:12:00 crc kubenswrapper[4794]: I0310 11:12:00.318100 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zc7sh\" (UniqueName: \"kubernetes.io/projected/007279e7-7980-459f-8eba-f63a2dab9526-kube-api-access-zc7sh\") pod \"auto-csr-approver-29552352-5jscq\" (UID: \"007279e7-7980-459f-8eba-f63a2dab9526\") " pod="openshift-infra/auto-csr-approver-29552352-5jscq" Mar 10 11:12:00 crc kubenswrapper[4794]: I0310 11:12:00.463444 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552352-5jscq" Mar 10 11:12:00 crc kubenswrapper[4794]: I0310 11:12:00.943069 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552352-5jscq"] Mar 10 11:12:00 crc kubenswrapper[4794]: I0310 11:12:00.970232 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-sync-brrxw"] Mar 10 11:12:00 crc kubenswrapper[4794]: I0310 11:12:00.971507 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-brrxw" Mar 10 11:12:00 crc kubenswrapper[4794]: I0310 11:12:00.972924 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Mar 10 11:12:00 crc kubenswrapper[4794]: I0310 11:12:00.973394 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-blbrh" Mar 10 11:12:00 crc kubenswrapper[4794]: I0310 11:12:00.973716 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Mar 10 11:12:00 crc kubenswrapper[4794]: I0310 11:12:00.974060 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Mar 10 11:12:00 crc kubenswrapper[4794]: I0310 11:12:00.978743 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-brrxw"] Mar 10 11:12:01 crc kubenswrapper[4794]: I0310 11:12:01.002784 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6b3770a1-b9eb-416d-bdcc-af3a4eb0d204-combined-ca-bundle\") pod \"keystone-db-sync-brrxw\" (UID: \"6b3770a1-b9eb-416d-bdcc-af3a4eb0d204\") " pod="openstack/keystone-db-sync-brrxw" Mar 10 11:12:01 crc kubenswrapper[4794]: I0310 11:12:01.002870 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gp24v\" (UniqueName: \"kubernetes.io/projected/6b3770a1-b9eb-416d-bdcc-af3a4eb0d204-kube-api-access-gp24v\") pod \"keystone-db-sync-brrxw\" (UID: \"6b3770a1-b9eb-416d-bdcc-af3a4eb0d204\") " pod="openstack/keystone-db-sync-brrxw" Mar 10 11:12:01 crc kubenswrapper[4794]: I0310 11:12:01.002889 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6b3770a1-b9eb-416d-bdcc-af3a4eb0d204-config-data\") pod \"keystone-db-sync-brrxw\" (UID: \"6b3770a1-b9eb-416d-bdcc-af3a4eb0d204\") " pod="openstack/keystone-db-sync-brrxw" Mar 10 11:12:01 crc kubenswrapper[4794]: I0310 11:12:01.104191 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gp24v\" (UniqueName: \"kubernetes.io/projected/6b3770a1-b9eb-416d-bdcc-af3a4eb0d204-kube-api-access-gp24v\") pod \"keystone-db-sync-brrxw\" (UID: \"6b3770a1-b9eb-416d-bdcc-af3a4eb0d204\") " pod="openstack/keystone-db-sync-brrxw" Mar 10 11:12:01 crc kubenswrapper[4794]: I0310 11:12:01.104243 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6b3770a1-b9eb-416d-bdcc-af3a4eb0d204-config-data\") pod \"keystone-db-sync-brrxw\" (UID: \"6b3770a1-b9eb-416d-bdcc-af3a4eb0d204\") " pod="openstack/keystone-db-sync-brrxw" Mar 10 11:12:01 crc kubenswrapper[4794]: I0310 11:12:01.104390 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6b3770a1-b9eb-416d-bdcc-af3a4eb0d204-combined-ca-bundle\") pod \"keystone-db-sync-brrxw\" (UID: \"6b3770a1-b9eb-416d-bdcc-af3a4eb0d204\") " pod="openstack/keystone-db-sync-brrxw" Mar 10 11:12:01 crc kubenswrapper[4794]: I0310 11:12:01.110483 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6b3770a1-b9eb-416d-bdcc-af3a4eb0d204-config-data\") pod \"keystone-db-sync-brrxw\" (UID: \"6b3770a1-b9eb-416d-bdcc-af3a4eb0d204\") " pod="openstack/keystone-db-sync-brrxw" Mar 10 11:12:01 crc kubenswrapper[4794]: I0310 11:12:01.111224 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6b3770a1-b9eb-416d-bdcc-af3a4eb0d204-combined-ca-bundle\") pod \"keystone-db-sync-brrxw\" (UID: \"6b3770a1-b9eb-416d-bdcc-af3a4eb0d204\") " pod="openstack/keystone-db-sync-brrxw" Mar 10 11:12:01 crc kubenswrapper[4794]: I0310 11:12:01.128358 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gp24v\" (UniqueName: \"kubernetes.io/projected/6b3770a1-b9eb-416d-bdcc-af3a4eb0d204-kube-api-access-gp24v\") pod \"keystone-db-sync-brrxw\" (UID: \"6b3770a1-b9eb-416d-bdcc-af3a4eb0d204\") " pod="openstack/keystone-db-sync-brrxw" Mar 10 11:12:01 crc kubenswrapper[4794]: I0310 11:12:01.294030 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-brrxw" Mar 10 11:12:01 crc kubenswrapper[4794]: I0310 11:12:01.387407 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552352-5jscq" event={"ID":"007279e7-7980-459f-8eba-f63a2dab9526","Type":"ContainerStarted","Data":"f7af02cfa01e068885d42ebd83eabf19cdfd720d53723644072c21618cd3696c"} Mar 10 11:12:01 crc kubenswrapper[4794]: I0310 11:12:01.741995 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-brrxw"] Mar 10 11:12:01 crc kubenswrapper[4794]: W0310 11:12:01.751371 4794 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6b3770a1_b9eb_416d_bdcc_af3a4eb0d204.slice/crio-0832df9fc517542942d81f55a9f521e65a31dca37942fb30ee0997d46dee967a WatchSource:0}: Error finding container 0832df9fc517542942d81f55a9f521e65a31dca37942fb30ee0997d46dee967a: Status 404 returned error can't find the container with id 0832df9fc517542942d81f55a9f521e65a31dca37942fb30ee0997d46dee967a Mar 10 11:12:02 crc kubenswrapper[4794]: I0310 11:12:02.405525 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-brrxw" event={"ID":"6b3770a1-b9eb-416d-bdcc-af3a4eb0d204","Type":"ContainerStarted","Data":"cac451694e0fe7d0944aa0de56cf184c2c9bd2b792ea57e8d7bee677ac996613"} Mar 10 11:12:02 crc kubenswrapper[4794]: I0310 11:12:02.405874 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-brrxw" event={"ID":"6b3770a1-b9eb-416d-bdcc-af3a4eb0d204","Type":"ContainerStarted","Data":"0832df9fc517542942d81f55a9f521e65a31dca37942fb30ee0997d46dee967a"} Mar 10 11:12:02 crc kubenswrapper[4794]: I0310 11:12:02.442388 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-db-sync-brrxw" podStartSLOduration=2.442281243 podStartE2EDuration="2.442281243s" podCreationTimestamp="2026-03-10 11:12:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 11:12:02.432762046 +0000 UTC m=+5271.188932884" watchObservedRunningTime="2026-03-10 11:12:02.442281243 +0000 UTC m=+5271.198452111" Mar 10 11:12:02 crc kubenswrapper[4794]: I0310 11:12:02.459967 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29552352-5jscq" podStartSLOduration=1.323837799 podStartE2EDuration="2.459944022s" podCreationTimestamp="2026-03-10 11:12:00 +0000 UTC" firstStartedPulling="2026-03-10 11:12:00.944091112 +0000 UTC m=+5269.700261920" lastFinishedPulling="2026-03-10 11:12:02.080197315 +0000 UTC m=+5270.836368143" observedRunningTime="2026-03-10 11:12:02.459271111 +0000 UTC m=+5271.215441959" watchObservedRunningTime="2026-03-10 11:12:02.459944022 +0000 UTC m=+5271.216114850" Mar 10 11:12:03 crc kubenswrapper[4794]: I0310 11:12:03.420627 4794 generic.go:334] "Generic (PLEG): container finished" podID="6b3770a1-b9eb-416d-bdcc-af3a4eb0d204" containerID="cac451694e0fe7d0944aa0de56cf184c2c9bd2b792ea57e8d7bee677ac996613" exitCode=0 Mar 10 11:12:03 crc kubenswrapper[4794]: I0310 11:12:03.420704 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-brrxw" event={"ID":"6b3770a1-b9eb-416d-bdcc-af3a4eb0d204","Type":"ContainerDied","Data":"cac451694e0fe7d0944aa0de56cf184c2c9bd2b792ea57e8d7bee677ac996613"} Mar 10 11:12:03 crc kubenswrapper[4794]: I0310 11:12:03.422926 4794 generic.go:334] "Generic (PLEG): container finished" podID="007279e7-7980-459f-8eba-f63a2dab9526" containerID="d63c99472b5229335c5be601564544dbe2c58765c09f451d9337f60f910194b5" exitCode=0 Mar 10 11:12:03 crc kubenswrapper[4794]: I0310 11:12:03.422964 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552352-5jscq" event={"ID":"007279e7-7980-459f-8eba-f63a2dab9526","Type":"ContainerDied","Data":"d63c99472b5229335c5be601564544dbe2c58765c09f451d9337f60f910194b5"} Mar 10 11:12:04 crc kubenswrapper[4794]: I0310 11:12:04.954917 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-brrxw" Mar 10 11:12:04 crc kubenswrapper[4794]: I0310 11:12:04.960998 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552352-5jscq" Mar 10 11:12:05 crc kubenswrapper[4794]: I0310 11:12:05.085706 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gp24v\" (UniqueName: \"kubernetes.io/projected/6b3770a1-b9eb-416d-bdcc-af3a4eb0d204-kube-api-access-gp24v\") pod \"6b3770a1-b9eb-416d-bdcc-af3a4eb0d204\" (UID: \"6b3770a1-b9eb-416d-bdcc-af3a4eb0d204\") " Mar 10 11:12:05 crc kubenswrapper[4794]: I0310 11:12:05.085916 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6b3770a1-b9eb-416d-bdcc-af3a4eb0d204-combined-ca-bundle\") pod \"6b3770a1-b9eb-416d-bdcc-af3a4eb0d204\" (UID: \"6b3770a1-b9eb-416d-bdcc-af3a4eb0d204\") " Mar 10 11:12:05 crc kubenswrapper[4794]: I0310 11:12:05.085990 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6b3770a1-b9eb-416d-bdcc-af3a4eb0d204-config-data\") pod \"6b3770a1-b9eb-416d-bdcc-af3a4eb0d204\" (UID: \"6b3770a1-b9eb-416d-bdcc-af3a4eb0d204\") " Mar 10 11:12:05 crc kubenswrapper[4794]: I0310 11:12:05.086009 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zc7sh\" (UniqueName: \"kubernetes.io/projected/007279e7-7980-459f-8eba-f63a2dab9526-kube-api-access-zc7sh\") pod \"007279e7-7980-459f-8eba-f63a2dab9526\" (UID: \"007279e7-7980-459f-8eba-f63a2dab9526\") " Mar 10 11:12:05 crc kubenswrapper[4794]: I0310 11:12:05.094442 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29552346-xcxk2"] Mar 10 11:12:05 crc kubenswrapper[4794]: I0310 11:12:05.099419 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/007279e7-7980-459f-8eba-f63a2dab9526-kube-api-access-zc7sh" (OuterVolumeSpecName: "kube-api-access-zc7sh") pod "007279e7-7980-459f-8eba-f63a2dab9526" (UID: "007279e7-7980-459f-8eba-f63a2dab9526"). InnerVolumeSpecName "kube-api-access-zc7sh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 11:12:05 crc kubenswrapper[4794]: I0310 11:12:05.099478 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6b3770a1-b9eb-416d-bdcc-af3a4eb0d204-kube-api-access-gp24v" (OuterVolumeSpecName: "kube-api-access-gp24v") pod "6b3770a1-b9eb-416d-bdcc-af3a4eb0d204" (UID: "6b3770a1-b9eb-416d-bdcc-af3a4eb0d204"). InnerVolumeSpecName "kube-api-access-gp24v". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 11:12:05 crc kubenswrapper[4794]: I0310 11:12:05.100429 4794 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29552346-xcxk2"] Mar 10 11:12:05 crc kubenswrapper[4794]: I0310 11:12:05.119602 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6b3770a1-b9eb-416d-bdcc-af3a4eb0d204-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6b3770a1-b9eb-416d-bdcc-af3a4eb0d204" (UID: "6b3770a1-b9eb-416d-bdcc-af3a4eb0d204"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 11:12:05 crc kubenswrapper[4794]: I0310 11:12:05.135132 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6b3770a1-b9eb-416d-bdcc-af3a4eb0d204-config-data" (OuterVolumeSpecName: "config-data") pod "6b3770a1-b9eb-416d-bdcc-af3a4eb0d204" (UID: "6b3770a1-b9eb-416d-bdcc-af3a4eb0d204"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 11:12:05 crc kubenswrapper[4794]: I0310 11:12:05.188828 4794 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6b3770a1-b9eb-416d-bdcc-af3a4eb0d204-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 11:12:05 crc kubenswrapper[4794]: I0310 11:12:05.188889 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zc7sh\" (UniqueName: \"kubernetes.io/projected/007279e7-7980-459f-8eba-f63a2dab9526-kube-api-access-zc7sh\") on node \"crc\" DevicePath \"\"" Mar 10 11:12:05 crc kubenswrapper[4794]: I0310 11:12:05.188919 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gp24v\" (UniqueName: \"kubernetes.io/projected/6b3770a1-b9eb-416d-bdcc-af3a4eb0d204-kube-api-access-gp24v\") on node \"crc\" DevicePath \"\"" Mar 10 11:12:05 crc kubenswrapper[4794]: I0310 11:12:05.188938 4794 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6b3770a1-b9eb-416d-bdcc-af3a4eb0d204-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 11:12:05 crc kubenswrapper[4794]: I0310 11:12:05.291236 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6458df987c-wl7hm"] Mar 10 11:12:05 crc kubenswrapper[4794]: E0310 11:12:05.291544 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="007279e7-7980-459f-8eba-f63a2dab9526" containerName="oc" Mar 10 11:12:05 crc kubenswrapper[4794]: I0310 11:12:05.291557 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="007279e7-7980-459f-8eba-f63a2dab9526" containerName="oc" Mar 10 11:12:05 crc kubenswrapper[4794]: E0310 11:12:05.291591 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6b3770a1-b9eb-416d-bdcc-af3a4eb0d204" containerName="keystone-db-sync" Mar 10 11:12:05 crc kubenswrapper[4794]: I0310 11:12:05.291597 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="6b3770a1-b9eb-416d-bdcc-af3a4eb0d204" containerName="keystone-db-sync" Mar 10 11:12:05 crc kubenswrapper[4794]: I0310 11:12:05.291729 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="6b3770a1-b9eb-416d-bdcc-af3a4eb0d204" containerName="keystone-db-sync" Mar 10 11:12:05 crc kubenswrapper[4794]: I0310 11:12:05.291748 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="007279e7-7980-459f-8eba-f63a2dab9526" containerName="oc" Mar 10 11:12:05 crc kubenswrapper[4794]: I0310 11:12:05.292653 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6458df987c-wl7hm" Mar 10 11:12:05 crc kubenswrapper[4794]: I0310 11:12:05.308706 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6458df987c-wl7hm"] Mar 10 11:12:05 crc kubenswrapper[4794]: I0310 11:12:05.338939 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-d6twh"] Mar 10 11:12:05 crc kubenswrapper[4794]: I0310 11:12:05.340797 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-d6twh" Mar 10 11:12:05 crc kubenswrapper[4794]: I0310 11:12:05.342928 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Mar 10 11:12:05 crc kubenswrapper[4794]: I0310 11:12:05.387556 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-d6twh"] Mar 10 11:12:05 crc kubenswrapper[4794]: I0310 11:12:05.392107 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/74d63a04-2b17-4bae-b9c9-61b0e0ed9712-ovsdbserver-sb\") pod \"dnsmasq-dns-6458df987c-wl7hm\" (UID: \"74d63a04-2b17-4bae-b9c9-61b0e0ed9712\") " pod="openstack/dnsmasq-dns-6458df987c-wl7hm" Mar 10 11:12:05 crc kubenswrapper[4794]: I0310 11:12:05.392272 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dzcrj\" (UniqueName: \"kubernetes.io/projected/e8aac033-ebad-4d19-9064-ddb49641ddb8-kube-api-access-dzcrj\") pod \"keystone-bootstrap-d6twh\" (UID: \"e8aac033-ebad-4d19-9064-ddb49641ddb8\") " pod="openstack/keystone-bootstrap-d6twh" Mar 10 11:12:05 crc kubenswrapper[4794]: I0310 11:12:05.392371 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e8aac033-ebad-4d19-9064-ddb49641ddb8-config-data\") pod \"keystone-bootstrap-d6twh\" (UID: \"e8aac033-ebad-4d19-9064-ddb49641ddb8\") " pod="openstack/keystone-bootstrap-d6twh" Mar 10 11:12:05 crc kubenswrapper[4794]: I0310 11:12:05.392437 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/e8aac033-ebad-4d19-9064-ddb49641ddb8-credential-keys\") pod \"keystone-bootstrap-d6twh\" (UID: \"e8aac033-ebad-4d19-9064-ddb49641ddb8\") " pod="openstack/keystone-bootstrap-d6twh" Mar 10 11:12:05 crc kubenswrapper[4794]: I0310 11:12:05.392509 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pplbz\" (UniqueName: \"kubernetes.io/projected/74d63a04-2b17-4bae-b9c9-61b0e0ed9712-kube-api-access-pplbz\") pod \"dnsmasq-dns-6458df987c-wl7hm\" (UID: \"74d63a04-2b17-4bae-b9c9-61b0e0ed9712\") " pod="openstack/dnsmasq-dns-6458df987c-wl7hm" Mar 10 11:12:05 crc kubenswrapper[4794]: I0310 11:12:05.392536 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/74d63a04-2b17-4bae-b9c9-61b0e0ed9712-ovsdbserver-nb\") pod \"dnsmasq-dns-6458df987c-wl7hm\" (UID: \"74d63a04-2b17-4bae-b9c9-61b0e0ed9712\") " pod="openstack/dnsmasq-dns-6458df987c-wl7hm" Mar 10 11:12:05 crc kubenswrapper[4794]: I0310 11:12:05.392598 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e8aac033-ebad-4d19-9064-ddb49641ddb8-scripts\") pod \"keystone-bootstrap-d6twh\" (UID: \"e8aac033-ebad-4d19-9064-ddb49641ddb8\") " pod="openstack/keystone-bootstrap-d6twh" Mar 10 11:12:05 crc kubenswrapper[4794]: I0310 11:12:05.392628 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e8aac033-ebad-4d19-9064-ddb49641ddb8-combined-ca-bundle\") pod \"keystone-bootstrap-d6twh\" (UID: \"e8aac033-ebad-4d19-9064-ddb49641ddb8\") " pod="openstack/keystone-bootstrap-d6twh" Mar 10 11:12:05 crc kubenswrapper[4794]: I0310 11:12:05.392686 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/e8aac033-ebad-4d19-9064-ddb49641ddb8-fernet-keys\") pod \"keystone-bootstrap-d6twh\" (UID: \"e8aac033-ebad-4d19-9064-ddb49641ddb8\") " pod="openstack/keystone-bootstrap-d6twh" Mar 10 11:12:05 crc kubenswrapper[4794]: I0310 11:12:05.392716 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/74d63a04-2b17-4bae-b9c9-61b0e0ed9712-config\") pod \"dnsmasq-dns-6458df987c-wl7hm\" (UID: \"74d63a04-2b17-4bae-b9c9-61b0e0ed9712\") " pod="openstack/dnsmasq-dns-6458df987c-wl7hm" Mar 10 11:12:05 crc kubenswrapper[4794]: I0310 11:12:05.392738 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/74d63a04-2b17-4bae-b9c9-61b0e0ed9712-dns-svc\") pod \"dnsmasq-dns-6458df987c-wl7hm\" (UID: \"74d63a04-2b17-4bae-b9c9-61b0e0ed9712\") " pod="openstack/dnsmasq-dns-6458df987c-wl7hm" Mar 10 11:12:05 crc kubenswrapper[4794]: I0310 11:12:05.443644 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-brrxw" event={"ID":"6b3770a1-b9eb-416d-bdcc-af3a4eb0d204","Type":"ContainerDied","Data":"0832df9fc517542942d81f55a9f521e65a31dca37942fb30ee0997d46dee967a"} Mar 10 11:12:05 crc kubenswrapper[4794]: I0310 11:12:05.443702 4794 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0832df9fc517542942d81f55a9f521e65a31dca37942fb30ee0997d46dee967a" Mar 10 11:12:05 crc kubenswrapper[4794]: I0310 11:12:05.443669 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-brrxw" Mar 10 11:12:05 crc kubenswrapper[4794]: I0310 11:12:05.445490 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552352-5jscq" event={"ID":"007279e7-7980-459f-8eba-f63a2dab9526","Type":"ContainerDied","Data":"f7af02cfa01e068885d42ebd83eabf19cdfd720d53723644072c21618cd3696c"} Mar 10 11:12:05 crc kubenswrapper[4794]: I0310 11:12:05.445589 4794 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f7af02cfa01e068885d42ebd83eabf19cdfd720d53723644072c21618cd3696c" Mar 10 11:12:05 crc kubenswrapper[4794]: I0310 11:12:05.445544 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552352-5jscq" Mar 10 11:12:05 crc kubenswrapper[4794]: I0310 11:12:05.496126 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pplbz\" (UniqueName: \"kubernetes.io/projected/74d63a04-2b17-4bae-b9c9-61b0e0ed9712-kube-api-access-pplbz\") pod \"dnsmasq-dns-6458df987c-wl7hm\" (UID: \"74d63a04-2b17-4bae-b9c9-61b0e0ed9712\") " pod="openstack/dnsmasq-dns-6458df987c-wl7hm" Mar 10 11:12:05 crc kubenswrapper[4794]: I0310 11:12:05.496361 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/74d63a04-2b17-4bae-b9c9-61b0e0ed9712-ovsdbserver-nb\") pod \"dnsmasq-dns-6458df987c-wl7hm\" (UID: \"74d63a04-2b17-4bae-b9c9-61b0e0ed9712\") " pod="openstack/dnsmasq-dns-6458df987c-wl7hm" Mar 10 11:12:05 crc kubenswrapper[4794]: I0310 11:12:05.496475 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e8aac033-ebad-4d19-9064-ddb49641ddb8-scripts\") pod \"keystone-bootstrap-d6twh\" (UID: \"e8aac033-ebad-4d19-9064-ddb49641ddb8\") " pod="openstack/keystone-bootstrap-d6twh" Mar 10 11:12:05 crc kubenswrapper[4794]: I0310 11:12:05.496550 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e8aac033-ebad-4d19-9064-ddb49641ddb8-combined-ca-bundle\") pod \"keystone-bootstrap-d6twh\" (UID: \"e8aac033-ebad-4d19-9064-ddb49641ddb8\") " pod="openstack/keystone-bootstrap-d6twh" Mar 10 11:12:05 crc kubenswrapper[4794]: I0310 11:12:05.496608 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/e8aac033-ebad-4d19-9064-ddb49641ddb8-fernet-keys\") pod \"keystone-bootstrap-d6twh\" (UID: \"e8aac033-ebad-4d19-9064-ddb49641ddb8\") " pod="openstack/keystone-bootstrap-d6twh" Mar 10 11:12:05 crc kubenswrapper[4794]: I0310 11:12:05.496670 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/74d63a04-2b17-4bae-b9c9-61b0e0ed9712-config\") pod \"dnsmasq-dns-6458df987c-wl7hm\" (UID: \"74d63a04-2b17-4bae-b9c9-61b0e0ed9712\") " pod="openstack/dnsmasq-dns-6458df987c-wl7hm" Mar 10 11:12:05 crc kubenswrapper[4794]: I0310 11:12:05.496741 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/74d63a04-2b17-4bae-b9c9-61b0e0ed9712-dns-svc\") pod \"dnsmasq-dns-6458df987c-wl7hm\" (UID: \"74d63a04-2b17-4bae-b9c9-61b0e0ed9712\") " pod="openstack/dnsmasq-dns-6458df987c-wl7hm" Mar 10 11:12:05 crc kubenswrapper[4794]: I0310 11:12:05.497286 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/74d63a04-2b17-4bae-b9c9-61b0e0ed9712-ovsdbserver-sb\") pod \"dnsmasq-dns-6458df987c-wl7hm\" (UID: \"74d63a04-2b17-4bae-b9c9-61b0e0ed9712\") " pod="openstack/dnsmasq-dns-6458df987c-wl7hm" Mar 10 11:12:05 crc kubenswrapper[4794]: I0310 11:12:05.497389 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dzcrj\" (UniqueName: \"kubernetes.io/projected/e8aac033-ebad-4d19-9064-ddb49641ddb8-kube-api-access-dzcrj\") pod \"keystone-bootstrap-d6twh\" (UID: \"e8aac033-ebad-4d19-9064-ddb49641ddb8\") " pod="openstack/keystone-bootstrap-d6twh" Mar 10 11:12:05 crc kubenswrapper[4794]: I0310 11:12:05.497493 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e8aac033-ebad-4d19-9064-ddb49641ddb8-config-data\") pod \"keystone-bootstrap-d6twh\" (UID: \"e8aac033-ebad-4d19-9064-ddb49641ddb8\") " pod="openstack/keystone-bootstrap-d6twh" Mar 10 11:12:05 crc kubenswrapper[4794]: I0310 11:12:05.497607 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/e8aac033-ebad-4d19-9064-ddb49641ddb8-credential-keys\") pod \"keystone-bootstrap-d6twh\" (UID: \"e8aac033-ebad-4d19-9064-ddb49641ddb8\") " pod="openstack/keystone-bootstrap-d6twh" Mar 10 11:12:05 crc kubenswrapper[4794]: I0310 11:12:05.498133 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/74d63a04-2b17-4bae-b9c9-61b0e0ed9712-dns-svc\") pod \"dnsmasq-dns-6458df987c-wl7hm\" (UID: \"74d63a04-2b17-4bae-b9c9-61b0e0ed9712\") " pod="openstack/dnsmasq-dns-6458df987c-wl7hm" Mar 10 11:12:05 crc kubenswrapper[4794]: I0310 11:12:05.498143 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/74d63a04-2b17-4bae-b9c9-61b0e0ed9712-ovsdbserver-nb\") pod \"dnsmasq-dns-6458df987c-wl7hm\" (UID: \"74d63a04-2b17-4bae-b9c9-61b0e0ed9712\") " pod="openstack/dnsmasq-dns-6458df987c-wl7hm" Mar 10 11:12:05 crc kubenswrapper[4794]: I0310 11:12:05.498249 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/74d63a04-2b17-4bae-b9c9-61b0e0ed9712-ovsdbserver-sb\") pod \"dnsmasq-dns-6458df987c-wl7hm\" (UID: \"74d63a04-2b17-4bae-b9c9-61b0e0ed9712\") " pod="openstack/dnsmasq-dns-6458df987c-wl7hm" Mar 10 11:12:05 crc kubenswrapper[4794]: I0310 11:12:05.498756 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/74d63a04-2b17-4bae-b9c9-61b0e0ed9712-config\") pod \"dnsmasq-dns-6458df987c-wl7hm\" (UID: \"74d63a04-2b17-4bae-b9c9-61b0e0ed9712\") " pod="openstack/dnsmasq-dns-6458df987c-wl7hm" Mar 10 11:12:05 crc kubenswrapper[4794]: I0310 11:12:05.500733 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/e8aac033-ebad-4d19-9064-ddb49641ddb8-fernet-keys\") pod \"keystone-bootstrap-d6twh\" (UID: \"e8aac033-ebad-4d19-9064-ddb49641ddb8\") " pod="openstack/keystone-bootstrap-d6twh" Mar 10 11:12:05 crc kubenswrapper[4794]: I0310 11:12:05.500960 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e8aac033-ebad-4d19-9064-ddb49641ddb8-combined-ca-bundle\") pod \"keystone-bootstrap-d6twh\" (UID: \"e8aac033-ebad-4d19-9064-ddb49641ddb8\") " pod="openstack/keystone-bootstrap-d6twh" Mar 10 11:12:05 crc kubenswrapper[4794]: I0310 11:12:05.501166 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/e8aac033-ebad-4d19-9064-ddb49641ddb8-credential-keys\") pod \"keystone-bootstrap-d6twh\" (UID: \"e8aac033-ebad-4d19-9064-ddb49641ddb8\") " pod="openstack/keystone-bootstrap-d6twh" Mar 10 11:12:05 crc kubenswrapper[4794]: I0310 11:12:05.501179 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e8aac033-ebad-4d19-9064-ddb49641ddb8-scripts\") pod \"keystone-bootstrap-d6twh\" (UID: \"e8aac033-ebad-4d19-9064-ddb49641ddb8\") " pod="openstack/keystone-bootstrap-d6twh" Mar 10 11:12:05 crc kubenswrapper[4794]: I0310 11:12:05.501906 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e8aac033-ebad-4d19-9064-ddb49641ddb8-config-data\") pod \"keystone-bootstrap-d6twh\" (UID: \"e8aac033-ebad-4d19-9064-ddb49641ddb8\") " pod="openstack/keystone-bootstrap-d6twh" Mar 10 11:12:05 crc kubenswrapper[4794]: I0310 11:12:05.512609 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pplbz\" (UniqueName: \"kubernetes.io/projected/74d63a04-2b17-4bae-b9c9-61b0e0ed9712-kube-api-access-pplbz\") pod \"dnsmasq-dns-6458df987c-wl7hm\" (UID: \"74d63a04-2b17-4bae-b9c9-61b0e0ed9712\") " pod="openstack/dnsmasq-dns-6458df987c-wl7hm" Mar 10 11:12:05 crc kubenswrapper[4794]: I0310 11:12:05.521964 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dzcrj\" (UniqueName: \"kubernetes.io/projected/e8aac033-ebad-4d19-9064-ddb49641ddb8-kube-api-access-dzcrj\") pod \"keystone-bootstrap-d6twh\" (UID: \"e8aac033-ebad-4d19-9064-ddb49641ddb8\") " pod="openstack/keystone-bootstrap-d6twh" Mar 10 11:12:05 crc kubenswrapper[4794]: I0310 11:12:05.654220 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6458df987c-wl7hm" Mar 10 11:12:05 crc kubenswrapper[4794]: I0310 11:12:05.675420 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-d6twh" Mar 10 11:12:06 crc kubenswrapper[4794]: I0310 11:12:06.015834 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="851b3d8e-0191-4952-ac50-d09258590e83" path="/var/lib/kubelet/pods/851b3d8e-0191-4952-ac50-d09258590e83/volumes" Mar 10 11:12:06 crc kubenswrapper[4794]: I0310 11:12:06.135891 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6458df987c-wl7hm"] Mar 10 11:12:06 crc kubenswrapper[4794]: W0310 11:12:06.139171 4794 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod74d63a04_2b17_4bae_b9c9_61b0e0ed9712.slice/crio-7329585322b6c6b653f4f97f1e793082814ff1fc8c57e63f5945c596513c3100 WatchSource:0}: Error finding container 7329585322b6c6b653f4f97f1e793082814ff1fc8c57e63f5945c596513c3100: Status 404 returned error can't find the container with id 7329585322b6c6b653f4f97f1e793082814ff1fc8c57e63f5945c596513c3100 Mar 10 11:12:06 crc kubenswrapper[4794]: I0310 11:12:06.202084 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-d6twh"] Mar 10 11:12:06 crc kubenswrapper[4794]: W0310 11:12:06.205261 4794 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode8aac033_ebad_4d19_9064_ddb49641ddb8.slice/crio-1864e59f1b6353535b4198b349f3f9076c04f0bd0524a51bbc1726d7f10546d7 WatchSource:0}: Error finding container 1864e59f1b6353535b4198b349f3f9076c04f0bd0524a51bbc1726d7f10546d7: Status 404 returned error can't find the container with id 1864e59f1b6353535b4198b349f3f9076c04f0bd0524a51bbc1726d7f10546d7 Mar 10 11:12:06 crc kubenswrapper[4794]: I0310 11:12:06.455991 4794 generic.go:334] "Generic (PLEG): container finished" podID="74d63a04-2b17-4bae-b9c9-61b0e0ed9712" containerID="5c83452d0cd456dda1fc33fd7b6b8234e5ccd2b6ddb4fd680d0fb30f86242056" exitCode=0 Mar 10 11:12:06 crc kubenswrapper[4794]: I0310 11:12:06.456086 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6458df987c-wl7hm" event={"ID":"74d63a04-2b17-4bae-b9c9-61b0e0ed9712","Type":"ContainerDied","Data":"5c83452d0cd456dda1fc33fd7b6b8234e5ccd2b6ddb4fd680d0fb30f86242056"} Mar 10 11:12:06 crc kubenswrapper[4794]: I0310 11:12:06.456125 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6458df987c-wl7hm" event={"ID":"74d63a04-2b17-4bae-b9c9-61b0e0ed9712","Type":"ContainerStarted","Data":"7329585322b6c6b653f4f97f1e793082814ff1fc8c57e63f5945c596513c3100"} Mar 10 11:12:06 crc kubenswrapper[4794]: I0310 11:12:06.459935 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-d6twh" event={"ID":"e8aac033-ebad-4d19-9064-ddb49641ddb8","Type":"ContainerStarted","Data":"1e230fd967502ae8d4656b2a5f6d05681b7bec15ea475101de690e31149cbaa2"} Mar 10 11:12:06 crc kubenswrapper[4794]: I0310 11:12:06.459969 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-d6twh" event={"ID":"e8aac033-ebad-4d19-9064-ddb49641ddb8","Type":"ContainerStarted","Data":"1864e59f1b6353535b4198b349f3f9076c04f0bd0524a51bbc1726d7f10546d7"} Mar 10 11:12:06 crc kubenswrapper[4794]: I0310 11:12:06.548926 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-d6twh" podStartSLOduration=1.548903132 podStartE2EDuration="1.548903132s" podCreationTimestamp="2026-03-10 11:12:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 11:12:06.540007475 +0000 UTC m=+5275.296178303" watchObservedRunningTime="2026-03-10 11:12:06.548903132 +0000 UTC m=+5275.305073960" Mar 10 11:12:07 crc kubenswrapper[4794]: I0310 11:12:07.482913 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6458df987c-wl7hm" event={"ID":"74d63a04-2b17-4bae-b9c9-61b0e0ed9712","Type":"ContainerStarted","Data":"4600d8eb077528eddd3573099d0e77ce2d10b965ec996e12ebc9a0bc4c8d6842"} Mar 10 11:12:07 crc kubenswrapper[4794]: I0310 11:12:07.483406 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6458df987c-wl7hm" Mar 10 11:12:07 crc kubenswrapper[4794]: I0310 11:12:07.523760 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6458df987c-wl7hm" podStartSLOduration=2.5237372259999997 podStartE2EDuration="2.523737226s" podCreationTimestamp="2026-03-10 11:12:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 11:12:07.511596408 +0000 UTC m=+5276.267767266" watchObservedRunningTime="2026-03-10 11:12:07.523737226 +0000 UTC m=+5276.279908054" Mar 10 11:12:09 crc kubenswrapper[4794]: I0310 11:12:09.508253 4794 generic.go:334] "Generic (PLEG): container finished" podID="e8aac033-ebad-4d19-9064-ddb49641ddb8" containerID="1e230fd967502ae8d4656b2a5f6d05681b7bec15ea475101de690e31149cbaa2" exitCode=0 Mar 10 11:12:09 crc kubenswrapper[4794]: I0310 11:12:09.508388 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-d6twh" event={"ID":"e8aac033-ebad-4d19-9064-ddb49641ddb8","Type":"ContainerDied","Data":"1e230fd967502ae8d4656b2a5f6d05681b7bec15ea475101de690e31149cbaa2"} Mar 10 11:12:10 crc kubenswrapper[4794]: I0310 11:12:10.922289 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-d6twh" Mar 10 11:12:11 crc kubenswrapper[4794]: I0310 11:12:11.002088 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e8aac033-ebad-4d19-9064-ddb49641ddb8-scripts\") pod \"e8aac033-ebad-4d19-9064-ddb49641ddb8\" (UID: \"e8aac033-ebad-4d19-9064-ddb49641ddb8\") " Mar 10 11:12:11 crc kubenswrapper[4794]: I0310 11:12:11.002222 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dzcrj\" (UniqueName: \"kubernetes.io/projected/e8aac033-ebad-4d19-9064-ddb49641ddb8-kube-api-access-dzcrj\") pod \"e8aac033-ebad-4d19-9064-ddb49641ddb8\" (UID: \"e8aac033-ebad-4d19-9064-ddb49641ddb8\") " Mar 10 11:12:11 crc kubenswrapper[4794]: I0310 11:12:11.002249 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e8aac033-ebad-4d19-9064-ddb49641ddb8-combined-ca-bundle\") pod \"e8aac033-ebad-4d19-9064-ddb49641ddb8\" (UID: \"e8aac033-ebad-4d19-9064-ddb49641ddb8\") " Mar 10 11:12:11 crc kubenswrapper[4794]: I0310 11:12:11.002279 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e8aac033-ebad-4d19-9064-ddb49641ddb8-config-data\") pod \"e8aac033-ebad-4d19-9064-ddb49641ddb8\" (UID: \"e8aac033-ebad-4d19-9064-ddb49641ddb8\") " Mar 10 11:12:11 crc kubenswrapper[4794]: I0310 11:12:11.002306 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/e8aac033-ebad-4d19-9064-ddb49641ddb8-credential-keys\") pod \"e8aac033-ebad-4d19-9064-ddb49641ddb8\" (UID: \"e8aac033-ebad-4d19-9064-ddb49641ddb8\") " Mar 10 11:12:11 crc kubenswrapper[4794]: I0310 11:12:11.002402 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/e8aac033-ebad-4d19-9064-ddb49641ddb8-fernet-keys\") pod \"e8aac033-ebad-4d19-9064-ddb49641ddb8\" (UID: \"e8aac033-ebad-4d19-9064-ddb49641ddb8\") " Mar 10 11:12:11 crc kubenswrapper[4794]: I0310 11:12:11.008115 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e8aac033-ebad-4d19-9064-ddb49641ddb8-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "e8aac033-ebad-4d19-9064-ddb49641ddb8" (UID: "e8aac033-ebad-4d19-9064-ddb49641ddb8"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 11:12:11 crc kubenswrapper[4794]: I0310 11:12:11.008161 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e8aac033-ebad-4d19-9064-ddb49641ddb8-kube-api-access-dzcrj" (OuterVolumeSpecName: "kube-api-access-dzcrj") pod "e8aac033-ebad-4d19-9064-ddb49641ddb8" (UID: "e8aac033-ebad-4d19-9064-ddb49641ddb8"). InnerVolumeSpecName "kube-api-access-dzcrj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 11:12:11 crc kubenswrapper[4794]: I0310 11:12:11.008826 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e8aac033-ebad-4d19-9064-ddb49641ddb8-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "e8aac033-ebad-4d19-9064-ddb49641ddb8" (UID: "e8aac033-ebad-4d19-9064-ddb49641ddb8"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 11:12:11 crc kubenswrapper[4794]: I0310 11:12:11.009213 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e8aac033-ebad-4d19-9064-ddb49641ddb8-scripts" (OuterVolumeSpecName: "scripts") pod "e8aac033-ebad-4d19-9064-ddb49641ddb8" (UID: "e8aac033-ebad-4d19-9064-ddb49641ddb8"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 11:12:11 crc kubenswrapper[4794]: I0310 11:12:11.026293 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e8aac033-ebad-4d19-9064-ddb49641ddb8-config-data" (OuterVolumeSpecName: "config-data") pod "e8aac033-ebad-4d19-9064-ddb49641ddb8" (UID: "e8aac033-ebad-4d19-9064-ddb49641ddb8"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 11:12:11 crc kubenswrapper[4794]: I0310 11:12:11.027005 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e8aac033-ebad-4d19-9064-ddb49641ddb8-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e8aac033-ebad-4d19-9064-ddb49641ddb8" (UID: "e8aac033-ebad-4d19-9064-ddb49641ddb8"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 11:12:11 crc kubenswrapper[4794]: I0310 11:12:11.105051 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dzcrj\" (UniqueName: \"kubernetes.io/projected/e8aac033-ebad-4d19-9064-ddb49641ddb8-kube-api-access-dzcrj\") on node \"crc\" DevicePath \"\"" Mar 10 11:12:11 crc kubenswrapper[4794]: I0310 11:12:11.105119 4794 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e8aac033-ebad-4d19-9064-ddb49641ddb8-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 11:12:11 crc kubenswrapper[4794]: I0310 11:12:11.105147 4794 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e8aac033-ebad-4d19-9064-ddb49641ddb8-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 11:12:11 crc kubenswrapper[4794]: I0310 11:12:11.105171 4794 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/e8aac033-ebad-4d19-9064-ddb49641ddb8-credential-keys\") on node \"crc\" DevicePath \"\"" Mar 10 11:12:11 crc kubenswrapper[4794]: I0310 11:12:11.105195 4794 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/e8aac033-ebad-4d19-9064-ddb49641ddb8-fernet-keys\") on node \"crc\" DevicePath \"\"" Mar 10 11:12:11 crc kubenswrapper[4794]: I0310 11:12:11.105219 4794 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e8aac033-ebad-4d19-9064-ddb49641ddb8-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 11:12:11 crc kubenswrapper[4794]: I0310 11:12:11.535475 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-d6twh" Mar 10 11:12:11 crc kubenswrapper[4794]: I0310 11:12:11.535491 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-d6twh" event={"ID":"e8aac033-ebad-4d19-9064-ddb49641ddb8","Type":"ContainerDied","Data":"1864e59f1b6353535b4198b349f3f9076c04f0bd0524a51bbc1726d7f10546d7"} Mar 10 11:12:11 crc kubenswrapper[4794]: I0310 11:12:11.535554 4794 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1864e59f1b6353535b4198b349f3f9076c04f0bd0524a51bbc1726d7f10546d7" Mar 10 11:12:11 crc kubenswrapper[4794]: I0310 11:12:11.612009 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-d6twh"] Mar 10 11:12:11 crc kubenswrapper[4794]: I0310 11:12:11.624811 4794 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-d6twh"] Mar 10 11:12:11 crc kubenswrapper[4794]: I0310 11:12:11.718949 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-69ksj"] Mar 10 11:12:11 crc kubenswrapper[4794]: E0310 11:12:11.719548 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e8aac033-ebad-4d19-9064-ddb49641ddb8" containerName="keystone-bootstrap" Mar 10 11:12:11 crc kubenswrapper[4794]: I0310 11:12:11.719586 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="e8aac033-ebad-4d19-9064-ddb49641ddb8" containerName="keystone-bootstrap" Mar 10 11:12:11 crc kubenswrapper[4794]: I0310 11:12:11.719931 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="e8aac033-ebad-4d19-9064-ddb49641ddb8" containerName="keystone-bootstrap" Mar 10 11:12:11 crc kubenswrapper[4794]: I0310 11:12:11.721229 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-69ksj" Mar 10 11:12:11 crc kubenswrapper[4794]: I0310 11:12:11.725017 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Mar 10 11:12:11 crc kubenswrapper[4794]: I0310 11:12:11.725696 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Mar 10 11:12:11 crc kubenswrapper[4794]: I0310 11:12:11.725814 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Mar 10 11:12:11 crc kubenswrapper[4794]: I0310 11:12:11.726185 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Mar 10 11:12:11 crc kubenswrapper[4794]: I0310 11:12:11.726848 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-blbrh" Mar 10 11:12:11 crc kubenswrapper[4794]: I0310 11:12:11.729555 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-69ksj"] Mar 10 11:12:11 crc kubenswrapper[4794]: I0310 11:12:11.848269 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4a267651-30fa-4587-9da7-25c34d836c67-scripts\") pod \"keystone-bootstrap-69ksj\" (UID: \"4a267651-30fa-4587-9da7-25c34d836c67\") " pod="openstack/keystone-bootstrap-69ksj" Mar 10 11:12:11 crc kubenswrapper[4794]: I0310 11:12:11.848682 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4a267651-30fa-4587-9da7-25c34d836c67-config-data\") pod \"keystone-bootstrap-69ksj\" (UID: \"4a267651-30fa-4587-9da7-25c34d836c67\") " pod="openstack/keystone-bootstrap-69ksj" Mar 10 11:12:11 crc kubenswrapper[4794]: I0310 11:12:11.848858 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4a267651-30fa-4587-9da7-25c34d836c67-combined-ca-bundle\") pod \"keystone-bootstrap-69ksj\" (UID: \"4a267651-30fa-4587-9da7-25c34d836c67\") " pod="openstack/keystone-bootstrap-69ksj" Mar 10 11:12:11 crc kubenswrapper[4794]: I0310 11:12:11.848950 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bj25x\" (UniqueName: \"kubernetes.io/projected/4a267651-30fa-4587-9da7-25c34d836c67-kube-api-access-bj25x\") pod \"keystone-bootstrap-69ksj\" (UID: \"4a267651-30fa-4587-9da7-25c34d836c67\") " pod="openstack/keystone-bootstrap-69ksj" Mar 10 11:12:11 crc kubenswrapper[4794]: I0310 11:12:11.849062 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/4a267651-30fa-4587-9da7-25c34d836c67-credential-keys\") pod \"keystone-bootstrap-69ksj\" (UID: \"4a267651-30fa-4587-9da7-25c34d836c67\") " pod="openstack/keystone-bootstrap-69ksj" Mar 10 11:12:11 crc kubenswrapper[4794]: I0310 11:12:11.849294 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/4a267651-30fa-4587-9da7-25c34d836c67-fernet-keys\") pod \"keystone-bootstrap-69ksj\" (UID: \"4a267651-30fa-4587-9da7-25c34d836c67\") " pod="openstack/keystone-bootstrap-69ksj" Mar 10 11:12:11 crc kubenswrapper[4794]: I0310 11:12:11.950214 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4a267651-30fa-4587-9da7-25c34d836c67-combined-ca-bundle\") pod \"keystone-bootstrap-69ksj\" (UID: \"4a267651-30fa-4587-9da7-25c34d836c67\") " pod="openstack/keystone-bootstrap-69ksj" Mar 10 11:12:11 crc kubenswrapper[4794]: I0310 11:12:11.950278 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bj25x\" (UniqueName: \"kubernetes.io/projected/4a267651-30fa-4587-9da7-25c34d836c67-kube-api-access-bj25x\") pod \"keystone-bootstrap-69ksj\" (UID: \"4a267651-30fa-4587-9da7-25c34d836c67\") " pod="openstack/keystone-bootstrap-69ksj" Mar 10 11:12:11 crc kubenswrapper[4794]: I0310 11:12:11.950317 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/4a267651-30fa-4587-9da7-25c34d836c67-credential-keys\") pod \"keystone-bootstrap-69ksj\" (UID: \"4a267651-30fa-4587-9da7-25c34d836c67\") " pod="openstack/keystone-bootstrap-69ksj" Mar 10 11:12:11 crc kubenswrapper[4794]: I0310 11:12:11.950389 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/4a267651-30fa-4587-9da7-25c34d836c67-fernet-keys\") pod \"keystone-bootstrap-69ksj\" (UID: \"4a267651-30fa-4587-9da7-25c34d836c67\") " pod="openstack/keystone-bootstrap-69ksj" Mar 10 11:12:11 crc kubenswrapper[4794]: I0310 11:12:11.950425 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4a267651-30fa-4587-9da7-25c34d836c67-scripts\") pod \"keystone-bootstrap-69ksj\" (UID: \"4a267651-30fa-4587-9da7-25c34d836c67\") " pod="openstack/keystone-bootstrap-69ksj" Mar 10 11:12:11 crc kubenswrapper[4794]: I0310 11:12:11.950454 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4a267651-30fa-4587-9da7-25c34d836c67-config-data\") pod \"keystone-bootstrap-69ksj\" (UID: \"4a267651-30fa-4587-9da7-25c34d836c67\") " pod="openstack/keystone-bootstrap-69ksj" Mar 10 11:12:11 crc kubenswrapper[4794]: I0310 11:12:11.958660 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4a267651-30fa-4587-9da7-25c34d836c67-config-data\") pod \"keystone-bootstrap-69ksj\" (UID: \"4a267651-30fa-4587-9da7-25c34d836c67\") " pod="openstack/keystone-bootstrap-69ksj" Mar 10 11:12:11 crc kubenswrapper[4794]: I0310 11:12:11.960311 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4a267651-30fa-4587-9da7-25c34d836c67-combined-ca-bundle\") pod \"keystone-bootstrap-69ksj\" (UID: \"4a267651-30fa-4587-9da7-25c34d836c67\") " pod="openstack/keystone-bootstrap-69ksj" Mar 10 11:12:11 crc kubenswrapper[4794]: I0310 11:12:11.977502 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4a267651-30fa-4587-9da7-25c34d836c67-scripts\") pod \"keystone-bootstrap-69ksj\" (UID: \"4a267651-30fa-4587-9da7-25c34d836c67\") " pod="openstack/keystone-bootstrap-69ksj" Mar 10 11:12:11 crc kubenswrapper[4794]: I0310 11:12:11.978938 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/4a267651-30fa-4587-9da7-25c34d836c67-credential-keys\") pod \"keystone-bootstrap-69ksj\" (UID: \"4a267651-30fa-4587-9da7-25c34d836c67\") " pod="openstack/keystone-bootstrap-69ksj" Mar 10 11:12:11 crc kubenswrapper[4794]: I0310 11:12:11.979734 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/4a267651-30fa-4587-9da7-25c34d836c67-fernet-keys\") pod \"keystone-bootstrap-69ksj\" (UID: \"4a267651-30fa-4587-9da7-25c34d836c67\") " pod="openstack/keystone-bootstrap-69ksj" Mar 10 11:12:11 crc kubenswrapper[4794]: I0310 11:12:11.982400 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bj25x\" (UniqueName: \"kubernetes.io/projected/4a267651-30fa-4587-9da7-25c34d836c67-kube-api-access-bj25x\") pod \"keystone-bootstrap-69ksj\" (UID: \"4a267651-30fa-4587-9da7-25c34d836c67\") " pod="openstack/keystone-bootstrap-69ksj" Mar 10 11:12:12 crc kubenswrapper[4794]: I0310 11:12:12.017732 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e8aac033-ebad-4d19-9064-ddb49641ddb8" path="/var/lib/kubelet/pods/e8aac033-ebad-4d19-9064-ddb49641ddb8/volumes" Mar 10 11:12:12 crc kubenswrapper[4794]: I0310 11:12:12.060672 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-blbrh" Mar 10 11:12:12 crc kubenswrapper[4794]: I0310 11:12:12.069002 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-69ksj" Mar 10 11:12:12 crc kubenswrapper[4794]: I0310 11:12:12.534900 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-69ksj"] Mar 10 11:12:12 crc kubenswrapper[4794]: W0310 11:12:12.544843 4794 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4a267651_30fa_4587_9da7_25c34d836c67.slice/crio-adf12d8f7c046a2b38d28123626c7fc264bec3bd26598bbe073d0504d783a4ae WatchSource:0}: Error finding container adf12d8f7c046a2b38d28123626c7fc264bec3bd26598bbe073d0504d783a4ae: Status 404 returned error can't find the container with id adf12d8f7c046a2b38d28123626c7fc264bec3bd26598bbe073d0504d783a4ae Mar 10 11:12:12 crc kubenswrapper[4794]: I0310 11:12:12.551613 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Mar 10 11:12:13 crc kubenswrapper[4794]: I0310 11:12:13.566455 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-69ksj" event={"ID":"4a267651-30fa-4587-9da7-25c34d836c67","Type":"ContainerStarted","Data":"5a17284487e5576d7d0597281abbadd9180a4c08785164d5d45f5bbc798aef41"} Mar 10 11:12:13 crc kubenswrapper[4794]: I0310 11:12:13.566991 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-69ksj" event={"ID":"4a267651-30fa-4587-9da7-25c34d836c67","Type":"ContainerStarted","Data":"adf12d8f7c046a2b38d28123626c7fc264bec3bd26598bbe073d0504d783a4ae"} Mar 10 11:12:13 crc kubenswrapper[4794]: I0310 11:12:13.590608 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-69ksj" podStartSLOduration=2.590580203 podStartE2EDuration="2.590580203s" podCreationTimestamp="2026-03-10 11:12:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 11:12:13.581233232 +0000 UTC m=+5282.337404080" watchObservedRunningTime="2026-03-10 11:12:13.590580203 +0000 UTC m=+5282.346751061" Mar 10 11:12:15 crc kubenswrapper[4794]: I0310 11:12:15.588264 4794 generic.go:334] "Generic (PLEG): container finished" podID="4a267651-30fa-4587-9da7-25c34d836c67" containerID="5a17284487e5576d7d0597281abbadd9180a4c08785164d5d45f5bbc798aef41" exitCode=0 Mar 10 11:12:15 crc kubenswrapper[4794]: I0310 11:12:15.588387 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-69ksj" event={"ID":"4a267651-30fa-4587-9da7-25c34d836c67","Type":"ContainerDied","Data":"5a17284487e5576d7d0597281abbadd9180a4c08785164d5d45f5bbc798aef41"} Mar 10 11:12:15 crc kubenswrapper[4794]: I0310 11:12:15.656640 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-6458df987c-wl7hm" Mar 10 11:12:15 crc kubenswrapper[4794]: I0310 11:12:15.752490 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6cd799f5b5-hgcjl"] Mar 10 11:12:15 crc kubenswrapper[4794]: I0310 11:12:15.752798 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-6cd799f5b5-hgcjl" podUID="69066222-1571-4ff1-86a6-1681967612da" containerName="dnsmasq-dns" containerID="cri-o://03da0622d5760d58291153ca42e768b1658694e688253040567b816d0b833cd0" gracePeriod=10 Mar 10 11:12:16 crc kubenswrapper[4794]: I0310 11:12:16.241802 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6cd799f5b5-hgcjl" Mar 10 11:12:16 crc kubenswrapper[4794]: I0310 11:12:16.330239 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/69066222-1571-4ff1-86a6-1681967612da-config\") pod \"69066222-1571-4ff1-86a6-1681967612da\" (UID: \"69066222-1571-4ff1-86a6-1681967612da\") " Mar 10 11:12:16 crc kubenswrapper[4794]: I0310 11:12:16.330424 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jg2xv\" (UniqueName: \"kubernetes.io/projected/69066222-1571-4ff1-86a6-1681967612da-kube-api-access-jg2xv\") pod \"69066222-1571-4ff1-86a6-1681967612da\" (UID: \"69066222-1571-4ff1-86a6-1681967612da\") " Mar 10 11:12:16 crc kubenswrapper[4794]: I0310 11:12:16.330472 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/69066222-1571-4ff1-86a6-1681967612da-dns-svc\") pod \"69066222-1571-4ff1-86a6-1681967612da\" (UID: \"69066222-1571-4ff1-86a6-1681967612da\") " Mar 10 11:12:16 crc kubenswrapper[4794]: I0310 11:12:16.330539 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/69066222-1571-4ff1-86a6-1681967612da-ovsdbserver-sb\") pod \"69066222-1571-4ff1-86a6-1681967612da\" (UID: \"69066222-1571-4ff1-86a6-1681967612da\") " Mar 10 11:12:16 crc kubenswrapper[4794]: I0310 11:12:16.330603 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/69066222-1571-4ff1-86a6-1681967612da-ovsdbserver-nb\") pod \"69066222-1571-4ff1-86a6-1681967612da\" (UID: \"69066222-1571-4ff1-86a6-1681967612da\") " Mar 10 11:12:16 crc kubenswrapper[4794]: I0310 11:12:16.350246 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/69066222-1571-4ff1-86a6-1681967612da-kube-api-access-jg2xv" (OuterVolumeSpecName: "kube-api-access-jg2xv") pod "69066222-1571-4ff1-86a6-1681967612da" (UID: "69066222-1571-4ff1-86a6-1681967612da"). InnerVolumeSpecName "kube-api-access-jg2xv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 11:12:16 crc kubenswrapper[4794]: I0310 11:12:16.379857 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/69066222-1571-4ff1-86a6-1681967612da-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "69066222-1571-4ff1-86a6-1681967612da" (UID: "69066222-1571-4ff1-86a6-1681967612da"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 11:12:16 crc kubenswrapper[4794]: I0310 11:12:16.379618 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/69066222-1571-4ff1-86a6-1681967612da-config" (OuterVolumeSpecName: "config") pod "69066222-1571-4ff1-86a6-1681967612da" (UID: "69066222-1571-4ff1-86a6-1681967612da"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 11:12:16 crc kubenswrapper[4794]: I0310 11:12:16.384422 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/69066222-1571-4ff1-86a6-1681967612da-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "69066222-1571-4ff1-86a6-1681967612da" (UID: "69066222-1571-4ff1-86a6-1681967612da"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 11:12:16 crc kubenswrapper[4794]: I0310 11:12:16.386411 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/69066222-1571-4ff1-86a6-1681967612da-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "69066222-1571-4ff1-86a6-1681967612da" (UID: "69066222-1571-4ff1-86a6-1681967612da"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 11:12:16 crc kubenswrapper[4794]: I0310 11:12:16.432477 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jg2xv\" (UniqueName: \"kubernetes.io/projected/69066222-1571-4ff1-86a6-1681967612da-kube-api-access-jg2xv\") on node \"crc\" DevicePath \"\"" Mar 10 11:12:16 crc kubenswrapper[4794]: I0310 11:12:16.432511 4794 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/69066222-1571-4ff1-86a6-1681967612da-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 10 11:12:16 crc kubenswrapper[4794]: I0310 11:12:16.432521 4794 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/69066222-1571-4ff1-86a6-1681967612da-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 10 11:12:16 crc kubenswrapper[4794]: I0310 11:12:16.432529 4794 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/69066222-1571-4ff1-86a6-1681967612da-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 10 11:12:16 crc kubenswrapper[4794]: I0310 11:12:16.432539 4794 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/69066222-1571-4ff1-86a6-1681967612da-config\") on node \"crc\" DevicePath \"\"" Mar 10 11:12:16 crc kubenswrapper[4794]: I0310 11:12:16.609666 4794 generic.go:334] "Generic (PLEG): container finished" podID="69066222-1571-4ff1-86a6-1681967612da" containerID="03da0622d5760d58291153ca42e768b1658694e688253040567b816d0b833cd0" exitCode=0 Mar 10 11:12:16 crc kubenswrapper[4794]: I0310 11:12:16.609742 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6cd799f5b5-hgcjl" event={"ID":"69066222-1571-4ff1-86a6-1681967612da","Type":"ContainerDied","Data":"03da0622d5760d58291153ca42e768b1658694e688253040567b816d0b833cd0"} Mar 10 11:12:16 crc kubenswrapper[4794]: I0310 11:12:16.609760 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6cd799f5b5-hgcjl" Mar 10 11:12:16 crc kubenswrapper[4794]: I0310 11:12:16.609910 4794 scope.go:117] "RemoveContainer" containerID="03da0622d5760d58291153ca42e768b1658694e688253040567b816d0b833cd0" Mar 10 11:12:16 crc kubenswrapper[4794]: I0310 11:12:16.609880 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6cd799f5b5-hgcjl" event={"ID":"69066222-1571-4ff1-86a6-1681967612da","Type":"ContainerDied","Data":"c8a67e3985b11cb4dfe4ca9eb4dd117d03f21eca89533f0833164b9770bec688"} Mar 10 11:12:16 crc kubenswrapper[4794]: I0310 11:12:16.666133 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6cd799f5b5-hgcjl"] Mar 10 11:12:16 crc kubenswrapper[4794]: I0310 11:12:16.667538 4794 scope.go:117] "RemoveContainer" containerID="ac432e8dea49d1b4563bfb647e53d8494d32c81c68c5a809f8e03285c79b8ccd" Mar 10 11:12:16 crc kubenswrapper[4794]: I0310 11:12:16.676583 4794 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6cd799f5b5-hgcjl"] Mar 10 11:12:16 crc kubenswrapper[4794]: I0310 11:12:16.697779 4794 scope.go:117] "RemoveContainer" containerID="03da0622d5760d58291153ca42e768b1658694e688253040567b816d0b833cd0" Mar 10 11:12:16 crc kubenswrapper[4794]: E0310 11:12:16.698737 4794 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"03da0622d5760d58291153ca42e768b1658694e688253040567b816d0b833cd0\": container with ID starting with 03da0622d5760d58291153ca42e768b1658694e688253040567b816d0b833cd0 not found: ID does not exist" containerID="03da0622d5760d58291153ca42e768b1658694e688253040567b816d0b833cd0" Mar 10 11:12:16 crc kubenswrapper[4794]: I0310 11:12:16.698782 4794 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"03da0622d5760d58291153ca42e768b1658694e688253040567b816d0b833cd0"} err="failed to get container status \"03da0622d5760d58291153ca42e768b1658694e688253040567b816d0b833cd0\": rpc error: code = NotFound desc = could not find container \"03da0622d5760d58291153ca42e768b1658694e688253040567b816d0b833cd0\": container with ID starting with 03da0622d5760d58291153ca42e768b1658694e688253040567b816d0b833cd0 not found: ID does not exist" Mar 10 11:12:16 crc kubenswrapper[4794]: I0310 11:12:16.698810 4794 scope.go:117] "RemoveContainer" containerID="ac432e8dea49d1b4563bfb647e53d8494d32c81c68c5a809f8e03285c79b8ccd" Mar 10 11:12:16 crc kubenswrapper[4794]: E0310 11:12:16.699145 4794 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ac432e8dea49d1b4563bfb647e53d8494d32c81c68c5a809f8e03285c79b8ccd\": container with ID starting with ac432e8dea49d1b4563bfb647e53d8494d32c81c68c5a809f8e03285c79b8ccd not found: ID does not exist" containerID="ac432e8dea49d1b4563bfb647e53d8494d32c81c68c5a809f8e03285c79b8ccd" Mar 10 11:12:16 crc kubenswrapper[4794]: I0310 11:12:16.699193 4794 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ac432e8dea49d1b4563bfb647e53d8494d32c81c68c5a809f8e03285c79b8ccd"} err="failed to get container status \"ac432e8dea49d1b4563bfb647e53d8494d32c81c68c5a809f8e03285c79b8ccd\": rpc error: code = NotFound desc = could not find container \"ac432e8dea49d1b4563bfb647e53d8494d32c81c68c5a809f8e03285c79b8ccd\": container with ID starting with ac432e8dea49d1b4563bfb647e53d8494d32c81c68c5a809f8e03285c79b8ccd not found: ID does not exist" Mar 10 11:12:16 crc kubenswrapper[4794]: I0310 11:12:16.998160 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-69ksj" Mar 10 11:12:17 crc kubenswrapper[4794]: I0310 11:12:17.041542 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/4a267651-30fa-4587-9da7-25c34d836c67-credential-keys\") pod \"4a267651-30fa-4587-9da7-25c34d836c67\" (UID: \"4a267651-30fa-4587-9da7-25c34d836c67\") " Mar 10 11:12:17 crc kubenswrapper[4794]: I0310 11:12:17.041640 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4a267651-30fa-4587-9da7-25c34d836c67-config-data\") pod \"4a267651-30fa-4587-9da7-25c34d836c67\" (UID: \"4a267651-30fa-4587-9da7-25c34d836c67\") " Mar 10 11:12:17 crc kubenswrapper[4794]: I0310 11:12:17.041667 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/4a267651-30fa-4587-9da7-25c34d836c67-fernet-keys\") pod \"4a267651-30fa-4587-9da7-25c34d836c67\" (UID: \"4a267651-30fa-4587-9da7-25c34d836c67\") " Mar 10 11:12:17 crc kubenswrapper[4794]: I0310 11:12:17.041744 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4a267651-30fa-4587-9da7-25c34d836c67-scripts\") pod \"4a267651-30fa-4587-9da7-25c34d836c67\" (UID: \"4a267651-30fa-4587-9da7-25c34d836c67\") " Mar 10 11:12:17 crc kubenswrapper[4794]: I0310 11:12:17.041764 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bj25x\" (UniqueName: \"kubernetes.io/projected/4a267651-30fa-4587-9da7-25c34d836c67-kube-api-access-bj25x\") pod \"4a267651-30fa-4587-9da7-25c34d836c67\" (UID: \"4a267651-30fa-4587-9da7-25c34d836c67\") " Mar 10 11:12:17 crc kubenswrapper[4794]: I0310 11:12:17.041814 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4a267651-30fa-4587-9da7-25c34d836c67-combined-ca-bundle\") pod \"4a267651-30fa-4587-9da7-25c34d836c67\" (UID: \"4a267651-30fa-4587-9da7-25c34d836c67\") " Mar 10 11:12:17 crc kubenswrapper[4794]: I0310 11:12:17.047780 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4a267651-30fa-4587-9da7-25c34d836c67-scripts" (OuterVolumeSpecName: "scripts") pod "4a267651-30fa-4587-9da7-25c34d836c67" (UID: "4a267651-30fa-4587-9da7-25c34d836c67"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 11:12:17 crc kubenswrapper[4794]: I0310 11:12:17.047810 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4a267651-30fa-4587-9da7-25c34d836c67-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "4a267651-30fa-4587-9da7-25c34d836c67" (UID: "4a267651-30fa-4587-9da7-25c34d836c67"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 11:12:17 crc kubenswrapper[4794]: I0310 11:12:17.048952 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4a267651-30fa-4587-9da7-25c34d836c67-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "4a267651-30fa-4587-9da7-25c34d836c67" (UID: "4a267651-30fa-4587-9da7-25c34d836c67"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 11:12:17 crc kubenswrapper[4794]: I0310 11:12:17.050054 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4a267651-30fa-4587-9da7-25c34d836c67-kube-api-access-bj25x" (OuterVolumeSpecName: "kube-api-access-bj25x") pod "4a267651-30fa-4587-9da7-25c34d836c67" (UID: "4a267651-30fa-4587-9da7-25c34d836c67"). InnerVolumeSpecName "kube-api-access-bj25x". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 11:12:17 crc kubenswrapper[4794]: I0310 11:12:17.081778 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4a267651-30fa-4587-9da7-25c34d836c67-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4a267651-30fa-4587-9da7-25c34d836c67" (UID: "4a267651-30fa-4587-9da7-25c34d836c67"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 11:12:17 crc kubenswrapper[4794]: I0310 11:12:17.083511 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4a267651-30fa-4587-9da7-25c34d836c67-config-data" (OuterVolumeSpecName: "config-data") pod "4a267651-30fa-4587-9da7-25c34d836c67" (UID: "4a267651-30fa-4587-9da7-25c34d836c67"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 11:12:17 crc kubenswrapper[4794]: I0310 11:12:17.144124 4794 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4a267651-30fa-4587-9da7-25c34d836c67-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 11:12:17 crc kubenswrapper[4794]: I0310 11:12:17.144493 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bj25x\" (UniqueName: \"kubernetes.io/projected/4a267651-30fa-4587-9da7-25c34d836c67-kube-api-access-bj25x\") on node \"crc\" DevicePath \"\"" Mar 10 11:12:17 crc kubenswrapper[4794]: I0310 11:12:17.144506 4794 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4a267651-30fa-4587-9da7-25c34d836c67-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 11:12:17 crc kubenswrapper[4794]: I0310 11:12:17.144517 4794 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/4a267651-30fa-4587-9da7-25c34d836c67-credential-keys\") on node \"crc\" DevicePath \"\"" Mar 10 11:12:17 crc kubenswrapper[4794]: I0310 11:12:17.144526 4794 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4a267651-30fa-4587-9da7-25c34d836c67-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 11:12:17 crc kubenswrapper[4794]: I0310 11:12:17.144533 4794 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/4a267651-30fa-4587-9da7-25c34d836c67-fernet-keys\") on node \"crc\" DevicePath \"\"" Mar 10 11:12:17 crc kubenswrapper[4794]: I0310 11:12:17.622641 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-69ksj" event={"ID":"4a267651-30fa-4587-9da7-25c34d836c67","Type":"ContainerDied","Data":"adf12d8f7c046a2b38d28123626c7fc264bec3bd26598bbe073d0504d783a4ae"} Mar 10 11:12:17 crc kubenswrapper[4794]: I0310 11:12:17.622709 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-69ksj" Mar 10 11:12:17 crc kubenswrapper[4794]: I0310 11:12:17.622727 4794 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="adf12d8f7c046a2b38d28123626c7fc264bec3bd26598bbe073d0504d783a4ae" Mar 10 11:12:17 crc kubenswrapper[4794]: I0310 11:12:17.858121 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-84ff5f675d-mcgwb"] Mar 10 11:12:17 crc kubenswrapper[4794]: E0310 11:12:17.858838 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="69066222-1571-4ff1-86a6-1681967612da" containerName="init" Mar 10 11:12:17 crc kubenswrapper[4794]: I0310 11:12:17.858885 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="69066222-1571-4ff1-86a6-1681967612da" containerName="init" Mar 10 11:12:17 crc kubenswrapper[4794]: E0310 11:12:17.858929 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="69066222-1571-4ff1-86a6-1681967612da" containerName="dnsmasq-dns" Mar 10 11:12:17 crc kubenswrapper[4794]: I0310 11:12:17.858949 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="69066222-1571-4ff1-86a6-1681967612da" containerName="dnsmasq-dns" Mar 10 11:12:17 crc kubenswrapper[4794]: E0310 11:12:17.858973 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4a267651-30fa-4587-9da7-25c34d836c67" containerName="keystone-bootstrap" Mar 10 11:12:17 crc kubenswrapper[4794]: I0310 11:12:17.858992 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="4a267651-30fa-4587-9da7-25c34d836c67" containerName="keystone-bootstrap" Mar 10 11:12:17 crc kubenswrapper[4794]: I0310 11:12:17.859427 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="4a267651-30fa-4587-9da7-25c34d836c67" containerName="keystone-bootstrap" Mar 10 11:12:17 crc kubenswrapper[4794]: I0310 11:12:17.859490 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="69066222-1571-4ff1-86a6-1681967612da" containerName="dnsmasq-dns" Mar 10 11:12:17 crc kubenswrapper[4794]: I0310 11:12:17.860722 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-84ff5f675d-mcgwb" Mar 10 11:12:17 crc kubenswrapper[4794]: I0310 11:12:17.873225 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-84ff5f675d-mcgwb"] Mar 10 11:12:17 crc kubenswrapper[4794]: I0310 11:12:17.876045 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Mar 10 11:12:17 crc kubenswrapper[4794]: I0310 11:12:17.876207 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Mar 10 11:12:17 crc kubenswrapper[4794]: I0310 11:12:17.876729 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Mar 10 11:12:17 crc kubenswrapper[4794]: I0310 11:12:17.877010 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-blbrh" Mar 10 11:12:17 crc kubenswrapper[4794]: I0310 11:12:17.959292 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5c1d5e7f-f99a-46af-b1d6-9c016759827a-combined-ca-bundle\") pod \"keystone-84ff5f675d-mcgwb\" (UID: \"5c1d5e7f-f99a-46af-b1d6-9c016759827a\") " pod="openstack/keystone-84ff5f675d-mcgwb" Mar 10 11:12:17 crc kubenswrapper[4794]: I0310 11:12:17.959421 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xdf9j\" (UniqueName: \"kubernetes.io/projected/5c1d5e7f-f99a-46af-b1d6-9c016759827a-kube-api-access-xdf9j\") pod \"keystone-84ff5f675d-mcgwb\" (UID: \"5c1d5e7f-f99a-46af-b1d6-9c016759827a\") " pod="openstack/keystone-84ff5f675d-mcgwb" Mar 10 11:12:17 crc kubenswrapper[4794]: I0310 11:12:17.959536 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5c1d5e7f-f99a-46af-b1d6-9c016759827a-scripts\") pod \"keystone-84ff5f675d-mcgwb\" (UID: \"5c1d5e7f-f99a-46af-b1d6-9c016759827a\") " pod="openstack/keystone-84ff5f675d-mcgwb" Mar 10 11:12:17 crc kubenswrapper[4794]: I0310 11:12:17.959766 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/5c1d5e7f-f99a-46af-b1d6-9c016759827a-credential-keys\") pod \"keystone-84ff5f675d-mcgwb\" (UID: \"5c1d5e7f-f99a-46af-b1d6-9c016759827a\") " pod="openstack/keystone-84ff5f675d-mcgwb" Mar 10 11:12:17 crc kubenswrapper[4794]: I0310 11:12:17.959844 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5c1d5e7f-f99a-46af-b1d6-9c016759827a-config-data\") pod \"keystone-84ff5f675d-mcgwb\" (UID: \"5c1d5e7f-f99a-46af-b1d6-9c016759827a\") " pod="openstack/keystone-84ff5f675d-mcgwb" Mar 10 11:12:17 crc kubenswrapper[4794]: I0310 11:12:17.959874 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/5c1d5e7f-f99a-46af-b1d6-9c016759827a-fernet-keys\") pod \"keystone-84ff5f675d-mcgwb\" (UID: \"5c1d5e7f-f99a-46af-b1d6-9c016759827a\") " pod="openstack/keystone-84ff5f675d-mcgwb" Mar 10 11:12:18 crc kubenswrapper[4794]: I0310 11:12:18.010655 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="69066222-1571-4ff1-86a6-1681967612da" path="/var/lib/kubelet/pods/69066222-1571-4ff1-86a6-1681967612da/volumes" Mar 10 11:12:18 crc kubenswrapper[4794]: I0310 11:12:18.060685 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5c1d5e7f-f99a-46af-b1d6-9c016759827a-combined-ca-bundle\") pod \"keystone-84ff5f675d-mcgwb\" (UID: \"5c1d5e7f-f99a-46af-b1d6-9c016759827a\") " pod="openstack/keystone-84ff5f675d-mcgwb" Mar 10 11:12:18 crc kubenswrapper[4794]: I0310 11:12:18.060753 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xdf9j\" (UniqueName: \"kubernetes.io/projected/5c1d5e7f-f99a-46af-b1d6-9c016759827a-kube-api-access-xdf9j\") pod \"keystone-84ff5f675d-mcgwb\" (UID: \"5c1d5e7f-f99a-46af-b1d6-9c016759827a\") " pod="openstack/keystone-84ff5f675d-mcgwb" Mar 10 11:12:18 crc kubenswrapper[4794]: I0310 11:12:18.060797 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5c1d5e7f-f99a-46af-b1d6-9c016759827a-scripts\") pod \"keystone-84ff5f675d-mcgwb\" (UID: \"5c1d5e7f-f99a-46af-b1d6-9c016759827a\") " pod="openstack/keystone-84ff5f675d-mcgwb" Mar 10 11:12:18 crc kubenswrapper[4794]: I0310 11:12:18.060908 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/5c1d5e7f-f99a-46af-b1d6-9c016759827a-credential-keys\") pod \"keystone-84ff5f675d-mcgwb\" (UID: \"5c1d5e7f-f99a-46af-b1d6-9c016759827a\") " pod="openstack/keystone-84ff5f675d-mcgwb" Mar 10 11:12:18 crc kubenswrapper[4794]: I0310 11:12:18.060960 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5c1d5e7f-f99a-46af-b1d6-9c016759827a-config-data\") pod \"keystone-84ff5f675d-mcgwb\" (UID: \"5c1d5e7f-f99a-46af-b1d6-9c016759827a\") " pod="openstack/keystone-84ff5f675d-mcgwb" Mar 10 11:12:18 crc kubenswrapper[4794]: I0310 11:12:18.060995 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/5c1d5e7f-f99a-46af-b1d6-9c016759827a-fernet-keys\") pod \"keystone-84ff5f675d-mcgwb\" (UID: \"5c1d5e7f-f99a-46af-b1d6-9c016759827a\") " pod="openstack/keystone-84ff5f675d-mcgwb" Mar 10 11:12:18 crc kubenswrapper[4794]: I0310 11:12:18.064506 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5c1d5e7f-f99a-46af-b1d6-9c016759827a-scripts\") pod \"keystone-84ff5f675d-mcgwb\" (UID: \"5c1d5e7f-f99a-46af-b1d6-9c016759827a\") " pod="openstack/keystone-84ff5f675d-mcgwb" Mar 10 11:12:18 crc kubenswrapper[4794]: I0310 11:12:18.064788 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5c1d5e7f-f99a-46af-b1d6-9c016759827a-combined-ca-bundle\") pod \"keystone-84ff5f675d-mcgwb\" (UID: \"5c1d5e7f-f99a-46af-b1d6-9c016759827a\") " pod="openstack/keystone-84ff5f675d-mcgwb" Mar 10 11:12:18 crc kubenswrapper[4794]: I0310 11:12:18.065101 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5c1d5e7f-f99a-46af-b1d6-9c016759827a-config-data\") pod \"keystone-84ff5f675d-mcgwb\" (UID: \"5c1d5e7f-f99a-46af-b1d6-9c016759827a\") " pod="openstack/keystone-84ff5f675d-mcgwb" Mar 10 11:12:18 crc kubenswrapper[4794]: I0310 11:12:18.065709 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/5c1d5e7f-f99a-46af-b1d6-9c016759827a-fernet-keys\") pod \"keystone-84ff5f675d-mcgwb\" (UID: \"5c1d5e7f-f99a-46af-b1d6-9c016759827a\") " pod="openstack/keystone-84ff5f675d-mcgwb" Mar 10 11:12:18 crc kubenswrapper[4794]: I0310 11:12:18.066443 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/5c1d5e7f-f99a-46af-b1d6-9c016759827a-credential-keys\") pod \"keystone-84ff5f675d-mcgwb\" (UID: \"5c1d5e7f-f99a-46af-b1d6-9c016759827a\") " pod="openstack/keystone-84ff5f675d-mcgwb" Mar 10 11:12:18 crc kubenswrapper[4794]: I0310 11:12:18.076301 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xdf9j\" (UniqueName: \"kubernetes.io/projected/5c1d5e7f-f99a-46af-b1d6-9c016759827a-kube-api-access-xdf9j\") pod \"keystone-84ff5f675d-mcgwb\" (UID: \"5c1d5e7f-f99a-46af-b1d6-9c016759827a\") " pod="openstack/keystone-84ff5f675d-mcgwb" Mar 10 11:12:18 crc kubenswrapper[4794]: I0310 11:12:18.234043 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-84ff5f675d-mcgwb" Mar 10 11:12:18 crc kubenswrapper[4794]: I0310 11:12:18.765690 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-84ff5f675d-mcgwb"] Mar 10 11:12:18 crc kubenswrapper[4794]: W0310 11:12:18.789523 4794 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5c1d5e7f_f99a_46af_b1d6_9c016759827a.slice/crio-786e73cb0c384b720d0d5174fbb53e664ee83c63b329757d37e6e60c076e4e5c WatchSource:0}: Error finding container 786e73cb0c384b720d0d5174fbb53e664ee83c63b329757d37e6e60c076e4e5c: Status 404 returned error can't find the container with id 786e73cb0c384b720d0d5174fbb53e664ee83c63b329757d37e6e60c076e4e5c Mar 10 11:12:19 crc kubenswrapper[4794]: I0310 11:12:19.648286 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-84ff5f675d-mcgwb" event={"ID":"5c1d5e7f-f99a-46af-b1d6-9c016759827a","Type":"ContainerStarted","Data":"e0a8f22d5629247bf5ddcc24138c03db00f419620ca535b2fb672bb28e0892b9"} Mar 10 11:12:19 crc kubenswrapper[4794]: I0310 11:12:19.648627 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/keystone-84ff5f675d-mcgwb" Mar 10 11:12:19 crc kubenswrapper[4794]: I0310 11:12:19.648643 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-84ff5f675d-mcgwb" event={"ID":"5c1d5e7f-f99a-46af-b1d6-9c016759827a","Type":"ContainerStarted","Data":"786e73cb0c384b720d0d5174fbb53e664ee83c63b329757d37e6e60c076e4e5c"} Mar 10 11:12:19 crc kubenswrapper[4794]: I0310 11:12:19.673019 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-84ff5f675d-mcgwb" podStartSLOduration=2.672988714 podStartE2EDuration="2.672988714s" podCreationTimestamp="2026-03-10 11:12:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 11:12:19.672841409 +0000 UTC m=+5288.429012277" watchObservedRunningTime="2026-03-10 11:12:19.672988714 +0000 UTC m=+5288.429159572" Mar 10 11:12:42 crc kubenswrapper[4794]: I0310 11:12:42.116964 4794 scope.go:117] "RemoveContainer" containerID="08c2c66effe54b7118bdf05a58b266c465935f44cf2c11e832bc73c547e0685e" Mar 10 11:12:49 crc kubenswrapper[4794]: I0310 11:12:49.629754 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/keystone-84ff5f675d-mcgwb" Mar 10 11:12:50 crc kubenswrapper[4794]: I0310 11:12:50.400836 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Mar 10 11:12:50 crc kubenswrapper[4794]: I0310 11:12:50.402978 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Mar 10 11:12:50 crc kubenswrapper[4794]: I0310 11:12:50.405079 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config" Mar 10 11:12:50 crc kubenswrapper[4794]: I0310 11:12:50.410418 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstackclient-openstackclient-dockercfg-v9npx" Mar 10 11:12:50 crc kubenswrapper[4794]: I0310 11:12:50.412084 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-config-secret" Mar 10 11:12:50 crc kubenswrapper[4794]: I0310 11:12:50.416804 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Mar 10 11:12:50 crc kubenswrapper[4794]: I0310 11:12:50.554864 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/9629922b-07aa-4027-a102-5d957b2ca8af-openstack-config\") pod \"openstackclient\" (UID: \"9629922b-07aa-4027-a102-5d957b2ca8af\") " pod="openstack/openstackclient" Mar 10 11:12:50 crc kubenswrapper[4794]: I0310 11:12:50.555016 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jb54l\" (UniqueName: \"kubernetes.io/projected/9629922b-07aa-4027-a102-5d957b2ca8af-kube-api-access-jb54l\") pod \"openstackclient\" (UID: \"9629922b-07aa-4027-a102-5d957b2ca8af\") " pod="openstack/openstackclient" Mar 10 11:12:50 crc kubenswrapper[4794]: I0310 11:12:50.555151 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/9629922b-07aa-4027-a102-5d957b2ca8af-openstack-config-secret\") pod \"openstackclient\" (UID: \"9629922b-07aa-4027-a102-5d957b2ca8af\") " pod="openstack/openstackclient" Mar 10 11:12:50 crc kubenswrapper[4794]: I0310 11:12:50.656681 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jb54l\" (UniqueName: \"kubernetes.io/projected/9629922b-07aa-4027-a102-5d957b2ca8af-kube-api-access-jb54l\") pod \"openstackclient\" (UID: \"9629922b-07aa-4027-a102-5d957b2ca8af\") " pod="openstack/openstackclient" Mar 10 11:12:50 crc kubenswrapper[4794]: I0310 11:12:50.656914 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/9629922b-07aa-4027-a102-5d957b2ca8af-openstack-config-secret\") pod \"openstackclient\" (UID: \"9629922b-07aa-4027-a102-5d957b2ca8af\") " pod="openstack/openstackclient" Mar 10 11:12:50 crc kubenswrapper[4794]: I0310 11:12:50.657047 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/9629922b-07aa-4027-a102-5d957b2ca8af-openstack-config\") pod \"openstackclient\" (UID: \"9629922b-07aa-4027-a102-5d957b2ca8af\") " pod="openstack/openstackclient" Mar 10 11:12:50 crc kubenswrapper[4794]: I0310 11:12:50.658861 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/9629922b-07aa-4027-a102-5d957b2ca8af-openstack-config\") pod \"openstackclient\" (UID: \"9629922b-07aa-4027-a102-5d957b2ca8af\") " pod="openstack/openstackclient" Mar 10 11:12:50 crc kubenswrapper[4794]: I0310 11:12:50.664706 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/9629922b-07aa-4027-a102-5d957b2ca8af-openstack-config-secret\") pod \"openstackclient\" (UID: \"9629922b-07aa-4027-a102-5d957b2ca8af\") " pod="openstack/openstackclient" Mar 10 11:12:50 crc kubenswrapper[4794]: I0310 11:12:50.676471 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jb54l\" (UniqueName: \"kubernetes.io/projected/9629922b-07aa-4027-a102-5d957b2ca8af-kube-api-access-jb54l\") pod \"openstackclient\" (UID: \"9629922b-07aa-4027-a102-5d957b2ca8af\") " pod="openstack/openstackclient" Mar 10 11:12:50 crc kubenswrapper[4794]: I0310 11:12:50.733865 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Mar 10 11:12:51 crc kubenswrapper[4794]: I0310 11:12:51.061135 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Mar 10 11:12:52 crc kubenswrapper[4794]: I0310 11:12:52.006655 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"9629922b-07aa-4027-a102-5d957b2ca8af","Type":"ContainerStarted","Data":"2eca94f4dde83de73167c8dab6b4d9a2d353616bc86ccdcb0e93d1b10c6e8e36"} Mar 10 11:12:52 crc kubenswrapper[4794]: I0310 11:12:52.007837 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"9629922b-07aa-4027-a102-5d957b2ca8af","Type":"ContainerStarted","Data":"6316a78d391b0b67be523040099819831382df6b803f07a1b1169c03967ad58d"} Mar 10 11:12:52 crc kubenswrapper[4794]: I0310 11:12:52.048446 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstackclient" podStartSLOduration=2.048425287 podStartE2EDuration="2.048425287s" podCreationTimestamp="2026-03-10 11:12:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 11:12:52.041747319 +0000 UTC m=+5320.797918167" watchObservedRunningTime="2026-03-10 11:12:52.048425287 +0000 UTC m=+5320.804596115" Mar 10 11:12:52 crc kubenswrapper[4794]: I0310 11:12:52.967752 4794 patch_prober.go:28] interesting pod/machine-config-daemon-69278 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 11:12:52 crc kubenswrapper[4794]: I0310 11:12:52.968395 4794 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-69278" podUID="071a79a8-a892-4d38-a255-2a19483b64aa" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 11:13:22 crc kubenswrapper[4794]: I0310 11:13:22.968110 4794 patch_prober.go:28] interesting pod/machine-config-daemon-69278 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 11:13:22 crc kubenswrapper[4794]: I0310 11:13:22.968972 4794 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-69278" podUID="071a79a8-a892-4d38-a255-2a19483b64aa" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 11:13:52 crc kubenswrapper[4794]: I0310 11:13:52.967852 4794 patch_prober.go:28] interesting pod/machine-config-daemon-69278 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 11:13:52 crc kubenswrapper[4794]: I0310 11:13:52.968567 4794 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-69278" podUID="071a79a8-a892-4d38-a255-2a19483b64aa" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 11:13:52 crc kubenswrapper[4794]: I0310 11:13:52.968611 4794 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-69278" Mar 10 11:13:52 crc kubenswrapper[4794]: I0310 11:13:52.969358 4794 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"993a5336699cfaa4c1135438dd34154cd549d1e3528993849fd4002d26837157"} pod="openshift-machine-config-operator/machine-config-daemon-69278" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 10 11:13:52 crc kubenswrapper[4794]: I0310 11:13:52.969428 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-69278" podUID="071a79a8-a892-4d38-a255-2a19483b64aa" containerName="machine-config-daemon" containerID="cri-o://993a5336699cfaa4c1135438dd34154cd549d1e3528993849fd4002d26837157" gracePeriod=600 Mar 10 11:13:53 crc kubenswrapper[4794]: E0310 11:13:53.115325 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-69278_openshift-machine-config-operator(071a79a8-a892-4d38-a255-2a19483b64aa)\"" pod="openshift-machine-config-operator/machine-config-daemon-69278" podUID="071a79a8-a892-4d38-a255-2a19483b64aa" Mar 10 11:13:53 crc kubenswrapper[4794]: I0310 11:13:53.612633 4794 generic.go:334] "Generic (PLEG): container finished" podID="071a79a8-a892-4d38-a255-2a19483b64aa" containerID="993a5336699cfaa4c1135438dd34154cd549d1e3528993849fd4002d26837157" exitCode=0 Mar 10 11:13:53 crc kubenswrapper[4794]: I0310 11:13:53.612705 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-69278" event={"ID":"071a79a8-a892-4d38-a255-2a19483b64aa","Type":"ContainerDied","Data":"993a5336699cfaa4c1135438dd34154cd549d1e3528993849fd4002d26837157"} Mar 10 11:13:53 crc kubenswrapper[4794]: I0310 11:13:53.612752 4794 scope.go:117] "RemoveContainer" containerID="f76c37e6c1f055cf0ebe872366c01cd32a21819058acbb1bc0dcd8368831960a" Mar 10 11:13:53 crc kubenswrapper[4794]: I0310 11:13:53.613571 4794 scope.go:117] "RemoveContainer" containerID="993a5336699cfaa4c1135438dd34154cd549d1e3528993849fd4002d26837157" Mar 10 11:13:53 crc kubenswrapper[4794]: E0310 11:13:53.614059 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-69278_openshift-machine-config-operator(071a79a8-a892-4d38-a255-2a19483b64aa)\"" pod="openshift-machine-config-operator/machine-config-daemon-69278" podUID="071a79a8-a892-4d38-a255-2a19483b64aa" Mar 10 11:13:57 crc kubenswrapper[4794]: I0310 11:13:57.056304 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-fp5js"] Mar 10 11:13:57 crc kubenswrapper[4794]: I0310 11:13:57.061932 4794 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-fp5js"] Mar 10 11:13:58 crc kubenswrapper[4794]: I0310 11:13:58.010454 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0aebf8d4-3811-45eb-88fd-cc864f69e181" path="/var/lib/kubelet/pods/0aebf8d4-3811-45eb-88fd-cc864f69e181/volumes" Mar 10 11:14:00 crc kubenswrapper[4794]: I0310 11:14:00.143917 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29552354-cxv5m"] Mar 10 11:14:00 crc kubenswrapper[4794]: I0310 11:14:00.145316 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552354-cxv5m" Mar 10 11:14:00 crc kubenswrapper[4794]: I0310 11:14:00.147531 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-r5tkc" Mar 10 11:14:00 crc kubenswrapper[4794]: I0310 11:14:00.147694 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 10 11:14:00 crc kubenswrapper[4794]: I0310 11:14:00.147893 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 10 11:14:00 crc kubenswrapper[4794]: I0310 11:14:00.153757 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552354-cxv5m"] Mar 10 11:14:00 crc kubenswrapper[4794]: I0310 11:14:00.249693 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4sjfs\" (UniqueName: \"kubernetes.io/projected/49befbd9-bbf3-477e-9490-83a2e7e8eac6-kube-api-access-4sjfs\") pod \"auto-csr-approver-29552354-cxv5m\" (UID: \"49befbd9-bbf3-477e-9490-83a2e7e8eac6\") " pod="openshift-infra/auto-csr-approver-29552354-cxv5m" Mar 10 11:14:00 crc kubenswrapper[4794]: I0310 11:14:00.351730 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4sjfs\" (UniqueName: \"kubernetes.io/projected/49befbd9-bbf3-477e-9490-83a2e7e8eac6-kube-api-access-4sjfs\") pod \"auto-csr-approver-29552354-cxv5m\" (UID: \"49befbd9-bbf3-477e-9490-83a2e7e8eac6\") " pod="openshift-infra/auto-csr-approver-29552354-cxv5m" Mar 10 11:14:00 crc kubenswrapper[4794]: I0310 11:14:00.376148 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4sjfs\" (UniqueName: \"kubernetes.io/projected/49befbd9-bbf3-477e-9490-83a2e7e8eac6-kube-api-access-4sjfs\") pod \"auto-csr-approver-29552354-cxv5m\" (UID: \"49befbd9-bbf3-477e-9490-83a2e7e8eac6\") " pod="openshift-infra/auto-csr-approver-29552354-cxv5m" Mar 10 11:14:00 crc kubenswrapper[4794]: I0310 11:14:00.470824 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552354-cxv5m" Mar 10 11:14:00 crc kubenswrapper[4794]: I0310 11:14:00.968315 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552354-cxv5m"] Mar 10 11:14:01 crc kubenswrapper[4794]: I0310 11:14:01.687259 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552354-cxv5m" event={"ID":"49befbd9-bbf3-477e-9490-83a2e7e8eac6","Type":"ContainerStarted","Data":"57910ac1739727921f309a13c51b6845aa9b48168bbb306b8d759bb6c3f119f6"} Mar 10 11:14:02 crc kubenswrapper[4794]: I0310 11:14:02.698854 4794 generic.go:334] "Generic (PLEG): container finished" podID="49befbd9-bbf3-477e-9490-83a2e7e8eac6" containerID="b303cf68544edd697504ba6c368778ffb2e4e4129a74c2e6ccfa9a94953a0d0d" exitCode=0 Mar 10 11:14:02 crc kubenswrapper[4794]: I0310 11:14:02.698920 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552354-cxv5m" event={"ID":"49befbd9-bbf3-477e-9490-83a2e7e8eac6","Type":"ContainerDied","Data":"b303cf68544edd697504ba6c368778ffb2e4e4129a74c2e6ccfa9a94953a0d0d"} Mar 10 11:14:04 crc kubenswrapper[4794]: I0310 11:14:04.003388 4794 scope.go:117] "RemoveContainer" containerID="993a5336699cfaa4c1135438dd34154cd549d1e3528993849fd4002d26837157" Mar 10 11:14:04 crc kubenswrapper[4794]: E0310 11:14:04.003757 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-69278_openshift-machine-config-operator(071a79a8-a892-4d38-a255-2a19483b64aa)\"" pod="openshift-machine-config-operator/machine-config-daemon-69278" podUID="071a79a8-a892-4d38-a255-2a19483b64aa" Mar 10 11:14:04 crc kubenswrapper[4794]: I0310 11:14:04.031771 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552354-cxv5m" Mar 10 11:14:04 crc kubenswrapper[4794]: I0310 11:14:04.216747 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4sjfs\" (UniqueName: \"kubernetes.io/projected/49befbd9-bbf3-477e-9490-83a2e7e8eac6-kube-api-access-4sjfs\") pod \"49befbd9-bbf3-477e-9490-83a2e7e8eac6\" (UID: \"49befbd9-bbf3-477e-9490-83a2e7e8eac6\") " Mar 10 11:14:04 crc kubenswrapper[4794]: I0310 11:14:04.223638 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49befbd9-bbf3-477e-9490-83a2e7e8eac6-kube-api-access-4sjfs" (OuterVolumeSpecName: "kube-api-access-4sjfs") pod "49befbd9-bbf3-477e-9490-83a2e7e8eac6" (UID: "49befbd9-bbf3-477e-9490-83a2e7e8eac6"). InnerVolumeSpecName "kube-api-access-4sjfs". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 11:14:04 crc kubenswrapper[4794]: I0310 11:14:04.318857 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4sjfs\" (UniqueName: \"kubernetes.io/projected/49befbd9-bbf3-477e-9490-83a2e7e8eac6-kube-api-access-4sjfs\") on node \"crc\" DevicePath \"\"" Mar 10 11:14:04 crc kubenswrapper[4794]: I0310 11:14:04.722255 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552354-cxv5m" event={"ID":"49befbd9-bbf3-477e-9490-83a2e7e8eac6","Type":"ContainerDied","Data":"57910ac1739727921f309a13c51b6845aa9b48168bbb306b8d759bb6c3f119f6"} Mar 10 11:14:04 crc kubenswrapper[4794]: I0310 11:14:04.722609 4794 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="57910ac1739727921f309a13c51b6845aa9b48168bbb306b8d759bb6c3f119f6" Mar 10 11:14:04 crc kubenswrapper[4794]: I0310 11:14:04.722291 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552354-cxv5m" Mar 10 11:14:05 crc kubenswrapper[4794]: I0310 11:14:05.084960 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29552348-twd8d"] Mar 10 11:14:05 crc kubenswrapper[4794]: I0310 11:14:05.091469 4794 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29552348-twd8d"] Mar 10 11:14:06 crc kubenswrapper[4794]: I0310 11:14:06.007720 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a58ccd65-b7c0-452b-82e1-01ce3abd6f4a" path="/var/lib/kubelet/pods/a58ccd65-b7c0-452b-82e1-01ce3abd6f4a/volumes" Mar 10 11:14:17 crc kubenswrapper[4794]: I0310 11:14:17.999320 4794 scope.go:117] "RemoveContainer" containerID="993a5336699cfaa4c1135438dd34154cd549d1e3528993849fd4002d26837157" Mar 10 11:14:18 crc kubenswrapper[4794]: E0310 11:14:18.000034 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-69278_openshift-machine-config-operator(071a79a8-a892-4d38-a255-2a19483b64aa)\"" pod="openshift-machine-config-operator/machine-config-daemon-69278" podUID="071a79a8-a892-4d38-a255-2a19483b64aa" Mar 10 11:14:22 crc kubenswrapper[4794]: I0310 11:14:22.290236 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-kzcfk"] Mar 10 11:14:22 crc kubenswrapper[4794]: E0310 11:14:22.291021 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="49befbd9-bbf3-477e-9490-83a2e7e8eac6" containerName="oc" Mar 10 11:14:22 crc kubenswrapper[4794]: I0310 11:14:22.291043 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="49befbd9-bbf3-477e-9490-83a2e7e8eac6" containerName="oc" Mar 10 11:14:22 crc kubenswrapper[4794]: I0310 11:14:22.291365 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="49befbd9-bbf3-477e-9490-83a2e7e8eac6" containerName="oc" Mar 10 11:14:22 crc kubenswrapper[4794]: I0310 11:14:22.293466 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-kzcfk" Mar 10 11:14:22 crc kubenswrapper[4794]: I0310 11:14:22.307818 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-kzcfk"] Mar 10 11:14:22 crc kubenswrapper[4794]: I0310 11:14:22.362710 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/52e59577-61f1-45bf-adf7-3f55a2d92cf9-catalog-content\") pod \"community-operators-kzcfk\" (UID: \"52e59577-61f1-45bf-adf7-3f55a2d92cf9\") " pod="openshift-marketplace/community-operators-kzcfk" Mar 10 11:14:22 crc kubenswrapper[4794]: I0310 11:14:22.363183 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/52e59577-61f1-45bf-adf7-3f55a2d92cf9-utilities\") pod \"community-operators-kzcfk\" (UID: \"52e59577-61f1-45bf-adf7-3f55a2d92cf9\") " pod="openshift-marketplace/community-operators-kzcfk" Mar 10 11:14:22 crc kubenswrapper[4794]: I0310 11:14:22.363304 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rwpqq\" (UniqueName: \"kubernetes.io/projected/52e59577-61f1-45bf-adf7-3f55a2d92cf9-kube-api-access-rwpqq\") pod \"community-operators-kzcfk\" (UID: \"52e59577-61f1-45bf-adf7-3f55a2d92cf9\") " pod="openshift-marketplace/community-operators-kzcfk" Mar 10 11:14:22 crc kubenswrapper[4794]: I0310 11:14:22.465138 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rwpqq\" (UniqueName: \"kubernetes.io/projected/52e59577-61f1-45bf-adf7-3f55a2d92cf9-kube-api-access-rwpqq\") pod \"community-operators-kzcfk\" (UID: \"52e59577-61f1-45bf-adf7-3f55a2d92cf9\") " pod="openshift-marketplace/community-operators-kzcfk" Mar 10 11:14:22 crc kubenswrapper[4794]: I0310 11:14:22.465230 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/52e59577-61f1-45bf-adf7-3f55a2d92cf9-catalog-content\") pod \"community-operators-kzcfk\" (UID: \"52e59577-61f1-45bf-adf7-3f55a2d92cf9\") " pod="openshift-marketplace/community-operators-kzcfk" Mar 10 11:14:22 crc kubenswrapper[4794]: I0310 11:14:22.465278 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/52e59577-61f1-45bf-adf7-3f55a2d92cf9-utilities\") pod \"community-operators-kzcfk\" (UID: \"52e59577-61f1-45bf-adf7-3f55a2d92cf9\") " pod="openshift-marketplace/community-operators-kzcfk" Mar 10 11:14:22 crc kubenswrapper[4794]: I0310 11:14:22.465766 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/52e59577-61f1-45bf-adf7-3f55a2d92cf9-catalog-content\") pod \"community-operators-kzcfk\" (UID: \"52e59577-61f1-45bf-adf7-3f55a2d92cf9\") " pod="openshift-marketplace/community-operators-kzcfk" Mar 10 11:14:22 crc kubenswrapper[4794]: I0310 11:14:22.465778 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/52e59577-61f1-45bf-adf7-3f55a2d92cf9-utilities\") pod \"community-operators-kzcfk\" (UID: \"52e59577-61f1-45bf-adf7-3f55a2d92cf9\") " pod="openshift-marketplace/community-operators-kzcfk" Mar 10 11:14:22 crc kubenswrapper[4794]: I0310 11:14:22.487892 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rwpqq\" (UniqueName: \"kubernetes.io/projected/52e59577-61f1-45bf-adf7-3f55a2d92cf9-kube-api-access-rwpqq\") pod \"community-operators-kzcfk\" (UID: \"52e59577-61f1-45bf-adf7-3f55a2d92cf9\") " pod="openshift-marketplace/community-operators-kzcfk" Mar 10 11:14:22 crc kubenswrapper[4794]: I0310 11:14:22.618362 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-kzcfk" Mar 10 11:14:22 crc kubenswrapper[4794]: I0310 11:14:22.952776 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-kzcfk"] Mar 10 11:14:23 crc kubenswrapper[4794]: I0310 11:14:23.954255 4794 generic.go:334] "Generic (PLEG): container finished" podID="52e59577-61f1-45bf-adf7-3f55a2d92cf9" containerID="d172c1142e10efee9b223ed6b1c5306910d8fc257eb2ab9be08aef5790852226" exitCode=0 Mar 10 11:14:23 crc kubenswrapper[4794]: I0310 11:14:23.954344 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kzcfk" event={"ID":"52e59577-61f1-45bf-adf7-3f55a2d92cf9","Type":"ContainerDied","Data":"d172c1142e10efee9b223ed6b1c5306910d8fc257eb2ab9be08aef5790852226"} Mar 10 11:14:23 crc kubenswrapper[4794]: I0310 11:14:23.954636 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kzcfk" event={"ID":"52e59577-61f1-45bf-adf7-3f55a2d92cf9","Type":"ContainerStarted","Data":"8996562a879f0cb8302c243e0b91409bc0c52b00f3db5fd07f233491ec07b0de"} Mar 10 11:14:24 crc kubenswrapper[4794]: I0310 11:14:24.966599 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kzcfk" event={"ID":"52e59577-61f1-45bf-adf7-3f55a2d92cf9","Type":"ContainerStarted","Data":"541971d4fed5db771b3a328c3a9d79737e76ac60b3ff542bb49354e6feec5c9d"} Mar 10 11:14:25 crc kubenswrapper[4794]: I0310 11:14:25.980786 4794 generic.go:334] "Generic (PLEG): container finished" podID="52e59577-61f1-45bf-adf7-3f55a2d92cf9" containerID="541971d4fed5db771b3a328c3a9d79737e76ac60b3ff542bb49354e6feec5c9d" exitCode=0 Mar 10 11:14:25 crc kubenswrapper[4794]: I0310 11:14:25.980829 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kzcfk" event={"ID":"52e59577-61f1-45bf-adf7-3f55a2d92cf9","Type":"ContainerDied","Data":"541971d4fed5db771b3a328c3a9d79737e76ac60b3ff542bb49354e6feec5c9d"} Mar 10 11:14:26 crc kubenswrapper[4794]: I0310 11:14:26.990940 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kzcfk" event={"ID":"52e59577-61f1-45bf-adf7-3f55a2d92cf9","Type":"ContainerStarted","Data":"3e7cdcd6096d66570d2456be2aca2aaee04189f8f57d88a4cb076875d54a0822"} Mar 10 11:14:27 crc kubenswrapper[4794]: I0310 11:14:27.014830 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-kzcfk" podStartSLOduration=2.527852049 podStartE2EDuration="5.014812608s" podCreationTimestamp="2026-03-10 11:14:22 +0000 UTC" firstStartedPulling="2026-03-10 11:14:23.957283195 +0000 UTC m=+5412.713454043" lastFinishedPulling="2026-03-10 11:14:26.444243774 +0000 UTC m=+5415.200414602" observedRunningTime="2026-03-10 11:14:27.014388955 +0000 UTC m=+5415.770559783" watchObservedRunningTime="2026-03-10 11:14:27.014812608 +0000 UTC m=+5415.770983416" Mar 10 11:14:30 crc kubenswrapper[4794]: I0310 11:14:30.999312 4794 scope.go:117] "RemoveContainer" containerID="993a5336699cfaa4c1135438dd34154cd549d1e3528993849fd4002d26837157" Mar 10 11:14:31 crc kubenswrapper[4794]: E0310 11:14:31.000037 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-69278_openshift-machine-config-operator(071a79a8-a892-4d38-a255-2a19483b64aa)\"" pod="openshift-machine-config-operator/machine-config-daemon-69278" podUID="071a79a8-a892-4d38-a255-2a19483b64aa" Mar 10 11:14:32 crc kubenswrapper[4794]: I0310 11:14:32.618942 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-kzcfk" Mar 10 11:14:32 crc kubenswrapper[4794]: I0310 11:14:32.619161 4794 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-kzcfk" Mar 10 11:14:32 crc kubenswrapper[4794]: I0310 11:14:32.686693 4794 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-kzcfk" Mar 10 11:14:33 crc kubenswrapper[4794]: I0310 11:14:33.094231 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-kzcfk" Mar 10 11:14:33 crc kubenswrapper[4794]: I0310 11:14:33.150766 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-kzcfk"] Mar 10 11:14:35 crc kubenswrapper[4794]: I0310 11:14:35.057612 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-kzcfk" podUID="52e59577-61f1-45bf-adf7-3f55a2d92cf9" containerName="registry-server" containerID="cri-o://3e7cdcd6096d66570d2456be2aca2aaee04189f8f57d88a4cb076875d54a0822" gracePeriod=2 Mar 10 11:14:35 crc kubenswrapper[4794]: I0310 11:14:35.505581 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-kzcfk" Mar 10 11:14:35 crc kubenswrapper[4794]: I0310 11:14:35.611031 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/52e59577-61f1-45bf-adf7-3f55a2d92cf9-utilities\") pod \"52e59577-61f1-45bf-adf7-3f55a2d92cf9\" (UID: \"52e59577-61f1-45bf-adf7-3f55a2d92cf9\") " Mar 10 11:14:35 crc kubenswrapper[4794]: I0310 11:14:35.611162 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rwpqq\" (UniqueName: \"kubernetes.io/projected/52e59577-61f1-45bf-adf7-3f55a2d92cf9-kube-api-access-rwpqq\") pod \"52e59577-61f1-45bf-adf7-3f55a2d92cf9\" (UID: \"52e59577-61f1-45bf-adf7-3f55a2d92cf9\") " Mar 10 11:14:35 crc kubenswrapper[4794]: I0310 11:14:35.611199 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/52e59577-61f1-45bf-adf7-3f55a2d92cf9-catalog-content\") pod \"52e59577-61f1-45bf-adf7-3f55a2d92cf9\" (UID: \"52e59577-61f1-45bf-adf7-3f55a2d92cf9\") " Mar 10 11:14:35 crc kubenswrapper[4794]: I0310 11:14:35.612203 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/52e59577-61f1-45bf-adf7-3f55a2d92cf9-utilities" (OuterVolumeSpecName: "utilities") pod "52e59577-61f1-45bf-adf7-3f55a2d92cf9" (UID: "52e59577-61f1-45bf-adf7-3f55a2d92cf9"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 11:14:35 crc kubenswrapper[4794]: I0310 11:14:35.619724 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/52e59577-61f1-45bf-adf7-3f55a2d92cf9-kube-api-access-rwpqq" (OuterVolumeSpecName: "kube-api-access-rwpqq") pod "52e59577-61f1-45bf-adf7-3f55a2d92cf9" (UID: "52e59577-61f1-45bf-adf7-3f55a2d92cf9"). InnerVolumeSpecName "kube-api-access-rwpqq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 11:14:35 crc kubenswrapper[4794]: I0310 11:14:35.713582 4794 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/52e59577-61f1-45bf-adf7-3f55a2d92cf9-utilities\") on node \"crc\" DevicePath \"\"" Mar 10 11:14:35 crc kubenswrapper[4794]: I0310 11:14:35.713642 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rwpqq\" (UniqueName: \"kubernetes.io/projected/52e59577-61f1-45bf-adf7-3f55a2d92cf9-kube-api-access-rwpqq\") on node \"crc\" DevicePath \"\"" Mar 10 11:14:35 crc kubenswrapper[4794]: I0310 11:14:35.845452 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/52e59577-61f1-45bf-adf7-3f55a2d92cf9-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "52e59577-61f1-45bf-adf7-3f55a2d92cf9" (UID: "52e59577-61f1-45bf-adf7-3f55a2d92cf9"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 11:14:35 crc kubenswrapper[4794]: I0310 11:14:35.916871 4794 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/52e59577-61f1-45bf-adf7-3f55a2d92cf9-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 10 11:14:36 crc kubenswrapper[4794]: I0310 11:14:36.071551 4794 generic.go:334] "Generic (PLEG): container finished" podID="52e59577-61f1-45bf-adf7-3f55a2d92cf9" containerID="3e7cdcd6096d66570d2456be2aca2aaee04189f8f57d88a4cb076875d54a0822" exitCode=0 Mar 10 11:14:36 crc kubenswrapper[4794]: I0310 11:14:36.071638 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kzcfk" event={"ID":"52e59577-61f1-45bf-adf7-3f55a2d92cf9","Type":"ContainerDied","Data":"3e7cdcd6096d66570d2456be2aca2aaee04189f8f57d88a4cb076875d54a0822"} Mar 10 11:14:36 crc kubenswrapper[4794]: I0310 11:14:36.071711 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kzcfk" event={"ID":"52e59577-61f1-45bf-adf7-3f55a2d92cf9","Type":"ContainerDied","Data":"8996562a879f0cb8302c243e0b91409bc0c52b00f3db5fd07f233491ec07b0de"} Mar 10 11:14:36 crc kubenswrapper[4794]: I0310 11:14:36.071747 4794 scope.go:117] "RemoveContainer" containerID="3e7cdcd6096d66570d2456be2aca2aaee04189f8f57d88a4cb076875d54a0822" Mar 10 11:14:36 crc kubenswrapper[4794]: I0310 11:14:36.072085 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-kzcfk" Mar 10 11:14:36 crc kubenswrapper[4794]: I0310 11:14:36.106362 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-kzcfk"] Mar 10 11:14:36 crc kubenswrapper[4794]: I0310 11:14:36.107588 4794 scope.go:117] "RemoveContainer" containerID="541971d4fed5db771b3a328c3a9d79737e76ac60b3ff542bb49354e6feec5c9d" Mar 10 11:14:36 crc kubenswrapper[4794]: I0310 11:14:36.117491 4794 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-kzcfk"] Mar 10 11:14:36 crc kubenswrapper[4794]: I0310 11:14:36.132848 4794 scope.go:117] "RemoveContainer" containerID="d172c1142e10efee9b223ed6b1c5306910d8fc257eb2ab9be08aef5790852226" Mar 10 11:14:36 crc kubenswrapper[4794]: I0310 11:14:36.182383 4794 scope.go:117] "RemoveContainer" containerID="3e7cdcd6096d66570d2456be2aca2aaee04189f8f57d88a4cb076875d54a0822" Mar 10 11:14:36 crc kubenswrapper[4794]: E0310 11:14:36.182996 4794 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3e7cdcd6096d66570d2456be2aca2aaee04189f8f57d88a4cb076875d54a0822\": container with ID starting with 3e7cdcd6096d66570d2456be2aca2aaee04189f8f57d88a4cb076875d54a0822 not found: ID does not exist" containerID="3e7cdcd6096d66570d2456be2aca2aaee04189f8f57d88a4cb076875d54a0822" Mar 10 11:14:36 crc kubenswrapper[4794]: I0310 11:14:36.183023 4794 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3e7cdcd6096d66570d2456be2aca2aaee04189f8f57d88a4cb076875d54a0822"} err="failed to get container status \"3e7cdcd6096d66570d2456be2aca2aaee04189f8f57d88a4cb076875d54a0822\": rpc error: code = NotFound desc = could not find container \"3e7cdcd6096d66570d2456be2aca2aaee04189f8f57d88a4cb076875d54a0822\": container with ID starting with 3e7cdcd6096d66570d2456be2aca2aaee04189f8f57d88a4cb076875d54a0822 not found: ID does not exist" Mar 10 11:14:36 crc kubenswrapper[4794]: I0310 11:14:36.183041 4794 scope.go:117] "RemoveContainer" containerID="541971d4fed5db771b3a328c3a9d79737e76ac60b3ff542bb49354e6feec5c9d" Mar 10 11:14:36 crc kubenswrapper[4794]: E0310 11:14:36.183649 4794 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"541971d4fed5db771b3a328c3a9d79737e76ac60b3ff542bb49354e6feec5c9d\": container with ID starting with 541971d4fed5db771b3a328c3a9d79737e76ac60b3ff542bb49354e6feec5c9d not found: ID does not exist" containerID="541971d4fed5db771b3a328c3a9d79737e76ac60b3ff542bb49354e6feec5c9d" Mar 10 11:14:36 crc kubenswrapper[4794]: I0310 11:14:36.183679 4794 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"541971d4fed5db771b3a328c3a9d79737e76ac60b3ff542bb49354e6feec5c9d"} err="failed to get container status \"541971d4fed5db771b3a328c3a9d79737e76ac60b3ff542bb49354e6feec5c9d\": rpc error: code = NotFound desc = could not find container \"541971d4fed5db771b3a328c3a9d79737e76ac60b3ff542bb49354e6feec5c9d\": container with ID starting with 541971d4fed5db771b3a328c3a9d79737e76ac60b3ff542bb49354e6feec5c9d not found: ID does not exist" Mar 10 11:14:36 crc kubenswrapper[4794]: I0310 11:14:36.183699 4794 scope.go:117] "RemoveContainer" containerID="d172c1142e10efee9b223ed6b1c5306910d8fc257eb2ab9be08aef5790852226" Mar 10 11:14:36 crc kubenswrapper[4794]: E0310 11:14:36.184172 4794 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d172c1142e10efee9b223ed6b1c5306910d8fc257eb2ab9be08aef5790852226\": container with ID starting with d172c1142e10efee9b223ed6b1c5306910d8fc257eb2ab9be08aef5790852226 not found: ID does not exist" containerID="d172c1142e10efee9b223ed6b1c5306910d8fc257eb2ab9be08aef5790852226" Mar 10 11:14:36 crc kubenswrapper[4794]: I0310 11:14:36.184190 4794 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d172c1142e10efee9b223ed6b1c5306910d8fc257eb2ab9be08aef5790852226"} err="failed to get container status \"d172c1142e10efee9b223ed6b1c5306910d8fc257eb2ab9be08aef5790852226\": rpc error: code = NotFound desc = could not find container \"d172c1142e10efee9b223ed6b1c5306910d8fc257eb2ab9be08aef5790852226\": container with ID starting with d172c1142e10efee9b223ed6b1c5306910d8fc257eb2ab9be08aef5790852226 not found: ID does not exist" Mar 10 11:14:38 crc kubenswrapper[4794]: I0310 11:14:38.011384 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="52e59577-61f1-45bf-adf7-3f55a2d92cf9" path="/var/lib/kubelet/pods/52e59577-61f1-45bf-adf7-3f55a2d92cf9/volumes" Mar 10 11:14:38 crc kubenswrapper[4794]: I0310 11:14:38.267084 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-create-sqjmt"] Mar 10 11:14:38 crc kubenswrapper[4794]: E0310 11:14:38.267645 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="52e59577-61f1-45bf-adf7-3f55a2d92cf9" containerName="registry-server" Mar 10 11:14:38 crc kubenswrapper[4794]: I0310 11:14:38.267674 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="52e59577-61f1-45bf-adf7-3f55a2d92cf9" containerName="registry-server" Mar 10 11:14:38 crc kubenswrapper[4794]: E0310 11:14:38.267705 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="52e59577-61f1-45bf-adf7-3f55a2d92cf9" containerName="extract-utilities" Mar 10 11:14:38 crc kubenswrapper[4794]: I0310 11:14:38.267714 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="52e59577-61f1-45bf-adf7-3f55a2d92cf9" containerName="extract-utilities" Mar 10 11:14:38 crc kubenswrapper[4794]: E0310 11:14:38.267740 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="52e59577-61f1-45bf-adf7-3f55a2d92cf9" containerName="extract-content" Mar 10 11:14:38 crc kubenswrapper[4794]: I0310 11:14:38.267748 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="52e59577-61f1-45bf-adf7-3f55a2d92cf9" containerName="extract-content" Mar 10 11:14:38 crc kubenswrapper[4794]: I0310 11:14:38.267943 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="52e59577-61f1-45bf-adf7-3f55a2d92cf9" containerName="registry-server" Mar 10 11:14:38 crc kubenswrapper[4794]: I0310 11:14:38.268657 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-sqjmt" Mar 10 11:14:38 crc kubenswrapper[4794]: I0310 11:14:38.274798 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-e47a-account-create-update-89jdk"] Mar 10 11:14:38 crc kubenswrapper[4794]: I0310 11:14:38.276024 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-e47a-account-create-update-89jdk" Mar 10 11:14:38 crc kubenswrapper[4794]: I0310 11:14:38.277837 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-db-secret" Mar 10 11:14:38 crc kubenswrapper[4794]: I0310 11:14:38.291115 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-sqjmt"] Mar 10 11:14:38 crc kubenswrapper[4794]: I0310 11:14:38.329717 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-e47a-account-create-update-89jdk"] Mar 10 11:14:38 crc kubenswrapper[4794]: I0310 11:14:38.358745 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/02aec78e-96f3-4d8a-81cb-412e143471ca-operator-scripts\") pod \"barbican-db-create-sqjmt\" (UID: \"02aec78e-96f3-4d8a-81cb-412e143471ca\") " pod="openstack/barbican-db-create-sqjmt" Mar 10 11:14:38 crc kubenswrapper[4794]: I0310 11:14:38.359029 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/578d197c-2cae-4127-ab66-d2159303508b-operator-scripts\") pod \"barbican-e47a-account-create-update-89jdk\" (UID: \"578d197c-2cae-4127-ab66-d2159303508b\") " pod="openstack/barbican-e47a-account-create-update-89jdk" Mar 10 11:14:38 crc kubenswrapper[4794]: I0310 11:14:38.359099 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-crdzj\" (UniqueName: \"kubernetes.io/projected/578d197c-2cae-4127-ab66-d2159303508b-kube-api-access-crdzj\") pod \"barbican-e47a-account-create-update-89jdk\" (UID: \"578d197c-2cae-4127-ab66-d2159303508b\") " pod="openstack/barbican-e47a-account-create-update-89jdk" Mar 10 11:14:38 crc kubenswrapper[4794]: I0310 11:14:38.359153 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hs4hh\" (UniqueName: \"kubernetes.io/projected/02aec78e-96f3-4d8a-81cb-412e143471ca-kube-api-access-hs4hh\") pod \"barbican-db-create-sqjmt\" (UID: \"02aec78e-96f3-4d8a-81cb-412e143471ca\") " pod="openstack/barbican-db-create-sqjmt" Mar 10 11:14:38 crc kubenswrapper[4794]: I0310 11:14:38.460466 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/578d197c-2cae-4127-ab66-d2159303508b-operator-scripts\") pod \"barbican-e47a-account-create-update-89jdk\" (UID: \"578d197c-2cae-4127-ab66-d2159303508b\") " pod="openstack/barbican-e47a-account-create-update-89jdk" Mar 10 11:14:38 crc kubenswrapper[4794]: I0310 11:14:38.460516 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-crdzj\" (UniqueName: \"kubernetes.io/projected/578d197c-2cae-4127-ab66-d2159303508b-kube-api-access-crdzj\") pod \"barbican-e47a-account-create-update-89jdk\" (UID: \"578d197c-2cae-4127-ab66-d2159303508b\") " pod="openstack/barbican-e47a-account-create-update-89jdk" Mar 10 11:14:38 crc kubenswrapper[4794]: I0310 11:14:38.460543 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hs4hh\" (UniqueName: \"kubernetes.io/projected/02aec78e-96f3-4d8a-81cb-412e143471ca-kube-api-access-hs4hh\") pod \"barbican-db-create-sqjmt\" (UID: \"02aec78e-96f3-4d8a-81cb-412e143471ca\") " pod="openstack/barbican-db-create-sqjmt" Mar 10 11:14:38 crc kubenswrapper[4794]: I0310 11:14:38.460570 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/02aec78e-96f3-4d8a-81cb-412e143471ca-operator-scripts\") pod \"barbican-db-create-sqjmt\" (UID: \"02aec78e-96f3-4d8a-81cb-412e143471ca\") " pod="openstack/barbican-db-create-sqjmt" Mar 10 11:14:38 crc kubenswrapper[4794]: I0310 11:14:38.461279 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/02aec78e-96f3-4d8a-81cb-412e143471ca-operator-scripts\") pod \"barbican-db-create-sqjmt\" (UID: \"02aec78e-96f3-4d8a-81cb-412e143471ca\") " pod="openstack/barbican-db-create-sqjmt" Mar 10 11:14:38 crc kubenswrapper[4794]: I0310 11:14:38.461516 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/578d197c-2cae-4127-ab66-d2159303508b-operator-scripts\") pod \"barbican-e47a-account-create-update-89jdk\" (UID: \"578d197c-2cae-4127-ab66-d2159303508b\") " pod="openstack/barbican-e47a-account-create-update-89jdk" Mar 10 11:14:38 crc kubenswrapper[4794]: I0310 11:14:38.480707 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hs4hh\" (UniqueName: \"kubernetes.io/projected/02aec78e-96f3-4d8a-81cb-412e143471ca-kube-api-access-hs4hh\") pod \"barbican-db-create-sqjmt\" (UID: \"02aec78e-96f3-4d8a-81cb-412e143471ca\") " pod="openstack/barbican-db-create-sqjmt" Mar 10 11:14:38 crc kubenswrapper[4794]: I0310 11:14:38.480807 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-crdzj\" (UniqueName: \"kubernetes.io/projected/578d197c-2cae-4127-ab66-d2159303508b-kube-api-access-crdzj\") pod \"barbican-e47a-account-create-update-89jdk\" (UID: \"578d197c-2cae-4127-ab66-d2159303508b\") " pod="openstack/barbican-e47a-account-create-update-89jdk" Mar 10 11:14:38 crc kubenswrapper[4794]: I0310 11:14:38.669173 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-sqjmt" Mar 10 11:14:38 crc kubenswrapper[4794]: I0310 11:14:38.680301 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-e47a-account-create-update-89jdk" Mar 10 11:14:39 crc kubenswrapper[4794]: I0310 11:14:39.211197 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-sqjmt"] Mar 10 11:14:39 crc kubenswrapper[4794]: I0310 11:14:39.218105 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-e47a-account-create-update-89jdk"] Mar 10 11:14:40 crc kubenswrapper[4794]: I0310 11:14:40.106784 4794 generic.go:334] "Generic (PLEG): container finished" podID="578d197c-2cae-4127-ab66-d2159303508b" containerID="46da1e90b20ca9a718374fb93120cea7c6e2d502fb42fcc666f60cf0d81a0be3" exitCode=0 Mar 10 11:14:40 crc kubenswrapper[4794]: I0310 11:14:40.106892 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-e47a-account-create-update-89jdk" event={"ID":"578d197c-2cae-4127-ab66-d2159303508b","Type":"ContainerDied","Data":"46da1e90b20ca9a718374fb93120cea7c6e2d502fb42fcc666f60cf0d81a0be3"} Mar 10 11:14:40 crc kubenswrapper[4794]: I0310 11:14:40.107140 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-e47a-account-create-update-89jdk" event={"ID":"578d197c-2cae-4127-ab66-d2159303508b","Type":"ContainerStarted","Data":"3d5bde43843c3a8b513af9965e264829bf66e5a1a6bef8ad224278b906ebfd9d"} Mar 10 11:14:40 crc kubenswrapper[4794]: I0310 11:14:40.109325 4794 generic.go:334] "Generic (PLEG): container finished" podID="02aec78e-96f3-4d8a-81cb-412e143471ca" containerID="1b76cae5e4145b0868ffa23664644938667774bfc0547d584bee7ac98f723c3b" exitCode=0 Mar 10 11:14:40 crc kubenswrapper[4794]: I0310 11:14:40.109405 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-sqjmt" event={"ID":"02aec78e-96f3-4d8a-81cb-412e143471ca","Type":"ContainerDied","Data":"1b76cae5e4145b0868ffa23664644938667774bfc0547d584bee7ac98f723c3b"} Mar 10 11:14:40 crc kubenswrapper[4794]: I0310 11:14:40.109428 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-sqjmt" event={"ID":"02aec78e-96f3-4d8a-81cb-412e143471ca","Type":"ContainerStarted","Data":"606913c198897876d369795ac9c46286de7acff8b1a66d335903982a5c0e172e"} Mar 10 11:14:41 crc kubenswrapper[4794]: I0310 11:14:41.531639 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-sqjmt" Mar 10 11:14:41 crc kubenswrapper[4794]: I0310 11:14:41.537389 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-e47a-account-create-update-89jdk" Mar 10 11:14:41 crc kubenswrapper[4794]: I0310 11:14:41.627910 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hs4hh\" (UniqueName: \"kubernetes.io/projected/02aec78e-96f3-4d8a-81cb-412e143471ca-kube-api-access-hs4hh\") pod \"02aec78e-96f3-4d8a-81cb-412e143471ca\" (UID: \"02aec78e-96f3-4d8a-81cb-412e143471ca\") " Mar 10 11:14:41 crc kubenswrapper[4794]: I0310 11:14:41.628086 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/02aec78e-96f3-4d8a-81cb-412e143471ca-operator-scripts\") pod \"02aec78e-96f3-4d8a-81cb-412e143471ca\" (UID: \"02aec78e-96f3-4d8a-81cb-412e143471ca\") " Mar 10 11:14:41 crc kubenswrapper[4794]: I0310 11:14:41.628985 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/02aec78e-96f3-4d8a-81cb-412e143471ca-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "02aec78e-96f3-4d8a-81cb-412e143471ca" (UID: "02aec78e-96f3-4d8a-81cb-412e143471ca"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 11:14:41 crc kubenswrapper[4794]: I0310 11:14:41.634006 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/02aec78e-96f3-4d8a-81cb-412e143471ca-kube-api-access-hs4hh" (OuterVolumeSpecName: "kube-api-access-hs4hh") pod "02aec78e-96f3-4d8a-81cb-412e143471ca" (UID: "02aec78e-96f3-4d8a-81cb-412e143471ca"). InnerVolumeSpecName "kube-api-access-hs4hh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 11:14:41 crc kubenswrapper[4794]: I0310 11:14:41.729876 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-crdzj\" (UniqueName: \"kubernetes.io/projected/578d197c-2cae-4127-ab66-d2159303508b-kube-api-access-crdzj\") pod \"578d197c-2cae-4127-ab66-d2159303508b\" (UID: \"578d197c-2cae-4127-ab66-d2159303508b\") " Mar 10 11:14:41 crc kubenswrapper[4794]: I0310 11:14:41.730010 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/578d197c-2cae-4127-ab66-d2159303508b-operator-scripts\") pod \"578d197c-2cae-4127-ab66-d2159303508b\" (UID: \"578d197c-2cae-4127-ab66-d2159303508b\") " Mar 10 11:14:41 crc kubenswrapper[4794]: I0310 11:14:41.730430 4794 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/02aec78e-96f3-4d8a-81cb-412e143471ca-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 11:14:41 crc kubenswrapper[4794]: I0310 11:14:41.730448 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hs4hh\" (UniqueName: \"kubernetes.io/projected/02aec78e-96f3-4d8a-81cb-412e143471ca-kube-api-access-hs4hh\") on node \"crc\" DevicePath \"\"" Mar 10 11:14:41 crc kubenswrapper[4794]: I0310 11:14:41.730884 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/578d197c-2cae-4127-ab66-d2159303508b-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "578d197c-2cae-4127-ab66-d2159303508b" (UID: "578d197c-2cae-4127-ab66-d2159303508b"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 11:14:41 crc kubenswrapper[4794]: I0310 11:14:41.733503 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/578d197c-2cae-4127-ab66-d2159303508b-kube-api-access-crdzj" (OuterVolumeSpecName: "kube-api-access-crdzj") pod "578d197c-2cae-4127-ab66-d2159303508b" (UID: "578d197c-2cae-4127-ab66-d2159303508b"). InnerVolumeSpecName "kube-api-access-crdzj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 11:14:41 crc kubenswrapper[4794]: I0310 11:14:41.832548 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-crdzj\" (UniqueName: \"kubernetes.io/projected/578d197c-2cae-4127-ab66-d2159303508b-kube-api-access-crdzj\") on node \"crc\" DevicePath \"\"" Mar 10 11:14:41 crc kubenswrapper[4794]: I0310 11:14:41.832590 4794 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/578d197c-2cae-4127-ab66-d2159303508b-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 11:14:42 crc kubenswrapper[4794]: I0310 11:14:42.131695 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-sqjmt" event={"ID":"02aec78e-96f3-4d8a-81cb-412e143471ca","Type":"ContainerDied","Data":"606913c198897876d369795ac9c46286de7acff8b1a66d335903982a5c0e172e"} Mar 10 11:14:42 crc kubenswrapper[4794]: I0310 11:14:42.131736 4794 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="606913c198897876d369795ac9c46286de7acff8b1a66d335903982a5c0e172e" Mar 10 11:14:42 crc kubenswrapper[4794]: I0310 11:14:42.131737 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-sqjmt" Mar 10 11:14:42 crc kubenswrapper[4794]: I0310 11:14:42.134428 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-e47a-account-create-update-89jdk" event={"ID":"578d197c-2cae-4127-ab66-d2159303508b","Type":"ContainerDied","Data":"3d5bde43843c3a8b513af9965e264829bf66e5a1a6bef8ad224278b906ebfd9d"} Mar 10 11:14:42 crc kubenswrapper[4794]: I0310 11:14:42.134555 4794 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3d5bde43843c3a8b513af9965e264829bf66e5a1a6bef8ad224278b906ebfd9d" Mar 10 11:14:42 crc kubenswrapper[4794]: I0310 11:14:42.134463 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-e47a-account-create-update-89jdk" Mar 10 11:14:42 crc kubenswrapper[4794]: I0310 11:14:42.274050 4794 scope.go:117] "RemoveContainer" containerID="35f0095264ba3d9ef4a7353e888be2676eef26a6d0e18908f36f913529577de5" Mar 10 11:14:42 crc kubenswrapper[4794]: I0310 11:14:42.313208 4794 scope.go:117] "RemoveContainer" containerID="c0e34bd502e79d6c39416a77bb318b8f87108b872e6f242bc54bbc3bcd908e30" Mar 10 11:14:43 crc kubenswrapper[4794]: I0310 11:14:43.549961 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-sync-ptn9j"] Mar 10 11:14:43 crc kubenswrapper[4794]: E0310 11:14:43.550289 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="578d197c-2cae-4127-ab66-d2159303508b" containerName="mariadb-account-create-update" Mar 10 11:14:43 crc kubenswrapper[4794]: I0310 11:14:43.550301 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="578d197c-2cae-4127-ab66-d2159303508b" containerName="mariadb-account-create-update" Mar 10 11:14:43 crc kubenswrapper[4794]: E0310 11:14:43.550329 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="02aec78e-96f3-4d8a-81cb-412e143471ca" containerName="mariadb-database-create" Mar 10 11:14:43 crc kubenswrapper[4794]: I0310 11:14:43.550335 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="02aec78e-96f3-4d8a-81cb-412e143471ca" containerName="mariadb-database-create" Mar 10 11:14:43 crc kubenswrapper[4794]: I0310 11:14:43.550697 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="02aec78e-96f3-4d8a-81cb-412e143471ca" containerName="mariadb-database-create" Mar 10 11:14:43 crc kubenswrapper[4794]: I0310 11:14:43.550718 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="578d197c-2cae-4127-ab66-d2159303508b" containerName="mariadb-account-create-update" Mar 10 11:14:43 crc kubenswrapper[4794]: I0310 11:14:43.551222 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-ptn9j" Mar 10 11:14:43 crc kubenswrapper[4794]: I0310 11:14:43.553737 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Mar 10 11:14:43 crc kubenswrapper[4794]: I0310 11:14:43.562606 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-nmbwb" Mar 10 11:14:43 crc kubenswrapper[4794]: I0310 11:14:43.565922 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-ptn9j"] Mar 10 11:14:43 crc kubenswrapper[4794]: I0310 11:14:43.674008 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-shj5q\" (UniqueName: \"kubernetes.io/projected/6fb3e755-7a30-4965-b6a1-01e1a65e97f5-kube-api-access-shj5q\") pod \"barbican-db-sync-ptn9j\" (UID: \"6fb3e755-7a30-4965-b6a1-01e1a65e97f5\") " pod="openstack/barbican-db-sync-ptn9j" Mar 10 11:14:43 crc kubenswrapper[4794]: I0310 11:14:43.674059 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6fb3e755-7a30-4965-b6a1-01e1a65e97f5-combined-ca-bundle\") pod \"barbican-db-sync-ptn9j\" (UID: \"6fb3e755-7a30-4965-b6a1-01e1a65e97f5\") " pod="openstack/barbican-db-sync-ptn9j" Mar 10 11:14:43 crc kubenswrapper[4794]: I0310 11:14:43.674216 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/6fb3e755-7a30-4965-b6a1-01e1a65e97f5-db-sync-config-data\") pod \"barbican-db-sync-ptn9j\" (UID: \"6fb3e755-7a30-4965-b6a1-01e1a65e97f5\") " pod="openstack/barbican-db-sync-ptn9j" Mar 10 11:14:43 crc kubenswrapper[4794]: I0310 11:14:43.775469 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-shj5q\" (UniqueName: \"kubernetes.io/projected/6fb3e755-7a30-4965-b6a1-01e1a65e97f5-kube-api-access-shj5q\") pod \"barbican-db-sync-ptn9j\" (UID: \"6fb3e755-7a30-4965-b6a1-01e1a65e97f5\") " pod="openstack/barbican-db-sync-ptn9j" Mar 10 11:14:43 crc kubenswrapper[4794]: I0310 11:14:43.775527 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6fb3e755-7a30-4965-b6a1-01e1a65e97f5-combined-ca-bundle\") pod \"barbican-db-sync-ptn9j\" (UID: \"6fb3e755-7a30-4965-b6a1-01e1a65e97f5\") " pod="openstack/barbican-db-sync-ptn9j" Mar 10 11:14:43 crc kubenswrapper[4794]: I0310 11:14:43.775571 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/6fb3e755-7a30-4965-b6a1-01e1a65e97f5-db-sync-config-data\") pod \"barbican-db-sync-ptn9j\" (UID: \"6fb3e755-7a30-4965-b6a1-01e1a65e97f5\") " pod="openstack/barbican-db-sync-ptn9j" Mar 10 11:14:43 crc kubenswrapper[4794]: I0310 11:14:43.780974 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/6fb3e755-7a30-4965-b6a1-01e1a65e97f5-db-sync-config-data\") pod \"barbican-db-sync-ptn9j\" (UID: \"6fb3e755-7a30-4965-b6a1-01e1a65e97f5\") " pod="openstack/barbican-db-sync-ptn9j" Mar 10 11:14:43 crc kubenswrapper[4794]: I0310 11:14:43.781091 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6fb3e755-7a30-4965-b6a1-01e1a65e97f5-combined-ca-bundle\") pod \"barbican-db-sync-ptn9j\" (UID: \"6fb3e755-7a30-4965-b6a1-01e1a65e97f5\") " pod="openstack/barbican-db-sync-ptn9j" Mar 10 11:14:43 crc kubenswrapper[4794]: I0310 11:14:43.795392 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-shj5q\" (UniqueName: \"kubernetes.io/projected/6fb3e755-7a30-4965-b6a1-01e1a65e97f5-kube-api-access-shj5q\") pod \"barbican-db-sync-ptn9j\" (UID: \"6fb3e755-7a30-4965-b6a1-01e1a65e97f5\") " pod="openstack/barbican-db-sync-ptn9j" Mar 10 11:14:43 crc kubenswrapper[4794]: I0310 11:14:43.865531 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-ptn9j" Mar 10 11:14:44 crc kubenswrapper[4794]: I0310 11:14:44.124910 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-ptn9j"] Mar 10 11:14:44 crc kubenswrapper[4794]: I0310 11:14:44.174061 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-ptn9j" event={"ID":"6fb3e755-7a30-4965-b6a1-01e1a65e97f5","Type":"ContainerStarted","Data":"6581799f31bf7f7b302c6ba5ad848e73c06a3a922eb7db9054a62f232ef8b377"} Mar 10 11:14:45 crc kubenswrapper[4794]: I0310 11:14:44.999146 4794 scope.go:117] "RemoveContainer" containerID="993a5336699cfaa4c1135438dd34154cd549d1e3528993849fd4002d26837157" Mar 10 11:14:45 crc kubenswrapper[4794]: E0310 11:14:44.999701 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-69278_openshift-machine-config-operator(071a79a8-a892-4d38-a255-2a19483b64aa)\"" pod="openshift-machine-config-operator/machine-config-daemon-69278" podUID="071a79a8-a892-4d38-a255-2a19483b64aa" Mar 10 11:14:45 crc kubenswrapper[4794]: I0310 11:14:45.185418 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-ptn9j" event={"ID":"6fb3e755-7a30-4965-b6a1-01e1a65e97f5","Type":"ContainerStarted","Data":"a00e706210e6e39ad85924a95fe1833658e45660d98e81d480c96460dd273372"} Mar 10 11:14:45 crc kubenswrapper[4794]: I0310 11:14:45.209686 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-db-sync-ptn9j" podStartSLOduration=2.209663382 podStartE2EDuration="2.209663382s" podCreationTimestamp="2026-03-10 11:14:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 11:14:45.202155507 +0000 UTC m=+5433.958326365" watchObservedRunningTime="2026-03-10 11:14:45.209663382 +0000 UTC m=+5433.965834190" Mar 10 11:14:46 crc kubenswrapper[4794]: I0310 11:14:46.198922 4794 generic.go:334] "Generic (PLEG): container finished" podID="6fb3e755-7a30-4965-b6a1-01e1a65e97f5" containerID="a00e706210e6e39ad85924a95fe1833658e45660d98e81d480c96460dd273372" exitCode=0 Mar 10 11:14:46 crc kubenswrapper[4794]: I0310 11:14:46.198967 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-ptn9j" event={"ID":"6fb3e755-7a30-4965-b6a1-01e1a65e97f5","Type":"ContainerDied","Data":"a00e706210e6e39ad85924a95fe1833658e45660d98e81d480c96460dd273372"} Mar 10 11:14:47 crc kubenswrapper[4794]: I0310 11:14:47.589801 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-ptn9j" Mar 10 11:14:47 crc kubenswrapper[4794]: I0310 11:14:47.743235 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-shj5q\" (UniqueName: \"kubernetes.io/projected/6fb3e755-7a30-4965-b6a1-01e1a65e97f5-kube-api-access-shj5q\") pod \"6fb3e755-7a30-4965-b6a1-01e1a65e97f5\" (UID: \"6fb3e755-7a30-4965-b6a1-01e1a65e97f5\") " Mar 10 11:14:47 crc kubenswrapper[4794]: I0310 11:14:47.743303 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/6fb3e755-7a30-4965-b6a1-01e1a65e97f5-db-sync-config-data\") pod \"6fb3e755-7a30-4965-b6a1-01e1a65e97f5\" (UID: \"6fb3e755-7a30-4965-b6a1-01e1a65e97f5\") " Mar 10 11:14:47 crc kubenswrapper[4794]: I0310 11:14:47.743437 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6fb3e755-7a30-4965-b6a1-01e1a65e97f5-combined-ca-bundle\") pod \"6fb3e755-7a30-4965-b6a1-01e1a65e97f5\" (UID: \"6fb3e755-7a30-4965-b6a1-01e1a65e97f5\") " Mar 10 11:14:47 crc kubenswrapper[4794]: I0310 11:14:47.749106 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6fb3e755-7a30-4965-b6a1-01e1a65e97f5-kube-api-access-shj5q" (OuterVolumeSpecName: "kube-api-access-shj5q") pod "6fb3e755-7a30-4965-b6a1-01e1a65e97f5" (UID: "6fb3e755-7a30-4965-b6a1-01e1a65e97f5"). InnerVolumeSpecName "kube-api-access-shj5q". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 11:14:47 crc kubenswrapper[4794]: I0310 11:14:47.749411 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6fb3e755-7a30-4965-b6a1-01e1a65e97f5-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "6fb3e755-7a30-4965-b6a1-01e1a65e97f5" (UID: "6fb3e755-7a30-4965-b6a1-01e1a65e97f5"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 11:14:47 crc kubenswrapper[4794]: I0310 11:14:47.784935 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6fb3e755-7a30-4965-b6a1-01e1a65e97f5-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6fb3e755-7a30-4965-b6a1-01e1a65e97f5" (UID: "6fb3e755-7a30-4965-b6a1-01e1a65e97f5"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 11:14:47 crc kubenswrapper[4794]: I0310 11:14:47.845021 4794 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6fb3e755-7a30-4965-b6a1-01e1a65e97f5-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 11:14:47 crc kubenswrapper[4794]: I0310 11:14:47.845054 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-shj5q\" (UniqueName: \"kubernetes.io/projected/6fb3e755-7a30-4965-b6a1-01e1a65e97f5-kube-api-access-shj5q\") on node \"crc\" DevicePath \"\"" Mar 10 11:14:47 crc kubenswrapper[4794]: I0310 11:14:47.845065 4794 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/6fb3e755-7a30-4965-b6a1-01e1a65e97f5-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 11:14:48 crc kubenswrapper[4794]: I0310 11:14:48.217658 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-ptn9j" event={"ID":"6fb3e755-7a30-4965-b6a1-01e1a65e97f5","Type":"ContainerDied","Data":"6581799f31bf7f7b302c6ba5ad848e73c06a3a922eb7db9054a62f232ef8b377"} Mar 10 11:14:48 crc kubenswrapper[4794]: I0310 11:14:48.217699 4794 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6581799f31bf7f7b302c6ba5ad848e73c06a3a922eb7db9054a62f232ef8b377" Mar 10 11:14:48 crc kubenswrapper[4794]: I0310 11:14:48.217751 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-ptn9j" Mar 10 11:14:48 crc kubenswrapper[4794]: I0310 11:14:48.362503 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-worker-7dcbcf68fc-sr8rx"] Mar 10 11:14:48 crc kubenswrapper[4794]: E0310 11:14:48.362842 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6fb3e755-7a30-4965-b6a1-01e1a65e97f5" containerName="barbican-db-sync" Mar 10 11:14:48 crc kubenswrapper[4794]: I0310 11:14:48.362866 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="6fb3e755-7a30-4965-b6a1-01e1a65e97f5" containerName="barbican-db-sync" Mar 10 11:14:48 crc kubenswrapper[4794]: I0310 11:14:48.363013 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="6fb3e755-7a30-4965-b6a1-01e1a65e97f5" containerName="barbican-db-sync" Mar 10 11:14:48 crc kubenswrapper[4794]: I0310 11:14:48.363870 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-7dcbcf68fc-sr8rx" Mar 10 11:14:48 crc kubenswrapper[4794]: W0310 11:14:48.366370 4794 reflector.go:561] object-"openstack"/"barbican-worker-config-data": failed to list *v1.Secret: secrets "barbican-worker-config-data" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openstack": no relationship found between node 'crc' and this object Mar 10 11:14:48 crc kubenswrapper[4794]: E0310 11:14:48.366423 4794 reflector.go:158] "Unhandled Error" err="object-\"openstack\"/\"barbican-worker-config-data\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"barbican-worker-config-data\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openstack\": no relationship found between node 'crc' and this object" logger="UnhandledError" Mar 10 11:14:48 crc kubenswrapper[4794]: I0310 11:14:48.366537 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-nmbwb" Mar 10 11:14:48 crc kubenswrapper[4794]: I0310 11:14:48.370743 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Mar 10 11:14:48 crc kubenswrapper[4794]: I0310 11:14:48.415938 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-keystone-listener-86667778fb-jhxgj"] Mar 10 11:14:48 crc kubenswrapper[4794]: I0310 11:14:48.417710 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-86667778fb-jhxgj" Mar 10 11:14:48 crc kubenswrapper[4794]: I0310 11:14:48.421386 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-7dcbcf68fc-sr8rx"] Mar 10 11:14:48 crc kubenswrapper[4794]: I0310 11:14:48.426282 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-keystone-listener-config-data" Mar 10 11:14:48 crc kubenswrapper[4794]: I0310 11:14:48.437419 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-86667778fb-jhxgj"] Mar 10 11:14:48 crc kubenswrapper[4794]: I0310 11:14:48.448444 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-577f6d597f-sd6mk"] Mar 10 11:14:48 crc kubenswrapper[4794]: I0310 11:14:48.449844 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-577f6d597f-sd6mk" Mar 10 11:14:48 crc kubenswrapper[4794]: I0310 11:14:48.473755 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-577f6d597f-sd6mk"] Mar 10 11:14:48 crc kubenswrapper[4794]: I0310 11:14:48.554688 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4jsdc\" (UniqueName: \"kubernetes.io/projected/68c9337f-a87e-4734-9150-fb69adfa8e63-kube-api-access-4jsdc\") pod \"barbican-worker-7dcbcf68fc-sr8rx\" (UID: \"68c9337f-a87e-4734-9150-fb69adfa8e63\") " pod="openstack/barbican-worker-7dcbcf68fc-sr8rx" Mar 10 11:14:48 crc kubenswrapper[4794]: I0310 11:14:48.554769 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/68c9337f-a87e-4734-9150-fb69adfa8e63-combined-ca-bundle\") pod \"barbican-worker-7dcbcf68fc-sr8rx\" (UID: \"68c9337f-a87e-4734-9150-fb69adfa8e63\") " pod="openstack/barbican-worker-7dcbcf68fc-sr8rx" Mar 10 11:14:48 crc kubenswrapper[4794]: I0310 11:14:48.554821 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/657ac79d-5ffa-4505-8fad-1df78ff74c78-logs\") pod \"barbican-keystone-listener-86667778fb-jhxgj\" (UID: \"657ac79d-5ffa-4505-8fad-1df78ff74c78\") " pod="openstack/barbican-keystone-listener-86667778fb-jhxgj" Mar 10 11:14:48 crc kubenswrapper[4794]: I0310 11:14:48.554909 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/68c9337f-a87e-4734-9150-fb69adfa8e63-config-data-custom\") pod \"barbican-worker-7dcbcf68fc-sr8rx\" (UID: \"68c9337f-a87e-4734-9150-fb69adfa8e63\") " pod="openstack/barbican-worker-7dcbcf68fc-sr8rx" Mar 10 11:14:48 crc kubenswrapper[4794]: I0310 11:14:48.554937 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e19bd983-6b82-4879-84d5-3e2362654043-ovsdbserver-sb\") pod \"dnsmasq-dns-577f6d597f-sd6mk\" (UID: \"e19bd983-6b82-4879-84d5-3e2362654043\") " pod="openstack/dnsmasq-dns-577f6d597f-sd6mk" Mar 10 11:14:48 crc kubenswrapper[4794]: I0310 11:14:48.554982 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e19bd983-6b82-4879-84d5-3e2362654043-dns-svc\") pod \"dnsmasq-dns-577f6d597f-sd6mk\" (UID: \"e19bd983-6b82-4879-84d5-3e2362654043\") " pod="openstack/dnsmasq-dns-577f6d597f-sd6mk" Mar 10 11:14:48 crc kubenswrapper[4794]: I0310 11:14:48.555033 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/657ac79d-5ffa-4505-8fad-1df78ff74c78-combined-ca-bundle\") pod \"barbican-keystone-listener-86667778fb-jhxgj\" (UID: \"657ac79d-5ffa-4505-8fad-1df78ff74c78\") " pod="openstack/barbican-keystone-listener-86667778fb-jhxgj" Mar 10 11:14:48 crc kubenswrapper[4794]: I0310 11:14:48.555091 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/68c9337f-a87e-4734-9150-fb69adfa8e63-logs\") pod \"barbican-worker-7dcbcf68fc-sr8rx\" (UID: \"68c9337f-a87e-4734-9150-fb69adfa8e63\") " pod="openstack/barbican-worker-7dcbcf68fc-sr8rx" Mar 10 11:14:48 crc kubenswrapper[4794]: I0310 11:14:48.555113 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e19bd983-6b82-4879-84d5-3e2362654043-ovsdbserver-nb\") pod \"dnsmasq-dns-577f6d597f-sd6mk\" (UID: \"e19bd983-6b82-4879-84d5-3e2362654043\") " pod="openstack/dnsmasq-dns-577f6d597f-sd6mk" Mar 10 11:14:48 crc kubenswrapper[4794]: I0310 11:14:48.555163 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/657ac79d-5ffa-4505-8fad-1df78ff74c78-config-data-custom\") pod \"barbican-keystone-listener-86667778fb-jhxgj\" (UID: \"657ac79d-5ffa-4505-8fad-1df78ff74c78\") " pod="openstack/barbican-keystone-listener-86667778fb-jhxgj" Mar 10 11:14:48 crc kubenswrapper[4794]: I0310 11:14:48.555232 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mljg4\" (UniqueName: \"kubernetes.io/projected/657ac79d-5ffa-4505-8fad-1df78ff74c78-kube-api-access-mljg4\") pod \"barbican-keystone-listener-86667778fb-jhxgj\" (UID: \"657ac79d-5ffa-4505-8fad-1df78ff74c78\") " pod="openstack/barbican-keystone-listener-86667778fb-jhxgj" Mar 10 11:14:48 crc kubenswrapper[4794]: I0310 11:14:48.555344 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e19bd983-6b82-4879-84d5-3e2362654043-config\") pod \"dnsmasq-dns-577f6d597f-sd6mk\" (UID: \"e19bd983-6b82-4879-84d5-3e2362654043\") " pod="openstack/dnsmasq-dns-577f6d597f-sd6mk" Mar 10 11:14:48 crc kubenswrapper[4794]: I0310 11:14:48.555388 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/68c9337f-a87e-4734-9150-fb69adfa8e63-config-data\") pod \"barbican-worker-7dcbcf68fc-sr8rx\" (UID: \"68c9337f-a87e-4734-9150-fb69adfa8e63\") " pod="openstack/barbican-worker-7dcbcf68fc-sr8rx" Mar 10 11:14:48 crc kubenswrapper[4794]: I0310 11:14:48.555427 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c9cbg\" (UniqueName: \"kubernetes.io/projected/e19bd983-6b82-4879-84d5-3e2362654043-kube-api-access-c9cbg\") pod \"dnsmasq-dns-577f6d597f-sd6mk\" (UID: \"e19bd983-6b82-4879-84d5-3e2362654043\") " pod="openstack/dnsmasq-dns-577f6d597f-sd6mk" Mar 10 11:14:48 crc kubenswrapper[4794]: I0310 11:14:48.555458 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/657ac79d-5ffa-4505-8fad-1df78ff74c78-config-data\") pod \"barbican-keystone-listener-86667778fb-jhxgj\" (UID: \"657ac79d-5ffa-4505-8fad-1df78ff74c78\") " pod="openstack/barbican-keystone-listener-86667778fb-jhxgj" Mar 10 11:14:48 crc kubenswrapper[4794]: I0310 11:14:48.588432 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-55d6f8766b-cgldc"] Mar 10 11:14:48 crc kubenswrapper[4794]: I0310 11:14:48.589673 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-55d6f8766b-cgldc" Mar 10 11:14:48 crc kubenswrapper[4794]: I0310 11:14:48.591574 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-api-config-data" Mar 10 11:14:48 crc kubenswrapper[4794]: I0310 11:14:48.605286 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-55d6f8766b-cgldc"] Mar 10 11:14:48 crc kubenswrapper[4794]: I0310 11:14:48.656409 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f8dbf764-272f-4b4e-b7c7-f005c3219494-config-data-custom\") pod \"barbican-api-55d6f8766b-cgldc\" (UID: \"f8dbf764-272f-4b4e-b7c7-f005c3219494\") " pod="openstack/barbican-api-55d6f8766b-cgldc" Mar 10 11:14:48 crc kubenswrapper[4794]: I0310 11:14:48.656463 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/657ac79d-5ffa-4505-8fad-1df78ff74c78-combined-ca-bundle\") pod \"barbican-keystone-listener-86667778fb-jhxgj\" (UID: \"657ac79d-5ffa-4505-8fad-1df78ff74c78\") " pod="openstack/barbican-keystone-listener-86667778fb-jhxgj" Mar 10 11:14:48 crc kubenswrapper[4794]: I0310 11:14:48.656485 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/68c9337f-a87e-4734-9150-fb69adfa8e63-logs\") pod \"barbican-worker-7dcbcf68fc-sr8rx\" (UID: \"68c9337f-a87e-4734-9150-fb69adfa8e63\") " pod="openstack/barbican-worker-7dcbcf68fc-sr8rx" Mar 10 11:14:48 crc kubenswrapper[4794]: I0310 11:14:48.656503 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e19bd983-6b82-4879-84d5-3e2362654043-ovsdbserver-nb\") pod \"dnsmasq-dns-577f6d597f-sd6mk\" (UID: \"e19bd983-6b82-4879-84d5-3e2362654043\") " pod="openstack/dnsmasq-dns-577f6d597f-sd6mk" Mar 10 11:14:48 crc kubenswrapper[4794]: I0310 11:14:48.656523 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f8dbf764-272f-4b4e-b7c7-f005c3219494-config-data\") pod \"barbican-api-55d6f8766b-cgldc\" (UID: \"f8dbf764-272f-4b4e-b7c7-f005c3219494\") " pod="openstack/barbican-api-55d6f8766b-cgldc" Mar 10 11:14:48 crc kubenswrapper[4794]: I0310 11:14:48.656542 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f8dbf764-272f-4b4e-b7c7-f005c3219494-combined-ca-bundle\") pod \"barbican-api-55d6f8766b-cgldc\" (UID: \"f8dbf764-272f-4b4e-b7c7-f005c3219494\") " pod="openstack/barbican-api-55d6f8766b-cgldc" Mar 10 11:14:48 crc kubenswrapper[4794]: I0310 11:14:48.656560 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/657ac79d-5ffa-4505-8fad-1df78ff74c78-config-data-custom\") pod \"barbican-keystone-listener-86667778fb-jhxgj\" (UID: \"657ac79d-5ffa-4505-8fad-1df78ff74c78\") " pod="openstack/barbican-keystone-listener-86667778fb-jhxgj" Mar 10 11:14:48 crc kubenswrapper[4794]: I0310 11:14:48.656581 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mljg4\" (UniqueName: \"kubernetes.io/projected/657ac79d-5ffa-4505-8fad-1df78ff74c78-kube-api-access-mljg4\") pod \"barbican-keystone-listener-86667778fb-jhxgj\" (UID: \"657ac79d-5ffa-4505-8fad-1df78ff74c78\") " pod="openstack/barbican-keystone-listener-86667778fb-jhxgj" Mar 10 11:14:48 crc kubenswrapper[4794]: I0310 11:14:48.656613 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f79rv\" (UniqueName: \"kubernetes.io/projected/f8dbf764-272f-4b4e-b7c7-f005c3219494-kube-api-access-f79rv\") pod \"barbican-api-55d6f8766b-cgldc\" (UID: \"f8dbf764-272f-4b4e-b7c7-f005c3219494\") " pod="openstack/barbican-api-55d6f8766b-cgldc" Mar 10 11:14:48 crc kubenswrapper[4794]: I0310 11:14:48.656630 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e19bd983-6b82-4879-84d5-3e2362654043-config\") pod \"dnsmasq-dns-577f6d597f-sd6mk\" (UID: \"e19bd983-6b82-4879-84d5-3e2362654043\") " pod="openstack/dnsmasq-dns-577f6d597f-sd6mk" Mar 10 11:14:48 crc kubenswrapper[4794]: I0310 11:14:48.656644 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/68c9337f-a87e-4734-9150-fb69adfa8e63-config-data\") pod \"barbican-worker-7dcbcf68fc-sr8rx\" (UID: \"68c9337f-a87e-4734-9150-fb69adfa8e63\") " pod="openstack/barbican-worker-7dcbcf68fc-sr8rx" Mar 10 11:14:48 crc kubenswrapper[4794]: I0310 11:14:48.656663 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c9cbg\" (UniqueName: \"kubernetes.io/projected/e19bd983-6b82-4879-84d5-3e2362654043-kube-api-access-c9cbg\") pod \"dnsmasq-dns-577f6d597f-sd6mk\" (UID: \"e19bd983-6b82-4879-84d5-3e2362654043\") " pod="openstack/dnsmasq-dns-577f6d597f-sd6mk" Mar 10 11:14:48 crc kubenswrapper[4794]: I0310 11:14:48.656685 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/657ac79d-5ffa-4505-8fad-1df78ff74c78-config-data\") pod \"barbican-keystone-listener-86667778fb-jhxgj\" (UID: \"657ac79d-5ffa-4505-8fad-1df78ff74c78\") " pod="openstack/barbican-keystone-listener-86667778fb-jhxgj" Mar 10 11:14:48 crc kubenswrapper[4794]: I0310 11:14:48.656721 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4jsdc\" (UniqueName: \"kubernetes.io/projected/68c9337f-a87e-4734-9150-fb69adfa8e63-kube-api-access-4jsdc\") pod \"barbican-worker-7dcbcf68fc-sr8rx\" (UID: \"68c9337f-a87e-4734-9150-fb69adfa8e63\") " pod="openstack/barbican-worker-7dcbcf68fc-sr8rx" Mar 10 11:14:48 crc kubenswrapper[4794]: I0310 11:14:48.656737 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f8dbf764-272f-4b4e-b7c7-f005c3219494-logs\") pod \"barbican-api-55d6f8766b-cgldc\" (UID: \"f8dbf764-272f-4b4e-b7c7-f005c3219494\") " pod="openstack/barbican-api-55d6f8766b-cgldc" Mar 10 11:14:48 crc kubenswrapper[4794]: I0310 11:14:48.656756 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/68c9337f-a87e-4734-9150-fb69adfa8e63-combined-ca-bundle\") pod \"barbican-worker-7dcbcf68fc-sr8rx\" (UID: \"68c9337f-a87e-4734-9150-fb69adfa8e63\") " pod="openstack/barbican-worker-7dcbcf68fc-sr8rx" Mar 10 11:14:48 crc kubenswrapper[4794]: I0310 11:14:48.656776 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/657ac79d-5ffa-4505-8fad-1df78ff74c78-logs\") pod \"barbican-keystone-listener-86667778fb-jhxgj\" (UID: \"657ac79d-5ffa-4505-8fad-1df78ff74c78\") " pod="openstack/barbican-keystone-listener-86667778fb-jhxgj" Mar 10 11:14:48 crc kubenswrapper[4794]: I0310 11:14:48.656793 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/68c9337f-a87e-4734-9150-fb69adfa8e63-config-data-custom\") pod \"barbican-worker-7dcbcf68fc-sr8rx\" (UID: \"68c9337f-a87e-4734-9150-fb69adfa8e63\") " pod="openstack/barbican-worker-7dcbcf68fc-sr8rx" Mar 10 11:14:48 crc kubenswrapper[4794]: I0310 11:14:48.656809 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e19bd983-6b82-4879-84d5-3e2362654043-ovsdbserver-sb\") pod \"dnsmasq-dns-577f6d597f-sd6mk\" (UID: \"e19bd983-6b82-4879-84d5-3e2362654043\") " pod="openstack/dnsmasq-dns-577f6d597f-sd6mk" Mar 10 11:14:48 crc kubenswrapper[4794]: I0310 11:14:48.656824 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e19bd983-6b82-4879-84d5-3e2362654043-dns-svc\") pod \"dnsmasq-dns-577f6d597f-sd6mk\" (UID: \"e19bd983-6b82-4879-84d5-3e2362654043\") " pod="openstack/dnsmasq-dns-577f6d597f-sd6mk" Mar 10 11:14:48 crc kubenswrapper[4794]: I0310 11:14:48.657228 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e19bd983-6b82-4879-84d5-3e2362654043-ovsdbserver-nb\") pod \"dnsmasq-dns-577f6d597f-sd6mk\" (UID: \"e19bd983-6b82-4879-84d5-3e2362654043\") " pod="openstack/dnsmasq-dns-577f6d597f-sd6mk" Mar 10 11:14:48 crc kubenswrapper[4794]: I0310 11:14:48.657408 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e19bd983-6b82-4879-84d5-3e2362654043-dns-svc\") pod \"dnsmasq-dns-577f6d597f-sd6mk\" (UID: \"e19bd983-6b82-4879-84d5-3e2362654043\") " pod="openstack/dnsmasq-dns-577f6d597f-sd6mk" Mar 10 11:14:48 crc kubenswrapper[4794]: I0310 11:14:48.658053 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/657ac79d-5ffa-4505-8fad-1df78ff74c78-logs\") pod \"barbican-keystone-listener-86667778fb-jhxgj\" (UID: \"657ac79d-5ffa-4505-8fad-1df78ff74c78\") " pod="openstack/barbican-keystone-listener-86667778fb-jhxgj" Mar 10 11:14:48 crc kubenswrapper[4794]: I0310 11:14:48.658412 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e19bd983-6b82-4879-84d5-3e2362654043-ovsdbserver-sb\") pod \"dnsmasq-dns-577f6d597f-sd6mk\" (UID: \"e19bd983-6b82-4879-84d5-3e2362654043\") " pod="openstack/dnsmasq-dns-577f6d597f-sd6mk" Mar 10 11:14:48 crc kubenswrapper[4794]: I0310 11:14:48.658499 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/68c9337f-a87e-4734-9150-fb69adfa8e63-logs\") pod \"barbican-worker-7dcbcf68fc-sr8rx\" (UID: \"68c9337f-a87e-4734-9150-fb69adfa8e63\") " pod="openstack/barbican-worker-7dcbcf68fc-sr8rx" Mar 10 11:14:48 crc kubenswrapper[4794]: I0310 11:14:48.658614 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e19bd983-6b82-4879-84d5-3e2362654043-config\") pod \"dnsmasq-dns-577f6d597f-sd6mk\" (UID: \"e19bd983-6b82-4879-84d5-3e2362654043\") " pod="openstack/dnsmasq-dns-577f6d597f-sd6mk" Mar 10 11:14:48 crc kubenswrapper[4794]: I0310 11:14:48.661147 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/657ac79d-5ffa-4505-8fad-1df78ff74c78-combined-ca-bundle\") pod \"barbican-keystone-listener-86667778fb-jhxgj\" (UID: \"657ac79d-5ffa-4505-8fad-1df78ff74c78\") " pod="openstack/barbican-keystone-listener-86667778fb-jhxgj" Mar 10 11:14:48 crc kubenswrapper[4794]: I0310 11:14:48.662432 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/68c9337f-a87e-4734-9150-fb69adfa8e63-config-data\") pod \"barbican-worker-7dcbcf68fc-sr8rx\" (UID: \"68c9337f-a87e-4734-9150-fb69adfa8e63\") " pod="openstack/barbican-worker-7dcbcf68fc-sr8rx" Mar 10 11:14:48 crc kubenswrapper[4794]: I0310 11:14:48.671996 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/657ac79d-5ffa-4505-8fad-1df78ff74c78-config-data-custom\") pod \"barbican-keystone-listener-86667778fb-jhxgj\" (UID: \"657ac79d-5ffa-4505-8fad-1df78ff74c78\") " pod="openstack/barbican-keystone-listener-86667778fb-jhxgj" Mar 10 11:14:48 crc kubenswrapper[4794]: I0310 11:14:48.672361 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/657ac79d-5ffa-4505-8fad-1df78ff74c78-config-data\") pod \"barbican-keystone-listener-86667778fb-jhxgj\" (UID: \"657ac79d-5ffa-4505-8fad-1df78ff74c78\") " pod="openstack/barbican-keystone-listener-86667778fb-jhxgj" Mar 10 11:14:48 crc kubenswrapper[4794]: I0310 11:14:48.672454 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mljg4\" (UniqueName: \"kubernetes.io/projected/657ac79d-5ffa-4505-8fad-1df78ff74c78-kube-api-access-mljg4\") pod \"barbican-keystone-listener-86667778fb-jhxgj\" (UID: \"657ac79d-5ffa-4505-8fad-1df78ff74c78\") " pod="openstack/barbican-keystone-listener-86667778fb-jhxgj" Mar 10 11:14:48 crc kubenswrapper[4794]: I0310 11:14:48.673472 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4jsdc\" (UniqueName: \"kubernetes.io/projected/68c9337f-a87e-4734-9150-fb69adfa8e63-kube-api-access-4jsdc\") pod \"barbican-worker-7dcbcf68fc-sr8rx\" (UID: \"68c9337f-a87e-4734-9150-fb69adfa8e63\") " pod="openstack/barbican-worker-7dcbcf68fc-sr8rx" Mar 10 11:14:48 crc kubenswrapper[4794]: I0310 11:14:48.673899 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/68c9337f-a87e-4734-9150-fb69adfa8e63-combined-ca-bundle\") pod \"barbican-worker-7dcbcf68fc-sr8rx\" (UID: \"68c9337f-a87e-4734-9150-fb69adfa8e63\") " pod="openstack/barbican-worker-7dcbcf68fc-sr8rx" Mar 10 11:14:48 crc kubenswrapper[4794]: I0310 11:14:48.674286 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c9cbg\" (UniqueName: \"kubernetes.io/projected/e19bd983-6b82-4879-84d5-3e2362654043-kube-api-access-c9cbg\") pod \"dnsmasq-dns-577f6d597f-sd6mk\" (UID: \"e19bd983-6b82-4879-84d5-3e2362654043\") " pod="openstack/dnsmasq-dns-577f6d597f-sd6mk" Mar 10 11:14:48 crc kubenswrapper[4794]: I0310 11:14:48.740977 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-86667778fb-jhxgj" Mar 10 11:14:48 crc kubenswrapper[4794]: I0310 11:14:48.758247 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f79rv\" (UniqueName: \"kubernetes.io/projected/f8dbf764-272f-4b4e-b7c7-f005c3219494-kube-api-access-f79rv\") pod \"barbican-api-55d6f8766b-cgldc\" (UID: \"f8dbf764-272f-4b4e-b7c7-f005c3219494\") " pod="openstack/barbican-api-55d6f8766b-cgldc" Mar 10 11:14:48 crc kubenswrapper[4794]: I0310 11:14:48.758348 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f8dbf764-272f-4b4e-b7c7-f005c3219494-logs\") pod \"barbican-api-55d6f8766b-cgldc\" (UID: \"f8dbf764-272f-4b4e-b7c7-f005c3219494\") " pod="openstack/barbican-api-55d6f8766b-cgldc" Mar 10 11:14:48 crc kubenswrapper[4794]: I0310 11:14:48.758405 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f8dbf764-272f-4b4e-b7c7-f005c3219494-config-data-custom\") pod \"barbican-api-55d6f8766b-cgldc\" (UID: \"f8dbf764-272f-4b4e-b7c7-f005c3219494\") " pod="openstack/barbican-api-55d6f8766b-cgldc" Mar 10 11:14:48 crc kubenswrapper[4794]: I0310 11:14:48.758440 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f8dbf764-272f-4b4e-b7c7-f005c3219494-config-data\") pod \"barbican-api-55d6f8766b-cgldc\" (UID: \"f8dbf764-272f-4b4e-b7c7-f005c3219494\") " pod="openstack/barbican-api-55d6f8766b-cgldc" Mar 10 11:14:48 crc kubenswrapper[4794]: I0310 11:14:48.758461 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f8dbf764-272f-4b4e-b7c7-f005c3219494-combined-ca-bundle\") pod \"barbican-api-55d6f8766b-cgldc\" (UID: \"f8dbf764-272f-4b4e-b7c7-f005c3219494\") " pod="openstack/barbican-api-55d6f8766b-cgldc" Mar 10 11:14:48 crc kubenswrapper[4794]: I0310 11:14:48.758854 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f8dbf764-272f-4b4e-b7c7-f005c3219494-logs\") pod \"barbican-api-55d6f8766b-cgldc\" (UID: \"f8dbf764-272f-4b4e-b7c7-f005c3219494\") " pod="openstack/barbican-api-55d6f8766b-cgldc" Mar 10 11:14:48 crc kubenswrapper[4794]: I0310 11:14:48.761911 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f8dbf764-272f-4b4e-b7c7-f005c3219494-combined-ca-bundle\") pod \"barbican-api-55d6f8766b-cgldc\" (UID: \"f8dbf764-272f-4b4e-b7c7-f005c3219494\") " pod="openstack/barbican-api-55d6f8766b-cgldc" Mar 10 11:14:48 crc kubenswrapper[4794]: I0310 11:14:48.763139 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f8dbf764-272f-4b4e-b7c7-f005c3219494-config-data\") pod \"barbican-api-55d6f8766b-cgldc\" (UID: \"f8dbf764-272f-4b4e-b7c7-f005c3219494\") " pod="openstack/barbican-api-55d6f8766b-cgldc" Mar 10 11:14:48 crc kubenswrapper[4794]: I0310 11:14:48.763448 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f8dbf764-272f-4b4e-b7c7-f005c3219494-config-data-custom\") pod \"barbican-api-55d6f8766b-cgldc\" (UID: \"f8dbf764-272f-4b4e-b7c7-f005c3219494\") " pod="openstack/barbican-api-55d6f8766b-cgldc" Mar 10 11:14:48 crc kubenswrapper[4794]: I0310 11:14:48.766886 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-577f6d597f-sd6mk" Mar 10 11:14:48 crc kubenswrapper[4794]: I0310 11:14:48.783588 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f79rv\" (UniqueName: \"kubernetes.io/projected/f8dbf764-272f-4b4e-b7c7-f005c3219494-kube-api-access-f79rv\") pod \"barbican-api-55d6f8766b-cgldc\" (UID: \"f8dbf764-272f-4b4e-b7c7-f005c3219494\") " pod="openstack/barbican-api-55d6f8766b-cgldc" Mar 10 11:14:48 crc kubenswrapper[4794]: I0310 11:14:48.904771 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-55d6f8766b-cgldc" Mar 10 11:14:49 crc kubenswrapper[4794]: I0310 11:14:49.283362 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-86667778fb-jhxgj"] Mar 10 11:14:49 crc kubenswrapper[4794]: I0310 11:14:49.346148 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-577f6d597f-sd6mk"] Mar 10 11:14:49 crc kubenswrapper[4794]: I0310 11:14:49.402116 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-55d6f8766b-cgldc"] Mar 10 11:14:49 crc kubenswrapper[4794]: W0310 11:14:49.405656 4794 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf8dbf764_272f_4b4e_b7c7_f005c3219494.slice/crio-c4216450b9e3c74290a6cf500f69fa7a706c0a952929a5dfc98db933b82b935a WatchSource:0}: Error finding container c4216450b9e3c74290a6cf500f69fa7a706c0a952929a5dfc98db933b82b935a: Status 404 returned error can't find the container with id c4216450b9e3c74290a6cf500f69fa7a706c0a952929a5dfc98db933b82b935a Mar 10 11:14:49 crc kubenswrapper[4794]: I0310 11:14:49.534830 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-worker-config-data" Mar 10 11:14:49 crc kubenswrapper[4794]: I0310 11:14:49.542070 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/68c9337f-a87e-4734-9150-fb69adfa8e63-config-data-custom\") pod \"barbican-worker-7dcbcf68fc-sr8rx\" (UID: \"68c9337f-a87e-4734-9150-fb69adfa8e63\") " pod="openstack/barbican-worker-7dcbcf68fc-sr8rx" Mar 10 11:14:49 crc kubenswrapper[4794]: I0310 11:14:49.584983 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-7dcbcf68fc-sr8rx" Mar 10 11:14:50 crc kubenswrapper[4794]: I0310 11:14:50.038459 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-7dcbcf68fc-sr8rx"] Mar 10 11:14:50 crc kubenswrapper[4794]: I0310 11:14:50.248370 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-55d6f8766b-cgldc" event={"ID":"f8dbf764-272f-4b4e-b7c7-f005c3219494","Type":"ContainerStarted","Data":"f451eb3f8105ae53c1aff56f0a6f231a83f63700c93b92a86a566cbc1a5d0634"} Mar 10 11:14:50 crc kubenswrapper[4794]: I0310 11:14:50.248688 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-55d6f8766b-cgldc" event={"ID":"f8dbf764-272f-4b4e-b7c7-f005c3219494","Type":"ContainerStarted","Data":"10a6a20fad0d3e3226195c307b1c42f9d20209cc028cfa98c39c1a174c78e4ae"} Mar 10 11:14:50 crc kubenswrapper[4794]: I0310 11:14:50.248709 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-55d6f8766b-cgldc" Mar 10 11:14:50 crc kubenswrapper[4794]: I0310 11:14:50.248721 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-55d6f8766b-cgldc" Mar 10 11:14:50 crc kubenswrapper[4794]: I0310 11:14:50.248730 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-55d6f8766b-cgldc" event={"ID":"f8dbf764-272f-4b4e-b7c7-f005c3219494","Type":"ContainerStarted","Data":"c4216450b9e3c74290a6cf500f69fa7a706c0a952929a5dfc98db933b82b935a"} Mar 10 11:14:50 crc kubenswrapper[4794]: I0310 11:14:50.253432 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-86667778fb-jhxgj" event={"ID":"657ac79d-5ffa-4505-8fad-1df78ff74c78","Type":"ContainerStarted","Data":"65e1aeac05d8c0f3b21fb7a29a06025ec138edd8009c990d2dec682d8d128ed1"} Mar 10 11:14:50 crc kubenswrapper[4794]: I0310 11:14:50.253474 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-86667778fb-jhxgj" event={"ID":"657ac79d-5ffa-4505-8fad-1df78ff74c78","Type":"ContainerStarted","Data":"dea6f728418bf94c117bcea6ea3c7244acf8ef59cdba4403d0b9017e70d5ef8d"} Mar 10 11:14:50 crc kubenswrapper[4794]: I0310 11:14:50.253489 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-86667778fb-jhxgj" event={"ID":"657ac79d-5ffa-4505-8fad-1df78ff74c78","Type":"ContainerStarted","Data":"cbc763052f056b476068e5e1721c998ce9b2ae9492b5a294a403147fe2fc4098"} Mar 10 11:14:50 crc kubenswrapper[4794]: I0310 11:14:50.255975 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-7dcbcf68fc-sr8rx" event={"ID":"68c9337f-a87e-4734-9150-fb69adfa8e63","Type":"ContainerStarted","Data":"9e52552b9f5384e4cbb5086707b2818213a6ed5e16673c1fb69bb33073f6cdba"} Mar 10 11:14:50 crc kubenswrapper[4794]: I0310 11:14:50.256003 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-7dcbcf68fc-sr8rx" event={"ID":"68c9337f-a87e-4734-9150-fb69adfa8e63","Type":"ContainerStarted","Data":"f10bbb71404264252a919703f9c881ad42e2427f56b396ef1dafdfda6d70bbce"} Mar 10 11:14:50 crc kubenswrapper[4794]: I0310 11:14:50.257528 4794 generic.go:334] "Generic (PLEG): container finished" podID="e19bd983-6b82-4879-84d5-3e2362654043" containerID="fafd23b30900529ae35c00b34d2118eb31aa999de34597a0be3c90e018211f69" exitCode=0 Mar 10 11:14:50 crc kubenswrapper[4794]: I0310 11:14:50.257562 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-577f6d597f-sd6mk" event={"ID":"e19bd983-6b82-4879-84d5-3e2362654043","Type":"ContainerDied","Data":"fafd23b30900529ae35c00b34d2118eb31aa999de34597a0be3c90e018211f69"} Mar 10 11:14:50 crc kubenswrapper[4794]: I0310 11:14:50.257581 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-577f6d597f-sd6mk" event={"ID":"e19bd983-6b82-4879-84d5-3e2362654043","Type":"ContainerStarted","Data":"abbcea27794053843d1c5e9a5c2027b8114d3d66aad2f605b35f61acad0ef240"} Mar 10 11:14:50 crc kubenswrapper[4794]: I0310 11:14:50.287502 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-55d6f8766b-cgldc" podStartSLOduration=2.287471621 podStartE2EDuration="2.287471621s" podCreationTimestamp="2026-03-10 11:14:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 11:14:50.276393397 +0000 UTC m=+5439.032564235" watchObservedRunningTime="2026-03-10 11:14:50.287471621 +0000 UTC m=+5439.043642479" Mar 10 11:14:50 crc kubenswrapper[4794]: I0310 11:14:50.325135 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-keystone-listener-86667778fb-jhxgj" podStartSLOduration=2.325115702 podStartE2EDuration="2.325115702s" podCreationTimestamp="2026-03-10 11:14:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 11:14:50.319950382 +0000 UTC m=+5439.076121220" watchObservedRunningTime="2026-03-10 11:14:50.325115702 +0000 UTC m=+5439.081286520" Mar 10 11:14:51 crc kubenswrapper[4794]: I0310 11:14:51.268232 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-7dcbcf68fc-sr8rx" event={"ID":"68c9337f-a87e-4734-9150-fb69adfa8e63","Type":"ContainerStarted","Data":"df1f0698a4709d3ed16b9e5896994bf4e4941597033071394792c6dd772b2896"} Mar 10 11:14:51 crc kubenswrapper[4794]: I0310 11:14:51.270454 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-577f6d597f-sd6mk" event={"ID":"e19bd983-6b82-4879-84d5-3e2362654043","Type":"ContainerStarted","Data":"449a157ed020ee7fa7b9a0ca2b2cf80739ed77686010443ce9ef638877596952"} Mar 10 11:14:51 crc kubenswrapper[4794]: I0310 11:14:51.271395 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-577f6d597f-sd6mk" Mar 10 11:14:51 crc kubenswrapper[4794]: I0310 11:14:51.296579 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-worker-7dcbcf68fc-sr8rx" podStartSLOduration=3.296550741 podStartE2EDuration="3.296550741s" podCreationTimestamp="2026-03-10 11:14:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 11:14:51.287688366 +0000 UTC m=+5440.043859194" watchObservedRunningTime="2026-03-10 11:14:51.296550741 +0000 UTC m=+5440.052721559" Mar 10 11:14:51 crc kubenswrapper[4794]: I0310 11:14:51.317215 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-577f6d597f-sd6mk" podStartSLOduration=3.317194824 podStartE2EDuration="3.317194824s" podCreationTimestamp="2026-03-10 11:14:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 11:14:51.312782557 +0000 UTC m=+5440.068953395" watchObservedRunningTime="2026-03-10 11:14:51.317194824 +0000 UTC m=+5440.073365652" Mar 10 11:14:58 crc kubenswrapper[4794]: I0310 11:14:57.999540 4794 scope.go:117] "RemoveContainer" containerID="993a5336699cfaa4c1135438dd34154cd549d1e3528993849fd4002d26837157" Mar 10 11:14:58 crc kubenswrapper[4794]: E0310 11:14:58.000554 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-69278_openshift-machine-config-operator(071a79a8-a892-4d38-a255-2a19483b64aa)\"" pod="openshift-machine-config-operator/machine-config-daemon-69278" podUID="071a79a8-a892-4d38-a255-2a19483b64aa" Mar 10 11:14:58 crc kubenswrapper[4794]: I0310 11:14:58.770090 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-577f6d597f-sd6mk" Mar 10 11:14:58 crc kubenswrapper[4794]: I0310 11:14:58.824534 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6458df987c-wl7hm"] Mar 10 11:14:58 crc kubenswrapper[4794]: I0310 11:14:58.825112 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-6458df987c-wl7hm" podUID="74d63a04-2b17-4bae-b9c9-61b0e0ed9712" containerName="dnsmasq-dns" containerID="cri-o://4600d8eb077528eddd3573099d0e77ce2d10b965ec996e12ebc9a0bc4c8d6842" gracePeriod=10 Mar 10 11:14:59 crc kubenswrapper[4794]: I0310 11:14:59.316117 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6458df987c-wl7hm" Mar 10 11:14:59 crc kubenswrapper[4794]: I0310 11:14:59.356628 4794 generic.go:334] "Generic (PLEG): container finished" podID="74d63a04-2b17-4bae-b9c9-61b0e0ed9712" containerID="4600d8eb077528eddd3573099d0e77ce2d10b965ec996e12ebc9a0bc4c8d6842" exitCode=0 Mar 10 11:14:59 crc kubenswrapper[4794]: I0310 11:14:59.356664 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6458df987c-wl7hm" event={"ID":"74d63a04-2b17-4bae-b9c9-61b0e0ed9712","Type":"ContainerDied","Data":"4600d8eb077528eddd3573099d0e77ce2d10b965ec996e12ebc9a0bc4c8d6842"} Mar 10 11:14:59 crc kubenswrapper[4794]: I0310 11:14:59.356690 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6458df987c-wl7hm" event={"ID":"74d63a04-2b17-4bae-b9c9-61b0e0ed9712","Type":"ContainerDied","Data":"7329585322b6c6b653f4f97f1e793082814ff1fc8c57e63f5945c596513c3100"} Mar 10 11:14:59 crc kubenswrapper[4794]: I0310 11:14:59.356707 4794 scope.go:117] "RemoveContainer" containerID="4600d8eb077528eddd3573099d0e77ce2d10b965ec996e12ebc9a0bc4c8d6842" Mar 10 11:14:59 crc kubenswrapper[4794]: I0310 11:14:59.356821 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6458df987c-wl7hm" Mar 10 11:14:59 crc kubenswrapper[4794]: I0310 11:14:59.378323 4794 scope.go:117] "RemoveContainer" containerID="5c83452d0cd456dda1fc33fd7b6b8234e5ccd2b6ddb4fd680d0fb30f86242056" Mar 10 11:14:59 crc kubenswrapper[4794]: I0310 11:14:59.402123 4794 scope.go:117] "RemoveContainer" containerID="4600d8eb077528eddd3573099d0e77ce2d10b965ec996e12ebc9a0bc4c8d6842" Mar 10 11:14:59 crc kubenswrapper[4794]: E0310 11:14:59.402552 4794 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4600d8eb077528eddd3573099d0e77ce2d10b965ec996e12ebc9a0bc4c8d6842\": container with ID starting with 4600d8eb077528eddd3573099d0e77ce2d10b965ec996e12ebc9a0bc4c8d6842 not found: ID does not exist" containerID="4600d8eb077528eddd3573099d0e77ce2d10b965ec996e12ebc9a0bc4c8d6842" Mar 10 11:14:59 crc kubenswrapper[4794]: I0310 11:14:59.402580 4794 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4600d8eb077528eddd3573099d0e77ce2d10b965ec996e12ebc9a0bc4c8d6842"} err="failed to get container status \"4600d8eb077528eddd3573099d0e77ce2d10b965ec996e12ebc9a0bc4c8d6842\": rpc error: code = NotFound desc = could not find container \"4600d8eb077528eddd3573099d0e77ce2d10b965ec996e12ebc9a0bc4c8d6842\": container with ID starting with 4600d8eb077528eddd3573099d0e77ce2d10b965ec996e12ebc9a0bc4c8d6842 not found: ID does not exist" Mar 10 11:14:59 crc kubenswrapper[4794]: I0310 11:14:59.402599 4794 scope.go:117] "RemoveContainer" containerID="5c83452d0cd456dda1fc33fd7b6b8234e5ccd2b6ddb4fd680d0fb30f86242056" Mar 10 11:14:59 crc kubenswrapper[4794]: E0310 11:14:59.402888 4794 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5c83452d0cd456dda1fc33fd7b6b8234e5ccd2b6ddb4fd680d0fb30f86242056\": container with ID starting with 5c83452d0cd456dda1fc33fd7b6b8234e5ccd2b6ddb4fd680d0fb30f86242056 not found: ID does not exist" containerID="5c83452d0cd456dda1fc33fd7b6b8234e5ccd2b6ddb4fd680d0fb30f86242056" Mar 10 11:14:59 crc kubenswrapper[4794]: I0310 11:14:59.402906 4794 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5c83452d0cd456dda1fc33fd7b6b8234e5ccd2b6ddb4fd680d0fb30f86242056"} err="failed to get container status \"5c83452d0cd456dda1fc33fd7b6b8234e5ccd2b6ddb4fd680d0fb30f86242056\": rpc error: code = NotFound desc = could not find container \"5c83452d0cd456dda1fc33fd7b6b8234e5ccd2b6ddb4fd680d0fb30f86242056\": container with ID starting with 5c83452d0cd456dda1fc33fd7b6b8234e5ccd2b6ddb4fd680d0fb30f86242056 not found: ID does not exist" Mar 10 11:14:59 crc kubenswrapper[4794]: I0310 11:14:59.455500 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/74d63a04-2b17-4bae-b9c9-61b0e0ed9712-config\") pod \"74d63a04-2b17-4bae-b9c9-61b0e0ed9712\" (UID: \"74d63a04-2b17-4bae-b9c9-61b0e0ed9712\") " Mar 10 11:14:59 crc kubenswrapper[4794]: I0310 11:14:59.455574 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/74d63a04-2b17-4bae-b9c9-61b0e0ed9712-ovsdbserver-nb\") pod \"74d63a04-2b17-4bae-b9c9-61b0e0ed9712\" (UID: \"74d63a04-2b17-4bae-b9c9-61b0e0ed9712\") " Mar 10 11:14:59 crc kubenswrapper[4794]: I0310 11:14:59.455680 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/74d63a04-2b17-4bae-b9c9-61b0e0ed9712-dns-svc\") pod \"74d63a04-2b17-4bae-b9c9-61b0e0ed9712\" (UID: \"74d63a04-2b17-4bae-b9c9-61b0e0ed9712\") " Mar 10 11:14:59 crc kubenswrapper[4794]: I0310 11:14:59.455707 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pplbz\" (UniqueName: \"kubernetes.io/projected/74d63a04-2b17-4bae-b9c9-61b0e0ed9712-kube-api-access-pplbz\") pod \"74d63a04-2b17-4bae-b9c9-61b0e0ed9712\" (UID: \"74d63a04-2b17-4bae-b9c9-61b0e0ed9712\") " Mar 10 11:14:59 crc kubenswrapper[4794]: I0310 11:14:59.455741 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/74d63a04-2b17-4bae-b9c9-61b0e0ed9712-ovsdbserver-sb\") pod \"74d63a04-2b17-4bae-b9c9-61b0e0ed9712\" (UID: \"74d63a04-2b17-4bae-b9c9-61b0e0ed9712\") " Mar 10 11:14:59 crc kubenswrapper[4794]: I0310 11:14:59.466557 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/74d63a04-2b17-4bae-b9c9-61b0e0ed9712-kube-api-access-pplbz" (OuterVolumeSpecName: "kube-api-access-pplbz") pod "74d63a04-2b17-4bae-b9c9-61b0e0ed9712" (UID: "74d63a04-2b17-4bae-b9c9-61b0e0ed9712"). InnerVolumeSpecName "kube-api-access-pplbz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 11:14:59 crc kubenswrapper[4794]: I0310 11:14:59.507996 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/74d63a04-2b17-4bae-b9c9-61b0e0ed9712-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "74d63a04-2b17-4bae-b9c9-61b0e0ed9712" (UID: "74d63a04-2b17-4bae-b9c9-61b0e0ed9712"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 11:14:59 crc kubenswrapper[4794]: I0310 11:14:59.508049 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/74d63a04-2b17-4bae-b9c9-61b0e0ed9712-config" (OuterVolumeSpecName: "config") pod "74d63a04-2b17-4bae-b9c9-61b0e0ed9712" (UID: "74d63a04-2b17-4bae-b9c9-61b0e0ed9712"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 11:14:59 crc kubenswrapper[4794]: I0310 11:14:59.508190 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/74d63a04-2b17-4bae-b9c9-61b0e0ed9712-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "74d63a04-2b17-4bae-b9c9-61b0e0ed9712" (UID: "74d63a04-2b17-4bae-b9c9-61b0e0ed9712"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 11:14:59 crc kubenswrapper[4794]: I0310 11:14:59.524748 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/74d63a04-2b17-4bae-b9c9-61b0e0ed9712-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "74d63a04-2b17-4bae-b9c9-61b0e0ed9712" (UID: "74d63a04-2b17-4bae-b9c9-61b0e0ed9712"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 11:14:59 crc kubenswrapper[4794]: I0310 11:14:59.557613 4794 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/74d63a04-2b17-4bae-b9c9-61b0e0ed9712-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 10 11:14:59 crc kubenswrapper[4794]: I0310 11:14:59.557649 4794 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/74d63a04-2b17-4bae-b9c9-61b0e0ed9712-config\") on node \"crc\" DevicePath \"\"" Mar 10 11:14:59 crc kubenswrapper[4794]: I0310 11:14:59.557662 4794 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/74d63a04-2b17-4bae-b9c9-61b0e0ed9712-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 10 11:14:59 crc kubenswrapper[4794]: I0310 11:14:59.557674 4794 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/74d63a04-2b17-4bae-b9c9-61b0e0ed9712-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 10 11:14:59 crc kubenswrapper[4794]: I0310 11:14:59.557686 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pplbz\" (UniqueName: \"kubernetes.io/projected/74d63a04-2b17-4bae-b9c9-61b0e0ed9712-kube-api-access-pplbz\") on node \"crc\" DevicePath \"\"" Mar 10 11:14:59 crc kubenswrapper[4794]: I0310 11:14:59.706520 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6458df987c-wl7hm"] Mar 10 11:14:59 crc kubenswrapper[4794]: I0310 11:14:59.717608 4794 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6458df987c-wl7hm"] Mar 10 11:15:00 crc kubenswrapper[4794]: I0310 11:15:00.016429 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="74d63a04-2b17-4bae-b9c9-61b0e0ed9712" path="/var/lib/kubelet/pods/74d63a04-2b17-4bae-b9c9-61b0e0ed9712/volumes" Mar 10 11:15:00 crc kubenswrapper[4794]: I0310 11:15:00.147894 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29552355-4jxds"] Mar 10 11:15:00 crc kubenswrapper[4794]: E0310 11:15:00.148466 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="74d63a04-2b17-4bae-b9c9-61b0e0ed9712" containerName="init" Mar 10 11:15:00 crc kubenswrapper[4794]: I0310 11:15:00.148551 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="74d63a04-2b17-4bae-b9c9-61b0e0ed9712" containerName="init" Mar 10 11:15:00 crc kubenswrapper[4794]: E0310 11:15:00.148642 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="74d63a04-2b17-4bae-b9c9-61b0e0ed9712" containerName="dnsmasq-dns" Mar 10 11:15:00 crc kubenswrapper[4794]: I0310 11:15:00.148724 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="74d63a04-2b17-4bae-b9c9-61b0e0ed9712" containerName="dnsmasq-dns" Mar 10 11:15:00 crc kubenswrapper[4794]: I0310 11:15:00.148989 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="74d63a04-2b17-4bae-b9c9-61b0e0ed9712" containerName="dnsmasq-dns" Mar 10 11:15:00 crc kubenswrapper[4794]: I0310 11:15:00.149688 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29552355-4jxds" Mar 10 11:15:00 crc kubenswrapper[4794]: I0310 11:15:00.153096 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 10 11:15:00 crc kubenswrapper[4794]: I0310 11:15:00.153154 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 10 11:15:00 crc kubenswrapper[4794]: I0310 11:15:00.161753 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29552355-4jxds"] Mar 10 11:15:00 crc kubenswrapper[4794]: I0310 11:15:00.168154 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2dxn6\" (UniqueName: \"kubernetes.io/projected/a8fdc89e-e1d3-4819-ab36-a9a1ff3c2a52-kube-api-access-2dxn6\") pod \"collect-profiles-29552355-4jxds\" (UID: \"a8fdc89e-e1d3-4819-ab36-a9a1ff3c2a52\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552355-4jxds" Mar 10 11:15:00 crc kubenswrapper[4794]: I0310 11:15:00.168237 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a8fdc89e-e1d3-4819-ab36-a9a1ff3c2a52-config-volume\") pod \"collect-profiles-29552355-4jxds\" (UID: \"a8fdc89e-e1d3-4819-ab36-a9a1ff3c2a52\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552355-4jxds" Mar 10 11:15:00 crc kubenswrapper[4794]: I0310 11:15:00.168308 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/a8fdc89e-e1d3-4819-ab36-a9a1ff3c2a52-secret-volume\") pod \"collect-profiles-29552355-4jxds\" (UID: \"a8fdc89e-e1d3-4819-ab36-a9a1ff3c2a52\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552355-4jxds" Mar 10 11:15:00 crc kubenswrapper[4794]: I0310 11:15:00.262703 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-55d6f8766b-cgldc" Mar 10 11:15:00 crc kubenswrapper[4794]: I0310 11:15:00.269420 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/a8fdc89e-e1d3-4819-ab36-a9a1ff3c2a52-secret-volume\") pod \"collect-profiles-29552355-4jxds\" (UID: \"a8fdc89e-e1d3-4819-ab36-a9a1ff3c2a52\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552355-4jxds" Mar 10 11:15:00 crc kubenswrapper[4794]: I0310 11:15:00.269499 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2dxn6\" (UniqueName: \"kubernetes.io/projected/a8fdc89e-e1d3-4819-ab36-a9a1ff3c2a52-kube-api-access-2dxn6\") pod \"collect-profiles-29552355-4jxds\" (UID: \"a8fdc89e-e1d3-4819-ab36-a9a1ff3c2a52\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552355-4jxds" Mar 10 11:15:00 crc kubenswrapper[4794]: I0310 11:15:00.269551 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a8fdc89e-e1d3-4819-ab36-a9a1ff3c2a52-config-volume\") pod \"collect-profiles-29552355-4jxds\" (UID: \"a8fdc89e-e1d3-4819-ab36-a9a1ff3c2a52\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552355-4jxds" Mar 10 11:15:00 crc kubenswrapper[4794]: I0310 11:15:00.270817 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a8fdc89e-e1d3-4819-ab36-a9a1ff3c2a52-config-volume\") pod \"collect-profiles-29552355-4jxds\" (UID: \"a8fdc89e-e1d3-4819-ab36-a9a1ff3c2a52\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552355-4jxds" Mar 10 11:15:00 crc kubenswrapper[4794]: I0310 11:15:00.273898 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/a8fdc89e-e1d3-4819-ab36-a9a1ff3c2a52-secret-volume\") pod \"collect-profiles-29552355-4jxds\" (UID: \"a8fdc89e-e1d3-4819-ab36-a9a1ff3c2a52\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552355-4jxds" Mar 10 11:15:00 crc kubenswrapper[4794]: I0310 11:15:00.287270 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2dxn6\" (UniqueName: \"kubernetes.io/projected/a8fdc89e-e1d3-4819-ab36-a9a1ff3c2a52-kube-api-access-2dxn6\") pod \"collect-profiles-29552355-4jxds\" (UID: \"a8fdc89e-e1d3-4819-ab36-a9a1ff3c2a52\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552355-4jxds" Mar 10 11:15:00 crc kubenswrapper[4794]: I0310 11:15:00.348199 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-55d6f8766b-cgldc" Mar 10 11:15:00 crc kubenswrapper[4794]: I0310 11:15:00.469868 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29552355-4jxds" Mar 10 11:15:00 crc kubenswrapper[4794]: I0310 11:15:00.949650 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29552355-4jxds"] Mar 10 11:15:00 crc kubenswrapper[4794]: W0310 11:15:00.957832 4794 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda8fdc89e_e1d3_4819_ab36_a9a1ff3c2a52.slice/crio-73ba472c0c039fabd95e5732aba1b537b0af963c6e11fe09f0dca968ad97db52 WatchSource:0}: Error finding container 73ba472c0c039fabd95e5732aba1b537b0af963c6e11fe09f0dca968ad97db52: Status 404 returned error can't find the container with id 73ba472c0c039fabd95e5732aba1b537b0af963c6e11fe09f0dca968ad97db52 Mar 10 11:15:01 crc kubenswrapper[4794]: I0310 11:15:01.373828 4794 generic.go:334] "Generic (PLEG): container finished" podID="a8fdc89e-e1d3-4819-ab36-a9a1ff3c2a52" containerID="440ab1bba25534ba1b40e651a5c6c53cc936bc33df805415e947df3381bdfd0e" exitCode=0 Mar 10 11:15:01 crc kubenswrapper[4794]: I0310 11:15:01.373897 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29552355-4jxds" event={"ID":"a8fdc89e-e1d3-4819-ab36-a9a1ff3c2a52","Type":"ContainerDied","Data":"440ab1bba25534ba1b40e651a5c6c53cc936bc33df805415e947df3381bdfd0e"} Mar 10 11:15:01 crc kubenswrapper[4794]: I0310 11:15:01.374081 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29552355-4jxds" event={"ID":"a8fdc89e-e1d3-4819-ab36-a9a1ff3c2a52","Type":"ContainerStarted","Data":"73ba472c0c039fabd95e5732aba1b537b0af963c6e11fe09f0dca968ad97db52"} Mar 10 11:15:02 crc kubenswrapper[4794]: I0310 11:15:02.755013 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29552355-4jxds" Mar 10 11:15:02 crc kubenswrapper[4794]: I0310 11:15:02.918698 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2dxn6\" (UniqueName: \"kubernetes.io/projected/a8fdc89e-e1d3-4819-ab36-a9a1ff3c2a52-kube-api-access-2dxn6\") pod \"a8fdc89e-e1d3-4819-ab36-a9a1ff3c2a52\" (UID: \"a8fdc89e-e1d3-4819-ab36-a9a1ff3c2a52\") " Mar 10 11:15:02 crc kubenswrapper[4794]: I0310 11:15:02.918884 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a8fdc89e-e1d3-4819-ab36-a9a1ff3c2a52-config-volume\") pod \"a8fdc89e-e1d3-4819-ab36-a9a1ff3c2a52\" (UID: \"a8fdc89e-e1d3-4819-ab36-a9a1ff3c2a52\") " Mar 10 11:15:02 crc kubenswrapper[4794]: I0310 11:15:02.919080 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/a8fdc89e-e1d3-4819-ab36-a9a1ff3c2a52-secret-volume\") pod \"a8fdc89e-e1d3-4819-ab36-a9a1ff3c2a52\" (UID: \"a8fdc89e-e1d3-4819-ab36-a9a1ff3c2a52\") " Mar 10 11:15:02 crc kubenswrapper[4794]: I0310 11:15:02.919992 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a8fdc89e-e1d3-4819-ab36-a9a1ff3c2a52-config-volume" (OuterVolumeSpecName: "config-volume") pod "a8fdc89e-e1d3-4819-ab36-a9a1ff3c2a52" (UID: "a8fdc89e-e1d3-4819-ab36-a9a1ff3c2a52"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 11:15:02 crc kubenswrapper[4794]: I0310 11:15:02.928182 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a8fdc89e-e1d3-4819-ab36-a9a1ff3c2a52-kube-api-access-2dxn6" (OuterVolumeSpecName: "kube-api-access-2dxn6") pod "a8fdc89e-e1d3-4819-ab36-a9a1ff3c2a52" (UID: "a8fdc89e-e1d3-4819-ab36-a9a1ff3c2a52"). InnerVolumeSpecName "kube-api-access-2dxn6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 11:15:02 crc kubenswrapper[4794]: I0310 11:15:02.928587 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a8fdc89e-e1d3-4819-ab36-a9a1ff3c2a52-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "a8fdc89e-e1d3-4819-ab36-a9a1ff3c2a52" (UID: "a8fdc89e-e1d3-4819-ab36-a9a1ff3c2a52"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 11:15:03 crc kubenswrapper[4794]: I0310 11:15:03.021485 4794 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/a8fdc89e-e1d3-4819-ab36-a9a1ff3c2a52-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 10 11:15:03 crc kubenswrapper[4794]: I0310 11:15:03.021814 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2dxn6\" (UniqueName: \"kubernetes.io/projected/a8fdc89e-e1d3-4819-ab36-a9a1ff3c2a52-kube-api-access-2dxn6\") on node \"crc\" DevicePath \"\"" Mar 10 11:15:03 crc kubenswrapper[4794]: I0310 11:15:03.021827 4794 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a8fdc89e-e1d3-4819-ab36-a9a1ff3c2a52-config-volume\") on node \"crc\" DevicePath \"\"" Mar 10 11:15:03 crc kubenswrapper[4794]: I0310 11:15:03.392821 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29552355-4jxds" event={"ID":"a8fdc89e-e1d3-4819-ab36-a9a1ff3c2a52","Type":"ContainerDied","Data":"73ba472c0c039fabd95e5732aba1b537b0af963c6e11fe09f0dca968ad97db52"} Mar 10 11:15:03 crc kubenswrapper[4794]: I0310 11:15:03.392857 4794 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="73ba472c0c039fabd95e5732aba1b537b0af963c6e11fe09f0dca968ad97db52" Mar 10 11:15:03 crc kubenswrapper[4794]: I0310 11:15:03.392897 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29552355-4jxds" Mar 10 11:15:03 crc kubenswrapper[4794]: I0310 11:15:03.862468 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29552310-7jcl5"] Mar 10 11:15:03 crc kubenswrapper[4794]: I0310 11:15:03.868561 4794 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29552310-7jcl5"] Mar 10 11:15:04 crc kubenswrapper[4794]: I0310 11:15:04.018201 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="decc335d-1858-4f2c-b1d4-880766c7b327" path="/var/lib/kubelet/pods/decc335d-1858-4f2c-b1d4-880766c7b327/volumes" Mar 10 11:15:08 crc kubenswrapper[4794]: I0310 11:15:08.999449 4794 scope.go:117] "RemoveContainer" containerID="993a5336699cfaa4c1135438dd34154cd549d1e3528993849fd4002d26837157" Mar 10 11:15:09 crc kubenswrapper[4794]: E0310 11:15:09.000261 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-69278_openshift-machine-config-operator(071a79a8-a892-4d38-a255-2a19483b64aa)\"" pod="openshift-machine-config-operator/machine-config-daemon-69278" podUID="071a79a8-a892-4d38-a255-2a19483b64aa" Mar 10 11:15:12 crc kubenswrapper[4794]: I0310 11:15:12.614144 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-create-gdldc"] Mar 10 11:15:12 crc kubenswrapper[4794]: E0310 11:15:12.615123 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a8fdc89e-e1d3-4819-ab36-a9a1ff3c2a52" containerName="collect-profiles" Mar 10 11:15:12 crc kubenswrapper[4794]: I0310 11:15:12.615139 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="a8fdc89e-e1d3-4819-ab36-a9a1ff3c2a52" containerName="collect-profiles" Mar 10 11:15:12 crc kubenswrapper[4794]: I0310 11:15:12.615372 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="a8fdc89e-e1d3-4819-ab36-a9a1ff3c2a52" containerName="collect-profiles" Mar 10 11:15:12 crc kubenswrapper[4794]: I0310 11:15:12.616085 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-gdldc" Mar 10 11:15:12 crc kubenswrapper[4794]: I0310 11:15:12.622651 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-gdldc"] Mar 10 11:15:12 crc kubenswrapper[4794]: I0310 11:15:12.709511 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/88f6ae7d-9e45-43bf-a76f-e3f9b2d13be0-operator-scripts\") pod \"neutron-db-create-gdldc\" (UID: \"88f6ae7d-9e45-43bf-a76f-e3f9b2d13be0\") " pod="openstack/neutron-db-create-gdldc" Mar 10 11:15:12 crc kubenswrapper[4794]: I0310 11:15:12.709593 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kxcvq\" (UniqueName: \"kubernetes.io/projected/88f6ae7d-9e45-43bf-a76f-e3f9b2d13be0-kube-api-access-kxcvq\") pod \"neutron-db-create-gdldc\" (UID: \"88f6ae7d-9e45-43bf-a76f-e3f9b2d13be0\") " pod="openstack/neutron-db-create-gdldc" Mar 10 11:15:12 crc kubenswrapper[4794]: I0310 11:15:12.811115 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/88f6ae7d-9e45-43bf-a76f-e3f9b2d13be0-operator-scripts\") pod \"neutron-db-create-gdldc\" (UID: \"88f6ae7d-9e45-43bf-a76f-e3f9b2d13be0\") " pod="openstack/neutron-db-create-gdldc" Mar 10 11:15:12 crc kubenswrapper[4794]: I0310 11:15:12.811229 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kxcvq\" (UniqueName: \"kubernetes.io/projected/88f6ae7d-9e45-43bf-a76f-e3f9b2d13be0-kube-api-access-kxcvq\") pod \"neutron-db-create-gdldc\" (UID: \"88f6ae7d-9e45-43bf-a76f-e3f9b2d13be0\") " pod="openstack/neutron-db-create-gdldc" Mar 10 11:15:12 crc kubenswrapper[4794]: I0310 11:15:12.812730 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/88f6ae7d-9e45-43bf-a76f-e3f9b2d13be0-operator-scripts\") pod \"neutron-db-create-gdldc\" (UID: \"88f6ae7d-9e45-43bf-a76f-e3f9b2d13be0\") " pod="openstack/neutron-db-create-gdldc" Mar 10 11:15:12 crc kubenswrapper[4794]: I0310 11:15:12.823320 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-a98d-account-create-update-8vn77"] Mar 10 11:15:12 crc kubenswrapper[4794]: I0310 11:15:12.824587 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-a98d-account-create-update-8vn77" Mar 10 11:15:12 crc kubenswrapper[4794]: I0310 11:15:12.829359 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-db-secret" Mar 10 11:15:12 crc kubenswrapper[4794]: I0310 11:15:12.841138 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-a98d-account-create-update-8vn77"] Mar 10 11:15:12 crc kubenswrapper[4794]: I0310 11:15:12.864183 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kxcvq\" (UniqueName: \"kubernetes.io/projected/88f6ae7d-9e45-43bf-a76f-e3f9b2d13be0-kube-api-access-kxcvq\") pod \"neutron-db-create-gdldc\" (UID: \"88f6ae7d-9e45-43bf-a76f-e3f9b2d13be0\") " pod="openstack/neutron-db-create-gdldc" Mar 10 11:15:12 crc kubenswrapper[4794]: I0310 11:15:12.912970 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/747409d2-0f42-4b0e-9daa-3a2aae24a583-operator-scripts\") pod \"neutron-a98d-account-create-update-8vn77\" (UID: \"747409d2-0f42-4b0e-9daa-3a2aae24a583\") " pod="openstack/neutron-a98d-account-create-update-8vn77" Mar 10 11:15:12 crc kubenswrapper[4794]: I0310 11:15:12.913018 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4hjgs\" (UniqueName: \"kubernetes.io/projected/747409d2-0f42-4b0e-9daa-3a2aae24a583-kube-api-access-4hjgs\") pod \"neutron-a98d-account-create-update-8vn77\" (UID: \"747409d2-0f42-4b0e-9daa-3a2aae24a583\") " pod="openstack/neutron-a98d-account-create-update-8vn77" Mar 10 11:15:12 crc kubenswrapper[4794]: I0310 11:15:12.942286 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-gdldc" Mar 10 11:15:13 crc kubenswrapper[4794]: I0310 11:15:13.014816 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/747409d2-0f42-4b0e-9daa-3a2aae24a583-operator-scripts\") pod \"neutron-a98d-account-create-update-8vn77\" (UID: \"747409d2-0f42-4b0e-9daa-3a2aae24a583\") " pod="openstack/neutron-a98d-account-create-update-8vn77" Mar 10 11:15:13 crc kubenswrapper[4794]: I0310 11:15:13.014861 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4hjgs\" (UniqueName: \"kubernetes.io/projected/747409d2-0f42-4b0e-9daa-3a2aae24a583-kube-api-access-4hjgs\") pod \"neutron-a98d-account-create-update-8vn77\" (UID: \"747409d2-0f42-4b0e-9daa-3a2aae24a583\") " pod="openstack/neutron-a98d-account-create-update-8vn77" Mar 10 11:15:13 crc kubenswrapper[4794]: I0310 11:15:13.015703 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/747409d2-0f42-4b0e-9daa-3a2aae24a583-operator-scripts\") pod \"neutron-a98d-account-create-update-8vn77\" (UID: \"747409d2-0f42-4b0e-9daa-3a2aae24a583\") " pod="openstack/neutron-a98d-account-create-update-8vn77" Mar 10 11:15:13 crc kubenswrapper[4794]: I0310 11:15:13.029758 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4hjgs\" (UniqueName: \"kubernetes.io/projected/747409d2-0f42-4b0e-9daa-3a2aae24a583-kube-api-access-4hjgs\") pod \"neutron-a98d-account-create-update-8vn77\" (UID: \"747409d2-0f42-4b0e-9daa-3a2aae24a583\") " pod="openstack/neutron-a98d-account-create-update-8vn77" Mar 10 11:15:13 crc kubenswrapper[4794]: I0310 11:15:13.149937 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-a98d-account-create-update-8vn77" Mar 10 11:15:13 crc kubenswrapper[4794]: I0310 11:15:13.408931 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-gdldc"] Mar 10 11:15:13 crc kubenswrapper[4794]: W0310 11:15:13.416685 4794 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod88f6ae7d_9e45_43bf_a76f_e3f9b2d13be0.slice/crio-9cbc30de21047b6844b508871778842e64e3040151a4b4f347d12c5f3bfed527 WatchSource:0}: Error finding container 9cbc30de21047b6844b508871778842e64e3040151a4b4f347d12c5f3bfed527: Status 404 returned error can't find the container with id 9cbc30de21047b6844b508871778842e64e3040151a4b4f347d12c5f3bfed527 Mar 10 11:15:13 crc kubenswrapper[4794]: I0310 11:15:13.492851 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-gdldc" event={"ID":"88f6ae7d-9e45-43bf-a76f-e3f9b2d13be0","Type":"ContainerStarted","Data":"9cbc30de21047b6844b508871778842e64e3040151a4b4f347d12c5f3bfed527"} Mar 10 11:15:13 crc kubenswrapper[4794]: W0310 11:15:13.582900 4794 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod747409d2_0f42_4b0e_9daa_3a2aae24a583.slice/crio-6df0f63fb93f67c1a0f50856668ac0a4c3840f257237a58d14872b5ccfd07729 WatchSource:0}: Error finding container 6df0f63fb93f67c1a0f50856668ac0a4c3840f257237a58d14872b5ccfd07729: Status 404 returned error can't find the container with id 6df0f63fb93f67c1a0f50856668ac0a4c3840f257237a58d14872b5ccfd07729 Mar 10 11:15:13 crc kubenswrapper[4794]: I0310 11:15:13.584083 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-a98d-account-create-update-8vn77"] Mar 10 11:15:14 crc kubenswrapper[4794]: I0310 11:15:14.511614 4794 generic.go:334] "Generic (PLEG): container finished" podID="747409d2-0f42-4b0e-9daa-3a2aae24a583" containerID="b37a53f395fddadf0dac7239afbff2042aa89978fd87bcd7c0081f6643ae744a" exitCode=0 Mar 10 11:15:14 crc kubenswrapper[4794]: I0310 11:15:14.511744 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-a98d-account-create-update-8vn77" event={"ID":"747409d2-0f42-4b0e-9daa-3a2aae24a583","Type":"ContainerDied","Data":"b37a53f395fddadf0dac7239afbff2042aa89978fd87bcd7c0081f6643ae744a"} Mar 10 11:15:14 crc kubenswrapper[4794]: I0310 11:15:14.511789 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-a98d-account-create-update-8vn77" event={"ID":"747409d2-0f42-4b0e-9daa-3a2aae24a583","Type":"ContainerStarted","Data":"6df0f63fb93f67c1a0f50856668ac0a4c3840f257237a58d14872b5ccfd07729"} Mar 10 11:15:14 crc kubenswrapper[4794]: I0310 11:15:14.518142 4794 generic.go:334] "Generic (PLEG): container finished" podID="88f6ae7d-9e45-43bf-a76f-e3f9b2d13be0" containerID="540523ba9349540477c0ccb3362badc240145da4c88f1e70720b28dd75d7c9d6" exitCode=0 Mar 10 11:15:14 crc kubenswrapper[4794]: I0310 11:15:14.518221 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-gdldc" event={"ID":"88f6ae7d-9e45-43bf-a76f-e3f9b2d13be0","Type":"ContainerDied","Data":"540523ba9349540477c0ccb3362badc240145da4c88f1e70720b28dd75d7c9d6"} Mar 10 11:15:16 crc kubenswrapper[4794]: I0310 11:15:16.029762 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-gdldc" Mar 10 11:15:16 crc kubenswrapper[4794]: I0310 11:15:16.039275 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-a98d-account-create-update-8vn77" Mar 10 11:15:16 crc kubenswrapper[4794]: I0310 11:15:16.175814 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kxcvq\" (UniqueName: \"kubernetes.io/projected/88f6ae7d-9e45-43bf-a76f-e3f9b2d13be0-kube-api-access-kxcvq\") pod \"88f6ae7d-9e45-43bf-a76f-e3f9b2d13be0\" (UID: \"88f6ae7d-9e45-43bf-a76f-e3f9b2d13be0\") " Mar 10 11:15:16 crc kubenswrapper[4794]: I0310 11:15:16.176194 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/88f6ae7d-9e45-43bf-a76f-e3f9b2d13be0-operator-scripts\") pod \"88f6ae7d-9e45-43bf-a76f-e3f9b2d13be0\" (UID: \"88f6ae7d-9e45-43bf-a76f-e3f9b2d13be0\") " Mar 10 11:15:16 crc kubenswrapper[4794]: I0310 11:15:16.176319 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/747409d2-0f42-4b0e-9daa-3a2aae24a583-operator-scripts\") pod \"747409d2-0f42-4b0e-9daa-3a2aae24a583\" (UID: \"747409d2-0f42-4b0e-9daa-3a2aae24a583\") " Mar 10 11:15:16 crc kubenswrapper[4794]: I0310 11:15:16.176458 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4hjgs\" (UniqueName: \"kubernetes.io/projected/747409d2-0f42-4b0e-9daa-3a2aae24a583-kube-api-access-4hjgs\") pod \"747409d2-0f42-4b0e-9daa-3a2aae24a583\" (UID: \"747409d2-0f42-4b0e-9daa-3a2aae24a583\") " Mar 10 11:15:16 crc kubenswrapper[4794]: I0310 11:15:16.177119 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/88f6ae7d-9e45-43bf-a76f-e3f9b2d13be0-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "88f6ae7d-9e45-43bf-a76f-e3f9b2d13be0" (UID: "88f6ae7d-9e45-43bf-a76f-e3f9b2d13be0"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 11:15:16 crc kubenswrapper[4794]: I0310 11:15:16.177247 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/747409d2-0f42-4b0e-9daa-3a2aae24a583-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "747409d2-0f42-4b0e-9daa-3a2aae24a583" (UID: "747409d2-0f42-4b0e-9daa-3a2aae24a583"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 11:15:16 crc kubenswrapper[4794]: I0310 11:15:16.178842 4794 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/88f6ae7d-9e45-43bf-a76f-e3f9b2d13be0-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 11:15:16 crc kubenswrapper[4794]: I0310 11:15:16.178889 4794 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/747409d2-0f42-4b0e-9daa-3a2aae24a583-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 11:15:16 crc kubenswrapper[4794]: I0310 11:15:16.183222 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/88f6ae7d-9e45-43bf-a76f-e3f9b2d13be0-kube-api-access-kxcvq" (OuterVolumeSpecName: "kube-api-access-kxcvq") pod "88f6ae7d-9e45-43bf-a76f-e3f9b2d13be0" (UID: "88f6ae7d-9e45-43bf-a76f-e3f9b2d13be0"). InnerVolumeSpecName "kube-api-access-kxcvq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 11:15:16 crc kubenswrapper[4794]: I0310 11:15:16.183606 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/747409d2-0f42-4b0e-9daa-3a2aae24a583-kube-api-access-4hjgs" (OuterVolumeSpecName: "kube-api-access-4hjgs") pod "747409d2-0f42-4b0e-9daa-3a2aae24a583" (UID: "747409d2-0f42-4b0e-9daa-3a2aae24a583"). InnerVolumeSpecName "kube-api-access-4hjgs". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 11:15:16 crc kubenswrapper[4794]: I0310 11:15:16.280611 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kxcvq\" (UniqueName: \"kubernetes.io/projected/88f6ae7d-9e45-43bf-a76f-e3f9b2d13be0-kube-api-access-kxcvq\") on node \"crc\" DevicePath \"\"" Mar 10 11:15:16 crc kubenswrapper[4794]: I0310 11:15:16.280642 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4hjgs\" (UniqueName: \"kubernetes.io/projected/747409d2-0f42-4b0e-9daa-3a2aae24a583-kube-api-access-4hjgs\") on node \"crc\" DevicePath \"\"" Mar 10 11:15:16 crc kubenswrapper[4794]: I0310 11:15:16.542941 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-gdldc" event={"ID":"88f6ae7d-9e45-43bf-a76f-e3f9b2d13be0","Type":"ContainerDied","Data":"9cbc30de21047b6844b508871778842e64e3040151a4b4f347d12c5f3bfed527"} Mar 10 11:15:16 crc kubenswrapper[4794]: I0310 11:15:16.542986 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-gdldc" Mar 10 11:15:16 crc kubenswrapper[4794]: I0310 11:15:16.543008 4794 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9cbc30de21047b6844b508871778842e64e3040151a4b4f347d12c5f3bfed527" Mar 10 11:15:16 crc kubenswrapper[4794]: I0310 11:15:16.545875 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-a98d-account-create-update-8vn77" event={"ID":"747409d2-0f42-4b0e-9daa-3a2aae24a583","Type":"ContainerDied","Data":"6df0f63fb93f67c1a0f50856668ac0a4c3840f257237a58d14872b5ccfd07729"} Mar 10 11:15:16 crc kubenswrapper[4794]: I0310 11:15:16.545942 4794 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6df0f63fb93f67c1a0f50856668ac0a4c3840f257237a58d14872b5ccfd07729" Mar 10 11:15:16 crc kubenswrapper[4794]: I0310 11:15:16.545962 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-a98d-account-create-update-8vn77" Mar 10 11:15:18 crc kubenswrapper[4794]: I0310 11:15:18.097780 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-sync-vlmhb"] Mar 10 11:15:18 crc kubenswrapper[4794]: E0310 11:15:18.098719 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="747409d2-0f42-4b0e-9daa-3a2aae24a583" containerName="mariadb-account-create-update" Mar 10 11:15:18 crc kubenswrapper[4794]: I0310 11:15:18.098742 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="747409d2-0f42-4b0e-9daa-3a2aae24a583" containerName="mariadb-account-create-update" Mar 10 11:15:18 crc kubenswrapper[4794]: E0310 11:15:18.098777 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="88f6ae7d-9e45-43bf-a76f-e3f9b2d13be0" containerName="mariadb-database-create" Mar 10 11:15:18 crc kubenswrapper[4794]: I0310 11:15:18.098790 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="88f6ae7d-9e45-43bf-a76f-e3f9b2d13be0" containerName="mariadb-database-create" Mar 10 11:15:18 crc kubenswrapper[4794]: I0310 11:15:18.099086 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="747409d2-0f42-4b0e-9daa-3a2aae24a583" containerName="mariadb-account-create-update" Mar 10 11:15:18 crc kubenswrapper[4794]: I0310 11:15:18.099129 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="88f6ae7d-9e45-43bf-a76f-e3f9b2d13be0" containerName="mariadb-database-create" Mar 10 11:15:18 crc kubenswrapper[4794]: I0310 11:15:18.100055 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-vlmhb" Mar 10 11:15:18 crc kubenswrapper[4794]: I0310 11:15:18.104848 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Mar 10 11:15:18 crc kubenswrapper[4794]: I0310 11:15:18.104907 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Mar 10 11:15:18 crc kubenswrapper[4794]: I0310 11:15:18.111622 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-d8f55" Mar 10 11:15:18 crc kubenswrapper[4794]: I0310 11:15:18.111799 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-vlmhb"] Mar 10 11:15:18 crc kubenswrapper[4794]: I0310 11:15:18.223730 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5a7ac128-d40e-4917-b3f9-4d87040dddbe-combined-ca-bundle\") pod \"neutron-db-sync-vlmhb\" (UID: \"5a7ac128-d40e-4917-b3f9-4d87040dddbe\") " pod="openstack/neutron-db-sync-vlmhb" Mar 10 11:15:18 crc kubenswrapper[4794]: I0310 11:15:18.223863 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/5a7ac128-d40e-4917-b3f9-4d87040dddbe-config\") pod \"neutron-db-sync-vlmhb\" (UID: \"5a7ac128-d40e-4917-b3f9-4d87040dddbe\") " pod="openstack/neutron-db-sync-vlmhb" Mar 10 11:15:18 crc kubenswrapper[4794]: I0310 11:15:18.223914 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pqbqf\" (UniqueName: \"kubernetes.io/projected/5a7ac128-d40e-4917-b3f9-4d87040dddbe-kube-api-access-pqbqf\") pod \"neutron-db-sync-vlmhb\" (UID: \"5a7ac128-d40e-4917-b3f9-4d87040dddbe\") " pod="openstack/neutron-db-sync-vlmhb" Mar 10 11:15:18 crc kubenswrapper[4794]: I0310 11:15:18.325304 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5a7ac128-d40e-4917-b3f9-4d87040dddbe-combined-ca-bundle\") pod \"neutron-db-sync-vlmhb\" (UID: \"5a7ac128-d40e-4917-b3f9-4d87040dddbe\") " pod="openstack/neutron-db-sync-vlmhb" Mar 10 11:15:18 crc kubenswrapper[4794]: I0310 11:15:18.325441 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/5a7ac128-d40e-4917-b3f9-4d87040dddbe-config\") pod \"neutron-db-sync-vlmhb\" (UID: \"5a7ac128-d40e-4917-b3f9-4d87040dddbe\") " pod="openstack/neutron-db-sync-vlmhb" Mar 10 11:15:18 crc kubenswrapper[4794]: I0310 11:15:18.325491 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pqbqf\" (UniqueName: \"kubernetes.io/projected/5a7ac128-d40e-4917-b3f9-4d87040dddbe-kube-api-access-pqbqf\") pod \"neutron-db-sync-vlmhb\" (UID: \"5a7ac128-d40e-4917-b3f9-4d87040dddbe\") " pod="openstack/neutron-db-sync-vlmhb" Mar 10 11:15:18 crc kubenswrapper[4794]: I0310 11:15:18.331047 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5a7ac128-d40e-4917-b3f9-4d87040dddbe-combined-ca-bundle\") pod \"neutron-db-sync-vlmhb\" (UID: \"5a7ac128-d40e-4917-b3f9-4d87040dddbe\") " pod="openstack/neutron-db-sync-vlmhb" Mar 10 11:15:18 crc kubenswrapper[4794]: I0310 11:15:18.333315 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/5a7ac128-d40e-4917-b3f9-4d87040dddbe-config\") pod \"neutron-db-sync-vlmhb\" (UID: \"5a7ac128-d40e-4917-b3f9-4d87040dddbe\") " pod="openstack/neutron-db-sync-vlmhb" Mar 10 11:15:18 crc kubenswrapper[4794]: I0310 11:15:18.344971 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pqbqf\" (UniqueName: \"kubernetes.io/projected/5a7ac128-d40e-4917-b3f9-4d87040dddbe-kube-api-access-pqbqf\") pod \"neutron-db-sync-vlmhb\" (UID: \"5a7ac128-d40e-4917-b3f9-4d87040dddbe\") " pod="openstack/neutron-db-sync-vlmhb" Mar 10 11:15:18 crc kubenswrapper[4794]: I0310 11:15:18.438706 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-vlmhb" Mar 10 11:15:18 crc kubenswrapper[4794]: I0310 11:15:18.931174 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-vlmhb"] Mar 10 11:15:19 crc kubenswrapper[4794]: I0310 11:15:19.588284 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-vlmhb" event={"ID":"5a7ac128-d40e-4917-b3f9-4d87040dddbe","Type":"ContainerStarted","Data":"65123d0ed48834e813578fec9bb56c6a2df752c50179e00ad5ca7fe24b94d03b"} Mar 10 11:15:19 crc kubenswrapper[4794]: I0310 11:15:19.588808 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-vlmhb" event={"ID":"5a7ac128-d40e-4917-b3f9-4d87040dddbe","Type":"ContainerStarted","Data":"49506bdd25b3969a3c3839adf4af216f8788c69dcee001b8d5bb2988c23a1dc7"} Mar 10 11:15:19 crc kubenswrapper[4794]: I0310 11:15:19.624486 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-db-sync-vlmhb" podStartSLOduration=1.6244579030000001 podStartE2EDuration="1.624457903s" podCreationTimestamp="2026-03-10 11:15:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 11:15:19.614258405 +0000 UTC m=+5468.370429253" watchObservedRunningTime="2026-03-10 11:15:19.624457903 +0000 UTC m=+5468.380628751" Mar 10 11:15:22 crc kubenswrapper[4794]: I0310 11:15:22.014637 4794 scope.go:117] "RemoveContainer" containerID="993a5336699cfaa4c1135438dd34154cd549d1e3528993849fd4002d26837157" Mar 10 11:15:22 crc kubenswrapper[4794]: E0310 11:15:22.015452 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-69278_openshift-machine-config-operator(071a79a8-a892-4d38-a255-2a19483b64aa)\"" pod="openshift-machine-config-operator/machine-config-daemon-69278" podUID="071a79a8-a892-4d38-a255-2a19483b64aa" Mar 10 11:15:23 crc kubenswrapper[4794]: I0310 11:15:23.629876 4794 generic.go:334] "Generic (PLEG): container finished" podID="5a7ac128-d40e-4917-b3f9-4d87040dddbe" containerID="65123d0ed48834e813578fec9bb56c6a2df752c50179e00ad5ca7fe24b94d03b" exitCode=0 Mar 10 11:15:23 crc kubenswrapper[4794]: I0310 11:15:23.630416 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-vlmhb" event={"ID":"5a7ac128-d40e-4917-b3f9-4d87040dddbe","Type":"ContainerDied","Data":"65123d0ed48834e813578fec9bb56c6a2df752c50179e00ad5ca7fe24b94d03b"} Mar 10 11:15:25 crc kubenswrapper[4794]: I0310 11:15:25.068124 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-vlmhb" Mar 10 11:15:25 crc kubenswrapper[4794]: I0310 11:15:25.158633 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pqbqf\" (UniqueName: \"kubernetes.io/projected/5a7ac128-d40e-4917-b3f9-4d87040dddbe-kube-api-access-pqbqf\") pod \"5a7ac128-d40e-4917-b3f9-4d87040dddbe\" (UID: \"5a7ac128-d40e-4917-b3f9-4d87040dddbe\") " Mar 10 11:15:25 crc kubenswrapper[4794]: I0310 11:15:25.158935 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5a7ac128-d40e-4917-b3f9-4d87040dddbe-combined-ca-bundle\") pod \"5a7ac128-d40e-4917-b3f9-4d87040dddbe\" (UID: \"5a7ac128-d40e-4917-b3f9-4d87040dddbe\") " Mar 10 11:15:25 crc kubenswrapper[4794]: I0310 11:15:25.158983 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/5a7ac128-d40e-4917-b3f9-4d87040dddbe-config\") pod \"5a7ac128-d40e-4917-b3f9-4d87040dddbe\" (UID: \"5a7ac128-d40e-4917-b3f9-4d87040dddbe\") " Mar 10 11:15:25 crc kubenswrapper[4794]: I0310 11:15:25.178886 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5a7ac128-d40e-4917-b3f9-4d87040dddbe-kube-api-access-pqbqf" (OuterVolumeSpecName: "kube-api-access-pqbqf") pod "5a7ac128-d40e-4917-b3f9-4d87040dddbe" (UID: "5a7ac128-d40e-4917-b3f9-4d87040dddbe"). InnerVolumeSpecName "kube-api-access-pqbqf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 11:15:25 crc kubenswrapper[4794]: I0310 11:15:25.199516 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5a7ac128-d40e-4917-b3f9-4d87040dddbe-config" (OuterVolumeSpecName: "config") pod "5a7ac128-d40e-4917-b3f9-4d87040dddbe" (UID: "5a7ac128-d40e-4917-b3f9-4d87040dddbe"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 11:15:25 crc kubenswrapper[4794]: I0310 11:15:25.203583 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5a7ac128-d40e-4917-b3f9-4d87040dddbe-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5a7ac128-d40e-4917-b3f9-4d87040dddbe" (UID: "5a7ac128-d40e-4917-b3f9-4d87040dddbe"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 11:15:25 crc kubenswrapper[4794]: I0310 11:15:25.260498 4794 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5a7ac128-d40e-4917-b3f9-4d87040dddbe-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 11:15:25 crc kubenswrapper[4794]: I0310 11:15:25.260533 4794 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/5a7ac128-d40e-4917-b3f9-4d87040dddbe-config\") on node \"crc\" DevicePath \"\"" Mar 10 11:15:25 crc kubenswrapper[4794]: I0310 11:15:25.260545 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pqbqf\" (UniqueName: \"kubernetes.io/projected/5a7ac128-d40e-4917-b3f9-4d87040dddbe-kube-api-access-pqbqf\") on node \"crc\" DevicePath \"\"" Mar 10 11:15:25 crc kubenswrapper[4794]: I0310 11:15:25.656127 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-vlmhb" event={"ID":"5a7ac128-d40e-4917-b3f9-4d87040dddbe","Type":"ContainerDied","Data":"49506bdd25b3969a3c3839adf4af216f8788c69dcee001b8d5bb2988c23a1dc7"} Mar 10 11:15:25 crc kubenswrapper[4794]: I0310 11:15:25.656198 4794 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="49506bdd25b3969a3c3839adf4af216f8788c69dcee001b8d5bb2988c23a1dc7" Mar 10 11:15:25 crc kubenswrapper[4794]: I0310 11:15:25.656358 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-vlmhb" Mar 10 11:15:25 crc kubenswrapper[4794]: I0310 11:15:25.834007 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-bff8cd95-2t5br"] Mar 10 11:15:25 crc kubenswrapper[4794]: E0310 11:15:25.835263 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5a7ac128-d40e-4917-b3f9-4d87040dddbe" containerName="neutron-db-sync" Mar 10 11:15:25 crc kubenswrapper[4794]: I0310 11:15:25.835316 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="5a7ac128-d40e-4917-b3f9-4d87040dddbe" containerName="neutron-db-sync" Mar 10 11:15:25 crc kubenswrapper[4794]: I0310 11:15:25.835730 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="5a7ac128-d40e-4917-b3f9-4d87040dddbe" containerName="neutron-db-sync" Mar 10 11:15:25 crc kubenswrapper[4794]: I0310 11:15:25.837567 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-bff8cd95-2t5br" Mar 10 11:15:25 crc kubenswrapper[4794]: I0310 11:15:25.842893 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-bff8cd95-2t5br"] Mar 10 11:15:25 crc kubenswrapper[4794]: I0310 11:15:25.952320 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-78455f4569-hjwg5"] Mar 10 11:15:25 crc kubenswrapper[4794]: I0310 11:15:25.956653 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-78455f4569-hjwg5" Mar 10 11:15:25 crc kubenswrapper[4794]: I0310 11:15:25.960696 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-d8f55" Mar 10 11:15:25 crc kubenswrapper[4794]: I0310 11:15:25.961008 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Mar 10 11:15:25 crc kubenswrapper[4794]: I0310 11:15:25.961083 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Mar 10 11:15:25 crc kubenswrapper[4794]: I0310 11:15:25.964379 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-78455f4569-hjwg5"] Mar 10 11:15:25 crc kubenswrapper[4794]: I0310 11:15:25.977247 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ee972ab7-a06d-4966-b74f-b7ab83336c5c-ovsdbserver-nb\") pod \"dnsmasq-dns-bff8cd95-2t5br\" (UID: \"ee972ab7-a06d-4966-b74f-b7ab83336c5c\") " pod="openstack/dnsmasq-dns-bff8cd95-2t5br" Mar 10 11:15:25 crc kubenswrapper[4794]: I0310 11:15:25.977324 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ee972ab7-a06d-4966-b74f-b7ab83336c5c-dns-svc\") pod \"dnsmasq-dns-bff8cd95-2t5br\" (UID: \"ee972ab7-a06d-4966-b74f-b7ab83336c5c\") " pod="openstack/dnsmasq-dns-bff8cd95-2t5br" Mar 10 11:15:25 crc kubenswrapper[4794]: I0310 11:15:25.977372 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pdbgd\" (UniqueName: \"kubernetes.io/projected/ee972ab7-a06d-4966-b74f-b7ab83336c5c-kube-api-access-pdbgd\") pod \"dnsmasq-dns-bff8cd95-2t5br\" (UID: \"ee972ab7-a06d-4966-b74f-b7ab83336c5c\") " pod="openstack/dnsmasq-dns-bff8cd95-2t5br" Mar 10 11:15:25 crc kubenswrapper[4794]: I0310 11:15:25.977399 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ee972ab7-a06d-4966-b74f-b7ab83336c5c-ovsdbserver-sb\") pod \"dnsmasq-dns-bff8cd95-2t5br\" (UID: \"ee972ab7-a06d-4966-b74f-b7ab83336c5c\") " pod="openstack/dnsmasq-dns-bff8cd95-2t5br" Mar 10 11:15:25 crc kubenswrapper[4794]: I0310 11:15:25.977462 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ee972ab7-a06d-4966-b74f-b7ab83336c5c-config\") pod \"dnsmasq-dns-bff8cd95-2t5br\" (UID: \"ee972ab7-a06d-4966-b74f-b7ab83336c5c\") " pod="openstack/dnsmasq-dns-bff8cd95-2t5br" Mar 10 11:15:26 crc kubenswrapper[4794]: I0310 11:15:26.079164 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ee972ab7-a06d-4966-b74f-b7ab83336c5c-ovsdbserver-nb\") pod \"dnsmasq-dns-bff8cd95-2t5br\" (UID: \"ee972ab7-a06d-4966-b74f-b7ab83336c5c\") " pod="openstack/dnsmasq-dns-bff8cd95-2t5br" Mar 10 11:15:26 crc kubenswrapper[4794]: I0310 11:15:26.079245 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/d290cfd0-f0e3-4319-bb64-48ec66b84613-httpd-config\") pod \"neutron-78455f4569-hjwg5\" (UID: \"d290cfd0-f0e3-4319-bb64-48ec66b84613\") " pod="openstack/neutron-78455f4569-hjwg5" Mar 10 11:15:26 crc kubenswrapper[4794]: I0310 11:15:26.079289 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ee972ab7-a06d-4966-b74f-b7ab83336c5c-dns-svc\") pod \"dnsmasq-dns-bff8cd95-2t5br\" (UID: \"ee972ab7-a06d-4966-b74f-b7ab83336c5c\") " pod="openstack/dnsmasq-dns-bff8cd95-2t5br" Mar 10 11:15:26 crc kubenswrapper[4794]: I0310 11:15:26.079308 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pdbgd\" (UniqueName: \"kubernetes.io/projected/ee972ab7-a06d-4966-b74f-b7ab83336c5c-kube-api-access-pdbgd\") pod \"dnsmasq-dns-bff8cd95-2t5br\" (UID: \"ee972ab7-a06d-4966-b74f-b7ab83336c5c\") " pod="openstack/dnsmasq-dns-bff8cd95-2t5br" Mar 10 11:15:26 crc kubenswrapper[4794]: I0310 11:15:26.079324 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ee972ab7-a06d-4966-b74f-b7ab83336c5c-ovsdbserver-sb\") pod \"dnsmasq-dns-bff8cd95-2t5br\" (UID: \"ee972ab7-a06d-4966-b74f-b7ab83336c5c\") " pod="openstack/dnsmasq-dns-bff8cd95-2t5br" Mar 10 11:15:26 crc kubenswrapper[4794]: I0310 11:15:26.079419 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-56kxd\" (UniqueName: \"kubernetes.io/projected/d290cfd0-f0e3-4319-bb64-48ec66b84613-kube-api-access-56kxd\") pod \"neutron-78455f4569-hjwg5\" (UID: \"d290cfd0-f0e3-4319-bb64-48ec66b84613\") " pod="openstack/neutron-78455f4569-hjwg5" Mar 10 11:15:26 crc kubenswrapper[4794]: I0310 11:15:26.079441 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d290cfd0-f0e3-4319-bb64-48ec66b84613-combined-ca-bundle\") pod \"neutron-78455f4569-hjwg5\" (UID: \"d290cfd0-f0e3-4319-bb64-48ec66b84613\") " pod="openstack/neutron-78455f4569-hjwg5" Mar 10 11:15:26 crc kubenswrapper[4794]: I0310 11:15:26.079463 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ee972ab7-a06d-4966-b74f-b7ab83336c5c-config\") pod \"dnsmasq-dns-bff8cd95-2t5br\" (UID: \"ee972ab7-a06d-4966-b74f-b7ab83336c5c\") " pod="openstack/dnsmasq-dns-bff8cd95-2t5br" Mar 10 11:15:26 crc kubenswrapper[4794]: I0310 11:15:26.079485 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/d290cfd0-f0e3-4319-bb64-48ec66b84613-config\") pod \"neutron-78455f4569-hjwg5\" (UID: \"d290cfd0-f0e3-4319-bb64-48ec66b84613\") " pod="openstack/neutron-78455f4569-hjwg5" Mar 10 11:15:26 crc kubenswrapper[4794]: I0310 11:15:26.080699 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ee972ab7-a06d-4966-b74f-b7ab83336c5c-ovsdbserver-sb\") pod \"dnsmasq-dns-bff8cd95-2t5br\" (UID: \"ee972ab7-a06d-4966-b74f-b7ab83336c5c\") " pod="openstack/dnsmasq-dns-bff8cd95-2t5br" Mar 10 11:15:26 crc kubenswrapper[4794]: I0310 11:15:26.080738 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ee972ab7-a06d-4966-b74f-b7ab83336c5c-ovsdbserver-nb\") pod \"dnsmasq-dns-bff8cd95-2t5br\" (UID: \"ee972ab7-a06d-4966-b74f-b7ab83336c5c\") " pod="openstack/dnsmasq-dns-bff8cd95-2t5br" Mar 10 11:15:26 crc kubenswrapper[4794]: I0310 11:15:26.080763 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ee972ab7-a06d-4966-b74f-b7ab83336c5c-dns-svc\") pod \"dnsmasq-dns-bff8cd95-2t5br\" (UID: \"ee972ab7-a06d-4966-b74f-b7ab83336c5c\") " pod="openstack/dnsmasq-dns-bff8cd95-2t5br" Mar 10 11:15:26 crc kubenswrapper[4794]: I0310 11:15:26.080995 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ee972ab7-a06d-4966-b74f-b7ab83336c5c-config\") pod \"dnsmasq-dns-bff8cd95-2t5br\" (UID: \"ee972ab7-a06d-4966-b74f-b7ab83336c5c\") " pod="openstack/dnsmasq-dns-bff8cd95-2t5br" Mar 10 11:15:26 crc kubenswrapper[4794]: I0310 11:15:26.097877 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pdbgd\" (UniqueName: \"kubernetes.io/projected/ee972ab7-a06d-4966-b74f-b7ab83336c5c-kube-api-access-pdbgd\") pod \"dnsmasq-dns-bff8cd95-2t5br\" (UID: \"ee972ab7-a06d-4966-b74f-b7ab83336c5c\") " pod="openstack/dnsmasq-dns-bff8cd95-2t5br" Mar 10 11:15:26 crc kubenswrapper[4794]: I0310 11:15:26.165138 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-bff8cd95-2t5br" Mar 10 11:15:26 crc kubenswrapper[4794]: I0310 11:15:26.180474 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/d290cfd0-f0e3-4319-bb64-48ec66b84613-httpd-config\") pod \"neutron-78455f4569-hjwg5\" (UID: \"d290cfd0-f0e3-4319-bb64-48ec66b84613\") " pod="openstack/neutron-78455f4569-hjwg5" Mar 10 11:15:26 crc kubenswrapper[4794]: I0310 11:15:26.180561 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-56kxd\" (UniqueName: \"kubernetes.io/projected/d290cfd0-f0e3-4319-bb64-48ec66b84613-kube-api-access-56kxd\") pod \"neutron-78455f4569-hjwg5\" (UID: \"d290cfd0-f0e3-4319-bb64-48ec66b84613\") " pod="openstack/neutron-78455f4569-hjwg5" Mar 10 11:15:26 crc kubenswrapper[4794]: I0310 11:15:26.180586 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d290cfd0-f0e3-4319-bb64-48ec66b84613-combined-ca-bundle\") pod \"neutron-78455f4569-hjwg5\" (UID: \"d290cfd0-f0e3-4319-bb64-48ec66b84613\") " pod="openstack/neutron-78455f4569-hjwg5" Mar 10 11:15:26 crc kubenswrapper[4794]: I0310 11:15:26.180612 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/d290cfd0-f0e3-4319-bb64-48ec66b84613-config\") pod \"neutron-78455f4569-hjwg5\" (UID: \"d290cfd0-f0e3-4319-bb64-48ec66b84613\") " pod="openstack/neutron-78455f4569-hjwg5" Mar 10 11:15:26 crc kubenswrapper[4794]: I0310 11:15:26.185392 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/d290cfd0-f0e3-4319-bb64-48ec66b84613-config\") pod \"neutron-78455f4569-hjwg5\" (UID: \"d290cfd0-f0e3-4319-bb64-48ec66b84613\") " pod="openstack/neutron-78455f4569-hjwg5" Mar 10 11:15:26 crc kubenswrapper[4794]: I0310 11:15:26.185562 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d290cfd0-f0e3-4319-bb64-48ec66b84613-combined-ca-bundle\") pod \"neutron-78455f4569-hjwg5\" (UID: \"d290cfd0-f0e3-4319-bb64-48ec66b84613\") " pod="openstack/neutron-78455f4569-hjwg5" Mar 10 11:15:26 crc kubenswrapper[4794]: I0310 11:15:26.186298 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/d290cfd0-f0e3-4319-bb64-48ec66b84613-httpd-config\") pod \"neutron-78455f4569-hjwg5\" (UID: \"d290cfd0-f0e3-4319-bb64-48ec66b84613\") " pod="openstack/neutron-78455f4569-hjwg5" Mar 10 11:15:26 crc kubenswrapper[4794]: I0310 11:15:26.198178 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-56kxd\" (UniqueName: \"kubernetes.io/projected/d290cfd0-f0e3-4319-bb64-48ec66b84613-kube-api-access-56kxd\") pod \"neutron-78455f4569-hjwg5\" (UID: \"d290cfd0-f0e3-4319-bb64-48ec66b84613\") " pod="openstack/neutron-78455f4569-hjwg5" Mar 10 11:15:26 crc kubenswrapper[4794]: I0310 11:15:26.285249 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-78455f4569-hjwg5" Mar 10 11:15:26 crc kubenswrapper[4794]: I0310 11:15:26.590843 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-bff8cd95-2t5br"] Mar 10 11:15:26 crc kubenswrapper[4794]: I0310 11:15:26.677556 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-bff8cd95-2t5br" event={"ID":"ee972ab7-a06d-4966-b74f-b7ab83336c5c","Type":"ContainerStarted","Data":"4130f25a88c9631f93363534700d2f05010b5c1da5e173ae6ece98141d18d10a"} Mar 10 11:15:26 crc kubenswrapper[4794]: I0310 11:15:26.801598 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-78455f4569-hjwg5"] Mar 10 11:15:26 crc kubenswrapper[4794]: W0310 11:15:26.804982 4794 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd290cfd0_f0e3_4319_bb64_48ec66b84613.slice/crio-90593aeda220de0e34a6f5bd6884ed35d67a13f85c9aa14a057b1d1626bb97df WatchSource:0}: Error finding container 90593aeda220de0e34a6f5bd6884ed35d67a13f85c9aa14a057b1d1626bb97df: Status 404 returned error can't find the container with id 90593aeda220de0e34a6f5bd6884ed35d67a13f85c9aa14a057b1d1626bb97df Mar 10 11:15:27 crc kubenswrapper[4794]: I0310 11:15:27.687791 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-78455f4569-hjwg5" event={"ID":"d290cfd0-f0e3-4319-bb64-48ec66b84613","Type":"ContainerStarted","Data":"0d9283f71d202426c07caa5dd9248cbe9815c4d2c80cb8c8ff192a06e62aa784"} Mar 10 11:15:27 crc kubenswrapper[4794]: I0310 11:15:27.688135 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-78455f4569-hjwg5" Mar 10 11:15:27 crc kubenswrapper[4794]: I0310 11:15:27.688158 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-78455f4569-hjwg5" event={"ID":"d290cfd0-f0e3-4319-bb64-48ec66b84613","Type":"ContainerStarted","Data":"aebb09bd2e008ede0a1c24626ea82161dbdb4454e9a3a3946b21c1fd56961c49"} Mar 10 11:15:27 crc kubenswrapper[4794]: I0310 11:15:27.688178 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-78455f4569-hjwg5" event={"ID":"d290cfd0-f0e3-4319-bb64-48ec66b84613","Type":"ContainerStarted","Data":"90593aeda220de0e34a6f5bd6884ed35d67a13f85c9aa14a057b1d1626bb97df"} Mar 10 11:15:27 crc kubenswrapper[4794]: I0310 11:15:27.690595 4794 generic.go:334] "Generic (PLEG): container finished" podID="ee972ab7-a06d-4966-b74f-b7ab83336c5c" containerID="1cec9374ae96c49ac1f4862ac7c3f27ee54028d1c65452f1d88fc5333b0fee73" exitCode=0 Mar 10 11:15:27 crc kubenswrapper[4794]: I0310 11:15:27.690639 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-bff8cd95-2t5br" event={"ID":"ee972ab7-a06d-4966-b74f-b7ab83336c5c","Type":"ContainerDied","Data":"1cec9374ae96c49ac1f4862ac7c3f27ee54028d1c65452f1d88fc5333b0fee73"} Mar 10 11:15:27 crc kubenswrapper[4794]: I0310 11:15:27.717479 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-78455f4569-hjwg5" podStartSLOduration=2.717454359 podStartE2EDuration="2.717454359s" podCreationTimestamp="2026-03-10 11:15:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 11:15:27.712821675 +0000 UTC m=+5476.468992523" watchObservedRunningTime="2026-03-10 11:15:27.717454359 +0000 UTC m=+5476.473625177" Mar 10 11:15:28 crc kubenswrapper[4794]: I0310 11:15:28.705019 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-bff8cd95-2t5br" event={"ID":"ee972ab7-a06d-4966-b74f-b7ab83336c5c","Type":"ContainerStarted","Data":"cbd98f6a40d8e166d172365aad5e50af304540f1cec541537547518b4bc1da6d"} Mar 10 11:15:28 crc kubenswrapper[4794]: I0310 11:15:28.735965 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-bff8cd95-2t5br" podStartSLOduration=3.7359452920000003 podStartE2EDuration="3.735945292s" podCreationTimestamp="2026-03-10 11:15:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 11:15:28.727404656 +0000 UTC m=+5477.483575534" watchObservedRunningTime="2026-03-10 11:15:28.735945292 +0000 UTC m=+5477.492116120" Mar 10 11:15:29 crc kubenswrapper[4794]: I0310 11:15:29.715790 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-bff8cd95-2t5br" Mar 10 11:15:36 crc kubenswrapper[4794]: I0310 11:15:36.000612 4794 scope.go:117] "RemoveContainer" containerID="993a5336699cfaa4c1135438dd34154cd549d1e3528993849fd4002d26837157" Mar 10 11:15:36 crc kubenswrapper[4794]: E0310 11:15:36.002017 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-69278_openshift-machine-config-operator(071a79a8-a892-4d38-a255-2a19483b64aa)\"" pod="openshift-machine-config-operator/machine-config-daemon-69278" podUID="071a79a8-a892-4d38-a255-2a19483b64aa" Mar 10 11:15:36 crc kubenswrapper[4794]: I0310 11:15:36.166660 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-bff8cd95-2t5br" Mar 10 11:15:36 crc kubenswrapper[4794]: I0310 11:15:36.256632 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-577f6d597f-sd6mk"] Mar 10 11:15:36 crc kubenswrapper[4794]: I0310 11:15:36.256960 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-577f6d597f-sd6mk" podUID="e19bd983-6b82-4879-84d5-3e2362654043" containerName="dnsmasq-dns" containerID="cri-o://449a157ed020ee7fa7b9a0ca2b2cf80739ed77686010443ce9ef638877596952" gracePeriod=10 Mar 10 11:15:36 crc kubenswrapper[4794]: E0310 11:15:36.311323 4794 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode19bd983_6b82_4879_84d5_3e2362654043.slice/crio-449a157ed020ee7fa7b9a0ca2b2cf80739ed77686010443ce9ef638877596952.scope\": RecentStats: unable to find data in memory cache]" Mar 10 11:15:36 crc kubenswrapper[4794]: I0310 11:15:36.792076 4794 generic.go:334] "Generic (PLEG): container finished" podID="e19bd983-6b82-4879-84d5-3e2362654043" containerID="449a157ed020ee7fa7b9a0ca2b2cf80739ed77686010443ce9ef638877596952" exitCode=0 Mar 10 11:15:36 crc kubenswrapper[4794]: I0310 11:15:36.792167 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-577f6d597f-sd6mk" event={"ID":"e19bd983-6b82-4879-84d5-3e2362654043","Type":"ContainerDied","Data":"449a157ed020ee7fa7b9a0ca2b2cf80739ed77686010443ce9ef638877596952"} Mar 10 11:15:36 crc kubenswrapper[4794]: I0310 11:15:36.792588 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-577f6d597f-sd6mk" event={"ID":"e19bd983-6b82-4879-84d5-3e2362654043","Type":"ContainerDied","Data":"abbcea27794053843d1c5e9a5c2027b8114d3d66aad2f605b35f61acad0ef240"} Mar 10 11:15:36 crc kubenswrapper[4794]: I0310 11:15:36.792603 4794 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="abbcea27794053843d1c5e9a5c2027b8114d3d66aad2f605b35f61acad0ef240" Mar 10 11:15:36 crc kubenswrapper[4794]: I0310 11:15:36.809791 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-577f6d597f-sd6mk" Mar 10 11:15:36 crc kubenswrapper[4794]: I0310 11:15:36.897131 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e19bd983-6b82-4879-84d5-3e2362654043-ovsdbserver-nb\") pod \"e19bd983-6b82-4879-84d5-3e2362654043\" (UID: \"e19bd983-6b82-4879-84d5-3e2362654043\") " Mar 10 11:15:36 crc kubenswrapper[4794]: I0310 11:15:36.897253 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e19bd983-6b82-4879-84d5-3e2362654043-dns-svc\") pod \"e19bd983-6b82-4879-84d5-3e2362654043\" (UID: \"e19bd983-6b82-4879-84d5-3e2362654043\") " Mar 10 11:15:36 crc kubenswrapper[4794]: I0310 11:15:36.897373 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e19bd983-6b82-4879-84d5-3e2362654043-ovsdbserver-sb\") pod \"e19bd983-6b82-4879-84d5-3e2362654043\" (UID: \"e19bd983-6b82-4879-84d5-3e2362654043\") " Mar 10 11:15:36 crc kubenswrapper[4794]: I0310 11:15:36.897405 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e19bd983-6b82-4879-84d5-3e2362654043-config\") pod \"e19bd983-6b82-4879-84d5-3e2362654043\" (UID: \"e19bd983-6b82-4879-84d5-3e2362654043\") " Mar 10 11:15:36 crc kubenswrapper[4794]: I0310 11:15:36.897452 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c9cbg\" (UniqueName: \"kubernetes.io/projected/e19bd983-6b82-4879-84d5-3e2362654043-kube-api-access-c9cbg\") pod \"e19bd983-6b82-4879-84d5-3e2362654043\" (UID: \"e19bd983-6b82-4879-84d5-3e2362654043\") " Mar 10 11:15:36 crc kubenswrapper[4794]: I0310 11:15:36.917489 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e19bd983-6b82-4879-84d5-3e2362654043-kube-api-access-c9cbg" (OuterVolumeSpecName: "kube-api-access-c9cbg") pod "e19bd983-6b82-4879-84d5-3e2362654043" (UID: "e19bd983-6b82-4879-84d5-3e2362654043"). InnerVolumeSpecName "kube-api-access-c9cbg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 11:15:36 crc kubenswrapper[4794]: I0310 11:15:36.941045 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e19bd983-6b82-4879-84d5-3e2362654043-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "e19bd983-6b82-4879-84d5-3e2362654043" (UID: "e19bd983-6b82-4879-84d5-3e2362654043"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 11:15:36 crc kubenswrapper[4794]: I0310 11:15:36.943685 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e19bd983-6b82-4879-84d5-3e2362654043-config" (OuterVolumeSpecName: "config") pod "e19bd983-6b82-4879-84d5-3e2362654043" (UID: "e19bd983-6b82-4879-84d5-3e2362654043"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 11:15:36 crc kubenswrapper[4794]: I0310 11:15:36.944196 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e19bd983-6b82-4879-84d5-3e2362654043-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "e19bd983-6b82-4879-84d5-3e2362654043" (UID: "e19bd983-6b82-4879-84d5-3e2362654043"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 11:15:36 crc kubenswrapper[4794]: I0310 11:15:36.956983 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e19bd983-6b82-4879-84d5-3e2362654043-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "e19bd983-6b82-4879-84d5-3e2362654043" (UID: "e19bd983-6b82-4879-84d5-3e2362654043"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 11:15:37 crc kubenswrapper[4794]: I0310 11:15:37.000449 4794 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e19bd983-6b82-4879-84d5-3e2362654043-config\") on node \"crc\" DevicePath \"\"" Mar 10 11:15:37 crc kubenswrapper[4794]: I0310 11:15:37.000502 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c9cbg\" (UniqueName: \"kubernetes.io/projected/e19bd983-6b82-4879-84d5-3e2362654043-kube-api-access-c9cbg\") on node \"crc\" DevicePath \"\"" Mar 10 11:15:37 crc kubenswrapper[4794]: I0310 11:15:37.000514 4794 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e19bd983-6b82-4879-84d5-3e2362654043-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 10 11:15:37 crc kubenswrapper[4794]: I0310 11:15:37.000527 4794 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e19bd983-6b82-4879-84d5-3e2362654043-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 10 11:15:37 crc kubenswrapper[4794]: I0310 11:15:37.000555 4794 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e19bd983-6b82-4879-84d5-3e2362654043-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 10 11:15:37 crc kubenswrapper[4794]: I0310 11:15:37.799581 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-577f6d597f-sd6mk" Mar 10 11:15:37 crc kubenswrapper[4794]: I0310 11:15:37.830010 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-577f6d597f-sd6mk"] Mar 10 11:15:37 crc kubenswrapper[4794]: I0310 11:15:37.837446 4794 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-577f6d597f-sd6mk"] Mar 10 11:15:38 crc kubenswrapper[4794]: I0310 11:15:38.013457 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e19bd983-6b82-4879-84d5-3e2362654043" path="/var/lib/kubelet/pods/e19bd983-6b82-4879-84d5-3e2362654043/volumes" Mar 10 11:15:42 crc kubenswrapper[4794]: I0310 11:15:42.415651 4794 scope.go:117] "RemoveContainer" containerID="53e9eb9caa6ee4f53b43c42b40530a755d11265dd8e0eea792637b9e0b86997a" Mar 10 11:15:48 crc kubenswrapper[4794]: I0310 11:15:47.999938 4794 scope.go:117] "RemoveContainer" containerID="993a5336699cfaa4c1135438dd34154cd549d1e3528993849fd4002d26837157" Mar 10 11:15:48 crc kubenswrapper[4794]: E0310 11:15:48.001160 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-69278_openshift-machine-config-operator(071a79a8-a892-4d38-a255-2a19483b64aa)\"" pod="openshift-machine-config-operator/machine-config-daemon-69278" podUID="071a79a8-a892-4d38-a255-2a19483b64aa" Mar 10 11:15:56 crc kubenswrapper[4794]: I0310 11:15:56.309621 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-78455f4569-hjwg5" Mar 10 11:16:00 crc kubenswrapper[4794]: I0310 11:16:00.140243 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29552356-flccj"] Mar 10 11:16:00 crc kubenswrapper[4794]: E0310 11:16:00.141223 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e19bd983-6b82-4879-84d5-3e2362654043" containerName="init" Mar 10 11:16:00 crc kubenswrapper[4794]: I0310 11:16:00.141239 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="e19bd983-6b82-4879-84d5-3e2362654043" containerName="init" Mar 10 11:16:00 crc kubenswrapper[4794]: E0310 11:16:00.141268 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e19bd983-6b82-4879-84d5-3e2362654043" containerName="dnsmasq-dns" Mar 10 11:16:00 crc kubenswrapper[4794]: I0310 11:16:00.141276 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="e19bd983-6b82-4879-84d5-3e2362654043" containerName="dnsmasq-dns" Mar 10 11:16:00 crc kubenswrapper[4794]: I0310 11:16:00.141509 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="e19bd983-6b82-4879-84d5-3e2362654043" containerName="dnsmasq-dns" Mar 10 11:16:00 crc kubenswrapper[4794]: I0310 11:16:00.142046 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552356-flccj" Mar 10 11:16:00 crc kubenswrapper[4794]: I0310 11:16:00.148959 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 10 11:16:00 crc kubenswrapper[4794]: I0310 11:16:00.149247 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-r5tkc" Mar 10 11:16:00 crc kubenswrapper[4794]: I0310 11:16:00.152143 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 10 11:16:00 crc kubenswrapper[4794]: I0310 11:16:00.158035 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552356-flccj"] Mar 10 11:16:00 crc kubenswrapper[4794]: I0310 11:16:00.313539 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bw464\" (UniqueName: \"kubernetes.io/projected/97e542d0-c411-44de-beb8-e5cffa10f1a7-kube-api-access-bw464\") pod \"auto-csr-approver-29552356-flccj\" (UID: \"97e542d0-c411-44de-beb8-e5cffa10f1a7\") " pod="openshift-infra/auto-csr-approver-29552356-flccj" Mar 10 11:16:00 crc kubenswrapper[4794]: I0310 11:16:00.416497 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bw464\" (UniqueName: \"kubernetes.io/projected/97e542d0-c411-44de-beb8-e5cffa10f1a7-kube-api-access-bw464\") pod \"auto-csr-approver-29552356-flccj\" (UID: \"97e542d0-c411-44de-beb8-e5cffa10f1a7\") " pod="openshift-infra/auto-csr-approver-29552356-flccj" Mar 10 11:16:00 crc kubenswrapper[4794]: I0310 11:16:00.445005 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bw464\" (UniqueName: \"kubernetes.io/projected/97e542d0-c411-44de-beb8-e5cffa10f1a7-kube-api-access-bw464\") pod \"auto-csr-approver-29552356-flccj\" (UID: \"97e542d0-c411-44de-beb8-e5cffa10f1a7\") " pod="openshift-infra/auto-csr-approver-29552356-flccj" Mar 10 11:16:00 crc kubenswrapper[4794]: I0310 11:16:00.466800 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552356-flccj" Mar 10 11:16:00 crc kubenswrapper[4794]: I0310 11:16:00.951653 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552356-flccj"] Mar 10 11:16:00 crc kubenswrapper[4794]: W0310 11:16:00.963549 4794 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod97e542d0_c411_44de_beb8_e5cffa10f1a7.slice/crio-ffcde6b64f92d96594ec72a81e3f55f032860c5d6e807f6beb11c56aeb601077 WatchSource:0}: Error finding container ffcde6b64f92d96594ec72a81e3f55f032860c5d6e807f6beb11c56aeb601077: Status 404 returned error can't find the container with id ffcde6b64f92d96594ec72a81e3f55f032860c5d6e807f6beb11c56aeb601077 Mar 10 11:16:00 crc kubenswrapper[4794]: I0310 11:16:00.966467 4794 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 10 11:16:01 crc kubenswrapper[4794]: I0310 11:16:01.001366 4794 scope.go:117] "RemoveContainer" containerID="993a5336699cfaa4c1135438dd34154cd549d1e3528993849fd4002d26837157" Mar 10 11:16:01 crc kubenswrapper[4794]: E0310 11:16:01.002254 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-69278_openshift-machine-config-operator(071a79a8-a892-4d38-a255-2a19483b64aa)\"" pod="openshift-machine-config-operator/machine-config-daemon-69278" podUID="071a79a8-a892-4d38-a255-2a19483b64aa" Mar 10 11:16:01 crc kubenswrapper[4794]: I0310 11:16:01.069948 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552356-flccj" event={"ID":"97e542d0-c411-44de-beb8-e5cffa10f1a7","Type":"ContainerStarted","Data":"ffcde6b64f92d96594ec72a81e3f55f032860c5d6e807f6beb11c56aeb601077"} Mar 10 11:16:03 crc kubenswrapper[4794]: I0310 11:16:03.089178 4794 generic.go:334] "Generic (PLEG): container finished" podID="97e542d0-c411-44de-beb8-e5cffa10f1a7" containerID="f11c8070e6ccf8a777e1b80ded382feeccedf6744a66a5a374be673e2f8ec2b8" exitCode=0 Mar 10 11:16:03 crc kubenswrapper[4794]: I0310 11:16:03.089278 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552356-flccj" event={"ID":"97e542d0-c411-44de-beb8-e5cffa10f1a7","Type":"ContainerDied","Data":"f11c8070e6ccf8a777e1b80ded382feeccedf6744a66a5a374be673e2f8ec2b8"} Mar 10 11:16:03 crc kubenswrapper[4794]: I0310 11:16:03.365557 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-create-nbph9"] Mar 10 11:16:03 crc kubenswrapper[4794]: I0310 11:16:03.366503 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-nbph9" Mar 10 11:16:03 crc kubenswrapper[4794]: I0310 11:16:03.377753 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-nbph9"] Mar 10 11:16:03 crc kubenswrapper[4794]: I0310 11:16:03.471067 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-5620-account-create-update-scwqm"] Mar 10 11:16:03 crc kubenswrapper[4794]: I0310 11:16:03.471906 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f233e4b9-a4e1-4c20-99a5-54632094041d-operator-scripts\") pod \"glance-db-create-nbph9\" (UID: \"f233e4b9-a4e1-4c20-99a5-54632094041d\") " pod="openstack/glance-db-create-nbph9" Mar 10 11:16:03 crc kubenswrapper[4794]: I0310 11:16:03.471958 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8bjjf\" (UniqueName: \"kubernetes.io/projected/f233e4b9-a4e1-4c20-99a5-54632094041d-kube-api-access-8bjjf\") pod \"glance-db-create-nbph9\" (UID: \"f233e4b9-a4e1-4c20-99a5-54632094041d\") " pod="openstack/glance-db-create-nbph9" Mar 10 11:16:03 crc kubenswrapper[4794]: I0310 11:16:03.472164 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-5620-account-create-update-scwqm" Mar 10 11:16:03 crc kubenswrapper[4794]: I0310 11:16:03.474392 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-db-secret" Mar 10 11:16:03 crc kubenswrapper[4794]: I0310 11:16:03.481870 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-5620-account-create-update-scwqm"] Mar 10 11:16:03 crc kubenswrapper[4794]: I0310 11:16:03.573776 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f233e4b9-a4e1-4c20-99a5-54632094041d-operator-scripts\") pod \"glance-db-create-nbph9\" (UID: \"f233e4b9-a4e1-4c20-99a5-54632094041d\") " pod="openstack/glance-db-create-nbph9" Mar 10 11:16:03 crc kubenswrapper[4794]: I0310 11:16:03.573833 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qwgxl\" (UniqueName: \"kubernetes.io/projected/55b82cce-abab-4c3f-869e-e4910f4d2435-kube-api-access-qwgxl\") pod \"glance-5620-account-create-update-scwqm\" (UID: \"55b82cce-abab-4c3f-869e-e4910f4d2435\") " pod="openstack/glance-5620-account-create-update-scwqm" Mar 10 11:16:03 crc kubenswrapper[4794]: I0310 11:16:03.573853 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8bjjf\" (UniqueName: \"kubernetes.io/projected/f233e4b9-a4e1-4c20-99a5-54632094041d-kube-api-access-8bjjf\") pod \"glance-db-create-nbph9\" (UID: \"f233e4b9-a4e1-4c20-99a5-54632094041d\") " pod="openstack/glance-db-create-nbph9" Mar 10 11:16:03 crc kubenswrapper[4794]: I0310 11:16:03.573887 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/55b82cce-abab-4c3f-869e-e4910f4d2435-operator-scripts\") pod \"glance-5620-account-create-update-scwqm\" (UID: \"55b82cce-abab-4c3f-869e-e4910f4d2435\") " pod="openstack/glance-5620-account-create-update-scwqm" Mar 10 11:16:03 crc kubenswrapper[4794]: I0310 11:16:03.574692 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f233e4b9-a4e1-4c20-99a5-54632094041d-operator-scripts\") pod \"glance-db-create-nbph9\" (UID: \"f233e4b9-a4e1-4c20-99a5-54632094041d\") " pod="openstack/glance-db-create-nbph9" Mar 10 11:16:03 crc kubenswrapper[4794]: I0310 11:16:03.598467 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8bjjf\" (UniqueName: \"kubernetes.io/projected/f233e4b9-a4e1-4c20-99a5-54632094041d-kube-api-access-8bjjf\") pod \"glance-db-create-nbph9\" (UID: \"f233e4b9-a4e1-4c20-99a5-54632094041d\") " pod="openstack/glance-db-create-nbph9" Mar 10 11:16:03 crc kubenswrapper[4794]: I0310 11:16:03.675873 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qwgxl\" (UniqueName: \"kubernetes.io/projected/55b82cce-abab-4c3f-869e-e4910f4d2435-kube-api-access-qwgxl\") pod \"glance-5620-account-create-update-scwqm\" (UID: \"55b82cce-abab-4c3f-869e-e4910f4d2435\") " pod="openstack/glance-5620-account-create-update-scwqm" Mar 10 11:16:03 crc kubenswrapper[4794]: I0310 11:16:03.675952 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/55b82cce-abab-4c3f-869e-e4910f4d2435-operator-scripts\") pod \"glance-5620-account-create-update-scwqm\" (UID: \"55b82cce-abab-4c3f-869e-e4910f4d2435\") " pod="openstack/glance-5620-account-create-update-scwqm" Mar 10 11:16:03 crc kubenswrapper[4794]: I0310 11:16:03.676869 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/55b82cce-abab-4c3f-869e-e4910f4d2435-operator-scripts\") pod \"glance-5620-account-create-update-scwqm\" (UID: \"55b82cce-abab-4c3f-869e-e4910f4d2435\") " pod="openstack/glance-5620-account-create-update-scwqm" Mar 10 11:16:03 crc kubenswrapper[4794]: I0310 11:16:03.687054 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-nbph9" Mar 10 11:16:03 crc kubenswrapper[4794]: I0310 11:16:03.691524 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qwgxl\" (UniqueName: \"kubernetes.io/projected/55b82cce-abab-4c3f-869e-e4910f4d2435-kube-api-access-qwgxl\") pod \"glance-5620-account-create-update-scwqm\" (UID: \"55b82cce-abab-4c3f-869e-e4910f4d2435\") " pod="openstack/glance-5620-account-create-update-scwqm" Mar 10 11:16:03 crc kubenswrapper[4794]: I0310 11:16:03.792657 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-5620-account-create-update-scwqm" Mar 10 11:16:04 crc kubenswrapper[4794]: I0310 11:16:04.212926 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-nbph9"] Mar 10 11:16:04 crc kubenswrapper[4794]: W0310 11:16:04.228517 4794 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf233e4b9_a4e1_4c20_99a5_54632094041d.slice/crio-88a14cd22d737f64265c0eeed5f9b0e2580fb07b2dc86ce49840c88c517403b9 WatchSource:0}: Error finding container 88a14cd22d737f64265c0eeed5f9b0e2580fb07b2dc86ce49840c88c517403b9: Status 404 returned error can't find the container with id 88a14cd22d737f64265c0eeed5f9b0e2580fb07b2dc86ce49840c88c517403b9 Mar 10 11:16:04 crc kubenswrapper[4794]: I0310 11:16:04.317581 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-5620-account-create-update-scwqm"] Mar 10 11:16:04 crc kubenswrapper[4794]: W0310 11:16:04.327271 4794 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod55b82cce_abab_4c3f_869e_e4910f4d2435.slice/crio-3631a60c1f91ec37c75d6633df5dfd593bf29fdc47f56cb0d3a4269acffbaaab WatchSource:0}: Error finding container 3631a60c1f91ec37c75d6633df5dfd593bf29fdc47f56cb0d3a4269acffbaaab: Status 404 returned error can't find the container with id 3631a60c1f91ec37c75d6633df5dfd593bf29fdc47f56cb0d3a4269acffbaaab Mar 10 11:16:04 crc kubenswrapper[4794]: I0310 11:16:04.336795 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552356-flccj" Mar 10 11:16:04 crc kubenswrapper[4794]: I0310 11:16:04.490838 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bw464\" (UniqueName: \"kubernetes.io/projected/97e542d0-c411-44de-beb8-e5cffa10f1a7-kube-api-access-bw464\") pod \"97e542d0-c411-44de-beb8-e5cffa10f1a7\" (UID: \"97e542d0-c411-44de-beb8-e5cffa10f1a7\") " Mar 10 11:16:04 crc kubenswrapper[4794]: I0310 11:16:04.496190 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/97e542d0-c411-44de-beb8-e5cffa10f1a7-kube-api-access-bw464" (OuterVolumeSpecName: "kube-api-access-bw464") pod "97e542d0-c411-44de-beb8-e5cffa10f1a7" (UID: "97e542d0-c411-44de-beb8-e5cffa10f1a7"). InnerVolumeSpecName "kube-api-access-bw464". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 11:16:04 crc kubenswrapper[4794]: I0310 11:16:04.593467 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bw464\" (UniqueName: \"kubernetes.io/projected/97e542d0-c411-44de-beb8-e5cffa10f1a7-kube-api-access-bw464\") on node \"crc\" DevicePath \"\"" Mar 10 11:16:05 crc kubenswrapper[4794]: I0310 11:16:05.105619 4794 generic.go:334] "Generic (PLEG): container finished" podID="55b82cce-abab-4c3f-869e-e4910f4d2435" containerID="0dbe5a9bbb2cc0e10af62f800ea9466d45a3ab1efbdd5694adf30e7f1e9ed1b9" exitCode=0 Mar 10 11:16:05 crc kubenswrapper[4794]: I0310 11:16:05.105668 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-5620-account-create-update-scwqm" event={"ID":"55b82cce-abab-4c3f-869e-e4910f4d2435","Type":"ContainerDied","Data":"0dbe5a9bbb2cc0e10af62f800ea9466d45a3ab1efbdd5694adf30e7f1e9ed1b9"} Mar 10 11:16:05 crc kubenswrapper[4794]: I0310 11:16:05.106031 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-5620-account-create-update-scwqm" event={"ID":"55b82cce-abab-4c3f-869e-e4910f4d2435","Type":"ContainerStarted","Data":"3631a60c1f91ec37c75d6633df5dfd593bf29fdc47f56cb0d3a4269acffbaaab"} Mar 10 11:16:05 crc kubenswrapper[4794]: I0310 11:16:05.107848 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552356-flccj" Mar 10 11:16:05 crc kubenswrapper[4794]: I0310 11:16:05.109772 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552356-flccj" event={"ID":"97e542d0-c411-44de-beb8-e5cffa10f1a7","Type":"ContainerDied","Data":"ffcde6b64f92d96594ec72a81e3f55f032860c5d6e807f6beb11c56aeb601077"} Mar 10 11:16:05 crc kubenswrapper[4794]: I0310 11:16:05.109833 4794 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ffcde6b64f92d96594ec72a81e3f55f032860c5d6e807f6beb11c56aeb601077" Mar 10 11:16:05 crc kubenswrapper[4794]: I0310 11:16:05.111633 4794 generic.go:334] "Generic (PLEG): container finished" podID="f233e4b9-a4e1-4c20-99a5-54632094041d" containerID="4878c96a1ee18ad5d4b6a0bd581fdea7d5b56e4a4eddbe56c45938b20143151f" exitCode=0 Mar 10 11:16:05 crc kubenswrapper[4794]: I0310 11:16:05.111679 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-nbph9" event={"ID":"f233e4b9-a4e1-4c20-99a5-54632094041d","Type":"ContainerDied","Data":"4878c96a1ee18ad5d4b6a0bd581fdea7d5b56e4a4eddbe56c45938b20143151f"} Mar 10 11:16:05 crc kubenswrapper[4794]: I0310 11:16:05.111706 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-nbph9" event={"ID":"f233e4b9-a4e1-4c20-99a5-54632094041d","Type":"ContainerStarted","Data":"88a14cd22d737f64265c0eeed5f9b0e2580fb07b2dc86ce49840c88c517403b9"} Mar 10 11:16:05 crc kubenswrapper[4794]: I0310 11:16:05.429569 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29552350-4wd5c"] Mar 10 11:16:05 crc kubenswrapper[4794]: I0310 11:16:05.441637 4794 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29552350-4wd5c"] Mar 10 11:16:06 crc kubenswrapper[4794]: I0310 11:16:06.024007 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="29974508-802d-4263-91c7-c41b07c18dd7" path="/var/lib/kubelet/pods/29974508-802d-4263-91c7-c41b07c18dd7/volumes" Mar 10 11:16:06 crc kubenswrapper[4794]: I0310 11:16:06.622569 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-5620-account-create-update-scwqm" Mar 10 11:16:06 crc kubenswrapper[4794]: I0310 11:16:06.630470 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-nbph9" Mar 10 11:16:06 crc kubenswrapper[4794]: I0310 11:16:06.737949 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qwgxl\" (UniqueName: \"kubernetes.io/projected/55b82cce-abab-4c3f-869e-e4910f4d2435-kube-api-access-qwgxl\") pod \"55b82cce-abab-4c3f-869e-e4910f4d2435\" (UID: \"55b82cce-abab-4c3f-869e-e4910f4d2435\") " Mar 10 11:16:06 crc kubenswrapper[4794]: I0310 11:16:06.738040 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8bjjf\" (UniqueName: \"kubernetes.io/projected/f233e4b9-a4e1-4c20-99a5-54632094041d-kube-api-access-8bjjf\") pod \"f233e4b9-a4e1-4c20-99a5-54632094041d\" (UID: \"f233e4b9-a4e1-4c20-99a5-54632094041d\") " Mar 10 11:16:06 crc kubenswrapper[4794]: I0310 11:16:06.738067 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/55b82cce-abab-4c3f-869e-e4910f4d2435-operator-scripts\") pod \"55b82cce-abab-4c3f-869e-e4910f4d2435\" (UID: \"55b82cce-abab-4c3f-869e-e4910f4d2435\") " Mar 10 11:16:06 crc kubenswrapper[4794]: I0310 11:16:06.738152 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f233e4b9-a4e1-4c20-99a5-54632094041d-operator-scripts\") pod \"f233e4b9-a4e1-4c20-99a5-54632094041d\" (UID: \"f233e4b9-a4e1-4c20-99a5-54632094041d\") " Mar 10 11:16:06 crc kubenswrapper[4794]: I0310 11:16:06.738882 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/55b82cce-abab-4c3f-869e-e4910f4d2435-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "55b82cce-abab-4c3f-869e-e4910f4d2435" (UID: "55b82cce-abab-4c3f-869e-e4910f4d2435"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 11:16:06 crc kubenswrapper[4794]: I0310 11:16:06.738930 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f233e4b9-a4e1-4c20-99a5-54632094041d-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "f233e4b9-a4e1-4c20-99a5-54632094041d" (UID: "f233e4b9-a4e1-4c20-99a5-54632094041d"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 11:16:06 crc kubenswrapper[4794]: I0310 11:16:06.742840 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f233e4b9-a4e1-4c20-99a5-54632094041d-kube-api-access-8bjjf" (OuterVolumeSpecName: "kube-api-access-8bjjf") pod "f233e4b9-a4e1-4c20-99a5-54632094041d" (UID: "f233e4b9-a4e1-4c20-99a5-54632094041d"). InnerVolumeSpecName "kube-api-access-8bjjf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 11:16:06 crc kubenswrapper[4794]: I0310 11:16:06.743105 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/55b82cce-abab-4c3f-869e-e4910f4d2435-kube-api-access-qwgxl" (OuterVolumeSpecName: "kube-api-access-qwgxl") pod "55b82cce-abab-4c3f-869e-e4910f4d2435" (UID: "55b82cce-abab-4c3f-869e-e4910f4d2435"). InnerVolumeSpecName "kube-api-access-qwgxl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 11:16:06 crc kubenswrapper[4794]: I0310 11:16:06.843743 4794 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f233e4b9-a4e1-4c20-99a5-54632094041d-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 11:16:06 crc kubenswrapper[4794]: I0310 11:16:06.843789 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qwgxl\" (UniqueName: \"kubernetes.io/projected/55b82cce-abab-4c3f-869e-e4910f4d2435-kube-api-access-qwgxl\") on node \"crc\" DevicePath \"\"" Mar 10 11:16:06 crc kubenswrapper[4794]: I0310 11:16:06.843802 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8bjjf\" (UniqueName: \"kubernetes.io/projected/f233e4b9-a4e1-4c20-99a5-54632094041d-kube-api-access-8bjjf\") on node \"crc\" DevicePath \"\"" Mar 10 11:16:06 crc kubenswrapper[4794]: I0310 11:16:06.843814 4794 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/55b82cce-abab-4c3f-869e-e4910f4d2435-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 11:16:07 crc kubenswrapper[4794]: I0310 11:16:07.135114 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-nbph9" event={"ID":"f233e4b9-a4e1-4c20-99a5-54632094041d","Type":"ContainerDied","Data":"88a14cd22d737f64265c0eeed5f9b0e2580fb07b2dc86ce49840c88c517403b9"} Mar 10 11:16:07 crc kubenswrapper[4794]: I0310 11:16:07.135160 4794 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="88a14cd22d737f64265c0eeed5f9b0e2580fb07b2dc86ce49840c88c517403b9" Mar 10 11:16:07 crc kubenswrapper[4794]: I0310 11:16:07.135183 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-nbph9" Mar 10 11:16:07 crc kubenswrapper[4794]: I0310 11:16:07.137295 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-5620-account-create-update-scwqm" event={"ID":"55b82cce-abab-4c3f-869e-e4910f4d2435","Type":"ContainerDied","Data":"3631a60c1f91ec37c75d6633df5dfd593bf29fdc47f56cb0d3a4269acffbaaab"} Mar 10 11:16:07 crc kubenswrapper[4794]: I0310 11:16:07.137318 4794 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3631a60c1f91ec37c75d6633df5dfd593bf29fdc47f56cb0d3a4269acffbaaab" Mar 10 11:16:07 crc kubenswrapper[4794]: I0310 11:16:07.137392 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-5620-account-create-update-scwqm" Mar 10 11:16:08 crc kubenswrapper[4794]: I0310 11:16:08.760994 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-sync-z96r7"] Mar 10 11:16:08 crc kubenswrapper[4794]: E0310 11:16:08.761707 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="55b82cce-abab-4c3f-869e-e4910f4d2435" containerName="mariadb-account-create-update" Mar 10 11:16:08 crc kubenswrapper[4794]: I0310 11:16:08.761723 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="55b82cce-abab-4c3f-869e-e4910f4d2435" containerName="mariadb-account-create-update" Mar 10 11:16:08 crc kubenswrapper[4794]: E0310 11:16:08.761750 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="97e542d0-c411-44de-beb8-e5cffa10f1a7" containerName="oc" Mar 10 11:16:08 crc kubenswrapper[4794]: I0310 11:16:08.761759 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="97e542d0-c411-44de-beb8-e5cffa10f1a7" containerName="oc" Mar 10 11:16:08 crc kubenswrapper[4794]: E0310 11:16:08.761789 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f233e4b9-a4e1-4c20-99a5-54632094041d" containerName="mariadb-database-create" Mar 10 11:16:08 crc kubenswrapper[4794]: I0310 11:16:08.761797 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="f233e4b9-a4e1-4c20-99a5-54632094041d" containerName="mariadb-database-create" Mar 10 11:16:08 crc kubenswrapper[4794]: I0310 11:16:08.761980 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="55b82cce-abab-4c3f-869e-e4910f4d2435" containerName="mariadb-account-create-update" Mar 10 11:16:08 crc kubenswrapper[4794]: I0310 11:16:08.761997 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="f233e4b9-a4e1-4c20-99a5-54632094041d" containerName="mariadb-database-create" Mar 10 11:16:08 crc kubenswrapper[4794]: I0310 11:16:08.762021 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="97e542d0-c411-44de-beb8-e5cffa10f1a7" containerName="oc" Mar 10 11:16:08 crc kubenswrapper[4794]: I0310 11:16:08.762765 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-z96r7" Mar 10 11:16:08 crc kubenswrapper[4794]: I0310 11:16:08.765974 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-config-data" Mar 10 11:16:08 crc kubenswrapper[4794]: I0310 11:16:08.773295 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-fss2f" Mar 10 11:16:08 crc kubenswrapper[4794]: I0310 11:16:08.813791 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-z96r7"] Mar 10 11:16:08 crc kubenswrapper[4794]: I0310 11:16:08.885632 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/a29fab98-30cc-441b-a7ad-17257e3f75a6-db-sync-config-data\") pod \"glance-db-sync-z96r7\" (UID: \"a29fab98-30cc-441b-a7ad-17257e3f75a6\") " pod="openstack/glance-db-sync-z96r7" Mar 10 11:16:08 crc kubenswrapper[4794]: I0310 11:16:08.885747 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zdcxv\" (UniqueName: \"kubernetes.io/projected/a29fab98-30cc-441b-a7ad-17257e3f75a6-kube-api-access-zdcxv\") pod \"glance-db-sync-z96r7\" (UID: \"a29fab98-30cc-441b-a7ad-17257e3f75a6\") " pod="openstack/glance-db-sync-z96r7" Mar 10 11:16:08 crc kubenswrapper[4794]: I0310 11:16:08.885841 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a29fab98-30cc-441b-a7ad-17257e3f75a6-combined-ca-bundle\") pod \"glance-db-sync-z96r7\" (UID: \"a29fab98-30cc-441b-a7ad-17257e3f75a6\") " pod="openstack/glance-db-sync-z96r7" Mar 10 11:16:08 crc kubenswrapper[4794]: I0310 11:16:08.885888 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a29fab98-30cc-441b-a7ad-17257e3f75a6-config-data\") pod \"glance-db-sync-z96r7\" (UID: \"a29fab98-30cc-441b-a7ad-17257e3f75a6\") " pod="openstack/glance-db-sync-z96r7" Mar 10 11:16:08 crc kubenswrapper[4794]: I0310 11:16:08.987559 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/a29fab98-30cc-441b-a7ad-17257e3f75a6-db-sync-config-data\") pod \"glance-db-sync-z96r7\" (UID: \"a29fab98-30cc-441b-a7ad-17257e3f75a6\") " pod="openstack/glance-db-sync-z96r7" Mar 10 11:16:08 crc kubenswrapper[4794]: I0310 11:16:08.987690 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zdcxv\" (UniqueName: \"kubernetes.io/projected/a29fab98-30cc-441b-a7ad-17257e3f75a6-kube-api-access-zdcxv\") pod \"glance-db-sync-z96r7\" (UID: \"a29fab98-30cc-441b-a7ad-17257e3f75a6\") " pod="openstack/glance-db-sync-z96r7" Mar 10 11:16:08 crc kubenswrapper[4794]: I0310 11:16:08.987859 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a29fab98-30cc-441b-a7ad-17257e3f75a6-combined-ca-bundle\") pod \"glance-db-sync-z96r7\" (UID: \"a29fab98-30cc-441b-a7ad-17257e3f75a6\") " pod="openstack/glance-db-sync-z96r7" Mar 10 11:16:08 crc kubenswrapper[4794]: I0310 11:16:08.987974 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a29fab98-30cc-441b-a7ad-17257e3f75a6-config-data\") pod \"glance-db-sync-z96r7\" (UID: \"a29fab98-30cc-441b-a7ad-17257e3f75a6\") " pod="openstack/glance-db-sync-z96r7" Mar 10 11:16:08 crc kubenswrapper[4794]: I0310 11:16:08.995446 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a29fab98-30cc-441b-a7ad-17257e3f75a6-combined-ca-bundle\") pod \"glance-db-sync-z96r7\" (UID: \"a29fab98-30cc-441b-a7ad-17257e3f75a6\") " pod="openstack/glance-db-sync-z96r7" Mar 10 11:16:09 crc kubenswrapper[4794]: I0310 11:16:09.011178 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a29fab98-30cc-441b-a7ad-17257e3f75a6-config-data\") pod \"glance-db-sync-z96r7\" (UID: \"a29fab98-30cc-441b-a7ad-17257e3f75a6\") " pod="openstack/glance-db-sync-z96r7" Mar 10 11:16:09 crc kubenswrapper[4794]: I0310 11:16:09.012119 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/a29fab98-30cc-441b-a7ad-17257e3f75a6-db-sync-config-data\") pod \"glance-db-sync-z96r7\" (UID: \"a29fab98-30cc-441b-a7ad-17257e3f75a6\") " pod="openstack/glance-db-sync-z96r7" Mar 10 11:16:09 crc kubenswrapper[4794]: I0310 11:16:09.020695 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zdcxv\" (UniqueName: \"kubernetes.io/projected/a29fab98-30cc-441b-a7ad-17257e3f75a6-kube-api-access-zdcxv\") pod \"glance-db-sync-z96r7\" (UID: \"a29fab98-30cc-441b-a7ad-17257e3f75a6\") " pod="openstack/glance-db-sync-z96r7" Mar 10 11:16:09 crc kubenswrapper[4794]: I0310 11:16:09.119131 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-z96r7" Mar 10 11:16:09 crc kubenswrapper[4794]: I0310 11:16:09.670270 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-z96r7"] Mar 10 11:16:10 crc kubenswrapper[4794]: I0310 11:16:10.166479 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-z96r7" event={"ID":"a29fab98-30cc-441b-a7ad-17257e3f75a6","Type":"ContainerStarted","Data":"a4c15521e37e3d4c576d9019a9c3ebe7a1ce39effdd721eb45e095893eea26a8"} Mar 10 11:16:11 crc kubenswrapper[4794]: I0310 11:16:11.175668 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-z96r7" event={"ID":"a29fab98-30cc-441b-a7ad-17257e3f75a6","Type":"ContainerStarted","Data":"8b51557da52d041a9e6ea7d9dd6c8758de33c61c9d4e6bb19394f4eeed2df9ad"} Mar 10 11:16:11 crc kubenswrapper[4794]: I0310 11:16:11.195887 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-db-sync-z96r7" podStartSLOduration=3.195867411 podStartE2EDuration="3.195867411s" podCreationTimestamp="2026-03-10 11:16:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 11:16:11.189800632 +0000 UTC m=+5519.945971450" watchObservedRunningTime="2026-03-10 11:16:11.195867411 +0000 UTC m=+5519.952038229" Mar 10 11:16:14 crc kubenswrapper[4794]: I0310 11:16:13.999934 4794 scope.go:117] "RemoveContainer" containerID="993a5336699cfaa4c1135438dd34154cd549d1e3528993849fd4002d26837157" Mar 10 11:16:14 crc kubenswrapper[4794]: E0310 11:16:14.000926 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-69278_openshift-machine-config-operator(071a79a8-a892-4d38-a255-2a19483b64aa)\"" pod="openshift-machine-config-operator/machine-config-daemon-69278" podUID="071a79a8-a892-4d38-a255-2a19483b64aa" Mar 10 11:16:14 crc kubenswrapper[4794]: I0310 11:16:14.222050 4794 generic.go:334] "Generic (PLEG): container finished" podID="a29fab98-30cc-441b-a7ad-17257e3f75a6" containerID="8b51557da52d041a9e6ea7d9dd6c8758de33c61c9d4e6bb19394f4eeed2df9ad" exitCode=0 Mar 10 11:16:14 crc kubenswrapper[4794]: I0310 11:16:14.222117 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-z96r7" event={"ID":"a29fab98-30cc-441b-a7ad-17257e3f75a6","Type":"ContainerDied","Data":"8b51557da52d041a9e6ea7d9dd6c8758de33c61c9d4e6bb19394f4eeed2df9ad"} Mar 10 11:16:15 crc kubenswrapper[4794]: I0310 11:16:15.757081 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-z96r7" Mar 10 11:16:15 crc kubenswrapper[4794]: I0310 11:16:15.935604 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a29fab98-30cc-441b-a7ad-17257e3f75a6-config-data\") pod \"a29fab98-30cc-441b-a7ad-17257e3f75a6\" (UID: \"a29fab98-30cc-441b-a7ad-17257e3f75a6\") " Mar 10 11:16:15 crc kubenswrapper[4794]: I0310 11:16:15.935823 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a29fab98-30cc-441b-a7ad-17257e3f75a6-combined-ca-bundle\") pod \"a29fab98-30cc-441b-a7ad-17257e3f75a6\" (UID: \"a29fab98-30cc-441b-a7ad-17257e3f75a6\") " Mar 10 11:16:15 crc kubenswrapper[4794]: I0310 11:16:15.936040 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/a29fab98-30cc-441b-a7ad-17257e3f75a6-db-sync-config-data\") pod \"a29fab98-30cc-441b-a7ad-17257e3f75a6\" (UID: \"a29fab98-30cc-441b-a7ad-17257e3f75a6\") " Mar 10 11:16:15 crc kubenswrapper[4794]: I0310 11:16:15.936095 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zdcxv\" (UniqueName: \"kubernetes.io/projected/a29fab98-30cc-441b-a7ad-17257e3f75a6-kube-api-access-zdcxv\") pod \"a29fab98-30cc-441b-a7ad-17257e3f75a6\" (UID: \"a29fab98-30cc-441b-a7ad-17257e3f75a6\") " Mar 10 11:16:15 crc kubenswrapper[4794]: I0310 11:16:15.944645 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a29fab98-30cc-441b-a7ad-17257e3f75a6-kube-api-access-zdcxv" (OuterVolumeSpecName: "kube-api-access-zdcxv") pod "a29fab98-30cc-441b-a7ad-17257e3f75a6" (UID: "a29fab98-30cc-441b-a7ad-17257e3f75a6"). InnerVolumeSpecName "kube-api-access-zdcxv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 11:16:15 crc kubenswrapper[4794]: I0310 11:16:15.948811 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a29fab98-30cc-441b-a7ad-17257e3f75a6-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "a29fab98-30cc-441b-a7ad-17257e3f75a6" (UID: "a29fab98-30cc-441b-a7ad-17257e3f75a6"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 11:16:15 crc kubenswrapper[4794]: I0310 11:16:15.988302 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a29fab98-30cc-441b-a7ad-17257e3f75a6-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a29fab98-30cc-441b-a7ad-17257e3f75a6" (UID: "a29fab98-30cc-441b-a7ad-17257e3f75a6"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 11:16:16 crc kubenswrapper[4794]: I0310 11:16:16.012087 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a29fab98-30cc-441b-a7ad-17257e3f75a6-config-data" (OuterVolumeSpecName: "config-data") pod "a29fab98-30cc-441b-a7ad-17257e3f75a6" (UID: "a29fab98-30cc-441b-a7ad-17257e3f75a6"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 11:16:16 crc kubenswrapper[4794]: I0310 11:16:16.038720 4794 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a29fab98-30cc-441b-a7ad-17257e3f75a6-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 11:16:16 crc kubenswrapper[4794]: I0310 11:16:16.038874 4794 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a29fab98-30cc-441b-a7ad-17257e3f75a6-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 11:16:16 crc kubenswrapper[4794]: I0310 11:16:16.038904 4794 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/a29fab98-30cc-441b-a7ad-17257e3f75a6-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 11:16:16 crc kubenswrapper[4794]: I0310 11:16:16.038968 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zdcxv\" (UniqueName: \"kubernetes.io/projected/a29fab98-30cc-441b-a7ad-17257e3f75a6-kube-api-access-zdcxv\") on node \"crc\" DevicePath \"\"" Mar 10 11:16:16 crc kubenswrapper[4794]: I0310 11:16:16.249002 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-z96r7" event={"ID":"a29fab98-30cc-441b-a7ad-17257e3f75a6","Type":"ContainerDied","Data":"a4c15521e37e3d4c576d9019a9c3ebe7a1ce39effdd721eb45e095893eea26a8"} Mar 10 11:16:16 crc kubenswrapper[4794]: I0310 11:16:16.249065 4794 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a4c15521e37e3d4c576d9019a9c3ebe7a1ce39effdd721eb45e095893eea26a8" Mar 10 11:16:16 crc kubenswrapper[4794]: I0310 11:16:16.249079 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-z96r7" Mar 10 11:16:16 crc kubenswrapper[4794]: I0310 11:16:16.652697 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-76f7f8d69f-2s985"] Mar 10 11:16:16 crc kubenswrapper[4794]: E0310 11:16:16.653290 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a29fab98-30cc-441b-a7ad-17257e3f75a6" containerName="glance-db-sync" Mar 10 11:16:16 crc kubenswrapper[4794]: I0310 11:16:16.653306 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="a29fab98-30cc-441b-a7ad-17257e3f75a6" containerName="glance-db-sync" Mar 10 11:16:16 crc kubenswrapper[4794]: I0310 11:16:16.653474 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="a29fab98-30cc-441b-a7ad-17257e3f75a6" containerName="glance-db-sync" Mar 10 11:16:16 crc kubenswrapper[4794]: I0310 11:16:16.654337 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-76f7f8d69f-2s985" Mar 10 11:16:16 crc kubenswrapper[4794]: I0310 11:16:16.683552 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-76f7f8d69f-2s985"] Mar 10 11:16:16 crc kubenswrapper[4794]: I0310 11:16:16.747477 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Mar 10 11:16:16 crc kubenswrapper[4794]: I0310 11:16:16.748936 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 10 11:16:16 crc kubenswrapper[4794]: I0310 11:16:16.752792 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Mar 10 11:16:16 crc kubenswrapper[4794]: I0310 11:16:16.752981 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Mar 10 11:16:16 crc kubenswrapper[4794]: I0310 11:16:16.753096 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Mar 10 11:16:16 crc kubenswrapper[4794]: I0310 11:16:16.753191 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-fss2f" Mar 10 11:16:16 crc kubenswrapper[4794]: I0310 11:16:16.759954 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 10 11:16:16 crc kubenswrapper[4794]: I0310 11:16:16.855282 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5ffdf5bc-7b4a-4714-8cbf-4e27c16a4f21-config\") pod \"dnsmasq-dns-76f7f8d69f-2s985\" (UID: \"5ffdf5bc-7b4a-4714-8cbf-4e27c16a4f21\") " pod="openstack/dnsmasq-dns-76f7f8d69f-2s985" Mar 10 11:16:16 crc kubenswrapper[4794]: I0310 11:16:16.855333 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d50415b5-e20d-4aca-bd71-f1eb44c10950-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"d50415b5-e20d-4aca-bd71-f1eb44c10950\") " pod="openstack/glance-default-external-api-0" Mar 10 11:16:16 crc kubenswrapper[4794]: I0310 11:16:16.855368 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/d50415b5-e20d-4aca-bd71-f1eb44c10950-ceph\") pod \"glance-default-external-api-0\" (UID: \"d50415b5-e20d-4aca-bd71-f1eb44c10950\") " pod="openstack/glance-default-external-api-0" Mar 10 11:16:16 crc kubenswrapper[4794]: I0310 11:16:16.855426 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5ffdf5bc-7b4a-4714-8cbf-4e27c16a4f21-ovsdbserver-nb\") pod \"dnsmasq-dns-76f7f8d69f-2s985\" (UID: \"5ffdf5bc-7b4a-4714-8cbf-4e27c16a4f21\") " pod="openstack/dnsmasq-dns-76f7f8d69f-2s985" Mar 10 11:16:16 crc kubenswrapper[4794]: I0310 11:16:16.855454 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5ffdf5bc-7b4a-4714-8cbf-4e27c16a4f21-dns-svc\") pod \"dnsmasq-dns-76f7f8d69f-2s985\" (UID: \"5ffdf5bc-7b4a-4714-8cbf-4e27c16a4f21\") " pod="openstack/dnsmasq-dns-76f7f8d69f-2s985" Mar 10 11:16:16 crc kubenswrapper[4794]: I0310 11:16:16.855468 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2g7gm\" (UniqueName: \"kubernetes.io/projected/5ffdf5bc-7b4a-4714-8cbf-4e27c16a4f21-kube-api-access-2g7gm\") pod \"dnsmasq-dns-76f7f8d69f-2s985\" (UID: \"5ffdf5bc-7b4a-4714-8cbf-4e27c16a4f21\") " pod="openstack/dnsmasq-dns-76f7f8d69f-2s985" Mar 10 11:16:16 crc kubenswrapper[4794]: I0310 11:16:16.855512 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5ffdf5bc-7b4a-4714-8cbf-4e27c16a4f21-ovsdbserver-sb\") pod \"dnsmasq-dns-76f7f8d69f-2s985\" (UID: \"5ffdf5bc-7b4a-4714-8cbf-4e27c16a4f21\") " pod="openstack/dnsmasq-dns-76f7f8d69f-2s985" Mar 10 11:16:16 crc kubenswrapper[4794]: I0310 11:16:16.855529 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d50415b5-e20d-4aca-bd71-f1eb44c10950-logs\") pod \"glance-default-external-api-0\" (UID: \"d50415b5-e20d-4aca-bd71-f1eb44c10950\") " pod="openstack/glance-default-external-api-0" Mar 10 11:16:16 crc kubenswrapper[4794]: I0310 11:16:16.855572 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k58tt\" (UniqueName: \"kubernetes.io/projected/d50415b5-e20d-4aca-bd71-f1eb44c10950-kube-api-access-k58tt\") pod \"glance-default-external-api-0\" (UID: \"d50415b5-e20d-4aca-bd71-f1eb44c10950\") " pod="openstack/glance-default-external-api-0" Mar 10 11:16:16 crc kubenswrapper[4794]: I0310 11:16:16.855595 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d50415b5-e20d-4aca-bd71-f1eb44c10950-config-data\") pod \"glance-default-external-api-0\" (UID: \"d50415b5-e20d-4aca-bd71-f1eb44c10950\") " pod="openstack/glance-default-external-api-0" Mar 10 11:16:16 crc kubenswrapper[4794]: I0310 11:16:16.855628 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d50415b5-e20d-4aca-bd71-f1eb44c10950-scripts\") pod \"glance-default-external-api-0\" (UID: \"d50415b5-e20d-4aca-bd71-f1eb44c10950\") " pod="openstack/glance-default-external-api-0" Mar 10 11:16:16 crc kubenswrapper[4794]: I0310 11:16:16.855654 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/d50415b5-e20d-4aca-bd71-f1eb44c10950-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"d50415b5-e20d-4aca-bd71-f1eb44c10950\") " pod="openstack/glance-default-external-api-0" Mar 10 11:16:16 crc kubenswrapper[4794]: I0310 11:16:16.882959 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 10 11:16:16 crc kubenswrapper[4794]: I0310 11:16:16.893327 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 10 11:16:16 crc kubenswrapper[4794]: I0310 11:16:16.898769 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Mar 10 11:16:16 crc kubenswrapper[4794]: I0310 11:16:16.902015 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 10 11:16:16 crc kubenswrapper[4794]: I0310 11:16:16.957413 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/d50415b5-e20d-4aca-bd71-f1eb44c10950-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"d50415b5-e20d-4aca-bd71-f1eb44c10950\") " pod="openstack/glance-default-external-api-0" Mar 10 11:16:16 crc kubenswrapper[4794]: I0310 11:16:16.957468 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/aef4159b-4e10-43db-a895-01ecf2c19b61-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"aef4159b-4e10-43db-a895-01ecf2c19b61\") " pod="openstack/glance-default-internal-api-0" Mar 10 11:16:16 crc kubenswrapper[4794]: I0310 11:16:16.957489 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/aef4159b-4e10-43db-a895-01ecf2c19b61-logs\") pod \"glance-default-internal-api-0\" (UID: \"aef4159b-4e10-43db-a895-01ecf2c19b61\") " pod="openstack/glance-default-internal-api-0" Mar 10 11:16:16 crc kubenswrapper[4794]: I0310 11:16:16.957508 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5ffdf5bc-7b4a-4714-8cbf-4e27c16a4f21-config\") pod \"dnsmasq-dns-76f7f8d69f-2s985\" (UID: \"5ffdf5bc-7b4a-4714-8cbf-4e27c16a4f21\") " pod="openstack/dnsmasq-dns-76f7f8d69f-2s985" Mar 10 11:16:16 crc kubenswrapper[4794]: I0310 11:16:16.957523 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aef4159b-4e10-43db-a895-01ecf2c19b61-config-data\") pod \"glance-default-internal-api-0\" (UID: \"aef4159b-4e10-43db-a895-01ecf2c19b61\") " pod="openstack/glance-default-internal-api-0" Mar 10 11:16:16 crc kubenswrapper[4794]: I0310 11:16:16.957545 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d50415b5-e20d-4aca-bd71-f1eb44c10950-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"d50415b5-e20d-4aca-bd71-f1eb44c10950\") " pod="openstack/glance-default-external-api-0" Mar 10 11:16:16 crc kubenswrapper[4794]: I0310 11:16:16.957560 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/d50415b5-e20d-4aca-bd71-f1eb44c10950-ceph\") pod \"glance-default-external-api-0\" (UID: \"d50415b5-e20d-4aca-bd71-f1eb44c10950\") " pod="openstack/glance-default-external-api-0" Mar 10 11:16:16 crc kubenswrapper[4794]: I0310 11:16:16.957616 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5ffdf5bc-7b4a-4714-8cbf-4e27c16a4f21-ovsdbserver-nb\") pod \"dnsmasq-dns-76f7f8d69f-2s985\" (UID: \"5ffdf5bc-7b4a-4714-8cbf-4e27c16a4f21\") " pod="openstack/dnsmasq-dns-76f7f8d69f-2s985" Mar 10 11:16:16 crc kubenswrapper[4794]: I0310 11:16:16.957637 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/aef4159b-4e10-43db-a895-01ecf2c19b61-scripts\") pod \"glance-default-internal-api-0\" (UID: \"aef4159b-4e10-43db-a895-01ecf2c19b61\") " pod="openstack/glance-default-internal-api-0" Mar 10 11:16:16 crc kubenswrapper[4794]: I0310 11:16:16.957653 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5ffdf5bc-7b4a-4714-8cbf-4e27c16a4f21-dns-svc\") pod \"dnsmasq-dns-76f7f8d69f-2s985\" (UID: \"5ffdf5bc-7b4a-4714-8cbf-4e27c16a4f21\") " pod="openstack/dnsmasq-dns-76f7f8d69f-2s985" Mar 10 11:16:16 crc kubenswrapper[4794]: I0310 11:16:16.957670 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2g7gm\" (UniqueName: \"kubernetes.io/projected/5ffdf5bc-7b4a-4714-8cbf-4e27c16a4f21-kube-api-access-2g7gm\") pod \"dnsmasq-dns-76f7f8d69f-2s985\" (UID: \"5ffdf5bc-7b4a-4714-8cbf-4e27c16a4f21\") " pod="openstack/dnsmasq-dns-76f7f8d69f-2s985" Mar 10 11:16:16 crc kubenswrapper[4794]: I0310 11:16:16.957690 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/aef4159b-4e10-43db-a895-01ecf2c19b61-ceph\") pod \"glance-default-internal-api-0\" (UID: \"aef4159b-4e10-43db-a895-01ecf2c19b61\") " pod="openstack/glance-default-internal-api-0" Mar 10 11:16:16 crc kubenswrapper[4794]: I0310 11:16:16.957712 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aef4159b-4e10-43db-a895-01ecf2c19b61-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"aef4159b-4e10-43db-a895-01ecf2c19b61\") " pod="openstack/glance-default-internal-api-0" Mar 10 11:16:16 crc kubenswrapper[4794]: I0310 11:16:16.957731 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5ffdf5bc-7b4a-4714-8cbf-4e27c16a4f21-ovsdbserver-sb\") pod \"dnsmasq-dns-76f7f8d69f-2s985\" (UID: \"5ffdf5bc-7b4a-4714-8cbf-4e27c16a4f21\") " pod="openstack/dnsmasq-dns-76f7f8d69f-2s985" Mar 10 11:16:16 crc kubenswrapper[4794]: I0310 11:16:16.957746 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d50415b5-e20d-4aca-bd71-f1eb44c10950-logs\") pod \"glance-default-external-api-0\" (UID: \"d50415b5-e20d-4aca-bd71-f1eb44c10950\") " pod="openstack/glance-default-external-api-0" Mar 10 11:16:16 crc kubenswrapper[4794]: I0310 11:16:16.957786 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cnkwz\" (UniqueName: \"kubernetes.io/projected/aef4159b-4e10-43db-a895-01ecf2c19b61-kube-api-access-cnkwz\") pod \"glance-default-internal-api-0\" (UID: \"aef4159b-4e10-43db-a895-01ecf2c19b61\") " pod="openstack/glance-default-internal-api-0" Mar 10 11:16:16 crc kubenswrapper[4794]: I0310 11:16:16.957814 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k58tt\" (UniqueName: \"kubernetes.io/projected/d50415b5-e20d-4aca-bd71-f1eb44c10950-kube-api-access-k58tt\") pod \"glance-default-external-api-0\" (UID: \"d50415b5-e20d-4aca-bd71-f1eb44c10950\") " pod="openstack/glance-default-external-api-0" Mar 10 11:16:16 crc kubenswrapper[4794]: I0310 11:16:16.957830 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d50415b5-e20d-4aca-bd71-f1eb44c10950-config-data\") pod \"glance-default-external-api-0\" (UID: \"d50415b5-e20d-4aca-bd71-f1eb44c10950\") " pod="openstack/glance-default-external-api-0" Mar 10 11:16:16 crc kubenswrapper[4794]: I0310 11:16:16.957858 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d50415b5-e20d-4aca-bd71-f1eb44c10950-scripts\") pod \"glance-default-external-api-0\" (UID: \"d50415b5-e20d-4aca-bd71-f1eb44c10950\") " pod="openstack/glance-default-external-api-0" Mar 10 11:16:16 crc kubenswrapper[4794]: I0310 11:16:16.958067 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/d50415b5-e20d-4aca-bd71-f1eb44c10950-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"d50415b5-e20d-4aca-bd71-f1eb44c10950\") " pod="openstack/glance-default-external-api-0" Mar 10 11:16:16 crc kubenswrapper[4794]: I0310 11:16:16.958770 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5ffdf5bc-7b4a-4714-8cbf-4e27c16a4f21-dns-svc\") pod \"dnsmasq-dns-76f7f8d69f-2s985\" (UID: \"5ffdf5bc-7b4a-4714-8cbf-4e27c16a4f21\") " pod="openstack/dnsmasq-dns-76f7f8d69f-2s985" Mar 10 11:16:16 crc kubenswrapper[4794]: I0310 11:16:16.959284 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5ffdf5bc-7b4a-4714-8cbf-4e27c16a4f21-config\") pod \"dnsmasq-dns-76f7f8d69f-2s985\" (UID: \"5ffdf5bc-7b4a-4714-8cbf-4e27c16a4f21\") " pod="openstack/dnsmasq-dns-76f7f8d69f-2s985" Mar 10 11:16:16 crc kubenswrapper[4794]: I0310 11:16:16.959870 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d50415b5-e20d-4aca-bd71-f1eb44c10950-logs\") pod \"glance-default-external-api-0\" (UID: \"d50415b5-e20d-4aca-bd71-f1eb44c10950\") " pod="openstack/glance-default-external-api-0" Mar 10 11:16:16 crc kubenswrapper[4794]: I0310 11:16:16.959929 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5ffdf5bc-7b4a-4714-8cbf-4e27c16a4f21-ovsdbserver-sb\") pod \"dnsmasq-dns-76f7f8d69f-2s985\" (UID: \"5ffdf5bc-7b4a-4714-8cbf-4e27c16a4f21\") " pod="openstack/dnsmasq-dns-76f7f8d69f-2s985" Mar 10 11:16:16 crc kubenswrapper[4794]: I0310 11:16:16.960451 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5ffdf5bc-7b4a-4714-8cbf-4e27c16a4f21-ovsdbserver-nb\") pod \"dnsmasq-dns-76f7f8d69f-2s985\" (UID: \"5ffdf5bc-7b4a-4714-8cbf-4e27c16a4f21\") " pod="openstack/dnsmasq-dns-76f7f8d69f-2s985" Mar 10 11:16:16 crc kubenswrapper[4794]: I0310 11:16:16.963690 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d50415b5-e20d-4aca-bd71-f1eb44c10950-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"d50415b5-e20d-4aca-bd71-f1eb44c10950\") " pod="openstack/glance-default-external-api-0" Mar 10 11:16:16 crc kubenswrapper[4794]: I0310 11:16:16.964974 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d50415b5-e20d-4aca-bd71-f1eb44c10950-scripts\") pod \"glance-default-external-api-0\" (UID: \"d50415b5-e20d-4aca-bd71-f1eb44c10950\") " pod="openstack/glance-default-external-api-0" Mar 10 11:16:16 crc kubenswrapper[4794]: I0310 11:16:16.965457 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d50415b5-e20d-4aca-bd71-f1eb44c10950-config-data\") pod \"glance-default-external-api-0\" (UID: \"d50415b5-e20d-4aca-bd71-f1eb44c10950\") " pod="openstack/glance-default-external-api-0" Mar 10 11:16:16 crc kubenswrapper[4794]: I0310 11:16:16.967389 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/d50415b5-e20d-4aca-bd71-f1eb44c10950-ceph\") pod \"glance-default-external-api-0\" (UID: \"d50415b5-e20d-4aca-bd71-f1eb44c10950\") " pod="openstack/glance-default-external-api-0" Mar 10 11:16:16 crc kubenswrapper[4794]: I0310 11:16:16.997120 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k58tt\" (UniqueName: \"kubernetes.io/projected/d50415b5-e20d-4aca-bd71-f1eb44c10950-kube-api-access-k58tt\") pod \"glance-default-external-api-0\" (UID: \"d50415b5-e20d-4aca-bd71-f1eb44c10950\") " pod="openstack/glance-default-external-api-0" Mar 10 11:16:17 crc kubenswrapper[4794]: I0310 11:16:17.001564 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2g7gm\" (UniqueName: \"kubernetes.io/projected/5ffdf5bc-7b4a-4714-8cbf-4e27c16a4f21-kube-api-access-2g7gm\") pod \"dnsmasq-dns-76f7f8d69f-2s985\" (UID: \"5ffdf5bc-7b4a-4714-8cbf-4e27c16a4f21\") " pod="openstack/dnsmasq-dns-76f7f8d69f-2s985" Mar 10 11:16:17 crc kubenswrapper[4794]: I0310 11:16:17.059384 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/aef4159b-4e10-43db-a895-01ecf2c19b61-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"aef4159b-4e10-43db-a895-01ecf2c19b61\") " pod="openstack/glance-default-internal-api-0" Mar 10 11:16:17 crc kubenswrapper[4794]: I0310 11:16:17.059641 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/aef4159b-4e10-43db-a895-01ecf2c19b61-logs\") pod \"glance-default-internal-api-0\" (UID: \"aef4159b-4e10-43db-a895-01ecf2c19b61\") " pod="openstack/glance-default-internal-api-0" Mar 10 11:16:17 crc kubenswrapper[4794]: I0310 11:16:17.059720 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aef4159b-4e10-43db-a895-01ecf2c19b61-config-data\") pod \"glance-default-internal-api-0\" (UID: \"aef4159b-4e10-43db-a895-01ecf2c19b61\") " pod="openstack/glance-default-internal-api-0" Mar 10 11:16:17 crc kubenswrapper[4794]: I0310 11:16:17.059887 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/aef4159b-4e10-43db-a895-01ecf2c19b61-scripts\") pod \"glance-default-internal-api-0\" (UID: \"aef4159b-4e10-43db-a895-01ecf2c19b61\") " pod="openstack/glance-default-internal-api-0" Mar 10 11:16:17 crc kubenswrapper[4794]: I0310 11:16:17.059962 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/aef4159b-4e10-43db-a895-01ecf2c19b61-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"aef4159b-4e10-43db-a895-01ecf2c19b61\") " pod="openstack/glance-default-internal-api-0" Mar 10 11:16:17 crc kubenswrapper[4794]: I0310 11:16:17.059969 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/aef4159b-4e10-43db-a895-01ecf2c19b61-ceph\") pod \"glance-default-internal-api-0\" (UID: \"aef4159b-4e10-43db-a895-01ecf2c19b61\") " pod="openstack/glance-default-internal-api-0" Mar 10 11:16:17 crc kubenswrapper[4794]: I0310 11:16:17.060038 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/aef4159b-4e10-43db-a895-01ecf2c19b61-logs\") pod \"glance-default-internal-api-0\" (UID: \"aef4159b-4e10-43db-a895-01ecf2c19b61\") " pod="openstack/glance-default-internal-api-0" Mar 10 11:16:17 crc kubenswrapper[4794]: I0310 11:16:17.060063 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aef4159b-4e10-43db-a895-01ecf2c19b61-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"aef4159b-4e10-43db-a895-01ecf2c19b61\") " pod="openstack/glance-default-internal-api-0" Mar 10 11:16:17 crc kubenswrapper[4794]: I0310 11:16:17.060163 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cnkwz\" (UniqueName: \"kubernetes.io/projected/aef4159b-4e10-43db-a895-01ecf2c19b61-kube-api-access-cnkwz\") pod \"glance-default-internal-api-0\" (UID: \"aef4159b-4e10-43db-a895-01ecf2c19b61\") " pod="openstack/glance-default-internal-api-0" Mar 10 11:16:17 crc kubenswrapper[4794]: I0310 11:16:17.063643 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/aef4159b-4e10-43db-a895-01ecf2c19b61-scripts\") pod \"glance-default-internal-api-0\" (UID: \"aef4159b-4e10-43db-a895-01ecf2c19b61\") " pod="openstack/glance-default-internal-api-0" Mar 10 11:16:17 crc kubenswrapper[4794]: I0310 11:16:17.063773 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aef4159b-4e10-43db-a895-01ecf2c19b61-config-data\") pod \"glance-default-internal-api-0\" (UID: \"aef4159b-4e10-43db-a895-01ecf2c19b61\") " pod="openstack/glance-default-internal-api-0" Mar 10 11:16:17 crc kubenswrapper[4794]: I0310 11:16:17.063785 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aef4159b-4e10-43db-a895-01ecf2c19b61-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"aef4159b-4e10-43db-a895-01ecf2c19b61\") " pod="openstack/glance-default-internal-api-0" Mar 10 11:16:17 crc kubenswrapper[4794]: I0310 11:16:17.068102 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/aef4159b-4e10-43db-a895-01ecf2c19b61-ceph\") pod \"glance-default-internal-api-0\" (UID: \"aef4159b-4e10-43db-a895-01ecf2c19b61\") " pod="openstack/glance-default-internal-api-0" Mar 10 11:16:17 crc kubenswrapper[4794]: I0310 11:16:17.075802 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cnkwz\" (UniqueName: \"kubernetes.io/projected/aef4159b-4e10-43db-a895-01ecf2c19b61-kube-api-access-cnkwz\") pod \"glance-default-internal-api-0\" (UID: \"aef4159b-4e10-43db-a895-01ecf2c19b61\") " pod="openstack/glance-default-internal-api-0" Mar 10 11:16:17 crc kubenswrapper[4794]: I0310 11:16:17.113560 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 10 11:16:17 crc kubenswrapper[4794]: I0310 11:16:17.238768 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 10 11:16:17 crc kubenswrapper[4794]: I0310 11:16:17.268312 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-76f7f8d69f-2s985" Mar 10 11:16:17 crc kubenswrapper[4794]: I0310 11:16:17.608501 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 10 11:16:17 crc kubenswrapper[4794]: I0310 11:16:17.663377 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 10 11:16:17 crc kubenswrapper[4794]: I0310 11:16:17.733152 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-76f7f8d69f-2s985"] Mar 10 11:16:17 crc kubenswrapper[4794]: I0310 11:16:17.891074 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 10 11:16:17 crc kubenswrapper[4794]: W0310 11:16:17.898914 4794 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podaef4159b_4e10_43db_a895_01ecf2c19b61.slice/crio-ee396cfa39fb25567fcb571a8c7f8acae05aa1b1a09b2e7513f9f0d1d8920e17 WatchSource:0}: Error finding container ee396cfa39fb25567fcb571a8c7f8acae05aa1b1a09b2e7513f9f0d1d8920e17: Status 404 returned error can't find the container with id ee396cfa39fb25567fcb571a8c7f8acae05aa1b1a09b2e7513f9f0d1d8920e17 Mar 10 11:16:18 crc kubenswrapper[4794]: I0310 11:16:18.291368 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"aef4159b-4e10-43db-a895-01ecf2c19b61","Type":"ContainerStarted","Data":"ee396cfa39fb25567fcb571a8c7f8acae05aa1b1a09b2e7513f9f0d1d8920e17"} Mar 10 11:16:18 crc kubenswrapper[4794]: I0310 11:16:18.299720 4794 generic.go:334] "Generic (PLEG): container finished" podID="5ffdf5bc-7b4a-4714-8cbf-4e27c16a4f21" containerID="668ade1cad142e4840f5d744f2555b3126d4fdabf0172120dd69776c0f34849c" exitCode=0 Mar 10 11:16:18 crc kubenswrapper[4794]: I0310 11:16:18.299828 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-76f7f8d69f-2s985" event={"ID":"5ffdf5bc-7b4a-4714-8cbf-4e27c16a4f21","Type":"ContainerDied","Data":"668ade1cad142e4840f5d744f2555b3126d4fdabf0172120dd69776c0f34849c"} Mar 10 11:16:18 crc kubenswrapper[4794]: I0310 11:16:18.299857 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-76f7f8d69f-2s985" event={"ID":"5ffdf5bc-7b4a-4714-8cbf-4e27c16a4f21","Type":"ContainerStarted","Data":"300d455d31d716a76d6cf4dee067b83ccaf8dbac159077d4a66d1843f5398b45"} Mar 10 11:16:18 crc kubenswrapper[4794]: I0310 11:16:18.301247 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"d50415b5-e20d-4aca-bd71-f1eb44c10950","Type":"ContainerStarted","Data":"2dba7d37bd1042ed3880b1923b9e1e36981694f2d4f78a8bb8c039e8dcd6ac34"} Mar 10 11:16:18 crc kubenswrapper[4794]: I0310 11:16:18.301417 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"d50415b5-e20d-4aca-bd71-f1eb44c10950","Type":"ContainerStarted","Data":"f4cff7229708d85f00ec6936c4cd2f9107fc5e9559a12c47f095a10ae92ae9e2"} Mar 10 11:16:19 crc kubenswrapper[4794]: I0310 11:16:19.311067 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"d50415b5-e20d-4aca-bd71-f1eb44c10950","Type":"ContainerStarted","Data":"e04225e1ee7e6dce3e39f97d89044acb0f2b29bb24430e647802b7d3926cad17"} Mar 10 11:16:19 crc kubenswrapper[4794]: I0310 11:16:19.311260 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="d50415b5-e20d-4aca-bd71-f1eb44c10950" containerName="glance-httpd" containerID="cri-o://e04225e1ee7e6dce3e39f97d89044acb0f2b29bb24430e647802b7d3926cad17" gracePeriod=30 Mar 10 11:16:19 crc kubenswrapper[4794]: I0310 11:16:19.311258 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="d50415b5-e20d-4aca-bd71-f1eb44c10950" containerName="glance-log" containerID="cri-o://2dba7d37bd1042ed3880b1923b9e1e36981694f2d4f78a8bb8c039e8dcd6ac34" gracePeriod=30 Mar 10 11:16:19 crc kubenswrapper[4794]: I0310 11:16:19.317932 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"aef4159b-4e10-43db-a895-01ecf2c19b61","Type":"ContainerStarted","Data":"1e48f541c6c74f381cbcaef2e2716d3e4419adf23dcca2cf6d36b784b8ee4c50"} Mar 10 11:16:19 crc kubenswrapper[4794]: I0310 11:16:19.318023 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"aef4159b-4e10-43db-a895-01ecf2c19b61","Type":"ContainerStarted","Data":"1fc0121f5aff9ef418671c50b2b9607a06a8211303f0d9313956b4ca1d8a76d7"} Mar 10 11:16:19 crc kubenswrapper[4794]: I0310 11:16:19.326688 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-76f7f8d69f-2s985" event={"ID":"5ffdf5bc-7b4a-4714-8cbf-4e27c16a4f21","Type":"ContainerStarted","Data":"df5b7c3de89a7de22db06361467e37303d68f8464626231d377eab6f107e907e"} Mar 10 11:16:19 crc kubenswrapper[4794]: I0310 11:16:19.326848 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-76f7f8d69f-2s985" Mar 10 11:16:19 crc kubenswrapper[4794]: I0310 11:16:19.344921 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=3.34490331 podStartE2EDuration="3.34490331s" podCreationTimestamp="2026-03-10 11:16:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 11:16:19.340055069 +0000 UTC m=+5528.096225907" watchObservedRunningTime="2026-03-10 11:16:19.34490331 +0000 UTC m=+5528.101074118" Mar 10 11:16:19 crc kubenswrapper[4794]: I0310 11:16:19.367690 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-76f7f8d69f-2s985" podStartSLOduration=3.367675339 podStartE2EDuration="3.367675339s" podCreationTimestamp="2026-03-10 11:16:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 11:16:19.36387402 +0000 UTC m=+5528.120044878" watchObservedRunningTime="2026-03-10 11:16:19.367675339 +0000 UTC m=+5528.123846157" Mar 10 11:16:19 crc kubenswrapper[4794]: I0310 11:16:19.382020 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=3.382003614 podStartE2EDuration="3.382003614s" podCreationTimestamp="2026-03-10 11:16:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 11:16:19.381138738 +0000 UTC m=+5528.137309556" watchObservedRunningTime="2026-03-10 11:16:19.382003614 +0000 UTC m=+5528.138174432" Mar 10 11:16:19 crc kubenswrapper[4794]: I0310 11:16:19.402609 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 10 11:16:19 crc kubenswrapper[4794]: I0310 11:16:19.923548 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 10 11:16:20 crc kubenswrapper[4794]: I0310 11:16:20.122298 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d50415b5-e20d-4aca-bd71-f1eb44c10950-config-data\") pod \"d50415b5-e20d-4aca-bd71-f1eb44c10950\" (UID: \"d50415b5-e20d-4aca-bd71-f1eb44c10950\") " Mar 10 11:16:20 crc kubenswrapper[4794]: I0310 11:16:20.122826 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d50415b5-e20d-4aca-bd71-f1eb44c10950-scripts\") pod \"d50415b5-e20d-4aca-bd71-f1eb44c10950\" (UID: \"d50415b5-e20d-4aca-bd71-f1eb44c10950\") " Mar 10 11:16:20 crc kubenswrapper[4794]: I0310 11:16:20.122954 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d50415b5-e20d-4aca-bd71-f1eb44c10950-logs\") pod \"d50415b5-e20d-4aca-bd71-f1eb44c10950\" (UID: \"d50415b5-e20d-4aca-bd71-f1eb44c10950\") " Mar 10 11:16:20 crc kubenswrapper[4794]: I0310 11:16:20.123138 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d50415b5-e20d-4aca-bd71-f1eb44c10950-combined-ca-bundle\") pod \"d50415b5-e20d-4aca-bd71-f1eb44c10950\" (UID: \"d50415b5-e20d-4aca-bd71-f1eb44c10950\") " Mar 10 11:16:20 crc kubenswrapper[4794]: I0310 11:16:20.123263 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/d50415b5-e20d-4aca-bd71-f1eb44c10950-ceph\") pod \"d50415b5-e20d-4aca-bd71-f1eb44c10950\" (UID: \"d50415b5-e20d-4aca-bd71-f1eb44c10950\") " Mar 10 11:16:20 crc kubenswrapper[4794]: I0310 11:16:20.123330 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d50415b5-e20d-4aca-bd71-f1eb44c10950-logs" (OuterVolumeSpecName: "logs") pod "d50415b5-e20d-4aca-bd71-f1eb44c10950" (UID: "d50415b5-e20d-4aca-bd71-f1eb44c10950"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 11:16:20 crc kubenswrapper[4794]: I0310 11:16:20.123479 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k58tt\" (UniqueName: \"kubernetes.io/projected/d50415b5-e20d-4aca-bd71-f1eb44c10950-kube-api-access-k58tt\") pod \"d50415b5-e20d-4aca-bd71-f1eb44c10950\" (UID: \"d50415b5-e20d-4aca-bd71-f1eb44c10950\") " Mar 10 11:16:20 crc kubenswrapper[4794]: I0310 11:16:20.123620 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/d50415b5-e20d-4aca-bd71-f1eb44c10950-httpd-run\") pod \"d50415b5-e20d-4aca-bd71-f1eb44c10950\" (UID: \"d50415b5-e20d-4aca-bd71-f1eb44c10950\") " Mar 10 11:16:20 crc kubenswrapper[4794]: I0310 11:16:20.123829 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d50415b5-e20d-4aca-bd71-f1eb44c10950-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "d50415b5-e20d-4aca-bd71-f1eb44c10950" (UID: "d50415b5-e20d-4aca-bd71-f1eb44c10950"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 11:16:20 crc kubenswrapper[4794]: I0310 11:16:20.124264 4794 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d50415b5-e20d-4aca-bd71-f1eb44c10950-logs\") on node \"crc\" DevicePath \"\"" Mar 10 11:16:20 crc kubenswrapper[4794]: I0310 11:16:20.124374 4794 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/d50415b5-e20d-4aca-bd71-f1eb44c10950-httpd-run\") on node \"crc\" DevicePath \"\"" Mar 10 11:16:20 crc kubenswrapper[4794]: I0310 11:16:20.128750 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d50415b5-e20d-4aca-bd71-f1eb44c10950-kube-api-access-k58tt" (OuterVolumeSpecName: "kube-api-access-k58tt") pod "d50415b5-e20d-4aca-bd71-f1eb44c10950" (UID: "d50415b5-e20d-4aca-bd71-f1eb44c10950"). InnerVolumeSpecName "kube-api-access-k58tt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 11:16:20 crc kubenswrapper[4794]: I0310 11:16:20.133510 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d50415b5-e20d-4aca-bd71-f1eb44c10950-ceph" (OuterVolumeSpecName: "ceph") pod "d50415b5-e20d-4aca-bd71-f1eb44c10950" (UID: "d50415b5-e20d-4aca-bd71-f1eb44c10950"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 11:16:20 crc kubenswrapper[4794]: I0310 11:16:20.133618 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d50415b5-e20d-4aca-bd71-f1eb44c10950-scripts" (OuterVolumeSpecName: "scripts") pod "d50415b5-e20d-4aca-bd71-f1eb44c10950" (UID: "d50415b5-e20d-4aca-bd71-f1eb44c10950"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 11:16:20 crc kubenswrapper[4794]: I0310 11:16:20.147363 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d50415b5-e20d-4aca-bd71-f1eb44c10950-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d50415b5-e20d-4aca-bd71-f1eb44c10950" (UID: "d50415b5-e20d-4aca-bd71-f1eb44c10950"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 11:16:20 crc kubenswrapper[4794]: I0310 11:16:20.176712 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d50415b5-e20d-4aca-bd71-f1eb44c10950-config-data" (OuterVolumeSpecName: "config-data") pod "d50415b5-e20d-4aca-bd71-f1eb44c10950" (UID: "d50415b5-e20d-4aca-bd71-f1eb44c10950"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 11:16:20 crc kubenswrapper[4794]: I0310 11:16:20.225368 4794 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d50415b5-e20d-4aca-bd71-f1eb44c10950-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 11:16:20 crc kubenswrapper[4794]: I0310 11:16:20.225406 4794 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d50415b5-e20d-4aca-bd71-f1eb44c10950-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 11:16:20 crc kubenswrapper[4794]: I0310 11:16:20.225418 4794 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d50415b5-e20d-4aca-bd71-f1eb44c10950-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 11:16:20 crc kubenswrapper[4794]: I0310 11:16:20.225433 4794 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/d50415b5-e20d-4aca-bd71-f1eb44c10950-ceph\") on node \"crc\" DevicePath \"\"" Mar 10 11:16:20 crc kubenswrapper[4794]: I0310 11:16:20.225445 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k58tt\" (UniqueName: \"kubernetes.io/projected/d50415b5-e20d-4aca-bd71-f1eb44c10950-kube-api-access-k58tt\") on node \"crc\" DevicePath \"\"" Mar 10 11:16:20 crc kubenswrapper[4794]: I0310 11:16:20.334702 4794 generic.go:334] "Generic (PLEG): container finished" podID="d50415b5-e20d-4aca-bd71-f1eb44c10950" containerID="e04225e1ee7e6dce3e39f97d89044acb0f2b29bb24430e647802b7d3926cad17" exitCode=0 Mar 10 11:16:20 crc kubenswrapper[4794]: I0310 11:16:20.334730 4794 generic.go:334] "Generic (PLEG): container finished" podID="d50415b5-e20d-4aca-bd71-f1eb44c10950" containerID="2dba7d37bd1042ed3880b1923b9e1e36981694f2d4f78a8bb8c039e8dcd6ac34" exitCode=143 Mar 10 11:16:20 crc kubenswrapper[4794]: I0310 11:16:20.335527 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 10 11:16:20 crc kubenswrapper[4794]: I0310 11:16:20.337548 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"d50415b5-e20d-4aca-bd71-f1eb44c10950","Type":"ContainerDied","Data":"e04225e1ee7e6dce3e39f97d89044acb0f2b29bb24430e647802b7d3926cad17"} Mar 10 11:16:20 crc kubenswrapper[4794]: I0310 11:16:20.337596 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"d50415b5-e20d-4aca-bd71-f1eb44c10950","Type":"ContainerDied","Data":"2dba7d37bd1042ed3880b1923b9e1e36981694f2d4f78a8bb8c039e8dcd6ac34"} Mar 10 11:16:20 crc kubenswrapper[4794]: I0310 11:16:20.337609 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"d50415b5-e20d-4aca-bd71-f1eb44c10950","Type":"ContainerDied","Data":"f4cff7229708d85f00ec6936c4cd2f9107fc5e9559a12c47f095a10ae92ae9e2"} Mar 10 11:16:20 crc kubenswrapper[4794]: I0310 11:16:20.337624 4794 scope.go:117] "RemoveContainer" containerID="e04225e1ee7e6dce3e39f97d89044acb0f2b29bb24430e647802b7d3926cad17" Mar 10 11:16:20 crc kubenswrapper[4794]: I0310 11:16:20.361478 4794 scope.go:117] "RemoveContainer" containerID="2dba7d37bd1042ed3880b1923b9e1e36981694f2d4f78a8bb8c039e8dcd6ac34" Mar 10 11:16:20 crc kubenswrapper[4794]: I0310 11:16:20.377614 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 10 11:16:20 crc kubenswrapper[4794]: I0310 11:16:20.390162 4794 scope.go:117] "RemoveContainer" containerID="e04225e1ee7e6dce3e39f97d89044acb0f2b29bb24430e647802b7d3926cad17" Mar 10 11:16:20 crc kubenswrapper[4794]: E0310 11:16:20.390862 4794 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e04225e1ee7e6dce3e39f97d89044acb0f2b29bb24430e647802b7d3926cad17\": container with ID starting with e04225e1ee7e6dce3e39f97d89044acb0f2b29bb24430e647802b7d3926cad17 not found: ID does not exist" containerID="e04225e1ee7e6dce3e39f97d89044acb0f2b29bb24430e647802b7d3926cad17" Mar 10 11:16:20 crc kubenswrapper[4794]: I0310 11:16:20.390918 4794 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e04225e1ee7e6dce3e39f97d89044acb0f2b29bb24430e647802b7d3926cad17"} err="failed to get container status \"e04225e1ee7e6dce3e39f97d89044acb0f2b29bb24430e647802b7d3926cad17\": rpc error: code = NotFound desc = could not find container \"e04225e1ee7e6dce3e39f97d89044acb0f2b29bb24430e647802b7d3926cad17\": container with ID starting with e04225e1ee7e6dce3e39f97d89044acb0f2b29bb24430e647802b7d3926cad17 not found: ID does not exist" Mar 10 11:16:20 crc kubenswrapper[4794]: I0310 11:16:20.393339 4794 scope.go:117] "RemoveContainer" containerID="2dba7d37bd1042ed3880b1923b9e1e36981694f2d4f78a8bb8c039e8dcd6ac34" Mar 10 11:16:20 crc kubenswrapper[4794]: E0310 11:16:20.398016 4794 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2dba7d37bd1042ed3880b1923b9e1e36981694f2d4f78a8bb8c039e8dcd6ac34\": container with ID starting with 2dba7d37bd1042ed3880b1923b9e1e36981694f2d4f78a8bb8c039e8dcd6ac34 not found: ID does not exist" containerID="2dba7d37bd1042ed3880b1923b9e1e36981694f2d4f78a8bb8c039e8dcd6ac34" Mar 10 11:16:20 crc kubenswrapper[4794]: I0310 11:16:20.398070 4794 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2dba7d37bd1042ed3880b1923b9e1e36981694f2d4f78a8bb8c039e8dcd6ac34"} err="failed to get container status \"2dba7d37bd1042ed3880b1923b9e1e36981694f2d4f78a8bb8c039e8dcd6ac34\": rpc error: code = NotFound desc = could not find container \"2dba7d37bd1042ed3880b1923b9e1e36981694f2d4f78a8bb8c039e8dcd6ac34\": container with ID starting with 2dba7d37bd1042ed3880b1923b9e1e36981694f2d4f78a8bb8c039e8dcd6ac34 not found: ID does not exist" Mar 10 11:16:20 crc kubenswrapper[4794]: I0310 11:16:20.398101 4794 scope.go:117] "RemoveContainer" containerID="e04225e1ee7e6dce3e39f97d89044acb0f2b29bb24430e647802b7d3926cad17" Mar 10 11:16:20 crc kubenswrapper[4794]: I0310 11:16:20.402702 4794 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e04225e1ee7e6dce3e39f97d89044acb0f2b29bb24430e647802b7d3926cad17"} err="failed to get container status \"e04225e1ee7e6dce3e39f97d89044acb0f2b29bb24430e647802b7d3926cad17\": rpc error: code = NotFound desc = could not find container \"e04225e1ee7e6dce3e39f97d89044acb0f2b29bb24430e647802b7d3926cad17\": container with ID starting with e04225e1ee7e6dce3e39f97d89044acb0f2b29bb24430e647802b7d3926cad17 not found: ID does not exist" Mar 10 11:16:20 crc kubenswrapper[4794]: I0310 11:16:20.402744 4794 scope.go:117] "RemoveContainer" containerID="2dba7d37bd1042ed3880b1923b9e1e36981694f2d4f78a8bb8c039e8dcd6ac34" Mar 10 11:16:20 crc kubenswrapper[4794]: I0310 11:16:20.403142 4794 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2dba7d37bd1042ed3880b1923b9e1e36981694f2d4f78a8bb8c039e8dcd6ac34"} err="failed to get container status \"2dba7d37bd1042ed3880b1923b9e1e36981694f2d4f78a8bb8c039e8dcd6ac34\": rpc error: code = NotFound desc = could not find container \"2dba7d37bd1042ed3880b1923b9e1e36981694f2d4f78a8bb8c039e8dcd6ac34\": container with ID starting with 2dba7d37bd1042ed3880b1923b9e1e36981694f2d4f78a8bb8c039e8dcd6ac34 not found: ID does not exist" Mar 10 11:16:20 crc kubenswrapper[4794]: I0310 11:16:20.412035 4794 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 10 11:16:20 crc kubenswrapper[4794]: I0310 11:16:20.426665 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Mar 10 11:16:20 crc kubenswrapper[4794]: E0310 11:16:20.427254 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d50415b5-e20d-4aca-bd71-f1eb44c10950" containerName="glance-httpd" Mar 10 11:16:20 crc kubenswrapper[4794]: I0310 11:16:20.427891 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="d50415b5-e20d-4aca-bd71-f1eb44c10950" containerName="glance-httpd" Mar 10 11:16:20 crc kubenswrapper[4794]: E0310 11:16:20.428013 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d50415b5-e20d-4aca-bd71-f1eb44c10950" containerName="glance-log" Mar 10 11:16:20 crc kubenswrapper[4794]: I0310 11:16:20.428089 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="d50415b5-e20d-4aca-bd71-f1eb44c10950" containerName="glance-log" Mar 10 11:16:20 crc kubenswrapper[4794]: I0310 11:16:20.428415 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="d50415b5-e20d-4aca-bd71-f1eb44c10950" containerName="glance-log" Mar 10 11:16:20 crc kubenswrapper[4794]: I0310 11:16:20.428525 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="d50415b5-e20d-4aca-bd71-f1eb44c10950" containerName="glance-httpd" Mar 10 11:16:20 crc kubenswrapper[4794]: I0310 11:16:20.430415 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 10 11:16:20 crc kubenswrapper[4794]: I0310 11:16:20.435649 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Mar 10 11:16:20 crc kubenswrapper[4794]: I0310 11:16:20.448217 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 10 11:16:20 crc kubenswrapper[4794]: I0310 11:16:20.533837 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c9a8037c-9090-4a68-92a2-21b0e4a142b4-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"c9a8037c-9090-4a68-92a2-21b0e4a142b4\") " pod="openstack/glance-default-external-api-0" Mar 10 11:16:20 crc kubenswrapper[4794]: I0310 11:16:20.533978 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/c9a8037c-9090-4a68-92a2-21b0e4a142b4-ceph\") pod \"glance-default-external-api-0\" (UID: \"c9a8037c-9090-4a68-92a2-21b0e4a142b4\") " pod="openstack/glance-default-external-api-0" Mar 10 11:16:20 crc kubenswrapper[4794]: I0310 11:16:20.534027 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c9a8037c-9090-4a68-92a2-21b0e4a142b4-scripts\") pod \"glance-default-external-api-0\" (UID: \"c9a8037c-9090-4a68-92a2-21b0e4a142b4\") " pod="openstack/glance-default-external-api-0" Mar 10 11:16:20 crc kubenswrapper[4794]: I0310 11:16:20.534059 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c9a8037c-9090-4a68-92a2-21b0e4a142b4-config-data\") pod \"glance-default-external-api-0\" (UID: \"c9a8037c-9090-4a68-92a2-21b0e4a142b4\") " pod="openstack/glance-default-external-api-0" Mar 10 11:16:20 crc kubenswrapper[4794]: I0310 11:16:20.534105 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c9a8037c-9090-4a68-92a2-21b0e4a142b4-logs\") pod \"glance-default-external-api-0\" (UID: \"c9a8037c-9090-4a68-92a2-21b0e4a142b4\") " pod="openstack/glance-default-external-api-0" Mar 10 11:16:20 crc kubenswrapper[4794]: I0310 11:16:20.534127 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/c9a8037c-9090-4a68-92a2-21b0e4a142b4-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"c9a8037c-9090-4a68-92a2-21b0e4a142b4\") " pod="openstack/glance-default-external-api-0" Mar 10 11:16:20 crc kubenswrapper[4794]: I0310 11:16:20.534165 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8cbkc\" (UniqueName: \"kubernetes.io/projected/c9a8037c-9090-4a68-92a2-21b0e4a142b4-kube-api-access-8cbkc\") pod \"glance-default-external-api-0\" (UID: \"c9a8037c-9090-4a68-92a2-21b0e4a142b4\") " pod="openstack/glance-default-external-api-0" Mar 10 11:16:20 crc kubenswrapper[4794]: I0310 11:16:20.636314 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/c9a8037c-9090-4a68-92a2-21b0e4a142b4-ceph\") pod \"glance-default-external-api-0\" (UID: \"c9a8037c-9090-4a68-92a2-21b0e4a142b4\") " pod="openstack/glance-default-external-api-0" Mar 10 11:16:20 crc kubenswrapper[4794]: I0310 11:16:20.636456 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c9a8037c-9090-4a68-92a2-21b0e4a142b4-scripts\") pod \"glance-default-external-api-0\" (UID: \"c9a8037c-9090-4a68-92a2-21b0e4a142b4\") " pod="openstack/glance-default-external-api-0" Mar 10 11:16:20 crc kubenswrapper[4794]: I0310 11:16:20.636536 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c9a8037c-9090-4a68-92a2-21b0e4a142b4-config-data\") pod \"glance-default-external-api-0\" (UID: \"c9a8037c-9090-4a68-92a2-21b0e4a142b4\") " pod="openstack/glance-default-external-api-0" Mar 10 11:16:20 crc kubenswrapper[4794]: I0310 11:16:20.636638 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c9a8037c-9090-4a68-92a2-21b0e4a142b4-logs\") pod \"glance-default-external-api-0\" (UID: \"c9a8037c-9090-4a68-92a2-21b0e4a142b4\") " pod="openstack/glance-default-external-api-0" Mar 10 11:16:20 crc kubenswrapper[4794]: I0310 11:16:20.636706 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/c9a8037c-9090-4a68-92a2-21b0e4a142b4-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"c9a8037c-9090-4a68-92a2-21b0e4a142b4\") " pod="openstack/glance-default-external-api-0" Mar 10 11:16:20 crc kubenswrapper[4794]: I0310 11:16:20.636794 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8cbkc\" (UniqueName: \"kubernetes.io/projected/c9a8037c-9090-4a68-92a2-21b0e4a142b4-kube-api-access-8cbkc\") pod \"glance-default-external-api-0\" (UID: \"c9a8037c-9090-4a68-92a2-21b0e4a142b4\") " pod="openstack/glance-default-external-api-0" Mar 10 11:16:20 crc kubenswrapper[4794]: I0310 11:16:20.636930 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c9a8037c-9090-4a68-92a2-21b0e4a142b4-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"c9a8037c-9090-4a68-92a2-21b0e4a142b4\") " pod="openstack/glance-default-external-api-0" Mar 10 11:16:20 crc kubenswrapper[4794]: I0310 11:16:20.637532 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/c9a8037c-9090-4a68-92a2-21b0e4a142b4-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"c9a8037c-9090-4a68-92a2-21b0e4a142b4\") " pod="openstack/glance-default-external-api-0" Mar 10 11:16:20 crc kubenswrapper[4794]: I0310 11:16:20.637803 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c9a8037c-9090-4a68-92a2-21b0e4a142b4-logs\") pod \"glance-default-external-api-0\" (UID: \"c9a8037c-9090-4a68-92a2-21b0e4a142b4\") " pod="openstack/glance-default-external-api-0" Mar 10 11:16:20 crc kubenswrapper[4794]: I0310 11:16:20.642112 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/c9a8037c-9090-4a68-92a2-21b0e4a142b4-ceph\") pod \"glance-default-external-api-0\" (UID: \"c9a8037c-9090-4a68-92a2-21b0e4a142b4\") " pod="openstack/glance-default-external-api-0" Mar 10 11:16:20 crc kubenswrapper[4794]: I0310 11:16:20.642699 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c9a8037c-9090-4a68-92a2-21b0e4a142b4-scripts\") pod \"glance-default-external-api-0\" (UID: \"c9a8037c-9090-4a68-92a2-21b0e4a142b4\") " pod="openstack/glance-default-external-api-0" Mar 10 11:16:20 crc kubenswrapper[4794]: I0310 11:16:20.644704 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c9a8037c-9090-4a68-92a2-21b0e4a142b4-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"c9a8037c-9090-4a68-92a2-21b0e4a142b4\") " pod="openstack/glance-default-external-api-0" Mar 10 11:16:20 crc kubenswrapper[4794]: I0310 11:16:20.650433 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c9a8037c-9090-4a68-92a2-21b0e4a142b4-config-data\") pod \"glance-default-external-api-0\" (UID: \"c9a8037c-9090-4a68-92a2-21b0e4a142b4\") " pod="openstack/glance-default-external-api-0" Mar 10 11:16:20 crc kubenswrapper[4794]: I0310 11:16:20.656969 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8cbkc\" (UniqueName: \"kubernetes.io/projected/c9a8037c-9090-4a68-92a2-21b0e4a142b4-kube-api-access-8cbkc\") pod \"glance-default-external-api-0\" (UID: \"c9a8037c-9090-4a68-92a2-21b0e4a142b4\") " pod="openstack/glance-default-external-api-0" Mar 10 11:16:20 crc kubenswrapper[4794]: I0310 11:16:20.750654 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 10 11:16:21 crc kubenswrapper[4794]: I0310 11:16:21.323070 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 10 11:16:21 crc kubenswrapper[4794]: I0310 11:16:21.343480 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="aef4159b-4e10-43db-a895-01ecf2c19b61" containerName="glance-log" containerID="cri-o://1fc0121f5aff9ef418671c50b2b9607a06a8211303f0d9313956b4ca1d8a76d7" gracePeriod=30 Mar 10 11:16:21 crc kubenswrapper[4794]: I0310 11:16:21.344092 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="aef4159b-4e10-43db-a895-01ecf2c19b61" containerName="glance-httpd" containerID="cri-o://1e48f541c6c74f381cbcaef2e2716d3e4419adf23dcca2cf6d36b784b8ee4c50" gracePeriod=30 Mar 10 11:16:21 crc kubenswrapper[4794]: I0310 11:16:21.921800 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 10 11:16:21 crc kubenswrapper[4794]: I0310 11:16:21.965296 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/aef4159b-4e10-43db-a895-01ecf2c19b61-logs\") pod \"aef4159b-4e10-43db-a895-01ecf2c19b61\" (UID: \"aef4159b-4e10-43db-a895-01ecf2c19b61\") " Mar 10 11:16:21 crc kubenswrapper[4794]: I0310 11:16:21.965389 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/aef4159b-4e10-43db-a895-01ecf2c19b61-ceph\") pod \"aef4159b-4e10-43db-a895-01ecf2c19b61\" (UID: \"aef4159b-4e10-43db-a895-01ecf2c19b61\") " Mar 10 11:16:21 crc kubenswrapper[4794]: I0310 11:16:21.965822 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/aef4159b-4e10-43db-a895-01ecf2c19b61-logs" (OuterVolumeSpecName: "logs") pod "aef4159b-4e10-43db-a895-01ecf2c19b61" (UID: "aef4159b-4e10-43db-a895-01ecf2c19b61"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 11:16:21 crc kubenswrapper[4794]: I0310 11:16:21.968717 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aef4159b-4e10-43db-a895-01ecf2c19b61-config-data\") pod \"aef4159b-4e10-43db-a895-01ecf2c19b61\" (UID: \"aef4159b-4e10-43db-a895-01ecf2c19b61\") " Mar 10 11:16:21 crc kubenswrapper[4794]: I0310 11:16:21.968776 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aef4159b-4e10-43db-a895-01ecf2c19b61-combined-ca-bundle\") pod \"aef4159b-4e10-43db-a895-01ecf2c19b61\" (UID: \"aef4159b-4e10-43db-a895-01ecf2c19b61\") " Mar 10 11:16:21 crc kubenswrapper[4794]: I0310 11:16:21.968856 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/aef4159b-4e10-43db-a895-01ecf2c19b61-scripts\") pod \"aef4159b-4e10-43db-a895-01ecf2c19b61\" (UID: \"aef4159b-4e10-43db-a895-01ecf2c19b61\") " Mar 10 11:16:21 crc kubenswrapper[4794]: I0310 11:16:21.968903 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cnkwz\" (UniqueName: \"kubernetes.io/projected/aef4159b-4e10-43db-a895-01ecf2c19b61-kube-api-access-cnkwz\") pod \"aef4159b-4e10-43db-a895-01ecf2c19b61\" (UID: \"aef4159b-4e10-43db-a895-01ecf2c19b61\") " Mar 10 11:16:21 crc kubenswrapper[4794]: I0310 11:16:21.968928 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/aef4159b-4e10-43db-a895-01ecf2c19b61-httpd-run\") pod \"aef4159b-4e10-43db-a895-01ecf2c19b61\" (UID: \"aef4159b-4e10-43db-a895-01ecf2c19b61\") " Mar 10 11:16:21 crc kubenswrapper[4794]: I0310 11:16:21.969622 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/aef4159b-4e10-43db-a895-01ecf2c19b61-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "aef4159b-4e10-43db-a895-01ecf2c19b61" (UID: "aef4159b-4e10-43db-a895-01ecf2c19b61"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 11:16:21 crc kubenswrapper[4794]: I0310 11:16:21.969878 4794 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/aef4159b-4e10-43db-a895-01ecf2c19b61-logs\") on node \"crc\" DevicePath \"\"" Mar 10 11:16:21 crc kubenswrapper[4794]: I0310 11:16:21.969897 4794 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/aef4159b-4e10-43db-a895-01ecf2c19b61-httpd-run\") on node \"crc\" DevicePath \"\"" Mar 10 11:16:21 crc kubenswrapper[4794]: I0310 11:16:21.973268 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aef4159b-4e10-43db-a895-01ecf2c19b61-scripts" (OuterVolumeSpecName: "scripts") pod "aef4159b-4e10-43db-a895-01ecf2c19b61" (UID: "aef4159b-4e10-43db-a895-01ecf2c19b61"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 11:16:21 crc kubenswrapper[4794]: I0310 11:16:21.973848 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aef4159b-4e10-43db-a895-01ecf2c19b61-kube-api-access-cnkwz" (OuterVolumeSpecName: "kube-api-access-cnkwz") pod "aef4159b-4e10-43db-a895-01ecf2c19b61" (UID: "aef4159b-4e10-43db-a895-01ecf2c19b61"). InnerVolumeSpecName "kube-api-access-cnkwz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 11:16:21 crc kubenswrapper[4794]: I0310 11:16:21.975137 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aef4159b-4e10-43db-a895-01ecf2c19b61-ceph" (OuterVolumeSpecName: "ceph") pod "aef4159b-4e10-43db-a895-01ecf2c19b61" (UID: "aef4159b-4e10-43db-a895-01ecf2c19b61"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 11:16:22 crc kubenswrapper[4794]: I0310 11:16:22.010524 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aef4159b-4e10-43db-a895-01ecf2c19b61-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "aef4159b-4e10-43db-a895-01ecf2c19b61" (UID: "aef4159b-4e10-43db-a895-01ecf2c19b61"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 11:16:22 crc kubenswrapper[4794]: I0310 11:16:22.015383 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d50415b5-e20d-4aca-bd71-f1eb44c10950" path="/var/lib/kubelet/pods/d50415b5-e20d-4aca-bd71-f1eb44c10950/volumes" Mar 10 11:16:22 crc kubenswrapper[4794]: I0310 11:16:22.018646 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aef4159b-4e10-43db-a895-01ecf2c19b61-config-data" (OuterVolumeSpecName: "config-data") pod "aef4159b-4e10-43db-a895-01ecf2c19b61" (UID: "aef4159b-4e10-43db-a895-01ecf2c19b61"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 11:16:22 crc kubenswrapper[4794]: I0310 11:16:22.071229 4794 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/aef4159b-4e10-43db-a895-01ecf2c19b61-ceph\") on node \"crc\" DevicePath \"\"" Mar 10 11:16:22 crc kubenswrapper[4794]: I0310 11:16:22.071269 4794 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aef4159b-4e10-43db-a895-01ecf2c19b61-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 11:16:22 crc kubenswrapper[4794]: I0310 11:16:22.071282 4794 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aef4159b-4e10-43db-a895-01ecf2c19b61-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 11:16:22 crc kubenswrapper[4794]: I0310 11:16:22.071297 4794 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/aef4159b-4e10-43db-a895-01ecf2c19b61-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 11:16:22 crc kubenswrapper[4794]: I0310 11:16:22.071307 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cnkwz\" (UniqueName: \"kubernetes.io/projected/aef4159b-4e10-43db-a895-01ecf2c19b61-kube-api-access-cnkwz\") on node \"crc\" DevicePath \"\"" Mar 10 11:16:22 crc kubenswrapper[4794]: I0310 11:16:22.355038 4794 generic.go:334] "Generic (PLEG): container finished" podID="aef4159b-4e10-43db-a895-01ecf2c19b61" containerID="1e48f541c6c74f381cbcaef2e2716d3e4419adf23dcca2cf6d36b784b8ee4c50" exitCode=0 Mar 10 11:16:22 crc kubenswrapper[4794]: I0310 11:16:22.355382 4794 generic.go:334] "Generic (PLEG): container finished" podID="aef4159b-4e10-43db-a895-01ecf2c19b61" containerID="1fc0121f5aff9ef418671c50b2b9607a06a8211303f0d9313956b4ca1d8a76d7" exitCode=143 Mar 10 11:16:22 crc kubenswrapper[4794]: I0310 11:16:22.355123 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"aef4159b-4e10-43db-a895-01ecf2c19b61","Type":"ContainerDied","Data":"1e48f541c6c74f381cbcaef2e2716d3e4419adf23dcca2cf6d36b784b8ee4c50"} Mar 10 11:16:22 crc kubenswrapper[4794]: I0310 11:16:22.355454 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"aef4159b-4e10-43db-a895-01ecf2c19b61","Type":"ContainerDied","Data":"1fc0121f5aff9ef418671c50b2b9607a06a8211303f0d9313956b4ca1d8a76d7"} Mar 10 11:16:22 crc kubenswrapper[4794]: I0310 11:16:22.355467 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"aef4159b-4e10-43db-a895-01ecf2c19b61","Type":"ContainerDied","Data":"ee396cfa39fb25567fcb571a8c7f8acae05aa1b1a09b2e7513f9f0d1d8920e17"} Mar 10 11:16:22 crc kubenswrapper[4794]: I0310 11:16:22.355244 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 10 11:16:22 crc kubenswrapper[4794]: I0310 11:16:22.355484 4794 scope.go:117] "RemoveContainer" containerID="1e48f541c6c74f381cbcaef2e2716d3e4419adf23dcca2cf6d36b784b8ee4c50" Mar 10 11:16:22 crc kubenswrapper[4794]: I0310 11:16:22.359677 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"c9a8037c-9090-4a68-92a2-21b0e4a142b4","Type":"ContainerStarted","Data":"c4c053008c62120f41214b5b89fa8a72183957fc43951b46ce7668624899f2f5"} Mar 10 11:16:22 crc kubenswrapper[4794]: I0310 11:16:22.359704 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"c9a8037c-9090-4a68-92a2-21b0e4a142b4","Type":"ContainerStarted","Data":"06d5449fef5964f97f56dfee7a05b70a0811ea7c2f5746f960a1b3295a9103b2"} Mar 10 11:16:22 crc kubenswrapper[4794]: I0310 11:16:22.380500 4794 scope.go:117] "RemoveContainer" containerID="1fc0121f5aff9ef418671c50b2b9607a06a8211303f0d9313956b4ca1d8a76d7" Mar 10 11:16:22 crc kubenswrapper[4794]: I0310 11:16:22.405492 4794 scope.go:117] "RemoveContainer" containerID="1e48f541c6c74f381cbcaef2e2716d3e4419adf23dcca2cf6d36b784b8ee4c50" Mar 10 11:16:22 crc kubenswrapper[4794]: E0310 11:16:22.405958 4794 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1e48f541c6c74f381cbcaef2e2716d3e4419adf23dcca2cf6d36b784b8ee4c50\": container with ID starting with 1e48f541c6c74f381cbcaef2e2716d3e4419adf23dcca2cf6d36b784b8ee4c50 not found: ID does not exist" containerID="1e48f541c6c74f381cbcaef2e2716d3e4419adf23dcca2cf6d36b784b8ee4c50" Mar 10 11:16:22 crc kubenswrapper[4794]: I0310 11:16:22.406008 4794 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1e48f541c6c74f381cbcaef2e2716d3e4419adf23dcca2cf6d36b784b8ee4c50"} err="failed to get container status \"1e48f541c6c74f381cbcaef2e2716d3e4419adf23dcca2cf6d36b784b8ee4c50\": rpc error: code = NotFound desc = could not find container \"1e48f541c6c74f381cbcaef2e2716d3e4419adf23dcca2cf6d36b784b8ee4c50\": container with ID starting with 1e48f541c6c74f381cbcaef2e2716d3e4419adf23dcca2cf6d36b784b8ee4c50 not found: ID does not exist" Mar 10 11:16:22 crc kubenswrapper[4794]: I0310 11:16:22.406043 4794 scope.go:117] "RemoveContainer" containerID="1fc0121f5aff9ef418671c50b2b9607a06a8211303f0d9313956b4ca1d8a76d7" Mar 10 11:16:22 crc kubenswrapper[4794]: E0310 11:16:22.406473 4794 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1fc0121f5aff9ef418671c50b2b9607a06a8211303f0d9313956b4ca1d8a76d7\": container with ID starting with 1fc0121f5aff9ef418671c50b2b9607a06a8211303f0d9313956b4ca1d8a76d7 not found: ID does not exist" containerID="1fc0121f5aff9ef418671c50b2b9607a06a8211303f0d9313956b4ca1d8a76d7" Mar 10 11:16:22 crc kubenswrapper[4794]: I0310 11:16:22.406523 4794 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1fc0121f5aff9ef418671c50b2b9607a06a8211303f0d9313956b4ca1d8a76d7"} err="failed to get container status \"1fc0121f5aff9ef418671c50b2b9607a06a8211303f0d9313956b4ca1d8a76d7\": rpc error: code = NotFound desc = could not find container \"1fc0121f5aff9ef418671c50b2b9607a06a8211303f0d9313956b4ca1d8a76d7\": container with ID starting with 1fc0121f5aff9ef418671c50b2b9607a06a8211303f0d9313956b4ca1d8a76d7 not found: ID does not exist" Mar 10 11:16:22 crc kubenswrapper[4794]: I0310 11:16:22.406548 4794 scope.go:117] "RemoveContainer" containerID="1e48f541c6c74f381cbcaef2e2716d3e4419adf23dcca2cf6d36b784b8ee4c50" Mar 10 11:16:22 crc kubenswrapper[4794]: I0310 11:16:22.406801 4794 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1e48f541c6c74f381cbcaef2e2716d3e4419adf23dcca2cf6d36b784b8ee4c50"} err="failed to get container status \"1e48f541c6c74f381cbcaef2e2716d3e4419adf23dcca2cf6d36b784b8ee4c50\": rpc error: code = NotFound desc = could not find container \"1e48f541c6c74f381cbcaef2e2716d3e4419adf23dcca2cf6d36b784b8ee4c50\": container with ID starting with 1e48f541c6c74f381cbcaef2e2716d3e4419adf23dcca2cf6d36b784b8ee4c50 not found: ID does not exist" Mar 10 11:16:22 crc kubenswrapper[4794]: I0310 11:16:22.406822 4794 scope.go:117] "RemoveContainer" containerID="1fc0121f5aff9ef418671c50b2b9607a06a8211303f0d9313956b4ca1d8a76d7" Mar 10 11:16:22 crc kubenswrapper[4794]: I0310 11:16:22.407062 4794 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1fc0121f5aff9ef418671c50b2b9607a06a8211303f0d9313956b4ca1d8a76d7"} err="failed to get container status \"1fc0121f5aff9ef418671c50b2b9607a06a8211303f0d9313956b4ca1d8a76d7\": rpc error: code = NotFound desc = could not find container \"1fc0121f5aff9ef418671c50b2b9607a06a8211303f0d9313956b4ca1d8a76d7\": container with ID starting with 1fc0121f5aff9ef418671c50b2b9607a06a8211303f0d9313956b4ca1d8a76d7 not found: ID does not exist" Mar 10 11:16:22 crc kubenswrapper[4794]: I0310 11:16:22.411407 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 10 11:16:22 crc kubenswrapper[4794]: I0310 11:16:22.423221 4794 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 10 11:16:22 crc kubenswrapper[4794]: I0310 11:16:22.442174 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 10 11:16:22 crc kubenswrapper[4794]: E0310 11:16:22.442676 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aef4159b-4e10-43db-a895-01ecf2c19b61" containerName="glance-log" Mar 10 11:16:22 crc kubenswrapper[4794]: I0310 11:16:22.442702 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="aef4159b-4e10-43db-a895-01ecf2c19b61" containerName="glance-log" Mar 10 11:16:22 crc kubenswrapper[4794]: E0310 11:16:22.442736 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aef4159b-4e10-43db-a895-01ecf2c19b61" containerName="glance-httpd" Mar 10 11:16:22 crc kubenswrapper[4794]: I0310 11:16:22.442745 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="aef4159b-4e10-43db-a895-01ecf2c19b61" containerName="glance-httpd" Mar 10 11:16:22 crc kubenswrapper[4794]: I0310 11:16:22.442957 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="aef4159b-4e10-43db-a895-01ecf2c19b61" containerName="glance-httpd" Mar 10 11:16:22 crc kubenswrapper[4794]: I0310 11:16:22.442980 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="aef4159b-4e10-43db-a895-01ecf2c19b61" containerName="glance-log" Mar 10 11:16:22 crc kubenswrapper[4794]: I0310 11:16:22.444268 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 10 11:16:22 crc kubenswrapper[4794]: I0310 11:16:22.449659 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Mar 10 11:16:22 crc kubenswrapper[4794]: I0310 11:16:22.449793 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 10 11:16:22 crc kubenswrapper[4794]: I0310 11:16:22.479643 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/58e94276-b957-49b1-b4db-874a151bfd3a-scripts\") pod \"glance-default-internal-api-0\" (UID: \"58e94276-b957-49b1-b4db-874a151bfd3a\") " pod="openstack/glance-default-internal-api-0" Mar 10 11:16:22 crc kubenswrapper[4794]: I0310 11:16:22.479714 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5kvcs\" (UniqueName: \"kubernetes.io/projected/58e94276-b957-49b1-b4db-874a151bfd3a-kube-api-access-5kvcs\") pod \"glance-default-internal-api-0\" (UID: \"58e94276-b957-49b1-b4db-874a151bfd3a\") " pod="openstack/glance-default-internal-api-0" Mar 10 11:16:22 crc kubenswrapper[4794]: I0310 11:16:22.479832 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/58e94276-b957-49b1-b4db-874a151bfd3a-config-data\") pod \"glance-default-internal-api-0\" (UID: \"58e94276-b957-49b1-b4db-874a151bfd3a\") " pod="openstack/glance-default-internal-api-0" Mar 10 11:16:22 crc kubenswrapper[4794]: I0310 11:16:22.479902 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/58e94276-b957-49b1-b4db-874a151bfd3a-logs\") pod \"glance-default-internal-api-0\" (UID: \"58e94276-b957-49b1-b4db-874a151bfd3a\") " pod="openstack/glance-default-internal-api-0" Mar 10 11:16:22 crc kubenswrapper[4794]: I0310 11:16:22.479975 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/58e94276-b957-49b1-b4db-874a151bfd3a-ceph\") pod \"glance-default-internal-api-0\" (UID: \"58e94276-b957-49b1-b4db-874a151bfd3a\") " pod="openstack/glance-default-internal-api-0" Mar 10 11:16:22 crc kubenswrapper[4794]: I0310 11:16:22.480026 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/58e94276-b957-49b1-b4db-874a151bfd3a-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"58e94276-b957-49b1-b4db-874a151bfd3a\") " pod="openstack/glance-default-internal-api-0" Mar 10 11:16:22 crc kubenswrapper[4794]: I0310 11:16:22.480069 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/58e94276-b957-49b1-b4db-874a151bfd3a-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"58e94276-b957-49b1-b4db-874a151bfd3a\") " pod="openstack/glance-default-internal-api-0" Mar 10 11:16:22 crc kubenswrapper[4794]: I0310 11:16:22.582029 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/58e94276-b957-49b1-b4db-874a151bfd3a-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"58e94276-b957-49b1-b4db-874a151bfd3a\") " pod="openstack/glance-default-internal-api-0" Mar 10 11:16:22 crc kubenswrapper[4794]: I0310 11:16:22.582100 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/58e94276-b957-49b1-b4db-874a151bfd3a-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"58e94276-b957-49b1-b4db-874a151bfd3a\") " pod="openstack/glance-default-internal-api-0" Mar 10 11:16:22 crc kubenswrapper[4794]: I0310 11:16:22.583155 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/58e94276-b957-49b1-b4db-874a151bfd3a-scripts\") pod \"glance-default-internal-api-0\" (UID: \"58e94276-b957-49b1-b4db-874a151bfd3a\") " pod="openstack/glance-default-internal-api-0" Mar 10 11:16:22 crc kubenswrapper[4794]: I0310 11:16:22.583218 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5kvcs\" (UniqueName: \"kubernetes.io/projected/58e94276-b957-49b1-b4db-874a151bfd3a-kube-api-access-5kvcs\") pod \"glance-default-internal-api-0\" (UID: \"58e94276-b957-49b1-b4db-874a151bfd3a\") " pod="openstack/glance-default-internal-api-0" Mar 10 11:16:22 crc kubenswrapper[4794]: I0310 11:16:22.583381 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/58e94276-b957-49b1-b4db-874a151bfd3a-config-data\") pod \"glance-default-internal-api-0\" (UID: \"58e94276-b957-49b1-b4db-874a151bfd3a\") " pod="openstack/glance-default-internal-api-0" Mar 10 11:16:22 crc kubenswrapper[4794]: I0310 11:16:22.583490 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/58e94276-b957-49b1-b4db-874a151bfd3a-logs\") pod \"glance-default-internal-api-0\" (UID: \"58e94276-b957-49b1-b4db-874a151bfd3a\") " pod="openstack/glance-default-internal-api-0" Mar 10 11:16:22 crc kubenswrapper[4794]: I0310 11:16:22.582902 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/58e94276-b957-49b1-b4db-874a151bfd3a-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"58e94276-b957-49b1-b4db-874a151bfd3a\") " pod="openstack/glance-default-internal-api-0" Mar 10 11:16:22 crc kubenswrapper[4794]: I0310 11:16:22.583573 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/58e94276-b957-49b1-b4db-874a151bfd3a-ceph\") pod \"glance-default-internal-api-0\" (UID: \"58e94276-b957-49b1-b4db-874a151bfd3a\") " pod="openstack/glance-default-internal-api-0" Mar 10 11:16:22 crc kubenswrapper[4794]: I0310 11:16:22.583820 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/58e94276-b957-49b1-b4db-874a151bfd3a-logs\") pod \"glance-default-internal-api-0\" (UID: \"58e94276-b957-49b1-b4db-874a151bfd3a\") " pod="openstack/glance-default-internal-api-0" Mar 10 11:16:22 crc kubenswrapper[4794]: I0310 11:16:22.589284 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/58e94276-b957-49b1-b4db-874a151bfd3a-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"58e94276-b957-49b1-b4db-874a151bfd3a\") " pod="openstack/glance-default-internal-api-0" Mar 10 11:16:22 crc kubenswrapper[4794]: I0310 11:16:22.589596 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/58e94276-b957-49b1-b4db-874a151bfd3a-config-data\") pod \"glance-default-internal-api-0\" (UID: \"58e94276-b957-49b1-b4db-874a151bfd3a\") " pod="openstack/glance-default-internal-api-0" Mar 10 11:16:22 crc kubenswrapper[4794]: I0310 11:16:22.592873 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/58e94276-b957-49b1-b4db-874a151bfd3a-ceph\") pod \"glance-default-internal-api-0\" (UID: \"58e94276-b957-49b1-b4db-874a151bfd3a\") " pod="openstack/glance-default-internal-api-0" Mar 10 11:16:22 crc kubenswrapper[4794]: I0310 11:16:22.602023 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5kvcs\" (UniqueName: \"kubernetes.io/projected/58e94276-b957-49b1-b4db-874a151bfd3a-kube-api-access-5kvcs\") pod \"glance-default-internal-api-0\" (UID: \"58e94276-b957-49b1-b4db-874a151bfd3a\") " pod="openstack/glance-default-internal-api-0" Mar 10 11:16:22 crc kubenswrapper[4794]: I0310 11:16:22.602068 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/58e94276-b957-49b1-b4db-874a151bfd3a-scripts\") pod \"glance-default-internal-api-0\" (UID: \"58e94276-b957-49b1-b4db-874a151bfd3a\") " pod="openstack/glance-default-internal-api-0" Mar 10 11:16:22 crc kubenswrapper[4794]: I0310 11:16:22.793044 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 10 11:16:23 crc kubenswrapper[4794]: I0310 11:16:23.343465 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 10 11:16:23 crc kubenswrapper[4794]: W0310 11:16:23.349076 4794 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod58e94276_b957_49b1_b4db_874a151bfd3a.slice/crio-12f3210cc6d0e47b42ffde6234908028d5d3111657f62698cda68bd331fae702 WatchSource:0}: Error finding container 12f3210cc6d0e47b42ffde6234908028d5d3111657f62698cda68bd331fae702: Status 404 returned error can't find the container with id 12f3210cc6d0e47b42ffde6234908028d5d3111657f62698cda68bd331fae702 Mar 10 11:16:23 crc kubenswrapper[4794]: I0310 11:16:23.371884 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"58e94276-b957-49b1-b4db-874a151bfd3a","Type":"ContainerStarted","Data":"12f3210cc6d0e47b42ffde6234908028d5d3111657f62698cda68bd331fae702"} Mar 10 11:16:23 crc kubenswrapper[4794]: I0310 11:16:23.376804 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"c9a8037c-9090-4a68-92a2-21b0e4a142b4","Type":"ContainerStarted","Data":"df0b72b90c3e4b239a686b0ecec91563d34fd8961b41e357fbe0cb85abd935f8"} Mar 10 11:16:23 crc kubenswrapper[4794]: I0310 11:16:23.406690 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=3.406666963 podStartE2EDuration="3.406666963s" podCreationTimestamp="2026-03-10 11:16:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 11:16:23.404216837 +0000 UTC m=+5532.160387695" watchObservedRunningTime="2026-03-10 11:16:23.406666963 +0000 UTC m=+5532.162837801" Mar 10 11:16:24 crc kubenswrapper[4794]: I0310 11:16:24.031017 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="aef4159b-4e10-43db-a895-01ecf2c19b61" path="/var/lib/kubelet/pods/aef4159b-4e10-43db-a895-01ecf2c19b61/volumes" Mar 10 11:16:24 crc kubenswrapper[4794]: I0310 11:16:24.394530 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"58e94276-b957-49b1-b4db-874a151bfd3a","Type":"ContainerStarted","Data":"0311c0be9eec89c14e0736e3a96f5851e5d816408c26015edbbe888b94c0cbcb"} Mar 10 11:16:25 crc kubenswrapper[4794]: I0310 11:16:25.411929 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"58e94276-b957-49b1-b4db-874a151bfd3a","Type":"ContainerStarted","Data":"73001e259d7cad5584fd2258090c25f4dd39102ca089343a21d440080cc36471"} Mar 10 11:16:25 crc kubenswrapper[4794]: I0310 11:16:25.446392 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=3.446368244 podStartE2EDuration="3.446368244s" podCreationTimestamp="2026-03-10 11:16:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 11:16:25.445948091 +0000 UTC m=+5534.202118949" watchObservedRunningTime="2026-03-10 11:16:25.446368244 +0000 UTC m=+5534.202539092" Mar 10 11:16:25 crc kubenswrapper[4794]: I0310 11:16:25.999112 4794 scope.go:117] "RemoveContainer" containerID="993a5336699cfaa4c1135438dd34154cd549d1e3528993849fd4002d26837157" Mar 10 11:16:25 crc kubenswrapper[4794]: E0310 11:16:25.999681 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-69278_openshift-machine-config-operator(071a79a8-a892-4d38-a255-2a19483b64aa)\"" pod="openshift-machine-config-operator/machine-config-daemon-69278" podUID="071a79a8-a892-4d38-a255-2a19483b64aa" Mar 10 11:16:27 crc kubenswrapper[4794]: I0310 11:16:27.272040 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-76f7f8d69f-2s985" Mar 10 11:16:27 crc kubenswrapper[4794]: I0310 11:16:27.387061 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-bff8cd95-2t5br"] Mar 10 11:16:27 crc kubenswrapper[4794]: I0310 11:16:27.387319 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-bff8cd95-2t5br" podUID="ee972ab7-a06d-4966-b74f-b7ab83336c5c" containerName="dnsmasq-dns" containerID="cri-o://cbd98f6a40d8e166d172365aad5e50af304540f1cec541537547518b4bc1da6d" gracePeriod=10 Mar 10 11:16:27 crc kubenswrapper[4794]: E0310 11:16:27.584828 4794 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podee972ab7_a06d_4966_b74f_b7ab83336c5c.slice/crio-cbd98f6a40d8e166d172365aad5e50af304540f1cec541537547518b4bc1da6d.scope\": RecentStats: unable to find data in memory cache]" Mar 10 11:16:27 crc kubenswrapper[4794]: I0310 11:16:27.905992 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-bff8cd95-2t5br" Mar 10 11:16:27 crc kubenswrapper[4794]: I0310 11:16:27.979703 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ee972ab7-a06d-4966-b74f-b7ab83336c5c-ovsdbserver-nb\") pod \"ee972ab7-a06d-4966-b74f-b7ab83336c5c\" (UID: \"ee972ab7-a06d-4966-b74f-b7ab83336c5c\") " Mar 10 11:16:27 crc kubenswrapper[4794]: I0310 11:16:27.979820 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ee972ab7-a06d-4966-b74f-b7ab83336c5c-dns-svc\") pod \"ee972ab7-a06d-4966-b74f-b7ab83336c5c\" (UID: \"ee972ab7-a06d-4966-b74f-b7ab83336c5c\") " Mar 10 11:16:27 crc kubenswrapper[4794]: I0310 11:16:27.979882 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pdbgd\" (UniqueName: \"kubernetes.io/projected/ee972ab7-a06d-4966-b74f-b7ab83336c5c-kube-api-access-pdbgd\") pod \"ee972ab7-a06d-4966-b74f-b7ab83336c5c\" (UID: \"ee972ab7-a06d-4966-b74f-b7ab83336c5c\") " Mar 10 11:16:27 crc kubenswrapper[4794]: I0310 11:16:27.979927 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ee972ab7-a06d-4966-b74f-b7ab83336c5c-ovsdbserver-sb\") pod \"ee972ab7-a06d-4966-b74f-b7ab83336c5c\" (UID: \"ee972ab7-a06d-4966-b74f-b7ab83336c5c\") " Mar 10 11:16:27 crc kubenswrapper[4794]: I0310 11:16:27.979947 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ee972ab7-a06d-4966-b74f-b7ab83336c5c-config\") pod \"ee972ab7-a06d-4966-b74f-b7ab83336c5c\" (UID: \"ee972ab7-a06d-4966-b74f-b7ab83336c5c\") " Mar 10 11:16:27 crc kubenswrapper[4794]: I0310 11:16:27.985466 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ee972ab7-a06d-4966-b74f-b7ab83336c5c-kube-api-access-pdbgd" (OuterVolumeSpecName: "kube-api-access-pdbgd") pod "ee972ab7-a06d-4966-b74f-b7ab83336c5c" (UID: "ee972ab7-a06d-4966-b74f-b7ab83336c5c"). InnerVolumeSpecName "kube-api-access-pdbgd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 11:16:28 crc kubenswrapper[4794]: I0310 11:16:28.030829 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ee972ab7-a06d-4966-b74f-b7ab83336c5c-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "ee972ab7-a06d-4966-b74f-b7ab83336c5c" (UID: "ee972ab7-a06d-4966-b74f-b7ab83336c5c"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 11:16:28 crc kubenswrapper[4794]: I0310 11:16:28.032270 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ee972ab7-a06d-4966-b74f-b7ab83336c5c-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "ee972ab7-a06d-4966-b74f-b7ab83336c5c" (UID: "ee972ab7-a06d-4966-b74f-b7ab83336c5c"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 11:16:28 crc kubenswrapper[4794]: I0310 11:16:28.034680 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ee972ab7-a06d-4966-b74f-b7ab83336c5c-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "ee972ab7-a06d-4966-b74f-b7ab83336c5c" (UID: "ee972ab7-a06d-4966-b74f-b7ab83336c5c"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 11:16:28 crc kubenswrapper[4794]: I0310 11:16:28.036516 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ee972ab7-a06d-4966-b74f-b7ab83336c5c-config" (OuterVolumeSpecName: "config") pod "ee972ab7-a06d-4966-b74f-b7ab83336c5c" (UID: "ee972ab7-a06d-4966-b74f-b7ab83336c5c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 11:16:28 crc kubenswrapper[4794]: I0310 11:16:28.082189 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pdbgd\" (UniqueName: \"kubernetes.io/projected/ee972ab7-a06d-4966-b74f-b7ab83336c5c-kube-api-access-pdbgd\") on node \"crc\" DevicePath \"\"" Mar 10 11:16:28 crc kubenswrapper[4794]: I0310 11:16:28.082220 4794 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ee972ab7-a06d-4966-b74f-b7ab83336c5c-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 10 11:16:28 crc kubenswrapper[4794]: I0310 11:16:28.082232 4794 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ee972ab7-a06d-4966-b74f-b7ab83336c5c-config\") on node \"crc\" DevicePath \"\"" Mar 10 11:16:28 crc kubenswrapper[4794]: I0310 11:16:28.082242 4794 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ee972ab7-a06d-4966-b74f-b7ab83336c5c-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 10 11:16:28 crc kubenswrapper[4794]: I0310 11:16:28.082250 4794 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ee972ab7-a06d-4966-b74f-b7ab83336c5c-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 10 11:16:28 crc kubenswrapper[4794]: I0310 11:16:28.446106 4794 generic.go:334] "Generic (PLEG): container finished" podID="ee972ab7-a06d-4966-b74f-b7ab83336c5c" containerID="cbd98f6a40d8e166d172365aad5e50af304540f1cec541537547518b4bc1da6d" exitCode=0 Mar 10 11:16:28 crc kubenswrapper[4794]: I0310 11:16:28.446159 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-bff8cd95-2t5br" event={"ID":"ee972ab7-a06d-4966-b74f-b7ab83336c5c","Type":"ContainerDied","Data":"cbd98f6a40d8e166d172365aad5e50af304540f1cec541537547518b4bc1da6d"} Mar 10 11:16:28 crc kubenswrapper[4794]: I0310 11:16:28.446192 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-bff8cd95-2t5br" event={"ID":"ee972ab7-a06d-4966-b74f-b7ab83336c5c","Type":"ContainerDied","Data":"4130f25a88c9631f93363534700d2f05010b5c1da5e173ae6ece98141d18d10a"} Mar 10 11:16:28 crc kubenswrapper[4794]: I0310 11:16:28.446211 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-bff8cd95-2t5br" Mar 10 11:16:28 crc kubenswrapper[4794]: I0310 11:16:28.446251 4794 scope.go:117] "RemoveContainer" containerID="cbd98f6a40d8e166d172365aad5e50af304540f1cec541537547518b4bc1da6d" Mar 10 11:16:28 crc kubenswrapper[4794]: I0310 11:16:28.490170 4794 scope.go:117] "RemoveContainer" containerID="1cec9374ae96c49ac1f4862ac7c3f27ee54028d1c65452f1d88fc5333b0fee73" Mar 10 11:16:28 crc kubenswrapper[4794]: I0310 11:16:28.512648 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-bff8cd95-2t5br"] Mar 10 11:16:28 crc kubenswrapper[4794]: I0310 11:16:28.520247 4794 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-bff8cd95-2t5br"] Mar 10 11:16:28 crc kubenswrapper[4794]: I0310 11:16:28.549809 4794 scope.go:117] "RemoveContainer" containerID="cbd98f6a40d8e166d172365aad5e50af304540f1cec541537547518b4bc1da6d" Mar 10 11:16:28 crc kubenswrapper[4794]: E0310 11:16:28.550287 4794 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cbd98f6a40d8e166d172365aad5e50af304540f1cec541537547518b4bc1da6d\": container with ID starting with cbd98f6a40d8e166d172365aad5e50af304540f1cec541537547518b4bc1da6d not found: ID does not exist" containerID="cbd98f6a40d8e166d172365aad5e50af304540f1cec541537547518b4bc1da6d" Mar 10 11:16:28 crc kubenswrapper[4794]: I0310 11:16:28.550328 4794 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cbd98f6a40d8e166d172365aad5e50af304540f1cec541537547518b4bc1da6d"} err="failed to get container status \"cbd98f6a40d8e166d172365aad5e50af304540f1cec541537547518b4bc1da6d\": rpc error: code = NotFound desc = could not find container \"cbd98f6a40d8e166d172365aad5e50af304540f1cec541537547518b4bc1da6d\": container with ID starting with cbd98f6a40d8e166d172365aad5e50af304540f1cec541537547518b4bc1da6d not found: ID does not exist" Mar 10 11:16:28 crc kubenswrapper[4794]: I0310 11:16:28.550369 4794 scope.go:117] "RemoveContainer" containerID="1cec9374ae96c49ac1f4862ac7c3f27ee54028d1c65452f1d88fc5333b0fee73" Mar 10 11:16:28 crc kubenswrapper[4794]: E0310 11:16:28.550872 4794 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1cec9374ae96c49ac1f4862ac7c3f27ee54028d1c65452f1d88fc5333b0fee73\": container with ID starting with 1cec9374ae96c49ac1f4862ac7c3f27ee54028d1c65452f1d88fc5333b0fee73 not found: ID does not exist" containerID="1cec9374ae96c49ac1f4862ac7c3f27ee54028d1c65452f1d88fc5333b0fee73" Mar 10 11:16:28 crc kubenswrapper[4794]: I0310 11:16:28.550897 4794 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1cec9374ae96c49ac1f4862ac7c3f27ee54028d1c65452f1d88fc5333b0fee73"} err="failed to get container status \"1cec9374ae96c49ac1f4862ac7c3f27ee54028d1c65452f1d88fc5333b0fee73\": rpc error: code = NotFound desc = could not find container \"1cec9374ae96c49ac1f4862ac7c3f27ee54028d1c65452f1d88fc5333b0fee73\": container with ID starting with 1cec9374ae96c49ac1f4862ac7c3f27ee54028d1c65452f1d88fc5333b0fee73 not found: ID does not exist" Mar 10 11:16:30 crc kubenswrapper[4794]: I0310 11:16:30.017691 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ee972ab7-a06d-4966-b74f-b7ab83336c5c" path="/var/lib/kubelet/pods/ee972ab7-a06d-4966-b74f-b7ab83336c5c/volumes" Mar 10 11:16:30 crc kubenswrapper[4794]: I0310 11:16:30.751641 4794 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Mar 10 11:16:30 crc kubenswrapper[4794]: I0310 11:16:30.751721 4794 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Mar 10 11:16:30 crc kubenswrapper[4794]: I0310 11:16:30.798395 4794 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Mar 10 11:16:30 crc kubenswrapper[4794]: I0310 11:16:30.820973 4794 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Mar 10 11:16:31 crc kubenswrapper[4794]: I0310 11:16:31.482654 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Mar 10 11:16:31 crc kubenswrapper[4794]: I0310 11:16:31.482710 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Mar 10 11:16:32 crc kubenswrapper[4794]: I0310 11:16:32.794147 4794 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Mar 10 11:16:32 crc kubenswrapper[4794]: I0310 11:16:32.794515 4794 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Mar 10 11:16:32 crc kubenswrapper[4794]: I0310 11:16:32.823633 4794 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Mar 10 11:16:32 crc kubenswrapper[4794]: I0310 11:16:32.866355 4794 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Mar 10 11:16:33 crc kubenswrapper[4794]: I0310 11:16:33.404407 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Mar 10 11:16:33 crc kubenswrapper[4794]: I0310 11:16:33.452824 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Mar 10 11:16:33 crc kubenswrapper[4794]: I0310 11:16:33.503955 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Mar 10 11:16:33 crc kubenswrapper[4794]: I0310 11:16:33.504412 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Mar 10 11:16:35 crc kubenswrapper[4794]: I0310 11:16:35.350270 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Mar 10 11:16:35 crc kubenswrapper[4794]: I0310 11:16:35.354548 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Mar 10 11:16:39 crc kubenswrapper[4794]: I0310 11:16:38.999784 4794 scope.go:117] "RemoveContainer" containerID="993a5336699cfaa4c1135438dd34154cd549d1e3528993849fd4002d26837157" Mar 10 11:16:39 crc kubenswrapper[4794]: E0310 11:16:39.000540 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-69278_openshift-machine-config-operator(071a79a8-a892-4d38-a255-2a19483b64aa)\"" pod="openshift-machine-config-operator/machine-config-daemon-69278" podUID="071a79a8-a892-4d38-a255-2a19483b64aa" Mar 10 11:16:41 crc kubenswrapper[4794]: I0310 11:16:41.411714 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-create-789kv"] Mar 10 11:16:41 crc kubenswrapper[4794]: E0310 11:16:41.412404 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ee972ab7-a06d-4966-b74f-b7ab83336c5c" containerName="dnsmasq-dns" Mar 10 11:16:41 crc kubenswrapper[4794]: I0310 11:16:41.412419 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="ee972ab7-a06d-4966-b74f-b7ab83336c5c" containerName="dnsmasq-dns" Mar 10 11:16:41 crc kubenswrapper[4794]: E0310 11:16:41.412444 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ee972ab7-a06d-4966-b74f-b7ab83336c5c" containerName="init" Mar 10 11:16:41 crc kubenswrapper[4794]: I0310 11:16:41.412452 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="ee972ab7-a06d-4966-b74f-b7ab83336c5c" containerName="init" Mar 10 11:16:41 crc kubenswrapper[4794]: I0310 11:16:41.412640 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="ee972ab7-a06d-4966-b74f-b7ab83336c5c" containerName="dnsmasq-dns" Mar 10 11:16:41 crc kubenswrapper[4794]: I0310 11:16:41.413319 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-789kv" Mar 10 11:16:41 crc kubenswrapper[4794]: I0310 11:16:41.438086 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vhzm4\" (UniqueName: \"kubernetes.io/projected/78311413-38d5-422f-8153-57eb3eed4494-kube-api-access-vhzm4\") pod \"placement-db-create-789kv\" (UID: \"78311413-38d5-422f-8153-57eb3eed4494\") " pod="openstack/placement-db-create-789kv" Mar 10 11:16:41 crc kubenswrapper[4794]: I0310 11:16:41.438145 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/78311413-38d5-422f-8153-57eb3eed4494-operator-scripts\") pod \"placement-db-create-789kv\" (UID: \"78311413-38d5-422f-8153-57eb3eed4494\") " pod="openstack/placement-db-create-789kv" Mar 10 11:16:41 crc kubenswrapper[4794]: I0310 11:16:41.450208 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-789kv"] Mar 10 11:16:41 crc kubenswrapper[4794]: I0310 11:16:41.458762 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-6f51-account-create-update-fmxtb"] Mar 10 11:16:41 crc kubenswrapper[4794]: I0310 11:16:41.460008 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-6f51-account-create-update-fmxtb" Mar 10 11:16:41 crc kubenswrapper[4794]: I0310 11:16:41.469370 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-db-secret" Mar 10 11:16:41 crc kubenswrapper[4794]: I0310 11:16:41.477538 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-6f51-account-create-update-fmxtb"] Mar 10 11:16:41 crc kubenswrapper[4794]: I0310 11:16:41.540486 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9dgcc\" (UniqueName: \"kubernetes.io/projected/d43bfe5e-a75c-45e1-b0f0-5d5654ecd8b6-kube-api-access-9dgcc\") pod \"placement-6f51-account-create-update-fmxtb\" (UID: \"d43bfe5e-a75c-45e1-b0f0-5d5654ecd8b6\") " pod="openstack/placement-6f51-account-create-update-fmxtb" Mar 10 11:16:41 crc kubenswrapper[4794]: I0310 11:16:41.540636 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vhzm4\" (UniqueName: \"kubernetes.io/projected/78311413-38d5-422f-8153-57eb3eed4494-kube-api-access-vhzm4\") pod \"placement-db-create-789kv\" (UID: \"78311413-38d5-422f-8153-57eb3eed4494\") " pod="openstack/placement-db-create-789kv" Mar 10 11:16:41 crc kubenswrapper[4794]: I0310 11:16:41.540697 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/78311413-38d5-422f-8153-57eb3eed4494-operator-scripts\") pod \"placement-db-create-789kv\" (UID: \"78311413-38d5-422f-8153-57eb3eed4494\") " pod="openstack/placement-db-create-789kv" Mar 10 11:16:41 crc kubenswrapper[4794]: I0310 11:16:41.541401 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d43bfe5e-a75c-45e1-b0f0-5d5654ecd8b6-operator-scripts\") pod \"placement-6f51-account-create-update-fmxtb\" (UID: \"d43bfe5e-a75c-45e1-b0f0-5d5654ecd8b6\") " pod="openstack/placement-6f51-account-create-update-fmxtb" Mar 10 11:16:41 crc kubenswrapper[4794]: I0310 11:16:41.542178 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/78311413-38d5-422f-8153-57eb3eed4494-operator-scripts\") pod \"placement-db-create-789kv\" (UID: \"78311413-38d5-422f-8153-57eb3eed4494\") " pod="openstack/placement-db-create-789kv" Mar 10 11:16:41 crc kubenswrapper[4794]: I0310 11:16:41.558416 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vhzm4\" (UniqueName: \"kubernetes.io/projected/78311413-38d5-422f-8153-57eb3eed4494-kube-api-access-vhzm4\") pod \"placement-db-create-789kv\" (UID: \"78311413-38d5-422f-8153-57eb3eed4494\") " pod="openstack/placement-db-create-789kv" Mar 10 11:16:41 crc kubenswrapper[4794]: I0310 11:16:41.642488 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9dgcc\" (UniqueName: \"kubernetes.io/projected/d43bfe5e-a75c-45e1-b0f0-5d5654ecd8b6-kube-api-access-9dgcc\") pod \"placement-6f51-account-create-update-fmxtb\" (UID: \"d43bfe5e-a75c-45e1-b0f0-5d5654ecd8b6\") " pod="openstack/placement-6f51-account-create-update-fmxtb" Mar 10 11:16:41 crc kubenswrapper[4794]: I0310 11:16:41.642710 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d43bfe5e-a75c-45e1-b0f0-5d5654ecd8b6-operator-scripts\") pod \"placement-6f51-account-create-update-fmxtb\" (UID: \"d43bfe5e-a75c-45e1-b0f0-5d5654ecd8b6\") " pod="openstack/placement-6f51-account-create-update-fmxtb" Mar 10 11:16:41 crc kubenswrapper[4794]: I0310 11:16:41.643965 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d43bfe5e-a75c-45e1-b0f0-5d5654ecd8b6-operator-scripts\") pod \"placement-6f51-account-create-update-fmxtb\" (UID: \"d43bfe5e-a75c-45e1-b0f0-5d5654ecd8b6\") " pod="openstack/placement-6f51-account-create-update-fmxtb" Mar 10 11:16:41 crc kubenswrapper[4794]: I0310 11:16:41.661525 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9dgcc\" (UniqueName: \"kubernetes.io/projected/d43bfe5e-a75c-45e1-b0f0-5d5654ecd8b6-kube-api-access-9dgcc\") pod \"placement-6f51-account-create-update-fmxtb\" (UID: \"d43bfe5e-a75c-45e1-b0f0-5d5654ecd8b6\") " pod="openstack/placement-6f51-account-create-update-fmxtb" Mar 10 11:16:41 crc kubenswrapper[4794]: I0310 11:16:41.741571 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-789kv" Mar 10 11:16:41 crc kubenswrapper[4794]: I0310 11:16:41.782327 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-6f51-account-create-update-fmxtb" Mar 10 11:16:42 crc kubenswrapper[4794]: I0310 11:16:42.260179 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-789kv"] Mar 10 11:16:42 crc kubenswrapper[4794]: I0310 11:16:42.289672 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-6f51-account-create-update-fmxtb"] Mar 10 11:16:42 crc kubenswrapper[4794]: W0310 11:16:42.311782 4794 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd43bfe5e_a75c_45e1_b0f0_5d5654ecd8b6.slice/crio-be2b8bc73cda445b943707a48b9d84014873c3e3f13be6804ecb5779e8b8f225 WatchSource:0}: Error finding container be2b8bc73cda445b943707a48b9d84014873c3e3f13be6804ecb5779e8b8f225: Status 404 returned error can't find the container with id be2b8bc73cda445b943707a48b9d84014873c3e3f13be6804ecb5779e8b8f225 Mar 10 11:16:42 crc kubenswrapper[4794]: I0310 11:16:42.512490 4794 scope.go:117] "RemoveContainer" containerID="cb39e8056a47c3aad427bfa16fa4633242ace230c4fee6f0e3feec2436b38082" Mar 10 11:16:42 crc kubenswrapper[4794]: I0310 11:16:42.616971 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-6f51-account-create-update-fmxtb" event={"ID":"d43bfe5e-a75c-45e1-b0f0-5d5654ecd8b6","Type":"ContainerStarted","Data":"2a0243d78f2df57b859dc77fe6752ee13edf53e83f10cd4e0616c56259649ee2"} Mar 10 11:16:42 crc kubenswrapper[4794]: I0310 11:16:42.617013 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-6f51-account-create-update-fmxtb" event={"ID":"d43bfe5e-a75c-45e1-b0f0-5d5654ecd8b6","Type":"ContainerStarted","Data":"be2b8bc73cda445b943707a48b9d84014873c3e3f13be6804ecb5779e8b8f225"} Mar 10 11:16:42 crc kubenswrapper[4794]: I0310 11:16:42.619312 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-789kv" event={"ID":"78311413-38d5-422f-8153-57eb3eed4494","Type":"ContainerStarted","Data":"5ee176fc2bcf33fedd90521d107fe2ef16ba0f4fe4f2c020bbed70829eace224"} Mar 10 11:16:42 crc kubenswrapper[4794]: I0310 11:16:42.619380 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-789kv" event={"ID":"78311413-38d5-422f-8153-57eb3eed4494","Type":"ContainerStarted","Data":"87998ffe591af399a60b2c37e2ceb8425ae86bd7243fd5f9087d690ffde04862"} Mar 10 11:16:42 crc kubenswrapper[4794]: I0310 11:16:42.637635 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-6f51-account-create-update-fmxtb" podStartSLOduration=1.6376172979999999 podStartE2EDuration="1.637617298s" podCreationTimestamp="2026-03-10 11:16:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 11:16:42.630124174 +0000 UTC m=+5551.386295012" watchObservedRunningTime="2026-03-10 11:16:42.637617298 +0000 UTC m=+5551.393788126" Mar 10 11:16:42 crc kubenswrapper[4794]: I0310 11:16:42.660033 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-db-create-789kv" podStartSLOduration=1.660009884 podStartE2EDuration="1.660009884s" podCreationTimestamp="2026-03-10 11:16:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 11:16:42.644089368 +0000 UTC m=+5551.400260186" watchObservedRunningTime="2026-03-10 11:16:42.660009884 +0000 UTC m=+5551.416180712" Mar 10 11:16:43 crc kubenswrapper[4794]: I0310 11:16:43.634209 4794 generic.go:334] "Generic (PLEG): container finished" podID="d43bfe5e-a75c-45e1-b0f0-5d5654ecd8b6" containerID="2a0243d78f2df57b859dc77fe6752ee13edf53e83f10cd4e0616c56259649ee2" exitCode=0 Mar 10 11:16:43 crc kubenswrapper[4794]: I0310 11:16:43.634300 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-6f51-account-create-update-fmxtb" event={"ID":"d43bfe5e-a75c-45e1-b0f0-5d5654ecd8b6","Type":"ContainerDied","Data":"2a0243d78f2df57b859dc77fe6752ee13edf53e83f10cd4e0616c56259649ee2"} Mar 10 11:16:43 crc kubenswrapper[4794]: I0310 11:16:43.637499 4794 generic.go:334] "Generic (PLEG): container finished" podID="78311413-38d5-422f-8153-57eb3eed4494" containerID="5ee176fc2bcf33fedd90521d107fe2ef16ba0f4fe4f2c020bbed70829eace224" exitCode=0 Mar 10 11:16:43 crc kubenswrapper[4794]: I0310 11:16:43.637560 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-789kv" event={"ID":"78311413-38d5-422f-8153-57eb3eed4494","Type":"ContainerDied","Data":"5ee176fc2bcf33fedd90521d107fe2ef16ba0f4fe4f2c020bbed70829eace224"} Mar 10 11:16:45 crc kubenswrapper[4794]: I0310 11:16:45.116053 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-6f51-account-create-update-fmxtb" Mar 10 11:16:45 crc kubenswrapper[4794]: I0310 11:16:45.122839 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-789kv" Mar 10 11:16:45 crc kubenswrapper[4794]: I0310 11:16:45.219396 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9dgcc\" (UniqueName: \"kubernetes.io/projected/d43bfe5e-a75c-45e1-b0f0-5d5654ecd8b6-kube-api-access-9dgcc\") pod \"d43bfe5e-a75c-45e1-b0f0-5d5654ecd8b6\" (UID: \"d43bfe5e-a75c-45e1-b0f0-5d5654ecd8b6\") " Mar 10 11:16:45 crc kubenswrapper[4794]: I0310 11:16:45.219448 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/78311413-38d5-422f-8153-57eb3eed4494-operator-scripts\") pod \"78311413-38d5-422f-8153-57eb3eed4494\" (UID: \"78311413-38d5-422f-8153-57eb3eed4494\") " Mar 10 11:16:45 crc kubenswrapper[4794]: I0310 11:16:45.219479 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vhzm4\" (UniqueName: \"kubernetes.io/projected/78311413-38d5-422f-8153-57eb3eed4494-kube-api-access-vhzm4\") pod \"78311413-38d5-422f-8153-57eb3eed4494\" (UID: \"78311413-38d5-422f-8153-57eb3eed4494\") " Mar 10 11:16:45 crc kubenswrapper[4794]: I0310 11:16:45.219574 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d43bfe5e-a75c-45e1-b0f0-5d5654ecd8b6-operator-scripts\") pod \"d43bfe5e-a75c-45e1-b0f0-5d5654ecd8b6\" (UID: \"d43bfe5e-a75c-45e1-b0f0-5d5654ecd8b6\") " Mar 10 11:16:45 crc kubenswrapper[4794]: I0310 11:16:45.220620 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/78311413-38d5-422f-8153-57eb3eed4494-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "78311413-38d5-422f-8153-57eb3eed4494" (UID: "78311413-38d5-422f-8153-57eb3eed4494"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 11:16:45 crc kubenswrapper[4794]: I0310 11:16:45.220687 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d43bfe5e-a75c-45e1-b0f0-5d5654ecd8b6-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "d43bfe5e-a75c-45e1-b0f0-5d5654ecd8b6" (UID: "d43bfe5e-a75c-45e1-b0f0-5d5654ecd8b6"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 11:16:45 crc kubenswrapper[4794]: I0310 11:16:45.227937 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d43bfe5e-a75c-45e1-b0f0-5d5654ecd8b6-kube-api-access-9dgcc" (OuterVolumeSpecName: "kube-api-access-9dgcc") pod "d43bfe5e-a75c-45e1-b0f0-5d5654ecd8b6" (UID: "d43bfe5e-a75c-45e1-b0f0-5d5654ecd8b6"). InnerVolumeSpecName "kube-api-access-9dgcc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 11:16:45 crc kubenswrapper[4794]: I0310 11:16:45.229592 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/78311413-38d5-422f-8153-57eb3eed4494-kube-api-access-vhzm4" (OuterVolumeSpecName: "kube-api-access-vhzm4") pod "78311413-38d5-422f-8153-57eb3eed4494" (UID: "78311413-38d5-422f-8153-57eb3eed4494"). InnerVolumeSpecName "kube-api-access-vhzm4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 11:16:45 crc kubenswrapper[4794]: I0310 11:16:45.320624 4794 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d43bfe5e-a75c-45e1-b0f0-5d5654ecd8b6-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 11:16:45 crc kubenswrapper[4794]: I0310 11:16:45.320655 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9dgcc\" (UniqueName: \"kubernetes.io/projected/d43bfe5e-a75c-45e1-b0f0-5d5654ecd8b6-kube-api-access-9dgcc\") on node \"crc\" DevicePath \"\"" Mar 10 11:16:45 crc kubenswrapper[4794]: I0310 11:16:45.320666 4794 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/78311413-38d5-422f-8153-57eb3eed4494-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 11:16:45 crc kubenswrapper[4794]: I0310 11:16:45.320675 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vhzm4\" (UniqueName: \"kubernetes.io/projected/78311413-38d5-422f-8153-57eb3eed4494-kube-api-access-vhzm4\") on node \"crc\" DevicePath \"\"" Mar 10 11:16:45 crc kubenswrapper[4794]: I0310 11:16:45.670964 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-6f51-account-create-update-fmxtb" event={"ID":"d43bfe5e-a75c-45e1-b0f0-5d5654ecd8b6","Type":"ContainerDied","Data":"be2b8bc73cda445b943707a48b9d84014873c3e3f13be6804ecb5779e8b8f225"} Mar 10 11:16:45 crc kubenswrapper[4794]: I0310 11:16:45.671036 4794 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="be2b8bc73cda445b943707a48b9d84014873c3e3f13be6804ecb5779e8b8f225" Mar 10 11:16:45 crc kubenswrapper[4794]: I0310 11:16:45.670983 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-6f51-account-create-update-fmxtb" Mar 10 11:16:45 crc kubenswrapper[4794]: I0310 11:16:45.677238 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-789kv" event={"ID":"78311413-38d5-422f-8153-57eb3eed4494","Type":"ContainerDied","Data":"87998ffe591af399a60b2c37e2ceb8425ae86bd7243fd5f9087d690ffde04862"} Mar 10 11:16:45 crc kubenswrapper[4794]: I0310 11:16:45.677288 4794 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="87998ffe591af399a60b2c37e2ceb8425ae86bd7243fd5f9087d690ffde04862" Mar 10 11:16:45 crc kubenswrapper[4794]: I0310 11:16:45.677392 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-789kv" Mar 10 11:16:46 crc kubenswrapper[4794]: I0310 11:16:46.849879 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-sync-jgrbz"] Mar 10 11:16:46 crc kubenswrapper[4794]: E0310 11:16:46.850255 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d43bfe5e-a75c-45e1-b0f0-5d5654ecd8b6" containerName="mariadb-account-create-update" Mar 10 11:16:46 crc kubenswrapper[4794]: I0310 11:16:46.850270 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="d43bfe5e-a75c-45e1-b0f0-5d5654ecd8b6" containerName="mariadb-account-create-update" Mar 10 11:16:46 crc kubenswrapper[4794]: E0310 11:16:46.850301 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="78311413-38d5-422f-8153-57eb3eed4494" containerName="mariadb-database-create" Mar 10 11:16:46 crc kubenswrapper[4794]: I0310 11:16:46.850308 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="78311413-38d5-422f-8153-57eb3eed4494" containerName="mariadb-database-create" Mar 10 11:16:46 crc kubenswrapper[4794]: I0310 11:16:46.850468 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="d43bfe5e-a75c-45e1-b0f0-5d5654ecd8b6" containerName="mariadb-account-create-update" Mar 10 11:16:46 crc kubenswrapper[4794]: I0310 11:16:46.850481 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="78311413-38d5-422f-8153-57eb3eed4494" containerName="mariadb-database-create" Mar 10 11:16:46 crc kubenswrapper[4794]: I0310 11:16:46.851078 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-jgrbz" Mar 10 11:16:46 crc kubenswrapper[4794]: I0310 11:16:46.854110 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Mar 10 11:16:46 crc kubenswrapper[4794]: I0310 11:16:46.854409 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-stj8g" Mar 10 11:16:46 crc kubenswrapper[4794]: I0310 11:16:46.854607 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Mar 10 11:16:46 crc kubenswrapper[4794]: I0310 11:16:46.901302 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7f9b86ff59-7wf9d"] Mar 10 11:16:46 crc kubenswrapper[4794]: I0310 11:16:46.902709 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7f9b86ff59-7wf9d" Mar 10 11:16:46 crc kubenswrapper[4794]: I0310 11:16:46.911095 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-jgrbz"] Mar 10 11:16:46 crc kubenswrapper[4794]: I0310 11:16:46.918560 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7f9b86ff59-7wf9d"] Mar 10 11:16:46 crc kubenswrapper[4794]: I0310 11:16:46.953124 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a5acb60a-9ac5-4cda-ab04-5542bd65e4ad-logs\") pod \"placement-db-sync-jgrbz\" (UID: \"a5acb60a-9ac5-4cda-ab04-5542bd65e4ad\") " pod="openstack/placement-db-sync-jgrbz" Mar 10 11:16:46 crc kubenswrapper[4794]: I0310 11:16:46.953207 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a5acb60a-9ac5-4cda-ab04-5542bd65e4ad-scripts\") pod \"placement-db-sync-jgrbz\" (UID: \"a5acb60a-9ac5-4cda-ab04-5542bd65e4ad\") " pod="openstack/placement-db-sync-jgrbz" Mar 10 11:16:46 crc kubenswrapper[4794]: I0310 11:16:46.953242 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a5acb60a-9ac5-4cda-ab04-5542bd65e4ad-combined-ca-bundle\") pod \"placement-db-sync-jgrbz\" (UID: \"a5acb60a-9ac5-4cda-ab04-5542bd65e4ad\") " pod="openstack/placement-db-sync-jgrbz" Mar 10 11:16:46 crc kubenswrapper[4794]: I0310 11:16:46.953285 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rgr9x\" (UniqueName: \"kubernetes.io/projected/a5acb60a-9ac5-4cda-ab04-5542bd65e4ad-kube-api-access-rgr9x\") pod \"placement-db-sync-jgrbz\" (UID: \"a5acb60a-9ac5-4cda-ab04-5542bd65e4ad\") " pod="openstack/placement-db-sync-jgrbz" Mar 10 11:16:46 crc kubenswrapper[4794]: I0310 11:16:46.953324 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a5acb60a-9ac5-4cda-ab04-5542bd65e4ad-config-data\") pod \"placement-db-sync-jgrbz\" (UID: \"a5acb60a-9ac5-4cda-ab04-5542bd65e4ad\") " pod="openstack/placement-db-sync-jgrbz" Mar 10 11:16:47 crc kubenswrapper[4794]: I0310 11:16:47.058247 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rgr9x\" (UniqueName: \"kubernetes.io/projected/a5acb60a-9ac5-4cda-ab04-5542bd65e4ad-kube-api-access-rgr9x\") pod \"placement-db-sync-jgrbz\" (UID: \"a5acb60a-9ac5-4cda-ab04-5542bd65e4ad\") " pod="openstack/placement-db-sync-jgrbz" Mar 10 11:16:47 crc kubenswrapper[4794]: I0310 11:16:47.058306 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m2l8z\" (UniqueName: \"kubernetes.io/projected/66e0ef86-58d0-4ad0-9336-236773558c09-kube-api-access-m2l8z\") pod \"dnsmasq-dns-7f9b86ff59-7wf9d\" (UID: \"66e0ef86-58d0-4ad0-9336-236773558c09\") " pod="openstack/dnsmasq-dns-7f9b86ff59-7wf9d" Mar 10 11:16:47 crc kubenswrapper[4794]: I0310 11:16:47.058384 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a5acb60a-9ac5-4cda-ab04-5542bd65e4ad-config-data\") pod \"placement-db-sync-jgrbz\" (UID: \"a5acb60a-9ac5-4cda-ab04-5542bd65e4ad\") " pod="openstack/placement-db-sync-jgrbz" Mar 10 11:16:47 crc kubenswrapper[4794]: I0310 11:16:47.058429 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a5acb60a-9ac5-4cda-ab04-5542bd65e4ad-logs\") pod \"placement-db-sync-jgrbz\" (UID: \"a5acb60a-9ac5-4cda-ab04-5542bd65e4ad\") " pod="openstack/placement-db-sync-jgrbz" Mar 10 11:16:47 crc kubenswrapper[4794]: I0310 11:16:47.058517 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/66e0ef86-58d0-4ad0-9336-236773558c09-ovsdbserver-nb\") pod \"dnsmasq-dns-7f9b86ff59-7wf9d\" (UID: \"66e0ef86-58d0-4ad0-9336-236773558c09\") " pod="openstack/dnsmasq-dns-7f9b86ff59-7wf9d" Mar 10 11:16:47 crc kubenswrapper[4794]: I0310 11:16:47.058544 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a5acb60a-9ac5-4cda-ab04-5542bd65e4ad-scripts\") pod \"placement-db-sync-jgrbz\" (UID: \"a5acb60a-9ac5-4cda-ab04-5542bd65e4ad\") " pod="openstack/placement-db-sync-jgrbz" Mar 10 11:16:47 crc kubenswrapper[4794]: I0310 11:16:47.058586 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/66e0ef86-58d0-4ad0-9336-236773558c09-config\") pod \"dnsmasq-dns-7f9b86ff59-7wf9d\" (UID: \"66e0ef86-58d0-4ad0-9336-236773558c09\") " pod="openstack/dnsmasq-dns-7f9b86ff59-7wf9d" Mar 10 11:16:47 crc kubenswrapper[4794]: I0310 11:16:47.058623 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a5acb60a-9ac5-4cda-ab04-5542bd65e4ad-combined-ca-bundle\") pod \"placement-db-sync-jgrbz\" (UID: \"a5acb60a-9ac5-4cda-ab04-5542bd65e4ad\") " pod="openstack/placement-db-sync-jgrbz" Mar 10 11:16:47 crc kubenswrapper[4794]: I0310 11:16:47.058661 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/66e0ef86-58d0-4ad0-9336-236773558c09-ovsdbserver-sb\") pod \"dnsmasq-dns-7f9b86ff59-7wf9d\" (UID: \"66e0ef86-58d0-4ad0-9336-236773558c09\") " pod="openstack/dnsmasq-dns-7f9b86ff59-7wf9d" Mar 10 11:16:47 crc kubenswrapper[4794]: I0310 11:16:47.058679 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/66e0ef86-58d0-4ad0-9336-236773558c09-dns-svc\") pod \"dnsmasq-dns-7f9b86ff59-7wf9d\" (UID: \"66e0ef86-58d0-4ad0-9336-236773558c09\") " pod="openstack/dnsmasq-dns-7f9b86ff59-7wf9d" Mar 10 11:16:47 crc kubenswrapper[4794]: I0310 11:16:47.060131 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a5acb60a-9ac5-4cda-ab04-5542bd65e4ad-logs\") pod \"placement-db-sync-jgrbz\" (UID: \"a5acb60a-9ac5-4cda-ab04-5542bd65e4ad\") " pod="openstack/placement-db-sync-jgrbz" Mar 10 11:16:47 crc kubenswrapper[4794]: I0310 11:16:47.063272 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a5acb60a-9ac5-4cda-ab04-5542bd65e4ad-config-data\") pod \"placement-db-sync-jgrbz\" (UID: \"a5acb60a-9ac5-4cda-ab04-5542bd65e4ad\") " pod="openstack/placement-db-sync-jgrbz" Mar 10 11:16:47 crc kubenswrapper[4794]: I0310 11:16:47.064293 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a5acb60a-9ac5-4cda-ab04-5542bd65e4ad-scripts\") pod \"placement-db-sync-jgrbz\" (UID: \"a5acb60a-9ac5-4cda-ab04-5542bd65e4ad\") " pod="openstack/placement-db-sync-jgrbz" Mar 10 11:16:47 crc kubenswrapper[4794]: I0310 11:16:47.064390 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a5acb60a-9ac5-4cda-ab04-5542bd65e4ad-combined-ca-bundle\") pod \"placement-db-sync-jgrbz\" (UID: \"a5acb60a-9ac5-4cda-ab04-5542bd65e4ad\") " pod="openstack/placement-db-sync-jgrbz" Mar 10 11:16:47 crc kubenswrapper[4794]: I0310 11:16:47.079313 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rgr9x\" (UniqueName: \"kubernetes.io/projected/a5acb60a-9ac5-4cda-ab04-5542bd65e4ad-kube-api-access-rgr9x\") pod \"placement-db-sync-jgrbz\" (UID: \"a5acb60a-9ac5-4cda-ab04-5542bd65e4ad\") " pod="openstack/placement-db-sync-jgrbz" Mar 10 11:16:47 crc kubenswrapper[4794]: I0310 11:16:47.159980 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/66e0ef86-58d0-4ad0-9336-236773558c09-ovsdbserver-nb\") pod \"dnsmasq-dns-7f9b86ff59-7wf9d\" (UID: \"66e0ef86-58d0-4ad0-9336-236773558c09\") " pod="openstack/dnsmasq-dns-7f9b86ff59-7wf9d" Mar 10 11:16:47 crc kubenswrapper[4794]: I0310 11:16:47.160028 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/66e0ef86-58d0-4ad0-9336-236773558c09-config\") pod \"dnsmasq-dns-7f9b86ff59-7wf9d\" (UID: \"66e0ef86-58d0-4ad0-9336-236773558c09\") " pod="openstack/dnsmasq-dns-7f9b86ff59-7wf9d" Mar 10 11:16:47 crc kubenswrapper[4794]: I0310 11:16:47.160078 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/66e0ef86-58d0-4ad0-9336-236773558c09-ovsdbserver-sb\") pod \"dnsmasq-dns-7f9b86ff59-7wf9d\" (UID: \"66e0ef86-58d0-4ad0-9336-236773558c09\") " pod="openstack/dnsmasq-dns-7f9b86ff59-7wf9d" Mar 10 11:16:47 crc kubenswrapper[4794]: I0310 11:16:47.160093 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/66e0ef86-58d0-4ad0-9336-236773558c09-dns-svc\") pod \"dnsmasq-dns-7f9b86ff59-7wf9d\" (UID: \"66e0ef86-58d0-4ad0-9336-236773558c09\") " pod="openstack/dnsmasq-dns-7f9b86ff59-7wf9d" Mar 10 11:16:47 crc kubenswrapper[4794]: I0310 11:16:47.160131 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m2l8z\" (UniqueName: \"kubernetes.io/projected/66e0ef86-58d0-4ad0-9336-236773558c09-kube-api-access-m2l8z\") pod \"dnsmasq-dns-7f9b86ff59-7wf9d\" (UID: \"66e0ef86-58d0-4ad0-9336-236773558c09\") " pod="openstack/dnsmasq-dns-7f9b86ff59-7wf9d" Mar 10 11:16:47 crc kubenswrapper[4794]: I0310 11:16:47.161493 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/66e0ef86-58d0-4ad0-9336-236773558c09-dns-svc\") pod \"dnsmasq-dns-7f9b86ff59-7wf9d\" (UID: \"66e0ef86-58d0-4ad0-9336-236773558c09\") " pod="openstack/dnsmasq-dns-7f9b86ff59-7wf9d" Mar 10 11:16:47 crc kubenswrapper[4794]: I0310 11:16:47.161664 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/66e0ef86-58d0-4ad0-9336-236773558c09-config\") pod \"dnsmasq-dns-7f9b86ff59-7wf9d\" (UID: \"66e0ef86-58d0-4ad0-9336-236773558c09\") " pod="openstack/dnsmasq-dns-7f9b86ff59-7wf9d" Mar 10 11:16:47 crc kubenswrapper[4794]: I0310 11:16:47.161986 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/66e0ef86-58d0-4ad0-9336-236773558c09-ovsdbserver-sb\") pod \"dnsmasq-dns-7f9b86ff59-7wf9d\" (UID: \"66e0ef86-58d0-4ad0-9336-236773558c09\") " pod="openstack/dnsmasq-dns-7f9b86ff59-7wf9d" Mar 10 11:16:47 crc kubenswrapper[4794]: I0310 11:16:47.162353 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/66e0ef86-58d0-4ad0-9336-236773558c09-ovsdbserver-nb\") pod \"dnsmasq-dns-7f9b86ff59-7wf9d\" (UID: \"66e0ef86-58d0-4ad0-9336-236773558c09\") " pod="openstack/dnsmasq-dns-7f9b86ff59-7wf9d" Mar 10 11:16:47 crc kubenswrapper[4794]: I0310 11:16:47.190246 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m2l8z\" (UniqueName: \"kubernetes.io/projected/66e0ef86-58d0-4ad0-9336-236773558c09-kube-api-access-m2l8z\") pod \"dnsmasq-dns-7f9b86ff59-7wf9d\" (UID: \"66e0ef86-58d0-4ad0-9336-236773558c09\") " pod="openstack/dnsmasq-dns-7f9b86ff59-7wf9d" Mar 10 11:16:47 crc kubenswrapper[4794]: I0310 11:16:47.225762 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-jgrbz" Mar 10 11:16:47 crc kubenswrapper[4794]: I0310 11:16:47.236366 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7f9b86ff59-7wf9d" Mar 10 11:16:47 crc kubenswrapper[4794]: I0310 11:16:47.708585 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-jgrbz"] Mar 10 11:16:47 crc kubenswrapper[4794]: I0310 11:16:47.797513 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7f9b86ff59-7wf9d"] Mar 10 11:16:48 crc kubenswrapper[4794]: I0310 11:16:48.711771 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-jgrbz" event={"ID":"a5acb60a-9ac5-4cda-ab04-5542bd65e4ad","Type":"ContainerStarted","Data":"3aa8acc367871bb87e19d6efb9c07640890b28e221aa9bf63add21509165b115"} Mar 10 11:16:48 crc kubenswrapper[4794]: I0310 11:16:48.712320 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-jgrbz" event={"ID":"a5acb60a-9ac5-4cda-ab04-5542bd65e4ad","Type":"ContainerStarted","Data":"bce619c48423b5a5919bfe44a9968ea4ae02a385c16056a3f726939f70618c5b"} Mar 10 11:16:48 crc kubenswrapper[4794]: I0310 11:16:48.713498 4794 generic.go:334] "Generic (PLEG): container finished" podID="66e0ef86-58d0-4ad0-9336-236773558c09" containerID="6a5fc212dd02823c2d523319c01cb9f0da24406d839dc3bea585a7e3e298b0eb" exitCode=0 Mar 10 11:16:48 crc kubenswrapper[4794]: I0310 11:16:48.713563 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7f9b86ff59-7wf9d" event={"ID":"66e0ef86-58d0-4ad0-9336-236773558c09","Type":"ContainerDied","Data":"6a5fc212dd02823c2d523319c01cb9f0da24406d839dc3bea585a7e3e298b0eb"} Mar 10 11:16:48 crc kubenswrapper[4794]: I0310 11:16:48.713594 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7f9b86ff59-7wf9d" event={"ID":"66e0ef86-58d0-4ad0-9336-236773558c09","Type":"ContainerStarted","Data":"8e9b8bfde6b7888cfe29e815d0647cb6174b55db53f109ce46395b02e3dc5e63"} Mar 10 11:16:48 crc kubenswrapper[4794]: I0310 11:16:48.761317 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-db-sync-jgrbz" podStartSLOduration=2.761290432 podStartE2EDuration="2.761290432s" podCreationTimestamp="2026-03-10 11:16:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 11:16:48.743828178 +0000 UTC m=+5557.499998996" watchObservedRunningTime="2026-03-10 11:16:48.761290432 +0000 UTC m=+5557.517461280" Mar 10 11:16:49 crc kubenswrapper[4794]: I0310 11:16:49.728544 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7f9b86ff59-7wf9d" event={"ID":"66e0ef86-58d0-4ad0-9336-236773558c09","Type":"ContainerStarted","Data":"d3a20ab9001d55da97233357f587e7c43f82752707564eaac01064851811a183"} Mar 10 11:16:49 crc kubenswrapper[4794]: I0310 11:16:49.728649 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-7f9b86ff59-7wf9d" Mar 10 11:16:49 crc kubenswrapper[4794]: I0310 11:16:49.731062 4794 generic.go:334] "Generic (PLEG): container finished" podID="a5acb60a-9ac5-4cda-ab04-5542bd65e4ad" containerID="3aa8acc367871bb87e19d6efb9c07640890b28e221aa9bf63add21509165b115" exitCode=0 Mar 10 11:16:49 crc kubenswrapper[4794]: I0310 11:16:49.731261 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-jgrbz" event={"ID":"a5acb60a-9ac5-4cda-ab04-5542bd65e4ad","Type":"ContainerDied","Data":"3aa8acc367871bb87e19d6efb9c07640890b28e221aa9bf63add21509165b115"} Mar 10 11:16:49 crc kubenswrapper[4794]: I0310 11:16:49.760002 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-7f9b86ff59-7wf9d" podStartSLOduration=3.75998088 podStartE2EDuration="3.75998088s" podCreationTimestamp="2026-03-10 11:16:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 11:16:49.75260795 +0000 UTC m=+5558.508778818" watchObservedRunningTime="2026-03-10 11:16:49.75998088 +0000 UTC m=+5558.516151698" Mar 10 11:16:51 crc kubenswrapper[4794]: I0310 11:16:51.171205 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-jgrbz" Mar 10 11:16:51 crc kubenswrapper[4794]: I0310 11:16:51.339578 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rgr9x\" (UniqueName: \"kubernetes.io/projected/a5acb60a-9ac5-4cda-ab04-5542bd65e4ad-kube-api-access-rgr9x\") pod \"a5acb60a-9ac5-4cda-ab04-5542bd65e4ad\" (UID: \"a5acb60a-9ac5-4cda-ab04-5542bd65e4ad\") " Mar 10 11:16:51 crc kubenswrapper[4794]: I0310 11:16:51.339660 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a5acb60a-9ac5-4cda-ab04-5542bd65e4ad-combined-ca-bundle\") pod \"a5acb60a-9ac5-4cda-ab04-5542bd65e4ad\" (UID: \"a5acb60a-9ac5-4cda-ab04-5542bd65e4ad\") " Mar 10 11:16:51 crc kubenswrapper[4794]: I0310 11:16:51.339777 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a5acb60a-9ac5-4cda-ab04-5542bd65e4ad-config-data\") pod \"a5acb60a-9ac5-4cda-ab04-5542bd65e4ad\" (UID: \"a5acb60a-9ac5-4cda-ab04-5542bd65e4ad\") " Mar 10 11:16:51 crc kubenswrapper[4794]: I0310 11:16:51.339817 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a5acb60a-9ac5-4cda-ab04-5542bd65e4ad-logs\") pod \"a5acb60a-9ac5-4cda-ab04-5542bd65e4ad\" (UID: \"a5acb60a-9ac5-4cda-ab04-5542bd65e4ad\") " Mar 10 11:16:51 crc kubenswrapper[4794]: I0310 11:16:51.339891 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a5acb60a-9ac5-4cda-ab04-5542bd65e4ad-scripts\") pod \"a5acb60a-9ac5-4cda-ab04-5542bd65e4ad\" (UID: \"a5acb60a-9ac5-4cda-ab04-5542bd65e4ad\") " Mar 10 11:16:51 crc kubenswrapper[4794]: I0310 11:16:51.340497 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a5acb60a-9ac5-4cda-ab04-5542bd65e4ad-logs" (OuterVolumeSpecName: "logs") pod "a5acb60a-9ac5-4cda-ab04-5542bd65e4ad" (UID: "a5acb60a-9ac5-4cda-ab04-5542bd65e4ad"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 11:16:51 crc kubenswrapper[4794]: I0310 11:16:51.341422 4794 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a5acb60a-9ac5-4cda-ab04-5542bd65e4ad-logs\") on node \"crc\" DevicePath \"\"" Mar 10 11:16:51 crc kubenswrapper[4794]: I0310 11:16:51.345813 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a5acb60a-9ac5-4cda-ab04-5542bd65e4ad-scripts" (OuterVolumeSpecName: "scripts") pod "a5acb60a-9ac5-4cda-ab04-5542bd65e4ad" (UID: "a5acb60a-9ac5-4cda-ab04-5542bd65e4ad"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 11:16:51 crc kubenswrapper[4794]: I0310 11:16:51.346065 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a5acb60a-9ac5-4cda-ab04-5542bd65e4ad-kube-api-access-rgr9x" (OuterVolumeSpecName: "kube-api-access-rgr9x") pod "a5acb60a-9ac5-4cda-ab04-5542bd65e4ad" (UID: "a5acb60a-9ac5-4cda-ab04-5542bd65e4ad"). InnerVolumeSpecName "kube-api-access-rgr9x". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 11:16:51 crc kubenswrapper[4794]: I0310 11:16:51.379287 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a5acb60a-9ac5-4cda-ab04-5542bd65e4ad-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a5acb60a-9ac5-4cda-ab04-5542bd65e4ad" (UID: "a5acb60a-9ac5-4cda-ab04-5542bd65e4ad"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 11:16:51 crc kubenswrapper[4794]: I0310 11:16:51.383832 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a5acb60a-9ac5-4cda-ab04-5542bd65e4ad-config-data" (OuterVolumeSpecName: "config-data") pod "a5acb60a-9ac5-4cda-ab04-5542bd65e4ad" (UID: "a5acb60a-9ac5-4cda-ab04-5542bd65e4ad"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 11:16:51 crc kubenswrapper[4794]: I0310 11:16:51.443153 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rgr9x\" (UniqueName: \"kubernetes.io/projected/a5acb60a-9ac5-4cda-ab04-5542bd65e4ad-kube-api-access-rgr9x\") on node \"crc\" DevicePath \"\"" Mar 10 11:16:51 crc kubenswrapper[4794]: I0310 11:16:51.443194 4794 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a5acb60a-9ac5-4cda-ab04-5542bd65e4ad-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 11:16:51 crc kubenswrapper[4794]: I0310 11:16:51.443208 4794 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a5acb60a-9ac5-4cda-ab04-5542bd65e4ad-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 11:16:51 crc kubenswrapper[4794]: I0310 11:16:51.443220 4794 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a5acb60a-9ac5-4cda-ab04-5542bd65e4ad-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 11:16:51 crc kubenswrapper[4794]: I0310 11:16:51.754104 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-jgrbz" event={"ID":"a5acb60a-9ac5-4cda-ab04-5542bd65e4ad","Type":"ContainerDied","Data":"bce619c48423b5a5919bfe44a9968ea4ae02a385c16056a3f726939f70618c5b"} Mar 10 11:16:51 crc kubenswrapper[4794]: I0310 11:16:51.754406 4794 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bce619c48423b5a5919bfe44a9968ea4ae02a385c16056a3f726939f70618c5b" Mar 10 11:16:51 crc kubenswrapper[4794]: I0310 11:16:51.754195 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-jgrbz" Mar 10 11:16:51 crc kubenswrapper[4794]: I0310 11:16:51.863112 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-7459f47788-bz25p"] Mar 10 11:16:51 crc kubenswrapper[4794]: E0310 11:16:51.863534 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a5acb60a-9ac5-4cda-ab04-5542bd65e4ad" containerName="placement-db-sync" Mar 10 11:16:51 crc kubenswrapper[4794]: I0310 11:16:51.863554 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="a5acb60a-9ac5-4cda-ab04-5542bd65e4ad" containerName="placement-db-sync" Mar 10 11:16:51 crc kubenswrapper[4794]: I0310 11:16:51.863790 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="a5acb60a-9ac5-4cda-ab04-5542bd65e4ad" containerName="placement-db-sync" Mar 10 11:16:51 crc kubenswrapper[4794]: I0310 11:16:51.865008 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-7459f47788-bz25p" Mar 10 11:16:51 crc kubenswrapper[4794]: I0310 11:16:51.866930 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Mar 10 11:16:51 crc kubenswrapper[4794]: I0310 11:16:51.868764 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-stj8g" Mar 10 11:16:51 crc kubenswrapper[4794]: I0310 11:16:51.871721 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Mar 10 11:16:51 crc kubenswrapper[4794]: I0310 11:16:51.914014 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-7459f47788-bz25p"] Mar 10 11:16:52 crc kubenswrapper[4794]: I0310 11:16:52.054205 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cebda460-fe0a-407c-bdd4-db77b6d6c14e-combined-ca-bundle\") pod \"placement-7459f47788-bz25p\" (UID: \"cebda460-fe0a-407c-bdd4-db77b6d6c14e\") " pod="openstack/placement-7459f47788-bz25p" Mar 10 11:16:52 crc kubenswrapper[4794]: I0310 11:16:52.054255 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cebda460-fe0a-407c-bdd4-db77b6d6c14e-logs\") pod \"placement-7459f47788-bz25p\" (UID: \"cebda460-fe0a-407c-bdd4-db77b6d6c14e\") " pod="openstack/placement-7459f47788-bz25p" Mar 10 11:16:52 crc kubenswrapper[4794]: I0310 11:16:52.054289 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cebda460-fe0a-407c-bdd4-db77b6d6c14e-scripts\") pod \"placement-7459f47788-bz25p\" (UID: \"cebda460-fe0a-407c-bdd4-db77b6d6c14e\") " pod="openstack/placement-7459f47788-bz25p" Mar 10 11:16:52 crc kubenswrapper[4794]: I0310 11:16:52.054347 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cebda460-fe0a-407c-bdd4-db77b6d6c14e-config-data\") pod \"placement-7459f47788-bz25p\" (UID: \"cebda460-fe0a-407c-bdd4-db77b6d6c14e\") " pod="openstack/placement-7459f47788-bz25p" Mar 10 11:16:52 crc kubenswrapper[4794]: I0310 11:16:52.054367 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j2d45\" (UniqueName: \"kubernetes.io/projected/cebda460-fe0a-407c-bdd4-db77b6d6c14e-kube-api-access-j2d45\") pod \"placement-7459f47788-bz25p\" (UID: \"cebda460-fe0a-407c-bdd4-db77b6d6c14e\") " pod="openstack/placement-7459f47788-bz25p" Mar 10 11:16:52 crc kubenswrapper[4794]: I0310 11:16:52.155433 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cebda460-fe0a-407c-bdd4-db77b6d6c14e-combined-ca-bundle\") pod \"placement-7459f47788-bz25p\" (UID: \"cebda460-fe0a-407c-bdd4-db77b6d6c14e\") " pod="openstack/placement-7459f47788-bz25p" Mar 10 11:16:52 crc kubenswrapper[4794]: I0310 11:16:52.155496 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cebda460-fe0a-407c-bdd4-db77b6d6c14e-logs\") pod \"placement-7459f47788-bz25p\" (UID: \"cebda460-fe0a-407c-bdd4-db77b6d6c14e\") " pod="openstack/placement-7459f47788-bz25p" Mar 10 11:16:52 crc kubenswrapper[4794]: I0310 11:16:52.155536 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cebda460-fe0a-407c-bdd4-db77b6d6c14e-scripts\") pod \"placement-7459f47788-bz25p\" (UID: \"cebda460-fe0a-407c-bdd4-db77b6d6c14e\") " pod="openstack/placement-7459f47788-bz25p" Mar 10 11:16:52 crc kubenswrapper[4794]: I0310 11:16:52.155593 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cebda460-fe0a-407c-bdd4-db77b6d6c14e-config-data\") pod \"placement-7459f47788-bz25p\" (UID: \"cebda460-fe0a-407c-bdd4-db77b6d6c14e\") " pod="openstack/placement-7459f47788-bz25p" Mar 10 11:16:52 crc kubenswrapper[4794]: I0310 11:16:52.155618 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j2d45\" (UniqueName: \"kubernetes.io/projected/cebda460-fe0a-407c-bdd4-db77b6d6c14e-kube-api-access-j2d45\") pod \"placement-7459f47788-bz25p\" (UID: \"cebda460-fe0a-407c-bdd4-db77b6d6c14e\") " pod="openstack/placement-7459f47788-bz25p" Mar 10 11:16:52 crc kubenswrapper[4794]: I0310 11:16:52.156276 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cebda460-fe0a-407c-bdd4-db77b6d6c14e-logs\") pod \"placement-7459f47788-bz25p\" (UID: \"cebda460-fe0a-407c-bdd4-db77b6d6c14e\") " pod="openstack/placement-7459f47788-bz25p" Mar 10 11:16:52 crc kubenswrapper[4794]: I0310 11:16:52.161017 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cebda460-fe0a-407c-bdd4-db77b6d6c14e-scripts\") pod \"placement-7459f47788-bz25p\" (UID: \"cebda460-fe0a-407c-bdd4-db77b6d6c14e\") " pod="openstack/placement-7459f47788-bz25p" Mar 10 11:16:52 crc kubenswrapper[4794]: I0310 11:16:52.163485 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cebda460-fe0a-407c-bdd4-db77b6d6c14e-combined-ca-bundle\") pod \"placement-7459f47788-bz25p\" (UID: \"cebda460-fe0a-407c-bdd4-db77b6d6c14e\") " pod="openstack/placement-7459f47788-bz25p" Mar 10 11:16:52 crc kubenswrapper[4794]: I0310 11:16:52.169567 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cebda460-fe0a-407c-bdd4-db77b6d6c14e-config-data\") pod \"placement-7459f47788-bz25p\" (UID: \"cebda460-fe0a-407c-bdd4-db77b6d6c14e\") " pod="openstack/placement-7459f47788-bz25p" Mar 10 11:16:52 crc kubenswrapper[4794]: I0310 11:16:52.186708 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j2d45\" (UniqueName: \"kubernetes.io/projected/cebda460-fe0a-407c-bdd4-db77b6d6c14e-kube-api-access-j2d45\") pod \"placement-7459f47788-bz25p\" (UID: \"cebda460-fe0a-407c-bdd4-db77b6d6c14e\") " pod="openstack/placement-7459f47788-bz25p" Mar 10 11:16:52 crc kubenswrapper[4794]: I0310 11:16:52.218956 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-7459f47788-bz25p" Mar 10 11:16:52 crc kubenswrapper[4794]: I0310 11:16:52.740582 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-7459f47788-bz25p"] Mar 10 11:16:52 crc kubenswrapper[4794]: W0310 11:16:52.748770 4794 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcebda460_fe0a_407c_bdd4_db77b6d6c14e.slice/crio-b9efa85e26d171b44fcfc6167e0fd1630ea09eee9122b990401aa77ca0364673 WatchSource:0}: Error finding container b9efa85e26d171b44fcfc6167e0fd1630ea09eee9122b990401aa77ca0364673: Status 404 returned error can't find the container with id b9efa85e26d171b44fcfc6167e0fd1630ea09eee9122b990401aa77ca0364673 Mar 10 11:16:52 crc kubenswrapper[4794]: I0310 11:16:52.789545 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-7459f47788-bz25p" event={"ID":"cebda460-fe0a-407c-bdd4-db77b6d6c14e","Type":"ContainerStarted","Data":"b9efa85e26d171b44fcfc6167e0fd1630ea09eee9122b990401aa77ca0364673"} Mar 10 11:16:53 crc kubenswrapper[4794]: I0310 11:16:52.999970 4794 scope.go:117] "RemoveContainer" containerID="993a5336699cfaa4c1135438dd34154cd549d1e3528993849fd4002d26837157" Mar 10 11:16:53 crc kubenswrapper[4794]: E0310 11:16:53.000251 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-69278_openshift-machine-config-operator(071a79a8-a892-4d38-a255-2a19483b64aa)\"" pod="openshift-machine-config-operator/machine-config-daemon-69278" podUID="071a79a8-a892-4d38-a255-2a19483b64aa" Mar 10 11:16:53 crc kubenswrapper[4794]: I0310 11:16:53.802155 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-7459f47788-bz25p" event={"ID":"cebda460-fe0a-407c-bdd4-db77b6d6c14e","Type":"ContainerStarted","Data":"128ffa6f0a9f369e78931d67b8b3e322fdcae931a8c7ae984faa966f7da7afdb"} Mar 10 11:16:53 crc kubenswrapper[4794]: I0310 11:16:53.802535 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-7459f47788-bz25p" event={"ID":"cebda460-fe0a-407c-bdd4-db77b6d6c14e","Type":"ContainerStarted","Data":"d9df888acaf40db2b4405f6a981a2b1cff1731cb009eeb8c6ad4281dec1ff07b"} Mar 10 11:16:53 crc kubenswrapper[4794]: I0310 11:16:53.802585 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-7459f47788-bz25p" Mar 10 11:16:53 crc kubenswrapper[4794]: I0310 11:16:53.802607 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-7459f47788-bz25p" Mar 10 11:16:53 crc kubenswrapper[4794]: I0310 11:16:53.834768 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-7459f47788-bz25p" podStartSLOduration=2.834750487 podStartE2EDuration="2.834750487s" podCreationTimestamp="2026-03-10 11:16:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 11:16:53.820559525 +0000 UTC m=+5562.576730343" watchObservedRunningTime="2026-03-10 11:16:53.834750487 +0000 UTC m=+5562.590921305" Mar 10 11:16:57 crc kubenswrapper[4794]: I0310 11:16:57.237591 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-7f9b86ff59-7wf9d" Mar 10 11:16:57 crc kubenswrapper[4794]: I0310 11:16:57.342063 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-76f7f8d69f-2s985"] Mar 10 11:16:57 crc kubenswrapper[4794]: I0310 11:16:57.343039 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-76f7f8d69f-2s985" podUID="5ffdf5bc-7b4a-4714-8cbf-4e27c16a4f21" containerName="dnsmasq-dns" containerID="cri-o://df5b7c3de89a7de22db06361467e37303d68f8464626231d377eab6f107e907e" gracePeriod=10 Mar 10 11:16:57 crc kubenswrapper[4794]: I0310 11:16:57.871523 4794 generic.go:334] "Generic (PLEG): container finished" podID="5ffdf5bc-7b4a-4714-8cbf-4e27c16a4f21" containerID="df5b7c3de89a7de22db06361467e37303d68f8464626231d377eab6f107e907e" exitCode=0 Mar 10 11:16:57 crc kubenswrapper[4794]: I0310 11:16:57.871810 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-76f7f8d69f-2s985" event={"ID":"5ffdf5bc-7b4a-4714-8cbf-4e27c16a4f21","Type":"ContainerDied","Data":"df5b7c3de89a7de22db06361467e37303d68f8464626231d377eab6f107e907e"} Mar 10 11:16:57 crc kubenswrapper[4794]: I0310 11:16:57.871837 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-76f7f8d69f-2s985" event={"ID":"5ffdf5bc-7b4a-4714-8cbf-4e27c16a4f21","Type":"ContainerDied","Data":"300d455d31d716a76d6cf4dee067b83ccaf8dbac159077d4a66d1843f5398b45"} Mar 10 11:16:57 crc kubenswrapper[4794]: I0310 11:16:57.871864 4794 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="300d455d31d716a76d6cf4dee067b83ccaf8dbac159077d4a66d1843f5398b45" Mar 10 11:16:57 crc kubenswrapper[4794]: I0310 11:16:57.888727 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-76f7f8d69f-2s985" Mar 10 11:16:58 crc kubenswrapper[4794]: I0310 11:16:58.070815 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2g7gm\" (UniqueName: \"kubernetes.io/projected/5ffdf5bc-7b4a-4714-8cbf-4e27c16a4f21-kube-api-access-2g7gm\") pod \"5ffdf5bc-7b4a-4714-8cbf-4e27c16a4f21\" (UID: \"5ffdf5bc-7b4a-4714-8cbf-4e27c16a4f21\") " Mar 10 11:16:58 crc kubenswrapper[4794]: I0310 11:16:58.070895 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5ffdf5bc-7b4a-4714-8cbf-4e27c16a4f21-config\") pod \"5ffdf5bc-7b4a-4714-8cbf-4e27c16a4f21\" (UID: \"5ffdf5bc-7b4a-4714-8cbf-4e27c16a4f21\") " Mar 10 11:16:58 crc kubenswrapper[4794]: I0310 11:16:58.070952 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5ffdf5bc-7b4a-4714-8cbf-4e27c16a4f21-ovsdbserver-sb\") pod \"5ffdf5bc-7b4a-4714-8cbf-4e27c16a4f21\" (UID: \"5ffdf5bc-7b4a-4714-8cbf-4e27c16a4f21\") " Mar 10 11:16:58 crc kubenswrapper[4794]: I0310 11:16:58.071803 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5ffdf5bc-7b4a-4714-8cbf-4e27c16a4f21-ovsdbserver-nb\") pod \"5ffdf5bc-7b4a-4714-8cbf-4e27c16a4f21\" (UID: \"5ffdf5bc-7b4a-4714-8cbf-4e27c16a4f21\") " Mar 10 11:16:58 crc kubenswrapper[4794]: I0310 11:16:58.071904 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5ffdf5bc-7b4a-4714-8cbf-4e27c16a4f21-dns-svc\") pod \"5ffdf5bc-7b4a-4714-8cbf-4e27c16a4f21\" (UID: \"5ffdf5bc-7b4a-4714-8cbf-4e27c16a4f21\") " Mar 10 11:16:58 crc kubenswrapper[4794]: I0310 11:16:58.077578 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5ffdf5bc-7b4a-4714-8cbf-4e27c16a4f21-kube-api-access-2g7gm" (OuterVolumeSpecName: "kube-api-access-2g7gm") pod "5ffdf5bc-7b4a-4714-8cbf-4e27c16a4f21" (UID: "5ffdf5bc-7b4a-4714-8cbf-4e27c16a4f21"). InnerVolumeSpecName "kube-api-access-2g7gm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 11:16:58 crc kubenswrapper[4794]: I0310 11:16:58.112274 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5ffdf5bc-7b4a-4714-8cbf-4e27c16a4f21-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "5ffdf5bc-7b4a-4714-8cbf-4e27c16a4f21" (UID: "5ffdf5bc-7b4a-4714-8cbf-4e27c16a4f21"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 11:16:58 crc kubenswrapper[4794]: I0310 11:16:58.121177 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5ffdf5bc-7b4a-4714-8cbf-4e27c16a4f21-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "5ffdf5bc-7b4a-4714-8cbf-4e27c16a4f21" (UID: "5ffdf5bc-7b4a-4714-8cbf-4e27c16a4f21"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 11:16:58 crc kubenswrapper[4794]: I0310 11:16:58.125863 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5ffdf5bc-7b4a-4714-8cbf-4e27c16a4f21-config" (OuterVolumeSpecName: "config") pod "5ffdf5bc-7b4a-4714-8cbf-4e27c16a4f21" (UID: "5ffdf5bc-7b4a-4714-8cbf-4e27c16a4f21"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 11:16:58 crc kubenswrapper[4794]: I0310 11:16:58.127856 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5ffdf5bc-7b4a-4714-8cbf-4e27c16a4f21-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "5ffdf5bc-7b4a-4714-8cbf-4e27c16a4f21" (UID: "5ffdf5bc-7b4a-4714-8cbf-4e27c16a4f21"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 11:16:58 crc kubenswrapper[4794]: I0310 11:16:58.173267 4794 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5ffdf5bc-7b4a-4714-8cbf-4e27c16a4f21-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 10 11:16:58 crc kubenswrapper[4794]: I0310 11:16:58.173293 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2g7gm\" (UniqueName: \"kubernetes.io/projected/5ffdf5bc-7b4a-4714-8cbf-4e27c16a4f21-kube-api-access-2g7gm\") on node \"crc\" DevicePath \"\"" Mar 10 11:16:58 crc kubenswrapper[4794]: I0310 11:16:58.173302 4794 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5ffdf5bc-7b4a-4714-8cbf-4e27c16a4f21-config\") on node \"crc\" DevicePath \"\"" Mar 10 11:16:58 crc kubenswrapper[4794]: I0310 11:16:58.173310 4794 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5ffdf5bc-7b4a-4714-8cbf-4e27c16a4f21-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 10 11:16:58 crc kubenswrapper[4794]: I0310 11:16:58.173319 4794 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5ffdf5bc-7b4a-4714-8cbf-4e27c16a4f21-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 10 11:16:58 crc kubenswrapper[4794]: I0310 11:16:58.885832 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-76f7f8d69f-2s985" Mar 10 11:16:58 crc kubenswrapper[4794]: I0310 11:16:58.950230 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-76f7f8d69f-2s985"] Mar 10 11:16:58 crc kubenswrapper[4794]: I0310 11:16:58.966873 4794 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-76f7f8d69f-2s985"] Mar 10 11:17:00 crc kubenswrapper[4794]: I0310 11:17:00.017065 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5ffdf5bc-7b4a-4714-8cbf-4e27c16a4f21" path="/var/lib/kubelet/pods/5ffdf5bc-7b4a-4714-8cbf-4e27c16a4f21/volumes" Mar 10 11:17:08 crc kubenswrapper[4794]: I0310 11:17:08.000029 4794 scope.go:117] "RemoveContainer" containerID="993a5336699cfaa4c1135438dd34154cd549d1e3528993849fd4002d26837157" Mar 10 11:17:08 crc kubenswrapper[4794]: E0310 11:17:08.001079 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-69278_openshift-machine-config-operator(071a79a8-a892-4d38-a255-2a19483b64aa)\"" pod="openshift-machine-config-operator/machine-config-daemon-69278" podUID="071a79a8-a892-4d38-a255-2a19483b64aa" Mar 10 11:17:22 crc kubenswrapper[4794]: I0310 11:17:22.008898 4794 scope.go:117] "RemoveContainer" containerID="993a5336699cfaa4c1135438dd34154cd549d1e3528993849fd4002d26837157" Mar 10 11:17:22 crc kubenswrapper[4794]: E0310 11:17:22.009892 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-69278_openshift-machine-config-operator(071a79a8-a892-4d38-a255-2a19483b64aa)\"" pod="openshift-machine-config-operator/machine-config-daemon-69278" podUID="071a79a8-a892-4d38-a255-2a19483b64aa" Mar 10 11:17:23 crc kubenswrapper[4794]: I0310 11:17:23.274571 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-7459f47788-bz25p" Mar 10 11:17:23 crc kubenswrapper[4794]: I0310 11:17:23.275808 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-7459f47788-bz25p" Mar 10 11:17:36 crc kubenswrapper[4794]: I0310 11:17:36.999247 4794 scope.go:117] "RemoveContainer" containerID="993a5336699cfaa4c1135438dd34154cd549d1e3528993849fd4002d26837157" Mar 10 11:17:37 crc kubenswrapper[4794]: E0310 11:17:36.999970 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-69278_openshift-machine-config-operator(071a79a8-a892-4d38-a255-2a19483b64aa)\"" pod="openshift-machine-config-operator/machine-config-daemon-69278" podUID="071a79a8-a892-4d38-a255-2a19483b64aa" Mar 10 11:17:42 crc kubenswrapper[4794]: I0310 11:17:42.658054 4794 scope.go:117] "RemoveContainer" containerID="24cd0b134f71c7f37fdb9b7d6615d5c3345af2426be56aab801e6ad3e27ba750" Mar 10 11:17:44 crc kubenswrapper[4794]: I0310 11:17:44.479565 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-db-create-xb889"] Mar 10 11:17:44 crc kubenswrapper[4794]: E0310 11:17:44.480815 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5ffdf5bc-7b4a-4714-8cbf-4e27c16a4f21" containerName="init" Mar 10 11:17:44 crc kubenswrapper[4794]: I0310 11:17:44.484357 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="5ffdf5bc-7b4a-4714-8cbf-4e27c16a4f21" containerName="init" Mar 10 11:17:44 crc kubenswrapper[4794]: E0310 11:17:44.484462 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5ffdf5bc-7b4a-4714-8cbf-4e27c16a4f21" containerName="dnsmasq-dns" Mar 10 11:17:44 crc kubenswrapper[4794]: I0310 11:17:44.484537 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="5ffdf5bc-7b4a-4714-8cbf-4e27c16a4f21" containerName="dnsmasq-dns" Mar 10 11:17:44 crc kubenswrapper[4794]: I0310 11:17:44.484766 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="5ffdf5bc-7b4a-4714-8cbf-4e27c16a4f21" containerName="dnsmasq-dns" Mar 10 11:17:44 crc kubenswrapper[4794]: I0310 11:17:44.485450 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-xb889" Mar 10 11:17:44 crc kubenswrapper[4794]: I0310 11:17:44.497096 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-xb889"] Mar 10 11:17:44 crc kubenswrapper[4794]: I0310 11:17:44.558293 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t4nk4\" (UniqueName: \"kubernetes.io/projected/1e45f8c3-9a9c-4446-9d3b-0a3ca2a3959d-kube-api-access-t4nk4\") pod \"nova-api-db-create-xb889\" (UID: \"1e45f8c3-9a9c-4446-9d3b-0a3ca2a3959d\") " pod="openstack/nova-api-db-create-xb889" Mar 10 11:17:44 crc kubenswrapper[4794]: I0310 11:17:44.558381 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1e45f8c3-9a9c-4446-9d3b-0a3ca2a3959d-operator-scripts\") pod \"nova-api-db-create-xb889\" (UID: \"1e45f8c3-9a9c-4446-9d3b-0a3ca2a3959d\") " pod="openstack/nova-api-db-create-xb889" Mar 10 11:17:44 crc kubenswrapper[4794]: I0310 11:17:44.573579 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-db-create-k47gx"] Mar 10 11:17:44 crc kubenswrapper[4794]: I0310 11:17:44.574573 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-k47gx" Mar 10 11:17:44 crc kubenswrapper[4794]: I0310 11:17:44.584342 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-k47gx"] Mar 10 11:17:44 crc kubenswrapper[4794]: I0310 11:17:44.594904 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-c39c-account-create-update-4gr8p"] Mar 10 11:17:44 crc kubenswrapper[4794]: I0310 11:17:44.595883 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-c39c-account-create-update-4gr8p" Mar 10 11:17:44 crc kubenswrapper[4794]: I0310 11:17:44.598141 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-db-secret" Mar 10 11:17:44 crc kubenswrapper[4794]: I0310 11:17:44.612637 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-c39c-account-create-update-4gr8p"] Mar 10 11:17:44 crc kubenswrapper[4794]: I0310 11:17:44.661775 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1e45f8c3-9a9c-4446-9d3b-0a3ca2a3959d-operator-scripts\") pod \"nova-api-db-create-xb889\" (UID: \"1e45f8c3-9a9c-4446-9d3b-0a3ca2a3959d\") " pod="openstack/nova-api-db-create-xb889" Mar 10 11:17:44 crc kubenswrapper[4794]: I0310 11:17:44.662709 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t4nk4\" (UniqueName: \"kubernetes.io/projected/1e45f8c3-9a9c-4446-9d3b-0a3ca2a3959d-kube-api-access-t4nk4\") pod \"nova-api-db-create-xb889\" (UID: \"1e45f8c3-9a9c-4446-9d3b-0a3ca2a3959d\") " pod="openstack/nova-api-db-create-xb889" Mar 10 11:17:44 crc kubenswrapper[4794]: I0310 11:17:44.663045 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1e45f8c3-9a9c-4446-9d3b-0a3ca2a3959d-operator-scripts\") pod \"nova-api-db-create-xb889\" (UID: \"1e45f8c3-9a9c-4446-9d3b-0a3ca2a3959d\") " pod="openstack/nova-api-db-create-xb889" Mar 10 11:17:44 crc kubenswrapper[4794]: I0310 11:17:44.674864 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-db-create-j9xxz"] Mar 10 11:17:44 crc kubenswrapper[4794]: I0310 11:17:44.676152 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-j9xxz" Mar 10 11:17:44 crc kubenswrapper[4794]: I0310 11:17:44.697038 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-j9xxz"] Mar 10 11:17:44 crc kubenswrapper[4794]: I0310 11:17:44.700567 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t4nk4\" (UniqueName: \"kubernetes.io/projected/1e45f8c3-9a9c-4446-9d3b-0a3ca2a3959d-kube-api-access-t4nk4\") pod \"nova-api-db-create-xb889\" (UID: \"1e45f8c3-9a9c-4446-9d3b-0a3ca2a3959d\") " pod="openstack/nova-api-db-create-xb889" Mar 10 11:17:44 crc kubenswrapper[4794]: I0310 11:17:44.764217 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f8d7a332-73d9-4294-9cdd-5c9fde7561bc-operator-scripts\") pod \"nova-cell0-db-create-k47gx\" (UID: \"f8d7a332-73d9-4294-9cdd-5c9fde7561bc\") " pod="openstack/nova-cell0-db-create-k47gx" Mar 10 11:17:44 crc kubenswrapper[4794]: I0310 11:17:44.764261 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6q4b7\" (UniqueName: \"kubernetes.io/projected/f8d7a332-73d9-4294-9cdd-5c9fde7561bc-kube-api-access-6q4b7\") pod \"nova-cell0-db-create-k47gx\" (UID: \"f8d7a332-73d9-4294-9cdd-5c9fde7561bc\") " pod="openstack/nova-cell0-db-create-k47gx" Mar 10 11:17:44 crc kubenswrapper[4794]: I0310 11:17:44.764314 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nprf5\" (UniqueName: \"kubernetes.io/projected/72480687-9623-4a72-9714-16a4ac7143f2-kube-api-access-nprf5\") pod \"nova-api-c39c-account-create-update-4gr8p\" (UID: \"72480687-9623-4a72-9714-16a4ac7143f2\") " pod="openstack/nova-api-c39c-account-create-update-4gr8p" Mar 10 11:17:44 crc kubenswrapper[4794]: I0310 11:17:44.764346 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/72480687-9623-4a72-9714-16a4ac7143f2-operator-scripts\") pod \"nova-api-c39c-account-create-update-4gr8p\" (UID: \"72480687-9623-4a72-9714-16a4ac7143f2\") " pod="openstack/nova-api-c39c-account-create-update-4gr8p" Mar 10 11:17:44 crc kubenswrapper[4794]: I0310 11:17:44.780891 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-7991-account-create-update-rhvh4"] Mar 10 11:17:44 crc kubenswrapper[4794]: I0310 11:17:44.781851 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-7991-account-create-update-rhvh4" Mar 10 11:17:44 crc kubenswrapper[4794]: I0310 11:17:44.784463 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-db-secret" Mar 10 11:17:44 crc kubenswrapper[4794]: I0310 11:17:44.796266 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-7991-account-create-update-rhvh4"] Mar 10 11:17:44 crc kubenswrapper[4794]: I0310 11:17:44.800475 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-xb889" Mar 10 11:17:44 crc kubenswrapper[4794]: I0310 11:17:44.865871 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/55a3b4ff-cf31-4fa5-8482-17824a6b6d6f-operator-scripts\") pod \"nova-cell1-db-create-j9xxz\" (UID: \"55a3b4ff-cf31-4fa5-8482-17824a6b6d6f\") " pod="openstack/nova-cell1-db-create-j9xxz" Mar 10 11:17:44 crc kubenswrapper[4794]: I0310 11:17:44.867017 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z4bjs\" (UniqueName: \"kubernetes.io/projected/55a3b4ff-cf31-4fa5-8482-17824a6b6d6f-kube-api-access-z4bjs\") pod \"nova-cell1-db-create-j9xxz\" (UID: \"55a3b4ff-cf31-4fa5-8482-17824a6b6d6f\") " pod="openstack/nova-cell1-db-create-j9xxz" Mar 10 11:17:44 crc kubenswrapper[4794]: I0310 11:17:44.867070 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f8d7a332-73d9-4294-9cdd-5c9fde7561bc-operator-scripts\") pod \"nova-cell0-db-create-k47gx\" (UID: \"f8d7a332-73d9-4294-9cdd-5c9fde7561bc\") " pod="openstack/nova-cell0-db-create-k47gx" Mar 10 11:17:44 crc kubenswrapper[4794]: I0310 11:17:44.867109 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6q4b7\" (UniqueName: \"kubernetes.io/projected/f8d7a332-73d9-4294-9cdd-5c9fde7561bc-kube-api-access-6q4b7\") pod \"nova-cell0-db-create-k47gx\" (UID: \"f8d7a332-73d9-4294-9cdd-5c9fde7561bc\") " pod="openstack/nova-cell0-db-create-k47gx" Mar 10 11:17:44 crc kubenswrapper[4794]: I0310 11:17:44.867280 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nprf5\" (UniqueName: \"kubernetes.io/projected/72480687-9623-4a72-9714-16a4ac7143f2-kube-api-access-nprf5\") pod \"nova-api-c39c-account-create-update-4gr8p\" (UID: \"72480687-9623-4a72-9714-16a4ac7143f2\") " pod="openstack/nova-api-c39c-account-create-update-4gr8p" Mar 10 11:17:44 crc kubenswrapper[4794]: I0310 11:17:44.867371 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/72480687-9623-4a72-9714-16a4ac7143f2-operator-scripts\") pod \"nova-api-c39c-account-create-update-4gr8p\" (UID: \"72480687-9623-4a72-9714-16a4ac7143f2\") " pod="openstack/nova-api-c39c-account-create-update-4gr8p" Mar 10 11:17:44 crc kubenswrapper[4794]: I0310 11:17:44.868646 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/72480687-9623-4a72-9714-16a4ac7143f2-operator-scripts\") pod \"nova-api-c39c-account-create-update-4gr8p\" (UID: \"72480687-9623-4a72-9714-16a4ac7143f2\") " pod="openstack/nova-api-c39c-account-create-update-4gr8p" Mar 10 11:17:44 crc kubenswrapper[4794]: I0310 11:17:44.869518 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f8d7a332-73d9-4294-9cdd-5c9fde7561bc-operator-scripts\") pod \"nova-cell0-db-create-k47gx\" (UID: \"f8d7a332-73d9-4294-9cdd-5c9fde7561bc\") " pod="openstack/nova-cell0-db-create-k47gx" Mar 10 11:17:44 crc kubenswrapper[4794]: I0310 11:17:44.893775 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nprf5\" (UniqueName: \"kubernetes.io/projected/72480687-9623-4a72-9714-16a4ac7143f2-kube-api-access-nprf5\") pod \"nova-api-c39c-account-create-update-4gr8p\" (UID: \"72480687-9623-4a72-9714-16a4ac7143f2\") " pod="openstack/nova-api-c39c-account-create-update-4gr8p" Mar 10 11:17:44 crc kubenswrapper[4794]: I0310 11:17:44.898273 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6q4b7\" (UniqueName: \"kubernetes.io/projected/f8d7a332-73d9-4294-9cdd-5c9fde7561bc-kube-api-access-6q4b7\") pod \"nova-cell0-db-create-k47gx\" (UID: \"f8d7a332-73d9-4294-9cdd-5c9fde7561bc\") " pod="openstack/nova-cell0-db-create-k47gx" Mar 10 11:17:44 crc kubenswrapper[4794]: I0310 11:17:44.912786 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-c39c-account-create-update-4gr8p" Mar 10 11:17:44 crc kubenswrapper[4794]: I0310 11:17:44.969079 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d48c817c-bed9-4483-b4af-83b6cf201c8e-operator-scripts\") pod \"nova-cell0-7991-account-create-update-rhvh4\" (UID: \"d48c817c-bed9-4483-b4af-83b6cf201c8e\") " pod="openstack/nova-cell0-7991-account-create-update-rhvh4" Mar 10 11:17:44 crc kubenswrapper[4794]: I0310 11:17:44.969142 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/55a3b4ff-cf31-4fa5-8482-17824a6b6d6f-operator-scripts\") pod \"nova-cell1-db-create-j9xxz\" (UID: \"55a3b4ff-cf31-4fa5-8482-17824a6b6d6f\") " pod="openstack/nova-cell1-db-create-j9xxz" Mar 10 11:17:44 crc kubenswrapper[4794]: I0310 11:17:44.969224 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z4bjs\" (UniqueName: \"kubernetes.io/projected/55a3b4ff-cf31-4fa5-8482-17824a6b6d6f-kube-api-access-z4bjs\") pod \"nova-cell1-db-create-j9xxz\" (UID: \"55a3b4ff-cf31-4fa5-8482-17824a6b6d6f\") " pod="openstack/nova-cell1-db-create-j9xxz" Mar 10 11:17:44 crc kubenswrapper[4794]: I0310 11:17:44.969312 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k6jbc\" (UniqueName: \"kubernetes.io/projected/d48c817c-bed9-4483-b4af-83b6cf201c8e-kube-api-access-k6jbc\") pod \"nova-cell0-7991-account-create-update-rhvh4\" (UID: \"d48c817c-bed9-4483-b4af-83b6cf201c8e\") " pod="openstack/nova-cell0-7991-account-create-update-rhvh4" Mar 10 11:17:44 crc kubenswrapper[4794]: I0310 11:17:44.990432 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-269d-account-create-update-ld4wn"] Mar 10 11:17:44 crc kubenswrapper[4794]: I0310 11:17:44.991758 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-269d-account-create-update-ld4wn" Mar 10 11:17:44 crc kubenswrapper[4794]: I0310 11:17:44.994270 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-db-secret" Mar 10 11:17:44 crc kubenswrapper[4794]: I0310 11:17:44.997043 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/55a3b4ff-cf31-4fa5-8482-17824a6b6d6f-operator-scripts\") pod \"nova-cell1-db-create-j9xxz\" (UID: \"55a3b4ff-cf31-4fa5-8482-17824a6b6d6f\") " pod="openstack/nova-cell1-db-create-j9xxz" Mar 10 11:17:44 crc kubenswrapper[4794]: I0310 11:17:44.997990 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z4bjs\" (UniqueName: \"kubernetes.io/projected/55a3b4ff-cf31-4fa5-8482-17824a6b6d6f-kube-api-access-z4bjs\") pod \"nova-cell1-db-create-j9xxz\" (UID: \"55a3b4ff-cf31-4fa5-8482-17824a6b6d6f\") " pod="openstack/nova-cell1-db-create-j9xxz" Mar 10 11:17:45 crc kubenswrapper[4794]: I0310 11:17:45.003522 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-269d-account-create-update-ld4wn"] Mar 10 11:17:45 crc kubenswrapper[4794]: I0310 11:17:45.052530 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-lmrpm"] Mar 10 11:17:45 crc kubenswrapper[4794]: I0310 11:17:45.054681 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-lmrpm" Mar 10 11:17:45 crc kubenswrapper[4794]: I0310 11:17:45.064588 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-lmrpm"] Mar 10 11:17:45 crc kubenswrapper[4794]: I0310 11:17:45.070259 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k6jbc\" (UniqueName: \"kubernetes.io/projected/d48c817c-bed9-4483-b4af-83b6cf201c8e-kube-api-access-k6jbc\") pod \"nova-cell0-7991-account-create-update-rhvh4\" (UID: \"d48c817c-bed9-4483-b4af-83b6cf201c8e\") " pod="openstack/nova-cell0-7991-account-create-update-rhvh4" Mar 10 11:17:45 crc kubenswrapper[4794]: I0310 11:17:45.070324 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d48c817c-bed9-4483-b4af-83b6cf201c8e-operator-scripts\") pod \"nova-cell0-7991-account-create-update-rhvh4\" (UID: \"d48c817c-bed9-4483-b4af-83b6cf201c8e\") " pod="openstack/nova-cell0-7991-account-create-update-rhvh4" Mar 10 11:17:45 crc kubenswrapper[4794]: I0310 11:17:45.071019 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d48c817c-bed9-4483-b4af-83b6cf201c8e-operator-scripts\") pod \"nova-cell0-7991-account-create-update-rhvh4\" (UID: \"d48c817c-bed9-4483-b4af-83b6cf201c8e\") " pod="openstack/nova-cell0-7991-account-create-update-rhvh4" Mar 10 11:17:45 crc kubenswrapper[4794]: I0310 11:17:45.086430 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k6jbc\" (UniqueName: \"kubernetes.io/projected/d48c817c-bed9-4483-b4af-83b6cf201c8e-kube-api-access-k6jbc\") pod \"nova-cell0-7991-account-create-update-rhvh4\" (UID: \"d48c817c-bed9-4483-b4af-83b6cf201c8e\") " pod="openstack/nova-cell0-7991-account-create-update-rhvh4" Mar 10 11:17:45 crc kubenswrapper[4794]: I0310 11:17:45.096796 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-7991-account-create-update-rhvh4" Mar 10 11:17:45 crc kubenswrapper[4794]: I0310 11:17:45.171471 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3d995cba-d0f5-433c-bb91-a7b20e3a055d-operator-scripts\") pod \"nova-cell1-269d-account-create-update-ld4wn\" (UID: \"3d995cba-d0f5-433c-bb91-a7b20e3a055d\") " pod="openstack/nova-cell1-269d-account-create-update-ld4wn" Mar 10 11:17:45 crc kubenswrapper[4794]: I0310 11:17:45.171534 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4540a698-7f45-4e9f-b38e-11102a1ee435-utilities\") pod \"redhat-operators-lmrpm\" (UID: \"4540a698-7f45-4e9f-b38e-11102a1ee435\") " pod="openshift-marketplace/redhat-operators-lmrpm" Mar 10 11:17:45 crc kubenswrapper[4794]: I0310 11:17:45.171589 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9r88w\" (UniqueName: \"kubernetes.io/projected/4540a698-7f45-4e9f-b38e-11102a1ee435-kube-api-access-9r88w\") pod \"redhat-operators-lmrpm\" (UID: \"4540a698-7f45-4e9f-b38e-11102a1ee435\") " pod="openshift-marketplace/redhat-operators-lmrpm" Mar 10 11:17:45 crc kubenswrapper[4794]: I0310 11:17:45.171618 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-26hsw\" (UniqueName: \"kubernetes.io/projected/3d995cba-d0f5-433c-bb91-a7b20e3a055d-kube-api-access-26hsw\") pod \"nova-cell1-269d-account-create-update-ld4wn\" (UID: \"3d995cba-d0f5-433c-bb91-a7b20e3a055d\") " pod="openstack/nova-cell1-269d-account-create-update-ld4wn" Mar 10 11:17:45 crc kubenswrapper[4794]: I0310 11:17:45.171645 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4540a698-7f45-4e9f-b38e-11102a1ee435-catalog-content\") pod \"redhat-operators-lmrpm\" (UID: \"4540a698-7f45-4e9f-b38e-11102a1ee435\") " pod="openshift-marketplace/redhat-operators-lmrpm" Mar 10 11:17:45 crc kubenswrapper[4794]: I0310 11:17:45.192164 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-k47gx" Mar 10 11:17:45 crc kubenswrapper[4794]: I0310 11:17:45.262137 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-xb889"] Mar 10 11:17:45 crc kubenswrapper[4794]: I0310 11:17:45.275091 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4540a698-7f45-4e9f-b38e-11102a1ee435-utilities\") pod \"redhat-operators-lmrpm\" (UID: \"4540a698-7f45-4e9f-b38e-11102a1ee435\") " pod="openshift-marketplace/redhat-operators-lmrpm" Mar 10 11:17:45 crc kubenswrapper[4794]: I0310 11:17:45.275161 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9r88w\" (UniqueName: \"kubernetes.io/projected/4540a698-7f45-4e9f-b38e-11102a1ee435-kube-api-access-9r88w\") pod \"redhat-operators-lmrpm\" (UID: \"4540a698-7f45-4e9f-b38e-11102a1ee435\") " pod="openshift-marketplace/redhat-operators-lmrpm" Mar 10 11:17:45 crc kubenswrapper[4794]: I0310 11:17:45.275200 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-26hsw\" (UniqueName: \"kubernetes.io/projected/3d995cba-d0f5-433c-bb91-a7b20e3a055d-kube-api-access-26hsw\") pod \"nova-cell1-269d-account-create-update-ld4wn\" (UID: \"3d995cba-d0f5-433c-bb91-a7b20e3a055d\") " pod="openstack/nova-cell1-269d-account-create-update-ld4wn" Mar 10 11:17:45 crc kubenswrapper[4794]: I0310 11:17:45.275231 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4540a698-7f45-4e9f-b38e-11102a1ee435-catalog-content\") pod \"redhat-operators-lmrpm\" (UID: \"4540a698-7f45-4e9f-b38e-11102a1ee435\") " pod="openshift-marketplace/redhat-operators-lmrpm" Mar 10 11:17:45 crc kubenswrapper[4794]: I0310 11:17:45.275295 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3d995cba-d0f5-433c-bb91-a7b20e3a055d-operator-scripts\") pod \"nova-cell1-269d-account-create-update-ld4wn\" (UID: \"3d995cba-d0f5-433c-bb91-a7b20e3a055d\") " pod="openstack/nova-cell1-269d-account-create-update-ld4wn" Mar 10 11:17:45 crc kubenswrapper[4794]: I0310 11:17:45.275949 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4540a698-7f45-4e9f-b38e-11102a1ee435-utilities\") pod \"redhat-operators-lmrpm\" (UID: \"4540a698-7f45-4e9f-b38e-11102a1ee435\") " pod="openshift-marketplace/redhat-operators-lmrpm" Mar 10 11:17:45 crc kubenswrapper[4794]: I0310 11:17:45.276213 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4540a698-7f45-4e9f-b38e-11102a1ee435-catalog-content\") pod \"redhat-operators-lmrpm\" (UID: \"4540a698-7f45-4e9f-b38e-11102a1ee435\") " pod="openshift-marketplace/redhat-operators-lmrpm" Mar 10 11:17:45 crc kubenswrapper[4794]: I0310 11:17:45.276396 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3d995cba-d0f5-433c-bb91-a7b20e3a055d-operator-scripts\") pod \"nova-cell1-269d-account-create-update-ld4wn\" (UID: \"3d995cba-d0f5-433c-bb91-a7b20e3a055d\") " pod="openstack/nova-cell1-269d-account-create-update-ld4wn" Mar 10 11:17:45 crc kubenswrapper[4794]: W0310 11:17:45.280922 4794 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1e45f8c3_9a9c_4446_9d3b_0a3ca2a3959d.slice/crio-3dc54ab2ddc701d30a3ef1ec9a5cc10b7576b34df07b64b439d7ad09dcea990d WatchSource:0}: Error finding container 3dc54ab2ddc701d30a3ef1ec9a5cc10b7576b34df07b64b439d7ad09dcea990d: Status 404 returned error can't find the container with id 3dc54ab2ddc701d30a3ef1ec9a5cc10b7576b34df07b64b439d7ad09dcea990d Mar 10 11:17:45 crc kubenswrapper[4794]: I0310 11:17:45.297926 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-j9xxz" Mar 10 11:17:45 crc kubenswrapper[4794]: I0310 11:17:45.304845 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9r88w\" (UniqueName: \"kubernetes.io/projected/4540a698-7f45-4e9f-b38e-11102a1ee435-kube-api-access-9r88w\") pod \"redhat-operators-lmrpm\" (UID: \"4540a698-7f45-4e9f-b38e-11102a1ee435\") " pod="openshift-marketplace/redhat-operators-lmrpm" Mar 10 11:17:45 crc kubenswrapper[4794]: I0310 11:17:45.313087 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-26hsw\" (UniqueName: \"kubernetes.io/projected/3d995cba-d0f5-433c-bb91-a7b20e3a055d-kube-api-access-26hsw\") pod \"nova-cell1-269d-account-create-update-ld4wn\" (UID: \"3d995cba-d0f5-433c-bb91-a7b20e3a055d\") " pod="openstack/nova-cell1-269d-account-create-update-ld4wn" Mar 10 11:17:45 crc kubenswrapper[4794]: I0310 11:17:45.314382 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-269d-account-create-update-ld4wn" Mar 10 11:17:45 crc kubenswrapper[4794]: I0310 11:17:45.381990 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-lmrpm" Mar 10 11:17:45 crc kubenswrapper[4794]: I0310 11:17:45.383525 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-xb889" event={"ID":"1e45f8c3-9a9c-4446-9d3b-0a3ca2a3959d","Type":"ContainerStarted","Data":"3dc54ab2ddc701d30a3ef1ec9a5cc10b7576b34df07b64b439d7ad09dcea990d"} Mar 10 11:17:45 crc kubenswrapper[4794]: I0310 11:17:45.542079 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-c39c-account-create-update-4gr8p"] Mar 10 11:17:45 crc kubenswrapper[4794]: I0310 11:17:45.639282 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-7prv7"] Mar 10 11:17:45 crc kubenswrapper[4794]: I0310 11:17:45.641177 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-7prv7" Mar 10 11:17:45 crc kubenswrapper[4794]: I0310 11:17:45.663779 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-7prv7"] Mar 10 11:17:45 crc kubenswrapper[4794]: I0310 11:17:45.678037 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-7991-account-create-update-rhvh4"] Mar 10 11:17:45 crc kubenswrapper[4794]: I0310 11:17:45.686323 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dec62e66-7b8d-4386-84cc-cde2d34e6282-utilities\") pod \"redhat-marketplace-7prv7\" (UID: \"dec62e66-7b8d-4386-84cc-cde2d34e6282\") " pod="openshift-marketplace/redhat-marketplace-7prv7" Mar 10 11:17:45 crc kubenswrapper[4794]: I0310 11:17:45.686442 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xgkw8\" (UniqueName: \"kubernetes.io/projected/dec62e66-7b8d-4386-84cc-cde2d34e6282-kube-api-access-xgkw8\") pod \"redhat-marketplace-7prv7\" (UID: \"dec62e66-7b8d-4386-84cc-cde2d34e6282\") " pod="openshift-marketplace/redhat-marketplace-7prv7" Mar 10 11:17:45 crc kubenswrapper[4794]: I0310 11:17:45.686505 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dec62e66-7b8d-4386-84cc-cde2d34e6282-catalog-content\") pod \"redhat-marketplace-7prv7\" (UID: \"dec62e66-7b8d-4386-84cc-cde2d34e6282\") " pod="openshift-marketplace/redhat-marketplace-7prv7" Mar 10 11:17:45 crc kubenswrapper[4794]: I0310 11:17:45.789054 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xgkw8\" (UniqueName: \"kubernetes.io/projected/dec62e66-7b8d-4386-84cc-cde2d34e6282-kube-api-access-xgkw8\") pod \"redhat-marketplace-7prv7\" (UID: \"dec62e66-7b8d-4386-84cc-cde2d34e6282\") " pod="openshift-marketplace/redhat-marketplace-7prv7" Mar 10 11:17:45 crc kubenswrapper[4794]: I0310 11:17:45.789149 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dec62e66-7b8d-4386-84cc-cde2d34e6282-catalog-content\") pod \"redhat-marketplace-7prv7\" (UID: \"dec62e66-7b8d-4386-84cc-cde2d34e6282\") " pod="openshift-marketplace/redhat-marketplace-7prv7" Mar 10 11:17:45 crc kubenswrapper[4794]: I0310 11:17:45.789183 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dec62e66-7b8d-4386-84cc-cde2d34e6282-utilities\") pod \"redhat-marketplace-7prv7\" (UID: \"dec62e66-7b8d-4386-84cc-cde2d34e6282\") " pod="openshift-marketplace/redhat-marketplace-7prv7" Mar 10 11:17:45 crc kubenswrapper[4794]: I0310 11:17:45.789611 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dec62e66-7b8d-4386-84cc-cde2d34e6282-utilities\") pod \"redhat-marketplace-7prv7\" (UID: \"dec62e66-7b8d-4386-84cc-cde2d34e6282\") " pod="openshift-marketplace/redhat-marketplace-7prv7" Mar 10 11:17:45 crc kubenswrapper[4794]: I0310 11:17:45.790113 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dec62e66-7b8d-4386-84cc-cde2d34e6282-catalog-content\") pod \"redhat-marketplace-7prv7\" (UID: \"dec62e66-7b8d-4386-84cc-cde2d34e6282\") " pod="openshift-marketplace/redhat-marketplace-7prv7" Mar 10 11:17:45 crc kubenswrapper[4794]: I0310 11:17:45.795207 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-k47gx"] Mar 10 11:17:45 crc kubenswrapper[4794]: I0310 11:17:45.833401 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xgkw8\" (UniqueName: \"kubernetes.io/projected/dec62e66-7b8d-4386-84cc-cde2d34e6282-kube-api-access-xgkw8\") pod \"redhat-marketplace-7prv7\" (UID: \"dec62e66-7b8d-4386-84cc-cde2d34e6282\") " pod="openshift-marketplace/redhat-marketplace-7prv7" Mar 10 11:17:45 crc kubenswrapper[4794]: I0310 11:17:45.913769 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-j9xxz"] Mar 10 11:17:45 crc kubenswrapper[4794]: I0310 11:17:45.942221 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-lmrpm"] Mar 10 11:17:45 crc kubenswrapper[4794]: I0310 11:17:45.963554 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-269d-account-create-update-ld4wn"] Mar 10 11:17:45 crc kubenswrapper[4794]: I0310 11:17:45.980866 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-7prv7" Mar 10 11:17:46 crc kubenswrapper[4794]: I0310 11:17:46.392018 4794 generic.go:334] "Generic (PLEG): container finished" podID="72480687-9623-4a72-9714-16a4ac7143f2" containerID="6b09461dda48becbbd59e698f79d7c35bcad430638877567314883f2a3979d99" exitCode=0 Mar 10 11:17:46 crc kubenswrapper[4794]: I0310 11:17:46.392099 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-c39c-account-create-update-4gr8p" event={"ID":"72480687-9623-4a72-9714-16a4ac7143f2","Type":"ContainerDied","Data":"6b09461dda48becbbd59e698f79d7c35bcad430638877567314883f2a3979d99"} Mar 10 11:17:46 crc kubenswrapper[4794]: I0310 11:17:46.392375 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-c39c-account-create-update-4gr8p" event={"ID":"72480687-9623-4a72-9714-16a4ac7143f2","Type":"ContainerStarted","Data":"4111c99a3227db2187f7c21c171d92c6cf6bf967c8494d28a1c96fded2029216"} Mar 10 11:17:46 crc kubenswrapper[4794]: I0310 11:17:46.394231 4794 generic.go:334] "Generic (PLEG): container finished" podID="1e45f8c3-9a9c-4446-9d3b-0a3ca2a3959d" containerID="4975fc730fbc2ec9e2e46c6ac387b74769008e53ba5af3bdf14a35383647b75e" exitCode=0 Mar 10 11:17:46 crc kubenswrapper[4794]: I0310 11:17:46.394282 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-xb889" event={"ID":"1e45f8c3-9a9c-4446-9d3b-0a3ca2a3959d","Type":"ContainerDied","Data":"4975fc730fbc2ec9e2e46c6ac387b74769008e53ba5af3bdf14a35383647b75e"} Mar 10 11:17:46 crc kubenswrapper[4794]: I0310 11:17:46.396946 4794 generic.go:334] "Generic (PLEG): container finished" podID="4540a698-7f45-4e9f-b38e-11102a1ee435" containerID="7a5f364dca4b20e76e6d8025af628cf70e88f3276bded150c0b61e0159ccb561" exitCode=0 Mar 10 11:17:46 crc kubenswrapper[4794]: I0310 11:17:46.396999 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lmrpm" event={"ID":"4540a698-7f45-4e9f-b38e-11102a1ee435","Type":"ContainerDied","Data":"7a5f364dca4b20e76e6d8025af628cf70e88f3276bded150c0b61e0159ccb561"} Mar 10 11:17:46 crc kubenswrapper[4794]: I0310 11:17:46.397019 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lmrpm" event={"ID":"4540a698-7f45-4e9f-b38e-11102a1ee435","Type":"ContainerStarted","Data":"b24d588d1786bde53923101d1e378ecca63ffd15056f52efb4220fc6baaab64f"} Mar 10 11:17:46 crc kubenswrapper[4794]: I0310 11:17:46.400787 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-269d-account-create-update-ld4wn" event={"ID":"3d995cba-d0f5-433c-bb91-a7b20e3a055d","Type":"ContainerStarted","Data":"9adfab5f90e9c45df62adb600b920a1f53fc2bf231d031997e441a12504b5b8b"} Mar 10 11:17:46 crc kubenswrapper[4794]: I0310 11:17:46.400815 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-269d-account-create-update-ld4wn" event={"ID":"3d995cba-d0f5-433c-bb91-a7b20e3a055d","Type":"ContainerStarted","Data":"32ccccea62573298362465e4ef52379d50ba2d39cee57c317beee82e329f7de8"} Mar 10 11:17:46 crc kubenswrapper[4794]: I0310 11:17:46.402799 4794 generic.go:334] "Generic (PLEG): container finished" podID="d48c817c-bed9-4483-b4af-83b6cf201c8e" containerID="3d4ca50092d0be9e95a29c530920908827707f1a7bef1a1178f00ec55ae70682" exitCode=0 Mar 10 11:17:46 crc kubenswrapper[4794]: I0310 11:17:46.402876 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-7991-account-create-update-rhvh4" event={"ID":"d48c817c-bed9-4483-b4af-83b6cf201c8e","Type":"ContainerDied","Data":"3d4ca50092d0be9e95a29c530920908827707f1a7bef1a1178f00ec55ae70682"} Mar 10 11:17:46 crc kubenswrapper[4794]: I0310 11:17:46.402913 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-7991-account-create-update-rhvh4" event={"ID":"d48c817c-bed9-4483-b4af-83b6cf201c8e","Type":"ContainerStarted","Data":"ab5464ab8e62621cb7ddff7c2c733484a2a599d79b581fb4fc7ae86d977cb140"} Mar 10 11:17:46 crc kubenswrapper[4794]: I0310 11:17:46.405538 4794 generic.go:334] "Generic (PLEG): container finished" podID="f8d7a332-73d9-4294-9cdd-5c9fde7561bc" containerID="5075d75ab9e59f54be1a73af28ebc58af063daad4d65fb8e4ffe7ca6622a936c" exitCode=0 Mar 10 11:17:46 crc kubenswrapper[4794]: I0310 11:17:46.405580 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-k47gx" event={"ID":"f8d7a332-73d9-4294-9cdd-5c9fde7561bc","Type":"ContainerDied","Data":"5075d75ab9e59f54be1a73af28ebc58af063daad4d65fb8e4ffe7ca6622a936c"} Mar 10 11:17:46 crc kubenswrapper[4794]: I0310 11:17:46.405639 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-k47gx" event={"ID":"f8d7a332-73d9-4294-9cdd-5c9fde7561bc","Type":"ContainerStarted","Data":"f4bffc9f7ddfecdcab79d911fe5e3e3755d75fca58a9b9499b7ab80c97a03992"} Mar 10 11:17:46 crc kubenswrapper[4794]: I0310 11:17:46.407928 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-j9xxz" event={"ID":"55a3b4ff-cf31-4fa5-8482-17824a6b6d6f","Type":"ContainerStarted","Data":"d3d17c41a6c2d8c7d790eacd943449ab5c1b8ed1ad57dec4177444f257a59c81"} Mar 10 11:17:46 crc kubenswrapper[4794]: I0310 11:17:46.407962 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-j9xxz" event={"ID":"55a3b4ff-cf31-4fa5-8482-17824a6b6d6f","Type":"ContainerStarted","Data":"6ec5cae5b9a440c643246d088b3abedb766832d4777829e7247ba620efc444c6"} Mar 10 11:17:46 crc kubenswrapper[4794]: I0310 11:17:46.448994 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-269d-account-create-update-ld4wn" podStartSLOduration=2.4489793349999998 podStartE2EDuration="2.448979335s" podCreationTimestamp="2026-03-10 11:17:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 11:17:46.439930064 +0000 UTC m=+5615.196100882" watchObservedRunningTime="2026-03-10 11:17:46.448979335 +0000 UTC m=+5615.205150153" Mar 10 11:17:46 crc kubenswrapper[4794]: I0310 11:17:46.474299 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-db-create-j9xxz" podStartSLOduration=2.474284302 podStartE2EDuration="2.474284302s" podCreationTimestamp="2026-03-10 11:17:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 11:17:46.471802495 +0000 UTC m=+5615.227973323" watchObservedRunningTime="2026-03-10 11:17:46.474284302 +0000 UTC m=+5615.230455120" Mar 10 11:17:46 crc kubenswrapper[4794]: I0310 11:17:46.526080 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-7prv7"] Mar 10 11:17:46 crc kubenswrapper[4794]: W0310 11:17:46.527992 4794 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddec62e66_7b8d_4386_84cc_cde2d34e6282.slice/crio-55657f14ee4a54a736556aeaed0b3397e173f34c1b3b44696e40b2f8e6737266 WatchSource:0}: Error finding container 55657f14ee4a54a736556aeaed0b3397e173f34c1b3b44696e40b2f8e6737266: Status 404 returned error can't find the container with id 55657f14ee4a54a736556aeaed0b3397e173f34c1b3b44696e40b2f8e6737266 Mar 10 11:17:47 crc kubenswrapper[4794]: I0310 11:17:47.425066 4794 generic.go:334] "Generic (PLEG): container finished" podID="55a3b4ff-cf31-4fa5-8482-17824a6b6d6f" containerID="d3d17c41a6c2d8c7d790eacd943449ab5c1b8ed1ad57dec4177444f257a59c81" exitCode=0 Mar 10 11:17:47 crc kubenswrapper[4794]: I0310 11:17:47.425133 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-j9xxz" event={"ID":"55a3b4ff-cf31-4fa5-8482-17824a6b6d6f","Type":"ContainerDied","Data":"d3d17c41a6c2d8c7d790eacd943449ab5c1b8ed1ad57dec4177444f257a59c81"} Mar 10 11:17:47 crc kubenswrapper[4794]: I0310 11:17:47.428832 4794 generic.go:334] "Generic (PLEG): container finished" podID="3d995cba-d0f5-433c-bb91-a7b20e3a055d" containerID="9adfab5f90e9c45df62adb600b920a1f53fc2bf231d031997e441a12504b5b8b" exitCode=0 Mar 10 11:17:47 crc kubenswrapper[4794]: I0310 11:17:47.428925 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-269d-account-create-update-ld4wn" event={"ID":"3d995cba-d0f5-433c-bb91-a7b20e3a055d","Type":"ContainerDied","Data":"9adfab5f90e9c45df62adb600b920a1f53fc2bf231d031997e441a12504b5b8b"} Mar 10 11:17:47 crc kubenswrapper[4794]: I0310 11:17:47.431357 4794 generic.go:334] "Generic (PLEG): container finished" podID="dec62e66-7b8d-4386-84cc-cde2d34e6282" containerID="13cbda9f321a2918115746f5d418d4a20aeed5c03f9776dd66c2ed42240f3992" exitCode=0 Mar 10 11:17:47 crc kubenswrapper[4794]: I0310 11:17:47.431405 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7prv7" event={"ID":"dec62e66-7b8d-4386-84cc-cde2d34e6282","Type":"ContainerDied","Data":"13cbda9f321a2918115746f5d418d4a20aeed5c03f9776dd66c2ed42240f3992"} Mar 10 11:17:47 crc kubenswrapper[4794]: I0310 11:17:47.431470 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7prv7" event={"ID":"dec62e66-7b8d-4386-84cc-cde2d34e6282","Type":"ContainerStarted","Data":"55657f14ee4a54a736556aeaed0b3397e173f34c1b3b44696e40b2f8e6737266"} Mar 10 11:17:48 crc kubenswrapper[4794]: I0310 11:17:48.014120 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-xb889" Mar 10 11:17:48 crc kubenswrapper[4794]: I0310 11:17:48.022871 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-7991-account-create-update-rhvh4" Mar 10 11:17:48 crc kubenswrapper[4794]: I0310 11:17:48.034502 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-c39c-account-create-update-4gr8p" Mar 10 11:17:48 crc kubenswrapper[4794]: I0310 11:17:48.060068 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t4nk4\" (UniqueName: \"kubernetes.io/projected/1e45f8c3-9a9c-4446-9d3b-0a3ca2a3959d-kube-api-access-t4nk4\") pod \"1e45f8c3-9a9c-4446-9d3b-0a3ca2a3959d\" (UID: \"1e45f8c3-9a9c-4446-9d3b-0a3ca2a3959d\") " Mar 10 11:17:48 crc kubenswrapper[4794]: I0310 11:17:48.060112 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d48c817c-bed9-4483-b4af-83b6cf201c8e-operator-scripts\") pod \"d48c817c-bed9-4483-b4af-83b6cf201c8e\" (UID: \"d48c817c-bed9-4483-b4af-83b6cf201c8e\") " Mar 10 11:17:48 crc kubenswrapper[4794]: I0310 11:17:48.060245 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/72480687-9623-4a72-9714-16a4ac7143f2-operator-scripts\") pod \"72480687-9623-4a72-9714-16a4ac7143f2\" (UID: \"72480687-9623-4a72-9714-16a4ac7143f2\") " Mar 10 11:17:48 crc kubenswrapper[4794]: I0310 11:17:48.060284 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1e45f8c3-9a9c-4446-9d3b-0a3ca2a3959d-operator-scripts\") pod \"1e45f8c3-9a9c-4446-9d3b-0a3ca2a3959d\" (UID: \"1e45f8c3-9a9c-4446-9d3b-0a3ca2a3959d\") " Mar 10 11:17:48 crc kubenswrapper[4794]: I0310 11:17:48.060343 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k6jbc\" (UniqueName: \"kubernetes.io/projected/d48c817c-bed9-4483-b4af-83b6cf201c8e-kube-api-access-k6jbc\") pod \"d48c817c-bed9-4483-b4af-83b6cf201c8e\" (UID: \"d48c817c-bed9-4483-b4af-83b6cf201c8e\") " Mar 10 11:17:48 crc kubenswrapper[4794]: I0310 11:17:48.060418 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nprf5\" (UniqueName: \"kubernetes.io/projected/72480687-9623-4a72-9714-16a4ac7143f2-kube-api-access-nprf5\") pod \"72480687-9623-4a72-9714-16a4ac7143f2\" (UID: \"72480687-9623-4a72-9714-16a4ac7143f2\") " Mar 10 11:17:48 crc kubenswrapper[4794]: I0310 11:17:48.075060 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/72480687-9623-4a72-9714-16a4ac7143f2-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "72480687-9623-4a72-9714-16a4ac7143f2" (UID: "72480687-9623-4a72-9714-16a4ac7143f2"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 11:17:48 crc kubenswrapper[4794]: I0310 11:17:48.075095 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d48c817c-bed9-4483-b4af-83b6cf201c8e-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "d48c817c-bed9-4483-b4af-83b6cf201c8e" (UID: "d48c817c-bed9-4483-b4af-83b6cf201c8e"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 11:17:48 crc kubenswrapper[4794]: I0310 11:17:48.075514 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1e45f8c3-9a9c-4446-9d3b-0a3ca2a3959d-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "1e45f8c3-9a9c-4446-9d3b-0a3ca2a3959d" (UID: "1e45f8c3-9a9c-4446-9d3b-0a3ca2a3959d"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 11:17:48 crc kubenswrapper[4794]: I0310 11:17:48.079547 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/72480687-9623-4a72-9714-16a4ac7143f2-kube-api-access-nprf5" (OuterVolumeSpecName: "kube-api-access-nprf5") pod "72480687-9623-4a72-9714-16a4ac7143f2" (UID: "72480687-9623-4a72-9714-16a4ac7143f2"). InnerVolumeSpecName "kube-api-access-nprf5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 11:17:48 crc kubenswrapper[4794]: I0310 11:17:48.081806 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d48c817c-bed9-4483-b4af-83b6cf201c8e-kube-api-access-k6jbc" (OuterVolumeSpecName: "kube-api-access-k6jbc") pod "d48c817c-bed9-4483-b4af-83b6cf201c8e" (UID: "d48c817c-bed9-4483-b4af-83b6cf201c8e"). InnerVolumeSpecName "kube-api-access-k6jbc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 11:17:48 crc kubenswrapper[4794]: I0310 11:17:48.082033 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1e45f8c3-9a9c-4446-9d3b-0a3ca2a3959d-kube-api-access-t4nk4" (OuterVolumeSpecName: "kube-api-access-t4nk4") pod "1e45f8c3-9a9c-4446-9d3b-0a3ca2a3959d" (UID: "1e45f8c3-9a9c-4446-9d3b-0a3ca2a3959d"). InnerVolumeSpecName "kube-api-access-t4nk4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 11:17:48 crc kubenswrapper[4794]: I0310 11:17:48.133969 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-k47gx" Mar 10 11:17:48 crc kubenswrapper[4794]: I0310 11:17:48.161628 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6q4b7\" (UniqueName: \"kubernetes.io/projected/f8d7a332-73d9-4294-9cdd-5c9fde7561bc-kube-api-access-6q4b7\") pod \"f8d7a332-73d9-4294-9cdd-5c9fde7561bc\" (UID: \"f8d7a332-73d9-4294-9cdd-5c9fde7561bc\") " Mar 10 11:17:48 crc kubenswrapper[4794]: I0310 11:17:48.161703 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f8d7a332-73d9-4294-9cdd-5c9fde7561bc-operator-scripts\") pod \"f8d7a332-73d9-4294-9cdd-5c9fde7561bc\" (UID: \"f8d7a332-73d9-4294-9cdd-5c9fde7561bc\") " Mar 10 11:17:48 crc kubenswrapper[4794]: I0310 11:17:48.162048 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t4nk4\" (UniqueName: \"kubernetes.io/projected/1e45f8c3-9a9c-4446-9d3b-0a3ca2a3959d-kube-api-access-t4nk4\") on node \"crc\" DevicePath \"\"" Mar 10 11:17:48 crc kubenswrapper[4794]: I0310 11:17:48.162066 4794 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d48c817c-bed9-4483-b4af-83b6cf201c8e-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 11:17:48 crc kubenswrapper[4794]: I0310 11:17:48.162075 4794 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/72480687-9623-4a72-9714-16a4ac7143f2-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 11:17:48 crc kubenswrapper[4794]: I0310 11:17:48.162084 4794 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1e45f8c3-9a9c-4446-9d3b-0a3ca2a3959d-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 11:17:48 crc kubenswrapper[4794]: I0310 11:17:48.162093 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k6jbc\" (UniqueName: \"kubernetes.io/projected/d48c817c-bed9-4483-b4af-83b6cf201c8e-kube-api-access-k6jbc\") on node \"crc\" DevicePath \"\"" Mar 10 11:17:48 crc kubenswrapper[4794]: I0310 11:17:48.162103 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nprf5\" (UniqueName: \"kubernetes.io/projected/72480687-9623-4a72-9714-16a4ac7143f2-kube-api-access-nprf5\") on node \"crc\" DevicePath \"\"" Mar 10 11:17:48 crc kubenswrapper[4794]: I0310 11:17:48.162498 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f8d7a332-73d9-4294-9cdd-5c9fde7561bc-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "f8d7a332-73d9-4294-9cdd-5c9fde7561bc" (UID: "f8d7a332-73d9-4294-9cdd-5c9fde7561bc"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 11:17:48 crc kubenswrapper[4794]: I0310 11:17:48.165843 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f8d7a332-73d9-4294-9cdd-5c9fde7561bc-kube-api-access-6q4b7" (OuterVolumeSpecName: "kube-api-access-6q4b7") pod "f8d7a332-73d9-4294-9cdd-5c9fde7561bc" (UID: "f8d7a332-73d9-4294-9cdd-5c9fde7561bc"). InnerVolumeSpecName "kube-api-access-6q4b7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 11:17:48 crc kubenswrapper[4794]: I0310 11:17:48.263664 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6q4b7\" (UniqueName: \"kubernetes.io/projected/f8d7a332-73d9-4294-9cdd-5c9fde7561bc-kube-api-access-6q4b7\") on node \"crc\" DevicePath \"\"" Mar 10 11:17:48 crc kubenswrapper[4794]: I0310 11:17:48.263695 4794 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f8d7a332-73d9-4294-9cdd-5c9fde7561bc-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 11:17:48 crc kubenswrapper[4794]: I0310 11:17:48.448037 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lmrpm" event={"ID":"4540a698-7f45-4e9f-b38e-11102a1ee435","Type":"ContainerStarted","Data":"4e38abd9e72adaecae1d730b4891373103d29029a4ecd0dd7be468adc6328eca"} Mar 10 11:17:48 crc kubenswrapper[4794]: I0310 11:17:48.455495 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-7991-account-create-update-rhvh4" event={"ID":"d48c817c-bed9-4483-b4af-83b6cf201c8e","Type":"ContainerDied","Data":"ab5464ab8e62621cb7ddff7c2c733484a2a599d79b581fb4fc7ae86d977cb140"} Mar 10 11:17:48 crc kubenswrapper[4794]: I0310 11:17:48.455620 4794 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ab5464ab8e62621cb7ddff7c2c733484a2a599d79b581fb4fc7ae86d977cb140" Mar 10 11:17:48 crc kubenswrapper[4794]: I0310 11:17:48.455620 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-7991-account-create-update-rhvh4" Mar 10 11:17:48 crc kubenswrapper[4794]: I0310 11:17:48.457252 4794 generic.go:334] "Generic (PLEG): container finished" podID="dec62e66-7b8d-4386-84cc-cde2d34e6282" containerID="3d6527ddf385801ec1ef99ddfee19e887628b8bf9f1585d2be9e70d7924a9ed0" exitCode=0 Mar 10 11:17:48 crc kubenswrapper[4794]: I0310 11:17:48.457305 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7prv7" event={"ID":"dec62e66-7b8d-4386-84cc-cde2d34e6282","Type":"ContainerDied","Data":"3d6527ddf385801ec1ef99ddfee19e887628b8bf9f1585d2be9e70d7924a9ed0"} Mar 10 11:17:48 crc kubenswrapper[4794]: I0310 11:17:48.468312 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-k47gx" event={"ID":"f8d7a332-73d9-4294-9cdd-5c9fde7561bc","Type":"ContainerDied","Data":"f4bffc9f7ddfecdcab79d911fe5e3e3755d75fca58a9b9499b7ab80c97a03992"} Mar 10 11:17:48 crc kubenswrapper[4794]: I0310 11:17:48.468387 4794 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f4bffc9f7ddfecdcab79d911fe5e3e3755d75fca58a9b9499b7ab80c97a03992" Mar 10 11:17:48 crc kubenswrapper[4794]: I0310 11:17:48.468388 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-k47gx" Mar 10 11:17:48 crc kubenswrapper[4794]: I0310 11:17:48.474998 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-c39c-account-create-update-4gr8p" event={"ID":"72480687-9623-4a72-9714-16a4ac7143f2","Type":"ContainerDied","Data":"4111c99a3227db2187f7c21c171d92c6cf6bf967c8494d28a1c96fded2029216"} Mar 10 11:17:48 crc kubenswrapper[4794]: I0310 11:17:48.475031 4794 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4111c99a3227db2187f7c21c171d92c6cf6bf967c8494d28a1c96fded2029216" Mar 10 11:17:48 crc kubenswrapper[4794]: I0310 11:17:48.475102 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-c39c-account-create-update-4gr8p" Mar 10 11:17:48 crc kubenswrapper[4794]: I0310 11:17:48.482695 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-xb889" Mar 10 11:17:48 crc kubenswrapper[4794]: I0310 11:17:48.482926 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-xb889" event={"ID":"1e45f8c3-9a9c-4446-9d3b-0a3ca2a3959d","Type":"ContainerDied","Data":"3dc54ab2ddc701d30a3ef1ec9a5cc10b7576b34df07b64b439d7ad09dcea990d"} Mar 10 11:17:48 crc kubenswrapper[4794]: I0310 11:17:48.482988 4794 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3dc54ab2ddc701d30a3ef1ec9a5cc10b7576b34df07b64b439d7ad09dcea990d" Mar 10 11:17:49 crc kubenswrapper[4794]: I0310 11:17:49.035575 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-269d-account-create-update-ld4wn" Mar 10 11:17:49 crc kubenswrapper[4794]: I0310 11:17:49.042140 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-j9xxz" Mar 10 11:17:49 crc kubenswrapper[4794]: I0310 11:17:49.078907 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-26hsw\" (UniqueName: \"kubernetes.io/projected/3d995cba-d0f5-433c-bb91-a7b20e3a055d-kube-api-access-26hsw\") pod \"3d995cba-d0f5-433c-bb91-a7b20e3a055d\" (UID: \"3d995cba-d0f5-433c-bb91-a7b20e3a055d\") " Mar 10 11:17:49 crc kubenswrapper[4794]: I0310 11:17:49.078970 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/55a3b4ff-cf31-4fa5-8482-17824a6b6d6f-operator-scripts\") pod \"55a3b4ff-cf31-4fa5-8482-17824a6b6d6f\" (UID: \"55a3b4ff-cf31-4fa5-8482-17824a6b6d6f\") " Mar 10 11:17:49 crc kubenswrapper[4794]: I0310 11:17:49.079054 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z4bjs\" (UniqueName: \"kubernetes.io/projected/55a3b4ff-cf31-4fa5-8482-17824a6b6d6f-kube-api-access-z4bjs\") pod \"55a3b4ff-cf31-4fa5-8482-17824a6b6d6f\" (UID: \"55a3b4ff-cf31-4fa5-8482-17824a6b6d6f\") " Mar 10 11:17:49 crc kubenswrapper[4794]: I0310 11:17:49.079077 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3d995cba-d0f5-433c-bb91-a7b20e3a055d-operator-scripts\") pod \"3d995cba-d0f5-433c-bb91-a7b20e3a055d\" (UID: \"3d995cba-d0f5-433c-bb91-a7b20e3a055d\") " Mar 10 11:17:49 crc kubenswrapper[4794]: I0310 11:17:49.081860 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3d995cba-d0f5-433c-bb91-a7b20e3a055d-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "3d995cba-d0f5-433c-bb91-a7b20e3a055d" (UID: "3d995cba-d0f5-433c-bb91-a7b20e3a055d"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 11:17:49 crc kubenswrapper[4794]: I0310 11:17:49.082253 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/55a3b4ff-cf31-4fa5-8482-17824a6b6d6f-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "55a3b4ff-cf31-4fa5-8482-17824a6b6d6f" (UID: "55a3b4ff-cf31-4fa5-8482-17824a6b6d6f"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 11:17:49 crc kubenswrapper[4794]: I0310 11:17:49.086324 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3d995cba-d0f5-433c-bb91-a7b20e3a055d-kube-api-access-26hsw" (OuterVolumeSpecName: "kube-api-access-26hsw") pod "3d995cba-d0f5-433c-bb91-a7b20e3a055d" (UID: "3d995cba-d0f5-433c-bb91-a7b20e3a055d"). InnerVolumeSpecName "kube-api-access-26hsw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 11:17:49 crc kubenswrapper[4794]: I0310 11:17:49.087641 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/55a3b4ff-cf31-4fa5-8482-17824a6b6d6f-kube-api-access-z4bjs" (OuterVolumeSpecName: "kube-api-access-z4bjs") pod "55a3b4ff-cf31-4fa5-8482-17824a6b6d6f" (UID: "55a3b4ff-cf31-4fa5-8482-17824a6b6d6f"). InnerVolumeSpecName "kube-api-access-z4bjs". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 11:17:49 crc kubenswrapper[4794]: I0310 11:17:49.184311 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-26hsw\" (UniqueName: \"kubernetes.io/projected/3d995cba-d0f5-433c-bb91-a7b20e3a055d-kube-api-access-26hsw\") on node \"crc\" DevicePath \"\"" Mar 10 11:17:49 crc kubenswrapper[4794]: I0310 11:17:49.184366 4794 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/55a3b4ff-cf31-4fa5-8482-17824a6b6d6f-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 11:17:49 crc kubenswrapper[4794]: I0310 11:17:49.184381 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z4bjs\" (UniqueName: \"kubernetes.io/projected/55a3b4ff-cf31-4fa5-8482-17824a6b6d6f-kube-api-access-z4bjs\") on node \"crc\" DevicePath \"\"" Mar 10 11:17:49 crc kubenswrapper[4794]: I0310 11:17:49.184393 4794 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3d995cba-d0f5-433c-bb91-a7b20e3a055d-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 11:17:49 crc kubenswrapper[4794]: I0310 11:17:49.502723 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-j9xxz" event={"ID":"55a3b4ff-cf31-4fa5-8482-17824a6b6d6f","Type":"ContainerDied","Data":"6ec5cae5b9a440c643246d088b3abedb766832d4777829e7247ba620efc444c6"} Mar 10 11:17:49 crc kubenswrapper[4794]: I0310 11:17:49.502979 4794 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6ec5cae5b9a440c643246d088b3abedb766832d4777829e7247ba620efc444c6" Mar 10 11:17:49 crc kubenswrapper[4794]: I0310 11:17:49.502755 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-j9xxz" Mar 10 11:17:49 crc kubenswrapper[4794]: I0310 11:17:49.504868 4794 generic.go:334] "Generic (PLEG): container finished" podID="4540a698-7f45-4e9f-b38e-11102a1ee435" containerID="4e38abd9e72adaecae1d730b4891373103d29029a4ecd0dd7be468adc6328eca" exitCode=0 Mar 10 11:17:49 crc kubenswrapper[4794]: I0310 11:17:49.504904 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lmrpm" event={"ID":"4540a698-7f45-4e9f-b38e-11102a1ee435","Type":"ContainerDied","Data":"4e38abd9e72adaecae1d730b4891373103d29029a4ecd0dd7be468adc6328eca"} Mar 10 11:17:49 crc kubenswrapper[4794]: I0310 11:17:49.507355 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-269d-account-create-update-ld4wn" event={"ID":"3d995cba-d0f5-433c-bb91-a7b20e3a055d","Type":"ContainerDied","Data":"32ccccea62573298362465e4ef52379d50ba2d39cee57c317beee82e329f7de8"} Mar 10 11:17:49 crc kubenswrapper[4794]: I0310 11:17:49.507384 4794 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="32ccccea62573298362465e4ef52379d50ba2d39cee57c317beee82e329f7de8" Mar 10 11:17:49 crc kubenswrapper[4794]: I0310 11:17:49.507431 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-269d-account-create-update-ld4wn" Mar 10 11:17:49 crc kubenswrapper[4794]: I0310 11:17:49.511528 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7prv7" event={"ID":"dec62e66-7b8d-4386-84cc-cde2d34e6282","Type":"ContainerStarted","Data":"c5c0082c49665d7e8131bdb5552996b501b20f084295eb9db1250624056dd091"} Mar 10 11:17:49 crc kubenswrapper[4794]: I0310 11:17:49.545903 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-7prv7" podStartSLOduration=3.068098858 podStartE2EDuration="4.545888694s" podCreationTimestamp="2026-03-10 11:17:45 +0000 UTC" firstStartedPulling="2026-03-10 11:17:47.433902814 +0000 UTC m=+5616.190073672" lastFinishedPulling="2026-03-10 11:17:48.91169269 +0000 UTC m=+5617.667863508" observedRunningTime="2026-03-10 11:17:49.540853677 +0000 UTC m=+5618.297024495" watchObservedRunningTime="2026-03-10 11:17:49.545888694 +0000 UTC m=+5618.302059502" Mar 10 11:17:49 crc kubenswrapper[4794]: I0310 11:17:49.980308 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-db-sync-d82lf"] Mar 10 11:17:49 crc kubenswrapper[4794]: E0310 11:17:49.980639 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1e45f8c3-9a9c-4446-9d3b-0a3ca2a3959d" containerName="mariadb-database-create" Mar 10 11:17:49 crc kubenswrapper[4794]: I0310 11:17:49.980654 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="1e45f8c3-9a9c-4446-9d3b-0a3ca2a3959d" containerName="mariadb-database-create" Mar 10 11:17:49 crc kubenswrapper[4794]: E0310 11:17:49.980672 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="72480687-9623-4a72-9714-16a4ac7143f2" containerName="mariadb-account-create-update" Mar 10 11:17:49 crc kubenswrapper[4794]: I0310 11:17:49.980680 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="72480687-9623-4a72-9714-16a4ac7143f2" containerName="mariadb-account-create-update" Mar 10 11:17:49 crc kubenswrapper[4794]: E0310 11:17:49.980692 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="55a3b4ff-cf31-4fa5-8482-17824a6b6d6f" containerName="mariadb-database-create" Mar 10 11:17:49 crc kubenswrapper[4794]: I0310 11:17:49.980699 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="55a3b4ff-cf31-4fa5-8482-17824a6b6d6f" containerName="mariadb-database-create" Mar 10 11:17:49 crc kubenswrapper[4794]: E0310 11:17:49.980710 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f8d7a332-73d9-4294-9cdd-5c9fde7561bc" containerName="mariadb-database-create" Mar 10 11:17:49 crc kubenswrapper[4794]: I0310 11:17:49.980716 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="f8d7a332-73d9-4294-9cdd-5c9fde7561bc" containerName="mariadb-database-create" Mar 10 11:17:49 crc kubenswrapper[4794]: E0310 11:17:49.980734 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3d995cba-d0f5-433c-bb91-a7b20e3a055d" containerName="mariadb-account-create-update" Mar 10 11:17:49 crc kubenswrapper[4794]: I0310 11:17:49.980741 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="3d995cba-d0f5-433c-bb91-a7b20e3a055d" containerName="mariadb-account-create-update" Mar 10 11:17:49 crc kubenswrapper[4794]: E0310 11:17:49.980758 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d48c817c-bed9-4483-b4af-83b6cf201c8e" containerName="mariadb-account-create-update" Mar 10 11:17:49 crc kubenswrapper[4794]: I0310 11:17:49.980764 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="d48c817c-bed9-4483-b4af-83b6cf201c8e" containerName="mariadb-account-create-update" Mar 10 11:17:49 crc kubenswrapper[4794]: I0310 11:17:49.980900 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="55a3b4ff-cf31-4fa5-8482-17824a6b6d6f" containerName="mariadb-database-create" Mar 10 11:17:49 crc kubenswrapper[4794]: I0310 11:17:49.980907 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="1e45f8c3-9a9c-4446-9d3b-0a3ca2a3959d" containerName="mariadb-database-create" Mar 10 11:17:49 crc kubenswrapper[4794]: I0310 11:17:49.980920 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="3d995cba-d0f5-433c-bb91-a7b20e3a055d" containerName="mariadb-account-create-update" Mar 10 11:17:49 crc kubenswrapper[4794]: I0310 11:17:49.980933 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="d48c817c-bed9-4483-b4af-83b6cf201c8e" containerName="mariadb-account-create-update" Mar 10 11:17:49 crc kubenswrapper[4794]: I0310 11:17:49.980944 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="f8d7a332-73d9-4294-9cdd-5c9fde7561bc" containerName="mariadb-database-create" Mar 10 11:17:49 crc kubenswrapper[4794]: I0310 11:17:49.980956 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="72480687-9623-4a72-9714-16a4ac7143f2" containerName="mariadb-account-create-update" Mar 10 11:17:49 crc kubenswrapper[4794]: I0310 11:17:49.981509 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-d82lf" Mar 10 11:17:49 crc kubenswrapper[4794]: I0310 11:17:49.984617 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Mar 10 11:17:49 crc kubenswrapper[4794]: I0310 11:17:49.984871 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-lq26n" Mar 10 11:17:49 crc kubenswrapper[4794]: I0310 11:17:49.985051 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-scripts" Mar 10 11:17:49 crc kubenswrapper[4794]: I0310 11:17:49.992693 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-d82lf"] Mar 10 11:17:50 crc kubenswrapper[4794]: I0310 11:17:50.100225 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bb00adea-5b3d-4e1e-a32c-b9378e9aa75e-config-data\") pod \"nova-cell0-conductor-db-sync-d82lf\" (UID: \"bb00adea-5b3d-4e1e-a32c-b9378e9aa75e\") " pod="openstack/nova-cell0-conductor-db-sync-d82lf" Mar 10 11:17:50 crc kubenswrapper[4794]: I0310 11:17:50.101117 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bb00adea-5b3d-4e1e-a32c-b9378e9aa75e-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-d82lf\" (UID: \"bb00adea-5b3d-4e1e-a32c-b9378e9aa75e\") " pod="openstack/nova-cell0-conductor-db-sync-d82lf" Mar 10 11:17:50 crc kubenswrapper[4794]: I0310 11:17:50.101324 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bb00adea-5b3d-4e1e-a32c-b9378e9aa75e-scripts\") pod \"nova-cell0-conductor-db-sync-d82lf\" (UID: \"bb00adea-5b3d-4e1e-a32c-b9378e9aa75e\") " pod="openstack/nova-cell0-conductor-db-sync-d82lf" Mar 10 11:17:50 crc kubenswrapper[4794]: I0310 11:17:50.102885 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t57d9\" (UniqueName: \"kubernetes.io/projected/bb00adea-5b3d-4e1e-a32c-b9378e9aa75e-kube-api-access-t57d9\") pod \"nova-cell0-conductor-db-sync-d82lf\" (UID: \"bb00adea-5b3d-4e1e-a32c-b9378e9aa75e\") " pod="openstack/nova-cell0-conductor-db-sync-d82lf" Mar 10 11:17:50 crc kubenswrapper[4794]: I0310 11:17:50.204458 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bb00adea-5b3d-4e1e-a32c-b9378e9aa75e-config-data\") pod \"nova-cell0-conductor-db-sync-d82lf\" (UID: \"bb00adea-5b3d-4e1e-a32c-b9378e9aa75e\") " pod="openstack/nova-cell0-conductor-db-sync-d82lf" Mar 10 11:17:50 crc kubenswrapper[4794]: I0310 11:17:50.204767 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bb00adea-5b3d-4e1e-a32c-b9378e9aa75e-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-d82lf\" (UID: \"bb00adea-5b3d-4e1e-a32c-b9378e9aa75e\") " pod="openstack/nova-cell0-conductor-db-sync-d82lf" Mar 10 11:17:50 crc kubenswrapper[4794]: I0310 11:17:50.204803 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bb00adea-5b3d-4e1e-a32c-b9378e9aa75e-scripts\") pod \"nova-cell0-conductor-db-sync-d82lf\" (UID: \"bb00adea-5b3d-4e1e-a32c-b9378e9aa75e\") " pod="openstack/nova-cell0-conductor-db-sync-d82lf" Mar 10 11:17:50 crc kubenswrapper[4794]: I0310 11:17:50.204879 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t57d9\" (UniqueName: \"kubernetes.io/projected/bb00adea-5b3d-4e1e-a32c-b9378e9aa75e-kube-api-access-t57d9\") pod \"nova-cell0-conductor-db-sync-d82lf\" (UID: \"bb00adea-5b3d-4e1e-a32c-b9378e9aa75e\") " pod="openstack/nova-cell0-conductor-db-sync-d82lf" Mar 10 11:17:50 crc kubenswrapper[4794]: I0310 11:17:50.209538 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bb00adea-5b3d-4e1e-a32c-b9378e9aa75e-config-data\") pod \"nova-cell0-conductor-db-sync-d82lf\" (UID: \"bb00adea-5b3d-4e1e-a32c-b9378e9aa75e\") " pod="openstack/nova-cell0-conductor-db-sync-d82lf" Mar 10 11:17:50 crc kubenswrapper[4794]: I0310 11:17:50.212707 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bb00adea-5b3d-4e1e-a32c-b9378e9aa75e-scripts\") pod \"nova-cell0-conductor-db-sync-d82lf\" (UID: \"bb00adea-5b3d-4e1e-a32c-b9378e9aa75e\") " pod="openstack/nova-cell0-conductor-db-sync-d82lf" Mar 10 11:17:50 crc kubenswrapper[4794]: I0310 11:17:50.213977 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bb00adea-5b3d-4e1e-a32c-b9378e9aa75e-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-d82lf\" (UID: \"bb00adea-5b3d-4e1e-a32c-b9378e9aa75e\") " pod="openstack/nova-cell0-conductor-db-sync-d82lf" Mar 10 11:17:50 crc kubenswrapper[4794]: I0310 11:17:50.227819 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t57d9\" (UniqueName: \"kubernetes.io/projected/bb00adea-5b3d-4e1e-a32c-b9378e9aa75e-kube-api-access-t57d9\") pod \"nova-cell0-conductor-db-sync-d82lf\" (UID: \"bb00adea-5b3d-4e1e-a32c-b9378e9aa75e\") " pod="openstack/nova-cell0-conductor-db-sync-d82lf" Mar 10 11:17:50 crc kubenswrapper[4794]: I0310 11:17:50.312087 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-d82lf" Mar 10 11:17:50 crc kubenswrapper[4794]: I0310 11:17:50.525808 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lmrpm" event={"ID":"4540a698-7f45-4e9f-b38e-11102a1ee435","Type":"ContainerStarted","Data":"e7e9bcad8ccbc8cdd3dca02ff28379dc1f14ab20eb194ea258f4ec6f18be51b6"} Mar 10 11:17:50 crc kubenswrapper[4794]: I0310 11:17:50.543939 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-lmrpm" podStartSLOduration=2.003534462 podStartE2EDuration="5.543917761s" podCreationTimestamp="2026-03-10 11:17:45 +0000 UTC" firstStartedPulling="2026-03-10 11:17:46.398529345 +0000 UTC m=+5615.154700163" lastFinishedPulling="2026-03-10 11:17:49.938912644 +0000 UTC m=+5618.695083462" observedRunningTime="2026-03-10 11:17:50.543596431 +0000 UTC m=+5619.299767249" watchObservedRunningTime="2026-03-10 11:17:50.543917761 +0000 UTC m=+5619.300088589" Mar 10 11:17:50 crc kubenswrapper[4794]: W0310 11:17:50.780736 4794 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbb00adea_5b3d_4e1e_a32c_b9378e9aa75e.slice/crio-0ac299a7d905051f116b7aaead6ab72138cbe1424674d20248312f8a994fd392 WatchSource:0}: Error finding container 0ac299a7d905051f116b7aaead6ab72138cbe1424674d20248312f8a994fd392: Status 404 returned error can't find the container with id 0ac299a7d905051f116b7aaead6ab72138cbe1424674d20248312f8a994fd392 Mar 10 11:17:50 crc kubenswrapper[4794]: I0310 11:17:50.789381 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-d82lf"] Mar 10 11:17:50 crc kubenswrapper[4794]: I0310 11:17:50.999687 4794 scope.go:117] "RemoveContainer" containerID="993a5336699cfaa4c1135438dd34154cd549d1e3528993849fd4002d26837157" Mar 10 11:17:51 crc kubenswrapper[4794]: E0310 11:17:51.000166 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-69278_openshift-machine-config-operator(071a79a8-a892-4d38-a255-2a19483b64aa)\"" pod="openshift-machine-config-operator/machine-config-daemon-69278" podUID="071a79a8-a892-4d38-a255-2a19483b64aa" Mar 10 11:17:51 crc kubenswrapper[4794]: I0310 11:17:51.537382 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-d82lf" event={"ID":"bb00adea-5b3d-4e1e-a32c-b9378e9aa75e","Type":"ContainerStarted","Data":"dc08acba3012e6ff76d1ff5183e98caa3586ffe0e5900aaef669b064c0118c40"} Mar 10 11:17:51 crc kubenswrapper[4794]: I0310 11:17:51.537713 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-d82lf" event={"ID":"bb00adea-5b3d-4e1e-a32c-b9378e9aa75e","Type":"ContainerStarted","Data":"0ac299a7d905051f116b7aaead6ab72138cbe1424674d20248312f8a994fd392"} Mar 10 11:17:51 crc kubenswrapper[4794]: I0310 11:17:51.558229 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-db-sync-d82lf" podStartSLOduration=2.558214173 podStartE2EDuration="2.558214173s" podCreationTimestamp="2026-03-10 11:17:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 11:17:51.552816745 +0000 UTC m=+5620.308987563" watchObservedRunningTime="2026-03-10 11:17:51.558214173 +0000 UTC m=+5620.314384991" Mar 10 11:17:55 crc kubenswrapper[4794]: I0310 11:17:55.382498 4794 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-lmrpm" Mar 10 11:17:55 crc kubenswrapper[4794]: I0310 11:17:55.382920 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-lmrpm" Mar 10 11:17:55 crc kubenswrapper[4794]: I0310 11:17:55.981800 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-7prv7" Mar 10 11:17:55 crc kubenswrapper[4794]: I0310 11:17:55.982362 4794 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-7prv7" Mar 10 11:17:56 crc kubenswrapper[4794]: I0310 11:17:56.053832 4794 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-7prv7" Mar 10 11:17:56 crc kubenswrapper[4794]: I0310 11:17:56.441595 4794 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-lmrpm" podUID="4540a698-7f45-4e9f-b38e-11102a1ee435" containerName="registry-server" probeResult="failure" output=< Mar 10 11:17:56 crc kubenswrapper[4794]: timeout: failed to connect service ":50051" within 1s Mar 10 11:17:56 crc kubenswrapper[4794]: > Mar 10 11:17:56 crc kubenswrapper[4794]: I0310 11:17:56.640631 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-7prv7" Mar 10 11:17:56 crc kubenswrapper[4794]: I0310 11:17:56.716953 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-7prv7"] Mar 10 11:17:58 crc kubenswrapper[4794]: I0310 11:17:58.598409 4794 generic.go:334] "Generic (PLEG): container finished" podID="bb00adea-5b3d-4e1e-a32c-b9378e9aa75e" containerID="dc08acba3012e6ff76d1ff5183e98caa3586ffe0e5900aaef669b064c0118c40" exitCode=0 Mar 10 11:17:58 crc kubenswrapper[4794]: I0310 11:17:58.598449 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-d82lf" event={"ID":"bb00adea-5b3d-4e1e-a32c-b9378e9aa75e","Type":"ContainerDied","Data":"dc08acba3012e6ff76d1ff5183e98caa3586ffe0e5900aaef669b064c0118c40"} Mar 10 11:17:58 crc kubenswrapper[4794]: I0310 11:17:58.598779 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-7prv7" podUID="dec62e66-7b8d-4386-84cc-cde2d34e6282" containerName="registry-server" containerID="cri-o://c5c0082c49665d7e8131bdb5552996b501b20f084295eb9db1250624056dd091" gracePeriod=2 Mar 10 11:17:59 crc kubenswrapper[4794]: I0310 11:17:59.153790 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-7prv7" Mar 10 11:17:59 crc kubenswrapper[4794]: I0310 11:17:59.218155 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dec62e66-7b8d-4386-84cc-cde2d34e6282-catalog-content\") pod \"dec62e66-7b8d-4386-84cc-cde2d34e6282\" (UID: \"dec62e66-7b8d-4386-84cc-cde2d34e6282\") " Mar 10 11:17:59 crc kubenswrapper[4794]: I0310 11:17:59.218245 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dec62e66-7b8d-4386-84cc-cde2d34e6282-utilities\") pod \"dec62e66-7b8d-4386-84cc-cde2d34e6282\" (UID: \"dec62e66-7b8d-4386-84cc-cde2d34e6282\") " Mar 10 11:17:59 crc kubenswrapper[4794]: I0310 11:17:59.218276 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xgkw8\" (UniqueName: \"kubernetes.io/projected/dec62e66-7b8d-4386-84cc-cde2d34e6282-kube-api-access-xgkw8\") pod \"dec62e66-7b8d-4386-84cc-cde2d34e6282\" (UID: \"dec62e66-7b8d-4386-84cc-cde2d34e6282\") " Mar 10 11:17:59 crc kubenswrapper[4794]: I0310 11:17:59.219347 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dec62e66-7b8d-4386-84cc-cde2d34e6282-utilities" (OuterVolumeSpecName: "utilities") pod "dec62e66-7b8d-4386-84cc-cde2d34e6282" (UID: "dec62e66-7b8d-4386-84cc-cde2d34e6282"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 11:17:59 crc kubenswrapper[4794]: I0310 11:17:59.225019 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dec62e66-7b8d-4386-84cc-cde2d34e6282-kube-api-access-xgkw8" (OuterVolumeSpecName: "kube-api-access-xgkw8") pod "dec62e66-7b8d-4386-84cc-cde2d34e6282" (UID: "dec62e66-7b8d-4386-84cc-cde2d34e6282"). InnerVolumeSpecName "kube-api-access-xgkw8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 11:17:59 crc kubenswrapper[4794]: I0310 11:17:59.242399 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dec62e66-7b8d-4386-84cc-cde2d34e6282-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "dec62e66-7b8d-4386-84cc-cde2d34e6282" (UID: "dec62e66-7b8d-4386-84cc-cde2d34e6282"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 11:17:59 crc kubenswrapper[4794]: I0310 11:17:59.320008 4794 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dec62e66-7b8d-4386-84cc-cde2d34e6282-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 10 11:17:59 crc kubenswrapper[4794]: I0310 11:17:59.320048 4794 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dec62e66-7b8d-4386-84cc-cde2d34e6282-utilities\") on node \"crc\" DevicePath \"\"" Mar 10 11:17:59 crc kubenswrapper[4794]: I0310 11:17:59.320061 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xgkw8\" (UniqueName: \"kubernetes.io/projected/dec62e66-7b8d-4386-84cc-cde2d34e6282-kube-api-access-xgkw8\") on node \"crc\" DevicePath \"\"" Mar 10 11:17:59 crc kubenswrapper[4794]: I0310 11:17:59.610635 4794 generic.go:334] "Generic (PLEG): container finished" podID="dec62e66-7b8d-4386-84cc-cde2d34e6282" containerID="c5c0082c49665d7e8131bdb5552996b501b20f084295eb9db1250624056dd091" exitCode=0 Mar 10 11:17:59 crc kubenswrapper[4794]: I0310 11:17:59.610694 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7prv7" event={"ID":"dec62e66-7b8d-4386-84cc-cde2d34e6282","Type":"ContainerDied","Data":"c5c0082c49665d7e8131bdb5552996b501b20f084295eb9db1250624056dd091"} Mar 10 11:17:59 crc kubenswrapper[4794]: I0310 11:17:59.610732 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-7prv7" Mar 10 11:17:59 crc kubenswrapper[4794]: I0310 11:17:59.612185 4794 scope.go:117] "RemoveContainer" containerID="c5c0082c49665d7e8131bdb5552996b501b20f084295eb9db1250624056dd091" Mar 10 11:17:59 crc kubenswrapper[4794]: I0310 11:17:59.612155 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7prv7" event={"ID":"dec62e66-7b8d-4386-84cc-cde2d34e6282","Type":"ContainerDied","Data":"55657f14ee4a54a736556aeaed0b3397e173f34c1b3b44696e40b2f8e6737266"} Mar 10 11:17:59 crc kubenswrapper[4794]: I0310 11:17:59.669116 4794 scope.go:117] "RemoveContainer" containerID="3d6527ddf385801ec1ef99ddfee19e887628b8bf9f1585d2be9e70d7924a9ed0" Mar 10 11:17:59 crc kubenswrapper[4794]: I0310 11:17:59.678631 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-7prv7"] Mar 10 11:17:59 crc kubenswrapper[4794]: I0310 11:17:59.692161 4794 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-7prv7"] Mar 10 11:17:59 crc kubenswrapper[4794]: E0310 11:17:59.719290 4794 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddec62e66_7b8d_4386_84cc_cde2d34e6282.slice\": RecentStats: unable to find data in memory cache]" Mar 10 11:17:59 crc kubenswrapper[4794]: I0310 11:17:59.721043 4794 scope.go:117] "RemoveContainer" containerID="13cbda9f321a2918115746f5d418d4a20aeed5c03f9776dd66c2ed42240f3992" Mar 10 11:17:59 crc kubenswrapper[4794]: I0310 11:17:59.759093 4794 scope.go:117] "RemoveContainer" containerID="c5c0082c49665d7e8131bdb5552996b501b20f084295eb9db1250624056dd091" Mar 10 11:17:59 crc kubenswrapper[4794]: E0310 11:17:59.760286 4794 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c5c0082c49665d7e8131bdb5552996b501b20f084295eb9db1250624056dd091\": container with ID starting with c5c0082c49665d7e8131bdb5552996b501b20f084295eb9db1250624056dd091 not found: ID does not exist" containerID="c5c0082c49665d7e8131bdb5552996b501b20f084295eb9db1250624056dd091" Mar 10 11:17:59 crc kubenswrapper[4794]: I0310 11:17:59.760355 4794 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c5c0082c49665d7e8131bdb5552996b501b20f084295eb9db1250624056dd091"} err="failed to get container status \"c5c0082c49665d7e8131bdb5552996b501b20f084295eb9db1250624056dd091\": rpc error: code = NotFound desc = could not find container \"c5c0082c49665d7e8131bdb5552996b501b20f084295eb9db1250624056dd091\": container with ID starting with c5c0082c49665d7e8131bdb5552996b501b20f084295eb9db1250624056dd091 not found: ID does not exist" Mar 10 11:17:59 crc kubenswrapper[4794]: I0310 11:17:59.760387 4794 scope.go:117] "RemoveContainer" containerID="3d6527ddf385801ec1ef99ddfee19e887628b8bf9f1585d2be9e70d7924a9ed0" Mar 10 11:17:59 crc kubenswrapper[4794]: E0310 11:17:59.760756 4794 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3d6527ddf385801ec1ef99ddfee19e887628b8bf9f1585d2be9e70d7924a9ed0\": container with ID starting with 3d6527ddf385801ec1ef99ddfee19e887628b8bf9f1585d2be9e70d7924a9ed0 not found: ID does not exist" containerID="3d6527ddf385801ec1ef99ddfee19e887628b8bf9f1585d2be9e70d7924a9ed0" Mar 10 11:17:59 crc kubenswrapper[4794]: I0310 11:17:59.760791 4794 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3d6527ddf385801ec1ef99ddfee19e887628b8bf9f1585d2be9e70d7924a9ed0"} err="failed to get container status \"3d6527ddf385801ec1ef99ddfee19e887628b8bf9f1585d2be9e70d7924a9ed0\": rpc error: code = NotFound desc = could not find container \"3d6527ddf385801ec1ef99ddfee19e887628b8bf9f1585d2be9e70d7924a9ed0\": container with ID starting with 3d6527ddf385801ec1ef99ddfee19e887628b8bf9f1585d2be9e70d7924a9ed0 not found: ID does not exist" Mar 10 11:17:59 crc kubenswrapper[4794]: I0310 11:17:59.760812 4794 scope.go:117] "RemoveContainer" containerID="13cbda9f321a2918115746f5d418d4a20aeed5c03f9776dd66c2ed42240f3992" Mar 10 11:17:59 crc kubenswrapper[4794]: E0310 11:17:59.761143 4794 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"13cbda9f321a2918115746f5d418d4a20aeed5c03f9776dd66c2ed42240f3992\": container with ID starting with 13cbda9f321a2918115746f5d418d4a20aeed5c03f9776dd66c2ed42240f3992 not found: ID does not exist" containerID="13cbda9f321a2918115746f5d418d4a20aeed5c03f9776dd66c2ed42240f3992" Mar 10 11:17:59 crc kubenswrapper[4794]: I0310 11:17:59.761180 4794 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"13cbda9f321a2918115746f5d418d4a20aeed5c03f9776dd66c2ed42240f3992"} err="failed to get container status \"13cbda9f321a2918115746f5d418d4a20aeed5c03f9776dd66c2ed42240f3992\": rpc error: code = NotFound desc = could not find container \"13cbda9f321a2918115746f5d418d4a20aeed5c03f9776dd66c2ed42240f3992\": container with ID starting with 13cbda9f321a2918115746f5d418d4a20aeed5c03f9776dd66c2ed42240f3992 not found: ID does not exist" Mar 10 11:17:59 crc kubenswrapper[4794]: I0310 11:17:59.958118 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-d82lf" Mar 10 11:18:00 crc kubenswrapper[4794]: I0310 11:18:00.019724 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dec62e66-7b8d-4386-84cc-cde2d34e6282" path="/var/lib/kubelet/pods/dec62e66-7b8d-4386-84cc-cde2d34e6282/volumes" Mar 10 11:18:00 crc kubenswrapper[4794]: I0310 11:18:00.037620 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bb00adea-5b3d-4e1e-a32c-b9378e9aa75e-scripts\") pod \"bb00adea-5b3d-4e1e-a32c-b9378e9aa75e\" (UID: \"bb00adea-5b3d-4e1e-a32c-b9378e9aa75e\") " Mar 10 11:18:00 crc kubenswrapper[4794]: I0310 11:18:00.037770 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bb00adea-5b3d-4e1e-a32c-b9378e9aa75e-combined-ca-bundle\") pod \"bb00adea-5b3d-4e1e-a32c-b9378e9aa75e\" (UID: \"bb00adea-5b3d-4e1e-a32c-b9378e9aa75e\") " Mar 10 11:18:00 crc kubenswrapper[4794]: I0310 11:18:00.037857 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bb00adea-5b3d-4e1e-a32c-b9378e9aa75e-config-data\") pod \"bb00adea-5b3d-4e1e-a32c-b9378e9aa75e\" (UID: \"bb00adea-5b3d-4e1e-a32c-b9378e9aa75e\") " Mar 10 11:18:00 crc kubenswrapper[4794]: I0310 11:18:00.037892 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t57d9\" (UniqueName: \"kubernetes.io/projected/bb00adea-5b3d-4e1e-a32c-b9378e9aa75e-kube-api-access-t57d9\") pod \"bb00adea-5b3d-4e1e-a32c-b9378e9aa75e\" (UID: \"bb00adea-5b3d-4e1e-a32c-b9378e9aa75e\") " Mar 10 11:18:00 crc kubenswrapper[4794]: I0310 11:18:00.055641 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bb00adea-5b3d-4e1e-a32c-b9378e9aa75e-scripts" (OuterVolumeSpecName: "scripts") pod "bb00adea-5b3d-4e1e-a32c-b9378e9aa75e" (UID: "bb00adea-5b3d-4e1e-a32c-b9378e9aa75e"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 11:18:00 crc kubenswrapper[4794]: I0310 11:18:00.057462 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bb00adea-5b3d-4e1e-a32c-b9378e9aa75e-kube-api-access-t57d9" (OuterVolumeSpecName: "kube-api-access-t57d9") pod "bb00adea-5b3d-4e1e-a32c-b9378e9aa75e" (UID: "bb00adea-5b3d-4e1e-a32c-b9378e9aa75e"). InnerVolumeSpecName "kube-api-access-t57d9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 11:18:00 crc kubenswrapper[4794]: I0310 11:18:00.058628 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bb00adea-5b3d-4e1e-a32c-b9378e9aa75e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "bb00adea-5b3d-4e1e-a32c-b9378e9aa75e" (UID: "bb00adea-5b3d-4e1e-a32c-b9378e9aa75e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 11:18:00 crc kubenswrapper[4794]: I0310 11:18:00.065603 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bb00adea-5b3d-4e1e-a32c-b9378e9aa75e-config-data" (OuterVolumeSpecName: "config-data") pod "bb00adea-5b3d-4e1e-a32c-b9378e9aa75e" (UID: "bb00adea-5b3d-4e1e-a32c-b9378e9aa75e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 11:18:00 crc kubenswrapper[4794]: I0310 11:18:00.140623 4794 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bb00adea-5b3d-4e1e-a32c-b9378e9aa75e-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 11:18:00 crc kubenswrapper[4794]: I0310 11:18:00.140676 4794 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bb00adea-5b3d-4e1e-a32c-b9378e9aa75e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 11:18:00 crc kubenswrapper[4794]: I0310 11:18:00.140699 4794 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bb00adea-5b3d-4e1e-a32c-b9378e9aa75e-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 11:18:00 crc kubenswrapper[4794]: I0310 11:18:00.140762 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t57d9\" (UniqueName: \"kubernetes.io/projected/bb00adea-5b3d-4e1e-a32c-b9378e9aa75e-kube-api-access-t57d9\") on node \"crc\" DevicePath \"\"" Mar 10 11:18:00 crc kubenswrapper[4794]: I0310 11:18:00.148616 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29552358-j69qp"] Mar 10 11:18:00 crc kubenswrapper[4794]: E0310 11:18:00.149080 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dec62e66-7b8d-4386-84cc-cde2d34e6282" containerName="extract-content" Mar 10 11:18:00 crc kubenswrapper[4794]: I0310 11:18:00.149101 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="dec62e66-7b8d-4386-84cc-cde2d34e6282" containerName="extract-content" Mar 10 11:18:00 crc kubenswrapper[4794]: E0310 11:18:00.149123 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dec62e66-7b8d-4386-84cc-cde2d34e6282" containerName="registry-server" Mar 10 11:18:00 crc kubenswrapper[4794]: I0310 11:18:00.149133 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="dec62e66-7b8d-4386-84cc-cde2d34e6282" containerName="registry-server" Mar 10 11:18:00 crc kubenswrapper[4794]: E0310 11:18:00.149153 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dec62e66-7b8d-4386-84cc-cde2d34e6282" containerName="extract-utilities" Mar 10 11:18:00 crc kubenswrapper[4794]: I0310 11:18:00.149162 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="dec62e66-7b8d-4386-84cc-cde2d34e6282" containerName="extract-utilities" Mar 10 11:18:00 crc kubenswrapper[4794]: E0310 11:18:00.149177 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bb00adea-5b3d-4e1e-a32c-b9378e9aa75e" containerName="nova-cell0-conductor-db-sync" Mar 10 11:18:00 crc kubenswrapper[4794]: I0310 11:18:00.149184 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="bb00adea-5b3d-4e1e-a32c-b9378e9aa75e" containerName="nova-cell0-conductor-db-sync" Mar 10 11:18:00 crc kubenswrapper[4794]: I0310 11:18:00.149407 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="bb00adea-5b3d-4e1e-a32c-b9378e9aa75e" containerName="nova-cell0-conductor-db-sync" Mar 10 11:18:00 crc kubenswrapper[4794]: I0310 11:18:00.149421 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="dec62e66-7b8d-4386-84cc-cde2d34e6282" containerName="registry-server" Mar 10 11:18:00 crc kubenswrapper[4794]: I0310 11:18:00.150110 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552358-j69qp" Mar 10 11:18:00 crc kubenswrapper[4794]: I0310 11:18:00.152434 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-r5tkc" Mar 10 11:18:00 crc kubenswrapper[4794]: I0310 11:18:00.152463 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 10 11:18:00 crc kubenswrapper[4794]: I0310 11:18:00.154560 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 10 11:18:00 crc kubenswrapper[4794]: I0310 11:18:00.167074 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552358-j69qp"] Mar 10 11:18:00 crc kubenswrapper[4794]: I0310 11:18:00.242302 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9v86k\" (UniqueName: \"kubernetes.io/projected/aadfef82-5fa2-476a-aa14-65e884ed00b1-kube-api-access-9v86k\") pod \"auto-csr-approver-29552358-j69qp\" (UID: \"aadfef82-5fa2-476a-aa14-65e884ed00b1\") " pod="openshift-infra/auto-csr-approver-29552358-j69qp" Mar 10 11:18:00 crc kubenswrapper[4794]: I0310 11:18:00.344008 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9v86k\" (UniqueName: \"kubernetes.io/projected/aadfef82-5fa2-476a-aa14-65e884ed00b1-kube-api-access-9v86k\") pod \"auto-csr-approver-29552358-j69qp\" (UID: \"aadfef82-5fa2-476a-aa14-65e884ed00b1\") " pod="openshift-infra/auto-csr-approver-29552358-j69qp" Mar 10 11:18:00 crc kubenswrapper[4794]: I0310 11:18:00.369140 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9v86k\" (UniqueName: \"kubernetes.io/projected/aadfef82-5fa2-476a-aa14-65e884ed00b1-kube-api-access-9v86k\") pod \"auto-csr-approver-29552358-j69qp\" (UID: \"aadfef82-5fa2-476a-aa14-65e884ed00b1\") " pod="openshift-infra/auto-csr-approver-29552358-j69qp" Mar 10 11:18:00 crc kubenswrapper[4794]: I0310 11:18:00.489263 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552358-j69qp" Mar 10 11:18:00 crc kubenswrapper[4794]: I0310 11:18:00.626292 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-d82lf" event={"ID":"bb00adea-5b3d-4e1e-a32c-b9378e9aa75e","Type":"ContainerDied","Data":"0ac299a7d905051f116b7aaead6ab72138cbe1424674d20248312f8a994fd392"} Mar 10 11:18:00 crc kubenswrapper[4794]: I0310 11:18:00.626327 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-d82lf" Mar 10 11:18:00 crc kubenswrapper[4794]: I0310 11:18:00.626349 4794 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0ac299a7d905051f116b7aaead6ab72138cbe1424674d20248312f8a994fd392" Mar 10 11:18:00 crc kubenswrapper[4794]: I0310 11:18:00.730708 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-0"] Mar 10 11:18:00 crc kubenswrapper[4794]: I0310 11:18:00.732127 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Mar 10 11:18:00 crc kubenswrapper[4794]: I0310 11:18:00.734774 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-lq26n" Mar 10 11:18:00 crc kubenswrapper[4794]: I0310 11:18:00.737458 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Mar 10 11:18:00 crc kubenswrapper[4794]: I0310 11:18:00.737890 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Mar 10 11:18:00 crc kubenswrapper[4794]: I0310 11:18:00.854210 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8b88005f-6aaa-488f-90d5-b789ccced7ec-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"8b88005f-6aaa-488f-90d5-b789ccced7ec\") " pod="openstack/nova-cell0-conductor-0" Mar 10 11:18:00 crc kubenswrapper[4794]: I0310 11:18:00.854575 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ghbnz\" (UniqueName: \"kubernetes.io/projected/8b88005f-6aaa-488f-90d5-b789ccced7ec-kube-api-access-ghbnz\") pod \"nova-cell0-conductor-0\" (UID: \"8b88005f-6aaa-488f-90d5-b789ccced7ec\") " pod="openstack/nova-cell0-conductor-0" Mar 10 11:18:00 crc kubenswrapper[4794]: I0310 11:18:00.854615 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8b88005f-6aaa-488f-90d5-b789ccced7ec-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"8b88005f-6aaa-488f-90d5-b789ccced7ec\") " pod="openstack/nova-cell0-conductor-0" Mar 10 11:18:00 crc kubenswrapper[4794]: I0310 11:18:00.955845 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8b88005f-6aaa-488f-90d5-b789ccced7ec-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"8b88005f-6aaa-488f-90d5-b789ccced7ec\") " pod="openstack/nova-cell0-conductor-0" Mar 10 11:18:00 crc kubenswrapper[4794]: I0310 11:18:00.956101 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8b88005f-6aaa-488f-90d5-b789ccced7ec-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"8b88005f-6aaa-488f-90d5-b789ccced7ec\") " pod="openstack/nova-cell0-conductor-0" Mar 10 11:18:00 crc kubenswrapper[4794]: I0310 11:18:00.956172 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ghbnz\" (UniqueName: \"kubernetes.io/projected/8b88005f-6aaa-488f-90d5-b789ccced7ec-kube-api-access-ghbnz\") pod \"nova-cell0-conductor-0\" (UID: \"8b88005f-6aaa-488f-90d5-b789ccced7ec\") " pod="openstack/nova-cell0-conductor-0" Mar 10 11:18:00 crc kubenswrapper[4794]: I0310 11:18:00.961265 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8b88005f-6aaa-488f-90d5-b789ccced7ec-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"8b88005f-6aaa-488f-90d5-b789ccced7ec\") " pod="openstack/nova-cell0-conductor-0" Mar 10 11:18:00 crc kubenswrapper[4794]: I0310 11:18:00.965262 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8b88005f-6aaa-488f-90d5-b789ccced7ec-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"8b88005f-6aaa-488f-90d5-b789ccced7ec\") " pod="openstack/nova-cell0-conductor-0" Mar 10 11:18:00 crc kubenswrapper[4794]: I0310 11:18:00.975241 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ghbnz\" (UniqueName: \"kubernetes.io/projected/8b88005f-6aaa-488f-90d5-b789ccced7ec-kube-api-access-ghbnz\") pod \"nova-cell0-conductor-0\" (UID: \"8b88005f-6aaa-488f-90d5-b789ccced7ec\") " pod="openstack/nova-cell0-conductor-0" Mar 10 11:18:01 crc kubenswrapper[4794]: W0310 11:18:01.026033 4794 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podaadfef82_5fa2_476a_aa14_65e884ed00b1.slice/crio-f93b804ccc25966eedcbae4b0665ed69c1413fb9403ac9fa38590a0e8c5db863 WatchSource:0}: Error finding container f93b804ccc25966eedcbae4b0665ed69c1413fb9403ac9fa38590a0e8c5db863: Status 404 returned error can't find the container with id f93b804ccc25966eedcbae4b0665ed69c1413fb9403ac9fa38590a0e8c5db863 Mar 10 11:18:01 crc kubenswrapper[4794]: I0310 11:18:01.030021 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552358-j69qp"] Mar 10 11:18:01 crc kubenswrapper[4794]: I0310 11:18:01.097634 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Mar 10 11:18:01 crc kubenswrapper[4794]: I0310 11:18:01.534536 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Mar 10 11:18:01 crc kubenswrapper[4794]: W0310 11:18:01.539674 4794 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8b88005f_6aaa_488f_90d5_b789ccced7ec.slice/crio-1ad96500eb6db7b567ab2a2b39bd91f964776aab1226b92b557ed81cf2d6d728 WatchSource:0}: Error finding container 1ad96500eb6db7b567ab2a2b39bd91f964776aab1226b92b557ed81cf2d6d728: Status 404 returned error can't find the container with id 1ad96500eb6db7b567ab2a2b39bd91f964776aab1226b92b557ed81cf2d6d728 Mar 10 11:18:01 crc kubenswrapper[4794]: I0310 11:18:01.635962 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552358-j69qp" event={"ID":"aadfef82-5fa2-476a-aa14-65e884ed00b1","Type":"ContainerStarted","Data":"f93b804ccc25966eedcbae4b0665ed69c1413fb9403ac9fa38590a0e8c5db863"} Mar 10 11:18:01 crc kubenswrapper[4794]: I0310 11:18:01.637264 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"8b88005f-6aaa-488f-90d5-b789ccced7ec","Type":"ContainerStarted","Data":"1ad96500eb6db7b567ab2a2b39bd91f964776aab1226b92b557ed81cf2d6d728"} Mar 10 11:18:02 crc kubenswrapper[4794]: I0310 11:18:02.652621 4794 generic.go:334] "Generic (PLEG): container finished" podID="aadfef82-5fa2-476a-aa14-65e884ed00b1" containerID="cf8f62ca227888d5977c62623b0915a91525b21e29ecbbe6e6a719fd3611e51c" exitCode=0 Mar 10 11:18:02 crc kubenswrapper[4794]: I0310 11:18:02.653099 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552358-j69qp" event={"ID":"aadfef82-5fa2-476a-aa14-65e884ed00b1","Type":"ContainerDied","Data":"cf8f62ca227888d5977c62623b0915a91525b21e29ecbbe6e6a719fd3611e51c"} Mar 10 11:18:02 crc kubenswrapper[4794]: I0310 11:18:02.655776 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"8b88005f-6aaa-488f-90d5-b789ccced7ec","Type":"ContainerStarted","Data":"773ca7b14734c179e1a0443c6ccd3c922e4dcbfb06c2b67fcd8db43590f52861"} Mar 10 11:18:02 crc kubenswrapper[4794]: I0310 11:18:02.656035 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell0-conductor-0" Mar 10 11:18:02 crc kubenswrapper[4794]: I0310 11:18:02.703143 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-0" podStartSLOduration=2.703117927 podStartE2EDuration="2.703117927s" podCreationTimestamp="2026-03-10 11:18:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 11:18:02.69032829 +0000 UTC m=+5631.446499148" watchObservedRunningTime="2026-03-10 11:18:02.703117927 +0000 UTC m=+5631.459288775" Mar 10 11:18:04 crc kubenswrapper[4794]: I0310 11:18:03.999802 4794 scope.go:117] "RemoveContainer" containerID="993a5336699cfaa4c1135438dd34154cd549d1e3528993849fd4002d26837157" Mar 10 11:18:04 crc kubenswrapper[4794]: E0310 11:18:04.000455 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-69278_openshift-machine-config-operator(071a79a8-a892-4d38-a255-2a19483b64aa)\"" pod="openshift-machine-config-operator/machine-config-daemon-69278" podUID="071a79a8-a892-4d38-a255-2a19483b64aa" Mar 10 11:18:04 crc kubenswrapper[4794]: I0310 11:18:04.024023 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552358-j69qp" Mar 10 11:18:04 crc kubenswrapper[4794]: I0310 11:18:04.114464 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9v86k\" (UniqueName: \"kubernetes.io/projected/aadfef82-5fa2-476a-aa14-65e884ed00b1-kube-api-access-9v86k\") pod \"aadfef82-5fa2-476a-aa14-65e884ed00b1\" (UID: \"aadfef82-5fa2-476a-aa14-65e884ed00b1\") " Mar 10 11:18:04 crc kubenswrapper[4794]: I0310 11:18:04.121633 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aadfef82-5fa2-476a-aa14-65e884ed00b1-kube-api-access-9v86k" (OuterVolumeSpecName: "kube-api-access-9v86k") pod "aadfef82-5fa2-476a-aa14-65e884ed00b1" (UID: "aadfef82-5fa2-476a-aa14-65e884ed00b1"). InnerVolumeSpecName "kube-api-access-9v86k". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 11:18:04 crc kubenswrapper[4794]: I0310 11:18:04.216936 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9v86k\" (UniqueName: \"kubernetes.io/projected/aadfef82-5fa2-476a-aa14-65e884ed00b1-kube-api-access-9v86k\") on node \"crc\" DevicePath \"\"" Mar 10 11:18:04 crc kubenswrapper[4794]: I0310 11:18:04.685812 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552358-j69qp" event={"ID":"aadfef82-5fa2-476a-aa14-65e884ed00b1","Type":"ContainerDied","Data":"f93b804ccc25966eedcbae4b0665ed69c1413fb9403ac9fa38590a0e8c5db863"} Mar 10 11:18:04 crc kubenswrapper[4794]: I0310 11:18:04.685886 4794 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f93b804ccc25966eedcbae4b0665ed69c1413fb9403ac9fa38590a0e8c5db863" Mar 10 11:18:04 crc kubenswrapper[4794]: I0310 11:18:04.685943 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552358-j69qp" Mar 10 11:18:05 crc kubenswrapper[4794]: I0310 11:18:05.110963 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29552352-5jscq"] Mar 10 11:18:05 crc kubenswrapper[4794]: I0310 11:18:05.120627 4794 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29552352-5jscq"] Mar 10 11:18:05 crc kubenswrapper[4794]: I0310 11:18:05.437217 4794 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-lmrpm" Mar 10 11:18:05 crc kubenswrapper[4794]: I0310 11:18:05.506586 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-lmrpm" Mar 10 11:18:05 crc kubenswrapper[4794]: I0310 11:18:05.686212 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-lmrpm"] Mar 10 11:18:06 crc kubenswrapper[4794]: I0310 11:18:06.016510 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="007279e7-7980-459f-8eba-f63a2dab9526" path="/var/lib/kubelet/pods/007279e7-7980-459f-8eba-f63a2dab9526/volumes" Mar 10 11:18:06 crc kubenswrapper[4794]: I0310 11:18:06.146206 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell0-conductor-0" Mar 10 11:18:06 crc kubenswrapper[4794]: I0310 11:18:06.722087 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-cell-mapping-4sqhj"] Mar 10 11:18:06 crc kubenswrapper[4794]: E0310 11:18:06.722443 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aadfef82-5fa2-476a-aa14-65e884ed00b1" containerName="oc" Mar 10 11:18:06 crc kubenswrapper[4794]: I0310 11:18:06.722459 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="aadfef82-5fa2-476a-aa14-65e884ed00b1" containerName="oc" Mar 10 11:18:06 crc kubenswrapper[4794]: I0310 11:18:06.722646 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="aadfef82-5fa2-476a-aa14-65e884ed00b1" containerName="oc" Mar 10 11:18:06 crc kubenswrapper[4794]: I0310 11:18:06.723178 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-4sqhj" Mar 10 11:18:06 crc kubenswrapper[4794]: I0310 11:18:06.723428 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-lmrpm" podUID="4540a698-7f45-4e9f-b38e-11102a1ee435" containerName="registry-server" containerID="cri-o://e7e9bcad8ccbc8cdd3dca02ff28379dc1f14ab20eb194ea258f4ec6f18be51b6" gracePeriod=2 Mar 10 11:18:06 crc kubenswrapper[4794]: I0310 11:18:06.725686 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-scripts" Mar 10 11:18:06 crc kubenswrapper[4794]: I0310 11:18:06.730209 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-config-data" Mar 10 11:18:06 crc kubenswrapper[4794]: I0310 11:18:06.752067 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-4sqhj"] Mar 10 11:18:06 crc kubenswrapper[4794]: I0310 11:18:06.779186 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0ebea665-f36f-45ef-95a5-bdeacd279dd3-config-data\") pod \"nova-cell0-cell-mapping-4sqhj\" (UID: \"0ebea665-f36f-45ef-95a5-bdeacd279dd3\") " pod="openstack/nova-cell0-cell-mapping-4sqhj" Mar 10 11:18:06 crc kubenswrapper[4794]: I0310 11:18:06.779229 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0ebea665-f36f-45ef-95a5-bdeacd279dd3-scripts\") pod \"nova-cell0-cell-mapping-4sqhj\" (UID: \"0ebea665-f36f-45ef-95a5-bdeacd279dd3\") " pod="openstack/nova-cell0-cell-mapping-4sqhj" Mar 10 11:18:06 crc kubenswrapper[4794]: I0310 11:18:06.779257 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0ebea665-f36f-45ef-95a5-bdeacd279dd3-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-4sqhj\" (UID: \"0ebea665-f36f-45ef-95a5-bdeacd279dd3\") " pod="openstack/nova-cell0-cell-mapping-4sqhj" Mar 10 11:18:06 crc kubenswrapper[4794]: I0310 11:18:06.779317 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vmlzl\" (UniqueName: \"kubernetes.io/projected/0ebea665-f36f-45ef-95a5-bdeacd279dd3-kube-api-access-vmlzl\") pod \"nova-cell0-cell-mapping-4sqhj\" (UID: \"0ebea665-f36f-45ef-95a5-bdeacd279dd3\") " pod="openstack/nova-cell0-cell-mapping-4sqhj" Mar 10 11:18:06 crc kubenswrapper[4794]: I0310 11:18:06.883239 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vmlzl\" (UniqueName: \"kubernetes.io/projected/0ebea665-f36f-45ef-95a5-bdeacd279dd3-kube-api-access-vmlzl\") pod \"nova-cell0-cell-mapping-4sqhj\" (UID: \"0ebea665-f36f-45ef-95a5-bdeacd279dd3\") " pod="openstack/nova-cell0-cell-mapping-4sqhj" Mar 10 11:18:06 crc kubenswrapper[4794]: I0310 11:18:06.883360 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0ebea665-f36f-45ef-95a5-bdeacd279dd3-config-data\") pod \"nova-cell0-cell-mapping-4sqhj\" (UID: \"0ebea665-f36f-45ef-95a5-bdeacd279dd3\") " pod="openstack/nova-cell0-cell-mapping-4sqhj" Mar 10 11:18:06 crc kubenswrapper[4794]: I0310 11:18:06.883382 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0ebea665-f36f-45ef-95a5-bdeacd279dd3-scripts\") pod \"nova-cell0-cell-mapping-4sqhj\" (UID: \"0ebea665-f36f-45ef-95a5-bdeacd279dd3\") " pod="openstack/nova-cell0-cell-mapping-4sqhj" Mar 10 11:18:06 crc kubenswrapper[4794]: I0310 11:18:06.883405 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0ebea665-f36f-45ef-95a5-bdeacd279dd3-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-4sqhj\" (UID: \"0ebea665-f36f-45ef-95a5-bdeacd279dd3\") " pod="openstack/nova-cell0-cell-mapping-4sqhj" Mar 10 11:18:06 crc kubenswrapper[4794]: I0310 11:18:06.888174 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0ebea665-f36f-45ef-95a5-bdeacd279dd3-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-4sqhj\" (UID: \"0ebea665-f36f-45ef-95a5-bdeacd279dd3\") " pod="openstack/nova-cell0-cell-mapping-4sqhj" Mar 10 11:18:06 crc kubenswrapper[4794]: I0310 11:18:06.897965 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0ebea665-f36f-45ef-95a5-bdeacd279dd3-config-data\") pod \"nova-cell0-cell-mapping-4sqhj\" (UID: \"0ebea665-f36f-45ef-95a5-bdeacd279dd3\") " pod="openstack/nova-cell0-cell-mapping-4sqhj" Mar 10 11:18:06 crc kubenswrapper[4794]: I0310 11:18:06.901831 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0ebea665-f36f-45ef-95a5-bdeacd279dd3-scripts\") pod \"nova-cell0-cell-mapping-4sqhj\" (UID: \"0ebea665-f36f-45ef-95a5-bdeacd279dd3\") " pod="openstack/nova-cell0-cell-mapping-4sqhj" Mar 10 11:18:06 crc kubenswrapper[4794]: I0310 11:18:06.917935 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vmlzl\" (UniqueName: \"kubernetes.io/projected/0ebea665-f36f-45ef-95a5-bdeacd279dd3-kube-api-access-vmlzl\") pod \"nova-cell0-cell-mapping-4sqhj\" (UID: \"0ebea665-f36f-45ef-95a5-bdeacd279dd3\") " pod="openstack/nova-cell0-cell-mapping-4sqhj" Mar 10 11:18:06 crc kubenswrapper[4794]: I0310 11:18:06.952293 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Mar 10 11:18:06 crc kubenswrapper[4794]: I0310 11:18:06.955493 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 10 11:18:06 crc kubenswrapper[4794]: I0310 11:18:06.958305 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Mar 10 11:18:06 crc kubenswrapper[4794]: I0310 11:18:06.983452 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 10 11:18:07 crc kubenswrapper[4794]: I0310 11:18:06.999949 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Mar 10 11:18:07 crc kubenswrapper[4794]: I0310 11:18:07.002673 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 10 11:18:07 crc kubenswrapper[4794]: I0310 11:18:07.014476 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Mar 10 11:18:07 crc kubenswrapper[4794]: I0310 11:18:07.042690 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 10 11:18:07 crc kubenswrapper[4794]: I0310 11:18:07.046323 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-4sqhj" Mar 10 11:18:07 crc kubenswrapper[4794]: I0310 11:18:07.074054 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Mar 10 11:18:07 crc kubenswrapper[4794]: I0310 11:18:07.076893 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 10 11:18:07 crc kubenswrapper[4794]: I0310 11:18:07.093561 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e92b7354-097a-42e3-af6b-0d66414c4d4d-config-data\") pod \"nova-api-0\" (UID: \"e92b7354-097a-42e3-af6b-0d66414c4d4d\") " pod="openstack/nova-api-0" Mar 10 11:18:07 crc kubenswrapper[4794]: I0310 11:18:07.093596 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e92b7354-097a-42e3-af6b-0d66414c4d4d-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"e92b7354-097a-42e3-af6b-0d66414c4d4d\") " pod="openstack/nova-api-0" Mar 10 11:18:07 crc kubenswrapper[4794]: I0310 11:18:07.093710 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/abf2624e-ffa0-4251-8661-f688c4699f5a-config-data\") pod \"nova-scheduler-0\" (UID: \"abf2624e-ffa0-4251-8661-f688c4699f5a\") " pod="openstack/nova-scheduler-0" Mar 10 11:18:07 crc kubenswrapper[4794]: I0310 11:18:07.093742 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4k5jc\" (UniqueName: \"kubernetes.io/projected/e92b7354-097a-42e3-af6b-0d66414c4d4d-kube-api-access-4k5jc\") pod \"nova-api-0\" (UID: \"e92b7354-097a-42e3-af6b-0d66414c4d4d\") " pod="openstack/nova-api-0" Mar 10 11:18:07 crc kubenswrapper[4794]: I0310 11:18:07.093765 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8qf6s\" (UniqueName: \"kubernetes.io/projected/abf2624e-ffa0-4251-8661-f688c4699f5a-kube-api-access-8qf6s\") pod \"nova-scheduler-0\" (UID: \"abf2624e-ffa0-4251-8661-f688c4699f5a\") " pod="openstack/nova-scheduler-0" Mar 10 11:18:07 crc kubenswrapper[4794]: I0310 11:18:07.093787 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/abf2624e-ffa0-4251-8661-f688c4699f5a-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"abf2624e-ffa0-4251-8661-f688c4699f5a\") " pod="openstack/nova-scheduler-0" Mar 10 11:18:07 crc kubenswrapper[4794]: I0310 11:18:07.093810 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e92b7354-097a-42e3-af6b-0d66414c4d4d-logs\") pod \"nova-api-0\" (UID: \"e92b7354-097a-42e3-af6b-0d66414c4d4d\") " pod="openstack/nova-api-0" Mar 10 11:18:07 crc kubenswrapper[4794]: I0310 11:18:07.112657 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Mar 10 11:18:07 crc kubenswrapper[4794]: I0310 11:18:07.149397 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 10 11:18:07 crc kubenswrapper[4794]: I0310 11:18:07.194835 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4k5jc\" (UniqueName: \"kubernetes.io/projected/e92b7354-097a-42e3-af6b-0d66414c4d4d-kube-api-access-4k5jc\") pod \"nova-api-0\" (UID: \"e92b7354-097a-42e3-af6b-0d66414c4d4d\") " pod="openstack/nova-api-0" Mar 10 11:18:07 crc kubenswrapper[4794]: I0310 11:18:07.194886 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8qf6s\" (UniqueName: \"kubernetes.io/projected/abf2624e-ffa0-4251-8661-f688c4699f5a-kube-api-access-8qf6s\") pod \"nova-scheduler-0\" (UID: \"abf2624e-ffa0-4251-8661-f688c4699f5a\") " pod="openstack/nova-scheduler-0" Mar 10 11:18:07 crc kubenswrapper[4794]: I0310 11:18:07.194912 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tw448\" (UniqueName: \"kubernetes.io/projected/8d41b2d6-74f1-44b7-982d-56983819f28a-kube-api-access-tw448\") pod \"nova-metadata-0\" (UID: \"8d41b2d6-74f1-44b7-982d-56983819f28a\") " pod="openstack/nova-metadata-0" Mar 10 11:18:07 crc kubenswrapper[4794]: I0310 11:18:07.194935 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/abf2624e-ffa0-4251-8661-f688c4699f5a-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"abf2624e-ffa0-4251-8661-f688c4699f5a\") " pod="openstack/nova-scheduler-0" Mar 10 11:18:07 crc kubenswrapper[4794]: I0310 11:18:07.194965 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e92b7354-097a-42e3-af6b-0d66414c4d4d-logs\") pod \"nova-api-0\" (UID: \"e92b7354-097a-42e3-af6b-0d66414c4d4d\") " pod="openstack/nova-api-0" Mar 10 11:18:07 crc kubenswrapper[4794]: I0310 11:18:07.194986 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e92b7354-097a-42e3-af6b-0d66414c4d4d-config-data\") pod \"nova-api-0\" (UID: \"e92b7354-097a-42e3-af6b-0d66414c4d4d\") " pod="openstack/nova-api-0" Mar 10 11:18:07 crc kubenswrapper[4794]: I0310 11:18:07.195004 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e92b7354-097a-42e3-af6b-0d66414c4d4d-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"e92b7354-097a-42e3-af6b-0d66414c4d4d\") " pod="openstack/nova-api-0" Mar 10 11:18:07 crc kubenswrapper[4794]: I0310 11:18:07.195051 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8d41b2d6-74f1-44b7-982d-56983819f28a-config-data\") pod \"nova-metadata-0\" (UID: \"8d41b2d6-74f1-44b7-982d-56983819f28a\") " pod="openstack/nova-metadata-0" Mar 10 11:18:07 crc kubenswrapper[4794]: I0310 11:18:07.195068 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8d41b2d6-74f1-44b7-982d-56983819f28a-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"8d41b2d6-74f1-44b7-982d-56983819f28a\") " pod="openstack/nova-metadata-0" Mar 10 11:18:07 crc kubenswrapper[4794]: I0310 11:18:07.195106 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8d41b2d6-74f1-44b7-982d-56983819f28a-logs\") pod \"nova-metadata-0\" (UID: \"8d41b2d6-74f1-44b7-982d-56983819f28a\") " pod="openstack/nova-metadata-0" Mar 10 11:18:07 crc kubenswrapper[4794]: I0310 11:18:07.195150 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/abf2624e-ffa0-4251-8661-f688c4699f5a-config-data\") pod \"nova-scheduler-0\" (UID: \"abf2624e-ffa0-4251-8661-f688c4699f5a\") " pod="openstack/nova-scheduler-0" Mar 10 11:18:07 crc kubenswrapper[4794]: I0310 11:18:07.195897 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e92b7354-097a-42e3-af6b-0d66414c4d4d-logs\") pod \"nova-api-0\" (UID: \"e92b7354-097a-42e3-af6b-0d66414c4d4d\") " pod="openstack/nova-api-0" Mar 10 11:18:07 crc kubenswrapper[4794]: I0310 11:18:07.198633 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-9dd655fcf-vd7zz"] Mar 10 11:18:07 crc kubenswrapper[4794]: I0310 11:18:07.200090 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-9dd655fcf-vd7zz" Mar 10 11:18:07 crc kubenswrapper[4794]: I0310 11:18:07.200978 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/abf2624e-ffa0-4251-8661-f688c4699f5a-config-data\") pod \"nova-scheduler-0\" (UID: \"abf2624e-ffa0-4251-8661-f688c4699f5a\") " pod="openstack/nova-scheduler-0" Mar 10 11:18:07 crc kubenswrapper[4794]: I0310 11:18:07.201106 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e92b7354-097a-42e3-af6b-0d66414c4d4d-config-data\") pod \"nova-api-0\" (UID: \"e92b7354-097a-42e3-af6b-0d66414c4d4d\") " pod="openstack/nova-api-0" Mar 10 11:18:07 crc kubenswrapper[4794]: I0310 11:18:07.202926 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e92b7354-097a-42e3-af6b-0d66414c4d4d-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"e92b7354-097a-42e3-af6b-0d66414c4d4d\") " pod="openstack/nova-api-0" Mar 10 11:18:07 crc kubenswrapper[4794]: I0310 11:18:07.203569 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/abf2624e-ffa0-4251-8661-f688c4699f5a-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"abf2624e-ffa0-4251-8661-f688c4699f5a\") " pod="openstack/nova-scheduler-0" Mar 10 11:18:07 crc kubenswrapper[4794]: I0310 11:18:07.223794 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 10 11:18:07 crc kubenswrapper[4794]: I0310 11:18:07.226360 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 10 11:18:07 crc kubenswrapper[4794]: I0310 11:18:07.230155 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Mar 10 11:18:07 crc kubenswrapper[4794]: I0310 11:18:07.235262 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8qf6s\" (UniqueName: \"kubernetes.io/projected/abf2624e-ffa0-4251-8661-f688c4699f5a-kube-api-access-8qf6s\") pod \"nova-scheduler-0\" (UID: \"abf2624e-ffa0-4251-8661-f688c4699f5a\") " pod="openstack/nova-scheduler-0" Mar 10 11:18:07 crc kubenswrapper[4794]: I0310 11:18:07.244033 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-9dd655fcf-vd7zz"] Mar 10 11:18:07 crc kubenswrapper[4794]: I0310 11:18:07.258945 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4k5jc\" (UniqueName: \"kubernetes.io/projected/e92b7354-097a-42e3-af6b-0d66414c4d4d-kube-api-access-4k5jc\") pod \"nova-api-0\" (UID: \"e92b7354-097a-42e3-af6b-0d66414c4d4d\") " pod="openstack/nova-api-0" Mar 10 11:18:07 crc kubenswrapper[4794]: I0310 11:18:07.264417 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 10 11:18:07 crc kubenswrapper[4794]: I0310 11:18:07.306139 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6hw4l\" (UniqueName: \"kubernetes.io/projected/647826ea-505e-495e-8fdc-004e7e432df8-kube-api-access-6hw4l\") pod \"dnsmasq-dns-9dd655fcf-vd7zz\" (UID: \"647826ea-505e-495e-8fdc-004e7e432df8\") " pod="openstack/dnsmasq-dns-9dd655fcf-vd7zz" Mar 10 11:18:07 crc kubenswrapper[4794]: I0310 11:18:07.306253 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b7rhx\" (UniqueName: \"kubernetes.io/projected/7d69c47b-ce22-4e6a-aa51-bbfb7e5b8f23-kube-api-access-b7rhx\") pod \"nova-cell1-novncproxy-0\" (UID: \"7d69c47b-ce22-4e6a-aa51-bbfb7e5b8f23\") " pod="openstack/nova-cell1-novncproxy-0" Mar 10 11:18:07 crc kubenswrapper[4794]: I0310 11:18:07.306306 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tw448\" (UniqueName: \"kubernetes.io/projected/8d41b2d6-74f1-44b7-982d-56983819f28a-kube-api-access-tw448\") pod \"nova-metadata-0\" (UID: \"8d41b2d6-74f1-44b7-982d-56983819f28a\") " pod="openstack/nova-metadata-0" Mar 10 11:18:07 crc kubenswrapper[4794]: I0310 11:18:07.306377 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/647826ea-505e-495e-8fdc-004e7e432df8-dns-svc\") pod \"dnsmasq-dns-9dd655fcf-vd7zz\" (UID: \"647826ea-505e-495e-8fdc-004e7e432df8\") " pod="openstack/dnsmasq-dns-9dd655fcf-vd7zz" Mar 10 11:18:07 crc kubenswrapper[4794]: I0310 11:18:07.306412 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/647826ea-505e-495e-8fdc-004e7e432df8-config\") pod \"dnsmasq-dns-9dd655fcf-vd7zz\" (UID: \"647826ea-505e-495e-8fdc-004e7e432df8\") " pod="openstack/dnsmasq-dns-9dd655fcf-vd7zz" Mar 10 11:18:07 crc kubenswrapper[4794]: I0310 11:18:07.306429 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/647826ea-505e-495e-8fdc-004e7e432df8-ovsdbserver-sb\") pod \"dnsmasq-dns-9dd655fcf-vd7zz\" (UID: \"647826ea-505e-495e-8fdc-004e7e432df8\") " pod="openstack/dnsmasq-dns-9dd655fcf-vd7zz" Mar 10 11:18:07 crc kubenswrapper[4794]: I0310 11:18:07.306453 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7d69c47b-ce22-4e6a-aa51-bbfb7e5b8f23-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"7d69c47b-ce22-4e6a-aa51-bbfb7e5b8f23\") " pod="openstack/nova-cell1-novncproxy-0" Mar 10 11:18:07 crc kubenswrapper[4794]: I0310 11:18:07.306542 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8d41b2d6-74f1-44b7-982d-56983819f28a-config-data\") pod \"nova-metadata-0\" (UID: \"8d41b2d6-74f1-44b7-982d-56983819f28a\") " pod="openstack/nova-metadata-0" Mar 10 11:18:07 crc kubenswrapper[4794]: I0310 11:18:07.306570 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8d41b2d6-74f1-44b7-982d-56983819f28a-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"8d41b2d6-74f1-44b7-982d-56983819f28a\") " pod="openstack/nova-metadata-0" Mar 10 11:18:07 crc kubenswrapper[4794]: I0310 11:18:07.306637 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7d69c47b-ce22-4e6a-aa51-bbfb7e5b8f23-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"7d69c47b-ce22-4e6a-aa51-bbfb7e5b8f23\") " pod="openstack/nova-cell1-novncproxy-0" Mar 10 11:18:07 crc kubenswrapper[4794]: I0310 11:18:07.306672 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8d41b2d6-74f1-44b7-982d-56983819f28a-logs\") pod \"nova-metadata-0\" (UID: \"8d41b2d6-74f1-44b7-982d-56983819f28a\") " pod="openstack/nova-metadata-0" Mar 10 11:18:07 crc kubenswrapper[4794]: I0310 11:18:07.306710 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/647826ea-505e-495e-8fdc-004e7e432df8-ovsdbserver-nb\") pod \"dnsmasq-dns-9dd655fcf-vd7zz\" (UID: \"647826ea-505e-495e-8fdc-004e7e432df8\") " pod="openstack/dnsmasq-dns-9dd655fcf-vd7zz" Mar 10 11:18:07 crc kubenswrapper[4794]: I0310 11:18:07.307929 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8d41b2d6-74f1-44b7-982d-56983819f28a-logs\") pod \"nova-metadata-0\" (UID: \"8d41b2d6-74f1-44b7-982d-56983819f28a\") " pod="openstack/nova-metadata-0" Mar 10 11:18:07 crc kubenswrapper[4794]: I0310 11:18:07.312127 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8d41b2d6-74f1-44b7-982d-56983819f28a-config-data\") pod \"nova-metadata-0\" (UID: \"8d41b2d6-74f1-44b7-982d-56983819f28a\") " pod="openstack/nova-metadata-0" Mar 10 11:18:07 crc kubenswrapper[4794]: I0310 11:18:07.313545 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8d41b2d6-74f1-44b7-982d-56983819f28a-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"8d41b2d6-74f1-44b7-982d-56983819f28a\") " pod="openstack/nova-metadata-0" Mar 10 11:18:07 crc kubenswrapper[4794]: I0310 11:18:07.324669 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tw448\" (UniqueName: \"kubernetes.io/projected/8d41b2d6-74f1-44b7-982d-56983819f28a-kube-api-access-tw448\") pod \"nova-metadata-0\" (UID: \"8d41b2d6-74f1-44b7-982d-56983819f28a\") " pod="openstack/nova-metadata-0" Mar 10 11:18:07 crc kubenswrapper[4794]: I0310 11:18:07.328130 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 10 11:18:07 crc kubenswrapper[4794]: I0310 11:18:07.357031 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 10 11:18:07 crc kubenswrapper[4794]: I0310 11:18:07.408076 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6hw4l\" (UniqueName: \"kubernetes.io/projected/647826ea-505e-495e-8fdc-004e7e432df8-kube-api-access-6hw4l\") pod \"dnsmasq-dns-9dd655fcf-vd7zz\" (UID: \"647826ea-505e-495e-8fdc-004e7e432df8\") " pod="openstack/dnsmasq-dns-9dd655fcf-vd7zz" Mar 10 11:18:07 crc kubenswrapper[4794]: I0310 11:18:07.408146 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b7rhx\" (UniqueName: \"kubernetes.io/projected/7d69c47b-ce22-4e6a-aa51-bbfb7e5b8f23-kube-api-access-b7rhx\") pod \"nova-cell1-novncproxy-0\" (UID: \"7d69c47b-ce22-4e6a-aa51-bbfb7e5b8f23\") " pod="openstack/nova-cell1-novncproxy-0" Mar 10 11:18:07 crc kubenswrapper[4794]: I0310 11:18:07.408186 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/647826ea-505e-495e-8fdc-004e7e432df8-dns-svc\") pod \"dnsmasq-dns-9dd655fcf-vd7zz\" (UID: \"647826ea-505e-495e-8fdc-004e7e432df8\") " pod="openstack/dnsmasq-dns-9dd655fcf-vd7zz" Mar 10 11:18:07 crc kubenswrapper[4794]: I0310 11:18:07.408206 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/647826ea-505e-495e-8fdc-004e7e432df8-config\") pod \"dnsmasq-dns-9dd655fcf-vd7zz\" (UID: \"647826ea-505e-495e-8fdc-004e7e432df8\") " pod="openstack/dnsmasq-dns-9dd655fcf-vd7zz" Mar 10 11:18:07 crc kubenswrapper[4794]: I0310 11:18:07.408221 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/647826ea-505e-495e-8fdc-004e7e432df8-ovsdbserver-sb\") pod \"dnsmasq-dns-9dd655fcf-vd7zz\" (UID: \"647826ea-505e-495e-8fdc-004e7e432df8\") " pod="openstack/dnsmasq-dns-9dd655fcf-vd7zz" Mar 10 11:18:07 crc kubenswrapper[4794]: I0310 11:18:07.408240 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7d69c47b-ce22-4e6a-aa51-bbfb7e5b8f23-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"7d69c47b-ce22-4e6a-aa51-bbfb7e5b8f23\") " pod="openstack/nova-cell1-novncproxy-0" Mar 10 11:18:07 crc kubenswrapper[4794]: I0310 11:18:07.408302 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7d69c47b-ce22-4e6a-aa51-bbfb7e5b8f23-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"7d69c47b-ce22-4e6a-aa51-bbfb7e5b8f23\") " pod="openstack/nova-cell1-novncproxy-0" Mar 10 11:18:07 crc kubenswrapper[4794]: I0310 11:18:07.408347 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/647826ea-505e-495e-8fdc-004e7e432df8-ovsdbserver-nb\") pod \"dnsmasq-dns-9dd655fcf-vd7zz\" (UID: \"647826ea-505e-495e-8fdc-004e7e432df8\") " pod="openstack/dnsmasq-dns-9dd655fcf-vd7zz" Mar 10 11:18:07 crc kubenswrapper[4794]: I0310 11:18:07.409400 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/647826ea-505e-495e-8fdc-004e7e432df8-ovsdbserver-nb\") pod \"dnsmasq-dns-9dd655fcf-vd7zz\" (UID: \"647826ea-505e-495e-8fdc-004e7e432df8\") " pod="openstack/dnsmasq-dns-9dd655fcf-vd7zz" Mar 10 11:18:07 crc kubenswrapper[4794]: I0310 11:18:07.409689 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/647826ea-505e-495e-8fdc-004e7e432df8-config\") pod \"dnsmasq-dns-9dd655fcf-vd7zz\" (UID: \"647826ea-505e-495e-8fdc-004e7e432df8\") " pod="openstack/dnsmasq-dns-9dd655fcf-vd7zz" Mar 10 11:18:07 crc kubenswrapper[4794]: I0310 11:18:07.409696 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/647826ea-505e-495e-8fdc-004e7e432df8-dns-svc\") pod \"dnsmasq-dns-9dd655fcf-vd7zz\" (UID: \"647826ea-505e-495e-8fdc-004e7e432df8\") " pod="openstack/dnsmasq-dns-9dd655fcf-vd7zz" Mar 10 11:18:07 crc kubenswrapper[4794]: I0310 11:18:07.409776 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/647826ea-505e-495e-8fdc-004e7e432df8-ovsdbserver-sb\") pod \"dnsmasq-dns-9dd655fcf-vd7zz\" (UID: \"647826ea-505e-495e-8fdc-004e7e432df8\") " pod="openstack/dnsmasq-dns-9dd655fcf-vd7zz" Mar 10 11:18:07 crc kubenswrapper[4794]: I0310 11:18:07.411447 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7d69c47b-ce22-4e6a-aa51-bbfb7e5b8f23-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"7d69c47b-ce22-4e6a-aa51-bbfb7e5b8f23\") " pod="openstack/nova-cell1-novncproxy-0" Mar 10 11:18:07 crc kubenswrapper[4794]: I0310 11:18:07.412840 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7d69c47b-ce22-4e6a-aa51-bbfb7e5b8f23-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"7d69c47b-ce22-4e6a-aa51-bbfb7e5b8f23\") " pod="openstack/nova-cell1-novncproxy-0" Mar 10 11:18:07 crc kubenswrapper[4794]: I0310 11:18:07.425170 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b7rhx\" (UniqueName: \"kubernetes.io/projected/7d69c47b-ce22-4e6a-aa51-bbfb7e5b8f23-kube-api-access-b7rhx\") pod \"nova-cell1-novncproxy-0\" (UID: \"7d69c47b-ce22-4e6a-aa51-bbfb7e5b8f23\") " pod="openstack/nova-cell1-novncproxy-0" Mar 10 11:18:07 crc kubenswrapper[4794]: I0310 11:18:07.428134 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6hw4l\" (UniqueName: \"kubernetes.io/projected/647826ea-505e-495e-8fdc-004e7e432df8-kube-api-access-6hw4l\") pod \"dnsmasq-dns-9dd655fcf-vd7zz\" (UID: \"647826ea-505e-495e-8fdc-004e7e432df8\") " pod="openstack/dnsmasq-dns-9dd655fcf-vd7zz" Mar 10 11:18:07 crc kubenswrapper[4794]: I0310 11:18:07.449989 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-lmrpm" Mar 10 11:18:07 crc kubenswrapper[4794]: I0310 11:18:07.510156 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4540a698-7f45-4e9f-b38e-11102a1ee435-catalog-content\") pod \"4540a698-7f45-4e9f-b38e-11102a1ee435\" (UID: \"4540a698-7f45-4e9f-b38e-11102a1ee435\") " Mar 10 11:18:07 crc kubenswrapper[4794]: I0310 11:18:07.510524 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4540a698-7f45-4e9f-b38e-11102a1ee435-utilities\") pod \"4540a698-7f45-4e9f-b38e-11102a1ee435\" (UID: \"4540a698-7f45-4e9f-b38e-11102a1ee435\") " Mar 10 11:18:07 crc kubenswrapper[4794]: I0310 11:18:07.510639 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9r88w\" (UniqueName: \"kubernetes.io/projected/4540a698-7f45-4e9f-b38e-11102a1ee435-kube-api-access-9r88w\") pod \"4540a698-7f45-4e9f-b38e-11102a1ee435\" (UID: \"4540a698-7f45-4e9f-b38e-11102a1ee435\") " Mar 10 11:18:07 crc kubenswrapper[4794]: I0310 11:18:07.513981 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4540a698-7f45-4e9f-b38e-11102a1ee435-utilities" (OuterVolumeSpecName: "utilities") pod "4540a698-7f45-4e9f-b38e-11102a1ee435" (UID: "4540a698-7f45-4e9f-b38e-11102a1ee435"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 11:18:07 crc kubenswrapper[4794]: I0310 11:18:07.515535 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4540a698-7f45-4e9f-b38e-11102a1ee435-kube-api-access-9r88w" (OuterVolumeSpecName: "kube-api-access-9r88w") pod "4540a698-7f45-4e9f-b38e-11102a1ee435" (UID: "4540a698-7f45-4e9f-b38e-11102a1ee435"). InnerVolumeSpecName "kube-api-access-9r88w". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 11:18:07 crc kubenswrapper[4794]: I0310 11:18:07.525694 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 10 11:18:07 crc kubenswrapper[4794]: I0310 11:18:07.553827 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-9dd655fcf-vd7zz" Mar 10 11:18:07 crc kubenswrapper[4794]: I0310 11:18:07.566304 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 10 11:18:07 crc kubenswrapper[4794]: I0310 11:18:07.612747 4794 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4540a698-7f45-4e9f-b38e-11102a1ee435-utilities\") on node \"crc\" DevicePath \"\"" Mar 10 11:18:07 crc kubenswrapper[4794]: I0310 11:18:07.612991 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9r88w\" (UniqueName: \"kubernetes.io/projected/4540a698-7f45-4e9f-b38e-11102a1ee435-kube-api-access-9r88w\") on node \"crc\" DevicePath \"\"" Mar 10 11:18:07 crc kubenswrapper[4794]: I0310 11:18:07.666176 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 10 11:18:07 crc kubenswrapper[4794]: I0310 11:18:07.675479 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-4sqhj"] Mar 10 11:18:07 crc kubenswrapper[4794]: I0310 11:18:07.713469 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4540a698-7f45-4e9f-b38e-11102a1ee435-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "4540a698-7f45-4e9f-b38e-11102a1ee435" (UID: "4540a698-7f45-4e9f-b38e-11102a1ee435"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 11:18:07 crc kubenswrapper[4794]: I0310 11:18:07.716154 4794 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4540a698-7f45-4e9f-b38e-11102a1ee435-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 10 11:18:07 crc kubenswrapper[4794]: I0310 11:18:07.762809 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lmrpm" event={"ID":"4540a698-7f45-4e9f-b38e-11102a1ee435","Type":"ContainerDied","Data":"e7e9bcad8ccbc8cdd3dca02ff28379dc1f14ab20eb194ea258f4ec6f18be51b6"} Mar 10 11:18:07 crc kubenswrapper[4794]: I0310 11:18:07.762916 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-lmrpm" Mar 10 11:18:07 crc kubenswrapper[4794]: I0310 11:18:07.763070 4794 scope.go:117] "RemoveContainer" containerID="e7e9bcad8ccbc8cdd3dca02ff28379dc1f14ab20eb194ea258f4ec6f18be51b6" Mar 10 11:18:07 crc kubenswrapper[4794]: I0310 11:18:07.762783 4794 generic.go:334] "Generic (PLEG): container finished" podID="4540a698-7f45-4e9f-b38e-11102a1ee435" containerID="e7e9bcad8ccbc8cdd3dca02ff28379dc1f14ab20eb194ea258f4ec6f18be51b6" exitCode=0 Mar 10 11:18:07 crc kubenswrapper[4794]: I0310 11:18:07.763218 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lmrpm" event={"ID":"4540a698-7f45-4e9f-b38e-11102a1ee435","Type":"ContainerDied","Data":"b24d588d1786bde53923101d1e378ecca63ffd15056f52efb4220fc6baaab64f"} Mar 10 11:18:07 crc kubenswrapper[4794]: I0310 11:18:07.781865 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-4sqhj" event={"ID":"0ebea665-f36f-45ef-95a5-bdeacd279dd3","Type":"ContainerStarted","Data":"996695bb86c50f92acd572f0f5ce4522a21cf69c7ff1e13ca9b035e914549000"} Mar 10 11:18:07 crc kubenswrapper[4794]: I0310 11:18:07.787545 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"e92b7354-097a-42e3-af6b-0d66414c4d4d","Type":"ContainerStarted","Data":"75b2ccadea49ed375af09026efd83e3bdae4d6c9244311a668c12dcee5798673"} Mar 10 11:18:07 crc kubenswrapper[4794]: I0310 11:18:07.828985 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-lmrpm"] Mar 10 11:18:07 crc kubenswrapper[4794]: I0310 11:18:07.834242 4794 scope.go:117] "RemoveContainer" containerID="4e38abd9e72adaecae1d730b4891373103d29029a4ecd0dd7be468adc6328eca" Mar 10 11:18:07 crc kubenswrapper[4794]: I0310 11:18:07.847988 4794 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-lmrpm"] Mar 10 11:18:07 crc kubenswrapper[4794]: W0310 11:18:07.853429 4794 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podabf2624e_ffa0_4251_8661_f688c4699f5a.slice/crio-92871da306b7772d02d6a671c8a71cd46cabcd0c79d2b6719ef5355a4be73490 WatchSource:0}: Error finding container 92871da306b7772d02d6a671c8a71cd46cabcd0c79d2b6719ef5355a4be73490: Status 404 returned error can't find the container with id 92871da306b7772d02d6a671c8a71cd46cabcd0c79d2b6719ef5355a4be73490 Mar 10 11:18:07 crc kubenswrapper[4794]: I0310 11:18:07.871693 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 10 11:18:07 crc kubenswrapper[4794]: I0310 11:18:07.872659 4794 scope.go:117] "RemoveContainer" containerID="7a5f364dca4b20e76e6d8025af628cf70e88f3276bded150c0b61e0159ccb561" Mar 10 11:18:08 crc kubenswrapper[4794]: I0310 11:18:08.015471 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4540a698-7f45-4e9f-b38e-11102a1ee435" path="/var/lib/kubelet/pods/4540a698-7f45-4e9f-b38e-11102a1ee435/volumes" Mar 10 11:18:08 crc kubenswrapper[4794]: I0310 11:18:08.029899 4794 scope.go:117] "RemoveContainer" containerID="e7e9bcad8ccbc8cdd3dca02ff28379dc1f14ab20eb194ea258f4ec6f18be51b6" Mar 10 11:18:08 crc kubenswrapper[4794]: E0310 11:18:08.030569 4794 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e7e9bcad8ccbc8cdd3dca02ff28379dc1f14ab20eb194ea258f4ec6f18be51b6\": container with ID starting with e7e9bcad8ccbc8cdd3dca02ff28379dc1f14ab20eb194ea258f4ec6f18be51b6 not found: ID does not exist" containerID="e7e9bcad8ccbc8cdd3dca02ff28379dc1f14ab20eb194ea258f4ec6f18be51b6" Mar 10 11:18:08 crc kubenswrapper[4794]: I0310 11:18:08.030593 4794 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e7e9bcad8ccbc8cdd3dca02ff28379dc1f14ab20eb194ea258f4ec6f18be51b6"} err="failed to get container status \"e7e9bcad8ccbc8cdd3dca02ff28379dc1f14ab20eb194ea258f4ec6f18be51b6\": rpc error: code = NotFound desc = could not find container \"e7e9bcad8ccbc8cdd3dca02ff28379dc1f14ab20eb194ea258f4ec6f18be51b6\": container with ID starting with e7e9bcad8ccbc8cdd3dca02ff28379dc1f14ab20eb194ea258f4ec6f18be51b6 not found: ID does not exist" Mar 10 11:18:08 crc kubenswrapper[4794]: I0310 11:18:08.030612 4794 scope.go:117] "RemoveContainer" containerID="4e38abd9e72adaecae1d730b4891373103d29029a4ecd0dd7be468adc6328eca" Mar 10 11:18:08 crc kubenswrapper[4794]: E0310 11:18:08.035656 4794 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4e38abd9e72adaecae1d730b4891373103d29029a4ecd0dd7be468adc6328eca\": container with ID starting with 4e38abd9e72adaecae1d730b4891373103d29029a4ecd0dd7be468adc6328eca not found: ID does not exist" containerID="4e38abd9e72adaecae1d730b4891373103d29029a4ecd0dd7be468adc6328eca" Mar 10 11:18:08 crc kubenswrapper[4794]: I0310 11:18:08.035696 4794 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4e38abd9e72adaecae1d730b4891373103d29029a4ecd0dd7be468adc6328eca"} err="failed to get container status \"4e38abd9e72adaecae1d730b4891373103d29029a4ecd0dd7be468adc6328eca\": rpc error: code = NotFound desc = could not find container \"4e38abd9e72adaecae1d730b4891373103d29029a4ecd0dd7be468adc6328eca\": container with ID starting with 4e38abd9e72adaecae1d730b4891373103d29029a4ecd0dd7be468adc6328eca not found: ID does not exist" Mar 10 11:18:08 crc kubenswrapper[4794]: I0310 11:18:08.035721 4794 scope.go:117] "RemoveContainer" containerID="7a5f364dca4b20e76e6d8025af628cf70e88f3276bded150c0b61e0159ccb561" Mar 10 11:18:08 crc kubenswrapper[4794]: E0310 11:18:08.039717 4794 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7a5f364dca4b20e76e6d8025af628cf70e88f3276bded150c0b61e0159ccb561\": container with ID starting with 7a5f364dca4b20e76e6d8025af628cf70e88f3276bded150c0b61e0159ccb561 not found: ID does not exist" containerID="7a5f364dca4b20e76e6d8025af628cf70e88f3276bded150c0b61e0159ccb561" Mar 10 11:18:08 crc kubenswrapper[4794]: I0310 11:18:08.039758 4794 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7a5f364dca4b20e76e6d8025af628cf70e88f3276bded150c0b61e0159ccb561"} err="failed to get container status \"7a5f364dca4b20e76e6d8025af628cf70e88f3276bded150c0b61e0159ccb561\": rpc error: code = NotFound desc = could not find container \"7a5f364dca4b20e76e6d8025af628cf70e88f3276bded150c0b61e0159ccb561\": container with ID starting with 7a5f364dca4b20e76e6d8025af628cf70e88f3276bded150c0b61e0159ccb561 not found: ID does not exist" Mar 10 11:18:08 crc kubenswrapper[4794]: I0310 11:18:08.140694 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-db-sync-ccrj5"] Mar 10 11:18:08 crc kubenswrapper[4794]: E0310 11:18:08.141530 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4540a698-7f45-4e9f-b38e-11102a1ee435" containerName="extract-utilities" Mar 10 11:18:08 crc kubenswrapper[4794]: I0310 11:18:08.141546 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="4540a698-7f45-4e9f-b38e-11102a1ee435" containerName="extract-utilities" Mar 10 11:18:08 crc kubenswrapper[4794]: E0310 11:18:08.141566 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4540a698-7f45-4e9f-b38e-11102a1ee435" containerName="registry-server" Mar 10 11:18:08 crc kubenswrapper[4794]: I0310 11:18:08.141573 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="4540a698-7f45-4e9f-b38e-11102a1ee435" containerName="registry-server" Mar 10 11:18:08 crc kubenswrapper[4794]: E0310 11:18:08.141583 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4540a698-7f45-4e9f-b38e-11102a1ee435" containerName="extract-content" Mar 10 11:18:08 crc kubenswrapper[4794]: I0310 11:18:08.141590 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="4540a698-7f45-4e9f-b38e-11102a1ee435" containerName="extract-content" Mar 10 11:18:08 crc kubenswrapper[4794]: I0310 11:18:08.141795 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="4540a698-7f45-4e9f-b38e-11102a1ee435" containerName="registry-server" Mar 10 11:18:08 crc kubenswrapper[4794]: I0310 11:18:08.142617 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-ccrj5" Mar 10 11:18:08 crc kubenswrapper[4794]: I0310 11:18:08.144464 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Mar 10 11:18:08 crc kubenswrapper[4794]: I0310 11:18:08.144888 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-scripts" Mar 10 11:18:08 crc kubenswrapper[4794]: I0310 11:18:08.150579 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 10 11:18:08 crc kubenswrapper[4794]: I0310 11:18:08.158710 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-ccrj5"] Mar 10 11:18:08 crc kubenswrapper[4794]: I0310 11:18:08.223103 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-9dd655fcf-vd7zz"] Mar 10 11:18:08 crc kubenswrapper[4794]: I0310 11:18:08.226085 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xl6qd\" (UniqueName: \"kubernetes.io/projected/78550cc2-d68e-4b15-98f8-281fb85642df-kube-api-access-xl6qd\") pod \"nova-cell1-conductor-db-sync-ccrj5\" (UID: \"78550cc2-d68e-4b15-98f8-281fb85642df\") " pod="openstack/nova-cell1-conductor-db-sync-ccrj5" Mar 10 11:18:08 crc kubenswrapper[4794]: I0310 11:18:08.226123 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/78550cc2-d68e-4b15-98f8-281fb85642df-scripts\") pod \"nova-cell1-conductor-db-sync-ccrj5\" (UID: \"78550cc2-d68e-4b15-98f8-281fb85642df\") " pod="openstack/nova-cell1-conductor-db-sync-ccrj5" Mar 10 11:18:08 crc kubenswrapper[4794]: I0310 11:18:08.226214 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/78550cc2-d68e-4b15-98f8-281fb85642df-config-data\") pod \"nova-cell1-conductor-db-sync-ccrj5\" (UID: \"78550cc2-d68e-4b15-98f8-281fb85642df\") " pod="openstack/nova-cell1-conductor-db-sync-ccrj5" Mar 10 11:18:08 crc kubenswrapper[4794]: I0310 11:18:08.226242 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/78550cc2-d68e-4b15-98f8-281fb85642df-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-ccrj5\" (UID: \"78550cc2-d68e-4b15-98f8-281fb85642df\") " pod="openstack/nova-cell1-conductor-db-sync-ccrj5" Mar 10 11:18:08 crc kubenswrapper[4794]: I0310 11:18:08.303657 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 10 11:18:08 crc kubenswrapper[4794]: I0310 11:18:08.329503 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xl6qd\" (UniqueName: \"kubernetes.io/projected/78550cc2-d68e-4b15-98f8-281fb85642df-kube-api-access-xl6qd\") pod \"nova-cell1-conductor-db-sync-ccrj5\" (UID: \"78550cc2-d68e-4b15-98f8-281fb85642df\") " pod="openstack/nova-cell1-conductor-db-sync-ccrj5" Mar 10 11:18:08 crc kubenswrapper[4794]: I0310 11:18:08.329624 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/78550cc2-d68e-4b15-98f8-281fb85642df-scripts\") pod \"nova-cell1-conductor-db-sync-ccrj5\" (UID: \"78550cc2-d68e-4b15-98f8-281fb85642df\") " pod="openstack/nova-cell1-conductor-db-sync-ccrj5" Mar 10 11:18:08 crc kubenswrapper[4794]: I0310 11:18:08.329764 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/78550cc2-d68e-4b15-98f8-281fb85642df-config-data\") pod \"nova-cell1-conductor-db-sync-ccrj5\" (UID: \"78550cc2-d68e-4b15-98f8-281fb85642df\") " pod="openstack/nova-cell1-conductor-db-sync-ccrj5" Mar 10 11:18:08 crc kubenswrapper[4794]: I0310 11:18:08.329938 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/78550cc2-d68e-4b15-98f8-281fb85642df-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-ccrj5\" (UID: \"78550cc2-d68e-4b15-98f8-281fb85642df\") " pod="openstack/nova-cell1-conductor-db-sync-ccrj5" Mar 10 11:18:08 crc kubenswrapper[4794]: I0310 11:18:08.333431 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/78550cc2-d68e-4b15-98f8-281fb85642df-config-data\") pod \"nova-cell1-conductor-db-sync-ccrj5\" (UID: \"78550cc2-d68e-4b15-98f8-281fb85642df\") " pod="openstack/nova-cell1-conductor-db-sync-ccrj5" Mar 10 11:18:08 crc kubenswrapper[4794]: I0310 11:18:08.334130 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/78550cc2-d68e-4b15-98f8-281fb85642df-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-ccrj5\" (UID: \"78550cc2-d68e-4b15-98f8-281fb85642df\") " pod="openstack/nova-cell1-conductor-db-sync-ccrj5" Mar 10 11:18:08 crc kubenswrapper[4794]: I0310 11:18:08.334389 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/78550cc2-d68e-4b15-98f8-281fb85642df-scripts\") pod \"nova-cell1-conductor-db-sync-ccrj5\" (UID: \"78550cc2-d68e-4b15-98f8-281fb85642df\") " pod="openstack/nova-cell1-conductor-db-sync-ccrj5" Mar 10 11:18:08 crc kubenswrapper[4794]: I0310 11:18:08.345489 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xl6qd\" (UniqueName: \"kubernetes.io/projected/78550cc2-d68e-4b15-98f8-281fb85642df-kube-api-access-xl6qd\") pod \"nova-cell1-conductor-db-sync-ccrj5\" (UID: \"78550cc2-d68e-4b15-98f8-281fb85642df\") " pod="openstack/nova-cell1-conductor-db-sync-ccrj5" Mar 10 11:18:08 crc kubenswrapper[4794]: I0310 11:18:08.466931 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-ccrj5" Mar 10 11:18:08 crc kubenswrapper[4794]: I0310 11:18:08.798718 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"e92b7354-097a-42e3-af6b-0d66414c4d4d","Type":"ContainerStarted","Data":"d696cd902eee6738838e1d4611ea33381aa5fd16b8a19fcd0247b346782038e6"} Mar 10 11:18:08 crc kubenswrapper[4794]: I0310 11:18:08.798977 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"e92b7354-097a-42e3-af6b-0d66414c4d4d","Type":"ContainerStarted","Data":"5e829d653787d159944b5fcd451c12dd75ac1133cc579a0dc350c03d0ba14582"} Mar 10 11:18:08 crc kubenswrapper[4794]: I0310 11:18:08.800719 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"abf2624e-ffa0-4251-8661-f688c4699f5a","Type":"ContainerStarted","Data":"9f0c3f5bb5cac7658eec685d5eee8faa11d60ba5d34ea562347590f6fb17a493"} Mar 10 11:18:08 crc kubenswrapper[4794]: I0310 11:18:08.800744 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"abf2624e-ffa0-4251-8661-f688c4699f5a","Type":"ContainerStarted","Data":"92871da306b7772d02d6a671c8a71cd46cabcd0c79d2b6719ef5355a4be73490"} Mar 10 11:18:08 crc kubenswrapper[4794]: I0310 11:18:08.802718 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"8d41b2d6-74f1-44b7-982d-56983819f28a","Type":"ContainerStarted","Data":"01ed2fda3f4c80509970f19aeebb1c3bb26a19bb4d92ebc2e186c1f118ac7890"} Mar 10 11:18:08 crc kubenswrapper[4794]: I0310 11:18:08.802745 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"8d41b2d6-74f1-44b7-982d-56983819f28a","Type":"ContainerStarted","Data":"e21d79c43231f2be99e22493b1f42ab3cfd085caafe4ac3ba1a0b9d42ac3c726"} Mar 10 11:18:08 crc kubenswrapper[4794]: I0310 11:18:08.802757 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"8d41b2d6-74f1-44b7-982d-56983819f28a","Type":"ContainerStarted","Data":"14943685051ca7a320d24e99fe954e86d5b4fd77aad8d49bb501c493cfa612de"} Mar 10 11:18:08 crc kubenswrapper[4794]: I0310 11:18:08.805018 4794 generic.go:334] "Generic (PLEG): container finished" podID="647826ea-505e-495e-8fdc-004e7e432df8" containerID="9f44948ec3a32d4d69ec6573f79079ea88779de41621c1c32f05b724d05edae9" exitCode=0 Mar 10 11:18:08 crc kubenswrapper[4794]: I0310 11:18:08.805088 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-9dd655fcf-vd7zz" event={"ID":"647826ea-505e-495e-8fdc-004e7e432df8","Type":"ContainerDied","Data":"9f44948ec3a32d4d69ec6573f79079ea88779de41621c1c32f05b724d05edae9"} Mar 10 11:18:08 crc kubenswrapper[4794]: I0310 11:18:08.805118 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-9dd655fcf-vd7zz" event={"ID":"647826ea-505e-495e-8fdc-004e7e432df8","Type":"ContainerStarted","Data":"b337d0f6423286d6e6834e2bce14382a5cac1f18dc201e9dd181192bced486e2"} Mar 10 11:18:08 crc kubenswrapper[4794]: I0310 11:18:08.808737 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-4sqhj" event={"ID":"0ebea665-f36f-45ef-95a5-bdeacd279dd3","Type":"ContainerStarted","Data":"2892665528e4b37193ed63e0bb5ff16567ce35d3f8ea7280dc41b117ce794dcf"} Mar 10 11:18:08 crc kubenswrapper[4794]: I0310 11:18:08.810321 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"7d69c47b-ce22-4e6a-aa51-bbfb7e5b8f23","Type":"ContainerStarted","Data":"bd16b4f53a35e5576da7d1a0f331f057b122c2a8f1b9a4d858b904c8e01ec28d"} Mar 10 11:18:08 crc kubenswrapper[4794]: I0310 11:18:08.810371 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"7d69c47b-ce22-4e6a-aa51-bbfb7e5b8f23","Type":"ContainerStarted","Data":"2b2b77342842d19af067333b539aaf3cc2f04b030dc04c5d740e9ff3322481f0"} Mar 10 11:18:08 crc kubenswrapper[4794]: I0310 11:18:08.823459 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.823439118 podStartE2EDuration="2.823439118s" podCreationTimestamp="2026-03-10 11:18:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 11:18:08.81546161 +0000 UTC m=+5637.571632418" watchObservedRunningTime="2026-03-10 11:18:08.823439118 +0000 UTC m=+5637.579609936" Mar 10 11:18:08 crc kubenswrapper[4794]: I0310 11:18:08.844368 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=1.844347709 podStartE2EDuration="1.844347709s" podCreationTimestamp="2026-03-10 11:18:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 11:18:08.836113863 +0000 UTC m=+5637.592284681" watchObservedRunningTime="2026-03-10 11:18:08.844347709 +0000 UTC m=+5637.600518527" Mar 10 11:18:08 crc kubenswrapper[4794]: I0310 11:18:08.861904 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.8618824849999998 podStartE2EDuration="2.861882485s" podCreationTimestamp="2026-03-10 11:18:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 11:18:08.85210319 +0000 UTC m=+5637.608274008" watchObservedRunningTime="2026-03-10 11:18:08.861882485 +0000 UTC m=+5637.618053313" Mar 10 11:18:08 crc kubenswrapper[4794]: I0310 11:18:08.905244 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=1.905225623 podStartE2EDuration="1.905225623s" podCreationTimestamp="2026-03-10 11:18:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 11:18:08.904365226 +0000 UTC m=+5637.660536044" watchObservedRunningTime="2026-03-10 11:18:08.905225623 +0000 UTC m=+5637.661396441" Mar 10 11:18:08 crc kubenswrapper[4794]: I0310 11:18:08.933999 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-ccrj5"] Mar 10 11:18:08 crc kubenswrapper[4794]: I0310 11:18:08.936873 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-cell-mapping-4sqhj" podStartSLOduration=2.9368490080000003 podStartE2EDuration="2.936849008s" podCreationTimestamp="2026-03-10 11:18:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 11:18:08.92245287 +0000 UTC m=+5637.678623688" watchObservedRunningTime="2026-03-10 11:18:08.936849008 +0000 UTC m=+5637.693019836" Mar 10 11:18:09 crc kubenswrapper[4794]: I0310 11:18:09.817961 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-ccrj5" event={"ID":"78550cc2-d68e-4b15-98f8-281fb85642df","Type":"ContainerStarted","Data":"5ef4cdc386b2a6e49916e7674d285fa07042c3f40995f8a61f9371bb22c25ee2"} Mar 10 11:18:09 crc kubenswrapper[4794]: I0310 11:18:09.818200 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-ccrj5" event={"ID":"78550cc2-d68e-4b15-98f8-281fb85642df","Type":"ContainerStarted","Data":"38089fc025072b7acbdad2b462110d593ac303fd729e8467b17ddbb5348fb762"} Mar 10 11:18:09 crc kubenswrapper[4794]: I0310 11:18:09.822960 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-9dd655fcf-vd7zz" event={"ID":"647826ea-505e-495e-8fdc-004e7e432df8","Type":"ContainerStarted","Data":"b847f84fb6dceb7e782c14ea46054ce699b41e5d821d4cf993ce4e3449c9b7af"} Mar 10 11:18:09 crc kubenswrapper[4794]: I0310 11:18:09.823042 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-9dd655fcf-vd7zz" Mar 10 11:18:09 crc kubenswrapper[4794]: I0310 11:18:09.836955 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-db-sync-ccrj5" podStartSLOduration=1.836938886 podStartE2EDuration="1.836938886s" podCreationTimestamp="2026-03-10 11:18:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 11:18:09.83224428 +0000 UTC m=+5638.588415098" watchObservedRunningTime="2026-03-10 11:18:09.836938886 +0000 UTC m=+5638.593109704" Mar 10 11:18:09 crc kubenswrapper[4794]: I0310 11:18:09.869229 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-9dd655fcf-vd7zz" podStartSLOduration=2.869202481 podStartE2EDuration="2.869202481s" podCreationTimestamp="2026-03-10 11:18:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 11:18:09.854185063 +0000 UTC m=+5638.610355881" watchObservedRunningTime="2026-03-10 11:18:09.869202481 +0000 UTC m=+5638.625373339" Mar 10 11:18:11 crc kubenswrapper[4794]: I0310 11:18:11.852537 4794 generic.go:334] "Generic (PLEG): container finished" podID="78550cc2-d68e-4b15-98f8-281fb85642df" containerID="5ef4cdc386b2a6e49916e7674d285fa07042c3f40995f8a61f9371bb22c25ee2" exitCode=0 Mar 10 11:18:11 crc kubenswrapper[4794]: I0310 11:18:11.853197 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-ccrj5" event={"ID":"78550cc2-d68e-4b15-98f8-281fb85642df","Type":"ContainerDied","Data":"5ef4cdc386b2a6e49916e7674d285fa07042c3f40995f8a61f9371bb22c25ee2"} Mar 10 11:18:12 crc kubenswrapper[4794]: I0310 11:18:12.357940 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Mar 10 11:18:12 crc kubenswrapper[4794]: I0310 11:18:12.526775 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Mar 10 11:18:12 crc kubenswrapper[4794]: I0310 11:18:12.526825 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Mar 10 11:18:12 crc kubenswrapper[4794]: I0310 11:18:12.567707 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Mar 10 11:18:12 crc kubenswrapper[4794]: I0310 11:18:12.869836 4794 generic.go:334] "Generic (PLEG): container finished" podID="0ebea665-f36f-45ef-95a5-bdeacd279dd3" containerID="2892665528e4b37193ed63e0bb5ff16567ce35d3f8ea7280dc41b117ce794dcf" exitCode=0 Mar 10 11:18:12 crc kubenswrapper[4794]: I0310 11:18:12.870171 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-4sqhj" event={"ID":"0ebea665-f36f-45ef-95a5-bdeacd279dd3","Type":"ContainerDied","Data":"2892665528e4b37193ed63e0bb5ff16567ce35d3f8ea7280dc41b117ce794dcf"} Mar 10 11:18:13 crc kubenswrapper[4794]: I0310 11:18:13.249753 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-ccrj5" Mar 10 11:18:13 crc kubenswrapper[4794]: I0310 11:18:13.329375 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/78550cc2-d68e-4b15-98f8-281fb85642df-scripts\") pod \"78550cc2-d68e-4b15-98f8-281fb85642df\" (UID: \"78550cc2-d68e-4b15-98f8-281fb85642df\") " Mar 10 11:18:13 crc kubenswrapper[4794]: I0310 11:18:13.329522 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xl6qd\" (UniqueName: \"kubernetes.io/projected/78550cc2-d68e-4b15-98f8-281fb85642df-kube-api-access-xl6qd\") pod \"78550cc2-d68e-4b15-98f8-281fb85642df\" (UID: \"78550cc2-d68e-4b15-98f8-281fb85642df\") " Mar 10 11:18:13 crc kubenswrapper[4794]: I0310 11:18:13.329555 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/78550cc2-d68e-4b15-98f8-281fb85642df-config-data\") pod \"78550cc2-d68e-4b15-98f8-281fb85642df\" (UID: \"78550cc2-d68e-4b15-98f8-281fb85642df\") " Mar 10 11:18:13 crc kubenswrapper[4794]: I0310 11:18:13.329638 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/78550cc2-d68e-4b15-98f8-281fb85642df-combined-ca-bundle\") pod \"78550cc2-d68e-4b15-98f8-281fb85642df\" (UID: \"78550cc2-d68e-4b15-98f8-281fb85642df\") " Mar 10 11:18:13 crc kubenswrapper[4794]: I0310 11:18:13.335734 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/78550cc2-d68e-4b15-98f8-281fb85642df-kube-api-access-xl6qd" (OuterVolumeSpecName: "kube-api-access-xl6qd") pod "78550cc2-d68e-4b15-98f8-281fb85642df" (UID: "78550cc2-d68e-4b15-98f8-281fb85642df"). InnerVolumeSpecName "kube-api-access-xl6qd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 11:18:13 crc kubenswrapper[4794]: I0310 11:18:13.336051 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/78550cc2-d68e-4b15-98f8-281fb85642df-scripts" (OuterVolumeSpecName: "scripts") pod "78550cc2-d68e-4b15-98f8-281fb85642df" (UID: "78550cc2-d68e-4b15-98f8-281fb85642df"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 11:18:13 crc kubenswrapper[4794]: I0310 11:18:13.370596 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/78550cc2-d68e-4b15-98f8-281fb85642df-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "78550cc2-d68e-4b15-98f8-281fb85642df" (UID: "78550cc2-d68e-4b15-98f8-281fb85642df"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 11:18:13 crc kubenswrapper[4794]: I0310 11:18:13.377991 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/78550cc2-d68e-4b15-98f8-281fb85642df-config-data" (OuterVolumeSpecName: "config-data") pod "78550cc2-d68e-4b15-98f8-281fb85642df" (UID: "78550cc2-d68e-4b15-98f8-281fb85642df"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 11:18:13 crc kubenswrapper[4794]: I0310 11:18:13.432295 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xl6qd\" (UniqueName: \"kubernetes.io/projected/78550cc2-d68e-4b15-98f8-281fb85642df-kube-api-access-xl6qd\") on node \"crc\" DevicePath \"\"" Mar 10 11:18:13 crc kubenswrapper[4794]: I0310 11:18:13.432503 4794 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/78550cc2-d68e-4b15-98f8-281fb85642df-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 11:18:13 crc kubenswrapper[4794]: I0310 11:18:13.432590 4794 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/78550cc2-d68e-4b15-98f8-281fb85642df-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 11:18:13 crc kubenswrapper[4794]: I0310 11:18:13.432668 4794 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/78550cc2-d68e-4b15-98f8-281fb85642df-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 11:18:13 crc kubenswrapper[4794]: I0310 11:18:13.885650 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-ccrj5" Mar 10 11:18:13 crc kubenswrapper[4794]: I0310 11:18:13.888551 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-ccrj5" event={"ID":"78550cc2-d68e-4b15-98f8-281fb85642df","Type":"ContainerDied","Data":"38089fc025072b7acbdad2b462110d593ac303fd729e8467b17ddbb5348fb762"} Mar 10 11:18:13 crc kubenswrapper[4794]: I0310 11:18:13.888612 4794 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="38089fc025072b7acbdad2b462110d593ac303fd729e8467b17ddbb5348fb762" Mar 10 11:18:14 crc kubenswrapper[4794]: I0310 11:18:14.020584 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-0"] Mar 10 11:18:14 crc kubenswrapper[4794]: E0310 11:18:14.020920 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="78550cc2-d68e-4b15-98f8-281fb85642df" containerName="nova-cell1-conductor-db-sync" Mar 10 11:18:14 crc kubenswrapper[4794]: I0310 11:18:14.020941 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="78550cc2-d68e-4b15-98f8-281fb85642df" containerName="nova-cell1-conductor-db-sync" Mar 10 11:18:14 crc kubenswrapper[4794]: I0310 11:18:14.021166 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="78550cc2-d68e-4b15-98f8-281fb85642df" containerName="nova-cell1-conductor-db-sync" Mar 10 11:18:14 crc kubenswrapper[4794]: I0310 11:18:14.021904 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Mar 10 11:18:14 crc kubenswrapper[4794]: I0310 11:18:14.021994 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Mar 10 11:18:14 crc kubenswrapper[4794]: I0310 11:18:14.024819 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Mar 10 11:18:14 crc kubenswrapper[4794]: I0310 11:18:14.153919 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/69d4f681-c12f-43fa-8ec0-95e90bff92a8-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"69d4f681-c12f-43fa-8ec0-95e90bff92a8\") " pod="openstack/nova-cell1-conductor-0" Mar 10 11:18:14 crc kubenswrapper[4794]: I0310 11:18:14.154022 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/69d4f681-c12f-43fa-8ec0-95e90bff92a8-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"69d4f681-c12f-43fa-8ec0-95e90bff92a8\") " pod="openstack/nova-cell1-conductor-0" Mar 10 11:18:14 crc kubenswrapper[4794]: I0310 11:18:14.154535 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gc7dd\" (UniqueName: \"kubernetes.io/projected/69d4f681-c12f-43fa-8ec0-95e90bff92a8-kube-api-access-gc7dd\") pod \"nova-cell1-conductor-0\" (UID: \"69d4f681-c12f-43fa-8ec0-95e90bff92a8\") " pod="openstack/nova-cell1-conductor-0" Mar 10 11:18:14 crc kubenswrapper[4794]: I0310 11:18:14.256717 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/69d4f681-c12f-43fa-8ec0-95e90bff92a8-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"69d4f681-c12f-43fa-8ec0-95e90bff92a8\") " pod="openstack/nova-cell1-conductor-0" Mar 10 11:18:14 crc kubenswrapper[4794]: I0310 11:18:14.256846 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/69d4f681-c12f-43fa-8ec0-95e90bff92a8-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"69d4f681-c12f-43fa-8ec0-95e90bff92a8\") " pod="openstack/nova-cell1-conductor-0" Mar 10 11:18:14 crc kubenswrapper[4794]: I0310 11:18:14.256917 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gc7dd\" (UniqueName: \"kubernetes.io/projected/69d4f681-c12f-43fa-8ec0-95e90bff92a8-kube-api-access-gc7dd\") pod \"nova-cell1-conductor-0\" (UID: \"69d4f681-c12f-43fa-8ec0-95e90bff92a8\") " pod="openstack/nova-cell1-conductor-0" Mar 10 11:18:14 crc kubenswrapper[4794]: I0310 11:18:14.262362 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/69d4f681-c12f-43fa-8ec0-95e90bff92a8-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"69d4f681-c12f-43fa-8ec0-95e90bff92a8\") " pod="openstack/nova-cell1-conductor-0" Mar 10 11:18:14 crc kubenswrapper[4794]: I0310 11:18:14.263399 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/69d4f681-c12f-43fa-8ec0-95e90bff92a8-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"69d4f681-c12f-43fa-8ec0-95e90bff92a8\") " pod="openstack/nova-cell1-conductor-0" Mar 10 11:18:14 crc kubenswrapper[4794]: I0310 11:18:14.284238 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gc7dd\" (UniqueName: \"kubernetes.io/projected/69d4f681-c12f-43fa-8ec0-95e90bff92a8-kube-api-access-gc7dd\") pod \"nova-cell1-conductor-0\" (UID: \"69d4f681-c12f-43fa-8ec0-95e90bff92a8\") " pod="openstack/nova-cell1-conductor-0" Mar 10 11:18:14 crc kubenswrapper[4794]: I0310 11:18:14.340755 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-4sqhj" Mar 10 11:18:14 crc kubenswrapper[4794]: I0310 11:18:14.348845 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Mar 10 11:18:14 crc kubenswrapper[4794]: I0310 11:18:14.504477 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0ebea665-f36f-45ef-95a5-bdeacd279dd3-config-data\") pod \"0ebea665-f36f-45ef-95a5-bdeacd279dd3\" (UID: \"0ebea665-f36f-45ef-95a5-bdeacd279dd3\") " Mar 10 11:18:14 crc kubenswrapper[4794]: I0310 11:18:14.505156 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0ebea665-f36f-45ef-95a5-bdeacd279dd3-scripts\") pod \"0ebea665-f36f-45ef-95a5-bdeacd279dd3\" (UID: \"0ebea665-f36f-45ef-95a5-bdeacd279dd3\") " Mar 10 11:18:14 crc kubenswrapper[4794]: I0310 11:18:14.505231 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vmlzl\" (UniqueName: \"kubernetes.io/projected/0ebea665-f36f-45ef-95a5-bdeacd279dd3-kube-api-access-vmlzl\") pod \"0ebea665-f36f-45ef-95a5-bdeacd279dd3\" (UID: \"0ebea665-f36f-45ef-95a5-bdeacd279dd3\") " Mar 10 11:18:14 crc kubenswrapper[4794]: I0310 11:18:14.505413 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0ebea665-f36f-45ef-95a5-bdeacd279dd3-combined-ca-bundle\") pod \"0ebea665-f36f-45ef-95a5-bdeacd279dd3\" (UID: \"0ebea665-f36f-45ef-95a5-bdeacd279dd3\") " Mar 10 11:18:14 crc kubenswrapper[4794]: I0310 11:18:14.509100 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0ebea665-f36f-45ef-95a5-bdeacd279dd3-scripts" (OuterVolumeSpecName: "scripts") pod "0ebea665-f36f-45ef-95a5-bdeacd279dd3" (UID: "0ebea665-f36f-45ef-95a5-bdeacd279dd3"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 11:18:14 crc kubenswrapper[4794]: I0310 11:18:14.510495 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0ebea665-f36f-45ef-95a5-bdeacd279dd3-kube-api-access-vmlzl" (OuterVolumeSpecName: "kube-api-access-vmlzl") pod "0ebea665-f36f-45ef-95a5-bdeacd279dd3" (UID: "0ebea665-f36f-45ef-95a5-bdeacd279dd3"). InnerVolumeSpecName "kube-api-access-vmlzl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 11:18:14 crc kubenswrapper[4794]: I0310 11:18:14.540754 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0ebea665-f36f-45ef-95a5-bdeacd279dd3-config-data" (OuterVolumeSpecName: "config-data") pod "0ebea665-f36f-45ef-95a5-bdeacd279dd3" (UID: "0ebea665-f36f-45ef-95a5-bdeacd279dd3"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 11:18:14 crc kubenswrapper[4794]: I0310 11:18:14.544683 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0ebea665-f36f-45ef-95a5-bdeacd279dd3-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0ebea665-f36f-45ef-95a5-bdeacd279dd3" (UID: "0ebea665-f36f-45ef-95a5-bdeacd279dd3"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 11:18:14 crc kubenswrapper[4794]: I0310 11:18:14.608085 4794 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0ebea665-f36f-45ef-95a5-bdeacd279dd3-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 11:18:14 crc kubenswrapper[4794]: I0310 11:18:14.608345 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vmlzl\" (UniqueName: \"kubernetes.io/projected/0ebea665-f36f-45ef-95a5-bdeacd279dd3-kube-api-access-vmlzl\") on node \"crc\" DevicePath \"\"" Mar 10 11:18:14 crc kubenswrapper[4794]: I0310 11:18:14.608422 4794 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0ebea665-f36f-45ef-95a5-bdeacd279dd3-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 11:18:14 crc kubenswrapper[4794]: I0310 11:18:14.608494 4794 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0ebea665-f36f-45ef-95a5-bdeacd279dd3-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 11:18:14 crc kubenswrapper[4794]: I0310 11:18:14.675092 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Mar 10 11:18:14 crc kubenswrapper[4794]: W0310 11:18:14.685880 4794 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod69d4f681_c12f_43fa_8ec0_95e90bff92a8.slice/crio-9ecf37b1a495c4f494c7287c49a87edccaa876c69bfb040fcba8ba38bf7f9224 WatchSource:0}: Error finding container 9ecf37b1a495c4f494c7287c49a87edccaa876c69bfb040fcba8ba38bf7f9224: Status 404 returned error can't find the container with id 9ecf37b1a495c4f494c7287c49a87edccaa876c69bfb040fcba8ba38bf7f9224 Mar 10 11:18:14 crc kubenswrapper[4794]: I0310 11:18:14.898017 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-4sqhj" Mar 10 11:18:14 crc kubenswrapper[4794]: I0310 11:18:14.898314 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-4sqhj" event={"ID":"0ebea665-f36f-45ef-95a5-bdeacd279dd3","Type":"ContainerDied","Data":"996695bb86c50f92acd572f0f5ce4522a21cf69c7ff1e13ca9b035e914549000"} Mar 10 11:18:14 crc kubenswrapper[4794]: I0310 11:18:14.898396 4794 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="996695bb86c50f92acd572f0f5ce4522a21cf69c7ff1e13ca9b035e914549000" Mar 10 11:18:14 crc kubenswrapper[4794]: I0310 11:18:14.900447 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"69d4f681-c12f-43fa-8ec0-95e90bff92a8","Type":"ContainerStarted","Data":"130a3de147a7e7545561c7fc8bf64d08fa7458da93bcbf14a7d765d68e6e1f56"} Mar 10 11:18:14 crc kubenswrapper[4794]: I0310 11:18:14.900479 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"69d4f681-c12f-43fa-8ec0-95e90bff92a8","Type":"ContainerStarted","Data":"9ecf37b1a495c4f494c7287c49a87edccaa876c69bfb040fcba8ba38bf7f9224"} Mar 10 11:18:14 crc kubenswrapper[4794]: I0310 11:18:14.902022 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-conductor-0" Mar 10 11:18:14 crc kubenswrapper[4794]: I0310 11:18:14.935746 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-0" podStartSLOduration=1.935717099 podStartE2EDuration="1.935717099s" podCreationTimestamp="2026-03-10 11:18:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 11:18:14.919762903 +0000 UTC m=+5643.675933771" watchObservedRunningTime="2026-03-10 11:18:14.935717099 +0000 UTC m=+5643.691887927" Mar 10 11:18:14 crc kubenswrapper[4794]: I0310 11:18:14.999228 4794 scope.go:117] "RemoveContainer" containerID="993a5336699cfaa4c1135438dd34154cd549d1e3528993849fd4002d26837157" Mar 10 11:18:14 crc kubenswrapper[4794]: E0310 11:18:14.999645 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-69278_openshift-machine-config-operator(071a79a8-a892-4d38-a255-2a19483b64aa)\"" pod="openshift-machine-config-operator/machine-config-daemon-69278" podUID="071a79a8-a892-4d38-a255-2a19483b64aa" Mar 10 11:18:15 crc kubenswrapper[4794]: I0310 11:18:15.147414 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 10 11:18:15 crc kubenswrapper[4794]: I0310 11:18:15.147736 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="e92b7354-097a-42e3-af6b-0d66414c4d4d" containerName="nova-api-log" containerID="cri-o://5e829d653787d159944b5fcd451c12dd75ac1133cc579a0dc350c03d0ba14582" gracePeriod=30 Mar 10 11:18:15 crc kubenswrapper[4794]: I0310 11:18:15.148239 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="e92b7354-097a-42e3-af6b-0d66414c4d4d" containerName="nova-api-api" containerID="cri-o://d696cd902eee6738838e1d4611ea33381aa5fd16b8a19fcd0247b346782038e6" gracePeriod=30 Mar 10 11:18:15 crc kubenswrapper[4794]: I0310 11:18:15.166056 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Mar 10 11:18:15 crc kubenswrapper[4794]: I0310 11:18:15.166268 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="abf2624e-ffa0-4251-8661-f688c4699f5a" containerName="nova-scheduler-scheduler" containerID="cri-o://9f0c3f5bb5cac7658eec685d5eee8faa11d60ba5d34ea562347590f6fb17a493" gracePeriod=30 Mar 10 11:18:15 crc kubenswrapper[4794]: I0310 11:18:15.219782 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 10 11:18:15 crc kubenswrapper[4794]: I0310 11:18:15.220042 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="8d41b2d6-74f1-44b7-982d-56983819f28a" containerName="nova-metadata-log" containerID="cri-o://e21d79c43231f2be99e22493b1f42ab3cfd085caafe4ac3ba1a0b9d42ac3c726" gracePeriod=30 Mar 10 11:18:15 crc kubenswrapper[4794]: I0310 11:18:15.220079 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="8d41b2d6-74f1-44b7-982d-56983819f28a" containerName="nova-metadata-metadata" containerID="cri-o://01ed2fda3f4c80509970f19aeebb1c3bb26a19bb4d92ebc2e186c1f118ac7890" gracePeriod=30 Mar 10 11:18:15 crc kubenswrapper[4794]: I0310 11:18:15.664750 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 10 11:18:15 crc kubenswrapper[4794]: I0310 11:18:15.743185 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 10 11:18:15 crc kubenswrapper[4794]: I0310 11:18:15.830531 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8d41b2d6-74f1-44b7-982d-56983819f28a-combined-ca-bundle\") pod \"8d41b2d6-74f1-44b7-982d-56983819f28a\" (UID: \"8d41b2d6-74f1-44b7-982d-56983819f28a\") " Mar 10 11:18:15 crc kubenswrapper[4794]: I0310 11:18:15.830585 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8d41b2d6-74f1-44b7-982d-56983819f28a-config-data\") pod \"8d41b2d6-74f1-44b7-982d-56983819f28a\" (UID: \"8d41b2d6-74f1-44b7-982d-56983819f28a\") " Mar 10 11:18:15 crc kubenswrapper[4794]: I0310 11:18:15.830715 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e92b7354-097a-42e3-af6b-0d66414c4d4d-config-data\") pod \"e92b7354-097a-42e3-af6b-0d66414c4d4d\" (UID: \"e92b7354-097a-42e3-af6b-0d66414c4d4d\") " Mar 10 11:18:15 crc kubenswrapper[4794]: I0310 11:18:15.830755 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tw448\" (UniqueName: \"kubernetes.io/projected/8d41b2d6-74f1-44b7-982d-56983819f28a-kube-api-access-tw448\") pod \"8d41b2d6-74f1-44b7-982d-56983819f28a\" (UID: \"8d41b2d6-74f1-44b7-982d-56983819f28a\") " Mar 10 11:18:15 crc kubenswrapper[4794]: I0310 11:18:15.830790 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8d41b2d6-74f1-44b7-982d-56983819f28a-logs\") pod \"8d41b2d6-74f1-44b7-982d-56983819f28a\" (UID: \"8d41b2d6-74f1-44b7-982d-56983819f28a\") " Mar 10 11:18:15 crc kubenswrapper[4794]: I0310 11:18:15.830822 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4k5jc\" (UniqueName: \"kubernetes.io/projected/e92b7354-097a-42e3-af6b-0d66414c4d4d-kube-api-access-4k5jc\") pod \"e92b7354-097a-42e3-af6b-0d66414c4d4d\" (UID: \"e92b7354-097a-42e3-af6b-0d66414c4d4d\") " Mar 10 11:18:15 crc kubenswrapper[4794]: I0310 11:18:15.830876 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e92b7354-097a-42e3-af6b-0d66414c4d4d-combined-ca-bundle\") pod \"e92b7354-097a-42e3-af6b-0d66414c4d4d\" (UID: \"e92b7354-097a-42e3-af6b-0d66414c4d4d\") " Mar 10 11:18:15 crc kubenswrapper[4794]: I0310 11:18:15.830953 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e92b7354-097a-42e3-af6b-0d66414c4d4d-logs\") pod \"e92b7354-097a-42e3-af6b-0d66414c4d4d\" (UID: \"e92b7354-097a-42e3-af6b-0d66414c4d4d\") " Mar 10 11:18:15 crc kubenswrapper[4794]: I0310 11:18:15.831686 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8d41b2d6-74f1-44b7-982d-56983819f28a-logs" (OuterVolumeSpecName: "logs") pod "8d41b2d6-74f1-44b7-982d-56983819f28a" (UID: "8d41b2d6-74f1-44b7-982d-56983819f28a"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 11:18:15 crc kubenswrapper[4794]: I0310 11:18:15.831860 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e92b7354-097a-42e3-af6b-0d66414c4d4d-logs" (OuterVolumeSpecName: "logs") pod "e92b7354-097a-42e3-af6b-0d66414c4d4d" (UID: "e92b7354-097a-42e3-af6b-0d66414c4d4d"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 11:18:15 crc kubenswrapper[4794]: I0310 11:18:15.835194 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e92b7354-097a-42e3-af6b-0d66414c4d4d-kube-api-access-4k5jc" (OuterVolumeSpecName: "kube-api-access-4k5jc") pod "e92b7354-097a-42e3-af6b-0d66414c4d4d" (UID: "e92b7354-097a-42e3-af6b-0d66414c4d4d"). InnerVolumeSpecName "kube-api-access-4k5jc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 11:18:15 crc kubenswrapper[4794]: I0310 11:18:15.838609 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8d41b2d6-74f1-44b7-982d-56983819f28a-kube-api-access-tw448" (OuterVolumeSpecName: "kube-api-access-tw448") pod "8d41b2d6-74f1-44b7-982d-56983819f28a" (UID: "8d41b2d6-74f1-44b7-982d-56983819f28a"). InnerVolumeSpecName "kube-api-access-tw448". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 11:18:15 crc kubenswrapper[4794]: I0310 11:18:15.853517 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e92b7354-097a-42e3-af6b-0d66414c4d4d-config-data" (OuterVolumeSpecName: "config-data") pod "e92b7354-097a-42e3-af6b-0d66414c4d4d" (UID: "e92b7354-097a-42e3-af6b-0d66414c4d4d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 11:18:15 crc kubenswrapper[4794]: I0310 11:18:15.853883 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8d41b2d6-74f1-44b7-982d-56983819f28a-config-data" (OuterVolumeSpecName: "config-data") pod "8d41b2d6-74f1-44b7-982d-56983819f28a" (UID: "8d41b2d6-74f1-44b7-982d-56983819f28a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 11:18:15 crc kubenswrapper[4794]: I0310 11:18:15.857239 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8d41b2d6-74f1-44b7-982d-56983819f28a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8d41b2d6-74f1-44b7-982d-56983819f28a" (UID: "8d41b2d6-74f1-44b7-982d-56983819f28a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 11:18:15 crc kubenswrapper[4794]: I0310 11:18:15.865684 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e92b7354-097a-42e3-af6b-0d66414c4d4d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e92b7354-097a-42e3-af6b-0d66414c4d4d" (UID: "e92b7354-097a-42e3-af6b-0d66414c4d4d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 11:18:15 crc kubenswrapper[4794]: I0310 11:18:15.911475 4794 generic.go:334] "Generic (PLEG): container finished" podID="e92b7354-097a-42e3-af6b-0d66414c4d4d" containerID="d696cd902eee6738838e1d4611ea33381aa5fd16b8a19fcd0247b346782038e6" exitCode=0 Mar 10 11:18:15 crc kubenswrapper[4794]: I0310 11:18:15.911511 4794 generic.go:334] "Generic (PLEG): container finished" podID="e92b7354-097a-42e3-af6b-0d66414c4d4d" containerID="5e829d653787d159944b5fcd451c12dd75ac1133cc579a0dc350c03d0ba14582" exitCode=143 Mar 10 11:18:15 crc kubenswrapper[4794]: I0310 11:18:15.911564 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"e92b7354-097a-42e3-af6b-0d66414c4d4d","Type":"ContainerDied","Data":"d696cd902eee6738838e1d4611ea33381aa5fd16b8a19fcd0247b346782038e6"} Mar 10 11:18:15 crc kubenswrapper[4794]: I0310 11:18:15.911594 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"e92b7354-097a-42e3-af6b-0d66414c4d4d","Type":"ContainerDied","Data":"5e829d653787d159944b5fcd451c12dd75ac1133cc579a0dc350c03d0ba14582"} Mar 10 11:18:15 crc kubenswrapper[4794]: I0310 11:18:15.911607 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"e92b7354-097a-42e3-af6b-0d66414c4d4d","Type":"ContainerDied","Data":"75b2ccadea49ed375af09026efd83e3bdae4d6c9244311a668c12dcee5798673"} Mar 10 11:18:15 crc kubenswrapper[4794]: I0310 11:18:15.911625 4794 scope.go:117] "RemoveContainer" containerID="d696cd902eee6738838e1d4611ea33381aa5fd16b8a19fcd0247b346782038e6" Mar 10 11:18:15 crc kubenswrapper[4794]: I0310 11:18:15.911746 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 10 11:18:15 crc kubenswrapper[4794]: I0310 11:18:15.918357 4794 generic.go:334] "Generic (PLEG): container finished" podID="8d41b2d6-74f1-44b7-982d-56983819f28a" containerID="01ed2fda3f4c80509970f19aeebb1c3bb26a19bb4d92ebc2e186c1f118ac7890" exitCode=0 Mar 10 11:18:15 crc kubenswrapper[4794]: I0310 11:18:15.918402 4794 generic.go:334] "Generic (PLEG): container finished" podID="8d41b2d6-74f1-44b7-982d-56983819f28a" containerID="e21d79c43231f2be99e22493b1f42ab3cfd085caafe4ac3ba1a0b9d42ac3c726" exitCode=143 Mar 10 11:18:15 crc kubenswrapper[4794]: I0310 11:18:15.918554 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 10 11:18:15 crc kubenswrapper[4794]: I0310 11:18:15.918640 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"8d41b2d6-74f1-44b7-982d-56983819f28a","Type":"ContainerDied","Data":"01ed2fda3f4c80509970f19aeebb1c3bb26a19bb4d92ebc2e186c1f118ac7890"} Mar 10 11:18:15 crc kubenswrapper[4794]: I0310 11:18:15.918677 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"8d41b2d6-74f1-44b7-982d-56983819f28a","Type":"ContainerDied","Data":"e21d79c43231f2be99e22493b1f42ab3cfd085caafe4ac3ba1a0b9d42ac3c726"} Mar 10 11:18:15 crc kubenswrapper[4794]: I0310 11:18:15.918687 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"8d41b2d6-74f1-44b7-982d-56983819f28a","Type":"ContainerDied","Data":"14943685051ca7a320d24e99fe954e86d5b4fd77aad8d49bb501c493cfa612de"} Mar 10 11:18:15 crc kubenswrapper[4794]: I0310 11:18:15.934228 4794 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e92b7354-097a-42e3-af6b-0d66414c4d4d-logs\") on node \"crc\" DevicePath \"\"" Mar 10 11:18:15 crc kubenswrapper[4794]: I0310 11:18:15.934255 4794 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8d41b2d6-74f1-44b7-982d-56983819f28a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 11:18:15 crc kubenswrapper[4794]: I0310 11:18:15.934266 4794 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8d41b2d6-74f1-44b7-982d-56983819f28a-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 11:18:15 crc kubenswrapper[4794]: I0310 11:18:15.934277 4794 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e92b7354-097a-42e3-af6b-0d66414c4d4d-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 11:18:15 crc kubenswrapper[4794]: I0310 11:18:15.934286 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tw448\" (UniqueName: \"kubernetes.io/projected/8d41b2d6-74f1-44b7-982d-56983819f28a-kube-api-access-tw448\") on node \"crc\" DevicePath \"\"" Mar 10 11:18:15 crc kubenswrapper[4794]: I0310 11:18:15.934295 4794 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8d41b2d6-74f1-44b7-982d-56983819f28a-logs\") on node \"crc\" DevicePath \"\"" Mar 10 11:18:15 crc kubenswrapper[4794]: I0310 11:18:15.934305 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4k5jc\" (UniqueName: \"kubernetes.io/projected/e92b7354-097a-42e3-af6b-0d66414c4d4d-kube-api-access-4k5jc\") on node \"crc\" DevicePath \"\"" Mar 10 11:18:15 crc kubenswrapper[4794]: I0310 11:18:15.934314 4794 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e92b7354-097a-42e3-af6b-0d66414c4d4d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 11:18:15 crc kubenswrapper[4794]: I0310 11:18:15.944056 4794 scope.go:117] "RemoveContainer" containerID="5e829d653787d159944b5fcd451c12dd75ac1133cc579a0dc350c03d0ba14582" Mar 10 11:18:15 crc kubenswrapper[4794]: I0310 11:18:15.965690 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 10 11:18:15 crc kubenswrapper[4794]: I0310 11:18:15.979933 4794 scope.go:117] "RemoveContainer" containerID="d696cd902eee6738838e1d4611ea33381aa5fd16b8a19fcd0247b346782038e6" Mar 10 11:18:15 crc kubenswrapper[4794]: E0310 11:18:15.980431 4794 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d696cd902eee6738838e1d4611ea33381aa5fd16b8a19fcd0247b346782038e6\": container with ID starting with d696cd902eee6738838e1d4611ea33381aa5fd16b8a19fcd0247b346782038e6 not found: ID does not exist" containerID="d696cd902eee6738838e1d4611ea33381aa5fd16b8a19fcd0247b346782038e6" Mar 10 11:18:15 crc kubenswrapper[4794]: I0310 11:18:15.980460 4794 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d696cd902eee6738838e1d4611ea33381aa5fd16b8a19fcd0247b346782038e6"} err="failed to get container status \"d696cd902eee6738838e1d4611ea33381aa5fd16b8a19fcd0247b346782038e6\": rpc error: code = NotFound desc = could not find container \"d696cd902eee6738838e1d4611ea33381aa5fd16b8a19fcd0247b346782038e6\": container with ID starting with d696cd902eee6738838e1d4611ea33381aa5fd16b8a19fcd0247b346782038e6 not found: ID does not exist" Mar 10 11:18:15 crc kubenswrapper[4794]: I0310 11:18:15.980482 4794 scope.go:117] "RemoveContainer" containerID="5e829d653787d159944b5fcd451c12dd75ac1133cc579a0dc350c03d0ba14582" Mar 10 11:18:15 crc kubenswrapper[4794]: I0310 11:18:15.980904 4794 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Mar 10 11:18:15 crc kubenswrapper[4794]: E0310 11:18:15.981194 4794 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5e829d653787d159944b5fcd451c12dd75ac1133cc579a0dc350c03d0ba14582\": container with ID starting with 5e829d653787d159944b5fcd451c12dd75ac1133cc579a0dc350c03d0ba14582 not found: ID does not exist" containerID="5e829d653787d159944b5fcd451c12dd75ac1133cc579a0dc350c03d0ba14582" Mar 10 11:18:15 crc kubenswrapper[4794]: I0310 11:18:15.981229 4794 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5e829d653787d159944b5fcd451c12dd75ac1133cc579a0dc350c03d0ba14582"} err="failed to get container status \"5e829d653787d159944b5fcd451c12dd75ac1133cc579a0dc350c03d0ba14582\": rpc error: code = NotFound desc = could not find container \"5e829d653787d159944b5fcd451c12dd75ac1133cc579a0dc350c03d0ba14582\": container with ID starting with 5e829d653787d159944b5fcd451c12dd75ac1133cc579a0dc350c03d0ba14582 not found: ID does not exist" Mar 10 11:18:15 crc kubenswrapper[4794]: I0310 11:18:15.981245 4794 scope.go:117] "RemoveContainer" containerID="d696cd902eee6738838e1d4611ea33381aa5fd16b8a19fcd0247b346782038e6" Mar 10 11:18:15 crc kubenswrapper[4794]: I0310 11:18:15.981637 4794 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d696cd902eee6738838e1d4611ea33381aa5fd16b8a19fcd0247b346782038e6"} err="failed to get container status \"d696cd902eee6738838e1d4611ea33381aa5fd16b8a19fcd0247b346782038e6\": rpc error: code = NotFound desc = could not find container \"d696cd902eee6738838e1d4611ea33381aa5fd16b8a19fcd0247b346782038e6\": container with ID starting with d696cd902eee6738838e1d4611ea33381aa5fd16b8a19fcd0247b346782038e6 not found: ID does not exist" Mar 10 11:18:15 crc kubenswrapper[4794]: I0310 11:18:15.981652 4794 scope.go:117] "RemoveContainer" containerID="5e829d653787d159944b5fcd451c12dd75ac1133cc579a0dc350c03d0ba14582" Mar 10 11:18:15 crc kubenswrapper[4794]: I0310 11:18:15.981862 4794 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5e829d653787d159944b5fcd451c12dd75ac1133cc579a0dc350c03d0ba14582"} err="failed to get container status \"5e829d653787d159944b5fcd451c12dd75ac1133cc579a0dc350c03d0ba14582\": rpc error: code = NotFound desc = could not find container \"5e829d653787d159944b5fcd451c12dd75ac1133cc579a0dc350c03d0ba14582\": container with ID starting with 5e829d653787d159944b5fcd451c12dd75ac1133cc579a0dc350c03d0ba14582 not found: ID does not exist" Mar 10 11:18:15 crc kubenswrapper[4794]: I0310 11:18:15.981877 4794 scope.go:117] "RemoveContainer" containerID="01ed2fda3f4c80509970f19aeebb1c3bb26a19bb4d92ebc2e186c1f118ac7890" Mar 10 11:18:15 crc kubenswrapper[4794]: I0310 11:18:15.991966 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Mar 10 11:18:15 crc kubenswrapper[4794]: E0310 11:18:15.992424 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e92b7354-097a-42e3-af6b-0d66414c4d4d" containerName="nova-api-api" Mar 10 11:18:15 crc kubenswrapper[4794]: I0310 11:18:15.992440 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="e92b7354-097a-42e3-af6b-0d66414c4d4d" containerName="nova-api-api" Mar 10 11:18:15 crc kubenswrapper[4794]: E0310 11:18:15.992467 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8d41b2d6-74f1-44b7-982d-56983819f28a" containerName="nova-metadata-log" Mar 10 11:18:15 crc kubenswrapper[4794]: I0310 11:18:15.992473 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="8d41b2d6-74f1-44b7-982d-56983819f28a" containerName="nova-metadata-log" Mar 10 11:18:15 crc kubenswrapper[4794]: E0310 11:18:15.992483 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8d41b2d6-74f1-44b7-982d-56983819f28a" containerName="nova-metadata-metadata" Mar 10 11:18:15 crc kubenswrapper[4794]: I0310 11:18:15.992489 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="8d41b2d6-74f1-44b7-982d-56983819f28a" containerName="nova-metadata-metadata" Mar 10 11:18:15 crc kubenswrapper[4794]: E0310 11:18:15.992502 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e92b7354-097a-42e3-af6b-0d66414c4d4d" containerName="nova-api-log" Mar 10 11:18:15 crc kubenswrapper[4794]: I0310 11:18:15.992508 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="e92b7354-097a-42e3-af6b-0d66414c4d4d" containerName="nova-api-log" Mar 10 11:18:15 crc kubenswrapper[4794]: E0310 11:18:15.992529 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0ebea665-f36f-45ef-95a5-bdeacd279dd3" containerName="nova-manage" Mar 10 11:18:15 crc kubenswrapper[4794]: I0310 11:18:15.992537 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="0ebea665-f36f-45ef-95a5-bdeacd279dd3" containerName="nova-manage" Mar 10 11:18:15 crc kubenswrapper[4794]: I0310 11:18:15.992675 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="8d41b2d6-74f1-44b7-982d-56983819f28a" containerName="nova-metadata-metadata" Mar 10 11:18:15 crc kubenswrapper[4794]: I0310 11:18:15.992696 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="e92b7354-097a-42e3-af6b-0d66414c4d4d" containerName="nova-api-api" Mar 10 11:18:15 crc kubenswrapper[4794]: I0310 11:18:15.992706 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="0ebea665-f36f-45ef-95a5-bdeacd279dd3" containerName="nova-manage" Mar 10 11:18:15 crc kubenswrapper[4794]: I0310 11:18:15.992713 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="8d41b2d6-74f1-44b7-982d-56983819f28a" containerName="nova-metadata-log" Mar 10 11:18:15 crc kubenswrapper[4794]: I0310 11:18:15.992723 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="e92b7354-097a-42e3-af6b-0d66414c4d4d" containerName="nova-api-log" Mar 10 11:18:16 crc kubenswrapper[4794]: I0310 11:18:16.015135 4794 scope.go:117] "RemoveContainer" containerID="e21d79c43231f2be99e22493b1f42ab3cfd085caafe4ac3ba1a0b9d42ac3c726" Mar 10 11:18:16 crc kubenswrapper[4794]: I0310 11:18:16.015737 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 10 11:18:16 crc kubenswrapper[4794]: I0310 11:18:16.018861 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Mar 10 11:18:16 crc kubenswrapper[4794]: I0310 11:18:16.055171 4794 scope.go:117] "RemoveContainer" containerID="01ed2fda3f4c80509970f19aeebb1c3bb26a19bb4d92ebc2e186c1f118ac7890" Mar 10 11:18:16 crc kubenswrapper[4794]: E0310 11:18:16.055569 4794 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"01ed2fda3f4c80509970f19aeebb1c3bb26a19bb4d92ebc2e186c1f118ac7890\": container with ID starting with 01ed2fda3f4c80509970f19aeebb1c3bb26a19bb4d92ebc2e186c1f118ac7890 not found: ID does not exist" containerID="01ed2fda3f4c80509970f19aeebb1c3bb26a19bb4d92ebc2e186c1f118ac7890" Mar 10 11:18:16 crc kubenswrapper[4794]: I0310 11:18:16.055609 4794 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"01ed2fda3f4c80509970f19aeebb1c3bb26a19bb4d92ebc2e186c1f118ac7890"} err="failed to get container status \"01ed2fda3f4c80509970f19aeebb1c3bb26a19bb4d92ebc2e186c1f118ac7890\": rpc error: code = NotFound desc = could not find container \"01ed2fda3f4c80509970f19aeebb1c3bb26a19bb4d92ebc2e186c1f118ac7890\": container with ID starting with 01ed2fda3f4c80509970f19aeebb1c3bb26a19bb4d92ebc2e186c1f118ac7890 not found: ID does not exist" Mar 10 11:18:16 crc kubenswrapper[4794]: I0310 11:18:16.055660 4794 scope.go:117] "RemoveContainer" containerID="e21d79c43231f2be99e22493b1f42ab3cfd085caafe4ac3ba1a0b9d42ac3c726" Mar 10 11:18:16 crc kubenswrapper[4794]: E0310 11:18:16.056016 4794 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e21d79c43231f2be99e22493b1f42ab3cfd085caafe4ac3ba1a0b9d42ac3c726\": container with ID starting with e21d79c43231f2be99e22493b1f42ab3cfd085caafe4ac3ba1a0b9d42ac3c726 not found: ID does not exist" containerID="e21d79c43231f2be99e22493b1f42ab3cfd085caafe4ac3ba1a0b9d42ac3c726" Mar 10 11:18:16 crc kubenswrapper[4794]: I0310 11:18:16.056043 4794 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e21d79c43231f2be99e22493b1f42ab3cfd085caafe4ac3ba1a0b9d42ac3c726"} err="failed to get container status \"e21d79c43231f2be99e22493b1f42ab3cfd085caafe4ac3ba1a0b9d42ac3c726\": rpc error: code = NotFound desc = could not find container \"e21d79c43231f2be99e22493b1f42ab3cfd085caafe4ac3ba1a0b9d42ac3c726\": container with ID starting with e21d79c43231f2be99e22493b1f42ab3cfd085caafe4ac3ba1a0b9d42ac3c726 not found: ID does not exist" Mar 10 11:18:16 crc kubenswrapper[4794]: I0310 11:18:16.056064 4794 scope.go:117] "RemoveContainer" containerID="01ed2fda3f4c80509970f19aeebb1c3bb26a19bb4d92ebc2e186c1f118ac7890" Mar 10 11:18:16 crc kubenswrapper[4794]: I0310 11:18:16.056280 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e92b7354-097a-42e3-af6b-0d66414c4d4d" path="/var/lib/kubelet/pods/e92b7354-097a-42e3-af6b-0d66414c4d4d/volumes" Mar 10 11:18:16 crc kubenswrapper[4794]: I0310 11:18:16.056396 4794 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"01ed2fda3f4c80509970f19aeebb1c3bb26a19bb4d92ebc2e186c1f118ac7890"} err="failed to get container status \"01ed2fda3f4c80509970f19aeebb1c3bb26a19bb4d92ebc2e186c1f118ac7890\": rpc error: code = NotFound desc = could not find container \"01ed2fda3f4c80509970f19aeebb1c3bb26a19bb4d92ebc2e186c1f118ac7890\": container with ID starting with 01ed2fda3f4c80509970f19aeebb1c3bb26a19bb4d92ebc2e186c1f118ac7890 not found: ID does not exist" Mar 10 11:18:16 crc kubenswrapper[4794]: I0310 11:18:16.056438 4794 scope.go:117] "RemoveContainer" containerID="e21d79c43231f2be99e22493b1f42ab3cfd085caafe4ac3ba1a0b9d42ac3c726" Mar 10 11:18:16 crc kubenswrapper[4794]: I0310 11:18:16.057491 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 10 11:18:16 crc kubenswrapper[4794]: I0310 11:18:16.057531 4794 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Mar 10 11:18:16 crc kubenswrapper[4794]: I0310 11:18:16.057553 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 10 11:18:16 crc kubenswrapper[4794]: I0310 11:18:16.057685 4794 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e21d79c43231f2be99e22493b1f42ab3cfd085caafe4ac3ba1a0b9d42ac3c726"} err="failed to get container status \"e21d79c43231f2be99e22493b1f42ab3cfd085caafe4ac3ba1a0b9d42ac3c726\": rpc error: code = NotFound desc = could not find container \"e21d79c43231f2be99e22493b1f42ab3cfd085caafe4ac3ba1a0b9d42ac3c726\": container with ID starting with e21d79c43231f2be99e22493b1f42ab3cfd085caafe4ac3ba1a0b9d42ac3c726 not found: ID does not exist" Mar 10 11:18:16 crc kubenswrapper[4794]: I0310 11:18:16.065179 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Mar 10 11:18:16 crc kubenswrapper[4794]: I0310 11:18:16.066636 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 10 11:18:16 crc kubenswrapper[4794]: I0310 11:18:16.068114 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Mar 10 11:18:16 crc kubenswrapper[4794]: I0310 11:18:16.076236 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 10 11:18:16 crc kubenswrapper[4794]: I0310 11:18:16.145116 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/33361ebe-bd4e-4b9a-9753-f1ff87f27959-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"33361ebe-bd4e-4b9a-9753-f1ff87f27959\") " pod="openstack/nova-api-0" Mar 10 11:18:16 crc kubenswrapper[4794]: I0310 11:18:16.145157 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7h27s\" (UniqueName: \"kubernetes.io/projected/33361ebe-bd4e-4b9a-9753-f1ff87f27959-kube-api-access-7h27s\") pod \"nova-api-0\" (UID: \"33361ebe-bd4e-4b9a-9753-f1ff87f27959\") " pod="openstack/nova-api-0" Mar 10 11:18:16 crc kubenswrapper[4794]: I0310 11:18:16.145205 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/33361ebe-bd4e-4b9a-9753-f1ff87f27959-config-data\") pod \"nova-api-0\" (UID: \"33361ebe-bd4e-4b9a-9753-f1ff87f27959\") " pod="openstack/nova-api-0" Mar 10 11:18:16 crc kubenswrapper[4794]: I0310 11:18:16.145364 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/33361ebe-bd4e-4b9a-9753-f1ff87f27959-logs\") pod \"nova-api-0\" (UID: \"33361ebe-bd4e-4b9a-9753-f1ff87f27959\") " pod="openstack/nova-api-0" Mar 10 11:18:16 crc kubenswrapper[4794]: I0310 11:18:16.247759 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/33361ebe-bd4e-4b9a-9753-f1ff87f27959-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"33361ebe-bd4e-4b9a-9753-f1ff87f27959\") " pod="openstack/nova-api-0" Mar 10 11:18:16 crc kubenswrapper[4794]: I0310 11:18:16.247800 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7h27s\" (UniqueName: \"kubernetes.io/projected/33361ebe-bd4e-4b9a-9753-f1ff87f27959-kube-api-access-7h27s\") pod \"nova-api-0\" (UID: \"33361ebe-bd4e-4b9a-9753-f1ff87f27959\") " pod="openstack/nova-api-0" Mar 10 11:18:16 crc kubenswrapper[4794]: I0310 11:18:16.247839 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/33361ebe-bd4e-4b9a-9753-f1ff87f27959-config-data\") pod \"nova-api-0\" (UID: \"33361ebe-bd4e-4b9a-9753-f1ff87f27959\") " pod="openstack/nova-api-0" Mar 10 11:18:16 crc kubenswrapper[4794]: I0310 11:18:16.247865 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aed09c21-26a5-44b3-a0fb-1fb40f42448e-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"aed09c21-26a5-44b3-a0fb-1fb40f42448e\") " pod="openstack/nova-metadata-0" Mar 10 11:18:16 crc kubenswrapper[4794]: I0310 11:18:16.247908 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/33361ebe-bd4e-4b9a-9753-f1ff87f27959-logs\") pod \"nova-api-0\" (UID: \"33361ebe-bd4e-4b9a-9753-f1ff87f27959\") " pod="openstack/nova-api-0" Mar 10 11:18:16 crc kubenswrapper[4794]: I0310 11:18:16.247959 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p4gwk\" (UniqueName: \"kubernetes.io/projected/aed09c21-26a5-44b3-a0fb-1fb40f42448e-kube-api-access-p4gwk\") pod \"nova-metadata-0\" (UID: \"aed09c21-26a5-44b3-a0fb-1fb40f42448e\") " pod="openstack/nova-metadata-0" Mar 10 11:18:16 crc kubenswrapper[4794]: I0310 11:18:16.247998 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aed09c21-26a5-44b3-a0fb-1fb40f42448e-config-data\") pod \"nova-metadata-0\" (UID: \"aed09c21-26a5-44b3-a0fb-1fb40f42448e\") " pod="openstack/nova-metadata-0" Mar 10 11:18:16 crc kubenswrapper[4794]: I0310 11:18:16.248023 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/aed09c21-26a5-44b3-a0fb-1fb40f42448e-logs\") pod \"nova-metadata-0\" (UID: \"aed09c21-26a5-44b3-a0fb-1fb40f42448e\") " pod="openstack/nova-metadata-0" Mar 10 11:18:16 crc kubenswrapper[4794]: I0310 11:18:16.248470 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/33361ebe-bd4e-4b9a-9753-f1ff87f27959-logs\") pod \"nova-api-0\" (UID: \"33361ebe-bd4e-4b9a-9753-f1ff87f27959\") " pod="openstack/nova-api-0" Mar 10 11:18:16 crc kubenswrapper[4794]: I0310 11:18:16.253886 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/33361ebe-bd4e-4b9a-9753-f1ff87f27959-config-data\") pod \"nova-api-0\" (UID: \"33361ebe-bd4e-4b9a-9753-f1ff87f27959\") " pod="openstack/nova-api-0" Mar 10 11:18:16 crc kubenswrapper[4794]: I0310 11:18:16.255358 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/33361ebe-bd4e-4b9a-9753-f1ff87f27959-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"33361ebe-bd4e-4b9a-9753-f1ff87f27959\") " pod="openstack/nova-api-0" Mar 10 11:18:16 crc kubenswrapper[4794]: I0310 11:18:16.274714 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7h27s\" (UniqueName: \"kubernetes.io/projected/33361ebe-bd4e-4b9a-9753-f1ff87f27959-kube-api-access-7h27s\") pod \"nova-api-0\" (UID: \"33361ebe-bd4e-4b9a-9753-f1ff87f27959\") " pod="openstack/nova-api-0" Mar 10 11:18:16 crc kubenswrapper[4794]: I0310 11:18:16.349620 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p4gwk\" (UniqueName: \"kubernetes.io/projected/aed09c21-26a5-44b3-a0fb-1fb40f42448e-kube-api-access-p4gwk\") pod \"nova-metadata-0\" (UID: \"aed09c21-26a5-44b3-a0fb-1fb40f42448e\") " pod="openstack/nova-metadata-0" Mar 10 11:18:16 crc kubenswrapper[4794]: I0310 11:18:16.349709 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aed09c21-26a5-44b3-a0fb-1fb40f42448e-config-data\") pod \"nova-metadata-0\" (UID: \"aed09c21-26a5-44b3-a0fb-1fb40f42448e\") " pod="openstack/nova-metadata-0" Mar 10 11:18:16 crc kubenswrapper[4794]: I0310 11:18:16.349749 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/aed09c21-26a5-44b3-a0fb-1fb40f42448e-logs\") pod \"nova-metadata-0\" (UID: \"aed09c21-26a5-44b3-a0fb-1fb40f42448e\") " pod="openstack/nova-metadata-0" Mar 10 11:18:16 crc kubenswrapper[4794]: I0310 11:18:16.349811 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aed09c21-26a5-44b3-a0fb-1fb40f42448e-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"aed09c21-26a5-44b3-a0fb-1fb40f42448e\") " pod="openstack/nova-metadata-0" Mar 10 11:18:16 crc kubenswrapper[4794]: I0310 11:18:16.350249 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/aed09c21-26a5-44b3-a0fb-1fb40f42448e-logs\") pod \"nova-metadata-0\" (UID: \"aed09c21-26a5-44b3-a0fb-1fb40f42448e\") " pod="openstack/nova-metadata-0" Mar 10 11:18:16 crc kubenswrapper[4794]: I0310 11:18:16.354628 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aed09c21-26a5-44b3-a0fb-1fb40f42448e-config-data\") pod \"nova-metadata-0\" (UID: \"aed09c21-26a5-44b3-a0fb-1fb40f42448e\") " pod="openstack/nova-metadata-0" Mar 10 11:18:16 crc kubenswrapper[4794]: I0310 11:18:16.356318 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aed09c21-26a5-44b3-a0fb-1fb40f42448e-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"aed09c21-26a5-44b3-a0fb-1fb40f42448e\") " pod="openstack/nova-metadata-0" Mar 10 11:18:16 crc kubenswrapper[4794]: I0310 11:18:16.357627 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 10 11:18:16 crc kubenswrapper[4794]: I0310 11:18:16.366468 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p4gwk\" (UniqueName: \"kubernetes.io/projected/aed09c21-26a5-44b3-a0fb-1fb40f42448e-kube-api-access-p4gwk\") pod \"nova-metadata-0\" (UID: \"aed09c21-26a5-44b3-a0fb-1fb40f42448e\") " pod="openstack/nova-metadata-0" Mar 10 11:18:16 crc kubenswrapper[4794]: I0310 11:18:16.380475 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 10 11:18:16 crc kubenswrapper[4794]: I0310 11:18:16.900765 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 10 11:18:16 crc kubenswrapper[4794]: W0310 11:18:16.906661 4794 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod33361ebe_bd4e_4b9a_9753_f1ff87f27959.slice/crio-89d9db83a5df2558dd498732948dc5fa84d84a40e55ed84d966be33a6989ec55 WatchSource:0}: Error finding container 89d9db83a5df2558dd498732948dc5fa84d84a40e55ed84d966be33a6989ec55: Status 404 returned error can't find the container with id 89d9db83a5df2558dd498732948dc5fa84d84a40e55ed84d966be33a6989ec55 Mar 10 11:18:16 crc kubenswrapper[4794]: I0310 11:18:16.944002 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"33361ebe-bd4e-4b9a-9753-f1ff87f27959","Type":"ContainerStarted","Data":"89d9db83a5df2558dd498732948dc5fa84d84a40e55ed84d966be33a6989ec55"} Mar 10 11:18:16 crc kubenswrapper[4794]: I0310 11:18:16.982437 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 10 11:18:17 crc kubenswrapper[4794]: I0310 11:18:17.556596 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-9dd655fcf-vd7zz" Mar 10 11:18:17 crc kubenswrapper[4794]: I0310 11:18:17.567854 4794 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-cell1-novncproxy-0" Mar 10 11:18:17 crc kubenswrapper[4794]: I0310 11:18:17.583970 4794 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-cell1-novncproxy-0" Mar 10 11:18:17 crc kubenswrapper[4794]: I0310 11:18:17.692923 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7f9b86ff59-7wf9d"] Mar 10 11:18:17 crc kubenswrapper[4794]: I0310 11:18:17.693166 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-7f9b86ff59-7wf9d" podUID="66e0ef86-58d0-4ad0-9336-236773558c09" containerName="dnsmasq-dns" containerID="cri-o://d3a20ab9001d55da97233357f587e7c43f82752707564eaac01064851811a183" gracePeriod=10 Mar 10 11:18:17 crc kubenswrapper[4794]: I0310 11:18:17.958881 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"aed09c21-26a5-44b3-a0fb-1fb40f42448e","Type":"ContainerStarted","Data":"6b050bf1ace311f075b59692212e4f8cb7de41381b3240134cb0ed62afc4c6bd"} Mar 10 11:18:17 crc kubenswrapper[4794]: I0310 11:18:17.958938 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"aed09c21-26a5-44b3-a0fb-1fb40f42448e","Type":"ContainerStarted","Data":"5271312361a1bf1ed6f58b9b58c96ceb50b111fcc6843eaaf57561b41806311b"} Mar 10 11:18:17 crc kubenswrapper[4794]: I0310 11:18:17.958953 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"aed09c21-26a5-44b3-a0fb-1fb40f42448e","Type":"ContainerStarted","Data":"33f551e4afe3fd1a14290114dd98caef8fa15376d4ba4ffff86af81aa19f21a3"} Mar 10 11:18:17 crc kubenswrapper[4794]: I0310 11:18:17.963318 4794 generic.go:334] "Generic (PLEG): container finished" podID="66e0ef86-58d0-4ad0-9336-236773558c09" containerID="d3a20ab9001d55da97233357f587e7c43f82752707564eaac01064851811a183" exitCode=0 Mar 10 11:18:17 crc kubenswrapper[4794]: I0310 11:18:17.963374 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7f9b86ff59-7wf9d" event={"ID":"66e0ef86-58d0-4ad0-9336-236773558c09","Type":"ContainerDied","Data":"d3a20ab9001d55da97233357f587e7c43f82752707564eaac01064851811a183"} Mar 10 11:18:17 crc kubenswrapper[4794]: I0310 11:18:17.966050 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"33361ebe-bd4e-4b9a-9753-f1ff87f27959","Type":"ContainerStarted","Data":"2d4426b59d32b121886d26555ee294263928c122129176e03bd543b7b8799cc0"} Mar 10 11:18:17 crc kubenswrapper[4794]: I0310 11:18:17.966107 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"33361ebe-bd4e-4b9a-9753-f1ff87f27959","Type":"ContainerStarted","Data":"5e4b840434241251d6a3ba3525ecf5233d3a201bbfea5f4ffa9ff8d84f629895"} Mar 10 11:18:17 crc kubenswrapper[4794]: I0310 11:18:17.976567 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.976552983 podStartE2EDuration="2.976552983s" podCreationTimestamp="2026-03-10 11:18:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 11:18:17.975873202 +0000 UTC m=+5646.732044040" watchObservedRunningTime="2026-03-10 11:18:17.976552983 +0000 UTC m=+5646.732723801" Mar 10 11:18:17 crc kubenswrapper[4794]: I0310 11:18:17.980923 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-novncproxy-0" Mar 10 11:18:18 crc kubenswrapper[4794]: I0310 11:18:18.016647 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=3.01663216 podStartE2EDuration="3.01663216s" podCreationTimestamp="2026-03-10 11:18:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 11:18:17.997154454 +0000 UTC m=+5646.753325282" watchObservedRunningTime="2026-03-10 11:18:18.01663216 +0000 UTC m=+5646.772802978" Mar 10 11:18:18 crc kubenswrapper[4794]: I0310 11:18:18.027303 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8d41b2d6-74f1-44b7-982d-56983819f28a" path="/var/lib/kubelet/pods/8d41b2d6-74f1-44b7-982d-56983819f28a/volumes" Mar 10 11:18:18 crc kubenswrapper[4794]: I0310 11:18:18.198418 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7f9b86ff59-7wf9d" Mar 10 11:18:18 crc kubenswrapper[4794]: I0310 11:18:18.295811 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/66e0ef86-58d0-4ad0-9336-236773558c09-dns-svc\") pod \"66e0ef86-58d0-4ad0-9336-236773558c09\" (UID: \"66e0ef86-58d0-4ad0-9336-236773558c09\") " Mar 10 11:18:18 crc kubenswrapper[4794]: I0310 11:18:18.295901 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/66e0ef86-58d0-4ad0-9336-236773558c09-ovsdbserver-sb\") pod \"66e0ef86-58d0-4ad0-9336-236773558c09\" (UID: \"66e0ef86-58d0-4ad0-9336-236773558c09\") " Mar 10 11:18:18 crc kubenswrapper[4794]: I0310 11:18:18.296033 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/66e0ef86-58d0-4ad0-9336-236773558c09-config\") pod \"66e0ef86-58d0-4ad0-9336-236773558c09\" (UID: \"66e0ef86-58d0-4ad0-9336-236773558c09\") " Mar 10 11:18:18 crc kubenswrapper[4794]: I0310 11:18:18.296118 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m2l8z\" (UniqueName: \"kubernetes.io/projected/66e0ef86-58d0-4ad0-9336-236773558c09-kube-api-access-m2l8z\") pod \"66e0ef86-58d0-4ad0-9336-236773558c09\" (UID: \"66e0ef86-58d0-4ad0-9336-236773558c09\") " Mar 10 11:18:18 crc kubenswrapper[4794]: I0310 11:18:18.296811 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/66e0ef86-58d0-4ad0-9336-236773558c09-ovsdbserver-nb\") pod \"66e0ef86-58d0-4ad0-9336-236773558c09\" (UID: \"66e0ef86-58d0-4ad0-9336-236773558c09\") " Mar 10 11:18:18 crc kubenswrapper[4794]: I0310 11:18:18.301421 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/66e0ef86-58d0-4ad0-9336-236773558c09-kube-api-access-m2l8z" (OuterVolumeSpecName: "kube-api-access-m2l8z") pod "66e0ef86-58d0-4ad0-9336-236773558c09" (UID: "66e0ef86-58d0-4ad0-9336-236773558c09"). InnerVolumeSpecName "kube-api-access-m2l8z". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 11:18:18 crc kubenswrapper[4794]: I0310 11:18:18.338079 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/66e0ef86-58d0-4ad0-9336-236773558c09-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "66e0ef86-58d0-4ad0-9336-236773558c09" (UID: "66e0ef86-58d0-4ad0-9336-236773558c09"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 11:18:18 crc kubenswrapper[4794]: I0310 11:18:18.341054 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/66e0ef86-58d0-4ad0-9336-236773558c09-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "66e0ef86-58d0-4ad0-9336-236773558c09" (UID: "66e0ef86-58d0-4ad0-9336-236773558c09"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 11:18:18 crc kubenswrapper[4794]: I0310 11:18:18.352231 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/66e0ef86-58d0-4ad0-9336-236773558c09-config" (OuterVolumeSpecName: "config") pod "66e0ef86-58d0-4ad0-9336-236773558c09" (UID: "66e0ef86-58d0-4ad0-9336-236773558c09"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 11:18:18 crc kubenswrapper[4794]: I0310 11:18:18.363553 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/66e0ef86-58d0-4ad0-9336-236773558c09-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "66e0ef86-58d0-4ad0-9336-236773558c09" (UID: "66e0ef86-58d0-4ad0-9336-236773558c09"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 11:18:18 crc kubenswrapper[4794]: I0310 11:18:18.398893 4794 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/66e0ef86-58d0-4ad0-9336-236773558c09-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 10 11:18:18 crc kubenswrapper[4794]: I0310 11:18:18.399168 4794 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/66e0ef86-58d0-4ad0-9336-236773558c09-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 10 11:18:18 crc kubenswrapper[4794]: I0310 11:18:18.399253 4794 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/66e0ef86-58d0-4ad0-9336-236773558c09-config\") on node \"crc\" DevicePath \"\"" Mar 10 11:18:18 crc kubenswrapper[4794]: I0310 11:18:18.399356 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m2l8z\" (UniqueName: \"kubernetes.io/projected/66e0ef86-58d0-4ad0-9336-236773558c09-kube-api-access-m2l8z\") on node \"crc\" DevicePath \"\"" Mar 10 11:18:18 crc kubenswrapper[4794]: I0310 11:18:18.399437 4794 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/66e0ef86-58d0-4ad0-9336-236773558c09-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 10 11:18:18 crc kubenswrapper[4794]: I0310 11:18:18.985064 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7f9b86ff59-7wf9d" event={"ID":"66e0ef86-58d0-4ad0-9336-236773558c09","Type":"ContainerDied","Data":"8e9b8bfde6b7888cfe29e815d0647cb6174b55db53f109ce46395b02e3dc5e63"} Mar 10 11:18:18 crc kubenswrapper[4794]: I0310 11:18:18.985177 4794 scope.go:117] "RemoveContainer" containerID="d3a20ab9001d55da97233357f587e7c43f82752707564eaac01064851811a183" Mar 10 11:18:18 crc kubenswrapper[4794]: I0310 11:18:18.985322 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7f9b86ff59-7wf9d" Mar 10 11:18:19 crc kubenswrapper[4794]: I0310 11:18:19.042807 4794 scope.go:117] "RemoveContainer" containerID="6a5fc212dd02823c2d523319c01cb9f0da24406d839dc3bea585a7e3e298b0eb" Mar 10 11:18:19 crc kubenswrapper[4794]: I0310 11:18:19.048387 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7f9b86ff59-7wf9d"] Mar 10 11:18:19 crc kubenswrapper[4794]: I0310 11:18:19.061799 4794 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7f9b86ff59-7wf9d"] Mar 10 11:18:19 crc kubenswrapper[4794]: I0310 11:18:19.374794 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-conductor-0" Mar 10 11:18:19 crc kubenswrapper[4794]: I0310 11:18:19.876733 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-cell-mapping-gz7lg"] Mar 10 11:18:19 crc kubenswrapper[4794]: E0310 11:18:19.877075 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="66e0ef86-58d0-4ad0-9336-236773558c09" containerName="dnsmasq-dns" Mar 10 11:18:19 crc kubenswrapper[4794]: I0310 11:18:19.877086 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="66e0ef86-58d0-4ad0-9336-236773558c09" containerName="dnsmasq-dns" Mar 10 11:18:19 crc kubenswrapper[4794]: E0310 11:18:19.877114 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="66e0ef86-58d0-4ad0-9336-236773558c09" containerName="init" Mar 10 11:18:19 crc kubenswrapper[4794]: I0310 11:18:19.877120 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="66e0ef86-58d0-4ad0-9336-236773558c09" containerName="init" Mar 10 11:18:19 crc kubenswrapper[4794]: I0310 11:18:19.877274 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="66e0ef86-58d0-4ad0-9336-236773558c09" containerName="dnsmasq-dns" Mar 10 11:18:19 crc kubenswrapper[4794]: I0310 11:18:19.879938 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-gz7lg" Mar 10 11:18:19 crc kubenswrapper[4794]: I0310 11:18:19.886171 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-scripts" Mar 10 11:18:19 crc kubenswrapper[4794]: I0310 11:18:19.886414 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-config-data" Mar 10 11:18:19 crc kubenswrapper[4794]: I0310 11:18:19.912571 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-gz7lg"] Mar 10 11:18:20 crc kubenswrapper[4794]: I0310 11:18:20.021453 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="66e0ef86-58d0-4ad0-9336-236773558c09" path="/var/lib/kubelet/pods/66e0ef86-58d0-4ad0-9336-236773558c09/volumes" Mar 10 11:18:20 crc kubenswrapper[4794]: I0310 11:18:20.026001 4794 generic.go:334] "Generic (PLEG): container finished" podID="abf2624e-ffa0-4251-8661-f688c4699f5a" containerID="9f0c3f5bb5cac7658eec685d5eee8faa11d60ba5d34ea562347590f6fb17a493" exitCode=0 Mar 10 11:18:20 crc kubenswrapper[4794]: I0310 11:18:20.026103 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"abf2624e-ffa0-4251-8661-f688c4699f5a","Type":"ContainerDied","Data":"9f0c3f5bb5cac7658eec685d5eee8faa11d60ba5d34ea562347590f6fb17a493"} Mar 10 11:18:20 crc kubenswrapper[4794]: I0310 11:18:20.026135 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"abf2624e-ffa0-4251-8661-f688c4699f5a","Type":"ContainerDied","Data":"92871da306b7772d02d6a671c8a71cd46cabcd0c79d2b6719ef5355a4be73490"} Mar 10 11:18:20 crc kubenswrapper[4794]: I0310 11:18:20.026147 4794 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="92871da306b7772d02d6a671c8a71cd46cabcd0c79d2b6719ef5355a4be73490" Mar 10 11:18:20 crc kubenswrapper[4794]: I0310 11:18:20.050207 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f5ddb993-25ab-4586-8570-ed2365d197f8-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-gz7lg\" (UID: \"f5ddb993-25ab-4586-8570-ed2365d197f8\") " pod="openstack/nova-cell1-cell-mapping-gz7lg" Mar 10 11:18:20 crc kubenswrapper[4794]: I0310 11:18:20.050256 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tfb9b\" (UniqueName: \"kubernetes.io/projected/f5ddb993-25ab-4586-8570-ed2365d197f8-kube-api-access-tfb9b\") pod \"nova-cell1-cell-mapping-gz7lg\" (UID: \"f5ddb993-25ab-4586-8570-ed2365d197f8\") " pod="openstack/nova-cell1-cell-mapping-gz7lg" Mar 10 11:18:20 crc kubenswrapper[4794]: I0310 11:18:20.050390 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f5ddb993-25ab-4586-8570-ed2365d197f8-config-data\") pod \"nova-cell1-cell-mapping-gz7lg\" (UID: \"f5ddb993-25ab-4586-8570-ed2365d197f8\") " pod="openstack/nova-cell1-cell-mapping-gz7lg" Mar 10 11:18:20 crc kubenswrapper[4794]: I0310 11:18:20.050409 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f5ddb993-25ab-4586-8570-ed2365d197f8-scripts\") pod \"nova-cell1-cell-mapping-gz7lg\" (UID: \"f5ddb993-25ab-4586-8570-ed2365d197f8\") " pod="openstack/nova-cell1-cell-mapping-gz7lg" Mar 10 11:18:20 crc kubenswrapper[4794]: I0310 11:18:20.062536 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 10 11:18:20 crc kubenswrapper[4794]: I0310 11:18:20.151154 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8qf6s\" (UniqueName: \"kubernetes.io/projected/abf2624e-ffa0-4251-8661-f688c4699f5a-kube-api-access-8qf6s\") pod \"abf2624e-ffa0-4251-8661-f688c4699f5a\" (UID: \"abf2624e-ffa0-4251-8661-f688c4699f5a\") " Mar 10 11:18:20 crc kubenswrapper[4794]: I0310 11:18:20.151275 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/abf2624e-ffa0-4251-8661-f688c4699f5a-combined-ca-bundle\") pod \"abf2624e-ffa0-4251-8661-f688c4699f5a\" (UID: \"abf2624e-ffa0-4251-8661-f688c4699f5a\") " Mar 10 11:18:20 crc kubenswrapper[4794]: I0310 11:18:20.151364 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/abf2624e-ffa0-4251-8661-f688c4699f5a-config-data\") pod \"abf2624e-ffa0-4251-8661-f688c4699f5a\" (UID: \"abf2624e-ffa0-4251-8661-f688c4699f5a\") " Mar 10 11:18:20 crc kubenswrapper[4794]: I0310 11:18:20.152259 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f5ddb993-25ab-4586-8570-ed2365d197f8-config-data\") pod \"nova-cell1-cell-mapping-gz7lg\" (UID: \"f5ddb993-25ab-4586-8570-ed2365d197f8\") " pod="openstack/nova-cell1-cell-mapping-gz7lg" Mar 10 11:18:20 crc kubenswrapper[4794]: I0310 11:18:20.152287 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f5ddb993-25ab-4586-8570-ed2365d197f8-scripts\") pod \"nova-cell1-cell-mapping-gz7lg\" (UID: \"f5ddb993-25ab-4586-8570-ed2365d197f8\") " pod="openstack/nova-cell1-cell-mapping-gz7lg" Mar 10 11:18:20 crc kubenswrapper[4794]: I0310 11:18:20.152322 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f5ddb993-25ab-4586-8570-ed2365d197f8-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-gz7lg\" (UID: \"f5ddb993-25ab-4586-8570-ed2365d197f8\") " pod="openstack/nova-cell1-cell-mapping-gz7lg" Mar 10 11:18:20 crc kubenswrapper[4794]: I0310 11:18:20.152363 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tfb9b\" (UniqueName: \"kubernetes.io/projected/f5ddb993-25ab-4586-8570-ed2365d197f8-kube-api-access-tfb9b\") pod \"nova-cell1-cell-mapping-gz7lg\" (UID: \"f5ddb993-25ab-4586-8570-ed2365d197f8\") " pod="openstack/nova-cell1-cell-mapping-gz7lg" Mar 10 11:18:20 crc kubenswrapper[4794]: I0310 11:18:20.158524 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/abf2624e-ffa0-4251-8661-f688c4699f5a-kube-api-access-8qf6s" (OuterVolumeSpecName: "kube-api-access-8qf6s") pod "abf2624e-ffa0-4251-8661-f688c4699f5a" (UID: "abf2624e-ffa0-4251-8661-f688c4699f5a"). InnerVolumeSpecName "kube-api-access-8qf6s". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 11:18:20 crc kubenswrapper[4794]: I0310 11:18:20.178372 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f5ddb993-25ab-4586-8570-ed2365d197f8-scripts\") pod \"nova-cell1-cell-mapping-gz7lg\" (UID: \"f5ddb993-25ab-4586-8570-ed2365d197f8\") " pod="openstack/nova-cell1-cell-mapping-gz7lg" Mar 10 11:18:20 crc kubenswrapper[4794]: I0310 11:18:20.185588 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f5ddb993-25ab-4586-8570-ed2365d197f8-config-data\") pod \"nova-cell1-cell-mapping-gz7lg\" (UID: \"f5ddb993-25ab-4586-8570-ed2365d197f8\") " pod="openstack/nova-cell1-cell-mapping-gz7lg" Mar 10 11:18:20 crc kubenswrapper[4794]: I0310 11:18:20.193001 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f5ddb993-25ab-4586-8570-ed2365d197f8-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-gz7lg\" (UID: \"f5ddb993-25ab-4586-8570-ed2365d197f8\") " pod="openstack/nova-cell1-cell-mapping-gz7lg" Mar 10 11:18:20 crc kubenswrapper[4794]: I0310 11:18:20.202385 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tfb9b\" (UniqueName: \"kubernetes.io/projected/f5ddb993-25ab-4586-8570-ed2365d197f8-kube-api-access-tfb9b\") pod \"nova-cell1-cell-mapping-gz7lg\" (UID: \"f5ddb993-25ab-4586-8570-ed2365d197f8\") " pod="openstack/nova-cell1-cell-mapping-gz7lg" Mar 10 11:18:20 crc kubenswrapper[4794]: I0310 11:18:20.215003 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/abf2624e-ffa0-4251-8661-f688c4699f5a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "abf2624e-ffa0-4251-8661-f688c4699f5a" (UID: "abf2624e-ffa0-4251-8661-f688c4699f5a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 11:18:20 crc kubenswrapper[4794]: I0310 11:18:20.233540 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/abf2624e-ffa0-4251-8661-f688c4699f5a-config-data" (OuterVolumeSpecName: "config-data") pod "abf2624e-ffa0-4251-8661-f688c4699f5a" (UID: "abf2624e-ffa0-4251-8661-f688c4699f5a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 11:18:20 crc kubenswrapper[4794]: I0310 11:18:20.253736 4794 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/abf2624e-ffa0-4251-8661-f688c4699f5a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 11:18:20 crc kubenswrapper[4794]: I0310 11:18:20.253771 4794 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/abf2624e-ffa0-4251-8661-f688c4699f5a-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 11:18:20 crc kubenswrapper[4794]: I0310 11:18:20.253780 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8qf6s\" (UniqueName: \"kubernetes.io/projected/abf2624e-ffa0-4251-8661-f688c4699f5a-kube-api-access-8qf6s\") on node \"crc\" DevicePath \"\"" Mar 10 11:18:20 crc kubenswrapper[4794]: I0310 11:18:20.283780 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-gz7lg" Mar 10 11:18:20 crc kubenswrapper[4794]: I0310 11:18:20.700385 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-gz7lg"] Mar 10 11:18:21 crc kubenswrapper[4794]: I0310 11:18:21.046468 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 10 11:18:21 crc kubenswrapper[4794]: I0310 11:18:21.046497 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-gz7lg" event={"ID":"f5ddb993-25ab-4586-8570-ed2365d197f8","Type":"ContainerStarted","Data":"3ca83dd317eca7e463fd21ea015fe795c89d895fc3025b08a9ab2f17569254f2"} Mar 10 11:18:21 crc kubenswrapper[4794]: I0310 11:18:21.046557 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-gz7lg" event={"ID":"f5ddb993-25ab-4586-8570-ed2365d197f8","Type":"ContainerStarted","Data":"c4791636c166075c04b8fb53d3f045ad22f64055adfa0c2798a1dd8263e298d0"} Mar 10 11:18:21 crc kubenswrapper[4794]: I0310 11:18:21.069690 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-cell-mapping-gz7lg" podStartSLOduration=2.069672954 podStartE2EDuration="2.069672954s" podCreationTimestamp="2026-03-10 11:18:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 11:18:21.066068732 +0000 UTC m=+5649.822239560" watchObservedRunningTime="2026-03-10 11:18:21.069672954 +0000 UTC m=+5649.825843772" Mar 10 11:18:21 crc kubenswrapper[4794]: I0310 11:18:21.098129 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Mar 10 11:18:21 crc kubenswrapper[4794]: I0310 11:18:21.127411 4794 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Mar 10 11:18:21 crc kubenswrapper[4794]: I0310 11:18:21.137161 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Mar 10 11:18:21 crc kubenswrapper[4794]: E0310 11:18:21.138057 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="abf2624e-ffa0-4251-8661-f688c4699f5a" containerName="nova-scheduler-scheduler" Mar 10 11:18:21 crc kubenswrapper[4794]: I0310 11:18:21.138249 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="abf2624e-ffa0-4251-8661-f688c4699f5a" containerName="nova-scheduler-scheduler" Mar 10 11:18:21 crc kubenswrapper[4794]: I0310 11:18:21.138816 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="abf2624e-ffa0-4251-8661-f688c4699f5a" containerName="nova-scheduler-scheduler" Mar 10 11:18:21 crc kubenswrapper[4794]: I0310 11:18:21.140061 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 10 11:18:21 crc kubenswrapper[4794]: I0310 11:18:21.143744 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Mar 10 11:18:21 crc kubenswrapper[4794]: I0310 11:18:21.147956 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 10 11:18:21 crc kubenswrapper[4794]: I0310 11:18:21.286185 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9a5e67b3-0cde-4385-995a-3ad279f877d5-config-data\") pod \"nova-scheduler-0\" (UID: \"9a5e67b3-0cde-4385-995a-3ad279f877d5\") " pod="openstack/nova-scheduler-0" Mar 10 11:18:21 crc kubenswrapper[4794]: I0310 11:18:21.286296 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9a5e67b3-0cde-4385-995a-3ad279f877d5-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"9a5e67b3-0cde-4385-995a-3ad279f877d5\") " pod="openstack/nova-scheduler-0" Mar 10 11:18:21 crc kubenswrapper[4794]: I0310 11:18:21.286343 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6phc8\" (UniqueName: \"kubernetes.io/projected/9a5e67b3-0cde-4385-995a-3ad279f877d5-kube-api-access-6phc8\") pod \"nova-scheduler-0\" (UID: \"9a5e67b3-0cde-4385-995a-3ad279f877d5\") " pod="openstack/nova-scheduler-0" Mar 10 11:18:21 crc kubenswrapper[4794]: I0310 11:18:21.381862 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Mar 10 11:18:21 crc kubenswrapper[4794]: I0310 11:18:21.382257 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Mar 10 11:18:21 crc kubenswrapper[4794]: I0310 11:18:21.387810 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9a5e67b3-0cde-4385-995a-3ad279f877d5-config-data\") pod \"nova-scheduler-0\" (UID: \"9a5e67b3-0cde-4385-995a-3ad279f877d5\") " pod="openstack/nova-scheduler-0" Mar 10 11:18:21 crc kubenswrapper[4794]: I0310 11:18:21.387946 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9a5e67b3-0cde-4385-995a-3ad279f877d5-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"9a5e67b3-0cde-4385-995a-3ad279f877d5\") " pod="openstack/nova-scheduler-0" Mar 10 11:18:21 crc kubenswrapper[4794]: I0310 11:18:21.387993 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6phc8\" (UniqueName: \"kubernetes.io/projected/9a5e67b3-0cde-4385-995a-3ad279f877d5-kube-api-access-6phc8\") pod \"nova-scheduler-0\" (UID: \"9a5e67b3-0cde-4385-995a-3ad279f877d5\") " pod="openstack/nova-scheduler-0" Mar 10 11:18:21 crc kubenswrapper[4794]: I0310 11:18:21.391959 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9a5e67b3-0cde-4385-995a-3ad279f877d5-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"9a5e67b3-0cde-4385-995a-3ad279f877d5\") " pod="openstack/nova-scheduler-0" Mar 10 11:18:21 crc kubenswrapper[4794]: I0310 11:18:21.393208 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9a5e67b3-0cde-4385-995a-3ad279f877d5-config-data\") pod \"nova-scheduler-0\" (UID: \"9a5e67b3-0cde-4385-995a-3ad279f877d5\") " pod="openstack/nova-scheduler-0" Mar 10 11:18:21 crc kubenswrapper[4794]: I0310 11:18:21.426504 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6phc8\" (UniqueName: \"kubernetes.io/projected/9a5e67b3-0cde-4385-995a-3ad279f877d5-kube-api-access-6phc8\") pod \"nova-scheduler-0\" (UID: \"9a5e67b3-0cde-4385-995a-3ad279f877d5\") " pod="openstack/nova-scheduler-0" Mar 10 11:18:21 crc kubenswrapper[4794]: I0310 11:18:21.464569 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 10 11:18:21 crc kubenswrapper[4794]: I0310 11:18:21.991863 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 10 11:18:22 crc kubenswrapper[4794]: I0310 11:18:22.031927 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="abf2624e-ffa0-4251-8661-f688c4699f5a" path="/var/lib/kubelet/pods/abf2624e-ffa0-4251-8661-f688c4699f5a/volumes" Mar 10 11:18:22 crc kubenswrapper[4794]: I0310 11:18:22.073579 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"9a5e67b3-0cde-4385-995a-3ad279f877d5","Type":"ContainerStarted","Data":"41faef868a18eb281871fd8a4c77e0115358bd237ae92d0e87e630842b4768f8"} Mar 10 11:18:23 crc kubenswrapper[4794]: I0310 11:18:23.153964 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"9a5e67b3-0cde-4385-995a-3ad279f877d5","Type":"ContainerStarted","Data":"f5b69ba2fb7667ac48d4e86c5c08fe00bcd404263356077dc4c6a60439c3074d"} Mar 10 11:18:23 crc kubenswrapper[4794]: I0310 11:18:23.195717 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.1956925 podStartE2EDuration="2.1956925s" podCreationTimestamp="2026-03-10 11:18:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 11:18:23.184521423 +0000 UTC m=+5651.940692231" watchObservedRunningTime="2026-03-10 11:18:23.1956925 +0000 UTC m=+5651.951863318" Mar 10 11:18:25 crc kubenswrapper[4794]: I0310 11:18:25.999440 4794 scope.go:117] "RemoveContainer" containerID="993a5336699cfaa4c1135438dd34154cd549d1e3528993849fd4002d26837157" Mar 10 11:18:26 crc kubenswrapper[4794]: E0310 11:18:26.000402 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-69278_openshift-machine-config-operator(071a79a8-a892-4d38-a255-2a19483b64aa)\"" pod="openshift-machine-config-operator/machine-config-daemon-69278" podUID="071a79a8-a892-4d38-a255-2a19483b64aa" Mar 10 11:18:26 crc kubenswrapper[4794]: I0310 11:18:26.200204 4794 generic.go:334] "Generic (PLEG): container finished" podID="f5ddb993-25ab-4586-8570-ed2365d197f8" containerID="3ca83dd317eca7e463fd21ea015fe795c89d895fc3025b08a9ab2f17569254f2" exitCode=0 Mar 10 11:18:26 crc kubenswrapper[4794]: I0310 11:18:26.200256 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-gz7lg" event={"ID":"f5ddb993-25ab-4586-8570-ed2365d197f8","Type":"ContainerDied","Data":"3ca83dd317eca7e463fd21ea015fe795c89d895fc3025b08a9ab2f17569254f2"} Mar 10 11:18:26 crc kubenswrapper[4794]: I0310 11:18:26.358378 4794 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 10 11:18:26 crc kubenswrapper[4794]: I0310 11:18:26.358724 4794 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 10 11:18:26 crc kubenswrapper[4794]: I0310 11:18:26.382000 4794 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Mar 10 11:18:26 crc kubenswrapper[4794]: I0310 11:18:26.382122 4794 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Mar 10 11:18:26 crc kubenswrapper[4794]: I0310 11:18:26.465407 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Mar 10 11:18:27 crc kubenswrapper[4794]: I0310 11:18:27.522542 4794 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="33361ebe-bd4e-4b9a-9753-f1ff87f27959" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.1.105:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 10 11:18:27 crc kubenswrapper[4794]: I0310 11:18:27.522735 4794 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="aed09c21-26a5-44b3-a0fb-1fb40f42448e" containerName="nova-metadata-log" probeResult="failure" output="Get \"http://10.217.1.106:8775/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 10 11:18:27 crc kubenswrapper[4794]: I0310 11:18:27.522760 4794 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="33361ebe-bd4e-4b9a-9753-f1ff87f27959" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.1.105:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 10 11:18:27 crc kubenswrapper[4794]: I0310 11:18:27.522777 4794 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="aed09c21-26a5-44b3-a0fb-1fb40f42448e" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"http://10.217.1.106:8775/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 10 11:18:27 crc kubenswrapper[4794]: I0310 11:18:27.531641 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-gz7lg" Mar 10 11:18:27 crc kubenswrapper[4794]: I0310 11:18:27.632290 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f5ddb993-25ab-4586-8570-ed2365d197f8-config-data\") pod \"f5ddb993-25ab-4586-8570-ed2365d197f8\" (UID: \"f5ddb993-25ab-4586-8570-ed2365d197f8\") " Mar 10 11:18:27 crc kubenswrapper[4794]: I0310 11:18:27.632366 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f5ddb993-25ab-4586-8570-ed2365d197f8-scripts\") pod \"f5ddb993-25ab-4586-8570-ed2365d197f8\" (UID: \"f5ddb993-25ab-4586-8570-ed2365d197f8\") " Mar 10 11:18:27 crc kubenswrapper[4794]: I0310 11:18:27.632468 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tfb9b\" (UniqueName: \"kubernetes.io/projected/f5ddb993-25ab-4586-8570-ed2365d197f8-kube-api-access-tfb9b\") pod \"f5ddb993-25ab-4586-8570-ed2365d197f8\" (UID: \"f5ddb993-25ab-4586-8570-ed2365d197f8\") " Mar 10 11:18:27 crc kubenswrapper[4794]: I0310 11:18:27.632509 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f5ddb993-25ab-4586-8570-ed2365d197f8-combined-ca-bundle\") pod \"f5ddb993-25ab-4586-8570-ed2365d197f8\" (UID: \"f5ddb993-25ab-4586-8570-ed2365d197f8\") " Mar 10 11:18:27 crc kubenswrapper[4794]: I0310 11:18:27.637795 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f5ddb993-25ab-4586-8570-ed2365d197f8-scripts" (OuterVolumeSpecName: "scripts") pod "f5ddb993-25ab-4586-8570-ed2365d197f8" (UID: "f5ddb993-25ab-4586-8570-ed2365d197f8"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 11:18:27 crc kubenswrapper[4794]: I0310 11:18:27.638293 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f5ddb993-25ab-4586-8570-ed2365d197f8-kube-api-access-tfb9b" (OuterVolumeSpecName: "kube-api-access-tfb9b") pod "f5ddb993-25ab-4586-8570-ed2365d197f8" (UID: "f5ddb993-25ab-4586-8570-ed2365d197f8"). InnerVolumeSpecName "kube-api-access-tfb9b". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 11:18:27 crc kubenswrapper[4794]: I0310 11:18:27.659300 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f5ddb993-25ab-4586-8570-ed2365d197f8-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f5ddb993-25ab-4586-8570-ed2365d197f8" (UID: "f5ddb993-25ab-4586-8570-ed2365d197f8"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 11:18:27 crc kubenswrapper[4794]: I0310 11:18:27.672649 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f5ddb993-25ab-4586-8570-ed2365d197f8-config-data" (OuterVolumeSpecName: "config-data") pod "f5ddb993-25ab-4586-8570-ed2365d197f8" (UID: "f5ddb993-25ab-4586-8570-ed2365d197f8"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 11:18:27 crc kubenswrapper[4794]: I0310 11:18:27.734078 4794 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f5ddb993-25ab-4586-8570-ed2365d197f8-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 11:18:27 crc kubenswrapper[4794]: I0310 11:18:27.734103 4794 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f5ddb993-25ab-4586-8570-ed2365d197f8-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 11:18:27 crc kubenswrapper[4794]: I0310 11:18:27.734113 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tfb9b\" (UniqueName: \"kubernetes.io/projected/f5ddb993-25ab-4586-8570-ed2365d197f8-kube-api-access-tfb9b\") on node \"crc\" DevicePath \"\"" Mar 10 11:18:27 crc kubenswrapper[4794]: I0310 11:18:27.734124 4794 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f5ddb993-25ab-4586-8570-ed2365d197f8-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 11:18:28 crc kubenswrapper[4794]: I0310 11:18:28.238442 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-gz7lg" event={"ID":"f5ddb993-25ab-4586-8570-ed2365d197f8","Type":"ContainerDied","Data":"c4791636c166075c04b8fb53d3f045ad22f64055adfa0c2798a1dd8263e298d0"} Mar 10 11:18:28 crc kubenswrapper[4794]: I0310 11:18:28.238507 4794 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c4791636c166075c04b8fb53d3f045ad22f64055adfa0c2798a1dd8263e298d0" Mar 10 11:18:28 crc kubenswrapper[4794]: I0310 11:18:28.238523 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-gz7lg" Mar 10 11:18:28 crc kubenswrapper[4794]: I0310 11:18:28.436830 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 10 11:18:28 crc kubenswrapper[4794]: I0310 11:18:28.437384 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="33361ebe-bd4e-4b9a-9753-f1ff87f27959" containerName="nova-api-log" containerID="cri-o://5e4b840434241251d6a3ba3525ecf5233d3a201bbfea5f4ffa9ff8d84f629895" gracePeriod=30 Mar 10 11:18:28 crc kubenswrapper[4794]: I0310 11:18:28.437898 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="33361ebe-bd4e-4b9a-9753-f1ff87f27959" containerName="nova-api-api" containerID="cri-o://2d4426b59d32b121886d26555ee294263928c122129176e03bd543b7b8799cc0" gracePeriod=30 Mar 10 11:18:28 crc kubenswrapper[4794]: I0310 11:18:28.452411 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Mar 10 11:18:28 crc kubenswrapper[4794]: I0310 11:18:28.456264 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="9a5e67b3-0cde-4385-995a-3ad279f877d5" containerName="nova-scheduler-scheduler" containerID="cri-o://f5b69ba2fb7667ac48d4e86c5c08fe00bcd404263356077dc4c6a60439c3074d" gracePeriod=30 Mar 10 11:18:28 crc kubenswrapper[4794]: I0310 11:18:28.471641 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 10 11:18:28 crc kubenswrapper[4794]: I0310 11:18:28.471819 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="aed09c21-26a5-44b3-a0fb-1fb40f42448e" containerName="nova-metadata-log" containerID="cri-o://5271312361a1bf1ed6f58b9b58c96ceb50b111fcc6843eaaf57561b41806311b" gracePeriod=30 Mar 10 11:18:28 crc kubenswrapper[4794]: I0310 11:18:28.472116 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="aed09c21-26a5-44b3-a0fb-1fb40f42448e" containerName="nova-metadata-metadata" containerID="cri-o://6b050bf1ace311f075b59692212e4f8cb7de41381b3240134cb0ed62afc4c6bd" gracePeriod=30 Mar 10 11:18:29 crc kubenswrapper[4794]: I0310 11:18:29.250102 4794 generic.go:334] "Generic (PLEG): container finished" podID="aed09c21-26a5-44b3-a0fb-1fb40f42448e" containerID="5271312361a1bf1ed6f58b9b58c96ceb50b111fcc6843eaaf57561b41806311b" exitCode=143 Mar 10 11:18:29 crc kubenswrapper[4794]: I0310 11:18:29.250195 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"aed09c21-26a5-44b3-a0fb-1fb40f42448e","Type":"ContainerDied","Data":"5271312361a1bf1ed6f58b9b58c96ceb50b111fcc6843eaaf57561b41806311b"} Mar 10 11:18:29 crc kubenswrapper[4794]: I0310 11:18:29.252313 4794 generic.go:334] "Generic (PLEG): container finished" podID="33361ebe-bd4e-4b9a-9753-f1ff87f27959" containerID="5e4b840434241251d6a3ba3525ecf5233d3a201bbfea5f4ffa9ff8d84f629895" exitCode=143 Mar 10 11:18:29 crc kubenswrapper[4794]: I0310 11:18:29.252373 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"33361ebe-bd4e-4b9a-9753-f1ff87f27959","Type":"ContainerDied","Data":"5e4b840434241251d6a3ba3525ecf5233d3a201bbfea5f4ffa9ff8d84f629895"} Mar 10 11:18:29 crc kubenswrapper[4794]: I0310 11:18:29.719360 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 10 11:18:29 crc kubenswrapper[4794]: I0310 11:18:29.875503 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6phc8\" (UniqueName: \"kubernetes.io/projected/9a5e67b3-0cde-4385-995a-3ad279f877d5-kube-api-access-6phc8\") pod \"9a5e67b3-0cde-4385-995a-3ad279f877d5\" (UID: \"9a5e67b3-0cde-4385-995a-3ad279f877d5\") " Mar 10 11:18:29 crc kubenswrapper[4794]: I0310 11:18:29.875654 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9a5e67b3-0cde-4385-995a-3ad279f877d5-config-data\") pod \"9a5e67b3-0cde-4385-995a-3ad279f877d5\" (UID: \"9a5e67b3-0cde-4385-995a-3ad279f877d5\") " Mar 10 11:18:29 crc kubenswrapper[4794]: I0310 11:18:29.875682 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9a5e67b3-0cde-4385-995a-3ad279f877d5-combined-ca-bundle\") pod \"9a5e67b3-0cde-4385-995a-3ad279f877d5\" (UID: \"9a5e67b3-0cde-4385-995a-3ad279f877d5\") " Mar 10 11:18:29 crc kubenswrapper[4794]: I0310 11:18:29.881768 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9a5e67b3-0cde-4385-995a-3ad279f877d5-kube-api-access-6phc8" (OuterVolumeSpecName: "kube-api-access-6phc8") pod "9a5e67b3-0cde-4385-995a-3ad279f877d5" (UID: "9a5e67b3-0cde-4385-995a-3ad279f877d5"). InnerVolumeSpecName "kube-api-access-6phc8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 11:18:29 crc kubenswrapper[4794]: I0310 11:18:29.898473 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9a5e67b3-0cde-4385-995a-3ad279f877d5-config-data" (OuterVolumeSpecName: "config-data") pod "9a5e67b3-0cde-4385-995a-3ad279f877d5" (UID: "9a5e67b3-0cde-4385-995a-3ad279f877d5"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 11:18:29 crc kubenswrapper[4794]: I0310 11:18:29.903178 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9a5e67b3-0cde-4385-995a-3ad279f877d5-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9a5e67b3-0cde-4385-995a-3ad279f877d5" (UID: "9a5e67b3-0cde-4385-995a-3ad279f877d5"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 11:18:29 crc kubenswrapper[4794]: I0310 11:18:29.978364 4794 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9a5e67b3-0cde-4385-995a-3ad279f877d5-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 11:18:29 crc kubenswrapper[4794]: I0310 11:18:29.978406 4794 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9a5e67b3-0cde-4385-995a-3ad279f877d5-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 11:18:29 crc kubenswrapper[4794]: I0310 11:18:29.978422 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6phc8\" (UniqueName: \"kubernetes.io/projected/9a5e67b3-0cde-4385-995a-3ad279f877d5-kube-api-access-6phc8\") on node \"crc\" DevicePath \"\"" Mar 10 11:18:30 crc kubenswrapper[4794]: I0310 11:18:30.267263 4794 generic.go:334] "Generic (PLEG): container finished" podID="9a5e67b3-0cde-4385-995a-3ad279f877d5" containerID="f5b69ba2fb7667ac48d4e86c5c08fe00bcd404263356077dc4c6a60439c3074d" exitCode=0 Mar 10 11:18:30 crc kubenswrapper[4794]: I0310 11:18:30.267330 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"9a5e67b3-0cde-4385-995a-3ad279f877d5","Type":"ContainerDied","Data":"f5b69ba2fb7667ac48d4e86c5c08fe00bcd404263356077dc4c6a60439c3074d"} Mar 10 11:18:30 crc kubenswrapper[4794]: I0310 11:18:30.267408 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"9a5e67b3-0cde-4385-995a-3ad279f877d5","Type":"ContainerDied","Data":"41faef868a18eb281871fd8a4c77e0115358bd237ae92d0e87e630842b4768f8"} Mar 10 11:18:30 crc kubenswrapper[4794]: I0310 11:18:30.267432 4794 scope.go:117] "RemoveContainer" containerID="f5b69ba2fb7667ac48d4e86c5c08fe00bcd404263356077dc4c6a60439c3074d" Mar 10 11:18:30 crc kubenswrapper[4794]: I0310 11:18:30.267449 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 10 11:18:30 crc kubenswrapper[4794]: I0310 11:18:30.298941 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Mar 10 11:18:30 crc kubenswrapper[4794]: I0310 11:18:30.304870 4794 scope.go:117] "RemoveContainer" containerID="f5b69ba2fb7667ac48d4e86c5c08fe00bcd404263356077dc4c6a60439c3074d" Mar 10 11:18:30 crc kubenswrapper[4794]: E0310 11:18:30.305624 4794 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f5b69ba2fb7667ac48d4e86c5c08fe00bcd404263356077dc4c6a60439c3074d\": container with ID starting with f5b69ba2fb7667ac48d4e86c5c08fe00bcd404263356077dc4c6a60439c3074d not found: ID does not exist" containerID="f5b69ba2fb7667ac48d4e86c5c08fe00bcd404263356077dc4c6a60439c3074d" Mar 10 11:18:30 crc kubenswrapper[4794]: I0310 11:18:30.305676 4794 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f5b69ba2fb7667ac48d4e86c5c08fe00bcd404263356077dc4c6a60439c3074d"} err="failed to get container status \"f5b69ba2fb7667ac48d4e86c5c08fe00bcd404263356077dc4c6a60439c3074d\": rpc error: code = NotFound desc = could not find container \"f5b69ba2fb7667ac48d4e86c5c08fe00bcd404263356077dc4c6a60439c3074d\": container with ID starting with f5b69ba2fb7667ac48d4e86c5c08fe00bcd404263356077dc4c6a60439c3074d not found: ID does not exist" Mar 10 11:18:30 crc kubenswrapper[4794]: I0310 11:18:30.331730 4794 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Mar 10 11:18:30 crc kubenswrapper[4794]: I0310 11:18:30.343630 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Mar 10 11:18:30 crc kubenswrapper[4794]: E0310 11:18:30.344568 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f5ddb993-25ab-4586-8570-ed2365d197f8" containerName="nova-manage" Mar 10 11:18:30 crc kubenswrapper[4794]: I0310 11:18:30.344613 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="f5ddb993-25ab-4586-8570-ed2365d197f8" containerName="nova-manage" Mar 10 11:18:30 crc kubenswrapper[4794]: E0310 11:18:30.344648 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9a5e67b3-0cde-4385-995a-3ad279f877d5" containerName="nova-scheduler-scheduler" Mar 10 11:18:30 crc kubenswrapper[4794]: I0310 11:18:30.344667 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="9a5e67b3-0cde-4385-995a-3ad279f877d5" containerName="nova-scheduler-scheduler" Mar 10 11:18:30 crc kubenswrapper[4794]: I0310 11:18:30.345125 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="f5ddb993-25ab-4586-8570-ed2365d197f8" containerName="nova-manage" Mar 10 11:18:30 crc kubenswrapper[4794]: I0310 11:18:30.345223 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="9a5e67b3-0cde-4385-995a-3ad279f877d5" containerName="nova-scheduler-scheduler" Mar 10 11:18:30 crc kubenswrapper[4794]: I0310 11:18:30.346598 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 10 11:18:30 crc kubenswrapper[4794]: I0310 11:18:30.349820 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Mar 10 11:18:30 crc kubenswrapper[4794]: I0310 11:18:30.359930 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 10 11:18:30 crc kubenswrapper[4794]: I0310 11:18:30.488701 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-52vw8\" (UniqueName: \"kubernetes.io/projected/660cfab8-47f8-4194-a9fc-9075fdb441ab-kube-api-access-52vw8\") pod \"nova-scheduler-0\" (UID: \"660cfab8-47f8-4194-a9fc-9075fdb441ab\") " pod="openstack/nova-scheduler-0" Mar 10 11:18:30 crc kubenswrapper[4794]: I0310 11:18:30.488737 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/660cfab8-47f8-4194-a9fc-9075fdb441ab-config-data\") pod \"nova-scheduler-0\" (UID: \"660cfab8-47f8-4194-a9fc-9075fdb441ab\") " pod="openstack/nova-scheduler-0" Mar 10 11:18:30 crc kubenswrapper[4794]: I0310 11:18:30.489046 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/660cfab8-47f8-4194-a9fc-9075fdb441ab-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"660cfab8-47f8-4194-a9fc-9075fdb441ab\") " pod="openstack/nova-scheduler-0" Mar 10 11:18:30 crc kubenswrapper[4794]: I0310 11:18:30.591245 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-52vw8\" (UniqueName: \"kubernetes.io/projected/660cfab8-47f8-4194-a9fc-9075fdb441ab-kube-api-access-52vw8\") pod \"nova-scheduler-0\" (UID: \"660cfab8-47f8-4194-a9fc-9075fdb441ab\") " pod="openstack/nova-scheduler-0" Mar 10 11:18:30 crc kubenswrapper[4794]: I0310 11:18:30.591592 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/660cfab8-47f8-4194-a9fc-9075fdb441ab-config-data\") pod \"nova-scheduler-0\" (UID: \"660cfab8-47f8-4194-a9fc-9075fdb441ab\") " pod="openstack/nova-scheduler-0" Mar 10 11:18:30 crc kubenswrapper[4794]: I0310 11:18:30.591623 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/660cfab8-47f8-4194-a9fc-9075fdb441ab-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"660cfab8-47f8-4194-a9fc-9075fdb441ab\") " pod="openstack/nova-scheduler-0" Mar 10 11:18:30 crc kubenswrapper[4794]: I0310 11:18:30.595299 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/660cfab8-47f8-4194-a9fc-9075fdb441ab-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"660cfab8-47f8-4194-a9fc-9075fdb441ab\") " pod="openstack/nova-scheduler-0" Mar 10 11:18:30 crc kubenswrapper[4794]: I0310 11:18:30.595675 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/660cfab8-47f8-4194-a9fc-9075fdb441ab-config-data\") pod \"nova-scheduler-0\" (UID: \"660cfab8-47f8-4194-a9fc-9075fdb441ab\") " pod="openstack/nova-scheduler-0" Mar 10 11:18:30 crc kubenswrapper[4794]: I0310 11:18:30.622550 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-52vw8\" (UniqueName: \"kubernetes.io/projected/660cfab8-47f8-4194-a9fc-9075fdb441ab-kube-api-access-52vw8\") pod \"nova-scheduler-0\" (UID: \"660cfab8-47f8-4194-a9fc-9075fdb441ab\") " pod="openstack/nova-scheduler-0" Mar 10 11:18:30 crc kubenswrapper[4794]: I0310 11:18:30.684875 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 10 11:18:31 crc kubenswrapper[4794]: I0310 11:18:31.197962 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 10 11:18:31 crc kubenswrapper[4794]: W0310 11:18:31.199600 4794 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod660cfab8_47f8_4194_a9fc_9075fdb441ab.slice/crio-7cb4cba420f47bfcbdc9a27ed62eeb6101247c76e2db1c6bb117684b7482d297 WatchSource:0}: Error finding container 7cb4cba420f47bfcbdc9a27ed62eeb6101247c76e2db1c6bb117684b7482d297: Status 404 returned error can't find the container with id 7cb4cba420f47bfcbdc9a27ed62eeb6101247c76e2db1c6bb117684b7482d297 Mar 10 11:18:31 crc kubenswrapper[4794]: I0310 11:18:31.280160 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"660cfab8-47f8-4194-a9fc-9075fdb441ab","Type":"ContainerStarted","Data":"7cb4cba420f47bfcbdc9a27ed62eeb6101247c76e2db1c6bb117684b7482d297"} Mar 10 11:18:32 crc kubenswrapper[4794]: I0310 11:18:32.019618 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9a5e67b3-0cde-4385-995a-3ad279f877d5" path="/var/lib/kubelet/pods/9a5e67b3-0cde-4385-995a-3ad279f877d5/volumes" Mar 10 11:18:32 crc kubenswrapper[4794]: I0310 11:18:32.290620 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"660cfab8-47f8-4194-a9fc-9075fdb441ab","Type":"ContainerStarted","Data":"ef2bdf7435fa5ad4f2d88d430c0bbec0318b87aca7ad75f918c56d0c5c08e7f3"} Mar 10 11:18:32 crc kubenswrapper[4794]: I0310 11:18:32.294020 4794 generic.go:334] "Generic (PLEG): container finished" podID="aed09c21-26a5-44b3-a0fb-1fb40f42448e" containerID="6b050bf1ace311f075b59692212e4f8cb7de41381b3240134cb0ed62afc4c6bd" exitCode=0 Mar 10 11:18:32 crc kubenswrapper[4794]: I0310 11:18:32.294063 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"aed09c21-26a5-44b3-a0fb-1fb40f42448e","Type":"ContainerDied","Data":"6b050bf1ace311f075b59692212e4f8cb7de41381b3240134cb0ed62afc4c6bd"} Mar 10 11:18:32 crc kubenswrapper[4794]: I0310 11:18:32.313785 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.313763675 podStartE2EDuration="2.313763675s" podCreationTimestamp="2026-03-10 11:18:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 11:18:32.309250684 +0000 UTC m=+5661.065421502" watchObservedRunningTime="2026-03-10 11:18:32.313763675 +0000 UTC m=+5661.069934493" Mar 10 11:18:33 crc kubenswrapper[4794]: I0310 11:18:33.025155 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 10 11:18:33 crc kubenswrapper[4794]: I0310 11:18:33.099667 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aed09c21-26a5-44b3-a0fb-1fb40f42448e-config-data\") pod \"aed09c21-26a5-44b3-a0fb-1fb40f42448e\" (UID: \"aed09c21-26a5-44b3-a0fb-1fb40f42448e\") " Mar 10 11:18:33 crc kubenswrapper[4794]: I0310 11:18:33.144493 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aed09c21-26a5-44b3-a0fb-1fb40f42448e-config-data" (OuterVolumeSpecName: "config-data") pod "aed09c21-26a5-44b3-a0fb-1fb40f42448e" (UID: "aed09c21-26a5-44b3-a0fb-1fb40f42448e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 11:18:33 crc kubenswrapper[4794]: I0310 11:18:33.202162 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aed09c21-26a5-44b3-a0fb-1fb40f42448e-combined-ca-bundle\") pod \"aed09c21-26a5-44b3-a0fb-1fb40f42448e\" (UID: \"aed09c21-26a5-44b3-a0fb-1fb40f42448e\") " Mar 10 11:18:33 crc kubenswrapper[4794]: I0310 11:18:33.202263 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/aed09c21-26a5-44b3-a0fb-1fb40f42448e-logs\") pod \"aed09c21-26a5-44b3-a0fb-1fb40f42448e\" (UID: \"aed09c21-26a5-44b3-a0fb-1fb40f42448e\") " Mar 10 11:18:33 crc kubenswrapper[4794]: I0310 11:18:33.202322 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p4gwk\" (UniqueName: \"kubernetes.io/projected/aed09c21-26a5-44b3-a0fb-1fb40f42448e-kube-api-access-p4gwk\") pod \"aed09c21-26a5-44b3-a0fb-1fb40f42448e\" (UID: \"aed09c21-26a5-44b3-a0fb-1fb40f42448e\") " Mar 10 11:18:33 crc kubenswrapper[4794]: I0310 11:18:33.202859 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/aed09c21-26a5-44b3-a0fb-1fb40f42448e-logs" (OuterVolumeSpecName: "logs") pod "aed09c21-26a5-44b3-a0fb-1fb40f42448e" (UID: "aed09c21-26a5-44b3-a0fb-1fb40f42448e"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 11:18:33 crc kubenswrapper[4794]: I0310 11:18:33.203667 4794 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/aed09c21-26a5-44b3-a0fb-1fb40f42448e-logs\") on node \"crc\" DevicePath \"\"" Mar 10 11:18:33 crc kubenswrapper[4794]: I0310 11:18:33.203686 4794 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aed09c21-26a5-44b3-a0fb-1fb40f42448e-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 11:18:33 crc kubenswrapper[4794]: I0310 11:18:33.208480 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aed09c21-26a5-44b3-a0fb-1fb40f42448e-kube-api-access-p4gwk" (OuterVolumeSpecName: "kube-api-access-p4gwk") pod "aed09c21-26a5-44b3-a0fb-1fb40f42448e" (UID: "aed09c21-26a5-44b3-a0fb-1fb40f42448e"). InnerVolumeSpecName "kube-api-access-p4gwk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 11:18:33 crc kubenswrapper[4794]: I0310 11:18:33.235858 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aed09c21-26a5-44b3-a0fb-1fb40f42448e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "aed09c21-26a5-44b3-a0fb-1fb40f42448e" (UID: "aed09c21-26a5-44b3-a0fb-1fb40f42448e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 11:18:33 crc kubenswrapper[4794]: I0310 11:18:33.305572 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p4gwk\" (UniqueName: \"kubernetes.io/projected/aed09c21-26a5-44b3-a0fb-1fb40f42448e-kube-api-access-p4gwk\") on node \"crc\" DevicePath \"\"" Mar 10 11:18:33 crc kubenswrapper[4794]: I0310 11:18:33.305612 4794 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aed09c21-26a5-44b3-a0fb-1fb40f42448e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 11:18:33 crc kubenswrapper[4794]: I0310 11:18:33.311307 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"aed09c21-26a5-44b3-a0fb-1fb40f42448e","Type":"ContainerDied","Data":"33f551e4afe3fd1a14290114dd98caef8fa15376d4ba4ffff86af81aa19f21a3"} Mar 10 11:18:33 crc kubenswrapper[4794]: I0310 11:18:33.311407 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 10 11:18:33 crc kubenswrapper[4794]: I0310 11:18:33.311415 4794 scope.go:117] "RemoveContainer" containerID="6b050bf1ace311f075b59692212e4f8cb7de41381b3240134cb0ed62afc4c6bd" Mar 10 11:18:33 crc kubenswrapper[4794]: I0310 11:18:33.317243 4794 generic.go:334] "Generic (PLEG): container finished" podID="33361ebe-bd4e-4b9a-9753-f1ff87f27959" containerID="2d4426b59d32b121886d26555ee294263928c122129176e03bd543b7b8799cc0" exitCode=0 Mar 10 11:18:33 crc kubenswrapper[4794]: I0310 11:18:33.318381 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"33361ebe-bd4e-4b9a-9753-f1ff87f27959","Type":"ContainerDied","Data":"2d4426b59d32b121886d26555ee294263928c122129176e03bd543b7b8799cc0"} Mar 10 11:18:33 crc kubenswrapper[4794]: I0310 11:18:33.324081 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 10 11:18:33 crc kubenswrapper[4794]: I0310 11:18:33.339166 4794 scope.go:117] "RemoveContainer" containerID="5271312361a1bf1ed6f58b9b58c96ceb50b111fcc6843eaaf57561b41806311b" Mar 10 11:18:33 crc kubenswrapper[4794]: I0310 11:18:33.386932 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 10 11:18:33 crc kubenswrapper[4794]: I0310 11:18:33.404917 4794 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Mar 10 11:18:33 crc kubenswrapper[4794]: I0310 11:18:33.416803 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Mar 10 11:18:33 crc kubenswrapper[4794]: E0310 11:18:33.417160 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="33361ebe-bd4e-4b9a-9753-f1ff87f27959" containerName="nova-api-api" Mar 10 11:18:33 crc kubenswrapper[4794]: I0310 11:18:33.417171 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="33361ebe-bd4e-4b9a-9753-f1ff87f27959" containerName="nova-api-api" Mar 10 11:18:33 crc kubenswrapper[4794]: E0310 11:18:33.417187 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="33361ebe-bd4e-4b9a-9753-f1ff87f27959" containerName="nova-api-log" Mar 10 11:18:33 crc kubenswrapper[4794]: I0310 11:18:33.417195 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="33361ebe-bd4e-4b9a-9753-f1ff87f27959" containerName="nova-api-log" Mar 10 11:18:33 crc kubenswrapper[4794]: E0310 11:18:33.417223 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aed09c21-26a5-44b3-a0fb-1fb40f42448e" containerName="nova-metadata-metadata" Mar 10 11:18:33 crc kubenswrapper[4794]: I0310 11:18:33.417231 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="aed09c21-26a5-44b3-a0fb-1fb40f42448e" containerName="nova-metadata-metadata" Mar 10 11:18:33 crc kubenswrapper[4794]: E0310 11:18:33.417243 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aed09c21-26a5-44b3-a0fb-1fb40f42448e" containerName="nova-metadata-log" Mar 10 11:18:33 crc kubenswrapper[4794]: I0310 11:18:33.417251 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="aed09c21-26a5-44b3-a0fb-1fb40f42448e" containerName="nova-metadata-log" Mar 10 11:18:33 crc kubenswrapper[4794]: I0310 11:18:33.417441 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="33361ebe-bd4e-4b9a-9753-f1ff87f27959" containerName="nova-api-log" Mar 10 11:18:33 crc kubenswrapper[4794]: I0310 11:18:33.417460 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="aed09c21-26a5-44b3-a0fb-1fb40f42448e" containerName="nova-metadata-metadata" Mar 10 11:18:33 crc kubenswrapper[4794]: I0310 11:18:33.417470 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="33361ebe-bd4e-4b9a-9753-f1ff87f27959" containerName="nova-api-api" Mar 10 11:18:33 crc kubenswrapper[4794]: I0310 11:18:33.417483 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="aed09c21-26a5-44b3-a0fb-1fb40f42448e" containerName="nova-metadata-log" Mar 10 11:18:33 crc kubenswrapper[4794]: I0310 11:18:33.418478 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 10 11:18:33 crc kubenswrapper[4794]: I0310 11:18:33.424688 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 10 11:18:33 crc kubenswrapper[4794]: I0310 11:18:33.426141 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Mar 10 11:18:33 crc kubenswrapper[4794]: I0310 11:18:33.509667 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/33361ebe-bd4e-4b9a-9753-f1ff87f27959-combined-ca-bundle\") pod \"33361ebe-bd4e-4b9a-9753-f1ff87f27959\" (UID: \"33361ebe-bd4e-4b9a-9753-f1ff87f27959\") " Mar 10 11:18:33 crc kubenswrapper[4794]: I0310 11:18:33.509911 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/33361ebe-bd4e-4b9a-9753-f1ff87f27959-config-data\") pod \"33361ebe-bd4e-4b9a-9753-f1ff87f27959\" (UID: \"33361ebe-bd4e-4b9a-9753-f1ff87f27959\") " Mar 10 11:18:33 crc kubenswrapper[4794]: I0310 11:18:33.510014 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/33361ebe-bd4e-4b9a-9753-f1ff87f27959-logs\") pod \"33361ebe-bd4e-4b9a-9753-f1ff87f27959\" (UID: \"33361ebe-bd4e-4b9a-9753-f1ff87f27959\") " Mar 10 11:18:33 crc kubenswrapper[4794]: I0310 11:18:33.510065 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7h27s\" (UniqueName: \"kubernetes.io/projected/33361ebe-bd4e-4b9a-9753-f1ff87f27959-kube-api-access-7h27s\") pod \"33361ebe-bd4e-4b9a-9753-f1ff87f27959\" (UID: \"33361ebe-bd4e-4b9a-9753-f1ff87f27959\") " Mar 10 11:18:33 crc kubenswrapper[4794]: I0310 11:18:33.510545 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8dwtb\" (UniqueName: \"kubernetes.io/projected/e3cc18ff-b557-4d49-8580-733877f288a5-kube-api-access-8dwtb\") pod \"nova-metadata-0\" (UID: \"e3cc18ff-b557-4d49-8580-733877f288a5\") " pod="openstack/nova-metadata-0" Mar 10 11:18:33 crc kubenswrapper[4794]: I0310 11:18:33.510623 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e3cc18ff-b557-4d49-8580-733877f288a5-logs\") pod \"nova-metadata-0\" (UID: \"e3cc18ff-b557-4d49-8580-733877f288a5\") " pod="openstack/nova-metadata-0" Mar 10 11:18:33 crc kubenswrapper[4794]: I0310 11:18:33.510650 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e3cc18ff-b557-4d49-8580-733877f288a5-config-data\") pod \"nova-metadata-0\" (UID: \"e3cc18ff-b557-4d49-8580-733877f288a5\") " pod="openstack/nova-metadata-0" Mar 10 11:18:33 crc kubenswrapper[4794]: I0310 11:18:33.510816 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e3cc18ff-b557-4d49-8580-733877f288a5-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"e3cc18ff-b557-4d49-8580-733877f288a5\") " pod="openstack/nova-metadata-0" Mar 10 11:18:33 crc kubenswrapper[4794]: I0310 11:18:33.511373 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/33361ebe-bd4e-4b9a-9753-f1ff87f27959-logs" (OuterVolumeSpecName: "logs") pod "33361ebe-bd4e-4b9a-9753-f1ff87f27959" (UID: "33361ebe-bd4e-4b9a-9753-f1ff87f27959"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 11:18:33 crc kubenswrapper[4794]: I0310 11:18:33.514178 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/33361ebe-bd4e-4b9a-9753-f1ff87f27959-kube-api-access-7h27s" (OuterVolumeSpecName: "kube-api-access-7h27s") pod "33361ebe-bd4e-4b9a-9753-f1ff87f27959" (UID: "33361ebe-bd4e-4b9a-9753-f1ff87f27959"). InnerVolumeSpecName "kube-api-access-7h27s". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 11:18:33 crc kubenswrapper[4794]: I0310 11:18:33.547155 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/33361ebe-bd4e-4b9a-9753-f1ff87f27959-config-data" (OuterVolumeSpecName: "config-data") pod "33361ebe-bd4e-4b9a-9753-f1ff87f27959" (UID: "33361ebe-bd4e-4b9a-9753-f1ff87f27959"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 11:18:33 crc kubenswrapper[4794]: I0310 11:18:33.547968 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/33361ebe-bd4e-4b9a-9753-f1ff87f27959-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "33361ebe-bd4e-4b9a-9753-f1ff87f27959" (UID: "33361ebe-bd4e-4b9a-9753-f1ff87f27959"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 11:18:33 crc kubenswrapper[4794]: I0310 11:18:33.612100 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e3cc18ff-b557-4d49-8580-733877f288a5-logs\") pod \"nova-metadata-0\" (UID: \"e3cc18ff-b557-4d49-8580-733877f288a5\") " pod="openstack/nova-metadata-0" Mar 10 11:18:33 crc kubenswrapper[4794]: I0310 11:18:33.612148 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e3cc18ff-b557-4d49-8580-733877f288a5-config-data\") pod \"nova-metadata-0\" (UID: \"e3cc18ff-b557-4d49-8580-733877f288a5\") " pod="openstack/nova-metadata-0" Mar 10 11:18:33 crc kubenswrapper[4794]: I0310 11:18:33.612226 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e3cc18ff-b557-4d49-8580-733877f288a5-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"e3cc18ff-b557-4d49-8580-733877f288a5\") " pod="openstack/nova-metadata-0" Mar 10 11:18:33 crc kubenswrapper[4794]: I0310 11:18:33.612275 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8dwtb\" (UniqueName: \"kubernetes.io/projected/e3cc18ff-b557-4d49-8580-733877f288a5-kube-api-access-8dwtb\") pod \"nova-metadata-0\" (UID: \"e3cc18ff-b557-4d49-8580-733877f288a5\") " pod="openstack/nova-metadata-0" Mar 10 11:18:33 crc kubenswrapper[4794]: I0310 11:18:33.612325 4794 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/33361ebe-bd4e-4b9a-9753-f1ff87f27959-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 11:18:33 crc kubenswrapper[4794]: I0310 11:18:33.612354 4794 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/33361ebe-bd4e-4b9a-9753-f1ff87f27959-logs\") on node \"crc\" DevicePath \"\"" Mar 10 11:18:33 crc kubenswrapper[4794]: I0310 11:18:33.612363 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7h27s\" (UniqueName: \"kubernetes.io/projected/33361ebe-bd4e-4b9a-9753-f1ff87f27959-kube-api-access-7h27s\") on node \"crc\" DevicePath \"\"" Mar 10 11:18:33 crc kubenswrapper[4794]: I0310 11:18:33.612373 4794 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/33361ebe-bd4e-4b9a-9753-f1ff87f27959-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 11:18:33 crc kubenswrapper[4794]: I0310 11:18:33.612919 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e3cc18ff-b557-4d49-8580-733877f288a5-logs\") pod \"nova-metadata-0\" (UID: \"e3cc18ff-b557-4d49-8580-733877f288a5\") " pod="openstack/nova-metadata-0" Mar 10 11:18:33 crc kubenswrapper[4794]: I0310 11:18:33.618220 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e3cc18ff-b557-4d49-8580-733877f288a5-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"e3cc18ff-b557-4d49-8580-733877f288a5\") " pod="openstack/nova-metadata-0" Mar 10 11:18:33 crc kubenswrapper[4794]: I0310 11:18:33.619625 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e3cc18ff-b557-4d49-8580-733877f288a5-config-data\") pod \"nova-metadata-0\" (UID: \"e3cc18ff-b557-4d49-8580-733877f288a5\") " pod="openstack/nova-metadata-0" Mar 10 11:18:33 crc kubenswrapper[4794]: I0310 11:18:33.641670 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8dwtb\" (UniqueName: \"kubernetes.io/projected/e3cc18ff-b557-4d49-8580-733877f288a5-kube-api-access-8dwtb\") pod \"nova-metadata-0\" (UID: \"e3cc18ff-b557-4d49-8580-733877f288a5\") " pod="openstack/nova-metadata-0" Mar 10 11:18:33 crc kubenswrapper[4794]: I0310 11:18:33.734125 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 10 11:18:34 crc kubenswrapper[4794]: I0310 11:18:34.009924 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="aed09c21-26a5-44b3-a0fb-1fb40f42448e" path="/var/lib/kubelet/pods/aed09c21-26a5-44b3-a0fb-1fb40f42448e/volumes" Mar 10 11:18:34 crc kubenswrapper[4794]: I0310 11:18:34.253302 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 10 11:18:34 crc kubenswrapper[4794]: W0310 11:18:34.255232 4794 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode3cc18ff_b557_4d49_8580_733877f288a5.slice/crio-6fd9da60e1cf3efdfcd8233aad0e963797daf47d7e749eb628b9f113e0e27d1a WatchSource:0}: Error finding container 6fd9da60e1cf3efdfcd8233aad0e963797daf47d7e749eb628b9f113e0e27d1a: Status 404 returned error can't find the container with id 6fd9da60e1cf3efdfcd8233aad0e963797daf47d7e749eb628b9f113e0e27d1a Mar 10 11:18:34 crc kubenswrapper[4794]: I0310 11:18:34.329534 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"e3cc18ff-b557-4d49-8580-733877f288a5","Type":"ContainerStarted","Data":"6fd9da60e1cf3efdfcd8233aad0e963797daf47d7e749eb628b9f113e0e27d1a"} Mar 10 11:18:34 crc kubenswrapper[4794]: I0310 11:18:34.333570 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"33361ebe-bd4e-4b9a-9753-f1ff87f27959","Type":"ContainerDied","Data":"89d9db83a5df2558dd498732948dc5fa84d84a40e55ed84d966be33a6989ec55"} Mar 10 11:18:34 crc kubenswrapper[4794]: I0310 11:18:34.333667 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 10 11:18:34 crc kubenswrapper[4794]: I0310 11:18:34.333825 4794 scope.go:117] "RemoveContainer" containerID="2d4426b59d32b121886d26555ee294263928c122129176e03bd543b7b8799cc0" Mar 10 11:18:34 crc kubenswrapper[4794]: I0310 11:18:34.376727 4794 scope.go:117] "RemoveContainer" containerID="5e4b840434241251d6a3ba3525ecf5233d3a201bbfea5f4ffa9ff8d84f629895" Mar 10 11:18:34 crc kubenswrapper[4794]: I0310 11:18:34.406455 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 10 11:18:34 crc kubenswrapper[4794]: I0310 11:18:34.437703 4794 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Mar 10 11:18:34 crc kubenswrapper[4794]: I0310 11:18:34.437976 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Mar 10 11:18:34 crc kubenswrapper[4794]: I0310 11:18:34.442295 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 10 11:18:34 crc kubenswrapper[4794]: I0310 11:18:34.452469 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Mar 10 11:18:34 crc kubenswrapper[4794]: I0310 11:18:34.462714 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 10 11:18:34 crc kubenswrapper[4794]: I0310 11:18:34.640708 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fc1aed0b-186c-45d8-947b-072cb1a2ce0f-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"fc1aed0b-186c-45d8-947b-072cb1a2ce0f\") " pod="openstack/nova-api-0" Mar 10 11:18:34 crc kubenswrapper[4794]: I0310 11:18:34.641016 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fc1aed0b-186c-45d8-947b-072cb1a2ce0f-config-data\") pod \"nova-api-0\" (UID: \"fc1aed0b-186c-45d8-947b-072cb1a2ce0f\") " pod="openstack/nova-api-0" Mar 10 11:18:34 crc kubenswrapper[4794]: I0310 11:18:34.641042 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zn5br\" (UniqueName: \"kubernetes.io/projected/fc1aed0b-186c-45d8-947b-072cb1a2ce0f-kube-api-access-zn5br\") pod \"nova-api-0\" (UID: \"fc1aed0b-186c-45d8-947b-072cb1a2ce0f\") " pod="openstack/nova-api-0" Mar 10 11:18:34 crc kubenswrapper[4794]: I0310 11:18:34.641820 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fc1aed0b-186c-45d8-947b-072cb1a2ce0f-logs\") pod \"nova-api-0\" (UID: \"fc1aed0b-186c-45d8-947b-072cb1a2ce0f\") " pod="openstack/nova-api-0" Mar 10 11:18:34 crc kubenswrapper[4794]: I0310 11:18:34.743686 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fc1aed0b-186c-45d8-947b-072cb1a2ce0f-logs\") pod \"nova-api-0\" (UID: \"fc1aed0b-186c-45d8-947b-072cb1a2ce0f\") " pod="openstack/nova-api-0" Mar 10 11:18:34 crc kubenswrapper[4794]: I0310 11:18:34.744242 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fc1aed0b-186c-45d8-947b-072cb1a2ce0f-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"fc1aed0b-186c-45d8-947b-072cb1a2ce0f\") " pod="openstack/nova-api-0" Mar 10 11:18:34 crc kubenswrapper[4794]: I0310 11:18:34.744890 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fc1aed0b-186c-45d8-947b-072cb1a2ce0f-config-data\") pod \"nova-api-0\" (UID: \"fc1aed0b-186c-45d8-947b-072cb1a2ce0f\") " pod="openstack/nova-api-0" Mar 10 11:18:34 crc kubenswrapper[4794]: I0310 11:18:34.744917 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zn5br\" (UniqueName: \"kubernetes.io/projected/fc1aed0b-186c-45d8-947b-072cb1a2ce0f-kube-api-access-zn5br\") pod \"nova-api-0\" (UID: \"fc1aed0b-186c-45d8-947b-072cb1a2ce0f\") " pod="openstack/nova-api-0" Mar 10 11:18:34 crc kubenswrapper[4794]: I0310 11:18:34.744178 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fc1aed0b-186c-45d8-947b-072cb1a2ce0f-logs\") pod \"nova-api-0\" (UID: \"fc1aed0b-186c-45d8-947b-072cb1a2ce0f\") " pod="openstack/nova-api-0" Mar 10 11:18:34 crc kubenswrapper[4794]: I0310 11:18:34.748536 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fc1aed0b-186c-45d8-947b-072cb1a2ce0f-config-data\") pod \"nova-api-0\" (UID: \"fc1aed0b-186c-45d8-947b-072cb1a2ce0f\") " pod="openstack/nova-api-0" Mar 10 11:18:34 crc kubenswrapper[4794]: I0310 11:18:34.748931 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fc1aed0b-186c-45d8-947b-072cb1a2ce0f-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"fc1aed0b-186c-45d8-947b-072cb1a2ce0f\") " pod="openstack/nova-api-0" Mar 10 11:18:34 crc kubenswrapper[4794]: I0310 11:18:34.765272 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zn5br\" (UniqueName: \"kubernetes.io/projected/fc1aed0b-186c-45d8-947b-072cb1a2ce0f-kube-api-access-zn5br\") pod \"nova-api-0\" (UID: \"fc1aed0b-186c-45d8-947b-072cb1a2ce0f\") " pod="openstack/nova-api-0" Mar 10 11:18:35 crc kubenswrapper[4794]: I0310 11:18:35.064591 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 10 11:18:35 crc kubenswrapper[4794]: I0310 11:18:35.349872 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"e3cc18ff-b557-4d49-8580-733877f288a5","Type":"ContainerStarted","Data":"8307f7ba39567058c2c2afb0493f5efb4f54bf8b9d792b4ddc4fd0580cc86401"} Mar 10 11:18:35 crc kubenswrapper[4794]: I0310 11:18:35.350232 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"e3cc18ff-b557-4d49-8580-733877f288a5","Type":"ContainerStarted","Data":"bd17680b7381005b9d0606808844a3a815638a759a845329a6aaa7cde1297f88"} Mar 10 11:18:35 crc kubenswrapper[4794]: I0310 11:18:35.383994 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.383970883 podStartE2EDuration="2.383970883s" podCreationTimestamp="2026-03-10 11:18:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 11:18:35.367849961 +0000 UTC m=+5664.124020790" watchObservedRunningTime="2026-03-10 11:18:35.383970883 +0000 UTC m=+5664.140141701" Mar 10 11:18:35 crc kubenswrapper[4794]: I0310 11:18:35.562060 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 10 11:18:35 crc kubenswrapper[4794]: W0310 11:18:35.565720 4794 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfc1aed0b_186c_45d8_947b_072cb1a2ce0f.slice/crio-58a9c0c40105ab6738a509df6d9323044f391977b5333c7749fa48c9d6434b7a WatchSource:0}: Error finding container 58a9c0c40105ab6738a509df6d9323044f391977b5333c7749fa48c9d6434b7a: Status 404 returned error can't find the container with id 58a9c0c40105ab6738a509df6d9323044f391977b5333c7749fa48c9d6434b7a Mar 10 11:18:35 crc kubenswrapper[4794]: I0310 11:18:35.685065 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Mar 10 11:18:36 crc kubenswrapper[4794]: I0310 11:18:36.013890 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="33361ebe-bd4e-4b9a-9753-f1ff87f27959" path="/var/lib/kubelet/pods/33361ebe-bd4e-4b9a-9753-f1ff87f27959/volumes" Mar 10 11:18:36 crc kubenswrapper[4794]: I0310 11:18:36.369209 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"fc1aed0b-186c-45d8-947b-072cb1a2ce0f","Type":"ContainerStarted","Data":"41c78b00714fd6d0f603309b957b6ac5d885afe04aa32a0e7b8f992d63d515c9"} Mar 10 11:18:36 crc kubenswrapper[4794]: I0310 11:18:36.369300 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"fc1aed0b-186c-45d8-947b-072cb1a2ce0f","Type":"ContainerStarted","Data":"58a9c0c40105ab6738a509df6d9323044f391977b5333c7749fa48c9d6434b7a"} Mar 10 11:18:37 crc kubenswrapper[4794]: I0310 11:18:37.376995 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"fc1aed0b-186c-45d8-947b-072cb1a2ce0f","Type":"ContainerStarted","Data":"93486363bcda7b62107d7e1e0b7afd58567e06ee4d14343e41ead3774822918b"} Mar 10 11:18:37 crc kubenswrapper[4794]: I0310 11:18:37.393621 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=3.393601718 podStartE2EDuration="3.393601718s" podCreationTimestamp="2026-03-10 11:18:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 11:18:37.392264627 +0000 UTC m=+5666.148435445" watchObservedRunningTime="2026-03-10 11:18:37.393601718 +0000 UTC m=+5666.149772546" Mar 10 11:18:38 crc kubenswrapper[4794]: I0310 11:18:38.735685 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Mar 10 11:18:38 crc kubenswrapper[4794]: I0310 11:18:38.735780 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Mar 10 11:18:39 crc kubenswrapper[4794]: I0310 11:18:39.999362 4794 scope.go:117] "RemoveContainer" containerID="993a5336699cfaa4c1135438dd34154cd549d1e3528993849fd4002d26837157" Mar 10 11:18:40 crc kubenswrapper[4794]: E0310 11:18:39.999795 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-69278_openshift-machine-config-operator(071a79a8-a892-4d38-a255-2a19483b64aa)\"" pod="openshift-machine-config-operator/machine-config-daemon-69278" podUID="071a79a8-a892-4d38-a255-2a19483b64aa" Mar 10 11:18:40 crc kubenswrapper[4794]: I0310 11:18:40.685520 4794 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Mar 10 11:18:40 crc kubenswrapper[4794]: I0310 11:18:40.733474 4794 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Mar 10 11:18:41 crc kubenswrapper[4794]: I0310 11:18:41.468445 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Mar 10 11:18:42 crc kubenswrapper[4794]: I0310 11:18:42.707735 4794 scope.go:117] "RemoveContainer" containerID="1e230fd967502ae8d4656b2a5f6d05681b7bec15ea475101de690e31149cbaa2" Mar 10 11:18:42 crc kubenswrapper[4794]: I0310 11:18:42.765193 4794 scope.go:117] "RemoveContainer" containerID="d63c99472b5229335c5be601564544dbe2c58765c09f451d9337f60f910194b5" Mar 10 11:18:43 crc kubenswrapper[4794]: I0310 11:18:43.735853 4794 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Mar 10 11:18:43 crc kubenswrapper[4794]: I0310 11:18:43.736260 4794 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Mar 10 11:18:44 crc kubenswrapper[4794]: I0310 11:18:44.818581 4794 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="e3cc18ff-b557-4d49-8580-733877f288a5" containerName="nova-metadata-log" probeResult="failure" output="Get \"http://10.217.1.110:8775/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 10 11:18:44 crc kubenswrapper[4794]: I0310 11:18:44.819091 4794 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="e3cc18ff-b557-4d49-8580-733877f288a5" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"http://10.217.1.110:8775/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 10 11:18:45 crc kubenswrapper[4794]: I0310 11:18:45.064806 4794 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 10 11:18:45 crc kubenswrapper[4794]: I0310 11:18:45.064871 4794 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 10 11:18:46 crc kubenswrapper[4794]: I0310 11:18:46.147633 4794 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="fc1aed0b-186c-45d8-947b-072cb1a2ce0f" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.1.111:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 10 11:18:46 crc kubenswrapper[4794]: I0310 11:18:46.147719 4794 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="fc1aed0b-186c-45d8-947b-072cb1a2ce0f" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.1.111:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 10 11:18:53 crc kubenswrapper[4794]: I0310 11:18:53.738701 4794 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Mar 10 11:18:53 crc kubenswrapper[4794]: I0310 11:18:53.739483 4794 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Mar 10 11:18:53 crc kubenswrapper[4794]: I0310 11:18:53.742503 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Mar 10 11:18:53 crc kubenswrapper[4794]: I0310 11:18:53.743224 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Mar 10 11:18:53 crc kubenswrapper[4794]: I0310 11:18:53.999792 4794 scope.go:117] "RemoveContainer" containerID="993a5336699cfaa4c1135438dd34154cd549d1e3528993849fd4002d26837157" Mar 10 11:18:54 crc kubenswrapper[4794]: I0310 11:18:54.867566 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-69278" event={"ID":"071a79a8-a892-4d38-a255-2a19483b64aa","Type":"ContainerStarted","Data":"14e62759b4835fd8e09988559aeb5396bdb9e62fa6007a87b363f008bdc5ba42"} Mar 10 11:18:55 crc kubenswrapper[4794]: I0310 11:18:55.069276 4794 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Mar 10 11:18:55 crc kubenswrapper[4794]: I0310 11:18:55.069746 4794 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Mar 10 11:18:55 crc kubenswrapper[4794]: I0310 11:18:55.070163 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Mar 10 11:18:55 crc kubenswrapper[4794]: I0310 11:18:55.070213 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Mar 10 11:18:55 crc kubenswrapper[4794]: I0310 11:18:55.072426 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Mar 10 11:18:55 crc kubenswrapper[4794]: I0310 11:18:55.074157 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Mar 10 11:18:55 crc kubenswrapper[4794]: I0310 11:18:55.271433 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5c46849895-wd2pd"] Mar 10 11:18:55 crc kubenswrapper[4794]: I0310 11:18:55.272984 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c46849895-wd2pd" Mar 10 11:18:55 crc kubenswrapper[4794]: I0310 11:18:55.291965 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5c46849895-wd2pd"] Mar 10 11:18:55 crc kubenswrapper[4794]: I0310 11:18:55.404549 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c950caad-61b1-466a-8ab0-9665214d56cc-dns-svc\") pod \"dnsmasq-dns-5c46849895-wd2pd\" (UID: \"c950caad-61b1-466a-8ab0-9665214d56cc\") " pod="openstack/dnsmasq-dns-5c46849895-wd2pd" Mar 10 11:18:55 crc kubenswrapper[4794]: I0310 11:18:55.404617 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c950caad-61b1-466a-8ab0-9665214d56cc-config\") pod \"dnsmasq-dns-5c46849895-wd2pd\" (UID: \"c950caad-61b1-466a-8ab0-9665214d56cc\") " pod="openstack/dnsmasq-dns-5c46849895-wd2pd" Mar 10 11:18:55 crc kubenswrapper[4794]: I0310 11:18:55.404647 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c950caad-61b1-466a-8ab0-9665214d56cc-ovsdbserver-sb\") pod \"dnsmasq-dns-5c46849895-wd2pd\" (UID: \"c950caad-61b1-466a-8ab0-9665214d56cc\") " pod="openstack/dnsmasq-dns-5c46849895-wd2pd" Mar 10 11:18:55 crc kubenswrapper[4794]: I0310 11:18:55.404695 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c950caad-61b1-466a-8ab0-9665214d56cc-ovsdbserver-nb\") pod \"dnsmasq-dns-5c46849895-wd2pd\" (UID: \"c950caad-61b1-466a-8ab0-9665214d56cc\") " pod="openstack/dnsmasq-dns-5c46849895-wd2pd" Mar 10 11:18:55 crc kubenswrapper[4794]: I0310 11:18:55.404823 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2dfcq\" (UniqueName: \"kubernetes.io/projected/c950caad-61b1-466a-8ab0-9665214d56cc-kube-api-access-2dfcq\") pod \"dnsmasq-dns-5c46849895-wd2pd\" (UID: \"c950caad-61b1-466a-8ab0-9665214d56cc\") " pod="openstack/dnsmasq-dns-5c46849895-wd2pd" Mar 10 11:18:55 crc kubenswrapper[4794]: I0310 11:18:55.506130 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c950caad-61b1-466a-8ab0-9665214d56cc-ovsdbserver-sb\") pod \"dnsmasq-dns-5c46849895-wd2pd\" (UID: \"c950caad-61b1-466a-8ab0-9665214d56cc\") " pod="openstack/dnsmasq-dns-5c46849895-wd2pd" Mar 10 11:18:55 crc kubenswrapper[4794]: I0310 11:18:55.506427 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c950caad-61b1-466a-8ab0-9665214d56cc-ovsdbserver-nb\") pod \"dnsmasq-dns-5c46849895-wd2pd\" (UID: \"c950caad-61b1-466a-8ab0-9665214d56cc\") " pod="openstack/dnsmasq-dns-5c46849895-wd2pd" Mar 10 11:18:55 crc kubenswrapper[4794]: I0310 11:18:55.506460 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2dfcq\" (UniqueName: \"kubernetes.io/projected/c950caad-61b1-466a-8ab0-9665214d56cc-kube-api-access-2dfcq\") pod \"dnsmasq-dns-5c46849895-wd2pd\" (UID: \"c950caad-61b1-466a-8ab0-9665214d56cc\") " pod="openstack/dnsmasq-dns-5c46849895-wd2pd" Mar 10 11:18:55 crc kubenswrapper[4794]: I0310 11:18:55.506555 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c950caad-61b1-466a-8ab0-9665214d56cc-dns-svc\") pod \"dnsmasq-dns-5c46849895-wd2pd\" (UID: \"c950caad-61b1-466a-8ab0-9665214d56cc\") " pod="openstack/dnsmasq-dns-5c46849895-wd2pd" Mar 10 11:18:55 crc kubenswrapper[4794]: I0310 11:18:55.506592 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c950caad-61b1-466a-8ab0-9665214d56cc-config\") pod \"dnsmasq-dns-5c46849895-wd2pd\" (UID: \"c950caad-61b1-466a-8ab0-9665214d56cc\") " pod="openstack/dnsmasq-dns-5c46849895-wd2pd" Mar 10 11:18:55 crc kubenswrapper[4794]: I0310 11:18:55.507242 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c950caad-61b1-466a-8ab0-9665214d56cc-ovsdbserver-nb\") pod \"dnsmasq-dns-5c46849895-wd2pd\" (UID: \"c950caad-61b1-466a-8ab0-9665214d56cc\") " pod="openstack/dnsmasq-dns-5c46849895-wd2pd" Mar 10 11:18:55 crc kubenswrapper[4794]: I0310 11:18:55.507284 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c950caad-61b1-466a-8ab0-9665214d56cc-ovsdbserver-sb\") pod \"dnsmasq-dns-5c46849895-wd2pd\" (UID: \"c950caad-61b1-466a-8ab0-9665214d56cc\") " pod="openstack/dnsmasq-dns-5c46849895-wd2pd" Mar 10 11:18:55 crc kubenswrapper[4794]: I0310 11:18:55.507781 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c950caad-61b1-466a-8ab0-9665214d56cc-config\") pod \"dnsmasq-dns-5c46849895-wd2pd\" (UID: \"c950caad-61b1-466a-8ab0-9665214d56cc\") " pod="openstack/dnsmasq-dns-5c46849895-wd2pd" Mar 10 11:18:55 crc kubenswrapper[4794]: I0310 11:18:55.507902 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c950caad-61b1-466a-8ab0-9665214d56cc-dns-svc\") pod \"dnsmasq-dns-5c46849895-wd2pd\" (UID: \"c950caad-61b1-466a-8ab0-9665214d56cc\") " pod="openstack/dnsmasq-dns-5c46849895-wd2pd" Mar 10 11:18:55 crc kubenswrapper[4794]: I0310 11:18:55.523871 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2dfcq\" (UniqueName: \"kubernetes.io/projected/c950caad-61b1-466a-8ab0-9665214d56cc-kube-api-access-2dfcq\") pod \"dnsmasq-dns-5c46849895-wd2pd\" (UID: \"c950caad-61b1-466a-8ab0-9665214d56cc\") " pod="openstack/dnsmasq-dns-5c46849895-wd2pd" Mar 10 11:18:55 crc kubenswrapper[4794]: I0310 11:18:55.597384 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c46849895-wd2pd" Mar 10 11:18:56 crc kubenswrapper[4794]: I0310 11:18:56.125892 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5c46849895-wd2pd"] Mar 10 11:18:56 crc kubenswrapper[4794]: I0310 11:18:56.888904 4794 generic.go:334] "Generic (PLEG): container finished" podID="c950caad-61b1-466a-8ab0-9665214d56cc" containerID="37dd0ebe5a61b16f305cebc351b0ad23c4eeaa10514fa9eb920ac1408f9a34a5" exitCode=0 Mar 10 11:18:56 crc kubenswrapper[4794]: I0310 11:18:56.889631 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c46849895-wd2pd" event={"ID":"c950caad-61b1-466a-8ab0-9665214d56cc","Type":"ContainerDied","Data":"37dd0ebe5a61b16f305cebc351b0ad23c4eeaa10514fa9eb920ac1408f9a34a5"} Mar 10 11:18:56 crc kubenswrapper[4794]: I0310 11:18:56.889735 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c46849895-wd2pd" event={"ID":"c950caad-61b1-466a-8ab0-9665214d56cc","Type":"ContainerStarted","Data":"752e4f9d7f4fb0d1325a9794ffc918dc0b963968d413b76c31c9c49590b44f78"} Mar 10 11:18:57 crc kubenswrapper[4794]: I0310 11:18:57.901185 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c46849895-wd2pd" event={"ID":"c950caad-61b1-466a-8ab0-9665214d56cc","Type":"ContainerStarted","Data":"f78198c9d63abd505e0c7a4c15b70376691efd876016129b87cdd2a92aef7f20"} Mar 10 11:18:57 crc kubenswrapper[4794]: I0310 11:18:57.901535 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5c46849895-wd2pd" Mar 10 11:18:57 crc kubenswrapper[4794]: I0310 11:18:57.936879 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5c46849895-wd2pd" podStartSLOduration=2.936849708 podStartE2EDuration="2.936849708s" podCreationTimestamp="2026-03-10 11:18:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 11:18:57.928808778 +0000 UTC m=+5686.684979616" watchObservedRunningTime="2026-03-10 11:18:57.936849708 +0000 UTC m=+5686.693020566" Mar 10 11:19:05 crc kubenswrapper[4794]: I0310 11:19:05.599619 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5c46849895-wd2pd" Mar 10 11:19:05 crc kubenswrapper[4794]: I0310 11:19:05.683465 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-9dd655fcf-vd7zz"] Mar 10 11:19:05 crc kubenswrapper[4794]: I0310 11:19:05.683745 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-9dd655fcf-vd7zz" podUID="647826ea-505e-495e-8fdc-004e7e432df8" containerName="dnsmasq-dns" containerID="cri-o://b847f84fb6dceb7e782c14ea46054ce699b41e5d821d4cf993ce4e3449c9b7af" gracePeriod=10 Mar 10 11:19:05 crc kubenswrapper[4794]: I0310 11:19:05.996802 4794 generic.go:334] "Generic (PLEG): container finished" podID="647826ea-505e-495e-8fdc-004e7e432df8" containerID="b847f84fb6dceb7e782c14ea46054ce699b41e5d821d4cf993ce4e3449c9b7af" exitCode=0 Mar 10 11:19:05 crc kubenswrapper[4794]: I0310 11:19:05.996942 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-9dd655fcf-vd7zz" event={"ID":"647826ea-505e-495e-8fdc-004e7e432df8","Type":"ContainerDied","Data":"b847f84fb6dceb7e782c14ea46054ce699b41e5d821d4cf993ce4e3449c9b7af"} Mar 10 11:19:06 crc kubenswrapper[4794]: I0310 11:19:06.161596 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-9dd655fcf-vd7zz" Mar 10 11:19:06 crc kubenswrapper[4794]: I0310 11:19:06.321987 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6hw4l\" (UniqueName: \"kubernetes.io/projected/647826ea-505e-495e-8fdc-004e7e432df8-kube-api-access-6hw4l\") pod \"647826ea-505e-495e-8fdc-004e7e432df8\" (UID: \"647826ea-505e-495e-8fdc-004e7e432df8\") " Mar 10 11:19:06 crc kubenswrapper[4794]: I0310 11:19:06.323250 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/647826ea-505e-495e-8fdc-004e7e432df8-dns-svc\") pod \"647826ea-505e-495e-8fdc-004e7e432df8\" (UID: \"647826ea-505e-495e-8fdc-004e7e432df8\") " Mar 10 11:19:06 crc kubenswrapper[4794]: I0310 11:19:06.323539 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/647826ea-505e-495e-8fdc-004e7e432df8-ovsdbserver-sb\") pod \"647826ea-505e-495e-8fdc-004e7e432df8\" (UID: \"647826ea-505e-495e-8fdc-004e7e432df8\") " Mar 10 11:19:06 crc kubenswrapper[4794]: I0310 11:19:06.323594 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/647826ea-505e-495e-8fdc-004e7e432df8-ovsdbserver-nb\") pod \"647826ea-505e-495e-8fdc-004e7e432df8\" (UID: \"647826ea-505e-495e-8fdc-004e7e432df8\") " Mar 10 11:19:06 crc kubenswrapper[4794]: I0310 11:19:06.323625 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/647826ea-505e-495e-8fdc-004e7e432df8-config\") pod \"647826ea-505e-495e-8fdc-004e7e432df8\" (UID: \"647826ea-505e-495e-8fdc-004e7e432df8\") " Mar 10 11:19:06 crc kubenswrapper[4794]: I0310 11:19:06.328474 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/647826ea-505e-495e-8fdc-004e7e432df8-kube-api-access-6hw4l" (OuterVolumeSpecName: "kube-api-access-6hw4l") pod "647826ea-505e-495e-8fdc-004e7e432df8" (UID: "647826ea-505e-495e-8fdc-004e7e432df8"). InnerVolumeSpecName "kube-api-access-6hw4l". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 11:19:06 crc kubenswrapper[4794]: I0310 11:19:06.363234 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/647826ea-505e-495e-8fdc-004e7e432df8-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "647826ea-505e-495e-8fdc-004e7e432df8" (UID: "647826ea-505e-495e-8fdc-004e7e432df8"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 11:19:06 crc kubenswrapper[4794]: I0310 11:19:06.382183 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/647826ea-505e-495e-8fdc-004e7e432df8-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "647826ea-505e-495e-8fdc-004e7e432df8" (UID: "647826ea-505e-495e-8fdc-004e7e432df8"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 11:19:06 crc kubenswrapper[4794]: I0310 11:19:06.385441 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/647826ea-505e-495e-8fdc-004e7e432df8-config" (OuterVolumeSpecName: "config") pod "647826ea-505e-495e-8fdc-004e7e432df8" (UID: "647826ea-505e-495e-8fdc-004e7e432df8"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 11:19:06 crc kubenswrapper[4794]: I0310 11:19:06.390545 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/647826ea-505e-495e-8fdc-004e7e432df8-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "647826ea-505e-495e-8fdc-004e7e432df8" (UID: "647826ea-505e-495e-8fdc-004e7e432df8"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 11:19:06 crc kubenswrapper[4794]: I0310 11:19:06.425735 4794 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/647826ea-505e-495e-8fdc-004e7e432df8-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 10 11:19:06 crc kubenswrapper[4794]: I0310 11:19:06.425769 4794 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/647826ea-505e-495e-8fdc-004e7e432df8-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 10 11:19:06 crc kubenswrapper[4794]: I0310 11:19:06.425782 4794 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/647826ea-505e-495e-8fdc-004e7e432df8-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 10 11:19:06 crc kubenswrapper[4794]: I0310 11:19:06.425793 4794 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/647826ea-505e-495e-8fdc-004e7e432df8-config\") on node \"crc\" DevicePath \"\"" Mar 10 11:19:06 crc kubenswrapper[4794]: I0310 11:19:06.425807 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6hw4l\" (UniqueName: \"kubernetes.io/projected/647826ea-505e-495e-8fdc-004e7e432df8-kube-api-access-6hw4l\") on node \"crc\" DevicePath \"\"" Mar 10 11:19:06 crc kubenswrapper[4794]: I0310 11:19:06.604740 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-27lf2"] Mar 10 11:19:06 crc kubenswrapper[4794]: E0310 11:19:06.605276 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="647826ea-505e-495e-8fdc-004e7e432df8" containerName="init" Mar 10 11:19:06 crc kubenswrapper[4794]: I0310 11:19:06.605294 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="647826ea-505e-495e-8fdc-004e7e432df8" containerName="init" Mar 10 11:19:06 crc kubenswrapper[4794]: E0310 11:19:06.605324 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="647826ea-505e-495e-8fdc-004e7e432df8" containerName="dnsmasq-dns" Mar 10 11:19:06 crc kubenswrapper[4794]: I0310 11:19:06.605360 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="647826ea-505e-495e-8fdc-004e7e432df8" containerName="dnsmasq-dns" Mar 10 11:19:06 crc kubenswrapper[4794]: I0310 11:19:06.605572 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="647826ea-505e-495e-8fdc-004e7e432df8" containerName="dnsmasq-dns" Mar 10 11:19:06 crc kubenswrapper[4794]: I0310 11:19:06.607184 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-27lf2" Mar 10 11:19:06 crc kubenswrapper[4794]: I0310 11:19:06.633694 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-27lf2"] Mar 10 11:19:06 crc kubenswrapper[4794]: I0310 11:19:06.735847 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8n7fv\" (UniqueName: \"kubernetes.io/projected/a1c7c111-e573-416b-9db2-778ff3318b52-kube-api-access-8n7fv\") pod \"certified-operators-27lf2\" (UID: \"a1c7c111-e573-416b-9db2-778ff3318b52\") " pod="openshift-marketplace/certified-operators-27lf2" Mar 10 11:19:06 crc kubenswrapper[4794]: I0310 11:19:06.736013 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a1c7c111-e573-416b-9db2-778ff3318b52-utilities\") pod \"certified-operators-27lf2\" (UID: \"a1c7c111-e573-416b-9db2-778ff3318b52\") " pod="openshift-marketplace/certified-operators-27lf2" Mar 10 11:19:06 crc kubenswrapper[4794]: I0310 11:19:06.736051 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a1c7c111-e573-416b-9db2-778ff3318b52-catalog-content\") pod \"certified-operators-27lf2\" (UID: \"a1c7c111-e573-416b-9db2-778ff3318b52\") " pod="openshift-marketplace/certified-operators-27lf2" Mar 10 11:19:06 crc kubenswrapper[4794]: I0310 11:19:06.837462 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8n7fv\" (UniqueName: \"kubernetes.io/projected/a1c7c111-e573-416b-9db2-778ff3318b52-kube-api-access-8n7fv\") pod \"certified-operators-27lf2\" (UID: \"a1c7c111-e573-416b-9db2-778ff3318b52\") " pod="openshift-marketplace/certified-operators-27lf2" Mar 10 11:19:06 crc kubenswrapper[4794]: I0310 11:19:06.837820 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a1c7c111-e573-416b-9db2-778ff3318b52-utilities\") pod \"certified-operators-27lf2\" (UID: \"a1c7c111-e573-416b-9db2-778ff3318b52\") " pod="openshift-marketplace/certified-operators-27lf2" Mar 10 11:19:06 crc kubenswrapper[4794]: I0310 11:19:06.837951 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a1c7c111-e573-416b-9db2-778ff3318b52-catalog-content\") pod \"certified-operators-27lf2\" (UID: \"a1c7c111-e573-416b-9db2-778ff3318b52\") " pod="openshift-marketplace/certified-operators-27lf2" Mar 10 11:19:06 crc kubenswrapper[4794]: I0310 11:19:06.838486 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a1c7c111-e573-416b-9db2-778ff3318b52-catalog-content\") pod \"certified-operators-27lf2\" (UID: \"a1c7c111-e573-416b-9db2-778ff3318b52\") " pod="openshift-marketplace/certified-operators-27lf2" Mar 10 11:19:06 crc kubenswrapper[4794]: I0310 11:19:06.838484 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a1c7c111-e573-416b-9db2-778ff3318b52-utilities\") pod \"certified-operators-27lf2\" (UID: \"a1c7c111-e573-416b-9db2-778ff3318b52\") " pod="openshift-marketplace/certified-operators-27lf2" Mar 10 11:19:06 crc kubenswrapper[4794]: I0310 11:19:06.869756 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8n7fv\" (UniqueName: \"kubernetes.io/projected/a1c7c111-e573-416b-9db2-778ff3318b52-kube-api-access-8n7fv\") pod \"certified-operators-27lf2\" (UID: \"a1c7c111-e573-416b-9db2-778ff3318b52\") " pod="openshift-marketplace/certified-operators-27lf2" Mar 10 11:19:06 crc kubenswrapper[4794]: I0310 11:19:06.939811 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-27lf2" Mar 10 11:19:07 crc kubenswrapper[4794]: I0310 11:19:07.013235 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-9dd655fcf-vd7zz" event={"ID":"647826ea-505e-495e-8fdc-004e7e432df8","Type":"ContainerDied","Data":"b337d0f6423286d6e6834e2bce14382a5cac1f18dc201e9dd181192bced486e2"} Mar 10 11:19:07 crc kubenswrapper[4794]: I0310 11:19:07.013312 4794 scope.go:117] "RemoveContainer" containerID="b847f84fb6dceb7e782c14ea46054ce699b41e5d821d4cf993ce4e3449c9b7af" Mar 10 11:19:07 crc kubenswrapper[4794]: I0310 11:19:07.013375 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-9dd655fcf-vd7zz" Mar 10 11:19:07 crc kubenswrapper[4794]: I0310 11:19:07.061019 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-9dd655fcf-vd7zz"] Mar 10 11:19:07 crc kubenswrapper[4794]: I0310 11:19:07.064891 4794 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-9dd655fcf-vd7zz"] Mar 10 11:19:07 crc kubenswrapper[4794]: I0310 11:19:07.139373 4794 scope.go:117] "RemoveContainer" containerID="9f44948ec3a32d4d69ec6573f79079ea88779de41621c1c32f05b724d05edae9" Mar 10 11:19:07 crc kubenswrapper[4794]: I0310 11:19:07.444175 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-27lf2"] Mar 10 11:19:07 crc kubenswrapper[4794]: W0310 11:19:07.446456 4794 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda1c7c111_e573_416b_9db2_778ff3318b52.slice/crio-6bf6879377eac608021884e3f75d8d469df7ae3046879583d432039db166a7e5 WatchSource:0}: Error finding container 6bf6879377eac608021884e3f75d8d469df7ae3046879583d432039db166a7e5: Status 404 returned error can't find the container with id 6bf6879377eac608021884e3f75d8d469df7ae3046879583d432039db166a7e5 Mar 10 11:19:07 crc kubenswrapper[4794]: I0310 11:19:07.838952 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-create-8qjtb"] Mar 10 11:19:07 crc kubenswrapper[4794]: I0310 11:19:07.840865 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-8qjtb" Mar 10 11:19:07 crc kubenswrapper[4794]: I0310 11:19:07.853051 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-8qjtb"] Mar 10 11:19:07 crc kubenswrapper[4794]: I0310 11:19:07.946805 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-2a16-account-create-update-5q9jn"] Mar 10 11:19:07 crc kubenswrapper[4794]: I0310 11:19:07.948266 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-2a16-account-create-update-5q9jn" Mar 10 11:19:07 crc kubenswrapper[4794]: I0310 11:19:07.954087 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-db-secret" Mar 10 11:19:07 crc kubenswrapper[4794]: I0310 11:19:07.959201 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-2a16-account-create-update-5q9jn"] Mar 10 11:19:07 crc kubenswrapper[4794]: I0310 11:19:07.959248 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/aecfaacc-1770-4c75-90e5-18bf8f45581d-operator-scripts\") pod \"cinder-db-create-8qjtb\" (UID: \"aecfaacc-1770-4c75-90e5-18bf8f45581d\") " pod="openstack/cinder-db-create-8qjtb" Mar 10 11:19:07 crc kubenswrapper[4794]: I0310 11:19:07.959639 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sjvl6\" (UniqueName: \"kubernetes.io/projected/aecfaacc-1770-4c75-90e5-18bf8f45581d-kube-api-access-sjvl6\") pod \"cinder-db-create-8qjtb\" (UID: \"aecfaacc-1770-4c75-90e5-18bf8f45581d\") " pod="openstack/cinder-db-create-8qjtb" Mar 10 11:19:08 crc kubenswrapper[4794]: I0310 11:19:08.015770 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="647826ea-505e-495e-8fdc-004e7e432df8" path="/var/lib/kubelet/pods/647826ea-505e-495e-8fdc-004e7e432df8/volumes" Mar 10 11:19:08 crc kubenswrapper[4794]: I0310 11:19:08.026283 4794 generic.go:334] "Generic (PLEG): container finished" podID="a1c7c111-e573-416b-9db2-778ff3318b52" containerID="dc624ea43fd97645b6b91a8881158813ef0993d81bce478248c36dee9b2a7234" exitCode=0 Mar 10 11:19:08 crc kubenswrapper[4794]: I0310 11:19:08.026346 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-27lf2" event={"ID":"a1c7c111-e573-416b-9db2-778ff3318b52","Type":"ContainerDied","Data":"dc624ea43fd97645b6b91a8881158813ef0993d81bce478248c36dee9b2a7234"} Mar 10 11:19:08 crc kubenswrapper[4794]: I0310 11:19:08.026375 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-27lf2" event={"ID":"a1c7c111-e573-416b-9db2-778ff3318b52","Type":"ContainerStarted","Data":"6bf6879377eac608021884e3f75d8d469df7ae3046879583d432039db166a7e5"} Mar 10 11:19:08 crc kubenswrapper[4794]: I0310 11:19:08.061469 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e8b55040-929c-4088-9d23-532663500a6b-operator-scripts\") pod \"cinder-2a16-account-create-update-5q9jn\" (UID: \"e8b55040-929c-4088-9d23-532663500a6b\") " pod="openstack/cinder-2a16-account-create-update-5q9jn" Mar 10 11:19:08 crc kubenswrapper[4794]: I0310 11:19:08.061702 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/aecfaacc-1770-4c75-90e5-18bf8f45581d-operator-scripts\") pod \"cinder-db-create-8qjtb\" (UID: \"aecfaacc-1770-4c75-90e5-18bf8f45581d\") " pod="openstack/cinder-db-create-8qjtb" Mar 10 11:19:08 crc kubenswrapper[4794]: I0310 11:19:08.061789 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sjvl6\" (UniqueName: \"kubernetes.io/projected/aecfaacc-1770-4c75-90e5-18bf8f45581d-kube-api-access-sjvl6\") pod \"cinder-db-create-8qjtb\" (UID: \"aecfaacc-1770-4c75-90e5-18bf8f45581d\") " pod="openstack/cinder-db-create-8qjtb" Mar 10 11:19:08 crc kubenswrapper[4794]: I0310 11:19:08.062061 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cbkt5\" (UniqueName: \"kubernetes.io/projected/e8b55040-929c-4088-9d23-532663500a6b-kube-api-access-cbkt5\") pod \"cinder-2a16-account-create-update-5q9jn\" (UID: \"e8b55040-929c-4088-9d23-532663500a6b\") " pod="openstack/cinder-2a16-account-create-update-5q9jn" Mar 10 11:19:08 crc kubenswrapper[4794]: I0310 11:19:08.062592 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/aecfaacc-1770-4c75-90e5-18bf8f45581d-operator-scripts\") pod \"cinder-db-create-8qjtb\" (UID: \"aecfaacc-1770-4c75-90e5-18bf8f45581d\") " pod="openstack/cinder-db-create-8qjtb" Mar 10 11:19:08 crc kubenswrapper[4794]: I0310 11:19:08.088617 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sjvl6\" (UniqueName: \"kubernetes.io/projected/aecfaacc-1770-4c75-90e5-18bf8f45581d-kube-api-access-sjvl6\") pod \"cinder-db-create-8qjtb\" (UID: \"aecfaacc-1770-4c75-90e5-18bf8f45581d\") " pod="openstack/cinder-db-create-8qjtb" Mar 10 11:19:08 crc kubenswrapper[4794]: I0310 11:19:08.155926 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-8qjtb" Mar 10 11:19:08 crc kubenswrapper[4794]: I0310 11:19:08.163990 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cbkt5\" (UniqueName: \"kubernetes.io/projected/e8b55040-929c-4088-9d23-532663500a6b-kube-api-access-cbkt5\") pod \"cinder-2a16-account-create-update-5q9jn\" (UID: \"e8b55040-929c-4088-9d23-532663500a6b\") " pod="openstack/cinder-2a16-account-create-update-5q9jn" Mar 10 11:19:08 crc kubenswrapper[4794]: I0310 11:19:08.164046 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e8b55040-929c-4088-9d23-532663500a6b-operator-scripts\") pod \"cinder-2a16-account-create-update-5q9jn\" (UID: \"e8b55040-929c-4088-9d23-532663500a6b\") " pod="openstack/cinder-2a16-account-create-update-5q9jn" Mar 10 11:19:08 crc kubenswrapper[4794]: I0310 11:19:08.167395 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e8b55040-929c-4088-9d23-532663500a6b-operator-scripts\") pod \"cinder-2a16-account-create-update-5q9jn\" (UID: \"e8b55040-929c-4088-9d23-532663500a6b\") " pod="openstack/cinder-2a16-account-create-update-5q9jn" Mar 10 11:19:08 crc kubenswrapper[4794]: I0310 11:19:08.198896 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cbkt5\" (UniqueName: \"kubernetes.io/projected/e8b55040-929c-4088-9d23-532663500a6b-kube-api-access-cbkt5\") pod \"cinder-2a16-account-create-update-5q9jn\" (UID: \"e8b55040-929c-4088-9d23-532663500a6b\") " pod="openstack/cinder-2a16-account-create-update-5q9jn" Mar 10 11:19:08 crc kubenswrapper[4794]: I0310 11:19:08.271000 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-2a16-account-create-update-5q9jn" Mar 10 11:19:08 crc kubenswrapper[4794]: I0310 11:19:08.663129 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-8qjtb"] Mar 10 11:19:08 crc kubenswrapper[4794]: W0310 11:19:08.667298 4794 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podaecfaacc_1770_4c75_90e5_18bf8f45581d.slice/crio-29add1b07488e1b23b3572845aa07cc296021712855ee832bf205ef33a67f923 WatchSource:0}: Error finding container 29add1b07488e1b23b3572845aa07cc296021712855ee832bf205ef33a67f923: Status 404 returned error can't find the container with id 29add1b07488e1b23b3572845aa07cc296021712855ee832bf205ef33a67f923 Mar 10 11:19:08 crc kubenswrapper[4794]: I0310 11:19:08.798890 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-2a16-account-create-update-5q9jn"] Mar 10 11:19:08 crc kubenswrapper[4794]: W0310 11:19:08.799255 4794 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode8b55040_929c_4088_9d23_532663500a6b.slice/crio-b2cb7f101e9420c2c74f37b8147714cf3bf87346a62942c8508e4ef2d4c02d77 WatchSource:0}: Error finding container b2cb7f101e9420c2c74f37b8147714cf3bf87346a62942c8508e4ef2d4c02d77: Status 404 returned error can't find the container with id b2cb7f101e9420c2c74f37b8147714cf3bf87346a62942c8508e4ef2d4c02d77 Mar 10 11:19:09 crc kubenswrapper[4794]: I0310 11:19:09.035704 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-8qjtb" event={"ID":"aecfaacc-1770-4c75-90e5-18bf8f45581d","Type":"ContainerStarted","Data":"583deb61ce6ccbb2d89e6d8fe21d9f4a362de539c089d8c0c89c0fe43854b84e"} Mar 10 11:19:09 crc kubenswrapper[4794]: I0310 11:19:09.036063 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-8qjtb" event={"ID":"aecfaacc-1770-4c75-90e5-18bf8f45581d","Type":"ContainerStarted","Data":"29add1b07488e1b23b3572845aa07cc296021712855ee832bf205ef33a67f923"} Mar 10 11:19:09 crc kubenswrapper[4794]: I0310 11:19:09.038694 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-27lf2" event={"ID":"a1c7c111-e573-416b-9db2-778ff3318b52","Type":"ContainerStarted","Data":"0c5215ccee936d6918d39ad1ca85b3886d50f164b2dbd2ca8f8197896815ff49"} Mar 10 11:19:09 crc kubenswrapper[4794]: I0310 11:19:09.040930 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-2a16-account-create-update-5q9jn" event={"ID":"e8b55040-929c-4088-9d23-532663500a6b","Type":"ContainerStarted","Data":"1a8f8ed5da40dc30e6234ba884c5b3d621a1fd46132212b94d6b48cb58737cf7"} Mar 10 11:19:09 crc kubenswrapper[4794]: I0310 11:19:09.040978 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-2a16-account-create-update-5q9jn" event={"ID":"e8b55040-929c-4088-9d23-532663500a6b","Type":"ContainerStarted","Data":"b2cb7f101e9420c2c74f37b8147714cf3bf87346a62942c8508e4ef2d4c02d77"} Mar 10 11:19:09 crc kubenswrapper[4794]: I0310 11:19:09.060468 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-db-create-8qjtb" podStartSLOduration=2.0604467 podStartE2EDuration="2.0604467s" podCreationTimestamp="2026-03-10 11:19:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 11:19:09.053510093 +0000 UTC m=+5697.809680921" watchObservedRunningTime="2026-03-10 11:19:09.0604467 +0000 UTC m=+5697.816617528" Mar 10 11:19:09 crc kubenswrapper[4794]: I0310 11:19:09.074215 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-2a16-account-create-update-5q9jn" podStartSLOduration=2.074194247 podStartE2EDuration="2.074194247s" podCreationTimestamp="2026-03-10 11:19:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 11:19:09.069013436 +0000 UTC m=+5697.825184264" watchObservedRunningTime="2026-03-10 11:19:09.074194247 +0000 UTC m=+5697.830365065" Mar 10 11:19:10 crc kubenswrapper[4794]: I0310 11:19:10.058765 4794 generic.go:334] "Generic (PLEG): container finished" podID="e8b55040-929c-4088-9d23-532663500a6b" containerID="1a8f8ed5da40dc30e6234ba884c5b3d621a1fd46132212b94d6b48cb58737cf7" exitCode=0 Mar 10 11:19:10 crc kubenswrapper[4794]: I0310 11:19:10.058887 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-2a16-account-create-update-5q9jn" event={"ID":"e8b55040-929c-4088-9d23-532663500a6b","Type":"ContainerDied","Data":"1a8f8ed5da40dc30e6234ba884c5b3d621a1fd46132212b94d6b48cb58737cf7"} Mar 10 11:19:10 crc kubenswrapper[4794]: I0310 11:19:10.065536 4794 generic.go:334] "Generic (PLEG): container finished" podID="aecfaacc-1770-4c75-90e5-18bf8f45581d" containerID="583deb61ce6ccbb2d89e6d8fe21d9f4a362de539c089d8c0c89c0fe43854b84e" exitCode=0 Mar 10 11:19:10 crc kubenswrapper[4794]: I0310 11:19:10.065606 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-8qjtb" event={"ID":"aecfaacc-1770-4c75-90e5-18bf8f45581d","Type":"ContainerDied","Data":"583deb61ce6ccbb2d89e6d8fe21d9f4a362de539c089d8c0c89c0fe43854b84e"} Mar 10 11:19:10 crc kubenswrapper[4794]: I0310 11:19:10.083093 4794 generic.go:334] "Generic (PLEG): container finished" podID="a1c7c111-e573-416b-9db2-778ff3318b52" containerID="0c5215ccee936d6918d39ad1ca85b3886d50f164b2dbd2ca8f8197896815ff49" exitCode=0 Mar 10 11:19:10 crc kubenswrapper[4794]: I0310 11:19:10.083150 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-27lf2" event={"ID":"a1c7c111-e573-416b-9db2-778ff3318b52","Type":"ContainerDied","Data":"0c5215ccee936d6918d39ad1ca85b3886d50f164b2dbd2ca8f8197896815ff49"} Mar 10 11:19:11 crc kubenswrapper[4794]: I0310 11:19:11.097664 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-27lf2" event={"ID":"a1c7c111-e573-416b-9db2-778ff3318b52","Type":"ContainerStarted","Data":"1bbff651183a76774318b9c9c666b9e822820c2e3f389e290fb1396713755554"} Mar 10 11:19:11 crc kubenswrapper[4794]: I0310 11:19:11.124157 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-27lf2" podStartSLOduration=2.632752211 podStartE2EDuration="5.124137217s" podCreationTimestamp="2026-03-10 11:19:06 +0000 UTC" firstStartedPulling="2026-03-10 11:19:08.028739695 +0000 UTC m=+5696.784910513" lastFinishedPulling="2026-03-10 11:19:10.520124651 +0000 UTC m=+5699.276295519" observedRunningTime="2026-03-10 11:19:11.121480535 +0000 UTC m=+5699.877651413" watchObservedRunningTime="2026-03-10 11:19:11.124137217 +0000 UTC m=+5699.880308035" Mar 10 11:19:11 crc kubenswrapper[4794]: I0310 11:19:11.583375 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-2a16-account-create-update-5q9jn" Mar 10 11:19:11 crc kubenswrapper[4794]: I0310 11:19:11.590203 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-8qjtb" Mar 10 11:19:11 crc kubenswrapper[4794]: I0310 11:19:11.742824 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e8b55040-929c-4088-9d23-532663500a6b-operator-scripts\") pod \"e8b55040-929c-4088-9d23-532663500a6b\" (UID: \"e8b55040-929c-4088-9d23-532663500a6b\") " Mar 10 11:19:11 crc kubenswrapper[4794]: I0310 11:19:11.743187 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sjvl6\" (UniqueName: \"kubernetes.io/projected/aecfaacc-1770-4c75-90e5-18bf8f45581d-kube-api-access-sjvl6\") pod \"aecfaacc-1770-4c75-90e5-18bf8f45581d\" (UID: \"aecfaacc-1770-4c75-90e5-18bf8f45581d\") " Mar 10 11:19:11 crc kubenswrapper[4794]: I0310 11:19:11.743254 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cbkt5\" (UniqueName: \"kubernetes.io/projected/e8b55040-929c-4088-9d23-532663500a6b-kube-api-access-cbkt5\") pod \"e8b55040-929c-4088-9d23-532663500a6b\" (UID: \"e8b55040-929c-4088-9d23-532663500a6b\") " Mar 10 11:19:11 crc kubenswrapper[4794]: I0310 11:19:11.743298 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/aecfaacc-1770-4c75-90e5-18bf8f45581d-operator-scripts\") pod \"aecfaacc-1770-4c75-90e5-18bf8f45581d\" (UID: \"aecfaacc-1770-4c75-90e5-18bf8f45581d\") " Mar 10 11:19:11 crc kubenswrapper[4794]: I0310 11:19:11.743492 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e8b55040-929c-4088-9d23-532663500a6b-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "e8b55040-929c-4088-9d23-532663500a6b" (UID: "e8b55040-929c-4088-9d23-532663500a6b"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 11:19:11 crc kubenswrapper[4794]: I0310 11:19:11.743735 4794 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e8b55040-929c-4088-9d23-532663500a6b-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 11:19:11 crc kubenswrapper[4794]: I0310 11:19:11.744130 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/aecfaacc-1770-4c75-90e5-18bf8f45581d-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "aecfaacc-1770-4c75-90e5-18bf8f45581d" (UID: "aecfaacc-1770-4c75-90e5-18bf8f45581d"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 11:19:11 crc kubenswrapper[4794]: I0310 11:19:11.753643 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aecfaacc-1770-4c75-90e5-18bf8f45581d-kube-api-access-sjvl6" (OuterVolumeSpecName: "kube-api-access-sjvl6") pod "aecfaacc-1770-4c75-90e5-18bf8f45581d" (UID: "aecfaacc-1770-4c75-90e5-18bf8f45581d"). InnerVolumeSpecName "kube-api-access-sjvl6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 11:19:11 crc kubenswrapper[4794]: I0310 11:19:11.753703 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e8b55040-929c-4088-9d23-532663500a6b-kube-api-access-cbkt5" (OuterVolumeSpecName: "kube-api-access-cbkt5") pod "e8b55040-929c-4088-9d23-532663500a6b" (UID: "e8b55040-929c-4088-9d23-532663500a6b"). InnerVolumeSpecName "kube-api-access-cbkt5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 11:19:11 crc kubenswrapper[4794]: I0310 11:19:11.845685 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sjvl6\" (UniqueName: \"kubernetes.io/projected/aecfaacc-1770-4c75-90e5-18bf8f45581d-kube-api-access-sjvl6\") on node \"crc\" DevicePath \"\"" Mar 10 11:19:11 crc kubenswrapper[4794]: I0310 11:19:11.845734 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cbkt5\" (UniqueName: \"kubernetes.io/projected/e8b55040-929c-4088-9d23-532663500a6b-kube-api-access-cbkt5\") on node \"crc\" DevicePath \"\"" Mar 10 11:19:11 crc kubenswrapper[4794]: I0310 11:19:11.845753 4794 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/aecfaacc-1770-4c75-90e5-18bf8f45581d-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 11:19:12 crc kubenswrapper[4794]: I0310 11:19:12.114521 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-2a16-account-create-update-5q9jn" event={"ID":"e8b55040-929c-4088-9d23-532663500a6b","Type":"ContainerDied","Data":"b2cb7f101e9420c2c74f37b8147714cf3bf87346a62942c8508e4ef2d4c02d77"} Mar 10 11:19:12 crc kubenswrapper[4794]: I0310 11:19:12.114585 4794 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b2cb7f101e9420c2c74f37b8147714cf3bf87346a62942c8508e4ef2d4c02d77" Mar 10 11:19:12 crc kubenswrapper[4794]: I0310 11:19:12.114567 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-2a16-account-create-update-5q9jn" Mar 10 11:19:12 crc kubenswrapper[4794]: I0310 11:19:12.117394 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-8qjtb" Mar 10 11:19:12 crc kubenswrapper[4794]: I0310 11:19:12.117511 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-8qjtb" event={"ID":"aecfaacc-1770-4c75-90e5-18bf8f45581d","Type":"ContainerDied","Data":"29add1b07488e1b23b3572845aa07cc296021712855ee832bf205ef33a67f923"} Mar 10 11:19:12 crc kubenswrapper[4794]: I0310 11:19:12.117559 4794 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="29add1b07488e1b23b3572845aa07cc296021712855ee832bf205ef33a67f923" Mar 10 11:19:13 crc kubenswrapper[4794]: I0310 11:19:13.186591 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-sync-v6gbl"] Mar 10 11:19:13 crc kubenswrapper[4794]: E0310 11:19:13.187160 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aecfaacc-1770-4c75-90e5-18bf8f45581d" containerName="mariadb-database-create" Mar 10 11:19:13 crc kubenswrapper[4794]: I0310 11:19:13.187173 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="aecfaacc-1770-4c75-90e5-18bf8f45581d" containerName="mariadb-database-create" Mar 10 11:19:13 crc kubenswrapper[4794]: E0310 11:19:13.187187 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e8b55040-929c-4088-9d23-532663500a6b" containerName="mariadb-account-create-update" Mar 10 11:19:13 crc kubenswrapper[4794]: I0310 11:19:13.187194 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="e8b55040-929c-4088-9d23-532663500a6b" containerName="mariadb-account-create-update" Mar 10 11:19:13 crc kubenswrapper[4794]: I0310 11:19:13.187394 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="aecfaacc-1770-4c75-90e5-18bf8f45581d" containerName="mariadb-database-create" Mar 10 11:19:13 crc kubenswrapper[4794]: I0310 11:19:13.187406 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="e8b55040-929c-4088-9d23-532663500a6b" containerName="mariadb-account-create-update" Mar 10 11:19:13 crc kubenswrapper[4794]: I0310 11:19:13.188047 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-v6gbl" Mar 10 11:19:13 crc kubenswrapper[4794]: I0310 11:19:13.190978 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Mar 10 11:19:13 crc kubenswrapper[4794]: I0310 11:19:13.191135 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Mar 10 11:19:13 crc kubenswrapper[4794]: I0310 11:19:13.192705 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-xzrl9" Mar 10 11:19:13 crc kubenswrapper[4794]: I0310 11:19:13.196061 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-v6gbl"] Mar 10 11:19:13 crc kubenswrapper[4794]: I0310 11:19:13.283575 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/68d82833-c203-4aa4-9829-f6392d598df1-scripts\") pod \"cinder-db-sync-v6gbl\" (UID: \"68d82833-c203-4aa4-9829-f6392d598df1\") " pod="openstack/cinder-db-sync-v6gbl" Mar 10 11:19:13 crc kubenswrapper[4794]: I0310 11:19:13.283625 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/68d82833-c203-4aa4-9829-f6392d598df1-etc-machine-id\") pod \"cinder-db-sync-v6gbl\" (UID: \"68d82833-c203-4aa4-9829-f6392d598df1\") " pod="openstack/cinder-db-sync-v6gbl" Mar 10 11:19:13 crc kubenswrapper[4794]: I0310 11:19:13.283649 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/68d82833-c203-4aa4-9829-f6392d598df1-combined-ca-bundle\") pod \"cinder-db-sync-v6gbl\" (UID: \"68d82833-c203-4aa4-9829-f6392d598df1\") " pod="openstack/cinder-db-sync-v6gbl" Mar 10 11:19:13 crc kubenswrapper[4794]: I0310 11:19:13.283699 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/68d82833-c203-4aa4-9829-f6392d598df1-db-sync-config-data\") pod \"cinder-db-sync-v6gbl\" (UID: \"68d82833-c203-4aa4-9829-f6392d598df1\") " pod="openstack/cinder-db-sync-v6gbl" Mar 10 11:19:13 crc kubenswrapper[4794]: I0310 11:19:13.283721 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/68d82833-c203-4aa4-9829-f6392d598df1-config-data\") pod \"cinder-db-sync-v6gbl\" (UID: \"68d82833-c203-4aa4-9829-f6392d598df1\") " pod="openstack/cinder-db-sync-v6gbl" Mar 10 11:19:13 crc kubenswrapper[4794]: I0310 11:19:13.283851 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jb8wp\" (UniqueName: \"kubernetes.io/projected/68d82833-c203-4aa4-9829-f6392d598df1-kube-api-access-jb8wp\") pod \"cinder-db-sync-v6gbl\" (UID: \"68d82833-c203-4aa4-9829-f6392d598df1\") " pod="openstack/cinder-db-sync-v6gbl" Mar 10 11:19:13 crc kubenswrapper[4794]: I0310 11:19:13.385736 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jb8wp\" (UniqueName: \"kubernetes.io/projected/68d82833-c203-4aa4-9829-f6392d598df1-kube-api-access-jb8wp\") pod \"cinder-db-sync-v6gbl\" (UID: \"68d82833-c203-4aa4-9829-f6392d598df1\") " pod="openstack/cinder-db-sync-v6gbl" Mar 10 11:19:13 crc kubenswrapper[4794]: I0310 11:19:13.385870 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/68d82833-c203-4aa4-9829-f6392d598df1-scripts\") pod \"cinder-db-sync-v6gbl\" (UID: \"68d82833-c203-4aa4-9829-f6392d598df1\") " pod="openstack/cinder-db-sync-v6gbl" Mar 10 11:19:13 crc kubenswrapper[4794]: I0310 11:19:13.385897 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/68d82833-c203-4aa4-9829-f6392d598df1-etc-machine-id\") pod \"cinder-db-sync-v6gbl\" (UID: \"68d82833-c203-4aa4-9829-f6392d598df1\") " pod="openstack/cinder-db-sync-v6gbl" Mar 10 11:19:13 crc kubenswrapper[4794]: I0310 11:19:13.385917 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/68d82833-c203-4aa4-9829-f6392d598df1-combined-ca-bundle\") pod \"cinder-db-sync-v6gbl\" (UID: \"68d82833-c203-4aa4-9829-f6392d598df1\") " pod="openstack/cinder-db-sync-v6gbl" Mar 10 11:19:13 crc kubenswrapper[4794]: I0310 11:19:13.385936 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/68d82833-c203-4aa4-9829-f6392d598df1-db-sync-config-data\") pod \"cinder-db-sync-v6gbl\" (UID: \"68d82833-c203-4aa4-9829-f6392d598df1\") " pod="openstack/cinder-db-sync-v6gbl" Mar 10 11:19:13 crc kubenswrapper[4794]: I0310 11:19:13.385957 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/68d82833-c203-4aa4-9829-f6392d598df1-config-data\") pod \"cinder-db-sync-v6gbl\" (UID: \"68d82833-c203-4aa4-9829-f6392d598df1\") " pod="openstack/cinder-db-sync-v6gbl" Mar 10 11:19:13 crc kubenswrapper[4794]: I0310 11:19:13.387234 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/68d82833-c203-4aa4-9829-f6392d598df1-etc-machine-id\") pod \"cinder-db-sync-v6gbl\" (UID: \"68d82833-c203-4aa4-9829-f6392d598df1\") " pod="openstack/cinder-db-sync-v6gbl" Mar 10 11:19:13 crc kubenswrapper[4794]: I0310 11:19:13.392990 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/68d82833-c203-4aa4-9829-f6392d598df1-config-data\") pod \"cinder-db-sync-v6gbl\" (UID: \"68d82833-c203-4aa4-9829-f6392d598df1\") " pod="openstack/cinder-db-sync-v6gbl" Mar 10 11:19:13 crc kubenswrapper[4794]: I0310 11:19:13.393498 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/68d82833-c203-4aa4-9829-f6392d598df1-scripts\") pod \"cinder-db-sync-v6gbl\" (UID: \"68d82833-c203-4aa4-9829-f6392d598df1\") " pod="openstack/cinder-db-sync-v6gbl" Mar 10 11:19:13 crc kubenswrapper[4794]: I0310 11:19:13.394624 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/68d82833-c203-4aa4-9829-f6392d598df1-combined-ca-bundle\") pod \"cinder-db-sync-v6gbl\" (UID: \"68d82833-c203-4aa4-9829-f6392d598df1\") " pod="openstack/cinder-db-sync-v6gbl" Mar 10 11:19:13 crc kubenswrapper[4794]: I0310 11:19:13.403928 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/68d82833-c203-4aa4-9829-f6392d598df1-db-sync-config-data\") pod \"cinder-db-sync-v6gbl\" (UID: \"68d82833-c203-4aa4-9829-f6392d598df1\") " pod="openstack/cinder-db-sync-v6gbl" Mar 10 11:19:13 crc kubenswrapper[4794]: I0310 11:19:13.419891 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jb8wp\" (UniqueName: \"kubernetes.io/projected/68d82833-c203-4aa4-9829-f6392d598df1-kube-api-access-jb8wp\") pod \"cinder-db-sync-v6gbl\" (UID: \"68d82833-c203-4aa4-9829-f6392d598df1\") " pod="openstack/cinder-db-sync-v6gbl" Mar 10 11:19:13 crc kubenswrapper[4794]: I0310 11:19:13.510413 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-v6gbl" Mar 10 11:19:14 crc kubenswrapper[4794]: I0310 11:19:14.058026 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-v6gbl"] Mar 10 11:19:14 crc kubenswrapper[4794]: I0310 11:19:14.140097 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-v6gbl" event={"ID":"68d82833-c203-4aa4-9829-f6392d598df1","Type":"ContainerStarted","Data":"528d64c792c12ea6772e4305dc42e0dfe5912d85a8194f43fb57e752e0b6a616"} Mar 10 11:19:15 crc kubenswrapper[4794]: I0310 11:19:15.150522 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-v6gbl" event={"ID":"68d82833-c203-4aa4-9829-f6392d598df1","Type":"ContainerStarted","Data":"361b45844d7fae61746e7ecdfd0e55c5c319bc07b9764362342b57f3c845d55b"} Mar 10 11:19:15 crc kubenswrapper[4794]: I0310 11:19:15.176215 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-db-sync-v6gbl" podStartSLOduration=2.176192228 podStartE2EDuration="2.176192228s" podCreationTimestamp="2026-03-10 11:19:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 11:19:15.168460537 +0000 UTC m=+5703.924631385" watchObservedRunningTime="2026-03-10 11:19:15.176192228 +0000 UTC m=+5703.932363046" Mar 10 11:19:16 crc kubenswrapper[4794]: I0310 11:19:16.941324 4794 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-27lf2" Mar 10 11:19:16 crc kubenswrapper[4794]: I0310 11:19:16.941708 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-27lf2" Mar 10 11:19:17 crc kubenswrapper[4794]: I0310 11:19:17.016216 4794 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-27lf2" Mar 10 11:19:17 crc kubenswrapper[4794]: I0310 11:19:17.240491 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-27lf2" Mar 10 11:19:17 crc kubenswrapper[4794]: I0310 11:19:17.310846 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-27lf2"] Mar 10 11:19:18 crc kubenswrapper[4794]: I0310 11:19:18.190470 4794 generic.go:334] "Generic (PLEG): container finished" podID="68d82833-c203-4aa4-9829-f6392d598df1" containerID="361b45844d7fae61746e7ecdfd0e55c5c319bc07b9764362342b57f3c845d55b" exitCode=0 Mar 10 11:19:18 crc kubenswrapper[4794]: I0310 11:19:18.190618 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-v6gbl" event={"ID":"68d82833-c203-4aa4-9829-f6392d598df1","Type":"ContainerDied","Data":"361b45844d7fae61746e7ecdfd0e55c5c319bc07b9764362342b57f3c845d55b"} Mar 10 11:19:19 crc kubenswrapper[4794]: I0310 11:19:19.201424 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-27lf2" podUID="a1c7c111-e573-416b-9db2-778ff3318b52" containerName="registry-server" containerID="cri-o://1bbff651183a76774318b9c9c666b9e822820c2e3f389e290fb1396713755554" gracePeriod=2 Mar 10 11:19:20 crc kubenswrapper[4794]: I0310 11:19:19.692852 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-v6gbl" Mar 10 11:19:20 crc kubenswrapper[4794]: I0310 11:19:19.811272 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jb8wp\" (UniqueName: \"kubernetes.io/projected/68d82833-c203-4aa4-9829-f6392d598df1-kube-api-access-jb8wp\") pod \"68d82833-c203-4aa4-9829-f6392d598df1\" (UID: \"68d82833-c203-4aa4-9829-f6392d598df1\") " Mar 10 11:19:20 crc kubenswrapper[4794]: I0310 11:19:19.811411 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/68d82833-c203-4aa4-9829-f6392d598df1-config-data\") pod \"68d82833-c203-4aa4-9829-f6392d598df1\" (UID: \"68d82833-c203-4aa4-9829-f6392d598df1\") " Mar 10 11:19:20 crc kubenswrapper[4794]: I0310 11:19:19.811541 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/68d82833-c203-4aa4-9829-f6392d598df1-combined-ca-bundle\") pod \"68d82833-c203-4aa4-9829-f6392d598df1\" (UID: \"68d82833-c203-4aa4-9829-f6392d598df1\") " Mar 10 11:19:20 crc kubenswrapper[4794]: I0310 11:19:19.811605 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/68d82833-c203-4aa4-9829-f6392d598df1-scripts\") pod \"68d82833-c203-4aa4-9829-f6392d598df1\" (UID: \"68d82833-c203-4aa4-9829-f6392d598df1\") " Mar 10 11:19:20 crc kubenswrapper[4794]: I0310 11:19:19.811666 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/68d82833-c203-4aa4-9829-f6392d598df1-db-sync-config-data\") pod \"68d82833-c203-4aa4-9829-f6392d598df1\" (UID: \"68d82833-c203-4aa4-9829-f6392d598df1\") " Mar 10 11:19:20 crc kubenswrapper[4794]: I0310 11:19:19.811855 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/68d82833-c203-4aa4-9829-f6392d598df1-etc-machine-id\") pod \"68d82833-c203-4aa4-9829-f6392d598df1\" (UID: \"68d82833-c203-4aa4-9829-f6392d598df1\") " Mar 10 11:19:20 crc kubenswrapper[4794]: I0310 11:19:19.812713 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/68d82833-c203-4aa4-9829-f6392d598df1-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "68d82833-c203-4aa4-9829-f6392d598df1" (UID: "68d82833-c203-4aa4-9829-f6392d598df1"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 11:19:20 crc kubenswrapper[4794]: I0310 11:19:19.817639 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/68d82833-c203-4aa4-9829-f6392d598df1-scripts" (OuterVolumeSpecName: "scripts") pod "68d82833-c203-4aa4-9829-f6392d598df1" (UID: "68d82833-c203-4aa4-9829-f6392d598df1"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 11:19:20 crc kubenswrapper[4794]: I0310 11:19:19.817766 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/68d82833-c203-4aa4-9829-f6392d598df1-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "68d82833-c203-4aa4-9829-f6392d598df1" (UID: "68d82833-c203-4aa4-9829-f6392d598df1"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 11:19:20 crc kubenswrapper[4794]: I0310 11:19:19.818434 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/68d82833-c203-4aa4-9829-f6392d598df1-kube-api-access-jb8wp" (OuterVolumeSpecName: "kube-api-access-jb8wp") pod "68d82833-c203-4aa4-9829-f6392d598df1" (UID: "68d82833-c203-4aa4-9829-f6392d598df1"). InnerVolumeSpecName "kube-api-access-jb8wp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 11:19:20 crc kubenswrapper[4794]: I0310 11:19:19.840155 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/68d82833-c203-4aa4-9829-f6392d598df1-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "68d82833-c203-4aa4-9829-f6392d598df1" (UID: "68d82833-c203-4aa4-9829-f6392d598df1"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 11:19:20 crc kubenswrapper[4794]: I0310 11:19:19.899242 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/68d82833-c203-4aa4-9829-f6392d598df1-config-data" (OuterVolumeSpecName: "config-data") pod "68d82833-c203-4aa4-9829-f6392d598df1" (UID: "68d82833-c203-4aa4-9829-f6392d598df1"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 11:19:20 crc kubenswrapper[4794]: I0310 11:19:19.914906 4794 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/68d82833-c203-4aa4-9829-f6392d598df1-etc-machine-id\") on node \"crc\" DevicePath \"\"" Mar 10 11:19:20 crc kubenswrapper[4794]: I0310 11:19:19.914938 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jb8wp\" (UniqueName: \"kubernetes.io/projected/68d82833-c203-4aa4-9829-f6392d598df1-kube-api-access-jb8wp\") on node \"crc\" DevicePath \"\"" Mar 10 11:19:20 crc kubenswrapper[4794]: I0310 11:19:19.914952 4794 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/68d82833-c203-4aa4-9829-f6392d598df1-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 11:19:20 crc kubenswrapper[4794]: I0310 11:19:19.914966 4794 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/68d82833-c203-4aa4-9829-f6392d598df1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 11:19:20 crc kubenswrapper[4794]: I0310 11:19:19.914976 4794 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/68d82833-c203-4aa4-9829-f6392d598df1-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 11:19:20 crc kubenswrapper[4794]: I0310 11:19:19.914989 4794 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/68d82833-c203-4aa4-9829-f6392d598df1-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 11:19:20 crc kubenswrapper[4794]: I0310 11:19:20.257985 4794 generic.go:334] "Generic (PLEG): container finished" podID="a1c7c111-e573-416b-9db2-778ff3318b52" containerID="1bbff651183a76774318b9c9c666b9e822820c2e3f389e290fb1396713755554" exitCode=0 Mar 10 11:19:20 crc kubenswrapper[4794]: I0310 11:19:20.258170 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-27lf2" event={"ID":"a1c7c111-e573-416b-9db2-778ff3318b52","Type":"ContainerDied","Data":"1bbff651183a76774318b9c9c666b9e822820c2e3f389e290fb1396713755554"} Mar 10 11:19:20 crc kubenswrapper[4794]: I0310 11:19:20.262219 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-v6gbl" event={"ID":"68d82833-c203-4aa4-9829-f6392d598df1","Type":"ContainerDied","Data":"528d64c792c12ea6772e4305dc42e0dfe5912d85a8194f43fb57e752e0b6a616"} Mar 10 11:19:20 crc kubenswrapper[4794]: I0310 11:19:20.262300 4794 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="528d64c792c12ea6772e4305dc42e0dfe5912d85a8194f43fb57e752e0b6a616" Mar 10 11:19:20 crc kubenswrapper[4794]: I0310 11:19:20.262303 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-v6gbl" Mar 10 11:19:20 crc kubenswrapper[4794]: I0310 11:19:20.550241 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7654548499-nb8kl"] Mar 10 11:19:20 crc kubenswrapper[4794]: E0310 11:19:20.550883 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="68d82833-c203-4aa4-9829-f6392d598df1" containerName="cinder-db-sync" Mar 10 11:19:20 crc kubenswrapper[4794]: I0310 11:19:20.550899 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="68d82833-c203-4aa4-9829-f6392d598df1" containerName="cinder-db-sync" Mar 10 11:19:20 crc kubenswrapper[4794]: I0310 11:19:20.551070 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="68d82833-c203-4aa4-9829-f6392d598df1" containerName="cinder-db-sync" Mar 10 11:19:20 crc kubenswrapper[4794]: I0310 11:19:20.552028 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7654548499-nb8kl" Mar 10 11:19:20 crc kubenswrapper[4794]: I0310 11:19:20.569657 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7654548499-nb8kl"] Mar 10 11:19:20 crc kubenswrapper[4794]: I0310 11:19:20.571215 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-27lf2" Mar 10 11:19:20 crc kubenswrapper[4794]: I0310 11:19:20.650813 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/692a8a38-92c2-437f-995f-e8595cc09a32-ovsdbserver-sb\") pod \"dnsmasq-dns-7654548499-nb8kl\" (UID: \"692a8a38-92c2-437f-995f-e8595cc09a32\") " pod="openstack/dnsmasq-dns-7654548499-nb8kl" Mar 10 11:19:20 crc kubenswrapper[4794]: I0310 11:19:20.650895 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qmqdd\" (UniqueName: \"kubernetes.io/projected/692a8a38-92c2-437f-995f-e8595cc09a32-kube-api-access-qmqdd\") pod \"dnsmasq-dns-7654548499-nb8kl\" (UID: \"692a8a38-92c2-437f-995f-e8595cc09a32\") " pod="openstack/dnsmasq-dns-7654548499-nb8kl" Mar 10 11:19:20 crc kubenswrapper[4794]: I0310 11:19:20.650961 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/692a8a38-92c2-437f-995f-e8595cc09a32-ovsdbserver-nb\") pod \"dnsmasq-dns-7654548499-nb8kl\" (UID: \"692a8a38-92c2-437f-995f-e8595cc09a32\") " pod="openstack/dnsmasq-dns-7654548499-nb8kl" Mar 10 11:19:20 crc kubenswrapper[4794]: I0310 11:19:20.650997 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/692a8a38-92c2-437f-995f-e8595cc09a32-dns-svc\") pod \"dnsmasq-dns-7654548499-nb8kl\" (UID: \"692a8a38-92c2-437f-995f-e8595cc09a32\") " pod="openstack/dnsmasq-dns-7654548499-nb8kl" Mar 10 11:19:20 crc kubenswrapper[4794]: I0310 11:19:20.651037 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/692a8a38-92c2-437f-995f-e8595cc09a32-config\") pod \"dnsmasq-dns-7654548499-nb8kl\" (UID: \"692a8a38-92c2-437f-995f-e8595cc09a32\") " pod="openstack/dnsmasq-dns-7654548499-nb8kl" Mar 10 11:19:20 crc kubenswrapper[4794]: I0310 11:19:20.692581 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Mar 10 11:19:20 crc kubenswrapper[4794]: E0310 11:19:20.692966 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a1c7c111-e573-416b-9db2-778ff3318b52" containerName="registry-server" Mar 10 11:19:20 crc kubenswrapper[4794]: I0310 11:19:20.692978 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="a1c7c111-e573-416b-9db2-778ff3318b52" containerName="registry-server" Mar 10 11:19:20 crc kubenswrapper[4794]: E0310 11:19:20.692987 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a1c7c111-e573-416b-9db2-778ff3318b52" containerName="extract-content" Mar 10 11:19:20 crc kubenswrapper[4794]: I0310 11:19:20.692992 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="a1c7c111-e573-416b-9db2-778ff3318b52" containerName="extract-content" Mar 10 11:19:20 crc kubenswrapper[4794]: E0310 11:19:20.693010 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a1c7c111-e573-416b-9db2-778ff3318b52" containerName="extract-utilities" Mar 10 11:19:20 crc kubenswrapper[4794]: I0310 11:19:20.693016 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="a1c7c111-e573-416b-9db2-778ff3318b52" containerName="extract-utilities" Mar 10 11:19:20 crc kubenswrapper[4794]: I0310 11:19:20.693168 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="a1c7c111-e573-416b-9db2-778ff3318b52" containerName="registry-server" Mar 10 11:19:20 crc kubenswrapper[4794]: I0310 11:19:20.694136 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 10 11:19:20 crc kubenswrapper[4794]: I0310 11:19:20.700519 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Mar 10 11:19:20 crc kubenswrapper[4794]: I0310 11:19:20.701213 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Mar 10 11:19:20 crc kubenswrapper[4794]: I0310 11:19:20.701316 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Mar 10 11:19:20 crc kubenswrapper[4794]: I0310 11:19:20.701413 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-xzrl9" Mar 10 11:19:20 crc kubenswrapper[4794]: I0310 11:19:20.722284 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Mar 10 11:19:20 crc kubenswrapper[4794]: I0310 11:19:20.751527 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a1c7c111-e573-416b-9db2-778ff3318b52-catalog-content\") pod \"a1c7c111-e573-416b-9db2-778ff3318b52\" (UID: \"a1c7c111-e573-416b-9db2-778ff3318b52\") " Mar 10 11:19:20 crc kubenswrapper[4794]: I0310 11:19:20.751600 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8n7fv\" (UniqueName: \"kubernetes.io/projected/a1c7c111-e573-416b-9db2-778ff3318b52-kube-api-access-8n7fv\") pod \"a1c7c111-e573-416b-9db2-778ff3318b52\" (UID: \"a1c7c111-e573-416b-9db2-778ff3318b52\") " Mar 10 11:19:20 crc kubenswrapper[4794]: I0310 11:19:20.751745 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a1c7c111-e573-416b-9db2-778ff3318b52-utilities\") pod \"a1c7c111-e573-416b-9db2-778ff3318b52\" (UID: \"a1c7c111-e573-416b-9db2-778ff3318b52\") " Mar 10 11:19:20 crc kubenswrapper[4794]: I0310 11:19:20.751939 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/692a8a38-92c2-437f-995f-e8595cc09a32-config\") pod \"dnsmasq-dns-7654548499-nb8kl\" (UID: \"692a8a38-92c2-437f-995f-e8595cc09a32\") " pod="openstack/dnsmasq-dns-7654548499-nb8kl" Mar 10 11:19:20 crc kubenswrapper[4794]: I0310 11:19:20.751991 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/692a8a38-92c2-437f-995f-e8595cc09a32-ovsdbserver-sb\") pod \"dnsmasq-dns-7654548499-nb8kl\" (UID: \"692a8a38-92c2-437f-995f-e8595cc09a32\") " pod="openstack/dnsmasq-dns-7654548499-nb8kl" Mar 10 11:19:20 crc kubenswrapper[4794]: I0310 11:19:20.752041 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qmqdd\" (UniqueName: \"kubernetes.io/projected/692a8a38-92c2-437f-995f-e8595cc09a32-kube-api-access-qmqdd\") pod \"dnsmasq-dns-7654548499-nb8kl\" (UID: \"692a8a38-92c2-437f-995f-e8595cc09a32\") " pod="openstack/dnsmasq-dns-7654548499-nb8kl" Mar 10 11:19:20 crc kubenswrapper[4794]: I0310 11:19:20.752079 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/692a8a38-92c2-437f-995f-e8595cc09a32-ovsdbserver-nb\") pod \"dnsmasq-dns-7654548499-nb8kl\" (UID: \"692a8a38-92c2-437f-995f-e8595cc09a32\") " pod="openstack/dnsmasq-dns-7654548499-nb8kl" Mar 10 11:19:20 crc kubenswrapper[4794]: I0310 11:19:20.752104 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/692a8a38-92c2-437f-995f-e8595cc09a32-dns-svc\") pod \"dnsmasq-dns-7654548499-nb8kl\" (UID: \"692a8a38-92c2-437f-995f-e8595cc09a32\") " pod="openstack/dnsmasq-dns-7654548499-nb8kl" Mar 10 11:19:20 crc kubenswrapper[4794]: I0310 11:19:20.752866 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/692a8a38-92c2-437f-995f-e8595cc09a32-dns-svc\") pod \"dnsmasq-dns-7654548499-nb8kl\" (UID: \"692a8a38-92c2-437f-995f-e8595cc09a32\") " pod="openstack/dnsmasq-dns-7654548499-nb8kl" Mar 10 11:19:20 crc kubenswrapper[4794]: I0310 11:19:20.753946 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/692a8a38-92c2-437f-995f-e8595cc09a32-ovsdbserver-nb\") pod \"dnsmasq-dns-7654548499-nb8kl\" (UID: \"692a8a38-92c2-437f-995f-e8595cc09a32\") " pod="openstack/dnsmasq-dns-7654548499-nb8kl" Mar 10 11:19:20 crc kubenswrapper[4794]: I0310 11:19:20.754659 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a1c7c111-e573-416b-9db2-778ff3318b52-utilities" (OuterVolumeSpecName: "utilities") pod "a1c7c111-e573-416b-9db2-778ff3318b52" (UID: "a1c7c111-e573-416b-9db2-778ff3318b52"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 11:19:20 crc kubenswrapper[4794]: I0310 11:19:20.762978 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a1c7c111-e573-416b-9db2-778ff3318b52-kube-api-access-8n7fv" (OuterVolumeSpecName: "kube-api-access-8n7fv") pod "a1c7c111-e573-416b-9db2-778ff3318b52" (UID: "a1c7c111-e573-416b-9db2-778ff3318b52"). InnerVolumeSpecName "kube-api-access-8n7fv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 11:19:20 crc kubenswrapper[4794]: I0310 11:19:20.764937 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/692a8a38-92c2-437f-995f-e8595cc09a32-ovsdbserver-sb\") pod \"dnsmasq-dns-7654548499-nb8kl\" (UID: \"692a8a38-92c2-437f-995f-e8595cc09a32\") " pod="openstack/dnsmasq-dns-7654548499-nb8kl" Mar 10 11:19:20 crc kubenswrapper[4794]: I0310 11:19:20.765977 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/692a8a38-92c2-437f-995f-e8595cc09a32-config\") pod \"dnsmasq-dns-7654548499-nb8kl\" (UID: \"692a8a38-92c2-437f-995f-e8595cc09a32\") " pod="openstack/dnsmasq-dns-7654548499-nb8kl" Mar 10 11:19:20 crc kubenswrapper[4794]: I0310 11:19:20.793851 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qmqdd\" (UniqueName: \"kubernetes.io/projected/692a8a38-92c2-437f-995f-e8595cc09a32-kube-api-access-qmqdd\") pod \"dnsmasq-dns-7654548499-nb8kl\" (UID: \"692a8a38-92c2-437f-995f-e8595cc09a32\") " pod="openstack/dnsmasq-dns-7654548499-nb8kl" Mar 10 11:19:20 crc kubenswrapper[4794]: I0310 11:19:20.857409 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/ce2424b1-763a-4b70-9f10-2503c5d49d7c-etc-machine-id\") pod \"cinder-api-0\" (UID: \"ce2424b1-763a-4b70-9f10-2503c5d49d7c\") " pod="openstack/cinder-api-0" Mar 10 11:19:20 crc kubenswrapper[4794]: I0310 11:19:20.857511 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ce2424b1-763a-4b70-9f10-2503c5d49d7c-config-data\") pod \"cinder-api-0\" (UID: \"ce2424b1-763a-4b70-9f10-2503c5d49d7c\") " pod="openstack/cinder-api-0" Mar 10 11:19:20 crc kubenswrapper[4794]: I0310 11:19:20.857552 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-289qw\" (UniqueName: \"kubernetes.io/projected/ce2424b1-763a-4b70-9f10-2503c5d49d7c-kube-api-access-289qw\") pod \"cinder-api-0\" (UID: \"ce2424b1-763a-4b70-9f10-2503c5d49d7c\") " pod="openstack/cinder-api-0" Mar 10 11:19:20 crc kubenswrapper[4794]: I0310 11:19:20.857587 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ce2424b1-763a-4b70-9f10-2503c5d49d7c-scripts\") pod \"cinder-api-0\" (UID: \"ce2424b1-763a-4b70-9f10-2503c5d49d7c\") " pod="openstack/cinder-api-0" Mar 10 11:19:20 crc kubenswrapper[4794]: I0310 11:19:20.857655 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ce2424b1-763a-4b70-9f10-2503c5d49d7c-config-data-custom\") pod \"cinder-api-0\" (UID: \"ce2424b1-763a-4b70-9f10-2503c5d49d7c\") " pod="openstack/cinder-api-0" Mar 10 11:19:20 crc kubenswrapper[4794]: I0310 11:19:20.857686 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ce2424b1-763a-4b70-9f10-2503c5d49d7c-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"ce2424b1-763a-4b70-9f10-2503c5d49d7c\") " pod="openstack/cinder-api-0" Mar 10 11:19:20 crc kubenswrapper[4794]: I0310 11:19:20.857711 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ce2424b1-763a-4b70-9f10-2503c5d49d7c-logs\") pod \"cinder-api-0\" (UID: \"ce2424b1-763a-4b70-9f10-2503c5d49d7c\") " pod="openstack/cinder-api-0" Mar 10 11:19:20 crc kubenswrapper[4794]: I0310 11:19:20.857763 4794 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a1c7c111-e573-416b-9db2-778ff3318b52-utilities\") on node \"crc\" DevicePath \"\"" Mar 10 11:19:20 crc kubenswrapper[4794]: I0310 11:19:20.857800 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8n7fv\" (UniqueName: \"kubernetes.io/projected/a1c7c111-e573-416b-9db2-778ff3318b52-kube-api-access-8n7fv\") on node \"crc\" DevicePath \"\"" Mar 10 11:19:20 crc kubenswrapper[4794]: I0310 11:19:20.880439 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a1c7c111-e573-416b-9db2-778ff3318b52-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a1c7c111-e573-416b-9db2-778ff3318b52" (UID: "a1c7c111-e573-416b-9db2-778ff3318b52"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 11:19:20 crc kubenswrapper[4794]: I0310 11:19:20.880871 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7654548499-nb8kl" Mar 10 11:19:20 crc kubenswrapper[4794]: I0310 11:19:20.961204 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ce2424b1-763a-4b70-9f10-2503c5d49d7c-config-data\") pod \"cinder-api-0\" (UID: \"ce2424b1-763a-4b70-9f10-2503c5d49d7c\") " pod="openstack/cinder-api-0" Mar 10 11:19:20 crc kubenswrapper[4794]: I0310 11:19:20.961263 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-289qw\" (UniqueName: \"kubernetes.io/projected/ce2424b1-763a-4b70-9f10-2503c5d49d7c-kube-api-access-289qw\") pod \"cinder-api-0\" (UID: \"ce2424b1-763a-4b70-9f10-2503c5d49d7c\") " pod="openstack/cinder-api-0" Mar 10 11:19:20 crc kubenswrapper[4794]: I0310 11:19:20.961299 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ce2424b1-763a-4b70-9f10-2503c5d49d7c-scripts\") pod \"cinder-api-0\" (UID: \"ce2424b1-763a-4b70-9f10-2503c5d49d7c\") " pod="openstack/cinder-api-0" Mar 10 11:19:20 crc kubenswrapper[4794]: I0310 11:19:20.961363 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ce2424b1-763a-4b70-9f10-2503c5d49d7c-config-data-custom\") pod \"cinder-api-0\" (UID: \"ce2424b1-763a-4b70-9f10-2503c5d49d7c\") " pod="openstack/cinder-api-0" Mar 10 11:19:20 crc kubenswrapper[4794]: I0310 11:19:20.961390 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ce2424b1-763a-4b70-9f10-2503c5d49d7c-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"ce2424b1-763a-4b70-9f10-2503c5d49d7c\") " pod="openstack/cinder-api-0" Mar 10 11:19:20 crc kubenswrapper[4794]: I0310 11:19:20.961407 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ce2424b1-763a-4b70-9f10-2503c5d49d7c-logs\") pod \"cinder-api-0\" (UID: \"ce2424b1-763a-4b70-9f10-2503c5d49d7c\") " pod="openstack/cinder-api-0" Mar 10 11:19:20 crc kubenswrapper[4794]: I0310 11:19:20.961439 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/ce2424b1-763a-4b70-9f10-2503c5d49d7c-etc-machine-id\") pod \"cinder-api-0\" (UID: \"ce2424b1-763a-4b70-9f10-2503c5d49d7c\") " pod="openstack/cinder-api-0" Mar 10 11:19:20 crc kubenswrapper[4794]: I0310 11:19:20.961508 4794 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a1c7c111-e573-416b-9db2-778ff3318b52-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 10 11:19:20 crc kubenswrapper[4794]: I0310 11:19:20.961550 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/ce2424b1-763a-4b70-9f10-2503c5d49d7c-etc-machine-id\") pod \"cinder-api-0\" (UID: \"ce2424b1-763a-4b70-9f10-2503c5d49d7c\") " pod="openstack/cinder-api-0" Mar 10 11:19:20 crc kubenswrapper[4794]: I0310 11:19:20.965821 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ce2424b1-763a-4b70-9f10-2503c5d49d7c-logs\") pod \"cinder-api-0\" (UID: \"ce2424b1-763a-4b70-9f10-2503c5d49d7c\") " pod="openstack/cinder-api-0" Mar 10 11:19:20 crc kubenswrapper[4794]: I0310 11:19:20.974136 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ce2424b1-763a-4b70-9f10-2503c5d49d7c-config-data-custom\") pod \"cinder-api-0\" (UID: \"ce2424b1-763a-4b70-9f10-2503c5d49d7c\") " pod="openstack/cinder-api-0" Mar 10 11:19:20 crc kubenswrapper[4794]: I0310 11:19:20.974581 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ce2424b1-763a-4b70-9f10-2503c5d49d7c-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"ce2424b1-763a-4b70-9f10-2503c5d49d7c\") " pod="openstack/cinder-api-0" Mar 10 11:19:20 crc kubenswrapper[4794]: I0310 11:19:20.975949 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ce2424b1-763a-4b70-9f10-2503c5d49d7c-scripts\") pod \"cinder-api-0\" (UID: \"ce2424b1-763a-4b70-9f10-2503c5d49d7c\") " pod="openstack/cinder-api-0" Mar 10 11:19:20 crc kubenswrapper[4794]: I0310 11:19:20.978684 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ce2424b1-763a-4b70-9f10-2503c5d49d7c-config-data\") pod \"cinder-api-0\" (UID: \"ce2424b1-763a-4b70-9f10-2503c5d49d7c\") " pod="openstack/cinder-api-0" Mar 10 11:19:20 crc kubenswrapper[4794]: I0310 11:19:20.990542 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-289qw\" (UniqueName: \"kubernetes.io/projected/ce2424b1-763a-4b70-9f10-2503c5d49d7c-kube-api-access-289qw\") pod \"cinder-api-0\" (UID: \"ce2424b1-763a-4b70-9f10-2503c5d49d7c\") " pod="openstack/cinder-api-0" Mar 10 11:19:21 crc kubenswrapper[4794]: I0310 11:19:21.020759 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 10 11:19:21 crc kubenswrapper[4794]: I0310 11:19:21.274425 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-27lf2" event={"ID":"a1c7c111-e573-416b-9db2-778ff3318b52","Type":"ContainerDied","Data":"6bf6879377eac608021884e3f75d8d469df7ae3046879583d432039db166a7e5"} Mar 10 11:19:21 crc kubenswrapper[4794]: I0310 11:19:21.274481 4794 scope.go:117] "RemoveContainer" containerID="1bbff651183a76774318b9c9c666b9e822820c2e3f389e290fb1396713755554" Mar 10 11:19:21 crc kubenswrapper[4794]: I0310 11:19:21.274593 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-27lf2" Mar 10 11:19:21 crc kubenswrapper[4794]: I0310 11:19:21.312783 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-27lf2"] Mar 10 11:19:21 crc kubenswrapper[4794]: I0310 11:19:21.318473 4794 scope.go:117] "RemoveContainer" containerID="0c5215ccee936d6918d39ad1ca85b3886d50f164b2dbd2ca8f8197896815ff49" Mar 10 11:19:21 crc kubenswrapper[4794]: I0310 11:19:21.324529 4794 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-27lf2"] Mar 10 11:19:21 crc kubenswrapper[4794]: I0310 11:19:21.375220 4794 scope.go:117] "RemoveContainer" containerID="dc624ea43fd97645b6b91a8881158813ef0993d81bce478248c36dee9b2a7234" Mar 10 11:19:21 crc kubenswrapper[4794]: I0310 11:19:21.398652 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7654548499-nb8kl"] Mar 10 11:19:21 crc kubenswrapper[4794]: I0310 11:19:21.508558 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Mar 10 11:19:21 crc kubenswrapper[4794]: E0310 11:19:21.811910 4794 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod692a8a38_92c2_437f_995f_e8595cc09a32.slice/crio-conmon-5f737eee6aa7c3a803b3628c1c63d1d2efb8ca6a307af4fa82979f8362cc39f3.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod692a8a38_92c2_437f_995f_e8595cc09a32.slice/crio-5f737eee6aa7c3a803b3628c1c63d1d2efb8ca6a307af4fa82979f8362cc39f3.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode8b55040_929c_4088_9d23_532663500a6b.slice/crio-b2cb7f101e9420c2c74f37b8147714cf3bf87346a62942c8508e4ef2d4c02d77\": RecentStats: unable to find data in memory cache]" Mar 10 11:19:22 crc kubenswrapper[4794]: I0310 11:19:22.011432 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a1c7c111-e573-416b-9db2-778ff3318b52" path="/var/lib/kubelet/pods/a1c7c111-e573-416b-9db2-778ff3318b52/volumes" Mar 10 11:19:22 crc kubenswrapper[4794]: I0310 11:19:22.298845 4794 generic.go:334] "Generic (PLEG): container finished" podID="692a8a38-92c2-437f-995f-e8595cc09a32" containerID="5f737eee6aa7c3a803b3628c1c63d1d2efb8ca6a307af4fa82979f8362cc39f3" exitCode=0 Mar 10 11:19:22 crc kubenswrapper[4794]: I0310 11:19:22.298967 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7654548499-nb8kl" event={"ID":"692a8a38-92c2-437f-995f-e8595cc09a32","Type":"ContainerDied","Data":"5f737eee6aa7c3a803b3628c1c63d1d2efb8ca6a307af4fa82979f8362cc39f3"} Mar 10 11:19:22 crc kubenswrapper[4794]: I0310 11:19:22.299166 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7654548499-nb8kl" event={"ID":"692a8a38-92c2-437f-995f-e8595cc09a32","Type":"ContainerStarted","Data":"63d1ad395c9e293c6fb66138e2ff234eb3bdb74fc8745384c82a167ce43fd4a5"} Mar 10 11:19:22 crc kubenswrapper[4794]: I0310 11:19:22.313649 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"ce2424b1-763a-4b70-9f10-2503c5d49d7c","Type":"ContainerStarted","Data":"adb37a2d74235ae1c882bd2d614dea306392e358ea6ea989e227f6585a82d41a"} Mar 10 11:19:22 crc kubenswrapper[4794]: I0310 11:19:22.313703 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"ce2424b1-763a-4b70-9f10-2503c5d49d7c","Type":"ContainerStarted","Data":"0f03c6c9fc5d732739dff4abe5c0a317cd8bd7c79c4f6d0c8fe0834ffa2d0968"} Mar 10 11:19:23 crc kubenswrapper[4794]: I0310 11:19:23.324088 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"ce2424b1-763a-4b70-9f10-2503c5d49d7c","Type":"ContainerStarted","Data":"fd559e1689f963d616a0e201e644628df3259c2127ce50665eb673cf4ada6a8c"} Mar 10 11:19:23 crc kubenswrapper[4794]: I0310 11:19:23.324485 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Mar 10 11:19:23 crc kubenswrapper[4794]: I0310 11:19:23.327083 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7654548499-nb8kl" event={"ID":"692a8a38-92c2-437f-995f-e8595cc09a32","Type":"ContainerStarted","Data":"0998a61af3fe096aad73a3b9691001b780ca6e82bb7571816afe3ae1f19fb9bf"} Mar 10 11:19:23 crc kubenswrapper[4794]: I0310 11:19:23.327275 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-7654548499-nb8kl" Mar 10 11:19:23 crc kubenswrapper[4794]: I0310 11:19:23.341872 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=3.341854924 podStartE2EDuration="3.341854924s" podCreationTimestamp="2026-03-10 11:19:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 11:19:23.340447691 +0000 UTC m=+5712.096618529" watchObservedRunningTime="2026-03-10 11:19:23.341854924 +0000 UTC m=+5712.098025752" Mar 10 11:19:30 crc kubenswrapper[4794]: I0310 11:19:30.883578 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-7654548499-nb8kl" Mar 10 11:19:30 crc kubenswrapper[4794]: I0310 11:19:30.938640 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-7654548499-nb8kl" podStartSLOduration=10.938604759 podStartE2EDuration="10.938604759s" podCreationTimestamp="2026-03-10 11:19:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 11:19:23.366564544 +0000 UTC m=+5712.122735372" watchObservedRunningTime="2026-03-10 11:19:30.938604759 +0000 UTC m=+5719.694775637" Mar 10 11:19:30 crc kubenswrapper[4794]: I0310 11:19:30.988393 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5c46849895-wd2pd"] Mar 10 11:19:30 crc kubenswrapper[4794]: I0310 11:19:30.988828 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5c46849895-wd2pd" podUID="c950caad-61b1-466a-8ab0-9665214d56cc" containerName="dnsmasq-dns" containerID="cri-o://f78198c9d63abd505e0c7a4c15b70376691efd876016129b87cdd2a92aef7f20" gracePeriod=10 Mar 10 11:19:31 crc kubenswrapper[4794]: I0310 11:19:31.439865 4794 generic.go:334] "Generic (PLEG): container finished" podID="c950caad-61b1-466a-8ab0-9665214d56cc" containerID="f78198c9d63abd505e0c7a4c15b70376691efd876016129b87cdd2a92aef7f20" exitCode=0 Mar 10 11:19:31 crc kubenswrapper[4794]: I0310 11:19:31.440013 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c46849895-wd2pd" event={"ID":"c950caad-61b1-466a-8ab0-9665214d56cc","Type":"ContainerDied","Data":"f78198c9d63abd505e0c7a4c15b70376691efd876016129b87cdd2a92aef7f20"} Mar 10 11:19:31 crc kubenswrapper[4794]: I0310 11:19:31.564611 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c46849895-wd2pd" Mar 10 11:19:31 crc kubenswrapper[4794]: I0310 11:19:31.667975 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c950caad-61b1-466a-8ab0-9665214d56cc-ovsdbserver-sb\") pod \"c950caad-61b1-466a-8ab0-9665214d56cc\" (UID: \"c950caad-61b1-466a-8ab0-9665214d56cc\") " Mar 10 11:19:31 crc kubenswrapper[4794]: I0310 11:19:31.668050 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2dfcq\" (UniqueName: \"kubernetes.io/projected/c950caad-61b1-466a-8ab0-9665214d56cc-kube-api-access-2dfcq\") pod \"c950caad-61b1-466a-8ab0-9665214d56cc\" (UID: \"c950caad-61b1-466a-8ab0-9665214d56cc\") " Mar 10 11:19:31 crc kubenswrapper[4794]: I0310 11:19:31.668110 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c950caad-61b1-466a-8ab0-9665214d56cc-config\") pod \"c950caad-61b1-466a-8ab0-9665214d56cc\" (UID: \"c950caad-61b1-466a-8ab0-9665214d56cc\") " Mar 10 11:19:31 crc kubenswrapper[4794]: I0310 11:19:31.668207 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c950caad-61b1-466a-8ab0-9665214d56cc-ovsdbserver-nb\") pod \"c950caad-61b1-466a-8ab0-9665214d56cc\" (UID: \"c950caad-61b1-466a-8ab0-9665214d56cc\") " Mar 10 11:19:31 crc kubenswrapper[4794]: I0310 11:19:31.668224 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c950caad-61b1-466a-8ab0-9665214d56cc-dns-svc\") pod \"c950caad-61b1-466a-8ab0-9665214d56cc\" (UID: \"c950caad-61b1-466a-8ab0-9665214d56cc\") " Mar 10 11:19:31 crc kubenswrapper[4794]: I0310 11:19:31.673480 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c950caad-61b1-466a-8ab0-9665214d56cc-kube-api-access-2dfcq" (OuterVolumeSpecName: "kube-api-access-2dfcq") pod "c950caad-61b1-466a-8ab0-9665214d56cc" (UID: "c950caad-61b1-466a-8ab0-9665214d56cc"). InnerVolumeSpecName "kube-api-access-2dfcq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 11:19:31 crc kubenswrapper[4794]: I0310 11:19:31.732216 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c950caad-61b1-466a-8ab0-9665214d56cc-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "c950caad-61b1-466a-8ab0-9665214d56cc" (UID: "c950caad-61b1-466a-8ab0-9665214d56cc"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 11:19:31 crc kubenswrapper[4794]: I0310 11:19:31.734076 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c950caad-61b1-466a-8ab0-9665214d56cc-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "c950caad-61b1-466a-8ab0-9665214d56cc" (UID: "c950caad-61b1-466a-8ab0-9665214d56cc"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 11:19:31 crc kubenswrapper[4794]: I0310 11:19:31.742422 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c950caad-61b1-466a-8ab0-9665214d56cc-config" (OuterVolumeSpecName: "config") pod "c950caad-61b1-466a-8ab0-9665214d56cc" (UID: "c950caad-61b1-466a-8ab0-9665214d56cc"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 11:19:31 crc kubenswrapper[4794]: I0310 11:19:31.771848 4794 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c950caad-61b1-466a-8ab0-9665214d56cc-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 10 11:19:31 crc kubenswrapper[4794]: I0310 11:19:31.771871 4794 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c950caad-61b1-466a-8ab0-9665214d56cc-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 10 11:19:31 crc kubenswrapper[4794]: I0310 11:19:31.771881 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2dfcq\" (UniqueName: \"kubernetes.io/projected/c950caad-61b1-466a-8ab0-9665214d56cc-kube-api-access-2dfcq\") on node \"crc\" DevicePath \"\"" Mar 10 11:19:31 crc kubenswrapper[4794]: I0310 11:19:31.771890 4794 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c950caad-61b1-466a-8ab0-9665214d56cc-config\") on node \"crc\" DevicePath \"\"" Mar 10 11:19:31 crc kubenswrapper[4794]: I0310 11:19:31.791859 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c950caad-61b1-466a-8ab0-9665214d56cc-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "c950caad-61b1-466a-8ab0-9665214d56cc" (UID: "c950caad-61b1-466a-8ab0-9665214d56cc"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 11:19:31 crc kubenswrapper[4794]: I0310 11:19:31.873717 4794 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c950caad-61b1-466a-8ab0-9665214d56cc-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 10 11:19:32 crc kubenswrapper[4794]: E0310 11:19:32.031775 4794 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode8b55040_929c_4088_9d23_532663500a6b.slice/crio-b2cb7f101e9420c2c74f37b8147714cf3bf87346a62942c8508e4ef2d4c02d77\": RecentStats: unable to find data in memory cache]" Mar 10 11:19:32 crc kubenswrapper[4794]: I0310 11:19:32.452619 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c46849895-wd2pd" event={"ID":"c950caad-61b1-466a-8ab0-9665214d56cc","Type":"ContainerDied","Data":"752e4f9d7f4fb0d1325a9794ffc918dc0b963968d413b76c31c9c49590b44f78"} Mar 10 11:19:32 crc kubenswrapper[4794]: I0310 11:19:32.452666 4794 scope.go:117] "RemoveContainer" containerID="f78198c9d63abd505e0c7a4c15b70376691efd876016129b87cdd2a92aef7f20" Mar 10 11:19:32 crc kubenswrapper[4794]: I0310 11:19:32.452787 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c46849895-wd2pd" Mar 10 11:19:32 crc kubenswrapper[4794]: I0310 11:19:32.478302 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5c46849895-wd2pd"] Mar 10 11:19:32 crc kubenswrapper[4794]: I0310 11:19:32.478512 4794 scope.go:117] "RemoveContainer" containerID="37dd0ebe5a61b16f305cebc351b0ad23c4eeaa10514fa9eb920ac1408f9a34a5" Mar 10 11:19:32 crc kubenswrapper[4794]: I0310 11:19:32.487443 4794 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5c46849895-wd2pd"] Mar 10 11:19:32 crc kubenswrapper[4794]: I0310 11:19:32.657697 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 10 11:19:32 crc kubenswrapper[4794]: I0310 11:19:32.657922 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="fc1aed0b-186c-45d8-947b-072cb1a2ce0f" containerName="nova-api-log" containerID="cri-o://41c78b00714fd6d0f603309b957b6ac5d885afe04aa32a0e7b8f992d63d515c9" gracePeriod=30 Mar 10 11:19:32 crc kubenswrapper[4794]: I0310 11:19:32.658034 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="fc1aed0b-186c-45d8-947b-072cb1a2ce0f" containerName="nova-api-api" containerID="cri-o://93486363bcda7b62107d7e1e0b7afd58567e06ee4d14343e41ead3774822918b" gracePeriod=30 Mar 10 11:19:32 crc kubenswrapper[4794]: I0310 11:19:32.676100 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Mar 10 11:19:32 crc kubenswrapper[4794]: I0310 11:19:32.676578 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="660cfab8-47f8-4194-a9fc-9075fdb441ab" containerName="nova-scheduler-scheduler" containerID="cri-o://ef2bdf7435fa5ad4f2d88d430c0bbec0318b87aca7ad75f918c56d0c5c08e7f3" gracePeriod=30 Mar 10 11:19:32 crc kubenswrapper[4794]: I0310 11:19:32.690593 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-0"] Mar 10 11:19:32 crc kubenswrapper[4794]: I0310 11:19:32.690811 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell0-conductor-0" podUID="8b88005f-6aaa-488f-90d5-b789ccced7ec" containerName="nova-cell0-conductor-conductor" containerID="cri-o://773ca7b14734c179e1a0443c6ccd3c922e4dcbfb06c2b67fcd8db43590f52861" gracePeriod=30 Mar 10 11:19:32 crc kubenswrapper[4794]: I0310 11:19:32.701230 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 10 11:19:32 crc kubenswrapper[4794]: I0310 11:19:32.701498 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="e3cc18ff-b557-4d49-8580-733877f288a5" containerName="nova-metadata-log" containerID="cri-o://bd17680b7381005b9d0606808844a3a815638a759a845329a6aaa7cde1297f88" gracePeriod=30 Mar 10 11:19:32 crc kubenswrapper[4794]: I0310 11:19:32.701570 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="e3cc18ff-b557-4d49-8580-733877f288a5" containerName="nova-metadata-metadata" containerID="cri-o://8307f7ba39567058c2c2afb0493f5efb4f54bf8b9d792b4ddc4fd0580cc86401" gracePeriod=30 Mar 10 11:19:32 crc kubenswrapper[4794]: I0310 11:19:32.721379 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 10 11:19:32 crc kubenswrapper[4794]: I0310 11:19:32.721559 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-novncproxy-0" podUID="7d69c47b-ce22-4e6a-aa51-bbfb7e5b8f23" containerName="nova-cell1-novncproxy-novncproxy" containerID="cri-o://bd16b4f53a35e5576da7d1a0f331f057b122c2a8f1b9a4d858b904c8e01ec28d" gracePeriod=30 Mar 10 11:19:33 crc kubenswrapper[4794]: I0310 11:19:33.136124 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cinder-api-0" Mar 10 11:19:33 crc kubenswrapper[4794]: I0310 11:19:33.463406 4794 generic.go:334] "Generic (PLEG): container finished" podID="fc1aed0b-186c-45d8-947b-072cb1a2ce0f" containerID="41c78b00714fd6d0f603309b957b6ac5d885afe04aa32a0e7b8f992d63d515c9" exitCode=143 Mar 10 11:19:33 crc kubenswrapper[4794]: I0310 11:19:33.463470 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"fc1aed0b-186c-45d8-947b-072cb1a2ce0f","Type":"ContainerDied","Data":"41c78b00714fd6d0f603309b957b6ac5d885afe04aa32a0e7b8f992d63d515c9"} Mar 10 11:19:33 crc kubenswrapper[4794]: I0310 11:19:33.464704 4794 generic.go:334] "Generic (PLEG): container finished" podID="e3cc18ff-b557-4d49-8580-733877f288a5" containerID="bd17680b7381005b9d0606808844a3a815638a759a845329a6aaa7cde1297f88" exitCode=143 Mar 10 11:19:33 crc kubenswrapper[4794]: I0310 11:19:33.464744 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"e3cc18ff-b557-4d49-8580-733877f288a5","Type":"ContainerDied","Data":"bd17680b7381005b9d0606808844a3a815638a759a845329a6aaa7cde1297f88"} Mar 10 11:19:34 crc kubenswrapper[4794]: I0310 11:19:34.018867 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c950caad-61b1-466a-8ab0-9665214d56cc" path="/var/lib/kubelet/pods/c950caad-61b1-466a-8ab0-9665214d56cc/volumes" Mar 10 11:19:34 crc kubenswrapper[4794]: I0310 11:19:34.521476 4794 generic.go:334] "Generic (PLEG): container finished" podID="7d69c47b-ce22-4e6a-aa51-bbfb7e5b8f23" containerID="bd16b4f53a35e5576da7d1a0f331f057b122c2a8f1b9a4d858b904c8e01ec28d" exitCode=0 Mar 10 11:19:34 crc kubenswrapper[4794]: I0310 11:19:34.525389 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"7d69c47b-ce22-4e6a-aa51-bbfb7e5b8f23","Type":"ContainerDied","Data":"bd16b4f53a35e5576da7d1a0f331f057b122c2a8f1b9a4d858b904c8e01ec28d"} Mar 10 11:19:34 crc kubenswrapper[4794]: I0310 11:19:34.682037 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 10 11:19:34 crc kubenswrapper[4794]: I0310 11:19:34.737528 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7d69c47b-ce22-4e6a-aa51-bbfb7e5b8f23-combined-ca-bundle\") pod \"7d69c47b-ce22-4e6a-aa51-bbfb7e5b8f23\" (UID: \"7d69c47b-ce22-4e6a-aa51-bbfb7e5b8f23\") " Mar 10 11:19:34 crc kubenswrapper[4794]: I0310 11:19:34.737734 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b7rhx\" (UniqueName: \"kubernetes.io/projected/7d69c47b-ce22-4e6a-aa51-bbfb7e5b8f23-kube-api-access-b7rhx\") pod \"7d69c47b-ce22-4e6a-aa51-bbfb7e5b8f23\" (UID: \"7d69c47b-ce22-4e6a-aa51-bbfb7e5b8f23\") " Mar 10 11:19:34 crc kubenswrapper[4794]: I0310 11:19:34.737841 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7d69c47b-ce22-4e6a-aa51-bbfb7e5b8f23-config-data\") pod \"7d69c47b-ce22-4e6a-aa51-bbfb7e5b8f23\" (UID: \"7d69c47b-ce22-4e6a-aa51-bbfb7e5b8f23\") " Mar 10 11:19:34 crc kubenswrapper[4794]: I0310 11:19:34.759628 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7d69c47b-ce22-4e6a-aa51-bbfb7e5b8f23-kube-api-access-b7rhx" (OuterVolumeSpecName: "kube-api-access-b7rhx") pod "7d69c47b-ce22-4e6a-aa51-bbfb7e5b8f23" (UID: "7d69c47b-ce22-4e6a-aa51-bbfb7e5b8f23"). InnerVolumeSpecName "kube-api-access-b7rhx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 11:19:34 crc kubenswrapper[4794]: I0310 11:19:34.773431 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7d69c47b-ce22-4e6a-aa51-bbfb7e5b8f23-config-data" (OuterVolumeSpecName: "config-data") pod "7d69c47b-ce22-4e6a-aa51-bbfb7e5b8f23" (UID: "7d69c47b-ce22-4e6a-aa51-bbfb7e5b8f23"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 11:19:34 crc kubenswrapper[4794]: I0310 11:19:34.777000 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7d69c47b-ce22-4e6a-aa51-bbfb7e5b8f23-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7d69c47b-ce22-4e6a-aa51-bbfb7e5b8f23" (UID: "7d69c47b-ce22-4e6a-aa51-bbfb7e5b8f23"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 11:19:34 crc kubenswrapper[4794]: I0310 11:19:34.839430 4794 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7d69c47b-ce22-4e6a-aa51-bbfb7e5b8f23-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 11:19:34 crc kubenswrapper[4794]: I0310 11:19:34.839463 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b7rhx\" (UniqueName: \"kubernetes.io/projected/7d69c47b-ce22-4e6a-aa51-bbfb7e5b8f23-kube-api-access-b7rhx\") on node \"crc\" DevicePath \"\"" Mar 10 11:19:34 crc kubenswrapper[4794]: I0310 11:19:34.839476 4794 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7d69c47b-ce22-4e6a-aa51-bbfb7e5b8f23-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 11:19:35 crc kubenswrapper[4794]: I0310 11:19:35.082680 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 10 11:19:35 crc kubenswrapper[4794]: I0310 11:19:35.148623 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/660cfab8-47f8-4194-a9fc-9075fdb441ab-config-data\") pod \"660cfab8-47f8-4194-a9fc-9075fdb441ab\" (UID: \"660cfab8-47f8-4194-a9fc-9075fdb441ab\") " Mar 10 11:19:35 crc kubenswrapper[4794]: I0310 11:19:35.148727 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-52vw8\" (UniqueName: \"kubernetes.io/projected/660cfab8-47f8-4194-a9fc-9075fdb441ab-kube-api-access-52vw8\") pod \"660cfab8-47f8-4194-a9fc-9075fdb441ab\" (UID: \"660cfab8-47f8-4194-a9fc-9075fdb441ab\") " Mar 10 11:19:35 crc kubenswrapper[4794]: I0310 11:19:35.148868 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/660cfab8-47f8-4194-a9fc-9075fdb441ab-combined-ca-bundle\") pod \"660cfab8-47f8-4194-a9fc-9075fdb441ab\" (UID: \"660cfab8-47f8-4194-a9fc-9075fdb441ab\") " Mar 10 11:19:35 crc kubenswrapper[4794]: I0310 11:19:35.154856 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/660cfab8-47f8-4194-a9fc-9075fdb441ab-kube-api-access-52vw8" (OuterVolumeSpecName: "kube-api-access-52vw8") pod "660cfab8-47f8-4194-a9fc-9075fdb441ab" (UID: "660cfab8-47f8-4194-a9fc-9075fdb441ab"). InnerVolumeSpecName "kube-api-access-52vw8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 11:19:35 crc kubenswrapper[4794]: I0310 11:19:35.173243 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/660cfab8-47f8-4194-a9fc-9075fdb441ab-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "660cfab8-47f8-4194-a9fc-9075fdb441ab" (UID: "660cfab8-47f8-4194-a9fc-9075fdb441ab"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 11:19:35 crc kubenswrapper[4794]: I0310 11:19:35.177576 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/660cfab8-47f8-4194-a9fc-9075fdb441ab-config-data" (OuterVolumeSpecName: "config-data") pod "660cfab8-47f8-4194-a9fc-9075fdb441ab" (UID: "660cfab8-47f8-4194-a9fc-9075fdb441ab"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 11:19:35 crc kubenswrapper[4794]: I0310 11:19:35.250748 4794 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/660cfab8-47f8-4194-a9fc-9075fdb441ab-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 11:19:35 crc kubenswrapper[4794]: I0310 11:19:35.250814 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-52vw8\" (UniqueName: \"kubernetes.io/projected/660cfab8-47f8-4194-a9fc-9075fdb441ab-kube-api-access-52vw8\") on node \"crc\" DevicePath \"\"" Mar 10 11:19:35 crc kubenswrapper[4794]: I0310 11:19:35.250825 4794 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/660cfab8-47f8-4194-a9fc-9075fdb441ab-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 11:19:35 crc kubenswrapper[4794]: I0310 11:19:35.541773 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 10 11:19:35 crc kubenswrapper[4794]: I0310 11:19:35.542297 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"7d69c47b-ce22-4e6a-aa51-bbfb7e5b8f23","Type":"ContainerDied","Data":"2b2b77342842d19af067333b539aaf3cc2f04b030dc04c5d740e9ff3322481f0"} Mar 10 11:19:35 crc kubenswrapper[4794]: I0310 11:19:35.542415 4794 scope.go:117] "RemoveContainer" containerID="bd16b4f53a35e5576da7d1a0f331f057b122c2a8f1b9a4d858b904c8e01ec28d" Mar 10 11:19:35 crc kubenswrapper[4794]: I0310 11:19:35.546128 4794 generic.go:334] "Generic (PLEG): container finished" podID="660cfab8-47f8-4194-a9fc-9075fdb441ab" containerID="ef2bdf7435fa5ad4f2d88d430c0bbec0318b87aca7ad75f918c56d0c5c08e7f3" exitCode=0 Mar 10 11:19:35 crc kubenswrapper[4794]: I0310 11:19:35.546189 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"660cfab8-47f8-4194-a9fc-9075fdb441ab","Type":"ContainerDied","Data":"ef2bdf7435fa5ad4f2d88d430c0bbec0318b87aca7ad75f918c56d0c5c08e7f3"} Mar 10 11:19:35 crc kubenswrapper[4794]: I0310 11:19:35.546228 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"660cfab8-47f8-4194-a9fc-9075fdb441ab","Type":"ContainerDied","Data":"7cb4cba420f47bfcbdc9a27ed62eeb6101247c76e2db1c6bb117684b7482d297"} Mar 10 11:19:35 crc kubenswrapper[4794]: I0310 11:19:35.546299 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 10 11:19:35 crc kubenswrapper[4794]: I0310 11:19:35.575584 4794 scope.go:117] "RemoveContainer" containerID="ef2bdf7435fa5ad4f2d88d430c0bbec0318b87aca7ad75f918c56d0c5c08e7f3" Mar 10 11:19:35 crc kubenswrapper[4794]: I0310 11:19:35.605484 4794 scope.go:117] "RemoveContainer" containerID="ef2bdf7435fa5ad4f2d88d430c0bbec0318b87aca7ad75f918c56d0c5c08e7f3" Mar 10 11:19:35 crc kubenswrapper[4794]: E0310 11:19:35.606028 4794 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ef2bdf7435fa5ad4f2d88d430c0bbec0318b87aca7ad75f918c56d0c5c08e7f3\": container with ID starting with ef2bdf7435fa5ad4f2d88d430c0bbec0318b87aca7ad75f918c56d0c5c08e7f3 not found: ID does not exist" containerID="ef2bdf7435fa5ad4f2d88d430c0bbec0318b87aca7ad75f918c56d0c5c08e7f3" Mar 10 11:19:35 crc kubenswrapper[4794]: I0310 11:19:35.606091 4794 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ef2bdf7435fa5ad4f2d88d430c0bbec0318b87aca7ad75f918c56d0c5c08e7f3"} err="failed to get container status \"ef2bdf7435fa5ad4f2d88d430c0bbec0318b87aca7ad75f918c56d0c5c08e7f3\": rpc error: code = NotFound desc = could not find container \"ef2bdf7435fa5ad4f2d88d430c0bbec0318b87aca7ad75f918c56d0c5c08e7f3\": container with ID starting with ef2bdf7435fa5ad4f2d88d430c0bbec0318b87aca7ad75f918c56d0c5c08e7f3 not found: ID does not exist" Mar 10 11:19:35 crc kubenswrapper[4794]: I0310 11:19:35.621000 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 10 11:19:35 crc kubenswrapper[4794]: I0310 11:19:35.657708 4794 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 10 11:19:35 crc kubenswrapper[4794]: I0310 11:19:35.666406 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Mar 10 11:19:35 crc kubenswrapper[4794]: I0310 11:19:35.676429 4794 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Mar 10 11:19:35 crc kubenswrapper[4794]: I0310 11:19:35.684066 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 10 11:19:35 crc kubenswrapper[4794]: E0310 11:19:35.684545 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c950caad-61b1-466a-8ab0-9665214d56cc" containerName="init" Mar 10 11:19:35 crc kubenswrapper[4794]: I0310 11:19:35.684566 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="c950caad-61b1-466a-8ab0-9665214d56cc" containerName="init" Mar 10 11:19:35 crc kubenswrapper[4794]: E0310 11:19:35.684585 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="660cfab8-47f8-4194-a9fc-9075fdb441ab" containerName="nova-scheduler-scheduler" Mar 10 11:19:35 crc kubenswrapper[4794]: I0310 11:19:35.684592 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="660cfab8-47f8-4194-a9fc-9075fdb441ab" containerName="nova-scheduler-scheduler" Mar 10 11:19:35 crc kubenswrapper[4794]: E0310 11:19:35.684613 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c950caad-61b1-466a-8ab0-9665214d56cc" containerName="dnsmasq-dns" Mar 10 11:19:35 crc kubenswrapper[4794]: I0310 11:19:35.684620 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="c950caad-61b1-466a-8ab0-9665214d56cc" containerName="dnsmasq-dns" Mar 10 11:19:35 crc kubenswrapper[4794]: E0310 11:19:35.684630 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7d69c47b-ce22-4e6a-aa51-bbfb7e5b8f23" containerName="nova-cell1-novncproxy-novncproxy" Mar 10 11:19:35 crc kubenswrapper[4794]: I0310 11:19:35.684635 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="7d69c47b-ce22-4e6a-aa51-bbfb7e5b8f23" containerName="nova-cell1-novncproxy-novncproxy" Mar 10 11:19:35 crc kubenswrapper[4794]: I0310 11:19:35.684791 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="660cfab8-47f8-4194-a9fc-9075fdb441ab" containerName="nova-scheduler-scheduler" Mar 10 11:19:35 crc kubenswrapper[4794]: I0310 11:19:35.684812 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="c950caad-61b1-466a-8ab0-9665214d56cc" containerName="dnsmasq-dns" Mar 10 11:19:35 crc kubenswrapper[4794]: I0310 11:19:35.684826 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="7d69c47b-ce22-4e6a-aa51-bbfb7e5b8f23" containerName="nova-cell1-novncproxy-novncproxy" Mar 10 11:19:35 crc kubenswrapper[4794]: I0310 11:19:35.685478 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 10 11:19:35 crc kubenswrapper[4794]: I0310 11:19:35.687829 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Mar 10 11:19:35 crc kubenswrapper[4794]: I0310 11:19:35.692419 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 10 11:19:35 crc kubenswrapper[4794]: I0310 11:19:35.703977 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Mar 10 11:19:35 crc kubenswrapper[4794]: I0310 11:19:35.705549 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 10 11:19:35 crc kubenswrapper[4794]: I0310 11:19:35.708146 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Mar 10 11:19:35 crc kubenswrapper[4794]: I0310 11:19:35.714328 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 10 11:19:35 crc kubenswrapper[4794]: I0310 11:19:35.759609 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vwlkk\" (UniqueName: \"kubernetes.io/projected/9f2758f6-ec0b-40db-a619-65e823f98cc9-kube-api-access-vwlkk\") pod \"nova-scheduler-0\" (UID: \"9f2758f6-ec0b-40db-a619-65e823f98cc9\") " pod="openstack/nova-scheduler-0" Mar 10 11:19:35 crc kubenswrapper[4794]: I0310 11:19:35.759666 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6ba30853-8729-4d4a-9e44-f4ec5e80c459-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"6ba30853-8729-4d4a-9e44-f4ec5e80c459\") " pod="openstack/nova-cell1-novncproxy-0" Mar 10 11:19:35 crc kubenswrapper[4794]: I0310 11:19:35.759916 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6ba30853-8729-4d4a-9e44-f4ec5e80c459-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"6ba30853-8729-4d4a-9e44-f4ec5e80c459\") " pod="openstack/nova-cell1-novncproxy-0" Mar 10 11:19:35 crc kubenswrapper[4794]: I0310 11:19:35.759960 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8657f\" (UniqueName: \"kubernetes.io/projected/6ba30853-8729-4d4a-9e44-f4ec5e80c459-kube-api-access-8657f\") pod \"nova-cell1-novncproxy-0\" (UID: \"6ba30853-8729-4d4a-9e44-f4ec5e80c459\") " pod="openstack/nova-cell1-novncproxy-0" Mar 10 11:19:35 crc kubenswrapper[4794]: I0310 11:19:35.760100 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9f2758f6-ec0b-40db-a619-65e823f98cc9-config-data\") pod \"nova-scheduler-0\" (UID: \"9f2758f6-ec0b-40db-a619-65e823f98cc9\") " pod="openstack/nova-scheduler-0" Mar 10 11:19:35 crc kubenswrapper[4794]: I0310 11:19:35.760123 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9f2758f6-ec0b-40db-a619-65e823f98cc9-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"9f2758f6-ec0b-40db-a619-65e823f98cc9\") " pod="openstack/nova-scheduler-0" Mar 10 11:19:35 crc kubenswrapper[4794]: I0310 11:19:35.859091 4794 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-api-0" podUID="fc1aed0b-186c-45d8-947b-072cb1a2ce0f" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.1.111:8774/\": read tcp 10.217.0.2:60404->10.217.1.111:8774: read: connection reset by peer" Mar 10 11:19:35 crc kubenswrapper[4794]: I0310 11:19:35.859753 4794 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-api-0" podUID="fc1aed0b-186c-45d8-947b-072cb1a2ce0f" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.1.111:8774/\": read tcp 10.217.0.2:60406->10.217.1.111:8774: read: connection reset by peer" Mar 10 11:19:35 crc kubenswrapper[4794]: I0310 11:19:35.862194 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6ba30853-8729-4d4a-9e44-f4ec5e80c459-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"6ba30853-8729-4d4a-9e44-f4ec5e80c459\") " pod="openstack/nova-cell1-novncproxy-0" Mar 10 11:19:35 crc kubenswrapper[4794]: I0310 11:19:35.862235 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8657f\" (UniqueName: \"kubernetes.io/projected/6ba30853-8729-4d4a-9e44-f4ec5e80c459-kube-api-access-8657f\") pod \"nova-cell1-novncproxy-0\" (UID: \"6ba30853-8729-4d4a-9e44-f4ec5e80c459\") " pod="openstack/nova-cell1-novncproxy-0" Mar 10 11:19:35 crc kubenswrapper[4794]: I0310 11:19:35.862314 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9f2758f6-ec0b-40db-a619-65e823f98cc9-config-data\") pod \"nova-scheduler-0\" (UID: \"9f2758f6-ec0b-40db-a619-65e823f98cc9\") " pod="openstack/nova-scheduler-0" Mar 10 11:19:35 crc kubenswrapper[4794]: I0310 11:19:35.862382 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9f2758f6-ec0b-40db-a619-65e823f98cc9-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"9f2758f6-ec0b-40db-a619-65e823f98cc9\") " pod="openstack/nova-scheduler-0" Mar 10 11:19:35 crc kubenswrapper[4794]: I0310 11:19:35.862436 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vwlkk\" (UniqueName: \"kubernetes.io/projected/9f2758f6-ec0b-40db-a619-65e823f98cc9-kube-api-access-vwlkk\") pod \"nova-scheduler-0\" (UID: \"9f2758f6-ec0b-40db-a619-65e823f98cc9\") " pod="openstack/nova-scheduler-0" Mar 10 11:19:35 crc kubenswrapper[4794]: I0310 11:19:35.862474 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6ba30853-8729-4d4a-9e44-f4ec5e80c459-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"6ba30853-8729-4d4a-9e44-f4ec5e80c459\") " pod="openstack/nova-cell1-novncproxy-0" Mar 10 11:19:35 crc kubenswrapper[4794]: I0310 11:19:35.867662 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6ba30853-8729-4d4a-9e44-f4ec5e80c459-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"6ba30853-8729-4d4a-9e44-f4ec5e80c459\") " pod="openstack/nova-cell1-novncproxy-0" Mar 10 11:19:35 crc kubenswrapper[4794]: I0310 11:19:35.868476 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9f2758f6-ec0b-40db-a619-65e823f98cc9-config-data\") pod \"nova-scheduler-0\" (UID: \"9f2758f6-ec0b-40db-a619-65e823f98cc9\") " pod="openstack/nova-scheduler-0" Mar 10 11:19:35 crc kubenswrapper[4794]: I0310 11:19:35.871735 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9f2758f6-ec0b-40db-a619-65e823f98cc9-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"9f2758f6-ec0b-40db-a619-65e823f98cc9\") " pod="openstack/nova-scheduler-0" Mar 10 11:19:35 crc kubenswrapper[4794]: I0310 11:19:35.885757 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6ba30853-8729-4d4a-9e44-f4ec5e80c459-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"6ba30853-8729-4d4a-9e44-f4ec5e80c459\") " pod="openstack/nova-cell1-novncproxy-0" Mar 10 11:19:35 crc kubenswrapper[4794]: I0310 11:19:35.887919 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8657f\" (UniqueName: \"kubernetes.io/projected/6ba30853-8729-4d4a-9e44-f4ec5e80c459-kube-api-access-8657f\") pod \"nova-cell1-novncproxy-0\" (UID: \"6ba30853-8729-4d4a-9e44-f4ec5e80c459\") " pod="openstack/nova-cell1-novncproxy-0" Mar 10 11:19:35 crc kubenswrapper[4794]: I0310 11:19:35.903145 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vwlkk\" (UniqueName: \"kubernetes.io/projected/9f2758f6-ec0b-40db-a619-65e823f98cc9-kube-api-access-vwlkk\") pod \"nova-scheduler-0\" (UID: \"9f2758f6-ec0b-40db-a619-65e823f98cc9\") " pod="openstack/nova-scheduler-0" Mar 10 11:19:35 crc kubenswrapper[4794]: I0310 11:19:35.957181 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-0"] Mar 10 11:19:35 crc kubenswrapper[4794]: I0310 11:19:35.957428 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-conductor-0" podUID="69d4f681-c12f-43fa-8ec0-95e90bff92a8" containerName="nova-cell1-conductor-conductor" containerID="cri-o://130a3de147a7e7545561c7fc8bf64d08fa7458da93bcbf14a7d765d68e6e1f56" gracePeriod=30 Mar 10 11:19:36 crc kubenswrapper[4794]: I0310 11:19:36.007022 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 10 11:19:36 crc kubenswrapper[4794]: I0310 11:19:36.025582 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="660cfab8-47f8-4194-a9fc-9075fdb441ab" path="/var/lib/kubelet/pods/660cfab8-47f8-4194-a9fc-9075fdb441ab/volumes" Mar 10 11:19:36 crc kubenswrapper[4794]: I0310 11:19:36.026241 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7d69c47b-ce22-4e6a-aa51-bbfb7e5b8f23" path="/var/lib/kubelet/pods/7d69c47b-ce22-4e6a-aa51-bbfb7e5b8f23/volumes" Mar 10 11:19:36 crc kubenswrapper[4794]: I0310 11:19:36.031477 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 10 11:19:36 crc kubenswrapper[4794]: E0310 11:19:36.123929 4794 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="773ca7b14734c179e1a0443c6ccd3c922e4dcbfb06c2b67fcd8db43590f52861" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Mar 10 11:19:36 crc kubenswrapper[4794]: E0310 11:19:36.132452 4794 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="773ca7b14734c179e1a0443c6ccd3c922e4dcbfb06c2b67fcd8db43590f52861" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Mar 10 11:19:36 crc kubenswrapper[4794]: E0310 11:19:36.135611 4794 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="773ca7b14734c179e1a0443c6ccd3c922e4dcbfb06c2b67fcd8db43590f52861" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Mar 10 11:19:36 crc kubenswrapper[4794]: E0310 11:19:36.135684 4794 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-cell0-conductor-0" podUID="8b88005f-6aaa-488f-90d5-b789ccced7ec" containerName="nova-cell0-conductor-conductor" Mar 10 11:19:36 crc kubenswrapper[4794]: I0310 11:19:36.432720 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 10 11:19:36 crc kubenswrapper[4794]: I0310 11:19:36.557610 4794 generic.go:334] "Generic (PLEG): container finished" podID="fc1aed0b-186c-45d8-947b-072cb1a2ce0f" containerID="93486363bcda7b62107d7e1e0b7afd58567e06ee4d14343e41ead3774822918b" exitCode=0 Mar 10 11:19:36 crc kubenswrapper[4794]: I0310 11:19:36.557665 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 10 11:19:36 crc kubenswrapper[4794]: I0310 11:19:36.557680 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"fc1aed0b-186c-45d8-947b-072cb1a2ce0f","Type":"ContainerDied","Data":"93486363bcda7b62107d7e1e0b7afd58567e06ee4d14343e41ead3774822918b"} Mar 10 11:19:36 crc kubenswrapper[4794]: I0310 11:19:36.557709 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"fc1aed0b-186c-45d8-947b-072cb1a2ce0f","Type":"ContainerDied","Data":"58a9c0c40105ab6738a509df6d9323044f391977b5333c7749fa48c9d6434b7a"} Mar 10 11:19:36 crc kubenswrapper[4794]: I0310 11:19:36.557726 4794 scope.go:117] "RemoveContainer" containerID="93486363bcda7b62107d7e1e0b7afd58567e06ee4d14343e41ead3774822918b" Mar 10 11:19:36 crc kubenswrapper[4794]: I0310 11:19:36.582949 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zn5br\" (UniqueName: \"kubernetes.io/projected/fc1aed0b-186c-45d8-947b-072cb1a2ce0f-kube-api-access-zn5br\") pod \"fc1aed0b-186c-45d8-947b-072cb1a2ce0f\" (UID: \"fc1aed0b-186c-45d8-947b-072cb1a2ce0f\") " Mar 10 11:19:36 crc kubenswrapper[4794]: I0310 11:19:36.583116 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fc1aed0b-186c-45d8-947b-072cb1a2ce0f-config-data\") pod \"fc1aed0b-186c-45d8-947b-072cb1a2ce0f\" (UID: \"fc1aed0b-186c-45d8-947b-072cb1a2ce0f\") " Mar 10 11:19:36 crc kubenswrapper[4794]: I0310 11:19:36.583223 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fc1aed0b-186c-45d8-947b-072cb1a2ce0f-logs\") pod \"fc1aed0b-186c-45d8-947b-072cb1a2ce0f\" (UID: \"fc1aed0b-186c-45d8-947b-072cb1a2ce0f\") " Mar 10 11:19:36 crc kubenswrapper[4794]: I0310 11:19:36.583252 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fc1aed0b-186c-45d8-947b-072cb1a2ce0f-combined-ca-bundle\") pod \"fc1aed0b-186c-45d8-947b-072cb1a2ce0f\" (UID: \"fc1aed0b-186c-45d8-947b-072cb1a2ce0f\") " Mar 10 11:19:36 crc kubenswrapper[4794]: I0310 11:19:36.583628 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fc1aed0b-186c-45d8-947b-072cb1a2ce0f-logs" (OuterVolumeSpecName: "logs") pod "fc1aed0b-186c-45d8-947b-072cb1a2ce0f" (UID: "fc1aed0b-186c-45d8-947b-072cb1a2ce0f"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 11:19:36 crc kubenswrapper[4794]: I0310 11:19:36.587453 4794 scope.go:117] "RemoveContainer" containerID="41c78b00714fd6d0f603309b957b6ac5d885afe04aa32a0e7b8f992d63d515c9" Mar 10 11:19:36 crc kubenswrapper[4794]: I0310 11:19:36.590583 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fc1aed0b-186c-45d8-947b-072cb1a2ce0f-kube-api-access-zn5br" (OuterVolumeSpecName: "kube-api-access-zn5br") pod "fc1aed0b-186c-45d8-947b-072cb1a2ce0f" (UID: "fc1aed0b-186c-45d8-947b-072cb1a2ce0f"). InnerVolumeSpecName "kube-api-access-zn5br". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 11:19:36 crc kubenswrapper[4794]: I0310 11:19:36.599158 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 10 11:19:36 crc kubenswrapper[4794]: I0310 11:19:36.617671 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fc1aed0b-186c-45d8-947b-072cb1a2ce0f-config-data" (OuterVolumeSpecName: "config-data") pod "fc1aed0b-186c-45d8-947b-072cb1a2ce0f" (UID: "fc1aed0b-186c-45d8-947b-072cb1a2ce0f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 11:19:36 crc kubenswrapper[4794]: I0310 11:19:36.620118 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fc1aed0b-186c-45d8-947b-072cb1a2ce0f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "fc1aed0b-186c-45d8-947b-072cb1a2ce0f" (UID: "fc1aed0b-186c-45d8-947b-072cb1a2ce0f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 11:19:36 crc kubenswrapper[4794]: I0310 11:19:36.647639 4794 scope.go:117] "RemoveContainer" containerID="93486363bcda7b62107d7e1e0b7afd58567e06ee4d14343e41ead3774822918b" Mar 10 11:19:36 crc kubenswrapper[4794]: E0310 11:19:36.648118 4794 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"93486363bcda7b62107d7e1e0b7afd58567e06ee4d14343e41ead3774822918b\": container with ID starting with 93486363bcda7b62107d7e1e0b7afd58567e06ee4d14343e41ead3774822918b not found: ID does not exist" containerID="93486363bcda7b62107d7e1e0b7afd58567e06ee4d14343e41ead3774822918b" Mar 10 11:19:36 crc kubenswrapper[4794]: I0310 11:19:36.648152 4794 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"93486363bcda7b62107d7e1e0b7afd58567e06ee4d14343e41ead3774822918b"} err="failed to get container status \"93486363bcda7b62107d7e1e0b7afd58567e06ee4d14343e41ead3774822918b\": rpc error: code = NotFound desc = could not find container \"93486363bcda7b62107d7e1e0b7afd58567e06ee4d14343e41ead3774822918b\": container with ID starting with 93486363bcda7b62107d7e1e0b7afd58567e06ee4d14343e41ead3774822918b not found: ID does not exist" Mar 10 11:19:36 crc kubenswrapper[4794]: I0310 11:19:36.648220 4794 scope.go:117] "RemoveContainer" containerID="41c78b00714fd6d0f603309b957b6ac5d885afe04aa32a0e7b8f992d63d515c9" Mar 10 11:19:36 crc kubenswrapper[4794]: E0310 11:19:36.649585 4794 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"41c78b00714fd6d0f603309b957b6ac5d885afe04aa32a0e7b8f992d63d515c9\": container with ID starting with 41c78b00714fd6d0f603309b957b6ac5d885afe04aa32a0e7b8f992d63d515c9 not found: ID does not exist" containerID="41c78b00714fd6d0f603309b957b6ac5d885afe04aa32a0e7b8f992d63d515c9" Mar 10 11:19:36 crc kubenswrapper[4794]: I0310 11:19:36.649631 4794 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"41c78b00714fd6d0f603309b957b6ac5d885afe04aa32a0e7b8f992d63d515c9"} err="failed to get container status \"41c78b00714fd6d0f603309b957b6ac5d885afe04aa32a0e7b8f992d63d515c9\": rpc error: code = NotFound desc = could not find container \"41c78b00714fd6d0f603309b957b6ac5d885afe04aa32a0e7b8f992d63d515c9\": container with ID starting with 41c78b00714fd6d0f603309b957b6ac5d885afe04aa32a0e7b8f992d63d515c9 not found: ID does not exist" Mar 10 11:19:36 crc kubenswrapper[4794]: I0310 11:19:36.671934 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 10 11:19:36 crc kubenswrapper[4794]: I0310 11:19:36.684577 4794 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fc1aed0b-186c-45d8-947b-072cb1a2ce0f-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 11:19:36 crc kubenswrapper[4794]: I0310 11:19:36.684604 4794 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fc1aed0b-186c-45d8-947b-072cb1a2ce0f-logs\") on node \"crc\" DevicePath \"\"" Mar 10 11:19:36 crc kubenswrapper[4794]: I0310 11:19:36.684613 4794 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fc1aed0b-186c-45d8-947b-072cb1a2ce0f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 11:19:36 crc kubenswrapper[4794]: I0310 11:19:36.684623 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zn5br\" (UniqueName: \"kubernetes.io/projected/fc1aed0b-186c-45d8-947b-072cb1a2ce0f-kube-api-access-zn5br\") on node \"crc\" DevicePath \"\"" Mar 10 11:19:36 crc kubenswrapper[4794]: I0310 11:19:36.893430 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 10 11:19:36 crc kubenswrapper[4794]: I0310 11:19:36.900875 4794 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Mar 10 11:19:36 crc kubenswrapper[4794]: I0310 11:19:36.921375 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Mar 10 11:19:36 crc kubenswrapper[4794]: E0310 11:19:36.921736 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fc1aed0b-186c-45d8-947b-072cb1a2ce0f" containerName="nova-api-log" Mar 10 11:19:36 crc kubenswrapper[4794]: I0310 11:19:36.921752 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="fc1aed0b-186c-45d8-947b-072cb1a2ce0f" containerName="nova-api-log" Mar 10 11:19:36 crc kubenswrapper[4794]: E0310 11:19:36.921778 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fc1aed0b-186c-45d8-947b-072cb1a2ce0f" containerName="nova-api-api" Mar 10 11:19:36 crc kubenswrapper[4794]: I0310 11:19:36.921784 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="fc1aed0b-186c-45d8-947b-072cb1a2ce0f" containerName="nova-api-api" Mar 10 11:19:36 crc kubenswrapper[4794]: I0310 11:19:36.921965 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="fc1aed0b-186c-45d8-947b-072cb1a2ce0f" containerName="nova-api-api" Mar 10 11:19:36 crc kubenswrapper[4794]: I0310 11:19:36.921985 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="fc1aed0b-186c-45d8-947b-072cb1a2ce0f" containerName="nova-api-log" Mar 10 11:19:36 crc kubenswrapper[4794]: I0310 11:19:36.922896 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 10 11:19:36 crc kubenswrapper[4794]: I0310 11:19:36.924932 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Mar 10 11:19:36 crc kubenswrapper[4794]: I0310 11:19:36.935950 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 10 11:19:36 crc kubenswrapper[4794]: E0310 11:19:36.959634 4794 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 38.102.83.65:32978->38.102.83.65:40829: write tcp 38.102.83.65:32978->38.102.83.65:40829: write: broken pipe Mar 10 11:19:37 crc kubenswrapper[4794]: I0310 11:19:37.090825 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/abb3b031-7a99-4128-ae47-c28a091f5ee3-config-data\") pod \"nova-api-0\" (UID: \"abb3b031-7a99-4128-ae47-c28a091f5ee3\") " pod="openstack/nova-api-0" Mar 10 11:19:37 crc kubenswrapper[4794]: I0310 11:19:37.090880 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sq926\" (UniqueName: \"kubernetes.io/projected/abb3b031-7a99-4128-ae47-c28a091f5ee3-kube-api-access-sq926\") pod \"nova-api-0\" (UID: \"abb3b031-7a99-4128-ae47-c28a091f5ee3\") " pod="openstack/nova-api-0" Mar 10 11:19:37 crc kubenswrapper[4794]: I0310 11:19:37.090916 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/abb3b031-7a99-4128-ae47-c28a091f5ee3-logs\") pod \"nova-api-0\" (UID: \"abb3b031-7a99-4128-ae47-c28a091f5ee3\") " pod="openstack/nova-api-0" Mar 10 11:19:37 crc kubenswrapper[4794]: I0310 11:19:37.090938 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/abb3b031-7a99-4128-ae47-c28a091f5ee3-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"abb3b031-7a99-4128-ae47-c28a091f5ee3\") " pod="openstack/nova-api-0" Mar 10 11:19:37 crc kubenswrapper[4794]: I0310 11:19:37.192886 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/abb3b031-7a99-4128-ae47-c28a091f5ee3-config-data\") pod \"nova-api-0\" (UID: \"abb3b031-7a99-4128-ae47-c28a091f5ee3\") " pod="openstack/nova-api-0" Mar 10 11:19:37 crc kubenswrapper[4794]: I0310 11:19:37.193323 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sq926\" (UniqueName: \"kubernetes.io/projected/abb3b031-7a99-4128-ae47-c28a091f5ee3-kube-api-access-sq926\") pod \"nova-api-0\" (UID: \"abb3b031-7a99-4128-ae47-c28a091f5ee3\") " pod="openstack/nova-api-0" Mar 10 11:19:37 crc kubenswrapper[4794]: I0310 11:19:37.193430 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/abb3b031-7a99-4128-ae47-c28a091f5ee3-logs\") pod \"nova-api-0\" (UID: \"abb3b031-7a99-4128-ae47-c28a091f5ee3\") " pod="openstack/nova-api-0" Mar 10 11:19:37 crc kubenswrapper[4794]: I0310 11:19:37.193493 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/abb3b031-7a99-4128-ae47-c28a091f5ee3-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"abb3b031-7a99-4128-ae47-c28a091f5ee3\") " pod="openstack/nova-api-0" Mar 10 11:19:37 crc kubenswrapper[4794]: I0310 11:19:37.193857 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/abb3b031-7a99-4128-ae47-c28a091f5ee3-logs\") pod \"nova-api-0\" (UID: \"abb3b031-7a99-4128-ae47-c28a091f5ee3\") " pod="openstack/nova-api-0" Mar 10 11:19:37 crc kubenswrapper[4794]: I0310 11:19:37.198313 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/abb3b031-7a99-4128-ae47-c28a091f5ee3-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"abb3b031-7a99-4128-ae47-c28a091f5ee3\") " pod="openstack/nova-api-0" Mar 10 11:19:37 crc kubenswrapper[4794]: I0310 11:19:37.198966 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/abb3b031-7a99-4128-ae47-c28a091f5ee3-config-data\") pod \"nova-api-0\" (UID: \"abb3b031-7a99-4128-ae47-c28a091f5ee3\") " pod="openstack/nova-api-0" Mar 10 11:19:37 crc kubenswrapper[4794]: I0310 11:19:37.219457 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sq926\" (UniqueName: \"kubernetes.io/projected/abb3b031-7a99-4128-ae47-c28a091f5ee3-kube-api-access-sq926\") pod \"nova-api-0\" (UID: \"abb3b031-7a99-4128-ae47-c28a091f5ee3\") " pod="openstack/nova-api-0" Mar 10 11:19:37 crc kubenswrapper[4794]: I0310 11:19:37.234898 4794 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="e3cc18ff-b557-4d49-8580-733877f288a5" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"http://10.217.1.110:8775/\": read tcp 10.217.0.2:56828->10.217.1.110:8775: read: connection reset by peer" Mar 10 11:19:37 crc kubenswrapper[4794]: I0310 11:19:37.234949 4794 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="e3cc18ff-b557-4d49-8580-733877f288a5" containerName="nova-metadata-log" probeResult="failure" output="Get \"http://10.217.1.110:8775/\": read tcp 10.217.0.2:56832->10.217.1.110:8775: read: connection reset by peer" Mar 10 11:19:37 crc kubenswrapper[4794]: I0310 11:19:37.254190 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 10 11:19:37 crc kubenswrapper[4794]: I0310 11:19:37.578787 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"9f2758f6-ec0b-40db-a619-65e823f98cc9","Type":"ContainerStarted","Data":"848e79df9401a9915b1f9c828f03f2c156d90bb24e2fb89f65dee4ecaa3a387d"} Mar 10 11:19:37 crc kubenswrapper[4794]: I0310 11:19:37.579050 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"9f2758f6-ec0b-40db-a619-65e823f98cc9","Type":"ContainerStarted","Data":"80385cd692e52efb8e8dda8eacbf34179f4e8c2208f51fce442ca5fdaed2b3cd"} Mar 10 11:19:37 crc kubenswrapper[4794]: I0310 11:19:37.600778 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.60075975 podStartE2EDuration="2.60075975s" podCreationTimestamp="2026-03-10 11:19:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 11:19:37.591911715 +0000 UTC m=+5726.348082533" watchObservedRunningTime="2026-03-10 11:19:37.60075975 +0000 UTC m=+5726.356930568" Mar 10 11:19:37 crc kubenswrapper[4794]: I0310 11:19:37.615953 4794 generic.go:334] "Generic (PLEG): container finished" podID="e3cc18ff-b557-4d49-8580-733877f288a5" containerID="8307f7ba39567058c2c2afb0493f5efb4f54bf8b9d792b4ddc4fd0580cc86401" exitCode=0 Mar 10 11:19:37 crc kubenswrapper[4794]: I0310 11:19:37.616016 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"e3cc18ff-b557-4d49-8580-733877f288a5","Type":"ContainerDied","Data":"8307f7ba39567058c2c2afb0493f5efb4f54bf8b9d792b4ddc4fd0580cc86401"} Mar 10 11:19:37 crc kubenswrapper[4794]: I0310 11:19:37.617580 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"6ba30853-8729-4d4a-9e44-f4ec5e80c459","Type":"ContainerStarted","Data":"43ad1911286552f989b5a5efe3441f98f7c04e84e2eb88d476459c8fece33e15"} Mar 10 11:19:37 crc kubenswrapper[4794]: I0310 11:19:37.617603 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"6ba30853-8729-4d4a-9e44-f4ec5e80c459","Type":"ContainerStarted","Data":"e00742fb25465842bcb2afd3613e9a7ec29c0597aabb1a6f69d8b915fdc7fc75"} Mar 10 11:19:37 crc kubenswrapper[4794]: I0310 11:19:37.634639 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=2.634614814 podStartE2EDuration="2.634614814s" podCreationTimestamp="2026-03-10 11:19:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 11:19:37.63062059 +0000 UTC m=+5726.386791418" watchObservedRunningTime="2026-03-10 11:19:37.634614814 +0000 UTC m=+5726.390785652" Mar 10 11:19:37 crc kubenswrapper[4794]: I0310 11:19:37.760105 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 10 11:19:37 crc kubenswrapper[4794]: I0310 11:19:37.797814 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 10 11:19:37 crc kubenswrapper[4794]: I0310 11:19:37.915961 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e3cc18ff-b557-4d49-8580-733877f288a5-combined-ca-bundle\") pod \"e3cc18ff-b557-4d49-8580-733877f288a5\" (UID: \"e3cc18ff-b557-4d49-8580-733877f288a5\") " Mar 10 11:19:37 crc kubenswrapper[4794]: I0310 11:19:37.916082 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8dwtb\" (UniqueName: \"kubernetes.io/projected/e3cc18ff-b557-4d49-8580-733877f288a5-kube-api-access-8dwtb\") pod \"e3cc18ff-b557-4d49-8580-733877f288a5\" (UID: \"e3cc18ff-b557-4d49-8580-733877f288a5\") " Mar 10 11:19:37 crc kubenswrapper[4794]: I0310 11:19:37.916126 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e3cc18ff-b557-4d49-8580-733877f288a5-config-data\") pod \"e3cc18ff-b557-4d49-8580-733877f288a5\" (UID: \"e3cc18ff-b557-4d49-8580-733877f288a5\") " Mar 10 11:19:37 crc kubenswrapper[4794]: I0310 11:19:37.916235 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e3cc18ff-b557-4d49-8580-733877f288a5-logs\") pod \"e3cc18ff-b557-4d49-8580-733877f288a5\" (UID: \"e3cc18ff-b557-4d49-8580-733877f288a5\") " Mar 10 11:19:37 crc kubenswrapper[4794]: I0310 11:19:37.917182 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e3cc18ff-b557-4d49-8580-733877f288a5-logs" (OuterVolumeSpecName: "logs") pod "e3cc18ff-b557-4d49-8580-733877f288a5" (UID: "e3cc18ff-b557-4d49-8580-733877f288a5"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 11:19:37 crc kubenswrapper[4794]: I0310 11:19:37.930783 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e3cc18ff-b557-4d49-8580-733877f288a5-kube-api-access-8dwtb" (OuterVolumeSpecName: "kube-api-access-8dwtb") pod "e3cc18ff-b557-4d49-8580-733877f288a5" (UID: "e3cc18ff-b557-4d49-8580-733877f288a5"). InnerVolumeSpecName "kube-api-access-8dwtb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 11:19:37 crc kubenswrapper[4794]: I0310 11:19:37.954017 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e3cc18ff-b557-4d49-8580-733877f288a5-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e3cc18ff-b557-4d49-8580-733877f288a5" (UID: "e3cc18ff-b557-4d49-8580-733877f288a5"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 11:19:37 crc kubenswrapper[4794]: I0310 11:19:37.957573 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e3cc18ff-b557-4d49-8580-733877f288a5-config-data" (OuterVolumeSpecName: "config-data") pod "e3cc18ff-b557-4d49-8580-733877f288a5" (UID: "e3cc18ff-b557-4d49-8580-733877f288a5"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 11:19:38 crc kubenswrapper[4794]: I0310 11:19:38.017791 4794 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e3cc18ff-b557-4d49-8580-733877f288a5-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 11:19:38 crc kubenswrapper[4794]: I0310 11:19:38.017828 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8dwtb\" (UniqueName: \"kubernetes.io/projected/e3cc18ff-b557-4d49-8580-733877f288a5-kube-api-access-8dwtb\") on node \"crc\" DevicePath \"\"" Mar 10 11:19:38 crc kubenswrapper[4794]: I0310 11:19:38.017840 4794 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e3cc18ff-b557-4d49-8580-733877f288a5-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 11:19:38 crc kubenswrapper[4794]: I0310 11:19:38.017848 4794 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e3cc18ff-b557-4d49-8580-733877f288a5-logs\") on node \"crc\" DevicePath \"\"" Mar 10 11:19:38 crc kubenswrapper[4794]: I0310 11:19:38.022758 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fc1aed0b-186c-45d8-947b-072cb1a2ce0f" path="/var/lib/kubelet/pods/fc1aed0b-186c-45d8-947b-072cb1a2ce0f/volumes" Mar 10 11:19:38 crc kubenswrapper[4794]: I0310 11:19:38.212897 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Mar 10 11:19:38 crc kubenswrapper[4794]: I0310 11:19:38.322363 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/69d4f681-c12f-43fa-8ec0-95e90bff92a8-combined-ca-bundle\") pod \"69d4f681-c12f-43fa-8ec0-95e90bff92a8\" (UID: \"69d4f681-c12f-43fa-8ec0-95e90bff92a8\") " Mar 10 11:19:38 crc kubenswrapper[4794]: I0310 11:19:38.322492 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/69d4f681-c12f-43fa-8ec0-95e90bff92a8-config-data\") pod \"69d4f681-c12f-43fa-8ec0-95e90bff92a8\" (UID: \"69d4f681-c12f-43fa-8ec0-95e90bff92a8\") " Mar 10 11:19:38 crc kubenswrapper[4794]: I0310 11:19:38.322574 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gc7dd\" (UniqueName: \"kubernetes.io/projected/69d4f681-c12f-43fa-8ec0-95e90bff92a8-kube-api-access-gc7dd\") pod \"69d4f681-c12f-43fa-8ec0-95e90bff92a8\" (UID: \"69d4f681-c12f-43fa-8ec0-95e90bff92a8\") " Mar 10 11:19:38 crc kubenswrapper[4794]: I0310 11:19:38.325379 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/69d4f681-c12f-43fa-8ec0-95e90bff92a8-kube-api-access-gc7dd" (OuterVolumeSpecName: "kube-api-access-gc7dd") pod "69d4f681-c12f-43fa-8ec0-95e90bff92a8" (UID: "69d4f681-c12f-43fa-8ec0-95e90bff92a8"). InnerVolumeSpecName "kube-api-access-gc7dd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 11:19:38 crc kubenswrapper[4794]: I0310 11:19:38.345502 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/69d4f681-c12f-43fa-8ec0-95e90bff92a8-config-data" (OuterVolumeSpecName: "config-data") pod "69d4f681-c12f-43fa-8ec0-95e90bff92a8" (UID: "69d4f681-c12f-43fa-8ec0-95e90bff92a8"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 11:19:38 crc kubenswrapper[4794]: I0310 11:19:38.345502 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/69d4f681-c12f-43fa-8ec0-95e90bff92a8-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "69d4f681-c12f-43fa-8ec0-95e90bff92a8" (UID: "69d4f681-c12f-43fa-8ec0-95e90bff92a8"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 11:19:38 crc kubenswrapper[4794]: I0310 11:19:38.424367 4794 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/69d4f681-c12f-43fa-8ec0-95e90bff92a8-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 11:19:38 crc kubenswrapper[4794]: I0310 11:19:38.424407 4794 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/69d4f681-c12f-43fa-8ec0-95e90bff92a8-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 11:19:38 crc kubenswrapper[4794]: I0310 11:19:38.424420 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gc7dd\" (UniqueName: \"kubernetes.io/projected/69d4f681-c12f-43fa-8ec0-95e90bff92a8-kube-api-access-gc7dd\") on node \"crc\" DevicePath \"\"" Mar 10 11:19:38 crc kubenswrapper[4794]: I0310 11:19:38.626075 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"abb3b031-7a99-4128-ae47-c28a091f5ee3","Type":"ContainerStarted","Data":"e17f2db38884b006e2dc81fc9c71a9946033f8c2efb20884c96af86be4d89c73"} Mar 10 11:19:38 crc kubenswrapper[4794]: I0310 11:19:38.626115 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"abb3b031-7a99-4128-ae47-c28a091f5ee3","Type":"ContainerStarted","Data":"a51a2e470e0b29c1a262060fb3b27c824b3d6976fe8f641d698c8c5ba765a22f"} Mar 10 11:19:38 crc kubenswrapper[4794]: I0310 11:19:38.626126 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"abb3b031-7a99-4128-ae47-c28a091f5ee3","Type":"ContainerStarted","Data":"66f6a0e04045968208a3b56fd362695cf6d437aa4491a2c63d6a1f31dc0c1066"} Mar 10 11:19:38 crc kubenswrapper[4794]: I0310 11:19:38.627680 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"e3cc18ff-b557-4d49-8580-733877f288a5","Type":"ContainerDied","Data":"6fd9da60e1cf3efdfcd8233aad0e963797daf47d7e749eb628b9f113e0e27d1a"} Mar 10 11:19:38 crc kubenswrapper[4794]: I0310 11:19:38.627714 4794 scope.go:117] "RemoveContainer" containerID="8307f7ba39567058c2c2afb0493f5efb4f54bf8b9d792b4ddc4fd0580cc86401" Mar 10 11:19:38 crc kubenswrapper[4794]: I0310 11:19:38.627800 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 10 11:19:38 crc kubenswrapper[4794]: I0310 11:19:38.629809 4794 generic.go:334] "Generic (PLEG): container finished" podID="69d4f681-c12f-43fa-8ec0-95e90bff92a8" containerID="130a3de147a7e7545561c7fc8bf64d08fa7458da93bcbf14a7d765d68e6e1f56" exitCode=0 Mar 10 11:19:38 crc kubenswrapper[4794]: I0310 11:19:38.630248 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Mar 10 11:19:38 crc kubenswrapper[4794]: I0310 11:19:38.630268 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"69d4f681-c12f-43fa-8ec0-95e90bff92a8","Type":"ContainerDied","Data":"130a3de147a7e7545561c7fc8bf64d08fa7458da93bcbf14a7d765d68e6e1f56"} Mar 10 11:19:38 crc kubenswrapper[4794]: I0310 11:19:38.630306 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"69d4f681-c12f-43fa-8ec0-95e90bff92a8","Type":"ContainerDied","Data":"9ecf37b1a495c4f494c7287c49a87edccaa876c69bfb040fcba8ba38bf7f9224"} Mar 10 11:19:38 crc kubenswrapper[4794]: I0310 11:19:38.647719 4794 scope.go:117] "RemoveContainer" containerID="bd17680b7381005b9d0606808844a3a815638a759a845329a6aaa7cde1297f88" Mar 10 11:19:38 crc kubenswrapper[4794]: I0310 11:19:38.659083 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.659065403 podStartE2EDuration="2.659065403s" podCreationTimestamp="2026-03-10 11:19:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 11:19:38.656743 +0000 UTC m=+5727.412913818" watchObservedRunningTime="2026-03-10 11:19:38.659065403 +0000 UTC m=+5727.415236221" Mar 10 11:19:38 crc kubenswrapper[4794]: I0310 11:19:38.670468 4794 scope.go:117] "RemoveContainer" containerID="130a3de147a7e7545561c7fc8bf64d08fa7458da93bcbf14a7d765d68e6e1f56" Mar 10 11:19:38 crc kubenswrapper[4794]: I0310 11:19:38.692467 4794 scope.go:117] "RemoveContainer" containerID="130a3de147a7e7545561c7fc8bf64d08fa7458da93bcbf14a7d765d68e6e1f56" Mar 10 11:19:38 crc kubenswrapper[4794]: E0310 11:19:38.693019 4794 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"130a3de147a7e7545561c7fc8bf64d08fa7458da93bcbf14a7d765d68e6e1f56\": container with ID starting with 130a3de147a7e7545561c7fc8bf64d08fa7458da93bcbf14a7d765d68e6e1f56 not found: ID does not exist" containerID="130a3de147a7e7545561c7fc8bf64d08fa7458da93bcbf14a7d765d68e6e1f56" Mar 10 11:19:38 crc kubenswrapper[4794]: I0310 11:19:38.693071 4794 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"130a3de147a7e7545561c7fc8bf64d08fa7458da93bcbf14a7d765d68e6e1f56"} err="failed to get container status \"130a3de147a7e7545561c7fc8bf64d08fa7458da93bcbf14a7d765d68e6e1f56\": rpc error: code = NotFound desc = could not find container \"130a3de147a7e7545561c7fc8bf64d08fa7458da93bcbf14a7d765d68e6e1f56\": container with ID starting with 130a3de147a7e7545561c7fc8bf64d08fa7458da93bcbf14a7d765d68e6e1f56 not found: ID does not exist" Mar 10 11:19:38 crc kubenswrapper[4794]: I0310 11:19:38.693126 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-0"] Mar 10 11:19:38 crc kubenswrapper[4794]: I0310 11:19:38.704073 4794 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-conductor-0"] Mar 10 11:19:38 crc kubenswrapper[4794]: I0310 11:19:38.712601 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 10 11:19:38 crc kubenswrapper[4794]: I0310 11:19:38.722605 4794 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Mar 10 11:19:38 crc kubenswrapper[4794]: I0310 11:19:38.726468 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-0"] Mar 10 11:19:38 crc kubenswrapper[4794]: E0310 11:19:38.726874 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e3cc18ff-b557-4d49-8580-733877f288a5" containerName="nova-metadata-metadata" Mar 10 11:19:38 crc kubenswrapper[4794]: I0310 11:19:38.726893 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="e3cc18ff-b557-4d49-8580-733877f288a5" containerName="nova-metadata-metadata" Mar 10 11:19:38 crc kubenswrapper[4794]: E0310 11:19:38.726909 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e3cc18ff-b557-4d49-8580-733877f288a5" containerName="nova-metadata-log" Mar 10 11:19:38 crc kubenswrapper[4794]: I0310 11:19:38.726917 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="e3cc18ff-b557-4d49-8580-733877f288a5" containerName="nova-metadata-log" Mar 10 11:19:38 crc kubenswrapper[4794]: E0310 11:19:38.726947 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="69d4f681-c12f-43fa-8ec0-95e90bff92a8" containerName="nova-cell1-conductor-conductor" Mar 10 11:19:38 crc kubenswrapper[4794]: I0310 11:19:38.726954 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="69d4f681-c12f-43fa-8ec0-95e90bff92a8" containerName="nova-cell1-conductor-conductor" Mar 10 11:19:38 crc kubenswrapper[4794]: I0310 11:19:38.727105 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="e3cc18ff-b557-4d49-8580-733877f288a5" containerName="nova-metadata-metadata" Mar 10 11:19:38 crc kubenswrapper[4794]: I0310 11:19:38.727116 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="e3cc18ff-b557-4d49-8580-733877f288a5" containerName="nova-metadata-log" Mar 10 11:19:38 crc kubenswrapper[4794]: I0310 11:19:38.727135 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="69d4f681-c12f-43fa-8ec0-95e90bff92a8" containerName="nova-cell1-conductor-conductor" Mar 10 11:19:38 crc kubenswrapper[4794]: I0310 11:19:38.729442 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Mar 10 11:19:38 crc kubenswrapper[4794]: I0310 11:19:38.731758 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Mar 10 11:19:38 crc kubenswrapper[4794]: I0310 11:19:38.760958 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Mar 10 11:19:38 crc kubenswrapper[4794]: I0310 11:19:38.782822 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Mar 10 11:19:38 crc kubenswrapper[4794]: I0310 11:19:38.784255 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 10 11:19:38 crc kubenswrapper[4794]: I0310 11:19:38.786642 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Mar 10 11:19:38 crc kubenswrapper[4794]: I0310 11:19:38.806432 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 10 11:19:38 crc kubenswrapper[4794]: I0310 11:19:38.830499 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nrvvd\" (UniqueName: \"kubernetes.io/projected/73f45dee-40e0-4370-9ab8-de6d2998fa6b-kube-api-access-nrvvd\") pod \"nova-cell1-conductor-0\" (UID: \"73f45dee-40e0-4370-9ab8-de6d2998fa6b\") " pod="openstack/nova-cell1-conductor-0" Mar 10 11:19:38 crc kubenswrapper[4794]: I0310 11:19:38.830618 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/73f45dee-40e0-4370-9ab8-de6d2998fa6b-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"73f45dee-40e0-4370-9ab8-de6d2998fa6b\") " pod="openstack/nova-cell1-conductor-0" Mar 10 11:19:38 crc kubenswrapper[4794]: I0310 11:19:38.830639 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/73f45dee-40e0-4370-9ab8-de6d2998fa6b-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"73f45dee-40e0-4370-9ab8-de6d2998fa6b\") " pod="openstack/nova-cell1-conductor-0" Mar 10 11:19:38 crc kubenswrapper[4794]: I0310 11:19:38.931590 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/832f4f9e-beca-4825-b367-2efa49512dd8-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"832f4f9e-beca-4825-b367-2efa49512dd8\") " pod="openstack/nova-metadata-0" Mar 10 11:19:38 crc kubenswrapper[4794]: I0310 11:19:38.931636 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/832f4f9e-beca-4825-b367-2efa49512dd8-config-data\") pod \"nova-metadata-0\" (UID: \"832f4f9e-beca-4825-b367-2efa49512dd8\") " pod="openstack/nova-metadata-0" Mar 10 11:19:38 crc kubenswrapper[4794]: I0310 11:19:38.931683 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nrvvd\" (UniqueName: \"kubernetes.io/projected/73f45dee-40e0-4370-9ab8-de6d2998fa6b-kube-api-access-nrvvd\") pod \"nova-cell1-conductor-0\" (UID: \"73f45dee-40e0-4370-9ab8-de6d2998fa6b\") " pod="openstack/nova-cell1-conductor-0" Mar 10 11:19:38 crc kubenswrapper[4794]: I0310 11:19:38.932020 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/832f4f9e-beca-4825-b367-2efa49512dd8-logs\") pod \"nova-metadata-0\" (UID: \"832f4f9e-beca-4825-b367-2efa49512dd8\") " pod="openstack/nova-metadata-0" Mar 10 11:19:38 crc kubenswrapper[4794]: I0310 11:19:38.932068 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-snms8\" (UniqueName: \"kubernetes.io/projected/832f4f9e-beca-4825-b367-2efa49512dd8-kube-api-access-snms8\") pod \"nova-metadata-0\" (UID: \"832f4f9e-beca-4825-b367-2efa49512dd8\") " pod="openstack/nova-metadata-0" Mar 10 11:19:38 crc kubenswrapper[4794]: I0310 11:19:38.932202 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/73f45dee-40e0-4370-9ab8-de6d2998fa6b-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"73f45dee-40e0-4370-9ab8-de6d2998fa6b\") " pod="openstack/nova-cell1-conductor-0" Mar 10 11:19:38 crc kubenswrapper[4794]: I0310 11:19:38.932221 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/73f45dee-40e0-4370-9ab8-de6d2998fa6b-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"73f45dee-40e0-4370-9ab8-de6d2998fa6b\") " pod="openstack/nova-cell1-conductor-0" Mar 10 11:19:38 crc kubenswrapper[4794]: I0310 11:19:38.936469 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/73f45dee-40e0-4370-9ab8-de6d2998fa6b-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"73f45dee-40e0-4370-9ab8-de6d2998fa6b\") " pod="openstack/nova-cell1-conductor-0" Mar 10 11:19:38 crc kubenswrapper[4794]: I0310 11:19:38.936493 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/73f45dee-40e0-4370-9ab8-de6d2998fa6b-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"73f45dee-40e0-4370-9ab8-de6d2998fa6b\") " pod="openstack/nova-cell1-conductor-0" Mar 10 11:19:38 crc kubenswrapper[4794]: I0310 11:19:38.969986 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nrvvd\" (UniqueName: \"kubernetes.io/projected/73f45dee-40e0-4370-9ab8-de6d2998fa6b-kube-api-access-nrvvd\") pod \"nova-cell1-conductor-0\" (UID: \"73f45dee-40e0-4370-9ab8-de6d2998fa6b\") " pod="openstack/nova-cell1-conductor-0" Mar 10 11:19:39 crc kubenswrapper[4794]: I0310 11:19:39.033603 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/832f4f9e-beca-4825-b367-2efa49512dd8-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"832f4f9e-beca-4825-b367-2efa49512dd8\") " pod="openstack/nova-metadata-0" Mar 10 11:19:39 crc kubenswrapper[4794]: I0310 11:19:39.034019 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/832f4f9e-beca-4825-b367-2efa49512dd8-config-data\") pod \"nova-metadata-0\" (UID: \"832f4f9e-beca-4825-b367-2efa49512dd8\") " pod="openstack/nova-metadata-0" Mar 10 11:19:39 crc kubenswrapper[4794]: I0310 11:19:39.034231 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/832f4f9e-beca-4825-b367-2efa49512dd8-logs\") pod \"nova-metadata-0\" (UID: \"832f4f9e-beca-4825-b367-2efa49512dd8\") " pod="openstack/nova-metadata-0" Mar 10 11:19:39 crc kubenswrapper[4794]: I0310 11:19:39.034268 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-snms8\" (UniqueName: \"kubernetes.io/projected/832f4f9e-beca-4825-b367-2efa49512dd8-kube-api-access-snms8\") pod \"nova-metadata-0\" (UID: \"832f4f9e-beca-4825-b367-2efa49512dd8\") " pod="openstack/nova-metadata-0" Mar 10 11:19:39 crc kubenswrapper[4794]: I0310 11:19:39.035561 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/832f4f9e-beca-4825-b367-2efa49512dd8-logs\") pod \"nova-metadata-0\" (UID: \"832f4f9e-beca-4825-b367-2efa49512dd8\") " pod="openstack/nova-metadata-0" Mar 10 11:19:39 crc kubenswrapper[4794]: I0310 11:19:39.036737 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/832f4f9e-beca-4825-b367-2efa49512dd8-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"832f4f9e-beca-4825-b367-2efa49512dd8\") " pod="openstack/nova-metadata-0" Mar 10 11:19:39 crc kubenswrapper[4794]: I0310 11:19:39.037056 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/832f4f9e-beca-4825-b367-2efa49512dd8-config-data\") pod \"nova-metadata-0\" (UID: \"832f4f9e-beca-4825-b367-2efa49512dd8\") " pod="openstack/nova-metadata-0" Mar 10 11:19:39 crc kubenswrapper[4794]: I0310 11:19:39.050195 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-snms8\" (UniqueName: \"kubernetes.io/projected/832f4f9e-beca-4825-b367-2efa49512dd8-kube-api-access-snms8\") pod \"nova-metadata-0\" (UID: \"832f4f9e-beca-4825-b367-2efa49512dd8\") " pod="openstack/nova-metadata-0" Mar 10 11:19:39 crc kubenswrapper[4794]: I0310 11:19:39.096976 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Mar 10 11:19:39 crc kubenswrapper[4794]: I0310 11:19:39.108142 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 10 11:19:39 crc kubenswrapper[4794]: I0310 11:19:39.699245 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Mar 10 11:19:39 crc kubenswrapper[4794]: W0310 11:19:39.704553 4794 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod73f45dee_40e0_4370_9ab8_de6d2998fa6b.slice/crio-c45fc6c8bd143d5dcdb58df63e77e2b0bb63c118016aa00e3758a63edd126e2b WatchSource:0}: Error finding container c45fc6c8bd143d5dcdb58df63e77e2b0bb63c118016aa00e3758a63edd126e2b: Status 404 returned error can't find the container with id c45fc6c8bd143d5dcdb58df63e77e2b0bb63c118016aa00e3758a63edd126e2b Mar 10 11:19:39 crc kubenswrapper[4794]: I0310 11:19:39.836427 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 10 11:19:40 crc kubenswrapper[4794]: I0310 11:19:40.039596 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="69d4f681-c12f-43fa-8ec0-95e90bff92a8" path="/var/lib/kubelet/pods/69d4f681-c12f-43fa-8ec0-95e90bff92a8/volumes" Mar 10 11:19:40 crc kubenswrapper[4794]: I0310 11:19:40.040400 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e3cc18ff-b557-4d49-8580-733877f288a5" path="/var/lib/kubelet/pods/e3cc18ff-b557-4d49-8580-733877f288a5/volumes" Mar 10 11:19:40 crc kubenswrapper[4794]: I0310 11:19:40.137167 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Mar 10 11:19:40 crc kubenswrapper[4794]: I0310 11:19:40.262303 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8b88005f-6aaa-488f-90d5-b789ccced7ec-config-data\") pod \"8b88005f-6aaa-488f-90d5-b789ccced7ec\" (UID: \"8b88005f-6aaa-488f-90d5-b789ccced7ec\") " Mar 10 11:19:40 crc kubenswrapper[4794]: I0310 11:19:40.262402 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ghbnz\" (UniqueName: \"kubernetes.io/projected/8b88005f-6aaa-488f-90d5-b789ccced7ec-kube-api-access-ghbnz\") pod \"8b88005f-6aaa-488f-90d5-b789ccced7ec\" (UID: \"8b88005f-6aaa-488f-90d5-b789ccced7ec\") " Mar 10 11:19:40 crc kubenswrapper[4794]: I0310 11:19:40.262470 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8b88005f-6aaa-488f-90d5-b789ccced7ec-combined-ca-bundle\") pod \"8b88005f-6aaa-488f-90d5-b789ccced7ec\" (UID: \"8b88005f-6aaa-488f-90d5-b789ccced7ec\") " Mar 10 11:19:40 crc kubenswrapper[4794]: I0310 11:19:40.266830 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8b88005f-6aaa-488f-90d5-b789ccced7ec-kube-api-access-ghbnz" (OuterVolumeSpecName: "kube-api-access-ghbnz") pod "8b88005f-6aaa-488f-90d5-b789ccced7ec" (UID: "8b88005f-6aaa-488f-90d5-b789ccced7ec"). InnerVolumeSpecName "kube-api-access-ghbnz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 11:19:40 crc kubenswrapper[4794]: I0310 11:19:40.286644 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8b88005f-6aaa-488f-90d5-b789ccced7ec-config-data" (OuterVolumeSpecName: "config-data") pod "8b88005f-6aaa-488f-90d5-b789ccced7ec" (UID: "8b88005f-6aaa-488f-90d5-b789ccced7ec"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 11:19:40 crc kubenswrapper[4794]: I0310 11:19:40.288314 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8b88005f-6aaa-488f-90d5-b789ccced7ec-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8b88005f-6aaa-488f-90d5-b789ccced7ec" (UID: "8b88005f-6aaa-488f-90d5-b789ccced7ec"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 11:19:40 crc kubenswrapper[4794]: I0310 11:19:40.365706 4794 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8b88005f-6aaa-488f-90d5-b789ccced7ec-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 11:19:40 crc kubenswrapper[4794]: I0310 11:19:40.366229 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ghbnz\" (UniqueName: \"kubernetes.io/projected/8b88005f-6aaa-488f-90d5-b789ccced7ec-kube-api-access-ghbnz\") on node \"crc\" DevicePath \"\"" Mar 10 11:19:40 crc kubenswrapper[4794]: I0310 11:19:40.366246 4794 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8b88005f-6aaa-488f-90d5-b789ccced7ec-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 11:19:40 crc kubenswrapper[4794]: I0310 11:19:40.664615 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"832f4f9e-beca-4825-b367-2efa49512dd8","Type":"ContainerStarted","Data":"57a909fe0d6b139cf0f65ae6c67a2896d2054a7bf59f1fd5b5e1cc0c0b0edb86"} Mar 10 11:19:40 crc kubenswrapper[4794]: I0310 11:19:40.664680 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"832f4f9e-beca-4825-b367-2efa49512dd8","Type":"ContainerStarted","Data":"5674e7f057fe6006193e20972cb4ee463a2d2dfa5116786d28931c2e355a90b0"} Mar 10 11:19:40 crc kubenswrapper[4794]: I0310 11:19:40.664695 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"832f4f9e-beca-4825-b367-2efa49512dd8","Type":"ContainerStarted","Data":"aadb6a1fdfeafa2115209c6473d0ccae06a2cebf82d15173b704e8be12f48d2e"} Mar 10 11:19:40 crc kubenswrapper[4794]: I0310 11:19:40.677777 4794 generic.go:334] "Generic (PLEG): container finished" podID="8b88005f-6aaa-488f-90d5-b789ccced7ec" containerID="773ca7b14734c179e1a0443c6ccd3c922e4dcbfb06c2b67fcd8db43590f52861" exitCode=0 Mar 10 11:19:40 crc kubenswrapper[4794]: I0310 11:19:40.677877 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"8b88005f-6aaa-488f-90d5-b789ccced7ec","Type":"ContainerDied","Data":"773ca7b14734c179e1a0443c6ccd3c922e4dcbfb06c2b67fcd8db43590f52861"} Mar 10 11:19:40 crc kubenswrapper[4794]: I0310 11:19:40.677903 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"8b88005f-6aaa-488f-90d5-b789ccced7ec","Type":"ContainerDied","Data":"1ad96500eb6db7b567ab2a2b39bd91f964776aab1226b92b557ed81cf2d6d728"} Mar 10 11:19:40 crc kubenswrapper[4794]: I0310 11:19:40.677925 4794 scope.go:117] "RemoveContainer" containerID="773ca7b14734c179e1a0443c6ccd3c922e4dcbfb06c2b67fcd8db43590f52861" Mar 10 11:19:40 crc kubenswrapper[4794]: I0310 11:19:40.678104 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Mar 10 11:19:40 crc kubenswrapper[4794]: I0310 11:19:40.706228 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"73f45dee-40e0-4370-9ab8-de6d2998fa6b","Type":"ContainerStarted","Data":"674bd38098e7a6e2fa404f4cfd40b1373d5737d8e1b2ddb4f428c5bbd611fb3a"} Mar 10 11:19:40 crc kubenswrapper[4794]: I0310 11:19:40.706289 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"73f45dee-40e0-4370-9ab8-de6d2998fa6b","Type":"ContainerStarted","Data":"c45fc6c8bd143d5dcdb58df63e77e2b0bb63c118016aa00e3758a63edd126e2b"} Mar 10 11:19:40 crc kubenswrapper[4794]: I0310 11:19:40.707577 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-conductor-0" Mar 10 11:19:40 crc kubenswrapper[4794]: I0310 11:19:40.726845 4794 scope.go:117] "RemoveContainer" containerID="773ca7b14734c179e1a0443c6ccd3c922e4dcbfb06c2b67fcd8db43590f52861" Mar 10 11:19:40 crc kubenswrapper[4794]: I0310 11:19:40.727174 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.727152916 podStartE2EDuration="2.727152916s" podCreationTimestamp="2026-03-10 11:19:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 11:19:40.68870229 +0000 UTC m=+5729.444873118" watchObservedRunningTime="2026-03-10 11:19:40.727152916 +0000 UTC m=+5729.483323734" Mar 10 11:19:40 crc kubenswrapper[4794]: I0310 11:19:40.728634 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-0"] Mar 10 11:19:40 crc kubenswrapper[4794]: E0310 11:19:40.733437 4794 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"773ca7b14734c179e1a0443c6ccd3c922e4dcbfb06c2b67fcd8db43590f52861\": container with ID starting with 773ca7b14734c179e1a0443c6ccd3c922e4dcbfb06c2b67fcd8db43590f52861 not found: ID does not exist" containerID="773ca7b14734c179e1a0443c6ccd3c922e4dcbfb06c2b67fcd8db43590f52861" Mar 10 11:19:40 crc kubenswrapper[4794]: I0310 11:19:40.733484 4794 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"773ca7b14734c179e1a0443c6ccd3c922e4dcbfb06c2b67fcd8db43590f52861"} err="failed to get container status \"773ca7b14734c179e1a0443c6ccd3c922e4dcbfb06c2b67fcd8db43590f52861\": rpc error: code = NotFound desc = could not find container \"773ca7b14734c179e1a0443c6ccd3c922e4dcbfb06c2b67fcd8db43590f52861\": container with ID starting with 773ca7b14734c179e1a0443c6ccd3c922e4dcbfb06c2b67fcd8db43590f52861 not found: ID does not exist" Mar 10 11:19:40 crc kubenswrapper[4794]: I0310 11:19:40.737738 4794 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-0"] Mar 10 11:19:40 crc kubenswrapper[4794]: I0310 11:19:40.751380 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-0"] Mar 10 11:19:40 crc kubenswrapper[4794]: E0310 11:19:40.752063 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8b88005f-6aaa-488f-90d5-b789ccced7ec" containerName="nova-cell0-conductor-conductor" Mar 10 11:19:40 crc kubenswrapper[4794]: I0310 11:19:40.752096 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="8b88005f-6aaa-488f-90d5-b789ccced7ec" containerName="nova-cell0-conductor-conductor" Mar 10 11:19:40 crc kubenswrapper[4794]: I0310 11:19:40.752410 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="8b88005f-6aaa-488f-90d5-b789ccced7ec" containerName="nova-cell0-conductor-conductor" Mar 10 11:19:40 crc kubenswrapper[4794]: I0310 11:19:40.753221 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Mar 10 11:19:40 crc kubenswrapper[4794]: I0310 11:19:40.757640 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Mar 10 11:19:40 crc kubenswrapper[4794]: I0310 11:19:40.766866 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Mar 10 11:19:40 crc kubenswrapper[4794]: I0310 11:19:40.771107 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-0" podStartSLOduration=2.771089144 podStartE2EDuration="2.771089144s" podCreationTimestamp="2026-03-10 11:19:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 11:19:40.758083139 +0000 UTC m=+5729.514253957" watchObservedRunningTime="2026-03-10 11:19:40.771089144 +0000 UTC m=+5729.527259962" Mar 10 11:19:40 crc kubenswrapper[4794]: I0310 11:19:40.884768 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/00faf545-0a0c-474d-9288-169d52a10e12-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"00faf545-0a0c-474d-9288-169d52a10e12\") " pod="openstack/nova-cell0-conductor-0" Mar 10 11:19:40 crc kubenswrapper[4794]: I0310 11:19:40.884850 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wgvd8\" (UniqueName: \"kubernetes.io/projected/00faf545-0a0c-474d-9288-169d52a10e12-kube-api-access-wgvd8\") pod \"nova-cell0-conductor-0\" (UID: \"00faf545-0a0c-474d-9288-169d52a10e12\") " pod="openstack/nova-cell0-conductor-0" Mar 10 11:19:40 crc kubenswrapper[4794]: I0310 11:19:40.884943 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/00faf545-0a0c-474d-9288-169d52a10e12-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"00faf545-0a0c-474d-9288-169d52a10e12\") " pod="openstack/nova-cell0-conductor-0" Mar 10 11:19:40 crc kubenswrapper[4794]: I0310 11:19:40.986860 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/00faf545-0a0c-474d-9288-169d52a10e12-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"00faf545-0a0c-474d-9288-169d52a10e12\") " pod="openstack/nova-cell0-conductor-0" Mar 10 11:19:40 crc kubenswrapper[4794]: I0310 11:19:40.987147 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wgvd8\" (UniqueName: \"kubernetes.io/projected/00faf545-0a0c-474d-9288-169d52a10e12-kube-api-access-wgvd8\") pod \"nova-cell0-conductor-0\" (UID: \"00faf545-0a0c-474d-9288-169d52a10e12\") " pod="openstack/nova-cell0-conductor-0" Mar 10 11:19:40 crc kubenswrapper[4794]: I0310 11:19:40.987345 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/00faf545-0a0c-474d-9288-169d52a10e12-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"00faf545-0a0c-474d-9288-169d52a10e12\") " pod="openstack/nova-cell0-conductor-0" Mar 10 11:19:40 crc kubenswrapper[4794]: I0310 11:19:40.990653 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/00faf545-0a0c-474d-9288-169d52a10e12-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"00faf545-0a0c-474d-9288-169d52a10e12\") " pod="openstack/nova-cell0-conductor-0" Mar 10 11:19:41 crc kubenswrapper[4794]: I0310 11:19:41.004051 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/00faf545-0a0c-474d-9288-169d52a10e12-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"00faf545-0a0c-474d-9288-169d52a10e12\") " pod="openstack/nova-cell0-conductor-0" Mar 10 11:19:41 crc kubenswrapper[4794]: I0310 11:19:41.004859 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wgvd8\" (UniqueName: \"kubernetes.io/projected/00faf545-0a0c-474d-9288-169d52a10e12-kube-api-access-wgvd8\") pod \"nova-cell0-conductor-0\" (UID: \"00faf545-0a0c-474d-9288-169d52a10e12\") " pod="openstack/nova-cell0-conductor-0" Mar 10 11:19:41 crc kubenswrapper[4794]: I0310 11:19:41.013673 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Mar 10 11:19:41 crc kubenswrapper[4794]: I0310 11:19:41.032812 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Mar 10 11:19:41 crc kubenswrapper[4794]: I0310 11:19:41.075343 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Mar 10 11:19:41 crc kubenswrapper[4794]: W0310 11:19:41.563457 4794 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod00faf545_0a0c_474d_9288_169d52a10e12.slice/crio-0880a405e190d356b0fc717784e9f23ac3cfc2486d675dca08319753b31d50f3 WatchSource:0}: Error finding container 0880a405e190d356b0fc717784e9f23ac3cfc2486d675dca08319753b31d50f3: Status 404 returned error can't find the container with id 0880a405e190d356b0fc717784e9f23ac3cfc2486d675dca08319753b31d50f3 Mar 10 11:19:41 crc kubenswrapper[4794]: I0310 11:19:41.567749 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Mar 10 11:19:41 crc kubenswrapper[4794]: I0310 11:19:41.722304 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"00faf545-0a0c-474d-9288-169d52a10e12","Type":"ContainerStarted","Data":"0880a405e190d356b0fc717784e9f23ac3cfc2486d675dca08319753b31d50f3"} Mar 10 11:19:42 crc kubenswrapper[4794]: I0310 11:19:42.014201 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8b88005f-6aaa-488f-90d5-b789ccced7ec" path="/var/lib/kubelet/pods/8b88005f-6aaa-488f-90d5-b789ccced7ec/volumes" Mar 10 11:19:42 crc kubenswrapper[4794]: E0310 11:19:42.290060 4794 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode8b55040_929c_4088_9d23_532663500a6b.slice/crio-b2cb7f101e9420c2c74f37b8147714cf3bf87346a62942c8508e4ef2d4c02d77\": RecentStats: unable to find data in memory cache]" Mar 10 11:19:42 crc kubenswrapper[4794]: I0310 11:19:42.737527 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"00faf545-0a0c-474d-9288-169d52a10e12","Type":"ContainerStarted","Data":"1e8e6a6f4333691a39f04023fe1e4516264545d01b2c123a1d46f75f22f60eab"} Mar 10 11:19:42 crc kubenswrapper[4794]: I0310 11:19:42.770005 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-0" podStartSLOduration=2.769981495 podStartE2EDuration="2.769981495s" podCreationTimestamp="2026-03-10 11:19:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 11:19:42.766851268 +0000 UTC m=+5731.523022106" watchObservedRunningTime="2026-03-10 11:19:42.769981495 +0000 UTC m=+5731.526152323" Mar 10 11:19:43 crc kubenswrapper[4794]: I0310 11:19:43.748925 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell0-conductor-0" Mar 10 11:19:44 crc kubenswrapper[4794]: I0310 11:19:44.108956 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Mar 10 11:19:44 crc kubenswrapper[4794]: I0310 11:19:44.108998 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Mar 10 11:19:46 crc kubenswrapper[4794]: I0310 11:19:46.019150 4794 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-cell1-novncproxy-0" Mar 10 11:19:46 crc kubenswrapper[4794]: I0310 11:19:46.023627 4794 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-cell1-novncproxy-0" Mar 10 11:19:46 crc kubenswrapper[4794]: I0310 11:19:46.032278 4794 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Mar 10 11:19:46 crc kubenswrapper[4794]: I0310 11:19:46.093406 4794 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Mar 10 11:19:46 crc kubenswrapper[4794]: I0310 11:19:46.138065 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell0-conductor-0" Mar 10 11:19:46 crc kubenswrapper[4794]: I0310 11:19:46.795210 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-novncproxy-0" Mar 10 11:19:46 crc kubenswrapper[4794]: I0310 11:19:46.815730 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Mar 10 11:19:47 crc kubenswrapper[4794]: I0310 11:19:47.254885 4794 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 10 11:19:47 crc kubenswrapper[4794]: I0310 11:19:47.255299 4794 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 10 11:19:48 crc kubenswrapper[4794]: I0310 11:19:48.339588 4794 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="abb3b031-7a99-4128-ae47-c28a091f5ee3" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.1.121:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 10 11:19:48 crc kubenswrapper[4794]: I0310 11:19:48.339591 4794 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="abb3b031-7a99-4128-ae47-c28a091f5ee3" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.1.121:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 10 11:19:49 crc kubenswrapper[4794]: I0310 11:19:49.108769 4794 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Mar 10 11:19:49 crc kubenswrapper[4794]: I0310 11:19:49.109062 4794 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Mar 10 11:19:49 crc kubenswrapper[4794]: I0310 11:19:49.121673 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-conductor-0" Mar 10 11:19:50 crc kubenswrapper[4794]: I0310 11:19:50.150616 4794 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="832f4f9e-beca-4825-b367-2efa49512dd8" containerName="nova-metadata-log" probeResult="failure" output="Get \"http://10.217.1.123:8775/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 10 11:19:50 crc kubenswrapper[4794]: I0310 11:19:50.191655 4794 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="832f4f9e-beca-4825-b367-2efa49512dd8" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"http://10.217.1.123:8775/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 10 11:19:51 crc kubenswrapper[4794]: I0310 11:19:51.829382 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Mar 10 11:19:51 crc kubenswrapper[4794]: I0310 11:19:51.831623 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 10 11:19:51 crc kubenswrapper[4794]: I0310 11:19:51.834036 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Mar 10 11:19:51 crc kubenswrapper[4794]: I0310 11:19:51.840224 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 10 11:19:51 crc kubenswrapper[4794]: I0310 11:19:51.896418 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f9415088-1f39-4a30-b6e2-1f0421d1dd9f-config-data\") pod \"cinder-scheduler-0\" (UID: \"f9415088-1f39-4a30-b6e2-1f0421d1dd9f\") " pod="openstack/cinder-scheduler-0" Mar 10 11:19:51 crc kubenswrapper[4794]: I0310 11:19:51.896506 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rl96p\" (UniqueName: \"kubernetes.io/projected/f9415088-1f39-4a30-b6e2-1f0421d1dd9f-kube-api-access-rl96p\") pod \"cinder-scheduler-0\" (UID: \"f9415088-1f39-4a30-b6e2-1f0421d1dd9f\") " pod="openstack/cinder-scheduler-0" Mar 10 11:19:51 crc kubenswrapper[4794]: I0310 11:19:51.896574 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f9415088-1f39-4a30-b6e2-1f0421d1dd9f-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"f9415088-1f39-4a30-b6e2-1f0421d1dd9f\") " pod="openstack/cinder-scheduler-0" Mar 10 11:19:51 crc kubenswrapper[4794]: I0310 11:19:51.896627 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f9415088-1f39-4a30-b6e2-1f0421d1dd9f-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"f9415088-1f39-4a30-b6e2-1f0421d1dd9f\") " pod="openstack/cinder-scheduler-0" Mar 10 11:19:51 crc kubenswrapper[4794]: I0310 11:19:51.896658 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/f9415088-1f39-4a30-b6e2-1f0421d1dd9f-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"f9415088-1f39-4a30-b6e2-1f0421d1dd9f\") " pod="openstack/cinder-scheduler-0" Mar 10 11:19:51 crc kubenswrapper[4794]: I0310 11:19:51.896678 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f9415088-1f39-4a30-b6e2-1f0421d1dd9f-scripts\") pod \"cinder-scheduler-0\" (UID: \"f9415088-1f39-4a30-b6e2-1f0421d1dd9f\") " pod="openstack/cinder-scheduler-0" Mar 10 11:19:51 crc kubenswrapper[4794]: I0310 11:19:51.997916 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f9415088-1f39-4a30-b6e2-1f0421d1dd9f-config-data\") pod \"cinder-scheduler-0\" (UID: \"f9415088-1f39-4a30-b6e2-1f0421d1dd9f\") " pod="openstack/cinder-scheduler-0" Mar 10 11:19:51 crc kubenswrapper[4794]: I0310 11:19:51.998011 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rl96p\" (UniqueName: \"kubernetes.io/projected/f9415088-1f39-4a30-b6e2-1f0421d1dd9f-kube-api-access-rl96p\") pod \"cinder-scheduler-0\" (UID: \"f9415088-1f39-4a30-b6e2-1f0421d1dd9f\") " pod="openstack/cinder-scheduler-0" Mar 10 11:19:51 crc kubenswrapper[4794]: I0310 11:19:51.998085 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f9415088-1f39-4a30-b6e2-1f0421d1dd9f-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"f9415088-1f39-4a30-b6e2-1f0421d1dd9f\") " pod="openstack/cinder-scheduler-0" Mar 10 11:19:51 crc kubenswrapper[4794]: I0310 11:19:51.998147 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f9415088-1f39-4a30-b6e2-1f0421d1dd9f-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"f9415088-1f39-4a30-b6e2-1f0421d1dd9f\") " pod="openstack/cinder-scheduler-0" Mar 10 11:19:51 crc kubenswrapper[4794]: I0310 11:19:51.998183 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/f9415088-1f39-4a30-b6e2-1f0421d1dd9f-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"f9415088-1f39-4a30-b6e2-1f0421d1dd9f\") " pod="openstack/cinder-scheduler-0" Mar 10 11:19:51 crc kubenswrapper[4794]: I0310 11:19:51.998208 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f9415088-1f39-4a30-b6e2-1f0421d1dd9f-scripts\") pod \"cinder-scheduler-0\" (UID: \"f9415088-1f39-4a30-b6e2-1f0421d1dd9f\") " pod="openstack/cinder-scheduler-0" Mar 10 11:19:51 crc kubenswrapper[4794]: I0310 11:19:51.998902 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/f9415088-1f39-4a30-b6e2-1f0421d1dd9f-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"f9415088-1f39-4a30-b6e2-1f0421d1dd9f\") " pod="openstack/cinder-scheduler-0" Mar 10 11:19:52 crc kubenswrapper[4794]: I0310 11:19:52.004977 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f9415088-1f39-4a30-b6e2-1f0421d1dd9f-scripts\") pod \"cinder-scheduler-0\" (UID: \"f9415088-1f39-4a30-b6e2-1f0421d1dd9f\") " pod="openstack/cinder-scheduler-0" Mar 10 11:19:52 crc kubenswrapper[4794]: I0310 11:19:52.010156 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f9415088-1f39-4a30-b6e2-1f0421d1dd9f-config-data\") pod \"cinder-scheduler-0\" (UID: \"f9415088-1f39-4a30-b6e2-1f0421d1dd9f\") " pod="openstack/cinder-scheduler-0" Mar 10 11:19:52 crc kubenswrapper[4794]: I0310 11:19:52.010380 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f9415088-1f39-4a30-b6e2-1f0421d1dd9f-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"f9415088-1f39-4a30-b6e2-1f0421d1dd9f\") " pod="openstack/cinder-scheduler-0" Mar 10 11:19:52 crc kubenswrapper[4794]: I0310 11:19:52.013667 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f9415088-1f39-4a30-b6e2-1f0421d1dd9f-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"f9415088-1f39-4a30-b6e2-1f0421d1dd9f\") " pod="openstack/cinder-scheduler-0" Mar 10 11:19:52 crc kubenswrapper[4794]: I0310 11:19:52.016914 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rl96p\" (UniqueName: \"kubernetes.io/projected/f9415088-1f39-4a30-b6e2-1f0421d1dd9f-kube-api-access-rl96p\") pod \"cinder-scheduler-0\" (UID: \"f9415088-1f39-4a30-b6e2-1f0421d1dd9f\") " pod="openstack/cinder-scheduler-0" Mar 10 11:19:52 crc kubenswrapper[4794]: I0310 11:19:52.165480 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 10 11:19:52 crc kubenswrapper[4794]: E0310 11:19:52.521071 4794 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode8b55040_929c_4088_9d23_532663500a6b.slice/crio-b2cb7f101e9420c2c74f37b8147714cf3bf87346a62942c8508e4ef2d4c02d77\": RecentStats: unable to find data in memory cache]" Mar 10 11:19:52 crc kubenswrapper[4794]: I0310 11:19:52.682633 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 10 11:19:52 crc kubenswrapper[4794]: I0310 11:19:52.844001 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"f9415088-1f39-4a30-b6e2-1f0421d1dd9f","Type":"ContainerStarted","Data":"f09f4a5c852a74f058a7cbcc2da3d5e37aa37f21319ec92df53de80dba45b1c3"} Mar 10 11:19:53 crc kubenswrapper[4794]: I0310 11:19:53.448984 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Mar 10 11:19:53 crc kubenswrapper[4794]: I0310 11:19:53.449665 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="ce2424b1-763a-4b70-9f10-2503c5d49d7c" containerName="cinder-api" containerID="cri-o://fd559e1689f963d616a0e201e644628df3259c2127ce50665eb673cf4ada6a8c" gracePeriod=30 Mar 10 11:19:53 crc kubenswrapper[4794]: I0310 11:19:53.449834 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="ce2424b1-763a-4b70-9f10-2503c5d49d7c" containerName="cinder-api-log" containerID="cri-o://adb37a2d74235ae1c882bd2d614dea306392e358ea6ea989e227f6585a82d41a" gracePeriod=30 Mar 10 11:19:53 crc kubenswrapper[4794]: I0310 11:19:53.853879 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"f9415088-1f39-4a30-b6e2-1f0421d1dd9f","Type":"ContainerStarted","Data":"8223a7484a0c224e84cd0f81a8a2a7f0e59bfc770ed59b29bee05e0336459770"} Mar 10 11:19:53 crc kubenswrapper[4794]: I0310 11:19:53.856295 4794 generic.go:334] "Generic (PLEG): container finished" podID="ce2424b1-763a-4b70-9f10-2503c5d49d7c" containerID="adb37a2d74235ae1c882bd2d614dea306392e358ea6ea989e227f6585a82d41a" exitCode=143 Mar 10 11:19:53 crc kubenswrapper[4794]: I0310 11:19:53.856368 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"ce2424b1-763a-4b70-9f10-2503c5d49d7c","Type":"ContainerDied","Data":"adb37a2d74235ae1c882bd2d614dea306392e358ea6ea989e227f6585a82d41a"} Mar 10 11:19:53 crc kubenswrapper[4794]: I0310 11:19:53.877457 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-volume-volume1-0"] Mar 10 11:19:53 crc kubenswrapper[4794]: I0310 11:19:53.878812 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-volume-volume1-0" Mar 10 11:19:53 crc kubenswrapper[4794]: I0310 11:19:53.884505 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-volume-volume1-config-data" Mar 10 11:19:53 crc kubenswrapper[4794]: I0310 11:19:53.892528 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-volume-volume1-0"] Mar 10 11:19:53 crc kubenswrapper[4794]: I0310 11:19:53.946640 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/8d6ba85c-b8f0-46dd-90a7-0d85c29d6b0b-etc-iscsi\") pod \"cinder-volume-volume1-0\" (UID: \"8d6ba85c-b8f0-46dd-90a7-0d85c29d6b0b\") " pod="openstack/cinder-volume-volume1-0" Mar 10 11:19:53 crc kubenswrapper[4794]: I0310 11:19:53.946696 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/8d6ba85c-b8f0-46dd-90a7-0d85c29d6b0b-lib-modules\") pod \"cinder-volume-volume1-0\" (UID: \"8d6ba85c-b8f0-46dd-90a7-0d85c29d6b0b\") " pod="openstack/cinder-volume-volume1-0" Mar 10 11:19:53 crc kubenswrapper[4794]: I0310 11:19:53.946718 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/8d6ba85c-b8f0-46dd-90a7-0d85c29d6b0b-var-locks-cinder\") pod \"cinder-volume-volume1-0\" (UID: \"8d6ba85c-b8f0-46dd-90a7-0d85c29d6b0b\") " pod="openstack/cinder-volume-volume1-0" Mar 10 11:19:53 crc kubenswrapper[4794]: I0310 11:19:53.946943 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8d6ba85c-b8f0-46dd-90a7-0d85c29d6b0b-scripts\") pod \"cinder-volume-volume1-0\" (UID: \"8d6ba85c-b8f0-46dd-90a7-0d85c29d6b0b\") " pod="openstack/cinder-volume-volume1-0" Mar 10 11:19:53 crc kubenswrapper[4794]: I0310 11:19:53.947069 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/8d6ba85c-b8f0-46dd-90a7-0d85c29d6b0b-etc-nvme\") pod \"cinder-volume-volume1-0\" (UID: \"8d6ba85c-b8f0-46dd-90a7-0d85c29d6b0b\") " pod="openstack/cinder-volume-volume1-0" Mar 10 11:19:53 crc kubenswrapper[4794]: I0310 11:19:53.947115 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/8d6ba85c-b8f0-46dd-90a7-0d85c29d6b0b-ceph\") pod \"cinder-volume-volume1-0\" (UID: \"8d6ba85c-b8f0-46dd-90a7-0d85c29d6b0b\") " pod="openstack/cinder-volume-volume1-0" Mar 10 11:19:53 crc kubenswrapper[4794]: I0310 11:19:53.947200 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/8d6ba85c-b8f0-46dd-90a7-0d85c29d6b0b-sys\") pod \"cinder-volume-volume1-0\" (UID: \"8d6ba85c-b8f0-46dd-90a7-0d85c29d6b0b\") " pod="openstack/cinder-volume-volume1-0" Mar 10 11:19:53 crc kubenswrapper[4794]: I0310 11:19:53.947240 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n5dg6\" (UniqueName: \"kubernetes.io/projected/8d6ba85c-b8f0-46dd-90a7-0d85c29d6b0b-kube-api-access-n5dg6\") pod \"cinder-volume-volume1-0\" (UID: \"8d6ba85c-b8f0-46dd-90a7-0d85c29d6b0b\") " pod="openstack/cinder-volume-volume1-0" Mar 10 11:19:53 crc kubenswrapper[4794]: I0310 11:19:53.947276 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/8d6ba85c-b8f0-46dd-90a7-0d85c29d6b0b-dev\") pod \"cinder-volume-volume1-0\" (UID: \"8d6ba85c-b8f0-46dd-90a7-0d85c29d6b0b\") " pod="openstack/cinder-volume-volume1-0" Mar 10 11:19:53 crc kubenswrapper[4794]: I0310 11:19:53.947294 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/8d6ba85c-b8f0-46dd-90a7-0d85c29d6b0b-run\") pod \"cinder-volume-volume1-0\" (UID: \"8d6ba85c-b8f0-46dd-90a7-0d85c29d6b0b\") " pod="openstack/cinder-volume-volume1-0" Mar 10 11:19:53 crc kubenswrapper[4794]: I0310 11:19:53.947314 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8d6ba85c-b8f0-46dd-90a7-0d85c29d6b0b-config-data\") pod \"cinder-volume-volume1-0\" (UID: \"8d6ba85c-b8f0-46dd-90a7-0d85c29d6b0b\") " pod="openstack/cinder-volume-volume1-0" Mar 10 11:19:53 crc kubenswrapper[4794]: I0310 11:19:53.947377 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/8d6ba85c-b8f0-46dd-90a7-0d85c29d6b0b-etc-machine-id\") pod \"cinder-volume-volume1-0\" (UID: \"8d6ba85c-b8f0-46dd-90a7-0d85c29d6b0b\") " pod="openstack/cinder-volume-volume1-0" Mar 10 11:19:53 crc kubenswrapper[4794]: I0310 11:19:53.947422 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/8d6ba85c-b8f0-46dd-90a7-0d85c29d6b0b-var-lib-cinder\") pod \"cinder-volume-volume1-0\" (UID: \"8d6ba85c-b8f0-46dd-90a7-0d85c29d6b0b\") " pod="openstack/cinder-volume-volume1-0" Mar 10 11:19:53 crc kubenswrapper[4794]: I0310 11:19:53.947439 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/8d6ba85c-b8f0-46dd-90a7-0d85c29d6b0b-var-locks-brick\") pod \"cinder-volume-volume1-0\" (UID: \"8d6ba85c-b8f0-46dd-90a7-0d85c29d6b0b\") " pod="openstack/cinder-volume-volume1-0" Mar 10 11:19:53 crc kubenswrapper[4794]: I0310 11:19:53.947495 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8d6ba85c-b8f0-46dd-90a7-0d85c29d6b0b-config-data-custom\") pod \"cinder-volume-volume1-0\" (UID: \"8d6ba85c-b8f0-46dd-90a7-0d85c29d6b0b\") " pod="openstack/cinder-volume-volume1-0" Mar 10 11:19:53 crc kubenswrapper[4794]: I0310 11:19:53.947530 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8d6ba85c-b8f0-46dd-90a7-0d85c29d6b0b-combined-ca-bundle\") pod \"cinder-volume-volume1-0\" (UID: \"8d6ba85c-b8f0-46dd-90a7-0d85c29d6b0b\") " pod="openstack/cinder-volume-volume1-0" Mar 10 11:19:54 crc kubenswrapper[4794]: I0310 11:19:54.050244 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/8d6ba85c-b8f0-46dd-90a7-0d85c29d6b0b-var-lib-cinder\") pod \"cinder-volume-volume1-0\" (UID: \"8d6ba85c-b8f0-46dd-90a7-0d85c29d6b0b\") " pod="openstack/cinder-volume-volume1-0" Mar 10 11:19:54 crc kubenswrapper[4794]: I0310 11:19:54.053429 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/8d6ba85c-b8f0-46dd-90a7-0d85c29d6b0b-var-locks-brick\") pod \"cinder-volume-volume1-0\" (UID: \"8d6ba85c-b8f0-46dd-90a7-0d85c29d6b0b\") " pod="openstack/cinder-volume-volume1-0" Mar 10 11:19:54 crc kubenswrapper[4794]: I0310 11:19:54.053511 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8d6ba85c-b8f0-46dd-90a7-0d85c29d6b0b-config-data-custom\") pod \"cinder-volume-volume1-0\" (UID: \"8d6ba85c-b8f0-46dd-90a7-0d85c29d6b0b\") " pod="openstack/cinder-volume-volume1-0" Mar 10 11:19:54 crc kubenswrapper[4794]: I0310 11:19:54.053572 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8d6ba85c-b8f0-46dd-90a7-0d85c29d6b0b-combined-ca-bundle\") pod \"cinder-volume-volume1-0\" (UID: \"8d6ba85c-b8f0-46dd-90a7-0d85c29d6b0b\") " pod="openstack/cinder-volume-volume1-0" Mar 10 11:19:54 crc kubenswrapper[4794]: I0310 11:19:54.053645 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/8d6ba85c-b8f0-46dd-90a7-0d85c29d6b0b-etc-iscsi\") pod \"cinder-volume-volume1-0\" (UID: \"8d6ba85c-b8f0-46dd-90a7-0d85c29d6b0b\") " pod="openstack/cinder-volume-volume1-0" Mar 10 11:19:54 crc kubenswrapper[4794]: I0310 11:19:54.053711 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/8d6ba85c-b8f0-46dd-90a7-0d85c29d6b0b-lib-modules\") pod \"cinder-volume-volume1-0\" (UID: \"8d6ba85c-b8f0-46dd-90a7-0d85c29d6b0b\") " pod="openstack/cinder-volume-volume1-0" Mar 10 11:19:54 crc kubenswrapper[4794]: I0310 11:19:54.053750 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/8d6ba85c-b8f0-46dd-90a7-0d85c29d6b0b-var-locks-cinder\") pod \"cinder-volume-volume1-0\" (UID: \"8d6ba85c-b8f0-46dd-90a7-0d85c29d6b0b\") " pod="openstack/cinder-volume-volume1-0" Mar 10 11:19:54 crc kubenswrapper[4794]: I0310 11:19:54.053837 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8d6ba85c-b8f0-46dd-90a7-0d85c29d6b0b-scripts\") pod \"cinder-volume-volume1-0\" (UID: \"8d6ba85c-b8f0-46dd-90a7-0d85c29d6b0b\") " pod="openstack/cinder-volume-volume1-0" Mar 10 11:19:54 crc kubenswrapper[4794]: I0310 11:19:54.053922 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/8d6ba85c-b8f0-46dd-90a7-0d85c29d6b0b-etc-nvme\") pod \"cinder-volume-volume1-0\" (UID: \"8d6ba85c-b8f0-46dd-90a7-0d85c29d6b0b\") " pod="openstack/cinder-volume-volume1-0" Mar 10 11:19:54 crc kubenswrapper[4794]: I0310 11:19:54.053963 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/8d6ba85c-b8f0-46dd-90a7-0d85c29d6b0b-ceph\") pod \"cinder-volume-volume1-0\" (UID: \"8d6ba85c-b8f0-46dd-90a7-0d85c29d6b0b\") " pod="openstack/cinder-volume-volume1-0" Mar 10 11:19:54 crc kubenswrapper[4794]: I0310 11:19:54.054042 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/8d6ba85c-b8f0-46dd-90a7-0d85c29d6b0b-sys\") pod \"cinder-volume-volume1-0\" (UID: \"8d6ba85c-b8f0-46dd-90a7-0d85c29d6b0b\") " pod="openstack/cinder-volume-volume1-0" Mar 10 11:19:54 crc kubenswrapper[4794]: I0310 11:19:54.054083 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n5dg6\" (UniqueName: \"kubernetes.io/projected/8d6ba85c-b8f0-46dd-90a7-0d85c29d6b0b-kube-api-access-n5dg6\") pod \"cinder-volume-volume1-0\" (UID: \"8d6ba85c-b8f0-46dd-90a7-0d85c29d6b0b\") " pod="openstack/cinder-volume-volume1-0" Mar 10 11:19:54 crc kubenswrapper[4794]: I0310 11:19:54.054117 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/8d6ba85c-b8f0-46dd-90a7-0d85c29d6b0b-run\") pod \"cinder-volume-volume1-0\" (UID: \"8d6ba85c-b8f0-46dd-90a7-0d85c29d6b0b\") " pod="openstack/cinder-volume-volume1-0" Mar 10 11:19:54 crc kubenswrapper[4794]: I0310 11:19:54.054139 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/8d6ba85c-b8f0-46dd-90a7-0d85c29d6b0b-dev\") pod \"cinder-volume-volume1-0\" (UID: \"8d6ba85c-b8f0-46dd-90a7-0d85c29d6b0b\") " pod="openstack/cinder-volume-volume1-0" Mar 10 11:19:54 crc kubenswrapper[4794]: I0310 11:19:54.054162 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8d6ba85c-b8f0-46dd-90a7-0d85c29d6b0b-config-data\") pod \"cinder-volume-volume1-0\" (UID: \"8d6ba85c-b8f0-46dd-90a7-0d85c29d6b0b\") " pod="openstack/cinder-volume-volume1-0" Mar 10 11:19:54 crc kubenswrapper[4794]: I0310 11:19:54.054194 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/8d6ba85c-b8f0-46dd-90a7-0d85c29d6b0b-etc-machine-id\") pod \"cinder-volume-volume1-0\" (UID: \"8d6ba85c-b8f0-46dd-90a7-0d85c29d6b0b\") " pod="openstack/cinder-volume-volume1-0" Mar 10 11:19:54 crc kubenswrapper[4794]: I0310 11:19:54.054326 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/8d6ba85c-b8f0-46dd-90a7-0d85c29d6b0b-etc-machine-id\") pod \"cinder-volume-volume1-0\" (UID: \"8d6ba85c-b8f0-46dd-90a7-0d85c29d6b0b\") " pod="openstack/cinder-volume-volume1-0" Mar 10 11:19:54 crc kubenswrapper[4794]: I0310 11:19:54.050642 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/8d6ba85c-b8f0-46dd-90a7-0d85c29d6b0b-var-lib-cinder\") pod \"cinder-volume-volume1-0\" (UID: \"8d6ba85c-b8f0-46dd-90a7-0d85c29d6b0b\") " pod="openstack/cinder-volume-volume1-0" Mar 10 11:19:54 crc kubenswrapper[4794]: I0310 11:19:54.054594 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/8d6ba85c-b8f0-46dd-90a7-0d85c29d6b0b-var-locks-brick\") pod \"cinder-volume-volume1-0\" (UID: \"8d6ba85c-b8f0-46dd-90a7-0d85c29d6b0b\") " pod="openstack/cinder-volume-volume1-0" Mar 10 11:19:54 crc kubenswrapper[4794]: I0310 11:19:54.067621 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/8d6ba85c-b8f0-46dd-90a7-0d85c29d6b0b-lib-modules\") pod \"cinder-volume-volume1-0\" (UID: \"8d6ba85c-b8f0-46dd-90a7-0d85c29d6b0b\") " pod="openstack/cinder-volume-volume1-0" Mar 10 11:19:54 crc kubenswrapper[4794]: I0310 11:19:54.067671 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/8d6ba85c-b8f0-46dd-90a7-0d85c29d6b0b-etc-nvme\") pod \"cinder-volume-volume1-0\" (UID: \"8d6ba85c-b8f0-46dd-90a7-0d85c29d6b0b\") " pod="openstack/cinder-volume-volume1-0" Mar 10 11:19:54 crc kubenswrapper[4794]: I0310 11:19:54.067698 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/8d6ba85c-b8f0-46dd-90a7-0d85c29d6b0b-etc-iscsi\") pod \"cinder-volume-volume1-0\" (UID: \"8d6ba85c-b8f0-46dd-90a7-0d85c29d6b0b\") " pod="openstack/cinder-volume-volume1-0" Mar 10 11:19:54 crc kubenswrapper[4794]: I0310 11:19:54.067813 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/8d6ba85c-b8f0-46dd-90a7-0d85c29d6b0b-var-locks-cinder\") pod \"cinder-volume-volume1-0\" (UID: \"8d6ba85c-b8f0-46dd-90a7-0d85c29d6b0b\") " pod="openstack/cinder-volume-volume1-0" Mar 10 11:19:54 crc kubenswrapper[4794]: I0310 11:19:54.068188 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/8d6ba85c-b8f0-46dd-90a7-0d85c29d6b0b-dev\") pod \"cinder-volume-volume1-0\" (UID: \"8d6ba85c-b8f0-46dd-90a7-0d85c29d6b0b\") " pod="openstack/cinder-volume-volume1-0" Mar 10 11:19:54 crc kubenswrapper[4794]: I0310 11:19:54.068307 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/8d6ba85c-b8f0-46dd-90a7-0d85c29d6b0b-run\") pod \"cinder-volume-volume1-0\" (UID: \"8d6ba85c-b8f0-46dd-90a7-0d85c29d6b0b\") " pod="openstack/cinder-volume-volume1-0" Mar 10 11:19:54 crc kubenswrapper[4794]: I0310 11:19:54.068481 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/8d6ba85c-b8f0-46dd-90a7-0d85c29d6b0b-sys\") pod \"cinder-volume-volume1-0\" (UID: \"8d6ba85c-b8f0-46dd-90a7-0d85c29d6b0b\") " pod="openstack/cinder-volume-volume1-0" Mar 10 11:19:54 crc kubenswrapper[4794]: I0310 11:19:54.077813 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/8d6ba85c-b8f0-46dd-90a7-0d85c29d6b0b-ceph\") pod \"cinder-volume-volume1-0\" (UID: \"8d6ba85c-b8f0-46dd-90a7-0d85c29d6b0b\") " pod="openstack/cinder-volume-volume1-0" Mar 10 11:19:54 crc kubenswrapper[4794]: I0310 11:19:54.083469 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8d6ba85c-b8f0-46dd-90a7-0d85c29d6b0b-config-data\") pod \"cinder-volume-volume1-0\" (UID: \"8d6ba85c-b8f0-46dd-90a7-0d85c29d6b0b\") " pod="openstack/cinder-volume-volume1-0" Mar 10 11:19:54 crc kubenswrapper[4794]: I0310 11:19:54.083981 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8d6ba85c-b8f0-46dd-90a7-0d85c29d6b0b-combined-ca-bundle\") pod \"cinder-volume-volume1-0\" (UID: \"8d6ba85c-b8f0-46dd-90a7-0d85c29d6b0b\") " pod="openstack/cinder-volume-volume1-0" Mar 10 11:19:54 crc kubenswrapper[4794]: I0310 11:19:54.085897 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8d6ba85c-b8f0-46dd-90a7-0d85c29d6b0b-config-data-custom\") pod \"cinder-volume-volume1-0\" (UID: \"8d6ba85c-b8f0-46dd-90a7-0d85c29d6b0b\") " pod="openstack/cinder-volume-volume1-0" Mar 10 11:19:54 crc kubenswrapper[4794]: I0310 11:19:54.088698 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8d6ba85c-b8f0-46dd-90a7-0d85c29d6b0b-scripts\") pod \"cinder-volume-volume1-0\" (UID: \"8d6ba85c-b8f0-46dd-90a7-0d85c29d6b0b\") " pod="openstack/cinder-volume-volume1-0" Mar 10 11:19:54 crc kubenswrapper[4794]: I0310 11:19:54.114909 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n5dg6\" (UniqueName: \"kubernetes.io/projected/8d6ba85c-b8f0-46dd-90a7-0d85c29d6b0b-kube-api-access-n5dg6\") pod \"cinder-volume-volume1-0\" (UID: \"8d6ba85c-b8f0-46dd-90a7-0d85c29d6b0b\") " pod="openstack/cinder-volume-volume1-0" Mar 10 11:19:54 crc kubenswrapper[4794]: I0310 11:19:54.206641 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-volume-volume1-0" Mar 10 11:19:54 crc kubenswrapper[4794]: I0310 11:19:54.643907 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-backup-0"] Mar 10 11:19:54 crc kubenswrapper[4794]: I0310 11:19:54.645963 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-backup-0" Mar 10 11:19:54 crc kubenswrapper[4794]: I0310 11:19:54.649070 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-backup-config-data" Mar 10 11:19:54 crc kubenswrapper[4794]: I0310 11:19:54.653450 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-backup-0"] Mar 10 11:19:54 crc kubenswrapper[4794]: I0310 11:19:54.775594 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/07c8f180-62af-49c1-8e1d-3fec16164fee-scripts\") pod \"cinder-backup-0\" (UID: \"07c8f180-62af-49c1-8e1d-3fec16164fee\") " pod="openstack/cinder-backup-0" Mar 10 11:19:54 crc kubenswrapper[4794]: I0310 11:19:54.775662 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/07c8f180-62af-49c1-8e1d-3fec16164fee-config-data-custom\") pod \"cinder-backup-0\" (UID: \"07c8f180-62af-49c1-8e1d-3fec16164fee\") " pod="openstack/cinder-backup-0" Mar 10 11:19:54 crc kubenswrapper[4794]: I0310 11:19:54.775708 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/07c8f180-62af-49c1-8e1d-3fec16164fee-etc-machine-id\") pod \"cinder-backup-0\" (UID: \"07c8f180-62af-49c1-8e1d-3fec16164fee\") " pod="openstack/cinder-backup-0" Mar 10 11:19:54 crc kubenswrapper[4794]: I0310 11:19:54.775728 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/07c8f180-62af-49c1-8e1d-3fec16164fee-run\") pod \"cinder-backup-0\" (UID: \"07c8f180-62af-49c1-8e1d-3fec16164fee\") " pod="openstack/cinder-backup-0" Mar 10 11:19:54 crc kubenswrapper[4794]: I0310 11:19:54.775743 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/07c8f180-62af-49c1-8e1d-3fec16164fee-sys\") pod \"cinder-backup-0\" (UID: \"07c8f180-62af-49c1-8e1d-3fec16164fee\") " pod="openstack/cinder-backup-0" Mar 10 11:19:54 crc kubenswrapper[4794]: I0310 11:19:54.775759 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/07c8f180-62af-49c1-8e1d-3fec16164fee-lib-modules\") pod \"cinder-backup-0\" (UID: \"07c8f180-62af-49c1-8e1d-3fec16164fee\") " pod="openstack/cinder-backup-0" Mar 10 11:19:54 crc kubenswrapper[4794]: I0310 11:19:54.775776 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/07c8f180-62af-49c1-8e1d-3fec16164fee-ceph\") pod \"cinder-backup-0\" (UID: \"07c8f180-62af-49c1-8e1d-3fec16164fee\") " pod="openstack/cinder-backup-0" Mar 10 11:19:54 crc kubenswrapper[4794]: I0310 11:19:54.775791 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/07c8f180-62af-49c1-8e1d-3fec16164fee-config-data\") pod \"cinder-backup-0\" (UID: \"07c8f180-62af-49c1-8e1d-3fec16164fee\") " pod="openstack/cinder-backup-0" Mar 10 11:19:54 crc kubenswrapper[4794]: I0310 11:19:54.775810 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/07c8f180-62af-49c1-8e1d-3fec16164fee-etc-nvme\") pod \"cinder-backup-0\" (UID: \"07c8f180-62af-49c1-8e1d-3fec16164fee\") " pod="openstack/cinder-backup-0" Mar 10 11:19:54 crc kubenswrapper[4794]: I0310 11:19:54.775844 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/07c8f180-62af-49c1-8e1d-3fec16164fee-etc-iscsi\") pod \"cinder-backup-0\" (UID: \"07c8f180-62af-49c1-8e1d-3fec16164fee\") " pod="openstack/cinder-backup-0" Mar 10 11:19:54 crc kubenswrapper[4794]: I0310 11:19:54.775911 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/07c8f180-62af-49c1-8e1d-3fec16164fee-var-lib-cinder\") pod \"cinder-backup-0\" (UID: \"07c8f180-62af-49c1-8e1d-3fec16164fee\") " pod="openstack/cinder-backup-0" Mar 10 11:19:54 crc kubenswrapper[4794]: I0310 11:19:54.776120 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/07c8f180-62af-49c1-8e1d-3fec16164fee-var-locks-cinder\") pod \"cinder-backup-0\" (UID: \"07c8f180-62af-49c1-8e1d-3fec16164fee\") " pod="openstack/cinder-backup-0" Mar 10 11:19:54 crc kubenswrapper[4794]: I0310 11:19:54.776167 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vvkql\" (UniqueName: \"kubernetes.io/projected/07c8f180-62af-49c1-8e1d-3fec16164fee-kube-api-access-vvkql\") pod \"cinder-backup-0\" (UID: \"07c8f180-62af-49c1-8e1d-3fec16164fee\") " pod="openstack/cinder-backup-0" Mar 10 11:19:54 crc kubenswrapper[4794]: I0310 11:19:54.776197 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/07c8f180-62af-49c1-8e1d-3fec16164fee-combined-ca-bundle\") pod \"cinder-backup-0\" (UID: \"07c8f180-62af-49c1-8e1d-3fec16164fee\") " pod="openstack/cinder-backup-0" Mar 10 11:19:54 crc kubenswrapper[4794]: I0310 11:19:54.776263 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/07c8f180-62af-49c1-8e1d-3fec16164fee-var-locks-brick\") pod \"cinder-backup-0\" (UID: \"07c8f180-62af-49c1-8e1d-3fec16164fee\") " pod="openstack/cinder-backup-0" Mar 10 11:19:54 crc kubenswrapper[4794]: I0310 11:19:54.776296 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/07c8f180-62af-49c1-8e1d-3fec16164fee-dev\") pod \"cinder-backup-0\" (UID: \"07c8f180-62af-49c1-8e1d-3fec16164fee\") " pod="openstack/cinder-backup-0" Mar 10 11:19:54 crc kubenswrapper[4794]: I0310 11:19:54.867285 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"f9415088-1f39-4a30-b6e2-1f0421d1dd9f","Type":"ContainerStarted","Data":"f6b5ae7b564ee3eb0e9d6fcbfce48d6c75d9bcf011234b05cf4926acb5f8a13d"} Mar 10 11:19:54 crc kubenswrapper[4794]: I0310 11:19:54.878123 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/07c8f180-62af-49c1-8e1d-3fec16164fee-run\") pod \"cinder-backup-0\" (UID: \"07c8f180-62af-49c1-8e1d-3fec16164fee\") " pod="openstack/cinder-backup-0" Mar 10 11:19:54 crc kubenswrapper[4794]: I0310 11:19:54.878362 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/07c8f180-62af-49c1-8e1d-3fec16164fee-sys\") pod \"cinder-backup-0\" (UID: \"07c8f180-62af-49c1-8e1d-3fec16164fee\") " pod="openstack/cinder-backup-0" Mar 10 11:19:54 crc kubenswrapper[4794]: I0310 11:19:54.878292 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/07c8f180-62af-49c1-8e1d-3fec16164fee-run\") pod \"cinder-backup-0\" (UID: \"07c8f180-62af-49c1-8e1d-3fec16164fee\") " pod="openstack/cinder-backup-0" Mar 10 11:19:54 crc kubenswrapper[4794]: I0310 11:19:54.878480 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/07c8f180-62af-49c1-8e1d-3fec16164fee-sys\") pod \"cinder-backup-0\" (UID: \"07c8f180-62af-49c1-8e1d-3fec16164fee\") " pod="openstack/cinder-backup-0" Mar 10 11:19:54 crc kubenswrapper[4794]: I0310 11:19:54.878435 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/07c8f180-62af-49c1-8e1d-3fec16164fee-lib-modules\") pod \"cinder-backup-0\" (UID: \"07c8f180-62af-49c1-8e1d-3fec16164fee\") " pod="openstack/cinder-backup-0" Mar 10 11:19:54 crc kubenswrapper[4794]: I0310 11:19:54.878550 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/07c8f180-62af-49c1-8e1d-3fec16164fee-ceph\") pod \"cinder-backup-0\" (UID: \"07c8f180-62af-49c1-8e1d-3fec16164fee\") " pod="openstack/cinder-backup-0" Mar 10 11:19:54 crc kubenswrapper[4794]: I0310 11:19:54.878575 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/07c8f180-62af-49c1-8e1d-3fec16164fee-config-data\") pod \"cinder-backup-0\" (UID: \"07c8f180-62af-49c1-8e1d-3fec16164fee\") " pod="openstack/cinder-backup-0" Mar 10 11:19:54 crc kubenswrapper[4794]: I0310 11:19:54.878607 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/07c8f180-62af-49c1-8e1d-3fec16164fee-etc-nvme\") pod \"cinder-backup-0\" (UID: \"07c8f180-62af-49c1-8e1d-3fec16164fee\") " pod="openstack/cinder-backup-0" Mar 10 11:19:54 crc kubenswrapper[4794]: I0310 11:19:54.878674 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/07c8f180-62af-49c1-8e1d-3fec16164fee-etc-iscsi\") pod \"cinder-backup-0\" (UID: \"07c8f180-62af-49c1-8e1d-3fec16164fee\") " pod="openstack/cinder-backup-0" Mar 10 11:19:54 crc kubenswrapper[4794]: I0310 11:19:54.878706 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/07c8f180-62af-49c1-8e1d-3fec16164fee-var-lib-cinder\") pod \"cinder-backup-0\" (UID: \"07c8f180-62af-49c1-8e1d-3fec16164fee\") " pod="openstack/cinder-backup-0" Mar 10 11:19:54 crc kubenswrapper[4794]: I0310 11:19:54.878770 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/07c8f180-62af-49c1-8e1d-3fec16164fee-lib-modules\") pod \"cinder-backup-0\" (UID: \"07c8f180-62af-49c1-8e1d-3fec16164fee\") " pod="openstack/cinder-backup-0" Mar 10 11:19:54 crc kubenswrapper[4794]: I0310 11:19:54.878828 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/07c8f180-62af-49c1-8e1d-3fec16164fee-var-locks-cinder\") pod \"cinder-backup-0\" (UID: \"07c8f180-62af-49c1-8e1d-3fec16164fee\") " pod="openstack/cinder-backup-0" Mar 10 11:19:54 crc kubenswrapper[4794]: I0310 11:19:54.878870 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vvkql\" (UniqueName: \"kubernetes.io/projected/07c8f180-62af-49c1-8e1d-3fec16164fee-kube-api-access-vvkql\") pod \"cinder-backup-0\" (UID: \"07c8f180-62af-49c1-8e1d-3fec16164fee\") " pod="openstack/cinder-backup-0" Mar 10 11:19:54 crc kubenswrapper[4794]: I0310 11:19:54.878901 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/07c8f180-62af-49c1-8e1d-3fec16164fee-combined-ca-bundle\") pod \"cinder-backup-0\" (UID: \"07c8f180-62af-49c1-8e1d-3fec16164fee\") " pod="openstack/cinder-backup-0" Mar 10 11:19:54 crc kubenswrapper[4794]: I0310 11:19:54.878968 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/07c8f180-62af-49c1-8e1d-3fec16164fee-var-locks-brick\") pod \"cinder-backup-0\" (UID: \"07c8f180-62af-49c1-8e1d-3fec16164fee\") " pod="openstack/cinder-backup-0" Mar 10 11:19:54 crc kubenswrapper[4794]: I0310 11:19:54.878997 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/07c8f180-62af-49c1-8e1d-3fec16164fee-dev\") pod \"cinder-backup-0\" (UID: \"07c8f180-62af-49c1-8e1d-3fec16164fee\") " pod="openstack/cinder-backup-0" Mar 10 11:19:54 crc kubenswrapper[4794]: I0310 11:19:54.879031 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/07c8f180-62af-49c1-8e1d-3fec16164fee-scripts\") pod \"cinder-backup-0\" (UID: \"07c8f180-62af-49c1-8e1d-3fec16164fee\") " pod="openstack/cinder-backup-0" Mar 10 11:19:54 crc kubenswrapper[4794]: I0310 11:19:54.879119 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/07c8f180-62af-49c1-8e1d-3fec16164fee-config-data-custom\") pod \"cinder-backup-0\" (UID: \"07c8f180-62af-49c1-8e1d-3fec16164fee\") " pod="openstack/cinder-backup-0" Mar 10 11:19:54 crc kubenswrapper[4794]: I0310 11:19:54.879204 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/07c8f180-62af-49c1-8e1d-3fec16164fee-etc-machine-id\") pod \"cinder-backup-0\" (UID: \"07c8f180-62af-49c1-8e1d-3fec16164fee\") " pod="openstack/cinder-backup-0" Mar 10 11:19:54 crc kubenswrapper[4794]: I0310 11:19:54.879203 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/07c8f180-62af-49c1-8e1d-3fec16164fee-etc-nvme\") pod \"cinder-backup-0\" (UID: \"07c8f180-62af-49c1-8e1d-3fec16164fee\") " pod="openstack/cinder-backup-0" Mar 10 11:19:54 crc kubenswrapper[4794]: I0310 11:19:54.879276 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/07c8f180-62af-49c1-8e1d-3fec16164fee-dev\") pod \"cinder-backup-0\" (UID: \"07c8f180-62af-49c1-8e1d-3fec16164fee\") " pod="openstack/cinder-backup-0" Mar 10 11:19:54 crc kubenswrapper[4794]: I0310 11:19:54.879310 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/07c8f180-62af-49c1-8e1d-3fec16164fee-var-locks-brick\") pod \"cinder-backup-0\" (UID: \"07c8f180-62af-49c1-8e1d-3fec16164fee\") " pod="openstack/cinder-backup-0" Mar 10 11:19:54 crc kubenswrapper[4794]: I0310 11:19:54.879405 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/07c8f180-62af-49c1-8e1d-3fec16164fee-var-lib-cinder\") pod \"cinder-backup-0\" (UID: \"07c8f180-62af-49c1-8e1d-3fec16164fee\") " pod="openstack/cinder-backup-0" Mar 10 11:19:54 crc kubenswrapper[4794]: I0310 11:19:54.879426 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/07c8f180-62af-49c1-8e1d-3fec16164fee-etc-machine-id\") pod \"cinder-backup-0\" (UID: \"07c8f180-62af-49c1-8e1d-3fec16164fee\") " pod="openstack/cinder-backup-0" Mar 10 11:19:54 crc kubenswrapper[4794]: I0310 11:19:54.879757 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/07c8f180-62af-49c1-8e1d-3fec16164fee-var-locks-cinder\") pod \"cinder-backup-0\" (UID: \"07c8f180-62af-49c1-8e1d-3fec16164fee\") " pod="openstack/cinder-backup-0" Mar 10 11:19:54 crc kubenswrapper[4794]: I0310 11:19:54.878976 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/07c8f180-62af-49c1-8e1d-3fec16164fee-etc-iscsi\") pod \"cinder-backup-0\" (UID: \"07c8f180-62af-49c1-8e1d-3fec16164fee\") " pod="openstack/cinder-backup-0" Mar 10 11:19:54 crc kubenswrapper[4794]: I0310 11:19:54.887563 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/07c8f180-62af-49c1-8e1d-3fec16164fee-config-data\") pod \"cinder-backup-0\" (UID: \"07c8f180-62af-49c1-8e1d-3fec16164fee\") " pod="openstack/cinder-backup-0" Mar 10 11:19:54 crc kubenswrapper[4794]: I0310 11:19:54.887897 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/07c8f180-62af-49c1-8e1d-3fec16164fee-combined-ca-bundle\") pod \"cinder-backup-0\" (UID: \"07c8f180-62af-49c1-8e1d-3fec16164fee\") " pod="openstack/cinder-backup-0" Mar 10 11:19:54 crc kubenswrapper[4794]: I0310 11:19:54.888495 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/07c8f180-62af-49c1-8e1d-3fec16164fee-config-data-custom\") pod \"cinder-backup-0\" (UID: \"07c8f180-62af-49c1-8e1d-3fec16164fee\") " pod="openstack/cinder-backup-0" Mar 10 11:19:54 crc kubenswrapper[4794]: I0310 11:19:54.893152 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/07c8f180-62af-49c1-8e1d-3fec16164fee-ceph\") pod \"cinder-backup-0\" (UID: \"07c8f180-62af-49c1-8e1d-3fec16164fee\") " pod="openstack/cinder-backup-0" Mar 10 11:19:54 crc kubenswrapper[4794]: I0310 11:19:54.893344 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=3.893302135 podStartE2EDuration="3.893302135s" podCreationTimestamp="2026-03-10 11:19:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 11:19:54.892989276 +0000 UTC m=+5743.649160094" watchObservedRunningTime="2026-03-10 11:19:54.893302135 +0000 UTC m=+5743.649472953" Mar 10 11:19:54 crc kubenswrapper[4794]: I0310 11:19:54.897731 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/07c8f180-62af-49c1-8e1d-3fec16164fee-scripts\") pod \"cinder-backup-0\" (UID: \"07c8f180-62af-49c1-8e1d-3fec16164fee\") " pod="openstack/cinder-backup-0" Mar 10 11:19:54 crc kubenswrapper[4794]: I0310 11:19:54.898777 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vvkql\" (UniqueName: \"kubernetes.io/projected/07c8f180-62af-49c1-8e1d-3fec16164fee-kube-api-access-vvkql\") pod \"cinder-backup-0\" (UID: \"07c8f180-62af-49c1-8e1d-3fec16164fee\") " pod="openstack/cinder-backup-0" Mar 10 11:19:54 crc kubenswrapper[4794]: I0310 11:19:54.912447 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-volume-volume1-0"] Mar 10 11:19:54 crc kubenswrapper[4794]: I0310 11:19:54.962155 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-backup-0" Mar 10 11:19:55 crc kubenswrapper[4794]: I0310 11:19:55.481179 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-backup-0"] Mar 10 11:19:55 crc kubenswrapper[4794]: I0310 11:19:55.880650 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-volume-volume1-0" event={"ID":"8d6ba85c-b8f0-46dd-90a7-0d85c29d6b0b","Type":"ContainerStarted","Data":"9f675e1a17217255b66be6b048162f52a0168f86036ddf9561e2e421ec25ef63"} Mar 10 11:19:55 crc kubenswrapper[4794]: I0310 11:19:55.882916 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-backup-0" event={"ID":"07c8f180-62af-49c1-8e1d-3fec16164fee","Type":"ContainerStarted","Data":"9e3ca4ed2893fb4c00db2fc7bd210f23dc22bbca58b12701f00c70ba940457a8"} Mar 10 11:19:56 crc kubenswrapper[4794]: I0310 11:19:56.608689 4794 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/cinder-api-0" podUID="ce2424b1-763a-4b70-9f10-2503c5d49d7c" containerName="cinder-api" probeResult="failure" output="Get \"http://10.217.1.118:8776/healthcheck\": read tcp 10.217.0.2:55358->10.217.1.118:8776: read: connection reset by peer" Mar 10 11:19:57 crc kubenswrapper[4794]: I0310 11:19:57.006577 4794 generic.go:334] "Generic (PLEG): container finished" podID="ce2424b1-763a-4b70-9f10-2503c5d49d7c" containerID="fd559e1689f963d616a0e201e644628df3259c2127ce50665eb673cf4ada6a8c" exitCode=0 Mar 10 11:19:57 crc kubenswrapper[4794]: I0310 11:19:57.006876 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"ce2424b1-763a-4b70-9f10-2503c5d49d7c","Type":"ContainerDied","Data":"fd559e1689f963d616a0e201e644628df3259c2127ce50665eb673cf4ada6a8c"} Mar 10 11:19:57 crc kubenswrapper[4794]: I0310 11:19:57.046512 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-volume-volume1-0" event={"ID":"8d6ba85c-b8f0-46dd-90a7-0d85c29d6b0b","Type":"ContainerStarted","Data":"c22f5c1a53228d574789aca8152419f53404fcacbbfbfdf7161b6bba0bd2d71e"} Mar 10 11:19:57 crc kubenswrapper[4794]: I0310 11:19:57.046560 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-volume-volume1-0" event={"ID":"8d6ba85c-b8f0-46dd-90a7-0d85c29d6b0b","Type":"ContainerStarted","Data":"6eecc3d9fde68b068182774387f6d13cb839140c7c81c19092495c8f1006b75d"} Mar 10 11:19:57 crc kubenswrapper[4794]: I0310 11:19:57.113866 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-volume-volume1-0" podStartSLOduration=3.382901268 podStartE2EDuration="4.113852154s" podCreationTimestamp="2026-03-10 11:19:53 +0000 UTC" firstStartedPulling="2026-03-10 11:19:54.935768927 +0000 UTC m=+5743.691939745" lastFinishedPulling="2026-03-10 11:19:55.666719813 +0000 UTC m=+5744.422890631" observedRunningTime="2026-03-10 11:19:57.110772108 +0000 UTC m=+5745.866942926" watchObservedRunningTime="2026-03-10 11:19:57.113852154 +0000 UTC m=+5745.870022972" Mar 10 11:19:57 crc kubenswrapper[4794]: I0310 11:19:57.166060 4794 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Mar 10 11:19:57 crc kubenswrapper[4794]: I0310 11:19:57.260735 4794 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Mar 10 11:19:57 crc kubenswrapper[4794]: I0310 11:19:57.261995 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Mar 10 11:19:57 crc kubenswrapper[4794]: I0310 11:19:57.262217 4794 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Mar 10 11:19:57 crc kubenswrapper[4794]: I0310 11:19:57.266123 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Mar 10 11:19:57 crc kubenswrapper[4794]: I0310 11:19:57.365071 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 10 11:19:57 crc kubenswrapper[4794]: I0310 11:19:57.464109 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ce2424b1-763a-4b70-9f10-2503c5d49d7c-logs\") pod \"ce2424b1-763a-4b70-9f10-2503c5d49d7c\" (UID: \"ce2424b1-763a-4b70-9f10-2503c5d49d7c\") " Mar 10 11:19:57 crc kubenswrapper[4794]: I0310 11:19:57.464162 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-289qw\" (UniqueName: \"kubernetes.io/projected/ce2424b1-763a-4b70-9f10-2503c5d49d7c-kube-api-access-289qw\") pod \"ce2424b1-763a-4b70-9f10-2503c5d49d7c\" (UID: \"ce2424b1-763a-4b70-9f10-2503c5d49d7c\") " Mar 10 11:19:57 crc kubenswrapper[4794]: I0310 11:19:57.464198 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ce2424b1-763a-4b70-9f10-2503c5d49d7c-config-data\") pod \"ce2424b1-763a-4b70-9f10-2503c5d49d7c\" (UID: \"ce2424b1-763a-4b70-9f10-2503c5d49d7c\") " Mar 10 11:19:57 crc kubenswrapper[4794]: I0310 11:19:57.464274 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ce2424b1-763a-4b70-9f10-2503c5d49d7c-scripts\") pod \"ce2424b1-763a-4b70-9f10-2503c5d49d7c\" (UID: \"ce2424b1-763a-4b70-9f10-2503c5d49d7c\") " Mar 10 11:19:57 crc kubenswrapper[4794]: I0310 11:19:57.464310 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ce2424b1-763a-4b70-9f10-2503c5d49d7c-combined-ca-bundle\") pod \"ce2424b1-763a-4b70-9f10-2503c5d49d7c\" (UID: \"ce2424b1-763a-4b70-9f10-2503c5d49d7c\") " Mar 10 11:19:57 crc kubenswrapper[4794]: I0310 11:19:57.464372 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ce2424b1-763a-4b70-9f10-2503c5d49d7c-config-data-custom\") pod \"ce2424b1-763a-4b70-9f10-2503c5d49d7c\" (UID: \"ce2424b1-763a-4b70-9f10-2503c5d49d7c\") " Mar 10 11:19:57 crc kubenswrapper[4794]: I0310 11:19:57.464453 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/ce2424b1-763a-4b70-9f10-2503c5d49d7c-etc-machine-id\") pod \"ce2424b1-763a-4b70-9f10-2503c5d49d7c\" (UID: \"ce2424b1-763a-4b70-9f10-2503c5d49d7c\") " Mar 10 11:19:57 crc kubenswrapper[4794]: I0310 11:19:57.464804 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ce2424b1-763a-4b70-9f10-2503c5d49d7c-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "ce2424b1-763a-4b70-9f10-2503c5d49d7c" (UID: "ce2424b1-763a-4b70-9f10-2503c5d49d7c"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 11:19:57 crc kubenswrapper[4794]: I0310 11:19:57.466237 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ce2424b1-763a-4b70-9f10-2503c5d49d7c-logs" (OuterVolumeSpecName: "logs") pod "ce2424b1-763a-4b70-9f10-2503c5d49d7c" (UID: "ce2424b1-763a-4b70-9f10-2503c5d49d7c"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 11:19:57 crc kubenswrapper[4794]: I0310 11:19:57.471556 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ce2424b1-763a-4b70-9f10-2503c5d49d7c-scripts" (OuterVolumeSpecName: "scripts") pod "ce2424b1-763a-4b70-9f10-2503c5d49d7c" (UID: "ce2424b1-763a-4b70-9f10-2503c5d49d7c"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 11:19:57 crc kubenswrapper[4794]: I0310 11:19:57.473982 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ce2424b1-763a-4b70-9f10-2503c5d49d7c-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "ce2424b1-763a-4b70-9f10-2503c5d49d7c" (UID: "ce2424b1-763a-4b70-9f10-2503c5d49d7c"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 11:19:57 crc kubenswrapper[4794]: I0310 11:19:57.480053 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ce2424b1-763a-4b70-9f10-2503c5d49d7c-kube-api-access-289qw" (OuterVolumeSpecName: "kube-api-access-289qw") pod "ce2424b1-763a-4b70-9f10-2503c5d49d7c" (UID: "ce2424b1-763a-4b70-9f10-2503c5d49d7c"). InnerVolumeSpecName "kube-api-access-289qw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 11:19:57 crc kubenswrapper[4794]: I0310 11:19:57.525828 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ce2424b1-763a-4b70-9f10-2503c5d49d7c-config-data" (OuterVolumeSpecName: "config-data") pod "ce2424b1-763a-4b70-9f10-2503c5d49d7c" (UID: "ce2424b1-763a-4b70-9f10-2503c5d49d7c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 11:19:57 crc kubenswrapper[4794]: I0310 11:19:57.526965 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ce2424b1-763a-4b70-9f10-2503c5d49d7c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ce2424b1-763a-4b70-9f10-2503c5d49d7c" (UID: "ce2424b1-763a-4b70-9f10-2503c5d49d7c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 11:19:57 crc kubenswrapper[4794]: I0310 11:19:57.565815 4794 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ce2424b1-763a-4b70-9f10-2503c5d49d7c-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 11:19:57 crc kubenswrapper[4794]: I0310 11:19:57.566002 4794 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ce2424b1-763a-4b70-9f10-2503c5d49d7c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 11:19:57 crc kubenswrapper[4794]: I0310 11:19:57.566014 4794 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ce2424b1-763a-4b70-9f10-2503c5d49d7c-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 10 11:19:57 crc kubenswrapper[4794]: I0310 11:19:57.566023 4794 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/ce2424b1-763a-4b70-9f10-2503c5d49d7c-etc-machine-id\") on node \"crc\" DevicePath \"\"" Mar 10 11:19:57 crc kubenswrapper[4794]: I0310 11:19:57.566058 4794 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ce2424b1-763a-4b70-9f10-2503c5d49d7c-logs\") on node \"crc\" DevicePath \"\"" Mar 10 11:19:57 crc kubenswrapper[4794]: I0310 11:19:57.566067 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-289qw\" (UniqueName: \"kubernetes.io/projected/ce2424b1-763a-4b70-9f10-2503c5d49d7c-kube-api-access-289qw\") on node \"crc\" DevicePath \"\"" Mar 10 11:19:57 crc kubenswrapper[4794]: I0310 11:19:57.566076 4794 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ce2424b1-763a-4b70-9f10-2503c5d49d7c-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 11:19:58 crc kubenswrapper[4794]: I0310 11:19:58.057625 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"ce2424b1-763a-4b70-9f10-2503c5d49d7c","Type":"ContainerDied","Data":"0f03c6c9fc5d732739dff4abe5c0a317cd8bd7c79c4f6d0c8fe0834ffa2d0968"} Mar 10 11:19:58 crc kubenswrapper[4794]: I0310 11:19:58.057975 4794 scope.go:117] "RemoveContainer" containerID="fd559e1689f963d616a0e201e644628df3259c2127ce50665eb673cf4ada6a8c" Mar 10 11:19:58 crc kubenswrapper[4794]: I0310 11:19:58.057682 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 10 11:19:58 crc kubenswrapper[4794]: I0310 11:19:58.061736 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-backup-0" event={"ID":"07c8f180-62af-49c1-8e1d-3fec16164fee","Type":"ContainerStarted","Data":"483c78f112f4f6d2b3c29b911a6cc5b65105c0cb143523b27f65a55984f1a89f"} Mar 10 11:19:58 crc kubenswrapper[4794]: I0310 11:19:58.061773 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-backup-0" event={"ID":"07c8f180-62af-49c1-8e1d-3fec16164fee","Type":"ContainerStarted","Data":"a11a2dd73098f25c640c3c0febaea2351941f5e9fd5fa371ddce6c02f855aace"} Mar 10 11:19:58 crc kubenswrapper[4794]: I0310 11:19:58.062306 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Mar 10 11:19:58 crc kubenswrapper[4794]: I0310 11:19:58.069698 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Mar 10 11:19:58 crc kubenswrapper[4794]: I0310 11:19:58.092706 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-backup-0" podStartSLOduration=2.994877793 podStartE2EDuration="4.092684854s" podCreationTimestamp="2026-03-10 11:19:54 +0000 UTC" firstStartedPulling="2026-03-10 11:19:55.493504703 +0000 UTC m=+5744.249675521" lastFinishedPulling="2026-03-10 11:19:56.591311724 +0000 UTC m=+5745.347482582" observedRunningTime="2026-03-10 11:19:58.086187151 +0000 UTC m=+5746.842357979" watchObservedRunningTime="2026-03-10 11:19:58.092684854 +0000 UTC m=+5746.848855682" Mar 10 11:19:58 crc kubenswrapper[4794]: I0310 11:19:58.095982 4794 scope.go:117] "RemoveContainer" containerID="adb37a2d74235ae1c882bd2d614dea306392e358ea6ea989e227f6585a82d41a" Mar 10 11:19:58 crc kubenswrapper[4794]: I0310 11:19:58.162153 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Mar 10 11:19:58 crc kubenswrapper[4794]: I0310 11:19:58.178987 4794 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-api-0"] Mar 10 11:19:58 crc kubenswrapper[4794]: I0310 11:19:58.193733 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Mar 10 11:19:58 crc kubenswrapper[4794]: E0310 11:19:58.194185 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ce2424b1-763a-4b70-9f10-2503c5d49d7c" containerName="cinder-api" Mar 10 11:19:58 crc kubenswrapper[4794]: I0310 11:19:58.194203 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="ce2424b1-763a-4b70-9f10-2503c5d49d7c" containerName="cinder-api" Mar 10 11:19:58 crc kubenswrapper[4794]: E0310 11:19:58.194217 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ce2424b1-763a-4b70-9f10-2503c5d49d7c" containerName="cinder-api-log" Mar 10 11:19:58 crc kubenswrapper[4794]: I0310 11:19:58.194224 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="ce2424b1-763a-4b70-9f10-2503c5d49d7c" containerName="cinder-api-log" Mar 10 11:19:58 crc kubenswrapper[4794]: I0310 11:19:58.194476 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="ce2424b1-763a-4b70-9f10-2503c5d49d7c" containerName="cinder-api" Mar 10 11:19:58 crc kubenswrapper[4794]: I0310 11:19:58.194496 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="ce2424b1-763a-4b70-9f10-2503c5d49d7c" containerName="cinder-api-log" Mar 10 11:19:58 crc kubenswrapper[4794]: I0310 11:19:58.195649 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 10 11:19:58 crc kubenswrapper[4794]: I0310 11:19:58.198818 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Mar 10 11:19:58 crc kubenswrapper[4794]: I0310 11:19:58.218878 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Mar 10 11:19:58 crc kubenswrapper[4794]: I0310 11:19:58.281638 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d52ca6f9-6457-431b-9b61-5af02167bb0c-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"d52ca6f9-6457-431b-9b61-5af02167bb0c\") " pod="openstack/cinder-api-0" Mar 10 11:19:58 crc kubenswrapper[4794]: I0310 11:19:58.281728 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d52ca6f9-6457-431b-9b61-5af02167bb0c-config-data\") pod \"cinder-api-0\" (UID: \"d52ca6f9-6457-431b-9b61-5af02167bb0c\") " pod="openstack/cinder-api-0" Mar 10 11:19:58 crc kubenswrapper[4794]: I0310 11:19:58.281801 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d52ca6f9-6457-431b-9b61-5af02167bb0c-scripts\") pod \"cinder-api-0\" (UID: \"d52ca6f9-6457-431b-9b61-5af02167bb0c\") " pod="openstack/cinder-api-0" Mar 10 11:19:58 crc kubenswrapper[4794]: I0310 11:19:58.281833 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/d52ca6f9-6457-431b-9b61-5af02167bb0c-etc-machine-id\") pod \"cinder-api-0\" (UID: \"d52ca6f9-6457-431b-9b61-5af02167bb0c\") " pod="openstack/cinder-api-0" Mar 10 11:19:58 crc kubenswrapper[4794]: I0310 11:19:58.281858 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d52ca6f9-6457-431b-9b61-5af02167bb0c-config-data-custom\") pod \"cinder-api-0\" (UID: \"d52ca6f9-6457-431b-9b61-5af02167bb0c\") " pod="openstack/cinder-api-0" Mar 10 11:19:58 crc kubenswrapper[4794]: I0310 11:19:58.281881 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-95grk\" (UniqueName: \"kubernetes.io/projected/d52ca6f9-6457-431b-9b61-5af02167bb0c-kube-api-access-95grk\") pod \"cinder-api-0\" (UID: \"d52ca6f9-6457-431b-9b61-5af02167bb0c\") " pod="openstack/cinder-api-0" Mar 10 11:19:58 crc kubenswrapper[4794]: I0310 11:19:58.281895 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d52ca6f9-6457-431b-9b61-5af02167bb0c-logs\") pod \"cinder-api-0\" (UID: \"d52ca6f9-6457-431b-9b61-5af02167bb0c\") " pod="openstack/cinder-api-0" Mar 10 11:19:58 crc kubenswrapper[4794]: I0310 11:19:58.383948 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d52ca6f9-6457-431b-9b61-5af02167bb0c-config-data\") pod \"cinder-api-0\" (UID: \"d52ca6f9-6457-431b-9b61-5af02167bb0c\") " pod="openstack/cinder-api-0" Mar 10 11:19:58 crc kubenswrapper[4794]: I0310 11:19:58.384117 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d52ca6f9-6457-431b-9b61-5af02167bb0c-scripts\") pod \"cinder-api-0\" (UID: \"d52ca6f9-6457-431b-9b61-5af02167bb0c\") " pod="openstack/cinder-api-0" Mar 10 11:19:58 crc kubenswrapper[4794]: I0310 11:19:58.384177 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/d52ca6f9-6457-431b-9b61-5af02167bb0c-etc-machine-id\") pod \"cinder-api-0\" (UID: \"d52ca6f9-6457-431b-9b61-5af02167bb0c\") " pod="openstack/cinder-api-0" Mar 10 11:19:58 crc kubenswrapper[4794]: I0310 11:19:58.384224 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d52ca6f9-6457-431b-9b61-5af02167bb0c-config-data-custom\") pod \"cinder-api-0\" (UID: \"d52ca6f9-6457-431b-9b61-5af02167bb0c\") " pod="openstack/cinder-api-0" Mar 10 11:19:58 crc kubenswrapper[4794]: I0310 11:19:58.384269 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-95grk\" (UniqueName: \"kubernetes.io/projected/d52ca6f9-6457-431b-9b61-5af02167bb0c-kube-api-access-95grk\") pod \"cinder-api-0\" (UID: \"d52ca6f9-6457-431b-9b61-5af02167bb0c\") " pod="openstack/cinder-api-0" Mar 10 11:19:58 crc kubenswrapper[4794]: I0310 11:19:58.384300 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d52ca6f9-6457-431b-9b61-5af02167bb0c-logs\") pod \"cinder-api-0\" (UID: \"d52ca6f9-6457-431b-9b61-5af02167bb0c\") " pod="openstack/cinder-api-0" Mar 10 11:19:58 crc kubenswrapper[4794]: I0310 11:19:58.384384 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d52ca6f9-6457-431b-9b61-5af02167bb0c-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"d52ca6f9-6457-431b-9b61-5af02167bb0c\") " pod="openstack/cinder-api-0" Mar 10 11:19:58 crc kubenswrapper[4794]: I0310 11:19:58.384487 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/d52ca6f9-6457-431b-9b61-5af02167bb0c-etc-machine-id\") pod \"cinder-api-0\" (UID: \"d52ca6f9-6457-431b-9b61-5af02167bb0c\") " pod="openstack/cinder-api-0" Mar 10 11:19:58 crc kubenswrapper[4794]: I0310 11:19:58.385059 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d52ca6f9-6457-431b-9b61-5af02167bb0c-logs\") pod \"cinder-api-0\" (UID: \"d52ca6f9-6457-431b-9b61-5af02167bb0c\") " pod="openstack/cinder-api-0" Mar 10 11:19:58 crc kubenswrapper[4794]: I0310 11:19:58.390164 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d52ca6f9-6457-431b-9b61-5af02167bb0c-scripts\") pod \"cinder-api-0\" (UID: \"d52ca6f9-6457-431b-9b61-5af02167bb0c\") " pod="openstack/cinder-api-0" Mar 10 11:19:58 crc kubenswrapper[4794]: I0310 11:19:58.397138 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d52ca6f9-6457-431b-9b61-5af02167bb0c-config-data\") pod \"cinder-api-0\" (UID: \"d52ca6f9-6457-431b-9b61-5af02167bb0c\") " pod="openstack/cinder-api-0" Mar 10 11:19:58 crc kubenswrapper[4794]: I0310 11:19:58.398390 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d52ca6f9-6457-431b-9b61-5af02167bb0c-config-data-custom\") pod \"cinder-api-0\" (UID: \"d52ca6f9-6457-431b-9b61-5af02167bb0c\") " pod="openstack/cinder-api-0" Mar 10 11:19:58 crc kubenswrapper[4794]: I0310 11:19:58.400213 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d52ca6f9-6457-431b-9b61-5af02167bb0c-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"d52ca6f9-6457-431b-9b61-5af02167bb0c\") " pod="openstack/cinder-api-0" Mar 10 11:19:58 crc kubenswrapper[4794]: I0310 11:19:58.402594 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-95grk\" (UniqueName: \"kubernetes.io/projected/d52ca6f9-6457-431b-9b61-5af02167bb0c-kube-api-access-95grk\") pod \"cinder-api-0\" (UID: \"d52ca6f9-6457-431b-9b61-5af02167bb0c\") " pod="openstack/cinder-api-0" Mar 10 11:19:58 crc kubenswrapper[4794]: I0310 11:19:58.525312 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 10 11:19:59 crc kubenswrapper[4794]: W0310 11:19:59.006187 4794 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd52ca6f9_6457_431b_9b61_5af02167bb0c.slice/crio-381df8363f5db64d410ea0251bf576cd7468f9cb8a15297cb09f1257aa69b7d2 WatchSource:0}: Error finding container 381df8363f5db64d410ea0251bf576cd7468f9cb8a15297cb09f1257aa69b7d2: Status 404 returned error can't find the container with id 381df8363f5db64d410ea0251bf576cd7468f9cb8a15297cb09f1257aa69b7d2 Mar 10 11:19:59 crc kubenswrapper[4794]: I0310 11:19:59.018208 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Mar 10 11:19:59 crc kubenswrapper[4794]: I0310 11:19:59.071638 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"d52ca6f9-6457-431b-9b61-5af02167bb0c","Type":"ContainerStarted","Data":"381df8363f5db64d410ea0251bf576cd7468f9cb8a15297cb09f1257aa69b7d2"} Mar 10 11:19:59 crc kubenswrapper[4794]: I0310 11:19:59.111600 4794 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Mar 10 11:19:59 crc kubenswrapper[4794]: I0310 11:19:59.116895 4794 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Mar 10 11:19:59 crc kubenswrapper[4794]: I0310 11:19:59.116957 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Mar 10 11:19:59 crc kubenswrapper[4794]: I0310 11:19:59.207489 4794 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-volume-volume1-0" Mar 10 11:19:59 crc kubenswrapper[4794]: I0310 11:19:59.963573 4794 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-backup-0" Mar 10 11:20:00 crc kubenswrapper[4794]: I0310 11:20:00.016897 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ce2424b1-763a-4b70-9f10-2503c5d49d7c" path="/var/lib/kubelet/pods/ce2424b1-763a-4b70-9f10-2503c5d49d7c/volumes" Mar 10 11:20:00 crc kubenswrapper[4794]: I0310 11:20:00.087127 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"d52ca6f9-6457-431b-9b61-5af02167bb0c","Type":"ContainerStarted","Data":"fbdd1c053d875a3bc6764c03e211eb185b44e579adf22fed3ee28fd7a2573209"} Mar 10 11:20:00 crc kubenswrapper[4794]: I0310 11:20:00.090058 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Mar 10 11:20:00 crc kubenswrapper[4794]: I0310 11:20:00.143010 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29552360-dc7fk"] Mar 10 11:20:00 crc kubenswrapper[4794]: I0310 11:20:00.144175 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552360-dc7fk" Mar 10 11:20:00 crc kubenswrapper[4794]: I0310 11:20:00.160839 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 10 11:20:00 crc kubenswrapper[4794]: I0310 11:20:00.161120 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 10 11:20:00 crc kubenswrapper[4794]: I0310 11:20:00.161700 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-r5tkc" Mar 10 11:20:00 crc kubenswrapper[4794]: I0310 11:20:00.176482 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552360-dc7fk"] Mar 10 11:20:00 crc kubenswrapper[4794]: I0310 11:20:00.225721 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kkktn\" (UniqueName: \"kubernetes.io/projected/1112032b-20d2-4c4b-8f2b-3c9712c8cdf8-kube-api-access-kkktn\") pod \"auto-csr-approver-29552360-dc7fk\" (UID: \"1112032b-20d2-4c4b-8f2b-3c9712c8cdf8\") " pod="openshift-infra/auto-csr-approver-29552360-dc7fk" Mar 10 11:20:00 crc kubenswrapper[4794]: I0310 11:20:00.328126 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kkktn\" (UniqueName: \"kubernetes.io/projected/1112032b-20d2-4c4b-8f2b-3c9712c8cdf8-kube-api-access-kkktn\") pod \"auto-csr-approver-29552360-dc7fk\" (UID: \"1112032b-20d2-4c4b-8f2b-3c9712c8cdf8\") " pod="openshift-infra/auto-csr-approver-29552360-dc7fk" Mar 10 11:20:00 crc kubenswrapper[4794]: I0310 11:20:00.366352 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kkktn\" (UniqueName: \"kubernetes.io/projected/1112032b-20d2-4c4b-8f2b-3c9712c8cdf8-kube-api-access-kkktn\") pod \"auto-csr-approver-29552360-dc7fk\" (UID: \"1112032b-20d2-4c4b-8f2b-3c9712c8cdf8\") " pod="openshift-infra/auto-csr-approver-29552360-dc7fk" Mar 10 11:20:00 crc kubenswrapper[4794]: I0310 11:20:00.484008 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552360-dc7fk" Mar 10 11:20:01 crc kubenswrapper[4794]: I0310 11:20:01.038461 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552360-dc7fk"] Mar 10 11:20:01 crc kubenswrapper[4794]: W0310 11:20:01.041021 4794 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1112032b_20d2_4c4b_8f2b_3c9712c8cdf8.slice/crio-b3b0a17fa1f1819cb5defb53632dd6cb3641eadce76694dd5c93388e9cc9a7db WatchSource:0}: Error finding container b3b0a17fa1f1819cb5defb53632dd6cb3641eadce76694dd5c93388e9cc9a7db: Status 404 returned error can't find the container with id b3b0a17fa1f1819cb5defb53632dd6cb3641eadce76694dd5c93388e9cc9a7db Mar 10 11:20:01 crc kubenswrapper[4794]: I0310 11:20:01.099664 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552360-dc7fk" event={"ID":"1112032b-20d2-4c4b-8f2b-3c9712c8cdf8","Type":"ContainerStarted","Data":"b3b0a17fa1f1819cb5defb53632dd6cb3641eadce76694dd5c93388e9cc9a7db"} Mar 10 11:20:01 crc kubenswrapper[4794]: I0310 11:20:01.104200 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"d52ca6f9-6457-431b-9b61-5af02167bb0c","Type":"ContainerStarted","Data":"24c3543b7e42580d9a072bdf20574371cfd8b4d38c8c4a9402ee2e564fab9bb2"} Mar 10 11:20:01 crc kubenswrapper[4794]: I0310 11:20:01.104414 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Mar 10 11:20:01 crc kubenswrapper[4794]: I0310 11:20:01.127718 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=3.127691266 podStartE2EDuration="3.127691266s" podCreationTimestamp="2026-03-10 11:19:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 11:20:01.12654358 +0000 UTC m=+5749.882714418" watchObservedRunningTime="2026-03-10 11:20:01.127691266 +0000 UTC m=+5749.883862124" Mar 10 11:20:02 crc kubenswrapper[4794]: I0310 11:20:02.388144 4794 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Mar 10 11:20:02 crc kubenswrapper[4794]: I0310 11:20:02.465006 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 10 11:20:02 crc kubenswrapper[4794]: E0310 11:20:02.743499 4794 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode8b55040_929c_4088_9d23_532663500a6b.slice/crio-b2cb7f101e9420c2c74f37b8147714cf3bf87346a62942c8508e4ef2d4c02d77\": RecentStats: unable to find data in memory cache]" Mar 10 11:20:03 crc kubenswrapper[4794]: I0310 11:20:03.129392 4794 generic.go:334] "Generic (PLEG): container finished" podID="1112032b-20d2-4c4b-8f2b-3c9712c8cdf8" containerID="0726f8b9b4d8ca49671d811007c1986ff0185fb4312ebb37ee616f1a79a95fdf" exitCode=0 Mar 10 11:20:03 crc kubenswrapper[4794]: I0310 11:20:03.129779 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552360-dc7fk" event={"ID":"1112032b-20d2-4c4b-8f2b-3c9712c8cdf8","Type":"ContainerDied","Data":"0726f8b9b4d8ca49671d811007c1986ff0185fb4312ebb37ee616f1a79a95fdf"} Mar 10 11:20:03 crc kubenswrapper[4794]: I0310 11:20:03.131254 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="f9415088-1f39-4a30-b6e2-1f0421d1dd9f" containerName="probe" containerID="cri-o://f6b5ae7b564ee3eb0e9d6fcbfce48d6c75d9bcf011234b05cf4926acb5f8a13d" gracePeriod=30 Mar 10 11:20:03 crc kubenswrapper[4794]: I0310 11:20:03.131190 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="f9415088-1f39-4a30-b6e2-1f0421d1dd9f" containerName="cinder-scheduler" containerID="cri-o://8223a7484a0c224e84cd0f81a8a2a7f0e59bfc770ed59b29bee05e0336459770" gracePeriod=30 Mar 10 11:20:04 crc kubenswrapper[4794]: I0310 11:20:04.148700 4794 generic.go:334] "Generic (PLEG): container finished" podID="f9415088-1f39-4a30-b6e2-1f0421d1dd9f" containerID="f6b5ae7b564ee3eb0e9d6fcbfce48d6c75d9bcf011234b05cf4926acb5f8a13d" exitCode=0 Mar 10 11:20:04 crc kubenswrapper[4794]: I0310 11:20:04.148805 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"f9415088-1f39-4a30-b6e2-1f0421d1dd9f","Type":"ContainerDied","Data":"f6b5ae7b564ee3eb0e9d6fcbfce48d6c75d9bcf011234b05cf4926acb5f8a13d"} Mar 10 11:20:04 crc kubenswrapper[4794]: I0310 11:20:04.590757 4794 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-volume-volume1-0" Mar 10 11:20:04 crc kubenswrapper[4794]: I0310 11:20:04.636598 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552360-dc7fk" Mar 10 11:20:04 crc kubenswrapper[4794]: I0310 11:20:04.725754 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kkktn\" (UniqueName: \"kubernetes.io/projected/1112032b-20d2-4c4b-8f2b-3c9712c8cdf8-kube-api-access-kkktn\") pod \"1112032b-20d2-4c4b-8f2b-3c9712c8cdf8\" (UID: \"1112032b-20d2-4c4b-8f2b-3c9712c8cdf8\") " Mar 10 11:20:04 crc kubenswrapper[4794]: I0310 11:20:04.737538 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1112032b-20d2-4c4b-8f2b-3c9712c8cdf8-kube-api-access-kkktn" (OuterVolumeSpecName: "kube-api-access-kkktn") pod "1112032b-20d2-4c4b-8f2b-3c9712c8cdf8" (UID: "1112032b-20d2-4c4b-8f2b-3c9712c8cdf8"). InnerVolumeSpecName "kube-api-access-kkktn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 11:20:04 crc kubenswrapper[4794]: I0310 11:20:04.830554 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kkktn\" (UniqueName: \"kubernetes.io/projected/1112032b-20d2-4c4b-8f2b-3c9712c8cdf8-kube-api-access-kkktn\") on node \"crc\" DevicePath \"\"" Mar 10 11:20:05 crc kubenswrapper[4794]: I0310 11:20:05.157382 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552360-dc7fk" event={"ID":"1112032b-20d2-4c4b-8f2b-3c9712c8cdf8","Type":"ContainerDied","Data":"b3b0a17fa1f1819cb5defb53632dd6cb3641eadce76694dd5c93388e9cc9a7db"} Mar 10 11:20:05 crc kubenswrapper[4794]: I0310 11:20:05.157431 4794 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b3b0a17fa1f1819cb5defb53632dd6cb3641eadce76694dd5c93388e9cc9a7db" Mar 10 11:20:05 crc kubenswrapper[4794]: I0310 11:20:05.157495 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552360-dc7fk" Mar 10 11:20:05 crc kubenswrapper[4794]: I0310 11:20:05.170805 4794 generic.go:334] "Generic (PLEG): container finished" podID="f9415088-1f39-4a30-b6e2-1f0421d1dd9f" containerID="8223a7484a0c224e84cd0f81a8a2a7f0e59bfc770ed59b29bee05e0336459770" exitCode=0 Mar 10 11:20:05 crc kubenswrapper[4794]: I0310 11:20:05.170851 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"f9415088-1f39-4a30-b6e2-1f0421d1dd9f","Type":"ContainerDied","Data":"8223a7484a0c224e84cd0f81a8a2a7f0e59bfc770ed59b29bee05e0336459770"} Mar 10 11:20:05 crc kubenswrapper[4794]: I0310 11:20:05.170883 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"f9415088-1f39-4a30-b6e2-1f0421d1dd9f","Type":"ContainerDied","Data":"f09f4a5c852a74f058a7cbcc2da3d5e37aa37f21319ec92df53de80dba45b1c3"} Mar 10 11:20:05 crc kubenswrapper[4794]: I0310 11:20:05.170897 4794 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f09f4a5c852a74f058a7cbcc2da3d5e37aa37f21319ec92df53de80dba45b1c3" Mar 10 11:20:05 crc kubenswrapper[4794]: I0310 11:20:05.208937 4794 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-backup-0" Mar 10 11:20:05 crc kubenswrapper[4794]: I0310 11:20:05.227675 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 10 11:20:05 crc kubenswrapper[4794]: I0310 11:20:05.338468 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/f9415088-1f39-4a30-b6e2-1f0421d1dd9f-etc-machine-id\") pod \"f9415088-1f39-4a30-b6e2-1f0421d1dd9f\" (UID: \"f9415088-1f39-4a30-b6e2-1f0421d1dd9f\") " Mar 10 11:20:05 crc kubenswrapper[4794]: I0310 11:20:05.338569 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f9415088-1f39-4a30-b6e2-1f0421d1dd9f-scripts\") pod \"f9415088-1f39-4a30-b6e2-1f0421d1dd9f\" (UID: \"f9415088-1f39-4a30-b6e2-1f0421d1dd9f\") " Mar 10 11:20:05 crc kubenswrapper[4794]: I0310 11:20:05.338659 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f9415088-1f39-4a30-b6e2-1f0421d1dd9f-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "f9415088-1f39-4a30-b6e2-1f0421d1dd9f" (UID: "f9415088-1f39-4a30-b6e2-1f0421d1dd9f"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 11:20:05 crc kubenswrapper[4794]: I0310 11:20:05.338724 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rl96p\" (UniqueName: \"kubernetes.io/projected/f9415088-1f39-4a30-b6e2-1f0421d1dd9f-kube-api-access-rl96p\") pod \"f9415088-1f39-4a30-b6e2-1f0421d1dd9f\" (UID: \"f9415088-1f39-4a30-b6e2-1f0421d1dd9f\") " Mar 10 11:20:05 crc kubenswrapper[4794]: I0310 11:20:05.339676 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f9415088-1f39-4a30-b6e2-1f0421d1dd9f-config-data-custom\") pod \"f9415088-1f39-4a30-b6e2-1f0421d1dd9f\" (UID: \"f9415088-1f39-4a30-b6e2-1f0421d1dd9f\") " Mar 10 11:20:05 crc kubenswrapper[4794]: I0310 11:20:05.339724 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f9415088-1f39-4a30-b6e2-1f0421d1dd9f-combined-ca-bundle\") pod \"f9415088-1f39-4a30-b6e2-1f0421d1dd9f\" (UID: \"f9415088-1f39-4a30-b6e2-1f0421d1dd9f\") " Mar 10 11:20:05 crc kubenswrapper[4794]: I0310 11:20:05.339744 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f9415088-1f39-4a30-b6e2-1f0421d1dd9f-config-data\") pod \"f9415088-1f39-4a30-b6e2-1f0421d1dd9f\" (UID: \"f9415088-1f39-4a30-b6e2-1f0421d1dd9f\") " Mar 10 11:20:05 crc kubenswrapper[4794]: I0310 11:20:05.340154 4794 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/f9415088-1f39-4a30-b6e2-1f0421d1dd9f-etc-machine-id\") on node \"crc\" DevicePath \"\"" Mar 10 11:20:05 crc kubenswrapper[4794]: I0310 11:20:05.345178 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f9415088-1f39-4a30-b6e2-1f0421d1dd9f-kube-api-access-rl96p" (OuterVolumeSpecName: "kube-api-access-rl96p") pod "f9415088-1f39-4a30-b6e2-1f0421d1dd9f" (UID: "f9415088-1f39-4a30-b6e2-1f0421d1dd9f"). InnerVolumeSpecName "kube-api-access-rl96p". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 11:20:05 crc kubenswrapper[4794]: I0310 11:20:05.345487 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f9415088-1f39-4a30-b6e2-1f0421d1dd9f-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "f9415088-1f39-4a30-b6e2-1f0421d1dd9f" (UID: "f9415088-1f39-4a30-b6e2-1f0421d1dd9f"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 11:20:05 crc kubenswrapper[4794]: I0310 11:20:05.356593 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f9415088-1f39-4a30-b6e2-1f0421d1dd9f-scripts" (OuterVolumeSpecName: "scripts") pod "f9415088-1f39-4a30-b6e2-1f0421d1dd9f" (UID: "f9415088-1f39-4a30-b6e2-1f0421d1dd9f"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 11:20:05 crc kubenswrapper[4794]: I0310 11:20:05.416624 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f9415088-1f39-4a30-b6e2-1f0421d1dd9f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f9415088-1f39-4a30-b6e2-1f0421d1dd9f" (UID: "f9415088-1f39-4a30-b6e2-1f0421d1dd9f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 11:20:05 crc kubenswrapper[4794]: I0310 11:20:05.442587 4794 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f9415088-1f39-4a30-b6e2-1f0421d1dd9f-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 10 11:20:05 crc kubenswrapper[4794]: I0310 11:20:05.442612 4794 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f9415088-1f39-4a30-b6e2-1f0421d1dd9f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 11:20:05 crc kubenswrapper[4794]: I0310 11:20:05.442643 4794 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f9415088-1f39-4a30-b6e2-1f0421d1dd9f-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 11:20:05 crc kubenswrapper[4794]: I0310 11:20:05.442654 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rl96p\" (UniqueName: \"kubernetes.io/projected/f9415088-1f39-4a30-b6e2-1f0421d1dd9f-kube-api-access-rl96p\") on node \"crc\" DevicePath \"\"" Mar 10 11:20:05 crc kubenswrapper[4794]: I0310 11:20:05.461906 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f9415088-1f39-4a30-b6e2-1f0421d1dd9f-config-data" (OuterVolumeSpecName: "config-data") pod "f9415088-1f39-4a30-b6e2-1f0421d1dd9f" (UID: "f9415088-1f39-4a30-b6e2-1f0421d1dd9f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 11:20:05 crc kubenswrapper[4794]: I0310 11:20:05.544532 4794 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f9415088-1f39-4a30-b6e2-1f0421d1dd9f-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 11:20:05 crc kubenswrapper[4794]: I0310 11:20:05.733365 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29552354-cxv5m"] Mar 10 11:20:05 crc kubenswrapper[4794]: I0310 11:20:05.740960 4794 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29552354-cxv5m"] Mar 10 11:20:06 crc kubenswrapper[4794]: I0310 11:20:06.011440 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49befbd9-bbf3-477e-9490-83a2e7e8eac6" path="/var/lib/kubelet/pods/49befbd9-bbf3-477e-9490-83a2e7e8eac6/volumes" Mar 10 11:20:06 crc kubenswrapper[4794]: I0310 11:20:06.178550 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 10 11:20:06 crc kubenswrapper[4794]: I0310 11:20:06.220182 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 10 11:20:06 crc kubenswrapper[4794]: I0310 11:20:06.243894 4794 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 10 11:20:06 crc kubenswrapper[4794]: I0310 11:20:06.259566 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Mar 10 11:20:06 crc kubenswrapper[4794]: E0310 11:20:06.260000 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1112032b-20d2-4c4b-8f2b-3c9712c8cdf8" containerName="oc" Mar 10 11:20:06 crc kubenswrapper[4794]: I0310 11:20:06.260021 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="1112032b-20d2-4c4b-8f2b-3c9712c8cdf8" containerName="oc" Mar 10 11:20:06 crc kubenswrapper[4794]: E0310 11:20:06.260043 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f9415088-1f39-4a30-b6e2-1f0421d1dd9f" containerName="probe" Mar 10 11:20:06 crc kubenswrapper[4794]: I0310 11:20:06.260052 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="f9415088-1f39-4a30-b6e2-1f0421d1dd9f" containerName="probe" Mar 10 11:20:06 crc kubenswrapper[4794]: E0310 11:20:06.260080 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f9415088-1f39-4a30-b6e2-1f0421d1dd9f" containerName="cinder-scheduler" Mar 10 11:20:06 crc kubenswrapper[4794]: I0310 11:20:06.260089 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="f9415088-1f39-4a30-b6e2-1f0421d1dd9f" containerName="cinder-scheduler" Mar 10 11:20:06 crc kubenswrapper[4794]: I0310 11:20:06.260298 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="1112032b-20d2-4c4b-8f2b-3c9712c8cdf8" containerName="oc" Mar 10 11:20:06 crc kubenswrapper[4794]: I0310 11:20:06.260318 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="f9415088-1f39-4a30-b6e2-1f0421d1dd9f" containerName="probe" Mar 10 11:20:06 crc kubenswrapper[4794]: I0310 11:20:06.260361 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="f9415088-1f39-4a30-b6e2-1f0421d1dd9f" containerName="cinder-scheduler" Mar 10 11:20:06 crc kubenswrapper[4794]: I0310 11:20:06.261588 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 10 11:20:06 crc kubenswrapper[4794]: I0310 11:20:06.265140 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Mar 10 11:20:06 crc kubenswrapper[4794]: I0310 11:20:06.271815 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 10 11:20:06 crc kubenswrapper[4794]: I0310 11:20:06.357974 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3fa5babe-464d-4dd0-a3be-fb5a0adc54d0-scripts\") pod \"cinder-scheduler-0\" (UID: \"3fa5babe-464d-4dd0-a3be-fb5a0adc54d0\") " pod="openstack/cinder-scheduler-0" Mar 10 11:20:06 crc kubenswrapper[4794]: I0310 11:20:06.358111 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3fa5babe-464d-4dd0-a3be-fb5a0adc54d0-config-data\") pod \"cinder-scheduler-0\" (UID: \"3fa5babe-464d-4dd0-a3be-fb5a0adc54d0\") " pod="openstack/cinder-scheduler-0" Mar 10 11:20:06 crc kubenswrapper[4794]: I0310 11:20:06.358173 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3fa5babe-464d-4dd0-a3be-fb5a0adc54d0-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"3fa5babe-464d-4dd0-a3be-fb5a0adc54d0\") " pod="openstack/cinder-scheduler-0" Mar 10 11:20:06 crc kubenswrapper[4794]: I0310 11:20:06.358220 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/3fa5babe-464d-4dd0-a3be-fb5a0adc54d0-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"3fa5babe-464d-4dd0-a3be-fb5a0adc54d0\") " pod="openstack/cinder-scheduler-0" Mar 10 11:20:06 crc kubenswrapper[4794]: I0310 11:20:06.358452 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3fa5babe-464d-4dd0-a3be-fb5a0adc54d0-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"3fa5babe-464d-4dd0-a3be-fb5a0adc54d0\") " pod="openstack/cinder-scheduler-0" Mar 10 11:20:06 crc kubenswrapper[4794]: I0310 11:20:06.358510 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jhtz4\" (UniqueName: \"kubernetes.io/projected/3fa5babe-464d-4dd0-a3be-fb5a0adc54d0-kube-api-access-jhtz4\") pod \"cinder-scheduler-0\" (UID: \"3fa5babe-464d-4dd0-a3be-fb5a0adc54d0\") " pod="openstack/cinder-scheduler-0" Mar 10 11:20:06 crc kubenswrapper[4794]: I0310 11:20:06.459983 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3fa5babe-464d-4dd0-a3be-fb5a0adc54d0-config-data\") pod \"cinder-scheduler-0\" (UID: \"3fa5babe-464d-4dd0-a3be-fb5a0adc54d0\") " pod="openstack/cinder-scheduler-0" Mar 10 11:20:06 crc kubenswrapper[4794]: I0310 11:20:06.460049 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3fa5babe-464d-4dd0-a3be-fb5a0adc54d0-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"3fa5babe-464d-4dd0-a3be-fb5a0adc54d0\") " pod="openstack/cinder-scheduler-0" Mar 10 11:20:06 crc kubenswrapper[4794]: I0310 11:20:06.460086 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/3fa5babe-464d-4dd0-a3be-fb5a0adc54d0-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"3fa5babe-464d-4dd0-a3be-fb5a0adc54d0\") " pod="openstack/cinder-scheduler-0" Mar 10 11:20:06 crc kubenswrapper[4794]: I0310 11:20:06.460121 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3fa5babe-464d-4dd0-a3be-fb5a0adc54d0-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"3fa5babe-464d-4dd0-a3be-fb5a0adc54d0\") " pod="openstack/cinder-scheduler-0" Mar 10 11:20:06 crc kubenswrapper[4794]: I0310 11:20:06.460139 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jhtz4\" (UniqueName: \"kubernetes.io/projected/3fa5babe-464d-4dd0-a3be-fb5a0adc54d0-kube-api-access-jhtz4\") pod \"cinder-scheduler-0\" (UID: \"3fa5babe-464d-4dd0-a3be-fb5a0adc54d0\") " pod="openstack/cinder-scheduler-0" Mar 10 11:20:06 crc kubenswrapper[4794]: I0310 11:20:06.460176 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3fa5babe-464d-4dd0-a3be-fb5a0adc54d0-scripts\") pod \"cinder-scheduler-0\" (UID: \"3fa5babe-464d-4dd0-a3be-fb5a0adc54d0\") " pod="openstack/cinder-scheduler-0" Mar 10 11:20:06 crc kubenswrapper[4794]: I0310 11:20:06.460250 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/3fa5babe-464d-4dd0-a3be-fb5a0adc54d0-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"3fa5babe-464d-4dd0-a3be-fb5a0adc54d0\") " pod="openstack/cinder-scheduler-0" Mar 10 11:20:06 crc kubenswrapper[4794]: I0310 11:20:06.466139 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3fa5babe-464d-4dd0-a3be-fb5a0adc54d0-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"3fa5babe-464d-4dd0-a3be-fb5a0adc54d0\") " pod="openstack/cinder-scheduler-0" Mar 10 11:20:06 crc kubenswrapper[4794]: I0310 11:20:06.466420 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3fa5babe-464d-4dd0-a3be-fb5a0adc54d0-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"3fa5babe-464d-4dd0-a3be-fb5a0adc54d0\") " pod="openstack/cinder-scheduler-0" Mar 10 11:20:06 crc kubenswrapper[4794]: I0310 11:20:06.477373 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3fa5babe-464d-4dd0-a3be-fb5a0adc54d0-config-data\") pod \"cinder-scheduler-0\" (UID: \"3fa5babe-464d-4dd0-a3be-fb5a0adc54d0\") " pod="openstack/cinder-scheduler-0" Mar 10 11:20:06 crc kubenswrapper[4794]: I0310 11:20:06.478965 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3fa5babe-464d-4dd0-a3be-fb5a0adc54d0-scripts\") pod \"cinder-scheduler-0\" (UID: \"3fa5babe-464d-4dd0-a3be-fb5a0adc54d0\") " pod="openstack/cinder-scheduler-0" Mar 10 11:20:06 crc kubenswrapper[4794]: I0310 11:20:06.482803 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jhtz4\" (UniqueName: \"kubernetes.io/projected/3fa5babe-464d-4dd0-a3be-fb5a0adc54d0-kube-api-access-jhtz4\") pod \"cinder-scheduler-0\" (UID: \"3fa5babe-464d-4dd0-a3be-fb5a0adc54d0\") " pod="openstack/cinder-scheduler-0" Mar 10 11:20:06 crc kubenswrapper[4794]: I0310 11:20:06.592424 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 10 11:20:07 crc kubenswrapper[4794]: I0310 11:20:07.173414 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 10 11:20:07 crc kubenswrapper[4794]: I0310 11:20:07.188251 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"3fa5babe-464d-4dd0-a3be-fb5a0adc54d0","Type":"ContainerStarted","Data":"d12bdfc70a5d2cc3ab9dc6f18e8719bea0d4b0896d9643623f20e91f62c8dd15"} Mar 10 11:20:08 crc kubenswrapper[4794]: I0310 11:20:08.016026 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f9415088-1f39-4a30-b6e2-1f0421d1dd9f" path="/var/lib/kubelet/pods/f9415088-1f39-4a30-b6e2-1f0421d1dd9f/volumes" Mar 10 11:20:08 crc kubenswrapper[4794]: I0310 11:20:08.199589 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"3fa5babe-464d-4dd0-a3be-fb5a0adc54d0","Type":"ContainerStarted","Data":"8a2fe056c1b525941149ab079eed984e23cffa43029825519c45d126c0692825"} Mar 10 11:20:09 crc kubenswrapper[4794]: I0310 11:20:09.213938 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"3fa5babe-464d-4dd0-a3be-fb5a0adc54d0","Type":"ContainerStarted","Data":"ceb8e9d12c6af8d1ad75f8b531845d3c2cff17dbe432e4321fa28bfddabad5fa"} Mar 10 11:20:09 crc kubenswrapper[4794]: I0310 11:20:09.244482 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=3.244457922 podStartE2EDuration="3.244457922s" podCreationTimestamp="2026-03-10 11:20:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 11:20:09.234532083 +0000 UTC m=+5757.990702921" watchObservedRunningTime="2026-03-10 11:20:09.244457922 +0000 UTC m=+5758.000628740" Mar 10 11:20:10 crc kubenswrapper[4794]: I0310 11:20:10.318441 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cinder-api-0" Mar 10 11:20:11 crc kubenswrapper[4794]: I0310 11:20:11.593358 4794 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Mar 10 11:20:16 crc kubenswrapper[4794]: I0310 11:20:16.855826 4794 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Mar 10 11:20:43 crc kubenswrapper[4794]: I0310 11:20:43.241762 4794 scope.go:117] "RemoveContainer" containerID="b303cf68544edd697504ba6c368778ffb2e4e4129a74c2e6ccfa9a94953a0d0d" Mar 10 11:21:22 crc kubenswrapper[4794]: I0310 11:21:22.968110 4794 patch_prober.go:28] interesting pod/machine-config-daemon-69278 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 11:21:22 crc kubenswrapper[4794]: I0310 11:21:22.968579 4794 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-69278" podUID="071a79a8-a892-4d38-a255-2a19483b64aa" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 11:21:43 crc kubenswrapper[4794]: I0310 11:21:43.388381 4794 scope.go:117] "RemoveContainer" containerID="fafd23b30900529ae35c00b34d2118eb31aa999de34597a0be3c90e018211f69" Mar 10 11:21:43 crc kubenswrapper[4794]: I0310 11:21:43.435581 4794 scope.go:117] "RemoveContainer" containerID="449a157ed020ee7fa7b9a0ca2b2cf80739ed77686010443ce9ef638877596952" Mar 10 11:21:52 crc kubenswrapper[4794]: I0310 11:21:52.830084 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-dnwr6"] Mar 10 11:21:52 crc kubenswrapper[4794]: I0310 11:21:52.832124 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-dnwr6" Mar 10 11:21:52 crc kubenswrapper[4794]: I0310 11:21:52.839624 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-scripts" Mar 10 11:21:52 crc kubenswrapper[4794]: I0310 11:21:52.839854 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncontroller-ovncontroller-dockercfg-j8l4m" Mar 10 11:21:52 crc kubenswrapper[4794]: I0310 11:21:52.842070 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-dnwr6"] Mar 10 11:21:52 crc kubenswrapper[4794]: I0310 11:21:52.874300 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/e395ac11-306a-4cb7-8868-ac0c2108d63b-var-log-ovn\") pod \"ovn-controller-dnwr6\" (UID: \"e395ac11-306a-4cb7-8868-ac0c2108d63b\") " pod="openstack/ovn-controller-dnwr6" Mar 10 11:21:52 crc kubenswrapper[4794]: I0310 11:21:52.874362 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sd6m9\" (UniqueName: \"kubernetes.io/projected/e395ac11-306a-4cb7-8868-ac0c2108d63b-kube-api-access-sd6m9\") pod \"ovn-controller-dnwr6\" (UID: \"e395ac11-306a-4cb7-8868-ac0c2108d63b\") " pod="openstack/ovn-controller-dnwr6" Mar 10 11:21:52 crc kubenswrapper[4794]: I0310 11:21:52.874424 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/e395ac11-306a-4cb7-8868-ac0c2108d63b-var-run-ovn\") pod \"ovn-controller-dnwr6\" (UID: \"e395ac11-306a-4cb7-8868-ac0c2108d63b\") " pod="openstack/ovn-controller-dnwr6" Mar 10 11:21:52 crc kubenswrapper[4794]: I0310 11:21:52.874449 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e395ac11-306a-4cb7-8868-ac0c2108d63b-scripts\") pod \"ovn-controller-dnwr6\" (UID: \"e395ac11-306a-4cb7-8868-ac0c2108d63b\") " pod="openstack/ovn-controller-dnwr6" Mar 10 11:21:52 crc kubenswrapper[4794]: I0310 11:21:52.874597 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/e395ac11-306a-4cb7-8868-ac0c2108d63b-var-run\") pod \"ovn-controller-dnwr6\" (UID: \"e395ac11-306a-4cb7-8868-ac0c2108d63b\") " pod="openstack/ovn-controller-dnwr6" Mar 10 11:21:52 crc kubenswrapper[4794]: I0310 11:21:52.928986 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-ovs-545pd"] Mar 10 11:21:52 crc kubenswrapper[4794]: I0310 11:21:52.931497 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-545pd" Mar 10 11:21:52 crc kubenswrapper[4794]: I0310 11:21:52.940570 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-545pd"] Mar 10 11:21:52 crc kubenswrapper[4794]: I0310 11:21:52.969621 4794 patch_prober.go:28] interesting pod/machine-config-daemon-69278 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 11:21:52 crc kubenswrapper[4794]: I0310 11:21:52.969682 4794 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-69278" podUID="071a79a8-a892-4d38-a255-2a19483b64aa" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 11:21:52 crc kubenswrapper[4794]: I0310 11:21:52.976256 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/e395ac11-306a-4cb7-8868-ac0c2108d63b-var-run-ovn\") pod \"ovn-controller-dnwr6\" (UID: \"e395ac11-306a-4cb7-8868-ac0c2108d63b\") " pod="openstack/ovn-controller-dnwr6" Mar 10 11:21:52 crc kubenswrapper[4794]: I0310 11:21:52.976539 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e395ac11-306a-4cb7-8868-ac0c2108d63b-scripts\") pod \"ovn-controller-dnwr6\" (UID: \"e395ac11-306a-4cb7-8868-ac0c2108d63b\") " pod="openstack/ovn-controller-dnwr6" Mar 10 11:21:52 crc kubenswrapper[4794]: I0310 11:21:52.976618 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/e395ac11-306a-4cb7-8868-ac0c2108d63b-var-run-ovn\") pod \"ovn-controller-dnwr6\" (UID: \"e395ac11-306a-4cb7-8868-ac0c2108d63b\") " pod="openstack/ovn-controller-dnwr6" Mar 10 11:21:52 crc kubenswrapper[4794]: I0310 11:21:52.976859 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/77cb5711-11e8-4897-b227-3579f08b54a6-var-lib\") pod \"ovn-controller-ovs-545pd\" (UID: \"77cb5711-11e8-4897-b227-3579f08b54a6\") " pod="openstack/ovn-controller-ovs-545pd" Mar 10 11:21:52 crc kubenswrapper[4794]: I0310 11:21:52.976939 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/e395ac11-306a-4cb7-8868-ac0c2108d63b-var-run\") pod \"ovn-controller-dnwr6\" (UID: \"e395ac11-306a-4cb7-8868-ac0c2108d63b\") " pod="openstack/ovn-controller-dnwr6" Mar 10 11:21:52 crc kubenswrapper[4794]: I0310 11:21:52.977000 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/e395ac11-306a-4cb7-8868-ac0c2108d63b-var-run\") pod \"ovn-controller-dnwr6\" (UID: \"e395ac11-306a-4cb7-8868-ac0c2108d63b\") " pod="openstack/ovn-controller-dnwr6" Mar 10 11:21:52 crc kubenswrapper[4794]: I0310 11:21:52.977081 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/77cb5711-11e8-4897-b227-3579f08b54a6-etc-ovs\") pod \"ovn-controller-ovs-545pd\" (UID: \"77cb5711-11e8-4897-b227-3579f08b54a6\") " pod="openstack/ovn-controller-ovs-545pd" Mar 10 11:21:52 crc kubenswrapper[4794]: I0310 11:21:52.977188 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/77cb5711-11e8-4897-b227-3579f08b54a6-var-run\") pod \"ovn-controller-ovs-545pd\" (UID: \"77cb5711-11e8-4897-b227-3579f08b54a6\") " pod="openstack/ovn-controller-ovs-545pd" Mar 10 11:21:52 crc kubenswrapper[4794]: I0310 11:21:52.977350 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wcmf8\" (UniqueName: \"kubernetes.io/projected/77cb5711-11e8-4897-b227-3579f08b54a6-kube-api-access-wcmf8\") pod \"ovn-controller-ovs-545pd\" (UID: \"77cb5711-11e8-4897-b227-3579f08b54a6\") " pod="openstack/ovn-controller-ovs-545pd" Mar 10 11:21:52 crc kubenswrapper[4794]: I0310 11:21:52.977421 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/e395ac11-306a-4cb7-8868-ac0c2108d63b-var-log-ovn\") pod \"ovn-controller-dnwr6\" (UID: \"e395ac11-306a-4cb7-8868-ac0c2108d63b\") " pod="openstack/ovn-controller-dnwr6" Mar 10 11:21:52 crc kubenswrapper[4794]: I0310 11:21:52.977480 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/77cb5711-11e8-4897-b227-3579f08b54a6-scripts\") pod \"ovn-controller-ovs-545pd\" (UID: \"77cb5711-11e8-4897-b227-3579f08b54a6\") " pod="openstack/ovn-controller-ovs-545pd" Mar 10 11:21:52 crc kubenswrapper[4794]: I0310 11:21:52.977524 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/77cb5711-11e8-4897-b227-3579f08b54a6-var-log\") pod \"ovn-controller-ovs-545pd\" (UID: \"77cb5711-11e8-4897-b227-3579f08b54a6\") " pod="openstack/ovn-controller-ovs-545pd" Mar 10 11:21:52 crc kubenswrapper[4794]: I0310 11:21:52.977572 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sd6m9\" (UniqueName: \"kubernetes.io/projected/e395ac11-306a-4cb7-8868-ac0c2108d63b-kube-api-access-sd6m9\") pod \"ovn-controller-dnwr6\" (UID: \"e395ac11-306a-4cb7-8868-ac0c2108d63b\") " pod="openstack/ovn-controller-dnwr6" Mar 10 11:21:52 crc kubenswrapper[4794]: I0310 11:21:52.977792 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/e395ac11-306a-4cb7-8868-ac0c2108d63b-var-log-ovn\") pod \"ovn-controller-dnwr6\" (UID: \"e395ac11-306a-4cb7-8868-ac0c2108d63b\") " pod="openstack/ovn-controller-dnwr6" Mar 10 11:21:52 crc kubenswrapper[4794]: I0310 11:21:52.978495 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e395ac11-306a-4cb7-8868-ac0c2108d63b-scripts\") pod \"ovn-controller-dnwr6\" (UID: \"e395ac11-306a-4cb7-8868-ac0c2108d63b\") " pod="openstack/ovn-controller-dnwr6" Mar 10 11:21:53 crc kubenswrapper[4794]: I0310 11:21:53.018209 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sd6m9\" (UniqueName: \"kubernetes.io/projected/e395ac11-306a-4cb7-8868-ac0c2108d63b-kube-api-access-sd6m9\") pod \"ovn-controller-dnwr6\" (UID: \"e395ac11-306a-4cb7-8868-ac0c2108d63b\") " pod="openstack/ovn-controller-dnwr6" Mar 10 11:21:53 crc kubenswrapper[4794]: I0310 11:21:53.079050 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/77cb5711-11e8-4897-b227-3579f08b54a6-var-lib\") pod \"ovn-controller-ovs-545pd\" (UID: \"77cb5711-11e8-4897-b227-3579f08b54a6\") " pod="openstack/ovn-controller-ovs-545pd" Mar 10 11:21:53 crc kubenswrapper[4794]: I0310 11:21:53.079292 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/77cb5711-11e8-4897-b227-3579f08b54a6-var-lib\") pod \"ovn-controller-ovs-545pd\" (UID: \"77cb5711-11e8-4897-b227-3579f08b54a6\") " pod="openstack/ovn-controller-ovs-545pd" Mar 10 11:21:53 crc kubenswrapper[4794]: I0310 11:21:53.079364 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/77cb5711-11e8-4897-b227-3579f08b54a6-etc-ovs\") pod \"ovn-controller-ovs-545pd\" (UID: \"77cb5711-11e8-4897-b227-3579f08b54a6\") " pod="openstack/ovn-controller-ovs-545pd" Mar 10 11:21:53 crc kubenswrapper[4794]: I0310 11:21:53.079407 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/77cb5711-11e8-4897-b227-3579f08b54a6-var-run\") pod \"ovn-controller-ovs-545pd\" (UID: \"77cb5711-11e8-4897-b227-3579f08b54a6\") " pod="openstack/ovn-controller-ovs-545pd" Mar 10 11:21:53 crc kubenswrapper[4794]: I0310 11:21:53.079429 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/77cb5711-11e8-4897-b227-3579f08b54a6-etc-ovs\") pod \"ovn-controller-ovs-545pd\" (UID: \"77cb5711-11e8-4897-b227-3579f08b54a6\") " pod="openstack/ovn-controller-ovs-545pd" Mar 10 11:21:53 crc kubenswrapper[4794]: I0310 11:21:53.079456 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wcmf8\" (UniqueName: \"kubernetes.io/projected/77cb5711-11e8-4897-b227-3579f08b54a6-kube-api-access-wcmf8\") pod \"ovn-controller-ovs-545pd\" (UID: \"77cb5711-11e8-4897-b227-3579f08b54a6\") " pod="openstack/ovn-controller-ovs-545pd" Mar 10 11:21:53 crc kubenswrapper[4794]: I0310 11:21:53.079539 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/77cb5711-11e8-4897-b227-3579f08b54a6-var-run\") pod \"ovn-controller-ovs-545pd\" (UID: \"77cb5711-11e8-4897-b227-3579f08b54a6\") " pod="openstack/ovn-controller-ovs-545pd" Mar 10 11:21:53 crc kubenswrapper[4794]: I0310 11:21:53.080636 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/77cb5711-11e8-4897-b227-3579f08b54a6-scripts\") pod \"ovn-controller-ovs-545pd\" (UID: \"77cb5711-11e8-4897-b227-3579f08b54a6\") " pod="openstack/ovn-controller-ovs-545pd" Mar 10 11:21:53 crc kubenswrapper[4794]: I0310 11:21:53.082735 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/77cb5711-11e8-4897-b227-3579f08b54a6-scripts\") pod \"ovn-controller-ovs-545pd\" (UID: \"77cb5711-11e8-4897-b227-3579f08b54a6\") " pod="openstack/ovn-controller-ovs-545pd" Mar 10 11:21:53 crc kubenswrapper[4794]: I0310 11:21:53.082793 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/77cb5711-11e8-4897-b227-3579f08b54a6-var-log\") pod \"ovn-controller-ovs-545pd\" (UID: \"77cb5711-11e8-4897-b227-3579f08b54a6\") " pod="openstack/ovn-controller-ovs-545pd" Mar 10 11:21:53 crc kubenswrapper[4794]: I0310 11:21:53.082883 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/77cb5711-11e8-4897-b227-3579f08b54a6-var-log\") pod \"ovn-controller-ovs-545pd\" (UID: \"77cb5711-11e8-4897-b227-3579f08b54a6\") " pod="openstack/ovn-controller-ovs-545pd" Mar 10 11:21:53 crc kubenswrapper[4794]: I0310 11:21:53.106769 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wcmf8\" (UniqueName: \"kubernetes.io/projected/77cb5711-11e8-4897-b227-3579f08b54a6-kube-api-access-wcmf8\") pod \"ovn-controller-ovs-545pd\" (UID: \"77cb5711-11e8-4897-b227-3579f08b54a6\") " pod="openstack/ovn-controller-ovs-545pd" Mar 10 11:21:53 crc kubenswrapper[4794]: I0310 11:21:53.149713 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-dnwr6" Mar 10 11:21:53 crc kubenswrapper[4794]: I0310 11:21:53.284557 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-545pd" Mar 10 11:21:53 crc kubenswrapper[4794]: I0310 11:21:53.772570 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-dnwr6"] Mar 10 11:21:54 crc kubenswrapper[4794]: I0310 11:21:54.203882 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-545pd"] Mar 10 11:21:54 crc kubenswrapper[4794]: I0310 11:21:54.338114 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-metrics-h24fx"] Mar 10 11:21:54 crc kubenswrapper[4794]: I0310 11:21:54.339588 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-h24fx" Mar 10 11:21:54 crc kubenswrapper[4794]: I0310 11:21:54.372894 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-metrics-config" Mar 10 11:21:54 crc kubenswrapper[4794]: I0310 11:21:54.386022 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-h24fx"] Mar 10 11:21:54 crc kubenswrapper[4794]: I0310 11:21:54.406316 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/a0118024-1c72-4da4-8e7b-78190928f285-ovn-rundir\") pod \"ovn-controller-metrics-h24fx\" (UID: \"a0118024-1c72-4da4-8e7b-78190928f285\") " pod="openstack/ovn-controller-metrics-h24fx" Mar 10 11:21:54 crc kubenswrapper[4794]: I0310 11:21:54.406438 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-96gpl\" (UniqueName: \"kubernetes.io/projected/a0118024-1c72-4da4-8e7b-78190928f285-kube-api-access-96gpl\") pod \"ovn-controller-metrics-h24fx\" (UID: \"a0118024-1c72-4da4-8e7b-78190928f285\") " pod="openstack/ovn-controller-metrics-h24fx" Mar 10 11:21:54 crc kubenswrapper[4794]: I0310 11:21:54.411589 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a0118024-1c72-4da4-8e7b-78190928f285-config\") pod \"ovn-controller-metrics-h24fx\" (UID: \"a0118024-1c72-4da4-8e7b-78190928f285\") " pod="openstack/ovn-controller-metrics-h24fx" Mar 10 11:21:54 crc kubenswrapper[4794]: I0310 11:21:54.411708 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/a0118024-1c72-4da4-8e7b-78190928f285-ovs-rundir\") pod \"ovn-controller-metrics-h24fx\" (UID: \"a0118024-1c72-4da4-8e7b-78190928f285\") " pod="openstack/ovn-controller-metrics-h24fx" Mar 10 11:21:54 crc kubenswrapper[4794]: I0310 11:21:54.470827 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-545pd" event={"ID":"77cb5711-11e8-4897-b227-3579f08b54a6","Type":"ContainerStarted","Data":"9e0c6c01e892cbda2ec0c01e1f8e7e6cfb38f11e96c15a4b0d6bdd86fb0c43cc"} Mar 10 11:21:54 crc kubenswrapper[4794]: I0310 11:21:54.472018 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-dnwr6" event={"ID":"e395ac11-306a-4cb7-8868-ac0c2108d63b","Type":"ContainerStarted","Data":"86a69575b1bf4d10eccd6f7ef1de63d8ab72c01d83e1b10af15cef8f87026668"} Mar 10 11:21:54 crc kubenswrapper[4794]: I0310 11:21:54.472041 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-dnwr6" event={"ID":"e395ac11-306a-4cb7-8868-ac0c2108d63b","Type":"ContainerStarted","Data":"74e03376b19e864b9d0a5dfc13ac371bd09a8b7aaed0445e08208e873b476f1d"} Mar 10 11:21:54 crc kubenswrapper[4794]: I0310 11:21:54.473100 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-dnwr6" Mar 10 11:21:54 crc kubenswrapper[4794]: I0310 11:21:54.499597 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-dnwr6" podStartSLOduration=2.499578105 podStartE2EDuration="2.499578105s" podCreationTimestamp="2026-03-10 11:21:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 11:21:54.498829663 +0000 UTC m=+5863.255000491" watchObservedRunningTime="2026-03-10 11:21:54.499578105 +0000 UTC m=+5863.255748923" Mar 10 11:21:54 crc kubenswrapper[4794]: I0310 11:21:54.513778 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/a0118024-1c72-4da4-8e7b-78190928f285-ovn-rundir\") pod \"ovn-controller-metrics-h24fx\" (UID: \"a0118024-1c72-4da4-8e7b-78190928f285\") " pod="openstack/ovn-controller-metrics-h24fx" Mar 10 11:21:54 crc kubenswrapper[4794]: I0310 11:21:54.513865 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-96gpl\" (UniqueName: \"kubernetes.io/projected/a0118024-1c72-4da4-8e7b-78190928f285-kube-api-access-96gpl\") pod \"ovn-controller-metrics-h24fx\" (UID: \"a0118024-1c72-4da4-8e7b-78190928f285\") " pod="openstack/ovn-controller-metrics-h24fx" Mar 10 11:21:54 crc kubenswrapper[4794]: I0310 11:21:54.513920 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a0118024-1c72-4da4-8e7b-78190928f285-config\") pod \"ovn-controller-metrics-h24fx\" (UID: \"a0118024-1c72-4da4-8e7b-78190928f285\") " pod="openstack/ovn-controller-metrics-h24fx" Mar 10 11:21:54 crc kubenswrapper[4794]: I0310 11:21:54.513965 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/a0118024-1c72-4da4-8e7b-78190928f285-ovs-rundir\") pod \"ovn-controller-metrics-h24fx\" (UID: \"a0118024-1c72-4da4-8e7b-78190928f285\") " pod="openstack/ovn-controller-metrics-h24fx" Mar 10 11:21:54 crc kubenswrapper[4794]: I0310 11:21:54.515471 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/a0118024-1c72-4da4-8e7b-78190928f285-ovn-rundir\") pod \"ovn-controller-metrics-h24fx\" (UID: \"a0118024-1c72-4da4-8e7b-78190928f285\") " pod="openstack/ovn-controller-metrics-h24fx" Mar 10 11:21:54 crc kubenswrapper[4794]: I0310 11:21:54.516443 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a0118024-1c72-4da4-8e7b-78190928f285-config\") pod \"ovn-controller-metrics-h24fx\" (UID: \"a0118024-1c72-4da4-8e7b-78190928f285\") " pod="openstack/ovn-controller-metrics-h24fx" Mar 10 11:21:54 crc kubenswrapper[4794]: I0310 11:21:54.516775 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/a0118024-1c72-4da4-8e7b-78190928f285-ovs-rundir\") pod \"ovn-controller-metrics-h24fx\" (UID: \"a0118024-1c72-4da4-8e7b-78190928f285\") " pod="openstack/ovn-controller-metrics-h24fx" Mar 10 11:21:54 crc kubenswrapper[4794]: I0310 11:21:54.540736 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-96gpl\" (UniqueName: \"kubernetes.io/projected/a0118024-1c72-4da4-8e7b-78190928f285-kube-api-access-96gpl\") pod \"ovn-controller-metrics-h24fx\" (UID: \"a0118024-1c72-4da4-8e7b-78190928f285\") " pod="openstack/ovn-controller-metrics-h24fx" Mar 10 11:21:54 crc kubenswrapper[4794]: I0310 11:21:54.698418 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-h24fx" Mar 10 11:21:55 crc kubenswrapper[4794]: W0310 11:21:55.139559 4794 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda0118024_1c72_4da4_8e7b_78190928f285.slice/crio-ee1a56139f39c9c53b7f531ea052e262c1d19b5cb4541778776be1ab995ea8fd WatchSource:0}: Error finding container ee1a56139f39c9c53b7f531ea052e262c1d19b5cb4541778776be1ab995ea8fd: Status 404 returned error can't find the container with id ee1a56139f39c9c53b7f531ea052e262c1d19b5cb4541778776be1ab995ea8fd Mar 10 11:21:55 crc kubenswrapper[4794]: I0310 11:21:55.148383 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-h24fx"] Mar 10 11:21:55 crc kubenswrapper[4794]: I0310 11:21:55.505033 4794 generic.go:334] "Generic (PLEG): container finished" podID="77cb5711-11e8-4897-b227-3579f08b54a6" containerID="9be65aae5b6eba389a5648bf616776b027b482216ea8e8951d9d0fef087efe81" exitCode=0 Mar 10 11:21:55 crc kubenswrapper[4794]: I0310 11:21:55.505115 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-545pd" event={"ID":"77cb5711-11e8-4897-b227-3579f08b54a6","Type":"ContainerDied","Data":"9be65aae5b6eba389a5648bf616776b027b482216ea8e8951d9d0fef087efe81"} Mar 10 11:21:55 crc kubenswrapper[4794]: I0310 11:21:55.506948 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-h24fx" event={"ID":"a0118024-1c72-4da4-8e7b-78190928f285","Type":"ContainerStarted","Data":"e85f6d424bc19200acea5484724a24ebaa24817d7b65518b327dc9be684cb4f7"} Mar 10 11:21:55 crc kubenswrapper[4794]: I0310 11:21:55.506989 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-h24fx" event={"ID":"a0118024-1c72-4da4-8e7b-78190928f285","Type":"ContainerStarted","Data":"ee1a56139f39c9c53b7f531ea052e262c1d19b5cb4541778776be1ab995ea8fd"} Mar 10 11:21:55 crc kubenswrapper[4794]: I0310 11:21:55.571418 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-metrics-h24fx" podStartSLOduration=1.5714000879999999 podStartE2EDuration="1.571400088s" podCreationTimestamp="2026-03-10 11:21:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 11:21:55.54831899 +0000 UTC m=+5864.304489838" watchObservedRunningTime="2026-03-10 11:21:55.571400088 +0000 UTC m=+5864.327570906" Mar 10 11:21:56 crc kubenswrapper[4794]: I0310 11:21:56.531955 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-545pd" event={"ID":"77cb5711-11e8-4897-b227-3579f08b54a6","Type":"ContainerStarted","Data":"e0b8255aa70bc7cda13053d232e883e25b10d8b9bdb82da00386422f82fa6766"} Mar 10 11:21:56 crc kubenswrapper[4794]: I0310 11:21:56.532033 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-545pd" event={"ID":"77cb5711-11e8-4897-b227-3579f08b54a6","Type":"ContainerStarted","Data":"8bf9b444625e57c866cd747a927c708411fd23fd3c0bb7e1c19a6c5036e7b750"} Mar 10 11:21:56 crc kubenswrapper[4794]: I0310 11:21:56.556844 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-ovs-545pd" podStartSLOduration=4.556818152 podStartE2EDuration="4.556818152s" podCreationTimestamp="2026-03-10 11:21:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 11:21:56.552932272 +0000 UTC m=+5865.309103130" watchObservedRunningTime="2026-03-10 11:21:56.556818152 +0000 UTC m=+5865.312988990" Mar 10 11:21:57 crc kubenswrapper[4794]: I0310 11:21:57.545081 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-545pd" Mar 10 11:21:57 crc kubenswrapper[4794]: I0310 11:21:57.545441 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-545pd" Mar 10 11:21:59 crc kubenswrapper[4794]: I0310 11:21:59.070394 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-create-d5rbx"] Mar 10 11:21:59 crc kubenswrapper[4794]: I0310 11:21:59.092810 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-2a13-account-create-update-9pmtz"] Mar 10 11:21:59 crc kubenswrapper[4794]: I0310 11:21:59.107430 4794 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-2a13-account-create-update-9pmtz"] Mar 10 11:21:59 crc kubenswrapper[4794]: I0310 11:21:59.122649 4794 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-create-d5rbx"] Mar 10 11:22:00 crc kubenswrapper[4794]: I0310 11:22:00.016015 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="206a0153-5800-403c-8749-e68d34d36a81" path="/var/lib/kubelet/pods/206a0153-5800-403c-8749-e68d34d36a81/volumes" Mar 10 11:22:00 crc kubenswrapper[4794]: I0310 11:22:00.017691 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b2416ded-2ad4-4cef-af44-8c6e944834d8" path="/var/lib/kubelet/pods/b2416ded-2ad4-4cef-af44-8c6e944834d8/volumes" Mar 10 11:22:00 crc kubenswrapper[4794]: I0310 11:22:00.150445 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29552362-d8s7g"] Mar 10 11:22:00 crc kubenswrapper[4794]: I0310 11:22:00.153694 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552362-d8s7g" Mar 10 11:22:00 crc kubenswrapper[4794]: I0310 11:22:00.162771 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-r5tkc" Mar 10 11:22:00 crc kubenswrapper[4794]: I0310 11:22:00.163041 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 10 11:22:00 crc kubenswrapper[4794]: I0310 11:22:00.163151 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 10 11:22:00 crc kubenswrapper[4794]: I0310 11:22:00.173253 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552362-d8s7g"] Mar 10 11:22:00 crc kubenswrapper[4794]: I0310 11:22:00.257814 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jg9fs\" (UniqueName: \"kubernetes.io/projected/756d4872-e20d-4b8a-8c2e-b75813c15334-kube-api-access-jg9fs\") pod \"auto-csr-approver-29552362-d8s7g\" (UID: \"756d4872-e20d-4b8a-8c2e-b75813c15334\") " pod="openshift-infra/auto-csr-approver-29552362-d8s7g" Mar 10 11:22:00 crc kubenswrapper[4794]: I0310 11:22:00.359981 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jg9fs\" (UniqueName: \"kubernetes.io/projected/756d4872-e20d-4b8a-8c2e-b75813c15334-kube-api-access-jg9fs\") pod \"auto-csr-approver-29552362-d8s7g\" (UID: \"756d4872-e20d-4b8a-8c2e-b75813c15334\") " pod="openshift-infra/auto-csr-approver-29552362-d8s7g" Mar 10 11:22:00 crc kubenswrapper[4794]: I0310 11:22:00.403133 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jg9fs\" (UniqueName: \"kubernetes.io/projected/756d4872-e20d-4b8a-8c2e-b75813c15334-kube-api-access-jg9fs\") pod \"auto-csr-approver-29552362-d8s7g\" (UID: \"756d4872-e20d-4b8a-8c2e-b75813c15334\") " pod="openshift-infra/auto-csr-approver-29552362-d8s7g" Mar 10 11:22:00 crc kubenswrapper[4794]: I0310 11:22:00.491639 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552362-d8s7g" Mar 10 11:22:01 crc kubenswrapper[4794]: I0310 11:22:01.040532 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552362-d8s7g"] Mar 10 11:22:01 crc kubenswrapper[4794]: I0310 11:22:01.046943 4794 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 10 11:22:01 crc kubenswrapper[4794]: I0310 11:22:01.612264 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552362-d8s7g" event={"ID":"756d4872-e20d-4b8a-8c2e-b75813c15334","Type":"ContainerStarted","Data":"ddf264be64ddf0a20f5d5f2863bc17d2fb98d06f079be50f1bbaa21916b3e4ba"} Mar 10 11:22:03 crc kubenswrapper[4794]: I0310 11:22:03.636453 4794 generic.go:334] "Generic (PLEG): container finished" podID="756d4872-e20d-4b8a-8c2e-b75813c15334" containerID="09c3473db17e343c7f1c600d8e6f23efb32b817c496a01ac3b0db591c660983f" exitCode=0 Mar 10 11:22:03 crc kubenswrapper[4794]: I0310 11:22:03.637089 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552362-d8s7g" event={"ID":"756d4872-e20d-4b8a-8c2e-b75813c15334","Type":"ContainerDied","Data":"09c3473db17e343c7f1c600d8e6f23efb32b817c496a01ac3b0db591c660983f"} Mar 10 11:22:05 crc kubenswrapper[4794]: I0310 11:22:05.047455 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-sync-brrxw"] Mar 10 11:22:05 crc kubenswrapper[4794]: I0310 11:22:05.057475 4794 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-sync-brrxw"] Mar 10 11:22:05 crc kubenswrapper[4794]: I0310 11:22:05.059472 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552362-d8s7g" Mar 10 11:22:05 crc kubenswrapper[4794]: I0310 11:22:05.175222 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jg9fs\" (UniqueName: \"kubernetes.io/projected/756d4872-e20d-4b8a-8c2e-b75813c15334-kube-api-access-jg9fs\") pod \"756d4872-e20d-4b8a-8c2e-b75813c15334\" (UID: \"756d4872-e20d-4b8a-8c2e-b75813c15334\") " Mar 10 11:22:05 crc kubenswrapper[4794]: I0310 11:22:05.181980 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/756d4872-e20d-4b8a-8c2e-b75813c15334-kube-api-access-jg9fs" (OuterVolumeSpecName: "kube-api-access-jg9fs") pod "756d4872-e20d-4b8a-8c2e-b75813c15334" (UID: "756d4872-e20d-4b8a-8c2e-b75813c15334"). InnerVolumeSpecName "kube-api-access-jg9fs". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 11:22:05 crc kubenswrapper[4794]: I0310 11:22:05.277693 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jg9fs\" (UniqueName: \"kubernetes.io/projected/756d4872-e20d-4b8a-8c2e-b75813c15334-kube-api-access-jg9fs\") on node \"crc\" DevicePath \"\"" Mar 10 11:22:05 crc kubenswrapper[4794]: I0310 11:22:05.661166 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552362-d8s7g" event={"ID":"756d4872-e20d-4b8a-8c2e-b75813c15334","Type":"ContainerDied","Data":"ddf264be64ddf0a20f5d5f2863bc17d2fb98d06f079be50f1bbaa21916b3e4ba"} Mar 10 11:22:05 crc kubenswrapper[4794]: I0310 11:22:05.661213 4794 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ddf264be64ddf0a20f5d5f2863bc17d2fb98d06f079be50f1bbaa21916b3e4ba" Mar 10 11:22:05 crc kubenswrapper[4794]: I0310 11:22:05.661218 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552362-d8s7g" Mar 10 11:22:06 crc kubenswrapper[4794]: I0310 11:22:06.019696 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6b3770a1-b9eb-416d-bdcc-af3a4eb0d204" path="/var/lib/kubelet/pods/6b3770a1-b9eb-416d-bdcc-af3a4eb0d204/volumes" Mar 10 11:22:06 crc kubenswrapper[4794]: I0310 11:22:06.133253 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29552356-flccj"] Mar 10 11:22:06 crc kubenswrapper[4794]: I0310 11:22:06.141495 4794 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29552356-flccj"] Mar 10 11:22:06 crc kubenswrapper[4794]: I0310 11:22:06.861755 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/octavia-db-create-qzvlg"] Mar 10 11:22:06 crc kubenswrapper[4794]: E0310 11:22:06.862168 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="756d4872-e20d-4b8a-8c2e-b75813c15334" containerName="oc" Mar 10 11:22:06 crc kubenswrapper[4794]: I0310 11:22:06.862186 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="756d4872-e20d-4b8a-8c2e-b75813c15334" containerName="oc" Mar 10 11:22:06 crc kubenswrapper[4794]: I0310 11:22:06.862381 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="756d4872-e20d-4b8a-8c2e-b75813c15334" containerName="oc" Mar 10 11:22:06 crc kubenswrapper[4794]: I0310 11:22:06.863048 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-db-create-qzvlg" Mar 10 11:22:06 crc kubenswrapper[4794]: I0310 11:22:06.871749 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-db-create-qzvlg"] Mar 10 11:22:06 crc kubenswrapper[4794]: I0310 11:22:06.918405 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q5rsx\" (UniqueName: \"kubernetes.io/projected/19facb05-a5ec-4544-9a36-3b5973303b73-kube-api-access-q5rsx\") pod \"octavia-db-create-qzvlg\" (UID: \"19facb05-a5ec-4544-9a36-3b5973303b73\") " pod="openstack/octavia-db-create-qzvlg" Mar 10 11:22:06 crc kubenswrapper[4794]: I0310 11:22:06.918846 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/19facb05-a5ec-4544-9a36-3b5973303b73-operator-scripts\") pod \"octavia-db-create-qzvlg\" (UID: \"19facb05-a5ec-4544-9a36-3b5973303b73\") " pod="openstack/octavia-db-create-qzvlg" Mar 10 11:22:07 crc kubenswrapper[4794]: I0310 11:22:07.020543 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q5rsx\" (UniqueName: \"kubernetes.io/projected/19facb05-a5ec-4544-9a36-3b5973303b73-kube-api-access-q5rsx\") pod \"octavia-db-create-qzvlg\" (UID: \"19facb05-a5ec-4544-9a36-3b5973303b73\") " pod="openstack/octavia-db-create-qzvlg" Mar 10 11:22:07 crc kubenswrapper[4794]: I0310 11:22:07.020642 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/19facb05-a5ec-4544-9a36-3b5973303b73-operator-scripts\") pod \"octavia-db-create-qzvlg\" (UID: \"19facb05-a5ec-4544-9a36-3b5973303b73\") " pod="openstack/octavia-db-create-qzvlg" Mar 10 11:22:07 crc kubenswrapper[4794]: I0310 11:22:07.022745 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/19facb05-a5ec-4544-9a36-3b5973303b73-operator-scripts\") pod \"octavia-db-create-qzvlg\" (UID: \"19facb05-a5ec-4544-9a36-3b5973303b73\") " pod="openstack/octavia-db-create-qzvlg" Mar 10 11:22:07 crc kubenswrapper[4794]: I0310 11:22:07.039977 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q5rsx\" (UniqueName: \"kubernetes.io/projected/19facb05-a5ec-4544-9a36-3b5973303b73-kube-api-access-q5rsx\") pod \"octavia-db-create-qzvlg\" (UID: \"19facb05-a5ec-4544-9a36-3b5973303b73\") " pod="openstack/octavia-db-create-qzvlg" Mar 10 11:22:07 crc kubenswrapper[4794]: I0310 11:22:07.220315 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-db-create-qzvlg" Mar 10 11:22:07 crc kubenswrapper[4794]: I0310 11:22:07.733804 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-db-create-qzvlg"] Mar 10 11:22:08 crc kubenswrapper[4794]: I0310 11:22:08.018286 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="97e542d0-c411-44de-beb8-e5cffa10f1a7" path="/var/lib/kubelet/pods/97e542d0-c411-44de-beb8-e5cffa10f1a7/volumes" Mar 10 11:22:08 crc kubenswrapper[4794]: I0310 11:22:08.424192 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/octavia-4b48-account-create-update-q6xqr"] Mar 10 11:22:08 crc kubenswrapper[4794]: I0310 11:22:08.425661 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-4b48-account-create-update-q6xqr" Mar 10 11:22:08 crc kubenswrapper[4794]: I0310 11:22:08.428176 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-db-secret" Mar 10 11:22:08 crc kubenswrapper[4794]: I0310 11:22:08.437250 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-4b48-account-create-update-q6xqr"] Mar 10 11:22:08 crc kubenswrapper[4794]: I0310 11:22:08.551854 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wm6g5\" (UniqueName: \"kubernetes.io/projected/de94bae2-ff28-4041-935f-3c8e27931e98-kube-api-access-wm6g5\") pod \"octavia-4b48-account-create-update-q6xqr\" (UID: \"de94bae2-ff28-4041-935f-3c8e27931e98\") " pod="openstack/octavia-4b48-account-create-update-q6xqr" Mar 10 11:22:08 crc kubenswrapper[4794]: I0310 11:22:08.552160 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/de94bae2-ff28-4041-935f-3c8e27931e98-operator-scripts\") pod \"octavia-4b48-account-create-update-q6xqr\" (UID: \"de94bae2-ff28-4041-935f-3c8e27931e98\") " pod="openstack/octavia-4b48-account-create-update-q6xqr" Mar 10 11:22:08 crc kubenswrapper[4794]: I0310 11:22:08.654002 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/de94bae2-ff28-4041-935f-3c8e27931e98-operator-scripts\") pod \"octavia-4b48-account-create-update-q6xqr\" (UID: \"de94bae2-ff28-4041-935f-3c8e27931e98\") " pod="openstack/octavia-4b48-account-create-update-q6xqr" Mar 10 11:22:08 crc kubenswrapper[4794]: I0310 11:22:08.654170 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wm6g5\" (UniqueName: \"kubernetes.io/projected/de94bae2-ff28-4041-935f-3c8e27931e98-kube-api-access-wm6g5\") pod \"octavia-4b48-account-create-update-q6xqr\" (UID: \"de94bae2-ff28-4041-935f-3c8e27931e98\") " pod="openstack/octavia-4b48-account-create-update-q6xqr" Mar 10 11:22:08 crc kubenswrapper[4794]: I0310 11:22:08.654696 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/de94bae2-ff28-4041-935f-3c8e27931e98-operator-scripts\") pod \"octavia-4b48-account-create-update-q6xqr\" (UID: \"de94bae2-ff28-4041-935f-3c8e27931e98\") " pod="openstack/octavia-4b48-account-create-update-q6xqr" Mar 10 11:22:08 crc kubenswrapper[4794]: I0310 11:22:08.680228 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wm6g5\" (UniqueName: \"kubernetes.io/projected/de94bae2-ff28-4041-935f-3c8e27931e98-kube-api-access-wm6g5\") pod \"octavia-4b48-account-create-update-q6xqr\" (UID: \"de94bae2-ff28-4041-935f-3c8e27931e98\") " pod="openstack/octavia-4b48-account-create-update-q6xqr" Mar 10 11:22:08 crc kubenswrapper[4794]: I0310 11:22:08.695000 4794 generic.go:334] "Generic (PLEG): container finished" podID="19facb05-a5ec-4544-9a36-3b5973303b73" containerID="17c48bb8a9ff4ccf504437b9ec1b5a3f9af0e2b5338966f3e576ac71ce107cd5" exitCode=0 Mar 10 11:22:08 crc kubenswrapper[4794]: I0310 11:22:08.695044 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-db-create-qzvlg" event={"ID":"19facb05-a5ec-4544-9a36-3b5973303b73","Type":"ContainerDied","Data":"17c48bb8a9ff4ccf504437b9ec1b5a3f9af0e2b5338966f3e576ac71ce107cd5"} Mar 10 11:22:08 crc kubenswrapper[4794]: I0310 11:22:08.695068 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-db-create-qzvlg" event={"ID":"19facb05-a5ec-4544-9a36-3b5973303b73","Type":"ContainerStarted","Data":"b0bf0308bfb2d77f665004b87b225c9999944e90dc1f0232e68a4eddd66f076f"} Mar 10 11:22:08 crc kubenswrapper[4794]: I0310 11:22:08.746941 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-4b48-account-create-update-q6xqr" Mar 10 11:22:09 crc kubenswrapper[4794]: I0310 11:22:09.239245 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-4b48-account-create-update-q6xqr"] Mar 10 11:22:09 crc kubenswrapper[4794]: W0310 11:22:09.242930 4794 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podde94bae2_ff28_4041_935f_3c8e27931e98.slice/crio-2f08fb07cecf464a591c60bcccc222cca1c806467e76ea296a23dd284bdb331b WatchSource:0}: Error finding container 2f08fb07cecf464a591c60bcccc222cca1c806467e76ea296a23dd284bdb331b: Status 404 returned error can't find the container with id 2f08fb07cecf464a591c60bcccc222cca1c806467e76ea296a23dd284bdb331b Mar 10 11:22:09 crc kubenswrapper[4794]: I0310 11:22:09.713062 4794 generic.go:334] "Generic (PLEG): container finished" podID="de94bae2-ff28-4041-935f-3c8e27931e98" containerID="759f811718ed7a63696b8290009c4141bf27f60b4855aec49dfe381031aa9a86" exitCode=0 Mar 10 11:22:09 crc kubenswrapper[4794]: I0310 11:22:09.713136 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-4b48-account-create-update-q6xqr" event={"ID":"de94bae2-ff28-4041-935f-3c8e27931e98","Type":"ContainerDied","Data":"759f811718ed7a63696b8290009c4141bf27f60b4855aec49dfe381031aa9a86"} Mar 10 11:22:09 crc kubenswrapper[4794]: I0310 11:22:09.713201 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-4b48-account-create-update-q6xqr" event={"ID":"de94bae2-ff28-4041-935f-3c8e27931e98","Type":"ContainerStarted","Data":"2f08fb07cecf464a591c60bcccc222cca1c806467e76ea296a23dd284bdb331b"} Mar 10 11:22:10 crc kubenswrapper[4794]: I0310 11:22:10.179601 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-db-create-qzvlg" Mar 10 11:22:10 crc kubenswrapper[4794]: I0310 11:22:10.299113 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q5rsx\" (UniqueName: \"kubernetes.io/projected/19facb05-a5ec-4544-9a36-3b5973303b73-kube-api-access-q5rsx\") pod \"19facb05-a5ec-4544-9a36-3b5973303b73\" (UID: \"19facb05-a5ec-4544-9a36-3b5973303b73\") " Mar 10 11:22:10 crc kubenswrapper[4794]: I0310 11:22:10.299465 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/19facb05-a5ec-4544-9a36-3b5973303b73-operator-scripts\") pod \"19facb05-a5ec-4544-9a36-3b5973303b73\" (UID: \"19facb05-a5ec-4544-9a36-3b5973303b73\") " Mar 10 11:22:10 crc kubenswrapper[4794]: I0310 11:22:10.300464 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/19facb05-a5ec-4544-9a36-3b5973303b73-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "19facb05-a5ec-4544-9a36-3b5973303b73" (UID: "19facb05-a5ec-4544-9a36-3b5973303b73"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 11:22:10 crc kubenswrapper[4794]: I0310 11:22:10.309824 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/19facb05-a5ec-4544-9a36-3b5973303b73-kube-api-access-q5rsx" (OuterVolumeSpecName: "kube-api-access-q5rsx") pod "19facb05-a5ec-4544-9a36-3b5973303b73" (UID: "19facb05-a5ec-4544-9a36-3b5973303b73"). InnerVolumeSpecName "kube-api-access-q5rsx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 11:22:10 crc kubenswrapper[4794]: I0310 11:22:10.402418 4794 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/19facb05-a5ec-4544-9a36-3b5973303b73-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 11:22:10 crc kubenswrapper[4794]: I0310 11:22:10.402484 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q5rsx\" (UniqueName: \"kubernetes.io/projected/19facb05-a5ec-4544-9a36-3b5973303b73-kube-api-access-q5rsx\") on node \"crc\" DevicePath \"\"" Mar 10 11:22:10 crc kubenswrapper[4794]: I0310 11:22:10.727653 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-db-create-qzvlg" Mar 10 11:22:10 crc kubenswrapper[4794]: I0310 11:22:10.728516 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-db-create-qzvlg" event={"ID":"19facb05-a5ec-4544-9a36-3b5973303b73","Type":"ContainerDied","Data":"b0bf0308bfb2d77f665004b87b225c9999944e90dc1f0232e68a4eddd66f076f"} Mar 10 11:22:10 crc kubenswrapper[4794]: I0310 11:22:10.728563 4794 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b0bf0308bfb2d77f665004b87b225c9999944e90dc1f0232e68a4eddd66f076f" Mar 10 11:22:11 crc kubenswrapper[4794]: I0310 11:22:11.070417 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-4b48-account-create-update-q6xqr" Mar 10 11:22:11 crc kubenswrapper[4794]: I0310 11:22:11.220584 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wm6g5\" (UniqueName: \"kubernetes.io/projected/de94bae2-ff28-4041-935f-3c8e27931e98-kube-api-access-wm6g5\") pod \"de94bae2-ff28-4041-935f-3c8e27931e98\" (UID: \"de94bae2-ff28-4041-935f-3c8e27931e98\") " Mar 10 11:22:11 crc kubenswrapper[4794]: I0310 11:22:11.220805 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/de94bae2-ff28-4041-935f-3c8e27931e98-operator-scripts\") pod \"de94bae2-ff28-4041-935f-3c8e27931e98\" (UID: \"de94bae2-ff28-4041-935f-3c8e27931e98\") " Mar 10 11:22:11 crc kubenswrapper[4794]: I0310 11:22:11.221411 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/de94bae2-ff28-4041-935f-3c8e27931e98-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "de94bae2-ff28-4041-935f-3c8e27931e98" (UID: "de94bae2-ff28-4041-935f-3c8e27931e98"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 11:22:11 crc kubenswrapper[4794]: I0310 11:22:11.239227 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/de94bae2-ff28-4041-935f-3c8e27931e98-kube-api-access-wm6g5" (OuterVolumeSpecName: "kube-api-access-wm6g5") pod "de94bae2-ff28-4041-935f-3c8e27931e98" (UID: "de94bae2-ff28-4041-935f-3c8e27931e98"). InnerVolumeSpecName "kube-api-access-wm6g5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 11:22:11 crc kubenswrapper[4794]: I0310 11:22:11.323936 4794 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/de94bae2-ff28-4041-935f-3c8e27931e98-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 11:22:11 crc kubenswrapper[4794]: I0310 11:22:11.324102 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wm6g5\" (UniqueName: \"kubernetes.io/projected/de94bae2-ff28-4041-935f-3c8e27931e98-kube-api-access-wm6g5\") on node \"crc\" DevicePath \"\"" Mar 10 11:22:11 crc kubenswrapper[4794]: I0310 11:22:11.744891 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-4b48-account-create-update-q6xqr" event={"ID":"de94bae2-ff28-4041-935f-3c8e27931e98","Type":"ContainerDied","Data":"2f08fb07cecf464a591c60bcccc222cca1c806467e76ea296a23dd284bdb331b"} Mar 10 11:22:11 crc kubenswrapper[4794]: I0310 11:22:11.745815 4794 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2f08fb07cecf464a591c60bcccc222cca1c806467e76ea296a23dd284bdb331b" Mar 10 11:22:11 crc kubenswrapper[4794]: I0310 11:22:11.744999 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-4b48-account-create-update-q6xqr" Mar 10 11:22:13 crc kubenswrapper[4794]: I0310 11:22:13.922245 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/octavia-persistence-db-create-lkmqd"] Mar 10 11:22:13 crc kubenswrapper[4794]: E0310 11:22:13.923472 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="19facb05-a5ec-4544-9a36-3b5973303b73" containerName="mariadb-database-create" Mar 10 11:22:13 crc kubenswrapper[4794]: I0310 11:22:13.923504 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="19facb05-a5ec-4544-9a36-3b5973303b73" containerName="mariadb-database-create" Mar 10 11:22:13 crc kubenswrapper[4794]: E0310 11:22:13.923563 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="de94bae2-ff28-4041-935f-3c8e27931e98" containerName="mariadb-account-create-update" Mar 10 11:22:13 crc kubenswrapper[4794]: I0310 11:22:13.923582 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="de94bae2-ff28-4041-935f-3c8e27931e98" containerName="mariadb-account-create-update" Mar 10 11:22:13 crc kubenswrapper[4794]: I0310 11:22:13.924073 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="de94bae2-ff28-4041-935f-3c8e27931e98" containerName="mariadb-account-create-update" Mar 10 11:22:13 crc kubenswrapper[4794]: I0310 11:22:13.924103 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="19facb05-a5ec-4544-9a36-3b5973303b73" containerName="mariadb-database-create" Mar 10 11:22:13 crc kubenswrapper[4794]: I0310 11:22:13.925584 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-persistence-db-create-lkmqd" Mar 10 11:22:13 crc kubenswrapper[4794]: I0310 11:22:13.931308 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-persistence-db-create-lkmqd"] Mar 10 11:22:13 crc kubenswrapper[4794]: I0310 11:22:13.994949 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-89pd8\" (UniqueName: \"kubernetes.io/projected/46be95d5-5770-4daa-bba9-cf040d10d55a-kube-api-access-89pd8\") pod \"octavia-persistence-db-create-lkmqd\" (UID: \"46be95d5-5770-4daa-bba9-cf040d10d55a\") " pod="openstack/octavia-persistence-db-create-lkmqd" Mar 10 11:22:13 crc kubenswrapper[4794]: I0310 11:22:13.995115 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/46be95d5-5770-4daa-bba9-cf040d10d55a-operator-scripts\") pod \"octavia-persistence-db-create-lkmqd\" (UID: \"46be95d5-5770-4daa-bba9-cf040d10d55a\") " pod="openstack/octavia-persistence-db-create-lkmqd" Mar 10 11:22:14 crc kubenswrapper[4794]: I0310 11:22:14.102420 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-89pd8\" (UniqueName: \"kubernetes.io/projected/46be95d5-5770-4daa-bba9-cf040d10d55a-kube-api-access-89pd8\") pod \"octavia-persistence-db-create-lkmqd\" (UID: \"46be95d5-5770-4daa-bba9-cf040d10d55a\") " pod="openstack/octavia-persistence-db-create-lkmqd" Mar 10 11:22:14 crc kubenswrapper[4794]: I0310 11:22:14.102651 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/46be95d5-5770-4daa-bba9-cf040d10d55a-operator-scripts\") pod \"octavia-persistence-db-create-lkmqd\" (UID: \"46be95d5-5770-4daa-bba9-cf040d10d55a\") " pod="openstack/octavia-persistence-db-create-lkmqd" Mar 10 11:22:14 crc kubenswrapper[4794]: I0310 11:22:14.106447 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/46be95d5-5770-4daa-bba9-cf040d10d55a-operator-scripts\") pod \"octavia-persistence-db-create-lkmqd\" (UID: \"46be95d5-5770-4daa-bba9-cf040d10d55a\") " pod="openstack/octavia-persistence-db-create-lkmqd" Mar 10 11:22:14 crc kubenswrapper[4794]: I0310 11:22:14.135929 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-89pd8\" (UniqueName: \"kubernetes.io/projected/46be95d5-5770-4daa-bba9-cf040d10d55a-kube-api-access-89pd8\") pod \"octavia-persistence-db-create-lkmqd\" (UID: \"46be95d5-5770-4daa-bba9-cf040d10d55a\") " pod="openstack/octavia-persistence-db-create-lkmqd" Mar 10 11:22:14 crc kubenswrapper[4794]: I0310 11:22:14.254698 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-persistence-db-create-lkmqd" Mar 10 11:22:14 crc kubenswrapper[4794]: I0310 11:22:14.421394 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/octavia-d138-account-create-update-bwck7"] Mar 10 11:22:14 crc kubenswrapper[4794]: I0310 11:22:14.423184 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-d138-account-create-update-bwck7" Mar 10 11:22:14 crc kubenswrapper[4794]: I0310 11:22:14.425344 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-persistence-db-secret" Mar 10 11:22:14 crc kubenswrapper[4794]: I0310 11:22:14.450057 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-d138-account-create-update-bwck7"] Mar 10 11:22:14 crc kubenswrapper[4794]: I0310 11:22:14.512372 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mnn8p\" (UniqueName: \"kubernetes.io/projected/959ece64-6ede-48e4-8137-2b8cfce20471-kube-api-access-mnn8p\") pod \"octavia-d138-account-create-update-bwck7\" (UID: \"959ece64-6ede-48e4-8137-2b8cfce20471\") " pod="openstack/octavia-d138-account-create-update-bwck7" Mar 10 11:22:14 crc kubenswrapper[4794]: I0310 11:22:14.512536 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/959ece64-6ede-48e4-8137-2b8cfce20471-operator-scripts\") pod \"octavia-d138-account-create-update-bwck7\" (UID: \"959ece64-6ede-48e4-8137-2b8cfce20471\") " pod="openstack/octavia-d138-account-create-update-bwck7" Mar 10 11:22:14 crc kubenswrapper[4794]: I0310 11:22:14.616313 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mnn8p\" (UniqueName: \"kubernetes.io/projected/959ece64-6ede-48e4-8137-2b8cfce20471-kube-api-access-mnn8p\") pod \"octavia-d138-account-create-update-bwck7\" (UID: \"959ece64-6ede-48e4-8137-2b8cfce20471\") " pod="openstack/octavia-d138-account-create-update-bwck7" Mar 10 11:22:14 crc kubenswrapper[4794]: I0310 11:22:14.616476 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/959ece64-6ede-48e4-8137-2b8cfce20471-operator-scripts\") pod \"octavia-d138-account-create-update-bwck7\" (UID: \"959ece64-6ede-48e4-8137-2b8cfce20471\") " pod="openstack/octavia-d138-account-create-update-bwck7" Mar 10 11:22:14 crc kubenswrapper[4794]: I0310 11:22:14.617223 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/959ece64-6ede-48e4-8137-2b8cfce20471-operator-scripts\") pod \"octavia-d138-account-create-update-bwck7\" (UID: \"959ece64-6ede-48e4-8137-2b8cfce20471\") " pod="openstack/octavia-d138-account-create-update-bwck7" Mar 10 11:22:14 crc kubenswrapper[4794]: I0310 11:22:14.657387 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mnn8p\" (UniqueName: \"kubernetes.io/projected/959ece64-6ede-48e4-8137-2b8cfce20471-kube-api-access-mnn8p\") pod \"octavia-d138-account-create-update-bwck7\" (UID: \"959ece64-6ede-48e4-8137-2b8cfce20471\") " pod="openstack/octavia-d138-account-create-update-bwck7" Mar 10 11:22:14 crc kubenswrapper[4794]: I0310 11:22:14.754245 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-d138-account-create-update-bwck7" Mar 10 11:22:14 crc kubenswrapper[4794]: I0310 11:22:14.893388 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-persistence-db-create-lkmqd"] Mar 10 11:22:15 crc kubenswrapper[4794]: I0310 11:22:15.056412 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-d138-account-create-update-bwck7"] Mar 10 11:22:15 crc kubenswrapper[4794]: W0310 11:22:15.060297 4794 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod959ece64_6ede_48e4_8137_2b8cfce20471.slice/crio-14495d374324bce73a9f787a649bb1b0eafb728d24dedf2355fc88e2f014cc56 WatchSource:0}: Error finding container 14495d374324bce73a9f787a649bb1b0eafb728d24dedf2355fc88e2f014cc56: Status 404 returned error can't find the container with id 14495d374324bce73a9f787a649bb1b0eafb728d24dedf2355fc88e2f014cc56 Mar 10 11:22:15 crc kubenswrapper[4794]: I0310 11:22:15.812710 4794 generic.go:334] "Generic (PLEG): container finished" podID="959ece64-6ede-48e4-8137-2b8cfce20471" containerID="9a1bdfc1045b898ae6c793757db86876a6990c9eadc6368c86650d52a20001fb" exitCode=0 Mar 10 11:22:15 crc kubenswrapper[4794]: I0310 11:22:15.812786 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-d138-account-create-update-bwck7" event={"ID":"959ece64-6ede-48e4-8137-2b8cfce20471","Type":"ContainerDied","Data":"9a1bdfc1045b898ae6c793757db86876a6990c9eadc6368c86650d52a20001fb"} Mar 10 11:22:15 crc kubenswrapper[4794]: I0310 11:22:15.813397 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-d138-account-create-update-bwck7" event={"ID":"959ece64-6ede-48e4-8137-2b8cfce20471","Type":"ContainerStarted","Data":"14495d374324bce73a9f787a649bb1b0eafb728d24dedf2355fc88e2f014cc56"} Mar 10 11:22:15 crc kubenswrapper[4794]: I0310 11:22:15.817037 4794 generic.go:334] "Generic (PLEG): container finished" podID="46be95d5-5770-4daa-bba9-cf040d10d55a" containerID="9b899dd0f88d6c7f75831bfba61cbf085a8fa25aac1a5019660fd10149e337d4" exitCode=0 Mar 10 11:22:15 crc kubenswrapper[4794]: I0310 11:22:15.817080 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-persistence-db-create-lkmqd" event={"ID":"46be95d5-5770-4daa-bba9-cf040d10d55a","Type":"ContainerDied","Data":"9b899dd0f88d6c7f75831bfba61cbf085a8fa25aac1a5019660fd10149e337d4"} Mar 10 11:22:15 crc kubenswrapper[4794]: I0310 11:22:15.817133 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-persistence-db-create-lkmqd" event={"ID":"46be95d5-5770-4daa-bba9-cf040d10d55a","Type":"ContainerStarted","Data":"8d112d7f33c4f7fb69f9ed48596b5556431b5aa5fe92a4a040d8613167e0182d"} Mar 10 11:22:17 crc kubenswrapper[4794]: I0310 11:22:17.049880 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-69ksj"] Mar 10 11:22:17 crc kubenswrapper[4794]: I0310 11:22:17.058355 4794 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-69ksj"] Mar 10 11:22:17 crc kubenswrapper[4794]: I0310 11:22:17.333399 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-d138-account-create-update-bwck7" Mar 10 11:22:17 crc kubenswrapper[4794]: I0310 11:22:17.333778 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-persistence-db-create-lkmqd" Mar 10 11:22:17 crc kubenswrapper[4794]: I0310 11:22:17.384006 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mnn8p\" (UniqueName: \"kubernetes.io/projected/959ece64-6ede-48e4-8137-2b8cfce20471-kube-api-access-mnn8p\") pod \"959ece64-6ede-48e4-8137-2b8cfce20471\" (UID: \"959ece64-6ede-48e4-8137-2b8cfce20471\") " Mar 10 11:22:17 crc kubenswrapper[4794]: I0310 11:22:17.384094 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/46be95d5-5770-4daa-bba9-cf040d10d55a-operator-scripts\") pod \"46be95d5-5770-4daa-bba9-cf040d10d55a\" (UID: \"46be95d5-5770-4daa-bba9-cf040d10d55a\") " Mar 10 11:22:17 crc kubenswrapper[4794]: I0310 11:22:17.384193 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/959ece64-6ede-48e4-8137-2b8cfce20471-operator-scripts\") pod \"959ece64-6ede-48e4-8137-2b8cfce20471\" (UID: \"959ece64-6ede-48e4-8137-2b8cfce20471\") " Mar 10 11:22:17 crc kubenswrapper[4794]: I0310 11:22:17.384324 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-89pd8\" (UniqueName: \"kubernetes.io/projected/46be95d5-5770-4daa-bba9-cf040d10d55a-kube-api-access-89pd8\") pod \"46be95d5-5770-4daa-bba9-cf040d10d55a\" (UID: \"46be95d5-5770-4daa-bba9-cf040d10d55a\") " Mar 10 11:22:17 crc kubenswrapper[4794]: I0310 11:22:17.384844 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/46be95d5-5770-4daa-bba9-cf040d10d55a-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "46be95d5-5770-4daa-bba9-cf040d10d55a" (UID: "46be95d5-5770-4daa-bba9-cf040d10d55a"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 11:22:17 crc kubenswrapper[4794]: I0310 11:22:17.384957 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/959ece64-6ede-48e4-8137-2b8cfce20471-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "959ece64-6ede-48e4-8137-2b8cfce20471" (UID: "959ece64-6ede-48e4-8137-2b8cfce20471"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 11:22:17 crc kubenswrapper[4794]: I0310 11:22:17.389913 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/46be95d5-5770-4daa-bba9-cf040d10d55a-kube-api-access-89pd8" (OuterVolumeSpecName: "kube-api-access-89pd8") pod "46be95d5-5770-4daa-bba9-cf040d10d55a" (UID: "46be95d5-5770-4daa-bba9-cf040d10d55a"). InnerVolumeSpecName "kube-api-access-89pd8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 11:22:17 crc kubenswrapper[4794]: I0310 11:22:17.390576 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/959ece64-6ede-48e4-8137-2b8cfce20471-kube-api-access-mnn8p" (OuterVolumeSpecName: "kube-api-access-mnn8p") pod "959ece64-6ede-48e4-8137-2b8cfce20471" (UID: "959ece64-6ede-48e4-8137-2b8cfce20471"). InnerVolumeSpecName "kube-api-access-mnn8p". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 11:22:17 crc kubenswrapper[4794]: I0310 11:22:17.488094 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-89pd8\" (UniqueName: \"kubernetes.io/projected/46be95d5-5770-4daa-bba9-cf040d10d55a-kube-api-access-89pd8\") on node \"crc\" DevicePath \"\"" Mar 10 11:22:17 crc kubenswrapper[4794]: I0310 11:22:17.488150 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mnn8p\" (UniqueName: \"kubernetes.io/projected/959ece64-6ede-48e4-8137-2b8cfce20471-kube-api-access-mnn8p\") on node \"crc\" DevicePath \"\"" Mar 10 11:22:17 crc kubenswrapper[4794]: I0310 11:22:17.488170 4794 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/46be95d5-5770-4daa-bba9-cf040d10d55a-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 11:22:17 crc kubenswrapper[4794]: I0310 11:22:17.488186 4794 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/959ece64-6ede-48e4-8137-2b8cfce20471-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 11:22:17 crc kubenswrapper[4794]: I0310 11:22:17.844022 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-d138-account-create-update-bwck7" event={"ID":"959ece64-6ede-48e4-8137-2b8cfce20471","Type":"ContainerDied","Data":"14495d374324bce73a9f787a649bb1b0eafb728d24dedf2355fc88e2f014cc56"} Mar 10 11:22:17 crc kubenswrapper[4794]: I0310 11:22:17.844103 4794 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="14495d374324bce73a9f787a649bb1b0eafb728d24dedf2355fc88e2f014cc56" Mar 10 11:22:17 crc kubenswrapper[4794]: I0310 11:22:17.844046 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-d138-account-create-update-bwck7" Mar 10 11:22:17 crc kubenswrapper[4794]: I0310 11:22:17.846627 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-persistence-db-create-lkmqd" event={"ID":"46be95d5-5770-4daa-bba9-cf040d10d55a","Type":"ContainerDied","Data":"8d112d7f33c4f7fb69f9ed48596b5556431b5aa5fe92a4a040d8613167e0182d"} Mar 10 11:22:17 crc kubenswrapper[4794]: I0310 11:22:17.846671 4794 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8d112d7f33c4f7fb69f9ed48596b5556431b5aa5fe92a4a040d8613167e0182d" Mar 10 11:22:17 crc kubenswrapper[4794]: I0310 11:22:17.846742 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-persistence-db-create-lkmqd" Mar 10 11:22:18 crc kubenswrapper[4794]: I0310 11:22:18.022055 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4a267651-30fa-4587-9da7-25c34d836c67" path="/var/lib/kubelet/pods/4a267651-30fa-4587-9da7-25c34d836c67/volumes" Mar 10 11:22:19 crc kubenswrapper[4794]: I0310 11:22:19.962611 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/octavia-api-5f9d47f567-qfb6h"] Mar 10 11:22:19 crc kubenswrapper[4794]: E0310 11:22:19.963404 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="46be95d5-5770-4daa-bba9-cf040d10d55a" containerName="mariadb-database-create" Mar 10 11:22:19 crc kubenswrapper[4794]: I0310 11:22:19.963418 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="46be95d5-5770-4daa-bba9-cf040d10d55a" containerName="mariadb-database-create" Mar 10 11:22:19 crc kubenswrapper[4794]: E0310 11:22:19.963438 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="959ece64-6ede-48e4-8137-2b8cfce20471" containerName="mariadb-account-create-update" Mar 10 11:22:19 crc kubenswrapper[4794]: I0310 11:22:19.963444 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="959ece64-6ede-48e4-8137-2b8cfce20471" containerName="mariadb-account-create-update" Mar 10 11:22:19 crc kubenswrapper[4794]: I0310 11:22:19.963602 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="959ece64-6ede-48e4-8137-2b8cfce20471" containerName="mariadb-account-create-update" Mar 10 11:22:19 crc kubenswrapper[4794]: I0310 11:22:19.963629 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="46be95d5-5770-4daa-bba9-cf040d10d55a" containerName="mariadb-database-create" Mar 10 11:22:19 crc kubenswrapper[4794]: I0310 11:22:19.965141 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-api-5f9d47f567-qfb6h" Mar 10 11:22:19 crc kubenswrapper[4794]: I0310 11:22:19.970111 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-octavia-dockercfg-sdmcs" Mar 10 11:22:19 crc kubenswrapper[4794]: I0310 11:22:19.970210 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-api-scripts" Mar 10 11:22:19 crc kubenswrapper[4794]: I0310 11:22:19.970638 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-api-config-data" Mar 10 11:22:19 crc kubenswrapper[4794]: I0310 11:22:19.996105 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-api-5f9d47f567-qfb6h"] Mar 10 11:22:20 crc kubenswrapper[4794]: I0310 11:22:20.068088 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6d282049-4c70-4e8b-945e-3375dd7a0e95-config-data\") pod \"octavia-api-5f9d47f567-qfb6h\" (UID: \"6d282049-4c70-4e8b-945e-3375dd7a0e95\") " pod="openstack/octavia-api-5f9d47f567-qfb6h" Mar 10 11:22:20 crc kubenswrapper[4794]: I0310 11:22:20.068155 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"octavia-run\" (UniqueName: \"kubernetes.io/empty-dir/6d282049-4c70-4e8b-945e-3375dd7a0e95-octavia-run\") pod \"octavia-api-5f9d47f567-qfb6h\" (UID: \"6d282049-4c70-4e8b-945e-3375dd7a0e95\") " pod="openstack/octavia-api-5f9d47f567-qfb6h" Mar 10 11:22:20 crc kubenswrapper[4794]: I0310 11:22:20.068193 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/6d282049-4c70-4e8b-945e-3375dd7a0e95-config-data-merged\") pod \"octavia-api-5f9d47f567-qfb6h\" (UID: \"6d282049-4c70-4e8b-945e-3375dd7a0e95\") " pod="openstack/octavia-api-5f9d47f567-qfb6h" Mar 10 11:22:20 crc kubenswrapper[4794]: I0310 11:22:20.068315 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6d282049-4c70-4e8b-945e-3375dd7a0e95-combined-ca-bundle\") pod \"octavia-api-5f9d47f567-qfb6h\" (UID: \"6d282049-4c70-4e8b-945e-3375dd7a0e95\") " pod="openstack/octavia-api-5f9d47f567-qfb6h" Mar 10 11:22:20 crc kubenswrapper[4794]: I0310 11:22:20.068365 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6d282049-4c70-4e8b-945e-3375dd7a0e95-scripts\") pod \"octavia-api-5f9d47f567-qfb6h\" (UID: \"6d282049-4c70-4e8b-945e-3375dd7a0e95\") " pod="openstack/octavia-api-5f9d47f567-qfb6h" Mar 10 11:22:20 crc kubenswrapper[4794]: I0310 11:22:20.170315 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6d282049-4c70-4e8b-945e-3375dd7a0e95-combined-ca-bundle\") pod \"octavia-api-5f9d47f567-qfb6h\" (UID: \"6d282049-4c70-4e8b-945e-3375dd7a0e95\") " pod="openstack/octavia-api-5f9d47f567-qfb6h" Mar 10 11:22:20 crc kubenswrapper[4794]: I0310 11:22:20.170370 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6d282049-4c70-4e8b-945e-3375dd7a0e95-scripts\") pod \"octavia-api-5f9d47f567-qfb6h\" (UID: \"6d282049-4c70-4e8b-945e-3375dd7a0e95\") " pod="openstack/octavia-api-5f9d47f567-qfb6h" Mar 10 11:22:20 crc kubenswrapper[4794]: I0310 11:22:20.170389 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6d282049-4c70-4e8b-945e-3375dd7a0e95-config-data\") pod \"octavia-api-5f9d47f567-qfb6h\" (UID: \"6d282049-4c70-4e8b-945e-3375dd7a0e95\") " pod="openstack/octavia-api-5f9d47f567-qfb6h" Mar 10 11:22:20 crc kubenswrapper[4794]: I0310 11:22:20.170430 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"octavia-run\" (UniqueName: \"kubernetes.io/empty-dir/6d282049-4c70-4e8b-945e-3375dd7a0e95-octavia-run\") pod \"octavia-api-5f9d47f567-qfb6h\" (UID: \"6d282049-4c70-4e8b-945e-3375dd7a0e95\") " pod="openstack/octavia-api-5f9d47f567-qfb6h" Mar 10 11:22:20 crc kubenswrapper[4794]: I0310 11:22:20.170457 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/6d282049-4c70-4e8b-945e-3375dd7a0e95-config-data-merged\") pod \"octavia-api-5f9d47f567-qfb6h\" (UID: \"6d282049-4c70-4e8b-945e-3375dd7a0e95\") " pod="openstack/octavia-api-5f9d47f567-qfb6h" Mar 10 11:22:20 crc kubenswrapper[4794]: I0310 11:22:20.170889 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/6d282049-4c70-4e8b-945e-3375dd7a0e95-config-data-merged\") pod \"octavia-api-5f9d47f567-qfb6h\" (UID: \"6d282049-4c70-4e8b-945e-3375dd7a0e95\") " pod="openstack/octavia-api-5f9d47f567-qfb6h" Mar 10 11:22:20 crc kubenswrapper[4794]: I0310 11:22:20.173063 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"octavia-run\" (UniqueName: \"kubernetes.io/empty-dir/6d282049-4c70-4e8b-945e-3375dd7a0e95-octavia-run\") pod \"octavia-api-5f9d47f567-qfb6h\" (UID: \"6d282049-4c70-4e8b-945e-3375dd7a0e95\") " pod="openstack/octavia-api-5f9d47f567-qfb6h" Mar 10 11:22:20 crc kubenswrapper[4794]: I0310 11:22:20.177682 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6d282049-4c70-4e8b-945e-3375dd7a0e95-config-data\") pod \"octavia-api-5f9d47f567-qfb6h\" (UID: \"6d282049-4c70-4e8b-945e-3375dd7a0e95\") " pod="openstack/octavia-api-5f9d47f567-qfb6h" Mar 10 11:22:20 crc kubenswrapper[4794]: I0310 11:22:20.177726 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6d282049-4c70-4e8b-945e-3375dd7a0e95-scripts\") pod \"octavia-api-5f9d47f567-qfb6h\" (UID: \"6d282049-4c70-4e8b-945e-3375dd7a0e95\") " pod="openstack/octavia-api-5f9d47f567-qfb6h" Mar 10 11:22:20 crc kubenswrapper[4794]: I0310 11:22:20.178212 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6d282049-4c70-4e8b-945e-3375dd7a0e95-combined-ca-bundle\") pod \"octavia-api-5f9d47f567-qfb6h\" (UID: \"6d282049-4c70-4e8b-945e-3375dd7a0e95\") " pod="openstack/octavia-api-5f9d47f567-qfb6h" Mar 10 11:22:20 crc kubenswrapper[4794]: I0310 11:22:20.292362 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-api-5f9d47f567-qfb6h" Mar 10 11:22:20 crc kubenswrapper[4794]: I0310 11:22:20.864681 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-api-5f9d47f567-qfb6h"] Mar 10 11:22:20 crc kubenswrapper[4794]: I0310 11:22:20.892409 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-api-5f9d47f567-qfb6h" event={"ID":"6d282049-4c70-4e8b-945e-3375dd7a0e95","Type":"ContainerStarted","Data":"02f7e2a3ead941354e4a8f9e4eb72ff81bd54394373e8a583a176ff7e0e659e3"} Mar 10 11:22:22 crc kubenswrapper[4794]: I0310 11:22:22.968262 4794 patch_prober.go:28] interesting pod/machine-config-daemon-69278 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 11:22:22 crc kubenswrapper[4794]: I0310 11:22:22.968678 4794 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-69278" podUID="071a79a8-a892-4d38-a255-2a19483b64aa" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 11:22:22 crc kubenswrapper[4794]: I0310 11:22:22.968736 4794 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-69278" Mar 10 11:22:22 crc kubenswrapper[4794]: I0310 11:22:22.969625 4794 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"14e62759b4835fd8e09988559aeb5396bdb9e62fa6007a87b363f008bdc5ba42"} pod="openshift-machine-config-operator/machine-config-daemon-69278" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 10 11:22:22 crc kubenswrapper[4794]: I0310 11:22:22.969683 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-69278" podUID="071a79a8-a892-4d38-a255-2a19483b64aa" containerName="machine-config-daemon" containerID="cri-o://14e62759b4835fd8e09988559aeb5396bdb9e62fa6007a87b363f008bdc5ba42" gracePeriod=600 Mar 10 11:22:23 crc kubenswrapper[4794]: I0310 11:22:23.924504 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-69278" event={"ID":"071a79a8-a892-4d38-a255-2a19483b64aa","Type":"ContainerDied","Data":"14e62759b4835fd8e09988559aeb5396bdb9e62fa6007a87b363f008bdc5ba42"} Mar 10 11:22:23 crc kubenswrapper[4794]: I0310 11:22:23.924892 4794 scope.go:117] "RemoveContainer" containerID="993a5336699cfaa4c1135438dd34154cd549d1e3528993849fd4002d26837157" Mar 10 11:22:23 crc kubenswrapper[4794]: I0310 11:22:23.925504 4794 generic.go:334] "Generic (PLEG): container finished" podID="071a79a8-a892-4d38-a255-2a19483b64aa" containerID="14e62759b4835fd8e09988559aeb5396bdb9e62fa6007a87b363f008bdc5ba42" exitCode=0 Mar 10 11:22:23 crc kubenswrapper[4794]: I0310 11:22:23.925969 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-69278" event={"ID":"071a79a8-a892-4d38-a255-2a19483b64aa","Type":"ContainerStarted","Data":"7035d4646bbb6a23dd802a9c8c9f05a3c6c945d4e5a6da8586d2bb005fcfad77"} Mar 10 11:22:28 crc kubenswrapper[4794]: I0310 11:22:28.189528 4794 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-dnwr6" podUID="e395ac11-306a-4cb7-8868-ac0c2108d63b" containerName="ovn-controller" probeResult="failure" output=< Mar 10 11:22:28 crc kubenswrapper[4794]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Mar 10 11:22:28 crc kubenswrapper[4794]: > Mar 10 11:22:28 crc kubenswrapper[4794]: I0310 11:22:28.365874 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-545pd" Mar 10 11:22:28 crc kubenswrapper[4794]: I0310 11:22:28.367184 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-545pd" Mar 10 11:22:28 crc kubenswrapper[4794]: I0310 11:22:28.503645 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-dnwr6-config-9mcjh"] Mar 10 11:22:28 crc kubenswrapper[4794]: I0310 11:22:28.504878 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-dnwr6-config-9mcjh" Mar 10 11:22:28 crc kubenswrapper[4794]: I0310 11:22:28.508100 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-extra-scripts" Mar 10 11:22:28 crc kubenswrapper[4794]: I0310 11:22:28.517119 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-dnwr6-config-9mcjh"] Mar 10 11:22:28 crc kubenswrapper[4794]: I0310 11:22:28.630670 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/211475bd-28bf-4ed2-bc66-7ff9b073e730-additional-scripts\") pod \"ovn-controller-dnwr6-config-9mcjh\" (UID: \"211475bd-28bf-4ed2-bc66-7ff9b073e730\") " pod="openstack/ovn-controller-dnwr6-config-9mcjh" Mar 10 11:22:28 crc kubenswrapper[4794]: I0310 11:22:28.630908 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/211475bd-28bf-4ed2-bc66-7ff9b073e730-var-run-ovn\") pod \"ovn-controller-dnwr6-config-9mcjh\" (UID: \"211475bd-28bf-4ed2-bc66-7ff9b073e730\") " pod="openstack/ovn-controller-dnwr6-config-9mcjh" Mar 10 11:22:28 crc kubenswrapper[4794]: I0310 11:22:28.630947 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/211475bd-28bf-4ed2-bc66-7ff9b073e730-var-log-ovn\") pod \"ovn-controller-dnwr6-config-9mcjh\" (UID: \"211475bd-28bf-4ed2-bc66-7ff9b073e730\") " pod="openstack/ovn-controller-dnwr6-config-9mcjh" Mar 10 11:22:28 crc kubenswrapper[4794]: I0310 11:22:28.631011 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pt4sf\" (UniqueName: \"kubernetes.io/projected/211475bd-28bf-4ed2-bc66-7ff9b073e730-kube-api-access-pt4sf\") pod \"ovn-controller-dnwr6-config-9mcjh\" (UID: \"211475bd-28bf-4ed2-bc66-7ff9b073e730\") " pod="openstack/ovn-controller-dnwr6-config-9mcjh" Mar 10 11:22:28 crc kubenswrapper[4794]: I0310 11:22:28.631092 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/211475bd-28bf-4ed2-bc66-7ff9b073e730-var-run\") pod \"ovn-controller-dnwr6-config-9mcjh\" (UID: \"211475bd-28bf-4ed2-bc66-7ff9b073e730\") " pod="openstack/ovn-controller-dnwr6-config-9mcjh" Mar 10 11:22:28 crc kubenswrapper[4794]: I0310 11:22:28.631135 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/211475bd-28bf-4ed2-bc66-7ff9b073e730-scripts\") pod \"ovn-controller-dnwr6-config-9mcjh\" (UID: \"211475bd-28bf-4ed2-bc66-7ff9b073e730\") " pod="openstack/ovn-controller-dnwr6-config-9mcjh" Mar 10 11:22:28 crc kubenswrapper[4794]: I0310 11:22:28.733111 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/211475bd-28bf-4ed2-bc66-7ff9b073e730-additional-scripts\") pod \"ovn-controller-dnwr6-config-9mcjh\" (UID: \"211475bd-28bf-4ed2-bc66-7ff9b073e730\") " pod="openstack/ovn-controller-dnwr6-config-9mcjh" Mar 10 11:22:28 crc kubenswrapper[4794]: I0310 11:22:28.733522 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/211475bd-28bf-4ed2-bc66-7ff9b073e730-var-run-ovn\") pod \"ovn-controller-dnwr6-config-9mcjh\" (UID: \"211475bd-28bf-4ed2-bc66-7ff9b073e730\") " pod="openstack/ovn-controller-dnwr6-config-9mcjh" Mar 10 11:22:28 crc kubenswrapper[4794]: I0310 11:22:28.733542 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/211475bd-28bf-4ed2-bc66-7ff9b073e730-var-log-ovn\") pod \"ovn-controller-dnwr6-config-9mcjh\" (UID: \"211475bd-28bf-4ed2-bc66-7ff9b073e730\") " pod="openstack/ovn-controller-dnwr6-config-9mcjh" Mar 10 11:22:28 crc kubenswrapper[4794]: I0310 11:22:28.733564 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pt4sf\" (UniqueName: \"kubernetes.io/projected/211475bd-28bf-4ed2-bc66-7ff9b073e730-kube-api-access-pt4sf\") pod \"ovn-controller-dnwr6-config-9mcjh\" (UID: \"211475bd-28bf-4ed2-bc66-7ff9b073e730\") " pod="openstack/ovn-controller-dnwr6-config-9mcjh" Mar 10 11:22:28 crc kubenswrapper[4794]: I0310 11:22:28.733614 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/211475bd-28bf-4ed2-bc66-7ff9b073e730-var-run\") pod \"ovn-controller-dnwr6-config-9mcjh\" (UID: \"211475bd-28bf-4ed2-bc66-7ff9b073e730\") " pod="openstack/ovn-controller-dnwr6-config-9mcjh" Mar 10 11:22:28 crc kubenswrapper[4794]: I0310 11:22:28.733653 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/211475bd-28bf-4ed2-bc66-7ff9b073e730-scripts\") pod \"ovn-controller-dnwr6-config-9mcjh\" (UID: \"211475bd-28bf-4ed2-bc66-7ff9b073e730\") " pod="openstack/ovn-controller-dnwr6-config-9mcjh" Mar 10 11:22:28 crc kubenswrapper[4794]: I0310 11:22:28.734856 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/211475bd-28bf-4ed2-bc66-7ff9b073e730-var-log-ovn\") pod \"ovn-controller-dnwr6-config-9mcjh\" (UID: \"211475bd-28bf-4ed2-bc66-7ff9b073e730\") " pod="openstack/ovn-controller-dnwr6-config-9mcjh" Mar 10 11:22:28 crc kubenswrapper[4794]: I0310 11:22:28.735127 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/211475bd-28bf-4ed2-bc66-7ff9b073e730-var-run-ovn\") pod \"ovn-controller-dnwr6-config-9mcjh\" (UID: \"211475bd-28bf-4ed2-bc66-7ff9b073e730\") " pod="openstack/ovn-controller-dnwr6-config-9mcjh" Mar 10 11:22:28 crc kubenswrapper[4794]: I0310 11:22:28.735135 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/211475bd-28bf-4ed2-bc66-7ff9b073e730-var-run\") pod \"ovn-controller-dnwr6-config-9mcjh\" (UID: \"211475bd-28bf-4ed2-bc66-7ff9b073e730\") " pod="openstack/ovn-controller-dnwr6-config-9mcjh" Mar 10 11:22:28 crc kubenswrapper[4794]: I0310 11:22:28.735570 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/211475bd-28bf-4ed2-bc66-7ff9b073e730-scripts\") pod \"ovn-controller-dnwr6-config-9mcjh\" (UID: \"211475bd-28bf-4ed2-bc66-7ff9b073e730\") " pod="openstack/ovn-controller-dnwr6-config-9mcjh" Mar 10 11:22:28 crc kubenswrapper[4794]: I0310 11:22:28.738586 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/211475bd-28bf-4ed2-bc66-7ff9b073e730-additional-scripts\") pod \"ovn-controller-dnwr6-config-9mcjh\" (UID: \"211475bd-28bf-4ed2-bc66-7ff9b073e730\") " pod="openstack/ovn-controller-dnwr6-config-9mcjh" Mar 10 11:22:28 crc kubenswrapper[4794]: I0310 11:22:28.752696 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pt4sf\" (UniqueName: \"kubernetes.io/projected/211475bd-28bf-4ed2-bc66-7ff9b073e730-kube-api-access-pt4sf\") pod \"ovn-controller-dnwr6-config-9mcjh\" (UID: \"211475bd-28bf-4ed2-bc66-7ff9b073e730\") " pod="openstack/ovn-controller-dnwr6-config-9mcjh" Mar 10 11:22:28 crc kubenswrapper[4794]: I0310 11:22:28.829318 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-dnwr6-config-9mcjh" Mar 10 11:22:31 crc kubenswrapper[4794]: I0310 11:22:31.325953 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-dnwr6-config-9mcjh"] Mar 10 11:22:31 crc kubenswrapper[4794]: W0310 11:22:31.334060 4794 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod211475bd_28bf_4ed2_bc66_7ff9b073e730.slice/crio-b35b06fef2cc5d535c1e12952e791bfc4b82b3489219131fae67a0b275311886 WatchSource:0}: Error finding container b35b06fef2cc5d535c1e12952e791bfc4b82b3489219131fae67a0b275311886: Status 404 returned error can't find the container with id b35b06fef2cc5d535c1e12952e791bfc4b82b3489219131fae67a0b275311886 Mar 10 11:22:32 crc kubenswrapper[4794]: I0310 11:22:32.022986 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-dnwr6-config-9mcjh" event={"ID":"211475bd-28bf-4ed2-bc66-7ff9b073e730","Type":"ContainerStarted","Data":"daf4fcade275be31a6babdb894296aae88ec9f7cc9fd48676243d5184178ec74"} Mar 10 11:22:32 crc kubenswrapper[4794]: I0310 11:22:32.023288 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-dnwr6-config-9mcjh" event={"ID":"211475bd-28bf-4ed2-bc66-7ff9b073e730","Type":"ContainerStarted","Data":"b35b06fef2cc5d535c1e12952e791bfc4b82b3489219131fae67a0b275311886"} Mar 10 11:22:32 crc kubenswrapper[4794]: I0310 11:22:32.025444 4794 generic.go:334] "Generic (PLEG): container finished" podID="6d282049-4c70-4e8b-945e-3375dd7a0e95" containerID="bd98c3409e354c8613c7ae6f73fa8a748cb0fc838ed9790f5967f2db2e9a4e97" exitCode=0 Mar 10 11:22:32 crc kubenswrapper[4794]: I0310 11:22:32.025484 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-api-5f9d47f567-qfb6h" event={"ID":"6d282049-4c70-4e8b-945e-3375dd7a0e95","Type":"ContainerDied","Data":"bd98c3409e354c8613c7ae6f73fa8a748cb0fc838ed9790f5967f2db2e9a4e97"} Mar 10 11:22:32 crc kubenswrapper[4794]: I0310 11:22:32.075182 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-dnwr6-config-9mcjh" podStartSLOduration=4.075160376 podStartE2EDuration="4.075160376s" podCreationTimestamp="2026-03-10 11:22:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 11:22:32.045682759 +0000 UTC m=+5900.801853577" watchObservedRunningTime="2026-03-10 11:22:32.075160376 +0000 UTC m=+5900.831331194" Mar 10 11:22:33 crc kubenswrapper[4794]: I0310 11:22:33.037446 4794 generic.go:334] "Generic (PLEG): container finished" podID="211475bd-28bf-4ed2-bc66-7ff9b073e730" containerID="daf4fcade275be31a6babdb894296aae88ec9f7cc9fd48676243d5184178ec74" exitCode=0 Mar 10 11:22:33 crc kubenswrapper[4794]: I0310 11:22:33.037898 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-dnwr6-config-9mcjh" event={"ID":"211475bd-28bf-4ed2-bc66-7ff9b073e730","Type":"ContainerDied","Data":"daf4fcade275be31a6babdb894296aae88ec9f7cc9fd48676243d5184178ec74"} Mar 10 11:22:33 crc kubenswrapper[4794]: I0310 11:22:33.041501 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-api-5f9d47f567-qfb6h" event={"ID":"6d282049-4c70-4e8b-945e-3375dd7a0e95","Type":"ContainerStarted","Data":"a8ef514ed4382467e4505296d4d95d88847f76294a1603dde47d798014b1c120"} Mar 10 11:22:33 crc kubenswrapper[4794]: I0310 11:22:33.041580 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-api-5f9d47f567-qfb6h" event={"ID":"6d282049-4c70-4e8b-945e-3375dd7a0e95","Type":"ContainerStarted","Data":"f1063fac653ce341ac8df0e32df9c50d4062fc6d560f7325a8b8a31cada40d7d"} Mar 10 11:22:33 crc kubenswrapper[4794]: I0310 11:22:33.042560 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/octavia-api-5f9d47f567-qfb6h" Mar 10 11:22:33 crc kubenswrapper[4794]: I0310 11:22:33.042697 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/octavia-api-5f9d47f567-qfb6h" Mar 10 11:22:33 crc kubenswrapper[4794]: I0310 11:22:33.098280 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/octavia-api-5f9d47f567-qfb6h" podStartSLOduration=4.126215063 podStartE2EDuration="14.09825888s" podCreationTimestamp="2026-03-10 11:22:19 +0000 UTC" firstStartedPulling="2026-03-10 11:22:20.867830107 +0000 UTC m=+5889.624000925" lastFinishedPulling="2026-03-10 11:22:30.839873934 +0000 UTC m=+5899.596044742" observedRunningTime="2026-03-10 11:22:33.086448282 +0000 UTC m=+5901.842619100" watchObservedRunningTime="2026-03-10 11:22:33.09825888 +0000 UTC m=+5901.854429718" Mar 10 11:22:33 crc kubenswrapper[4794]: I0310 11:22:33.190022 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-dnwr6" Mar 10 11:22:34 crc kubenswrapper[4794]: I0310 11:22:34.493715 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-dnwr6-config-9mcjh" Mar 10 11:22:34 crc kubenswrapper[4794]: I0310 11:22:34.580464 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/211475bd-28bf-4ed2-bc66-7ff9b073e730-additional-scripts\") pod \"211475bd-28bf-4ed2-bc66-7ff9b073e730\" (UID: \"211475bd-28bf-4ed2-bc66-7ff9b073e730\") " Mar 10 11:22:34 crc kubenswrapper[4794]: I0310 11:22:34.580601 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pt4sf\" (UniqueName: \"kubernetes.io/projected/211475bd-28bf-4ed2-bc66-7ff9b073e730-kube-api-access-pt4sf\") pod \"211475bd-28bf-4ed2-bc66-7ff9b073e730\" (UID: \"211475bd-28bf-4ed2-bc66-7ff9b073e730\") " Mar 10 11:22:34 crc kubenswrapper[4794]: I0310 11:22:34.580648 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/211475bd-28bf-4ed2-bc66-7ff9b073e730-scripts\") pod \"211475bd-28bf-4ed2-bc66-7ff9b073e730\" (UID: \"211475bd-28bf-4ed2-bc66-7ff9b073e730\") " Mar 10 11:22:34 crc kubenswrapper[4794]: I0310 11:22:34.580697 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/211475bd-28bf-4ed2-bc66-7ff9b073e730-var-log-ovn\") pod \"211475bd-28bf-4ed2-bc66-7ff9b073e730\" (UID: \"211475bd-28bf-4ed2-bc66-7ff9b073e730\") " Mar 10 11:22:34 crc kubenswrapper[4794]: I0310 11:22:34.580892 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/211475bd-28bf-4ed2-bc66-7ff9b073e730-var-run\") pod \"211475bd-28bf-4ed2-bc66-7ff9b073e730\" (UID: \"211475bd-28bf-4ed2-bc66-7ff9b073e730\") " Mar 10 11:22:34 crc kubenswrapper[4794]: I0310 11:22:34.580898 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/211475bd-28bf-4ed2-bc66-7ff9b073e730-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "211475bd-28bf-4ed2-bc66-7ff9b073e730" (UID: "211475bd-28bf-4ed2-bc66-7ff9b073e730"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 11:22:34 crc kubenswrapper[4794]: I0310 11:22:34.580985 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/211475bd-28bf-4ed2-bc66-7ff9b073e730-var-run" (OuterVolumeSpecName: "var-run") pod "211475bd-28bf-4ed2-bc66-7ff9b073e730" (UID: "211475bd-28bf-4ed2-bc66-7ff9b073e730"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 11:22:34 crc kubenswrapper[4794]: I0310 11:22:34.580979 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/211475bd-28bf-4ed2-bc66-7ff9b073e730-var-run-ovn\") pod \"211475bd-28bf-4ed2-bc66-7ff9b073e730\" (UID: \"211475bd-28bf-4ed2-bc66-7ff9b073e730\") " Mar 10 11:22:34 crc kubenswrapper[4794]: I0310 11:22:34.581113 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/211475bd-28bf-4ed2-bc66-7ff9b073e730-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "211475bd-28bf-4ed2-bc66-7ff9b073e730" (UID: "211475bd-28bf-4ed2-bc66-7ff9b073e730"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 11:22:34 crc kubenswrapper[4794]: I0310 11:22:34.581793 4794 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/211475bd-28bf-4ed2-bc66-7ff9b073e730-var-log-ovn\") on node \"crc\" DevicePath \"\"" Mar 10 11:22:34 crc kubenswrapper[4794]: I0310 11:22:34.581828 4794 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/211475bd-28bf-4ed2-bc66-7ff9b073e730-var-run\") on node \"crc\" DevicePath \"\"" Mar 10 11:22:34 crc kubenswrapper[4794]: I0310 11:22:34.581865 4794 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/211475bd-28bf-4ed2-bc66-7ff9b073e730-var-run-ovn\") on node \"crc\" DevicePath \"\"" Mar 10 11:22:34 crc kubenswrapper[4794]: I0310 11:22:34.581945 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/211475bd-28bf-4ed2-bc66-7ff9b073e730-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "211475bd-28bf-4ed2-bc66-7ff9b073e730" (UID: "211475bd-28bf-4ed2-bc66-7ff9b073e730"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 11:22:34 crc kubenswrapper[4794]: I0310 11:22:34.582439 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/211475bd-28bf-4ed2-bc66-7ff9b073e730-scripts" (OuterVolumeSpecName: "scripts") pod "211475bd-28bf-4ed2-bc66-7ff9b073e730" (UID: "211475bd-28bf-4ed2-bc66-7ff9b073e730"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 11:22:34 crc kubenswrapper[4794]: I0310 11:22:34.589823 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/211475bd-28bf-4ed2-bc66-7ff9b073e730-kube-api-access-pt4sf" (OuterVolumeSpecName: "kube-api-access-pt4sf") pod "211475bd-28bf-4ed2-bc66-7ff9b073e730" (UID: "211475bd-28bf-4ed2-bc66-7ff9b073e730"). InnerVolumeSpecName "kube-api-access-pt4sf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 11:22:34 crc kubenswrapper[4794]: I0310 11:22:34.683671 4794 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/211475bd-28bf-4ed2-bc66-7ff9b073e730-additional-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 11:22:34 crc kubenswrapper[4794]: I0310 11:22:34.683698 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pt4sf\" (UniqueName: \"kubernetes.io/projected/211475bd-28bf-4ed2-bc66-7ff9b073e730-kube-api-access-pt4sf\") on node \"crc\" DevicePath \"\"" Mar 10 11:22:34 crc kubenswrapper[4794]: I0310 11:22:34.683712 4794 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/211475bd-28bf-4ed2-bc66-7ff9b073e730-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 11:22:35 crc kubenswrapper[4794]: I0310 11:22:35.064927 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-dnwr6-config-9mcjh" event={"ID":"211475bd-28bf-4ed2-bc66-7ff9b073e730","Type":"ContainerDied","Data":"b35b06fef2cc5d535c1e12952e791bfc4b82b3489219131fae67a0b275311886"} Mar 10 11:22:35 crc kubenswrapper[4794]: I0310 11:22:35.064987 4794 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b35b06fef2cc5d535c1e12952e791bfc4b82b3489219131fae67a0b275311886" Mar 10 11:22:35 crc kubenswrapper[4794]: I0310 11:22:35.065007 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-dnwr6-config-9mcjh" Mar 10 11:22:35 crc kubenswrapper[4794]: I0310 11:22:35.592749 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-dnwr6-config-9mcjh"] Mar 10 11:22:35 crc kubenswrapper[4794]: I0310 11:22:35.603325 4794 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-dnwr6-config-9mcjh"] Mar 10 11:22:35 crc kubenswrapper[4794]: I0310 11:22:35.760323 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-dnwr6-config-f9z9r"] Mar 10 11:22:35 crc kubenswrapper[4794]: E0310 11:22:35.760849 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="211475bd-28bf-4ed2-bc66-7ff9b073e730" containerName="ovn-config" Mar 10 11:22:35 crc kubenswrapper[4794]: I0310 11:22:35.760884 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="211475bd-28bf-4ed2-bc66-7ff9b073e730" containerName="ovn-config" Mar 10 11:22:35 crc kubenswrapper[4794]: I0310 11:22:35.761139 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="211475bd-28bf-4ed2-bc66-7ff9b073e730" containerName="ovn-config" Mar 10 11:22:35 crc kubenswrapper[4794]: I0310 11:22:35.761923 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-dnwr6-config-f9z9r" Mar 10 11:22:35 crc kubenswrapper[4794]: I0310 11:22:35.764842 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-extra-scripts" Mar 10 11:22:35 crc kubenswrapper[4794]: I0310 11:22:35.796398 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-dnwr6-config-f9z9r"] Mar 10 11:22:35 crc kubenswrapper[4794]: I0310 11:22:35.917326 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/5669ea2e-fe93-4551-a6a4-4fdc9f8f3c60-var-log-ovn\") pod \"ovn-controller-dnwr6-config-f9z9r\" (UID: \"5669ea2e-fe93-4551-a6a4-4fdc9f8f3c60\") " pod="openstack/ovn-controller-dnwr6-config-f9z9r" Mar 10 11:22:35 crc kubenswrapper[4794]: I0310 11:22:35.917427 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/5669ea2e-fe93-4551-a6a4-4fdc9f8f3c60-var-run\") pod \"ovn-controller-dnwr6-config-f9z9r\" (UID: \"5669ea2e-fe93-4551-a6a4-4fdc9f8f3c60\") " pod="openstack/ovn-controller-dnwr6-config-f9z9r" Mar 10 11:22:35 crc kubenswrapper[4794]: I0310 11:22:35.917454 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5669ea2e-fe93-4551-a6a4-4fdc9f8f3c60-scripts\") pod \"ovn-controller-dnwr6-config-f9z9r\" (UID: \"5669ea2e-fe93-4551-a6a4-4fdc9f8f3c60\") " pod="openstack/ovn-controller-dnwr6-config-f9z9r" Mar 10 11:22:35 crc kubenswrapper[4794]: I0310 11:22:35.917884 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7vj4z\" (UniqueName: \"kubernetes.io/projected/5669ea2e-fe93-4551-a6a4-4fdc9f8f3c60-kube-api-access-7vj4z\") pod \"ovn-controller-dnwr6-config-f9z9r\" (UID: \"5669ea2e-fe93-4551-a6a4-4fdc9f8f3c60\") " pod="openstack/ovn-controller-dnwr6-config-f9z9r" Mar 10 11:22:35 crc kubenswrapper[4794]: I0310 11:22:35.917939 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/5669ea2e-fe93-4551-a6a4-4fdc9f8f3c60-additional-scripts\") pod \"ovn-controller-dnwr6-config-f9z9r\" (UID: \"5669ea2e-fe93-4551-a6a4-4fdc9f8f3c60\") " pod="openstack/ovn-controller-dnwr6-config-f9z9r" Mar 10 11:22:35 crc kubenswrapper[4794]: I0310 11:22:35.918006 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/5669ea2e-fe93-4551-a6a4-4fdc9f8f3c60-var-run-ovn\") pod \"ovn-controller-dnwr6-config-f9z9r\" (UID: \"5669ea2e-fe93-4551-a6a4-4fdc9f8f3c60\") " pod="openstack/ovn-controller-dnwr6-config-f9z9r" Mar 10 11:22:36 crc kubenswrapper[4794]: I0310 11:22:36.007621 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="211475bd-28bf-4ed2-bc66-7ff9b073e730" path="/var/lib/kubelet/pods/211475bd-28bf-4ed2-bc66-7ff9b073e730/volumes" Mar 10 11:22:36 crc kubenswrapper[4794]: I0310 11:22:36.019997 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7vj4z\" (UniqueName: \"kubernetes.io/projected/5669ea2e-fe93-4551-a6a4-4fdc9f8f3c60-kube-api-access-7vj4z\") pod \"ovn-controller-dnwr6-config-f9z9r\" (UID: \"5669ea2e-fe93-4551-a6a4-4fdc9f8f3c60\") " pod="openstack/ovn-controller-dnwr6-config-f9z9r" Mar 10 11:22:36 crc kubenswrapper[4794]: I0310 11:22:36.020041 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/5669ea2e-fe93-4551-a6a4-4fdc9f8f3c60-additional-scripts\") pod \"ovn-controller-dnwr6-config-f9z9r\" (UID: \"5669ea2e-fe93-4551-a6a4-4fdc9f8f3c60\") " pod="openstack/ovn-controller-dnwr6-config-f9z9r" Mar 10 11:22:36 crc kubenswrapper[4794]: I0310 11:22:36.020070 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/5669ea2e-fe93-4551-a6a4-4fdc9f8f3c60-var-run-ovn\") pod \"ovn-controller-dnwr6-config-f9z9r\" (UID: \"5669ea2e-fe93-4551-a6a4-4fdc9f8f3c60\") " pod="openstack/ovn-controller-dnwr6-config-f9z9r" Mar 10 11:22:36 crc kubenswrapper[4794]: I0310 11:22:36.020103 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/5669ea2e-fe93-4551-a6a4-4fdc9f8f3c60-var-log-ovn\") pod \"ovn-controller-dnwr6-config-f9z9r\" (UID: \"5669ea2e-fe93-4551-a6a4-4fdc9f8f3c60\") " pod="openstack/ovn-controller-dnwr6-config-f9z9r" Mar 10 11:22:36 crc kubenswrapper[4794]: I0310 11:22:36.020169 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/5669ea2e-fe93-4551-a6a4-4fdc9f8f3c60-var-run\") pod \"ovn-controller-dnwr6-config-f9z9r\" (UID: \"5669ea2e-fe93-4551-a6a4-4fdc9f8f3c60\") " pod="openstack/ovn-controller-dnwr6-config-f9z9r" Mar 10 11:22:36 crc kubenswrapper[4794]: I0310 11:22:36.020186 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5669ea2e-fe93-4551-a6a4-4fdc9f8f3c60-scripts\") pod \"ovn-controller-dnwr6-config-f9z9r\" (UID: \"5669ea2e-fe93-4551-a6a4-4fdc9f8f3c60\") " pod="openstack/ovn-controller-dnwr6-config-f9z9r" Mar 10 11:22:36 crc kubenswrapper[4794]: I0310 11:22:36.020418 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/5669ea2e-fe93-4551-a6a4-4fdc9f8f3c60-var-run-ovn\") pod \"ovn-controller-dnwr6-config-f9z9r\" (UID: \"5669ea2e-fe93-4551-a6a4-4fdc9f8f3c60\") " pod="openstack/ovn-controller-dnwr6-config-f9z9r" Mar 10 11:22:36 crc kubenswrapper[4794]: I0310 11:22:36.020496 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/5669ea2e-fe93-4551-a6a4-4fdc9f8f3c60-var-log-ovn\") pod \"ovn-controller-dnwr6-config-f9z9r\" (UID: \"5669ea2e-fe93-4551-a6a4-4fdc9f8f3c60\") " pod="openstack/ovn-controller-dnwr6-config-f9z9r" Mar 10 11:22:36 crc kubenswrapper[4794]: I0310 11:22:36.020529 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/5669ea2e-fe93-4551-a6a4-4fdc9f8f3c60-var-run\") pod \"ovn-controller-dnwr6-config-f9z9r\" (UID: \"5669ea2e-fe93-4551-a6a4-4fdc9f8f3c60\") " pod="openstack/ovn-controller-dnwr6-config-f9z9r" Mar 10 11:22:36 crc kubenswrapper[4794]: I0310 11:22:36.020877 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/5669ea2e-fe93-4551-a6a4-4fdc9f8f3c60-additional-scripts\") pod \"ovn-controller-dnwr6-config-f9z9r\" (UID: \"5669ea2e-fe93-4551-a6a4-4fdc9f8f3c60\") " pod="openstack/ovn-controller-dnwr6-config-f9z9r" Mar 10 11:22:36 crc kubenswrapper[4794]: I0310 11:22:36.021994 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5669ea2e-fe93-4551-a6a4-4fdc9f8f3c60-scripts\") pod \"ovn-controller-dnwr6-config-f9z9r\" (UID: \"5669ea2e-fe93-4551-a6a4-4fdc9f8f3c60\") " pod="openstack/ovn-controller-dnwr6-config-f9z9r" Mar 10 11:22:36 crc kubenswrapper[4794]: I0310 11:22:36.038670 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7vj4z\" (UniqueName: \"kubernetes.io/projected/5669ea2e-fe93-4551-a6a4-4fdc9f8f3c60-kube-api-access-7vj4z\") pod \"ovn-controller-dnwr6-config-f9z9r\" (UID: \"5669ea2e-fe93-4551-a6a4-4fdc9f8f3c60\") " pod="openstack/ovn-controller-dnwr6-config-f9z9r" Mar 10 11:22:36 crc kubenswrapper[4794]: I0310 11:22:36.091262 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-dnwr6-config-f9z9r" Mar 10 11:22:36 crc kubenswrapper[4794]: I0310 11:22:36.607030 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-dnwr6-config-f9z9r"] Mar 10 11:22:36 crc kubenswrapper[4794]: W0310 11:22:36.613224 4794 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5669ea2e_fe93_4551_a6a4_4fdc9f8f3c60.slice/crio-459684c51bd10c744d3f457406a6f6dc5a9e3788cc9d190ff6d33561ae19cf7e WatchSource:0}: Error finding container 459684c51bd10c744d3f457406a6f6dc5a9e3788cc9d190ff6d33561ae19cf7e: Status 404 returned error can't find the container with id 459684c51bd10c744d3f457406a6f6dc5a9e3788cc9d190ff6d33561ae19cf7e Mar 10 11:22:37 crc kubenswrapper[4794]: I0310 11:22:37.089405 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-dnwr6-config-f9z9r" event={"ID":"5669ea2e-fe93-4551-a6a4-4fdc9f8f3c60","Type":"ContainerStarted","Data":"e2b9fd2ac1d777b4a984fc95f971f66404bd76ee25222bb503db3d9cf0d13182"} Mar 10 11:22:37 crc kubenswrapper[4794]: I0310 11:22:37.089686 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-dnwr6-config-f9z9r" event={"ID":"5669ea2e-fe93-4551-a6a4-4fdc9f8f3c60","Type":"ContainerStarted","Data":"459684c51bd10c744d3f457406a6f6dc5a9e3788cc9d190ff6d33561ae19cf7e"} Mar 10 11:22:37 crc kubenswrapper[4794]: I0310 11:22:37.112084 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-dnwr6-config-f9z9r" podStartSLOduration=2.112068341 podStartE2EDuration="2.112068341s" podCreationTimestamp="2026-03-10 11:22:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 11:22:37.111267575 +0000 UTC m=+5905.867438403" watchObservedRunningTime="2026-03-10 11:22:37.112068341 +0000 UTC m=+5905.868239159" Mar 10 11:22:37 crc kubenswrapper[4794]: I0310 11:22:37.619370 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/octavia-rsyslog-nkn6m"] Mar 10 11:22:37 crc kubenswrapper[4794]: I0310 11:22:37.621543 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-rsyslog-nkn6m" Mar 10 11:22:37 crc kubenswrapper[4794]: I0310 11:22:37.623961 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"octavia-hmport-map" Mar 10 11:22:37 crc kubenswrapper[4794]: I0310 11:22:37.624087 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-rsyslog-config-data" Mar 10 11:22:37 crc kubenswrapper[4794]: I0310 11:22:37.623981 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-rsyslog-scripts" Mar 10 11:22:37 crc kubenswrapper[4794]: I0310 11:22:37.637519 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-rsyslog-nkn6m"] Mar 10 11:22:37 crc kubenswrapper[4794]: I0310 11:22:37.758410 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/15527b8a-9b20-4baa-b371-cb2395457d00-config-data-merged\") pod \"octavia-rsyslog-nkn6m\" (UID: \"15527b8a-9b20-4baa-b371-cb2395457d00\") " pod="openstack/octavia-rsyslog-nkn6m" Mar 10 11:22:37 crc kubenswrapper[4794]: I0310 11:22:37.758478 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/15527b8a-9b20-4baa-b371-cb2395457d00-scripts\") pod \"octavia-rsyslog-nkn6m\" (UID: \"15527b8a-9b20-4baa-b371-cb2395457d00\") " pod="openstack/octavia-rsyslog-nkn6m" Mar 10 11:22:37 crc kubenswrapper[4794]: I0310 11:22:37.758503 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hm-ports\" (UniqueName: \"kubernetes.io/configmap/15527b8a-9b20-4baa-b371-cb2395457d00-hm-ports\") pod \"octavia-rsyslog-nkn6m\" (UID: \"15527b8a-9b20-4baa-b371-cb2395457d00\") " pod="openstack/octavia-rsyslog-nkn6m" Mar 10 11:22:37 crc kubenswrapper[4794]: I0310 11:22:37.758566 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/15527b8a-9b20-4baa-b371-cb2395457d00-config-data\") pod \"octavia-rsyslog-nkn6m\" (UID: \"15527b8a-9b20-4baa-b371-cb2395457d00\") " pod="openstack/octavia-rsyslog-nkn6m" Mar 10 11:22:37 crc kubenswrapper[4794]: I0310 11:22:37.860417 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/15527b8a-9b20-4baa-b371-cb2395457d00-config-data-merged\") pod \"octavia-rsyslog-nkn6m\" (UID: \"15527b8a-9b20-4baa-b371-cb2395457d00\") " pod="openstack/octavia-rsyslog-nkn6m" Mar 10 11:22:37 crc kubenswrapper[4794]: I0310 11:22:37.860474 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/15527b8a-9b20-4baa-b371-cb2395457d00-scripts\") pod \"octavia-rsyslog-nkn6m\" (UID: \"15527b8a-9b20-4baa-b371-cb2395457d00\") " pod="openstack/octavia-rsyslog-nkn6m" Mar 10 11:22:37 crc kubenswrapper[4794]: I0310 11:22:37.860496 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hm-ports\" (UniqueName: \"kubernetes.io/configmap/15527b8a-9b20-4baa-b371-cb2395457d00-hm-ports\") pod \"octavia-rsyslog-nkn6m\" (UID: \"15527b8a-9b20-4baa-b371-cb2395457d00\") " pod="openstack/octavia-rsyslog-nkn6m" Mar 10 11:22:37 crc kubenswrapper[4794]: I0310 11:22:37.860550 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/15527b8a-9b20-4baa-b371-cb2395457d00-config-data\") pod \"octavia-rsyslog-nkn6m\" (UID: \"15527b8a-9b20-4baa-b371-cb2395457d00\") " pod="openstack/octavia-rsyslog-nkn6m" Mar 10 11:22:37 crc kubenswrapper[4794]: I0310 11:22:37.861290 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/15527b8a-9b20-4baa-b371-cb2395457d00-config-data-merged\") pod \"octavia-rsyslog-nkn6m\" (UID: \"15527b8a-9b20-4baa-b371-cb2395457d00\") " pod="openstack/octavia-rsyslog-nkn6m" Mar 10 11:22:37 crc kubenswrapper[4794]: I0310 11:22:37.862241 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hm-ports\" (UniqueName: \"kubernetes.io/configmap/15527b8a-9b20-4baa-b371-cb2395457d00-hm-ports\") pod \"octavia-rsyslog-nkn6m\" (UID: \"15527b8a-9b20-4baa-b371-cb2395457d00\") " pod="openstack/octavia-rsyslog-nkn6m" Mar 10 11:22:37 crc kubenswrapper[4794]: I0310 11:22:37.866572 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/15527b8a-9b20-4baa-b371-cb2395457d00-scripts\") pod \"octavia-rsyslog-nkn6m\" (UID: \"15527b8a-9b20-4baa-b371-cb2395457d00\") " pod="openstack/octavia-rsyslog-nkn6m" Mar 10 11:22:37 crc kubenswrapper[4794]: I0310 11:22:37.870108 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/15527b8a-9b20-4baa-b371-cb2395457d00-config-data\") pod \"octavia-rsyslog-nkn6m\" (UID: \"15527b8a-9b20-4baa-b371-cb2395457d00\") " pod="openstack/octavia-rsyslog-nkn6m" Mar 10 11:22:37 crc kubenswrapper[4794]: I0310 11:22:37.957733 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-rsyslog-nkn6m" Mar 10 11:22:38 crc kubenswrapper[4794]: I0310 11:22:38.114196 4794 generic.go:334] "Generic (PLEG): container finished" podID="5669ea2e-fe93-4551-a6a4-4fdc9f8f3c60" containerID="e2b9fd2ac1d777b4a984fc95f971f66404bd76ee25222bb503db3d9cf0d13182" exitCode=0 Mar 10 11:22:38 crc kubenswrapper[4794]: I0310 11:22:38.114305 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-dnwr6-config-f9z9r" event={"ID":"5669ea2e-fe93-4551-a6a4-4fdc9f8f3c60","Type":"ContainerDied","Data":"e2b9fd2ac1d777b4a984fc95f971f66404bd76ee25222bb503db3d9cf0d13182"} Mar 10 11:22:38 crc kubenswrapper[4794]: I0310 11:22:38.214695 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/octavia-image-upload-6f5964dbc9-5d45h"] Mar 10 11:22:38 crc kubenswrapper[4794]: I0310 11:22:38.217113 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-image-upload-6f5964dbc9-5d45h" Mar 10 11:22:38 crc kubenswrapper[4794]: I0310 11:22:38.220637 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-config-data" Mar 10 11:22:38 crc kubenswrapper[4794]: I0310 11:22:38.240664 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-image-upload-6f5964dbc9-5d45h"] Mar 10 11:22:38 crc kubenswrapper[4794]: I0310 11:22:38.370465 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/9950d0b1-6138-4bf8-9e37-a4a759bd44a2-httpd-config\") pod \"octavia-image-upload-6f5964dbc9-5d45h\" (UID: \"9950d0b1-6138-4bf8-9e37-a4a759bd44a2\") " pod="openstack/octavia-image-upload-6f5964dbc9-5d45h" Mar 10 11:22:38 crc kubenswrapper[4794]: I0310 11:22:38.370621 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"amphora-image\" (UniqueName: \"kubernetes.io/empty-dir/9950d0b1-6138-4bf8-9e37-a4a759bd44a2-amphora-image\") pod \"octavia-image-upload-6f5964dbc9-5d45h\" (UID: \"9950d0b1-6138-4bf8-9e37-a4a759bd44a2\") " pod="openstack/octavia-image-upload-6f5964dbc9-5d45h" Mar 10 11:22:38 crc kubenswrapper[4794]: I0310 11:22:38.472580 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/9950d0b1-6138-4bf8-9e37-a4a759bd44a2-httpd-config\") pod \"octavia-image-upload-6f5964dbc9-5d45h\" (UID: \"9950d0b1-6138-4bf8-9e37-a4a759bd44a2\") " pod="openstack/octavia-image-upload-6f5964dbc9-5d45h" Mar 10 11:22:38 crc kubenswrapper[4794]: I0310 11:22:38.472693 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"amphora-image\" (UniqueName: \"kubernetes.io/empty-dir/9950d0b1-6138-4bf8-9e37-a4a759bd44a2-amphora-image\") pod \"octavia-image-upload-6f5964dbc9-5d45h\" (UID: \"9950d0b1-6138-4bf8-9e37-a4a759bd44a2\") " pod="openstack/octavia-image-upload-6f5964dbc9-5d45h" Mar 10 11:22:38 crc kubenswrapper[4794]: I0310 11:22:38.473291 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"amphora-image\" (UniqueName: \"kubernetes.io/empty-dir/9950d0b1-6138-4bf8-9e37-a4a759bd44a2-amphora-image\") pod \"octavia-image-upload-6f5964dbc9-5d45h\" (UID: \"9950d0b1-6138-4bf8-9e37-a4a759bd44a2\") " pod="openstack/octavia-image-upload-6f5964dbc9-5d45h" Mar 10 11:22:38 crc kubenswrapper[4794]: I0310 11:22:38.478838 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/9950d0b1-6138-4bf8-9e37-a4a759bd44a2-httpd-config\") pod \"octavia-image-upload-6f5964dbc9-5d45h\" (UID: \"9950d0b1-6138-4bf8-9e37-a4a759bd44a2\") " pod="openstack/octavia-image-upload-6f5964dbc9-5d45h" Mar 10 11:22:38 crc kubenswrapper[4794]: I0310 11:22:38.534940 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-image-upload-6f5964dbc9-5d45h" Mar 10 11:22:38 crc kubenswrapper[4794]: I0310 11:22:38.547385 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-rsyslog-nkn6m"] Mar 10 11:22:38 crc kubenswrapper[4794]: I0310 11:22:38.771702 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-rsyslog-nkn6m"] Mar 10 11:22:38 crc kubenswrapper[4794]: I0310 11:22:38.803594 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-image-upload-6f5964dbc9-5d45h"] Mar 10 11:22:38 crc kubenswrapper[4794]: W0310 11:22:38.807547 4794 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9950d0b1_6138_4bf8_9e37_a4a759bd44a2.slice/crio-bcbc54024c543e963accc023889e80c131dec1e4fe3ec6b5cad9e467e3a2e90d WatchSource:0}: Error finding container bcbc54024c543e963accc023889e80c131dec1e4fe3ec6b5cad9e467e3a2e90d: Status 404 returned error can't find the container with id bcbc54024c543e963accc023889e80c131dec1e4fe3ec6b5cad9e467e3a2e90d Mar 10 11:22:39 crc kubenswrapper[4794]: I0310 11:22:39.128863 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-rsyslog-nkn6m" event={"ID":"15527b8a-9b20-4baa-b371-cb2395457d00","Type":"ContainerStarted","Data":"e823aaa9b68489b08b8c19614b70e40de0576d2e6038101859812f57e78cbd0a"} Mar 10 11:22:39 crc kubenswrapper[4794]: I0310 11:22:39.131913 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-image-upload-6f5964dbc9-5d45h" event={"ID":"9950d0b1-6138-4bf8-9e37-a4a759bd44a2","Type":"ContainerStarted","Data":"bcbc54024c543e963accc023889e80c131dec1e4fe3ec6b5cad9e467e3a2e90d"} Mar 10 11:22:39 crc kubenswrapper[4794]: I0310 11:22:39.519303 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-dnwr6-config-f9z9r" Mar 10 11:22:39 crc kubenswrapper[4794]: I0310 11:22:39.674048 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-dnwr6-config-f9z9r"] Mar 10 11:22:39 crc kubenswrapper[4794]: I0310 11:22:39.681588 4794 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-dnwr6-config-f9z9r"] Mar 10 11:22:39 crc kubenswrapper[4794]: I0310 11:22:39.696268 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/5669ea2e-fe93-4551-a6a4-4fdc9f8f3c60-var-run-ovn\") pod \"5669ea2e-fe93-4551-a6a4-4fdc9f8f3c60\" (UID: \"5669ea2e-fe93-4551-a6a4-4fdc9f8f3c60\") " Mar 10 11:22:39 crc kubenswrapper[4794]: I0310 11:22:39.696469 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5669ea2e-fe93-4551-a6a4-4fdc9f8f3c60-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "5669ea2e-fe93-4551-a6a4-4fdc9f8f3c60" (UID: "5669ea2e-fe93-4551-a6a4-4fdc9f8f3c60"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 11:22:39 crc kubenswrapper[4794]: I0310 11:22:39.696510 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7vj4z\" (UniqueName: \"kubernetes.io/projected/5669ea2e-fe93-4551-a6a4-4fdc9f8f3c60-kube-api-access-7vj4z\") pod \"5669ea2e-fe93-4551-a6a4-4fdc9f8f3c60\" (UID: \"5669ea2e-fe93-4551-a6a4-4fdc9f8f3c60\") " Mar 10 11:22:39 crc kubenswrapper[4794]: I0310 11:22:39.696657 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/5669ea2e-fe93-4551-a6a4-4fdc9f8f3c60-additional-scripts\") pod \"5669ea2e-fe93-4551-a6a4-4fdc9f8f3c60\" (UID: \"5669ea2e-fe93-4551-a6a4-4fdc9f8f3c60\") " Mar 10 11:22:39 crc kubenswrapper[4794]: I0310 11:22:39.696808 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5669ea2e-fe93-4551-a6a4-4fdc9f8f3c60-scripts\") pod \"5669ea2e-fe93-4551-a6a4-4fdc9f8f3c60\" (UID: \"5669ea2e-fe93-4551-a6a4-4fdc9f8f3c60\") " Mar 10 11:22:39 crc kubenswrapper[4794]: I0310 11:22:39.696962 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/5669ea2e-fe93-4551-a6a4-4fdc9f8f3c60-var-log-ovn\") pod \"5669ea2e-fe93-4551-a6a4-4fdc9f8f3c60\" (UID: \"5669ea2e-fe93-4551-a6a4-4fdc9f8f3c60\") " Mar 10 11:22:39 crc kubenswrapper[4794]: I0310 11:22:39.697057 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/5669ea2e-fe93-4551-a6a4-4fdc9f8f3c60-var-run\") pod \"5669ea2e-fe93-4551-a6a4-4fdc9f8f3c60\" (UID: \"5669ea2e-fe93-4551-a6a4-4fdc9f8f3c60\") " Mar 10 11:22:39 crc kubenswrapper[4794]: I0310 11:22:39.697191 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5669ea2e-fe93-4551-a6a4-4fdc9f8f3c60-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "5669ea2e-fe93-4551-a6a4-4fdc9f8f3c60" (UID: "5669ea2e-fe93-4551-a6a4-4fdc9f8f3c60"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 11:22:39 crc kubenswrapper[4794]: I0310 11:22:39.697452 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5669ea2e-fe93-4551-a6a4-4fdc9f8f3c60-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "5669ea2e-fe93-4551-a6a4-4fdc9f8f3c60" (UID: "5669ea2e-fe93-4551-a6a4-4fdc9f8f3c60"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 11:22:39 crc kubenswrapper[4794]: I0310 11:22:39.697566 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5669ea2e-fe93-4551-a6a4-4fdc9f8f3c60-var-run" (OuterVolumeSpecName: "var-run") pod "5669ea2e-fe93-4551-a6a4-4fdc9f8f3c60" (UID: "5669ea2e-fe93-4551-a6a4-4fdc9f8f3c60"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 11:22:39 crc kubenswrapper[4794]: I0310 11:22:39.697942 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5669ea2e-fe93-4551-a6a4-4fdc9f8f3c60-scripts" (OuterVolumeSpecName: "scripts") pod "5669ea2e-fe93-4551-a6a4-4fdc9f8f3c60" (UID: "5669ea2e-fe93-4551-a6a4-4fdc9f8f3c60"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 11:22:39 crc kubenswrapper[4794]: I0310 11:22:39.698007 4794 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/5669ea2e-fe93-4551-a6a4-4fdc9f8f3c60-var-log-ovn\") on node \"crc\" DevicePath \"\"" Mar 10 11:22:39 crc kubenswrapper[4794]: I0310 11:22:39.698258 4794 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/5669ea2e-fe93-4551-a6a4-4fdc9f8f3c60-var-run\") on node \"crc\" DevicePath \"\"" Mar 10 11:22:39 crc kubenswrapper[4794]: I0310 11:22:39.698272 4794 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/5669ea2e-fe93-4551-a6a4-4fdc9f8f3c60-var-run-ovn\") on node \"crc\" DevicePath \"\"" Mar 10 11:22:39 crc kubenswrapper[4794]: I0310 11:22:39.698286 4794 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/5669ea2e-fe93-4551-a6a4-4fdc9f8f3c60-additional-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 11:22:39 crc kubenswrapper[4794]: I0310 11:22:39.701366 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5669ea2e-fe93-4551-a6a4-4fdc9f8f3c60-kube-api-access-7vj4z" (OuterVolumeSpecName: "kube-api-access-7vj4z") pod "5669ea2e-fe93-4551-a6a4-4fdc9f8f3c60" (UID: "5669ea2e-fe93-4551-a6a4-4fdc9f8f3c60"). InnerVolumeSpecName "kube-api-access-7vj4z". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 11:22:39 crc kubenswrapper[4794]: I0310 11:22:39.800232 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7vj4z\" (UniqueName: \"kubernetes.io/projected/5669ea2e-fe93-4551-a6a4-4fdc9f8f3c60-kube-api-access-7vj4z\") on node \"crc\" DevicePath \"\"" Mar 10 11:22:39 crc kubenswrapper[4794]: I0310 11:22:39.800276 4794 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5669ea2e-fe93-4551-a6a4-4fdc9f8f3c60-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 11:22:40 crc kubenswrapper[4794]: I0310 11:22:40.015905 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5669ea2e-fe93-4551-a6a4-4fdc9f8f3c60" path="/var/lib/kubelet/pods/5669ea2e-fe93-4551-a6a4-4fdc9f8f3c60/volumes" Mar 10 11:22:40 crc kubenswrapper[4794]: I0310 11:22:40.145863 4794 scope.go:117] "RemoveContainer" containerID="e2b9fd2ac1d777b4a984fc95f971f66404bd76ee25222bb503db3d9cf0d13182" Mar 10 11:22:40 crc kubenswrapper[4794]: I0310 11:22:40.145911 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-dnwr6-config-f9z9r" Mar 10 11:22:42 crc kubenswrapper[4794]: I0310 11:22:42.846111 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/octavia-db-sync-7st8t"] Mar 10 11:22:42 crc kubenswrapper[4794]: E0310 11:22:42.847104 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5669ea2e-fe93-4551-a6a4-4fdc9f8f3c60" containerName="ovn-config" Mar 10 11:22:42 crc kubenswrapper[4794]: I0310 11:22:42.847122 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="5669ea2e-fe93-4551-a6a4-4fdc9f8f3c60" containerName="ovn-config" Mar 10 11:22:42 crc kubenswrapper[4794]: I0310 11:22:42.847372 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="5669ea2e-fe93-4551-a6a4-4fdc9f8f3c60" containerName="ovn-config" Mar 10 11:22:42 crc kubenswrapper[4794]: I0310 11:22:42.848652 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-db-sync-7st8t" Mar 10 11:22:42 crc kubenswrapper[4794]: I0310 11:22:42.851719 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-scripts" Mar 10 11:22:42 crc kubenswrapper[4794]: I0310 11:22:42.877411 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-db-sync-7st8t"] Mar 10 11:22:42 crc kubenswrapper[4794]: I0310 11:22:42.961015 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/22779500-ca5f-49be-8578-d0f7b28196fa-scripts\") pod \"octavia-db-sync-7st8t\" (UID: \"22779500-ca5f-49be-8578-d0f7b28196fa\") " pod="openstack/octavia-db-sync-7st8t" Mar 10 11:22:42 crc kubenswrapper[4794]: I0310 11:22:42.961060 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/22779500-ca5f-49be-8578-d0f7b28196fa-config-data\") pod \"octavia-db-sync-7st8t\" (UID: \"22779500-ca5f-49be-8578-d0f7b28196fa\") " pod="openstack/octavia-db-sync-7st8t" Mar 10 11:22:42 crc kubenswrapper[4794]: I0310 11:22:42.961106 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/22779500-ca5f-49be-8578-d0f7b28196fa-config-data-merged\") pod \"octavia-db-sync-7st8t\" (UID: \"22779500-ca5f-49be-8578-d0f7b28196fa\") " pod="openstack/octavia-db-sync-7st8t" Mar 10 11:22:42 crc kubenswrapper[4794]: I0310 11:22:42.961182 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/22779500-ca5f-49be-8578-d0f7b28196fa-combined-ca-bundle\") pod \"octavia-db-sync-7st8t\" (UID: \"22779500-ca5f-49be-8578-d0f7b28196fa\") " pod="openstack/octavia-db-sync-7st8t" Mar 10 11:22:43 crc kubenswrapper[4794]: I0310 11:22:43.063235 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/22779500-ca5f-49be-8578-d0f7b28196fa-config-data\") pod \"octavia-db-sync-7st8t\" (UID: \"22779500-ca5f-49be-8578-d0f7b28196fa\") " pod="openstack/octavia-db-sync-7st8t" Mar 10 11:22:43 crc kubenswrapper[4794]: I0310 11:22:43.063320 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/22779500-ca5f-49be-8578-d0f7b28196fa-config-data-merged\") pod \"octavia-db-sync-7st8t\" (UID: \"22779500-ca5f-49be-8578-d0f7b28196fa\") " pod="openstack/octavia-db-sync-7st8t" Mar 10 11:22:43 crc kubenswrapper[4794]: I0310 11:22:43.063392 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/22779500-ca5f-49be-8578-d0f7b28196fa-combined-ca-bundle\") pod \"octavia-db-sync-7st8t\" (UID: \"22779500-ca5f-49be-8578-d0f7b28196fa\") " pod="openstack/octavia-db-sync-7st8t" Mar 10 11:22:43 crc kubenswrapper[4794]: I0310 11:22:43.063554 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/22779500-ca5f-49be-8578-d0f7b28196fa-scripts\") pod \"octavia-db-sync-7st8t\" (UID: \"22779500-ca5f-49be-8578-d0f7b28196fa\") " pod="openstack/octavia-db-sync-7st8t" Mar 10 11:22:43 crc kubenswrapper[4794]: I0310 11:22:43.064124 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/22779500-ca5f-49be-8578-d0f7b28196fa-config-data-merged\") pod \"octavia-db-sync-7st8t\" (UID: \"22779500-ca5f-49be-8578-d0f7b28196fa\") " pod="openstack/octavia-db-sync-7st8t" Mar 10 11:22:43 crc kubenswrapper[4794]: I0310 11:22:43.072057 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/22779500-ca5f-49be-8578-d0f7b28196fa-scripts\") pod \"octavia-db-sync-7st8t\" (UID: \"22779500-ca5f-49be-8578-d0f7b28196fa\") " pod="openstack/octavia-db-sync-7st8t" Mar 10 11:22:43 crc kubenswrapper[4794]: I0310 11:22:43.072207 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/22779500-ca5f-49be-8578-d0f7b28196fa-combined-ca-bundle\") pod \"octavia-db-sync-7st8t\" (UID: \"22779500-ca5f-49be-8578-d0f7b28196fa\") " pod="openstack/octavia-db-sync-7st8t" Mar 10 11:22:43 crc kubenswrapper[4794]: I0310 11:22:43.072834 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/22779500-ca5f-49be-8578-d0f7b28196fa-config-data\") pod \"octavia-db-sync-7st8t\" (UID: \"22779500-ca5f-49be-8578-d0f7b28196fa\") " pod="openstack/octavia-db-sync-7st8t" Mar 10 11:22:43 crc kubenswrapper[4794]: I0310 11:22:43.171216 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-db-sync-7st8t" Mar 10 11:22:43 crc kubenswrapper[4794]: I0310 11:22:43.570703 4794 scope.go:117] "RemoveContainer" containerID="cac451694e0fe7d0944aa0de56cf184c2c9bd2b792ea57e8d7bee677ac996613" Mar 10 11:22:44 crc kubenswrapper[4794]: I0310 11:22:44.288191 4794 scope.go:117] "RemoveContainer" containerID="f11c8070e6ccf8a777e1b80ded382feeccedf6744a66a5a374be673e2f8ec2b8" Mar 10 11:22:44 crc kubenswrapper[4794]: I0310 11:22:44.421378 4794 scope.go:117] "RemoveContainer" containerID="e6435e00dc9ed00c63a47426d0fe37e7439e843bd0c8aa87fdef1360a6df02b0" Mar 10 11:22:44 crc kubenswrapper[4794]: I0310 11:22:44.740453 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-db-sync-7st8t"] Mar 10 11:22:46 crc kubenswrapper[4794]: I0310 11:22:46.214022 4794 scope.go:117] "RemoveContainer" containerID="a6d7da3bc9b8ea52a168771183dfea67df181fe896297055a7347ee15be0d992" Mar 10 11:22:46 crc kubenswrapper[4794]: W0310 11:22:46.223615 4794 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod22779500_ca5f_49be_8578_d0f7b28196fa.slice/crio-fcb734e4766d057149f4f219d4399266ecab8fd8bb0f22b1dc83dd9473a1df4c WatchSource:0}: Error finding container fcb734e4766d057149f4f219d4399266ecab8fd8bb0f22b1dc83dd9473a1df4c: Status 404 returned error can't find the container with id fcb734e4766d057149f4f219d4399266ecab8fd8bb0f22b1dc83dd9473a1df4c Mar 10 11:22:46 crc kubenswrapper[4794]: I0310 11:22:46.242028 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-db-sync-7st8t" event={"ID":"22779500-ca5f-49be-8578-d0f7b28196fa","Type":"ContainerStarted","Data":"fcb734e4766d057149f4f219d4399266ecab8fd8bb0f22b1dc83dd9473a1df4c"} Mar 10 11:22:46 crc kubenswrapper[4794]: I0310 11:22:46.345701 4794 scope.go:117] "RemoveContainer" containerID="5a17284487e5576d7d0597281abbadd9180a4c08785164d5d45f5bbc798aef41" Mar 10 11:22:47 crc kubenswrapper[4794]: I0310 11:22:47.096956 4794 scope.go:117] "RemoveContainer" containerID="df5b7c3de89a7de22db06361467e37303d68f8464626231d377eab6f107e907e" Mar 10 11:22:47 crc kubenswrapper[4794]: I0310 11:22:47.118083 4794 scope.go:117] "RemoveContainer" containerID="668ade1cad142e4840f5d744f2555b3126d4fdabf0172120dd69776c0f34849c" Mar 10 11:22:48 crc kubenswrapper[4794]: I0310 11:22:48.264499 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-rsyslog-nkn6m" event={"ID":"15527b8a-9b20-4baa-b371-cb2395457d00","Type":"ContainerStarted","Data":"de7bfb824aa52dfee63602bdd66e8b9e70f430ee881a3f5b001a67a87971dac3"} Mar 10 11:22:48 crc kubenswrapper[4794]: I0310 11:22:48.268655 4794 generic.go:334] "Generic (PLEG): container finished" podID="22779500-ca5f-49be-8578-d0f7b28196fa" containerID="515b2db4ef251605bbd6913ce03071a1c79cd8d744a50a654f697f6e6b0a490c" exitCode=0 Mar 10 11:22:48 crc kubenswrapper[4794]: I0310 11:22:48.268957 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-db-sync-7st8t" event={"ID":"22779500-ca5f-49be-8578-d0f7b28196fa","Type":"ContainerDied","Data":"515b2db4ef251605bbd6913ce03071a1c79cd8d744a50a654f697f6e6b0a490c"} Mar 10 11:22:49 crc kubenswrapper[4794]: I0310 11:22:49.280643 4794 generic.go:334] "Generic (PLEG): container finished" podID="15527b8a-9b20-4baa-b371-cb2395457d00" containerID="de7bfb824aa52dfee63602bdd66e8b9e70f430ee881a3f5b001a67a87971dac3" exitCode=0 Mar 10 11:22:49 crc kubenswrapper[4794]: I0310 11:22:49.280789 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-rsyslog-nkn6m" event={"ID":"15527b8a-9b20-4baa-b371-cb2395457d00","Type":"ContainerDied","Data":"de7bfb824aa52dfee63602bdd66e8b9e70f430ee881a3f5b001a67a87971dac3"} Mar 10 11:22:53 crc kubenswrapper[4794]: I0310 11:22:53.318086 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-db-sync-7st8t" event={"ID":"22779500-ca5f-49be-8578-d0f7b28196fa","Type":"ContainerStarted","Data":"f09f0c0d99a49400d7bd8787bf667338d715ae276d00619d2ebaeb027a81eb9e"} Mar 10 11:22:53 crc kubenswrapper[4794]: I0310 11:22:53.322699 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-image-upload-6f5964dbc9-5d45h" event={"ID":"9950d0b1-6138-4bf8-9e37-a4a759bd44a2","Type":"ContainerStarted","Data":"ebb30217f7112e32171323d2589d9bad732c40b575d4e6a015ba66c99b0a4ed6"} Mar 10 11:22:53 crc kubenswrapper[4794]: I0310 11:22:53.331186 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-rsyslog-nkn6m" event={"ID":"15527b8a-9b20-4baa-b371-cb2395457d00","Type":"ContainerStarted","Data":"6415a640cc838c675498491e0924a04ac1923f2bf0fb85645ca42913e4c24888"} Mar 10 11:22:53 crc kubenswrapper[4794]: I0310 11:22:53.331311 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/octavia-rsyslog-nkn6m" Mar 10 11:22:53 crc kubenswrapper[4794]: I0310 11:22:53.342677 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/octavia-db-sync-7st8t" podStartSLOduration=11.342658141 podStartE2EDuration="11.342658141s" podCreationTimestamp="2026-03-10 11:22:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 11:22:53.339721389 +0000 UTC m=+5922.095892217" watchObservedRunningTime="2026-03-10 11:22:53.342658141 +0000 UTC m=+5922.098828959" Mar 10 11:22:53 crc kubenswrapper[4794]: I0310 11:22:53.366005 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/octavia-rsyslog-nkn6m" podStartSLOduration=2.068352126 podStartE2EDuration="16.365987676s" podCreationTimestamp="2026-03-10 11:22:37 +0000 UTC" firstStartedPulling="2026-03-10 11:22:38.549027856 +0000 UTC m=+5907.305198674" lastFinishedPulling="2026-03-10 11:22:52.846663416 +0000 UTC m=+5921.602834224" observedRunningTime="2026-03-10 11:22:53.359002469 +0000 UTC m=+5922.115173287" watchObservedRunningTime="2026-03-10 11:22:53.365987676 +0000 UTC m=+5922.122158494" Mar 10 11:22:54 crc kubenswrapper[4794]: I0310 11:22:54.354607 4794 generic.go:334] "Generic (PLEG): container finished" podID="9950d0b1-6138-4bf8-9e37-a4a759bd44a2" containerID="ebb30217f7112e32171323d2589d9bad732c40b575d4e6a015ba66c99b0a4ed6" exitCode=0 Mar 10 11:22:54 crc kubenswrapper[4794]: I0310 11:22:54.354677 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-image-upload-6f5964dbc9-5d45h" event={"ID":"9950d0b1-6138-4bf8-9e37-a4a759bd44a2","Type":"ContainerDied","Data":"ebb30217f7112e32171323d2589d9bad732c40b575d4e6a015ba66c99b0a4ed6"} Mar 10 11:22:54 crc kubenswrapper[4794]: I0310 11:22:54.425797 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/octavia-api-5f9d47f567-qfb6h" Mar 10 11:22:54 crc kubenswrapper[4794]: I0310 11:22:54.439960 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/octavia-api-5f9d47f567-qfb6h" Mar 10 11:22:55 crc kubenswrapper[4794]: I0310 11:22:55.369862 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-image-upload-6f5964dbc9-5d45h" event={"ID":"9950d0b1-6138-4bf8-9e37-a4a759bd44a2","Type":"ContainerStarted","Data":"d61482aa712e3556154fe25fd8d58af7f8ca1f3a5824cd022b99b9082c7b2f42"} Mar 10 11:22:55 crc kubenswrapper[4794]: I0310 11:22:55.391313 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/octavia-image-upload-6f5964dbc9-5d45h" podStartSLOduration=3.410431656 podStartE2EDuration="17.391290649s" podCreationTimestamp="2026-03-10 11:22:38 +0000 UTC" firstStartedPulling="2026-03-10 11:22:38.809825011 +0000 UTC m=+5907.565995829" lastFinishedPulling="2026-03-10 11:22:52.790683964 +0000 UTC m=+5921.546854822" observedRunningTime="2026-03-10 11:22:55.380261656 +0000 UTC m=+5924.136432474" watchObservedRunningTime="2026-03-10 11:22:55.391290649 +0000 UTC m=+5924.147461487" Mar 10 11:22:58 crc kubenswrapper[4794]: I0310 11:22:58.399521 4794 generic.go:334] "Generic (PLEG): container finished" podID="22779500-ca5f-49be-8578-d0f7b28196fa" containerID="f09f0c0d99a49400d7bd8787bf667338d715ae276d00619d2ebaeb027a81eb9e" exitCode=0 Mar 10 11:22:58 crc kubenswrapper[4794]: I0310 11:22:58.399733 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-db-sync-7st8t" event={"ID":"22779500-ca5f-49be-8578-d0f7b28196fa","Type":"ContainerDied","Data":"f09f0c0d99a49400d7bd8787bf667338d715ae276d00619d2ebaeb027a81eb9e"} Mar 10 11:22:59 crc kubenswrapper[4794]: I0310 11:22:59.936844 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-db-sync-7st8t" Mar 10 11:23:00 crc kubenswrapper[4794]: I0310 11:23:00.042752 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/22779500-ca5f-49be-8578-d0f7b28196fa-combined-ca-bundle\") pod \"22779500-ca5f-49be-8578-d0f7b28196fa\" (UID: \"22779500-ca5f-49be-8578-d0f7b28196fa\") " Mar 10 11:23:00 crc kubenswrapper[4794]: I0310 11:23:00.042832 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/22779500-ca5f-49be-8578-d0f7b28196fa-config-data\") pod \"22779500-ca5f-49be-8578-d0f7b28196fa\" (UID: \"22779500-ca5f-49be-8578-d0f7b28196fa\") " Mar 10 11:23:00 crc kubenswrapper[4794]: I0310 11:23:00.042882 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/22779500-ca5f-49be-8578-d0f7b28196fa-config-data-merged\") pod \"22779500-ca5f-49be-8578-d0f7b28196fa\" (UID: \"22779500-ca5f-49be-8578-d0f7b28196fa\") " Mar 10 11:23:00 crc kubenswrapper[4794]: I0310 11:23:00.043047 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/22779500-ca5f-49be-8578-d0f7b28196fa-scripts\") pod \"22779500-ca5f-49be-8578-d0f7b28196fa\" (UID: \"22779500-ca5f-49be-8578-d0f7b28196fa\") " Mar 10 11:23:00 crc kubenswrapper[4794]: I0310 11:23:00.048317 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22779500-ca5f-49be-8578-d0f7b28196fa-config-data" (OuterVolumeSpecName: "config-data") pod "22779500-ca5f-49be-8578-d0f7b28196fa" (UID: "22779500-ca5f-49be-8578-d0f7b28196fa"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 11:23:00 crc kubenswrapper[4794]: I0310 11:23:00.048939 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22779500-ca5f-49be-8578-d0f7b28196fa-scripts" (OuterVolumeSpecName: "scripts") pod "22779500-ca5f-49be-8578-d0f7b28196fa" (UID: "22779500-ca5f-49be-8578-d0f7b28196fa"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 11:23:00 crc kubenswrapper[4794]: I0310 11:23:00.067528 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22779500-ca5f-49be-8578-d0f7b28196fa-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "22779500-ca5f-49be-8578-d0f7b28196fa" (UID: "22779500-ca5f-49be-8578-d0f7b28196fa"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 11:23:00 crc kubenswrapper[4794]: I0310 11:23:00.068043 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/22779500-ca5f-49be-8578-d0f7b28196fa-config-data-merged" (OuterVolumeSpecName: "config-data-merged") pod "22779500-ca5f-49be-8578-d0f7b28196fa" (UID: "22779500-ca5f-49be-8578-d0f7b28196fa"). InnerVolumeSpecName "config-data-merged". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 11:23:00 crc kubenswrapper[4794]: I0310 11:23:00.144915 4794 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/22779500-ca5f-49be-8578-d0f7b28196fa-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 11:23:00 crc kubenswrapper[4794]: I0310 11:23:00.144946 4794 reconciler_common.go:293] "Volume detached for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/22779500-ca5f-49be-8578-d0f7b28196fa-config-data-merged\") on node \"crc\" DevicePath \"\"" Mar 10 11:23:00 crc kubenswrapper[4794]: I0310 11:23:00.144955 4794 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/22779500-ca5f-49be-8578-d0f7b28196fa-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 11:23:00 crc kubenswrapper[4794]: I0310 11:23:00.144962 4794 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/22779500-ca5f-49be-8578-d0f7b28196fa-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 11:23:00 crc kubenswrapper[4794]: I0310 11:23:00.425949 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-db-sync-7st8t" event={"ID":"22779500-ca5f-49be-8578-d0f7b28196fa","Type":"ContainerDied","Data":"fcb734e4766d057149f4f219d4399266ecab8fd8bb0f22b1dc83dd9473a1df4c"} Mar 10 11:23:00 crc kubenswrapper[4794]: I0310 11:23:00.426008 4794 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fcb734e4766d057149f4f219d4399266ecab8fd8bb0f22b1dc83dd9473a1df4c" Mar 10 11:23:00 crc kubenswrapper[4794]: I0310 11:23:00.426023 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-db-sync-7st8t" Mar 10 11:23:08 crc kubenswrapper[4794]: I0310 11:23:08.012642 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/octavia-rsyslog-nkn6m" Mar 10 11:23:22 crc kubenswrapper[4794]: I0310 11:23:22.850293 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/octavia-image-upload-6f5964dbc9-5d45h"] Mar 10 11:23:22 crc kubenswrapper[4794]: I0310 11:23:22.852627 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/octavia-image-upload-6f5964dbc9-5d45h" podUID="9950d0b1-6138-4bf8-9e37-a4a759bd44a2" containerName="octavia-amphora-httpd" containerID="cri-o://d61482aa712e3556154fe25fd8d58af7f8ca1f3a5824cd022b99b9082c7b2f42" gracePeriod=30 Mar 10 11:23:23 crc kubenswrapper[4794]: I0310 11:23:23.432612 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-image-upload-6f5964dbc9-5d45h" Mar 10 11:23:23 crc kubenswrapper[4794]: I0310 11:23:23.628268 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/9950d0b1-6138-4bf8-9e37-a4a759bd44a2-httpd-config\") pod \"9950d0b1-6138-4bf8-9e37-a4a759bd44a2\" (UID: \"9950d0b1-6138-4bf8-9e37-a4a759bd44a2\") " Mar 10 11:23:23 crc kubenswrapper[4794]: I0310 11:23:23.628481 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"amphora-image\" (UniqueName: \"kubernetes.io/empty-dir/9950d0b1-6138-4bf8-9e37-a4a759bd44a2-amphora-image\") pod \"9950d0b1-6138-4bf8-9e37-a4a759bd44a2\" (UID: \"9950d0b1-6138-4bf8-9e37-a4a759bd44a2\") " Mar 10 11:23:23 crc kubenswrapper[4794]: I0310 11:23:23.667478 4794 generic.go:334] "Generic (PLEG): container finished" podID="9950d0b1-6138-4bf8-9e37-a4a759bd44a2" containerID="d61482aa712e3556154fe25fd8d58af7f8ca1f3a5824cd022b99b9082c7b2f42" exitCode=0 Mar 10 11:23:23 crc kubenswrapper[4794]: I0310 11:23:23.667514 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-image-upload-6f5964dbc9-5d45h" event={"ID":"9950d0b1-6138-4bf8-9e37-a4a759bd44a2","Type":"ContainerDied","Data":"d61482aa712e3556154fe25fd8d58af7f8ca1f3a5824cd022b99b9082c7b2f42"} Mar 10 11:23:23 crc kubenswrapper[4794]: I0310 11:23:23.667550 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-image-upload-6f5964dbc9-5d45h" event={"ID":"9950d0b1-6138-4bf8-9e37-a4a759bd44a2","Type":"ContainerDied","Data":"bcbc54024c543e963accc023889e80c131dec1e4fe3ec6b5cad9e467e3a2e90d"} Mar 10 11:23:23 crc kubenswrapper[4794]: I0310 11:23:23.667568 4794 scope.go:117] "RemoveContainer" containerID="d61482aa712e3556154fe25fd8d58af7f8ca1f3a5824cd022b99b9082c7b2f42" Mar 10 11:23:23 crc kubenswrapper[4794]: I0310 11:23:23.667678 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-image-upload-6f5964dbc9-5d45h" Mar 10 11:23:23 crc kubenswrapper[4794]: I0310 11:23:23.709682 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9950d0b1-6138-4bf8-9e37-a4a759bd44a2-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "9950d0b1-6138-4bf8-9e37-a4a759bd44a2" (UID: "9950d0b1-6138-4bf8-9e37-a4a759bd44a2"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 11:23:23 crc kubenswrapper[4794]: I0310 11:23:23.731762 4794 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/9950d0b1-6138-4bf8-9e37-a4a759bd44a2-httpd-config\") on node \"crc\" DevicePath \"\"" Mar 10 11:23:23 crc kubenswrapper[4794]: I0310 11:23:23.752395 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9950d0b1-6138-4bf8-9e37-a4a759bd44a2-amphora-image" (OuterVolumeSpecName: "amphora-image") pod "9950d0b1-6138-4bf8-9e37-a4a759bd44a2" (UID: "9950d0b1-6138-4bf8-9e37-a4a759bd44a2"). InnerVolumeSpecName "amphora-image". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 11:23:23 crc kubenswrapper[4794]: I0310 11:23:23.766025 4794 scope.go:117] "RemoveContainer" containerID="ebb30217f7112e32171323d2589d9bad732c40b575d4e6a015ba66c99b0a4ed6" Mar 10 11:23:23 crc kubenswrapper[4794]: I0310 11:23:23.808969 4794 scope.go:117] "RemoveContainer" containerID="d61482aa712e3556154fe25fd8d58af7f8ca1f3a5824cd022b99b9082c7b2f42" Mar 10 11:23:23 crc kubenswrapper[4794]: E0310 11:23:23.809473 4794 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d61482aa712e3556154fe25fd8d58af7f8ca1f3a5824cd022b99b9082c7b2f42\": container with ID starting with d61482aa712e3556154fe25fd8d58af7f8ca1f3a5824cd022b99b9082c7b2f42 not found: ID does not exist" containerID="d61482aa712e3556154fe25fd8d58af7f8ca1f3a5824cd022b99b9082c7b2f42" Mar 10 11:23:23 crc kubenswrapper[4794]: I0310 11:23:23.809609 4794 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d61482aa712e3556154fe25fd8d58af7f8ca1f3a5824cd022b99b9082c7b2f42"} err="failed to get container status \"d61482aa712e3556154fe25fd8d58af7f8ca1f3a5824cd022b99b9082c7b2f42\": rpc error: code = NotFound desc = could not find container \"d61482aa712e3556154fe25fd8d58af7f8ca1f3a5824cd022b99b9082c7b2f42\": container with ID starting with d61482aa712e3556154fe25fd8d58af7f8ca1f3a5824cd022b99b9082c7b2f42 not found: ID does not exist" Mar 10 11:23:23 crc kubenswrapper[4794]: I0310 11:23:23.809705 4794 scope.go:117] "RemoveContainer" containerID="ebb30217f7112e32171323d2589d9bad732c40b575d4e6a015ba66c99b0a4ed6" Mar 10 11:23:23 crc kubenswrapper[4794]: E0310 11:23:23.810073 4794 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ebb30217f7112e32171323d2589d9bad732c40b575d4e6a015ba66c99b0a4ed6\": container with ID starting with ebb30217f7112e32171323d2589d9bad732c40b575d4e6a015ba66c99b0a4ed6 not found: ID does not exist" containerID="ebb30217f7112e32171323d2589d9bad732c40b575d4e6a015ba66c99b0a4ed6" Mar 10 11:23:23 crc kubenswrapper[4794]: I0310 11:23:23.810132 4794 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ebb30217f7112e32171323d2589d9bad732c40b575d4e6a015ba66c99b0a4ed6"} err="failed to get container status \"ebb30217f7112e32171323d2589d9bad732c40b575d4e6a015ba66c99b0a4ed6\": rpc error: code = NotFound desc = could not find container \"ebb30217f7112e32171323d2589d9bad732c40b575d4e6a015ba66c99b0a4ed6\": container with ID starting with ebb30217f7112e32171323d2589d9bad732c40b575d4e6a015ba66c99b0a4ed6 not found: ID does not exist" Mar 10 11:23:23 crc kubenswrapper[4794]: I0310 11:23:23.835018 4794 reconciler_common.go:293] "Volume detached for volume \"amphora-image\" (UniqueName: \"kubernetes.io/empty-dir/9950d0b1-6138-4bf8-9e37-a4a759bd44a2-amphora-image\") on node \"crc\" DevicePath \"\"" Mar 10 11:23:24 crc kubenswrapper[4794]: I0310 11:23:24.033712 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/octavia-image-upload-6f5964dbc9-5d45h"] Mar 10 11:23:24 crc kubenswrapper[4794]: I0310 11:23:24.035387 4794 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/octavia-image-upload-6f5964dbc9-5d45h"] Mar 10 11:23:26 crc kubenswrapper[4794]: I0310 11:23:26.015745 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9950d0b1-6138-4bf8-9e37-a4a759bd44a2" path="/var/lib/kubelet/pods/9950d0b1-6138-4bf8-9e37-a4a759bd44a2/volumes" Mar 10 11:23:42 crc kubenswrapper[4794]: I0310 11:23:42.520324 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/octavia-healthmanager-g8rgx"] Mar 10 11:23:42 crc kubenswrapper[4794]: E0310 11:23:42.521387 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9950d0b1-6138-4bf8-9e37-a4a759bd44a2" containerName="init" Mar 10 11:23:42 crc kubenswrapper[4794]: I0310 11:23:42.521410 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="9950d0b1-6138-4bf8-9e37-a4a759bd44a2" containerName="init" Mar 10 11:23:42 crc kubenswrapper[4794]: E0310 11:23:42.521434 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="22779500-ca5f-49be-8578-d0f7b28196fa" containerName="init" Mar 10 11:23:42 crc kubenswrapper[4794]: I0310 11:23:42.521443 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="22779500-ca5f-49be-8578-d0f7b28196fa" containerName="init" Mar 10 11:23:42 crc kubenswrapper[4794]: E0310 11:23:42.521458 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="22779500-ca5f-49be-8578-d0f7b28196fa" containerName="octavia-db-sync" Mar 10 11:23:42 crc kubenswrapper[4794]: I0310 11:23:42.521467 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="22779500-ca5f-49be-8578-d0f7b28196fa" containerName="octavia-db-sync" Mar 10 11:23:42 crc kubenswrapper[4794]: E0310 11:23:42.521499 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9950d0b1-6138-4bf8-9e37-a4a759bd44a2" containerName="octavia-amphora-httpd" Mar 10 11:23:42 crc kubenswrapper[4794]: I0310 11:23:42.521508 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="9950d0b1-6138-4bf8-9e37-a4a759bd44a2" containerName="octavia-amphora-httpd" Mar 10 11:23:42 crc kubenswrapper[4794]: I0310 11:23:42.521735 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="9950d0b1-6138-4bf8-9e37-a4a759bd44a2" containerName="octavia-amphora-httpd" Mar 10 11:23:42 crc kubenswrapper[4794]: I0310 11:23:42.521751 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="22779500-ca5f-49be-8578-d0f7b28196fa" containerName="octavia-db-sync" Mar 10 11:23:42 crc kubenswrapper[4794]: I0310 11:23:42.523040 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-healthmanager-g8rgx" Mar 10 11:23:42 crc kubenswrapper[4794]: I0310 11:23:42.526236 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-certs-secret" Mar 10 11:23:42 crc kubenswrapper[4794]: I0310 11:23:42.526638 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-healthmanager-config-data" Mar 10 11:23:42 crc kubenswrapper[4794]: I0310 11:23:42.528881 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-healthmanager-scripts" Mar 10 11:23:42 crc kubenswrapper[4794]: I0310 11:23:42.542964 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-healthmanager-g8rgx"] Mar 10 11:23:42 crc kubenswrapper[4794]: I0310 11:23:42.768444 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/0e5b10af-c321-47e8-9fa2-d143dbe38634-config-data-merged\") pod \"octavia-healthmanager-g8rgx\" (UID: \"0e5b10af-c321-47e8-9fa2-d143dbe38634\") " pod="openstack/octavia-healthmanager-g8rgx" Mar 10 11:23:42 crc kubenswrapper[4794]: I0310 11:23:42.768773 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hm-ports\" (UniqueName: \"kubernetes.io/configmap/0e5b10af-c321-47e8-9fa2-d143dbe38634-hm-ports\") pod \"octavia-healthmanager-g8rgx\" (UID: \"0e5b10af-c321-47e8-9fa2-d143dbe38634\") " pod="openstack/octavia-healthmanager-g8rgx" Mar 10 11:23:42 crc kubenswrapper[4794]: I0310 11:23:42.768913 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0e5b10af-c321-47e8-9fa2-d143dbe38634-combined-ca-bundle\") pod \"octavia-healthmanager-g8rgx\" (UID: \"0e5b10af-c321-47e8-9fa2-d143dbe38634\") " pod="openstack/octavia-healthmanager-g8rgx" Mar 10 11:23:42 crc kubenswrapper[4794]: I0310 11:23:42.769011 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0e5b10af-c321-47e8-9fa2-d143dbe38634-config-data\") pod \"octavia-healthmanager-g8rgx\" (UID: \"0e5b10af-c321-47e8-9fa2-d143dbe38634\") " pod="openstack/octavia-healthmanager-g8rgx" Mar 10 11:23:42 crc kubenswrapper[4794]: I0310 11:23:42.769150 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0e5b10af-c321-47e8-9fa2-d143dbe38634-scripts\") pod \"octavia-healthmanager-g8rgx\" (UID: \"0e5b10af-c321-47e8-9fa2-d143dbe38634\") " pod="openstack/octavia-healthmanager-g8rgx" Mar 10 11:23:42 crc kubenswrapper[4794]: I0310 11:23:42.769232 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"amphora-certs\" (UniqueName: \"kubernetes.io/secret/0e5b10af-c321-47e8-9fa2-d143dbe38634-amphora-certs\") pod \"octavia-healthmanager-g8rgx\" (UID: \"0e5b10af-c321-47e8-9fa2-d143dbe38634\") " pod="openstack/octavia-healthmanager-g8rgx" Mar 10 11:23:42 crc kubenswrapper[4794]: I0310 11:23:42.871582 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/0e5b10af-c321-47e8-9fa2-d143dbe38634-config-data-merged\") pod \"octavia-healthmanager-g8rgx\" (UID: \"0e5b10af-c321-47e8-9fa2-d143dbe38634\") " pod="openstack/octavia-healthmanager-g8rgx" Mar 10 11:23:42 crc kubenswrapper[4794]: I0310 11:23:42.871687 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hm-ports\" (UniqueName: \"kubernetes.io/configmap/0e5b10af-c321-47e8-9fa2-d143dbe38634-hm-ports\") pod \"octavia-healthmanager-g8rgx\" (UID: \"0e5b10af-c321-47e8-9fa2-d143dbe38634\") " pod="openstack/octavia-healthmanager-g8rgx" Mar 10 11:23:42 crc kubenswrapper[4794]: I0310 11:23:42.871827 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0e5b10af-c321-47e8-9fa2-d143dbe38634-combined-ca-bundle\") pod \"octavia-healthmanager-g8rgx\" (UID: \"0e5b10af-c321-47e8-9fa2-d143dbe38634\") " pod="openstack/octavia-healthmanager-g8rgx" Mar 10 11:23:42 crc kubenswrapper[4794]: I0310 11:23:42.871899 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0e5b10af-c321-47e8-9fa2-d143dbe38634-config-data\") pod \"octavia-healthmanager-g8rgx\" (UID: \"0e5b10af-c321-47e8-9fa2-d143dbe38634\") " pod="openstack/octavia-healthmanager-g8rgx" Mar 10 11:23:42 crc kubenswrapper[4794]: I0310 11:23:42.872088 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0e5b10af-c321-47e8-9fa2-d143dbe38634-scripts\") pod \"octavia-healthmanager-g8rgx\" (UID: \"0e5b10af-c321-47e8-9fa2-d143dbe38634\") " pod="openstack/octavia-healthmanager-g8rgx" Mar 10 11:23:42 crc kubenswrapper[4794]: I0310 11:23:42.872173 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"amphora-certs\" (UniqueName: \"kubernetes.io/secret/0e5b10af-c321-47e8-9fa2-d143dbe38634-amphora-certs\") pod \"octavia-healthmanager-g8rgx\" (UID: \"0e5b10af-c321-47e8-9fa2-d143dbe38634\") " pod="openstack/octavia-healthmanager-g8rgx" Mar 10 11:23:42 crc kubenswrapper[4794]: I0310 11:23:42.872968 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/0e5b10af-c321-47e8-9fa2-d143dbe38634-config-data-merged\") pod \"octavia-healthmanager-g8rgx\" (UID: \"0e5b10af-c321-47e8-9fa2-d143dbe38634\") " pod="openstack/octavia-healthmanager-g8rgx" Mar 10 11:23:42 crc kubenswrapper[4794]: I0310 11:23:42.874418 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hm-ports\" (UniqueName: \"kubernetes.io/configmap/0e5b10af-c321-47e8-9fa2-d143dbe38634-hm-ports\") pod \"octavia-healthmanager-g8rgx\" (UID: \"0e5b10af-c321-47e8-9fa2-d143dbe38634\") " pod="openstack/octavia-healthmanager-g8rgx" Mar 10 11:23:42 crc kubenswrapper[4794]: I0310 11:23:42.880763 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0e5b10af-c321-47e8-9fa2-d143dbe38634-combined-ca-bundle\") pod \"octavia-healthmanager-g8rgx\" (UID: \"0e5b10af-c321-47e8-9fa2-d143dbe38634\") " pod="openstack/octavia-healthmanager-g8rgx" Mar 10 11:23:42 crc kubenswrapper[4794]: I0310 11:23:42.881381 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"amphora-certs\" (UniqueName: \"kubernetes.io/secret/0e5b10af-c321-47e8-9fa2-d143dbe38634-amphora-certs\") pod \"octavia-healthmanager-g8rgx\" (UID: \"0e5b10af-c321-47e8-9fa2-d143dbe38634\") " pod="openstack/octavia-healthmanager-g8rgx" Mar 10 11:23:42 crc kubenswrapper[4794]: I0310 11:23:42.881517 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0e5b10af-c321-47e8-9fa2-d143dbe38634-scripts\") pod \"octavia-healthmanager-g8rgx\" (UID: \"0e5b10af-c321-47e8-9fa2-d143dbe38634\") " pod="openstack/octavia-healthmanager-g8rgx" Mar 10 11:23:42 crc kubenswrapper[4794]: I0310 11:23:42.891924 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0e5b10af-c321-47e8-9fa2-d143dbe38634-config-data\") pod \"octavia-healthmanager-g8rgx\" (UID: \"0e5b10af-c321-47e8-9fa2-d143dbe38634\") " pod="openstack/octavia-healthmanager-g8rgx" Mar 10 11:23:42 crc kubenswrapper[4794]: I0310 11:23:42.980690 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-healthmanager-g8rgx" Mar 10 11:23:43 crc kubenswrapper[4794]: I0310 11:23:43.536059 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-healthmanager-g8rgx"] Mar 10 11:23:43 crc kubenswrapper[4794]: I0310 11:23:43.921881 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-healthmanager-g8rgx" event={"ID":"0e5b10af-c321-47e8-9fa2-d143dbe38634","Type":"ContainerStarted","Data":"730acd8490c14746a6e69d296812c95b4d41ef0aa3a68bf65aad588d7d91b113"} Mar 10 11:23:44 crc kubenswrapper[4794]: I0310 11:23:44.050832 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/octavia-housekeeping-jn45g"] Mar 10 11:23:44 crc kubenswrapper[4794]: I0310 11:23:44.052320 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-housekeeping-jn45g" Mar 10 11:23:44 crc kubenswrapper[4794]: I0310 11:23:44.056231 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-housekeeping-scripts" Mar 10 11:23:44 crc kubenswrapper[4794]: I0310 11:23:44.060509 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-housekeeping-config-data" Mar 10 11:23:44 crc kubenswrapper[4794]: I0310 11:23:44.066488 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-housekeeping-jn45g"] Mar 10 11:23:44 crc kubenswrapper[4794]: I0310 11:23:44.187407 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ec60c79b-10a2-4b22-988f-4bc082f6b5ec-scripts\") pod \"octavia-housekeeping-jn45g\" (UID: \"ec60c79b-10a2-4b22-988f-4bc082f6b5ec\") " pod="openstack/octavia-housekeeping-jn45g" Mar 10 11:23:44 crc kubenswrapper[4794]: I0310 11:23:44.187502 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hm-ports\" (UniqueName: \"kubernetes.io/configmap/ec60c79b-10a2-4b22-988f-4bc082f6b5ec-hm-ports\") pod \"octavia-housekeeping-jn45g\" (UID: \"ec60c79b-10a2-4b22-988f-4bc082f6b5ec\") " pod="openstack/octavia-housekeeping-jn45g" Mar 10 11:23:44 crc kubenswrapper[4794]: I0310 11:23:44.187555 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"amphora-certs\" (UniqueName: \"kubernetes.io/secret/ec60c79b-10a2-4b22-988f-4bc082f6b5ec-amphora-certs\") pod \"octavia-housekeeping-jn45g\" (UID: \"ec60c79b-10a2-4b22-988f-4bc082f6b5ec\") " pod="openstack/octavia-housekeeping-jn45g" Mar 10 11:23:44 crc kubenswrapper[4794]: I0310 11:23:44.187591 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ec60c79b-10a2-4b22-988f-4bc082f6b5ec-combined-ca-bundle\") pod \"octavia-housekeeping-jn45g\" (UID: \"ec60c79b-10a2-4b22-988f-4bc082f6b5ec\") " pod="openstack/octavia-housekeeping-jn45g" Mar 10 11:23:44 crc kubenswrapper[4794]: I0310 11:23:44.187635 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/ec60c79b-10a2-4b22-988f-4bc082f6b5ec-config-data-merged\") pod \"octavia-housekeeping-jn45g\" (UID: \"ec60c79b-10a2-4b22-988f-4bc082f6b5ec\") " pod="openstack/octavia-housekeeping-jn45g" Mar 10 11:23:44 crc kubenswrapper[4794]: I0310 11:23:44.187712 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ec60c79b-10a2-4b22-988f-4bc082f6b5ec-config-data\") pod \"octavia-housekeeping-jn45g\" (UID: \"ec60c79b-10a2-4b22-988f-4bc082f6b5ec\") " pod="openstack/octavia-housekeeping-jn45g" Mar 10 11:23:44 crc kubenswrapper[4794]: I0310 11:23:44.289484 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ec60c79b-10a2-4b22-988f-4bc082f6b5ec-scripts\") pod \"octavia-housekeeping-jn45g\" (UID: \"ec60c79b-10a2-4b22-988f-4bc082f6b5ec\") " pod="openstack/octavia-housekeeping-jn45g" Mar 10 11:23:44 crc kubenswrapper[4794]: I0310 11:23:44.289560 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hm-ports\" (UniqueName: \"kubernetes.io/configmap/ec60c79b-10a2-4b22-988f-4bc082f6b5ec-hm-ports\") pod \"octavia-housekeeping-jn45g\" (UID: \"ec60c79b-10a2-4b22-988f-4bc082f6b5ec\") " pod="openstack/octavia-housekeeping-jn45g" Mar 10 11:23:44 crc kubenswrapper[4794]: I0310 11:23:44.289589 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"amphora-certs\" (UniqueName: \"kubernetes.io/secret/ec60c79b-10a2-4b22-988f-4bc082f6b5ec-amphora-certs\") pod \"octavia-housekeeping-jn45g\" (UID: \"ec60c79b-10a2-4b22-988f-4bc082f6b5ec\") " pod="openstack/octavia-housekeeping-jn45g" Mar 10 11:23:44 crc kubenswrapper[4794]: I0310 11:23:44.289616 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ec60c79b-10a2-4b22-988f-4bc082f6b5ec-combined-ca-bundle\") pod \"octavia-housekeeping-jn45g\" (UID: \"ec60c79b-10a2-4b22-988f-4bc082f6b5ec\") " pod="openstack/octavia-housekeeping-jn45g" Mar 10 11:23:44 crc kubenswrapper[4794]: I0310 11:23:44.289981 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/ec60c79b-10a2-4b22-988f-4bc082f6b5ec-config-data-merged\") pod \"octavia-housekeeping-jn45g\" (UID: \"ec60c79b-10a2-4b22-988f-4bc082f6b5ec\") " pod="openstack/octavia-housekeeping-jn45g" Mar 10 11:23:44 crc kubenswrapper[4794]: I0310 11:23:44.290096 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ec60c79b-10a2-4b22-988f-4bc082f6b5ec-config-data\") pod \"octavia-housekeeping-jn45g\" (UID: \"ec60c79b-10a2-4b22-988f-4bc082f6b5ec\") " pod="openstack/octavia-housekeeping-jn45g" Mar 10 11:23:44 crc kubenswrapper[4794]: I0310 11:23:44.290567 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/ec60c79b-10a2-4b22-988f-4bc082f6b5ec-config-data-merged\") pod \"octavia-housekeeping-jn45g\" (UID: \"ec60c79b-10a2-4b22-988f-4bc082f6b5ec\") " pod="openstack/octavia-housekeeping-jn45g" Mar 10 11:23:44 crc kubenswrapper[4794]: I0310 11:23:44.290864 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hm-ports\" (UniqueName: \"kubernetes.io/configmap/ec60c79b-10a2-4b22-988f-4bc082f6b5ec-hm-ports\") pod \"octavia-housekeeping-jn45g\" (UID: \"ec60c79b-10a2-4b22-988f-4bc082f6b5ec\") " pod="openstack/octavia-housekeeping-jn45g" Mar 10 11:23:44 crc kubenswrapper[4794]: I0310 11:23:44.295689 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ec60c79b-10a2-4b22-988f-4bc082f6b5ec-combined-ca-bundle\") pod \"octavia-housekeeping-jn45g\" (UID: \"ec60c79b-10a2-4b22-988f-4bc082f6b5ec\") " pod="openstack/octavia-housekeeping-jn45g" Mar 10 11:23:44 crc kubenswrapper[4794]: I0310 11:23:44.309851 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ec60c79b-10a2-4b22-988f-4bc082f6b5ec-scripts\") pod \"octavia-housekeeping-jn45g\" (UID: \"ec60c79b-10a2-4b22-988f-4bc082f6b5ec\") " pod="openstack/octavia-housekeeping-jn45g" Mar 10 11:23:44 crc kubenswrapper[4794]: I0310 11:23:44.310051 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"amphora-certs\" (UniqueName: \"kubernetes.io/secret/ec60c79b-10a2-4b22-988f-4bc082f6b5ec-amphora-certs\") pod \"octavia-housekeeping-jn45g\" (UID: \"ec60c79b-10a2-4b22-988f-4bc082f6b5ec\") " pod="openstack/octavia-housekeeping-jn45g" Mar 10 11:23:44 crc kubenswrapper[4794]: I0310 11:23:44.312135 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ec60c79b-10a2-4b22-988f-4bc082f6b5ec-config-data\") pod \"octavia-housekeeping-jn45g\" (UID: \"ec60c79b-10a2-4b22-988f-4bc082f6b5ec\") " pod="openstack/octavia-housekeeping-jn45g" Mar 10 11:23:44 crc kubenswrapper[4794]: I0310 11:23:44.385051 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-housekeeping-jn45g" Mar 10 11:23:44 crc kubenswrapper[4794]: I0310 11:23:44.937391 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-healthmanager-g8rgx" event={"ID":"0e5b10af-c321-47e8-9fa2-d143dbe38634","Type":"ContainerStarted","Data":"b3ee2cf170e2d6f4011c686e133c8bd28c51530ad603e4e5dde39ff72aa7cecc"} Mar 10 11:23:44 crc kubenswrapper[4794]: I0310 11:23:44.947958 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-housekeeping-jn45g"] Mar 10 11:23:44 crc kubenswrapper[4794]: W0310 11:23:44.952447 4794 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podec60c79b_10a2_4b22_988f_4bc082f6b5ec.slice/crio-87c91cf3c48908bbce36fb2ee2e44c89fb1b0c6ca37e4d85404955928dba2dab WatchSource:0}: Error finding container 87c91cf3c48908bbce36fb2ee2e44c89fb1b0c6ca37e4d85404955928dba2dab: Status 404 returned error can't find the container with id 87c91cf3c48908bbce36fb2ee2e44c89fb1b0c6ca37e4d85404955928dba2dab Mar 10 11:23:45 crc kubenswrapper[4794]: I0310 11:23:45.213327 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/octavia-worker-dcpwj"] Mar 10 11:23:45 crc kubenswrapper[4794]: I0310 11:23:45.215447 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-worker-dcpwj" Mar 10 11:23:45 crc kubenswrapper[4794]: I0310 11:23:45.219159 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-worker-scripts" Mar 10 11:23:45 crc kubenswrapper[4794]: I0310 11:23:45.219306 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-worker-config-data" Mar 10 11:23:45 crc kubenswrapper[4794]: I0310 11:23:45.226044 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-worker-dcpwj"] Mar 10 11:23:45 crc kubenswrapper[4794]: I0310 11:23:45.308150 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/25562046-f83e-475d-bcf8-22c782c6595e-config-data-merged\") pod \"octavia-worker-dcpwj\" (UID: \"25562046-f83e-475d-bcf8-22c782c6595e\") " pod="openstack/octavia-worker-dcpwj" Mar 10 11:23:45 crc kubenswrapper[4794]: I0310 11:23:45.308400 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/25562046-f83e-475d-bcf8-22c782c6595e-scripts\") pod \"octavia-worker-dcpwj\" (UID: \"25562046-f83e-475d-bcf8-22c782c6595e\") " pod="openstack/octavia-worker-dcpwj" Mar 10 11:23:45 crc kubenswrapper[4794]: I0310 11:23:45.308638 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"amphora-certs\" (UniqueName: \"kubernetes.io/secret/25562046-f83e-475d-bcf8-22c782c6595e-amphora-certs\") pod \"octavia-worker-dcpwj\" (UID: \"25562046-f83e-475d-bcf8-22c782c6595e\") " pod="openstack/octavia-worker-dcpwj" Mar 10 11:23:45 crc kubenswrapper[4794]: I0310 11:23:45.308756 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/25562046-f83e-475d-bcf8-22c782c6595e-combined-ca-bundle\") pod \"octavia-worker-dcpwj\" (UID: \"25562046-f83e-475d-bcf8-22c782c6595e\") " pod="openstack/octavia-worker-dcpwj" Mar 10 11:23:45 crc kubenswrapper[4794]: I0310 11:23:45.308853 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hm-ports\" (UniqueName: \"kubernetes.io/configmap/25562046-f83e-475d-bcf8-22c782c6595e-hm-ports\") pod \"octavia-worker-dcpwj\" (UID: \"25562046-f83e-475d-bcf8-22c782c6595e\") " pod="openstack/octavia-worker-dcpwj" Mar 10 11:23:45 crc kubenswrapper[4794]: I0310 11:23:45.308912 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/25562046-f83e-475d-bcf8-22c782c6595e-config-data\") pod \"octavia-worker-dcpwj\" (UID: \"25562046-f83e-475d-bcf8-22c782c6595e\") " pod="openstack/octavia-worker-dcpwj" Mar 10 11:23:45 crc kubenswrapper[4794]: I0310 11:23:45.410556 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hm-ports\" (UniqueName: \"kubernetes.io/configmap/25562046-f83e-475d-bcf8-22c782c6595e-hm-ports\") pod \"octavia-worker-dcpwj\" (UID: \"25562046-f83e-475d-bcf8-22c782c6595e\") " pod="openstack/octavia-worker-dcpwj" Mar 10 11:23:45 crc kubenswrapper[4794]: I0310 11:23:45.410858 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/25562046-f83e-475d-bcf8-22c782c6595e-config-data\") pod \"octavia-worker-dcpwj\" (UID: \"25562046-f83e-475d-bcf8-22c782c6595e\") " pod="openstack/octavia-worker-dcpwj" Mar 10 11:23:45 crc kubenswrapper[4794]: I0310 11:23:45.411023 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/25562046-f83e-475d-bcf8-22c782c6595e-config-data-merged\") pod \"octavia-worker-dcpwj\" (UID: \"25562046-f83e-475d-bcf8-22c782c6595e\") " pod="openstack/octavia-worker-dcpwj" Mar 10 11:23:45 crc kubenswrapper[4794]: I0310 11:23:45.411206 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/25562046-f83e-475d-bcf8-22c782c6595e-scripts\") pod \"octavia-worker-dcpwj\" (UID: \"25562046-f83e-475d-bcf8-22c782c6595e\") " pod="openstack/octavia-worker-dcpwj" Mar 10 11:23:45 crc kubenswrapper[4794]: I0310 11:23:45.411425 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"amphora-certs\" (UniqueName: \"kubernetes.io/secret/25562046-f83e-475d-bcf8-22c782c6595e-amphora-certs\") pod \"octavia-worker-dcpwj\" (UID: \"25562046-f83e-475d-bcf8-22c782c6595e\") " pod="openstack/octavia-worker-dcpwj" Mar 10 11:23:45 crc kubenswrapper[4794]: I0310 11:23:45.411617 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/25562046-f83e-475d-bcf8-22c782c6595e-combined-ca-bundle\") pod \"octavia-worker-dcpwj\" (UID: \"25562046-f83e-475d-bcf8-22c782c6595e\") " pod="openstack/octavia-worker-dcpwj" Mar 10 11:23:45 crc kubenswrapper[4794]: I0310 11:23:45.412626 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/25562046-f83e-475d-bcf8-22c782c6595e-config-data-merged\") pod \"octavia-worker-dcpwj\" (UID: \"25562046-f83e-475d-bcf8-22c782c6595e\") " pod="openstack/octavia-worker-dcpwj" Mar 10 11:23:45 crc kubenswrapper[4794]: I0310 11:23:45.413258 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hm-ports\" (UniqueName: \"kubernetes.io/configmap/25562046-f83e-475d-bcf8-22c782c6595e-hm-ports\") pod \"octavia-worker-dcpwj\" (UID: \"25562046-f83e-475d-bcf8-22c782c6595e\") " pod="openstack/octavia-worker-dcpwj" Mar 10 11:23:45 crc kubenswrapper[4794]: I0310 11:23:45.417963 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/25562046-f83e-475d-bcf8-22c782c6595e-config-data\") pod \"octavia-worker-dcpwj\" (UID: \"25562046-f83e-475d-bcf8-22c782c6595e\") " pod="openstack/octavia-worker-dcpwj" Mar 10 11:23:45 crc kubenswrapper[4794]: I0310 11:23:45.418118 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"amphora-certs\" (UniqueName: \"kubernetes.io/secret/25562046-f83e-475d-bcf8-22c782c6595e-amphora-certs\") pod \"octavia-worker-dcpwj\" (UID: \"25562046-f83e-475d-bcf8-22c782c6595e\") " pod="openstack/octavia-worker-dcpwj" Mar 10 11:23:45 crc kubenswrapper[4794]: I0310 11:23:45.425832 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/25562046-f83e-475d-bcf8-22c782c6595e-combined-ca-bundle\") pod \"octavia-worker-dcpwj\" (UID: \"25562046-f83e-475d-bcf8-22c782c6595e\") " pod="openstack/octavia-worker-dcpwj" Mar 10 11:23:45 crc kubenswrapper[4794]: I0310 11:23:45.429752 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/25562046-f83e-475d-bcf8-22c782c6595e-scripts\") pod \"octavia-worker-dcpwj\" (UID: \"25562046-f83e-475d-bcf8-22c782c6595e\") " pod="openstack/octavia-worker-dcpwj" Mar 10 11:23:45 crc kubenswrapper[4794]: I0310 11:23:45.543429 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-worker-dcpwj" Mar 10 11:23:45 crc kubenswrapper[4794]: I0310 11:23:45.962118 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-housekeeping-jn45g" event={"ID":"ec60c79b-10a2-4b22-988f-4bc082f6b5ec","Type":"ContainerStarted","Data":"87c91cf3c48908bbce36fb2ee2e44c89fb1b0c6ca37e4d85404955928dba2dab"} Mar 10 11:23:45 crc kubenswrapper[4794]: I0310 11:23:45.964740 4794 generic.go:334] "Generic (PLEG): container finished" podID="0e5b10af-c321-47e8-9fa2-d143dbe38634" containerID="b3ee2cf170e2d6f4011c686e133c8bd28c51530ad603e4e5dde39ff72aa7cecc" exitCode=0 Mar 10 11:23:45 crc kubenswrapper[4794]: I0310 11:23:45.964785 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-healthmanager-g8rgx" event={"ID":"0e5b10af-c321-47e8-9fa2-d143dbe38634","Type":"ContainerDied","Data":"b3ee2cf170e2d6f4011c686e133c8bd28c51530ad603e4e5dde39ff72aa7cecc"} Mar 10 11:23:46 crc kubenswrapper[4794]: I0310 11:23:46.160622 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-worker-dcpwj"] Mar 10 11:23:46 crc kubenswrapper[4794]: W0310 11:23:46.401737 4794 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod25562046_f83e_475d_bcf8_22c782c6595e.slice/crio-f863f89407a202447dc6ecf70bdde692498fca0995bf3fa1a925d5a27f8d24a4 WatchSource:0}: Error finding container f863f89407a202447dc6ecf70bdde692498fca0995bf3fa1a925d5a27f8d24a4: Status 404 returned error can't find the container with id f863f89407a202447dc6ecf70bdde692498fca0995bf3fa1a925d5a27f8d24a4 Mar 10 11:23:46 crc kubenswrapper[4794]: I0310 11:23:46.679045 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-healthmanager-g8rgx"] Mar 10 11:23:46 crc kubenswrapper[4794]: I0310 11:23:46.976694 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-worker-dcpwj" event={"ID":"25562046-f83e-475d-bcf8-22c782c6595e","Type":"ContainerStarted","Data":"f863f89407a202447dc6ecf70bdde692498fca0995bf3fa1a925d5a27f8d24a4"} Mar 10 11:23:46 crc kubenswrapper[4794]: I0310 11:23:46.985407 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-healthmanager-g8rgx" event={"ID":"0e5b10af-c321-47e8-9fa2-d143dbe38634","Type":"ContainerStarted","Data":"b0e3aca7cc05e246d3497d1b0fdb84d8bc1a210833ff681933c305c4782f559c"} Mar 10 11:23:46 crc kubenswrapper[4794]: I0310 11:23:46.985585 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/octavia-healthmanager-g8rgx" Mar 10 11:23:47 crc kubenswrapper[4794]: I0310 11:23:47.015460 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/octavia-healthmanager-g8rgx" podStartSLOduration=5.015441278 podStartE2EDuration="5.015441278s" podCreationTimestamp="2026-03-10 11:23:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 11:23:47.008707438 +0000 UTC m=+5975.764878256" watchObservedRunningTime="2026-03-10 11:23:47.015441278 +0000 UTC m=+5975.771612096" Mar 10 11:23:47 crc kubenswrapper[4794]: I0310 11:23:47.996097 4794 generic.go:334] "Generic (PLEG): container finished" podID="ec60c79b-10a2-4b22-988f-4bc082f6b5ec" containerID="a6989227bd67723055e155c79f8f323e4c2e432a9473efb533753d3d9ff79a73" exitCode=0 Mar 10 11:23:47 crc kubenswrapper[4794]: I0310 11:23:47.996157 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-housekeeping-jn45g" event={"ID":"ec60c79b-10a2-4b22-988f-4bc082f6b5ec","Type":"ContainerDied","Data":"a6989227bd67723055e155c79f8f323e4c2e432a9473efb533753d3d9ff79a73"} Mar 10 11:23:49 crc kubenswrapper[4794]: I0310 11:23:49.007476 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-housekeeping-jn45g" event={"ID":"ec60c79b-10a2-4b22-988f-4bc082f6b5ec","Type":"ContainerStarted","Data":"8b40119d374adc80381116a686e73f0432ab6ce31436fe2f9880e70feb7f5e8e"} Mar 10 11:23:49 crc kubenswrapper[4794]: I0310 11:23:49.008049 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/octavia-housekeeping-jn45g" Mar 10 11:23:49 crc kubenswrapper[4794]: I0310 11:23:49.010740 4794 generic.go:334] "Generic (PLEG): container finished" podID="25562046-f83e-475d-bcf8-22c782c6595e" containerID="db53004447cd963a6064005db18f01332e0916d574d2f20775cf8d97c4afffed" exitCode=0 Mar 10 11:23:49 crc kubenswrapper[4794]: I0310 11:23:49.010780 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-worker-dcpwj" event={"ID":"25562046-f83e-475d-bcf8-22c782c6595e","Type":"ContainerDied","Data":"db53004447cd963a6064005db18f01332e0916d574d2f20775cf8d97c4afffed"} Mar 10 11:23:49 crc kubenswrapper[4794]: I0310 11:23:49.035302 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/octavia-housekeeping-jn45g" podStartSLOduration=3.527364507 podStartE2EDuration="5.03528434s" podCreationTimestamp="2026-03-10 11:23:44 +0000 UTC" firstStartedPulling="2026-03-10 11:23:44.955447995 +0000 UTC m=+5973.711618813" lastFinishedPulling="2026-03-10 11:23:46.463367828 +0000 UTC m=+5975.219538646" observedRunningTime="2026-03-10 11:23:49.029972335 +0000 UTC m=+5977.786143153" watchObservedRunningTime="2026-03-10 11:23:49.03528434 +0000 UTC m=+5977.791455158" Mar 10 11:23:50 crc kubenswrapper[4794]: I0310 11:23:50.036039 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-worker-dcpwj" event={"ID":"25562046-f83e-475d-bcf8-22c782c6595e","Type":"ContainerStarted","Data":"98a1b8a7e318584fcfaed77cf1004c87f290eaeb150092de03efb266d43ffefb"} Mar 10 11:23:50 crc kubenswrapper[4794]: I0310 11:23:50.036524 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/octavia-worker-dcpwj" Mar 10 11:23:50 crc kubenswrapper[4794]: I0310 11:23:50.062386 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/octavia-worker-dcpwj" podStartSLOduration=3.835482934 podStartE2EDuration="5.062371181s" podCreationTimestamp="2026-03-10 11:23:45 +0000 UTC" firstStartedPulling="2026-03-10 11:23:46.405717665 +0000 UTC m=+5975.161888483" lastFinishedPulling="2026-03-10 11:23:47.632605902 +0000 UTC m=+5976.388776730" observedRunningTime="2026-03-10 11:23:50.053882217 +0000 UTC m=+5978.810053035" watchObservedRunningTime="2026-03-10 11:23:50.062371181 +0000 UTC m=+5978.818541999" Mar 10 11:23:58 crc kubenswrapper[4794]: I0310 11:23:58.014031 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/octavia-healthmanager-g8rgx" Mar 10 11:23:59 crc kubenswrapper[4794]: I0310 11:23:59.435208 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/octavia-housekeeping-jn45g" Mar 10 11:24:00 crc kubenswrapper[4794]: I0310 11:24:00.148986 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29552364-4nfmf"] Mar 10 11:24:00 crc kubenswrapper[4794]: I0310 11:24:00.151755 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552364-4nfmf" Mar 10 11:24:00 crc kubenswrapper[4794]: I0310 11:24:00.178095 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 10 11:24:00 crc kubenswrapper[4794]: I0310 11:24:00.178270 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-r5tkc" Mar 10 11:24:00 crc kubenswrapper[4794]: I0310 11:24:00.178532 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 10 11:24:00 crc kubenswrapper[4794]: I0310 11:24:00.180080 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552364-4nfmf"] Mar 10 11:24:00 crc kubenswrapper[4794]: I0310 11:24:00.276906 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cq8gg\" (UniqueName: \"kubernetes.io/projected/d4483d52-5707-451b-bff3-9ba1e6984e82-kube-api-access-cq8gg\") pod \"auto-csr-approver-29552364-4nfmf\" (UID: \"d4483d52-5707-451b-bff3-9ba1e6984e82\") " pod="openshift-infra/auto-csr-approver-29552364-4nfmf" Mar 10 11:24:00 crc kubenswrapper[4794]: I0310 11:24:00.381992 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cq8gg\" (UniqueName: \"kubernetes.io/projected/d4483d52-5707-451b-bff3-9ba1e6984e82-kube-api-access-cq8gg\") pod \"auto-csr-approver-29552364-4nfmf\" (UID: \"d4483d52-5707-451b-bff3-9ba1e6984e82\") " pod="openshift-infra/auto-csr-approver-29552364-4nfmf" Mar 10 11:24:00 crc kubenswrapper[4794]: I0310 11:24:00.408671 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cq8gg\" (UniqueName: \"kubernetes.io/projected/d4483d52-5707-451b-bff3-9ba1e6984e82-kube-api-access-cq8gg\") pod \"auto-csr-approver-29552364-4nfmf\" (UID: \"d4483d52-5707-451b-bff3-9ba1e6984e82\") " pod="openshift-infra/auto-csr-approver-29552364-4nfmf" Mar 10 11:24:00 crc kubenswrapper[4794]: I0310 11:24:00.502414 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552364-4nfmf" Mar 10 11:24:00 crc kubenswrapper[4794]: I0310 11:24:00.590244 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/octavia-worker-dcpwj" Mar 10 11:24:00 crc kubenswrapper[4794]: I0310 11:24:00.993229 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552364-4nfmf"] Mar 10 11:24:01 crc kubenswrapper[4794]: W0310 11:24:01.003326 4794 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd4483d52_5707_451b_bff3_9ba1e6984e82.slice/crio-04006d9a8c4920321b419f11a5dcbbb65cbf614bd72d8ed61e37f9b266705675 WatchSource:0}: Error finding container 04006d9a8c4920321b419f11a5dcbbb65cbf614bd72d8ed61e37f9b266705675: Status 404 returned error can't find the container with id 04006d9a8c4920321b419f11a5dcbbb65cbf614bd72d8ed61e37f9b266705675 Mar 10 11:24:01 crc kubenswrapper[4794]: I0310 11:24:01.197641 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552364-4nfmf" event={"ID":"d4483d52-5707-451b-bff3-9ba1e6984e82","Type":"ContainerStarted","Data":"04006d9a8c4920321b419f11a5dcbbb65cbf614bd72d8ed61e37f9b266705675"} Mar 10 11:24:03 crc kubenswrapper[4794]: I0310 11:24:03.215370 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552364-4nfmf" event={"ID":"d4483d52-5707-451b-bff3-9ba1e6984e82","Type":"ContainerStarted","Data":"c6781b4d2bc760d5d7461019b839e5c688a2f9c9be8d1c23c04d943703a000f9"} Mar 10 11:24:03 crc kubenswrapper[4794]: I0310 11:24:03.230861 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29552364-4nfmf" podStartSLOduration=1.556389029 podStartE2EDuration="3.230846215s" podCreationTimestamp="2026-03-10 11:24:00 +0000 UTC" firstStartedPulling="2026-03-10 11:24:01.005904219 +0000 UTC m=+5989.762075037" lastFinishedPulling="2026-03-10 11:24:02.680361415 +0000 UTC m=+5991.436532223" observedRunningTime="2026-03-10 11:24:03.227758089 +0000 UTC m=+5991.983928897" watchObservedRunningTime="2026-03-10 11:24:03.230846215 +0000 UTC m=+5991.987017033" Mar 10 11:24:04 crc kubenswrapper[4794]: I0310 11:24:04.232889 4794 generic.go:334] "Generic (PLEG): container finished" podID="d4483d52-5707-451b-bff3-9ba1e6984e82" containerID="c6781b4d2bc760d5d7461019b839e5c688a2f9c9be8d1c23c04d943703a000f9" exitCode=0 Mar 10 11:24:04 crc kubenswrapper[4794]: I0310 11:24:04.232953 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552364-4nfmf" event={"ID":"d4483d52-5707-451b-bff3-9ba1e6984e82","Type":"ContainerDied","Data":"c6781b4d2bc760d5d7461019b839e5c688a2f9c9be8d1c23c04d943703a000f9"} Mar 10 11:24:05 crc kubenswrapper[4794]: I0310 11:24:05.643929 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552364-4nfmf" Mar 10 11:24:05 crc kubenswrapper[4794]: I0310 11:24:05.817738 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cq8gg\" (UniqueName: \"kubernetes.io/projected/d4483d52-5707-451b-bff3-9ba1e6984e82-kube-api-access-cq8gg\") pod \"d4483d52-5707-451b-bff3-9ba1e6984e82\" (UID: \"d4483d52-5707-451b-bff3-9ba1e6984e82\") " Mar 10 11:24:05 crc kubenswrapper[4794]: I0310 11:24:05.824754 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d4483d52-5707-451b-bff3-9ba1e6984e82-kube-api-access-cq8gg" (OuterVolumeSpecName: "kube-api-access-cq8gg") pod "d4483d52-5707-451b-bff3-9ba1e6984e82" (UID: "d4483d52-5707-451b-bff3-9ba1e6984e82"). InnerVolumeSpecName "kube-api-access-cq8gg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 11:24:05 crc kubenswrapper[4794]: I0310 11:24:05.920422 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cq8gg\" (UniqueName: \"kubernetes.io/projected/d4483d52-5707-451b-bff3-9ba1e6984e82-kube-api-access-cq8gg\") on node \"crc\" DevicePath \"\"" Mar 10 11:24:06 crc kubenswrapper[4794]: I0310 11:24:06.255743 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552364-4nfmf" event={"ID":"d4483d52-5707-451b-bff3-9ba1e6984e82","Type":"ContainerDied","Data":"04006d9a8c4920321b419f11a5dcbbb65cbf614bd72d8ed61e37f9b266705675"} Mar 10 11:24:06 crc kubenswrapper[4794]: I0310 11:24:06.255800 4794 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="04006d9a8c4920321b419f11a5dcbbb65cbf614bd72d8ed61e37f9b266705675" Mar 10 11:24:06 crc kubenswrapper[4794]: I0310 11:24:06.256321 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552364-4nfmf" Mar 10 11:24:06 crc kubenswrapper[4794]: I0310 11:24:06.304380 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29552358-j69qp"] Mar 10 11:24:06 crc kubenswrapper[4794]: I0310 11:24:06.312853 4794 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29552358-j69qp"] Mar 10 11:24:08 crc kubenswrapper[4794]: I0310 11:24:08.014928 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="aadfef82-5fa2-476a-aa14-65e884ed00b1" path="/var/lib/kubelet/pods/aadfef82-5fa2-476a-aa14-65e884ed00b1/volumes" Mar 10 11:24:13 crc kubenswrapper[4794]: I0310 11:24:13.456773 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-5dbd77ff65-fht8h"] Mar 10 11:24:13 crc kubenswrapper[4794]: E0310 11:24:13.457664 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d4483d52-5707-451b-bff3-9ba1e6984e82" containerName="oc" Mar 10 11:24:13 crc kubenswrapper[4794]: I0310 11:24:13.457676 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="d4483d52-5707-451b-bff3-9ba1e6984e82" containerName="oc" Mar 10 11:24:13 crc kubenswrapper[4794]: I0310 11:24:13.458071 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="d4483d52-5707-451b-bff3-9ba1e6984e82" containerName="oc" Mar 10 11:24:13 crc kubenswrapper[4794]: I0310 11:24:13.459266 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-5dbd77ff65-fht8h" Mar 10 11:24:13 crc kubenswrapper[4794]: I0310 11:24:13.461774 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"horizon-scripts" Mar 10 11:24:13 crc kubenswrapper[4794]: I0310 11:24:13.461799 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"horizon" Mar 10 11:24:13 crc kubenswrapper[4794]: I0310 11:24:13.461997 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"horizon-config-data" Mar 10 11:24:13 crc kubenswrapper[4794]: I0310 11:24:13.461750 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"horizon-horizon-dockercfg-ftprz" Mar 10 11:24:13 crc kubenswrapper[4794]: I0310 11:24:13.495544 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-5dbd77ff65-fht8h"] Mar 10 11:24:13 crc kubenswrapper[4794]: I0310 11:24:13.512236 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 10 11:24:13 crc kubenswrapper[4794]: I0310 11:24:13.512544 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="c9a8037c-9090-4a68-92a2-21b0e4a142b4" containerName="glance-log" containerID="cri-o://c4c053008c62120f41214b5b89fa8a72183957fc43951b46ce7668624899f2f5" gracePeriod=30 Mar 10 11:24:13 crc kubenswrapper[4794]: I0310 11:24:13.513037 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="c9a8037c-9090-4a68-92a2-21b0e4a142b4" containerName="glance-httpd" containerID="cri-o://df0b72b90c3e4b239a686b0ecec91563d34fd8961b41e357fbe0cb85abd935f8" gracePeriod=30 Mar 10 11:24:13 crc kubenswrapper[4794]: I0310 11:24:13.595215 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/0d5d143f-51bc-4019-88a3-3224640cdea1-horizon-secret-key\") pod \"horizon-5dbd77ff65-fht8h\" (UID: \"0d5d143f-51bc-4019-88a3-3224640cdea1\") " pod="openstack/horizon-5dbd77ff65-fht8h" Mar 10 11:24:13 crc kubenswrapper[4794]: I0310 11:24:13.595514 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/0d5d143f-51bc-4019-88a3-3224640cdea1-config-data\") pod \"horizon-5dbd77ff65-fht8h\" (UID: \"0d5d143f-51bc-4019-88a3-3224640cdea1\") " pod="openstack/horizon-5dbd77ff65-fht8h" Mar 10 11:24:13 crc kubenswrapper[4794]: I0310 11:24:13.595538 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/0d5d143f-51bc-4019-88a3-3224640cdea1-scripts\") pod \"horizon-5dbd77ff65-fht8h\" (UID: \"0d5d143f-51bc-4019-88a3-3224640cdea1\") " pod="openstack/horizon-5dbd77ff65-fht8h" Mar 10 11:24:13 crc kubenswrapper[4794]: I0310 11:24:13.595699 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0d5d143f-51bc-4019-88a3-3224640cdea1-logs\") pod \"horizon-5dbd77ff65-fht8h\" (UID: \"0d5d143f-51bc-4019-88a3-3224640cdea1\") " pod="openstack/horizon-5dbd77ff65-fht8h" Mar 10 11:24:13 crc kubenswrapper[4794]: I0310 11:24:13.595793 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7rvj6\" (UniqueName: \"kubernetes.io/projected/0d5d143f-51bc-4019-88a3-3224640cdea1-kube-api-access-7rvj6\") pod \"horizon-5dbd77ff65-fht8h\" (UID: \"0d5d143f-51bc-4019-88a3-3224640cdea1\") " pod="openstack/horizon-5dbd77ff65-fht8h" Mar 10 11:24:13 crc kubenswrapper[4794]: I0310 11:24:13.601530 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-6ffb66d69f-4mhg9"] Mar 10 11:24:13 crc kubenswrapper[4794]: I0310 11:24:13.603211 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-6ffb66d69f-4mhg9" Mar 10 11:24:13 crc kubenswrapper[4794]: I0310 11:24:13.614208 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 10 11:24:13 crc kubenswrapper[4794]: I0310 11:24:13.614451 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="58e94276-b957-49b1-b4db-874a151bfd3a" containerName="glance-log" containerID="cri-o://0311c0be9eec89c14e0736e3a96f5851e5d816408c26015edbbe888b94c0cbcb" gracePeriod=30 Mar 10 11:24:13 crc kubenswrapper[4794]: I0310 11:24:13.614593 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="58e94276-b957-49b1-b4db-874a151bfd3a" containerName="glance-httpd" containerID="cri-o://73001e259d7cad5584fd2258090c25f4dd39102ca089343a21d440080cc36471" gracePeriod=30 Mar 10 11:24:13 crc kubenswrapper[4794]: I0310 11:24:13.632188 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-6ffb66d69f-4mhg9"] Mar 10 11:24:13 crc kubenswrapper[4794]: I0310 11:24:13.697633 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b86814f6-b171-481c-9f93-dab25d50b2a9-scripts\") pod \"horizon-6ffb66d69f-4mhg9\" (UID: \"b86814f6-b171-481c-9f93-dab25d50b2a9\") " pod="openstack/horizon-6ffb66d69f-4mhg9" Mar 10 11:24:13 crc kubenswrapper[4794]: I0310 11:24:13.697707 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0d5d143f-51bc-4019-88a3-3224640cdea1-logs\") pod \"horizon-5dbd77ff65-fht8h\" (UID: \"0d5d143f-51bc-4019-88a3-3224640cdea1\") " pod="openstack/horizon-5dbd77ff65-fht8h" Mar 10 11:24:13 crc kubenswrapper[4794]: I0310 11:24:13.697784 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/b86814f6-b171-481c-9f93-dab25d50b2a9-config-data\") pod \"horizon-6ffb66d69f-4mhg9\" (UID: \"b86814f6-b171-481c-9f93-dab25d50b2a9\") " pod="openstack/horizon-6ffb66d69f-4mhg9" Mar 10 11:24:13 crc kubenswrapper[4794]: I0310 11:24:13.697816 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7rvj6\" (UniqueName: \"kubernetes.io/projected/0d5d143f-51bc-4019-88a3-3224640cdea1-kube-api-access-7rvj6\") pod \"horizon-5dbd77ff65-fht8h\" (UID: \"0d5d143f-51bc-4019-88a3-3224640cdea1\") " pod="openstack/horizon-5dbd77ff65-fht8h" Mar 10 11:24:13 crc kubenswrapper[4794]: I0310 11:24:13.697872 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/0d5d143f-51bc-4019-88a3-3224640cdea1-horizon-secret-key\") pod \"horizon-5dbd77ff65-fht8h\" (UID: \"0d5d143f-51bc-4019-88a3-3224640cdea1\") " pod="openstack/horizon-5dbd77ff65-fht8h" Mar 10 11:24:13 crc kubenswrapper[4794]: I0310 11:24:13.697907 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2gbnw\" (UniqueName: \"kubernetes.io/projected/b86814f6-b171-481c-9f93-dab25d50b2a9-kube-api-access-2gbnw\") pod \"horizon-6ffb66d69f-4mhg9\" (UID: \"b86814f6-b171-481c-9f93-dab25d50b2a9\") " pod="openstack/horizon-6ffb66d69f-4mhg9" Mar 10 11:24:13 crc kubenswrapper[4794]: I0310 11:24:13.697933 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/0d5d143f-51bc-4019-88a3-3224640cdea1-config-data\") pod \"horizon-5dbd77ff65-fht8h\" (UID: \"0d5d143f-51bc-4019-88a3-3224640cdea1\") " pod="openstack/horizon-5dbd77ff65-fht8h" Mar 10 11:24:13 crc kubenswrapper[4794]: I0310 11:24:13.697951 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/0d5d143f-51bc-4019-88a3-3224640cdea1-scripts\") pod \"horizon-5dbd77ff65-fht8h\" (UID: \"0d5d143f-51bc-4019-88a3-3224640cdea1\") " pod="openstack/horizon-5dbd77ff65-fht8h" Mar 10 11:24:13 crc kubenswrapper[4794]: I0310 11:24:13.698014 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/b86814f6-b171-481c-9f93-dab25d50b2a9-horizon-secret-key\") pod \"horizon-6ffb66d69f-4mhg9\" (UID: \"b86814f6-b171-481c-9f93-dab25d50b2a9\") " pod="openstack/horizon-6ffb66d69f-4mhg9" Mar 10 11:24:13 crc kubenswrapper[4794]: I0310 11:24:13.698043 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b86814f6-b171-481c-9f93-dab25d50b2a9-logs\") pod \"horizon-6ffb66d69f-4mhg9\" (UID: \"b86814f6-b171-481c-9f93-dab25d50b2a9\") " pod="openstack/horizon-6ffb66d69f-4mhg9" Mar 10 11:24:13 crc kubenswrapper[4794]: I0310 11:24:13.698097 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0d5d143f-51bc-4019-88a3-3224640cdea1-logs\") pod \"horizon-5dbd77ff65-fht8h\" (UID: \"0d5d143f-51bc-4019-88a3-3224640cdea1\") " pod="openstack/horizon-5dbd77ff65-fht8h" Mar 10 11:24:13 crc kubenswrapper[4794]: I0310 11:24:13.698888 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/0d5d143f-51bc-4019-88a3-3224640cdea1-scripts\") pod \"horizon-5dbd77ff65-fht8h\" (UID: \"0d5d143f-51bc-4019-88a3-3224640cdea1\") " pod="openstack/horizon-5dbd77ff65-fht8h" Mar 10 11:24:13 crc kubenswrapper[4794]: I0310 11:24:13.699091 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/0d5d143f-51bc-4019-88a3-3224640cdea1-config-data\") pod \"horizon-5dbd77ff65-fht8h\" (UID: \"0d5d143f-51bc-4019-88a3-3224640cdea1\") " pod="openstack/horizon-5dbd77ff65-fht8h" Mar 10 11:24:13 crc kubenswrapper[4794]: I0310 11:24:13.724959 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7rvj6\" (UniqueName: \"kubernetes.io/projected/0d5d143f-51bc-4019-88a3-3224640cdea1-kube-api-access-7rvj6\") pod \"horizon-5dbd77ff65-fht8h\" (UID: \"0d5d143f-51bc-4019-88a3-3224640cdea1\") " pod="openstack/horizon-5dbd77ff65-fht8h" Mar 10 11:24:13 crc kubenswrapper[4794]: I0310 11:24:13.726224 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/0d5d143f-51bc-4019-88a3-3224640cdea1-horizon-secret-key\") pod \"horizon-5dbd77ff65-fht8h\" (UID: \"0d5d143f-51bc-4019-88a3-3224640cdea1\") " pod="openstack/horizon-5dbd77ff65-fht8h" Mar 10 11:24:13 crc kubenswrapper[4794]: I0310 11:24:13.783860 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-5dbd77ff65-fht8h" Mar 10 11:24:13 crc kubenswrapper[4794]: I0310 11:24:13.800451 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/b86814f6-b171-481c-9f93-dab25d50b2a9-config-data\") pod \"horizon-6ffb66d69f-4mhg9\" (UID: \"b86814f6-b171-481c-9f93-dab25d50b2a9\") " pod="openstack/horizon-6ffb66d69f-4mhg9" Mar 10 11:24:13 crc kubenswrapper[4794]: I0310 11:24:13.800585 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2gbnw\" (UniqueName: \"kubernetes.io/projected/b86814f6-b171-481c-9f93-dab25d50b2a9-kube-api-access-2gbnw\") pod \"horizon-6ffb66d69f-4mhg9\" (UID: \"b86814f6-b171-481c-9f93-dab25d50b2a9\") " pod="openstack/horizon-6ffb66d69f-4mhg9" Mar 10 11:24:13 crc kubenswrapper[4794]: I0310 11:24:13.800721 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/b86814f6-b171-481c-9f93-dab25d50b2a9-horizon-secret-key\") pod \"horizon-6ffb66d69f-4mhg9\" (UID: \"b86814f6-b171-481c-9f93-dab25d50b2a9\") " pod="openstack/horizon-6ffb66d69f-4mhg9" Mar 10 11:24:13 crc kubenswrapper[4794]: I0310 11:24:13.800767 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b86814f6-b171-481c-9f93-dab25d50b2a9-logs\") pod \"horizon-6ffb66d69f-4mhg9\" (UID: \"b86814f6-b171-481c-9f93-dab25d50b2a9\") " pod="openstack/horizon-6ffb66d69f-4mhg9" Mar 10 11:24:13 crc kubenswrapper[4794]: I0310 11:24:13.800804 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b86814f6-b171-481c-9f93-dab25d50b2a9-scripts\") pod \"horizon-6ffb66d69f-4mhg9\" (UID: \"b86814f6-b171-481c-9f93-dab25d50b2a9\") " pod="openstack/horizon-6ffb66d69f-4mhg9" Mar 10 11:24:13 crc kubenswrapper[4794]: I0310 11:24:13.801416 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b86814f6-b171-481c-9f93-dab25d50b2a9-logs\") pod \"horizon-6ffb66d69f-4mhg9\" (UID: \"b86814f6-b171-481c-9f93-dab25d50b2a9\") " pod="openstack/horizon-6ffb66d69f-4mhg9" Mar 10 11:24:13 crc kubenswrapper[4794]: I0310 11:24:13.801693 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b86814f6-b171-481c-9f93-dab25d50b2a9-scripts\") pod \"horizon-6ffb66d69f-4mhg9\" (UID: \"b86814f6-b171-481c-9f93-dab25d50b2a9\") " pod="openstack/horizon-6ffb66d69f-4mhg9" Mar 10 11:24:13 crc kubenswrapper[4794]: I0310 11:24:13.802466 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/b86814f6-b171-481c-9f93-dab25d50b2a9-config-data\") pod \"horizon-6ffb66d69f-4mhg9\" (UID: \"b86814f6-b171-481c-9f93-dab25d50b2a9\") " pod="openstack/horizon-6ffb66d69f-4mhg9" Mar 10 11:24:13 crc kubenswrapper[4794]: I0310 11:24:13.807679 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/b86814f6-b171-481c-9f93-dab25d50b2a9-horizon-secret-key\") pod \"horizon-6ffb66d69f-4mhg9\" (UID: \"b86814f6-b171-481c-9f93-dab25d50b2a9\") " pod="openstack/horizon-6ffb66d69f-4mhg9" Mar 10 11:24:13 crc kubenswrapper[4794]: I0310 11:24:13.820662 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2gbnw\" (UniqueName: \"kubernetes.io/projected/b86814f6-b171-481c-9f93-dab25d50b2a9-kube-api-access-2gbnw\") pod \"horizon-6ffb66d69f-4mhg9\" (UID: \"b86814f6-b171-481c-9f93-dab25d50b2a9\") " pod="openstack/horizon-6ffb66d69f-4mhg9" Mar 10 11:24:13 crc kubenswrapper[4794]: I0310 11:24:13.917510 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-6ffb66d69f-4mhg9" Mar 10 11:24:14 crc kubenswrapper[4794]: I0310 11:24:14.117429 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-5dbd77ff65-fht8h"] Mar 10 11:24:14 crc kubenswrapper[4794]: I0310 11:24:14.154204 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-56cc67dc8f-wf6qg"] Mar 10 11:24:14 crc kubenswrapper[4794]: I0310 11:24:14.156128 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-56cc67dc8f-wf6qg" Mar 10 11:24:14 crc kubenswrapper[4794]: I0310 11:24:14.164787 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-56cc67dc8f-wf6qg"] Mar 10 11:24:14 crc kubenswrapper[4794]: I0310 11:24:14.273642 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-5dbd77ff65-fht8h"] Mar 10 11:24:14 crc kubenswrapper[4794]: I0310 11:24:14.311539 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9bc8c092-11cb-41cf-860b-4912ca0a0eeb-logs\") pod \"horizon-56cc67dc8f-wf6qg\" (UID: \"9bc8c092-11cb-41cf-860b-4912ca0a0eeb\") " pod="openstack/horizon-56cc67dc8f-wf6qg" Mar 10 11:24:14 crc kubenswrapper[4794]: I0310 11:24:14.311619 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jxfx6\" (UniqueName: \"kubernetes.io/projected/9bc8c092-11cb-41cf-860b-4912ca0a0eeb-kube-api-access-jxfx6\") pod \"horizon-56cc67dc8f-wf6qg\" (UID: \"9bc8c092-11cb-41cf-860b-4912ca0a0eeb\") " pod="openstack/horizon-56cc67dc8f-wf6qg" Mar 10 11:24:14 crc kubenswrapper[4794]: I0310 11:24:14.311693 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/9bc8c092-11cb-41cf-860b-4912ca0a0eeb-horizon-secret-key\") pod \"horizon-56cc67dc8f-wf6qg\" (UID: \"9bc8c092-11cb-41cf-860b-4912ca0a0eeb\") " pod="openstack/horizon-56cc67dc8f-wf6qg" Mar 10 11:24:14 crc kubenswrapper[4794]: I0310 11:24:14.311739 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/9bc8c092-11cb-41cf-860b-4912ca0a0eeb-config-data\") pod \"horizon-56cc67dc8f-wf6qg\" (UID: \"9bc8c092-11cb-41cf-860b-4912ca0a0eeb\") " pod="openstack/horizon-56cc67dc8f-wf6qg" Mar 10 11:24:14 crc kubenswrapper[4794]: I0310 11:24:14.311787 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9bc8c092-11cb-41cf-860b-4912ca0a0eeb-scripts\") pod \"horizon-56cc67dc8f-wf6qg\" (UID: \"9bc8c092-11cb-41cf-860b-4912ca0a0eeb\") " pod="openstack/horizon-56cc67dc8f-wf6qg" Mar 10 11:24:14 crc kubenswrapper[4794]: I0310 11:24:14.343717 4794 generic.go:334] "Generic (PLEG): container finished" podID="58e94276-b957-49b1-b4db-874a151bfd3a" containerID="0311c0be9eec89c14e0736e3a96f5851e5d816408c26015edbbe888b94c0cbcb" exitCode=143 Mar 10 11:24:14 crc kubenswrapper[4794]: I0310 11:24:14.343784 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"58e94276-b957-49b1-b4db-874a151bfd3a","Type":"ContainerDied","Data":"0311c0be9eec89c14e0736e3a96f5851e5d816408c26015edbbe888b94c0cbcb"} Mar 10 11:24:14 crc kubenswrapper[4794]: I0310 11:24:14.346226 4794 generic.go:334] "Generic (PLEG): container finished" podID="c9a8037c-9090-4a68-92a2-21b0e4a142b4" containerID="c4c053008c62120f41214b5b89fa8a72183957fc43951b46ce7668624899f2f5" exitCode=143 Mar 10 11:24:14 crc kubenswrapper[4794]: I0310 11:24:14.346288 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"c9a8037c-9090-4a68-92a2-21b0e4a142b4","Type":"ContainerDied","Data":"c4c053008c62120f41214b5b89fa8a72183957fc43951b46ce7668624899f2f5"} Mar 10 11:24:14 crc kubenswrapper[4794]: I0310 11:24:14.347525 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5dbd77ff65-fht8h" event={"ID":"0d5d143f-51bc-4019-88a3-3224640cdea1","Type":"ContainerStarted","Data":"cfb12a1a9ff719c547057bb1dc778e890669796c08d3ec94f192617c892274a1"} Mar 10 11:24:14 crc kubenswrapper[4794]: I0310 11:24:14.413537 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/9bc8c092-11cb-41cf-860b-4912ca0a0eeb-horizon-secret-key\") pod \"horizon-56cc67dc8f-wf6qg\" (UID: \"9bc8c092-11cb-41cf-860b-4912ca0a0eeb\") " pod="openstack/horizon-56cc67dc8f-wf6qg" Mar 10 11:24:14 crc kubenswrapper[4794]: I0310 11:24:14.413642 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/9bc8c092-11cb-41cf-860b-4912ca0a0eeb-config-data\") pod \"horizon-56cc67dc8f-wf6qg\" (UID: \"9bc8c092-11cb-41cf-860b-4912ca0a0eeb\") " pod="openstack/horizon-56cc67dc8f-wf6qg" Mar 10 11:24:14 crc kubenswrapper[4794]: I0310 11:24:14.413764 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9bc8c092-11cb-41cf-860b-4912ca0a0eeb-scripts\") pod \"horizon-56cc67dc8f-wf6qg\" (UID: \"9bc8c092-11cb-41cf-860b-4912ca0a0eeb\") " pod="openstack/horizon-56cc67dc8f-wf6qg" Mar 10 11:24:14 crc kubenswrapper[4794]: I0310 11:24:14.413934 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9bc8c092-11cb-41cf-860b-4912ca0a0eeb-logs\") pod \"horizon-56cc67dc8f-wf6qg\" (UID: \"9bc8c092-11cb-41cf-860b-4912ca0a0eeb\") " pod="openstack/horizon-56cc67dc8f-wf6qg" Mar 10 11:24:14 crc kubenswrapper[4794]: I0310 11:24:14.414024 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jxfx6\" (UniqueName: \"kubernetes.io/projected/9bc8c092-11cb-41cf-860b-4912ca0a0eeb-kube-api-access-jxfx6\") pod \"horizon-56cc67dc8f-wf6qg\" (UID: \"9bc8c092-11cb-41cf-860b-4912ca0a0eeb\") " pod="openstack/horizon-56cc67dc8f-wf6qg" Mar 10 11:24:14 crc kubenswrapper[4794]: I0310 11:24:14.414612 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9bc8c092-11cb-41cf-860b-4912ca0a0eeb-logs\") pod \"horizon-56cc67dc8f-wf6qg\" (UID: \"9bc8c092-11cb-41cf-860b-4912ca0a0eeb\") " pod="openstack/horizon-56cc67dc8f-wf6qg" Mar 10 11:24:14 crc kubenswrapper[4794]: I0310 11:24:14.414662 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9bc8c092-11cb-41cf-860b-4912ca0a0eeb-scripts\") pod \"horizon-56cc67dc8f-wf6qg\" (UID: \"9bc8c092-11cb-41cf-860b-4912ca0a0eeb\") " pod="openstack/horizon-56cc67dc8f-wf6qg" Mar 10 11:24:14 crc kubenswrapper[4794]: I0310 11:24:14.415117 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/9bc8c092-11cb-41cf-860b-4912ca0a0eeb-config-data\") pod \"horizon-56cc67dc8f-wf6qg\" (UID: \"9bc8c092-11cb-41cf-860b-4912ca0a0eeb\") " pod="openstack/horizon-56cc67dc8f-wf6qg" Mar 10 11:24:14 crc kubenswrapper[4794]: I0310 11:24:14.421103 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/9bc8c092-11cb-41cf-860b-4912ca0a0eeb-horizon-secret-key\") pod \"horizon-56cc67dc8f-wf6qg\" (UID: \"9bc8c092-11cb-41cf-860b-4912ca0a0eeb\") " pod="openstack/horizon-56cc67dc8f-wf6qg" Mar 10 11:24:14 crc kubenswrapper[4794]: I0310 11:24:14.431964 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jxfx6\" (UniqueName: \"kubernetes.io/projected/9bc8c092-11cb-41cf-860b-4912ca0a0eeb-kube-api-access-jxfx6\") pod \"horizon-56cc67dc8f-wf6qg\" (UID: \"9bc8c092-11cb-41cf-860b-4912ca0a0eeb\") " pod="openstack/horizon-56cc67dc8f-wf6qg" Mar 10 11:24:14 crc kubenswrapper[4794]: I0310 11:24:14.438549 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-6ffb66d69f-4mhg9"] Mar 10 11:24:14 crc kubenswrapper[4794]: W0310 11:24:14.449267 4794 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb86814f6_b171_481c_9f93_dab25d50b2a9.slice/crio-9ffa2da217033f775dc0c545ea35a0c145071603674e176a6991a7b00297a1dd WatchSource:0}: Error finding container 9ffa2da217033f775dc0c545ea35a0c145071603674e176a6991a7b00297a1dd: Status 404 returned error can't find the container with id 9ffa2da217033f775dc0c545ea35a0c145071603674e176a6991a7b00297a1dd Mar 10 11:24:14 crc kubenswrapper[4794]: I0310 11:24:14.477537 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-56cc67dc8f-wf6qg" Mar 10 11:24:14 crc kubenswrapper[4794]: I0310 11:24:14.946805 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-56cc67dc8f-wf6qg"] Mar 10 11:24:14 crc kubenswrapper[4794]: W0310 11:24:14.948758 4794 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9bc8c092_11cb_41cf_860b_4912ca0a0eeb.slice/crio-e38e07a56cde82d8d26ebcd4cab2feebb80150e0eda2cdd6167416d36ee40413 WatchSource:0}: Error finding container e38e07a56cde82d8d26ebcd4cab2feebb80150e0eda2cdd6167416d36ee40413: Status 404 returned error can't find the container with id e38e07a56cde82d8d26ebcd4cab2feebb80150e0eda2cdd6167416d36ee40413 Mar 10 11:24:15 crc kubenswrapper[4794]: I0310 11:24:15.356394 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-56cc67dc8f-wf6qg" event={"ID":"9bc8c092-11cb-41cf-860b-4912ca0a0eeb","Type":"ContainerStarted","Data":"e38e07a56cde82d8d26ebcd4cab2feebb80150e0eda2cdd6167416d36ee40413"} Mar 10 11:24:15 crc kubenswrapper[4794]: I0310 11:24:15.357235 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6ffb66d69f-4mhg9" event={"ID":"b86814f6-b171-481c-9f93-dab25d50b2a9","Type":"ContainerStarted","Data":"9ffa2da217033f775dc0c545ea35a0c145071603674e176a6991a7b00297a1dd"} Mar 10 11:24:17 crc kubenswrapper[4794]: I0310 11:24:17.379514 4794 generic.go:334] "Generic (PLEG): container finished" podID="58e94276-b957-49b1-b4db-874a151bfd3a" containerID="73001e259d7cad5584fd2258090c25f4dd39102ca089343a21d440080cc36471" exitCode=0 Mar 10 11:24:17 crc kubenswrapper[4794]: I0310 11:24:17.379610 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"58e94276-b957-49b1-b4db-874a151bfd3a","Type":"ContainerDied","Data":"73001e259d7cad5584fd2258090c25f4dd39102ca089343a21d440080cc36471"} Mar 10 11:24:17 crc kubenswrapper[4794]: I0310 11:24:17.382103 4794 generic.go:334] "Generic (PLEG): container finished" podID="c9a8037c-9090-4a68-92a2-21b0e4a142b4" containerID="df0b72b90c3e4b239a686b0ecec91563d34fd8961b41e357fbe0cb85abd935f8" exitCode=0 Mar 10 11:24:17 crc kubenswrapper[4794]: I0310 11:24:17.382148 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"c9a8037c-9090-4a68-92a2-21b0e4a142b4","Type":"ContainerDied","Data":"df0b72b90c3e4b239a686b0ecec91563d34fd8961b41e357fbe0cb85abd935f8"} Mar 10 11:24:20 crc kubenswrapper[4794]: I0310 11:24:20.753315 4794 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/glance-default-external-api-0" podUID="c9a8037c-9090-4a68-92a2-21b0e4a142b4" containerName="glance-log" probeResult="failure" output="Get \"http://10.217.1.79:9292/healthcheck\": dial tcp 10.217.1.79:9292: connect: connection refused" Mar 10 11:24:20 crc kubenswrapper[4794]: I0310 11:24:20.753512 4794 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/glance-default-external-api-0" podUID="c9a8037c-9090-4a68-92a2-21b0e4a142b4" containerName="glance-httpd" probeResult="failure" output="Get \"http://10.217.1.79:9292/healthcheck\": dial tcp 10.217.1.79:9292: connect: connection refused" Mar 10 11:24:21 crc kubenswrapper[4794]: I0310 11:24:21.774502 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 10 11:24:21 crc kubenswrapper[4794]: I0310 11:24:21.828255 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 10 11:24:21 crc kubenswrapper[4794]: I0310 11:24:21.951478 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5kvcs\" (UniqueName: \"kubernetes.io/projected/58e94276-b957-49b1-b4db-874a151bfd3a-kube-api-access-5kvcs\") pod \"58e94276-b957-49b1-b4db-874a151bfd3a\" (UID: \"58e94276-b957-49b1-b4db-874a151bfd3a\") " Mar 10 11:24:21 crc kubenswrapper[4794]: I0310 11:24:21.951600 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/58e94276-b957-49b1-b4db-874a151bfd3a-logs\") pod \"58e94276-b957-49b1-b4db-874a151bfd3a\" (UID: \"58e94276-b957-49b1-b4db-874a151bfd3a\") " Mar 10 11:24:21 crc kubenswrapper[4794]: I0310 11:24:21.951684 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c9a8037c-9090-4a68-92a2-21b0e4a142b4-combined-ca-bundle\") pod \"c9a8037c-9090-4a68-92a2-21b0e4a142b4\" (UID: \"c9a8037c-9090-4a68-92a2-21b0e4a142b4\") " Mar 10 11:24:21 crc kubenswrapper[4794]: I0310 11:24:21.951735 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/58e94276-b957-49b1-b4db-874a151bfd3a-scripts\") pod \"58e94276-b957-49b1-b4db-874a151bfd3a\" (UID: \"58e94276-b957-49b1-b4db-874a151bfd3a\") " Mar 10 11:24:21 crc kubenswrapper[4794]: I0310 11:24:21.951777 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c9a8037c-9090-4a68-92a2-21b0e4a142b4-scripts\") pod \"c9a8037c-9090-4a68-92a2-21b0e4a142b4\" (UID: \"c9a8037c-9090-4a68-92a2-21b0e4a142b4\") " Mar 10 11:24:21 crc kubenswrapper[4794]: I0310 11:24:21.951811 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8cbkc\" (UniqueName: \"kubernetes.io/projected/c9a8037c-9090-4a68-92a2-21b0e4a142b4-kube-api-access-8cbkc\") pod \"c9a8037c-9090-4a68-92a2-21b0e4a142b4\" (UID: \"c9a8037c-9090-4a68-92a2-21b0e4a142b4\") " Mar 10 11:24:21 crc kubenswrapper[4794]: I0310 11:24:21.951833 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/c9a8037c-9090-4a68-92a2-21b0e4a142b4-httpd-run\") pod \"c9a8037c-9090-4a68-92a2-21b0e4a142b4\" (UID: \"c9a8037c-9090-4a68-92a2-21b0e4a142b4\") " Mar 10 11:24:21 crc kubenswrapper[4794]: I0310 11:24:21.951856 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/58e94276-b957-49b1-b4db-874a151bfd3a-httpd-run\") pod \"58e94276-b957-49b1-b4db-874a151bfd3a\" (UID: \"58e94276-b957-49b1-b4db-874a151bfd3a\") " Mar 10 11:24:21 crc kubenswrapper[4794]: I0310 11:24:21.951877 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/58e94276-b957-49b1-b4db-874a151bfd3a-ceph\") pod \"58e94276-b957-49b1-b4db-874a151bfd3a\" (UID: \"58e94276-b957-49b1-b4db-874a151bfd3a\") " Mar 10 11:24:21 crc kubenswrapper[4794]: I0310 11:24:21.951909 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/c9a8037c-9090-4a68-92a2-21b0e4a142b4-ceph\") pod \"c9a8037c-9090-4a68-92a2-21b0e4a142b4\" (UID: \"c9a8037c-9090-4a68-92a2-21b0e4a142b4\") " Mar 10 11:24:21 crc kubenswrapper[4794]: I0310 11:24:21.951931 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c9a8037c-9090-4a68-92a2-21b0e4a142b4-logs\") pod \"c9a8037c-9090-4a68-92a2-21b0e4a142b4\" (UID: \"c9a8037c-9090-4a68-92a2-21b0e4a142b4\") " Mar 10 11:24:21 crc kubenswrapper[4794]: I0310 11:24:21.951959 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/58e94276-b957-49b1-b4db-874a151bfd3a-config-data\") pod \"58e94276-b957-49b1-b4db-874a151bfd3a\" (UID: \"58e94276-b957-49b1-b4db-874a151bfd3a\") " Mar 10 11:24:21 crc kubenswrapper[4794]: I0310 11:24:21.952006 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c9a8037c-9090-4a68-92a2-21b0e4a142b4-config-data\") pod \"c9a8037c-9090-4a68-92a2-21b0e4a142b4\" (UID: \"c9a8037c-9090-4a68-92a2-21b0e4a142b4\") " Mar 10 11:24:21 crc kubenswrapper[4794]: I0310 11:24:21.952025 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/58e94276-b957-49b1-b4db-874a151bfd3a-combined-ca-bundle\") pod \"58e94276-b957-49b1-b4db-874a151bfd3a\" (UID: \"58e94276-b957-49b1-b4db-874a151bfd3a\") " Mar 10 11:24:21 crc kubenswrapper[4794]: I0310 11:24:21.953241 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c9a8037c-9090-4a68-92a2-21b0e4a142b4-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "c9a8037c-9090-4a68-92a2-21b0e4a142b4" (UID: "c9a8037c-9090-4a68-92a2-21b0e4a142b4"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 11:24:21 crc kubenswrapper[4794]: I0310 11:24:21.953397 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c9a8037c-9090-4a68-92a2-21b0e4a142b4-logs" (OuterVolumeSpecName: "logs") pod "c9a8037c-9090-4a68-92a2-21b0e4a142b4" (UID: "c9a8037c-9090-4a68-92a2-21b0e4a142b4"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 11:24:21 crc kubenswrapper[4794]: I0310 11:24:21.954120 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/58e94276-b957-49b1-b4db-874a151bfd3a-logs" (OuterVolumeSpecName: "logs") pod "58e94276-b957-49b1-b4db-874a151bfd3a" (UID: "58e94276-b957-49b1-b4db-874a151bfd3a"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 11:24:21 crc kubenswrapper[4794]: I0310 11:24:21.954421 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/58e94276-b957-49b1-b4db-874a151bfd3a-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "58e94276-b957-49b1-b4db-874a151bfd3a" (UID: "58e94276-b957-49b1-b4db-874a151bfd3a"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 11:24:21 crc kubenswrapper[4794]: I0310 11:24:21.981769 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c9a8037c-9090-4a68-92a2-21b0e4a142b4-ceph" (OuterVolumeSpecName: "ceph") pod "c9a8037c-9090-4a68-92a2-21b0e4a142b4" (UID: "c9a8037c-9090-4a68-92a2-21b0e4a142b4"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 11:24:21 crc kubenswrapper[4794]: I0310 11:24:21.983099 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c9a8037c-9090-4a68-92a2-21b0e4a142b4-kube-api-access-8cbkc" (OuterVolumeSpecName: "kube-api-access-8cbkc") pod "c9a8037c-9090-4a68-92a2-21b0e4a142b4" (UID: "c9a8037c-9090-4a68-92a2-21b0e4a142b4"). InnerVolumeSpecName "kube-api-access-8cbkc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 11:24:21 crc kubenswrapper[4794]: I0310 11:24:21.984025 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/58e94276-b957-49b1-b4db-874a151bfd3a-scripts" (OuterVolumeSpecName: "scripts") pod "58e94276-b957-49b1-b4db-874a151bfd3a" (UID: "58e94276-b957-49b1-b4db-874a151bfd3a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 11:24:21 crc kubenswrapper[4794]: I0310 11:24:21.985154 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/58e94276-b957-49b1-b4db-874a151bfd3a-kube-api-access-5kvcs" (OuterVolumeSpecName: "kube-api-access-5kvcs") pod "58e94276-b957-49b1-b4db-874a151bfd3a" (UID: "58e94276-b957-49b1-b4db-874a151bfd3a"). InnerVolumeSpecName "kube-api-access-5kvcs". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 11:24:21 crc kubenswrapper[4794]: I0310 11:24:21.988431 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/58e94276-b957-49b1-b4db-874a151bfd3a-ceph" (OuterVolumeSpecName: "ceph") pod "58e94276-b957-49b1-b4db-874a151bfd3a" (UID: "58e94276-b957-49b1-b4db-874a151bfd3a"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 11:24:21 crc kubenswrapper[4794]: I0310 11:24:21.992001 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c9a8037c-9090-4a68-92a2-21b0e4a142b4-scripts" (OuterVolumeSpecName: "scripts") pod "c9a8037c-9090-4a68-92a2-21b0e4a142b4" (UID: "c9a8037c-9090-4a68-92a2-21b0e4a142b4"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 11:24:22 crc kubenswrapper[4794]: I0310 11:24:22.059380 4794 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/58e94276-b957-49b1-b4db-874a151bfd3a-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 11:24:22 crc kubenswrapper[4794]: I0310 11:24:22.059997 4794 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c9a8037c-9090-4a68-92a2-21b0e4a142b4-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 11:24:22 crc kubenswrapper[4794]: I0310 11:24:22.060009 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8cbkc\" (UniqueName: \"kubernetes.io/projected/c9a8037c-9090-4a68-92a2-21b0e4a142b4-kube-api-access-8cbkc\") on node \"crc\" DevicePath \"\"" Mar 10 11:24:22 crc kubenswrapper[4794]: I0310 11:24:22.060021 4794 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/c9a8037c-9090-4a68-92a2-21b0e4a142b4-httpd-run\") on node \"crc\" DevicePath \"\"" Mar 10 11:24:22 crc kubenswrapper[4794]: I0310 11:24:22.060031 4794 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/58e94276-b957-49b1-b4db-874a151bfd3a-httpd-run\") on node \"crc\" DevicePath \"\"" Mar 10 11:24:22 crc kubenswrapper[4794]: I0310 11:24:22.060041 4794 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/58e94276-b957-49b1-b4db-874a151bfd3a-ceph\") on node \"crc\" DevicePath \"\"" Mar 10 11:24:22 crc kubenswrapper[4794]: I0310 11:24:22.060049 4794 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/c9a8037c-9090-4a68-92a2-21b0e4a142b4-ceph\") on node \"crc\" DevicePath \"\"" Mar 10 11:24:22 crc kubenswrapper[4794]: I0310 11:24:22.060057 4794 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c9a8037c-9090-4a68-92a2-21b0e4a142b4-logs\") on node \"crc\" DevicePath \"\"" Mar 10 11:24:22 crc kubenswrapper[4794]: I0310 11:24:22.060068 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5kvcs\" (UniqueName: \"kubernetes.io/projected/58e94276-b957-49b1-b4db-874a151bfd3a-kube-api-access-5kvcs\") on node \"crc\" DevicePath \"\"" Mar 10 11:24:22 crc kubenswrapper[4794]: I0310 11:24:22.060076 4794 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/58e94276-b957-49b1-b4db-874a151bfd3a-logs\") on node \"crc\" DevicePath \"\"" Mar 10 11:24:22 crc kubenswrapper[4794]: I0310 11:24:22.118685 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c9a8037c-9090-4a68-92a2-21b0e4a142b4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c9a8037c-9090-4a68-92a2-21b0e4a142b4" (UID: "c9a8037c-9090-4a68-92a2-21b0e4a142b4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 11:24:22 crc kubenswrapper[4794]: I0310 11:24:22.126578 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/58e94276-b957-49b1-b4db-874a151bfd3a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "58e94276-b957-49b1-b4db-874a151bfd3a" (UID: "58e94276-b957-49b1-b4db-874a151bfd3a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 11:24:22 crc kubenswrapper[4794]: I0310 11:24:22.132223 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c9a8037c-9090-4a68-92a2-21b0e4a142b4-config-data" (OuterVolumeSpecName: "config-data") pod "c9a8037c-9090-4a68-92a2-21b0e4a142b4" (UID: "c9a8037c-9090-4a68-92a2-21b0e4a142b4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 11:24:22 crc kubenswrapper[4794]: I0310 11:24:22.156840 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/58e94276-b957-49b1-b4db-874a151bfd3a-config-data" (OuterVolumeSpecName: "config-data") pod "58e94276-b957-49b1-b4db-874a151bfd3a" (UID: "58e94276-b957-49b1-b4db-874a151bfd3a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 11:24:22 crc kubenswrapper[4794]: I0310 11:24:22.162306 4794 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c9a8037c-9090-4a68-92a2-21b0e4a142b4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 11:24:22 crc kubenswrapper[4794]: I0310 11:24:22.162355 4794 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/58e94276-b957-49b1-b4db-874a151bfd3a-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 11:24:22 crc kubenswrapper[4794]: I0310 11:24:22.162367 4794 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c9a8037c-9090-4a68-92a2-21b0e4a142b4-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 11:24:22 crc kubenswrapper[4794]: I0310 11:24:22.162378 4794 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/58e94276-b957-49b1-b4db-874a151bfd3a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 11:24:22 crc kubenswrapper[4794]: I0310 11:24:22.659392 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5dbd77ff65-fht8h" event={"ID":"0d5d143f-51bc-4019-88a3-3224640cdea1","Type":"ContainerStarted","Data":"e29ccc92db38c5fd7db0ece3a4243f727b77c693f3bea8cf12a64738d427102d"} Mar 10 11:24:22 crc kubenswrapper[4794]: I0310 11:24:22.659445 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5dbd77ff65-fht8h" event={"ID":"0d5d143f-51bc-4019-88a3-3224640cdea1","Type":"ContainerStarted","Data":"2692b49ac6f870602e533d34b3529ffc2a82bf08f8257e6846105ad47ec513d4"} Mar 10 11:24:22 crc kubenswrapper[4794]: I0310 11:24:22.659508 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-5dbd77ff65-fht8h" podUID="0d5d143f-51bc-4019-88a3-3224640cdea1" containerName="horizon-log" containerID="cri-o://2692b49ac6f870602e533d34b3529ffc2a82bf08f8257e6846105ad47ec513d4" gracePeriod=30 Mar 10 11:24:22 crc kubenswrapper[4794]: I0310 11:24:22.659601 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-5dbd77ff65-fht8h" podUID="0d5d143f-51bc-4019-88a3-3224640cdea1" containerName="horizon" containerID="cri-o://e29ccc92db38c5fd7db0ece3a4243f727b77c693f3bea8cf12a64738d427102d" gracePeriod=30 Mar 10 11:24:22 crc kubenswrapper[4794]: I0310 11:24:22.662925 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 10 11:24:22 crc kubenswrapper[4794]: I0310 11:24:22.662921 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"58e94276-b957-49b1-b4db-874a151bfd3a","Type":"ContainerDied","Data":"12f3210cc6d0e47b42ffde6234908028d5d3111657f62698cda68bd331fae702"} Mar 10 11:24:22 crc kubenswrapper[4794]: I0310 11:24:22.663047 4794 scope.go:117] "RemoveContainer" containerID="73001e259d7cad5584fd2258090c25f4dd39102ca089343a21d440080cc36471" Mar 10 11:24:22 crc kubenswrapper[4794]: I0310 11:24:22.670242 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-56cc67dc8f-wf6qg" event={"ID":"9bc8c092-11cb-41cf-860b-4912ca0a0eeb","Type":"ContainerStarted","Data":"751ddd37c4de044ec7045631d07915f82ff9dc16dcd448acff16c7c4215f84cf"} Mar 10 11:24:22 crc kubenswrapper[4794]: I0310 11:24:22.670318 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-56cc67dc8f-wf6qg" event={"ID":"9bc8c092-11cb-41cf-860b-4912ca0a0eeb","Type":"ContainerStarted","Data":"89b01a6bbd20a9c236848a6e06620f29f7cf563663f553373aaacb6b207569ad"} Mar 10 11:24:22 crc kubenswrapper[4794]: I0310 11:24:22.673947 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6ffb66d69f-4mhg9" event={"ID":"b86814f6-b171-481c-9f93-dab25d50b2a9","Type":"ContainerStarted","Data":"93a3d4858ce9325bb129c88aa42962084decd984424790e70676ca58ea506623"} Mar 10 11:24:22 crc kubenswrapper[4794]: I0310 11:24:22.673999 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6ffb66d69f-4mhg9" event={"ID":"b86814f6-b171-481c-9f93-dab25d50b2a9","Type":"ContainerStarted","Data":"b91c8b16f3261f4d1dbce0cd84b40db955de5e6ef96f848d34c4279b7497911e"} Mar 10 11:24:22 crc kubenswrapper[4794]: I0310 11:24:22.678433 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"c9a8037c-9090-4a68-92a2-21b0e4a142b4","Type":"ContainerDied","Data":"06d5449fef5964f97f56dfee7a05b70a0811ea7c2f5746f960a1b3295a9103b2"} Mar 10 11:24:22 crc kubenswrapper[4794]: I0310 11:24:22.678534 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 10 11:24:22 crc kubenswrapper[4794]: I0310 11:24:22.681017 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-5dbd77ff65-fht8h" podStartSLOduration=2.574049655 podStartE2EDuration="9.680972079s" podCreationTimestamp="2026-03-10 11:24:13 +0000 UTC" firstStartedPulling="2026-03-10 11:24:14.275579053 +0000 UTC m=+6003.031749871" lastFinishedPulling="2026-03-10 11:24:21.382501477 +0000 UTC m=+6010.138672295" observedRunningTime="2026-03-10 11:24:22.677277355 +0000 UTC m=+6011.433448193" watchObservedRunningTime="2026-03-10 11:24:22.680972079 +0000 UTC m=+6011.437142907" Mar 10 11:24:22 crc kubenswrapper[4794]: I0310 11:24:22.706848 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-56cc67dc8f-wf6qg" podStartSLOduration=2.28505137 podStartE2EDuration="8.706827714s" podCreationTimestamp="2026-03-10 11:24:14 +0000 UTC" firstStartedPulling="2026-03-10 11:24:14.954320463 +0000 UTC m=+6003.710491281" lastFinishedPulling="2026-03-10 11:24:21.376096797 +0000 UTC m=+6010.132267625" observedRunningTime="2026-03-10 11:24:22.698306649 +0000 UTC m=+6011.454477477" watchObservedRunningTime="2026-03-10 11:24:22.706827714 +0000 UTC m=+6011.462998532" Mar 10 11:24:22 crc kubenswrapper[4794]: I0310 11:24:22.727593 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-6ffb66d69f-4mhg9" podStartSLOduration=2.7434300670000002 podStartE2EDuration="9.72757448s" podCreationTimestamp="2026-03-10 11:24:13 +0000 UTC" firstStartedPulling="2026-03-10 11:24:14.453717736 +0000 UTC m=+6003.209888544" lastFinishedPulling="2026-03-10 11:24:21.437862139 +0000 UTC m=+6010.194032957" observedRunningTime="2026-03-10 11:24:22.716051691 +0000 UTC m=+6011.472222499" watchObservedRunningTime="2026-03-10 11:24:22.72757448 +0000 UTC m=+6011.483745298" Mar 10 11:24:22 crc kubenswrapper[4794]: I0310 11:24:22.730480 4794 scope.go:117] "RemoveContainer" containerID="0311c0be9eec89c14e0736e3a96f5851e5d816408c26015edbbe888b94c0cbcb" Mar 10 11:24:22 crc kubenswrapper[4794]: I0310 11:24:22.747793 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 10 11:24:22 crc kubenswrapper[4794]: I0310 11:24:22.759437 4794 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 10 11:24:22 crc kubenswrapper[4794]: I0310 11:24:22.768857 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 10 11:24:22 crc kubenswrapper[4794]: I0310 11:24:22.777010 4794 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 10 11:24:22 crc kubenswrapper[4794]: I0310 11:24:22.783547 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 10 11:24:22 crc kubenswrapper[4794]: I0310 11:24:22.783746 4794 scope.go:117] "RemoveContainer" containerID="df0b72b90c3e4b239a686b0ecec91563d34fd8961b41e357fbe0cb85abd935f8" Mar 10 11:24:22 crc kubenswrapper[4794]: E0310 11:24:22.784393 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c9a8037c-9090-4a68-92a2-21b0e4a142b4" containerName="glance-log" Mar 10 11:24:22 crc kubenswrapper[4794]: I0310 11:24:22.784417 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="c9a8037c-9090-4a68-92a2-21b0e4a142b4" containerName="glance-log" Mar 10 11:24:22 crc kubenswrapper[4794]: E0310 11:24:22.784451 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="58e94276-b957-49b1-b4db-874a151bfd3a" containerName="glance-log" Mar 10 11:24:22 crc kubenswrapper[4794]: I0310 11:24:22.784459 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="58e94276-b957-49b1-b4db-874a151bfd3a" containerName="glance-log" Mar 10 11:24:22 crc kubenswrapper[4794]: E0310 11:24:22.784470 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c9a8037c-9090-4a68-92a2-21b0e4a142b4" containerName="glance-httpd" Mar 10 11:24:22 crc kubenswrapper[4794]: I0310 11:24:22.784477 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="c9a8037c-9090-4a68-92a2-21b0e4a142b4" containerName="glance-httpd" Mar 10 11:24:22 crc kubenswrapper[4794]: E0310 11:24:22.784564 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="58e94276-b957-49b1-b4db-874a151bfd3a" containerName="glance-httpd" Mar 10 11:24:22 crc kubenswrapper[4794]: I0310 11:24:22.784572 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="58e94276-b957-49b1-b4db-874a151bfd3a" containerName="glance-httpd" Mar 10 11:24:22 crc kubenswrapper[4794]: I0310 11:24:22.785041 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="c9a8037c-9090-4a68-92a2-21b0e4a142b4" containerName="glance-httpd" Mar 10 11:24:22 crc kubenswrapper[4794]: I0310 11:24:22.785064 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="58e94276-b957-49b1-b4db-874a151bfd3a" containerName="glance-httpd" Mar 10 11:24:22 crc kubenswrapper[4794]: I0310 11:24:22.785076 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="58e94276-b957-49b1-b4db-874a151bfd3a" containerName="glance-log" Mar 10 11:24:22 crc kubenswrapper[4794]: I0310 11:24:22.785090 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="c9a8037c-9090-4a68-92a2-21b0e4a142b4" containerName="glance-log" Mar 10 11:24:22 crc kubenswrapper[4794]: I0310 11:24:22.786433 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 10 11:24:22 crc kubenswrapper[4794]: I0310 11:24:22.790557 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 10 11:24:22 crc kubenswrapper[4794]: I0310 11:24:22.797867 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Mar 10 11:24:22 crc kubenswrapper[4794]: I0310 11:24:22.799497 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 10 11:24:22 crc kubenswrapper[4794]: I0310 11:24:22.814172 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Mar 10 11:24:22 crc kubenswrapper[4794]: I0310 11:24:22.814272 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-fss2f" Mar 10 11:24:22 crc kubenswrapper[4794]: I0310 11:24:22.814803 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Mar 10 11:24:22 crc kubenswrapper[4794]: I0310 11:24:22.814997 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Mar 10 11:24:22 crc kubenswrapper[4794]: I0310 11:24:22.838661 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 10 11:24:22 crc kubenswrapper[4794]: I0310 11:24:22.843166 4794 scope.go:117] "RemoveContainer" containerID="c4c053008c62120f41214b5b89fa8a72183957fc43951b46ce7668624899f2f5" Mar 10 11:24:22 crc kubenswrapper[4794]: I0310 11:24:22.876656 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/89c1851f-3bf0-4d2b-b3da-db9f3c42cd51-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"89c1851f-3bf0-4d2b-b3da-db9f3c42cd51\") " pod="openstack/glance-default-internal-api-0" Mar 10 11:24:22 crc kubenswrapper[4794]: I0310 11:24:22.876732 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e0a7c3c3-404b-4d50-a8f3-b06b83af4f0f-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"e0a7c3c3-404b-4d50-a8f3-b06b83af4f0f\") " pod="openstack/glance-default-external-api-0" Mar 10 11:24:22 crc kubenswrapper[4794]: I0310 11:24:22.876798 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e0a7c3c3-404b-4d50-a8f3-b06b83af4f0f-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"e0a7c3c3-404b-4d50-a8f3-b06b83af4f0f\") " pod="openstack/glance-default-external-api-0" Mar 10 11:24:22 crc kubenswrapper[4794]: I0310 11:24:22.876835 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d6x9t\" (UniqueName: \"kubernetes.io/projected/e0a7c3c3-404b-4d50-a8f3-b06b83af4f0f-kube-api-access-d6x9t\") pod \"glance-default-external-api-0\" (UID: \"e0a7c3c3-404b-4d50-a8f3-b06b83af4f0f\") " pod="openstack/glance-default-external-api-0" Mar 10 11:24:22 crc kubenswrapper[4794]: I0310 11:24:22.876873 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/89c1851f-3bf0-4d2b-b3da-db9f3c42cd51-logs\") pod \"glance-default-internal-api-0\" (UID: \"89c1851f-3bf0-4d2b-b3da-db9f3c42cd51\") " pod="openstack/glance-default-internal-api-0" Mar 10 11:24:22 crc kubenswrapper[4794]: I0310 11:24:22.876920 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/89c1851f-3bf0-4d2b-b3da-db9f3c42cd51-scripts\") pod \"glance-default-internal-api-0\" (UID: \"89c1851f-3bf0-4d2b-b3da-db9f3c42cd51\") " pod="openstack/glance-default-internal-api-0" Mar 10 11:24:22 crc kubenswrapper[4794]: I0310 11:24:22.876948 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q2cr6\" (UniqueName: \"kubernetes.io/projected/89c1851f-3bf0-4d2b-b3da-db9f3c42cd51-kube-api-access-q2cr6\") pod \"glance-default-internal-api-0\" (UID: \"89c1851f-3bf0-4d2b-b3da-db9f3c42cd51\") " pod="openstack/glance-default-internal-api-0" Mar 10 11:24:22 crc kubenswrapper[4794]: I0310 11:24:22.877018 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e0a7c3c3-404b-4d50-a8f3-b06b83af4f0f-logs\") pod \"glance-default-external-api-0\" (UID: \"e0a7c3c3-404b-4d50-a8f3-b06b83af4f0f\") " pod="openstack/glance-default-external-api-0" Mar 10 11:24:22 crc kubenswrapper[4794]: I0310 11:24:22.877043 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/89c1851f-3bf0-4d2b-b3da-db9f3c42cd51-ceph\") pod \"glance-default-internal-api-0\" (UID: \"89c1851f-3bf0-4d2b-b3da-db9f3c42cd51\") " pod="openstack/glance-default-internal-api-0" Mar 10 11:24:22 crc kubenswrapper[4794]: I0310 11:24:22.877077 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/e0a7c3c3-404b-4d50-a8f3-b06b83af4f0f-ceph\") pod \"glance-default-external-api-0\" (UID: \"e0a7c3c3-404b-4d50-a8f3-b06b83af4f0f\") " pod="openstack/glance-default-external-api-0" Mar 10 11:24:22 crc kubenswrapper[4794]: I0310 11:24:22.877108 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/89c1851f-3bf0-4d2b-b3da-db9f3c42cd51-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"89c1851f-3bf0-4d2b-b3da-db9f3c42cd51\") " pod="openstack/glance-default-internal-api-0" Mar 10 11:24:22 crc kubenswrapper[4794]: I0310 11:24:22.877130 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e0a7c3c3-404b-4d50-a8f3-b06b83af4f0f-config-data\") pod \"glance-default-external-api-0\" (UID: \"e0a7c3c3-404b-4d50-a8f3-b06b83af4f0f\") " pod="openstack/glance-default-external-api-0" Mar 10 11:24:22 crc kubenswrapper[4794]: I0310 11:24:22.877158 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/89c1851f-3bf0-4d2b-b3da-db9f3c42cd51-config-data\") pod \"glance-default-internal-api-0\" (UID: \"89c1851f-3bf0-4d2b-b3da-db9f3c42cd51\") " pod="openstack/glance-default-internal-api-0" Mar 10 11:24:22 crc kubenswrapper[4794]: I0310 11:24:22.877186 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e0a7c3c3-404b-4d50-a8f3-b06b83af4f0f-scripts\") pod \"glance-default-external-api-0\" (UID: \"e0a7c3c3-404b-4d50-a8f3-b06b83af4f0f\") " pod="openstack/glance-default-external-api-0" Mar 10 11:24:22 crc kubenswrapper[4794]: I0310 11:24:22.979971 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e0a7c3c3-404b-4d50-a8f3-b06b83af4f0f-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"e0a7c3c3-404b-4d50-a8f3-b06b83af4f0f\") " pod="openstack/glance-default-external-api-0" Mar 10 11:24:22 crc kubenswrapper[4794]: I0310 11:24:22.980076 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e0a7c3c3-404b-4d50-a8f3-b06b83af4f0f-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"e0a7c3c3-404b-4d50-a8f3-b06b83af4f0f\") " pod="openstack/glance-default-external-api-0" Mar 10 11:24:22 crc kubenswrapper[4794]: I0310 11:24:22.980112 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d6x9t\" (UniqueName: \"kubernetes.io/projected/e0a7c3c3-404b-4d50-a8f3-b06b83af4f0f-kube-api-access-d6x9t\") pod \"glance-default-external-api-0\" (UID: \"e0a7c3c3-404b-4d50-a8f3-b06b83af4f0f\") " pod="openstack/glance-default-external-api-0" Mar 10 11:24:22 crc kubenswrapper[4794]: I0310 11:24:22.980152 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/89c1851f-3bf0-4d2b-b3da-db9f3c42cd51-logs\") pod \"glance-default-internal-api-0\" (UID: \"89c1851f-3bf0-4d2b-b3da-db9f3c42cd51\") " pod="openstack/glance-default-internal-api-0" Mar 10 11:24:22 crc kubenswrapper[4794]: I0310 11:24:22.980210 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/89c1851f-3bf0-4d2b-b3da-db9f3c42cd51-scripts\") pod \"glance-default-internal-api-0\" (UID: \"89c1851f-3bf0-4d2b-b3da-db9f3c42cd51\") " pod="openstack/glance-default-internal-api-0" Mar 10 11:24:22 crc kubenswrapper[4794]: I0310 11:24:22.980237 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q2cr6\" (UniqueName: \"kubernetes.io/projected/89c1851f-3bf0-4d2b-b3da-db9f3c42cd51-kube-api-access-q2cr6\") pod \"glance-default-internal-api-0\" (UID: \"89c1851f-3bf0-4d2b-b3da-db9f3c42cd51\") " pod="openstack/glance-default-internal-api-0" Mar 10 11:24:22 crc kubenswrapper[4794]: I0310 11:24:22.980322 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e0a7c3c3-404b-4d50-a8f3-b06b83af4f0f-logs\") pod \"glance-default-external-api-0\" (UID: \"e0a7c3c3-404b-4d50-a8f3-b06b83af4f0f\") " pod="openstack/glance-default-external-api-0" Mar 10 11:24:22 crc kubenswrapper[4794]: I0310 11:24:22.980370 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/89c1851f-3bf0-4d2b-b3da-db9f3c42cd51-ceph\") pod \"glance-default-internal-api-0\" (UID: \"89c1851f-3bf0-4d2b-b3da-db9f3c42cd51\") " pod="openstack/glance-default-internal-api-0" Mar 10 11:24:22 crc kubenswrapper[4794]: I0310 11:24:22.980404 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/e0a7c3c3-404b-4d50-a8f3-b06b83af4f0f-ceph\") pod \"glance-default-external-api-0\" (UID: \"e0a7c3c3-404b-4d50-a8f3-b06b83af4f0f\") " pod="openstack/glance-default-external-api-0" Mar 10 11:24:22 crc kubenswrapper[4794]: I0310 11:24:22.980441 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/89c1851f-3bf0-4d2b-b3da-db9f3c42cd51-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"89c1851f-3bf0-4d2b-b3da-db9f3c42cd51\") " pod="openstack/glance-default-internal-api-0" Mar 10 11:24:22 crc kubenswrapper[4794]: I0310 11:24:22.980460 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e0a7c3c3-404b-4d50-a8f3-b06b83af4f0f-config-data\") pod \"glance-default-external-api-0\" (UID: \"e0a7c3c3-404b-4d50-a8f3-b06b83af4f0f\") " pod="openstack/glance-default-external-api-0" Mar 10 11:24:22 crc kubenswrapper[4794]: I0310 11:24:22.980494 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/89c1851f-3bf0-4d2b-b3da-db9f3c42cd51-config-data\") pod \"glance-default-internal-api-0\" (UID: \"89c1851f-3bf0-4d2b-b3da-db9f3c42cd51\") " pod="openstack/glance-default-internal-api-0" Mar 10 11:24:22 crc kubenswrapper[4794]: I0310 11:24:22.980527 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e0a7c3c3-404b-4d50-a8f3-b06b83af4f0f-scripts\") pod \"glance-default-external-api-0\" (UID: \"e0a7c3c3-404b-4d50-a8f3-b06b83af4f0f\") " pod="openstack/glance-default-external-api-0" Mar 10 11:24:22 crc kubenswrapper[4794]: I0310 11:24:22.980589 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/89c1851f-3bf0-4d2b-b3da-db9f3c42cd51-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"89c1851f-3bf0-4d2b-b3da-db9f3c42cd51\") " pod="openstack/glance-default-internal-api-0" Mar 10 11:24:22 crc kubenswrapper[4794]: I0310 11:24:22.980610 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e0a7c3c3-404b-4d50-a8f3-b06b83af4f0f-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"e0a7c3c3-404b-4d50-a8f3-b06b83af4f0f\") " pod="openstack/glance-default-external-api-0" Mar 10 11:24:22 crc kubenswrapper[4794]: I0310 11:24:22.981418 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/89c1851f-3bf0-4d2b-b3da-db9f3c42cd51-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"89c1851f-3bf0-4d2b-b3da-db9f3c42cd51\") " pod="openstack/glance-default-internal-api-0" Mar 10 11:24:22 crc kubenswrapper[4794]: I0310 11:24:22.981652 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/89c1851f-3bf0-4d2b-b3da-db9f3c42cd51-logs\") pod \"glance-default-internal-api-0\" (UID: \"89c1851f-3bf0-4d2b-b3da-db9f3c42cd51\") " pod="openstack/glance-default-internal-api-0" Mar 10 11:24:22 crc kubenswrapper[4794]: I0310 11:24:22.984627 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e0a7c3c3-404b-4d50-a8f3-b06b83af4f0f-logs\") pod \"glance-default-external-api-0\" (UID: \"e0a7c3c3-404b-4d50-a8f3-b06b83af4f0f\") " pod="openstack/glance-default-external-api-0" Mar 10 11:24:22 crc kubenswrapper[4794]: I0310 11:24:22.992699 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/89c1851f-3bf0-4d2b-b3da-db9f3c42cd51-config-data\") pod \"glance-default-internal-api-0\" (UID: \"89c1851f-3bf0-4d2b-b3da-db9f3c42cd51\") " pod="openstack/glance-default-internal-api-0" Mar 10 11:24:22 crc kubenswrapper[4794]: I0310 11:24:22.992750 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/e0a7c3c3-404b-4d50-a8f3-b06b83af4f0f-ceph\") pod \"glance-default-external-api-0\" (UID: \"e0a7c3c3-404b-4d50-a8f3-b06b83af4f0f\") " pod="openstack/glance-default-external-api-0" Mar 10 11:24:22 crc kubenswrapper[4794]: I0310 11:24:22.995968 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/89c1851f-3bf0-4d2b-b3da-db9f3c42cd51-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"89c1851f-3bf0-4d2b-b3da-db9f3c42cd51\") " pod="openstack/glance-default-internal-api-0" Mar 10 11:24:23 crc kubenswrapper[4794]: I0310 11:24:23.001077 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e0a7c3c3-404b-4d50-a8f3-b06b83af4f0f-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"e0a7c3c3-404b-4d50-a8f3-b06b83af4f0f\") " pod="openstack/glance-default-external-api-0" Mar 10 11:24:23 crc kubenswrapper[4794]: I0310 11:24:23.008889 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/89c1851f-3bf0-4d2b-b3da-db9f3c42cd51-scripts\") pod \"glance-default-internal-api-0\" (UID: \"89c1851f-3bf0-4d2b-b3da-db9f3c42cd51\") " pod="openstack/glance-default-internal-api-0" Mar 10 11:24:23 crc kubenswrapper[4794]: I0310 11:24:23.012980 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e0a7c3c3-404b-4d50-a8f3-b06b83af4f0f-scripts\") pod \"glance-default-external-api-0\" (UID: \"e0a7c3c3-404b-4d50-a8f3-b06b83af4f0f\") " pod="openstack/glance-default-external-api-0" Mar 10 11:24:23 crc kubenswrapper[4794]: I0310 11:24:23.012993 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/89c1851f-3bf0-4d2b-b3da-db9f3c42cd51-ceph\") pod \"glance-default-internal-api-0\" (UID: \"89c1851f-3bf0-4d2b-b3da-db9f3c42cd51\") " pod="openstack/glance-default-internal-api-0" Mar 10 11:24:23 crc kubenswrapper[4794]: I0310 11:24:23.013380 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q2cr6\" (UniqueName: \"kubernetes.io/projected/89c1851f-3bf0-4d2b-b3da-db9f3c42cd51-kube-api-access-q2cr6\") pod \"glance-default-internal-api-0\" (UID: \"89c1851f-3bf0-4d2b-b3da-db9f3c42cd51\") " pod="openstack/glance-default-internal-api-0" Mar 10 11:24:23 crc kubenswrapper[4794]: I0310 11:24:23.013466 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d6x9t\" (UniqueName: \"kubernetes.io/projected/e0a7c3c3-404b-4d50-a8f3-b06b83af4f0f-kube-api-access-d6x9t\") pod \"glance-default-external-api-0\" (UID: \"e0a7c3c3-404b-4d50-a8f3-b06b83af4f0f\") " pod="openstack/glance-default-external-api-0" Mar 10 11:24:23 crc kubenswrapper[4794]: I0310 11:24:23.013902 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e0a7c3c3-404b-4d50-a8f3-b06b83af4f0f-config-data\") pod \"glance-default-external-api-0\" (UID: \"e0a7c3c3-404b-4d50-a8f3-b06b83af4f0f\") " pod="openstack/glance-default-external-api-0" Mar 10 11:24:23 crc kubenswrapper[4794]: I0310 11:24:23.134845 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 10 11:24:23 crc kubenswrapper[4794]: I0310 11:24:23.144824 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 10 11:24:23 crc kubenswrapper[4794]: I0310 11:24:23.762010 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 10 11:24:23 crc kubenswrapper[4794]: I0310 11:24:23.784438 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-5dbd77ff65-fht8h" Mar 10 11:24:23 crc kubenswrapper[4794]: I0310 11:24:23.877517 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 10 11:24:23 crc kubenswrapper[4794]: I0310 11:24:23.927038 4794 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-6ffb66d69f-4mhg9" Mar 10 11:24:23 crc kubenswrapper[4794]: I0310 11:24:23.927098 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-6ffb66d69f-4mhg9" Mar 10 11:24:24 crc kubenswrapper[4794]: I0310 11:24:24.017495 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="58e94276-b957-49b1-b4db-874a151bfd3a" path="/var/lib/kubelet/pods/58e94276-b957-49b1-b4db-874a151bfd3a/volumes" Mar 10 11:24:24 crc kubenswrapper[4794]: I0310 11:24:24.019840 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c9a8037c-9090-4a68-92a2-21b0e4a142b4" path="/var/lib/kubelet/pods/c9a8037c-9090-4a68-92a2-21b0e4a142b4/volumes" Mar 10 11:24:24 crc kubenswrapper[4794]: I0310 11:24:24.477953 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-56cc67dc8f-wf6qg" Mar 10 11:24:24 crc kubenswrapper[4794]: I0310 11:24:24.478106 4794 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-56cc67dc8f-wf6qg" Mar 10 11:24:24 crc kubenswrapper[4794]: I0310 11:24:24.726449 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"89c1851f-3bf0-4d2b-b3da-db9f3c42cd51","Type":"ContainerStarted","Data":"bccd16c4bcd9ff0ef6f0283093e61fcc4d3bd2b03a4afce8ee59dc442a77629b"} Mar 10 11:24:24 crc kubenswrapper[4794]: I0310 11:24:24.726489 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"89c1851f-3bf0-4d2b-b3da-db9f3c42cd51","Type":"ContainerStarted","Data":"938c3c2477a8ce5e5b7dc228f1ab206a35b56358587e143f8a76a3153ec869f6"} Mar 10 11:24:24 crc kubenswrapper[4794]: I0310 11:24:24.736378 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"e0a7c3c3-404b-4d50-a8f3-b06b83af4f0f","Type":"ContainerStarted","Data":"91b9c870eb986700f4fa0cb314bcb9bd8e8901583c502b6cf236a64f678b2746"} Mar 10 11:24:24 crc kubenswrapper[4794]: I0310 11:24:24.736422 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"e0a7c3c3-404b-4d50-a8f3-b06b83af4f0f","Type":"ContainerStarted","Data":"9bb0201d56f4cbe83dd418723f57fe7b30b1fed146db72bfe1639a7848447b18"} Mar 10 11:24:25 crc kubenswrapper[4794]: I0310 11:24:25.749149 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"89c1851f-3bf0-4d2b-b3da-db9f3c42cd51","Type":"ContainerStarted","Data":"1ee9cd91e676ff1db981db991e448346d8f109486a9c00755d3e358349f431aa"} Mar 10 11:24:25 crc kubenswrapper[4794]: I0310 11:24:25.752884 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"e0a7c3c3-404b-4d50-a8f3-b06b83af4f0f","Type":"ContainerStarted","Data":"add37669b22ee434406b44fd1df3981b116dbc1fa6a938a4f1a7303d7eb9e73d"} Mar 10 11:24:25 crc kubenswrapper[4794]: I0310 11:24:25.774163 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=3.774145742 podStartE2EDuration="3.774145742s" podCreationTimestamp="2026-03-10 11:24:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 11:24:25.771989265 +0000 UTC m=+6014.528160093" watchObservedRunningTime="2026-03-10 11:24:25.774145742 +0000 UTC m=+6014.530316560" Mar 10 11:24:25 crc kubenswrapper[4794]: I0310 11:24:25.806719 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=3.806694485 podStartE2EDuration="3.806694485s" podCreationTimestamp="2026-03-10 11:24:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 11:24:25.799099048 +0000 UTC m=+6014.555269866" watchObservedRunningTime="2026-03-10 11:24:25.806694485 +0000 UTC m=+6014.562865303" Mar 10 11:24:33 crc kubenswrapper[4794]: I0310 11:24:33.135699 4794 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Mar 10 11:24:33 crc kubenswrapper[4794]: I0310 11:24:33.136005 4794 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Mar 10 11:24:33 crc kubenswrapper[4794]: I0310 11:24:33.146388 4794 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Mar 10 11:24:33 crc kubenswrapper[4794]: I0310 11:24:33.146437 4794 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Mar 10 11:24:33 crc kubenswrapper[4794]: I0310 11:24:33.177310 4794 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Mar 10 11:24:33 crc kubenswrapper[4794]: I0310 11:24:33.180361 4794 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Mar 10 11:24:33 crc kubenswrapper[4794]: I0310 11:24:33.183532 4794 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Mar 10 11:24:33 crc kubenswrapper[4794]: I0310 11:24:33.212236 4794 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Mar 10 11:24:33 crc kubenswrapper[4794]: I0310 11:24:33.838403 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Mar 10 11:24:33 crc kubenswrapper[4794]: I0310 11:24:33.838735 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Mar 10 11:24:33 crc kubenswrapper[4794]: I0310 11:24:33.838751 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Mar 10 11:24:33 crc kubenswrapper[4794]: I0310 11:24:33.838761 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Mar 10 11:24:33 crc kubenswrapper[4794]: I0310 11:24:33.920613 4794 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-6ffb66d69f-4mhg9" podUID="b86814f6-b171-481c-9f93-dab25d50b2a9" containerName="horizon" probeResult="failure" output="Get \"http://10.217.1.150:8080/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.1.150:8080: connect: connection refused" Mar 10 11:24:34 crc kubenswrapper[4794]: I0310 11:24:34.480614 4794 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-56cc67dc8f-wf6qg" podUID="9bc8c092-11cb-41cf-860b-4912ca0a0eeb" containerName="horizon" probeResult="failure" output="Get \"http://10.217.1.151:8080/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.1.151:8080: connect: connection refused" Mar 10 11:24:35 crc kubenswrapper[4794]: I0310 11:24:35.849410 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Mar 10 11:24:35 crc kubenswrapper[4794]: I0310 11:24:35.851871 4794 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 10 11:24:35 crc kubenswrapper[4794]: I0310 11:24:35.851910 4794 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 10 11:24:35 crc kubenswrapper[4794]: I0310 11:24:35.851877 4794 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 10 11:24:35 crc kubenswrapper[4794]: I0310 11:24:35.896032 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Mar 10 11:24:35 crc kubenswrapper[4794]: I0310 11:24:35.907848 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Mar 10 11:24:36 crc kubenswrapper[4794]: I0310 11:24:36.239296 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Mar 10 11:24:42 crc kubenswrapper[4794]: I0310 11:24:42.041418 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-e47a-account-create-update-89jdk"] Mar 10 11:24:42 crc kubenswrapper[4794]: I0310 11:24:42.050925 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-create-sqjmt"] Mar 10 11:24:42 crc kubenswrapper[4794]: I0310 11:24:42.059669 4794 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-create-sqjmt"] Mar 10 11:24:42 crc kubenswrapper[4794]: I0310 11:24:42.068477 4794 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-e47a-account-create-update-89jdk"] Mar 10 11:24:44 crc kubenswrapper[4794]: I0310 11:24:44.011314 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="02aec78e-96f3-4d8a-81cb-412e143471ca" path="/var/lib/kubelet/pods/02aec78e-96f3-4d8a-81cb-412e143471ca/volumes" Mar 10 11:24:44 crc kubenswrapper[4794]: I0310 11:24:44.032728 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="578d197c-2cae-4127-ab66-d2159303508b" path="/var/lib/kubelet/pods/578d197c-2cae-4127-ab66-d2159303508b/volumes" Mar 10 11:24:46 crc kubenswrapper[4794]: I0310 11:24:46.146872 4794 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/horizon-6ffb66d69f-4mhg9" Mar 10 11:24:46 crc kubenswrapper[4794]: I0310 11:24:46.452027 4794 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/horizon-56cc67dc8f-wf6qg" Mar 10 11:24:47 crc kubenswrapper[4794]: I0310 11:24:47.593579 4794 scope.go:117] "RemoveContainer" containerID="cf8f62ca227888d5977c62623b0915a91525b21e29ecbbe6e6a719fd3611e51c" Mar 10 11:24:47 crc kubenswrapper[4794]: I0310 11:24:47.648302 4794 scope.go:117] "RemoveContainer" containerID="1b76cae5e4145b0868ffa23664644938667774bfc0547d584bee7ac98f723c3b" Mar 10 11:24:47 crc kubenswrapper[4794]: I0310 11:24:47.679960 4794 scope.go:117] "RemoveContainer" containerID="9f0c3f5bb5cac7658eec685d5eee8faa11d60ba5d34ea562347590f6fb17a493" Mar 10 11:24:47 crc kubenswrapper[4794]: I0310 11:24:47.726717 4794 scope.go:117] "RemoveContainer" containerID="46da1e90b20ca9a718374fb93120cea7c6e2d502fb42fcc666f60cf0d81a0be3" Mar 10 11:24:47 crc kubenswrapper[4794]: I0310 11:24:47.872546 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/horizon-6ffb66d69f-4mhg9" Mar 10 11:24:48 crc kubenswrapper[4794]: I0310 11:24:48.041180 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-sync-ptn9j"] Mar 10 11:24:48 crc kubenswrapper[4794]: I0310 11:24:48.052811 4794 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-sync-ptn9j"] Mar 10 11:24:48 crc kubenswrapper[4794]: I0310 11:24:48.181088 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/horizon-56cc67dc8f-wf6qg" Mar 10 11:24:48 crc kubenswrapper[4794]: I0310 11:24:48.235665 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-6ffb66d69f-4mhg9"] Mar 10 11:24:48 crc kubenswrapper[4794]: I0310 11:24:48.235860 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-6ffb66d69f-4mhg9" podUID="b86814f6-b171-481c-9f93-dab25d50b2a9" containerName="horizon-log" containerID="cri-o://b91c8b16f3261f4d1dbce0cd84b40db955de5e6ef96f848d34c4279b7497911e" gracePeriod=30 Mar 10 11:24:48 crc kubenswrapper[4794]: I0310 11:24:48.236114 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-6ffb66d69f-4mhg9" podUID="b86814f6-b171-481c-9f93-dab25d50b2a9" containerName="horizon" containerID="cri-o://93a3d4858ce9325bb129c88aa42962084decd984424790e70676ca58ea506623" gracePeriod=30 Mar 10 11:24:50 crc kubenswrapper[4794]: I0310 11:24:50.008788 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6fb3e755-7a30-4965-b6a1-01e1a65e97f5" path="/var/lib/kubelet/pods/6fb3e755-7a30-4965-b6a1-01e1a65e97f5/volumes" Mar 10 11:24:52 crc kubenswrapper[4794]: I0310 11:24:52.023959 4794 generic.go:334] "Generic (PLEG): container finished" podID="b86814f6-b171-481c-9f93-dab25d50b2a9" containerID="93a3d4858ce9325bb129c88aa42962084decd984424790e70676ca58ea506623" exitCode=0 Mar 10 11:24:52 crc kubenswrapper[4794]: I0310 11:24:52.024045 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6ffb66d69f-4mhg9" event={"ID":"b86814f6-b171-481c-9f93-dab25d50b2a9","Type":"ContainerDied","Data":"93a3d4858ce9325bb129c88aa42962084decd984424790e70676ca58ea506623"} Mar 10 11:24:52 crc kubenswrapper[4794]: I0310 11:24:52.967767 4794 patch_prober.go:28] interesting pod/machine-config-daemon-69278 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 11:24:52 crc kubenswrapper[4794]: I0310 11:24:52.968161 4794 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-69278" podUID="071a79a8-a892-4d38-a255-2a19483b64aa" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 11:24:53 crc kubenswrapper[4794]: I0310 11:24:53.035941 4794 generic.go:334] "Generic (PLEG): container finished" podID="0d5d143f-51bc-4019-88a3-3224640cdea1" containerID="e29ccc92db38c5fd7db0ece3a4243f727b77c693f3bea8cf12a64738d427102d" exitCode=137 Mar 10 11:24:53 crc kubenswrapper[4794]: I0310 11:24:53.035973 4794 generic.go:334] "Generic (PLEG): container finished" podID="0d5d143f-51bc-4019-88a3-3224640cdea1" containerID="2692b49ac6f870602e533d34b3529ffc2a82bf08f8257e6846105ad47ec513d4" exitCode=137 Mar 10 11:24:53 crc kubenswrapper[4794]: I0310 11:24:53.035993 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5dbd77ff65-fht8h" event={"ID":"0d5d143f-51bc-4019-88a3-3224640cdea1","Type":"ContainerDied","Data":"e29ccc92db38c5fd7db0ece3a4243f727b77c693f3bea8cf12a64738d427102d"} Mar 10 11:24:53 crc kubenswrapper[4794]: I0310 11:24:53.036018 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5dbd77ff65-fht8h" event={"ID":"0d5d143f-51bc-4019-88a3-3224640cdea1","Type":"ContainerDied","Data":"2692b49ac6f870602e533d34b3529ffc2a82bf08f8257e6846105ad47ec513d4"} Mar 10 11:24:53 crc kubenswrapper[4794]: I0310 11:24:53.185084 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-5dbd77ff65-fht8h" Mar 10 11:24:53 crc kubenswrapper[4794]: I0310 11:24:53.253140 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/0d5d143f-51bc-4019-88a3-3224640cdea1-config-data\") pod \"0d5d143f-51bc-4019-88a3-3224640cdea1\" (UID: \"0d5d143f-51bc-4019-88a3-3224640cdea1\") " Mar 10 11:24:53 crc kubenswrapper[4794]: I0310 11:24:53.253585 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/0d5d143f-51bc-4019-88a3-3224640cdea1-scripts\") pod \"0d5d143f-51bc-4019-88a3-3224640cdea1\" (UID: \"0d5d143f-51bc-4019-88a3-3224640cdea1\") " Mar 10 11:24:53 crc kubenswrapper[4794]: I0310 11:24:53.253843 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0d5d143f-51bc-4019-88a3-3224640cdea1-logs\") pod \"0d5d143f-51bc-4019-88a3-3224640cdea1\" (UID: \"0d5d143f-51bc-4019-88a3-3224640cdea1\") " Mar 10 11:24:53 crc kubenswrapper[4794]: I0310 11:24:53.253959 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/0d5d143f-51bc-4019-88a3-3224640cdea1-horizon-secret-key\") pod \"0d5d143f-51bc-4019-88a3-3224640cdea1\" (UID: \"0d5d143f-51bc-4019-88a3-3224640cdea1\") " Mar 10 11:24:53 crc kubenswrapper[4794]: I0310 11:24:53.254064 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7rvj6\" (UniqueName: \"kubernetes.io/projected/0d5d143f-51bc-4019-88a3-3224640cdea1-kube-api-access-7rvj6\") pod \"0d5d143f-51bc-4019-88a3-3224640cdea1\" (UID: \"0d5d143f-51bc-4019-88a3-3224640cdea1\") " Mar 10 11:24:53 crc kubenswrapper[4794]: I0310 11:24:53.254665 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0d5d143f-51bc-4019-88a3-3224640cdea1-logs" (OuterVolumeSpecName: "logs") pod "0d5d143f-51bc-4019-88a3-3224640cdea1" (UID: "0d5d143f-51bc-4019-88a3-3224640cdea1"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 11:24:53 crc kubenswrapper[4794]: I0310 11:24:53.258263 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0d5d143f-51bc-4019-88a3-3224640cdea1-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "0d5d143f-51bc-4019-88a3-3224640cdea1" (UID: "0d5d143f-51bc-4019-88a3-3224640cdea1"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 11:24:53 crc kubenswrapper[4794]: I0310 11:24:53.258447 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0d5d143f-51bc-4019-88a3-3224640cdea1-kube-api-access-7rvj6" (OuterVolumeSpecName: "kube-api-access-7rvj6") pod "0d5d143f-51bc-4019-88a3-3224640cdea1" (UID: "0d5d143f-51bc-4019-88a3-3224640cdea1"). InnerVolumeSpecName "kube-api-access-7rvj6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 11:24:53 crc kubenswrapper[4794]: I0310 11:24:53.276778 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0d5d143f-51bc-4019-88a3-3224640cdea1-scripts" (OuterVolumeSpecName: "scripts") pod "0d5d143f-51bc-4019-88a3-3224640cdea1" (UID: "0d5d143f-51bc-4019-88a3-3224640cdea1"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 11:24:53 crc kubenswrapper[4794]: I0310 11:24:53.285222 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0d5d143f-51bc-4019-88a3-3224640cdea1-config-data" (OuterVolumeSpecName: "config-data") pod "0d5d143f-51bc-4019-88a3-3224640cdea1" (UID: "0d5d143f-51bc-4019-88a3-3224640cdea1"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 11:24:53 crc kubenswrapper[4794]: I0310 11:24:53.356611 4794 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0d5d143f-51bc-4019-88a3-3224640cdea1-logs\") on node \"crc\" DevicePath \"\"" Mar 10 11:24:53 crc kubenswrapper[4794]: I0310 11:24:53.356654 4794 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/0d5d143f-51bc-4019-88a3-3224640cdea1-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Mar 10 11:24:53 crc kubenswrapper[4794]: I0310 11:24:53.356673 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7rvj6\" (UniqueName: \"kubernetes.io/projected/0d5d143f-51bc-4019-88a3-3224640cdea1-kube-api-access-7rvj6\") on node \"crc\" DevicePath \"\"" Mar 10 11:24:53 crc kubenswrapper[4794]: I0310 11:24:53.356684 4794 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/0d5d143f-51bc-4019-88a3-3224640cdea1-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 11:24:53 crc kubenswrapper[4794]: I0310 11:24:53.356697 4794 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/0d5d143f-51bc-4019-88a3-3224640cdea1-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 11:24:53 crc kubenswrapper[4794]: I0310 11:24:53.919416 4794 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-6ffb66d69f-4mhg9" podUID="b86814f6-b171-481c-9f93-dab25d50b2a9" containerName="horizon" probeResult="failure" output="Get \"http://10.217.1.150:8080/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.1.150:8080: connect: connection refused" Mar 10 11:24:54 crc kubenswrapper[4794]: I0310 11:24:54.046155 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5dbd77ff65-fht8h" event={"ID":"0d5d143f-51bc-4019-88a3-3224640cdea1","Type":"ContainerDied","Data":"cfb12a1a9ff719c547057bb1dc778e890669796c08d3ec94f192617c892274a1"} Mar 10 11:24:54 crc kubenswrapper[4794]: I0310 11:24:54.046200 4794 scope.go:117] "RemoveContainer" containerID="e29ccc92db38c5fd7db0ece3a4243f727b77c693f3bea8cf12a64738d427102d" Mar 10 11:24:54 crc kubenswrapper[4794]: I0310 11:24:54.046315 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-5dbd77ff65-fht8h" Mar 10 11:24:54 crc kubenswrapper[4794]: I0310 11:24:54.104285 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-5dbd77ff65-fht8h"] Mar 10 11:24:54 crc kubenswrapper[4794]: I0310 11:24:54.114753 4794 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-5dbd77ff65-fht8h"] Mar 10 11:24:54 crc kubenswrapper[4794]: I0310 11:24:54.258408 4794 scope.go:117] "RemoveContainer" containerID="2692b49ac6f870602e533d34b3529ffc2a82bf08f8257e6846105ad47ec513d4" Mar 10 11:24:56 crc kubenswrapper[4794]: I0310 11:24:56.015843 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0d5d143f-51bc-4019-88a3-3224640cdea1" path="/var/lib/kubelet/pods/0d5d143f-51bc-4019-88a3-3224640cdea1/volumes" Mar 10 11:25:01 crc kubenswrapper[4794]: I0310 11:25:01.843905 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-77df494f69-g2kl6"] Mar 10 11:25:01 crc kubenswrapper[4794]: E0310 11:25:01.844783 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0d5d143f-51bc-4019-88a3-3224640cdea1" containerName="horizon" Mar 10 11:25:01 crc kubenswrapper[4794]: I0310 11:25:01.844801 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="0d5d143f-51bc-4019-88a3-3224640cdea1" containerName="horizon" Mar 10 11:25:01 crc kubenswrapper[4794]: E0310 11:25:01.844815 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0d5d143f-51bc-4019-88a3-3224640cdea1" containerName="horizon-log" Mar 10 11:25:01 crc kubenswrapper[4794]: I0310 11:25:01.844821 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="0d5d143f-51bc-4019-88a3-3224640cdea1" containerName="horizon-log" Mar 10 11:25:01 crc kubenswrapper[4794]: I0310 11:25:01.845045 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="0d5d143f-51bc-4019-88a3-3224640cdea1" containerName="horizon-log" Mar 10 11:25:01 crc kubenswrapper[4794]: I0310 11:25:01.845059 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="0d5d143f-51bc-4019-88a3-3224640cdea1" containerName="horizon" Mar 10 11:25:01 crc kubenswrapper[4794]: I0310 11:25:01.846281 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-77df494f69-g2kl6" Mar 10 11:25:01 crc kubenswrapper[4794]: I0310 11:25:01.852217 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/446085d1-b68e-40ef-ac9c-01bc709de2a3-logs\") pod \"horizon-77df494f69-g2kl6\" (UID: \"446085d1-b68e-40ef-ac9c-01bc709de2a3\") " pod="openstack/horizon-77df494f69-g2kl6" Mar 10 11:25:01 crc kubenswrapper[4794]: I0310 11:25:01.852278 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zdv55\" (UniqueName: \"kubernetes.io/projected/446085d1-b68e-40ef-ac9c-01bc709de2a3-kube-api-access-zdv55\") pod \"horizon-77df494f69-g2kl6\" (UID: \"446085d1-b68e-40ef-ac9c-01bc709de2a3\") " pod="openstack/horizon-77df494f69-g2kl6" Mar 10 11:25:01 crc kubenswrapper[4794]: I0310 11:25:01.852399 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/446085d1-b68e-40ef-ac9c-01bc709de2a3-config-data\") pod \"horizon-77df494f69-g2kl6\" (UID: \"446085d1-b68e-40ef-ac9c-01bc709de2a3\") " pod="openstack/horizon-77df494f69-g2kl6" Mar 10 11:25:01 crc kubenswrapper[4794]: I0310 11:25:01.852447 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/446085d1-b68e-40ef-ac9c-01bc709de2a3-scripts\") pod \"horizon-77df494f69-g2kl6\" (UID: \"446085d1-b68e-40ef-ac9c-01bc709de2a3\") " pod="openstack/horizon-77df494f69-g2kl6" Mar 10 11:25:01 crc kubenswrapper[4794]: I0310 11:25:01.852484 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/446085d1-b68e-40ef-ac9c-01bc709de2a3-horizon-secret-key\") pod \"horizon-77df494f69-g2kl6\" (UID: \"446085d1-b68e-40ef-ac9c-01bc709de2a3\") " pod="openstack/horizon-77df494f69-g2kl6" Mar 10 11:25:01 crc kubenswrapper[4794]: I0310 11:25:01.871667 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-77df494f69-g2kl6"] Mar 10 11:25:01 crc kubenswrapper[4794]: I0310 11:25:01.954593 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/446085d1-b68e-40ef-ac9c-01bc709de2a3-config-data\") pod \"horizon-77df494f69-g2kl6\" (UID: \"446085d1-b68e-40ef-ac9c-01bc709de2a3\") " pod="openstack/horizon-77df494f69-g2kl6" Mar 10 11:25:01 crc kubenswrapper[4794]: I0310 11:25:01.954677 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/446085d1-b68e-40ef-ac9c-01bc709de2a3-scripts\") pod \"horizon-77df494f69-g2kl6\" (UID: \"446085d1-b68e-40ef-ac9c-01bc709de2a3\") " pod="openstack/horizon-77df494f69-g2kl6" Mar 10 11:25:01 crc kubenswrapper[4794]: I0310 11:25:01.954721 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/446085d1-b68e-40ef-ac9c-01bc709de2a3-horizon-secret-key\") pod \"horizon-77df494f69-g2kl6\" (UID: \"446085d1-b68e-40ef-ac9c-01bc709de2a3\") " pod="openstack/horizon-77df494f69-g2kl6" Mar 10 11:25:01 crc kubenswrapper[4794]: I0310 11:25:01.954756 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/446085d1-b68e-40ef-ac9c-01bc709de2a3-logs\") pod \"horizon-77df494f69-g2kl6\" (UID: \"446085d1-b68e-40ef-ac9c-01bc709de2a3\") " pod="openstack/horizon-77df494f69-g2kl6" Mar 10 11:25:01 crc kubenswrapper[4794]: I0310 11:25:01.954797 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zdv55\" (UniqueName: \"kubernetes.io/projected/446085d1-b68e-40ef-ac9c-01bc709de2a3-kube-api-access-zdv55\") pod \"horizon-77df494f69-g2kl6\" (UID: \"446085d1-b68e-40ef-ac9c-01bc709de2a3\") " pod="openstack/horizon-77df494f69-g2kl6" Mar 10 11:25:01 crc kubenswrapper[4794]: I0310 11:25:01.955529 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/446085d1-b68e-40ef-ac9c-01bc709de2a3-scripts\") pod \"horizon-77df494f69-g2kl6\" (UID: \"446085d1-b68e-40ef-ac9c-01bc709de2a3\") " pod="openstack/horizon-77df494f69-g2kl6" Mar 10 11:25:01 crc kubenswrapper[4794]: I0310 11:25:01.956140 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/446085d1-b68e-40ef-ac9c-01bc709de2a3-config-data\") pod \"horizon-77df494f69-g2kl6\" (UID: \"446085d1-b68e-40ef-ac9c-01bc709de2a3\") " pod="openstack/horizon-77df494f69-g2kl6" Mar 10 11:25:01 crc kubenswrapper[4794]: I0310 11:25:01.956452 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/446085d1-b68e-40ef-ac9c-01bc709de2a3-logs\") pod \"horizon-77df494f69-g2kl6\" (UID: \"446085d1-b68e-40ef-ac9c-01bc709de2a3\") " pod="openstack/horizon-77df494f69-g2kl6" Mar 10 11:25:01 crc kubenswrapper[4794]: I0310 11:25:01.973797 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/446085d1-b68e-40ef-ac9c-01bc709de2a3-horizon-secret-key\") pod \"horizon-77df494f69-g2kl6\" (UID: \"446085d1-b68e-40ef-ac9c-01bc709de2a3\") " pod="openstack/horizon-77df494f69-g2kl6" Mar 10 11:25:01 crc kubenswrapper[4794]: I0310 11:25:01.975648 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zdv55\" (UniqueName: \"kubernetes.io/projected/446085d1-b68e-40ef-ac9c-01bc709de2a3-kube-api-access-zdv55\") pod \"horizon-77df494f69-g2kl6\" (UID: \"446085d1-b68e-40ef-ac9c-01bc709de2a3\") " pod="openstack/horizon-77df494f69-g2kl6" Mar 10 11:25:02 crc kubenswrapper[4794]: I0310 11:25:02.164160 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-77df494f69-g2kl6" Mar 10 11:25:02 crc kubenswrapper[4794]: I0310 11:25:02.643186 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-77df494f69-g2kl6"] Mar 10 11:25:03 crc kubenswrapper[4794]: I0310 11:25:03.135990 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-77df494f69-g2kl6" event={"ID":"446085d1-b68e-40ef-ac9c-01bc709de2a3","Type":"ContainerStarted","Data":"ead70f30596fb3c780c69e542996faf75326ff99b09e176d1be70535c78399b4"} Mar 10 11:25:03 crc kubenswrapper[4794]: I0310 11:25:03.136458 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-77df494f69-g2kl6" event={"ID":"446085d1-b68e-40ef-ac9c-01bc709de2a3","Type":"ContainerStarted","Data":"71ed493c9893de25f9b62a49f28d3abad26ac4380ab96fdbc1ddfc11d65d22e7"} Mar 10 11:25:03 crc kubenswrapper[4794]: I0310 11:25:03.136471 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-77df494f69-g2kl6" event={"ID":"446085d1-b68e-40ef-ac9c-01bc709de2a3","Type":"ContainerStarted","Data":"709728e04484d9776cfc56783067067b6090e9418463c73cea90566507b658a8"} Mar 10 11:25:03 crc kubenswrapper[4794]: I0310 11:25:03.175589 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-77df494f69-g2kl6" podStartSLOduration=2.175567181 podStartE2EDuration="2.175567181s" podCreationTimestamp="2026-03-10 11:25:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 11:25:03.153602337 +0000 UTC m=+6051.909773155" watchObservedRunningTime="2026-03-10 11:25:03.175567181 +0000 UTC m=+6051.931737999" Mar 10 11:25:03 crc kubenswrapper[4794]: I0310 11:25:03.296525 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-db-create-4rqg8"] Mar 10 11:25:03 crc kubenswrapper[4794]: I0310 11:25:03.297695 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-create-4rqg8" Mar 10 11:25:03 crc kubenswrapper[4794]: I0310 11:25:03.313267 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-db-create-4rqg8"] Mar 10 11:25:03 crc kubenswrapper[4794]: I0310 11:25:03.390700 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k2rwv\" (UniqueName: \"kubernetes.io/projected/b17c00fd-9e0c-4ddc-b961-30d84e94b34b-kube-api-access-k2rwv\") pod \"heat-db-create-4rqg8\" (UID: \"b17c00fd-9e0c-4ddc-b961-30d84e94b34b\") " pod="openstack/heat-db-create-4rqg8" Mar 10 11:25:03 crc kubenswrapper[4794]: I0310 11:25:03.390751 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b17c00fd-9e0c-4ddc-b961-30d84e94b34b-operator-scripts\") pod \"heat-db-create-4rqg8\" (UID: \"b17c00fd-9e0c-4ddc-b961-30d84e94b34b\") " pod="openstack/heat-db-create-4rqg8" Mar 10 11:25:03 crc kubenswrapper[4794]: I0310 11:25:03.404986 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-3230-account-create-update-8576n"] Mar 10 11:25:03 crc kubenswrapper[4794]: I0310 11:25:03.406325 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-3230-account-create-update-8576n" Mar 10 11:25:03 crc kubenswrapper[4794]: I0310 11:25:03.408491 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-db-secret" Mar 10 11:25:03 crc kubenswrapper[4794]: I0310 11:25:03.417754 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-3230-account-create-update-8576n"] Mar 10 11:25:03 crc kubenswrapper[4794]: I0310 11:25:03.492255 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ndjzv\" (UniqueName: \"kubernetes.io/projected/eca7f66a-0319-4b41-a820-706bab3a4898-kube-api-access-ndjzv\") pod \"heat-3230-account-create-update-8576n\" (UID: \"eca7f66a-0319-4b41-a820-706bab3a4898\") " pod="openstack/heat-3230-account-create-update-8576n" Mar 10 11:25:03 crc kubenswrapper[4794]: I0310 11:25:03.492619 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k2rwv\" (UniqueName: \"kubernetes.io/projected/b17c00fd-9e0c-4ddc-b961-30d84e94b34b-kube-api-access-k2rwv\") pod \"heat-db-create-4rqg8\" (UID: \"b17c00fd-9e0c-4ddc-b961-30d84e94b34b\") " pod="openstack/heat-db-create-4rqg8" Mar 10 11:25:03 crc kubenswrapper[4794]: I0310 11:25:03.492828 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b17c00fd-9e0c-4ddc-b961-30d84e94b34b-operator-scripts\") pod \"heat-db-create-4rqg8\" (UID: \"b17c00fd-9e0c-4ddc-b961-30d84e94b34b\") " pod="openstack/heat-db-create-4rqg8" Mar 10 11:25:03 crc kubenswrapper[4794]: I0310 11:25:03.492966 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/eca7f66a-0319-4b41-a820-706bab3a4898-operator-scripts\") pod \"heat-3230-account-create-update-8576n\" (UID: \"eca7f66a-0319-4b41-a820-706bab3a4898\") " pod="openstack/heat-3230-account-create-update-8576n" Mar 10 11:25:03 crc kubenswrapper[4794]: I0310 11:25:03.493629 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b17c00fd-9e0c-4ddc-b961-30d84e94b34b-operator-scripts\") pod \"heat-db-create-4rqg8\" (UID: \"b17c00fd-9e0c-4ddc-b961-30d84e94b34b\") " pod="openstack/heat-db-create-4rqg8" Mar 10 11:25:03 crc kubenswrapper[4794]: I0310 11:25:03.512790 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k2rwv\" (UniqueName: \"kubernetes.io/projected/b17c00fd-9e0c-4ddc-b961-30d84e94b34b-kube-api-access-k2rwv\") pod \"heat-db-create-4rqg8\" (UID: \"b17c00fd-9e0c-4ddc-b961-30d84e94b34b\") " pod="openstack/heat-db-create-4rqg8" Mar 10 11:25:03 crc kubenswrapper[4794]: I0310 11:25:03.594143 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ndjzv\" (UniqueName: \"kubernetes.io/projected/eca7f66a-0319-4b41-a820-706bab3a4898-kube-api-access-ndjzv\") pod \"heat-3230-account-create-update-8576n\" (UID: \"eca7f66a-0319-4b41-a820-706bab3a4898\") " pod="openstack/heat-3230-account-create-update-8576n" Mar 10 11:25:03 crc kubenswrapper[4794]: I0310 11:25:03.594249 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/eca7f66a-0319-4b41-a820-706bab3a4898-operator-scripts\") pod \"heat-3230-account-create-update-8576n\" (UID: \"eca7f66a-0319-4b41-a820-706bab3a4898\") " pod="openstack/heat-3230-account-create-update-8576n" Mar 10 11:25:03 crc kubenswrapper[4794]: I0310 11:25:03.595218 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/eca7f66a-0319-4b41-a820-706bab3a4898-operator-scripts\") pod \"heat-3230-account-create-update-8576n\" (UID: \"eca7f66a-0319-4b41-a820-706bab3a4898\") " pod="openstack/heat-3230-account-create-update-8576n" Mar 10 11:25:03 crc kubenswrapper[4794]: I0310 11:25:03.620620 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ndjzv\" (UniqueName: \"kubernetes.io/projected/eca7f66a-0319-4b41-a820-706bab3a4898-kube-api-access-ndjzv\") pod \"heat-3230-account-create-update-8576n\" (UID: \"eca7f66a-0319-4b41-a820-706bab3a4898\") " pod="openstack/heat-3230-account-create-update-8576n" Mar 10 11:25:03 crc kubenswrapper[4794]: I0310 11:25:03.632437 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-create-4rqg8" Mar 10 11:25:03 crc kubenswrapper[4794]: I0310 11:25:03.724884 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-3230-account-create-update-8576n" Mar 10 11:25:03 crc kubenswrapper[4794]: I0310 11:25:03.918988 4794 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-6ffb66d69f-4mhg9" podUID="b86814f6-b171-481c-9f93-dab25d50b2a9" containerName="horizon" probeResult="failure" output="Get \"http://10.217.1.150:8080/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.1.150:8080: connect: connection refused" Mar 10 11:25:04 crc kubenswrapper[4794]: I0310 11:25:04.122763 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-db-create-4rqg8"] Mar 10 11:25:04 crc kubenswrapper[4794]: W0310 11:25:04.125567 4794 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb17c00fd_9e0c_4ddc_b961_30d84e94b34b.slice/crio-60ce47dd4c7d3ae60076e497b0bddac8d6098b2f47154e60f50f15eaf4445aec WatchSource:0}: Error finding container 60ce47dd4c7d3ae60076e497b0bddac8d6098b2f47154e60f50f15eaf4445aec: Status 404 returned error can't find the container with id 60ce47dd4c7d3ae60076e497b0bddac8d6098b2f47154e60f50f15eaf4445aec Mar 10 11:25:04 crc kubenswrapper[4794]: I0310 11:25:04.149950 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-create-4rqg8" event={"ID":"b17c00fd-9e0c-4ddc-b961-30d84e94b34b","Type":"ContainerStarted","Data":"60ce47dd4c7d3ae60076e497b0bddac8d6098b2f47154e60f50f15eaf4445aec"} Mar 10 11:25:04 crc kubenswrapper[4794]: I0310 11:25:04.226268 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-3230-account-create-update-8576n"] Mar 10 11:25:04 crc kubenswrapper[4794]: W0310 11:25:04.235964 4794 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podeca7f66a_0319_4b41_a820_706bab3a4898.slice/crio-c23dfc4d893734c1c79bd21afc810479353708d497fd698dc21173db1504a0c5 WatchSource:0}: Error finding container c23dfc4d893734c1c79bd21afc810479353708d497fd698dc21173db1504a0c5: Status 404 returned error can't find the container with id c23dfc4d893734c1c79bd21afc810479353708d497fd698dc21173db1504a0c5 Mar 10 11:25:05 crc kubenswrapper[4794]: I0310 11:25:05.162767 4794 generic.go:334] "Generic (PLEG): container finished" podID="eca7f66a-0319-4b41-a820-706bab3a4898" containerID="e2adb5705bed8b7641cd0255c8510c9d1ddf5ef99d6c5b7d13eabbbcb14db9e3" exitCode=0 Mar 10 11:25:05 crc kubenswrapper[4794]: I0310 11:25:05.163268 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-3230-account-create-update-8576n" event={"ID":"eca7f66a-0319-4b41-a820-706bab3a4898","Type":"ContainerDied","Data":"e2adb5705bed8b7641cd0255c8510c9d1ddf5ef99d6c5b7d13eabbbcb14db9e3"} Mar 10 11:25:05 crc kubenswrapper[4794]: I0310 11:25:05.163309 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-3230-account-create-update-8576n" event={"ID":"eca7f66a-0319-4b41-a820-706bab3a4898","Type":"ContainerStarted","Data":"c23dfc4d893734c1c79bd21afc810479353708d497fd698dc21173db1504a0c5"} Mar 10 11:25:05 crc kubenswrapper[4794]: I0310 11:25:05.165594 4794 generic.go:334] "Generic (PLEG): container finished" podID="b17c00fd-9e0c-4ddc-b961-30d84e94b34b" containerID="867999d3e8a9525ffaf32ea5935511308d1a624934fa60e704d6dc4d18b8688d" exitCode=0 Mar 10 11:25:05 crc kubenswrapper[4794]: I0310 11:25:05.165628 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-create-4rqg8" event={"ID":"b17c00fd-9e0c-4ddc-b961-30d84e94b34b","Type":"ContainerDied","Data":"867999d3e8a9525ffaf32ea5935511308d1a624934fa60e704d6dc4d18b8688d"} Mar 10 11:25:06 crc kubenswrapper[4794]: I0310 11:25:06.684486 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-create-4rqg8" Mar 10 11:25:06 crc kubenswrapper[4794]: I0310 11:25:06.689258 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-3230-account-create-update-8576n" Mar 10 11:25:06 crc kubenswrapper[4794]: I0310 11:25:06.812648 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k2rwv\" (UniqueName: \"kubernetes.io/projected/b17c00fd-9e0c-4ddc-b961-30d84e94b34b-kube-api-access-k2rwv\") pod \"b17c00fd-9e0c-4ddc-b961-30d84e94b34b\" (UID: \"b17c00fd-9e0c-4ddc-b961-30d84e94b34b\") " Mar 10 11:25:06 crc kubenswrapper[4794]: I0310 11:25:06.812696 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b17c00fd-9e0c-4ddc-b961-30d84e94b34b-operator-scripts\") pod \"b17c00fd-9e0c-4ddc-b961-30d84e94b34b\" (UID: \"b17c00fd-9e0c-4ddc-b961-30d84e94b34b\") " Mar 10 11:25:06 crc kubenswrapper[4794]: I0310 11:25:06.812778 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/eca7f66a-0319-4b41-a820-706bab3a4898-operator-scripts\") pod \"eca7f66a-0319-4b41-a820-706bab3a4898\" (UID: \"eca7f66a-0319-4b41-a820-706bab3a4898\") " Mar 10 11:25:06 crc kubenswrapper[4794]: I0310 11:25:06.812962 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ndjzv\" (UniqueName: \"kubernetes.io/projected/eca7f66a-0319-4b41-a820-706bab3a4898-kube-api-access-ndjzv\") pod \"eca7f66a-0319-4b41-a820-706bab3a4898\" (UID: \"eca7f66a-0319-4b41-a820-706bab3a4898\") " Mar 10 11:25:06 crc kubenswrapper[4794]: I0310 11:25:06.814507 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b17c00fd-9e0c-4ddc-b961-30d84e94b34b-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "b17c00fd-9e0c-4ddc-b961-30d84e94b34b" (UID: "b17c00fd-9e0c-4ddc-b961-30d84e94b34b"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 11:25:06 crc kubenswrapper[4794]: I0310 11:25:06.815078 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/eca7f66a-0319-4b41-a820-706bab3a4898-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "eca7f66a-0319-4b41-a820-706bab3a4898" (UID: "eca7f66a-0319-4b41-a820-706bab3a4898"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 11:25:06 crc kubenswrapper[4794]: I0310 11:25:06.819586 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b17c00fd-9e0c-4ddc-b961-30d84e94b34b-kube-api-access-k2rwv" (OuterVolumeSpecName: "kube-api-access-k2rwv") pod "b17c00fd-9e0c-4ddc-b961-30d84e94b34b" (UID: "b17c00fd-9e0c-4ddc-b961-30d84e94b34b"). InnerVolumeSpecName "kube-api-access-k2rwv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 11:25:06 crc kubenswrapper[4794]: I0310 11:25:06.819768 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eca7f66a-0319-4b41-a820-706bab3a4898-kube-api-access-ndjzv" (OuterVolumeSpecName: "kube-api-access-ndjzv") pod "eca7f66a-0319-4b41-a820-706bab3a4898" (UID: "eca7f66a-0319-4b41-a820-706bab3a4898"). InnerVolumeSpecName "kube-api-access-ndjzv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 11:25:06 crc kubenswrapper[4794]: I0310 11:25:06.915470 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ndjzv\" (UniqueName: \"kubernetes.io/projected/eca7f66a-0319-4b41-a820-706bab3a4898-kube-api-access-ndjzv\") on node \"crc\" DevicePath \"\"" Mar 10 11:25:06 crc kubenswrapper[4794]: I0310 11:25:06.915500 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k2rwv\" (UniqueName: \"kubernetes.io/projected/b17c00fd-9e0c-4ddc-b961-30d84e94b34b-kube-api-access-k2rwv\") on node \"crc\" DevicePath \"\"" Mar 10 11:25:06 crc kubenswrapper[4794]: I0310 11:25:06.915512 4794 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b17c00fd-9e0c-4ddc-b961-30d84e94b34b-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 11:25:06 crc kubenswrapper[4794]: I0310 11:25:06.915521 4794 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/eca7f66a-0319-4b41-a820-706bab3a4898-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 11:25:07 crc kubenswrapper[4794]: I0310 11:25:07.193225 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-create-4rqg8" Mar 10 11:25:07 crc kubenswrapper[4794]: I0310 11:25:07.193260 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-create-4rqg8" event={"ID":"b17c00fd-9e0c-4ddc-b961-30d84e94b34b","Type":"ContainerDied","Data":"60ce47dd4c7d3ae60076e497b0bddac8d6098b2f47154e60f50f15eaf4445aec"} Mar 10 11:25:07 crc kubenswrapper[4794]: I0310 11:25:07.193300 4794 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="60ce47dd4c7d3ae60076e497b0bddac8d6098b2f47154e60f50f15eaf4445aec" Mar 10 11:25:07 crc kubenswrapper[4794]: I0310 11:25:07.194574 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-3230-account-create-update-8576n" event={"ID":"eca7f66a-0319-4b41-a820-706bab3a4898","Type":"ContainerDied","Data":"c23dfc4d893734c1c79bd21afc810479353708d497fd698dc21173db1504a0c5"} Mar 10 11:25:07 crc kubenswrapper[4794]: I0310 11:25:07.194595 4794 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c23dfc4d893734c1c79bd21afc810479353708d497fd698dc21173db1504a0c5" Mar 10 11:25:07 crc kubenswrapper[4794]: I0310 11:25:07.194604 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-3230-account-create-update-8576n" Mar 10 11:25:08 crc kubenswrapper[4794]: I0310 11:25:08.548417 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-db-sync-84hf6"] Mar 10 11:25:08 crc kubenswrapper[4794]: E0310 11:25:08.549138 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eca7f66a-0319-4b41-a820-706bab3a4898" containerName="mariadb-account-create-update" Mar 10 11:25:08 crc kubenswrapper[4794]: I0310 11:25:08.549154 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="eca7f66a-0319-4b41-a820-706bab3a4898" containerName="mariadb-account-create-update" Mar 10 11:25:08 crc kubenswrapper[4794]: E0310 11:25:08.549185 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b17c00fd-9e0c-4ddc-b961-30d84e94b34b" containerName="mariadb-database-create" Mar 10 11:25:08 crc kubenswrapper[4794]: I0310 11:25:08.549193 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="b17c00fd-9e0c-4ddc-b961-30d84e94b34b" containerName="mariadb-database-create" Mar 10 11:25:08 crc kubenswrapper[4794]: I0310 11:25:08.549549 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="b17c00fd-9e0c-4ddc-b961-30d84e94b34b" containerName="mariadb-database-create" Mar 10 11:25:08 crc kubenswrapper[4794]: I0310 11:25:08.549565 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="eca7f66a-0319-4b41-a820-706bab3a4898" containerName="mariadb-account-create-update" Mar 10 11:25:08 crc kubenswrapper[4794]: I0310 11:25:08.550391 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-sync-84hf6" Mar 10 11:25:08 crc kubenswrapper[4794]: I0310 11:25:08.552694 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-heat-dockercfg-vs27r" Mar 10 11:25:08 crc kubenswrapper[4794]: I0310 11:25:08.552794 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-config-data" Mar 10 11:25:08 crc kubenswrapper[4794]: I0310 11:25:08.568117 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-db-sync-84hf6"] Mar 10 11:25:08 crc kubenswrapper[4794]: I0310 11:25:08.650712 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1bdf038e-4f84-46d5-b1af-a95d083befbb-combined-ca-bundle\") pod \"heat-db-sync-84hf6\" (UID: \"1bdf038e-4f84-46d5-b1af-a95d083befbb\") " pod="openstack/heat-db-sync-84hf6" Mar 10 11:25:08 crc kubenswrapper[4794]: I0310 11:25:08.650802 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m9q77\" (UniqueName: \"kubernetes.io/projected/1bdf038e-4f84-46d5-b1af-a95d083befbb-kube-api-access-m9q77\") pod \"heat-db-sync-84hf6\" (UID: \"1bdf038e-4f84-46d5-b1af-a95d083befbb\") " pod="openstack/heat-db-sync-84hf6" Mar 10 11:25:08 crc kubenswrapper[4794]: I0310 11:25:08.650835 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1bdf038e-4f84-46d5-b1af-a95d083befbb-config-data\") pod \"heat-db-sync-84hf6\" (UID: \"1bdf038e-4f84-46d5-b1af-a95d083befbb\") " pod="openstack/heat-db-sync-84hf6" Mar 10 11:25:08 crc kubenswrapper[4794]: I0310 11:25:08.751535 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1bdf038e-4f84-46d5-b1af-a95d083befbb-combined-ca-bundle\") pod \"heat-db-sync-84hf6\" (UID: \"1bdf038e-4f84-46d5-b1af-a95d083befbb\") " pod="openstack/heat-db-sync-84hf6" Mar 10 11:25:08 crc kubenswrapper[4794]: I0310 11:25:08.751637 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m9q77\" (UniqueName: \"kubernetes.io/projected/1bdf038e-4f84-46d5-b1af-a95d083befbb-kube-api-access-m9q77\") pod \"heat-db-sync-84hf6\" (UID: \"1bdf038e-4f84-46d5-b1af-a95d083befbb\") " pod="openstack/heat-db-sync-84hf6" Mar 10 11:25:08 crc kubenswrapper[4794]: I0310 11:25:08.751666 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1bdf038e-4f84-46d5-b1af-a95d083befbb-config-data\") pod \"heat-db-sync-84hf6\" (UID: \"1bdf038e-4f84-46d5-b1af-a95d083befbb\") " pod="openstack/heat-db-sync-84hf6" Mar 10 11:25:08 crc kubenswrapper[4794]: I0310 11:25:08.756701 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1bdf038e-4f84-46d5-b1af-a95d083befbb-combined-ca-bundle\") pod \"heat-db-sync-84hf6\" (UID: \"1bdf038e-4f84-46d5-b1af-a95d083befbb\") " pod="openstack/heat-db-sync-84hf6" Mar 10 11:25:08 crc kubenswrapper[4794]: I0310 11:25:08.757098 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1bdf038e-4f84-46d5-b1af-a95d083befbb-config-data\") pod \"heat-db-sync-84hf6\" (UID: \"1bdf038e-4f84-46d5-b1af-a95d083befbb\") " pod="openstack/heat-db-sync-84hf6" Mar 10 11:25:08 crc kubenswrapper[4794]: I0310 11:25:08.782246 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m9q77\" (UniqueName: \"kubernetes.io/projected/1bdf038e-4f84-46d5-b1af-a95d083befbb-kube-api-access-m9q77\") pod \"heat-db-sync-84hf6\" (UID: \"1bdf038e-4f84-46d5-b1af-a95d083befbb\") " pod="openstack/heat-db-sync-84hf6" Mar 10 11:25:08 crc kubenswrapper[4794]: I0310 11:25:08.889609 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-sync-84hf6" Mar 10 11:25:09 crc kubenswrapper[4794]: I0310 11:25:09.414536 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-db-sync-84hf6"] Mar 10 11:25:10 crc kubenswrapper[4794]: I0310 11:25:10.232869 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-sync-84hf6" event={"ID":"1bdf038e-4f84-46d5-b1af-a95d083befbb","Type":"ContainerStarted","Data":"c14d17bc15ca3fc3fa2d5d04485ea0989a73cbcd4aab8b0f0fed3a32e514d6fd"} Mar 10 11:25:12 crc kubenswrapper[4794]: I0310 11:25:12.164298 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-77df494f69-g2kl6" Mar 10 11:25:12 crc kubenswrapper[4794]: I0310 11:25:12.164381 4794 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-77df494f69-g2kl6" Mar 10 11:25:13 crc kubenswrapper[4794]: I0310 11:25:13.921110 4794 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-6ffb66d69f-4mhg9" podUID="b86814f6-b171-481c-9f93-dab25d50b2a9" containerName="horizon" probeResult="failure" output="Get \"http://10.217.1.150:8080/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.1.150:8080: connect: connection refused" Mar 10 11:25:13 crc kubenswrapper[4794]: I0310 11:25:13.921672 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-6ffb66d69f-4mhg9" Mar 10 11:25:17 crc kubenswrapper[4794]: I0310 11:25:17.055035 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-create-gdldc"] Mar 10 11:25:17 crc kubenswrapper[4794]: I0310 11:25:17.065468 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-a98d-account-create-update-8vn77"] Mar 10 11:25:17 crc kubenswrapper[4794]: I0310 11:25:17.073429 4794 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-create-gdldc"] Mar 10 11:25:17 crc kubenswrapper[4794]: I0310 11:25:17.080942 4794 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-a98d-account-create-update-8vn77"] Mar 10 11:25:17 crc kubenswrapper[4794]: I0310 11:25:17.397298 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-sync-84hf6" event={"ID":"1bdf038e-4f84-46d5-b1af-a95d083befbb","Type":"ContainerStarted","Data":"459ff8ff80a916813fc22b8f98294568f8a6287d2cd9e78a538b29e33712353c"} Mar 10 11:25:17 crc kubenswrapper[4794]: I0310 11:25:17.420539 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-db-sync-84hf6" podStartSLOduration=1.9150307180000001 podStartE2EDuration="9.420513282s" podCreationTimestamp="2026-03-10 11:25:08 +0000 UTC" firstStartedPulling="2026-03-10 11:25:09.433674659 +0000 UTC m=+6058.189845517" lastFinishedPulling="2026-03-10 11:25:16.939157253 +0000 UTC m=+6065.695328081" observedRunningTime="2026-03-10 11:25:17.411208143 +0000 UTC m=+6066.167378991" watchObservedRunningTime="2026-03-10 11:25:17.420513282 +0000 UTC m=+6066.176684110" Mar 10 11:25:18 crc kubenswrapper[4794]: I0310 11:25:18.020041 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="747409d2-0f42-4b0e-9daa-3a2aae24a583" path="/var/lib/kubelet/pods/747409d2-0f42-4b0e-9daa-3a2aae24a583/volumes" Mar 10 11:25:18 crc kubenswrapper[4794]: I0310 11:25:18.022498 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="88f6ae7d-9e45-43bf-a76f-e3f9b2d13be0" path="/var/lib/kubelet/pods/88f6ae7d-9e45-43bf-a76f-e3f9b2d13be0/volumes" Mar 10 11:25:19 crc kubenswrapper[4794]: I0310 11:25:19.356094 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-6ffb66d69f-4mhg9" Mar 10 11:25:19 crc kubenswrapper[4794]: I0310 11:25:19.434461 4794 generic.go:334] "Generic (PLEG): container finished" podID="1bdf038e-4f84-46d5-b1af-a95d083befbb" containerID="459ff8ff80a916813fc22b8f98294568f8a6287d2cd9e78a538b29e33712353c" exitCode=0 Mar 10 11:25:19 crc kubenswrapper[4794]: I0310 11:25:19.434564 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-sync-84hf6" event={"ID":"1bdf038e-4f84-46d5-b1af-a95d083befbb","Type":"ContainerDied","Data":"459ff8ff80a916813fc22b8f98294568f8a6287d2cd9e78a538b29e33712353c"} Mar 10 11:25:19 crc kubenswrapper[4794]: I0310 11:25:19.437632 4794 generic.go:334] "Generic (PLEG): container finished" podID="b86814f6-b171-481c-9f93-dab25d50b2a9" containerID="b91c8b16f3261f4d1dbce0cd84b40db955de5e6ef96f848d34c4279b7497911e" exitCode=137 Mar 10 11:25:19 crc kubenswrapper[4794]: I0310 11:25:19.437686 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6ffb66d69f-4mhg9" event={"ID":"b86814f6-b171-481c-9f93-dab25d50b2a9","Type":"ContainerDied","Data":"b91c8b16f3261f4d1dbce0cd84b40db955de5e6ef96f848d34c4279b7497911e"} Mar 10 11:25:19 crc kubenswrapper[4794]: I0310 11:25:19.437722 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6ffb66d69f-4mhg9" event={"ID":"b86814f6-b171-481c-9f93-dab25d50b2a9","Type":"ContainerDied","Data":"9ffa2da217033f775dc0c545ea35a0c145071603674e176a6991a7b00297a1dd"} Mar 10 11:25:19 crc kubenswrapper[4794]: I0310 11:25:19.437746 4794 scope.go:117] "RemoveContainer" containerID="93a3d4858ce9325bb129c88aa42962084decd984424790e70676ca58ea506623" Mar 10 11:25:19 crc kubenswrapper[4794]: I0310 11:25:19.437824 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-6ffb66d69f-4mhg9" Mar 10 11:25:19 crc kubenswrapper[4794]: I0310 11:25:19.526588 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/b86814f6-b171-481c-9f93-dab25d50b2a9-config-data\") pod \"b86814f6-b171-481c-9f93-dab25d50b2a9\" (UID: \"b86814f6-b171-481c-9f93-dab25d50b2a9\") " Mar 10 11:25:19 crc kubenswrapper[4794]: I0310 11:25:19.526634 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2gbnw\" (UniqueName: \"kubernetes.io/projected/b86814f6-b171-481c-9f93-dab25d50b2a9-kube-api-access-2gbnw\") pod \"b86814f6-b171-481c-9f93-dab25d50b2a9\" (UID: \"b86814f6-b171-481c-9f93-dab25d50b2a9\") " Mar 10 11:25:19 crc kubenswrapper[4794]: I0310 11:25:19.526664 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b86814f6-b171-481c-9f93-dab25d50b2a9-logs\") pod \"b86814f6-b171-481c-9f93-dab25d50b2a9\" (UID: \"b86814f6-b171-481c-9f93-dab25d50b2a9\") " Mar 10 11:25:19 crc kubenswrapper[4794]: I0310 11:25:19.526741 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b86814f6-b171-481c-9f93-dab25d50b2a9-scripts\") pod \"b86814f6-b171-481c-9f93-dab25d50b2a9\" (UID: \"b86814f6-b171-481c-9f93-dab25d50b2a9\") " Mar 10 11:25:19 crc kubenswrapper[4794]: I0310 11:25:19.526841 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/b86814f6-b171-481c-9f93-dab25d50b2a9-horizon-secret-key\") pod \"b86814f6-b171-481c-9f93-dab25d50b2a9\" (UID: \"b86814f6-b171-481c-9f93-dab25d50b2a9\") " Mar 10 11:25:19 crc kubenswrapper[4794]: I0310 11:25:19.529876 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b86814f6-b171-481c-9f93-dab25d50b2a9-logs" (OuterVolumeSpecName: "logs") pod "b86814f6-b171-481c-9f93-dab25d50b2a9" (UID: "b86814f6-b171-481c-9f93-dab25d50b2a9"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 11:25:19 crc kubenswrapper[4794]: I0310 11:25:19.543567 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b86814f6-b171-481c-9f93-dab25d50b2a9-kube-api-access-2gbnw" (OuterVolumeSpecName: "kube-api-access-2gbnw") pod "b86814f6-b171-481c-9f93-dab25d50b2a9" (UID: "b86814f6-b171-481c-9f93-dab25d50b2a9"). InnerVolumeSpecName "kube-api-access-2gbnw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 11:25:19 crc kubenswrapper[4794]: I0310 11:25:19.551054 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b86814f6-b171-481c-9f93-dab25d50b2a9-scripts" (OuterVolumeSpecName: "scripts") pod "b86814f6-b171-481c-9f93-dab25d50b2a9" (UID: "b86814f6-b171-481c-9f93-dab25d50b2a9"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 11:25:19 crc kubenswrapper[4794]: I0310 11:25:19.545913 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b86814f6-b171-481c-9f93-dab25d50b2a9-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "b86814f6-b171-481c-9f93-dab25d50b2a9" (UID: "b86814f6-b171-481c-9f93-dab25d50b2a9"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 11:25:19 crc kubenswrapper[4794]: I0310 11:25:19.558072 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b86814f6-b171-481c-9f93-dab25d50b2a9-config-data" (OuterVolumeSpecName: "config-data") pod "b86814f6-b171-481c-9f93-dab25d50b2a9" (UID: "b86814f6-b171-481c-9f93-dab25d50b2a9"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 11:25:19 crc kubenswrapper[4794]: I0310 11:25:19.616300 4794 scope.go:117] "RemoveContainer" containerID="b91c8b16f3261f4d1dbce0cd84b40db955de5e6ef96f848d34c4279b7497911e" Mar 10 11:25:19 crc kubenswrapper[4794]: I0310 11:25:19.629118 4794 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b86814f6-b171-481c-9f93-dab25d50b2a9-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 11:25:19 crc kubenswrapper[4794]: I0310 11:25:19.629152 4794 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/b86814f6-b171-481c-9f93-dab25d50b2a9-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Mar 10 11:25:19 crc kubenswrapper[4794]: I0310 11:25:19.629162 4794 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/b86814f6-b171-481c-9f93-dab25d50b2a9-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 11:25:19 crc kubenswrapper[4794]: I0310 11:25:19.629171 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2gbnw\" (UniqueName: \"kubernetes.io/projected/b86814f6-b171-481c-9f93-dab25d50b2a9-kube-api-access-2gbnw\") on node \"crc\" DevicePath \"\"" Mar 10 11:25:19 crc kubenswrapper[4794]: I0310 11:25:19.629184 4794 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b86814f6-b171-481c-9f93-dab25d50b2a9-logs\") on node \"crc\" DevicePath \"\"" Mar 10 11:25:19 crc kubenswrapper[4794]: I0310 11:25:19.641984 4794 scope.go:117] "RemoveContainer" containerID="93a3d4858ce9325bb129c88aa42962084decd984424790e70676ca58ea506623" Mar 10 11:25:19 crc kubenswrapper[4794]: E0310 11:25:19.642424 4794 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"93a3d4858ce9325bb129c88aa42962084decd984424790e70676ca58ea506623\": container with ID starting with 93a3d4858ce9325bb129c88aa42962084decd984424790e70676ca58ea506623 not found: ID does not exist" containerID="93a3d4858ce9325bb129c88aa42962084decd984424790e70676ca58ea506623" Mar 10 11:25:19 crc kubenswrapper[4794]: I0310 11:25:19.642463 4794 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"93a3d4858ce9325bb129c88aa42962084decd984424790e70676ca58ea506623"} err="failed to get container status \"93a3d4858ce9325bb129c88aa42962084decd984424790e70676ca58ea506623\": rpc error: code = NotFound desc = could not find container \"93a3d4858ce9325bb129c88aa42962084decd984424790e70676ca58ea506623\": container with ID starting with 93a3d4858ce9325bb129c88aa42962084decd984424790e70676ca58ea506623 not found: ID does not exist" Mar 10 11:25:19 crc kubenswrapper[4794]: I0310 11:25:19.642482 4794 scope.go:117] "RemoveContainer" containerID="b91c8b16f3261f4d1dbce0cd84b40db955de5e6ef96f848d34c4279b7497911e" Mar 10 11:25:19 crc kubenswrapper[4794]: E0310 11:25:19.642851 4794 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b91c8b16f3261f4d1dbce0cd84b40db955de5e6ef96f848d34c4279b7497911e\": container with ID starting with b91c8b16f3261f4d1dbce0cd84b40db955de5e6ef96f848d34c4279b7497911e not found: ID does not exist" containerID="b91c8b16f3261f4d1dbce0cd84b40db955de5e6ef96f848d34c4279b7497911e" Mar 10 11:25:19 crc kubenswrapper[4794]: I0310 11:25:19.642904 4794 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b91c8b16f3261f4d1dbce0cd84b40db955de5e6ef96f848d34c4279b7497911e"} err="failed to get container status \"b91c8b16f3261f4d1dbce0cd84b40db955de5e6ef96f848d34c4279b7497911e\": rpc error: code = NotFound desc = could not find container \"b91c8b16f3261f4d1dbce0cd84b40db955de5e6ef96f848d34c4279b7497911e\": container with ID starting with b91c8b16f3261f4d1dbce0cd84b40db955de5e6ef96f848d34c4279b7497911e not found: ID does not exist" Mar 10 11:25:19 crc kubenswrapper[4794]: I0310 11:25:19.794063 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-6ffb66d69f-4mhg9"] Mar 10 11:25:19 crc kubenswrapper[4794]: I0310 11:25:19.806743 4794 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-6ffb66d69f-4mhg9"] Mar 10 11:25:20 crc kubenswrapper[4794]: I0310 11:25:20.012539 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b86814f6-b171-481c-9f93-dab25d50b2a9" path="/var/lib/kubelet/pods/b86814f6-b171-481c-9f93-dab25d50b2a9/volumes" Mar 10 11:25:20 crc kubenswrapper[4794]: I0310 11:25:20.869602 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-sync-84hf6" Mar 10 11:25:21 crc kubenswrapper[4794]: I0310 11:25:21.055218 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1bdf038e-4f84-46d5-b1af-a95d083befbb-config-data\") pod \"1bdf038e-4f84-46d5-b1af-a95d083befbb\" (UID: \"1bdf038e-4f84-46d5-b1af-a95d083befbb\") " Mar 10 11:25:21 crc kubenswrapper[4794]: I0310 11:25:21.055308 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m9q77\" (UniqueName: \"kubernetes.io/projected/1bdf038e-4f84-46d5-b1af-a95d083befbb-kube-api-access-m9q77\") pod \"1bdf038e-4f84-46d5-b1af-a95d083befbb\" (UID: \"1bdf038e-4f84-46d5-b1af-a95d083befbb\") " Mar 10 11:25:21 crc kubenswrapper[4794]: I0310 11:25:21.055374 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1bdf038e-4f84-46d5-b1af-a95d083befbb-combined-ca-bundle\") pod \"1bdf038e-4f84-46d5-b1af-a95d083befbb\" (UID: \"1bdf038e-4f84-46d5-b1af-a95d083befbb\") " Mar 10 11:25:21 crc kubenswrapper[4794]: I0310 11:25:21.067268 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1bdf038e-4f84-46d5-b1af-a95d083befbb-kube-api-access-m9q77" (OuterVolumeSpecName: "kube-api-access-m9q77") pod "1bdf038e-4f84-46d5-b1af-a95d083befbb" (UID: "1bdf038e-4f84-46d5-b1af-a95d083befbb"). InnerVolumeSpecName "kube-api-access-m9q77". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 11:25:21 crc kubenswrapper[4794]: I0310 11:25:21.096812 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bdf038e-4f84-46d5-b1af-a95d083befbb-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1bdf038e-4f84-46d5-b1af-a95d083befbb" (UID: "1bdf038e-4f84-46d5-b1af-a95d083befbb"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 11:25:21 crc kubenswrapper[4794]: I0310 11:25:21.158630 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m9q77\" (UniqueName: \"kubernetes.io/projected/1bdf038e-4f84-46d5-b1af-a95d083befbb-kube-api-access-m9q77\") on node \"crc\" DevicePath \"\"" Mar 10 11:25:21 crc kubenswrapper[4794]: I0310 11:25:21.158669 4794 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1bdf038e-4f84-46d5-b1af-a95d083befbb-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 11:25:21 crc kubenswrapper[4794]: I0310 11:25:21.161809 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bdf038e-4f84-46d5-b1af-a95d083befbb-config-data" (OuterVolumeSpecName: "config-data") pod "1bdf038e-4f84-46d5-b1af-a95d083befbb" (UID: "1bdf038e-4f84-46d5-b1af-a95d083befbb"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 11:25:21 crc kubenswrapper[4794]: I0310 11:25:21.259821 4794 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1bdf038e-4f84-46d5-b1af-a95d083befbb-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 11:25:21 crc kubenswrapper[4794]: I0310 11:25:21.465038 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-sync-84hf6" event={"ID":"1bdf038e-4f84-46d5-b1af-a95d083befbb","Type":"ContainerDied","Data":"c14d17bc15ca3fc3fa2d5d04485ea0989a73cbcd4aab8b0f0fed3a32e514d6fd"} Mar 10 11:25:21 crc kubenswrapper[4794]: I0310 11:25:21.465369 4794 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c14d17bc15ca3fc3fa2d5d04485ea0989a73cbcd4aab8b0f0fed3a32e514d6fd" Mar 10 11:25:21 crc kubenswrapper[4794]: I0310 11:25:21.465130 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-sync-84hf6" Mar 10 11:25:22 crc kubenswrapper[4794]: I0310 11:25:22.604320 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-engine-6fbd4f9675-8th7w"] Mar 10 11:25:22 crc kubenswrapper[4794]: E0310 11:25:22.604931 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b86814f6-b171-481c-9f93-dab25d50b2a9" containerName="horizon-log" Mar 10 11:25:22 crc kubenswrapper[4794]: I0310 11:25:22.604948 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="b86814f6-b171-481c-9f93-dab25d50b2a9" containerName="horizon-log" Mar 10 11:25:22 crc kubenswrapper[4794]: E0310 11:25:22.604972 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1bdf038e-4f84-46d5-b1af-a95d083befbb" containerName="heat-db-sync" Mar 10 11:25:22 crc kubenswrapper[4794]: I0310 11:25:22.604981 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="1bdf038e-4f84-46d5-b1af-a95d083befbb" containerName="heat-db-sync" Mar 10 11:25:22 crc kubenswrapper[4794]: E0310 11:25:22.605012 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b86814f6-b171-481c-9f93-dab25d50b2a9" containerName="horizon" Mar 10 11:25:22 crc kubenswrapper[4794]: I0310 11:25:22.605019 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="b86814f6-b171-481c-9f93-dab25d50b2a9" containerName="horizon" Mar 10 11:25:22 crc kubenswrapper[4794]: I0310 11:25:22.605239 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="b86814f6-b171-481c-9f93-dab25d50b2a9" containerName="horizon-log" Mar 10 11:25:22 crc kubenswrapper[4794]: I0310 11:25:22.605253 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="b86814f6-b171-481c-9f93-dab25d50b2a9" containerName="horizon" Mar 10 11:25:22 crc kubenswrapper[4794]: I0310 11:25:22.605268 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="1bdf038e-4f84-46d5-b1af-a95d083befbb" containerName="heat-db-sync" Mar 10 11:25:22 crc kubenswrapper[4794]: I0310 11:25:22.606144 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-engine-6fbd4f9675-8th7w" Mar 10 11:25:22 crc kubenswrapper[4794]: I0310 11:25:22.608846 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-config-data" Mar 10 11:25:22 crc kubenswrapper[4794]: I0310 11:25:22.608942 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-heat-dockercfg-vs27r" Mar 10 11:25:22 crc kubenswrapper[4794]: I0310 11:25:22.608969 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-engine-config-data" Mar 10 11:25:22 crc kubenswrapper[4794]: I0310 11:25:22.625831 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-engine-6fbd4f9675-8th7w"] Mar 10 11:25:22 crc kubenswrapper[4794]: I0310 11:25:22.730396 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-cfnapi-7bff4bb77-gfswn"] Mar 10 11:25:22 crc kubenswrapper[4794]: I0310 11:25:22.731683 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-7bff4bb77-gfswn" Mar 10 11:25:22 crc kubenswrapper[4794]: I0310 11:25:22.733839 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-cfnapi-config-data" Mar 10 11:25:22 crc kubenswrapper[4794]: I0310 11:25:22.755244 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-cfnapi-7bff4bb77-gfswn"] Mar 10 11:25:22 crc kubenswrapper[4794]: I0310 11:25:22.790131 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2a7fce03-02d0-47e4-804e-2fa5ac594bdc-config-data\") pod \"heat-engine-6fbd4f9675-8th7w\" (UID: \"2a7fce03-02d0-47e4-804e-2fa5ac594bdc\") " pod="openstack/heat-engine-6fbd4f9675-8th7w" Mar 10 11:25:22 crc kubenswrapper[4794]: I0310 11:25:22.790491 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s6scd\" (UniqueName: \"kubernetes.io/projected/2a7fce03-02d0-47e4-804e-2fa5ac594bdc-kube-api-access-s6scd\") pod \"heat-engine-6fbd4f9675-8th7w\" (UID: \"2a7fce03-02d0-47e4-804e-2fa5ac594bdc\") " pod="openstack/heat-engine-6fbd4f9675-8th7w" Mar 10 11:25:22 crc kubenswrapper[4794]: I0310 11:25:22.790834 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2a7fce03-02d0-47e4-804e-2fa5ac594bdc-combined-ca-bundle\") pod \"heat-engine-6fbd4f9675-8th7w\" (UID: \"2a7fce03-02d0-47e4-804e-2fa5ac594bdc\") " pod="openstack/heat-engine-6fbd4f9675-8th7w" Mar 10 11:25:22 crc kubenswrapper[4794]: I0310 11:25:22.790878 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2a7fce03-02d0-47e4-804e-2fa5ac594bdc-config-data-custom\") pod \"heat-engine-6fbd4f9675-8th7w\" (UID: \"2a7fce03-02d0-47e4-804e-2fa5ac594bdc\") " pod="openstack/heat-engine-6fbd4f9675-8th7w" Mar 10 11:25:22 crc kubenswrapper[4794]: I0310 11:25:22.841032 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-api-f448c9dcf-rwf4x"] Mar 10 11:25:22 crc kubenswrapper[4794]: I0310 11:25:22.842625 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-f448c9dcf-rwf4x" Mar 10 11:25:22 crc kubenswrapper[4794]: I0310 11:25:22.844582 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-api-config-data" Mar 10 11:25:22 crc kubenswrapper[4794]: I0310 11:25:22.853824 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-api-f448c9dcf-rwf4x"] Mar 10 11:25:22 crc kubenswrapper[4794]: I0310 11:25:22.893547 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/49071c63-b14a-4c35-8d14-5c0e2b2deea3-config-data-custom\") pod \"heat-api-f448c9dcf-rwf4x\" (UID: \"49071c63-b14a-4c35-8d14-5c0e2b2deea3\") " pod="openstack/heat-api-f448c9dcf-rwf4x" Mar 10 11:25:22 crc kubenswrapper[4794]: I0310 11:25:22.893632 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2a7fce03-02d0-47e4-804e-2fa5ac594bdc-config-data\") pod \"heat-engine-6fbd4f9675-8th7w\" (UID: \"2a7fce03-02d0-47e4-804e-2fa5ac594bdc\") " pod="openstack/heat-engine-6fbd4f9675-8th7w" Mar 10 11:25:22 crc kubenswrapper[4794]: I0310 11:25:22.893670 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n4wpf\" (UniqueName: \"kubernetes.io/projected/49071c63-b14a-4c35-8d14-5c0e2b2deea3-kube-api-access-n4wpf\") pod \"heat-api-f448c9dcf-rwf4x\" (UID: \"49071c63-b14a-4c35-8d14-5c0e2b2deea3\") " pod="openstack/heat-api-f448c9dcf-rwf4x" Mar 10 11:25:22 crc kubenswrapper[4794]: I0310 11:25:22.893711 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tsmkc\" (UniqueName: \"kubernetes.io/projected/e9f71665-b982-459c-814e-fba7eedcb66b-kube-api-access-tsmkc\") pod \"heat-cfnapi-7bff4bb77-gfswn\" (UID: \"e9f71665-b982-459c-814e-fba7eedcb66b\") " pod="openstack/heat-cfnapi-7bff4bb77-gfswn" Mar 10 11:25:22 crc kubenswrapper[4794]: I0310 11:25:22.893734 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e9f71665-b982-459c-814e-fba7eedcb66b-combined-ca-bundle\") pod \"heat-cfnapi-7bff4bb77-gfswn\" (UID: \"e9f71665-b982-459c-814e-fba7eedcb66b\") " pod="openstack/heat-cfnapi-7bff4bb77-gfswn" Mar 10 11:25:22 crc kubenswrapper[4794]: I0310 11:25:22.893765 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e9f71665-b982-459c-814e-fba7eedcb66b-config-data-custom\") pod \"heat-cfnapi-7bff4bb77-gfswn\" (UID: \"e9f71665-b982-459c-814e-fba7eedcb66b\") " pod="openstack/heat-cfnapi-7bff4bb77-gfswn" Mar 10 11:25:22 crc kubenswrapper[4794]: I0310 11:25:22.893799 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s6scd\" (UniqueName: \"kubernetes.io/projected/2a7fce03-02d0-47e4-804e-2fa5ac594bdc-kube-api-access-s6scd\") pod \"heat-engine-6fbd4f9675-8th7w\" (UID: \"2a7fce03-02d0-47e4-804e-2fa5ac594bdc\") " pod="openstack/heat-engine-6fbd4f9675-8th7w" Mar 10 11:25:22 crc kubenswrapper[4794]: I0310 11:25:22.893856 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2a7fce03-02d0-47e4-804e-2fa5ac594bdc-combined-ca-bundle\") pod \"heat-engine-6fbd4f9675-8th7w\" (UID: \"2a7fce03-02d0-47e4-804e-2fa5ac594bdc\") " pod="openstack/heat-engine-6fbd4f9675-8th7w" Mar 10 11:25:22 crc kubenswrapper[4794]: I0310 11:25:22.893959 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2a7fce03-02d0-47e4-804e-2fa5ac594bdc-config-data-custom\") pod \"heat-engine-6fbd4f9675-8th7w\" (UID: \"2a7fce03-02d0-47e4-804e-2fa5ac594bdc\") " pod="openstack/heat-engine-6fbd4f9675-8th7w" Mar 10 11:25:22 crc kubenswrapper[4794]: I0310 11:25:22.894018 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/49071c63-b14a-4c35-8d14-5c0e2b2deea3-config-data\") pod \"heat-api-f448c9dcf-rwf4x\" (UID: \"49071c63-b14a-4c35-8d14-5c0e2b2deea3\") " pod="openstack/heat-api-f448c9dcf-rwf4x" Mar 10 11:25:22 crc kubenswrapper[4794]: I0310 11:25:22.894033 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/49071c63-b14a-4c35-8d14-5c0e2b2deea3-combined-ca-bundle\") pod \"heat-api-f448c9dcf-rwf4x\" (UID: \"49071c63-b14a-4c35-8d14-5c0e2b2deea3\") " pod="openstack/heat-api-f448c9dcf-rwf4x" Mar 10 11:25:22 crc kubenswrapper[4794]: I0310 11:25:22.894067 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e9f71665-b982-459c-814e-fba7eedcb66b-config-data\") pod \"heat-cfnapi-7bff4bb77-gfswn\" (UID: \"e9f71665-b982-459c-814e-fba7eedcb66b\") " pod="openstack/heat-cfnapi-7bff4bb77-gfswn" Mar 10 11:25:22 crc kubenswrapper[4794]: I0310 11:25:22.904495 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2a7fce03-02d0-47e4-804e-2fa5ac594bdc-config-data-custom\") pod \"heat-engine-6fbd4f9675-8th7w\" (UID: \"2a7fce03-02d0-47e4-804e-2fa5ac594bdc\") " pod="openstack/heat-engine-6fbd4f9675-8th7w" Mar 10 11:25:22 crc kubenswrapper[4794]: I0310 11:25:22.904584 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2a7fce03-02d0-47e4-804e-2fa5ac594bdc-config-data\") pod \"heat-engine-6fbd4f9675-8th7w\" (UID: \"2a7fce03-02d0-47e4-804e-2fa5ac594bdc\") " pod="openstack/heat-engine-6fbd4f9675-8th7w" Mar 10 11:25:22 crc kubenswrapper[4794]: I0310 11:25:22.905751 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2a7fce03-02d0-47e4-804e-2fa5ac594bdc-combined-ca-bundle\") pod \"heat-engine-6fbd4f9675-8th7w\" (UID: \"2a7fce03-02d0-47e4-804e-2fa5ac594bdc\") " pod="openstack/heat-engine-6fbd4f9675-8th7w" Mar 10 11:25:22 crc kubenswrapper[4794]: I0310 11:25:22.912293 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s6scd\" (UniqueName: \"kubernetes.io/projected/2a7fce03-02d0-47e4-804e-2fa5ac594bdc-kube-api-access-s6scd\") pod \"heat-engine-6fbd4f9675-8th7w\" (UID: \"2a7fce03-02d0-47e4-804e-2fa5ac594bdc\") " pod="openstack/heat-engine-6fbd4f9675-8th7w" Mar 10 11:25:22 crc kubenswrapper[4794]: I0310 11:25:22.931153 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-engine-6fbd4f9675-8th7w" Mar 10 11:25:22 crc kubenswrapper[4794]: I0310 11:25:22.967828 4794 patch_prober.go:28] interesting pod/machine-config-daemon-69278 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 11:25:22 crc kubenswrapper[4794]: I0310 11:25:22.967877 4794 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-69278" podUID="071a79a8-a892-4d38-a255-2a19483b64aa" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 11:25:22 crc kubenswrapper[4794]: I0310 11:25:22.995410 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/49071c63-b14a-4c35-8d14-5c0e2b2deea3-config-data-custom\") pod \"heat-api-f448c9dcf-rwf4x\" (UID: \"49071c63-b14a-4c35-8d14-5c0e2b2deea3\") " pod="openstack/heat-api-f448c9dcf-rwf4x" Mar 10 11:25:22 crc kubenswrapper[4794]: I0310 11:25:22.995485 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n4wpf\" (UniqueName: \"kubernetes.io/projected/49071c63-b14a-4c35-8d14-5c0e2b2deea3-kube-api-access-n4wpf\") pod \"heat-api-f448c9dcf-rwf4x\" (UID: \"49071c63-b14a-4c35-8d14-5c0e2b2deea3\") " pod="openstack/heat-api-f448c9dcf-rwf4x" Mar 10 11:25:22 crc kubenswrapper[4794]: I0310 11:25:22.995519 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tsmkc\" (UniqueName: \"kubernetes.io/projected/e9f71665-b982-459c-814e-fba7eedcb66b-kube-api-access-tsmkc\") pod \"heat-cfnapi-7bff4bb77-gfswn\" (UID: \"e9f71665-b982-459c-814e-fba7eedcb66b\") " pod="openstack/heat-cfnapi-7bff4bb77-gfswn" Mar 10 11:25:22 crc kubenswrapper[4794]: I0310 11:25:22.995539 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e9f71665-b982-459c-814e-fba7eedcb66b-combined-ca-bundle\") pod \"heat-cfnapi-7bff4bb77-gfswn\" (UID: \"e9f71665-b982-459c-814e-fba7eedcb66b\") " pod="openstack/heat-cfnapi-7bff4bb77-gfswn" Mar 10 11:25:22 crc kubenswrapper[4794]: I0310 11:25:22.995573 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e9f71665-b982-459c-814e-fba7eedcb66b-config-data-custom\") pod \"heat-cfnapi-7bff4bb77-gfswn\" (UID: \"e9f71665-b982-459c-814e-fba7eedcb66b\") " pod="openstack/heat-cfnapi-7bff4bb77-gfswn" Mar 10 11:25:22 crc kubenswrapper[4794]: I0310 11:25:22.995633 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/49071c63-b14a-4c35-8d14-5c0e2b2deea3-config-data\") pod \"heat-api-f448c9dcf-rwf4x\" (UID: \"49071c63-b14a-4c35-8d14-5c0e2b2deea3\") " pod="openstack/heat-api-f448c9dcf-rwf4x" Mar 10 11:25:22 crc kubenswrapper[4794]: I0310 11:25:22.995653 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/49071c63-b14a-4c35-8d14-5c0e2b2deea3-combined-ca-bundle\") pod \"heat-api-f448c9dcf-rwf4x\" (UID: \"49071c63-b14a-4c35-8d14-5c0e2b2deea3\") " pod="openstack/heat-api-f448c9dcf-rwf4x" Mar 10 11:25:22 crc kubenswrapper[4794]: I0310 11:25:22.995675 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e9f71665-b982-459c-814e-fba7eedcb66b-config-data\") pod \"heat-cfnapi-7bff4bb77-gfswn\" (UID: \"e9f71665-b982-459c-814e-fba7eedcb66b\") " pod="openstack/heat-cfnapi-7bff4bb77-gfswn" Mar 10 11:25:23 crc kubenswrapper[4794]: I0310 11:25:23.001154 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/49071c63-b14a-4c35-8d14-5c0e2b2deea3-config-data-custom\") pod \"heat-api-f448c9dcf-rwf4x\" (UID: \"49071c63-b14a-4c35-8d14-5c0e2b2deea3\") " pod="openstack/heat-api-f448c9dcf-rwf4x" Mar 10 11:25:23 crc kubenswrapper[4794]: I0310 11:25:23.001367 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e9f71665-b982-459c-814e-fba7eedcb66b-config-data\") pod \"heat-cfnapi-7bff4bb77-gfswn\" (UID: \"e9f71665-b982-459c-814e-fba7eedcb66b\") " pod="openstack/heat-cfnapi-7bff4bb77-gfswn" Mar 10 11:25:23 crc kubenswrapper[4794]: I0310 11:25:23.003297 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e9f71665-b982-459c-814e-fba7eedcb66b-config-data-custom\") pod \"heat-cfnapi-7bff4bb77-gfswn\" (UID: \"e9f71665-b982-459c-814e-fba7eedcb66b\") " pod="openstack/heat-cfnapi-7bff4bb77-gfswn" Mar 10 11:25:23 crc kubenswrapper[4794]: I0310 11:25:23.015202 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/49071c63-b14a-4c35-8d14-5c0e2b2deea3-combined-ca-bundle\") pod \"heat-api-f448c9dcf-rwf4x\" (UID: \"49071c63-b14a-4c35-8d14-5c0e2b2deea3\") " pod="openstack/heat-api-f448c9dcf-rwf4x" Mar 10 11:25:23 crc kubenswrapper[4794]: I0310 11:25:23.016753 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e9f71665-b982-459c-814e-fba7eedcb66b-combined-ca-bundle\") pod \"heat-cfnapi-7bff4bb77-gfswn\" (UID: \"e9f71665-b982-459c-814e-fba7eedcb66b\") " pod="openstack/heat-cfnapi-7bff4bb77-gfswn" Mar 10 11:25:23 crc kubenswrapper[4794]: I0310 11:25:23.026988 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n4wpf\" (UniqueName: \"kubernetes.io/projected/49071c63-b14a-4c35-8d14-5c0e2b2deea3-kube-api-access-n4wpf\") pod \"heat-api-f448c9dcf-rwf4x\" (UID: \"49071c63-b14a-4c35-8d14-5c0e2b2deea3\") " pod="openstack/heat-api-f448c9dcf-rwf4x" Mar 10 11:25:23 crc kubenswrapper[4794]: I0310 11:25:23.027811 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tsmkc\" (UniqueName: \"kubernetes.io/projected/e9f71665-b982-459c-814e-fba7eedcb66b-kube-api-access-tsmkc\") pod \"heat-cfnapi-7bff4bb77-gfswn\" (UID: \"e9f71665-b982-459c-814e-fba7eedcb66b\") " pod="openstack/heat-cfnapi-7bff4bb77-gfswn" Mar 10 11:25:23 crc kubenswrapper[4794]: I0310 11:25:23.030539 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/49071c63-b14a-4c35-8d14-5c0e2b2deea3-config-data\") pod \"heat-api-f448c9dcf-rwf4x\" (UID: \"49071c63-b14a-4c35-8d14-5c0e2b2deea3\") " pod="openstack/heat-api-f448c9dcf-rwf4x" Mar 10 11:25:23 crc kubenswrapper[4794]: I0310 11:25:23.077266 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-7bff4bb77-gfswn" Mar 10 11:25:23 crc kubenswrapper[4794]: I0310 11:25:23.165992 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-f448c9dcf-rwf4x" Mar 10 11:25:23 crc kubenswrapper[4794]: I0310 11:25:23.545037 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-engine-6fbd4f9675-8th7w"] Mar 10 11:25:23 crc kubenswrapper[4794]: I0310 11:25:23.746547 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-cfnapi-7bff4bb77-gfswn"] Mar 10 11:25:23 crc kubenswrapper[4794]: I0310 11:25:23.880611 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-api-f448c9dcf-rwf4x"] Mar 10 11:25:24 crc kubenswrapper[4794]: I0310 11:25:24.499679 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-engine-6fbd4f9675-8th7w" event={"ID":"2a7fce03-02d0-47e4-804e-2fa5ac594bdc","Type":"ContainerStarted","Data":"3d1bfbf605b8525f3b6b1bb614907178878f3b969f0f1ee20b855df8232470ce"} Mar 10 11:25:24 crc kubenswrapper[4794]: I0310 11:25:24.500035 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-engine-6fbd4f9675-8th7w" event={"ID":"2a7fce03-02d0-47e4-804e-2fa5ac594bdc","Type":"ContainerStarted","Data":"93fb3dcd6c817f89eccd6158fbffca6105f9c1dff8a1e3d3d4178f1feccedd36"} Mar 10 11:25:24 crc kubenswrapper[4794]: I0310 11:25:24.500053 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-engine-6fbd4f9675-8th7w" Mar 10 11:25:24 crc kubenswrapper[4794]: I0310 11:25:24.501078 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-f448c9dcf-rwf4x" event={"ID":"49071c63-b14a-4c35-8d14-5c0e2b2deea3","Type":"ContainerStarted","Data":"9a3f536465bda456f8d5db9f5eb3e2e200557944d74c29d7382fdec66b084671"} Mar 10 11:25:24 crc kubenswrapper[4794]: I0310 11:25:24.502287 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-7bff4bb77-gfswn" event={"ID":"e9f71665-b982-459c-814e-fba7eedcb66b","Type":"ContainerStarted","Data":"f0be8325d39228d7260c9ee8f2ad6670cdd952f6ace9f298c923b048664e33a6"} Mar 10 11:25:24 crc kubenswrapper[4794]: I0310 11:25:24.518993 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-engine-6fbd4f9675-8th7w" podStartSLOduration=2.51897394 podStartE2EDuration="2.51897394s" podCreationTimestamp="2026-03-10 11:25:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 11:25:24.516255316 +0000 UTC m=+6073.272426134" watchObservedRunningTime="2026-03-10 11:25:24.51897394 +0000 UTC m=+6073.275144758" Mar 10 11:25:24 crc kubenswrapper[4794]: I0310 11:25:24.658034 4794 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/horizon-77df494f69-g2kl6" Mar 10 11:25:25 crc kubenswrapper[4794]: I0310 11:25:25.032028 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-sync-vlmhb"] Mar 10 11:25:25 crc kubenswrapper[4794]: I0310 11:25:25.043735 4794 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-sync-vlmhb"] Mar 10 11:25:26 crc kubenswrapper[4794]: I0310 11:25:26.010590 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5a7ac128-d40e-4917-b3f9-4d87040dddbe" path="/var/lib/kubelet/pods/5a7ac128-d40e-4917-b3f9-4d87040dddbe/volumes" Mar 10 11:25:26 crc kubenswrapper[4794]: I0310 11:25:26.381733 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/horizon-77df494f69-g2kl6" Mar 10 11:25:26 crc kubenswrapper[4794]: I0310 11:25:26.487144 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-56cc67dc8f-wf6qg"] Mar 10 11:25:26 crc kubenswrapper[4794]: I0310 11:25:26.487409 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-56cc67dc8f-wf6qg" podUID="9bc8c092-11cb-41cf-860b-4912ca0a0eeb" containerName="horizon-log" containerID="cri-o://89b01a6bbd20a9c236848a6e06620f29f7cf563663f553373aaacb6b207569ad" gracePeriod=30 Mar 10 11:25:26 crc kubenswrapper[4794]: I0310 11:25:26.487561 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-56cc67dc8f-wf6qg" podUID="9bc8c092-11cb-41cf-860b-4912ca0a0eeb" containerName="horizon" containerID="cri-o://751ddd37c4de044ec7045631d07915f82ff9dc16dcd448acff16c7c4215f84cf" gracePeriod=30 Mar 10 11:25:27 crc kubenswrapper[4794]: I0310 11:25:27.538471 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-f448c9dcf-rwf4x" event={"ID":"49071c63-b14a-4c35-8d14-5c0e2b2deea3","Type":"ContainerStarted","Data":"14388431def721ca5071f72f48ce048c84f43311d59ebc3ba7959b7921c22946"} Mar 10 11:25:27 crc kubenswrapper[4794]: I0310 11:25:27.538743 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-api-f448c9dcf-rwf4x" Mar 10 11:25:27 crc kubenswrapper[4794]: I0310 11:25:27.542310 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-7bff4bb77-gfswn" event={"ID":"e9f71665-b982-459c-814e-fba7eedcb66b","Type":"ContainerStarted","Data":"7282323b17381cf032aadd8e2000056f3e35503817541acb26158d3daa7590aa"} Mar 10 11:25:27 crc kubenswrapper[4794]: I0310 11:25:27.543194 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-cfnapi-7bff4bb77-gfswn" Mar 10 11:25:27 crc kubenswrapper[4794]: I0310 11:25:27.558888 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-api-f448c9dcf-rwf4x" podStartSLOduration=3.154809516 podStartE2EDuration="5.558868615s" podCreationTimestamp="2026-03-10 11:25:22 +0000 UTC" firstStartedPulling="2026-03-10 11:25:23.884204888 +0000 UTC m=+6072.640375706" lastFinishedPulling="2026-03-10 11:25:26.288263997 +0000 UTC m=+6075.044434805" observedRunningTime="2026-03-10 11:25:27.552827637 +0000 UTC m=+6076.308998455" watchObservedRunningTime="2026-03-10 11:25:27.558868615 +0000 UTC m=+6076.315039433" Mar 10 11:25:27 crc kubenswrapper[4794]: I0310 11:25:27.576607 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-cfnapi-7bff4bb77-gfswn" podStartSLOduration=3.042317036 podStartE2EDuration="5.576588687s" podCreationTimestamp="2026-03-10 11:25:22 +0000 UTC" firstStartedPulling="2026-03-10 11:25:23.755356578 +0000 UTC m=+6072.511527386" lastFinishedPulling="2026-03-10 11:25:26.289628219 +0000 UTC m=+6075.045799037" observedRunningTime="2026-03-10 11:25:27.569827366 +0000 UTC m=+6076.325998194" watchObservedRunningTime="2026-03-10 11:25:27.576588687 +0000 UTC m=+6076.332759505" Mar 10 11:25:30 crc kubenswrapper[4794]: I0310 11:25:30.570953 4794 generic.go:334] "Generic (PLEG): container finished" podID="9bc8c092-11cb-41cf-860b-4912ca0a0eeb" containerID="751ddd37c4de044ec7045631d07915f82ff9dc16dcd448acff16c7c4215f84cf" exitCode=0 Mar 10 11:25:30 crc kubenswrapper[4794]: I0310 11:25:30.571025 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-56cc67dc8f-wf6qg" event={"ID":"9bc8c092-11cb-41cf-860b-4912ca0a0eeb","Type":"ContainerDied","Data":"751ddd37c4de044ec7045631d07915f82ff9dc16dcd448acff16c7c4215f84cf"} Mar 10 11:25:34 crc kubenswrapper[4794]: I0310 11:25:34.479080 4794 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-56cc67dc8f-wf6qg" podUID="9bc8c092-11cb-41cf-860b-4912ca0a0eeb" containerName="horizon" probeResult="failure" output="Get \"http://10.217.1.151:8080/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.1.151:8080: connect: connection refused" Mar 10 11:25:34 crc kubenswrapper[4794]: I0310 11:25:34.636710 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/heat-api-f448c9dcf-rwf4x" Mar 10 11:25:34 crc kubenswrapper[4794]: I0310 11:25:34.685565 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/heat-cfnapi-7bff4bb77-gfswn" Mar 10 11:25:42 crc kubenswrapper[4794]: I0310 11:25:42.984601 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/heat-engine-6fbd4f9675-8th7w" Mar 10 11:25:44 crc kubenswrapper[4794]: I0310 11:25:44.478754 4794 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-56cc67dc8f-wf6qg" podUID="9bc8c092-11cb-41cf-860b-4912ca0a0eeb" containerName="horizon" probeResult="failure" output="Get \"http://10.217.1.151:8080/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.1.151:8080: connect: connection refused" Mar 10 11:25:47 crc kubenswrapper[4794]: I0310 11:25:47.879374 4794 scope.go:117] "RemoveContainer" containerID="540523ba9349540477c0ccb3362badc240145da4c88f1e70720b28dd75d7c9d6" Mar 10 11:25:47 crc kubenswrapper[4794]: I0310 11:25:47.915345 4794 scope.go:117] "RemoveContainer" containerID="a00e706210e6e39ad85924a95fe1833658e45660d98e81d480c96460dd273372" Mar 10 11:25:47 crc kubenswrapper[4794]: I0310 11:25:47.976831 4794 scope.go:117] "RemoveContainer" containerID="b37a53f395fddadf0dac7239afbff2042aa89978fd87bcd7c0081f6643ae744a" Mar 10 11:25:48 crc kubenswrapper[4794]: I0310 11:25:48.035642 4794 scope.go:117] "RemoveContainer" containerID="65123d0ed48834e813578fec9bb56c6a2df752c50179e00ad5ca7fe24b94d03b" Mar 10 11:25:52 crc kubenswrapper[4794]: I0310 11:25:52.187134 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f087v755"] Mar 10 11:25:52 crc kubenswrapper[4794]: I0310 11:25:52.203551 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f087v755" Mar 10 11:25:52 crc kubenswrapper[4794]: I0310 11:25:52.207673 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Mar 10 11:25:52 crc kubenswrapper[4794]: I0310 11:25:52.235860 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f087v755"] Mar 10 11:25:52 crc kubenswrapper[4794]: I0310 11:25:52.399746 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/e6c90642-0761-4f5d-82bc-1546e809efe1-util\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f087v755\" (UID: \"e6c90642-0761-4f5d-82bc-1546e809efe1\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f087v755" Mar 10 11:25:52 crc kubenswrapper[4794]: I0310 11:25:52.400006 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/e6c90642-0761-4f5d-82bc-1546e809efe1-bundle\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f087v755\" (UID: \"e6c90642-0761-4f5d-82bc-1546e809efe1\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f087v755" Mar 10 11:25:52 crc kubenswrapper[4794]: I0310 11:25:52.400069 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-952nd\" (UniqueName: \"kubernetes.io/projected/e6c90642-0761-4f5d-82bc-1546e809efe1-kube-api-access-952nd\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f087v755\" (UID: \"e6c90642-0761-4f5d-82bc-1546e809efe1\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f087v755" Mar 10 11:25:52 crc kubenswrapper[4794]: I0310 11:25:52.503263 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/e6c90642-0761-4f5d-82bc-1546e809efe1-util\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f087v755\" (UID: \"e6c90642-0761-4f5d-82bc-1546e809efe1\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f087v755" Mar 10 11:25:52 crc kubenswrapper[4794]: I0310 11:25:52.504213 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/e6c90642-0761-4f5d-82bc-1546e809efe1-bundle\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f087v755\" (UID: \"e6c90642-0761-4f5d-82bc-1546e809efe1\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f087v755" Mar 10 11:25:52 crc kubenswrapper[4794]: I0310 11:25:52.504599 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-952nd\" (UniqueName: \"kubernetes.io/projected/e6c90642-0761-4f5d-82bc-1546e809efe1-kube-api-access-952nd\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f087v755\" (UID: \"e6c90642-0761-4f5d-82bc-1546e809efe1\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f087v755" Mar 10 11:25:52 crc kubenswrapper[4794]: I0310 11:25:52.505046 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/e6c90642-0761-4f5d-82bc-1546e809efe1-bundle\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f087v755\" (UID: \"e6c90642-0761-4f5d-82bc-1546e809efe1\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f087v755" Mar 10 11:25:52 crc kubenswrapper[4794]: I0310 11:25:52.505429 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/e6c90642-0761-4f5d-82bc-1546e809efe1-util\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f087v755\" (UID: \"e6c90642-0761-4f5d-82bc-1546e809efe1\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f087v755" Mar 10 11:25:52 crc kubenswrapper[4794]: I0310 11:25:52.541077 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-952nd\" (UniqueName: \"kubernetes.io/projected/e6c90642-0761-4f5d-82bc-1546e809efe1-kube-api-access-952nd\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f087v755\" (UID: \"e6c90642-0761-4f5d-82bc-1546e809efe1\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f087v755" Mar 10 11:25:52 crc kubenswrapper[4794]: I0310 11:25:52.558156 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f087v755" Mar 10 11:25:52 crc kubenswrapper[4794]: I0310 11:25:52.967380 4794 patch_prober.go:28] interesting pod/machine-config-daemon-69278 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 11:25:52 crc kubenswrapper[4794]: I0310 11:25:52.967849 4794 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-69278" podUID="071a79a8-a892-4d38-a255-2a19483b64aa" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 11:25:52 crc kubenswrapper[4794]: I0310 11:25:52.967917 4794 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-69278" Mar 10 11:25:52 crc kubenswrapper[4794]: I0310 11:25:52.968943 4794 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"7035d4646bbb6a23dd802a9c8c9f05a3c6c945d4e5a6da8586d2bb005fcfad77"} pod="openshift-machine-config-operator/machine-config-daemon-69278" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 10 11:25:52 crc kubenswrapper[4794]: I0310 11:25:52.969034 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-69278" podUID="071a79a8-a892-4d38-a255-2a19483b64aa" containerName="machine-config-daemon" containerID="cri-o://7035d4646bbb6a23dd802a9c8c9f05a3c6c945d4e5a6da8586d2bb005fcfad77" gracePeriod=600 Mar 10 11:25:53 crc kubenswrapper[4794]: E0310 11:25:53.119756 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-69278_openshift-machine-config-operator(071a79a8-a892-4d38-a255-2a19483b64aa)\"" pod="openshift-machine-config-operator/machine-config-daemon-69278" podUID="071a79a8-a892-4d38-a255-2a19483b64aa" Mar 10 11:25:53 crc kubenswrapper[4794]: I0310 11:25:53.232065 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f087v755"] Mar 10 11:25:53 crc kubenswrapper[4794]: I0310 11:25:53.821929 4794 generic.go:334] "Generic (PLEG): container finished" podID="e6c90642-0761-4f5d-82bc-1546e809efe1" containerID="3f3b07a4a2b89acdae9addd745fea0f4c6ae461489e98724ced925fd6c1648b4" exitCode=0 Mar 10 11:25:53 crc kubenswrapper[4794]: I0310 11:25:53.822089 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f087v755" event={"ID":"e6c90642-0761-4f5d-82bc-1546e809efe1","Type":"ContainerDied","Data":"3f3b07a4a2b89acdae9addd745fea0f4c6ae461489e98724ced925fd6c1648b4"} Mar 10 11:25:53 crc kubenswrapper[4794]: I0310 11:25:53.822633 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f087v755" event={"ID":"e6c90642-0761-4f5d-82bc-1546e809efe1","Type":"ContainerStarted","Data":"f8415fe810eda3cb004f87c36f0cdc0f0eadbcafcd953b5c69c4ce36f3cf9abd"} Mar 10 11:25:53 crc kubenswrapper[4794]: I0310 11:25:53.830434 4794 generic.go:334] "Generic (PLEG): container finished" podID="071a79a8-a892-4d38-a255-2a19483b64aa" containerID="7035d4646bbb6a23dd802a9c8c9f05a3c6c945d4e5a6da8586d2bb005fcfad77" exitCode=0 Mar 10 11:25:53 crc kubenswrapper[4794]: I0310 11:25:53.830536 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-69278" event={"ID":"071a79a8-a892-4d38-a255-2a19483b64aa","Type":"ContainerDied","Data":"7035d4646bbb6a23dd802a9c8c9f05a3c6c945d4e5a6da8586d2bb005fcfad77"} Mar 10 11:25:53 crc kubenswrapper[4794]: I0310 11:25:53.830584 4794 scope.go:117] "RemoveContainer" containerID="14e62759b4835fd8e09988559aeb5396bdb9e62fa6007a87b363f008bdc5ba42" Mar 10 11:25:53 crc kubenswrapper[4794]: I0310 11:25:53.831640 4794 scope.go:117] "RemoveContainer" containerID="7035d4646bbb6a23dd802a9c8c9f05a3c6c945d4e5a6da8586d2bb005fcfad77" Mar 10 11:25:53 crc kubenswrapper[4794]: E0310 11:25:53.832222 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-69278_openshift-machine-config-operator(071a79a8-a892-4d38-a255-2a19483b64aa)\"" pod="openshift-machine-config-operator/machine-config-daemon-69278" podUID="071a79a8-a892-4d38-a255-2a19483b64aa" Mar 10 11:25:54 crc kubenswrapper[4794]: I0310 11:25:54.481194 4794 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-56cc67dc8f-wf6qg" podUID="9bc8c092-11cb-41cf-860b-4912ca0a0eeb" containerName="horizon" probeResult="failure" output="Get \"http://10.217.1.151:8080/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.1.151:8080: connect: connection refused" Mar 10 11:25:54 crc kubenswrapper[4794]: I0310 11:25:54.481382 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-56cc67dc8f-wf6qg" Mar 10 11:25:55 crc kubenswrapper[4794]: I0310 11:25:55.864197 4794 generic.go:334] "Generic (PLEG): container finished" podID="e6c90642-0761-4f5d-82bc-1546e809efe1" containerID="fd6a5d3f2ba582e4d1596845fffcf1dba1469178c7f50a367907bc2bfbf3e9be" exitCode=0 Mar 10 11:25:55 crc kubenswrapper[4794]: I0310 11:25:55.864283 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f087v755" event={"ID":"e6c90642-0761-4f5d-82bc-1546e809efe1","Type":"ContainerDied","Data":"fd6a5d3f2ba582e4d1596845fffcf1dba1469178c7f50a367907bc2bfbf3e9be"} Mar 10 11:25:56 crc kubenswrapper[4794]: I0310 11:25:56.887490 4794 generic.go:334] "Generic (PLEG): container finished" podID="9bc8c092-11cb-41cf-860b-4912ca0a0eeb" containerID="89b01a6bbd20a9c236848a6e06620f29f7cf563663f553373aaacb6b207569ad" exitCode=137 Mar 10 11:25:56 crc kubenswrapper[4794]: I0310 11:25:56.887858 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-56cc67dc8f-wf6qg" event={"ID":"9bc8c092-11cb-41cf-860b-4912ca0a0eeb","Type":"ContainerDied","Data":"89b01a6bbd20a9c236848a6e06620f29f7cf563663f553373aaacb6b207569ad"} Mar 10 11:25:56 crc kubenswrapper[4794]: I0310 11:25:56.893943 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f087v755" event={"ID":"e6c90642-0761-4f5d-82bc-1546e809efe1","Type":"ContainerStarted","Data":"12a5d2d4ec1aee9356d8eb29d3c2c647bb6961c5b9cc554a5f23a71e5e2d43e5"} Mar 10 11:25:56 crc kubenswrapper[4794]: I0310 11:25:56.921241 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f087v755" podStartSLOduration=3.584721667 podStartE2EDuration="4.921221726s" podCreationTimestamp="2026-03-10 11:25:52 +0000 UTC" firstStartedPulling="2026-03-10 11:25:53.824120321 +0000 UTC m=+6102.580291149" lastFinishedPulling="2026-03-10 11:25:55.16062038 +0000 UTC m=+6103.916791208" observedRunningTime="2026-03-10 11:25:56.909322565 +0000 UTC m=+6105.665493383" watchObservedRunningTime="2026-03-10 11:25:56.921221726 +0000 UTC m=+6105.677392544" Mar 10 11:25:57 crc kubenswrapper[4794]: I0310 11:25:57.086836 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-56cc67dc8f-wf6qg" Mar 10 11:25:57 crc kubenswrapper[4794]: I0310 11:25:57.230661 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9bc8c092-11cb-41cf-860b-4912ca0a0eeb-logs\") pod \"9bc8c092-11cb-41cf-860b-4912ca0a0eeb\" (UID: \"9bc8c092-11cb-41cf-860b-4912ca0a0eeb\") " Mar 10 11:25:57 crc kubenswrapper[4794]: I0310 11:25:57.230802 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/9bc8c092-11cb-41cf-860b-4912ca0a0eeb-horizon-secret-key\") pod \"9bc8c092-11cb-41cf-860b-4912ca0a0eeb\" (UID: \"9bc8c092-11cb-41cf-860b-4912ca0a0eeb\") " Mar 10 11:25:57 crc kubenswrapper[4794]: I0310 11:25:57.230928 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jxfx6\" (UniqueName: \"kubernetes.io/projected/9bc8c092-11cb-41cf-860b-4912ca0a0eeb-kube-api-access-jxfx6\") pod \"9bc8c092-11cb-41cf-860b-4912ca0a0eeb\" (UID: \"9bc8c092-11cb-41cf-860b-4912ca0a0eeb\") " Mar 10 11:25:57 crc kubenswrapper[4794]: I0310 11:25:57.231068 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9bc8c092-11cb-41cf-860b-4912ca0a0eeb-scripts\") pod \"9bc8c092-11cb-41cf-860b-4912ca0a0eeb\" (UID: \"9bc8c092-11cb-41cf-860b-4912ca0a0eeb\") " Mar 10 11:25:57 crc kubenswrapper[4794]: I0310 11:25:57.231126 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9bc8c092-11cb-41cf-860b-4912ca0a0eeb-logs" (OuterVolumeSpecName: "logs") pod "9bc8c092-11cb-41cf-860b-4912ca0a0eeb" (UID: "9bc8c092-11cb-41cf-860b-4912ca0a0eeb"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 11:25:57 crc kubenswrapper[4794]: I0310 11:25:57.231247 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/9bc8c092-11cb-41cf-860b-4912ca0a0eeb-config-data\") pod \"9bc8c092-11cb-41cf-860b-4912ca0a0eeb\" (UID: \"9bc8c092-11cb-41cf-860b-4912ca0a0eeb\") " Mar 10 11:25:57 crc kubenswrapper[4794]: I0310 11:25:57.231693 4794 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9bc8c092-11cb-41cf-860b-4912ca0a0eeb-logs\") on node \"crc\" DevicePath \"\"" Mar 10 11:25:57 crc kubenswrapper[4794]: I0310 11:25:57.238311 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9bc8c092-11cb-41cf-860b-4912ca0a0eeb-kube-api-access-jxfx6" (OuterVolumeSpecName: "kube-api-access-jxfx6") pod "9bc8c092-11cb-41cf-860b-4912ca0a0eeb" (UID: "9bc8c092-11cb-41cf-860b-4912ca0a0eeb"). InnerVolumeSpecName "kube-api-access-jxfx6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 11:25:57 crc kubenswrapper[4794]: I0310 11:25:57.239021 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9bc8c092-11cb-41cf-860b-4912ca0a0eeb-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "9bc8c092-11cb-41cf-860b-4912ca0a0eeb" (UID: "9bc8c092-11cb-41cf-860b-4912ca0a0eeb"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 11:25:57 crc kubenswrapper[4794]: I0310 11:25:57.260992 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9bc8c092-11cb-41cf-860b-4912ca0a0eeb-config-data" (OuterVolumeSpecName: "config-data") pod "9bc8c092-11cb-41cf-860b-4912ca0a0eeb" (UID: "9bc8c092-11cb-41cf-860b-4912ca0a0eeb"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 11:25:57 crc kubenswrapper[4794]: I0310 11:25:57.268973 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9bc8c092-11cb-41cf-860b-4912ca0a0eeb-scripts" (OuterVolumeSpecName: "scripts") pod "9bc8c092-11cb-41cf-860b-4912ca0a0eeb" (UID: "9bc8c092-11cb-41cf-860b-4912ca0a0eeb"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 11:25:57 crc kubenswrapper[4794]: I0310 11:25:57.339266 4794 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/9bc8c092-11cb-41cf-860b-4912ca0a0eeb-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 11:25:57 crc kubenswrapper[4794]: I0310 11:25:57.339297 4794 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/9bc8c092-11cb-41cf-860b-4912ca0a0eeb-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Mar 10 11:25:57 crc kubenswrapper[4794]: I0310 11:25:57.339311 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jxfx6\" (UniqueName: \"kubernetes.io/projected/9bc8c092-11cb-41cf-860b-4912ca0a0eeb-kube-api-access-jxfx6\") on node \"crc\" DevicePath \"\"" Mar 10 11:25:57 crc kubenswrapper[4794]: I0310 11:25:57.339343 4794 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9bc8c092-11cb-41cf-860b-4912ca0a0eeb-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 11:25:57 crc kubenswrapper[4794]: I0310 11:25:57.906375 4794 generic.go:334] "Generic (PLEG): container finished" podID="e6c90642-0761-4f5d-82bc-1546e809efe1" containerID="12a5d2d4ec1aee9356d8eb29d3c2c647bb6961c5b9cc554a5f23a71e5e2d43e5" exitCode=0 Mar 10 11:25:57 crc kubenswrapper[4794]: I0310 11:25:57.906428 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f087v755" event={"ID":"e6c90642-0761-4f5d-82bc-1546e809efe1","Type":"ContainerDied","Data":"12a5d2d4ec1aee9356d8eb29d3c2c647bb6961c5b9cc554a5f23a71e5e2d43e5"} Mar 10 11:25:57 crc kubenswrapper[4794]: I0310 11:25:57.910495 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-56cc67dc8f-wf6qg" event={"ID":"9bc8c092-11cb-41cf-860b-4912ca0a0eeb","Type":"ContainerDied","Data":"e38e07a56cde82d8d26ebcd4cab2feebb80150e0eda2cdd6167416d36ee40413"} Mar 10 11:25:57 crc kubenswrapper[4794]: I0310 11:25:57.910551 4794 scope.go:117] "RemoveContainer" containerID="751ddd37c4de044ec7045631d07915f82ff9dc16dcd448acff16c7c4215f84cf" Mar 10 11:25:57 crc kubenswrapper[4794]: I0310 11:25:57.910591 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-56cc67dc8f-wf6qg" Mar 10 11:25:57 crc kubenswrapper[4794]: I0310 11:25:57.989888 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-56cc67dc8f-wf6qg"] Mar 10 11:25:58 crc kubenswrapper[4794]: I0310 11:25:58.009228 4794 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-56cc67dc8f-wf6qg"] Mar 10 11:25:58 crc kubenswrapper[4794]: I0310 11:25:58.118853 4794 scope.go:117] "RemoveContainer" containerID="89b01a6bbd20a9c236848a6e06620f29f7cf563663f553373aaacb6b207569ad" Mar 10 11:25:59 crc kubenswrapper[4794]: I0310 11:25:59.341764 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f087v755" Mar 10 11:25:59 crc kubenswrapper[4794]: I0310 11:25:59.382505 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/e6c90642-0761-4f5d-82bc-1546e809efe1-bundle\") pod \"e6c90642-0761-4f5d-82bc-1546e809efe1\" (UID: \"e6c90642-0761-4f5d-82bc-1546e809efe1\") " Mar 10 11:25:59 crc kubenswrapper[4794]: I0310 11:25:59.382561 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/e6c90642-0761-4f5d-82bc-1546e809efe1-util\") pod \"e6c90642-0761-4f5d-82bc-1546e809efe1\" (UID: \"e6c90642-0761-4f5d-82bc-1546e809efe1\") " Mar 10 11:25:59 crc kubenswrapper[4794]: I0310 11:25:59.382602 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-952nd\" (UniqueName: \"kubernetes.io/projected/e6c90642-0761-4f5d-82bc-1546e809efe1-kube-api-access-952nd\") pod \"e6c90642-0761-4f5d-82bc-1546e809efe1\" (UID: \"e6c90642-0761-4f5d-82bc-1546e809efe1\") " Mar 10 11:25:59 crc kubenswrapper[4794]: I0310 11:25:59.386180 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e6c90642-0761-4f5d-82bc-1546e809efe1-bundle" (OuterVolumeSpecName: "bundle") pod "e6c90642-0761-4f5d-82bc-1546e809efe1" (UID: "e6c90642-0761-4f5d-82bc-1546e809efe1"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 11:25:59 crc kubenswrapper[4794]: I0310 11:25:59.391592 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e6c90642-0761-4f5d-82bc-1546e809efe1-kube-api-access-952nd" (OuterVolumeSpecName: "kube-api-access-952nd") pod "e6c90642-0761-4f5d-82bc-1546e809efe1" (UID: "e6c90642-0761-4f5d-82bc-1546e809efe1"). InnerVolumeSpecName "kube-api-access-952nd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 11:25:59 crc kubenswrapper[4794]: I0310 11:25:59.405098 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e6c90642-0761-4f5d-82bc-1546e809efe1-util" (OuterVolumeSpecName: "util") pod "e6c90642-0761-4f5d-82bc-1546e809efe1" (UID: "e6c90642-0761-4f5d-82bc-1546e809efe1"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 11:25:59 crc kubenswrapper[4794]: I0310 11:25:59.484819 4794 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/e6c90642-0761-4f5d-82bc-1546e809efe1-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 11:25:59 crc kubenswrapper[4794]: I0310 11:25:59.484854 4794 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/e6c90642-0761-4f5d-82bc-1546e809efe1-util\") on node \"crc\" DevicePath \"\"" Mar 10 11:25:59 crc kubenswrapper[4794]: I0310 11:25:59.484865 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-952nd\" (UniqueName: \"kubernetes.io/projected/e6c90642-0761-4f5d-82bc-1546e809efe1-kube-api-access-952nd\") on node \"crc\" DevicePath \"\"" Mar 10 11:25:59 crc kubenswrapper[4794]: I0310 11:25:59.938525 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f087v755" Mar 10 11:25:59 crc kubenswrapper[4794]: I0310 11:25:59.938416 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f087v755" event={"ID":"e6c90642-0761-4f5d-82bc-1546e809efe1","Type":"ContainerDied","Data":"f8415fe810eda3cb004f87c36f0cdc0f0eadbcafcd953b5c69c4ce36f3cf9abd"} Mar 10 11:25:59 crc kubenswrapper[4794]: I0310 11:25:59.940638 4794 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f8415fe810eda3cb004f87c36f0cdc0f0eadbcafcd953b5c69c4ce36f3cf9abd" Mar 10 11:26:00 crc kubenswrapper[4794]: I0310 11:26:00.013626 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9bc8c092-11cb-41cf-860b-4912ca0a0eeb" path="/var/lib/kubelet/pods/9bc8c092-11cb-41cf-860b-4912ca0a0eeb/volumes" Mar 10 11:26:00 crc kubenswrapper[4794]: I0310 11:26:00.167399 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29552366-zhtn9"] Mar 10 11:26:00 crc kubenswrapper[4794]: E0310 11:26:00.167895 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e6c90642-0761-4f5d-82bc-1546e809efe1" containerName="extract" Mar 10 11:26:00 crc kubenswrapper[4794]: I0310 11:26:00.167915 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="e6c90642-0761-4f5d-82bc-1546e809efe1" containerName="extract" Mar 10 11:26:00 crc kubenswrapper[4794]: E0310 11:26:00.167928 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e6c90642-0761-4f5d-82bc-1546e809efe1" containerName="util" Mar 10 11:26:00 crc kubenswrapper[4794]: I0310 11:26:00.167936 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="e6c90642-0761-4f5d-82bc-1546e809efe1" containerName="util" Mar 10 11:26:00 crc kubenswrapper[4794]: E0310 11:26:00.167948 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9bc8c092-11cb-41cf-860b-4912ca0a0eeb" containerName="horizon" Mar 10 11:26:00 crc kubenswrapper[4794]: I0310 11:26:00.167958 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="9bc8c092-11cb-41cf-860b-4912ca0a0eeb" containerName="horizon" Mar 10 11:26:00 crc kubenswrapper[4794]: E0310 11:26:00.167975 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e6c90642-0761-4f5d-82bc-1546e809efe1" containerName="pull" Mar 10 11:26:00 crc kubenswrapper[4794]: I0310 11:26:00.167982 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="e6c90642-0761-4f5d-82bc-1546e809efe1" containerName="pull" Mar 10 11:26:00 crc kubenswrapper[4794]: E0310 11:26:00.167994 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9bc8c092-11cb-41cf-860b-4912ca0a0eeb" containerName="horizon-log" Mar 10 11:26:00 crc kubenswrapper[4794]: I0310 11:26:00.168001 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="9bc8c092-11cb-41cf-860b-4912ca0a0eeb" containerName="horizon-log" Mar 10 11:26:00 crc kubenswrapper[4794]: I0310 11:26:00.168193 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="9bc8c092-11cb-41cf-860b-4912ca0a0eeb" containerName="horizon" Mar 10 11:26:00 crc kubenswrapper[4794]: I0310 11:26:00.168208 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="e6c90642-0761-4f5d-82bc-1546e809efe1" containerName="extract" Mar 10 11:26:00 crc kubenswrapper[4794]: I0310 11:26:00.168218 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="9bc8c092-11cb-41cf-860b-4912ca0a0eeb" containerName="horizon-log" Mar 10 11:26:00 crc kubenswrapper[4794]: I0310 11:26:00.168954 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552366-zhtn9" Mar 10 11:26:00 crc kubenswrapper[4794]: I0310 11:26:00.172086 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 10 11:26:00 crc kubenswrapper[4794]: I0310 11:26:00.172395 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 10 11:26:00 crc kubenswrapper[4794]: I0310 11:26:00.172853 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-r5tkc" Mar 10 11:26:00 crc kubenswrapper[4794]: I0310 11:26:00.181949 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552366-zhtn9"] Mar 10 11:26:00 crc kubenswrapper[4794]: I0310 11:26:00.203381 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5k8lk\" (UniqueName: \"kubernetes.io/projected/faf5136a-197d-4f67-9a0d-01ab38902d79-kube-api-access-5k8lk\") pod \"auto-csr-approver-29552366-zhtn9\" (UID: \"faf5136a-197d-4f67-9a0d-01ab38902d79\") " pod="openshift-infra/auto-csr-approver-29552366-zhtn9" Mar 10 11:26:00 crc kubenswrapper[4794]: I0310 11:26:00.305809 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5k8lk\" (UniqueName: \"kubernetes.io/projected/faf5136a-197d-4f67-9a0d-01ab38902d79-kube-api-access-5k8lk\") pod \"auto-csr-approver-29552366-zhtn9\" (UID: \"faf5136a-197d-4f67-9a0d-01ab38902d79\") " pod="openshift-infra/auto-csr-approver-29552366-zhtn9" Mar 10 11:26:00 crc kubenswrapper[4794]: I0310 11:26:00.324274 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5k8lk\" (UniqueName: \"kubernetes.io/projected/faf5136a-197d-4f67-9a0d-01ab38902d79-kube-api-access-5k8lk\") pod \"auto-csr-approver-29552366-zhtn9\" (UID: \"faf5136a-197d-4f67-9a0d-01ab38902d79\") " pod="openshift-infra/auto-csr-approver-29552366-zhtn9" Mar 10 11:26:00 crc kubenswrapper[4794]: I0310 11:26:00.522453 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552366-zhtn9" Mar 10 11:26:01 crc kubenswrapper[4794]: I0310 11:26:01.026042 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552366-zhtn9"] Mar 10 11:26:01 crc kubenswrapper[4794]: I0310 11:26:01.957415 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552366-zhtn9" event={"ID":"faf5136a-197d-4f67-9a0d-01ab38902d79","Type":"ContainerStarted","Data":"07770b830788b71abec8e392f4f50829e86e41515384c52807bc5a23a41fdc40"} Mar 10 11:26:05 crc kubenswrapper[4794]: I0310 11:26:05.999355 4794 scope.go:117] "RemoveContainer" containerID="7035d4646bbb6a23dd802a9c8c9f05a3c6c945d4e5a6da8586d2bb005fcfad77" Mar 10 11:26:06 crc kubenswrapper[4794]: E0310 11:26:06.000056 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-69278_openshift-machine-config-operator(071a79a8-a892-4d38-a255-2a19483b64aa)\"" pod="openshift-machine-config-operator/machine-config-daemon-69278" podUID="071a79a8-a892-4d38-a255-2a19483b64aa" Mar 10 11:26:07 crc kubenswrapper[4794]: I0310 11:26:07.029554 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552366-zhtn9" event={"ID":"faf5136a-197d-4f67-9a0d-01ab38902d79","Type":"ContainerStarted","Data":"b41be85390c1e60a649cb2f1053d837910650d40dfb00dab3397d1ab8f2ad778"} Mar 10 11:26:07 crc kubenswrapper[4794]: I0310 11:26:07.058367 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29552366-zhtn9" podStartSLOduration=2.11420678 podStartE2EDuration="7.05832719s" podCreationTimestamp="2026-03-10 11:26:00 +0000 UTC" firstStartedPulling="2026-03-10 11:26:01.045129682 +0000 UTC m=+6109.801300520" lastFinishedPulling="2026-03-10 11:26:05.989250112 +0000 UTC m=+6114.745420930" observedRunningTime="2026-03-10 11:26:07.054351515 +0000 UTC m=+6115.810522333" watchObservedRunningTime="2026-03-10 11:26:07.05832719 +0000 UTC m=+6115.814498008" Mar 10 11:26:07 crc kubenswrapper[4794]: I0310 11:26:07.113827 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-create-nbph9"] Mar 10 11:26:07 crc kubenswrapper[4794]: I0310 11:26:07.125299 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-5620-account-create-update-scwqm"] Mar 10 11:26:07 crc kubenswrapper[4794]: I0310 11:26:07.134645 4794 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-create-nbph9"] Mar 10 11:26:07 crc kubenswrapper[4794]: I0310 11:26:07.147110 4794 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-5620-account-create-update-scwqm"] Mar 10 11:26:08 crc kubenswrapper[4794]: I0310 11:26:08.012560 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="55b82cce-abab-4c3f-869e-e4910f4d2435" path="/var/lib/kubelet/pods/55b82cce-abab-4c3f-869e-e4910f4d2435/volumes" Mar 10 11:26:08 crc kubenswrapper[4794]: I0310 11:26:08.014202 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f233e4b9-a4e1-4c20-99a5-54632094041d" path="/var/lib/kubelet/pods/f233e4b9-a4e1-4c20-99a5-54632094041d/volumes" Mar 10 11:26:09 crc kubenswrapper[4794]: I0310 11:26:09.899022 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-68bc856cb9-ld8g6"] Mar 10 11:26:09 crc kubenswrapper[4794]: I0310 11:26:09.901446 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-ld8g6" Mar 10 11:26:09 crc kubenswrapper[4794]: I0310 11:26:09.904033 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators"/"kube-root-ca.crt" Mar 10 11:26:09 crc kubenswrapper[4794]: I0310 11:26:09.919706 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-dockercfg-z6nds" Mar 10 11:26:09 crc kubenswrapper[4794]: I0310 11:26:09.919984 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators"/"openshift-service-ca.crt" Mar 10 11:26:09 crc kubenswrapper[4794]: I0310 11:26:09.920468 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-68bc856cb9-ld8g6"] Mar 10 11:26:09 crc kubenswrapper[4794]: I0310 11:26:09.996848 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wkl85\" (UniqueName: \"kubernetes.io/projected/6a7f71bd-0553-4555-b978-b4e470af8a84-kube-api-access-wkl85\") pod \"obo-prometheus-operator-68bc856cb9-ld8g6\" (UID: \"6a7f71bd-0553-4555-b978-b4e470af8a84\") " pod="openshift-operators/obo-prometheus-operator-68bc856cb9-ld8g6" Mar 10 11:26:10 crc kubenswrapper[4794]: I0310 11:26:10.027770 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-696c9775d9-klbpr"] Mar 10 11:26:10 crc kubenswrapper[4794]: I0310 11:26:10.028940 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-696c9775d9-klbpr" Mar 10 11:26:10 crc kubenswrapper[4794]: I0310 11:26:10.033493 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-admission-webhook-dockercfg-txzhj" Mar 10 11:26:10 crc kubenswrapper[4794]: I0310 11:26:10.033746 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-admission-webhook-service-cert" Mar 10 11:26:10 crc kubenswrapper[4794]: I0310 11:26:10.054132 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-696c9775d9-klbpr"] Mar 10 11:26:10 crc kubenswrapper[4794]: I0310 11:26:10.055396 4794 generic.go:334] "Generic (PLEG): container finished" podID="faf5136a-197d-4f67-9a0d-01ab38902d79" containerID="b41be85390c1e60a649cb2f1053d837910650d40dfb00dab3397d1ab8f2ad778" exitCode=0 Mar 10 11:26:10 crc kubenswrapper[4794]: I0310 11:26:10.055423 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552366-zhtn9" event={"ID":"faf5136a-197d-4f67-9a0d-01ab38902d79","Type":"ContainerDied","Data":"b41be85390c1e60a649cb2f1053d837910650d40dfb00dab3397d1ab8f2ad778"} Mar 10 11:26:10 crc kubenswrapper[4794]: I0310 11:26:10.088230 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-696c9775d9-tmx6t"] Mar 10 11:26:10 crc kubenswrapper[4794]: I0310 11:26:10.089585 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-696c9775d9-tmx6t" Mar 10 11:26:10 crc kubenswrapper[4794]: I0310 11:26:10.097088 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-696c9775d9-tmx6t"] Mar 10 11:26:10 crc kubenswrapper[4794]: I0310 11:26:10.098164 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wkl85\" (UniqueName: \"kubernetes.io/projected/6a7f71bd-0553-4555-b978-b4e470af8a84-kube-api-access-wkl85\") pod \"obo-prometheus-operator-68bc856cb9-ld8g6\" (UID: \"6a7f71bd-0553-4555-b978-b4e470af8a84\") " pod="openshift-operators/obo-prometheus-operator-68bc856cb9-ld8g6" Mar 10 11:26:10 crc kubenswrapper[4794]: I0310 11:26:10.124196 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wkl85\" (UniqueName: \"kubernetes.io/projected/6a7f71bd-0553-4555-b978-b4e470af8a84-kube-api-access-wkl85\") pod \"obo-prometheus-operator-68bc856cb9-ld8g6\" (UID: \"6a7f71bd-0553-4555-b978-b4e470af8a84\") " pod="openshift-operators/obo-prometheus-operator-68bc856cb9-ld8g6" Mar 10 11:26:10 crc kubenswrapper[4794]: I0310 11:26:10.199702 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/e8018367-607a-4c95-8e53-f06d848933cb-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-696c9775d9-tmx6t\" (UID: \"e8018367-607a-4c95-8e53-f06d848933cb\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-696c9775d9-tmx6t" Mar 10 11:26:10 crc kubenswrapper[4794]: I0310 11:26:10.200045 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/97b0679e-d4da-48f2-9f6e-62fbf2c3fb87-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-696c9775d9-klbpr\" (UID: \"97b0679e-d4da-48f2-9f6e-62fbf2c3fb87\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-696c9775d9-klbpr" Mar 10 11:26:10 crc kubenswrapper[4794]: I0310 11:26:10.200145 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/e8018367-607a-4c95-8e53-f06d848933cb-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-696c9775d9-tmx6t\" (UID: \"e8018367-607a-4c95-8e53-f06d848933cb\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-696c9775d9-tmx6t" Mar 10 11:26:10 crc kubenswrapper[4794]: I0310 11:26:10.200182 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/97b0679e-d4da-48f2-9f6e-62fbf2c3fb87-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-696c9775d9-klbpr\" (UID: \"97b0679e-d4da-48f2-9f6e-62fbf2c3fb87\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-696c9775d9-klbpr" Mar 10 11:26:10 crc kubenswrapper[4794]: I0310 11:26:10.218449 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-ld8g6" Mar 10 11:26:10 crc kubenswrapper[4794]: I0310 11:26:10.234841 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/observability-operator-59bdc8b94-2tfm5"] Mar 10 11:26:10 crc kubenswrapper[4794]: I0310 11:26:10.236268 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-59bdc8b94-2tfm5" Mar 10 11:26:10 crc kubenswrapper[4794]: I0310 11:26:10.238479 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"observability-operator-tls" Mar 10 11:26:10 crc kubenswrapper[4794]: I0310 11:26:10.238808 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"observability-operator-sa-dockercfg-fk2zb" Mar 10 11:26:10 crc kubenswrapper[4794]: I0310 11:26:10.251033 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/observability-operator-59bdc8b94-2tfm5"] Mar 10 11:26:10 crc kubenswrapper[4794]: I0310 11:26:10.305458 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/e8018367-607a-4c95-8e53-f06d848933cb-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-696c9775d9-tmx6t\" (UID: \"e8018367-607a-4c95-8e53-f06d848933cb\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-696c9775d9-tmx6t" Mar 10 11:26:10 crc kubenswrapper[4794]: I0310 11:26:10.305536 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/97b0679e-d4da-48f2-9f6e-62fbf2c3fb87-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-696c9775d9-klbpr\" (UID: \"97b0679e-d4da-48f2-9f6e-62fbf2c3fb87\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-696c9775d9-klbpr" Mar 10 11:26:10 crc kubenswrapper[4794]: I0310 11:26:10.305624 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/e8018367-607a-4c95-8e53-f06d848933cb-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-696c9775d9-tmx6t\" (UID: \"e8018367-607a-4c95-8e53-f06d848933cb\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-696c9775d9-tmx6t" Mar 10 11:26:10 crc kubenswrapper[4794]: I0310 11:26:10.305646 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/97b0679e-d4da-48f2-9f6e-62fbf2c3fb87-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-696c9775d9-klbpr\" (UID: \"97b0679e-d4da-48f2-9f6e-62fbf2c3fb87\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-696c9775d9-klbpr" Mar 10 11:26:10 crc kubenswrapper[4794]: I0310 11:26:10.314726 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/e8018367-607a-4c95-8e53-f06d848933cb-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-696c9775d9-tmx6t\" (UID: \"e8018367-607a-4c95-8e53-f06d848933cb\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-696c9775d9-tmx6t" Mar 10 11:26:10 crc kubenswrapper[4794]: I0310 11:26:10.314800 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/e8018367-607a-4c95-8e53-f06d848933cb-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-696c9775d9-tmx6t\" (UID: \"e8018367-607a-4c95-8e53-f06d848933cb\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-696c9775d9-tmx6t" Mar 10 11:26:10 crc kubenswrapper[4794]: I0310 11:26:10.314830 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/97b0679e-d4da-48f2-9f6e-62fbf2c3fb87-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-696c9775d9-klbpr\" (UID: \"97b0679e-d4da-48f2-9f6e-62fbf2c3fb87\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-696c9775d9-klbpr" Mar 10 11:26:10 crc kubenswrapper[4794]: I0310 11:26:10.317613 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/97b0679e-d4da-48f2-9f6e-62fbf2c3fb87-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-696c9775d9-klbpr\" (UID: \"97b0679e-d4da-48f2-9f6e-62fbf2c3fb87\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-696c9775d9-klbpr" Mar 10 11:26:10 crc kubenswrapper[4794]: I0310 11:26:10.345994 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-696c9775d9-klbpr" Mar 10 11:26:10 crc kubenswrapper[4794]: I0310 11:26:10.359512 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/perses-operator-5bf474d74f-kt69n"] Mar 10 11:26:10 crc kubenswrapper[4794]: I0310 11:26:10.360846 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-5bf474d74f-kt69n" Mar 10 11:26:10 crc kubenswrapper[4794]: I0310 11:26:10.372986 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"perses-operator-dockercfg-tfg49" Mar 10 11:26:10 crc kubenswrapper[4794]: I0310 11:26:10.379753 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/perses-operator-5bf474d74f-kt69n"] Mar 10 11:26:10 crc kubenswrapper[4794]: I0310 11:26:10.407281 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zbhfw\" (UniqueName: \"kubernetes.io/projected/53147876-0193-4a98-bf6d-fd9d34f1d84a-kube-api-access-zbhfw\") pod \"observability-operator-59bdc8b94-2tfm5\" (UID: \"53147876-0193-4a98-bf6d-fd9d34f1d84a\") " pod="openshift-operators/observability-operator-59bdc8b94-2tfm5" Mar 10 11:26:10 crc kubenswrapper[4794]: I0310 11:26:10.407430 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/53147876-0193-4a98-bf6d-fd9d34f1d84a-observability-operator-tls\") pod \"observability-operator-59bdc8b94-2tfm5\" (UID: \"53147876-0193-4a98-bf6d-fd9d34f1d84a\") " pod="openshift-operators/observability-operator-59bdc8b94-2tfm5" Mar 10 11:26:10 crc kubenswrapper[4794]: I0310 11:26:10.408352 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-696c9775d9-tmx6t" Mar 10 11:26:10 crc kubenswrapper[4794]: I0310 11:26:10.511957 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jvrzg\" (UniqueName: \"kubernetes.io/projected/5a0fb20a-63c7-403c-9154-744f6d841f43-kube-api-access-jvrzg\") pod \"perses-operator-5bf474d74f-kt69n\" (UID: \"5a0fb20a-63c7-403c-9154-744f6d841f43\") " pod="openshift-operators/perses-operator-5bf474d74f-kt69n" Mar 10 11:26:10 crc kubenswrapper[4794]: I0310 11:26:10.512140 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/53147876-0193-4a98-bf6d-fd9d34f1d84a-observability-operator-tls\") pod \"observability-operator-59bdc8b94-2tfm5\" (UID: \"53147876-0193-4a98-bf6d-fd9d34f1d84a\") " pod="openshift-operators/observability-operator-59bdc8b94-2tfm5" Mar 10 11:26:10 crc kubenswrapper[4794]: I0310 11:26:10.516518 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/5a0fb20a-63c7-403c-9154-744f6d841f43-openshift-service-ca\") pod \"perses-operator-5bf474d74f-kt69n\" (UID: \"5a0fb20a-63c7-403c-9154-744f6d841f43\") " pod="openshift-operators/perses-operator-5bf474d74f-kt69n" Mar 10 11:26:10 crc kubenswrapper[4794]: I0310 11:26:10.516732 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zbhfw\" (UniqueName: \"kubernetes.io/projected/53147876-0193-4a98-bf6d-fd9d34f1d84a-kube-api-access-zbhfw\") pod \"observability-operator-59bdc8b94-2tfm5\" (UID: \"53147876-0193-4a98-bf6d-fd9d34f1d84a\") " pod="openshift-operators/observability-operator-59bdc8b94-2tfm5" Mar 10 11:26:10 crc kubenswrapper[4794]: I0310 11:26:10.520597 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/53147876-0193-4a98-bf6d-fd9d34f1d84a-observability-operator-tls\") pod \"observability-operator-59bdc8b94-2tfm5\" (UID: \"53147876-0193-4a98-bf6d-fd9d34f1d84a\") " pod="openshift-operators/observability-operator-59bdc8b94-2tfm5" Mar 10 11:26:10 crc kubenswrapper[4794]: I0310 11:26:10.545097 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zbhfw\" (UniqueName: \"kubernetes.io/projected/53147876-0193-4a98-bf6d-fd9d34f1d84a-kube-api-access-zbhfw\") pod \"observability-operator-59bdc8b94-2tfm5\" (UID: \"53147876-0193-4a98-bf6d-fd9d34f1d84a\") " pod="openshift-operators/observability-operator-59bdc8b94-2tfm5" Mar 10 11:26:10 crc kubenswrapper[4794]: I0310 11:26:10.619060 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jvrzg\" (UniqueName: \"kubernetes.io/projected/5a0fb20a-63c7-403c-9154-744f6d841f43-kube-api-access-jvrzg\") pod \"perses-operator-5bf474d74f-kt69n\" (UID: \"5a0fb20a-63c7-403c-9154-744f6d841f43\") " pod="openshift-operators/perses-operator-5bf474d74f-kt69n" Mar 10 11:26:10 crc kubenswrapper[4794]: I0310 11:26:10.619162 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/5a0fb20a-63c7-403c-9154-744f6d841f43-openshift-service-ca\") pod \"perses-operator-5bf474d74f-kt69n\" (UID: \"5a0fb20a-63c7-403c-9154-744f6d841f43\") " pod="openshift-operators/perses-operator-5bf474d74f-kt69n" Mar 10 11:26:10 crc kubenswrapper[4794]: I0310 11:26:10.620038 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/5a0fb20a-63c7-403c-9154-744f6d841f43-openshift-service-ca\") pod \"perses-operator-5bf474d74f-kt69n\" (UID: \"5a0fb20a-63c7-403c-9154-744f6d841f43\") " pod="openshift-operators/perses-operator-5bf474d74f-kt69n" Mar 10 11:26:10 crc kubenswrapper[4794]: I0310 11:26:10.648478 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jvrzg\" (UniqueName: \"kubernetes.io/projected/5a0fb20a-63c7-403c-9154-744f6d841f43-kube-api-access-jvrzg\") pod \"perses-operator-5bf474d74f-kt69n\" (UID: \"5a0fb20a-63c7-403c-9154-744f6d841f43\") " pod="openshift-operators/perses-operator-5bf474d74f-kt69n" Mar 10 11:26:10 crc kubenswrapper[4794]: I0310 11:26:10.732133 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-68bc856cb9-ld8g6"] Mar 10 11:26:10 crc kubenswrapper[4794]: I0310 11:26:10.743951 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-59bdc8b94-2tfm5" Mar 10 11:26:10 crc kubenswrapper[4794]: I0310 11:26:10.752294 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-5bf474d74f-kt69n" Mar 10 11:26:10 crc kubenswrapper[4794]: I0310 11:26:10.962682 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-696c9775d9-klbpr"] Mar 10 11:26:10 crc kubenswrapper[4794]: W0310 11:26:10.973097 4794 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode8018367_607a_4c95_8e53_f06d848933cb.slice/crio-c16469ed9eece1d03e2f166abcf06718406b6631b2f743350f5ad9af3bef742f WatchSource:0}: Error finding container c16469ed9eece1d03e2f166abcf06718406b6631b2f743350f5ad9af3bef742f: Status 404 returned error can't find the container with id c16469ed9eece1d03e2f166abcf06718406b6631b2f743350f5ad9af3bef742f Mar 10 11:26:10 crc kubenswrapper[4794]: I0310 11:26:10.982654 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-696c9775d9-tmx6t"] Mar 10 11:26:11 crc kubenswrapper[4794]: I0310 11:26:11.121588 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-ld8g6" event={"ID":"6a7f71bd-0553-4555-b978-b4e470af8a84","Type":"ContainerStarted","Data":"13edd23ba4a4b458b2a62c508a9852d126c5aeef47b9a8c94eec53554159dfc9"} Mar 10 11:26:11 crc kubenswrapper[4794]: I0310 11:26:11.162606 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-696c9775d9-tmx6t" event={"ID":"e8018367-607a-4c95-8e53-f06d848933cb","Type":"ContainerStarted","Data":"c16469ed9eece1d03e2f166abcf06718406b6631b2f743350f5ad9af3bef742f"} Mar 10 11:26:11 crc kubenswrapper[4794]: I0310 11:26:11.190670 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-696c9775d9-klbpr" event={"ID":"97b0679e-d4da-48f2-9f6e-62fbf2c3fb87","Type":"ContainerStarted","Data":"50fe9d861e0ccf8e57d941470be59af453970dd5be65a25df554571982c552e2"} Mar 10 11:26:11 crc kubenswrapper[4794]: I0310 11:26:11.258731 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/perses-operator-5bf474d74f-kt69n"] Mar 10 11:26:11 crc kubenswrapper[4794]: W0310 11:26:11.287247 4794 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5a0fb20a_63c7_403c_9154_744f6d841f43.slice/crio-aab261f2c4a6a9544ff54f163bc2267a7fca4cbcec3ba3f8813f108002df1730 WatchSource:0}: Error finding container aab261f2c4a6a9544ff54f163bc2267a7fca4cbcec3ba3f8813f108002df1730: Status 404 returned error can't find the container with id aab261f2c4a6a9544ff54f163bc2267a7fca4cbcec3ba3f8813f108002df1730 Mar 10 11:26:11 crc kubenswrapper[4794]: W0310 11:26:11.534299 4794 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod53147876_0193_4a98_bf6d_fd9d34f1d84a.slice/crio-fc188b9126677c695d5f540fcc1cf42c6b6306e796a864bcd21bf03bbc9873e2 WatchSource:0}: Error finding container fc188b9126677c695d5f540fcc1cf42c6b6306e796a864bcd21bf03bbc9873e2: Status 404 returned error can't find the container with id fc188b9126677c695d5f540fcc1cf42c6b6306e796a864bcd21bf03bbc9873e2 Mar 10 11:26:11 crc kubenswrapper[4794]: I0310 11:26:11.537192 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/observability-operator-59bdc8b94-2tfm5"] Mar 10 11:26:11 crc kubenswrapper[4794]: I0310 11:26:11.685767 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552366-zhtn9" Mar 10 11:26:11 crc kubenswrapper[4794]: I0310 11:26:11.785289 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5k8lk\" (UniqueName: \"kubernetes.io/projected/faf5136a-197d-4f67-9a0d-01ab38902d79-kube-api-access-5k8lk\") pod \"faf5136a-197d-4f67-9a0d-01ab38902d79\" (UID: \"faf5136a-197d-4f67-9a0d-01ab38902d79\") " Mar 10 11:26:11 crc kubenswrapper[4794]: I0310 11:26:11.794508 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/faf5136a-197d-4f67-9a0d-01ab38902d79-kube-api-access-5k8lk" (OuterVolumeSpecName: "kube-api-access-5k8lk") pod "faf5136a-197d-4f67-9a0d-01ab38902d79" (UID: "faf5136a-197d-4f67-9a0d-01ab38902d79"). InnerVolumeSpecName "kube-api-access-5k8lk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 11:26:11 crc kubenswrapper[4794]: I0310 11:26:11.887612 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5k8lk\" (UniqueName: \"kubernetes.io/projected/faf5136a-197d-4f67-9a0d-01ab38902d79-kube-api-access-5k8lk\") on node \"crc\" DevicePath \"\"" Mar 10 11:26:12 crc kubenswrapper[4794]: I0310 11:26:12.178055 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29552360-dc7fk"] Mar 10 11:26:12 crc kubenswrapper[4794]: I0310 11:26:12.189352 4794 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29552360-dc7fk"] Mar 10 11:26:12 crc kubenswrapper[4794]: I0310 11:26:12.226608 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552366-zhtn9" event={"ID":"faf5136a-197d-4f67-9a0d-01ab38902d79","Type":"ContainerDied","Data":"07770b830788b71abec8e392f4f50829e86e41515384c52807bc5a23a41fdc40"} Mar 10 11:26:12 crc kubenswrapper[4794]: I0310 11:26:12.226650 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552366-zhtn9" Mar 10 11:26:12 crc kubenswrapper[4794]: I0310 11:26:12.226647 4794 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="07770b830788b71abec8e392f4f50829e86e41515384c52807bc5a23a41fdc40" Mar 10 11:26:12 crc kubenswrapper[4794]: I0310 11:26:12.229220 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/perses-operator-5bf474d74f-kt69n" event={"ID":"5a0fb20a-63c7-403c-9154-744f6d841f43","Type":"ContainerStarted","Data":"aab261f2c4a6a9544ff54f163bc2267a7fca4cbcec3ba3f8813f108002df1730"} Mar 10 11:26:12 crc kubenswrapper[4794]: I0310 11:26:12.235832 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/observability-operator-59bdc8b94-2tfm5" event={"ID":"53147876-0193-4a98-bf6d-fd9d34f1d84a","Type":"ContainerStarted","Data":"fc188b9126677c695d5f540fcc1cf42c6b6306e796a864bcd21bf03bbc9873e2"} Mar 10 11:26:12 crc kubenswrapper[4794]: E0310 11:26:12.270897 4794 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfaf5136a_197d_4f67_9a0d_01ab38902d79.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfaf5136a_197d_4f67_9a0d_01ab38902d79.slice/crio-07770b830788b71abec8e392f4f50829e86e41515384c52807bc5a23a41fdc40\": RecentStats: unable to find data in memory cache]" Mar 10 11:26:14 crc kubenswrapper[4794]: I0310 11:26:14.023274 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1112032b-20d2-4c4b-8f2b-3c9712c8cdf8" path="/var/lib/kubelet/pods/1112032b-20d2-4c4b-8f2b-3c9712c8cdf8/volumes" Mar 10 11:26:16 crc kubenswrapper[4794]: I0310 11:26:16.056745 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-sync-z96r7"] Mar 10 11:26:16 crc kubenswrapper[4794]: I0310 11:26:16.077182 4794 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-sync-z96r7"] Mar 10 11:26:18 crc kubenswrapper[4794]: I0310 11:26:18.013150 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a29fab98-30cc-441b-a7ad-17257e3f75a6" path="/var/lib/kubelet/pods/a29fab98-30cc-441b-a7ad-17257e3f75a6/volumes" Mar 10 11:26:21 crc kubenswrapper[4794]: I0310 11:26:20.999579 4794 scope.go:117] "RemoveContainer" containerID="7035d4646bbb6a23dd802a9c8c9f05a3c6c945d4e5a6da8586d2bb005fcfad77" Mar 10 11:26:21 crc kubenswrapper[4794]: E0310 11:26:21.002135 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-69278_openshift-machine-config-operator(071a79a8-a892-4d38-a255-2a19483b64aa)\"" pod="openshift-machine-config-operator/machine-config-daemon-69278" podUID="071a79a8-a892-4d38-a255-2a19483b64aa" Mar 10 11:26:23 crc kubenswrapper[4794]: I0310 11:26:23.396929 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-696c9775d9-tmx6t" event={"ID":"e8018367-607a-4c95-8e53-f06d848933cb","Type":"ContainerStarted","Data":"d6b16daac43e913672b47886dde8e82fb14699a55bb84ac1a429c60eac2d9d26"} Mar 10 11:26:23 crc kubenswrapper[4794]: I0310 11:26:23.399401 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-ld8g6" event={"ID":"6a7f71bd-0553-4555-b978-b4e470af8a84","Type":"ContainerStarted","Data":"0a2d636b46a782de76061cc602499824c9951271b11a87720a06d7e3b9a4f63c"} Mar 10 11:26:23 crc kubenswrapper[4794]: I0310 11:26:23.401836 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-696c9775d9-klbpr" event={"ID":"97b0679e-d4da-48f2-9f6e-62fbf2c3fb87","Type":"ContainerStarted","Data":"0e9608af6c6d981c8f4bcb3268d6f3abff0f3bc88c953a3d6535c80fbdf34f28"} Mar 10 11:26:23 crc kubenswrapper[4794]: I0310 11:26:23.403895 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/perses-operator-5bf474d74f-kt69n" event={"ID":"5a0fb20a-63c7-403c-9154-744f6d841f43","Type":"ContainerStarted","Data":"a13090d8cbf1b6dc34b0adf2afd3fb8570be0fc2b0106be1fe6794b4294bfc60"} Mar 10 11:26:23 crc kubenswrapper[4794]: I0310 11:26:23.404095 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operators/perses-operator-5bf474d74f-kt69n" Mar 10 11:26:23 crc kubenswrapper[4794]: I0310 11:26:23.405865 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/observability-operator-59bdc8b94-2tfm5" event={"ID":"53147876-0193-4a98-bf6d-fd9d34f1d84a","Type":"ContainerStarted","Data":"3df0eeaec78678d8914ebac446e2966d11397f65ff430a03a41e018d2b776802"} Mar 10 11:26:23 crc kubenswrapper[4794]: I0310 11:26:23.407754 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operators/observability-operator-59bdc8b94-2tfm5" Mar 10 11:26:23 crc kubenswrapper[4794]: I0310 11:26:23.410203 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators/observability-operator-59bdc8b94-2tfm5" Mar 10 11:26:23 crc kubenswrapper[4794]: I0310 11:26:23.425600 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-admission-webhook-696c9775d9-tmx6t" podStartSLOduration=1.725363607 podStartE2EDuration="13.425581152s" podCreationTimestamp="2026-03-10 11:26:10 +0000 UTC" firstStartedPulling="2026-03-10 11:26:10.979513548 +0000 UTC m=+6119.735684376" lastFinishedPulling="2026-03-10 11:26:22.679731103 +0000 UTC m=+6131.435901921" observedRunningTime="2026-03-10 11:26:23.421115273 +0000 UTC m=+6132.177286101" watchObservedRunningTime="2026-03-10 11:26:23.425581152 +0000 UTC m=+6132.181751970" Mar 10 11:26:23 crc kubenswrapper[4794]: I0310 11:26:23.497445 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/perses-operator-5bf474d74f-kt69n" podStartSLOduration=2.150345262 podStartE2EDuration="13.497426558s" podCreationTimestamp="2026-03-10 11:26:10 +0000 UTC" firstStartedPulling="2026-03-10 11:26:11.301863689 +0000 UTC m=+6120.058034507" lastFinishedPulling="2026-03-10 11:26:22.648944975 +0000 UTC m=+6131.405115803" observedRunningTime="2026-03-10 11:26:23.448220966 +0000 UTC m=+6132.204391774" watchObservedRunningTime="2026-03-10 11:26:23.497426558 +0000 UTC m=+6132.253597376" Mar 10 11:26:23 crc kubenswrapper[4794]: I0310 11:26:23.571506 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-ld8g6" podStartSLOduration=2.663023128 podStartE2EDuration="14.571490282s" podCreationTimestamp="2026-03-10 11:26:09 +0000 UTC" firstStartedPulling="2026-03-10 11:26:10.740621985 +0000 UTC m=+6119.496792803" lastFinishedPulling="2026-03-10 11:26:22.649089139 +0000 UTC m=+6131.405259957" observedRunningTime="2026-03-10 11:26:23.569650755 +0000 UTC m=+6132.325821583" watchObservedRunningTime="2026-03-10 11:26:23.571490282 +0000 UTC m=+6132.327661100" Mar 10 11:26:23 crc kubenswrapper[4794]: I0310 11:26:23.586986 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-admission-webhook-696c9775d9-klbpr" podStartSLOduration=1.921004145 podStartE2EDuration="13.586968934s" podCreationTimestamp="2026-03-10 11:26:10 +0000 UTC" firstStartedPulling="2026-03-10 11:26:10.98663049 +0000 UTC m=+6119.742801308" lastFinishedPulling="2026-03-10 11:26:22.652595279 +0000 UTC m=+6131.408766097" observedRunningTime="2026-03-10 11:26:23.512650271 +0000 UTC m=+6132.268821089" watchObservedRunningTime="2026-03-10 11:26:23.586968934 +0000 UTC m=+6132.343139752" Mar 10 11:26:23 crc kubenswrapper[4794]: I0310 11:26:23.631973 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/observability-operator-59bdc8b94-2tfm5" podStartSLOduration=2.444142454 podStartE2EDuration="13.631949604s" podCreationTimestamp="2026-03-10 11:26:10 +0000 UTC" firstStartedPulling="2026-03-10 11:26:11.540313669 +0000 UTC m=+6120.296484487" lastFinishedPulling="2026-03-10 11:26:22.728120819 +0000 UTC m=+6131.484291637" observedRunningTime="2026-03-10 11:26:23.627601538 +0000 UTC m=+6132.383772356" watchObservedRunningTime="2026-03-10 11:26:23.631949604 +0000 UTC m=+6132.388120422" Mar 10 11:26:30 crc kubenswrapper[4794]: I0310 11:26:30.756996 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators/perses-operator-5bf474d74f-kt69n" Mar 10 11:26:33 crc kubenswrapper[4794]: I0310 11:26:33.363403 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstackclient"] Mar 10 11:26:33 crc kubenswrapper[4794]: I0310 11:26:33.364114 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/openstackclient" podUID="9629922b-07aa-4027-a102-5d957b2ca8af" containerName="openstackclient" containerID="cri-o://2eca94f4dde83de73167c8dab6b4d9a2d353616bc86ccdcb0e93d1b10c6e8e36" gracePeriod=2 Mar 10 11:26:33 crc kubenswrapper[4794]: I0310 11:26:33.370826 4794 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/openstackclient"] Mar 10 11:26:33 crc kubenswrapper[4794]: I0310 11:26:33.400417 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Mar 10 11:26:33 crc kubenswrapper[4794]: E0310 11:26:33.400799 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="faf5136a-197d-4f67-9a0d-01ab38902d79" containerName="oc" Mar 10 11:26:33 crc kubenswrapper[4794]: I0310 11:26:33.400814 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="faf5136a-197d-4f67-9a0d-01ab38902d79" containerName="oc" Mar 10 11:26:33 crc kubenswrapper[4794]: E0310 11:26:33.400824 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9629922b-07aa-4027-a102-5d957b2ca8af" containerName="openstackclient" Mar 10 11:26:33 crc kubenswrapper[4794]: I0310 11:26:33.400831 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="9629922b-07aa-4027-a102-5d957b2ca8af" containerName="openstackclient" Mar 10 11:26:33 crc kubenswrapper[4794]: I0310 11:26:33.401051 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="9629922b-07aa-4027-a102-5d957b2ca8af" containerName="openstackclient" Mar 10 11:26:33 crc kubenswrapper[4794]: I0310 11:26:33.401073 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="faf5136a-197d-4f67-9a0d-01ab38902d79" containerName="oc" Mar 10 11:26:33 crc kubenswrapper[4794]: I0310 11:26:33.401712 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Mar 10 11:26:33 crc kubenswrapper[4794]: I0310 11:26:33.436213 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Mar 10 11:26:33 crc kubenswrapper[4794]: I0310 11:26:33.436808 4794 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/openstackclient" oldPodUID="9629922b-07aa-4027-a102-5d957b2ca8af" podUID="3b5553b5-8c76-4d5d-9501-48df7e9b14d6" Mar 10 11:26:33 crc kubenswrapper[4794]: I0310 11:26:33.566707 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j2qgz\" (UniqueName: \"kubernetes.io/projected/3b5553b5-8c76-4d5d-9501-48df7e9b14d6-kube-api-access-j2qgz\") pod \"openstackclient\" (UID: \"3b5553b5-8c76-4d5d-9501-48df7e9b14d6\") " pod="openstack/openstackclient" Mar 10 11:26:33 crc kubenswrapper[4794]: I0310 11:26:33.566762 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/3b5553b5-8c76-4d5d-9501-48df7e9b14d6-openstack-config-secret\") pod \"openstackclient\" (UID: \"3b5553b5-8c76-4d5d-9501-48df7e9b14d6\") " pod="openstack/openstackclient" Mar 10 11:26:33 crc kubenswrapper[4794]: I0310 11:26:33.566815 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/3b5553b5-8c76-4d5d-9501-48df7e9b14d6-openstack-config\") pod \"openstackclient\" (UID: \"3b5553b5-8c76-4d5d-9501-48df7e9b14d6\") " pod="openstack/openstackclient" Mar 10 11:26:33 crc kubenswrapper[4794]: I0310 11:26:33.624640 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Mar 10 11:26:33 crc kubenswrapper[4794]: I0310 11:26:33.625835 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 10 11:26:33 crc kubenswrapper[4794]: I0310 11:26:33.633574 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-ceilometer-dockercfg-hr8v5" Mar 10 11:26:33 crc kubenswrapper[4794]: I0310 11:26:33.648671 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 10 11:26:33 crc kubenswrapper[4794]: I0310 11:26:33.676468 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j2qgz\" (UniqueName: \"kubernetes.io/projected/3b5553b5-8c76-4d5d-9501-48df7e9b14d6-kube-api-access-j2qgz\") pod \"openstackclient\" (UID: \"3b5553b5-8c76-4d5d-9501-48df7e9b14d6\") " pod="openstack/openstackclient" Mar 10 11:26:33 crc kubenswrapper[4794]: I0310 11:26:33.676532 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/3b5553b5-8c76-4d5d-9501-48df7e9b14d6-openstack-config-secret\") pod \"openstackclient\" (UID: \"3b5553b5-8c76-4d5d-9501-48df7e9b14d6\") " pod="openstack/openstackclient" Mar 10 11:26:33 crc kubenswrapper[4794]: I0310 11:26:33.676579 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/3b5553b5-8c76-4d5d-9501-48df7e9b14d6-openstack-config\") pod \"openstackclient\" (UID: \"3b5553b5-8c76-4d5d-9501-48df7e9b14d6\") " pod="openstack/openstackclient" Mar 10 11:26:33 crc kubenswrapper[4794]: I0310 11:26:33.677448 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/3b5553b5-8c76-4d5d-9501-48df7e9b14d6-openstack-config\") pod \"openstackclient\" (UID: \"3b5553b5-8c76-4d5d-9501-48df7e9b14d6\") " pod="openstack/openstackclient" Mar 10 11:26:33 crc kubenswrapper[4794]: I0310 11:26:33.687892 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/3b5553b5-8c76-4d5d-9501-48df7e9b14d6-openstack-config-secret\") pod \"openstackclient\" (UID: \"3b5553b5-8c76-4d5d-9501-48df7e9b14d6\") " pod="openstack/openstackclient" Mar 10 11:26:33 crc kubenswrapper[4794]: I0310 11:26:33.713924 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j2qgz\" (UniqueName: \"kubernetes.io/projected/3b5553b5-8c76-4d5d-9501-48df7e9b14d6-kube-api-access-j2qgz\") pod \"openstackclient\" (UID: \"3b5553b5-8c76-4d5d-9501-48df7e9b14d6\") " pod="openstack/openstackclient" Mar 10 11:26:33 crc kubenswrapper[4794]: I0310 11:26:33.747293 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Mar 10 11:26:33 crc kubenswrapper[4794]: I0310 11:26:33.782968 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8snmp\" (UniqueName: \"kubernetes.io/projected/faba7423-9101-4887-9104-69b12739b3a3-kube-api-access-8snmp\") pod \"kube-state-metrics-0\" (UID: \"faba7423-9101-4887-9104-69b12739b3a3\") " pod="openstack/kube-state-metrics-0" Mar 10 11:26:33 crc kubenswrapper[4794]: I0310 11:26:33.890464 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8snmp\" (UniqueName: \"kubernetes.io/projected/faba7423-9101-4887-9104-69b12739b3a3-kube-api-access-8snmp\") pod \"kube-state-metrics-0\" (UID: \"faba7423-9101-4887-9104-69b12739b3a3\") " pod="openstack/kube-state-metrics-0" Mar 10 11:26:33 crc kubenswrapper[4794]: I0310 11:26:33.961589 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8snmp\" (UniqueName: \"kubernetes.io/projected/faba7423-9101-4887-9104-69b12739b3a3-kube-api-access-8snmp\") pod \"kube-state-metrics-0\" (UID: \"faba7423-9101-4887-9104-69b12739b3a3\") " pod="openstack/kube-state-metrics-0" Mar 10 11:26:34 crc kubenswrapper[4794]: I0310 11:26:34.240968 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 10 11:26:34 crc kubenswrapper[4794]: I0310 11:26:34.382053 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/alertmanager-metric-storage-0"] Mar 10 11:26:34 crc kubenswrapper[4794]: I0310 11:26:34.384133 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/alertmanager-metric-storage-0" Mar 10 11:26:34 crc kubenswrapper[4794]: I0310 11:26:34.386671 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"metric-storage-alertmanager-dockercfg-2x4sc" Mar 10 11:26:34 crc kubenswrapper[4794]: I0310 11:26:34.386928 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"alertmanager-metric-storage-cluster-tls-config" Mar 10 11:26:34 crc kubenswrapper[4794]: I0310 11:26:34.387060 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"alertmanager-metric-storage-web-config" Mar 10 11:26:34 crc kubenswrapper[4794]: I0310 11:26:34.387321 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"alertmanager-metric-storage-tls-assets-0" Mar 10 11:26:34 crc kubenswrapper[4794]: I0310 11:26:34.387338 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"alertmanager-metric-storage-generated" Mar 10 11:26:34 crc kubenswrapper[4794]: I0310 11:26:34.400852 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/alertmanager-metric-storage-0"] Mar 10 11:26:34 crc kubenswrapper[4794]: I0310 11:26:34.508403 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/92c89072-b510-442b-9009-bfb1363a34ef-web-config\") pod \"alertmanager-metric-storage-0\" (UID: \"92c89072-b510-442b-9009-bfb1363a34ef\") " pod="openstack/alertmanager-metric-storage-0" Mar 10 11:26:34 crc kubenswrapper[4794]: I0310 11:26:34.508741 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6ql49\" (UniqueName: \"kubernetes.io/projected/92c89072-b510-442b-9009-bfb1363a34ef-kube-api-access-6ql49\") pod \"alertmanager-metric-storage-0\" (UID: \"92c89072-b510-442b-9009-bfb1363a34ef\") " pod="openstack/alertmanager-metric-storage-0" Mar 10 11:26:34 crc kubenswrapper[4794]: I0310 11:26:34.508772 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/92c89072-b510-442b-9009-bfb1363a34ef-cluster-tls-config\") pod \"alertmanager-metric-storage-0\" (UID: \"92c89072-b510-442b-9009-bfb1363a34ef\") " pod="openstack/alertmanager-metric-storage-0" Mar 10 11:26:34 crc kubenswrapper[4794]: I0310 11:26:34.508804 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/92c89072-b510-442b-9009-bfb1363a34ef-tls-assets\") pod \"alertmanager-metric-storage-0\" (UID: \"92c89072-b510-442b-9009-bfb1363a34ef\") " pod="openstack/alertmanager-metric-storage-0" Mar 10 11:26:34 crc kubenswrapper[4794]: I0310 11:26:34.508829 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-metric-storage-db\" (UniqueName: \"kubernetes.io/empty-dir/92c89072-b510-442b-9009-bfb1363a34ef-alertmanager-metric-storage-db\") pod \"alertmanager-metric-storage-0\" (UID: \"92c89072-b510-442b-9009-bfb1363a34ef\") " pod="openstack/alertmanager-metric-storage-0" Mar 10 11:26:34 crc kubenswrapper[4794]: I0310 11:26:34.508916 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/92c89072-b510-442b-9009-bfb1363a34ef-config-out\") pod \"alertmanager-metric-storage-0\" (UID: \"92c89072-b510-442b-9009-bfb1363a34ef\") " pod="openstack/alertmanager-metric-storage-0" Mar 10 11:26:34 crc kubenswrapper[4794]: I0310 11:26:34.508945 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/92c89072-b510-442b-9009-bfb1363a34ef-config-volume\") pod \"alertmanager-metric-storage-0\" (UID: \"92c89072-b510-442b-9009-bfb1363a34ef\") " pod="openstack/alertmanager-metric-storage-0" Mar 10 11:26:34 crc kubenswrapper[4794]: I0310 11:26:34.522199 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Mar 10 11:26:34 crc kubenswrapper[4794]: I0310 11:26:34.615107 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/92c89072-b510-442b-9009-bfb1363a34ef-config-out\") pod \"alertmanager-metric-storage-0\" (UID: \"92c89072-b510-442b-9009-bfb1363a34ef\") " pod="openstack/alertmanager-metric-storage-0" Mar 10 11:26:34 crc kubenswrapper[4794]: I0310 11:26:34.615176 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/92c89072-b510-442b-9009-bfb1363a34ef-config-volume\") pod \"alertmanager-metric-storage-0\" (UID: \"92c89072-b510-442b-9009-bfb1363a34ef\") " pod="openstack/alertmanager-metric-storage-0" Mar 10 11:26:34 crc kubenswrapper[4794]: I0310 11:26:34.615232 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/92c89072-b510-442b-9009-bfb1363a34ef-web-config\") pod \"alertmanager-metric-storage-0\" (UID: \"92c89072-b510-442b-9009-bfb1363a34ef\") " pod="openstack/alertmanager-metric-storage-0" Mar 10 11:26:34 crc kubenswrapper[4794]: I0310 11:26:34.615253 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6ql49\" (UniqueName: \"kubernetes.io/projected/92c89072-b510-442b-9009-bfb1363a34ef-kube-api-access-6ql49\") pod \"alertmanager-metric-storage-0\" (UID: \"92c89072-b510-442b-9009-bfb1363a34ef\") " pod="openstack/alertmanager-metric-storage-0" Mar 10 11:26:34 crc kubenswrapper[4794]: I0310 11:26:34.615279 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/92c89072-b510-442b-9009-bfb1363a34ef-cluster-tls-config\") pod \"alertmanager-metric-storage-0\" (UID: \"92c89072-b510-442b-9009-bfb1363a34ef\") " pod="openstack/alertmanager-metric-storage-0" Mar 10 11:26:34 crc kubenswrapper[4794]: I0310 11:26:34.615311 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/92c89072-b510-442b-9009-bfb1363a34ef-tls-assets\") pod \"alertmanager-metric-storage-0\" (UID: \"92c89072-b510-442b-9009-bfb1363a34ef\") " pod="openstack/alertmanager-metric-storage-0" Mar 10 11:26:34 crc kubenswrapper[4794]: I0310 11:26:34.615354 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"alertmanager-metric-storage-db\" (UniqueName: \"kubernetes.io/empty-dir/92c89072-b510-442b-9009-bfb1363a34ef-alertmanager-metric-storage-db\") pod \"alertmanager-metric-storage-0\" (UID: \"92c89072-b510-442b-9009-bfb1363a34ef\") " pod="openstack/alertmanager-metric-storage-0" Mar 10 11:26:34 crc kubenswrapper[4794]: I0310 11:26:34.616100 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"alertmanager-metric-storage-db\" (UniqueName: \"kubernetes.io/empty-dir/92c89072-b510-442b-9009-bfb1363a34ef-alertmanager-metric-storage-db\") pod \"alertmanager-metric-storage-0\" (UID: \"92c89072-b510-442b-9009-bfb1363a34ef\") " pod="openstack/alertmanager-metric-storage-0" Mar 10 11:26:34 crc kubenswrapper[4794]: I0310 11:26:34.626828 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/92c89072-b510-442b-9009-bfb1363a34ef-tls-assets\") pod \"alertmanager-metric-storage-0\" (UID: \"92c89072-b510-442b-9009-bfb1363a34ef\") " pod="openstack/alertmanager-metric-storage-0" Mar 10 11:26:34 crc kubenswrapper[4794]: I0310 11:26:34.630860 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/92c89072-b510-442b-9009-bfb1363a34ef-cluster-tls-config\") pod \"alertmanager-metric-storage-0\" (UID: \"92c89072-b510-442b-9009-bfb1363a34ef\") " pod="openstack/alertmanager-metric-storage-0" Mar 10 11:26:34 crc kubenswrapper[4794]: I0310 11:26:34.632036 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/92c89072-b510-442b-9009-bfb1363a34ef-config-volume\") pod \"alertmanager-metric-storage-0\" (UID: \"92c89072-b510-442b-9009-bfb1363a34ef\") " pod="openstack/alertmanager-metric-storage-0" Mar 10 11:26:34 crc kubenswrapper[4794]: I0310 11:26:34.632675 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/92c89072-b510-442b-9009-bfb1363a34ef-web-config\") pod \"alertmanager-metric-storage-0\" (UID: \"92c89072-b510-442b-9009-bfb1363a34ef\") " pod="openstack/alertmanager-metric-storage-0" Mar 10 11:26:34 crc kubenswrapper[4794]: I0310 11:26:34.641686 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/92c89072-b510-442b-9009-bfb1363a34ef-config-out\") pod \"alertmanager-metric-storage-0\" (UID: \"92c89072-b510-442b-9009-bfb1363a34ef\") " pod="openstack/alertmanager-metric-storage-0" Mar 10 11:26:34 crc kubenswrapper[4794]: I0310 11:26:34.647758 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6ql49\" (UniqueName: \"kubernetes.io/projected/92c89072-b510-442b-9009-bfb1363a34ef-kube-api-access-6ql49\") pod \"alertmanager-metric-storage-0\" (UID: \"92c89072-b510-442b-9009-bfb1363a34ef\") " pod="openstack/alertmanager-metric-storage-0" Mar 10 11:26:34 crc kubenswrapper[4794]: I0310 11:26:34.752175 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/alertmanager-metric-storage-0" Mar 10 11:26:34 crc kubenswrapper[4794]: I0310 11:26:34.975450 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/prometheus-metric-storage-0"] Mar 10 11:26:34 crc kubenswrapper[4794]: I0310 11:26:34.978073 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Mar 10 11:26:34 crc kubenswrapper[4794]: I0310 11:26:34.983029 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-web-config" Mar 10 11:26:34 crc kubenswrapper[4794]: I0310 11:26:34.983174 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-thanos-prometheus-http-client-file" Mar 10 11:26:34 crc kubenswrapper[4794]: I0310 11:26:34.983304 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-1" Mar 10 11:26:34 crc kubenswrapper[4794]: I0310 11:26:34.983563 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"metric-storage-prometheus-dockercfg-57rn9" Mar 10 11:26:34 crc kubenswrapper[4794]: I0310 11:26:34.983681 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage" Mar 10 11:26:34 crc kubenswrapper[4794]: I0310 11:26:34.983773 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-0" Mar 10 11:26:34 crc kubenswrapper[4794]: I0310 11:26:34.983872 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-tls-assets-0" Mar 10 11:26:34 crc kubenswrapper[4794]: I0310 11:26:34.992172 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-2" Mar 10 11:26:34 crc kubenswrapper[4794]: I0310 11:26:34.994969 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Mar 10 11:26:35 crc kubenswrapper[4794]: I0310 11:26:35.001480 4794 scope.go:117] "RemoveContainer" containerID="7035d4646bbb6a23dd802a9c8c9f05a3c6c945d4e5a6da8586d2bb005fcfad77" Mar 10 11:26:35 crc kubenswrapper[4794]: E0310 11:26:35.001958 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-69278_openshift-machine-config-operator(071a79a8-a892-4d38-a255-2a19483b64aa)\"" pod="openshift-machine-config-operator/machine-config-daemon-69278" podUID="071a79a8-a892-4d38-a255-2a19483b64aa" Mar 10 11:26:35 crc kubenswrapper[4794]: I0310 11:26:35.159749 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-14068e82-922c-4e6e-b173-1afb08de01fe\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-14068e82-922c-4e6e-b173-1afb08de01fe\") pod \"prometheus-metric-storage-0\" (UID: \"a1c58e50-939f-4c26-8214-ea21470b3f12\") " pod="openstack/prometheus-metric-storage-0" Mar 10 11:26:35 crc kubenswrapper[4794]: I0310 11:26:35.160083 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/a1c58e50-939f-4c26-8214-ea21470b3f12-prometheus-metric-storage-rulefiles-2\") pod \"prometheus-metric-storage-0\" (UID: \"a1c58e50-939f-4c26-8214-ea21470b3f12\") " pod="openstack/prometheus-metric-storage-0" Mar 10 11:26:35 crc kubenswrapper[4794]: I0310 11:26:35.160124 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/a1c58e50-939f-4c26-8214-ea21470b3f12-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"a1c58e50-939f-4c26-8214-ea21470b3f12\") " pod="openstack/prometheus-metric-storage-0" Mar 10 11:26:35 crc kubenswrapper[4794]: I0310 11:26:35.160149 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/a1c58e50-939f-4c26-8214-ea21470b3f12-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"a1c58e50-939f-4c26-8214-ea21470b3f12\") " pod="openstack/prometheus-metric-storage-0" Mar 10 11:26:35 crc kubenswrapper[4794]: I0310 11:26:35.163590 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/a1c58e50-939f-4c26-8214-ea21470b3f12-prometheus-metric-storage-rulefiles-1\") pod \"prometheus-metric-storage-0\" (UID: \"a1c58e50-939f-4c26-8214-ea21470b3f12\") " pod="openstack/prometheus-metric-storage-0" Mar 10 11:26:35 crc kubenswrapper[4794]: I0310 11:26:35.163680 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/a1c58e50-939f-4c26-8214-ea21470b3f12-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"a1c58e50-939f-4c26-8214-ea21470b3f12\") " pod="openstack/prometheus-metric-storage-0" Mar 10 11:26:35 crc kubenswrapper[4794]: I0310 11:26:35.163725 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/a1c58e50-939f-4c26-8214-ea21470b3f12-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"a1c58e50-939f-4c26-8214-ea21470b3f12\") " pod="openstack/prometheus-metric-storage-0" Mar 10 11:26:35 crc kubenswrapper[4794]: I0310 11:26:35.163792 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/a1c58e50-939f-4c26-8214-ea21470b3f12-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"a1c58e50-939f-4c26-8214-ea21470b3f12\") " pod="openstack/prometheus-metric-storage-0" Mar 10 11:26:35 crc kubenswrapper[4794]: I0310 11:26:35.163878 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gqrvk\" (UniqueName: \"kubernetes.io/projected/a1c58e50-939f-4c26-8214-ea21470b3f12-kube-api-access-gqrvk\") pod \"prometheus-metric-storage-0\" (UID: \"a1c58e50-939f-4c26-8214-ea21470b3f12\") " pod="openstack/prometheus-metric-storage-0" Mar 10 11:26:35 crc kubenswrapper[4794]: I0310 11:26:35.163945 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/a1c58e50-939f-4c26-8214-ea21470b3f12-config\") pod \"prometheus-metric-storage-0\" (UID: \"a1c58e50-939f-4c26-8214-ea21470b3f12\") " pod="openstack/prometheus-metric-storage-0" Mar 10 11:26:35 crc kubenswrapper[4794]: I0310 11:26:35.317470 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 10 11:26:35 crc kubenswrapper[4794]: I0310 11:26:35.321362 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-14068e82-922c-4e6e-b173-1afb08de01fe\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-14068e82-922c-4e6e-b173-1afb08de01fe\") pod \"prometheus-metric-storage-0\" (UID: \"a1c58e50-939f-4c26-8214-ea21470b3f12\") " pod="openstack/prometheus-metric-storage-0" Mar 10 11:26:35 crc kubenswrapper[4794]: I0310 11:26:35.321471 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/a1c58e50-939f-4c26-8214-ea21470b3f12-prometheus-metric-storage-rulefiles-2\") pod \"prometheus-metric-storage-0\" (UID: \"a1c58e50-939f-4c26-8214-ea21470b3f12\") " pod="openstack/prometheus-metric-storage-0" Mar 10 11:26:35 crc kubenswrapper[4794]: I0310 11:26:35.321513 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/a1c58e50-939f-4c26-8214-ea21470b3f12-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"a1c58e50-939f-4c26-8214-ea21470b3f12\") " pod="openstack/prometheus-metric-storage-0" Mar 10 11:26:35 crc kubenswrapper[4794]: I0310 11:26:35.321536 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/a1c58e50-939f-4c26-8214-ea21470b3f12-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"a1c58e50-939f-4c26-8214-ea21470b3f12\") " pod="openstack/prometheus-metric-storage-0" Mar 10 11:26:35 crc kubenswrapper[4794]: I0310 11:26:35.321562 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/a1c58e50-939f-4c26-8214-ea21470b3f12-prometheus-metric-storage-rulefiles-1\") pod \"prometheus-metric-storage-0\" (UID: \"a1c58e50-939f-4c26-8214-ea21470b3f12\") " pod="openstack/prometheus-metric-storage-0" Mar 10 11:26:35 crc kubenswrapper[4794]: I0310 11:26:35.321601 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/a1c58e50-939f-4c26-8214-ea21470b3f12-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"a1c58e50-939f-4c26-8214-ea21470b3f12\") " pod="openstack/prometheus-metric-storage-0" Mar 10 11:26:35 crc kubenswrapper[4794]: I0310 11:26:35.321639 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/a1c58e50-939f-4c26-8214-ea21470b3f12-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"a1c58e50-939f-4c26-8214-ea21470b3f12\") " pod="openstack/prometheus-metric-storage-0" Mar 10 11:26:35 crc kubenswrapper[4794]: I0310 11:26:35.321695 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/a1c58e50-939f-4c26-8214-ea21470b3f12-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"a1c58e50-939f-4c26-8214-ea21470b3f12\") " pod="openstack/prometheus-metric-storage-0" Mar 10 11:26:35 crc kubenswrapper[4794]: I0310 11:26:35.321761 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gqrvk\" (UniqueName: \"kubernetes.io/projected/a1c58e50-939f-4c26-8214-ea21470b3f12-kube-api-access-gqrvk\") pod \"prometheus-metric-storage-0\" (UID: \"a1c58e50-939f-4c26-8214-ea21470b3f12\") " pod="openstack/prometheus-metric-storage-0" Mar 10 11:26:35 crc kubenswrapper[4794]: I0310 11:26:35.321789 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/a1c58e50-939f-4c26-8214-ea21470b3f12-config\") pod \"prometheus-metric-storage-0\" (UID: \"a1c58e50-939f-4c26-8214-ea21470b3f12\") " pod="openstack/prometheus-metric-storage-0" Mar 10 11:26:35 crc kubenswrapper[4794]: I0310 11:26:35.323068 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/a1c58e50-939f-4c26-8214-ea21470b3f12-prometheus-metric-storage-rulefiles-1\") pod \"prometheus-metric-storage-0\" (UID: \"a1c58e50-939f-4c26-8214-ea21470b3f12\") " pod="openstack/prometheus-metric-storage-0" Mar 10 11:26:35 crc kubenswrapper[4794]: I0310 11:26:35.323995 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/a1c58e50-939f-4c26-8214-ea21470b3f12-prometheus-metric-storage-rulefiles-2\") pod \"prometheus-metric-storage-0\" (UID: \"a1c58e50-939f-4c26-8214-ea21470b3f12\") " pod="openstack/prometheus-metric-storage-0" Mar 10 11:26:35 crc kubenswrapper[4794]: I0310 11:26:35.324443 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/a1c58e50-939f-4c26-8214-ea21470b3f12-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"a1c58e50-939f-4c26-8214-ea21470b3f12\") " pod="openstack/prometheus-metric-storage-0" Mar 10 11:26:35 crc kubenswrapper[4794]: I0310 11:26:35.330907 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/a1c58e50-939f-4c26-8214-ea21470b3f12-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"a1c58e50-939f-4c26-8214-ea21470b3f12\") " pod="openstack/prometheus-metric-storage-0" Mar 10 11:26:35 crc kubenswrapper[4794]: I0310 11:26:35.341174 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/a1c58e50-939f-4c26-8214-ea21470b3f12-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"a1c58e50-939f-4c26-8214-ea21470b3f12\") " pod="openstack/prometheus-metric-storage-0" Mar 10 11:26:35 crc kubenswrapper[4794]: I0310 11:26:35.343864 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/a1c58e50-939f-4c26-8214-ea21470b3f12-config\") pod \"prometheus-metric-storage-0\" (UID: \"a1c58e50-939f-4c26-8214-ea21470b3f12\") " pod="openstack/prometheus-metric-storage-0" Mar 10 11:26:35 crc kubenswrapper[4794]: I0310 11:26:35.347995 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/a1c58e50-939f-4c26-8214-ea21470b3f12-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"a1c58e50-939f-4c26-8214-ea21470b3f12\") " pod="openstack/prometheus-metric-storage-0" Mar 10 11:26:35 crc kubenswrapper[4794]: I0310 11:26:35.350441 4794 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 10 11:26:35 crc kubenswrapper[4794]: I0310 11:26:35.350489 4794 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-14068e82-922c-4e6e-b173-1afb08de01fe\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-14068e82-922c-4e6e-b173-1afb08de01fe\") pod \"prometheus-metric-storage-0\" (UID: \"a1c58e50-939f-4c26-8214-ea21470b3f12\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/b6317738df107c1cfa5c582e8ed230baeaf74920028c8a73b4937704e55221f0/globalmount\"" pod="openstack/prometheus-metric-storage-0" Mar 10 11:26:35 crc kubenswrapper[4794]: I0310 11:26:35.355107 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/a1c58e50-939f-4c26-8214-ea21470b3f12-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"a1c58e50-939f-4c26-8214-ea21470b3f12\") " pod="openstack/prometheus-metric-storage-0" Mar 10 11:26:35 crc kubenswrapper[4794]: I0310 11:26:35.369235 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gqrvk\" (UniqueName: \"kubernetes.io/projected/a1c58e50-939f-4c26-8214-ea21470b3f12-kube-api-access-gqrvk\") pod \"prometheus-metric-storage-0\" (UID: \"a1c58e50-939f-4c26-8214-ea21470b3f12\") " pod="openstack/prometheus-metric-storage-0" Mar 10 11:26:35 crc kubenswrapper[4794]: I0310 11:26:35.572690 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-14068e82-922c-4e6e-b173-1afb08de01fe\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-14068e82-922c-4e6e-b173-1afb08de01fe\") pod \"prometheus-metric-storage-0\" (UID: \"a1c58e50-939f-4c26-8214-ea21470b3f12\") " pod="openstack/prometheus-metric-storage-0" Mar 10 11:26:35 crc kubenswrapper[4794]: I0310 11:26:35.594591 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"3b5553b5-8c76-4d5d-9501-48df7e9b14d6","Type":"ContainerStarted","Data":"8b1ba465435e00fae3c1cce4dc3e916df154a3b1ae0b3ab8e900f83d631c8587"} Mar 10 11:26:35 crc kubenswrapper[4794]: I0310 11:26:35.594636 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"3b5553b5-8c76-4d5d-9501-48df7e9b14d6","Type":"ContainerStarted","Data":"5a0a23c93c8c56d5ba8df509fce7a745beb0e294477146b2d5e823a7f1d52c2f"} Mar 10 11:26:35 crc kubenswrapper[4794]: I0310 11:26:35.607513 4794 generic.go:334] "Generic (PLEG): container finished" podID="9629922b-07aa-4027-a102-5d957b2ca8af" containerID="2eca94f4dde83de73167c8dab6b4d9a2d353616bc86ccdcb0e93d1b10c6e8e36" exitCode=137 Mar 10 11:26:35 crc kubenswrapper[4794]: I0310 11:26:35.612676 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"faba7423-9101-4887-9104-69b12739b3a3","Type":"ContainerStarted","Data":"9eda3e104fb3a8629d8fa383a02c398b6e62f40c4c4f1b293a7dc22f6c53ae10"} Mar 10 11:26:35 crc kubenswrapper[4794]: I0310 11:26:35.623956 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstackclient" podStartSLOduration=2.623940488 podStartE2EDuration="2.623940488s" podCreationTimestamp="2026-03-10 11:26:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 11:26:35.621995967 +0000 UTC m=+6144.378166785" watchObservedRunningTime="2026-03-10 11:26:35.623940488 +0000 UTC m=+6144.380111296" Mar 10 11:26:35 crc kubenswrapper[4794]: I0310 11:26:35.666259 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/alertmanager-metric-storage-0"] Mar 10 11:26:35 crc kubenswrapper[4794]: I0310 11:26:35.668655 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Mar 10 11:26:36 crc kubenswrapper[4794]: I0310 11:26:36.387972 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Mar 10 11:26:36 crc kubenswrapper[4794]: I0310 11:26:36.450914 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Mar 10 11:26:36 crc kubenswrapper[4794]: I0310 11:26:36.456389 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/9629922b-07aa-4027-a102-5d957b2ca8af-openstack-config-secret\") pod \"9629922b-07aa-4027-a102-5d957b2ca8af\" (UID: \"9629922b-07aa-4027-a102-5d957b2ca8af\") " Mar 10 11:26:36 crc kubenswrapper[4794]: I0310 11:26:36.456682 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/9629922b-07aa-4027-a102-5d957b2ca8af-openstack-config\") pod \"9629922b-07aa-4027-a102-5d957b2ca8af\" (UID: \"9629922b-07aa-4027-a102-5d957b2ca8af\") " Mar 10 11:26:36 crc kubenswrapper[4794]: I0310 11:26:36.456742 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jb54l\" (UniqueName: \"kubernetes.io/projected/9629922b-07aa-4027-a102-5d957b2ca8af-kube-api-access-jb54l\") pod \"9629922b-07aa-4027-a102-5d957b2ca8af\" (UID: \"9629922b-07aa-4027-a102-5d957b2ca8af\") " Mar 10 11:26:36 crc kubenswrapper[4794]: I0310 11:26:36.463589 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9629922b-07aa-4027-a102-5d957b2ca8af-kube-api-access-jb54l" (OuterVolumeSpecName: "kube-api-access-jb54l") pod "9629922b-07aa-4027-a102-5d957b2ca8af" (UID: "9629922b-07aa-4027-a102-5d957b2ca8af"). InnerVolumeSpecName "kube-api-access-jb54l". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 11:26:36 crc kubenswrapper[4794]: I0310 11:26:36.484949 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9629922b-07aa-4027-a102-5d957b2ca8af-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "9629922b-07aa-4027-a102-5d957b2ca8af" (UID: "9629922b-07aa-4027-a102-5d957b2ca8af"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 11:26:36 crc kubenswrapper[4794]: I0310 11:26:36.512159 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9629922b-07aa-4027-a102-5d957b2ca8af-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "9629922b-07aa-4027-a102-5d957b2ca8af" (UID: "9629922b-07aa-4027-a102-5d957b2ca8af"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 11:26:36 crc kubenswrapper[4794]: I0310 11:26:36.561216 4794 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/9629922b-07aa-4027-a102-5d957b2ca8af-openstack-config\") on node \"crc\" DevicePath \"\"" Mar 10 11:26:36 crc kubenswrapper[4794]: I0310 11:26:36.561250 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jb54l\" (UniqueName: \"kubernetes.io/projected/9629922b-07aa-4027-a102-5d957b2ca8af-kube-api-access-jb54l\") on node \"crc\" DevicePath \"\"" Mar 10 11:26:36 crc kubenswrapper[4794]: I0310 11:26:36.561259 4794 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/9629922b-07aa-4027-a102-5d957b2ca8af-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Mar 10 11:26:36 crc kubenswrapper[4794]: I0310 11:26:36.629802 4794 scope.go:117] "RemoveContainer" containerID="2eca94f4dde83de73167c8dab6b4d9a2d353616bc86ccdcb0e93d1b10c6e8e36" Mar 10 11:26:36 crc kubenswrapper[4794]: I0310 11:26:36.630059 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Mar 10 11:26:36 crc kubenswrapper[4794]: I0310 11:26:36.637363 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/alertmanager-metric-storage-0" event={"ID":"92c89072-b510-442b-9009-bfb1363a34ef","Type":"ContainerStarted","Data":"b3a1649d8c18f07950f65511de20e907bfc6b4aff70a11317d5b2c225fe31aff"} Mar 10 11:26:36 crc kubenswrapper[4794]: I0310 11:26:36.641076 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"a1c58e50-939f-4c26-8214-ea21470b3f12","Type":"ContainerStarted","Data":"b55c7f396b82d1ab658b0cdb9f82e7720cc1b24db7717474092b0ed36d854063"} Mar 10 11:26:36 crc kubenswrapper[4794]: I0310 11:26:36.644257 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"faba7423-9101-4887-9104-69b12739b3a3","Type":"ContainerStarted","Data":"5df3a260a824d89b81a326b9eb4271c47a12350938ac0ef1979f37a595b05973"} Mar 10 11:26:36 crc kubenswrapper[4794]: I0310 11:26:36.644574 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Mar 10 11:26:36 crc kubenswrapper[4794]: I0310 11:26:36.675707 4794 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/openstackclient" oldPodUID="9629922b-07aa-4027-a102-5d957b2ca8af" podUID="3b5553b5-8c76-4d5d-9501-48df7e9b14d6" Mar 10 11:26:36 crc kubenswrapper[4794]: I0310 11:26:36.684186 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=3.065839429 podStartE2EDuration="3.684143129s" podCreationTimestamp="2026-03-10 11:26:33 +0000 UTC" firstStartedPulling="2026-03-10 11:26:35.162275662 +0000 UTC m=+6143.918446480" lastFinishedPulling="2026-03-10 11:26:35.780579362 +0000 UTC m=+6144.536750180" observedRunningTime="2026-03-10 11:26:36.670423642 +0000 UTC m=+6145.426594470" watchObservedRunningTime="2026-03-10 11:26:36.684143129 +0000 UTC m=+6145.440313957" Mar 10 11:26:38 crc kubenswrapper[4794]: I0310 11:26:38.014078 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9629922b-07aa-4027-a102-5d957b2ca8af" path="/var/lib/kubelet/pods/9629922b-07aa-4027-a102-5d957b2ca8af/volumes" Mar 10 11:26:42 crc kubenswrapper[4794]: I0310 11:26:42.740359 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/alertmanager-metric-storage-0" event={"ID":"92c89072-b510-442b-9009-bfb1363a34ef","Type":"ContainerStarted","Data":"b98ffd0761b148bebfe3ad5e4739f4a53b9edf96c79882490aa92811f30e0cbb"} Mar 10 11:26:43 crc kubenswrapper[4794]: I0310 11:26:43.755716 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"a1c58e50-939f-4c26-8214-ea21470b3f12","Type":"ContainerStarted","Data":"1e6ba4efe189ce5e8106e3545feeb3ddbbf2fb3af96f2c70c5a2bd711e626c4d"} Mar 10 11:26:44 crc kubenswrapper[4794]: I0310 11:26:44.246797 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Mar 10 11:26:45 crc kubenswrapper[4794]: I0310 11:26:45.055691 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-6f51-account-create-update-fmxtb"] Mar 10 11:26:45 crc kubenswrapper[4794]: I0310 11:26:45.068991 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-create-789kv"] Mar 10 11:26:45 crc kubenswrapper[4794]: I0310 11:26:45.079713 4794 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-create-789kv"] Mar 10 11:26:45 crc kubenswrapper[4794]: I0310 11:26:45.090349 4794 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-6f51-account-create-update-fmxtb"] Mar 10 11:26:46 crc kubenswrapper[4794]: I0310 11:26:46.035252 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="78311413-38d5-422f-8153-57eb3eed4494" path="/var/lib/kubelet/pods/78311413-38d5-422f-8153-57eb3eed4494/volumes" Mar 10 11:26:46 crc kubenswrapper[4794]: I0310 11:26:46.036587 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d43bfe5e-a75c-45e1-b0f0-5d5654ecd8b6" path="/var/lib/kubelet/pods/d43bfe5e-a75c-45e1-b0f0-5d5654ecd8b6/volumes" Mar 10 11:26:48 crc kubenswrapper[4794]: I0310 11:26:48.234724 4794 scope.go:117] "RemoveContainer" containerID="8223a7484a0c224e84cd0f81a8a2a7f0e59bfc770ed59b29bee05e0336459770" Mar 10 11:26:48 crc kubenswrapper[4794]: I0310 11:26:48.277995 4794 scope.go:117] "RemoveContainer" containerID="4878c96a1ee18ad5d4b6a0bd581fdea7d5b56e4a4eddbe56c45938b20143151f" Mar 10 11:26:48 crc kubenswrapper[4794]: I0310 11:26:48.322683 4794 scope.go:117] "RemoveContainer" containerID="5ee176fc2bcf33fedd90521d107fe2ef16ba0f4fe4f2c020bbed70829eace224" Mar 10 11:26:48 crc kubenswrapper[4794]: I0310 11:26:48.377730 4794 scope.go:117] "RemoveContainer" containerID="2a0243d78f2df57b859dc77fe6752ee13edf53e83f10cd4e0616c56259649ee2" Mar 10 11:26:48 crc kubenswrapper[4794]: I0310 11:26:48.423543 4794 scope.go:117] "RemoveContainer" containerID="8b51557da52d041a9e6ea7d9dd6c8758de33c61c9d4e6bb19394f4eeed2df9ad" Mar 10 11:26:48 crc kubenswrapper[4794]: I0310 11:26:48.498179 4794 scope.go:117] "RemoveContainer" containerID="0dbe5a9bbb2cc0e10af62f800ea9466d45a3ab1efbdd5694adf30e7f1e9ed1b9" Mar 10 11:26:48 crc kubenswrapper[4794]: I0310 11:26:48.528901 4794 scope.go:117] "RemoveContainer" containerID="0726f8b9b4d8ca49671d811007c1986ff0185fb4312ebb37ee616f1a79a95fdf" Mar 10 11:26:48 crc kubenswrapper[4794]: I0310 11:26:48.608286 4794 scope.go:117] "RemoveContainer" containerID="f6b5ae7b564ee3eb0e9d6fcbfce48d6c75d9bcf011234b05cf4926acb5f8a13d" Mar 10 11:26:48 crc kubenswrapper[4794]: I0310 11:26:48.999497 4794 scope.go:117] "RemoveContainer" containerID="7035d4646bbb6a23dd802a9c8c9f05a3c6c945d4e5a6da8586d2bb005fcfad77" Mar 10 11:26:49 crc kubenswrapper[4794]: E0310 11:26:48.999978 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-69278_openshift-machine-config-operator(071a79a8-a892-4d38-a255-2a19483b64aa)\"" pod="openshift-machine-config-operator/machine-config-daemon-69278" podUID="071a79a8-a892-4d38-a255-2a19483b64aa" Mar 10 11:26:50 crc kubenswrapper[4794]: I0310 11:26:50.844217 4794 generic.go:334] "Generic (PLEG): container finished" podID="92c89072-b510-442b-9009-bfb1363a34ef" containerID="b98ffd0761b148bebfe3ad5e4739f4a53b9edf96c79882490aa92811f30e0cbb" exitCode=0 Mar 10 11:26:50 crc kubenswrapper[4794]: I0310 11:26:50.844261 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/alertmanager-metric-storage-0" event={"ID":"92c89072-b510-442b-9009-bfb1363a34ef","Type":"ContainerDied","Data":"b98ffd0761b148bebfe3ad5e4739f4a53b9edf96c79882490aa92811f30e0cbb"} Mar 10 11:26:51 crc kubenswrapper[4794]: I0310 11:26:51.040948 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-sync-jgrbz"] Mar 10 11:26:51 crc kubenswrapper[4794]: I0310 11:26:51.054682 4794 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-sync-jgrbz"] Mar 10 11:26:52 crc kubenswrapper[4794]: I0310 11:26:52.018387 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a5acb60a-9ac5-4cda-ab04-5542bd65e4ad" path="/var/lib/kubelet/pods/a5acb60a-9ac5-4cda-ab04-5542bd65e4ad/volumes" Mar 10 11:26:52 crc kubenswrapper[4794]: I0310 11:26:52.873200 4794 generic.go:334] "Generic (PLEG): container finished" podID="a1c58e50-939f-4c26-8214-ea21470b3f12" containerID="1e6ba4efe189ce5e8106e3545feeb3ddbbf2fb3af96f2c70c5a2bd711e626c4d" exitCode=0 Mar 10 11:26:52 crc kubenswrapper[4794]: I0310 11:26:52.873260 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"a1c58e50-939f-4c26-8214-ea21470b3f12","Type":"ContainerDied","Data":"1e6ba4efe189ce5e8106e3545feeb3ddbbf2fb3af96f2c70c5a2bd711e626c4d"} Mar 10 11:26:53 crc kubenswrapper[4794]: I0310 11:26:53.886795 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/alertmanager-metric-storage-0" event={"ID":"92c89072-b510-442b-9009-bfb1363a34ef","Type":"ContainerStarted","Data":"c8ade3678eeca5126b5d68140ae9241fd7a9d5a9baa940153a545c17ed481231"} Mar 10 11:26:59 crc kubenswrapper[4794]: I0310 11:26:59.976164 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/alertmanager-metric-storage-0" event={"ID":"92c89072-b510-442b-9009-bfb1363a34ef","Type":"ContainerStarted","Data":"0e545a7d1bfcd7283fb8b87619403fd7b74bf720d3fdf002a4d6d6a0bf1cd1bb"} Mar 10 11:26:59 crc kubenswrapper[4794]: I0310 11:26:59.976785 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/alertmanager-metric-storage-0" Mar 10 11:26:59 crc kubenswrapper[4794]: I0310 11:26:59.980300 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"a1c58e50-939f-4c26-8214-ea21470b3f12","Type":"ContainerStarted","Data":"9e9761070fb74be423b8afb22efb3519241cb2c8bc191b912a32a38cc18af6a0"} Mar 10 11:26:59 crc kubenswrapper[4794]: I0310 11:26:59.982833 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/alertmanager-metric-storage-0" Mar 10 11:27:00 crc kubenswrapper[4794]: I0310 11:27:00.027095 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/alertmanager-metric-storage-0" podStartSLOduration=8.253422672 podStartE2EDuration="26.027076579s" podCreationTimestamp="2026-03-10 11:26:34 +0000 UTC" firstStartedPulling="2026-03-10 11:26:35.712510754 +0000 UTC m=+6144.468681572" lastFinishedPulling="2026-03-10 11:26:53.486164651 +0000 UTC m=+6162.242335479" observedRunningTime="2026-03-10 11:27:00.010926676 +0000 UTC m=+6168.767097544" watchObservedRunningTime="2026-03-10 11:27:00.027076579 +0000 UTC m=+6168.783247397" Mar 10 11:27:04 crc kubenswrapper[4794]: I0310 11:27:03.999617 4794 scope.go:117] "RemoveContainer" containerID="7035d4646bbb6a23dd802a9c8c9f05a3c6c945d4e5a6da8586d2bb005fcfad77" Mar 10 11:27:04 crc kubenswrapper[4794]: E0310 11:27:04.000998 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-69278_openshift-machine-config-operator(071a79a8-a892-4d38-a255-2a19483b64aa)\"" pod="openshift-machine-config-operator/machine-config-daemon-69278" podUID="071a79a8-a892-4d38-a255-2a19483b64aa" Mar 10 11:27:04 crc kubenswrapper[4794]: I0310 11:27:04.009926 4794 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 10 11:27:04 crc kubenswrapper[4794]: I0310 11:27:04.034857 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"a1c58e50-939f-4c26-8214-ea21470b3f12","Type":"ContainerStarted","Data":"38d694907494ea837f9d6c05e010a7eb3c9172bc471f09c496d1cbe89804b9dc"} Mar 10 11:27:07 crc kubenswrapper[4794]: I0310 11:27:07.072943 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"a1c58e50-939f-4c26-8214-ea21470b3f12","Type":"ContainerStarted","Data":"268d6b9b2c4305fdeab86b8e4441ac270b2031fedf0ae5785cd826047e8e38c2"} Mar 10 11:27:07 crc kubenswrapper[4794]: I0310 11:27:07.116247 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/prometheus-metric-storage-0" podStartSLOduration=3.945613265 podStartE2EDuration="34.116225157s" podCreationTimestamp="2026-03-10 11:26:33 +0000 UTC" firstStartedPulling="2026-03-10 11:26:36.454346609 +0000 UTC m=+6145.210517427" lastFinishedPulling="2026-03-10 11:27:06.624958511 +0000 UTC m=+6175.381129319" observedRunningTime="2026-03-10 11:27:07.109400275 +0000 UTC m=+6175.865571163" watchObservedRunningTime="2026-03-10 11:27:07.116225157 +0000 UTC m=+6175.872395985" Mar 10 11:27:10 crc kubenswrapper[4794]: I0310 11:27:10.669591 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/prometheus-metric-storage-0" Mar 10 11:27:13 crc kubenswrapper[4794]: I0310 11:27:13.927440 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 10 11:27:13 crc kubenswrapper[4794]: I0310 11:27:13.930509 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 10 11:27:13 crc kubenswrapper[4794]: I0310 11:27:13.932605 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 10 11:27:13 crc kubenswrapper[4794]: I0310 11:27:13.933533 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 10 11:27:13 crc kubenswrapper[4794]: I0310 11:27:13.951133 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 10 11:27:14 crc kubenswrapper[4794]: I0310 11:27:14.022348 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qdh45\" (UniqueName: \"kubernetes.io/projected/88a6b831-1306-46c9-9c99-d31a3078f532-kube-api-access-qdh45\") pod \"ceilometer-0\" (UID: \"88a6b831-1306-46c9-9c99-d31a3078f532\") " pod="openstack/ceilometer-0" Mar 10 11:27:14 crc kubenswrapper[4794]: I0310 11:27:14.022410 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/88a6b831-1306-46c9-9c99-d31a3078f532-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"88a6b831-1306-46c9-9c99-d31a3078f532\") " pod="openstack/ceilometer-0" Mar 10 11:27:14 crc kubenswrapper[4794]: I0310 11:27:14.022445 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/88a6b831-1306-46c9-9c99-d31a3078f532-config-data\") pod \"ceilometer-0\" (UID: \"88a6b831-1306-46c9-9c99-d31a3078f532\") " pod="openstack/ceilometer-0" Mar 10 11:27:14 crc kubenswrapper[4794]: I0310 11:27:14.022554 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/88a6b831-1306-46c9-9c99-d31a3078f532-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"88a6b831-1306-46c9-9c99-d31a3078f532\") " pod="openstack/ceilometer-0" Mar 10 11:27:14 crc kubenswrapper[4794]: I0310 11:27:14.022591 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/88a6b831-1306-46c9-9c99-d31a3078f532-scripts\") pod \"ceilometer-0\" (UID: \"88a6b831-1306-46c9-9c99-d31a3078f532\") " pod="openstack/ceilometer-0" Mar 10 11:27:14 crc kubenswrapper[4794]: I0310 11:27:14.022616 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/88a6b831-1306-46c9-9c99-d31a3078f532-log-httpd\") pod \"ceilometer-0\" (UID: \"88a6b831-1306-46c9-9c99-d31a3078f532\") " pod="openstack/ceilometer-0" Mar 10 11:27:14 crc kubenswrapper[4794]: I0310 11:27:14.022852 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/88a6b831-1306-46c9-9c99-d31a3078f532-run-httpd\") pod \"ceilometer-0\" (UID: \"88a6b831-1306-46c9-9c99-d31a3078f532\") " pod="openstack/ceilometer-0" Mar 10 11:27:14 crc kubenswrapper[4794]: I0310 11:27:14.125781 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qdh45\" (UniqueName: \"kubernetes.io/projected/88a6b831-1306-46c9-9c99-d31a3078f532-kube-api-access-qdh45\") pod \"ceilometer-0\" (UID: \"88a6b831-1306-46c9-9c99-d31a3078f532\") " pod="openstack/ceilometer-0" Mar 10 11:27:14 crc kubenswrapper[4794]: I0310 11:27:14.125855 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/88a6b831-1306-46c9-9c99-d31a3078f532-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"88a6b831-1306-46c9-9c99-d31a3078f532\") " pod="openstack/ceilometer-0" Mar 10 11:27:14 crc kubenswrapper[4794]: I0310 11:27:14.125885 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/88a6b831-1306-46c9-9c99-d31a3078f532-config-data\") pod \"ceilometer-0\" (UID: \"88a6b831-1306-46c9-9c99-d31a3078f532\") " pod="openstack/ceilometer-0" Mar 10 11:27:14 crc kubenswrapper[4794]: I0310 11:27:14.125971 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/88a6b831-1306-46c9-9c99-d31a3078f532-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"88a6b831-1306-46c9-9c99-d31a3078f532\") " pod="openstack/ceilometer-0" Mar 10 11:27:14 crc kubenswrapper[4794]: I0310 11:27:14.126007 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/88a6b831-1306-46c9-9c99-d31a3078f532-scripts\") pod \"ceilometer-0\" (UID: \"88a6b831-1306-46c9-9c99-d31a3078f532\") " pod="openstack/ceilometer-0" Mar 10 11:27:14 crc kubenswrapper[4794]: I0310 11:27:14.126030 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/88a6b831-1306-46c9-9c99-d31a3078f532-log-httpd\") pod \"ceilometer-0\" (UID: \"88a6b831-1306-46c9-9c99-d31a3078f532\") " pod="openstack/ceilometer-0" Mar 10 11:27:14 crc kubenswrapper[4794]: I0310 11:27:14.126107 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/88a6b831-1306-46c9-9c99-d31a3078f532-run-httpd\") pod \"ceilometer-0\" (UID: \"88a6b831-1306-46c9-9c99-d31a3078f532\") " pod="openstack/ceilometer-0" Mar 10 11:27:14 crc kubenswrapper[4794]: I0310 11:27:14.126966 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/88a6b831-1306-46c9-9c99-d31a3078f532-log-httpd\") pod \"ceilometer-0\" (UID: \"88a6b831-1306-46c9-9c99-d31a3078f532\") " pod="openstack/ceilometer-0" Mar 10 11:27:14 crc kubenswrapper[4794]: I0310 11:27:14.129889 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/88a6b831-1306-46c9-9c99-d31a3078f532-run-httpd\") pod \"ceilometer-0\" (UID: \"88a6b831-1306-46c9-9c99-d31a3078f532\") " pod="openstack/ceilometer-0" Mar 10 11:27:14 crc kubenswrapper[4794]: I0310 11:27:14.132287 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/88a6b831-1306-46c9-9c99-d31a3078f532-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"88a6b831-1306-46c9-9c99-d31a3078f532\") " pod="openstack/ceilometer-0" Mar 10 11:27:14 crc kubenswrapper[4794]: I0310 11:27:14.132522 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/88a6b831-1306-46c9-9c99-d31a3078f532-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"88a6b831-1306-46c9-9c99-d31a3078f532\") " pod="openstack/ceilometer-0" Mar 10 11:27:14 crc kubenswrapper[4794]: I0310 11:27:14.132913 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/88a6b831-1306-46c9-9c99-d31a3078f532-scripts\") pod \"ceilometer-0\" (UID: \"88a6b831-1306-46c9-9c99-d31a3078f532\") " pod="openstack/ceilometer-0" Mar 10 11:27:14 crc kubenswrapper[4794]: I0310 11:27:14.139241 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/88a6b831-1306-46c9-9c99-d31a3078f532-config-data\") pod \"ceilometer-0\" (UID: \"88a6b831-1306-46c9-9c99-d31a3078f532\") " pod="openstack/ceilometer-0" Mar 10 11:27:14 crc kubenswrapper[4794]: I0310 11:27:14.141966 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qdh45\" (UniqueName: \"kubernetes.io/projected/88a6b831-1306-46c9-9c99-d31a3078f532-kube-api-access-qdh45\") pod \"ceilometer-0\" (UID: \"88a6b831-1306-46c9-9c99-d31a3078f532\") " pod="openstack/ceilometer-0" Mar 10 11:27:14 crc kubenswrapper[4794]: I0310 11:27:14.265495 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 10 11:27:14 crc kubenswrapper[4794]: I0310 11:27:14.818566 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 10 11:27:15 crc kubenswrapper[4794]: I0310 11:27:15.201022 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"88a6b831-1306-46c9-9c99-d31a3078f532","Type":"ContainerStarted","Data":"db7d79b6bb5f062912226c84786e46269396cc8487a6f6702c27f4e4275dc6cd"} Mar 10 11:27:15 crc kubenswrapper[4794]: I0310 11:27:15.998727 4794 scope.go:117] "RemoveContainer" containerID="7035d4646bbb6a23dd802a9c8c9f05a3c6c945d4e5a6da8586d2bb005fcfad77" Mar 10 11:27:15 crc kubenswrapper[4794]: E0310 11:27:15.999373 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-69278_openshift-machine-config-operator(071a79a8-a892-4d38-a255-2a19483b64aa)\"" pod="openshift-machine-config-operator/machine-config-daemon-69278" podUID="071a79a8-a892-4d38-a255-2a19483b64aa" Mar 10 11:27:16 crc kubenswrapper[4794]: I0310 11:27:16.230502 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"88a6b831-1306-46c9-9c99-d31a3078f532","Type":"ContainerStarted","Data":"c03db4107aa78bf9d52cf84d5bccd6df397f0dd26da5c7e91ba3897deb89b9ef"} Mar 10 11:27:17 crc kubenswrapper[4794]: I0310 11:27:17.241880 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"88a6b831-1306-46c9-9c99-d31a3078f532","Type":"ContainerStarted","Data":"fc3500b985444ba6a90e62acb7d871d40afaffe0c0430da3f307e3c465a30f48"} Mar 10 11:27:17 crc kubenswrapper[4794]: I0310 11:27:17.242280 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"88a6b831-1306-46c9-9c99-d31a3078f532","Type":"ContainerStarted","Data":"b8115589ade0581832b073c52ad0c6ca4a94ecc5eadc47609b95c19ac8c6c806"} Mar 10 11:27:20 crc kubenswrapper[4794]: I0310 11:27:20.290612 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"88a6b831-1306-46c9-9c99-d31a3078f532","Type":"ContainerStarted","Data":"b712de111e658d060b3a1b566511693f47f67d861bd8a821ff26e515298623f1"} Mar 10 11:27:20 crc kubenswrapper[4794]: I0310 11:27:20.291562 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 10 11:27:20 crc kubenswrapper[4794]: I0310 11:27:20.334211 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=3.114899647 podStartE2EDuration="7.334179521s" podCreationTimestamp="2026-03-10 11:27:13 +0000 UTC" firstStartedPulling="2026-03-10 11:27:14.831238452 +0000 UTC m=+6183.587409270" lastFinishedPulling="2026-03-10 11:27:19.050518326 +0000 UTC m=+6187.806689144" observedRunningTime="2026-03-10 11:27:20.318895365 +0000 UTC m=+6189.075066203" watchObservedRunningTime="2026-03-10 11:27:20.334179521 +0000 UTC m=+6189.090350389" Mar 10 11:27:20 crc kubenswrapper[4794]: I0310 11:27:20.671025 4794 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/prometheus-metric-storage-0" Mar 10 11:27:20 crc kubenswrapper[4794]: I0310 11:27:20.672746 4794 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/prometheus-metric-storage-0" Mar 10 11:27:21 crc kubenswrapper[4794]: I0310 11:27:21.303901 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/prometheus-metric-storage-0" Mar 10 11:27:25 crc kubenswrapper[4794]: I0310 11:27:25.600433 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/aodh-db-create-f8bl4"] Mar 10 11:27:25 crc kubenswrapper[4794]: I0310 11:27:25.602596 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-create-f8bl4" Mar 10 11:27:25 crc kubenswrapper[4794]: I0310 11:27:25.611558 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-db-create-f8bl4"] Mar 10 11:27:25 crc kubenswrapper[4794]: I0310 11:27:25.701187 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/907874fb-cbaf-44b2-8fa6-2fc67601929e-operator-scripts\") pod \"aodh-db-create-f8bl4\" (UID: \"907874fb-cbaf-44b2-8fa6-2fc67601929e\") " pod="openstack/aodh-db-create-f8bl4" Mar 10 11:27:25 crc kubenswrapper[4794]: I0310 11:27:25.701355 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4x6p5\" (UniqueName: \"kubernetes.io/projected/907874fb-cbaf-44b2-8fa6-2fc67601929e-kube-api-access-4x6p5\") pod \"aodh-db-create-f8bl4\" (UID: \"907874fb-cbaf-44b2-8fa6-2fc67601929e\") " pod="openstack/aodh-db-create-f8bl4" Mar 10 11:27:25 crc kubenswrapper[4794]: I0310 11:27:25.711436 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/aodh-8c84-account-create-update-r5vsv"] Mar 10 11:27:25 crc kubenswrapper[4794]: I0310 11:27:25.713116 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-8c84-account-create-update-r5vsv" Mar 10 11:27:25 crc kubenswrapper[4794]: I0310 11:27:25.723101 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-8c84-account-create-update-r5vsv"] Mar 10 11:27:25 crc kubenswrapper[4794]: I0310 11:27:25.757818 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-db-secret" Mar 10 11:27:25 crc kubenswrapper[4794]: I0310 11:27:25.804142 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/907874fb-cbaf-44b2-8fa6-2fc67601929e-operator-scripts\") pod \"aodh-db-create-f8bl4\" (UID: \"907874fb-cbaf-44b2-8fa6-2fc67601929e\") " pod="openstack/aodh-db-create-f8bl4" Mar 10 11:27:25 crc kubenswrapper[4794]: I0310 11:27:25.804220 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g2m6f\" (UniqueName: \"kubernetes.io/projected/c9a7f680-5d9b-47a8-ba2b-9bdd1ddc8c50-kube-api-access-g2m6f\") pod \"aodh-8c84-account-create-update-r5vsv\" (UID: \"c9a7f680-5d9b-47a8-ba2b-9bdd1ddc8c50\") " pod="openstack/aodh-8c84-account-create-update-r5vsv" Mar 10 11:27:25 crc kubenswrapper[4794]: I0310 11:27:25.804257 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c9a7f680-5d9b-47a8-ba2b-9bdd1ddc8c50-operator-scripts\") pod \"aodh-8c84-account-create-update-r5vsv\" (UID: \"c9a7f680-5d9b-47a8-ba2b-9bdd1ddc8c50\") " pod="openstack/aodh-8c84-account-create-update-r5vsv" Mar 10 11:27:25 crc kubenswrapper[4794]: I0310 11:27:25.804601 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4x6p5\" (UniqueName: \"kubernetes.io/projected/907874fb-cbaf-44b2-8fa6-2fc67601929e-kube-api-access-4x6p5\") pod \"aodh-db-create-f8bl4\" (UID: \"907874fb-cbaf-44b2-8fa6-2fc67601929e\") " pod="openstack/aodh-db-create-f8bl4" Mar 10 11:27:25 crc kubenswrapper[4794]: I0310 11:27:25.804980 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/907874fb-cbaf-44b2-8fa6-2fc67601929e-operator-scripts\") pod \"aodh-db-create-f8bl4\" (UID: \"907874fb-cbaf-44b2-8fa6-2fc67601929e\") " pod="openstack/aodh-db-create-f8bl4" Mar 10 11:27:25 crc kubenswrapper[4794]: I0310 11:27:25.824902 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4x6p5\" (UniqueName: \"kubernetes.io/projected/907874fb-cbaf-44b2-8fa6-2fc67601929e-kube-api-access-4x6p5\") pod \"aodh-db-create-f8bl4\" (UID: \"907874fb-cbaf-44b2-8fa6-2fc67601929e\") " pod="openstack/aodh-db-create-f8bl4" Mar 10 11:27:25 crc kubenswrapper[4794]: I0310 11:27:25.906422 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g2m6f\" (UniqueName: \"kubernetes.io/projected/c9a7f680-5d9b-47a8-ba2b-9bdd1ddc8c50-kube-api-access-g2m6f\") pod \"aodh-8c84-account-create-update-r5vsv\" (UID: \"c9a7f680-5d9b-47a8-ba2b-9bdd1ddc8c50\") " pod="openstack/aodh-8c84-account-create-update-r5vsv" Mar 10 11:27:25 crc kubenswrapper[4794]: I0310 11:27:25.906495 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c9a7f680-5d9b-47a8-ba2b-9bdd1ddc8c50-operator-scripts\") pod \"aodh-8c84-account-create-update-r5vsv\" (UID: \"c9a7f680-5d9b-47a8-ba2b-9bdd1ddc8c50\") " pod="openstack/aodh-8c84-account-create-update-r5vsv" Mar 10 11:27:25 crc kubenswrapper[4794]: I0310 11:27:25.907221 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c9a7f680-5d9b-47a8-ba2b-9bdd1ddc8c50-operator-scripts\") pod \"aodh-8c84-account-create-update-r5vsv\" (UID: \"c9a7f680-5d9b-47a8-ba2b-9bdd1ddc8c50\") " pod="openstack/aodh-8c84-account-create-update-r5vsv" Mar 10 11:27:25 crc kubenswrapper[4794]: I0310 11:27:25.923415 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-create-f8bl4" Mar 10 11:27:25 crc kubenswrapper[4794]: I0310 11:27:25.927646 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g2m6f\" (UniqueName: \"kubernetes.io/projected/c9a7f680-5d9b-47a8-ba2b-9bdd1ddc8c50-kube-api-access-g2m6f\") pod \"aodh-8c84-account-create-update-r5vsv\" (UID: \"c9a7f680-5d9b-47a8-ba2b-9bdd1ddc8c50\") " pod="openstack/aodh-8c84-account-create-update-r5vsv" Mar 10 11:27:26 crc kubenswrapper[4794]: I0310 11:27:26.071604 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-8c84-account-create-update-r5vsv" Mar 10 11:27:26 crc kubenswrapper[4794]: I0310 11:27:26.415481 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-db-create-f8bl4"] Mar 10 11:27:26 crc kubenswrapper[4794]: I0310 11:27:26.611465 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-8c84-account-create-update-r5vsv"] Mar 10 11:27:26 crc kubenswrapper[4794]: W0310 11:27:26.612690 4794 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc9a7f680_5d9b_47a8_ba2b_9bdd1ddc8c50.slice/crio-1598a3c0d804344c24b6c9bd38b0edee10902e6df7c612d2dfa682106a8368ad WatchSource:0}: Error finding container 1598a3c0d804344c24b6c9bd38b0edee10902e6df7c612d2dfa682106a8368ad: Status 404 returned error can't find the container with id 1598a3c0d804344c24b6c9bd38b0edee10902e6df7c612d2dfa682106a8368ad Mar 10 11:27:27 crc kubenswrapper[4794]: I0310 11:27:27.372546 4794 generic.go:334] "Generic (PLEG): container finished" podID="907874fb-cbaf-44b2-8fa6-2fc67601929e" containerID="41a112b424ec84b7af820cce3394a5a3a48ff208147e1fcbcfd7e84b4384e475" exitCode=0 Mar 10 11:27:27 crc kubenswrapper[4794]: I0310 11:27:27.372599 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-create-f8bl4" event={"ID":"907874fb-cbaf-44b2-8fa6-2fc67601929e","Type":"ContainerDied","Data":"41a112b424ec84b7af820cce3394a5a3a48ff208147e1fcbcfd7e84b4384e475"} Mar 10 11:27:27 crc kubenswrapper[4794]: I0310 11:27:27.372657 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-create-f8bl4" event={"ID":"907874fb-cbaf-44b2-8fa6-2fc67601929e","Type":"ContainerStarted","Data":"ef989e819e6a9be270a5648e9f5e7aa9761d267ae9cf6dfbda2ea128dc5cae1f"} Mar 10 11:27:27 crc kubenswrapper[4794]: I0310 11:27:27.374900 4794 generic.go:334] "Generic (PLEG): container finished" podID="c9a7f680-5d9b-47a8-ba2b-9bdd1ddc8c50" containerID="5748208f75481a6d6f6879054091990ba41585f9207f37bf95a510dbb7244b06" exitCode=0 Mar 10 11:27:27 crc kubenswrapper[4794]: I0310 11:27:27.374942 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-8c84-account-create-update-r5vsv" event={"ID":"c9a7f680-5d9b-47a8-ba2b-9bdd1ddc8c50","Type":"ContainerDied","Data":"5748208f75481a6d6f6879054091990ba41585f9207f37bf95a510dbb7244b06"} Mar 10 11:27:27 crc kubenswrapper[4794]: I0310 11:27:27.374964 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-8c84-account-create-update-r5vsv" event={"ID":"c9a7f680-5d9b-47a8-ba2b-9bdd1ddc8c50","Type":"ContainerStarted","Data":"1598a3c0d804344c24b6c9bd38b0edee10902e6df7c612d2dfa682106a8368ad"} Mar 10 11:27:29 crc kubenswrapper[4794]: I0310 11:27:29.210894 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-create-f8bl4" Mar 10 11:27:29 crc kubenswrapper[4794]: I0310 11:27:29.211836 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-8c84-account-create-update-r5vsv" Mar 10 11:27:29 crc kubenswrapper[4794]: I0310 11:27:29.386622 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4x6p5\" (UniqueName: \"kubernetes.io/projected/907874fb-cbaf-44b2-8fa6-2fc67601929e-kube-api-access-4x6p5\") pod \"907874fb-cbaf-44b2-8fa6-2fc67601929e\" (UID: \"907874fb-cbaf-44b2-8fa6-2fc67601929e\") " Mar 10 11:27:29 crc kubenswrapper[4794]: I0310 11:27:29.386892 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c9a7f680-5d9b-47a8-ba2b-9bdd1ddc8c50-operator-scripts\") pod \"c9a7f680-5d9b-47a8-ba2b-9bdd1ddc8c50\" (UID: \"c9a7f680-5d9b-47a8-ba2b-9bdd1ddc8c50\") " Mar 10 11:27:29 crc kubenswrapper[4794]: I0310 11:27:29.386991 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g2m6f\" (UniqueName: \"kubernetes.io/projected/c9a7f680-5d9b-47a8-ba2b-9bdd1ddc8c50-kube-api-access-g2m6f\") pod \"c9a7f680-5d9b-47a8-ba2b-9bdd1ddc8c50\" (UID: \"c9a7f680-5d9b-47a8-ba2b-9bdd1ddc8c50\") " Mar 10 11:27:29 crc kubenswrapper[4794]: I0310 11:27:29.387043 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/907874fb-cbaf-44b2-8fa6-2fc67601929e-operator-scripts\") pod \"907874fb-cbaf-44b2-8fa6-2fc67601929e\" (UID: \"907874fb-cbaf-44b2-8fa6-2fc67601929e\") " Mar 10 11:27:29 crc kubenswrapper[4794]: I0310 11:27:29.388418 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c9a7f680-5d9b-47a8-ba2b-9bdd1ddc8c50-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "c9a7f680-5d9b-47a8-ba2b-9bdd1ddc8c50" (UID: "c9a7f680-5d9b-47a8-ba2b-9bdd1ddc8c50"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 11:27:29 crc kubenswrapper[4794]: I0310 11:27:29.388459 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/907874fb-cbaf-44b2-8fa6-2fc67601929e-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "907874fb-cbaf-44b2-8fa6-2fc67601929e" (UID: "907874fb-cbaf-44b2-8fa6-2fc67601929e"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 11:27:29 crc kubenswrapper[4794]: I0310 11:27:29.395686 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c9a7f680-5d9b-47a8-ba2b-9bdd1ddc8c50-kube-api-access-g2m6f" (OuterVolumeSpecName: "kube-api-access-g2m6f") pod "c9a7f680-5d9b-47a8-ba2b-9bdd1ddc8c50" (UID: "c9a7f680-5d9b-47a8-ba2b-9bdd1ddc8c50"). InnerVolumeSpecName "kube-api-access-g2m6f". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 11:27:29 crc kubenswrapper[4794]: I0310 11:27:29.396228 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/907874fb-cbaf-44b2-8fa6-2fc67601929e-kube-api-access-4x6p5" (OuterVolumeSpecName: "kube-api-access-4x6p5") pod "907874fb-cbaf-44b2-8fa6-2fc67601929e" (UID: "907874fb-cbaf-44b2-8fa6-2fc67601929e"). InnerVolumeSpecName "kube-api-access-4x6p5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 11:27:29 crc kubenswrapper[4794]: I0310 11:27:29.401457 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-8c84-account-create-update-r5vsv" Mar 10 11:27:29 crc kubenswrapper[4794]: I0310 11:27:29.401445 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-8c84-account-create-update-r5vsv" event={"ID":"c9a7f680-5d9b-47a8-ba2b-9bdd1ddc8c50","Type":"ContainerDied","Data":"1598a3c0d804344c24b6c9bd38b0edee10902e6df7c612d2dfa682106a8368ad"} Mar 10 11:27:29 crc kubenswrapper[4794]: I0310 11:27:29.402316 4794 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1598a3c0d804344c24b6c9bd38b0edee10902e6df7c612d2dfa682106a8368ad" Mar 10 11:27:29 crc kubenswrapper[4794]: I0310 11:27:29.404092 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-create-f8bl4" event={"ID":"907874fb-cbaf-44b2-8fa6-2fc67601929e","Type":"ContainerDied","Data":"ef989e819e6a9be270a5648e9f5e7aa9761d267ae9cf6dfbda2ea128dc5cae1f"} Mar 10 11:27:29 crc kubenswrapper[4794]: I0310 11:27:29.404141 4794 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ef989e819e6a9be270a5648e9f5e7aa9761d267ae9cf6dfbda2ea128dc5cae1f" Mar 10 11:27:29 crc kubenswrapper[4794]: I0310 11:27:29.404468 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-create-f8bl4" Mar 10 11:27:29 crc kubenswrapper[4794]: I0310 11:27:29.489922 4794 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c9a7f680-5d9b-47a8-ba2b-9bdd1ddc8c50-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 11:27:29 crc kubenswrapper[4794]: I0310 11:27:29.489968 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g2m6f\" (UniqueName: \"kubernetes.io/projected/c9a7f680-5d9b-47a8-ba2b-9bdd1ddc8c50-kube-api-access-g2m6f\") on node \"crc\" DevicePath \"\"" Mar 10 11:27:29 crc kubenswrapper[4794]: I0310 11:27:29.489988 4794 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/907874fb-cbaf-44b2-8fa6-2fc67601929e-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 11:27:29 crc kubenswrapper[4794]: I0310 11:27:29.490008 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4x6p5\" (UniqueName: \"kubernetes.io/projected/907874fb-cbaf-44b2-8fa6-2fc67601929e-kube-api-access-4x6p5\") on node \"crc\" DevicePath \"\"" Mar 10 11:27:30 crc kubenswrapper[4794]: I0310 11:27:30.000828 4794 scope.go:117] "RemoveContainer" containerID="7035d4646bbb6a23dd802a9c8c9f05a3c6c945d4e5a6da8586d2bb005fcfad77" Mar 10 11:27:30 crc kubenswrapper[4794]: E0310 11:27:30.001445 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-69278_openshift-machine-config-operator(071a79a8-a892-4d38-a255-2a19483b64aa)\"" pod="openshift-machine-config-operator/machine-config-daemon-69278" podUID="071a79a8-a892-4d38-a255-2a19483b64aa" Mar 10 11:27:30 crc kubenswrapper[4794]: I0310 11:27:30.997271 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/aodh-db-sync-lxf6c"] Mar 10 11:27:30 crc kubenswrapper[4794]: E0310 11:27:30.998950 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c9a7f680-5d9b-47a8-ba2b-9bdd1ddc8c50" containerName="mariadb-account-create-update" Mar 10 11:27:30 crc kubenswrapper[4794]: I0310 11:27:30.998990 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="c9a7f680-5d9b-47a8-ba2b-9bdd1ddc8c50" containerName="mariadb-account-create-update" Mar 10 11:27:30 crc kubenswrapper[4794]: E0310 11:27:30.999086 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="907874fb-cbaf-44b2-8fa6-2fc67601929e" containerName="mariadb-database-create" Mar 10 11:27:30 crc kubenswrapper[4794]: I0310 11:27:30.999105 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="907874fb-cbaf-44b2-8fa6-2fc67601929e" containerName="mariadb-database-create" Mar 10 11:27:30 crc kubenswrapper[4794]: I0310 11:27:30.999776 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="907874fb-cbaf-44b2-8fa6-2fc67601929e" containerName="mariadb-database-create" Mar 10 11:27:30 crc kubenswrapper[4794]: I0310 11:27:30.999805 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="c9a7f680-5d9b-47a8-ba2b-9bdd1ddc8c50" containerName="mariadb-account-create-update" Mar 10 11:27:31 crc kubenswrapper[4794]: I0310 11:27:31.001004 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-sync-lxf6c" Mar 10 11:27:31 crc kubenswrapper[4794]: I0310 11:27:31.004409 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-autoscaling-dockercfg-jxfxv" Mar 10 11:27:31 crc kubenswrapper[4794]: I0310 11:27:31.004499 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-config-data" Mar 10 11:27:31 crc kubenswrapper[4794]: I0310 11:27:31.004610 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Mar 10 11:27:31 crc kubenswrapper[4794]: I0310 11:27:31.004513 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-scripts" Mar 10 11:27:31 crc kubenswrapper[4794]: I0310 11:27:31.010731 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-db-sync-lxf6c"] Mar 10 11:27:31 crc kubenswrapper[4794]: I0310 11:27:31.131816 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bcb1116e-257f-487b-91f8-f72da443ced5-scripts\") pod \"aodh-db-sync-lxf6c\" (UID: \"bcb1116e-257f-487b-91f8-f72da443ced5\") " pod="openstack/aodh-db-sync-lxf6c" Mar 10 11:27:31 crc kubenswrapper[4794]: I0310 11:27:31.131923 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bcb1116e-257f-487b-91f8-f72da443ced5-combined-ca-bundle\") pod \"aodh-db-sync-lxf6c\" (UID: \"bcb1116e-257f-487b-91f8-f72da443ced5\") " pod="openstack/aodh-db-sync-lxf6c" Mar 10 11:27:31 crc kubenswrapper[4794]: I0310 11:27:31.132000 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bcb1116e-257f-487b-91f8-f72da443ced5-config-data\") pod \"aodh-db-sync-lxf6c\" (UID: \"bcb1116e-257f-487b-91f8-f72da443ced5\") " pod="openstack/aodh-db-sync-lxf6c" Mar 10 11:27:31 crc kubenswrapper[4794]: I0310 11:27:31.132022 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p6w4p\" (UniqueName: \"kubernetes.io/projected/bcb1116e-257f-487b-91f8-f72da443ced5-kube-api-access-p6w4p\") pod \"aodh-db-sync-lxf6c\" (UID: \"bcb1116e-257f-487b-91f8-f72da443ced5\") " pod="openstack/aodh-db-sync-lxf6c" Mar 10 11:27:31 crc kubenswrapper[4794]: I0310 11:27:31.233595 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bcb1116e-257f-487b-91f8-f72da443ced5-scripts\") pod \"aodh-db-sync-lxf6c\" (UID: \"bcb1116e-257f-487b-91f8-f72da443ced5\") " pod="openstack/aodh-db-sync-lxf6c" Mar 10 11:27:31 crc kubenswrapper[4794]: I0310 11:27:31.233681 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bcb1116e-257f-487b-91f8-f72da443ced5-combined-ca-bundle\") pod \"aodh-db-sync-lxf6c\" (UID: \"bcb1116e-257f-487b-91f8-f72da443ced5\") " pod="openstack/aodh-db-sync-lxf6c" Mar 10 11:27:31 crc kubenswrapper[4794]: I0310 11:27:31.233765 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bcb1116e-257f-487b-91f8-f72da443ced5-config-data\") pod \"aodh-db-sync-lxf6c\" (UID: \"bcb1116e-257f-487b-91f8-f72da443ced5\") " pod="openstack/aodh-db-sync-lxf6c" Mar 10 11:27:31 crc kubenswrapper[4794]: I0310 11:27:31.233803 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p6w4p\" (UniqueName: \"kubernetes.io/projected/bcb1116e-257f-487b-91f8-f72da443ced5-kube-api-access-p6w4p\") pod \"aodh-db-sync-lxf6c\" (UID: \"bcb1116e-257f-487b-91f8-f72da443ced5\") " pod="openstack/aodh-db-sync-lxf6c" Mar 10 11:27:31 crc kubenswrapper[4794]: I0310 11:27:31.239313 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bcb1116e-257f-487b-91f8-f72da443ced5-scripts\") pod \"aodh-db-sync-lxf6c\" (UID: \"bcb1116e-257f-487b-91f8-f72da443ced5\") " pod="openstack/aodh-db-sync-lxf6c" Mar 10 11:27:31 crc kubenswrapper[4794]: I0310 11:27:31.242042 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bcb1116e-257f-487b-91f8-f72da443ced5-config-data\") pod \"aodh-db-sync-lxf6c\" (UID: \"bcb1116e-257f-487b-91f8-f72da443ced5\") " pod="openstack/aodh-db-sync-lxf6c" Mar 10 11:27:31 crc kubenswrapper[4794]: I0310 11:27:31.242682 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bcb1116e-257f-487b-91f8-f72da443ced5-combined-ca-bundle\") pod \"aodh-db-sync-lxf6c\" (UID: \"bcb1116e-257f-487b-91f8-f72da443ced5\") " pod="openstack/aodh-db-sync-lxf6c" Mar 10 11:27:31 crc kubenswrapper[4794]: I0310 11:27:31.248931 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p6w4p\" (UniqueName: \"kubernetes.io/projected/bcb1116e-257f-487b-91f8-f72da443ced5-kube-api-access-p6w4p\") pod \"aodh-db-sync-lxf6c\" (UID: \"bcb1116e-257f-487b-91f8-f72da443ced5\") " pod="openstack/aodh-db-sync-lxf6c" Mar 10 11:27:31 crc kubenswrapper[4794]: I0310 11:27:31.334150 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-sync-lxf6c" Mar 10 11:27:31 crc kubenswrapper[4794]: I0310 11:27:31.851017 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-db-sync-lxf6c"] Mar 10 11:27:31 crc kubenswrapper[4794]: W0310 11:27:31.857213 4794 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbcb1116e_257f_487b_91f8_f72da443ced5.slice/crio-9a2540b9fadec6fa940fa028eb1269722b8dbc1b7962cd4aa13b30af7630709f WatchSource:0}: Error finding container 9a2540b9fadec6fa940fa028eb1269722b8dbc1b7962cd4aa13b30af7630709f: Status 404 returned error can't find the container with id 9a2540b9fadec6fa940fa028eb1269722b8dbc1b7962cd4aa13b30af7630709f Mar 10 11:27:33 crc kubenswrapper[4794]: I0310 11:27:33.492462 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-sync-lxf6c" event={"ID":"bcb1116e-257f-487b-91f8-f72da443ced5","Type":"ContainerStarted","Data":"9a2540b9fadec6fa940fa028eb1269722b8dbc1b7962cd4aa13b30af7630709f"} Mar 10 11:27:39 crc kubenswrapper[4794]: I0310 11:27:39.567530 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-sync-lxf6c" event={"ID":"bcb1116e-257f-487b-91f8-f72da443ced5","Type":"ContainerStarted","Data":"df4c64b5f42d494a8c18fa9d668b31c1c660e3dc676ec247b706c06bbb972256"} Mar 10 11:27:39 crc kubenswrapper[4794]: I0310 11:27:39.602857 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/aodh-db-sync-lxf6c" podStartSLOduration=2.987361099 podStartE2EDuration="9.602837788s" podCreationTimestamp="2026-03-10 11:27:30 +0000 UTC" firstStartedPulling="2026-03-10 11:27:31.859817452 +0000 UTC m=+6200.615988270" lastFinishedPulling="2026-03-10 11:27:38.475294141 +0000 UTC m=+6207.231464959" observedRunningTime="2026-03-10 11:27:39.594693304 +0000 UTC m=+6208.350864132" watchObservedRunningTime="2026-03-10 11:27:39.602837788 +0000 UTC m=+6208.359008616" Mar 10 11:27:41 crc kubenswrapper[4794]: I0310 11:27:41.599254 4794 generic.go:334] "Generic (PLEG): container finished" podID="bcb1116e-257f-487b-91f8-f72da443ced5" containerID="df4c64b5f42d494a8c18fa9d668b31c1c660e3dc676ec247b706c06bbb972256" exitCode=0 Mar 10 11:27:41 crc kubenswrapper[4794]: I0310 11:27:41.599462 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-sync-lxf6c" event={"ID":"bcb1116e-257f-487b-91f8-f72da443ced5","Type":"ContainerDied","Data":"df4c64b5f42d494a8c18fa9d668b31c1c660e3dc676ec247b706c06bbb972256"} Mar 10 11:27:43 crc kubenswrapper[4794]: I0310 11:27:43.059836 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-sync-lxf6c" Mar 10 11:27:43 crc kubenswrapper[4794]: I0310 11:27:43.175882 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p6w4p\" (UniqueName: \"kubernetes.io/projected/bcb1116e-257f-487b-91f8-f72da443ced5-kube-api-access-p6w4p\") pod \"bcb1116e-257f-487b-91f8-f72da443ced5\" (UID: \"bcb1116e-257f-487b-91f8-f72da443ced5\") " Mar 10 11:27:43 crc kubenswrapper[4794]: I0310 11:27:43.175977 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bcb1116e-257f-487b-91f8-f72da443ced5-combined-ca-bundle\") pod \"bcb1116e-257f-487b-91f8-f72da443ced5\" (UID: \"bcb1116e-257f-487b-91f8-f72da443ced5\") " Mar 10 11:27:43 crc kubenswrapper[4794]: I0310 11:27:43.176051 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bcb1116e-257f-487b-91f8-f72da443ced5-config-data\") pod \"bcb1116e-257f-487b-91f8-f72da443ced5\" (UID: \"bcb1116e-257f-487b-91f8-f72da443ced5\") " Mar 10 11:27:43 crc kubenswrapper[4794]: I0310 11:27:43.176125 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bcb1116e-257f-487b-91f8-f72da443ced5-scripts\") pod \"bcb1116e-257f-487b-91f8-f72da443ced5\" (UID: \"bcb1116e-257f-487b-91f8-f72da443ced5\") " Mar 10 11:27:43 crc kubenswrapper[4794]: I0310 11:27:43.181604 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bcb1116e-257f-487b-91f8-f72da443ced5-scripts" (OuterVolumeSpecName: "scripts") pod "bcb1116e-257f-487b-91f8-f72da443ced5" (UID: "bcb1116e-257f-487b-91f8-f72da443ced5"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 11:27:43 crc kubenswrapper[4794]: I0310 11:27:43.205628 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bcb1116e-257f-487b-91f8-f72da443ced5-kube-api-access-p6w4p" (OuterVolumeSpecName: "kube-api-access-p6w4p") pod "bcb1116e-257f-487b-91f8-f72da443ced5" (UID: "bcb1116e-257f-487b-91f8-f72da443ced5"). InnerVolumeSpecName "kube-api-access-p6w4p". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 11:27:43 crc kubenswrapper[4794]: I0310 11:27:43.249486 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bcb1116e-257f-487b-91f8-f72da443ced5-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "bcb1116e-257f-487b-91f8-f72da443ced5" (UID: "bcb1116e-257f-487b-91f8-f72da443ced5"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 11:27:43 crc kubenswrapper[4794]: I0310 11:27:43.253272 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bcb1116e-257f-487b-91f8-f72da443ced5-config-data" (OuterVolumeSpecName: "config-data") pod "bcb1116e-257f-487b-91f8-f72da443ced5" (UID: "bcb1116e-257f-487b-91f8-f72da443ced5"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 11:27:43 crc kubenswrapper[4794]: I0310 11:27:43.278117 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p6w4p\" (UniqueName: \"kubernetes.io/projected/bcb1116e-257f-487b-91f8-f72da443ced5-kube-api-access-p6w4p\") on node \"crc\" DevicePath \"\"" Mar 10 11:27:43 crc kubenswrapper[4794]: I0310 11:27:43.278152 4794 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bcb1116e-257f-487b-91f8-f72da443ced5-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 11:27:43 crc kubenswrapper[4794]: I0310 11:27:43.278162 4794 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bcb1116e-257f-487b-91f8-f72da443ced5-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 11:27:43 crc kubenswrapper[4794]: I0310 11:27:43.278172 4794 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bcb1116e-257f-487b-91f8-f72da443ced5-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 11:27:43 crc kubenswrapper[4794]: I0310 11:27:43.625966 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-sync-lxf6c" event={"ID":"bcb1116e-257f-487b-91f8-f72da443ced5","Type":"ContainerDied","Data":"9a2540b9fadec6fa940fa028eb1269722b8dbc1b7962cd4aa13b30af7630709f"} Mar 10 11:27:43 crc kubenswrapper[4794]: I0310 11:27:43.626002 4794 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9a2540b9fadec6fa940fa028eb1269722b8dbc1b7962cd4aa13b30af7630709f" Mar 10 11:27:43 crc kubenswrapper[4794]: I0310 11:27:43.626042 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-sync-lxf6c" Mar 10 11:27:44 crc kubenswrapper[4794]: I0310 11:27:44.281743 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Mar 10 11:27:45 crc kubenswrapper[4794]: I0310 11:27:45.000302 4794 scope.go:117] "RemoveContainer" containerID="7035d4646bbb6a23dd802a9c8c9f05a3c6c945d4e5a6da8586d2bb005fcfad77" Mar 10 11:27:45 crc kubenswrapper[4794]: E0310 11:27:45.000640 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-69278_openshift-machine-config-operator(071a79a8-a892-4d38-a255-2a19483b64aa)\"" pod="openshift-machine-config-operator/machine-config-daemon-69278" podUID="071a79a8-a892-4d38-a255-2a19483b64aa" Mar 10 11:27:45 crc kubenswrapper[4794]: I0310 11:27:45.683074 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/aodh-0"] Mar 10 11:27:45 crc kubenswrapper[4794]: E0310 11:27:45.683651 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bcb1116e-257f-487b-91f8-f72da443ced5" containerName="aodh-db-sync" Mar 10 11:27:45 crc kubenswrapper[4794]: I0310 11:27:45.683662 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="bcb1116e-257f-487b-91f8-f72da443ced5" containerName="aodh-db-sync" Mar 10 11:27:45 crc kubenswrapper[4794]: I0310 11:27:45.683862 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="bcb1116e-257f-487b-91f8-f72da443ced5" containerName="aodh-db-sync" Mar 10 11:27:45 crc kubenswrapper[4794]: I0310 11:27:45.685572 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-0" Mar 10 11:27:45 crc kubenswrapper[4794]: I0310 11:27:45.688397 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-config-data" Mar 10 11:27:45 crc kubenswrapper[4794]: I0310 11:27:45.688561 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-autoscaling-dockercfg-jxfxv" Mar 10 11:27:45 crc kubenswrapper[4794]: I0310 11:27:45.690117 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-scripts" Mar 10 11:27:45 crc kubenswrapper[4794]: I0310 11:27:45.697117 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-0"] Mar 10 11:27:45 crc kubenswrapper[4794]: I0310 11:27:45.836101 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ff3182a7-ac39-40fa-a387-f9cc59b2782b-scripts\") pod \"aodh-0\" (UID: \"ff3182a7-ac39-40fa-a387-f9cc59b2782b\") " pod="openstack/aodh-0" Mar 10 11:27:45 crc kubenswrapper[4794]: I0310 11:27:45.836164 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ff3182a7-ac39-40fa-a387-f9cc59b2782b-config-data\") pod \"aodh-0\" (UID: \"ff3182a7-ac39-40fa-a387-f9cc59b2782b\") " pod="openstack/aodh-0" Mar 10 11:27:45 crc kubenswrapper[4794]: I0310 11:27:45.836255 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l8tn8\" (UniqueName: \"kubernetes.io/projected/ff3182a7-ac39-40fa-a387-f9cc59b2782b-kube-api-access-l8tn8\") pod \"aodh-0\" (UID: \"ff3182a7-ac39-40fa-a387-f9cc59b2782b\") " pod="openstack/aodh-0" Mar 10 11:27:45 crc kubenswrapper[4794]: I0310 11:27:45.836360 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ff3182a7-ac39-40fa-a387-f9cc59b2782b-combined-ca-bundle\") pod \"aodh-0\" (UID: \"ff3182a7-ac39-40fa-a387-f9cc59b2782b\") " pod="openstack/aodh-0" Mar 10 11:27:45 crc kubenswrapper[4794]: I0310 11:27:45.940002 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l8tn8\" (UniqueName: \"kubernetes.io/projected/ff3182a7-ac39-40fa-a387-f9cc59b2782b-kube-api-access-l8tn8\") pod \"aodh-0\" (UID: \"ff3182a7-ac39-40fa-a387-f9cc59b2782b\") " pod="openstack/aodh-0" Mar 10 11:27:45 crc kubenswrapper[4794]: I0310 11:27:45.940075 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ff3182a7-ac39-40fa-a387-f9cc59b2782b-combined-ca-bundle\") pod \"aodh-0\" (UID: \"ff3182a7-ac39-40fa-a387-f9cc59b2782b\") " pod="openstack/aodh-0" Mar 10 11:27:45 crc kubenswrapper[4794]: I0310 11:27:45.940351 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ff3182a7-ac39-40fa-a387-f9cc59b2782b-scripts\") pod \"aodh-0\" (UID: \"ff3182a7-ac39-40fa-a387-f9cc59b2782b\") " pod="openstack/aodh-0" Mar 10 11:27:45 crc kubenswrapper[4794]: I0310 11:27:45.940410 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ff3182a7-ac39-40fa-a387-f9cc59b2782b-config-data\") pod \"aodh-0\" (UID: \"ff3182a7-ac39-40fa-a387-f9cc59b2782b\") " pod="openstack/aodh-0" Mar 10 11:27:45 crc kubenswrapper[4794]: I0310 11:27:45.947090 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ff3182a7-ac39-40fa-a387-f9cc59b2782b-config-data\") pod \"aodh-0\" (UID: \"ff3182a7-ac39-40fa-a387-f9cc59b2782b\") " pod="openstack/aodh-0" Mar 10 11:27:45 crc kubenswrapper[4794]: I0310 11:27:45.948110 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ff3182a7-ac39-40fa-a387-f9cc59b2782b-combined-ca-bundle\") pod \"aodh-0\" (UID: \"ff3182a7-ac39-40fa-a387-f9cc59b2782b\") " pod="openstack/aodh-0" Mar 10 11:27:45 crc kubenswrapper[4794]: I0310 11:27:45.958225 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ff3182a7-ac39-40fa-a387-f9cc59b2782b-scripts\") pod \"aodh-0\" (UID: \"ff3182a7-ac39-40fa-a387-f9cc59b2782b\") " pod="openstack/aodh-0" Mar 10 11:27:45 crc kubenswrapper[4794]: I0310 11:27:45.968605 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l8tn8\" (UniqueName: \"kubernetes.io/projected/ff3182a7-ac39-40fa-a387-f9cc59b2782b-kube-api-access-l8tn8\") pod \"aodh-0\" (UID: \"ff3182a7-ac39-40fa-a387-f9cc59b2782b\") " pod="openstack/aodh-0" Mar 10 11:27:46 crc kubenswrapper[4794]: I0310 11:27:46.005647 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-0" Mar 10 11:27:46 crc kubenswrapper[4794]: I0310 11:27:46.488756 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-0"] Mar 10 11:27:46 crc kubenswrapper[4794]: I0310 11:27:46.658040 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"ff3182a7-ac39-40fa-a387-f9cc59b2782b","Type":"ContainerStarted","Data":"2938be7048105c1c51c5c68a94bbf04b4e8c2e0bbbbba252d8e47ba7020d1eaa"} Mar 10 11:27:47 crc kubenswrapper[4794]: I0310 11:27:47.632405 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 10 11:27:47 crc kubenswrapper[4794]: I0310 11:27:47.633017 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="88a6b831-1306-46c9-9c99-d31a3078f532" containerName="ceilometer-central-agent" containerID="cri-o://c03db4107aa78bf9d52cf84d5bccd6df397f0dd26da5c7e91ba3897deb89b9ef" gracePeriod=30 Mar 10 11:27:47 crc kubenswrapper[4794]: I0310 11:27:47.633090 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="88a6b831-1306-46c9-9c99-d31a3078f532" containerName="proxy-httpd" containerID="cri-o://b712de111e658d060b3a1b566511693f47f67d861bd8a821ff26e515298623f1" gracePeriod=30 Mar 10 11:27:47 crc kubenswrapper[4794]: I0310 11:27:47.633120 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="88a6b831-1306-46c9-9c99-d31a3078f532" containerName="sg-core" containerID="cri-o://fc3500b985444ba6a90e62acb7d871d40afaffe0c0430da3f307e3c465a30f48" gracePeriod=30 Mar 10 11:27:47 crc kubenswrapper[4794]: I0310 11:27:47.633112 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="88a6b831-1306-46c9-9c99-d31a3078f532" containerName="ceilometer-notification-agent" containerID="cri-o://b8115589ade0581832b073c52ad0c6ca4a94ecc5eadc47609b95c19ac8c6c806" gracePeriod=30 Mar 10 11:27:47 crc kubenswrapper[4794]: I0310 11:27:47.668057 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"ff3182a7-ac39-40fa-a387-f9cc59b2782b","Type":"ContainerStarted","Data":"f12e7dc3ffe4721204c8469b93acbb8594b65f16009f68c1190c3014e6fe65f1"} Mar 10 11:27:49 crc kubenswrapper[4794]: I0310 11:27:48.690192 4794 generic.go:334] "Generic (PLEG): container finished" podID="88a6b831-1306-46c9-9c99-d31a3078f532" containerID="b712de111e658d060b3a1b566511693f47f67d861bd8a821ff26e515298623f1" exitCode=0 Mar 10 11:27:49 crc kubenswrapper[4794]: I0310 11:27:48.690466 4794 generic.go:334] "Generic (PLEG): container finished" podID="88a6b831-1306-46c9-9c99-d31a3078f532" containerID="fc3500b985444ba6a90e62acb7d871d40afaffe0c0430da3f307e3c465a30f48" exitCode=2 Mar 10 11:27:49 crc kubenswrapper[4794]: I0310 11:27:48.690475 4794 generic.go:334] "Generic (PLEG): container finished" podID="88a6b831-1306-46c9-9c99-d31a3078f532" containerID="b8115589ade0581832b073c52ad0c6ca4a94ecc5eadc47609b95c19ac8c6c806" exitCode=0 Mar 10 11:27:49 crc kubenswrapper[4794]: I0310 11:27:48.690483 4794 generic.go:334] "Generic (PLEG): container finished" podID="88a6b831-1306-46c9-9c99-d31a3078f532" containerID="c03db4107aa78bf9d52cf84d5bccd6df397f0dd26da5c7e91ba3897deb89b9ef" exitCode=0 Mar 10 11:27:49 crc kubenswrapper[4794]: I0310 11:27:48.690277 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"88a6b831-1306-46c9-9c99-d31a3078f532","Type":"ContainerDied","Data":"b712de111e658d060b3a1b566511693f47f67d861bd8a821ff26e515298623f1"} Mar 10 11:27:49 crc kubenswrapper[4794]: I0310 11:27:48.690523 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"88a6b831-1306-46c9-9c99-d31a3078f532","Type":"ContainerDied","Data":"fc3500b985444ba6a90e62acb7d871d40afaffe0c0430da3f307e3c465a30f48"} Mar 10 11:27:49 crc kubenswrapper[4794]: I0310 11:27:48.690537 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"88a6b831-1306-46c9-9c99-d31a3078f532","Type":"ContainerDied","Data":"b8115589ade0581832b073c52ad0c6ca4a94ecc5eadc47609b95c19ac8c6c806"} Mar 10 11:27:49 crc kubenswrapper[4794]: I0310 11:27:48.690547 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"88a6b831-1306-46c9-9c99-d31a3078f532","Type":"ContainerDied","Data":"c03db4107aa78bf9d52cf84d5bccd6df397f0dd26da5c7e91ba3897deb89b9ef"} Mar 10 11:27:49 crc kubenswrapper[4794]: I0310 11:27:48.876263 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 10 11:27:49 crc kubenswrapper[4794]: I0310 11:27:49.006204 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/88a6b831-1306-46c9-9c99-d31a3078f532-sg-core-conf-yaml\") pod \"88a6b831-1306-46c9-9c99-d31a3078f532\" (UID: \"88a6b831-1306-46c9-9c99-d31a3078f532\") " Mar 10 11:27:49 crc kubenswrapper[4794]: I0310 11:27:49.006374 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/88a6b831-1306-46c9-9c99-d31a3078f532-combined-ca-bundle\") pod \"88a6b831-1306-46c9-9c99-d31a3078f532\" (UID: \"88a6b831-1306-46c9-9c99-d31a3078f532\") " Mar 10 11:27:49 crc kubenswrapper[4794]: I0310 11:27:49.006444 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/88a6b831-1306-46c9-9c99-d31a3078f532-log-httpd\") pod \"88a6b831-1306-46c9-9c99-d31a3078f532\" (UID: \"88a6b831-1306-46c9-9c99-d31a3078f532\") " Mar 10 11:27:49 crc kubenswrapper[4794]: I0310 11:27:49.006537 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/88a6b831-1306-46c9-9c99-d31a3078f532-config-data\") pod \"88a6b831-1306-46c9-9c99-d31a3078f532\" (UID: \"88a6b831-1306-46c9-9c99-d31a3078f532\") " Mar 10 11:27:49 crc kubenswrapper[4794]: I0310 11:27:49.006598 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/88a6b831-1306-46c9-9c99-d31a3078f532-scripts\") pod \"88a6b831-1306-46c9-9c99-d31a3078f532\" (UID: \"88a6b831-1306-46c9-9c99-d31a3078f532\") " Mar 10 11:27:49 crc kubenswrapper[4794]: I0310 11:27:49.006730 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/88a6b831-1306-46c9-9c99-d31a3078f532-run-httpd\") pod \"88a6b831-1306-46c9-9c99-d31a3078f532\" (UID: \"88a6b831-1306-46c9-9c99-d31a3078f532\") " Mar 10 11:27:49 crc kubenswrapper[4794]: I0310 11:27:49.006760 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qdh45\" (UniqueName: \"kubernetes.io/projected/88a6b831-1306-46c9-9c99-d31a3078f532-kube-api-access-qdh45\") pod \"88a6b831-1306-46c9-9c99-d31a3078f532\" (UID: \"88a6b831-1306-46c9-9c99-d31a3078f532\") " Mar 10 11:27:49 crc kubenswrapper[4794]: I0310 11:27:49.007995 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/88a6b831-1306-46c9-9c99-d31a3078f532-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "88a6b831-1306-46c9-9c99-d31a3078f532" (UID: "88a6b831-1306-46c9-9c99-d31a3078f532"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 11:27:49 crc kubenswrapper[4794]: I0310 11:27:49.008095 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/88a6b831-1306-46c9-9c99-d31a3078f532-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "88a6b831-1306-46c9-9c99-d31a3078f532" (UID: "88a6b831-1306-46c9-9c99-d31a3078f532"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 11:27:49 crc kubenswrapper[4794]: I0310 11:27:49.014702 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/88a6b831-1306-46c9-9c99-d31a3078f532-kube-api-access-qdh45" (OuterVolumeSpecName: "kube-api-access-qdh45") pod "88a6b831-1306-46c9-9c99-d31a3078f532" (UID: "88a6b831-1306-46c9-9c99-d31a3078f532"). InnerVolumeSpecName "kube-api-access-qdh45". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 11:27:49 crc kubenswrapper[4794]: I0310 11:27:49.023001 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/88a6b831-1306-46c9-9c99-d31a3078f532-scripts" (OuterVolumeSpecName: "scripts") pod "88a6b831-1306-46c9-9c99-d31a3078f532" (UID: "88a6b831-1306-46c9-9c99-d31a3078f532"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 11:27:49 crc kubenswrapper[4794]: I0310 11:27:49.074400 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-db-create-k47gx"] Mar 10 11:27:49 crc kubenswrapper[4794]: I0310 11:27:49.084444 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-7991-account-create-update-rhvh4"] Mar 10 11:27:49 crc kubenswrapper[4794]: I0310 11:27:49.095766 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-269d-account-create-update-ld4wn"] Mar 10 11:27:49 crc kubenswrapper[4794]: I0310 11:27:49.104507 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/88a6b831-1306-46c9-9c99-d31a3078f532-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "88a6b831-1306-46c9-9c99-d31a3078f532" (UID: "88a6b831-1306-46c9-9c99-d31a3078f532"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 11:27:49 crc kubenswrapper[4794]: I0310 11:27:49.106372 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-db-create-xb889"] Mar 10 11:27:49 crc kubenswrapper[4794]: I0310 11:27:49.108799 4794 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/88a6b831-1306-46c9-9c99-d31a3078f532-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 10 11:27:49 crc kubenswrapper[4794]: I0310 11:27:49.108815 4794 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/88a6b831-1306-46c9-9c99-d31a3078f532-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 11:27:49 crc kubenswrapper[4794]: I0310 11:27:49.108823 4794 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/88a6b831-1306-46c9-9c99-d31a3078f532-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 10 11:27:49 crc kubenswrapper[4794]: I0310 11:27:49.108831 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qdh45\" (UniqueName: \"kubernetes.io/projected/88a6b831-1306-46c9-9c99-d31a3078f532-kube-api-access-qdh45\") on node \"crc\" DevicePath \"\"" Mar 10 11:27:49 crc kubenswrapper[4794]: I0310 11:27:49.108843 4794 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/88a6b831-1306-46c9-9c99-d31a3078f532-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 10 11:27:49 crc kubenswrapper[4794]: I0310 11:27:49.116742 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-c39c-account-create-update-4gr8p"] Mar 10 11:27:49 crc kubenswrapper[4794]: I0310 11:27:49.120493 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/88a6b831-1306-46c9-9c99-d31a3078f532-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "88a6b831-1306-46c9-9c99-d31a3078f532" (UID: "88a6b831-1306-46c9-9c99-d31a3078f532"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 11:27:49 crc kubenswrapper[4794]: I0310 11:27:49.128224 4794 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-db-create-k47gx"] Mar 10 11:27:49 crc kubenswrapper[4794]: I0310 11:27:49.139135 4794 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-269d-account-create-update-ld4wn"] Mar 10 11:27:49 crc kubenswrapper[4794]: I0310 11:27:49.151735 4794 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-7991-account-create-update-rhvh4"] Mar 10 11:27:49 crc kubenswrapper[4794]: I0310 11:27:49.165776 4794 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-db-create-xb889"] Mar 10 11:27:49 crc kubenswrapper[4794]: I0310 11:27:49.169462 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/88a6b831-1306-46c9-9c99-d31a3078f532-config-data" (OuterVolumeSpecName: "config-data") pod "88a6b831-1306-46c9-9c99-d31a3078f532" (UID: "88a6b831-1306-46c9-9c99-d31a3078f532"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 11:27:49 crc kubenswrapper[4794]: I0310 11:27:49.177629 4794 scope.go:117] "RemoveContainer" containerID="3aa8acc367871bb87e19d6efb9c07640890b28e221aa9bf63add21509165b115" Mar 10 11:27:49 crc kubenswrapper[4794]: I0310 11:27:49.186003 4794 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-c39c-account-create-update-4gr8p"] Mar 10 11:27:49 crc kubenswrapper[4794]: I0310 11:27:49.210661 4794 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/88a6b831-1306-46c9-9c99-d31a3078f532-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 11:27:49 crc kubenswrapper[4794]: I0310 11:27:49.210693 4794 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/88a6b831-1306-46c9-9c99-d31a3078f532-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 11:27:49 crc kubenswrapper[4794]: I0310 11:27:49.705897 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"88a6b831-1306-46c9-9c99-d31a3078f532","Type":"ContainerDied","Data":"db7d79b6bb5f062912226c84786e46269396cc8487a6f6702c27f4e4275dc6cd"} Mar 10 11:27:49 crc kubenswrapper[4794]: I0310 11:27:49.706265 4794 scope.go:117] "RemoveContainer" containerID="b712de111e658d060b3a1b566511693f47f67d861bd8a821ff26e515298623f1" Mar 10 11:27:49 crc kubenswrapper[4794]: I0310 11:27:49.706007 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 10 11:27:49 crc kubenswrapper[4794]: I0310 11:27:49.720960 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"ff3182a7-ac39-40fa-a387-f9cc59b2782b","Type":"ContainerStarted","Data":"db8349deb89998a8e3d824b7d0e3c86f7cef807f8ed12201ad75381da3870634"} Mar 10 11:27:49 crc kubenswrapper[4794]: I0310 11:27:49.760727 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 10 11:27:49 crc kubenswrapper[4794]: I0310 11:27:49.779380 4794 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 10 11:27:49 crc kubenswrapper[4794]: I0310 11:27:49.795807 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 10 11:27:49 crc kubenswrapper[4794]: E0310 11:27:49.796386 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="88a6b831-1306-46c9-9c99-d31a3078f532" containerName="ceilometer-central-agent" Mar 10 11:27:49 crc kubenswrapper[4794]: I0310 11:27:49.796408 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="88a6b831-1306-46c9-9c99-d31a3078f532" containerName="ceilometer-central-agent" Mar 10 11:27:49 crc kubenswrapper[4794]: E0310 11:27:49.796430 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="88a6b831-1306-46c9-9c99-d31a3078f532" containerName="proxy-httpd" Mar 10 11:27:49 crc kubenswrapper[4794]: I0310 11:27:49.796439 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="88a6b831-1306-46c9-9c99-d31a3078f532" containerName="proxy-httpd" Mar 10 11:27:49 crc kubenswrapper[4794]: E0310 11:27:49.796459 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="88a6b831-1306-46c9-9c99-d31a3078f532" containerName="sg-core" Mar 10 11:27:49 crc kubenswrapper[4794]: I0310 11:27:49.796467 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="88a6b831-1306-46c9-9c99-d31a3078f532" containerName="sg-core" Mar 10 11:27:49 crc kubenswrapper[4794]: E0310 11:27:49.796484 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="88a6b831-1306-46c9-9c99-d31a3078f532" containerName="ceilometer-notification-agent" Mar 10 11:27:49 crc kubenswrapper[4794]: I0310 11:27:49.796494 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="88a6b831-1306-46c9-9c99-d31a3078f532" containerName="ceilometer-notification-agent" Mar 10 11:27:49 crc kubenswrapper[4794]: I0310 11:27:49.796751 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="88a6b831-1306-46c9-9c99-d31a3078f532" containerName="ceilometer-central-agent" Mar 10 11:27:49 crc kubenswrapper[4794]: I0310 11:27:49.796781 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="88a6b831-1306-46c9-9c99-d31a3078f532" containerName="sg-core" Mar 10 11:27:49 crc kubenswrapper[4794]: I0310 11:27:49.796802 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="88a6b831-1306-46c9-9c99-d31a3078f532" containerName="proxy-httpd" Mar 10 11:27:49 crc kubenswrapper[4794]: I0310 11:27:49.796817 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="88a6b831-1306-46c9-9c99-d31a3078f532" containerName="ceilometer-notification-agent" Mar 10 11:27:49 crc kubenswrapper[4794]: I0310 11:27:49.799197 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 10 11:27:49 crc kubenswrapper[4794]: I0310 11:27:49.803931 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 10 11:27:49 crc kubenswrapper[4794]: I0310 11:27:49.806024 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 10 11:27:49 crc kubenswrapper[4794]: I0310 11:27:49.810199 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 10 11:27:49 crc kubenswrapper[4794]: I0310 11:27:49.925273 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e925561f-cfd5-420a-930e-0d1fa394fdd5-run-httpd\") pod \"ceilometer-0\" (UID: \"e925561f-cfd5-420a-930e-0d1fa394fdd5\") " pod="openstack/ceilometer-0" Mar 10 11:27:49 crc kubenswrapper[4794]: I0310 11:27:49.925323 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kjsqq\" (UniqueName: \"kubernetes.io/projected/e925561f-cfd5-420a-930e-0d1fa394fdd5-kube-api-access-kjsqq\") pod \"ceilometer-0\" (UID: \"e925561f-cfd5-420a-930e-0d1fa394fdd5\") " pod="openstack/ceilometer-0" Mar 10 11:27:49 crc kubenswrapper[4794]: I0310 11:27:49.925375 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e925561f-cfd5-420a-930e-0d1fa394fdd5-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"e925561f-cfd5-420a-930e-0d1fa394fdd5\") " pod="openstack/ceilometer-0" Mar 10 11:27:49 crc kubenswrapper[4794]: I0310 11:27:49.925933 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e925561f-cfd5-420a-930e-0d1fa394fdd5-scripts\") pod \"ceilometer-0\" (UID: \"e925561f-cfd5-420a-930e-0d1fa394fdd5\") " pod="openstack/ceilometer-0" Mar 10 11:27:49 crc kubenswrapper[4794]: I0310 11:27:49.926024 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e925561f-cfd5-420a-930e-0d1fa394fdd5-log-httpd\") pod \"ceilometer-0\" (UID: \"e925561f-cfd5-420a-930e-0d1fa394fdd5\") " pod="openstack/ceilometer-0" Mar 10 11:27:49 crc kubenswrapper[4794]: I0310 11:27:49.926147 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e925561f-cfd5-420a-930e-0d1fa394fdd5-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"e925561f-cfd5-420a-930e-0d1fa394fdd5\") " pod="openstack/ceilometer-0" Mar 10 11:27:49 crc kubenswrapper[4794]: I0310 11:27:49.926178 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e925561f-cfd5-420a-930e-0d1fa394fdd5-config-data\") pod \"ceilometer-0\" (UID: \"e925561f-cfd5-420a-930e-0d1fa394fdd5\") " pod="openstack/ceilometer-0" Mar 10 11:27:49 crc kubenswrapper[4794]: I0310 11:27:49.996106 4794 scope.go:117] "RemoveContainer" containerID="fc3500b985444ba6a90e62acb7d871d40afaffe0c0430da3f307e3c465a30f48" Mar 10 11:27:50 crc kubenswrapper[4794]: I0310 11:27:50.027523 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1e45f8c3-9a9c-4446-9d3b-0a3ca2a3959d" path="/var/lib/kubelet/pods/1e45f8c3-9a9c-4446-9d3b-0a3ca2a3959d/volumes" Mar 10 11:27:50 crc kubenswrapper[4794]: I0310 11:27:50.031104 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e925561f-cfd5-420a-930e-0d1fa394fdd5-run-httpd\") pod \"ceilometer-0\" (UID: \"e925561f-cfd5-420a-930e-0d1fa394fdd5\") " pod="openstack/ceilometer-0" Mar 10 11:27:50 crc kubenswrapper[4794]: I0310 11:27:50.031287 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kjsqq\" (UniqueName: \"kubernetes.io/projected/e925561f-cfd5-420a-930e-0d1fa394fdd5-kube-api-access-kjsqq\") pod \"ceilometer-0\" (UID: \"e925561f-cfd5-420a-930e-0d1fa394fdd5\") " pod="openstack/ceilometer-0" Mar 10 11:27:50 crc kubenswrapper[4794]: I0310 11:27:50.031417 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e925561f-cfd5-420a-930e-0d1fa394fdd5-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"e925561f-cfd5-420a-930e-0d1fa394fdd5\") " pod="openstack/ceilometer-0" Mar 10 11:27:50 crc kubenswrapper[4794]: I0310 11:27:50.031591 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e925561f-cfd5-420a-930e-0d1fa394fdd5-scripts\") pod \"ceilometer-0\" (UID: \"e925561f-cfd5-420a-930e-0d1fa394fdd5\") " pod="openstack/ceilometer-0" Mar 10 11:27:50 crc kubenswrapper[4794]: I0310 11:27:50.031690 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e925561f-cfd5-420a-930e-0d1fa394fdd5-log-httpd\") pod \"ceilometer-0\" (UID: \"e925561f-cfd5-420a-930e-0d1fa394fdd5\") " pod="openstack/ceilometer-0" Mar 10 11:27:50 crc kubenswrapper[4794]: I0310 11:27:50.031837 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e925561f-cfd5-420a-930e-0d1fa394fdd5-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"e925561f-cfd5-420a-930e-0d1fa394fdd5\") " pod="openstack/ceilometer-0" Mar 10 11:27:50 crc kubenswrapper[4794]: I0310 11:27:50.031941 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e925561f-cfd5-420a-930e-0d1fa394fdd5-config-data\") pod \"ceilometer-0\" (UID: \"e925561f-cfd5-420a-930e-0d1fa394fdd5\") " pod="openstack/ceilometer-0" Mar 10 11:27:50 crc kubenswrapper[4794]: I0310 11:27:50.032076 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e925561f-cfd5-420a-930e-0d1fa394fdd5-log-httpd\") pod \"ceilometer-0\" (UID: \"e925561f-cfd5-420a-930e-0d1fa394fdd5\") " pod="openstack/ceilometer-0" Mar 10 11:27:50 crc kubenswrapper[4794]: I0310 11:27:50.031599 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3d995cba-d0f5-433c-bb91-a7b20e3a055d" path="/var/lib/kubelet/pods/3d995cba-d0f5-433c-bb91-a7b20e3a055d/volumes" Mar 10 11:27:50 crc kubenswrapper[4794]: I0310 11:27:50.031613 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e925561f-cfd5-420a-930e-0d1fa394fdd5-run-httpd\") pod \"ceilometer-0\" (UID: \"e925561f-cfd5-420a-930e-0d1fa394fdd5\") " pod="openstack/ceilometer-0" Mar 10 11:27:50 crc kubenswrapper[4794]: I0310 11:27:50.033896 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="72480687-9623-4a72-9714-16a4ac7143f2" path="/var/lib/kubelet/pods/72480687-9623-4a72-9714-16a4ac7143f2/volumes" Mar 10 11:27:50 crc kubenswrapper[4794]: I0310 11:27:50.035001 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="88a6b831-1306-46c9-9c99-d31a3078f532" path="/var/lib/kubelet/pods/88a6b831-1306-46c9-9c99-d31a3078f532/volumes" Mar 10 11:27:50 crc kubenswrapper[4794]: I0310 11:27:50.035997 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d48c817c-bed9-4483-b4af-83b6cf201c8e" path="/var/lib/kubelet/pods/d48c817c-bed9-4483-b4af-83b6cf201c8e/volumes" Mar 10 11:27:50 crc kubenswrapper[4794]: I0310 11:27:50.037504 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f8d7a332-73d9-4294-9cdd-5c9fde7561bc" path="/var/lib/kubelet/pods/f8d7a332-73d9-4294-9cdd-5c9fde7561bc/volumes" Mar 10 11:27:50 crc kubenswrapper[4794]: I0310 11:27:50.038194 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-db-create-j9xxz"] Mar 10 11:27:50 crc kubenswrapper[4794]: I0310 11:27:50.041031 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e925561f-cfd5-420a-930e-0d1fa394fdd5-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"e925561f-cfd5-420a-930e-0d1fa394fdd5\") " pod="openstack/ceilometer-0" Mar 10 11:27:50 crc kubenswrapper[4794]: I0310 11:27:50.041239 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e925561f-cfd5-420a-930e-0d1fa394fdd5-scripts\") pod \"ceilometer-0\" (UID: \"e925561f-cfd5-420a-930e-0d1fa394fdd5\") " pod="openstack/ceilometer-0" Mar 10 11:27:50 crc kubenswrapper[4794]: I0310 11:27:50.042025 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e925561f-cfd5-420a-930e-0d1fa394fdd5-config-data\") pod \"ceilometer-0\" (UID: \"e925561f-cfd5-420a-930e-0d1fa394fdd5\") " pod="openstack/ceilometer-0" Mar 10 11:27:50 crc kubenswrapper[4794]: I0310 11:27:50.045135 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e925561f-cfd5-420a-930e-0d1fa394fdd5-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"e925561f-cfd5-420a-930e-0d1fa394fdd5\") " pod="openstack/ceilometer-0" Mar 10 11:27:50 crc kubenswrapper[4794]: I0310 11:27:50.046379 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kjsqq\" (UniqueName: \"kubernetes.io/projected/e925561f-cfd5-420a-930e-0d1fa394fdd5-kube-api-access-kjsqq\") pod \"ceilometer-0\" (UID: \"e925561f-cfd5-420a-930e-0d1fa394fdd5\") " pod="openstack/ceilometer-0" Mar 10 11:27:50 crc kubenswrapper[4794]: I0310 11:27:50.051885 4794 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-db-create-j9xxz"] Mar 10 11:27:50 crc kubenswrapper[4794]: I0310 11:27:50.073555 4794 scope.go:117] "RemoveContainer" containerID="b8115589ade0581832b073c52ad0c6ca4a94ecc5eadc47609b95c19ac8c6c806" Mar 10 11:27:50 crc kubenswrapper[4794]: I0310 11:27:50.123164 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 10 11:27:50 crc kubenswrapper[4794]: I0310 11:27:50.245246 4794 scope.go:117] "RemoveContainer" containerID="c03db4107aa78bf9d52cf84d5bccd6df397f0dd26da5c7e91ba3897deb89b9ef" Mar 10 11:27:50 crc kubenswrapper[4794]: I0310 11:27:50.680033 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 10 11:27:50 crc kubenswrapper[4794]: I0310 11:27:50.730827 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e925561f-cfd5-420a-930e-0d1fa394fdd5","Type":"ContainerStarted","Data":"b66b14c2a2b0e86b35412f44bf85726ca1b3b46c4faf6e74a353e0cb6385d7b5"} Mar 10 11:27:50 crc kubenswrapper[4794]: I0310 11:27:50.734123 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"ff3182a7-ac39-40fa-a387-f9cc59b2782b","Type":"ContainerStarted","Data":"b3a3bd38b232854a027473d10c134b1abb4332c5ded6ea05be07d8623b780424"} Mar 10 11:27:51 crc kubenswrapper[4794]: I0310 11:27:51.751671 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e925561f-cfd5-420a-930e-0d1fa394fdd5","Type":"ContainerStarted","Data":"db7bdc3f7a7ea3537cd2920286083f1bd31c0533725f3aced43ff828d043bc49"} Mar 10 11:27:52 crc kubenswrapper[4794]: I0310 11:27:52.012061 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="55a3b4ff-cf31-4fa5-8482-17824a6b6d6f" path="/var/lib/kubelet/pods/55a3b4ff-cf31-4fa5-8482-17824a6b6d6f/volumes" Mar 10 11:27:52 crc kubenswrapper[4794]: I0310 11:27:52.764669 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e925561f-cfd5-420a-930e-0d1fa394fdd5","Type":"ContainerStarted","Data":"8877b6ec2312b89f77411d26fbb5826f3b009f8d842a72c58142e45ab7103ecf"} Mar 10 11:27:52 crc kubenswrapper[4794]: I0310 11:27:52.767383 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"ff3182a7-ac39-40fa-a387-f9cc59b2782b","Type":"ContainerStarted","Data":"0e11cafb92eb4d98aa95353ee43058a4bcbf063990da8f4427bbf896c4a418e0"} Mar 10 11:27:52 crc kubenswrapper[4794]: I0310 11:27:52.800214 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/aodh-0" podStartSLOduration=2.619811139 podStartE2EDuration="7.80019396s" podCreationTimestamp="2026-03-10 11:27:45 +0000 UTC" firstStartedPulling="2026-03-10 11:27:46.49412868 +0000 UTC m=+6215.250299498" lastFinishedPulling="2026-03-10 11:27:51.674511501 +0000 UTC m=+6220.430682319" observedRunningTime="2026-03-10 11:27:52.787273388 +0000 UTC m=+6221.543444216" watchObservedRunningTime="2026-03-10 11:27:52.80019396 +0000 UTC m=+6221.556364778" Mar 10 11:27:53 crc kubenswrapper[4794]: I0310 11:27:53.780449 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e925561f-cfd5-420a-930e-0d1fa394fdd5","Type":"ContainerStarted","Data":"91ddf3c1e2942abe31d064fcf7f93a62e2c9df09bb01e7057a5785ea96eb7eac"} Mar 10 11:27:55 crc kubenswrapper[4794]: I0310 11:27:55.802966 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e925561f-cfd5-420a-930e-0d1fa394fdd5","Type":"ContainerStarted","Data":"42bdfec4747abe5f78a64f9efecf2a66a80d1894083a349e2891d50996c451d3"} Mar 10 11:27:55 crc kubenswrapper[4794]: I0310 11:27:55.803549 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 10 11:27:58 crc kubenswrapper[4794]: I0310 11:27:58.000817 4794 scope.go:117] "RemoveContainer" containerID="7035d4646bbb6a23dd802a9c8c9f05a3c6c945d4e5a6da8586d2bb005fcfad77" Mar 10 11:27:58 crc kubenswrapper[4794]: E0310 11:27:58.002118 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-69278_openshift-machine-config-operator(071a79a8-a892-4d38-a255-2a19483b64aa)\"" pod="openshift-machine-config-operator/machine-config-daemon-69278" podUID="071a79a8-a892-4d38-a255-2a19483b64aa" Mar 10 11:27:58 crc kubenswrapper[4794]: I0310 11:27:58.506493 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=4.914427482 podStartE2EDuration="9.506471397s" podCreationTimestamp="2026-03-10 11:27:49 +0000 UTC" firstStartedPulling="2026-03-10 11:27:50.669497637 +0000 UTC m=+6219.425668455" lastFinishedPulling="2026-03-10 11:27:55.261541552 +0000 UTC m=+6224.017712370" observedRunningTime="2026-03-10 11:27:55.839525457 +0000 UTC m=+6224.595696275" watchObservedRunningTime="2026-03-10 11:27:58.506471397 +0000 UTC m=+6227.262642225" Mar 10 11:27:58 crc kubenswrapper[4794]: I0310 11:27:58.514409 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-db-create-gspnw"] Mar 10 11:27:58 crc kubenswrapper[4794]: I0310 11:27:58.516112 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-create-gspnw" Mar 10 11:27:58 crc kubenswrapper[4794]: I0310 11:27:58.535198 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-db-create-gspnw"] Mar 10 11:27:58 crc kubenswrapper[4794]: I0310 11:27:58.582042 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3f15d67e-f384-4070-806e-6a213e97b3d9-operator-scripts\") pod \"manila-db-create-gspnw\" (UID: \"3f15d67e-f384-4070-806e-6a213e97b3d9\") " pod="openstack/manila-db-create-gspnw" Mar 10 11:27:58 crc kubenswrapper[4794]: I0310 11:27:58.582143 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x8hx4\" (UniqueName: \"kubernetes.io/projected/3f15d67e-f384-4070-806e-6a213e97b3d9-kube-api-access-x8hx4\") pod \"manila-db-create-gspnw\" (UID: \"3f15d67e-f384-4070-806e-6a213e97b3d9\") " pod="openstack/manila-db-create-gspnw" Mar 10 11:27:58 crc kubenswrapper[4794]: I0310 11:27:58.683640 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x8hx4\" (UniqueName: \"kubernetes.io/projected/3f15d67e-f384-4070-806e-6a213e97b3d9-kube-api-access-x8hx4\") pod \"manila-db-create-gspnw\" (UID: \"3f15d67e-f384-4070-806e-6a213e97b3d9\") " pod="openstack/manila-db-create-gspnw" Mar 10 11:27:58 crc kubenswrapper[4794]: I0310 11:27:58.683840 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3f15d67e-f384-4070-806e-6a213e97b3d9-operator-scripts\") pod \"manila-db-create-gspnw\" (UID: \"3f15d67e-f384-4070-806e-6a213e97b3d9\") " pod="openstack/manila-db-create-gspnw" Mar 10 11:27:58 crc kubenswrapper[4794]: I0310 11:27:58.684602 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3f15d67e-f384-4070-806e-6a213e97b3d9-operator-scripts\") pod \"manila-db-create-gspnw\" (UID: \"3f15d67e-f384-4070-806e-6a213e97b3d9\") " pod="openstack/manila-db-create-gspnw" Mar 10 11:27:58 crc kubenswrapper[4794]: I0310 11:27:58.713729 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x8hx4\" (UniqueName: \"kubernetes.io/projected/3f15d67e-f384-4070-806e-6a213e97b3d9-kube-api-access-x8hx4\") pod \"manila-db-create-gspnw\" (UID: \"3f15d67e-f384-4070-806e-6a213e97b3d9\") " pod="openstack/manila-db-create-gspnw" Mar 10 11:27:58 crc kubenswrapper[4794]: I0310 11:27:58.730216 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-1caa-account-create-update-8zbdf"] Mar 10 11:27:58 crc kubenswrapper[4794]: I0310 11:27:58.731633 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-1caa-account-create-update-8zbdf" Mar 10 11:27:58 crc kubenswrapper[4794]: I0310 11:27:58.733700 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-db-secret" Mar 10 11:27:58 crc kubenswrapper[4794]: I0310 11:27:58.747297 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-1caa-account-create-update-8zbdf"] Mar 10 11:27:58 crc kubenswrapper[4794]: I0310 11:27:58.785994 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nw5kb\" (UniqueName: \"kubernetes.io/projected/a835411b-024f-4d3b-a8ae-dcba59606ca6-kube-api-access-nw5kb\") pod \"manila-1caa-account-create-update-8zbdf\" (UID: \"a835411b-024f-4d3b-a8ae-dcba59606ca6\") " pod="openstack/manila-1caa-account-create-update-8zbdf" Mar 10 11:27:58 crc kubenswrapper[4794]: I0310 11:27:58.786188 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a835411b-024f-4d3b-a8ae-dcba59606ca6-operator-scripts\") pod \"manila-1caa-account-create-update-8zbdf\" (UID: \"a835411b-024f-4d3b-a8ae-dcba59606ca6\") " pod="openstack/manila-1caa-account-create-update-8zbdf" Mar 10 11:27:58 crc kubenswrapper[4794]: I0310 11:27:58.887843 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a835411b-024f-4d3b-a8ae-dcba59606ca6-operator-scripts\") pod \"manila-1caa-account-create-update-8zbdf\" (UID: \"a835411b-024f-4d3b-a8ae-dcba59606ca6\") " pod="openstack/manila-1caa-account-create-update-8zbdf" Mar 10 11:27:58 crc kubenswrapper[4794]: I0310 11:27:58.888023 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nw5kb\" (UniqueName: \"kubernetes.io/projected/a835411b-024f-4d3b-a8ae-dcba59606ca6-kube-api-access-nw5kb\") pod \"manila-1caa-account-create-update-8zbdf\" (UID: \"a835411b-024f-4d3b-a8ae-dcba59606ca6\") " pod="openstack/manila-1caa-account-create-update-8zbdf" Mar 10 11:27:58 crc kubenswrapper[4794]: I0310 11:27:58.888815 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a835411b-024f-4d3b-a8ae-dcba59606ca6-operator-scripts\") pod \"manila-1caa-account-create-update-8zbdf\" (UID: \"a835411b-024f-4d3b-a8ae-dcba59606ca6\") " pod="openstack/manila-1caa-account-create-update-8zbdf" Mar 10 11:27:58 crc kubenswrapper[4794]: I0310 11:27:58.891160 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-create-gspnw" Mar 10 11:27:58 crc kubenswrapper[4794]: I0310 11:27:58.917527 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nw5kb\" (UniqueName: \"kubernetes.io/projected/a835411b-024f-4d3b-a8ae-dcba59606ca6-kube-api-access-nw5kb\") pod \"manila-1caa-account-create-update-8zbdf\" (UID: \"a835411b-024f-4d3b-a8ae-dcba59606ca6\") " pod="openstack/manila-1caa-account-create-update-8zbdf" Mar 10 11:27:59 crc kubenswrapper[4794]: I0310 11:27:59.100509 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-1caa-account-create-update-8zbdf" Mar 10 11:27:59 crc kubenswrapper[4794]: I0310 11:27:59.485764 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-db-create-gspnw"] Mar 10 11:27:59 crc kubenswrapper[4794]: I0310 11:27:59.831051 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-1caa-account-create-update-8zbdf"] Mar 10 11:27:59 crc kubenswrapper[4794]: W0310 11:27:59.843715 4794 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda835411b_024f_4d3b_a8ae_dcba59606ca6.slice/crio-d8174872e90ecb9264a93ad47694a776503e1930f44aacbd867c011a8f193485 WatchSource:0}: Error finding container d8174872e90ecb9264a93ad47694a776503e1930f44aacbd867c011a8f193485: Status 404 returned error can't find the container with id d8174872e90ecb9264a93ad47694a776503e1930f44aacbd867c011a8f193485 Mar 10 11:27:59 crc kubenswrapper[4794]: I0310 11:27:59.855186 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-1caa-account-create-update-8zbdf" event={"ID":"a835411b-024f-4d3b-a8ae-dcba59606ca6","Type":"ContainerStarted","Data":"d8174872e90ecb9264a93ad47694a776503e1930f44aacbd867c011a8f193485"} Mar 10 11:27:59 crc kubenswrapper[4794]: I0310 11:27:59.857078 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-create-gspnw" event={"ID":"3f15d67e-f384-4070-806e-6a213e97b3d9","Type":"ContainerStarted","Data":"7de966b2e69162def51fa7897e8a587d06aedd23363e5d0a71598dabf97a75d3"} Mar 10 11:27:59 crc kubenswrapper[4794]: I0310 11:27:59.857102 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-create-gspnw" event={"ID":"3f15d67e-f384-4070-806e-6a213e97b3d9","Type":"ContainerStarted","Data":"e45dc8d842409b0829520d957e23760967297dc1781f12939ecff3d8882bc6b7"} Mar 10 11:27:59 crc kubenswrapper[4794]: I0310 11:27:59.879369 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/manila-db-create-gspnw" podStartSLOduration=1.879350637 podStartE2EDuration="1.879350637s" podCreationTimestamp="2026-03-10 11:27:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 11:27:59.871926276 +0000 UTC m=+6228.628097094" watchObservedRunningTime="2026-03-10 11:27:59.879350637 +0000 UTC m=+6228.635521455" Mar 10 11:28:00 crc kubenswrapper[4794]: I0310 11:28:00.036666 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-d82lf"] Mar 10 11:28:00 crc kubenswrapper[4794]: I0310 11:28:00.046182 4794 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-d82lf"] Mar 10 11:28:00 crc kubenswrapper[4794]: I0310 11:28:00.177541 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29552368-98mgx"] Mar 10 11:28:00 crc kubenswrapper[4794]: I0310 11:28:00.179591 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552368-98mgx" Mar 10 11:28:00 crc kubenswrapper[4794]: I0310 11:28:00.185924 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 10 11:28:00 crc kubenswrapper[4794]: I0310 11:28:00.186202 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 10 11:28:00 crc kubenswrapper[4794]: I0310 11:28:00.186343 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-r5tkc" Mar 10 11:28:00 crc kubenswrapper[4794]: I0310 11:28:00.186236 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552368-98mgx"] Mar 10 11:28:00 crc kubenswrapper[4794]: I0310 11:28:00.234090 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c2qm7\" (UniqueName: \"kubernetes.io/projected/deafc0de-3f00-45c6-b5d7-cea8a3a66ce7-kube-api-access-c2qm7\") pod \"auto-csr-approver-29552368-98mgx\" (UID: \"deafc0de-3f00-45c6-b5d7-cea8a3a66ce7\") " pod="openshift-infra/auto-csr-approver-29552368-98mgx" Mar 10 11:28:00 crc kubenswrapper[4794]: I0310 11:28:00.335947 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c2qm7\" (UniqueName: \"kubernetes.io/projected/deafc0de-3f00-45c6-b5d7-cea8a3a66ce7-kube-api-access-c2qm7\") pod \"auto-csr-approver-29552368-98mgx\" (UID: \"deafc0de-3f00-45c6-b5d7-cea8a3a66ce7\") " pod="openshift-infra/auto-csr-approver-29552368-98mgx" Mar 10 11:28:00 crc kubenswrapper[4794]: I0310 11:28:00.354362 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c2qm7\" (UniqueName: \"kubernetes.io/projected/deafc0de-3f00-45c6-b5d7-cea8a3a66ce7-kube-api-access-c2qm7\") pod \"auto-csr-approver-29552368-98mgx\" (UID: \"deafc0de-3f00-45c6-b5d7-cea8a3a66ce7\") " pod="openshift-infra/auto-csr-approver-29552368-98mgx" Mar 10 11:28:00 crc kubenswrapper[4794]: I0310 11:28:00.523514 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552368-98mgx" Mar 10 11:28:00 crc kubenswrapper[4794]: I0310 11:28:00.869372 4794 generic.go:334] "Generic (PLEG): container finished" podID="a835411b-024f-4d3b-a8ae-dcba59606ca6" containerID="2fed274478277fb3a33d47601fb13570f7076a27f2eeb0cf6854d67b70670990" exitCode=0 Mar 10 11:28:00 crc kubenswrapper[4794]: I0310 11:28:00.869478 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-1caa-account-create-update-8zbdf" event={"ID":"a835411b-024f-4d3b-a8ae-dcba59606ca6","Type":"ContainerDied","Data":"2fed274478277fb3a33d47601fb13570f7076a27f2eeb0cf6854d67b70670990"} Mar 10 11:28:00 crc kubenswrapper[4794]: I0310 11:28:00.872033 4794 generic.go:334] "Generic (PLEG): container finished" podID="3f15d67e-f384-4070-806e-6a213e97b3d9" containerID="7de966b2e69162def51fa7897e8a587d06aedd23363e5d0a71598dabf97a75d3" exitCode=0 Mar 10 11:28:00 crc kubenswrapper[4794]: I0310 11:28:00.872059 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-create-gspnw" event={"ID":"3f15d67e-f384-4070-806e-6a213e97b3d9","Type":"ContainerDied","Data":"7de966b2e69162def51fa7897e8a587d06aedd23363e5d0a71598dabf97a75d3"} Mar 10 11:28:00 crc kubenswrapper[4794]: W0310 11:28:00.996293 4794 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddeafc0de_3f00_45c6_b5d7_cea8a3a66ce7.slice/crio-f0ffe724e5bd58ce14e9bc1998f5152892d995b66048904aeeefa6cca748ede9 WatchSource:0}: Error finding container f0ffe724e5bd58ce14e9bc1998f5152892d995b66048904aeeefa6cca748ede9: Status 404 returned error can't find the container with id f0ffe724e5bd58ce14e9bc1998f5152892d995b66048904aeeefa6cca748ede9 Mar 10 11:28:01 crc kubenswrapper[4794]: I0310 11:28:01.005031 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552368-98mgx"] Mar 10 11:28:01 crc kubenswrapper[4794]: I0310 11:28:01.886507 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552368-98mgx" event={"ID":"deafc0de-3f00-45c6-b5d7-cea8a3a66ce7","Type":"ContainerStarted","Data":"f0ffe724e5bd58ce14e9bc1998f5152892d995b66048904aeeefa6cca748ede9"} Mar 10 11:28:02 crc kubenswrapper[4794]: I0310 11:28:02.013891 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bb00adea-5b3d-4e1e-a32c-b9378e9aa75e" path="/var/lib/kubelet/pods/bb00adea-5b3d-4e1e-a32c-b9378e9aa75e/volumes" Mar 10 11:28:02 crc kubenswrapper[4794]: I0310 11:28:02.510247 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-1caa-account-create-update-8zbdf" Mar 10 11:28:02 crc kubenswrapper[4794]: I0310 11:28:02.540903 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-create-gspnw" Mar 10 11:28:02 crc kubenswrapper[4794]: I0310 11:28:02.611493 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nw5kb\" (UniqueName: \"kubernetes.io/projected/a835411b-024f-4d3b-a8ae-dcba59606ca6-kube-api-access-nw5kb\") pod \"a835411b-024f-4d3b-a8ae-dcba59606ca6\" (UID: \"a835411b-024f-4d3b-a8ae-dcba59606ca6\") " Mar 10 11:28:02 crc kubenswrapper[4794]: I0310 11:28:02.611586 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3f15d67e-f384-4070-806e-6a213e97b3d9-operator-scripts\") pod \"3f15d67e-f384-4070-806e-6a213e97b3d9\" (UID: \"3f15d67e-f384-4070-806e-6a213e97b3d9\") " Mar 10 11:28:02 crc kubenswrapper[4794]: I0310 11:28:02.611635 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a835411b-024f-4d3b-a8ae-dcba59606ca6-operator-scripts\") pod \"a835411b-024f-4d3b-a8ae-dcba59606ca6\" (UID: \"a835411b-024f-4d3b-a8ae-dcba59606ca6\") " Mar 10 11:28:02 crc kubenswrapper[4794]: I0310 11:28:02.611697 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x8hx4\" (UniqueName: \"kubernetes.io/projected/3f15d67e-f384-4070-806e-6a213e97b3d9-kube-api-access-x8hx4\") pod \"3f15d67e-f384-4070-806e-6a213e97b3d9\" (UID: \"3f15d67e-f384-4070-806e-6a213e97b3d9\") " Mar 10 11:28:02 crc kubenswrapper[4794]: I0310 11:28:02.612440 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3f15d67e-f384-4070-806e-6a213e97b3d9-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "3f15d67e-f384-4070-806e-6a213e97b3d9" (UID: "3f15d67e-f384-4070-806e-6a213e97b3d9"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 11:28:02 crc kubenswrapper[4794]: I0310 11:28:02.612878 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a835411b-024f-4d3b-a8ae-dcba59606ca6-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "a835411b-024f-4d3b-a8ae-dcba59606ca6" (UID: "a835411b-024f-4d3b-a8ae-dcba59606ca6"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 11:28:02 crc kubenswrapper[4794]: I0310 11:28:02.616773 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3f15d67e-f384-4070-806e-6a213e97b3d9-kube-api-access-x8hx4" (OuterVolumeSpecName: "kube-api-access-x8hx4") pod "3f15d67e-f384-4070-806e-6a213e97b3d9" (UID: "3f15d67e-f384-4070-806e-6a213e97b3d9"). InnerVolumeSpecName "kube-api-access-x8hx4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 11:28:02 crc kubenswrapper[4794]: I0310 11:28:02.616815 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a835411b-024f-4d3b-a8ae-dcba59606ca6-kube-api-access-nw5kb" (OuterVolumeSpecName: "kube-api-access-nw5kb") pod "a835411b-024f-4d3b-a8ae-dcba59606ca6" (UID: "a835411b-024f-4d3b-a8ae-dcba59606ca6"). InnerVolumeSpecName "kube-api-access-nw5kb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 11:28:02 crc kubenswrapper[4794]: I0310 11:28:02.713661 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nw5kb\" (UniqueName: \"kubernetes.io/projected/a835411b-024f-4d3b-a8ae-dcba59606ca6-kube-api-access-nw5kb\") on node \"crc\" DevicePath \"\"" Mar 10 11:28:02 crc kubenswrapper[4794]: I0310 11:28:02.713718 4794 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3f15d67e-f384-4070-806e-6a213e97b3d9-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 11:28:02 crc kubenswrapper[4794]: I0310 11:28:02.713728 4794 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a835411b-024f-4d3b-a8ae-dcba59606ca6-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 11:28:02 crc kubenswrapper[4794]: I0310 11:28:02.713737 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x8hx4\" (UniqueName: \"kubernetes.io/projected/3f15d67e-f384-4070-806e-6a213e97b3d9-kube-api-access-x8hx4\") on node \"crc\" DevicePath \"\"" Mar 10 11:28:02 crc kubenswrapper[4794]: I0310 11:28:02.900063 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-1caa-account-create-update-8zbdf" event={"ID":"a835411b-024f-4d3b-a8ae-dcba59606ca6","Type":"ContainerDied","Data":"d8174872e90ecb9264a93ad47694a776503e1930f44aacbd867c011a8f193485"} Mar 10 11:28:02 crc kubenswrapper[4794]: I0310 11:28:02.900119 4794 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d8174872e90ecb9264a93ad47694a776503e1930f44aacbd867c011a8f193485" Mar 10 11:28:02 crc kubenswrapper[4794]: I0310 11:28:02.900134 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-1caa-account-create-update-8zbdf" Mar 10 11:28:02 crc kubenswrapper[4794]: I0310 11:28:02.903269 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552368-98mgx" event={"ID":"deafc0de-3f00-45c6-b5d7-cea8a3a66ce7","Type":"ContainerStarted","Data":"1075b91d496197f6e02538e923217d308f6944a7353740d9f4d4fc912b3a1597"} Mar 10 11:28:02 crc kubenswrapper[4794]: I0310 11:28:02.909878 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-create-gspnw" event={"ID":"3f15d67e-f384-4070-806e-6a213e97b3d9","Type":"ContainerDied","Data":"e45dc8d842409b0829520d957e23760967297dc1781f12939ecff3d8882bc6b7"} Mar 10 11:28:02 crc kubenswrapper[4794]: I0310 11:28:02.909919 4794 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e45dc8d842409b0829520d957e23760967297dc1781f12939ecff3d8882bc6b7" Mar 10 11:28:02 crc kubenswrapper[4794]: I0310 11:28:02.909978 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-create-gspnw" Mar 10 11:28:02 crc kubenswrapper[4794]: I0310 11:28:02.948067 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29552368-98mgx" podStartSLOduration=1.867179815 podStartE2EDuration="2.948039648s" podCreationTimestamp="2026-03-10 11:28:00 +0000 UTC" firstStartedPulling="2026-03-10 11:28:00.998866595 +0000 UTC m=+6229.755037413" lastFinishedPulling="2026-03-10 11:28:02.079726428 +0000 UTC m=+6230.835897246" observedRunningTime="2026-03-10 11:28:02.92528378 +0000 UTC m=+6231.681454638" watchObservedRunningTime="2026-03-10 11:28:02.948039648 +0000 UTC m=+6231.704210496" Mar 10 11:28:03 crc kubenswrapper[4794]: I0310 11:28:03.920248 4794 generic.go:334] "Generic (PLEG): container finished" podID="deafc0de-3f00-45c6-b5d7-cea8a3a66ce7" containerID="1075b91d496197f6e02538e923217d308f6944a7353740d9f4d4fc912b3a1597" exitCode=0 Mar 10 11:28:03 crc kubenswrapper[4794]: I0310 11:28:03.920355 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552368-98mgx" event={"ID":"deafc0de-3f00-45c6-b5d7-cea8a3a66ce7","Type":"ContainerDied","Data":"1075b91d496197f6e02538e923217d308f6944a7353740d9f4d4fc912b3a1597"} Mar 10 11:28:04 crc kubenswrapper[4794]: I0310 11:28:04.197885 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-db-sync-gdd98"] Mar 10 11:28:04 crc kubenswrapper[4794]: E0310 11:28:04.198252 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3f15d67e-f384-4070-806e-6a213e97b3d9" containerName="mariadb-database-create" Mar 10 11:28:04 crc kubenswrapper[4794]: I0310 11:28:04.198267 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="3f15d67e-f384-4070-806e-6a213e97b3d9" containerName="mariadb-database-create" Mar 10 11:28:04 crc kubenswrapper[4794]: E0310 11:28:04.198284 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a835411b-024f-4d3b-a8ae-dcba59606ca6" containerName="mariadb-account-create-update" Mar 10 11:28:04 crc kubenswrapper[4794]: I0310 11:28:04.198290 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="a835411b-024f-4d3b-a8ae-dcba59606ca6" containerName="mariadb-account-create-update" Mar 10 11:28:04 crc kubenswrapper[4794]: I0310 11:28:04.198503 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="3f15d67e-f384-4070-806e-6a213e97b3d9" containerName="mariadb-database-create" Mar 10 11:28:04 crc kubenswrapper[4794]: I0310 11:28:04.198522 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="a835411b-024f-4d3b-a8ae-dcba59606ca6" containerName="mariadb-account-create-update" Mar 10 11:28:04 crc kubenswrapper[4794]: I0310 11:28:04.199152 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-sync-gdd98" Mar 10 11:28:04 crc kubenswrapper[4794]: I0310 11:28:04.201845 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-config-data" Mar 10 11:28:04 crc kubenswrapper[4794]: I0310 11:28:04.203263 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-manila-dockercfg-nttq2" Mar 10 11:28:04 crc kubenswrapper[4794]: I0310 11:28:04.218054 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-db-sync-gdd98"] Mar 10 11:28:04 crc kubenswrapper[4794]: I0310 11:28:04.352774 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/31990e7b-b9b5-408f-aa66-e066c7b58fd4-combined-ca-bundle\") pod \"manila-db-sync-gdd98\" (UID: \"31990e7b-b9b5-408f-aa66-e066c7b58fd4\") " pod="openstack/manila-db-sync-gdd98" Mar 10 11:28:04 crc kubenswrapper[4794]: I0310 11:28:04.352821 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-btlc6\" (UniqueName: \"kubernetes.io/projected/31990e7b-b9b5-408f-aa66-e066c7b58fd4-kube-api-access-btlc6\") pod \"manila-db-sync-gdd98\" (UID: \"31990e7b-b9b5-408f-aa66-e066c7b58fd4\") " pod="openstack/manila-db-sync-gdd98" Mar 10 11:28:04 crc kubenswrapper[4794]: I0310 11:28:04.353188 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"job-config-data\" (UniqueName: \"kubernetes.io/secret/31990e7b-b9b5-408f-aa66-e066c7b58fd4-job-config-data\") pod \"manila-db-sync-gdd98\" (UID: \"31990e7b-b9b5-408f-aa66-e066c7b58fd4\") " pod="openstack/manila-db-sync-gdd98" Mar 10 11:28:04 crc kubenswrapper[4794]: I0310 11:28:04.353738 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/31990e7b-b9b5-408f-aa66-e066c7b58fd4-config-data\") pod \"manila-db-sync-gdd98\" (UID: \"31990e7b-b9b5-408f-aa66-e066c7b58fd4\") " pod="openstack/manila-db-sync-gdd98" Mar 10 11:28:04 crc kubenswrapper[4794]: I0310 11:28:04.455305 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"job-config-data\" (UniqueName: \"kubernetes.io/secret/31990e7b-b9b5-408f-aa66-e066c7b58fd4-job-config-data\") pod \"manila-db-sync-gdd98\" (UID: \"31990e7b-b9b5-408f-aa66-e066c7b58fd4\") " pod="openstack/manila-db-sync-gdd98" Mar 10 11:28:04 crc kubenswrapper[4794]: I0310 11:28:04.455886 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/31990e7b-b9b5-408f-aa66-e066c7b58fd4-config-data\") pod \"manila-db-sync-gdd98\" (UID: \"31990e7b-b9b5-408f-aa66-e066c7b58fd4\") " pod="openstack/manila-db-sync-gdd98" Mar 10 11:28:04 crc kubenswrapper[4794]: I0310 11:28:04.455967 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/31990e7b-b9b5-408f-aa66-e066c7b58fd4-combined-ca-bundle\") pod \"manila-db-sync-gdd98\" (UID: \"31990e7b-b9b5-408f-aa66-e066c7b58fd4\") " pod="openstack/manila-db-sync-gdd98" Mar 10 11:28:04 crc kubenswrapper[4794]: I0310 11:28:04.456001 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-btlc6\" (UniqueName: \"kubernetes.io/projected/31990e7b-b9b5-408f-aa66-e066c7b58fd4-kube-api-access-btlc6\") pod \"manila-db-sync-gdd98\" (UID: \"31990e7b-b9b5-408f-aa66-e066c7b58fd4\") " pod="openstack/manila-db-sync-gdd98" Mar 10 11:28:04 crc kubenswrapper[4794]: I0310 11:28:04.462747 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/31990e7b-b9b5-408f-aa66-e066c7b58fd4-config-data\") pod \"manila-db-sync-gdd98\" (UID: \"31990e7b-b9b5-408f-aa66-e066c7b58fd4\") " pod="openstack/manila-db-sync-gdd98" Mar 10 11:28:04 crc kubenswrapper[4794]: I0310 11:28:04.462946 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/31990e7b-b9b5-408f-aa66-e066c7b58fd4-combined-ca-bundle\") pod \"manila-db-sync-gdd98\" (UID: \"31990e7b-b9b5-408f-aa66-e066c7b58fd4\") " pod="openstack/manila-db-sync-gdd98" Mar 10 11:28:04 crc kubenswrapper[4794]: I0310 11:28:04.478473 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"job-config-data\" (UniqueName: \"kubernetes.io/secret/31990e7b-b9b5-408f-aa66-e066c7b58fd4-job-config-data\") pod \"manila-db-sync-gdd98\" (UID: \"31990e7b-b9b5-408f-aa66-e066c7b58fd4\") " pod="openstack/manila-db-sync-gdd98" Mar 10 11:28:04 crc kubenswrapper[4794]: I0310 11:28:04.490709 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-btlc6\" (UniqueName: \"kubernetes.io/projected/31990e7b-b9b5-408f-aa66-e066c7b58fd4-kube-api-access-btlc6\") pod \"manila-db-sync-gdd98\" (UID: \"31990e7b-b9b5-408f-aa66-e066c7b58fd4\") " pod="openstack/manila-db-sync-gdd98" Mar 10 11:28:04 crc kubenswrapper[4794]: I0310 11:28:04.517918 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-sync-gdd98" Mar 10 11:28:05 crc kubenswrapper[4794]: I0310 11:28:05.317258 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552368-98mgx" Mar 10 11:28:05 crc kubenswrapper[4794]: I0310 11:28:05.481732 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c2qm7\" (UniqueName: \"kubernetes.io/projected/deafc0de-3f00-45c6-b5d7-cea8a3a66ce7-kube-api-access-c2qm7\") pod \"deafc0de-3f00-45c6-b5d7-cea8a3a66ce7\" (UID: \"deafc0de-3f00-45c6-b5d7-cea8a3a66ce7\") " Mar 10 11:28:05 crc kubenswrapper[4794]: I0310 11:28:05.493463 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/deafc0de-3f00-45c6-b5d7-cea8a3a66ce7-kube-api-access-c2qm7" (OuterVolumeSpecName: "kube-api-access-c2qm7") pod "deafc0de-3f00-45c6-b5d7-cea8a3a66ce7" (UID: "deafc0de-3f00-45c6-b5d7-cea8a3a66ce7"). InnerVolumeSpecName "kube-api-access-c2qm7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 11:28:05 crc kubenswrapper[4794]: I0310 11:28:05.497090 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-db-sync-gdd98"] Mar 10 11:28:05 crc kubenswrapper[4794]: W0310 11:28:05.502984 4794 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod31990e7b_b9b5_408f_aa66_e066c7b58fd4.slice/crio-3bf6fcd8eb8e8c48269afddb45469f0d366b2d5a6c95c105becfbf9d9ea27ebd WatchSource:0}: Error finding container 3bf6fcd8eb8e8c48269afddb45469f0d366b2d5a6c95c105becfbf9d9ea27ebd: Status 404 returned error can't find the container with id 3bf6fcd8eb8e8c48269afddb45469f0d366b2d5a6c95c105becfbf9d9ea27ebd Mar 10 11:28:05 crc kubenswrapper[4794]: I0310 11:28:05.584350 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c2qm7\" (UniqueName: \"kubernetes.io/projected/deafc0de-3f00-45c6-b5d7-cea8a3a66ce7-kube-api-access-c2qm7\") on node \"crc\" DevicePath \"\"" Mar 10 11:28:05 crc kubenswrapper[4794]: I0310 11:28:05.947850 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552368-98mgx" event={"ID":"deafc0de-3f00-45c6-b5d7-cea8a3a66ce7","Type":"ContainerDied","Data":"f0ffe724e5bd58ce14e9bc1998f5152892d995b66048904aeeefa6cca748ede9"} Mar 10 11:28:05 crc kubenswrapper[4794]: I0310 11:28:05.948084 4794 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f0ffe724e5bd58ce14e9bc1998f5152892d995b66048904aeeefa6cca748ede9" Mar 10 11:28:05 crc kubenswrapper[4794]: I0310 11:28:05.947938 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552368-98mgx" Mar 10 11:28:05 crc kubenswrapper[4794]: I0310 11:28:05.950128 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-sync-gdd98" event={"ID":"31990e7b-b9b5-408f-aa66-e066c7b58fd4","Type":"ContainerStarted","Data":"3bf6fcd8eb8e8c48269afddb45469f0d366b2d5a6c95c105becfbf9d9ea27ebd"} Mar 10 11:28:06 crc kubenswrapper[4794]: I0310 11:28:06.014809 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-zppgc"] Mar 10 11:28:06 crc kubenswrapper[4794]: E0310 11:28:06.015148 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="deafc0de-3f00-45c6-b5d7-cea8a3a66ce7" containerName="oc" Mar 10 11:28:06 crc kubenswrapper[4794]: I0310 11:28:06.015171 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="deafc0de-3f00-45c6-b5d7-cea8a3a66ce7" containerName="oc" Mar 10 11:28:06 crc kubenswrapper[4794]: I0310 11:28:06.015393 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="deafc0de-3f00-45c6-b5d7-cea8a3a66ce7" containerName="oc" Mar 10 11:28:06 crc kubenswrapper[4794]: I0310 11:28:06.016944 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-zppgc" Mar 10 11:28:06 crc kubenswrapper[4794]: I0310 11:28:06.037406 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-zppgc"] Mar 10 11:28:06 crc kubenswrapper[4794]: I0310 11:28:06.196497 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cdtgs\" (UniqueName: \"kubernetes.io/projected/81fd453f-c42b-4432-8d2c-dfc34ee6241c-kube-api-access-cdtgs\") pod \"redhat-marketplace-zppgc\" (UID: \"81fd453f-c42b-4432-8d2c-dfc34ee6241c\") " pod="openshift-marketplace/redhat-marketplace-zppgc" Mar 10 11:28:06 crc kubenswrapper[4794]: I0310 11:28:06.196673 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/81fd453f-c42b-4432-8d2c-dfc34ee6241c-utilities\") pod \"redhat-marketplace-zppgc\" (UID: \"81fd453f-c42b-4432-8d2c-dfc34ee6241c\") " pod="openshift-marketplace/redhat-marketplace-zppgc" Mar 10 11:28:06 crc kubenswrapper[4794]: I0310 11:28:06.196771 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/81fd453f-c42b-4432-8d2c-dfc34ee6241c-catalog-content\") pod \"redhat-marketplace-zppgc\" (UID: \"81fd453f-c42b-4432-8d2c-dfc34ee6241c\") " pod="openshift-marketplace/redhat-marketplace-zppgc" Mar 10 11:28:06 crc kubenswrapper[4794]: I0310 11:28:06.298312 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cdtgs\" (UniqueName: \"kubernetes.io/projected/81fd453f-c42b-4432-8d2c-dfc34ee6241c-kube-api-access-cdtgs\") pod \"redhat-marketplace-zppgc\" (UID: \"81fd453f-c42b-4432-8d2c-dfc34ee6241c\") " pod="openshift-marketplace/redhat-marketplace-zppgc" Mar 10 11:28:06 crc kubenswrapper[4794]: I0310 11:28:06.298443 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/81fd453f-c42b-4432-8d2c-dfc34ee6241c-utilities\") pod \"redhat-marketplace-zppgc\" (UID: \"81fd453f-c42b-4432-8d2c-dfc34ee6241c\") " pod="openshift-marketplace/redhat-marketplace-zppgc" Mar 10 11:28:06 crc kubenswrapper[4794]: I0310 11:28:06.298515 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/81fd453f-c42b-4432-8d2c-dfc34ee6241c-catalog-content\") pod \"redhat-marketplace-zppgc\" (UID: \"81fd453f-c42b-4432-8d2c-dfc34ee6241c\") " pod="openshift-marketplace/redhat-marketplace-zppgc" Mar 10 11:28:06 crc kubenswrapper[4794]: I0310 11:28:06.298962 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/81fd453f-c42b-4432-8d2c-dfc34ee6241c-catalog-content\") pod \"redhat-marketplace-zppgc\" (UID: \"81fd453f-c42b-4432-8d2c-dfc34ee6241c\") " pod="openshift-marketplace/redhat-marketplace-zppgc" Mar 10 11:28:06 crc kubenswrapper[4794]: I0310 11:28:06.299499 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/81fd453f-c42b-4432-8d2c-dfc34ee6241c-utilities\") pod \"redhat-marketplace-zppgc\" (UID: \"81fd453f-c42b-4432-8d2c-dfc34ee6241c\") " pod="openshift-marketplace/redhat-marketplace-zppgc" Mar 10 11:28:06 crc kubenswrapper[4794]: I0310 11:28:06.324417 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cdtgs\" (UniqueName: \"kubernetes.io/projected/81fd453f-c42b-4432-8d2c-dfc34ee6241c-kube-api-access-cdtgs\") pod \"redhat-marketplace-zppgc\" (UID: \"81fd453f-c42b-4432-8d2c-dfc34ee6241c\") " pod="openshift-marketplace/redhat-marketplace-zppgc" Mar 10 11:28:06 crc kubenswrapper[4794]: I0310 11:28:06.341684 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-zppgc" Mar 10 11:28:06 crc kubenswrapper[4794]: I0310 11:28:06.392360 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29552362-d8s7g"] Mar 10 11:28:06 crc kubenswrapper[4794]: I0310 11:28:06.431549 4794 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29552362-d8s7g"] Mar 10 11:28:06 crc kubenswrapper[4794]: W0310 11:28:06.972351 4794 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod81fd453f_c42b_4432_8d2c_dfc34ee6241c.slice/crio-a5af0e18410fc2fb909c13031b41e5e24d7297c9422708dd628f7889605da697 WatchSource:0}: Error finding container a5af0e18410fc2fb909c13031b41e5e24d7297c9422708dd628f7889605da697: Status 404 returned error can't find the container with id a5af0e18410fc2fb909c13031b41e5e24d7297c9422708dd628f7889605da697 Mar 10 11:28:06 crc kubenswrapper[4794]: I0310 11:28:06.985050 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-zppgc"] Mar 10 11:28:08 crc kubenswrapper[4794]: I0310 11:28:08.007592 4794 generic.go:334] "Generic (PLEG): container finished" podID="81fd453f-c42b-4432-8d2c-dfc34ee6241c" containerID="9c6b32cfa64031423a2f2f720df35653c8f48099175fdc1a5c660ebdd398d874" exitCode=0 Mar 10 11:28:08 crc kubenswrapper[4794]: I0310 11:28:08.031969 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="756d4872-e20d-4b8a-8c2e-b75813c15334" path="/var/lib/kubelet/pods/756d4872-e20d-4b8a-8c2e-b75813c15334/volumes" Mar 10 11:28:08 crc kubenswrapper[4794]: I0310 11:28:08.032690 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zppgc" event={"ID":"81fd453f-c42b-4432-8d2c-dfc34ee6241c","Type":"ContainerDied","Data":"9c6b32cfa64031423a2f2f720df35653c8f48099175fdc1a5c660ebdd398d874"} Mar 10 11:28:08 crc kubenswrapper[4794]: I0310 11:28:08.032724 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zppgc" event={"ID":"81fd453f-c42b-4432-8d2c-dfc34ee6241c","Type":"ContainerStarted","Data":"a5af0e18410fc2fb909c13031b41e5e24d7297c9422708dd628f7889605da697"} Mar 10 11:28:11 crc kubenswrapper[4794]: I0310 11:28:10.999721 4794 scope.go:117] "RemoveContainer" containerID="7035d4646bbb6a23dd802a9c8c9f05a3c6c945d4e5a6da8586d2bb005fcfad77" Mar 10 11:28:11 crc kubenswrapper[4794]: E0310 11:28:11.000627 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-69278_openshift-machine-config-operator(071a79a8-a892-4d38-a255-2a19483b64aa)\"" pod="openshift-machine-config-operator/machine-config-daemon-69278" podUID="071a79a8-a892-4d38-a255-2a19483b64aa" Mar 10 11:28:11 crc kubenswrapper[4794]: I0310 11:28:11.058720 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zppgc" event={"ID":"81fd453f-c42b-4432-8d2c-dfc34ee6241c","Type":"ContainerStarted","Data":"ab8fea3eb6e261288778386a535aa54e0c32c52bf9e229f3c6d3ace93323da9c"} Mar 10 11:28:11 crc kubenswrapper[4794]: I0310 11:28:11.061463 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-sync-gdd98" event={"ID":"31990e7b-b9b5-408f-aa66-e066c7b58fd4","Type":"ContainerStarted","Data":"f8f54ccae0f40c98d9e23426c2df80e4ca6f3481cf1cb4619461559ebeb54161"} Mar 10 11:28:12 crc kubenswrapper[4794]: I0310 11:28:12.073264 4794 generic.go:334] "Generic (PLEG): container finished" podID="81fd453f-c42b-4432-8d2c-dfc34ee6241c" containerID="ab8fea3eb6e261288778386a535aa54e0c32c52bf9e229f3c6d3ace93323da9c" exitCode=0 Mar 10 11:28:12 crc kubenswrapper[4794]: I0310 11:28:12.076236 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zppgc" event={"ID":"81fd453f-c42b-4432-8d2c-dfc34ee6241c","Type":"ContainerDied","Data":"ab8fea3eb6e261288778386a535aa54e0c32c52bf9e229f3c6d3ace93323da9c"} Mar 10 11:28:12 crc kubenswrapper[4794]: I0310 11:28:12.103759 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/manila-db-sync-gdd98" podStartSLOduration=3.25682982 podStartE2EDuration="8.103739174s" podCreationTimestamp="2026-03-10 11:28:04 +0000 UTC" firstStartedPulling="2026-03-10 11:28:05.505557143 +0000 UTC m=+6234.261727961" lastFinishedPulling="2026-03-10 11:28:10.352466497 +0000 UTC m=+6239.108637315" observedRunningTime="2026-03-10 11:28:12.093928788 +0000 UTC m=+6240.850099646" watchObservedRunningTime="2026-03-10 11:28:12.103739174 +0000 UTC m=+6240.859909992" Mar 10 11:28:13 crc kubenswrapper[4794]: I0310 11:28:13.043081 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-ccrj5"] Mar 10 11:28:13 crc kubenswrapper[4794]: I0310 11:28:13.059019 4794 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-ccrj5"] Mar 10 11:28:13 crc kubenswrapper[4794]: I0310 11:28:13.086135 4794 generic.go:334] "Generic (PLEG): container finished" podID="31990e7b-b9b5-408f-aa66-e066c7b58fd4" containerID="f8f54ccae0f40c98d9e23426c2df80e4ca6f3481cf1cb4619461559ebeb54161" exitCode=0 Mar 10 11:28:13 crc kubenswrapper[4794]: I0310 11:28:13.086175 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-sync-gdd98" event={"ID":"31990e7b-b9b5-408f-aa66-e066c7b58fd4","Type":"ContainerDied","Data":"f8f54ccae0f40c98d9e23426c2df80e4ca6f3481cf1cb4619461559ebeb54161"} Mar 10 11:28:14 crc kubenswrapper[4794]: I0310 11:28:14.013387 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="78550cc2-d68e-4b15-98f8-281fb85642df" path="/var/lib/kubelet/pods/78550cc2-d68e-4b15-98f8-281fb85642df/volumes" Mar 10 11:28:14 crc kubenswrapper[4794]: I0310 11:28:14.039366 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-cell-mapping-4sqhj"] Mar 10 11:28:14 crc kubenswrapper[4794]: I0310 11:28:14.053105 4794 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-cell-mapping-4sqhj"] Mar 10 11:28:14 crc kubenswrapper[4794]: I0310 11:28:14.099435 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zppgc" event={"ID":"81fd453f-c42b-4432-8d2c-dfc34ee6241c","Type":"ContainerStarted","Data":"d4c8537903b62043899faf687979b13e02576af699a55eabd48a96aa686abc7e"} Mar 10 11:28:14 crc kubenswrapper[4794]: I0310 11:28:14.134885 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-zppgc" podStartSLOduration=5.4360998 podStartE2EDuration="9.134838366s" podCreationTimestamp="2026-03-10 11:28:05 +0000 UTC" firstStartedPulling="2026-03-10 11:28:09.834699396 +0000 UTC m=+6238.590870214" lastFinishedPulling="2026-03-10 11:28:13.533437952 +0000 UTC m=+6242.289608780" observedRunningTime="2026-03-10 11:28:14.123400431 +0000 UTC m=+6242.879571289" watchObservedRunningTime="2026-03-10 11:28:14.134838366 +0000 UTC m=+6242.891009184" Mar 10 11:28:14 crc kubenswrapper[4794]: I0310 11:28:14.635661 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-sync-gdd98" Mar 10 11:28:14 crc kubenswrapper[4794]: I0310 11:28:14.808239 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"job-config-data\" (UniqueName: \"kubernetes.io/secret/31990e7b-b9b5-408f-aa66-e066c7b58fd4-job-config-data\") pod \"31990e7b-b9b5-408f-aa66-e066c7b58fd4\" (UID: \"31990e7b-b9b5-408f-aa66-e066c7b58fd4\") " Mar 10 11:28:14 crc kubenswrapper[4794]: I0310 11:28:14.808484 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/31990e7b-b9b5-408f-aa66-e066c7b58fd4-combined-ca-bundle\") pod \"31990e7b-b9b5-408f-aa66-e066c7b58fd4\" (UID: \"31990e7b-b9b5-408f-aa66-e066c7b58fd4\") " Mar 10 11:28:14 crc kubenswrapper[4794]: I0310 11:28:14.808538 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/31990e7b-b9b5-408f-aa66-e066c7b58fd4-config-data\") pod \"31990e7b-b9b5-408f-aa66-e066c7b58fd4\" (UID: \"31990e7b-b9b5-408f-aa66-e066c7b58fd4\") " Mar 10 11:28:14 crc kubenswrapper[4794]: I0310 11:28:14.808630 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-btlc6\" (UniqueName: \"kubernetes.io/projected/31990e7b-b9b5-408f-aa66-e066c7b58fd4-kube-api-access-btlc6\") pod \"31990e7b-b9b5-408f-aa66-e066c7b58fd4\" (UID: \"31990e7b-b9b5-408f-aa66-e066c7b58fd4\") " Mar 10 11:28:14 crc kubenswrapper[4794]: I0310 11:28:14.815717 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31990e7b-b9b5-408f-aa66-e066c7b58fd4-kube-api-access-btlc6" (OuterVolumeSpecName: "kube-api-access-btlc6") pod "31990e7b-b9b5-408f-aa66-e066c7b58fd4" (UID: "31990e7b-b9b5-408f-aa66-e066c7b58fd4"). InnerVolumeSpecName "kube-api-access-btlc6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 11:28:14 crc kubenswrapper[4794]: I0310 11:28:14.816686 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31990e7b-b9b5-408f-aa66-e066c7b58fd4-job-config-data" (OuterVolumeSpecName: "job-config-data") pod "31990e7b-b9b5-408f-aa66-e066c7b58fd4" (UID: "31990e7b-b9b5-408f-aa66-e066c7b58fd4"). InnerVolumeSpecName "job-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 11:28:14 crc kubenswrapper[4794]: I0310 11:28:14.826808 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31990e7b-b9b5-408f-aa66-e066c7b58fd4-config-data" (OuterVolumeSpecName: "config-data") pod "31990e7b-b9b5-408f-aa66-e066c7b58fd4" (UID: "31990e7b-b9b5-408f-aa66-e066c7b58fd4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 11:28:14 crc kubenswrapper[4794]: I0310 11:28:14.844846 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31990e7b-b9b5-408f-aa66-e066c7b58fd4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "31990e7b-b9b5-408f-aa66-e066c7b58fd4" (UID: "31990e7b-b9b5-408f-aa66-e066c7b58fd4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 11:28:14 crc kubenswrapper[4794]: I0310 11:28:14.913007 4794 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/31990e7b-b9b5-408f-aa66-e066c7b58fd4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 11:28:14 crc kubenswrapper[4794]: I0310 11:28:14.913055 4794 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/31990e7b-b9b5-408f-aa66-e066c7b58fd4-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 11:28:14 crc kubenswrapper[4794]: I0310 11:28:14.913073 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-btlc6\" (UniqueName: \"kubernetes.io/projected/31990e7b-b9b5-408f-aa66-e066c7b58fd4-kube-api-access-btlc6\") on node \"crc\" DevicePath \"\"" Mar 10 11:28:14 crc kubenswrapper[4794]: I0310 11:28:14.913091 4794 reconciler_common.go:293] "Volume detached for volume \"job-config-data\" (UniqueName: \"kubernetes.io/secret/31990e7b-b9b5-408f-aa66-e066c7b58fd4-job-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 11:28:15 crc kubenswrapper[4794]: I0310 11:28:15.111835 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-sync-gdd98" Mar 10 11:28:15 crc kubenswrapper[4794]: I0310 11:28:15.113605 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-sync-gdd98" event={"ID":"31990e7b-b9b5-408f-aa66-e066c7b58fd4","Type":"ContainerDied","Data":"3bf6fcd8eb8e8c48269afddb45469f0d366b2d5a6c95c105becfbf9d9ea27ebd"} Mar 10 11:28:15 crc kubenswrapper[4794]: I0310 11:28:15.113816 4794 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3bf6fcd8eb8e8c48269afddb45469f0d366b2d5a6c95c105becfbf9d9ea27ebd" Mar 10 11:28:15 crc kubenswrapper[4794]: I0310 11:28:15.569455 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-scheduler-0"] Mar 10 11:28:15 crc kubenswrapper[4794]: E0310 11:28:15.570499 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="31990e7b-b9b5-408f-aa66-e066c7b58fd4" containerName="manila-db-sync" Mar 10 11:28:15 crc kubenswrapper[4794]: I0310 11:28:15.570599 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="31990e7b-b9b5-408f-aa66-e066c7b58fd4" containerName="manila-db-sync" Mar 10 11:28:15 crc kubenswrapper[4794]: I0310 11:28:15.570946 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="31990e7b-b9b5-408f-aa66-e066c7b58fd4" containerName="manila-db-sync" Mar 10 11:28:15 crc kubenswrapper[4794]: I0310 11:28:15.575434 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-scheduler-0" Mar 10 11:28:15 crc kubenswrapper[4794]: I0310 11:28:15.585670 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-manila-dockercfg-nttq2" Mar 10 11:28:15 crc kubenswrapper[4794]: I0310 11:28:15.585864 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-scripts" Mar 10 11:28:15 crc kubenswrapper[4794]: I0310 11:28:15.586107 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-scheduler-config-data" Mar 10 11:28:15 crc kubenswrapper[4794]: I0310 11:28:15.586313 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-config-data" Mar 10 11:28:15 crc kubenswrapper[4794]: I0310 11:28:15.605384 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-scheduler-0"] Mar 10 11:28:15 crc kubenswrapper[4794]: I0310 11:28:15.640178 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-75677c58ff-4skth"] Mar 10 11:28:15 crc kubenswrapper[4794]: I0310 11:28:15.642077 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-75677c58ff-4skth" Mar 10 11:28:15 crc kubenswrapper[4794]: I0310 11:28:15.667113 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-75677c58ff-4skth"] Mar 10 11:28:15 crc kubenswrapper[4794]: I0310 11:28:15.678144 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-share-share1-0"] Mar 10 11:28:15 crc kubenswrapper[4794]: I0310 11:28:15.679966 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-share-share1-0" Mar 10 11:28:15 crc kubenswrapper[4794]: I0310 11:28:15.684106 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-share-share1-config-data" Mar 10 11:28:15 crc kubenswrapper[4794]: I0310 11:28:15.733413 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-share-share1-0"] Mar 10 11:28:15 crc kubenswrapper[4794]: I0310 11:28:15.735114 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/957e6f09-08a4-459b-9261-346d34354b23-config-data-custom\") pod \"manila-scheduler-0\" (UID: \"957e6f09-08a4-459b-9261-346d34354b23\") " pod="openstack/manila-scheduler-0" Mar 10 11:28:15 crc kubenswrapper[4794]: I0310 11:28:15.735175 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d32f18f-c413-4235-947d-181f43e86242-config\") pod \"dnsmasq-dns-75677c58ff-4skth\" (UID: \"9d32f18f-c413-4235-947d-181f43e86242\") " pod="openstack/dnsmasq-dns-75677c58ff-4skth" Mar 10 11:28:15 crc kubenswrapper[4794]: I0310 11:28:15.735198 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-shkr9\" (UniqueName: \"kubernetes.io/projected/9d32f18f-c413-4235-947d-181f43e86242-kube-api-access-shkr9\") pod \"dnsmasq-dns-75677c58ff-4skth\" (UID: \"9d32f18f-c413-4235-947d-181f43e86242\") " pod="openstack/dnsmasq-dns-75677c58ff-4skth" Mar 10 11:28:15 crc kubenswrapper[4794]: I0310 11:28:15.735247 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9d32f18f-c413-4235-947d-181f43e86242-ovsdbserver-sb\") pod \"dnsmasq-dns-75677c58ff-4skth\" (UID: \"9d32f18f-c413-4235-947d-181f43e86242\") " pod="openstack/dnsmasq-dns-75677c58ff-4skth" Mar 10 11:28:15 crc kubenswrapper[4794]: I0310 11:28:15.735277 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/957e6f09-08a4-459b-9261-346d34354b23-config-data\") pod \"manila-scheduler-0\" (UID: \"957e6f09-08a4-459b-9261-346d34354b23\") " pod="openstack/manila-scheduler-0" Mar 10 11:28:15 crc kubenswrapper[4794]: I0310 11:28:15.735401 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/957e6f09-08a4-459b-9261-346d34354b23-etc-machine-id\") pod \"manila-scheduler-0\" (UID: \"957e6f09-08a4-459b-9261-346d34354b23\") " pod="openstack/manila-scheduler-0" Mar 10 11:28:15 crc kubenswrapper[4794]: I0310 11:28:15.735468 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mz6jx\" (UniqueName: \"kubernetes.io/projected/957e6f09-08a4-459b-9261-346d34354b23-kube-api-access-mz6jx\") pod \"manila-scheduler-0\" (UID: \"957e6f09-08a4-459b-9261-346d34354b23\") " pod="openstack/manila-scheduler-0" Mar 10 11:28:15 crc kubenswrapper[4794]: I0310 11:28:15.735606 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/957e6f09-08a4-459b-9261-346d34354b23-combined-ca-bundle\") pod \"manila-scheduler-0\" (UID: \"957e6f09-08a4-459b-9261-346d34354b23\") " pod="openstack/manila-scheduler-0" Mar 10 11:28:15 crc kubenswrapper[4794]: I0310 11:28:15.735696 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9d32f18f-c413-4235-947d-181f43e86242-dns-svc\") pod \"dnsmasq-dns-75677c58ff-4skth\" (UID: \"9d32f18f-c413-4235-947d-181f43e86242\") " pod="openstack/dnsmasq-dns-75677c58ff-4skth" Mar 10 11:28:15 crc kubenswrapper[4794]: I0310 11:28:15.735768 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9d32f18f-c413-4235-947d-181f43e86242-ovsdbserver-nb\") pod \"dnsmasq-dns-75677c58ff-4skth\" (UID: \"9d32f18f-c413-4235-947d-181f43e86242\") " pod="openstack/dnsmasq-dns-75677c58ff-4skth" Mar 10 11:28:15 crc kubenswrapper[4794]: I0310 11:28:15.735824 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/957e6f09-08a4-459b-9261-346d34354b23-scripts\") pod \"manila-scheduler-0\" (UID: \"957e6f09-08a4-459b-9261-346d34354b23\") " pod="openstack/manila-scheduler-0" Mar 10 11:28:15 crc kubenswrapper[4794]: I0310 11:28:15.835616 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-api-0"] Mar 10 11:28:15 crc kubenswrapper[4794]: I0310 11:28:15.837082 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9d32f18f-c413-4235-947d-181f43e86242-ovsdbserver-nb\") pod \"dnsmasq-dns-75677c58ff-4skth\" (UID: \"9d32f18f-c413-4235-947d-181f43e86242\") " pod="openstack/dnsmasq-dns-75677c58ff-4skth" Mar 10 11:28:15 crc kubenswrapper[4794]: I0310 11:28:15.837148 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a601e4ec-cffc-427d-890b-aeb8f9e7a224-combined-ca-bundle\") pod \"manila-share-share1-0\" (UID: \"a601e4ec-cffc-427d-890b-aeb8f9e7a224\") " pod="openstack/manila-share-share1-0" Mar 10 11:28:15 crc kubenswrapper[4794]: I0310 11:28:15.837180 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/957e6f09-08a4-459b-9261-346d34354b23-scripts\") pod \"manila-scheduler-0\" (UID: \"957e6f09-08a4-459b-9261-346d34354b23\") " pod="openstack/manila-scheduler-0" Mar 10 11:28:15 crc kubenswrapper[4794]: I0310 11:28:15.837212 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-manila\" (UniqueName: \"kubernetes.io/host-path/a601e4ec-cffc-427d-890b-aeb8f9e7a224-var-lib-manila\") pod \"manila-share-share1-0\" (UID: \"a601e4ec-cffc-427d-890b-aeb8f9e7a224\") " pod="openstack/manila-share-share1-0" Mar 10 11:28:15 crc kubenswrapper[4794]: I0310 11:28:15.837243 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/957e6f09-08a4-459b-9261-346d34354b23-config-data-custom\") pod \"manila-scheduler-0\" (UID: \"957e6f09-08a4-459b-9261-346d34354b23\") " pod="openstack/manila-scheduler-0" Mar 10 11:28:15 crc kubenswrapper[4794]: I0310 11:28:15.837279 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a601e4ec-cffc-427d-890b-aeb8f9e7a224-scripts\") pod \"manila-share-share1-0\" (UID: \"a601e4ec-cffc-427d-890b-aeb8f9e7a224\") " pod="openstack/manila-share-share1-0" Mar 10 11:28:15 crc kubenswrapper[4794]: I0310 11:28:15.837318 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d32f18f-c413-4235-947d-181f43e86242-config\") pod \"dnsmasq-dns-75677c58ff-4skth\" (UID: \"9d32f18f-c413-4235-947d-181f43e86242\") " pod="openstack/dnsmasq-dns-75677c58ff-4skth" Mar 10 11:28:15 crc kubenswrapper[4794]: I0310 11:28:15.837371 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-shkr9\" (UniqueName: \"kubernetes.io/projected/9d32f18f-c413-4235-947d-181f43e86242-kube-api-access-shkr9\") pod \"dnsmasq-dns-75677c58ff-4skth\" (UID: \"9d32f18f-c413-4235-947d-181f43e86242\") " pod="openstack/dnsmasq-dns-75677c58ff-4skth" Mar 10 11:28:15 crc kubenswrapper[4794]: I0310 11:28:15.837399 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a601e4ec-cffc-427d-890b-aeb8f9e7a224-config-data-custom\") pod \"manila-share-share1-0\" (UID: \"a601e4ec-cffc-427d-890b-aeb8f9e7a224\") " pod="openstack/manila-share-share1-0" Mar 10 11:28:15 crc kubenswrapper[4794]: I0310 11:28:15.837423 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/a601e4ec-cffc-427d-890b-aeb8f9e7a224-ceph\") pod \"manila-share-share1-0\" (UID: \"a601e4ec-cffc-427d-890b-aeb8f9e7a224\") " pod="openstack/manila-share-share1-0" Mar 10 11:28:15 crc kubenswrapper[4794]: I0310 11:28:15.837565 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5dj8r\" (UniqueName: \"kubernetes.io/projected/a601e4ec-cffc-427d-890b-aeb8f9e7a224-kube-api-access-5dj8r\") pod \"manila-share-share1-0\" (UID: \"a601e4ec-cffc-427d-890b-aeb8f9e7a224\") " pod="openstack/manila-share-share1-0" Mar 10 11:28:15 crc kubenswrapper[4794]: I0310 11:28:15.837635 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a601e4ec-cffc-427d-890b-aeb8f9e7a224-config-data\") pod \"manila-share-share1-0\" (UID: \"a601e4ec-cffc-427d-890b-aeb8f9e7a224\") " pod="openstack/manila-share-share1-0" Mar 10 11:28:15 crc kubenswrapper[4794]: I0310 11:28:15.837669 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9d32f18f-c413-4235-947d-181f43e86242-ovsdbserver-sb\") pod \"dnsmasq-dns-75677c58ff-4skth\" (UID: \"9d32f18f-c413-4235-947d-181f43e86242\") " pod="openstack/dnsmasq-dns-75677c58ff-4skth" Mar 10 11:28:15 crc kubenswrapper[4794]: I0310 11:28:15.838510 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/957e6f09-08a4-459b-9261-346d34354b23-config-data\") pod \"manila-scheduler-0\" (UID: \"957e6f09-08a4-459b-9261-346d34354b23\") " pod="openstack/manila-scheduler-0" Mar 10 11:28:15 crc kubenswrapper[4794]: I0310 11:28:15.838148 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d32f18f-c413-4235-947d-181f43e86242-config\") pod \"dnsmasq-dns-75677c58ff-4skth\" (UID: \"9d32f18f-c413-4235-947d-181f43e86242\") " pod="openstack/dnsmasq-dns-75677c58ff-4skth" Mar 10 11:28:15 crc kubenswrapper[4794]: I0310 11:28:15.838449 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9d32f18f-c413-4235-947d-181f43e86242-ovsdbserver-sb\") pod \"dnsmasq-dns-75677c58ff-4skth\" (UID: \"9d32f18f-c413-4235-947d-181f43e86242\") " pod="openstack/dnsmasq-dns-75677c58ff-4skth" Mar 10 11:28:15 crc kubenswrapper[4794]: I0310 11:28:15.838607 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/957e6f09-08a4-459b-9261-346d34354b23-etc-machine-id\") pod \"manila-scheduler-0\" (UID: \"957e6f09-08a4-459b-9261-346d34354b23\") " pod="openstack/manila-scheduler-0" Mar 10 11:28:15 crc kubenswrapper[4794]: I0310 11:28:15.838664 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mz6jx\" (UniqueName: \"kubernetes.io/projected/957e6f09-08a4-459b-9261-346d34354b23-kube-api-access-mz6jx\") pod \"manila-scheduler-0\" (UID: \"957e6f09-08a4-459b-9261-346d34354b23\") " pod="openstack/manila-scheduler-0" Mar 10 11:28:15 crc kubenswrapper[4794]: I0310 11:28:15.838668 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9d32f18f-c413-4235-947d-181f43e86242-ovsdbserver-nb\") pod \"dnsmasq-dns-75677c58ff-4skth\" (UID: \"9d32f18f-c413-4235-947d-181f43e86242\") " pod="openstack/dnsmasq-dns-75677c58ff-4skth" Mar 10 11:28:15 crc kubenswrapper[4794]: I0310 11:28:15.838719 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/957e6f09-08a4-459b-9261-346d34354b23-etc-machine-id\") pod \"manila-scheduler-0\" (UID: \"957e6f09-08a4-459b-9261-346d34354b23\") " pod="openstack/manila-scheduler-0" Mar 10 11:28:15 crc kubenswrapper[4794]: I0310 11:28:15.838833 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/957e6f09-08a4-459b-9261-346d34354b23-combined-ca-bundle\") pod \"manila-scheduler-0\" (UID: \"957e6f09-08a4-459b-9261-346d34354b23\") " pod="openstack/manila-scheduler-0" Mar 10 11:28:15 crc kubenswrapper[4794]: I0310 11:28:15.838867 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9d32f18f-c413-4235-947d-181f43e86242-dns-svc\") pod \"dnsmasq-dns-75677c58ff-4skth\" (UID: \"9d32f18f-c413-4235-947d-181f43e86242\") " pod="openstack/dnsmasq-dns-75677c58ff-4skth" Mar 10 11:28:15 crc kubenswrapper[4794]: I0310 11:28:15.839645 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9d32f18f-c413-4235-947d-181f43e86242-dns-svc\") pod \"dnsmasq-dns-75677c58ff-4skth\" (UID: \"9d32f18f-c413-4235-947d-181f43e86242\") " pod="openstack/dnsmasq-dns-75677c58ff-4skth" Mar 10 11:28:15 crc kubenswrapper[4794]: I0310 11:28:15.838904 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/a601e4ec-cffc-427d-890b-aeb8f9e7a224-etc-machine-id\") pod \"manila-share-share1-0\" (UID: \"a601e4ec-cffc-427d-890b-aeb8f9e7a224\") " pod="openstack/manila-share-share1-0" Mar 10 11:28:15 crc kubenswrapper[4794]: I0310 11:28:15.846349 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-api-0" Mar 10 11:28:15 crc kubenswrapper[4794]: I0310 11:28:15.854182 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-api-0"] Mar 10 11:28:15 crc kubenswrapper[4794]: I0310 11:28:15.856879 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-api-config-data" Mar 10 11:28:15 crc kubenswrapper[4794]: I0310 11:28:15.857298 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/957e6f09-08a4-459b-9261-346d34354b23-scripts\") pod \"manila-scheduler-0\" (UID: \"957e6f09-08a4-459b-9261-346d34354b23\") " pod="openstack/manila-scheduler-0" Mar 10 11:28:15 crc kubenswrapper[4794]: I0310 11:28:15.857393 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/957e6f09-08a4-459b-9261-346d34354b23-config-data-custom\") pod \"manila-scheduler-0\" (UID: \"957e6f09-08a4-459b-9261-346d34354b23\") " pod="openstack/manila-scheduler-0" Mar 10 11:28:15 crc kubenswrapper[4794]: I0310 11:28:15.858113 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/957e6f09-08a4-459b-9261-346d34354b23-combined-ca-bundle\") pod \"manila-scheduler-0\" (UID: \"957e6f09-08a4-459b-9261-346d34354b23\") " pod="openstack/manila-scheduler-0" Mar 10 11:28:15 crc kubenswrapper[4794]: I0310 11:28:15.858318 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/957e6f09-08a4-459b-9261-346d34354b23-config-data\") pod \"manila-scheduler-0\" (UID: \"957e6f09-08a4-459b-9261-346d34354b23\") " pod="openstack/manila-scheduler-0" Mar 10 11:28:15 crc kubenswrapper[4794]: I0310 11:28:15.874180 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mz6jx\" (UniqueName: \"kubernetes.io/projected/957e6f09-08a4-459b-9261-346d34354b23-kube-api-access-mz6jx\") pod \"manila-scheduler-0\" (UID: \"957e6f09-08a4-459b-9261-346d34354b23\") " pod="openstack/manila-scheduler-0" Mar 10 11:28:15 crc kubenswrapper[4794]: I0310 11:28:15.878714 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-shkr9\" (UniqueName: \"kubernetes.io/projected/9d32f18f-c413-4235-947d-181f43e86242-kube-api-access-shkr9\") pod \"dnsmasq-dns-75677c58ff-4skth\" (UID: \"9d32f18f-c413-4235-947d-181f43e86242\") " pod="openstack/dnsmasq-dns-75677c58ff-4skth" Mar 10 11:28:15 crc kubenswrapper[4794]: I0310 11:28:15.932454 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-scheduler-0" Mar 10 11:28:15 crc kubenswrapper[4794]: I0310 11:28:15.941162 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6lw7n\" (UniqueName: \"kubernetes.io/projected/f56fdc61-c6d4-4840-bbce-da97847489bd-kube-api-access-6lw7n\") pod \"manila-api-0\" (UID: \"f56fdc61-c6d4-4840-bbce-da97847489bd\") " pod="openstack/manila-api-0" Mar 10 11:28:15 crc kubenswrapper[4794]: I0310 11:28:15.941247 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f56fdc61-c6d4-4840-bbce-da97847489bd-config-data-custom\") pod \"manila-api-0\" (UID: \"f56fdc61-c6d4-4840-bbce-da97847489bd\") " pod="openstack/manila-api-0" Mar 10 11:28:15 crc kubenswrapper[4794]: I0310 11:28:15.941275 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/f56fdc61-c6d4-4840-bbce-da97847489bd-etc-machine-id\") pod \"manila-api-0\" (UID: \"f56fdc61-c6d4-4840-bbce-da97847489bd\") " pod="openstack/manila-api-0" Mar 10 11:28:15 crc kubenswrapper[4794]: I0310 11:28:15.941307 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/a601e4ec-cffc-427d-890b-aeb8f9e7a224-etc-machine-id\") pod \"manila-share-share1-0\" (UID: \"a601e4ec-cffc-427d-890b-aeb8f9e7a224\") " pod="openstack/manila-share-share1-0" Mar 10 11:28:15 crc kubenswrapper[4794]: I0310 11:28:15.941344 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f56fdc61-c6d4-4840-bbce-da97847489bd-logs\") pod \"manila-api-0\" (UID: \"f56fdc61-c6d4-4840-bbce-da97847489bd\") " pod="openstack/manila-api-0" Mar 10 11:28:15 crc kubenswrapper[4794]: I0310 11:28:15.941383 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a601e4ec-cffc-427d-890b-aeb8f9e7a224-combined-ca-bundle\") pod \"manila-share-share1-0\" (UID: \"a601e4ec-cffc-427d-890b-aeb8f9e7a224\") " pod="openstack/manila-share-share1-0" Mar 10 11:28:15 crc kubenswrapper[4794]: I0310 11:28:15.941410 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f56fdc61-c6d4-4840-bbce-da97847489bd-config-data\") pod \"manila-api-0\" (UID: \"f56fdc61-c6d4-4840-bbce-da97847489bd\") " pod="openstack/manila-api-0" Mar 10 11:28:15 crc kubenswrapper[4794]: I0310 11:28:15.941428 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-manila\" (UniqueName: \"kubernetes.io/host-path/a601e4ec-cffc-427d-890b-aeb8f9e7a224-var-lib-manila\") pod \"manila-share-share1-0\" (UID: \"a601e4ec-cffc-427d-890b-aeb8f9e7a224\") " pod="openstack/manila-share-share1-0" Mar 10 11:28:15 crc kubenswrapper[4794]: I0310 11:28:15.941457 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a601e4ec-cffc-427d-890b-aeb8f9e7a224-scripts\") pod \"manila-share-share1-0\" (UID: \"a601e4ec-cffc-427d-890b-aeb8f9e7a224\") " pod="openstack/manila-share-share1-0" Mar 10 11:28:15 crc kubenswrapper[4794]: I0310 11:28:15.941483 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a601e4ec-cffc-427d-890b-aeb8f9e7a224-config-data-custom\") pod \"manila-share-share1-0\" (UID: \"a601e4ec-cffc-427d-890b-aeb8f9e7a224\") " pod="openstack/manila-share-share1-0" Mar 10 11:28:15 crc kubenswrapper[4794]: I0310 11:28:15.941498 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/a601e4ec-cffc-427d-890b-aeb8f9e7a224-ceph\") pod \"manila-share-share1-0\" (UID: \"a601e4ec-cffc-427d-890b-aeb8f9e7a224\") " pod="openstack/manila-share-share1-0" Mar 10 11:28:15 crc kubenswrapper[4794]: I0310 11:28:15.941531 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5dj8r\" (UniqueName: \"kubernetes.io/projected/a601e4ec-cffc-427d-890b-aeb8f9e7a224-kube-api-access-5dj8r\") pod \"manila-share-share1-0\" (UID: \"a601e4ec-cffc-427d-890b-aeb8f9e7a224\") " pod="openstack/manila-share-share1-0" Mar 10 11:28:15 crc kubenswrapper[4794]: I0310 11:28:15.941553 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a601e4ec-cffc-427d-890b-aeb8f9e7a224-config-data\") pod \"manila-share-share1-0\" (UID: \"a601e4ec-cffc-427d-890b-aeb8f9e7a224\") " pod="openstack/manila-share-share1-0" Mar 10 11:28:15 crc kubenswrapper[4794]: I0310 11:28:15.941569 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f56fdc61-c6d4-4840-bbce-da97847489bd-scripts\") pod \"manila-api-0\" (UID: \"f56fdc61-c6d4-4840-bbce-da97847489bd\") " pod="openstack/manila-api-0" Mar 10 11:28:15 crc kubenswrapper[4794]: I0310 11:28:15.941585 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f56fdc61-c6d4-4840-bbce-da97847489bd-combined-ca-bundle\") pod \"manila-api-0\" (UID: \"f56fdc61-c6d4-4840-bbce-da97847489bd\") " pod="openstack/manila-api-0" Mar 10 11:28:15 crc kubenswrapper[4794]: I0310 11:28:15.941686 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/a601e4ec-cffc-427d-890b-aeb8f9e7a224-etc-machine-id\") pod \"manila-share-share1-0\" (UID: \"a601e4ec-cffc-427d-890b-aeb8f9e7a224\") " pod="openstack/manila-share-share1-0" Mar 10 11:28:15 crc kubenswrapper[4794]: I0310 11:28:15.943866 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-manila\" (UniqueName: \"kubernetes.io/host-path/a601e4ec-cffc-427d-890b-aeb8f9e7a224-var-lib-manila\") pod \"manila-share-share1-0\" (UID: \"a601e4ec-cffc-427d-890b-aeb8f9e7a224\") " pod="openstack/manila-share-share1-0" Mar 10 11:28:15 crc kubenswrapper[4794]: I0310 11:28:15.946700 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a601e4ec-cffc-427d-890b-aeb8f9e7a224-combined-ca-bundle\") pod \"manila-share-share1-0\" (UID: \"a601e4ec-cffc-427d-890b-aeb8f9e7a224\") " pod="openstack/manila-share-share1-0" Mar 10 11:28:15 crc kubenswrapper[4794]: I0310 11:28:15.949390 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a601e4ec-cffc-427d-890b-aeb8f9e7a224-scripts\") pod \"manila-share-share1-0\" (UID: \"a601e4ec-cffc-427d-890b-aeb8f9e7a224\") " pod="openstack/manila-share-share1-0" Mar 10 11:28:15 crc kubenswrapper[4794]: I0310 11:28:15.949619 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a601e4ec-cffc-427d-890b-aeb8f9e7a224-config-data-custom\") pod \"manila-share-share1-0\" (UID: \"a601e4ec-cffc-427d-890b-aeb8f9e7a224\") " pod="openstack/manila-share-share1-0" Mar 10 11:28:15 crc kubenswrapper[4794]: I0310 11:28:15.950732 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/a601e4ec-cffc-427d-890b-aeb8f9e7a224-ceph\") pod \"manila-share-share1-0\" (UID: \"a601e4ec-cffc-427d-890b-aeb8f9e7a224\") " pod="openstack/manila-share-share1-0" Mar 10 11:28:15 crc kubenswrapper[4794]: I0310 11:28:15.951351 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a601e4ec-cffc-427d-890b-aeb8f9e7a224-config-data\") pod \"manila-share-share1-0\" (UID: \"a601e4ec-cffc-427d-890b-aeb8f9e7a224\") " pod="openstack/manila-share-share1-0" Mar 10 11:28:15 crc kubenswrapper[4794]: I0310 11:28:15.962968 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-75677c58ff-4skth" Mar 10 11:28:15 crc kubenswrapper[4794]: I0310 11:28:15.963817 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5dj8r\" (UniqueName: \"kubernetes.io/projected/a601e4ec-cffc-427d-890b-aeb8f9e7a224-kube-api-access-5dj8r\") pod \"manila-share-share1-0\" (UID: \"a601e4ec-cffc-427d-890b-aeb8f9e7a224\") " pod="openstack/manila-share-share1-0" Mar 10 11:28:16 crc kubenswrapper[4794]: I0310 11:28:16.013012 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-share-share1-0" Mar 10 11:28:16 crc kubenswrapper[4794]: I0310 11:28:16.025531 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0ebea665-f36f-45ef-95a5-bdeacd279dd3" path="/var/lib/kubelet/pods/0ebea665-f36f-45ef-95a5-bdeacd279dd3/volumes" Mar 10 11:28:16 crc kubenswrapper[4794]: I0310 11:28:16.043573 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f56fdc61-c6d4-4840-bbce-da97847489bd-config-data-custom\") pod \"manila-api-0\" (UID: \"f56fdc61-c6d4-4840-bbce-da97847489bd\") " pod="openstack/manila-api-0" Mar 10 11:28:16 crc kubenswrapper[4794]: I0310 11:28:16.044171 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/f56fdc61-c6d4-4840-bbce-da97847489bd-etc-machine-id\") pod \"manila-api-0\" (UID: \"f56fdc61-c6d4-4840-bbce-da97847489bd\") " pod="openstack/manila-api-0" Mar 10 11:28:16 crc kubenswrapper[4794]: I0310 11:28:16.044211 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f56fdc61-c6d4-4840-bbce-da97847489bd-logs\") pod \"manila-api-0\" (UID: \"f56fdc61-c6d4-4840-bbce-da97847489bd\") " pod="openstack/manila-api-0" Mar 10 11:28:16 crc kubenswrapper[4794]: I0310 11:28:16.044259 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f56fdc61-c6d4-4840-bbce-da97847489bd-config-data\") pod \"manila-api-0\" (UID: \"f56fdc61-c6d4-4840-bbce-da97847489bd\") " pod="openstack/manila-api-0" Mar 10 11:28:16 crc kubenswrapper[4794]: I0310 11:28:16.044317 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/f56fdc61-c6d4-4840-bbce-da97847489bd-etc-machine-id\") pod \"manila-api-0\" (UID: \"f56fdc61-c6d4-4840-bbce-da97847489bd\") " pod="openstack/manila-api-0" Mar 10 11:28:16 crc kubenswrapper[4794]: I0310 11:28:16.044411 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f56fdc61-c6d4-4840-bbce-da97847489bd-scripts\") pod \"manila-api-0\" (UID: \"f56fdc61-c6d4-4840-bbce-da97847489bd\") " pod="openstack/manila-api-0" Mar 10 11:28:16 crc kubenswrapper[4794]: I0310 11:28:16.044433 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f56fdc61-c6d4-4840-bbce-da97847489bd-combined-ca-bundle\") pod \"manila-api-0\" (UID: \"f56fdc61-c6d4-4840-bbce-da97847489bd\") " pod="openstack/manila-api-0" Mar 10 11:28:16 crc kubenswrapper[4794]: I0310 11:28:16.044454 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6lw7n\" (UniqueName: \"kubernetes.io/projected/f56fdc61-c6d4-4840-bbce-da97847489bd-kube-api-access-6lw7n\") pod \"manila-api-0\" (UID: \"f56fdc61-c6d4-4840-bbce-da97847489bd\") " pod="openstack/manila-api-0" Mar 10 11:28:16 crc kubenswrapper[4794]: I0310 11:28:16.056038 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f56fdc61-c6d4-4840-bbce-da97847489bd-logs\") pod \"manila-api-0\" (UID: \"f56fdc61-c6d4-4840-bbce-da97847489bd\") " pod="openstack/manila-api-0" Mar 10 11:28:16 crc kubenswrapper[4794]: I0310 11:28:16.060902 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f56fdc61-c6d4-4840-bbce-da97847489bd-config-data-custom\") pod \"manila-api-0\" (UID: \"f56fdc61-c6d4-4840-bbce-da97847489bd\") " pod="openstack/manila-api-0" Mar 10 11:28:16 crc kubenswrapper[4794]: I0310 11:28:16.069743 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f56fdc61-c6d4-4840-bbce-da97847489bd-scripts\") pod \"manila-api-0\" (UID: \"f56fdc61-c6d4-4840-bbce-da97847489bd\") " pod="openstack/manila-api-0" Mar 10 11:28:16 crc kubenswrapper[4794]: I0310 11:28:16.073152 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6lw7n\" (UniqueName: \"kubernetes.io/projected/f56fdc61-c6d4-4840-bbce-da97847489bd-kube-api-access-6lw7n\") pod \"manila-api-0\" (UID: \"f56fdc61-c6d4-4840-bbce-da97847489bd\") " pod="openstack/manila-api-0" Mar 10 11:28:16 crc kubenswrapper[4794]: I0310 11:28:16.073533 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f56fdc61-c6d4-4840-bbce-da97847489bd-config-data\") pod \"manila-api-0\" (UID: \"f56fdc61-c6d4-4840-bbce-da97847489bd\") " pod="openstack/manila-api-0" Mar 10 11:28:16 crc kubenswrapper[4794]: I0310 11:28:16.075691 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f56fdc61-c6d4-4840-bbce-da97847489bd-combined-ca-bundle\") pod \"manila-api-0\" (UID: \"f56fdc61-c6d4-4840-bbce-da97847489bd\") " pod="openstack/manila-api-0" Mar 10 11:28:16 crc kubenswrapper[4794]: I0310 11:28:16.128136 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-api-0" Mar 10 11:28:16 crc kubenswrapper[4794]: I0310 11:28:16.342294 4794 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-zppgc" Mar 10 11:28:16 crc kubenswrapper[4794]: I0310 11:28:16.344103 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-zppgc" Mar 10 11:28:16 crc kubenswrapper[4794]: I0310 11:28:16.661399 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-scheduler-0"] Mar 10 11:28:16 crc kubenswrapper[4794]: I0310 11:28:16.717366 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-75677c58ff-4skth"] Mar 10 11:28:16 crc kubenswrapper[4794]: I0310 11:28:16.921923 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-share-share1-0"] Mar 10 11:28:17 crc kubenswrapper[4794]: I0310 11:28:17.055307 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-api-0"] Mar 10 11:28:17 crc kubenswrapper[4794]: W0310 11:28:17.063934 4794 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf56fdc61_c6d4_4840_bbce_da97847489bd.slice/crio-60e421b09b9facf1873bf0b03a3302264ece2001bd43030f31d8c40ea8b8baf1 WatchSource:0}: Error finding container 60e421b09b9facf1873bf0b03a3302264ece2001bd43030f31d8c40ea8b8baf1: Status 404 returned error can't find the container with id 60e421b09b9facf1873bf0b03a3302264ece2001bd43030f31d8c40ea8b8baf1 Mar 10 11:28:17 crc kubenswrapper[4794]: I0310 11:28:17.144362 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-share-share1-0" event={"ID":"a601e4ec-cffc-427d-890b-aeb8f9e7a224","Type":"ContainerStarted","Data":"2871527d328ee37a7d116f5956e85e0b324b595895816f9029c66d39b47599b4"} Mar 10 11:28:17 crc kubenswrapper[4794]: I0310 11:28:17.146056 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-75677c58ff-4skth" event={"ID":"9d32f18f-c413-4235-947d-181f43e86242","Type":"ContainerStarted","Data":"6186c51dcaacb52efcc0898684d006779b37c821bd7ac9e9a93c07fc05e4123c"} Mar 10 11:28:17 crc kubenswrapper[4794]: I0310 11:28:17.148838 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-scheduler-0" event={"ID":"957e6f09-08a4-459b-9261-346d34354b23","Type":"ContainerStarted","Data":"b38a5c26000aae0f285adab56efc6f7bdfae6fac5f9eab394e1a644ff7ef2f34"} Mar 10 11:28:17 crc kubenswrapper[4794]: I0310 11:28:17.157129 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-api-0" event={"ID":"f56fdc61-c6d4-4840-bbce-da97847489bd","Type":"ContainerStarted","Data":"60e421b09b9facf1873bf0b03a3302264ece2001bd43030f31d8c40ea8b8baf1"} Mar 10 11:28:17 crc kubenswrapper[4794]: I0310 11:28:17.453859 4794 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-marketplace-zppgc" podUID="81fd453f-c42b-4432-8d2c-dfc34ee6241c" containerName="registry-server" probeResult="failure" output=< Mar 10 11:28:17 crc kubenswrapper[4794]: timeout: failed to connect service ":50051" within 1s Mar 10 11:28:17 crc kubenswrapper[4794]: > Mar 10 11:28:18 crc kubenswrapper[4794]: I0310 11:28:18.172399 4794 generic.go:334] "Generic (PLEG): container finished" podID="9d32f18f-c413-4235-947d-181f43e86242" containerID="d5c1c8f3c053ed1820759e5ee0a138342f7dccbaa62ead391643ea0f2ce661c0" exitCode=0 Mar 10 11:28:18 crc kubenswrapper[4794]: I0310 11:28:18.173450 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-75677c58ff-4skth" event={"ID":"9d32f18f-c413-4235-947d-181f43e86242","Type":"ContainerDied","Data":"d5c1c8f3c053ed1820759e5ee0a138342f7dccbaa62ead391643ea0f2ce661c0"} Mar 10 11:28:18 crc kubenswrapper[4794]: I0310 11:28:18.186570 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-scheduler-0" event={"ID":"957e6f09-08a4-459b-9261-346d34354b23","Type":"ContainerStarted","Data":"353b83c6bdefd1d5958eab0a8a2db8072b738b031b81151367c73eafcd84fdf4"} Mar 10 11:28:18 crc kubenswrapper[4794]: I0310 11:28:18.188285 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-api-0" event={"ID":"f56fdc61-c6d4-4840-bbce-da97847489bd","Type":"ContainerStarted","Data":"7dbe714d09d878eba7fc4adb3c589d85fb1460fa87c8f61e2fc7f6959112f1ed"} Mar 10 11:28:18 crc kubenswrapper[4794]: I0310 11:28:18.188339 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-api-0" event={"ID":"f56fdc61-c6d4-4840-bbce-da97847489bd","Type":"ContainerStarted","Data":"7d401525349e135bb441420f7393bedc0c2703c6126ee45a6d1033955d4db059"} Mar 10 11:28:18 crc kubenswrapper[4794]: I0310 11:28:18.189278 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/manila-api-0" Mar 10 11:28:18 crc kubenswrapper[4794]: I0310 11:28:18.213067 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/manila-api-0" podStartSLOduration=3.213049462 podStartE2EDuration="3.213049462s" podCreationTimestamp="2026-03-10 11:28:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 11:28:18.211451362 +0000 UTC m=+6246.967622190" watchObservedRunningTime="2026-03-10 11:28:18.213049462 +0000 UTC m=+6246.969220280" Mar 10 11:28:19 crc kubenswrapper[4794]: I0310 11:28:19.200082 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-scheduler-0" event={"ID":"957e6f09-08a4-459b-9261-346d34354b23","Type":"ContainerStarted","Data":"6a953c121a5d0ca6b4348d3d2400bb89e5f60dd4b994dc4f74ebb729220904cd"} Mar 10 11:28:19 crc kubenswrapper[4794]: I0310 11:28:19.203561 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-75677c58ff-4skth" event={"ID":"9d32f18f-c413-4235-947d-181f43e86242","Type":"ContainerStarted","Data":"7ae636b9693b7b3345a998134f7e867e69dd368cf61616a283bb22292fb17cf0"} Mar 10 11:28:19 crc kubenswrapper[4794]: I0310 11:28:19.203896 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-75677c58ff-4skth" Mar 10 11:28:19 crc kubenswrapper[4794]: I0310 11:28:19.223826 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/manila-scheduler-0" podStartSLOduration=3.161323461 podStartE2EDuration="4.223803744s" podCreationTimestamp="2026-03-10 11:28:15 +0000 UTC" firstStartedPulling="2026-03-10 11:28:16.674085802 +0000 UTC m=+6245.430256620" lastFinishedPulling="2026-03-10 11:28:17.736566075 +0000 UTC m=+6246.492736903" observedRunningTime="2026-03-10 11:28:19.214975939 +0000 UTC m=+6247.971146757" watchObservedRunningTime="2026-03-10 11:28:19.223803744 +0000 UTC m=+6247.979974572" Mar 10 11:28:19 crc kubenswrapper[4794]: I0310 11:28:19.234155 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-75677c58ff-4skth" podStartSLOduration=4.234138036 podStartE2EDuration="4.234138036s" podCreationTimestamp="2026-03-10 11:28:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 11:28:19.231113971 +0000 UTC m=+6247.987284819" watchObservedRunningTime="2026-03-10 11:28:19.234138036 +0000 UTC m=+6247.990308854" Mar 10 11:28:20 crc kubenswrapper[4794]: I0310 11:28:20.133955 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Mar 10 11:28:20 crc kubenswrapper[4794]: I0310 11:28:20.596952 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 10 11:28:20 crc kubenswrapper[4794]: I0310 11:28:20.602643 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="e925561f-cfd5-420a-930e-0d1fa394fdd5" containerName="ceilometer-central-agent" containerID="cri-o://db7bdc3f7a7ea3537cd2920286083f1bd31c0533725f3aced43ff828d043bc49" gracePeriod=30 Mar 10 11:28:20 crc kubenswrapper[4794]: I0310 11:28:20.602702 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="e925561f-cfd5-420a-930e-0d1fa394fdd5" containerName="sg-core" containerID="cri-o://91ddf3c1e2942abe31d064fcf7f93a62e2c9df09bb01e7057a5785ea96eb7eac" gracePeriod=30 Mar 10 11:28:20 crc kubenswrapper[4794]: I0310 11:28:20.602733 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="e925561f-cfd5-420a-930e-0d1fa394fdd5" containerName="ceilometer-notification-agent" containerID="cri-o://8877b6ec2312b89f77411d26fbb5826f3b009f8d842a72c58142e45ab7103ecf" gracePeriod=30 Mar 10 11:28:20 crc kubenswrapper[4794]: I0310 11:28:20.602702 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="e925561f-cfd5-420a-930e-0d1fa394fdd5" containerName="proxy-httpd" containerID="cri-o://42bdfec4747abe5f78a64f9efecf2a66a80d1894083a349e2891d50996c451d3" gracePeriod=30 Mar 10 11:28:21 crc kubenswrapper[4794]: I0310 11:28:21.234602 4794 generic.go:334] "Generic (PLEG): container finished" podID="e925561f-cfd5-420a-930e-0d1fa394fdd5" containerID="42bdfec4747abe5f78a64f9efecf2a66a80d1894083a349e2891d50996c451d3" exitCode=0 Mar 10 11:28:21 crc kubenswrapper[4794]: I0310 11:28:21.234630 4794 generic.go:334] "Generic (PLEG): container finished" podID="e925561f-cfd5-420a-930e-0d1fa394fdd5" containerID="91ddf3c1e2942abe31d064fcf7f93a62e2c9df09bb01e7057a5785ea96eb7eac" exitCode=2 Mar 10 11:28:21 crc kubenswrapper[4794]: I0310 11:28:21.234637 4794 generic.go:334] "Generic (PLEG): container finished" podID="e925561f-cfd5-420a-930e-0d1fa394fdd5" containerID="db7bdc3f7a7ea3537cd2920286083f1bd31c0533725f3aced43ff828d043bc49" exitCode=0 Mar 10 11:28:21 crc kubenswrapper[4794]: I0310 11:28:21.234656 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e925561f-cfd5-420a-930e-0d1fa394fdd5","Type":"ContainerDied","Data":"42bdfec4747abe5f78a64f9efecf2a66a80d1894083a349e2891d50996c451d3"} Mar 10 11:28:21 crc kubenswrapper[4794]: I0310 11:28:21.234678 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e925561f-cfd5-420a-930e-0d1fa394fdd5","Type":"ContainerDied","Data":"91ddf3c1e2942abe31d064fcf7f93a62e2c9df09bb01e7057a5785ea96eb7eac"} Mar 10 11:28:21 crc kubenswrapper[4794]: I0310 11:28:21.234687 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e925561f-cfd5-420a-930e-0d1fa394fdd5","Type":"ContainerDied","Data":"db7bdc3f7a7ea3537cd2920286083f1bd31c0533725f3aced43ff828d043bc49"} Mar 10 11:28:22 crc kubenswrapper[4794]: I0310 11:28:22.247780 4794 generic.go:334] "Generic (PLEG): container finished" podID="e925561f-cfd5-420a-930e-0d1fa394fdd5" containerID="8877b6ec2312b89f77411d26fbb5826f3b009f8d842a72c58142e45ab7103ecf" exitCode=0 Mar 10 11:28:22 crc kubenswrapper[4794]: I0310 11:28:22.247862 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e925561f-cfd5-420a-930e-0d1fa394fdd5","Type":"ContainerDied","Data":"8877b6ec2312b89f77411d26fbb5826f3b009f8d842a72c58142e45ab7103ecf"} Mar 10 11:28:22 crc kubenswrapper[4794]: I0310 11:28:22.999560 4794 scope.go:117] "RemoveContainer" containerID="7035d4646bbb6a23dd802a9c8c9f05a3c6c945d4e5a6da8586d2bb005fcfad77" Mar 10 11:28:23 crc kubenswrapper[4794]: E0310 11:28:23.000305 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-69278_openshift-machine-config-operator(071a79a8-a892-4d38-a255-2a19483b64aa)\"" pod="openshift-machine-config-operator/machine-config-daemon-69278" podUID="071a79a8-a892-4d38-a255-2a19483b64aa" Mar 10 11:28:24 crc kubenswrapper[4794]: I0310 11:28:24.683848 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 10 11:28:24 crc kubenswrapper[4794]: I0310 11:28:24.754637 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e925561f-cfd5-420a-930e-0d1fa394fdd5-config-data\") pod \"e925561f-cfd5-420a-930e-0d1fa394fdd5\" (UID: \"e925561f-cfd5-420a-930e-0d1fa394fdd5\") " Mar 10 11:28:24 crc kubenswrapper[4794]: I0310 11:28:24.754710 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e925561f-cfd5-420a-930e-0d1fa394fdd5-sg-core-conf-yaml\") pod \"e925561f-cfd5-420a-930e-0d1fa394fdd5\" (UID: \"e925561f-cfd5-420a-930e-0d1fa394fdd5\") " Mar 10 11:28:24 crc kubenswrapper[4794]: I0310 11:28:24.754832 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kjsqq\" (UniqueName: \"kubernetes.io/projected/e925561f-cfd5-420a-930e-0d1fa394fdd5-kube-api-access-kjsqq\") pod \"e925561f-cfd5-420a-930e-0d1fa394fdd5\" (UID: \"e925561f-cfd5-420a-930e-0d1fa394fdd5\") " Mar 10 11:28:24 crc kubenswrapper[4794]: I0310 11:28:24.754902 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e925561f-cfd5-420a-930e-0d1fa394fdd5-log-httpd\") pod \"e925561f-cfd5-420a-930e-0d1fa394fdd5\" (UID: \"e925561f-cfd5-420a-930e-0d1fa394fdd5\") " Mar 10 11:28:24 crc kubenswrapper[4794]: I0310 11:28:24.754997 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e925561f-cfd5-420a-930e-0d1fa394fdd5-run-httpd\") pod \"e925561f-cfd5-420a-930e-0d1fa394fdd5\" (UID: \"e925561f-cfd5-420a-930e-0d1fa394fdd5\") " Mar 10 11:28:24 crc kubenswrapper[4794]: I0310 11:28:24.755040 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e925561f-cfd5-420a-930e-0d1fa394fdd5-combined-ca-bundle\") pod \"e925561f-cfd5-420a-930e-0d1fa394fdd5\" (UID: \"e925561f-cfd5-420a-930e-0d1fa394fdd5\") " Mar 10 11:28:24 crc kubenswrapper[4794]: I0310 11:28:24.755063 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e925561f-cfd5-420a-930e-0d1fa394fdd5-scripts\") pod \"e925561f-cfd5-420a-930e-0d1fa394fdd5\" (UID: \"e925561f-cfd5-420a-930e-0d1fa394fdd5\") " Mar 10 11:28:24 crc kubenswrapper[4794]: I0310 11:28:24.756414 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e925561f-cfd5-420a-930e-0d1fa394fdd5-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "e925561f-cfd5-420a-930e-0d1fa394fdd5" (UID: "e925561f-cfd5-420a-930e-0d1fa394fdd5"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 11:28:24 crc kubenswrapper[4794]: I0310 11:28:24.756826 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e925561f-cfd5-420a-930e-0d1fa394fdd5-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "e925561f-cfd5-420a-930e-0d1fa394fdd5" (UID: "e925561f-cfd5-420a-930e-0d1fa394fdd5"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 11:28:24 crc kubenswrapper[4794]: I0310 11:28:24.761490 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e925561f-cfd5-420a-930e-0d1fa394fdd5-kube-api-access-kjsqq" (OuterVolumeSpecName: "kube-api-access-kjsqq") pod "e925561f-cfd5-420a-930e-0d1fa394fdd5" (UID: "e925561f-cfd5-420a-930e-0d1fa394fdd5"). InnerVolumeSpecName "kube-api-access-kjsqq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 11:28:24 crc kubenswrapper[4794]: I0310 11:28:24.761589 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e925561f-cfd5-420a-930e-0d1fa394fdd5-scripts" (OuterVolumeSpecName: "scripts") pod "e925561f-cfd5-420a-930e-0d1fa394fdd5" (UID: "e925561f-cfd5-420a-930e-0d1fa394fdd5"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 11:28:24 crc kubenswrapper[4794]: I0310 11:28:24.790966 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e925561f-cfd5-420a-930e-0d1fa394fdd5-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "e925561f-cfd5-420a-930e-0d1fa394fdd5" (UID: "e925561f-cfd5-420a-930e-0d1fa394fdd5"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 11:28:24 crc kubenswrapper[4794]: I0310 11:28:24.857449 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kjsqq\" (UniqueName: \"kubernetes.io/projected/e925561f-cfd5-420a-930e-0d1fa394fdd5-kube-api-access-kjsqq\") on node \"crc\" DevicePath \"\"" Mar 10 11:28:24 crc kubenswrapper[4794]: I0310 11:28:24.857482 4794 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e925561f-cfd5-420a-930e-0d1fa394fdd5-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 10 11:28:24 crc kubenswrapper[4794]: I0310 11:28:24.857497 4794 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e925561f-cfd5-420a-930e-0d1fa394fdd5-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 10 11:28:24 crc kubenswrapper[4794]: I0310 11:28:24.857509 4794 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e925561f-cfd5-420a-930e-0d1fa394fdd5-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 11:28:24 crc kubenswrapper[4794]: I0310 11:28:24.857522 4794 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e925561f-cfd5-420a-930e-0d1fa394fdd5-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 10 11:28:24 crc kubenswrapper[4794]: I0310 11:28:24.866941 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e925561f-cfd5-420a-930e-0d1fa394fdd5-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e925561f-cfd5-420a-930e-0d1fa394fdd5" (UID: "e925561f-cfd5-420a-930e-0d1fa394fdd5"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 11:28:24 crc kubenswrapper[4794]: I0310 11:28:24.887613 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e925561f-cfd5-420a-930e-0d1fa394fdd5-config-data" (OuterVolumeSpecName: "config-data") pod "e925561f-cfd5-420a-930e-0d1fa394fdd5" (UID: "e925561f-cfd5-420a-930e-0d1fa394fdd5"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 11:28:24 crc kubenswrapper[4794]: I0310 11:28:24.959869 4794 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e925561f-cfd5-420a-930e-0d1fa394fdd5-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 11:28:24 crc kubenswrapper[4794]: I0310 11:28:24.960214 4794 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e925561f-cfd5-420a-930e-0d1fa394fdd5-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 11:28:25 crc kubenswrapper[4794]: I0310 11:28:25.299916 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e925561f-cfd5-420a-930e-0d1fa394fdd5","Type":"ContainerDied","Data":"b66b14c2a2b0e86b35412f44bf85726ca1b3b46c4faf6e74a353e0cb6385d7b5"} Mar 10 11:28:25 crc kubenswrapper[4794]: I0310 11:28:25.299970 4794 scope.go:117] "RemoveContainer" containerID="42bdfec4747abe5f78a64f9efecf2a66a80d1894083a349e2891d50996c451d3" Mar 10 11:28:25 crc kubenswrapper[4794]: I0310 11:28:25.300101 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 10 11:28:25 crc kubenswrapper[4794]: I0310 11:28:25.310189 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-share-share1-0" event={"ID":"a601e4ec-cffc-427d-890b-aeb8f9e7a224","Type":"ContainerStarted","Data":"fd9a581d455a3edc8d9d3e072801ea143a12f9f2d837a13a6125b8b759f8230a"} Mar 10 11:28:25 crc kubenswrapper[4794]: I0310 11:28:25.410353 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 10 11:28:25 crc kubenswrapper[4794]: I0310 11:28:25.415749 4794 scope.go:117] "RemoveContainer" containerID="91ddf3c1e2942abe31d064fcf7f93a62e2c9df09bb01e7057a5785ea96eb7eac" Mar 10 11:28:25 crc kubenswrapper[4794]: I0310 11:28:25.428416 4794 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 10 11:28:25 crc kubenswrapper[4794]: I0310 11:28:25.440498 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 10 11:28:25 crc kubenswrapper[4794]: E0310 11:28:25.440923 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e925561f-cfd5-420a-930e-0d1fa394fdd5" containerName="proxy-httpd" Mar 10 11:28:25 crc kubenswrapper[4794]: I0310 11:28:25.440934 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="e925561f-cfd5-420a-930e-0d1fa394fdd5" containerName="proxy-httpd" Mar 10 11:28:25 crc kubenswrapper[4794]: E0310 11:28:25.440944 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e925561f-cfd5-420a-930e-0d1fa394fdd5" containerName="ceilometer-central-agent" Mar 10 11:28:25 crc kubenswrapper[4794]: I0310 11:28:25.440950 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="e925561f-cfd5-420a-930e-0d1fa394fdd5" containerName="ceilometer-central-agent" Mar 10 11:28:25 crc kubenswrapper[4794]: E0310 11:28:25.440968 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e925561f-cfd5-420a-930e-0d1fa394fdd5" containerName="sg-core" Mar 10 11:28:25 crc kubenswrapper[4794]: I0310 11:28:25.440976 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="e925561f-cfd5-420a-930e-0d1fa394fdd5" containerName="sg-core" Mar 10 11:28:25 crc kubenswrapper[4794]: E0310 11:28:25.440995 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e925561f-cfd5-420a-930e-0d1fa394fdd5" containerName="ceilometer-notification-agent" Mar 10 11:28:25 crc kubenswrapper[4794]: I0310 11:28:25.441000 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="e925561f-cfd5-420a-930e-0d1fa394fdd5" containerName="ceilometer-notification-agent" Mar 10 11:28:25 crc kubenswrapper[4794]: I0310 11:28:25.441174 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="e925561f-cfd5-420a-930e-0d1fa394fdd5" containerName="ceilometer-central-agent" Mar 10 11:28:25 crc kubenswrapper[4794]: I0310 11:28:25.441190 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="e925561f-cfd5-420a-930e-0d1fa394fdd5" containerName="proxy-httpd" Mar 10 11:28:25 crc kubenswrapper[4794]: I0310 11:28:25.441206 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="e925561f-cfd5-420a-930e-0d1fa394fdd5" containerName="sg-core" Mar 10 11:28:25 crc kubenswrapper[4794]: I0310 11:28:25.441217 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="e925561f-cfd5-420a-930e-0d1fa394fdd5" containerName="ceilometer-notification-agent" Mar 10 11:28:25 crc kubenswrapper[4794]: I0310 11:28:25.444000 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 10 11:28:25 crc kubenswrapper[4794]: I0310 11:28:25.448860 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 10 11:28:25 crc kubenswrapper[4794]: I0310 11:28:25.449001 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 10 11:28:25 crc kubenswrapper[4794]: I0310 11:28:25.465640 4794 scope.go:117] "RemoveContainer" containerID="8877b6ec2312b89f77411d26fbb5826f3b009f8d842a72c58142e45ab7103ecf" Mar 10 11:28:25 crc kubenswrapper[4794]: I0310 11:28:25.491162 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 10 11:28:25 crc kubenswrapper[4794]: I0310 11:28:25.506581 4794 scope.go:117] "RemoveContainer" containerID="db7bdc3f7a7ea3537cd2920286083f1bd31c0533725f3aced43ff828d043bc49" Mar 10 11:28:25 crc kubenswrapper[4794]: I0310 11:28:25.575776 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8e493776-c193-4e89-bdeb-b19ca57df652-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"8e493776-c193-4e89-bdeb-b19ca57df652\") " pod="openstack/ceilometer-0" Mar 10 11:28:25 crc kubenswrapper[4794]: I0310 11:28:25.575817 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8e493776-c193-4e89-bdeb-b19ca57df652-log-httpd\") pod \"ceilometer-0\" (UID: \"8e493776-c193-4e89-bdeb-b19ca57df652\") " pod="openstack/ceilometer-0" Mar 10 11:28:25 crc kubenswrapper[4794]: I0310 11:28:25.575843 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8e493776-c193-4e89-bdeb-b19ca57df652-scripts\") pod \"ceilometer-0\" (UID: \"8e493776-c193-4e89-bdeb-b19ca57df652\") " pod="openstack/ceilometer-0" Mar 10 11:28:25 crc kubenswrapper[4794]: I0310 11:28:25.575996 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tslpq\" (UniqueName: \"kubernetes.io/projected/8e493776-c193-4e89-bdeb-b19ca57df652-kube-api-access-tslpq\") pod \"ceilometer-0\" (UID: \"8e493776-c193-4e89-bdeb-b19ca57df652\") " pod="openstack/ceilometer-0" Mar 10 11:28:25 crc kubenswrapper[4794]: I0310 11:28:25.576110 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/8e493776-c193-4e89-bdeb-b19ca57df652-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"8e493776-c193-4e89-bdeb-b19ca57df652\") " pod="openstack/ceilometer-0" Mar 10 11:28:25 crc kubenswrapper[4794]: I0310 11:28:25.576184 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8e493776-c193-4e89-bdeb-b19ca57df652-run-httpd\") pod \"ceilometer-0\" (UID: \"8e493776-c193-4e89-bdeb-b19ca57df652\") " pod="openstack/ceilometer-0" Mar 10 11:28:25 crc kubenswrapper[4794]: I0310 11:28:25.576284 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8e493776-c193-4e89-bdeb-b19ca57df652-config-data\") pod \"ceilometer-0\" (UID: \"8e493776-c193-4e89-bdeb-b19ca57df652\") " pod="openstack/ceilometer-0" Mar 10 11:28:25 crc kubenswrapper[4794]: I0310 11:28:25.677912 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/8e493776-c193-4e89-bdeb-b19ca57df652-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"8e493776-c193-4e89-bdeb-b19ca57df652\") " pod="openstack/ceilometer-0" Mar 10 11:28:25 crc kubenswrapper[4794]: I0310 11:28:25.677968 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8e493776-c193-4e89-bdeb-b19ca57df652-run-httpd\") pod \"ceilometer-0\" (UID: \"8e493776-c193-4e89-bdeb-b19ca57df652\") " pod="openstack/ceilometer-0" Mar 10 11:28:25 crc kubenswrapper[4794]: I0310 11:28:25.678010 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8e493776-c193-4e89-bdeb-b19ca57df652-config-data\") pod \"ceilometer-0\" (UID: \"8e493776-c193-4e89-bdeb-b19ca57df652\") " pod="openstack/ceilometer-0" Mar 10 11:28:25 crc kubenswrapper[4794]: I0310 11:28:25.678111 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8e493776-c193-4e89-bdeb-b19ca57df652-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"8e493776-c193-4e89-bdeb-b19ca57df652\") " pod="openstack/ceilometer-0" Mar 10 11:28:25 crc kubenswrapper[4794]: I0310 11:28:25.678132 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8e493776-c193-4e89-bdeb-b19ca57df652-log-httpd\") pod \"ceilometer-0\" (UID: \"8e493776-c193-4e89-bdeb-b19ca57df652\") " pod="openstack/ceilometer-0" Mar 10 11:28:25 crc kubenswrapper[4794]: I0310 11:28:25.678151 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8e493776-c193-4e89-bdeb-b19ca57df652-scripts\") pod \"ceilometer-0\" (UID: \"8e493776-c193-4e89-bdeb-b19ca57df652\") " pod="openstack/ceilometer-0" Mar 10 11:28:25 crc kubenswrapper[4794]: I0310 11:28:25.678177 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tslpq\" (UniqueName: \"kubernetes.io/projected/8e493776-c193-4e89-bdeb-b19ca57df652-kube-api-access-tslpq\") pod \"ceilometer-0\" (UID: \"8e493776-c193-4e89-bdeb-b19ca57df652\") " pod="openstack/ceilometer-0" Mar 10 11:28:25 crc kubenswrapper[4794]: I0310 11:28:25.681255 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8e493776-c193-4e89-bdeb-b19ca57df652-run-httpd\") pod \"ceilometer-0\" (UID: \"8e493776-c193-4e89-bdeb-b19ca57df652\") " pod="openstack/ceilometer-0" Mar 10 11:28:25 crc kubenswrapper[4794]: I0310 11:28:25.681489 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8e493776-c193-4e89-bdeb-b19ca57df652-log-httpd\") pod \"ceilometer-0\" (UID: \"8e493776-c193-4e89-bdeb-b19ca57df652\") " pod="openstack/ceilometer-0" Mar 10 11:28:25 crc kubenswrapper[4794]: I0310 11:28:25.684367 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8e493776-c193-4e89-bdeb-b19ca57df652-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"8e493776-c193-4e89-bdeb-b19ca57df652\") " pod="openstack/ceilometer-0" Mar 10 11:28:25 crc kubenswrapper[4794]: I0310 11:28:25.691972 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/8e493776-c193-4e89-bdeb-b19ca57df652-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"8e493776-c193-4e89-bdeb-b19ca57df652\") " pod="openstack/ceilometer-0" Mar 10 11:28:25 crc kubenswrapper[4794]: I0310 11:28:25.705465 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8e493776-c193-4e89-bdeb-b19ca57df652-config-data\") pod \"ceilometer-0\" (UID: \"8e493776-c193-4e89-bdeb-b19ca57df652\") " pod="openstack/ceilometer-0" Mar 10 11:28:25 crc kubenswrapper[4794]: I0310 11:28:25.706067 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8e493776-c193-4e89-bdeb-b19ca57df652-scripts\") pod \"ceilometer-0\" (UID: \"8e493776-c193-4e89-bdeb-b19ca57df652\") " pod="openstack/ceilometer-0" Mar 10 11:28:25 crc kubenswrapper[4794]: I0310 11:28:25.708695 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tslpq\" (UniqueName: \"kubernetes.io/projected/8e493776-c193-4e89-bdeb-b19ca57df652-kube-api-access-tslpq\") pod \"ceilometer-0\" (UID: \"8e493776-c193-4e89-bdeb-b19ca57df652\") " pod="openstack/ceilometer-0" Mar 10 11:28:25 crc kubenswrapper[4794]: I0310 11:28:25.777590 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 10 11:28:25 crc kubenswrapper[4794]: I0310 11:28:25.936262 4794 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/manila-scheduler-0" Mar 10 11:28:25 crc kubenswrapper[4794]: I0310 11:28:25.966258 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-75677c58ff-4skth" Mar 10 11:28:26 crc kubenswrapper[4794]: I0310 11:28:26.028845 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e925561f-cfd5-420a-930e-0d1fa394fdd5" path="/var/lib/kubelet/pods/e925561f-cfd5-420a-930e-0d1fa394fdd5/volumes" Mar 10 11:28:26 crc kubenswrapper[4794]: I0310 11:28:26.049960 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7654548499-nb8kl"] Mar 10 11:28:26 crc kubenswrapper[4794]: I0310 11:28:26.050184 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-7654548499-nb8kl" podUID="692a8a38-92c2-437f-995f-e8595cc09a32" containerName="dnsmasq-dns" containerID="cri-o://0998a61af3fe096aad73a3b9691001b780ca6e82bb7571816afe3ae1f19fb9bf" gracePeriod=10 Mar 10 11:28:26 crc kubenswrapper[4794]: I0310 11:28:26.315658 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 10 11:28:26 crc kubenswrapper[4794]: I0310 11:28:26.328472 4794 generic.go:334] "Generic (PLEG): container finished" podID="692a8a38-92c2-437f-995f-e8595cc09a32" containerID="0998a61af3fe096aad73a3b9691001b780ca6e82bb7571816afe3ae1f19fb9bf" exitCode=0 Mar 10 11:28:26 crc kubenswrapper[4794]: I0310 11:28:26.328529 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7654548499-nb8kl" event={"ID":"692a8a38-92c2-437f-995f-e8595cc09a32","Type":"ContainerDied","Data":"0998a61af3fe096aad73a3b9691001b780ca6e82bb7571816afe3ae1f19fb9bf"} Mar 10 11:28:26 crc kubenswrapper[4794]: I0310 11:28:26.333279 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-share-share1-0" event={"ID":"a601e4ec-cffc-427d-890b-aeb8f9e7a224","Type":"ContainerStarted","Data":"06894a250e31dc18d986dc3d79dee8e0655bc8a529ad596d9c5b5388f2e161d7"} Mar 10 11:28:26 crc kubenswrapper[4794]: I0310 11:28:26.358995 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/manila-share-share1-0" podStartSLOduration=3.953448931 podStartE2EDuration="11.358975225s" podCreationTimestamp="2026-03-10 11:28:15 +0000 UTC" firstStartedPulling="2026-03-10 11:28:16.957007996 +0000 UTC m=+6245.713178814" lastFinishedPulling="2026-03-10 11:28:24.36253429 +0000 UTC m=+6253.118705108" observedRunningTime="2026-03-10 11:28:26.355876688 +0000 UTC m=+6255.112047506" watchObservedRunningTime="2026-03-10 11:28:26.358975225 +0000 UTC m=+6255.115146043" Mar 10 11:28:26 crc kubenswrapper[4794]: I0310 11:28:26.416560 4794 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-zppgc" Mar 10 11:28:26 crc kubenswrapper[4794]: I0310 11:28:26.464214 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-zppgc" Mar 10 11:28:26 crc kubenswrapper[4794]: I0310 11:28:26.653762 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7654548499-nb8kl" Mar 10 11:28:26 crc kubenswrapper[4794]: I0310 11:28:26.658989 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-zppgc"] Mar 10 11:28:26 crc kubenswrapper[4794]: I0310 11:28:26.704370 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qmqdd\" (UniqueName: \"kubernetes.io/projected/692a8a38-92c2-437f-995f-e8595cc09a32-kube-api-access-qmqdd\") pod \"692a8a38-92c2-437f-995f-e8595cc09a32\" (UID: \"692a8a38-92c2-437f-995f-e8595cc09a32\") " Mar 10 11:28:26 crc kubenswrapper[4794]: I0310 11:28:26.704569 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/692a8a38-92c2-437f-995f-e8595cc09a32-ovsdbserver-sb\") pod \"692a8a38-92c2-437f-995f-e8595cc09a32\" (UID: \"692a8a38-92c2-437f-995f-e8595cc09a32\") " Mar 10 11:28:26 crc kubenswrapper[4794]: I0310 11:28:26.704733 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/692a8a38-92c2-437f-995f-e8595cc09a32-config\") pod \"692a8a38-92c2-437f-995f-e8595cc09a32\" (UID: \"692a8a38-92c2-437f-995f-e8595cc09a32\") " Mar 10 11:28:26 crc kubenswrapper[4794]: I0310 11:28:26.704787 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/692a8a38-92c2-437f-995f-e8595cc09a32-dns-svc\") pod \"692a8a38-92c2-437f-995f-e8595cc09a32\" (UID: \"692a8a38-92c2-437f-995f-e8595cc09a32\") " Mar 10 11:28:26 crc kubenswrapper[4794]: I0310 11:28:26.704847 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/692a8a38-92c2-437f-995f-e8595cc09a32-ovsdbserver-nb\") pod \"692a8a38-92c2-437f-995f-e8595cc09a32\" (UID: \"692a8a38-92c2-437f-995f-e8595cc09a32\") " Mar 10 11:28:26 crc kubenswrapper[4794]: I0310 11:28:26.743478 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/692a8a38-92c2-437f-995f-e8595cc09a32-kube-api-access-qmqdd" (OuterVolumeSpecName: "kube-api-access-qmqdd") pod "692a8a38-92c2-437f-995f-e8595cc09a32" (UID: "692a8a38-92c2-437f-995f-e8595cc09a32"). InnerVolumeSpecName "kube-api-access-qmqdd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 11:28:26 crc kubenswrapper[4794]: I0310 11:28:26.792182 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/692a8a38-92c2-437f-995f-e8595cc09a32-config" (OuterVolumeSpecName: "config") pod "692a8a38-92c2-437f-995f-e8595cc09a32" (UID: "692a8a38-92c2-437f-995f-e8595cc09a32"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 11:28:26 crc kubenswrapper[4794]: I0310 11:28:26.813017 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/692a8a38-92c2-437f-995f-e8595cc09a32-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "692a8a38-92c2-437f-995f-e8595cc09a32" (UID: "692a8a38-92c2-437f-995f-e8595cc09a32"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 11:28:26 crc kubenswrapper[4794]: I0310 11:28:26.817187 4794 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/692a8a38-92c2-437f-995f-e8595cc09a32-config\") on node \"crc\" DevicePath \"\"" Mar 10 11:28:26 crc kubenswrapper[4794]: I0310 11:28:26.817282 4794 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/692a8a38-92c2-437f-995f-e8595cc09a32-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 10 11:28:26 crc kubenswrapper[4794]: I0310 11:28:26.817360 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qmqdd\" (UniqueName: \"kubernetes.io/projected/692a8a38-92c2-437f-995f-e8595cc09a32-kube-api-access-qmqdd\") on node \"crc\" DevicePath \"\"" Mar 10 11:28:26 crc kubenswrapper[4794]: I0310 11:28:26.825573 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/692a8a38-92c2-437f-995f-e8595cc09a32-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "692a8a38-92c2-437f-995f-e8595cc09a32" (UID: "692a8a38-92c2-437f-995f-e8595cc09a32"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 11:28:26 crc kubenswrapper[4794]: I0310 11:28:26.830793 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/692a8a38-92c2-437f-995f-e8595cc09a32-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "692a8a38-92c2-437f-995f-e8595cc09a32" (UID: "692a8a38-92c2-437f-995f-e8595cc09a32"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 11:28:26 crc kubenswrapper[4794]: I0310 11:28:26.919954 4794 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/692a8a38-92c2-437f-995f-e8595cc09a32-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 10 11:28:26 crc kubenswrapper[4794]: I0310 11:28:26.919993 4794 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/692a8a38-92c2-437f-995f-e8595cc09a32-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 10 11:28:27 crc kubenswrapper[4794]: I0310 11:28:27.354794 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8e493776-c193-4e89-bdeb-b19ca57df652","Type":"ContainerStarted","Data":"88a97723577b089c875abe472a0931ac8a90f3b5e1a3401f35e815e904d0c000"} Mar 10 11:28:27 crc kubenswrapper[4794]: I0310 11:28:27.355120 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8e493776-c193-4e89-bdeb-b19ca57df652","Type":"ContainerStarted","Data":"1fea346552eb4ad20848fc4d3aea2ed5f6068b6f959af5a51c0e7e2c1dd88c6a"} Mar 10 11:28:27 crc kubenswrapper[4794]: I0310 11:28:27.358443 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7654548499-nb8kl" Mar 10 11:28:27 crc kubenswrapper[4794]: I0310 11:28:27.362645 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7654548499-nb8kl" event={"ID":"692a8a38-92c2-437f-995f-e8595cc09a32","Type":"ContainerDied","Data":"63d1ad395c9e293c6fb66138e2ff234eb3bdb74fc8745384c82a167ce43fd4a5"} Mar 10 11:28:27 crc kubenswrapper[4794]: I0310 11:28:27.362717 4794 scope.go:117] "RemoveContainer" containerID="0998a61af3fe096aad73a3b9691001b780ca6e82bb7571816afe3ae1f19fb9bf" Mar 10 11:28:27 crc kubenswrapper[4794]: I0310 11:28:27.406065 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7654548499-nb8kl"] Mar 10 11:28:27 crc kubenswrapper[4794]: I0310 11:28:27.406380 4794 scope.go:117] "RemoveContainer" containerID="5f737eee6aa7c3a803b3628c1c63d1d2efb8ca6a307af4fa82979f8362cc39f3" Mar 10 11:28:27 crc kubenswrapper[4794]: I0310 11:28:27.417831 4794 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7654548499-nb8kl"] Mar 10 11:28:28 crc kubenswrapper[4794]: I0310 11:28:28.020216 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="692a8a38-92c2-437f-995f-e8595cc09a32" path="/var/lib/kubelet/pods/692a8a38-92c2-437f-995f-e8595cc09a32/volumes" Mar 10 11:28:28 crc kubenswrapper[4794]: I0310 11:28:28.047827 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-cell-mapping-gz7lg"] Mar 10 11:28:28 crc kubenswrapper[4794]: I0310 11:28:28.055852 4794 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-cell-mapping-gz7lg"] Mar 10 11:28:28 crc kubenswrapper[4794]: I0310 11:28:28.391226 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8e493776-c193-4e89-bdeb-b19ca57df652","Type":"ContainerStarted","Data":"0fcf42edf8b7766bfe217b4e35a0deafd44b116cdc06cf652c7640c237c80090"} Mar 10 11:28:28 crc kubenswrapper[4794]: I0310 11:28:28.392731 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-zppgc" podUID="81fd453f-c42b-4432-8d2c-dfc34ee6241c" containerName="registry-server" containerID="cri-o://d4c8537903b62043899faf687979b13e02576af699a55eabd48a96aa686abc7e" gracePeriod=2 Mar 10 11:28:29 crc kubenswrapper[4794]: I0310 11:28:29.089579 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-zppgc" Mar 10 11:28:29 crc kubenswrapper[4794]: I0310 11:28:29.191582 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/81fd453f-c42b-4432-8d2c-dfc34ee6241c-utilities\") pod \"81fd453f-c42b-4432-8d2c-dfc34ee6241c\" (UID: \"81fd453f-c42b-4432-8d2c-dfc34ee6241c\") " Mar 10 11:28:29 crc kubenswrapper[4794]: I0310 11:28:29.191626 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/81fd453f-c42b-4432-8d2c-dfc34ee6241c-catalog-content\") pod \"81fd453f-c42b-4432-8d2c-dfc34ee6241c\" (UID: \"81fd453f-c42b-4432-8d2c-dfc34ee6241c\") " Mar 10 11:28:29 crc kubenswrapper[4794]: I0310 11:28:29.191728 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cdtgs\" (UniqueName: \"kubernetes.io/projected/81fd453f-c42b-4432-8d2c-dfc34ee6241c-kube-api-access-cdtgs\") pod \"81fd453f-c42b-4432-8d2c-dfc34ee6241c\" (UID: \"81fd453f-c42b-4432-8d2c-dfc34ee6241c\") " Mar 10 11:28:29 crc kubenswrapper[4794]: I0310 11:28:29.192385 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/81fd453f-c42b-4432-8d2c-dfc34ee6241c-utilities" (OuterVolumeSpecName: "utilities") pod "81fd453f-c42b-4432-8d2c-dfc34ee6241c" (UID: "81fd453f-c42b-4432-8d2c-dfc34ee6241c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 11:28:29 crc kubenswrapper[4794]: I0310 11:28:29.198201 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/81fd453f-c42b-4432-8d2c-dfc34ee6241c-kube-api-access-cdtgs" (OuterVolumeSpecName: "kube-api-access-cdtgs") pod "81fd453f-c42b-4432-8d2c-dfc34ee6241c" (UID: "81fd453f-c42b-4432-8d2c-dfc34ee6241c"). InnerVolumeSpecName "kube-api-access-cdtgs". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 11:28:29 crc kubenswrapper[4794]: I0310 11:28:29.216418 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/81fd453f-c42b-4432-8d2c-dfc34ee6241c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "81fd453f-c42b-4432-8d2c-dfc34ee6241c" (UID: "81fd453f-c42b-4432-8d2c-dfc34ee6241c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 11:28:29 crc kubenswrapper[4794]: I0310 11:28:29.294230 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cdtgs\" (UniqueName: \"kubernetes.io/projected/81fd453f-c42b-4432-8d2c-dfc34ee6241c-kube-api-access-cdtgs\") on node \"crc\" DevicePath \"\"" Mar 10 11:28:29 crc kubenswrapper[4794]: I0310 11:28:29.294267 4794 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/81fd453f-c42b-4432-8d2c-dfc34ee6241c-utilities\") on node \"crc\" DevicePath \"\"" Mar 10 11:28:29 crc kubenswrapper[4794]: I0310 11:28:29.294280 4794 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/81fd453f-c42b-4432-8d2c-dfc34ee6241c-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 10 11:28:29 crc kubenswrapper[4794]: I0310 11:28:29.406780 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8e493776-c193-4e89-bdeb-b19ca57df652","Type":"ContainerStarted","Data":"15fcf4b361319af8c0e055136272bc8edfc2e6cbb2fe4fad2a79c13e82e3fad6"} Mar 10 11:28:29 crc kubenswrapper[4794]: I0310 11:28:29.410500 4794 generic.go:334] "Generic (PLEG): container finished" podID="81fd453f-c42b-4432-8d2c-dfc34ee6241c" containerID="d4c8537903b62043899faf687979b13e02576af699a55eabd48a96aa686abc7e" exitCode=0 Mar 10 11:28:29 crc kubenswrapper[4794]: I0310 11:28:29.410540 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zppgc" event={"ID":"81fd453f-c42b-4432-8d2c-dfc34ee6241c","Type":"ContainerDied","Data":"d4c8537903b62043899faf687979b13e02576af699a55eabd48a96aa686abc7e"} Mar 10 11:28:29 crc kubenswrapper[4794]: I0310 11:28:29.410566 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-zppgc" Mar 10 11:28:29 crc kubenswrapper[4794]: I0310 11:28:29.410577 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zppgc" event={"ID":"81fd453f-c42b-4432-8d2c-dfc34ee6241c","Type":"ContainerDied","Data":"a5af0e18410fc2fb909c13031b41e5e24d7297c9422708dd628f7889605da697"} Mar 10 11:28:29 crc kubenswrapper[4794]: I0310 11:28:29.410595 4794 scope.go:117] "RemoveContainer" containerID="d4c8537903b62043899faf687979b13e02576af699a55eabd48a96aa686abc7e" Mar 10 11:28:29 crc kubenswrapper[4794]: I0310 11:28:29.451004 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-zppgc"] Mar 10 11:28:29 crc kubenswrapper[4794]: I0310 11:28:29.455153 4794 scope.go:117] "RemoveContainer" containerID="ab8fea3eb6e261288778386a535aa54e0c32c52bf9e229f3c6d3ace93323da9c" Mar 10 11:28:29 crc kubenswrapper[4794]: I0310 11:28:29.459238 4794 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-zppgc"] Mar 10 11:28:29 crc kubenswrapper[4794]: I0310 11:28:29.481538 4794 scope.go:117] "RemoveContainer" containerID="9c6b32cfa64031423a2f2f720df35653c8f48099175fdc1a5c660ebdd398d874" Mar 10 11:28:29 crc kubenswrapper[4794]: I0310 11:28:29.514076 4794 scope.go:117] "RemoveContainer" containerID="d4c8537903b62043899faf687979b13e02576af699a55eabd48a96aa686abc7e" Mar 10 11:28:29 crc kubenswrapper[4794]: E0310 11:28:29.514870 4794 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d4c8537903b62043899faf687979b13e02576af699a55eabd48a96aa686abc7e\": container with ID starting with d4c8537903b62043899faf687979b13e02576af699a55eabd48a96aa686abc7e not found: ID does not exist" containerID="d4c8537903b62043899faf687979b13e02576af699a55eabd48a96aa686abc7e" Mar 10 11:28:29 crc kubenswrapper[4794]: I0310 11:28:29.514927 4794 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d4c8537903b62043899faf687979b13e02576af699a55eabd48a96aa686abc7e"} err="failed to get container status \"d4c8537903b62043899faf687979b13e02576af699a55eabd48a96aa686abc7e\": rpc error: code = NotFound desc = could not find container \"d4c8537903b62043899faf687979b13e02576af699a55eabd48a96aa686abc7e\": container with ID starting with d4c8537903b62043899faf687979b13e02576af699a55eabd48a96aa686abc7e not found: ID does not exist" Mar 10 11:28:29 crc kubenswrapper[4794]: I0310 11:28:29.514959 4794 scope.go:117] "RemoveContainer" containerID="ab8fea3eb6e261288778386a535aa54e0c32c52bf9e229f3c6d3ace93323da9c" Mar 10 11:28:29 crc kubenswrapper[4794]: E0310 11:28:29.515476 4794 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ab8fea3eb6e261288778386a535aa54e0c32c52bf9e229f3c6d3ace93323da9c\": container with ID starting with ab8fea3eb6e261288778386a535aa54e0c32c52bf9e229f3c6d3ace93323da9c not found: ID does not exist" containerID="ab8fea3eb6e261288778386a535aa54e0c32c52bf9e229f3c6d3ace93323da9c" Mar 10 11:28:29 crc kubenswrapper[4794]: I0310 11:28:29.515518 4794 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ab8fea3eb6e261288778386a535aa54e0c32c52bf9e229f3c6d3ace93323da9c"} err="failed to get container status \"ab8fea3eb6e261288778386a535aa54e0c32c52bf9e229f3c6d3ace93323da9c\": rpc error: code = NotFound desc = could not find container \"ab8fea3eb6e261288778386a535aa54e0c32c52bf9e229f3c6d3ace93323da9c\": container with ID starting with ab8fea3eb6e261288778386a535aa54e0c32c52bf9e229f3c6d3ace93323da9c not found: ID does not exist" Mar 10 11:28:29 crc kubenswrapper[4794]: I0310 11:28:29.515544 4794 scope.go:117] "RemoveContainer" containerID="9c6b32cfa64031423a2f2f720df35653c8f48099175fdc1a5c660ebdd398d874" Mar 10 11:28:29 crc kubenswrapper[4794]: E0310 11:28:29.516034 4794 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9c6b32cfa64031423a2f2f720df35653c8f48099175fdc1a5c660ebdd398d874\": container with ID starting with 9c6b32cfa64031423a2f2f720df35653c8f48099175fdc1a5c660ebdd398d874 not found: ID does not exist" containerID="9c6b32cfa64031423a2f2f720df35653c8f48099175fdc1a5c660ebdd398d874" Mar 10 11:28:29 crc kubenswrapper[4794]: I0310 11:28:29.516080 4794 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9c6b32cfa64031423a2f2f720df35653c8f48099175fdc1a5c660ebdd398d874"} err="failed to get container status \"9c6b32cfa64031423a2f2f720df35653c8f48099175fdc1a5c660ebdd398d874\": rpc error: code = NotFound desc = could not find container \"9c6b32cfa64031423a2f2f720df35653c8f48099175fdc1a5c660ebdd398d874\": container with ID starting with 9c6b32cfa64031423a2f2f720df35653c8f48099175fdc1a5c660ebdd398d874 not found: ID does not exist" Mar 10 11:28:30 crc kubenswrapper[4794]: I0310 11:28:30.017850 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="81fd453f-c42b-4432-8d2c-dfc34ee6241c" path="/var/lib/kubelet/pods/81fd453f-c42b-4432-8d2c-dfc34ee6241c/volumes" Mar 10 11:28:30 crc kubenswrapper[4794]: I0310 11:28:30.018772 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f5ddb993-25ab-4586-8570-ed2365d197f8" path="/var/lib/kubelet/pods/f5ddb993-25ab-4586-8570-ed2365d197f8/volumes" Mar 10 11:28:30 crc kubenswrapper[4794]: I0310 11:28:30.284703 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 10 11:28:31 crc kubenswrapper[4794]: I0310 11:28:31.445406 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8e493776-c193-4e89-bdeb-b19ca57df652","Type":"ContainerStarted","Data":"22582a7e83a8964571ae371aaaeffc71e6c679c3fe6b356bb1f9e35542d335d6"} Mar 10 11:28:31 crc kubenswrapper[4794]: I0310 11:28:31.445567 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="8e493776-c193-4e89-bdeb-b19ca57df652" containerName="ceilometer-central-agent" containerID="cri-o://88a97723577b089c875abe472a0931ac8a90f3b5e1a3401f35e815e904d0c000" gracePeriod=30 Mar 10 11:28:31 crc kubenswrapper[4794]: I0310 11:28:31.445615 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 10 11:28:31 crc kubenswrapper[4794]: I0310 11:28:31.445640 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="8e493776-c193-4e89-bdeb-b19ca57df652" containerName="ceilometer-notification-agent" containerID="cri-o://0fcf42edf8b7766bfe217b4e35a0deafd44b116cdc06cf652c7640c237c80090" gracePeriod=30 Mar 10 11:28:31 crc kubenswrapper[4794]: I0310 11:28:31.445616 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="8e493776-c193-4e89-bdeb-b19ca57df652" containerName="sg-core" containerID="cri-o://15fcf4b361319af8c0e055136272bc8edfc2e6cbb2fe4fad2a79c13e82e3fad6" gracePeriod=30 Mar 10 11:28:31 crc kubenswrapper[4794]: I0310 11:28:31.445692 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="8e493776-c193-4e89-bdeb-b19ca57df652" containerName="proxy-httpd" containerID="cri-o://22582a7e83a8964571ae371aaaeffc71e6c679c3fe6b356bb1f9e35542d335d6" gracePeriod=30 Mar 10 11:28:31 crc kubenswrapper[4794]: I0310 11:28:31.471964 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=1.758536629 podStartE2EDuration="6.471947309s" podCreationTimestamp="2026-03-10 11:28:25 +0000 UTC" firstStartedPulling="2026-03-10 11:28:26.323619045 +0000 UTC m=+6255.079789863" lastFinishedPulling="2026-03-10 11:28:31.037029705 +0000 UTC m=+6259.793200543" observedRunningTime="2026-03-10 11:28:31.463479856 +0000 UTC m=+6260.219650684" watchObservedRunningTime="2026-03-10 11:28:31.471947309 +0000 UTC m=+6260.228118127" Mar 10 11:28:32 crc kubenswrapper[4794]: I0310 11:28:32.474509 4794 generic.go:334] "Generic (PLEG): container finished" podID="8e493776-c193-4e89-bdeb-b19ca57df652" containerID="22582a7e83a8964571ae371aaaeffc71e6c679c3fe6b356bb1f9e35542d335d6" exitCode=0 Mar 10 11:28:32 crc kubenswrapper[4794]: I0310 11:28:32.474910 4794 generic.go:334] "Generic (PLEG): container finished" podID="8e493776-c193-4e89-bdeb-b19ca57df652" containerID="15fcf4b361319af8c0e055136272bc8edfc2e6cbb2fe4fad2a79c13e82e3fad6" exitCode=2 Mar 10 11:28:32 crc kubenswrapper[4794]: I0310 11:28:32.474919 4794 generic.go:334] "Generic (PLEG): container finished" podID="8e493776-c193-4e89-bdeb-b19ca57df652" containerID="0fcf42edf8b7766bfe217b4e35a0deafd44b116cdc06cf652c7640c237c80090" exitCode=0 Mar 10 11:28:32 crc kubenswrapper[4794]: I0310 11:28:32.474926 4794 generic.go:334] "Generic (PLEG): container finished" podID="8e493776-c193-4e89-bdeb-b19ca57df652" containerID="88a97723577b089c875abe472a0931ac8a90f3b5e1a3401f35e815e904d0c000" exitCode=0 Mar 10 11:28:32 crc kubenswrapper[4794]: I0310 11:28:32.474620 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8e493776-c193-4e89-bdeb-b19ca57df652","Type":"ContainerDied","Data":"22582a7e83a8964571ae371aaaeffc71e6c679c3fe6b356bb1f9e35542d335d6"} Mar 10 11:28:32 crc kubenswrapper[4794]: I0310 11:28:32.474965 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8e493776-c193-4e89-bdeb-b19ca57df652","Type":"ContainerDied","Data":"15fcf4b361319af8c0e055136272bc8edfc2e6cbb2fe4fad2a79c13e82e3fad6"} Mar 10 11:28:32 crc kubenswrapper[4794]: I0310 11:28:32.474978 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8e493776-c193-4e89-bdeb-b19ca57df652","Type":"ContainerDied","Data":"0fcf42edf8b7766bfe217b4e35a0deafd44b116cdc06cf652c7640c237c80090"} Mar 10 11:28:32 crc kubenswrapper[4794]: I0310 11:28:32.474987 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8e493776-c193-4e89-bdeb-b19ca57df652","Type":"ContainerDied","Data":"88a97723577b089c875abe472a0931ac8a90f3b5e1a3401f35e815e904d0c000"} Mar 10 11:28:32 crc kubenswrapper[4794]: I0310 11:28:32.802302 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 10 11:28:32 crc kubenswrapper[4794]: I0310 11:28:32.983317 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8e493776-c193-4e89-bdeb-b19ca57df652-config-data\") pod \"8e493776-c193-4e89-bdeb-b19ca57df652\" (UID: \"8e493776-c193-4e89-bdeb-b19ca57df652\") " Mar 10 11:28:32 crc kubenswrapper[4794]: I0310 11:28:32.984408 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8e493776-c193-4e89-bdeb-b19ca57df652-combined-ca-bundle\") pod \"8e493776-c193-4e89-bdeb-b19ca57df652\" (UID: \"8e493776-c193-4e89-bdeb-b19ca57df652\") " Mar 10 11:28:32 crc kubenswrapper[4794]: I0310 11:28:32.984478 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tslpq\" (UniqueName: \"kubernetes.io/projected/8e493776-c193-4e89-bdeb-b19ca57df652-kube-api-access-tslpq\") pod \"8e493776-c193-4e89-bdeb-b19ca57df652\" (UID: \"8e493776-c193-4e89-bdeb-b19ca57df652\") " Mar 10 11:28:32 crc kubenswrapper[4794]: I0310 11:28:32.984547 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8e493776-c193-4e89-bdeb-b19ca57df652-run-httpd\") pod \"8e493776-c193-4e89-bdeb-b19ca57df652\" (UID: \"8e493776-c193-4e89-bdeb-b19ca57df652\") " Mar 10 11:28:32 crc kubenswrapper[4794]: I0310 11:28:32.984626 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8e493776-c193-4e89-bdeb-b19ca57df652-log-httpd\") pod \"8e493776-c193-4e89-bdeb-b19ca57df652\" (UID: \"8e493776-c193-4e89-bdeb-b19ca57df652\") " Mar 10 11:28:32 crc kubenswrapper[4794]: I0310 11:28:32.984722 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8e493776-c193-4e89-bdeb-b19ca57df652-scripts\") pod \"8e493776-c193-4e89-bdeb-b19ca57df652\" (UID: \"8e493776-c193-4e89-bdeb-b19ca57df652\") " Mar 10 11:28:32 crc kubenswrapper[4794]: I0310 11:28:32.984788 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/8e493776-c193-4e89-bdeb-b19ca57df652-sg-core-conf-yaml\") pod \"8e493776-c193-4e89-bdeb-b19ca57df652\" (UID: \"8e493776-c193-4e89-bdeb-b19ca57df652\") " Mar 10 11:28:32 crc kubenswrapper[4794]: I0310 11:28:32.985242 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8e493776-c193-4e89-bdeb-b19ca57df652-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "8e493776-c193-4e89-bdeb-b19ca57df652" (UID: "8e493776-c193-4e89-bdeb-b19ca57df652"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 11:28:32 crc kubenswrapper[4794]: I0310 11:28:32.985313 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8e493776-c193-4e89-bdeb-b19ca57df652-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "8e493776-c193-4e89-bdeb-b19ca57df652" (UID: "8e493776-c193-4e89-bdeb-b19ca57df652"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 11:28:32 crc kubenswrapper[4794]: I0310 11:28:32.985505 4794 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8e493776-c193-4e89-bdeb-b19ca57df652-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 10 11:28:32 crc kubenswrapper[4794]: I0310 11:28:32.985520 4794 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8e493776-c193-4e89-bdeb-b19ca57df652-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 10 11:28:32 crc kubenswrapper[4794]: I0310 11:28:32.991183 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8e493776-c193-4e89-bdeb-b19ca57df652-scripts" (OuterVolumeSpecName: "scripts") pod "8e493776-c193-4e89-bdeb-b19ca57df652" (UID: "8e493776-c193-4e89-bdeb-b19ca57df652"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 11:28:32 crc kubenswrapper[4794]: I0310 11:28:32.992259 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8e493776-c193-4e89-bdeb-b19ca57df652-kube-api-access-tslpq" (OuterVolumeSpecName: "kube-api-access-tslpq") pod "8e493776-c193-4e89-bdeb-b19ca57df652" (UID: "8e493776-c193-4e89-bdeb-b19ca57df652"). InnerVolumeSpecName "kube-api-access-tslpq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 11:28:33 crc kubenswrapper[4794]: I0310 11:28:33.019631 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8e493776-c193-4e89-bdeb-b19ca57df652-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "8e493776-c193-4e89-bdeb-b19ca57df652" (UID: "8e493776-c193-4e89-bdeb-b19ca57df652"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 11:28:33 crc kubenswrapper[4794]: I0310 11:28:33.088270 4794 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/8e493776-c193-4e89-bdeb-b19ca57df652-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 10 11:28:33 crc kubenswrapper[4794]: I0310 11:28:33.088314 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tslpq\" (UniqueName: \"kubernetes.io/projected/8e493776-c193-4e89-bdeb-b19ca57df652-kube-api-access-tslpq\") on node \"crc\" DevicePath \"\"" Mar 10 11:28:33 crc kubenswrapper[4794]: I0310 11:28:33.088363 4794 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8e493776-c193-4e89-bdeb-b19ca57df652-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 11:28:33 crc kubenswrapper[4794]: I0310 11:28:33.100566 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8e493776-c193-4e89-bdeb-b19ca57df652-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8e493776-c193-4e89-bdeb-b19ca57df652" (UID: "8e493776-c193-4e89-bdeb-b19ca57df652"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 11:28:33 crc kubenswrapper[4794]: I0310 11:28:33.109591 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8e493776-c193-4e89-bdeb-b19ca57df652-config-data" (OuterVolumeSpecName: "config-data") pod "8e493776-c193-4e89-bdeb-b19ca57df652" (UID: "8e493776-c193-4e89-bdeb-b19ca57df652"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 11:28:33 crc kubenswrapper[4794]: I0310 11:28:33.190541 4794 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8e493776-c193-4e89-bdeb-b19ca57df652-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 11:28:33 crc kubenswrapper[4794]: I0310 11:28:33.190974 4794 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8e493776-c193-4e89-bdeb-b19ca57df652-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 11:28:33 crc kubenswrapper[4794]: I0310 11:28:33.486301 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8e493776-c193-4e89-bdeb-b19ca57df652","Type":"ContainerDied","Data":"1fea346552eb4ad20848fc4d3aea2ed5f6068b6f959af5a51c0e7e2c1dd88c6a"} Mar 10 11:28:33 crc kubenswrapper[4794]: I0310 11:28:33.486385 4794 scope.go:117] "RemoveContainer" containerID="22582a7e83a8964571ae371aaaeffc71e6c679c3fe6b356bb1f9e35542d335d6" Mar 10 11:28:33 crc kubenswrapper[4794]: I0310 11:28:33.486394 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 10 11:28:33 crc kubenswrapper[4794]: I0310 11:28:33.518345 4794 scope.go:117] "RemoveContainer" containerID="15fcf4b361319af8c0e055136272bc8edfc2e6cbb2fe4fad2a79c13e82e3fad6" Mar 10 11:28:33 crc kubenswrapper[4794]: I0310 11:28:33.542241 4794 scope.go:117] "RemoveContainer" containerID="0fcf42edf8b7766bfe217b4e35a0deafd44b116cdc06cf652c7640c237c80090" Mar 10 11:28:33 crc kubenswrapper[4794]: I0310 11:28:33.543856 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 10 11:28:33 crc kubenswrapper[4794]: I0310 11:28:33.564047 4794 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 10 11:28:33 crc kubenswrapper[4794]: I0310 11:28:33.579544 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 10 11:28:33 crc kubenswrapper[4794]: E0310 11:28:33.580053 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="81fd453f-c42b-4432-8d2c-dfc34ee6241c" containerName="registry-server" Mar 10 11:28:33 crc kubenswrapper[4794]: I0310 11:28:33.580074 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="81fd453f-c42b-4432-8d2c-dfc34ee6241c" containerName="registry-server" Mar 10 11:28:33 crc kubenswrapper[4794]: E0310 11:28:33.580092 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="81fd453f-c42b-4432-8d2c-dfc34ee6241c" containerName="extract-utilities" Mar 10 11:28:33 crc kubenswrapper[4794]: I0310 11:28:33.580099 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="81fd453f-c42b-4432-8d2c-dfc34ee6241c" containerName="extract-utilities" Mar 10 11:28:33 crc kubenswrapper[4794]: E0310 11:28:33.580109 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="81fd453f-c42b-4432-8d2c-dfc34ee6241c" containerName="extract-content" Mar 10 11:28:33 crc kubenswrapper[4794]: I0310 11:28:33.580115 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="81fd453f-c42b-4432-8d2c-dfc34ee6241c" containerName="extract-content" Mar 10 11:28:33 crc kubenswrapper[4794]: E0310 11:28:33.580123 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8e493776-c193-4e89-bdeb-b19ca57df652" containerName="ceilometer-central-agent" Mar 10 11:28:33 crc kubenswrapper[4794]: I0310 11:28:33.580128 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="8e493776-c193-4e89-bdeb-b19ca57df652" containerName="ceilometer-central-agent" Mar 10 11:28:33 crc kubenswrapper[4794]: E0310 11:28:33.580146 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="692a8a38-92c2-437f-995f-e8595cc09a32" containerName="dnsmasq-dns" Mar 10 11:28:33 crc kubenswrapper[4794]: I0310 11:28:33.580152 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="692a8a38-92c2-437f-995f-e8595cc09a32" containerName="dnsmasq-dns" Mar 10 11:28:33 crc kubenswrapper[4794]: E0310 11:28:33.580164 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8e493776-c193-4e89-bdeb-b19ca57df652" containerName="sg-core" Mar 10 11:28:33 crc kubenswrapper[4794]: I0310 11:28:33.580170 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="8e493776-c193-4e89-bdeb-b19ca57df652" containerName="sg-core" Mar 10 11:28:33 crc kubenswrapper[4794]: E0310 11:28:33.580236 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8e493776-c193-4e89-bdeb-b19ca57df652" containerName="proxy-httpd" Mar 10 11:28:33 crc kubenswrapper[4794]: I0310 11:28:33.580244 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="8e493776-c193-4e89-bdeb-b19ca57df652" containerName="proxy-httpd" Mar 10 11:28:33 crc kubenswrapper[4794]: E0310 11:28:33.580257 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8e493776-c193-4e89-bdeb-b19ca57df652" containerName="ceilometer-notification-agent" Mar 10 11:28:33 crc kubenswrapper[4794]: I0310 11:28:33.580263 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="8e493776-c193-4e89-bdeb-b19ca57df652" containerName="ceilometer-notification-agent" Mar 10 11:28:33 crc kubenswrapper[4794]: E0310 11:28:33.580281 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="692a8a38-92c2-437f-995f-e8595cc09a32" containerName="init" Mar 10 11:28:33 crc kubenswrapper[4794]: I0310 11:28:33.580288 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="692a8a38-92c2-437f-995f-e8595cc09a32" containerName="init" Mar 10 11:28:33 crc kubenswrapper[4794]: I0310 11:28:33.580541 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="8e493776-c193-4e89-bdeb-b19ca57df652" containerName="ceilometer-central-agent" Mar 10 11:28:33 crc kubenswrapper[4794]: I0310 11:28:33.580560 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="8e493776-c193-4e89-bdeb-b19ca57df652" containerName="sg-core" Mar 10 11:28:33 crc kubenswrapper[4794]: I0310 11:28:33.580589 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="81fd453f-c42b-4432-8d2c-dfc34ee6241c" containerName="registry-server" Mar 10 11:28:33 crc kubenswrapper[4794]: I0310 11:28:33.580682 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="8e493776-c193-4e89-bdeb-b19ca57df652" containerName="ceilometer-notification-agent" Mar 10 11:28:33 crc kubenswrapper[4794]: I0310 11:28:33.580699 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="8e493776-c193-4e89-bdeb-b19ca57df652" containerName="proxy-httpd" Mar 10 11:28:33 crc kubenswrapper[4794]: I0310 11:28:33.580718 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="692a8a38-92c2-437f-995f-e8595cc09a32" containerName="dnsmasq-dns" Mar 10 11:28:33 crc kubenswrapper[4794]: I0310 11:28:33.583829 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 10 11:28:33 crc kubenswrapper[4794]: I0310 11:28:33.584029 4794 scope.go:117] "RemoveContainer" containerID="88a97723577b089c875abe472a0931ac8a90f3b5e1a3401f35e815e904d0c000" Mar 10 11:28:33 crc kubenswrapper[4794]: I0310 11:28:33.588117 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 10 11:28:33 crc kubenswrapper[4794]: I0310 11:28:33.589210 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 10 11:28:33 crc kubenswrapper[4794]: I0310 11:28:33.593255 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 10 11:28:33 crc kubenswrapper[4794]: I0310 11:28:33.701253 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pp5qj\" (UniqueName: \"kubernetes.io/projected/871071a4-bdcf-4d0a-bb26-21205ac2c2da-kube-api-access-pp5qj\") pod \"ceilometer-0\" (UID: \"871071a4-bdcf-4d0a-bb26-21205ac2c2da\") " pod="openstack/ceilometer-0" Mar 10 11:28:33 crc kubenswrapper[4794]: I0310 11:28:33.701324 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/871071a4-bdcf-4d0a-bb26-21205ac2c2da-run-httpd\") pod \"ceilometer-0\" (UID: \"871071a4-bdcf-4d0a-bb26-21205ac2c2da\") " pod="openstack/ceilometer-0" Mar 10 11:28:33 crc kubenswrapper[4794]: I0310 11:28:33.701376 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/871071a4-bdcf-4d0a-bb26-21205ac2c2da-log-httpd\") pod \"ceilometer-0\" (UID: \"871071a4-bdcf-4d0a-bb26-21205ac2c2da\") " pod="openstack/ceilometer-0" Mar 10 11:28:33 crc kubenswrapper[4794]: I0310 11:28:33.701416 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/871071a4-bdcf-4d0a-bb26-21205ac2c2da-scripts\") pod \"ceilometer-0\" (UID: \"871071a4-bdcf-4d0a-bb26-21205ac2c2da\") " pod="openstack/ceilometer-0" Mar 10 11:28:33 crc kubenswrapper[4794]: I0310 11:28:33.701509 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/871071a4-bdcf-4d0a-bb26-21205ac2c2da-config-data\") pod \"ceilometer-0\" (UID: \"871071a4-bdcf-4d0a-bb26-21205ac2c2da\") " pod="openstack/ceilometer-0" Mar 10 11:28:33 crc kubenswrapper[4794]: I0310 11:28:33.701787 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/871071a4-bdcf-4d0a-bb26-21205ac2c2da-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"871071a4-bdcf-4d0a-bb26-21205ac2c2da\") " pod="openstack/ceilometer-0" Mar 10 11:28:33 crc kubenswrapper[4794]: I0310 11:28:33.702199 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/871071a4-bdcf-4d0a-bb26-21205ac2c2da-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"871071a4-bdcf-4d0a-bb26-21205ac2c2da\") " pod="openstack/ceilometer-0" Mar 10 11:28:33 crc kubenswrapper[4794]: I0310 11:28:33.805696 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/871071a4-bdcf-4d0a-bb26-21205ac2c2da-run-httpd\") pod \"ceilometer-0\" (UID: \"871071a4-bdcf-4d0a-bb26-21205ac2c2da\") " pod="openstack/ceilometer-0" Mar 10 11:28:33 crc kubenswrapper[4794]: I0310 11:28:33.804712 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/871071a4-bdcf-4d0a-bb26-21205ac2c2da-run-httpd\") pod \"ceilometer-0\" (UID: \"871071a4-bdcf-4d0a-bb26-21205ac2c2da\") " pod="openstack/ceilometer-0" Mar 10 11:28:33 crc kubenswrapper[4794]: I0310 11:28:33.805881 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/871071a4-bdcf-4d0a-bb26-21205ac2c2da-log-httpd\") pod \"ceilometer-0\" (UID: \"871071a4-bdcf-4d0a-bb26-21205ac2c2da\") " pod="openstack/ceilometer-0" Mar 10 11:28:33 crc kubenswrapper[4794]: I0310 11:28:33.806519 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/871071a4-bdcf-4d0a-bb26-21205ac2c2da-log-httpd\") pod \"ceilometer-0\" (UID: \"871071a4-bdcf-4d0a-bb26-21205ac2c2da\") " pod="openstack/ceilometer-0" Mar 10 11:28:33 crc kubenswrapper[4794]: I0310 11:28:33.806875 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/871071a4-bdcf-4d0a-bb26-21205ac2c2da-scripts\") pod \"ceilometer-0\" (UID: \"871071a4-bdcf-4d0a-bb26-21205ac2c2da\") " pod="openstack/ceilometer-0" Mar 10 11:28:33 crc kubenswrapper[4794]: I0310 11:28:33.808177 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/871071a4-bdcf-4d0a-bb26-21205ac2c2da-config-data\") pod \"ceilometer-0\" (UID: \"871071a4-bdcf-4d0a-bb26-21205ac2c2da\") " pod="openstack/ceilometer-0" Mar 10 11:28:33 crc kubenswrapper[4794]: I0310 11:28:33.808544 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/871071a4-bdcf-4d0a-bb26-21205ac2c2da-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"871071a4-bdcf-4d0a-bb26-21205ac2c2da\") " pod="openstack/ceilometer-0" Mar 10 11:28:33 crc kubenswrapper[4794]: I0310 11:28:33.808848 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/871071a4-bdcf-4d0a-bb26-21205ac2c2da-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"871071a4-bdcf-4d0a-bb26-21205ac2c2da\") " pod="openstack/ceilometer-0" Mar 10 11:28:33 crc kubenswrapper[4794]: I0310 11:28:33.809193 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pp5qj\" (UniqueName: \"kubernetes.io/projected/871071a4-bdcf-4d0a-bb26-21205ac2c2da-kube-api-access-pp5qj\") pod \"ceilometer-0\" (UID: \"871071a4-bdcf-4d0a-bb26-21205ac2c2da\") " pod="openstack/ceilometer-0" Mar 10 11:28:33 crc kubenswrapper[4794]: I0310 11:28:33.813166 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/871071a4-bdcf-4d0a-bb26-21205ac2c2da-config-data\") pod \"ceilometer-0\" (UID: \"871071a4-bdcf-4d0a-bb26-21205ac2c2da\") " pod="openstack/ceilometer-0" Mar 10 11:28:33 crc kubenswrapper[4794]: I0310 11:28:33.813726 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/871071a4-bdcf-4d0a-bb26-21205ac2c2da-scripts\") pod \"ceilometer-0\" (UID: \"871071a4-bdcf-4d0a-bb26-21205ac2c2da\") " pod="openstack/ceilometer-0" Mar 10 11:28:33 crc kubenswrapper[4794]: I0310 11:28:33.829830 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/871071a4-bdcf-4d0a-bb26-21205ac2c2da-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"871071a4-bdcf-4d0a-bb26-21205ac2c2da\") " pod="openstack/ceilometer-0" Mar 10 11:28:33 crc kubenswrapper[4794]: I0310 11:28:33.831607 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/871071a4-bdcf-4d0a-bb26-21205ac2c2da-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"871071a4-bdcf-4d0a-bb26-21205ac2c2da\") " pod="openstack/ceilometer-0" Mar 10 11:28:33 crc kubenswrapper[4794]: I0310 11:28:33.839028 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pp5qj\" (UniqueName: \"kubernetes.io/projected/871071a4-bdcf-4d0a-bb26-21205ac2c2da-kube-api-access-pp5qj\") pod \"ceilometer-0\" (UID: \"871071a4-bdcf-4d0a-bb26-21205ac2c2da\") " pod="openstack/ceilometer-0" Mar 10 11:28:33 crc kubenswrapper[4794]: I0310 11:28:33.908618 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 10 11:28:34 crc kubenswrapper[4794]: I0310 11:28:34.025817 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8e493776-c193-4e89-bdeb-b19ca57df652" path="/var/lib/kubelet/pods/8e493776-c193-4e89-bdeb-b19ca57df652/volumes" Mar 10 11:28:34 crc kubenswrapper[4794]: I0310 11:28:34.444857 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 10 11:28:34 crc kubenswrapper[4794]: W0310 11:28:34.455975 4794 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod871071a4_bdcf_4d0a_bb26_21205ac2c2da.slice/crio-3d45ac5e1815078ea16010458e41dd4d7d2d589cdb1f5a3d9991f4c832deeede WatchSource:0}: Error finding container 3d45ac5e1815078ea16010458e41dd4d7d2d589cdb1f5a3d9991f4c832deeede: Status 404 returned error can't find the container with id 3d45ac5e1815078ea16010458e41dd4d7d2d589cdb1f5a3d9991f4c832deeede Mar 10 11:28:34 crc kubenswrapper[4794]: I0310 11:28:34.514648 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"871071a4-bdcf-4d0a-bb26-21205ac2c2da","Type":"ContainerStarted","Data":"3d45ac5e1815078ea16010458e41dd4d7d2d589cdb1f5a3d9991f4c832deeede"} Mar 10 11:28:34 crc kubenswrapper[4794]: I0310 11:28:34.999596 4794 scope.go:117] "RemoveContainer" containerID="7035d4646bbb6a23dd802a9c8c9f05a3c6c945d4e5a6da8586d2bb005fcfad77" Mar 10 11:28:35 crc kubenswrapper[4794]: E0310 11:28:35.000316 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-69278_openshift-machine-config-operator(071a79a8-a892-4d38-a255-2a19483b64aa)\"" pod="openshift-machine-config-operator/machine-config-daemon-69278" podUID="071a79a8-a892-4d38-a255-2a19483b64aa" Mar 10 11:28:35 crc kubenswrapper[4794]: I0310 11:28:35.534276 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"871071a4-bdcf-4d0a-bb26-21205ac2c2da","Type":"ContainerStarted","Data":"b6bdb8446302abca4dbdf6f81e4f220b13744710d3cf38a4df274de7a6e2794e"} Mar 10 11:28:36 crc kubenswrapper[4794]: I0310 11:28:36.026460 4794 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/manila-share-share1-0" Mar 10 11:28:36 crc kubenswrapper[4794]: I0310 11:28:36.551208 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"871071a4-bdcf-4d0a-bb26-21205ac2c2da","Type":"ContainerStarted","Data":"5074b8f5bbb043850f056a1e7bcf9ee2377bc6aa4b8d6529df309df17d6b2eaf"} Mar 10 11:28:37 crc kubenswrapper[4794]: I0310 11:28:37.532502 4794 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/manila-scheduler-0" Mar 10 11:28:37 crc kubenswrapper[4794]: I0310 11:28:37.566993 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"871071a4-bdcf-4d0a-bb26-21205ac2c2da","Type":"ContainerStarted","Data":"a0c54f0f2ec4e727d6cafa1dc68967d04c468a78072e6a88f290269e285ff148"} Mar 10 11:28:37 crc kubenswrapper[4794]: I0310 11:28:37.613763 4794 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/manila-share-share1-0" Mar 10 11:28:37 crc kubenswrapper[4794]: I0310 11:28:37.695427 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/manila-api-0" Mar 10 11:28:38 crc kubenswrapper[4794]: I0310 11:28:38.580595 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"871071a4-bdcf-4d0a-bb26-21205ac2c2da","Type":"ContainerStarted","Data":"e0e5019709c972aa90e5890ac0bb225de2fa6b0d8a52bcf2cd6b101cebab8bda"} Mar 10 11:28:38 crc kubenswrapper[4794]: I0310 11:28:38.582761 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 10 11:28:38 crc kubenswrapper[4794]: I0310 11:28:38.602037 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=1.877200524 podStartE2EDuration="5.602012181s" podCreationTimestamp="2026-03-10 11:28:33 +0000 UTC" firstStartedPulling="2026-03-10 11:28:34.458550656 +0000 UTC m=+6263.214721484" lastFinishedPulling="2026-03-10 11:28:38.183362333 +0000 UTC m=+6266.939533141" observedRunningTime="2026-03-10 11:28:38.598669257 +0000 UTC m=+6267.354840095" watchObservedRunningTime="2026-03-10 11:28:38.602012181 +0000 UTC m=+6267.358183019" Mar 10 11:28:49 crc kubenswrapper[4794]: I0310 11:28:49.247880 4794 scope.go:117] "RemoveContainer" containerID="4975fc730fbc2ec9e2e46c6ac387b74769008e53ba5af3bdf14a35383647b75e" Mar 10 11:28:49 crc kubenswrapper[4794]: I0310 11:28:49.272663 4794 scope.go:117] "RemoveContainer" containerID="5ef4cdc386b2a6e49916e7674d285fa07042c3f40995f8a61f9371bb22c25ee2" Mar 10 11:28:49 crc kubenswrapper[4794]: I0310 11:28:49.340569 4794 scope.go:117] "RemoveContainer" containerID="dc08acba3012e6ff76d1ff5183e98caa3586ffe0e5900aaef669b064c0118c40" Mar 10 11:28:49 crc kubenswrapper[4794]: I0310 11:28:49.399727 4794 scope.go:117] "RemoveContainer" containerID="09c3473db17e343c7f1c600d8e6f23efb32b817c496a01ac3b0db591c660983f" Mar 10 11:28:49 crc kubenswrapper[4794]: I0310 11:28:49.462955 4794 scope.go:117] "RemoveContainer" containerID="3ca83dd317eca7e463fd21ea015fe795c89d895fc3025b08a9ab2f17569254f2" Mar 10 11:28:49 crc kubenswrapper[4794]: I0310 11:28:49.522731 4794 scope.go:117] "RemoveContainer" containerID="9adfab5f90e9c45df62adb600b920a1f53fc2bf231d031997e441a12504b5b8b" Mar 10 11:28:49 crc kubenswrapper[4794]: I0310 11:28:49.551749 4794 scope.go:117] "RemoveContainer" containerID="5075d75ab9e59f54be1a73af28ebc58af063daad4d65fb8e4ffe7ca6622a936c" Mar 10 11:28:49 crc kubenswrapper[4794]: I0310 11:28:49.576890 4794 scope.go:117] "RemoveContainer" containerID="d3d17c41a6c2d8c7d790eacd943449ab5c1b8ed1ad57dec4177444f257a59c81" Mar 10 11:28:49 crc kubenswrapper[4794]: I0310 11:28:49.598016 4794 scope.go:117] "RemoveContainer" containerID="3d4ca50092d0be9e95a29c530920908827707f1a7bef1a1178f00ec55ae70682" Mar 10 11:28:49 crc kubenswrapper[4794]: I0310 11:28:49.645020 4794 scope.go:117] "RemoveContainer" containerID="6b09461dda48becbbd59e698f79d7c35bcad430638877567314883f2a3979d99" Mar 10 11:28:49 crc kubenswrapper[4794]: I0310 11:28:49.671947 4794 scope.go:117] "RemoveContainer" containerID="2892665528e4b37193ed63e0bb5ff16567ce35d3f8ea7280dc41b117ce794dcf" Mar 10 11:28:49 crc kubenswrapper[4794]: I0310 11:28:49.706236 4794 scope.go:117] "RemoveContainer" containerID="daf4fcade275be31a6babdb894296aae88ec9f7cc9fd48676243d5184178ec74" Mar 10 11:28:49 crc kubenswrapper[4794]: I0310 11:28:49.998995 4794 scope.go:117] "RemoveContainer" containerID="7035d4646bbb6a23dd802a9c8c9f05a3c6c945d4e5a6da8586d2bb005fcfad77" Mar 10 11:28:49 crc kubenswrapper[4794]: E0310 11:28:49.999304 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-69278_openshift-machine-config-operator(071a79a8-a892-4d38-a255-2a19483b64aa)\"" pod="openshift-machine-config-operator/machine-config-daemon-69278" podUID="071a79a8-a892-4d38-a255-2a19483b64aa" Mar 10 11:29:03 crc kubenswrapper[4794]: I0310 11:29:03.914799 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Mar 10 11:29:03 crc kubenswrapper[4794]: I0310 11:29:03.999802 4794 scope.go:117] "RemoveContainer" containerID="7035d4646bbb6a23dd802a9c8c9f05a3c6c945d4e5a6da8586d2bb005fcfad77" Mar 10 11:29:04 crc kubenswrapper[4794]: E0310 11:29:04.000097 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-69278_openshift-machine-config-operator(071a79a8-a892-4d38-a255-2a19483b64aa)\"" pod="openshift-machine-config-operator/machine-config-daemon-69278" podUID="071a79a8-a892-4d38-a255-2a19483b64aa" Mar 10 11:29:12 crc kubenswrapper[4794]: I0310 11:29:12.073653 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-create-8qjtb"] Mar 10 11:29:12 crc kubenswrapper[4794]: I0310 11:29:12.083864 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-2a16-account-create-update-5q9jn"] Mar 10 11:29:12 crc kubenswrapper[4794]: I0310 11:29:12.093459 4794 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-2a16-account-create-update-5q9jn"] Mar 10 11:29:12 crc kubenswrapper[4794]: I0310 11:29:12.125321 4794 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-create-8qjtb"] Mar 10 11:29:14 crc kubenswrapper[4794]: I0310 11:29:14.018914 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="aecfaacc-1770-4c75-90e5-18bf8f45581d" path="/var/lib/kubelet/pods/aecfaacc-1770-4c75-90e5-18bf8f45581d/volumes" Mar 10 11:29:14 crc kubenswrapper[4794]: I0310 11:29:14.024053 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e8b55040-929c-4088-9d23-532663500a6b" path="/var/lib/kubelet/pods/e8b55040-929c-4088-9d23-532663500a6b/volumes" Mar 10 11:29:16 crc kubenswrapper[4794]: I0310 11:29:16.000304 4794 scope.go:117] "RemoveContainer" containerID="7035d4646bbb6a23dd802a9c8c9f05a3c6c945d4e5a6da8586d2bb005fcfad77" Mar 10 11:29:16 crc kubenswrapper[4794]: E0310 11:29:16.001128 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-69278_openshift-machine-config-operator(071a79a8-a892-4d38-a255-2a19483b64aa)\"" pod="openshift-machine-config-operator/machine-config-daemon-69278" podUID="071a79a8-a892-4d38-a255-2a19483b64aa" Mar 10 11:29:20 crc kubenswrapper[4794]: I0310 11:29:20.058144 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-sync-v6gbl"] Mar 10 11:29:20 crc kubenswrapper[4794]: I0310 11:29:20.072621 4794 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-sync-v6gbl"] Mar 10 11:29:22 crc kubenswrapper[4794]: I0310 11:29:22.022108 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="68d82833-c203-4aa4-9829-f6392d598df1" path="/var/lib/kubelet/pods/68d82833-c203-4aa4-9829-f6392d598df1/volumes" Mar 10 11:29:22 crc kubenswrapper[4794]: I0310 11:29:22.111691 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5d986dd4f7-j95pr"] Mar 10 11:29:22 crc kubenswrapper[4794]: I0310 11:29:22.114310 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5d986dd4f7-j95pr" Mar 10 11:29:22 crc kubenswrapper[4794]: I0310 11:29:22.117373 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1" Mar 10 11:29:22 crc kubenswrapper[4794]: I0310 11:29:22.130314 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5d986dd4f7-j95pr"] Mar 10 11:29:22 crc kubenswrapper[4794]: I0310 11:29:22.299519 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nvw2t\" (UniqueName: \"kubernetes.io/projected/df9a8fa7-2924-4579-8728-2fdba5c829ea-kube-api-access-nvw2t\") pod \"dnsmasq-dns-5d986dd4f7-j95pr\" (UID: \"df9a8fa7-2924-4579-8728-2fdba5c829ea\") " pod="openstack/dnsmasq-dns-5d986dd4f7-j95pr" Mar 10 11:29:22 crc kubenswrapper[4794]: I0310 11:29:22.300086 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-cell1\" (UniqueName: \"kubernetes.io/configmap/df9a8fa7-2924-4579-8728-2fdba5c829ea-openstack-cell1\") pod \"dnsmasq-dns-5d986dd4f7-j95pr\" (UID: \"df9a8fa7-2924-4579-8728-2fdba5c829ea\") " pod="openstack/dnsmasq-dns-5d986dd4f7-j95pr" Mar 10 11:29:22 crc kubenswrapper[4794]: I0310 11:29:22.300259 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/df9a8fa7-2924-4579-8728-2fdba5c829ea-config\") pod \"dnsmasq-dns-5d986dd4f7-j95pr\" (UID: \"df9a8fa7-2924-4579-8728-2fdba5c829ea\") " pod="openstack/dnsmasq-dns-5d986dd4f7-j95pr" Mar 10 11:29:22 crc kubenswrapper[4794]: I0310 11:29:22.300355 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/df9a8fa7-2924-4579-8728-2fdba5c829ea-dns-svc\") pod \"dnsmasq-dns-5d986dd4f7-j95pr\" (UID: \"df9a8fa7-2924-4579-8728-2fdba5c829ea\") " pod="openstack/dnsmasq-dns-5d986dd4f7-j95pr" Mar 10 11:29:22 crc kubenswrapper[4794]: I0310 11:29:22.300424 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/df9a8fa7-2924-4579-8728-2fdba5c829ea-ovsdbserver-sb\") pod \"dnsmasq-dns-5d986dd4f7-j95pr\" (UID: \"df9a8fa7-2924-4579-8728-2fdba5c829ea\") " pod="openstack/dnsmasq-dns-5d986dd4f7-j95pr" Mar 10 11:29:22 crc kubenswrapper[4794]: I0310 11:29:22.300622 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/df9a8fa7-2924-4579-8728-2fdba5c829ea-ovsdbserver-nb\") pod \"dnsmasq-dns-5d986dd4f7-j95pr\" (UID: \"df9a8fa7-2924-4579-8728-2fdba5c829ea\") " pod="openstack/dnsmasq-dns-5d986dd4f7-j95pr" Mar 10 11:29:22 crc kubenswrapper[4794]: I0310 11:29:22.402793 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nvw2t\" (UniqueName: \"kubernetes.io/projected/df9a8fa7-2924-4579-8728-2fdba5c829ea-kube-api-access-nvw2t\") pod \"dnsmasq-dns-5d986dd4f7-j95pr\" (UID: \"df9a8fa7-2924-4579-8728-2fdba5c829ea\") " pod="openstack/dnsmasq-dns-5d986dd4f7-j95pr" Mar 10 11:29:22 crc kubenswrapper[4794]: I0310 11:29:22.402850 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-cell1\" (UniqueName: \"kubernetes.io/configmap/df9a8fa7-2924-4579-8728-2fdba5c829ea-openstack-cell1\") pod \"dnsmasq-dns-5d986dd4f7-j95pr\" (UID: \"df9a8fa7-2924-4579-8728-2fdba5c829ea\") " pod="openstack/dnsmasq-dns-5d986dd4f7-j95pr" Mar 10 11:29:22 crc kubenswrapper[4794]: I0310 11:29:22.402916 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/df9a8fa7-2924-4579-8728-2fdba5c829ea-config\") pod \"dnsmasq-dns-5d986dd4f7-j95pr\" (UID: \"df9a8fa7-2924-4579-8728-2fdba5c829ea\") " pod="openstack/dnsmasq-dns-5d986dd4f7-j95pr" Mar 10 11:29:22 crc kubenswrapper[4794]: I0310 11:29:22.402945 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/df9a8fa7-2924-4579-8728-2fdba5c829ea-dns-svc\") pod \"dnsmasq-dns-5d986dd4f7-j95pr\" (UID: \"df9a8fa7-2924-4579-8728-2fdba5c829ea\") " pod="openstack/dnsmasq-dns-5d986dd4f7-j95pr" Mar 10 11:29:22 crc kubenswrapper[4794]: I0310 11:29:22.402983 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/df9a8fa7-2924-4579-8728-2fdba5c829ea-ovsdbserver-sb\") pod \"dnsmasq-dns-5d986dd4f7-j95pr\" (UID: \"df9a8fa7-2924-4579-8728-2fdba5c829ea\") " pod="openstack/dnsmasq-dns-5d986dd4f7-j95pr" Mar 10 11:29:22 crc kubenswrapper[4794]: I0310 11:29:22.403085 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/df9a8fa7-2924-4579-8728-2fdba5c829ea-ovsdbserver-nb\") pod \"dnsmasq-dns-5d986dd4f7-j95pr\" (UID: \"df9a8fa7-2924-4579-8728-2fdba5c829ea\") " pod="openstack/dnsmasq-dns-5d986dd4f7-j95pr" Mar 10 11:29:22 crc kubenswrapper[4794]: I0310 11:29:22.403894 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-cell1\" (UniqueName: \"kubernetes.io/configmap/df9a8fa7-2924-4579-8728-2fdba5c829ea-openstack-cell1\") pod \"dnsmasq-dns-5d986dd4f7-j95pr\" (UID: \"df9a8fa7-2924-4579-8728-2fdba5c829ea\") " pod="openstack/dnsmasq-dns-5d986dd4f7-j95pr" Mar 10 11:29:22 crc kubenswrapper[4794]: I0310 11:29:22.404301 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/df9a8fa7-2924-4579-8728-2fdba5c829ea-dns-svc\") pod \"dnsmasq-dns-5d986dd4f7-j95pr\" (UID: \"df9a8fa7-2924-4579-8728-2fdba5c829ea\") " pod="openstack/dnsmasq-dns-5d986dd4f7-j95pr" Mar 10 11:29:22 crc kubenswrapper[4794]: I0310 11:29:22.404533 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/df9a8fa7-2924-4579-8728-2fdba5c829ea-ovsdbserver-nb\") pod \"dnsmasq-dns-5d986dd4f7-j95pr\" (UID: \"df9a8fa7-2924-4579-8728-2fdba5c829ea\") " pod="openstack/dnsmasq-dns-5d986dd4f7-j95pr" Mar 10 11:29:22 crc kubenswrapper[4794]: I0310 11:29:22.404640 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/df9a8fa7-2924-4579-8728-2fdba5c829ea-config\") pod \"dnsmasq-dns-5d986dd4f7-j95pr\" (UID: \"df9a8fa7-2924-4579-8728-2fdba5c829ea\") " pod="openstack/dnsmasq-dns-5d986dd4f7-j95pr" Mar 10 11:29:22 crc kubenswrapper[4794]: I0310 11:29:22.405059 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/df9a8fa7-2924-4579-8728-2fdba5c829ea-ovsdbserver-sb\") pod \"dnsmasq-dns-5d986dd4f7-j95pr\" (UID: \"df9a8fa7-2924-4579-8728-2fdba5c829ea\") " pod="openstack/dnsmasq-dns-5d986dd4f7-j95pr" Mar 10 11:29:22 crc kubenswrapper[4794]: I0310 11:29:22.435080 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nvw2t\" (UniqueName: \"kubernetes.io/projected/df9a8fa7-2924-4579-8728-2fdba5c829ea-kube-api-access-nvw2t\") pod \"dnsmasq-dns-5d986dd4f7-j95pr\" (UID: \"df9a8fa7-2924-4579-8728-2fdba5c829ea\") " pod="openstack/dnsmasq-dns-5d986dd4f7-j95pr" Mar 10 11:29:22 crc kubenswrapper[4794]: I0310 11:29:22.440416 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5d986dd4f7-j95pr" Mar 10 11:29:22 crc kubenswrapper[4794]: I0310 11:29:22.972641 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5d986dd4f7-j95pr"] Mar 10 11:29:23 crc kubenswrapper[4794]: I0310 11:29:23.146550 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5d986dd4f7-j95pr" event={"ID":"df9a8fa7-2924-4579-8728-2fdba5c829ea","Type":"ContainerStarted","Data":"15a5cf112e00e7f2705d6d98390dff06c85588b0d61fa17a24f5b6a1fc9f0e52"} Mar 10 11:29:24 crc kubenswrapper[4794]: I0310 11:29:24.160784 4794 generic.go:334] "Generic (PLEG): container finished" podID="df9a8fa7-2924-4579-8728-2fdba5c829ea" containerID="df7a6405e05a57c0583703670873e5ce73db52c1cce9d9c2ec1e3f90eda8998b" exitCode=0 Mar 10 11:29:24 crc kubenswrapper[4794]: I0310 11:29:24.161039 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5d986dd4f7-j95pr" event={"ID":"df9a8fa7-2924-4579-8728-2fdba5c829ea","Type":"ContainerDied","Data":"df7a6405e05a57c0583703670873e5ce73db52c1cce9d9c2ec1e3f90eda8998b"} Mar 10 11:29:24 crc kubenswrapper[4794]: I0310 11:29:24.379900 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-c57hw"] Mar 10 11:29:24 crc kubenswrapper[4794]: I0310 11:29:24.382984 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-c57hw" Mar 10 11:29:24 crc kubenswrapper[4794]: I0310 11:29:24.397950 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-c57hw"] Mar 10 11:29:24 crc kubenswrapper[4794]: I0310 11:29:24.555522 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gkmqm\" (UniqueName: \"kubernetes.io/projected/faa599aa-6af6-418a-8db0-6e5b9cd859c7-kube-api-access-gkmqm\") pod \"certified-operators-c57hw\" (UID: \"faa599aa-6af6-418a-8db0-6e5b9cd859c7\") " pod="openshift-marketplace/certified-operators-c57hw" Mar 10 11:29:24 crc kubenswrapper[4794]: I0310 11:29:24.555897 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/faa599aa-6af6-418a-8db0-6e5b9cd859c7-catalog-content\") pod \"certified-operators-c57hw\" (UID: \"faa599aa-6af6-418a-8db0-6e5b9cd859c7\") " pod="openshift-marketplace/certified-operators-c57hw" Mar 10 11:29:24 crc kubenswrapper[4794]: I0310 11:29:24.556053 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/faa599aa-6af6-418a-8db0-6e5b9cd859c7-utilities\") pod \"certified-operators-c57hw\" (UID: \"faa599aa-6af6-418a-8db0-6e5b9cd859c7\") " pod="openshift-marketplace/certified-operators-c57hw" Mar 10 11:29:24 crc kubenswrapper[4794]: I0310 11:29:24.657828 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/faa599aa-6af6-418a-8db0-6e5b9cd859c7-utilities\") pod \"certified-operators-c57hw\" (UID: \"faa599aa-6af6-418a-8db0-6e5b9cd859c7\") " pod="openshift-marketplace/certified-operators-c57hw" Mar 10 11:29:24 crc kubenswrapper[4794]: I0310 11:29:24.657976 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gkmqm\" (UniqueName: \"kubernetes.io/projected/faa599aa-6af6-418a-8db0-6e5b9cd859c7-kube-api-access-gkmqm\") pod \"certified-operators-c57hw\" (UID: \"faa599aa-6af6-418a-8db0-6e5b9cd859c7\") " pod="openshift-marketplace/certified-operators-c57hw" Mar 10 11:29:24 crc kubenswrapper[4794]: I0310 11:29:24.658016 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/faa599aa-6af6-418a-8db0-6e5b9cd859c7-catalog-content\") pod \"certified-operators-c57hw\" (UID: \"faa599aa-6af6-418a-8db0-6e5b9cd859c7\") " pod="openshift-marketplace/certified-operators-c57hw" Mar 10 11:29:24 crc kubenswrapper[4794]: I0310 11:29:24.658295 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/faa599aa-6af6-418a-8db0-6e5b9cd859c7-utilities\") pod \"certified-operators-c57hw\" (UID: \"faa599aa-6af6-418a-8db0-6e5b9cd859c7\") " pod="openshift-marketplace/certified-operators-c57hw" Mar 10 11:29:24 crc kubenswrapper[4794]: I0310 11:29:24.658614 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/faa599aa-6af6-418a-8db0-6e5b9cd859c7-catalog-content\") pod \"certified-operators-c57hw\" (UID: \"faa599aa-6af6-418a-8db0-6e5b9cd859c7\") " pod="openshift-marketplace/certified-operators-c57hw" Mar 10 11:29:24 crc kubenswrapper[4794]: I0310 11:29:24.679622 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gkmqm\" (UniqueName: \"kubernetes.io/projected/faa599aa-6af6-418a-8db0-6e5b9cd859c7-kube-api-access-gkmqm\") pod \"certified-operators-c57hw\" (UID: \"faa599aa-6af6-418a-8db0-6e5b9cd859c7\") " pod="openshift-marketplace/certified-operators-c57hw" Mar 10 11:29:24 crc kubenswrapper[4794]: I0310 11:29:24.724251 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-c57hw" Mar 10 11:29:25 crc kubenswrapper[4794]: I0310 11:29:25.174214 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5d986dd4f7-j95pr" event={"ID":"df9a8fa7-2924-4579-8728-2fdba5c829ea","Type":"ContainerStarted","Data":"abc7007853a1092955051446ebe967264570b717f712ef4bfbbe1774ca9baba7"} Mar 10 11:29:25 crc kubenswrapper[4794]: I0310 11:29:25.174673 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5d986dd4f7-j95pr" Mar 10 11:29:25 crc kubenswrapper[4794]: I0310 11:29:25.207000 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-c57hw"] Mar 10 11:29:25 crc kubenswrapper[4794]: I0310 11:29:25.245365 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5d986dd4f7-j95pr" podStartSLOduration=3.245311693 podStartE2EDuration="3.245311693s" podCreationTimestamp="2026-03-10 11:29:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 11:29:25.198603149 +0000 UTC m=+6313.954773987" watchObservedRunningTime="2026-03-10 11:29:25.245311693 +0000 UTC m=+6314.001482541" Mar 10 11:29:26 crc kubenswrapper[4794]: I0310 11:29:26.184545 4794 generic.go:334] "Generic (PLEG): container finished" podID="faa599aa-6af6-418a-8db0-6e5b9cd859c7" containerID="fbf10a7a5657fde0e0454909c5825a7689c99ffc06543f5aa244b14b066ef454" exitCode=0 Mar 10 11:29:26 crc kubenswrapper[4794]: I0310 11:29:26.184611 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-c57hw" event={"ID":"faa599aa-6af6-418a-8db0-6e5b9cd859c7","Type":"ContainerDied","Data":"fbf10a7a5657fde0e0454909c5825a7689c99ffc06543f5aa244b14b066ef454"} Mar 10 11:29:26 crc kubenswrapper[4794]: I0310 11:29:26.184987 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-c57hw" event={"ID":"faa599aa-6af6-418a-8db0-6e5b9cd859c7","Type":"ContainerStarted","Data":"159cc4b3d95904882f1e750743891af512570d72eaa0665d713f01fb32fc4938"} Mar 10 11:29:27 crc kubenswrapper[4794]: I0310 11:29:27.199717 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-c57hw" event={"ID":"faa599aa-6af6-418a-8db0-6e5b9cd859c7","Type":"ContainerStarted","Data":"d36dc53bb8de6fcc73bfdf40201f001eefdc3818c61cdc1ef9f66f58214359c1"} Mar 10 11:29:29 crc kubenswrapper[4794]: I0310 11:29:29.231024 4794 generic.go:334] "Generic (PLEG): container finished" podID="faa599aa-6af6-418a-8db0-6e5b9cd859c7" containerID="d36dc53bb8de6fcc73bfdf40201f001eefdc3818c61cdc1ef9f66f58214359c1" exitCode=0 Mar 10 11:29:29 crc kubenswrapper[4794]: I0310 11:29:29.231082 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-c57hw" event={"ID":"faa599aa-6af6-418a-8db0-6e5b9cd859c7","Type":"ContainerDied","Data":"d36dc53bb8de6fcc73bfdf40201f001eefdc3818c61cdc1ef9f66f58214359c1"} Mar 10 11:29:30 crc kubenswrapper[4794]: I0310 11:29:29.999859 4794 scope.go:117] "RemoveContainer" containerID="7035d4646bbb6a23dd802a9c8c9f05a3c6c945d4e5a6da8586d2bb005fcfad77" Mar 10 11:29:30 crc kubenswrapper[4794]: E0310 11:29:30.000737 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-69278_openshift-machine-config-operator(071a79a8-a892-4d38-a255-2a19483b64aa)\"" pod="openshift-machine-config-operator/machine-config-daemon-69278" podUID="071a79a8-a892-4d38-a255-2a19483b64aa" Mar 10 11:29:30 crc kubenswrapper[4794]: I0310 11:29:30.246807 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-c57hw" event={"ID":"faa599aa-6af6-418a-8db0-6e5b9cd859c7","Type":"ContainerStarted","Data":"b77011c2da3abead50d24bada80ce1fed0f3b51f42890393a1355f2d532a9f63"} Mar 10 11:29:30 crc kubenswrapper[4794]: I0310 11:29:30.280415 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-c57hw" podStartSLOduration=2.811458992 podStartE2EDuration="6.28040028s" podCreationTimestamp="2026-03-10 11:29:24 +0000 UTC" firstStartedPulling="2026-03-10 11:29:26.186479844 +0000 UTC m=+6314.942650662" lastFinishedPulling="2026-03-10 11:29:29.655421112 +0000 UTC m=+6318.411591950" observedRunningTime="2026-03-10 11:29:30.278634617 +0000 UTC m=+6319.034805435" watchObservedRunningTime="2026-03-10 11:29:30.28040028 +0000 UTC m=+6319.036571088" Mar 10 11:29:32 crc kubenswrapper[4794]: I0310 11:29:32.441774 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5d986dd4f7-j95pr" Mar 10 11:29:32 crc kubenswrapper[4794]: I0310 11:29:32.522263 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-75677c58ff-4skth"] Mar 10 11:29:32 crc kubenswrapper[4794]: I0310 11:29:32.522572 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-75677c58ff-4skth" podUID="9d32f18f-c413-4235-947d-181f43e86242" containerName="dnsmasq-dns" containerID="cri-o://7ae636b9693b7b3345a998134f7e867e69dd368cf61616a283bb22292fb17cf0" gracePeriod=10 Mar 10 11:29:32 crc kubenswrapper[4794]: I0310 11:29:32.672640 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-55f66c686f-d8wjz"] Mar 10 11:29:32 crc kubenswrapper[4794]: I0310 11:29:32.676113 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-55f66c686f-d8wjz" Mar 10 11:29:32 crc kubenswrapper[4794]: I0310 11:29:32.697458 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-55f66c686f-d8wjz"] Mar 10 11:29:32 crc kubenswrapper[4794]: I0310 11:29:32.844891 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-cell1\" (UniqueName: \"kubernetes.io/configmap/68926d9f-d2a2-4ddb-a755-42a2bc0d614d-openstack-cell1\") pod \"dnsmasq-dns-55f66c686f-d8wjz\" (UID: \"68926d9f-d2a2-4ddb-a755-42a2bc0d614d\") " pod="openstack/dnsmasq-dns-55f66c686f-d8wjz" Mar 10 11:29:32 crc kubenswrapper[4794]: I0310 11:29:32.844948 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t4gr7\" (UniqueName: \"kubernetes.io/projected/68926d9f-d2a2-4ddb-a755-42a2bc0d614d-kube-api-access-t4gr7\") pod \"dnsmasq-dns-55f66c686f-d8wjz\" (UID: \"68926d9f-d2a2-4ddb-a755-42a2bc0d614d\") " pod="openstack/dnsmasq-dns-55f66c686f-d8wjz" Mar 10 11:29:32 crc kubenswrapper[4794]: I0310 11:29:32.845029 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/68926d9f-d2a2-4ddb-a755-42a2bc0d614d-config\") pod \"dnsmasq-dns-55f66c686f-d8wjz\" (UID: \"68926d9f-d2a2-4ddb-a755-42a2bc0d614d\") " pod="openstack/dnsmasq-dns-55f66c686f-d8wjz" Mar 10 11:29:32 crc kubenswrapper[4794]: I0310 11:29:32.845071 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/68926d9f-d2a2-4ddb-a755-42a2bc0d614d-ovsdbserver-nb\") pod \"dnsmasq-dns-55f66c686f-d8wjz\" (UID: \"68926d9f-d2a2-4ddb-a755-42a2bc0d614d\") " pod="openstack/dnsmasq-dns-55f66c686f-d8wjz" Mar 10 11:29:32 crc kubenswrapper[4794]: I0310 11:29:32.845217 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/68926d9f-d2a2-4ddb-a755-42a2bc0d614d-dns-svc\") pod \"dnsmasq-dns-55f66c686f-d8wjz\" (UID: \"68926d9f-d2a2-4ddb-a755-42a2bc0d614d\") " pod="openstack/dnsmasq-dns-55f66c686f-d8wjz" Mar 10 11:29:32 crc kubenswrapper[4794]: I0310 11:29:32.845408 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/68926d9f-d2a2-4ddb-a755-42a2bc0d614d-ovsdbserver-sb\") pod \"dnsmasq-dns-55f66c686f-d8wjz\" (UID: \"68926d9f-d2a2-4ddb-a755-42a2bc0d614d\") " pod="openstack/dnsmasq-dns-55f66c686f-d8wjz" Mar 10 11:29:32 crc kubenswrapper[4794]: I0310 11:29:32.948316 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-cell1\" (UniqueName: \"kubernetes.io/configmap/68926d9f-d2a2-4ddb-a755-42a2bc0d614d-openstack-cell1\") pod \"dnsmasq-dns-55f66c686f-d8wjz\" (UID: \"68926d9f-d2a2-4ddb-a755-42a2bc0d614d\") " pod="openstack/dnsmasq-dns-55f66c686f-d8wjz" Mar 10 11:29:32 crc kubenswrapper[4794]: I0310 11:29:32.948394 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t4gr7\" (UniqueName: \"kubernetes.io/projected/68926d9f-d2a2-4ddb-a755-42a2bc0d614d-kube-api-access-t4gr7\") pod \"dnsmasq-dns-55f66c686f-d8wjz\" (UID: \"68926d9f-d2a2-4ddb-a755-42a2bc0d614d\") " pod="openstack/dnsmasq-dns-55f66c686f-d8wjz" Mar 10 11:29:32 crc kubenswrapper[4794]: I0310 11:29:32.948465 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/68926d9f-d2a2-4ddb-a755-42a2bc0d614d-config\") pod \"dnsmasq-dns-55f66c686f-d8wjz\" (UID: \"68926d9f-d2a2-4ddb-a755-42a2bc0d614d\") " pod="openstack/dnsmasq-dns-55f66c686f-d8wjz" Mar 10 11:29:32 crc kubenswrapper[4794]: I0310 11:29:32.948518 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/68926d9f-d2a2-4ddb-a755-42a2bc0d614d-ovsdbserver-nb\") pod \"dnsmasq-dns-55f66c686f-d8wjz\" (UID: \"68926d9f-d2a2-4ddb-a755-42a2bc0d614d\") " pod="openstack/dnsmasq-dns-55f66c686f-d8wjz" Mar 10 11:29:32 crc kubenswrapper[4794]: I0310 11:29:32.948552 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/68926d9f-d2a2-4ddb-a755-42a2bc0d614d-dns-svc\") pod \"dnsmasq-dns-55f66c686f-d8wjz\" (UID: \"68926d9f-d2a2-4ddb-a755-42a2bc0d614d\") " pod="openstack/dnsmasq-dns-55f66c686f-d8wjz" Mar 10 11:29:32 crc kubenswrapper[4794]: I0310 11:29:32.948618 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/68926d9f-d2a2-4ddb-a755-42a2bc0d614d-ovsdbserver-sb\") pod \"dnsmasq-dns-55f66c686f-d8wjz\" (UID: \"68926d9f-d2a2-4ddb-a755-42a2bc0d614d\") " pod="openstack/dnsmasq-dns-55f66c686f-d8wjz" Mar 10 11:29:32 crc kubenswrapper[4794]: I0310 11:29:32.949451 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-cell1\" (UniqueName: \"kubernetes.io/configmap/68926d9f-d2a2-4ddb-a755-42a2bc0d614d-openstack-cell1\") pod \"dnsmasq-dns-55f66c686f-d8wjz\" (UID: \"68926d9f-d2a2-4ddb-a755-42a2bc0d614d\") " pod="openstack/dnsmasq-dns-55f66c686f-d8wjz" Mar 10 11:29:32 crc kubenswrapper[4794]: I0310 11:29:32.949495 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/68926d9f-d2a2-4ddb-a755-42a2bc0d614d-ovsdbserver-nb\") pod \"dnsmasq-dns-55f66c686f-d8wjz\" (UID: \"68926d9f-d2a2-4ddb-a755-42a2bc0d614d\") " pod="openstack/dnsmasq-dns-55f66c686f-d8wjz" Mar 10 11:29:32 crc kubenswrapper[4794]: I0310 11:29:32.949557 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/68926d9f-d2a2-4ddb-a755-42a2bc0d614d-config\") pod \"dnsmasq-dns-55f66c686f-d8wjz\" (UID: \"68926d9f-d2a2-4ddb-a755-42a2bc0d614d\") " pod="openstack/dnsmasq-dns-55f66c686f-d8wjz" Mar 10 11:29:32 crc kubenswrapper[4794]: I0310 11:29:32.949666 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/68926d9f-d2a2-4ddb-a755-42a2bc0d614d-ovsdbserver-sb\") pod \"dnsmasq-dns-55f66c686f-d8wjz\" (UID: \"68926d9f-d2a2-4ddb-a755-42a2bc0d614d\") " pod="openstack/dnsmasq-dns-55f66c686f-d8wjz" Mar 10 11:29:32 crc kubenswrapper[4794]: I0310 11:29:32.951212 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/68926d9f-d2a2-4ddb-a755-42a2bc0d614d-dns-svc\") pod \"dnsmasq-dns-55f66c686f-d8wjz\" (UID: \"68926d9f-d2a2-4ddb-a755-42a2bc0d614d\") " pod="openstack/dnsmasq-dns-55f66c686f-d8wjz" Mar 10 11:29:32 crc kubenswrapper[4794]: I0310 11:29:32.978311 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t4gr7\" (UniqueName: \"kubernetes.io/projected/68926d9f-d2a2-4ddb-a755-42a2bc0d614d-kube-api-access-t4gr7\") pod \"dnsmasq-dns-55f66c686f-d8wjz\" (UID: \"68926d9f-d2a2-4ddb-a755-42a2bc0d614d\") " pod="openstack/dnsmasq-dns-55f66c686f-d8wjz" Mar 10 11:29:33 crc kubenswrapper[4794]: I0310 11:29:33.029921 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-55f66c686f-d8wjz" Mar 10 11:29:33 crc kubenswrapper[4794]: I0310 11:29:33.303144 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-75677c58ff-4skth" Mar 10 11:29:33 crc kubenswrapper[4794]: I0310 11:29:33.336464 4794 generic.go:334] "Generic (PLEG): container finished" podID="9d32f18f-c413-4235-947d-181f43e86242" containerID="7ae636b9693b7b3345a998134f7e867e69dd368cf61616a283bb22292fb17cf0" exitCode=0 Mar 10 11:29:33 crc kubenswrapper[4794]: I0310 11:29:33.336511 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-75677c58ff-4skth" event={"ID":"9d32f18f-c413-4235-947d-181f43e86242","Type":"ContainerDied","Data":"7ae636b9693b7b3345a998134f7e867e69dd368cf61616a283bb22292fb17cf0"} Mar 10 11:29:33 crc kubenswrapper[4794]: I0310 11:29:33.336540 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-75677c58ff-4skth" event={"ID":"9d32f18f-c413-4235-947d-181f43e86242","Type":"ContainerDied","Data":"6186c51dcaacb52efcc0898684d006779b37c821bd7ac9e9a93c07fc05e4123c"} Mar 10 11:29:33 crc kubenswrapper[4794]: I0310 11:29:33.336602 4794 scope.go:117] "RemoveContainer" containerID="7ae636b9693b7b3345a998134f7e867e69dd368cf61616a283bb22292fb17cf0" Mar 10 11:29:33 crc kubenswrapper[4794]: I0310 11:29:33.390492 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9d32f18f-c413-4235-947d-181f43e86242-ovsdbserver-sb\") pod \"9d32f18f-c413-4235-947d-181f43e86242\" (UID: \"9d32f18f-c413-4235-947d-181f43e86242\") " Mar 10 11:29:33 crc kubenswrapper[4794]: I0310 11:29:33.390535 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9d32f18f-c413-4235-947d-181f43e86242-dns-svc\") pod \"9d32f18f-c413-4235-947d-181f43e86242\" (UID: \"9d32f18f-c413-4235-947d-181f43e86242\") " Mar 10 11:29:33 crc kubenswrapper[4794]: I0310 11:29:33.390572 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-shkr9\" (UniqueName: \"kubernetes.io/projected/9d32f18f-c413-4235-947d-181f43e86242-kube-api-access-shkr9\") pod \"9d32f18f-c413-4235-947d-181f43e86242\" (UID: \"9d32f18f-c413-4235-947d-181f43e86242\") " Mar 10 11:29:33 crc kubenswrapper[4794]: I0310 11:29:33.390756 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d32f18f-c413-4235-947d-181f43e86242-config\") pod \"9d32f18f-c413-4235-947d-181f43e86242\" (UID: \"9d32f18f-c413-4235-947d-181f43e86242\") " Mar 10 11:29:33 crc kubenswrapper[4794]: I0310 11:29:33.390784 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9d32f18f-c413-4235-947d-181f43e86242-ovsdbserver-nb\") pod \"9d32f18f-c413-4235-947d-181f43e86242\" (UID: \"9d32f18f-c413-4235-947d-181f43e86242\") " Mar 10 11:29:33 crc kubenswrapper[4794]: I0310 11:29:33.411564 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d32f18f-c413-4235-947d-181f43e86242-kube-api-access-shkr9" (OuterVolumeSpecName: "kube-api-access-shkr9") pod "9d32f18f-c413-4235-947d-181f43e86242" (UID: "9d32f18f-c413-4235-947d-181f43e86242"). InnerVolumeSpecName "kube-api-access-shkr9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 11:29:33 crc kubenswrapper[4794]: I0310 11:29:33.411676 4794 scope.go:117] "RemoveContainer" containerID="d5c1c8f3c053ed1820759e5ee0a138342f7dccbaa62ead391643ea0f2ce661c0" Mar 10 11:29:33 crc kubenswrapper[4794]: I0310 11:29:33.488797 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d32f18f-c413-4235-947d-181f43e86242-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "9d32f18f-c413-4235-947d-181f43e86242" (UID: "9d32f18f-c413-4235-947d-181f43e86242"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 11:29:33 crc kubenswrapper[4794]: I0310 11:29:33.494794 4794 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9d32f18f-c413-4235-947d-181f43e86242-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 10 11:29:33 crc kubenswrapper[4794]: I0310 11:29:33.494819 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-shkr9\" (UniqueName: \"kubernetes.io/projected/9d32f18f-c413-4235-947d-181f43e86242-kube-api-access-shkr9\") on node \"crc\" DevicePath \"\"" Mar 10 11:29:33 crc kubenswrapper[4794]: I0310 11:29:33.516689 4794 scope.go:117] "RemoveContainer" containerID="7ae636b9693b7b3345a998134f7e867e69dd368cf61616a283bb22292fb17cf0" Mar 10 11:29:33 crc kubenswrapper[4794]: I0310 11:29:33.517646 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d32f18f-c413-4235-947d-181f43e86242-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "9d32f18f-c413-4235-947d-181f43e86242" (UID: "9d32f18f-c413-4235-947d-181f43e86242"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 11:29:33 crc kubenswrapper[4794]: E0310 11:29:33.518662 4794 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7ae636b9693b7b3345a998134f7e867e69dd368cf61616a283bb22292fb17cf0\": container with ID starting with 7ae636b9693b7b3345a998134f7e867e69dd368cf61616a283bb22292fb17cf0 not found: ID does not exist" containerID="7ae636b9693b7b3345a998134f7e867e69dd368cf61616a283bb22292fb17cf0" Mar 10 11:29:33 crc kubenswrapper[4794]: I0310 11:29:33.518699 4794 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7ae636b9693b7b3345a998134f7e867e69dd368cf61616a283bb22292fb17cf0"} err="failed to get container status \"7ae636b9693b7b3345a998134f7e867e69dd368cf61616a283bb22292fb17cf0\": rpc error: code = NotFound desc = could not find container \"7ae636b9693b7b3345a998134f7e867e69dd368cf61616a283bb22292fb17cf0\": container with ID starting with 7ae636b9693b7b3345a998134f7e867e69dd368cf61616a283bb22292fb17cf0 not found: ID does not exist" Mar 10 11:29:33 crc kubenswrapper[4794]: I0310 11:29:33.518723 4794 scope.go:117] "RemoveContainer" containerID="d5c1c8f3c053ed1820759e5ee0a138342f7dccbaa62ead391643ea0f2ce661c0" Mar 10 11:29:33 crc kubenswrapper[4794]: E0310 11:29:33.519080 4794 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d5c1c8f3c053ed1820759e5ee0a138342f7dccbaa62ead391643ea0f2ce661c0\": container with ID starting with d5c1c8f3c053ed1820759e5ee0a138342f7dccbaa62ead391643ea0f2ce661c0 not found: ID does not exist" containerID="d5c1c8f3c053ed1820759e5ee0a138342f7dccbaa62ead391643ea0f2ce661c0" Mar 10 11:29:33 crc kubenswrapper[4794]: I0310 11:29:33.519119 4794 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d5c1c8f3c053ed1820759e5ee0a138342f7dccbaa62ead391643ea0f2ce661c0"} err="failed to get container status \"d5c1c8f3c053ed1820759e5ee0a138342f7dccbaa62ead391643ea0f2ce661c0\": rpc error: code = NotFound desc = could not find container \"d5c1c8f3c053ed1820759e5ee0a138342f7dccbaa62ead391643ea0f2ce661c0\": container with ID starting with d5c1c8f3c053ed1820759e5ee0a138342f7dccbaa62ead391643ea0f2ce661c0 not found: ID does not exist" Mar 10 11:29:33 crc kubenswrapper[4794]: I0310 11:29:33.531841 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d32f18f-c413-4235-947d-181f43e86242-config" (OuterVolumeSpecName: "config") pod "9d32f18f-c413-4235-947d-181f43e86242" (UID: "9d32f18f-c413-4235-947d-181f43e86242"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 11:29:33 crc kubenswrapper[4794]: I0310 11:29:33.542846 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d32f18f-c413-4235-947d-181f43e86242-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "9d32f18f-c413-4235-947d-181f43e86242" (UID: "9d32f18f-c413-4235-947d-181f43e86242"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 11:29:33 crc kubenswrapper[4794]: I0310 11:29:33.597741 4794 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9d32f18f-c413-4235-947d-181f43e86242-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 10 11:29:33 crc kubenswrapper[4794]: I0310 11:29:33.597781 4794 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d32f18f-c413-4235-947d-181f43e86242-config\") on node \"crc\" DevicePath \"\"" Mar 10 11:29:33 crc kubenswrapper[4794]: I0310 11:29:33.597790 4794 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9d32f18f-c413-4235-947d-181f43e86242-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 10 11:29:33 crc kubenswrapper[4794]: I0310 11:29:33.808500 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-55f66c686f-d8wjz"] Mar 10 11:29:34 crc kubenswrapper[4794]: I0310 11:29:34.346954 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-75677c58ff-4skth" Mar 10 11:29:34 crc kubenswrapper[4794]: I0310 11:29:34.348671 4794 generic.go:334] "Generic (PLEG): container finished" podID="68926d9f-d2a2-4ddb-a755-42a2bc0d614d" containerID="16750010c0f0cd0128ffdfb857bb1d0f464c22ee8750f2bdb349e860df2421f7" exitCode=0 Mar 10 11:29:34 crc kubenswrapper[4794]: I0310 11:29:34.348717 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55f66c686f-d8wjz" event={"ID":"68926d9f-d2a2-4ddb-a755-42a2bc0d614d","Type":"ContainerDied","Data":"16750010c0f0cd0128ffdfb857bb1d0f464c22ee8750f2bdb349e860df2421f7"} Mar 10 11:29:34 crc kubenswrapper[4794]: I0310 11:29:34.348764 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55f66c686f-d8wjz" event={"ID":"68926d9f-d2a2-4ddb-a755-42a2bc0d614d","Type":"ContainerStarted","Data":"767488504fcb3721d92a34e7fcedb945165ca6a8a21b35b179e2c858e78c1a76"} Mar 10 11:29:34 crc kubenswrapper[4794]: I0310 11:29:34.413736 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-75677c58ff-4skth"] Mar 10 11:29:34 crc kubenswrapper[4794]: I0310 11:29:34.422307 4794 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-75677c58ff-4skth"] Mar 10 11:29:34 crc kubenswrapper[4794]: I0310 11:29:34.724419 4794 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-c57hw" Mar 10 11:29:34 crc kubenswrapper[4794]: I0310 11:29:34.724468 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-c57hw" Mar 10 11:29:35 crc kubenswrapper[4794]: I0310 11:29:35.361855 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55f66c686f-d8wjz" event={"ID":"68926d9f-d2a2-4ddb-a755-42a2bc0d614d","Type":"ContainerStarted","Data":"ca09f35c0e7e6f0662a486460a6b676515b97246302a9e6e14b25c85ae22d26b"} Mar 10 11:29:35 crc kubenswrapper[4794]: I0310 11:29:35.362202 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-55f66c686f-d8wjz" Mar 10 11:29:35 crc kubenswrapper[4794]: I0310 11:29:35.388557 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-55f66c686f-d8wjz" podStartSLOduration=3.388524766 podStartE2EDuration="3.388524766s" podCreationTimestamp="2026-03-10 11:29:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 11:29:35.380833772 +0000 UTC m=+6324.137004590" watchObservedRunningTime="2026-03-10 11:29:35.388524766 +0000 UTC m=+6324.144695594" Mar 10 11:29:35 crc kubenswrapper[4794]: I0310 11:29:35.780437 4794 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/certified-operators-c57hw" podUID="faa599aa-6af6-418a-8db0-6e5b9cd859c7" containerName="registry-server" probeResult="failure" output=< Mar 10 11:29:35 crc kubenswrapper[4794]: timeout: failed to connect service ":50051" within 1s Mar 10 11:29:35 crc kubenswrapper[4794]: > Mar 10 11:29:36 crc kubenswrapper[4794]: I0310 11:29:36.009557 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d32f18f-c413-4235-947d-181f43e86242" path="/var/lib/kubelet/pods/9d32f18f-c413-4235-947d-181f43e86242/volumes" Mar 10 11:29:43 crc kubenswrapper[4794]: I0310 11:29:43.032455 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-55f66c686f-d8wjz" Mar 10 11:29:43 crc kubenswrapper[4794]: I0310 11:29:43.093147 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5d986dd4f7-j95pr"] Mar 10 11:29:43 crc kubenswrapper[4794]: I0310 11:29:43.094029 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5d986dd4f7-j95pr" podUID="df9a8fa7-2924-4579-8728-2fdba5c829ea" containerName="dnsmasq-dns" containerID="cri-o://abc7007853a1092955051446ebe967264570b717f712ef4bfbbe1774ca9baba7" gracePeriod=10 Mar 10 11:29:43 crc kubenswrapper[4794]: I0310 11:29:43.510281 4794 generic.go:334] "Generic (PLEG): container finished" podID="df9a8fa7-2924-4579-8728-2fdba5c829ea" containerID="abc7007853a1092955051446ebe967264570b717f712ef4bfbbe1774ca9baba7" exitCode=0 Mar 10 11:29:43 crc kubenswrapper[4794]: I0310 11:29:43.510323 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5d986dd4f7-j95pr" event={"ID":"df9a8fa7-2924-4579-8728-2fdba5c829ea","Type":"ContainerDied","Data":"abc7007853a1092955051446ebe967264570b717f712ef4bfbbe1774ca9baba7"} Mar 10 11:29:43 crc kubenswrapper[4794]: I0310 11:29:43.816878 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5d986dd4f7-j95pr" Mar 10 11:29:43 crc kubenswrapper[4794]: I0310 11:29:43.942591 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/df9a8fa7-2924-4579-8728-2fdba5c829ea-ovsdbserver-sb\") pod \"df9a8fa7-2924-4579-8728-2fdba5c829ea\" (UID: \"df9a8fa7-2924-4579-8728-2fdba5c829ea\") " Mar 10 11:29:43 crc kubenswrapper[4794]: I0310 11:29:43.942803 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-cell1\" (UniqueName: \"kubernetes.io/configmap/df9a8fa7-2924-4579-8728-2fdba5c829ea-openstack-cell1\") pod \"df9a8fa7-2924-4579-8728-2fdba5c829ea\" (UID: \"df9a8fa7-2924-4579-8728-2fdba5c829ea\") " Mar 10 11:29:43 crc kubenswrapper[4794]: I0310 11:29:43.942891 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nvw2t\" (UniqueName: \"kubernetes.io/projected/df9a8fa7-2924-4579-8728-2fdba5c829ea-kube-api-access-nvw2t\") pod \"df9a8fa7-2924-4579-8728-2fdba5c829ea\" (UID: \"df9a8fa7-2924-4579-8728-2fdba5c829ea\") " Mar 10 11:29:43 crc kubenswrapper[4794]: I0310 11:29:43.943081 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/df9a8fa7-2924-4579-8728-2fdba5c829ea-dns-svc\") pod \"df9a8fa7-2924-4579-8728-2fdba5c829ea\" (UID: \"df9a8fa7-2924-4579-8728-2fdba5c829ea\") " Mar 10 11:29:43 crc kubenswrapper[4794]: I0310 11:29:43.943151 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/df9a8fa7-2924-4579-8728-2fdba5c829ea-config\") pod \"df9a8fa7-2924-4579-8728-2fdba5c829ea\" (UID: \"df9a8fa7-2924-4579-8728-2fdba5c829ea\") " Mar 10 11:29:43 crc kubenswrapper[4794]: I0310 11:29:43.943179 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/df9a8fa7-2924-4579-8728-2fdba5c829ea-ovsdbserver-nb\") pod \"df9a8fa7-2924-4579-8728-2fdba5c829ea\" (UID: \"df9a8fa7-2924-4579-8728-2fdba5c829ea\") " Mar 10 11:29:43 crc kubenswrapper[4794]: I0310 11:29:43.952542 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/df9a8fa7-2924-4579-8728-2fdba5c829ea-kube-api-access-nvw2t" (OuterVolumeSpecName: "kube-api-access-nvw2t") pod "df9a8fa7-2924-4579-8728-2fdba5c829ea" (UID: "df9a8fa7-2924-4579-8728-2fdba5c829ea"). InnerVolumeSpecName "kube-api-access-nvw2t". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 11:29:43 crc kubenswrapper[4794]: I0310 11:29:43.999848 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/df9a8fa7-2924-4579-8728-2fdba5c829ea-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "df9a8fa7-2924-4579-8728-2fdba5c829ea" (UID: "df9a8fa7-2924-4579-8728-2fdba5c829ea"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 11:29:44 crc kubenswrapper[4794]: I0310 11:29:44.009652 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/df9a8fa7-2924-4579-8728-2fdba5c829ea-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "df9a8fa7-2924-4579-8728-2fdba5c829ea" (UID: "df9a8fa7-2924-4579-8728-2fdba5c829ea"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 11:29:44 crc kubenswrapper[4794]: I0310 11:29:44.010129 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/df9a8fa7-2924-4579-8728-2fdba5c829ea-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "df9a8fa7-2924-4579-8728-2fdba5c829ea" (UID: "df9a8fa7-2924-4579-8728-2fdba5c829ea"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 11:29:44 crc kubenswrapper[4794]: I0310 11:29:44.011036 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/df9a8fa7-2924-4579-8728-2fdba5c829ea-config" (OuterVolumeSpecName: "config") pod "df9a8fa7-2924-4579-8728-2fdba5c829ea" (UID: "df9a8fa7-2924-4579-8728-2fdba5c829ea"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 11:29:44 crc kubenswrapper[4794]: I0310 11:29:44.014104 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/df9a8fa7-2924-4579-8728-2fdba5c829ea-openstack-cell1" (OuterVolumeSpecName: "openstack-cell1") pod "df9a8fa7-2924-4579-8728-2fdba5c829ea" (UID: "df9a8fa7-2924-4579-8728-2fdba5c829ea"). InnerVolumeSpecName "openstack-cell1". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 11:29:44 crc kubenswrapper[4794]: I0310 11:29:44.045437 4794 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/df9a8fa7-2924-4579-8728-2fdba5c829ea-config\") on node \"crc\" DevicePath \"\"" Mar 10 11:29:44 crc kubenswrapper[4794]: I0310 11:29:44.045497 4794 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/df9a8fa7-2924-4579-8728-2fdba5c829ea-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 10 11:29:44 crc kubenswrapper[4794]: I0310 11:29:44.045511 4794 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/df9a8fa7-2924-4579-8728-2fdba5c829ea-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 10 11:29:44 crc kubenswrapper[4794]: I0310 11:29:44.045547 4794 reconciler_common.go:293] "Volume detached for volume \"openstack-cell1\" (UniqueName: \"kubernetes.io/configmap/df9a8fa7-2924-4579-8728-2fdba5c829ea-openstack-cell1\") on node \"crc\" DevicePath \"\"" Mar 10 11:29:44 crc kubenswrapper[4794]: I0310 11:29:44.045562 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nvw2t\" (UniqueName: \"kubernetes.io/projected/df9a8fa7-2924-4579-8728-2fdba5c829ea-kube-api-access-nvw2t\") on node \"crc\" DevicePath \"\"" Mar 10 11:29:44 crc kubenswrapper[4794]: I0310 11:29:44.045573 4794 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/df9a8fa7-2924-4579-8728-2fdba5c829ea-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 10 11:29:44 crc kubenswrapper[4794]: I0310 11:29:44.521257 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5d986dd4f7-j95pr" event={"ID":"df9a8fa7-2924-4579-8728-2fdba5c829ea","Type":"ContainerDied","Data":"15a5cf112e00e7f2705d6d98390dff06c85588b0d61fa17a24f5b6a1fc9f0e52"} Mar 10 11:29:44 crc kubenswrapper[4794]: I0310 11:29:44.521744 4794 scope.go:117] "RemoveContainer" containerID="abc7007853a1092955051446ebe967264570b717f712ef4bfbbe1774ca9baba7" Mar 10 11:29:44 crc kubenswrapper[4794]: I0310 11:29:44.521301 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5d986dd4f7-j95pr" Mar 10 11:29:44 crc kubenswrapper[4794]: I0310 11:29:44.560192 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5d986dd4f7-j95pr"] Mar 10 11:29:44 crc kubenswrapper[4794]: I0310 11:29:44.568218 4794 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5d986dd4f7-j95pr"] Mar 10 11:29:44 crc kubenswrapper[4794]: I0310 11:29:44.568641 4794 scope.go:117] "RemoveContainer" containerID="df7a6405e05a57c0583703670873e5ce73db52c1cce9d9c2ec1e3f90eda8998b" Mar 10 11:29:44 crc kubenswrapper[4794]: I0310 11:29:44.772069 4794 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-c57hw" Mar 10 11:29:44 crc kubenswrapper[4794]: I0310 11:29:44.840403 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-c57hw" Mar 10 11:29:44 crc kubenswrapper[4794]: I0310 11:29:44.998704 4794 scope.go:117] "RemoveContainer" containerID="7035d4646bbb6a23dd802a9c8c9f05a3c6c945d4e5a6da8586d2bb005fcfad77" Mar 10 11:29:44 crc kubenswrapper[4794]: E0310 11:29:44.998960 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-69278_openshift-machine-config-operator(071a79a8-a892-4d38-a255-2a19483b64aa)\"" pod="openshift-machine-config-operator/machine-config-daemon-69278" podUID="071a79a8-a892-4d38-a255-2a19483b64aa" Mar 10 11:29:45 crc kubenswrapper[4794]: I0310 11:29:45.014597 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-c57hw"] Mar 10 11:29:46 crc kubenswrapper[4794]: I0310 11:29:46.013168 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="df9a8fa7-2924-4579-8728-2fdba5c829ea" path="/var/lib/kubelet/pods/df9a8fa7-2924-4579-8728-2fdba5c829ea/volumes" Mar 10 11:29:46 crc kubenswrapper[4794]: I0310 11:29:46.552075 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-c57hw" podUID="faa599aa-6af6-418a-8db0-6e5b9cd859c7" containerName="registry-server" containerID="cri-o://b77011c2da3abead50d24bada80ce1fed0f3b51f42890393a1355f2d532a9f63" gracePeriod=2 Mar 10 11:29:47 crc kubenswrapper[4794]: I0310 11:29:47.083044 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-c57hw" Mar 10 11:29:47 crc kubenswrapper[4794]: I0310 11:29:47.215908 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/faa599aa-6af6-418a-8db0-6e5b9cd859c7-utilities\") pod \"faa599aa-6af6-418a-8db0-6e5b9cd859c7\" (UID: \"faa599aa-6af6-418a-8db0-6e5b9cd859c7\") " Mar 10 11:29:47 crc kubenswrapper[4794]: I0310 11:29:47.215952 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/faa599aa-6af6-418a-8db0-6e5b9cd859c7-catalog-content\") pod \"faa599aa-6af6-418a-8db0-6e5b9cd859c7\" (UID: \"faa599aa-6af6-418a-8db0-6e5b9cd859c7\") " Mar 10 11:29:47 crc kubenswrapper[4794]: I0310 11:29:47.216025 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gkmqm\" (UniqueName: \"kubernetes.io/projected/faa599aa-6af6-418a-8db0-6e5b9cd859c7-kube-api-access-gkmqm\") pod \"faa599aa-6af6-418a-8db0-6e5b9cd859c7\" (UID: \"faa599aa-6af6-418a-8db0-6e5b9cd859c7\") " Mar 10 11:29:47 crc kubenswrapper[4794]: I0310 11:29:47.217591 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/faa599aa-6af6-418a-8db0-6e5b9cd859c7-utilities" (OuterVolumeSpecName: "utilities") pod "faa599aa-6af6-418a-8db0-6e5b9cd859c7" (UID: "faa599aa-6af6-418a-8db0-6e5b9cd859c7"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 11:29:47 crc kubenswrapper[4794]: I0310 11:29:47.221308 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/faa599aa-6af6-418a-8db0-6e5b9cd859c7-kube-api-access-gkmqm" (OuterVolumeSpecName: "kube-api-access-gkmqm") pod "faa599aa-6af6-418a-8db0-6e5b9cd859c7" (UID: "faa599aa-6af6-418a-8db0-6e5b9cd859c7"). InnerVolumeSpecName "kube-api-access-gkmqm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 11:29:47 crc kubenswrapper[4794]: I0310 11:29:47.306807 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/faa599aa-6af6-418a-8db0-6e5b9cd859c7-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "faa599aa-6af6-418a-8db0-6e5b9cd859c7" (UID: "faa599aa-6af6-418a-8db0-6e5b9cd859c7"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 11:29:47 crc kubenswrapper[4794]: I0310 11:29:47.318920 4794 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/faa599aa-6af6-418a-8db0-6e5b9cd859c7-utilities\") on node \"crc\" DevicePath \"\"" Mar 10 11:29:47 crc kubenswrapper[4794]: I0310 11:29:47.318953 4794 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/faa599aa-6af6-418a-8db0-6e5b9cd859c7-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 10 11:29:47 crc kubenswrapper[4794]: I0310 11:29:47.318965 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gkmqm\" (UniqueName: \"kubernetes.io/projected/faa599aa-6af6-418a-8db0-6e5b9cd859c7-kube-api-access-gkmqm\") on node \"crc\" DevicePath \"\"" Mar 10 11:29:47 crc kubenswrapper[4794]: I0310 11:29:47.570523 4794 generic.go:334] "Generic (PLEG): container finished" podID="faa599aa-6af6-418a-8db0-6e5b9cd859c7" containerID="b77011c2da3abead50d24bada80ce1fed0f3b51f42890393a1355f2d532a9f63" exitCode=0 Mar 10 11:29:47 crc kubenswrapper[4794]: I0310 11:29:47.570577 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-c57hw" event={"ID":"faa599aa-6af6-418a-8db0-6e5b9cd859c7","Type":"ContainerDied","Data":"b77011c2da3abead50d24bada80ce1fed0f3b51f42890393a1355f2d532a9f63"} Mar 10 11:29:47 crc kubenswrapper[4794]: I0310 11:29:47.570586 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-c57hw" Mar 10 11:29:47 crc kubenswrapper[4794]: I0310 11:29:47.570607 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-c57hw" event={"ID":"faa599aa-6af6-418a-8db0-6e5b9cd859c7","Type":"ContainerDied","Data":"159cc4b3d95904882f1e750743891af512570d72eaa0665d713f01fb32fc4938"} Mar 10 11:29:47 crc kubenswrapper[4794]: I0310 11:29:47.570624 4794 scope.go:117] "RemoveContainer" containerID="b77011c2da3abead50d24bada80ce1fed0f3b51f42890393a1355f2d532a9f63" Mar 10 11:29:47 crc kubenswrapper[4794]: I0310 11:29:47.649907 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-c57hw"] Mar 10 11:29:47 crc kubenswrapper[4794]: I0310 11:29:47.652554 4794 scope.go:117] "RemoveContainer" containerID="d36dc53bb8de6fcc73bfdf40201f001eefdc3818c61cdc1ef9f66f58214359c1" Mar 10 11:29:47 crc kubenswrapper[4794]: I0310 11:29:47.658505 4794 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-c57hw"] Mar 10 11:29:47 crc kubenswrapper[4794]: I0310 11:29:47.682761 4794 scope.go:117] "RemoveContainer" containerID="fbf10a7a5657fde0e0454909c5825a7689c99ffc06543f5aa244b14b066ef454" Mar 10 11:29:47 crc kubenswrapper[4794]: I0310 11:29:47.749020 4794 scope.go:117] "RemoveContainer" containerID="b77011c2da3abead50d24bada80ce1fed0f3b51f42890393a1355f2d532a9f63" Mar 10 11:29:47 crc kubenswrapper[4794]: E0310 11:29:47.750596 4794 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b77011c2da3abead50d24bada80ce1fed0f3b51f42890393a1355f2d532a9f63\": container with ID starting with b77011c2da3abead50d24bada80ce1fed0f3b51f42890393a1355f2d532a9f63 not found: ID does not exist" containerID="b77011c2da3abead50d24bada80ce1fed0f3b51f42890393a1355f2d532a9f63" Mar 10 11:29:47 crc kubenswrapper[4794]: I0310 11:29:47.750644 4794 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b77011c2da3abead50d24bada80ce1fed0f3b51f42890393a1355f2d532a9f63"} err="failed to get container status \"b77011c2da3abead50d24bada80ce1fed0f3b51f42890393a1355f2d532a9f63\": rpc error: code = NotFound desc = could not find container \"b77011c2da3abead50d24bada80ce1fed0f3b51f42890393a1355f2d532a9f63\": container with ID starting with b77011c2da3abead50d24bada80ce1fed0f3b51f42890393a1355f2d532a9f63 not found: ID does not exist" Mar 10 11:29:47 crc kubenswrapper[4794]: I0310 11:29:47.750677 4794 scope.go:117] "RemoveContainer" containerID="d36dc53bb8de6fcc73bfdf40201f001eefdc3818c61cdc1ef9f66f58214359c1" Mar 10 11:29:47 crc kubenswrapper[4794]: E0310 11:29:47.751030 4794 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d36dc53bb8de6fcc73bfdf40201f001eefdc3818c61cdc1ef9f66f58214359c1\": container with ID starting with d36dc53bb8de6fcc73bfdf40201f001eefdc3818c61cdc1ef9f66f58214359c1 not found: ID does not exist" containerID="d36dc53bb8de6fcc73bfdf40201f001eefdc3818c61cdc1ef9f66f58214359c1" Mar 10 11:29:47 crc kubenswrapper[4794]: I0310 11:29:47.751094 4794 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d36dc53bb8de6fcc73bfdf40201f001eefdc3818c61cdc1ef9f66f58214359c1"} err="failed to get container status \"d36dc53bb8de6fcc73bfdf40201f001eefdc3818c61cdc1ef9f66f58214359c1\": rpc error: code = NotFound desc = could not find container \"d36dc53bb8de6fcc73bfdf40201f001eefdc3818c61cdc1ef9f66f58214359c1\": container with ID starting with d36dc53bb8de6fcc73bfdf40201f001eefdc3818c61cdc1ef9f66f58214359c1 not found: ID does not exist" Mar 10 11:29:47 crc kubenswrapper[4794]: I0310 11:29:47.751137 4794 scope.go:117] "RemoveContainer" containerID="fbf10a7a5657fde0e0454909c5825a7689c99ffc06543f5aa244b14b066ef454" Mar 10 11:29:47 crc kubenswrapper[4794]: E0310 11:29:47.752126 4794 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fbf10a7a5657fde0e0454909c5825a7689c99ffc06543f5aa244b14b066ef454\": container with ID starting with fbf10a7a5657fde0e0454909c5825a7689c99ffc06543f5aa244b14b066ef454 not found: ID does not exist" containerID="fbf10a7a5657fde0e0454909c5825a7689c99ffc06543f5aa244b14b066ef454" Mar 10 11:29:47 crc kubenswrapper[4794]: I0310 11:29:47.752193 4794 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fbf10a7a5657fde0e0454909c5825a7689c99ffc06543f5aa244b14b066ef454"} err="failed to get container status \"fbf10a7a5657fde0e0454909c5825a7689c99ffc06543f5aa244b14b066ef454\": rpc error: code = NotFound desc = could not find container \"fbf10a7a5657fde0e0454909c5825a7689c99ffc06543f5aa244b14b066ef454\": container with ID starting with fbf10a7a5657fde0e0454909c5825a7689c99ffc06543f5aa244b14b066ef454 not found: ID does not exist" Mar 10 11:29:48 crc kubenswrapper[4794]: I0310 11:29:48.018424 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="faa599aa-6af6-418a-8db0-6e5b9cd859c7" path="/var/lib/kubelet/pods/faa599aa-6af6-418a-8db0-6e5b9cd859c7/volumes" Mar 10 11:29:50 crc kubenswrapper[4794]: I0310 11:29:50.046973 4794 scope.go:117] "RemoveContainer" containerID="583deb61ce6ccbb2d89e6d8fe21d9f4a362de539c089d8c0c89c0fe43854b84e" Mar 10 11:29:50 crc kubenswrapper[4794]: I0310 11:29:50.080114 4794 scope.go:117] "RemoveContainer" containerID="361b45844d7fae61746e7ecdfd0e55c5c319bc07b9764362342b57f3c845d55b" Mar 10 11:29:50 crc kubenswrapper[4794]: I0310 11:29:50.176281 4794 scope.go:117] "RemoveContainer" containerID="1a8f8ed5da40dc30e6234ba884c5b3d621a1fd46132212b94d6b48cb58737cf7" Mar 10 11:29:54 crc kubenswrapper[4794]: I0310 11:29:54.101675 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cgbslf"] Mar 10 11:29:54 crc kubenswrapper[4794]: E0310 11:29:54.102825 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="faa599aa-6af6-418a-8db0-6e5b9cd859c7" containerName="extract-content" Mar 10 11:29:54 crc kubenswrapper[4794]: I0310 11:29:54.102845 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="faa599aa-6af6-418a-8db0-6e5b9cd859c7" containerName="extract-content" Mar 10 11:29:54 crc kubenswrapper[4794]: E0310 11:29:54.102912 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="df9a8fa7-2924-4579-8728-2fdba5c829ea" containerName="init" Mar 10 11:29:54 crc kubenswrapper[4794]: I0310 11:29:54.102926 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="df9a8fa7-2924-4579-8728-2fdba5c829ea" containerName="init" Mar 10 11:29:54 crc kubenswrapper[4794]: E0310 11:29:54.102953 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="faa599aa-6af6-418a-8db0-6e5b9cd859c7" containerName="extract-utilities" Mar 10 11:29:54 crc kubenswrapper[4794]: I0310 11:29:54.102965 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="faa599aa-6af6-418a-8db0-6e5b9cd859c7" containerName="extract-utilities" Mar 10 11:29:54 crc kubenswrapper[4794]: E0310 11:29:54.102993 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9d32f18f-c413-4235-947d-181f43e86242" containerName="init" Mar 10 11:29:54 crc kubenswrapper[4794]: I0310 11:29:54.103004 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="9d32f18f-c413-4235-947d-181f43e86242" containerName="init" Mar 10 11:29:54 crc kubenswrapper[4794]: E0310 11:29:54.103039 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="df9a8fa7-2924-4579-8728-2fdba5c829ea" containerName="dnsmasq-dns" Mar 10 11:29:54 crc kubenswrapper[4794]: I0310 11:29:54.103053 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="df9a8fa7-2924-4579-8728-2fdba5c829ea" containerName="dnsmasq-dns" Mar 10 11:29:54 crc kubenswrapper[4794]: E0310 11:29:54.103078 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9d32f18f-c413-4235-947d-181f43e86242" containerName="dnsmasq-dns" Mar 10 11:29:54 crc kubenswrapper[4794]: I0310 11:29:54.103091 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="9d32f18f-c413-4235-947d-181f43e86242" containerName="dnsmasq-dns" Mar 10 11:29:54 crc kubenswrapper[4794]: E0310 11:29:54.103107 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="faa599aa-6af6-418a-8db0-6e5b9cd859c7" containerName="registry-server" Mar 10 11:29:54 crc kubenswrapper[4794]: I0310 11:29:54.103119 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="faa599aa-6af6-418a-8db0-6e5b9cd859c7" containerName="registry-server" Mar 10 11:29:54 crc kubenswrapper[4794]: I0310 11:29:54.103467 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="9d32f18f-c413-4235-947d-181f43e86242" containerName="dnsmasq-dns" Mar 10 11:29:54 crc kubenswrapper[4794]: I0310 11:29:54.103500 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="df9a8fa7-2924-4579-8728-2fdba5c829ea" containerName="dnsmasq-dns" Mar 10 11:29:54 crc kubenswrapper[4794]: I0310 11:29:54.103520 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="faa599aa-6af6-418a-8db0-6e5b9cd859c7" containerName="registry-server" Mar 10 11:29:54 crc kubenswrapper[4794]: I0310 11:29:54.104669 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cgbslf" Mar 10 11:29:54 crc kubenswrapper[4794]: I0310 11:29:54.106095 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Mar 10 11:29:54 crc kubenswrapper[4794]: I0310 11:29:54.107161 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 10 11:29:54 crc kubenswrapper[4794]: I0310 11:29:54.107635 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-jdm8g" Mar 10 11:29:54 crc kubenswrapper[4794]: I0310 11:29:54.107803 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Mar 10 11:29:54 crc kubenswrapper[4794]: I0310 11:29:54.160384 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cgbslf"] Mar 10 11:29:54 crc kubenswrapper[4794]: I0310 11:29:54.205054 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/5643a41f-c260-4e6a-881c-87f777fa58f3-ceph\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-cgbslf\" (UID: \"5643a41f-c260-4e6a-881c-87f777fa58f3\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cgbslf" Mar 10 11:29:54 crc kubenswrapper[4794]: I0310 11:29:54.205129 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pre-adoption-validation-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5643a41f-c260-4e6a-881c-87f777fa58f3-pre-adoption-validation-combined-ca-bundle\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-cgbslf\" (UID: \"5643a41f-c260-4e6a-881c-87f777fa58f3\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cgbslf" Mar 10 11:29:54 crc kubenswrapper[4794]: I0310 11:29:54.205188 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6tvmc\" (UniqueName: \"kubernetes.io/projected/5643a41f-c260-4e6a-881c-87f777fa58f3-kube-api-access-6tvmc\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-cgbslf\" (UID: \"5643a41f-c260-4e6a-881c-87f777fa58f3\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cgbslf" Mar 10 11:29:54 crc kubenswrapper[4794]: I0310 11:29:54.205320 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5643a41f-c260-4e6a-881c-87f777fa58f3-inventory\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-cgbslf\" (UID: \"5643a41f-c260-4e6a-881c-87f777fa58f3\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cgbslf" Mar 10 11:29:54 crc kubenswrapper[4794]: I0310 11:29:54.205402 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/5643a41f-c260-4e6a-881c-87f777fa58f3-ssh-key-openstack-cell1\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-cgbslf\" (UID: \"5643a41f-c260-4e6a-881c-87f777fa58f3\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cgbslf" Mar 10 11:29:54 crc kubenswrapper[4794]: I0310 11:29:54.307683 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5643a41f-c260-4e6a-881c-87f777fa58f3-inventory\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-cgbslf\" (UID: \"5643a41f-c260-4e6a-881c-87f777fa58f3\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cgbslf" Mar 10 11:29:54 crc kubenswrapper[4794]: I0310 11:29:54.307737 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/5643a41f-c260-4e6a-881c-87f777fa58f3-ssh-key-openstack-cell1\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-cgbslf\" (UID: \"5643a41f-c260-4e6a-881c-87f777fa58f3\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cgbslf" Mar 10 11:29:54 crc kubenswrapper[4794]: I0310 11:29:54.307820 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/5643a41f-c260-4e6a-881c-87f777fa58f3-ceph\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-cgbslf\" (UID: \"5643a41f-c260-4e6a-881c-87f777fa58f3\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cgbslf" Mar 10 11:29:54 crc kubenswrapper[4794]: I0310 11:29:54.307844 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pre-adoption-validation-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5643a41f-c260-4e6a-881c-87f777fa58f3-pre-adoption-validation-combined-ca-bundle\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-cgbslf\" (UID: \"5643a41f-c260-4e6a-881c-87f777fa58f3\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cgbslf" Mar 10 11:29:54 crc kubenswrapper[4794]: I0310 11:29:54.307890 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6tvmc\" (UniqueName: \"kubernetes.io/projected/5643a41f-c260-4e6a-881c-87f777fa58f3-kube-api-access-6tvmc\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-cgbslf\" (UID: \"5643a41f-c260-4e6a-881c-87f777fa58f3\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cgbslf" Mar 10 11:29:54 crc kubenswrapper[4794]: I0310 11:29:54.313904 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pre-adoption-validation-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5643a41f-c260-4e6a-881c-87f777fa58f3-pre-adoption-validation-combined-ca-bundle\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-cgbslf\" (UID: \"5643a41f-c260-4e6a-881c-87f777fa58f3\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cgbslf" Mar 10 11:29:54 crc kubenswrapper[4794]: I0310 11:29:54.314442 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/5643a41f-c260-4e6a-881c-87f777fa58f3-ssh-key-openstack-cell1\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-cgbslf\" (UID: \"5643a41f-c260-4e6a-881c-87f777fa58f3\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cgbslf" Mar 10 11:29:54 crc kubenswrapper[4794]: I0310 11:29:54.314981 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/5643a41f-c260-4e6a-881c-87f777fa58f3-ceph\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-cgbslf\" (UID: \"5643a41f-c260-4e6a-881c-87f777fa58f3\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cgbslf" Mar 10 11:29:54 crc kubenswrapper[4794]: I0310 11:29:54.315809 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5643a41f-c260-4e6a-881c-87f777fa58f3-inventory\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-cgbslf\" (UID: \"5643a41f-c260-4e6a-881c-87f777fa58f3\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cgbslf" Mar 10 11:29:54 crc kubenswrapper[4794]: I0310 11:29:54.327617 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6tvmc\" (UniqueName: \"kubernetes.io/projected/5643a41f-c260-4e6a-881c-87f777fa58f3-kube-api-access-6tvmc\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-cgbslf\" (UID: \"5643a41f-c260-4e6a-881c-87f777fa58f3\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cgbslf" Mar 10 11:29:54 crc kubenswrapper[4794]: I0310 11:29:54.449697 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cgbslf" Mar 10 11:29:55 crc kubenswrapper[4794]: I0310 11:29:55.167648 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cgbslf"] Mar 10 11:29:55 crc kubenswrapper[4794]: W0310 11:29:55.189524 4794 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5643a41f_c260_4e6a_881c_87f777fa58f3.slice/crio-412961cfef64be72d9aaa6aecccb5c55826dbde9902b2ec4a3ef82f89f91f2a0 WatchSource:0}: Error finding container 412961cfef64be72d9aaa6aecccb5c55826dbde9902b2ec4a3ef82f89f91f2a0: Status 404 returned error can't find the container with id 412961cfef64be72d9aaa6aecccb5c55826dbde9902b2ec4a3ef82f89f91f2a0 Mar 10 11:29:55 crc kubenswrapper[4794]: I0310 11:29:55.708877 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cgbslf" event={"ID":"5643a41f-c260-4e6a-881c-87f777fa58f3","Type":"ContainerStarted","Data":"412961cfef64be72d9aaa6aecccb5c55826dbde9902b2ec4a3ef82f89f91f2a0"} Mar 10 11:29:56 crc kubenswrapper[4794]: I0310 11:29:56.000740 4794 scope.go:117] "RemoveContainer" containerID="7035d4646bbb6a23dd802a9c8c9f05a3c6c945d4e5a6da8586d2bb005fcfad77" Mar 10 11:29:56 crc kubenswrapper[4794]: E0310 11:29:56.001525 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-69278_openshift-machine-config-operator(071a79a8-a892-4d38-a255-2a19483b64aa)\"" pod="openshift-machine-config-operator/machine-config-daemon-69278" podUID="071a79a8-a892-4d38-a255-2a19483b64aa" Mar 10 11:30:00 crc kubenswrapper[4794]: I0310 11:30:00.141311 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29552370-nphmk"] Mar 10 11:30:00 crc kubenswrapper[4794]: I0310 11:30:00.143647 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552370-nphmk" Mar 10 11:30:00 crc kubenswrapper[4794]: I0310 11:30:00.145553 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 10 11:30:00 crc kubenswrapper[4794]: I0310 11:30:00.145879 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-r5tkc" Mar 10 11:30:00 crc kubenswrapper[4794]: I0310 11:30:00.146592 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 10 11:30:00 crc kubenswrapper[4794]: I0310 11:30:00.156730 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29552370-6qw5k"] Mar 10 11:30:00 crc kubenswrapper[4794]: I0310 11:30:00.158131 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29552370-6qw5k" Mar 10 11:30:00 crc kubenswrapper[4794]: I0310 11:30:00.161966 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 10 11:30:00 crc kubenswrapper[4794]: I0310 11:30:00.162547 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 10 11:30:00 crc kubenswrapper[4794]: I0310 11:30:00.169548 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552370-nphmk"] Mar 10 11:30:00 crc kubenswrapper[4794]: I0310 11:30:00.184115 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29552370-6qw5k"] Mar 10 11:30:00 crc kubenswrapper[4794]: I0310 11:30:00.244872 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/24048515-cbfd-4a1a-a438-285f5e399cdf-config-volume\") pod \"collect-profiles-29552370-6qw5k\" (UID: \"24048515-cbfd-4a1a-a438-285f5e399cdf\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552370-6qw5k" Mar 10 11:30:00 crc kubenswrapper[4794]: I0310 11:30:00.245153 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ldf78\" (UniqueName: \"kubernetes.io/projected/24048515-cbfd-4a1a-a438-285f5e399cdf-kube-api-access-ldf78\") pod \"collect-profiles-29552370-6qw5k\" (UID: \"24048515-cbfd-4a1a-a438-285f5e399cdf\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552370-6qw5k" Mar 10 11:30:00 crc kubenswrapper[4794]: I0310 11:30:00.245200 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hzt24\" (UniqueName: \"kubernetes.io/projected/8e9bd9f6-74c2-4c97-b3ac-e55e74c36dbe-kube-api-access-hzt24\") pod \"auto-csr-approver-29552370-nphmk\" (UID: \"8e9bd9f6-74c2-4c97-b3ac-e55e74c36dbe\") " pod="openshift-infra/auto-csr-approver-29552370-nphmk" Mar 10 11:30:00 crc kubenswrapper[4794]: I0310 11:30:00.245528 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/24048515-cbfd-4a1a-a438-285f5e399cdf-secret-volume\") pod \"collect-profiles-29552370-6qw5k\" (UID: \"24048515-cbfd-4a1a-a438-285f5e399cdf\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552370-6qw5k" Mar 10 11:30:00 crc kubenswrapper[4794]: I0310 11:30:00.349183 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ldf78\" (UniqueName: \"kubernetes.io/projected/24048515-cbfd-4a1a-a438-285f5e399cdf-kube-api-access-ldf78\") pod \"collect-profiles-29552370-6qw5k\" (UID: \"24048515-cbfd-4a1a-a438-285f5e399cdf\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552370-6qw5k" Mar 10 11:30:00 crc kubenswrapper[4794]: I0310 11:30:00.349516 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hzt24\" (UniqueName: \"kubernetes.io/projected/8e9bd9f6-74c2-4c97-b3ac-e55e74c36dbe-kube-api-access-hzt24\") pod \"auto-csr-approver-29552370-nphmk\" (UID: \"8e9bd9f6-74c2-4c97-b3ac-e55e74c36dbe\") " pod="openshift-infra/auto-csr-approver-29552370-nphmk" Mar 10 11:30:00 crc kubenswrapper[4794]: I0310 11:30:00.349672 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/24048515-cbfd-4a1a-a438-285f5e399cdf-secret-volume\") pod \"collect-profiles-29552370-6qw5k\" (UID: \"24048515-cbfd-4a1a-a438-285f5e399cdf\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552370-6qw5k" Mar 10 11:30:00 crc kubenswrapper[4794]: I0310 11:30:00.349730 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/24048515-cbfd-4a1a-a438-285f5e399cdf-config-volume\") pod \"collect-profiles-29552370-6qw5k\" (UID: \"24048515-cbfd-4a1a-a438-285f5e399cdf\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552370-6qw5k" Mar 10 11:30:00 crc kubenswrapper[4794]: I0310 11:30:00.350445 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/24048515-cbfd-4a1a-a438-285f5e399cdf-config-volume\") pod \"collect-profiles-29552370-6qw5k\" (UID: \"24048515-cbfd-4a1a-a438-285f5e399cdf\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552370-6qw5k" Mar 10 11:30:00 crc kubenswrapper[4794]: I0310 11:30:00.360737 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/24048515-cbfd-4a1a-a438-285f5e399cdf-secret-volume\") pod \"collect-profiles-29552370-6qw5k\" (UID: \"24048515-cbfd-4a1a-a438-285f5e399cdf\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552370-6qw5k" Mar 10 11:30:00 crc kubenswrapper[4794]: I0310 11:30:00.367970 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hzt24\" (UniqueName: \"kubernetes.io/projected/8e9bd9f6-74c2-4c97-b3ac-e55e74c36dbe-kube-api-access-hzt24\") pod \"auto-csr-approver-29552370-nphmk\" (UID: \"8e9bd9f6-74c2-4c97-b3ac-e55e74c36dbe\") " pod="openshift-infra/auto-csr-approver-29552370-nphmk" Mar 10 11:30:00 crc kubenswrapper[4794]: I0310 11:30:00.369126 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ldf78\" (UniqueName: \"kubernetes.io/projected/24048515-cbfd-4a1a-a438-285f5e399cdf-kube-api-access-ldf78\") pod \"collect-profiles-29552370-6qw5k\" (UID: \"24048515-cbfd-4a1a-a438-285f5e399cdf\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552370-6qw5k" Mar 10 11:30:00 crc kubenswrapper[4794]: I0310 11:30:00.490133 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552370-nphmk" Mar 10 11:30:00 crc kubenswrapper[4794]: I0310 11:30:00.496743 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29552370-6qw5k" Mar 10 11:30:05 crc kubenswrapper[4794]: I0310 11:30:05.488375 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29552370-6qw5k"] Mar 10 11:30:05 crc kubenswrapper[4794]: W0310 11:30:05.493993 4794 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod24048515_cbfd_4a1a_a438_285f5e399cdf.slice/crio-77f8bf32e9528438c65676312db5012156b66b864bca0ad46dbb289d68699787 WatchSource:0}: Error finding container 77f8bf32e9528438c65676312db5012156b66b864bca0ad46dbb289d68699787: Status 404 returned error can't find the container with id 77f8bf32e9528438c65676312db5012156b66b864bca0ad46dbb289d68699787 Mar 10 11:30:05 crc kubenswrapper[4794]: I0310 11:30:05.523501 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552370-nphmk"] Mar 10 11:30:05 crc kubenswrapper[4794]: W0310 11:30:05.542734 4794 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8e9bd9f6_74c2_4c97_b3ac_e55e74c36dbe.slice/crio-638951ee07016208d3c1aaca4f6a25445ada19633ce71e2769debc058b7076d0 WatchSource:0}: Error finding container 638951ee07016208d3c1aaca4f6a25445ada19633ce71e2769debc058b7076d0: Status 404 returned error can't find the container with id 638951ee07016208d3c1aaca4f6a25445ada19633ce71e2769debc058b7076d0 Mar 10 11:30:05 crc kubenswrapper[4794]: I0310 11:30:05.809019 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29552370-6qw5k" event={"ID":"24048515-cbfd-4a1a-a438-285f5e399cdf","Type":"ContainerStarted","Data":"0798d116e17d3efc93ee1b26ff856e25907d110be12a521a46e06544a669bbb8"} Mar 10 11:30:05 crc kubenswrapper[4794]: I0310 11:30:05.809103 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29552370-6qw5k" event={"ID":"24048515-cbfd-4a1a-a438-285f5e399cdf","Type":"ContainerStarted","Data":"77f8bf32e9528438c65676312db5012156b66b864bca0ad46dbb289d68699787"} Mar 10 11:30:05 crc kubenswrapper[4794]: I0310 11:30:05.811433 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cgbslf" event={"ID":"5643a41f-c260-4e6a-881c-87f777fa58f3","Type":"ContainerStarted","Data":"10da335c19797d0b1d1b598676745e8b24b4c086b81f49bf973c7cfc585daac5"} Mar 10 11:30:05 crc kubenswrapper[4794]: I0310 11:30:05.813062 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552370-nphmk" event={"ID":"8e9bd9f6-74c2-4c97-b3ac-e55e74c36dbe","Type":"ContainerStarted","Data":"638951ee07016208d3c1aaca4f6a25445ada19633ce71e2769debc058b7076d0"} Mar 10 11:30:05 crc kubenswrapper[4794]: I0310 11:30:05.825193 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29552370-6qw5k" podStartSLOduration=5.825174939 podStartE2EDuration="5.825174939s" podCreationTimestamp="2026-03-10 11:30:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 11:30:05.8242242 +0000 UTC m=+6354.580395018" watchObservedRunningTime="2026-03-10 11:30:05.825174939 +0000 UTC m=+6354.581345757" Mar 10 11:30:05 crc kubenswrapper[4794]: I0310 11:30:05.844391 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cgbslf" podStartSLOduration=2.038559629 podStartE2EDuration="11.844371354s" podCreationTimestamp="2026-03-10 11:29:54 +0000 UTC" firstStartedPulling="2026-03-10 11:29:55.191662613 +0000 UTC m=+6343.947833431" lastFinishedPulling="2026-03-10 11:30:04.997474338 +0000 UTC m=+6353.753645156" observedRunningTime="2026-03-10 11:30:05.840091614 +0000 UTC m=+6354.596262432" watchObservedRunningTime="2026-03-10 11:30:05.844371354 +0000 UTC m=+6354.600542172" Mar 10 11:30:06 crc kubenswrapper[4794]: I0310 11:30:06.827159 4794 generic.go:334] "Generic (PLEG): container finished" podID="24048515-cbfd-4a1a-a438-285f5e399cdf" containerID="0798d116e17d3efc93ee1b26ff856e25907d110be12a521a46e06544a669bbb8" exitCode=0 Mar 10 11:30:06 crc kubenswrapper[4794]: I0310 11:30:06.827273 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29552370-6qw5k" event={"ID":"24048515-cbfd-4a1a-a438-285f5e399cdf","Type":"ContainerDied","Data":"0798d116e17d3efc93ee1b26ff856e25907d110be12a521a46e06544a669bbb8"} Mar 10 11:30:07 crc kubenswrapper[4794]: I0310 11:30:07.852469 4794 generic.go:334] "Generic (PLEG): container finished" podID="8e9bd9f6-74c2-4c97-b3ac-e55e74c36dbe" containerID="b4bc681fb985346ee8a3631df85de0294c1091767fe0b861665a98494797a917" exitCode=0 Mar 10 11:30:07 crc kubenswrapper[4794]: I0310 11:30:07.852622 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552370-nphmk" event={"ID":"8e9bd9f6-74c2-4c97-b3ac-e55e74c36dbe","Type":"ContainerDied","Data":"b4bc681fb985346ee8a3631df85de0294c1091767fe0b861665a98494797a917"} Mar 10 11:30:08 crc kubenswrapper[4794]: I0310 11:30:08.297795 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29552370-6qw5k" Mar 10 11:30:08 crc kubenswrapper[4794]: I0310 11:30:08.422480 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ldf78\" (UniqueName: \"kubernetes.io/projected/24048515-cbfd-4a1a-a438-285f5e399cdf-kube-api-access-ldf78\") pod \"24048515-cbfd-4a1a-a438-285f5e399cdf\" (UID: \"24048515-cbfd-4a1a-a438-285f5e399cdf\") " Mar 10 11:30:08 crc kubenswrapper[4794]: I0310 11:30:08.422724 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/24048515-cbfd-4a1a-a438-285f5e399cdf-config-volume\") pod \"24048515-cbfd-4a1a-a438-285f5e399cdf\" (UID: \"24048515-cbfd-4a1a-a438-285f5e399cdf\") " Mar 10 11:30:08 crc kubenswrapper[4794]: I0310 11:30:08.422758 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/24048515-cbfd-4a1a-a438-285f5e399cdf-secret-volume\") pod \"24048515-cbfd-4a1a-a438-285f5e399cdf\" (UID: \"24048515-cbfd-4a1a-a438-285f5e399cdf\") " Mar 10 11:30:08 crc kubenswrapper[4794]: I0310 11:30:08.424008 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/24048515-cbfd-4a1a-a438-285f5e399cdf-config-volume" (OuterVolumeSpecName: "config-volume") pod "24048515-cbfd-4a1a-a438-285f5e399cdf" (UID: "24048515-cbfd-4a1a-a438-285f5e399cdf"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 11:30:08 crc kubenswrapper[4794]: I0310 11:30:08.429792 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/24048515-cbfd-4a1a-a438-285f5e399cdf-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "24048515-cbfd-4a1a-a438-285f5e399cdf" (UID: "24048515-cbfd-4a1a-a438-285f5e399cdf"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 11:30:08 crc kubenswrapper[4794]: I0310 11:30:08.429980 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/24048515-cbfd-4a1a-a438-285f5e399cdf-kube-api-access-ldf78" (OuterVolumeSpecName: "kube-api-access-ldf78") pod "24048515-cbfd-4a1a-a438-285f5e399cdf" (UID: "24048515-cbfd-4a1a-a438-285f5e399cdf"). InnerVolumeSpecName "kube-api-access-ldf78". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 11:30:08 crc kubenswrapper[4794]: I0310 11:30:08.527995 4794 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/24048515-cbfd-4a1a-a438-285f5e399cdf-config-volume\") on node \"crc\" DevicePath \"\"" Mar 10 11:30:08 crc kubenswrapper[4794]: I0310 11:30:08.528036 4794 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/24048515-cbfd-4a1a-a438-285f5e399cdf-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 10 11:30:08 crc kubenswrapper[4794]: I0310 11:30:08.528048 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ldf78\" (UniqueName: \"kubernetes.io/projected/24048515-cbfd-4a1a-a438-285f5e399cdf-kube-api-access-ldf78\") on node \"crc\" DevicePath \"\"" Mar 10 11:30:08 crc kubenswrapper[4794]: I0310 11:30:08.580934 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29552325-xx4p7"] Mar 10 11:30:08 crc kubenswrapper[4794]: I0310 11:30:08.589770 4794 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29552325-xx4p7"] Mar 10 11:30:08 crc kubenswrapper[4794]: I0310 11:30:08.837631 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-79qm4"] Mar 10 11:30:08 crc kubenswrapper[4794]: E0310 11:30:08.838262 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="24048515-cbfd-4a1a-a438-285f5e399cdf" containerName="collect-profiles" Mar 10 11:30:08 crc kubenswrapper[4794]: I0310 11:30:08.838289 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="24048515-cbfd-4a1a-a438-285f5e399cdf" containerName="collect-profiles" Mar 10 11:30:08 crc kubenswrapper[4794]: I0310 11:30:08.838567 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="24048515-cbfd-4a1a-a438-285f5e399cdf" containerName="collect-profiles" Mar 10 11:30:08 crc kubenswrapper[4794]: I0310 11:30:08.840354 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-79qm4" Mar 10 11:30:08 crc kubenswrapper[4794]: I0310 11:30:08.850450 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-79qm4"] Mar 10 11:30:08 crc kubenswrapper[4794]: I0310 11:30:08.868595 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29552370-6qw5k" Mar 10 11:30:08 crc kubenswrapper[4794]: I0310 11:30:08.870441 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29552370-6qw5k" event={"ID":"24048515-cbfd-4a1a-a438-285f5e399cdf","Type":"ContainerDied","Data":"77f8bf32e9528438c65676312db5012156b66b864bca0ad46dbb289d68699787"} Mar 10 11:30:08 crc kubenswrapper[4794]: I0310 11:30:08.870498 4794 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="77f8bf32e9528438c65676312db5012156b66b864bca0ad46dbb289d68699787" Mar 10 11:30:09 crc kubenswrapper[4794]: I0310 11:30:09.000991 4794 scope.go:117] "RemoveContainer" containerID="7035d4646bbb6a23dd802a9c8c9f05a3c6c945d4e5a6da8586d2bb005fcfad77" Mar 10 11:30:09 crc kubenswrapper[4794]: E0310 11:30:09.001552 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-69278_openshift-machine-config-operator(071a79a8-a892-4d38-a255-2a19483b64aa)\"" pod="openshift-machine-config-operator/machine-config-daemon-69278" podUID="071a79a8-a892-4d38-a255-2a19483b64aa" Mar 10 11:30:09 crc kubenswrapper[4794]: I0310 11:30:09.035840 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n77rd\" (UniqueName: \"kubernetes.io/projected/2efb1797-b8ed-46ea-95c4-b967633e07f3-kube-api-access-n77rd\") pod \"community-operators-79qm4\" (UID: \"2efb1797-b8ed-46ea-95c4-b967633e07f3\") " pod="openshift-marketplace/community-operators-79qm4" Mar 10 11:30:09 crc kubenswrapper[4794]: I0310 11:30:09.036317 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2efb1797-b8ed-46ea-95c4-b967633e07f3-utilities\") pod \"community-operators-79qm4\" (UID: \"2efb1797-b8ed-46ea-95c4-b967633e07f3\") " pod="openshift-marketplace/community-operators-79qm4" Mar 10 11:30:09 crc kubenswrapper[4794]: I0310 11:30:09.036602 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2efb1797-b8ed-46ea-95c4-b967633e07f3-catalog-content\") pod \"community-operators-79qm4\" (UID: \"2efb1797-b8ed-46ea-95c4-b967633e07f3\") " pod="openshift-marketplace/community-operators-79qm4" Mar 10 11:30:09 crc kubenswrapper[4794]: I0310 11:30:09.140001 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2efb1797-b8ed-46ea-95c4-b967633e07f3-utilities\") pod \"community-operators-79qm4\" (UID: \"2efb1797-b8ed-46ea-95c4-b967633e07f3\") " pod="openshift-marketplace/community-operators-79qm4" Mar 10 11:30:09 crc kubenswrapper[4794]: I0310 11:30:09.140129 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2efb1797-b8ed-46ea-95c4-b967633e07f3-catalog-content\") pod \"community-operators-79qm4\" (UID: \"2efb1797-b8ed-46ea-95c4-b967633e07f3\") " pod="openshift-marketplace/community-operators-79qm4" Mar 10 11:30:09 crc kubenswrapper[4794]: I0310 11:30:09.140188 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n77rd\" (UniqueName: \"kubernetes.io/projected/2efb1797-b8ed-46ea-95c4-b967633e07f3-kube-api-access-n77rd\") pod \"community-operators-79qm4\" (UID: \"2efb1797-b8ed-46ea-95c4-b967633e07f3\") " pod="openshift-marketplace/community-operators-79qm4" Mar 10 11:30:09 crc kubenswrapper[4794]: I0310 11:30:09.142153 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2efb1797-b8ed-46ea-95c4-b967633e07f3-utilities\") pod \"community-operators-79qm4\" (UID: \"2efb1797-b8ed-46ea-95c4-b967633e07f3\") " pod="openshift-marketplace/community-operators-79qm4" Mar 10 11:30:09 crc kubenswrapper[4794]: I0310 11:30:09.142972 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2efb1797-b8ed-46ea-95c4-b967633e07f3-catalog-content\") pod \"community-operators-79qm4\" (UID: \"2efb1797-b8ed-46ea-95c4-b967633e07f3\") " pod="openshift-marketplace/community-operators-79qm4" Mar 10 11:30:09 crc kubenswrapper[4794]: I0310 11:30:09.164219 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n77rd\" (UniqueName: \"kubernetes.io/projected/2efb1797-b8ed-46ea-95c4-b967633e07f3-kube-api-access-n77rd\") pod \"community-operators-79qm4\" (UID: \"2efb1797-b8ed-46ea-95c4-b967633e07f3\") " pod="openshift-marketplace/community-operators-79qm4" Mar 10 11:30:09 crc kubenswrapper[4794]: I0310 11:30:09.174969 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552370-nphmk" Mar 10 11:30:09 crc kubenswrapper[4794]: I0310 11:30:09.343478 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hzt24\" (UniqueName: \"kubernetes.io/projected/8e9bd9f6-74c2-4c97-b3ac-e55e74c36dbe-kube-api-access-hzt24\") pod \"8e9bd9f6-74c2-4c97-b3ac-e55e74c36dbe\" (UID: \"8e9bd9f6-74c2-4c97-b3ac-e55e74c36dbe\") " Mar 10 11:30:09 crc kubenswrapper[4794]: I0310 11:30:09.347612 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8e9bd9f6-74c2-4c97-b3ac-e55e74c36dbe-kube-api-access-hzt24" (OuterVolumeSpecName: "kube-api-access-hzt24") pod "8e9bd9f6-74c2-4c97-b3ac-e55e74c36dbe" (UID: "8e9bd9f6-74c2-4c97-b3ac-e55e74c36dbe"). InnerVolumeSpecName "kube-api-access-hzt24". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 11:30:09 crc kubenswrapper[4794]: I0310 11:30:09.446741 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hzt24\" (UniqueName: \"kubernetes.io/projected/8e9bd9f6-74c2-4c97-b3ac-e55e74c36dbe-kube-api-access-hzt24\") on node \"crc\" DevicePath \"\"" Mar 10 11:30:09 crc kubenswrapper[4794]: I0310 11:30:09.460479 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-79qm4" Mar 10 11:30:09 crc kubenswrapper[4794]: I0310 11:30:09.884397 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552370-nphmk" event={"ID":"8e9bd9f6-74c2-4c97-b3ac-e55e74c36dbe","Type":"ContainerDied","Data":"638951ee07016208d3c1aaca4f6a25445ada19633ce71e2769debc058b7076d0"} Mar 10 11:30:09 crc kubenswrapper[4794]: I0310 11:30:09.884834 4794 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="638951ee07016208d3c1aaca4f6a25445ada19633ce71e2769debc058b7076d0" Mar 10 11:30:09 crc kubenswrapper[4794]: I0310 11:30:09.884697 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552370-nphmk" Mar 10 11:30:09 crc kubenswrapper[4794]: I0310 11:30:09.946537 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-79qm4"] Mar 10 11:30:09 crc kubenswrapper[4794]: W0310 11:30:09.958954 4794 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2efb1797_b8ed_46ea_95c4_b967633e07f3.slice/crio-ce223c448984fa10bbbe2c0433c8c6c6f7457e72e9e7db90fef85850053aaf7b WatchSource:0}: Error finding container ce223c448984fa10bbbe2c0433c8c6c6f7457e72e9e7db90fef85850053aaf7b: Status 404 returned error can't find the container with id ce223c448984fa10bbbe2c0433c8c6c6f7457e72e9e7db90fef85850053aaf7b Mar 10 11:30:10 crc kubenswrapper[4794]: I0310 11:30:10.021888 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc5f5e33-ec54-41b5-b46d-69f07fac6d87" path="/var/lib/kubelet/pods/bc5f5e33-ec54-41b5-b46d-69f07fac6d87/volumes" Mar 10 11:30:10 crc kubenswrapper[4794]: I0310 11:30:10.274792 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29552364-4nfmf"] Mar 10 11:30:10 crc kubenswrapper[4794]: I0310 11:30:10.284056 4794 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29552364-4nfmf"] Mar 10 11:30:10 crc kubenswrapper[4794]: I0310 11:30:10.896042 4794 generic.go:334] "Generic (PLEG): container finished" podID="2efb1797-b8ed-46ea-95c4-b967633e07f3" containerID="7ecf35ad8b9e0a2847cfa0fd0d89fbc80538a55ace79b190fb4a278149636fd0" exitCode=0 Mar 10 11:30:10 crc kubenswrapper[4794]: I0310 11:30:10.896088 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-79qm4" event={"ID":"2efb1797-b8ed-46ea-95c4-b967633e07f3","Type":"ContainerDied","Data":"7ecf35ad8b9e0a2847cfa0fd0d89fbc80538a55ace79b190fb4a278149636fd0"} Mar 10 11:30:10 crc kubenswrapper[4794]: I0310 11:30:10.896118 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-79qm4" event={"ID":"2efb1797-b8ed-46ea-95c4-b967633e07f3","Type":"ContainerStarted","Data":"ce223c448984fa10bbbe2c0433c8c6c6f7457e72e9e7db90fef85850053aaf7b"} Mar 10 11:30:11 crc kubenswrapper[4794]: I0310 11:30:11.241210 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-lhhpq"] Mar 10 11:30:11 crc kubenswrapper[4794]: E0310 11:30:11.241786 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8e9bd9f6-74c2-4c97-b3ac-e55e74c36dbe" containerName="oc" Mar 10 11:30:11 crc kubenswrapper[4794]: I0310 11:30:11.241807 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="8e9bd9f6-74c2-4c97-b3ac-e55e74c36dbe" containerName="oc" Mar 10 11:30:11 crc kubenswrapper[4794]: I0310 11:30:11.242086 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="8e9bd9f6-74c2-4c97-b3ac-e55e74c36dbe" containerName="oc" Mar 10 11:30:11 crc kubenswrapper[4794]: I0310 11:30:11.243994 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-lhhpq" Mar 10 11:30:11 crc kubenswrapper[4794]: I0310 11:30:11.260313 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-lhhpq"] Mar 10 11:30:11 crc kubenswrapper[4794]: I0310 11:30:11.396824 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/905df928-7f1b-4b9e-b18a-d2d032e28bb3-utilities\") pod \"redhat-operators-lhhpq\" (UID: \"905df928-7f1b-4b9e-b18a-d2d032e28bb3\") " pod="openshift-marketplace/redhat-operators-lhhpq" Mar 10 11:30:11 crc kubenswrapper[4794]: I0310 11:30:11.397272 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/905df928-7f1b-4b9e-b18a-d2d032e28bb3-catalog-content\") pod \"redhat-operators-lhhpq\" (UID: \"905df928-7f1b-4b9e-b18a-d2d032e28bb3\") " pod="openshift-marketplace/redhat-operators-lhhpq" Mar 10 11:30:11 crc kubenswrapper[4794]: I0310 11:30:11.397487 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jzbgx\" (UniqueName: \"kubernetes.io/projected/905df928-7f1b-4b9e-b18a-d2d032e28bb3-kube-api-access-jzbgx\") pod \"redhat-operators-lhhpq\" (UID: \"905df928-7f1b-4b9e-b18a-d2d032e28bb3\") " pod="openshift-marketplace/redhat-operators-lhhpq" Mar 10 11:30:11 crc kubenswrapper[4794]: I0310 11:30:11.499470 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/905df928-7f1b-4b9e-b18a-d2d032e28bb3-catalog-content\") pod \"redhat-operators-lhhpq\" (UID: \"905df928-7f1b-4b9e-b18a-d2d032e28bb3\") " pod="openshift-marketplace/redhat-operators-lhhpq" Mar 10 11:30:11 crc kubenswrapper[4794]: I0310 11:30:11.499538 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jzbgx\" (UniqueName: \"kubernetes.io/projected/905df928-7f1b-4b9e-b18a-d2d032e28bb3-kube-api-access-jzbgx\") pod \"redhat-operators-lhhpq\" (UID: \"905df928-7f1b-4b9e-b18a-d2d032e28bb3\") " pod="openshift-marketplace/redhat-operators-lhhpq" Mar 10 11:30:11 crc kubenswrapper[4794]: I0310 11:30:11.499672 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/905df928-7f1b-4b9e-b18a-d2d032e28bb3-utilities\") pod \"redhat-operators-lhhpq\" (UID: \"905df928-7f1b-4b9e-b18a-d2d032e28bb3\") " pod="openshift-marketplace/redhat-operators-lhhpq" Mar 10 11:30:11 crc kubenswrapper[4794]: I0310 11:30:11.500143 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/905df928-7f1b-4b9e-b18a-d2d032e28bb3-utilities\") pod \"redhat-operators-lhhpq\" (UID: \"905df928-7f1b-4b9e-b18a-d2d032e28bb3\") " pod="openshift-marketplace/redhat-operators-lhhpq" Mar 10 11:30:11 crc kubenswrapper[4794]: I0310 11:30:11.500445 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/905df928-7f1b-4b9e-b18a-d2d032e28bb3-catalog-content\") pod \"redhat-operators-lhhpq\" (UID: \"905df928-7f1b-4b9e-b18a-d2d032e28bb3\") " pod="openshift-marketplace/redhat-operators-lhhpq" Mar 10 11:30:11 crc kubenswrapper[4794]: I0310 11:30:11.520612 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jzbgx\" (UniqueName: \"kubernetes.io/projected/905df928-7f1b-4b9e-b18a-d2d032e28bb3-kube-api-access-jzbgx\") pod \"redhat-operators-lhhpq\" (UID: \"905df928-7f1b-4b9e-b18a-d2d032e28bb3\") " pod="openshift-marketplace/redhat-operators-lhhpq" Mar 10 11:30:11 crc kubenswrapper[4794]: I0310 11:30:11.578012 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-lhhpq" Mar 10 11:30:11 crc kubenswrapper[4794]: I0310 11:30:11.928556 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-79qm4" event={"ID":"2efb1797-b8ed-46ea-95c4-b967633e07f3","Type":"ContainerStarted","Data":"a5eec08bfb99d996be8d8c9de23705978d261c8a441d39e894168f4c92c98348"} Mar 10 11:30:12 crc kubenswrapper[4794]: I0310 11:30:12.046130 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d4483d52-5707-451b-bff3-9ba1e6984e82" path="/var/lib/kubelet/pods/d4483d52-5707-451b-bff3-9ba1e6984e82/volumes" Mar 10 11:30:12 crc kubenswrapper[4794]: I0310 11:30:12.137307 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-lhhpq"] Mar 10 11:30:12 crc kubenswrapper[4794]: I0310 11:30:12.942938 4794 generic.go:334] "Generic (PLEG): container finished" podID="905df928-7f1b-4b9e-b18a-d2d032e28bb3" containerID="3a930ec0e45c904b6d91fe249fbbcb69114254a544689ecf6e60b7b330b03c8e" exitCode=0 Mar 10 11:30:12 crc kubenswrapper[4794]: I0310 11:30:12.943059 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lhhpq" event={"ID":"905df928-7f1b-4b9e-b18a-d2d032e28bb3","Type":"ContainerDied","Data":"3a930ec0e45c904b6d91fe249fbbcb69114254a544689ecf6e60b7b330b03c8e"} Mar 10 11:30:12 crc kubenswrapper[4794]: I0310 11:30:12.943374 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lhhpq" event={"ID":"905df928-7f1b-4b9e-b18a-d2d032e28bb3","Type":"ContainerStarted","Data":"b7d1cbfed3d9d8569b7f3884785b6128b77f45bb804ab7e2daf514a09e26df38"} Mar 10 11:30:13 crc kubenswrapper[4794]: I0310 11:30:13.959619 4794 generic.go:334] "Generic (PLEG): container finished" podID="2efb1797-b8ed-46ea-95c4-b967633e07f3" containerID="a5eec08bfb99d996be8d8c9de23705978d261c8a441d39e894168f4c92c98348" exitCode=0 Mar 10 11:30:13 crc kubenswrapper[4794]: I0310 11:30:13.959708 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-79qm4" event={"ID":"2efb1797-b8ed-46ea-95c4-b967633e07f3","Type":"ContainerDied","Data":"a5eec08bfb99d996be8d8c9de23705978d261c8a441d39e894168f4c92c98348"} Mar 10 11:30:14 crc kubenswrapper[4794]: I0310 11:30:14.972471 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lhhpq" event={"ID":"905df928-7f1b-4b9e-b18a-d2d032e28bb3","Type":"ContainerStarted","Data":"3be7ccee2f0b7d1d415b467d0cfc1a3d2ba48a120b83ba8a590c728c6d683b36"} Mar 10 11:30:14 crc kubenswrapper[4794]: I0310 11:30:14.975643 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-79qm4" event={"ID":"2efb1797-b8ed-46ea-95c4-b967633e07f3","Type":"ContainerStarted","Data":"3b4de4a04ac0f76ea2113e85974f043ea8f704c8b3b87c0228a477a13c3f4f58"} Mar 10 11:30:15 crc kubenswrapper[4794]: I0310 11:30:15.021983 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-79qm4" podStartSLOduration=3.402845973 podStartE2EDuration="7.02196811s" podCreationTimestamp="2026-03-10 11:30:08 +0000 UTC" firstStartedPulling="2026-03-10 11:30:10.898239225 +0000 UTC m=+6359.654410063" lastFinishedPulling="2026-03-10 11:30:14.517361381 +0000 UTC m=+6363.273532200" observedRunningTime="2026-03-10 11:30:15.017231145 +0000 UTC m=+6363.773401993" watchObservedRunningTime="2026-03-10 11:30:15.02196811 +0000 UTC m=+6363.778138918" Mar 10 11:30:19 crc kubenswrapper[4794]: I0310 11:30:19.461861 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-79qm4" Mar 10 11:30:19 crc kubenswrapper[4794]: I0310 11:30:19.462579 4794 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-79qm4" Mar 10 11:30:19 crc kubenswrapper[4794]: I0310 11:30:19.548836 4794 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-79qm4" Mar 10 11:30:20 crc kubenswrapper[4794]: I0310 11:30:20.020628 4794 generic.go:334] "Generic (PLEG): container finished" podID="5643a41f-c260-4e6a-881c-87f777fa58f3" containerID="10da335c19797d0b1d1b598676745e8b24b4c086b81f49bf973c7cfc585daac5" exitCode=0 Mar 10 11:30:20 crc kubenswrapper[4794]: I0310 11:30:20.020703 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cgbslf" event={"ID":"5643a41f-c260-4e6a-881c-87f777fa58f3","Type":"ContainerDied","Data":"10da335c19797d0b1d1b598676745e8b24b4c086b81f49bf973c7cfc585daac5"} Mar 10 11:30:20 crc kubenswrapper[4794]: I0310 11:30:20.023140 4794 generic.go:334] "Generic (PLEG): container finished" podID="905df928-7f1b-4b9e-b18a-d2d032e28bb3" containerID="3be7ccee2f0b7d1d415b467d0cfc1a3d2ba48a120b83ba8a590c728c6d683b36" exitCode=0 Mar 10 11:30:20 crc kubenswrapper[4794]: I0310 11:30:20.023231 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lhhpq" event={"ID":"905df928-7f1b-4b9e-b18a-d2d032e28bb3","Type":"ContainerDied","Data":"3be7ccee2f0b7d1d415b467d0cfc1a3d2ba48a120b83ba8a590c728c6d683b36"} Mar 10 11:30:20 crc kubenswrapper[4794]: I0310 11:30:20.089826 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-79qm4" Mar 10 11:30:21 crc kubenswrapper[4794]: I0310 11:30:21.034622 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lhhpq" event={"ID":"905df928-7f1b-4b9e-b18a-d2d032e28bb3","Type":"ContainerStarted","Data":"2e0742bac0bc239d4bbe8da77a1d1177e88bffbf833e44066c95c35056e6578a"} Mar 10 11:30:21 crc kubenswrapper[4794]: I0310 11:30:21.069966 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-lhhpq" podStartSLOduration=2.560630715 podStartE2EDuration="10.069940006s" podCreationTimestamp="2026-03-10 11:30:11 +0000 UTC" firstStartedPulling="2026-03-10 11:30:12.946203038 +0000 UTC m=+6361.702373876" lastFinishedPulling="2026-03-10 11:30:20.455512349 +0000 UTC m=+6369.211683167" observedRunningTime="2026-03-10 11:30:21.065527812 +0000 UTC m=+6369.821698650" watchObservedRunningTime="2026-03-10 11:30:21.069940006 +0000 UTC m=+6369.826110834" Mar 10 11:30:21 crc kubenswrapper[4794]: I0310 11:30:21.578573 4794 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-lhhpq" Mar 10 11:30:21 crc kubenswrapper[4794]: I0310 11:30:21.578951 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-lhhpq" Mar 10 11:30:21 crc kubenswrapper[4794]: I0310 11:30:21.651890 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-79qm4"] Mar 10 11:30:21 crc kubenswrapper[4794]: I0310 11:30:21.810215 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cgbslf" Mar 10 11:30:21 crc kubenswrapper[4794]: I0310 11:30:21.969853 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/5643a41f-c260-4e6a-881c-87f777fa58f3-ceph\") pod \"5643a41f-c260-4e6a-881c-87f777fa58f3\" (UID: \"5643a41f-c260-4e6a-881c-87f777fa58f3\") " Mar 10 11:30:21 crc kubenswrapper[4794]: I0310 11:30:21.969959 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pre-adoption-validation-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5643a41f-c260-4e6a-881c-87f777fa58f3-pre-adoption-validation-combined-ca-bundle\") pod \"5643a41f-c260-4e6a-881c-87f777fa58f3\" (UID: \"5643a41f-c260-4e6a-881c-87f777fa58f3\") " Mar 10 11:30:21 crc kubenswrapper[4794]: I0310 11:30:21.970032 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5643a41f-c260-4e6a-881c-87f777fa58f3-inventory\") pod \"5643a41f-c260-4e6a-881c-87f777fa58f3\" (UID: \"5643a41f-c260-4e6a-881c-87f777fa58f3\") " Mar 10 11:30:21 crc kubenswrapper[4794]: I0310 11:30:21.970060 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6tvmc\" (UniqueName: \"kubernetes.io/projected/5643a41f-c260-4e6a-881c-87f777fa58f3-kube-api-access-6tvmc\") pod \"5643a41f-c260-4e6a-881c-87f777fa58f3\" (UID: \"5643a41f-c260-4e6a-881c-87f777fa58f3\") " Mar 10 11:30:21 crc kubenswrapper[4794]: I0310 11:30:21.970114 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/5643a41f-c260-4e6a-881c-87f777fa58f3-ssh-key-openstack-cell1\") pod \"5643a41f-c260-4e6a-881c-87f777fa58f3\" (UID: \"5643a41f-c260-4e6a-881c-87f777fa58f3\") " Mar 10 11:30:21 crc kubenswrapper[4794]: I0310 11:30:21.979727 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5643a41f-c260-4e6a-881c-87f777fa58f3-pre-adoption-validation-combined-ca-bundle" (OuterVolumeSpecName: "pre-adoption-validation-combined-ca-bundle") pod "5643a41f-c260-4e6a-881c-87f777fa58f3" (UID: "5643a41f-c260-4e6a-881c-87f777fa58f3"). InnerVolumeSpecName "pre-adoption-validation-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 11:30:21 crc kubenswrapper[4794]: I0310 11:30:21.979801 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5643a41f-c260-4e6a-881c-87f777fa58f3-ceph" (OuterVolumeSpecName: "ceph") pod "5643a41f-c260-4e6a-881c-87f777fa58f3" (UID: "5643a41f-c260-4e6a-881c-87f777fa58f3"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 11:30:21 crc kubenswrapper[4794]: I0310 11:30:21.980519 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5643a41f-c260-4e6a-881c-87f777fa58f3-kube-api-access-6tvmc" (OuterVolumeSpecName: "kube-api-access-6tvmc") pod "5643a41f-c260-4e6a-881c-87f777fa58f3" (UID: "5643a41f-c260-4e6a-881c-87f777fa58f3"). InnerVolumeSpecName "kube-api-access-6tvmc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 11:30:22 crc kubenswrapper[4794]: I0310 11:30:22.001786 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5643a41f-c260-4e6a-881c-87f777fa58f3-ssh-key-openstack-cell1" (OuterVolumeSpecName: "ssh-key-openstack-cell1") pod "5643a41f-c260-4e6a-881c-87f777fa58f3" (UID: "5643a41f-c260-4e6a-881c-87f777fa58f3"). InnerVolumeSpecName "ssh-key-openstack-cell1". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 11:30:22 crc kubenswrapper[4794]: I0310 11:30:22.008513 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5643a41f-c260-4e6a-881c-87f777fa58f3-inventory" (OuterVolumeSpecName: "inventory") pod "5643a41f-c260-4e6a-881c-87f777fa58f3" (UID: "5643a41f-c260-4e6a-881c-87f777fa58f3"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 11:30:22 crc kubenswrapper[4794]: I0310 11:30:22.044003 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cgbslf" event={"ID":"5643a41f-c260-4e6a-881c-87f777fa58f3","Type":"ContainerDied","Data":"412961cfef64be72d9aaa6aecccb5c55826dbde9902b2ec4a3ef82f89f91f2a0"} Mar 10 11:30:22 crc kubenswrapper[4794]: I0310 11:30:22.044059 4794 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="412961cfef64be72d9aaa6aecccb5c55826dbde9902b2ec4a3ef82f89f91f2a0" Mar 10 11:30:22 crc kubenswrapper[4794]: I0310 11:30:22.044159 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-79qm4" podUID="2efb1797-b8ed-46ea-95c4-b967633e07f3" containerName="registry-server" containerID="cri-o://3b4de4a04ac0f76ea2113e85974f043ea8f704c8b3b87c0228a477a13c3f4f58" gracePeriod=2 Mar 10 11:30:22 crc kubenswrapper[4794]: I0310 11:30:22.044182 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cgbslf" Mar 10 11:30:22 crc kubenswrapper[4794]: I0310 11:30:22.072461 4794 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5643a41f-c260-4e6a-881c-87f777fa58f3-inventory\") on node \"crc\" DevicePath \"\"" Mar 10 11:30:22 crc kubenswrapper[4794]: I0310 11:30:22.072501 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6tvmc\" (UniqueName: \"kubernetes.io/projected/5643a41f-c260-4e6a-881c-87f777fa58f3-kube-api-access-6tvmc\") on node \"crc\" DevicePath \"\"" Mar 10 11:30:22 crc kubenswrapper[4794]: I0310 11:30:22.072516 4794 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/5643a41f-c260-4e6a-881c-87f777fa58f3-ssh-key-openstack-cell1\") on node \"crc\" DevicePath \"\"" Mar 10 11:30:22 crc kubenswrapper[4794]: I0310 11:30:22.072527 4794 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/5643a41f-c260-4e6a-881c-87f777fa58f3-ceph\") on node \"crc\" DevicePath \"\"" Mar 10 11:30:22 crc kubenswrapper[4794]: I0310 11:30:22.072539 4794 reconciler_common.go:293] "Volume detached for volume \"pre-adoption-validation-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5643a41f-c260-4e6a-881c-87f777fa58f3-pre-adoption-validation-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 11:30:22 crc kubenswrapper[4794]: I0310 11:30:22.431243 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-79qm4" Mar 10 11:30:22 crc kubenswrapper[4794]: I0310 11:30:22.583470 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n77rd\" (UniqueName: \"kubernetes.io/projected/2efb1797-b8ed-46ea-95c4-b967633e07f3-kube-api-access-n77rd\") pod \"2efb1797-b8ed-46ea-95c4-b967633e07f3\" (UID: \"2efb1797-b8ed-46ea-95c4-b967633e07f3\") " Mar 10 11:30:22 crc kubenswrapper[4794]: I0310 11:30:22.583633 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2efb1797-b8ed-46ea-95c4-b967633e07f3-catalog-content\") pod \"2efb1797-b8ed-46ea-95c4-b967633e07f3\" (UID: \"2efb1797-b8ed-46ea-95c4-b967633e07f3\") " Mar 10 11:30:22 crc kubenswrapper[4794]: I0310 11:30:22.583845 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2efb1797-b8ed-46ea-95c4-b967633e07f3-utilities\") pod \"2efb1797-b8ed-46ea-95c4-b967633e07f3\" (UID: \"2efb1797-b8ed-46ea-95c4-b967633e07f3\") " Mar 10 11:30:22 crc kubenswrapper[4794]: I0310 11:30:22.584507 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2efb1797-b8ed-46ea-95c4-b967633e07f3-utilities" (OuterVolumeSpecName: "utilities") pod "2efb1797-b8ed-46ea-95c4-b967633e07f3" (UID: "2efb1797-b8ed-46ea-95c4-b967633e07f3"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 11:30:22 crc kubenswrapper[4794]: I0310 11:30:22.588737 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2efb1797-b8ed-46ea-95c4-b967633e07f3-kube-api-access-n77rd" (OuterVolumeSpecName: "kube-api-access-n77rd") pod "2efb1797-b8ed-46ea-95c4-b967633e07f3" (UID: "2efb1797-b8ed-46ea-95c4-b967633e07f3"). InnerVolumeSpecName "kube-api-access-n77rd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 11:30:22 crc kubenswrapper[4794]: I0310 11:30:22.635120 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2efb1797-b8ed-46ea-95c4-b967633e07f3-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "2efb1797-b8ed-46ea-95c4-b967633e07f3" (UID: "2efb1797-b8ed-46ea-95c4-b967633e07f3"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 11:30:22 crc kubenswrapper[4794]: I0310 11:30:22.686389 4794 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2efb1797-b8ed-46ea-95c4-b967633e07f3-utilities\") on node \"crc\" DevicePath \"\"" Mar 10 11:30:22 crc kubenswrapper[4794]: I0310 11:30:22.686636 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n77rd\" (UniqueName: \"kubernetes.io/projected/2efb1797-b8ed-46ea-95c4-b967633e07f3-kube-api-access-n77rd\") on node \"crc\" DevicePath \"\"" Mar 10 11:30:22 crc kubenswrapper[4794]: I0310 11:30:22.686717 4794 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2efb1797-b8ed-46ea-95c4-b967633e07f3-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 10 11:30:22 crc kubenswrapper[4794]: I0310 11:30:22.709096 4794 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-lhhpq" podUID="905df928-7f1b-4b9e-b18a-d2d032e28bb3" containerName="registry-server" probeResult="failure" output=< Mar 10 11:30:22 crc kubenswrapper[4794]: timeout: failed to connect service ":50051" within 1s Mar 10 11:30:22 crc kubenswrapper[4794]: > Mar 10 11:30:23 crc kubenswrapper[4794]: I0310 11:30:23.000021 4794 scope.go:117] "RemoveContainer" containerID="7035d4646bbb6a23dd802a9c8c9f05a3c6c945d4e5a6da8586d2bb005fcfad77" Mar 10 11:30:23 crc kubenswrapper[4794]: E0310 11:30:23.000496 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-69278_openshift-machine-config-operator(071a79a8-a892-4d38-a255-2a19483b64aa)\"" pod="openshift-machine-config-operator/machine-config-daemon-69278" podUID="071a79a8-a892-4d38-a255-2a19483b64aa" Mar 10 11:30:23 crc kubenswrapper[4794]: I0310 11:30:23.057184 4794 generic.go:334] "Generic (PLEG): container finished" podID="2efb1797-b8ed-46ea-95c4-b967633e07f3" containerID="3b4de4a04ac0f76ea2113e85974f043ea8f704c8b3b87c0228a477a13c3f4f58" exitCode=0 Mar 10 11:30:23 crc kubenswrapper[4794]: I0310 11:30:23.057257 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-79qm4" Mar 10 11:30:23 crc kubenswrapper[4794]: I0310 11:30:23.057257 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-79qm4" event={"ID":"2efb1797-b8ed-46ea-95c4-b967633e07f3","Type":"ContainerDied","Data":"3b4de4a04ac0f76ea2113e85974f043ea8f704c8b3b87c0228a477a13c3f4f58"} Mar 10 11:30:23 crc kubenswrapper[4794]: I0310 11:30:23.057405 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-79qm4" event={"ID":"2efb1797-b8ed-46ea-95c4-b967633e07f3","Type":"ContainerDied","Data":"ce223c448984fa10bbbe2c0433c8c6c6f7457e72e9e7db90fef85850053aaf7b"} Mar 10 11:30:23 crc kubenswrapper[4794]: I0310 11:30:23.057451 4794 scope.go:117] "RemoveContainer" containerID="3b4de4a04ac0f76ea2113e85974f043ea8f704c8b3b87c0228a477a13c3f4f58" Mar 10 11:30:23 crc kubenswrapper[4794]: I0310 11:30:23.118750 4794 scope.go:117] "RemoveContainer" containerID="a5eec08bfb99d996be8d8c9de23705978d261c8a441d39e894168f4c92c98348" Mar 10 11:30:23 crc kubenswrapper[4794]: I0310 11:30:23.132061 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-79qm4"] Mar 10 11:30:23 crc kubenswrapper[4794]: I0310 11:30:23.142136 4794 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-79qm4"] Mar 10 11:30:23 crc kubenswrapper[4794]: I0310 11:30:23.147298 4794 scope.go:117] "RemoveContainer" containerID="7ecf35ad8b9e0a2847cfa0fd0d89fbc80538a55ace79b190fb4a278149636fd0" Mar 10 11:30:23 crc kubenswrapper[4794]: I0310 11:30:23.192166 4794 scope.go:117] "RemoveContainer" containerID="3b4de4a04ac0f76ea2113e85974f043ea8f704c8b3b87c0228a477a13c3f4f58" Mar 10 11:30:23 crc kubenswrapper[4794]: E0310 11:30:23.192808 4794 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3b4de4a04ac0f76ea2113e85974f043ea8f704c8b3b87c0228a477a13c3f4f58\": container with ID starting with 3b4de4a04ac0f76ea2113e85974f043ea8f704c8b3b87c0228a477a13c3f4f58 not found: ID does not exist" containerID="3b4de4a04ac0f76ea2113e85974f043ea8f704c8b3b87c0228a477a13c3f4f58" Mar 10 11:30:23 crc kubenswrapper[4794]: I0310 11:30:23.192863 4794 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3b4de4a04ac0f76ea2113e85974f043ea8f704c8b3b87c0228a477a13c3f4f58"} err="failed to get container status \"3b4de4a04ac0f76ea2113e85974f043ea8f704c8b3b87c0228a477a13c3f4f58\": rpc error: code = NotFound desc = could not find container \"3b4de4a04ac0f76ea2113e85974f043ea8f704c8b3b87c0228a477a13c3f4f58\": container with ID starting with 3b4de4a04ac0f76ea2113e85974f043ea8f704c8b3b87c0228a477a13c3f4f58 not found: ID does not exist" Mar 10 11:30:23 crc kubenswrapper[4794]: I0310 11:30:23.192896 4794 scope.go:117] "RemoveContainer" containerID="a5eec08bfb99d996be8d8c9de23705978d261c8a441d39e894168f4c92c98348" Mar 10 11:30:23 crc kubenswrapper[4794]: E0310 11:30:23.193156 4794 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a5eec08bfb99d996be8d8c9de23705978d261c8a441d39e894168f4c92c98348\": container with ID starting with a5eec08bfb99d996be8d8c9de23705978d261c8a441d39e894168f4c92c98348 not found: ID does not exist" containerID="a5eec08bfb99d996be8d8c9de23705978d261c8a441d39e894168f4c92c98348" Mar 10 11:30:23 crc kubenswrapper[4794]: I0310 11:30:23.193184 4794 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a5eec08bfb99d996be8d8c9de23705978d261c8a441d39e894168f4c92c98348"} err="failed to get container status \"a5eec08bfb99d996be8d8c9de23705978d261c8a441d39e894168f4c92c98348\": rpc error: code = NotFound desc = could not find container \"a5eec08bfb99d996be8d8c9de23705978d261c8a441d39e894168f4c92c98348\": container with ID starting with a5eec08bfb99d996be8d8c9de23705978d261c8a441d39e894168f4c92c98348 not found: ID does not exist" Mar 10 11:30:23 crc kubenswrapper[4794]: I0310 11:30:23.193203 4794 scope.go:117] "RemoveContainer" containerID="7ecf35ad8b9e0a2847cfa0fd0d89fbc80538a55ace79b190fb4a278149636fd0" Mar 10 11:30:23 crc kubenswrapper[4794]: E0310 11:30:23.193557 4794 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7ecf35ad8b9e0a2847cfa0fd0d89fbc80538a55ace79b190fb4a278149636fd0\": container with ID starting with 7ecf35ad8b9e0a2847cfa0fd0d89fbc80538a55ace79b190fb4a278149636fd0 not found: ID does not exist" containerID="7ecf35ad8b9e0a2847cfa0fd0d89fbc80538a55ace79b190fb4a278149636fd0" Mar 10 11:30:23 crc kubenswrapper[4794]: I0310 11:30:23.193589 4794 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7ecf35ad8b9e0a2847cfa0fd0d89fbc80538a55ace79b190fb4a278149636fd0"} err="failed to get container status \"7ecf35ad8b9e0a2847cfa0fd0d89fbc80538a55ace79b190fb4a278149636fd0\": rpc error: code = NotFound desc = could not find container \"7ecf35ad8b9e0a2847cfa0fd0d89fbc80538a55ace79b190fb4a278149636fd0\": container with ID starting with 7ecf35ad8b9e0a2847cfa0fd0d89fbc80538a55ace79b190fb4a278149636fd0 not found: ID does not exist" Mar 10 11:30:24 crc kubenswrapper[4794]: I0310 11:30:24.012277 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2efb1797-b8ed-46ea-95c4-b967633e07f3" path="/var/lib/kubelet/pods/2efb1797-b8ed-46ea-95c4-b967633e07f3/volumes" Mar 10 11:30:26 crc kubenswrapper[4794]: I0310 11:30:26.945046 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-4sdlv"] Mar 10 11:30:26 crc kubenswrapper[4794]: E0310 11:30:26.945915 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2efb1797-b8ed-46ea-95c4-b967633e07f3" containerName="extract-content" Mar 10 11:30:26 crc kubenswrapper[4794]: I0310 11:30:26.945934 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="2efb1797-b8ed-46ea-95c4-b967633e07f3" containerName="extract-content" Mar 10 11:30:26 crc kubenswrapper[4794]: E0310 11:30:26.945952 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2efb1797-b8ed-46ea-95c4-b967633e07f3" containerName="registry-server" Mar 10 11:30:26 crc kubenswrapper[4794]: I0310 11:30:26.945962 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="2efb1797-b8ed-46ea-95c4-b967633e07f3" containerName="registry-server" Mar 10 11:30:26 crc kubenswrapper[4794]: E0310 11:30:26.945982 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5643a41f-c260-4e6a-881c-87f777fa58f3" containerName="pre-adoption-validation-openstack-pre-adoption-openstack-cell1" Mar 10 11:30:26 crc kubenswrapper[4794]: I0310 11:30:26.945994 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="5643a41f-c260-4e6a-881c-87f777fa58f3" containerName="pre-adoption-validation-openstack-pre-adoption-openstack-cell1" Mar 10 11:30:26 crc kubenswrapper[4794]: E0310 11:30:26.946052 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2efb1797-b8ed-46ea-95c4-b967633e07f3" containerName="extract-utilities" Mar 10 11:30:26 crc kubenswrapper[4794]: I0310 11:30:26.946065 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="2efb1797-b8ed-46ea-95c4-b967633e07f3" containerName="extract-utilities" Mar 10 11:30:26 crc kubenswrapper[4794]: I0310 11:30:26.946313 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="5643a41f-c260-4e6a-881c-87f777fa58f3" containerName="pre-adoption-validation-openstack-pre-adoption-openstack-cell1" Mar 10 11:30:26 crc kubenswrapper[4794]: I0310 11:30:26.946357 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="2efb1797-b8ed-46ea-95c4-b967633e07f3" containerName="registry-server" Mar 10 11:30:26 crc kubenswrapper[4794]: I0310 11:30:26.947374 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-4sdlv" Mar 10 11:30:26 crc kubenswrapper[4794]: I0310 11:30:26.950060 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 10 11:30:26 crc kubenswrapper[4794]: I0310 11:30:26.950135 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-jdm8g" Mar 10 11:30:26 crc kubenswrapper[4794]: I0310 11:30:26.950309 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Mar 10 11:30:26 crc kubenswrapper[4794]: I0310 11:30:26.950728 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Mar 10 11:30:26 crc kubenswrapper[4794]: I0310 11:30:26.957095 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-4sdlv"] Mar 10 11:30:26 crc kubenswrapper[4794]: I0310 11:30:26.979361 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/2281ee85-db79-4fec-bb5e-ce0a4a4c61de-ceph\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-4sdlv\" (UID: \"2281ee85-db79-4fec-bb5e-ce0a4a4c61de\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-4sdlv" Mar 10 11:30:26 crc kubenswrapper[4794]: I0310 11:30:26.979426 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s8ztj\" (UniqueName: \"kubernetes.io/projected/2281ee85-db79-4fec-bb5e-ce0a4a4c61de-kube-api-access-s8ztj\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-4sdlv\" (UID: \"2281ee85-db79-4fec-bb5e-ce0a4a4c61de\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-4sdlv" Mar 10 11:30:26 crc kubenswrapper[4794]: I0310 11:30:26.979448 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2281ee85-db79-4fec-bb5e-ce0a4a4c61de-inventory\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-4sdlv\" (UID: \"2281ee85-db79-4fec-bb5e-ce0a4a4c61de\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-4sdlv" Mar 10 11:30:26 crc kubenswrapper[4794]: I0310 11:30:26.979517 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tripleo-cleanup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2281ee85-db79-4fec-bb5e-ce0a4a4c61de-tripleo-cleanup-combined-ca-bundle\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-4sdlv\" (UID: \"2281ee85-db79-4fec-bb5e-ce0a4a4c61de\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-4sdlv" Mar 10 11:30:26 crc kubenswrapper[4794]: I0310 11:30:26.979557 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/2281ee85-db79-4fec-bb5e-ce0a4a4c61de-ssh-key-openstack-cell1\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-4sdlv\" (UID: \"2281ee85-db79-4fec-bb5e-ce0a4a4c61de\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-4sdlv" Mar 10 11:30:27 crc kubenswrapper[4794]: I0310 11:30:27.081227 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tripleo-cleanup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2281ee85-db79-4fec-bb5e-ce0a4a4c61de-tripleo-cleanup-combined-ca-bundle\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-4sdlv\" (UID: \"2281ee85-db79-4fec-bb5e-ce0a4a4c61de\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-4sdlv" Mar 10 11:30:27 crc kubenswrapper[4794]: I0310 11:30:27.081589 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/2281ee85-db79-4fec-bb5e-ce0a4a4c61de-ssh-key-openstack-cell1\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-4sdlv\" (UID: \"2281ee85-db79-4fec-bb5e-ce0a4a4c61de\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-4sdlv" Mar 10 11:30:27 crc kubenswrapper[4794]: I0310 11:30:27.081697 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/2281ee85-db79-4fec-bb5e-ce0a4a4c61de-ceph\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-4sdlv\" (UID: \"2281ee85-db79-4fec-bb5e-ce0a4a4c61de\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-4sdlv" Mar 10 11:30:27 crc kubenswrapper[4794]: I0310 11:30:27.081775 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s8ztj\" (UniqueName: \"kubernetes.io/projected/2281ee85-db79-4fec-bb5e-ce0a4a4c61de-kube-api-access-s8ztj\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-4sdlv\" (UID: \"2281ee85-db79-4fec-bb5e-ce0a4a4c61de\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-4sdlv" Mar 10 11:30:27 crc kubenswrapper[4794]: I0310 11:30:27.081804 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2281ee85-db79-4fec-bb5e-ce0a4a4c61de-inventory\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-4sdlv\" (UID: \"2281ee85-db79-4fec-bb5e-ce0a4a4c61de\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-4sdlv" Mar 10 11:30:27 crc kubenswrapper[4794]: I0310 11:30:27.087160 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2281ee85-db79-4fec-bb5e-ce0a4a4c61de-inventory\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-4sdlv\" (UID: \"2281ee85-db79-4fec-bb5e-ce0a4a4c61de\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-4sdlv" Mar 10 11:30:27 crc kubenswrapper[4794]: I0310 11:30:27.087602 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/2281ee85-db79-4fec-bb5e-ce0a4a4c61de-ssh-key-openstack-cell1\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-4sdlv\" (UID: \"2281ee85-db79-4fec-bb5e-ce0a4a4c61de\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-4sdlv" Mar 10 11:30:27 crc kubenswrapper[4794]: I0310 11:30:27.087795 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tripleo-cleanup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2281ee85-db79-4fec-bb5e-ce0a4a4c61de-tripleo-cleanup-combined-ca-bundle\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-4sdlv\" (UID: \"2281ee85-db79-4fec-bb5e-ce0a4a4c61de\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-4sdlv" Mar 10 11:30:27 crc kubenswrapper[4794]: I0310 11:30:27.088475 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/2281ee85-db79-4fec-bb5e-ce0a4a4c61de-ceph\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-4sdlv\" (UID: \"2281ee85-db79-4fec-bb5e-ce0a4a4c61de\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-4sdlv" Mar 10 11:30:27 crc kubenswrapper[4794]: I0310 11:30:27.098087 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s8ztj\" (UniqueName: \"kubernetes.io/projected/2281ee85-db79-4fec-bb5e-ce0a4a4c61de-kube-api-access-s8ztj\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-4sdlv\" (UID: \"2281ee85-db79-4fec-bb5e-ce0a4a4c61de\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-4sdlv" Mar 10 11:30:27 crc kubenswrapper[4794]: I0310 11:30:27.266450 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-4sdlv" Mar 10 11:30:27 crc kubenswrapper[4794]: I0310 11:30:27.984106 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-4sdlv"] Mar 10 11:30:28 crc kubenswrapper[4794]: I0310 11:30:28.103584 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-4sdlv" event={"ID":"2281ee85-db79-4fec-bb5e-ce0a4a4c61de","Type":"ContainerStarted","Data":"bffd2d9d8c902591e0dc00b8a85ce093cdd6f05563d4fb9c56a905f2d7b95c77"} Mar 10 11:30:29 crc kubenswrapper[4794]: I0310 11:30:29.115779 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-4sdlv" event={"ID":"2281ee85-db79-4fec-bb5e-ce0a4a4c61de","Type":"ContainerStarted","Data":"b8eb446da5078a00f3fa88320aa3cc297e48edc9bf62d3bd2190cb461495a458"} Mar 10 11:30:29 crc kubenswrapper[4794]: I0310 11:30:29.139849 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-4sdlv" podStartSLOduration=2.717970479 podStartE2EDuration="3.139831663s" podCreationTimestamp="2026-03-10 11:30:26 +0000 UTC" firstStartedPulling="2026-03-10 11:30:27.975692102 +0000 UTC m=+6376.731862920" lastFinishedPulling="2026-03-10 11:30:28.397553286 +0000 UTC m=+6377.153724104" observedRunningTime="2026-03-10 11:30:29.139388779 +0000 UTC m=+6377.895559617" watchObservedRunningTime="2026-03-10 11:30:29.139831663 +0000 UTC m=+6377.896002481" Mar 10 11:30:32 crc kubenswrapper[4794]: I0310 11:30:32.632389 4794 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-lhhpq" podUID="905df928-7f1b-4b9e-b18a-d2d032e28bb3" containerName="registry-server" probeResult="failure" output=< Mar 10 11:30:32 crc kubenswrapper[4794]: timeout: failed to connect service ":50051" within 1s Mar 10 11:30:32 crc kubenswrapper[4794]: > Mar 10 11:30:33 crc kubenswrapper[4794]: I0310 11:30:33.999407 4794 scope.go:117] "RemoveContainer" containerID="7035d4646bbb6a23dd802a9c8c9f05a3c6c945d4e5a6da8586d2bb005fcfad77" Mar 10 11:30:34 crc kubenswrapper[4794]: E0310 11:30:33.999686 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-69278_openshift-machine-config-operator(071a79a8-a892-4d38-a255-2a19483b64aa)\"" pod="openshift-machine-config-operator/machine-config-daemon-69278" podUID="071a79a8-a892-4d38-a255-2a19483b64aa" Mar 10 11:30:41 crc kubenswrapper[4794]: I0310 11:30:41.642497 4794 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-lhhpq" Mar 10 11:30:41 crc kubenswrapper[4794]: I0310 11:30:41.699035 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-lhhpq" Mar 10 11:30:42 crc kubenswrapper[4794]: I0310 11:30:42.430646 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-lhhpq"] Mar 10 11:30:43 crc kubenswrapper[4794]: I0310 11:30:43.257933 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-lhhpq" podUID="905df928-7f1b-4b9e-b18a-d2d032e28bb3" containerName="registry-server" containerID="cri-o://2e0742bac0bc239d4bbe8da77a1d1177e88bffbf833e44066c95c35056e6578a" gracePeriod=2 Mar 10 11:30:43 crc kubenswrapper[4794]: I0310 11:30:43.739080 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-lhhpq" Mar 10 11:30:43 crc kubenswrapper[4794]: I0310 11:30:43.898367 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/905df928-7f1b-4b9e-b18a-d2d032e28bb3-utilities\") pod \"905df928-7f1b-4b9e-b18a-d2d032e28bb3\" (UID: \"905df928-7f1b-4b9e-b18a-d2d032e28bb3\") " Mar 10 11:30:43 crc kubenswrapper[4794]: I0310 11:30:43.898573 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jzbgx\" (UniqueName: \"kubernetes.io/projected/905df928-7f1b-4b9e-b18a-d2d032e28bb3-kube-api-access-jzbgx\") pod \"905df928-7f1b-4b9e-b18a-d2d032e28bb3\" (UID: \"905df928-7f1b-4b9e-b18a-d2d032e28bb3\") " Mar 10 11:30:43 crc kubenswrapper[4794]: I0310 11:30:43.898635 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/905df928-7f1b-4b9e-b18a-d2d032e28bb3-catalog-content\") pod \"905df928-7f1b-4b9e-b18a-d2d032e28bb3\" (UID: \"905df928-7f1b-4b9e-b18a-d2d032e28bb3\") " Mar 10 11:30:43 crc kubenswrapper[4794]: I0310 11:30:43.899608 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/905df928-7f1b-4b9e-b18a-d2d032e28bb3-utilities" (OuterVolumeSpecName: "utilities") pod "905df928-7f1b-4b9e-b18a-d2d032e28bb3" (UID: "905df928-7f1b-4b9e-b18a-d2d032e28bb3"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 11:30:43 crc kubenswrapper[4794]: I0310 11:30:43.906812 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/905df928-7f1b-4b9e-b18a-d2d032e28bb3-kube-api-access-jzbgx" (OuterVolumeSpecName: "kube-api-access-jzbgx") pod "905df928-7f1b-4b9e-b18a-d2d032e28bb3" (UID: "905df928-7f1b-4b9e-b18a-d2d032e28bb3"). InnerVolumeSpecName "kube-api-access-jzbgx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 11:30:44 crc kubenswrapper[4794]: I0310 11:30:44.017912 4794 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/905df928-7f1b-4b9e-b18a-d2d032e28bb3-utilities\") on node \"crc\" DevicePath \"\"" Mar 10 11:30:44 crc kubenswrapper[4794]: I0310 11:30:44.017950 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jzbgx\" (UniqueName: \"kubernetes.io/projected/905df928-7f1b-4b9e-b18a-d2d032e28bb3-kube-api-access-jzbgx\") on node \"crc\" DevicePath \"\"" Mar 10 11:30:44 crc kubenswrapper[4794]: I0310 11:30:44.031030 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/905df928-7f1b-4b9e-b18a-d2d032e28bb3-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "905df928-7f1b-4b9e-b18a-d2d032e28bb3" (UID: "905df928-7f1b-4b9e-b18a-d2d032e28bb3"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 11:30:44 crc kubenswrapper[4794]: I0310 11:30:44.120150 4794 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/905df928-7f1b-4b9e-b18a-d2d032e28bb3-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 10 11:30:44 crc kubenswrapper[4794]: I0310 11:30:44.269622 4794 generic.go:334] "Generic (PLEG): container finished" podID="905df928-7f1b-4b9e-b18a-d2d032e28bb3" containerID="2e0742bac0bc239d4bbe8da77a1d1177e88bffbf833e44066c95c35056e6578a" exitCode=0 Mar 10 11:30:44 crc kubenswrapper[4794]: I0310 11:30:44.269674 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lhhpq" event={"ID":"905df928-7f1b-4b9e-b18a-d2d032e28bb3","Type":"ContainerDied","Data":"2e0742bac0bc239d4bbe8da77a1d1177e88bffbf833e44066c95c35056e6578a"} Mar 10 11:30:44 crc kubenswrapper[4794]: I0310 11:30:44.269714 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lhhpq" event={"ID":"905df928-7f1b-4b9e-b18a-d2d032e28bb3","Type":"ContainerDied","Data":"b7d1cbfed3d9d8569b7f3884785b6128b77f45bb804ab7e2daf514a09e26df38"} Mar 10 11:30:44 crc kubenswrapper[4794]: I0310 11:30:44.269721 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-lhhpq" Mar 10 11:30:44 crc kubenswrapper[4794]: I0310 11:30:44.269736 4794 scope.go:117] "RemoveContainer" containerID="2e0742bac0bc239d4bbe8da77a1d1177e88bffbf833e44066c95c35056e6578a" Mar 10 11:30:44 crc kubenswrapper[4794]: I0310 11:30:44.299779 4794 scope.go:117] "RemoveContainer" containerID="3be7ccee2f0b7d1d415b467d0cfc1a3d2ba48a120b83ba8a590c728c6d683b36" Mar 10 11:30:44 crc kubenswrapper[4794]: I0310 11:30:44.311745 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-lhhpq"] Mar 10 11:30:44 crc kubenswrapper[4794]: I0310 11:30:44.329893 4794 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-lhhpq"] Mar 10 11:30:44 crc kubenswrapper[4794]: I0310 11:30:44.354465 4794 scope.go:117] "RemoveContainer" containerID="3a930ec0e45c904b6d91fe249fbbcb69114254a544689ecf6e60b7b330b03c8e" Mar 10 11:30:44 crc kubenswrapper[4794]: I0310 11:30:44.393264 4794 scope.go:117] "RemoveContainer" containerID="2e0742bac0bc239d4bbe8da77a1d1177e88bffbf833e44066c95c35056e6578a" Mar 10 11:30:44 crc kubenswrapper[4794]: E0310 11:30:44.393904 4794 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2e0742bac0bc239d4bbe8da77a1d1177e88bffbf833e44066c95c35056e6578a\": container with ID starting with 2e0742bac0bc239d4bbe8da77a1d1177e88bffbf833e44066c95c35056e6578a not found: ID does not exist" containerID="2e0742bac0bc239d4bbe8da77a1d1177e88bffbf833e44066c95c35056e6578a" Mar 10 11:30:44 crc kubenswrapper[4794]: I0310 11:30:44.393953 4794 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2e0742bac0bc239d4bbe8da77a1d1177e88bffbf833e44066c95c35056e6578a"} err="failed to get container status \"2e0742bac0bc239d4bbe8da77a1d1177e88bffbf833e44066c95c35056e6578a\": rpc error: code = NotFound desc = could not find container \"2e0742bac0bc239d4bbe8da77a1d1177e88bffbf833e44066c95c35056e6578a\": container with ID starting with 2e0742bac0bc239d4bbe8da77a1d1177e88bffbf833e44066c95c35056e6578a not found: ID does not exist" Mar 10 11:30:44 crc kubenswrapper[4794]: I0310 11:30:44.393980 4794 scope.go:117] "RemoveContainer" containerID="3be7ccee2f0b7d1d415b467d0cfc1a3d2ba48a120b83ba8a590c728c6d683b36" Mar 10 11:30:44 crc kubenswrapper[4794]: E0310 11:30:44.394505 4794 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3be7ccee2f0b7d1d415b467d0cfc1a3d2ba48a120b83ba8a590c728c6d683b36\": container with ID starting with 3be7ccee2f0b7d1d415b467d0cfc1a3d2ba48a120b83ba8a590c728c6d683b36 not found: ID does not exist" containerID="3be7ccee2f0b7d1d415b467d0cfc1a3d2ba48a120b83ba8a590c728c6d683b36" Mar 10 11:30:44 crc kubenswrapper[4794]: I0310 11:30:44.394542 4794 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3be7ccee2f0b7d1d415b467d0cfc1a3d2ba48a120b83ba8a590c728c6d683b36"} err="failed to get container status \"3be7ccee2f0b7d1d415b467d0cfc1a3d2ba48a120b83ba8a590c728c6d683b36\": rpc error: code = NotFound desc = could not find container \"3be7ccee2f0b7d1d415b467d0cfc1a3d2ba48a120b83ba8a590c728c6d683b36\": container with ID starting with 3be7ccee2f0b7d1d415b467d0cfc1a3d2ba48a120b83ba8a590c728c6d683b36 not found: ID does not exist" Mar 10 11:30:44 crc kubenswrapper[4794]: I0310 11:30:44.394568 4794 scope.go:117] "RemoveContainer" containerID="3a930ec0e45c904b6d91fe249fbbcb69114254a544689ecf6e60b7b330b03c8e" Mar 10 11:30:44 crc kubenswrapper[4794]: E0310 11:30:44.394962 4794 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3a930ec0e45c904b6d91fe249fbbcb69114254a544689ecf6e60b7b330b03c8e\": container with ID starting with 3a930ec0e45c904b6d91fe249fbbcb69114254a544689ecf6e60b7b330b03c8e not found: ID does not exist" containerID="3a930ec0e45c904b6d91fe249fbbcb69114254a544689ecf6e60b7b330b03c8e" Mar 10 11:30:44 crc kubenswrapper[4794]: I0310 11:30:44.394980 4794 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3a930ec0e45c904b6d91fe249fbbcb69114254a544689ecf6e60b7b330b03c8e"} err="failed to get container status \"3a930ec0e45c904b6d91fe249fbbcb69114254a544689ecf6e60b7b330b03c8e\": rpc error: code = NotFound desc = could not find container \"3a930ec0e45c904b6d91fe249fbbcb69114254a544689ecf6e60b7b330b03c8e\": container with ID starting with 3a930ec0e45c904b6d91fe249fbbcb69114254a544689ecf6e60b7b330b03c8e not found: ID does not exist" Mar 10 11:30:46 crc kubenswrapper[4794]: I0310 11:30:46.021204 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="905df928-7f1b-4b9e-b18a-d2d032e28bb3" path="/var/lib/kubelet/pods/905df928-7f1b-4b9e-b18a-d2d032e28bb3/volumes" Mar 10 11:30:47 crc kubenswrapper[4794]: I0310 11:30:47.999084 4794 scope.go:117] "RemoveContainer" containerID="7035d4646bbb6a23dd802a9c8c9f05a3c6c945d4e5a6da8586d2bb005fcfad77" Mar 10 11:30:48 crc kubenswrapper[4794]: E0310 11:30:47.999629 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-69278_openshift-machine-config-operator(071a79a8-a892-4d38-a255-2a19483b64aa)\"" pod="openshift-machine-config-operator/machine-config-daemon-69278" podUID="071a79a8-a892-4d38-a255-2a19483b64aa" Mar 10 11:30:50 crc kubenswrapper[4794]: I0310 11:30:50.348402 4794 scope.go:117] "RemoveContainer" containerID="1b58417f5c7b9a2e3a08274cc6025c4ba19bca341dff9c5ddab3b3e64e531cdf" Mar 10 11:30:50 crc kubenswrapper[4794]: I0310 11:30:50.389524 4794 scope.go:117] "RemoveContainer" containerID="c6781b4d2bc760d5d7461019b839e5c688a2f9c9be8d1c23c04d943703a000f9" Mar 10 11:31:02 crc kubenswrapper[4794]: I0310 11:31:02.005830 4794 scope.go:117] "RemoveContainer" containerID="7035d4646bbb6a23dd802a9c8c9f05a3c6c945d4e5a6da8586d2bb005fcfad77" Mar 10 11:31:02 crc kubenswrapper[4794]: I0310 11:31:02.491066 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-69278" event={"ID":"071a79a8-a892-4d38-a255-2a19483b64aa","Type":"ContainerStarted","Data":"ff9ef84b5f8498d327987989405e5a00267015a3b87f089c53653d6eff6df838"} Mar 10 11:32:00 crc kubenswrapper[4794]: I0310 11:32:00.187694 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29552372-5hvh5"] Mar 10 11:32:00 crc kubenswrapper[4794]: E0310 11:32:00.188956 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="905df928-7f1b-4b9e-b18a-d2d032e28bb3" containerName="extract-content" Mar 10 11:32:00 crc kubenswrapper[4794]: I0310 11:32:00.188976 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="905df928-7f1b-4b9e-b18a-d2d032e28bb3" containerName="extract-content" Mar 10 11:32:00 crc kubenswrapper[4794]: E0310 11:32:00.189006 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="905df928-7f1b-4b9e-b18a-d2d032e28bb3" containerName="extract-utilities" Mar 10 11:32:00 crc kubenswrapper[4794]: I0310 11:32:00.189014 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="905df928-7f1b-4b9e-b18a-d2d032e28bb3" containerName="extract-utilities" Mar 10 11:32:00 crc kubenswrapper[4794]: E0310 11:32:00.189036 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="905df928-7f1b-4b9e-b18a-d2d032e28bb3" containerName="registry-server" Mar 10 11:32:00 crc kubenswrapper[4794]: I0310 11:32:00.189044 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="905df928-7f1b-4b9e-b18a-d2d032e28bb3" containerName="registry-server" Mar 10 11:32:00 crc kubenswrapper[4794]: I0310 11:32:00.189358 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="905df928-7f1b-4b9e-b18a-d2d032e28bb3" containerName="registry-server" Mar 10 11:32:00 crc kubenswrapper[4794]: I0310 11:32:00.190467 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552372-5hvh5" Mar 10 11:32:00 crc kubenswrapper[4794]: I0310 11:32:00.194133 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 10 11:32:00 crc kubenswrapper[4794]: I0310 11:32:00.194253 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-r5tkc" Mar 10 11:32:00 crc kubenswrapper[4794]: I0310 11:32:00.194918 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 10 11:32:00 crc kubenswrapper[4794]: I0310 11:32:00.202878 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552372-5hvh5"] Mar 10 11:32:00 crc kubenswrapper[4794]: I0310 11:32:00.381762 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g9q94\" (UniqueName: \"kubernetes.io/projected/ba694751-93ab-49ca-a8b9-0e73c87ea5f6-kube-api-access-g9q94\") pod \"auto-csr-approver-29552372-5hvh5\" (UID: \"ba694751-93ab-49ca-a8b9-0e73c87ea5f6\") " pod="openshift-infra/auto-csr-approver-29552372-5hvh5" Mar 10 11:32:00 crc kubenswrapper[4794]: I0310 11:32:00.484841 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g9q94\" (UniqueName: \"kubernetes.io/projected/ba694751-93ab-49ca-a8b9-0e73c87ea5f6-kube-api-access-g9q94\") pod \"auto-csr-approver-29552372-5hvh5\" (UID: \"ba694751-93ab-49ca-a8b9-0e73c87ea5f6\") " pod="openshift-infra/auto-csr-approver-29552372-5hvh5" Mar 10 11:32:00 crc kubenswrapper[4794]: I0310 11:32:00.522097 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g9q94\" (UniqueName: \"kubernetes.io/projected/ba694751-93ab-49ca-a8b9-0e73c87ea5f6-kube-api-access-g9q94\") pod \"auto-csr-approver-29552372-5hvh5\" (UID: \"ba694751-93ab-49ca-a8b9-0e73c87ea5f6\") " pod="openshift-infra/auto-csr-approver-29552372-5hvh5" Mar 10 11:32:00 crc kubenswrapper[4794]: I0310 11:32:00.819758 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552372-5hvh5" Mar 10 11:32:01 crc kubenswrapper[4794]: I0310 11:32:01.427228 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552372-5hvh5"] Mar 10 11:32:02 crc kubenswrapper[4794]: I0310 11:32:02.229032 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552372-5hvh5" event={"ID":"ba694751-93ab-49ca-a8b9-0e73c87ea5f6","Type":"ContainerStarted","Data":"405b13b6529af4ba0cc541c2ee387af21ed1c6f22087e114271b0460613fecfc"} Mar 10 11:32:03 crc kubenswrapper[4794]: I0310 11:32:03.238810 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552372-5hvh5" event={"ID":"ba694751-93ab-49ca-a8b9-0e73c87ea5f6","Type":"ContainerStarted","Data":"07047adba663cffdd80297e275f95ed54327ed72f931ad8fffa72191e13a1a1c"} Mar 10 11:32:03 crc kubenswrapper[4794]: I0310 11:32:03.258510 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29552372-5hvh5" podStartSLOduration=2.199325494 podStartE2EDuration="3.258493725s" podCreationTimestamp="2026-03-10 11:32:00 +0000 UTC" firstStartedPulling="2026-03-10 11:32:01.440107452 +0000 UTC m=+6470.196278280" lastFinishedPulling="2026-03-10 11:32:02.499275683 +0000 UTC m=+6471.255446511" observedRunningTime="2026-03-10 11:32:03.254627627 +0000 UTC m=+6472.010798445" watchObservedRunningTime="2026-03-10 11:32:03.258493725 +0000 UTC m=+6472.014664543" Mar 10 11:32:04 crc kubenswrapper[4794]: I0310 11:32:04.253696 4794 generic.go:334] "Generic (PLEG): container finished" podID="ba694751-93ab-49ca-a8b9-0e73c87ea5f6" containerID="07047adba663cffdd80297e275f95ed54327ed72f931ad8fffa72191e13a1a1c" exitCode=0 Mar 10 11:32:04 crc kubenswrapper[4794]: I0310 11:32:04.253739 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552372-5hvh5" event={"ID":"ba694751-93ab-49ca-a8b9-0e73c87ea5f6","Type":"ContainerDied","Data":"07047adba663cffdd80297e275f95ed54327ed72f931ad8fffa72191e13a1a1c"} Mar 10 11:32:05 crc kubenswrapper[4794]: I0310 11:32:05.714588 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552372-5hvh5" Mar 10 11:32:05 crc kubenswrapper[4794]: I0310 11:32:05.827635 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g9q94\" (UniqueName: \"kubernetes.io/projected/ba694751-93ab-49ca-a8b9-0e73c87ea5f6-kube-api-access-g9q94\") pod \"ba694751-93ab-49ca-a8b9-0e73c87ea5f6\" (UID: \"ba694751-93ab-49ca-a8b9-0e73c87ea5f6\") " Mar 10 11:32:05 crc kubenswrapper[4794]: I0310 11:32:05.835204 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ba694751-93ab-49ca-a8b9-0e73c87ea5f6-kube-api-access-g9q94" (OuterVolumeSpecName: "kube-api-access-g9q94") pod "ba694751-93ab-49ca-a8b9-0e73c87ea5f6" (UID: "ba694751-93ab-49ca-a8b9-0e73c87ea5f6"). InnerVolumeSpecName "kube-api-access-g9q94". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 11:32:05 crc kubenswrapper[4794]: I0310 11:32:05.930148 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g9q94\" (UniqueName: \"kubernetes.io/projected/ba694751-93ab-49ca-a8b9-0e73c87ea5f6-kube-api-access-g9q94\") on node \"crc\" DevicePath \"\"" Mar 10 11:32:06 crc kubenswrapper[4794]: I0310 11:32:06.276809 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552372-5hvh5" event={"ID":"ba694751-93ab-49ca-a8b9-0e73c87ea5f6","Type":"ContainerDied","Data":"405b13b6529af4ba0cc541c2ee387af21ed1c6f22087e114271b0460613fecfc"} Mar 10 11:32:06 crc kubenswrapper[4794]: I0310 11:32:06.276861 4794 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="405b13b6529af4ba0cc541c2ee387af21ed1c6f22087e114271b0460613fecfc" Mar 10 11:32:06 crc kubenswrapper[4794]: I0310 11:32:06.276937 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552372-5hvh5" Mar 10 11:32:06 crc kubenswrapper[4794]: I0310 11:32:06.347701 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29552366-zhtn9"] Mar 10 11:32:06 crc kubenswrapper[4794]: I0310 11:32:06.358922 4794 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29552366-zhtn9"] Mar 10 11:32:08 crc kubenswrapper[4794]: I0310 11:32:08.019896 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="faf5136a-197d-4f67-9a0d-01ab38902d79" path="/var/lib/kubelet/pods/faf5136a-197d-4f67-9a0d-01ab38902d79/volumes" Mar 10 11:32:11 crc kubenswrapper[4794]: I0310 11:32:11.040711 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/octavia-db-create-qzvlg"] Mar 10 11:32:11 crc kubenswrapper[4794]: I0310 11:32:11.059262 4794 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/octavia-db-create-qzvlg"] Mar 10 11:32:12 crc kubenswrapper[4794]: I0310 11:32:12.020189 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="19facb05-a5ec-4544-9a36-3b5973303b73" path="/var/lib/kubelet/pods/19facb05-a5ec-4544-9a36-3b5973303b73/volumes" Mar 10 11:32:12 crc kubenswrapper[4794]: I0310 11:32:12.036363 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/octavia-4b48-account-create-update-q6xqr"] Mar 10 11:32:12 crc kubenswrapper[4794]: I0310 11:32:12.047006 4794 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/octavia-4b48-account-create-update-q6xqr"] Mar 10 11:32:14 crc kubenswrapper[4794]: I0310 11:32:14.025068 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="de94bae2-ff28-4041-935f-3c8e27931e98" path="/var/lib/kubelet/pods/de94bae2-ff28-4041-935f-3c8e27931e98/volumes" Mar 10 11:32:18 crc kubenswrapper[4794]: I0310 11:32:18.065886 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/octavia-d138-account-create-update-bwck7"] Mar 10 11:32:18 crc kubenswrapper[4794]: I0310 11:32:18.084420 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/octavia-persistence-db-create-lkmqd"] Mar 10 11:32:18 crc kubenswrapper[4794]: I0310 11:32:18.096467 4794 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/octavia-d138-account-create-update-bwck7"] Mar 10 11:32:18 crc kubenswrapper[4794]: I0310 11:32:18.114411 4794 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/octavia-persistence-db-create-lkmqd"] Mar 10 11:32:20 crc kubenswrapper[4794]: I0310 11:32:20.014650 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="46be95d5-5770-4daa-bba9-cf040d10d55a" path="/var/lib/kubelet/pods/46be95d5-5770-4daa-bba9-cf040d10d55a/volumes" Mar 10 11:32:20 crc kubenswrapper[4794]: I0310 11:32:20.017997 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="959ece64-6ede-48e4-8137-2b8cfce20471" path="/var/lib/kubelet/pods/959ece64-6ede-48e4-8137-2b8cfce20471/volumes" Mar 10 11:32:50 crc kubenswrapper[4794]: I0310 11:32:50.632586 4794 scope.go:117] "RemoveContainer" containerID="17c48bb8a9ff4ccf504437b9ec1b5a3f9af0e2b5338966f3e576ac71ce107cd5" Mar 10 11:32:50 crc kubenswrapper[4794]: I0310 11:32:50.678226 4794 scope.go:117] "RemoveContainer" containerID="9a1bdfc1045b898ae6c793757db86876a6990c9eadc6368c86650d52a20001fb" Mar 10 11:32:50 crc kubenswrapper[4794]: I0310 11:32:50.766637 4794 scope.go:117] "RemoveContainer" containerID="759f811718ed7a63696b8290009c4141bf27f60b4855aec49dfe381031aa9a86" Mar 10 11:32:50 crc kubenswrapper[4794]: I0310 11:32:50.820175 4794 scope.go:117] "RemoveContainer" containerID="b41be85390c1e60a649cb2f1053d837910650d40dfb00dab3397d1ab8f2ad778" Mar 10 11:32:50 crc kubenswrapper[4794]: I0310 11:32:50.914360 4794 scope.go:117] "RemoveContainer" containerID="9b899dd0f88d6c7f75831bfba61cbf085a8fa25aac1a5019660fd10149e337d4" Mar 10 11:33:00 crc kubenswrapper[4794]: I0310 11:33:00.058851 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/octavia-db-sync-7st8t"] Mar 10 11:33:00 crc kubenswrapper[4794]: I0310 11:33:00.070237 4794 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/octavia-db-sync-7st8t"] Mar 10 11:33:02 crc kubenswrapper[4794]: I0310 11:33:02.011768 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22779500-ca5f-49be-8578-d0f7b28196fa" path="/var/lib/kubelet/pods/22779500-ca5f-49be-8578-d0f7b28196fa/volumes" Mar 10 11:33:22 crc kubenswrapper[4794]: I0310 11:33:22.968197 4794 patch_prober.go:28] interesting pod/machine-config-daemon-69278 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 11:33:22 crc kubenswrapper[4794]: I0310 11:33:22.969102 4794 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-69278" podUID="071a79a8-a892-4d38-a255-2a19483b64aa" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 11:33:51 crc kubenswrapper[4794]: I0310 11:33:51.092315 4794 scope.go:117] "RemoveContainer" containerID="f09f0c0d99a49400d7bd8787bf667338d715ae276d00619d2ebaeb027a81eb9e" Mar 10 11:33:51 crc kubenswrapper[4794]: I0310 11:33:51.150881 4794 scope.go:117] "RemoveContainer" containerID="515b2db4ef251605bbd6913ce03071a1c79cd8d744a50a654f697f6e6b0a490c" Mar 10 11:33:52 crc kubenswrapper[4794]: I0310 11:33:52.967991 4794 patch_prober.go:28] interesting pod/machine-config-daemon-69278 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 11:33:52 crc kubenswrapper[4794]: I0310 11:33:52.968595 4794 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-69278" podUID="071a79a8-a892-4d38-a255-2a19483b64aa" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 11:34:00 crc kubenswrapper[4794]: I0310 11:34:00.181786 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29552374-bhqcg"] Mar 10 11:34:00 crc kubenswrapper[4794]: E0310 11:34:00.182927 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ba694751-93ab-49ca-a8b9-0e73c87ea5f6" containerName="oc" Mar 10 11:34:00 crc kubenswrapper[4794]: I0310 11:34:00.182944 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="ba694751-93ab-49ca-a8b9-0e73c87ea5f6" containerName="oc" Mar 10 11:34:00 crc kubenswrapper[4794]: I0310 11:34:00.183200 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="ba694751-93ab-49ca-a8b9-0e73c87ea5f6" containerName="oc" Mar 10 11:34:00 crc kubenswrapper[4794]: I0310 11:34:00.184027 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552374-bhqcg" Mar 10 11:34:00 crc kubenswrapper[4794]: I0310 11:34:00.186910 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 10 11:34:00 crc kubenswrapper[4794]: I0310 11:34:00.188688 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-r5tkc" Mar 10 11:34:00 crc kubenswrapper[4794]: I0310 11:34:00.189487 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 10 11:34:00 crc kubenswrapper[4794]: I0310 11:34:00.254456 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552374-bhqcg"] Mar 10 11:34:00 crc kubenswrapper[4794]: I0310 11:34:00.260208 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-84k4w\" (UniqueName: \"kubernetes.io/projected/83e37ea6-ebb9-4617-a214-f2cac72dad1e-kube-api-access-84k4w\") pod \"auto-csr-approver-29552374-bhqcg\" (UID: \"83e37ea6-ebb9-4617-a214-f2cac72dad1e\") " pod="openshift-infra/auto-csr-approver-29552374-bhqcg" Mar 10 11:34:00 crc kubenswrapper[4794]: I0310 11:34:00.362788 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-84k4w\" (UniqueName: \"kubernetes.io/projected/83e37ea6-ebb9-4617-a214-f2cac72dad1e-kube-api-access-84k4w\") pod \"auto-csr-approver-29552374-bhqcg\" (UID: \"83e37ea6-ebb9-4617-a214-f2cac72dad1e\") " pod="openshift-infra/auto-csr-approver-29552374-bhqcg" Mar 10 11:34:00 crc kubenswrapper[4794]: I0310 11:34:00.386586 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-84k4w\" (UniqueName: \"kubernetes.io/projected/83e37ea6-ebb9-4617-a214-f2cac72dad1e-kube-api-access-84k4w\") pod \"auto-csr-approver-29552374-bhqcg\" (UID: \"83e37ea6-ebb9-4617-a214-f2cac72dad1e\") " pod="openshift-infra/auto-csr-approver-29552374-bhqcg" Mar 10 11:34:00 crc kubenswrapper[4794]: I0310 11:34:00.517793 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552374-bhqcg" Mar 10 11:34:01 crc kubenswrapper[4794]: I0310 11:34:01.114418 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552374-bhqcg"] Mar 10 11:34:01 crc kubenswrapper[4794]: W0310 11:34:01.122168 4794 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod83e37ea6_ebb9_4617_a214_f2cac72dad1e.slice/crio-5e6edda971c66cc665817f4c2abae2ffbd827963cc28fbd122d789c8025a86b8 WatchSource:0}: Error finding container 5e6edda971c66cc665817f4c2abae2ffbd827963cc28fbd122d789c8025a86b8: Status 404 returned error can't find the container with id 5e6edda971c66cc665817f4c2abae2ffbd827963cc28fbd122d789c8025a86b8 Mar 10 11:34:01 crc kubenswrapper[4794]: I0310 11:34:01.126141 4794 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 10 11:34:01 crc kubenswrapper[4794]: I0310 11:34:01.633029 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552374-bhqcg" event={"ID":"83e37ea6-ebb9-4617-a214-f2cac72dad1e","Type":"ContainerStarted","Data":"5e6edda971c66cc665817f4c2abae2ffbd827963cc28fbd122d789c8025a86b8"} Mar 10 11:34:02 crc kubenswrapper[4794]: I0310 11:34:02.645793 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552374-bhqcg" event={"ID":"83e37ea6-ebb9-4617-a214-f2cac72dad1e","Type":"ContainerStarted","Data":"20b9158f59c025a54bcc5b25df9575f4823d9b34316afe73fb47fe76823d2a39"} Mar 10 11:34:02 crc kubenswrapper[4794]: I0310 11:34:02.681074 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29552374-bhqcg" podStartSLOduration=1.6531188540000001 podStartE2EDuration="2.681053462s" podCreationTimestamp="2026-03-10 11:34:00 +0000 UTC" firstStartedPulling="2026-03-10 11:34:01.125734132 +0000 UTC m=+6589.881904980" lastFinishedPulling="2026-03-10 11:34:02.15366874 +0000 UTC m=+6590.909839588" observedRunningTime="2026-03-10 11:34:02.667074676 +0000 UTC m=+6591.423245534" watchObservedRunningTime="2026-03-10 11:34:02.681053462 +0000 UTC m=+6591.437224290" Mar 10 11:34:03 crc kubenswrapper[4794]: I0310 11:34:03.678186 4794 generic.go:334] "Generic (PLEG): container finished" podID="83e37ea6-ebb9-4617-a214-f2cac72dad1e" containerID="20b9158f59c025a54bcc5b25df9575f4823d9b34316afe73fb47fe76823d2a39" exitCode=0 Mar 10 11:34:03 crc kubenswrapper[4794]: I0310 11:34:03.678229 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552374-bhqcg" event={"ID":"83e37ea6-ebb9-4617-a214-f2cac72dad1e","Type":"ContainerDied","Data":"20b9158f59c025a54bcc5b25df9575f4823d9b34316afe73fb47fe76823d2a39"} Mar 10 11:34:05 crc kubenswrapper[4794]: I0310 11:34:05.054614 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552374-bhqcg" Mar 10 11:34:05 crc kubenswrapper[4794]: I0310 11:34:05.179619 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-84k4w\" (UniqueName: \"kubernetes.io/projected/83e37ea6-ebb9-4617-a214-f2cac72dad1e-kube-api-access-84k4w\") pod \"83e37ea6-ebb9-4617-a214-f2cac72dad1e\" (UID: \"83e37ea6-ebb9-4617-a214-f2cac72dad1e\") " Mar 10 11:34:05 crc kubenswrapper[4794]: I0310 11:34:05.190787 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/83e37ea6-ebb9-4617-a214-f2cac72dad1e-kube-api-access-84k4w" (OuterVolumeSpecName: "kube-api-access-84k4w") pod "83e37ea6-ebb9-4617-a214-f2cac72dad1e" (UID: "83e37ea6-ebb9-4617-a214-f2cac72dad1e"). InnerVolumeSpecName "kube-api-access-84k4w". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 11:34:05 crc kubenswrapper[4794]: I0310 11:34:05.283808 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-84k4w\" (UniqueName: \"kubernetes.io/projected/83e37ea6-ebb9-4617-a214-f2cac72dad1e-kube-api-access-84k4w\") on node \"crc\" DevicePath \"\"" Mar 10 11:34:05 crc kubenswrapper[4794]: I0310 11:34:05.706533 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552374-bhqcg" event={"ID":"83e37ea6-ebb9-4617-a214-f2cac72dad1e","Type":"ContainerDied","Data":"5e6edda971c66cc665817f4c2abae2ffbd827963cc28fbd122d789c8025a86b8"} Mar 10 11:34:05 crc kubenswrapper[4794]: I0310 11:34:05.706597 4794 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5e6edda971c66cc665817f4c2abae2ffbd827963cc28fbd122d789c8025a86b8" Mar 10 11:34:05 crc kubenswrapper[4794]: I0310 11:34:05.706596 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552374-bhqcg" Mar 10 11:34:06 crc kubenswrapper[4794]: I0310 11:34:06.147726 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29552368-98mgx"] Mar 10 11:34:06 crc kubenswrapper[4794]: I0310 11:34:06.165959 4794 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29552368-98mgx"] Mar 10 11:34:08 crc kubenswrapper[4794]: I0310 11:34:08.017754 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="deafc0de-3f00-45c6-b5d7-cea8a3a66ce7" path="/var/lib/kubelet/pods/deafc0de-3f00-45c6-b5d7-cea8a3a66ce7/volumes" Mar 10 11:34:22 crc kubenswrapper[4794]: I0310 11:34:22.967550 4794 patch_prober.go:28] interesting pod/machine-config-daemon-69278 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 11:34:22 crc kubenswrapper[4794]: I0310 11:34:22.968140 4794 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-69278" podUID="071a79a8-a892-4d38-a255-2a19483b64aa" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 11:34:22 crc kubenswrapper[4794]: I0310 11:34:22.968190 4794 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-69278" Mar 10 11:34:22 crc kubenswrapper[4794]: I0310 11:34:22.969030 4794 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"ff9ef84b5f8498d327987989405e5a00267015a3b87f089c53653d6eff6df838"} pod="openshift-machine-config-operator/machine-config-daemon-69278" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 10 11:34:22 crc kubenswrapper[4794]: I0310 11:34:22.969089 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-69278" podUID="071a79a8-a892-4d38-a255-2a19483b64aa" containerName="machine-config-daemon" containerID="cri-o://ff9ef84b5f8498d327987989405e5a00267015a3b87f089c53653d6eff6df838" gracePeriod=600 Mar 10 11:34:23 crc kubenswrapper[4794]: I0310 11:34:23.931998 4794 generic.go:334] "Generic (PLEG): container finished" podID="071a79a8-a892-4d38-a255-2a19483b64aa" containerID="ff9ef84b5f8498d327987989405e5a00267015a3b87f089c53653d6eff6df838" exitCode=0 Mar 10 11:34:23 crc kubenswrapper[4794]: I0310 11:34:23.932541 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-69278" event={"ID":"071a79a8-a892-4d38-a255-2a19483b64aa","Type":"ContainerDied","Data":"ff9ef84b5f8498d327987989405e5a00267015a3b87f089c53653d6eff6df838"} Mar 10 11:34:23 crc kubenswrapper[4794]: I0310 11:34:23.932890 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-69278" event={"ID":"071a79a8-a892-4d38-a255-2a19483b64aa","Type":"ContainerStarted","Data":"33bc8fb5de7a6ee4431141216e52d779889bbf57219d1ad361c0b6ac9e985c0e"} Mar 10 11:34:23 crc kubenswrapper[4794]: I0310 11:34:23.932934 4794 scope.go:117] "RemoveContainer" containerID="7035d4646bbb6a23dd802a9c8c9f05a3c6c945d4e5a6da8586d2bb005fcfad77" Mar 10 11:34:51 crc kubenswrapper[4794]: I0310 11:34:51.267686 4794 scope.go:117] "RemoveContainer" containerID="1075b91d496197f6e02538e923217d308f6944a7353740d9f4d4fc912b3a1597" Mar 10 11:35:07 crc kubenswrapper[4794]: I0310 11:35:07.042155 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-3230-account-create-update-8576n"] Mar 10 11:35:07 crc kubenswrapper[4794]: I0310 11:35:07.057075 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-db-create-4rqg8"] Mar 10 11:35:07 crc kubenswrapper[4794]: I0310 11:35:07.065750 4794 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-3230-account-create-update-8576n"] Mar 10 11:35:07 crc kubenswrapper[4794]: I0310 11:35:07.073883 4794 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-db-create-4rqg8"] Mar 10 11:35:08 crc kubenswrapper[4794]: I0310 11:35:08.014664 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b17c00fd-9e0c-4ddc-b961-30d84e94b34b" path="/var/lib/kubelet/pods/b17c00fd-9e0c-4ddc-b961-30d84e94b34b/volumes" Mar 10 11:35:08 crc kubenswrapper[4794]: I0310 11:35:08.015286 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="eca7f66a-0319-4b41-a820-706bab3a4898" path="/var/lib/kubelet/pods/eca7f66a-0319-4b41-a820-706bab3a4898/volumes" Mar 10 11:35:21 crc kubenswrapper[4794]: I0310 11:35:21.060713 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-db-sync-84hf6"] Mar 10 11:35:21 crc kubenswrapper[4794]: I0310 11:35:21.070404 4794 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-db-sync-84hf6"] Mar 10 11:35:22 crc kubenswrapper[4794]: I0310 11:35:22.032988 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1bdf038e-4f84-46d5-b1af-a95d083befbb" path="/var/lib/kubelet/pods/1bdf038e-4f84-46d5-b1af-a95d083befbb/volumes" Mar 10 11:35:51 crc kubenswrapper[4794]: I0310 11:35:51.391608 4794 scope.go:117] "RemoveContainer" containerID="e2adb5705bed8b7641cd0255c8510c9d1ddf5ef99d6c5b7d13eabbbcb14db9e3" Mar 10 11:35:51 crc kubenswrapper[4794]: I0310 11:35:51.433882 4794 scope.go:117] "RemoveContainer" containerID="459ff8ff80a916813fc22b8f98294568f8a6287d2cd9e78a538b29e33712353c" Mar 10 11:35:51 crc kubenswrapper[4794]: I0310 11:35:51.507988 4794 scope.go:117] "RemoveContainer" containerID="867999d3e8a9525ffaf32ea5935511308d1a624934fa60e704d6dc4d18b8688d" Mar 10 11:36:00 crc kubenswrapper[4794]: I0310 11:36:00.161938 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29552376-dttgm"] Mar 10 11:36:00 crc kubenswrapper[4794]: E0310 11:36:00.163623 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="83e37ea6-ebb9-4617-a214-f2cac72dad1e" containerName="oc" Mar 10 11:36:00 crc kubenswrapper[4794]: I0310 11:36:00.163650 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="83e37ea6-ebb9-4617-a214-f2cac72dad1e" containerName="oc" Mar 10 11:36:00 crc kubenswrapper[4794]: I0310 11:36:00.164019 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="83e37ea6-ebb9-4617-a214-f2cac72dad1e" containerName="oc" Mar 10 11:36:00 crc kubenswrapper[4794]: I0310 11:36:00.165211 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552376-dttgm" Mar 10 11:36:00 crc kubenswrapper[4794]: I0310 11:36:00.170986 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552376-dttgm"] Mar 10 11:36:00 crc kubenswrapper[4794]: I0310 11:36:00.172329 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 10 11:36:00 crc kubenswrapper[4794]: I0310 11:36:00.178233 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-r5tkc" Mar 10 11:36:00 crc kubenswrapper[4794]: I0310 11:36:00.178766 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 10 11:36:00 crc kubenswrapper[4794]: I0310 11:36:00.241324 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k76xf\" (UniqueName: \"kubernetes.io/projected/d95950e4-bccd-4145-b0fa-22b114129be1-kube-api-access-k76xf\") pod \"auto-csr-approver-29552376-dttgm\" (UID: \"d95950e4-bccd-4145-b0fa-22b114129be1\") " pod="openshift-infra/auto-csr-approver-29552376-dttgm" Mar 10 11:36:00 crc kubenswrapper[4794]: I0310 11:36:00.343893 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k76xf\" (UniqueName: \"kubernetes.io/projected/d95950e4-bccd-4145-b0fa-22b114129be1-kube-api-access-k76xf\") pod \"auto-csr-approver-29552376-dttgm\" (UID: \"d95950e4-bccd-4145-b0fa-22b114129be1\") " pod="openshift-infra/auto-csr-approver-29552376-dttgm" Mar 10 11:36:00 crc kubenswrapper[4794]: I0310 11:36:00.369202 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k76xf\" (UniqueName: \"kubernetes.io/projected/d95950e4-bccd-4145-b0fa-22b114129be1-kube-api-access-k76xf\") pod \"auto-csr-approver-29552376-dttgm\" (UID: \"d95950e4-bccd-4145-b0fa-22b114129be1\") " pod="openshift-infra/auto-csr-approver-29552376-dttgm" Mar 10 11:36:00 crc kubenswrapper[4794]: I0310 11:36:00.504922 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552376-dttgm" Mar 10 11:36:01 crc kubenswrapper[4794]: W0310 11:36:01.041639 4794 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd95950e4_bccd_4145_b0fa_22b114129be1.slice/crio-673d5944f1fee9c1e09c3c9ae6145771cae2cdce11296cd397961c3d2649320b WatchSource:0}: Error finding container 673d5944f1fee9c1e09c3c9ae6145771cae2cdce11296cd397961c3d2649320b: Status 404 returned error can't find the container with id 673d5944f1fee9c1e09c3c9ae6145771cae2cdce11296cd397961c3d2649320b Mar 10 11:36:01 crc kubenswrapper[4794]: I0310 11:36:01.044027 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552376-dttgm"] Mar 10 11:36:01 crc kubenswrapper[4794]: I0310 11:36:01.238684 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552376-dttgm" event={"ID":"d95950e4-bccd-4145-b0fa-22b114129be1","Type":"ContainerStarted","Data":"673d5944f1fee9c1e09c3c9ae6145771cae2cdce11296cd397961c3d2649320b"} Mar 10 11:36:03 crc kubenswrapper[4794]: I0310 11:36:03.258059 4794 generic.go:334] "Generic (PLEG): container finished" podID="d95950e4-bccd-4145-b0fa-22b114129be1" containerID="fda49684f956b7d3fb137604c388e6ebb7ae52aaabeb12fd4ef7f7836a8d8f35" exitCode=0 Mar 10 11:36:03 crc kubenswrapper[4794]: I0310 11:36:03.258131 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552376-dttgm" event={"ID":"d95950e4-bccd-4145-b0fa-22b114129be1","Type":"ContainerDied","Data":"fda49684f956b7d3fb137604c388e6ebb7ae52aaabeb12fd4ef7f7836a8d8f35"} Mar 10 11:36:04 crc kubenswrapper[4794]: I0310 11:36:04.716518 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552376-dttgm" Mar 10 11:36:04 crc kubenswrapper[4794]: I0310 11:36:04.838091 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k76xf\" (UniqueName: \"kubernetes.io/projected/d95950e4-bccd-4145-b0fa-22b114129be1-kube-api-access-k76xf\") pod \"d95950e4-bccd-4145-b0fa-22b114129be1\" (UID: \"d95950e4-bccd-4145-b0fa-22b114129be1\") " Mar 10 11:36:04 crc kubenswrapper[4794]: I0310 11:36:04.847523 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d95950e4-bccd-4145-b0fa-22b114129be1-kube-api-access-k76xf" (OuterVolumeSpecName: "kube-api-access-k76xf") pod "d95950e4-bccd-4145-b0fa-22b114129be1" (UID: "d95950e4-bccd-4145-b0fa-22b114129be1"). InnerVolumeSpecName "kube-api-access-k76xf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 11:36:04 crc kubenswrapper[4794]: I0310 11:36:04.941172 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k76xf\" (UniqueName: \"kubernetes.io/projected/d95950e4-bccd-4145-b0fa-22b114129be1-kube-api-access-k76xf\") on node \"crc\" DevicePath \"\"" Mar 10 11:36:05 crc kubenswrapper[4794]: I0310 11:36:05.284690 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552376-dttgm" event={"ID":"d95950e4-bccd-4145-b0fa-22b114129be1","Type":"ContainerDied","Data":"673d5944f1fee9c1e09c3c9ae6145771cae2cdce11296cd397961c3d2649320b"} Mar 10 11:36:05 crc kubenswrapper[4794]: I0310 11:36:05.284735 4794 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="673d5944f1fee9c1e09c3c9ae6145771cae2cdce11296cd397961c3d2649320b" Mar 10 11:36:05 crc kubenswrapper[4794]: I0310 11:36:05.284747 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552376-dttgm" Mar 10 11:36:05 crc kubenswrapper[4794]: I0310 11:36:05.819761 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29552370-nphmk"] Mar 10 11:36:05 crc kubenswrapper[4794]: I0310 11:36:05.838363 4794 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29552370-nphmk"] Mar 10 11:36:06 crc kubenswrapper[4794]: I0310 11:36:06.012909 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8e9bd9f6-74c2-4c97-b3ac-e55e74c36dbe" path="/var/lib/kubelet/pods/8e9bd9f6-74c2-4c97-b3ac-e55e74c36dbe/volumes" Mar 10 11:36:51 crc kubenswrapper[4794]: I0310 11:36:51.669925 4794 scope.go:117] "RemoveContainer" containerID="b4bc681fb985346ee8a3631df85de0294c1091767fe0b861665a98494797a917" Mar 10 11:36:52 crc kubenswrapper[4794]: I0310 11:36:52.968317 4794 patch_prober.go:28] interesting pod/machine-config-daemon-69278 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 11:36:52 crc kubenswrapper[4794]: I0310 11:36:52.968833 4794 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-69278" podUID="071a79a8-a892-4d38-a255-2a19483b64aa" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 11:37:22 crc kubenswrapper[4794]: I0310 11:37:22.967623 4794 patch_prober.go:28] interesting pod/machine-config-daemon-69278 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 11:37:22 crc kubenswrapper[4794]: I0310 11:37:22.968224 4794 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-69278" podUID="071a79a8-a892-4d38-a255-2a19483b64aa" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 11:37:30 crc kubenswrapper[4794]: I0310 11:37:30.084837 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/aodh-8c84-account-create-update-r5vsv"] Mar 10 11:37:30 crc kubenswrapper[4794]: I0310 11:37:30.101724 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/aodh-db-create-f8bl4"] Mar 10 11:37:30 crc kubenswrapper[4794]: I0310 11:37:30.112919 4794 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/aodh-8c84-account-create-update-r5vsv"] Mar 10 11:37:30 crc kubenswrapper[4794]: I0310 11:37:30.123492 4794 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/aodh-db-create-f8bl4"] Mar 10 11:37:32 crc kubenswrapper[4794]: I0310 11:37:32.782822 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="907874fb-cbaf-44b2-8fa6-2fc67601929e" path="/var/lib/kubelet/pods/907874fb-cbaf-44b2-8fa6-2fc67601929e/volumes" Mar 10 11:37:32 crc kubenswrapper[4794]: I0310 11:37:32.784795 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c9a7f680-5d9b-47a8-ba2b-9bdd1ddc8c50" path="/var/lib/kubelet/pods/c9a7f680-5d9b-47a8-ba2b-9bdd1ddc8c50/volumes" Mar 10 11:37:43 crc kubenswrapper[4794]: I0310 11:37:43.037053 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/aodh-db-sync-lxf6c"] Mar 10 11:37:43 crc kubenswrapper[4794]: I0310 11:37:43.055778 4794 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/aodh-db-sync-lxf6c"] Mar 10 11:37:44 crc kubenswrapper[4794]: I0310 11:37:44.019933 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bcb1116e-257f-487b-91f8-f72da443ced5" path="/var/lib/kubelet/pods/bcb1116e-257f-487b-91f8-f72da443ced5/volumes" Mar 10 11:37:51 crc kubenswrapper[4794]: I0310 11:37:51.793973 4794 scope.go:117] "RemoveContainer" containerID="df4c64b5f42d494a8c18fa9d668b31c1c660e3dc676ec247b706c06bbb972256" Mar 10 11:37:51 crc kubenswrapper[4794]: I0310 11:37:51.846178 4794 scope.go:117] "RemoveContainer" containerID="5748208f75481a6d6f6879054091990ba41585f9207f37bf95a510dbb7244b06" Mar 10 11:37:51 crc kubenswrapper[4794]: I0310 11:37:51.885425 4794 scope.go:117] "RemoveContainer" containerID="41a112b424ec84b7af820cce3394a5a3a48ff208147e1fcbcfd7e84b4384e475" Mar 10 11:37:52 crc kubenswrapper[4794]: I0310 11:37:52.967270 4794 patch_prober.go:28] interesting pod/machine-config-daemon-69278 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 11:37:52 crc kubenswrapper[4794]: I0310 11:37:52.967517 4794 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-69278" podUID="071a79a8-a892-4d38-a255-2a19483b64aa" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 11:37:52 crc kubenswrapper[4794]: I0310 11:37:52.967602 4794 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-69278" Mar 10 11:37:52 crc kubenswrapper[4794]: I0310 11:37:52.968568 4794 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"33bc8fb5de7a6ee4431141216e52d779889bbf57219d1ad361c0b6ac9e985c0e"} pod="openshift-machine-config-operator/machine-config-daemon-69278" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 10 11:37:52 crc kubenswrapper[4794]: I0310 11:37:52.968678 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-69278" podUID="071a79a8-a892-4d38-a255-2a19483b64aa" containerName="machine-config-daemon" containerID="cri-o://33bc8fb5de7a6ee4431141216e52d779889bbf57219d1ad361c0b6ac9e985c0e" gracePeriod=600 Mar 10 11:37:53 crc kubenswrapper[4794]: E0310 11:37:53.092655 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-69278_openshift-machine-config-operator(071a79a8-a892-4d38-a255-2a19483b64aa)\"" pod="openshift-machine-config-operator/machine-config-daemon-69278" podUID="071a79a8-a892-4d38-a255-2a19483b64aa" Mar 10 11:37:53 crc kubenswrapper[4794]: I0310 11:37:53.540320 4794 generic.go:334] "Generic (PLEG): container finished" podID="071a79a8-a892-4d38-a255-2a19483b64aa" containerID="33bc8fb5de7a6ee4431141216e52d779889bbf57219d1ad361c0b6ac9e985c0e" exitCode=0 Mar 10 11:37:53 crc kubenswrapper[4794]: I0310 11:37:53.540393 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-69278" event={"ID":"071a79a8-a892-4d38-a255-2a19483b64aa","Type":"ContainerDied","Data":"33bc8fb5de7a6ee4431141216e52d779889bbf57219d1ad361c0b6ac9e985c0e"} Mar 10 11:37:53 crc kubenswrapper[4794]: I0310 11:37:53.540465 4794 scope.go:117] "RemoveContainer" containerID="ff9ef84b5f8498d327987989405e5a00267015a3b87f089c53653d6eff6df838" Mar 10 11:37:53 crc kubenswrapper[4794]: I0310 11:37:53.541172 4794 scope.go:117] "RemoveContainer" containerID="33bc8fb5de7a6ee4431141216e52d779889bbf57219d1ad361c0b6ac9e985c0e" Mar 10 11:37:53 crc kubenswrapper[4794]: E0310 11:37:53.541930 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-69278_openshift-machine-config-operator(071a79a8-a892-4d38-a255-2a19483b64aa)\"" pod="openshift-machine-config-operator/machine-config-daemon-69278" podUID="071a79a8-a892-4d38-a255-2a19483b64aa" Mar 10 11:38:00 crc kubenswrapper[4794]: I0310 11:38:00.187376 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29552378-wxkq4"] Mar 10 11:38:00 crc kubenswrapper[4794]: E0310 11:38:00.188601 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d95950e4-bccd-4145-b0fa-22b114129be1" containerName="oc" Mar 10 11:38:00 crc kubenswrapper[4794]: I0310 11:38:00.188621 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="d95950e4-bccd-4145-b0fa-22b114129be1" containerName="oc" Mar 10 11:38:00 crc kubenswrapper[4794]: I0310 11:38:00.188896 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="d95950e4-bccd-4145-b0fa-22b114129be1" containerName="oc" Mar 10 11:38:00 crc kubenswrapper[4794]: I0310 11:38:00.189866 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552378-wxkq4" Mar 10 11:38:00 crc kubenswrapper[4794]: I0310 11:38:00.192579 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-r5tkc" Mar 10 11:38:00 crc kubenswrapper[4794]: I0310 11:38:00.192743 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 10 11:38:00 crc kubenswrapper[4794]: I0310 11:38:00.194133 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 10 11:38:00 crc kubenswrapper[4794]: I0310 11:38:00.205860 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552378-wxkq4"] Mar 10 11:38:00 crc kubenswrapper[4794]: I0310 11:38:00.240416 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r59vp\" (UniqueName: \"kubernetes.io/projected/3961fb69-7762-45fa-af1a-c5be94b67ed2-kube-api-access-r59vp\") pod \"auto-csr-approver-29552378-wxkq4\" (UID: \"3961fb69-7762-45fa-af1a-c5be94b67ed2\") " pod="openshift-infra/auto-csr-approver-29552378-wxkq4" Mar 10 11:38:00 crc kubenswrapper[4794]: I0310 11:38:00.343170 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r59vp\" (UniqueName: \"kubernetes.io/projected/3961fb69-7762-45fa-af1a-c5be94b67ed2-kube-api-access-r59vp\") pod \"auto-csr-approver-29552378-wxkq4\" (UID: \"3961fb69-7762-45fa-af1a-c5be94b67ed2\") " pod="openshift-infra/auto-csr-approver-29552378-wxkq4" Mar 10 11:38:00 crc kubenswrapper[4794]: I0310 11:38:00.375466 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r59vp\" (UniqueName: \"kubernetes.io/projected/3961fb69-7762-45fa-af1a-c5be94b67ed2-kube-api-access-r59vp\") pod \"auto-csr-approver-29552378-wxkq4\" (UID: \"3961fb69-7762-45fa-af1a-c5be94b67ed2\") " pod="openshift-infra/auto-csr-approver-29552378-wxkq4" Mar 10 11:38:00 crc kubenswrapper[4794]: I0310 11:38:00.525284 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552378-wxkq4" Mar 10 11:38:01 crc kubenswrapper[4794]: I0310 11:38:01.000763 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552378-wxkq4"] Mar 10 11:38:01 crc kubenswrapper[4794]: W0310 11:38:01.002577 4794 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3961fb69_7762_45fa_af1a_c5be94b67ed2.slice/crio-00dc96cb848a40739d6755cd44ed897c4ef2341c27905db4992b7804e62445ab WatchSource:0}: Error finding container 00dc96cb848a40739d6755cd44ed897c4ef2341c27905db4992b7804e62445ab: Status 404 returned error can't find the container with id 00dc96cb848a40739d6755cd44ed897c4ef2341c27905db4992b7804e62445ab Mar 10 11:38:01 crc kubenswrapper[4794]: I0310 11:38:01.646248 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552378-wxkq4" event={"ID":"3961fb69-7762-45fa-af1a-c5be94b67ed2","Type":"ContainerStarted","Data":"00dc96cb848a40739d6755cd44ed897c4ef2341c27905db4992b7804e62445ab"} Mar 10 11:38:02 crc kubenswrapper[4794]: I0310 11:38:02.049329 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/manila-db-create-gspnw"] Mar 10 11:38:02 crc kubenswrapper[4794]: I0310 11:38:02.058843 4794 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/manila-db-create-gspnw"] Mar 10 11:38:03 crc kubenswrapper[4794]: I0310 11:38:03.042627 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/manila-1caa-account-create-update-8zbdf"] Mar 10 11:38:03 crc kubenswrapper[4794]: I0310 11:38:03.057944 4794 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/manila-1caa-account-create-update-8zbdf"] Mar 10 11:38:04 crc kubenswrapper[4794]: I0310 11:38:04.010866 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3f15d67e-f384-4070-806e-6a213e97b3d9" path="/var/lib/kubelet/pods/3f15d67e-f384-4070-806e-6a213e97b3d9/volumes" Mar 10 11:38:04 crc kubenswrapper[4794]: I0310 11:38:04.012814 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a835411b-024f-4d3b-a8ae-dcba59606ca6" path="/var/lib/kubelet/pods/a835411b-024f-4d3b-a8ae-dcba59606ca6/volumes" Mar 10 11:38:04 crc kubenswrapper[4794]: I0310 11:38:04.680317 4794 generic.go:334] "Generic (PLEG): container finished" podID="3961fb69-7762-45fa-af1a-c5be94b67ed2" containerID="42e00f07cf0a1488747cc4384ba4b156c6672d1516f0866cd94b4ae1638808d4" exitCode=0 Mar 10 11:38:04 crc kubenswrapper[4794]: I0310 11:38:04.680540 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552378-wxkq4" event={"ID":"3961fb69-7762-45fa-af1a-c5be94b67ed2","Type":"ContainerDied","Data":"42e00f07cf0a1488747cc4384ba4b156c6672d1516f0866cd94b4ae1638808d4"} Mar 10 11:38:06 crc kubenswrapper[4794]: I0310 11:38:06.000583 4794 scope.go:117] "RemoveContainer" containerID="33bc8fb5de7a6ee4431141216e52d779889bbf57219d1ad361c0b6ac9e985c0e" Mar 10 11:38:06 crc kubenswrapper[4794]: E0310 11:38:06.001473 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-69278_openshift-machine-config-operator(071a79a8-a892-4d38-a255-2a19483b64aa)\"" pod="openshift-machine-config-operator/machine-config-daemon-69278" podUID="071a79a8-a892-4d38-a255-2a19483b64aa" Mar 10 11:38:06 crc kubenswrapper[4794]: I0310 11:38:06.097270 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552378-wxkq4" Mar 10 11:38:06 crc kubenswrapper[4794]: I0310 11:38:06.206097 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r59vp\" (UniqueName: \"kubernetes.io/projected/3961fb69-7762-45fa-af1a-c5be94b67ed2-kube-api-access-r59vp\") pod \"3961fb69-7762-45fa-af1a-c5be94b67ed2\" (UID: \"3961fb69-7762-45fa-af1a-c5be94b67ed2\") " Mar 10 11:38:06 crc kubenswrapper[4794]: I0310 11:38:06.212479 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3961fb69-7762-45fa-af1a-c5be94b67ed2-kube-api-access-r59vp" (OuterVolumeSpecName: "kube-api-access-r59vp") pod "3961fb69-7762-45fa-af1a-c5be94b67ed2" (UID: "3961fb69-7762-45fa-af1a-c5be94b67ed2"). InnerVolumeSpecName "kube-api-access-r59vp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 11:38:06 crc kubenswrapper[4794]: I0310 11:38:06.309649 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r59vp\" (UniqueName: \"kubernetes.io/projected/3961fb69-7762-45fa-af1a-c5be94b67ed2-kube-api-access-r59vp\") on node \"crc\" DevicePath \"\"" Mar 10 11:38:06 crc kubenswrapper[4794]: I0310 11:38:06.710444 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552378-wxkq4" Mar 10 11:38:06 crc kubenswrapper[4794]: I0310 11:38:06.710321 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552378-wxkq4" event={"ID":"3961fb69-7762-45fa-af1a-c5be94b67ed2","Type":"ContainerDied","Data":"00dc96cb848a40739d6755cd44ed897c4ef2341c27905db4992b7804e62445ab"} Mar 10 11:38:06 crc kubenswrapper[4794]: I0310 11:38:06.710568 4794 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="00dc96cb848a40739d6755cd44ed897c4ef2341c27905db4992b7804e62445ab" Mar 10 11:38:07 crc kubenswrapper[4794]: I0310 11:38:07.173863 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29552372-5hvh5"] Mar 10 11:38:07 crc kubenswrapper[4794]: I0310 11:38:07.184625 4794 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29552372-5hvh5"] Mar 10 11:38:08 crc kubenswrapper[4794]: I0310 11:38:08.013616 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ba694751-93ab-49ca-a8b9-0e73c87ea5f6" path="/var/lib/kubelet/pods/ba694751-93ab-49ca-a8b9-0e73c87ea5f6/volumes" Mar 10 11:38:15 crc kubenswrapper[4794]: I0310 11:38:15.078102 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/manila-db-sync-gdd98"] Mar 10 11:38:15 crc kubenswrapper[4794]: I0310 11:38:15.098859 4794 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/manila-db-sync-gdd98"] Mar 10 11:38:16 crc kubenswrapper[4794]: I0310 11:38:16.015120 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31990e7b-b9b5-408f-aa66-e066c7b58fd4" path="/var/lib/kubelet/pods/31990e7b-b9b5-408f-aa66-e066c7b58fd4/volumes" Mar 10 11:38:20 crc kubenswrapper[4794]: I0310 11:38:20.999622 4794 scope.go:117] "RemoveContainer" containerID="33bc8fb5de7a6ee4431141216e52d779889bbf57219d1ad361c0b6ac9e985c0e" Mar 10 11:38:21 crc kubenswrapper[4794]: E0310 11:38:21.000180 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-69278_openshift-machine-config-operator(071a79a8-a892-4d38-a255-2a19483b64aa)\"" pod="openshift-machine-config-operator/machine-config-daemon-69278" podUID="071a79a8-a892-4d38-a255-2a19483b64aa" Mar 10 11:38:29 crc kubenswrapper[4794]: I0310 11:38:29.860194 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-ndcfk"] Mar 10 11:38:29 crc kubenswrapper[4794]: E0310 11:38:29.861846 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3961fb69-7762-45fa-af1a-c5be94b67ed2" containerName="oc" Mar 10 11:38:29 crc kubenswrapper[4794]: I0310 11:38:29.861875 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="3961fb69-7762-45fa-af1a-c5be94b67ed2" containerName="oc" Mar 10 11:38:29 crc kubenswrapper[4794]: I0310 11:38:29.862321 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="3961fb69-7762-45fa-af1a-c5be94b67ed2" containerName="oc" Mar 10 11:38:29 crc kubenswrapper[4794]: I0310 11:38:29.866013 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-ndcfk" Mar 10 11:38:29 crc kubenswrapper[4794]: I0310 11:38:29.884504 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-ndcfk"] Mar 10 11:38:29 crc kubenswrapper[4794]: I0310 11:38:29.962169 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4d293ca9-c989-4099-92d7-c3ea41285317-catalog-content\") pod \"redhat-marketplace-ndcfk\" (UID: \"4d293ca9-c989-4099-92d7-c3ea41285317\") " pod="openshift-marketplace/redhat-marketplace-ndcfk" Mar 10 11:38:29 crc kubenswrapper[4794]: I0310 11:38:29.962294 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-84d4g\" (UniqueName: \"kubernetes.io/projected/4d293ca9-c989-4099-92d7-c3ea41285317-kube-api-access-84d4g\") pod \"redhat-marketplace-ndcfk\" (UID: \"4d293ca9-c989-4099-92d7-c3ea41285317\") " pod="openshift-marketplace/redhat-marketplace-ndcfk" Mar 10 11:38:29 crc kubenswrapper[4794]: I0310 11:38:29.962475 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4d293ca9-c989-4099-92d7-c3ea41285317-utilities\") pod \"redhat-marketplace-ndcfk\" (UID: \"4d293ca9-c989-4099-92d7-c3ea41285317\") " pod="openshift-marketplace/redhat-marketplace-ndcfk" Mar 10 11:38:30 crc kubenswrapper[4794]: I0310 11:38:30.064874 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-84d4g\" (UniqueName: \"kubernetes.io/projected/4d293ca9-c989-4099-92d7-c3ea41285317-kube-api-access-84d4g\") pod \"redhat-marketplace-ndcfk\" (UID: \"4d293ca9-c989-4099-92d7-c3ea41285317\") " pod="openshift-marketplace/redhat-marketplace-ndcfk" Mar 10 11:38:30 crc kubenswrapper[4794]: I0310 11:38:30.065022 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4d293ca9-c989-4099-92d7-c3ea41285317-utilities\") pod \"redhat-marketplace-ndcfk\" (UID: \"4d293ca9-c989-4099-92d7-c3ea41285317\") " pod="openshift-marketplace/redhat-marketplace-ndcfk" Mar 10 11:38:30 crc kubenswrapper[4794]: I0310 11:38:30.065307 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4d293ca9-c989-4099-92d7-c3ea41285317-catalog-content\") pod \"redhat-marketplace-ndcfk\" (UID: \"4d293ca9-c989-4099-92d7-c3ea41285317\") " pod="openshift-marketplace/redhat-marketplace-ndcfk" Mar 10 11:38:30 crc kubenswrapper[4794]: I0310 11:38:30.065783 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4d293ca9-c989-4099-92d7-c3ea41285317-utilities\") pod \"redhat-marketplace-ndcfk\" (UID: \"4d293ca9-c989-4099-92d7-c3ea41285317\") " pod="openshift-marketplace/redhat-marketplace-ndcfk" Mar 10 11:38:30 crc kubenswrapper[4794]: I0310 11:38:30.065913 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4d293ca9-c989-4099-92d7-c3ea41285317-catalog-content\") pod \"redhat-marketplace-ndcfk\" (UID: \"4d293ca9-c989-4099-92d7-c3ea41285317\") " pod="openshift-marketplace/redhat-marketplace-ndcfk" Mar 10 11:38:30 crc kubenswrapper[4794]: I0310 11:38:30.087294 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-84d4g\" (UniqueName: \"kubernetes.io/projected/4d293ca9-c989-4099-92d7-c3ea41285317-kube-api-access-84d4g\") pod \"redhat-marketplace-ndcfk\" (UID: \"4d293ca9-c989-4099-92d7-c3ea41285317\") " pod="openshift-marketplace/redhat-marketplace-ndcfk" Mar 10 11:38:30 crc kubenswrapper[4794]: I0310 11:38:30.201039 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-ndcfk" Mar 10 11:38:30 crc kubenswrapper[4794]: I0310 11:38:30.752435 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-ndcfk"] Mar 10 11:38:31 crc kubenswrapper[4794]: I0310 11:38:31.001688 4794 generic.go:334] "Generic (PLEG): container finished" podID="4d293ca9-c989-4099-92d7-c3ea41285317" containerID="cce0d9417b7e32bbbe1d2d8f5486d6a0ea49ed459b6d15565f03ec69447e86da" exitCode=0 Mar 10 11:38:31 crc kubenswrapper[4794]: I0310 11:38:31.001857 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ndcfk" event={"ID":"4d293ca9-c989-4099-92d7-c3ea41285317","Type":"ContainerDied","Data":"cce0d9417b7e32bbbe1d2d8f5486d6a0ea49ed459b6d15565f03ec69447e86da"} Mar 10 11:38:31 crc kubenswrapper[4794]: I0310 11:38:31.001974 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ndcfk" event={"ID":"4d293ca9-c989-4099-92d7-c3ea41285317","Type":"ContainerStarted","Data":"55bcc8d4b28f0d0deb1d1382879e4f1d7931aa593a42eaa2a8b1d7af8fafef6e"} Mar 10 11:38:32 crc kubenswrapper[4794]: I0310 11:38:32.015488 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ndcfk" event={"ID":"4d293ca9-c989-4099-92d7-c3ea41285317","Type":"ContainerStarted","Data":"a32d27527332de0bba80a82dcf033da1bc6e7470615a970140887126093f6a49"} Mar 10 11:38:33 crc kubenswrapper[4794]: I0310 11:38:33.026488 4794 generic.go:334] "Generic (PLEG): container finished" podID="4d293ca9-c989-4099-92d7-c3ea41285317" containerID="a32d27527332de0bba80a82dcf033da1bc6e7470615a970140887126093f6a49" exitCode=0 Mar 10 11:38:33 crc kubenswrapper[4794]: I0310 11:38:33.026689 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ndcfk" event={"ID":"4d293ca9-c989-4099-92d7-c3ea41285317","Type":"ContainerDied","Data":"a32d27527332de0bba80a82dcf033da1bc6e7470615a970140887126093f6a49"} Mar 10 11:38:34 crc kubenswrapper[4794]: I0310 11:38:34.037907 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ndcfk" event={"ID":"4d293ca9-c989-4099-92d7-c3ea41285317","Type":"ContainerStarted","Data":"4ab9bdd0a8e8b3578bf4d533756d17c0ea7bf57978cb8697b8cac9bc04e91770"} Mar 10 11:38:34 crc kubenswrapper[4794]: I0310 11:38:34.057827 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-ndcfk" podStartSLOduration=2.6250517650000003 podStartE2EDuration="5.057803823s" podCreationTimestamp="2026-03-10 11:38:29 +0000 UTC" firstStartedPulling="2026-03-10 11:38:31.003052296 +0000 UTC m=+6859.759223104" lastFinishedPulling="2026-03-10 11:38:33.435804314 +0000 UTC m=+6862.191975162" observedRunningTime="2026-03-10 11:38:34.052233613 +0000 UTC m=+6862.808404461" watchObservedRunningTime="2026-03-10 11:38:34.057803823 +0000 UTC m=+6862.813974651" Mar 10 11:38:36 crc kubenswrapper[4794]: I0310 11:38:36.027672 4794 scope.go:117] "RemoveContainer" containerID="33bc8fb5de7a6ee4431141216e52d779889bbf57219d1ad361c0b6ac9e985c0e" Mar 10 11:38:36 crc kubenswrapper[4794]: E0310 11:38:36.029580 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-69278_openshift-machine-config-operator(071a79a8-a892-4d38-a255-2a19483b64aa)\"" pod="openshift-machine-config-operator/machine-config-daemon-69278" podUID="071a79a8-a892-4d38-a255-2a19483b64aa" Mar 10 11:38:40 crc kubenswrapper[4794]: I0310 11:38:40.201281 4794 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-ndcfk" Mar 10 11:38:40 crc kubenswrapper[4794]: I0310 11:38:40.202103 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-ndcfk" Mar 10 11:38:40 crc kubenswrapper[4794]: I0310 11:38:40.268255 4794 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-ndcfk" Mar 10 11:38:41 crc kubenswrapper[4794]: I0310 11:38:41.152631 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-ndcfk" Mar 10 11:38:42 crc kubenswrapper[4794]: I0310 11:38:42.837745 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-ndcfk"] Mar 10 11:38:43 crc kubenswrapper[4794]: I0310 11:38:43.123514 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-ndcfk" podUID="4d293ca9-c989-4099-92d7-c3ea41285317" containerName="registry-server" containerID="cri-o://4ab9bdd0a8e8b3578bf4d533756d17c0ea7bf57978cb8697b8cac9bc04e91770" gracePeriod=2 Mar 10 11:38:43 crc kubenswrapper[4794]: I0310 11:38:43.642534 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-ndcfk" Mar 10 11:38:43 crc kubenswrapper[4794]: I0310 11:38:43.796535 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4d293ca9-c989-4099-92d7-c3ea41285317-catalog-content\") pod \"4d293ca9-c989-4099-92d7-c3ea41285317\" (UID: \"4d293ca9-c989-4099-92d7-c3ea41285317\") " Mar 10 11:38:43 crc kubenswrapper[4794]: I0310 11:38:43.796857 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4d293ca9-c989-4099-92d7-c3ea41285317-utilities\") pod \"4d293ca9-c989-4099-92d7-c3ea41285317\" (UID: \"4d293ca9-c989-4099-92d7-c3ea41285317\") " Mar 10 11:38:43 crc kubenswrapper[4794]: I0310 11:38:43.797058 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-84d4g\" (UniqueName: \"kubernetes.io/projected/4d293ca9-c989-4099-92d7-c3ea41285317-kube-api-access-84d4g\") pod \"4d293ca9-c989-4099-92d7-c3ea41285317\" (UID: \"4d293ca9-c989-4099-92d7-c3ea41285317\") " Mar 10 11:38:43 crc kubenswrapper[4794]: I0310 11:38:43.798260 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4d293ca9-c989-4099-92d7-c3ea41285317-utilities" (OuterVolumeSpecName: "utilities") pod "4d293ca9-c989-4099-92d7-c3ea41285317" (UID: "4d293ca9-c989-4099-92d7-c3ea41285317"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 11:38:43 crc kubenswrapper[4794]: I0310 11:38:43.803390 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4d293ca9-c989-4099-92d7-c3ea41285317-kube-api-access-84d4g" (OuterVolumeSpecName: "kube-api-access-84d4g") pod "4d293ca9-c989-4099-92d7-c3ea41285317" (UID: "4d293ca9-c989-4099-92d7-c3ea41285317"). InnerVolumeSpecName "kube-api-access-84d4g". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 11:38:43 crc kubenswrapper[4794]: I0310 11:38:43.830794 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4d293ca9-c989-4099-92d7-c3ea41285317-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "4d293ca9-c989-4099-92d7-c3ea41285317" (UID: "4d293ca9-c989-4099-92d7-c3ea41285317"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 11:38:43 crc kubenswrapper[4794]: I0310 11:38:43.900141 4794 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4d293ca9-c989-4099-92d7-c3ea41285317-utilities\") on node \"crc\" DevicePath \"\"" Mar 10 11:38:43 crc kubenswrapper[4794]: I0310 11:38:43.900170 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-84d4g\" (UniqueName: \"kubernetes.io/projected/4d293ca9-c989-4099-92d7-c3ea41285317-kube-api-access-84d4g\") on node \"crc\" DevicePath \"\"" Mar 10 11:38:43 crc kubenswrapper[4794]: I0310 11:38:43.900181 4794 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4d293ca9-c989-4099-92d7-c3ea41285317-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 10 11:38:44 crc kubenswrapper[4794]: I0310 11:38:44.136253 4794 generic.go:334] "Generic (PLEG): container finished" podID="4d293ca9-c989-4099-92d7-c3ea41285317" containerID="4ab9bdd0a8e8b3578bf4d533756d17c0ea7bf57978cb8697b8cac9bc04e91770" exitCode=0 Mar 10 11:38:44 crc kubenswrapper[4794]: I0310 11:38:44.136298 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ndcfk" event={"ID":"4d293ca9-c989-4099-92d7-c3ea41285317","Type":"ContainerDied","Data":"4ab9bdd0a8e8b3578bf4d533756d17c0ea7bf57978cb8697b8cac9bc04e91770"} Mar 10 11:38:44 crc kubenswrapper[4794]: I0310 11:38:44.136350 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-ndcfk" Mar 10 11:38:44 crc kubenswrapper[4794]: I0310 11:38:44.136365 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ndcfk" event={"ID":"4d293ca9-c989-4099-92d7-c3ea41285317","Type":"ContainerDied","Data":"55bcc8d4b28f0d0deb1d1382879e4f1d7931aa593a42eaa2a8b1d7af8fafef6e"} Mar 10 11:38:44 crc kubenswrapper[4794]: I0310 11:38:44.136389 4794 scope.go:117] "RemoveContainer" containerID="4ab9bdd0a8e8b3578bf4d533756d17c0ea7bf57978cb8697b8cac9bc04e91770" Mar 10 11:38:44 crc kubenswrapper[4794]: I0310 11:38:44.163674 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-ndcfk"] Mar 10 11:38:44 crc kubenswrapper[4794]: I0310 11:38:44.173682 4794 scope.go:117] "RemoveContainer" containerID="a32d27527332de0bba80a82dcf033da1bc6e7470615a970140887126093f6a49" Mar 10 11:38:44 crc kubenswrapper[4794]: I0310 11:38:44.176548 4794 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-ndcfk"] Mar 10 11:38:44 crc kubenswrapper[4794]: I0310 11:38:44.194856 4794 scope.go:117] "RemoveContainer" containerID="cce0d9417b7e32bbbe1d2d8f5486d6a0ea49ed459b6d15565f03ec69447e86da" Mar 10 11:38:44 crc kubenswrapper[4794]: I0310 11:38:44.248547 4794 scope.go:117] "RemoveContainer" containerID="4ab9bdd0a8e8b3578bf4d533756d17c0ea7bf57978cb8697b8cac9bc04e91770" Mar 10 11:38:44 crc kubenswrapper[4794]: E0310 11:38:44.248956 4794 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4ab9bdd0a8e8b3578bf4d533756d17c0ea7bf57978cb8697b8cac9bc04e91770\": container with ID starting with 4ab9bdd0a8e8b3578bf4d533756d17c0ea7bf57978cb8697b8cac9bc04e91770 not found: ID does not exist" containerID="4ab9bdd0a8e8b3578bf4d533756d17c0ea7bf57978cb8697b8cac9bc04e91770" Mar 10 11:38:44 crc kubenswrapper[4794]: I0310 11:38:44.248988 4794 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4ab9bdd0a8e8b3578bf4d533756d17c0ea7bf57978cb8697b8cac9bc04e91770"} err="failed to get container status \"4ab9bdd0a8e8b3578bf4d533756d17c0ea7bf57978cb8697b8cac9bc04e91770\": rpc error: code = NotFound desc = could not find container \"4ab9bdd0a8e8b3578bf4d533756d17c0ea7bf57978cb8697b8cac9bc04e91770\": container with ID starting with 4ab9bdd0a8e8b3578bf4d533756d17c0ea7bf57978cb8697b8cac9bc04e91770 not found: ID does not exist" Mar 10 11:38:44 crc kubenswrapper[4794]: I0310 11:38:44.249009 4794 scope.go:117] "RemoveContainer" containerID="a32d27527332de0bba80a82dcf033da1bc6e7470615a970140887126093f6a49" Mar 10 11:38:44 crc kubenswrapper[4794]: E0310 11:38:44.249403 4794 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a32d27527332de0bba80a82dcf033da1bc6e7470615a970140887126093f6a49\": container with ID starting with a32d27527332de0bba80a82dcf033da1bc6e7470615a970140887126093f6a49 not found: ID does not exist" containerID="a32d27527332de0bba80a82dcf033da1bc6e7470615a970140887126093f6a49" Mar 10 11:38:44 crc kubenswrapper[4794]: I0310 11:38:44.249454 4794 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a32d27527332de0bba80a82dcf033da1bc6e7470615a970140887126093f6a49"} err="failed to get container status \"a32d27527332de0bba80a82dcf033da1bc6e7470615a970140887126093f6a49\": rpc error: code = NotFound desc = could not find container \"a32d27527332de0bba80a82dcf033da1bc6e7470615a970140887126093f6a49\": container with ID starting with a32d27527332de0bba80a82dcf033da1bc6e7470615a970140887126093f6a49 not found: ID does not exist" Mar 10 11:38:44 crc kubenswrapper[4794]: I0310 11:38:44.249520 4794 scope.go:117] "RemoveContainer" containerID="cce0d9417b7e32bbbe1d2d8f5486d6a0ea49ed459b6d15565f03ec69447e86da" Mar 10 11:38:44 crc kubenswrapper[4794]: E0310 11:38:44.249825 4794 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cce0d9417b7e32bbbe1d2d8f5486d6a0ea49ed459b6d15565f03ec69447e86da\": container with ID starting with cce0d9417b7e32bbbe1d2d8f5486d6a0ea49ed459b6d15565f03ec69447e86da not found: ID does not exist" containerID="cce0d9417b7e32bbbe1d2d8f5486d6a0ea49ed459b6d15565f03ec69447e86da" Mar 10 11:38:44 crc kubenswrapper[4794]: I0310 11:38:44.249863 4794 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cce0d9417b7e32bbbe1d2d8f5486d6a0ea49ed459b6d15565f03ec69447e86da"} err="failed to get container status \"cce0d9417b7e32bbbe1d2d8f5486d6a0ea49ed459b6d15565f03ec69447e86da\": rpc error: code = NotFound desc = could not find container \"cce0d9417b7e32bbbe1d2d8f5486d6a0ea49ed459b6d15565f03ec69447e86da\": container with ID starting with cce0d9417b7e32bbbe1d2d8f5486d6a0ea49ed459b6d15565f03ec69447e86da not found: ID does not exist" Mar 10 11:38:46 crc kubenswrapper[4794]: I0310 11:38:46.022753 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4d293ca9-c989-4099-92d7-c3ea41285317" path="/var/lib/kubelet/pods/4d293ca9-c989-4099-92d7-c3ea41285317/volumes" Mar 10 11:38:47 crc kubenswrapper[4794]: I0310 11:38:47.000268 4794 scope.go:117] "RemoveContainer" containerID="33bc8fb5de7a6ee4431141216e52d779889bbf57219d1ad361c0b6ac9e985c0e" Mar 10 11:38:47 crc kubenswrapper[4794]: E0310 11:38:47.000772 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-69278_openshift-machine-config-operator(071a79a8-a892-4d38-a255-2a19483b64aa)\"" pod="openshift-machine-config-operator/machine-config-daemon-69278" podUID="071a79a8-a892-4d38-a255-2a19483b64aa" Mar 10 11:38:52 crc kubenswrapper[4794]: I0310 11:38:52.046554 4794 scope.go:117] "RemoveContainer" containerID="7de966b2e69162def51fa7897e8a587d06aedd23363e5d0a71598dabf97a75d3" Mar 10 11:38:52 crc kubenswrapper[4794]: I0310 11:38:52.095726 4794 scope.go:117] "RemoveContainer" containerID="2fed274478277fb3a33d47601fb13570f7076a27f2eeb0cf6854d67b70670990" Mar 10 11:38:52 crc kubenswrapper[4794]: I0310 11:38:52.147549 4794 scope.go:117] "RemoveContainer" containerID="f8f54ccae0f40c98d9e23426c2df80e4ca6f3481cf1cb4619461559ebeb54161" Mar 10 11:38:52 crc kubenswrapper[4794]: I0310 11:38:52.205642 4794 scope.go:117] "RemoveContainer" containerID="07047adba663cffdd80297e275f95ed54327ed72f931ad8fffa72191e13a1a1c" Mar 10 11:39:01 crc kubenswrapper[4794]: I0310 11:39:01.000192 4794 scope.go:117] "RemoveContainer" containerID="33bc8fb5de7a6ee4431141216e52d779889bbf57219d1ad361c0b6ac9e985c0e" Mar 10 11:39:01 crc kubenswrapper[4794]: E0310 11:39:01.001410 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-69278_openshift-machine-config-operator(071a79a8-a892-4d38-a255-2a19483b64aa)\"" pod="openshift-machine-config-operator/machine-config-daemon-69278" podUID="071a79a8-a892-4d38-a255-2a19483b64aa" Mar 10 11:39:15 crc kubenswrapper[4794]: I0310 11:39:15.998931 4794 scope.go:117] "RemoveContainer" containerID="33bc8fb5de7a6ee4431141216e52d779889bbf57219d1ad361c0b6ac9e985c0e" Mar 10 11:39:16 crc kubenswrapper[4794]: E0310 11:39:16.000565 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-69278_openshift-machine-config-operator(071a79a8-a892-4d38-a255-2a19483b64aa)\"" pod="openshift-machine-config-operator/machine-config-daemon-69278" podUID="071a79a8-a892-4d38-a255-2a19483b64aa" Mar 10 11:39:29 crc kubenswrapper[4794]: I0310 11:39:29.999087 4794 scope.go:117] "RemoveContainer" containerID="33bc8fb5de7a6ee4431141216e52d779889bbf57219d1ad361c0b6ac9e985c0e" Mar 10 11:39:30 crc kubenswrapper[4794]: E0310 11:39:30.000097 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-69278_openshift-machine-config-operator(071a79a8-a892-4d38-a255-2a19483b64aa)\"" pod="openshift-machine-config-operator/machine-config-daemon-69278" podUID="071a79a8-a892-4d38-a255-2a19483b64aa" Mar 10 11:39:31 crc kubenswrapper[4794]: I0310 11:39:31.155681 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-vsjrq"] Mar 10 11:39:31 crc kubenswrapper[4794]: E0310 11:39:31.156566 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4d293ca9-c989-4099-92d7-c3ea41285317" containerName="extract-utilities" Mar 10 11:39:31 crc kubenswrapper[4794]: I0310 11:39:31.156905 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="4d293ca9-c989-4099-92d7-c3ea41285317" containerName="extract-utilities" Mar 10 11:39:31 crc kubenswrapper[4794]: E0310 11:39:31.156972 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4d293ca9-c989-4099-92d7-c3ea41285317" containerName="extract-content" Mar 10 11:39:31 crc kubenswrapper[4794]: I0310 11:39:31.156986 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="4d293ca9-c989-4099-92d7-c3ea41285317" containerName="extract-content" Mar 10 11:39:31 crc kubenswrapper[4794]: E0310 11:39:31.157018 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4d293ca9-c989-4099-92d7-c3ea41285317" containerName="registry-server" Mar 10 11:39:31 crc kubenswrapper[4794]: I0310 11:39:31.157032 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="4d293ca9-c989-4099-92d7-c3ea41285317" containerName="registry-server" Mar 10 11:39:31 crc kubenswrapper[4794]: I0310 11:39:31.157447 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="4d293ca9-c989-4099-92d7-c3ea41285317" containerName="registry-server" Mar 10 11:39:31 crc kubenswrapper[4794]: I0310 11:39:31.160440 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-vsjrq" Mar 10 11:39:31 crc kubenswrapper[4794]: I0310 11:39:31.178936 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-vsjrq"] Mar 10 11:39:31 crc kubenswrapper[4794]: I0310 11:39:31.294099 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fbb6febf-b3e9-4385-b9e9-447d1cca66b5-utilities\") pod \"certified-operators-vsjrq\" (UID: \"fbb6febf-b3e9-4385-b9e9-447d1cca66b5\") " pod="openshift-marketplace/certified-operators-vsjrq" Mar 10 11:39:31 crc kubenswrapper[4794]: I0310 11:39:31.294169 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fbb6febf-b3e9-4385-b9e9-447d1cca66b5-catalog-content\") pod \"certified-operators-vsjrq\" (UID: \"fbb6febf-b3e9-4385-b9e9-447d1cca66b5\") " pod="openshift-marketplace/certified-operators-vsjrq" Mar 10 11:39:31 crc kubenswrapper[4794]: I0310 11:39:31.294241 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-thjtr\" (UniqueName: \"kubernetes.io/projected/fbb6febf-b3e9-4385-b9e9-447d1cca66b5-kube-api-access-thjtr\") pod \"certified-operators-vsjrq\" (UID: \"fbb6febf-b3e9-4385-b9e9-447d1cca66b5\") " pod="openshift-marketplace/certified-operators-vsjrq" Mar 10 11:39:31 crc kubenswrapper[4794]: I0310 11:39:31.396350 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fbb6febf-b3e9-4385-b9e9-447d1cca66b5-utilities\") pod \"certified-operators-vsjrq\" (UID: \"fbb6febf-b3e9-4385-b9e9-447d1cca66b5\") " pod="openshift-marketplace/certified-operators-vsjrq" Mar 10 11:39:31 crc kubenswrapper[4794]: I0310 11:39:31.396409 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fbb6febf-b3e9-4385-b9e9-447d1cca66b5-catalog-content\") pod \"certified-operators-vsjrq\" (UID: \"fbb6febf-b3e9-4385-b9e9-447d1cca66b5\") " pod="openshift-marketplace/certified-operators-vsjrq" Mar 10 11:39:31 crc kubenswrapper[4794]: I0310 11:39:31.396461 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-thjtr\" (UniqueName: \"kubernetes.io/projected/fbb6febf-b3e9-4385-b9e9-447d1cca66b5-kube-api-access-thjtr\") pod \"certified-operators-vsjrq\" (UID: \"fbb6febf-b3e9-4385-b9e9-447d1cca66b5\") " pod="openshift-marketplace/certified-operators-vsjrq" Mar 10 11:39:31 crc kubenswrapper[4794]: I0310 11:39:31.397236 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fbb6febf-b3e9-4385-b9e9-447d1cca66b5-utilities\") pod \"certified-operators-vsjrq\" (UID: \"fbb6febf-b3e9-4385-b9e9-447d1cca66b5\") " pod="openshift-marketplace/certified-operators-vsjrq" Mar 10 11:39:31 crc kubenswrapper[4794]: I0310 11:39:31.397465 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fbb6febf-b3e9-4385-b9e9-447d1cca66b5-catalog-content\") pod \"certified-operators-vsjrq\" (UID: \"fbb6febf-b3e9-4385-b9e9-447d1cca66b5\") " pod="openshift-marketplace/certified-operators-vsjrq" Mar 10 11:39:31 crc kubenswrapper[4794]: I0310 11:39:31.415265 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-thjtr\" (UniqueName: \"kubernetes.io/projected/fbb6febf-b3e9-4385-b9e9-447d1cca66b5-kube-api-access-thjtr\") pod \"certified-operators-vsjrq\" (UID: \"fbb6febf-b3e9-4385-b9e9-447d1cca66b5\") " pod="openshift-marketplace/certified-operators-vsjrq" Mar 10 11:39:31 crc kubenswrapper[4794]: I0310 11:39:31.506610 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-vsjrq" Mar 10 11:39:32 crc kubenswrapper[4794]: I0310 11:39:32.057588 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-vsjrq"] Mar 10 11:39:32 crc kubenswrapper[4794]: I0310 11:39:32.829004 4794 generic.go:334] "Generic (PLEG): container finished" podID="fbb6febf-b3e9-4385-b9e9-447d1cca66b5" containerID="cf615687632cf437931ebaa879f34d1f3d873b03fa669662a5c649162e5e81e7" exitCode=0 Mar 10 11:39:32 crc kubenswrapper[4794]: I0310 11:39:32.829152 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vsjrq" event={"ID":"fbb6febf-b3e9-4385-b9e9-447d1cca66b5","Type":"ContainerDied","Data":"cf615687632cf437931ebaa879f34d1f3d873b03fa669662a5c649162e5e81e7"} Mar 10 11:39:32 crc kubenswrapper[4794]: I0310 11:39:32.829352 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vsjrq" event={"ID":"fbb6febf-b3e9-4385-b9e9-447d1cca66b5","Type":"ContainerStarted","Data":"bf914b71963c3c310b138fd75a53adc501ba6bf04f0d41c3e48590015c796549"} Mar 10 11:39:32 crc kubenswrapper[4794]: I0310 11:39:32.832636 4794 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 10 11:39:33 crc kubenswrapper[4794]: I0310 11:39:33.846204 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vsjrq" event={"ID":"fbb6febf-b3e9-4385-b9e9-447d1cca66b5","Type":"ContainerStarted","Data":"2da9d63345b82277527ef642ae3827d03341c7540ceecb7181eda68fcbf6de21"} Mar 10 11:39:35 crc kubenswrapper[4794]: I0310 11:39:35.880997 4794 generic.go:334] "Generic (PLEG): container finished" podID="fbb6febf-b3e9-4385-b9e9-447d1cca66b5" containerID="2da9d63345b82277527ef642ae3827d03341c7540ceecb7181eda68fcbf6de21" exitCode=0 Mar 10 11:39:35 crc kubenswrapper[4794]: I0310 11:39:35.881149 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vsjrq" event={"ID":"fbb6febf-b3e9-4385-b9e9-447d1cca66b5","Type":"ContainerDied","Data":"2da9d63345b82277527ef642ae3827d03341c7540ceecb7181eda68fcbf6de21"} Mar 10 11:39:36 crc kubenswrapper[4794]: I0310 11:39:36.900689 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vsjrq" event={"ID":"fbb6febf-b3e9-4385-b9e9-447d1cca66b5","Type":"ContainerStarted","Data":"ae1ed6753c96cb815fae94b9f626a10d2251d0cbebb2d3d6863a6d9e59e73e7b"} Mar 10 11:39:36 crc kubenswrapper[4794]: I0310 11:39:36.939059 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-vsjrq" podStartSLOduration=2.454721933 podStartE2EDuration="5.93903511s" podCreationTimestamp="2026-03-10 11:39:31 +0000 UTC" firstStartedPulling="2026-03-10 11:39:32.832162929 +0000 UTC m=+6921.588333757" lastFinishedPulling="2026-03-10 11:39:36.316476106 +0000 UTC m=+6925.072646934" observedRunningTime="2026-03-10 11:39:36.923899609 +0000 UTC m=+6925.680070487" watchObservedRunningTime="2026-03-10 11:39:36.93903511 +0000 UTC m=+6925.695205948" Mar 10 11:39:41 crc kubenswrapper[4794]: I0310 11:39:41.507664 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-vsjrq" Mar 10 11:39:41 crc kubenswrapper[4794]: I0310 11:39:41.508563 4794 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-vsjrq" Mar 10 11:39:41 crc kubenswrapper[4794]: I0310 11:39:41.584494 4794 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-vsjrq" Mar 10 11:39:42 crc kubenswrapper[4794]: I0310 11:39:42.026589 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-vsjrq" Mar 10 11:39:42 crc kubenswrapper[4794]: I0310 11:39:42.092281 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-vsjrq"] Mar 10 11:39:43 crc kubenswrapper[4794]: I0310 11:39:43.983263 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-vsjrq" podUID="fbb6febf-b3e9-4385-b9e9-447d1cca66b5" containerName="registry-server" containerID="cri-o://ae1ed6753c96cb815fae94b9f626a10d2251d0cbebb2d3d6863a6d9e59e73e7b" gracePeriod=2 Mar 10 11:39:44 crc kubenswrapper[4794]: I0310 11:39:44.611307 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-vsjrq" Mar 10 11:39:44 crc kubenswrapper[4794]: I0310 11:39:44.729474 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-thjtr\" (UniqueName: \"kubernetes.io/projected/fbb6febf-b3e9-4385-b9e9-447d1cca66b5-kube-api-access-thjtr\") pod \"fbb6febf-b3e9-4385-b9e9-447d1cca66b5\" (UID: \"fbb6febf-b3e9-4385-b9e9-447d1cca66b5\") " Mar 10 11:39:44 crc kubenswrapper[4794]: I0310 11:39:44.729622 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fbb6febf-b3e9-4385-b9e9-447d1cca66b5-utilities\") pod \"fbb6febf-b3e9-4385-b9e9-447d1cca66b5\" (UID: \"fbb6febf-b3e9-4385-b9e9-447d1cca66b5\") " Mar 10 11:39:44 crc kubenswrapper[4794]: I0310 11:39:44.729757 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fbb6febf-b3e9-4385-b9e9-447d1cca66b5-catalog-content\") pod \"fbb6febf-b3e9-4385-b9e9-447d1cca66b5\" (UID: \"fbb6febf-b3e9-4385-b9e9-447d1cca66b5\") " Mar 10 11:39:44 crc kubenswrapper[4794]: I0310 11:39:44.731149 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fbb6febf-b3e9-4385-b9e9-447d1cca66b5-utilities" (OuterVolumeSpecName: "utilities") pod "fbb6febf-b3e9-4385-b9e9-447d1cca66b5" (UID: "fbb6febf-b3e9-4385-b9e9-447d1cca66b5"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 11:39:44 crc kubenswrapper[4794]: I0310 11:39:44.738597 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fbb6febf-b3e9-4385-b9e9-447d1cca66b5-kube-api-access-thjtr" (OuterVolumeSpecName: "kube-api-access-thjtr") pod "fbb6febf-b3e9-4385-b9e9-447d1cca66b5" (UID: "fbb6febf-b3e9-4385-b9e9-447d1cca66b5"). InnerVolumeSpecName "kube-api-access-thjtr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 11:39:44 crc kubenswrapper[4794]: I0310 11:39:44.827510 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fbb6febf-b3e9-4385-b9e9-447d1cca66b5-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "fbb6febf-b3e9-4385-b9e9-447d1cca66b5" (UID: "fbb6febf-b3e9-4385-b9e9-447d1cca66b5"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 11:39:44 crc kubenswrapper[4794]: I0310 11:39:44.832293 4794 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fbb6febf-b3e9-4385-b9e9-447d1cca66b5-utilities\") on node \"crc\" DevicePath \"\"" Mar 10 11:39:44 crc kubenswrapper[4794]: I0310 11:39:44.832360 4794 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fbb6febf-b3e9-4385-b9e9-447d1cca66b5-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 10 11:39:44 crc kubenswrapper[4794]: I0310 11:39:44.832383 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-thjtr\" (UniqueName: \"kubernetes.io/projected/fbb6febf-b3e9-4385-b9e9-447d1cca66b5-kube-api-access-thjtr\") on node \"crc\" DevicePath \"\"" Mar 10 11:39:44 crc kubenswrapper[4794]: I0310 11:39:44.999680 4794 scope.go:117] "RemoveContainer" containerID="33bc8fb5de7a6ee4431141216e52d779889bbf57219d1ad361c0b6ac9e985c0e" Mar 10 11:39:45 crc kubenswrapper[4794]: I0310 11:39:44.999915 4794 generic.go:334] "Generic (PLEG): container finished" podID="fbb6febf-b3e9-4385-b9e9-447d1cca66b5" containerID="ae1ed6753c96cb815fae94b9f626a10d2251d0cbebb2d3d6863a6d9e59e73e7b" exitCode=0 Mar 10 11:39:45 crc kubenswrapper[4794]: I0310 11:39:44.999960 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-vsjrq" Mar 10 11:39:45 crc kubenswrapper[4794]: I0310 11:39:44.999978 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vsjrq" event={"ID":"fbb6febf-b3e9-4385-b9e9-447d1cca66b5","Type":"ContainerDied","Data":"ae1ed6753c96cb815fae94b9f626a10d2251d0cbebb2d3d6863a6d9e59e73e7b"} Mar 10 11:39:45 crc kubenswrapper[4794]: I0310 11:39:45.000042 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vsjrq" event={"ID":"fbb6febf-b3e9-4385-b9e9-447d1cca66b5","Type":"ContainerDied","Data":"bf914b71963c3c310b138fd75a53adc501ba6bf04f0d41c3e48590015c796549"} Mar 10 11:39:45 crc kubenswrapper[4794]: I0310 11:39:45.000080 4794 scope.go:117] "RemoveContainer" containerID="ae1ed6753c96cb815fae94b9f626a10d2251d0cbebb2d3d6863a6d9e59e73e7b" Mar 10 11:39:45 crc kubenswrapper[4794]: E0310 11:39:45.000303 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-69278_openshift-machine-config-operator(071a79a8-a892-4d38-a255-2a19483b64aa)\"" pod="openshift-machine-config-operator/machine-config-daemon-69278" podUID="071a79a8-a892-4d38-a255-2a19483b64aa" Mar 10 11:39:45 crc kubenswrapper[4794]: I0310 11:39:45.047389 4794 scope.go:117] "RemoveContainer" containerID="2da9d63345b82277527ef642ae3827d03341c7540ceecb7181eda68fcbf6de21" Mar 10 11:39:45 crc kubenswrapper[4794]: I0310 11:39:45.074053 4794 scope.go:117] "RemoveContainer" containerID="cf615687632cf437931ebaa879f34d1f3d873b03fa669662a5c649162e5e81e7" Mar 10 11:39:45 crc kubenswrapper[4794]: I0310 11:39:45.074191 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-vsjrq"] Mar 10 11:39:45 crc kubenswrapper[4794]: I0310 11:39:45.088083 4794 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-vsjrq"] Mar 10 11:39:45 crc kubenswrapper[4794]: I0310 11:39:45.131433 4794 scope.go:117] "RemoveContainer" containerID="ae1ed6753c96cb815fae94b9f626a10d2251d0cbebb2d3d6863a6d9e59e73e7b" Mar 10 11:39:45 crc kubenswrapper[4794]: E0310 11:39:45.131966 4794 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ae1ed6753c96cb815fae94b9f626a10d2251d0cbebb2d3d6863a6d9e59e73e7b\": container with ID starting with ae1ed6753c96cb815fae94b9f626a10d2251d0cbebb2d3d6863a6d9e59e73e7b not found: ID does not exist" containerID="ae1ed6753c96cb815fae94b9f626a10d2251d0cbebb2d3d6863a6d9e59e73e7b" Mar 10 11:39:45 crc kubenswrapper[4794]: I0310 11:39:45.132013 4794 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ae1ed6753c96cb815fae94b9f626a10d2251d0cbebb2d3d6863a6d9e59e73e7b"} err="failed to get container status \"ae1ed6753c96cb815fae94b9f626a10d2251d0cbebb2d3d6863a6d9e59e73e7b\": rpc error: code = NotFound desc = could not find container \"ae1ed6753c96cb815fae94b9f626a10d2251d0cbebb2d3d6863a6d9e59e73e7b\": container with ID starting with ae1ed6753c96cb815fae94b9f626a10d2251d0cbebb2d3d6863a6d9e59e73e7b not found: ID does not exist" Mar 10 11:39:45 crc kubenswrapper[4794]: I0310 11:39:45.132043 4794 scope.go:117] "RemoveContainer" containerID="2da9d63345b82277527ef642ae3827d03341c7540ceecb7181eda68fcbf6de21" Mar 10 11:39:45 crc kubenswrapper[4794]: E0310 11:39:45.132350 4794 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2da9d63345b82277527ef642ae3827d03341c7540ceecb7181eda68fcbf6de21\": container with ID starting with 2da9d63345b82277527ef642ae3827d03341c7540ceecb7181eda68fcbf6de21 not found: ID does not exist" containerID="2da9d63345b82277527ef642ae3827d03341c7540ceecb7181eda68fcbf6de21" Mar 10 11:39:45 crc kubenswrapper[4794]: I0310 11:39:45.132396 4794 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2da9d63345b82277527ef642ae3827d03341c7540ceecb7181eda68fcbf6de21"} err="failed to get container status \"2da9d63345b82277527ef642ae3827d03341c7540ceecb7181eda68fcbf6de21\": rpc error: code = NotFound desc = could not find container \"2da9d63345b82277527ef642ae3827d03341c7540ceecb7181eda68fcbf6de21\": container with ID starting with 2da9d63345b82277527ef642ae3827d03341c7540ceecb7181eda68fcbf6de21 not found: ID does not exist" Mar 10 11:39:45 crc kubenswrapper[4794]: I0310 11:39:45.132450 4794 scope.go:117] "RemoveContainer" containerID="cf615687632cf437931ebaa879f34d1f3d873b03fa669662a5c649162e5e81e7" Mar 10 11:39:45 crc kubenswrapper[4794]: E0310 11:39:45.132662 4794 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cf615687632cf437931ebaa879f34d1f3d873b03fa669662a5c649162e5e81e7\": container with ID starting with cf615687632cf437931ebaa879f34d1f3d873b03fa669662a5c649162e5e81e7 not found: ID does not exist" containerID="cf615687632cf437931ebaa879f34d1f3d873b03fa669662a5c649162e5e81e7" Mar 10 11:39:45 crc kubenswrapper[4794]: I0310 11:39:45.132691 4794 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cf615687632cf437931ebaa879f34d1f3d873b03fa669662a5c649162e5e81e7"} err="failed to get container status \"cf615687632cf437931ebaa879f34d1f3d873b03fa669662a5c649162e5e81e7\": rpc error: code = NotFound desc = could not find container \"cf615687632cf437931ebaa879f34d1f3d873b03fa669662a5c649162e5e81e7\": container with ID starting with cf615687632cf437931ebaa879f34d1f3d873b03fa669662a5c649162e5e81e7 not found: ID does not exist" Mar 10 11:39:46 crc kubenswrapper[4794]: I0310 11:39:46.031516 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fbb6febf-b3e9-4385-b9e9-447d1cca66b5" path="/var/lib/kubelet/pods/fbb6febf-b3e9-4385-b9e9-447d1cca66b5/volumes" Mar 10 11:39:57 crc kubenswrapper[4794]: I0310 11:39:57.000095 4794 scope.go:117] "RemoveContainer" containerID="33bc8fb5de7a6ee4431141216e52d779889bbf57219d1ad361c0b6ac9e985c0e" Mar 10 11:39:57 crc kubenswrapper[4794]: E0310 11:39:57.001289 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-69278_openshift-machine-config-operator(071a79a8-a892-4d38-a255-2a19483b64aa)\"" pod="openshift-machine-config-operator/machine-config-daemon-69278" podUID="071a79a8-a892-4d38-a255-2a19483b64aa" Mar 10 11:40:00 crc kubenswrapper[4794]: I0310 11:40:00.180659 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29552380-z5wsx"] Mar 10 11:40:00 crc kubenswrapper[4794]: E0310 11:40:00.182330 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fbb6febf-b3e9-4385-b9e9-447d1cca66b5" containerName="extract-content" Mar 10 11:40:00 crc kubenswrapper[4794]: I0310 11:40:00.182387 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="fbb6febf-b3e9-4385-b9e9-447d1cca66b5" containerName="extract-content" Mar 10 11:40:00 crc kubenswrapper[4794]: E0310 11:40:00.182416 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fbb6febf-b3e9-4385-b9e9-447d1cca66b5" containerName="registry-server" Mar 10 11:40:00 crc kubenswrapper[4794]: I0310 11:40:00.182429 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="fbb6febf-b3e9-4385-b9e9-447d1cca66b5" containerName="registry-server" Mar 10 11:40:00 crc kubenswrapper[4794]: E0310 11:40:00.182448 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fbb6febf-b3e9-4385-b9e9-447d1cca66b5" containerName="extract-utilities" Mar 10 11:40:00 crc kubenswrapper[4794]: I0310 11:40:00.182461 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="fbb6febf-b3e9-4385-b9e9-447d1cca66b5" containerName="extract-utilities" Mar 10 11:40:00 crc kubenswrapper[4794]: I0310 11:40:00.182935 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="fbb6febf-b3e9-4385-b9e9-447d1cca66b5" containerName="registry-server" Mar 10 11:40:00 crc kubenswrapper[4794]: I0310 11:40:00.184066 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552380-z5wsx" Mar 10 11:40:00 crc kubenswrapper[4794]: I0310 11:40:00.186849 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-r5tkc" Mar 10 11:40:00 crc kubenswrapper[4794]: I0310 11:40:00.187402 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 10 11:40:00 crc kubenswrapper[4794]: I0310 11:40:00.189062 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 10 11:40:00 crc kubenswrapper[4794]: I0310 11:40:00.195911 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552380-z5wsx"] Mar 10 11:40:00 crc kubenswrapper[4794]: I0310 11:40:00.295592 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c74c4\" (UniqueName: \"kubernetes.io/projected/69f1ae00-aeb3-44f4-9105-e0d7b3c6189b-kube-api-access-c74c4\") pod \"auto-csr-approver-29552380-z5wsx\" (UID: \"69f1ae00-aeb3-44f4-9105-e0d7b3c6189b\") " pod="openshift-infra/auto-csr-approver-29552380-z5wsx" Mar 10 11:40:00 crc kubenswrapper[4794]: I0310 11:40:00.397651 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c74c4\" (UniqueName: \"kubernetes.io/projected/69f1ae00-aeb3-44f4-9105-e0d7b3c6189b-kube-api-access-c74c4\") pod \"auto-csr-approver-29552380-z5wsx\" (UID: \"69f1ae00-aeb3-44f4-9105-e0d7b3c6189b\") " pod="openshift-infra/auto-csr-approver-29552380-z5wsx" Mar 10 11:40:00 crc kubenswrapper[4794]: I0310 11:40:00.423045 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c74c4\" (UniqueName: \"kubernetes.io/projected/69f1ae00-aeb3-44f4-9105-e0d7b3c6189b-kube-api-access-c74c4\") pod \"auto-csr-approver-29552380-z5wsx\" (UID: \"69f1ae00-aeb3-44f4-9105-e0d7b3c6189b\") " pod="openshift-infra/auto-csr-approver-29552380-z5wsx" Mar 10 11:40:00 crc kubenswrapper[4794]: I0310 11:40:00.509801 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552380-z5wsx" Mar 10 11:40:01 crc kubenswrapper[4794]: I0310 11:40:01.052555 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552380-z5wsx"] Mar 10 11:40:01 crc kubenswrapper[4794]: I0310 11:40:01.188292 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552380-z5wsx" event={"ID":"69f1ae00-aeb3-44f4-9105-e0d7b3c6189b","Type":"ContainerStarted","Data":"6909a82cc1026e89e6a89efc0e0412d72fa6edb69424dfce81f69e999b133675"} Mar 10 11:40:03 crc kubenswrapper[4794]: I0310 11:40:03.214734 4794 generic.go:334] "Generic (PLEG): container finished" podID="69f1ae00-aeb3-44f4-9105-e0d7b3c6189b" containerID="b0ef429d6f985bb876ca63de27cca82abeb92b5287e19ca548faf478bba6a66c" exitCode=0 Mar 10 11:40:03 crc kubenswrapper[4794]: I0310 11:40:03.215271 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552380-z5wsx" event={"ID":"69f1ae00-aeb3-44f4-9105-e0d7b3c6189b","Type":"ContainerDied","Data":"b0ef429d6f985bb876ca63de27cca82abeb92b5287e19ca548faf478bba6a66c"} Mar 10 11:40:04 crc kubenswrapper[4794]: I0310 11:40:04.698747 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552380-z5wsx" Mar 10 11:40:04 crc kubenswrapper[4794]: I0310 11:40:04.814308 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c74c4\" (UniqueName: \"kubernetes.io/projected/69f1ae00-aeb3-44f4-9105-e0d7b3c6189b-kube-api-access-c74c4\") pod \"69f1ae00-aeb3-44f4-9105-e0d7b3c6189b\" (UID: \"69f1ae00-aeb3-44f4-9105-e0d7b3c6189b\") " Mar 10 11:40:04 crc kubenswrapper[4794]: I0310 11:40:04.820945 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/69f1ae00-aeb3-44f4-9105-e0d7b3c6189b-kube-api-access-c74c4" (OuterVolumeSpecName: "kube-api-access-c74c4") pod "69f1ae00-aeb3-44f4-9105-e0d7b3c6189b" (UID: "69f1ae00-aeb3-44f4-9105-e0d7b3c6189b"). InnerVolumeSpecName "kube-api-access-c74c4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 11:40:04 crc kubenswrapper[4794]: I0310 11:40:04.917171 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c74c4\" (UniqueName: \"kubernetes.io/projected/69f1ae00-aeb3-44f4-9105-e0d7b3c6189b-kube-api-access-c74c4\") on node \"crc\" DevicePath \"\"" Mar 10 11:40:05 crc kubenswrapper[4794]: I0310 11:40:05.253240 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552380-z5wsx" event={"ID":"69f1ae00-aeb3-44f4-9105-e0d7b3c6189b","Type":"ContainerDied","Data":"6909a82cc1026e89e6a89efc0e0412d72fa6edb69424dfce81f69e999b133675"} Mar 10 11:40:05 crc kubenswrapper[4794]: I0310 11:40:05.254083 4794 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6909a82cc1026e89e6a89efc0e0412d72fa6edb69424dfce81f69e999b133675" Mar 10 11:40:05 crc kubenswrapper[4794]: I0310 11:40:05.254218 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552380-z5wsx" Mar 10 11:40:05 crc kubenswrapper[4794]: I0310 11:40:05.795629 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29552374-bhqcg"] Mar 10 11:40:05 crc kubenswrapper[4794]: I0310 11:40:05.807319 4794 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29552374-bhqcg"] Mar 10 11:40:06 crc kubenswrapper[4794]: I0310 11:40:06.013087 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="83e37ea6-ebb9-4617-a214-f2cac72dad1e" path="/var/lib/kubelet/pods/83e37ea6-ebb9-4617-a214-f2cac72dad1e/volumes" Mar 10 11:40:10 crc kubenswrapper[4794]: I0310 11:40:10.999319 4794 scope.go:117] "RemoveContainer" containerID="33bc8fb5de7a6ee4431141216e52d779889bbf57219d1ad361c0b6ac9e985c0e" Mar 10 11:40:11 crc kubenswrapper[4794]: E0310 11:40:11.000188 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-69278_openshift-machine-config-operator(071a79a8-a892-4d38-a255-2a19483b64aa)\"" pod="openshift-machine-config-operator/machine-config-daemon-69278" podUID="071a79a8-a892-4d38-a255-2a19483b64aa" Mar 10 11:40:24 crc kubenswrapper[4794]: I0310 11:40:23.999578 4794 scope.go:117] "RemoveContainer" containerID="33bc8fb5de7a6ee4431141216e52d779889bbf57219d1ad361c0b6ac9e985c0e" Mar 10 11:40:24 crc kubenswrapper[4794]: E0310 11:40:24.001157 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-69278_openshift-machine-config-operator(071a79a8-a892-4d38-a255-2a19483b64aa)\"" pod="openshift-machine-config-operator/machine-config-daemon-69278" podUID="071a79a8-a892-4d38-a255-2a19483b64aa" Mar 10 11:40:33 crc kubenswrapper[4794]: I0310 11:40:33.211965 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-h7qm2"] Mar 10 11:40:33 crc kubenswrapper[4794]: E0310 11:40:33.213650 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="69f1ae00-aeb3-44f4-9105-e0d7b3c6189b" containerName="oc" Mar 10 11:40:33 crc kubenswrapper[4794]: I0310 11:40:33.213670 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="69f1ae00-aeb3-44f4-9105-e0d7b3c6189b" containerName="oc" Mar 10 11:40:33 crc kubenswrapper[4794]: I0310 11:40:33.214290 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="69f1ae00-aeb3-44f4-9105-e0d7b3c6189b" containerName="oc" Mar 10 11:40:33 crc kubenswrapper[4794]: I0310 11:40:33.217059 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-h7qm2" Mar 10 11:40:33 crc kubenswrapper[4794]: I0310 11:40:33.242582 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-h7qm2"] Mar 10 11:40:33 crc kubenswrapper[4794]: I0310 11:40:33.334317 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/489ffc4f-7c1e-49c9-8a8c-510fcdea7dba-catalog-content\") pod \"redhat-operators-h7qm2\" (UID: \"489ffc4f-7c1e-49c9-8a8c-510fcdea7dba\") " pod="openshift-marketplace/redhat-operators-h7qm2" Mar 10 11:40:33 crc kubenswrapper[4794]: I0310 11:40:33.334468 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/489ffc4f-7c1e-49c9-8a8c-510fcdea7dba-utilities\") pod \"redhat-operators-h7qm2\" (UID: \"489ffc4f-7c1e-49c9-8a8c-510fcdea7dba\") " pod="openshift-marketplace/redhat-operators-h7qm2" Mar 10 11:40:33 crc kubenswrapper[4794]: I0310 11:40:33.334543 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lnwdk\" (UniqueName: \"kubernetes.io/projected/489ffc4f-7c1e-49c9-8a8c-510fcdea7dba-kube-api-access-lnwdk\") pod \"redhat-operators-h7qm2\" (UID: \"489ffc4f-7c1e-49c9-8a8c-510fcdea7dba\") " pod="openshift-marketplace/redhat-operators-h7qm2" Mar 10 11:40:33 crc kubenswrapper[4794]: I0310 11:40:33.436559 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/489ffc4f-7c1e-49c9-8a8c-510fcdea7dba-utilities\") pod \"redhat-operators-h7qm2\" (UID: \"489ffc4f-7c1e-49c9-8a8c-510fcdea7dba\") " pod="openshift-marketplace/redhat-operators-h7qm2" Mar 10 11:40:33 crc kubenswrapper[4794]: I0310 11:40:33.436664 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lnwdk\" (UniqueName: \"kubernetes.io/projected/489ffc4f-7c1e-49c9-8a8c-510fcdea7dba-kube-api-access-lnwdk\") pod \"redhat-operators-h7qm2\" (UID: \"489ffc4f-7c1e-49c9-8a8c-510fcdea7dba\") " pod="openshift-marketplace/redhat-operators-h7qm2" Mar 10 11:40:33 crc kubenswrapper[4794]: I0310 11:40:33.436702 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/489ffc4f-7c1e-49c9-8a8c-510fcdea7dba-catalog-content\") pod \"redhat-operators-h7qm2\" (UID: \"489ffc4f-7c1e-49c9-8a8c-510fcdea7dba\") " pod="openshift-marketplace/redhat-operators-h7qm2" Mar 10 11:40:33 crc kubenswrapper[4794]: I0310 11:40:33.437031 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/489ffc4f-7c1e-49c9-8a8c-510fcdea7dba-utilities\") pod \"redhat-operators-h7qm2\" (UID: \"489ffc4f-7c1e-49c9-8a8c-510fcdea7dba\") " pod="openshift-marketplace/redhat-operators-h7qm2" Mar 10 11:40:33 crc kubenswrapper[4794]: I0310 11:40:33.437051 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/489ffc4f-7c1e-49c9-8a8c-510fcdea7dba-catalog-content\") pod \"redhat-operators-h7qm2\" (UID: \"489ffc4f-7c1e-49c9-8a8c-510fcdea7dba\") " pod="openshift-marketplace/redhat-operators-h7qm2" Mar 10 11:40:33 crc kubenswrapper[4794]: I0310 11:40:33.457778 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lnwdk\" (UniqueName: \"kubernetes.io/projected/489ffc4f-7c1e-49c9-8a8c-510fcdea7dba-kube-api-access-lnwdk\") pod \"redhat-operators-h7qm2\" (UID: \"489ffc4f-7c1e-49c9-8a8c-510fcdea7dba\") " pod="openshift-marketplace/redhat-operators-h7qm2" Mar 10 11:40:33 crc kubenswrapper[4794]: I0310 11:40:33.560124 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-h7qm2" Mar 10 11:40:34 crc kubenswrapper[4794]: I0310 11:40:34.031842 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-h7qm2"] Mar 10 11:40:34 crc kubenswrapper[4794]: I0310 11:40:34.642675 4794 generic.go:334] "Generic (PLEG): container finished" podID="489ffc4f-7c1e-49c9-8a8c-510fcdea7dba" containerID="36a7b7b64e45c5d9215806d666513ab99f9c90f49612ff5115329cc7708ddd7a" exitCode=0 Mar 10 11:40:34 crc kubenswrapper[4794]: I0310 11:40:34.642786 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-h7qm2" event={"ID":"489ffc4f-7c1e-49c9-8a8c-510fcdea7dba","Type":"ContainerDied","Data":"36a7b7b64e45c5d9215806d666513ab99f9c90f49612ff5115329cc7708ddd7a"} Mar 10 11:40:34 crc kubenswrapper[4794]: I0310 11:40:34.642985 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-h7qm2" event={"ID":"489ffc4f-7c1e-49c9-8a8c-510fcdea7dba","Type":"ContainerStarted","Data":"e568009306381a6a5d2dad785ccf2ab2fbe207d2f01e94610fc1b507b283d0e0"} Mar 10 11:40:35 crc kubenswrapper[4794]: I0310 11:40:35.651975 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-h7qm2" event={"ID":"489ffc4f-7c1e-49c9-8a8c-510fcdea7dba","Type":"ContainerStarted","Data":"3af716d5ac765c0ac73053da5cb6be56fac153c0a05b120b3ac28c34681b466e"} Mar 10 11:40:38 crc kubenswrapper[4794]: I0310 11:40:38.374785 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-mrc7n"] Mar 10 11:40:38 crc kubenswrapper[4794]: I0310 11:40:38.379467 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-mrc7n" Mar 10 11:40:38 crc kubenswrapper[4794]: I0310 11:40:38.403265 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-mrc7n"] Mar 10 11:40:38 crc kubenswrapper[4794]: I0310 11:40:38.542153 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f1107c3c-6645-424a-bba4-5db5da4d2717-catalog-content\") pod \"community-operators-mrc7n\" (UID: \"f1107c3c-6645-424a-bba4-5db5da4d2717\") " pod="openshift-marketplace/community-operators-mrc7n" Mar 10 11:40:38 crc kubenswrapper[4794]: I0310 11:40:38.542619 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-64k29\" (UniqueName: \"kubernetes.io/projected/f1107c3c-6645-424a-bba4-5db5da4d2717-kube-api-access-64k29\") pod \"community-operators-mrc7n\" (UID: \"f1107c3c-6645-424a-bba4-5db5da4d2717\") " pod="openshift-marketplace/community-operators-mrc7n" Mar 10 11:40:38 crc kubenswrapper[4794]: I0310 11:40:38.542691 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f1107c3c-6645-424a-bba4-5db5da4d2717-utilities\") pod \"community-operators-mrc7n\" (UID: \"f1107c3c-6645-424a-bba4-5db5da4d2717\") " pod="openshift-marketplace/community-operators-mrc7n" Mar 10 11:40:38 crc kubenswrapper[4794]: I0310 11:40:38.644217 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-64k29\" (UniqueName: \"kubernetes.io/projected/f1107c3c-6645-424a-bba4-5db5da4d2717-kube-api-access-64k29\") pod \"community-operators-mrc7n\" (UID: \"f1107c3c-6645-424a-bba4-5db5da4d2717\") " pod="openshift-marketplace/community-operators-mrc7n" Mar 10 11:40:38 crc kubenswrapper[4794]: I0310 11:40:38.644310 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f1107c3c-6645-424a-bba4-5db5da4d2717-utilities\") pod \"community-operators-mrc7n\" (UID: \"f1107c3c-6645-424a-bba4-5db5da4d2717\") " pod="openshift-marketplace/community-operators-mrc7n" Mar 10 11:40:38 crc kubenswrapper[4794]: I0310 11:40:38.644423 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f1107c3c-6645-424a-bba4-5db5da4d2717-catalog-content\") pod \"community-operators-mrc7n\" (UID: \"f1107c3c-6645-424a-bba4-5db5da4d2717\") " pod="openshift-marketplace/community-operators-mrc7n" Mar 10 11:40:38 crc kubenswrapper[4794]: I0310 11:40:38.645031 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f1107c3c-6645-424a-bba4-5db5da4d2717-catalog-content\") pod \"community-operators-mrc7n\" (UID: \"f1107c3c-6645-424a-bba4-5db5da4d2717\") " pod="openshift-marketplace/community-operators-mrc7n" Mar 10 11:40:38 crc kubenswrapper[4794]: I0310 11:40:38.645110 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f1107c3c-6645-424a-bba4-5db5da4d2717-utilities\") pod \"community-operators-mrc7n\" (UID: \"f1107c3c-6645-424a-bba4-5db5da4d2717\") " pod="openshift-marketplace/community-operators-mrc7n" Mar 10 11:40:38 crc kubenswrapper[4794]: I0310 11:40:38.680844 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-64k29\" (UniqueName: \"kubernetes.io/projected/f1107c3c-6645-424a-bba4-5db5da4d2717-kube-api-access-64k29\") pod \"community-operators-mrc7n\" (UID: \"f1107c3c-6645-424a-bba4-5db5da4d2717\") " pod="openshift-marketplace/community-operators-mrc7n" Mar 10 11:40:38 crc kubenswrapper[4794]: I0310 11:40:38.704275 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-mrc7n" Mar 10 11:40:39 crc kubenswrapper[4794]: I0310 11:40:39.000151 4794 scope.go:117] "RemoveContainer" containerID="33bc8fb5de7a6ee4431141216e52d779889bbf57219d1ad361c0b6ac9e985c0e" Mar 10 11:40:39 crc kubenswrapper[4794]: E0310 11:40:39.000755 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-69278_openshift-machine-config-operator(071a79a8-a892-4d38-a255-2a19483b64aa)\"" pod="openshift-machine-config-operator/machine-config-daemon-69278" podUID="071a79a8-a892-4d38-a255-2a19483b64aa" Mar 10 11:40:39 crc kubenswrapper[4794]: I0310 11:40:39.290294 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-mrc7n"] Mar 10 11:40:39 crc kubenswrapper[4794]: I0310 11:40:39.700944 4794 generic.go:334] "Generic (PLEG): container finished" podID="f1107c3c-6645-424a-bba4-5db5da4d2717" containerID="e72af649e8b8d291254a54c8adf8e6435c6959589da0e65d2b9ee8f28b10250d" exitCode=0 Mar 10 11:40:39 crc kubenswrapper[4794]: I0310 11:40:39.701029 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mrc7n" event={"ID":"f1107c3c-6645-424a-bba4-5db5da4d2717","Type":"ContainerDied","Data":"e72af649e8b8d291254a54c8adf8e6435c6959589da0e65d2b9ee8f28b10250d"} Mar 10 11:40:39 crc kubenswrapper[4794]: I0310 11:40:39.701271 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mrc7n" event={"ID":"f1107c3c-6645-424a-bba4-5db5da4d2717","Type":"ContainerStarted","Data":"4c5443e0d8bf87f208e7f284da5a324b66b072eb831ddbf65eb5d456600e370e"} Mar 10 11:40:40 crc kubenswrapper[4794]: I0310 11:40:40.725939 4794 generic.go:334] "Generic (PLEG): container finished" podID="489ffc4f-7c1e-49c9-8a8c-510fcdea7dba" containerID="3af716d5ac765c0ac73053da5cb6be56fac153c0a05b120b3ac28c34681b466e" exitCode=0 Mar 10 11:40:40 crc kubenswrapper[4794]: I0310 11:40:40.726030 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-h7qm2" event={"ID":"489ffc4f-7c1e-49c9-8a8c-510fcdea7dba","Type":"ContainerDied","Data":"3af716d5ac765c0ac73053da5cb6be56fac153c0a05b120b3ac28c34681b466e"} Mar 10 11:40:40 crc kubenswrapper[4794]: I0310 11:40:40.728153 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mrc7n" event={"ID":"f1107c3c-6645-424a-bba4-5db5da4d2717","Type":"ContainerStarted","Data":"77589dc6abf1f5034bb738d1882c788cb4a8ff699d84f9055f32e5779764d976"} Mar 10 11:40:41 crc kubenswrapper[4794]: I0310 11:40:41.772504 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-h7qm2" event={"ID":"489ffc4f-7c1e-49c9-8a8c-510fcdea7dba","Type":"ContainerStarted","Data":"52e3179f36782c0f41b80927d831290c05451f75edbbce252142b1d2a4f1edac"} Mar 10 11:40:41 crc kubenswrapper[4794]: I0310 11:40:41.810845 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-h7qm2" podStartSLOduration=2.330996679 podStartE2EDuration="8.810823892s" podCreationTimestamp="2026-03-10 11:40:33 +0000 UTC" firstStartedPulling="2026-03-10 11:40:34.645113852 +0000 UTC m=+6983.401284660" lastFinishedPulling="2026-03-10 11:40:41.124941055 +0000 UTC m=+6989.881111873" observedRunningTime="2026-03-10 11:40:41.802746262 +0000 UTC m=+6990.558917090" watchObservedRunningTime="2026-03-10 11:40:41.810823892 +0000 UTC m=+6990.566994720" Mar 10 11:40:42 crc kubenswrapper[4794]: I0310 11:40:42.781884 4794 generic.go:334] "Generic (PLEG): container finished" podID="f1107c3c-6645-424a-bba4-5db5da4d2717" containerID="77589dc6abf1f5034bb738d1882c788cb4a8ff699d84f9055f32e5779764d976" exitCode=0 Mar 10 11:40:42 crc kubenswrapper[4794]: I0310 11:40:42.781973 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mrc7n" event={"ID":"f1107c3c-6645-424a-bba4-5db5da4d2717","Type":"ContainerDied","Data":"77589dc6abf1f5034bb738d1882c788cb4a8ff699d84f9055f32e5779764d976"} Mar 10 11:40:43 crc kubenswrapper[4794]: I0310 11:40:43.560628 4794 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-h7qm2" Mar 10 11:40:43 crc kubenswrapper[4794]: I0310 11:40:43.561150 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-h7qm2" Mar 10 11:40:43 crc kubenswrapper[4794]: I0310 11:40:43.793773 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mrc7n" event={"ID":"f1107c3c-6645-424a-bba4-5db5da4d2717","Type":"ContainerStarted","Data":"b68f3b648951a3ebefc67d7959063cd314b837c870647baf7f4aa52e0de54e7c"} Mar 10 11:40:43 crc kubenswrapper[4794]: I0310 11:40:43.822947 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-mrc7n" podStartSLOduration=2.345999226 podStartE2EDuration="5.82292663s" podCreationTimestamp="2026-03-10 11:40:38 +0000 UTC" firstStartedPulling="2026-03-10 11:40:39.702774724 +0000 UTC m=+6988.458945542" lastFinishedPulling="2026-03-10 11:40:43.179702128 +0000 UTC m=+6991.935872946" observedRunningTime="2026-03-10 11:40:43.811545357 +0000 UTC m=+6992.567716195" watchObservedRunningTime="2026-03-10 11:40:43.82292663 +0000 UTC m=+6992.579097458" Mar 10 11:40:44 crc kubenswrapper[4794]: I0310 11:40:44.627639 4794 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-h7qm2" podUID="489ffc4f-7c1e-49c9-8a8c-510fcdea7dba" containerName="registry-server" probeResult="failure" output=< Mar 10 11:40:44 crc kubenswrapper[4794]: timeout: failed to connect service ":50051" within 1s Mar 10 11:40:44 crc kubenswrapper[4794]: > Mar 10 11:40:48 crc kubenswrapper[4794]: I0310 11:40:48.705432 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-mrc7n" Mar 10 11:40:48 crc kubenswrapper[4794]: I0310 11:40:48.706125 4794 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-mrc7n" Mar 10 11:40:48 crc kubenswrapper[4794]: I0310 11:40:48.750890 4794 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-mrc7n" Mar 10 11:40:48 crc kubenswrapper[4794]: I0310 11:40:48.899317 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-mrc7n" Mar 10 11:40:48 crc kubenswrapper[4794]: I0310 11:40:48.985129 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-mrc7n"] Mar 10 11:40:50 crc kubenswrapper[4794]: I0310 11:40:50.865392 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-mrc7n" podUID="f1107c3c-6645-424a-bba4-5db5da4d2717" containerName="registry-server" containerID="cri-o://b68f3b648951a3ebefc67d7959063cd314b837c870647baf7f4aa52e0de54e7c" gracePeriod=2 Mar 10 11:40:51 crc kubenswrapper[4794]: I0310 11:40:51.363280 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-mrc7n" Mar 10 11:40:51 crc kubenswrapper[4794]: I0310 11:40:51.518219 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f1107c3c-6645-424a-bba4-5db5da4d2717-utilities\") pod \"f1107c3c-6645-424a-bba4-5db5da4d2717\" (UID: \"f1107c3c-6645-424a-bba4-5db5da4d2717\") " Mar 10 11:40:51 crc kubenswrapper[4794]: I0310 11:40:51.518619 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-64k29\" (UniqueName: \"kubernetes.io/projected/f1107c3c-6645-424a-bba4-5db5da4d2717-kube-api-access-64k29\") pod \"f1107c3c-6645-424a-bba4-5db5da4d2717\" (UID: \"f1107c3c-6645-424a-bba4-5db5da4d2717\") " Mar 10 11:40:51 crc kubenswrapper[4794]: I0310 11:40:51.518656 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f1107c3c-6645-424a-bba4-5db5da4d2717-catalog-content\") pod \"f1107c3c-6645-424a-bba4-5db5da4d2717\" (UID: \"f1107c3c-6645-424a-bba4-5db5da4d2717\") " Mar 10 11:40:51 crc kubenswrapper[4794]: I0310 11:40:51.519378 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f1107c3c-6645-424a-bba4-5db5da4d2717-utilities" (OuterVolumeSpecName: "utilities") pod "f1107c3c-6645-424a-bba4-5db5da4d2717" (UID: "f1107c3c-6645-424a-bba4-5db5da4d2717"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 11:40:51 crc kubenswrapper[4794]: I0310 11:40:51.525573 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f1107c3c-6645-424a-bba4-5db5da4d2717-kube-api-access-64k29" (OuterVolumeSpecName: "kube-api-access-64k29") pod "f1107c3c-6645-424a-bba4-5db5da4d2717" (UID: "f1107c3c-6645-424a-bba4-5db5da4d2717"). InnerVolumeSpecName "kube-api-access-64k29". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 11:40:51 crc kubenswrapper[4794]: I0310 11:40:51.566955 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f1107c3c-6645-424a-bba4-5db5da4d2717-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f1107c3c-6645-424a-bba4-5db5da4d2717" (UID: "f1107c3c-6645-424a-bba4-5db5da4d2717"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 11:40:51 crc kubenswrapper[4794]: I0310 11:40:51.621662 4794 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f1107c3c-6645-424a-bba4-5db5da4d2717-utilities\") on node \"crc\" DevicePath \"\"" Mar 10 11:40:51 crc kubenswrapper[4794]: I0310 11:40:51.621694 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-64k29\" (UniqueName: \"kubernetes.io/projected/f1107c3c-6645-424a-bba4-5db5da4d2717-kube-api-access-64k29\") on node \"crc\" DevicePath \"\"" Mar 10 11:40:51 crc kubenswrapper[4794]: I0310 11:40:51.621705 4794 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f1107c3c-6645-424a-bba4-5db5da4d2717-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 10 11:40:51 crc kubenswrapper[4794]: I0310 11:40:51.877637 4794 generic.go:334] "Generic (PLEG): container finished" podID="f1107c3c-6645-424a-bba4-5db5da4d2717" containerID="b68f3b648951a3ebefc67d7959063cd314b837c870647baf7f4aa52e0de54e7c" exitCode=0 Mar 10 11:40:51 crc kubenswrapper[4794]: I0310 11:40:51.877686 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-mrc7n" Mar 10 11:40:51 crc kubenswrapper[4794]: I0310 11:40:51.877687 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mrc7n" event={"ID":"f1107c3c-6645-424a-bba4-5db5da4d2717","Type":"ContainerDied","Data":"b68f3b648951a3ebefc67d7959063cd314b837c870647baf7f4aa52e0de54e7c"} Mar 10 11:40:51 crc kubenswrapper[4794]: I0310 11:40:51.877806 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mrc7n" event={"ID":"f1107c3c-6645-424a-bba4-5db5da4d2717","Type":"ContainerDied","Data":"4c5443e0d8bf87f208e7f284da5a324b66b072eb831ddbf65eb5d456600e370e"} Mar 10 11:40:51 crc kubenswrapper[4794]: I0310 11:40:51.877829 4794 scope.go:117] "RemoveContainer" containerID="b68f3b648951a3ebefc67d7959063cd314b837c870647baf7f4aa52e0de54e7c" Mar 10 11:40:51 crc kubenswrapper[4794]: I0310 11:40:51.917783 4794 scope.go:117] "RemoveContainer" containerID="77589dc6abf1f5034bb738d1882c788cb4a8ff699d84f9055f32e5779764d976" Mar 10 11:40:51 crc kubenswrapper[4794]: I0310 11:40:51.918438 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-mrc7n"] Mar 10 11:40:51 crc kubenswrapper[4794]: I0310 11:40:51.931594 4794 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-mrc7n"] Mar 10 11:40:51 crc kubenswrapper[4794]: I0310 11:40:51.962781 4794 scope.go:117] "RemoveContainer" containerID="e72af649e8b8d291254a54c8adf8e6435c6959589da0e65d2b9ee8f28b10250d" Mar 10 11:40:52 crc kubenswrapper[4794]: I0310 11:40:52.019785 4794 scope.go:117] "RemoveContainer" containerID="33bc8fb5de7a6ee4431141216e52d779889bbf57219d1ad361c0b6ac9e985c0e" Mar 10 11:40:52 crc kubenswrapper[4794]: E0310 11:40:52.020135 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-69278_openshift-machine-config-operator(071a79a8-a892-4d38-a255-2a19483b64aa)\"" pod="openshift-machine-config-operator/machine-config-daemon-69278" podUID="071a79a8-a892-4d38-a255-2a19483b64aa" Mar 10 11:40:52 crc kubenswrapper[4794]: I0310 11:40:52.020843 4794 scope.go:117] "RemoveContainer" containerID="b68f3b648951a3ebefc67d7959063cd314b837c870647baf7f4aa52e0de54e7c" Mar 10 11:40:52 crc kubenswrapper[4794]: E0310 11:40:52.022988 4794 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b68f3b648951a3ebefc67d7959063cd314b837c870647baf7f4aa52e0de54e7c\": container with ID starting with b68f3b648951a3ebefc67d7959063cd314b837c870647baf7f4aa52e0de54e7c not found: ID does not exist" containerID="b68f3b648951a3ebefc67d7959063cd314b837c870647baf7f4aa52e0de54e7c" Mar 10 11:40:52 crc kubenswrapper[4794]: I0310 11:40:52.023023 4794 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b68f3b648951a3ebefc67d7959063cd314b837c870647baf7f4aa52e0de54e7c"} err="failed to get container status \"b68f3b648951a3ebefc67d7959063cd314b837c870647baf7f4aa52e0de54e7c\": rpc error: code = NotFound desc = could not find container \"b68f3b648951a3ebefc67d7959063cd314b837c870647baf7f4aa52e0de54e7c\": container with ID starting with b68f3b648951a3ebefc67d7959063cd314b837c870647baf7f4aa52e0de54e7c not found: ID does not exist" Mar 10 11:40:52 crc kubenswrapper[4794]: I0310 11:40:52.023107 4794 scope.go:117] "RemoveContainer" containerID="77589dc6abf1f5034bb738d1882c788cb4a8ff699d84f9055f32e5779764d976" Mar 10 11:40:52 crc kubenswrapper[4794]: E0310 11:40:52.023501 4794 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"77589dc6abf1f5034bb738d1882c788cb4a8ff699d84f9055f32e5779764d976\": container with ID starting with 77589dc6abf1f5034bb738d1882c788cb4a8ff699d84f9055f32e5779764d976 not found: ID does not exist" containerID="77589dc6abf1f5034bb738d1882c788cb4a8ff699d84f9055f32e5779764d976" Mar 10 11:40:52 crc kubenswrapper[4794]: I0310 11:40:52.023559 4794 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"77589dc6abf1f5034bb738d1882c788cb4a8ff699d84f9055f32e5779764d976"} err="failed to get container status \"77589dc6abf1f5034bb738d1882c788cb4a8ff699d84f9055f32e5779764d976\": rpc error: code = NotFound desc = could not find container \"77589dc6abf1f5034bb738d1882c788cb4a8ff699d84f9055f32e5779764d976\": container with ID starting with 77589dc6abf1f5034bb738d1882c788cb4a8ff699d84f9055f32e5779764d976 not found: ID does not exist" Mar 10 11:40:52 crc kubenswrapper[4794]: I0310 11:40:52.023594 4794 scope.go:117] "RemoveContainer" containerID="e72af649e8b8d291254a54c8adf8e6435c6959589da0e65d2b9ee8f28b10250d" Mar 10 11:40:52 crc kubenswrapper[4794]: E0310 11:40:52.023860 4794 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e72af649e8b8d291254a54c8adf8e6435c6959589da0e65d2b9ee8f28b10250d\": container with ID starting with e72af649e8b8d291254a54c8adf8e6435c6959589da0e65d2b9ee8f28b10250d not found: ID does not exist" containerID="e72af649e8b8d291254a54c8adf8e6435c6959589da0e65d2b9ee8f28b10250d" Mar 10 11:40:52 crc kubenswrapper[4794]: I0310 11:40:52.023896 4794 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e72af649e8b8d291254a54c8adf8e6435c6959589da0e65d2b9ee8f28b10250d"} err="failed to get container status \"e72af649e8b8d291254a54c8adf8e6435c6959589da0e65d2b9ee8f28b10250d\": rpc error: code = NotFound desc = could not find container \"e72af649e8b8d291254a54c8adf8e6435c6959589da0e65d2b9ee8f28b10250d\": container with ID starting with e72af649e8b8d291254a54c8adf8e6435c6959589da0e65d2b9ee8f28b10250d not found: ID does not exist" Mar 10 11:40:52 crc kubenswrapper[4794]: I0310 11:40:52.033930 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f1107c3c-6645-424a-bba4-5db5da4d2717" path="/var/lib/kubelet/pods/f1107c3c-6645-424a-bba4-5db5da4d2717/volumes" Mar 10 11:40:52 crc kubenswrapper[4794]: I0310 11:40:52.411721 4794 scope.go:117] "RemoveContainer" containerID="20b9158f59c025a54bcc5b25df9575f4823d9b34316afe73fb47fe76823d2a39" Mar 10 11:40:54 crc kubenswrapper[4794]: I0310 11:40:54.609639 4794 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-h7qm2" podUID="489ffc4f-7c1e-49c9-8a8c-510fcdea7dba" containerName="registry-server" probeResult="failure" output=< Mar 10 11:40:54 crc kubenswrapper[4794]: timeout: failed to connect service ":50051" within 1s Mar 10 11:40:54 crc kubenswrapper[4794]: > Mar 10 11:41:03 crc kubenswrapper[4794]: I0310 11:41:03.620908 4794 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-h7qm2" Mar 10 11:41:03 crc kubenswrapper[4794]: I0310 11:41:03.675450 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-h7qm2" Mar 10 11:41:03 crc kubenswrapper[4794]: I0310 11:41:03.999412 4794 scope.go:117] "RemoveContainer" containerID="33bc8fb5de7a6ee4431141216e52d779889bbf57219d1ad361c0b6ac9e985c0e" Mar 10 11:41:03 crc kubenswrapper[4794]: E0310 11:41:03.999729 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-69278_openshift-machine-config-operator(071a79a8-a892-4d38-a255-2a19483b64aa)\"" pod="openshift-machine-config-operator/machine-config-daemon-69278" podUID="071a79a8-a892-4d38-a255-2a19483b64aa" Mar 10 11:41:04 crc kubenswrapper[4794]: I0310 11:41:04.389934 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-h7qm2"] Mar 10 11:41:05 crc kubenswrapper[4794]: I0310 11:41:05.019132 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-h7qm2" podUID="489ffc4f-7c1e-49c9-8a8c-510fcdea7dba" containerName="registry-server" containerID="cri-o://52e3179f36782c0f41b80927d831290c05451f75edbbce252142b1d2a4f1edac" gracePeriod=2 Mar 10 11:41:05 crc kubenswrapper[4794]: I0310 11:41:05.545496 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-h7qm2" Mar 10 11:41:05 crc kubenswrapper[4794]: I0310 11:41:05.681939 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/489ffc4f-7c1e-49c9-8a8c-510fcdea7dba-utilities\") pod \"489ffc4f-7c1e-49c9-8a8c-510fcdea7dba\" (UID: \"489ffc4f-7c1e-49c9-8a8c-510fcdea7dba\") " Mar 10 11:41:05 crc kubenswrapper[4794]: I0310 11:41:05.682168 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lnwdk\" (UniqueName: \"kubernetes.io/projected/489ffc4f-7c1e-49c9-8a8c-510fcdea7dba-kube-api-access-lnwdk\") pod \"489ffc4f-7c1e-49c9-8a8c-510fcdea7dba\" (UID: \"489ffc4f-7c1e-49c9-8a8c-510fcdea7dba\") " Mar 10 11:41:05 crc kubenswrapper[4794]: I0310 11:41:05.682209 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/489ffc4f-7c1e-49c9-8a8c-510fcdea7dba-catalog-content\") pod \"489ffc4f-7c1e-49c9-8a8c-510fcdea7dba\" (UID: \"489ffc4f-7c1e-49c9-8a8c-510fcdea7dba\") " Mar 10 11:41:05 crc kubenswrapper[4794]: I0310 11:41:05.682846 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/489ffc4f-7c1e-49c9-8a8c-510fcdea7dba-utilities" (OuterVolumeSpecName: "utilities") pod "489ffc4f-7c1e-49c9-8a8c-510fcdea7dba" (UID: "489ffc4f-7c1e-49c9-8a8c-510fcdea7dba"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 11:41:05 crc kubenswrapper[4794]: I0310 11:41:05.687113 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/489ffc4f-7c1e-49c9-8a8c-510fcdea7dba-kube-api-access-lnwdk" (OuterVolumeSpecName: "kube-api-access-lnwdk") pod "489ffc4f-7c1e-49c9-8a8c-510fcdea7dba" (UID: "489ffc4f-7c1e-49c9-8a8c-510fcdea7dba"). InnerVolumeSpecName "kube-api-access-lnwdk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 11:41:05 crc kubenswrapper[4794]: I0310 11:41:05.784456 4794 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/489ffc4f-7c1e-49c9-8a8c-510fcdea7dba-utilities\") on node \"crc\" DevicePath \"\"" Mar 10 11:41:05 crc kubenswrapper[4794]: I0310 11:41:05.784487 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lnwdk\" (UniqueName: \"kubernetes.io/projected/489ffc4f-7c1e-49c9-8a8c-510fcdea7dba-kube-api-access-lnwdk\") on node \"crc\" DevicePath \"\"" Mar 10 11:41:05 crc kubenswrapper[4794]: I0310 11:41:05.819913 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/489ffc4f-7c1e-49c9-8a8c-510fcdea7dba-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "489ffc4f-7c1e-49c9-8a8c-510fcdea7dba" (UID: "489ffc4f-7c1e-49c9-8a8c-510fcdea7dba"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 11:41:05 crc kubenswrapper[4794]: I0310 11:41:05.886531 4794 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/489ffc4f-7c1e-49c9-8a8c-510fcdea7dba-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 10 11:41:06 crc kubenswrapper[4794]: I0310 11:41:06.033659 4794 generic.go:334] "Generic (PLEG): container finished" podID="489ffc4f-7c1e-49c9-8a8c-510fcdea7dba" containerID="52e3179f36782c0f41b80927d831290c05451f75edbbce252142b1d2a4f1edac" exitCode=0 Mar 10 11:41:06 crc kubenswrapper[4794]: I0310 11:41:06.033699 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-h7qm2" event={"ID":"489ffc4f-7c1e-49c9-8a8c-510fcdea7dba","Type":"ContainerDied","Data":"52e3179f36782c0f41b80927d831290c05451f75edbbce252142b1d2a4f1edac"} Mar 10 11:41:06 crc kubenswrapper[4794]: I0310 11:41:06.033722 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-h7qm2" Mar 10 11:41:06 crc kubenswrapper[4794]: I0310 11:41:06.033733 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-h7qm2" event={"ID":"489ffc4f-7c1e-49c9-8a8c-510fcdea7dba","Type":"ContainerDied","Data":"e568009306381a6a5d2dad785ccf2ab2fbe207d2f01e94610fc1b507b283d0e0"} Mar 10 11:41:06 crc kubenswrapper[4794]: I0310 11:41:06.033755 4794 scope.go:117] "RemoveContainer" containerID="52e3179f36782c0f41b80927d831290c05451f75edbbce252142b1d2a4f1edac" Mar 10 11:41:06 crc kubenswrapper[4794]: I0310 11:41:06.062243 4794 scope.go:117] "RemoveContainer" containerID="3af716d5ac765c0ac73053da5cb6be56fac153c0a05b120b3ac28c34681b466e" Mar 10 11:41:06 crc kubenswrapper[4794]: I0310 11:41:06.071161 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-h7qm2"] Mar 10 11:41:06 crc kubenswrapper[4794]: I0310 11:41:06.080491 4794 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-h7qm2"] Mar 10 11:41:06 crc kubenswrapper[4794]: I0310 11:41:06.102191 4794 scope.go:117] "RemoveContainer" containerID="36a7b7b64e45c5d9215806d666513ab99f9c90f49612ff5115329cc7708ddd7a" Mar 10 11:41:06 crc kubenswrapper[4794]: I0310 11:41:06.129535 4794 scope.go:117] "RemoveContainer" containerID="52e3179f36782c0f41b80927d831290c05451f75edbbce252142b1d2a4f1edac" Mar 10 11:41:06 crc kubenswrapper[4794]: E0310 11:41:06.129891 4794 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"52e3179f36782c0f41b80927d831290c05451f75edbbce252142b1d2a4f1edac\": container with ID starting with 52e3179f36782c0f41b80927d831290c05451f75edbbce252142b1d2a4f1edac not found: ID does not exist" containerID="52e3179f36782c0f41b80927d831290c05451f75edbbce252142b1d2a4f1edac" Mar 10 11:41:06 crc kubenswrapper[4794]: I0310 11:41:06.129924 4794 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"52e3179f36782c0f41b80927d831290c05451f75edbbce252142b1d2a4f1edac"} err="failed to get container status \"52e3179f36782c0f41b80927d831290c05451f75edbbce252142b1d2a4f1edac\": rpc error: code = NotFound desc = could not find container \"52e3179f36782c0f41b80927d831290c05451f75edbbce252142b1d2a4f1edac\": container with ID starting with 52e3179f36782c0f41b80927d831290c05451f75edbbce252142b1d2a4f1edac not found: ID does not exist" Mar 10 11:41:06 crc kubenswrapper[4794]: I0310 11:41:06.129944 4794 scope.go:117] "RemoveContainer" containerID="3af716d5ac765c0ac73053da5cb6be56fac153c0a05b120b3ac28c34681b466e" Mar 10 11:41:06 crc kubenswrapper[4794]: E0310 11:41:06.133552 4794 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3af716d5ac765c0ac73053da5cb6be56fac153c0a05b120b3ac28c34681b466e\": container with ID starting with 3af716d5ac765c0ac73053da5cb6be56fac153c0a05b120b3ac28c34681b466e not found: ID does not exist" containerID="3af716d5ac765c0ac73053da5cb6be56fac153c0a05b120b3ac28c34681b466e" Mar 10 11:41:06 crc kubenswrapper[4794]: I0310 11:41:06.133588 4794 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3af716d5ac765c0ac73053da5cb6be56fac153c0a05b120b3ac28c34681b466e"} err="failed to get container status \"3af716d5ac765c0ac73053da5cb6be56fac153c0a05b120b3ac28c34681b466e\": rpc error: code = NotFound desc = could not find container \"3af716d5ac765c0ac73053da5cb6be56fac153c0a05b120b3ac28c34681b466e\": container with ID starting with 3af716d5ac765c0ac73053da5cb6be56fac153c0a05b120b3ac28c34681b466e not found: ID does not exist" Mar 10 11:41:06 crc kubenswrapper[4794]: I0310 11:41:06.133602 4794 scope.go:117] "RemoveContainer" containerID="36a7b7b64e45c5d9215806d666513ab99f9c90f49612ff5115329cc7708ddd7a" Mar 10 11:41:06 crc kubenswrapper[4794]: E0310 11:41:06.133822 4794 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"36a7b7b64e45c5d9215806d666513ab99f9c90f49612ff5115329cc7708ddd7a\": container with ID starting with 36a7b7b64e45c5d9215806d666513ab99f9c90f49612ff5115329cc7708ddd7a not found: ID does not exist" containerID="36a7b7b64e45c5d9215806d666513ab99f9c90f49612ff5115329cc7708ddd7a" Mar 10 11:41:06 crc kubenswrapper[4794]: I0310 11:41:06.133847 4794 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"36a7b7b64e45c5d9215806d666513ab99f9c90f49612ff5115329cc7708ddd7a"} err="failed to get container status \"36a7b7b64e45c5d9215806d666513ab99f9c90f49612ff5115329cc7708ddd7a\": rpc error: code = NotFound desc = could not find container \"36a7b7b64e45c5d9215806d666513ab99f9c90f49612ff5115329cc7708ddd7a\": container with ID starting with 36a7b7b64e45c5d9215806d666513ab99f9c90f49612ff5115329cc7708ddd7a not found: ID does not exist" Mar 10 11:41:08 crc kubenswrapper[4794]: I0310 11:41:08.026668 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="489ffc4f-7c1e-49c9-8a8c-510fcdea7dba" path="/var/lib/kubelet/pods/489ffc4f-7c1e-49c9-8a8c-510fcdea7dba/volumes" Mar 10 11:41:16 crc kubenswrapper[4794]: I0310 11:41:16.999317 4794 scope.go:117] "RemoveContainer" containerID="33bc8fb5de7a6ee4431141216e52d779889bbf57219d1ad361c0b6ac9e985c0e" Mar 10 11:41:17 crc kubenswrapper[4794]: E0310 11:41:17.000019 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-69278_openshift-machine-config-operator(071a79a8-a892-4d38-a255-2a19483b64aa)\"" pod="openshift-machine-config-operator/machine-config-daemon-69278" podUID="071a79a8-a892-4d38-a255-2a19483b64aa" Mar 10 11:41:31 crc kubenswrapper[4794]: I0310 11:41:30.999949 4794 scope.go:117] "RemoveContainer" containerID="33bc8fb5de7a6ee4431141216e52d779889bbf57219d1ad361c0b6ac9e985c0e" Mar 10 11:41:31 crc kubenswrapper[4794]: E0310 11:41:31.000897 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-69278_openshift-machine-config-operator(071a79a8-a892-4d38-a255-2a19483b64aa)\"" pod="openshift-machine-config-operator/machine-config-daemon-69278" podUID="071a79a8-a892-4d38-a255-2a19483b64aa" Mar 10 11:41:43 crc kubenswrapper[4794]: I0310 11:41:43.000124 4794 scope.go:117] "RemoveContainer" containerID="33bc8fb5de7a6ee4431141216e52d779889bbf57219d1ad361c0b6ac9e985c0e" Mar 10 11:41:43 crc kubenswrapper[4794]: E0310 11:41:43.001222 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-69278_openshift-machine-config-operator(071a79a8-a892-4d38-a255-2a19483b64aa)\"" pod="openshift-machine-config-operator/machine-config-daemon-69278" podUID="071a79a8-a892-4d38-a255-2a19483b64aa" Mar 10 11:41:58 crc kubenswrapper[4794]: I0310 11:41:58.000016 4794 scope.go:117] "RemoveContainer" containerID="33bc8fb5de7a6ee4431141216e52d779889bbf57219d1ad361c0b6ac9e985c0e" Mar 10 11:41:58 crc kubenswrapper[4794]: E0310 11:41:58.001269 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-69278_openshift-machine-config-operator(071a79a8-a892-4d38-a255-2a19483b64aa)\"" pod="openshift-machine-config-operator/machine-config-daemon-69278" podUID="071a79a8-a892-4d38-a255-2a19483b64aa" Mar 10 11:42:00 crc kubenswrapper[4794]: I0310 11:42:00.206034 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29552382-p9pv6"] Mar 10 11:42:00 crc kubenswrapper[4794]: E0310 11:42:00.207078 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="489ffc4f-7c1e-49c9-8a8c-510fcdea7dba" containerName="registry-server" Mar 10 11:42:00 crc kubenswrapper[4794]: I0310 11:42:00.207098 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="489ffc4f-7c1e-49c9-8a8c-510fcdea7dba" containerName="registry-server" Mar 10 11:42:00 crc kubenswrapper[4794]: E0310 11:42:00.207111 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f1107c3c-6645-424a-bba4-5db5da4d2717" containerName="registry-server" Mar 10 11:42:00 crc kubenswrapper[4794]: I0310 11:42:00.207119 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="f1107c3c-6645-424a-bba4-5db5da4d2717" containerName="registry-server" Mar 10 11:42:00 crc kubenswrapper[4794]: E0310 11:42:00.207157 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="489ffc4f-7c1e-49c9-8a8c-510fcdea7dba" containerName="extract-content" Mar 10 11:42:00 crc kubenswrapper[4794]: I0310 11:42:00.207166 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="489ffc4f-7c1e-49c9-8a8c-510fcdea7dba" containerName="extract-content" Mar 10 11:42:00 crc kubenswrapper[4794]: E0310 11:42:00.207174 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f1107c3c-6645-424a-bba4-5db5da4d2717" containerName="extract-utilities" Mar 10 11:42:00 crc kubenswrapper[4794]: I0310 11:42:00.207183 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="f1107c3c-6645-424a-bba4-5db5da4d2717" containerName="extract-utilities" Mar 10 11:42:00 crc kubenswrapper[4794]: E0310 11:42:00.207193 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f1107c3c-6645-424a-bba4-5db5da4d2717" containerName="extract-content" Mar 10 11:42:00 crc kubenswrapper[4794]: I0310 11:42:00.207200 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="f1107c3c-6645-424a-bba4-5db5da4d2717" containerName="extract-content" Mar 10 11:42:00 crc kubenswrapper[4794]: E0310 11:42:00.207213 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="489ffc4f-7c1e-49c9-8a8c-510fcdea7dba" containerName="extract-utilities" Mar 10 11:42:00 crc kubenswrapper[4794]: I0310 11:42:00.207221 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="489ffc4f-7c1e-49c9-8a8c-510fcdea7dba" containerName="extract-utilities" Mar 10 11:42:00 crc kubenswrapper[4794]: I0310 11:42:00.207464 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="489ffc4f-7c1e-49c9-8a8c-510fcdea7dba" containerName="registry-server" Mar 10 11:42:00 crc kubenswrapper[4794]: I0310 11:42:00.207501 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="f1107c3c-6645-424a-bba4-5db5da4d2717" containerName="registry-server" Mar 10 11:42:00 crc kubenswrapper[4794]: I0310 11:42:00.208384 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552382-p9pv6" Mar 10 11:42:00 crc kubenswrapper[4794]: I0310 11:42:00.216221 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 10 11:42:00 crc kubenswrapper[4794]: I0310 11:42:00.216461 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-r5tkc" Mar 10 11:42:00 crc kubenswrapper[4794]: I0310 11:42:00.220939 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 10 11:42:00 crc kubenswrapper[4794]: I0310 11:42:00.221350 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wsgk9\" (UniqueName: \"kubernetes.io/projected/7a2ded44-27c0-4060-860e-d8e0e00f35f1-kube-api-access-wsgk9\") pod \"auto-csr-approver-29552382-p9pv6\" (UID: \"7a2ded44-27c0-4060-860e-d8e0e00f35f1\") " pod="openshift-infra/auto-csr-approver-29552382-p9pv6" Mar 10 11:42:00 crc kubenswrapper[4794]: I0310 11:42:00.235319 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552382-p9pv6"] Mar 10 11:42:00 crc kubenswrapper[4794]: I0310 11:42:00.323737 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wsgk9\" (UniqueName: \"kubernetes.io/projected/7a2ded44-27c0-4060-860e-d8e0e00f35f1-kube-api-access-wsgk9\") pod \"auto-csr-approver-29552382-p9pv6\" (UID: \"7a2ded44-27c0-4060-860e-d8e0e00f35f1\") " pod="openshift-infra/auto-csr-approver-29552382-p9pv6" Mar 10 11:42:00 crc kubenswrapper[4794]: I0310 11:42:00.353896 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wsgk9\" (UniqueName: \"kubernetes.io/projected/7a2ded44-27c0-4060-860e-d8e0e00f35f1-kube-api-access-wsgk9\") pod \"auto-csr-approver-29552382-p9pv6\" (UID: \"7a2ded44-27c0-4060-860e-d8e0e00f35f1\") " pod="openshift-infra/auto-csr-approver-29552382-p9pv6" Mar 10 11:42:00 crc kubenswrapper[4794]: I0310 11:42:00.537135 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552382-p9pv6" Mar 10 11:42:01 crc kubenswrapper[4794]: I0310 11:42:01.030007 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552382-p9pv6"] Mar 10 11:42:01 crc kubenswrapper[4794]: I0310 11:42:01.740159 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552382-p9pv6" event={"ID":"7a2ded44-27c0-4060-860e-d8e0e00f35f1","Type":"ContainerStarted","Data":"11f6ae8370a7f6aec4165cb523bfed4b71f22abffd89908c2ffbf42f1db951a0"} Mar 10 11:42:02 crc kubenswrapper[4794]: I0310 11:42:02.756633 4794 generic.go:334] "Generic (PLEG): container finished" podID="7a2ded44-27c0-4060-860e-d8e0e00f35f1" containerID="60e171925176b993cfbd5fc22a7d10e5273eff277b44bd3070313b6bb656361c" exitCode=0 Mar 10 11:42:02 crc kubenswrapper[4794]: I0310 11:42:02.756776 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552382-p9pv6" event={"ID":"7a2ded44-27c0-4060-860e-d8e0e00f35f1","Type":"ContainerDied","Data":"60e171925176b993cfbd5fc22a7d10e5273eff277b44bd3070313b6bb656361c"} Mar 10 11:42:04 crc kubenswrapper[4794]: I0310 11:42:04.190130 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552382-p9pv6" Mar 10 11:42:04 crc kubenswrapper[4794]: I0310 11:42:04.311651 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wsgk9\" (UniqueName: \"kubernetes.io/projected/7a2ded44-27c0-4060-860e-d8e0e00f35f1-kube-api-access-wsgk9\") pod \"7a2ded44-27c0-4060-860e-d8e0e00f35f1\" (UID: \"7a2ded44-27c0-4060-860e-d8e0e00f35f1\") " Mar 10 11:42:04 crc kubenswrapper[4794]: I0310 11:42:04.317790 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7a2ded44-27c0-4060-860e-d8e0e00f35f1-kube-api-access-wsgk9" (OuterVolumeSpecName: "kube-api-access-wsgk9") pod "7a2ded44-27c0-4060-860e-d8e0e00f35f1" (UID: "7a2ded44-27c0-4060-860e-d8e0e00f35f1"). InnerVolumeSpecName "kube-api-access-wsgk9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 11:42:04 crc kubenswrapper[4794]: I0310 11:42:04.415232 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wsgk9\" (UniqueName: \"kubernetes.io/projected/7a2ded44-27c0-4060-860e-d8e0e00f35f1-kube-api-access-wsgk9\") on node \"crc\" DevicePath \"\"" Mar 10 11:42:04 crc kubenswrapper[4794]: I0310 11:42:04.779175 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552382-p9pv6" event={"ID":"7a2ded44-27c0-4060-860e-d8e0e00f35f1","Type":"ContainerDied","Data":"11f6ae8370a7f6aec4165cb523bfed4b71f22abffd89908c2ffbf42f1db951a0"} Mar 10 11:42:04 crc kubenswrapper[4794]: I0310 11:42:04.779238 4794 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="11f6ae8370a7f6aec4165cb523bfed4b71f22abffd89908c2ffbf42f1db951a0" Mar 10 11:42:04 crc kubenswrapper[4794]: I0310 11:42:04.779376 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552382-p9pv6" Mar 10 11:42:05 crc kubenswrapper[4794]: I0310 11:42:05.275706 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29552376-dttgm"] Mar 10 11:42:05 crc kubenswrapper[4794]: I0310 11:42:05.291092 4794 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29552376-dttgm"] Mar 10 11:42:06 crc kubenswrapper[4794]: I0310 11:42:06.013937 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d95950e4-bccd-4145-b0fa-22b114129be1" path="/var/lib/kubelet/pods/d95950e4-bccd-4145-b0fa-22b114129be1/volumes" Mar 10 11:42:09 crc kubenswrapper[4794]: I0310 11:42:09.999201 4794 scope.go:117] "RemoveContainer" containerID="33bc8fb5de7a6ee4431141216e52d779889bbf57219d1ad361c0b6ac9e985c0e" Mar 10 11:42:10 crc kubenswrapper[4794]: E0310 11:42:09.999950 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-69278_openshift-machine-config-operator(071a79a8-a892-4d38-a255-2a19483b64aa)\"" pod="openshift-machine-config-operator/machine-config-daemon-69278" podUID="071a79a8-a892-4d38-a255-2a19483b64aa" Mar 10 11:42:24 crc kubenswrapper[4794]: I0310 11:42:23.999534 4794 scope.go:117] "RemoveContainer" containerID="33bc8fb5de7a6ee4431141216e52d779889bbf57219d1ad361c0b6ac9e985c0e" Mar 10 11:42:24 crc kubenswrapper[4794]: E0310 11:42:24.000221 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-69278_openshift-machine-config-operator(071a79a8-a892-4d38-a255-2a19483b64aa)\"" pod="openshift-machine-config-operator/machine-config-daemon-69278" podUID="071a79a8-a892-4d38-a255-2a19483b64aa" Mar 10 11:42:39 crc kubenswrapper[4794]: I0310 11:42:38.999591 4794 scope.go:117] "RemoveContainer" containerID="33bc8fb5de7a6ee4431141216e52d779889bbf57219d1ad361c0b6ac9e985c0e" Mar 10 11:42:39 crc kubenswrapper[4794]: E0310 11:42:39.003879 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-69278_openshift-machine-config-operator(071a79a8-a892-4d38-a255-2a19483b64aa)\"" pod="openshift-machine-config-operator/machine-config-daemon-69278" podUID="071a79a8-a892-4d38-a255-2a19483b64aa" Mar 10 11:42:50 crc kubenswrapper[4794]: I0310 11:42:50.322924 4794 generic.go:334] "Generic (PLEG): container finished" podID="2281ee85-db79-4fec-bb5e-ce0a4a4c61de" containerID="b8eb446da5078a00f3fa88320aa3cc297e48edc9bf62d3bd2190cb461495a458" exitCode=0 Mar 10 11:42:50 crc kubenswrapper[4794]: I0310 11:42:50.323020 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-4sdlv" event={"ID":"2281ee85-db79-4fec-bb5e-ce0a4a4c61de","Type":"ContainerDied","Data":"b8eb446da5078a00f3fa88320aa3cc297e48edc9bf62d3bd2190cb461495a458"} Mar 10 11:42:51 crc kubenswrapper[4794]: I0310 11:42:51.000841 4794 scope.go:117] "RemoveContainer" containerID="33bc8fb5de7a6ee4431141216e52d779889bbf57219d1ad361c0b6ac9e985c0e" Mar 10 11:42:51 crc kubenswrapper[4794]: E0310 11:42:51.001526 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-69278_openshift-machine-config-operator(071a79a8-a892-4d38-a255-2a19483b64aa)\"" pod="openshift-machine-config-operator/machine-config-daemon-69278" podUID="071a79a8-a892-4d38-a255-2a19483b64aa" Mar 10 11:42:51 crc kubenswrapper[4794]: I0310 11:42:51.928874 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-4sdlv" Mar 10 11:42:52 crc kubenswrapper[4794]: I0310 11:42:52.081092 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/2281ee85-db79-4fec-bb5e-ce0a4a4c61de-ceph\") pod \"2281ee85-db79-4fec-bb5e-ce0a4a4c61de\" (UID: \"2281ee85-db79-4fec-bb5e-ce0a4a4c61de\") " Mar 10 11:42:52 crc kubenswrapper[4794]: I0310 11:42:52.081548 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tripleo-cleanup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2281ee85-db79-4fec-bb5e-ce0a4a4c61de-tripleo-cleanup-combined-ca-bundle\") pod \"2281ee85-db79-4fec-bb5e-ce0a4a4c61de\" (UID: \"2281ee85-db79-4fec-bb5e-ce0a4a4c61de\") " Mar 10 11:42:52 crc kubenswrapper[4794]: I0310 11:42:52.081587 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/2281ee85-db79-4fec-bb5e-ce0a4a4c61de-ssh-key-openstack-cell1\") pod \"2281ee85-db79-4fec-bb5e-ce0a4a4c61de\" (UID: \"2281ee85-db79-4fec-bb5e-ce0a4a4c61de\") " Mar 10 11:42:52 crc kubenswrapper[4794]: I0310 11:42:52.081704 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2281ee85-db79-4fec-bb5e-ce0a4a4c61de-inventory\") pod \"2281ee85-db79-4fec-bb5e-ce0a4a4c61de\" (UID: \"2281ee85-db79-4fec-bb5e-ce0a4a4c61de\") " Mar 10 11:42:52 crc kubenswrapper[4794]: I0310 11:42:52.081777 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s8ztj\" (UniqueName: \"kubernetes.io/projected/2281ee85-db79-4fec-bb5e-ce0a4a4c61de-kube-api-access-s8ztj\") pod \"2281ee85-db79-4fec-bb5e-ce0a4a4c61de\" (UID: \"2281ee85-db79-4fec-bb5e-ce0a4a4c61de\") " Mar 10 11:42:52 crc kubenswrapper[4794]: I0310 11:42:52.086878 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2281ee85-db79-4fec-bb5e-ce0a4a4c61de-tripleo-cleanup-combined-ca-bundle" (OuterVolumeSpecName: "tripleo-cleanup-combined-ca-bundle") pod "2281ee85-db79-4fec-bb5e-ce0a4a4c61de" (UID: "2281ee85-db79-4fec-bb5e-ce0a4a4c61de"). InnerVolumeSpecName "tripleo-cleanup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 11:42:52 crc kubenswrapper[4794]: I0310 11:42:52.087106 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2281ee85-db79-4fec-bb5e-ce0a4a4c61de-kube-api-access-s8ztj" (OuterVolumeSpecName: "kube-api-access-s8ztj") pod "2281ee85-db79-4fec-bb5e-ce0a4a4c61de" (UID: "2281ee85-db79-4fec-bb5e-ce0a4a4c61de"). InnerVolumeSpecName "kube-api-access-s8ztj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 11:42:52 crc kubenswrapper[4794]: I0310 11:42:52.087256 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2281ee85-db79-4fec-bb5e-ce0a4a4c61de-ceph" (OuterVolumeSpecName: "ceph") pod "2281ee85-db79-4fec-bb5e-ce0a4a4c61de" (UID: "2281ee85-db79-4fec-bb5e-ce0a4a4c61de"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 11:42:52 crc kubenswrapper[4794]: I0310 11:42:52.128429 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2281ee85-db79-4fec-bb5e-ce0a4a4c61de-inventory" (OuterVolumeSpecName: "inventory") pod "2281ee85-db79-4fec-bb5e-ce0a4a4c61de" (UID: "2281ee85-db79-4fec-bb5e-ce0a4a4c61de"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 11:42:52 crc kubenswrapper[4794]: I0310 11:42:52.136829 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2281ee85-db79-4fec-bb5e-ce0a4a4c61de-ssh-key-openstack-cell1" (OuterVolumeSpecName: "ssh-key-openstack-cell1") pod "2281ee85-db79-4fec-bb5e-ce0a4a4c61de" (UID: "2281ee85-db79-4fec-bb5e-ce0a4a4c61de"). InnerVolumeSpecName "ssh-key-openstack-cell1". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 11:42:52 crc kubenswrapper[4794]: I0310 11:42:52.185089 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s8ztj\" (UniqueName: \"kubernetes.io/projected/2281ee85-db79-4fec-bb5e-ce0a4a4c61de-kube-api-access-s8ztj\") on node \"crc\" DevicePath \"\"" Mar 10 11:42:52 crc kubenswrapper[4794]: I0310 11:42:52.185136 4794 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/2281ee85-db79-4fec-bb5e-ce0a4a4c61de-ceph\") on node \"crc\" DevicePath \"\"" Mar 10 11:42:52 crc kubenswrapper[4794]: I0310 11:42:52.185154 4794 reconciler_common.go:293] "Volume detached for volume \"tripleo-cleanup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2281ee85-db79-4fec-bb5e-ce0a4a4c61de-tripleo-cleanup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 11:42:52 crc kubenswrapper[4794]: I0310 11:42:52.185171 4794 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/2281ee85-db79-4fec-bb5e-ce0a4a4c61de-ssh-key-openstack-cell1\") on node \"crc\" DevicePath \"\"" Mar 10 11:42:52 crc kubenswrapper[4794]: I0310 11:42:52.185190 4794 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2281ee85-db79-4fec-bb5e-ce0a4a4c61de-inventory\") on node \"crc\" DevicePath \"\"" Mar 10 11:42:52 crc kubenswrapper[4794]: I0310 11:42:52.340455 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-4sdlv" event={"ID":"2281ee85-db79-4fec-bb5e-ce0a4a4c61de","Type":"ContainerDied","Data":"bffd2d9d8c902591e0dc00b8a85ce093cdd6f05563d4fb9c56a905f2d7b95c77"} Mar 10 11:42:52 crc kubenswrapper[4794]: I0310 11:42:52.340497 4794 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bffd2d9d8c902591e0dc00b8a85ce093cdd6f05563d4fb9c56a905f2d7b95c77" Mar 10 11:42:52 crc kubenswrapper[4794]: I0310 11:42:52.340515 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-4sdlv" Mar 10 11:42:52 crc kubenswrapper[4794]: I0310 11:42:52.542996 4794 scope.go:117] "RemoveContainer" containerID="fda49684f956b7d3fb137604c388e6ebb7ae52aaabeb12fd4ef7f7836a8d8f35" Mar 10 11:42:59 crc kubenswrapper[4794]: I0310 11:42:59.824737 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/bootstrap-openstack-openstack-cell1-cl7pm"] Mar 10 11:42:59 crc kubenswrapper[4794]: E0310 11:42:59.825980 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7a2ded44-27c0-4060-860e-d8e0e00f35f1" containerName="oc" Mar 10 11:42:59 crc kubenswrapper[4794]: I0310 11:42:59.826002 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="7a2ded44-27c0-4060-860e-d8e0e00f35f1" containerName="oc" Mar 10 11:42:59 crc kubenswrapper[4794]: E0310 11:42:59.826047 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2281ee85-db79-4fec-bb5e-ce0a4a4c61de" containerName="tripleo-cleanup-tripleo-cleanup-openstack-cell1" Mar 10 11:42:59 crc kubenswrapper[4794]: I0310 11:42:59.826059 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="2281ee85-db79-4fec-bb5e-ce0a4a4c61de" containerName="tripleo-cleanup-tripleo-cleanup-openstack-cell1" Mar 10 11:42:59 crc kubenswrapper[4794]: I0310 11:42:59.826414 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="7a2ded44-27c0-4060-860e-d8e0e00f35f1" containerName="oc" Mar 10 11:42:59 crc kubenswrapper[4794]: I0310 11:42:59.826440 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="2281ee85-db79-4fec-bb5e-ce0a4a4c61de" containerName="tripleo-cleanup-tripleo-cleanup-openstack-cell1" Mar 10 11:42:59 crc kubenswrapper[4794]: I0310 11:42:59.827472 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-openstack-openstack-cell1-cl7pm" Mar 10 11:42:59 crc kubenswrapper[4794]: I0310 11:42:59.829865 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-jdm8g" Mar 10 11:42:59 crc kubenswrapper[4794]: I0310 11:42:59.830575 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 10 11:42:59 crc kubenswrapper[4794]: I0310 11:42:59.830824 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Mar 10 11:42:59 crc kubenswrapper[4794]: I0310 11:42:59.831840 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Mar 10 11:42:59 crc kubenswrapper[4794]: I0310 11:42:59.840134 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-openstack-openstack-cell1-cl7pm"] Mar 10 11:43:00 crc kubenswrapper[4794]: I0310 11:43:00.015040 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/b3ed4d47-7c95-4755-a631-092391d64b11-ceph\") pod \"bootstrap-openstack-openstack-cell1-cl7pm\" (UID: \"b3ed4d47-7c95-4755-a631-092391d64b11\") " pod="openstack/bootstrap-openstack-openstack-cell1-cl7pm" Mar 10 11:43:00 crc kubenswrapper[4794]: I0310 11:43:00.015521 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b3ed4d47-7c95-4755-a631-092391d64b11-bootstrap-combined-ca-bundle\") pod \"bootstrap-openstack-openstack-cell1-cl7pm\" (UID: \"b3ed4d47-7c95-4755-a631-092391d64b11\") " pod="openstack/bootstrap-openstack-openstack-cell1-cl7pm" Mar 10 11:43:00 crc kubenswrapper[4794]: I0310 11:43:00.015636 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b3ed4d47-7c95-4755-a631-092391d64b11-inventory\") pod \"bootstrap-openstack-openstack-cell1-cl7pm\" (UID: \"b3ed4d47-7c95-4755-a631-092391d64b11\") " pod="openstack/bootstrap-openstack-openstack-cell1-cl7pm" Mar 10 11:43:00 crc kubenswrapper[4794]: I0310 11:43:00.015909 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/b3ed4d47-7c95-4755-a631-092391d64b11-ssh-key-openstack-cell1\") pod \"bootstrap-openstack-openstack-cell1-cl7pm\" (UID: \"b3ed4d47-7c95-4755-a631-092391d64b11\") " pod="openstack/bootstrap-openstack-openstack-cell1-cl7pm" Mar 10 11:43:00 crc kubenswrapper[4794]: I0310 11:43:00.016002 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w5xc5\" (UniqueName: \"kubernetes.io/projected/b3ed4d47-7c95-4755-a631-092391d64b11-kube-api-access-w5xc5\") pod \"bootstrap-openstack-openstack-cell1-cl7pm\" (UID: \"b3ed4d47-7c95-4755-a631-092391d64b11\") " pod="openstack/bootstrap-openstack-openstack-cell1-cl7pm" Mar 10 11:43:00 crc kubenswrapper[4794]: I0310 11:43:00.118569 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b3ed4d47-7c95-4755-a631-092391d64b11-bootstrap-combined-ca-bundle\") pod \"bootstrap-openstack-openstack-cell1-cl7pm\" (UID: \"b3ed4d47-7c95-4755-a631-092391d64b11\") " pod="openstack/bootstrap-openstack-openstack-cell1-cl7pm" Mar 10 11:43:00 crc kubenswrapper[4794]: I0310 11:43:00.118717 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b3ed4d47-7c95-4755-a631-092391d64b11-inventory\") pod \"bootstrap-openstack-openstack-cell1-cl7pm\" (UID: \"b3ed4d47-7c95-4755-a631-092391d64b11\") " pod="openstack/bootstrap-openstack-openstack-cell1-cl7pm" Mar 10 11:43:00 crc kubenswrapper[4794]: I0310 11:43:00.118796 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/b3ed4d47-7c95-4755-a631-092391d64b11-ssh-key-openstack-cell1\") pod \"bootstrap-openstack-openstack-cell1-cl7pm\" (UID: \"b3ed4d47-7c95-4755-a631-092391d64b11\") " pod="openstack/bootstrap-openstack-openstack-cell1-cl7pm" Mar 10 11:43:00 crc kubenswrapper[4794]: I0310 11:43:00.118876 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w5xc5\" (UniqueName: \"kubernetes.io/projected/b3ed4d47-7c95-4755-a631-092391d64b11-kube-api-access-w5xc5\") pod \"bootstrap-openstack-openstack-cell1-cl7pm\" (UID: \"b3ed4d47-7c95-4755-a631-092391d64b11\") " pod="openstack/bootstrap-openstack-openstack-cell1-cl7pm" Mar 10 11:43:00 crc kubenswrapper[4794]: I0310 11:43:00.119177 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/b3ed4d47-7c95-4755-a631-092391d64b11-ceph\") pod \"bootstrap-openstack-openstack-cell1-cl7pm\" (UID: \"b3ed4d47-7c95-4755-a631-092391d64b11\") " pod="openstack/bootstrap-openstack-openstack-cell1-cl7pm" Mar 10 11:43:00 crc kubenswrapper[4794]: I0310 11:43:00.130812 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/b3ed4d47-7c95-4755-a631-092391d64b11-ceph\") pod \"bootstrap-openstack-openstack-cell1-cl7pm\" (UID: \"b3ed4d47-7c95-4755-a631-092391d64b11\") " pod="openstack/bootstrap-openstack-openstack-cell1-cl7pm" Mar 10 11:43:00 crc kubenswrapper[4794]: I0310 11:43:00.137649 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b3ed4d47-7c95-4755-a631-092391d64b11-bootstrap-combined-ca-bundle\") pod \"bootstrap-openstack-openstack-cell1-cl7pm\" (UID: \"b3ed4d47-7c95-4755-a631-092391d64b11\") " pod="openstack/bootstrap-openstack-openstack-cell1-cl7pm" Mar 10 11:43:00 crc kubenswrapper[4794]: I0310 11:43:00.140407 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b3ed4d47-7c95-4755-a631-092391d64b11-inventory\") pod \"bootstrap-openstack-openstack-cell1-cl7pm\" (UID: \"b3ed4d47-7c95-4755-a631-092391d64b11\") " pod="openstack/bootstrap-openstack-openstack-cell1-cl7pm" Mar 10 11:43:00 crc kubenswrapper[4794]: I0310 11:43:00.145182 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w5xc5\" (UniqueName: \"kubernetes.io/projected/b3ed4d47-7c95-4755-a631-092391d64b11-kube-api-access-w5xc5\") pod \"bootstrap-openstack-openstack-cell1-cl7pm\" (UID: \"b3ed4d47-7c95-4755-a631-092391d64b11\") " pod="openstack/bootstrap-openstack-openstack-cell1-cl7pm" Mar 10 11:43:00 crc kubenswrapper[4794]: I0310 11:43:00.150940 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/b3ed4d47-7c95-4755-a631-092391d64b11-ssh-key-openstack-cell1\") pod \"bootstrap-openstack-openstack-cell1-cl7pm\" (UID: \"b3ed4d47-7c95-4755-a631-092391d64b11\") " pod="openstack/bootstrap-openstack-openstack-cell1-cl7pm" Mar 10 11:43:00 crc kubenswrapper[4794]: I0310 11:43:00.153786 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-openstack-openstack-cell1-cl7pm" Mar 10 11:43:00 crc kubenswrapper[4794]: I0310 11:43:00.771748 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-openstack-openstack-cell1-cl7pm"] Mar 10 11:43:01 crc kubenswrapper[4794]: I0310 11:43:01.462304 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-openstack-openstack-cell1-cl7pm" event={"ID":"b3ed4d47-7c95-4755-a631-092391d64b11","Type":"ContainerStarted","Data":"0c37e249088ccf8b9bdfc31fc486952b25740c4bdc316812ead6a5c2a86c8f6f"} Mar 10 11:43:02 crc kubenswrapper[4794]: I0310 11:43:02.472769 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-openstack-openstack-cell1-cl7pm" event={"ID":"b3ed4d47-7c95-4755-a631-092391d64b11","Type":"ContainerStarted","Data":"5488b0976598923b1bf4ce1d99e01ec9db4855f8329077728c4c9e5e50c30643"} Mar 10 11:43:02 crc kubenswrapper[4794]: I0310 11:43:02.502527 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/bootstrap-openstack-openstack-cell1-cl7pm" podStartSLOduration=2.987537357 podStartE2EDuration="3.502461314s" podCreationTimestamp="2026-03-10 11:42:59 +0000 UTC" firstStartedPulling="2026-03-10 11:43:00.778767756 +0000 UTC m=+7129.534938574" lastFinishedPulling="2026-03-10 11:43:01.293691693 +0000 UTC m=+7130.049862531" observedRunningTime="2026-03-10 11:43:02.489149181 +0000 UTC m=+7131.245320039" watchObservedRunningTime="2026-03-10 11:43:02.502461314 +0000 UTC m=+7131.258632162" Mar 10 11:43:06 crc kubenswrapper[4794]: I0310 11:43:06.000458 4794 scope.go:117] "RemoveContainer" containerID="33bc8fb5de7a6ee4431141216e52d779889bbf57219d1ad361c0b6ac9e985c0e" Mar 10 11:43:06 crc kubenswrapper[4794]: I0310 11:43:06.532036 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-69278" event={"ID":"071a79a8-a892-4d38-a255-2a19483b64aa","Type":"ContainerStarted","Data":"cd7c71a98db62b40397f10a8c55d8c209b4e24fd8dfc1076cb0ee0625736a5b5"} Mar 10 11:44:00 crc kubenswrapper[4794]: I0310 11:44:00.183890 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29552384-jhmlf"] Mar 10 11:44:00 crc kubenswrapper[4794]: I0310 11:44:00.190502 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552384-jhmlf" Mar 10 11:44:00 crc kubenswrapper[4794]: I0310 11:44:00.194560 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 10 11:44:00 crc kubenswrapper[4794]: I0310 11:44:00.195286 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-r5tkc" Mar 10 11:44:00 crc kubenswrapper[4794]: I0310 11:44:00.195329 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 10 11:44:00 crc kubenswrapper[4794]: I0310 11:44:00.197194 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552384-jhmlf"] Mar 10 11:44:00 crc kubenswrapper[4794]: I0310 11:44:00.299221 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tbh8t\" (UniqueName: \"kubernetes.io/projected/fa5bff22-75c7-4ed7-8d83-8b5e79df536d-kube-api-access-tbh8t\") pod \"auto-csr-approver-29552384-jhmlf\" (UID: \"fa5bff22-75c7-4ed7-8d83-8b5e79df536d\") " pod="openshift-infra/auto-csr-approver-29552384-jhmlf" Mar 10 11:44:00 crc kubenswrapper[4794]: I0310 11:44:00.402631 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tbh8t\" (UniqueName: \"kubernetes.io/projected/fa5bff22-75c7-4ed7-8d83-8b5e79df536d-kube-api-access-tbh8t\") pod \"auto-csr-approver-29552384-jhmlf\" (UID: \"fa5bff22-75c7-4ed7-8d83-8b5e79df536d\") " pod="openshift-infra/auto-csr-approver-29552384-jhmlf" Mar 10 11:44:00 crc kubenswrapper[4794]: I0310 11:44:00.437579 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tbh8t\" (UniqueName: \"kubernetes.io/projected/fa5bff22-75c7-4ed7-8d83-8b5e79df536d-kube-api-access-tbh8t\") pod \"auto-csr-approver-29552384-jhmlf\" (UID: \"fa5bff22-75c7-4ed7-8d83-8b5e79df536d\") " pod="openshift-infra/auto-csr-approver-29552384-jhmlf" Mar 10 11:44:00 crc kubenswrapper[4794]: I0310 11:44:00.536057 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552384-jhmlf" Mar 10 11:44:01 crc kubenswrapper[4794]: I0310 11:44:01.067482 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552384-jhmlf"] Mar 10 11:44:01 crc kubenswrapper[4794]: I0310 11:44:01.203046 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552384-jhmlf" event={"ID":"fa5bff22-75c7-4ed7-8d83-8b5e79df536d","Type":"ContainerStarted","Data":"c33c5f9b2b2f6fec233d76399fd790ba3d2972917a4d4d0ecc84ba9eb5f02e19"} Mar 10 11:44:03 crc kubenswrapper[4794]: I0310 11:44:03.242435 4794 generic.go:334] "Generic (PLEG): container finished" podID="fa5bff22-75c7-4ed7-8d83-8b5e79df536d" containerID="0f995e27a29faebe7c322a344d0711d5ce3b9d27620d9da5d81502fd9510a8cf" exitCode=0 Mar 10 11:44:03 crc kubenswrapper[4794]: I0310 11:44:03.242554 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552384-jhmlf" event={"ID":"fa5bff22-75c7-4ed7-8d83-8b5e79df536d","Type":"ContainerDied","Data":"0f995e27a29faebe7c322a344d0711d5ce3b9d27620d9da5d81502fd9510a8cf"} Mar 10 11:44:04 crc kubenswrapper[4794]: I0310 11:44:04.687638 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552384-jhmlf" Mar 10 11:44:04 crc kubenswrapper[4794]: I0310 11:44:04.805503 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tbh8t\" (UniqueName: \"kubernetes.io/projected/fa5bff22-75c7-4ed7-8d83-8b5e79df536d-kube-api-access-tbh8t\") pod \"fa5bff22-75c7-4ed7-8d83-8b5e79df536d\" (UID: \"fa5bff22-75c7-4ed7-8d83-8b5e79df536d\") " Mar 10 11:44:04 crc kubenswrapper[4794]: I0310 11:44:04.810781 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fa5bff22-75c7-4ed7-8d83-8b5e79df536d-kube-api-access-tbh8t" (OuterVolumeSpecName: "kube-api-access-tbh8t") pod "fa5bff22-75c7-4ed7-8d83-8b5e79df536d" (UID: "fa5bff22-75c7-4ed7-8d83-8b5e79df536d"). InnerVolumeSpecName "kube-api-access-tbh8t". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 11:44:04 crc kubenswrapper[4794]: I0310 11:44:04.909061 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tbh8t\" (UniqueName: \"kubernetes.io/projected/fa5bff22-75c7-4ed7-8d83-8b5e79df536d-kube-api-access-tbh8t\") on node \"crc\" DevicePath \"\"" Mar 10 11:44:05 crc kubenswrapper[4794]: I0310 11:44:05.269289 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552384-jhmlf" event={"ID":"fa5bff22-75c7-4ed7-8d83-8b5e79df536d","Type":"ContainerDied","Data":"c33c5f9b2b2f6fec233d76399fd790ba3d2972917a4d4d0ecc84ba9eb5f02e19"} Mar 10 11:44:05 crc kubenswrapper[4794]: I0310 11:44:05.269393 4794 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c33c5f9b2b2f6fec233d76399fd790ba3d2972917a4d4d0ecc84ba9eb5f02e19" Mar 10 11:44:05 crc kubenswrapper[4794]: I0310 11:44:05.269444 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552384-jhmlf" Mar 10 11:44:05 crc kubenswrapper[4794]: I0310 11:44:05.790628 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29552378-wxkq4"] Mar 10 11:44:05 crc kubenswrapper[4794]: I0310 11:44:05.802089 4794 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29552378-wxkq4"] Mar 10 11:44:06 crc kubenswrapper[4794]: I0310 11:44:06.030941 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3961fb69-7762-45fa-af1a-c5be94b67ed2" path="/var/lib/kubelet/pods/3961fb69-7762-45fa-af1a-c5be94b67ed2/volumes" Mar 10 11:44:52 crc kubenswrapper[4794]: I0310 11:44:52.647820 4794 scope.go:117] "RemoveContainer" containerID="42e00f07cf0a1488747cc4384ba4b156c6672d1516f0866cd94b4ae1638808d4" Mar 10 11:45:00 crc kubenswrapper[4794]: I0310 11:45:00.179605 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29552385-rzmlx"] Mar 10 11:45:00 crc kubenswrapper[4794]: E0310 11:45:00.181294 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fa5bff22-75c7-4ed7-8d83-8b5e79df536d" containerName="oc" Mar 10 11:45:00 crc kubenswrapper[4794]: I0310 11:45:00.181325 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="fa5bff22-75c7-4ed7-8d83-8b5e79df536d" containerName="oc" Mar 10 11:45:00 crc kubenswrapper[4794]: I0310 11:45:00.181793 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="fa5bff22-75c7-4ed7-8d83-8b5e79df536d" containerName="oc" Mar 10 11:45:00 crc kubenswrapper[4794]: I0310 11:45:00.187216 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29552385-rzmlx" Mar 10 11:45:00 crc kubenswrapper[4794]: I0310 11:45:00.190430 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 10 11:45:00 crc kubenswrapper[4794]: I0310 11:45:00.190952 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 10 11:45:00 crc kubenswrapper[4794]: I0310 11:45:00.194371 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29552385-rzmlx"] Mar 10 11:45:00 crc kubenswrapper[4794]: I0310 11:45:00.311397 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ca80d151-efc9-49bd-a807-a67542d2ffda-config-volume\") pod \"collect-profiles-29552385-rzmlx\" (UID: \"ca80d151-efc9-49bd-a807-a67542d2ffda\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552385-rzmlx" Mar 10 11:45:00 crc kubenswrapper[4794]: I0310 11:45:00.311587 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-686kc\" (UniqueName: \"kubernetes.io/projected/ca80d151-efc9-49bd-a807-a67542d2ffda-kube-api-access-686kc\") pod \"collect-profiles-29552385-rzmlx\" (UID: \"ca80d151-efc9-49bd-a807-a67542d2ffda\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552385-rzmlx" Mar 10 11:45:00 crc kubenswrapper[4794]: I0310 11:45:00.311692 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/ca80d151-efc9-49bd-a807-a67542d2ffda-secret-volume\") pod \"collect-profiles-29552385-rzmlx\" (UID: \"ca80d151-efc9-49bd-a807-a67542d2ffda\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552385-rzmlx" Mar 10 11:45:00 crc kubenswrapper[4794]: I0310 11:45:00.414097 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ca80d151-efc9-49bd-a807-a67542d2ffda-config-volume\") pod \"collect-profiles-29552385-rzmlx\" (UID: \"ca80d151-efc9-49bd-a807-a67542d2ffda\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552385-rzmlx" Mar 10 11:45:00 crc kubenswrapper[4794]: I0310 11:45:00.414213 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-686kc\" (UniqueName: \"kubernetes.io/projected/ca80d151-efc9-49bd-a807-a67542d2ffda-kube-api-access-686kc\") pod \"collect-profiles-29552385-rzmlx\" (UID: \"ca80d151-efc9-49bd-a807-a67542d2ffda\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552385-rzmlx" Mar 10 11:45:00 crc kubenswrapper[4794]: I0310 11:45:00.414276 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/ca80d151-efc9-49bd-a807-a67542d2ffda-secret-volume\") pod \"collect-profiles-29552385-rzmlx\" (UID: \"ca80d151-efc9-49bd-a807-a67542d2ffda\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552385-rzmlx" Mar 10 11:45:00 crc kubenswrapper[4794]: I0310 11:45:00.416234 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ca80d151-efc9-49bd-a807-a67542d2ffda-config-volume\") pod \"collect-profiles-29552385-rzmlx\" (UID: \"ca80d151-efc9-49bd-a807-a67542d2ffda\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552385-rzmlx" Mar 10 11:45:00 crc kubenswrapper[4794]: I0310 11:45:00.421939 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/ca80d151-efc9-49bd-a807-a67542d2ffda-secret-volume\") pod \"collect-profiles-29552385-rzmlx\" (UID: \"ca80d151-efc9-49bd-a807-a67542d2ffda\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552385-rzmlx" Mar 10 11:45:00 crc kubenswrapper[4794]: I0310 11:45:00.434856 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-686kc\" (UniqueName: \"kubernetes.io/projected/ca80d151-efc9-49bd-a807-a67542d2ffda-kube-api-access-686kc\") pod \"collect-profiles-29552385-rzmlx\" (UID: \"ca80d151-efc9-49bd-a807-a67542d2ffda\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552385-rzmlx" Mar 10 11:45:00 crc kubenswrapper[4794]: I0310 11:45:00.522680 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29552385-rzmlx" Mar 10 11:45:01 crc kubenswrapper[4794]: I0310 11:45:01.095746 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29552385-rzmlx"] Mar 10 11:45:01 crc kubenswrapper[4794]: W0310 11:45:01.102322 4794 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podca80d151_efc9_49bd_a807_a67542d2ffda.slice/crio-9350b7ac22ba117b1fffb279a8f7bfa106135752f31cec28075f7dd502a9ad01 WatchSource:0}: Error finding container 9350b7ac22ba117b1fffb279a8f7bfa106135752f31cec28075f7dd502a9ad01: Status 404 returned error can't find the container with id 9350b7ac22ba117b1fffb279a8f7bfa106135752f31cec28075f7dd502a9ad01 Mar 10 11:45:02 crc kubenswrapper[4794]: I0310 11:45:02.036569 4794 generic.go:334] "Generic (PLEG): container finished" podID="ca80d151-efc9-49bd-a807-a67542d2ffda" containerID="674a9343c7dff95ddc611ad5b2416b8a8f4e3c42fa3374bece4903b286fca876" exitCode=0 Mar 10 11:45:02 crc kubenswrapper[4794]: I0310 11:45:02.036769 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29552385-rzmlx" event={"ID":"ca80d151-efc9-49bd-a807-a67542d2ffda","Type":"ContainerDied","Data":"674a9343c7dff95ddc611ad5b2416b8a8f4e3c42fa3374bece4903b286fca876"} Mar 10 11:45:02 crc kubenswrapper[4794]: I0310 11:45:02.037185 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29552385-rzmlx" event={"ID":"ca80d151-efc9-49bd-a807-a67542d2ffda","Type":"ContainerStarted","Data":"9350b7ac22ba117b1fffb279a8f7bfa106135752f31cec28075f7dd502a9ad01"} Mar 10 11:45:03 crc kubenswrapper[4794]: I0310 11:45:03.504010 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29552385-rzmlx" Mar 10 11:45:03 crc kubenswrapper[4794]: I0310 11:45:03.699653 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ca80d151-efc9-49bd-a807-a67542d2ffda-config-volume\") pod \"ca80d151-efc9-49bd-a807-a67542d2ffda\" (UID: \"ca80d151-efc9-49bd-a807-a67542d2ffda\") " Mar 10 11:45:03 crc kubenswrapper[4794]: I0310 11:45:03.699776 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/ca80d151-efc9-49bd-a807-a67542d2ffda-secret-volume\") pod \"ca80d151-efc9-49bd-a807-a67542d2ffda\" (UID: \"ca80d151-efc9-49bd-a807-a67542d2ffda\") " Mar 10 11:45:03 crc kubenswrapper[4794]: I0310 11:45:03.699835 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-686kc\" (UniqueName: \"kubernetes.io/projected/ca80d151-efc9-49bd-a807-a67542d2ffda-kube-api-access-686kc\") pod \"ca80d151-efc9-49bd-a807-a67542d2ffda\" (UID: \"ca80d151-efc9-49bd-a807-a67542d2ffda\") " Mar 10 11:45:03 crc kubenswrapper[4794]: I0310 11:45:03.701024 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ca80d151-efc9-49bd-a807-a67542d2ffda-config-volume" (OuterVolumeSpecName: "config-volume") pod "ca80d151-efc9-49bd-a807-a67542d2ffda" (UID: "ca80d151-efc9-49bd-a807-a67542d2ffda"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 11:45:03 crc kubenswrapper[4794]: I0310 11:45:03.708563 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ca80d151-efc9-49bd-a807-a67542d2ffda-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "ca80d151-efc9-49bd-a807-a67542d2ffda" (UID: "ca80d151-efc9-49bd-a807-a67542d2ffda"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 11:45:03 crc kubenswrapper[4794]: I0310 11:45:03.709496 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ca80d151-efc9-49bd-a807-a67542d2ffda-kube-api-access-686kc" (OuterVolumeSpecName: "kube-api-access-686kc") pod "ca80d151-efc9-49bd-a807-a67542d2ffda" (UID: "ca80d151-efc9-49bd-a807-a67542d2ffda"). InnerVolumeSpecName "kube-api-access-686kc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 11:45:03 crc kubenswrapper[4794]: I0310 11:45:03.802641 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-686kc\" (UniqueName: \"kubernetes.io/projected/ca80d151-efc9-49bd-a807-a67542d2ffda-kube-api-access-686kc\") on node \"crc\" DevicePath \"\"" Mar 10 11:45:03 crc kubenswrapper[4794]: I0310 11:45:03.802975 4794 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ca80d151-efc9-49bd-a807-a67542d2ffda-config-volume\") on node \"crc\" DevicePath \"\"" Mar 10 11:45:03 crc kubenswrapper[4794]: I0310 11:45:03.802992 4794 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/ca80d151-efc9-49bd-a807-a67542d2ffda-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 10 11:45:04 crc kubenswrapper[4794]: I0310 11:45:04.069943 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29552385-rzmlx" event={"ID":"ca80d151-efc9-49bd-a807-a67542d2ffda","Type":"ContainerDied","Data":"9350b7ac22ba117b1fffb279a8f7bfa106135752f31cec28075f7dd502a9ad01"} Mar 10 11:45:04 crc kubenswrapper[4794]: I0310 11:45:04.069984 4794 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9350b7ac22ba117b1fffb279a8f7bfa106135752f31cec28075f7dd502a9ad01" Mar 10 11:45:04 crc kubenswrapper[4794]: I0310 11:45:04.070036 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29552385-rzmlx" Mar 10 11:45:04 crc kubenswrapper[4794]: I0310 11:45:04.613288 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29552340-94k6f"] Mar 10 11:45:04 crc kubenswrapper[4794]: I0310 11:45:04.622619 4794 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29552340-94k6f"] Mar 10 11:45:06 crc kubenswrapper[4794]: I0310 11:45:06.015245 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="84abb13a-cdb3-4eb8-8845-fa8e7db8ef02" path="/var/lib/kubelet/pods/84abb13a-cdb3-4eb8-8845-fa8e7db8ef02/volumes" Mar 10 11:45:22 crc kubenswrapper[4794]: I0310 11:45:22.967823 4794 patch_prober.go:28] interesting pod/machine-config-daemon-69278 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 11:45:22 crc kubenswrapper[4794]: I0310 11:45:22.968765 4794 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-69278" podUID="071a79a8-a892-4d38-a255-2a19483b64aa" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 11:45:52 crc kubenswrapper[4794]: I0310 11:45:52.771091 4794 scope.go:117] "RemoveContainer" containerID="21fad9483da30ee067fd585ceb3f41eb4f0448942f59671109057931e96c4a41" Mar 10 11:45:52 crc kubenswrapper[4794]: I0310 11:45:52.970469 4794 patch_prober.go:28] interesting pod/machine-config-daemon-69278 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 11:45:52 crc kubenswrapper[4794]: I0310 11:45:52.970529 4794 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-69278" podUID="071a79a8-a892-4d38-a255-2a19483b64aa" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 11:46:00 crc kubenswrapper[4794]: I0310 11:46:00.174998 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29552386-zsvqc"] Mar 10 11:46:00 crc kubenswrapper[4794]: E0310 11:46:00.176490 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ca80d151-efc9-49bd-a807-a67542d2ffda" containerName="collect-profiles" Mar 10 11:46:00 crc kubenswrapper[4794]: I0310 11:46:00.176519 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="ca80d151-efc9-49bd-a807-a67542d2ffda" containerName="collect-profiles" Mar 10 11:46:00 crc kubenswrapper[4794]: I0310 11:46:00.177051 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="ca80d151-efc9-49bd-a807-a67542d2ffda" containerName="collect-profiles" Mar 10 11:46:00 crc kubenswrapper[4794]: I0310 11:46:00.178492 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552386-zsvqc" Mar 10 11:46:00 crc kubenswrapper[4794]: I0310 11:46:00.181510 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-r5tkc" Mar 10 11:46:00 crc kubenswrapper[4794]: I0310 11:46:00.182293 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 10 11:46:00 crc kubenswrapper[4794]: I0310 11:46:00.182752 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 10 11:46:00 crc kubenswrapper[4794]: I0310 11:46:00.187199 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552386-zsvqc"] Mar 10 11:46:00 crc kubenswrapper[4794]: I0310 11:46:00.267838 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-phsff\" (UniqueName: \"kubernetes.io/projected/9d78b47f-8a50-4433-9d51-a3037c59ea53-kube-api-access-phsff\") pod \"auto-csr-approver-29552386-zsvqc\" (UID: \"9d78b47f-8a50-4433-9d51-a3037c59ea53\") " pod="openshift-infra/auto-csr-approver-29552386-zsvqc" Mar 10 11:46:00 crc kubenswrapper[4794]: I0310 11:46:00.369992 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-phsff\" (UniqueName: \"kubernetes.io/projected/9d78b47f-8a50-4433-9d51-a3037c59ea53-kube-api-access-phsff\") pod \"auto-csr-approver-29552386-zsvqc\" (UID: \"9d78b47f-8a50-4433-9d51-a3037c59ea53\") " pod="openshift-infra/auto-csr-approver-29552386-zsvqc" Mar 10 11:46:00 crc kubenswrapper[4794]: I0310 11:46:00.396015 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-phsff\" (UniqueName: \"kubernetes.io/projected/9d78b47f-8a50-4433-9d51-a3037c59ea53-kube-api-access-phsff\") pod \"auto-csr-approver-29552386-zsvqc\" (UID: \"9d78b47f-8a50-4433-9d51-a3037c59ea53\") " pod="openshift-infra/auto-csr-approver-29552386-zsvqc" Mar 10 11:46:00 crc kubenswrapper[4794]: I0310 11:46:00.505923 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552386-zsvqc" Mar 10 11:46:01 crc kubenswrapper[4794]: I0310 11:46:01.033889 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552386-zsvqc"] Mar 10 11:46:01 crc kubenswrapper[4794]: I0310 11:46:01.041600 4794 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 10 11:46:01 crc kubenswrapper[4794]: I0310 11:46:01.738247 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552386-zsvqc" event={"ID":"9d78b47f-8a50-4433-9d51-a3037c59ea53","Type":"ContainerStarted","Data":"2d04dfff5d004d1096578ae3aabd1207dc108e76243b59f5a05cdcc79bbbebc5"} Mar 10 11:46:03 crc kubenswrapper[4794]: I0310 11:46:03.762301 4794 generic.go:334] "Generic (PLEG): container finished" podID="9d78b47f-8a50-4433-9d51-a3037c59ea53" containerID="9741a19c5d507976adb605905d8a2d318512206acb99e58b66f31dad35d38b08" exitCode=0 Mar 10 11:46:03 crc kubenswrapper[4794]: I0310 11:46:03.762403 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552386-zsvqc" event={"ID":"9d78b47f-8a50-4433-9d51-a3037c59ea53","Type":"ContainerDied","Data":"9741a19c5d507976adb605905d8a2d318512206acb99e58b66f31dad35d38b08"} Mar 10 11:46:05 crc kubenswrapper[4794]: I0310 11:46:05.237000 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552386-zsvqc" Mar 10 11:46:05 crc kubenswrapper[4794]: I0310 11:46:05.382027 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-phsff\" (UniqueName: \"kubernetes.io/projected/9d78b47f-8a50-4433-9d51-a3037c59ea53-kube-api-access-phsff\") pod \"9d78b47f-8a50-4433-9d51-a3037c59ea53\" (UID: \"9d78b47f-8a50-4433-9d51-a3037c59ea53\") " Mar 10 11:46:05 crc kubenswrapper[4794]: I0310 11:46:05.396825 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d78b47f-8a50-4433-9d51-a3037c59ea53-kube-api-access-phsff" (OuterVolumeSpecName: "kube-api-access-phsff") pod "9d78b47f-8a50-4433-9d51-a3037c59ea53" (UID: "9d78b47f-8a50-4433-9d51-a3037c59ea53"). InnerVolumeSpecName "kube-api-access-phsff". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 11:46:05 crc kubenswrapper[4794]: I0310 11:46:05.484848 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-phsff\" (UniqueName: \"kubernetes.io/projected/9d78b47f-8a50-4433-9d51-a3037c59ea53-kube-api-access-phsff\") on node \"crc\" DevicePath \"\"" Mar 10 11:46:05 crc kubenswrapper[4794]: I0310 11:46:05.787017 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552386-zsvqc" event={"ID":"9d78b47f-8a50-4433-9d51-a3037c59ea53","Type":"ContainerDied","Data":"2d04dfff5d004d1096578ae3aabd1207dc108e76243b59f5a05cdcc79bbbebc5"} Mar 10 11:46:05 crc kubenswrapper[4794]: I0310 11:46:05.787076 4794 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2d04dfff5d004d1096578ae3aabd1207dc108e76243b59f5a05cdcc79bbbebc5" Mar 10 11:46:05 crc kubenswrapper[4794]: I0310 11:46:05.787078 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552386-zsvqc" Mar 10 11:46:06 crc kubenswrapper[4794]: I0310 11:46:06.319408 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29552380-z5wsx"] Mar 10 11:46:06 crc kubenswrapper[4794]: I0310 11:46:06.329271 4794 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29552380-z5wsx"] Mar 10 11:46:08 crc kubenswrapper[4794]: I0310 11:46:08.011008 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="69f1ae00-aeb3-44f4-9105-e0d7b3c6189b" path="/var/lib/kubelet/pods/69f1ae00-aeb3-44f4-9105-e0d7b3c6189b/volumes" Mar 10 11:46:12 crc kubenswrapper[4794]: I0310 11:46:12.860917 4794 generic.go:334] "Generic (PLEG): container finished" podID="b3ed4d47-7c95-4755-a631-092391d64b11" containerID="5488b0976598923b1bf4ce1d99e01ec9db4855f8329077728c4c9e5e50c30643" exitCode=0 Mar 10 11:46:12 crc kubenswrapper[4794]: I0310 11:46:12.860972 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-openstack-openstack-cell1-cl7pm" event={"ID":"b3ed4d47-7c95-4755-a631-092391d64b11","Type":"ContainerDied","Data":"5488b0976598923b1bf4ce1d99e01ec9db4855f8329077728c4c9e5e50c30643"} Mar 10 11:46:14 crc kubenswrapper[4794]: I0310 11:46:14.368094 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-openstack-openstack-cell1-cl7pm" Mar 10 11:46:14 crc kubenswrapper[4794]: I0310 11:46:14.506587 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b3ed4d47-7c95-4755-a631-092391d64b11-inventory\") pod \"b3ed4d47-7c95-4755-a631-092391d64b11\" (UID: \"b3ed4d47-7c95-4755-a631-092391d64b11\") " Mar 10 11:46:14 crc kubenswrapper[4794]: I0310 11:46:14.506988 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/b3ed4d47-7c95-4755-a631-092391d64b11-ceph\") pod \"b3ed4d47-7c95-4755-a631-092391d64b11\" (UID: \"b3ed4d47-7c95-4755-a631-092391d64b11\") " Mar 10 11:46:14 crc kubenswrapper[4794]: I0310 11:46:14.507012 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/b3ed4d47-7c95-4755-a631-092391d64b11-ssh-key-openstack-cell1\") pod \"b3ed4d47-7c95-4755-a631-092391d64b11\" (UID: \"b3ed4d47-7c95-4755-a631-092391d64b11\") " Mar 10 11:46:14 crc kubenswrapper[4794]: I0310 11:46:14.507080 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w5xc5\" (UniqueName: \"kubernetes.io/projected/b3ed4d47-7c95-4755-a631-092391d64b11-kube-api-access-w5xc5\") pod \"b3ed4d47-7c95-4755-a631-092391d64b11\" (UID: \"b3ed4d47-7c95-4755-a631-092391d64b11\") " Mar 10 11:46:14 crc kubenswrapper[4794]: I0310 11:46:14.507125 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b3ed4d47-7c95-4755-a631-092391d64b11-bootstrap-combined-ca-bundle\") pod \"b3ed4d47-7c95-4755-a631-092391d64b11\" (UID: \"b3ed4d47-7c95-4755-a631-092391d64b11\") " Mar 10 11:46:14 crc kubenswrapper[4794]: I0310 11:46:14.518576 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b3ed4d47-7c95-4755-a631-092391d64b11-ceph" (OuterVolumeSpecName: "ceph") pod "b3ed4d47-7c95-4755-a631-092391d64b11" (UID: "b3ed4d47-7c95-4755-a631-092391d64b11"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 11:46:14 crc kubenswrapper[4794]: I0310 11:46:14.518633 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b3ed4d47-7c95-4755-a631-092391d64b11-kube-api-access-w5xc5" (OuterVolumeSpecName: "kube-api-access-w5xc5") pod "b3ed4d47-7c95-4755-a631-092391d64b11" (UID: "b3ed4d47-7c95-4755-a631-092391d64b11"). InnerVolumeSpecName "kube-api-access-w5xc5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 11:46:14 crc kubenswrapper[4794]: I0310 11:46:14.522687 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b3ed4d47-7c95-4755-a631-092391d64b11-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "b3ed4d47-7c95-4755-a631-092391d64b11" (UID: "b3ed4d47-7c95-4755-a631-092391d64b11"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 11:46:14 crc kubenswrapper[4794]: I0310 11:46:14.542718 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b3ed4d47-7c95-4755-a631-092391d64b11-inventory" (OuterVolumeSpecName: "inventory") pod "b3ed4d47-7c95-4755-a631-092391d64b11" (UID: "b3ed4d47-7c95-4755-a631-092391d64b11"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 11:46:14 crc kubenswrapper[4794]: I0310 11:46:14.543361 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b3ed4d47-7c95-4755-a631-092391d64b11-ssh-key-openstack-cell1" (OuterVolumeSpecName: "ssh-key-openstack-cell1") pod "b3ed4d47-7c95-4755-a631-092391d64b11" (UID: "b3ed4d47-7c95-4755-a631-092391d64b11"). InnerVolumeSpecName "ssh-key-openstack-cell1". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 11:46:14 crc kubenswrapper[4794]: I0310 11:46:14.609938 4794 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b3ed4d47-7c95-4755-a631-092391d64b11-inventory\") on node \"crc\" DevicePath \"\"" Mar 10 11:46:14 crc kubenswrapper[4794]: I0310 11:46:14.609995 4794 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/b3ed4d47-7c95-4755-a631-092391d64b11-ceph\") on node \"crc\" DevicePath \"\"" Mar 10 11:46:14 crc kubenswrapper[4794]: I0310 11:46:14.610018 4794 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/b3ed4d47-7c95-4755-a631-092391d64b11-ssh-key-openstack-cell1\") on node \"crc\" DevicePath \"\"" Mar 10 11:46:14 crc kubenswrapper[4794]: I0310 11:46:14.610039 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w5xc5\" (UniqueName: \"kubernetes.io/projected/b3ed4d47-7c95-4755-a631-092391d64b11-kube-api-access-w5xc5\") on node \"crc\" DevicePath \"\"" Mar 10 11:46:14 crc kubenswrapper[4794]: I0310 11:46:14.610058 4794 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b3ed4d47-7c95-4755-a631-092391d64b11-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 11:46:14 crc kubenswrapper[4794]: I0310 11:46:14.891979 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-openstack-openstack-cell1-cl7pm" event={"ID":"b3ed4d47-7c95-4755-a631-092391d64b11","Type":"ContainerDied","Data":"0c37e249088ccf8b9bdfc31fc486952b25740c4bdc316812ead6a5c2a86c8f6f"} Mar 10 11:46:14 crc kubenswrapper[4794]: I0310 11:46:14.892026 4794 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0c37e249088ccf8b9bdfc31fc486952b25740c4bdc316812ead6a5c2a86c8f6f" Mar 10 11:46:14 crc kubenswrapper[4794]: I0310 11:46:14.892158 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-openstack-openstack-cell1-cl7pm" Mar 10 11:46:15 crc kubenswrapper[4794]: I0310 11:46:14.987720 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/download-cache-openstack-openstack-cell1-l9qnw"] Mar 10 11:46:15 crc kubenswrapper[4794]: E0310 11:46:14.988207 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b3ed4d47-7c95-4755-a631-092391d64b11" containerName="bootstrap-openstack-openstack-cell1" Mar 10 11:46:15 crc kubenswrapper[4794]: I0310 11:46:14.988238 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="b3ed4d47-7c95-4755-a631-092391d64b11" containerName="bootstrap-openstack-openstack-cell1" Mar 10 11:46:15 crc kubenswrapper[4794]: E0310 11:46:14.988269 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9d78b47f-8a50-4433-9d51-a3037c59ea53" containerName="oc" Mar 10 11:46:15 crc kubenswrapper[4794]: I0310 11:46:14.988281 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="9d78b47f-8a50-4433-9d51-a3037c59ea53" containerName="oc" Mar 10 11:46:15 crc kubenswrapper[4794]: I0310 11:46:14.988652 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="9d78b47f-8a50-4433-9d51-a3037c59ea53" containerName="oc" Mar 10 11:46:15 crc kubenswrapper[4794]: I0310 11:46:14.988682 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="b3ed4d47-7c95-4755-a631-092391d64b11" containerName="bootstrap-openstack-openstack-cell1" Mar 10 11:46:15 crc kubenswrapper[4794]: I0310 11:46:14.989600 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-openstack-openstack-cell1-l9qnw" Mar 10 11:46:15 crc kubenswrapper[4794]: I0310 11:46:14.991937 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-jdm8g" Mar 10 11:46:15 crc kubenswrapper[4794]: I0310 11:46:14.992275 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Mar 10 11:46:15 crc kubenswrapper[4794]: I0310 11:46:14.992545 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Mar 10 11:46:15 crc kubenswrapper[4794]: I0310 11:46:14.992728 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 10 11:46:15 crc kubenswrapper[4794]: I0310 11:46:15.003702 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/download-cache-openstack-openstack-cell1-l9qnw"] Mar 10 11:46:15 crc kubenswrapper[4794]: I0310 11:46:15.141667 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/50dce0e2-2231-4163-8c82-8ee68e08cb57-inventory\") pod \"download-cache-openstack-openstack-cell1-l9qnw\" (UID: \"50dce0e2-2231-4163-8c82-8ee68e08cb57\") " pod="openstack/download-cache-openstack-openstack-cell1-l9qnw" Mar 10 11:46:15 crc kubenswrapper[4794]: I0310 11:46:15.141775 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/50dce0e2-2231-4163-8c82-8ee68e08cb57-ceph\") pod \"download-cache-openstack-openstack-cell1-l9qnw\" (UID: \"50dce0e2-2231-4163-8c82-8ee68e08cb57\") " pod="openstack/download-cache-openstack-openstack-cell1-l9qnw" Mar 10 11:46:15 crc kubenswrapper[4794]: I0310 11:46:15.141826 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/50dce0e2-2231-4163-8c82-8ee68e08cb57-ssh-key-openstack-cell1\") pod \"download-cache-openstack-openstack-cell1-l9qnw\" (UID: \"50dce0e2-2231-4163-8c82-8ee68e08cb57\") " pod="openstack/download-cache-openstack-openstack-cell1-l9qnw" Mar 10 11:46:15 crc kubenswrapper[4794]: I0310 11:46:15.141944 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n6pvw\" (UniqueName: \"kubernetes.io/projected/50dce0e2-2231-4163-8c82-8ee68e08cb57-kube-api-access-n6pvw\") pod \"download-cache-openstack-openstack-cell1-l9qnw\" (UID: \"50dce0e2-2231-4163-8c82-8ee68e08cb57\") " pod="openstack/download-cache-openstack-openstack-cell1-l9qnw" Mar 10 11:46:15 crc kubenswrapper[4794]: I0310 11:46:15.243501 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/50dce0e2-2231-4163-8c82-8ee68e08cb57-inventory\") pod \"download-cache-openstack-openstack-cell1-l9qnw\" (UID: \"50dce0e2-2231-4163-8c82-8ee68e08cb57\") " pod="openstack/download-cache-openstack-openstack-cell1-l9qnw" Mar 10 11:46:15 crc kubenswrapper[4794]: I0310 11:46:15.243864 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/50dce0e2-2231-4163-8c82-8ee68e08cb57-ceph\") pod \"download-cache-openstack-openstack-cell1-l9qnw\" (UID: \"50dce0e2-2231-4163-8c82-8ee68e08cb57\") " pod="openstack/download-cache-openstack-openstack-cell1-l9qnw" Mar 10 11:46:15 crc kubenswrapper[4794]: I0310 11:46:15.243922 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/50dce0e2-2231-4163-8c82-8ee68e08cb57-ssh-key-openstack-cell1\") pod \"download-cache-openstack-openstack-cell1-l9qnw\" (UID: \"50dce0e2-2231-4163-8c82-8ee68e08cb57\") " pod="openstack/download-cache-openstack-openstack-cell1-l9qnw" Mar 10 11:46:15 crc kubenswrapper[4794]: I0310 11:46:15.244045 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n6pvw\" (UniqueName: \"kubernetes.io/projected/50dce0e2-2231-4163-8c82-8ee68e08cb57-kube-api-access-n6pvw\") pod \"download-cache-openstack-openstack-cell1-l9qnw\" (UID: \"50dce0e2-2231-4163-8c82-8ee68e08cb57\") " pod="openstack/download-cache-openstack-openstack-cell1-l9qnw" Mar 10 11:46:15 crc kubenswrapper[4794]: I0310 11:46:15.248340 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/50dce0e2-2231-4163-8c82-8ee68e08cb57-ceph\") pod \"download-cache-openstack-openstack-cell1-l9qnw\" (UID: \"50dce0e2-2231-4163-8c82-8ee68e08cb57\") " pod="openstack/download-cache-openstack-openstack-cell1-l9qnw" Mar 10 11:46:15 crc kubenswrapper[4794]: I0310 11:46:15.249841 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/50dce0e2-2231-4163-8c82-8ee68e08cb57-ssh-key-openstack-cell1\") pod \"download-cache-openstack-openstack-cell1-l9qnw\" (UID: \"50dce0e2-2231-4163-8c82-8ee68e08cb57\") " pod="openstack/download-cache-openstack-openstack-cell1-l9qnw" Mar 10 11:46:15 crc kubenswrapper[4794]: I0310 11:46:15.253302 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/50dce0e2-2231-4163-8c82-8ee68e08cb57-inventory\") pod \"download-cache-openstack-openstack-cell1-l9qnw\" (UID: \"50dce0e2-2231-4163-8c82-8ee68e08cb57\") " pod="openstack/download-cache-openstack-openstack-cell1-l9qnw" Mar 10 11:46:15 crc kubenswrapper[4794]: I0310 11:46:15.272739 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n6pvw\" (UniqueName: \"kubernetes.io/projected/50dce0e2-2231-4163-8c82-8ee68e08cb57-kube-api-access-n6pvw\") pod \"download-cache-openstack-openstack-cell1-l9qnw\" (UID: \"50dce0e2-2231-4163-8c82-8ee68e08cb57\") " pod="openstack/download-cache-openstack-openstack-cell1-l9qnw" Mar 10 11:46:15 crc kubenswrapper[4794]: I0310 11:46:15.359745 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-openstack-openstack-cell1-l9qnw" Mar 10 11:46:15 crc kubenswrapper[4794]: I0310 11:46:15.972005 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/download-cache-openstack-openstack-cell1-l9qnw"] Mar 10 11:46:15 crc kubenswrapper[4794]: W0310 11:46:15.981495 4794 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod50dce0e2_2231_4163_8c82_8ee68e08cb57.slice/crio-9a2777ac504a8c874bbbcec12916ef80f355832488c5b9019fe46473882f444c WatchSource:0}: Error finding container 9a2777ac504a8c874bbbcec12916ef80f355832488c5b9019fe46473882f444c: Status 404 returned error can't find the container with id 9a2777ac504a8c874bbbcec12916ef80f355832488c5b9019fe46473882f444c Mar 10 11:46:16 crc kubenswrapper[4794]: I0310 11:46:16.962840 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-openstack-openstack-cell1-l9qnw" event={"ID":"50dce0e2-2231-4163-8c82-8ee68e08cb57","Type":"ContainerStarted","Data":"a4a22ed2237ad24f6e5506cde8704dd461fe1d3d59de71628c19e13ecac50eaf"} Mar 10 11:46:16 crc kubenswrapper[4794]: I0310 11:46:16.963224 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-openstack-openstack-cell1-l9qnw" event={"ID":"50dce0e2-2231-4163-8c82-8ee68e08cb57","Type":"ContainerStarted","Data":"9a2777ac504a8c874bbbcec12916ef80f355832488c5b9019fe46473882f444c"} Mar 10 11:46:16 crc kubenswrapper[4794]: I0310 11:46:16.987652 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/download-cache-openstack-openstack-cell1-l9qnw" podStartSLOduration=2.514714378 podStartE2EDuration="2.98763385s" podCreationTimestamp="2026-03-10 11:46:14 +0000 UTC" firstStartedPulling="2026-03-10 11:46:15.984964692 +0000 UTC m=+7324.741135550" lastFinishedPulling="2026-03-10 11:46:16.457884204 +0000 UTC m=+7325.214055022" observedRunningTime="2026-03-10 11:46:16.981276342 +0000 UTC m=+7325.737447160" watchObservedRunningTime="2026-03-10 11:46:16.98763385 +0000 UTC m=+7325.743804668" Mar 10 11:46:22 crc kubenswrapper[4794]: I0310 11:46:22.967413 4794 patch_prober.go:28] interesting pod/machine-config-daemon-69278 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 11:46:22 crc kubenswrapper[4794]: I0310 11:46:22.967793 4794 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-69278" podUID="071a79a8-a892-4d38-a255-2a19483b64aa" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 11:46:22 crc kubenswrapper[4794]: I0310 11:46:22.967849 4794 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-69278" Mar 10 11:46:22 crc kubenswrapper[4794]: I0310 11:46:22.968927 4794 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"cd7c71a98db62b40397f10a8c55d8c209b4e24fd8dfc1076cb0ee0625736a5b5"} pod="openshift-machine-config-operator/machine-config-daemon-69278" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 10 11:46:22 crc kubenswrapper[4794]: I0310 11:46:22.968999 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-69278" podUID="071a79a8-a892-4d38-a255-2a19483b64aa" containerName="machine-config-daemon" containerID="cri-o://cd7c71a98db62b40397f10a8c55d8c209b4e24fd8dfc1076cb0ee0625736a5b5" gracePeriod=600 Mar 10 11:46:24 crc kubenswrapper[4794]: I0310 11:46:24.051861 4794 generic.go:334] "Generic (PLEG): container finished" podID="071a79a8-a892-4d38-a255-2a19483b64aa" containerID="cd7c71a98db62b40397f10a8c55d8c209b4e24fd8dfc1076cb0ee0625736a5b5" exitCode=0 Mar 10 11:46:24 crc kubenswrapper[4794]: I0310 11:46:24.051937 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-69278" event={"ID":"071a79a8-a892-4d38-a255-2a19483b64aa","Type":"ContainerDied","Data":"cd7c71a98db62b40397f10a8c55d8c209b4e24fd8dfc1076cb0ee0625736a5b5"} Mar 10 11:46:24 crc kubenswrapper[4794]: I0310 11:46:24.052567 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-69278" event={"ID":"071a79a8-a892-4d38-a255-2a19483b64aa","Type":"ContainerStarted","Data":"9dc225440d65a260bfaf17b727b2d2bfa824f3fd4b54db4a37ab86f3420b2343"} Mar 10 11:46:24 crc kubenswrapper[4794]: I0310 11:46:24.052602 4794 scope.go:117] "RemoveContainer" containerID="33bc8fb5de7a6ee4431141216e52d779889bbf57219d1ad361c0b6ac9e985c0e" Mar 10 11:46:52 crc kubenswrapper[4794]: I0310 11:46:52.891942 4794 scope.go:117] "RemoveContainer" containerID="b0ef429d6f985bb876ca63de27cca82abeb92b5287e19ca548faf478bba6a66c" Mar 10 11:48:00 crc kubenswrapper[4794]: I0310 11:48:00.164852 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29552388-6j89z"] Mar 10 11:48:00 crc kubenswrapper[4794]: I0310 11:48:00.167496 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552388-6j89z" Mar 10 11:48:00 crc kubenswrapper[4794]: I0310 11:48:00.170275 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 10 11:48:00 crc kubenswrapper[4794]: I0310 11:48:00.170947 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-r5tkc" Mar 10 11:48:00 crc kubenswrapper[4794]: I0310 11:48:00.172826 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 10 11:48:00 crc kubenswrapper[4794]: I0310 11:48:00.178501 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552388-6j89z"] Mar 10 11:48:00 crc kubenswrapper[4794]: I0310 11:48:00.236033 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b4vcr\" (UniqueName: \"kubernetes.io/projected/bea8e430-5c78-4741-87d7-6f24aff9f849-kube-api-access-b4vcr\") pod \"auto-csr-approver-29552388-6j89z\" (UID: \"bea8e430-5c78-4741-87d7-6f24aff9f849\") " pod="openshift-infra/auto-csr-approver-29552388-6j89z" Mar 10 11:48:00 crc kubenswrapper[4794]: I0310 11:48:00.339536 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b4vcr\" (UniqueName: \"kubernetes.io/projected/bea8e430-5c78-4741-87d7-6f24aff9f849-kube-api-access-b4vcr\") pod \"auto-csr-approver-29552388-6j89z\" (UID: \"bea8e430-5c78-4741-87d7-6f24aff9f849\") " pod="openshift-infra/auto-csr-approver-29552388-6j89z" Mar 10 11:48:00 crc kubenswrapper[4794]: I0310 11:48:00.363298 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b4vcr\" (UniqueName: \"kubernetes.io/projected/bea8e430-5c78-4741-87d7-6f24aff9f849-kube-api-access-b4vcr\") pod \"auto-csr-approver-29552388-6j89z\" (UID: \"bea8e430-5c78-4741-87d7-6f24aff9f849\") " pod="openshift-infra/auto-csr-approver-29552388-6j89z" Mar 10 11:48:00 crc kubenswrapper[4794]: I0310 11:48:00.497094 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552388-6j89z" Mar 10 11:48:01 crc kubenswrapper[4794]: I0310 11:48:01.036681 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552388-6j89z"] Mar 10 11:48:01 crc kubenswrapper[4794]: I0310 11:48:01.206661 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552388-6j89z" event={"ID":"bea8e430-5c78-4741-87d7-6f24aff9f849","Type":"ContainerStarted","Data":"bbf6d74fe55e1e2654f70340f831dbb3b3764b656556f26828c02722fc9514cf"} Mar 10 11:48:03 crc kubenswrapper[4794]: I0310 11:48:03.230619 4794 generic.go:334] "Generic (PLEG): container finished" podID="bea8e430-5c78-4741-87d7-6f24aff9f849" containerID="c99d1bc6bc641f94707fa255193406b2dab936bb614ca1ec87bff4069ad3d077" exitCode=0 Mar 10 11:48:03 crc kubenswrapper[4794]: I0310 11:48:03.230759 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552388-6j89z" event={"ID":"bea8e430-5c78-4741-87d7-6f24aff9f849","Type":"ContainerDied","Data":"c99d1bc6bc641f94707fa255193406b2dab936bb614ca1ec87bff4069ad3d077"} Mar 10 11:48:04 crc kubenswrapper[4794]: I0310 11:48:04.654016 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552388-6j89z" Mar 10 11:48:04 crc kubenswrapper[4794]: I0310 11:48:04.772623 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b4vcr\" (UniqueName: \"kubernetes.io/projected/bea8e430-5c78-4741-87d7-6f24aff9f849-kube-api-access-b4vcr\") pod \"bea8e430-5c78-4741-87d7-6f24aff9f849\" (UID: \"bea8e430-5c78-4741-87d7-6f24aff9f849\") " Mar 10 11:48:04 crc kubenswrapper[4794]: I0310 11:48:04.780602 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bea8e430-5c78-4741-87d7-6f24aff9f849-kube-api-access-b4vcr" (OuterVolumeSpecName: "kube-api-access-b4vcr") pod "bea8e430-5c78-4741-87d7-6f24aff9f849" (UID: "bea8e430-5c78-4741-87d7-6f24aff9f849"). InnerVolumeSpecName "kube-api-access-b4vcr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 11:48:04 crc kubenswrapper[4794]: I0310 11:48:04.875217 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b4vcr\" (UniqueName: \"kubernetes.io/projected/bea8e430-5c78-4741-87d7-6f24aff9f849-kube-api-access-b4vcr\") on node \"crc\" DevicePath \"\"" Mar 10 11:48:05 crc kubenswrapper[4794]: I0310 11:48:05.249850 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552388-6j89z" event={"ID":"bea8e430-5c78-4741-87d7-6f24aff9f849","Type":"ContainerDied","Data":"bbf6d74fe55e1e2654f70340f831dbb3b3764b656556f26828c02722fc9514cf"} Mar 10 11:48:05 crc kubenswrapper[4794]: I0310 11:48:05.250129 4794 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bbf6d74fe55e1e2654f70340f831dbb3b3764b656556f26828c02722fc9514cf" Mar 10 11:48:05 crc kubenswrapper[4794]: I0310 11:48:05.249910 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552388-6j89z" Mar 10 11:48:05 crc kubenswrapper[4794]: I0310 11:48:05.762796 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29552382-p9pv6"] Mar 10 11:48:05 crc kubenswrapper[4794]: I0310 11:48:05.775543 4794 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29552382-p9pv6"] Mar 10 11:48:06 crc kubenswrapper[4794]: I0310 11:48:06.020955 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7a2ded44-27c0-4060-860e-d8e0e00f35f1" path="/var/lib/kubelet/pods/7a2ded44-27c0-4060-860e-d8e0e00f35f1/volumes" Mar 10 11:48:16 crc kubenswrapper[4794]: I0310 11:48:16.386209 4794 generic.go:334] "Generic (PLEG): container finished" podID="50dce0e2-2231-4163-8c82-8ee68e08cb57" containerID="a4a22ed2237ad24f6e5506cde8704dd461fe1d3d59de71628c19e13ecac50eaf" exitCode=0 Mar 10 11:48:16 crc kubenswrapper[4794]: I0310 11:48:16.386422 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-openstack-openstack-cell1-l9qnw" event={"ID":"50dce0e2-2231-4163-8c82-8ee68e08cb57","Type":"ContainerDied","Data":"a4a22ed2237ad24f6e5506cde8704dd461fe1d3d59de71628c19e13ecac50eaf"} Mar 10 11:48:17 crc kubenswrapper[4794]: I0310 11:48:17.954212 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-openstack-openstack-cell1-l9qnw" Mar 10 11:48:18 crc kubenswrapper[4794]: I0310 11:48:18.047838 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/50dce0e2-2231-4163-8c82-8ee68e08cb57-ssh-key-openstack-cell1\") pod \"50dce0e2-2231-4163-8c82-8ee68e08cb57\" (UID: \"50dce0e2-2231-4163-8c82-8ee68e08cb57\") " Mar 10 11:48:18 crc kubenswrapper[4794]: I0310 11:48:18.047970 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/50dce0e2-2231-4163-8c82-8ee68e08cb57-inventory\") pod \"50dce0e2-2231-4163-8c82-8ee68e08cb57\" (UID: \"50dce0e2-2231-4163-8c82-8ee68e08cb57\") " Mar 10 11:48:18 crc kubenswrapper[4794]: I0310 11:48:18.048262 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n6pvw\" (UniqueName: \"kubernetes.io/projected/50dce0e2-2231-4163-8c82-8ee68e08cb57-kube-api-access-n6pvw\") pod \"50dce0e2-2231-4163-8c82-8ee68e08cb57\" (UID: \"50dce0e2-2231-4163-8c82-8ee68e08cb57\") " Mar 10 11:48:18 crc kubenswrapper[4794]: I0310 11:48:18.048305 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/50dce0e2-2231-4163-8c82-8ee68e08cb57-ceph\") pod \"50dce0e2-2231-4163-8c82-8ee68e08cb57\" (UID: \"50dce0e2-2231-4163-8c82-8ee68e08cb57\") " Mar 10 11:48:18 crc kubenswrapper[4794]: I0310 11:48:18.055429 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/50dce0e2-2231-4163-8c82-8ee68e08cb57-kube-api-access-n6pvw" (OuterVolumeSpecName: "kube-api-access-n6pvw") pod "50dce0e2-2231-4163-8c82-8ee68e08cb57" (UID: "50dce0e2-2231-4163-8c82-8ee68e08cb57"). InnerVolumeSpecName "kube-api-access-n6pvw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 11:48:18 crc kubenswrapper[4794]: I0310 11:48:18.056881 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/50dce0e2-2231-4163-8c82-8ee68e08cb57-ceph" (OuterVolumeSpecName: "ceph") pod "50dce0e2-2231-4163-8c82-8ee68e08cb57" (UID: "50dce0e2-2231-4163-8c82-8ee68e08cb57"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 11:48:18 crc kubenswrapper[4794]: I0310 11:48:18.081042 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/50dce0e2-2231-4163-8c82-8ee68e08cb57-inventory" (OuterVolumeSpecName: "inventory") pod "50dce0e2-2231-4163-8c82-8ee68e08cb57" (UID: "50dce0e2-2231-4163-8c82-8ee68e08cb57"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 11:48:18 crc kubenswrapper[4794]: I0310 11:48:18.081917 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/50dce0e2-2231-4163-8c82-8ee68e08cb57-ssh-key-openstack-cell1" (OuterVolumeSpecName: "ssh-key-openstack-cell1") pod "50dce0e2-2231-4163-8c82-8ee68e08cb57" (UID: "50dce0e2-2231-4163-8c82-8ee68e08cb57"). InnerVolumeSpecName "ssh-key-openstack-cell1". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 11:48:18 crc kubenswrapper[4794]: I0310 11:48:18.152649 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n6pvw\" (UniqueName: \"kubernetes.io/projected/50dce0e2-2231-4163-8c82-8ee68e08cb57-kube-api-access-n6pvw\") on node \"crc\" DevicePath \"\"" Mar 10 11:48:18 crc kubenswrapper[4794]: I0310 11:48:18.152692 4794 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/50dce0e2-2231-4163-8c82-8ee68e08cb57-ceph\") on node \"crc\" DevicePath \"\"" Mar 10 11:48:18 crc kubenswrapper[4794]: I0310 11:48:18.152705 4794 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/50dce0e2-2231-4163-8c82-8ee68e08cb57-ssh-key-openstack-cell1\") on node \"crc\" DevicePath \"\"" Mar 10 11:48:18 crc kubenswrapper[4794]: I0310 11:48:18.152717 4794 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/50dce0e2-2231-4163-8c82-8ee68e08cb57-inventory\") on node \"crc\" DevicePath \"\"" Mar 10 11:48:18 crc kubenswrapper[4794]: I0310 11:48:18.409367 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-openstack-openstack-cell1-l9qnw" event={"ID":"50dce0e2-2231-4163-8c82-8ee68e08cb57","Type":"ContainerDied","Data":"9a2777ac504a8c874bbbcec12916ef80f355832488c5b9019fe46473882f444c"} Mar 10 11:48:18 crc kubenswrapper[4794]: I0310 11:48:18.409981 4794 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9a2777ac504a8c874bbbcec12916ef80f355832488c5b9019fe46473882f444c" Mar 10 11:48:18 crc kubenswrapper[4794]: I0310 11:48:18.409415 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-openstack-openstack-cell1-l9qnw" Mar 10 11:48:18 crc kubenswrapper[4794]: I0310 11:48:18.500479 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-network-openstack-openstack-cell1-rcwxx"] Mar 10 11:48:18 crc kubenswrapper[4794]: E0310 11:48:18.500896 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bea8e430-5c78-4741-87d7-6f24aff9f849" containerName="oc" Mar 10 11:48:18 crc kubenswrapper[4794]: I0310 11:48:18.500915 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="bea8e430-5c78-4741-87d7-6f24aff9f849" containerName="oc" Mar 10 11:48:18 crc kubenswrapper[4794]: E0310 11:48:18.500942 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="50dce0e2-2231-4163-8c82-8ee68e08cb57" containerName="download-cache-openstack-openstack-cell1" Mar 10 11:48:18 crc kubenswrapper[4794]: I0310 11:48:18.500949 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="50dce0e2-2231-4163-8c82-8ee68e08cb57" containerName="download-cache-openstack-openstack-cell1" Mar 10 11:48:18 crc kubenswrapper[4794]: I0310 11:48:18.501148 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="bea8e430-5c78-4741-87d7-6f24aff9f849" containerName="oc" Mar 10 11:48:18 crc kubenswrapper[4794]: I0310 11:48:18.501177 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="50dce0e2-2231-4163-8c82-8ee68e08cb57" containerName="download-cache-openstack-openstack-cell1" Mar 10 11:48:18 crc kubenswrapper[4794]: I0310 11:48:18.501885 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-openstack-openstack-cell1-rcwxx" Mar 10 11:48:18 crc kubenswrapper[4794]: I0310 11:48:18.504053 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Mar 10 11:48:18 crc kubenswrapper[4794]: I0310 11:48:18.504452 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Mar 10 11:48:18 crc kubenswrapper[4794]: I0310 11:48:18.504744 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 10 11:48:18 crc kubenswrapper[4794]: I0310 11:48:18.504905 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-jdm8g" Mar 10 11:48:18 crc kubenswrapper[4794]: I0310 11:48:18.526719 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-openstack-openstack-cell1-rcwxx"] Mar 10 11:48:18 crc kubenswrapper[4794]: I0310 11:48:18.663519 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/9938d61e-1f47-4555-a810-67e6c74dc947-ceph\") pod \"configure-network-openstack-openstack-cell1-rcwxx\" (UID: \"9938d61e-1f47-4555-a810-67e6c74dc947\") " pod="openstack/configure-network-openstack-openstack-cell1-rcwxx" Mar 10 11:48:18 crc kubenswrapper[4794]: I0310 11:48:18.663566 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9938d61e-1f47-4555-a810-67e6c74dc947-inventory\") pod \"configure-network-openstack-openstack-cell1-rcwxx\" (UID: \"9938d61e-1f47-4555-a810-67e6c74dc947\") " pod="openstack/configure-network-openstack-openstack-cell1-rcwxx" Mar 10 11:48:18 crc kubenswrapper[4794]: I0310 11:48:18.663665 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tt5r5\" (UniqueName: \"kubernetes.io/projected/9938d61e-1f47-4555-a810-67e6c74dc947-kube-api-access-tt5r5\") pod \"configure-network-openstack-openstack-cell1-rcwxx\" (UID: \"9938d61e-1f47-4555-a810-67e6c74dc947\") " pod="openstack/configure-network-openstack-openstack-cell1-rcwxx" Mar 10 11:48:18 crc kubenswrapper[4794]: I0310 11:48:18.663828 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/9938d61e-1f47-4555-a810-67e6c74dc947-ssh-key-openstack-cell1\") pod \"configure-network-openstack-openstack-cell1-rcwxx\" (UID: \"9938d61e-1f47-4555-a810-67e6c74dc947\") " pod="openstack/configure-network-openstack-openstack-cell1-rcwxx" Mar 10 11:48:18 crc kubenswrapper[4794]: I0310 11:48:18.765633 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/9938d61e-1f47-4555-a810-67e6c74dc947-ceph\") pod \"configure-network-openstack-openstack-cell1-rcwxx\" (UID: \"9938d61e-1f47-4555-a810-67e6c74dc947\") " pod="openstack/configure-network-openstack-openstack-cell1-rcwxx" Mar 10 11:48:18 crc kubenswrapper[4794]: I0310 11:48:18.765680 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9938d61e-1f47-4555-a810-67e6c74dc947-inventory\") pod \"configure-network-openstack-openstack-cell1-rcwxx\" (UID: \"9938d61e-1f47-4555-a810-67e6c74dc947\") " pod="openstack/configure-network-openstack-openstack-cell1-rcwxx" Mar 10 11:48:18 crc kubenswrapper[4794]: I0310 11:48:18.765763 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tt5r5\" (UniqueName: \"kubernetes.io/projected/9938d61e-1f47-4555-a810-67e6c74dc947-kube-api-access-tt5r5\") pod \"configure-network-openstack-openstack-cell1-rcwxx\" (UID: \"9938d61e-1f47-4555-a810-67e6c74dc947\") " pod="openstack/configure-network-openstack-openstack-cell1-rcwxx" Mar 10 11:48:18 crc kubenswrapper[4794]: I0310 11:48:18.765892 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/9938d61e-1f47-4555-a810-67e6c74dc947-ssh-key-openstack-cell1\") pod \"configure-network-openstack-openstack-cell1-rcwxx\" (UID: \"9938d61e-1f47-4555-a810-67e6c74dc947\") " pod="openstack/configure-network-openstack-openstack-cell1-rcwxx" Mar 10 11:48:18 crc kubenswrapper[4794]: I0310 11:48:18.771724 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/9938d61e-1f47-4555-a810-67e6c74dc947-ceph\") pod \"configure-network-openstack-openstack-cell1-rcwxx\" (UID: \"9938d61e-1f47-4555-a810-67e6c74dc947\") " pod="openstack/configure-network-openstack-openstack-cell1-rcwxx" Mar 10 11:48:18 crc kubenswrapper[4794]: I0310 11:48:18.772123 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9938d61e-1f47-4555-a810-67e6c74dc947-inventory\") pod \"configure-network-openstack-openstack-cell1-rcwxx\" (UID: \"9938d61e-1f47-4555-a810-67e6c74dc947\") " pod="openstack/configure-network-openstack-openstack-cell1-rcwxx" Mar 10 11:48:18 crc kubenswrapper[4794]: I0310 11:48:18.773309 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/9938d61e-1f47-4555-a810-67e6c74dc947-ssh-key-openstack-cell1\") pod \"configure-network-openstack-openstack-cell1-rcwxx\" (UID: \"9938d61e-1f47-4555-a810-67e6c74dc947\") " pod="openstack/configure-network-openstack-openstack-cell1-rcwxx" Mar 10 11:48:18 crc kubenswrapper[4794]: I0310 11:48:18.787360 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tt5r5\" (UniqueName: \"kubernetes.io/projected/9938d61e-1f47-4555-a810-67e6c74dc947-kube-api-access-tt5r5\") pod \"configure-network-openstack-openstack-cell1-rcwxx\" (UID: \"9938d61e-1f47-4555-a810-67e6c74dc947\") " pod="openstack/configure-network-openstack-openstack-cell1-rcwxx" Mar 10 11:48:18 crc kubenswrapper[4794]: I0310 11:48:18.833600 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-openstack-openstack-cell1-rcwxx" Mar 10 11:48:19 crc kubenswrapper[4794]: I0310 11:48:19.421552 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-openstack-openstack-cell1-rcwxx"] Mar 10 11:48:20 crc kubenswrapper[4794]: I0310 11:48:20.443146 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-openstack-openstack-cell1-rcwxx" event={"ID":"9938d61e-1f47-4555-a810-67e6c74dc947","Type":"ContainerStarted","Data":"2cd37bb3f3ce91dd41fc4aaace110ed67b188c4163964886c12018dfe57d724f"} Mar 10 11:48:20 crc kubenswrapper[4794]: I0310 11:48:20.443767 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-openstack-openstack-cell1-rcwxx" event={"ID":"9938d61e-1f47-4555-a810-67e6c74dc947","Type":"ContainerStarted","Data":"a32751ba342e54dec895ef9865b3f3f8c08580857920669d7c232b0bb83d1ca4"} Mar 10 11:48:20 crc kubenswrapper[4794]: I0310 11:48:20.472976 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-network-openstack-openstack-cell1-rcwxx" podStartSLOduration=1.9555906379999999 podStartE2EDuration="2.47294909s" podCreationTimestamp="2026-03-10 11:48:18 +0000 UTC" firstStartedPulling="2026-03-10 11:48:19.430886799 +0000 UTC m=+7448.187057627" lastFinishedPulling="2026-03-10 11:48:19.948245251 +0000 UTC m=+7448.704416079" observedRunningTime="2026-03-10 11:48:20.461541955 +0000 UTC m=+7449.217712793" watchObservedRunningTime="2026-03-10 11:48:20.47294909 +0000 UTC m=+7449.229119948" Mar 10 11:48:38 crc kubenswrapper[4794]: I0310 11:48:38.098172 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-7vhpp"] Mar 10 11:48:38 crc kubenswrapper[4794]: I0310 11:48:38.101768 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-7vhpp" Mar 10 11:48:38 crc kubenswrapper[4794]: I0310 11:48:38.125664 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-7vhpp"] Mar 10 11:48:38 crc kubenswrapper[4794]: I0310 11:48:38.214712 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lj8kp\" (UniqueName: \"kubernetes.io/projected/0de8daea-2bd9-4c4d-8524-feea98896a09-kube-api-access-lj8kp\") pod \"redhat-marketplace-7vhpp\" (UID: \"0de8daea-2bd9-4c4d-8524-feea98896a09\") " pod="openshift-marketplace/redhat-marketplace-7vhpp" Mar 10 11:48:38 crc kubenswrapper[4794]: I0310 11:48:38.214866 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0de8daea-2bd9-4c4d-8524-feea98896a09-utilities\") pod \"redhat-marketplace-7vhpp\" (UID: \"0de8daea-2bd9-4c4d-8524-feea98896a09\") " pod="openshift-marketplace/redhat-marketplace-7vhpp" Mar 10 11:48:38 crc kubenswrapper[4794]: I0310 11:48:38.214928 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0de8daea-2bd9-4c4d-8524-feea98896a09-catalog-content\") pod \"redhat-marketplace-7vhpp\" (UID: \"0de8daea-2bd9-4c4d-8524-feea98896a09\") " pod="openshift-marketplace/redhat-marketplace-7vhpp" Mar 10 11:48:38 crc kubenswrapper[4794]: I0310 11:48:38.316907 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0de8daea-2bd9-4c4d-8524-feea98896a09-catalog-content\") pod \"redhat-marketplace-7vhpp\" (UID: \"0de8daea-2bd9-4c4d-8524-feea98896a09\") " pod="openshift-marketplace/redhat-marketplace-7vhpp" Mar 10 11:48:38 crc kubenswrapper[4794]: I0310 11:48:38.317249 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lj8kp\" (UniqueName: \"kubernetes.io/projected/0de8daea-2bd9-4c4d-8524-feea98896a09-kube-api-access-lj8kp\") pod \"redhat-marketplace-7vhpp\" (UID: \"0de8daea-2bd9-4c4d-8524-feea98896a09\") " pod="openshift-marketplace/redhat-marketplace-7vhpp" Mar 10 11:48:38 crc kubenswrapper[4794]: I0310 11:48:38.317332 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0de8daea-2bd9-4c4d-8524-feea98896a09-catalog-content\") pod \"redhat-marketplace-7vhpp\" (UID: \"0de8daea-2bd9-4c4d-8524-feea98896a09\") " pod="openshift-marketplace/redhat-marketplace-7vhpp" Mar 10 11:48:38 crc kubenswrapper[4794]: I0310 11:48:38.317349 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0de8daea-2bd9-4c4d-8524-feea98896a09-utilities\") pod \"redhat-marketplace-7vhpp\" (UID: \"0de8daea-2bd9-4c4d-8524-feea98896a09\") " pod="openshift-marketplace/redhat-marketplace-7vhpp" Mar 10 11:48:38 crc kubenswrapper[4794]: I0310 11:48:38.317607 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0de8daea-2bd9-4c4d-8524-feea98896a09-utilities\") pod \"redhat-marketplace-7vhpp\" (UID: \"0de8daea-2bd9-4c4d-8524-feea98896a09\") " pod="openshift-marketplace/redhat-marketplace-7vhpp" Mar 10 11:48:38 crc kubenswrapper[4794]: I0310 11:48:38.343741 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lj8kp\" (UniqueName: \"kubernetes.io/projected/0de8daea-2bd9-4c4d-8524-feea98896a09-kube-api-access-lj8kp\") pod \"redhat-marketplace-7vhpp\" (UID: \"0de8daea-2bd9-4c4d-8524-feea98896a09\") " pod="openshift-marketplace/redhat-marketplace-7vhpp" Mar 10 11:48:38 crc kubenswrapper[4794]: I0310 11:48:38.441918 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-7vhpp" Mar 10 11:48:38 crc kubenswrapper[4794]: I0310 11:48:38.931242 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-7vhpp"] Mar 10 11:48:38 crc kubenswrapper[4794]: W0310 11:48:38.940480 4794 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0de8daea_2bd9_4c4d_8524_feea98896a09.slice/crio-5779f40de2fea60a3ebde9cb54e9ac7de062d4a833fd421d7a825980deadd527 WatchSource:0}: Error finding container 5779f40de2fea60a3ebde9cb54e9ac7de062d4a833fd421d7a825980deadd527: Status 404 returned error can't find the container with id 5779f40de2fea60a3ebde9cb54e9ac7de062d4a833fd421d7a825980deadd527 Mar 10 11:48:39 crc kubenswrapper[4794]: I0310 11:48:39.677399 4794 generic.go:334] "Generic (PLEG): container finished" podID="0de8daea-2bd9-4c4d-8524-feea98896a09" containerID="166418dbd2d02e3b0200fd4e313aa84b572af7b075b7703a9ab2fa8b8fcce57c" exitCode=0 Mar 10 11:48:39 crc kubenswrapper[4794]: I0310 11:48:39.677468 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7vhpp" event={"ID":"0de8daea-2bd9-4c4d-8524-feea98896a09","Type":"ContainerDied","Data":"166418dbd2d02e3b0200fd4e313aa84b572af7b075b7703a9ab2fa8b8fcce57c"} Mar 10 11:48:39 crc kubenswrapper[4794]: I0310 11:48:39.677771 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7vhpp" event={"ID":"0de8daea-2bd9-4c4d-8524-feea98896a09","Type":"ContainerStarted","Data":"5779f40de2fea60a3ebde9cb54e9ac7de062d4a833fd421d7a825980deadd527"} Mar 10 11:48:40 crc kubenswrapper[4794]: I0310 11:48:40.705921 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7vhpp" event={"ID":"0de8daea-2bd9-4c4d-8524-feea98896a09","Type":"ContainerStarted","Data":"0e96540e0883e7d9906328b20316d3d7625add90fafdd10b3088ab2b454fd94f"} Mar 10 11:48:41 crc kubenswrapper[4794]: I0310 11:48:41.724212 4794 generic.go:334] "Generic (PLEG): container finished" podID="0de8daea-2bd9-4c4d-8524-feea98896a09" containerID="0e96540e0883e7d9906328b20316d3d7625add90fafdd10b3088ab2b454fd94f" exitCode=0 Mar 10 11:48:41 crc kubenswrapper[4794]: I0310 11:48:41.724525 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7vhpp" event={"ID":"0de8daea-2bd9-4c4d-8524-feea98896a09","Type":"ContainerDied","Data":"0e96540e0883e7d9906328b20316d3d7625add90fafdd10b3088ab2b454fd94f"} Mar 10 11:48:42 crc kubenswrapper[4794]: I0310 11:48:42.750252 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7vhpp" event={"ID":"0de8daea-2bd9-4c4d-8524-feea98896a09","Type":"ContainerStarted","Data":"a91a04a378995809808ac7551ee9ef15224b0ef4361bc4a89e04d4db3da0a442"} Mar 10 11:48:42 crc kubenswrapper[4794]: I0310 11:48:42.794434 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-7vhpp" podStartSLOduration=2.304548631 podStartE2EDuration="4.794404447s" podCreationTimestamp="2026-03-10 11:48:38 +0000 UTC" firstStartedPulling="2026-03-10 11:48:39.681234361 +0000 UTC m=+7468.437405179" lastFinishedPulling="2026-03-10 11:48:42.171090137 +0000 UTC m=+7470.927260995" observedRunningTime="2026-03-10 11:48:42.789154985 +0000 UTC m=+7471.545325803" watchObservedRunningTime="2026-03-10 11:48:42.794404447 +0000 UTC m=+7471.550575325" Mar 10 11:48:48 crc kubenswrapper[4794]: I0310 11:48:48.442092 4794 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-7vhpp" Mar 10 11:48:48 crc kubenswrapper[4794]: I0310 11:48:48.442763 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-7vhpp" Mar 10 11:48:48 crc kubenswrapper[4794]: I0310 11:48:48.514307 4794 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-7vhpp" Mar 10 11:48:48 crc kubenswrapper[4794]: I0310 11:48:48.893227 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-7vhpp" Mar 10 11:48:48 crc kubenswrapper[4794]: I0310 11:48:48.952803 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-7vhpp"] Mar 10 11:48:50 crc kubenswrapper[4794]: I0310 11:48:50.842383 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-7vhpp" podUID="0de8daea-2bd9-4c4d-8524-feea98896a09" containerName="registry-server" containerID="cri-o://a91a04a378995809808ac7551ee9ef15224b0ef4361bc4a89e04d4db3da0a442" gracePeriod=2 Mar 10 11:48:51 crc kubenswrapper[4794]: I0310 11:48:51.457155 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-7vhpp" Mar 10 11:48:51 crc kubenswrapper[4794]: I0310 11:48:51.538594 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0de8daea-2bd9-4c4d-8524-feea98896a09-catalog-content\") pod \"0de8daea-2bd9-4c4d-8524-feea98896a09\" (UID: \"0de8daea-2bd9-4c4d-8524-feea98896a09\") " Mar 10 11:48:51 crc kubenswrapper[4794]: I0310 11:48:51.538754 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0de8daea-2bd9-4c4d-8524-feea98896a09-utilities\") pod \"0de8daea-2bd9-4c4d-8524-feea98896a09\" (UID: \"0de8daea-2bd9-4c4d-8524-feea98896a09\") " Mar 10 11:48:51 crc kubenswrapper[4794]: I0310 11:48:51.538787 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lj8kp\" (UniqueName: \"kubernetes.io/projected/0de8daea-2bd9-4c4d-8524-feea98896a09-kube-api-access-lj8kp\") pod \"0de8daea-2bd9-4c4d-8524-feea98896a09\" (UID: \"0de8daea-2bd9-4c4d-8524-feea98896a09\") " Mar 10 11:48:51 crc kubenswrapper[4794]: I0310 11:48:51.539731 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0de8daea-2bd9-4c4d-8524-feea98896a09-utilities" (OuterVolumeSpecName: "utilities") pod "0de8daea-2bd9-4c4d-8524-feea98896a09" (UID: "0de8daea-2bd9-4c4d-8524-feea98896a09"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 11:48:51 crc kubenswrapper[4794]: I0310 11:48:51.546321 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0de8daea-2bd9-4c4d-8524-feea98896a09-kube-api-access-lj8kp" (OuterVolumeSpecName: "kube-api-access-lj8kp") pod "0de8daea-2bd9-4c4d-8524-feea98896a09" (UID: "0de8daea-2bd9-4c4d-8524-feea98896a09"). InnerVolumeSpecName "kube-api-access-lj8kp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 11:48:51 crc kubenswrapper[4794]: I0310 11:48:51.578677 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0de8daea-2bd9-4c4d-8524-feea98896a09-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "0de8daea-2bd9-4c4d-8524-feea98896a09" (UID: "0de8daea-2bd9-4c4d-8524-feea98896a09"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 11:48:51 crc kubenswrapper[4794]: I0310 11:48:51.641467 4794 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0de8daea-2bd9-4c4d-8524-feea98896a09-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 10 11:48:51 crc kubenswrapper[4794]: I0310 11:48:51.641504 4794 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0de8daea-2bd9-4c4d-8524-feea98896a09-utilities\") on node \"crc\" DevicePath \"\"" Mar 10 11:48:51 crc kubenswrapper[4794]: I0310 11:48:51.641519 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lj8kp\" (UniqueName: \"kubernetes.io/projected/0de8daea-2bd9-4c4d-8524-feea98896a09-kube-api-access-lj8kp\") on node \"crc\" DevicePath \"\"" Mar 10 11:48:51 crc kubenswrapper[4794]: I0310 11:48:51.857704 4794 generic.go:334] "Generic (PLEG): container finished" podID="0de8daea-2bd9-4c4d-8524-feea98896a09" containerID="a91a04a378995809808ac7551ee9ef15224b0ef4361bc4a89e04d4db3da0a442" exitCode=0 Mar 10 11:48:51 crc kubenswrapper[4794]: I0310 11:48:51.857757 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7vhpp" event={"ID":"0de8daea-2bd9-4c4d-8524-feea98896a09","Type":"ContainerDied","Data":"a91a04a378995809808ac7551ee9ef15224b0ef4361bc4a89e04d4db3da0a442"} Mar 10 11:48:51 crc kubenswrapper[4794]: I0310 11:48:51.858075 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7vhpp" event={"ID":"0de8daea-2bd9-4c4d-8524-feea98896a09","Type":"ContainerDied","Data":"5779f40de2fea60a3ebde9cb54e9ac7de062d4a833fd421d7a825980deadd527"} Mar 10 11:48:51 crc kubenswrapper[4794]: I0310 11:48:51.858109 4794 scope.go:117] "RemoveContainer" containerID="a91a04a378995809808ac7551ee9ef15224b0ef4361bc4a89e04d4db3da0a442" Mar 10 11:48:51 crc kubenswrapper[4794]: I0310 11:48:51.857845 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-7vhpp" Mar 10 11:48:51 crc kubenswrapper[4794]: I0310 11:48:51.886548 4794 scope.go:117] "RemoveContainer" containerID="0e96540e0883e7d9906328b20316d3d7625add90fafdd10b3088ab2b454fd94f" Mar 10 11:48:51 crc kubenswrapper[4794]: I0310 11:48:51.929389 4794 scope.go:117] "RemoveContainer" containerID="166418dbd2d02e3b0200fd4e313aa84b572af7b075b7703a9ab2fa8b8fcce57c" Mar 10 11:48:51 crc kubenswrapper[4794]: I0310 11:48:51.929522 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-7vhpp"] Mar 10 11:48:51 crc kubenswrapper[4794]: I0310 11:48:51.941321 4794 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-7vhpp"] Mar 10 11:48:51 crc kubenswrapper[4794]: I0310 11:48:51.971605 4794 scope.go:117] "RemoveContainer" containerID="a91a04a378995809808ac7551ee9ef15224b0ef4361bc4a89e04d4db3da0a442" Mar 10 11:48:51 crc kubenswrapper[4794]: E0310 11:48:51.972144 4794 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a91a04a378995809808ac7551ee9ef15224b0ef4361bc4a89e04d4db3da0a442\": container with ID starting with a91a04a378995809808ac7551ee9ef15224b0ef4361bc4a89e04d4db3da0a442 not found: ID does not exist" containerID="a91a04a378995809808ac7551ee9ef15224b0ef4361bc4a89e04d4db3da0a442" Mar 10 11:48:51 crc kubenswrapper[4794]: I0310 11:48:51.972210 4794 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a91a04a378995809808ac7551ee9ef15224b0ef4361bc4a89e04d4db3da0a442"} err="failed to get container status \"a91a04a378995809808ac7551ee9ef15224b0ef4361bc4a89e04d4db3da0a442\": rpc error: code = NotFound desc = could not find container \"a91a04a378995809808ac7551ee9ef15224b0ef4361bc4a89e04d4db3da0a442\": container with ID starting with a91a04a378995809808ac7551ee9ef15224b0ef4361bc4a89e04d4db3da0a442 not found: ID does not exist" Mar 10 11:48:51 crc kubenswrapper[4794]: I0310 11:48:51.972249 4794 scope.go:117] "RemoveContainer" containerID="0e96540e0883e7d9906328b20316d3d7625add90fafdd10b3088ab2b454fd94f" Mar 10 11:48:51 crc kubenswrapper[4794]: E0310 11:48:51.972688 4794 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0e96540e0883e7d9906328b20316d3d7625add90fafdd10b3088ab2b454fd94f\": container with ID starting with 0e96540e0883e7d9906328b20316d3d7625add90fafdd10b3088ab2b454fd94f not found: ID does not exist" containerID="0e96540e0883e7d9906328b20316d3d7625add90fafdd10b3088ab2b454fd94f" Mar 10 11:48:51 crc kubenswrapper[4794]: I0310 11:48:51.972723 4794 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0e96540e0883e7d9906328b20316d3d7625add90fafdd10b3088ab2b454fd94f"} err="failed to get container status \"0e96540e0883e7d9906328b20316d3d7625add90fafdd10b3088ab2b454fd94f\": rpc error: code = NotFound desc = could not find container \"0e96540e0883e7d9906328b20316d3d7625add90fafdd10b3088ab2b454fd94f\": container with ID starting with 0e96540e0883e7d9906328b20316d3d7625add90fafdd10b3088ab2b454fd94f not found: ID does not exist" Mar 10 11:48:51 crc kubenswrapper[4794]: I0310 11:48:51.972750 4794 scope.go:117] "RemoveContainer" containerID="166418dbd2d02e3b0200fd4e313aa84b572af7b075b7703a9ab2fa8b8fcce57c" Mar 10 11:48:51 crc kubenswrapper[4794]: E0310 11:48:51.973037 4794 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"166418dbd2d02e3b0200fd4e313aa84b572af7b075b7703a9ab2fa8b8fcce57c\": container with ID starting with 166418dbd2d02e3b0200fd4e313aa84b572af7b075b7703a9ab2fa8b8fcce57c not found: ID does not exist" containerID="166418dbd2d02e3b0200fd4e313aa84b572af7b075b7703a9ab2fa8b8fcce57c" Mar 10 11:48:51 crc kubenswrapper[4794]: I0310 11:48:51.973073 4794 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"166418dbd2d02e3b0200fd4e313aa84b572af7b075b7703a9ab2fa8b8fcce57c"} err="failed to get container status \"166418dbd2d02e3b0200fd4e313aa84b572af7b075b7703a9ab2fa8b8fcce57c\": rpc error: code = NotFound desc = could not find container \"166418dbd2d02e3b0200fd4e313aa84b572af7b075b7703a9ab2fa8b8fcce57c\": container with ID starting with 166418dbd2d02e3b0200fd4e313aa84b572af7b075b7703a9ab2fa8b8fcce57c not found: ID does not exist" Mar 10 11:48:52 crc kubenswrapper[4794]: I0310 11:48:52.013680 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0de8daea-2bd9-4c4d-8524-feea98896a09" path="/var/lib/kubelet/pods/0de8daea-2bd9-4c4d-8524-feea98896a09/volumes" Mar 10 11:48:52 crc kubenswrapper[4794]: I0310 11:48:52.967488 4794 patch_prober.go:28] interesting pod/machine-config-daemon-69278 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 11:48:52 crc kubenswrapper[4794]: I0310 11:48:52.967770 4794 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-69278" podUID="071a79a8-a892-4d38-a255-2a19483b64aa" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 11:48:53 crc kubenswrapper[4794]: I0310 11:48:53.025902 4794 scope.go:117] "RemoveContainer" containerID="60e171925176b993cfbd5fc22a7d10e5273eff277b44bd3070313b6bb656361c" Mar 10 11:49:22 crc kubenswrapper[4794]: I0310 11:49:22.967615 4794 patch_prober.go:28] interesting pod/machine-config-daemon-69278 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 11:49:22 crc kubenswrapper[4794]: I0310 11:49:22.968201 4794 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-69278" podUID="071a79a8-a892-4d38-a255-2a19483b64aa" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 11:49:44 crc kubenswrapper[4794]: I0310 11:49:44.523136 4794 generic.go:334] "Generic (PLEG): container finished" podID="9938d61e-1f47-4555-a810-67e6c74dc947" containerID="2cd37bb3f3ce91dd41fc4aaace110ed67b188c4163964886c12018dfe57d724f" exitCode=0 Mar 10 11:49:44 crc kubenswrapper[4794]: I0310 11:49:44.523202 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-openstack-openstack-cell1-rcwxx" event={"ID":"9938d61e-1f47-4555-a810-67e6c74dc947","Type":"ContainerDied","Data":"2cd37bb3f3ce91dd41fc4aaace110ed67b188c4163964886c12018dfe57d724f"} Mar 10 11:49:46 crc kubenswrapper[4794]: I0310 11:49:46.217190 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-openstack-openstack-cell1-rcwxx" Mar 10 11:49:46 crc kubenswrapper[4794]: I0310 11:49:46.385179 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/9938d61e-1f47-4555-a810-67e6c74dc947-ceph\") pod \"9938d61e-1f47-4555-a810-67e6c74dc947\" (UID: \"9938d61e-1f47-4555-a810-67e6c74dc947\") " Mar 10 11:49:46 crc kubenswrapper[4794]: I0310 11:49:46.385305 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tt5r5\" (UniqueName: \"kubernetes.io/projected/9938d61e-1f47-4555-a810-67e6c74dc947-kube-api-access-tt5r5\") pod \"9938d61e-1f47-4555-a810-67e6c74dc947\" (UID: \"9938d61e-1f47-4555-a810-67e6c74dc947\") " Mar 10 11:49:46 crc kubenswrapper[4794]: I0310 11:49:46.385698 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/9938d61e-1f47-4555-a810-67e6c74dc947-ssh-key-openstack-cell1\") pod \"9938d61e-1f47-4555-a810-67e6c74dc947\" (UID: \"9938d61e-1f47-4555-a810-67e6c74dc947\") " Mar 10 11:49:46 crc kubenswrapper[4794]: I0310 11:49:46.385827 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9938d61e-1f47-4555-a810-67e6c74dc947-inventory\") pod \"9938d61e-1f47-4555-a810-67e6c74dc947\" (UID: \"9938d61e-1f47-4555-a810-67e6c74dc947\") " Mar 10 11:49:46 crc kubenswrapper[4794]: I0310 11:49:46.393846 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9938d61e-1f47-4555-a810-67e6c74dc947-kube-api-access-tt5r5" (OuterVolumeSpecName: "kube-api-access-tt5r5") pod "9938d61e-1f47-4555-a810-67e6c74dc947" (UID: "9938d61e-1f47-4555-a810-67e6c74dc947"). InnerVolumeSpecName "kube-api-access-tt5r5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 11:49:46 crc kubenswrapper[4794]: I0310 11:49:46.396559 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9938d61e-1f47-4555-a810-67e6c74dc947-ceph" (OuterVolumeSpecName: "ceph") pod "9938d61e-1f47-4555-a810-67e6c74dc947" (UID: "9938d61e-1f47-4555-a810-67e6c74dc947"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 11:49:46 crc kubenswrapper[4794]: I0310 11:49:46.416821 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9938d61e-1f47-4555-a810-67e6c74dc947-ssh-key-openstack-cell1" (OuterVolumeSpecName: "ssh-key-openstack-cell1") pod "9938d61e-1f47-4555-a810-67e6c74dc947" (UID: "9938d61e-1f47-4555-a810-67e6c74dc947"). InnerVolumeSpecName "ssh-key-openstack-cell1". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 11:49:46 crc kubenswrapper[4794]: I0310 11:49:46.431291 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9938d61e-1f47-4555-a810-67e6c74dc947-inventory" (OuterVolumeSpecName: "inventory") pod "9938d61e-1f47-4555-a810-67e6c74dc947" (UID: "9938d61e-1f47-4555-a810-67e6c74dc947"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 11:49:46 crc kubenswrapper[4794]: I0310 11:49:46.489873 4794 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/9938d61e-1f47-4555-a810-67e6c74dc947-ceph\") on node \"crc\" DevicePath \"\"" Mar 10 11:49:46 crc kubenswrapper[4794]: I0310 11:49:46.489942 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tt5r5\" (UniqueName: \"kubernetes.io/projected/9938d61e-1f47-4555-a810-67e6c74dc947-kube-api-access-tt5r5\") on node \"crc\" DevicePath \"\"" Mar 10 11:49:46 crc kubenswrapper[4794]: I0310 11:49:46.489965 4794 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/9938d61e-1f47-4555-a810-67e6c74dc947-ssh-key-openstack-cell1\") on node \"crc\" DevicePath \"\"" Mar 10 11:49:46 crc kubenswrapper[4794]: I0310 11:49:46.489982 4794 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9938d61e-1f47-4555-a810-67e6c74dc947-inventory\") on node \"crc\" DevicePath \"\"" Mar 10 11:49:46 crc kubenswrapper[4794]: I0310 11:49:46.555812 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-openstack-openstack-cell1-rcwxx" event={"ID":"9938d61e-1f47-4555-a810-67e6c74dc947","Type":"ContainerDied","Data":"a32751ba342e54dec895ef9865b3f3f8c08580857920669d7c232b0bb83d1ca4"} Mar 10 11:49:46 crc kubenswrapper[4794]: I0310 11:49:46.556475 4794 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a32751ba342e54dec895ef9865b3f3f8c08580857920669d7c232b0bb83d1ca4" Mar 10 11:49:46 crc kubenswrapper[4794]: I0310 11:49:46.555853 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-openstack-openstack-cell1-rcwxx" Mar 10 11:49:46 crc kubenswrapper[4794]: I0310 11:49:46.676683 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/validate-network-openstack-openstack-cell1-fv6vq"] Mar 10 11:49:46 crc kubenswrapper[4794]: E0310 11:49:46.678215 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0de8daea-2bd9-4c4d-8524-feea98896a09" containerName="extract-content" Mar 10 11:49:46 crc kubenswrapper[4794]: I0310 11:49:46.678250 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="0de8daea-2bd9-4c4d-8524-feea98896a09" containerName="extract-content" Mar 10 11:49:46 crc kubenswrapper[4794]: E0310 11:49:46.678289 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0de8daea-2bd9-4c4d-8524-feea98896a09" containerName="registry-server" Mar 10 11:49:46 crc kubenswrapper[4794]: I0310 11:49:46.678302 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="0de8daea-2bd9-4c4d-8524-feea98896a09" containerName="registry-server" Mar 10 11:49:46 crc kubenswrapper[4794]: E0310 11:49:46.678334 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0de8daea-2bd9-4c4d-8524-feea98896a09" containerName="extract-utilities" Mar 10 11:49:46 crc kubenswrapper[4794]: I0310 11:49:46.678594 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="0de8daea-2bd9-4c4d-8524-feea98896a09" containerName="extract-utilities" Mar 10 11:49:46 crc kubenswrapper[4794]: E0310 11:49:46.678888 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9938d61e-1f47-4555-a810-67e6c74dc947" containerName="configure-network-openstack-openstack-cell1" Mar 10 11:49:46 crc kubenswrapper[4794]: I0310 11:49:46.678963 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="9938d61e-1f47-4555-a810-67e6c74dc947" containerName="configure-network-openstack-openstack-cell1" Mar 10 11:49:46 crc kubenswrapper[4794]: I0310 11:49:46.682397 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="0de8daea-2bd9-4c4d-8524-feea98896a09" containerName="registry-server" Mar 10 11:49:46 crc kubenswrapper[4794]: I0310 11:49:46.682481 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="9938d61e-1f47-4555-a810-67e6c74dc947" containerName="configure-network-openstack-openstack-cell1" Mar 10 11:49:46 crc kubenswrapper[4794]: I0310 11:49:46.683910 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-openstack-openstack-cell1-fv6vq" Mar 10 11:49:46 crc kubenswrapper[4794]: I0310 11:49:46.690684 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 10 11:49:46 crc kubenswrapper[4794]: I0310 11:49:46.696577 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Mar 10 11:49:46 crc kubenswrapper[4794]: I0310 11:49:46.696835 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Mar 10 11:49:46 crc kubenswrapper[4794]: I0310 11:49:46.696865 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-jdm8g" Mar 10 11:49:46 crc kubenswrapper[4794]: I0310 11:49:46.725557 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-openstack-openstack-cell1-fv6vq"] Mar 10 11:49:46 crc kubenswrapper[4794]: I0310 11:49:46.800014 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/dd7d1c86-3b8c-4a4a-8c09-3769ebeabe6a-inventory\") pod \"validate-network-openstack-openstack-cell1-fv6vq\" (UID: \"dd7d1c86-3b8c-4a4a-8c09-3769ebeabe6a\") " pod="openstack/validate-network-openstack-openstack-cell1-fv6vq" Mar 10 11:49:46 crc kubenswrapper[4794]: I0310 11:49:46.800121 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/dd7d1c86-3b8c-4a4a-8c09-3769ebeabe6a-ceph\") pod \"validate-network-openstack-openstack-cell1-fv6vq\" (UID: \"dd7d1c86-3b8c-4a4a-8c09-3769ebeabe6a\") " pod="openstack/validate-network-openstack-openstack-cell1-fv6vq" Mar 10 11:49:46 crc kubenswrapper[4794]: I0310 11:49:46.800205 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/dd7d1c86-3b8c-4a4a-8c09-3769ebeabe6a-ssh-key-openstack-cell1\") pod \"validate-network-openstack-openstack-cell1-fv6vq\" (UID: \"dd7d1c86-3b8c-4a4a-8c09-3769ebeabe6a\") " pod="openstack/validate-network-openstack-openstack-cell1-fv6vq" Mar 10 11:49:46 crc kubenswrapper[4794]: I0310 11:49:46.800433 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zmwf7\" (UniqueName: \"kubernetes.io/projected/dd7d1c86-3b8c-4a4a-8c09-3769ebeabe6a-kube-api-access-zmwf7\") pod \"validate-network-openstack-openstack-cell1-fv6vq\" (UID: \"dd7d1c86-3b8c-4a4a-8c09-3769ebeabe6a\") " pod="openstack/validate-network-openstack-openstack-cell1-fv6vq" Mar 10 11:49:46 crc kubenswrapper[4794]: I0310 11:49:46.903044 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/dd7d1c86-3b8c-4a4a-8c09-3769ebeabe6a-ceph\") pod \"validate-network-openstack-openstack-cell1-fv6vq\" (UID: \"dd7d1c86-3b8c-4a4a-8c09-3769ebeabe6a\") " pod="openstack/validate-network-openstack-openstack-cell1-fv6vq" Mar 10 11:49:46 crc kubenswrapper[4794]: I0310 11:49:46.903129 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/dd7d1c86-3b8c-4a4a-8c09-3769ebeabe6a-ssh-key-openstack-cell1\") pod \"validate-network-openstack-openstack-cell1-fv6vq\" (UID: \"dd7d1c86-3b8c-4a4a-8c09-3769ebeabe6a\") " pod="openstack/validate-network-openstack-openstack-cell1-fv6vq" Mar 10 11:49:46 crc kubenswrapper[4794]: I0310 11:49:46.903733 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zmwf7\" (UniqueName: \"kubernetes.io/projected/dd7d1c86-3b8c-4a4a-8c09-3769ebeabe6a-kube-api-access-zmwf7\") pod \"validate-network-openstack-openstack-cell1-fv6vq\" (UID: \"dd7d1c86-3b8c-4a4a-8c09-3769ebeabe6a\") " pod="openstack/validate-network-openstack-openstack-cell1-fv6vq" Mar 10 11:49:46 crc kubenswrapper[4794]: I0310 11:49:46.903823 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/dd7d1c86-3b8c-4a4a-8c09-3769ebeabe6a-inventory\") pod \"validate-network-openstack-openstack-cell1-fv6vq\" (UID: \"dd7d1c86-3b8c-4a4a-8c09-3769ebeabe6a\") " pod="openstack/validate-network-openstack-openstack-cell1-fv6vq" Mar 10 11:49:46 crc kubenswrapper[4794]: I0310 11:49:46.908739 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/dd7d1c86-3b8c-4a4a-8c09-3769ebeabe6a-ssh-key-openstack-cell1\") pod \"validate-network-openstack-openstack-cell1-fv6vq\" (UID: \"dd7d1c86-3b8c-4a4a-8c09-3769ebeabe6a\") " pod="openstack/validate-network-openstack-openstack-cell1-fv6vq" Mar 10 11:49:46 crc kubenswrapper[4794]: I0310 11:49:46.911146 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/dd7d1c86-3b8c-4a4a-8c09-3769ebeabe6a-inventory\") pod \"validate-network-openstack-openstack-cell1-fv6vq\" (UID: \"dd7d1c86-3b8c-4a4a-8c09-3769ebeabe6a\") " pod="openstack/validate-network-openstack-openstack-cell1-fv6vq" Mar 10 11:49:46 crc kubenswrapper[4794]: I0310 11:49:46.914040 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/dd7d1c86-3b8c-4a4a-8c09-3769ebeabe6a-ceph\") pod \"validate-network-openstack-openstack-cell1-fv6vq\" (UID: \"dd7d1c86-3b8c-4a4a-8c09-3769ebeabe6a\") " pod="openstack/validate-network-openstack-openstack-cell1-fv6vq" Mar 10 11:49:46 crc kubenswrapper[4794]: I0310 11:49:46.949207 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zmwf7\" (UniqueName: \"kubernetes.io/projected/dd7d1c86-3b8c-4a4a-8c09-3769ebeabe6a-kube-api-access-zmwf7\") pod \"validate-network-openstack-openstack-cell1-fv6vq\" (UID: \"dd7d1c86-3b8c-4a4a-8c09-3769ebeabe6a\") " pod="openstack/validate-network-openstack-openstack-cell1-fv6vq" Mar 10 11:49:47 crc kubenswrapper[4794]: I0310 11:49:47.019654 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-openstack-openstack-cell1-fv6vq" Mar 10 11:49:47 crc kubenswrapper[4794]: I0310 11:49:47.651805 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-openstack-openstack-cell1-fv6vq"] Mar 10 11:49:48 crc kubenswrapper[4794]: I0310 11:49:48.581838 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-openstack-openstack-cell1-fv6vq" event={"ID":"dd7d1c86-3b8c-4a4a-8c09-3769ebeabe6a","Type":"ContainerStarted","Data":"239e5e9d3e55e8272932591b98729bf930469ecedc8624301061cf6c2bc95db8"} Mar 10 11:49:48 crc kubenswrapper[4794]: I0310 11:49:48.582318 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-openstack-openstack-cell1-fv6vq" event={"ID":"dd7d1c86-3b8c-4a4a-8c09-3769ebeabe6a","Type":"ContainerStarted","Data":"c5c3121f7faacb6f31d900d75465b6c189e310d332662ee3e49fe25d3075425d"} Mar 10 11:49:48 crc kubenswrapper[4794]: I0310 11:49:48.606951 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/validate-network-openstack-openstack-cell1-fv6vq" podStartSLOduration=2.167118786 podStartE2EDuration="2.60693442s" podCreationTimestamp="2026-03-10 11:49:46 +0000 UTC" firstStartedPulling="2026-03-10 11:49:47.666321499 +0000 UTC m=+7536.422492347" lastFinishedPulling="2026-03-10 11:49:48.106137123 +0000 UTC m=+7536.862307981" observedRunningTime="2026-03-10 11:49:48.601856622 +0000 UTC m=+7537.358027450" watchObservedRunningTime="2026-03-10 11:49:48.60693442 +0000 UTC m=+7537.363105238" Mar 10 11:49:52 crc kubenswrapper[4794]: I0310 11:49:52.968368 4794 patch_prober.go:28] interesting pod/machine-config-daemon-69278 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 11:49:52 crc kubenswrapper[4794]: I0310 11:49:52.969084 4794 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-69278" podUID="071a79a8-a892-4d38-a255-2a19483b64aa" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 11:49:52 crc kubenswrapper[4794]: I0310 11:49:52.969156 4794 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-69278" Mar 10 11:49:52 crc kubenswrapper[4794]: I0310 11:49:52.970191 4794 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"9dc225440d65a260bfaf17b727b2d2bfa824f3fd4b54db4a37ab86f3420b2343"} pod="openshift-machine-config-operator/machine-config-daemon-69278" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 10 11:49:52 crc kubenswrapper[4794]: I0310 11:49:52.970267 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-69278" podUID="071a79a8-a892-4d38-a255-2a19483b64aa" containerName="machine-config-daemon" containerID="cri-o://9dc225440d65a260bfaf17b727b2d2bfa824f3fd4b54db4a37ab86f3420b2343" gracePeriod=600 Mar 10 11:49:53 crc kubenswrapper[4794]: E0310 11:49:53.110799 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-69278_openshift-machine-config-operator(071a79a8-a892-4d38-a255-2a19483b64aa)\"" pod="openshift-machine-config-operator/machine-config-daemon-69278" podUID="071a79a8-a892-4d38-a255-2a19483b64aa" Mar 10 11:49:53 crc kubenswrapper[4794]: I0310 11:49:53.659862 4794 generic.go:334] "Generic (PLEG): container finished" podID="dd7d1c86-3b8c-4a4a-8c09-3769ebeabe6a" containerID="239e5e9d3e55e8272932591b98729bf930469ecedc8624301061cf6c2bc95db8" exitCode=0 Mar 10 11:49:53 crc kubenswrapper[4794]: I0310 11:49:53.659968 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-openstack-openstack-cell1-fv6vq" event={"ID":"dd7d1c86-3b8c-4a4a-8c09-3769ebeabe6a","Type":"ContainerDied","Data":"239e5e9d3e55e8272932591b98729bf930469ecedc8624301061cf6c2bc95db8"} Mar 10 11:49:53 crc kubenswrapper[4794]: I0310 11:49:53.665961 4794 generic.go:334] "Generic (PLEG): container finished" podID="071a79a8-a892-4d38-a255-2a19483b64aa" containerID="9dc225440d65a260bfaf17b727b2d2bfa824f3fd4b54db4a37ab86f3420b2343" exitCode=0 Mar 10 11:49:53 crc kubenswrapper[4794]: I0310 11:49:53.666031 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-69278" event={"ID":"071a79a8-a892-4d38-a255-2a19483b64aa","Type":"ContainerDied","Data":"9dc225440d65a260bfaf17b727b2d2bfa824f3fd4b54db4a37ab86f3420b2343"} Mar 10 11:49:53 crc kubenswrapper[4794]: I0310 11:49:53.666078 4794 scope.go:117] "RemoveContainer" containerID="cd7c71a98db62b40397f10a8c55d8c209b4e24fd8dfc1076cb0ee0625736a5b5" Mar 10 11:49:53 crc kubenswrapper[4794]: I0310 11:49:53.667207 4794 scope.go:117] "RemoveContainer" containerID="9dc225440d65a260bfaf17b727b2d2bfa824f3fd4b54db4a37ab86f3420b2343" Mar 10 11:49:53 crc kubenswrapper[4794]: E0310 11:49:53.667972 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-69278_openshift-machine-config-operator(071a79a8-a892-4d38-a255-2a19483b64aa)\"" pod="openshift-machine-config-operator/machine-config-daemon-69278" podUID="071a79a8-a892-4d38-a255-2a19483b64aa" Mar 10 11:49:55 crc kubenswrapper[4794]: I0310 11:49:55.339457 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-openstack-openstack-cell1-fv6vq" Mar 10 11:49:55 crc kubenswrapper[4794]: I0310 11:49:55.433765 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zmwf7\" (UniqueName: \"kubernetes.io/projected/dd7d1c86-3b8c-4a4a-8c09-3769ebeabe6a-kube-api-access-zmwf7\") pod \"dd7d1c86-3b8c-4a4a-8c09-3769ebeabe6a\" (UID: \"dd7d1c86-3b8c-4a4a-8c09-3769ebeabe6a\") " Mar 10 11:49:55 crc kubenswrapper[4794]: I0310 11:49:55.433855 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/dd7d1c86-3b8c-4a4a-8c09-3769ebeabe6a-ceph\") pod \"dd7d1c86-3b8c-4a4a-8c09-3769ebeabe6a\" (UID: \"dd7d1c86-3b8c-4a4a-8c09-3769ebeabe6a\") " Mar 10 11:49:55 crc kubenswrapper[4794]: I0310 11:49:55.434072 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/dd7d1c86-3b8c-4a4a-8c09-3769ebeabe6a-inventory\") pod \"dd7d1c86-3b8c-4a4a-8c09-3769ebeabe6a\" (UID: \"dd7d1c86-3b8c-4a4a-8c09-3769ebeabe6a\") " Mar 10 11:49:55 crc kubenswrapper[4794]: I0310 11:49:55.434155 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/dd7d1c86-3b8c-4a4a-8c09-3769ebeabe6a-ssh-key-openstack-cell1\") pod \"dd7d1c86-3b8c-4a4a-8c09-3769ebeabe6a\" (UID: \"dd7d1c86-3b8c-4a4a-8c09-3769ebeabe6a\") " Mar 10 11:49:55 crc kubenswrapper[4794]: I0310 11:49:55.439777 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dd7d1c86-3b8c-4a4a-8c09-3769ebeabe6a-ceph" (OuterVolumeSpecName: "ceph") pod "dd7d1c86-3b8c-4a4a-8c09-3769ebeabe6a" (UID: "dd7d1c86-3b8c-4a4a-8c09-3769ebeabe6a"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 11:49:55 crc kubenswrapper[4794]: I0310 11:49:55.441167 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dd7d1c86-3b8c-4a4a-8c09-3769ebeabe6a-kube-api-access-zmwf7" (OuterVolumeSpecName: "kube-api-access-zmwf7") pod "dd7d1c86-3b8c-4a4a-8c09-3769ebeabe6a" (UID: "dd7d1c86-3b8c-4a4a-8c09-3769ebeabe6a"). InnerVolumeSpecName "kube-api-access-zmwf7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 11:49:55 crc kubenswrapper[4794]: I0310 11:49:55.464284 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dd7d1c86-3b8c-4a4a-8c09-3769ebeabe6a-inventory" (OuterVolumeSpecName: "inventory") pod "dd7d1c86-3b8c-4a4a-8c09-3769ebeabe6a" (UID: "dd7d1c86-3b8c-4a4a-8c09-3769ebeabe6a"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 11:49:55 crc kubenswrapper[4794]: I0310 11:49:55.472459 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dd7d1c86-3b8c-4a4a-8c09-3769ebeabe6a-ssh-key-openstack-cell1" (OuterVolumeSpecName: "ssh-key-openstack-cell1") pod "dd7d1c86-3b8c-4a4a-8c09-3769ebeabe6a" (UID: "dd7d1c86-3b8c-4a4a-8c09-3769ebeabe6a"). InnerVolumeSpecName "ssh-key-openstack-cell1". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 11:49:55 crc kubenswrapper[4794]: I0310 11:49:55.538530 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zmwf7\" (UniqueName: \"kubernetes.io/projected/dd7d1c86-3b8c-4a4a-8c09-3769ebeabe6a-kube-api-access-zmwf7\") on node \"crc\" DevicePath \"\"" Mar 10 11:49:55 crc kubenswrapper[4794]: I0310 11:49:55.538590 4794 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/dd7d1c86-3b8c-4a4a-8c09-3769ebeabe6a-ceph\") on node \"crc\" DevicePath \"\"" Mar 10 11:49:55 crc kubenswrapper[4794]: I0310 11:49:55.538648 4794 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/dd7d1c86-3b8c-4a4a-8c09-3769ebeabe6a-inventory\") on node \"crc\" DevicePath \"\"" Mar 10 11:49:55 crc kubenswrapper[4794]: I0310 11:49:55.538669 4794 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/dd7d1c86-3b8c-4a4a-8c09-3769ebeabe6a-ssh-key-openstack-cell1\") on node \"crc\" DevicePath \"\"" Mar 10 11:49:55 crc kubenswrapper[4794]: I0310 11:49:55.711374 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-openstack-openstack-cell1-fv6vq" event={"ID":"dd7d1c86-3b8c-4a4a-8c09-3769ebeabe6a","Type":"ContainerDied","Data":"c5c3121f7faacb6f31d900d75465b6c189e310d332662ee3e49fe25d3075425d"} Mar 10 11:49:55 crc kubenswrapper[4794]: I0310 11:49:55.711451 4794 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c5c3121f7faacb6f31d900d75465b6c189e310d332662ee3e49fe25d3075425d" Mar 10 11:49:55 crc kubenswrapper[4794]: I0310 11:49:55.711536 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-openstack-openstack-cell1-fv6vq" Mar 10 11:49:55 crc kubenswrapper[4794]: I0310 11:49:55.787025 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-os-openstack-openstack-cell1-p6cdg"] Mar 10 11:49:55 crc kubenswrapper[4794]: E0310 11:49:55.787568 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dd7d1c86-3b8c-4a4a-8c09-3769ebeabe6a" containerName="validate-network-openstack-openstack-cell1" Mar 10 11:49:55 crc kubenswrapper[4794]: I0310 11:49:55.787595 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="dd7d1c86-3b8c-4a4a-8c09-3769ebeabe6a" containerName="validate-network-openstack-openstack-cell1" Mar 10 11:49:55 crc kubenswrapper[4794]: I0310 11:49:55.787818 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="dd7d1c86-3b8c-4a4a-8c09-3769ebeabe6a" containerName="validate-network-openstack-openstack-cell1" Mar 10 11:49:55 crc kubenswrapper[4794]: I0310 11:49:55.788777 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-openstack-openstack-cell1-p6cdg" Mar 10 11:49:55 crc kubenswrapper[4794]: I0310 11:49:55.792612 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 10 11:49:55 crc kubenswrapper[4794]: I0310 11:49:55.792743 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Mar 10 11:49:55 crc kubenswrapper[4794]: I0310 11:49:55.793824 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-jdm8g" Mar 10 11:49:55 crc kubenswrapper[4794]: I0310 11:49:55.794062 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Mar 10 11:49:55 crc kubenswrapper[4794]: I0310 11:49:55.815409 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-openstack-openstack-cell1-p6cdg"] Mar 10 11:49:55 crc kubenswrapper[4794]: I0310 11:49:55.948219 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mq76x\" (UniqueName: \"kubernetes.io/projected/d2743912-a485-45ae-b5ec-0addae4b7861-kube-api-access-mq76x\") pod \"install-os-openstack-openstack-cell1-p6cdg\" (UID: \"d2743912-a485-45ae-b5ec-0addae4b7861\") " pod="openstack/install-os-openstack-openstack-cell1-p6cdg" Mar 10 11:49:55 crc kubenswrapper[4794]: I0310 11:49:55.948545 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/d2743912-a485-45ae-b5ec-0addae4b7861-ceph\") pod \"install-os-openstack-openstack-cell1-p6cdg\" (UID: \"d2743912-a485-45ae-b5ec-0addae4b7861\") " pod="openstack/install-os-openstack-openstack-cell1-p6cdg" Mar 10 11:49:55 crc kubenswrapper[4794]: I0310 11:49:55.948567 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d2743912-a485-45ae-b5ec-0addae4b7861-inventory\") pod \"install-os-openstack-openstack-cell1-p6cdg\" (UID: \"d2743912-a485-45ae-b5ec-0addae4b7861\") " pod="openstack/install-os-openstack-openstack-cell1-p6cdg" Mar 10 11:49:55 crc kubenswrapper[4794]: I0310 11:49:55.948590 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/d2743912-a485-45ae-b5ec-0addae4b7861-ssh-key-openstack-cell1\") pod \"install-os-openstack-openstack-cell1-p6cdg\" (UID: \"d2743912-a485-45ae-b5ec-0addae4b7861\") " pod="openstack/install-os-openstack-openstack-cell1-p6cdg" Mar 10 11:49:56 crc kubenswrapper[4794]: I0310 11:49:56.050974 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mq76x\" (UniqueName: \"kubernetes.io/projected/d2743912-a485-45ae-b5ec-0addae4b7861-kube-api-access-mq76x\") pod \"install-os-openstack-openstack-cell1-p6cdg\" (UID: \"d2743912-a485-45ae-b5ec-0addae4b7861\") " pod="openstack/install-os-openstack-openstack-cell1-p6cdg" Mar 10 11:49:56 crc kubenswrapper[4794]: I0310 11:49:56.051045 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/d2743912-a485-45ae-b5ec-0addae4b7861-ceph\") pod \"install-os-openstack-openstack-cell1-p6cdg\" (UID: \"d2743912-a485-45ae-b5ec-0addae4b7861\") " pod="openstack/install-os-openstack-openstack-cell1-p6cdg" Mar 10 11:49:56 crc kubenswrapper[4794]: I0310 11:49:56.051073 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d2743912-a485-45ae-b5ec-0addae4b7861-inventory\") pod \"install-os-openstack-openstack-cell1-p6cdg\" (UID: \"d2743912-a485-45ae-b5ec-0addae4b7861\") " pod="openstack/install-os-openstack-openstack-cell1-p6cdg" Mar 10 11:49:56 crc kubenswrapper[4794]: I0310 11:49:56.051102 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/d2743912-a485-45ae-b5ec-0addae4b7861-ssh-key-openstack-cell1\") pod \"install-os-openstack-openstack-cell1-p6cdg\" (UID: \"d2743912-a485-45ae-b5ec-0addae4b7861\") " pod="openstack/install-os-openstack-openstack-cell1-p6cdg" Mar 10 11:49:56 crc kubenswrapper[4794]: I0310 11:49:56.057713 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/d2743912-a485-45ae-b5ec-0addae4b7861-ceph\") pod \"install-os-openstack-openstack-cell1-p6cdg\" (UID: \"d2743912-a485-45ae-b5ec-0addae4b7861\") " pod="openstack/install-os-openstack-openstack-cell1-p6cdg" Mar 10 11:49:56 crc kubenswrapper[4794]: I0310 11:49:56.059032 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/d2743912-a485-45ae-b5ec-0addae4b7861-ssh-key-openstack-cell1\") pod \"install-os-openstack-openstack-cell1-p6cdg\" (UID: \"d2743912-a485-45ae-b5ec-0addae4b7861\") " pod="openstack/install-os-openstack-openstack-cell1-p6cdg" Mar 10 11:49:56 crc kubenswrapper[4794]: I0310 11:49:56.068427 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d2743912-a485-45ae-b5ec-0addae4b7861-inventory\") pod \"install-os-openstack-openstack-cell1-p6cdg\" (UID: \"d2743912-a485-45ae-b5ec-0addae4b7861\") " pod="openstack/install-os-openstack-openstack-cell1-p6cdg" Mar 10 11:49:56 crc kubenswrapper[4794]: I0310 11:49:56.073974 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mq76x\" (UniqueName: \"kubernetes.io/projected/d2743912-a485-45ae-b5ec-0addae4b7861-kube-api-access-mq76x\") pod \"install-os-openstack-openstack-cell1-p6cdg\" (UID: \"d2743912-a485-45ae-b5ec-0addae4b7861\") " pod="openstack/install-os-openstack-openstack-cell1-p6cdg" Mar 10 11:49:56 crc kubenswrapper[4794]: I0310 11:49:56.112581 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-openstack-openstack-cell1-p6cdg" Mar 10 11:49:56 crc kubenswrapper[4794]: I0310 11:49:56.699319 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-openstack-openstack-cell1-p6cdg"] Mar 10 11:49:56 crc kubenswrapper[4794]: I0310 11:49:56.734149 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-openstack-openstack-cell1-p6cdg" event={"ID":"d2743912-a485-45ae-b5ec-0addae4b7861","Type":"ContainerStarted","Data":"d7eaef5a4fae0c4a1fba53258a48275a0cb985db7e0eb5fa397f4d580ecf38df"} Mar 10 11:49:57 crc kubenswrapper[4794]: I0310 11:49:57.747307 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-openstack-openstack-cell1-p6cdg" event={"ID":"d2743912-a485-45ae-b5ec-0addae4b7861","Type":"ContainerStarted","Data":"b9a0ce8c3da9a67ee30c1043060ac6a0fde64a0182bb50192cf145050d409bce"} Mar 10 11:49:57 crc kubenswrapper[4794]: I0310 11:49:57.780236 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-os-openstack-openstack-cell1-p6cdg" podStartSLOduration=2.248959106 podStartE2EDuration="2.780216378s" podCreationTimestamp="2026-03-10 11:49:55 +0000 UTC" firstStartedPulling="2026-03-10 11:49:56.705245037 +0000 UTC m=+7545.461415875" lastFinishedPulling="2026-03-10 11:49:57.236502289 +0000 UTC m=+7545.992673147" observedRunningTime="2026-03-10 11:49:57.777858475 +0000 UTC m=+7546.534029293" watchObservedRunningTime="2026-03-10 11:49:57.780216378 +0000 UTC m=+7546.536387216" Mar 10 11:50:00 crc kubenswrapper[4794]: I0310 11:50:00.160179 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29552390-xk452"] Mar 10 11:50:00 crc kubenswrapper[4794]: I0310 11:50:00.162934 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552390-xk452" Mar 10 11:50:00 crc kubenswrapper[4794]: I0310 11:50:00.166739 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 10 11:50:00 crc kubenswrapper[4794]: I0310 11:50:00.167144 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 10 11:50:00 crc kubenswrapper[4794]: I0310 11:50:00.168209 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-r5tkc" Mar 10 11:50:00 crc kubenswrapper[4794]: I0310 11:50:00.174597 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552390-xk452"] Mar 10 11:50:00 crc kubenswrapper[4794]: I0310 11:50:00.258513 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mvsx6\" (UniqueName: \"kubernetes.io/projected/24ea97d9-e465-4b6c-8c1d-8d63a45f57c2-kube-api-access-mvsx6\") pod \"auto-csr-approver-29552390-xk452\" (UID: \"24ea97d9-e465-4b6c-8c1d-8d63a45f57c2\") " pod="openshift-infra/auto-csr-approver-29552390-xk452" Mar 10 11:50:00 crc kubenswrapper[4794]: I0310 11:50:00.360326 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mvsx6\" (UniqueName: \"kubernetes.io/projected/24ea97d9-e465-4b6c-8c1d-8d63a45f57c2-kube-api-access-mvsx6\") pod \"auto-csr-approver-29552390-xk452\" (UID: \"24ea97d9-e465-4b6c-8c1d-8d63a45f57c2\") " pod="openshift-infra/auto-csr-approver-29552390-xk452" Mar 10 11:50:00 crc kubenswrapper[4794]: I0310 11:50:00.382278 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mvsx6\" (UniqueName: \"kubernetes.io/projected/24ea97d9-e465-4b6c-8c1d-8d63a45f57c2-kube-api-access-mvsx6\") pod \"auto-csr-approver-29552390-xk452\" (UID: \"24ea97d9-e465-4b6c-8c1d-8d63a45f57c2\") " pod="openshift-infra/auto-csr-approver-29552390-xk452" Mar 10 11:50:00 crc kubenswrapper[4794]: I0310 11:50:00.495893 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552390-xk452" Mar 10 11:50:00 crc kubenswrapper[4794]: I0310 11:50:00.977686 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552390-xk452"] Mar 10 11:50:00 crc kubenswrapper[4794]: W0310 11:50:00.980200 4794 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod24ea97d9_e465_4b6c_8c1d_8d63a45f57c2.slice/crio-989093f6f7a505d83cdca4939a7f2418420176fde8a3e4a437d7837ce7bdc2c3 WatchSource:0}: Error finding container 989093f6f7a505d83cdca4939a7f2418420176fde8a3e4a437d7837ce7bdc2c3: Status 404 returned error can't find the container with id 989093f6f7a505d83cdca4939a7f2418420176fde8a3e4a437d7837ce7bdc2c3 Mar 10 11:50:01 crc kubenswrapper[4794]: I0310 11:50:01.824546 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552390-xk452" event={"ID":"24ea97d9-e465-4b6c-8c1d-8d63a45f57c2","Type":"ContainerStarted","Data":"989093f6f7a505d83cdca4939a7f2418420176fde8a3e4a437d7837ce7bdc2c3"} Mar 10 11:50:02 crc kubenswrapper[4794]: I0310 11:50:02.840475 4794 generic.go:334] "Generic (PLEG): container finished" podID="24ea97d9-e465-4b6c-8c1d-8d63a45f57c2" containerID="0fb5ca443710d4ec6f3b14bc6ed29962121d42088d2e935e1f00897923df1429" exitCode=0 Mar 10 11:50:02 crc kubenswrapper[4794]: I0310 11:50:02.840543 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552390-xk452" event={"ID":"24ea97d9-e465-4b6c-8c1d-8d63a45f57c2","Type":"ContainerDied","Data":"0fb5ca443710d4ec6f3b14bc6ed29962121d42088d2e935e1f00897923df1429"} Mar 10 11:50:04 crc kubenswrapper[4794]: I0310 11:50:04.384360 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552390-xk452" Mar 10 11:50:04 crc kubenswrapper[4794]: I0310 11:50:04.458597 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mvsx6\" (UniqueName: \"kubernetes.io/projected/24ea97d9-e465-4b6c-8c1d-8d63a45f57c2-kube-api-access-mvsx6\") pod \"24ea97d9-e465-4b6c-8c1d-8d63a45f57c2\" (UID: \"24ea97d9-e465-4b6c-8c1d-8d63a45f57c2\") " Mar 10 11:50:04 crc kubenswrapper[4794]: I0310 11:50:04.465675 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/24ea97d9-e465-4b6c-8c1d-8d63a45f57c2-kube-api-access-mvsx6" (OuterVolumeSpecName: "kube-api-access-mvsx6") pod "24ea97d9-e465-4b6c-8c1d-8d63a45f57c2" (UID: "24ea97d9-e465-4b6c-8c1d-8d63a45f57c2"). InnerVolumeSpecName "kube-api-access-mvsx6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 11:50:04 crc kubenswrapper[4794]: I0310 11:50:04.561884 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mvsx6\" (UniqueName: \"kubernetes.io/projected/24ea97d9-e465-4b6c-8c1d-8d63a45f57c2-kube-api-access-mvsx6\") on node \"crc\" DevicePath \"\"" Mar 10 11:50:04 crc kubenswrapper[4794]: I0310 11:50:04.889784 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552390-xk452" event={"ID":"24ea97d9-e465-4b6c-8c1d-8d63a45f57c2","Type":"ContainerDied","Data":"989093f6f7a505d83cdca4939a7f2418420176fde8a3e4a437d7837ce7bdc2c3"} Mar 10 11:50:04 crc kubenswrapper[4794]: I0310 11:50:04.889822 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552390-xk452" Mar 10 11:50:04 crc kubenswrapper[4794]: I0310 11:50:04.889844 4794 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="989093f6f7a505d83cdca4939a7f2418420176fde8a3e4a437d7837ce7bdc2c3" Mar 10 11:50:05 crc kubenswrapper[4794]: I0310 11:50:05.455582 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29552384-jhmlf"] Mar 10 11:50:05 crc kubenswrapper[4794]: I0310 11:50:05.465934 4794 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29552384-jhmlf"] Mar 10 11:50:06 crc kubenswrapper[4794]: I0310 11:50:06.001349 4794 scope.go:117] "RemoveContainer" containerID="9dc225440d65a260bfaf17b727b2d2bfa824f3fd4b54db4a37ab86f3420b2343" Mar 10 11:50:06 crc kubenswrapper[4794]: E0310 11:50:06.001675 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-69278_openshift-machine-config-operator(071a79a8-a892-4d38-a255-2a19483b64aa)\"" pod="openshift-machine-config-operator/machine-config-daemon-69278" podUID="071a79a8-a892-4d38-a255-2a19483b64aa" Mar 10 11:50:06 crc kubenswrapper[4794]: I0310 11:50:06.013259 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fa5bff22-75c7-4ed7-8d83-8b5e79df536d" path="/var/lib/kubelet/pods/fa5bff22-75c7-4ed7-8d83-8b5e79df536d/volumes" Mar 10 11:50:21 crc kubenswrapper[4794]: I0310 11:50:21.001608 4794 scope.go:117] "RemoveContainer" containerID="9dc225440d65a260bfaf17b727b2d2bfa824f3fd4b54db4a37ab86f3420b2343" Mar 10 11:50:21 crc kubenswrapper[4794]: E0310 11:50:21.002841 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-69278_openshift-machine-config-operator(071a79a8-a892-4d38-a255-2a19483b64aa)\"" pod="openshift-machine-config-operator/machine-config-daemon-69278" podUID="071a79a8-a892-4d38-a255-2a19483b64aa" Mar 10 11:50:33 crc kubenswrapper[4794]: I0310 11:50:32.999932 4794 scope.go:117] "RemoveContainer" containerID="9dc225440d65a260bfaf17b727b2d2bfa824f3fd4b54db4a37ab86f3420b2343" Mar 10 11:50:33 crc kubenswrapper[4794]: E0310 11:50:33.000928 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-69278_openshift-machine-config-operator(071a79a8-a892-4d38-a255-2a19483b64aa)\"" pod="openshift-machine-config-operator/machine-config-daemon-69278" podUID="071a79a8-a892-4d38-a255-2a19483b64aa" Mar 10 11:50:43 crc kubenswrapper[4794]: I0310 11:50:43.372390 4794 generic.go:334] "Generic (PLEG): container finished" podID="d2743912-a485-45ae-b5ec-0addae4b7861" containerID="b9a0ce8c3da9a67ee30c1043060ac6a0fde64a0182bb50192cf145050d409bce" exitCode=0 Mar 10 11:50:43 crc kubenswrapper[4794]: I0310 11:50:43.372437 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-openstack-openstack-cell1-p6cdg" event={"ID":"d2743912-a485-45ae-b5ec-0addae4b7861","Type":"ContainerDied","Data":"b9a0ce8c3da9a67ee30c1043060ac6a0fde64a0182bb50192cf145050d409bce"} Mar 10 11:50:43 crc kubenswrapper[4794]: I0310 11:50:43.999198 4794 scope.go:117] "RemoveContainer" containerID="9dc225440d65a260bfaf17b727b2d2bfa824f3fd4b54db4a37ab86f3420b2343" Mar 10 11:50:43 crc kubenswrapper[4794]: E0310 11:50:43.999498 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-69278_openshift-machine-config-operator(071a79a8-a892-4d38-a255-2a19483b64aa)\"" pod="openshift-machine-config-operator/machine-config-daemon-69278" podUID="071a79a8-a892-4d38-a255-2a19483b64aa" Mar 10 11:50:44 crc kubenswrapper[4794]: I0310 11:50:44.899565 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-openstack-openstack-cell1-p6cdg" Mar 10 11:50:44 crc kubenswrapper[4794]: I0310 11:50:44.914891 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mq76x\" (UniqueName: \"kubernetes.io/projected/d2743912-a485-45ae-b5ec-0addae4b7861-kube-api-access-mq76x\") pod \"d2743912-a485-45ae-b5ec-0addae4b7861\" (UID: \"d2743912-a485-45ae-b5ec-0addae4b7861\") " Mar 10 11:50:44 crc kubenswrapper[4794]: I0310 11:50:44.915119 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d2743912-a485-45ae-b5ec-0addae4b7861-inventory\") pod \"d2743912-a485-45ae-b5ec-0addae4b7861\" (UID: \"d2743912-a485-45ae-b5ec-0addae4b7861\") " Mar 10 11:50:44 crc kubenswrapper[4794]: I0310 11:50:44.915226 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/d2743912-a485-45ae-b5ec-0addae4b7861-ssh-key-openstack-cell1\") pod \"d2743912-a485-45ae-b5ec-0addae4b7861\" (UID: \"d2743912-a485-45ae-b5ec-0addae4b7861\") " Mar 10 11:50:44 crc kubenswrapper[4794]: I0310 11:50:44.915264 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/d2743912-a485-45ae-b5ec-0addae4b7861-ceph\") pod \"d2743912-a485-45ae-b5ec-0addae4b7861\" (UID: \"d2743912-a485-45ae-b5ec-0addae4b7861\") " Mar 10 11:50:44 crc kubenswrapper[4794]: I0310 11:50:44.926614 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d2743912-a485-45ae-b5ec-0addae4b7861-ceph" (OuterVolumeSpecName: "ceph") pod "d2743912-a485-45ae-b5ec-0addae4b7861" (UID: "d2743912-a485-45ae-b5ec-0addae4b7861"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 11:50:44 crc kubenswrapper[4794]: I0310 11:50:44.926731 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d2743912-a485-45ae-b5ec-0addae4b7861-kube-api-access-mq76x" (OuterVolumeSpecName: "kube-api-access-mq76x") pod "d2743912-a485-45ae-b5ec-0addae4b7861" (UID: "d2743912-a485-45ae-b5ec-0addae4b7861"). InnerVolumeSpecName "kube-api-access-mq76x". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 11:50:44 crc kubenswrapper[4794]: I0310 11:50:44.959525 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d2743912-a485-45ae-b5ec-0addae4b7861-inventory" (OuterVolumeSpecName: "inventory") pod "d2743912-a485-45ae-b5ec-0addae4b7861" (UID: "d2743912-a485-45ae-b5ec-0addae4b7861"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 11:50:44 crc kubenswrapper[4794]: I0310 11:50:44.977425 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d2743912-a485-45ae-b5ec-0addae4b7861-ssh-key-openstack-cell1" (OuterVolumeSpecName: "ssh-key-openstack-cell1") pod "d2743912-a485-45ae-b5ec-0addae4b7861" (UID: "d2743912-a485-45ae-b5ec-0addae4b7861"). InnerVolumeSpecName "ssh-key-openstack-cell1". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 11:50:45 crc kubenswrapper[4794]: I0310 11:50:45.017192 4794 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d2743912-a485-45ae-b5ec-0addae4b7861-inventory\") on node \"crc\" DevicePath \"\"" Mar 10 11:50:45 crc kubenswrapper[4794]: I0310 11:50:45.017258 4794 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/d2743912-a485-45ae-b5ec-0addae4b7861-ssh-key-openstack-cell1\") on node \"crc\" DevicePath \"\"" Mar 10 11:50:45 crc kubenswrapper[4794]: I0310 11:50:45.017274 4794 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/d2743912-a485-45ae-b5ec-0addae4b7861-ceph\") on node \"crc\" DevicePath \"\"" Mar 10 11:50:45 crc kubenswrapper[4794]: I0310 11:50:45.017288 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mq76x\" (UniqueName: \"kubernetes.io/projected/d2743912-a485-45ae-b5ec-0addae4b7861-kube-api-access-mq76x\") on node \"crc\" DevicePath \"\"" Mar 10 11:50:45 crc kubenswrapper[4794]: I0310 11:50:45.397691 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-openstack-openstack-cell1-p6cdg" event={"ID":"d2743912-a485-45ae-b5ec-0addae4b7861","Type":"ContainerDied","Data":"d7eaef5a4fae0c4a1fba53258a48275a0cb985db7e0eb5fa397f4d580ecf38df"} Mar 10 11:50:45 crc kubenswrapper[4794]: I0310 11:50:45.397926 4794 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d7eaef5a4fae0c4a1fba53258a48275a0cb985db7e0eb5fa397f4d580ecf38df" Mar 10 11:50:45 crc kubenswrapper[4794]: I0310 11:50:45.397979 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-openstack-openstack-cell1-p6cdg" Mar 10 11:50:45 crc kubenswrapper[4794]: I0310 11:50:45.499711 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-os-openstack-openstack-cell1-cms62"] Mar 10 11:50:45 crc kubenswrapper[4794]: E0310 11:50:45.500273 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d2743912-a485-45ae-b5ec-0addae4b7861" containerName="install-os-openstack-openstack-cell1" Mar 10 11:50:45 crc kubenswrapper[4794]: I0310 11:50:45.500299 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="d2743912-a485-45ae-b5ec-0addae4b7861" containerName="install-os-openstack-openstack-cell1" Mar 10 11:50:45 crc kubenswrapper[4794]: E0310 11:50:45.500332 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="24ea97d9-e465-4b6c-8c1d-8d63a45f57c2" containerName="oc" Mar 10 11:50:45 crc kubenswrapper[4794]: I0310 11:50:45.500363 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="24ea97d9-e465-4b6c-8c1d-8d63a45f57c2" containerName="oc" Mar 10 11:50:45 crc kubenswrapper[4794]: I0310 11:50:45.500624 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="d2743912-a485-45ae-b5ec-0addae4b7861" containerName="install-os-openstack-openstack-cell1" Mar 10 11:50:45 crc kubenswrapper[4794]: I0310 11:50:45.500644 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="24ea97d9-e465-4b6c-8c1d-8d63a45f57c2" containerName="oc" Mar 10 11:50:45 crc kubenswrapper[4794]: I0310 11:50:45.502766 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-openstack-openstack-cell1-cms62" Mar 10 11:50:45 crc kubenswrapper[4794]: I0310 11:50:45.506008 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 10 11:50:45 crc kubenswrapper[4794]: I0310 11:50:45.506389 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Mar 10 11:50:45 crc kubenswrapper[4794]: I0310 11:50:45.506414 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Mar 10 11:50:45 crc kubenswrapper[4794]: I0310 11:50:45.506085 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-jdm8g" Mar 10 11:50:45 crc kubenswrapper[4794]: I0310 11:50:45.518815 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-openstack-openstack-cell1-cms62"] Mar 10 11:50:45 crc kubenswrapper[4794]: I0310 11:50:45.629787 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/4a19915b-ffd3-4b57-a806-0dc4f67e3003-ssh-key-openstack-cell1\") pod \"configure-os-openstack-openstack-cell1-cms62\" (UID: \"4a19915b-ffd3-4b57-a806-0dc4f67e3003\") " pod="openstack/configure-os-openstack-openstack-cell1-cms62" Mar 10 11:50:45 crc kubenswrapper[4794]: I0310 11:50:45.629838 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bdl6p\" (UniqueName: \"kubernetes.io/projected/4a19915b-ffd3-4b57-a806-0dc4f67e3003-kube-api-access-bdl6p\") pod \"configure-os-openstack-openstack-cell1-cms62\" (UID: \"4a19915b-ffd3-4b57-a806-0dc4f67e3003\") " pod="openstack/configure-os-openstack-openstack-cell1-cms62" Mar 10 11:50:45 crc kubenswrapper[4794]: I0310 11:50:45.629878 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4a19915b-ffd3-4b57-a806-0dc4f67e3003-inventory\") pod \"configure-os-openstack-openstack-cell1-cms62\" (UID: \"4a19915b-ffd3-4b57-a806-0dc4f67e3003\") " pod="openstack/configure-os-openstack-openstack-cell1-cms62" Mar 10 11:50:45 crc kubenswrapper[4794]: I0310 11:50:45.629983 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/4a19915b-ffd3-4b57-a806-0dc4f67e3003-ceph\") pod \"configure-os-openstack-openstack-cell1-cms62\" (UID: \"4a19915b-ffd3-4b57-a806-0dc4f67e3003\") " pod="openstack/configure-os-openstack-openstack-cell1-cms62" Mar 10 11:50:45 crc kubenswrapper[4794]: I0310 11:50:45.732057 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/4a19915b-ffd3-4b57-a806-0dc4f67e3003-ssh-key-openstack-cell1\") pod \"configure-os-openstack-openstack-cell1-cms62\" (UID: \"4a19915b-ffd3-4b57-a806-0dc4f67e3003\") " pod="openstack/configure-os-openstack-openstack-cell1-cms62" Mar 10 11:50:45 crc kubenswrapper[4794]: I0310 11:50:45.732366 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bdl6p\" (UniqueName: \"kubernetes.io/projected/4a19915b-ffd3-4b57-a806-0dc4f67e3003-kube-api-access-bdl6p\") pod \"configure-os-openstack-openstack-cell1-cms62\" (UID: \"4a19915b-ffd3-4b57-a806-0dc4f67e3003\") " pod="openstack/configure-os-openstack-openstack-cell1-cms62" Mar 10 11:50:45 crc kubenswrapper[4794]: I0310 11:50:45.732497 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4a19915b-ffd3-4b57-a806-0dc4f67e3003-inventory\") pod \"configure-os-openstack-openstack-cell1-cms62\" (UID: \"4a19915b-ffd3-4b57-a806-0dc4f67e3003\") " pod="openstack/configure-os-openstack-openstack-cell1-cms62" Mar 10 11:50:45 crc kubenswrapper[4794]: I0310 11:50:45.732657 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/4a19915b-ffd3-4b57-a806-0dc4f67e3003-ceph\") pod \"configure-os-openstack-openstack-cell1-cms62\" (UID: \"4a19915b-ffd3-4b57-a806-0dc4f67e3003\") " pod="openstack/configure-os-openstack-openstack-cell1-cms62" Mar 10 11:50:45 crc kubenswrapper[4794]: I0310 11:50:45.736493 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/4a19915b-ffd3-4b57-a806-0dc4f67e3003-ceph\") pod \"configure-os-openstack-openstack-cell1-cms62\" (UID: \"4a19915b-ffd3-4b57-a806-0dc4f67e3003\") " pod="openstack/configure-os-openstack-openstack-cell1-cms62" Mar 10 11:50:45 crc kubenswrapper[4794]: I0310 11:50:45.736570 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4a19915b-ffd3-4b57-a806-0dc4f67e3003-inventory\") pod \"configure-os-openstack-openstack-cell1-cms62\" (UID: \"4a19915b-ffd3-4b57-a806-0dc4f67e3003\") " pod="openstack/configure-os-openstack-openstack-cell1-cms62" Mar 10 11:50:45 crc kubenswrapper[4794]: I0310 11:50:45.736691 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/4a19915b-ffd3-4b57-a806-0dc4f67e3003-ssh-key-openstack-cell1\") pod \"configure-os-openstack-openstack-cell1-cms62\" (UID: \"4a19915b-ffd3-4b57-a806-0dc4f67e3003\") " pod="openstack/configure-os-openstack-openstack-cell1-cms62" Mar 10 11:50:45 crc kubenswrapper[4794]: I0310 11:50:45.749378 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bdl6p\" (UniqueName: \"kubernetes.io/projected/4a19915b-ffd3-4b57-a806-0dc4f67e3003-kube-api-access-bdl6p\") pod \"configure-os-openstack-openstack-cell1-cms62\" (UID: \"4a19915b-ffd3-4b57-a806-0dc4f67e3003\") " pod="openstack/configure-os-openstack-openstack-cell1-cms62" Mar 10 11:50:45 crc kubenswrapper[4794]: I0310 11:50:45.819528 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-openstack-openstack-cell1-cms62" Mar 10 11:50:46 crc kubenswrapper[4794]: I0310 11:50:46.389951 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-openstack-openstack-cell1-cms62"] Mar 10 11:50:46 crc kubenswrapper[4794]: W0310 11:50:46.393944 4794 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4a19915b_ffd3_4b57_a806_0dc4f67e3003.slice/crio-a90eaf08313920d3073f50721bd4694a5a06038d5039614969939271dc73ab78 WatchSource:0}: Error finding container a90eaf08313920d3073f50721bd4694a5a06038d5039614969939271dc73ab78: Status 404 returned error can't find the container with id a90eaf08313920d3073f50721bd4694a5a06038d5039614969939271dc73ab78 Mar 10 11:50:46 crc kubenswrapper[4794]: I0310 11:50:46.415361 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-openstack-openstack-cell1-cms62" event={"ID":"4a19915b-ffd3-4b57-a806-0dc4f67e3003","Type":"ContainerStarted","Data":"a90eaf08313920d3073f50721bd4694a5a06038d5039614969939271dc73ab78"} Mar 10 11:50:47 crc kubenswrapper[4794]: I0310 11:50:47.425568 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-openstack-openstack-cell1-cms62" event={"ID":"4a19915b-ffd3-4b57-a806-0dc4f67e3003","Type":"ContainerStarted","Data":"6e43199d622059bf96781753d35331c9e0fd118f0ec29642792871ad0549e112"} Mar 10 11:50:47 crc kubenswrapper[4794]: I0310 11:50:47.466630 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-os-openstack-openstack-cell1-cms62" podStartSLOduration=2.027498463 podStartE2EDuration="2.466605874s" podCreationTimestamp="2026-03-10 11:50:45 +0000 UTC" firstStartedPulling="2026-03-10 11:50:46.406310699 +0000 UTC m=+7595.162481517" lastFinishedPulling="2026-03-10 11:50:46.84541811 +0000 UTC m=+7595.601588928" observedRunningTime="2026-03-10 11:50:47.461773605 +0000 UTC m=+7596.217944433" watchObservedRunningTime="2026-03-10 11:50:47.466605874 +0000 UTC m=+7596.222776702" Mar 10 11:50:47 crc kubenswrapper[4794]: I0310 11:50:47.827268 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-8w6jl"] Mar 10 11:50:47 crc kubenswrapper[4794]: I0310 11:50:47.829965 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-8w6jl" Mar 10 11:50:47 crc kubenswrapper[4794]: I0310 11:50:47.843876 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-8w6jl"] Mar 10 11:50:47 crc kubenswrapper[4794]: I0310 11:50:47.979738 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1c617148-7474-4963-b02f-a43204c9594a-catalog-content\") pod \"certified-operators-8w6jl\" (UID: \"1c617148-7474-4963-b02f-a43204c9594a\") " pod="openshift-marketplace/certified-operators-8w6jl" Mar 10 11:50:47 crc kubenswrapper[4794]: I0310 11:50:47.980226 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1c617148-7474-4963-b02f-a43204c9594a-utilities\") pod \"certified-operators-8w6jl\" (UID: \"1c617148-7474-4963-b02f-a43204c9594a\") " pod="openshift-marketplace/certified-operators-8w6jl" Mar 10 11:50:47 crc kubenswrapper[4794]: I0310 11:50:47.980333 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4hlsr\" (UniqueName: \"kubernetes.io/projected/1c617148-7474-4963-b02f-a43204c9594a-kube-api-access-4hlsr\") pod \"certified-operators-8w6jl\" (UID: \"1c617148-7474-4963-b02f-a43204c9594a\") " pod="openshift-marketplace/certified-operators-8w6jl" Mar 10 11:50:48 crc kubenswrapper[4794]: I0310 11:50:48.082217 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1c617148-7474-4963-b02f-a43204c9594a-utilities\") pod \"certified-operators-8w6jl\" (UID: \"1c617148-7474-4963-b02f-a43204c9594a\") " pod="openshift-marketplace/certified-operators-8w6jl" Mar 10 11:50:48 crc kubenswrapper[4794]: I0310 11:50:48.082598 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4hlsr\" (UniqueName: \"kubernetes.io/projected/1c617148-7474-4963-b02f-a43204c9594a-kube-api-access-4hlsr\") pod \"certified-operators-8w6jl\" (UID: \"1c617148-7474-4963-b02f-a43204c9594a\") " pod="openshift-marketplace/certified-operators-8w6jl" Mar 10 11:50:48 crc kubenswrapper[4794]: I0310 11:50:48.082701 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1c617148-7474-4963-b02f-a43204c9594a-catalog-content\") pod \"certified-operators-8w6jl\" (UID: \"1c617148-7474-4963-b02f-a43204c9594a\") " pod="openshift-marketplace/certified-operators-8w6jl" Mar 10 11:50:48 crc kubenswrapper[4794]: I0310 11:50:48.082880 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1c617148-7474-4963-b02f-a43204c9594a-utilities\") pod \"certified-operators-8w6jl\" (UID: \"1c617148-7474-4963-b02f-a43204c9594a\") " pod="openshift-marketplace/certified-operators-8w6jl" Mar 10 11:50:48 crc kubenswrapper[4794]: I0310 11:50:48.083394 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1c617148-7474-4963-b02f-a43204c9594a-catalog-content\") pod \"certified-operators-8w6jl\" (UID: \"1c617148-7474-4963-b02f-a43204c9594a\") " pod="openshift-marketplace/certified-operators-8w6jl" Mar 10 11:50:48 crc kubenswrapper[4794]: I0310 11:50:48.107511 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4hlsr\" (UniqueName: \"kubernetes.io/projected/1c617148-7474-4963-b02f-a43204c9594a-kube-api-access-4hlsr\") pod \"certified-operators-8w6jl\" (UID: \"1c617148-7474-4963-b02f-a43204c9594a\") " pod="openshift-marketplace/certified-operators-8w6jl" Mar 10 11:50:48 crc kubenswrapper[4794]: I0310 11:50:48.169435 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-8w6jl" Mar 10 11:50:48 crc kubenswrapper[4794]: I0310 11:50:48.765593 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-8w6jl"] Mar 10 11:50:49 crc kubenswrapper[4794]: I0310 11:50:49.454758 4794 generic.go:334] "Generic (PLEG): container finished" podID="1c617148-7474-4963-b02f-a43204c9594a" containerID="24ab763b764bd0eecaac4b1ac285065e3c1b7b8320012bb2eb505c734f35bcda" exitCode=0 Mar 10 11:50:49 crc kubenswrapper[4794]: I0310 11:50:49.454824 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8w6jl" event={"ID":"1c617148-7474-4963-b02f-a43204c9594a","Type":"ContainerDied","Data":"24ab763b764bd0eecaac4b1ac285065e3c1b7b8320012bb2eb505c734f35bcda"} Mar 10 11:50:49 crc kubenswrapper[4794]: I0310 11:50:49.455436 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8w6jl" event={"ID":"1c617148-7474-4963-b02f-a43204c9594a","Type":"ContainerStarted","Data":"f423a4f1d04d0e1c0a46b090d2228b6fa0200e61cf4806c14e8cc2e6f2b12764"} Mar 10 11:50:50 crc kubenswrapper[4794]: I0310 11:50:50.466019 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8w6jl" event={"ID":"1c617148-7474-4963-b02f-a43204c9594a","Type":"ContainerStarted","Data":"e1d9628f3af08ba8cf39a9223ecaed9cdcf0597b106e2ce6f9a5c61920b3e324"} Mar 10 11:50:52 crc kubenswrapper[4794]: I0310 11:50:52.487846 4794 generic.go:334] "Generic (PLEG): container finished" podID="1c617148-7474-4963-b02f-a43204c9594a" containerID="e1d9628f3af08ba8cf39a9223ecaed9cdcf0597b106e2ce6f9a5c61920b3e324" exitCode=0 Mar 10 11:50:52 crc kubenswrapper[4794]: I0310 11:50:52.487920 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8w6jl" event={"ID":"1c617148-7474-4963-b02f-a43204c9594a","Type":"ContainerDied","Data":"e1d9628f3af08ba8cf39a9223ecaed9cdcf0597b106e2ce6f9a5c61920b3e324"} Mar 10 11:50:53 crc kubenswrapper[4794]: I0310 11:50:53.208746 4794 scope.go:117] "RemoveContainer" containerID="0f995e27a29faebe7c322a344d0711d5ce3b9d27620d9da5d81502fd9510a8cf" Mar 10 11:50:53 crc kubenswrapper[4794]: I0310 11:50:53.499797 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8w6jl" event={"ID":"1c617148-7474-4963-b02f-a43204c9594a","Type":"ContainerStarted","Data":"852b9a4d0b3ad819ba27c98d17b806223100bf53cb8f6d7e62572ee66855a559"} Mar 10 11:50:53 crc kubenswrapper[4794]: I0310 11:50:53.523662 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-8w6jl" podStartSLOduration=2.938983958 podStartE2EDuration="6.523640422s" podCreationTimestamp="2026-03-10 11:50:47 +0000 UTC" firstStartedPulling="2026-03-10 11:50:49.45683059 +0000 UTC m=+7598.213001408" lastFinishedPulling="2026-03-10 11:50:53.041487054 +0000 UTC m=+7601.797657872" observedRunningTime="2026-03-10 11:50:53.517819581 +0000 UTC m=+7602.273990399" watchObservedRunningTime="2026-03-10 11:50:53.523640422 +0000 UTC m=+7602.279811240" Mar 10 11:50:55 crc kubenswrapper[4794]: I0310 11:50:54.999871 4794 scope.go:117] "RemoveContainer" containerID="9dc225440d65a260bfaf17b727b2d2bfa824f3fd4b54db4a37ab86f3420b2343" Mar 10 11:50:55 crc kubenswrapper[4794]: E0310 11:50:55.000576 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-69278_openshift-machine-config-operator(071a79a8-a892-4d38-a255-2a19483b64aa)\"" pod="openshift-machine-config-operator/machine-config-daemon-69278" podUID="071a79a8-a892-4d38-a255-2a19483b64aa" Mar 10 11:50:58 crc kubenswrapper[4794]: I0310 11:50:58.169953 4794 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-8w6jl" Mar 10 11:50:58 crc kubenswrapper[4794]: I0310 11:50:58.170677 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-8w6jl" Mar 10 11:50:58 crc kubenswrapper[4794]: I0310 11:50:58.278743 4794 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-8w6jl" Mar 10 11:50:58 crc kubenswrapper[4794]: I0310 11:50:58.586392 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-8w6jl" Mar 10 11:50:58 crc kubenswrapper[4794]: I0310 11:50:58.634314 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-8w6jl"] Mar 10 11:51:00 crc kubenswrapper[4794]: I0310 11:51:00.579297 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-8w6jl" podUID="1c617148-7474-4963-b02f-a43204c9594a" containerName="registry-server" containerID="cri-o://852b9a4d0b3ad819ba27c98d17b806223100bf53cb8f6d7e62572ee66855a559" gracePeriod=2 Mar 10 11:51:01 crc kubenswrapper[4794]: I0310 11:51:01.231783 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-8w6jl" Mar 10 11:51:01 crc kubenswrapper[4794]: I0310 11:51:01.325362 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4hlsr\" (UniqueName: \"kubernetes.io/projected/1c617148-7474-4963-b02f-a43204c9594a-kube-api-access-4hlsr\") pod \"1c617148-7474-4963-b02f-a43204c9594a\" (UID: \"1c617148-7474-4963-b02f-a43204c9594a\") " Mar 10 11:51:01 crc kubenswrapper[4794]: I0310 11:51:01.325488 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1c617148-7474-4963-b02f-a43204c9594a-catalog-content\") pod \"1c617148-7474-4963-b02f-a43204c9594a\" (UID: \"1c617148-7474-4963-b02f-a43204c9594a\") " Mar 10 11:51:01 crc kubenswrapper[4794]: I0310 11:51:01.326181 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1c617148-7474-4963-b02f-a43204c9594a-utilities\") pod \"1c617148-7474-4963-b02f-a43204c9594a\" (UID: \"1c617148-7474-4963-b02f-a43204c9594a\") " Mar 10 11:51:01 crc kubenswrapper[4794]: I0310 11:51:01.327024 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1c617148-7474-4963-b02f-a43204c9594a-utilities" (OuterVolumeSpecName: "utilities") pod "1c617148-7474-4963-b02f-a43204c9594a" (UID: "1c617148-7474-4963-b02f-a43204c9594a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 11:51:01 crc kubenswrapper[4794]: I0310 11:51:01.332150 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1c617148-7474-4963-b02f-a43204c9594a-kube-api-access-4hlsr" (OuterVolumeSpecName: "kube-api-access-4hlsr") pod "1c617148-7474-4963-b02f-a43204c9594a" (UID: "1c617148-7474-4963-b02f-a43204c9594a"). InnerVolumeSpecName "kube-api-access-4hlsr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 11:51:01 crc kubenswrapper[4794]: I0310 11:51:01.388615 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1c617148-7474-4963-b02f-a43204c9594a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1c617148-7474-4963-b02f-a43204c9594a" (UID: "1c617148-7474-4963-b02f-a43204c9594a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 11:51:01 crc kubenswrapper[4794]: I0310 11:51:01.429464 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4hlsr\" (UniqueName: \"kubernetes.io/projected/1c617148-7474-4963-b02f-a43204c9594a-kube-api-access-4hlsr\") on node \"crc\" DevicePath \"\"" Mar 10 11:51:01 crc kubenswrapper[4794]: I0310 11:51:01.429505 4794 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1c617148-7474-4963-b02f-a43204c9594a-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 10 11:51:01 crc kubenswrapper[4794]: I0310 11:51:01.429519 4794 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1c617148-7474-4963-b02f-a43204c9594a-utilities\") on node \"crc\" DevicePath \"\"" Mar 10 11:51:01 crc kubenswrapper[4794]: I0310 11:51:01.593534 4794 generic.go:334] "Generic (PLEG): container finished" podID="1c617148-7474-4963-b02f-a43204c9594a" containerID="852b9a4d0b3ad819ba27c98d17b806223100bf53cb8f6d7e62572ee66855a559" exitCode=0 Mar 10 11:51:01 crc kubenswrapper[4794]: I0310 11:51:01.593597 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8w6jl" event={"ID":"1c617148-7474-4963-b02f-a43204c9594a","Type":"ContainerDied","Data":"852b9a4d0b3ad819ba27c98d17b806223100bf53cb8f6d7e62572ee66855a559"} Mar 10 11:51:01 crc kubenswrapper[4794]: I0310 11:51:01.593612 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-8w6jl" Mar 10 11:51:01 crc kubenswrapper[4794]: I0310 11:51:01.593651 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8w6jl" event={"ID":"1c617148-7474-4963-b02f-a43204c9594a","Type":"ContainerDied","Data":"f423a4f1d04d0e1c0a46b090d2228b6fa0200e61cf4806c14e8cc2e6f2b12764"} Mar 10 11:51:01 crc kubenswrapper[4794]: I0310 11:51:01.593683 4794 scope.go:117] "RemoveContainer" containerID="852b9a4d0b3ad819ba27c98d17b806223100bf53cb8f6d7e62572ee66855a559" Mar 10 11:51:01 crc kubenswrapper[4794]: I0310 11:51:01.619755 4794 scope.go:117] "RemoveContainer" containerID="e1d9628f3af08ba8cf39a9223ecaed9cdcf0597b106e2ce6f9a5c61920b3e324" Mar 10 11:51:01 crc kubenswrapper[4794]: I0310 11:51:01.643349 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-8w6jl"] Mar 10 11:51:01 crc kubenswrapper[4794]: I0310 11:51:01.666516 4794 scope.go:117] "RemoveContainer" containerID="24ab763b764bd0eecaac4b1ac285065e3c1b7b8320012bb2eb505c734f35bcda" Mar 10 11:51:01 crc kubenswrapper[4794]: I0310 11:51:01.670526 4794 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-8w6jl"] Mar 10 11:51:01 crc kubenswrapper[4794]: I0310 11:51:01.720611 4794 scope.go:117] "RemoveContainer" containerID="852b9a4d0b3ad819ba27c98d17b806223100bf53cb8f6d7e62572ee66855a559" Mar 10 11:51:01 crc kubenswrapper[4794]: E0310 11:51:01.721239 4794 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"852b9a4d0b3ad819ba27c98d17b806223100bf53cb8f6d7e62572ee66855a559\": container with ID starting with 852b9a4d0b3ad819ba27c98d17b806223100bf53cb8f6d7e62572ee66855a559 not found: ID does not exist" containerID="852b9a4d0b3ad819ba27c98d17b806223100bf53cb8f6d7e62572ee66855a559" Mar 10 11:51:01 crc kubenswrapper[4794]: I0310 11:51:01.721275 4794 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"852b9a4d0b3ad819ba27c98d17b806223100bf53cb8f6d7e62572ee66855a559"} err="failed to get container status \"852b9a4d0b3ad819ba27c98d17b806223100bf53cb8f6d7e62572ee66855a559\": rpc error: code = NotFound desc = could not find container \"852b9a4d0b3ad819ba27c98d17b806223100bf53cb8f6d7e62572ee66855a559\": container with ID starting with 852b9a4d0b3ad819ba27c98d17b806223100bf53cb8f6d7e62572ee66855a559 not found: ID does not exist" Mar 10 11:51:01 crc kubenswrapper[4794]: I0310 11:51:01.721300 4794 scope.go:117] "RemoveContainer" containerID="e1d9628f3af08ba8cf39a9223ecaed9cdcf0597b106e2ce6f9a5c61920b3e324" Mar 10 11:51:01 crc kubenswrapper[4794]: E0310 11:51:01.724718 4794 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e1d9628f3af08ba8cf39a9223ecaed9cdcf0597b106e2ce6f9a5c61920b3e324\": container with ID starting with e1d9628f3af08ba8cf39a9223ecaed9cdcf0597b106e2ce6f9a5c61920b3e324 not found: ID does not exist" containerID="e1d9628f3af08ba8cf39a9223ecaed9cdcf0597b106e2ce6f9a5c61920b3e324" Mar 10 11:51:01 crc kubenswrapper[4794]: I0310 11:51:01.724783 4794 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e1d9628f3af08ba8cf39a9223ecaed9cdcf0597b106e2ce6f9a5c61920b3e324"} err="failed to get container status \"e1d9628f3af08ba8cf39a9223ecaed9cdcf0597b106e2ce6f9a5c61920b3e324\": rpc error: code = NotFound desc = could not find container \"e1d9628f3af08ba8cf39a9223ecaed9cdcf0597b106e2ce6f9a5c61920b3e324\": container with ID starting with e1d9628f3af08ba8cf39a9223ecaed9cdcf0597b106e2ce6f9a5c61920b3e324 not found: ID does not exist" Mar 10 11:51:01 crc kubenswrapper[4794]: I0310 11:51:01.724813 4794 scope.go:117] "RemoveContainer" containerID="24ab763b764bd0eecaac4b1ac285065e3c1b7b8320012bb2eb505c734f35bcda" Mar 10 11:51:01 crc kubenswrapper[4794]: E0310 11:51:01.728055 4794 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"24ab763b764bd0eecaac4b1ac285065e3c1b7b8320012bb2eb505c734f35bcda\": container with ID starting with 24ab763b764bd0eecaac4b1ac285065e3c1b7b8320012bb2eb505c734f35bcda not found: ID does not exist" containerID="24ab763b764bd0eecaac4b1ac285065e3c1b7b8320012bb2eb505c734f35bcda" Mar 10 11:51:01 crc kubenswrapper[4794]: I0310 11:51:01.728087 4794 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"24ab763b764bd0eecaac4b1ac285065e3c1b7b8320012bb2eb505c734f35bcda"} err="failed to get container status \"24ab763b764bd0eecaac4b1ac285065e3c1b7b8320012bb2eb505c734f35bcda\": rpc error: code = NotFound desc = could not find container \"24ab763b764bd0eecaac4b1ac285065e3c1b7b8320012bb2eb505c734f35bcda\": container with ID starting with 24ab763b764bd0eecaac4b1ac285065e3c1b7b8320012bb2eb505c734f35bcda not found: ID does not exist" Mar 10 11:51:02 crc kubenswrapper[4794]: I0310 11:51:02.013046 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1c617148-7474-4963-b02f-a43204c9594a" path="/var/lib/kubelet/pods/1c617148-7474-4963-b02f-a43204c9594a/volumes" Mar 10 11:51:10 crc kubenswrapper[4794]: I0310 11:51:10.000250 4794 scope.go:117] "RemoveContainer" containerID="9dc225440d65a260bfaf17b727b2d2bfa824f3fd4b54db4a37ab86f3420b2343" Mar 10 11:51:10 crc kubenswrapper[4794]: E0310 11:51:10.000978 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-69278_openshift-machine-config-operator(071a79a8-a892-4d38-a255-2a19483b64aa)\"" pod="openshift-machine-config-operator/machine-config-daemon-69278" podUID="071a79a8-a892-4d38-a255-2a19483b64aa" Mar 10 11:51:24 crc kubenswrapper[4794]: I0310 11:51:24.000794 4794 scope.go:117] "RemoveContainer" containerID="9dc225440d65a260bfaf17b727b2d2bfa824f3fd4b54db4a37ab86f3420b2343" Mar 10 11:51:24 crc kubenswrapper[4794]: E0310 11:51:24.001958 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-69278_openshift-machine-config-operator(071a79a8-a892-4d38-a255-2a19483b64aa)\"" pod="openshift-machine-config-operator/machine-config-daemon-69278" podUID="071a79a8-a892-4d38-a255-2a19483b64aa" Mar 10 11:51:34 crc kubenswrapper[4794]: I0310 11:51:34.949657 4794 generic.go:334] "Generic (PLEG): container finished" podID="4a19915b-ffd3-4b57-a806-0dc4f67e3003" containerID="6e43199d622059bf96781753d35331c9e0fd118f0ec29642792871ad0549e112" exitCode=0 Mar 10 11:51:34 crc kubenswrapper[4794]: I0310 11:51:34.949746 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-openstack-openstack-cell1-cms62" event={"ID":"4a19915b-ffd3-4b57-a806-0dc4f67e3003","Type":"ContainerDied","Data":"6e43199d622059bf96781753d35331c9e0fd118f0ec29642792871ad0549e112"} Mar 10 11:51:36 crc kubenswrapper[4794]: I0310 11:51:36.559381 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-l8qfw"] Mar 10 11:51:36 crc kubenswrapper[4794]: E0310 11:51:36.560158 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1c617148-7474-4963-b02f-a43204c9594a" containerName="registry-server" Mar 10 11:51:36 crc kubenswrapper[4794]: I0310 11:51:36.560175 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="1c617148-7474-4963-b02f-a43204c9594a" containerName="registry-server" Mar 10 11:51:36 crc kubenswrapper[4794]: E0310 11:51:36.560205 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1c617148-7474-4963-b02f-a43204c9594a" containerName="extract-content" Mar 10 11:51:36 crc kubenswrapper[4794]: I0310 11:51:36.560215 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="1c617148-7474-4963-b02f-a43204c9594a" containerName="extract-content" Mar 10 11:51:36 crc kubenswrapper[4794]: E0310 11:51:36.560252 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1c617148-7474-4963-b02f-a43204c9594a" containerName="extract-utilities" Mar 10 11:51:36 crc kubenswrapper[4794]: I0310 11:51:36.560260 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="1c617148-7474-4963-b02f-a43204c9594a" containerName="extract-utilities" Mar 10 11:51:36 crc kubenswrapper[4794]: I0310 11:51:36.560530 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="1c617148-7474-4963-b02f-a43204c9594a" containerName="registry-server" Mar 10 11:51:36 crc kubenswrapper[4794]: I0310 11:51:36.562572 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-l8qfw" Mar 10 11:51:36 crc kubenswrapper[4794]: I0310 11:51:36.578394 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-l8qfw"] Mar 10 11:51:36 crc kubenswrapper[4794]: I0310 11:51:36.582938 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-openstack-openstack-cell1-cms62" Mar 10 11:51:36 crc kubenswrapper[4794]: I0310 11:51:36.702092 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/4a19915b-ffd3-4b57-a806-0dc4f67e3003-ceph\") pod \"4a19915b-ffd3-4b57-a806-0dc4f67e3003\" (UID: \"4a19915b-ffd3-4b57-a806-0dc4f67e3003\") " Mar 10 11:51:36 crc kubenswrapper[4794]: I0310 11:51:36.702481 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bdl6p\" (UniqueName: \"kubernetes.io/projected/4a19915b-ffd3-4b57-a806-0dc4f67e3003-kube-api-access-bdl6p\") pod \"4a19915b-ffd3-4b57-a806-0dc4f67e3003\" (UID: \"4a19915b-ffd3-4b57-a806-0dc4f67e3003\") " Mar 10 11:51:36 crc kubenswrapper[4794]: I0310 11:51:36.702612 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4a19915b-ffd3-4b57-a806-0dc4f67e3003-inventory\") pod \"4a19915b-ffd3-4b57-a806-0dc4f67e3003\" (UID: \"4a19915b-ffd3-4b57-a806-0dc4f67e3003\") " Mar 10 11:51:36 crc kubenswrapper[4794]: I0310 11:51:36.702634 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/4a19915b-ffd3-4b57-a806-0dc4f67e3003-ssh-key-openstack-cell1\") pod \"4a19915b-ffd3-4b57-a806-0dc4f67e3003\" (UID: \"4a19915b-ffd3-4b57-a806-0dc4f67e3003\") " Mar 10 11:51:36 crc kubenswrapper[4794]: I0310 11:51:36.702886 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b98e5e74-15a0-4564-80cc-3bb7810a8682-utilities\") pod \"redhat-operators-l8qfw\" (UID: \"b98e5e74-15a0-4564-80cc-3bb7810a8682\") " pod="openshift-marketplace/redhat-operators-l8qfw" Mar 10 11:51:36 crc kubenswrapper[4794]: I0310 11:51:36.703063 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b98e5e74-15a0-4564-80cc-3bb7810a8682-catalog-content\") pod \"redhat-operators-l8qfw\" (UID: \"b98e5e74-15a0-4564-80cc-3bb7810a8682\") " pod="openshift-marketplace/redhat-operators-l8qfw" Mar 10 11:51:36 crc kubenswrapper[4794]: I0310 11:51:36.703096 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zztdf\" (UniqueName: \"kubernetes.io/projected/b98e5e74-15a0-4564-80cc-3bb7810a8682-kube-api-access-zztdf\") pod \"redhat-operators-l8qfw\" (UID: \"b98e5e74-15a0-4564-80cc-3bb7810a8682\") " pod="openshift-marketplace/redhat-operators-l8qfw" Mar 10 11:51:36 crc kubenswrapper[4794]: I0310 11:51:36.708566 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4a19915b-ffd3-4b57-a806-0dc4f67e3003-kube-api-access-bdl6p" (OuterVolumeSpecName: "kube-api-access-bdl6p") pod "4a19915b-ffd3-4b57-a806-0dc4f67e3003" (UID: "4a19915b-ffd3-4b57-a806-0dc4f67e3003"). InnerVolumeSpecName "kube-api-access-bdl6p". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 11:51:36 crc kubenswrapper[4794]: I0310 11:51:36.713055 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4a19915b-ffd3-4b57-a806-0dc4f67e3003-ceph" (OuterVolumeSpecName: "ceph") pod "4a19915b-ffd3-4b57-a806-0dc4f67e3003" (UID: "4a19915b-ffd3-4b57-a806-0dc4f67e3003"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 11:51:36 crc kubenswrapper[4794]: I0310 11:51:36.732777 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4a19915b-ffd3-4b57-a806-0dc4f67e3003-inventory" (OuterVolumeSpecName: "inventory") pod "4a19915b-ffd3-4b57-a806-0dc4f67e3003" (UID: "4a19915b-ffd3-4b57-a806-0dc4f67e3003"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 11:51:36 crc kubenswrapper[4794]: I0310 11:51:36.765451 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4a19915b-ffd3-4b57-a806-0dc4f67e3003-ssh-key-openstack-cell1" (OuterVolumeSpecName: "ssh-key-openstack-cell1") pod "4a19915b-ffd3-4b57-a806-0dc4f67e3003" (UID: "4a19915b-ffd3-4b57-a806-0dc4f67e3003"). InnerVolumeSpecName "ssh-key-openstack-cell1". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 11:51:36 crc kubenswrapper[4794]: I0310 11:51:36.804952 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b98e5e74-15a0-4564-80cc-3bb7810a8682-catalog-content\") pod \"redhat-operators-l8qfw\" (UID: \"b98e5e74-15a0-4564-80cc-3bb7810a8682\") " pod="openshift-marketplace/redhat-operators-l8qfw" Mar 10 11:51:36 crc kubenswrapper[4794]: I0310 11:51:36.805024 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zztdf\" (UniqueName: \"kubernetes.io/projected/b98e5e74-15a0-4564-80cc-3bb7810a8682-kube-api-access-zztdf\") pod \"redhat-operators-l8qfw\" (UID: \"b98e5e74-15a0-4564-80cc-3bb7810a8682\") " pod="openshift-marketplace/redhat-operators-l8qfw" Mar 10 11:51:36 crc kubenswrapper[4794]: I0310 11:51:36.805127 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b98e5e74-15a0-4564-80cc-3bb7810a8682-utilities\") pod \"redhat-operators-l8qfw\" (UID: \"b98e5e74-15a0-4564-80cc-3bb7810a8682\") " pod="openshift-marketplace/redhat-operators-l8qfw" Mar 10 11:51:36 crc kubenswrapper[4794]: I0310 11:51:36.805260 4794 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4a19915b-ffd3-4b57-a806-0dc4f67e3003-inventory\") on node \"crc\" DevicePath \"\"" Mar 10 11:51:36 crc kubenswrapper[4794]: I0310 11:51:36.805277 4794 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/4a19915b-ffd3-4b57-a806-0dc4f67e3003-ssh-key-openstack-cell1\") on node \"crc\" DevicePath \"\"" Mar 10 11:51:36 crc kubenswrapper[4794]: I0310 11:51:36.805288 4794 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/4a19915b-ffd3-4b57-a806-0dc4f67e3003-ceph\") on node \"crc\" DevicePath \"\"" Mar 10 11:51:36 crc kubenswrapper[4794]: I0310 11:51:36.805297 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bdl6p\" (UniqueName: \"kubernetes.io/projected/4a19915b-ffd3-4b57-a806-0dc4f67e3003-kube-api-access-bdl6p\") on node \"crc\" DevicePath \"\"" Mar 10 11:51:36 crc kubenswrapper[4794]: I0310 11:51:36.805747 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b98e5e74-15a0-4564-80cc-3bb7810a8682-utilities\") pod \"redhat-operators-l8qfw\" (UID: \"b98e5e74-15a0-4564-80cc-3bb7810a8682\") " pod="openshift-marketplace/redhat-operators-l8qfw" Mar 10 11:51:36 crc kubenswrapper[4794]: I0310 11:51:36.805753 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b98e5e74-15a0-4564-80cc-3bb7810a8682-catalog-content\") pod \"redhat-operators-l8qfw\" (UID: \"b98e5e74-15a0-4564-80cc-3bb7810a8682\") " pod="openshift-marketplace/redhat-operators-l8qfw" Mar 10 11:51:36 crc kubenswrapper[4794]: I0310 11:51:36.820708 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zztdf\" (UniqueName: \"kubernetes.io/projected/b98e5e74-15a0-4564-80cc-3bb7810a8682-kube-api-access-zztdf\") pod \"redhat-operators-l8qfw\" (UID: \"b98e5e74-15a0-4564-80cc-3bb7810a8682\") " pod="openshift-marketplace/redhat-operators-l8qfw" Mar 10 11:51:36 crc kubenswrapper[4794]: I0310 11:51:36.900886 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-l8qfw" Mar 10 11:51:36 crc kubenswrapper[4794]: I0310 11:51:36.979116 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-openstack-openstack-cell1-cms62" event={"ID":"4a19915b-ffd3-4b57-a806-0dc4f67e3003","Type":"ContainerDied","Data":"a90eaf08313920d3073f50721bd4694a5a06038d5039614969939271dc73ab78"} Mar 10 11:51:36 crc kubenswrapper[4794]: I0310 11:51:36.979157 4794 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a90eaf08313920d3073f50721bd4694a5a06038d5039614969939271dc73ab78" Mar 10 11:51:36 crc kubenswrapper[4794]: I0310 11:51:36.979232 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-openstack-openstack-cell1-cms62" Mar 10 11:51:37 crc kubenswrapper[4794]: I0310 11:51:37.104681 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ssh-known-hosts-openstack-px9pl"] Mar 10 11:51:37 crc kubenswrapper[4794]: E0310 11:51:37.105077 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4a19915b-ffd3-4b57-a806-0dc4f67e3003" containerName="configure-os-openstack-openstack-cell1" Mar 10 11:51:37 crc kubenswrapper[4794]: I0310 11:51:37.105097 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="4a19915b-ffd3-4b57-a806-0dc4f67e3003" containerName="configure-os-openstack-openstack-cell1" Mar 10 11:51:37 crc kubenswrapper[4794]: I0310 11:51:37.105306 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="4a19915b-ffd3-4b57-a806-0dc4f67e3003" containerName="configure-os-openstack-openstack-cell1" Mar 10 11:51:37 crc kubenswrapper[4794]: I0310 11:51:37.106139 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-openstack-px9pl" Mar 10 11:51:37 crc kubenswrapper[4794]: I0310 11:51:37.116856 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 10 11:51:37 crc kubenswrapper[4794]: I0310 11:51:37.117608 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Mar 10 11:51:37 crc kubenswrapper[4794]: I0310 11:51:37.117797 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-jdm8g" Mar 10 11:51:37 crc kubenswrapper[4794]: I0310 11:51:37.118067 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Mar 10 11:51:37 crc kubenswrapper[4794]: I0310 11:51:37.143273 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-openstack-px9pl"] Mar 10 11:51:37 crc kubenswrapper[4794]: I0310 11:51:37.215927 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/f8422031-955c-4264-86dc-c633abfa5290-ssh-key-openstack-cell1\") pod \"ssh-known-hosts-openstack-px9pl\" (UID: \"f8422031-955c-4264-86dc-c633abfa5290\") " pod="openstack/ssh-known-hosts-openstack-px9pl" Mar 10 11:51:37 crc kubenswrapper[4794]: I0310 11:51:37.215974 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/f8422031-955c-4264-86dc-c633abfa5290-ceph\") pod \"ssh-known-hosts-openstack-px9pl\" (UID: \"f8422031-955c-4264-86dc-c633abfa5290\") " pod="openstack/ssh-known-hosts-openstack-px9pl" Mar 10 11:51:37 crc kubenswrapper[4794]: I0310 11:51:37.216081 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-95vkf\" (UniqueName: \"kubernetes.io/projected/f8422031-955c-4264-86dc-c633abfa5290-kube-api-access-95vkf\") pod \"ssh-known-hosts-openstack-px9pl\" (UID: \"f8422031-955c-4264-86dc-c633abfa5290\") " pod="openstack/ssh-known-hosts-openstack-px9pl" Mar 10 11:51:37 crc kubenswrapper[4794]: I0310 11:51:37.216526 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/f8422031-955c-4264-86dc-c633abfa5290-inventory-0\") pod \"ssh-known-hosts-openstack-px9pl\" (UID: \"f8422031-955c-4264-86dc-c633abfa5290\") " pod="openstack/ssh-known-hosts-openstack-px9pl" Mar 10 11:51:37 crc kubenswrapper[4794]: I0310 11:51:37.318838 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/f8422031-955c-4264-86dc-c633abfa5290-inventory-0\") pod \"ssh-known-hosts-openstack-px9pl\" (UID: \"f8422031-955c-4264-86dc-c633abfa5290\") " pod="openstack/ssh-known-hosts-openstack-px9pl" Mar 10 11:51:37 crc kubenswrapper[4794]: I0310 11:51:37.318948 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/f8422031-955c-4264-86dc-c633abfa5290-ssh-key-openstack-cell1\") pod \"ssh-known-hosts-openstack-px9pl\" (UID: \"f8422031-955c-4264-86dc-c633abfa5290\") " pod="openstack/ssh-known-hosts-openstack-px9pl" Mar 10 11:51:37 crc kubenswrapper[4794]: I0310 11:51:37.318969 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/f8422031-955c-4264-86dc-c633abfa5290-ceph\") pod \"ssh-known-hosts-openstack-px9pl\" (UID: \"f8422031-955c-4264-86dc-c633abfa5290\") " pod="openstack/ssh-known-hosts-openstack-px9pl" Mar 10 11:51:37 crc kubenswrapper[4794]: I0310 11:51:37.319003 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-95vkf\" (UniqueName: \"kubernetes.io/projected/f8422031-955c-4264-86dc-c633abfa5290-kube-api-access-95vkf\") pod \"ssh-known-hosts-openstack-px9pl\" (UID: \"f8422031-955c-4264-86dc-c633abfa5290\") " pod="openstack/ssh-known-hosts-openstack-px9pl" Mar 10 11:51:37 crc kubenswrapper[4794]: I0310 11:51:37.335875 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/f8422031-955c-4264-86dc-c633abfa5290-ssh-key-openstack-cell1\") pod \"ssh-known-hosts-openstack-px9pl\" (UID: \"f8422031-955c-4264-86dc-c633abfa5290\") " pod="openstack/ssh-known-hosts-openstack-px9pl" Mar 10 11:51:37 crc kubenswrapper[4794]: I0310 11:51:37.337171 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/f8422031-955c-4264-86dc-c633abfa5290-inventory-0\") pod \"ssh-known-hosts-openstack-px9pl\" (UID: \"f8422031-955c-4264-86dc-c633abfa5290\") " pod="openstack/ssh-known-hosts-openstack-px9pl" Mar 10 11:51:37 crc kubenswrapper[4794]: I0310 11:51:37.339930 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/f8422031-955c-4264-86dc-c633abfa5290-ceph\") pod \"ssh-known-hosts-openstack-px9pl\" (UID: \"f8422031-955c-4264-86dc-c633abfa5290\") " pod="openstack/ssh-known-hosts-openstack-px9pl" Mar 10 11:51:37 crc kubenswrapper[4794]: I0310 11:51:37.342643 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-95vkf\" (UniqueName: \"kubernetes.io/projected/f8422031-955c-4264-86dc-c633abfa5290-kube-api-access-95vkf\") pod \"ssh-known-hosts-openstack-px9pl\" (UID: \"f8422031-955c-4264-86dc-c633abfa5290\") " pod="openstack/ssh-known-hosts-openstack-px9pl" Mar 10 11:51:37 crc kubenswrapper[4794]: I0310 11:51:37.426162 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-l8qfw"] Mar 10 11:51:37 crc kubenswrapper[4794]: I0310 11:51:37.439870 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-openstack-px9pl" Mar 10 11:51:37 crc kubenswrapper[4794]: I0310 11:51:37.998744 4794 generic.go:334] "Generic (PLEG): container finished" podID="b98e5e74-15a0-4564-80cc-3bb7810a8682" containerID="c304f7e19f6f597e4c6440a87edb58e497c83db14f15433a872cd1a12abe3b61" exitCode=0 Mar 10 11:51:38 crc kubenswrapper[4794]: I0310 11:51:38.001136 4794 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 10 11:51:38 crc kubenswrapper[4794]: I0310 11:51:38.018531 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-l8qfw" event={"ID":"b98e5e74-15a0-4564-80cc-3bb7810a8682","Type":"ContainerDied","Data":"c304f7e19f6f597e4c6440a87edb58e497c83db14f15433a872cd1a12abe3b61"} Mar 10 11:51:38 crc kubenswrapper[4794]: I0310 11:51:38.018572 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-l8qfw" event={"ID":"b98e5e74-15a0-4564-80cc-3bb7810a8682","Type":"ContainerStarted","Data":"aca98efabdfee756f4df7e6b320a22777033bc65021dbc12c76187a4fdada3c1"} Mar 10 11:51:38 crc kubenswrapper[4794]: W0310 11:51:38.065511 4794 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf8422031_955c_4264_86dc_c633abfa5290.slice/crio-1d9154c8a1a953adcf2bb64f6b6139b605c14f66d2fdf8e1f0fa8cafe6499757 WatchSource:0}: Error finding container 1d9154c8a1a953adcf2bb64f6b6139b605c14f66d2fdf8e1f0fa8cafe6499757: Status 404 returned error can't find the container with id 1d9154c8a1a953adcf2bb64f6b6139b605c14f66d2fdf8e1f0fa8cafe6499757 Mar 10 11:51:38 crc kubenswrapper[4794]: I0310 11:51:38.077699 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-openstack-px9pl"] Mar 10 11:51:39 crc kubenswrapper[4794]: I0310 11:51:38.999874 4794 scope.go:117] "RemoveContainer" containerID="9dc225440d65a260bfaf17b727b2d2bfa824f3fd4b54db4a37ab86f3420b2343" Mar 10 11:51:39 crc kubenswrapper[4794]: E0310 11:51:39.000556 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-69278_openshift-machine-config-operator(071a79a8-a892-4d38-a255-2a19483b64aa)\"" pod="openshift-machine-config-operator/machine-config-daemon-69278" podUID="071a79a8-a892-4d38-a255-2a19483b64aa" Mar 10 11:51:39 crc kubenswrapper[4794]: I0310 11:51:39.020439 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-openstack-px9pl" event={"ID":"f8422031-955c-4264-86dc-c633abfa5290","Type":"ContainerStarted","Data":"2914bbb2c920c9b65033838726c5098a6fb7ad295ac851472b6638033e0b9b9f"} Mar 10 11:51:39 crc kubenswrapper[4794]: I0310 11:51:39.020485 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-openstack-px9pl" event={"ID":"f8422031-955c-4264-86dc-c633abfa5290","Type":"ContainerStarted","Data":"1d9154c8a1a953adcf2bb64f6b6139b605c14f66d2fdf8e1f0fa8cafe6499757"} Mar 10 11:51:39 crc kubenswrapper[4794]: I0310 11:51:39.041475 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ssh-known-hosts-openstack-px9pl" podStartSLOduration=1.630353134 podStartE2EDuration="2.041448216s" podCreationTimestamp="2026-03-10 11:51:37 +0000 UTC" firstStartedPulling="2026-03-10 11:51:38.068219753 +0000 UTC m=+7646.824390571" lastFinishedPulling="2026-03-10 11:51:38.479314835 +0000 UTC m=+7647.235485653" observedRunningTime="2026-03-10 11:51:39.039170576 +0000 UTC m=+7647.795341444" watchObservedRunningTime="2026-03-10 11:51:39.041448216 +0000 UTC m=+7647.797619044" Mar 10 11:51:40 crc kubenswrapper[4794]: I0310 11:51:40.053003 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-l8qfw" event={"ID":"b98e5e74-15a0-4564-80cc-3bb7810a8682","Type":"ContainerStarted","Data":"2c63331275dc3a632159b2fde0d96293208650446479e1d72624ca59fe947e8d"} Mar 10 11:51:44 crc kubenswrapper[4794]: I0310 11:51:44.104995 4794 generic.go:334] "Generic (PLEG): container finished" podID="b98e5e74-15a0-4564-80cc-3bb7810a8682" containerID="2c63331275dc3a632159b2fde0d96293208650446479e1d72624ca59fe947e8d" exitCode=0 Mar 10 11:51:44 crc kubenswrapper[4794]: I0310 11:51:44.105107 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-l8qfw" event={"ID":"b98e5e74-15a0-4564-80cc-3bb7810a8682","Type":"ContainerDied","Data":"2c63331275dc3a632159b2fde0d96293208650446479e1d72624ca59fe947e8d"} Mar 10 11:51:45 crc kubenswrapper[4794]: I0310 11:51:45.117421 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-l8qfw" event={"ID":"b98e5e74-15a0-4564-80cc-3bb7810a8682","Type":"ContainerStarted","Data":"3dd74ca70a7da8cf9804f0c3d1f985ea84144056394f50d6094de1b4b9d6e53d"} Mar 10 11:51:45 crc kubenswrapper[4794]: I0310 11:51:45.138601 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-l8qfw" podStartSLOduration=2.60734879 podStartE2EDuration="9.138585648s" podCreationTimestamp="2026-03-10 11:51:36 +0000 UTC" firstStartedPulling="2026-03-10 11:51:38.000907864 +0000 UTC m=+7646.757078692" lastFinishedPulling="2026-03-10 11:51:44.532144712 +0000 UTC m=+7653.288315550" observedRunningTime="2026-03-10 11:51:45.134817031 +0000 UTC m=+7653.890987849" watchObservedRunningTime="2026-03-10 11:51:45.138585648 +0000 UTC m=+7653.894756466" Mar 10 11:51:46 crc kubenswrapper[4794]: I0310 11:51:46.902138 4794 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-l8qfw" Mar 10 11:51:46 crc kubenswrapper[4794]: I0310 11:51:46.902637 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-l8qfw" Mar 10 11:51:47 crc kubenswrapper[4794]: I0310 11:51:47.975322 4794 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-l8qfw" podUID="b98e5e74-15a0-4564-80cc-3bb7810a8682" containerName="registry-server" probeResult="failure" output=< Mar 10 11:51:47 crc kubenswrapper[4794]: timeout: failed to connect service ":50051" within 1s Mar 10 11:51:47 crc kubenswrapper[4794]: > Mar 10 11:51:48 crc kubenswrapper[4794]: I0310 11:51:48.166533 4794 generic.go:334] "Generic (PLEG): container finished" podID="f8422031-955c-4264-86dc-c633abfa5290" containerID="2914bbb2c920c9b65033838726c5098a6fb7ad295ac851472b6638033e0b9b9f" exitCode=0 Mar 10 11:51:48 crc kubenswrapper[4794]: I0310 11:51:48.166578 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-openstack-px9pl" event={"ID":"f8422031-955c-4264-86dc-c633abfa5290","Type":"ContainerDied","Data":"2914bbb2c920c9b65033838726c5098a6fb7ad295ac851472b6638033e0b9b9f"} Mar 10 11:51:49 crc kubenswrapper[4794]: I0310 11:51:49.731467 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-openstack-px9pl" Mar 10 11:51:49 crc kubenswrapper[4794]: I0310 11:51:49.801074 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/f8422031-955c-4264-86dc-c633abfa5290-ssh-key-openstack-cell1\") pod \"f8422031-955c-4264-86dc-c633abfa5290\" (UID: \"f8422031-955c-4264-86dc-c633abfa5290\") " Mar 10 11:51:49 crc kubenswrapper[4794]: I0310 11:51:49.801205 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/f8422031-955c-4264-86dc-c633abfa5290-ceph\") pod \"f8422031-955c-4264-86dc-c633abfa5290\" (UID: \"f8422031-955c-4264-86dc-c633abfa5290\") " Mar 10 11:51:49 crc kubenswrapper[4794]: I0310 11:51:49.801227 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-95vkf\" (UniqueName: \"kubernetes.io/projected/f8422031-955c-4264-86dc-c633abfa5290-kube-api-access-95vkf\") pod \"f8422031-955c-4264-86dc-c633abfa5290\" (UID: \"f8422031-955c-4264-86dc-c633abfa5290\") " Mar 10 11:51:49 crc kubenswrapper[4794]: I0310 11:51:49.801294 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/f8422031-955c-4264-86dc-c633abfa5290-inventory-0\") pod \"f8422031-955c-4264-86dc-c633abfa5290\" (UID: \"f8422031-955c-4264-86dc-c633abfa5290\") " Mar 10 11:51:49 crc kubenswrapper[4794]: I0310 11:51:49.809574 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f8422031-955c-4264-86dc-c633abfa5290-kube-api-access-95vkf" (OuterVolumeSpecName: "kube-api-access-95vkf") pod "f8422031-955c-4264-86dc-c633abfa5290" (UID: "f8422031-955c-4264-86dc-c633abfa5290"). InnerVolumeSpecName "kube-api-access-95vkf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 11:51:49 crc kubenswrapper[4794]: I0310 11:51:49.809672 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f8422031-955c-4264-86dc-c633abfa5290-ceph" (OuterVolumeSpecName: "ceph") pod "f8422031-955c-4264-86dc-c633abfa5290" (UID: "f8422031-955c-4264-86dc-c633abfa5290"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 11:51:49 crc kubenswrapper[4794]: I0310 11:51:49.849021 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f8422031-955c-4264-86dc-c633abfa5290-inventory-0" (OuterVolumeSpecName: "inventory-0") pod "f8422031-955c-4264-86dc-c633abfa5290" (UID: "f8422031-955c-4264-86dc-c633abfa5290"). InnerVolumeSpecName "inventory-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 11:51:49 crc kubenswrapper[4794]: I0310 11:51:49.852250 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f8422031-955c-4264-86dc-c633abfa5290-ssh-key-openstack-cell1" (OuterVolumeSpecName: "ssh-key-openstack-cell1") pod "f8422031-955c-4264-86dc-c633abfa5290" (UID: "f8422031-955c-4264-86dc-c633abfa5290"). InnerVolumeSpecName "ssh-key-openstack-cell1". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 11:51:49 crc kubenswrapper[4794]: I0310 11:51:49.904769 4794 reconciler_common.go:293] "Volume detached for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/f8422031-955c-4264-86dc-c633abfa5290-inventory-0\") on node \"crc\" DevicePath \"\"" Mar 10 11:51:49 crc kubenswrapper[4794]: I0310 11:51:49.904813 4794 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/f8422031-955c-4264-86dc-c633abfa5290-ssh-key-openstack-cell1\") on node \"crc\" DevicePath \"\"" Mar 10 11:51:49 crc kubenswrapper[4794]: I0310 11:51:49.904829 4794 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/f8422031-955c-4264-86dc-c633abfa5290-ceph\") on node \"crc\" DevicePath \"\"" Mar 10 11:51:49 crc kubenswrapper[4794]: I0310 11:51:49.904845 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-95vkf\" (UniqueName: \"kubernetes.io/projected/f8422031-955c-4264-86dc-c633abfa5290-kube-api-access-95vkf\") on node \"crc\" DevicePath \"\"" Mar 10 11:51:50 crc kubenswrapper[4794]: I0310 11:51:50.190223 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-openstack-px9pl" event={"ID":"f8422031-955c-4264-86dc-c633abfa5290","Type":"ContainerDied","Data":"1d9154c8a1a953adcf2bb64f6b6139b605c14f66d2fdf8e1f0fa8cafe6499757"} Mar 10 11:51:50 crc kubenswrapper[4794]: I0310 11:51:50.190272 4794 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1d9154c8a1a953adcf2bb64f6b6139b605c14f66d2fdf8e1f0fa8cafe6499757" Mar 10 11:51:50 crc kubenswrapper[4794]: I0310 11:51:50.190276 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-openstack-px9pl" Mar 10 11:51:50 crc kubenswrapper[4794]: I0310 11:51:50.344522 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/run-os-openstack-openstack-cell1-msjrc"] Mar 10 11:51:50 crc kubenswrapper[4794]: E0310 11:51:50.344926 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f8422031-955c-4264-86dc-c633abfa5290" containerName="ssh-known-hosts-openstack" Mar 10 11:51:50 crc kubenswrapper[4794]: I0310 11:51:50.344941 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="f8422031-955c-4264-86dc-c633abfa5290" containerName="ssh-known-hosts-openstack" Mar 10 11:51:50 crc kubenswrapper[4794]: I0310 11:51:50.345142 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="f8422031-955c-4264-86dc-c633abfa5290" containerName="ssh-known-hosts-openstack" Mar 10 11:51:50 crc kubenswrapper[4794]: I0310 11:51:50.345852 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-openstack-openstack-cell1-msjrc" Mar 10 11:51:50 crc kubenswrapper[4794]: I0310 11:51:50.350068 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-jdm8g" Mar 10 11:51:50 crc kubenswrapper[4794]: I0310 11:51:50.350230 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Mar 10 11:51:50 crc kubenswrapper[4794]: I0310 11:51:50.354670 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 10 11:51:50 crc kubenswrapper[4794]: I0310 11:51:50.354866 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Mar 10 11:51:50 crc kubenswrapper[4794]: I0310 11:51:50.358803 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-openstack-openstack-cell1-msjrc"] Mar 10 11:51:50 crc kubenswrapper[4794]: I0310 11:51:50.417017 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/f756112e-b33c-4c53-aecd-408a3f22f8cf-ssh-key-openstack-cell1\") pod \"run-os-openstack-openstack-cell1-msjrc\" (UID: \"f756112e-b33c-4c53-aecd-408a3f22f8cf\") " pod="openstack/run-os-openstack-openstack-cell1-msjrc" Mar 10 11:51:50 crc kubenswrapper[4794]: I0310 11:51:50.417101 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/f756112e-b33c-4c53-aecd-408a3f22f8cf-ceph\") pod \"run-os-openstack-openstack-cell1-msjrc\" (UID: \"f756112e-b33c-4c53-aecd-408a3f22f8cf\") " pod="openstack/run-os-openstack-openstack-cell1-msjrc" Mar 10 11:51:50 crc kubenswrapper[4794]: I0310 11:51:50.417143 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f756112e-b33c-4c53-aecd-408a3f22f8cf-inventory\") pod \"run-os-openstack-openstack-cell1-msjrc\" (UID: \"f756112e-b33c-4c53-aecd-408a3f22f8cf\") " pod="openstack/run-os-openstack-openstack-cell1-msjrc" Mar 10 11:51:50 crc kubenswrapper[4794]: I0310 11:51:50.417196 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xcvfq\" (UniqueName: \"kubernetes.io/projected/f756112e-b33c-4c53-aecd-408a3f22f8cf-kube-api-access-xcvfq\") pod \"run-os-openstack-openstack-cell1-msjrc\" (UID: \"f756112e-b33c-4c53-aecd-408a3f22f8cf\") " pod="openstack/run-os-openstack-openstack-cell1-msjrc" Mar 10 11:51:50 crc kubenswrapper[4794]: I0310 11:51:50.519666 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/f756112e-b33c-4c53-aecd-408a3f22f8cf-ssh-key-openstack-cell1\") pod \"run-os-openstack-openstack-cell1-msjrc\" (UID: \"f756112e-b33c-4c53-aecd-408a3f22f8cf\") " pod="openstack/run-os-openstack-openstack-cell1-msjrc" Mar 10 11:51:50 crc kubenswrapper[4794]: I0310 11:51:50.519989 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/f756112e-b33c-4c53-aecd-408a3f22f8cf-ceph\") pod \"run-os-openstack-openstack-cell1-msjrc\" (UID: \"f756112e-b33c-4c53-aecd-408a3f22f8cf\") " pod="openstack/run-os-openstack-openstack-cell1-msjrc" Mar 10 11:51:50 crc kubenswrapper[4794]: I0310 11:51:50.520088 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f756112e-b33c-4c53-aecd-408a3f22f8cf-inventory\") pod \"run-os-openstack-openstack-cell1-msjrc\" (UID: \"f756112e-b33c-4c53-aecd-408a3f22f8cf\") " pod="openstack/run-os-openstack-openstack-cell1-msjrc" Mar 10 11:51:50 crc kubenswrapper[4794]: I0310 11:51:50.520220 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xcvfq\" (UniqueName: \"kubernetes.io/projected/f756112e-b33c-4c53-aecd-408a3f22f8cf-kube-api-access-xcvfq\") pod \"run-os-openstack-openstack-cell1-msjrc\" (UID: \"f756112e-b33c-4c53-aecd-408a3f22f8cf\") " pod="openstack/run-os-openstack-openstack-cell1-msjrc" Mar 10 11:51:50 crc kubenswrapper[4794]: I0310 11:51:50.524684 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f756112e-b33c-4c53-aecd-408a3f22f8cf-inventory\") pod \"run-os-openstack-openstack-cell1-msjrc\" (UID: \"f756112e-b33c-4c53-aecd-408a3f22f8cf\") " pod="openstack/run-os-openstack-openstack-cell1-msjrc" Mar 10 11:51:50 crc kubenswrapper[4794]: I0310 11:51:50.524911 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/f756112e-b33c-4c53-aecd-408a3f22f8cf-ceph\") pod \"run-os-openstack-openstack-cell1-msjrc\" (UID: \"f756112e-b33c-4c53-aecd-408a3f22f8cf\") " pod="openstack/run-os-openstack-openstack-cell1-msjrc" Mar 10 11:51:50 crc kubenswrapper[4794]: I0310 11:51:50.524994 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/f756112e-b33c-4c53-aecd-408a3f22f8cf-ssh-key-openstack-cell1\") pod \"run-os-openstack-openstack-cell1-msjrc\" (UID: \"f756112e-b33c-4c53-aecd-408a3f22f8cf\") " pod="openstack/run-os-openstack-openstack-cell1-msjrc" Mar 10 11:51:50 crc kubenswrapper[4794]: I0310 11:51:50.541422 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xcvfq\" (UniqueName: \"kubernetes.io/projected/f756112e-b33c-4c53-aecd-408a3f22f8cf-kube-api-access-xcvfq\") pod \"run-os-openstack-openstack-cell1-msjrc\" (UID: \"f756112e-b33c-4c53-aecd-408a3f22f8cf\") " pod="openstack/run-os-openstack-openstack-cell1-msjrc" Mar 10 11:51:50 crc kubenswrapper[4794]: I0310 11:51:50.677581 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-openstack-openstack-cell1-msjrc" Mar 10 11:51:51 crc kubenswrapper[4794]: I0310 11:51:51.248188 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-openstack-openstack-cell1-msjrc"] Mar 10 11:51:51 crc kubenswrapper[4794]: W0310 11:51:51.250990 4794 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf756112e_b33c_4c53_aecd_408a3f22f8cf.slice/crio-fe549c0dc0eb20cd6f6e4c6a7eae6e59161c2806b2992c5ce4538464355906f6 WatchSource:0}: Error finding container fe549c0dc0eb20cd6f6e4c6a7eae6e59161c2806b2992c5ce4538464355906f6: Status 404 returned error can't find the container with id fe549c0dc0eb20cd6f6e4c6a7eae6e59161c2806b2992c5ce4538464355906f6 Mar 10 11:51:52 crc kubenswrapper[4794]: I0310 11:51:52.210300 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-openstack-openstack-cell1-msjrc" event={"ID":"f756112e-b33c-4c53-aecd-408a3f22f8cf","Type":"ContainerStarted","Data":"3d192c4250a4dba32757f0b36619b3866b900342e26b18e53edc29d860d1afd7"} Mar 10 11:51:52 crc kubenswrapper[4794]: I0310 11:51:52.210622 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-openstack-openstack-cell1-msjrc" event={"ID":"f756112e-b33c-4c53-aecd-408a3f22f8cf","Type":"ContainerStarted","Data":"fe549c0dc0eb20cd6f6e4c6a7eae6e59161c2806b2992c5ce4538464355906f6"} Mar 10 11:51:52 crc kubenswrapper[4794]: I0310 11:51:52.229823 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/run-os-openstack-openstack-cell1-msjrc" podStartSLOduration=1.765518518 podStartE2EDuration="2.229806991s" podCreationTimestamp="2026-03-10 11:51:50 +0000 UTC" firstStartedPulling="2026-03-10 11:51:51.254276387 +0000 UTC m=+7660.010447225" lastFinishedPulling="2026-03-10 11:51:51.71856487 +0000 UTC m=+7660.474735698" observedRunningTime="2026-03-10 11:51:52.224554218 +0000 UTC m=+7660.980725046" watchObservedRunningTime="2026-03-10 11:51:52.229806991 +0000 UTC m=+7660.985977809" Mar 10 11:51:52 crc kubenswrapper[4794]: I0310 11:51:52.999113 4794 scope.go:117] "RemoveContainer" containerID="9dc225440d65a260bfaf17b727b2d2bfa824f3fd4b54db4a37ab86f3420b2343" Mar 10 11:51:53 crc kubenswrapper[4794]: E0310 11:51:52.999588 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-69278_openshift-machine-config-operator(071a79a8-a892-4d38-a255-2a19483b64aa)\"" pod="openshift-machine-config-operator/machine-config-daemon-69278" podUID="071a79a8-a892-4d38-a255-2a19483b64aa" Mar 10 11:51:53 crc kubenswrapper[4794]: I0310 11:51:53.434858 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-f9bxd"] Mar 10 11:51:53 crc kubenswrapper[4794]: I0310 11:51:53.437969 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-f9bxd" Mar 10 11:51:53 crc kubenswrapper[4794]: I0310 11:51:53.444926 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-f9bxd"] Mar 10 11:51:53 crc kubenswrapper[4794]: I0310 11:51:53.493116 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0afd8144-f691-4c04-a437-efc0b7352cee-catalog-content\") pod \"community-operators-f9bxd\" (UID: \"0afd8144-f691-4c04-a437-efc0b7352cee\") " pod="openshift-marketplace/community-operators-f9bxd" Mar 10 11:51:53 crc kubenswrapper[4794]: I0310 11:51:53.493377 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-msht8\" (UniqueName: \"kubernetes.io/projected/0afd8144-f691-4c04-a437-efc0b7352cee-kube-api-access-msht8\") pod \"community-operators-f9bxd\" (UID: \"0afd8144-f691-4c04-a437-efc0b7352cee\") " pod="openshift-marketplace/community-operators-f9bxd" Mar 10 11:51:53 crc kubenswrapper[4794]: I0310 11:51:53.493428 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0afd8144-f691-4c04-a437-efc0b7352cee-utilities\") pod \"community-operators-f9bxd\" (UID: \"0afd8144-f691-4c04-a437-efc0b7352cee\") " pod="openshift-marketplace/community-operators-f9bxd" Mar 10 11:51:53 crc kubenswrapper[4794]: I0310 11:51:53.595467 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-msht8\" (UniqueName: \"kubernetes.io/projected/0afd8144-f691-4c04-a437-efc0b7352cee-kube-api-access-msht8\") pod \"community-operators-f9bxd\" (UID: \"0afd8144-f691-4c04-a437-efc0b7352cee\") " pod="openshift-marketplace/community-operators-f9bxd" Mar 10 11:51:53 crc kubenswrapper[4794]: I0310 11:51:53.595528 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0afd8144-f691-4c04-a437-efc0b7352cee-utilities\") pod \"community-operators-f9bxd\" (UID: \"0afd8144-f691-4c04-a437-efc0b7352cee\") " pod="openshift-marketplace/community-operators-f9bxd" Mar 10 11:51:53 crc kubenswrapper[4794]: I0310 11:51:53.595661 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0afd8144-f691-4c04-a437-efc0b7352cee-catalog-content\") pod \"community-operators-f9bxd\" (UID: \"0afd8144-f691-4c04-a437-efc0b7352cee\") " pod="openshift-marketplace/community-operators-f9bxd" Mar 10 11:51:53 crc kubenswrapper[4794]: I0310 11:51:53.596288 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0afd8144-f691-4c04-a437-efc0b7352cee-catalog-content\") pod \"community-operators-f9bxd\" (UID: \"0afd8144-f691-4c04-a437-efc0b7352cee\") " pod="openshift-marketplace/community-operators-f9bxd" Mar 10 11:51:53 crc kubenswrapper[4794]: I0310 11:51:53.596939 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0afd8144-f691-4c04-a437-efc0b7352cee-utilities\") pod \"community-operators-f9bxd\" (UID: \"0afd8144-f691-4c04-a437-efc0b7352cee\") " pod="openshift-marketplace/community-operators-f9bxd" Mar 10 11:51:53 crc kubenswrapper[4794]: I0310 11:51:53.623323 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-msht8\" (UniqueName: \"kubernetes.io/projected/0afd8144-f691-4c04-a437-efc0b7352cee-kube-api-access-msht8\") pod \"community-operators-f9bxd\" (UID: \"0afd8144-f691-4c04-a437-efc0b7352cee\") " pod="openshift-marketplace/community-operators-f9bxd" Mar 10 11:51:53 crc kubenswrapper[4794]: I0310 11:51:53.816223 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-f9bxd" Mar 10 11:51:54 crc kubenswrapper[4794]: W0310 11:51:54.399783 4794 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0afd8144_f691_4c04_a437_efc0b7352cee.slice/crio-ee53435424025d7c88830efbdc82dde9d297bf338377db023f8f69723e007737 WatchSource:0}: Error finding container ee53435424025d7c88830efbdc82dde9d297bf338377db023f8f69723e007737: Status 404 returned error can't find the container with id ee53435424025d7c88830efbdc82dde9d297bf338377db023f8f69723e007737 Mar 10 11:51:54 crc kubenswrapper[4794]: I0310 11:51:54.401902 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-f9bxd"] Mar 10 11:51:55 crc kubenswrapper[4794]: I0310 11:51:55.242533 4794 generic.go:334] "Generic (PLEG): container finished" podID="0afd8144-f691-4c04-a437-efc0b7352cee" containerID="dbe6aab43f607e3929fc25c919a80db56b70319e0dcc236f33aeed788fa1ef8d" exitCode=0 Mar 10 11:51:55 crc kubenswrapper[4794]: I0310 11:51:55.242648 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-f9bxd" event={"ID":"0afd8144-f691-4c04-a437-efc0b7352cee","Type":"ContainerDied","Data":"dbe6aab43f607e3929fc25c919a80db56b70319e0dcc236f33aeed788fa1ef8d"} Mar 10 11:51:55 crc kubenswrapper[4794]: I0310 11:51:55.242848 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-f9bxd" event={"ID":"0afd8144-f691-4c04-a437-efc0b7352cee","Type":"ContainerStarted","Data":"ee53435424025d7c88830efbdc82dde9d297bf338377db023f8f69723e007737"} Mar 10 11:51:56 crc kubenswrapper[4794]: I0310 11:51:56.256459 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-f9bxd" event={"ID":"0afd8144-f691-4c04-a437-efc0b7352cee","Type":"ContainerStarted","Data":"88334a089d41d0472b7178b211cfd3b4336fb9435b94bec576c1c46c91afe6f2"} Mar 10 11:51:58 crc kubenswrapper[4794]: I0310 11:51:58.040288 4794 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-l8qfw" podUID="b98e5e74-15a0-4564-80cc-3bb7810a8682" containerName="registry-server" probeResult="failure" output=< Mar 10 11:51:58 crc kubenswrapper[4794]: timeout: failed to connect service ":50051" within 1s Mar 10 11:51:58 crc kubenswrapper[4794]: > Mar 10 11:51:59 crc kubenswrapper[4794]: I0310 11:51:59.286652 4794 generic.go:334] "Generic (PLEG): container finished" podID="0afd8144-f691-4c04-a437-efc0b7352cee" containerID="88334a089d41d0472b7178b211cfd3b4336fb9435b94bec576c1c46c91afe6f2" exitCode=0 Mar 10 11:51:59 crc kubenswrapper[4794]: I0310 11:51:59.286713 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-f9bxd" event={"ID":"0afd8144-f691-4c04-a437-efc0b7352cee","Type":"ContainerDied","Data":"88334a089d41d0472b7178b211cfd3b4336fb9435b94bec576c1c46c91afe6f2"} Mar 10 11:52:00 crc kubenswrapper[4794]: I0310 11:52:00.139938 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29552392-wkv8z"] Mar 10 11:52:00 crc kubenswrapper[4794]: I0310 11:52:00.142590 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552392-wkv8z" Mar 10 11:52:00 crc kubenswrapper[4794]: I0310 11:52:00.146366 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 10 11:52:00 crc kubenswrapper[4794]: I0310 11:52:00.146574 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 10 11:52:00 crc kubenswrapper[4794]: I0310 11:52:00.148426 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-r5tkc" Mar 10 11:52:00 crc kubenswrapper[4794]: I0310 11:52:00.152842 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552392-wkv8z"] Mar 10 11:52:00 crc kubenswrapper[4794]: I0310 11:52:00.240422 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8b6cr\" (UniqueName: \"kubernetes.io/projected/784198eb-66e6-4c0e-aa8b-c8a2b9dbde07-kube-api-access-8b6cr\") pod \"auto-csr-approver-29552392-wkv8z\" (UID: \"784198eb-66e6-4c0e-aa8b-c8a2b9dbde07\") " pod="openshift-infra/auto-csr-approver-29552392-wkv8z" Mar 10 11:52:00 crc kubenswrapper[4794]: I0310 11:52:00.342114 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8b6cr\" (UniqueName: \"kubernetes.io/projected/784198eb-66e6-4c0e-aa8b-c8a2b9dbde07-kube-api-access-8b6cr\") pod \"auto-csr-approver-29552392-wkv8z\" (UID: \"784198eb-66e6-4c0e-aa8b-c8a2b9dbde07\") " pod="openshift-infra/auto-csr-approver-29552392-wkv8z" Mar 10 11:52:00 crc kubenswrapper[4794]: I0310 11:52:00.365123 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8b6cr\" (UniqueName: \"kubernetes.io/projected/784198eb-66e6-4c0e-aa8b-c8a2b9dbde07-kube-api-access-8b6cr\") pod \"auto-csr-approver-29552392-wkv8z\" (UID: \"784198eb-66e6-4c0e-aa8b-c8a2b9dbde07\") " pod="openshift-infra/auto-csr-approver-29552392-wkv8z" Mar 10 11:52:00 crc kubenswrapper[4794]: I0310 11:52:00.476076 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552392-wkv8z" Mar 10 11:52:00 crc kubenswrapper[4794]: I0310 11:52:00.988842 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552392-wkv8z"] Mar 10 11:52:00 crc kubenswrapper[4794]: W0310 11:52:00.989583 4794 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod784198eb_66e6_4c0e_aa8b_c8a2b9dbde07.slice/crio-74659d388ec5153dc394f4d0f0fba891fd3cb1815bf6a3ff2fc6332a4c6c951f WatchSource:0}: Error finding container 74659d388ec5153dc394f4d0f0fba891fd3cb1815bf6a3ff2fc6332a4c6c951f: Status 404 returned error can't find the container with id 74659d388ec5153dc394f4d0f0fba891fd3cb1815bf6a3ff2fc6332a4c6c951f Mar 10 11:52:01 crc kubenswrapper[4794]: I0310 11:52:01.306062 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552392-wkv8z" event={"ID":"784198eb-66e6-4c0e-aa8b-c8a2b9dbde07","Type":"ContainerStarted","Data":"74659d388ec5153dc394f4d0f0fba891fd3cb1815bf6a3ff2fc6332a4c6c951f"} Mar 10 11:52:01 crc kubenswrapper[4794]: I0310 11:52:01.308873 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-f9bxd" event={"ID":"0afd8144-f691-4c04-a437-efc0b7352cee","Type":"ContainerStarted","Data":"9f1c57c7ed9d9e0a4380262ff334fbc39992e1e7fc7c189050d825ce59b831ce"} Mar 10 11:52:01 crc kubenswrapper[4794]: I0310 11:52:01.337694 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-f9bxd" podStartSLOduration=3.265036332 podStartE2EDuration="8.337670779s" podCreationTimestamp="2026-03-10 11:51:53 +0000 UTC" firstStartedPulling="2026-03-10 11:51:55.245001116 +0000 UTC m=+7664.001171944" lastFinishedPulling="2026-03-10 11:52:00.317635573 +0000 UTC m=+7669.073806391" observedRunningTime="2026-03-10 11:52:01.327919846 +0000 UTC m=+7670.084090684" watchObservedRunningTime="2026-03-10 11:52:01.337670779 +0000 UTC m=+7670.093841607" Mar 10 11:52:02 crc kubenswrapper[4794]: I0310 11:52:02.320969 4794 generic.go:334] "Generic (PLEG): container finished" podID="f756112e-b33c-4c53-aecd-408a3f22f8cf" containerID="3d192c4250a4dba32757f0b36619b3866b900342e26b18e53edc29d860d1afd7" exitCode=0 Mar 10 11:52:02 crc kubenswrapper[4794]: I0310 11:52:02.321035 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-openstack-openstack-cell1-msjrc" event={"ID":"f756112e-b33c-4c53-aecd-408a3f22f8cf","Type":"ContainerDied","Data":"3d192c4250a4dba32757f0b36619b3866b900342e26b18e53edc29d860d1afd7"} Mar 10 11:52:03 crc kubenswrapper[4794]: I0310 11:52:03.335156 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552392-wkv8z" event={"ID":"784198eb-66e6-4c0e-aa8b-c8a2b9dbde07","Type":"ContainerStarted","Data":"18403c1366ff205ad2bdefb355ebf1c08929ba25828291b9ab5a8d36a9c54321"} Mar 10 11:52:03 crc kubenswrapper[4794]: I0310 11:52:03.359159 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29552392-wkv8z" podStartSLOduration=1.937264393 podStartE2EDuration="3.359139155s" podCreationTimestamp="2026-03-10 11:52:00 +0000 UTC" firstStartedPulling="2026-03-10 11:52:00.994477274 +0000 UTC m=+7669.750648132" lastFinishedPulling="2026-03-10 11:52:02.416352066 +0000 UTC m=+7671.172522894" observedRunningTime="2026-03-10 11:52:03.351530078 +0000 UTC m=+7672.107700906" watchObservedRunningTime="2026-03-10 11:52:03.359139155 +0000 UTC m=+7672.115309983" Mar 10 11:52:03 crc kubenswrapper[4794]: I0310 11:52:03.791251 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-openstack-openstack-cell1-msjrc" Mar 10 11:52:03 crc kubenswrapper[4794]: I0310 11:52:03.817240 4794 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-f9bxd" Mar 10 11:52:03 crc kubenswrapper[4794]: I0310 11:52:03.817275 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-f9bxd" Mar 10 11:52:03 crc kubenswrapper[4794]: I0310 11:52:03.919676 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f756112e-b33c-4c53-aecd-408a3f22f8cf-inventory\") pod \"f756112e-b33c-4c53-aecd-408a3f22f8cf\" (UID: \"f756112e-b33c-4c53-aecd-408a3f22f8cf\") " Mar 10 11:52:03 crc kubenswrapper[4794]: I0310 11:52:03.919724 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/f756112e-b33c-4c53-aecd-408a3f22f8cf-ceph\") pod \"f756112e-b33c-4c53-aecd-408a3f22f8cf\" (UID: \"f756112e-b33c-4c53-aecd-408a3f22f8cf\") " Mar 10 11:52:03 crc kubenswrapper[4794]: I0310 11:52:03.919916 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/f756112e-b33c-4c53-aecd-408a3f22f8cf-ssh-key-openstack-cell1\") pod \"f756112e-b33c-4c53-aecd-408a3f22f8cf\" (UID: \"f756112e-b33c-4c53-aecd-408a3f22f8cf\") " Mar 10 11:52:03 crc kubenswrapper[4794]: I0310 11:52:03.920154 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcvfq\" (UniqueName: \"kubernetes.io/projected/f756112e-b33c-4c53-aecd-408a3f22f8cf-kube-api-access-xcvfq\") pod \"f756112e-b33c-4c53-aecd-408a3f22f8cf\" (UID: \"f756112e-b33c-4c53-aecd-408a3f22f8cf\") " Mar 10 11:52:03 crc kubenswrapper[4794]: I0310 11:52:03.927443 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f756112e-b33c-4c53-aecd-408a3f22f8cf-ceph" (OuterVolumeSpecName: "ceph") pod "f756112e-b33c-4c53-aecd-408a3f22f8cf" (UID: "f756112e-b33c-4c53-aecd-408a3f22f8cf"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 11:52:03 crc kubenswrapper[4794]: I0310 11:52:03.927795 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f756112e-b33c-4c53-aecd-408a3f22f8cf-kube-api-access-xcvfq" (OuterVolumeSpecName: "kube-api-access-xcvfq") pod "f756112e-b33c-4c53-aecd-408a3f22f8cf" (UID: "f756112e-b33c-4c53-aecd-408a3f22f8cf"). InnerVolumeSpecName "kube-api-access-xcvfq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 11:52:03 crc kubenswrapper[4794]: I0310 11:52:03.958486 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f756112e-b33c-4c53-aecd-408a3f22f8cf-inventory" (OuterVolumeSpecName: "inventory") pod "f756112e-b33c-4c53-aecd-408a3f22f8cf" (UID: "f756112e-b33c-4c53-aecd-408a3f22f8cf"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 11:52:03 crc kubenswrapper[4794]: I0310 11:52:03.961581 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f756112e-b33c-4c53-aecd-408a3f22f8cf-ssh-key-openstack-cell1" (OuterVolumeSpecName: "ssh-key-openstack-cell1") pod "f756112e-b33c-4c53-aecd-408a3f22f8cf" (UID: "f756112e-b33c-4c53-aecd-408a3f22f8cf"). InnerVolumeSpecName "ssh-key-openstack-cell1". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 11:52:04 crc kubenswrapper[4794]: I0310 11:52:04.022242 4794 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/f756112e-b33c-4c53-aecd-408a3f22f8cf-ssh-key-openstack-cell1\") on node \"crc\" DevicePath \"\"" Mar 10 11:52:04 crc kubenswrapper[4794]: I0310 11:52:04.022269 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcvfq\" (UniqueName: \"kubernetes.io/projected/f756112e-b33c-4c53-aecd-408a3f22f8cf-kube-api-access-xcvfq\") on node \"crc\" DevicePath \"\"" Mar 10 11:52:04 crc kubenswrapper[4794]: I0310 11:52:04.022279 4794 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f756112e-b33c-4c53-aecd-408a3f22f8cf-inventory\") on node \"crc\" DevicePath \"\"" Mar 10 11:52:04 crc kubenswrapper[4794]: I0310 11:52:04.022287 4794 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/f756112e-b33c-4c53-aecd-408a3f22f8cf-ceph\") on node \"crc\" DevicePath \"\"" Mar 10 11:52:04 crc kubenswrapper[4794]: I0310 11:52:04.346446 4794 generic.go:334] "Generic (PLEG): container finished" podID="784198eb-66e6-4c0e-aa8b-c8a2b9dbde07" containerID="18403c1366ff205ad2bdefb355ebf1c08929ba25828291b9ab5a8d36a9c54321" exitCode=0 Mar 10 11:52:04 crc kubenswrapper[4794]: I0310 11:52:04.346536 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552392-wkv8z" event={"ID":"784198eb-66e6-4c0e-aa8b-c8a2b9dbde07","Type":"ContainerDied","Data":"18403c1366ff205ad2bdefb355ebf1c08929ba25828291b9ab5a8d36a9c54321"} Mar 10 11:52:04 crc kubenswrapper[4794]: I0310 11:52:04.348965 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-openstack-openstack-cell1-msjrc" event={"ID":"f756112e-b33c-4c53-aecd-408a3f22f8cf","Type":"ContainerDied","Data":"fe549c0dc0eb20cd6f6e4c6a7eae6e59161c2806b2992c5ce4538464355906f6"} Mar 10 11:52:04 crc kubenswrapper[4794]: I0310 11:52:04.349013 4794 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fe549c0dc0eb20cd6f6e4c6a7eae6e59161c2806b2992c5ce4538464355906f6" Mar 10 11:52:04 crc kubenswrapper[4794]: I0310 11:52:04.349060 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-openstack-openstack-cell1-msjrc" Mar 10 11:52:04 crc kubenswrapper[4794]: I0310 11:52:04.422938 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/reboot-os-openstack-openstack-cell1-xrfp9"] Mar 10 11:52:04 crc kubenswrapper[4794]: E0310 11:52:04.423419 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f756112e-b33c-4c53-aecd-408a3f22f8cf" containerName="run-os-openstack-openstack-cell1" Mar 10 11:52:04 crc kubenswrapper[4794]: I0310 11:52:04.423449 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="f756112e-b33c-4c53-aecd-408a3f22f8cf" containerName="run-os-openstack-openstack-cell1" Mar 10 11:52:04 crc kubenswrapper[4794]: I0310 11:52:04.423720 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="f756112e-b33c-4c53-aecd-408a3f22f8cf" containerName="run-os-openstack-openstack-cell1" Mar 10 11:52:04 crc kubenswrapper[4794]: I0310 11:52:04.425315 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-openstack-openstack-cell1-xrfp9" Mar 10 11:52:04 crc kubenswrapper[4794]: I0310 11:52:04.428546 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Mar 10 11:52:04 crc kubenswrapper[4794]: I0310 11:52:04.428930 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-jdm8g" Mar 10 11:52:04 crc kubenswrapper[4794]: I0310 11:52:04.429148 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 10 11:52:04 crc kubenswrapper[4794]: I0310 11:52:04.429558 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Mar 10 11:52:04 crc kubenswrapper[4794]: I0310 11:52:04.435941 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-openstack-openstack-cell1-xrfp9"] Mar 10 11:52:04 crc kubenswrapper[4794]: I0310 11:52:04.533239 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a9adb25a-7a73-4147-869e-f5bbe20c230a-inventory\") pod \"reboot-os-openstack-openstack-cell1-xrfp9\" (UID: \"a9adb25a-7a73-4147-869e-f5bbe20c230a\") " pod="openstack/reboot-os-openstack-openstack-cell1-xrfp9" Mar 10 11:52:04 crc kubenswrapper[4794]: I0310 11:52:04.533566 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/a9adb25a-7a73-4147-869e-f5bbe20c230a-ssh-key-openstack-cell1\") pod \"reboot-os-openstack-openstack-cell1-xrfp9\" (UID: \"a9adb25a-7a73-4147-869e-f5bbe20c230a\") " pod="openstack/reboot-os-openstack-openstack-cell1-xrfp9" Mar 10 11:52:04 crc kubenswrapper[4794]: I0310 11:52:04.533655 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/a9adb25a-7a73-4147-869e-f5bbe20c230a-ceph\") pod \"reboot-os-openstack-openstack-cell1-xrfp9\" (UID: \"a9adb25a-7a73-4147-869e-f5bbe20c230a\") " pod="openstack/reboot-os-openstack-openstack-cell1-xrfp9" Mar 10 11:52:04 crc kubenswrapper[4794]: I0310 11:52:04.533749 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6l22n\" (UniqueName: \"kubernetes.io/projected/a9adb25a-7a73-4147-869e-f5bbe20c230a-kube-api-access-6l22n\") pod \"reboot-os-openstack-openstack-cell1-xrfp9\" (UID: \"a9adb25a-7a73-4147-869e-f5bbe20c230a\") " pod="openstack/reboot-os-openstack-openstack-cell1-xrfp9" Mar 10 11:52:04 crc kubenswrapper[4794]: I0310 11:52:04.635766 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/a9adb25a-7a73-4147-869e-f5bbe20c230a-ssh-key-openstack-cell1\") pod \"reboot-os-openstack-openstack-cell1-xrfp9\" (UID: \"a9adb25a-7a73-4147-869e-f5bbe20c230a\") " pod="openstack/reboot-os-openstack-openstack-cell1-xrfp9" Mar 10 11:52:04 crc kubenswrapper[4794]: I0310 11:52:04.635840 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/a9adb25a-7a73-4147-869e-f5bbe20c230a-ceph\") pod \"reboot-os-openstack-openstack-cell1-xrfp9\" (UID: \"a9adb25a-7a73-4147-869e-f5bbe20c230a\") " pod="openstack/reboot-os-openstack-openstack-cell1-xrfp9" Mar 10 11:52:04 crc kubenswrapper[4794]: I0310 11:52:04.635902 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6l22n\" (UniqueName: \"kubernetes.io/projected/a9adb25a-7a73-4147-869e-f5bbe20c230a-kube-api-access-6l22n\") pod \"reboot-os-openstack-openstack-cell1-xrfp9\" (UID: \"a9adb25a-7a73-4147-869e-f5bbe20c230a\") " pod="openstack/reboot-os-openstack-openstack-cell1-xrfp9" Mar 10 11:52:04 crc kubenswrapper[4794]: I0310 11:52:04.636085 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a9adb25a-7a73-4147-869e-f5bbe20c230a-inventory\") pod \"reboot-os-openstack-openstack-cell1-xrfp9\" (UID: \"a9adb25a-7a73-4147-869e-f5bbe20c230a\") " pod="openstack/reboot-os-openstack-openstack-cell1-xrfp9" Mar 10 11:52:04 crc kubenswrapper[4794]: I0310 11:52:04.641010 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/a9adb25a-7a73-4147-869e-f5bbe20c230a-ceph\") pod \"reboot-os-openstack-openstack-cell1-xrfp9\" (UID: \"a9adb25a-7a73-4147-869e-f5bbe20c230a\") " pod="openstack/reboot-os-openstack-openstack-cell1-xrfp9" Mar 10 11:52:04 crc kubenswrapper[4794]: I0310 11:52:04.641252 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a9adb25a-7a73-4147-869e-f5bbe20c230a-inventory\") pod \"reboot-os-openstack-openstack-cell1-xrfp9\" (UID: \"a9adb25a-7a73-4147-869e-f5bbe20c230a\") " pod="openstack/reboot-os-openstack-openstack-cell1-xrfp9" Mar 10 11:52:04 crc kubenswrapper[4794]: I0310 11:52:04.643240 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/a9adb25a-7a73-4147-869e-f5bbe20c230a-ssh-key-openstack-cell1\") pod \"reboot-os-openstack-openstack-cell1-xrfp9\" (UID: \"a9adb25a-7a73-4147-869e-f5bbe20c230a\") " pod="openstack/reboot-os-openstack-openstack-cell1-xrfp9" Mar 10 11:52:04 crc kubenswrapper[4794]: I0310 11:52:04.657735 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6l22n\" (UniqueName: \"kubernetes.io/projected/a9adb25a-7a73-4147-869e-f5bbe20c230a-kube-api-access-6l22n\") pod \"reboot-os-openstack-openstack-cell1-xrfp9\" (UID: \"a9adb25a-7a73-4147-869e-f5bbe20c230a\") " pod="openstack/reboot-os-openstack-openstack-cell1-xrfp9" Mar 10 11:52:04 crc kubenswrapper[4794]: I0310 11:52:04.794552 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-openstack-openstack-cell1-xrfp9" Mar 10 11:52:04 crc kubenswrapper[4794]: I0310 11:52:04.883659 4794 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/community-operators-f9bxd" podUID="0afd8144-f691-4c04-a437-efc0b7352cee" containerName="registry-server" probeResult="failure" output=< Mar 10 11:52:04 crc kubenswrapper[4794]: timeout: failed to connect service ":50051" within 1s Mar 10 11:52:04 crc kubenswrapper[4794]: > Mar 10 11:52:05 crc kubenswrapper[4794]: I0310 11:52:05.423214 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-openstack-openstack-cell1-xrfp9"] Mar 10 11:52:06 crc kubenswrapper[4794]: I0310 11:52:05.763995 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552392-wkv8z" Mar 10 11:52:06 crc kubenswrapper[4794]: I0310 11:52:05.874731 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8b6cr\" (UniqueName: \"kubernetes.io/projected/784198eb-66e6-4c0e-aa8b-c8a2b9dbde07-kube-api-access-8b6cr\") pod \"784198eb-66e6-4c0e-aa8b-c8a2b9dbde07\" (UID: \"784198eb-66e6-4c0e-aa8b-c8a2b9dbde07\") " Mar 10 11:52:06 crc kubenswrapper[4794]: I0310 11:52:05.884785 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/784198eb-66e6-4c0e-aa8b-c8a2b9dbde07-kube-api-access-8b6cr" (OuterVolumeSpecName: "kube-api-access-8b6cr") pod "784198eb-66e6-4c0e-aa8b-c8a2b9dbde07" (UID: "784198eb-66e6-4c0e-aa8b-c8a2b9dbde07"). InnerVolumeSpecName "kube-api-access-8b6cr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 11:52:06 crc kubenswrapper[4794]: I0310 11:52:05.977984 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8b6cr\" (UniqueName: \"kubernetes.io/projected/784198eb-66e6-4c0e-aa8b-c8a2b9dbde07-kube-api-access-8b6cr\") on node \"crc\" DevicePath \"\"" Mar 10 11:52:06 crc kubenswrapper[4794]: I0310 11:52:06.366285 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-openstack-openstack-cell1-xrfp9" event={"ID":"a9adb25a-7a73-4147-869e-f5bbe20c230a","Type":"ContainerStarted","Data":"e07a0699b4b9e1af3a6e1b172ddf2edca8a332f53dfbf932b42d21ab0a614657"} Mar 10 11:52:06 crc kubenswrapper[4794]: I0310 11:52:06.366327 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-openstack-openstack-cell1-xrfp9" event={"ID":"a9adb25a-7a73-4147-869e-f5bbe20c230a","Type":"ContainerStarted","Data":"1e577fe8e0553d989ffb5a1a4954620bdf5ce3663c35872d736ddf1cb7f0b907"} Mar 10 11:52:06 crc kubenswrapper[4794]: I0310 11:52:06.368565 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552392-wkv8z" event={"ID":"784198eb-66e6-4c0e-aa8b-c8a2b9dbde07","Type":"ContainerDied","Data":"74659d388ec5153dc394f4d0f0fba891fd3cb1815bf6a3ff2fc6332a4c6c951f"} Mar 10 11:52:06 crc kubenswrapper[4794]: I0310 11:52:06.368614 4794 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="74659d388ec5153dc394f4d0f0fba891fd3cb1815bf6a3ff2fc6332a4c6c951f" Mar 10 11:52:06 crc kubenswrapper[4794]: I0310 11:52:06.368617 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552392-wkv8z" Mar 10 11:52:06 crc kubenswrapper[4794]: I0310 11:52:06.406774 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/reboot-os-openstack-openstack-cell1-xrfp9" podStartSLOduration=1.950620725 podStartE2EDuration="2.406751806s" podCreationTimestamp="2026-03-10 11:52:04 +0000 UTC" firstStartedPulling="2026-03-10 11:52:05.428521597 +0000 UTC m=+7674.184692415" lastFinishedPulling="2026-03-10 11:52:05.884652678 +0000 UTC m=+7674.640823496" observedRunningTime="2026-03-10 11:52:06.397997514 +0000 UTC m=+7675.154168342" watchObservedRunningTime="2026-03-10 11:52:06.406751806 +0000 UTC m=+7675.162922624" Mar 10 11:52:06 crc kubenswrapper[4794]: I0310 11:52:06.455376 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29552386-zsvqc"] Mar 10 11:52:06 crc kubenswrapper[4794]: I0310 11:52:06.467484 4794 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29552386-zsvqc"] Mar 10 11:52:07 crc kubenswrapper[4794]: I0310 11:52:07.945237 4794 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-l8qfw" podUID="b98e5e74-15a0-4564-80cc-3bb7810a8682" containerName="registry-server" probeResult="failure" output=< Mar 10 11:52:07 crc kubenswrapper[4794]: timeout: failed to connect service ":50051" within 1s Mar 10 11:52:07 crc kubenswrapper[4794]: > Mar 10 11:52:07 crc kubenswrapper[4794]: I0310 11:52:07.999266 4794 scope.go:117] "RemoveContainer" containerID="9dc225440d65a260bfaf17b727b2d2bfa824f3fd4b54db4a37ab86f3420b2343" Mar 10 11:52:07 crc kubenswrapper[4794]: E0310 11:52:07.999594 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-69278_openshift-machine-config-operator(071a79a8-a892-4d38-a255-2a19483b64aa)\"" pod="openshift-machine-config-operator/machine-config-daemon-69278" podUID="071a79a8-a892-4d38-a255-2a19483b64aa" Mar 10 11:52:08 crc kubenswrapper[4794]: I0310 11:52:08.019497 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d78b47f-8a50-4433-9d51-a3037c59ea53" path="/var/lib/kubelet/pods/9d78b47f-8a50-4433-9d51-a3037c59ea53/volumes" Mar 10 11:52:13 crc kubenswrapper[4794]: I0310 11:52:13.870443 4794 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-f9bxd" Mar 10 11:52:13 crc kubenswrapper[4794]: I0310 11:52:13.961125 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-f9bxd" Mar 10 11:52:14 crc kubenswrapper[4794]: I0310 11:52:14.119219 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-f9bxd"] Mar 10 11:52:15 crc kubenswrapper[4794]: I0310 11:52:15.498715 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-f9bxd" podUID="0afd8144-f691-4c04-a437-efc0b7352cee" containerName="registry-server" containerID="cri-o://9f1c57c7ed9d9e0a4380262ff334fbc39992e1e7fc7c189050d825ce59b831ce" gracePeriod=2 Mar 10 11:52:15 crc kubenswrapper[4794]: I0310 11:52:15.958735 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-f9bxd" Mar 10 11:52:16 crc kubenswrapper[4794]: I0310 11:52:16.104693 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-msht8\" (UniqueName: \"kubernetes.io/projected/0afd8144-f691-4c04-a437-efc0b7352cee-kube-api-access-msht8\") pod \"0afd8144-f691-4c04-a437-efc0b7352cee\" (UID: \"0afd8144-f691-4c04-a437-efc0b7352cee\") " Mar 10 11:52:16 crc kubenswrapper[4794]: I0310 11:52:16.104837 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0afd8144-f691-4c04-a437-efc0b7352cee-catalog-content\") pod \"0afd8144-f691-4c04-a437-efc0b7352cee\" (UID: \"0afd8144-f691-4c04-a437-efc0b7352cee\") " Mar 10 11:52:16 crc kubenswrapper[4794]: I0310 11:52:16.104891 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0afd8144-f691-4c04-a437-efc0b7352cee-utilities\") pod \"0afd8144-f691-4c04-a437-efc0b7352cee\" (UID: \"0afd8144-f691-4c04-a437-efc0b7352cee\") " Mar 10 11:52:16 crc kubenswrapper[4794]: I0310 11:52:16.106048 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0afd8144-f691-4c04-a437-efc0b7352cee-utilities" (OuterVolumeSpecName: "utilities") pod "0afd8144-f691-4c04-a437-efc0b7352cee" (UID: "0afd8144-f691-4c04-a437-efc0b7352cee"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 11:52:16 crc kubenswrapper[4794]: I0310 11:52:16.118542 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0afd8144-f691-4c04-a437-efc0b7352cee-kube-api-access-msht8" (OuterVolumeSpecName: "kube-api-access-msht8") pod "0afd8144-f691-4c04-a437-efc0b7352cee" (UID: "0afd8144-f691-4c04-a437-efc0b7352cee"). InnerVolumeSpecName "kube-api-access-msht8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 11:52:16 crc kubenswrapper[4794]: I0310 11:52:16.155207 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0afd8144-f691-4c04-a437-efc0b7352cee-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "0afd8144-f691-4c04-a437-efc0b7352cee" (UID: "0afd8144-f691-4c04-a437-efc0b7352cee"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 11:52:16 crc kubenswrapper[4794]: I0310 11:52:16.207731 4794 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0afd8144-f691-4c04-a437-efc0b7352cee-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 10 11:52:16 crc kubenswrapper[4794]: I0310 11:52:16.207760 4794 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0afd8144-f691-4c04-a437-efc0b7352cee-utilities\") on node \"crc\" DevicePath \"\"" Mar 10 11:52:16 crc kubenswrapper[4794]: I0310 11:52:16.207769 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-msht8\" (UniqueName: \"kubernetes.io/projected/0afd8144-f691-4c04-a437-efc0b7352cee-kube-api-access-msht8\") on node \"crc\" DevicePath \"\"" Mar 10 11:52:16 crc kubenswrapper[4794]: I0310 11:52:16.514764 4794 generic.go:334] "Generic (PLEG): container finished" podID="0afd8144-f691-4c04-a437-efc0b7352cee" containerID="9f1c57c7ed9d9e0a4380262ff334fbc39992e1e7fc7c189050d825ce59b831ce" exitCode=0 Mar 10 11:52:16 crc kubenswrapper[4794]: I0310 11:52:16.514811 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-f9bxd" event={"ID":"0afd8144-f691-4c04-a437-efc0b7352cee","Type":"ContainerDied","Data":"9f1c57c7ed9d9e0a4380262ff334fbc39992e1e7fc7c189050d825ce59b831ce"} Mar 10 11:52:16 crc kubenswrapper[4794]: I0310 11:52:16.514850 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-f9bxd" event={"ID":"0afd8144-f691-4c04-a437-efc0b7352cee","Type":"ContainerDied","Data":"ee53435424025d7c88830efbdc82dde9d297bf338377db023f8f69723e007737"} Mar 10 11:52:16 crc kubenswrapper[4794]: I0310 11:52:16.514856 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-f9bxd" Mar 10 11:52:16 crc kubenswrapper[4794]: I0310 11:52:16.514870 4794 scope.go:117] "RemoveContainer" containerID="9f1c57c7ed9d9e0a4380262ff334fbc39992e1e7fc7c189050d825ce59b831ce" Mar 10 11:52:16 crc kubenswrapper[4794]: I0310 11:52:16.554534 4794 scope.go:117] "RemoveContainer" containerID="88334a089d41d0472b7178b211cfd3b4336fb9435b94bec576c1c46c91afe6f2" Mar 10 11:52:16 crc kubenswrapper[4794]: I0310 11:52:16.557059 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-f9bxd"] Mar 10 11:52:16 crc kubenswrapper[4794]: I0310 11:52:16.567737 4794 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-f9bxd"] Mar 10 11:52:16 crc kubenswrapper[4794]: I0310 11:52:16.582196 4794 scope.go:117] "RemoveContainer" containerID="dbe6aab43f607e3929fc25c919a80db56b70319e0dcc236f33aeed788fa1ef8d" Mar 10 11:52:16 crc kubenswrapper[4794]: I0310 11:52:16.631956 4794 scope.go:117] "RemoveContainer" containerID="9f1c57c7ed9d9e0a4380262ff334fbc39992e1e7fc7c189050d825ce59b831ce" Mar 10 11:52:16 crc kubenswrapper[4794]: E0310 11:52:16.632380 4794 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9f1c57c7ed9d9e0a4380262ff334fbc39992e1e7fc7c189050d825ce59b831ce\": container with ID starting with 9f1c57c7ed9d9e0a4380262ff334fbc39992e1e7fc7c189050d825ce59b831ce not found: ID does not exist" containerID="9f1c57c7ed9d9e0a4380262ff334fbc39992e1e7fc7c189050d825ce59b831ce" Mar 10 11:52:16 crc kubenswrapper[4794]: I0310 11:52:16.632437 4794 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9f1c57c7ed9d9e0a4380262ff334fbc39992e1e7fc7c189050d825ce59b831ce"} err="failed to get container status \"9f1c57c7ed9d9e0a4380262ff334fbc39992e1e7fc7c189050d825ce59b831ce\": rpc error: code = NotFound desc = could not find container \"9f1c57c7ed9d9e0a4380262ff334fbc39992e1e7fc7c189050d825ce59b831ce\": container with ID starting with 9f1c57c7ed9d9e0a4380262ff334fbc39992e1e7fc7c189050d825ce59b831ce not found: ID does not exist" Mar 10 11:52:16 crc kubenswrapper[4794]: I0310 11:52:16.632473 4794 scope.go:117] "RemoveContainer" containerID="88334a089d41d0472b7178b211cfd3b4336fb9435b94bec576c1c46c91afe6f2" Mar 10 11:52:16 crc kubenswrapper[4794]: E0310 11:52:16.633274 4794 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"88334a089d41d0472b7178b211cfd3b4336fb9435b94bec576c1c46c91afe6f2\": container with ID starting with 88334a089d41d0472b7178b211cfd3b4336fb9435b94bec576c1c46c91afe6f2 not found: ID does not exist" containerID="88334a089d41d0472b7178b211cfd3b4336fb9435b94bec576c1c46c91afe6f2" Mar 10 11:52:16 crc kubenswrapper[4794]: I0310 11:52:16.633346 4794 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"88334a089d41d0472b7178b211cfd3b4336fb9435b94bec576c1c46c91afe6f2"} err="failed to get container status \"88334a089d41d0472b7178b211cfd3b4336fb9435b94bec576c1c46c91afe6f2\": rpc error: code = NotFound desc = could not find container \"88334a089d41d0472b7178b211cfd3b4336fb9435b94bec576c1c46c91afe6f2\": container with ID starting with 88334a089d41d0472b7178b211cfd3b4336fb9435b94bec576c1c46c91afe6f2 not found: ID does not exist" Mar 10 11:52:16 crc kubenswrapper[4794]: I0310 11:52:16.633381 4794 scope.go:117] "RemoveContainer" containerID="dbe6aab43f607e3929fc25c919a80db56b70319e0dcc236f33aeed788fa1ef8d" Mar 10 11:52:16 crc kubenswrapper[4794]: E0310 11:52:16.633767 4794 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dbe6aab43f607e3929fc25c919a80db56b70319e0dcc236f33aeed788fa1ef8d\": container with ID starting with dbe6aab43f607e3929fc25c919a80db56b70319e0dcc236f33aeed788fa1ef8d not found: ID does not exist" containerID="dbe6aab43f607e3929fc25c919a80db56b70319e0dcc236f33aeed788fa1ef8d" Mar 10 11:52:16 crc kubenswrapper[4794]: I0310 11:52:16.633808 4794 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dbe6aab43f607e3929fc25c919a80db56b70319e0dcc236f33aeed788fa1ef8d"} err="failed to get container status \"dbe6aab43f607e3929fc25c919a80db56b70319e0dcc236f33aeed788fa1ef8d\": rpc error: code = NotFound desc = could not find container \"dbe6aab43f607e3929fc25c919a80db56b70319e0dcc236f33aeed788fa1ef8d\": container with ID starting with dbe6aab43f607e3929fc25c919a80db56b70319e0dcc236f33aeed788fa1ef8d not found: ID does not exist" Mar 10 11:52:16 crc kubenswrapper[4794]: I0310 11:52:16.979951 4794 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-l8qfw" Mar 10 11:52:17 crc kubenswrapper[4794]: I0310 11:52:17.041622 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-l8qfw" Mar 10 11:52:18 crc kubenswrapper[4794]: I0310 11:52:18.018978 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0afd8144-f691-4c04-a437-efc0b7352cee" path="/var/lib/kubelet/pods/0afd8144-f691-4c04-a437-efc0b7352cee/volumes" Mar 10 11:52:19 crc kubenswrapper[4794]: I0310 11:52:19.320850 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-l8qfw"] Mar 10 11:52:19 crc kubenswrapper[4794]: I0310 11:52:19.321533 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-l8qfw" podUID="b98e5e74-15a0-4564-80cc-3bb7810a8682" containerName="registry-server" containerID="cri-o://3dd74ca70a7da8cf9804f0c3d1f985ea84144056394f50d6094de1b4b9d6e53d" gracePeriod=2 Mar 10 11:52:19 crc kubenswrapper[4794]: I0310 11:52:19.553727 4794 generic.go:334] "Generic (PLEG): container finished" podID="b98e5e74-15a0-4564-80cc-3bb7810a8682" containerID="3dd74ca70a7da8cf9804f0c3d1f985ea84144056394f50d6094de1b4b9d6e53d" exitCode=0 Mar 10 11:52:19 crc kubenswrapper[4794]: I0310 11:52:19.553767 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-l8qfw" event={"ID":"b98e5e74-15a0-4564-80cc-3bb7810a8682","Type":"ContainerDied","Data":"3dd74ca70a7da8cf9804f0c3d1f985ea84144056394f50d6094de1b4b9d6e53d"} Mar 10 11:52:19 crc kubenswrapper[4794]: I0310 11:52:19.930775 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-l8qfw" Mar 10 11:52:20 crc kubenswrapper[4794]: I0310 11:52:20.088796 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b98e5e74-15a0-4564-80cc-3bb7810a8682-utilities\") pod \"b98e5e74-15a0-4564-80cc-3bb7810a8682\" (UID: \"b98e5e74-15a0-4564-80cc-3bb7810a8682\") " Mar 10 11:52:20 crc kubenswrapper[4794]: I0310 11:52:20.088919 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b98e5e74-15a0-4564-80cc-3bb7810a8682-catalog-content\") pod \"b98e5e74-15a0-4564-80cc-3bb7810a8682\" (UID: \"b98e5e74-15a0-4564-80cc-3bb7810a8682\") " Mar 10 11:52:20 crc kubenswrapper[4794]: I0310 11:52:20.088949 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zztdf\" (UniqueName: \"kubernetes.io/projected/b98e5e74-15a0-4564-80cc-3bb7810a8682-kube-api-access-zztdf\") pod \"b98e5e74-15a0-4564-80cc-3bb7810a8682\" (UID: \"b98e5e74-15a0-4564-80cc-3bb7810a8682\") " Mar 10 11:52:20 crc kubenswrapper[4794]: I0310 11:52:20.089892 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b98e5e74-15a0-4564-80cc-3bb7810a8682-utilities" (OuterVolumeSpecName: "utilities") pod "b98e5e74-15a0-4564-80cc-3bb7810a8682" (UID: "b98e5e74-15a0-4564-80cc-3bb7810a8682"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 11:52:20 crc kubenswrapper[4794]: I0310 11:52:20.094910 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b98e5e74-15a0-4564-80cc-3bb7810a8682-kube-api-access-zztdf" (OuterVolumeSpecName: "kube-api-access-zztdf") pod "b98e5e74-15a0-4564-80cc-3bb7810a8682" (UID: "b98e5e74-15a0-4564-80cc-3bb7810a8682"). InnerVolumeSpecName "kube-api-access-zztdf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 11:52:20 crc kubenswrapper[4794]: I0310 11:52:20.191729 4794 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b98e5e74-15a0-4564-80cc-3bb7810a8682-utilities\") on node \"crc\" DevicePath \"\"" Mar 10 11:52:20 crc kubenswrapper[4794]: I0310 11:52:20.191768 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zztdf\" (UniqueName: \"kubernetes.io/projected/b98e5e74-15a0-4564-80cc-3bb7810a8682-kube-api-access-zztdf\") on node \"crc\" DevicePath \"\"" Mar 10 11:52:20 crc kubenswrapper[4794]: I0310 11:52:20.239505 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b98e5e74-15a0-4564-80cc-3bb7810a8682-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b98e5e74-15a0-4564-80cc-3bb7810a8682" (UID: "b98e5e74-15a0-4564-80cc-3bb7810a8682"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 11:52:20 crc kubenswrapper[4794]: I0310 11:52:20.294267 4794 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b98e5e74-15a0-4564-80cc-3bb7810a8682-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 10 11:52:20 crc kubenswrapper[4794]: I0310 11:52:20.564215 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-l8qfw" event={"ID":"b98e5e74-15a0-4564-80cc-3bb7810a8682","Type":"ContainerDied","Data":"aca98efabdfee756f4df7e6b320a22777033bc65021dbc12c76187a4fdada3c1"} Mar 10 11:52:20 crc kubenswrapper[4794]: I0310 11:52:20.564530 4794 scope.go:117] "RemoveContainer" containerID="3dd74ca70a7da8cf9804f0c3d1f985ea84144056394f50d6094de1b4b9d6e53d" Mar 10 11:52:20 crc kubenswrapper[4794]: I0310 11:52:20.564715 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-l8qfw" Mar 10 11:52:20 crc kubenswrapper[4794]: I0310 11:52:20.603387 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-l8qfw"] Mar 10 11:52:20 crc kubenswrapper[4794]: I0310 11:52:20.605867 4794 scope.go:117] "RemoveContainer" containerID="2c63331275dc3a632159b2fde0d96293208650446479e1d72624ca59fe947e8d" Mar 10 11:52:20 crc kubenswrapper[4794]: I0310 11:52:20.610970 4794 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-l8qfw"] Mar 10 11:52:20 crc kubenswrapper[4794]: I0310 11:52:20.646765 4794 scope.go:117] "RemoveContainer" containerID="c304f7e19f6f597e4c6440a87edb58e497c83db14f15433a872cd1a12abe3b61" Mar 10 11:52:22 crc kubenswrapper[4794]: I0310 11:52:22.008874 4794 scope.go:117] "RemoveContainer" containerID="9dc225440d65a260bfaf17b727b2d2bfa824f3fd4b54db4a37ab86f3420b2343" Mar 10 11:52:22 crc kubenswrapper[4794]: E0310 11:52:22.009810 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-69278_openshift-machine-config-operator(071a79a8-a892-4d38-a255-2a19483b64aa)\"" pod="openshift-machine-config-operator/machine-config-daemon-69278" podUID="071a79a8-a892-4d38-a255-2a19483b64aa" Mar 10 11:52:22 crc kubenswrapper[4794]: I0310 11:52:22.012427 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b98e5e74-15a0-4564-80cc-3bb7810a8682" path="/var/lib/kubelet/pods/b98e5e74-15a0-4564-80cc-3bb7810a8682/volumes" Mar 10 11:52:23 crc kubenswrapper[4794]: I0310 11:52:23.592097 4794 generic.go:334] "Generic (PLEG): container finished" podID="a9adb25a-7a73-4147-869e-f5bbe20c230a" containerID="e07a0699b4b9e1af3a6e1b172ddf2edca8a332f53dfbf932b42d21ab0a614657" exitCode=0 Mar 10 11:52:23 crc kubenswrapper[4794]: I0310 11:52:23.592147 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-openstack-openstack-cell1-xrfp9" event={"ID":"a9adb25a-7a73-4147-869e-f5bbe20c230a","Type":"ContainerDied","Data":"e07a0699b4b9e1af3a6e1b172ddf2edca8a332f53dfbf932b42d21ab0a614657"} Mar 10 11:52:25 crc kubenswrapper[4794]: I0310 11:52:25.120978 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-openstack-openstack-cell1-xrfp9" Mar 10 11:52:25 crc kubenswrapper[4794]: I0310 11:52:25.306366 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/a9adb25a-7a73-4147-869e-f5bbe20c230a-ssh-key-openstack-cell1\") pod \"a9adb25a-7a73-4147-869e-f5bbe20c230a\" (UID: \"a9adb25a-7a73-4147-869e-f5bbe20c230a\") " Mar 10 11:52:25 crc kubenswrapper[4794]: I0310 11:52:25.306788 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/a9adb25a-7a73-4147-869e-f5bbe20c230a-ceph\") pod \"a9adb25a-7a73-4147-869e-f5bbe20c230a\" (UID: \"a9adb25a-7a73-4147-869e-f5bbe20c230a\") " Mar 10 11:52:25 crc kubenswrapper[4794]: I0310 11:52:25.306893 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6l22n\" (UniqueName: \"kubernetes.io/projected/a9adb25a-7a73-4147-869e-f5bbe20c230a-kube-api-access-6l22n\") pod \"a9adb25a-7a73-4147-869e-f5bbe20c230a\" (UID: \"a9adb25a-7a73-4147-869e-f5bbe20c230a\") " Mar 10 11:52:25 crc kubenswrapper[4794]: I0310 11:52:25.307162 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a9adb25a-7a73-4147-869e-f5bbe20c230a-inventory\") pod \"a9adb25a-7a73-4147-869e-f5bbe20c230a\" (UID: \"a9adb25a-7a73-4147-869e-f5bbe20c230a\") " Mar 10 11:52:25 crc kubenswrapper[4794]: I0310 11:52:25.314034 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a9adb25a-7a73-4147-869e-f5bbe20c230a-ceph" (OuterVolumeSpecName: "ceph") pod "a9adb25a-7a73-4147-869e-f5bbe20c230a" (UID: "a9adb25a-7a73-4147-869e-f5bbe20c230a"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 11:52:25 crc kubenswrapper[4794]: I0310 11:52:25.314769 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a9adb25a-7a73-4147-869e-f5bbe20c230a-kube-api-access-6l22n" (OuterVolumeSpecName: "kube-api-access-6l22n") pod "a9adb25a-7a73-4147-869e-f5bbe20c230a" (UID: "a9adb25a-7a73-4147-869e-f5bbe20c230a"). InnerVolumeSpecName "kube-api-access-6l22n". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 11:52:25 crc kubenswrapper[4794]: I0310 11:52:25.354232 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a9adb25a-7a73-4147-869e-f5bbe20c230a-ssh-key-openstack-cell1" (OuterVolumeSpecName: "ssh-key-openstack-cell1") pod "a9adb25a-7a73-4147-869e-f5bbe20c230a" (UID: "a9adb25a-7a73-4147-869e-f5bbe20c230a"). InnerVolumeSpecName "ssh-key-openstack-cell1". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 11:52:25 crc kubenswrapper[4794]: I0310 11:52:25.356227 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a9adb25a-7a73-4147-869e-f5bbe20c230a-inventory" (OuterVolumeSpecName: "inventory") pod "a9adb25a-7a73-4147-869e-f5bbe20c230a" (UID: "a9adb25a-7a73-4147-869e-f5bbe20c230a"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 11:52:25 crc kubenswrapper[4794]: I0310 11:52:25.409830 4794 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a9adb25a-7a73-4147-869e-f5bbe20c230a-inventory\") on node \"crc\" DevicePath \"\"" Mar 10 11:52:25 crc kubenswrapper[4794]: I0310 11:52:25.409861 4794 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/a9adb25a-7a73-4147-869e-f5bbe20c230a-ssh-key-openstack-cell1\") on node \"crc\" DevicePath \"\"" Mar 10 11:52:25 crc kubenswrapper[4794]: I0310 11:52:25.409872 4794 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/a9adb25a-7a73-4147-869e-f5bbe20c230a-ceph\") on node \"crc\" DevicePath \"\"" Mar 10 11:52:25 crc kubenswrapper[4794]: I0310 11:52:25.409882 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6l22n\" (UniqueName: \"kubernetes.io/projected/a9adb25a-7a73-4147-869e-f5bbe20c230a-kube-api-access-6l22n\") on node \"crc\" DevicePath \"\"" Mar 10 11:52:25 crc kubenswrapper[4794]: I0310 11:52:25.616794 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-openstack-openstack-cell1-xrfp9" event={"ID":"a9adb25a-7a73-4147-869e-f5bbe20c230a","Type":"ContainerDied","Data":"1e577fe8e0553d989ffb5a1a4954620bdf5ce3663c35872d736ddf1cb7f0b907"} Mar 10 11:52:25 crc kubenswrapper[4794]: I0310 11:52:25.616870 4794 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1e577fe8e0553d989ffb5a1a4954620bdf5ce3663c35872d736ddf1cb7f0b907" Mar 10 11:52:25 crc kubenswrapper[4794]: I0310 11:52:25.616917 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-openstack-openstack-cell1-xrfp9" Mar 10 11:52:25 crc kubenswrapper[4794]: I0310 11:52:25.742726 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-certs-openstack-openstack-cell1-qbzgk"] Mar 10 11:52:25 crc kubenswrapper[4794]: E0310 11:52:25.743210 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b98e5e74-15a0-4564-80cc-3bb7810a8682" containerName="extract-utilities" Mar 10 11:52:25 crc kubenswrapper[4794]: I0310 11:52:25.743223 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="b98e5e74-15a0-4564-80cc-3bb7810a8682" containerName="extract-utilities" Mar 10 11:52:25 crc kubenswrapper[4794]: E0310 11:52:25.743243 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0afd8144-f691-4c04-a437-efc0b7352cee" containerName="extract-utilities" Mar 10 11:52:25 crc kubenswrapper[4794]: I0310 11:52:25.743252 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="0afd8144-f691-4c04-a437-efc0b7352cee" containerName="extract-utilities" Mar 10 11:52:25 crc kubenswrapper[4794]: E0310 11:52:25.743275 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0afd8144-f691-4c04-a437-efc0b7352cee" containerName="registry-server" Mar 10 11:52:25 crc kubenswrapper[4794]: I0310 11:52:25.743282 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="0afd8144-f691-4c04-a437-efc0b7352cee" containerName="registry-server" Mar 10 11:52:25 crc kubenswrapper[4794]: E0310 11:52:25.743290 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="784198eb-66e6-4c0e-aa8b-c8a2b9dbde07" containerName="oc" Mar 10 11:52:25 crc kubenswrapper[4794]: I0310 11:52:25.743296 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="784198eb-66e6-4c0e-aa8b-c8a2b9dbde07" containerName="oc" Mar 10 11:52:25 crc kubenswrapper[4794]: E0310 11:52:25.743310 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b98e5e74-15a0-4564-80cc-3bb7810a8682" containerName="extract-content" Mar 10 11:52:25 crc kubenswrapper[4794]: I0310 11:52:25.743317 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="b98e5e74-15a0-4564-80cc-3bb7810a8682" containerName="extract-content" Mar 10 11:52:25 crc kubenswrapper[4794]: E0310 11:52:25.743367 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b98e5e74-15a0-4564-80cc-3bb7810a8682" containerName="registry-server" Mar 10 11:52:25 crc kubenswrapper[4794]: I0310 11:52:25.743375 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="b98e5e74-15a0-4564-80cc-3bb7810a8682" containerName="registry-server" Mar 10 11:52:25 crc kubenswrapper[4794]: E0310 11:52:25.743392 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0afd8144-f691-4c04-a437-efc0b7352cee" containerName="extract-content" Mar 10 11:52:25 crc kubenswrapper[4794]: I0310 11:52:25.743399 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="0afd8144-f691-4c04-a437-efc0b7352cee" containerName="extract-content" Mar 10 11:52:25 crc kubenswrapper[4794]: E0310 11:52:25.743412 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a9adb25a-7a73-4147-869e-f5bbe20c230a" containerName="reboot-os-openstack-openstack-cell1" Mar 10 11:52:25 crc kubenswrapper[4794]: I0310 11:52:25.743418 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="a9adb25a-7a73-4147-869e-f5bbe20c230a" containerName="reboot-os-openstack-openstack-cell1" Mar 10 11:52:25 crc kubenswrapper[4794]: I0310 11:52:25.743621 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="b98e5e74-15a0-4564-80cc-3bb7810a8682" containerName="registry-server" Mar 10 11:52:25 crc kubenswrapper[4794]: I0310 11:52:25.743638 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="a9adb25a-7a73-4147-869e-f5bbe20c230a" containerName="reboot-os-openstack-openstack-cell1" Mar 10 11:52:25 crc kubenswrapper[4794]: I0310 11:52:25.743650 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="0afd8144-f691-4c04-a437-efc0b7352cee" containerName="registry-server" Mar 10 11:52:25 crc kubenswrapper[4794]: I0310 11:52:25.743657 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="784198eb-66e6-4c0e-aa8b-c8a2b9dbde07" containerName="oc" Mar 10 11:52:25 crc kubenswrapper[4794]: I0310 11:52:25.744426 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-openstack-openstack-cell1-qbzgk" Mar 10 11:52:25 crc kubenswrapper[4794]: I0310 11:52:25.746709 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Mar 10 11:52:25 crc kubenswrapper[4794]: I0310 11:52:25.747699 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 10 11:52:25 crc kubenswrapper[4794]: I0310 11:52:25.747868 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-jdm8g" Mar 10 11:52:25 crc kubenswrapper[4794]: I0310 11:52:25.748171 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Mar 10 11:52:25 crc kubenswrapper[4794]: I0310 11:52:25.754487 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-openstack-openstack-cell1-qbzgk"] Mar 10 11:52:25 crc kubenswrapper[4794]: I0310 11:52:25.919196 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3922f569-e641-48f2-a436-015113e439ee-ovn-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-qbzgk\" (UID: \"3922f569-e641-48f2-a436-015113e439ee\") " pod="openstack/install-certs-openstack-openstack-cell1-qbzgk" Mar 10 11:52:25 crc kubenswrapper[4794]: I0310 11:52:25.919245 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-dhcp-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3922f569-e641-48f2-a436-015113e439ee-neutron-dhcp-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-qbzgk\" (UID: \"3922f569-e641-48f2-a436-015113e439ee\") " pod="openstack/install-certs-openstack-openstack-cell1-qbzgk" Mar 10 11:52:25 crc kubenswrapper[4794]: I0310 11:52:25.919265 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3922f569-e641-48f2-a436-015113e439ee-neutron-metadata-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-qbzgk\" (UID: \"3922f569-e641-48f2-a436-015113e439ee\") " pod="openstack/install-certs-openstack-openstack-cell1-qbzgk" Mar 10 11:52:25 crc kubenswrapper[4794]: I0310 11:52:25.919295 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3922f569-e641-48f2-a436-015113e439ee-inventory\") pod \"install-certs-openstack-openstack-cell1-qbzgk\" (UID: \"3922f569-e641-48f2-a436-015113e439ee\") " pod="openstack/install-certs-openstack-openstack-cell1-qbzgk" Mar 10 11:52:25 crc kubenswrapper[4794]: I0310 11:52:25.920144 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3922f569-e641-48f2-a436-015113e439ee-bootstrap-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-qbzgk\" (UID: \"3922f569-e641-48f2-a436-015113e439ee\") " pod="openstack/install-certs-openstack-openstack-cell1-qbzgk" Mar 10 11:52:25 crc kubenswrapper[4794]: I0310 11:52:25.920214 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-sriov-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3922f569-e641-48f2-a436-015113e439ee-neutron-sriov-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-qbzgk\" (UID: \"3922f569-e641-48f2-a436-015113e439ee\") " pod="openstack/install-certs-openstack-openstack-cell1-qbzgk" Mar 10 11:52:25 crc kubenswrapper[4794]: I0310 11:52:25.920263 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-48cmz\" (UniqueName: \"kubernetes.io/projected/3922f569-e641-48f2-a436-015113e439ee-kube-api-access-48cmz\") pod \"install-certs-openstack-openstack-cell1-qbzgk\" (UID: \"3922f569-e641-48f2-a436-015113e439ee\") " pod="openstack/install-certs-openstack-openstack-cell1-qbzgk" Mar 10 11:52:25 crc kubenswrapper[4794]: I0310 11:52:25.920470 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/3922f569-e641-48f2-a436-015113e439ee-ceph\") pod \"install-certs-openstack-openstack-cell1-qbzgk\" (UID: \"3922f569-e641-48f2-a436-015113e439ee\") " pod="openstack/install-certs-openstack-openstack-cell1-qbzgk" Mar 10 11:52:25 crc kubenswrapper[4794]: I0310 11:52:25.920529 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3922f569-e641-48f2-a436-015113e439ee-telemetry-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-qbzgk\" (UID: \"3922f569-e641-48f2-a436-015113e439ee\") " pod="openstack/install-certs-openstack-openstack-cell1-qbzgk" Mar 10 11:52:25 crc kubenswrapper[4794]: I0310 11:52:25.920567 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3922f569-e641-48f2-a436-015113e439ee-nova-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-qbzgk\" (UID: \"3922f569-e641-48f2-a436-015113e439ee\") " pod="openstack/install-certs-openstack-openstack-cell1-qbzgk" Mar 10 11:52:25 crc kubenswrapper[4794]: I0310 11:52:25.920659 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3922f569-e641-48f2-a436-015113e439ee-libvirt-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-qbzgk\" (UID: \"3922f569-e641-48f2-a436-015113e439ee\") " pod="openstack/install-certs-openstack-openstack-cell1-qbzgk" Mar 10 11:52:25 crc kubenswrapper[4794]: I0310 11:52:25.920702 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/3922f569-e641-48f2-a436-015113e439ee-ssh-key-openstack-cell1\") pod \"install-certs-openstack-openstack-cell1-qbzgk\" (UID: \"3922f569-e641-48f2-a436-015113e439ee\") " pod="openstack/install-certs-openstack-openstack-cell1-qbzgk" Mar 10 11:52:26 crc kubenswrapper[4794]: I0310 11:52:26.022929 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3922f569-e641-48f2-a436-015113e439ee-telemetry-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-qbzgk\" (UID: \"3922f569-e641-48f2-a436-015113e439ee\") " pod="openstack/install-certs-openstack-openstack-cell1-qbzgk" Mar 10 11:52:26 crc kubenswrapper[4794]: I0310 11:52:26.022976 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3922f569-e641-48f2-a436-015113e439ee-nova-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-qbzgk\" (UID: \"3922f569-e641-48f2-a436-015113e439ee\") " pod="openstack/install-certs-openstack-openstack-cell1-qbzgk" Mar 10 11:52:26 crc kubenswrapper[4794]: I0310 11:52:26.023019 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3922f569-e641-48f2-a436-015113e439ee-libvirt-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-qbzgk\" (UID: \"3922f569-e641-48f2-a436-015113e439ee\") " pod="openstack/install-certs-openstack-openstack-cell1-qbzgk" Mar 10 11:52:26 crc kubenswrapper[4794]: I0310 11:52:26.023045 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/3922f569-e641-48f2-a436-015113e439ee-ssh-key-openstack-cell1\") pod \"install-certs-openstack-openstack-cell1-qbzgk\" (UID: \"3922f569-e641-48f2-a436-015113e439ee\") " pod="openstack/install-certs-openstack-openstack-cell1-qbzgk" Mar 10 11:52:26 crc kubenswrapper[4794]: I0310 11:52:26.023068 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3922f569-e641-48f2-a436-015113e439ee-ovn-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-qbzgk\" (UID: \"3922f569-e641-48f2-a436-015113e439ee\") " pod="openstack/install-certs-openstack-openstack-cell1-qbzgk" Mar 10 11:52:26 crc kubenswrapper[4794]: I0310 11:52:26.023093 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-dhcp-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3922f569-e641-48f2-a436-015113e439ee-neutron-dhcp-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-qbzgk\" (UID: \"3922f569-e641-48f2-a436-015113e439ee\") " pod="openstack/install-certs-openstack-openstack-cell1-qbzgk" Mar 10 11:52:26 crc kubenswrapper[4794]: I0310 11:52:26.023112 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3922f569-e641-48f2-a436-015113e439ee-neutron-metadata-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-qbzgk\" (UID: \"3922f569-e641-48f2-a436-015113e439ee\") " pod="openstack/install-certs-openstack-openstack-cell1-qbzgk" Mar 10 11:52:26 crc kubenswrapper[4794]: I0310 11:52:26.023141 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3922f569-e641-48f2-a436-015113e439ee-inventory\") pod \"install-certs-openstack-openstack-cell1-qbzgk\" (UID: \"3922f569-e641-48f2-a436-015113e439ee\") " pod="openstack/install-certs-openstack-openstack-cell1-qbzgk" Mar 10 11:52:26 crc kubenswrapper[4794]: I0310 11:52:26.023165 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3922f569-e641-48f2-a436-015113e439ee-bootstrap-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-qbzgk\" (UID: \"3922f569-e641-48f2-a436-015113e439ee\") " pod="openstack/install-certs-openstack-openstack-cell1-qbzgk" Mar 10 11:52:26 crc kubenswrapper[4794]: I0310 11:52:26.023193 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-sriov-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3922f569-e641-48f2-a436-015113e439ee-neutron-sriov-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-qbzgk\" (UID: \"3922f569-e641-48f2-a436-015113e439ee\") " pod="openstack/install-certs-openstack-openstack-cell1-qbzgk" Mar 10 11:52:26 crc kubenswrapper[4794]: I0310 11:52:26.023213 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-48cmz\" (UniqueName: \"kubernetes.io/projected/3922f569-e641-48f2-a436-015113e439ee-kube-api-access-48cmz\") pod \"install-certs-openstack-openstack-cell1-qbzgk\" (UID: \"3922f569-e641-48f2-a436-015113e439ee\") " pod="openstack/install-certs-openstack-openstack-cell1-qbzgk" Mar 10 11:52:26 crc kubenswrapper[4794]: I0310 11:52:26.023310 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/3922f569-e641-48f2-a436-015113e439ee-ceph\") pod \"install-certs-openstack-openstack-cell1-qbzgk\" (UID: \"3922f569-e641-48f2-a436-015113e439ee\") " pod="openstack/install-certs-openstack-openstack-cell1-qbzgk" Mar 10 11:52:26 crc kubenswrapper[4794]: I0310 11:52:26.028594 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-dhcp-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3922f569-e641-48f2-a436-015113e439ee-neutron-dhcp-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-qbzgk\" (UID: \"3922f569-e641-48f2-a436-015113e439ee\") " pod="openstack/install-certs-openstack-openstack-cell1-qbzgk" Mar 10 11:52:26 crc kubenswrapper[4794]: I0310 11:52:26.029839 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3922f569-e641-48f2-a436-015113e439ee-neutron-metadata-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-qbzgk\" (UID: \"3922f569-e641-48f2-a436-015113e439ee\") " pod="openstack/install-certs-openstack-openstack-cell1-qbzgk" Mar 10 11:52:26 crc kubenswrapper[4794]: I0310 11:52:26.030817 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3922f569-e641-48f2-a436-015113e439ee-bootstrap-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-qbzgk\" (UID: \"3922f569-e641-48f2-a436-015113e439ee\") " pod="openstack/install-certs-openstack-openstack-cell1-qbzgk" Mar 10 11:52:26 crc kubenswrapper[4794]: I0310 11:52:26.031053 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/3922f569-e641-48f2-a436-015113e439ee-ssh-key-openstack-cell1\") pod \"install-certs-openstack-openstack-cell1-qbzgk\" (UID: \"3922f569-e641-48f2-a436-015113e439ee\") " pod="openstack/install-certs-openstack-openstack-cell1-qbzgk" Mar 10 11:52:26 crc kubenswrapper[4794]: I0310 11:52:26.031476 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3922f569-e641-48f2-a436-015113e439ee-nova-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-qbzgk\" (UID: \"3922f569-e641-48f2-a436-015113e439ee\") " pod="openstack/install-certs-openstack-openstack-cell1-qbzgk" Mar 10 11:52:26 crc kubenswrapper[4794]: I0310 11:52:26.032828 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3922f569-e641-48f2-a436-015113e439ee-telemetry-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-qbzgk\" (UID: \"3922f569-e641-48f2-a436-015113e439ee\") " pod="openstack/install-certs-openstack-openstack-cell1-qbzgk" Mar 10 11:52:26 crc kubenswrapper[4794]: I0310 11:52:26.033281 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3922f569-e641-48f2-a436-015113e439ee-ovn-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-qbzgk\" (UID: \"3922f569-e641-48f2-a436-015113e439ee\") " pod="openstack/install-certs-openstack-openstack-cell1-qbzgk" Mar 10 11:52:26 crc kubenswrapper[4794]: I0310 11:52:26.034072 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/3922f569-e641-48f2-a436-015113e439ee-ceph\") pod \"install-certs-openstack-openstack-cell1-qbzgk\" (UID: \"3922f569-e641-48f2-a436-015113e439ee\") " pod="openstack/install-certs-openstack-openstack-cell1-qbzgk" Mar 10 11:52:26 crc kubenswrapper[4794]: I0310 11:52:26.034854 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-sriov-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3922f569-e641-48f2-a436-015113e439ee-neutron-sriov-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-qbzgk\" (UID: \"3922f569-e641-48f2-a436-015113e439ee\") " pod="openstack/install-certs-openstack-openstack-cell1-qbzgk" Mar 10 11:52:26 crc kubenswrapper[4794]: I0310 11:52:26.042236 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3922f569-e641-48f2-a436-015113e439ee-inventory\") pod \"install-certs-openstack-openstack-cell1-qbzgk\" (UID: \"3922f569-e641-48f2-a436-015113e439ee\") " pod="openstack/install-certs-openstack-openstack-cell1-qbzgk" Mar 10 11:52:26 crc kubenswrapper[4794]: I0310 11:52:26.046767 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-48cmz\" (UniqueName: \"kubernetes.io/projected/3922f569-e641-48f2-a436-015113e439ee-kube-api-access-48cmz\") pod \"install-certs-openstack-openstack-cell1-qbzgk\" (UID: \"3922f569-e641-48f2-a436-015113e439ee\") " pod="openstack/install-certs-openstack-openstack-cell1-qbzgk" Mar 10 11:52:26 crc kubenswrapper[4794]: I0310 11:52:26.054781 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3922f569-e641-48f2-a436-015113e439ee-libvirt-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-qbzgk\" (UID: \"3922f569-e641-48f2-a436-015113e439ee\") " pod="openstack/install-certs-openstack-openstack-cell1-qbzgk" Mar 10 11:52:26 crc kubenswrapper[4794]: I0310 11:52:26.108859 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-openstack-openstack-cell1-qbzgk" Mar 10 11:52:26 crc kubenswrapper[4794]: I0310 11:52:26.722158 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-openstack-openstack-cell1-qbzgk"] Mar 10 11:52:27 crc kubenswrapper[4794]: I0310 11:52:27.637829 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-openstack-openstack-cell1-qbzgk" event={"ID":"3922f569-e641-48f2-a436-015113e439ee","Type":"ContainerStarted","Data":"6342d492fac245ad5d828e1c22fb0ec107b7fb6e5c889b4efba909505d5b8e05"} Mar 10 11:52:27 crc kubenswrapper[4794]: I0310 11:52:27.638419 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-openstack-openstack-cell1-qbzgk" event={"ID":"3922f569-e641-48f2-a436-015113e439ee","Type":"ContainerStarted","Data":"208ca4dcdebc8fbf586dae99c58e433c10f7b7cb01948e7948ec8f81948256da"} Mar 10 11:52:27 crc kubenswrapper[4794]: I0310 11:52:27.664695 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-certs-openstack-openstack-cell1-qbzgk" podStartSLOduration=2.178533004 podStartE2EDuration="2.664667217s" podCreationTimestamp="2026-03-10 11:52:25 +0000 UTC" firstStartedPulling="2026-03-10 11:52:26.7445168 +0000 UTC m=+7695.500687628" lastFinishedPulling="2026-03-10 11:52:27.230650983 +0000 UTC m=+7695.986821841" observedRunningTime="2026-03-10 11:52:27.656791122 +0000 UTC m=+7696.412961950" watchObservedRunningTime="2026-03-10 11:52:27.664667217 +0000 UTC m=+7696.420838075" Mar 10 11:52:35 crc kubenswrapper[4794]: I0310 11:52:34.999733 4794 scope.go:117] "RemoveContainer" containerID="9dc225440d65a260bfaf17b727b2d2bfa824f3fd4b54db4a37ab86f3420b2343" Mar 10 11:52:35 crc kubenswrapper[4794]: E0310 11:52:35.000306 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-69278_openshift-machine-config-operator(071a79a8-a892-4d38-a255-2a19483b64aa)\"" pod="openshift-machine-config-operator/machine-config-daemon-69278" podUID="071a79a8-a892-4d38-a255-2a19483b64aa" Mar 10 11:52:46 crc kubenswrapper[4794]: I0310 11:52:46.000288 4794 scope.go:117] "RemoveContainer" containerID="9dc225440d65a260bfaf17b727b2d2bfa824f3fd4b54db4a37ab86f3420b2343" Mar 10 11:52:46 crc kubenswrapper[4794]: E0310 11:52:46.001406 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-69278_openshift-machine-config-operator(071a79a8-a892-4d38-a255-2a19483b64aa)\"" pod="openshift-machine-config-operator/machine-config-daemon-69278" podUID="071a79a8-a892-4d38-a255-2a19483b64aa" Mar 10 11:52:48 crc kubenswrapper[4794]: I0310 11:52:48.897106 4794 generic.go:334] "Generic (PLEG): container finished" podID="3922f569-e641-48f2-a436-015113e439ee" containerID="6342d492fac245ad5d828e1c22fb0ec107b7fb6e5c889b4efba909505d5b8e05" exitCode=0 Mar 10 11:52:48 crc kubenswrapper[4794]: I0310 11:52:48.897466 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-openstack-openstack-cell1-qbzgk" event={"ID":"3922f569-e641-48f2-a436-015113e439ee","Type":"ContainerDied","Data":"6342d492fac245ad5d828e1c22fb0ec107b7fb6e5c889b4efba909505d5b8e05"} Mar 10 11:52:50 crc kubenswrapper[4794]: I0310 11:52:50.321602 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-openstack-openstack-cell1-qbzgk" Mar 10 11:52:50 crc kubenswrapper[4794]: I0310 11:52:50.436996 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3922f569-e641-48f2-a436-015113e439ee-inventory\") pod \"3922f569-e641-48f2-a436-015113e439ee\" (UID: \"3922f569-e641-48f2-a436-015113e439ee\") " Mar 10 11:52:50 crc kubenswrapper[4794]: I0310 11:52:50.437270 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/3922f569-e641-48f2-a436-015113e439ee-ssh-key-openstack-cell1\") pod \"3922f569-e641-48f2-a436-015113e439ee\" (UID: \"3922f569-e641-48f2-a436-015113e439ee\") " Mar 10 11:52:50 crc kubenswrapper[4794]: I0310 11:52:50.437403 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-dhcp-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3922f569-e641-48f2-a436-015113e439ee-neutron-dhcp-combined-ca-bundle\") pod \"3922f569-e641-48f2-a436-015113e439ee\" (UID: \"3922f569-e641-48f2-a436-015113e439ee\") " Mar 10 11:52:50 crc kubenswrapper[4794]: I0310 11:52:50.437513 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3922f569-e641-48f2-a436-015113e439ee-telemetry-combined-ca-bundle\") pod \"3922f569-e641-48f2-a436-015113e439ee\" (UID: \"3922f569-e641-48f2-a436-015113e439ee\") " Mar 10 11:52:50 crc kubenswrapper[4794]: I0310 11:52:50.437649 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3922f569-e641-48f2-a436-015113e439ee-bootstrap-combined-ca-bundle\") pod \"3922f569-e641-48f2-a436-015113e439ee\" (UID: \"3922f569-e641-48f2-a436-015113e439ee\") " Mar 10 11:52:50 crc kubenswrapper[4794]: I0310 11:52:50.437769 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/3922f569-e641-48f2-a436-015113e439ee-ceph\") pod \"3922f569-e641-48f2-a436-015113e439ee\" (UID: \"3922f569-e641-48f2-a436-015113e439ee\") " Mar 10 11:52:50 crc kubenswrapper[4794]: I0310 11:52:50.437877 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3922f569-e641-48f2-a436-015113e439ee-neutron-metadata-combined-ca-bundle\") pod \"3922f569-e641-48f2-a436-015113e439ee\" (UID: \"3922f569-e641-48f2-a436-015113e439ee\") " Mar 10 11:52:50 crc kubenswrapper[4794]: I0310 11:52:50.437939 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-sriov-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3922f569-e641-48f2-a436-015113e439ee-neutron-sriov-combined-ca-bundle\") pod \"3922f569-e641-48f2-a436-015113e439ee\" (UID: \"3922f569-e641-48f2-a436-015113e439ee\") " Mar 10 11:52:50 crc kubenswrapper[4794]: I0310 11:52:50.438005 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3922f569-e641-48f2-a436-015113e439ee-nova-combined-ca-bundle\") pod \"3922f569-e641-48f2-a436-015113e439ee\" (UID: \"3922f569-e641-48f2-a436-015113e439ee\") " Mar 10 11:52:50 crc kubenswrapper[4794]: I0310 11:52:50.438074 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-48cmz\" (UniqueName: \"kubernetes.io/projected/3922f569-e641-48f2-a436-015113e439ee-kube-api-access-48cmz\") pod \"3922f569-e641-48f2-a436-015113e439ee\" (UID: \"3922f569-e641-48f2-a436-015113e439ee\") " Mar 10 11:52:50 crc kubenswrapper[4794]: I0310 11:52:50.438152 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3922f569-e641-48f2-a436-015113e439ee-libvirt-combined-ca-bundle\") pod \"3922f569-e641-48f2-a436-015113e439ee\" (UID: \"3922f569-e641-48f2-a436-015113e439ee\") " Mar 10 11:52:50 crc kubenswrapper[4794]: I0310 11:52:50.438219 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3922f569-e641-48f2-a436-015113e439ee-ovn-combined-ca-bundle\") pod \"3922f569-e641-48f2-a436-015113e439ee\" (UID: \"3922f569-e641-48f2-a436-015113e439ee\") " Mar 10 11:52:50 crc kubenswrapper[4794]: I0310 11:52:50.442724 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3922f569-e641-48f2-a436-015113e439ee-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "3922f569-e641-48f2-a436-015113e439ee" (UID: "3922f569-e641-48f2-a436-015113e439ee"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 11:52:50 crc kubenswrapper[4794]: I0310 11:52:50.442803 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3922f569-e641-48f2-a436-015113e439ee-telemetry-combined-ca-bundle" (OuterVolumeSpecName: "telemetry-combined-ca-bundle") pod "3922f569-e641-48f2-a436-015113e439ee" (UID: "3922f569-e641-48f2-a436-015113e439ee"). InnerVolumeSpecName "telemetry-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 11:52:50 crc kubenswrapper[4794]: I0310 11:52:50.443052 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3922f569-e641-48f2-a436-015113e439ee-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "3922f569-e641-48f2-a436-015113e439ee" (UID: "3922f569-e641-48f2-a436-015113e439ee"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 11:52:50 crc kubenswrapper[4794]: I0310 11:52:50.443804 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3922f569-e641-48f2-a436-015113e439ee-ceph" (OuterVolumeSpecName: "ceph") pod "3922f569-e641-48f2-a436-015113e439ee" (UID: "3922f569-e641-48f2-a436-015113e439ee"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 11:52:50 crc kubenswrapper[4794]: I0310 11:52:50.445006 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3922f569-e641-48f2-a436-015113e439ee-neutron-sriov-combined-ca-bundle" (OuterVolumeSpecName: "neutron-sriov-combined-ca-bundle") pod "3922f569-e641-48f2-a436-015113e439ee" (UID: "3922f569-e641-48f2-a436-015113e439ee"). InnerVolumeSpecName "neutron-sriov-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 11:52:50 crc kubenswrapper[4794]: I0310 11:52:50.445091 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3922f569-e641-48f2-a436-015113e439ee-neutron-dhcp-combined-ca-bundle" (OuterVolumeSpecName: "neutron-dhcp-combined-ca-bundle") pod "3922f569-e641-48f2-a436-015113e439ee" (UID: "3922f569-e641-48f2-a436-015113e439ee"). InnerVolumeSpecName "neutron-dhcp-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 11:52:50 crc kubenswrapper[4794]: I0310 11:52:50.445239 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3922f569-e641-48f2-a436-015113e439ee-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "3922f569-e641-48f2-a436-015113e439ee" (UID: "3922f569-e641-48f2-a436-015113e439ee"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 11:52:50 crc kubenswrapper[4794]: I0310 11:52:50.446165 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3922f569-e641-48f2-a436-015113e439ee-kube-api-access-48cmz" (OuterVolumeSpecName: "kube-api-access-48cmz") pod "3922f569-e641-48f2-a436-015113e439ee" (UID: "3922f569-e641-48f2-a436-015113e439ee"). InnerVolumeSpecName "kube-api-access-48cmz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 11:52:50 crc kubenswrapper[4794]: I0310 11:52:50.446621 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3922f569-e641-48f2-a436-015113e439ee-nova-combined-ca-bundle" (OuterVolumeSpecName: "nova-combined-ca-bundle") pod "3922f569-e641-48f2-a436-015113e439ee" (UID: "3922f569-e641-48f2-a436-015113e439ee"). InnerVolumeSpecName "nova-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 11:52:50 crc kubenswrapper[4794]: I0310 11:52:50.446821 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3922f569-e641-48f2-a436-015113e439ee-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "3922f569-e641-48f2-a436-015113e439ee" (UID: "3922f569-e641-48f2-a436-015113e439ee"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 11:52:50 crc kubenswrapper[4794]: I0310 11:52:50.466129 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3922f569-e641-48f2-a436-015113e439ee-inventory" (OuterVolumeSpecName: "inventory") pod "3922f569-e641-48f2-a436-015113e439ee" (UID: "3922f569-e641-48f2-a436-015113e439ee"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 11:52:50 crc kubenswrapper[4794]: I0310 11:52:50.494366 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3922f569-e641-48f2-a436-015113e439ee-ssh-key-openstack-cell1" (OuterVolumeSpecName: "ssh-key-openstack-cell1") pod "3922f569-e641-48f2-a436-015113e439ee" (UID: "3922f569-e641-48f2-a436-015113e439ee"). InnerVolumeSpecName "ssh-key-openstack-cell1". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 11:52:50 crc kubenswrapper[4794]: I0310 11:52:50.540661 4794 reconciler_common.go:293] "Volume detached for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3922f569-e641-48f2-a436-015113e439ee-telemetry-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 11:52:50 crc kubenswrapper[4794]: I0310 11:52:50.540698 4794 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3922f569-e641-48f2-a436-015113e439ee-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 11:52:50 crc kubenswrapper[4794]: I0310 11:52:50.540711 4794 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/3922f569-e641-48f2-a436-015113e439ee-ceph\") on node \"crc\" DevicePath \"\"" Mar 10 11:52:50 crc kubenswrapper[4794]: I0310 11:52:50.540723 4794 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3922f569-e641-48f2-a436-015113e439ee-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 11:52:50 crc kubenswrapper[4794]: I0310 11:52:50.540735 4794 reconciler_common.go:293] "Volume detached for volume \"neutron-sriov-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3922f569-e641-48f2-a436-015113e439ee-neutron-sriov-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 11:52:50 crc kubenswrapper[4794]: I0310 11:52:50.540748 4794 reconciler_common.go:293] "Volume detached for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3922f569-e641-48f2-a436-015113e439ee-nova-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 11:52:50 crc kubenswrapper[4794]: I0310 11:52:50.540758 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-48cmz\" (UniqueName: \"kubernetes.io/projected/3922f569-e641-48f2-a436-015113e439ee-kube-api-access-48cmz\") on node \"crc\" DevicePath \"\"" Mar 10 11:52:50 crc kubenswrapper[4794]: I0310 11:52:50.540768 4794 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3922f569-e641-48f2-a436-015113e439ee-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 11:52:50 crc kubenswrapper[4794]: I0310 11:52:50.540779 4794 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3922f569-e641-48f2-a436-015113e439ee-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 11:52:50 crc kubenswrapper[4794]: I0310 11:52:50.540788 4794 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3922f569-e641-48f2-a436-015113e439ee-inventory\") on node \"crc\" DevicePath \"\"" Mar 10 11:52:50 crc kubenswrapper[4794]: I0310 11:52:50.540799 4794 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/3922f569-e641-48f2-a436-015113e439ee-ssh-key-openstack-cell1\") on node \"crc\" DevicePath \"\"" Mar 10 11:52:50 crc kubenswrapper[4794]: I0310 11:52:50.540809 4794 reconciler_common.go:293] "Volume detached for volume \"neutron-dhcp-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3922f569-e641-48f2-a436-015113e439ee-neutron-dhcp-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 11:52:50 crc kubenswrapper[4794]: I0310 11:52:50.933431 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-openstack-openstack-cell1-qbzgk" event={"ID":"3922f569-e641-48f2-a436-015113e439ee","Type":"ContainerDied","Data":"208ca4dcdebc8fbf586dae99c58e433c10f7b7cb01948e7948ec8f81948256da"} Mar 10 11:52:50 crc kubenswrapper[4794]: I0310 11:52:50.933489 4794 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="208ca4dcdebc8fbf586dae99c58e433c10f7b7cb01948e7948ec8f81948256da" Mar 10 11:52:50 crc kubenswrapper[4794]: I0310 11:52:50.933806 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-openstack-openstack-cell1-qbzgk" Mar 10 11:52:51 crc kubenswrapper[4794]: I0310 11:52:51.068151 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceph-client-openstack-openstack-cell1-lfjcb"] Mar 10 11:52:51 crc kubenswrapper[4794]: E0310 11:52:51.068914 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3922f569-e641-48f2-a436-015113e439ee" containerName="install-certs-openstack-openstack-cell1" Mar 10 11:52:51 crc kubenswrapper[4794]: I0310 11:52:51.068933 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="3922f569-e641-48f2-a436-015113e439ee" containerName="install-certs-openstack-openstack-cell1" Mar 10 11:52:51 crc kubenswrapper[4794]: I0310 11:52:51.069127 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="3922f569-e641-48f2-a436-015113e439ee" containerName="install-certs-openstack-openstack-cell1" Mar 10 11:52:51 crc kubenswrapper[4794]: I0310 11:52:51.069913 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-client-openstack-openstack-cell1-lfjcb" Mar 10 11:52:51 crc kubenswrapper[4794]: I0310 11:52:51.071956 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Mar 10 11:52:51 crc kubenswrapper[4794]: I0310 11:52:51.072774 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-jdm8g" Mar 10 11:52:51 crc kubenswrapper[4794]: I0310 11:52:51.072993 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 10 11:52:51 crc kubenswrapper[4794]: I0310 11:52:51.073182 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Mar 10 11:52:51 crc kubenswrapper[4794]: I0310 11:52:51.093268 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceph-client-openstack-openstack-cell1-lfjcb"] Mar 10 11:52:51 crc kubenswrapper[4794]: I0310 11:52:51.155388 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/9f97d295-b543-4b60-920f-37840bef42c1-ceph\") pod \"ceph-client-openstack-openstack-cell1-lfjcb\" (UID: \"9f97d295-b543-4b60-920f-37840bef42c1\") " pod="openstack/ceph-client-openstack-openstack-cell1-lfjcb" Mar 10 11:52:51 crc kubenswrapper[4794]: I0310 11:52:51.156223 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6xk4m\" (UniqueName: \"kubernetes.io/projected/9f97d295-b543-4b60-920f-37840bef42c1-kube-api-access-6xk4m\") pod \"ceph-client-openstack-openstack-cell1-lfjcb\" (UID: \"9f97d295-b543-4b60-920f-37840bef42c1\") " pod="openstack/ceph-client-openstack-openstack-cell1-lfjcb" Mar 10 11:52:51 crc kubenswrapper[4794]: I0310 11:52:51.156378 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9f97d295-b543-4b60-920f-37840bef42c1-inventory\") pod \"ceph-client-openstack-openstack-cell1-lfjcb\" (UID: \"9f97d295-b543-4b60-920f-37840bef42c1\") " pod="openstack/ceph-client-openstack-openstack-cell1-lfjcb" Mar 10 11:52:51 crc kubenswrapper[4794]: I0310 11:52:51.156466 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/9f97d295-b543-4b60-920f-37840bef42c1-ssh-key-openstack-cell1\") pod \"ceph-client-openstack-openstack-cell1-lfjcb\" (UID: \"9f97d295-b543-4b60-920f-37840bef42c1\") " pod="openstack/ceph-client-openstack-openstack-cell1-lfjcb" Mar 10 11:52:51 crc kubenswrapper[4794]: I0310 11:52:51.259021 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6xk4m\" (UniqueName: \"kubernetes.io/projected/9f97d295-b543-4b60-920f-37840bef42c1-kube-api-access-6xk4m\") pod \"ceph-client-openstack-openstack-cell1-lfjcb\" (UID: \"9f97d295-b543-4b60-920f-37840bef42c1\") " pod="openstack/ceph-client-openstack-openstack-cell1-lfjcb" Mar 10 11:52:51 crc kubenswrapper[4794]: I0310 11:52:51.259097 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9f97d295-b543-4b60-920f-37840bef42c1-inventory\") pod \"ceph-client-openstack-openstack-cell1-lfjcb\" (UID: \"9f97d295-b543-4b60-920f-37840bef42c1\") " pod="openstack/ceph-client-openstack-openstack-cell1-lfjcb" Mar 10 11:52:51 crc kubenswrapper[4794]: I0310 11:52:51.259118 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/9f97d295-b543-4b60-920f-37840bef42c1-ssh-key-openstack-cell1\") pod \"ceph-client-openstack-openstack-cell1-lfjcb\" (UID: \"9f97d295-b543-4b60-920f-37840bef42c1\") " pod="openstack/ceph-client-openstack-openstack-cell1-lfjcb" Mar 10 11:52:51 crc kubenswrapper[4794]: I0310 11:52:51.259154 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/9f97d295-b543-4b60-920f-37840bef42c1-ceph\") pod \"ceph-client-openstack-openstack-cell1-lfjcb\" (UID: \"9f97d295-b543-4b60-920f-37840bef42c1\") " pod="openstack/ceph-client-openstack-openstack-cell1-lfjcb" Mar 10 11:52:51 crc kubenswrapper[4794]: I0310 11:52:51.265621 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9f97d295-b543-4b60-920f-37840bef42c1-inventory\") pod \"ceph-client-openstack-openstack-cell1-lfjcb\" (UID: \"9f97d295-b543-4b60-920f-37840bef42c1\") " pod="openstack/ceph-client-openstack-openstack-cell1-lfjcb" Mar 10 11:52:51 crc kubenswrapper[4794]: I0310 11:52:51.265980 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/9f97d295-b543-4b60-920f-37840bef42c1-ceph\") pod \"ceph-client-openstack-openstack-cell1-lfjcb\" (UID: \"9f97d295-b543-4b60-920f-37840bef42c1\") " pod="openstack/ceph-client-openstack-openstack-cell1-lfjcb" Mar 10 11:52:51 crc kubenswrapper[4794]: I0310 11:52:51.266515 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/9f97d295-b543-4b60-920f-37840bef42c1-ssh-key-openstack-cell1\") pod \"ceph-client-openstack-openstack-cell1-lfjcb\" (UID: \"9f97d295-b543-4b60-920f-37840bef42c1\") " pod="openstack/ceph-client-openstack-openstack-cell1-lfjcb" Mar 10 11:52:51 crc kubenswrapper[4794]: I0310 11:52:51.277651 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6xk4m\" (UniqueName: \"kubernetes.io/projected/9f97d295-b543-4b60-920f-37840bef42c1-kube-api-access-6xk4m\") pod \"ceph-client-openstack-openstack-cell1-lfjcb\" (UID: \"9f97d295-b543-4b60-920f-37840bef42c1\") " pod="openstack/ceph-client-openstack-openstack-cell1-lfjcb" Mar 10 11:52:51 crc kubenswrapper[4794]: I0310 11:52:51.423702 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-client-openstack-openstack-cell1-lfjcb" Mar 10 11:52:52 crc kubenswrapper[4794]: I0310 11:52:52.033966 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceph-client-openstack-openstack-cell1-lfjcb"] Mar 10 11:52:52 crc kubenswrapper[4794]: I0310 11:52:52.980010 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-client-openstack-openstack-cell1-lfjcb" event={"ID":"9f97d295-b543-4b60-920f-37840bef42c1","Type":"ContainerStarted","Data":"c58275cbe55d120d2c4a09f9dff243870b226ef63b25290ee5da35a2389388ad"} Mar 10 11:52:52 crc kubenswrapper[4794]: I0310 11:52:52.980361 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-client-openstack-openstack-cell1-lfjcb" event={"ID":"9f97d295-b543-4b60-920f-37840bef42c1","Type":"ContainerStarted","Data":"5eec8731041d724376a32f9781891f3fcaf079f8409ae80fd69506b9281255a0"} Mar 10 11:52:52 crc kubenswrapper[4794]: I0310 11:52:52.999644 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceph-client-openstack-openstack-cell1-lfjcb" podStartSLOduration=1.512185343 podStartE2EDuration="1.999624296s" podCreationTimestamp="2026-03-10 11:52:51 +0000 UTC" firstStartedPulling="2026-03-10 11:52:52.037522358 +0000 UTC m=+7720.793693186" lastFinishedPulling="2026-03-10 11:52:52.524961291 +0000 UTC m=+7721.281132139" observedRunningTime="2026-03-10 11:52:52.997783779 +0000 UTC m=+7721.753954627" watchObservedRunningTime="2026-03-10 11:52:52.999624296 +0000 UTC m=+7721.755795114" Mar 10 11:52:53 crc kubenswrapper[4794]: I0310 11:52:53.408285 4794 scope.go:117] "RemoveContainer" containerID="9741a19c5d507976adb605905d8a2d318512206acb99e58b66f31dad35d38b08" Mar 10 11:52:58 crc kubenswrapper[4794]: I0310 11:52:58.042556 4794 generic.go:334] "Generic (PLEG): container finished" podID="9f97d295-b543-4b60-920f-37840bef42c1" containerID="c58275cbe55d120d2c4a09f9dff243870b226ef63b25290ee5da35a2389388ad" exitCode=0 Mar 10 11:52:58 crc kubenswrapper[4794]: I0310 11:52:58.042675 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-client-openstack-openstack-cell1-lfjcb" event={"ID":"9f97d295-b543-4b60-920f-37840bef42c1","Type":"ContainerDied","Data":"c58275cbe55d120d2c4a09f9dff243870b226ef63b25290ee5da35a2389388ad"} Mar 10 11:52:58 crc kubenswrapper[4794]: I0310 11:52:58.999299 4794 scope.go:117] "RemoveContainer" containerID="9dc225440d65a260bfaf17b727b2d2bfa824f3fd4b54db4a37ab86f3420b2343" Mar 10 11:52:58 crc kubenswrapper[4794]: E0310 11:52:58.999923 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-69278_openshift-machine-config-operator(071a79a8-a892-4d38-a255-2a19483b64aa)\"" pod="openshift-machine-config-operator/machine-config-daemon-69278" podUID="071a79a8-a892-4d38-a255-2a19483b64aa" Mar 10 11:52:59 crc kubenswrapper[4794]: I0310 11:52:59.682200 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-client-openstack-openstack-cell1-lfjcb" Mar 10 11:52:59 crc kubenswrapper[4794]: I0310 11:52:59.770460 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9f97d295-b543-4b60-920f-37840bef42c1-inventory\") pod \"9f97d295-b543-4b60-920f-37840bef42c1\" (UID: \"9f97d295-b543-4b60-920f-37840bef42c1\") " Mar 10 11:52:59 crc kubenswrapper[4794]: I0310 11:52:59.770644 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/9f97d295-b543-4b60-920f-37840bef42c1-ssh-key-openstack-cell1\") pod \"9f97d295-b543-4b60-920f-37840bef42c1\" (UID: \"9f97d295-b543-4b60-920f-37840bef42c1\") " Mar 10 11:52:59 crc kubenswrapper[4794]: I0310 11:52:59.770765 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/9f97d295-b543-4b60-920f-37840bef42c1-ceph\") pod \"9f97d295-b543-4b60-920f-37840bef42c1\" (UID: \"9f97d295-b543-4b60-920f-37840bef42c1\") " Mar 10 11:52:59 crc kubenswrapper[4794]: I0310 11:52:59.770816 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6xk4m\" (UniqueName: \"kubernetes.io/projected/9f97d295-b543-4b60-920f-37840bef42c1-kube-api-access-6xk4m\") pod \"9f97d295-b543-4b60-920f-37840bef42c1\" (UID: \"9f97d295-b543-4b60-920f-37840bef42c1\") " Mar 10 11:52:59 crc kubenswrapper[4794]: I0310 11:52:59.778269 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9f97d295-b543-4b60-920f-37840bef42c1-kube-api-access-6xk4m" (OuterVolumeSpecName: "kube-api-access-6xk4m") pod "9f97d295-b543-4b60-920f-37840bef42c1" (UID: "9f97d295-b543-4b60-920f-37840bef42c1"). InnerVolumeSpecName "kube-api-access-6xk4m". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 11:52:59 crc kubenswrapper[4794]: I0310 11:52:59.786094 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9f97d295-b543-4b60-920f-37840bef42c1-ceph" (OuterVolumeSpecName: "ceph") pod "9f97d295-b543-4b60-920f-37840bef42c1" (UID: "9f97d295-b543-4b60-920f-37840bef42c1"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 11:52:59 crc kubenswrapper[4794]: I0310 11:52:59.802958 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9f97d295-b543-4b60-920f-37840bef42c1-ssh-key-openstack-cell1" (OuterVolumeSpecName: "ssh-key-openstack-cell1") pod "9f97d295-b543-4b60-920f-37840bef42c1" (UID: "9f97d295-b543-4b60-920f-37840bef42c1"). InnerVolumeSpecName "ssh-key-openstack-cell1". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 11:52:59 crc kubenswrapper[4794]: I0310 11:52:59.807163 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9f97d295-b543-4b60-920f-37840bef42c1-inventory" (OuterVolumeSpecName: "inventory") pod "9f97d295-b543-4b60-920f-37840bef42c1" (UID: "9f97d295-b543-4b60-920f-37840bef42c1"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 11:52:59 crc kubenswrapper[4794]: I0310 11:52:59.872785 4794 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/9f97d295-b543-4b60-920f-37840bef42c1-ssh-key-openstack-cell1\") on node \"crc\" DevicePath \"\"" Mar 10 11:52:59 crc kubenswrapper[4794]: I0310 11:52:59.872923 4794 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/9f97d295-b543-4b60-920f-37840bef42c1-ceph\") on node \"crc\" DevicePath \"\"" Mar 10 11:52:59 crc kubenswrapper[4794]: I0310 11:52:59.872992 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6xk4m\" (UniqueName: \"kubernetes.io/projected/9f97d295-b543-4b60-920f-37840bef42c1-kube-api-access-6xk4m\") on node \"crc\" DevicePath \"\"" Mar 10 11:52:59 crc kubenswrapper[4794]: I0310 11:52:59.873062 4794 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9f97d295-b543-4b60-920f-37840bef42c1-inventory\") on node \"crc\" DevicePath \"\"" Mar 10 11:53:00 crc kubenswrapper[4794]: I0310 11:53:00.067635 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-client-openstack-openstack-cell1-lfjcb" event={"ID":"9f97d295-b543-4b60-920f-37840bef42c1","Type":"ContainerDied","Data":"5eec8731041d724376a32f9781891f3fcaf079f8409ae80fd69506b9281255a0"} Mar 10 11:53:00 crc kubenswrapper[4794]: I0310 11:53:00.067697 4794 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5eec8731041d724376a32f9781891f3fcaf079f8409ae80fd69506b9281255a0" Mar 10 11:53:00 crc kubenswrapper[4794]: I0310 11:53:00.067715 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-client-openstack-openstack-cell1-lfjcb" Mar 10 11:53:00 crc kubenswrapper[4794]: I0310 11:53:00.163194 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-openstack-openstack-cell1-nx6x4"] Mar 10 11:53:00 crc kubenswrapper[4794]: E0310 11:53:00.164313 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9f97d295-b543-4b60-920f-37840bef42c1" containerName="ceph-client-openstack-openstack-cell1" Mar 10 11:53:00 crc kubenswrapper[4794]: I0310 11:53:00.164413 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="9f97d295-b543-4b60-920f-37840bef42c1" containerName="ceph-client-openstack-openstack-cell1" Mar 10 11:53:00 crc kubenswrapper[4794]: I0310 11:53:00.164701 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="9f97d295-b543-4b60-920f-37840bef42c1" containerName="ceph-client-openstack-openstack-cell1" Mar 10 11:53:00 crc kubenswrapper[4794]: I0310 11:53:00.165702 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-openstack-openstack-cell1-nx6x4" Mar 10 11:53:00 crc kubenswrapper[4794]: I0310 11:53:00.169077 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-jdm8g" Mar 10 11:53:00 crc kubenswrapper[4794]: I0310 11:53:00.169227 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Mar 10 11:53:00 crc kubenswrapper[4794]: I0310 11:53:00.169507 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-config" Mar 10 11:53:00 crc kubenswrapper[4794]: I0310 11:53:00.171018 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 10 11:53:00 crc kubenswrapper[4794]: I0310 11:53:00.171172 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Mar 10 11:53:00 crc kubenswrapper[4794]: I0310 11:53:00.176503 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-openstack-openstack-cell1-nx6x4"] Mar 10 11:53:00 crc kubenswrapper[4794]: I0310 11:53:00.281880 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/40f28062-bca6-426e-a4b9-fff9e17e5a3d-ceph\") pod \"ovn-openstack-openstack-cell1-nx6x4\" (UID: \"40f28062-bca6-426e-a4b9-fff9e17e5a3d\") " pod="openstack/ovn-openstack-openstack-cell1-nx6x4" Mar 10 11:53:00 crc kubenswrapper[4794]: I0310 11:53:00.282148 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/40f28062-bca6-426e-a4b9-fff9e17e5a3d-ovncontroller-config-0\") pod \"ovn-openstack-openstack-cell1-nx6x4\" (UID: \"40f28062-bca6-426e-a4b9-fff9e17e5a3d\") " pod="openstack/ovn-openstack-openstack-cell1-nx6x4" Mar 10 11:53:00 crc kubenswrapper[4794]: I0310 11:53:00.282453 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wwk6m\" (UniqueName: \"kubernetes.io/projected/40f28062-bca6-426e-a4b9-fff9e17e5a3d-kube-api-access-wwk6m\") pod \"ovn-openstack-openstack-cell1-nx6x4\" (UID: \"40f28062-bca6-426e-a4b9-fff9e17e5a3d\") " pod="openstack/ovn-openstack-openstack-cell1-nx6x4" Mar 10 11:53:00 crc kubenswrapper[4794]: I0310 11:53:00.282600 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/40f28062-bca6-426e-a4b9-fff9e17e5a3d-inventory\") pod \"ovn-openstack-openstack-cell1-nx6x4\" (UID: \"40f28062-bca6-426e-a4b9-fff9e17e5a3d\") " pod="openstack/ovn-openstack-openstack-cell1-nx6x4" Mar 10 11:53:00 crc kubenswrapper[4794]: I0310 11:53:00.282800 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/40f28062-bca6-426e-a4b9-fff9e17e5a3d-ovn-combined-ca-bundle\") pod \"ovn-openstack-openstack-cell1-nx6x4\" (UID: \"40f28062-bca6-426e-a4b9-fff9e17e5a3d\") " pod="openstack/ovn-openstack-openstack-cell1-nx6x4" Mar 10 11:53:00 crc kubenswrapper[4794]: I0310 11:53:00.283043 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/40f28062-bca6-426e-a4b9-fff9e17e5a3d-ssh-key-openstack-cell1\") pod \"ovn-openstack-openstack-cell1-nx6x4\" (UID: \"40f28062-bca6-426e-a4b9-fff9e17e5a3d\") " pod="openstack/ovn-openstack-openstack-cell1-nx6x4" Mar 10 11:53:00 crc kubenswrapper[4794]: I0310 11:53:00.385982 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/40f28062-bca6-426e-a4b9-fff9e17e5a3d-ovn-combined-ca-bundle\") pod \"ovn-openstack-openstack-cell1-nx6x4\" (UID: \"40f28062-bca6-426e-a4b9-fff9e17e5a3d\") " pod="openstack/ovn-openstack-openstack-cell1-nx6x4" Mar 10 11:53:00 crc kubenswrapper[4794]: I0310 11:53:00.386139 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/40f28062-bca6-426e-a4b9-fff9e17e5a3d-ssh-key-openstack-cell1\") pod \"ovn-openstack-openstack-cell1-nx6x4\" (UID: \"40f28062-bca6-426e-a4b9-fff9e17e5a3d\") " pod="openstack/ovn-openstack-openstack-cell1-nx6x4" Mar 10 11:53:00 crc kubenswrapper[4794]: I0310 11:53:00.386223 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/40f28062-bca6-426e-a4b9-fff9e17e5a3d-ceph\") pod \"ovn-openstack-openstack-cell1-nx6x4\" (UID: \"40f28062-bca6-426e-a4b9-fff9e17e5a3d\") " pod="openstack/ovn-openstack-openstack-cell1-nx6x4" Mar 10 11:53:00 crc kubenswrapper[4794]: I0310 11:53:00.386537 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/40f28062-bca6-426e-a4b9-fff9e17e5a3d-ovncontroller-config-0\") pod \"ovn-openstack-openstack-cell1-nx6x4\" (UID: \"40f28062-bca6-426e-a4b9-fff9e17e5a3d\") " pod="openstack/ovn-openstack-openstack-cell1-nx6x4" Mar 10 11:53:00 crc kubenswrapper[4794]: I0310 11:53:00.386822 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wwk6m\" (UniqueName: \"kubernetes.io/projected/40f28062-bca6-426e-a4b9-fff9e17e5a3d-kube-api-access-wwk6m\") pod \"ovn-openstack-openstack-cell1-nx6x4\" (UID: \"40f28062-bca6-426e-a4b9-fff9e17e5a3d\") " pod="openstack/ovn-openstack-openstack-cell1-nx6x4" Mar 10 11:53:00 crc kubenswrapper[4794]: I0310 11:53:00.386903 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/40f28062-bca6-426e-a4b9-fff9e17e5a3d-inventory\") pod \"ovn-openstack-openstack-cell1-nx6x4\" (UID: \"40f28062-bca6-426e-a4b9-fff9e17e5a3d\") " pod="openstack/ovn-openstack-openstack-cell1-nx6x4" Mar 10 11:53:00 crc kubenswrapper[4794]: I0310 11:53:00.392106 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/40f28062-bca6-426e-a4b9-fff9e17e5a3d-ovncontroller-config-0\") pod \"ovn-openstack-openstack-cell1-nx6x4\" (UID: \"40f28062-bca6-426e-a4b9-fff9e17e5a3d\") " pod="openstack/ovn-openstack-openstack-cell1-nx6x4" Mar 10 11:53:00 crc kubenswrapper[4794]: I0310 11:53:00.394171 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/40f28062-bca6-426e-a4b9-fff9e17e5a3d-ceph\") pod \"ovn-openstack-openstack-cell1-nx6x4\" (UID: \"40f28062-bca6-426e-a4b9-fff9e17e5a3d\") " pod="openstack/ovn-openstack-openstack-cell1-nx6x4" Mar 10 11:53:00 crc kubenswrapper[4794]: I0310 11:53:00.406004 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/40f28062-bca6-426e-a4b9-fff9e17e5a3d-inventory\") pod \"ovn-openstack-openstack-cell1-nx6x4\" (UID: \"40f28062-bca6-426e-a4b9-fff9e17e5a3d\") " pod="openstack/ovn-openstack-openstack-cell1-nx6x4" Mar 10 11:53:00 crc kubenswrapper[4794]: I0310 11:53:00.411090 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wwk6m\" (UniqueName: \"kubernetes.io/projected/40f28062-bca6-426e-a4b9-fff9e17e5a3d-kube-api-access-wwk6m\") pod \"ovn-openstack-openstack-cell1-nx6x4\" (UID: \"40f28062-bca6-426e-a4b9-fff9e17e5a3d\") " pod="openstack/ovn-openstack-openstack-cell1-nx6x4" Mar 10 11:53:00 crc kubenswrapper[4794]: I0310 11:53:00.411727 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/40f28062-bca6-426e-a4b9-fff9e17e5a3d-ovn-combined-ca-bundle\") pod \"ovn-openstack-openstack-cell1-nx6x4\" (UID: \"40f28062-bca6-426e-a4b9-fff9e17e5a3d\") " pod="openstack/ovn-openstack-openstack-cell1-nx6x4" Mar 10 11:53:00 crc kubenswrapper[4794]: I0310 11:53:00.422376 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/40f28062-bca6-426e-a4b9-fff9e17e5a3d-ssh-key-openstack-cell1\") pod \"ovn-openstack-openstack-cell1-nx6x4\" (UID: \"40f28062-bca6-426e-a4b9-fff9e17e5a3d\") " pod="openstack/ovn-openstack-openstack-cell1-nx6x4" Mar 10 11:53:00 crc kubenswrapper[4794]: I0310 11:53:00.491920 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-openstack-openstack-cell1-nx6x4" Mar 10 11:53:01 crc kubenswrapper[4794]: I0310 11:53:01.032859 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-openstack-openstack-cell1-nx6x4"] Mar 10 11:53:01 crc kubenswrapper[4794]: I0310 11:53:01.081346 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-openstack-openstack-cell1-nx6x4" event={"ID":"40f28062-bca6-426e-a4b9-fff9e17e5a3d","Type":"ContainerStarted","Data":"c3a183d4e03eeb189addfea128f6e8bd0bce514535cfbc8a42073c811aa63c39"} Mar 10 11:53:02 crc kubenswrapper[4794]: I0310 11:53:02.099028 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-openstack-openstack-cell1-nx6x4" event={"ID":"40f28062-bca6-426e-a4b9-fff9e17e5a3d","Type":"ContainerStarted","Data":"8dcc235ea27a89631f74f3c464bd2c97ed4efaf26277183549034968d483ed83"} Mar 10 11:53:02 crc kubenswrapper[4794]: I0310 11:53:02.123076 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-openstack-openstack-cell1-nx6x4" podStartSLOduration=1.67701288 podStartE2EDuration="2.123053427s" podCreationTimestamp="2026-03-10 11:53:00 +0000 UTC" firstStartedPulling="2026-03-10 11:53:01.041902364 +0000 UTC m=+7729.798073202" lastFinishedPulling="2026-03-10 11:53:01.487942921 +0000 UTC m=+7730.244113749" observedRunningTime="2026-03-10 11:53:02.118397113 +0000 UTC m=+7730.874567941" watchObservedRunningTime="2026-03-10 11:53:02.123053427 +0000 UTC m=+7730.879224255" Mar 10 11:53:12 crc kubenswrapper[4794]: I0310 11:53:12.007711 4794 scope.go:117] "RemoveContainer" containerID="9dc225440d65a260bfaf17b727b2d2bfa824f3fd4b54db4a37ab86f3420b2343" Mar 10 11:53:12 crc kubenswrapper[4794]: E0310 11:53:12.008829 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-69278_openshift-machine-config-operator(071a79a8-a892-4d38-a255-2a19483b64aa)\"" pod="openshift-machine-config-operator/machine-config-daemon-69278" podUID="071a79a8-a892-4d38-a255-2a19483b64aa" Mar 10 11:53:27 crc kubenswrapper[4794]: I0310 11:53:27.000649 4794 scope.go:117] "RemoveContainer" containerID="9dc225440d65a260bfaf17b727b2d2bfa824f3fd4b54db4a37ab86f3420b2343" Mar 10 11:53:27 crc kubenswrapper[4794]: E0310 11:53:27.001909 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-69278_openshift-machine-config-operator(071a79a8-a892-4d38-a255-2a19483b64aa)\"" pod="openshift-machine-config-operator/machine-config-daemon-69278" podUID="071a79a8-a892-4d38-a255-2a19483b64aa" Mar 10 11:53:38 crc kubenswrapper[4794]: I0310 11:53:37.999696 4794 scope.go:117] "RemoveContainer" containerID="9dc225440d65a260bfaf17b727b2d2bfa824f3fd4b54db4a37ab86f3420b2343" Mar 10 11:53:38 crc kubenswrapper[4794]: E0310 11:53:38.000869 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-69278_openshift-machine-config-operator(071a79a8-a892-4d38-a255-2a19483b64aa)\"" pod="openshift-machine-config-operator/machine-config-daemon-69278" podUID="071a79a8-a892-4d38-a255-2a19483b64aa" Mar 10 11:53:52 crc kubenswrapper[4794]: I0310 11:53:52.999094 4794 scope.go:117] "RemoveContainer" containerID="9dc225440d65a260bfaf17b727b2d2bfa824f3fd4b54db4a37ab86f3420b2343" Mar 10 11:53:53 crc kubenswrapper[4794]: E0310 11:53:53.000029 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-69278_openshift-machine-config-operator(071a79a8-a892-4d38-a255-2a19483b64aa)\"" pod="openshift-machine-config-operator/machine-config-daemon-69278" podUID="071a79a8-a892-4d38-a255-2a19483b64aa" Mar 10 11:54:00 crc kubenswrapper[4794]: I0310 11:54:00.173477 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29552394-r9wlb"] Mar 10 11:54:00 crc kubenswrapper[4794]: I0310 11:54:00.176307 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552394-r9wlb" Mar 10 11:54:00 crc kubenswrapper[4794]: I0310 11:54:00.179831 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-r5tkc" Mar 10 11:54:00 crc kubenswrapper[4794]: I0310 11:54:00.180103 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 10 11:54:00 crc kubenswrapper[4794]: I0310 11:54:00.180685 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 10 11:54:00 crc kubenswrapper[4794]: I0310 11:54:00.191296 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552394-r9wlb"] Mar 10 11:54:00 crc kubenswrapper[4794]: I0310 11:54:00.270390 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zwgl2\" (UniqueName: \"kubernetes.io/projected/0f0fee6c-1902-4b26-b111-e4b439ed3811-kube-api-access-zwgl2\") pod \"auto-csr-approver-29552394-r9wlb\" (UID: \"0f0fee6c-1902-4b26-b111-e4b439ed3811\") " pod="openshift-infra/auto-csr-approver-29552394-r9wlb" Mar 10 11:54:00 crc kubenswrapper[4794]: I0310 11:54:00.372586 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zwgl2\" (UniqueName: \"kubernetes.io/projected/0f0fee6c-1902-4b26-b111-e4b439ed3811-kube-api-access-zwgl2\") pod \"auto-csr-approver-29552394-r9wlb\" (UID: \"0f0fee6c-1902-4b26-b111-e4b439ed3811\") " pod="openshift-infra/auto-csr-approver-29552394-r9wlb" Mar 10 11:54:00 crc kubenswrapper[4794]: I0310 11:54:00.407801 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zwgl2\" (UniqueName: \"kubernetes.io/projected/0f0fee6c-1902-4b26-b111-e4b439ed3811-kube-api-access-zwgl2\") pod \"auto-csr-approver-29552394-r9wlb\" (UID: \"0f0fee6c-1902-4b26-b111-e4b439ed3811\") " pod="openshift-infra/auto-csr-approver-29552394-r9wlb" Mar 10 11:54:00 crc kubenswrapper[4794]: I0310 11:54:00.523212 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552394-r9wlb" Mar 10 11:54:01 crc kubenswrapper[4794]: I0310 11:54:01.080770 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552394-r9wlb"] Mar 10 11:54:01 crc kubenswrapper[4794]: I0310 11:54:01.955278 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552394-r9wlb" event={"ID":"0f0fee6c-1902-4b26-b111-e4b439ed3811","Type":"ContainerStarted","Data":"d1e771a05b35b1cba377b136ae24fa94983ecd7723f0bcd108e82918bb379fd1"} Mar 10 11:54:02 crc kubenswrapper[4794]: I0310 11:54:02.971519 4794 generic.go:334] "Generic (PLEG): container finished" podID="0f0fee6c-1902-4b26-b111-e4b439ed3811" containerID="a73e125e24c2ece7751723bc02b5eccd4fb36d90b364b58456d0822584d819a5" exitCode=0 Mar 10 11:54:02 crc kubenswrapper[4794]: I0310 11:54:02.971583 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552394-r9wlb" event={"ID":"0f0fee6c-1902-4b26-b111-e4b439ed3811","Type":"ContainerDied","Data":"a73e125e24c2ece7751723bc02b5eccd4fb36d90b364b58456d0822584d819a5"} Mar 10 11:54:04 crc kubenswrapper[4794]: I0310 11:54:04.400703 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552394-r9wlb" Mar 10 11:54:04 crc kubenswrapper[4794]: I0310 11:54:04.575174 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zwgl2\" (UniqueName: \"kubernetes.io/projected/0f0fee6c-1902-4b26-b111-e4b439ed3811-kube-api-access-zwgl2\") pod \"0f0fee6c-1902-4b26-b111-e4b439ed3811\" (UID: \"0f0fee6c-1902-4b26-b111-e4b439ed3811\") " Mar 10 11:54:04 crc kubenswrapper[4794]: I0310 11:54:04.582731 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0f0fee6c-1902-4b26-b111-e4b439ed3811-kube-api-access-zwgl2" (OuterVolumeSpecName: "kube-api-access-zwgl2") pod "0f0fee6c-1902-4b26-b111-e4b439ed3811" (UID: "0f0fee6c-1902-4b26-b111-e4b439ed3811"). InnerVolumeSpecName "kube-api-access-zwgl2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 11:54:04 crc kubenswrapper[4794]: I0310 11:54:04.678153 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zwgl2\" (UniqueName: \"kubernetes.io/projected/0f0fee6c-1902-4b26-b111-e4b439ed3811-kube-api-access-zwgl2\") on node \"crc\" DevicePath \"\"" Mar 10 11:54:04 crc kubenswrapper[4794]: I0310 11:54:04.995783 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552394-r9wlb" event={"ID":"0f0fee6c-1902-4b26-b111-e4b439ed3811","Type":"ContainerDied","Data":"d1e771a05b35b1cba377b136ae24fa94983ecd7723f0bcd108e82918bb379fd1"} Mar 10 11:54:04 crc kubenswrapper[4794]: I0310 11:54:04.996015 4794 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d1e771a05b35b1cba377b136ae24fa94983ecd7723f0bcd108e82918bb379fd1" Mar 10 11:54:04 crc kubenswrapper[4794]: I0310 11:54:04.995835 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552394-r9wlb" Mar 10 11:54:05 crc kubenswrapper[4794]: I0310 11:54:05.491502 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29552388-6j89z"] Mar 10 11:54:05 crc kubenswrapper[4794]: I0310 11:54:05.504033 4794 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29552388-6j89z"] Mar 10 11:54:05 crc kubenswrapper[4794]: I0310 11:54:05.999260 4794 scope.go:117] "RemoveContainer" containerID="9dc225440d65a260bfaf17b727b2d2bfa824f3fd4b54db4a37ab86f3420b2343" Mar 10 11:54:05 crc kubenswrapper[4794]: E0310 11:54:05.999536 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-69278_openshift-machine-config-operator(071a79a8-a892-4d38-a255-2a19483b64aa)\"" pod="openshift-machine-config-operator/machine-config-daemon-69278" podUID="071a79a8-a892-4d38-a255-2a19483b64aa" Mar 10 11:54:06 crc kubenswrapper[4794]: I0310 11:54:06.015514 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bea8e430-5c78-4741-87d7-6f24aff9f849" path="/var/lib/kubelet/pods/bea8e430-5c78-4741-87d7-6f24aff9f849/volumes" Mar 10 11:54:11 crc kubenswrapper[4794]: I0310 11:54:11.080977 4794 generic.go:334] "Generic (PLEG): container finished" podID="40f28062-bca6-426e-a4b9-fff9e17e5a3d" containerID="8dcc235ea27a89631f74f3c464bd2c97ed4efaf26277183549034968d483ed83" exitCode=0 Mar 10 11:54:11 crc kubenswrapper[4794]: I0310 11:54:11.081093 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-openstack-openstack-cell1-nx6x4" event={"ID":"40f28062-bca6-426e-a4b9-fff9e17e5a3d","Type":"ContainerDied","Data":"8dcc235ea27a89631f74f3c464bd2c97ed4efaf26277183549034968d483ed83"} Mar 10 11:54:12 crc kubenswrapper[4794]: I0310 11:54:12.618404 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-openstack-openstack-cell1-nx6x4" Mar 10 11:54:12 crc kubenswrapper[4794]: I0310 11:54:12.674643 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/40f28062-bca6-426e-a4b9-fff9e17e5a3d-inventory\") pod \"40f28062-bca6-426e-a4b9-fff9e17e5a3d\" (UID: \"40f28062-bca6-426e-a4b9-fff9e17e5a3d\") " Mar 10 11:54:12 crc kubenswrapper[4794]: I0310 11:54:12.674758 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wwk6m\" (UniqueName: \"kubernetes.io/projected/40f28062-bca6-426e-a4b9-fff9e17e5a3d-kube-api-access-wwk6m\") pod \"40f28062-bca6-426e-a4b9-fff9e17e5a3d\" (UID: \"40f28062-bca6-426e-a4b9-fff9e17e5a3d\") " Mar 10 11:54:12 crc kubenswrapper[4794]: I0310 11:54:12.674816 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/40f28062-bca6-426e-a4b9-fff9e17e5a3d-ceph\") pod \"40f28062-bca6-426e-a4b9-fff9e17e5a3d\" (UID: \"40f28062-bca6-426e-a4b9-fff9e17e5a3d\") " Mar 10 11:54:12 crc kubenswrapper[4794]: I0310 11:54:12.674880 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/40f28062-bca6-426e-a4b9-fff9e17e5a3d-ovn-combined-ca-bundle\") pod \"40f28062-bca6-426e-a4b9-fff9e17e5a3d\" (UID: \"40f28062-bca6-426e-a4b9-fff9e17e5a3d\") " Mar 10 11:54:12 crc kubenswrapper[4794]: I0310 11:54:12.674983 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/40f28062-bca6-426e-a4b9-fff9e17e5a3d-ovncontroller-config-0\") pod \"40f28062-bca6-426e-a4b9-fff9e17e5a3d\" (UID: \"40f28062-bca6-426e-a4b9-fff9e17e5a3d\") " Mar 10 11:54:12 crc kubenswrapper[4794]: I0310 11:54:12.675025 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/40f28062-bca6-426e-a4b9-fff9e17e5a3d-ssh-key-openstack-cell1\") pod \"40f28062-bca6-426e-a4b9-fff9e17e5a3d\" (UID: \"40f28062-bca6-426e-a4b9-fff9e17e5a3d\") " Mar 10 11:54:12 crc kubenswrapper[4794]: I0310 11:54:12.682212 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/40f28062-bca6-426e-a4b9-fff9e17e5a3d-ceph" (OuterVolumeSpecName: "ceph") pod "40f28062-bca6-426e-a4b9-fff9e17e5a3d" (UID: "40f28062-bca6-426e-a4b9-fff9e17e5a3d"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 11:54:12 crc kubenswrapper[4794]: I0310 11:54:12.703064 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/40f28062-bca6-426e-a4b9-fff9e17e5a3d-kube-api-access-wwk6m" (OuterVolumeSpecName: "kube-api-access-wwk6m") pod "40f28062-bca6-426e-a4b9-fff9e17e5a3d" (UID: "40f28062-bca6-426e-a4b9-fff9e17e5a3d"). InnerVolumeSpecName "kube-api-access-wwk6m". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 11:54:12 crc kubenswrapper[4794]: I0310 11:54:12.706554 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/40f28062-bca6-426e-a4b9-fff9e17e5a3d-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "40f28062-bca6-426e-a4b9-fff9e17e5a3d" (UID: "40f28062-bca6-426e-a4b9-fff9e17e5a3d"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 11:54:12 crc kubenswrapper[4794]: I0310 11:54:12.713731 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/40f28062-bca6-426e-a4b9-fff9e17e5a3d-inventory" (OuterVolumeSpecName: "inventory") pod "40f28062-bca6-426e-a4b9-fff9e17e5a3d" (UID: "40f28062-bca6-426e-a4b9-fff9e17e5a3d"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 11:54:12 crc kubenswrapper[4794]: I0310 11:54:12.714482 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/40f28062-bca6-426e-a4b9-fff9e17e5a3d-ovncontroller-config-0" (OuterVolumeSpecName: "ovncontroller-config-0") pod "40f28062-bca6-426e-a4b9-fff9e17e5a3d" (UID: "40f28062-bca6-426e-a4b9-fff9e17e5a3d"). InnerVolumeSpecName "ovncontroller-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 11:54:12 crc kubenswrapper[4794]: I0310 11:54:12.735556 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/40f28062-bca6-426e-a4b9-fff9e17e5a3d-ssh-key-openstack-cell1" (OuterVolumeSpecName: "ssh-key-openstack-cell1") pod "40f28062-bca6-426e-a4b9-fff9e17e5a3d" (UID: "40f28062-bca6-426e-a4b9-fff9e17e5a3d"). InnerVolumeSpecName "ssh-key-openstack-cell1". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 11:54:12 crc kubenswrapper[4794]: I0310 11:54:12.778451 4794 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/40f28062-bca6-426e-a4b9-fff9e17e5a3d-inventory\") on node \"crc\" DevicePath \"\"" Mar 10 11:54:12 crc kubenswrapper[4794]: I0310 11:54:12.778541 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wwk6m\" (UniqueName: \"kubernetes.io/projected/40f28062-bca6-426e-a4b9-fff9e17e5a3d-kube-api-access-wwk6m\") on node \"crc\" DevicePath \"\"" Mar 10 11:54:12 crc kubenswrapper[4794]: I0310 11:54:12.778554 4794 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/40f28062-bca6-426e-a4b9-fff9e17e5a3d-ceph\") on node \"crc\" DevicePath \"\"" Mar 10 11:54:12 crc kubenswrapper[4794]: I0310 11:54:12.778591 4794 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/40f28062-bca6-426e-a4b9-fff9e17e5a3d-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 11:54:12 crc kubenswrapper[4794]: I0310 11:54:12.778604 4794 reconciler_common.go:293] "Volume detached for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/40f28062-bca6-426e-a4b9-fff9e17e5a3d-ovncontroller-config-0\") on node \"crc\" DevicePath \"\"" Mar 10 11:54:12 crc kubenswrapper[4794]: I0310 11:54:12.778614 4794 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/40f28062-bca6-426e-a4b9-fff9e17e5a3d-ssh-key-openstack-cell1\") on node \"crc\" DevicePath \"\"" Mar 10 11:54:13 crc kubenswrapper[4794]: I0310 11:54:13.107713 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-openstack-openstack-cell1-nx6x4" event={"ID":"40f28062-bca6-426e-a4b9-fff9e17e5a3d","Type":"ContainerDied","Data":"c3a183d4e03eeb189addfea128f6e8bd0bce514535cfbc8a42073c811aa63c39"} Mar 10 11:54:13 crc kubenswrapper[4794]: I0310 11:54:13.107765 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-openstack-openstack-cell1-nx6x4" Mar 10 11:54:13 crc kubenswrapper[4794]: I0310 11:54:13.107797 4794 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c3a183d4e03eeb189addfea128f6e8bd0bce514535cfbc8a42073c811aa63c39" Mar 10 11:54:13 crc kubenswrapper[4794]: I0310 11:54:13.221440 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-metadata-openstack-openstack-cell1-l92sc"] Mar 10 11:54:13 crc kubenswrapper[4794]: E0310 11:54:13.221893 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="40f28062-bca6-426e-a4b9-fff9e17e5a3d" containerName="ovn-openstack-openstack-cell1" Mar 10 11:54:13 crc kubenswrapper[4794]: I0310 11:54:13.221910 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="40f28062-bca6-426e-a4b9-fff9e17e5a3d" containerName="ovn-openstack-openstack-cell1" Mar 10 11:54:13 crc kubenswrapper[4794]: E0310 11:54:13.221928 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0f0fee6c-1902-4b26-b111-e4b439ed3811" containerName="oc" Mar 10 11:54:13 crc kubenswrapper[4794]: I0310 11:54:13.221935 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="0f0fee6c-1902-4b26-b111-e4b439ed3811" containerName="oc" Mar 10 11:54:13 crc kubenswrapper[4794]: I0310 11:54:13.222119 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="40f28062-bca6-426e-a4b9-fff9e17e5a3d" containerName="ovn-openstack-openstack-cell1" Mar 10 11:54:13 crc kubenswrapper[4794]: I0310 11:54:13.222148 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="0f0fee6c-1902-4b26-b111-e4b439ed3811" containerName="oc" Mar 10 11:54:13 crc kubenswrapper[4794]: I0310 11:54:13.223266 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-openstack-openstack-cell1-l92sc" Mar 10 11:54:13 crc kubenswrapper[4794]: I0310 11:54:13.226417 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-ovn-metadata-agent-neutron-config" Mar 10 11:54:13 crc kubenswrapper[4794]: I0310 11:54:13.226980 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Mar 10 11:54:13 crc kubenswrapper[4794]: I0310 11:54:13.227179 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 10 11:54:13 crc kubenswrapper[4794]: I0310 11:54:13.228102 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-neutron-config" Mar 10 11:54:13 crc kubenswrapper[4794]: I0310 11:54:13.229249 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Mar 10 11:54:13 crc kubenswrapper[4794]: I0310 11:54:13.239155 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-jdm8g" Mar 10 11:54:13 crc kubenswrapper[4794]: I0310 11:54:13.242824 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-openstack-openstack-cell1-l92sc"] Mar 10 11:54:13 crc kubenswrapper[4794]: I0310 11:54:13.290185 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/838354a0-fdf3-4dfd-89cc-1f9bdc800fbb-nova-metadata-neutron-config-0\") pod \"neutron-metadata-openstack-openstack-cell1-l92sc\" (UID: \"838354a0-fdf3-4dfd-89cc-1f9bdc800fbb\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-l92sc" Mar 10 11:54:13 crc kubenswrapper[4794]: I0310 11:54:13.290304 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/838354a0-fdf3-4dfd-89cc-1f9bdc800fbb-ssh-key-openstack-cell1\") pod \"neutron-metadata-openstack-openstack-cell1-l92sc\" (UID: \"838354a0-fdf3-4dfd-89cc-1f9bdc800fbb\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-l92sc" Mar 10 11:54:13 crc kubenswrapper[4794]: I0310 11:54:13.290428 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/838354a0-fdf3-4dfd-89cc-1f9bdc800fbb-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-openstack-openstack-cell1-l92sc\" (UID: \"838354a0-fdf3-4dfd-89cc-1f9bdc800fbb\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-l92sc" Mar 10 11:54:13 crc kubenswrapper[4794]: I0310 11:54:13.290591 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/838354a0-fdf3-4dfd-89cc-1f9bdc800fbb-ceph\") pod \"neutron-metadata-openstack-openstack-cell1-l92sc\" (UID: \"838354a0-fdf3-4dfd-89cc-1f9bdc800fbb\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-l92sc" Mar 10 11:54:13 crc kubenswrapper[4794]: I0310 11:54:13.290632 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/838354a0-fdf3-4dfd-89cc-1f9bdc800fbb-inventory\") pod \"neutron-metadata-openstack-openstack-cell1-l92sc\" (UID: \"838354a0-fdf3-4dfd-89cc-1f9bdc800fbb\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-l92sc" Mar 10 11:54:13 crc kubenswrapper[4794]: I0310 11:54:13.290725 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/838354a0-fdf3-4dfd-89cc-1f9bdc800fbb-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-openstack-openstack-cell1-l92sc\" (UID: \"838354a0-fdf3-4dfd-89cc-1f9bdc800fbb\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-l92sc" Mar 10 11:54:13 crc kubenswrapper[4794]: I0310 11:54:13.290755 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hr7q9\" (UniqueName: \"kubernetes.io/projected/838354a0-fdf3-4dfd-89cc-1f9bdc800fbb-kube-api-access-hr7q9\") pod \"neutron-metadata-openstack-openstack-cell1-l92sc\" (UID: \"838354a0-fdf3-4dfd-89cc-1f9bdc800fbb\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-l92sc" Mar 10 11:54:13 crc kubenswrapper[4794]: I0310 11:54:13.392542 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/838354a0-fdf3-4dfd-89cc-1f9bdc800fbb-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-openstack-openstack-cell1-l92sc\" (UID: \"838354a0-fdf3-4dfd-89cc-1f9bdc800fbb\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-l92sc" Mar 10 11:54:13 crc kubenswrapper[4794]: I0310 11:54:13.392710 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/838354a0-fdf3-4dfd-89cc-1f9bdc800fbb-ceph\") pod \"neutron-metadata-openstack-openstack-cell1-l92sc\" (UID: \"838354a0-fdf3-4dfd-89cc-1f9bdc800fbb\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-l92sc" Mar 10 11:54:13 crc kubenswrapper[4794]: I0310 11:54:13.392779 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/838354a0-fdf3-4dfd-89cc-1f9bdc800fbb-inventory\") pod \"neutron-metadata-openstack-openstack-cell1-l92sc\" (UID: \"838354a0-fdf3-4dfd-89cc-1f9bdc800fbb\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-l92sc" Mar 10 11:54:13 crc kubenswrapper[4794]: I0310 11:54:13.392883 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/838354a0-fdf3-4dfd-89cc-1f9bdc800fbb-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-openstack-openstack-cell1-l92sc\" (UID: \"838354a0-fdf3-4dfd-89cc-1f9bdc800fbb\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-l92sc" Mar 10 11:54:13 crc kubenswrapper[4794]: I0310 11:54:13.392946 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hr7q9\" (UniqueName: \"kubernetes.io/projected/838354a0-fdf3-4dfd-89cc-1f9bdc800fbb-kube-api-access-hr7q9\") pod \"neutron-metadata-openstack-openstack-cell1-l92sc\" (UID: \"838354a0-fdf3-4dfd-89cc-1f9bdc800fbb\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-l92sc" Mar 10 11:54:13 crc kubenswrapper[4794]: I0310 11:54:13.393014 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/838354a0-fdf3-4dfd-89cc-1f9bdc800fbb-nova-metadata-neutron-config-0\") pod \"neutron-metadata-openstack-openstack-cell1-l92sc\" (UID: \"838354a0-fdf3-4dfd-89cc-1f9bdc800fbb\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-l92sc" Mar 10 11:54:13 crc kubenswrapper[4794]: I0310 11:54:13.393088 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/838354a0-fdf3-4dfd-89cc-1f9bdc800fbb-ssh-key-openstack-cell1\") pod \"neutron-metadata-openstack-openstack-cell1-l92sc\" (UID: \"838354a0-fdf3-4dfd-89cc-1f9bdc800fbb\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-l92sc" Mar 10 11:54:13 crc kubenswrapper[4794]: I0310 11:54:13.396461 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/838354a0-fdf3-4dfd-89cc-1f9bdc800fbb-inventory\") pod \"neutron-metadata-openstack-openstack-cell1-l92sc\" (UID: \"838354a0-fdf3-4dfd-89cc-1f9bdc800fbb\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-l92sc" Mar 10 11:54:13 crc kubenswrapper[4794]: I0310 11:54:13.396583 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/838354a0-fdf3-4dfd-89cc-1f9bdc800fbb-nova-metadata-neutron-config-0\") pod \"neutron-metadata-openstack-openstack-cell1-l92sc\" (UID: \"838354a0-fdf3-4dfd-89cc-1f9bdc800fbb\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-l92sc" Mar 10 11:54:13 crc kubenswrapper[4794]: I0310 11:54:13.396889 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/838354a0-fdf3-4dfd-89cc-1f9bdc800fbb-ceph\") pod \"neutron-metadata-openstack-openstack-cell1-l92sc\" (UID: \"838354a0-fdf3-4dfd-89cc-1f9bdc800fbb\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-l92sc" Mar 10 11:54:13 crc kubenswrapper[4794]: I0310 11:54:13.397400 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/838354a0-fdf3-4dfd-89cc-1f9bdc800fbb-ssh-key-openstack-cell1\") pod \"neutron-metadata-openstack-openstack-cell1-l92sc\" (UID: \"838354a0-fdf3-4dfd-89cc-1f9bdc800fbb\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-l92sc" Mar 10 11:54:13 crc kubenswrapper[4794]: I0310 11:54:13.397891 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/838354a0-fdf3-4dfd-89cc-1f9bdc800fbb-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-openstack-openstack-cell1-l92sc\" (UID: \"838354a0-fdf3-4dfd-89cc-1f9bdc800fbb\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-l92sc" Mar 10 11:54:13 crc kubenswrapper[4794]: I0310 11:54:13.399449 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/838354a0-fdf3-4dfd-89cc-1f9bdc800fbb-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-openstack-openstack-cell1-l92sc\" (UID: \"838354a0-fdf3-4dfd-89cc-1f9bdc800fbb\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-l92sc" Mar 10 11:54:13 crc kubenswrapper[4794]: I0310 11:54:13.413039 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hr7q9\" (UniqueName: \"kubernetes.io/projected/838354a0-fdf3-4dfd-89cc-1f9bdc800fbb-kube-api-access-hr7q9\") pod \"neutron-metadata-openstack-openstack-cell1-l92sc\" (UID: \"838354a0-fdf3-4dfd-89cc-1f9bdc800fbb\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-l92sc" Mar 10 11:54:13 crc kubenswrapper[4794]: I0310 11:54:13.544585 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-openstack-openstack-cell1-l92sc" Mar 10 11:54:14 crc kubenswrapper[4794]: I0310 11:54:14.174654 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-openstack-openstack-cell1-l92sc"] Mar 10 11:54:15 crc kubenswrapper[4794]: I0310 11:54:15.132060 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-openstack-openstack-cell1-l92sc" event={"ID":"838354a0-fdf3-4dfd-89cc-1f9bdc800fbb","Type":"ContainerStarted","Data":"84ba7084939730b6752e293d696e07a42fea6412018c131a7cfa82297b1d7c4c"} Mar 10 11:54:15 crc kubenswrapper[4794]: I0310 11:54:15.132691 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-openstack-openstack-cell1-l92sc" event={"ID":"838354a0-fdf3-4dfd-89cc-1f9bdc800fbb","Type":"ContainerStarted","Data":"ed8af97c51fe0fbbe282cbf2c220cd3f215b61321d180e4d3f1fdd7072dd62c3"} Mar 10 11:54:15 crc kubenswrapper[4794]: I0310 11:54:15.158345 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-metadata-openstack-openstack-cell1-l92sc" podStartSLOduration=1.5079607560000001 podStartE2EDuration="2.158304326s" podCreationTimestamp="2026-03-10 11:54:13 +0000 UTC" firstStartedPulling="2026-03-10 11:54:14.18789399 +0000 UTC m=+7802.944064818" lastFinishedPulling="2026-03-10 11:54:14.83823757 +0000 UTC m=+7803.594408388" observedRunningTime="2026-03-10 11:54:15.15166888 +0000 UTC m=+7803.907839698" watchObservedRunningTime="2026-03-10 11:54:15.158304326 +0000 UTC m=+7803.914475144" Mar 10 11:54:19 crc kubenswrapper[4794]: I0310 11:54:19.000182 4794 scope.go:117] "RemoveContainer" containerID="9dc225440d65a260bfaf17b727b2d2bfa824f3fd4b54db4a37ab86f3420b2343" Mar 10 11:54:19 crc kubenswrapper[4794]: E0310 11:54:19.000623 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-69278_openshift-machine-config-operator(071a79a8-a892-4d38-a255-2a19483b64aa)\"" pod="openshift-machine-config-operator/machine-config-daemon-69278" podUID="071a79a8-a892-4d38-a255-2a19483b64aa" Mar 10 11:54:32 crc kubenswrapper[4794]: I0310 11:54:32.012264 4794 scope.go:117] "RemoveContainer" containerID="9dc225440d65a260bfaf17b727b2d2bfa824f3fd4b54db4a37ab86f3420b2343" Mar 10 11:54:32 crc kubenswrapper[4794]: E0310 11:54:32.013205 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-69278_openshift-machine-config-operator(071a79a8-a892-4d38-a255-2a19483b64aa)\"" pod="openshift-machine-config-operator/machine-config-daemon-69278" podUID="071a79a8-a892-4d38-a255-2a19483b64aa" Mar 10 11:54:46 crc kubenswrapper[4794]: I0310 11:54:46.000379 4794 scope.go:117] "RemoveContainer" containerID="9dc225440d65a260bfaf17b727b2d2bfa824f3fd4b54db4a37ab86f3420b2343" Mar 10 11:54:46 crc kubenswrapper[4794]: E0310 11:54:46.001697 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-69278_openshift-machine-config-operator(071a79a8-a892-4d38-a255-2a19483b64aa)\"" pod="openshift-machine-config-operator/machine-config-daemon-69278" podUID="071a79a8-a892-4d38-a255-2a19483b64aa" Mar 10 11:54:53 crc kubenswrapper[4794]: I0310 11:54:53.579378 4794 scope.go:117] "RemoveContainer" containerID="c99d1bc6bc641f94707fa255193406b2dab936bb614ca1ec87bff4069ad3d077" Mar 10 11:54:58 crc kubenswrapper[4794]: I0310 11:54:58.999071 4794 scope.go:117] "RemoveContainer" containerID="9dc225440d65a260bfaf17b727b2d2bfa824f3fd4b54db4a37ab86f3420b2343" Mar 10 11:54:59 crc kubenswrapper[4794]: I0310 11:54:59.722946 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-69278" event={"ID":"071a79a8-a892-4d38-a255-2a19483b64aa","Type":"ContainerStarted","Data":"2492796e0f1c5f77e1917908f1160073b1132a0dffbbb2c6499afca28aa2a417"} Mar 10 11:55:09 crc kubenswrapper[4794]: I0310 11:55:09.843735 4794 generic.go:334] "Generic (PLEG): container finished" podID="838354a0-fdf3-4dfd-89cc-1f9bdc800fbb" containerID="84ba7084939730b6752e293d696e07a42fea6412018c131a7cfa82297b1d7c4c" exitCode=0 Mar 10 11:55:09 crc kubenswrapper[4794]: I0310 11:55:09.843833 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-openstack-openstack-cell1-l92sc" event={"ID":"838354a0-fdf3-4dfd-89cc-1f9bdc800fbb","Type":"ContainerDied","Data":"84ba7084939730b6752e293d696e07a42fea6412018c131a7cfa82297b1d7c4c"} Mar 10 11:55:11 crc kubenswrapper[4794]: I0310 11:55:11.408495 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-openstack-openstack-cell1-l92sc" Mar 10 11:55:11 crc kubenswrapper[4794]: I0310 11:55:11.500317 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/838354a0-fdf3-4dfd-89cc-1f9bdc800fbb-ssh-key-openstack-cell1\") pod \"838354a0-fdf3-4dfd-89cc-1f9bdc800fbb\" (UID: \"838354a0-fdf3-4dfd-89cc-1f9bdc800fbb\") " Mar 10 11:55:11 crc kubenswrapper[4794]: I0310 11:55:11.500400 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hr7q9\" (UniqueName: \"kubernetes.io/projected/838354a0-fdf3-4dfd-89cc-1f9bdc800fbb-kube-api-access-hr7q9\") pod \"838354a0-fdf3-4dfd-89cc-1f9bdc800fbb\" (UID: \"838354a0-fdf3-4dfd-89cc-1f9bdc800fbb\") " Mar 10 11:55:11 crc kubenswrapper[4794]: I0310 11:55:11.500443 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/838354a0-fdf3-4dfd-89cc-1f9bdc800fbb-ceph\") pod \"838354a0-fdf3-4dfd-89cc-1f9bdc800fbb\" (UID: \"838354a0-fdf3-4dfd-89cc-1f9bdc800fbb\") " Mar 10 11:55:11 crc kubenswrapper[4794]: I0310 11:55:11.500467 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/838354a0-fdf3-4dfd-89cc-1f9bdc800fbb-neutron-metadata-combined-ca-bundle\") pod \"838354a0-fdf3-4dfd-89cc-1f9bdc800fbb\" (UID: \"838354a0-fdf3-4dfd-89cc-1f9bdc800fbb\") " Mar 10 11:55:11 crc kubenswrapper[4794]: I0310 11:55:11.500510 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/838354a0-fdf3-4dfd-89cc-1f9bdc800fbb-neutron-ovn-metadata-agent-neutron-config-0\") pod \"838354a0-fdf3-4dfd-89cc-1f9bdc800fbb\" (UID: \"838354a0-fdf3-4dfd-89cc-1f9bdc800fbb\") " Mar 10 11:55:11 crc kubenswrapper[4794]: I0310 11:55:11.500556 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/838354a0-fdf3-4dfd-89cc-1f9bdc800fbb-inventory\") pod \"838354a0-fdf3-4dfd-89cc-1f9bdc800fbb\" (UID: \"838354a0-fdf3-4dfd-89cc-1f9bdc800fbb\") " Mar 10 11:55:11 crc kubenswrapper[4794]: I0310 11:55:11.500622 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/838354a0-fdf3-4dfd-89cc-1f9bdc800fbb-nova-metadata-neutron-config-0\") pod \"838354a0-fdf3-4dfd-89cc-1f9bdc800fbb\" (UID: \"838354a0-fdf3-4dfd-89cc-1f9bdc800fbb\") " Mar 10 11:55:11 crc kubenswrapper[4794]: I0310 11:55:11.511700 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/838354a0-fdf3-4dfd-89cc-1f9bdc800fbb-ceph" (OuterVolumeSpecName: "ceph") pod "838354a0-fdf3-4dfd-89cc-1f9bdc800fbb" (UID: "838354a0-fdf3-4dfd-89cc-1f9bdc800fbb"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 11:55:11 crc kubenswrapper[4794]: I0310 11:55:11.513826 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/838354a0-fdf3-4dfd-89cc-1f9bdc800fbb-kube-api-access-hr7q9" (OuterVolumeSpecName: "kube-api-access-hr7q9") pod "838354a0-fdf3-4dfd-89cc-1f9bdc800fbb" (UID: "838354a0-fdf3-4dfd-89cc-1f9bdc800fbb"). InnerVolumeSpecName "kube-api-access-hr7q9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 11:55:11 crc kubenswrapper[4794]: I0310 11:55:11.519245 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/838354a0-fdf3-4dfd-89cc-1f9bdc800fbb-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "838354a0-fdf3-4dfd-89cc-1f9bdc800fbb" (UID: "838354a0-fdf3-4dfd-89cc-1f9bdc800fbb"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 11:55:11 crc kubenswrapper[4794]: I0310 11:55:11.541437 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/838354a0-fdf3-4dfd-89cc-1f9bdc800fbb-ssh-key-openstack-cell1" (OuterVolumeSpecName: "ssh-key-openstack-cell1") pod "838354a0-fdf3-4dfd-89cc-1f9bdc800fbb" (UID: "838354a0-fdf3-4dfd-89cc-1f9bdc800fbb"). InnerVolumeSpecName "ssh-key-openstack-cell1". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 11:55:11 crc kubenswrapper[4794]: I0310 11:55:11.541665 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/838354a0-fdf3-4dfd-89cc-1f9bdc800fbb-nova-metadata-neutron-config-0" (OuterVolumeSpecName: "nova-metadata-neutron-config-0") pod "838354a0-fdf3-4dfd-89cc-1f9bdc800fbb" (UID: "838354a0-fdf3-4dfd-89cc-1f9bdc800fbb"). InnerVolumeSpecName "nova-metadata-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 11:55:11 crc kubenswrapper[4794]: I0310 11:55:11.547357 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/838354a0-fdf3-4dfd-89cc-1f9bdc800fbb-neutron-ovn-metadata-agent-neutron-config-0" (OuterVolumeSpecName: "neutron-ovn-metadata-agent-neutron-config-0") pod "838354a0-fdf3-4dfd-89cc-1f9bdc800fbb" (UID: "838354a0-fdf3-4dfd-89cc-1f9bdc800fbb"). InnerVolumeSpecName "neutron-ovn-metadata-agent-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 11:55:11 crc kubenswrapper[4794]: I0310 11:55:11.548352 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/838354a0-fdf3-4dfd-89cc-1f9bdc800fbb-inventory" (OuterVolumeSpecName: "inventory") pod "838354a0-fdf3-4dfd-89cc-1f9bdc800fbb" (UID: "838354a0-fdf3-4dfd-89cc-1f9bdc800fbb"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 11:55:11 crc kubenswrapper[4794]: I0310 11:55:11.604190 4794 reconciler_common.go:293] "Volume detached for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/838354a0-fdf3-4dfd-89cc-1f9bdc800fbb-neutron-ovn-metadata-agent-neutron-config-0\") on node \"crc\" DevicePath \"\"" Mar 10 11:55:11 crc kubenswrapper[4794]: I0310 11:55:11.604227 4794 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/838354a0-fdf3-4dfd-89cc-1f9bdc800fbb-inventory\") on node \"crc\" DevicePath \"\"" Mar 10 11:55:11 crc kubenswrapper[4794]: I0310 11:55:11.604238 4794 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/838354a0-fdf3-4dfd-89cc-1f9bdc800fbb-nova-metadata-neutron-config-0\") on node \"crc\" DevicePath \"\"" Mar 10 11:55:11 crc kubenswrapper[4794]: I0310 11:55:11.604251 4794 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/838354a0-fdf3-4dfd-89cc-1f9bdc800fbb-ssh-key-openstack-cell1\") on node \"crc\" DevicePath \"\"" Mar 10 11:55:11 crc kubenswrapper[4794]: I0310 11:55:11.604265 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hr7q9\" (UniqueName: \"kubernetes.io/projected/838354a0-fdf3-4dfd-89cc-1f9bdc800fbb-kube-api-access-hr7q9\") on node \"crc\" DevicePath \"\"" Mar 10 11:55:11 crc kubenswrapper[4794]: I0310 11:55:11.604278 4794 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/838354a0-fdf3-4dfd-89cc-1f9bdc800fbb-ceph\") on node \"crc\" DevicePath \"\"" Mar 10 11:55:11 crc kubenswrapper[4794]: I0310 11:55:11.604288 4794 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/838354a0-fdf3-4dfd-89cc-1f9bdc800fbb-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 11:55:11 crc kubenswrapper[4794]: I0310 11:55:11.867831 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-openstack-openstack-cell1-l92sc" event={"ID":"838354a0-fdf3-4dfd-89cc-1f9bdc800fbb","Type":"ContainerDied","Data":"ed8af97c51fe0fbbe282cbf2c220cd3f215b61321d180e4d3f1fdd7072dd62c3"} Mar 10 11:55:11 crc kubenswrapper[4794]: I0310 11:55:11.867870 4794 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ed8af97c51fe0fbbe282cbf2c220cd3f215b61321d180e4d3f1fdd7072dd62c3" Mar 10 11:55:11 crc kubenswrapper[4794]: I0310 11:55:11.867889 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-openstack-openstack-cell1-l92sc" Mar 10 11:55:12 crc kubenswrapper[4794]: I0310 11:55:12.054549 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/libvirt-openstack-openstack-cell1-bpxnx"] Mar 10 11:55:12 crc kubenswrapper[4794]: E0310 11:55:12.055083 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="838354a0-fdf3-4dfd-89cc-1f9bdc800fbb" containerName="neutron-metadata-openstack-openstack-cell1" Mar 10 11:55:12 crc kubenswrapper[4794]: I0310 11:55:12.055095 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="838354a0-fdf3-4dfd-89cc-1f9bdc800fbb" containerName="neutron-metadata-openstack-openstack-cell1" Mar 10 11:55:12 crc kubenswrapper[4794]: I0310 11:55:12.055323 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="838354a0-fdf3-4dfd-89cc-1f9bdc800fbb" containerName="neutron-metadata-openstack-openstack-cell1" Mar 10 11:55:12 crc kubenswrapper[4794]: I0310 11:55:12.056044 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-openstack-openstack-cell1-bpxnx"] Mar 10 11:55:12 crc kubenswrapper[4794]: I0310 11:55:12.056132 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-openstack-openstack-cell1-bpxnx" Mar 10 11:55:12 crc kubenswrapper[4794]: I0310 11:55:12.059155 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-jdm8g" Mar 10 11:55:12 crc kubenswrapper[4794]: I0310 11:55:12.059335 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Mar 10 11:55:12 crc kubenswrapper[4794]: I0310 11:55:12.059532 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"libvirt-secret" Mar 10 11:55:12 crc kubenswrapper[4794]: I0310 11:55:12.059732 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Mar 10 11:55:12 crc kubenswrapper[4794]: I0310 11:55:12.060207 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 10 11:55:12 crc kubenswrapper[4794]: I0310 11:55:12.123010 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-56sch\" (UniqueName: \"kubernetes.io/projected/f01b2008-c26d-4f5f-90c6-438cd78f6836-kube-api-access-56sch\") pod \"libvirt-openstack-openstack-cell1-bpxnx\" (UID: \"f01b2008-c26d-4f5f-90c6-438cd78f6836\") " pod="openstack/libvirt-openstack-openstack-cell1-bpxnx" Mar 10 11:55:12 crc kubenswrapper[4794]: I0310 11:55:12.123457 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f01b2008-c26d-4f5f-90c6-438cd78f6836-inventory\") pod \"libvirt-openstack-openstack-cell1-bpxnx\" (UID: \"f01b2008-c26d-4f5f-90c6-438cd78f6836\") " pod="openstack/libvirt-openstack-openstack-cell1-bpxnx" Mar 10 11:55:12 crc kubenswrapper[4794]: I0310 11:55:12.123659 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/f01b2008-c26d-4f5f-90c6-438cd78f6836-ceph\") pod \"libvirt-openstack-openstack-cell1-bpxnx\" (UID: \"f01b2008-c26d-4f5f-90c6-438cd78f6836\") " pod="openstack/libvirt-openstack-openstack-cell1-bpxnx" Mar 10 11:55:12 crc kubenswrapper[4794]: I0310 11:55:12.123725 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/f01b2008-c26d-4f5f-90c6-438cd78f6836-libvirt-secret-0\") pod \"libvirt-openstack-openstack-cell1-bpxnx\" (UID: \"f01b2008-c26d-4f5f-90c6-438cd78f6836\") " pod="openstack/libvirt-openstack-openstack-cell1-bpxnx" Mar 10 11:55:12 crc kubenswrapper[4794]: I0310 11:55:12.124244 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f01b2008-c26d-4f5f-90c6-438cd78f6836-libvirt-combined-ca-bundle\") pod \"libvirt-openstack-openstack-cell1-bpxnx\" (UID: \"f01b2008-c26d-4f5f-90c6-438cd78f6836\") " pod="openstack/libvirt-openstack-openstack-cell1-bpxnx" Mar 10 11:55:12 crc kubenswrapper[4794]: I0310 11:55:12.124310 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/f01b2008-c26d-4f5f-90c6-438cd78f6836-ssh-key-openstack-cell1\") pod \"libvirt-openstack-openstack-cell1-bpxnx\" (UID: \"f01b2008-c26d-4f5f-90c6-438cd78f6836\") " pod="openstack/libvirt-openstack-openstack-cell1-bpxnx" Mar 10 11:55:12 crc kubenswrapper[4794]: I0310 11:55:12.226450 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-56sch\" (UniqueName: \"kubernetes.io/projected/f01b2008-c26d-4f5f-90c6-438cd78f6836-kube-api-access-56sch\") pod \"libvirt-openstack-openstack-cell1-bpxnx\" (UID: \"f01b2008-c26d-4f5f-90c6-438cd78f6836\") " pod="openstack/libvirt-openstack-openstack-cell1-bpxnx" Mar 10 11:55:12 crc kubenswrapper[4794]: I0310 11:55:12.226607 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f01b2008-c26d-4f5f-90c6-438cd78f6836-inventory\") pod \"libvirt-openstack-openstack-cell1-bpxnx\" (UID: \"f01b2008-c26d-4f5f-90c6-438cd78f6836\") " pod="openstack/libvirt-openstack-openstack-cell1-bpxnx" Mar 10 11:55:12 crc kubenswrapper[4794]: I0310 11:55:12.226650 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/f01b2008-c26d-4f5f-90c6-438cd78f6836-ceph\") pod \"libvirt-openstack-openstack-cell1-bpxnx\" (UID: \"f01b2008-c26d-4f5f-90c6-438cd78f6836\") " pod="openstack/libvirt-openstack-openstack-cell1-bpxnx" Mar 10 11:55:12 crc kubenswrapper[4794]: I0310 11:55:12.226685 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/f01b2008-c26d-4f5f-90c6-438cd78f6836-libvirt-secret-0\") pod \"libvirt-openstack-openstack-cell1-bpxnx\" (UID: \"f01b2008-c26d-4f5f-90c6-438cd78f6836\") " pod="openstack/libvirt-openstack-openstack-cell1-bpxnx" Mar 10 11:55:12 crc kubenswrapper[4794]: I0310 11:55:12.226851 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f01b2008-c26d-4f5f-90c6-438cd78f6836-libvirt-combined-ca-bundle\") pod \"libvirt-openstack-openstack-cell1-bpxnx\" (UID: \"f01b2008-c26d-4f5f-90c6-438cd78f6836\") " pod="openstack/libvirt-openstack-openstack-cell1-bpxnx" Mar 10 11:55:12 crc kubenswrapper[4794]: I0310 11:55:12.226912 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/f01b2008-c26d-4f5f-90c6-438cd78f6836-ssh-key-openstack-cell1\") pod \"libvirt-openstack-openstack-cell1-bpxnx\" (UID: \"f01b2008-c26d-4f5f-90c6-438cd78f6836\") " pod="openstack/libvirt-openstack-openstack-cell1-bpxnx" Mar 10 11:55:12 crc kubenswrapper[4794]: I0310 11:55:12.231798 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/f01b2008-c26d-4f5f-90c6-438cd78f6836-libvirt-secret-0\") pod \"libvirt-openstack-openstack-cell1-bpxnx\" (UID: \"f01b2008-c26d-4f5f-90c6-438cd78f6836\") " pod="openstack/libvirt-openstack-openstack-cell1-bpxnx" Mar 10 11:55:12 crc kubenswrapper[4794]: I0310 11:55:12.232451 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f01b2008-c26d-4f5f-90c6-438cd78f6836-libvirt-combined-ca-bundle\") pod \"libvirt-openstack-openstack-cell1-bpxnx\" (UID: \"f01b2008-c26d-4f5f-90c6-438cd78f6836\") " pod="openstack/libvirt-openstack-openstack-cell1-bpxnx" Mar 10 11:55:12 crc kubenswrapper[4794]: I0310 11:55:12.232590 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f01b2008-c26d-4f5f-90c6-438cd78f6836-inventory\") pod \"libvirt-openstack-openstack-cell1-bpxnx\" (UID: \"f01b2008-c26d-4f5f-90c6-438cd78f6836\") " pod="openstack/libvirt-openstack-openstack-cell1-bpxnx" Mar 10 11:55:12 crc kubenswrapper[4794]: I0310 11:55:12.233996 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/f01b2008-c26d-4f5f-90c6-438cd78f6836-ceph\") pod \"libvirt-openstack-openstack-cell1-bpxnx\" (UID: \"f01b2008-c26d-4f5f-90c6-438cd78f6836\") " pod="openstack/libvirt-openstack-openstack-cell1-bpxnx" Mar 10 11:55:12 crc kubenswrapper[4794]: I0310 11:55:12.241842 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/f01b2008-c26d-4f5f-90c6-438cd78f6836-ssh-key-openstack-cell1\") pod \"libvirt-openstack-openstack-cell1-bpxnx\" (UID: \"f01b2008-c26d-4f5f-90c6-438cd78f6836\") " pod="openstack/libvirt-openstack-openstack-cell1-bpxnx" Mar 10 11:55:12 crc kubenswrapper[4794]: I0310 11:55:12.244560 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-56sch\" (UniqueName: \"kubernetes.io/projected/f01b2008-c26d-4f5f-90c6-438cd78f6836-kube-api-access-56sch\") pod \"libvirt-openstack-openstack-cell1-bpxnx\" (UID: \"f01b2008-c26d-4f5f-90c6-438cd78f6836\") " pod="openstack/libvirt-openstack-openstack-cell1-bpxnx" Mar 10 11:55:12 crc kubenswrapper[4794]: I0310 11:55:12.389056 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-openstack-openstack-cell1-bpxnx" Mar 10 11:55:12 crc kubenswrapper[4794]: I0310 11:55:12.971721 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-openstack-openstack-cell1-bpxnx"] Mar 10 11:55:12 crc kubenswrapper[4794]: W0310 11:55:12.978683 4794 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf01b2008_c26d_4f5f_90c6_438cd78f6836.slice/crio-54a859e1868fb4e5f961cf4f87863ab36aee966df967f10059a00f88fc147d3e WatchSource:0}: Error finding container 54a859e1868fb4e5f961cf4f87863ab36aee966df967f10059a00f88fc147d3e: Status 404 returned error can't find the container with id 54a859e1868fb4e5f961cf4f87863ab36aee966df967f10059a00f88fc147d3e Mar 10 11:55:13 crc kubenswrapper[4794]: I0310 11:55:13.894907 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-openstack-openstack-cell1-bpxnx" event={"ID":"f01b2008-c26d-4f5f-90c6-438cd78f6836","Type":"ContainerStarted","Data":"ad2587fa34f50aa9d342f883afe8031ca6f772795b7a3e12b78ea11643cc3621"} Mar 10 11:55:13 crc kubenswrapper[4794]: I0310 11:55:13.895377 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-openstack-openstack-cell1-bpxnx" event={"ID":"f01b2008-c26d-4f5f-90c6-438cd78f6836","Type":"ContainerStarted","Data":"54a859e1868fb4e5f961cf4f87863ab36aee966df967f10059a00f88fc147d3e"} Mar 10 11:55:13 crc kubenswrapper[4794]: I0310 11:55:13.912373 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/libvirt-openstack-openstack-cell1-bpxnx" podStartSLOduration=2.329443486 podStartE2EDuration="2.912357192s" podCreationTimestamp="2026-03-10 11:55:11 +0000 UTC" firstStartedPulling="2026-03-10 11:55:12.980926576 +0000 UTC m=+7861.737097394" lastFinishedPulling="2026-03-10 11:55:13.563840282 +0000 UTC m=+7862.320011100" observedRunningTime="2026-03-10 11:55:13.910843874 +0000 UTC m=+7862.667014702" watchObservedRunningTime="2026-03-10 11:55:13.912357192 +0000 UTC m=+7862.668528020" Mar 10 11:56:00 crc kubenswrapper[4794]: I0310 11:56:00.144470 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29552396-zrrgw"] Mar 10 11:56:00 crc kubenswrapper[4794]: I0310 11:56:00.146574 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552396-zrrgw" Mar 10 11:56:00 crc kubenswrapper[4794]: I0310 11:56:00.148976 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-r5tkc" Mar 10 11:56:00 crc kubenswrapper[4794]: I0310 11:56:00.149190 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 10 11:56:00 crc kubenswrapper[4794]: I0310 11:56:00.149997 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 10 11:56:00 crc kubenswrapper[4794]: I0310 11:56:00.159069 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552396-zrrgw"] Mar 10 11:56:00 crc kubenswrapper[4794]: I0310 11:56:00.243866 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s89xl\" (UniqueName: \"kubernetes.io/projected/f18c6dbb-8161-4ba5-9898-2c61a62952cd-kube-api-access-s89xl\") pod \"auto-csr-approver-29552396-zrrgw\" (UID: \"f18c6dbb-8161-4ba5-9898-2c61a62952cd\") " pod="openshift-infra/auto-csr-approver-29552396-zrrgw" Mar 10 11:56:00 crc kubenswrapper[4794]: I0310 11:56:00.345360 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s89xl\" (UniqueName: \"kubernetes.io/projected/f18c6dbb-8161-4ba5-9898-2c61a62952cd-kube-api-access-s89xl\") pod \"auto-csr-approver-29552396-zrrgw\" (UID: \"f18c6dbb-8161-4ba5-9898-2c61a62952cd\") " pod="openshift-infra/auto-csr-approver-29552396-zrrgw" Mar 10 11:56:00 crc kubenswrapper[4794]: I0310 11:56:00.368355 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s89xl\" (UniqueName: \"kubernetes.io/projected/f18c6dbb-8161-4ba5-9898-2c61a62952cd-kube-api-access-s89xl\") pod \"auto-csr-approver-29552396-zrrgw\" (UID: \"f18c6dbb-8161-4ba5-9898-2c61a62952cd\") " pod="openshift-infra/auto-csr-approver-29552396-zrrgw" Mar 10 11:56:00 crc kubenswrapper[4794]: I0310 11:56:00.466679 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552396-zrrgw" Mar 10 11:56:00 crc kubenswrapper[4794]: I0310 11:56:00.943656 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552396-zrrgw"] Mar 10 11:56:01 crc kubenswrapper[4794]: I0310 11:56:01.428002 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552396-zrrgw" event={"ID":"f18c6dbb-8161-4ba5-9898-2c61a62952cd","Type":"ContainerStarted","Data":"af9543023b72932ec3eed24ea31674ce571a8cfc020affe5a83f8a45888714fd"} Mar 10 11:56:02 crc kubenswrapper[4794]: I0310 11:56:02.439287 4794 generic.go:334] "Generic (PLEG): container finished" podID="f18c6dbb-8161-4ba5-9898-2c61a62952cd" containerID="47a3e892ea4d118d21b9ca372ee29b61a20a02a8167e9c5e666ae0cef3f40bf0" exitCode=0 Mar 10 11:56:02 crc kubenswrapper[4794]: I0310 11:56:02.439358 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552396-zrrgw" event={"ID":"f18c6dbb-8161-4ba5-9898-2c61a62952cd","Type":"ContainerDied","Data":"47a3e892ea4d118d21b9ca372ee29b61a20a02a8167e9c5e666ae0cef3f40bf0"} Mar 10 11:56:03 crc kubenswrapper[4794]: I0310 11:56:03.835979 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552396-zrrgw" Mar 10 11:56:03 crc kubenswrapper[4794]: I0310 11:56:03.963909 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s89xl\" (UniqueName: \"kubernetes.io/projected/f18c6dbb-8161-4ba5-9898-2c61a62952cd-kube-api-access-s89xl\") pod \"f18c6dbb-8161-4ba5-9898-2c61a62952cd\" (UID: \"f18c6dbb-8161-4ba5-9898-2c61a62952cd\") " Mar 10 11:56:03 crc kubenswrapper[4794]: I0310 11:56:03.972578 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f18c6dbb-8161-4ba5-9898-2c61a62952cd-kube-api-access-s89xl" (OuterVolumeSpecName: "kube-api-access-s89xl") pod "f18c6dbb-8161-4ba5-9898-2c61a62952cd" (UID: "f18c6dbb-8161-4ba5-9898-2c61a62952cd"). InnerVolumeSpecName "kube-api-access-s89xl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 11:56:04 crc kubenswrapper[4794]: I0310 11:56:04.067323 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s89xl\" (UniqueName: \"kubernetes.io/projected/f18c6dbb-8161-4ba5-9898-2c61a62952cd-kube-api-access-s89xl\") on node \"crc\" DevicePath \"\"" Mar 10 11:56:04 crc kubenswrapper[4794]: I0310 11:56:04.467289 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552396-zrrgw" event={"ID":"f18c6dbb-8161-4ba5-9898-2c61a62952cd","Type":"ContainerDied","Data":"af9543023b72932ec3eed24ea31674ce571a8cfc020affe5a83f8a45888714fd"} Mar 10 11:56:04 crc kubenswrapper[4794]: I0310 11:56:04.467362 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552396-zrrgw" Mar 10 11:56:04 crc kubenswrapper[4794]: I0310 11:56:04.467392 4794 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="af9543023b72932ec3eed24ea31674ce571a8cfc020affe5a83f8a45888714fd" Mar 10 11:56:04 crc kubenswrapper[4794]: I0310 11:56:04.953317 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29552390-xk452"] Mar 10 11:56:04 crc kubenswrapper[4794]: I0310 11:56:04.973119 4794 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29552390-xk452"] Mar 10 11:56:06 crc kubenswrapper[4794]: I0310 11:56:06.011565 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="24ea97d9-e465-4b6c-8c1d-8d63a45f57c2" path="/var/lib/kubelet/pods/24ea97d9-e465-4b6c-8c1d-8d63a45f57c2/volumes" Mar 10 11:56:53 crc kubenswrapper[4794]: I0310 11:56:53.678844 4794 scope.go:117] "RemoveContainer" containerID="0fb5ca443710d4ec6f3b14bc6ed29962121d42088d2e935e1f00897923df1429" Mar 10 11:57:22 crc kubenswrapper[4794]: I0310 11:57:22.967290 4794 patch_prober.go:28] interesting pod/machine-config-daemon-69278 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 11:57:22 crc kubenswrapper[4794]: I0310 11:57:22.967978 4794 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-69278" podUID="071a79a8-a892-4d38-a255-2a19483b64aa" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 11:57:52 crc kubenswrapper[4794]: I0310 11:57:52.968809 4794 patch_prober.go:28] interesting pod/machine-config-daemon-69278 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 11:57:52 crc kubenswrapper[4794]: I0310 11:57:52.969400 4794 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-69278" podUID="071a79a8-a892-4d38-a255-2a19483b64aa" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 11:58:00 crc kubenswrapper[4794]: I0310 11:58:00.152627 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29552398-7gnn2"] Mar 10 11:58:00 crc kubenswrapper[4794]: E0310 11:58:00.153708 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f18c6dbb-8161-4ba5-9898-2c61a62952cd" containerName="oc" Mar 10 11:58:00 crc kubenswrapper[4794]: I0310 11:58:00.153725 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="f18c6dbb-8161-4ba5-9898-2c61a62952cd" containerName="oc" Mar 10 11:58:00 crc kubenswrapper[4794]: I0310 11:58:00.153983 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="f18c6dbb-8161-4ba5-9898-2c61a62952cd" containerName="oc" Mar 10 11:58:00 crc kubenswrapper[4794]: I0310 11:58:00.154952 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552398-7gnn2" Mar 10 11:58:00 crc kubenswrapper[4794]: I0310 11:58:00.157774 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 10 11:58:00 crc kubenswrapper[4794]: I0310 11:58:00.158047 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 10 11:58:00 crc kubenswrapper[4794]: I0310 11:58:00.158194 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-r5tkc" Mar 10 11:58:00 crc kubenswrapper[4794]: I0310 11:58:00.168294 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552398-7gnn2"] Mar 10 11:58:00 crc kubenswrapper[4794]: I0310 11:58:00.281228 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9cplv\" (UniqueName: \"kubernetes.io/projected/28e4c8d2-e942-4938-9a74-82c596407de7-kube-api-access-9cplv\") pod \"auto-csr-approver-29552398-7gnn2\" (UID: \"28e4c8d2-e942-4938-9a74-82c596407de7\") " pod="openshift-infra/auto-csr-approver-29552398-7gnn2" Mar 10 11:58:00 crc kubenswrapper[4794]: I0310 11:58:00.383961 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9cplv\" (UniqueName: \"kubernetes.io/projected/28e4c8d2-e942-4938-9a74-82c596407de7-kube-api-access-9cplv\") pod \"auto-csr-approver-29552398-7gnn2\" (UID: \"28e4c8d2-e942-4938-9a74-82c596407de7\") " pod="openshift-infra/auto-csr-approver-29552398-7gnn2" Mar 10 11:58:00 crc kubenswrapper[4794]: I0310 11:58:00.407872 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9cplv\" (UniqueName: \"kubernetes.io/projected/28e4c8d2-e942-4938-9a74-82c596407de7-kube-api-access-9cplv\") pod \"auto-csr-approver-29552398-7gnn2\" (UID: \"28e4c8d2-e942-4938-9a74-82c596407de7\") " pod="openshift-infra/auto-csr-approver-29552398-7gnn2" Mar 10 11:58:00 crc kubenswrapper[4794]: I0310 11:58:00.482636 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552398-7gnn2" Mar 10 11:58:00 crc kubenswrapper[4794]: I0310 11:58:00.984117 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552398-7gnn2"] Mar 10 11:58:00 crc kubenswrapper[4794]: W0310 11:58:00.987701 4794 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod28e4c8d2_e942_4938_9a74_82c596407de7.slice/crio-b1e8d878e4c791493dc15a3d9de0f867507aeb400834907b281e9b6cf274b84e WatchSource:0}: Error finding container b1e8d878e4c791493dc15a3d9de0f867507aeb400834907b281e9b6cf274b84e: Status 404 returned error can't find the container with id b1e8d878e4c791493dc15a3d9de0f867507aeb400834907b281e9b6cf274b84e Mar 10 11:58:00 crc kubenswrapper[4794]: I0310 11:58:00.990399 4794 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 10 11:58:01 crc kubenswrapper[4794]: I0310 11:58:01.834135 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552398-7gnn2" event={"ID":"28e4c8d2-e942-4938-9a74-82c596407de7","Type":"ContainerStarted","Data":"b1e8d878e4c791493dc15a3d9de0f867507aeb400834907b281e9b6cf274b84e"} Mar 10 11:58:02 crc kubenswrapper[4794]: I0310 11:58:02.846783 4794 generic.go:334] "Generic (PLEG): container finished" podID="28e4c8d2-e942-4938-9a74-82c596407de7" containerID="a7a363ef99cbcce43d3be032b16e1546540302f601080e6c0632013f00ce38cd" exitCode=0 Mar 10 11:58:02 crc kubenswrapper[4794]: I0310 11:58:02.846860 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552398-7gnn2" event={"ID":"28e4c8d2-e942-4938-9a74-82c596407de7","Type":"ContainerDied","Data":"a7a363ef99cbcce43d3be032b16e1546540302f601080e6c0632013f00ce38cd"} Mar 10 11:58:04 crc kubenswrapper[4794]: I0310 11:58:04.252305 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552398-7gnn2" Mar 10 11:58:04 crc kubenswrapper[4794]: I0310 11:58:04.381774 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9cplv\" (UniqueName: \"kubernetes.io/projected/28e4c8d2-e942-4938-9a74-82c596407de7-kube-api-access-9cplv\") pod \"28e4c8d2-e942-4938-9a74-82c596407de7\" (UID: \"28e4c8d2-e942-4938-9a74-82c596407de7\") " Mar 10 11:58:04 crc kubenswrapper[4794]: I0310 11:58:04.388841 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/28e4c8d2-e942-4938-9a74-82c596407de7-kube-api-access-9cplv" (OuterVolumeSpecName: "kube-api-access-9cplv") pod "28e4c8d2-e942-4938-9a74-82c596407de7" (UID: "28e4c8d2-e942-4938-9a74-82c596407de7"). InnerVolumeSpecName "kube-api-access-9cplv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 11:58:04 crc kubenswrapper[4794]: I0310 11:58:04.483905 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9cplv\" (UniqueName: \"kubernetes.io/projected/28e4c8d2-e942-4938-9a74-82c596407de7-kube-api-access-9cplv\") on node \"crc\" DevicePath \"\"" Mar 10 11:58:04 crc kubenswrapper[4794]: I0310 11:58:04.878980 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552398-7gnn2" event={"ID":"28e4c8d2-e942-4938-9a74-82c596407de7","Type":"ContainerDied","Data":"b1e8d878e4c791493dc15a3d9de0f867507aeb400834907b281e9b6cf274b84e"} Mar 10 11:58:04 crc kubenswrapper[4794]: I0310 11:58:04.879048 4794 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b1e8d878e4c791493dc15a3d9de0f867507aeb400834907b281e9b6cf274b84e" Mar 10 11:58:04 crc kubenswrapper[4794]: I0310 11:58:04.879078 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552398-7gnn2" Mar 10 11:58:05 crc kubenswrapper[4794]: I0310 11:58:05.324175 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29552392-wkv8z"] Mar 10 11:58:05 crc kubenswrapper[4794]: I0310 11:58:05.333600 4794 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29552392-wkv8z"] Mar 10 11:58:06 crc kubenswrapper[4794]: I0310 11:58:06.018650 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="784198eb-66e6-4c0e-aa8b-c8a2b9dbde07" path="/var/lib/kubelet/pods/784198eb-66e6-4c0e-aa8b-c8a2b9dbde07/volumes" Mar 10 11:58:22 crc kubenswrapper[4794]: I0310 11:58:22.967991 4794 patch_prober.go:28] interesting pod/machine-config-daemon-69278 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 11:58:22 crc kubenswrapper[4794]: I0310 11:58:22.968565 4794 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-69278" podUID="071a79a8-a892-4d38-a255-2a19483b64aa" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 11:58:22 crc kubenswrapper[4794]: I0310 11:58:22.968616 4794 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-69278" Mar 10 11:58:22 crc kubenswrapper[4794]: I0310 11:58:22.969489 4794 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"2492796e0f1c5f77e1917908f1160073b1132a0dffbbb2c6499afca28aa2a417"} pod="openshift-machine-config-operator/machine-config-daemon-69278" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 10 11:58:22 crc kubenswrapper[4794]: I0310 11:58:22.969559 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-69278" podUID="071a79a8-a892-4d38-a255-2a19483b64aa" containerName="machine-config-daemon" containerID="cri-o://2492796e0f1c5f77e1917908f1160073b1132a0dffbbb2c6499afca28aa2a417" gracePeriod=600 Mar 10 11:58:24 crc kubenswrapper[4794]: I0310 11:58:24.096814 4794 generic.go:334] "Generic (PLEG): container finished" podID="071a79a8-a892-4d38-a255-2a19483b64aa" containerID="2492796e0f1c5f77e1917908f1160073b1132a0dffbbb2c6499afca28aa2a417" exitCode=0 Mar 10 11:58:24 crc kubenswrapper[4794]: I0310 11:58:24.096857 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-69278" event={"ID":"071a79a8-a892-4d38-a255-2a19483b64aa","Type":"ContainerDied","Data":"2492796e0f1c5f77e1917908f1160073b1132a0dffbbb2c6499afca28aa2a417"} Mar 10 11:58:24 crc kubenswrapper[4794]: I0310 11:58:24.097423 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-69278" event={"ID":"071a79a8-a892-4d38-a255-2a19483b64aa","Type":"ContainerStarted","Data":"8fbcec467432a07654b1a077feacb7efdec4f288161f785ef9cd2a36f505037c"} Mar 10 11:58:24 crc kubenswrapper[4794]: I0310 11:58:24.097452 4794 scope.go:117] "RemoveContainer" containerID="9dc225440d65a260bfaf17b727b2d2bfa824f3fd4b54db4a37ab86f3420b2343" Mar 10 11:58:53 crc kubenswrapper[4794]: I0310 11:58:53.778219 4794 scope.go:117] "RemoveContainer" containerID="18403c1366ff205ad2bdefb355ebf1c08929ba25828291b9ab5a8d36a9c54321" Mar 10 11:59:09 crc kubenswrapper[4794]: I0310 11:59:09.035201 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-l5rkm"] Mar 10 11:59:09 crc kubenswrapper[4794]: E0310 11:59:09.036317 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="28e4c8d2-e942-4938-9a74-82c596407de7" containerName="oc" Mar 10 11:59:09 crc kubenswrapper[4794]: I0310 11:59:09.036358 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="28e4c8d2-e942-4938-9a74-82c596407de7" containerName="oc" Mar 10 11:59:09 crc kubenswrapper[4794]: I0310 11:59:09.036675 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="28e4c8d2-e942-4938-9a74-82c596407de7" containerName="oc" Mar 10 11:59:09 crc kubenswrapper[4794]: I0310 11:59:09.038550 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-l5rkm" Mar 10 11:59:09 crc kubenswrapper[4794]: I0310 11:59:09.054157 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-l5rkm"] Mar 10 11:59:09 crc kubenswrapper[4794]: I0310 11:59:09.055558 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f0e16b8c-53d2-40f7-ac56-13cb09c48695-catalog-content\") pod \"redhat-marketplace-l5rkm\" (UID: \"f0e16b8c-53d2-40f7-ac56-13cb09c48695\") " pod="openshift-marketplace/redhat-marketplace-l5rkm" Mar 10 11:59:09 crc kubenswrapper[4794]: I0310 11:59:09.055667 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pnjs5\" (UniqueName: \"kubernetes.io/projected/f0e16b8c-53d2-40f7-ac56-13cb09c48695-kube-api-access-pnjs5\") pod \"redhat-marketplace-l5rkm\" (UID: \"f0e16b8c-53d2-40f7-ac56-13cb09c48695\") " pod="openshift-marketplace/redhat-marketplace-l5rkm" Mar 10 11:59:09 crc kubenswrapper[4794]: I0310 11:59:09.057197 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f0e16b8c-53d2-40f7-ac56-13cb09c48695-utilities\") pod \"redhat-marketplace-l5rkm\" (UID: \"f0e16b8c-53d2-40f7-ac56-13cb09c48695\") " pod="openshift-marketplace/redhat-marketplace-l5rkm" Mar 10 11:59:09 crc kubenswrapper[4794]: I0310 11:59:09.158727 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f0e16b8c-53d2-40f7-ac56-13cb09c48695-utilities\") pod \"redhat-marketplace-l5rkm\" (UID: \"f0e16b8c-53d2-40f7-ac56-13cb09c48695\") " pod="openshift-marketplace/redhat-marketplace-l5rkm" Mar 10 11:59:09 crc kubenswrapper[4794]: I0310 11:59:09.158917 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f0e16b8c-53d2-40f7-ac56-13cb09c48695-catalog-content\") pod \"redhat-marketplace-l5rkm\" (UID: \"f0e16b8c-53d2-40f7-ac56-13cb09c48695\") " pod="openshift-marketplace/redhat-marketplace-l5rkm" Mar 10 11:59:09 crc kubenswrapper[4794]: I0310 11:59:09.158972 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pnjs5\" (UniqueName: \"kubernetes.io/projected/f0e16b8c-53d2-40f7-ac56-13cb09c48695-kube-api-access-pnjs5\") pod \"redhat-marketplace-l5rkm\" (UID: \"f0e16b8c-53d2-40f7-ac56-13cb09c48695\") " pod="openshift-marketplace/redhat-marketplace-l5rkm" Mar 10 11:59:09 crc kubenswrapper[4794]: I0310 11:59:09.159202 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f0e16b8c-53d2-40f7-ac56-13cb09c48695-utilities\") pod \"redhat-marketplace-l5rkm\" (UID: \"f0e16b8c-53d2-40f7-ac56-13cb09c48695\") " pod="openshift-marketplace/redhat-marketplace-l5rkm" Mar 10 11:59:09 crc kubenswrapper[4794]: I0310 11:59:09.159486 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f0e16b8c-53d2-40f7-ac56-13cb09c48695-catalog-content\") pod \"redhat-marketplace-l5rkm\" (UID: \"f0e16b8c-53d2-40f7-ac56-13cb09c48695\") " pod="openshift-marketplace/redhat-marketplace-l5rkm" Mar 10 11:59:09 crc kubenswrapper[4794]: I0310 11:59:09.184725 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pnjs5\" (UniqueName: \"kubernetes.io/projected/f0e16b8c-53d2-40f7-ac56-13cb09c48695-kube-api-access-pnjs5\") pod \"redhat-marketplace-l5rkm\" (UID: \"f0e16b8c-53d2-40f7-ac56-13cb09c48695\") " pod="openshift-marketplace/redhat-marketplace-l5rkm" Mar 10 11:59:09 crc kubenswrapper[4794]: I0310 11:59:09.357746 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-l5rkm" Mar 10 11:59:09 crc kubenswrapper[4794]: I0310 11:59:09.909839 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-l5rkm"] Mar 10 11:59:10 crc kubenswrapper[4794]: I0310 11:59:10.613547 4794 generic.go:334] "Generic (PLEG): container finished" podID="f0e16b8c-53d2-40f7-ac56-13cb09c48695" containerID="fc33eaa6a18b27ce3dfcd72e9d49b24cb33a13898adc30075bd248b1eb7a503f" exitCode=0 Mar 10 11:59:10 crc kubenswrapper[4794]: I0310 11:59:10.613631 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-l5rkm" event={"ID":"f0e16b8c-53d2-40f7-ac56-13cb09c48695","Type":"ContainerDied","Data":"fc33eaa6a18b27ce3dfcd72e9d49b24cb33a13898adc30075bd248b1eb7a503f"} Mar 10 11:59:10 crc kubenswrapper[4794]: I0310 11:59:10.614137 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-l5rkm" event={"ID":"f0e16b8c-53d2-40f7-ac56-13cb09c48695","Type":"ContainerStarted","Data":"e037db7f9306ff57e67071975e0999348d996a4aae9a3b91c9d3510b5c577231"} Mar 10 11:59:11 crc kubenswrapper[4794]: I0310 11:59:11.630450 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-l5rkm" event={"ID":"f0e16b8c-53d2-40f7-ac56-13cb09c48695","Type":"ContainerStarted","Data":"39edbe05b50f5c71f2852f7aa18c90cc8f88214570f663c32a0d770ee3919973"} Mar 10 11:59:12 crc kubenswrapper[4794]: I0310 11:59:12.648271 4794 generic.go:334] "Generic (PLEG): container finished" podID="f0e16b8c-53d2-40f7-ac56-13cb09c48695" containerID="39edbe05b50f5c71f2852f7aa18c90cc8f88214570f663c32a0d770ee3919973" exitCode=0 Mar 10 11:59:12 crc kubenswrapper[4794]: I0310 11:59:12.648387 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-l5rkm" event={"ID":"f0e16b8c-53d2-40f7-ac56-13cb09c48695","Type":"ContainerDied","Data":"39edbe05b50f5c71f2852f7aa18c90cc8f88214570f663c32a0d770ee3919973"} Mar 10 11:59:13 crc kubenswrapper[4794]: I0310 11:59:13.660485 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-l5rkm" event={"ID":"f0e16b8c-53d2-40f7-ac56-13cb09c48695","Type":"ContainerStarted","Data":"05b6f39a2094043d06c10fca9c2d726465b548663baed293ae3a4174a336c4ba"} Mar 10 11:59:13 crc kubenswrapper[4794]: I0310 11:59:13.697489 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-l5rkm" podStartSLOduration=3.212469953 podStartE2EDuration="5.697470789s" podCreationTimestamp="2026-03-10 11:59:08 +0000 UTC" firstStartedPulling="2026-03-10 11:59:10.616979047 +0000 UTC m=+8099.373149865" lastFinishedPulling="2026-03-10 11:59:13.101979833 +0000 UTC m=+8101.858150701" observedRunningTime="2026-03-10 11:59:13.680655207 +0000 UTC m=+8102.436826035" watchObservedRunningTime="2026-03-10 11:59:13.697470789 +0000 UTC m=+8102.453641607" Mar 10 11:59:19 crc kubenswrapper[4794]: I0310 11:59:19.358991 4794 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-l5rkm" Mar 10 11:59:19 crc kubenswrapper[4794]: I0310 11:59:19.359596 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-l5rkm" Mar 10 11:59:19 crc kubenswrapper[4794]: I0310 11:59:19.436322 4794 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-l5rkm" Mar 10 11:59:19 crc kubenswrapper[4794]: I0310 11:59:19.809911 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-l5rkm" Mar 10 11:59:19 crc kubenswrapper[4794]: I0310 11:59:19.866701 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-l5rkm"] Mar 10 11:59:21 crc kubenswrapper[4794]: I0310 11:59:21.756953 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-l5rkm" podUID="f0e16b8c-53d2-40f7-ac56-13cb09c48695" containerName="registry-server" containerID="cri-o://05b6f39a2094043d06c10fca9c2d726465b548663baed293ae3a4174a336c4ba" gracePeriod=2 Mar 10 11:59:22 crc kubenswrapper[4794]: I0310 11:59:22.295867 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-l5rkm" Mar 10 11:59:22 crc kubenswrapper[4794]: I0310 11:59:22.466231 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pnjs5\" (UniqueName: \"kubernetes.io/projected/f0e16b8c-53d2-40f7-ac56-13cb09c48695-kube-api-access-pnjs5\") pod \"f0e16b8c-53d2-40f7-ac56-13cb09c48695\" (UID: \"f0e16b8c-53d2-40f7-ac56-13cb09c48695\") " Mar 10 11:59:22 crc kubenswrapper[4794]: I0310 11:59:22.466374 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f0e16b8c-53d2-40f7-ac56-13cb09c48695-utilities\") pod \"f0e16b8c-53d2-40f7-ac56-13cb09c48695\" (UID: \"f0e16b8c-53d2-40f7-ac56-13cb09c48695\") " Mar 10 11:59:22 crc kubenswrapper[4794]: I0310 11:59:22.466423 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f0e16b8c-53d2-40f7-ac56-13cb09c48695-catalog-content\") pod \"f0e16b8c-53d2-40f7-ac56-13cb09c48695\" (UID: \"f0e16b8c-53d2-40f7-ac56-13cb09c48695\") " Mar 10 11:59:22 crc kubenswrapper[4794]: I0310 11:59:22.467594 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f0e16b8c-53d2-40f7-ac56-13cb09c48695-utilities" (OuterVolumeSpecName: "utilities") pod "f0e16b8c-53d2-40f7-ac56-13cb09c48695" (UID: "f0e16b8c-53d2-40f7-ac56-13cb09c48695"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 11:59:22 crc kubenswrapper[4794]: I0310 11:59:22.473310 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f0e16b8c-53d2-40f7-ac56-13cb09c48695-kube-api-access-pnjs5" (OuterVolumeSpecName: "kube-api-access-pnjs5") pod "f0e16b8c-53d2-40f7-ac56-13cb09c48695" (UID: "f0e16b8c-53d2-40f7-ac56-13cb09c48695"). InnerVolumeSpecName "kube-api-access-pnjs5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 11:59:22 crc kubenswrapper[4794]: I0310 11:59:22.514963 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f0e16b8c-53d2-40f7-ac56-13cb09c48695-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f0e16b8c-53d2-40f7-ac56-13cb09c48695" (UID: "f0e16b8c-53d2-40f7-ac56-13cb09c48695"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 11:59:22 crc kubenswrapper[4794]: I0310 11:59:22.569277 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pnjs5\" (UniqueName: \"kubernetes.io/projected/f0e16b8c-53d2-40f7-ac56-13cb09c48695-kube-api-access-pnjs5\") on node \"crc\" DevicePath \"\"" Mar 10 11:59:22 crc kubenswrapper[4794]: I0310 11:59:22.569316 4794 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f0e16b8c-53d2-40f7-ac56-13cb09c48695-utilities\") on node \"crc\" DevicePath \"\"" Mar 10 11:59:22 crc kubenswrapper[4794]: I0310 11:59:22.569326 4794 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f0e16b8c-53d2-40f7-ac56-13cb09c48695-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 10 11:59:22 crc kubenswrapper[4794]: I0310 11:59:22.772381 4794 generic.go:334] "Generic (PLEG): container finished" podID="f0e16b8c-53d2-40f7-ac56-13cb09c48695" containerID="05b6f39a2094043d06c10fca9c2d726465b548663baed293ae3a4174a336c4ba" exitCode=0 Mar 10 11:59:22 crc kubenswrapper[4794]: I0310 11:59:22.772450 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-l5rkm" Mar 10 11:59:22 crc kubenswrapper[4794]: I0310 11:59:22.772446 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-l5rkm" event={"ID":"f0e16b8c-53d2-40f7-ac56-13cb09c48695","Type":"ContainerDied","Data":"05b6f39a2094043d06c10fca9c2d726465b548663baed293ae3a4174a336c4ba"} Mar 10 11:59:22 crc kubenswrapper[4794]: I0310 11:59:22.773455 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-l5rkm" event={"ID":"f0e16b8c-53d2-40f7-ac56-13cb09c48695","Type":"ContainerDied","Data":"e037db7f9306ff57e67071975e0999348d996a4aae9a3b91c9d3510b5c577231"} Mar 10 11:59:22 crc kubenswrapper[4794]: I0310 11:59:22.773477 4794 scope.go:117] "RemoveContainer" containerID="05b6f39a2094043d06c10fca9c2d726465b548663baed293ae3a4174a336c4ba" Mar 10 11:59:22 crc kubenswrapper[4794]: I0310 11:59:22.808910 4794 scope.go:117] "RemoveContainer" containerID="39edbe05b50f5c71f2852f7aa18c90cc8f88214570f663c32a0d770ee3919973" Mar 10 11:59:22 crc kubenswrapper[4794]: I0310 11:59:22.819055 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-l5rkm"] Mar 10 11:59:22 crc kubenswrapper[4794]: I0310 11:59:22.828646 4794 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-l5rkm"] Mar 10 11:59:22 crc kubenswrapper[4794]: I0310 11:59:22.834279 4794 scope.go:117] "RemoveContainer" containerID="fc33eaa6a18b27ce3dfcd72e9d49b24cb33a13898adc30075bd248b1eb7a503f" Mar 10 11:59:22 crc kubenswrapper[4794]: I0310 11:59:22.884572 4794 scope.go:117] "RemoveContainer" containerID="05b6f39a2094043d06c10fca9c2d726465b548663baed293ae3a4174a336c4ba" Mar 10 11:59:22 crc kubenswrapper[4794]: E0310 11:59:22.885294 4794 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"05b6f39a2094043d06c10fca9c2d726465b548663baed293ae3a4174a336c4ba\": container with ID starting with 05b6f39a2094043d06c10fca9c2d726465b548663baed293ae3a4174a336c4ba not found: ID does not exist" containerID="05b6f39a2094043d06c10fca9c2d726465b548663baed293ae3a4174a336c4ba" Mar 10 11:59:22 crc kubenswrapper[4794]: I0310 11:59:22.885326 4794 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"05b6f39a2094043d06c10fca9c2d726465b548663baed293ae3a4174a336c4ba"} err="failed to get container status \"05b6f39a2094043d06c10fca9c2d726465b548663baed293ae3a4174a336c4ba\": rpc error: code = NotFound desc = could not find container \"05b6f39a2094043d06c10fca9c2d726465b548663baed293ae3a4174a336c4ba\": container with ID starting with 05b6f39a2094043d06c10fca9c2d726465b548663baed293ae3a4174a336c4ba not found: ID does not exist" Mar 10 11:59:22 crc kubenswrapper[4794]: I0310 11:59:22.885636 4794 scope.go:117] "RemoveContainer" containerID="39edbe05b50f5c71f2852f7aa18c90cc8f88214570f663c32a0d770ee3919973" Mar 10 11:59:22 crc kubenswrapper[4794]: E0310 11:59:22.886182 4794 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"39edbe05b50f5c71f2852f7aa18c90cc8f88214570f663c32a0d770ee3919973\": container with ID starting with 39edbe05b50f5c71f2852f7aa18c90cc8f88214570f663c32a0d770ee3919973 not found: ID does not exist" containerID="39edbe05b50f5c71f2852f7aa18c90cc8f88214570f663c32a0d770ee3919973" Mar 10 11:59:22 crc kubenswrapper[4794]: I0310 11:59:22.886241 4794 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"39edbe05b50f5c71f2852f7aa18c90cc8f88214570f663c32a0d770ee3919973"} err="failed to get container status \"39edbe05b50f5c71f2852f7aa18c90cc8f88214570f663c32a0d770ee3919973\": rpc error: code = NotFound desc = could not find container \"39edbe05b50f5c71f2852f7aa18c90cc8f88214570f663c32a0d770ee3919973\": container with ID starting with 39edbe05b50f5c71f2852f7aa18c90cc8f88214570f663c32a0d770ee3919973 not found: ID does not exist" Mar 10 11:59:22 crc kubenswrapper[4794]: I0310 11:59:22.886267 4794 scope.go:117] "RemoveContainer" containerID="fc33eaa6a18b27ce3dfcd72e9d49b24cb33a13898adc30075bd248b1eb7a503f" Mar 10 11:59:22 crc kubenswrapper[4794]: E0310 11:59:22.886604 4794 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fc33eaa6a18b27ce3dfcd72e9d49b24cb33a13898adc30075bd248b1eb7a503f\": container with ID starting with fc33eaa6a18b27ce3dfcd72e9d49b24cb33a13898adc30075bd248b1eb7a503f not found: ID does not exist" containerID="fc33eaa6a18b27ce3dfcd72e9d49b24cb33a13898adc30075bd248b1eb7a503f" Mar 10 11:59:22 crc kubenswrapper[4794]: I0310 11:59:22.886631 4794 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fc33eaa6a18b27ce3dfcd72e9d49b24cb33a13898adc30075bd248b1eb7a503f"} err="failed to get container status \"fc33eaa6a18b27ce3dfcd72e9d49b24cb33a13898adc30075bd248b1eb7a503f\": rpc error: code = NotFound desc = could not find container \"fc33eaa6a18b27ce3dfcd72e9d49b24cb33a13898adc30075bd248b1eb7a503f\": container with ID starting with fc33eaa6a18b27ce3dfcd72e9d49b24cb33a13898adc30075bd248b1eb7a503f not found: ID does not exist" Mar 10 11:59:24 crc kubenswrapper[4794]: I0310 11:59:24.017049 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f0e16b8c-53d2-40f7-ac56-13cb09c48695" path="/var/lib/kubelet/pods/f0e16b8c-53d2-40f7-ac56-13cb09c48695/volumes" Mar 10 11:59:56 crc kubenswrapper[4794]: I0310 11:59:56.126565 4794 generic.go:334] "Generic (PLEG): container finished" podID="f01b2008-c26d-4f5f-90c6-438cd78f6836" containerID="ad2587fa34f50aa9d342f883afe8031ca6f772795b7a3e12b78ea11643cc3621" exitCode=0 Mar 10 11:59:56 crc kubenswrapper[4794]: I0310 11:59:56.126700 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-openstack-openstack-cell1-bpxnx" event={"ID":"f01b2008-c26d-4f5f-90c6-438cd78f6836","Type":"ContainerDied","Data":"ad2587fa34f50aa9d342f883afe8031ca6f772795b7a3e12b78ea11643cc3621"} Mar 10 11:59:57 crc kubenswrapper[4794]: I0310 11:59:57.697709 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-openstack-openstack-cell1-bpxnx" Mar 10 11:59:57 crc kubenswrapper[4794]: I0310 11:59:57.804968 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/f01b2008-c26d-4f5f-90c6-438cd78f6836-ssh-key-openstack-cell1\") pod \"f01b2008-c26d-4f5f-90c6-438cd78f6836\" (UID: \"f01b2008-c26d-4f5f-90c6-438cd78f6836\") " Mar 10 11:59:57 crc kubenswrapper[4794]: I0310 11:59:57.805034 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f01b2008-c26d-4f5f-90c6-438cd78f6836-libvirt-combined-ca-bundle\") pod \"f01b2008-c26d-4f5f-90c6-438cd78f6836\" (UID: \"f01b2008-c26d-4f5f-90c6-438cd78f6836\") " Mar 10 11:59:57 crc kubenswrapper[4794]: I0310 11:59:57.805063 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-56sch\" (UniqueName: \"kubernetes.io/projected/f01b2008-c26d-4f5f-90c6-438cd78f6836-kube-api-access-56sch\") pod \"f01b2008-c26d-4f5f-90c6-438cd78f6836\" (UID: \"f01b2008-c26d-4f5f-90c6-438cd78f6836\") " Mar 10 11:59:57 crc kubenswrapper[4794]: I0310 11:59:57.805235 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f01b2008-c26d-4f5f-90c6-438cd78f6836-inventory\") pod \"f01b2008-c26d-4f5f-90c6-438cd78f6836\" (UID: \"f01b2008-c26d-4f5f-90c6-438cd78f6836\") " Mar 10 11:59:57 crc kubenswrapper[4794]: I0310 11:59:57.805283 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/f01b2008-c26d-4f5f-90c6-438cd78f6836-libvirt-secret-0\") pod \"f01b2008-c26d-4f5f-90c6-438cd78f6836\" (UID: \"f01b2008-c26d-4f5f-90c6-438cd78f6836\") " Mar 10 11:59:57 crc kubenswrapper[4794]: I0310 11:59:57.805309 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/f01b2008-c26d-4f5f-90c6-438cd78f6836-ceph\") pod \"f01b2008-c26d-4f5f-90c6-438cd78f6836\" (UID: \"f01b2008-c26d-4f5f-90c6-438cd78f6836\") " Mar 10 11:59:57 crc kubenswrapper[4794]: I0310 11:59:57.812638 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f01b2008-c26d-4f5f-90c6-438cd78f6836-ceph" (OuterVolumeSpecName: "ceph") pod "f01b2008-c26d-4f5f-90c6-438cd78f6836" (UID: "f01b2008-c26d-4f5f-90c6-438cd78f6836"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 11:59:57 crc kubenswrapper[4794]: I0310 11:59:57.812677 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f01b2008-c26d-4f5f-90c6-438cd78f6836-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "f01b2008-c26d-4f5f-90c6-438cd78f6836" (UID: "f01b2008-c26d-4f5f-90c6-438cd78f6836"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 11:59:57 crc kubenswrapper[4794]: I0310 11:59:57.812725 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f01b2008-c26d-4f5f-90c6-438cd78f6836-kube-api-access-56sch" (OuterVolumeSpecName: "kube-api-access-56sch") pod "f01b2008-c26d-4f5f-90c6-438cd78f6836" (UID: "f01b2008-c26d-4f5f-90c6-438cd78f6836"). InnerVolumeSpecName "kube-api-access-56sch". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 11:59:57 crc kubenswrapper[4794]: I0310 11:59:57.836207 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f01b2008-c26d-4f5f-90c6-438cd78f6836-ssh-key-openstack-cell1" (OuterVolumeSpecName: "ssh-key-openstack-cell1") pod "f01b2008-c26d-4f5f-90c6-438cd78f6836" (UID: "f01b2008-c26d-4f5f-90c6-438cd78f6836"). InnerVolumeSpecName "ssh-key-openstack-cell1". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 11:59:57 crc kubenswrapper[4794]: I0310 11:59:57.838612 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f01b2008-c26d-4f5f-90c6-438cd78f6836-libvirt-secret-0" (OuterVolumeSpecName: "libvirt-secret-0") pod "f01b2008-c26d-4f5f-90c6-438cd78f6836" (UID: "f01b2008-c26d-4f5f-90c6-438cd78f6836"). InnerVolumeSpecName "libvirt-secret-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 11:59:57 crc kubenswrapper[4794]: I0310 11:59:57.840978 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f01b2008-c26d-4f5f-90c6-438cd78f6836-inventory" (OuterVolumeSpecName: "inventory") pod "f01b2008-c26d-4f5f-90c6-438cd78f6836" (UID: "f01b2008-c26d-4f5f-90c6-438cd78f6836"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 11:59:57 crc kubenswrapper[4794]: I0310 11:59:57.907427 4794 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f01b2008-c26d-4f5f-90c6-438cd78f6836-inventory\") on node \"crc\" DevicePath \"\"" Mar 10 11:59:57 crc kubenswrapper[4794]: I0310 11:59:57.907723 4794 reconciler_common.go:293] "Volume detached for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/f01b2008-c26d-4f5f-90c6-438cd78f6836-libvirt-secret-0\") on node \"crc\" DevicePath \"\"" Mar 10 11:59:57 crc kubenswrapper[4794]: I0310 11:59:57.907735 4794 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/f01b2008-c26d-4f5f-90c6-438cd78f6836-ceph\") on node \"crc\" DevicePath \"\"" Mar 10 11:59:57 crc kubenswrapper[4794]: I0310 11:59:57.907745 4794 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/f01b2008-c26d-4f5f-90c6-438cd78f6836-ssh-key-openstack-cell1\") on node \"crc\" DevicePath \"\"" Mar 10 11:59:57 crc kubenswrapper[4794]: I0310 11:59:57.907754 4794 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f01b2008-c26d-4f5f-90c6-438cd78f6836-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 11:59:57 crc kubenswrapper[4794]: I0310 11:59:57.907763 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-56sch\" (UniqueName: \"kubernetes.io/projected/f01b2008-c26d-4f5f-90c6-438cd78f6836-kube-api-access-56sch\") on node \"crc\" DevicePath \"\"" Mar 10 11:59:58 crc kubenswrapper[4794]: I0310 11:59:58.157547 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-openstack-openstack-cell1-bpxnx" event={"ID":"f01b2008-c26d-4f5f-90c6-438cd78f6836","Type":"ContainerDied","Data":"54a859e1868fb4e5f961cf4f87863ab36aee966df967f10059a00f88fc147d3e"} Mar 10 11:59:58 crc kubenswrapper[4794]: I0310 11:59:58.158011 4794 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="54a859e1868fb4e5f961cf4f87863ab36aee966df967f10059a00f88fc147d3e" Mar 10 11:59:58 crc kubenswrapper[4794]: I0310 11:59:58.157603 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-openstack-openstack-cell1-bpxnx" Mar 10 11:59:58 crc kubenswrapper[4794]: I0310 11:59:58.263243 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-openstack-openstack-cell1-6rfj4"] Mar 10 11:59:58 crc kubenswrapper[4794]: E0310 11:59:58.263813 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f0e16b8c-53d2-40f7-ac56-13cb09c48695" containerName="extract-utilities" Mar 10 11:59:58 crc kubenswrapper[4794]: I0310 11:59:58.263848 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="f0e16b8c-53d2-40f7-ac56-13cb09c48695" containerName="extract-utilities" Mar 10 11:59:58 crc kubenswrapper[4794]: E0310 11:59:58.263872 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f01b2008-c26d-4f5f-90c6-438cd78f6836" containerName="libvirt-openstack-openstack-cell1" Mar 10 11:59:58 crc kubenswrapper[4794]: I0310 11:59:58.263880 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="f01b2008-c26d-4f5f-90c6-438cd78f6836" containerName="libvirt-openstack-openstack-cell1" Mar 10 11:59:58 crc kubenswrapper[4794]: E0310 11:59:58.263913 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f0e16b8c-53d2-40f7-ac56-13cb09c48695" containerName="registry-server" Mar 10 11:59:58 crc kubenswrapper[4794]: I0310 11:59:58.263924 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="f0e16b8c-53d2-40f7-ac56-13cb09c48695" containerName="registry-server" Mar 10 11:59:58 crc kubenswrapper[4794]: E0310 11:59:58.263940 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f0e16b8c-53d2-40f7-ac56-13cb09c48695" containerName="extract-content" Mar 10 11:59:58 crc kubenswrapper[4794]: I0310 11:59:58.263947 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="f0e16b8c-53d2-40f7-ac56-13cb09c48695" containerName="extract-content" Mar 10 11:59:58 crc kubenswrapper[4794]: I0310 11:59:58.264353 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="f0e16b8c-53d2-40f7-ac56-13cb09c48695" containerName="registry-server" Mar 10 11:59:58 crc kubenswrapper[4794]: I0310 11:59:58.264384 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="f01b2008-c26d-4f5f-90c6-438cd78f6836" containerName="libvirt-openstack-openstack-cell1" Mar 10 11:59:58 crc kubenswrapper[4794]: I0310 11:59:58.265315 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-openstack-openstack-cell1-6rfj4" Mar 10 11:59:58 crc kubenswrapper[4794]: I0310 11:59:58.267593 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Mar 10 11:59:58 crc kubenswrapper[4794]: I0310 11:59:58.267818 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-compute-config" Mar 10 11:59:58 crc kubenswrapper[4794]: I0310 11:59:58.268475 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"nova-cells-global-config" Mar 10 11:59:58 crc kubenswrapper[4794]: I0310 11:59:58.268715 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 10 11:59:58 crc kubenswrapper[4794]: I0310 11:59:58.269737 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-migration-ssh-key" Mar 10 11:59:58 crc kubenswrapper[4794]: I0310 11:59:58.271161 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Mar 10 11:59:58 crc kubenswrapper[4794]: I0310 11:59:58.271182 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-jdm8g" Mar 10 11:59:58 crc kubenswrapper[4794]: I0310 11:59:58.288862 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-openstack-openstack-cell1-6rfj4"] Mar 10 11:59:58 crc kubenswrapper[4794]: I0310 11:59:58.316629 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/0dbc468e-bf92-4cfa-81f1-a660334c4fd5-nova-migration-ssh-key-1\") pod \"nova-cell1-openstack-openstack-cell1-6rfj4\" (UID: \"0dbc468e-bf92-4cfa-81f1-a660334c4fd5\") " pod="openstack/nova-cell1-openstack-openstack-cell1-6rfj4" Mar 10 11:59:58 crc kubenswrapper[4794]: I0310 11:59:58.316700 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/0dbc468e-bf92-4cfa-81f1-a660334c4fd5-nova-cell1-compute-config-2\") pod \"nova-cell1-openstack-openstack-cell1-6rfj4\" (UID: \"0dbc468e-bf92-4cfa-81f1-a660334c4fd5\") " pod="openstack/nova-cell1-openstack-openstack-cell1-6rfj4" Mar 10 11:59:58 crc kubenswrapper[4794]: I0310 11:59:58.316776 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0dbc468e-bf92-4cfa-81f1-a660334c4fd5-inventory\") pod \"nova-cell1-openstack-openstack-cell1-6rfj4\" (UID: \"0dbc468e-bf92-4cfa-81f1-a660334c4fd5\") " pod="openstack/nova-cell1-openstack-openstack-cell1-6rfj4" Mar 10 11:59:58 crc kubenswrapper[4794]: I0310 11:59:58.316828 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/0dbc468e-bf92-4cfa-81f1-a660334c4fd5-ceph\") pod \"nova-cell1-openstack-openstack-cell1-6rfj4\" (UID: \"0dbc468e-bf92-4cfa-81f1-a660334c4fd5\") " pod="openstack/nova-cell1-openstack-openstack-cell1-6rfj4" Mar 10 11:59:58 crc kubenswrapper[4794]: I0310 11:59:58.316901 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cells-global-config-0\" (UniqueName: \"kubernetes.io/configmap/0dbc468e-bf92-4cfa-81f1-a660334c4fd5-nova-cells-global-config-0\") pod \"nova-cell1-openstack-openstack-cell1-6rfj4\" (UID: \"0dbc468e-bf92-4cfa-81f1-a660334c4fd5\") " pod="openstack/nova-cell1-openstack-openstack-cell1-6rfj4" Mar 10 11:59:58 crc kubenswrapper[4794]: I0310 11:59:58.316927 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/0dbc468e-bf92-4cfa-81f1-a660334c4fd5-nova-migration-ssh-key-0\") pod \"nova-cell1-openstack-openstack-cell1-6rfj4\" (UID: \"0dbc468e-bf92-4cfa-81f1-a660334c4fd5\") " pod="openstack/nova-cell1-openstack-openstack-cell1-6rfj4" Mar 10 11:59:58 crc kubenswrapper[4794]: I0310 11:59:58.316949 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/0dbc468e-bf92-4cfa-81f1-a660334c4fd5-nova-cell1-compute-config-1\") pod \"nova-cell1-openstack-openstack-cell1-6rfj4\" (UID: \"0dbc468e-bf92-4cfa-81f1-a660334c4fd5\") " pod="openstack/nova-cell1-openstack-openstack-cell1-6rfj4" Mar 10 11:59:58 crc kubenswrapper[4794]: I0310 11:59:58.316989 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/0dbc468e-bf92-4cfa-81f1-a660334c4fd5-ssh-key-openstack-cell1\") pod \"nova-cell1-openstack-openstack-cell1-6rfj4\" (UID: \"0dbc468e-bf92-4cfa-81f1-a660334c4fd5\") " pod="openstack/nova-cell1-openstack-openstack-cell1-6rfj4" Mar 10 11:59:58 crc kubenswrapper[4794]: I0310 11:59:58.317014 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6jg5k\" (UniqueName: \"kubernetes.io/projected/0dbc468e-bf92-4cfa-81f1-a660334c4fd5-kube-api-access-6jg5k\") pod \"nova-cell1-openstack-openstack-cell1-6rfj4\" (UID: \"0dbc468e-bf92-4cfa-81f1-a660334c4fd5\") " pod="openstack/nova-cell1-openstack-openstack-cell1-6rfj4" Mar 10 11:59:58 crc kubenswrapper[4794]: I0310 11:59:58.317047 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/0dbc468e-bf92-4cfa-81f1-a660334c4fd5-nova-cell1-compute-config-0\") pod \"nova-cell1-openstack-openstack-cell1-6rfj4\" (UID: \"0dbc468e-bf92-4cfa-81f1-a660334c4fd5\") " pod="openstack/nova-cell1-openstack-openstack-cell1-6rfj4" Mar 10 11:59:58 crc kubenswrapper[4794]: I0310 11:59:58.317091 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cells-global-config-1\" (UniqueName: \"kubernetes.io/configmap/0dbc468e-bf92-4cfa-81f1-a660334c4fd5-nova-cells-global-config-1\") pod \"nova-cell1-openstack-openstack-cell1-6rfj4\" (UID: \"0dbc468e-bf92-4cfa-81f1-a660334c4fd5\") " pod="openstack/nova-cell1-openstack-openstack-cell1-6rfj4" Mar 10 11:59:58 crc kubenswrapper[4794]: I0310 11:59:58.317134 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/0dbc468e-bf92-4cfa-81f1-a660334c4fd5-nova-cell1-compute-config-3\") pod \"nova-cell1-openstack-openstack-cell1-6rfj4\" (UID: \"0dbc468e-bf92-4cfa-81f1-a660334c4fd5\") " pod="openstack/nova-cell1-openstack-openstack-cell1-6rfj4" Mar 10 11:59:58 crc kubenswrapper[4794]: I0310 11:59:58.317158 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0dbc468e-bf92-4cfa-81f1-a660334c4fd5-nova-cell1-combined-ca-bundle\") pod \"nova-cell1-openstack-openstack-cell1-6rfj4\" (UID: \"0dbc468e-bf92-4cfa-81f1-a660334c4fd5\") " pod="openstack/nova-cell1-openstack-openstack-cell1-6rfj4" Mar 10 11:59:58 crc kubenswrapper[4794]: I0310 11:59:58.419663 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/0dbc468e-bf92-4cfa-81f1-a660334c4fd5-ceph\") pod \"nova-cell1-openstack-openstack-cell1-6rfj4\" (UID: \"0dbc468e-bf92-4cfa-81f1-a660334c4fd5\") " pod="openstack/nova-cell1-openstack-openstack-cell1-6rfj4" Mar 10 11:59:58 crc kubenswrapper[4794]: I0310 11:59:58.419751 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cells-global-config-0\" (UniqueName: \"kubernetes.io/configmap/0dbc468e-bf92-4cfa-81f1-a660334c4fd5-nova-cells-global-config-0\") pod \"nova-cell1-openstack-openstack-cell1-6rfj4\" (UID: \"0dbc468e-bf92-4cfa-81f1-a660334c4fd5\") " pod="openstack/nova-cell1-openstack-openstack-cell1-6rfj4" Mar 10 11:59:58 crc kubenswrapper[4794]: I0310 11:59:58.419775 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/0dbc468e-bf92-4cfa-81f1-a660334c4fd5-nova-migration-ssh-key-0\") pod \"nova-cell1-openstack-openstack-cell1-6rfj4\" (UID: \"0dbc468e-bf92-4cfa-81f1-a660334c4fd5\") " pod="openstack/nova-cell1-openstack-openstack-cell1-6rfj4" Mar 10 11:59:58 crc kubenswrapper[4794]: I0310 11:59:58.419793 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/0dbc468e-bf92-4cfa-81f1-a660334c4fd5-nova-cell1-compute-config-1\") pod \"nova-cell1-openstack-openstack-cell1-6rfj4\" (UID: \"0dbc468e-bf92-4cfa-81f1-a660334c4fd5\") " pod="openstack/nova-cell1-openstack-openstack-cell1-6rfj4" Mar 10 11:59:58 crc kubenswrapper[4794]: I0310 11:59:58.419828 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/0dbc468e-bf92-4cfa-81f1-a660334c4fd5-ssh-key-openstack-cell1\") pod \"nova-cell1-openstack-openstack-cell1-6rfj4\" (UID: \"0dbc468e-bf92-4cfa-81f1-a660334c4fd5\") " pod="openstack/nova-cell1-openstack-openstack-cell1-6rfj4" Mar 10 11:59:58 crc kubenswrapper[4794]: I0310 11:59:58.419854 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6jg5k\" (UniqueName: \"kubernetes.io/projected/0dbc468e-bf92-4cfa-81f1-a660334c4fd5-kube-api-access-6jg5k\") pod \"nova-cell1-openstack-openstack-cell1-6rfj4\" (UID: \"0dbc468e-bf92-4cfa-81f1-a660334c4fd5\") " pod="openstack/nova-cell1-openstack-openstack-cell1-6rfj4" Mar 10 11:59:58 crc kubenswrapper[4794]: I0310 11:59:58.419899 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/0dbc468e-bf92-4cfa-81f1-a660334c4fd5-nova-cell1-compute-config-0\") pod \"nova-cell1-openstack-openstack-cell1-6rfj4\" (UID: \"0dbc468e-bf92-4cfa-81f1-a660334c4fd5\") " pod="openstack/nova-cell1-openstack-openstack-cell1-6rfj4" Mar 10 11:59:58 crc kubenswrapper[4794]: I0310 11:59:58.419954 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cells-global-config-1\" (UniqueName: \"kubernetes.io/configmap/0dbc468e-bf92-4cfa-81f1-a660334c4fd5-nova-cells-global-config-1\") pod \"nova-cell1-openstack-openstack-cell1-6rfj4\" (UID: \"0dbc468e-bf92-4cfa-81f1-a660334c4fd5\") " pod="openstack/nova-cell1-openstack-openstack-cell1-6rfj4" Mar 10 11:59:58 crc kubenswrapper[4794]: I0310 11:59:58.420008 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/0dbc468e-bf92-4cfa-81f1-a660334c4fd5-nova-cell1-compute-config-3\") pod \"nova-cell1-openstack-openstack-cell1-6rfj4\" (UID: \"0dbc468e-bf92-4cfa-81f1-a660334c4fd5\") " pod="openstack/nova-cell1-openstack-openstack-cell1-6rfj4" Mar 10 11:59:58 crc kubenswrapper[4794]: I0310 11:59:58.420035 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0dbc468e-bf92-4cfa-81f1-a660334c4fd5-nova-cell1-combined-ca-bundle\") pod \"nova-cell1-openstack-openstack-cell1-6rfj4\" (UID: \"0dbc468e-bf92-4cfa-81f1-a660334c4fd5\") " pod="openstack/nova-cell1-openstack-openstack-cell1-6rfj4" Mar 10 11:59:58 crc kubenswrapper[4794]: I0310 11:59:58.420079 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/0dbc468e-bf92-4cfa-81f1-a660334c4fd5-nova-migration-ssh-key-1\") pod \"nova-cell1-openstack-openstack-cell1-6rfj4\" (UID: \"0dbc468e-bf92-4cfa-81f1-a660334c4fd5\") " pod="openstack/nova-cell1-openstack-openstack-cell1-6rfj4" Mar 10 11:59:58 crc kubenswrapper[4794]: I0310 11:59:58.420108 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/0dbc468e-bf92-4cfa-81f1-a660334c4fd5-nova-cell1-compute-config-2\") pod \"nova-cell1-openstack-openstack-cell1-6rfj4\" (UID: \"0dbc468e-bf92-4cfa-81f1-a660334c4fd5\") " pod="openstack/nova-cell1-openstack-openstack-cell1-6rfj4" Mar 10 11:59:58 crc kubenswrapper[4794]: I0310 11:59:58.420148 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0dbc468e-bf92-4cfa-81f1-a660334c4fd5-inventory\") pod \"nova-cell1-openstack-openstack-cell1-6rfj4\" (UID: \"0dbc468e-bf92-4cfa-81f1-a660334c4fd5\") " pod="openstack/nova-cell1-openstack-openstack-cell1-6rfj4" Mar 10 11:59:58 crc kubenswrapper[4794]: I0310 11:59:58.421075 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cells-global-config-0\" (UniqueName: \"kubernetes.io/configmap/0dbc468e-bf92-4cfa-81f1-a660334c4fd5-nova-cells-global-config-0\") pod \"nova-cell1-openstack-openstack-cell1-6rfj4\" (UID: \"0dbc468e-bf92-4cfa-81f1-a660334c4fd5\") " pod="openstack/nova-cell1-openstack-openstack-cell1-6rfj4" Mar 10 11:59:58 crc kubenswrapper[4794]: I0310 11:59:58.421816 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cells-global-config-1\" (UniqueName: \"kubernetes.io/configmap/0dbc468e-bf92-4cfa-81f1-a660334c4fd5-nova-cells-global-config-1\") pod \"nova-cell1-openstack-openstack-cell1-6rfj4\" (UID: \"0dbc468e-bf92-4cfa-81f1-a660334c4fd5\") " pod="openstack/nova-cell1-openstack-openstack-cell1-6rfj4" Mar 10 11:59:58 crc kubenswrapper[4794]: I0310 11:59:58.425486 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/0dbc468e-bf92-4cfa-81f1-a660334c4fd5-nova-migration-ssh-key-1\") pod \"nova-cell1-openstack-openstack-cell1-6rfj4\" (UID: \"0dbc468e-bf92-4cfa-81f1-a660334c4fd5\") " pod="openstack/nova-cell1-openstack-openstack-cell1-6rfj4" Mar 10 11:59:58 crc kubenswrapper[4794]: I0310 11:59:58.425738 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/0dbc468e-bf92-4cfa-81f1-a660334c4fd5-nova-cell1-compute-config-1\") pod \"nova-cell1-openstack-openstack-cell1-6rfj4\" (UID: \"0dbc468e-bf92-4cfa-81f1-a660334c4fd5\") " pod="openstack/nova-cell1-openstack-openstack-cell1-6rfj4" Mar 10 11:59:58 crc kubenswrapper[4794]: I0310 11:59:58.426333 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/0dbc468e-bf92-4cfa-81f1-a660334c4fd5-nova-cell1-compute-config-3\") pod \"nova-cell1-openstack-openstack-cell1-6rfj4\" (UID: \"0dbc468e-bf92-4cfa-81f1-a660334c4fd5\") " pod="openstack/nova-cell1-openstack-openstack-cell1-6rfj4" Mar 10 11:59:58 crc kubenswrapper[4794]: I0310 11:59:58.426444 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/0dbc468e-bf92-4cfa-81f1-a660334c4fd5-ssh-key-openstack-cell1\") pod \"nova-cell1-openstack-openstack-cell1-6rfj4\" (UID: \"0dbc468e-bf92-4cfa-81f1-a660334c4fd5\") " pod="openstack/nova-cell1-openstack-openstack-cell1-6rfj4" Mar 10 11:59:58 crc kubenswrapper[4794]: I0310 11:59:58.426773 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0dbc468e-bf92-4cfa-81f1-a660334c4fd5-nova-cell1-combined-ca-bundle\") pod \"nova-cell1-openstack-openstack-cell1-6rfj4\" (UID: \"0dbc468e-bf92-4cfa-81f1-a660334c4fd5\") " pod="openstack/nova-cell1-openstack-openstack-cell1-6rfj4" Mar 10 11:59:58 crc kubenswrapper[4794]: I0310 11:59:58.427014 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/0dbc468e-bf92-4cfa-81f1-a660334c4fd5-nova-cell1-compute-config-0\") pod \"nova-cell1-openstack-openstack-cell1-6rfj4\" (UID: \"0dbc468e-bf92-4cfa-81f1-a660334c4fd5\") " pod="openstack/nova-cell1-openstack-openstack-cell1-6rfj4" Mar 10 11:59:58 crc kubenswrapper[4794]: I0310 11:59:58.427447 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/0dbc468e-bf92-4cfa-81f1-a660334c4fd5-nova-cell1-compute-config-2\") pod \"nova-cell1-openstack-openstack-cell1-6rfj4\" (UID: \"0dbc468e-bf92-4cfa-81f1-a660334c4fd5\") " pod="openstack/nova-cell1-openstack-openstack-cell1-6rfj4" Mar 10 11:59:58 crc kubenswrapper[4794]: I0310 11:59:58.427460 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0dbc468e-bf92-4cfa-81f1-a660334c4fd5-inventory\") pod \"nova-cell1-openstack-openstack-cell1-6rfj4\" (UID: \"0dbc468e-bf92-4cfa-81f1-a660334c4fd5\") " pod="openstack/nova-cell1-openstack-openstack-cell1-6rfj4" Mar 10 11:59:58 crc kubenswrapper[4794]: I0310 11:59:58.427626 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/0dbc468e-bf92-4cfa-81f1-a660334c4fd5-nova-migration-ssh-key-0\") pod \"nova-cell1-openstack-openstack-cell1-6rfj4\" (UID: \"0dbc468e-bf92-4cfa-81f1-a660334c4fd5\") " pod="openstack/nova-cell1-openstack-openstack-cell1-6rfj4" Mar 10 11:59:58 crc kubenswrapper[4794]: I0310 11:59:58.428773 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/0dbc468e-bf92-4cfa-81f1-a660334c4fd5-ceph\") pod \"nova-cell1-openstack-openstack-cell1-6rfj4\" (UID: \"0dbc468e-bf92-4cfa-81f1-a660334c4fd5\") " pod="openstack/nova-cell1-openstack-openstack-cell1-6rfj4" Mar 10 11:59:58 crc kubenswrapper[4794]: I0310 11:59:58.437360 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6jg5k\" (UniqueName: \"kubernetes.io/projected/0dbc468e-bf92-4cfa-81f1-a660334c4fd5-kube-api-access-6jg5k\") pod \"nova-cell1-openstack-openstack-cell1-6rfj4\" (UID: \"0dbc468e-bf92-4cfa-81f1-a660334c4fd5\") " pod="openstack/nova-cell1-openstack-openstack-cell1-6rfj4" Mar 10 11:59:58 crc kubenswrapper[4794]: I0310 11:59:58.585302 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-openstack-openstack-cell1-6rfj4" Mar 10 11:59:59 crc kubenswrapper[4794]: I0310 11:59:59.161738 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-openstack-openstack-cell1-6rfj4"] Mar 10 11:59:59 crc kubenswrapper[4794]: W0310 11:59:59.173384 4794 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0dbc468e_bf92_4cfa_81f1_a660334c4fd5.slice/crio-47200f510fbd9fd3307f3e3e384c4006941acbb307d978707a395678afb27fbc WatchSource:0}: Error finding container 47200f510fbd9fd3307f3e3e384c4006941acbb307d978707a395678afb27fbc: Status 404 returned error can't find the container with id 47200f510fbd9fd3307f3e3e384c4006941acbb307d978707a395678afb27fbc Mar 10 12:00:00 crc kubenswrapper[4794]: I0310 12:00:00.142311 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29552400-g5wjv"] Mar 10 12:00:00 crc kubenswrapper[4794]: I0310 12:00:00.145256 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552400-g5wjv" Mar 10 12:00:00 crc kubenswrapper[4794]: I0310 12:00:00.149212 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 10 12:00:00 crc kubenswrapper[4794]: I0310 12:00:00.149673 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 10 12:00:00 crc kubenswrapper[4794]: I0310 12:00:00.149674 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-r5tkc" Mar 10 12:00:00 crc kubenswrapper[4794]: I0310 12:00:00.153474 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29552400-8nnhg"] Mar 10 12:00:00 crc kubenswrapper[4794]: I0310 12:00:00.156026 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29552400-8nnhg" Mar 10 12:00:00 crc kubenswrapper[4794]: I0310 12:00:00.158819 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 10 12:00:00 crc kubenswrapper[4794]: I0310 12:00:00.165430 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 10 12:00:00 crc kubenswrapper[4794]: I0310 12:00:00.167096 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552400-g5wjv"] Mar 10 12:00:00 crc kubenswrapper[4794]: I0310 12:00:00.183525 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29552400-8nnhg"] Mar 10 12:00:00 crc kubenswrapper[4794]: I0310 12:00:00.184754 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-openstack-openstack-cell1-6rfj4" event={"ID":"0dbc468e-bf92-4cfa-81f1-a660334c4fd5","Type":"ContainerStarted","Data":"2af3a11344f9c1779996fc8ea0275e685f24a7a95ec6fc505e9e62571c3df478"} Mar 10 12:00:00 crc kubenswrapper[4794]: I0310 12:00:00.184844 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-openstack-openstack-cell1-6rfj4" event={"ID":"0dbc468e-bf92-4cfa-81f1-a660334c4fd5","Type":"ContainerStarted","Data":"47200f510fbd9fd3307f3e3e384c4006941acbb307d978707a395678afb27fbc"} Mar 10 12:00:00 crc kubenswrapper[4794]: I0310 12:00:00.223152 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-openstack-openstack-cell1-6rfj4" podStartSLOduration=1.755399801 podStartE2EDuration="2.223130021s" podCreationTimestamp="2026-03-10 11:59:58 +0000 UTC" firstStartedPulling="2026-03-10 11:59:59.176357995 +0000 UTC m=+8147.932528813" lastFinishedPulling="2026-03-10 11:59:59.644088205 +0000 UTC m=+8148.400259033" observedRunningTime="2026-03-10 12:00:00.205902456 +0000 UTC m=+8148.962073264" watchObservedRunningTime="2026-03-10 12:00:00.223130021 +0000 UTC m=+8148.979300849" Mar 10 12:00:00 crc kubenswrapper[4794]: I0310 12:00:00.263540 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/72205154-3d58-4199-baed-860a4360e3d4-config-volume\") pod \"collect-profiles-29552400-8nnhg\" (UID: \"72205154-3d58-4199-baed-860a4360e3d4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552400-8nnhg" Mar 10 12:00:00 crc kubenswrapper[4794]: I0310 12:00:00.263615 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/72205154-3d58-4199-baed-860a4360e3d4-secret-volume\") pod \"collect-profiles-29552400-8nnhg\" (UID: \"72205154-3d58-4199-baed-860a4360e3d4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552400-8nnhg" Mar 10 12:00:00 crc kubenswrapper[4794]: I0310 12:00:00.263651 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2wp9h\" (UniqueName: \"kubernetes.io/projected/72205154-3d58-4199-baed-860a4360e3d4-kube-api-access-2wp9h\") pod \"collect-profiles-29552400-8nnhg\" (UID: \"72205154-3d58-4199-baed-860a4360e3d4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552400-8nnhg" Mar 10 12:00:00 crc kubenswrapper[4794]: I0310 12:00:00.263764 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l6fmn\" (UniqueName: \"kubernetes.io/projected/fe2b2b20-7e44-4fce-af37-93e674cb3736-kube-api-access-l6fmn\") pod \"auto-csr-approver-29552400-g5wjv\" (UID: \"fe2b2b20-7e44-4fce-af37-93e674cb3736\") " pod="openshift-infra/auto-csr-approver-29552400-g5wjv" Mar 10 12:00:00 crc kubenswrapper[4794]: I0310 12:00:00.392398 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l6fmn\" (UniqueName: \"kubernetes.io/projected/fe2b2b20-7e44-4fce-af37-93e674cb3736-kube-api-access-l6fmn\") pod \"auto-csr-approver-29552400-g5wjv\" (UID: \"fe2b2b20-7e44-4fce-af37-93e674cb3736\") " pod="openshift-infra/auto-csr-approver-29552400-g5wjv" Mar 10 12:00:00 crc kubenswrapper[4794]: I0310 12:00:00.393203 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/72205154-3d58-4199-baed-860a4360e3d4-config-volume\") pod \"collect-profiles-29552400-8nnhg\" (UID: \"72205154-3d58-4199-baed-860a4360e3d4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552400-8nnhg" Mar 10 12:00:00 crc kubenswrapper[4794]: I0310 12:00:00.393472 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/72205154-3d58-4199-baed-860a4360e3d4-secret-volume\") pod \"collect-profiles-29552400-8nnhg\" (UID: \"72205154-3d58-4199-baed-860a4360e3d4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552400-8nnhg" Mar 10 12:00:00 crc kubenswrapper[4794]: I0310 12:00:00.393573 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2wp9h\" (UniqueName: \"kubernetes.io/projected/72205154-3d58-4199-baed-860a4360e3d4-kube-api-access-2wp9h\") pod \"collect-profiles-29552400-8nnhg\" (UID: \"72205154-3d58-4199-baed-860a4360e3d4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552400-8nnhg" Mar 10 12:00:00 crc kubenswrapper[4794]: I0310 12:00:00.396039 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/72205154-3d58-4199-baed-860a4360e3d4-config-volume\") pod \"collect-profiles-29552400-8nnhg\" (UID: \"72205154-3d58-4199-baed-860a4360e3d4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552400-8nnhg" Mar 10 12:00:00 crc kubenswrapper[4794]: I0310 12:00:00.399529 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/72205154-3d58-4199-baed-860a4360e3d4-secret-volume\") pod \"collect-profiles-29552400-8nnhg\" (UID: \"72205154-3d58-4199-baed-860a4360e3d4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552400-8nnhg" Mar 10 12:00:00 crc kubenswrapper[4794]: I0310 12:00:00.409603 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l6fmn\" (UniqueName: \"kubernetes.io/projected/fe2b2b20-7e44-4fce-af37-93e674cb3736-kube-api-access-l6fmn\") pod \"auto-csr-approver-29552400-g5wjv\" (UID: \"fe2b2b20-7e44-4fce-af37-93e674cb3736\") " pod="openshift-infra/auto-csr-approver-29552400-g5wjv" Mar 10 12:00:00 crc kubenswrapper[4794]: I0310 12:00:00.410039 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2wp9h\" (UniqueName: \"kubernetes.io/projected/72205154-3d58-4199-baed-860a4360e3d4-kube-api-access-2wp9h\") pod \"collect-profiles-29552400-8nnhg\" (UID: \"72205154-3d58-4199-baed-860a4360e3d4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552400-8nnhg" Mar 10 12:00:00 crc kubenswrapper[4794]: I0310 12:00:00.475476 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552400-g5wjv" Mar 10 12:00:00 crc kubenswrapper[4794]: I0310 12:00:00.499207 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29552400-8nnhg" Mar 10 12:00:01 crc kubenswrapper[4794]: I0310 12:00:01.014032 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552400-g5wjv"] Mar 10 12:00:01 crc kubenswrapper[4794]: W0310 12:00:01.019008 4794 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod72205154_3d58_4199_baed_860a4360e3d4.slice/crio-0da536b4ceb025c3dc11296318d8d10ca3141118374eb214609a29a2055da4fa WatchSource:0}: Error finding container 0da536b4ceb025c3dc11296318d8d10ca3141118374eb214609a29a2055da4fa: Status 404 returned error can't find the container with id 0da536b4ceb025c3dc11296318d8d10ca3141118374eb214609a29a2055da4fa Mar 10 12:00:01 crc kubenswrapper[4794]: I0310 12:00:01.021687 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29552400-8nnhg"] Mar 10 12:00:01 crc kubenswrapper[4794]: I0310 12:00:01.195656 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552400-g5wjv" event={"ID":"fe2b2b20-7e44-4fce-af37-93e674cb3736","Type":"ContainerStarted","Data":"285722a74bcacf3022a84fda1d7e540a156117e779faa756b67f834165daa53c"} Mar 10 12:00:01 crc kubenswrapper[4794]: I0310 12:00:01.197758 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29552400-8nnhg" event={"ID":"72205154-3d58-4199-baed-860a4360e3d4","Type":"ContainerStarted","Data":"0da536b4ceb025c3dc11296318d8d10ca3141118374eb214609a29a2055da4fa"} Mar 10 12:00:02 crc kubenswrapper[4794]: I0310 12:00:02.208316 4794 generic.go:334] "Generic (PLEG): container finished" podID="72205154-3d58-4199-baed-860a4360e3d4" containerID="6ba822d0c2adea7e8be731f6aee5ae4b0fbb647ea4f44fcc7568042aeae3f6a2" exitCode=0 Mar 10 12:00:02 crc kubenswrapper[4794]: I0310 12:00:02.208395 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29552400-8nnhg" event={"ID":"72205154-3d58-4199-baed-860a4360e3d4","Type":"ContainerDied","Data":"6ba822d0c2adea7e8be731f6aee5ae4b0fbb647ea4f44fcc7568042aeae3f6a2"} Mar 10 12:00:03 crc kubenswrapper[4794]: I0310 12:00:03.591642 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29552400-8nnhg" Mar 10 12:00:03 crc kubenswrapper[4794]: I0310 12:00:03.665122 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/72205154-3d58-4199-baed-860a4360e3d4-config-volume\") pod \"72205154-3d58-4199-baed-860a4360e3d4\" (UID: \"72205154-3d58-4199-baed-860a4360e3d4\") " Mar 10 12:00:03 crc kubenswrapper[4794]: I0310 12:00:03.665197 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2wp9h\" (UniqueName: \"kubernetes.io/projected/72205154-3d58-4199-baed-860a4360e3d4-kube-api-access-2wp9h\") pod \"72205154-3d58-4199-baed-860a4360e3d4\" (UID: \"72205154-3d58-4199-baed-860a4360e3d4\") " Mar 10 12:00:03 crc kubenswrapper[4794]: I0310 12:00:03.665520 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/72205154-3d58-4199-baed-860a4360e3d4-secret-volume\") pod \"72205154-3d58-4199-baed-860a4360e3d4\" (UID: \"72205154-3d58-4199-baed-860a4360e3d4\") " Mar 10 12:00:03 crc kubenswrapper[4794]: I0310 12:00:03.666019 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/72205154-3d58-4199-baed-860a4360e3d4-config-volume" (OuterVolumeSpecName: "config-volume") pod "72205154-3d58-4199-baed-860a4360e3d4" (UID: "72205154-3d58-4199-baed-860a4360e3d4"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 12:00:03 crc kubenswrapper[4794]: I0310 12:00:03.671527 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/72205154-3d58-4199-baed-860a4360e3d4-kube-api-access-2wp9h" (OuterVolumeSpecName: "kube-api-access-2wp9h") pod "72205154-3d58-4199-baed-860a4360e3d4" (UID: "72205154-3d58-4199-baed-860a4360e3d4"). InnerVolumeSpecName "kube-api-access-2wp9h". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 12:00:03 crc kubenswrapper[4794]: I0310 12:00:03.671747 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/72205154-3d58-4199-baed-860a4360e3d4-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "72205154-3d58-4199-baed-860a4360e3d4" (UID: "72205154-3d58-4199-baed-860a4360e3d4"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 12:00:03 crc kubenswrapper[4794]: I0310 12:00:03.768745 4794 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/72205154-3d58-4199-baed-860a4360e3d4-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 10 12:00:03 crc kubenswrapper[4794]: I0310 12:00:03.768783 4794 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/72205154-3d58-4199-baed-860a4360e3d4-config-volume\") on node \"crc\" DevicePath \"\"" Mar 10 12:00:03 crc kubenswrapper[4794]: I0310 12:00:03.768819 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2wp9h\" (UniqueName: \"kubernetes.io/projected/72205154-3d58-4199-baed-860a4360e3d4-kube-api-access-2wp9h\") on node \"crc\" DevicePath \"\"" Mar 10 12:00:04 crc kubenswrapper[4794]: I0310 12:00:04.233940 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29552400-8nnhg" event={"ID":"72205154-3d58-4199-baed-860a4360e3d4","Type":"ContainerDied","Data":"0da536b4ceb025c3dc11296318d8d10ca3141118374eb214609a29a2055da4fa"} Mar 10 12:00:04 crc kubenswrapper[4794]: I0310 12:00:04.233985 4794 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0da536b4ceb025c3dc11296318d8d10ca3141118374eb214609a29a2055da4fa" Mar 10 12:00:04 crc kubenswrapper[4794]: I0310 12:00:04.234038 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29552400-8nnhg" Mar 10 12:00:04 crc kubenswrapper[4794]: I0310 12:00:04.692506 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29552355-4jxds"] Mar 10 12:00:04 crc kubenswrapper[4794]: I0310 12:00:04.705187 4794 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29552355-4jxds"] Mar 10 12:00:06 crc kubenswrapper[4794]: I0310 12:00:06.012385 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a8fdc89e-e1d3-4819-ab36-a9a1ff3c2a52" path="/var/lib/kubelet/pods/a8fdc89e-e1d3-4819-ab36-a9a1ff3c2a52/volumes" Mar 10 12:00:08 crc kubenswrapper[4794]: I0310 12:00:08.292829 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552400-g5wjv" event={"ID":"fe2b2b20-7e44-4fce-af37-93e674cb3736","Type":"ContainerStarted","Data":"aacd8e156b8ae251684da100d85a625a54beea2b7894e12565aa846a7e063ddd"} Mar 10 12:00:08 crc kubenswrapper[4794]: I0310 12:00:08.317495 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29552400-g5wjv" podStartSLOduration=1.345818556 podStartE2EDuration="8.317458976s" podCreationTimestamp="2026-03-10 12:00:00 +0000 UTC" firstStartedPulling="2026-03-10 12:00:01.024602593 +0000 UTC m=+8149.780773411" lastFinishedPulling="2026-03-10 12:00:07.996243013 +0000 UTC m=+8156.752413831" observedRunningTime="2026-03-10 12:00:08.314963798 +0000 UTC m=+8157.071134616" watchObservedRunningTime="2026-03-10 12:00:08.317458976 +0000 UTC m=+8157.073629794" Mar 10 12:00:09 crc kubenswrapper[4794]: I0310 12:00:09.309297 4794 generic.go:334] "Generic (PLEG): container finished" podID="fe2b2b20-7e44-4fce-af37-93e674cb3736" containerID="aacd8e156b8ae251684da100d85a625a54beea2b7894e12565aa846a7e063ddd" exitCode=0 Mar 10 12:00:09 crc kubenswrapper[4794]: I0310 12:00:09.309640 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552400-g5wjv" event={"ID":"fe2b2b20-7e44-4fce-af37-93e674cb3736","Type":"ContainerDied","Data":"aacd8e156b8ae251684da100d85a625a54beea2b7894e12565aa846a7e063ddd"} Mar 10 12:00:10 crc kubenswrapper[4794]: I0310 12:00:10.697888 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552400-g5wjv" Mar 10 12:00:10 crc kubenswrapper[4794]: I0310 12:00:10.827383 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l6fmn\" (UniqueName: \"kubernetes.io/projected/fe2b2b20-7e44-4fce-af37-93e674cb3736-kube-api-access-l6fmn\") pod \"fe2b2b20-7e44-4fce-af37-93e674cb3736\" (UID: \"fe2b2b20-7e44-4fce-af37-93e674cb3736\") " Mar 10 12:00:10 crc kubenswrapper[4794]: I0310 12:00:10.833543 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fe2b2b20-7e44-4fce-af37-93e674cb3736-kube-api-access-l6fmn" (OuterVolumeSpecName: "kube-api-access-l6fmn") pod "fe2b2b20-7e44-4fce-af37-93e674cb3736" (UID: "fe2b2b20-7e44-4fce-af37-93e674cb3736"). InnerVolumeSpecName "kube-api-access-l6fmn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 12:00:10 crc kubenswrapper[4794]: I0310 12:00:10.930284 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l6fmn\" (UniqueName: \"kubernetes.io/projected/fe2b2b20-7e44-4fce-af37-93e674cb3736-kube-api-access-l6fmn\") on node \"crc\" DevicePath \"\"" Mar 10 12:00:11 crc kubenswrapper[4794]: I0310 12:00:11.336289 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552400-g5wjv" event={"ID":"fe2b2b20-7e44-4fce-af37-93e674cb3736","Type":"ContainerDied","Data":"285722a74bcacf3022a84fda1d7e540a156117e779faa756b67f834165daa53c"} Mar 10 12:00:11 crc kubenswrapper[4794]: I0310 12:00:11.336384 4794 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="285722a74bcacf3022a84fda1d7e540a156117e779faa756b67f834165daa53c" Mar 10 12:00:11 crc kubenswrapper[4794]: I0310 12:00:11.336462 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552400-g5wjv" Mar 10 12:00:11 crc kubenswrapper[4794]: I0310 12:00:11.383468 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29552394-r9wlb"] Mar 10 12:00:11 crc kubenswrapper[4794]: I0310 12:00:11.392243 4794 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29552394-r9wlb"] Mar 10 12:00:12 crc kubenswrapper[4794]: I0310 12:00:12.020581 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0f0fee6c-1902-4b26-b111-e4b439ed3811" path="/var/lib/kubelet/pods/0f0fee6c-1902-4b26-b111-e4b439ed3811/volumes" Mar 10 12:00:52 crc kubenswrapper[4794]: I0310 12:00:52.967831 4794 patch_prober.go:28] interesting pod/machine-config-daemon-69278 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 12:00:52 crc kubenswrapper[4794]: I0310 12:00:52.968642 4794 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-69278" podUID="071a79a8-a892-4d38-a255-2a19483b64aa" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 12:00:53 crc kubenswrapper[4794]: I0310 12:00:53.899239 4794 scope.go:117] "RemoveContainer" containerID="a73e125e24c2ece7751723bc02b5eccd4fb36d90b364b58456d0822584d819a5" Mar 10 12:00:53 crc kubenswrapper[4794]: I0310 12:00:53.949929 4794 scope.go:117] "RemoveContainer" containerID="440ab1bba25534ba1b40e651a5c6c53cc936bc33df805415e947df3381bdfd0e" Mar 10 12:01:00 crc kubenswrapper[4794]: I0310 12:01:00.148680 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-cron-29552401-pfl4r"] Mar 10 12:01:00 crc kubenswrapper[4794]: E0310 12:01:00.149971 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fe2b2b20-7e44-4fce-af37-93e674cb3736" containerName="oc" Mar 10 12:01:00 crc kubenswrapper[4794]: I0310 12:01:00.150033 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="fe2b2b20-7e44-4fce-af37-93e674cb3736" containerName="oc" Mar 10 12:01:00 crc kubenswrapper[4794]: E0310 12:01:00.150075 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="72205154-3d58-4199-baed-860a4360e3d4" containerName="collect-profiles" Mar 10 12:01:00 crc kubenswrapper[4794]: I0310 12:01:00.150084 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="72205154-3d58-4199-baed-860a4360e3d4" containerName="collect-profiles" Mar 10 12:01:00 crc kubenswrapper[4794]: I0310 12:01:00.150382 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="72205154-3d58-4199-baed-860a4360e3d4" containerName="collect-profiles" Mar 10 12:01:00 crc kubenswrapper[4794]: I0310 12:01:00.150419 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="fe2b2b20-7e44-4fce-af37-93e674cb3736" containerName="oc" Mar 10 12:01:00 crc kubenswrapper[4794]: I0310 12:01:00.151428 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29552401-pfl4r" Mar 10 12:01:00 crc kubenswrapper[4794]: I0310 12:01:00.161927 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29552401-pfl4r"] Mar 10 12:01:00 crc kubenswrapper[4794]: I0310 12:01:00.245947 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/477b563e-62bf-4c38-9004-f3a46f574174-fernet-keys\") pod \"keystone-cron-29552401-pfl4r\" (UID: \"477b563e-62bf-4c38-9004-f3a46f574174\") " pod="openstack/keystone-cron-29552401-pfl4r" Mar 10 12:01:00 crc kubenswrapper[4794]: I0310 12:01:00.246248 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7gwpx\" (UniqueName: \"kubernetes.io/projected/477b563e-62bf-4c38-9004-f3a46f574174-kube-api-access-7gwpx\") pod \"keystone-cron-29552401-pfl4r\" (UID: \"477b563e-62bf-4c38-9004-f3a46f574174\") " pod="openstack/keystone-cron-29552401-pfl4r" Mar 10 12:01:00 crc kubenswrapper[4794]: I0310 12:01:00.246413 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/477b563e-62bf-4c38-9004-f3a46f574174-config-data\") pod \"keystone-cron-29552401-pfl4r\" (UID: \"477b563e-62bf-4c38-9004-f3a46f574174\") " pod="openstack/keystone-cron-29552401-pfl4r" Mar 10 12:01:00 crc kubenswrapper[4794]: I0310 12:01:00.246487 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/477b563e-62bf-4c38-9004-f3a46f574174-combined-ca-bundle\") pod \"keystone-cron-29552401-pfl4r\" (UID: \"477b563e-62bf-4c38-9004-f3a46f574174\") " pod="openstack/keystone-cron-29552401-pfl4r" Mar 10 12:01:00 crc kubenswrapper[4794]: I0310 12:01:00.347615 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/477b563e-62bf-4c38-9004-f3a46f574174-combined-ca-bundle\") pod \"keystone-cron-29552401-pfl4r\" (UID: \"477b563e-62bf-4c38-9004-f3a46f574174\") " pod="openstack/keystone-cron-29552401-pfl4r" Mar 10 12:01:00 crc kubenswrapper[4794]: I0310 12:01:00.347741 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/477b563e-62bf-4c38-9004-f3a46f574174-fernet-keys\") pod \"keystone-cron-29552401-pfl4r\" (UID: \"477b563e-62bf-4c38-9004-f3a46f574174\") " pod="openstack/keystone-cron-29552401-pfl4r" Mar 10 12:01:00 crc kubenswrapper[4794]: I0310 12:01:00.347816 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7gwpx\" (UniqueName: \"kubernetes.io/projected/477b563e-62bf-4c38-9004-f3a46f574174-kube-api-access-7gwpx\") pod \"keystone-cron-29552401-pfl4r\" (UID: \"477b563e-62bf-4c38-9004-f3a46f574174\") " pod="openstack/keystone-cron-29552401-pfl4r" Mar 10 12:01:00 crc kubenswrapper[4794]: I0310 12:01:00.347967 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/477b563e-62bf-4c38-9004-f3a46f574174-config-data\") pod \"keystone-cron-29552401-pfl4r\" (UID: \"477b563e-62bf-4c38-9004-f3a46f574174\") " pod="openstack/keystone-cron-29552401-pfl4r" Mar 10 12:01:00 crc kubenswrapper[4794]: I0310 12:01:00.353431 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/477b563e-62bf-4c38-9004-f3a46f574174-combined-ca-bundle\") pod \"keystone-cron-29552401-pfl4r\" (UID: \"477b563e-62bf-4c38-9004-f3a46f574174\") " pod="openstack/keystone-cron-29552401-pfl4r" Mar 10 12:01:00 crc kubenswrapper[4794]: I0310 12:01:00.353477 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/477b563e-62bf-4c38-9004-f3a46f574174-fernet-keys\") pod \"keystone-cron-29552401-pfl4r\" (UID: \"477b563e-62bf-4c38-9004-f3a46f574174\") " pod="openstack/keystone-cron-29552401-pfl4r" Mar 10 12:01:00 crc kubenswrapper[4794]: I0310 12:01:00.354277 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/477b563e-62bf-4c38-9004-f3a46f574174-config-data\") pod \"keystone-cron-29552401-pfl4r\" (UID: \"477b563e-62bf-4c38-9004-f3a46f574174\") " pod="openstack/keystone-cron-29552401-pfl4r" Mar 10 12:01:00 crc kubenswrapper[4794]: I0310 12:01:00.364832 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7gwpx\" (UniqueName: \"kubernetes.io/projected/477b563e-62bf-4c38-9004-f3a46f574174-kube-api-access-7gwpx\") pod \"keystone-cron-29552401-pfl4r\" (UID: \"477b563e-62bf-4c38-9004-f3a46f574174\") " pod="openstack/keystone-cron-29552401-pfl4r" Mar 10 12:01:00 crc kubenswrapper[4794]: I0310 12:01:00.471909 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29552401-pfl4r" Mar 10 12:01:00 crc kubenswrapper[4794]: I0310 12:01:00.931908 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29552401-pfl4r"] Mar 10 12:01:01 crc kubenswrapper[4794]: I0310 12:01:01.872903 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29552401-pfl4r" event={"ID":"477b563e-62bf-4c38-9004-f3a46f574174","Type":"ContainerStarted","Data":"1fffc5ab6aad6a6d6928fb3c5bdea79c8230a516287fdfafee5ee0f0723ce92e"} Mar 10 12:01:01 crc kubenswrapper[4794]: I0310 12:01:01.873780 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29552401-pfl4r" event={"ID":"477b563e-62bf-4c38-9004-f3a46f574174","Type":"ContainerStarted","Data":"38a16d7ee3c05455dec8977c573324da9d3561bef926067d8d6a26723dd66f9f"} Mar 10 12:01:01 crc kubenswrapper[4794]: I0310 12:01:01.900238 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-cron-29552401-pfl4r" podStartSLOduration=1.900216861 podStartE2EDuration="1.900216861s" podCreationTimestamp="2026-03-10 12:01:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 12:01:01.88986793 +0000 UTC m=+8210.646038758" watchObservedRunningTime="2026-03-10 12:01:01.900216861 +0000 UTC m=+8210.656387679" Mar 10 12:01:04 crc kubenswrapper[4794]: I0310 12:01:04.901447 4794 generic.go:334] "Generic (PLEG): container finished" podID="477b563e-62bf-4c38-9004-f3a46f574174" containerID="1fffc5ab6aad6a6d6928fb3c5bdea79c8230a516287fdfafee5ee0f0723ce92e" exitCode=0 Mar 10 12:01:04 crc kubenswrapper[4794]: I0310 12:01:04.901565 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29552401-pfl4r" event={"ID":"477b563e-62bf-4c38-9004-f3a46f574174","Type":"ContainerDied","Data":"1fffc5ab6aad6a6d6928fb3c5bdea79c8230a516287fdfafee5ee0f0723ce92e"} Mar 10 12:01:06 crc kubenswrapper[4794]: I0310 12:01:06.257062 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29552401-pfl4r" Mar 10 12:01:06 crc kubenswrapper[4794]: I0310 12:01:06.378063 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7gwpx\" (UniqueName: \"kubernetes.io/projected/477b563e-62bf-4c38-9004-f3a46f574174-kube-api-access-7gwpx\") pod \"477b563e-62bf-4c38-9004-f3a46f574174\" (UID: \"477b563e-62bf-4c38-9004-f3a46f574174\") " Mar 10 12:01:06 crc kubenswrapper[4794]: I0310 12:01:06.378113 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/477b563e-62bf-4c38-9004-f3a46f574174-config-data\") pod \"477b563e-62bf-4c38-9004-f3a46f574174\" (UID: \"477b563e-62bf-4c38-9004-f3a46f574174\") " Mar 10 12:01:06 crc kubenswrapper[4794]: I0310 12:01:06.378260 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/477b563e-62bf-4c38-9004-f3a46f574174-fernet-keys\") pod \"477b563e-62bf-4c38-9004-f3a46f574174\" (UID: \"477b563e-62bf-4c38-9004-f3a46f574174\") " Mar 10 12:01:06 crc kubenswrapper[4794]: I0310 12:01:06.378439 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/477b563e-62bf-4c38-9004-f3a46f574174-combined-ca-bundle\") pod \"477b563e-62bf-4c38-9004-f3a46f574174\" (UID: \"477b563e-62bf-4c38-9004-f3a46f574174\") " Mar 10 12:01:06 crc kubenswrapper[4794]: I0310 12:01:06.383991 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/477b563e-62bf-4c38-9004-f3a46f574174-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "477b563e-62bf-4c38-9004-f3a46f574174" (UID: "477b563e-62bf-4c38-9004-f3a46f574174"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 12:01:06 crc kubenswrapper[4794]: I0310 12:01:06.385370 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/477b563e-62bf-4c38-9004-f3a46f574174-kube-api-access-7gwpx" (OuterVolumeSpecName: "kube-api-access-7gwpx") pod "477b563e-62bf-4c38-9004-f3a46f574174" (UID: "477b563e-62bf-4c38-9004-f3a46f574174"). InnerVolumeSpecName "kube-api-access-7gwpx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 12:01:06 crc kubenswrapper[4794]: I0310 12:01:06.407616 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/477b563e-62bf-4c38-9004-f3a46f574174-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "477b563e-62bf-4c38-9004-f3a46f574174" (UID: "477b563e-62bf-4c38-9004-f3a46f574174"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 12:01:06 crc kubenswrapper[4794]: I0310 12:01:06.433937 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/477b563e-62bf-4c38-9004-f3a46f574174-config-data" (OuterVolumeSpecName: "config-data") pod "477b563e-62bf-4c38-9004-f3a46f574174" (UID: "477b563e-62bf-4c38-9004-f3a46f574174"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 12:01:06 crc kubenswrapper[4794]: I0310 12:01:06.481627 4794 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/477b563e-62bf-4c38-9004-f3a46f574174-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 12:01:06 crc kubenswrapper[4794]: I0310 12:01:06.481672 4794 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/477b563e-62bf-4c38-9004-f3a46f574174-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 12:01:06 crc kubenswrapper[4794]: I0310 12:01:06.481684 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7gwpx\" (UniqueName: \"kubernetes.io/projected/477b563e-62bf-4c38-9004-f3a46f574174-kube-api-access-7gwpx\") on node \"crc\" DevicePath \"\"" Mar 10 12:01:06 crc kubenswrapper[4794]: I0310 12:01:06.481693 4794 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/477b563e-62bf-4c38-9004-f3a46f574174-fernet-keys\") on node \"crc\" DevicePath \"\"" Mar 10 12:01:06 crc kubenswrapper[4794]: I0310 12:01:06.924001 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29552401-pfl4r" event={"ID":"477b563e-62bf-4c38-9004-f3a46f574174","Type":"ContainerDied","Data":"38a16d7ee3c05455dec8977c573324da9d3561bef926067d8d6a26723dd66f9f"} Mar 10 12:01:06 crc kubenswrapper[4794]: I0310 12:01:06.924062 4794 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="38a16d7ee3c05455dec8977c573324da9d3561bef926067d8d6a26723dd66f9f" Mar 10 12:01:06 crc kubenswrapper[4794]: I0310 12:01:06.924139 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29552401-pfl4r" Mar 10 12:01:22 crc kubenswrapper[4794]: I0310 12:01:22.967946 4794 patch_prober.go:28] interesting pod/machine-config-daemon-69278 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 12:01:22 crc kubenswrapper[4794]: I0310 12:01:22.968594 4794 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-69278" podUID="071a79a8-a892-4d38-a255-2a19483b64aa" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 12:01:52 crc kubenswrapper[4794]: I0310 12:01:52.967706 4794 patch_prober.go:28] interesting pod/machine-config-daemon-69278 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 12:01:52 crc kubenswrapper[4794]: I0310 12:01:52.968242 4794 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-69278" podUID="071a79a8-a892-4d38-a255-2a19483b64aa" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 12:01:52 crc kubenswrapper[4794]: I0310 12:01:52.968291 4794 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-69278" Mar 10 12:01:52 crc kubenswrapper[4794]: I0310 12:01:52.969105 4794 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"8fbcec467432a07654b1a077feacb7efdec4f288161f785ef9cd2a36f505037c"} pod="openshift-machine-config-operator/machine-config-daemon-69278" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 10 12:01:52 crc kubenswrapper[4794]: I0310 12:01:52.969161 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-69278" podUID="071a79a8-a892-4d38-a255-2a19483b64aa" containerName="machine-config-daemon" containerID="cri-o://8fbcec467432a07654b1a077feacb7efdec4f288161f785ef9cd2a36f505037c" gracePeriod=600 Mar 10 12:01:53 crc kubenswrapper[4794]: E0310 12:01:53.105831 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-69278_openshift-machine-config-operator(071a79a8-a892-4d38-a255-2a19483b64aa)\"" pod="openshift-machine-config-operator/machine-config-daemon-69278" podUID="071a79a8-a892-4d38-a255-2a19483b64aa" Mar 10 12:01:53 crc kubenswrapper[4794]: I0310 12:01:53.449048 4794 generic.go:334] "Generic (PLEG): container finished" podID="071a79a8-a892-4d38-a255-2a19483b64aa" containerID="8fbcec467432a07654b1a077feacb7efdec4f288161f785ef9cd2a36f505037c" exitCode=0 Mar 10 12:01:53 crc kubenswrapper[4794]: I0310 12:01:53.449095 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-69278" event={"ID":"071a79a8-a892-4d38-a255-2a19483b64aa","Type":"ContainerDied","Data":"8fbcec467432a07654b1a077feacb7efdec4f288161f785ef9cd2a36f505037c"} Mar 10 12:01:53 crc kubenswrapper[4794]: I0310 12:01:53.449129 4794 scope.go:117] "RemoveContainer" containerID="2492796e0f1c5f77e1917908f1160073b1132a0dffbbb2c6499afca28aa2a417" Mar 10 12:01:53 crc kubenswrapper[4794]: I0310 12:01:53.450023 4794 scope.go:117] "RemoveContainer" containerID="8fbcec467432a07654b1a077feacb7efdec4f288161f785ef9cd2a36f505037c" Mar 10 12:01:53 crc kubenswrapper[4794]: E0310 12:01:53.450750 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-69278_openshift-machine-config-operator(071a79a8-a892-4d38-a255-2a19483b64aa)\"" pod="openshift-machine-config-operator/machine-config-daemon-69278" podUID="071a79a8-a892-4d38-a255-2a19483b64aa" Mar 10 12:02:00 crc kubenswrapper[4794]: I0310 12:02:00.148707 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29552402-bhfxt"] Mar 10 12:02:00 crc kubenswrapper[4794]: E0310 12:02:00.150257 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="477b563e-62bf-4c38-9004-f3a46f574174" containerName="keystone-cron" Mar 10 12:02:00 crc kubenswrapper[4794]: I0310 12:02:00.150292 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="477b563e-62bf-4c38-9004-f3a46f574174" containerName="keystone-cron" Mar 10 12:02:00 crc kubenswrapper[4794]: I0310 12:02:00.150858 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="477b563e-62bf-4c38-9004-f3a46f574174" containerName="keystone-cron" Mar 10 12:02:00 crc kubenswrapper[4794]: I0310 12:02:00.152230 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552402-bhfxt" Mar 10 12:02:00 crc kubenswrapper[4794]: I0310 12:02:00.154369 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 10 12:02:00 crc kubenswrapper[4794]: I0310 12:02:00.156051 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-r5tkc" Mar 10 12:02:00 crc kubenswrapper[4794]: I0310 12:02:00.157463 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 10 12:02:00 crc kubenswrapper[4794]: I0310 12:02:00.169083 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552402-bhfxt"] Mar 10 12:02:00 crc kubenswrapper[4794]: I0310 12:02:00.275312 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mq2lj\" (UniqueName: \"kubernetes.io/projected/de9c58c4-8144-4434-9555-c3617f91a2ee-kube-api-access-mq2lj\") pod \"auto-csr-approver-29552402-bhfxt\" (UID: \"de9c58c4-8144-4434-9555-c3617f91a2ee\") " pod="openshift-infra/auto-csr-approver-29552402-bhfxt" Mar 10 12:02:00 crc kubenswrapper[4794]: I0310 12:02:00.378017 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mq2lj\" (UniqueName: \"kubernetes.io/projected/de9c58c4-8144-4434-9555-c3617f91a2ee-kube-api-access-mq2lj\") pod \"auto-csr-approver-29552402-bhfxt\" (UID: \"de9c58c4-8144-4434-9555-c3617f91a2ee\") " pod="openshift-infra/auto-csr-approver-29552402-bhfxt" Mar 10 12:02:00 crc kubenswrapper[4794]: I0310 12:02:00.397769 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mq2lj\" (UniqueName: \"kubernetes.io/projected/de9c58c4-8144-4434-9555-c3617f91a2ee-kube-api-access-mq2lj\") pod \"auto-csr-approver-29552402-bhfxt\" (UID: \"de9c58c4-8144-4434-9555-c3617f91a2ee\") " pod="openshift-infra/auto-csr-approver-29552402-bhfxt" Mar 10 12:02:00 crc kubenswrapper[4794]: I0310 12:02:00.472405 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552402-bhfxt" Mar 10 12:02:00 crc kubenswrapper[4794]: I0310 12:02:00.982781 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552402-bhfxt"] Mar 10 12:02:01 crc kubenswrapper[4794]: I0310 12:02:01.546773 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552402-bhfxt" event={"ID":"de9c58c4-8144-4434-9555-c3617f91a2ee","Type":"ContainerStarted","Data":"3f3d08cef2d88dac586a97ce38b3a1c267788baf31a4de96ac7dcb32ade34bd4"} Mar 10 12:02:03 crc kubenswrapper[4794]: I0310 12:02:03.573930 4794 generic.go:334] "Generic (PLEG): container finished" podID="de9c58c4-8144-4434-9555-c3617f91a2ee" containerID="35fbd660d1e03d32da4110a53fa0e34a10c63a4ad841b7639e2e2d6b9e5c85af" exitCode=0 Mar 10 12:02:03 crc kubenswrapper[4794]: I0310 12:02:03.573990 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552402-bhfxt" event={"ID":"de9c58c4-8144-4434-9555-c3617f91a2ee","Type":"ContainerDied","Data":"35fbd660d1e03d32da4110a53fa0e34a10c63a4ad841b7639e2e2d6b9e5c85af"} Mar 10 12:02:04 crc kubenswrapper[4794]: I0310 12:02:04.982418 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552402-bhfxt" Mar 10 12:02:05 crc kubenswrapper[4794]: I0310 12:02:05.082324 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mq2lj\" (UniqueName: \"kubernetes.io/projected/de9c58c4-8144-4434-9555-c3617f91a2ee-kube-api-access-mq2lj\") pod \"de9c58c4-8144-4434-9555-c3617f91a2ee\" (UID: \"de9c58c4-8144-4434-9555-c3617f91a2ee\") " Mar 10 12:02:05 crc kubenswrapper[4794]: I0310 12:02:05.090124 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/de9c58c4-8144-4434-9555-c3617f91a2ee-kube-api-access-mq2lj" (OuterVolumeSpecName: "kube-api-access-mq2lj") pod "de9c58c4-8144-4434-9555-c3617f91a2ee" (UID: "de9c58c4-8144-4434-9555-c3617f91a2ee"). InnerVolumeSpecName "kube-api-access-mq2lj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 12:02:05 crc kubenswrapper[4794]: I0310 12:02:05.185347 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mq2lj\" (UniqueName: \"kubernetes.io/projected/de9c58c4-8144-4434-9555-c3617f91a2ee-kube-api-access-mq2lj\") on node \"crc\" DevicePath \"\"" Mar 10 12:02:05 crc kubenswrapper[4794]: I0310 12:02:05.595997 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552402-bhfxt" event={"ID":"de9c58c4-8144-4434-9555-c3617f91a2ee","Type":"ContainerDied","Data":"3f3d08cef2d88dac586a97ce38b3a1c267788baf31a4de96ac7dcb32ade34bd4"} Mar 10 12:02:05 crc kubenswrapper[4794]: I0310 12:02:05.596044 4794 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3f3d08cef2d88dac586a97ce38b3a1c267788baf31a4de96ac7dcb32ade34bd4" Mar 10 12:02:05 crc kubenswrapper[4794]: I0310 12:02:05.596077 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552402-bhfxt" Mar 10 12:02:06 crc kubenswrapper[4794]: I0310 12:02:05.999848 4794 scope.go:117] "RemoveContainer" containerID="8fbcec467432a07654b1a077feacb7efdec4f288161f785ef9cd2a36f505037c" Mar 10 12:02:06 crc kubenswrapper[4794]: E0310 12:02:06.000562 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-69278_openshift-machine-config-operator(071a79a8-a892-4d38-a255-2a19483b64aa)\"" pod="openshift-machine-config-operator/machine-config-daemon-69278" podUID="071a79a8-a892-4d38-a255-2a19483b64aa" Mar 10 12:02:06 crc kubenswrapper[4794]: I0310 12:02:06.081373 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29552396-zrrgw"] Mar 10 12:02:06 crc kubenswrapper[4794]: I0310 12:02:06.096965 4794 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29552396-zrrgw"] Mar 10 12:02:08 crc kubenswrapper[4794]: I0310 12:02:08.054270 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f18c6dbb-8161-4ba5-9898-2c61a62952cd" path="/var/lib/kubelet/pods/f18c6dbb-8161-4ba5-9898-2c61a62952cd/volumes" Mar 10 12:02:14 crc kubenswrapper[4794]: I0310 12:02:14.268689 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-2dxl4"] Mar 10 12:02:14 crc kubenswrapper[4794]: E0310 12:02:14.269633 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="de9c58c4-8144-4434-9555-c3617f91a2ee" containerName="oc" Mar 10 12:02:14 crc kubenswrapper[4794]: I0310 12:02:14.269646 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="de9c58c4-8144-4434-9555-c3617f91a2ee" containerName="oc" Mar 10 12:02:14 crc kubenswrapper[4794]: I0310 12:02:14.269864 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="de9c58c4-8144-4434-9555-c3617f91a2ee" containerName="oc" Mar 10 12:02:14 crc kubenswrapper[4794]: I0310 12:02:14.271302 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-2dxl4" Mar 10 12:02:14 crc kubenswrapper[4794]: I0310 12:02:14.284172 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-2dxl4"] Mar 10 12:02:14 crc kubenswrapper[4794]: I0310 12:02:14.379517 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e3a9d999-4ef8-4222-85ae-d2d779a19819-catalog-content\") pod \"community-operators-2dxl4\" (UID: \"e3a9d999-4ef8-4222-85ae-d2d779a19819\") " pod="openshift-marketplace/community-operators-2dxl4" Mar 10 12:02:14 crc kubenswrapper[4794]: I0310 12:02:14.379590 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-47sws\" (UniqueName: \"kubernetes.io/projected/e3a9d999-4ef8-4222-85ae-d2d779a19819-kube-api-access-47sws\") pod \"community-operators-2dxl4\" (UID: \"e3a9d999-4ef8-4222-85ae-d2d779a19819\") " pod="openshift-marketplace/community-operators-2dxl4" Mar 10 12:02:14 crc kubenswrapper[4794]: I0310 12:02:14.379617 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e3a9d999-4ef8-4222-85ae-d2d779a19819-utilities\") pod \"community-operators-2dxl4\" (UID: \"e3a9d999-4ef8-4222-85ae-d2d779a19819\") " pod="openshift-marketplace/community-operators-2dxl4" Mar 10 12:02:14 crc kubenswrapper[4794]: I0310 12:02:14.481871 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e3a9d999-4ef8-4222-85ae-d2d779a19819-catalog-content\") pod \"community-operators-2dxl4\" (UID: \"e3a9d999-4ef8-4222-85ae-d2d779a19819\") " pod="openshift-marketplace/community-operators-2dxl4" Mar 10 12:02:14 crc kubenswrapper[4794]: I0310 12:02:14.482187 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-47sws\" (UniqueName: \"kubernetes.io/projected/e3a9d999-4ef8-4222-85ae-d2d779a19819-kube-api-access-47sws\") pod \"community-operators-2dxl4\" (UID: \"e3a9d999-4ef8-4222-85ae-d2d779a19819\") " pod="openshift-marketplace/community-operators-2dxl4" Mar 10 12:02:14 crc kubenswrapper[4794]: I0310 12:02:14.482237 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e3a9d999-4ef8-4222-85ae-d2d779a19819-utilities\") pod \"community-operators-2dxl4\" (UID: \"e3a9d999-4ef8-4222-85ae-d2d779a19819\") " pod="openshift-marketplace/community-operators-2dxl4" Mar 10 12:02:14 crc kubenswrapper[4794]: I0310 12:02:14.482870 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e3a9d999-4ef8-4222-85ae-d2d779a19819-catalog-content\") pod \"community-operators-2dxl4\" (UID: \"e3a9d999-4ef8-4222-85ae-d2d779a19819\") " pod="openshift-marketplace/community-operators-2dxl4" Mar 10 12:02:14 crc kubenswrapper[4794]: I0310 12:02:14.483853 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e3a9d999-4ef8-4222-85ae-d2d779a19819-utilities\") pod \"community-operators-2dxl4\" (UID: \"e3a9d999-4ef8-4222-85ae-d2d779a19819\") " pod="openshift-marketplace/community-operators-2dxl4" Mar 10 12:02:14 crc kubenswrapper[4794]: I0310 12:02:14.503290 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-47sws\" (UniqueName: \"kubernetes.io/projected/e3a9d999-4ef8-4222-85ae-d2d779a19819-kube-api-access-47sws\") pod \"community-operators-2dxl4\" (UID: \"e3a9d999-4ef8-4222-85ae-d2d779a19819\") " pod="openshift-marketplace/community-operators-2dxl4" Mar 10 12:02:14 crc kubenswrapper[4794]: I0310 12:02:14.592655 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-2dxl4" Mar 10 12:02:15 crc kubenswrapper[4794]: I0310 12:02:15.151118 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-2dxl4"] Mar 10 12:02:15 crc kubenswrapper[4794]: I0310 12:02:15.693276 4794 generic.go:334] "Generic (PLEG): container finished" podID="e3a9d999-4ef8-4222-85ae-d2d779a19819" containerID="ce42325d5b61bf92603783a9c024d30a4eabc507617bcecd75f172fa77991254" exitCode=0 Mar 10 12:02:15 crc kubenswrapper[4794]: I0310 12:02:15.693361 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2dxl4" event={"ID":"e3a9d999-4ef8-4222-85ae-d2d779a19819","Type":"ContainerDied","Data":"ce42325d5b61bf92603783a9c024d30a4eabc507617bcecd75f172fa77991254"} Mar 10 12:02:15 crc kubenswrapper[4794]: I0310 12:02:15.693599 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2dxl4" event={"ID":"e3a9d999-4ef8-4222-85ae-d2d779a19819","Type":"ContainerStarted","Data":"77e672fa55154f27d2a68afc355d347f16fde526a0702f397a5edf7b1e3f03a8"} Mar 10 12:02:16 crc kubenswrapper[4794]: I0310 12:02:16.730693 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2dxl4" event={"ID":"e3a9d999-4ef8-4222-85ae-d2d779a19819","Type":"ContainerStarted","Data":"046ea9713cd41073a5e1845343c7b41266f0acab9ed01e006804b771d9c896bd"} Mar 10 12:02:16 crc kubenswrapper[4794]: I0310 12:02:16.999828 4794 scope.go:117] "RemoveContainer" containerID="8fbcec467432a07654b1a077feacb7efdec4f288161f785ef9cd2a36f505037c" Mar 10 12:02:17 crc kubenswrapper[4794]: E0310 12:02:17.000122 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-69278_openshift-machine-config-operator(071a79a8-a892-4d38-a255-2a19483b64aa)\"" pod="openshift-machine-config-operator/machine-config-daemon-69278" podUID="071a79a8-a892-4d38-a255-2a19483b64aa" Mar 10 12:02:18 crc kubenswrapper[4794]: I0310 12:02:18.238521 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-ljmft"] Mar 10 12:02:18 crc kubenswrapper[4794]: I0310 12:02:18.242477 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-ljmft" Mar 10 12:02:18 crc kubenswrapper[4794]: I0310 12:02:18.250819 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-ljmft"] Mar 10 12:02:18 crc kubenswrapper[4794]: I0310 12:02:18.372706 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5lnmr\" (UniqueName: \"kubernetes.io/projected/472174b5-b716-47c5-bfe3-00c586405a20-kube-api-access-5lnmr\") pod \"certified-operators-ljmft\" (UID: \"472174b5-b716-47c5-bfe3-00c586405a20\") " pod="openshift-marketplace/certified-operators-ljmft" Mar 10 12:02:18 crc kubenswrapper[4794]: I0310 12:02:18.372914 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/472174b5-b716-47c5-bfe3-00c586405a20-catalog-content\") pod \"certified-operators-ljmft\" (UID: \"472174b5-b716-47c5-bfe3-00c586405a20\") " pod="openshift-marketplace/certified-operators-ljmft" Mar 10 12:02:18 crc kubenswrapper[4794]: I0310 12:02:18.373031 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/472174b5-b716-47c5-bfe3-00c586405a20-utilities\") pod \"certified-operators-ljmft\" (UID: \"472174b5-b716-47c5-bfe3-00c586405a20\") " pod="openshift-marketplace/certified-operators-ljmft" Mar 10 12:02:18 crc kubenswrapper[4794]: I0310 12:02:18.475129 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/472174b5-b716-47c5-bfe3-00c586405a20-catalog-content\") pod \"certified-operators-ljmft\" (UID: \"472174b5-b716-47c5-bfe3-00c586405a20\") " pod="openshift-marketplace/certified-operators-ljmft" Mar 10 12:02:18 crc kubenswrapper[4794]: I0310 12:02:18.475231 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/472174b5-b716-47c5-bfe3-00c586405a20-utilities\") pod \"certified-operators-ljmft\" (UID: \"472174b5-b716-47c5-bfe3-00c586405a20\") " pod="openshift-marketplace/certified-operators-ljmft" Mar 10 12:02:18 crc kubenswrapper[4794]: I0310 12:02:18.475377 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5lnmr\" (UniqueName: \"kubernetes.io/projected/472174b5-b716-47c5-bfe3-00c586405a20-kube-api-access-5lnmr\") pod \"certified-operators-ljmft\" (UID: \"472174b5-b716-47c5-bfe3-00c586405a20\") " pod="openshift-marketplace/certified-operators-ljmft" Mar 10 12:02:18 crc kubenswrapper[4794]: I0310 12:02:18.475624 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/472174b5-b716-47c5-bfe3-00c586405a20-catalog-content\") pod \"certified-operators-ljmft\" (UID: \"472174b5-b716-47c5-bfe3-00c586405a20\") " pod="openshift-marketplace/certified-operators-ljmft" Mar 10 12:02:18 crc kubenswrapper[4794]: I0310 12:02:18.475771 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/472174b5-b716-47c5-bfe3-00c586405a20-utilities\") pod \"certified-operators-ljmft\" (UID: \"472174b5-b716-47c5-bfe3-00c586405a20\") " pod="openshift-marketplace/certified-operators-ljmft" Mar 10 12:02:18 crc kubenswrapper[4794]: I0310 12:02:18.499481 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5lnmr\" (UniqueName: \"kubernetes.io/projected/472174b5-b716-47c5-bfe3-00c586405a20-kube-api-access-5lnmr\") pod \"certified-operators-ljmft\" (UID: \"472174b5-b716-47c5-bfe3-00c586405a20\") " pod="openshift-marketplace/certified-operators-ljmft" Mar 10 12:02:18 crc kubenswrapper[4794]: I0310 12:02:18.573254 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-ljmft" Mar 10 12:02:18 crc kubenswrapper[4794]: I0310 12:02:18.760030 4794 generic.go:334] "Generic (PLEG): container finished" podID="e3a9d999-4ef8-4222-85ae-d2d779a19819" containerID="046ea9713cd41073a5e1845343c7b41266f0acab9ed01e006804b771d9c896bd" exitCode=0 Mar 10 12:02:18 crc kubenswrapper[4794]: I0310 12:02:18.760198 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2dxl4" event={"ID":"e3a9d999-4ef8-4222-85ae-d2d779a19819","Type":"ContainerDied","Data":"046ea9713cd41073a5e1845343c7b41266f0acab9ed01e006804b771d9c896bd"} Mar 10 12:02:19 crc kubenswrapper[4794]: I0310 12:02:19.109255 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-ljmft"] Mar 10 12:02:19 crc kubenswrapper[4794]: I0310 12:02:19.775669 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2dxl4" event={"ID":"e3a9d999-4ef8-4222-85ae-d2d779a19819","Type":"ContainerStarted","Data":"ea835ca0ad5793c444ddca044e09afe439358f0ec0e35145469c2101a94d00b6"} Mar 10 12:02:19 crc kubenswrapper[4794]: I0310 12:02:19.780803 4794 generic.go:334] "Generic (PLEG): container finished" podID="472174b5-b716-47c5-bfe3-00c586405a20" containerID="29ef7c75adfb80dc9977fd86e645a713660d678628a29945893c30ee30a5425d" exitCode=0 Mar 10 12:02:19 crc kubenswrapper[4794]: I0310 12:02:19.780927 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ljmft" event={"ID":"472174b5-b716-47c5-bfe3-00c586405a20","Type":"ContainerDied","Data":"29ef7c75adfb80dc9977fd86e645a713660d678628a29945893c30ee30a5425d"} Mar 10 12:02:19 crc kubenswrapper[4794]: I0310 12:02:19.781888 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ljmft" event={"ID":"472174b5-b716-47c5-bfe3-00c586405a20","Type":"ContainerStarted","Data":"96a0123417eeefff99cbd3c2f3e055f5a292baf82f87da115b050d91051a7d70"} Mar 10 12:02:19 crc kubenswrapper[4794]: I0310 12:02:19.797129 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-2dxl4" podStartSLOduration=2.29312332 podStartE2EDuration="5.797092138s" podCreationTimestamp="2026-03-10 12:02:14 +0000 UTC" firstStartedPulling="2026-03-10 12:02:15.695455406 +0000 UTC m=+8284.451626224" lastFinishedPulling="2026-03-10 12:02:19.199424224 +0000 UTC m=+8287.955595042" observedRunningTime="2026-03-10 12:02:19.793713224 +0000 UTC m=+8288.549884062" watchObservedRunningTime="2026-03-10 12:02:19.797092138 +0000 UTC m=+8288.553262956" Mar 10 12:02:20 crc kubenswrapper[4794]: I0310 12:02:20.799730 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ljmft" event={"ID":"472174b5-b716-47c5-bfe3-00c586405a20","Type":"ContainerStarted","Data":"b043c287008eeeeefe59b5724abd965f95bfa7119b78d0e48672b59f18a03b03"} Mar 10 12:02:22 crc kubenswrapper[4794]: I0310 12:02:22.822940 4794 generic.go:334] "Generic (PLEG): container finished" podID="472174b5-b716-47c5-bfe3-00c586405a20" containerID="b043c287008eeeeefe59b5724abd965f95bfa7119b78d0e48672b59f18a03b03" exitCode=0 Mar 10 12:02:22 crc kubenswrapper[4794]: I0310 12:02:22.823040 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ljmft" event={"ID":"472174b5-b716-47c5-bfe3-00c586405a20","Type":"ContainerDied","Data":"b043c287008eeeeefe59b5724abd965f95bfa7119b78d0e48672b59f18a03b03"} Mar 10 12:02:23 crc kubenswrapper[4794]: I0310 12:02:23.837775 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ljmft" event={"ID":"472174b5-b716-47c5-bfe3-00c586405a20","Type":"ContainerStarted","Data":"42719808377a44e4c83dee4e2ed629b44ccdf5b8ba68cef5ba760a7dd4cdd02e"} Mar 10 12:02:23 crc kubenswrapper[4794]: I0310 12:02:23.866830 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-ljmft" podStartSLOduration=2.310271729 podStartE2EDuration="5.86680991s" podCreationTimestamp="2026-03-10 12:02:18 +0000 UTC" firstStartedPulling="2026-03-10 12:02:19.784099305 +0000 UTC m=+8288.540270123" lastFinishedPulling="2026-03-10 12:02:23.340637486 +0000 UTC m=+8292.096808304" observedRunningTime="2026-03-10 12:02:23.857449569 +0000 UTC m=+8292.613620377" watchObservedRunningTime="2026-03-10 12:02:23.86680991 +0000 UTC m=+8292.622980728" Mar 10 12:02:24 crc kubenswrapper[4794]: I0310 12:02:24.593584 4794 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-2dxl4" Mar 10 12:02:24 crc kubenswrapper[4794]: I0310 12:02:24.593931 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-2dxl4" Mar 10 12:02:24 crc kubenswrapper[4794]: I0310 12:02:24.645858 4794 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-2dxl4" Mar 10 12:02:24 crc kubenswrapper[4794]: I0310 12:02:24.896521 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-2dxl4" Mar 10 12:02:26 crc kubenswrapper[4794]: I0310 12:02:26.832770 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-2dxl4"] Mar 10 12:02:26 crc kubenswrapper[4794]: I0310 12:02:26.864586 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-2dxl4" podUID="e3a9d999-4ef8-4222-85ae-d2d779a19819" containerName="registry-server" containerID="cri-o://ea835ca0ad5793c444ddca044e09afe439358f0ec0e35145469c2101a94d00b6" gracePeriod=2 Mar 10 12:02:27 crc kubenswrapper[4794]: I0310 12:02:27.373102 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-2dxl4" Mar 10 12:02:27 crc kubenswrapper[4794]: I0310 12:02:27.467323 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-47sws\" (UniqueName: \"kubernetes.io/projected/e3a9d999-4ef8-4222-85ae-d2d779a19819-kube-api-access-47sws\") pod \"e3a9d999-4ef8-4222-85ae-d2d779a19819\" (UID: \"e3a9d999-4ef8-4222-85ae-d2d779a19819\") " Mar 10 12:02:27 crc kubenswrapper[4794]: I0310 12:02:27.467410 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e3a9d999-4ef8-4222-85ae-d2d779a19819-utilities\") pod \"e3a9d999-4ef8-4222-85ae-d2d779a19819\" (UID: \"e3a9d999-4ef8-4222-85ae-d2d779a19819\") " Mar 10 12:02:27 crc kubenswrapper[4794]: I0310 12:02:27.467674 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e3a9d999-4ef8-4222-85ae-d2d779a19819-catalog-content\") pod \"e3a9d999-4ef8-4222-85ae-d2d779a19819\" (UID: \"e3a9d999-4ef8-4222-85ae-d2d779a19819\") " Mar 10 12:02:27 crc kubenswrapper[4794]: I0310 12:02:27.468378 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e3a9d999-4ef8-4222-85ae-d2d779a19819-utilities" (OuterVolumeSpecName: "utilities") pod "e3a9d999-4ef8-4222-85ae-d2d779a19819" (UID: "e3a9d999-4ef8-4222-85ae-d2d779a19819"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 12:02:27 crc kubenswrapper[4794]: I0310 12:02:27.476110 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e3a9d999-4ef8-4222-85ae-d2d779a19819-kube-api-access-47sws" (OuterVolumeSpecName: "kube-api-access-47sws") pod "e3a9d999-4ef8-4222-85ae-d2d779a19819" (UID: "e3a9d999-4ef8-4222-85ae-d2d779a19819"). InnerVolumeSpecName "kube-api-access-47sws". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 12:02:27 crc kubenswrapper[4794]: I0310 12:02:27.522682 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e3a9d999-4ef8-4222-85ae-d2d779a19819-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e3a9d999-4ef8-4222-85ae-d2d779a19819" (UID: "e3a9d999-4ef8-4222-85ae-d2d779a19819"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 12:02:27 crc kubenswrapper[4794]: I0310 12:02:27.569671 4794 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e3a9d999-4ef8-4222-85ae-d2d779a19819-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 10 12:02:27 crc kubenswrapper[4794]: I0310 12:02:27.569708 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-47sws\" (UniqueName: \"kubernetes.io/projected/e3a9d999-4ef8-4222-85ae-d2d779a19819-kube-api-access-47sws\") on node \"crc\" DevicePath \"\"" Mar 10 12:02:27 crc kubenswrapper[4794]: I0310 12:02:27.569717 4794 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e3a9d999-4ef8-4222-85ae-d2d779a19819-utilities\") on node \"crc\" DevicePath \"\"" Mar 10 12:02:27 crc kubenswrapper[4794]: I0310 12:02:27.879588 4794 generic.go:334] "Generic (PLEG): container finished" podID="e3a9d999-4ef8-4222-85ae-d2d779a19819" containerID="ea835ca0ad5793c444ddca044e09afe439358f0ec0e35145469c2101a94d00b6" exitCode=0 Mar 10 12:02:27 crc kubenswrapper[4794]: I0310 12:02:27.879633 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2dxl4" event={"ID":"e3a9d999-4ef8-4222-85ae-d2d779a19819","Type":"ContainerDied","Data":"ea835ca0ad5793c444ddca044e09afe439358f0ec0e35145469c2101a94d00b6"} Mar 10 12:02:27 crc kubenswrapper[4794]: I0310 12:02:27.879641 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-2dxl4" Mar 10 12:02:27 crc kubenswrapper[4794]: I0310 12:02:27.879661 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2dxl4" event={"ID":"e3a9d999-4ef8-4222-85ae-d2d779a19819","Type":"ContainerDied","Data":"77e672fa55154f27d2a68afc355d347f16fde526a0702f397a5edf7b1e3f03a8"} Mar 10 12:02:27 crc kubenswrapper[4794]: I0310 12:02:27.879678 4794 scope.go:117] "RemoveContainer" containerID="ea835ca0ad5793c444ddca044e09afe439358f0ec0e35145469c2101a94d00b6" Mar 10 12:02:27 crc kubenswrapper[4794]: I0310 12:02:27.918589 4794 scope.go:117] "RemoveContainer" containerID="046ea9713cd41073a5e1845343c7b41266f0acab9ed01e006804b771d9c896bd" Mar 10 12:02:27 crc kubenswrapper[4794]: I0310 12:02:27.927253 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-2dxl4"] Mar 10 12:02:27 crc kubenswrapper[4794]: I0310 12:02:27.938660 4794 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-2dxl4"] Mar 10 12:02:27 crc kubenswrapper[4794]: I0310 12:02:27.947683 4794 scope.go:117] "RemoveContainer" containerID="ce42325d5b61bf92603783a9c024d30a4eabc507617bcecd75f172fa77991254" Mar 10 12:02:27 crc kubenswrapper[4794]: I0310 12:02:27.993961 4794 scope.go:117] "RemoveContainer" containerID="ea835ca0ad5793c444ddca044e09afe439358f0ec0e35145469c2101a94d00b6" Mar 10 12:02:27 crc kubenswrapper[4794]: E0310 12:02:27.994387 4794 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ea835ca0ad5793c444ddca044e09afe439358f0ec0e35145469c2101a94d00b6\": container with ID starting with ea835ca0ad5793c444ddca044e09afe439358f0ec0e35145469c2101a94d00b6 not found: ID does not exist" containerID="ea835ca0ad5793c444ddca044e09afe439358f0ec0e35145469c2101a94d00b6" Mar 10 12:02:27 crc kubenswrapper[4794]: I0310 12:02:27.994453 4794 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ea835ca0ad5793c444ddca044e09afe439358f0ec0e35145469c2101a94d00b6"} err="failed to get container status \"ea835ca0ad5793c444ddca044e09afe439358f0ec0e35145469c2101a94d00b6\": rpc error: code = NotFound desc = could not find container \"ea835ca0ad5793c444ddca044e09afe439358f0ec0e35145469c2101a94d00b6\": container with ID starting with ea835ca0ad5793c444ddca044e09afe439358f0ec0e35145469c2101a94d00b6 not found: ID does not exist" Mar 10 12:02:27 crc kubenswrapper[4794]: I0310 12:02:27.994491 4794 scope.go:117] "RemoveContainer" containerID="046ea9713cd41073a5e1845343c7b41266f0acab9ed01e006804b771d9c896bd" Mar 10 12:02:27 crc kubenswrapper[4794]: E0310 12:02:27.994936 4794 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"046ea9713cd41073a5e1845343c7b41266f0acab9ed01e006804b771d9c896bd\": container with ID starting with 046ea9713cd41073a5e1845343c7b41266f0acab9ed01e006804b771d9c896bd not found: ID does not exist" containerID="046ea9713cd41073a5e1845343c7b41266f0acab9ed01e006804b771d9c896bd" Mar 10 12:02:27 crc kubenswrapper[4794]: I0310 12:02:27.994963 4794 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"046ea9713cd41073a5e1845343c7b41266f0acab9ed01e006804b771d9c896bd"} err="failed to get container status \"046ea9713cd41073a5e1845343c7b41266f0acab9ed01e006804b771d9c896bd\": rpc error: code = NotFound desc = could not find container \"046ea9713cd41073a5e1845343c7b41266f0acab9ed01e006804b771d9c896bd\": container with ID starting with 046ea9713cd41073a5e1845343c7b41266f0acab9ed01e006804b771d9c896bd not found: ID does not exist" Mar 10 12:02:27 crc kubenswrapper[4794]: I0310 12:02:27.994980 4794 scope.go:117] "RemoveContainer" containerID="ce42325d5b61bf92603783a9c024d30a4eabc507617bcecd75f172fa77991254" Mar 10 12:02:27 crc kubenswrapper[4794]: E0310 12:02:27.995296 4794 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ce42325d5b61bf92603783a9c024d30a4eabc507617bcecd75f172fa77991254\": container with ID starting with ce42325d5b61bf92603783a9c024d30a4eabc507617bcecd75f172fa77991254 not found: ID does not exist" containerID="ce42325d5b61bf92603783a9c024d30a4eabc507617bcecd75f172fa77991254" Mar 10 12:02:27 crc kubenswrapper[4794]: I0310 12:02:27.995351 4794 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ce42325d5b61bf92603783a9c024d30a4eabc507617bcecd75f172fa77991254"} err="failed to get container status \"ce42325d5b61bf92603783a9c024d30a4eabc507617bcecd75f172fa77991254\": rpc error: code = NotFound desc = could not find container \"ce42325d5b61bf92603783a9c024d30a4eabc507617bcecd75f172fa77991254\": container with ID starting with ce42325d5b61bf92603783a9c024d30a4eabc507617bcecd75f172fa77991254 not found: ID does not exist" Mar 10 12:02:28 crc kubenswrapper[4794]: I0310 12:02:28.011921 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e3a9d999-4ef8-4222-85ae-d2d779a19819" path="/var/lib/kubelet/pods/e3a9d999-4ef8-4222-85ae-d2d779a19819/volumes" Mar 10 12:02:28 crc kubenswrapper[4794]: I0310 12:02:28.573738 4794 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-ljmft" Mar 10 12:02:28 crc kubenswrapper[4794]: I0310 12:02:28.574109 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-ljmft" Mar 10 12:02:28 crc kubenswrapper[4794]: I0310 12:02:28.649555 4794 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-ljmft" Mar 10 12:02:28 crc kubenswrapper[4794]: I0310 12:02:28.952289 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-ljmft" Mar 10 12:02:30 crc kubenswrapper[4794]: I0310 12:02:30.000966 4794 scope.go:117] "RemoveContainer" containerID="8fbcec467432a07654b1a077feacb7efdec4f288161f785ef9cd2a36f505037c" Mar 10 12:02:30 crc kubenswrapper[4794]: E0310 12:02:30.002103 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-69278_openshift-machine-config-operator(071a79a8-a892-4d38-a255-2a19483b64aa)\"" pod="openshift-machine-config-operator/machine-config-daemon-69278" podUID="071a79a8-a892-4d38-a255-2a19483b64aa" Mar 10 12:02:31 crc kubenswrapper[4794]: I0310 12:02:31.035251 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-ljmft"] Mar 10 12:02:31 crc kubenswrapper[4794]: I0310 12:02:31.035593 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-ljmft" podUID="472174b5-b716-47c5-bfe3-00c586405a20" containerName="registry-server" containerID="cri-o://42719808377a44e4c83dee4e2ed629b44ccdf5b8ba68cef5ba760a7dd4cdd02e" gracePeriod=2 Mar 10 12:02:31 crc kubenswrapper[4794]: I0310 12:02:31.558101 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-ljmft" Mar 10 12:02:31 crc kubenswrapper[4794]: I0310 12:02:31.667068 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5lnmr\" (UniqueName: \"kubernetes.io/projected/472174b5-b716-47c5-bfe3-00c586405a20-kube-api-access-5lnmr\") pod \"472174b5-b716-47c5-bfe3-00c586405a20\" (UID: \"472174b5-b716-47c5-bfe3-00c586405a20\") " Mar 10 12:02:31 crc kubenswrapper[4794]: I0310 12:02:31.667150 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/472174b5-b716-47c5-bfe3-00c586405a20-utilities\") pod \"472174b5-b716-47c5-bfe3-00c586405a20\" (UID: \"472174b5-b716-47c5-bfe3-00c586405a20\") " Mar 10 12:02:31 crc kubenswrapper[4794]: I0310 12:02:31.667173 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/472174b5-b716-47c5-bfe3-00c586405a20-catalog-content\") pod \"472174b5-b716-47c5-bfe3-00c586405a20\" (UID: \"472174b5-b716-47c5-bfe3-00c586405a20\") " Mar 10 12:02:31 crc kubenswrapper[4794]: I0310 12:02:31.668194 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/472174b5-b716-47c5-bfe3-00c586405a20-utilities" (OuterVolumeSpecName: "utilities") pod "472174b5-b716-47c5-bfe3-00c586405a20" (UID: "472174b5-b716-47c5-bfe3-00c586405a20"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 12:02:31 crc kubenswrapper[4794]: I0310 12:02:31.673299 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/472174b5-b716-47c5-bfe3-00c586405a20-kube-api-access-5lnmr" (OuterVolumeSpecName: "kube-api-access-5lnmr") pod "472174b5-b716-47c5-bfe3-00c586405a20" (UID: "472174b5-b716-47c5-bfe3-00c586405a20"). InnerVolumeSpecName "kube-api-access-5lnmr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 12:02:31 crc kubenswrapper[4794]: I0310 12:02:31.729522 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/472174b5-b716-47c5-bfe3-00c586405a20-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "472174b5-b716-47c5-bfe3-00c586405a20" (UID: "472174b5-b716-47c5-bfe3-00c586405a20"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 12:02:31 crc kubenswrapper[4794]: I0310 12:02:31.769481 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5lnmr\" (UniqueName: \"kubernetes.io/projected/472174b5-b716-47c5-bfe3-00c586405a20-kube-api-access-5lnmr\") on node \"crc\" DevicePath \"\"" Mar 10 12:02:31 crc kubenswrapper[4794]: I0310 12:02:31.769518 4794 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/472174b5-b716-47c5-bfe3-00c586405a20-utilities\") on node \"crc\" DevicePath \"\"" Mar 10 12:02:31 crc kubenswrapper[4794]: I0310 12:02:31.769533 4794 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/472174b5-b716-47c5-bfe3-00c586405a20-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 10 12:02:31 crc kubenswrapper[4794]: I0310 12:02:31.923264 4794 generic.go:334] "Generic (PLEG): container finished" podID="472174b5-b716-47c5-bfe3-00c586405a20" containerID="42719808377a44e4c83dee4e2ed629b44ccdf5b8ba68cef5ba760a7dd4cdd02e" exitCode=0 Mar 10 12:02:31 crc kubenswrapper[4794]: I0310 12:02:31.923362 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ljmft" event={"ID":"472174b5-b716-47c5-bfe3-00c586405a20","Type":"ContainerDied","Data":"42719808377a44e4c83dee4e2ed629b44ccdf5b8ba68cef5ba760a7dd4cdd02e"} Mar 10 12:02:31 crc kubenswrapper[4794]: I0310 12:02:31.923462 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ljmft" event={"ID":"472174b5-b716-47c5-bfe3-00c586405a20","Type":"ContainerDied","Data":"96a0123417eeefff99cbd3c2f3e055f5a292baf82f87da115b050d91051a7d70"} Mar 10 12:02:31 crc kubenswrapper[4794]: I0310 12:02:31.923487 4794 scope.go:117] "RemoveContainer" containerID="42719808377a44e4c83dee4e2ed629b44ccdf5b8ba68cef5ba760a7dd4cdd02e" Mar 10 12:02:31 crc kubenswrapper[4794]: I0310 12:02:31.923955 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-ljmft" Mar 10 12:02:31 crc kubenswrapper[4794]: I0310 12:02:31.973954 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-ljmft"] Mar 10 12:02:31 crc kubenswrapper[4794]: I0310 12:02:31.974594 4794 scope.go:117] "RemoveContainer" containerID="b043c287008eeeeefe59b5724abd965f95bfa7119b78d0e48672b59f18a03b03" Mar 10 12:02:31 crc kubenswrapper[4794]: I0310 12:02:31.992756 4794 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-ljmft"] Mar 10 12:02:32 crc kubenswrapper[4794]: I0310 12:02:32.009706 4794 scope.go:117] "RemoveContainer" containerID="29ef7c75adfb80dc9977fd86e645a713660d678628a29945893c30ee30a5425d" Mar 10 12:02:32 crc kubenswrapper[4794]: I0310 12:02:32.038729 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="472174b5-b716-47c5-bfe3-00c586405a20" path="/var/lib/kubelet/pods/472174b5-b716-47c5-bfe3-00c586405a20/volumes" Mar 10 12:02:32 crc kubenswrapper[4794]: I0310 12:02:32.083814 4794 scope.go:117] "RemoveContainer" containerID="42719808377a44e4c83dee4e2ed629b44ccdf5b8ba68cef5ba760a7dd4cdd02e" Mar 10 12:02:32 crc kubenswrapper[4794]: E0310 12:02:32.088528 4794 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"42719808377a44e4c83dee4e2ed629b44ccdf5b8ba68cef5ba760a7dd4cdd02e\": container with ID starting with 42719808377a44e4c83dee4e2ed629b44ccdf5b8ba68cef5ba760a7dd4cdd02e not found: ID does not exist" containerID="42719808377a44e4c83dee4e2ed629b44ccdf5b8ba68cef5ba760a7dd4cdd02e" Mar 10 12:02:32 crc kubenswrapper[4794]: I0310 12:02:32.088581 4794 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"42719808377a44e4c83dee4e2ed629b44ccdf5b8ba68cef5ba760a7dd4cdd02e"} err="failed to get container status \"42719808377a44e4c83dee4e2ed629b44ccdf5b8ba68cef5ba760a7dd4cdd02e\": rpc error: code = NotFound desc = could not find container \"42719808377a44e4c83dee4e2ed629b44ccdf5b8ba68cef5ba760a7dd4cdd02e\": container with ID starting with 42719808377a44e4c83dee4e2ed629b44ccdf5b8ba68cef5ba760a7dd4cdd02e not found: ID does not exist" Mar 10 12:02:32 crc kubenswrapper[4794]: I0310 12:02:32.088611 4794 scope.go:117] "RemoveContainer" containerID="b043c287008eeeeefe59b5724abd965f95bfa7119b78d0e48672b59f18a03b03" Mar 10 12:02:32 crc kubenswrapper[4794]: E0310 12:02:32.090969 4794 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b043c287008eeeeefe59b5724abd965f95bfa7119b78d0e48672b59f18a03b03\": container with ID starting with b043c287008eeeeefe59b5724abd965f95bfa7119b78d0e48672b59f18a03b03 not found: ID does not exist" containerID="b043c287008eeeeefe59b5724abd965f95bfa7119b78d0e48672b59f18a03b03" Mar 10 12:02:32 crc kubenswrapper[4794]: I0310 12:02:32.091005 4794 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b043c287008eeeeefe59b5724abd965f95bfa7119b78d0e48672b59f18a03b03"} err="failed to get container status \"b043c287008eeeeefe59b5724abd965f95bfa7119b78d0e48672b59f18a03b03\": rpc error: code = NotFound desc = could not find container \"b043c287008eeeeefe59b5724abd965f95bfa7119b78d0e48672b59f18a03b03\": container with ID starting with b043c287008eeeeefe59b5724abd965f95bfa7119b78d0e48672b59f18a03b03 not found: ID does not exist" Mar 10 12:02:32 crc kubenswrapper[4794]: I0310 12:02:32.091023 4794 scope.go:117] "RemoveContainer" containerID="29ef7c75adfb80dc9977fd86e645a713660d678628a29945893c30ee30a5425d" Mar 10 12:02:32 crc kubenswrapper[4794]: E0310 12:02:32.094644 4794 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"29ef7c75adfb80dc9977fd86e645a713660d678628a29945893c30ee30a5425d\": container with ID starting with 29ef7c75adfb80dc9977fd86e645a713660d678628a29945893c30ee30a5425d not found: ID does not exist" containerID="29ef7c75adfb80dc9977fd86e645a713660d678628a29945893c30ee30a5425d" Mar 10 12:02:32 crc kubenswrapper[4794]: I0310 12:02:32.094680 4794 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"29ef7c75adfb80dc9977fd86e645a713660d678628a29945893c30ee30a5425d"} err="failed to get container status \"29ef7c75adfb80dc9977fd86e645a713660d678628a29945893c30ee30a5425d\": rpc error: code = NotFound desc = could not find container \"29ef7c75adfb80dc9977fd86e645a713660d678628a29945893c30ee30a5425d\": container with ID starting with 29ef7c75adfb80dc9977fd86e645a713660d678628a29945893c30ee30a5425d not found: ID does not exist" Mar 10 12:02:42 crc kubenswrapper[4794]: I0310 12:02:42.026825 4794 generic.go:334] "Generic (PLEG): container finished" podID="0dbc468e-bf92-4cfa-81f1-a660334c4fd5" containerID="2af3a11344f9c1779996fc8ea0275e685f24a7a95ec6fc505e9e62571c3df478" exitCode=0 Mar 10 12:02:42 crc kubenswrapper[4794]: I0310 12:02:42.026944 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-openstack-openstack-cell1-6rfj4" event={"ID":"0dbc468e-bf92-4cfa-81f1-a660334c4fd5","Type":"ContainerDied","Data":"2af3a11344f9c1779996fc8ea0275e685f24a7a95ec6fc505e9e62571c3df478"} Mar 10 12:02:43 crc kubenswrapper[4794]: I0310 12:02:43.474599 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-openstack-openstack-cell1-6rfj4" Mar 10 12:02:43 crc kubenswrapper[4794]: I0310 12:02:43.624700 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/0dbc468e-bf92-4cfa-81f1-a660334c4fd5-ceph\") pod \"0dbc468e-bf92-4cfa-81f1-a660334c4fd5\" (UID: \"0dbc468e-bf92-4cfa-81f1-a660334c4fd5\") " Mar 10 12:02:43 crc kubenswrapper[4794]: I0310 12:02:43.624856 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0dbc468e-bf92-4cfa-81f1-a660334c4fd5-nova-cell1-combined-ca-bundle\") pod \"0dbc468e-bf92-4cfa-81f1-a660334c4fd5\" (UID: \"0dbc468e-bf92-4cfa-81f1-a660334c4fd5\") " Mar 10 12:02:43 crc kubenswrapper[4794]: I0310 12:02:43.624893 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6jg5k\" (UniqueName: \"kubernetes.io/projected/0dbc468e-bf92-4cfa-81f1-a660334c4fd5-kube-api-access-6jg5k\") pod \"0dbc468e-bf92-4cfa-81f1-a660334c4fd5\" (UID: \"0dbc468e-bf92-4cfa-81f1-a660334c4fd5\") " Mar 10 12:02:43 crc kubenswrapper[4794]: I0310 12:02:43.624932 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/0dbc468e-bf92-4cfa-81f1-a660334c4fd5-nova-cell1-compute-config-3\") pod \"0dbc468e-bf92-4cfa-81f1-a660334c4fd5\" (UID: \"0dbc468e-bf92-4cfa-81f1-a660334c4fd5\") " Mar 10 12:02:43 crc kubenswrapper[4794]: I0310 12:02:43.624975 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/0dbc468e-bf92-4cfa-81f1-a660334c4fd5-nova-cell1-compute-config-2\") pod \"0dbc468e-bf92-4cfa-81f1-a660334c4fd5\" (UID: \"0dbc468e-bf92-4cfa-81f1-a660334c4fd5\") " Mar 10 12:02:43 crc kubenswrapper[4794]: I0310 12:02:43.625011 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/0dbc468e-bf92-4cfa-81f1-a660334c4fd5-nova-migration-ssh-key-1\") pod \"0dbc468e-bf92-4cfa-81f1-a660334c4fd5\" (UID: \"0dbc468e-bf92-4cfa-81f1-a660334c4fd5\") " Mar 10 12:02:43 crc kubenswrapper[4794]: I0310 12:02:43.625076 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0dbc468e-bf92-4cfa-81f1-a660334c4fd5-inventory\") pod \"0dbc468e-bf92-4cfa-81f1-a660334c4fd5\" (UID: \"0dbc468e-bf92-4cfa-81f1-a660334c4fd5\") " Mar 10 12:02:43 crc kubenswrapper[4794]: I0310 12:02:43.625106 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cells-global-config-0\" (UniqueName: \"kubernetes.io/configmap/0dbc468e-bf92-4cfa-81f1-a660334c4fd5-nova-cells-global-config-0\") pod \"0dbc468e-bf92-4cfa-81f1-a660334c4fd5\" (UID: \"0dbc468e-bf92-4cfa-81f1-a660334c4fd5\") " Mar 10 12:02:43 crc kubenswrapper[4794]: I0310 12:02:43.625144 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cells-global-config-1\" (UniqueName: \"kubernetes.io/configmap/0dbc468e-bf92-4cfa-81f1-a660334c4fd5-nova-cells-global-config-1\") pod \"0dbc468e-bf92-4cfa-81f1-a660334c4fd5\" (UID: \"0dbc468e-bf92-4cfa-81f1-a660334c4fd5\") " Mar 10 12:02:43 crc kubenswrapper[4794]: I0310 12:02:43.625201 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/0dbc468e-bf92-4cfa-81f1-a660334c4fd5-ssh-key-openstack-cell1\") pod \"0dbc468e-bf92-4cfa-81f1-a660334c4fd5\" (UID: \"0dbc468e-bf92-4cfa-81f1-a660334c4fd5\") " Mar 10 12:02:43 crc kubenswrapper[4794]: I0310 12:02:43.625233 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/0dbc468e-bf92-4cfa-81f1-a660334c4fd5-nova-cell1-compute-config-0\") pod \"0dbc468e-bf92-4cfa-81f1-a660334c4fd5\" (UID: \"0dbc468e-bf92-4cfa-81f1-a660334c4fd5\") " Mar 10 12:02:43 crc kubenswrapper[4794]: I0310 12:02:43.625287 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/0dbc468e-bf92-4cfa-81f1-a660334c4fd5-nova-migration-ssh-key-0\") pod \"0dbc468e-bf92-4cfa-81f1-a660334c4fd5\" (UID: \"0dbc468e-bf92-4cfa-81f1-a660334c4fd5\") " Mar 10 12:02:43 crc kubenswrapper[4794]: I0310 12:02:43.625353 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/0dbc468e-bf92-4cfa-81f1-a660334c4fd5-nova-cell1-compute-config-1\") pod \"0dbc468e-bf92-4cfa-81f1-a660334c4fd5\" (UID: \"0dbc468e-bf92-4cfa-81f1-a660334c4fd5\") " Mar 10 12:02:43 crc kubenswrapper[4794]: I0310 12:02:43.631940 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0dbc468e-bf92-4cfa-81f1-a660334c4fd5-nova-cell1-combined-ca-bundle" (OuterVolumeSpecName: "nova-cell1-combined-ca-bundle") pod "0dbc468e-bf92-4cfa-81f1-a660334c4fd5" (UID: "0dbc468e-bf92-4cfa-81f1-a660334c4fd5"). InnerVolumeSpecName "nova-cell1-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 12:02:43 crc kubenswrapper[4794]: I0310 12:02:43.642883 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0dbc468e-bf92-4cfa-81f1-a660334c4fd5-ceph" (OuterVolumeSpecName: "ceph") pod "0dbc468e-bf92-4cfa-81f1-a660334c4fd5" (UID: "0dbc468e-bf92-4cfa-81f1-a660334c4fd5"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 12:02:43 crc kubenswrapper[4794]: I0310 12:02:43.645233 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0dbc468e-bf92-4cfa-81f1-a660334c4fd5-kube-api-access-6jg5k" (OuterVolumeSpecName: "kube-api-access-6jg5k") pod "0dbc468e-bf92-4cfa-81f1-a660334c4fd5" (UID: "0dbc468e-bf92-4cfa-81f1-a660334c4fd5"). InnerVolumeSpecName "kube-api-access-6jg5k". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 12:02:43 crc kubenswrapper[4794]: I0310 12:02:43.659403 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0dbc468e-bf92-4cfa-81f1-a660334c4fd5-nova-cell1-compute-config-1" (OuterVolumeSpecName: "nova-cell1-compute-config-1") pod "0dbc468e-bf92-4cfa-81f1-a660334c4fd5" (UID: "0dbc468e-bf92-4cfa-81f1-a660334c4fd5"). InnerVolumeSpecName "nova-cell1-compute-config-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 12:02:43 crc kubenswrapper[4794]: I0310 12:02:43.661073 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0dbc468e-bf92-4cfa-81f1-a660334c4fd5-nova-migration-ssh-key-0" (OuterVolumeSpecName: "nova-migration-ssh-key-0") pod "0dbc468e-bf92-4cfa-81f1-a660334c4fd5" (UID: "0dbc468e-bf92-4cfa-81f1-a660334c4fd5"). InnerVolumeSpecName "nova-migration-ssh-key-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 12:02:43 crc kubenswrapper[4794]: I0310 12:02:43.664095 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0dbc468e-bf92-4cfa-81f1-a660334c4fd5-nova-cell1-compute-config-2" (OuterVolumeSpecName: "nova-cell1-compute-config-2") pod "0dbc468e-bf92-4cfa-81f1-a660334c4fd5" (UID: "0dbc468e-bf92-4cfa-81f1-a660334c4fd5"). InnerVolumeSpecName "nova-cell1-compute-config-2". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 12:02:43 crc kubenswrapper[4794]: I0310 12:02:43.678156 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0dbc468e-bf92-4cfa-81f1-a660334c4fd5-ssh-key-openstack-cell1" (OuterVolumeSpecName: "ssh-key-openstack-cell1") pod "0dbc468e-bf92-4cfa-81f1-a660334c4fd5" (UID: "0dbc468e-bf92-4cfa-81f1-a660334c4fd5"). InnerVolumeSpecName "ssh-key-openstack-cell1". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 12:02:43 crc kubenswrapper[4794]: I0310 12:02:43.679949 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0dbc468e-bf92-4cfa-81f1-a660334c4fd5-inventory" (OuterVolumeSpecName: "inventory") pod "0dbc468e-bf92-4cfa-81f1-a660334c4fd5" (UID: "0dbc468e-bf92-4cfa-81f1-a660334c4fd5"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 12:02:43 crc kubenswrapper[4794]: I0310 12:02:43.682259 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0dbc468e-bf92-4cfa-81f1-a660334c4fd5-nova-cell1-compute-config-3" (OuterVolumeSpecName: "nova-cell1-compute-config-3") pod "0dbc468e-bf92-4cfa-81f1-a660334c4fd5" (UID: "0dbc468e-bf92-4cfa-81f1-a660334c4fd5"). InnerVolumeSpecName "nova-cell1-compute-config-3". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 12:02:43 crc kubenswrapper[4794]: I0310 12:02:43.682976 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0dbc468e-bf92-4cfa-81f1-a660334c4fd5-nova-cells-global-config-0" (OuterVolumeSpecName: "nova-cells-global-config-0") pod "0dbc468e-bf92-4cfa-81f1-a660334c4fd5" (UID: "0dbc468e-bf92-4cfa-81f1-a660334c4fd5"). InnerVolumeSpecName "nova-cells-global-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 12:02:43 crc kubenswrapper[4794]: I0310 12:02:43.686385 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0dbc468e-bf92-4cfa-81f1-a660334c4fd5-nova-cell1-compute-config-0" (OuterVolumeSpecName: "nova-cell1-compute-config-0") pod "0dbc468e-bf92-4cfa-81f1-a660334c4fd5" (UID: "0dbc468e-bf92-4cfa-81f1-a660334c4fd5"). InnerVolumeSpecName "nova-cell1-compute-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 12:02:43 crc kubenswrapper[4794]: I0310 12:02:43.689849 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0dbc468e-bf92-4cfa-81f1-a660334c4fd5-nova-cells-global-config-1" (OuterVolumeSpecName: "nova-cells-global-config-1") pod "0dbc468e-bf92-4cfa-81f1-a660334c4fd5" (UID: "0dbc468e-bf92-4cfa-81f1-a660334c4fd5"). InnerVolumeSpecName "nova-cells-global-config-1". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 12:02:43 crc kubenswrapper[4794]: I0310 12:02:43.689852 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0dbc468e-bf92-4cfa-81f1-a660334c4fd5-nova-migration-ssh-key-1" (OuterVolumeSpecName: "nova-migration-ssh-key-1") pod "0dbc468e-bf92-4cfa-81f1-a660334c4fd5" (UID: "0dbc468e-bf92-4cfa-81f1-a660334c4fd5"). InnerVolumeSpecName "nova-migration-ssh-key-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 12:02:43 crc kubenswrapper[4794]: I0310 12:02:43.728196 4794 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0dbc468e-bf92-4cfa-81f1-a660334c4fd5-inventory\") on node \"crc\" DevicePath \"\"" Mar 10 12:02:43 crc kubenswrapper[4794]: I0310 12:02:43.728236 4794 reconciler_common.go:293] "Volume detached for volume \"nova-cells-global-config-0\" (UniqueName: \"kubernetes.io/configmap/0dbc468e-bf92-4cfa-81f1-a660334c4fd5-nova-cells-global-config-0\") on node \"crc\" DevicePath \"\"" Mar 10 12:02:43 crc kubenswrapper[4794]: I0310 12:02:43.728249 4794 reconciler_common.go:293] "Volume detached for volume \"nova-cells-global-config-1\" (UniqueName: \"kubernetes.io/configmap/0dbc468e-bf92-4cfa-81f1-a660334c4fd5-nova-cells-global-config-1\") on node \"crc\" DevicePath \"\"" Mar 10 12:02:43 crc kubenswrapper[4794]: I0310 12:02:43.728259 4794 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/0dbc468e-bf92-4cfa-81f1-a660334c4fd5-ssh-key-openstack-cell1\") on node \"crc\" DevicePath \"\"" Mar 10 12:02:43 crc kubenswrapper[4794]: I0310 12:02:43.728269 4794 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/0dbc468e-bf92-4cfa-81f1-a660334c4fd5-nova-cell1-compute-config-0\") on node \"crc\" DevicePath \"\"" Mar 10 12:02:43 crc kubenswrapper[4794]: I0310 12:02:43.728281 4794 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/0dbc468e-bf92-4cfa-81f1-a660334c4fd5-nova-migration-ssh-key-0\") on node \"crc\" DevicePath \"\"" Mar 10 12:02:43 crc kubenswrapper[4794]: I0310 12:02:43.728292 4794 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/0dbc468e-bf92-4cfa-81f1-a660334c4fd5-nova-cell1-compute-config-1\") on node \"crc\" DevicePath \"\"" Mar 10 12:02:43 crc kubenswrapper[4794]: I0310 12:02:43.728303 4794 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/0dbc468e-bf92-4cfa-81f1-a660334c4fd5-ceph\") on node \"crc\" DevicePath \"\"" Mar 10 12:02:43 crc kubenswrapper[4794]: I0310 12:02:43.728313 4794 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0dbc468e-bf92-4cfa-81f1-a660334c4fd5-nova-cell1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 12:02:43 crc kubenswrapper[4794]: I0310 12:02:43.728342 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6jg5k\" (UniqueName: \"kubernetes.io/projected/0dbc468e-bf92-4cfa-81f1-a660334c4fd5-kube-api-access-6jg5k\") on node \"crc\" DevicePath \"\"" Mar 10 12:02:43 crc kubenswrapper[4794]: I0310 12:02:43.728352 4794 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/0dbc468e-bf92-4cfa-81f1-a660334c4fd5-nova-cell1-compute-config-3\") on node \"crc\" DevicePath \"\"" Mar 10 12:02:43 crc kubenswrapper[4794]: I0310 12:02:43.728362 4794 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/0dbc468e-bf92-4cfa-81f1-a660334c4fd5-nova-cell1-compute-config-2\") on node \"crc\" DevicePath \"\"" Mar 10 12:02:43 crc kubenswrapper[4794]: I0310 12:02:43.728372 4794 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/0dbc468e-bf92-4cfa-81f1-a660334c4fd5-nova-migration-ssh-key-1\") on node \"crc\" DevicePath \"\"" Mar 10 12:02:43 crc kubenswrapper[4794]: I0310 12:02:43.999385 4794 scope.go:117] "RemoveContainer" containerID="8fbcec467432a07654b1a077feacb7efdec4f288161f785ef9cd2a36f505037c" Mar 10 12:02:43 crc kubenswrapper[4794]: E0310 12:02:43.999859 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-69278_openshift-machine-config-operator(071a79a8-a892-4d38-a255-2a19483b64aa)\"" pod="openshift-machine-config-operator/machine-config-daemon-69278" podUID="071a79a8-a892-4d38-a255-2a19483b64aa" Mar 10 12:02:44 crc kubenswrapper[4794]: I0310 12:02:44.048067 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-openstack-openstack-cell1-6rfj4" event={"ID":"0dbc468e-bf92-4cfa-81f1-a660334c4fd5","Type":"ContainerDied","Data":"47200f510fbd9fd3307f3e3e384c4006941acbb307d978707a395678afb27fbc"} Mar 10 12:02:44 crc kubenswrapper[4794]: I0310 12:02:44.048298 4794 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="47200f510fbd9fd3307f3e3e384c4006941acbb307d978707a395678afb27fbc" Mar 10 12:02:44 crc kubenswrapper[4794]: I0310 12:02:44.048136 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-openstack-openstack-cell1-6rfj4" Mar 10 12:02:44 crc kubenswrapper[4794]: I0310 12:02:44.146688 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/telemetry-openstack-openstack-cell1-z7mv2"] Mar 10 12:02:44 crc kubenswrapper[4794]: E0310 12:02:44.147718 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="472174b5-b716-47c5-bfe3-00c586405a20" containerName="registry-server" Mar 10 12:02:44 crc kubenswrapper[4794]: I0310 12:02:44.147853 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="472174b5-b716-47c5-bfe3-00c586405a20" containerName="registry-server" Mar 10 12:02:44 crc kubenswrapper[4794]: E0310 12:02:44.147979 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e3a9d999-4ef8-4222-85ae-d2d779a19819" containerName="registry-server" Mar 10 12:02:44 crc kubenswrapper[4794]: I0310 12:02:44.148054 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="e3a9d999-4ef8-4222-85ae-d2d779a19819" containerName="registry-server" Mar 10 12:02:44 crc kubenswrapper[4794]: E0310 12:02:44.148166 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0dbc468e-bf92-4cfa-81f1-a660334c4fd5" containerName="nova-cell1-openstack-openstack-cell1" Mar 10 12:02:44 crc kubenswrapper[4794]: I0310 12:02:44.148280 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="0dbc468e-bf92-4cfa-81f1-a660334c4fd5" containerName="nova-cell1-openstack-openstack-cell1" Mar 10 12:02:44 crc kubenswrapper[4794]: E0310 12:02:44.148533 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="472174b5-b716-47c5-bfe3-00c586405a20" containerName="extract-content" Mar 10 12:02:44 crc kubenswrapper[4794]: I0310 12:02:44.148626 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="472174b5-b716-47c5-bfe3-00c586405a20" containerName="extract-content" Mar 10 12:02:44 crc kubenswrapper[4794]: E0310 12:02:44.148715 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="472174b5-b716-47c5-bfe3-00c586405a20" containerName="extract-utilities" Mar 10 12:02:44 crc kubenswrapper[4794]: I0310 12:02:44.148797 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="472174b5-b716-47c5-bfe3-00c586405a20" containerName="extract-utilities" Mar 10 12:02:44 crc kubenswrapper[4794]: E0310 12:02:44.148890 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e3a9d999-4ef8-4222-85ae-d2d779a19819" containerName="extract-utilities" Mar 10 12:02:44 crc kubenswrapper[4794]: I0310 12:02:44.148960 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="e3a9d999-4ef8-4222-85ae-d2d779a19819" containerName="extract-utilities" Mar 10 12:02:44 crc kubenswrapper[4794]: E0310 12:02:44.149048 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e3a9d999-4ef8-4222-85ae-d2d779a19819" containerName="extract-content" Mar 10 12:02:44 crc kubenswrapper[4794]: I0310 12:02:44.149119 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="e3a9d999-4ef8-4222-85ae-d2d779a19819" containerName="extract-content" Mar 10 12:02:44 crc kubenswrapper[4794]: I0310 12:02:44.149515 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="0dbc468e-bf92-4cfa-81f1-a660334c4fd5" containerName="nova-cell1-openstack-openstack-cell1" Mar 10 12:02:44 crc kubenswrapper[4794]: I0310 12:02:44.149658 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="472174b5-b716-47c5-bfe3-00c586405a20" containerName="registry-server" Mar 10 12:02:44 crc kubenswrapper[4794]: I0310 12:02:44.149757 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="e3a9d999-4ef8-4222-85ae-d2d779a19819" containerName="registry-server" Mar 10 12:02:44 crc kubenswrapper[4794]: I0310 12:02:44.151267 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-openstack-openstack-cell1-z7mv2" Mar 10 12:02:44 crc kubenswrapper[4794]: I0310 12:02:44.153585 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Mar 10 12:02:44 crc kubenswrapper[4794]: I0310 12:02:44.153841 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-jdm8g" Mar 10 12:02:44 crc kubenswrapper[4794]: I0310 12:02:44.154047 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-compute-config-data" Mar 10 12:02:44 crc kubenswrapper[4794]: I0310 12:02:44.154739 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 10 12:02:44 crc kubenswrapper[4794]: I0310 12:02:44.161919 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/telemetry-openstack-openstack-cell1-z7mv2"] Mar 10 12:02:44 crc kubenswrapper[4794]: I0310 12:02:44.163740 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Mar 10 12:02:44 crc kubenswrapper[4794]: I0310 12:02:44.241207 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/55055e58-21d6-4d61-bb6e-ba2ff62acb9a-inventory\") pod \"telemetry-openstack-openstack-cell1-z7mv2\" (UID: \"55055e58-21d6-4d61-bb6e-ba2ff62acb9a\") " pod="openstack/telemetry-openstack-openstack-cell1-z7mv2" Mar 10 12:02:44 crc kubenswrapper[4794]: I0310 12:02:44.241297 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/55055e58-21d6-4d61-bb6e-ba2ff62acb9a-telemetry-combined-ca-bundle\") pod \"telemetry-openstack-openstack-cell1-z7mv2\" (UID: \"55055e58-21d6-4d61-bb6e-ba2ff62acb9a\") " pod="openstack/telemetry-openstack-openstack-cell1-z7mv2" Mar 10 12:02:44 crc kubenswrapper[4794]: I0310 12:02:44.241357 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/55055e58-21d6-4d61-bb6e-ba2ff62acb9a-ceilometer-compute-config-data-1\") pod \"telemetry-openstack-openstack-cell1-z7mv2\" (UID: \"55055e58-21d6-4d61-bb6e-ba2ff62acb9a\") " pod="openstack/telemetry-openstack-openstack-cell1-z7mv2" Mar 10 12:02:44 crc kubenswrapper[4794]: I0310 12:02:44.241429 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kvcjj\" (UniqueName: \"kubernetes.io/projected/55055e58-21d6-4d61-bb6e-ba2ff62acb9a-kube-api-access-kvcjj\") pod \"telemetry-openstack-openstack-cell1-z7mv2\" (UID: \"55055e58-21d6-4d61-bb6e-ba2ff62acb9a\") " pod="openstack/telemetry-openstack-openstack-cell1-z7mv2" Mar 10 12:02:44 crc kubenswrapper[4794]: I0310 12:02:44.241595 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/55055e58-21d6-4d61-bb6e-ba2ff62acb9a-ssh-key-openstack-cell1\") pod \"telemetry-openstack-openstack-cell1-z7mv2\" (UID: \"55055e58-21d6-4d61-bb6e-ba2ff62acb9a\") " pod="openstack/telemetry-openstack-openstack-cell1-z7mv2" Mar 10 12:02:44 crc kubenswrapper[4794]: I0310 12:02:44.241636 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/55055e58-21d6-4d61-bb6e-ba2ff62acb9a-ceilometer-compute-config-data-2\") pod \"telemetry-openstack-openstack-cell1-z7mv2\" (UID: \"55055e58-21d6-4d61-bb6e-ba2ff62acb9a\") " pod="openstack/telemetry-openstack-openstack-cell1-z7mv2" Mar 10 12:02:44 crc kubenswrapper[4794]: I0310 12:02:44.241767 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/55055e58-21d6-4d61-bb6e-ba2ff62acb9a-ceilometer-compute-config-data-0\") pod \"telemetry-openstack-openstack-cell1-z7mv2\" (UID: \"55055e58-21d6-4d61-bb6e-ba2ff62acb9a\") " pod="openstack/telemetry-openstack-openstack-cell1-z7mv2" Mar 10 12:02:44 crc kubenswrapper[4794]: I0310 12:02:44.241815 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/55055e58-21d6-4d61-bb6e-ba2ff62acb9a-ceph\") pod \"telemetry-openstack-openstack-cell1-z7mv2\" (UID: \"55055e58-21d6-4d61-bb6e-ba2ff62acb9a\") " pod="openstack/telemetry-openstack-openstack-cell1-z7mv2" Mar 10 12:02:44 crc kubenswrapper[4794]: I0310 12:02:44.344065 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kvcjj\" (UniqueName: \"kubernetes.io/projected/55055e58-21d6-4d61-bb6e-ba2ff62acb9a-kube-api-access-kvcjj\") pod \"telemetry-openstack-openstack-cell1-z7mv2\" (UID: \"55055e58-21d6-4d61-bb6e-ba2ff62acb9a\") " pod="openstack/telemetry-openstack-openstack-cell1-z7mv2" Mar 10 12:02:44 crc kubenswrapper[4794]: I0310 12:02:44.344180 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/55055e58-21d6-4d61-bb6e-ba2ff62acb9a-ssh-key-openstack-cell1\") pod \"telemetry-openstack-openstack-cell1-z7mv2\" (UID: \"55055e58-21d6-4d61-bb6e-ba2ff62acb9a\") " pod="openstack/telemetry-openstack-openstack-cell1-z7mv2" Mar 10 12:02:44 crc kubenswrapper[4794]: I0310 12:02:44.344209 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/55055e58-21d6-4d61-bb6e-ba2ff62acb9a-ceilometer-compute-config-data-2\") pod \"telemetry-openstack-openstack-cell1-z7mv2\" (UID: \"55055e58-21d6-4d61-bb6e-ba2ff62acb9a\") " pod="openstack/telemetry-openstack-openstack-cell1-z7mv2" Mar 10 12:02:44 crc kubenswrapper[4794]: I0310 12:02:44.344295 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/55055e58-21d6-4d61-bb6e-ba2ff62acb9a-ceilometer-compute-config-data-0\") pod \"telemetry-openstack-openstack-cell1-z7mv2\" (UID: \"55055e58-21d6-4d61-bb6e-ba2ff62acb9a\") " pod="openstack/telemetry-openstack-openstack-cell1-z7mv2" Mar 10 12:02:44 crc kubenswrapper[4794]: I0310 12:02:44.344323 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/55055e58-21d6-4d61-bb6e-ba2ff62acb9a-ceph\") pod \"telemetry-openstack-openstack-cell1-z7mv2\" (UID: \"55055e58-21d6-4d61-bb6e-ba2ff62acb9a\") " pod="openstack/telemetry-openstack-openstack-cell1-z7mv2" Mar 10 12:02:44 crc kubenswrapper[4794]: I0310 12:02:44.344380 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/55055e58-21d6-4d61-bb6e-ba2ff62acb9a-inventory\") pod \"telemetry-openstack-openstack-cell1-z7mv2\" (UID: \"55055e58-21d6-4d61-bb6e-ba2ff62acb9a\") " pod="openstack/telemetry-openstack-openstack-cell1-z7mv2" Mar 10 12:02:44 crc kubenswrapper[4794]: I0310 12:02:44.344412 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/55055e58-21d6-4d61-bb6e-ba2ff62acb9a-telemetry-combined-ca-bundle\") pod \"telemetry-openstack-openstack-cell1-z7mv2\" (UID: \"55055e58-21d6-4d61-bb6e-ba2ff62acb9a\") " pod="openstack/telemetry-openstack-openstack-cell1-z7mv2" Mar 10 12:02:44 crc kubenswrapper[4794]: I0310 12:02:44.344432 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/55055e58-21d6-4d61-bb6e-ba2ff62acb9a-ceilometer-compute-config-data-1\") pod \"telemetry-openstack-openstack-cell1-z7mv2\" (UID: \"55055e58-21d6-4d61-bb6e-ba2ff62acb9a\") " pod="openstack/telemetry-openstack-openstack-cell1-z7mv2" Mar 10 12:02:44 crc kubenswrapper[4794]: I0310 12:02:44.349712 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/55055e58-21d6-4d61-bb6e-ba2ff62acb9a-inventory\") pod \"telemetry-openstack-openstack-cell1-z7mv2\" (UID: \"55055e58-21d6-4d61-bb6e-ba2ff62acb9a\") " pod="openstack/telemetry-openstack-openstack-cell1-z7mv2" Mar 10 12:02:44 crc kubenswrapper[4794]: I0310 12:02:44.349762 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/55055e58-21d6-4d61-bb6e-ba2ff62acb9a-ssh-key-openstack-cell1\") pod \"telemetry-openstack-openstack-cell1-z7mv2\" (UID: \"55055e58-21d6-4d61-bb6e-ba2ff62acb9a\") " pod="openstack/telemetry-openstack-openstack-cell1-z7mv2" Mar 10 12:02:44 crc kubenswrapper[4794]: I0310 12:02:44.350010 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/55055e58-21d6-4d61-bb6e-ba2ff62acb9a-ceilometer-compute-config-data-2\") pod \"telemetry-openstack-openstack-cell1-z7mv2\" (UID: \"55055e58-21d6-4d61-bb6e-ba2ff62acb9a\") " pod="openstack/telemetry-openstack-openstack-cell1-z7mv2" Mar 10 12:02:44 crc kubenswrapper[4794]: I0310 12:02:44.350655 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/55055e58-21d6-4d61-bb6e-ba2ff62acb9a-ceilometer-compute-config-data-1\") pod \"telemetry-openstack-openstack-cell1-z7mv2\" (UID: \"55055e58-21d6-4d61-bb6e-ba2ff62acb9a\") " pod="openstack/telemetry-openstack-openstack-cell1-z7mv2" Mar 10 12:02:44 crc kubenswrapper[4794]: I0310 12:02:44.351516 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/55055e58-21d6-4d61-bb6e-ba2ff62acb9a-telemetry-combined-ca-bundle\") pod \"telemetry-openstack-openstack-cell1-z7mv2\" (UID: \"55055e58-21d6-4d61-bb6e-ba2ff62acb9a\") " pod="openstack/telemetry-openstack-openstack-cell1-z7mv2" Mar 10 12:02:44 crc kubenswrapper[4794]: I0310 12:02:44.352552 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/55055e58-21d6-4d61-bb6e-ba2ff62acb9a-ceph\") pod \"telemetry-openstack-openstack-cell1-z7mv2\" (UID: \"55055e58-21d6-4d61-bb6e-ba2ff62acb9a\") " pod="openstack/telemetry-openstack-openstack-cell1-z7mv2" Mar 10 12:02:44 crc kubenswrapper[4794]: I0310 12:02:44.352973 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/55055e58-21d6-4d61-bb6e-ba2ff62acb9a-ceilometer-compute-config-data-0\") pod \"telemetry-openstack-openstack-cell1-z7mv2\" (UID: \"55055e58-21d6-4d61-bb6e-ba2ff62acb9a\") " pod="openstack/telemetry-openstack-openstack-cell1-z7mv2" Mar 10 12:02:44 crc kubenswrapper[4794]: I0310 12:02:44.362273 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kvcjj\" (UniqueName: \"kubernetes.io/projected/55055e58-21d6-4d61-bb6e-ba2ff62acb9a-kube-api-access-kvcjj\") pod \"telemetry-openstack-openstack-cell1-z7mv2\" (UID: \"55055e58-21d6-4d61-bb6e-ba2ff62acb9a\") " pod="openstack/telemetry-openstack-openstack-cell1-z7mv2" Mar 10 12:02:44 crc kubenswrapper[4794]: I0310 12:02:44.483388 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-openstack-openstack-cell1-z7mv2" Mar 10 12:02:45 crc kubenswrapper[4794]: I0310 12:02:45.024906 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/telemetry-openstack-openstack-cell1-z7mv2"] Mar 10 12:02:45 crc kubenswrapper[4794]: I0310 12:02:45.058979 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-openstack-openstack-cell1-z7mv2" event={"ID":"55055e58-21d6-4d61-bb6e-ba2ff62acb9a","Type":"ContainerStarted","Data":"c7b6e4a8c54e774780b4e7d5423214be65c7f835faab3c64c5cb3c2435165827"} Mar 10 12:02:46 crc kubenswrapper[4794]: I0310 12:02:46.078282 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-openstack-openstack-cell1-z7mv2" event={"ID":"55055e58-21d6-4d61-bb6e-ba2ff62acb9a","Type":"ContainerStarted","Data":"a4e1b444e36c812cebb602b7027e9e787c28fbdc375babfcecedc3aa03ca680d"} Mar 10 12:02:46 crc kubenswrapper[4794]: I0310 12:02:46.113014 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/telemetry-openstack-openstack-cell1-z7mv2" podStartSLOduration=1.6248687080000002 podStartE2EDuration="2.11298528s" podCreationTimestamp="2026-03-10 12:02:44 +0000 UTC" firstStartedPulling="2026-03-10 12:02:45.029046061 +0000 UTC m=+8313.785216879" lastFinishedPulling="2026-03-10 12:02:45.517162593 +0000 UTC m=+8314.273333451" observedRunningTime="2026-03-10 12:02:46.099793701 +0000 UTC m=+8314.855964529" watchObservedRunningTime="2026-03-10 12:02:46.11298528 +0000 UTC m=+8314.869156098" Mar 10 12:02:54 crc kubenswrapper[4794]: I0310 12:02:54.047271 4794 scope.go:117] "RemoveContainer" containerID="47a3e892ea4d118d21b9ca372ee29b61a20a02a8167e9c5e666ae0cef3f40bf0" Mar 10 12:02:56 crc kubenswrapper[4794]: I0310 12:02:56.000427 4794 scope.go:117] "RemoveContainer" containerID="8fbcec467432a07654b1a077feacb7efdec4f288161f785ef9cd2a36f505037c" Mar 10 12:02:56 crc kubenswrapper[4794]: E0310 12:02:56.001166 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-69278_openshift-machine-config-operator(071a79a8-a892-4d38-a255-2a19483b64aa)\"" pod="openshift-machine-config-operator/machine-config-daemon-69278" podUID="071a79a8-a892-4d38-a255-2a19483b64aa" Mar 10 12:03:09 crc kubenswrapper[4794]: I0310 12:03:09.323910 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-tkrhq"] Mar 10 12:03:09 crc kubenswrapper[4794]: I0310 12:03:09.328112 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-tkrhq" Mar 10 12:03:09 crc kubenswrapper[4794]: I0310 12:03:09.358617 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-tkrhq"] Mar 10 12:03:09 crc kubenswrapper[4794]: I0310 12:03:09.410532 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/520a233a-9c8b-41fc-abe6-2d71138a0707-catalog-content\") pod \"redhat-operators-tkrhq\" (UID: \"520a233a-9c8b-41fc-abe6-2d71138a0707\") " pod="openshift-marketplace/redhat-operators-tkrhq" Mar 10 12:03:09 crc kubenswrapper[4794]: I0310 12:03:09.410606 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/520a233a-9c8b-41fc-abe6-2d71138a0707-utilities\") pod \"redhat-operators-tkrhq\" (UID: \"520a233a-9c8b-41fc-abe6-2d71138a0707\") " pod="openshift-marketplace/redhat-operators-tkrhq" Mar 10 12:03:09 crc kubenswrapper[4794]: I0310 12:03:09.410653 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dxnmf\" (UniqueName: \"kubernetes.io/projected/520a233a-9c8b-41fc-abe6-2d71138a0707-kube-api-access-dxnmf\") pod \"redhat-operators-tkrhq\" (UID: \"520a233a-9c8b-41fc-abe6-2d71138a0707\") " pod="openshift-marketplace/redhat-operators-tkrhq" Mar 10 12:03:09 crc kubenswrapper[4794]: I0310 12:03:09.512814 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/520a233a-9c8b-41fc-abe6-2d71138a0707-catalog-content\") pod \"redhat-operators-tkrhq\" (UID: \"520a233a-9c8b-41fc-abe6-2d71138a0707\") " pod="openshift-marketplace/redhat-operators-tkrhq" Mar 10 12:03:09 crc kubenswrapper[4794]: I0310 12:03:09.512930 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/520a233a-9c8b-41fc-abe6-2d71138a0707-utilities\") pod \"redhat-operators-tkrhq\" (UID: \"520a233a-9c8b-41fc-abe6-2d71138a0707\") " pod="openshift-marketplace/redhat-operators-tkrhq" Mar 10 12:03:09 crc kubenswrapper[4794]: I0310 12:03:09.512985 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dxnmf\" (UniqueName: \"kubernetes.io/projected/520a233a-9c8b-41fc-abe6-2d71138a0707-kube-api-access-dxnmf\") pod \"redhat-operators-tkrhq\" (UID: \"520a233a-9c8b-41fc-abe6-2d71138a0707\") " pod="openshift-marketplace/redhat-operators-tkrhq" Mar 10 12:03:09 crc kubenswrapper[4794]: I0310 12:03:09.513302 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/520a233a-9c8b-41fc-abe6-2d71138a0707-catalog-content\") pod \"redhat-operators-tkrhq\" (UID: \"520a233a-9c8b-41fc-abe6-2d71138a0707\") " pod="openshift-marketplace/redhat-operators-tkrhq" Mar 10 12:03:09 crc kubenswrapper[4794]: I0310 12:03:09.513722 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/520a233a-9c8b-41fc-abe6-2d71138a0707-utilities\") pod \"redhat-operators-tkrhq\" (UID: \"520a233a-9c8b-41fc-abe6-2d71138a0707\") " pod="openshift-marketplace/redhat-operators-tkrhq" Mar 10 12:03:09 crc kubenswrapper[4794]: I0310 12:03:09.541113 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dxnmf\" (UniqueName: \"kubernetes.io/projected/520a233a-9c8b-41fc-abe6-2d71138a0707-kube-api-access-dxnmf\") pod \"redhat-operators-tkrhq\" (UID: \"520a233a-9c8b-41fc-abe6-2d71138a0707\") " pod="openshift-marketplace/redhat-operators-tkrhq" Mar 10 12:03:09 crc kubenswrapper[4794]: I0310 12:03:09.660015 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-tkrhq" Mar 10 12:03:10 crc kubenswrapper[4794]: I0310 12:03:10.226705 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-tkrhq"] Mar 10 12:03:10 crc kubenswrapper[4794]: W0310 12:03:10.233694 4794 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod520a233a_9c8b_41fc_abe6_2d71138a0707.slice/crio-00416181a7dcd48fe78e5e975397d39a8a4c49f353d26745991b357c833bb04e WatchSource:0}: Error finding container 00416181a7dcd48fe78e5e975397d39a8a4c49f353d26745991b357c833bb04e: Status 404 returned error can't find the container with id 00416181a7dcd48fe78e5e975397d39a8a4c49f353d26745991b357c833bb04e Mar 10 12:03:10 crc kubenswrapper[4794]: I0310 12:03:10.339858 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tkrhq" event={"ID":"520a233a-9c8b-41fc-abe6-2d71138a0707","Type":"ContainerStarted","Data":"00416181a7dcd48fe78e5e975397d39a8a4c49f353d26745991b357c833bb04e"} Mar 10 12:03:10 crc kubenswrapper[4794]: I0310 12:03:10.999134 4794 scope.go:117] "RemoveContainer" containerID="8fbcec467432a07654b1a077feacb7efdec4f288161f785ef9cd2a36f505037c" Mar 10 12:03:10 crc kubenswrapper[4794]: E0310 12:03:10.999934 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-69278_openshift-machine-config-operator(071a79a8-a892-4d38-a255-2a19483b64aa)\"" pod="openshift-machine-config-operator/machine-config-daemon-69278" podUID="071a79a8-a892-4d38-a255-2a19483b64aa" Mar 10 12:03:11 crc kubenswrapper[4794]: I0310 12:03:11.352118 4794 generic.go:334] "Generic (PLEG): container finished" podID="520a233a-9c8b-41fc-abe6-2d71138a0707" containerID="14f764dd46ee1a3bad6a7a48745165d8cfd909a18cf39a51ef8a223627cefe00" exitCode=0 Mar 10 12:03:11 crc kubenswrapper[4794]: I0310 12:03:11.352166 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tkrhq" event={"ID":"520a233a-9c8b-41fc-abe6-2d71138a0707","Type":"ContainerDied","Data":"14f764dd46ee1a3bad6a7a48745165d8cfd909a18cf39a51ef8a223627cefe00"} Mar 10 12:03:11 crc kubenswrapper[4794]: I0310 12:03:11.354745 4794 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 10 12:03:12 crc kubenswrapper[4794]: I0310 12:03:12.365813 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tkrhq" event={"ID":"520a233a-9c8b-41fc-abe6-2d71138a0707","Type":"ContainerStarted","Data":"6e7ee39f7468c7438a102793500cb37659899a028a9beaac689827b3a0e2ea6c"} Mar 10 12:03:19 crc kubenswrapper[4794]: I0310 12:03:19.434939 4794 generic.go:334] "Generic (PLEG): container finished" podID="520a233a-9c8b-41fc-abe6-2d71138a0707" containerID="6e7ee39f7468c7438a102793500cb37659899a028a9beaac689827b3a0e2ea6c" exitCode=0 Mar 10 12:03:19 crc kubenswrapper[4794]: I0310 12:03:19.435061 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tkrhq" event={"ID":"520a233a-9c8b-41fc-abe6-2d71138a0707","Type":"ContainerDied","Data":"6e7ee39f7468c7438a102793500cb37659899a028a9beaac689827b3a0e2ea6c"} Mar 10 12:03:20 crc kubenswrapper[4794]: I0310 12:03:20.446264 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tkrhq" event={"ID":"520a233a-9c8b-41fc-abe6-2d71138a0707","Type":"ContainerStarted","Data":"51f4addd3e43feb73cebea6b59e698662236b41d427259ecfeb036df1b82a04c"} Mar 10 12:03:20 crc kubenswrapper[4794]: I0310 12:03:20.470166 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-tkrhq" podStartSLOduration=2.901711553 podStartE2EDuration="11.470147096s" podCreationTimestamp="2026-03-10 12:03:09 +0000 UTC" firstStartedPulling="2026-03-10 12:03:11.354430136 +0000 UTC m=+8340.110600954" lastFinishedPulling="2026-03-10 12:03:19.922865679 +0000 UTC m=+8348.679036497" observedRunningTime="2026-03-10 12:03:20.463616366 +0000 UTC m=+8349.219787184" watchObservedRunningTime="2026-03-10 12:03:20.470147096 +0000 UTC m=+8349.226317914" Mar 10 12:03:22 crc kubenswrapper[4794]: I0310 12:03:22.999267 4794 scope.go:117] "RemoveContainer" containerID="8fbcec467432a07654b1a077feacb7efdec4f288161f785ef9cd2a36f505037c" Mar 10 12:03:23 crc kubenswrapper[4794]: E0310 12:03:22.999906 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-69278_openshift-machine-config-operator(071a79a8-a892-4d38-a255-2a19483b64aa)\"" pod="openshift-machine-config-operator/machine-config-daemon-69278" podUID="071a79a8-a892-4d38-a255-2a19483b64aa" Mar 10 12:03:29 crc kubenswrapper[4794]: I0310 12:03:29.660526 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-tkrhq" Mar 10 12:03:29 crc kubenswrapper[4794]: I0310 12:03:29.661268 4794 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-tkrhq" Mar 10 12:03:30 crc kubenswrapper[4794]: I0310 12:03:30.707142 4794 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-tkrhq" podUID="520a233a-9c8b-41fc-abe6-2d71138a0707" containerName="registry-server" probeResult="failure" output=< Mar 10 12:03:30 crc kubenswrapper[4794]: timeout: failed to connect service ":50051" within 1s Mar 10 12:03:30 crc kubenswrapper[4794]: > Mar 10 12:03:35 crc kubenswrapper[4794]: I0310 12:03:34.999687 4794 scope.go:117] "RemoveContainer" containerID="8fbcec467432a07654b1a077feacb7efdec4f288161f785ef9cd2a36f505037c" Mar 10 12:03:35 crc kubenswrapper[4794]: E0310 12:03:35.000440 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-69278_openshift-machine-config-operator(071a79a8-a892-4d38-a255-2a19483b64aa)\"" pod="openshift-machine-config-operator/machine-config-daemon-69278" podUID="071a79a8-a892-4d38-a255-2a19483b64aa" Mar 10 12:03:40 crc kubenswrapper[4794]: I0310 12:03:40.708765 4794 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-tkrhq" podUID="520a233a-9c8b-41fc-abe6-2d71138a0707" containerName="registry-server" probeResult="failure" output=< Mar 10 12:03:40 crc kubenswrapper[4794]: timeout: failed to connect service ":50051" within 1s Mar 10 12:03:40 crc kubenswrapper[4794]: > Mar 10 12:03:47 crc kubenswrapper[4794]: I0310 12:03:46.999804 4794 scope.go:117] "RemoveContainer" containerID="8fbcec467432a07654b1a077feacb7efdec4f288161f785ef9cd2a36f505037c" Mar 10 12:03:47 crc kubenswrapper[4794]: E0310 12:03:47.000763 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-69278_openshift-machine-config-operator(071a79a8-a892-4d38-a255-2a19483b64aa)\"" pod="openshift-machine-config-operator/machine-config-daemon-69278" podUID="071a79a8-a892-4d38-a255-2a19483b64aa" Mar 10 12:03:49 crc kubenswrapper[4794]: I0310 12:03:49.720116 4794 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-tkrhq" Mar 10 12:03:49 crc kubenswrapper[4794]: I0310 12:03:49.789713 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-tkrhq" Mar 10 12:03:49 crc kubenswrapper[4794]: I0310 12:03:49.961983 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-tkrhq"] Mar 10 12:03:51 crc kubenswrapper[4794]: I0310 12:03:51.751415 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-tkrhq" podUID="520a233a-9c8b-41fc-abe6-2d71138a0707" containerName="registry-server" containerID="cri-o://51f4addd3e43feb73cebea6b59e698662236b41d427259ecfeb036df1b82a04c" gracePeriod=2 Mar 10 12:03:52 crc kubenswrapper[4794]: I0310 12:03:52.393936 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-tkrhq" Mar 10 12:03:52 crc kubenswrapper[4794]: I0310 12:03:52.483697 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/520a233a-9c8b-41fc-abe6-2d71138a0707-utilities\") pod \"520a233a-9c8b-41fc-abe6-2d71138a0707\" (UID: \"520a233a-9c8b-41fc-abe6-2d71138a0707\") " Mar 10 12:03:52 crc kubenswrapper[4794]: I0310 12:03:52.483804 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dxnmf\" (UniqueName: \"kubernetes.io/projected/520a233a-9c8b-41fc-abe6-2d71138a0707-kube-api-access-dxnmf\") pod \"520a233a-9c8b-41fc-abe6-2d71138a0707\" (UID: \"520a233a-9c8b-41fc-abe6-2d71138a0707\") " Mar 10 12:03:52 crc kubenswrapper[4794]: I0310 12:03:52.483900 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/520a233a-9c8b-41fc-abe6-2d71138a0707-catalog-content\") pod \"520a233a-9c8b-41fc-abe6-2d71138a0707\" (UID: \"520a233a-9c8b-41fc-abe6-2d71138a0707\") " Mar 10 12:03:52 crc kubenswrapper[4794]: I0310 12:03:52.484585 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/520a233a-9c8b-41fc-abe6-2d71138a0707-utilities" (OuterVolumeSpecName: "utilities") pod "520a233a-9c8b-41fc-abe6-2d71138a0707" (UID: "520a233a-9c8b-41fc-abe6-2d71138a0707"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 12:03:52 crc kubenswrapper[4794]: I0310 12:03:52.489536 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/520a233a-9c8b-41fc-abe6-2d71138a0707-kube-api-access-dxnmf" (OuterVolumeSpecName: "kube-api-access-dxnmf") pod "520a233a-9c8b-41fc-abe6-2d71138a0707" (UID: "520a233a-9c8b-41fc-abe6-2d71138a0707"). InnerVolumeSpecName "kube-api-access-dxnmf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 12:03:52 crc kubenswrapper[4794]: I0310 12:03:52.586504 4794 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/520a233a-9c8b-41fc-abe6-2d71138a0707-utilities\") on node \"crc\" DevicePath \"\"" Mar 10 12:03:52 crc kubenswrapper[4794]: I0310 12:03:52.586543 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dxnmf\" (UniqueName: \"kubernetes.io/projected/520a233a-9c8b-41fc-abe6-2d71138a0707-kube-api-access-dxnmf\") on node \"crc\" DevicePath \"\"" Mar 10 12:03:52 crc kubenswrapper[4794]: I0310 12:03:52.621257 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/520a233a-9c8b-41fc-abe6-2d71138a0707-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "520a233a-9c8b-41fc-abe6-2d71138a0707" (UID: "520a233a-9c8b-41fc-abe6-2d71138a0707"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 12:03:52 crc kubenswrapper[4794]: I0310 12:03:52.688170 4794 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/520a233a-9c8b-41fc-abe6-2d71138a0707-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 10 12:03:52 crc kubenswrapper[4794]: I0310 12:03:52.760837 4794 generic.go:334] "Generic (PLEG): container finished" podID="520a233a-9c8b-41fc-abe6-2d71138a0707" containerID="51f4addd3e43feb73cebea6b59e698662236b41d427259ecfeb036df1b82a04c" exitCode=0 Mar 10 12:03:52 crc kubenswrapper[4794]: I0310 12:03:52.760898 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tkrhq" event={"ID":"520a233a-9c8b-41fc-abe6-2d71138a0707","Type":"ContainerDied","Data":"51f4addd3e43feb73cebea6b59e698662236b41d427259ecfeb036df1b82a04c"} Mar 10 12:03:52 crc kubenswrapper[4794]: I0310 12:03:52.760933 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-tkrhq" Mar 10 12:03:52 crc kubenswrapper[4794]: I0310 12:03:52.760971 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tkrhq" event={"ID":"520a233a-9c8b-41fc-abe6-2d71138a0707","Type":"ContainerDied","Data":"00416181a7dcd48fe78e5e975397d39a8a4c49f353d26745991b357c833bb04e"} Mar 10 12:03:52 crc kubenswrapper[4794]: I0310 12:03:52.760996 4794 scope.go:117] "RemoveContainer" containerID="51f4addd3e43feb73cebea6b59e698662236b41d427259ecfeb036df1b82a04c" Mar 10 12:03:52 crc kubenswrapper[4794]: I0310 12:03:52.799597 4794 scope.go:117] "RemoveContainer" containerID="6e7ee39f7468c7438a102793500cb37659899a028a9beaac689827b3a0e2ea6c" Mar 10 12:03:52 crc kubenswrapper[4794]: I0310 12:03:52.803432 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-tkrhq"] Mar 10 12:03:52 crc kubenswrapper[4794]: I0310 12:03:52.813876 4794 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-tkrhq"] Mar 10 12:03:52 crc kubenswrapper[4794]: I0310 12:03:52.821552 4794 scope.go:117] "RemoveContainer" containerID="14f764dd46ee1a3bad6a7a48745165d8cfd909a18cf39a51ef8a223627cefe00" Mar 10 12:03:52 crc kubenswrapper[4794]: I0310 12:03:52.869131 4794 scope.go:117] "RemoveContainer" containerID="51f4addd3e43feb73cebea6b59e698662236b41d427259ecfeb036df1b82a04c" Mar 10 12:03:52 crc kubenswrapper[4794]: E0310 12:03:52.869689 4794 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"51f4addd3e43feb73cebea6b59e698662236b41d427259ecfeb036df1b82a04c\": container with ID starting with 51f4addd3e43feb73cebea6b59e698662236b41d427259ecfeb036df1b82a04c not found: ID does not exist" containerID="51f4addd3e43feb73cebea6b59e698662236b41d427259ecfeb036df1b82a04c" Mar 10 12:03:52 crc kubenswrapper[4794]: I0310 12:03:52.869727 4794 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"51f4addd3e43feb73cebea6b59e698662236b41d427259ecfeb036df1b82a04c"} err="failed to get container status \"51f4addd3e43feb73cebea6b59e698662236b41d427259ecfeb036df1b82a04c\": rpc error: code = NotFound desc = could not find container \"51f4addd3e43feb73cebea6b59e698662236b41d427259ecfeb036df1b82a04c\": container with ID starting with 51f4addd3e43feb73cebea6b59e698662236b41d427259ecfeb036df1b82a04c not found: ID does not exist" Mar 10 12:03:52 crc kubenswrapper[4794]: I0310 12:03:52.869752 4794 scope.go:117] "RemoveContainer" containerID="6e7ee39f7468c7438a102793500cb37659899a028a9beaac689827b3a0e2ea6c" Mar 10 12:03:52 crc kubenswrapper[4794]: E0310 12:03:52.870032 4794 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6e7ee39f7468c7438a102793500cb37659899a028a9beaac689827b3a0e2ea6c\": container with ID starting with 6e7ee39f7468c7438a102793500cb37659899a028a9beaac689827b3a0e2ea6c not found: ID does not exist" containerID="6e7ee39f7468c7438a102793500cb37659899a028a9beaac689827b3a0e2ea6c" Mar 10 12:03:52 crc kubenswrapper[4794]: I0310 12:03:52.870065 4794 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6e7ee39f7468c7438a102793500cb37659899a028a9beaac689827b3a0e2ea6c"} err="failed to get container status \"6e7ee39f7468c7438a102793500cb37659899a028a9beaac689827b3a0e2ea6c\": rpc error: code = NotFound desc = could not find container \"6e7ee39f7468c7438a102793500cb37659899a028a9beaac689827b3a0e2ea6c\": container with ID starting with 6e7ee39f7468c7438a102793500cb37659899a028a9beaac689827b3a0e2ea6c not found: ID does not exist" Mar 10 12:03:52 crc kubenswrapper[4794]: I0310 12:03:52.870082 4794 scope.go:117] "RemoveContainer" containerID="14f764dd46ee1a3bad6a7a48745165d8cfd909a18cf39a51ef8a223627cefe00" Mar 10 12:03:52 crc kubenswrapper[4794]: E0310 12:03:52.870318 4794 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"14f764dd46ee1a3bad6a7a48745165d8cfd909a18cf39a51ef8a223627cefe00\": container with ID starting with 14f764dd46ee1a3bad6a7a48745165d8cfd909a18cf39a51ef8a223627cefe00 not found: ID does not exist" containerID="14f764dd46ee1a3bad6a7a48745165d8cfd909a18cf39a51ef8a223627cefe00" Mar 10 12:03:52 crc kubenswrapper[4794]: I0310 12:03:52.870358 4794 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"14f764dd46ee1a3bad6a7a48745165d8cfd909a18cf39a51ef8a223627cefe00"} err="failed to get container status \"14f764dd46ee1a3bad6a7a48745165d8cfd909a18cf39a51ef8a223627cefe00\": rpc error: code = NotFound desc = could not find container \"14f764dd46ee1a3bad6a7a48745165d8cfd909a18cf39a51ef8a223627cefe00\": container with ID starting with 14f764dd46ee1a3bad6a7a48745165d8cfd909a18cf39a51ef8a223627cefe00 not found: ID does not exist" Mar 10 12:03:54 crc kubenswrapper[4794]: I0310 12:03:54.010133 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="520a233a-9c8b-41fc-abe6-2d71138a0707" path="/var/lib/kubelet/pods/520a233a-9c8b-41fc-abe6-2d71138a0707/volumes" Mar 10 12:04:00 crc kubenswrapper[4794]: I0310 12:04:00.172787 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29552404-v4pzs"] Mar 10 12:04:00 crc kubenswrapper[4794]: E0310 12:04:00.173910 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="520a233a-9c8b-41fc-abe6-2d71138a0707" containerName="registry-server" Mar 10 12:04:00 crc kubenswrapper[4794]: I0310 12:04:00.173927 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="520a233a-9c8b-41fc-abe6-2d71138a0707" containerName="registry-server" Mar 10 12:04:00 crc kubenswrapper[4794]: E0310 12:04:00.173958 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="520a233a-9c8b-41fc-abe6-2d71138a0707" containerName="extract-content" Mar 10 12:04:00 crc kubenswrapper[4794]: I0310 12:04:00.173964 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="520a233a-9c8b-41fc-abe6-2d71138a0707" containerName="extract-content" Mar 10 12:04:00 crc kubenswrapper[4794]: E0310 12:04:00.173983 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="520a233a-9c8b-41fc-abe6-2d71138a0707" containerName="extract-utilities" Mar 10 12:04:00 crc kubenswrapper[4794]: I0310 12:04:00.173991 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="520a233a-9c8b-41fc-abe6-2d71138a0707" containerName="extract-utilities" Mar 10 12:04:00 crc kubenswrapper[4794]: I0310 12:04:00.174226 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="520a233a-9c8b-41fc-abe6-2d71138a0707" containerName="registry-server" Mar 10 12:04:00 crc kubenswrapper[4794]: I0310 12:04:00.175928 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552404-v4pzs" Mar 10 12:04:00 crc kubenswrapper[4794]: I0310 12:04:00.181604 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 10 12:04:00 crc kubenswrapper[4794]: I0310 12:04:00.182152 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-r5tkc" Mar 10 12:04:00 crc kubenswrapper[4794]: I0310 12:04:00.182585 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 10 12:04:00 crc kubenswrapper[4794]: I0310 12:04:00.186952 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552404-v4pzs"] Mar 10 12:04:00 crc kubenswrapper[4794]: I0310 12:04:00.260574 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vvcj7\" (UniqueName: \"kubernetes.io/projected/556983a9-9184-4252-941c-89607a266dc7-kube-api-access-vvcj7\") pod \"auto-csr-approver-29552404-v4pzs\" (UID: \"556983a9-9184-4252-941c-89607a266dc7\") " pod="openshift-infra/auto-csr-approver-29552404-v4pzs" Mar 10 12:04:00 crc kubenswrapper[4794]: I0310 12:04:00.362960 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vvcj7\" (UniqueName: \"kubernetes.io/projected/556983a9-9184-4252-941c-89607a266dc7-kube-api-access-vvcj7\") pod \"auto-csr-approver-29552404-v4pzs\" (UID: \"556983a9-9184-4252-941c-89607a266dc7\") " pod="openshift-infra/auto-csr-approver-29552404-v4pzs" Mar 10 12:04:00 crc kubenswrapper[4794]: I0310 12:04:00.380391 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vvcj7\" (UniqueName: \"kubernetes.io/projected/556983a9-9184-4252-941c-89607a266dc7-kube-api-access-vvcj7\") pod \"auto-csr-approver-29552404-v4pzs\" (UID: \"556983a9-9184-4252-941c-89607a266dc7\") " pod="openshift-infra/auto-csr-approver-29552404-v4pzs" Mar 10 12:04:00 crc kubenswrapper[4794]: I0310 12:04:00.502482 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552404-v4pzs" Mar 10 12:04:01 crc kubenswrapper[4794]: I0310 12:04:01.040685 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552404-v4pzs"] Mar 10 12:04:01 crc kubenswrapper[4794]: I0310 12:04:01.850915 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552404-v4pzs" event={"ID":"556983a9-9184-4252-941c-89607a266dc7","Type":"ContainerStarted","Data":"6571b7d40253d472563928a597b9f3b30c25f9386e7b2d9506d6ac57f06383e5"} Mar 10 12:04:02 crc kubenswrapper[4794]: I0310 12:04:02.008849 4794 scope.go:117] "RemoveContainer" containerID="8fbcec467432a07654b1a077feacb7efdec4f288161f785ef9cd2a36f505037c" Mar 10 12:04:02 crc kubenswrapper[4794]: E0310 12:04:02.009100 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-69278_openshift-machine-config-operator(071a79a8-a892-4d38-a255-2a19483b64aa)\"" pod="openshift-machine-config-operator/machine-config-daemon-69278" podUID="071a79a8-a892-4d38-a255-2a19483b64aa" Mar 10 12:04:02 crc kubenswrapper[4794]: I0310 12:04:02.863659 4794 generic.go:334] "Generic (PLEG): container finished" podID="556983a9-9184-4252-941c-89607a266dc7" containerID="699a80ec15ad44a1f976b9411a99bcaa192658c068e5a535c6b4da3cd3d43b00" exitCode=0 Mar 10 12:04:02 crc kubenswrapper[4794]: I0310 12:04:02.863710 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552404-v4pzs" event={"ID":"556983a9-9184-4252-941c-89607a266dc7","Type":"ContainerDied","Data":"699a80ec15ad44a1f976b9411a99bcaa192658c068e5a535c6b4da3cd3d43b00"} Mar 10 12:04:04 crc kubenswrapper[4794]: I0310 12:04:04.498495 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552404-v4pzs" Mar 10 12:04:04 crc kubenswrapper[4794]: I0310 12:04:04.660784 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vvcj7\" (UniqueName: \"kubernetes.io/projected/556983a9-9184-4252-941c-89607a266dc7-kube-api-access-vvcj7\") pod \"556983a9-9184-4252-941c-89607a266dc7\" (UID: \"556983a9-9184-4252-941c-89607a266dc7\") " Mar 10 12:04:04 crc kubenswrapper[4794]: I0310 12:04:04.668825 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/556983a9-9184-4252-941c-89607a266dc7-kube-api-access-vvcj7" (OuterVolumeSpecName: "kube-api-access-vvcj7") pod "556983a9-9184-4252-941c-89607a266dc7" (UID: "556983a9-9184-4252-941c-89607a266dc7"). InnerVolumeSpecName "kube-api-access-vvcj7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 12:04:04 crc kubenswrapper[4794]: I0310 12:04:04.764062 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vvcj7\" (UniqueName: \"kubernetes.io/projected/556983a9-9184-4252-941c-89607a266dc7-kube-api-access-vvcj7\") on node \"crc\" DevicePath \"\"" Mar 10 12:04:04 crc kubenswrapper[4794]: I0310 12:04:04.888132 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552404-v4pzs" event={"ID":"556983a9-9184-4252-941c-89607a266dc7","Type":"ContainerDied","Data":"6571b7d40253d472563928a597b9f3b30c25f9386e7b2d9506d6ac57f06383e5"} Mar 10 12:04:04 crc kubenswrapper[4794]: I0310 12:04:04.888177 4794 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6571b7d40253d472563928a597b9f3b30c25f9386e7b2d9506d6ac57f06383e5" Mar 10 12:04:04 crc kubenswrapper[4794]: I0310 12:04:04.888230 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552404-v4pzs" Mar 10 12:04:05 crc kubenswrapper[4794]: I0310 12:04:05.584155 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29552398-7gnn2"] Mar 10 12:04:05 crc kubenswrapper[4794]: I0310 12:04:05.596168 4794 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29552398-7gnn2"] Mar 10 12:04:06 crc kubenswrapper[4794]: I0310 12:04:06.011810 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="28e4c8d2-e942-4938-9a74-82c596407de7" path="/var/lib/kubelet/pods/28e4c8d2-e942-4938-9a74-82c596407de7/volumes" Mar 10 12:04:17 crc kubenswrapper[4794]: I0310 12:04:16.999787 4794 scope.go:117] "RemoveContainer" containerID="8fbcec467432a07654b1a077feacb7efdec4f288161f785ef9cd2a36f505037c" Mar 10 12:04:17 crc kubenswrapper[4794]: E0310 12:04:17.000546 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-69278_openshift-machine-config-operator(071a79a8-a892-4d38-a255-2a19483b64aa)\"" pod="openshift-machine-config-operator/machine-config-daemon-69278" podUID="071a79a8-a892-4d38-a255-2a19483b64aa" Mar 10 12:04:28 crc kubenswrapper[4794]: I0310 12:04:28.999423 4794 scope.go:117] "RemoveContainer" containerID="8fbcec467432a07654b1a077feacb7efdec4f288161f785ef9cd2a36f505037c" Mar 10 12:04:29 crc kubenswrapper[4794]: E0310 12:04:29.000130 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-69278_openshift-machine-config-operator(071a79a8-a892-4d38-a255-2a19483b64aa)\"" pod="openshift-machine-config-operator/machine-config-daemon-69278" podUID="071a79a8-a892-4d38-a255-2a19483b64aa" Mar 10 12:04:42 crc kubenswrapper[4794]: I0310 12:04:42.999175 4794 scope.go:117] "RemoveContainer" containerID="8fbcec467432a07654b1a077feacb7efdec4f288161f785ef9cd2a36f505037c" Mar 10 12:04:43 crc kubenswrapper[4794]: E0310 12:04:43.000035 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-69278_openshift-machine-config-operator(071a79a8-a892-4d38-a255-2a19483b64aa)\"" pod="openshift-machine-config-operator/machine-config-daemon-69278" podUID="071a79a8-a892-4d38-a255-2a19483b64aa" Mar 10 12:04:54 crc kubenswrapper[4794]: I0310 12:04:54.227856 4794 scope.go:117] "RemoveContainer" containerID="a7a363ef99cbcce43d3be032b16e1546540302f601080e6c0632013f00ce38cd" Mar 10 12:04:54 crc kubenswrapper[4794]: I0310 12:04:54.999588 4794 scope.go:117] "RemoveContainer" containerID="8fbcec467432a07654b1a077feacb7efdec4f288161f785ef9cd2a36f505037c" Mar 10 12:04:55 crc kubenswrapper[4794]: E0310 12:04:55.000308 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-69278_openshift-machine-config-operator(071a79a8-a892-4d38-a255-2a19483b64aa)\"" pod="openshift-machine-config-operator/machine-config-daemon-69278" podUID="071a79a8-a892-4d38-a255-2a19483b64aa" Mar 10 12:05:06 crc kubenswrapper[4794]: I0310 12:05:05.999930 4794 scope.go:117] "RemoveContainer" containerID="8fbcec467432a07654b1a077feacb7efdec4f288161f785ef9cd2a36f505037c" Mar 10 12:05:06 crc kubenswrapper[4794]: E0310 12:05:06.000803 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-69278_openshift-machine-config-operator(071a79a8-a892-4d38-a255-2a19483b64aa)\"" pod="openshift-machine-config-operator/machine-config-daemon-69278" podUID="071a79a8-a892-4d38-a255-2a19483b64aa" Mar 10 12:05:20 crc kubenswrapper[4794]: I0310 12:05:19.999251 4794 scope.go:117] "RemoveContainer" containerID="8fbcec467432a07654b1a077feacb7efdec4f288161f785ef9cd2a36f505037c" Mar 10 12:05:20 crc kubenswrapper[4794]: E0310 12:05:20.000110 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-69278_openshift-machine-config-operator(071a79a8-a892-4d38-a255-2a19483b64aa)\"" pod="openshift-machine-config-operator/machine-config-daemon-69278" podUID="071a79a8-a892-4d38-a255-2a19483b64aa" Mar 10 12:05:31 crc kubenswrapper[4794]: I0310 12:05:31.000398 4794 scope.go:117] "RemoveContainer" containerID="8fbcec467432a07654b1a077feacb7efdec4f288161f785ef9cd2a36f505037c" Mar 10 12:05:31 crc kubenswrapper[4794]: E0310 12:05:31.002550 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-69278_openshift-machine-config-operator(071a79a8-a892-4d38-a255-2a19483b64aa)\"" pod="openshift-machine-config-operator/machine-config-daemon-69278" podUID="071a79a8-a892-4d38-a255-2a19483b64aa" Mar 10 12:05:42 crc kubenswrapper[4794]: I0310 12:05:42.006410 4794 scope.go:117] "RemoveContainer" containerID="8fbcec467432a07654b1a077feacb7efdec4f288161f785ef9cd2a36f505037c" Mar 10 12:05:42 crc kubenswrapper[4794]: E0310 12:05:42.007211 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-69278_openshift-machine-config-operator(071a79a8-a892-4d38-a255-2a19483b64aa)\"" pod="openshift-machine-config-operator/machine-config-daemon-69278" podUID="071a79a8-a892-4d38-a255-2a19483b64aa" Mar 10 12:05:54 crc kubenswrapper[4794]: I0310 12:05:54.999231 4794 scope.go:117] "RemoveContainer" containerID="8fbcec467432a07654b1a077feacb7efdec4f288161f785ef9cd2a36f505037c" Mar 10 12:05:55 crc kubenswrapper[4794]: E0310 12:05:55.001711 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-69278_openshift-machine-config-operator(071a79a8-a892-4d38-a255-2a19483b64aa)\"" pod="openshift-machine-config-operator/machine-config-daemon-69278" podUID="071a79a8-a892-4d38-a255-2a19483b64aa" Mar 10 12:06:00 crc kubenswrapper[4794]: I0310 12:06:00.149954 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29552406-8wnx2"] Mar 10 12:06:00 crc kubenswrapper[4794]: E0310 12:06:00.150803 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="556983a9-9184-4252-941c-89607a266dc7" containerName="oc" Mar 10 12:06:00 crc kubenswrapper[4794]: I0310 12:06:00.150821 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="556983a9-9184-4252-941c-89607a266dc7" containerName="oc" Mar 10 12:06:00 crc kubenswrapper[4794]: I0310 12:06:00.151151 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="556983a9-9184-4252-941c-89607a266dc7" containerName="oc" Mar 10 12:06:00 crc kubenswrapper[4794]: I0310 12:06:00.152045 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552406-8wnx2" Mar 10 12:06:00 crc kubenswrapper[4794]: I0310 12:06:00.154753 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-r5tkc" Mar 10 12:06:00 crc kubenswrapper[4794]: I0310 12:06:00.154826 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 10 12:06:00 crc kubenswrapper[4794]: I0310 12:06:00.154900 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 10 12:06:00 crc kubenswrapper[4794]: I0310 12:06:00.162011 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552406-8wnx2"] Mar 10 12:06:00 crc kubenswrapper[4794]: I0310 12:06:00.217262 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-blvz9\" (UniqueName: \"kubernetes.io/projected/2213e803-3a33-496f-818f-44bf00ae8d8d-kube-api-access-blvz9\") pod \"auto-csr-approver-29552406-8wnx2\" (UID: \"2213e803-3a33-496f-818f-44bf00ae8d8d\") " pod="openshift-infra/auto-csr-approver-29552406-8wnx2" Mar 10 12:06:00 crc kubenswrapper[4794]: I0310 12:06:00.320246 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-blvz9\" (UniqueName: \"kubernetes.io/projected/2213e803-3a33-496f-818f-44bf00ae8d8d-kube-api-access-blvz9\") pod \"auto-csr-approver-29552406-8wnx2\" (UID: \"2213e803-3a33-496f-818f-44bf00ae8d8d\") " pod="openshift-infra/auto-csr-approver-29552406-8wnx2" Mar 10 12:06:00 crc kubenswrapper[4794]: I0310 12:06:00.343393 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-blvz9\" (UniqueName: \"kubernetes.io/projected/2213e803-3a33-496f-818f-44bf00ae8d8d-kube-api-access-blvz9\") pod \"auto-csr-approver-29552406-8wnx2\" (UID: \"2213e803-3a33-496f-818f-44bf00ae8d8d\") " pod="openshift-infra/auto-csr-approver-29552406-8wnx2" Mar 10 12:06:00 crc kubenswrapper[4794]: I0310 12:06:00.505492 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552406-8wnx2" Mar 10 12:06:00 crc kubenswrapper[4794]: I0310 12:06:00.996166 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552406-8wnx2"] Mar 10 12:06:01 crc kubenswrapper[4794]: I0310 12:06:01.178843 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552406-8wnx2" event={"ID":"2213e803-3a33-496f-818f-44bf00ae8d8d","Type":"ContainerStarted","Data":"85ea7f75da26a7b56dea939b09f9539f3c6d95edad0cdab9d477e65212effbb1"} Mar 10 12:06:03 crc kubenswrapper[4794]: I0310 12:06:03.202458 4794 generic.go:334] "Generic (PLEG): container finished" podID="2213e803-3a33-496f-818f-44bf00ae8d8d" containerID="b7eab5beccbff2245f68c647ed4ace195ef5269b9af607a8d61e8e17ed534171" exitCode=0 Mar 10 12:06:03 crc kubenswrapper[4794]: I0310 12:06:03.202551 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552406-8wnx2" event={"ID":"2213e803-3a33-496f-818f-44bf00ae8d8d","Type":"ContainerDied","Data":"b7eab5beccbff2245f68c647ed4ace195ef5269b9af607a8d61e8e17ed534171"} Mar 10 12:06:04 crc kubenswrapper[4794]: I0310 12:06:04.718243 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552406-8wnx2" Mar 10 12:06:04 crc kubenswrapper[4794]: I0310 12:06:04.856646 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-blvz9\" (UniqueName: \"kubernetes.io/projected/2213e803-3a33-496f-818f-44bf00ae8d8d-kube-api-access-blvz9\") pod \"2213e803-3a33-496f-818f-44bf00ae8d8d\" (UID: \"2213e803-3a33-496f-818f-44bf00ae8d8d\") " Mar 10 12:06:04 crc kubenswrapper[4794]: I0310 12:06:04.865751 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2213e803-3a33-496f-818f-44bf00ae8d8d-kube-api-access-blvz9" (OuterVolumeSpecName: "kube-api-access-blvz9") pod "2213e803-3a33-496f-818f-44bf00ae8d8d" (UID: "2213e803-3a33-496f-818f-44bf00ae8d8d"). InnerVolumeSpecName "kube-api-access-blvz9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 12:06:04 crc kubenswrapper[4794]: I0310 12:06:04.959263 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-blvz9\" (UniqueName: \"kubernetes.io/projected/2213e803-3a33-496f-818f-44bf00ae8d8d-kube-api-access-blvz9\") on node \"crc\" DevicePath \"\"" Mar 10 12:06:05 crc kubenswrapper[4794]: I0310 12:06:05.228249 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552406-8wnx2" event={"ID":"2213e803-3a33-496f-818f-44bf00ae8d8d","Type":"ContainerDied","Data":"85ea7f75da26a7b56dea939b09f9539f3c6d95edad0cdab9d477e65212effbb1"} Mar 10 12:06:05 crc kubenswrapper[4794]: I0310 12:06:05.228558 4794 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="85ea7f75da26a7b56dea939b09f9539f3c6d95edad0cdab9d477e65212effbb1" Mar 10 12:06:05 crc kubenswrapper[4794]: I0310 12:06:05.228308 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552406-8wnx2" Mar 10 12:06:05 crc kubenswrapper[4794]: I0310 12:06:05.796221 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29552400-g5wjv"] Mar 10 12:06:05 crc kubenswrapper[4794]: I0310 12:06:05.806102 4794 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29552400-g5wjv"] Mar 10 12:06:06 crc kubenswrapper[4794]: I0310 12:06:06.021204 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fe2b2b20-7e44-4fce-af37-93e674cb3736" path="/var/lib/kubelet/pods/fe2b2b20-7e44-4fce-af37-93e674cb3736/volumes" Mar 10 12:06:10 crc kubenswrapper[4794]: I0310 12:06:10.000087 4794 scope.go:117] "RemoveContainer" containerID="8fbcec467432a07654b1a077feacb7efdec4f288161f785ef9cd2a36f505037c" Mar 10 12:06:10 crc kubenswrapper[4794]: E0310 12:06:10.001142 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-69278_openshift-machine-config-operator(071a79a8-a892-4d38-a255-2a19483b64aa)\"" pod="openshift-machine-config-operator/machine-config-daemon-69278" podUID="071a79a8-a892-4d38-a255-2a19483b64aa" Mar 10 12:06:24 crc kubenswrapper[4794]: I0310 12:06:24.999107 4794 scope.go:117] "RemoveContainer" containerID="8fbcec467432a07654b1a077feacb7efdec4f288161f785ef9cd2a36f505037c" Mar 10 12:06:25 crc kubenswrapper[4794]: E0310 12:06:24.999897 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-69278_openshift-machine-config-operator(071a79a8-a892-4d38-a255-2a19483b64aa)\"" pod="openshift-machine-config-operator/machine-config-daemon-69278" podUID="071a79a8-a892-4d38-a255-2a19483b64aa" Mar 10 12:06:30 crc kubenswrapper[4794]: I0310 12:06:30.482607 4794 generic.go:334] "Generic (PLEG): container finished" podID="55055e58-21d6-4d61-bb6e-ba2ff62acb9a" containerID="a4e1b444e36c812cebb602b7027e9e787c28fbdc375babfcecedc3aa03ca680d" exitCode=0 Mar 10 12:06:30 crc kubenswrapper[4794]: I0310 12:06:30.482684 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-openstack-openstack-cell1-z7mv2" event={"ID":"55055e58-21d6-4d61-bb6e-ba2ff62acb9a","Type":"ContainerDied","Data":"a4e1b444e36c812cebb602b7027e9e787c28fbdc375babfcecedc3aa03ca680d"} Mar 10 12:06:32 crc kubenswrapper[4794]: I0310 12:06:32.011264 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-openstack-openstack-cell1-z7mv2" Mar 10 12:06:32 crc kubenswrapper[4794]: I0310 12:06:32.070922 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kvcjj\" (UniqueName: \"kubernetes.io/projected/55055e58-21d6-4d61-bb6e-ba2ff62acb9a-kube-api-access-kvcjj\") pod \"55055e58-21d6-4d61-bb6e-ba2ff62acb9a\" (UID: \"55055e58-21d6-4d61-bb6e-ba2ff62acb9a\") " Mar 10 12:06:32 crc kubenswrapper[4794]: I0310 12:06:32.071054 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/55055e58-21d6-4d61-bb6e-ba2ff62acb9a-telemetry-combined-ca-bundle\") pod \"55055e58-21d6-4d61-bb6e-ba2ff62acb9a\" (UID: \"55055e58-21d6-4d61-bb6e-ba2ff62acb9a\") " Mar 10 12:06:32 crc kubenswrapper[4794]: I0310 12:06:32.071279 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/55055e58-21d6-4d61-bb6e-ba2ff62acb9a-ceilometer-compute-config-data-0\") pod \"55055e58-21d6-4d61-bb6e-ba2ff62acb9a\" (UID: \"55055e58-21d6-4d61-bb6e-ba2ff62acb9a\") " Mar 10 12:06:32 crc kubenswrapper[4794]: I0310 12:06:32.071366 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/55055e58-21d6-4d61-bb6e-ba2ff62acb9a-ceilometer-compute-config-data-1\") pod \"55055e58-21d6-4d61-bb6e-ba2ff62acb9a\" (UID: \"55055e58-21d6-4d61-bb6e-ba2ff62acb9a\") " Mar 10 12:06:32 crc kubenswrapper[4794]: I0310 12:06:32.071431 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/55055e58-21d6-4d61-bb6e-ba2ff62acb9a-ssh-key-openstack-cell1\") pod \"55055e58-21d6-4d61-bb6e-ba2ff62acb9a\" (UID: \"55055e58-21d6-4d61-bb6e-ba2ff62acb9a\") " Mar 10 12:06:32 crc kubenswrapper[4794]: I0310 12:06:32.071519 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/55055e58-21d6-4d61-bb6e-ba2ff62acb9a-ceph\") pod \"55055e58-21d6-4d61-bb6e-ba2ff62acb9a\" (UID: \"55055e58-21d6-4d61-bb6e-ba2ff62acb9a\") " Mar 10 12:06:32 crc kubenswrapper[4794]: I0310 12:06:32.071674 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/55055e58-21d6-4d61-bb6e-ba2ff62acb9a-ceilometer-compute-config-data-2\") pod \"55055e58-21d6-4d61-bb6e-ba2ff62acb9a\" (UID: \"55055e58-21d6-4d61-bb6e-ba2ff62acb9a\") " Mar 10 12:06:32 crc kubenswrapper[4794]: I0310 12:06:32.071707 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/55055e58-21d6-4d61-bb6e-ba2ff62acb9a-inventory\") pod \"55055e58-21d6-4d61-bb6e-ba2ff62acb9a\" (UID: \"55055e58-21d6-4d61-bb6e-ba2ff62acb9a\") " Mar 10 12:06:32 crc kubenswrapper[4794]: I0310 12:06:32.077394 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/55055e58-21d6-4d61-bb6e-ba2ff62acb9a-ceph" (OuterVolumeSpecName: "ceph") pod "55055e58-21d6-4d61-bb6e-ba2ff62acb9a" (UID: "55055e58-21d6-4d61-bb6e-ba2ff62acb9a"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 12:06:32 crc kubenswrapper[4794]: I0310 12:06:32.077890 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/55055e58-21d6-4d61-bb6e-ba2ff62acb9a-kube-api-access-kvcjj" (OuterVolumeSpecName: "kube-api-access-kvcjj") pod "55055e58-21d6-4d61-bb6e-ba2ff62acb9a" (UID: "55055e58-21d6-4d61-bb6e-ba2ff62acb9a"). InnerVolumeSpecName "kube-api-access-kvcjj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 12:06:32 crc kubenswrapper[4794]: I0310 12:06:32.081655 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/55055e58-21d6-4d61-bb6e-ba2ff62acb9a-telemetry-combined-ca-bundle" (OuterVolumeSpecName: "telemetry-combined-ca-bundle") pod "55055e58-21d6-4d61-bb6e-ba2ff62acb9a" (UID: "55055e58-21d6-4d61-bb6e-ba2ff62acb9a"). InnerVolumeSpecName "telemetry-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 12:06:32 crc kubenswrapper[4794]: I0310 12:06:32.105961 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/55055e58-21d6-4d61-bb6e-ba2ff62acb9a-inventory" (OuterVolumeSpecName: "inventory") pod "55055e58-21d6-4d61-bb6e-ba2ff62acb9a" (UID: "55055e58-21d6-4d61-bb6e-ba2ff62acb9a"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 12:06:32 crc kubenswrapper[4794]: I0310 12:06:32.111152 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/55055e58-21d6-4d61-bb6e-ba2ff62acb9a-ceilometer-compute-config-data-2" (OuterVolumeSpecName: "ceilometer-compute-config-data-2") pod "55055e58-21d6-4d61-bb6e-ba2ff62acb9a" (UID: "55055e58-21d6-4d61-bb6e-ba2ff62acb9a"). InnerVolumeSpecName "ceilometer-compute-config-data-2". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 12:06:32 crc kubenswrapper[4794]: I0310 12:06:32.115350 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/55055e58-21d6-4d61-bb6e-ba2ff62acb9a-ssh-key-openstack-cell1" (OuterVolumeSpecName: "ssh-key-openstack-cell1") pod "55055e58-21d6-4d61-bb6e-ba2ff62acb9a" (UID: "55055e58-21d6-4d61-bb6e-ba2ff62acb9a"). InnerVolumeSpecName "ssh-key-openstack-cell1". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 12:06:32 crc kubenswrapper[4794]: I0310 12:06:32.115561 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/55055e58-21d6-4d61-bb6e-ba2ff62acb9a-ceilometer-compute-config-data-0" (OuterVolumeSpecName: "ceilometer-compute-config-data-0") pod "55055e58-21d6-4d61-bb6e-ba2ff62acb9a" (UID: "55055e58-21d6-4d61-bb6e-ba2ff62acb9a"). InnerVolumeSpecName "ceilometer-compute-config-data-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 12:06:32 crc kubenswrapper[4794]: I0310 12:06:32.118980 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/55055e58-21d6-4d61-bb6e-ba2ff62acb9a-ceilometer-compute-config-data-1" (OuterVolumeSpecName: "ceilometer-compute-config-data-1") pod "55055e58-21d6-4d61-bb6e-ba2ff62acb9a" (UID: "55055e58-21d6-4d61-bb6e-ba2ff62acb9a"). InnerVolumeSpecName "ceilometer-compute-config-data-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 12:06:32 crc kubenswrapper[4794]: I0310 12:06:32.175129 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kvcjj\" (UniqueName: \"kubernetes.io/projected/55055e58-21d6-4d61-bb6e-ba2ff62acb9a-kube-api-access-kvcjj\") on node \"crc\" DevicePath \"\"" Mar 10 12:06:32 crc kubenswrapper[4794]: I0310 12:06:32.175416 4794 reconciler_common.go:293] "Volume detached for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/55055e58-21d6-4d61-bb6e-ba2ff62acb9a-telemetry-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 12:06:32 crc kubenswrapper[4794]: I0310 12:06:32.175483 4794 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/55055e58-21d6-4d61-bb6e-ba2ff62acb9a-ceilometer-compute-config-data-0\") on node \"crc\" DevicePath \"\"" Mar 10 12:06:32 crc kubenswrapper[4794]: I0310 12:06:32.175548 4794 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/55055e58-21d6-4d61-bb6e-ba2ff62acb9a-ceilometer-compute-config-data-1\") on node \"crc\" DevicePath \"\"" Mar 10 12:06:32 crc kubenswrapper[4794]: I0310 12:06:32.175603 4794 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/55055e58-21d6-4d61-bb6e-ba2ff62acb9a-ssh-key-openstack-cell1\") on node \"crc\" DevicePath \"\"" Mar 10 12:06:32 crc kubenswrapper[4794]: I0310 12:06:32.175686 4794 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/55055e58-21d6-4d61-bb6e-ba2ff62acb9a-ceph\") on node \"crc\" DevicePath \"\"" Mar 10 12:06:32 crc kubenswrapper[4794]: I0310 12:06:32.175741 4794 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/55055e58-21d6-4d61-bb6e-ba2ff62acb9a-ceilometer-compute-config-data-2\") on node \"crc\" DevicePath \"\"" Mar 10 12:06:32 crc kubenswrapper[4794]: I0310 12:06:32.175806 4794 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/55055e58-21d6-4d61-bb6e-ba2ff62acb9a-inventory\") on node \"crc\" DevicePath \"\"" Mar 10 12:06:32 crc kubenswrapper[4794]: I0310 12:06:32.504713 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-openstack-openstack-cell1-z7mv2" event={"ID":"55055e58-21d6-4d61-bb6e-ba2ff62acb9a","Type":"ContainerDied","Data":"c7b6e4a8c54e774780b4e7d5423214be65c7f835faab3c64c5cb3c2435165827"} Mar 10 12:06:32 crc kubenswrapper[4794]: I0310 12:06:32.504771 4794 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c7b6e4a8c54e774780b4e7d5423214be65c7f835faab3c64c5cb3c2435165827" Mar 10 12:06:32 crc kubenswrapper[4794]: I0310 12:06:32.505236 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-openstack-openstack-cell1-z7mv2" Mar 10 12:06:32 crc kubenswrapper[4794]: I0310 12:06:32.611968 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-sriov-openstack-openstack-cell1-jpmfp"] Mar 10 12:06:32 crc kubenswrapper[4794]: E0310 12:06:32.612438 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2213e803-3a33-496f-818f-44bf00ae8d8d" containerName="oc" Mar 10 12:06:32 crc kubenswrapper[4794]: I0310 12:06:32.612456 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="2213e803-3a33-496f-818f-44bf00ae8d8d" containerName="oc" Mar 10 12:06:32 crc kubenswrapper[4794]: E0310 12:06:32.612498 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="55055e58-21d6-4d61-bb6e-ba2ff62acb9a" containerName="telemetry-openstack-openstack-cell1" Mar 10 12:06:32 crc kubenswrapper[4794]: I0310 12:06:32.612504 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="55055e58-21d6-4d61-bb6e-ba2ff62acb9a" containerName="telemetry-openstack-openstack-cell1" Mar 10 12:06:32 crc kubenswrapper[4794]: I0310 12:06:32.612694 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="2213e803-3a33-496f-818f-44bf00ae8d8d" containerName="oc" Mar 10 12:06:32 crc kubenswrapper[4794]: I0310 12:06:32.612720 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="55055e58-21d6-4d61-bb6e-ba2ff62acb9a" containerName="telemetry-openstack-openstack-cell1" Mar 10 12:06:32 crc kubenswrapper[4794]: I0310 12:06:32.613468 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-sriov-openstack-openstack-cell1-jpmfp" Mar 10 12:06:32 crc kubenswrapper[4794]: I0310 12:06:32.617208 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Mar 10 12:06:32 crc kubenswrapper[4794]: I0310 12:06:32.617488 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Mar 10 12:06:32 crc kubenswrapper[4794]: I0310 12:06:32.617570 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-sriov-agent-neutron-config" Mar 10 12:06:32 crc kubenswrapper[4794]: I0310 12:06:32.617842 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 10 12:06:32 crc kubenswrapper[4794]: I0310 12:06:32.618266 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-jdm8g" Mar 10 12:06:32 crc kubenswrapper[4794]: I0310 12:06:32.628317 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-sriov-openstack-openstack-cell1-jpmfp"] Mar 10 12:06:32 crc kubenswrapper[4794]: I0310 12:06:32.685652 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-sriov-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7903d1c4-a98b-4c62-85e4-54df0047589a-neutron-sriov-combined-ca-bundle\") pod \"neutron-sriov-openstack-openstack-cell1-jpmfp\" (UID: \"7903d1c4-a98b-4c62-85e4-54df0047589a\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-jpmfp" Mar 10 12:06:32 crc kubenswrapper[4794]: I0310 12:06:32.685790 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/7903d1c4-a98b-4c62-85e4-54df0047589a-ceph\") pod \"neutron-sriov-openstack-openstack-cell1-jpmfp\" (UID: \"7903d1c4-a98b-4c62-85e4-54df0047589a\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-jpmfp" Mar 10 12:06:32 crc kubenswrapper[4794]: I0310 12:06:32.685822 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jsrwq\" (UniqueName: \"kubernetes.io/projected/7903d1c4-a98b-4c62-85e4-54df0047589a-kube-api-access-jsrwq\") pod \"neutron-sriov-openstack-openstack-cell1-jpmfp\" (UID: \"7903d1c4-a98b-4c62-85e4-54df0047589a\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-jpmfp" Mar 10 12:06:32 crc kubenswrapper[4794]: I0310 12:06:32.685886 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-sriov-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/7903d1c4-a98b-4c62-85e4-54df0047589a-neutron-sriov-agent-neutron-config-0\") pod \"neutron-sriov-openstack-openstack-cell1-jpmfp\" (UID: \"7903d1c4-a98b-4c62-85e4-54df0047589a\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-jpmfp" Mar 10 12:06:32 crc kubenswrapper[4794]: I0310 12:06:32.685941 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7903d1c4-a98b-4c62-85e4-54df0047589a-inventory\") pod \"neutron-sriov-openstack-openstack-cell1-jpmfp\" (UID: \"7903d1c4-a98b-4c62-85e4-54df0047589a\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-jpmfp" Mar 10 12:06:32 crc kubenswrapper[4794]: I0310 12:06:32.685970 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/7903d1c4-a98b-4c62-85e4-54df0047589a-ssh-key-openstack-cell1\") pod \"neutron-sriov-openstack-openstack-cell1-jpmfp\" (UID: \"7903d1c4-a98b-4c62-85e4-54df0047589a\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-jpmfp" Mar 10 12:06:32 crc kubenswrapper[4794]: I0310 12:06:32.787955 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/7903d1c4-a98b-4c62-85e4-54df0047589a-ceph\") pod \"neutron-sriov-openstack-openstack-cell1-jpmfp\" (UID: \"7903d1c4-a98b-4c62-85e4-54df0047589a\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-jpmfp" Mar 10 12:06:32 crc kubenswrapper[4794]: I0310 12:06:32.788292 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jsrwq\" (UniqueName: \"kubernetes.io/projected/7903d1c4-a98b-4c62-85e4-54df0047589a-kube-api-access-jsrwq\") pod \"neutron-sriov-openstack-openstack-cell1-jpmfp\" (UID: \"7903d1c4-a98b-4c62-85e4-54df0047589a\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-jpmfp" Mar 10 12:06:32 crc kubenswrapper[4794]: I0310 12:06:32.788382 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-sriov-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/7903d1c4-a98b-4c62-85e4-54df0047589a-neutron-sriov-agent-neutron-config-0\") pod \"neutron-sriov-openstack-openstack-cell1-jpmfp\" (UID: \"7903d1c4-a98b-4c62-85e4-54df0047589a\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-jpmfp" Mar 10 12:06:32 crc kubenswrapper[4794]: I0310 12:06:32.788441 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7903d1c4-a98b-4c62-85e4-54df0047589a-inventory\") pod \"neutron-sriov-openstack-openstack-cell1-jpmfp\" (UID: \"7903d1c4-a98b-4c62-85e4-54df0047589a\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-jpmfp" Mar 10 12:06:32 crc kubenswrapper[4794]: I0310 12:06:32.788469 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/7903d1c4-a98b-4c62-85e4-54df0047589a-ssh-key-openstack-cell1\") pod \"neutron-sriov-openstack-openstack-cell1-jpmfp\" (UID: \"7903d1c4-a98b-4c62-85e4-54df0047589a\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-jpmfp" Mar 10 12:06:32 crc kubenswrapper[4794]: I0310 12:06:32.788544 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-sriov-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7903d1c4-a98b-4c62-85e4-54df0047589a-neutron-sriov-combined-ca-bundle\") pod \"neutron-sriov-openstack-openstack-cell1-jpmfp\" (UID: \"7903d1c4-a98b-4c62-85e4-54df0047589a\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-jpmfp" Mar 10 12:06:32 crc kubenswrapper[4794]: I0310 12:06:32.792575 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-sriov-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7903d1c4-a98b-4c62-85e4-54df0047589a-neutron-sriov-combined-ca-bundle\") pod \"neutron-sriov-openstack-openstack-cell1-jpmfp\" (UID: \"7903d1c4-a98b-4c62-85e4-54df0047589a\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-jpmfp" Mar 10 12:06:32 crc kubenswrapper[4794]: I0310 12:06:32.793110 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/7903d1c4-a98b-4c62-85e4-54df0047589a-ssh-key-openstack-cell1\") pod \"neutron-sriov-openstack-openstack-cell1-jpmfp\" (UID: \"7903d1c4-a98b-4c62-85e4-54df0047589a\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-jpmfp" Mar 10 12:06:32 crc kubenswrapper[4794]: I0310 12:06:32.793207 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7903d1c4-a98b-4c62-85e4-54df0047589a-inventory\") pod \"neutron-sriov-openstack-openstack-cell1-jpmfp\" (UID: \"7903d1c4-a98b-4c62-85e4-54df0047589a\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-jpmfp" Mar 10 12:06:32 crc kubenswrapper[4794]: I0310 12:06:32.798124 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/7903d1c4-a98b-4c62-85e4-54df0047589a-ceph\") pod \"neutron-sriov-openstack-openstack-cell1-jpmfp\" (UID: \"7903d1c4-a98b-4c62-85e4-54df0047589a\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-jpmfp" Mar 10 12:06:32 crc kubenswrapper[4794]: I0310 12:06:32.798660 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-sriov-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/7903d1c4-a98b-4c62-85e4-54df0047589a-neutron-sriov-agent-neutron-config-0\") pod \"neutron-sriov-openstack-openstack-cell1-jpmfp\" (UID: \"7903d1c4-a98b-4c62-85e4-54df0047589a\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-jpmfp" Mar 10 12:06:32 crc kubenswrapper[4794]: I0310 12:06:32.813168 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jsrwq\" (UniqueName: \"kubernetes.io/projected/7903d1c4-a98b-4c62-85e4-54df0047589a-kube-api-access-jsrwq\") pod \"neutron-sriov-openstack-openstack-cell1-jpmfp\" (UID: \"7903d1c4-a98b-4c62-85e4-54df0047589a\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-jpmfp" Mar 10 12:06:32 crc kubenswrapper[4794]: I0310 12:06:32.948184 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-sriov-openstack-openstack-cell1-jpmfp" Mar 10 12:06:33 crc kubenswrapper[4794]: I0310 12:06:33.555309 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-sriov-openstack-openstack-cell1-jpmfp"] Mar 10 12:06:33 crc kubenswrapper[4794]: W0310 12:06:33.568116 4794 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7903d1c4_a98b_4c62_85e4_54df0047589a.slice/crio-5b3d6353e427490f7a9ca104ffa6e81ba54f29c9b51da4d6574b7cbe344463ae WatchSource:0}: Error finding container 5b3d6353e427490f7a9ca104ffa6e81ba54f29c9b51da4d6574b7cbe344463ae: Status 404 returned error can't find the container with id 5b3d6353e427490f7a9ca104ffa6e81ba54f29c9b51da4d6574b7cbe344463ae Mar 10 12:06:34 crc kubenswrapper[4794]: I0310 12:06:34.526716 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-sriov-openstack-openstack-cell1-jpmfp" event={"ID":"7903d1c4-a98b-4c62-85e4-54df0047589a","Type":"ContainerStarted","Data":"414825ed541a463230306eba3f705adcc4b58674daf8d34e7aebe26ed3e3415e"} Mar 10 12:06:34 crc kubenswrapper[4794]: I0310 12:06:34.527294 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-sriov-openstack-openstack-cell1-jpmfp" event={"ID":"7903d1c4-a98b-4c62-85e4-54df0047589a","Type":"ContainerStarted","Data":"5b3d6353e427490f7a9ca104ffa6e81ba54f29c9b51da4d6574b7cbe344463ae"} Mar 10 12:06:34 crc kubenswrapper[4794]: I0310 12:06:34.548535 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-sriov-openstack-openstack-cell1-jpmfp" podStartSLOduration=2.007667022 podStartE2EDuration="2.548513411s" podCreationTimestamp="2026-03-10 12:06:32 +0000 UTC" firstStartedPulling="2026-03-10 12:06:33.571626377 +0000 UTC m=+8542.327797195" lastFinishedPulling="2026-03-10 12:06:34.112472766 +0000 UTC m=+8542.868643584" observedRunningTime="2026-03-10 12:06:34.544234979 +0000 UTC m=+8543.300405797" watchObservedRunningTime="2026-03-10 12:06:34.548513411 +0000 UTC m=+8543.304684229" Mar 10 12:06:40 crc kubenswrapper[4794]: I0310 12:06:39.999768 4794 scope.go:117] "RemoveContainer" containerID="8fbcec467432a07654b1a077feacb7efdec4f288161f785ef9cd2a36f505037c" Mar 10 12:06:40 crc kubenswrapper[4794]: E0310 12:06:40.000988 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-69278_openshift-machine-config-operator(071a79a8-a892-4d38-a255-2a19483b64aa)\"" pod="openshift-machine-config-operator/machine-config-daemon-69278" podUID="071a79a8-a892-4d38-a255-2a19483b64aa" Mar 10 12:06:54 crc kubenswrapper[4794]: I0310 12:06:54.000610 4794 scope.go:117] "RemoveContainer" containerID="8fbcec467432a07654b1a077feacb7efdec4f288161f785ef9cd2a36f505037c" Mar 10 12:06:54 crc kubenswrapper[4794]: I0310 12:06:54.357781 4794 scope.go:117] "RemoveContainer" containerID="aacd8e156b8ae251684da100d85a625a54beea2b7894e12565aa846a7e063ddd" Mar 10 12:06:54 crc kubenswrapper[4794]: I0310 12:06:54.734357 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-69278" event={"ID":"071a79a8-a892-4d38-a255-2a19483b64aa","Type":"ContainerStarted","Data":"f3eaf645c92d5074cb0924e573d1878a0115051de261d931bccf56e275fcfefa"} Mar 10 12:07:41 crc kubenswrapper[4794]: I0310 12:07:41.205904 4794 generic.go:334] "Generic (PLEG): container finished" podID="7903d1c4-a98b-4c62-85e4-54df0047589a" containerID="414825ed541a463230306eba3f705adcc4b58674daf8d34e7aebe26ed3e3415e" exitCode=0 Mar 10 12:07:41 crc kubenswrapper[4794]: I0310 12:07:41.206391 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-sriov-openstack-openstack-cell1-jpmfp" event={"ID":"7903d1c4-a98b-4c62-85e4-54df0047589a","Type":"ContainerDied","Data":"414825ed541a463230306eba3f705adcc4b58674daf8d34e7aebe26ed3e3415e"} Mar 10 12:07:42 crc kubenswrapper[4794]: I0310 12:07:42.683168 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-sriov-openstack-openstack-cell1-jpmfp" Mar 10 12:07:42 crc kubenswrapper[4794]: I0310 12:07:42.746432 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jsrwq\" (UniqueName: \"kubernetes.io/projected/7903d1c4-a98b-4c62-85e4-54df0047589a-kube-api-access-jsrwq\") pod \"7903d1c4-a98b-4c62-85e4-54df0047589a\" (UID: \"7903d1c4-a98b-4c62-85e4-54df0047589a\") " Mar 10 12:07:42 crc kubenswrapper[4794]: I0310 12:07:42.746475 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7903d1c4-a98b-4c62-85e4-54df0047589a-inventory\") pod \"7903d1c4-a98b-4c62-85e4-54df0047589a\" (UID: \"7903d1c4-a98b-4c62-85e4-54df0047589a\") " Mar 10 12:07:42 crc kubenswrapper[4794]: I0310 12:07:42.746497 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/7903d1c4-a98b-4c62-85e4-54df0047589a-ceph\") pod \"7903d1c4-a98b-4c62-85e4-54df0047589a\" (UID: \"7903d1c4-a98b-4c62-85e4-54df0047589a\") " Mar 10 12:07:42 crc kubenswrapper[4794]: I0310 12:07:42.746587 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-sriov-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7903d1c4-a98b-4c62-85e4-54df0047589a-neutron-sriov-combined-ca-bundle\") pod \"7903d1c4-a98b-4c62-85e4-54df0047589a\" (UID: \"7903d1c4-a98b-4c62-85e4-54df0047589a\") " Mar 10 12:07:42 crc kubenswrapper[4794]: I0310 12:07:42.746653 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-sriov-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/7903d1c4-a98b-4c62-85e4-54df0047589a-neutron-sriov-agent-neutron-config-0\") pod \"7903d1c4-a98b-4c62-85e4-54df0047589a\" (UID: \"7903d1c4-a98b-4c62-85e4-54df0047589a\") " Mar 10 12:07:42 crc kubenswrapper[4794]: I0310 12:07:42.746687 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/7903d1c4-a98b-4c62-85e4-54df0047589a-ssh-key-openstack-cell1\") pod \"7903d1c4-a98b-4c62-85e4-54df0047589a\" (UID: \"7903d1c4-a98b-4c62-85e4-54df0047589a\") " Mar 10 12:07:42 crc kubenswrapper[4794]: I0310 12:07:42.753648 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7903d1c4-a98b-4c62-85e4-54df0047589a-ceph" (OuterVolumeSpecName: "ceph") pod "7903d1c4-a98b-4c62-85e4-54df0047589a" (UID: "7903d1c4-a98b-4c62-85e4-54df0047589a"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 12:07:42 crc kubenswrapper[4794]: I0310 12:07:42.754177 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7903d1c4-a98b-4c62-85e4-54df0047589a-kube-api-access-jsrwq" (OuterVolumeSpecName: "kube-api-access-jsrwq") pod "7903d1c4-a98b-4c62-85e4-54df0047589a" (UID: "7903d1c4-a98b-4c62-85e4-54df0047589a"). InnerVolumeSpecName "kube-api-access-jsrwq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 12:07:42 crc kubenswrapper[4794]: I0310 12:07:42.754756 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7903d1c4-a98b-4c62-85e4-54df0047589a-neutron-sriov-combined-ca-bundle" (OuterVolumeSpecName: "neutron-sriov-combined-ca-bundle") pod "7903d1c4-a98b-4c62-85e4-54df0047589a" (UID: "7903d1c4-a98b-4c62-85e4-54df0047589a"). InnerVolumeSpecName "neutron-sriov-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 12:07:42 crc kubenswrapper[4794]: I0310 12:07:42.777462 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7903d1c4-a98b-4c62-85e4-54df0047589a-ssh-key-openstack-cell1" (OuterVolumeSpecName: "ssh-key-openstack-cell1") pod "7903d1c4-a98b-4c62-85e4-54df0047589a" (UID: "7903d1c4-a98b-4c62-85e4-54df0047589a"). InnerVolumeSpecName "ssh-key-openstack-cell1". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 12:07:42 crc kubenswrapper[4794]: I0310 12:07:42.779021 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7903d1c4-a98b-4c62-85e4-54df0047589a-inventory" (OuterVolumeSpecName: "inventory") pod "7903d1c4-a98b-4c62-85e4-54df0047589a" (UID: "7903d1c4-a98b-4c62-85e4-54df0047589a"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 12:07:42 crc kubenswrapper[4794]: I0310 12:07:42.780009 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7903d1c4-a98b-4c62-85e4-54df0047589a-neutron-sriov-agent-neutron-config-0" (OuterVolumeSpecName: "neutron-sriov-agent-neutron-config-0") pod "7903d1c4-a98b-4c62-85e4-54df0047589a" (UID: "7903d1c4-a98b-4c62-85e4-54df0047589a"). InnerVolumeSpecName "neutron-sriov-agent-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 12:07:42 crc kubenswrapper[4794]: I0310 12:07:42.849627 4794 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/7903d1c4-a98b-4c62-85e4-54df0047589a-ceph\") on node \"crc\" DevicePath \"\"" Mar 10 12:07:42 crc kubenswrapper[4794]: I0310 12:07:42.849667 4794 reconciler_common.go:293] "Volume detached for volume \"neutron-sriov-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7903d1c4-a98b-4c62-85e4-54df0047589a-neutron-sriov-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 12:07:42 crc kubenswrapper[4794]: I0310 12:07:42.849680 4794 reconciler_common.go:293] "Volume detached for volume \"neutron-sriov-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/7903d1c4-a98b-4c62-85e4-54df0047589a-neutron-sriov-agent-neutron-config-0\") on node \"crc\" DevicePath \"\"" Mar 10 12:07:42 crc kubenswrapper[4794]: I0310 12:07:42.849691 4794 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/7903d1c4-a98b-4c62-85e4-54df0047589a-ssh-key-openstack-cell1\") on node \"crc\" DevicePath \"\"" Mar 10 12:07:42 crc kubenswrapper[4794]: I0310 12:07:42.849702 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jsrwq\" (UniqueName: \"kubernetes.io/projected/7903d1c4-a98b-4c62-85e4-54df0047589a-kube-api-access-jsrwq\") on node \"crc\" DevicePath \"\"" Mar 10 12:07:42 crc kubenswrapper[4794]: I0310 12:07:42.849714 4794 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7903d1c4-a98b-4c62-85e4-54df0047589a-inventory\") on node \"crc\" DevicePath \"\"" Mar 10 12:07:43 crc kubenswrapper[4794]: I0310 12:07:43.227119 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-sriov-openstack-openstack-cell1-jpmfp" event={"ID":"7903d1c4-a98b-4c62-85e4-54df0047589a","Type":"ContainerDied","Data":"5b3d6353e427490f7a9ca104ffa6e81ba54f29c9b51da4d6574b7cbe344463ae"} Mar 10 12:07:43 crc kubenswrapper[4794]: I0310 12:07:43.227517 4794 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5b3d6353e427490f7a9ca104ffa6e81ba54f29c9b51da4d6574b7cbe344463ae" Mar 10 12:07:43 crc kubenswrapper[4794]: I0310 12:07:43.227195 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-sriov-openstack-openstack-cell1-jpmfp" Mar 10 12:07:43 crc kubenswrapper[4794]: I0310 12:07:43.337426 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-dhcp-openstack-openstack-cell1-gqrmr"] Mar 10 12:07:43 crc kubenswrapper[4794]: E0310 12:07:43.338023 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7903d1c4-a98b-4c62-85e4-54df0047589a" containerName="neutron-sriov-openstack-openstack-cell1" Mar 10 12:07:43 crc kubenswrapper[4794]: I0310 12:07:43.338048 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="7903d1c4-a98b-4c62-85e4-54df0047589a" containerName="neutron-sriov-openstack-openstack-cell1" Mar 10 12:07:43 crc kubenswrapper[4794]: I0310 12:07:43.338293 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="7903d1c4-a98b-4c62-85e4-54df0047589a" containerName="neutron-sriov-openstack-openstack-cell1" Mar 10 12:07:43 crc kubenswrapper[4794]: I0310 12:07:43.339224 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-dhcp-openstack-openstack-cell1-gqrmr" Mar 10 12:07:43 crc kubenswrapper[4794]: I0310 12:07:43.343870 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-dhcp-agent-neutron-config" Mar 10 12:07:43 crc kubenswrapper[4794]: I0310 12:07:43.344365 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Mar 10 12:07:43 crc kubenswrapper[4794]: I0310 12:07:43.344705 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 10 12:07:43 crc kubenswrapper[4794]: I0310 12:07:43.344964 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-jdm8g" Mar 10 12:07:43 crc kubenswrapper[4794]: I0310 12:07:43.346966 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Mar 10 12:07:43 crc kubenswrapper[4794]: I0310 12:07:43.347667 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-dhcp-openstack-openstack-cell1-gqrmr"] Mar 10 12:07:43 crc kubenswrapper[4794]: I0310 12:07:43.363258 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-dhcp-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/5b31cf7c-573c-4588-8cc9-948d8616af9d-neutron-dhcp-agent-neutron-config-0\") pod \"neutron-dhcp-openstack-openstack-cell1-gqrmr\" (UID: \"5b31cf7c-573c-4588-8cc9-948d8616af9d\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-gqrmr" Mar 10 12:07:43 crc kubenswrapper[4794]: I0310 12:07:43.363436 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-dhcp-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5b31cf7c-573c-4588-8cc9-948d8616af9d-neutron-dhcp-combined-ca-bundle\") pod \"neutron-dhcp-openstack-openstack-cell1-gqrmr\" (UID: \"5b31cf7c-573c-4588-8cc9-948d8616af9d\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-gqrmr" Mar 10 12:07:43 crc kubenswrapper[4794]: I0310 12:07:43.363475 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/5b31cf7c-573c-4588-8cc9-948d8616af9d-ceph\") pod \"neutron-dhcp-openstack-openstack-cell1-gqrmr\" (UID: \"5b31cf7c-573c-4588-8cc9-948d8616af9d\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-gqrmr" Mar 10 12:07:43 crc kubenswrapper[4794]: I0310 12:07:43.363859 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5b31cf7c-573c-4588-8cc9-948d8616af9d-inventory\") pod \"neutron-dhcp-openstack-openstack-cell1-gqrmr\" (UID: \"5b31cf7c-573c-4588-8cc9-948d8616af9d\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-gqrmr" Mar 10 12:07:43 crc kubenswrapper[4794]: I0310 12:07:43.363951 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7qtkb\" (UniqueName: \"kubernetes.io/projected/5b31cf7c-573c-4588-8cc9-948d8616af9d-kube-api-access-7qtkb\") pod \"neutron-dhcp-openstack-openstack-cell1-gqrmr\" (UID: \"5b31cf7c-573c-4588-8cc9-948d8616af9d\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-gqrmr" Mar 10 12:07:43 crc kubenswrapper[4794]: I0310 12:07:43.364010 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/5b31cf7c-573c-4588-8cc9-948d8616af9d-ssh-key-openstack-cell1\") pod \"neutron-dhcp-openstack-openstack-cell1-gqrmr\" (UID: \"5b31cf7c-573c-4588-8cc9-948d8616af9d\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-gqrmr" Mar 10 12:07:43 crc kubenswrapper[4794]: I0310 12:07:43.465141 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-dhcp-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5b31cf7c-573c-4588-8cc9-948d8616af9d-neutron-dhcp-combined-ca-bundle\") pod \"neutron-dhcp-openstack-openstack-cell1-gqrmr\" (UID: \"5b31cf7c-573c-4588-8cc9-948d8616af9d\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-gqrmr" Mar 10 12:07:43 crc kubenswrapper[4794]: I0310 12:07:43.465208 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/5b31cf7c-573c-4588-8cc9-948d8616af9d-ceph\") pod \"neutron-dhcp-openstack-openstack-cell1-gqrmr\" (UID: \"5b31cf7c-573c-4588-8cc9-948d8616af9d\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-gqrmr" Mar 10 12:07:43 crc kubenswrapper[4794]: I0310 12:07:43.465324 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5b31cf7c-573c-4588-8cc9-948d8616af9d-inventory\") pod \"neutron-dhcp-openstack-openstack-cell1-gqrmr\" (UID: \"5b31cf7c-573c-4588-8cc9-948d8616af9d\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-gqrmr" Mar 10 12:07:43 crc kubenswrapper[4794]: I0310 12:07:43.465405 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7qtkb\" (UniqueName: \"kubernetes.io/projected/5b31cf7c-573c-4588-8cc9-948d8616af9d-kube-api-access-7qtkb\") pod \"neutron-dhcp-openstack-openstack-cell1-gqrmr\" (UID: \"5b31cf7c-573c-4588-8cc9-948d8616af9d\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-gqrmr" Mar 10 12:07:43 crc kubenswrapper[4794]: I0310 12:07:43.465442 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/5b31cf7c-573c-4588-8cc9-948d8616af9d-ssh-key-openstack-cell1\") pod \"neutron-dhcp-openstack-openstack-cell1-gqrmr\" (UID: \"5b31cf7c-573c-4588-8cc9-948d8616af9d\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-gqrmr" Mar 10 12:07:43 crc kubenswrapper[4794]: I0310 12:07:43.465488 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-dhcp-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/5b31cf7c-573c-4588-8cc9-948d8616af9d-neutron-dhcp-agent-neutron-config-0\") pod \"neutron-dhcp-openstack-openstack-cell1-gqrmr\" (UID: \"5b31cf7c-573c-4588-8cc9-948d8616af9d\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-gqrmr" Mar 10 12:07:43 crc kubenswrapper[4794]: I0310 12:07:43.470112 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-dhcp-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/5b31cf7c-573c-4588-8cc9-948d8616af9d-neutron-dhcp-agent-neutron-config-0\") pod \"neutron-dhcp-openstack-openstack-cell1-gqrmr\" (UID: \"5b31cf7c-573c-4588-8cc9-948d8616af9d\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-gqrmr" Mar 10 12:07:43 crc kubenswrapper[4794]: I0310 12:07:43.470271 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/5b31cf7c-573c-4588-8cc9-948d8616af9d-ceph\") pod \"neutron-dhcp-openstack-openstack-cell1-gqrmr\" (UID: \"5b31cf7c-573c-4588-8cc9-948d8616af9d\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-gqrmr" Mar 10 12:07:43 crc kubenswrapper[4794]: I0310 12:07:43.471775 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-dhcp-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5b31cf7c-573c-4588-8cc9-948d8616af9d-neutron-dhcp-combined-ca-bundle\") pod \"neutron-dhcp-openstack-openstack-cell1-gqrmr\" (UID: \"5b31cf7c-573c-4588-8cc9-948d8616af9d\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-gqrmr" Mar 10 12:07:43 crc kubenswrapper[4794]: I0310 12:07:43.472089 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5b31cf7c-573c-4588-8cc9-948d8616af9d-inventory\") pod \"neutron-dhcp-openstack-openstack-cell1-gqrmr\" (UID: \"5b31cf7c-573c-4588-8cc9-948d8616af9d\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-gqrmr" Mar 10 12:07:43 crc kubenswrapper[4794]: I0310 12:07:43.472617 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/5b31cf7c-573c-4588-8cc9-948d8616af9d-ssh-key-openstack-cell1\") pod \"neutron-dhcp-openstack-openstack-cell1-gqrmr\" (UID: \"5b31cf7c-573c-4588-8cc9-948d8616af9d\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-gqrmr" Mar 10 12:07:43 crc kubenswrapper[4794]: I0310 12:07:43.490885 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7qtkb\" (UniqueName: \"kubernetes.io/projected/5b31cf7c-573c-4588-8cc9-948d8616af9d-kube-api-access-7qtkb\") pod \"neutron-dhcp-openstack-openstack-cell1-gqrmr\" (UID: \"5b31cf7c-573c-4588-8cc9-948d8616af9d\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-gqrmr" Mar 10 12:07:43 crc kubenswrapper[4794]: I0310 12:07:43.666832 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-dhcp-openstack-openstack-cell1-gqrmr" Mar 10 12:07:44 crc kubenswrapper[4794]: I0310 12:07:44.250433 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-dhcp-openstack-openstack-cell1-gqrmr"] Mar 10 12:07:45 crc kubenswrapper[4794]: I0310 12:07:45.245661 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-dhcp-openstack-openstack-cell1-gqrmr" event={"ID":"5b31cf7c-573c-4588-8cc9-948d8616af9d","Type":"ContainerStarted","Data":"7265a71169e5707df5727e87de153cb295b0e4d8dc18fa09371f95164fde6e37"} Mar 10 12:07:45 crc kubenswrapper[4794]: I0310 12:07:45.245975 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-dhcp-openstack-openstack-cell1-gqrmr" event={"ID":"5b31cf7c-573c-4588-8cc9-948d8616af9d","Type":"ContainerStarted","Data":"660f7cd7aabbc312cf2d2b8a51f4d768359c6d397316ea3727e1b8a25f2657db"} Mar 10 12:07:45 crc kubenswrapper[4794]: I0310 12:07:45.268595 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-dhcp-openstack-openstack-cell1-gqrmr" podStartSLOduration=1.713174001 podStartE2EDuration="2.268573538s" podCreationTimestamp="2026-03-10 12:07:43 +0000 UTC" firstStartedPulling="2026-03-10 12:07:44.262067682 +0000 UTC m=+8613.018238500" lastFinishedPulling="2026-03-10 12:07:44.817467219 +0000 UTC m=+8613.573638037" observedRunningTime="2026-03-10 12:07:45.262700767 +0000 UTC m=+8614.018871605" watchObservedRunningTime="2026-03-10 12:07:45.268573538 +0000 UTC m=+8614.024744356" Mar 10 12:08:00 crc kubenswrapper[4794]: I0310 12:08:00.152623 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29552408-7zzg5"] Mar 10 12:08:00 crc kubenswrapper[4794]: I0310 12:08:00.154651 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552408-7zzg5" Mar 10 12:08:00 crc kubenswrapper[4794]: I0310 12:08:00.163127 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-r5tkc" Mar 10 12:08:00 crc kubenswrapper[4794]: I0310 12:08:00.163949 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 10 12:08:00 crc kubenswrapper[4794]: I0310 12:08:00.164816 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 10 12:08:00 crc kubenswrapper[4794]: I0310 12:08:00.168121 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552408-7zzg5"] Mar 10 12:08:00 crc kubenswrapper[4794]: I0310 12:08:00.240545 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b62x6\" (UniqueName: \"kubernetes.io/projected/305ae941-1e89-4b65-ab6d-483a43528714-kube-api-access-b62x6\") pod \"auto-csr-approver-29552408-7zzg5\" (UID: \"305ae941-1e89-4b65-ab6d-483a43528714\") " pod="openshift-infra/auto-csr-approver-29552408-7zzg5" Mar 10 12:08:00 crc kubenswrapper[4794]: I0310 12:08:00.343363 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b62x6\" (UniqueName: \"kubernetes.io/projected/305ae941-1e89-4b65-ab6d-483a43528714-kube-api-access-b62x6\") pod \"auto-csr-approver-29552408-7zzg5\" (UID: \"305ae941-1e89-4b65-ab6d-483a43528714\") " pod="openshift-infra/auto-csr-approver-29552408-7zzg5" Mar 10 12:08:00 crc kubenswrapper[4794]: I0310 12:08:00.375745 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b62x6\" (UniqueName: \"kubernetes.io/projected/305ae941-1e89-4b65-ab6d-483a43528714-kube-api-access-b62x6\") pod \"auto-csr-approver-29552408-7zzg5\" (UID: \"305ae941-1e89-4b65-ab6d-483a43528714\") " pod="openshift-infra/auto-csr-approver-29552408-7zzg5" Mar 10 12:08:00 crc kubenswrapper[4794]: I0310 12:08:00.490570 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552408-7zzg5" Mar 10 12:08:01 crc kubenswrapper[4794]: W0310 12:08:01.020263 4794 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod305ae941_1e89_4b65_ab6d_483a43528714.slice/crio-ffdc5027f7d6ee74e22fa8a7bbf16ac1592ca0964e7d7ed40c1f3cd69168f94c WatchSource:0}: Error finding container ffdc5027f7d6ee74e22fa8a7bbf16ac1592ca0964e7d7ed40c1f3cd69168f94c: Status 404 returned error can't find the container with id ffdc5027f7d6ee74e22fa8a7bbf16ac1592ca0964e7d7ed40c1f3cd69168f94c Mar 10 12:08:01 crc kubenswrapper[4794]: I0310 12:08:01.022159 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552408-7zzg5"] Mar 10 12:08:01 crc kubenswrapper[4794]: I0310 12:08:01.416693 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552408-7zzg5" event={"ID":"305ae941-1e89-4b65-ab6d-483a43528714","Type":"ContainerStarted","Data":"ffdc5027f7d6ee74e22fa8a7bbf16ac1592ca0964e7d7ed40c1f3cd69168f94c"} Mar 10 12:08:03 crc kubenswrapper[4794]: I0310 12:08:03.447607 4794 generic.go:334] "Generic (PLEG): container finished" podID="305ae941-1e89-4b65-ab6d-483a43528714" containerID="761e0a72d89de4469b7f52862e87ab6241292aedfa9c2cec54c5d67da4e5e929" exitCode=0 Mar 10 12:08:03 crc kubenswrapper[4794]: I0310 12:08:03.447715 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552408-7zzg5" event={"ID":"305ae941-1e89-4b65-ab6d-483a43528714","Type":"ContainerDied","Data":"761e0a72d89de4469b7f52862e87ab6241292aedfa9c2cec54c5d67da4e5e929"} Mar 10 12:08:04 crc kubenswrapper[4794]: I0310 12:08:04.837162 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552408-7zzg5" Mar 10 12:08:04 crc kubenswrapper[4794]: I0310 12:08:04.945913 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b62x6\" (UniqueName: \"kubernetes.io/projected/305ae941-1e89-4b65-ab6d-483a43528714-kube-api-access-b62x6\") pod \"305ae941-1e89-4b65-ab6d-483a43528714\" (UID: \"305ae941-1e89-4b65-ab6d-483a43528714\") " Mar 10 12:08:04 crc kubenswrapper[4794]: I0310 12:08:04.955622 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/305ae941-1e89-4b65-ab6d-483a43528714-kube-api-access-b62x6" (OuterVolumeSpecName: "kube-api-access-b62x6") pod "305ae941-1e89-4b65-ab6d-483a43528714" (UID: "305ae941-1e89-4b65-ab6d-483a43528714"). InnerVolumeSpecName "kube-api-access-b62x6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 12:08:05 crc kubenswrapper[4794]: I0310 12:08:05.048349 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b62x6\" (UniqueName: \"kubernetes.io/projected/305ae941-1e89-4b65-ab6d-483a43528714-kube-api-access-b62x6\") on node \"crc\" DevicePath \"\"" Mar 10 12:08:05 crc kubenswrapper[4794]: I0310 12:08:05.470700 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552408-7zzg5" event={"ID":"305ae941-1e89-4b65-ab6d-483a43528714","Type":"ContainerDied","Data":"ffdc5027f7d6ee74e22fa8a7bbf16ac1592ca0964e7d7ed40c1f3cd69168f94c"} Mar 10 12:08:05 crc kubenswrapper[4794]: I0310 12:08:05.470903 4794 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ffdc5027f7d6ee74e22fa8a7bbf16ac1592ca0964e7d7ed40c1f3cd69168f94c" Mar 10 12:08:05 crc kubenswrapper[4794]: I0310 12:08:05.470754 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552408-7zzg5" Mar 10 12:08:05 crc kubenswrapper[4794]: E0310 12:08:05.550182 4794 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod305ae941_1e89_4b65_ab6d_483a43528714.slice\": RecentStats: unable to find data in memory cache]" Mar 10 12:08:05 crc kubenswrapper[4794]: I0310 12:08:05.914814 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29552402-bhfxt"] Mar 10 12:08:05 crc kubenswrapper[4794]: I0310 12:08:05.927006 4794 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29552402-bhfxt"] Mar 10 12:08:06 crc kubenswrapper[4794]: I0310 12:08:06.012103 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="de9c58c4-8144-4434-9555-c3617f91a2ee" path="/var/lib/kubelet/pods/de9c58c4-8144-4434-9555-c3617f91a2ee/volumes" Mar 10 12:08:54 crc kubenswrapper[4794]: I0310 12:08:54.494093 4794 scope.go:117] "RemoveContainer" containerID="35fbd660d1e03d32da4110a53fa0e34a10c63a4ad841b7639e2e2d6b9e5c85af" Mar 10 12:09:00 crc kubenswrapper[4794]: I0310 12:09:00.129968 4794 generic.go:334] "Generic (PLEG): container finished" podID="5b31cf7c-573c-4588-8cc9-948d8616af9d" containerID="7265a71169e5707df5727e87de153cb295b0e4d8dc18fa09371f95164fde6e37" exitCode=0 Mar 10 12:09:00 crc kubenswrapper[4794]: I0310 12:09:00.130085 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-dhcp-openstack-openstack-cell1-gqrmr" event={"ID":"5b31cf7c-573c-4588-8cc9-948d8616af9d","Type":"ContainerDied","Data":"7265a71169e5707df5727e87de153cb295b0e4d8dc18fa09371f95164fde6e37"} Mar 10 12:09:01 crc kubenswrapper[4794]: I0310 12:09:01.647027 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-dhcp-openstack-openstack-cell1-gqrmr" Mar 10 12:09:01 crc kubenswrapper[4794]: I0310 12:09:01.804288 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5b31cf7c-573c-4588-8cc9-948d8616af9d-inventory\") pod \"5b31cf7c-573c-4588-8cc9-948d8616af9d\" (UID: \"5b31cf7c-573c-4588-8cc9-948d8616af9d\") " Mar 10 12:09:01 crc kubenswrapper[4794]: I0310 12:09:01.804488 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-dhcp-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5b31cf7c-573c-4588-8cc9-948d8616af9d-neutron-dhcp-combined-ca-bundle\") pod \"5b31cf7c-573c-4588-8cc9-948d8616af9d\" (UID: \"5b31cf7c-573c-4588-8cc9-948d8616af9d\") " Mar 10 12:09:01 crc kubenswrapper[4794]: I0310 12:09:01.804598 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-dhcp-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/5b31cf7c-573c-4588-8cc9-948d8616af9d-neutron-dhcp-agent-neutron-config-0\") pod \"5b31cf7c-573c-4588-8cc9-948d8616af9d\" (UID: \"5b31cf7c-573c-4588-8cc9-948d8616af9d\") " Mar 10 12:09:01 crc kubenswrapper[4794]: I0310 12:09:01.804741 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/5b31cf7c-573c-4588-8cc9-948d8616af9d-ceph\") pod \"5b31cf7c-573c-4588-8cc9-948d8616af9d\" (UID: \"5b31cf7c-573c-4588-8cc9-948d8616af9d\") " Mar 10 12:09:01 crc kubenswrapper[4794]: I0310 12:09:01.804809 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7qtkb\" (UniqueName: \"kubernetes.io/projected/5b31cf7c-573c-4588-8cc9-948d8616af9d-kube-api-access-7qtkb\") pod \"5b31cf7c-573c-4588-8cc9-948d8616af9d\" (UID: \"5b31cf7c-573c-4588-8cc9-948d8616af9d\") " Mar 10 12:09:01 crc kubenswrapper[4794]: I0310 12:09:01.804847 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/5b31cf7c-573c-4588-8cc9-948d8616af9d-ssh-key-openstack-cell1\") pod \"5b31cf7c-573c-4588-8cc9-948d8616af9d\" (UID: \"5b31cf7c-573c-4588-8cc9-948d8616af9d\") " Mar 10 12:09:01 crc kubenswrapper[4794]: I0310 12:09:01.811857 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b31cf7c-573c-4588-8cc9-948d8616af9d-kube-api-access-7qtkb" (OuterVolumeSpecName: "kube-api-access-7qtkb") pod "5b31cf7c-573c-4588-8cc9-948d8616af9d" (UID: "5b31cf7c-573c-4588-8cc9-948d8616af9d"). InnerVolumeSpecName "kube-api-access-7qtkb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 12:09:01 crc kubenswrapper[4794]: I0310 12:09:01.812055 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b31cf7c-573c-4588-8cc9-948d8616af9d-neutron-dhcp-combined-ca-bundle" (OuterVolumeSpecName: "neutron-dhcp-combined-ca-bundle") pod "5b31cf7c-573c-4588-8cc9-948d8616af9d" (UID: "5b31cf7c-573c-4588-8cc9-948d8616af9d"). InnerVolumeSpecName "neutron-dhcp-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 12:09:01 crc kubenswrapper[4794]: I0310 12:09:01.827208 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b31cf7c-573c-4588-8cc9-948d8616af9d-ceph" (OuterVolumeSpecName: "ceph") pod "5b31cf7c-573c-4588-8cc9-948d8616af9d" (UID: "5b31cf7c-573c-4588-8cc9-948d8616af9d"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 12:09:01 crc kubenswrapper[4794]: I0310 12:09:01.835423 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b31cf7c-573c-4588-8cc9-948d8616af9d-inventory" (OuterVolumeSpecName: "inventory") pod "5b31cf7c-573c-4588-8cc9-948d8616af9d" (UID: "5b31cf7c-573c-4588-8cc9-948d8616af9d"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 12:09:01 crc kubenswrapper[4794]: I0310 12:09:01.846747 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b31cf7c-573c-4588-8cc9-948d8616af9d-ssh-key-openstack-cell1" (OuterVolumeSpecName: "ssh-key-openstack-cell1") pod "5b31cf7c-573c-4588-8cc9-948d8616af9d" (UID: "5b31cf7c-573c-4588-8cc9-948d8616af9d"). InnerVolumeSpecName "ssh-key-openstack-cell1". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 12:09:01 crc kubenswrapper[4794]: I0310 12:09:01.858316 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b31cf7c-573c-4588-8cc9-948d8616af9d-neutron-dhcp-agent-neutron-config-0" (OuterVolumeSpecName: "neutron-dhcp-agent-neutron-config-0") pod "5b31cf7c-573c-4588-8cc9-948d8616af9d" (UID: "5b31cf7c-573c-4588-8cc9-948d8616af9d"). InnerVolumeSpecName "neutron-dhcp-agent-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 12:09:01 crc kubenswrapper[4794]: I0310 12:09:01.908847 4794 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/5b31cf7c-573c-4588-8cc9-948d8616af9d-ssh-key-openstack-cell1\") on node \"crc\" DevicePath \"\"" Mar 10 12:09:01 crc kubenswrapper[4794]: I0310 12:09:01.908901 4794 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5b31cf7c-573c-4588-8cc9-948d8616af9d-inventory\") on node \"crc\" DevicePath \"\"" Mar 10 12:09:01 crc kubenswrapper[4794]: I0310 12:09:01.908921 4794 reconciler_common.go:293] "Volume detached for volume \"neutron-dhcp-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5b31cf7c-573c-4588-8cc9-948d8616af9d-neutron-dhcp-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 12:09:01 crc kubenswrapper[4794]: I0310 12:09:01.908942 4794 reconciler_common.go:293] "Volume detached for volume \"neutron-dhcp-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/5b31cf7c-573c-4588-8cc9-948d8616af9d-neutron-dhcp-agent-neutron-config-0\") on node \"crc\" DevicePath \"\"" Mar 10 12:09:01 crc kubenswrapper[4794]: I0310 12:09:01.908961 4794 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/5b31cf7c-573c-4588-8cc9-948d8616af9d-ceph\") on node \"crc\" DevicePath \"\"" Mar 10 12:09:01 crc kubenswrapper[4794]: I0310 12:09:01.908979 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7qtkb\" (UniqueName: \"kubernetes.io/projected/5b31cf7c-573c-4588-8cc9-948d8616af9d-kube-api-access-7qtkb\") on node \"crc\" DevicePath \"\"" Mar 10 12:09:02 crc kubenswrapper[4794]: I0310 12:09:02.155153 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-dhcp-openstack-openstack-cell1-gqrmr" event={"ID":"5b31cf7c-573c-4588-8cc9-948d8616af9d","Type":"ContainerDied","Data":"660f7cd7aabbc312cf2d2b8a51f4d768359c6d397316ea3727e1b8a25f2657db"} Mar 10 12:09:02 crc kubenswrapper[4794]: I0310 12:09:02.155193 4794 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="660f7cd7aabbc312cf2d2b8a51f4d768359c6d397316ea3727e1b8a25f2657db" Mar 10 12:09:02 crc kubenswrapper[4794]: I0310 12:09:02.155239 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-dhcp-openstack-openstack-cell1-gqrmr" Mar 10 12:09:13 crc kubenswrapper[4794]: I0310 12:09:13.361275 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-mjxlk"] Mar 10 12:09:13 crc kubenswrapper[4794]: E0310 12:09:13.362657 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5b31cf7c-573c-4588-8cc9-948d8616af9d" containerName="neutron-dhcp-openstack-openstack-cell1" Mar 10 12:09:13 crc kubenswrapper[4794]: I0310 12:09:13.362683 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="5b31cf7c-573c-4588-8cc9-948d8616af9d" containerName="neutron-dhcp-openstack-openstack-cell1" Mar 10 12:09:13 crc kubenswrapper[4794]: E0310 12:09:13.362742 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="305ae941-1e89-4b65-ab6d-483a43528714" containerName="oc" Mar 10 12:09:13 crc kubenswrapper[4794]: I0310 12:09:13.362754 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="305ae941-1e89-4b65-ab6d-483a43528714" containerName="oc" Mar 10 12:09:13 crc kubenswrapper[4794]: I0310 12:09:13.363162 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="305ae941-1e89-4b65-ab6d-483a43528714" containerName="oc" Mar 10 12:09:13 crc kubenswrapper[4794]: I0310 12:09:13.363205 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="5b31cf7c-573c-4588-8cc9-948d8616af9d" containerName="neutron-dhcp-openstack-openstack-cell1" Mar 10 12:09:13 crc kubenswrapper[4794]: I0310 12:09:13.366207 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-mjxlk" Mar 10 12:09:13 crc kubenswrapper[4794]: I0310 12:09:13.376270 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5cgmg\" (UniqueName: \"kubernetes.io/projected/bce55540-6f2b-405f-83ed-5cc71fff0e97-kube-api-access-5cgmg\") pod \"redhat-marketplace-mjxlk\" (UID: \"bce55540-6f2b-405f-83ed-5cc71fff0e97\") " pod="openshift-marketplace/redhat-marketplace-mjxlk" Mar 10 12:09:13 crc kubenswrapper[4794]: I0310 12:09:13.376348 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bce55540-6f2b-405f-83ed-5cc71fff0e97-catalog-content\") pod \"redhat-marketplace-mjxlk\" (UID: \"bce55540-6f2b-405f-83ed-5cc71fff0e97\") " pod="openshift-marketplace/redhat-marketplace-mjxlk" Mar 10 12:09:13 crc kubenswrapper[4794]: I0310 12:09:13.376374 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bce55540-6f2b-405f-83ed-5cc71fff0e97-utilities\") pod \"redhat-marketplace-mjxlk\" (UID: \"bce55540-6f2b-405f-83ed-5cc71fff0e97\") " pod="openshift-marketplace/redhat-marketplace-mjxlk" Mar 10 12:09:13 crc kubenswrapper[4794]: I0310 12:09:13.378758 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-mjxlk"] Mar 10 12:09:13 crc kubenswrapper[4794]: I0310 12:09:13.478098 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5cgmg\" (UniqueName: \"kubernetes.io/projected/bce55540-6f2b-405f-83ed-5cc71fff0e97-kube-api-access-5cgmg\") pod \"redhat-marketplace-mjxlk\" (UID: \"bce55540-6f2b-405f-83ed-5cc71fff0e97\") " pod="openshift-marketplace/redhat-marketplace-mjxlk" Mar 10 12:09:13 crc kubenswrapper[4794]: I0310 12:09:13.478502 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bce55540-6f2b-405f-83ed-5cc71fff0e97-catalog-content\") pod \"redhat-marketplace-mjxlk\" (UID: \"bce55540-6f2b-405f-83ed-5cc71fff0e97\") " pod="openshift-marketplace/redhat-marketplace-mjxlk" Mar 10 12:09:13 crc kubenswrapper[4794]: I0310 12:09:13.478527 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bce55540-6f2b-405f-83ed-5cc71fff0e97-utilities\") pod \"redhat-marketplace-mjxlk\" (UID: \"bce55540-6f2b-405f-83ed-5cc71fff0e97\") " pod="openshift-marketplace/redhat-marketplace-mjxlk" Mar 10 12:09:13 crc kubenswrapper[4794]: I0310 12:09:13.479087 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bce55540-6f2b-405f-83ed-5cc71fff0e97-catalog-content\") pod \"redhat-marketplace-mjxlk\" (UID: \"bce55540-6f2b-405f-83ed-5cc71fff0e97\") " pod="openshift-marketplace/redhat-marketplace-mjxlk" Mar 10 12:09:13 crc kubenswrapper[4794]: I0310 12:09:13.479123 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bce55540-6f2b-405f-83ed-5cc71fff0e97-utilities\") pod \"redhat-marketplace-mjxlk\" (UID: \"bce55540-6f2b-405f-83ed-5cc71fff0e97\") " pod="openshift-marketplace/redhat-marketplace-mjxlk" Mar 10 12:09:13 crc kubenswrapper[4794]: I0310 12:09:13.500581 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5cgmg\" (UniqueName: \"kubernetes.io/projected/bce55540-6f2b-405f-83ed-5cc71fff0e97-kube-api-access-5cgmg\") pod \"redhat-marketplace-mjxlk\" (UID: \"bce55540-6f2b-405f-83ed-5cc71fff0e97\") " pod="openshift-marketplace/redhat-marketplace-mjxlk" Mar 10 12:09:13 crc kubenswrapper[4794]: I0310 12:09:13.694272 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-mjxlk" Mar 10 12:09:14 crc kubenswrapper[4794]: I0310 12:09:14.283800 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-mjxlk"] Mar 10 12:09:15 crc kubenswrapper[4794]: I0310 12:09:15.312468 4794 generic.go:334] "Generic (PLEG): container finished" podID="bce55540-6f2b-405f-83ed-5cc71fff0e97" containerID="2119edb497201a23399465fb31dccf0369d2177968fb59e67cb38f02943a70f8" exitCode=0 Mar 10 12:09:15 crc kubenswrapper[4794]: I0310 12:09:15.312563 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mjxlk" event={"ID":"bce55540-6f2b-405f-83ed-5cc71fff0e97","Type":"ContainerDied","Data":"2119edb497201a23399465fb31dccf0369d2177968fb59e67cb38f02943a70f8"} Mar 10 12:09:15 crc kubenswrapper[4794]: I0310 12:09:15.312732 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mjxlk" event={"ID":"bce55540-6f2b-405f-83ed-5cc71fff0e97","Type":"ContainerStarted","Data":"3eade770b3cd016df2f4a181bfe1e1d2363e2004a9620ac3108d3519203507f4"} Mar 10 12:09:15 crc kubenswrapper[4794]: I0310 12:09:15.317304 4794 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 10 12:09:16 crc kubenswrapper[4794]: I0310 12:09:16.323137 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mjxlk" event={"ID":"bce55540-6f2b-405f-83ed-5cc71fff0e97","Type":"ContainerStarted","Data":"91f9c4e0c7fc3dd9d9c9a7069bd78a544cd8a01029fed2bb7254f828b79fdc73"} Mar 10 12:09:17 crc kubenswrapper[4794]: I0310 12:09:17.308670 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-0"] Mar 10 12:09:17 crc kubenswrapper[4794]: I0310 12:09:17.309221 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell0-conductor-0" podUID="00faf545-0a0c-474d-9288-169d52a10e12" containerName="nova-cell0-conductor-conductor" containerID="cri-o://1e8e6a6f4333691a39f04023fe1e4516264545d01b2c123a1d46f75f22f60eab" gracePeriod=30 Mar 10 12:09:17 crc kubenswrapper[4794]: I0310 12:09:17.366372 4794 generic.go:334] "Generic (PLEG): container finished" podID="bce55540-6f2b-405f-83ed-5cc71fff0e97" containerID="91f9c4e0c7fc3dd9d9c9a7069bd78a544cd8a01029fed2bb7254f828b79fdc73" exitCode=0 Mar 10 12:09:17 crc kubenswrapper[4794]: I0310 12:09:17.366454 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mjxlk" event={"ID":"bce55540-6f2b-405f-83ed-5cc71fff0e97","Type":"ContainerDied","Data":"91f9c4e0c7fc3dd9d9c9a7069bd78a544cd8a01029fed2bb7254f828b79fdc73"} Mar 10 12:09:18 crc kubenswrapper[4794]: I0310 12:09:18.056365 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-0"] Mar 10 12:09:18 crc kubenswrapper[4794]: I0310 12:09:18.056786 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-conductor-0" podUID="73f45dee-40e0-4370-9ab8-de6d2998fa6b" containerName="nova-cell1-conductor-conductor" containerID="cri-o://674bd38098e7a6e2fa404f4cfd40b1373d5737d8e1b2ddb4f428c5bbd611fb3a" gracePeriod=30 Mar 10 12:09:18 crc kubenswrapper[4794]: I0310 12:09:18.151608 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Mar 10 12:09:18 crc kubenswrapper[4794]: I0310 12:09:18.151990 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="9f2758f6-ec0b-40db-a619-65e823f98cc9" containerName="nova-scheduler-scheduler" containerID="cri-o://848e79df9401a9915b1f9c828f03f2c156d90bb24e2fb89f65dee4ecaa3a387d" gracePeriod=30 Mar 10 12:09:18 crc kubenswrapper[4794]: I0310 12:09:18.161884 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 10 12:09:18 crc kubenswrapper[4794]: I0310 12:09:18.162110 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="abb3b031-7a99-4128-ae47-c28a091f5ee3" containerName="nova-api-log" containerID="cri-o://a51a2e470e0b29c1a262060fb3b27c824b3d6976fe8f641d698c8c5ba765a22f" gracePeriod=30 Mar 10 12:09:18 crc kubenswrapper[4794]: I0310 12:09:18.162555 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="abb3b031-7a99-4128-ae47-c28a091f5ee3" containerName="nova-api-api" containerID="cri-o://e17f2db38884b006e2dc81fc9c71a9946033f8c2efb20884c96af86be4d89c73" gracePeriod=30 Mar 10 12:09:18 crc kubenswrapper[4794]: I0310 12:09:18.231079 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 10 12:09:18 crc kubenswrapper[4794]: I0310 12:09:18.231315 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="832f4f9e-beca-4825-b367-2efa49512dd8" containerName="nova-metadata-log" containerID="cri-o://5674e7f057fe6006193e20972cb4ee463a2d2dfa5116786d28931c2e355a90b0" gracePeriod=30 Mar 10 12:09:18 crc kubenswrapper[4794]: I0310 12:09:18.231462 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="832f4f9e-beca-4825-b367-2efa49512dd8" containerName="nova-metadata-metadata" containerID="cri-o://57a909fe0d6b139cf0f65ae6c67a2896d2054a7bf59f1fd5b5e1cc0c0b0edb86" gracePeriod=30 Mar 10 12:09:18 crc kubenswrapper[4794]: I0310 12:09:18.395293 4794 generic.go:334] "Generic (PLEG): container finished" podID="832f4f9e-beca-4825-b367-2efa49512dd8" containerID="5674e7f057fe6006193e20972cb4ee463a2d2dfa5116786d28931c2e355a90b0" exitCode=143 Mar 10 12:09:18 crc kubenswrapper[4794]: I0310 12:09:18.395391 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"832f4f9e-beca-4825-b367-2efa49512dd8","Type":"ContainerDied","Data":"5674e7f057fe6006193e20972cb4ee463a2d2dfa5116786d28931c2e355a90b0"} Mar 10 12:09:18 crc kubenswrapper[4794]: I0310 12:09:18.398403 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mjxlk" event={"ID":"bce55540-6f2b-405f-83ed-5cc71fff0e97","Type":"ContainerStarted","Data":"2655de5b925744176a7b4689adb863daf37c746d5b633c4e08b6e48ff85cdb73"} Mar 10 12:09:18 crc kubenswrapper[4794]: I0310 12:09:18.402286 4794 generic.go:334] "Generic (PLEG): container finished" podID="abb3b031-7a99-4128-ae47-c28a091f5ee3" containerID="a51a2e470e0b29c1a262060fb3b27c824b3d6976fe8f641d698c8c5ba765a22f" exitCode=143 Mar 10 12:09:18 crc kubenswrapper[4794]: I0310 12:09:18.402319 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"abb3b031-7a99-4128-ae47-c28a091f5ee3","Type":"ContainerDied","Data":"a51a2e470e0b29c1a262060fb3b27c824b3d6976fe8f641d698c8c5ba765a22f"} Mar 10 12:09:18 crc kubenswrapper[4794]: I0310 12:09:18.425235 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-mjxlk" podStartSLOduration=2.971104735 podStartE2EDuration="5.425214675s" podCreationTimestamp="2026-03-10 12:09:13 +0000 UTC" firstStartedPulling="2026-03-10 12:09:15.316851187 +0000 UTC m=+8704.073022045" lastFinishedPulling="2026-03-10 12:09:17.770961167 +0000 UTC m=+8706.527131985" observedRunningTime="2026-03-10 12:09:18.41694781 +0000 UTC m=+8707.173118638" watchObservedRunningTime="2026-03-10 12:09:18.425214675 +0000 UTC m=+8707.181385493" Mar 10 12:09:19 crc kubenswrapper[4794]: E0310 12:09:19.100687 4794 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="674bd38098e7a6e2fa404f4cfd40b1373d5737d8e1b2ddb4f428c5bbd611fb3a" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Mar 10 12:09:19 crc kubenswrapper[4794]: E0310 12:09:19.104895 4794 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="674bd38098e7a6e2fa404f4cfd40b1373d5737d8e1b2ddb4f428c5bbd611fb3a" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Mar 10 12:09:19 crc kubenswrapper[4794]: E0310 12:09:19.106453 4794 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="674bd38098e7a6e2fa404f4cfd40b1373d5737d8e1b2ddb4f428c5bbd611fb3a" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Mar 10 12:09:19 crc kubenswrapper[4794]: E0310 12:09:19.106494 4794 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-cell1-conductor-0" podUID="73f45dee-40e0-4370-9ab8-de6d2998fa6b" containerName="nova-cell1-conductor-conductor" Mar 10 12:09:20 crc kubenswrapper[4794]: I0310 12:09:20.431798 4794 generic.go:334] "Generic (PLEG): container finished" podID="73f45dee-40e0-4370-9ab8-de6d2998fa6b" containerID="674bd38098e7a6e2fa404f4cfd40b1373d5737d8e1b2ddb4f428c5bbd611fb3a" exitCode=0 Mar 10 12:09:20 crc kubenswrapper[4794]: I0310 12:09:20.431933 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"73f45dee-40e0-4370-9ab8-de6d2998fa6b","Type":"ContainerDied","Data":"674bd38098e7a6e2fa404f4cfd40b1373d5737d8e1b2ddb4f428c5bbd611fb3a"} Mar 10 12:09:20 crc kubenswrapper[4794]: I0310 12:09:20.436700 4794 generic.go:334] "Generic (PLEG): container finished" podID="00faf545-0a0c-474d-9288-169d52a10e12" containerID="1e8e6a6f4333691a39f04023fe1e4516264545d01b2c123a1d46f75f22f60eab" exitCode=0 Mar 10 12:09:20 crc kubenswrapper[4794]: I0310 12:09:20.436744 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"00faf545-0a0c-474d-9288-169d52a10e12","Type":"ContainerDied","Data":"1e8e6a6f4333691a39f04023fe1e4516264545d01b2c123a1d46f75f22f60eab"} Mar 10 12:09:20 crc kubenswrapper[4794]: I0310 12:09:20.436772 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"00faf545-0a0c-474d-9288-169d52a10e12","Type":"ContainerDied","Data":"0880a405e190d356b0fc717784e9f23ac3cfc2486d675dca08319753b31d50f3"} Mar 10 12:09:20 crc kubenswrapper[4794]: I0310 12:09:20.436784 4794 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0880a405e190d356b0fc717784e9f23ac3cfc2486d675dca08319753b31d50f3" Mar 10 12:09:20 crc kubenswrapper[4794]: I0310 12:09:20.437306 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Mar 10 12:09:20 crc kubenswrapper[4794]: I0310 12:09:20.609063 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Mar 10 12:09:20 crc kubenswrapper[4794]: I0310 12:09:20.628735 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/00faf545-0a0c-474d-9288-169d52a10e12-config-data\") pod \"00faf545-0a0c-474d-9288-169d52a10e12\" (UID: \"00faf545-0a0c-474d-9288-169d52a10e12\") " Mar 10 12:09:20 crc kubenswrapper[4794]: I0310 12:09:20.628832 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/00faf545-0a0c-474d-9288-169d52a10e12-combined-ca-bundle\") pod \"00faf545-0a0c-474d-9288-169d52a10e12\" (UID: \"00faf545-0a0c-474d-9288-169d52a10e12\") " Mar 10 12:09:20 crc kubenswrapper[4794]: I0310 12:09:20.629021 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wgvd8\" (UniqueName: \"kubernetes.io/projected/00faf545-0a0c-474d-9288-169d52a10e12-kube-api-access-wgvd8\") pod \"00faf545-0a0c-474d-9288-169d52a10e12\" (UID: \"00faf545-0a0c-474d-9288-169d52a10e12\") " Mar 10 12:09:20 crc kubenswrapper[4794]: I0310 12:09:20.684004 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/00faf545-0a0c-474d-9288-169d52a10e12-kube-api-access-wgvd8" (OuterVolumeSpecName: "kube-api-access-wgvd8") pod "00faf545-0a0c-474d-9288-169d52a10e12" (UID: "00faf545-0a0c-474d-9288-169d52a10e12"). InnerVolumeSpecName "kube-api-access-wgvd8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 12:09:20 crc kubenswrapper[4794]: I0310 12:09:20.687241 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/00faf545-0a0c-474d-9288-169d52a10e12-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "00faf545-0a0c-474d-9288-169d52a10e12" (UID: "00faf545-0a0c-474d-9288-169d52a10e12"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 12:09:20 crc kubenswrapper[4794]: I0310 12:09:20.688434 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/00faf545-0a0c-474d-9288-169d52a10e12-config-data" (OuterVolumeSpecName: "config-data") pod "00faf545-0a0c-474d-9288-169d52a10e12" (UID: "00faf545-0a0c-474d-9288-169d52a10e12"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 12:09:20 crc kubenswrapper[4794]: I0310 12:09:20.732186 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/73f45dee-40e0-4370-9ab8-de6d2998fa6b-combined-ca-bundle\") pod \"73f45dee-40e0-4370-9ab8-de6d2998fa6b\" (UID: \"73f45dee-40e0-4370-9ab8-de6d2998fa6b\") " Mar 10 12:09:20 crc kubenswrapper[4794]: I0310 12:09:20.732414 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/73f45dee-40e0-4370-9ab8-de6d2998fa6b-config-data\") pod \"73f45dee-40e0-4370-9ab8-de6d2998fa6b\" (UID: \"73f45dee-40e0-4370-9ab8-de6d2998fa6b\") " Mar 10 12:09:20 crc kubenswrapper[4794]: I0310 12:09:20.732457 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nrvvd\" (UniqueName: \"kubernetes.io/projected/73f45dee-40e0-4370-9ab8-de6d2998fa6b-kube-api-access-nrvvd\") pod \"73f45dee-40e0-4370-9ab8-de6d2998fa6b\" (UID: \"73f45dee-40e0-4370-9ab8-de6d2998fa6b\") " Mar 10 12:09:20 crc kubenswrapper[4794]: I0310 12:09:20.732884 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wgvd8\" (UniqueName: \"kubernetes.io/projected/00faf545-0a0c-474d-9288-169d52a10e12-kube-api-access-wgvd8\") on node \"crc\" DevicePath \"\"" Mar 10 12:09:20 crc kubenswrapper[4794]: I0310 12:09:20.732897 4794 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/00faf545-0a0c-474d-9288-169d52a10e12-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 12:09:20 crc kubenswrapper[4794]: I0310 12:09:20.732906 4794 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/00faf545-0a0c-474d-9288-169d52a10e12-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 12:09:20 crc kubenswrapper[4794]: I0310 12:09:20.735311 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/73f45dee-40e0-4370-9ab8-de6d2998fa6b-kube-api-access-nrvvd" (OuterVolumeSpecName: "kube-api-access-nrvvd") pod "73f45dee-40e0-4370-9ab8-de6d2998fa6b" (UID: "73f45dee-40e0-4370-9ab8-de6d2998fa6b"). InnerVolumeSpecName "kube-api-access-nrvvd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 12:09:20 crc kubenswrapper[4794]: I0310 12:09:20.761661 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/73f45dee-40e0-4370-9ab8-de6d2998fa6b-config-data" (OuterVolumeSpecName: "config-data") pod "73f45dee-40e0-4370-9ab8-de6d2998fa6b" (UID: "73f45dee-40e0-4370-9ab8-de6d2998fa6b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 12:09:20 crc kubenswrapper[4794]: I0310 12:09:20.765130 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/73f45dee-40e0-4370-9ab8-de6d2998fa6b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "73f45dee-40e0-4370-9ab8-de6d2998fa6b" (UID: "73f45dee-40e0-4370-9ab8-de6d2998fa6b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 12:09:20 crc kubenswrapper[4794]: I0310 12:09:20.834930 4794 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/73f45dee-40e0-4370-9ab8-de6d2998fa6b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 12:09:20 crc kubenswrapper[4794]: I0310 12:09:20.834967 4794 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/73f45dee-40e0-4370-9ab8-de6d2998fa6b-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 12:09:20 crc kubenswrapper[4794]: I0310 12:09:20.834980 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nrvvd\" (UniqueName: \"kubernetes.io/projected/73f45dee-40e0-4370-9ab8-de6d2998fa6b-kube-api-access-nrvvd\") on node \"crc\" DevicePath \"\"" Mar 10 12:09:20 crc kubenswrapper[4794]: I0310 12:09:20.996786 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 10 12:09:21 crc kubenswrapper[4794]: I0310 12:09:21.140859 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9f2758f6-ec0b-40db-a619-65e823f98cc9-combined-ca-bundle\") pod \"9f2758f6-ec0b-40db-a619-65e823f98cc9\" (UID: \"9f2758f6-ec0b-40db-a619-65e823f98cc9\") " Mar 10 12:09:21 crc kubenswrapper[4794]: I0310 12:09:21.141185 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vwlkk\" (UniqueName: \"kubernetes.io/projected/9f2758f6-ec0b-40db-a619-65e823f98cc9-kube-api-access-vwlkk\") pod \"9f2758f6-ec0b-40db-a619-65e823f98cc9\" (UID: \"9f2758f6-ec0b-40db-a619-65e823f98cc9\") " Mar 10 12:09:21 crc kubenswrapper[4794]: I0310 12:09:21.141682 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9f2758f6-ec0b-40db-a619-65e823f98cc9-config-data\") pod \"9f2758f6-ec0b-40db-a619-65e823f98cc9\" (UID: \"9f2758f6-ec0b-40db-a619-65e823f98cc9\") " Mar 10 12:09:21 crc kubenswrapper[4794]: I0310 12:09:21.153250 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9f2758f6-ec0b-40db-a619-65e823f98cc9-kube-api-access-vwlkk" (OuterVolumeSpecName: "kube-api-access-vwlkk") pod "9f2758f6-ec0b-40db-a619-65e823f98cc9" (UID: "9f2758f6-ec0b-40db-a619-65e823f98cc9"). InnerVolumeSpecName "kube-api-access-vwlkk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 12:09:21 crc kubenswrapper[4794]: I0310 12:09:21.177202 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9f2758f6-ec0b-40db-a619-65e823f98cc9-config-data" (OuterVolumeSpecName: "config-data") pod "9f2758f6-ec0b-40db-a619-65e823f98cc9" (UID: "9f2758f6-ec0b-40db-a619-65e823f98cc9"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 12:09:21 crc kubenswrapper[4794]: I0310 12:09:21.202178 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9f2758f6-ec0b-40db-a619-65e823f98cc9-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9f2758f6-ec0b-40db-a619-65e823f98cc9" (UID: "9f2758f6-ec0b-40db-a619-65e823f98cc9"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 12:09:21 crc kubenswrapper[4794]: I0310 12:09:21.274857 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vwlkk\" (UniqueName: \"kubernetes.io/projected/9f2758f6-ec0b-40db-a619-65e823f98cc9-kube-api-access-vwlkk\") on node \"crc\" DevicePath \"\"" Mar 10 12:09:21 crc kubenswrapper[4794]: I0310 12:09:21.275171 4794 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9f2758f6-ec0b-40db-a619-65e823f98cc9-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 12:09:21 crc kubenswrapper[4794]: I0310 12:09:21.275185 4794 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9f2758f6-ec0b-40db-a619-65e823f98cc9-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 12:09:21 crc kubenswrapper[4794]: I0310 12:09:21.451671 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"73f45dee-40e0-4370-9ab8-de6d2998fa6b","Type":"ContainerDied","Data":"c45fc6c8bd143d5dcdb58df63e77e2b0bb63c118016aa00e3758a63edd126e2b"} Mar 10 12:09:21 crc kubenswrapper[4794]: I0310 12:09:21.451737 4794 scope.go:117] "RemoveContainer" containerID="674bd38098e7a6e2fa404f4cfd40b1373d5737d8e1b2ddb4f428c5bbd611fb3a" Mar 10 12:09:21 crc kubenswrapper[4794]: I0310 12:09:21.451861 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Mar 10 12:09:21 crc kubenswrapper[4794]: I0310 12:09:21.455405 4794 generic.go:334] "Generic (PLEG): container finished" podID="9f2758f6-ec0b-40db-a619-65e823f98cc9" containerID="848e79df9401a9915b1f9c828f03f2c156d90bb24e2fb89f65dee4ecaa3a387d" exitCode=0 Mar 10 12:09:21 crc kubenswrapper[4794]: I0310 12:09:21.455444 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"9f2758f6-ec0b-40db-a619-65e823f98cc9","Type":"ContainerDied","Data":"848e79df9401a9915b1f9c828f03f2c156d90bb24e2fb89f65dee4ecaa3a387d"} Mar 10 12:09:21 crc kubenswrapper[4794]: I0310 12:09:21.455489 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"9f2758f6-ec0b-40db-a619-65e823f98cc9","Type":"ContainerDied","Data":"80385cd692e52efb8e8dda8eacbf34179f4e8c2208f51fce442ca5fdaed2b3cd"} Mar 10 12:09:21 crc kubenswrapper[4794]: I0310 12:09:21.455509 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Mar 10 12:09:21 crc kubenswrapper[4794]: I0310 12:09:21.455532 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 10 12:09:21 crc kubenswrapper[4794]: I0310 12:09:21.488129 4794 scope.go:117] "RemoveContainer" containerID="848e79df9401a9915b1f9c828f03f2c156d90bb24e2fb89f65dee4ecaa3a387d" Mar 10 12:09:21 crc kubenswrapper[4794]: I0310 12:09:21.505217 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-0"] Mar 10 12:09:21 crc kubenswrapper[4794]: I0310 12:09:21.526176 4794 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-0"] Mar 10 12:09:21 crc kubenswrapper[4794]: I0310 12:09:21.537214 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Mar 10 12:09:21 crc kubenswrapper[4794]: I0310 12:09:21.548314 4794 scope.go:117] "RemoveContainer" containerID="848e79df9401a9915b1f9c828f03f2c156d90bb24e2fb89f65dee4ecaa3a387d" Mar 10 12:09:21 crc kubenswrapper[4794]: E0310 12:09:21.565137 4794 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"848e79df9401a9915b1f9c828f03f2c156d90bb24e2fb89f65dee4ecaa3a387d\": container with ID starting with 848e79df9401a9915b1f9c828f03f2c156d90bb24e2fb89f65dee4ecaa3a387d not found: ID does not exist" containerID="848e79df9401a9915b1f9c828f03f2c156d90bb24e2fb89f65dee4ecaa3a387d" Mar 10 12:09:21 crc kubenswrapper[4794]: I0310 12:09:21.565197 4794 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"848e79df9401a9915b1f9c828f03f2c156d90bb24e2fb89f65dee4ecaa3a387d"} err="failed to get container status \"848e79df9401a9915b1f9c828f03f2c156d90bb24e2fb89f65dee4ecaa3a387d\": rpc error: code = NotFound desc = could not find container \"848e79df9401a9915b1f9c828f03f2c156d90bb24e2fb89f65dee4ecaa3a387d\": container with ID starting with 848e79df9401a9915b1f9c828f03f2c156d90bb24e2fb89f65dee4ecaa3a387d not found: ID does not exist" Mar 10 12:09:21 crc kubenswrapper[4794]: I0310 12:09:21.578080 4794 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Mar 10 12:09:21 crc kubenswrapper[4794]: I0310 12:09:21.589858 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-0"] Mar 10 12:09:21 crc kubenswrapper[4794]: E0310 12:09:21.590374 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="00faf545-0a0c-474d-9288-169d52a10e12" containerName="nova-cell0-conductor-conductor" Mar 10 12:09:21 crc kubenswrapper[4794]: I0310 12:09:21.590392 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="00faf545-0a0c-474d-9288-169d52a10e12" containerName="nova-cell0-conductor-conductor" Mar 10 12:09:21 crc kubenswrapper[4794]: E0310 12:09:21.590427 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9f2758f6-ec0b-40db-a619-65e823f98cc9" containerName="nova-scheduler-scheduler" Mar 10 12:09:21 crc kubenswrapper[4794]: I0310 12:09:21.590433 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="9f2758f6-ec0b-40db-a619-65e823f98cc9" containerName="nova-scheduler-scheduler" Mar 10 12:09:21 crc kubenswrapper[4794]: E0310 12:09:21.590447 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="73f45dee-40e0-4370-9ab8-de6d2998fa6b" containerName="nova-cell1-conductor-conductor" Mar 10 12:09:21 crc kubenswrapper[4794]: I0310 12:09:21.590452 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="73f45dee-40e0-4370-9ab8-de6d2998fa6b" containerName="nova-cell1-conductor-conductor" Mar 10 12:09:21 crc kubenswrapper[4794]: I0310 12:09:21.590686 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="00faf545-0a0c-474d-9288-169d52a10e12" containerName="nova-cell0-conductor-conductor" Mar 10 12:09:21 crc kubenswrapper[4794]: I0310 12:09:21.590712 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="9f2758f6-ec0b-40db-a619-65e823f98cc9" containerName="nova-scheduler-scheduler" Mar 10 12:09:21 crc kubenswrapper[4794]: I0310 12:09:21.590724 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="73f45dee-40e0-4370-9ab8-de6d2998fa6b" containerName="nova-cell1-conductor-conductor" Mar 10 12:09:21 crc kubenswrapper[4794]: I0310 12:09:21.591891 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Mar 10 12:09:21 crc kubenswrapper[4794]: I0310 12:09:21.595680 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Mar 10 12:09:21 crc kubenswrapper[4794]: I0310 12:09:21.602134 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-0"] Mar 10 12:09:21 crc kubenswrapper[4794]: I0310 12:09:21.616197 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Mar 10 12:09:21 crc kubenswrapper[4794]: I0310 12:09:21.618085 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 10 12:09:21 crc kubenswrapper[4794]: I0310 12:09:21.623998 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Mar 10 12:09:21 crc kubenswrapper[4794]: I0310 12:09:21.625811 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Mar 10 12:09:21 crc kubenswrapper[4794]: I0310 12:09:21.634555 4794 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="832f4f9e-beca-4825-b367-2efa49512dd8" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"http://10.217.1.123:8775/\": read tcp 10.217.0.2:56876->10.217.1.123:8775: read: connection reset by peer" Mar 10 12:09:21 crc kubenswrapper[4794]: I0310 12:09:21.634602 4794 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="832f4f9e-beca-4825-b367-2efa49512dd8" containerName="nova-metadata-log" probeResult="failure" output="Get \"http://10.217.1.123:8775/\": read tcp 10.217.0.2:56874->10.217.1.123:8775: read: connection reset by peer" Mar 10 12:09:21 crc kubenswrapper[4794]: I0310 12:09:21.640033 4794 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-conductor-0"] Mar 10 12:09:21 crc kubenswrapper[4794]: I0310 12:09:21.654571 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 10 12:09:21 crc kubenswrapper[4794]: I0310 12:09:21.663376 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-0"] Mar 10 12:09:21 crc kubenswrapper[4794]: I0310 12:09:21.664743 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Mar 10 12:09:21 crc kubenswrapper[4794]: I0310 12:09:21.669587 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Mar 10 12:09:21 crc kubenswrapper[4794]: I0310 12:09:21.683379 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/95009e92-9d1f-4334-9322-90ff0a8784bd-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"95009e92-9d1f-4334-9322-90ff0a8784bd\") " pod="openstack/nova-cell0-conductor-0" Mar 10 12:09:21 crc kubenswrapper[4794]: I0310 12:09:21.683424 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tnx5t\" (UniqueName: \"kubernetes.io/projected/95009e92-9d1f-4334-9322-90ff0a8784bd-kube-api-access-tnx5t\") pod \"nova-cell0-conductor-0\" (UID: \"95009e92-9d1f-4334-9322-90ff0a8784bd\") " pod="openstack/nova-cell0-conductor-0" Mar 10 12:09:21 crc kubenswrapper[4794]: I0310 12:09:21.683502 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/30125730-8a03-45b3-b2ab-78206ddcde9e-config-data\") pod \"nova-scheduler-0\" (UID: \"30125730-8a03-45b3-b2ab-78206ddcde9e\") " pod="openstack/nova-scheduler-0" Mar 10 12:09:21 crc kubenswrapper[4794]: I0310 12:09:21.683683 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zzr9g\" (UniqueName: \"kubernetes.io/projected/30125730-8a03-45b3-b2ab-78206ddcde9e-kube-api-access-zzr9g\") pod \"nova-scheduler-0\" (UID: \"30125730-8a03-45b3-b2ab-78206ddcde9e\") " pod="openstack/nova-scheduler-0" Mar 10 12:09:21 crc kubenswrapper[4794]: I0310 12:09:21.683713 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/95009e92-9d1f-4334-9322-90ff0a8784bd-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"95009e92-9d1f-4334-9322-90ff0a8784bd\") " pod="openstack/nova-cell0-conductor-0" Mar 10 12:09:21 crc kubenswrapper[4794]: I0310 12:09:21.683758 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/30125730-8a03-45b3-b2ab-78206ddcde9e-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"30125730-8a03-45b3-b2ab-78206ddcde9e\") " pod="openstack/nova-scheduler-0" Mar 10 12:09:21 crc kubenswrapper[4794]: I0310 12:09:21.693388 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Mar 10 12:09:21 crc kubenswrapper[4794]: I0310 12:09:21.787707 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9bf5612b-a7fc-44b3-87cd-6ee5d892221a-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"9bf5612b-a7fc-44b3-87cd-6ee5d892221a\") " pod="openstack/nova-cell1-conductor-0" Mar 10 12:09:21 crc kubenswrapper[4794]: I0310 12:09:21.787798 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9bf5612b-a7fc-44b3-87cd-6ee5d892221a-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"9bf5612b-a7fc-44b3-87cd-6ee5d892221a\") " pod="openstack/nova-cell1-conductor-0" Mar 10 12:09:21 crc kubenswrapper[4794]: I0310 12:09:21.787839 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zzr9g\" (UniqueName: \"kubernetes.io/projected/30125730-8a03-45b3-b2ab-78206ddcde9e-kube-api-access-zzr9g\") pod \"nova-scheduler-0\" (UID: \"30125730-8a03-45b3-b2ab-78206ddcde9e\") " pod="openstack/nova-scheduler-0" Mar 10 12:09:21 crc kubenswrapper[4794]: I0310 12:09:21.787864 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/95009e92-9d1f-4334-9322-90ff0a8784bd-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"95009e92-9d1f-4334-9322-90ff0a8784bd\") " pod="openstack/nova-cell0-conductor-0" Mar 10 12:09:21 crc kubenswrapper[4794]: I0310 12:09:21.787913 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/30125730-8a03-45b3-b2ab-78206ddcde9e-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"30125730-8a03-45b3-b2ab-78206ddcde9e\") " pod="openstack/nova-scheduler-0" Mar 10 12:09:21 crc kubenswrapper[4794]: I0310 12:09:21.787999 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/95009e92-9d1f-4334-9322-90ff0a8784bd-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"95009e92-9d1f-4334-9322-90ff0a8784bd\") " pod="openstack/nova-cell0-conductor-0" Mar 10 12:09:21 crc kubenswrapper[4794]: I0310 12:09:21.788017 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tnx5t\" (UniqueName: \"kubernetes.io/projected/95009e92-9d1f-4334-9322-90ff0a8784bd-kube-api-access-tnx5t\") pod \"nova-cell0-conductor-0\" (UID: \"95009e92-9d1f-4334-9322-90ff0a8784bd\") " pod="openstack/nova-cell0-conductor-0" Mar 10 12:09:21 crc kubenswrapper[4794]: I0310 12:09:21.788038 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fd85k\" (UniqueName: \"kubernetes.io/projected/9bf5612b-a7fc-44b3-87cd-6ee5d892221a-kube-api-access-fd85k\") pod \"nova-cell1-conductor-0\" (UID: \"9bf5612b-a7fc-44b3-87cd-6ee5d892221a\") " pod="openstack/nova-cell1-conductor-0" Mar 10 12:09:21 crc kubenswrapper[4794]: I0310 12:09:21.788100 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/30125730-8a03-45b3-b2ab-78206ddcde9e-config-data\") pod \"nova-scheduler-0\" (UID: \"30125730-8a03-45b3-b2ab-78206ddcde9e\") " pod="openstack/nova-scheduler-0" Mar 10 12:09:21 crc kubenswrapper[4794]: I0310 12:09:21.795319 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/95009e92-9d1f-4334-9322-90ff0a8784bd-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"95009e92-9d1f-4334-9322-90ff0a8784bd\") " pod="openstack/nova-cell0-conductor-0" Mar 10 12:09:21 crc kubenswrapper[4794]: I0310 12:09:21.802984 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/95009e92-9d1f-4334-9322-90ff0a8784bd-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"95009e92-9d1f-4334-9322-90ff0a8784bd\") " pod="openstack/nova-cell0-conductor-0" Mar 10 12:09:21 crc kubenswrapper[4794]: I0310 12:09:21.803253 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/30125730-8a03-45b3-b2ab-78206ddcde9e-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"30125730-8a03-45b3-b2ab-78206ddcde9e\") " pod="openstack/nova-scheduler-0" Mar 10 12:09:21 crc kubenswrapper[4794]: I0310 12:09:21.803513 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/30125730-8a03-45b3-b2ab-78206ddcde9e-config-data\") pod \"nova-scheduler-0\" (UID: \"30125730-8a03-45b3-b2ab-78206ddcde9e\") " pod="openstack/nova-scheduler-0" Mar 10 12:09:21 crc kubenswrapper[4794]: I0310 12:09:21.820951 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tnx5t\" (UniqueName: \"kubernetes.io/projected/95009e92-9d1f-4334-9322-90ff0a8784bd-kube-api-access-tnx5t\") pod \"nova-cell0-conductor-0\" (UID: \"95009e92-9d1f-4334-9322-90ff0a8784bd\") " pod="openstack/nova-cell0-conductor-0" Mar 10 12:09:21 crc kubenswrapper[4794]: I0310 12:09:21.828641 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zzr9g\" (UniqueName: \"kubernetes.io/projected/30125730-8a03-45b3-b2ab-78206ddcde9e-kube-api-access-zzr9g\") pod \"nova-scheduler-0\" (UID: \"30125730-8a03-45b3-b2ab-78206ddcde9e\") " pod="openstack/nova-scheduler-0" Mar 10 12:09:21 crc kubenswrapper[4794]: I0310 12:09:21.897555 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fd85k\" (UniqueName: \"kubernetes.io/projected/9bf5612b-a7fc-44b3-87cd-6ee5d892221a-kube-api-access-fd85k\") pod \"nova-cell1-conductor-0\" (UID: \"9bf5612b-a7fc-44b3-87cd-6ee5d892221a\") " pod="openstack/nova-cell1-conductor-0" Mar 10 12:09:21 crc kubenswrapper[4794]: I0310 12:09:21.897679 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9bf5612b-a7fc-44b3-87cd-6ee5d892221a-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"9bf5612b-a7fc-44b3-87cd-6ee5d892221a\") " pod="openstack/nova-cell1-conductor-0" Mar 10 12:09:21 crc kubenswrapper[4794]: I0310 12:09:21.897716 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9bf5612b-a7fc-44b3-87cd-6ee5d892221a-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"9bf5612b-a7fc-44b3-87cd-6ee5d892221a\") " pod="openstack/nova-cell1-conductor-0" Mar 10 12:09:21 crc kubenswrapper[4794]: I0310 12:09:21.904223 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9bf5612b-a7fc-44b3-87cd-6ee5d892221a-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"9bf5612b-a7fc-44b3-87cd-6ee5d892221a\") " pod="openstack/nova-cell1-conductor-0" Mar 10 12:09:21 crc kubenswrapper[4794]: I0310 12:09:21.906140 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9bf5612b-a7fc-44b3-87cd-6ee5d892221a-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"9bf5612b-a7fc-44b3-87cd-6ee5d892221a\") " pod="openstack/nova-cell1-conductor-0" Mar 10 12:09:21 crc kubenswrapper[4794]: I0310 12:09:21.917121 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Mar 10 12:09:21 crc kubenswrapper[4794]: I0310 12:09:21.922956 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fd85k\" (UniqueName: \"kubernetes.io/projected/9bf5612b-a7fc-44b3-87cd-6ee5d892221a-kube-api-access-fd85k\") pod \"nova-cell1-conductor-0\" (UID: \"9bf5612b-a7fc-44b3-87cd-6ee5d892221a\") " pod="openstack/nova-cell1-conductor-0" Mar 10 12:09:21 crc kubenswrapper[4794]: I0310 12:09:21.936153 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 10 12:09:22 crc kubenswrapper[4794]: I0310 12:09:22.041917 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="00faf545-0a0c-474d-9288-169d52a10e12" path="/var/lib/kubelet/pods/00faf545-0a0c-474d-9288-169d52a10e12/volumes" Mar 10 12:09:22 crc kubenswrapper[4794]: I0310 12:09:22.042466 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="73f45dee-40e0-4370-9ab8-de6d2998fa6b" path="/var/lib/kubelet/pods/73f45dee-40e0-4370-9ab8-de6d2998fa6b/volumes" Mar 10 12:09:22 crc kubenswrapper[4794]: I0310 12:09:22.043424 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9f2758f6-ec0b-40db-a619-65e823f98cc9" path="/var/lib/kubelet/pods/9f2758f6-ec0b-40db-a619-65e823f98cc9/volumes" Mar 10 12:09:22 crc kubenswrapper[4794]: I0310 12:09:22.076883 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Mar 10 12:09:22 crc kubenswrapper[4794]: I0310 12:09:22.472050 4794 generic.go:334] "Generic (PLEG): container finished" podID="abb3b031-7a99-4128-ae47-c28a091f5ee3" containerID="e17f2db38884b006e2dc81fc9c71a9946033f8c2efb20884c96af86be4d89c73" exitCode=0 Mar 10 12:09:22 crc kubenswrapper[4794]: I0310 12:09:22.472137 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"abb3b031-7a99-4128-ae47-c28a091f5ee3","Type":"ContainerDied","Data":"e17f2db38884b006e2dc81fc9c71a9946033f8c2efb20884c96af86be4d89c73"} Mar 10 12:09:22 crc kubenswrapper[4794]: I0310 12:09:22.476979 4794 generic.go:334] "Generic (PLEG): container finished" podID="832f4f9e-beca-4825-b367-2efa49512dd8" containerID="57a909fe0d6b139cf0f65ae6c67a2896d2054a7bf59f1fd5b5e1cc0c0b0edb86" exitCode=0 Mar 10 12:09:22 crc kubenswrapper[4794]: I0310 12:09:22.477016 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"832f4f9e-beca-4825-b367-2efa49512dd8","Type":"ContainerDied","Data":"57a909fe0d6b139cf0f65ae6c67a2896d2054a7bf59f1fd5b5e1cc0c0b0edb86"} Mar 10 12:09:22 crc kubenswrapper[4794]: I0310 12:09:22.483868 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 10 12:09:22 crc kubenswrapper[4794]: I0310 12:09:22.526983 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/832f4f9e-beca-4825-b367-2efa49512dd8-config-data\") pod \"832f4f9e-beca-4825-b367-2efa49512dd8\" (UID: \"832f4f9e-beca-4825-b367-2efa49512dd8\") " Mar 10 12:09:22 crc kubenswrapper[4794]: I0310 12:09:22.527142 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/832f4f9e-beca-4825-b367-2efa49512dd8-logs\") pod \"832f4f9e-beca-4825-b367-2efa49512dd8\" (UID: \"832f4f9e-beca-4825-b367-2efa49512dd8\") " Mar 10 12:09:22 crc kubenswrapper[4794]: I0310 12:09:22.527276 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/832f4f9e-beca-4825-b367-2efa49512dd8-combined-ca-bundle\") pod \"832f4f9e-beca-4825-b367-2efa49512dd8\" (UID: \"832f4f9e-beca-4825-b367-2efa49512dd8\") " Mar 10 12:09:22 crc kubenswrapper[4794]: I0310 12:09:22.527298 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-snms8\" (UniqueName: \"kubernetes.io/projected/832f4f9e-beca-4825-b367-2efa49512dd8-kube-api-access-snms8\") pod \"832f4f9e-beca-4825-b367-2efa49512dd8\" (UID: \"832f4f9e-beca-4825-b367-2efa49512dd8\") " Mar 10 12:09:22 crc kubenswrapper[4794]: I0310 12:09:22.528782 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/832f4f9e-beca-4825-b367-2efa49512dd8-logs" (OuterVolumeSpecName: "logs") pod "832f4f9e-beca-4825-b367-2efa49512dd8" (UID: "832f4f9e-beca-4825-b367-2efa49512dd8"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 12:09:22 crc kubenswrapper[4794]: I0310 12:09:22.534257 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/832f4f9e-beca-4825-b367-2efa49512dd8-kube-api-access-snms8" (OuterVolumeSpecName: "kube-api-access-snms8") pod "832f4f9e-beca-4825-b367-2efa49512dd8" (UID: "832f4f9e-beca-4825-b367-2efa49512dd8"). InnerVolumeSpecName "kube-api-access-snms8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 12:09:22 crc kubenswrapper[4794]: I0310 12:09:22.564793 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/832f4f9e-beca-4825-b367-2efa49512dd8-config-data" (OuterVolumeSpecName: "config-data") pod "832f4f9e-beca-4825-b367-2efa49512dd8" (UID: "832f4f9e-beca-4825-b367-2efa49512dd8"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 12:09:22 crc kubenswrapper[4794]: I0310 12:09:22.569751 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/832f4f9e-beca-4825-b367-2efa49512dd8-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "832f4f9e-beca-4825-b367-2efa49512dd8" (UID: "832f4f9e-beca-4825-b367-2efa49512dd8"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 12:09:22 crc kubenswrapper[4794]: I0310 12:09:22.607182 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 10 12:09:22 crc kubenswrapper[4794]: I0310 12:09:22.633097 4794 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/832f4f9e-beca-4825-b367-2efa49512dd8-logs\") on node \"crc\" DevicePath \"\"" Mar 10 12:09:22 crc kubenswrapper[4794]: I0310 12:09:22.633391 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-snms8\" (UniqueName: \"kubernetes.io/projected/832f4f9e-beca-4825-b367-2efa49512dd8-kube-api-access-snms8\") on node \"crc\" DevicePath \"\"" Mar 10 12:09:22 crc kubenswrapper[4794]: I0310 12:09:22.633461 4794 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/832f4f9e-beca-4825-b367-2efa49512dd8-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 12:09:22 crc kubenswrapper[4794]: I0310 12:09:22.633535 4794 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/832f4f9e-beca-4825-b367-2efa49512dd8-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 12:09:22 crc kubenswrapper[4794]: I0310 12:09:22.735267 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sq926\" (UniqueName: \"kubernetes.io/projected/abb3b031-7a99-4128-ae47-c28a091f5ee3-kube-api-access-sq926\") pod \"abb3b031-7a99-4128-ae47-c28a091f5ee3\" (UID: \"abb3b031-7a99-4128-ae47-c28a091f5ee3\") " Mar 10 12:09:22 crc kubenswrapper[4794]: I0310 12:09:22.735576 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/abb3b031-7a99-4128-ae47-c28a091f5ee3-combined-ca-bundle\") pod \"abb3b031-7a99-4128-ae47-c28a091f5ee3\" (UID: \"abb3b031-7a99-4128-ae47-c28a091f5ee3\") " Mar 10 12:09:22 crc kubenswrapper[4794]: I0310 12:09:22.735607 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/abb3b031-7a99-4128-ae47-c28a091f5ee3-logs\") pod \"abb3b031-7a99-4128-ae47-c28a091f5ee3\" (UID: \"abb3b031-7a99-4128-ae47-c28a091f5ee3\") " Mar 10 12:09:22 crc kubenswrapper[4794]: I0310 12:09:22.735645 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/abb3b031-7a99-4128-ae47-c28a091f5ee3-config-data\") pod \"abb3b031-7a99-4128-ae47-c28a091f5ee3\" (UID: \"abb3b031-7a99-4128-ae47-c28a091f5ee3\") " Mar 10 12:09:22 crc kubenswrapper[4794]: I0310 12:09:22.736492 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/abb3b031-7a99-4128-ae47-c28a091f5ee3-logs" (OuterVolumeSpecName: "logs") pod "abb3b031-7a99-4128-ae47-c28a091f5ee3" (UID: "abb3b031-7a99-4128-ae47-c28a091f5ee3"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 12:09:22 crc kubenswrapper[4794]: I0310 12:09:22.738867 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/abb3b031-7a99-4128-ae47-c28a091f5ee3-kube-api-access-sq926" (OuterVolumeSpecName: "kube-api-access-sq926") pod "abb3b031-7a99-4128-ae47-c28a091f5ee3" (UID: "abb3b031-7a99-4128-ae47-c28a091f5ee3"). InnerVolumeSpecName "kube-api-access-sq926". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 12:09:22 crc kubenswrapper[4794]: W0310 12:09:22.762234 4794 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod30125730_8a03_45b3_b2ab_78206ddcde9e.slice/crio-31d003f36a5a3038128be45282d4bdf8a9e3b88f895f98bb8f167a1d3f1b9ccf WatchSource:0}: Error finding container 31d003f36a5a3038128be45282d4bdf8a9e3b88f895f98bb8f167a1d3f1b9ccf: Status 404 returned error can't find the container with id 31d003f36a5a3038128be45282d4bdf8a9e3b88f895f98bb8f167a1d3f1b9ccf Mar 10 12:09:22 crc kubenswrapper[4794]: I0310 12:09:22.778053 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 10 12:09:22 crc kubenswrapper[4794]: I0310 12:09:22.780893 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/abb3b031-7a99-4128-ae47-c28a091f5ee3-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "abb3b031-7a99-4128-ae47-c28a091f5ee3" (UID: "abb3b031-7a99-4128-ae47-c28a091f5ee3"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 12:09:22 crc kubenswrapper[4794]: I0310 12:09:22.792507 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/abb3b031-7a99-4128-ae47-c28a091f5ee3-config-data" (OuterVolumeSpecName: "config-data") pod "abb3b031-7a99-4128-ae47-c28a091f5ee3" (UID: "abb3b031-7a99-4128-ae47-c28a091f5ee3"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 12:09:22 crc kubenswrapper[4794]: I0310 12:09:22.837724 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sq926\" (UniqueName: \"kubernetes.io/projected/abb3b031-7a99-4128-ae47-c28a091f5ee3-kube-api-access-sq926\") on node \"crc\" DevicePath \"\"" Mar 10 12:09:22 crc kubenswrapper[4794]: I0310 12:09:22.837756 4794 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/abb3b031-7a99-4128-ae47-c28a091f5ee3-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 12:09:22 crc kubenswrapper[4794]: I0310 12:09:22.837765 4794 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/abb3b031-7a99-4128-ae47-c28a091f5ee3-logs\") on node \"crc\" DevicePath \"\"" Mar 10 12:09:22 crc kubenswrapper[4794]: I0310 12:09:22.837774 4794 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/abb3b031-7a99-4128-ae47-c28a091f5ee3-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 12:09:22 crc kubenswrapper[4794]: I0310 12:09:22.901541 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Mar 10 12:09:22 crc kubenswrapper[4794]: I0310 12:09:22.918987 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Mar 10 12:09:22 crc kubenswrapper[4794]: W0310 12:09:22.933192 4794 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9bf5612b_a7fc_44b3_87cd_6ee5d892221a.slice/crio-906d378b58e5b53c0502a8449f97c06c2af1903c127820dbdd70ab09d2c13a47 WatchSource:0}: Error finding container 906d378b58e5b53c0502a8449f97c06c2af1903c127820dbdd70ab09d2c13a47: Status 404 returned error can't find the container with id 906d378b58e5b53c0502a8449f97c06c2af1903c127820dbdd70ab09d2c13a47 Mar 10 12:09:22 crc kubenswrapper[4794]: I0310 12:09:22.967531 4794 patch_prober.go:28] interesting pod/machine-config-daemon-69278 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 12:09:22 crc kubenswrapper[4794]: I0310 12:09:22.967598 4794 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-69278" podUID="071a79a8-a892-4d38-a255-2a19483b64aa" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 12:09:23 crc kubenswrapper[4794]: I0310 12:09:23.491124 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"9bf5612b-a7fc-44b3-87cd-6ee5d892221a","Type":"ContainerStarted","Data":"1781b31e55a96fcb22a0ca8acddabce46268e14738d6b17ae5f5545e090f350a"} Mar 10 12:09:23 crc kubenswrapper[4794]: I0310 12:09:23.491544 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-conductor-0" Mar 10 12:09:23 crc kubenswrapper[4794]: I0310 12:09:23.491566 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"9bf5612b-a7fc-44b3-87cd-6ee5d892221a","Type":"ContainerStarted","Data":"906d378b58e5b53c0502a8449f97c06c2af1903c127820dbdd70ab09d2c13a47"} Mar 10 12:09:23 crc kubenswrapper[4794]: I0310 12:09:23.499712 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"95009e92-9d1f-4334-9322-90ff0a8784bd","Type":"ContainerStarted","Data":"9f7c19687d7c514f3ac9f46cba326c2247015cf0183219cd99e76a5de94edbed"} Mar 10 12:09:23 crc kubenswrapper[4794]: I0310 12:09:23.499770 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"95009e92-9d1f-4334-9322-90ff0a8784bd","Type":"ContainerStarted","Data":"4511f7a002d123e576cd414ddad9aa9abab1f5e90c9d4fb1ae7ea7e4c033863a"} Mar 10 12:09:23 crc kubenswrapper[4794]: I0310 12:09:23.500835 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell0-conductor-0" Mar 10 12:09:23 crc kubenswrapper[4794]: I0310 12:09:23.509848 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"832f4f9e-beca-4825-b367-2efa49512dd8","Type":"ContainerDied","Data":"aadb6a1fdfeafa2115209c6473d0ccae06a2cebf82d15173b704e8be12f48d2e"} Mar 10 12:09:23 crc kubenswrapper[4794]: I0310 12:09:23.509917 4794 scope.go:117] "RemoveContainer" containerID="57a909fe0d6b139cf0f65ae6c67a2896d2054a7bf59f1fd5b5e1cc0c0b0edb86" Mar 10 12:09:23 crc kubenswrapper[4794]: I0310 12:09:23.509943 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 10 12:09:23 crc kubenswrapper[4794]: I0310 12:09:23.516477 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"abb3b031-7a99-4128-ae47-c28a091f5ee3","Type":"ContainerDied","Data":"66f6a0e04045968208a3b56fd362695cf6d437aa4491a2c63d6a1f31dc0c1066"} Mar 10 12:09:23 crc kubenswrapper[4794]: I0310 12:09:23.516613 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 10 12:09:23 crc kubenswrapper[4794]: I0310 12:09:23.530509 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"30125730-8a03-45b3-b2ab-78206ddcde9e","Type":"ContainerStarted","Data":"6e5d0d1e7ab08d6f44585d9df44a1c2fe3dde8ae18dcc703424b2e489c3a11f1"} Mar 10 12:09:23 crc kubenswrapper[4794]: I0310 12:09:23.530559 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"30125730-8a03-45b3-b2ab-78206ddcde9e","Type":"ContainerStarted","Data":"31d003f36a5a3038128be45282d4bdf8a9e3b88f895f98bb8f167a1d3f1b9ccf"} Mar 10 12:09:23 crc kubenswrapper[4794]: I0310 12:09:23.542867 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-0" podStartSLOduration=2.542841508 podStartE2EDuration="2.542841508s" podCreationTimestamp="2026-03-10 12:09:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 12:09:23.52177855 +0000 UTC m=+8712.277949378" watchObservedRunningTime="2026-03-10 12:09:23.542841508 +0000 UTC m=+8712.299012326" Mar 10 12:09:23 crc kubenswrapper[4794]: I0310 12:09:23.564492 4794 scope.go:117] "RemoveContainer" containerID="5674e7f057fe6006193e20972cb4ee463a2d2dfa5116786d28931c2e355a90b0" Mar 10 12:09:23 crc kubenswrapper[4794]: I0310 12:09:23.574028 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-0" podStartSLOduration=2.574004907 podStartE2EDuration="2.574004907s" podCreationTimestamp="2026-03-10 12:09:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 12:09:23.54223686 +0000 UTC m=+8712.298407678" watchObservedRunningTime="2026-03-10 12:09:23.574004907 +0000 UTC m=+8712.330175715" Mar 10 12:09:23 crc kubenswrapper[4794]: I0310 12:09:23.595484 4794 scope.go:117] "RemoveContainer" containerID="e17f2db38884b006e2dc81fc9c71a9946033f8c2efb20884c96af86be4d89c73" Mar 10 12:09:23 crc kubenswrapper[4794]: I0310 12:09:23.605632 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.605607499 podStartE2EDuration="2.605607499s" podCreationTimestamp="2026-03-10 12:09:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 12:09:23.567151036 +0000 UTC m=+8712.323321864" watchObservedRunningTime="2026-03-10 12:09:23.605607499 +0000 UTC m=+8712.361778337" Mar 10 12:09:23 crc kubenswrapper[4794]: I0310 12:09:23.637892 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 10 12:09:23 crc kubenswrapper[4794]: I0310 12:09:23.653481 4794 scope.go:117] "RemoveContainer" containerID="a51a2e470e0b29c1a262060fb3b27c824b3d6976fe8f641d698c8c5ba765a22f" Mar 10 12:09:23 crc kubenswrapper[4794]: I0310 12:09:23.661590 4794 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Mar 10 12:09:23 crc kubenswrapper[4794]: I0310 12:09:23.687926 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 10 12:09:23 crc kubenswrapper[4794]: I0310 12:09:23.695088 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-mjxlk" Mar 10 12:09:23 crc kubenswrapper[4794]: I0310 12:09:23.696491 4794 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-mjxlk" Mar 10 12:09:23 crc kubenswrapper[4794]: I0310 12:09:23.714106 4794 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Mar 10 12:09:23 crc kubenswrapper[4794]: I0310 12:09:23.730134 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Mar 10 12:09:23 crc kubenswrapper[4794]: E0310 12:09:23.731109 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="abb3b031-7a99-4128-ae47-c28a091f5ee3" containerName="nova-api-api" Mar 10 12:09:23 crc kubenswrapper[4794]: I0310 12:09:23.731272 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="abb3b031-7a99-4128-ae47-c28a091f5ee3" containerName="nova-api-api" Mar 10 12:09:23 crc kubenswrapper[4794]: E0310 12:09:23.731315 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="832f4f9e-beca-4825-b367-2efa49512dd8" containerName="nova-metadata-log" Mar 10 12:09:23 crc kubenswrapper[4794]: I0310 12:09:23.731325 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="832f4f9e-beca-4825-b367-2efa49512dd8" containerName="nova-metadata-log" Mar 10 12:09:23 crc kubenswrapper[4794]: E0310 12:09:23.731369 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="832f4f9e-beca-4825-b367-2efa49512dd8" containerName="nova-metadata-metadata" Mar 10 12:09:23 crc kubenswrapper[4794]: I0310 12:09:23.731379 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="832f4f9e-beca-4825-b367-2efa49512dd8" containerName="nova-metadata-metadata" Mar 10 12:09:23 crc kubenswrapper[4794]: E0310 12:09:23.731402 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="abb3b031-7a99-4128-ae47-c28a091f5ee3" containerName="nova-api-log" Mar 10 12:09:23 crc kubenswrapper[4794]: I0310 12:09:23.731413 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="abb3b031-7a99-4128-ae47-c28a091f5ee3" containerName="nova-api-log" Mar 10 12:09:23 crc kubenswrapper[4794]: I0310 12:09:23.731771 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="832f4f9e-beca-4825-b367-2efa49512dd8" containerName="nova-metadata-log" Mar 10 12:09:23 crc kubenswrapper[4794]: I0310 12:09:23.731784 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="832f4f9e-beca-4825-b367-2efa49512dd8" containerName="nova-metadata-metadata" Mar 10 12:09:23 crc kubenswrapper[4794]: I0310 12:09:23.731811 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="abb3b031-7a99-4128-ae47-c28a091f5ee3" containerName="nova-api-api" Mar 10 12:09:23 crc kubenswrapper[4794]: I0310 12:09:23.731940 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="abb3b031-7a99-4128-ae47-c28a091f5ee3" containerName="nova-api-log" Mar 10 12:09:23 crc kubenswrapper[4794]: I0310 12:09:23.733829 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 10 12:09:23 crc kubenswrapper[4794]: I0310 12:09:23.737868 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Mar 10 12:09:23 crc kubenswrapper[4794]: I0310 12:09:23.747272 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 10 12:09:23 crc kubenswrapper[4794]: I0310 12:09:23.760433 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Mar 10 12:09:23 crc kubenswrapper[4794]: I0310 12:09:23.763869 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 10 12:09:23 crc kubenswrapper[4794]: I0310 12:09:23.765201 4794 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-mjxlk" Mar 10 12:09:23 crc kubenswrapper[4794]: I0310 12:09:23.766179 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Mar 10 12:09:23 crc kubenswrapper[4794]: I0310 12:09:23.776899 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 10 12:09:23 crc kubenswrapper[4794]: I0310 12:09:23.867493 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t4vmq\" (UniqueName: \"kubernetes.io/projected/2a47c198-d1ae-45d0-a160-f74c8d8a04f4-kube-api-access-t4vmq\") pod \"nova-api-0\" (UID: \"2a47c198-d1ae-45d0-a160-f74c8d8a04f4\") " pod="openstack/nova-api-0" Mar 10 12:09:23 crc kubenswrapper[4794]: I0310 12:09:23.867577 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/02b78cfc-5b55-45a5-9417-e6d4e602995e-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"02b78cfc-5b55-45a5-9417-e6d4e602995e\") " pod="openstack/nova-metadata-0" Mar 10 12:09:23 crc kubenswrapper[4794]: I0310 12:09:23.867638 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2a47c198-d1ae-45d0-a160-f74c8d8a04f4-config-data\") pod \"nova-api-0\" (UID: \"2a47c198-d1ae-45d0-a160-f74c8d8a04f4\") " pod="openstack/nova-api-0" Mar 10 12:09:23 crc kubenswrapper[4794]: I0310 12:09:23.867684 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mpbdc\" (UniqueName: \"kubernetes.io/projected/02b78cfc-5b55-45a5-9417-e6d4e602995e-kube-api-access-mpbdc\") pod \"nova-metadata-0\" (UID: \"02b78cfc-5b55-45a5-9417-e6d4e602995e\") " pod="openstack/nova-metadata-0" Mar 10 12:09:23 crc kubenswrapper[4794]: I0310 12:09:23.867734 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/02b78cfc-5b55-45a5-9417-e6d4e602995e-logs\") pod \"nova-metadata-0\" (UID: \"02b78cfc-5b55-45a5-9417-e6d4e602995e\") " pod="openstack/nova-metadata-0" Mar 10 12:09:23 crc kubenswrapper[4794]: I0310 12:09:23.867748 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2a47c198-d1ae-45d0-a160-f74c8d8a04f4-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"2a47c198-d1ae-45d0-a160-f74c8d8a04f4\") " pod="openstack/nova-api-0" Mar 10 12:09:23 crc kubenswrapper[4794]: I0310 12:09:23.867812 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/02b78cfc-5b55-45a5-9417-e6d4e602995e-config-data\") pod \"nova-metadata-0\" (UID: \"02b78cfc-5b55-45a5-9417-e6d4e602995e\") " pod="openstack/nova-metadata-0" Mar 10 12:09:23 crc kubenswrapper[4794]: I0310 12:09:23.867850 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2a47c198-d1ae-45d0-a160-f74c8d8a04f4-logs\") pod \"nova-api-0\" (UID: \"2a47c198-d1ae-45d0-a160-f74c8d8a04f4\") " pod="openstack/nova-api-0" Mar 10 12:09:23 crc kubenswrapper[4794]: I0310 12:09:23.969991 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/02b78cfc-5b55-45a5-9417-e6d4e602995e-config-data\") pod \"nova-metadata-0\" (UID: \"02b78cfc-5b55-45a5-9417-e6d4e602995e\") " pod="openstack/nova-metadata-0" Mar 10 12:09:23 crc kubenswrapper[4794]: I0310 12:09:23.970370 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2a47c198-d1ae-45d0-a160-f74c8d8a04f4-logs\") pod \"nova-api-0\" (UID: \"2a47c198-d1ae-45d0-a160-f74c8d8a04f4\") " pod="openstack/nova-api-0" Mar 10 12:09:23 crc kubenswrapper[4794]: I0310 12:09:23.970584 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t4vmq\" (UniqueName: \"kubernetes.io/projected/2a47c198-d1ae-45d0-a160-f74c8d8a04f4-kube-api-access-t4vmq\") pod \"nova-api-0\" (UID: \"2a47c198-d1ae-45d0-a160-f74c8d8a04f4\") " pod="openstack/nova-api-0" Mar 10 12:09:23 crc kubenswrapper[4794]: I0310 12:09:23.970724 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/02b78cfc-5b55-45a5-9417-e6d4e602995e-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"02b78cfc-5b55-45a5-9417-e6d4e602995e\") " pod="openstack/nova-metadata-0" Mar 10 12:09:23 crc kubenswrapper[4794]: I0310 12:09:23.970885 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2a47c198-d1ae-45d0-a160-f74c8d8a04f4-config-data\") pod \"nova-api-0\" (UID: \"2a47c198-d1ae-45d0-a160-f74c8d8a04f4\") " pod="openstack/nova-api-0" Mar 10 12:09:23 crc kubenswrapper[4794]: I0310 12:09:23.971034 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mpbdc\" (UniqueName: \"kubernetes.io/projected/02b78cfc-5b55-45a5-9417-e6d4e602995e-kube-api-access-mpbdc\") pod \"nova-metadata-0\" (UID: \"02b78cfc-5b55-45a5-9417-e6d4e602995e\") " pod="openstack/nova-metadata-0" Mar 10 12:09:23 crc kubenswrapper[4794]: I0310 12:09:23.971187 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2a47c198-d1ae-45d0-a160-f74c8d8a04f4-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"2a47c198-d1ae-45d0-a160-f74c8d8a04f4\") " pod="openstack/nova-api-0" Mar 10 12:09:23 crc kubenswrapper[4794]: I0310 12:09:23.971289 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/02b78cfc-5b55-45a5-9417-e6d4e602995e-logs\") pod \"nova-metadata-0\" (UID: \"02b78cfc-5b55-45a5-9417-e6d4e602995e\") " pod="openstack/nova-metadata-0" Mar 10 12:09:23 crc kubenswrapper[4794]: I0310 12:09:23.971852 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/02b78cfc-5b55-45a5-9417-e6d4e602995e-logs\") pod \"nova-metadata-0\" (UID: \"02b78cfc-5b55-45a5-9417-e6d4e602995e\") " pod="openstack/nova-metadata-0" Mar 10 12:09:23 crc kubenswrapper[4794]: I0310 12:09:23.970890 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2a47c198-d1ae-45d0-a160-f74c8d8a04f4-logs\") pod \"nova-api-0\" (UID: \"2a47c198-d1ae-45d0-a160-f74c8d8a04f4\") " pod="openstack/nova-api-0" Mar 10 12:09:23 crc kubenswrapper[4794]: I0310 12:09:23.976323 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2a47c198-d1ae-45d0-a160-f74c8d8a04f4-config-data\") pod \"nova-api-0\" (UID: \"2a47c198-d1ae-45d0-a160-f74c8d8a04f4\") " pod="openstack/nova-api-0" Mar 10 12:09:23 crc kubenswrapper[4794]: I0310 12:09:23.983149 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/02b78cfc-5b55-45a5-9417-e6d4e602995e-config-data\") pod \"nova-metadata-0\" (UID: \"02b78cfc-5b55-45a5-9417-e6d4e602995e\") " pod="openstack/nova-metadata-0" Mar 10 12:09:23 crc kubenswrapper[4794]: I0310 12:09:23.990831 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t4vmq\" (UniqueName: \"kubernetes.io/projected/2a47c198-d1ae-45d0-a160-f74c8d8a04f4-kube-api-access-t4vmq\") pod \"nova-api-0\" (UID: \"2a47c198-d1ae-45d0-a160-f74c8d8a04f4\") " pod="openstack/nova-api-0" Mar 10 12:09:23 crc kubenswrapper[4794]: I0310 12:09:23.991481 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2a47c198-d1ae-45d0-a160-f74c8d8a04f4-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"2a47c198-d1ae-45d0-a160-f74c8d8a04f4\") " pod="openstack/nova-api-0" Mar 10 12:09:23 crc kubenswrapper[4794]: I0310 12:09:23.996406 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mpbdc\" (UniqueName: \"kubernetes.io/projected/02b78cfc-5b55-45a5-9417-e6d4e602995e-kube-api-access-mpbdc\") pod \"nova-metadata-0\" (UID: \"02b78cfc-5b55-45a5-9417-e6d4e602995e\") " pod="openstack/nova-metadata-0" Mar 10 12:09:24 crc kubenswrapper[4794]: I0310 12:09:24.012886 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/02b78cfc-5b55-45a5-9417-e6d4e602995e-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"02b78cfc-5b55-45a5-9417-e6d4e602995e\") " pod="openstack/nova-metadata-0" Mar 10 12:09:24 crc kubenswrapper[4794]: I0310 12:09:24.029537 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="832f4f9e-beca-4825-b367-2efa49512dd8" path="/var/lib/kubelet/pods/832f4f9e-beca-4825-b367-2efa49512dd8/volumes" Mar 10 12:09:24 crc kubenswrapper[4794]: I0310 12:09:24.031170 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="abb3b031-7a99-4128-ae47-c28a091f5ee3" path="/var/lib/kubelet/pods/abb3b031-7a99-4128-ae47-c28a091f5ee3/volumes" Mar 10 12:09:24 crc kubenswrapper[4794]: I0310 12:09:24.058577 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 10 12:09:24 crc kubenswrapper[4794]: I0310 12:09:24.087358 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 10 12:09:24 crc kubenswrapper[4794]: I0310 12:09:24.580566 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 10 12:09:24 crc kubenswrapper[4794]: I0310 12:09:24.636726 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-mjxlk" Mar 10 12:09:24 crc kubenswrapper[4794]: I0310 12:09:24.806048 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 10 12:09:25 crc kubenswrapper[4794]: I0310 12:09:25.572944 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"2a47c198-d1ae-45d0-a160-f74c8d8a04f4","Type":"ContainerStarted","Data":"156b786f9a156f1c93b7336a876cde81bdfff0fbb50c0e5466b1dca1e1bee578"} Mar 10 12:09:25 crc kubenswrapper[4794]: I0310 12:09:25.573631 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"2a47c198-d1ae-45d0-a160-f74c8d8a04f4","Type":"ContainerStarted","Data":"941825e2000993787acc6ef426e36ea1016ab4d7712b4d796ad56035f2a17d3d"} Mar 10 12:09:25 crc kubenswrapper[4794]: I0310 12:09:25.573701 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"2a47c198-d1ae-45d0-a160-f74c8d8a04f4","Type":"ContainerStarted","Data":"5765baffb9980833221e9ade66fdff09a5515a13d355c8a487076a3cc3e411b4"} Mar 10 12:09:25 crc kubenswrapper[4794]: I0310 12:09:25.587564 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"02b78cfc-5b55-45a5-9417-e6d4e602995e","Type":"ContainerStarted","Data":"a0d458fcc6be59c24da3bf6f24ee20720f5e49c152575db8f13a83587e8ab007"} Mar 10 12:09:25 crc kubenswrapper[4794]: I0310 12:09:25.592008 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"02b78cfc-5b55-45a5-9417-e6d4e602995e","Type":"ContainerStarted","Data":"cc840de89d68641a5fe811c39563e1b5c3b05bc40b4a1dd46f060f77b74cfbe3"} Mar 10 12:09:25 crc kubenswrapper[4794]: I0310 12:09:25.592441 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"02b78cfc-5b55-45a5-9417-e6d4e602995e","Type":"ContainerStarted","Data":"68dd47e4f592cb45e638075cc4ab33291b581b82db7c3eeb4e5a522d4e6418c5"} Mar 10 12:09:25 crc kubenswrapper[4794]: I0310 12:09:25.601029 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.601006247 podStartE2EDuration="2.601006247s" podCreationTimestamp="2026-03-10 12:09:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 12:09:25.594502888 +0000 UTC m=+8714.350673726" watchObservedRunningTime="2026-03-10 12:09:25.601006247 +0000 UTC m=+8714.357177065" Mar 10 12:09:25 crc kubenswrapper[4794]: I0310 12:09:25.629698 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.62967965 podStartE2EDuration="2.62967965s" podCreationTimestamp="2026-03-10 12:09:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 12:09:25.618267879 +0000 UTC m=+8714.374438707" watchObservedRunningTime="2026-03-10 12:09:25.62967965 +0000 UTC m=+8714.385850468" Mar 10 12:09:26 crc kubenswrapper[4794]: I0310 12:09:26.938223 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Mar 10 12:09:27 crc kubenswrapper[4794]: I0310 12:09:27.536826 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-mjxlk"] Mar 10 12:09:27 crc kubenswrapper[4794]: I0310 12:09:27.606160 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-mjxlk" podUID="bce55540-6f2b-405f-83ed-5cc71fff0e97" containerName="registry-server" containerID="cri-o://2655de5b925744176a7b4689adb863daf37c746d5b633c4e08b6e48ff85cdb73" gracePeriod=2 Mar 10 12:09:28 crc kubenswrapper[4794]: I0310 12:09:28.040954 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-mjxlk" Mar 10 12:09:28 crc kubenswrapper[4794]: I0310 12:09:28.168811 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5cgmg\" (UniqueName: \"kubernetes.io/projected/bce55540-6f2b-405f-83ed-5cc71fff0e97-kube-api-access-5cgmg\") pod \"bce55540-6f2b-405f-83ed-5cc71fff0e97\" (UID: \"bce55540-6f2b-405f-83ed-5cc71fff0e97\") " Mar 10 12:09:28 crc kubenswrapper[4794]: I0310 12:09:28.168967 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bce55540-6f2b-405f-83ed-5cc71fff0e97-utilities\") pod \"bce55540-6f2b-405f-83ed-5cc71fff0e97\" (UID: \"bce55540-6f2b-405f-83ed-5cc71fff0e97\") " Mar 10 12:09:28 crc kubenswrapper[4794]: I0310 12:09:28.169041 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bce55540-6f2b-405f-83ed-5cc71fff0e97-catalog-content\") pod \"bce55540-6f2b-405f-83ed-5cc71fff0e97\" (UID: \"bce55540-6f2b-405f-83ed-5cc71fff0e97\") " Mar 10 12:09:28 crc kubenswrapper[4794]: I0310 12:09:28.169802 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bce55540-6f2b-405f-83ed-5cc71fff0e97-utilities" (OuterVolumeSpecName: "utilities") pod "bce55540-6f2b-405f-83ed-5cc71fff0e97" (UID: "bce55540-6f2b-405f-83ed-5cc71fff0e97"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 12:09:28 crc kubenswrapper[4794]: I0310 12:09:28.179070 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bce55540-6f2b-405f-83ed-5cc71fff0e97-kube-api-access-5cgmg" (OuterVolumeSpecName: "kube-api-access-5cgmg") pod "bce55540-6f2b-405f-83ed-5cc71fff0e97" (UID: "bce55540-6f2b-405f-83ed-5cc71fff0e97"). InnerVolumeSpecName "kube-api-access-5cgmg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 12:09:28 crc kubenswrapper[4794]: I0310 12:09:28.195381 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bce55540-6f2b-405f-83ed-5cc71fff0e97-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "bce55540-6f2b-405f-83ed-5cc71fff0e97" (UID: "bce55540-6f2b-405f-83ed-5cc71fff0e97"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 12:09:28 crc kubenswrapper[4794]: I0310 12:09:28.271257 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5cgmg\" (UniqueName: \"kubernetes.io/projected/bce55540-6f2b-405f-83ed-5cc71fff0e97-kube-api-access-5cgmg\") on node \"crc\" DevicePath \"\"" Mar 10 12:09:28 crc kubenswrapper[4794]: I0310 12:09:28.271293 4794 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bce55540-6f2b-405f-83ed-5cc71fff0e97-utilities\") on node \"crc\" DevicePath \"\"" Mar 10 12:09:28 crc kubenswrapper[4794]: I0310 12:09:28.271302 4794 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bce55540-6f2b-405f-83ed-5cc71fff0e97-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 10 12:09:28 crc kubenswrapper[4794]: I0310 12:09:28.624958 4794 generic.go:334] "Generic (PLEG): container finished" podID="bce55540-6f2b-405f-83ed-5cc71fff0e97" containerID="2655de5b925744176a7b4689adb863daf37c746d5b633c4e08b6e48ff85cdb73" exitCode=0 Mar 10 12:09:28 crc kubenswrapper[4794]: I0310 12:09:28.625046 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mjxlk" event={"ID":"bce55540-6f2b-405f-83ed-5cc71fff0e97","Type":"ContainerDied","Data":"2655de5b925744176a7b4689adb863daf37c746d5b633c4e08b6e48ff85cdb73"} Mar 10 12:09:28 crc kubenswrapper[4794]: I0310 12:09:28.625107 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mjxlk" event={"ID":"bce55540-6f2b-405f-83ed-5cc71fff0e97","Type":"ContainerDied","Data":"3eade770b3cd016df2f4a181bfe1e1d2363e2004a9620ac3108d3519203507f4"} Mar 10 12:09:28 crc kubenswrapper[4794]: I0310 12:09:28.625177 4794 scope.go:117] "RemoveContainer" containerID="2655de5b925744176a7b4689adb863daf37c746d5b633c4e08b6e48ff85cdb73" Mar 10 12:09:28 crc kubenswrapper[4794]: I0310 12:09:28.625491 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-mjxlk" Mar 10 12:09:28 crc kubenswrapper[4794]: I0310 12:09:28.656873 4794 scope.go:117] "RemoveContainer" containerID="91f9c4e0c7fc3dd9d9c9a7069bd78a544cd8a01029fed2bb7254f828b79fdc73" Mar 10 12:09:28 crc kubenswrapper[4794]: I0310 12:09:28.674746 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-mjxlk"] Mar 10 12:09:28 crc kubenswrapper[4794]: I0310 12:09:28.683167 4794 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-mjxlk"] Mar 10 12:09:28 crc kubenswrapper[4794]: I0310 12:09:28.703232 4794 scope.go:117] "RemoveContainer" containerID="2119edb497201a23399465fb31dccf0369d2177968fb59e67cb38f02943a70f8" Mar 10 12:09:28 crc kubenswrapper[4794]: I0310 12:09:28.757668 4794 scope.go:117] "RemoveContainer" containerID="2655de5b925744176a7b4689adb863daf37c746d5b633c4e08b6e48ff85cdb73" Mar 10 12:09:28 crc kubenswrapper[4794]: E0310 12:09:28.758104 4794 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2655de5b925744176a7b4689adb863daf37c746d5b633c4e08b6e48ff85cdb73\": container with ID starting with 2655de5b925744176a7b4689adb863daf37c746d5b633c4e08b6e48ff85cdb73 not found: ID does not exist" containerID="2655de5b925744176a7b4689adb863daf37c746d5b633c4e08b6e48ff85cdb73" Mar 10 12:09:28 crc kubenswrapper[4794]: I0310 12:09:28.758144 4794 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2655de5b925744176a7b4689adb863daf37c746d5b633c4e08b6e48ff85cdb73"} err="failed to get container status \"2655de5b925744176a7b4689adb863daf37c746d5b633c4e08b6e48ff85cdb73\": rpc error: code = NotFound desc = could not find container \"2655de5b925744176a7b4689adb863daf37c746d5b633c4e08b6e48ff85cdb73\": container with ID starting with 2655de5b925744176a7b4689adb863daf37c746d5b633c4e08b6e48ff85cdb73 not found: ID does not exist" Mar 10 12:09:28 crc kubenswrapper[4794]: I0310 12:09:28.758168 4794 scope.go:117] "RemoveContainer" containerID="91f9c4e0c7fc3dd9d9c9a7069bd78a544cd8a01029fed2bb7254f828b79fdc73" Mar 10 12:09:28 crc kubenswrapper[4794]: E0310 12:09:28.758567 4794 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"91f9c4e0c7fc3dd9d9c9a7069bd78a544cd8a01029fed2bb7254f828b79fdc73\": container with ID starting with 91f9c4e0c7fc3dd9d9c9a7069bd78a544cd8a01029fed2bb7254f828b79fdc73 not found: ID does not exist" containerID="91f9c4e0c7fc3dd9d9c9a7069bd78a544cd8a01029fed2bb7254f828b79fdc73" Mar 10 12:09:28 crc kubenswrapper[4794]: I0310 12:09:28.758616 4794 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"91f9c4e0c7fc3dd9d9c9a7069bd78a544cd8a01029fed2bb7254f828b79fdc73"} err="failed to get container status \"91f9c4e0c7fc3dd9d9c9a7069bd78a544cd8a01029fed2bb7254f828b79fdc73\": rpc error: code = NotFound desc = could not find container \"91f9c4e0c7fc3dd9d9c9a7069bd78a544cd8a01029fed2bb7254f828b79fdc73\": container with ID starting with 91f9c4e0c7fc3dd9d9c9a7069bd78a544cd8a01029fed2bb7254f828b79fdc73 not found: ID does not exist" Mar 10 12:09:28 crc kubenswrapper[4794]: I0310 12:09:28.758648 4794 scope.go:117] "RemoveContainer" containerID="2119edb497201a23399465fb31dccf0369d2177968fb59e67cb38f02943a70f8" Mar 10 12:09:28 crc kubenswrapper[4794]: E0310 12:09:28.759195 4794 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2119edb497201a23399465fb31dccf0369d2177968fb59e67cb38f02943a70f8\": container with ID starting with 2119edb497201a23399465fb31dccf0369d2177968fb59e67cb38f02943a70f8 not found: ID does not exist" containerID="2119edb497201a23399465fb31dccf0369d2177968fb59e67cb38f02943a70f8" Mar 10 12:09:28 crc kubenswrapper[4794]: I0310 12:09:28.759238 4794 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2119edb497201a23399465fb31dccf0369d2177968fb59e67cb38f02943a70f8"} err="failed to get container status \"2119edb497201a23399465fb31dccf0369d2177968fb59e67cb38f02943a70f8\": rpc error: code = NotFound desc = could not find container \"2119edb497201a23399465fb31dccf0369d2177968fb59e67cb38f02943a70f8\": container with ID starting with 2119edb497201a23399465fb31dccf0369d2177968fb59e67cb38f02943a70f8 not found: ID does not exist" Mar 10 12:09:29 crc kubenswrapper[4794]: I0310 12:09:29.059556 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Mar 10 12:09:29 crc kubenswrapper[4794]: I0310 12:09:29.060052 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Mar 10 12:09:30 crc kubenswrapper[4794]: I0310 12:09:30.012927 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bce55540-6f2b-405f-83ed-5cc71fff0e97" path="/var/lib/kubelet/pods/bce55540-6f2b-405f-83ed-5cc71fff0e97/volumes" Mar 10 12:09:31 crc kubenswrapper[4794]: I0310 12:09:31.948749 4794 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Mar 10 12:09:32 crc kubenswrapper[4794]: I0310 12:09:32.018058 4794 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Mar 10 12:09:32 crc kubenswrapper[4794]: I0310 12:09:32.023994 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell0-conductor-0" Mar 10 12:09:32 crc kubenswrapper[4794]: I0310 12:09:32.115495 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-conductor-0" Mar 10 12:09:32 crc kubenswrapper[4794]: I0310 12:09:32.700737 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Mar 10 12:09:34 crc kubenswrapper[4794]: I0310 12:09:34.059326 4794 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Mar 10 12:09:34 crc kubenswrapper[4794]: I0310 12:09:34.059650 4794 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Mar 10 12:09:34 crc kubenswrapper[4794]: I0310 12:09:34.088392 4794 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 10 12:09:34 crc kubenswrapper[4794]: I0310 12:09:34.088456 4794 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 10 12:09:35 crc kubenswrapper[4794]: I0310 12:09:35.225595 4794 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="02b78cfc-5b55-45a5-9417-e6d4e602995e" containerName="nova-metadata-log" probeResult="failure" output="Get \"http://10.217.1.254:8775/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 10 12:09:35 crc kubenswrapper[4794]: I0310 12:09:35.225633 4794 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="2a47c198-d1ae-45d0-a160-f74c8d8a04f4" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.5:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 10 12:09:35 crc kubenswrapper[4794]: I0310 12:09:35.225775 4794 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="02b78cfc-5b55-45a5-9417-e6d4e602995e" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"http://10.217.1.254:8775/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 10 12:09:35 crc kubenswrapper[4794]: I0310 12:09:35.225805 4794 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="2a47c198-d1ae-45d0-a160-f74c8d8a04f4" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.5:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 10 12:09:44 crc kubenswrapper[4794]: I0310 12:09:44.061379 4794 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Mar 10 12:09:44 crc kubenswrapper[4794]: I0310 12:09:44.061991 4794 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Mar 10 12:09:44 crc kubenswrapper[4794]: I0310 12:09:44.064247 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Mar 10 12:09:44 crc kubenswrapper[4794]: I0310 12:09:44.065646 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Mar 10 12:09:44 crc kubenswrapper[4794]: I0310 12:09:44.097854 4794 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Mar 10 12:09:44 crc kubenswrapper[4794]: I0310 12:09:44.098277 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Mar 10 12:09:44 crc kubenswrapper[4794]: I0310 12:09:44.098928 4794 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Mar 10 12:09:44 crc kubenswrapper[4794]: I0310 12:09:44.101154 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Mar 10 12:09:44 crc kubenswrapper[4794]: I0310 12:09:44.802498 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Mar 10 12:09:44 crc kubenswrapper[4794]: I0310 12:09:44.806214 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Mar 10 12:09:46 crc kubenswrapper[4794]: I0310 12:09:46.068687 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellkhlzh"] Mar 10 12:09:46 crc kubenswrapper[4794]: E0310 12:09:46.069462 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bce55540-6f2b-405f-83ed-5cc71fff0e97" containerName="extract-content" Mar 10 12:09:46 crc kubenswrapper[4794]: I0310 12:09:46.069477 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="bce55540-6f2b-405f-83ed-5cc71fff0e97" containerName="extract-content" Mar 10 12:09:46 crc kubenswrapper[4794]: E0310 12:09:46.069504 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bce55540-6f2b-405f-83ed-5cc71fff0e97" containerName="extract-utilities" Mar 10 12:09:46 crc kubenswrapper[4794]: I0310 12:09:46.069510 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="bce55540-6f2b-405f-83ed-5cc71fff0e97" containerName="extract-utilities" Mar 10 12:09:46 crc kubenswrapper[4794]: E0310 12:09:46.069548 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bce55540-6f2b-405f-83ed-5cc71fff0e97" containerName="registry-server" Mar 10 12:09:46 crc kubenswrapper[4794]: I0310 12:09:46.069554 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="bce55540-6f2b-405f-83ed-5cc71fff0e97" containerName="registry-server" Mar 10 12:09:46 crc kubenswrapper[4794]: I0310 12:09:46.070661 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="bce55540-6f2b-405f-83ed-5cc71fff0e97" containerName="registry-server" Mar 10 12:09:46 crc kubenswrapper[4794]: I0310 12:09:46.071477 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellkhlzh" Mar 10 12:09:46 crc kubenswrapper[4794]: I0310 12:09:46.073739 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"nova-cells-global-config" Mar 10 12:09:46 crc kubenswrapper[4794]: I0310 12:09:46.074015 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Mar 10 12:09:46 crc kubenswrapper[4794]: I0310 12:09:46.075049 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-migration-ssh-key" Mar 10 12:09:46 crc kubenswrapper[4794]: I0310 12:09:46.075503 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Mar 10 12:09:46 crc kubenswrapper[4794]: I0310 12:09:46.075604 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-jdm8g" Mar 10 12:09:46 crc kubenswrapper[4794]: I0310 12:09:46.075639 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 10 12:09:46 crc kubenswrapper[4794]: I0310 12:09:46.076025 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-compute-config" Mar 10 12:09:46 crc kubenswrapper[4794]: I0310 12:09:46.080217 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellkhlzh"] Mar 10 12:09:46 crc kubenswrapper[4794]: I0310 12:09:46.172825 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/6e3a6dda-7237-47e6-b254-4f0b835f83c6-ssh-key-openstack-cell1\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellkhlzh\" (UID: \"6e3a6dda-7237-47e6-b254-4f0b835f83c6\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellkhlzh" Mar 10 12:09:46 crc kubenswrapper[4794]: I0310 12:09:46.172874 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/6e3a6dda-7237-47e6-b254-4f0b835f83c6-nova-cell1-compute-config-3\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellkhlzh\" (UID: \"6e3a6dda-7237-47e6-b254-4f0b835f83c6\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellkhlzh" Mar 10 12:09:46 crc kubenswrapper[4794]: I0310 12:09:46.172904 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cells-global-config-0\" (UniqueName: \"kubernetes.io/configmap/6e3a6dda-7237-47e6-b254-4f0b835f83c6-nova-cells-global-config-0\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellkhlzh\" (UID: \"6e3a6dda-7237-47e6-b254-4f0b835f83c6\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellkhlzh" Mar 10 12:09:46 crc kubenswrapper[4794]: I0310 12:09:46.172927 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6e3a6dda-7237-47e6-b254-4f0b835f83c6-inventory\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellkhlzh\" (UID: \"6e3a6dda-7237-47e6-b254-4f0b835f83c6\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellkhlzh" Mar 10 12:09:46 crc kubenswrapper[4794]: I0310 12:09:46.172956 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/6e3a6dda-7237-47e6-b254-4f0b835f83c6-nova-migration-ssh-key-0\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellkhlzh\" (UID: \"6e3a6dda-7237-47e6-b254-4f0b835f83c6\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellkhlzh" Mar 10 12:09:46 crc kubenswrapper[4794]: I0310 12:09:46.173133 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/6e3a6dda-7237-47e6-b254-4f0b835f83c6-nova-cell1-compute-config-2\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellkhlzh\" (UID: \"6e3a6dda-7237-47e6-b254-4f0b835f83c6\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellkhlzh" Mar 10 12:09:46 crc kubenswrapper[4794]: I0310 12:09:46.173245 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/6e3a6dda-7237-47e6-b254-4f0b835f83c6-nova-cell1-compute-config-0\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellkhlzh\" (UID: \"6e3a6dda-7237-47e6-b254-4f0b835f83c6\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellkhlzh" Mar 10 12:09:46 crc kubenswrapper[4794]: I0310 12:09:46.173300 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6e3a6dda-7237-47e6-b254-4f0b835f83c6-nova-cell1-combined-ca-bundle\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellkhlzh\" (UID: \"6e3a6dda-7237-47e6-b254-4f0b835f83c6\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellkhlzh" Mar 10 12:09:46 crc kubenswrapper[4794]: I0310 12:09:46.173379 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/6e3a6dda-7237-47e6-b254-4f0b835f83c6-nova-migration-ssh-key-1\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellkhlzh\" (UID: \"6e3a6dda-7237-47e6-b254-4f0b835f83c6\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellkhlzh" Mar 10 12:09:46 crc kubenswrapper[4794]: I0310 12:09:46.173421 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/6e3a6dda-7237-47e6-b254-4f0b835f83c6-nova-cell1-compute-config-1\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellkhlzh\" (UID: \"6e3a6dda-7237-47e6-b254-4f0b835f83c6\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellkhlzh" Mar 10 12:09:46 crc kubenswrapper[4794]: I0310 12:09:46.173449 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/6e3a6dda-7237-47e6-b254-4f0b835f83c6-ceph\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellkhlzh\" (UID: \"6e3a6dda-7237-47e6-b254-4f0b835f83c6\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellkhlzh" Mar 10 12:09:46 crc kubenswrapper[4794]: I0310 12:09:46.173522 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cells-global-config-1\" (UniqueName: \"kubernetes.io/configmap/6e3a6dda-7237-47e6-b254-4f0b835f83c6-nova-cells-global-config-1\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellkhlzh\" (UID: \"6e3a6dda-7237-47e6-b254-4f0b835f83c6\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellkhlzh" Mar 10 12:09:46 crc kubenswrapper[4794]: I0310 12:09:46.173756 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7h5qj\" (UniqueName: \"kubernetes.io/projected/6e3a6dda-7237-47e6-b254-4f0b835f83c6-kube-api-access-7h5qj\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellkhlzh\" (UID: \"6e3a6dda-7237-47e6-b254-4f0b835f83c6\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellkhlzh" Mar 10 12:09:46 crc kubenswrapper[4794]: I0310 12:09:46.276430 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/6e3a6dda-7237-47e6-b254-4f0b835f83c6-nova-migration-ssh-key-1\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellkhlzh\" (UID: \"6e3a6dda-7237-47e6-b254-4f0b835f83c6\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellkhlzh" Mar 10 12:09:46 crc kubenswrapper[4794]: I0310 12:09:46.276708 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/6e3a6dda-7237-47e6-b254-4f0b835f83c6-nova-cell1-compute-config-1\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellkhlzh\" (UID: \"6e3a6dda-7237-47e6-b254-4f0b835f83c6\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellkhlzh" Mar 10 12:09:46 crc kubenswrapper[4794]: I0310 12:09:46.276851 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/6e3a6dda-7237-47e6-b254-4f0b835f83c6-ceph\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellkhlzh\" (UID: \"6e3a6dda-7237-47e6-b254-4f0b835f83c6\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellkhlzh" Mar 10 12:09:46 crc kubenswrapper[4794]: I0310 12:09:46.277003 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cells-global-config-1\" (UniqueName: \"kubernetes.io/configmap/6e3a6dda-7237-47e6-b254-4f0b835f83c6-nova-cells-global-config-1\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellkhlzh\" (UID: \"6e3a6dda-7237-47e6-b254-4f0b835f83c6\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellkhlzh" Mar 10 12:09:46 crc kubenswrapper[4794]: I0310 12:09:46.277161 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7h5qj\" (UniqueName: \"kubernetes.io/projected/6e3a6dda-7237-47e6-b254-4f0b835f83c6-kube-api-access-7h5qj\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellkhlzh\" (UID: \"6e3a6dda-7237-47e6-b254-4f0b835f83c6\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellkhlzh" Mar 10 12:09:46 crc kubenswrapper[4794]: I0310 12:09:46.277327 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/6e3a6dda-7237-47e6-b254-4f0b835f83c6-nova-cell1-compute-config-3\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellkhlzh\" (UID: \"6e3a6dda-7237-47e6-b254-4f0b835f83c6\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellkhlzh" Mar 10 12:09:46 crc kubenswrapper[4794]: I0310 12:09:46.277465 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/6e3a6dda-7237-47e6-b254-4f0b835f83c6-ssh-key-openstack-cell1\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellkhlzh\" (UID: \"6e3a6dda-7237-47e6-b254-4f0b835f83c6\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellkhlzh" Mar 10 12:09:46 crc kubenswrapper[4794]: I0310 12:09:46.277576 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cells-global-config-0\" (UniqueName: \"kubernetes.io/configmap/6e3a6dda-7237-47e6-b254-4f0b835f83c6-nova-cells-global-config-0\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellkhlzh\" (UID: \"6e3a6dda-7237-47e6-b254-4f0b835f83c6\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellkhlzh" Mar 10 12:09:46 crc kubenswrapper[4794]: I0310 12:09:46.277680 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6e3a6dda-7237-47e6-b254-4f0b835f83c6-inventory\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellkhlzh\" (UID: \"6e3a6dda-7237-47e6-b254-4f0b835f83c6\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellkhlzh" Mar 10 12:09:46 crc kubenswrapper[4794]: I0310 12:09:46.277763 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/6e3a6dda-7237-47e6-b254-4f0b835f83c6-nova-migration-ssh-key-0\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellkhlzh\" (UID: \"6e3a6dda-7237-47e6-b254-4f0b835f83c6\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellkhlzh" Mar 10 12:09:46 crc kubenswrapper[4794]: I0310 12:09:46.277866 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/6e3a6dda-7237-47e6-b254-4f0b835f83c6-nova-cell1-compute-config-2\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellkhlzh\" (UID: \"6e3a6dda-7237-47e6-b254-4f0b835f83c6\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellkhlzh" Mar 10 12:09:46 crc kubenswrapper[4794]: I0310 12:09:46.277978 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/6e3a6dda-7237-47e6-b254-4f0b835f83c6-nova-cell1-compute-config-0\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellkhlzh\" (UID: \"6e3a6dda-7237-47e6-b254-4f0b835f83c6\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellkhlzh" Mar 10 12:09:46 crc kubenswrapper[4794]: I0310 12:09:46.278095 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6e3a6dda-7237-47e6-b254-4f0b835f83c6-nova-cell1-combined-ca-bundle\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellkhlzh\" (UID: \"6e3a6dda-7237-47e6-b254-4f0b835f83c6\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellkhlzh" Mar 10 12:09:46 crc kubenswrapper[4794]: I0310 12:09:46.278267 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cells-global-config-0\" (UniqueName: \"kubernetes.io/configmap/6e3a6dda-7237-47e6-b254-4f0b835f83c6-nova-cells-global-config-0\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellkhlzh\" (UID: \"6e3a6dda-7237-47e6-b254-4f0b835f83c6\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellkhlzh" Mar 10 12:09:46 crc kubenswrapper[4794]: I0310 12:09:46.278103 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cells-global-config-1\" (UniqueName: \"kubernetes.io/configmap/6e3a6dda-7237-47e6-b254-4f0b835f83c6-nova-cells-global-config-1\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellkhlzh\" (UID: \"6e3a6dda-7237-47e6-b254-4f0b835f83c6\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellkhlzh" Mar 10 12:09:46 crc kubenswrapper[4794]: I0310 12:09:46.283910 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/6e3a6dda-7237-47e6-b254-4f0b835f83c6-ssh-key-openstack-cell1\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellkhlzh\" (UID: \"6e3a6dda-7237-47e6-b254-4f0b835f83c6\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellkhlzh" Mar 10 12:09:46 crc kubenswrapper[4794]: I0310 12:09:46.284098 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/6e3a6dda-7237-47e6-b254-4f0b835f83c6-nova-cell1-compute-config-1\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellkhlzh\" (UID: \"6e3a6dda-7237-47e6-b254-4f0b835f83c6\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellkhlzh" Mar 10 12:09:46 crc kubenswrapper[4794]: I0310 12:09:46.284254 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/6e3a6dda-7237-47e6-b254-4f0b835f83c6-nova-cell1-compute-config-0\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellkhlzh\" (UID: \"6e3a6dda-7237-47e6-b254-4f0b835f83c6\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellkhlzh" Mar 10 12:09:46 crc kubenswrapper[4794]: I0310 12:09:46.284622 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/6e3a6dda-7237-47e6-b254-4f0b835f83c6-ceph\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellkhlzh\" (UID: \"6e3a6dda-7237-47e6-b254-4f0b835f83c6\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellkhlzh" Mar 10 12:09:46 crc kubenswrapper[4794]: I0310 12:09:46.287353 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/6e3a6dda-7237-47e6-b254-4f0b835f83c6-nova-migration-ssh-key-1\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellkhlzh\" (UID: \"6e3a6dda-7237-47e6-b254-4f0b835f83c6\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellkhlzh" Mar 10 12:09:46 crc kubenswrapper[4794]: I0310 12:09:46.287595 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/6e3a6dda-7237-47e6-b254-4f0b835f83c6-nova-migration-ssh-key-0\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellkhlzh\" (UID: \"6e3a6dda-7237-47e6-b254-4f0b835f83c6\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellkhlzh" Mar 10 12:09:46 crc kubenswrapper[4794]: I0310 12:09:46.288039 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6e3a6dda-7237-47e6-b254-4f0b835f83c6-nova-cell1-combined-ca-bundle\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellkhlzh\" (UID: \"6e3a6dda-7237-47e6-b254-4f0b835f83c6\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellkhlzh" Mar 10 12:09:46 crc kubenswrapper[4794]: I0310 12:09:46.291076 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/6e3a6dda-7237-47e6-b254-4f0b835f83c6-nova-cell1-compute-config-2\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellkhlzh\" (UID: \"6e3a6dda-7237-47e6-b254-4f0b835f83c6\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellkhlzh" Mar 10 12:09:46 crc kubenswrapper[4794]: I0310 12:09:46.296002 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6e3a6dda-7237-47e6-b254-4f0b835f83c6-inventory\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellkhlzh\" (UID: \"6e3a6dda-7237-47e6-b254-4f0b835f83c6\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellkhlzh" Mar 10 12:09:46 crc kubenswrapper[4794]: I0310 12:09:46.297039 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/6e3a6dda-7237-47e6-b254-4f0b835f83c6-nova-cell1-compute-config-3\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellkhlzh\" (UID: \"6e3a6dda-7237-47e6-b254-4f0b835f83c6\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellkhlzh" Mar 10 12:09:46 crc kubenswrapper[4794]: I0310 12:09:46.299687 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7h5qj\" (UniqueName: \"kubernetes.io/projected/6e3a6dda-7237-47e6-b254-4f0b835f83c6-kube-api-access-7h5qj\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellkhlzh\" (UID: \"6e3a6dda-7237-47e6-b254-4f0b835f83c6\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellkhlzh" Mar 10 12:09:46 crc kubenswrapper[4794]: I0310 12:09:46.394180 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellkhlzh" Mar 10 12:09:46 crc kubenswrapper[4794]: I0310 12:09:46.951917 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellkhlzh"] Mar 10 12:09:46 crc kubenswrapper[4794]: W0310 12:09:46.953592 4794 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6e3a6dda_7237_47e6_b254_4f0b835f83c6.slice/crio-3d9e0d8925bd96d1c2e5b534312fd00dc734c63f5528475cb6ffacb6ebcb2fa8 WatchSource:0}: Error finding container 3d9e0d8925bd96d1c2e5b534312fd00dc734c63f5528475cb6ffacb6ebcb2fa8: Status 404 returned error can't find the container with id 3d9e0d8925bd96d1c2e5b534312fd00dc734c63f5528475cb6ffacb6ebcb2fa8 Mar 10 12:09:47 crc kubenswrapper[4794]: I0310 12:09:47.836998 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellkhlzh" event={"ID":"6e3a6dda-7237-47e6-b254-4f0b835f83c6","Type":"ContainerStarted","Data":"00a71d57f3b69e418f1b48ead0c5253571a6930f0a8af858f9661efa062075e0"} Mar 10 12:09:47 crc kubenswrapper[4794]: I0310 12:09:47.837464 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellkhlzh" event={"ID":"6e3a6dda-7237-47e6-b254-4f0b835f83c6","Type":"ContainerStarted","Data":"3d9e0d8925bd96d1c2e5b534312fd00dc734c63f5528475cb6ffacb6ebcb2fa8"} Mar 10 12:09:47 crc kubenswrapper[4794]: I0310 12:09:47.858999 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellkhlzh" podStartSLOduration=1.3047229599999999 podStartE2EDuration="1.858980322s" podCreationTimestamp="2026-03-10 12:09:46 +0000 UTC" firstStartedPulling="2026-03-10 12:09:46.956522928 +0000 UTC m=+8735.712693746" lastFinishedPulling="2026-03-10 12:09:47.51078029 +0000 UTC m=+8736.266951108" observedRunningTime="2026-03-10 12:09:47.858700534 +0000 UTC m=+8736.614871352" watchObservedRunningTime="2026-03-10 12:09:47.858980322 +0000 UTC m=+8736.615151140" Mar 10 12:09:52 crc kubenswrapper[4794]: I0310 12:09:52.967556 4794 patch_prober.go:28] interesting pod/machine-config-daemon-69278 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 12:09:52 crc kubenswrapper[4794]: I0310 12:09:52.968014 4794 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-69278" podUID="071a79a8-a892-4d38-a255-2a19483b64aa" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 12:09:54 crc kubenswrapper[4794]: I0310 12:09:54.620878 4794 scope.go:117] "RemoveContainer" containerID="1e8e6a6f4333691a39f04023fe1e4516264545d01b2c123a1d46f75f22f60eab" Mar 10 12:10:00 crc kubenswrapper[4794]: I0310 12:10:00.136871 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29552410-l6df7"] Mar 10 12:10:00 crc kubenswrapper[4794]: I0310 12:10:00.139300 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552410-l6df7" Mar 10 12:10:00 crc kubenswrapper[4794]: I0310 12:10:00.141690 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 10 12:10:00 crc kubenswrapper[4794]: I0310 12:10:00.142043 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 10 12:10:00 crc kubenswrapper[4794]: I0310 12:10:00.142123 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-r5tkc" Mar 10 12:10:00 crc kubenswrapper[4794]: I0310 12:10:00.149811 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552410-l6df7"] Mar 10 12:10:00 crc kubenswrapper[4794]: I0310 12:10:00.191252 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vkmms\" (UniqueName: \"kubernetes.io/projected/20a5c8d7-14df-4b04-b241-125c2309de21-kube-api-access-vkmms\") pod \"auto-csr-approver-29552410-l6df7\" (UID: \"20a5c8d7-14df-4b04-b241-125c2309de21\") " pod="openshift-infra/auto-csr-approver-29552410-l6df7" Mar 10 12:10:00 crc kubenswrapper[4794]: I0310 12:10:00.293713 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vkmms\" (UniqueName: \"kubernetes.io/projected/20a5c8d7-14df-4b04-b241-125c2309de21-kube-api-access-vkmms\") pod \"auto-csr-approver-29552410-l6df7\" (UID: \"20a5c8d7-14df-4b04-b241-125c2309de21\") " pod="openshift-infra/auto-csr-approver-29552410-l6df7" Mar 10 12:10:00 crc kubenswrapper[4794]: I0310 12:10:00.315437 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vkmms\" (UniqueName: \"kubernetes.io/projected/20a5c8d7-14df-4b04-b241-125c2309de21-kube-api-access-vkmms\") pod \"auto-csr-approver-29552410-l6df7\" (UID: \"20a5c8d7-14df-4b04-b241-125c2309de21\") " pod="openshift-infra/auto-csr-approver-29552410-l6df7" Mar 10 12:10:00 crc kubenswrapper[4794]: I0310 12:10:00.462511 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552410-l6df7" Mar 10 12:10:00 crc kubenswrapper[4794]: I0310 12:10:00.963268 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552410-l6df7"] Mar 10 12:10:01 crc kubenswrapper[4794]: I0310 12:10:01.991940 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552410-l6df7" event={"ID":"20a5c8d7-14df-4b04-b241-125c2309de21","Type":"ContainerStarted","Data":"c29442e5ed06a2a16c8646aed1fe81801f2d08e125f65972d4f79eac7489fc2f"} Mar 10 12:10:04 crc kubenswrapper[4794]: I0310 12:10:04.013733 4794 generic.go:334] "Generic (PLEG): container finished" podID="20a5c8d7-14df-4b04-b241-125c2309de21" containerID="264e550c3f959054090123dcfeeecea0a35e4bd40c6289c0488b4b9f634b6331" exitCode=0 Mar 10 12:10:04 crc kubenswrapper[4794]: I0310 12:10:04.013813 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552410-l6df7" event={"ID":"20a5c8d7-14df-4b04-b241-125c2309de21","Type":"ContainerDied","Data":"264e550c3f959054090123dcfeeecea0a35e4bd40c6289c0488b4b9f634b6331"} Mar 10 12:10:05 crc kubenswrapper[4794]: I0310 12:10:05.377378 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552410-l6df7" Mar 10 12:10:05 crc kubenswrapper[4794]: I0310 12:10:05.460660 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vkmms\" (UniqueName: \"kubernetes.io/projected/20a5c8d7-14df-4b04-b241-125c2309de21-kube-api-access-vkmms\") pod \"20a5c8d7-14df-4b04-b241-125c2309de21\" (UID: \"20a5c8d7-14df-4b04-b241-125c2309de21\") " Mar 10 12:10:05 crc kubenswrapper[4794]: I0310 12:10:05.467369 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20a5c8d7-14df-4b04-b241-125c2309de21-kube-api-access-vkmms" (OuterVolumeSpecName: "kube-api-access-vkmms") pod "20a5c8d7-14df-4b04-b241-125c2309de21" (UID: "20a5c8d7-14df-4b04-b241-125c2309de21"). InnerVolumeSpecName "kube-api-access-vkmms". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 12:10:05 crc kubenswrapper[4794]: I0310 12:10:05.563809 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vkmms\" (UniqueName: \"kubernetes.io/projected/20a5c8d7-14df-4b04-b241-125c2309de21-kube-api-access-vkmms\") on node \"crc\" DevicePath \"\"" Mar 10 12:10:06 crc kubenswrapper[4794]: I0310 12:10:06.033200 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552410-l6df7" event={"ID":"20a5c8d7-14df-4b04-b241-125c2309de21","Type":"ContainerDied","Data":"c29442e5ed06a2a16c8646aed1fe81801f2d08e125f65972d4f79eac7489fc2f"} Mar 10 12:10:06 crc kubenswrapper[4794]: I0310 12:10:06.034481 4794 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c29442e5ed06a2a16c8646aed1fe81801f2d08e125f65972d4f79eac7489fc2f" Mar 10 12:10:06 crc kubenswrapper[4794]: I0310 12:10:06.033263 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552410-l6df7" Mar 10 12:10:06 crc kubenswrapper[4794]: I0310 12:10:06.471586 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29552404-v4pzs"] Mar 10 12:10:06 crc kubenswrapper[4794]: I0310 12:10:06.485643 4794 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29552404-v4pzs"] Mar 10 12:10:08 crc kubenswrapper[4794]: I0310 12:10:08.015033 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="556983a9-9184-4252-941c-89607a266dc7" path="/var/lib/kubelet/pods/556983a9-9184-4252-941c-89607a266dc7/volumes" Mar 10 12:10:22 crc kubenswrapper[4794]: I0310 12:10:22.968120 4794 patch_prober.go:28] interesting pod/machine-config-daemon-69278 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 12:10:22 crc kubenswrapper[4794]: I0310 12:10:22.968676 4794 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-69278" podUID="071a79a8-a892-4d38-a255-2a19483b64aa" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 12:10:22 crc kubenswrapper[4794]: I0310 12:10:22.968721 4794 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-69278" Mar 10 12:10:22 crc kubenswrapper[4794]: I0310 12:10:22.969522 4794 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"f3eaf645c92d5074cb0924e573d1878a0115051de261d931bccf56e275fcfefa"} pod="openshift-machine-config-operator/machine-config-daemon-69278" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 10 12:10:22 crc kubenswrapper[4794]: I0310 12:10:22.969575 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-69278" podUID="071a79a8-a892-4d38-a255-2a19483b64aa" containerName="machine-config-daemon" containerID="cri-o://f3eaf645c92d5074cb0924e573d1878a0115051de261d931bccf56e275fcfefa" gracePeriod=600 Mar 10 12:10:23 crc kubenswrapper[4794]: I0310 12:10:23.248424 4794 generic.go:334] "Generic (PLEG): container finished" podID="071a79a8-a892-4d38-a255-2a19483b64aa" containerID="f3eaf645c92d5074cb0924e573d1878a0115051de261d931bccf56e275fcfefa" exitCode=0 Mar 10 12:10:23 crc kubenswrapper[4794]: I0310 12:10:23.248516 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-69278" event={"ID":"071a79a8-a892-4d38-a255-2a19483b64aa","Type":"ContainerDied","Data":"f3eaf645c92d5074cb0924e573d1878a0115051de261d931bccf56e275fcfefa"} Mar 10 12:10:23 crc kubenswrapper[4794]: I0310 12:10:23.248689 4794 scope.go:117] "RemoveContainer" containerID="8fbcec467432a07654b1a077feacb7efdec4f288161f785ef9cd2a36f505037c" Mar 10 12:10:24 crc kubenswrapper[4794]: I0310 12:10:24.263849 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-69278" event={"ID":"071a79a8-a892-4d38-a255-2a19483b64aa","Type":"ContainerStarted","Data":"8a548dece9f2dc538d899cf84b2f6e94eab0179fe2b6707ac5975fb13bfb3972"} Mar 10 12:10:54 crc kubenswrapper[4794]: I0310 12:10:54.758758 4794 scope.go:117] "RemoveContainer" containerID="699a80ec15ad44a1f976b9411a99bcaa192658c068e5a535c6b4da3cd3d43b00" Mar 10 12:12:00 crc kubenswrapper[4794]: I0310 12:12:00.149174 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29552412-9dkkj"] Mar 10 12:12:00 crc kubenswrapper[4794]: E0310 12:12:00.151683 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="20a5c8d7-14df-4b04-b241-125c2309de21" containerName="oc" Mar 10 12:12:00 crc kubenswrapper[4794]: I0310 12:12:00.151714 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="20a5c8d7-14df-4b04-b241-125c2309de21" containerName="oc" Mar 10 12:12:00 crc kubenswrapper[4794]: I0310 12:12:00.152143 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="20a5c8d7-14df-4b04-b241-125c2309de21" containerName="oc" Mar 10 12:12:00 crc kubenswrapper[4794]: I0310 12:12:00.154058 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552412-9dkkj" Mar 10 12:12:00 crc kubenswrapper[4794]: I0310 12:12:00.156184 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 10 12:12:00 crc kubenswrapper[4794]: I0310 12:12:00.156178 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 10 12:12:00 crc kubenswrapper[4794]: I0310 12:12:00.156603 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-r5tkc" Mar 10 12:12:00 crc kubenswrapper[4794]: I0310 12:12:00.159257 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552412-9dkkj"] Mar 10 12:12:00 crc kubenswrapper[4794]: I0310 12:12:00.280530 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dpxpq\" (UniqueName: \"kubernetes.io/projected/7084aa56-cb02-408b-aede-13cd9e1e4d21-kube-api-access-dpxpq\") pod \"auto-csr-approver-29552412-9dkkj\" (UID: \"7084aa56-cb02-408b-aede-13cd9e1e4d21\") " pod="openshift-infra/auto-csr-approver-29552412-9dkkj" Mar 10 12:12:00 crc kubenswrapper[4794]: I0310 12:12:00.382573 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dpxpq\" (UniqueName: \"kubernetes.io/projected/7084aa56-cb02-408b-aede-13cd9e1e4d21-kube-api-access-dpxpq\") pod \"auto-csr-approver-29552412-9dkkj\" (UID: \"7084aa56-cb02-408b-aede-13cd9e1e4d21\") " pod="openshift-infra/auto-csr-approver-29552412-9dkkj" Mar 10 12:12:00 crc kubenswrapper[4794]: I0310 12:12:00.878956 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dpxpq\" (UniqueName: \"kubernetes.io/projected/7084aa56-cb02-408b-aede-13cd9e1e4d21-kube-api-access-dpxpq\") pod \"auto-csr-approver-29552412-9dkkj\" (UID: \"7084aa56-cb02-408b-aede-13cd9e1e4d21\") " pod="openshift-infra/auto-csr-approver-29552412-9dkkj" Mar 10 12:12:01 crc kubenswrapper[4794]: I0310 12:12:01.099421 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552412-9dkkj" Mar 10 12:12:01 crc kubenswrapper[4794]: I0310 12:12:01.582585 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552412-9dkkj"] Mar 10 12:12:02 crc kubenswrapper[4794]: I0310 12:12:02.380632 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552412-9dkkj" event={"ID":"7084aa56-cb02-408b-aede-13cd9e1e4d21","Type":"ContainerStarted","Data":"e58deabc6e6dada9b4ebf8eedaa5a1c8836fe83300e6c515fd08943e26f2ef61"} Mar 10 12:12:04 crc kubenswrapper[4794]: I0310 12:12:04.438995 4794 generic.go:334] "Generic (PLEG): container finished" podID="7084aa56-cb02-408b-aede-13cd9e1e4d21" containerID="638ca93f58f845a54b0798afdb6c08ceae03d6154423f41add7931021f356e98" exitCode=0 Mar 10 12:12:04 crc kubenswrapper[4794]: I0310 12:12:04.439211 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552412-9dkkj" event={"ID":"7084aa56-cb02-408b-aede-13cd9e1e4d21","Type":"ContainerDied","Data":"638ca93f58f845a54b0798afdb6c08ceae03d6154423f41add7931021f356e98"} Mar 10 12:12:05 crc kubenswrapper[4794]: I0310 12:12:05.875908 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552412-9dkkj" Mar 10 12:12:05 crc kubenswrapper[4794]: I0310 12:12:05.977073 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dpxpq\" (UniqueName: \"kubernetes.io/projected/7084aa56-cb02-408b-aede-13cd9e1e4d21-kube-api-access-dpxpq\") pod \"7084aa56-cb02-408b-aede-13cd9e1e4d21\" (UID: \"7084aa56-cb02-408b-aede-13cd9e1e4d21\") " Mar 10 12:12:05 crc kubenswrapper[4794]: I0310 12:12:05.984303 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7084aa56-cb02-408b-aede-13cd9e1e4d21-kube-api-access-dpxpq" (OuterVolumeSpecName: "kube-api-access-dpxpq") pod "7084aa56-cb02-408b-aede-13cd9e1e4d21" (UID: "7084aa56-cb02-408b-aede-13cd9e1e4d21"). InnerVolumeSpecName "kube-api-access-dpxpq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 12:12:06 crc kubenswrapper[4794]: I0310 12:12:06.081534 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dpxpq\" (UniqueName: \"kubernetes.io/projected/7084aa56-cb02-408b-aede-13cd9e1e4d21-kube-api-access-dpxpq\") on node \"crc\" DevicePath \"\"" Mar 10 12:12:06 crc kubenswrapper[4794]: I0310 12:12:06.481967 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552412-9dkkj" event={"ID":"7084aa56-cb02-408b-aede-13cd9e1e4d21","Type":"ContainerDied","Data":"e58deabc6e6dada9b4ebf8eedaa5a1c8836fe83300e6c515fd08943e26f2ef61"} Mar 10 12:12:06 crc kubenswrapper[4794]: I0310 12:12:06.482020 4794 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e58deabc6e6dada9b4ebf8eedaa5a1c8836fe83300e6c515fd08943e26f2ef61" Mar 10 12:12:06 crc kubenswrapper[4794]: I0310 12:12:06.482136 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552412-9dkkj" Mar 10 12:12:06 crc kubenswrapper[4794]: I0310 12:12:06.962554 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29552406-8wnx2"] Mar 10 12:12:06 crc kubenswrapper[4794]: I0310 12:12:06.982597 4794 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29552406-8wnx2"] Mar 10 12:12:08 crc kubenswrapper[4794]: I0310 12:12:08.010190 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2213e803-3a33-496f-818f-44bf00ae8d8d" path="/var/lib/kubelet/pods/2213e803-3a33-496f-818f-44bf00ae8d8d/volumes" Mar 10 12:12:22 crc kubenswrapper[4794]: I0310 12:12:22.891945 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-xk7pw"] Mar 10 12:12:22 crc kubenswrapper[4794]: E0310 12:12:22.895950 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7084aa56-cb02-408b-aede-13cd9e1e4d21" containerName="oc" Mar 10 12:12:22 crc kubenswrapper[4794]: I0310 12:12:22.895986 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="7084aa56-cb02-408b-aede-13cd9e1e4d21" containerName="oc" Mar 10 12:12:22 crc kubenswrapper[4794]: I0310 12:12:22.898083 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="7084aa56-cb02-408b-aede-13cd9e1e4d21" containerName="oc" Mar 10 12:12:22 crc kubenswrapper[4794]: I0310 12:12:22.903460 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-xk7pw" Mar 10 12:12:22 crc kubenswrapper[4794]: I0310 12:12:22.915789 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-xk7pw"] Mar 10 12:12:23 crc kubenswrapper[4794]: I0310 12:12:23.032557 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2c4e4eeb-e0c2-497d-a679-2715240d4a2b-utilities\") pod \"community-operators-xk7pw\" (UID: \"2c4e4eeb-e0c2-497d-a679-2715240d4a2b\") " pod="openshift-marketplace/community-operators-xk7pw" Mar 10 12:12:23 crc kubenswrapper[4794]: I0310 12:12:23.032597 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2c4e4eeb-e0c2-497d-a679-2715240d4a2b-catalog-content\") pod \"community-operators-xk7pw\" (UID: \"2c4e4eeb-e0c2-497d-a679-2715240d4a2b\") " pod="openshift-marketplace/community-operators-xk7pw" Mar 10 12:12:23 crc kubenswrapper[4794]: I0310 12:12:23.032714 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bqxpw\" (UniqueName: \"kubernetes.io/projected/2c4e4eeb-e0c2-497d-a679-2715240d4a2b-kube-api-access-bqxpw\") pod \"community-operators-xk7pw\" (UID: \"2c4e4eeb-e0c2-497d-a679-2715240d4a2b\") " pod="openshift-marketplace/community-operators-xk7pw" Mar 10 12:12:23 crc kubenswrapper[4794]: I0310 12:12:23.137738 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2c4e4eeb-e0c2-497d-a679-2715240d4a2b-utilities\") pod \"community-operators-xk7pw\" (UID: \"2c4e4eeb-e0c2-497d-a679-2715240d4a2b\") " pod="openshift-marketplace/community-operators-xk7pw" Mar 10 12:12:23 crc kubenswrapper[4794]: I0310 12:12:23.137779 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2c4e4eeb-e0c2-497d-a679-2715240d4a2b-catalog-content\") pod \"community-operators-xk7pw\" (UID: \"2c4e4eeb-e0c2-497d-a679-2715240d4a2b\") " pod="openshift-marketplace/community-operators-xk7pw" Mar 10 12:12:23 crc kubenswrapper[4794]: I0310 12:12:23.137905 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bqxpw\" (UniqueName: \"kubernetes.io/projected/2c4e4eeb-e0c2-497d-a679-2715240d4a2b-kube-api-access-bqxpw\") pod \"community-operators-xk7pw\" (UID: \"2c4e4eeb-e0c2-497d-a679-2715240d4a2b\") " pod="openshift-marketplace/community-operators-xk7pw" Mar 10 12:12:23 crc kubenswrapper[4794]: I0310 12:12:23.138765 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2c4e4eeb-e0c2-497d-a679-2715240d4a2b-utilities\") pod \"community-operators-xk7pw\" (UID: \"2c4e4eeb-e0c2-497d-a679-2715240d4a2b\") " pod="openshift-marketplace/community-operators-xk7pw" Mar 10 12:12:23 crc kubenswrapper[4794]: I0310 12:12:23.138969 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2c4e4eeb-e0c2-497d-a679-2715240d4a2b-catalog-content\") pod \"community-operators-xk7pw\" (UID: \"2c4e4eeb-e0c2-497d-a679-2715240d4a2b\") " pod="openshift-marketplace/community-operators-xk7pw" Mar 10 12:12:23 crc kubenswrapper[4794]: I0310 12:12:23.167222 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bqxpw\" (UniqueName: \"kubernetes.io/projected/2c4e4eeb-e0c2-497d-a679-2715240d4a2b-kube-api-access-bqxpw\") pod \"community-operators-xk7pw\" (UID: \"2c4e4eeb-e0c2-497d-a679-2715240d4a2b\") " pod="openshift-marketplace/community-operators-xk7pw" Mar 10 12:12:23 crc kubenswrapper[4794]: I0310 12:12:23.256061 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-xk7pw" Mar 10 12:12:23 crc kubenswrapper[4794]: I0310 12:12:23.788664 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-xk7pw"] Mar 10 12:12:24 crc kubenswrapper[4794]: I0310 12:12:24.667077 4794 generic.go:334] "Generic (PLEG): container finished" podID="2c4e4eeb-e0c2-497d-a679-2715240d4a2b" containerID="bc3ab26e7ea88fa5b18c81f8191e0bfeb091a9a56ffbec02180ebd8001416749" exitCode=0 Mar 10 12:12:24 crc kubenswrapper[4794]: I0310 12:12:24.667155 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xk7pw" event={"ID":"2c4e4eeb-e0c2-497d-a679-2715240d4a2b","Type":"ContainerDied","Data":"bc3ab26e7ea88fa5b18c81f8191e0bfeb091a9a56ffbec02180ebd8001416749"} Mar 10 12:12:24 crc kubenswrapper[4794]: I0310 12:12:24.667382 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xk7pw" event={"ID":"2c4e4eeb-e0c2-497d-a679-2715240d4a2b","Type":"ContainerStarted","Data":"8a9a3004591a4673eb94cf0f4a24433f7e11fc35fde6ff479a4fa65f31930ac9"} Mar 10 12:12:26 crc kubenswrapper[4794]: I0310 12:12:26.710869 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xk7pw" event={"ID":"2c4e4eeb-e0c2-497d-a679-2715240d4a2b","Type":"ContainerStarted","Data":"6b3809c6b93a836f5168eda7d4962c1713688a6d19963289032e9e0c4fdbede2"} Mar 10 12:12:28 crc kubenswrapper[4794]: I0310 12:12:28.731438 4794 generic.go:334] "Generic (PLEG): container finished" podID="2c4e4eeb-e0c2-497d-a679-2715240d4a2b" containerID="6b3809c6b93a836f5168eda7d4962c1713688a6d19963289032e9e0c4fdbede2" exitCode=0 Mar 10 12:12:28 crc kubenswrapper[4794]: I0310 12:12:28.731534 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xk7pw" event={"ID":"2c4e4eeb-e0c2-497d-a679-2715240d4a2b","Type":"ContainerDied","Data":"6b3809c6b93a836f5168eda7d4962c1713688a6d19963289032e9e0c4fdbede2"} Mar 10 12:12:29 crc kubenswrapper[4794]: I0310 12:12:29.748366 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xk7pw" event={"ID":"2c4e4eeb-e0c2-497d-a679-2715240d4a2b","Type":"ContainerStarted","Data":"ac3708d3082c28c7d0f211985e7a49b04c477c60578451236ff21b30148ba993"} Mar 10 12:12:29 crc kubenswrapper[4794]: I0310 12:12:29.788282 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-xk7pw" podStartSLOduration=3.321112769 podStartE2EDuration="7.788263901s" podCreationTimestamp="2026-03-10 12:12:22 +0000 UTC" firstStartedPulling="2026-03-10 12:12:24.66941441 +0000 UTC m=+8893.425585228" lastFinishedPulling="2026-03-10 12:12:29.136565542 +0000 UTC m=+8897.892736360" observedRunningTime="2026-03-10 12:12:29.775423237 +0000 UTC m=+8898.531594125" watchObservedRunningTime="2026-03-10 12:12:29.788263901 +0000 UTC m=+8898.544434709" Mar 10 12:12:33 crc kubenswrapper[4794]: I0310 12:12:33.256568 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-xk7pw" Mar 10 12:12:33 crc kubenswrapper[4794]: I0310 12:12:33.257165 4794 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-xk7pw" Mar 10 12:12:34 crc kubenswrapper[4794]: I0310 12:12:34.320727 4794 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/community-operators-xk7pw" podUID="2c4e4eeb-e0c2-497d-a679-2715240d4a2b" containerName="registry-server" probeResult="failure" output=< Mar 10 12:12:34 crc kubenswrapper[4794]: timeout: failed to connect service ":50051" within 1s Mar 10 12:12:34 crc kubenswrapper[4794]: > Mar 10 12:12:43 crc kubenswrapper[4794]: I0310 12:12:43.315226 4794 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-xk7pw" Mar 10 12:12:43 crc kubenswrapper[4794]: I0310 12:12:43.364428 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-xk7pw" Mar 10 12:12:43 crc kubenswrapper[4794]: I0310 12:12:43.561522 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-xk7pw"] Mar 10 12:12:44 crc kubenswrapper[4794]: I0310 12:12:44.907436 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-xk7pw" podUID="2c4e4eeb-e0c2-497d-a679-2715240d4a2b" containerName="registry-server" containerID="cri-o://ac3708d3082c28c7d0f211985e7a49b04c477c60578451236ff21b30148ba993" gracePeriod=2 Mar 10 12:12:45 crc kubenswrapper[4794]: I0310 12:12:45.458636 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-xk7pw" Mar 10 12:12:45 crc kubenswrapper[4794]: I0310 12:12:45.558473 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2c4e4eeb-e0c2-497d-a679-2715240d4a2b-catalog-content\") pod \"2c4e4eeb-e0c2-497d-a679-2715240d4a2b\" (UID: \"2c4e4eeb-e0c2-497d-a679-2715240d4a2b\") " Mar 10 12:12:45 crc kubenswrapper[4794]: I0310 12:12:45.558787 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2c4e4eeb-e0c2-497d-a679-2715240d4a2b-utilities\") pod \"2c4e4eeb-e0c2-497d-a679-2715240d4a2b\" (UID: \"2c4e4eeb-e0c2-497d-a679-2715240d4a2b\") " Mar 10 12:12:45 crc kubenswrapper[4794]: I0310 12:12:45.558936 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bqxpw\" (UniqueName: \"kubernetes.io/projected/2c4e4eeb-e0c2-497d-a679-2715240d4a2b-kube-api-access-bqxpw\") pod \"2c4e4eeb-e0c2-497d-a679-2715240d4a2b\" (UID: \"2c4e4eeb-e0c2-497d-a679-2715240d4a2b\") " Mar 10 12:12:45 crc kubenswrapper[4794]: I0310 12:12:45.560059 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2c4e4eeb-e0c2-497d-a679-2715240d4a2b-utilities" (OuterVolumeSpecName: "utilities") pod "2c4e4eeb-e0c2-497d-a679-2715240d4a2b" (UID: "2c4e4eeb-e0c2-497d-a679-2715240d4a2b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 12:12:45 crc kubenswrapper[4794]: I0310 12:12:45.571626 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2c4e4eeb-e0c2-497d-a679-2715240d4a2b-kube-api-access-bqxpw" (OuterVolumeSpecName: "kube-api-access-bqxpw") pod "2c4e4eeb-e0c2-497d-a679-2715240d4a2b" (UID: "2c4e4eeb-e0c2-497d-a679-2715240d4a2b"). InnerVolumeSpecName "kube-api-access-bqxpw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 12:12:45 crc kubenswrapper[4794]: I0310 12:12:45.635351 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2c4e4eeb-e0c2-497d-a679-2715240d4a2b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "2c4e4eeb-e0c2-497d-a679-2715240d4a2b" (UID: "2c4e4eeb-e0c2-497d-a679-2715240d4a2b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 12:12:45 crc kubenswrapper[4794]: I0310 12:12:45.661481 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bqxpw\" (UniqueName: \"kubernetes.io/projected/2c4e4eeb-e0c2-497d-a679-2715240d4a2b-kube-api-access-bqxpw\") on node \"crc\" DevicePath \"\"" Mar 10 12:12:45 crc kubenswrapper[4794]: I0310 12:12:45.661514 4794 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2c4e4eeb-e0c2-497d-a679-2715240d4a2b-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 10 12:12:45 crc kubenswrapper[4794]: I0310 12:12:45.661524 4794 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2c4e4eeb-e0c2-497d-a679-2715240d4a2b-utilities\") on node \"crc\" DevicePath \"\"" Mar 10 12:12:45 crc kubenswrapper[4794]: I0310 12:12:45.928546 4794 generic.go:334] "Generic (PLEG): container finished" podID="2c4e4eeb-e0c2-497d-a679-2715240d4a2b" containerID="ac3708d3082c28c7d0f211985e7a49b04c477c60578451236ff21b30148ba993" exitCode=0 Mar 10 12:12:45 crc kubenswrapper[4794]: I0310 12:12:45.928615 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xk7pw" event={"ID":"2c4e4eeb-e0c2-497d-a679-2715240d4a2b","Type":"ContainerDied","Data":"ac3708d3082c28c7d0f211985e7a49b04c477c60578451236ff21b30148ba993"} Mar 10 12:12:45 crc kubenswrapper[4794]: I0310 12:12:45.928687 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-xk7pw" Mar 10 12:12:45 crc kubenswrapper[4794]: I0310 12:12:45.928765 4794 scope.go:117] "RemoveContainer" containerID="ac3708d3082c28c7d0f211985e7a49b04c477c60578451236ff21b30148ba993" Mar 10 12:12:45 crc kubenswrapper[4794]: I0310 12:12:45.928688 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xk7pw" event={"ID":"2c4e4eeb-e0c2-497d-a679-2715240d4a2b","Type":"ContainerDied","Data":"8a9a3004591a4673eb94cf0f4a24433f7e11fc35fde6ff479a4fa65f31930ac9"} Mar 10 12:12:45 crc kubenswrapper[4794]: I0310 12:12:45.960572 4794 scope.go:117] "RemoveContainer" containerID="6b3809c6b93a836f5168eda7d4962c1713688a6d19963289032e9e0c4fdbede2" Mar 10 12:12:46 crc kubenswrapper[4794]: I0310 12:12:46.014951 4794 scope.go:117] "RemoveContainer" containerID="bc3ab26e7ea88fa5b18c81f8191e0bfeb091a9a56ffbec02180ebd8001416749" Mar 10 12:12:46 crc kubenswrapper[4794]: I0310 12:12:46.026070 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-xk7pw"] Mar 10 12:12:46 crc kubenswrapper[4794]: I0310 12:12:46.035485 4794 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-xk7pw"] Mar 10 12:12:46 crc kubenswrapper[4794]: I0310 12:12:46.063544 4794 scope.go:117] "RemoveContainer" containerID="ac3708d3082c28c7d0f211985e7a49b04c477c60578451236ff21b30148ba993" Mar 10 12:12:46 crc kubenswrapper[4794]: E0310 12:12:46.063981 4794 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ac3708d3082c28c7d0f211985e7a49b04c477c60578451236ff21b30148ba993\": container with ID starting with ac3708d3082c28c7d0f211985e7a49b04c477c60578451236ff21b30148ba993 not found: ID does not exist" containerID="ac3708d3082c28c7d0f211985e7a49b04c477c60578451236ff21b30148ba993" Mar 10 12:12:46 crc kubenswrapper[4794]: I0310 12:12:46.064030 4794 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ac3708d3082c28c7d0f211985e7a49b04c477c60578451236ff21b30148ba993"} err="failed to get container status \"ac3708d3082c28c7d0f211985e7a49b04c477c60578451236ff21b30148ba993\": rpc error: code = NotFound desc = could not find container \"ac3708d3082c28c7d0f211985e7a49b04c477c60578451236ff21b30148ba993\": container with ID starting with ac3708d3082c28c7d0f211985e7a49b04c477c60578451236ff21b30148ba993 not found: ID does not exist" Mar 10 12:12:46 crc kubenswrapper[4794]: I0310 12:12:46.064058 4794 scope.go:117] "RemoveContainer" containerID="6b3809c6b93a836f5168eda7d4962c1713688a6d19963289032e9e0c4fdbede2" Mar 10 12:12:46 crc kubenswrapper[4794]: E0310 12:12:46.064681 4794 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6b3809c6b93a836f5168eda7d4962c1713688a6d19963289032e9e0c4fdbede2\": container with ID starting with 6b3809c6b93a836f5168eda7d4962c1713688a6d19963289032e9e0c4fdbede2 not found: ID does not exist" containerID="6b3809c6b93a836f5168eda7d4962c1713688a6d19963289032e9e0c4fdbede2" Mar 10 12:12:46 crc kubenswrapper[4794]: I0310 12:12:46.064719 4794 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6b3809c6b93a836f5168eda7d4962c1713688a6d19963289032e9e0c4fdbede2"} err="failed to get container status \"6b3809c6b93a836f5168eda7d4962c1713688a6d19963289032e9e0c4fdbede2\": rpc error: code = NotFound desc = could not find container \"6b3809c6b93a836f5168eda7d4962c1713688a6d19963289032e9e0c4fdbede2\": container with ID starting with 6b3809c6b93a836f5168eda7d4962c1713688a6d19963289032e9e0c4fdbede2 not found: ID does not exist" Mar 10 12:12:46 crc kubenswrapper[4794]: I0310 12:12:46.064746 4794 scope.go:117] "RemoveContainer" containerID="bc3ab26e7ea88fa5b18c81f8191e0bfeb091a9a56ffbec02180ebd8001416749" Mar 10 12:12:46 crc kubenswrapper[4794]: E0310 12:12:46.065045 4794 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bc3ab26e7ea88fa5b18c81f8191e0bfeb091a9a56ffbec02180ebd8001416749\": container with ID starting with bc3ab26e7ea88fa5b18c81f8191e0bfeb091a9a56ffbec02180ebd8001416749 not found: ID does not exist" containerID="bc3ab26e7ea88fa5b18c81f8191e0bfeb091a9a56ffbec02180ebd8001416749" Mar 10 12:12:46 crc kubenswrapper[4794]: I0310 12:12:46.065081 4794 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bc3ab26e7ea88fa5b18c81f8191e0bfeb091a9a56ffbec02180ebd8001416749"} err="failed to get container status \"bc3ab26e7ea88fa5b18c81f8191e0bfeb091a9a56ffbec02180ebd8001416749\": rpc error: code = NotFound desc = could not find container \"bc3ab26e7ea88fa5b18c81f8191e0bfeb091a9a56ffbec02180ebd8001416749\": container with ID starting with bc3ab26e7ea88fa5b18c81f8191e0bfeb091a9a56ffbec02180ebd8001416749 not found: ID does not exist" Mar 10 12:12:48 crc kubenswrapper[4794]: I0310 12:12:48.012741 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2c4e4eeb-e0c2-497d-a679-2715240d4a2b" path="/var/lib/kubelet/pods/2c4e4eeb-e0c2-497d-a679-2715240d4a2b/volumes" Mar 10 12:12:52 crc kubenswrapper[4794]: I0310 12:12:52.967594 4794 patch_prober.go:28] interesting pod/machine-config-daemon-69278 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 12:12:52 crc kubenswrapper[4794]: I0310 12:12:52.968149 4794 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-69278" podUID="071a79a8-a892-4d38-a255-2a19483b64aa" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 12:12:54 crc kubenswrapper[4794]: I0310 12:12:54.874031 4794 scope.go:117] "RemoveContainer" containerID="b7eab5beccbff2245f68c647ed4ace195ef5269b9af607a8d61e8e17ed534171" Mar 10 12:13:07 crc kubenswrapper[4794]: I0310 12:13:07.008016 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-hz9x4"] Mar 10 12:13:07 crc kubenswrapper[4794]: E0310 12:13:07.009013 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2c4e4eeb-e0c2-497d-a679-2715240d4a2b" containerName="extract-utilities" Mar 10 12:13:07 crc kubenswrapper[4794]: I0310 12:13:07.009031 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="2c4e4eeb-e0c2-497d-a679-2715240d4a2b" containerName="extract-utilities" Mar 10 12:13:07 crc kubenswrapper[4794]: E0310 12:13:07.009052 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2c4e4eeb-e0c2-497d-a679-2715240d4a2b" containerName="registry-server" Mar 10 12:13:07 crc kubenswrapper[4794]: I0310 12:13:07.009060 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="2c4e4eeb-e0c2-497d-a679-2715240d4a2b" containerName="registry-server" Mar 10 12:13:07 crc kubenswrapper[4794]: E0310 12:13:07.009085 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2c4e4eeb-e0c2-497d-a679-2715240d4a2b" containerName="extract-content" Mar 10 12:13:07 crc kubenswrapper[4794]: I0310 12:13:07.009093 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="2c4e4eeb-e0c2-497d-a679-2715240d4a2b" containerName="extract-content" Mar 10 12:13:07 crc kubenswrapper[4794]: I0310 12:13:07.009393 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="2c4e4eeb-e0c2-497d-a679-2715240d4a2b" containerName="registry-server" Mar 10 12:13:07 crc kubenswrapper[4794]: I0310 12:13:07.011162 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-hz9x4" Mar 10 12:13:07 crc kubenswrapper[4794]: I0310 12:13:07.022179 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-hz9x4"] Mar 10 12:13:07 crc kubenswrapper[4794]: I0310 12:13:07.119029 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/393edda0-75a5-4315-994b-457edffb3617-catalog-content\") pod \"certified-operators-hz9x4\" (UID: \"393edda0-75a5-4315-994b-457edffb3617\") " pod="openshift-marketplace/certified-operators-hz9x4" Mar 10 12:13:07 crc kubenswrapper[4794]: I0310 12:13:07.119114 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jrkm9\" (UniqueName: \"kubernetes.io/projected/393edda0-75a5-4315-994b-457edffb3617-kube-api-access-jrkm9\") pod \"certified-operators-hz9x4\" (UID: \"393edda0-75a5-4315-994b-457edffb3617\") " pod="openshift-marketplace/certified-operators-hz9x4" Mar 10 12:13:07 crc kubenswrapper[4794]: I0310 12:13:07.119654 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/393edda0-75a5-4315-994b-457edffb3617-utilities\") pod \"certified-operators-hz9x4\" (UID: \"393edda0-75a5-4315-994b-457edffb3617\") " pod="openshift-marketplace/certified-operators-hz9x4" Mar 10 12:13:07 crc kubenswrapper[4794]: I0310 12:13:07.222714 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jrkm9\" (UniqueName: \"kubernetes.io/projected/393edda0-75a5-4315-994b-457edffb3617-kube-api-access-jrkm9\") pod \"certified-operators-hz9x4\" (UID: \"393edda0-75a5-4315-994b-457edffb3617\") " pod="openshift-marketplace/certified-operators-hz9x4" Mar 10 12:13:07 crc kubenswrapper[4794]: I0310 12:13:07.223142 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/393edda0-75a5-4315-994b-457edffb3617-utilities\") pod \"certified-operators-hz9x4\" (UID: \"393edda0-75a5-4315-994b-457edffb3617\") " pod="openshift-marketplace/certified-operators-hz9x4" Mar 10 12:13:07 crc kubenswrapper[4794]: I0310 12:13:07.223667 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/393edda0-75a5-4315-994b-457edffb3617-catalog-content\") pod \"certified-operators-hz9x4\" (UID: \"393edda0-75a5-4315-994b-457edffb3617\") " pod="openshift-marketplace/certified-operators-hz9x4" Mar 10 12:13:07 crc kubenswrapper[4794]: I0310 12:13:07.223823 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/393edda0-75a5-4315-994b-457edffb3617-utilities\") pod \"certified-operators-hz9x4\" (UID: \"393edda0-75a5-4315-994b-457edffb3617\") " pod="openshift-marketplace/certified-operators-hz9x4" Mar 10 12:13:07 crc kubenswrapper[4794]: I0310 12:13:07.224172 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/393edda0-75a5-4315-994b-457edffb3617-catalog-content\") pod \"certified-operators-hz9x4\" (UID: \"393edda0-75a5-4315-994b-457edffb3617\") " pod="openshift-marketplace/certified-operators-hz9x4" Mar 10 12:13:07 crc kubenswrapper[4794]: I0310 12:13:07.246721 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jrkm9\" (UniqueName: \"kubernetes.io/projected/393edda0-75a5-4315-994b-457edffb3617-kube-api-access-jrkm9\") pod \"certified-operators-hz9x4\" (UID: \"393edda0-75a5-4315-994b-457edffb3617\") " pod="openshift-marketplace/certified-operators-hz9x4" Mar 10 12:13:07 crc kubenswrapper[4794]: I0310 12:13:07.349213 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-hz9x4" Mar 10 12:13:07 crc kubenswrapper[4794]: I0310 12:13:07.852718 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-hz9x4"] Mar 10 12:13:08 crc kubenswrapper[4794]: I0310 12:13:08.201489 4794 generic.go:334] "Generic (PLEG): container finished" podID="393edda0-75a5-4315-994b-457edffb3617" containerID="432814da10930ec8868a3d9ad28a348562419c00f4759bd47ba2f672306bdf88" exitCode=0 Mar 10 12:13:08 crc kubenswrapper[4794]: I0310 12:13:08.201570 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hz9x4" event={"ID":"393edda0-75a5-4315-994b-457edffb3617","Type":"ContainerDied","Data":"432814da10930ec8868a3d9ad28a348562419c00f4759bd47ba2f672306bdf88"} Mar 10 12:13:08 crc kubenswrapper[4794]: I0310 12:13:08.201628 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hz9x4" event={"ID":"393edda0-75a5-4315-994b-457edffb3617","Type":"ContainerStarted","Data":"869bbbf2417072021380f177b1e44b3dc4cdf7e47e22e4e57e5fd643ea947b08"} Mar 10 12:13:10 crc kubenswrapper[4794]: I0310 12:13:10.227650 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hz9x4" event={"ID":"393edda0-75a5-4315-994b-457edffb3617","Type":"ContainerStarted","Data":"0ea29f6405ccf18260753190357cbd0770611251a65fd21d56d2f3b7bceb45e7"} Mar 10 12:13:11 crc kubenswrapper[4794]: I0310 12:13:11.242238 4794 generic.go:334] "Generic (PLEG): container finished" podID="393edda0-75a5-4315-994b-457edffb3617" containerID="0ea29f6405ccf18260753190357cbd0770611251a65fd21d56d2f3b7bceb45e7" exitCode=0 Mar 10 12:13:11 crc kubenswrapper[4794]: I0310 12:13:11.242348 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hz9x4" event={"ID":"393edda0-75a5-4315-994b-457edffb3617","Type":"ContainerDied","Data":"0ea29f6405ccf18260753190357cbd0770611251a65fd21d56d2f3b7bceb45e7"} Mar 10 12:13:14 crc kubenswrapper[4794]: I0310 12:13:14.278938 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hz9x4" event={"ID":"393edda0-75a5-4315-994b-457edffb3617","Type":"ContainerStarted","Data":"8caf165f0790f936736c9a49cf6d24449a66420a55296dc82b6fcad9f0c76444"} Mar 10 12:13:14 crc kubenswrapper[4794]: I0310 12:13:14.306874 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-hz9x4" podStartSLOduration=3.426332824 podStartE2EDuration="8.306855082s" podCreationTimestamp="2026-03-10 12:13:06 +0000 UTC" firstStartedPulling="2026-03-10 12:13:08.20502528 +0000 UTC m=+8936.961196128" lastFinishedPulling="2026-03-10 12:13:13.085547558 +0000 UTC m=+8941.841718386" observedRunningTime="2026-03-10 12:13:14.294991407 +0000 UTC m=+8943.051162235" watchObservedRunningTime="2026-03-10 12:13:14.306855082 +0000 UTC m=+8943.063025890" Mar 10 12:13:17 crc kubenswrapper[4794]: I0310 12:13:17.349840 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-hz9x4" Mar 10 12:13:17 crc kubenswrapper[4794]: I0310 12:13:17.350117 4794 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-hz9x4" Mar 10 12:13:18 crc kubenswrapper[4794]: I0310 12:13:18.411882 4794 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/certified-operators-hz9x4" podUID="393edda0-75a5-4315-994b-457edffb3617" containerName="registry-server" probeResult="failure" output=< Mar 10 12:13:18 crc kubenswrapper[4794]: timeout: failed to connect service ":50051" within 1s Mar 10 12:13:18 crc kubenswrapper[4794]: > Mar 10 12:13:22 crc kubenswrapper[4794]: I0310 12:13:22.968240 4794 patch_prober.go:28] interesting pod/machine-config-daemon-69278 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 12:13:22 crc kubenswrapper[4794]: I0310 12:13:22.969031 4794 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-69278" podUID="071a79a8-a892-4d38-a255-2a19483b64aa" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 12:13:27 crc kubenswrapper[4794]: I0310 12:13:27.401193 4794 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-hz9x4" Mar 10 12:13:27 crc kubenswrapper[4794]: I0310 12:13:27.458385 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-hz9x4" Mar 10 12:13:27 crc kubenswrapper[4794]: I0310 12:13:27.667293 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-hz9x4"] Mar 10 12:13:28 crc kubenswrapper[4794]: I0310 12:13:28.437896 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-hz9x4" podUID="393edda0-75a5-4315-994b-457edffb3617" containerName="registry-server" containerID="cri-o://8caf165f0790f936736c9a49cf6d24449a66420a55296dc82b6fcad9f0c76444" gracePeriod=2 Mar 10 12:13:28 crc kubenswrapper[4794]: I0310 12:13:28.945129 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-hz9x4" Mar 10 12:13:28 crc kubenswrapper[4794]: I0310 12:13:28.980432 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/393edda0-75a5-4315-994b-457edffb3617-utilities\") pod \"393edda0-75a5-4315-994b-457edffb3617\" (UID: \"393edda0-75a5-4315-994b-457edffb3617\") " Mar 10 12:13:28 crc kubenswrapper[4794]: I0310 12:13:28.980545 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jrkm9\" (UniqueName: \"kubernetes.io/projected/393edda0-75a5-4315-994b-457edffb3617-kube-api-access-jrkm9\") pod \"393edda0-75a5-4315-994b-457edffb3617\" (UID: \"393edda0-75a5-4315-994b-457edffb3617\") " Mar 10 12:13:28 crc kubenswrapper[4794]: I0310 12:13:28.980705 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/393edda0-75a5-4315-994b-457edffb3617-catalog-content\") pod \"393edda0-75a5-4315-994b-457edffb3617\" (UID: \"393edda0-75a5-4315-994b-457edffb3617\") " Mar 10 12:13:28 crc kubenswrapper[4794]: I0310 12:13:28.981802 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/393edda0-75a5-4315-994b-457edffb3617-utilities" (OuterVolumeSpecName: "utilities") pod "393edda0-75a5-4315-994b-457edffb3617" (UID: "393edda0-75a5-4315-994b-457edffb3617"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 12:13:28 crc kubenswrapper[4794]: I0310 12:13:28.990533 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/393edda0-75a5-4315-994b-457edffb3617-kube-api-access-jrkm9" (OuterVolumeSpecName: "kube-api-access-jrkm9") pod "393edda0-75a5-4315-994b-457edffb3617" (UID: "393edda0-75a5-4315-994b-457edffb3617"). InnerVolumeSpecName "kube-api-access-jrkm9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 12:13:29 crc kubenswrapper[4794]: I0310 12:13:29.048516 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/393edda0-75a5-4315-994b-457edffb3617-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "393edda0-75a5-4315-994b-457edffb3617" (UID: "393edda0-75a5-4315-994b-457edffb3617"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 12:13:29 crc kubenswrapper[4794]: I0310 12:13:29.083752 4794 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/393edda0-75a5-4315-994b-457edffb3617-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 10 12:13:29 crc kubenswrapper[4794]: I0310 12:13:29.083931 4794 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/393edda0-75a5-4315-994b-457edffb3617-utilities\") on node \"crc\" DevicePath \"\"" Mar 10 12:13:29 crc kubenswrapper[4794]: I0310 12:13:29.083942 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jrkm9\" (UniqueName: \"kubernetes.io/projected/393edda0-75a5-4315-994b-457edffb3617-kube-api-access-jrkm9\") on node \"crc\" DevicePath \"\"" Mar 10 12:13:29 crc kubenswrapper[4794]: I0310 12:13:29.448297 4794 generic.go:334] "Generic (PLEG): container finished" podID="393edda0-75a5-4315-994b-457edffb3617" containerID="8caf165f0790f936736c9a49cf6d24449a66420a55296dc82b6fcad9f0c76444" exitCode=0 Mar 10 12:13:29 crc kubenswrapper[4794]: I0310 12:13:29.448355 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hz9x4" event={"ID":"393edda0-75a5-4315-994b-457edffb3617","Type":"ContainerDied","Data":"8caf165f0790f936736c9a49cf6d24449a66420a55296dc82b6fcad9f0c76444"} Mar 10 12:13:29 crc kubenswrapper[4794]: I0310 12:13:29.448381 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hz9x4" event={"ID":"393edda0-75a5-4315-994b-457edffb3617","Type":"ContainerDied","Data":"869bbbf2417072021380f177b1e44b3dc4cdf7e47e22e4e57e5fd643ea947b08"} Mar 10 12:13:29 crc kubenswrapper[4794]: I0310 12:13:29.448397 4794 scope.go:117] "RemoveContainer" containerID="8caf165f0790f936736c9a49cf6d24449a66420a55296dc82b6fcad9f0c76444" Mar 10 12:13:29 crc kubenswrapper[4794]: I0310 12:13:29.448514 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-hz9x4" Mar 10 12:13:29 crc kubenswrapper[4794]: I0310 12:13:29.476950 4794 scope.go:117] "RemoveContainer" containerID="0ea29f6405ccf18260753190357cbd0770611251a65fd21d56d2f3b7bceb45e7" Mar 10 12:13:29 crc kubenswrapper[4794]: I0310 12:13:29.481089 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-hz9x4"] Mar 10 12:13:29 crc kubenswrapper[4794]: I0310 12:13:29.491932 4794 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-hz9x4"] Mar 10 12:13:29 crc kubenswrapper[4794]: I0310 12:13:29.512964 4794 scope.go:117] "RemoveContainer" containerID="432814da10930ec8868a3d9ad28a348562419c00f4759bd47ba2f672306bdf88" Mar 10 12:13:29 crc kubenswrapper[4794]: I0310 12:13:29.556279 4794 scope.go:117] "RemoveContainer" containerID="8caf165f0790f936736c9a49cf6d24449a66420a55296dc82b6fcad9f0c76444" Mar 10 12:13:29 crc kubenswrapper[4794]: E0310 12:13:29.556784 4794 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8caf165f0790f936736c9a49cf6d24449a66420a55296dc82b6fcad9f0c76444\": container with ID starting with 8caf165f0790f936736c9a49cf6d24449a66420a55296dc82b6fcad9f0c76444 not found: ID does not exist" containerID="8caf165f0790f936736c9a49cf6d24449a66420a55296dc82b6fcad9f0c76444" Mar 10 12:13:29 crc kubenswrapper[4794]: I0310 12:13:29.556885 4794 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8caf165f0790f936736c9a49cf6d24449a66420a55296dc82b6fcad9f0c76444"} err="failed to get container status \"8caf165f0790f936736c9a49cf6d24449a66420a55296dc82b6fcad9f0c76444\": rpc error: code = NotFound desc = could not find container \"8caf165f0790f936736c9a49cf6d24449a66420a55296dc82b6fcad9f0c76444\": container with ID starting with 8caf165f0790f936736c9a49cf6d24449a66420a55296dc82b6fcad9f0c76444 not found: ID does not exist" Mar 10 12:13:29 crc kubenswrapper[4794]: I0310 12:13:29.556959 4794 scope.go:117] "RemoveContainer" containerID="0ea29f6405ccf18260753190357cbd0770611251a65fd21d56d2f3b7bceb45e7" Mar 10 12:13:29 crc kubenswrapper[4794]: E0310 12:13:29.557318 4794 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0ea29f6405ccf18260753190357cbd0770611251a65fd21d56d2f3b7bceb45e7\": container with ID starting with 0ea29f6405ccf18260753190357cbd0770611251a65fd21d56d2f3b7bceb45e7 not found: ID does not exist" containerID="0ea29f6405ccf18260753190357cbd0770611251a65fd21d56d2f3b7bceb45e7" Mar 10 12:13:29 crc kubenswrapper[4794]: I0310 12:13:29.557417 4794 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0ea29f6405ccf18260753190357cbd0770611251a65fd21d56d2f3b7bceb45e7"} err="failed to get container status \"0ea29f6405ccf18260753190357cbd0770611251a65fd21d56d2f3b7bceb45e7\": rpc error: code = NotFound desc = could not find container \"0ea29f6405ccf18260753190357cbd0770611251a65fd21d56d2f3b7bceb45e7\": container with ID starting with 0ea29f6405ccf18260753190357cbd0770611251a65fd21d56d2f3b7bceb45e7 not found: ID does not exist" Mar 10 12:13:29 crc kubenswrapper[4794]: I0310 12:13:29.557487 4794 scope.go:117] "RemoveContainer" containerID="432814da10930ec8868a3d9ad28a348562419c00f4759bd47ba2f672306bdf88" Mar 10 12:13:29 crc kubenswrapper[4794]: E0310 12:13:29.557988 4794 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"432814da10930ec8868a3d9ad28a348562419c00f4759bd47ba2f672306bdf88\": container with ID starting with 432814da10930ec8868a3d9ad28a348562419c00f4759bd47ba2f672306bdf88 not found: ID does not exist" containerID="432814da10930ec8868a3d9ad28a348562419c00f4759bd47ba2f672306bdf88" Mar 10 12:13:29 crc kubenswrapper[4794]: I0310 12:13:29.558074 4794 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"432814da10930ec8868a3d9ad28a348562419c00f4759bd47ba2f672306bdf88"} err="failed to get container status \"432814da10930ec8868a3d9ad28a348562419c00f4759bd47ba2f672306bdf88\": rpc error: code = NotFound desc = could not find container \"432814da10930ec8868a3d9ad28a348562419c00f4759bd47ba2f672306bdf88\": container with ID starting with 432814da10930ec8868a3d9ad28a348562419c00f4759bd47ba2f672306bdf88 not found: ID does not exist" Mar 10 12:13:30 crc kubenswrapper[4794]: I0310 12:13:30.027532 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="393edda0-75a5-4315-994b-457edffb3617" path="/var/lib/kubelet/pods/393edda0-75a5-4315-994b-457edffb3617/volumes" Mar 10 12:13:31 crc kubenswrapper[4794]: I0310 12:13:31.473591 4794 generic.go:334] "Generic (PLEG): container finished" podID="6e3a6dda-7237-47e6-b254-4f0b835f83c6" containerID="00a71d57f3b69e418f1b48ead0c5253571a6930f0a8af858f9661efa062075e0" exitCode=0 Mar 10 12:13:31 crc kubenswrapper[4794]: I0310 12:13:31.473717 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellkhlzh" event={"ID":"6e3a6dda-7237-47e6-b254-4f0b835f83c6","Type":"ContainerDied","Data":"00a71d57f3b69e418f1b48ead0c5253571a6930f0a8af858f9661efa062075e0"} Mar 10 12:13:33 crc kubenswrapper[4794]: I0310 12:13:33.046304 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellkhlzh" Mar 10 12:13:33 crc kubenswrapper[4794]: I0310 12:13:33.184510 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/6e3a6dda-7237-47e6-b254-4f0b835f83c6-nova-cell1-compute-config-1\") pod \"6e3a6dda-7237-47e6-b254-4f0b835f83c6\" (UID: \"6e3a6dda-7237-47e6-b254-4f0b835f83c6\") " Mar 10 12:13:33 crc kubenswrapper[4794]: I0310 12:13:33.184623 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/6e3a6dda-7237-47e6-b254-4f0b835f83c6-nova-migration-ssh-key-0\") pod \"6e3a6dda-7237-47e6-b254-4f0b835f83c6\" (UID: \"6e3a6dda-7237-47e6-b254-4f0b835f83c6\") " Mar 10 12:13:33 crc kubenswrapper[4794]: I0310 12:13:33.184648 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7h5qj\" (UniqueName: \"kubernetes.io/projected/6e3a6dda-7237-47e6-b254-4f0b835f83c6-kube-api-access-7h5qj\") pod \"6e3a6dda-7237-47e6-b254-4f0b835f83c6\" (UID: \"6e3a6dda-7237-47e6-b254-4f0b835f83c6\") " Mar 10 12:13:33 crc kubenswrapper[4794]: I0310 12:13:33.184700 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/6e3a6dda-7237-47e6-b254-4f0b835f83c6-ceph\") pod \"6e3a6dda-7237-47e6-b254-4f0b835f83c6\" (UID: \"6e3a6dda-7237-47e6-b254-4f0b835f83c6\") " Mar 10 12:13:33 crc kubenswrapper[4794]: I0310 12:13:33.184758 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cells-global-config-0\" (UniqueName: \"kubernetes.io/configmap/6e3a6dda-7237-47e6-b254-4f0b835f83c6-nova-cells-global-config-0\") pod \"6e3a6dda-7237-47e6-b254-4f0b835f83c6\" (UID: \"6e3a6dda-7237-47e6-b254-4f0b835f83c6\") " Mar 10 12:13:33 crc kubenswrapper[4794]: I0310 12:13:33.184782 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cells-global-config-1\" (UniqueName: \"kubernetes.io/configmap/6e3a6dda-7237-47e6-b254-4f0b835f83c6-nova-cells-global-config-1\") pod \"6e3a6dda-7237-47e6-b254-4f0b835f83c6\" (UID: \"6e3a6dda-7237-47e6-b254-4f0b835f83c6\") " Mar 10 12:13:33 crc kubenswrapper[4794]: I0310 12:13:33.184802 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/6e3a6dda-7237-47e6-b254-4f0b835f83c6-nova-cell1-compute-config-2\") pod \"6e3a6dda-7237-47e6-b254-4f0b835f83c6\" (UID: \"6e3a6dda-7237-47e6-b254-4f0b835f83c6\") " Mar 10 12:13:33 crc kubenswrapper[4794]: I0310 12:13:33.184835 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/6e3a6dda-7237-47e6-b254-4f0b835f83c6-nova-cell1-compute-config-3\") pod \"6e3a6dda-7237-47e6-b254-4f0b835f83c6\" (UID: \"6e3a6dda-7237-47e6-b254-4f0b835f83c6\") " Mar 10 12:13:33 crc kubenswrapper[4794]: I0310 12:13:33.184873 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/6e3a6dda-7237-47e6-b254-4f0b835f83c6-nova-migration-ssh-key-1\") pod \"6e3a6dda-7237-47e6-b254-4f0b835f83c6\" (UID: \"6e3a6dda-7237-47e6-b254-4f0b835f83c6\") " Mar 10 12:13:33 crc kubenswrapper[4794]: I0310 12:13:33.184918 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6e3a6dda-7237-47e6-b254-4f0b835f83c6-inventory\") pod \"6e3a6dda-7237-47e6-b254-4f0b835f83c6\" (UID: \"6e3a6dda-7237-47e6-b254-4f0b835f83c6\") " Mar 10 12:13:33 crc kubenswrapper[4794]: I0310 12:13:33.184948 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6e3a6dda-7237-47e6-b254-4f0b835f83c6-nova-cell1-combined-ca-bundle\") pod \"6e3a6dda-7237-47e6-b254-4f0b835f83c6\" (UID: \"6e3a6dda-7237-47e6-b254-4f0b835f83c6\") " Mar 10 12:13:33 crc kubenswrapper[4794]: I0310 12:13:33.184984 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/6e3a6dda-7237-47e6-b254-4f0b835f83c6-ssh-key-openstack-cell1\") pod \"6e3a6dda-7237-47e6-b254-4f0b835f83c6\" (UID: \"6e3a6dda-7237-47e6-b254-4f0b835f83c6\") " Mar 10 12:13:33 crc kubenswrapper[4794]: I0310 12:13:33.185024 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/6e3a6dda-7237-47e6-b254-4f0b835f83c6-nova-cell1-compute-config-0\") pod \"6e3a6dda-7237-47e6-b254-4f0b835f83c6\" (UID: \"6e3a6dda-7237-47e6-b254-4f0b835f83c6\") " Mar 10 12:13:33 crc kubenswrapper[4794]: I0310 12:13:33.190907 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6e3a6dda-7237-47e6-b254-4f0b835f83c6-kube-api-access-7h5qj" (OuterVolumeSpecName: "kube-api-access-7h5qj") pod "6e3a6dda-7237-47e6-b254-4f0b835f83c6" (UID: "6e3a6dda-7237-47e6-b254-4f0b835f83c6"). InnerVolumeSpecName "kube-api-access-7h5qj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 12:13:33 crc kubenswrapper[4794]: I0310 12:13:33.190980 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6e3a6dda-7237-47e6-b254-4f0b835f83c6-nova-cell1-combined-ca-bundle" (OuterVolumeSpecName: "nova-cell1-combined-ca-bundle") pod "6e3a6dda-7237-47e6-b254-4f0b835f83c6" (UID: "6e3a6dda-7237-47e6-b254-4f0b835f83c6"). InnerVolumeSpecName "nova-cell1-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 12:13:33 crc kubenswrapper[4794]: I0310 12:13:33.191990 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6e3a6dda-7237-47e6-b254-4f0b835f83c6-ceph" (OuterVolumeSpecName: "ceph") pod "6e3a6dda-7237-47e6-b254-4f0b835f83c6" (UID: "6e3a6dda-7237-47e6-b254-4f0b835f83c6"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 12:13:33 crc kubenswrapper[4794]: I0310 12:13:33.220126 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6e3a6dda-7237-47e6-b254-4f0b835f83c6-nova-migration-ssh-key-0" (OuterVolumeSpecName: "nova-migration-ssh-key-0") pod "6e3a6dda-7237-47e6-b254-4f0b835f83c6" (UID: "6e3a6dda-7237-47e6-b254-4f0b835f83c6"). InnerVolumeSpecName "nova-migration-ssh-key-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 12:13:33 crc kubenswrapper[4794]: I0310 12:13:33.220510 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6e3a6dda-7237-47e6-b254-4f0b835f83c6-nova-cells-global-config-0" (OuterVolumeSpecName: "nova-cells-global-config-0") pod "6e3a6dda-7237-47e6-b254-4f0b835f83c6" (UID: "6e3a6dda-7237-47e6-b254-4f0b835f83c6"). InnerVolumeSpecName "nova-cells-global-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 12:13:33 crc kubenswrapper[4794]: I0310 12:13:33.220489 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6e3a6dda-7237-47e6-b254-4f0b835f83c6-nova-cells-global-config-1" (OuterVolumeSpecName: "nova-cells-global-config-1") pod "6e3a6dda-7237-47e6-b254-4f0b835f83c6" (UID: "6e3a6dda-7237-47e6-b254-4f0b835f83c6"). InnerVolumeSpecName "nova-cells-global-config-1". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 12:13:33 crc kubenswrapper[4794]: I0310 12:13:33.226085 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6e3a6dda-7237-47e6-b254-4f0b835f83c6-inventory" (OuterVolumeSpecName: "inventory") pod "6e3a6dda-7237-47e6-b254-4f0b835f83c6" (UID: "6e3a6dda-7237-47e6-b254-4f0b835f83c6"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 12:13:33 crc kubenswrapper[4794]: I0310 12:13:33.226650 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6e3a6dda-7237-47e6-b254-4f0b835f83c6-nova-migration-ssh-key-1" (OuterVolumeSpecName: "nova-migration-ssh-key-1") pod "6e3a6dda-7237-47e6-b254-4f0b835f83c6" (UID: "6e3a6dda-7237-47e6-b254-4f0b835f83c6"). InnerVolumeSpecName "nova-migration-ssh-key-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 12:13:33 crc kubenswrapper[4794]: I0310 12:13:33.227800 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6e3a6dda-7237-47e6-b254-4f0b835f83c6-nova-cell1-compute-config-1" (OuterVolumeSpecName: "nova-cell1-compute-config-1") pod "6e3a6dda-7237-47e6-b254-4f0b835f83c6" (UID: "6e3a6dda-7237-47e6-b254-4f0b835f83c6"). InnerVolumeSpecName "nova-cell1-compute-config-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 12:13:33 crc kubenswrapper[4794]: I0310 12:13:33.236869 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6e3a6dda-7237-47e6-b254-4f0b835f83c6-ssh-key-openstack-cell1" (OuterVolumeSpecName: "ssh-key-openstack-cell1") pod "6e3a6dda-7237-47e6-b254-4f0b835f83c6" (UID: "6e3a6dda-7237-47e6-b254-4f0b835f83c6"). InnerVolumeSpecName "ssh-key-openstack-cell1". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 12:13:33 crc kubenswrapper[4794]: I0310 12:13:33.239233 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6e3a6dda-7237-47e6-b254-4f0b835f83c6-nova-cell1-compute-config-2" (OuterVolumeSpecName: "nova-cell1-compute-config-2") pod "6e3a6dda-7237-47e6-b254-4f0b835f83c6" (UID: "6e3a6dda-7237-47e6-b254-4f0b835f83c6"). InnerVolumeSpecName "nova-cell1-compute-config-2". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 12:13:33 crc kubenswrapper[4794]: I0310 12:13:33.244044 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6e3a6dda-7237-47e6-b254-4f0b835f83c6-nova-cell1-compute-config-0" (OuterVolumeSpecName: "nova-cell1-compute-config-0") pod "6e3a6dda-7237-47e6-b254-4f0b835f83c6" (UID: "6e3a6dda-7237-47e6-b254-4f0b835f83c6"). InnerVolumeSpecName "nova-cell1-compute-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 12:13:33 crc kubenswrapper[4794]: I0310 12:13:33.248774 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6e3a6dda-7237-47e6-b254-4f0b835f83c6-nova-cell1-compute-config-3" (OuterVolumeSpecName: "nova-cell1-compute-config-3") pod "6e3a6dda-7237-47e6-b254-4f0b835f83c6" (UID: "6e3a6dda-7237-47e6-b254-4f0b835f83c6"). InnerVolumeSpecName "nova-cell1-compute-config-3". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 12:13:33 crc kubenswrapper[4794]: I0310 12:13:33.288755 4794 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/6e3a6dda-7237-47e6-b254-4f0b835f83c6-ceph\") on node \"crc\" DevicePath \"\"" Mar 10 12:13:33 crc kubenswrapper[4794]: I0310 12:13:33.288850 4794 reconciler_common.go:293] "Volume detached for volume \"nova-cells-global-config-0\" (UniqueName: \"kubernetes.io/configmap/6e3a6dda-7237-47e6-b254-4f0b835f83c6-nova-cells-global-config-0\") on node \"crc\" DevicePath \"\"" Mar 10 12:13:33 crc kubenswrapper[4794]: I0310 12:13:33.288871 4794 reconciler_common.go:293] "Volume detached for volume \"nova-cells-global-config-1\" (UniqueName: \"kubernetes.io/configmap/6e3a6dda-7237-47e6-b254-4f0b835f83c6-nova-cells-global-config-1\") on node \"crc\" DevicePath \"\"" Mar 10 12:13:33 crc kubenswrapper[4794]: I0310 12:13:33.288880 4794 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/6e3a6dda-7237-47e6-b254-4f0b835f83c6-nova-cell1-compute-config-2\") on node \"crc\" DevicePath \"\"" Mar 10 12:13:33 crc kubenswrapper[4794]: I0310 12:13:33.288889 4794 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/6e3a6dda-7237-47e6-b254-4f0b835f83c6-nova-cell1-compute-config-3\") on node \"crc\" DevicePath \"\"" Mar 10 12:13:33 crc kubenswrapper[4794]: I0310 12:13:33.288899 4794 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/6e3a6dda-7237-47e6-b254-4f0b835f83c6-nova-migration-ssh-key-1\") on node \"crc\" DevicePath \"\"" Mar 10 12:13:33 crc kubenswrapper[4794]: I0310 12:13:33.288907 4794 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6e3a6dda-7237-47e6-b254-4f0b835f83c6-inventory\") on node \"crc\" DevicePath \"\"" Mar 10 12:13:33 crc kubenswrapper[4794]: I0310 12:13:33.288916 4794 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6e3a6dda-7237-47e6-b254-4f0b835f83c6-nova-cell1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 12:13:33 crc kubenswrapper[4794]: I0310 12:13:33.288926 4794 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/6e3a6dda-7237-47e6-b254-4f0b835f83c6-ssh-key-openstack-cell1\") on node \"crc\" DevicePath \"\"" Mar 10 12:13:33 crc kubenswrapper[4794]: I0310 12:13:33.288936 4794 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/6e3a6dda-7237-47e6-b254-4f0b835f83c6-nova-cell1-compute-config-0\") on node \"crc\" DevicePath \"\"" Mar 10 12:13:33 crc kubenswrapper[4794]: I0310 12:13:33.288944 4794 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/6e3a6dda-7237-47e6-b254-4f0b835f83c6-nova-cell1-compute-config-1\") on node \"crc\" DevicePath \"\"" Mar 10 12:13:33 crc kubenswrapper[4794]: I0310 12:13:33.288992 4794 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/6e3a6dda-7237-47e6-b254-4f0b835f83c6-nova-migration-ssh-key-0\") on node \"crc\" DevicePath \"\"" Mar 10 12:13:33 crc kubenswrapper[4794]: I0310 12:13:33.289001 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7h5qj\" (UniqueName: \"kubernetes.io/projected/6e3a6dda-7237-47e6-b254-4f0b835f83c6-kube-api-access-7h5qj\") on node \"crc\" DevicePath \"\"" Mar 10 12:13:33 crc kubenswrapper[4794]: I0310 12:13:33.503093 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellkhlzh" event={"ID":"6e3a6dda-7237-47e6-b254-4f0b835f83c6","Type":"ContainerDied","Data":"3d9e0d8925bd96d1c2e5b534312fd00dc734c63f5528475cb6ffacb6ebcb2fa8"} Mar 10 12:13:33 crc kubenswrapper[4794]: I0310 12:13:33.503577 4794 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3d9e0d8925bd96d1c2e5b534312fd00dc734c63f5528475cb6ffacb6ebcb2fa8" Mar 10 12:13:33 crc kubenswrapper[4794]: I0310 12:13:33.503204 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellkhlzh" Mar 10 12:13:52 crc kubenswrapper[4794]: I0310 12:13:52.967738 4794 patch_prober.go:28] interesting pod/machine-config-daemon-69278 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 12:13:52 crc kubenswrapper[4794]: I0310 12:13:52.968361 4794 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-69278" podUID="071a79a8-a892-4d38-a255-2a19483b64aa" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 12:13:52 crc kubenswrapper[4794]: I0310 12:13:52.968414 4794 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-69278" Mar 10 12:13:52 crc kubenswrapper[4794]: I0310 12:13:52.969372 4794 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"8a548dece9f2dc538d899cf84b2f6e94eab0179fe2b6707ac5975fb13bfb3972"} pod="openshift-machine-config-operator/machine-config-daemon-69278" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 10 12:13:52 crc kubenswrapper[4794]: I0310 12:13:52.969443 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-69278" podUID="071a79a8-a892-4d38-a255-2a19483b64aa" containerName="machine-config-daemon" containerID="cri-o://8a548dece9f2dc538d899cf84b2f6e94eab0179fe2b6707ac5975fb13bfb3972" gracePeriod=600 Mar 10 12:13:53 crc kubenswrapper[4794]: E0310 12:13:53.113761 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-69278_openshift-machine-config-operator(071a79a8-a892-4d38-a255-2a19483b64aa)\"" pod="openshift-machine-config-operator/machine-config-daemon-69278" podUID="071a79a8-a892-4d38-a255-2a19483b64aa" Mar 10 12:13:53 crc kubenswrapper[4794]: I0310 12:13:53.754457 4794 generic.go:334] "Generic (PLEG): container finished" podID="071a79a8-a892-4d38-a255-2a19483b64aa" containerID="8a548dece9f2dc538d899cf84b2f6e94eab0179fe2b6707ac5975fb13bfb3972" exitCode=0 Mar 10 12:13:53 crc kubenswrapper[4794]: I0310 12:13:53.754532 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-69278" event={"ID":"071a79a8-a892-4d38-a255-2a19483b64aa","Type":"ContainerDied","Data":"8a548dece9f2dc538d899cf84b2f6e94eab0179fe2b6707ac5975fb13bfb3972"} Mar 10 12:13:53 crc kubenswrapper[4794]: I0310 12:13:53.754891 4794 scope.go:117] "RemoveContainer" containerID="f3eaf645c92d5074cb0924e573d1878a0115051de261d931bccf56e275fcfefa" Mar 10 12:13:53 crc kubenswrapper[4794]: I0310 12:13:53.755690 4794 scope.go:117] "RemoveContainer" containerID="8a548dece9f2dc538d899cf84b2f6e94eab0179fe2b6707ac5975fb13bfb3972" Mar 10 12:13:53 crc kubenswrapper[4794]: E0310 12:13:53.756134 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-69278_openshift-machine-config-operator(071a79a8-a892-4d38-a255-2a19483b64aa)\"" pod="openshift-machine-config-operator/machine-config-daemon-69278" podUID="071a79a8-a892-4d38-a255-2a19483b64aa" Mar 10 12:14:00 crc kubenswrapper[4794]: I0310 12:14:00.165519 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29552414-dczzv"] Mar 10 12:14:00 crc kubenswrapper[4794]: E0310 12:14:00.167141 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6e3a6dda-7237-47e6-b254-4f0b835f83c6" containerName="nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cell1" Mar 10 12:14:00 crc kubenswrapper[4794]: I0310 12:14:00.167174 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="6e3a6dda-7237-47e6-b254-4f0b835f83c6" containerName="nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cell1" Mar 10 12:14:00 crc kubenswrapper[4794]: E0310 12:14:00.167254 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="393edda0-75a5-4315-994b-457edffb3617" containerName="extract-utilities" Mar 10 12:14:00 crc kubenswrapper[4794]: I0310 12:14:00.167271 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="393edda0-75a5-4315-994b-457edffb3617" containerName="extract-utilities" Mar 10 12:14:00 crc kubenswrapper[4794]: E0310 12:14:00.167363 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="393edda0-75a5-4315-994b-457edffb3617" containerName="extract-content" Mar 10 12:14:00 crc kubenswrapper[4794]: I0310 12:14:00.167381 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="393edda0-75a5-4315-994b-457edffb3617" containerName="extract-content" Mar 10 12:14:00 crc kubenswrapper[4794]: E0310 12:14:00.167412 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="393edda0-75a5-4315-994b-457edffb3617" containerName="registry-server" Mar 10 12:14:00 crc kubenswrapper[4794]: I0310 12:14:00.167428 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="393edda0-75a5-4315-994b-457edffb3617" containerName="registry-server" Mar 10 12:14:00 crc kubenswrapper[4794]: I0310 12:14:00.167902 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="393edda0-75a5-4315-994b-457edffb3617" containerName="registry-server" Mar 10 12:14:00 crc kubenswrapper[4794]: I0310 12:14:00.167955 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="6e3a6dda-7237-47e6-b254-4f0b835f83c6" containerName="nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cell1" Mar 10 12:14:00 crc kubenswrapper[4794]: I0310 12:14:00.169812 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552414-dczzv" Mar 10 12:14:00 crc kubenswrapper[4794]: I0310 12:14:00.176081 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 10 12:14:00 crc kubenswrapper[4794]: I0310 12:14:00.176189 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 10 12:14:00 crc kubenswrapper[4794]: I0310 12:14:00.176478 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-r5tkc" Mar 10 12:14:00 crc kubenswrapper[4794]: I0310 12:14:00.185152 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552414-dczzv"] Mar 10 12:14:00 crc kubenswrapper[4794]: I0310 12:14:00.237288 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2r7jz\" (UniqueName: \"kubernetes.io/projected/64f7981e-696b-4a23-9194-d2a0326bbd1f-kube-api-access-2r7jz\") pod \"auto-csr-approver-29552414-dczzv\" (UID: \"64f7981e-696b-4a23-9194-d2a0326bbd1f\") " pod="openshift-infra/auto-csr-approver-29552414-dczzv" Mar 10 12:14:00 crc kubenswrapper[4794]: I0310 12:14:00.340045 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2r7jz\" (UniqueName: \"kubernetes.io/projected/64f7981e-696b-4a23-9194-d2a0326bbd1f-kube-api-access-2r7jz\") pod \"auto-csr-approver-29552414-dczzv\" (UID: \"64f7981e-696b-4a23-9194-d2a0326bbd1f\") " pod="openshift-infra/auto-csr-approver-29552414-dczzv" Mar 10 12:14:00 crc kubenswrapper[4794]: I0310 12:14:00.380985 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2r7jz\" (UniqueName: \"kubernetes.io/projected/64f7981e-696b-4a23-9194-d2a0326bbd1f-kube-api-access-2r7jz\") pod \"auto-csr-approver-29552414-dczzv\" (UID: \"64f7981e-696b-4a23-9194-d2a0326bbd1f\") " pod="openshift-infra/auto-csr-approver-29552414-dczzv" Mar 10 12:14:00 crc kubenswrapper[4794]: I0310 12:14:00.514155 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552414-dczzv" Mar 10 12:14:01 crc kubenswrapper[4794]: I0310 12:14:01.021955 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552414-dczzv"] Mar 10 12:14:01 crc kubenswrapper[4794]: I0310 12:14:01.866883 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552414-dczzv" event={"ID":"64f7981e-696b-4a23-9194-d2a0326bbd1f","Type":"ContainerStarted","Data":"7f3a7913502e810122a2e0dd9f2e294be24cb32c46fae2503893baad0714d3d4"} Mar 10 12:14:02 crc kubenswrapper[4794]: I0310 12:14:02.877325 4794 generic.go:334] "Generic (PLEG): container finished" podID="64f7981e-696b-4a23-9194-d2a0326bbd1f" containerID="dd5facc5566471df5a07e3b7825f8e7c5582834d603619508d1f096d74323f98" exitCode=0 Mar 10 12:14:02 crc kubenswrapper[4794]: I0310 12:14:02.877424 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552414-dczzv" event={"ID":"64f7981e-696b-4a23-9194-d2a0326bbd1f","Type":"ContainerDied","Data":"dd5facc5566471df5a07e3b7825f8e7c5582834d603619508d1f096d74323f98"} Mar 10 12:14:04 crc kubenswrapper[4794]: I0310 12:14:04.302305 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552414-dczzv" Mar 10 12:14:04 crc kubenswrapper[4794]: I0310 12:14:04.440259 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2r7jz\" (UniqueName: \"kubernetes.io/projected/64f7981e-696b-4a23-9194-d2a0326bbd1f-kube-api-access-2r7jz\") pod \"64f7981e-696b-4a23-9194-d2a0326bbd1f\" (UID: \"64f7981e-696b-4a23-9194-d2a0326bbd1f\") " Mar 10 12:14:04 crc kubenswrapper[4794]: I0310 12:14:04.446133 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/64f7981e-696b-4a23-9194-d2a0326bbd1f-kube-api-access-2r7jz" (OuterVolumeSpecName: "kube-api-access-2r7jz") pod "64f7981e-696b-4a23-9194-d2a0326bbd1f" (UID: "64f7981e-696b-4a23-9194-d2a0326bbd1f"). InnerVolumeSpecName "kube-api-access-2r7jz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 12:14:04 crc kubenswrapper[4794]: I0310 12:14:04.542721 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2r7jz\" (UniqueName: \"kubernetes.io/projected/64f7981e-696b-4a23-9194-d2a0326bbd1f-kube-api-access-2r7jz\") on node \"crc\" DevicePath \"\"" Mar 10 12:14:04 crc kubenswrapper[4794]: I0310 12:14:04.903303 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552414-dczzv" event={"ID":"64f7981e-696b-4a23-9194-d2a0326bbd1f","Type":"ContainerDied","Data":"7f3a7913502e810122a2e0dd9f2e294be24cb32c46fae2503893baad0714d3d4"} Mar 10 12:14:04 crc kubenswrapper[4794]: I0310 12:14:04.903353 4794 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7f3a7913502e810122a2e0dd9f2e294be24cb32c46fae2503893baad0714d3d4" Mar 10 12:14:04 crc kubenswrapper[4794]: I0310 12:14:04.903374 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552414-dczzv" Mar 10 12:14:05 crc kubenswrapper[4794]: I0310 12:14:05.389820 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29552408-7zzg5"] Mar 10 12:14:05 crc kubenswrapper[4794]: I0310 12:14:05.407670 4794 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29552408-7zzg5"] Mar 10 12:14:06 crc kubenswrapper[4794]: I0310 12:14:06.020800 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="305ae941-1e89-4b65-ab6d-483a43528714" path="/var/lib/kubelet/pods/305ae941-1e89-4b65-ab6d-483a43528714/volumes" Mar 10 12:14:09 crc kubenswrapper[4794]: I0310 12:14:08.999744 4794 scope.go:117] "RemoveContainer" containerID="8a548dece9f2dc538d899cf84b2f6e94eab0179fe2b6707ac5975fb13bfb3972" Mar 10 12:14:09 crc kubenswrapper[4794]: E0310 12:14:09.000383 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-69278_openshift-machine-config-operator(071a79a8-a892-4d38-a255-2a19483b64aa)\"" pod="openshift-machine-config-operator/machine-config-daemon-69278" podUID="071a79a8-a892-4d38-a255-2a19483b64aa" Mar 10 12:14:20 crc kubenswrapper[4794]: I0310 12:14:20.004870 4794 scope.go:117] "RemoveContainer" containerID="8a548dece9f2dc538d899cf84b2f6e94eab0179fe2b6707ac5975fb13bfb3972" Mar 10 12:14:20 crc kubenswrapper[4794]: E0310 12:14:20.005706 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-69278_openshift-machine-config-operator(071a79a8-a892-4d38-a255-2a19483b64aa)\"" pod="openshift-machine-config-operator/machine-config-daemon-69278" podUID="071a79a8-a892-4d38-a255-2a19483b64aa" Mar 10 12:14:32 crc kubenswrapper[4794]: I0310 12:14:32.005815 4794 scope.go:117] "RemoveContainer" containerID="8a548dece9f2dc538d899cf84b2f6e94eab0179fe2b6707ac5975fb13bfb3972" Mar 10 12:14:32 crc kubenswrapper[4794]: E0310 12:14:32.006766 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-69278_openshift-machine-config-operator(071a79a8-a892-4d38-a255-2a19483b64aa)\"" pod="openshift-machine-config-operator/machine-config-daemon-69278" podUID="071a79a8-a892-4d38-a255-2a19483b64aa" Mar 10 12:14:38 crc kubenswrapper[4794]: I0310 12:14:38.977944 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-qp65v"] Mar 10 12:14:38 crc kubenswrapper[4794]: E0310 12:14:38.979613 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="64f7981e-696b-4a23-9194-d2a0326bbd1f" containerName="oc" Mar 10 12:14:38 crc kubenswrapper[4794]: I0310 12:14:38.979637 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="64f7981e-696b-4a23-9194-d2a0326bbd1f" containerName="oc" Mar 10 12:14:38 crc kubenswrapper[4794]: I0310 12:14:38.980379 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="64f7981e-696b-4a23-9194-d2a0326bbd1f" containerName="oc" Mar 10 12:14:38 crc kubenswrapper[4794]: I0310 12:14:38.982324 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-qp65v" Mar 10 12:14:38 crc kubenswrapper[4794]: I0310 12:14:38.995304 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-qp65v"] Mar 10 12:14:39 crc kubenswrapper[4794]: I0310 12:14:39.089027 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c6ce4f59-4d7c-4415-ae83-f4661569b8ec-catalog-content\") pod \"redhat-operators-qp65v\" (UID: \"c6ce4f59-4d7c-4415-ae83-f4661569b8ec\") " pod="openshift-marketplace/redhat-operators-qp65v" Mar 10 12:14:39 crc kubenswrapper[4794]: I0310 12:14:39.089469 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c6ce4f59-4d7c-4415-ae83-f4661569b8ec-utilities\") pod \"redhat-operators-qp65v\" (UID: \"c6ce4f59-4d7c-4415-ae83-f4661569b8ec\") " pod="openshift-marketplace/redhat-operators-qp65v" Mar 10 12:14:39 crc kubenswrapper[4794]: I0310 12:14:39.089497 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fknrz\" (UniqueName: \"kubernetes.io/projected/c6ce4f59-4d7c-4415-ae83-f4661569b8ec-kube-api-access-fknrz\") pod \"redhat-operators-qp65v\" (UID: \"c6ce4f59-4d7c-4415-ae83-f4661569b8ec\") " pod="openshift-marketplace/redhat-operators-qp65v" Mar 10 12:14:39 crc kubenswrapper[4794]: I0310 12:14:39.191584 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c6ce4f59-4d7c-4415-ae83-f4661569b8ec-catalog-content\") pod \"redhat-operators-qp65v\" (UID: \"c6ce4f59-4d7c-4415-ae83-f4661569b8ec\") " pod="openshift-marketplace/redhat-operators-qp65v" Mar 10 12:14:39 crc kubenswrapper[4794]: I0310 12:14:39.191718 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c6ce4f59-4d7c-4415-ae83-f4661569b8ec-utilities\") pod \"redhat-operators-qp65v\" (UID: \"c6ce4f59-4d7c-4415-ae83-f4661569b8ec\") " pod="openshift-marketplace/redhat-operators-qp65v" Mar 10 12:14:39 crc kubenswrapper[4794]: I0310 12:14:39.191740 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fknrz\" (UniqueName: \"kubernetes.io/projected/c6ce4f59-4d7c-4415-ae83-f4661569b8ec-kube-api-access-fknrz\") pod \"redhat-operators-qp65v\" (UID: \"c6ce4f59-4d7c-4415-ae83-f4661569b8ec\") " pod="openshift-marketplace/redhat-operators-qp65v" Mar 10 12:14:39 crc kubenswrapper[4794]: I0310 12:14:39.192402 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c6ce4f59-4d7c-4415-ae83-f4661569b8ec-utilities\") pod \"redhat-operators-qp65v\" (UID: \"c6ce4f59-4d7c-4415-ae83-f4661569b8ec\") " pod="openshift-marketplace/redhat-operators-qp65v" Mar 10 12:14:39 crc kubenswrapper[4794]: I0310 12:14:39.192415 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c6ce4f59-4d7c-4415-ae83-f4661569b8ec-catalog-content\") pod \"redhat-operators-qp65v\" (UID: \"c6ce4f59-4d7c-4415-ae83-f4661569b8ec\") " pod="openshift-marketplace/redhat-operators-qp65v" Mar 10 12:14:39 crc kubenswrapper[4794]: I0310 12:14:39.209556 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fknrz\" (UniqueName: \"kubernetes.io/projected/c6ce4f59-4d7c-4415-ae83-f4661569b8ec-kube-api-access-fknrz\") pod \"redhat-operators-qp65v\" (UID: \"c6ce4f59-4d7c-4415-ae83-f4661569b8ec\") " pod="openshift-marketplace/redhat-operators-qp65v" Mar 10 12:14:39 crc kubenswrapper[4794]: I0310 12:14:39.316034 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-qp65v" Mar 10 12:14:39 crc kubenswrapper[4794]: I0310 12:14:39.842067 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-qp65v"] Mar 10 12:14:40 crc kubenswrapper[4794]: I0310 12:14:40.284366 4794 generic.go:334] "Generic (PLEG): container finished" podID="c6ce4f59-4d7c-4415-ae83-f4661569b8ec" containerID="cc7920bf8bc72189c85e82808f04126a56d2c796019fc2af2bcefe08d3582480" exitCode=0 Mar 10 12:14:40 crc kubenswrapper[4794]: I0310 12:14:40.284645 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qp65v" event={"ID":"c6ce4f59-4d7c-4415-ae83-f4661569b8ec","Type":"ContainerDied","Data":"cc7920bf8bc72189c85e82808f04126a56d2c796019fc2af2bcefe08d3582480"} Mar 10 12:14:40 crc kubenswrapper[4794]: I0310 12:14:40.284673 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qp65v" event={"ID":"c6ce4f59-4d7c-4415-ae83-f4661569b8ec","Type":"ContainerStarted","Data":"80fcc140c8afdf14eaf1524acda2b1de442b15f9a294b8b6b8f4da775e357b83"} Mar 10 12:14:40 crc kubenswrapper[4794]: I0310 12:14:40.286514 4794 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 10 12:14:42 crc kubenswrapper[4794]: I0310 12:14:42.308695 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qp65v" event={"ID":"c6ce4f59-4d7c-4415-ae83-f4661569b8ec","Type":"ContainerStarted","Data":"c29804ddfe230550b6afc4541c8a2a68d739727a8aea369e5e81aa63102a64b7"} Mar 10 12:14:44 crc kubenswrapper[4794]: I0310 12:14:44.999079 4794 scope.go:117] "RemoveContainer" containerID="8a548dece9f2dc538d899cf84b2f6e94eab0179fe2b6707ac5975fb13bfb3972" Mar 10 12:14:45 crc kubenswrapper[4794]: E0310 12:14:44.999863 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-69278_openshift-machine-config-operator(071a79a8-a892-4d38-a255-2a19483b64aa)\"" pod="openshift-machine-config-operator/machine-config-daemon-69278" podUID="071a79a8-a892-4d38-a255-2a19483b64aa" Mar 10 12:14:46 crc kubenswrapper[4794]: I0310 12:14:46.353461 4794 generic.go:334] "Generic (PLEG): container finished" podID="c6ce4f59-4d7c-4415-ae83-f4661569b8ec" containerID="c29804ddfe230550b6afc4541c8a2a68d739727a8aea369e5e81aa63102a64b7" exitCode=0 Mar 10 12:14:46 crc kubenswrapper[4794]: I0310 12:14:46.353766 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qp65v" event={"ID":"c6ce4f59-4d7c-4415-ae83-f4661569b8ec","Type":"ContainerDied","Data":"c29804ddfe230550b6afc4541c8a2a68d739727a8aea369e5e81aa63102a64b7"} Mar 10 12:14:47 crc kubenswrapper[4794]: I0310 12:14:47.363664 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qp65v" event={"ID":"c6ce4f59-4d7c-4415-ae83-f4661569b8ec","Type":"ContainerStarted","Data":"3441005ac13170075b08269983e00bdf78ef2b13611a9facc221119bac705234"} Mar 10 12:14:47 crc kubenswrapper[4794]: I0310 12:14:47.388958 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-qp65v" podStartSLOduration=2.817823584 podStartE2EDuration="9.388938804s" podCreationTimestamp="2026-03-10 12:14:38 +0000 UTC" firstStartedPulling="2026-03-10 12:14:40.286306552 +0000 UTC m=+9029.042477370" lastFinishedPulling="2026-03-10 12:14:46.857421772 +0000 UTC m=+9035.613592590" observedRunningTime="2026-03-10 12:14:47.385641502 +0000 UTC m=+9036.141812341" watchObservedRunningTime="2026-03-10 12:14:47.388938804 +0000 UTC m=+9036.145109622" Mar 10 12:14:49 crc kubenswrapper[4794]: I0310 12:14:49.316615 4794 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-qp65v" Mar 10 12:14:49 crc kubenswrapper[4794]: I0310 12:14:49.317022 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-qp65v" Mar 10 12:14:50 crc kubenswrapper[4794]: I0310 12:14:50.367674 4794 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-qp65v" podUID="c6ce4f59-4d7c-4415-ae83-f4661569b8ec" containerName="registry-server" probeResult="failure" output=< Mar 10 12:14:50 crc kubenswrapper[4794]: timeout: failed to connect service ":50051" within 1s Mar 10 12:14:50 crc kubenswrapper[4794]: > Mar 10 12:14:54 crc kubenswrapper[4794]: I0310 12:14:54.993165 4794 scope.go:117] "RemoveContainer" containerID="761e0a72d89de4469b7f52862e87ab6241292aedfa9c2cec54c5d67da4e5e929" Mar 10 12:14:55 crc kubenswrapper[4794]: I0310 12:14:55.999603 4794 scope.go:117] "RemoveContainer" containerID="8a548dece9f2dc538d899cf84b2f6e94eab0179fe2b6707ac5975fb13bfb3972" Mar 10 12:14:56 crc kubenswrapper[4794]: E0310 12:14:56.000017 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-69278_openshift-machine-config-operator(071a79a8-a892-4d38-a255-2a19483b64aa)\"" pod="openshift-machine-config-operator/machine-config-daemon-69278" podUID="071a79a8-a892-4d38-a255-2a19483b64aa" Mar 10 12:15:00 crc kubenswrapper[4794]: I0310 12:15:00.143053 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29552415-klgfh"] Mar 10 12:15:00 crc kubenswrapper[4794]: I0310 12:15:00.145890 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29552415-klgfh" Mar 10 12:15:00 crc kubenswrapper[4794]: I0310 12:15:00.148284 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 10 12:15:00 crc kubenswrapper[4794]: I0310 12:15:00.148455 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 10 12:15:00 crc kubenswrapper[4794]: I0310 12:15:00.165280 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29552415-klgfh"] Mar 10 12:15:00 crc kubenswrapper[4794]: I0310 12:15:00.256918 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nfzq7\" (UniqueName: \"kubernetes.io/projected/5057ce22-2e46-41e5-9980-6fda0f35ede9-kube-api-access-nfzq7\") pod \"collect-profiles-29552415-klgfh\" (UID: \"5057ce22-2e46-41e5-9980-6fda0f35ede9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552415-klgfh" Mar 10 12:15:00 crc kubenswrapper[4794]: I0310 12:15:00.257040 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5057ce22-2e46-41e5-9980-6fda0f35ede9-config-volume\") pod \"collect-profiles-29552415-klgfh\" (UID: \"5057ce22-2e46-41e5-9980-6fda0f35ede9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552415-klgfh" Mar 10 12:15:00 crc kubenswrapper[4794]: I0310 12:15:00.257121 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/5057ce22-2e46-41e5-9980-6fda0f35ede9-secret-volume\") pod \"collect-profiles-29552415-klgfh\" (UID: \"5057ce22-2e46-41e5-9980-6fda0f35ede9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552415-klgfh" Mar 10 12:15:00 crc kubenswrapper[4794]: I0310 12:15:00.359585 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nfzq7\" (UniqueName: \"kubernetes.io/projected/5057ce22-2e46-41e5-9980-6fda0f35ede9-kube-api-access-nfzq7\") pod \"collect-profiles-29552415-klgfh\" (UID: \"5057ce22-2e46-41e5-9980-6fda0f35ede9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552415-klgfh" Mar 10 12:15:00 crc kubenswrapper[4794]: I0310 12:15:00.359706 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5057ce22-2e46-41e5-9980-6fda0f35ede9-config-volume\") pod \"collect-profiles-29552415-klgfh\" (UID: \"5057ce22-2e46-41e5-9980-6fda0f35ede9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552415-klgfh" Mar 10 12:15:00 crc kubenswrapper[4794]: I0310 12:15:00.359793 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/5057ce22-2e46-41e5-9980-6fda0f35ede9-secret-volume\") pod \"collect-profiles-29552415-klgfh\" (UID: \"5057ce22-2e46-41e5-9980-6fda0f35ede9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552415-klgfh" Mar 10 12:15:00 crc kubenswrapper[4794]: I0310 12:15:00.360859 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5057ce22-2e46-41e5-9980-6fda0f35ede9-config-volume\") pod \"collect-profiles-29552415-klgfh\" (UID: \"5057ce22-2e46-41e5-9980-6fda0f35ede9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552415-klgfh" Mar 10 12:15:00 crc kubenswrapper[4794]: I0310 12:15:00.362129 4794 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-qp65v" podUID="c6ce4f59-4d7c-4415-ae83-f4661569b8ec" containerName="registry-server" probeResult="failure" output=< Mar 10 12:15:00 crc kubenswrapper[4794]: timeout: failed to connect service ":50051" within 1s Mar 10 12:15:00 crc kubenswrapper[4794]: > Mar 10 12:15:00 crc kubenswrapper[4794]: I0310 12:15:00.379069 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/5057ce22-2e46-41e5-9980-6fda0f35ede9-secret-volume\") pod \"collect-profiles-29552415-klgfh\" (UID: \"5057ce22-2e46-41e5-9980-6fda0f35ede9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552415-klgfh" Mar 10 12:15:00 crc kubenswrapper[4794]: I0310 12:15:00.382517 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nfzq7\" (UniqueName: \"kubernetes.io/projected/5057ce22-2e46-41e5-9980-6fda0f35ede9-kube-api-access-nfzq7\") pod \"collect-profiles-29552415-klgfh\" (UID: \"5057ce22-2e46-41e5-9980-6fda0f35ede9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552415-klgfh" Mar 10 12:15:00 crc kubenswrapper[4794]: I0310 12:15:00.467556 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29552415-klgfh" Mar 10 12:15:00 crc kubenswrapper[4794]: I0310 12:15:00.919582 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29552415-klgfh"] Mar 10 12:15:00 crc kubenswrapper[4794]: W0310 12:15:00.920729 4794 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5057ce22_2e46_41e5_9980_6fda0f35ede9.slice/crio-3af43a3e878a7e5f1e82936a19ac4c6d7270c6bd65bc183b8305685bb3faeb11 WatchSource:0}: Error finding container 3af43a3e878a7e5f1e82936a19ac4c6d7270c6bd65bc183b8305685bb3faeb11: Status 404 returned error can't find the container with id 3af43a3e878a7e5f1e82936a19ac4c6d7270c6bd65bc183b8305685bb3faeb11 Mar 10 12:15:01 crc kubenswrapper[4794]: I0310 12:15:01.502540 4794 generic.go:334] "Generic (PLEG): container finished" podID="5057ce22-2e46-41e5-9980-6fda0f35ede9" containerID="c87c52226b4d19445115149a2ddbc54a499fce057e927dbde9568ceee8df609d" exitCode=0 Mar 10 12:15:01 crc kubenswrapper[4794]: I0310 12:15:01.502630 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29552415-klgfh" event={"ID":"5057ce22-2e46-41e5-9980-6fda0f35ede9","Type":"ContainerDied","Data":"c87c52226b4d19445115149a2ddbc54a499fce057e927dbde9568ceee8df609d"} Mar 10 12:15:01 crc kubenswrapper[4794]: I0310 12:15:01.502807 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29552415-klgfh" event={"ID":"5057ce22-2e46-41e5-9980-6fda0f35ede9","Type":"ContainerStarted","Data":"3af43a3e878a7e5f1e82936a19ac4c6d7270c6bd65bc183b8305685bb3faeb11"} Mar 10 12:15:02 crc kubenswrapper[4794]: I0310 12:15:02.933827 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29552415-klgfh" Mar 10 12:15:03 crc kubenswrapper[4794]: I0310 12:15:03.014302 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nfzq7\" (UniqueName: \"kubernetes.io/projected/5057ce22-2e46-41e5-9980-6fda0f35ede9-kube-api-access-nfzq7\") pod \"5057ce22-2e46-41e5-9980-6fda0f35ede9\" (UID: \"5057ce22-2e46-41e5-9980-6fda0f35ede9\") " Mar 10 12:15:03 crc kubenswrapper[4794]: I0310 12:15:03.014694 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/5057ce22-2e46-41e5-9980-6fda0f35ede9-secret-volume\") pod \"5057ce22-2e46-41e5-9980-6fda0f35ede9\" (UID: \"5057ce22-2e46-41e5-9980-6fda0f35ede9\") " Mar 10 12:15:03 crc kubenswrapper[4794]: I0310 12:15:03.014739 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5057ce22-2e46-41e5-9980-6fda0f35ede9-config-volume\") pod \"5057ce22-2e46-41e5-9980-6fda0f35ede9\" (UID: \"5057ce22-2e46-41e5-9980-6fda0f35ede9\") " Mar 10 12:15:03 crc kubenswrapper[4794]: I0310 12:15:03.015672 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5057ce22-2e46-41e5-9980-6fda0f35ede9-config-volume" (OuterVolumeSpecName: "config-volume") pod "5057ce22-2e46-41e5-9980-6fda0f35ede9" (UID: "5057ce22-2e46-41e5-9980-6fda0f35ede9"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 12:15:03 crc kubenswrapper[4794]: I0310 12:15:03.021175 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5057ce22-2e46-41e5-9980-6fda0f35ede9-kube-api-access-nfzq7" (OuterVolumeSpecName: "kube-api-access-nfzq7") pod "5057ce22-2e46-41e5-9980-6fda0f35ede9" (UID: "5057ce22-2e46-41e5-9980-6fda0f35ede9"). InnerVolumeSpecName "kube-api-access-nfzq7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 12:15:03 crc kubenswrapper[4794]: I0310 12:15:03.034901 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5057ce22-2e46-41e5-9980-6fda0f35ede9-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "5057ce22-2e46-41e5-9980-6fda0f35ede9" (UID: "5057ce22-2e46-41e5-9980-6fda0f35ede9"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 12:15:03 crc kubenswrapper[4794]: I0310 12:15:03.117622 4794 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5057ce22-2e46-41e5-9980-6fda0f35ede9-config-volume\") on node \"crc\" DevicePath \"\"" Mar 10 12:15:03 crc kubenswrapper[4794]: I0310 12:15:03.117677 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nfzq7\" (UniqueName: \"kubernetes.io/projected/5057ce22-2e46-41e5-9980-6fda0f35ede9-kube-api-access-nfzq7\") on node \"crc\" DevicePath \"\"" Mar 10 12:15:03 crc kubenswrapper[4794]: I0310 12:15:03.117693 4794 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/5057ce22-2e46-41e5-9980-6fda0f35ede9-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 10 12:15:03 crc kubenswrapper[4794]: I0310 12:15:03.525394 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29552415-klgfh" event={"ID":"5057ce22-2e46-41e5-9980-6fda0f35ede9","Type":"ContainerDied","Data":"3af43a3e878a7e5f1e82936a19ac4c6d7270c6bd65bc183b8305685bb3faeb11"} Mar 10 12:15:03 crc kubenswrapper[4794]: I0310 12:15:03.525670 4794 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3af43a3e878a7e5f1e82936a19ac4c6d7270c6bd65bc183b8305685bb3faeb11" Mar 10 12:15:03 crc kubenswrapper[4794]: I0310 12:15:03.525486 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29552415-klgfh" Mar 10 12:15:04 crc kubenswrapper[4794]: I0310 12:15:04.033117 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29552370-6qw5k"] Mar 10 12:15:04 crc kubenswrapper[4794]: I0310 12:15:04.043042 4794 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29552370-6qw5k"] Mar 10 12:15:06 crc kubenswrapper[4794]: I0310 12:15:06.023809 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="24048515-cbfd-4a1a-a438-285f5e399cdf" path="/var/lib/kubelet/pods/24048515-cbfd-4a1a-a438-285f5e399cdf/volumes" Mar 10 12:15:10 crc kubenswrapper[4794]: I0310 12:15:10.375505 4794 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-qp65v" podUID="c6ce4f59-4d7c-4415-ae83-f4661569b8ec" containerName="registry-server" probeResult="failure" output=< Mar 10 12:15:10 crc kubenswrapper[4794]: timeout: failed to connect service ":50051" within 1s Mar 10 12:15:10 crc kubenswrapper[4794]: > Mar 10 12:15:10 crc kubenswrapper[4794]: I0310 12:15:10.999709 4794 scope.go:117] "RemoveContainer" containerID="8a548dece9f2dc538d899cf84b2f6e94eab0179fe2b6707ac5975fb13bfb3972" Mar 10 12:15:10 crc kubenswrapper[4794]: E0310 12:15:10.999967 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-69278_openshift-machine-config-operator(071a79a8-a892-4d38-a255-2a19483b64aa)\"" pod="openshift-machine-config-operator/machine-config-daemon-69278" podUID="071a79a8-a892-4d38-a255-2a19483b64aa" Mar 10 12:15:19 crc kubenswrapper[4794]: I0310 12:15:19.364059 4794 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-qp65v" Mar 10 12:15:19 crc kubenswrapper[4794]: I0310 12:15:19.413298 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-qp65v" Mar 10 12:15:19 crc kubenswrapper[4794]: I0310 12:15:19.611692 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-qp65v"] Mar 10 12:15:20 crc kubenswrapper[4794]: I0310 12:15:20.722701 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-qp65v" podUID="c6ce4f59-4d7c-4415-ae83-f4661569b8ec" containerName="registry-server" containerID="cri-o://3441005ac13170075b08269983e00bdf78ef2b13611a9facc221119bac705234" gracePeriod=2 Mar 10 12:15:21 crc kubenswrapper[4794]: I0310 12:15:21.363135 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-qp65v" Mar 10 12:15:21 crc kubenswrapper[4794]: I0310 12:15:21.543065 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c6ce4f59-4d7c-4415-ae83-f4661569b8ec-utilities\") pod \"c6ce4f59-4d7c-4415-ae83-f4661569b8ec\" (UID: \"c6ce4f59-4d7c-4415-ae83-f4661569b8ec\") " Mar 10 12:15:21 crc kubenswrapper[4794]: I0310 12:15:21.543574 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fknrz\" (UniqueName: \"kubernetes.io/projected/c6ce4f59-4d7c-4415-ae83-f4661569b8ec-kube-api-access-fknrz\") pod \"c6ce4f59-4d7c-4415-ae83-f4661569b8ec\" (UID: \"c6ce4f59-4d7c-4415-ae83-f4661569b8ec\") " Mar 10 12:15:21 crc kubenswrapper[4794]: I0310 12:15:21.543818 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c6ce4f59-4d7c-4415-ae83-f4661569b8ec-catalog-content\") pod \"c6ce4f59-4d7c-4415-ae83-f4661569b8ec\" (UID: \"c6ce4f59-4d7c-4415-ae83-f4661569b8ec\") " Mar 10 12:15:21 crc kubenswrapper[4794]: I0310 12:15:21.544091 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c6ce4f59-4d7c-4415-ae83-f4661569b8ec-utilities" (OuterVolumeSpecName: "utilities") pod "c6ce4f59-4d7c-4415-ae83-f4661569b8ec" (UID: "c6ce4f59-4d7c-4415-ae83-f4661569b8ec"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 12:15:21 crc kubenswrapper[4794]: I0310 12:15:21.544720 4794 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c6ce4f59-4d7c-4415-ae83-f4661569b8ec-utilities\") on node \"crc\" DevicePath \"\"" Mar 10 12:15:21 crc kubenswrapper[4794]: I0310 12:15:21.559519 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c6ce4f59-4d7c-4415-ae83-f4661569b8ec-kube-api-access-fknrz" (OuterVolumeSpecName: "kube-api-access-fknrz") pod "c6ce4f59-4d7c-4415-ae83-f4661569b8ec" (UID: "c6ce4f59-4d7c-4415-ae83-f4661569b8ec"). InnerVolumeSpecName "kube-api-access-fknrz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 12:15:21 crc kubenswrapper[4794]: I0310 12:15:21.646431 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fknrz\" (UniqueName: \"kubernetes.io/projected/c6ce4f59-4d7c-4415-ae83-f4661569b8ec-kube-api-access-fknrz\") on node \"crc\" DevicePath \"\"" Mar 10 12:15:21 crc kubenswrapper[4794]: I0310 12:15:21.669805 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c6ce4f59-4d7c-4415-ae83-f4661569b8ec-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c6ce4f59-4d7c-4415-ae83-f4661569b8ec" (UID: "c6ce4f59-4d7c-4415-ae83-f4661569b8ec"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 12:15:21 crc kubenswrapper[4794]: I0310 12:15:21.739749 4794 generic.go:334] "Generic (PLEG): container finished" podID="c6ce4f59-4d7c-4415-ae83-f4661569b8ec" containerID="3441005ac13170075b08269983e00bdf78ef2b13611a9facc221119bac705234" exitCode=0 Mar 10 12:15:21 crc kubenswrapper[4794]: I0310 12:15:21.739806 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qp65v" event={"ID":"c6ce4f59-4d7c-4415-ae83-f4661569b8ec","Type":"ContainerDied","Data":"3441005ac13170075b08269983e00bdf78ef2b13611a9facc221119bac705234"} Mar 10 12:15:21 crc kubenswrapper[4794]: I0310 12:15:21.739833 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-qp65v" Mar 10 12:15:21 crc kubenswrapper[4794]: I0310 12:15:21.739857 4794 scope.go:117] "RemoveContainer" containerID="3441005ac13170075b08269983e00bdf78ef2b13611a9facc221119bac705234" Mar 10 12:15:21 crc kubenswrapper[4794]: I0310 12:15:21.739844 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qp65v" event={"ID":"c6ce4f59-4d7c-4415-ae83-f4661569b8ec","Type":"ContainerDied","Data":"80fcc140c8afdf14eaf1524acda2b1de442b15f9a294b8b6b8f4da775e357b83"} Mar 10 12:15:21 crc kubenswrapper[4794]: I0310 12:15:21.748383 4794 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c6ce4f59-4d7c-4415-ae83-f4661569b8ec-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 10 12:15:21 crc kubenswrapper[4794]: I0310 12:15:21.771857 4794 scope.go:117] "RemoveContainer" containerID="c29804ddfe230550b6afc4541c8a2a68d739727a8aea369e5e81aa63102a64b7" Mar 10 12:15:21 crc kubenswrapper[4794]: I0310 12:15:21.783811 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-qp65v"] Mar 10 12:15:21 crc kubenswrapper[4794]: I0310 12:15:21.794226 4794 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-qp65v"] Mar 10 12:15:21 crc kubenswrapper[4794]: I0310 12:15:21.806227 4794 scope.go:117] "RemoveContainer" containerID="cc7920bf8bc72189c85e82808f04126a56d2c796019fc2af2bcefe08d3582480" Mar 10 12:15:21 crc kubenswrapper[4794]: I0310 12:15:21.846310 4794 scope.go:117] "RemoveContainer" containerID="3441005ac13170075b08269983e00bdf78ef2b13611a9facc221119bac705234" Mar 10 12:15:21 crc kubenswrapper[4794]: E0310 12:15:21.846753 4794 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3441005ac13170075b08269983e00bdf78ef2b13611a9facc221119bac705234\": container with ID starting with 3441005ac13170075b08269983e00bdf78ef2b13611a9facc221119bac705234 not found: ID does not exist" containerID="3441005ac13170075b08269983e00bdf78ef2b13611a9facc221119bac705234" Mar 10 12:15:21 crc kubenswrapper[4794]: I0310 12:15:21.846791 4794 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3441005ac13170075b08269983e00bdf78ef2b13611a9facc221119bac705234"} err="failed to get container status \"3441005ac13170075b08269983e00bdf78ef2b13611a9facc221119bac705234\": rpc error: code = NotFound desc = could not find container \"3441005ac13170075b08269983e00bdf78ef2b13611a9facc221119bac705234\": container with ID starting with 3441005ac13170075b08269983e00bdf78ef2b13611a9facc221119bac705234 not found: ID does not exist" Mar 10 12:15:21 crc kubenswrapper[4794]: I0310 12:15:21.846824 4794 scope.go:117] "RemoveContainer" containerID="c29804ddfe230550b6afc4541c8a2a68d739727a8aea369e5e81aa63102a64b7" Mar 10 12:15:21 crc kubenswrapper[4794]: E0310 12:15:21.847192 4794 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c29804ddfe230550b6afc4541c8a2a68d739727a8aea369e5e81aa63102a64b7\": container with ID starting with c29804ddfe230550b6afc4541c8a2a68d739727a8aea369e5e81aa63102a64b7 not found: ID does not exist" containerID="c29804ddfe230550b6afc4541c8a2a68d739727a8aea369e5e81aa63102a64b7" Mar 10 12:15:21 crc kubenswrapper[4794]: I0310 12:15:21.847218 4794 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c29804ddfe230550b6afc4541c8a2a68d739727a8aea369e5e81aa63102a64b7"} err="failed to get container status \"c29804ddfe230550b6afc4541c8a2a68d739727a8aea369e5e81aa63102a64b7\": rpc error: code = NotFound desc = could not find container \"c29804ddfe230550b6afc4541c8a2a68d739727a8aea369e5e81aa63102a64b7\": container with ID starting with c29804ddfe230550b6afc4541c8a2a68d739727a8aea369e5e81aa63102a64b7 not found: ID does not exist" Mar 10 12:15:21 crc kubenswrapper[4794]: I0310 12:15:21.847235 4794 scope.go:117] "RemoveContainer" containerID="cc7920bf8bc72189c85e82808f04126a56d2c796019fc2af2bcefe08d3582480" Mar 10 12:15:21 crc kubenswrapper[4794]: E0310 12:15:21.847494 4794 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cc7920bf8bc72189c85e82808f04126a56d2c796019fc2af2bcefe08d3582480\": container with ID starting with cc7920bf8bc72189c85e82808f04126a56d2c796019fc2af2bcefe08d3582480 not found: ID does not exist" containerID="cc7920bf8bc72189c85e82808f04126a56d2c796019fc2af2bcefe08d3582480" Mar 10 12:15:21 crc kubenswrapper[4794]: I0310 12:15:21.847516 4794 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cc7920bf8bc72189c85e82808f04126a56d2c796019fc2af2bcefe08d3582480"} err="failed to get container status \"cc7920bf8bc72189c85e82808f04126a56d2c796019fc2af2bcefe08d3582480\": rpc error: code = NotFound desc = could not find container \"cc7920bf8bc72189c85e82808f04126a56d2c796019fc2af2bcefe08d3582480\": container with ID starting with cc7920bf8bc72189c85e82808f04126a56d2c796019fc2af2bcefe08d3582480 not found: ID does not exist" Mar 10 12:15:22 crc kubenswrapper[4794]: I0310 12:15:22.014135 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c6ce4f59-4d7c-4415-ae83-f4661569b8ec" path="/var/lib/kubelet/pods/c6ce4f59-4d7c-4415-ae83-f4661569b8ec/volumes" Mar 10 12:15:25 crc kubenswrapper[4794]: I0310 12:15:25.998749 4794 scope.go:117] "RemoveContainer" containerID="8a548dece9f2dc538d899cf84b2f6e94eab0179fe2b6707ac5975fb13bfb3972" Mar 10 12:15:26 crc kubenswrapper[4794]: E0310 12:15:25.999592 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-69278_openshift-machine-config-operator(071a79a8-a892-4d38-a255-2a19483b64aa)\"" pod="openshift-machine-config-operator/machine-config-daemon-69278" podUID="071a79a8-a892-4d38-a255-2a19483b64aa" Mar 10 12:15:40 crc kubenswrapper[4794]: I0310 12:15:40.001157 4794 scope.go:117] "RemoveContainer" containerID="8a548dece9f2dc538d899cf84b2f6e94eab0179fe2b6707ac5975fb13bfb3972" Mar 10 12:15:40 crc kubenswrapper[4794]: E0310 12:15:40.002030 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-69278_openshift-machine-config-operator(071a79a8-a892-4d38-a255-2a19483b64aa)\"" pod="openshift-machine-config-operator/machine-config-daemon-69278" podUID="071a79a8-a892-4d38-a255-2a19483b64aa" Mar 10 12:15:55 crc kubenswrapper[4794]: I0310 12:15:55.000259 4794 scope.go:117] "RemoveContainer" containerID="8a548dece9f2dc538d899cf84b2f6e94eab0179fe2b6707ac5975fb13bfb3972" Mar 10 12:15:55 crc kubenswrapper[4794]: E0310 12:15:55.001023 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-69278_openshift-machine-config-operator(071a79a8-a892-4d38-a255-2a19483b64aa)\"" pod="openshift-machine-config-operator/machine-config-daemon-69278" podUID="071a79a8-a892-4d38-a255-2a19483b64aa" Mar 10 12:15:55 crc kubenswrapper[4794]: I0310 12:15:55.087093 4794 scope.go:117] "RemoveContainer" containerID="0798d116e17d3efc93ee1b26ff856e25907d110be12a521a46e06544a669bbb8" Mar 10 12:16:00 crc kubenswrapper[4794]: I0310 12:16:00.210182 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29552416-hzd82"] Mar 10 12:16:00 crc kubenswrapper[4794]: E0310 12:16:00.211502 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5057ce22-2e46-41e5-9980-6fda0f35ede9" containerName="collect-profiles" Mar 10 12:16:00 crc kubenswrapper[4794]: I0310 12:16:00.211526 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="5057ce22-2e46-41e5-9980-6fda0f35ede9" containerName="collect-profiles" Mar 10 12:16:00 crc kubenswrapper[4794]: E0310 12:16:00.211565 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c6ce4f59-4d7c-4415-ae83-f4661569b8ec" containerName="registry-server" Mar 10 12:16:00 crc kubenswrapper[4794]: I0310 12:16:00.211577 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="c6ce4f59-4d7c-4415-ae83-f4661569b8ec" containerName="registry-server" Mar 10 12:16:00 crc kubenswrapper[4794]: E0310 12:16:00.211613 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c6ce4f59-4d7c-4415-ae83-f4661569b8ec" containerName="extract-utilities" Mar 10 12:16:00 crc kubenswrapper[4794]: I0310 12:16:00.211626 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="c6ce4f59-4d7c-4415-ae83-f4661569b8ec" containerName="extract-utilities" Mar 10 12:16:00 crc kubenswrapper[4794]: E0310 12:16:00.211678 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c6ce4f59-4d7c-4415-ae83-f4661569b8ec" containerName="extract-content" Mar 10 12:16:00 crc kubenswrapper[4794]: I0310 12:16:00.211690 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="c6ce4f59-4d7c-4415-ae83-f4661569b8ec" containerName="extract-content" Mar 10 12:16:00 crc kubenswrapper[4794]: I0310 12:16:00.212242 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="c6ce4f59-4d7c-4415-ae83-f4661569b8ec" containerName="registry-server" Mar 10 12:16:00 crc kubenswrapper[4794]: I0310 12:16:00.212266 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="5057ce22-2e46-41e5-9980-6fda0f35ede9" containerName="collect-profiles" Mar 10 12:16:00 crc kubenswrapper[4794]: I0310 12:16:00.213510 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552416-hzd82" Mar 10 12:16:00 crc kubenswrapper[4794]: I0310 12:16:00.216805 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 10 12:16:00 crc kubenswrapper[4794]: I0310 12:16:00.216856 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 10 12:16:00 crc kubenswrapper[4794]: I0310 12:16:00.218112 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-r5tkc" Mar 10 12:16:00 crc kubenswrapper[4794]: I0310 12:16:00.224600 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552416-hzd82"] Mar 10 12:16:00 crc kubenswrapper[4794]: I0310 12:16:00.297322 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f9n22\" (UniqueName: \"kubernetes.io/projected/642736f4-3278-41ad-b9b1-b587da80d429-kube-api-access-f9n22\") pod \"auto-csr-approver-29552416-hzd82\" (UID: \"642736f4-3278-41ad-b9b1-b587da80d429\") " pod="openshift-infra/auto-csr-approver-29552416-hzd82" Mar 10 12:16:00 crc kubenswrapper[4794]: I0310 12:16:00.399157 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f9n22\" (UniqueName: \"kubernetes.io/projected/642736f4-3278-41ad-b9b1-b587da80d429-kube-api-access-f9n22\") pod \"auto-csr-approver-29552416-hzd82\" (UID: \"642736f4-3278-41ad-b9b1-b587da80d429\") " pod="openshift-infra/auto-csr-approver-29552416-hzd82" Mar 10 12:16:00 crc kubenswrapper[4794]: I0310 12:16:00.420406 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f9n22\" (UniqueName: \"kubernetes.io/projected/642736f4-3278-41ad-b9b1-b587da80d429-kube-api-access-f9n22\") pod \"auto-csr-approver-29552416-hzd82\" (UID: \"642736f4-3278-41ad-b9b1-b587da80d429\") " pod="openshift-infra/auto-csr-approver-29552416-hzd82" Mar 10 12:16:00 crc kubenswrapper[4794]: I0310 12:16:00.538537 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552416-hzd82" Mar 10 12:16:01 crc kubenswrapper[4794]: I0310 12:16:01.027668 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552416-hzd82"] Mar 10 12:16:01 crc kubenswrapper[4794]: I0310 12:16:01.146409 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552416-hzd82" event={"ID":"642736f4-3278-41ad-b9b1-b587da80d429","Type":"ContainerStarted","Data":"bdb18f07a1ab884b857fdad84b13378b8ac24c18ee46953efecd4dc045afdfe2"} Mar 10 12:16:03 crc kubenswrapper[4794]: I0310 12:16:03.168211 4794 generic.go:334] "Generic (PLEG): container finished" podID="642736f4-3278-41ad-b9b1-b587da80d429" containerID="b6f42a0eadbd3bb5efaec956113ba41f2e6bac55979749547deb8502d20f2370" exitCode=0 Mar 10 12:16:03 crc kubenswrapper[4794]: I0310 12:16:03.168522 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552416-hzd82" event={"ID":"642736f4-3278-41ad-b9b1-b587da80d429","Type":"ContainerDied","Data":"b6f42a0eadbd3bb5efaec956113ba41f2e6bac55979749547deb8502d20f2370"} Mar 10 12:16:04 crc kubenswrapper[4794]: I0310 12:16:04.615519 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552416-hzd82" Mar 10 12:16:04 crc kubenswrapper[4794]: I0310 12:16:04.791859 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f9n22\" (UniqueName: \"kubernetes.io/projected/642736f4-3278-41ad-b9b1-b587da80d429-kube-api-access-f9n22\") pod \"642736f4-3278-41ad-b9b1-b587da80d429\" (UID: \"642736f4-3278-41ad-b9b1-b587da80d429\") " Mar 10 12:16:04 crc kubenswrapper[4794]: I0310 12:16:04.801221 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/642736f4-3278-41ad-b9b1-b587da80d429-kube-api-access-f9n22" (OuterVolumeSpecName: "kube-api-access-f9n22") pod "642736f4-3278-41ad-b9b1-b587da80d429" (UID: "642736f4-3278-41ad-b9b1-b587da80d429"). InnerVolumeSpecName "kube-api-access-f9n22". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 12:16:04 crc kubenswrapper[4794]: I0310 12:16:04.896379 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f9n22\" (UniqueName: \"kubernetes.io/projected/642736f4-3278-41ad-b9b1-b587da80d429-kube-api-access-f9n22\") on node \"crc\" DevicePath \"\"" Mar 10 12:16:05 crc kubenswrapper[4794]: I0310 12:16:05.188316 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552416-hzd82" event={"ID":"642736f4-3278-41ad-b9b1-b587da80d429","Type":"ContainerDied","Data":"bdb18f07a1ab884b857fdad84b13378b8ac24c18ee46953efecd4dc045afdfe2"} Mar 10 12:16:05 crc kubenswrapper[4794]: I0310 12:16:05.188375 4794 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bdb18f07a1ab884b857fdad84b13378b8ac24c18ee46953efecd4dc045afdfe2" Mar 10 12:16:05 crc kubenswrapper[4794]: I0310 12:16:05.188547 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552416-hzd82" Mar 10 12:16:05 crc kubenswrapper[4794]: I0310 12:16:05.714118 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29552410-l6df7"] Mar 10 12:16:05 crc kubenswrapper[4794]: I0310 12:16:05.729203 4794 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29552410-l6df7"] Mar 10 12:16:06 crc kubenswrapper[4794]: I0310 12:16:06.012959 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20a5c8d7-14df-4b04-b241-125c2309de21" path="/var/lib/kubelet/pods/20a5c8d7-14df-4b04-b241-125c2309de21/volumes" Mar 10 12:16:09 crc kubenswrapper[4794]: I0310 12:16:09.000923 4794 scope.go:117] "RemoveContainer" containerID="8a548dece9f2dc538d899cf84b2f6e94eab0179fe2b6707ac5975fb13bfb3972" Mar 10 12:16:09 crc kubenswrapper[4794]: E0310 12:16:09.001674 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-69278_openshift-machine-config-operator(071a79a8-a892-4d38-a255-2a19483b64aa)\"" pod="openshift-machine-config-operator/machine-config-daemon-69278" podUID="071a79a8-a892-4d38-a255-2a19483b64aa" Mar 10 12:16:20 crc kubenswrapper[4794]: I0310 12:16:20.999067 4794 scope.go:117] "RemoveContainer" containerID="8a548dece9f2dc538d899cf84b2f6e94eab0179fe2b6707ac5975fb13bfb3972" Mar 10 12:16:21 crc kubenswrapper[4794]: E0310 12:16:20.999851 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-69278_openshift-machine-config-operator(071a79a8-a892-4d38-a255-2a19483b64aa)\"" pod="openshift-machine-config-operator/machine-config-daemon-69278" podUID="071a79a8-a892-4d38-a255-2a19483b64aa" Mar 10 12:16:35 crc kubenswrapper[4794]: I0310 12:16:34.999823 4794 scope.go:117] "RemoveContainer" containerID="8a548dece9f2dc538d899cf84b2f6e94eab0179fe2b6707ac5975fb13bfb3972" Mar 10 12:16:35 crc kubenswrapper[4794]: E0310 12:16:35.000876 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-69278_openshift-machine-config-operator(071a79a8-a892-4d38-a255-2a19483b64aa)\"" pod="openshift-machine-config-operator/machine-config-daemon-69278" podUID="071a79a8-a892-4d38-a255-2a19483b64aa" Mar 10 12:16:47 crc kubenswrapper[4794]: I0310 12:16:47.329824 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-9zvls/must-gather-v29r8"] Mar 10 12:16:47 crc kubenswrapper[4794]: E0310 12:16:47.330738 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="642736f4-3278-41ad-b9b1-b587da80d429" containerName="oc" Mar 10 12:16:47 crc kubenswrapper[4794]: I0310 12:16:47.330751 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="642736f4-3278-41ad-b9b1-b587da80d429" containerName="oc" Mar 10 12:16:47 crc kubenswrapper[4794]: I0310 12:16:47.330988 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="642736f4-3278-41ad-b9b1-b587da80d429" containerName="oc" Mar 10 12:16:47 crc kubenswrapper[4794]: I0310 12:16:47.334242 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-9zvls/must-gather-v29r8" Mar 10 12:16:47 crc kubenswrapper[4794]: I0310 12:16:47.346779 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-9zvls"/"openshift-service-ca.crt" Mar 10 12:16:47 crc kubenswrapper[4794]: I0310 12:16:47.350362 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-9zvls"/"default-dockercfg-mwzc5" Mar 10 12:16:47 crc kubenswrapper[4794]: I0310 12:16:47.350963 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-9zvls"/"kube-root-ca.crt" Mar 10 12:16:47 crc kubenswrapper[4794]: I0310 12:16:47.352772 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-9zvls/must-gather-v29r8"] Mar 10 12:16:47 crc kubenswrapper[4794]: I0310 12:16:47.438102 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/bb3870f7-5f55-4bc7-b9d2-89e55a39db3c-must-gather-output\") pod \"must-gather-v29r8\" (UID: \"bb3870f7-5f55-4bc7-b9d2-89e55a39db3c\") " pod="openshift-must-gather-9zvls/must-gather-v29r8" Mar 10 12:16:47 crc kubenswrapper[4794]: I0310 12:16:47.438751 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5blf6\" (UniqueName: \"kubernetes.io/projected/bb3870f7-5f55-4bc7-b9d2-89e55a39db3c-kube-api-access-5blf6\") pod \"must-gather-v29r8\" (UID: \"bb3870f7-5f55-4bc7-b9d2-89e55a39db3c\") " pod="openshift-must-gather-9zvls/must-gather-v29r8" Mar 10 12:16:47 crc kubenswrapper[4794]: I0310 12:16:47.540619 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/bb3870f7-5f55-4bc7-b9d2-89e55a39db3c-must-gather-output\") pod \"must-gather-v29r8\" (UID: \"bb3870f7-5f55-4bc7-b9d2-89e55a39db3c\") " pod="openshift-must-gather-9zvls/must-gather-v29r8" Mar 10 12:16:47 crc kubenswrapper[4794]: I0310 12:16:47.540759 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5blf6\" (UniqueName: \"kubernetes.io/projected/bb3870f7-5f55-4bc7-b9d2-89e55a39db3c-kube-api-access-5blf6\") pod \"must-gather-v29r8\" (UID: \"bb3870f7-5f55-4bc7-b9d2-89e55a39db3c\") " pod="openshift-must-gather-9zvls/must-gather-v29r8" Mar 10 12:16:47 crc kubenswrapper[4794]: I0310 12:16:47.541160 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/bb3870f7-5f55-4bc7-b9d2-89e55a39db3c-must-gather-output\") pod \"must-gather-v29r8\" (UID: \"bb3870f7-5f55-4bc7-b9d2-89e55a39db3c\") " pod="openshift-must-gather-9zvls/must-gather-v29r8" Mar 10 12:16:47 crc kubenswrapper[4794]: I0310 12:16:47.561244 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5blf6\" (UniqueName: \"kubernetes.io/projected/bb3870f7-5f55-4bc7-b9d2-89e55a39db3c-kube-api-access-5blf6\") pod \"must-gather-v29r8\" (UID: \"bb3870f7-5f55-4bc7-b9d2-89e55a39db3c\") " pod="openshift-must-gather-9zvls/must-gather-v29r8" Mar 10 12:16:47 crc kubenswrapper[4794]: I0310 12:16:47.667466 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-9zvls/must-gather-v29r8" Mar 10 12:16:48 crc kubenswrapper[4794]: I0310 12:16:48.179080 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-9zvls/must-gather-v29r8"] Mar 10 12:16:48 crc kubenswrapper[4794]: I0310 12:16:48.686438 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-9zvls/must-gather-v29r8" event={"ID":"bb3870f7-5f55-4bc7-b9d2-89e55a39db3c","Type":"ContainerStarted","Data":"888c214dc0c8766559ec7b4d82ef9b85a91ef5a270e67e4d670036ee2c8c313b"} Mar 10 12:16:48 crc kubenswrapper[4794]: I0310 12:16:48.999707 4794 scope.go:117] "RemoveContainer" containerID="8a548dece9f2dc538d899cf84b2f6e94eab0179fe2b6707ac5975fb13bfb3972" Mar 10 12:16:49 crc kubenswrapper[4794]: E0310 12:16:49.000007 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-69278_openshift-machine-config-operator(071a79a8-a892-4d38-a255-2a19483b64aa)\"" pod="openshift-machine-config-operator/machine-config-daemon-69278" podUID="071a79a8-a892-4d38-a255-2a19483b64aa" Mar 10 12:16:55 crc kubenswrapper[4794]: I0310 12:16:55.174058 4794 scope.go:117] "RemoveContainer" containerID="264e550c3f959054090123dcfeeecea0a35e4bd40c6289c0488b4b9f634b6331" Mar 10 12:16:55 crc kubenswrapper[4794]: I0310 12:16:55.802585 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-9zvls/must-gather-v29r8" event={"ID":"bb3870f7-5f55-4bc7-b9d2-89e55a39db3c","Type":"ContainerStarted","Data":"98ec632d28d4582ae735bb4be261bb11501984e06604d07ae68dc771e311e2f9"} Mar 10 12:16:56 crc kubenswrapper[4794]: I0310 12:16:56.816108 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-9zvls/must-gather-v29r8" event={"ID":"bb3870f7-5f55-4bc7-b9d2-89e55a39db3c","Type":"ContainerStarted","Data":"4e3d76e5154715389dfb45be595648f007d0bfea18b3afab2b1b01d3e1746a68"} Mar 10 12:16:56 crc kubenswrapper[4794]: I0310 12:16:56.832252 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-9zvls/must-gather-v29r8" podStartSLOduration=2.593240957 podStartE2EDuration="9.832233524s" podCreationTimestamp="2026-03-10 12:16:47 +0000 UTC" firstStartedPulling="2026-03-10 12:16:48.178002308 +0000 UTC m=+9156.934173146" lastFinishedPulling="2026-03-10 12:16:55.416994895 +0000 UTC m=+9164.173165713" observedRunningTime="2026-03-10 12:16:56.829413667 +0000 UTC m=+9165.585584485" watchObservedRunningTime="2026-03-10 12:16:56.832233524 +0000 UTC m=+9165.588404342" Mar 10 12:16:59 crc kubenswrapper[4794]: I0310 12:16:59.869263 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-9zvls/crc-debug-kp9js"] Mar 10 12:16:59 crc kubenswrapper[4794]: I0310 12:16:59.871666 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-9zvls/crc-debug-kp9js" Mar 10 12:16:59 crc kubenswrapper[4794]: I0310 12:16:59.965070 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wpr6t\" (UniqueName: \"kubernetes.io/projected/d6503f22-a6c5-4934-bf3c-443d55351b85-kube-api-access-wpr6t\") pod \"crc-debug-kp9js\" (UID: \"d6503f22-a6c5-4934-bf3c-443d55351b85\") " pod="openshift-must-gather-9zvls/crc-debug-kp9js" Mar 10 12:16:59 crc kubenswrapper[4794]: I0310 12:16:59.965278 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/d6503f22-a6c5-4934-bf3c-443d55351b85-host\") pod \"crc-debug-kp9js\" (UID: \"d6503f22-a6c5-4934-bf3c-443d55351b85\") " pod="openshift-must-gather-9zvls/crc-debug-kp9js" Mar 10 12:17:00 crc kubenswrapper[4794]: I0310 12:17:00.066806 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wpr6t\" (UniqueName: \"kubernetes.io/projected/d6503f22-a6c5-4934-bf3c-443d55351b85-kube-api-access-wpr6t\") pod \"crc-debug-kp9js\" (UID: \"d6503f22-a6c5-4934-bf3c-443d55351b85\") " pod="openshift-must-gather-9zvls/crc-debug-kp9js" Mar 10 12:17:00 crc kubenswrapper[4794]: I0310 12:17:00.066943 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/d6503f22-a6c5-4934-bf3c-443d55351b85-host\") pod \"crc-debug-kp9js\" (UID: \"d6503f22-a6c5-4934-bf3c-443d55351b85\") " pod="openshift-must-gather-9zvls/crc-debug-kp9js" Mar 10 12:17:00 crc kubenswrapper[4794]: I0310 12:17:00.067043 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/d6503f22-a6c5-4934-bf3c-443d55351b85-host\") pod \"crc-debug-kp9js\" (UID: \"d6503f22-a6c5-4934-bf3c-443d55351b85\") " pod="openshift-must-gather-9zvls/crc-debug-kp9js" Mar 10 12:17:00 crc kubenswrapper[4794]: I0310 12:17:00.090013 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wpr6t\" (UniqueName: \"kubernetes.io/projected/d6503f22-a6c5-4934-bf3c-443d55351b85-kube-api-access-wpr6t\") pod \"crc-debug-kp9js\" (UID: \"d6503f22-a6c5-4934-bf3c-443d55351b85\") " pod="openshift-must-gather-9zvls/crc-debug-kp9js" Mar 10 12:17:00 crc kubenswrapper[4794]: I0310 12:17:00.196272 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-9zvls/crc-debug-kp9js" Mar 10 12:17:00 crc kubenswrapper[4794]: I0310 12:17:00.864752 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-9zvls/crc-debug-kp9js" event={"ID":"d6503f22-a6c5-4934-bf3c-443d55351b85","Type":"ContainerStarted","Data":"c7c5a8aebaa195f903d6222ec91838b0ab830d550783630f9cbe9c09c17558ea"} Mar 10 12:17:00 crc kubenswrapper[4794]: I0310 12:17:00.999351 4794 scope.go:117] "RemoveContainer" containerID="8a548dece9f2dc538d899cf84b2f6e94eab0179fe2b6707ac5975fb13bfb3972" Mar 10 12:17:01 crc kubenswrapper[4794]: E0310 12:17:01.000642 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-69278_openshift-machine-config-operator(071a79a8-a892-4d38-a255-2a19483b64aa)\"" pod="openshift-machine-config-operator/machine-config-daemon-69278" podUID="071a79a8-a892-4d38-a255-2a19483b64aa" Mar 10 12:17:14 crc kubenswrapper[4794]: I0310 12:17:14.018145 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-9zvls/crc-debug-kp9js" event={"ID":"d6503f22-a6c5-4934-bf3c-443d55351b85","Type":"ContainerStarted","Data":"dca975b4fd67ffd4491fd0877665299fefc6bf3a44eb8a8e08869aa4b568d956"} Mar 10 12:17:14 crc kubenswrapper[4794]: I0310 12:17:14.029914 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-9zvls/crc-debug-kp9js" podStartSLOduration=2.291890846 podStartE2EDuration="15.029890529s" podCreationTimestamp="2026-03-10 12:16:59 +0000 UTC" firstStartedPulling="2026-03-10 12:17:00.257210804 +0000 UTC m=+9169.013381622" lastFinishedPulling="2026-03-10 12:17:12.995210487 +0000 UTC m=+9181.751381305" observedRunningTime="2026-03-10 12:17:14.019450698 +0000 UTC m=+9182.775621516" watchObservedRunningTime="2026-03-10 12:17:14.029890529 +0000 UTC m=+9182.786061347" Mar 10 12:17:16 crc kubenswrapper[4794]: I0310 12:17:16.000424 4794 scope.go:117] "RemoveContainer" containerID="8a548dece9f2dc538d899cf84b2f6e94eab0179fe2b6707ac5975fb13bfb3972" Mar 10 12:17:16 crc kubenswrapper[4794]: E0310 12:17:16.001464 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-69278_openshift-machine-config-operator(071a79a8-a892-4d38-a255-2a19483b64aa)\"" pod="openshift-machine-config-operator/machine-config-daemon-69278" podUID="071a79a8-a892-4d38-a255-2a19483b64aa" Mar 10 12:17:28 crc kubenswrapper[4794]: I0310 12:17:28.998981 4794 scope.go:117] "RemoveContainer" containerID="8a548dece9f2dc538d899cf84b2f6e94eab0179fe2b6707ac5975fb13bfb3972" Mar 10 12:17:29 crc kubenswrapper[4794]: E0310 12:17:28.999788 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-69278_openshift-machine-config-operator(071a79a8-a892-4d38-a255-2a19483b64aa)\"" pod="openshift-machine-config-operator/machine-config-daemon-69278" podUID="071a79a8-a892-4d38-a255-2a19483b64aa" Mar 10 12:17:36 crc kubenswrapper[4794]: I0310 12:17:36.221175 4794 generic.go:334] "Generic (PLEG): container finished" podID="d6503f22-a6c5-4934-bf3c-443d55351b85" containerID="dca975b4fd67ffd4491fd0877665299fefc6bf3a44eb8a8e08869aa4b568d956" exitCode=0 Mar 10 12:17:36 crc kubenswrapper[4794]: I0310 12:17:36.221270 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-9zvls/crc-debug-kp9js" event={"ID":"d6503f22-a6c5-4934-bf3c-443d55351b85","Type":"ContainerDied","Data":"dca975b4fd67ffd4491fd0877665299fefc6bf3a44eb8a8e08869aa4b568d956"} Mar 10 12:17:37 crc kubenswrapper[4794]: I0310 12:17:37.395321 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-9zvls/crc-debug-kp9js" Mar 10 12:17:37 crc kubenswrapper[4794]: I0310 12:17:37.431586 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-9zvls/crc-debug-kp9js"] Mar 10 12:17:37 crc kubenswrapper[4794]: I0310 12:17:37.444689 4794 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-9zvls/crc-debug-kp9js"] Mar 10 12:17:37 crc kubenswrapper[4794]: I0310 12:17:37.539054 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wpr6t\" (UniqueName: \"kubernetes.io/projected/d6503f22-a6c5-4934-bf3c-443d55351b85-kube-api-access-wpr6t\") pod \"d6503f22-a6c5-4934-bf3c-443d55351b85\" (UID: \"d6503f22-a6c5-4934-bf3c-443d55351b85\") " Mar 10 12:17:37 crc kubenswrapper[4794]: I0310 12:17:37.539150 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/d6503f22-a6c5-4934-bf3c-443d55351b85-host\") pod \"d6503f22-a6c5-4934-bf3c-443d55351b85\" (UID: \"d6503f22-a6c5-4934-bf3c-443d55351b85\") " Mar 10 12:17:37 crc kubenswrapper[4794]: I0310 12:17:37.539787 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d6503f22-a6c5-4934-bf3c-443d55351b85-host" (OuterVolumeSpecName: "host") pod "d6503f22-a6c5-4934-bf3c-443d55351b85" (UID: "d6503f22-a6c5-4934-bf3c-443d55351b85"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 12:17:37 crc kubenswrapper[4794]: I0310 12:17:37.545658 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d6503f22-a6c5-4934-bf3c-443d55351b85-kube-api-access-wpr6t" (OuterVolumeSpecName: "kube-api-access-wpr6t") pod "d6503f22-a6c5-4934-bf3c-443d55351b85" (UID: "d6503f22-a6c5-4934-bf3c-443d55351b85"). InnerVolumeSpecName "kube-api-access-wpr6t". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 12:17:37 crc kubenswrapper[4794]: I0310 12:17:37.641914 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wpr6t\" (UniqueName: \"kubernetes.io/projected/d6503f22-a6c5-4934-bf3c-443d55351b85-kube-api-access-wpr6t\") on node \"crc\" DevicePath \"\"" Mar 10 12:17:37 crc kubenswrapper[4794]: I0310 12:17:37.641967 4794 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/d6503f22-a6c5-4934-bf3c-443d55351b85-host\") on node \"crc\" DevicePath \"\"" Mar 10 12:17:38 crc kubenswrapper[4794]: I0310 12:17:38.010428 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d6503f22-a6c5-4934-bf3c-443d55351b85" path="/var/lib/kubelet/pods/d6503f22-a6c5-4934-bf3c-443d55351b85/volumes" Mar 10 12:17:38 crc kubenswrapper[4794]: I0310 12:17:38.242305 4794 scope.go:117] "RemoveContainer" containerID="dca975b4fd67ffd4491fd0877665299fefc6bf3a44eb8a8e08869aa4b568d956" Mar 10 12:17:38 crc kubenswrapper[4794]: I0310 12:17:38.242415 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-9zvls/crc-debug-kp9js" Mar 10 12:17:38 crc kubenswrapper[4794]: I0310 12:17:38.683369 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-9zvls/crc-debug-rx8mh"] Mar 10 12:17:38 crc kubenswrapper[4794]: E0310 12:17:38.683759 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d6503f22-a6c5-4934-bf3c-443d55351b85" containerName="container-00" Mar 10 12:17:38 crc kubenswrapper[4794]: I0310 12:17:38.683771 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="d6503f22-a6c5-4934-bf3c-443d55351b85" containerName="container-00" Mar 10 12:17:38 crc kubenswrapper[4794]: I0310 12:17:38.683964 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="d6503f22-a6c5-4934-bf3c-443d55351b85" containerName="container-00" Mar 10 12:17:38 crc kubenswrapper[4794]: I0310 12:17:38.684641 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-9zvls/crc-debug-rx8mh" Mar 10 12:17:38 crc kubenswrapper[4794]: I0310 12:17:38.866847 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s79x7\" (UniqueName: \"kubernetes.io/projected/c229d47c-7c10-4386-9212-1f85eb3fc676-kube-api-access-s79x7\") pod \"crc-debug-rx8mh\" (UID: \"c229d47c-7c10-4386-9212-1f85eb3fc676\") " pod="openshift-must-gather-9zvls/crc-debug-rx8mh" Mar 10 12:17:38 crc kubenswrapper[4794]: I0310 12:17:38.866910 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/c229d47c-7c10-4386-9212-1f85eb3fc676-host\") pod \"crc-debug-rx8mh\" (UID: \"c229d47c-7c10-4386-9212-1f85eb3fc676\") " pod="openshift-must-gather-9zvls/crc-debug-rx8mh" Mar 10 12:17:38 crc kubenswrapper[4794]: I0310 12:17:38.969391 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s79x7\" (UniqueName: \"kubernetes.io/projected/c229d47c-7c10-4386-9212-1f85eb3fc676-kube-api-access-s79x7\") pod \"crc-debug-rx8mh\" (UID: \"c229d47c-7c10-4386-9212-1f85eb3fc676\") " pod="openshift-must-gather-9zvls/crc-debug-rx8mh" Mar 10 12:17:38 crc kubenswrapper[4794]: I0310 12:17:38.969694 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/c229d47c-7c10-4386-9212-1f85eb3fc676-host\") pod \"crc-debug-rx8mh\" (UID: \"c229d47c-7c10-4386-9212-1f85eb3fc676\") " pod="openshift-must-gather-9zvls/crc-debug-rx8mh" Mar 10 12:17:38 crc kubenswrapper[4794]: I0310 12:17:38.969861 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/c229d47c-7c10-4386-9212-1f85eb3fc676-host\") pod \"crc-debug-rx8mh\" (UID: \"c229d47c-7c10-4386-9212-1f85eb3fc676\") " pod="openshift-must-gather-9zvls/crc-debug-rx8mh" Mar 10 12:17:38 crc kubenswrapper[4794]: I0310 12:17:38.992944 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s79x7\" (UniqueName: \"kubernetes.io/projected/c229d47c-7c10-4386-9212-1f85eb3fc676-kube-api-access-s79x7\") pod \"crc-debug-rx8mh\" (UID: \"c229d47c-7c10-4386-9212-1f85eb3fc676\") " pod="openshift-must-gather-9zvls/crc-debug-rx8mh" Mar 10 12:17:39 crc kubenswrapper[4794]: I0310 12:17:39.013696 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-9zvls/crc-debug-rx8mh" Mar 10 12:17:39 crc kubenswrapper[4794]: I0310 12:17:39.252820 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-9zvls/crc-debug-rx8mh" event={"ID":"c229d47c-7c10-4386-9212-1f85eb3fc676","Type":"ContainerStarted","Data":"63bca0d074658fe04ac420c9f3efb2bd3a3b607a825f4c5fb6e42afc4d75e386"} Mar 10 12:17:39 crc kubenswrapper[4794]: I0310 12:17:39.999301 4794 scope.go:117] "RemoveContainer" containerID="8a548dece9f2dc538d899cf84b2f6e94eab0179fe2b6707ac5975fb13bfb3972" Mar 10 12:17:40 crc kubenswrapper[4794]: E0310 12:17:39.999971 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-69278_openshift-machine-config-operator(071a79a8-a892-4d38-a255-2a19483b64aa)\"" pod="openshift-machine-config-operator/machine-config-daemon-69278" podUID="071a79a8-a892-4d38-a255-2a19483b64aa" Mar 10 12:17:40 crc kubenswrapper[4794]: I0310 12:17:40.264388 4794 generic.go:334] "Generic (PLEG): container finished" podID="c229d47c-7c10-4386-9212-1f85eb3fc676" containerID="b31ca7ebab447ff626e3b0653bfdd4337f9d014819fa13900087addbdec3cf42" exitCode=0 Mar 10 12:17:40 crc kubenswrapper[4794]: I0310 12:17:40.264432 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-9zvls/crc-debug-rx8mh" event={"ID":"c229d47c-7c10-4386-9212-1f85eb3fc676","Type":"ContainerDied","Data":"b31ca7ebab447ff626e3b0653bfdd4337f9d014819fa13900087addbdec3cf42"} Mar 10 12:17:40 crc kubenswrapper[4794]: I0310 12:17:40.391892 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-9zvls/crc-debug-rx8mh"] Mar 10 12:17:40 crc kubenswrapper[4794]: I0310 12:17:40.406852 4794 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-9zvls/crc-debug-rx8mh"] Mar 10 12:17:41 crc kubenswrapper[4794]: I0310 12:17:41.423814 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-9zvls/crc-debug-rx8mh" Mar 10 12:17:41 crc kubenswrapper[4794]: I0310 12:17:41.523996 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/c229d47c-7c10-4386-9212-1f85eb3fc676-host\") pod \"c229d47c-7c10-4386-9212-1f85eb3fc676\" (UID: \"c229d47c-7c10-4386-9212-1f85eb3fc676\") " Mar 10 12:17:41 crc kubenswrapper[4794]: I0310 12:17:41.524116 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s79x7\" (UniqueName: \"kubernetes.io/projected/c229d47c-7c10-4386-9212-1f85eb3fc676-kube-api-access-s79x7\") pod \"c229d47c-7c10-4386-9212-1f85eb3fc676\" (UID: \"c229d47c-7c10-4386-9212-1f85eb3fc676\") " Mar 10 12:17:41 crc kubenswrapper[4794]: I0310 12:17:41.524153 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c229d47c-7c10-4386-9212-1f85eb3fc676-host" (OuterVolumeSpecName: "host") pod "c229d47c-7c10-4386-9212-1f85eb3fc676" (UID: "c229d47c-7c10-4386-9212-1f85eb3fc676"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 12:17:41 crc kubenswrapper[4794]: I0310 12:17:41.524807 4794 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/c229d47c-7c10-4386-9212-1f85eb3fc676-host\") on node \"crc\" DevicePath \"\"" Mar 10 12:17:41 crc kubenswrapper[4794]: I0310 12:17:41.529869 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c229d47c-7c10-4386-9212-1f85eb3fc676-kube-api-access-s79x7" (OuterVolumeSpecName: "kube-api-access-s79x7") pod "c229d47c-7c10-4386-9212-1f85eb3fc676" (UID: "c229d47c-7c10-4386-9212-1f85eb3fc676"). InnerVolumeSpecName "kube-api-access-s79x7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 12:17:41 crc kubenswrapper[4794]: I0310 12:17:41.627052 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s79x7\" (UniqueName: \"kubernetes.io/projected/c229d47c-7c10-4386-9212-1f85eb3fc676-kube-api-access-s79x7\") on node \"crc\" DevicePath \"\"" Mar 10 12:17:41 crc kubenswrapper[4794]: I0310 12:17:41.636396 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-9zvls/crc-debug-hd4dw"] Mar 10 12:17:41 crc kubenswrapper[4794]: E0310 12:17:41.636921 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c229d47c-7c10-4386-9212-1f85eb3fc676" containerName="container-00" Mar 10 12:17:41 crc kubenswrapper[4794]: I0310 12:17:41.636945 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="c229d47c-7c10-4386-9212-1f85eb3fc676" containerName="container-00" Mar 10 12:17:41 crc kubenswrapper[4794]: I0310 12:17:41.637269 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="c229d47c-7c10-4386-9212-1f85eb3fc676" containerName="container-00" Mar 10 12:17:41 crc kubenswrapper[4794]: I0310 12:17:41.639860 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-9zvls/crc-debug-hd4dw" Mar 10 12:17:41 crc kubenswrapper[4794]: I0310 12:17:41.729239 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/38014118-b52f-4ff0-ad11-bdb820362797-host\") pod \"crc-debug-hd4dw\" (UID: \"38014118-b52f-4ff0-ad11-bdb820362797\") " pod="openshift-must-gather-9zvls/crc-debug-hd4dw" Mar 10 12:17:41 crc kubenswrapper[4794]: I0310 12:17:41.729780 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wcrn5\" (UniqueName: \"kubernetes.io/projected/38014118-b52f-4ff0-ad11-bdb820362797-kube-api-access-wcrn5\") pod \"crc-debug-hd4dw\" (UID: \"38014118-b52f-4ff0-ad11-bdb820362797\") " pod="openshift-must-gather-9zvls/crc-debug-hd4dw" Mar 10 12:17:41 crc kubenswrapper[4794]: I0310 12:17:41.832565 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/38014118-b52f-4ff0-ad11-bdb820362797-host\") pod \"crc-debug-hd4dw\" (UID: \"38014118-b52f-4ff0-ad11-bdb820362797\") " pod="openshift-must-gather-9zvls/crc-debug-hd4dw" Mar 10 12:17:41 crc kubenswrapper[4794]: I0310 12:17:41.832706 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/38014118-b52f-4ff0-ad11-bdb820362797-host\") pod \"crc-debug-hd4dw\" (UID: \"38014118-b52f-4ff0-ad11-bdb820362797\") " pod="openshift-must-gather-9zvls/crc-debug-hd4dw" Mar 10 12:17:41 crc kubenswrapper[4794]: I0310 12:17:41.832742 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wcrn5\" (UniqueName: \"kubernetes.io/projected/38014118-b52f-4ff0-ad11-bdb820362797-kube-api-access-wcrn5\") pod \"crc-debug-hd4dw\" (UID: \"38014118-b52f-4ff0-ad11-bdb820362797\") " pod="openshift-must-gather-9zvls/crc-debug-hd4dw" Mar 10 12:17:41 crc kubenswrapper[4794]: I0310 12:17:41.851083 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wcrn5\" (UniqueName: \"kubernetes.io/projected/38014118-b52f-4ff0-ad11-bdb820362797-kube-api-access-wcrn5\") pod \"crc-debug-hd4dw\" (UID: \"38014118-b52f-4ff0-ad11-bdb820362797\") " pod="openshift-must-gather-9zvls/crc-debug-hd4dw" Mar 10 12:17:41 crc kubenswrapper[4794]: I0310 12:17:41.961718 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-9zvls/crc-debug-hd4dw" Mar 10 12:17:41 crc kubenswrapper[4794]: W0310 12:17:41.999615 4794 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod38014118_b52f_4ff0_ad11_bdb820362797.slice/crio-415a81ab112767b087e879ee719947a330f617c24417ed55664f7a6dee6ddd6f WatchSource:0}: Error finding container 415a81ab112767b087e879ee719947a330f617c24417ed55664f7a6dee6ddd6f: Status 404 returned error can't find the container with id 415a81ab112767b087e879ee719947a330f617c24417ed55664f7a6dee6ddd6f Mar 10 12:17:42 crc kubenswrapper[4794]: I0310 12:17:42.012986 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c229d47c-7c10-4386-9212-1f85eb3fc676" path="/var/lib/kubelet/pods/c229d47c-7c10-4386-9212-1f85eb3fc676/volumes" Mar 10 12:17:42 crc kubenswrapper[4794]: I0310 12:17:42.307387 4794 generic.go:334] "Generic (PLEG): container finished" podID="38014118-b52f-4ff0-ad11-bdb820362797" containerID="e7a1a66ee747f264a4fb2aa9e51943cf7104aae149375ce1b3afe3728f64d066" exitCode=0 Mar 10 12:17:42 crc kubenswrapper[4794]: I0310 12:17:42.307584 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-9zvls/crc-debug-hd4dw" event={"ID":"38014118-b52f-4ff0-ad11-bdb820362797","Type":"ContainerDied","Data":"e7a1a66ee747f264a4fb2aa9e51943cf7104aae149375ce1b3afe3728f64d066"} Mar 10 12:17:42 crc kubenswrapper[4794]: I0310 12:17:42.307745 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-9zvls/crc-debug-hd4dw" event={"ID":"38014118-b52f-4ff0-ad11-bdb820362797","Type":"ContainerStarted","Data":"415a81ab112767b087e879ee719947a330f617c24417ed55664f7a6dee6ddd6f"} Mar 10 12:17:42 crc kubenswrapper[4794]: I0310 12:17:42.310635 4794 scope.go:117] "RemoveContainer" containerID="b31ca7ebab447ff626e3b0653bfdd4337f9d014819fa13900087addbdec3cf42" Mar 10 12:17:42 crc kubenswrapper[4794]: I0310 12:17:42.310674 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-9zvls/crc-debug-rx8mh" Mar 10 12:17:42 crc kubenswrapper[4794]: I0310 12:17:42.347789 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-9zvls/crc-debug-hd4dw"] Mar 10 12:17:42 crc kubenswrapper[4794]: I0310 12:17:42.365637 4794 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-9zvls/crc-debug-hd4dw"] Mar 10 12:17:43 crc kubenswrapper[4794]: I0310 12:17:43.440361 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-9zvls/crc-debug-hd4dw" Mar 10 12:17:43 crc kubenswrapper[4794]: I0310 12:17:43.570047 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wcrn5\" (UniqueName: \"kubernetes.io/projected/38014118-b52f-4ff0-ad11-bdb820362797-kube-api-access-wcrn5\") pod \"38014118-b52f-4ff0-ad11-bdb820362797\" (UID: \"38014118-b52f-4ff0-ad11-bdb820362797\") " Mar 10 12:17:43 crc kubenswrapper[4794]: I0310 12:17:43.570349 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/38014118-b52f-4ff0-ad11-bdb820362797-host\") pod \"38014118-b52f-4ff0-ad11-bdb820362797\" (UID: \"38014118-b52f-4ff0-ad11-bdb820362797\") " Mar 10 12:17:43 crc kubenswrapper[4794]: I0310 12:17:43.570513 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/38014118-b52f-4ff0-ad11-bdb820362797-host" (OuterVolumeSpecName: "host") pod "38014118-b52f-4ff0-ad11-bdb820362797" (UID: "38014118-b52f-4ff0-ad11-bdb820362797"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 12:17:43 crc kubenswrapper[4794]: I0310 12:17:43.570961 4794 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/38014118-b52f-4ff0-ad11-bdb820362797-host\") on node \"crc\" DevicePath \"\"" Mar 10 12:17:43 crc kubenswrapper[4794]: I0310 12:17:43.577282 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/38014118-b52f-4ff0-ad11-bdb820362797-kube-api-access-wcrn5" (OuterVolumeSpecName: "kube-api-access-wcrn5") pod "38014118-b52f-4ff0-ad11-bdb820362797" (UID: "38014118-b52f-4ff0-ad11-bdb820362797"). InnerVolumeSpecName "kube-api-access-wcrn5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 12:17:43 crc kubenswrapper[4794]: I0310 12:17:43.672783 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wcrn5\" (UniqueName: \"kubernetes.io/projected/38014118-b52f-4ff0-ad11-bdb820362797-kube-api-access-wcrn5\") on node \"crc\" DevicePath \"\"" Mar 10 12:17:44 crc kubenswrapper[4794]: I0310 12:17:44.012427 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="38014118-b52f-4ff0-ad11-bdb820362797" path="/var/lib/kubelet/pods/38014118-b52f-4ff0-ad11-bdb820362797/volumes" Mar 10 12:17:44 crc kubenswrapper[4794]: I0310 12:17:44.337300 4794 scope.go:117] "RemoveContainer" containerID="e7a1a66ee747f264a4fb2aa9e51943cf7104aae149375ce1b3afe3728f64d066" Mar 10 12:17:44 crc kubenswrapper[4794]: I0310 12:17:44.337381 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-9zvls/crc-debug-hd4dw" Mar 10 12:17:52 crc kubenswrapper[4794]: I0310 12:17:52.008512 4794 scope.go:117] "RemoveContainer" containerID="8a548dece9f2dc538d899cf84b2f6e94eab0179fe2b6707ac5975fb13bfb3972" Mar 10 12:17:52 crc kubenswrapper[4794]: E0310 12:17:52.009371 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-69278_openshift-machine-config-operator(071a79a8-a892-4d38-a255-2a19483b64aa)\"" pod="openshift-machine-config-operator/machine-config-daemon-69278" podUID="071a79a8-a892-4d38-a255-2a19483b64aa" Mar 10 12:18:00 crc kubenswrapper[4794]: I0310 12:18:00.153303 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29552418-4pq5p"] Mar 10 12:18:00 crc kubenswrapper[4794]: E0310 12:18:00.154582 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="38014118-b52f-4ff0-ad11-bdb820362797" containerName="container-00" Mar 10 12:18:00 crc kubenswrapper[4794]: I0310 12:18:00.154601 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="38014118-b52f-4ff0-ad11-bdb820362797" containerName="container-00" Mar 10 12:18:00 crc kubenswrapper[4794]: I0310 12:18:00.154890 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="38014118-b52f-4ff0-ad11-bdb820362797" containerName="container-00" Mar 10 12:18:00 crc kubenswrapper[4794]: I0310 12:18:00.155995 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552418-4pq5p" Mar 10 12:18:00 crc kubenswrapper[4794]: I0310 12:18:00.159008 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 10 12:18:00 crc kubenswrapper[4794]: I0310 12:18:00.159306 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-r5tkc" Mar 10 12:18:00 crc kubenswrapper[4794]: I0310 12:18:00.159395 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 10 12:18:00 crc kubenswrapper[4794]: I0310 12:18:00.176730 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552418-4pq5p"] Mar 10 12:18:00 crc kubenswrapper[4794]: I0310 12:18:00.258209 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tt4t8\" (UniqueName: \"kubernetes.io/projected/23668ea6-27b5-43f7-a940-f1b3fb3a5baf-kube-api-access-tt4t8\") pod \"auto-csr-approver-29552418-4pq5p\" (UID: \"23668ea6-27b5-43f7-a940-f1b3fb3a5baf\") " pod="openshift-infra/auto-csr-approver-29552418-4pq5p" Mar 10 12:18:00 crc kubenswrapper[4794]: I0310 12:18:00.360447 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tt4t8\" (UniqueName: \"kubernetes.io/projected/23668ea6-27b5-43f7-a940-f1b3fb3a5baf-kube-api-access-tt4t8\") pod \"auto-csr-approver-29552418-4pq5p\" (UID: \"23668ea6-27b5-43f7-a940-f1b3fb3a5baf\") " pod="openshift-infra/auto-csr-approver-29552418-4pq5p" Mar 10 12:18:00 crc kubenswrapper[4794]: I0310 12:18:00.379122 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tt4t8\" (UniqueName: \"kubernetes.io/projected/23668ea6-27b5-43f7-a940-f1b3fb3a5baf-kube-api-access-tt4t8\") pod \"auto-csr-approver-29552418-4pq5p\" (UID: \"23668ea6-27b5-43f7-a940-f1b3fb3a5baf\") " pod="openshift-infra/auto-csr-approver-29552418-4pq5p" Mar 10 12:18:00 crc kubenswrapper[4794]: I0310 12:18:00.495106 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552418-4pq5p" Mar 10 12:18:01 crc kubenswrapper[4794]: I0310 12:18:01.056851 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552418-4pq5p"] Mar 10 12:18:01 crc kubenswrapper[4794]: I0310 12:18:01.510945 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552418-4pq5p" event={"ID":"23668ea6-27b5-43f7-a940-f1b3fb3a5baf","Type":"ContainerStarted","Data":"1268a024df1c5bf582b409a9e17ad8ba1a52a1c58af99eb446eb5a226c753171"} Mar 10 12:18:03 crc kubenswrapper[4794]: I0310 12:18:03.543393 4794 generic.go:334] "Generic (PLEG): container finished" podID="23668ea6-27b5-43f7-a940-f1b3fb3a5baf" containerID="86208cc799bf4557ab40c9a4d8ca0d4131d78726809978b479c1702ee1b74b4c" exitCode=0 Mar 10 12:18:03 crc kubenswrapper[4794]: I0310 12:18:03.543881 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552418-4pq5p" event={"ID":"23668ea6-27b5-43f7-a940-f1b3fb3a5baf","Type":"ContainerDied","Data":"86208cc799bf4557ab40c9a4d8ca0d4131d78726809978b479c1702ee1b74b4c"} Mar 10 12:18:04 crc kubenswrapper[4794]: I0310 12:18:04.928618 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552418-4pq5p" Mar 10 12:18:04 crc kubenswrapper[4794]: I0310 12:18:04.994213 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tt4t8\" (UniqueName: \"kubernetes.io/projected/23668ea6-27b5-43f7-a940-f1b3fb3a5baf-kube-api-access-tt4t8\") pod \"23668ea6-27b5-43f7-a940-f1b3fb3a5baf\" (UID: \"23668ea6-27b5-43f7-a940-f1b3fb3a5baf\") " Mar 10 12:18:05 crc kubenswrapper[4794]: I0310 12:18:05.000083 4794 scope.go:117] "RemoveContainer" containerID="8a548dece9f2dc538d899cf84b2f6e94eab0179fe2b6707ac5975fb13bfb3972" Mar 10 12:18:05 crc kubenswrapper[4794]: E0310 12:18:05.000633 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-69278_openshift-machine-config-operator(071a79a8-a892-4d38-a255-2a19483b64aa)\"" pod="openshift-machine-config-operator/machine-config-daemon-69278" podUID="071a79a8-a892-4d38-a255-2a19483b64aa" Mar 10 12:18:05 crc kubenswrapper[4794]: I0310 12:18:05.001324 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/23668ea6-27b5-43f7-a940-f1b3fb3a5baf-kube-api-access-tt4t8" (OuterVolumeSpecName: "kube-api-access-tt4t8") pod "23668ea6-27b5-43f7-a940-f1b3fb3a5baf" (UID: "23668ea6-27b5-43f7-a940-f1b3fb3a5baf"). InnerVolumeSpecName "kube-api-access-tt4t8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 12:18:05 crc kubenswrapper[4794]: I0310 12:18:05.099002 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tt4t8\" (UniqueName: \"kubernetes.io/projected/23668ea6-27b5-43f7-a940-f1b3fb3a5baf-kube-api-access-tt4t8\") on node \"crc\" DevicePath \"\"" Mar 10 12:18:05 crc kubenswrapper[4794]: I0310 12:18:05.563281 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552418-4pq5p" event={"ID":"23668ea6-27b5-43f7-a940-f1b3fb3a5baf","Type":"ContainerDied","Data":"1268a024df1c5bf582b409a9e17ad8ba1a52a1c58af99eb446eb5a226c753171"} Mar 10 12:18:05 crc kubenswrapper[4794]: I0310 12:18:05.563323 4794 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1268a024df1c5bf582b409a9e17ad8ba1a52a1c58af99eb446eb5a226c753171" Mar 10 12:18:05 crc kubenswrapper[4794]: I0310 12:18:05.563394 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552418-4pq5p" Mar 10 12:18:06 crc kubenswrapper[4794]: I0310 12:18:06.011714 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29552412-9dkkj"] Mar 10 12:18:06 crc kubenswrapper[4794]: I0310 12:18:06.013029 4794 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29552412-9dkkj"] Mar 10 12:18:08 crc kubenswrapper[4794]: I0310 12:18:08.013748 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7084aa56-cb02-408b-aede-13cd9e1e4d21" path="/var/lib/kubelet/pods/7084aa56-cb02-408b-aede-13cd9e1e4d21/volumes" Mar 10 12:18:15 crc kubenswrapper[4794]: I0310 12:18:15.999134 4794 scope.go:117] "RemoveContainer" containerID="8a548dece9f2dc538d899cf84b2f6e94eab0179fe2b6707ac5975fb13bfb3972" Mar 10 12:18:16 crc kubenswrapper[4794]: E0310 12:18:16.000039 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-69278_openshift-machine-config-operator(071a79a8-a892-4d38-a255-2a19483b64aa)\"" pod="openshift-machine-config-operator/machine-config-daemon-69278" podUID="071a79a8-a892-4d38-a255-2a19483b64aa" Mar 10 12:18:30 crc kubenswrapper[4794]: I0310 12:18:30.000016 4794 scope.go:117] "RemoveContainer" containerID="8a548dece9f2dc538d899cf84b2f6e94eab0179fe2b6707ac5975fb13bfb3972" Mar 10 12:18:30 crc kubenswrapper[4794]: E0310 12:18:30.001589 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-69278_openshift-machine-config-operator(071a79a8-a892-4d38-a255-2a19483b64aa)\"" pod="openshift-machine-config-operator/machine-config-daemon-69278" podUID="071a79a8-a892-4d38-a255-2a19483b64aa" Mar 10 12:18:42 crc kubenswrapper[4794]: I0310 12:18:42.007949 4794 scope.go:117] "RemoveContainer" containerID="8a548dece9f2dc538d899cf84b2f6e94eab0179fe2b6707ac5975fb13bfb3972" Mar 10 12:18:42 crc kubenswrapper[4794]: E0310 12:18:42.008823 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-69278_openshift-machine-config-operator(071a79a8-a892-4d38-a255-2a19483b64aa)\"" pod="openshift-machine-config-operator/machine-config-daemon-69278" podUID="071a79a8-a892-4d38-a255-2a19483b64aa" Mar 10 12:18:55 crc kubenswrapper[4794]: I0310 12:18:55.421073 4794 scope.go:117] "RemoveContainer" containerID="638ca93f58f845a54b0798afdb6c08ceae03d6154423f41add7931021f356e98" Mar 10 12:18:55 crc kubenswrapper[4794]: I0310 12:18:55.999146 4794 scope.go:117] "RemoveContainer" containerID="8a548dece9f2dc538d899cf84b2f6e94eab0179fe2b6707ac5975fb13bfb3972" Mar 10 12:18:57 crc kubenswrapper[4794]: I0310 12:18:57.124243 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-69278" event={"ID":"071a79a8-a892-4d38-a255-2a19483b64aa","Type":"ContainerStarted","Data":"067da652f299f1725aeb1dffe145df44e864aa55b2260448aafac13d3c5a1934"} Mar 10 12:19:51 crc kubenswrapper[4794]: I0310 12:19:51.808397 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-pnl7v"] Mar 10 12:19:51 crc kubenswrapper[4794]: E0310 12:19:51.809574 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="23668ea6-27b5-43f7-a940-f1b3fb3a5baf" containerName="oc" Mar 10 12:19:51 crc kubenswrapper[4794]: I0310 12:19:51.809590 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="23668ea6-27b5-43f7-a940-f1b3fb3a5baf" containerName="oc" Mar 10 12:19:51 crc kubenswrapper[4794]: I0310 12:19:51.809873 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="23668ea6-27b5-43f7-a940-f1b3fb3a5baf" containerName="oc" Mar 10 12:19:51 crc kubenswrapper[4794]: I0310 12:19:51.811834 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-pnl7v" Mar 10 12:19:51 crc kubenswrapper[4794]: I0310 12:19:51.817641 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-pnl7v"] Mar 10 12:19:51 crc kubenswrapper[4794]: I0310 12:19:51.923736 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1fe897b1-ac26-4aa2-acce-fa65a511f49a-utilities\") pod \"redhat-marketplace-pnl7v\" (UID: \"1fe897b1-ac26-4aa2-acce-fa65a511f49a\") " pod="openshift-marketplace/redhat-marketplace-pnl7v" Mar 10 12:19:51 crc kubenswrapper[4794]: I0310 12:19:51.923812 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1fe897b1-ac26-4aa2-acce-fa65a511f49a-catalog-content\") pod \"redhat-marketplace-pnl7v\" (UID: \"1fe897b1-ac26-4aa2-acce-fa65a511f49a\") " pod="openshift-marketplace/redhat-marketplace-pnl7v" Mar 10 12:19:51 crc kubenswrapper[4794]: I0310 12:19:51.923850 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8wpz9\" (UniqueName: \"kubernetes.io/projected/1fe897b1-ac26-4aa2-acce-fa65a511f49a-kube-api-access-8wpz9\") pod \"redhat-marketplace-pnl7v\" (UID: \"1fe897b1-ac26-4aa2-acce-fa65a511f49a\") " pod="openshift-marketplace/redhat-marketplace-pnl7v" Mar 10 12:19:52 crc kubenswrapper[4794]: I0310 12:19:52.025880 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1fe897b1-ac26-4aa2-acce-fa65a511f49a-utilities\") pod \"redhat-marketplace-pnl7v\" (UID: \"1fe897b1-ac26-4aa2-acce-fa65a511f49a\") " pod="openshift-marketplace/redhat-marketplace-pnl7v" Mar 10 12:19:52 crc kubenswrapper[4794]: I0310 12:19:52.025954 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1fe897b1-ac26-4aa2-acce-fa65a511f49a-catalog-content\") pod \"redhat-marketplace-pnl7v\" (UID: \"1fe897b1-ac26-4aa2-acce-fa65a511f49a\") " pod="openshift-marketplace/redhat-marketplace-pnl7v" Mar 10 12:19:52 crc kubenswrapper[4794]: I0310 12:19:52.025981 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8wpz9\" (UniqueName: \"kubernetes.io/projected/1fe897b1-ac26-4aa2-acce-fa65a511f49a-kube-api-access-8wpz9\") pod \"redhat-marketplace-pnl7v\" (UID: \"1fe897b1-ac26-4aa2-acce-fa65a511f49a\") " pod="openshift-marketplace/redhat-marketplace-pnl7v" Mar 10 12:19:52 crc kubenswrapper[4794]: I0310 12:19:52.026535 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1fe897b1-ac26-4aa2-acce-fa65a511f49a-utilities\") pod \"redhat-marketplace-pnl7v\" (UID: \"1fe897b1-ac26-4aa2-acce-fa65a511f49a\") " pod="openshift-marketplace/redhat-marketplace-pnl7v" Mar 10 12:19:52 crc kubenswrapper[4794]: I0310 12:19:52.026544 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1fe897b1-ac26-4aa2-acce-fa65a511f49a-catalog-content\") pod \"redhat-marketplace-pnl7v\" (UID: \"1fe897b1-ac26-4aa2-acce-fa65a511f49a\") " pod="openshift-marketplace/redhat-marketplace-pnl7v" Mar 10 12:19:52 crc kubenswrapper[4794]: I0310 12:19:52.044132 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8wpz9\" (UniqueName: \"kubernetes.io/projected/1fe897b1-ac26-4aa2-acce-fa65a511f49a-kube-api-access-8wpz9\") pod \"redhat-marketplace-pnl7v\" (UID: \"1fe897b1-ac26-4aa2-acce-fa65a511f49a\") " pod="openshift-marketplace/redhat-marketplace-pnl7v" Mar 10 12:19:52 crc kubenswrapper[4794]: I0310 12:19:52.142551 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-pnl7v" Mar 10 12:19:52 crc kubenswrapper[4794]: I0310 12:19:52.631188 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-pnl7v"] Mar 10 12:19:52 crc kubenswrapper[4794]: I0310 12:19:52.810912 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pnl7v" event={"ID":"1fe897b1-ac26-4aa2-acce-fa65a511f49a","Type":"ContainerStarted","Data":"2dec439812ebc6fdc983523a7ec8f88154bbc8e9dd28ff73cdf9ccce0ab00d78"} Mar 10 12:19:53 crc kubenswrapper[4794]: I0310 12:19:53.826140 4794 generic.go:334] "Generic (PLEG): container finished" podID="1fe897b1-ac26-4aa2-acce-fa65a511f49a" containerID="533841c25c780bcf4501fbdf007e2e9101ba58563e529059a261a31c88a2feee" exitCode=0 Mar 10 12:19:53 crc kubenswrapper[4794]: I0310 12:19:53.826250 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pnl7v" event={"ID":"1fe897b1-ac26-4aa2-acce-fa65a511f49a","Type":"ContainerDied","Data":"533841c25c780bcf4501fbdf007e2e9101ba58563e529059a261a31c88a2feee"} Mar 10 12:19:53 crc kubenswrapper[4794]: I0310 12:19:53.831591 4794 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 10 12:19:54 crc kubenswrapper[4794]: I0310 12:19:54.841659 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pnl7v" event={"ID":"1fe897b1-ac26-4aa2-acce-fa65a511f49a","Type":"ContainerStarted","Data":"02d8194730b178ba97ae16d54bcaac9573706ec423128250421535c4b3016958"} Mar 10 12:19:56 crc kubenswrapper[4794]: I0310 12:19:56.862181 4794 generic.go:334] "Generic (PLEG): container finished" podID="1fe897b1-ac26-4aa2-acce-fa65a511f49a" containerID="02d8194730b178ba97ae16d54bcaac9573706ec423128250421535c4b3016958" exitCode=0 Mar 10 12:19:56 crc kubenswrapper[4794]: I0310 12:19:56.862273 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pnl7v" event={"ID":"1fe897b1-ac26-4aa2-acce-fa65a511f49a","Type":"ContainerDied","Data":"02d8194730b178ba97ae16d54bcaac9573706ec423128250421535c4b3016958"} Mar 10 12:19:58 crc kubenswrapper[4794]: I0310 12:19:58.885881 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pnl7v" event={"ID":"1fe897b1-ac26-4aa2-acce-fa65a511f49a","Type":"ContainerStarted","Data":"85fd88ed51ff0f75102a52393b89d428a7af074056f014cf00ee1f79e8f7205e"} Mar 10 12:19:58 crc kubenswrapper[4794]: I0310 12:19:58.915222 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-pnl7v" podStartSLOduration=4.355580324 podStartE2EDuration="7.915205766s" podCreationTimestamp="2026-03-10 12:19:51 +0000 UTC" firstStartedPulling="2026-03-10 12:19:53.831352237 +0000 UTC m=+9342.587523055" lastFinishedPulling="2026-03-10 12:19:57.390977679 +0000 UTC m=+9346.147148497" observedRunningTime="2026-03-10 12:19:58.907384333 +0000 UTC m=+9347.663555201" watchObservedRunningTime="2026-03-10 12:19:58.915205766 +0000 UTC m=+9347.671376584" Mar 10 12:20:00 crc kubenswrapper[4794]: I0310 12:20:00.157387 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29552420-f6knn"] Mar 10 12:20:00 crc kubenswrapper[4794]: I0310 12:20:00.159894 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552420-f6knn" Mar 10 12:20:00 crc kubenswrapper[4794]: I0310 12:20:00.164857 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 10 12:20:00 crc kubenswrapper[4794]: I0310 12:20:00.165182 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 10 12:20:00 crc kubenswrapper[4794]: I0310 12:20:00.165419 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-r5tkc" Mar 10 12:20:00 crc kubenswrapper[4794]: I0310 12:20:00.174924 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552420-f6knn"] Mar 10 12:20:00 crc kubenswrapper[4794]: I0310 12:20:00.252186 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hjkf7\" (UniqueName: \"kubernetes.io/projected/08df24c0-8c53-4f01-8edb-470609fbe0c1-kube-api-access-hjkf7\") pod \"auto-csr-approver-29552420-f6knn\" (UID: \"08df24c0-8c53-4f01-8edb-470609fbe0c1\") " pod="openshift-infra/auto-csr-approver-29552420-f6knn" Mar 10 12:20:00 crc kubenswrapper[4794]: I0310 12:20:00.354541 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hjkf7\" (UniqueName: \"kubernetes.io/projected/08df24c0-8c53-4f01-8edb-470609fbe0c1-kube-api-access-hjkf7\") pod \"auto-csr-approver-29552420-f6knn\" (UID: \"08df24c0-8c53-4f01-8edb-470609fbe0c1\") " pod="openshift-infra/auto-csr-approver-29552420-f6knn" Mar 10 12:20:00 crc kubenswrapper[4794]: I0310 12:20:00.381885 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hjkf7\" (UniqueName: \"kubernetes.io/projected/08df24c0-8c53-4f01-8edb-470609fbe0c1-kube-api-access-hjkf7\") pod \"auto-csr-approver-29552420-f6knn\" (UID: \"08df24c0-8c53-4f01-8edb-470609fbe0c1\") " pod="openshift-infra/auto-csr-approver-29552420-f6knn" Mar 10 12:20:00 crc kubenswrapper[4794]: I0310 12:20:00.530748 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552420-f6knn" Mar 10 12:20:01 crc kubenswrapper[4794]: I0310 12:20:01.044834 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552420-f6knn"] Mar 10 12:20:01 crc kubenswrapper[4794]: I0310 12:20:01.930267 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552420-f6knn" event={"ID":"08df24c0-8c53-4f01-8edb-470609fbe0c1","Type":"ContainerStarted","Data":"40f73307fb3fbf5ce4e85d7694cb6eaa97d84a0958573a1306f0ad1f9b9ad59e"} Mar 10 12:20:02 crc kubenswrapper[4794]: I0310 12:20:02.143827 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-pnl7v" Mar 10 12:20:02 crc kubenswrapper[4794]: I0310 12:20:02.144104 4794 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-pnl7v" Mar 10 12:20:02 crc kubenswrapper[4794]: I0310 12:20:02.211295 4794 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-pnl7v" Mar 10 12:20:02 crc kubenswrapper[4794]: I0310 12:20:02.943711 4794 generic.go:334] "Generic (PLEG): container finished" podID="08df24c0-8c53-4f01-8edb-470609fbe0c1" containerID="9aacf0fe110cebb3b6671285b4d7559bdcbb073b5368b865aab15c591a242cef" exitCode=0 Mar 10 12:20:02 crc kubenswrapper[4794]: I0310 12:20:02.943835 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552420-f6knn" event={"ID":"08df24c0-8c53-4f01-8edb-470609fbe0c1","Type":"ContainerDied","Data":"9aacf0fe110cebb3b6671285b4d7559bdcbb073b5368b865aab15c591a242cef"} Mar 10 12:20:04 crc kubenswrapper[4794]: I0310 12:20:04.433154 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552420-f6knn" Mar 10 12:20:04 crc kubenswrapper[4794]: I0310 12:20:04.554117 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hjkf7\" (UniqueName: \"kubernetes.io/projected/08df24c0-8c53-4f01-8edb-470609fbe0c1-kube-api-access-hjkf7\") pod \"08df24c0-8c53-4f01-8edb-470609fbe0c1\" (UID: \"08df24c0-8c53-4f01-8edb-470609fbe0c1\") " Mar 10 12:20:04 crc kubenswrapper[4794]: I0310 12:20:04.560448 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/08df24c0-8c53-4f01-8edb-470609fbe0c1-kube-api-access-hjkf7" (OuterVolumeSpecName: "kube-api-access-hjkf7") pod "08df24c0-8c53-4f01-8edb-470609fbe0c1" (UID: "08df24c0-8c53-4f01-8edb-470609fbe0c1"). InnerVolumeSpecName "kube-api-access-hjkf7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 12:20:04 crc kubenswrapper[4794]: I0310 12:20:04.659459 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hjkf7\" (UniqueName: \"kubernetes.io/projected/08df24c0-8c53-4f01-8edb-470609fbe0c1-kube-api-access-hjkf7\") on node \"crc\" DevicePath \"\"" Mar 10 12:20:04 crc kubenswrapper[4794]: I0310 12:20:04.967535 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552420-f6knn" event={"ID":"08df24c0-8c53-4f01-8edb-470609fbe0c1","Type":"ContainerDied","Data":"40f73307fb3fbf5ce4e85d7694cb6eaa97d84a0958573a1306f0ad1f9b9ad59e"} Mar 10 12:20:04 crc kubenswrapper[4794]: I0310 12:20:04.967581 4794 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="40f73307fb3fbf5ce4e85d7694cb6eaa97d84a0958573a1306f0ad1f9b9ad59e" Mar 10 12:20:04 crc kubenswrapper[4794]: I0310 12:20:04.967583 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552420-f6knn" Mar 10 12:20:05 crc kubenswrapper[4794]: I0310 12:20:05.528671 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29552414-dczzv"] Mar 10 12:20:05 crc kubenswrapper[4794]: I0310 12:20:05.548192 4794 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29552414-dczzv"] Mar 10 12:20:06 crc kubenswrapper[4794]: I0310 12:20:06.019957 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="64f7981e-696b-4a23-9194-d2a0326bbd1f" path="/var/lib/kubelet/pods/64f7981e-696b-4a23-9194-d2a0326bbd1f/volumes" Mar 10 12:20:12 crc kubenswrapper[4794]: I0310 12:20:12.207208 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-pnl7v" Mar 10 12:20:12 crc kubenswrapper[4794]: I0310 12:20:12.266268 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-pnl7v"] Mar 10 12:20:13 crc kubenswrapper[4794]: I0310 12:20:13.067028 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-pnl7v" podUID="1fe897b1-ac26-4aa2-acce-fa65a511f49a" containerName="registry-server" containerID="cri-o://85fd88ed51ff0f75102a52393b89d428a7af074056f014cf00ee1f79e8f7205e" gracePeriod=2 Mar 10 12:20:13 crc kubenswrapper[4794]: I0310 12:20:13.625187 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-pnl7v" Mar 10 12:20:13 crc kubenswrapper[4794]: I0310 12:20:13.675677 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1fe897b1-ac26-4aa2-acce-fa65a511f49a-catalog-content\") pod \"1fe897b1-ac26-4aa2-acce-fa65a511f49a\" (UID: \"1fe897b1-ac26-4aa2-acce-fa65a511f49a\") " Mar 10 12:20:13 crc kubenswrapper[4794]: I0310 12:20:13.676048 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8wpz9\" (UniqueName: \"kubernetes.io/projected/1fe897b1-ac26-4aa2-acce-fa65a511f49a-kube-api-access-8wpz9\") pod \"1fe897b1-ac26-4aa2-acce-fa65a511f49a\" (UID: \"1fe897b1-ac26-4aa2-acce-fa65a511f49a\") " Mar 10 12:20:13 crc kubenswrapper[4794]: I0310 12:20:13.676153 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1fe897b1-ac26-4aa2-acce-fa65a511f49a-utilities\") pod \"1fe897b1-ac26-4aa2-acce-fa65a511f49a\" (UID: \"1fe897b1-ac26-4aa2-acce-fa65a511f49a\") " Mar 10 12:20:13 crc kubenswrapper[4794]: I0310 12:20:13.677235 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1fe897b1-ac26-4aa2-acce-fa65a511f49a-utilities" (OuterVolumeSpecName: "utilities") pod "1fe897b1-ac26-4aa2-acce-fa65a511f49a" (UID: "1fe897b1-ac26-4aa2-acce-fa65a511f49a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 12:20:13 crc kubenswrapper[4794]: I0310 12:20:13.686837 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1fe897b1-ac26-4aa2-acce-fa65a511f49a-kube-api-access-8wpz9" (OuterVolumeSpecName: "kube-api-access-8wpz9") pod "1fe897b1-ac26-4aa2-acce-fa65a511f49a" (UID: "1fe897b1-ac26-4aa2-acce-fa65a511f49a"). InnerVolumeSpecName "kube-api-access-8wpz9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 12:20:13 crc kubenswrapper[4794]: I0310 12:20:13.719990 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1fe897b1-ac26-4aa2-acce-fa65a511f49a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1fe897b1-ac26-4aa2-acce-fa65a511f49a" (UID: "1fe897b1-ac26-4aa2-acce-fa65a511f49a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 12:20:13 crc kubenswrapper[4794]: I0310 12:20:13.778786 4794 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1fe897b1-ac26-4aa2-acce-fa65a511f49a-utilities\") on node \"crc\" DevicePath \"\"" Mar 10 12:20:13 crc kubenswrapper[4794]: I0310 12:20:13.778821 4794 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1fe897b1-ac26-4aa2-acce-fa65a511f49a-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 10 12:20:13 crc kubenswrapper[4794]: I0310 12:20:13.778832 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8wpz9\" (UniqueName: \"kubernetes.io/projected/1fe897b1-ac26-4aa2-acce-fa65a511f49a-kube-api-access-8wpz9\") on node \"crc\" DevicePath \"\"" Mar 10 12:20:14 crc kubenswrapper[4794]: I0310 12:20:14.078390 4794 generic.go:334] "Generic (PLEG): container finished" podID="1fe897b1-ac26-4aa2-acce-fa65a511f49a" containerID="85fd88ed51ff0f75102a52393b89d428a7af074056f014cf00ee1f79e8f7205e" exitCode=0 Mar 10 12:20:14 crc kubenswrapper[4794]: I0310 12:20:14.078453 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pnl7v" event={"ID":"1fe897b1-ac26-4aa2-acce-fa65a511f49a","Type":"ContainerDied","Data":"85fd88ed51ff0f75102a52393b89d428a7af074056f014cf00ee1f79e8f7205e"} Mar 10 12:20:14 crc kubenswrapper[4794]: I0310 12:20:14.078641 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pnl7v" event={"ID":"1fe897b1-ac26-4aa2-acce-fa65a511f49a","Type":"ContainerDied","Data":"2dec439812ebc6fdc983523a7ec8f88154bbc8e9dd28ff73cdf9ccce0ab00d78"} Mar 10 12:20:14 crc kubenswrapper[4794]: I0310 12:20:14.078667 4794 scope.go:117] "RemoveContainer" containerID="85fd88ed51ff0f75102a52393b89d428a7af074056f014cf00ee1f79e8f7205e" Mar 10 12:20:14 crc kubenswrapper[4794]: I0310 12:20:14.078473 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-pnl7v" Mar 10 12:20:14 crc kubenswrapper[4794]: I0310 12:20:14.099860 4794 scope.go:117] "RemoveContainer" containerID="02d8194730b178ba97ae16d54bcaac9573706ec423128250421535c4b3016958" Mar 10 12:20:14 crc kubenswrapper[4794]: I0310 12:20:14.106214 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-pnl7v"] Mar 10 12:20:14 crc kubenswrapper[4794]: I0310 12:20:14.114979 4794 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-pnl7v"] Mar 10 12:20:14 crc kubenswrapper[4794]: I0310 12:20:14.120649 4794 scope.go:117] "RemoveContainer" containerID="533841c25c780bcf4501fbdf007e2e9101ba58563e529059a261a31c88a2feee" Mar 10 12:20:14 crc kubenswrapper[4794]: I0310 12:20:14.170579 4794 scope.go:117] "RemoveContainer" containerID="85fd88ed51ff0f75102a52393b89d428a7af074056f014cf00ee1f79e8f7205e" Mar 10 12:20:14 crc kubenswrapper[4794]: E0310 12:20:14.171098 4794 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"85fd88ed51ff0f75102a52393b89d428a7af074056f014cf00ee1f79e8f7205e\": container with ID starting with 85fd88ed51ff0f75102a52393b89d428a7af074056f014cf00ee1f79e8f7205e not found: ID does not exist" containerID="85fd88ed51ff0f75102a52393b89d428a7af074056f014cf00ee1f79e8f7205e" Mar 10 12:20:14 crc kubenswrapper[4794]: I0310 12:20:14.171137 4794 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"85fd88ed51ff0f75102a52393b89d428a7af074056f014cf00ee1f79e8f7205e"} err="failed to get container status \"85fd88ed51ff0f75102a52393b89d428a7af074056f014cf00ee1f79e8f7205e\": rpc error: code = NotFound desc = could not find container \"85fd88ed51ff0f75102a52393b89d428a7af074056f014cf00ee1f79e8f7205e\": container with ID starting with 85fd88ed51ff0f75102a52393b89d428a7af074056f014cf00ee1f79e8f7205e not found: ID does not exist" Mar 10 12:20:14 crc kubenswrapper[4794]: I0310 12:20:14.171160 4794 scope.go:117] "RemoveContainer" containerID="02d8194730b178ba97ae16d54bcaac9573706ec423128250421535c4b3016958" Mar 10 12:20:14 crc kubenswrapper[4794]: E0310 12:20:14.171441 4794 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"02d8194730b178ba97ae16d54bcaac9573706ec423128250421535c4b3016958\": container with ID starting with 02d8194730b178ba97ae16d54bcaac9573706ec423128250421535c4b3016958 not found: ID does not exist" containerID="02d8194730b178ba97ae16d54bcaac9573706ec423128250421535c4b3016958" Mar 10 12:20:14 crc kubenswrapper[4794]: I0310 12:20:14.171463 4794 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"02d8194730b178ba97ae16d54bcaac9573706ec423128250421535c4b3016958"} err="failed to get container status \"02d8194730b178ba97ae16d54bcaac9573706ec423128250421535c4b3016958\": rpc error: code = NotFound desc = could not find container \"02d8194730b178ba97ae16d54bcaac9573706ec423128250421535c4b3016958\": container with ID starting with 02d8194730b178ba97ae16d54bcaac9573706ec423128250421535c4b3016958 not found: ID does not exist" Mar 10 12:20:14 crc kubenswrapper[4794]: I0310 12:20:14.171476 4794 scope.go:117] "RemoveContainer" containerID="533841c25c780bcf4501fbdf007e2e9101ba58563e529059a261a31c88a2feee" Mar 10 12:20:14 crc kubenswrapper[4794]: E0310 12:20:14.171717 4794 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"533841c25c780bcf4501fbdf007e2e9101ba58563e529059a261a31c88a2feee\": container with ID starting with 533841c25c780bcf4501fbdf007e2e9101ba58563e529059a261a31c88a2feee not found: ID does not exist" containerID="533841c25c780bcf4501fbdf007e2e9101ba58563e529059a261a31c88a2feee" Mar 10 12:20:14 crc kubenswrapper[4794]: I0310 12:20:14.171750 4794 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"533841c25c780bcf4501fbdf007e2e9101ba58563e529059a261a31c88a2feee"} err="failed to get container status \"533841c25c780bcf4501fbdf007e2e9101ba58563e529059a261a31c88a2feee\": rpc error: code = NotFound desc = could not find container \"533841c25c780bcf4501fbdf007e2e9101ba58563e529059a261a31c88a2feee\": container with ID starting with 533841c25c780bcf4501fbdf007e2e9101ba58563e529059a261a31c88a2feee not found: ID does not exist" Mar 10 12:20:16 crc kubenswrapper[4794]: I0310 12:20:16.029224 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1fe897b1-ac26-4aa2-acce-fa65a511f49a" path="/var/lib/kubelet/pods/1fe897b1-ac26-4aa2-acce-fa65a511f49a/volumes" Mar 10 12:20:55 crc kubenswrapper[4794]: I0310 12:20:55.545272 4794 scope.go:117] "RemoveContainer" containerID="dd5facc5566471df5a07e3b7825f8e7c5582834d603619508d1f096d74323f98" Mar 10 12:21:22 crc kubenswrapper[4794]: I0310 12:21:22.967779 4794 patch_prober.go:28] interesting pod/machine-config-daemon-69278 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 12:21:22 crc kubenswrapper[4794]: I0310 12:21:22.968237 4794 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-69278" podUID="071a79a8-a892-4d38-a255-2a19483b64aa" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 12:21:52 crc kubenswrapper[4794]: I0310 12:21:52.967352 4794 patch_prober.go:28] interesting pod/machine-config-daemon-69278 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 12:21:52 crc kubenswrapper[4794]: I0310 12:21:52.967877 4794 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-69278" podUID="071a79a8-a892-4d38-a255-2a19483b64aa" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 12:22:00 crc kubenswrapper[4794]: I0310 12:22:00.152245 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29552422-kkftv"] Mar 10 12:22:00 crc kubenswrapper[4794]: E0310 12:22:00.153614 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="08df24c0-8c53-4f01-8edb-470609fbe0c1" containerName="oc" Mar 10 12:22:00 crc kubenswrapper[4794]: I0310 12:22:00.153640 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="08df24c0-8c53-4f01-8edb-470609fbe0c1" containerName="oc" Mar 10 12:22:00 crc kubenswrapper[4794]: E0310 12:22:00.153675 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1fe897b1-ac26-4aa2-acce-fa65a511f49a" containerName="registry-server" Mar 10 12:22:00 crc kubenswrapper[4794]: I0310 12:22:00.153688 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="1fe897b1-ac26-4aa2-acce-fa65a511f49a" containerName="registry-server" Mar 10 12:22:00 crc kubenswrapper[4794]: E0310 12:22:00.153742 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1fe897b1-ac26-4aa2-acce-fa65a511f49a" containerName="extract-utilities" Mar 10 12:22:00 crc kubenswrapper[4794]: I0310 12:22:00.153756 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="1fe897b1-ac26-4aa2-acce-fa65a511f49a" containerName="extract-utilities" Mar 10 12:22:00 crc kubenswrapper[4794]: E0310 12:22:00.153791 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1fe897b1-ac26-4aa2-acce-fa65a511f49a" containerName="extract-content" Mar 10 12:22:00 crc kubenswrapper[4794]: I0310 12:22:00.153804 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="1fe897b1-ac26-4aa2-acce-fa65a511f49a" containerName="extract-content" Mar 10 12:22:00 crc kubenswrapper[4794]: I0310 12:22:00.154476 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="08df24c0-8c53-4f01-8edb-470609fbe0c1" containerName="oc" Mar 10 12:22:00 crc kubenswrapper[4794]: I0310 12:22:00.154537 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="1fe897b1-ac26-4aa2-acce-fa65a511f49a" containerName="registry-server" Mar 10 12:22:00 crc kubenswrapper[4794]: I0310 12:22:00.155911 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552422-kkftv" Mar 10 12:22:00 crc kubenswrapper[4794]: I0310 12:22:00.158231 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 10 12:22:00 crc kubenswrapper[4794]: I0310 12:22:00.158678 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 10 12:22:00 crc kubenswrapper[4794]: I0310 12:22:00.159046 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-r5tkc" Mar 10 12:22:00 crc kubenswrapper[4794]: I0310 12:22:00.167115 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552422-kkftv"] Mar 10 12:22:00 crc kubenswrapper[4794]: I0310 12:22:00.267160 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mk587\" (UniqueName: \"kubernetes.io/projected/0bc1c151-74da-457b-ba7b-68ec353b9b97-kube-api-access-mk587\") pod \"auto-csr-approver-29552422-kkftv\" (UID: \"0bc1c151-74da-457b-ba7b-68ec353b9b97\") " pod="openshift-infra/auto-csr-approver-29552422-kkftv" Mar 10 12:22:00 crc kubenswrapper[4794]: I0310 12:22:00.369990 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mk587\" (UniqueName: \"kubernetes.io/projected/0bc1c151-74da-457b-ba7b-68ec353b9b97-kube-api-access-mk587\") pod \"auto-csr-approver-29552422-kkftv\" (UID: \"0bc1c151-74da-457b-ba7b-68ec353b9b97\") " pod="openshift-infra/auto-csr-approver-29552422-kkftv" Mar 10 12:22:00 crc kubenswrapper[4794]: I0310 12:22:00.393926 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mk587\" (UniqueName: \"kubernetes.io/projected/0bc1c151-74da-457b-ba7b-68ec353b9b97-kube-api-access-mk587\") pod \"auto-csr-approver-29552422-kkftv\" (UID: \"0bc1c151-74da-457b-ba7b-68ec353b9b97\") " pod="openshift-infra/auto-csr-approver-29552422-kkftv" Mar 10 12:22:00 crc kubenswrapper[4794]: I0310 12:22:00.498669 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552422-kkftv" Mar 10 12:22:01 crc kubenswrapper[4794]: I0310 12:22:01.066417 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552422-kkftv"] Mar 10 12:22:01 crc kubenswrapper[4794]: I0310 12:22:01.220374 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552422-kkftv" event={"ID":"0bc1c151-74da-457b-ba7b-68ec353b9b97","Type":"ContainerStarted","Data":"2826f0b6e67f0e4ecc8083d7517e4aca7406af58cdc80ad711133f1958b765c6"} Mar 10 12:22:03 crc kubenswrapper[4794]: I0310 12:22:03.239282 4794 generic.go:334] "Generic (PLEG): container finished" podID="0bc1c151-74da-457b-ba7b-68ec353b9b97" containerID="fcd8ceb0add1e96d91b1bab5214dde92b41e24d41802e67f633bba12e61751ee" exitCode=0 Mar 10 12:22:03 crc kubenswrapper[4794]: I0310 12:22:03.239712 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552422-kkftv" event={"ID":"0bc1c151-74da-457b-ba7b-68ec353b9b97","Type":"ContainerDied","Data":"fcd8ceb0add1e96d91b1bab5214dde92b41e24d41802e67f633bba12e61751ee"} Mar 10 12:22:05 crc kubenswrapper[4794]: I0310 12:22:05.298116 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552422-kkftv" event={"ID":"0bc1c151-74da-457b-ba7b-68ec353b9b97","Type":"ContainerDied","Data":"2826f0b6e67f0e4ecc8083d7517e4aca7406af58cdc80ad711133f1958b765c6"} Mar 10 12:22:05 crc kubenswrapper[4794]: I0310 12:22:05.298657 4794 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2826f0b6e67f0e4ecc8083d7517e4aca7406af58cdc80ad711133f1958b765c6" Mar 10 12:22:05 crc kubenswrapper[4794]: I0310 12:22:05.428485 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552422-kkftv" Mar 10 12:22:05 crc kubenswrapper[4794]: I0310 12:22:05.608672 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mk587\" (UniqueName: \"kubernetes.io/projected/0bc1c151-74da-457b-ba7b-68ec353b9b97-kube-api-access-mk587\") pod \"0bc1c151-74da-457b-ba7b-68ec353b9b97\" (UID: \"0bc1c151-74da-457b-ba7b-68ec353b9b97\") " Mar 10 12:22:05 crc kubenswrapper[4794]: I0310 12:22:05.616003 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0bc1c151-74da-457b-ba7b-68ec353b9b97-kube-api-access-mk587" (OuterVolumeSpecName: "kube-api-access-mk587") pod "0bc1c151-74da-457b-ba7b-68ec353b9b97" (UID: "0bc1c151-74da-457b-ba7b-68ec353b9b97"). InnerVolumeSpecName "kube-api-access-mk587". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 12:22:05 crc kubenswrapper[4794]: I0310 12:22:05.718496 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mk587\" (UniqueName: \"kubernetes.io/projected/0bc1c151-74da-457b-ba7b-68ec353b9b97-kube-api-access-mk587\") on node \"crc\" DevicePath \"\"" Mar 10 12:22:06 crc kubenswrapper[4794]: I0310 12:22:06.312282 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552422-kkftv" Mar 10 12:22:06 crc kubenswrapper[4794]: I0310 12:22:06.517765 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29552416-hzd82"] Mar 10 12:22:06 crc kubenswrapper[4794]: I0310 12:22:06.526282 4794 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29552416-hzd82"] Mar 10 12:22:08 crc kubenswrapper[4794]: I0310 12:22:08.012761 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="642736f4-3278-41ad-b9b1-b587da80d429" path="/var/lib/kubelet/pods/642736f4-3278-41ad-b9b1-b587da80d429/volumes" Mar 10 12:22:22 crc kubenswrapper[4794]: I0310 12:22:22.967180 4794 patch_prober.go:28] interesting pod/machine-config-daemon-69278 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 12:22:22 crc kubenswrapper[4794]: I0310 12:22:22.967726 4794 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-69278" podUID="071a79a8-a892-4d38-a255-2a19483b64aa" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 12:22:22 crc kubenswrapper[4794]: I0310 12:22:22.967772 4794 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-69278" Mar 10 12:22:22 crc kubenswrapper[4794]: I0310 12:22:22.968577 4794 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"067da652f299f1725aeb1dffe145df44e864aa55b2260448aafac13d3c5a1934"} pod="openshift-machine-config-operator/machine-config-daemon-69278" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 10 12:22:22 crc kubenswrapper[4794]: I0310 12:22:22.968630 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-69278" podUID="071a79a8-a892-4d38-a255-2a19483b64aa" containerName="machine-config-daemon" containerID="cri-o://067da652f299f1725aeb1dffe145df44e864aa55b2260448aafac13d3c5a1934" gracePeriod=600 Mar 10 12:22:23 crc kubenswrapper[4794]: I0310 12:22:23.501498 4794 generic.go:334] "Generic (PLEG): container finished" podID="071a79a8-a892-4d38-a255-2a19483b64aa" containerID="067da652f299f1725aeb1dffe145df44e864aa55b2260448aafac13d3c5a1934" exitCode=0 Mar 10 12:22:23 crc kubenswrapper[4794]: I0310 12:22:23.501582 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-69278" event={"ID":"071a79a8-a892-4d38-a255-2a19483b64aa","Type":"ContainerDied","Data":"067da652f299f1725aeb1dffe145df44e864aa55b2260448aafac13d3c5a1934"} Mar 10 12:22:23 crc kubenswrapper[4794]: I0310 12:22:23.502414 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-69278" event={"ID":"071a79a8-a892-4d38-a255-2a19483b64aa","Type":"ContainerStarted","Data":"713a7f9d6f1bfd1d1e307cff8db8bc49857bd1b38078cd590d6dd88f1ae69a2e"} Mar 10 12:22:23 crc kubenswrapper[4794]: I0310 12:22:23.502587 4794 scope.go:117] "RemoveContainer" containerID="8a548dece9f2dc538d899cf84b2f6e94eab0179fe2b6707ac5975fb13bfb3972" Mar 10 12:22:52 crc kubenswrapper[4794]: I0310 12:22:52.259961 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-cj8h6"] Mar 10 12:22:52 crc kubenswrapper[4794]: E0310 12:22:52.261031 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0bc1c151-74da-457b-ba7b-68ec353b9b97" containerName="oc" Mar 10 12:22:52 crc kubenswrapper[4794]: I0310 12:22:52.261049 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="0bc1c151-74da-457b-ba7b-68ec353b9b97" containerName="oc" Mar 10 12:22:52 crc kubenswrapper[4794]: I0310 12:22:52.261323 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="0bc1c151-74da-457b-ba7b-68ec353b9b97" containerName="oc" Mar 10 12:22:52 crc kubenswrapper[4794]: I0310 12:22:52.263401 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-cj8h6" Mar 10 12:22:52 crc kubenswrapper[4794]: I0310 12:22:52.282717 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-cj8h6"] Mar 10 12:22:52 crc kubenswrapper[4794]: I0310 12:22:52.394407 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/36c49790-5d87-4bc6-a985-eb50d7f9535c-catalog-content\") pod \"community-operators-cj8h6\" (UID: \"36c49790-5d87-4bc6-a985-eb50d7f9535c\") " pod="openshift-marketplace/community-operators-cj8h6" Mar 10 12:22:52 crc kubenswrapper[4794]: I0310 12:22:52.394502 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vfh52\" (UniqueName: \"kubernetes.io/projected/36c49790-5d87-4bc6-a985-eb50d7f9535c-kube-api-access-vfh52\") pod \"community-operators-cj8h6\" (UID: \"36c49790-5d87-4bc6-a985-eb50d7f9535c\") " pod="openshift-marketplace/community-operators-cj8h6" Mar 10 12:22:52 crc kubenswrapper[4794]: I0310 12:22:52.394754 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/36c49790-5d87-4bc6-a985-eb50d7f9535c-utilities\") pod \"community-operators-cj8h6\" (UID: \"36c49790-5d87-4bc6-a985-eb50d7f9535c\") " pod="openshift-marketplace/community-operators-cj8h6" Mar 10 12:22:52 crc kubenswrapper[4794]: I0310 12:22:52.496645 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/36c49790-5d87-4bc6-a985-eb50d7f9535c-catalog-content\") pod \"community-operators-cj8h6\" (UID: \"36c49790-5d87-4bc6-a985-eb50d7f9535c\") " pod="openshift-marketplace/community-operators-cj8h6" Mar 10 12:22:52 crc kubenswrapper[4794]: I0310 12:22:52.496727 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vfh52\" (UniqueName: \"kubernetes.io/projected/36c49790-5d87-4bc6-a985-eb50d7f9535c-kube-api-access-vfh52\") pod \"community-operators-cj8h6\" (UID: \"36c49790-5d87-4bc6-a985-eb50d7f9535c\") " pod="openshift-marketplace/community-operators-cj8h6" Mar 10 12:22:52 crc kubenswrapper[4794]: I0310 12:22:52.496808 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/36c49790-5d87-4bc6-a985-eb50d7f9535c-utilities\") pod \"community-operators-cj8h6\" (UID: \"36c49790-5d87-4bc6-a985-eb50d7f9535c\") " pod="openshift-marketplace/community-operators-cj8h6" Mar 10 12:22:52 crc kubenswrapper[4794]: I0310 12:22:52.497232 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/36c49790-5d87-4bc6-a985-eb50d7f9535c-catalog-content\") pod \"community-operators-cj8h6\" (UID: \"36c49790-5d87-4bc6-a985-eb50d7f9535c\") " pod="openshift-marketplace/community-operators-cj8h6" Mar 10 12:22:52 crc kubenswrapper[4794]: I0310 12:22:52.497307 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/36c49790-5d87-4bc6-a985-eb50d7f9535c-utilities\") pod \"community-operators-cj8h6\" (UID: \"36c49790-5d87-4bc6-a985-eb50d7f9535c\") " pod="openshift-marketplace/community-operators-cj8h6" Mar 10 12:22:52 crc kubenswrapper[4794]: I0310 12:22:52.555449 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vfh52\" (UniqueName: \"kubernetes.io/projected/36c49790-5d87-4bc6-a985-eb50d7f9535c-kube-api-access-vfh52\") pod \"community-operators-cj8h6\" (UID: \"36c49790-5d87-4bc6-a985-eb50d7f9535c\") " pod="openshift-marketplace/community-operators-cj8h6" Mar 10 12:22:52 crc kubenswrapper[4794]: I0310 12:22:52.588190 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-cj8h6" Mar 10 12:22:53 crc kubenswrapper[4794]: I0310 12:22:53.149194 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-cj8h6"] Mar 10 12:22:53 crc kubenswrapper[4794]: I0310 12:22:53.901820 4794 generic.go:334] "Generic (PLEG): container finished" podID="36c49790-5d87-4bc6-a985-eb50d7f9535c" containerID="fdf8f40f22167c4aed43460da30cb5b6e11700981cd7b689db9a9120a6ab04b3" exitCode=0 Mar 10 12:22:53 crc kubenswrapper[4794]: I0310 12:22:53.901921 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-cj8h6" event={"ID":"36c49790-5d87-4bc6-a985-eb50d7f9535c","Type":"ContainerDied","Data":"fdf8f40f22167c4aed43460da30cb5b6e11700981cd7b689db9a9120a6ab04b3"} Mar 10 12:22:53 crc kubenswrapper[4794]: I0310 12:22:53.902104 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-cj8h6" event={"ID":"36c49790-5d87-4bc6-a985-eb50d7f9535c","Type":"ContainerStarted","Data":"fedf44ce8d22a87d956705cd9fb838d048fd4ad7e0186cc6731651fb44c37843"} Mar 10 12:22:54 crc kubenswrapper[4794]: I0310 12:22:54.919169 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-cj8h6" event={"ID":"36c49790-5d87-4bc6-a985-eb50d7f9535c","Type":"ContainerStarted","Data":"4c790a7492866cbcbdbab3f3cea7c877acbf4b659d3fcf8987852364da32f5c3"} Mar 10 12:22:55 crc kubenswrapper[4794]: I0310 12:22:55.658493 4794 scope.go:117] "RemoveContainer" containerID="b6f42a0eadbd3bb5efaec956113ba41f2e6bac55979749547deb8502d20f2370" Mar 10 12:22:56 crc kubenswrapper[4794]: I0310 12:22:56.948739 4794 generic.go:334] "Generic (PLEG): container finished" podID="36c49790-5d87-4bc6-a985-eb50d7f9535c" containerID="4c790a7492866cbcbdbab3f3cea7c877acbf4b659d3fcf8987852364da32f5c3" exitCode=0 Mar 10 12:22:56 crc kubenswrapper[4794]: I0310 12:22:56.948800 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-cj8h6" event={"ID":"36c49790-5d87-4bc6-a985-eb50d7f9535c","Type":"ContainerDied","Data":"4c790a7492866cbcbdbab3f3cea7c877acbf4b659d3fcf8987852364da32f5c3"} Mar 10 12:22:57 crc kubenswrapper[4794]: I0310 12:22:57.960986 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-cj8h6" event={"ID":"36c49790-5d87-4bc6-a985-eb50d7f9535c","Type":"ContainerStarted","Data":"a675fc20a64d5c922ff4fb11c71ea17bfae8e83dcdb5dc3d9fd31653fee0f26c"} Mar 10 12:22:57 crc kubenswrapper[4794]: I0310 12:22:57.984120 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-cj8h6" podStartSLOduration=2.404992195 podStartE2EDuration="5.984095823s" podCreationTimestamp="2026-03-10 12:22:52 +0000 UTC" firstStartedPulling="2026-03-10 12:22:53.904797805 +0000 UTC m=+9522.660968623" lastFinishedPulling="2026-03-10 12:22:57.483901433 +0000 UTC m=+9526.240072251" observedRunningTime="2026-03-10 12:22:57.980169461 +0000 UTC m=+9526.736340289" watchObservedRunningTime="2026-03-10 12:22:57.984095823 +0000 UTC m=+9526.740266651" Mar 10 12:23:02 crc kubenswrapper[4794]: I0310 12:23:02.588725 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-cj8h6" Mar 10 12:23:02 crc kubenswrapper[4794]: I0310 12:23:02.590950 4794 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-cj8h6" Mar 10 12:23:02 crc kubenswrapper[4794]: I0310 12:23:02.675735 4794 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-cj8h6" Mar 10 12:23:03 crc kubenswrapper[4794]: I0310 12:23:03.057076 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-cj8h6" Mar 10 12:23:03 crc kubenswrapper[4794]: I0310 12:23:03.124561 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-cj8h6"] Mar 10 12:23:05 crc kubenswrapper[4794]: I0310 12:23:05.035024 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-cj8h6" podUID="36c49790-5d87-4bc6-a985-eb50d7f9535c" containerName="registry-server" containerID="cri-o://a675fc20a64d5c922ff4fb11c71ea17bfae8e83dcdb5dc3d9fd31653fee0f26c" gracePeriod=2 Mar 10 12:23:05 crc kubenswrapper[4794]: I0310 12:23:05.563428 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-cj8h6" Mar 10 12:23:05 crc kubenswrapper[4794]: I0310 12:23:05.709676 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/36c49790-5d87-4bc6-a985-eb50d7f9535c-catalog-content\") pod \"36c49790-5d87-4bc6-a985-eb50d7f9535c\" (UID: \"36c49790-5d87-4bc6-a985-eb50d7f9535c\") " Mar 10 12:23:05 crc kubenswrapper[4794]: I0310 12:23:05.710066 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vfh52\" (UniqueName: \"kubernetes.io/projected/36c49790-5d87-4bc6-a985-eb50d7f9535c-kube-api-access-vfh52\") pod \"36c49790-5d87-4bc6-a985-eb50d7f9535c\" (UID: \"36c49790-5d87-4bc6-a985-eb50d7f9535c\") " Mar 10 12:23:05 crc kubenswrapper[4794]: I0310 12:23:05.710322 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/36c49790-5d87-4bc6-a985-eb50d7f9535c-utilities\") pod \"36c49790-5d87-4bc6-a985-eb50d7f9535c\" (UID: \"36c49790-5d87-4bc6-a985-eb50d7f9535c\") " Mar 10 12:23:05 crc kubenswrapper[4794]: I0310 12:23:05.711853 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/36c49790-5d87-4bc6-a985-eb50d7f9535c-utilities" (OuterVolumeSpecName: "utilities") pod "36c49790-5d87-4bc6-a985-eb50d7f9535c" (UID: "36c49790-5d87-4bc6-a985-eb50d7f9535c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 12:23:05 crc kubenswrapper[4794]: I0310 12:23:05.718412 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/36c49790-5d87-4bc6-a985-eb50d7f9535c-kube-api-access-vfh52" (OuterVolumeSpecName: "kube-api-access-vfh52") pod "36c49790-5d87-4bc6-a985-eb50d7f9535c" (UID: "36c49790-5d87-4bc6-a985-eb50d7f9535c"). InnerVolumeSpecName "kube-api-access-vfh52". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 12:23:05 crc kubenswrapper[4794]: I0310 12:23:05.767243 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/36c49790-5d87-4bc6-a985-eb50d7f9535c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "36c49790-5d87-4bc6-a985-eb50d7f9535c" (UID: "36c49790-5d87-4bc6-a985-eb50d7f9535c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 12:23:05 crc kubenswrapper[4794]: I0310 12:23:05.812769 4794 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/36c49790-5d87-4bc6-a985-eb50d7f9535c-utilities\") on node \"crc\" DevicePath \"\"" Mar 10 12:23:05 crc kubenswrapper[4794]: I0310 12:23:05.812807 4794 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/36c49790-5d87-4bc6-a985-eb50d7f9535c-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 10 12:23:05 crc kubenswrapper[4794]: I0310 12:23:05.812820 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vfh52\" (UniqueName: \"kubernetes.io/projected/36c49790-5d87-4bc6-a985-eb50d7f9535c-kube-api-access-vfh52\") on node \"crc\" DevicePath \"\"" Mar 10 12:23:06 crc kubenswrapper[4794]: I0310 12:23:06.045323 4794 generic.go:334] "Generic (PLEG): container finished" podID="36c49790-5d87-4bc6-a985-eb50d7f9535c" containerID="a675fc20a64d5c922ff4fb11c71ea17bfae8e83dcdb5dc3d9fd31653fee0f26c" exitCode=0 Mar 10 12:23:06 crc kubenswrapper[4794]: I0310 12:23:06.045376 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-cj8h6" event={"ID":"36c49790-5d87-4bc6-a985-eb50d7f9535c","Type":"ContainerDied","Data":"a675fc20a64d5c922ff4fb11c71ea17bfae8e83dcdb5dc3d9fd31653fee0f26c"} Mar 10 12:23:06 crc kubenswrapper[4794]: I0310 12:23:06.045403 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-cj8h6" event={"ID":"36c49790-5d87-4bc6-a985-eb50d7f9535c","Type":"ContainerDied","Data":"fedf44ce8d22a87d956705cd9fb838d048fd4ad7e0186cc6731651fb44c37843"} Mar 10 12:23:06 crc kubenswrapper[4794]: I0310 12:23:06.045400 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-cj8h6" Mar 10 12:23:06 crc kubenswrapper[4794]: I0310 12:23:06.045417 4794 scope.go:117] "RemoveContainer" containerID="a675fc20a64d5c922ff4fb11c71ea17bfae8e83dcdb5dc3d9fd31653fee0f26c" Mar 10 12:23:06 crc kubenswrapper[4794]: I0310 12:23:06.084012 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-cj8h6"] Mar 10 12:23:06 crc kubenswrapper[4794]: I0310 12:23:06.084740 4794 scope.go:117] "RemoveContainer" containerID="4c790a7492866cbcbdbab3f3cea7c877acbf4b659d3fcf8987852364da32f5c3" Mar 10 12:23:06 crc kubenswrapper[4794]: I0310 12:23:06.097105 4794 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-cj8h6"] Mar 10 12:23:06 crc kubenswrapper[4794]: I0310 12:23:06.121304 4794 scope.go:117] "RemoveContainer" containerID="fdf8f40f22167c4aed43460da30cb5b6e11700981cd7b689db9a9120a6ab04b3" Mar 10 12:23:06 crc kubenswrapper[4794]: I0310 12:23:06.169162 4794 scope.go:117] "RemoveContainer" containerID="a675fc20a64d5c922ff4fb11c71ea17bfae8e83dcdb5dc3d9fd31653fee0f26c" Mar 10 12:23:06 crc kubenswrapper[4794]: E0310 12:23:06.169701 4794 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a675fc20a64d5c922ff4fb11c71ea17bfae8e83dcdb5dc3d9fd31653fee0f26c\": container with ID starting with a675fc20a64d5c922ff4fb11c71ea17bfae8e83dcdb5dc3d9fd31653fee0f26c not found: ID does not exist" containerID="a675fc20a64d5c922ff4fb11c71ea17bfae8e83dcdb5dc3d9fd31653fee0f26c" Mar 10 12:23:06 crc kubenswrapper[4794]: I0310 12:23:06.169754 4794 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a675fc20a64d5c922ff4fb11c71ea17bfae8e83dcdb5dc3d9fd31653fee0f26c"} err="failed to get container status \"a675fc20a64d5c922ff4fb11c71ea17bfae8e83dcdb5dc3d9fd31653fee0f26c\": rpc error: code = NotFound desc = could not find container \"a675fc20a64d5c922ff4fb11c71ea17bfae8e83dcdb5dc3d9fd31653fee0f26c\": container with ID starting with a675fc20a64d5c922ff4fb11c71ea17bfae8e83dcdb5dc3d9fd31653fee0f26c not found: ID does not exist" Mar 10 12:23:06 crc kubenswrapper[4794]: I0310 12:23:06.169788 4794 scope.go:117] "RemoveContainer" containerID="4c790a7492866cbcbdbab3f3cea7c877acbf4b659d3fcf8987852364da32f5c3" Mar 10 12:23:06 crc kubenswrapper[4794]: E0310 12:23:06.170358 4794 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4c790a7492866cbcbdbab3f3cea7c877acbf4b659d3fcf8987852364da32f5c3\": container with ID starting with 4c790a7492866cbcbdbab3f3cea7c877acbf4b659d3fcf8987852364da32f5c3 not found: ID does not exist" containerID="4c790a7492866cbcbdbab3f3cea7c877acbf4b659d3fcf8987852364da32f5c3" Mar 10 12:23:06 crc kubenswrapper[4794]: I0310 12:23:06.170512 4794 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4c790a7492866cbcbdbab3f3cea7c877acbf4b659d3fcf8987852364da32f5c3"} err="failed to get container status \"4c790a7492866cbcbdbab3f3cea7c877acbf4b659d3fcf8987852364da32f5c3\": rpc error: code = NotFound desc = could not find container \"4c790a7492866cbcbdbab3f3cea7c877acbf4b659d3fcf8987852364da32f5c3\": container with ID starting with 4c790a7492866cbcbdbab3f3cea7c877acbf4b659d3fcf8987852364da32f5c3 not found: ID does not exist" Mar 10 12:23:06 crc kubenswrapper[4794]: I0310 12:23:06.170623 4794 scope.go:117] "RemoveContainer" containerID="fdf8f40f22167c4aed43460da30cb5b6e11700981cd7b689db9a9120a6ab04b3" Mar 10 12:23:06 crc kubenswrapper[4794]: E0310 12:23:06.171059 4794 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fdf8f40f22167c4aed43460da30cb5b6e11700981cd7b689db9a9120a6ab04b3\": container with ID starting with fdf8f40f22167c4aed43460da30cb5b6e11700981cd7b689db9a9120a6ab04b3 not found: ID does not exist" containerID="fdf8f40f22167c4aed43460da30cb5b6e11700981cd7b689db9a9120a6ab04b3" Mar 10 12:23:06 crc kubenswrapper[4794]: I0310 12:23:06.171096 4794 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fdf8f40f22167c4aed43460da30cb5b6e11700981cd7b689db9a9120a6ab04b3"} err="failed to get container status \"fdf8f40f22167c4aed43460da30cb5b6e11700981cd7b689db9a9120a6ab04b3\": rpc error: code = NotFound desc = could not find container \"fdf8f40f22167c4aed43460da30cb5b6e11700981cd7b689db9a9120a6ab04b3\": container with ID starting with fdf8f40f22167c4aed43460da30cb5b6e11700981cd7b689db9a9120a6ab04b3 not found: ID does not exist" Mar 10 12:23:08 crc kubenswrapper[4794]: I0310 12:23:08.014795 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="36c49790-5d87-4bc6-a985-eb50d7f9535c" path="/var/lib/kubelet/pods/36c49790-5d87-4bc6-a985-eb50d7f9535c/volumes" Mar 10 12:24:00 crc kubenswrapper[4794]: I0310 12:24:00.159393 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29552424-5d5m7"] Mar 10 12:24:00 crc kubenswrapper[4794]: E0310 12:24:00.160758 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="36c49790-5d87-4bc6-a985-eb50d7f9535c" containerName="extract-utilities" Mar 10 12:24:00 crc kubenswrapper[4794]: I0310 12:24:00.160773 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="36c49790-5d87-4bc6-a985-eb50d7f9535c" containerName="extract-utilities" Mar 10 12:24:00 crc kubenswrapper[4794]: E0310 12:24:00.160795 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="36c49790-5d87-4bc6-a985-eb50d7f9535c" containerName="registry-server" Mar 10 12:24:00 crc kubenswrapper[4794]: I0310 12:24:00.160802 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="36c49790-5d87-4bc6-a985-eb50d7f9535c" containerName="registry-server" Mar 10 12:24:00 crc kubenswrapper[4794]: E0310 12:24:00.160818 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="36c49790-5d87-4bc6-a985-eb50d7f9535c" containerName="extract-content" Mar 10 12:24:00 crc kubenswrapper[4794]: I0310 12:24:00.160824 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="36c49790-5d87-4bc6-a985-eb50d7f9535c" containerName="extract-content" Mar 10 12:24:00 crc kubenswrapper[4794]: I0310 12:24:00.161059 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="36c49790-5d87-4bc6-a985-eb50d7f9535c" containerName="registry-server" Mar 10 12:24:00 crc kubenswrapper[4794]: I0310 12:24:00.162146 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552424-5d5m7" Mar 10 12:24:00 crc kubenswrapper[4794]: I0310 12:24:00.164788 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 10 12:24:00 crc kubenswrapper[4794]: I0310 12:24:00.165224 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-r5tkc" Mar 10 12:24:00 crc kubenswrapper[4794]: I0310 12:24:00.167642 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 10 12:24:00 crc kubenswrapper[4794]: I0310 12:24:00.173896 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552424-5d5m7"] Mar 10 12:24:00 crc kubenswrapper[4794]: I0310 12:24:00.303621 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m5l8v\" (UniqueName: \"kubernetes.io/projected/f4490ab4-756f-4af3-a930-f8f22539a7e2-kube-api-access-m5l8v\") pod \"auto-csr-approver-29552424-5d5m7\" (UID: \"f4490ab4-756f-4af3-a930-f8f22539a7e2\") " pod="openshift-infra/auto-csr-approver-29552424-5d5m7" Mar 10 12:24:00 crc kubenswrapper[4794]: I0310 12:24:00.406163 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m5l8v\" (UniqueName: \"kubernetes.io/projected/f4490ab4-756f-4af3-a930-f8f22539a7e2-kube-api-access-m5l8v\") pod \"auto-csr-approver-29552424-5d5m7\" (UID: \"f4490ab4-756f-4af3-a930-f8f22539a7e2\") " pod="openshift-infra/auto-csr-approver-29552424-5d5m7" Mar 10 12:24:00 crc kubenswrapper[4794]: I0310 12:24:00.880809 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m5l8v\" (UniqueName: \"kubernetes.io/projected/f4490ab4-756f-4af3-a930-f8f22539a7e2-kube-api-access-m5l8v\") pod \"auto-csr-approver-29552424-5d5m7\" (UID: \"f4490ab4-756f-4af3-a930-f8f22539a7e2\") " pod="openshift-infra/auto-csr-approver-29552424-5d5m7" Mar 10 12:24:01 crc kubenswrapper[4794]: I0310 12:24:01.101071 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552424-5d5m7" Mar 10 12:24:01 crc kubenswrapper[4794]: I0310 12:24:01.581007 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552424-5d5m7"] Mar 10 12:24:01 crc kubenswrapper[4794]: I0310 12:24:01.698663 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552424-5d5m7" event={"ID":"f4490ab4-756f-4af3-a930-f8f22539a7e2","Type":"ContainerStarted","Data":"6be5a599007fafbc0a4f21495cd9c100e4573c24c5a992592f540de70db12f2e"} Mar 10 12:24:03 crc kubenswrapper[4794]: I0310 12:24:03.723445 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552424-5d5m7" event={"ID":"f4490ab4-756f-4af3-a930-f8f22539a7e2","Type":"ContainerStarted","Data":"eaadd8e1a09a72a2eb996b5ea5f96ea0cc33ca5705489fa3340a5d4ba216f979"} Mar 10 12:24:04 crc kubenswrapper[4794]: I0310 12:24:04.737408 4794 generic.go:334] "Generic (PLEG): container finished" podID="f4490ab4-756f-4af3-a930-f8f22539a7e2" containerID="eaadd8e1a09a72a2eb996b5ea5f96ea0cc33ca5705489fa3340a5d4ba216f979" exitCode=0 Mar 10 12:24:04 crc kubenswrapper[4794]: I0310 12:24:04.737484 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552424-5d5m7" event={"ID":"f4490ab4-756f-4af3-a930-f8f22539a7e2","Type":"ContainerDied","Data":"eaadd8e1a09a72a2eb996b5ea5f96ea0cc33ca5705489fa3340a5d4ba216f979"} Mar 10 12:24:05 crc kubenswrapper[4794]: I0310 12:24:05.144545 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552424-5d5m7" Mar 10 12:24:05 crc kubenswrapper[4794]: I0310 12:24:05.230115 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m5l8v\" (UniqueName: \"kubernetes.io/projected/f4490ab4-756f-4af3-a930-f8f22539a7e2-kube-api-access-m5l8v\") pod \"f4490ab4-756f-4af3-a930-f8f22539a7e2\" (UID: \"f4490ab4-756f-4af3-a930-f8f22539a7e2\") " Mar 10 12:24:05 crc kubenswrapper[4794]: I0310 12:24:05.236323 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f4490ab4-756f-4af3-a930-f8f22539a7e2-kube-api-access-m5l8v" (OuterVolumeSpecName: "kube-api-access-m5l8v") pod "f4490ab4-756f-4af3-a930-f8f22539a7e2" (UID: "f4490ab4-756f-4af3-a930-f8f22539a7e2"). InnerVolumeSpecName "kube-api-access-m5l8v". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 12:24:05 crc kubenswrapper[4794]: I0310 12:24:05.332690 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m5l8v\" (UniqueName: \"kubernetes.io/projected/f4490ab4-756f-4af3-a930-f8f22539a7e2-kube-api-access-m5l8v\") on node \"crc\" DevicePath \"\"" Mar 10 12:24:05 crc kubenswrapper[4794]: I0310 12:24:05.750196 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552424-5d5m7" event={"ID":"f4490ab4-756f-4af3-a930-f8f22539a7e2","Type":"ContainerDied","Data":"6be5a599007fafbc0a4f21495cd9c100e4573c24c5a992592f540de70db12f2e"} Mar 10 12:24:05 crc kubenswrapper[4794]: I0310 12:24:05.750256 4794 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6be5a599007fafbc0a4f21495cd9c100e4573c24c5a992592f540de70db12f2e" Mar 10 12:24:05 crc kubenswrapper[4794]: I0310 12:24:05.750274 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552424-5d5m7" Mar 10 12:24:06 crc kubenswrapper[4794]: I0310 12:24:06.221550 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29552418-4pq5p"] Mar 10 12:24:06 crc kubenswrapper[4794]: I0310 12:24:06.230781 4794 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29552418-4pq5p"] Mar 10 12:24:08 crc kubenswrapper[4794]: I0310 12:24:08.009789 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="23668ea6-27b5-43f7-a940-f1b3fb3a5baf" path="/var/lib/kubelet/pods/23668ea6-27b5-43f7-a940-f1b3fb3a5baf/volumes" Mar 10 12:24:21 crc kubenswrapper[4794]: I0310 12:24:21.809074 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-jrmhn"] Mar 10 12:24:21 crc kubenswrapper[4794]: E0310 12:24:21.810146 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4490ab4-756f-4af3-a930-f8f22539a7e2" containerName="oc" Mar 10 12:24:21 crc kubenswrapper[4794]: I0310 12:24:21.810162 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4490ab4-756f-4af3-a930-f8f22539a7e2" containerName="oc" Mar 10 12:24:21 crc kubenswrapper[4794]: I0310 12:24:21.810427 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4490ab4-756f-4af3-a930-f8f22539a7e2" containerName="oc" Mar 10 12:24:21 crc kubenswrapper[4794]: I0310 12:24:21.812145 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-jrmhn" Mar 10 12:24:21 crc kubenswrapper[4794]: I0310 12:24:21.823051 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-jrmhn"] Mar 10 12:24:21 crc kubenswrapper[4794]: I0310 12:24:21.895392 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/45736978-28be-4eb8-9f58-1d6b56f2dd47-utilities\") pod \"certified-operators-jrmhn\" (UID: \"45736978-28be-4eb8-9f58-1d6b56f2dd47\") " pod="openshift-marketplace/certified-operators-jrmhn" Mar 10 12:24:21 crc kubenswrapper[4794]: I0310 12:24:21.895511 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/45736978-28be-4eb8-9f58-1d6b56f2dd47-catalog-content\") pod \"certified-operators-jrmhn\" (UID: \"45736978-28be-4eb8-9f58-1d6b56f2dd47\") " pod="openshift-marketplace/certified-operators-jrmhn" Mar 10 12:24:21 crc kubenswrapper[4794]: I0310 12:24:21.895925 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kvqpn\" (UniqueName: \"kubernetes.io/projected/45736978-28be-4eb8-9f58-1d6b56f2dd47-kube-api-access-kvqpn\") pod \"certified-operators-jrmhn\" (UID: \"45736978-28be-4eb8-9f58-1d6b56f2dd47\") " pod="openshift-marketplace/certified-operators-jrmhn" Mar 10 12:24:22 crc kubenswrapper[4794]: I0310 12:24:22.014596 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/45736978-28be-4eb8-9f58-1d6b56f2dd47-catalog-content\") pod \"certified-operators-jrmhn\" (UID: \"45736978-28be-4eb8-9f58-1d6b56f2dd47\") " pod="openshift-marketplace/certified-operators-jrmhn" Mar 10 12:24:22 crc kubenswrapper[4794]: I0310 12:24:22.014816 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kvqpn\" (UniqueName: \"kubernetes.io/projected/45736978-28be-4eb8-9f58-1d6b56f2dd47-kube-api-access-kvqpn\") pod \"certified-operators-jrmhn\" (UID: \"45736978-28be-4eb8-9f58-1d6b56f2dd47\") " pod="openshift-marketplace/certified-operators-jrmhn" Mar 10 12:24:22 crc kubenswrapper[4794]: I0310 12:24:22.014951 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/45736978-28be-4eb8-9f58-1d6b56f2dd47-utilities\") pod \"certified-operators-jrmhn\" (UID: \"45736978-28be-4eb8-9f58-1d6b56f2dd47\") " pod="openshift-marketplace/certified-operators-jrmhn" Mar 10 12:24:22 crc kubenswrapper[4794]: I0310 12:24:22.016295 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/45736978-28be-4eb8-9f58-1d6b56f2dd47-utilities\") pod \"certified-operators-jrmhn\" (UID: \"45736978-28be-4eb8-9f58-1d6b56f2dd47\") " pod="openshift-marketplace/certified-operators-jrmhn" Mar 10 12:24:22 crc kubenswrapper[4794]: I0310 12:24:22.017780 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/45736978-28be-4eb8-9f58-1d6b56f2dd47-catalog-content\") pod \"certified-operators-jrmhn\" (UID: \"45736978-28be-4eb8-9f58-1d6b56f2dd47\") " pod="openshift-marketplace/certified-operators-jrmhn" Mar 10 12:24:22 crc kubenswrapper[4794]: I0310 12:24:22.046688 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kvqpn\" (UniqueName: \"kubernetes.io/projected/45736978-28be-4eb8-9f58-1d6b56f2dd47-kube-api-access-kvqpn\") pod \"certified-operators-jrmhn\" (UID: \"45736978-28be-4eb8-9f58-1d6b56f2dd47\") " pod="openshift-marketplace/certified-operators-jrmhn" Mar 10 12:24:22 crc kubenswrapper[4794]: I0310 12:24:22.141601 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-jrmhn" Mar 10 12:24:22 crc kubenswrapper[4794]: I0310 12:24:22.647686 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-jrmhn"] Mar 10 12:24:22 crc kubenswrapper[4794]: I0310 12:24:22.958304 4794 generic.go:334] "Generic (PLEG): container finished" podID="45736978-28be-4eb8-9f58-1d6b56f2dd47" containerID="1446126b6565cab42af2e5a445642de2c44a8a1a0e9bc6f80bb910fbe2359fd0" exitCode=0 Mar 10 12:24:22 crc kubenswrapper[4794]: I0310 12:24:22.958492 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jrmhn" event={"ID":"45736978-28be-4eb8-9f58-1d6b56f2dd47","Type":"ContainerDied","Data":"1446126b6565cab42af2e5a445642de2c44a8a1a0e9bc6f80bb910fbe2359fd0"} Mar 10 12:24:22 crc kubenswrapper[4794]: I0310 12:24:22.958612 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jrmhn" event={"ID":"45736978-28be-4eb8-9f58-1d6b56f2dd47","Type":"ContainerStarted","Data":"ee61aa75c4cfd978300909b7eaf756840de2f11e5e05f7b8b9d16d4454fa521c"} Mar 10 12:24:24 crc kubenswrapper[4794]: I0310 12:24:24.982918 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jrmhn" event={"ID":"45736978-28be-4eb8-9f58-1d6b56f2dd47","Type":"ContainerStarted","Data":"2a58d31ed43c09f0f437e37b4ca1a6e547633c4de097d29270f1ed17d143e350"} Mar 10 12:24:27 crc kubenswrapper[4794]: I0310 12:24:27.014486 4794 generic.go:334] "Generic (PLEG): container finished" podID="45736978-28be-4eb8-9f58-1d6b56f2dd47" containerID="2a58d31ed43c09f0f437e37b4ca1a6e547633c4de097d29270f1ed17d143e350" exitCode=0 Mar 10 12:24:27 crc kubenswrapper[4794]: I0310 12:24:27.014585 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jrmhn" event={"ID":"45736978-28be-4eb8-9f58-1d6b56f2dd47","Type":"ContainerDied","Data":"2a58d31ed43c09f0f437e37b4ca1a6e547633c4de097d29270f1ed17d143e350"} Mar 10 12:24:28 crc kubenswrapper[4794]: I0310 12:24:28.027989 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jrmhn" event={"ID":"45736978-28be-4eb8-9f58-1d6b56f2dd47","Type":"ContainerStarted","Data":"b2cbb8c1898bdda71807abb96cbc2736c2747217001a65e07d7d86744c9f0df5"} Mar 10 12:24:28 crc kubenswrapper[4794]: I0310 12:24:28.064696 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-jrmhn" podStartSLOduration=2.591740452 podStartE2EDuration="7.064671453s" podCreationTimestamp="2026-03-10 12:24:21 +0000 UTC" firstStartedPulling="2026-03-10 12:24:22.961181586 +0000 UTC m=+9611.717352404" lastFinishedPulling="2026-03-10 12:24:27.434112587 +0000 UTC m=+9616.190283405" observedRunningTime="2026-03-10 12:24:28.052033421 +0000 UTC m=+9616.808204259" watchObservedRunningTime="2026-03-10 12:24:28.064671453 +0000 UTC m=+9616.820842281" Mar 10 12:24:32 crc kubenswrapper[4794]: I0310 12:24:32.144071 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-jrmhn" Mar 10 12:24:32 crc kubenswrapper[4794]: I0310 12:24:32.144617 4794 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-jrmhn" Mar 10 12:24:32 crc kubenswrapper[4794]: I0310 12:24:32.200637 4794 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-jrmhn" Mar 10 12:24:33 crc kubenswrapper[4794]: I0310 12:24:33.152186 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-jrmhn" Mar 10 12:24:34 crc kubenswrapper[4794]: I0310 12:24:34.314379 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-jrmhn"] Mar 10 12:24:35 crc kubenswrapper[4794]: I0310 12:24:35.111480 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-jrmhn" podUID="45736978-28be-4eb8-9f58-1d6b56f2dd47" containerName="registry-server" containerID="cri-o://b2cbb8c1898bdda71807abb96cbc2736c2747217001a65e07d7d86744c9f0df5" gracePeriod=2 Mar 10 12:24:35 crc kubenswrapper[4794]: I0310 12:24:35.629730 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-jrmhn" Mar 10 12:24:35 crc kubenswrapper[4794]: I0310 12:24:35.708370 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/45736978-28be-4eb8-9f58-1d6b56f2dd47-utilities\") pod \"45736978-28be-4eb8-9f58-1d6b56f2dd47\" (UID: \"45736978-28be-4eb8-9f58-1d6b56f2dd47\") " Mar 10 12:24:35 crc kubenswrapper[4794]: I0310 12:24:35.708661 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kvqpn\" (UniqueName: \"kubernetes.io/projected/45736978-28be-4eb8-9f58-1d6b56f2dd47-kube-api-access-kvqpn\") pod \"45736978-28be-4eb8-9f58-1d6b56f2dd47\" (UID: \"45736978-28be-4eb8-9f58-1d6b56f2dd47\") " Mar 10 12:24:35 crc kubenswrapper[4794]: I0310 12:24:35.708750 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/45736978-28be-4eb8-9f58-1d6b56f2dd47-catalog-content\") pod \"45736978-28be-4eb8-9f58-1d6b56f2dd47\" (UID: \"45736978-28be-4eb8-9f58-1d6b56f2dd47\") " Mar 10 12:24:35 crc kubenswrapper[4794]: I0310 12:24:35.710267 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/45736978-28be-4eb8-9f58-1d6b56f2dd47-utilities" (OuterVolumeSpecName: "utilities") pod "45736978-28be-4eb8-9f58-1d6b56f2dd47" (UID: "45736978-28be-4eb8-9f58-1d6b56f2dd47"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 12:24:35 crc kubenswrapper[4794]: I0310 12:24:35.718736 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/45736978-28be-4eb8-9f58-1d6b56f2dd47-kube-api-access-kvqpn" (OuterVolumeSpecName: "kube-api-access-kvqpn") pod "45736978-28be-4eb8-9f58-1d6b56f2dd47" (UID: "45736978-28be-4eb8-9f58-1d6b56f2dd47"). InnerVolumeSpecName "kube-api-access-kvqpn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 12:24:35 crc kubenswrapper[4794]: I0310 12:24:35.769566 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/45736978-28be-4eb8-9f58-1d6b56f2dd47-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "45736978-28be-4eb8-9f58-1d6b56f2dd47" (UID: "45736978-28be-4eb8-9f58-1d6b56f2dd47"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 12:24:35 crc kubenswrapper[4794]: I0310 12:24:35.811260 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kvqpn\" (UniqueName: \"kubernetes.io/projected/45736978-28be-4eb8-9f58-1d6b56f2dd47-kube-api-access-kvqpn\") on node \"crc\" DevicePath \"\"" Mar 10 12:24:35 crc kubenswrapper[4794]: I0310 12:24:35.811294 4794 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/45736978-28be-4eb8-9f58-1d6b56f2dd47-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 10 12:24:35 crc kubenswrapper[4794]: I0310 12:24:35.811303 4794 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/45736978-28be-4eb8-9f58-1d6b56f2dd47-utilities\") on node \"crc\" DevicePath \"\"" Mar 10 12:24:36 crc kubenswrapper[4794]: I0310 12:24:36.122905 4794 generic.go:334] "Generic (PLEG): container finished" podID="45736978-28be-4eb8-9f58-1d6b56f2dd47" containerID="b2cbb8c1898bdda71807abb96cbc2736c2747217001a65e07d7d86744c9f0df5" exitCode=0 Mar 10 12:24:36 crc kubenswrapper[4794]: I0310 12:24:36.123048 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-jrmhn" Mar 10 12:24:36 crc kubenswrapper[4794]: I0310 12:24:36.123071 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jrmhn" event={"ID":"45736978-28be-4eb8-9f58-1d6b56f2dd47","Type":"ContainerDied","Data":"b2cbb8c1898bdda71807abb96cbc2736c2747217001a65e07d7d86744c9f0df5"} Mar 10 12:24:36 crc kubenswrapper[4794]: I0310 12:24:36.123759 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jrmhn" event={"ID":"45736978-28be-4eb8-9f58-1d6b56f2dd47","Type":"ContainerDied","Data":"ee61aa75c4cfd978300909b7eaf756840de2f11e5e05f7b8b9d16d4454fa521c"} Mar 10 12:24:36 crc kubenswrapper[4794]: I0310 12:24:36.123778 4794 scope.go:117] "RemoveContainer" containerID="b2cbb8c1898bdda71807abb96cbc2736c2747217001a65e07d7d86744c9f0df5" Mar 10 12:24:36 crc kubenswrapper[4794]: I0310 12:24:36.148538 4794 scope.go:117] "RemoveContainer" containerID="2a58d31ed43c09f0f437e37b4ca1a6e547633c4de097d29270f1ed17d143e350" Mar 10 12:24:36 crc kubenswrapper[4794]: I0310 12:24:36.155621 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-jrmhn"] Mar 10 12:24:36 crc kubenswrapper[4794]: I0310 12:24:36.164828 4794 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-jrmhn"] Mar 10 12:24:36 crc kubenswrapper[4794]: I0310 12:24:36.172588 4794 scope.go:117] "RemoveContainer" containerID="1446126b6565cab42af2e5a445642de2c44a8a1a0e9bc6f80bb910fbe2359fd0" Mar 10 12:24:36 crc kubenswrapper[4794]: I0310 12:24:36.238451 4794 scope.go:117] "RemoveContainer" containerID="b2cbb8c1898bdda71807abb96cbc2736c2747217001a65e07d7d86744c9f0df5" Mar 10 12:24:36 crc kubenswrapper[4794]: E0310 12:24:36.238910 4794 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b2cbb8c1898bdda71807abb96cbc2736c2747217001a65e07d7d86744c9f0df5\": container with ID starting with b2cbb8c1898bdda71807abb96cbc2736c2747217001a65e07d7d86744c9f0df5 not found: ID does not exist" containerID="b2cbb8c1898bdda71807abb96cbc2736c2747217001a65e07d7d86744c9f0df5" Mar 10 12:24:36 crc kubenswrapper[4794]: I0310 12:24:36.238975 4794 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b2cbb8c1898bdda71807abb96cbc2736c2747217001a65e07d7d86744c9f0df5"} err="failed to get container status \"b2cbb8c1898bdda71807abb96cbc2736c2747217001a65e07d7d86744c9f0df5\": rpc error: code = NotFound desc = could not find container \"b2cbb8c1898bdda71807abb96cbc2736c2747217001a65e07d7d86744c9f0df5\": container with ID starting with b2cbb8c1898bdda71807abb96cbc2736c2747217001a65e07d7d86744c9f0df5 not found: ID does not exist" Mar 10 12:24:36 crc kubenswrapper[4794]: I0310 12:24:36.239029 4794 scope.go:117] "RemoveContainer" containerID="2a58d31ed43c09f0f437e37b4ca1a6e547633c4de097d29270f1ed17d143e350" Mar 10 12:24:36 crc kubenswrapper[4794]: E0310 12:24:36.239429 4794 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2a58d31ed43c09f0f437e37b4ca1a6e547633c4de097d29270f1ed17d143e350\": container with ID starting with 2a58d31ed43c09f0f437e37b4ca1a6e547633c4de097d29270f1ed17d143e350 not found: ID does not exist" containerID="2a58d31ed43c09f0f437e37b4ca1a6e547633c4de097d29270f1ed17d143e350" Mar 10 12:24:36 crc kubenswrapper[4794]: I0310 12:24:36.239475 4794 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2a58d31ed43c09f0f437e37b4ca1a6e547633c4de097d29270f1ed17d143e350"} err="failed to get container status \"2a58d31ed43c09f0f437e37b4ca1a6e547633c4de097d29270f1ed17d143e350\": rpc error: code = NotFound desc = could not find container \"2a58d31ed43c09f0f437e37b4ca1a6e547633c4de097d29270f1ed17d143e350\": container with ID starting with 2a58d31ed43c09f0f437e37b4ca1a6e547633c4de097d29270f1ed17d143e350 not found: ID does not exist" Mar 10 12:24:36 crc kubenswrapper[4794]: I0310 12:24:36.239499 4794 scope.go:117] "RemoveContainer" containerID="1446126b6565cab42af2e5a445642de2c44a8a1a0e9bc6f80bb910fbe2359fd0" Mar 10 12:24:36 crc kubenswrapper[4794]: E0310 12:24:36.239756 4794 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1446126b6565cab42af2e5a445642de2c44a8a1a0e9bc6f80bb910fbe2359fd0\": container with ID starting with 1446126b6565cab42af2e5a445642de2c44a8a1a0e9bc6f80bb910fbe2359fd0 not found: ID does not exist" containerID="1446126b6565cab42af2e5a445642de2c44a8a1a0e9bc6f80bb910fbe2359fd0" Mar 10 12:24:36 crc kubenswrapper[4794]: I0310 12:24:36.239794 4794 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1446126b6565cab42af2e5a445642de2c44a8a1a0e9bc6f80bb910fbe2359fd0"} err="failed to get container status \"1446126b6565cab42af2e5a445642de2c44a8a1a0e9bc6f80bb910fbe2359fd0\": rpc error: code = NotFound desc = could not find container \"1446126b6565cab42af2e5a445642de2c44a8a1a0e9bc6f80bb910fbe2359fd0\": container with ID starting with 1446126b6565cab42af2e5a445642de2c44a8a1a0e9bc6f80bb910fbe2359fd0 not found: ID does not exist" Mar 10 12:24:38 crc kubenswrapper[4794]: I0310 12:24:38.009381 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="45736978-28be-4eb8-9f58-1d6b56f2dd47" path="/var/lib/kubelet/pods/45736978-28be-4eb8-9f58-1d6b56f2dd47/volumes" Mar 10 12:24:42 crc kubenswrapper[4794]: I0310 12:24:42.129652 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-jlcp2"] Mar 10 12:24:42 crc kubenswrapper[4794]: E0310 12:24:42.131133 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="45736978-28be-4eb8-9f58-1d6b56f2dd47" containerName="extract-content" Mar 10 12:24:42 crc kubenswrapper[4794]: I0310 12:24:42.131172 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="45736978-28be-4eb8-9f58-1d6b56f2dd47" containerName="extract-content" Mar 10 12:24:42 crc kubenswrapper[4794]: E0310 12:24:42.131220 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="45736978-28be-4eb8-9f58-1d6b56f2dd47" containerName="extract-utilities" Mar 10 12:24:42 crc kubenswrapper[4794]: I0310 12:24:42.131247 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="45736978-28be-4eb8-9f58-1d6b56f2dd47" containerName="extract-utilities" Mar 10 12:24:42 crc kubenswrapper[4794]: E0310 12:24:42.131259 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="45736978-28be-4eb8-9f58-1d6b56f2dd47" containerName="registry-server" Mar 10 12:24:42 crc kubenswrapper[4794]: I0310 12:24:42.131265 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="45736978-28be-4eb8-9f58-1d6b56f2dd47" containerName="registry-server" Mar 10 12:24:42 crc kubenswrapper[4794]: I0310 12:24:42.131611 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="45736978-28be-4eb8-9f58-1d6b56f2dd47" containerName="registry-server" Mar 10 12:24:42 crc kubenswrapper[4794]: I0310 12:24:42.135610 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-jlcp2" Mar 10 12:24:42 crc kubenswrapper[4794]: I0310 12:24:42.155244 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-jlcp2"] Mar 10 12:24:42 crc kubenswrapper[4794]: I0310 12:24:42.266572 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d76b42cb-4d59-42de-8d78-1091404fc249-catalog-content\") pod \"redhat-operators-jlcp2\" (UID: \"d76b42cb-4d59-42de-8d78-1091404fc249\") " pod="openshift-marketplace/redhat-operators-jlcp2" Mar 10 12:24:42 crc kubenswrapper[4794]: I0310 12:24:42.267192 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-44ff8\" (UniqueName: \"kubernetes.io/projected/d76b42cb-4d59-42de-8d78-1091404fc249-kube-api-access-44ff8\") pod \"redhat-operators-jlcp2\" (UID: \"d76b42cb-4d59-42de-8d78-1091404fc249\") " pod="openshift-marketplace/redhat-operators-jlcp2" Mar 10 12:24:42 crc kubenswrapper[4794]: I0310 12:24:42.267577 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d76b42cb-4d59-42de-8d78-1091404fc249-utilities\") pod \"redhat-operators-jlcp2\" (UID: \"d76b42cb-4d59-42de-8d78-1091404fc249\") " pod="openshift-marketplace/redhat-operators-jlcp2" Mar 10 12:24:42 crc kubenswrapper[4794]: I0310 12:24:42.370146 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-44ff8\" (UniqueName: \"kubernetes.io/projected/d76b42cb-4d59-42de-8d78-1091404fc249-kube-api-access-44ff8\") pod \"redhat-operators-jlcp2\" (UID: \"d76b42cb-4d59-42de-8d78-1091404fc249\") " pod="openshift-marketplace/redhat-operators-jlcp2" Mar 10 12:24:42 crc kubenswrapper[4794]: I0310 12:24:42.370308 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d76b42cb-4d59-42de-8d78-1091404fc249-utilities\") pod \"redhat-operators-jlcp2\" (UID: \"d76b42cb-4d59-42de-8d78-1091404fc249\") " pod="openshift-marketplace/redhat-operators-jlcp2" Mar 10 12:24:42 crc kubenswrapper[4794]: I0310 12:24:42.370549 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d76b42cb-4d59-42de-8d78-1091404fc249-catalog-content\") pod \"redhat-operators-jlcp2\" (UID: \"d76b42cb-4d59-42de-8d78-1091404fc249\") " pod="openshift-marketplace/redhat-operators-jlcp2" Mar 10 12:24:42 crc kubenswrapper[4794]: I0310 12:24:42.371142 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d76b42cb-4d59-42de-8d78-1091404fc249-utilities\") pod \"redhat-operators-jlcp2\" (UID: \"d76b42cb-4d59-42de-8d78-1091404fc249\") " pod="openshift-marketplace/redhat-operators-jlcp2" Mar 10 12:24:42 crc kubenswrapper[4794]: I0310 12:24:42.371184 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d76b42cb-4d59-42de-8d78-1091404fc249-catalog-content\") pod \"redhat-operators-jlcp2\" (UID: \"d76b42cb-4d59-42de-8d78-1091404fc249\") " pod="openshift-marketplace/redhat-operators-jlcp2" Mar 10 12:24:42 crc kubenswrapper[4794]: I0310 12:24:42.404341 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-44ff8\" (UniqueName: \"kubernetes.io/projected/d76b42cb-4d59-42de-8d78-1091404fc249-kube-api-access-44ff8\") pod \"redhat-operators-jlcp2\" (UID: \"d76b42cb-4d59-42de-8d78-1091404fc249\") " pod="openshift-marketplace/redhat-operators-jlcp2" Mar 10 12:24:42 crc kubenswrapper[4794]: I0310 12:24:42.497591 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-jlcp2" Mar 10 12:24:43 crc kubenswrapper[4794]: I0310 12:24:43.006762 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-jlcp2"] Mar 10 12:24:43 crc kubenswrapper[4794]: I0310 12:24:43.221878 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jlcp2" event={"ID":"d76b42cb-4d59-42de-8d78-1091404fc249","Type":"ContainerStarted","Data":"86e12263161323ac98e604860b48e81e0cff4c280a3e9e46d05e4a14e2831712"} Mar 10 12:24:44 crc kubenswrapper[4794]: I0310 12:24:44.235687 4794 generic.go:334] "Generic (PLEG): container finished" podID="d76b42cb-4d59-42de-8d78-1091404fc249" containerID="0892c7b8257e2c436c60fc6b67339924a069a82fe6bd809f031acdb001360b84" exitCode=0 Mar 10 12:24:44 crc kubenswrapper[4794]: I0310 12:24:44.235752 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jlcp2" event={"ID":"d76b42cb-4d59-42de-8d78-1091404fc249","Type":"ContainerDied","Data":"0892c7b8257e2c436c60fc6b67339924a069a82fe6bd809f031acdb001360b84"} Mar 10 12:24:46 crc kubenswrapper[4794]: I0310 12:24:46.256583 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jlcp2" event={"ID":"d76b42cb-4d59-42de-8d78-1091404fc249","Type":"ContainerStarted","Data":"df3f2b54c1670b4e79a8fa01d50b984a6f32dea7393d0d16f80fb4626298ef11"} Mar 10 12:24:50 crc kubenswrapper[4794]: I0310 12:24:50.306440 4794 generic.go:334] "Generic (PLEG): container finished" podID="d76b42cb-4d59-42de-8d78-1091404fc249" containerID="df3f2b54c1670b4e79a8fa01d50b984a6f32dea7393d0d16f80fb4626298ef11" exitCode=0 Mar 10 12:24:50 crc kubenswrapper[4794]: I0310 12:24:50.306503 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jlcp2" event={"ID":"d76b42cb-4d59-42de-8d78-1091404fc249","Type":"ContainerDied","Data":"df3f2b54c1670b4e79a8fa01d50b984a6f32dea7393d0d16f80fb4626298ef11"} Mar 10 12:24:51 crc kubenswrapper[4794]: I0310 12:24:51.320872 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jlcp2" event={"ID":"d76b42cb-4d59-42de-8d78-1091404fc249","Type":"ContainerStarted","Data":"a27aa972f11c2c384256be00f312d812a515b314a5d8eeefc11161dff70b460f"} Mar 10 12:24:51 crc kubenswrapper[4794]: I0310 12:24:51.353643 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-jlcp2" podStartSLOduration=2.854988566 podStartE2EDuration="9.353618214s" podCreationTimestamp="2026-03-10 12:24:42 +0000 UTC" firstStartedPulling="2026-03-10 12:24:44.237748453 +0000 UTC m=+9632.993919271" lastFinishedPulling="2026-03-10 12:24:50.736378061 +0000 UTC m=+9639.492548919" observedRunningTime="2026-03-10 12:24:51.351857268 +0000 UTC m=+9640.108028086" watchObservedRunningTime="2026-03-10 12:24:51.353618214 +0000 UTC m=+9640.109789032" Mar 10 12:24:52 crc kubenswrapper[4794]: I0310 12:24:52.498573 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-jlcp2" Mar 10 12:24:52 crc kubenswrapper[4794]: I0310 12:24:52.498936 4794 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-jlcp2" Mar 10 12:24:52 crc kubenswrapper[4794]: I0310 12:24:52.968235 4794 patch_prober.go:28] interesting pod/machine-config-daemon-69278 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 12:24:52 crc kubenswrapper[4794]: I0310 12:24:52.968305 4794 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-69278" podUID="071a79a8-a892-4d38-a255-2a19483b64aa" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 12:24:53 crc kubenswrapper[4794]: I0310 12:24:53.934167 4794 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-jlcp2" podUID="d76b42cb-4d59-42de-8d78-1091404fc249" containerName="registry-server" probeResult="failure" output=< Mar 10 12:24:53 crc kubenswrapper[4794]: timeout: failed to connect service ":50051" within 1s Mar 10 12:24:53 crc kubenswrapper[4794]: > Mar 10 12:24:55 crc kubenswrapper[4794]: I0310 12:24:55.824987 4794 scope.go:117] "RemoveContainer" containerID="86208cc799bf4557ab40c9a4d8ca0d4131d78726809978b479c1702ee1b74b4c" Mar 10 12:25:02 crc kubenswrapper[4794]: I0310 12:25:02.568641 4794 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-jlcp2" Mar 10 12:25:02 crc kubenswrapper[4794]: I0310 12:25:02.627364 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-jlcp2" Mar 10 12:25:06 crc kubenswrapper[4794]: I0310 12:25:06.323810 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-jlcp2"] Mar 10 12:25:06 crc kubenswrapper[4794]: I0310 12:25:06.324507 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-jlcp2" podUID="d76b42cb-4d59-42de-8d78-1091404fc249" containerName="registry-server" containerID="cri-o://a27aa972f11c2c384256be00f312d812a515b314a5d8eeefc11161dff70b460f" gracePeriod=2 Mar 10 12:25:06 crc kubenswrapper[4794]: I0310 12:25:06.497538 4794 generic.go:334] "Generic (PLEG): container finished" podID="d76b42cb-4d59-42de-8d78-1091404fc249" containerID="a27aa972f11c2c384256be00f312d812a515b314a5d8eeefc11161dff70b460f" exitCode=0 Mar 10 12:25:06 crc kubenswrapper[4794]: I0310 12:25:06.497579 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jlcp2" event={"ID":"d76b42cb-4d59-42de-8d78-1091404fc249","Type":"ContainerDied","Data":"a27aa972f11c2c384256be00f312d812a515b314a5d8eeefc11161dff70b460f"} Mar 10 12:25:06 crc kubenswrapper[4794]: I0310 12:25:06.947856 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-jlcp2" Mar 10 12:25:07 crc kubenswrapper[4794]: I0310 12:25:07.025452 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d76b42cb-4d59-42de-8d78-1091404fc249-catalog-content\") pod \"d76b42cb-4d59-42de-8d78-1091404fc249\" (UID: \"d76b42cb-4d59-42de-8d78-1091404fc249\") " Mar 10 12:25:07 crc kubenswrapper[4794]: I0310 12:25:07.025747 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d76b42cb-4d59-42de-8d78-1091404fc249-utilities\") pod \"d76b42cb-4d59-42de-8d78-1091404fc249\" (UID: \"d76b42cb-4d59-42de-8d78-1091404fc249\") " Mar 10 12:25:07 crc kubenswrapper[4794]: I0310 12:25:07.025801 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-44ff8\" (UniqueName: \"kubernetes.io/projected/d76b42cb-4d59-42de-8d78-1091404fc249-kube-api-access-44ff8\") pod \"d76b42cb-4d59-42de-8d78-1091404fc249\" (UID: \"d76b42cb-4d59-42de-8d78-1091404fc249\") " Mar 10 12:25:07 crc kubenswrapper[4794]: I0310 12:25:07.027779 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d76b42cb-4d59-42de-8d78-1091404fc249-utilities" (OuterVolumeSpecName: "utilities") pod "d76b42cb-4d59-42de-8d78-1091404fc249" (UID: "d76b42cb-4d59-42de-8d78-1091404fc249"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 12:25:07 crc kubenswrapper[4794]: I0310 12:25:07.028233 4794 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d76b42cb-4d59-42de-8d78-1091404fc249-utilities\") on node \"crc\" DevicePath \"\"" Mar 10 12:25:07 crc kubenswrapper[4794]: I0310 12:25:07.037581 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d76b42cb-4d59-42de-8d78-1091404fc249-kube-api-access-44ff8" (OuterVolumeSpecName: "kube-api-access-44ff8") pod "d76b42cb-4d59-42de-8d78-1091404fc249" (UID: "d76b42cb-4d59-42de-8d78-1091404fc249"). InnerVolumeSpecName "kube-api-access-44ff8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 12:25:07 crc kubenswrapper[4794]: I0310 12:25:07.131209 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-44ff8\" (UniqueName: \"kubernetes.io/projected/d76b42cb-4d59-42de-8d78-1091404fc249-kube-api-access-44ff8\") on node \"crc\" DevicePath \"\"" Mar 10 12:25:07 crc kubenswrapper[4794]: I0310 12:25:07.173944 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d76b42cb-4d59-42de-8d78-1091404fc249-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d76b42cb-4d59-42de-8d78-1091404fc249" (UID: "d76b42cb-4d59-42de-8d78-1091404fc249"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 12:25:07 crc kubenswrapper[4794]: I0310 12:25:07.233740 4794 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d76b42cb-4d59-42de-8d78-1091404fc249-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 10 12:25:07 crc kubenswrapper[4794]: I0310 12:25:07.510845 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jlcp2" event={"ID":"d76b42cb-4d59-42de-8d78-1091404fc249","Type":"ContainerDied","Data":"86e12263161323ac98e604860b48e81e0cff4c280a3e9e46d05e4a14e2831712"} Mar 10 12:25:07 crc kubenswrapper[4794]: I0310 12:25:07.511085 4794 scope.go:117] "RemoveContainer" containerID="a27aa972f11c2c384256be00f312d812a515b314a5d8eeefc11161dff70b460f" Mar 10 12:25:07 crc kubenswrapper[4794]: I0310 12:25:07.510927 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-jlcp2" Mar 10 12:25:07 crc kubenswrapper[4794]: I0310 12:25:07.548499 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-jlcp2"] Mar 10 12:25:07 crc kubenswrapper[4794]: I0310 12:25:07.548561 4794 scope.go:117] "RemoveContainer" containerID="df3f2b54c1670b4e79a8fa01d50b984a6f32dea7393d0d16f80fb4626298ef11" Mar 10 12:25:07 crc kubenswrapper[4794]: I0310 12:25:07.557996 4794 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-jlcp2"] Mar 10 12:25:08 crc kubenswrapper[4794]: I0310 12:25:08.016213 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d76b42cb-4d59-42de-8d78-1091404fc249" path="/var/lib/kubelet/pods/d76b42cb-4d59-42de-8d78-1091404fc249/volumes" Mar 10 12:25:08 crc kubenswrapper[4794]: I0310 12:25:08.102844 4794 scope.go:117] "RemoveContainer" containerID="0892c7b8257e2c436c60fc6b67339924a069a82fe6bd809f031acdb001360b84" Mar 10 12:25:22 crc kubenswrapper[4794]: I0310 12:25:22.967768 4794 patch_prober.go:28] interesting pod/machine-config-daemon-69278 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 12:25:22 crc kubenswrapper[4794]: I0310 12:25:22.968386 4794 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-69278" podUID="071a79a8-a892-4d38-a255-2a19483b64aa" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 12:25:52 crc kubenswrapper[4794]: I0310 12:25:52.967886 4794 patch_prober.go:28] interesting pod/machine-config-daemon-69278 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 12:25:52 crc kubenswrapper[4794]: I0310 12:25:52.968305 4794 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-69278" podUID="071a79a8-a892-4d38-a255-2a19483b64aa" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 12:25:52 crc kubenswrapper[4794]: I0310 12:25:52.968369 4794 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-69278" Mar 10 12:25:52 crc kubenswrapper[4794]: I0310 12:25:52.969303 4794 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"713a7f9d6f1bfd1d1e307cff8db8bc49857bd1b38078cd590d6dd88f1ae69a2e"} pod="openshift-machine-config-operator/machine-config-daemon-69278" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 10 12:25:52 crc kubenswrapper[4794]: I0310 12:25:52.969388 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-69278" podUID="071a79a8-a892-4d38-a255-2a19483b64aa" containerName="machine-config-daemon" containerID="cri-o://713a7f9d6f1bfd1d1e307cff8db8bc49857bd1b38078cd590d6dd88f1ae69a2e" gracePeriod=600 Mar 10 12:25:53 crc kubenswrapper[4794]: E0310 12:25:53.205045 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-69278_openshift-machine-config-operator(071a79a8-a892-4d38-a255-2a19483b64aa)\"" pod="openshift-machine-config-operator/machine-config-daemon-69278" podUID="071a79a8-a892-4d38-a255-2a19483b64aa" Mar 10 12:25:54 crc kubenswrapper[4794]: I0310 12:25:54.051893 4794 generic.go:334] "Generic (PLEG): container finished" podID="071a79a8-a892-4d38-a255-2a19483b64aa" containerID="713a7f9d6f1bfd1d1e307cff8db8bc49857bd1b38078cd590d6dd88f1ae69a2e" exitCode=0 Mar 10 12:25:54 crc kubenswrapper[4794]: I0310 12:25:54.051944 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-69278" event={"ID":"071a79a8-a892-4d38-a255-2a19483b64aa","Type":"ContainerDied","Data":"713a7f9d6f1bfd1d1e307cff8db8bc49857bd1b38078cd590d6dd88f1ae69a2e"} Mar 10 12:25:54 crc kubenswrapper[4794]: I0310 12:25:54.052281 4794 scope.go:117] "RemoveContainer" containerID="067da652f299f1725aeb1dffe145df44e864aa55b2260448aafac13d3c5a1934" Mar 10 12:25:54 crc kubenswrapper[4794]: I0310 12:25:54.053173 4794 scope.go:117] "RemoveContainer" containerID="713a7f9d6f1bfd1d1e307cff8db8bc49857bd1b38078cd590d6dd88f1ae69a2e" Mar 10 12:25:54 crc kubenswrapper[4794]: E0310 12:25:54.053635 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-69278_openshift-machine-config-operator(071a79a8-a892-4d38-a255-2a19483b64aa)\"" pod="openshift-machine-config-operator/machine-config-daemon-69278" podUID="071a79a8-a892-4d38-a255-2a19483b64aa" Mar 10 12:26:00 crc kubenswrapper[4794]: I0310 12:26:00.145665 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29552426-x5rr5"] Mar 10 12:26:00 crc kubenswrapper[4794]: E0310 12:26:00.146628 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d76b42cb-4d59-42de-8d78-1091404fc249" containerName="registry-server" Mar 10 12:26:00 crc kubenswrapper[4794]: I0310 12:26:00.146646 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="d76b42cb-4d59-42de-8d78-1091404fc249" containerName="registry-server" Mar 10 12:26:00 crc kubenswrapper[4794]: E0310 12:26:00.146691 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d76b42cb-4d59-42de-8d78-1091404fc249" containerName="extract-content" Mar 10 12:26:00 crc kubenswrapper[4794]: I0310 12:26:00.146696 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="d76b42cb-4d59-42de-8d78-1091404fc249" containerName="extract-content" Mar 10 12:26:00 crc kubenswrapper[4794]: E0310 12:26:00.146706 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d76b42cb-4d59-42de-8d78-1091404fc249" containerName="extract-utilities" Mar 10 12:26:00 crc kubenswrapper[4794]: I0310 12:26:00.146712 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="d76b42cb-4d59-42de-8d78-1091404fc249" containerName="extract-utilities" Mar 10 12:26:00 crc kubenswrapper[4794]: I0310 12:26:00.146905 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="d76b42cb-4d59-42de-8d78-1091404fc249" containerName="registry-server" Mar 10 12:26:00 crc kubenswrapper[4794]: I0310 12:26:00.147671 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552426-x5rr5" Mar 10 12:26:00 crc kubenswrapper[4794]: I0310 12:26:00.150361 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 10 12:26:00 crc kubenswrapper[4794]: I0310 12:26:00.150501 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 10 12:26:00 crc kubenswrapper[4794]: I0310 12:26:00.150621 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-r5tkc" Mar 10 12:26:00 crc kubenswrapper[4794]: I0310 12:26:00.159610 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ndcp8\" (UniqueName: \"kubernetes.io/projected/a1297148-f11a-47c9-ac04-40f9fba0c8fe-kube-api-access-ndcp8\") pod \"auto-csr-approver-29552426-x5rr5\" (UID: \"a1297148-f11a-47c9-ac04-40f9fba0c8fe\") " pod="openshift-infra/auto-csr-approver-29552426-x5rr5" Mar 10 12:26:00 crc kubenswrapper[4794]: I0310 12:26:00.160975 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552426-x5rr5"] Mar 10 12:26:00 crc kubenswrapper[4794]: I0310 12:26:00.260786 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ndcp8\" (UniqueName: \"kubernetes.io/projected/a1297148-f11a-47c9-ac04-40f9fba0c8fe-kube-api-access-ndcp8\") pod \"auto-csr-approver-29552426-x5rr5\" (UID: \"a1297148-f11a-47c9-ac04-40f9fba0c8fe\") " pod="openshift-infra/auto-csr-approver-29552426-x5rr5" Mar 10 12:26:00 crc kubenswrapper[4794]: I0310 12:26:00.283056 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ndcp8\" (UniqueName: \"kubernetes.io/projected/a1297148-f11a-47c9-ac04-40f9fba0c8fe-kube-api-access-ndcp8\") pod \"auto-csr-approver-29552426-x5rr5\" (UID: \"a1297148-f11a-47c9-ac04-40f9fba0c8fe\") " pod="openshift-infra/auto-csr-approver-29552426-x5rr5" Mar 10 12:26:00 crc kubenswrapper[4794]: I0310 12:26:00.471864 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552426-x5rr5" Mar 10 12:26:01 crc kubenswrapper[4794]: I0310 12:26:01.002967 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552426-x5rr5"] Mar 10 12:26:01 crc kubenswrapper[4794]: I0310 12:26:01.021743 4794 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 10 12:26:01 crc kubenswrapper[4794]: I0310 12:26:01.128734 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552426-x5rr5" event={"ID":"a1297148-f11a-47c9-ac04-40f9fba0c8fe","Type":"ContainerStarted","Data":"56233fcbc250ef40f428e9583988cf6ef280fa6e803c17bee6d5b4d2d91edbdb"} Mar 10 12:26:03 crc kubenswrapper[4794]: I0310 12:26:03.158001 4794 generic.go:334] "Generic (PLEG): container finished" podID="a1297148-f11a-47c9-ac04-40f9fba0c8fe" containerID="b6972798a7ffe7f30f4f2e32a2789a0a3a21147f3624d1270de1408ccc9b73ed" exitCode=0 Mar 10 12:26:03 crc kubenswrapper[4794]: I0310 12:26:03.158124 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552426-x5rr5" event={"ID":"a1297148-f11a-47c9-ac04-40f9fba0c8fe","Type":"ContainerDied","Data":"b6972798a7ffe7f30f4f2e32a2789a0a3a21147f3624d1270de1408ccc9b73ed"} Mar 10 12:26:05 crc kubenswrapper[4794]: I0310 12:26:05.179990 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552426-x5rr5" event={"ID":"a1297148-f11a-47c9-ac04-40f9fba0c8fe","Type":"ContainerDied","Data":"56233fcbc250ef40f428e9583988cf6ef280fa6e803c17bee6d5b4d2d91edbdb"} Mar 10 12:26:05 crc kubenswrapper[4794]: I0310 12:26:05.180641 4794 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="56233fcbc250ef40f428e9583988cf6ef280fa6e803c17bee6d5b4d2d91edbdb" Mar 10 12:26:05 crc kubenswrapper[4794]: I0310 12:26:05.224160 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552426-x5rr5" Mar 10 12:26:05 crc kubenswrapper[4794]: I0310 12:26:05.371230 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ndcp8\" (UniqueName: \"kubernetes.io/projected/a1297148-f11a-47c9-ac04-40f9fba0c8fe-kube-api-access-ndcp8\") pod \"a1297148-f11a-47c9-ac04-40f9fba0c8fe\" (UID: \"a1297148-f11a-47c9-ac04-40f9fba0c8fe\") " Mar 10 12:26:05 crc kubenswrapper[4794]: I0310 12:26:05.379958 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a1297148-f11a-47c9-ac04-40f9fba0c8fe-kube-api-access-ndcp8" (OuterVolumeSpecName: "kube-api-access-ndcp8") pod "a1297148-f11a-47c9-ac04-40f9fba0c8fe" (UID: "a1297148-f11a-47c9-ac04-40f9fba0c8fe"). InnerVolumeSpecName "kube-api-access-ndcp8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 12:26:05 crc kubenswrapper[4794]: I0310 12:26:05.473857 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ndcp8\" (UniqueName: \"kubernetes.io/projected/a1297148-f11a-47c9-ac04-40f9fba0c8fe-kube-api-access-ndcp8\") on node \"crc\" DevicePath \"\"" Mar 10 12:26:06 crc kubenswrapper[4794]: I0310 12:26:06.190703 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552426-x5rr5" Mar 10 12:26:06 crc kubenswrapper[4794]: I0310 12:26:06.306004 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29552420-f6knn"] Mar 10 12:26:06 crc kubenswrapper[4794]: I0310 12:26:06.317360 4794 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29552420-f6knn"] Mar 10 12:26:08 crc kubenswrapper[4794]: I0310 12:26:08.020028 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="08df24c0-8c53-4f01-8edb-470609fbe0c1" path="/var/lib/kubelet/pods/08df24c0-8c53-4f01-8edb-470609fbe0c1/volumes" Mar 10 12:26:09 crc kubenswrapper[4794]: I0310 12:26:09.000317 4794 scope.go:117] "RemoveContainer" containerID="713a7f9d6f1bfd1d1e307cff8db8bc49857bd1b38078cd590d6dd88f1ae69a2e" Mar 10 12:26:09 crc kubenswrapper[4794]: E0310 12:26:09.001402 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-69278_openshift-machine-config-operator(071a79a8-a892-4d38-a255-2a19483b64aa)\"" pod="openshift-machine-config-operator/machine-config-daemon-69278" podUID="071a79a8-a892-4d38-a255-2a19483b64aa" Mar 10 12:26:21 crc kubenswrapper[4794]: I0310 12:26:21.000201 4794 scope.go:117] "RemoveContainer" containerID="713a7f9d6f1bfd1d1e307cff8db8bc49857bd1b38078cd590d6dd88f1ae69a2e" Mar 10 12:26:21 crc kubenswrapper[4794]: E0310 12:26:21.001088 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-69278_openshift-machine-config-operator(071a79a8-a892-4d38-a255-2a19483b64aa)\"" pod="openshift-machine-config-operator/machine-config-daemon-69278" podUID="071a79a8-a892-4d38-a255-2a19483b64aa" Mar 10 12:26:33 crc kubenswrapper[4794]: I0310 12:26:32.999597 4794 scope.go:117] "RemoveContainer" containerID="713a7f9d6f1bfd1d1e307cff8db8bc49857bd1b38078cd590d6dd88f1ae69a2e" Mar 10 12:26:33 crc kubenswrapper[4794]: E0310 12:26:33.000542 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-69278_openshift-machine-config-operator(071a79a8-a892-4d38-a255-2a19483b64aa)\"" pod="openshift-machine-config-operator/machine-config-daemon-69278" podUID="071a79a8-a892-4d38-a255-2a19483b64aa" Mar 10 12:26:45 crc kubenswrapper[4794]: I0310 12:26:45.000323 4794 scope.go:117] "RemoveContainer" containerID="713a7f9d6f1bfd1d1e307cff8db8bc49857bd1b38078cd590d6dd88f1ae69a2e" Mar 10 12:26:45 crc kubenswrapper[4794]: E0310 12:26:45.001880 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-69278_openshift-machine-config-operator(071a79a8-a892-4d38-a255-2a19483b64aa)\"" pod="openshift-machine-config-operator/machine-config-daemon-69278" podUID="071a79a8-a892-4d38-a255-2a19483b64aa" Mar 10 12:26:55 crc kubenswrapper[4794]: I0310 12:26:55.954576 4794 scope.go:117] "RemoveContainer" containerID="9aacf0fe110cebb3b6671285b4d7559bdcbb073b5368b865aab15c591a242cef" Mar 10 12:26:55 crc kubenswrapper[4794]: I0310 12:26:55.999420 4794 scope.go:117] "RemoveContainer" containerID="713a7f9d6f1bfd1d1e307cff8db8bc49857bd1b38078cd590d6dd88f1ae69a2e" Mar 10 12:26:56 crc kubenswrapper[4794]: E0310 12:26:55.999980 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-69278_openshift-machine-config-operator(071a79a8-a892-4d38-a255-2a19483b64aa)\"" pod="openshift-machine-config-operator/machine-config-daemon-69278" podUID="071a79a8-a892-4d38-a255-2a19483b64aa" Mar 10 12:27:06 crc kubenswrapper[4794]: I0310 12:27:06.998695 4794 scope.go:117] "RemoveContainer" containerID="713a7f9d6f1bfd1d1e307cff8db8bc49857bd1b38078cd590d6dd88f1ae69a2e" Mar 10 12:27:07 crc kubenswrapper[4794]: E0310 12:27:06.999501 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-69278_openshift-machine-config-operator(071a79a8-a892-4d38-a255-2a19483b64aa)\"" pod="openshift-machine-config-operator/machine-config-daemon-69278" podUID="071a79a8-a892-4d38-a255-2a19483b64aa" Mar 10 12:27:21 crc kubenswrapper[4794]: I0310 12:27:20.999830 4794 scope.go:117] "RemoveContainer" containerID="713a7f9d6f1bfd1d1e307cff8db8bc49857bd1b38078cd590d6dd88f1ae69a2e" Mar 10 12:27:21 crc kubenswrapper[4794]: E0310 12:27:21.000837 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-69278_openshift-machine-config-operator(071a79a8-a892-4d38-a255-2a19483b64aa)\"" pod="openshift-machine-config-operator/machine-config-daemon-69278" podUID="071a79a8-a892-4d38-a255-2a19483b64aa" Mar 10 12:27:33 crc kubenswrapper[4794]: I0310 12:27:32.999907 4794 scope.go:117] "RemoveContainer" containerID="713a7f9d6f1bfd1d1e307cff8db8bc49857bd1b38078cd590d6dd88f1ae69a2e" Mar 10 12:27:33 crc kubenswrapper[4794]: E0310 12:27:33.000733 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-69278_openshift-machine-config-operator(071a79a8-a892-4d38-a255-2a19483b64aa)\"" pod="openshift-machine-config-operator/machine-config-daemon-69278" podUID="071a79a8-a892-4d38-a255-2a19483b64aa" Mar 10 12:27:47 crc kubenswrapper[4794]: I0310 12:27:47.999210 4794 scope.go:117] "RemoveContainer" containerID="713a7f9d6f1bfd1d1e307cff8db8bc49857bd1b38078cd590d6dd88f1ae69a2e" Mar 10 12:27:48 crc kubenswrapper[4794]: E0310 12:27:48.000068 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-69278_openshift-machine-config-operator(071a79a8-a892-4d38-a255-2a19483b64aa)\"" pod="openshift-machine-config-operator/machine-config-daemon-69278" podUID="071a79a8-a892-4d38-a255-2a19483b64aa" Mar 10 12:27:59 crc kubenswrapper[4794]: I0310 12:27:58.999112 4794 scope.go:117] "RemoveContainer" containerID="713a7f9d6f1bfd1d1e307cff8db8bc49857bd1b38078cd590d6dd88f1ae69a2e" Mar 10 12:27:59 crc kubenswrapper[4794]: E0310 12:27:59.000020 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-69278_openshift-machine-config-operator(071a79a8-a892-4d38-a255-2a19483b64aa)\"" pod="openshift-machine-config-operator/machine-config-daemon-69278" podUID="071a79a8-a892-4d38-a255-2a19483b64aa" Mar 10 12:28:00 crc kubenswrapper[4794]: I0310 12:28:00.148137 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29552428-gvn7p"] Mar 10 12:28:00 crc kubenswrapper[4794]: E0310 12:28:00.148816 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a1297148-f11a-47c9-ac04-40f9fba0c8fe" containerName="oc" Mar 10 12:28:00 crc kubenswrapper[4794]: I0310 12:28:00.148834 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="a1297148-f11a-47c9-ac04-40f9fba0c8fe" containerName="oc" Mar 10 12:28:00 crc kubenswrapper[4794]: I0310 12:28:00.149125 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="a1297148-f11a-47c9-ac04-40f9fba0c8fe" containerName="oc" Mar 10 12:28:00 crc kubenswrapper[4794]: I0310 12:28:00.150224 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552428-gvn7p" Mar 10 12:28:00 crc kubenswrapper[4794]: I0310 12:28:00.152290 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-r5tkc" Mar 10 12:28:00 crc kubenswrapper[4794]: I0310 12:28:00.152892 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 10 12:28:00 crc kubenswrapper[4794]: I0310 12:28:00.156612 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 10 12:28:00 crc kubenswrapper[4794]: I0310 12:28:00.163702 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552428-gvn7p"] Mar 10 12:28:00 crc kubenswrapper[4794]: I0310 12:28:00.268121 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s52zt\" (UniqueName: \"kubernetes.io/projected/c8b35de4-fab9-46c5-a7f0-71ccf3c8309d-kube-api-access-s52zt\") pod \"auto-csr-approver-29552428-gvn7p\" (UID: \"c8b35de4-fab9-46c5-a7f0-71ccf3c8309d\") " pod="openshift-infra/auto-csr-approver-29552428-gvn7p" Mar 10 12:28:00 crc kubenswrapper[4794]: I0310 12:28:00.370759 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s52zt\" (UniqueName: \"kubernetes.io/projected/c8b35de4-fab9-46c5-a7f0-71ccf3c8309d-kube-api-access-s52zt\") pod \"auto-csr-approver-29552428-gvn7p\" (UID: \"c8b35de4-fab9-46c5-a7f0-71ccf3c8309d\") " pod="openshift-infra/auto-csr-approver-29552428-gvn7p" Mar 10 12:28:00 crc kubenswrapper[4794]: I0310 12:28:00.391064 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s52zt\" (UniqueName: \"kubernetes.io/projected/c8b35de4-fab9-46c5-a7f0-71ccf3c8309d-kube-api-access-s52zt\") pod \"auto-csr-approver-29552428-gvn7p\" (UID: \"c8b35de4-fab9-46c5-a7f0-71ccf3c8309d\") " pod="openshift-infra/auto-csr-approver-29552428-gvn7p" Mar 10 12:28:00 crc kubenswrapper[4794]: I0310 12:28:00.472138 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552428-gvn7p" Mar 10 12:28:00 crc kubenswrapper[4794]: I0310 12:28:00.979591 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552428-gvn7p"] Mar 10 12:28:01 crc kubenswrapper[4794]: I0310 12:28:01.457168 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552428-gvn7p" event={"ID":"c8b35de4-fab9-46c5-a7f0-71ccf3c8309d","Type":"ContainerStarted","Data":"8127b943a8fa350e74ae05c159b9bdfd19b98989ff8609087d623024d058030d"} Mar 10 12:28:02 crc kubenswrapper[4794]: I0310 12:28:02.466934 4794 generic.go:334] "Generic (PLEG): container finished" podID="c8b35de4-fab9-46c5-a7f0-71ccf3c8309d" containerID="f6440904606d9aebf7d741832031499be56f2f4d4e99cd8c68eebdc26c4944c1" exitCode=0 Mar 10 12:28:02 crc kubenswrapper[4794]: I0310 12:28:02.467052 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552428-gvn7p" event={"ID":"c8b35de4-fab9-46c5-a7f0-71ccf3c8309d","Type":"ContainerDied","Data":"f6440904606d9aebf7d741832031499be56f2f4d4e99cd8c68eebdc26c4944c1"} Mar 10 12:28:05 crc kubenswrapper[4794]: I0310 12:28:05.032034 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552428-gvn7p" Mar 10 12:28:05 crc kubenswrapper[4794]: I0310 12:28:05.192953 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s52zt\" (UniqueName: \"kubernetes.io/projected/c8b35de4-fab9-46c5-a7f0-71ccf3c8309d-kube-api-access-s52zt\") pod \"c8b35de4-fab9-46c5-a7f0-71ccf3c8309d\" (UID: \"c8b35de4-fab9-46c5-a7f0-71ccf3c8309d\") " Mar 10 12:28:05 crc kubenswrapper[4794]: I0310 12:28:05.206630 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c8b35de4-fab9-46c5-a7f0-71ccf3c8309d-kube-api-access-s52zt" (OuterVolumeSpecName: "kube-api-access-s52zt") pod "c8b35de4-fab9-46c5-a7f0-71ccf3c8309d" (UID: "c8b35de4-fab9-46c5-a7f0-71ccf3c8309d"). InnerVolumeSpecName "kube-api-access-s52zt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 12:28:05 crc kubenswrapper[4794]: I0310 12:28:05.295566 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s52zt\" (UniqueName: \"kubernetes.io/projected/c8b35de4-fab9-46c5-a7f0-71ccf3c8309d-kube-api-access-s52zt\") on node \"crc\" DevicePath \"\"" Mar 10 12:28:05 crc kubenswrapper[4794]: I0310 12:28:05.513876 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552428-gvn7p" event={"ID":"c8b35de4-fab9-46c5-a7f0-71ccf3c8309d","Type":"ContainerDied","Data":"8127b943a8fa350e74ae05c159b9bdfd19b98989ff8609087d623024d058030d"} Mar 10 12:28:05 crc kubenswrapper[4794]: I0310 12:28:05.514446 4794 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8127b943a8fa350e74ae05c159b9bdfd19b98989ff8609087d623024d058030d" Mar 10 12:28:05 crc kubenswrapper[4794]: I0310 12:28:05.514161 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552428-gvn7p" Mar 10 12:28:06 crc kubenswrapper[4794]: I0310 12:28:06.107970 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29552422-kkftv"] Mar 10 12:28:06 crc kubenswrapper[4794]: I0310 12:28:06.120233 4794 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29552422-kkftv"] Mar 10 12:28:08 crc kubenswrapper[4794]: I0310 12:28:08.015461 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0bc1c151-74da-457b-ba7b-68ec353b9b97" path="/var/lib/kubelet/pods/0bc1c151-74da-457b-ba7b-68ec353b9b97/volumes" Mar 10 12:28:14 crc kubenswrapper[4794]: I0310 12:28:14.000101 4794 scope.go:117] "RemoveContainer" containerID="713a7f9d6f1bfd1d1e307cff8db8bc49857bd1b38078cd590d6dd88f1ae69a2e" Mar 10 12:28:14 crc kubenswrapper[4794]: E0310 12:28:14.001404 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-69278_openshift-machine-config-operator(071a79a8-a892-4d38-a255-2a19483b64aa)\"" pod="openshift-machine-config-operator/machine-config-daemon-69278" podUID="071a79a8-a892-4d38-a255-2a19483b64aa" Mar 10 12:28:24 crc kubenswrapper[4794]: I0310 12:28:24.999220 4794 scope.go:117] "RemoveContainer" containerID="713a7f9d6f1bfd1d1e307cff8db8bc49857bd1b38078cd590d6dd88f1ae69a2e" Mar 10 12:28:25 crc kubenswrapper[4794]: E0310 12:28:25.000100 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-69278_openshift-machine-config-operator(071a79a8-a892-4d38-a255-2a19483b64aa)\"" pod="openshift-machine-config-operator/machine-config-daemon-69278" podUID="071a79a8-a892-4d38-a255-2a19483b64aa" Mar 10 12:28:29 crc kubenswrapper[4794]: I0310 12:28:29.591483 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_alertmanager-metric-storage-0_92c89072-b510-442b-9009-bfb1363a34ef/init-config-reloader/0.log" Mar 10 12:28:29 crc kubenswrapper[4794]: I0310 12:28:29.746900 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_alertmanager-metric-storage-0_92c89072-b510-442b-9009-bfb1363a34ef/init-config-reloader/0.log" Mar 10 12:28:29 crc kubenswrapper[4794]: I0310 12:28:29.750796 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_alertmanager-metric-storage-0_92c89072-b510-442b-9009-bfb1363a34ef/alertmanager/0.log" Mar 10 12:28:29 crc kubenswrapper[4794]: I0310 12:28:29.803637 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_alertmanager-metric-storage-0_92c89072-b510-442b-9009-bfb1363a34ef/config-reloader/0.log" Mar 10 12:28:29 crc kubenswrapper[4794]: I0310 12:28:29.938847 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_aodh-0_ff3182a7-ac39-40fa-a387-f9cc59b2782b/aodh-api/0.log" Mar 10 12:28:29 crc kubenswrapper[4794]: I0310 12:28:29.984535 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_aodh-0_ff3182a7-ac39-40fa-a387-f9cc59b2782b/aodh-evaluator/0.log" Mar 10 12:28:30 crc kubenswrapper[4794]: I0310 12:28:30.118062 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_aodh-0_ff3182a7-ac39-40fa-a387-f9cc59b2782b/aodh-listener/0.log" Mar 10 12:28:30 crc kubenswrapper[4794]: I0310 12:28:30.158813 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_aodh-0_ff3182a7-ac39-40fa-a387-f9cc59b2782b/aodh-notifier/0.log" Mar 10 12:28:30 crc kubenswrapper[4794]: I0310 12:28:30.179736 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-55d6f8766b-cgldc_f8dbf764-272f-4b4e-b7c7-f005c3219494/barbican-api/0.log" Mar 10 12:28:30 crc kubenswrapper[4794]: I0310 12:28:30.356392 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-86667778fb-jhxgj_657ac79d-5ffa-4505-8fad-1df78ff74c78/barbican-keystone-listener/0.log" Mar 10 12:28:30 crc kubenswrapper[4794]: I0310 12:28:30.358799 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-55d6f8766b-cgldc_f8dbf764-272f-4b4e-b7c7-f005c3219494/barbican-api-log/0.log" Mar 10 12:28:30 crc kubenswrapper[4794]: I0310 12:28:30.461855 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-86667778fb-jhxgj_657ac79d-5ffa-4505-8fad-1df78ff74c78/barbican-keystone-listener-log/0.log" Mar 10 12:28:30 crc kubenswrapper[4794]: I0310 12:28:30.551377 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-7dcbcf68fc-sr8rx_68c9337f-a87e-4734-9150-fb69adfa8e63/barbican-worker/0.log" Mar 10 12:28:30 crc kubenswrapper[4794]: I0310 12:28:30.624051 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-7dcbcf68fc-sr8rx_68c9337f-a87e-4734-9150-fb69adfa8e63/barbican-worker-log/0.log" Mar 10 12:28:30 crc kubenswrapper[4794]: I0310 12:28:30.774240 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_bootstrap-openstack-openstack-cell1-cl7pm_b3ed4d47-7c95-4755-a631-092391d64b11/bootstrap-openstack-openstack-cell1/0.log" Mar 10 12:28:30 crc kubenswrapper[4794]: I0310 12:28:30.861459 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_871071a4-bdcf-4d0a-bb26-21205ac2c2da/ceilometer-central-agent/0.log" Mar 10 12:28:30 crc kubenswrapper[4794]: I0310 12:28:30.916100 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_871071a4-bdcf-4d0a-bb26-21205ac2c2da/ceilometer-notification-agent/0.log" Mar 10 12:28:30 crc kubenswrapper[4794]: I0310 12:28:30.994969 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_871071a4-bdcf-4d0a-bb26-21205ac2c2da/proxy-httpd/0.log" Mar 10 12:28:31 crc kubenswrapper[4794]: I0310 12:28:31.049954 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_871071a4-bdcf-4d0a-bb26-21205ac2c2da/sg-core/0.log" Mar 10 12:28:31 crc kubenswrapper[4794]: I0310 12:28:31.161404 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceph-client-openstack-openstack-cell1-lfjcb_9f97d295-b543-4b60-920f-37840bef42c1/ceph-client-openstack-openstack-cell1/0.log" Mar 10 12:28:31 crc kubenswrapper[4794]: I0310 12:28:31.338362 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_d52ca6f9-6457-431b-9b61-5af02167bb0c/cinder-api/0.log" Mar 10 12:28:31 crc kubenswrapper[4794]: I0310 12:28:31.374384 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_d52ca6f9-6457-431b-9b61-5af02167bb0c/cinder-api-log/0.log" Mar 10 12:28:31 crc kubenswrapper[4794]: I0310 12:28:31.600546 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-backup-0_07c8f180-62af-49c1-8e1d-3fec16164fee/probe/0.log" Mar 10 12:28:31 crc kubenswrapper[4794]: I0310 12:28:31.648640 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-backup-0_07c8f180-62af-49c1-8e1d-3fec16164fee/cinder-backup/0.log" Mar 10 12:28:31 crc kubenswrapper[4794]: I0310 12:28:31.736033 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_3fa5babe-464d-4dd0-a3be-fb5a0adc54d0/cinder-scheduler/0.log" Mar 10 12:28:31 crc kubenswrapper[4794]: I0310 12:28:31.842654 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_3fa5babe-464d-4dd0-a3be-fb5a0adc54d0/probe/0.log" Mar 10 12:28:31 crc kubenswrapper[4794]: I0310 12:28:31.927116 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-volume-volume1-0_8d6ba85c-b8f0-46dd-90a7-0d85c29d6b0b/cinder-volume/0.log" Mar 10 12:28:32 crc kubenswrapper[4794]: I0310 12:28:32.026460 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-volume-volume1-0_8d6ba85c-b8f0-46dd-90a7-0d85c29d6b0b/probe/0.log" Mar 10 12:28:32 crc kubenswrapper[4794]: I0310 12:28:32.199797 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-network-openstack-openstack-cell1-rcwxx_9938d61e-1f47-4555-a810-67e6c74dc947/configure-network-openstack-openstack-cell1/0.log" Mar 10 12:28:32 crc kubenswrapper[4794]: I0310 12:28:32.262117 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-os-openstack-openstack-cell1-cms62_4a19915b-ffd3-4b57-a806-0dc4f67e3003/configure-os-openstack-openstack-cell1/0.log" Mar 10 12:28:32 crc kubenswrapper[4794]: I0310 12:28:32.418349 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-55f66c686f-d8wjz_68926d9f-d2a2-4ddb-a755-42a2bc0d614d/init/0.log" Mar 10 12:28:32 crc kubenswrapper[4794]: I0310 12:28:32.613160 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-55f66c686f-d8wjz_68926d9f-d2a2-4ddb-a755-42a2bc0d614d/init/0.log" Mar 10 12:28:32 crc kubenswrapper[4794]: I0310 12:28:32.646738 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-55f66c686f-d8wjz_68926d9f-d2a2-4ddb-a755-42a2bc0d614d/dnsmasq-dns/0.log" Mar 10 12:28:32 crc kubenswrapper[4794]: I0310 12:28:32.708855 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_download-cache-openstack-openstack-cell1-l9qnw_50dce0e2-2231-4163-8c82-8ee68e08cb57/download-cache-openstack-openstack-cell1/0.log" Mar 10 12:28:33 crc kubenswrapper[4794]: I0310 12:28:33.025377 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_e0a7c3c3-404b-4d50-a8f3-b06b83af4f0f/glance-httpd/0.log" Mar 10 12:28:33 crc kubenswrapper[4794]: I0310 12:28:33.174698 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_e0a7c3c3-404b-4d50-a8f3-b06b83af4f0f/glance-log/0.log" Mar 10 12:28:33 crc kubenswrapper[4794]: I0310 12:28:33.424768 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_89c1851f-3bf0-4d2b-b3da-db9f3c42cd51/glance-log/0.log" Mar 10 12:28:33 crc kubenswrapper[4794]: I0310 12:28:33.503749 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_89c1851f-3bf0-4d2b-b3da-db9f3c42cd51/glance-httpd/0.log" Mar 10 12:28:33 crc kubenswrapper[4794]: I0310 12:28:33.587367 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_heat-api-f448c9dcf-rwf4x_49071c63-b14a-4c35-8d14-5c0e2b2deea3/heat-api/0.log" Mar 10 12:28:33 crc kubenswrapper[4794]: I0310 12:28:33.779849 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_heat-cfnapi-7bff4bb77-gfswn_e9f71665-b982-459c-814e-fba7eedcb66b/heat-cfnapi/0.log" Mar 10 12:28:33 crc kubenswrapper[4794]: I0310 12:28:33.827114 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_heat-engine-6fbd4f9675-8th7w_2a7fce03-02d0-47e4-804e-2fa5ac594bdc/heat-engine/0.log" Mar 10 12:28:34 crc kubenswrapper[4794]: I0310 12:28:34.176182 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-77df494f69-g2kl6_446085d1-b68e-40ef-ac9c-01bc709de2a3/horizon-log/0.log" Mar 10 12:28:34 crc kubenswrapper[4794]: I0310 12:28:34.201823 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-77df494f69-g2kl6_446085d1-b68e-40ef-ac9c-01bc709de2a3/horizon/0.log" Mar 10 12:28:34 crc kubenswrapper[4794]: I0310 12:28:34.219573 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-certs-openstack-openstack-cell1-qbzgk_3922f569-e641-48f2-a436-015113e439ee/install-certs-openstack-openstack-cell1/0.log" Mar 10 12:28:34 crc kubenswrapper[4794]: I0310 12:28:34.388674 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-os-openstack-openstack-cell1-p6cdg_d2743912-a485-45ae-b5ec-0addae4b7861/install-os-openstack-openstack-cell1/0.log" Mar 10 12:28:34 crc kubenswrapper[4794]: I0310 12:28:34.511226 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-cron-29552401-pfl4r_477b563e-62bf-4c38-9004-f3a46f574174/keystone-cron/0.log" Mar 10 12:28:34 crc kubenswrapper[4794]: I0310 12:28:34.512415 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-84ff5f675d-mcgwb_5c1d5e7f-f99a-46af-b1d6-9c016759827a/keystone-api/0.log" Mar 10 12:28:34 crc kubenswrapper[4794]: I0310 12:28:34.661470 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_kube-state-metrics-0_faba7423-9101-4887-9104-69b12739b3a3/kube-state-metrics/0.log" Mar 10 12:28:34 crc kubenswrapper[4794]: I0310 12:28:34.790321 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_libvirt-openstack-openstack-cell1-bpxnx_f01b2008-c26d-4f5f-90c6-438cd78f6836/libvirt-openstack-openstack-cell1/0.log" Mar 10 12:28:34 crc kubenswrapper[4794]: I0310 12:28:34.983800 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-api-0_f56fdc61-c6d4-4840-bbce-da97847489bd/manila-api-log/0.log" Mar 10 12:28:35 crc kubenswrapper[4794]: I0310 12:28:35.052526 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-api-0_f56fdc61-c6d4-4840-bbce-da97847489bd/manila-api/0.log" Mar 10 12:28:35 crc kubenswrapper[4794]: I0310 12:28:35.106008 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-scheduler-0_957e6f09-08a4-459b-9261-346d34354b23/manila-scheduler/0.log" Mar 10 12:28:35 crc kubenswrapper[4794]: I0310 12:28:35.161970 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-scheduler-0_957e6f09-08a4-459b-9261-346d34354b23/probe/0.log" Mar 10 12:28:35 crc kubenswrapper[4794]: I0310 12:28:35.292809 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-share-share1-0_a601e4ec-cffc-427d-890b-aeb8f9e7a224/manila-share/0.log" Mar 10 12:28:35 crc kubenswrapper[4794]: I0310 12:28:35.370176 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-share-share1-0_a601e4ec-cffc-427d-890b-aeb8f9e7a224/probe/0.log" Mar 10 12:28:35 crc kubenswrapper[4794]: I0310 12:28:35.406600 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_mariadb-copy-data_2d7b8834-0c36-41e1-837a-7bbd54185723/adoption/0.log" Mar 10 12:28:36 crc kubenswrapper[4794]: I0310 12:28:36.178354 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-78455f4569-hjwg5_d290cfd0-f0e3-4319-bb64-48ec66b84613/neutron-httpd/0.log" Mar 10 12:28:36 crc kubenswrapper[4794]: I0310 12:28:36.225404 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-78455f4569-hjwg5_d290cfd0-f0e3-4319-bb64-48ec66b84613/neutron-api/0.log" Mar 10 12:28:36 crc kubenswrapper[4794]: I0310 12:28:36.380601 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-dhcp-openstack-openstack-cell1-gqrmr_5b31cf7c-573c-4588-8cc9-948d8616af9d/neutron-dhcp-openstack-openstack-cell1/0.log" Mar 10 12:28:36 crc kubenswrapper[4794]: I0310 12:28:36.545526 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-metadata-openstack-openstack-cell1-l92sc_838354a0-fdf3-4dfd-89cc-1f9bdc800fbb/neutron-metadata-openstack-openstack-cell1/0.log" Mar 10 12:28:36 crc kubenswrapper[4794]: I0310 12:28:36.678226 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-sriov-openstack-openstack-cell1-jpmfp_7903d1c4-a98b-4c62-85e4-54df0047589a/neutron-sriov-openstack-openstack-cell1/0.log" Mar 10 12:28:36 crc kubenswrapper[4794]: I0310 12:28:36.915981 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_2a47c198-d1ae-45d0-a160-f74c8d8a04f4/nova-api-api/0.log" Mar 10 12:28:37 crc kubenswrapper[4794]: I0310 12:28:37.109304 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_2a47c198-d1ae-45d0-a160-f74c8d8a04f4/nova-api-log/0.log" Mar 10 12:28:38 crc kubenswrapper[4794]: I0310 12:28:38.042093 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell0-conductor-0_95009e92-9d1f-4334-9322-90ff0a8784bd/nova-cell0-conductor-conductor/0.log" Mar 10 12:28:38 crc kubenswrapper[4794]: I0310 12:28:38.239265 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-conductor-0_9bf5612b-a7fc-44b3-87cd-6ee5d892221a/nova-cell1-conductor-conductor/0.log" Mar 10 12:28:38 crc kubenswrapper[4794]: I0310 12:28:38.281967 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-novncproxy-0_6ba30853-8729-4d4a-9e44-f4ec5e80c459/nova-cell1-novncproxy-novncproxy/0.log" Mar 10 12:28:38 crc kubenswrapper[4794]: I0310 12:28:38.498950 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellkhlzh_6e3a6dda-7237-47e6-b254-4f0b835f83c6/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cell1/0.log" Mar 10 12:28:38 crc kubenswrapper[4794]: I0310 12:28:38.671325 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-openstack-openstack-cell1-6rfj4_0dbc468e-bf92-4cfa-81f1-a660334c4fd5/nova-cell1-openstack-openstack-cell1/0.log" Mar 10 12:28:38 crc kubenswrapper[4794]: I0310 12:28:38.887591 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_02b78cfc-5b55-45a5-9417-e6d4e602995e/nova-metadata-metadata/0.log" Mar 10 12:28:38 crc kubenswrapper[4794]: I0310 12:28:38.893745 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_02b78cfc-5b55-45a5-9417-e6d4e602995e/nova-metadata-log/0.log" Mar 10 12:28:38 crc kubenswrapper[4794]: I0310 12:28:38.993532 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-scheduler-0_30125730-8a03-45b3-b2ab-78206ddcde9e/nova-scheduler-scheduler/0.log" Mar 10 12:28:39 crc kubenswrapper[4794]: I0310 12:28:39.144664 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-api-5f9d47f567-qfb6h_6d282049-4c70-4e8b-945e-3375dd7a0e95/init/0.log" Mar 10 12:28:39 crc kubenswrapper[4794]: I0310 12:28:39.307615 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-api-5f9d47f567-qfb6h_6d282049-4c70-4e8b-945e-3375dd7a0e95/init/0.log" Mar 10 12:28:39 crc kubenswrapper[4794]: I0310 12:28:39.360321 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-api-5f9d47f567-qfb6h_6d282049-4c70-4e8b-945e-3375dd7a0e95/octavia-api-provider-agent/0.log" Mar 10 12:28:39 crc kubenswrapper[4794]: I0310 12:28:39.534924 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-healthmanager-g8rgx_0e5b10af-c321-47e8-9fa2-d143dbe38634/init/0.log" Mar 10 12:28:39 crc kubenswrapper[4794]: I0310 12:28:39.639552 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-api-5f9d47f567-qfb6h_6d282049-4c70-4e8b-945e-3375dd7a0e95/octavia-api/0.log" Mar 10 12:28:39 crc kubenswrapper[4794]: I0310 12:28:39.813782 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-healthmanager-g8rgx_0e5b10af-c321-47e8-9fa2-d143dbe38634/init/0.log" Mar 10 12:28:39 crc kubenswrapper[4794]: I0310 12:28:39.854182 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-healthmanager-g8rgx_0e5b10af-c321-47e8-9fa2-d143dbe38634/octavia-healthmanager/0.log" Mar 10 12:28:39 crc kubenswrapper[4794]: I0310 12:28:39.896284 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-housekeeping-jn45g_ec60c79b-10a2-4b22-988f-4bc082f6b5ec/init/0.log" Mar 10 12:28:39 crc kubenswrapper[4794]: I0310 12:28:39.999703 4794 scope.go:117] "RemoveContainer" containerID="713a7f9d6f1bfd1d1e307cff8db8bc49857bd1b38078cd590d6dd88f1ae69a2e" Mar 10 12:28:39 crc kubenswrapper[4794]: E0310 12:28:39.999953 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-69278_openshift-machine-config-operator(071a79a8-a892-4d38-a255-2a19483b64aa)\"" pod="openshift-machine-config-operator/machine-config-daemon-69278" podUID="071a79a8-a892-4d38-a255-2a19483b64aa" Mar 10 12:28:40 crc kubenswrapper[4794]: I0310 12:28:40.116351 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-rsyslog-nkn6m_15527b8a-9b20-4baa-b371-cb2395457d00/init/0.log" Mar 10 12:28:40 crc kubenswrapper[4794]: I0310 12:28:40.154304 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-housekeeping-jn45g_ec60c79b-10a2-4b22-988f-4bc082f6b5ec/octavia-housekeeping/0.log" Mar 10 12:28:40 crc kubenswrapper[4794]: I0310 12:28:40.228473 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-housekeeping-jn45g_ec60c79b-10a2-4b22-988f-4bc082f6b5ec/init/0.log" Mar 10 12:28:40 crc kubenswrapper[4794]: I0310 12:28:40.412228 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-rsyslog-nkn6m_15527b8a-9b20-4baa-b371-cb2395457d00/octavia-rsyslog/0.log" Mar 10 12:28:40 crc kubenswrapper[4794]: I0310 12:28:40.414900 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-rsyslog-nkn6m_15527b8a-9b20-4baa-b371-cb2395457d00/init/0.log" Mar 10 12:28:40 crc kubenswrapper[4794]: I0310 12:28:40.456713 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-worker-dcpwj_25562046-f83e-475d-bcf8-22c782c6595e/init/0.log" Mar 10 12:28:40 crc kubenswrapper[4794]: I0310 12:28:40.664668 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-worker-dcpwj_25562046-f83e-475d-bcf8-22c782c6595e/init/0.log" Mar 10 12:28:40 crc kubenswrapper[4794]: I0310 12:28:40.746559 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_9144c4f6-a35f-4aa9-a1d3-fbce6ca09b39/mysql-bootstrap/0.log" Mar 10 12:28:40 crc kubenswrapper[4794]: I0310 12:28:40.886513 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-worker-dcpwj_25562046-f83e-475d-bcf8-22c782c6595e/octavia-worker/0.log" Mar 10 12:28:40 crc kubenswrapper[4794]: I0310 12:28:40.918695 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_9144c4f6-a35f-4aa9-a1d3-fbce6ca09b39/mysql-bootstrap/0.log" Mar 10 12:28:40 crc kubenswrapper[4794]: I0310 12:28:40.972777 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_9144c4f6-a35f-4aa9-a1d3-fbce6ca09b39/galera/0.log" Mar 10 12:28:41 crc kubenswrapper[4794]: I0310 12:28:41.090708 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_c100d9a4-5eb2-48f1-a419-24f8758331e3/mysql-bootstrap/0.log" Mar 10 12:28:41 crc kubenswrapper[4794]: I0310 12:28:41.330883 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_c100d9a4-5eb2-48f1-a419-24f8758331e3/mysql-bootstrap/0.log" Mar 10 12:28:41 crc kubenswrapper[4794]: I0310 12:28:41.351602 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstackclient_3b5553b5-8c76-4d5d-9501-48df7e9b14d6/openstackclient/0.log" Mar 10 12:28:41 crc kubenswrapper[4794]: I0310 12:28:41.352932 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_c100d9a4-5eb2-48f1-a419-24f8758331e3/galera/0.log" Mar 10 12:28:41 crc kubenswrapper[4794]: I0310 12:28:41.594849 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-dnwr6_e395ac11-306a-4cb7-8868-ac0c2108d63b/ovn-controller/0.log" Mar 10 12:28:41 crc kubenswrapper[4794]: I0310 12:28:41.626440 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-h24fx_a0118024-1c72-4da4-8e7b-78190928f285/openstack-network-exporter/0.log" Mar 10 12:28:41 crc kubenswrapper[4794]: I0310 12:28:41.851894 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-545pd_77cb5711-11e8-4897-b227-3579f08b54a6/ovsdb-server-init/0.log" Mar 10 12:28:42 crc kubenswrapper[4794]: I0310 12:28:42.088883 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-545pd_77cb5711-11e8-4897-b227-3579f08b54a6/ovsdb-server-init/0.log" Mar 10 12:28:42 crc kubenswrapper[4794]: I0310 12:28:42.104737 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-545pd_77cb5711-11e8-4897-b227-3579f08b54a6/ovsdb-server/0.log" Mar 10 12:28:42 crc kubenswrapper[4794]: I0310 12:28:42.166069 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-545pd_77cb5711-11e8-4897-b227-3579f08b54a6/ovs-vswitchd/0.log" Mar 10 12:28:42 crc kubenswrapper[4794]: I0310 12:28:42.287618 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-copy-data_84da0233-958e-4c22-b27c-3a0881846fb3/adoption/0.log" Mar 10 12:28:42 crc kubenswrapper[4794]: I0310 12:28:42.358779 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_0436aee5-3421-4a09-b2c9-468430d109ec/openstack-network-exporter/0.log" Mar 10 12:28:42 crc kubenswrapper[4794]: I0310 12:28:42.458457 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_0436aee5-3421-4a09-b2c9-468430d109ec/ovn-northd/0.log" Mar 10 12:28:42 crc kubenswrapper[4794]: I0310 12:28:42.676739 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_c452b797-4adc-4fa8-9fd4-bd0397013cbf/openstack-network-exporter/0.log" Mar 10 12:28:42 crc kubenswrapper[4794]: I0310 12:28:42.684577 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-openstack-openstack-cell1-nx6x4_40f28062-bca6-426e-a4b9-fff9e17e5a3d/ovn-openstack-openstack-cell1/0.log" Mar 10 12:28:42 crc kubenswrapper[4794]: I0310 12:28:42.811152 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_c452b797-4adc-4fa8-9fd4-bd0397013cbf/ovsdbserver-nb/0.log" Mar 10 12:28:42 crc kubenswrapper[4794]: I0310 12:28:42.901769 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-1_55ecbbbb-548a-4a82-8302-984dc85503e8/ovsdbserver-nb/0.log" Mar 10 12:28:42 crc kubenswrapper[4794]: I0310 12:28:42.949906 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-1_55ecbbbb-548a-4a82-8302-984dc85503e8/openstack-network-exporter/0.log" Mar 10 12:28:43 crc kubenswrapper[4794]: I0310 12:28:43.128580 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-2_9ed6efd3-8e35-42f4-902a-d76dceaf7e3a/openstack-network-exporter/0.log" Mar 10 12:28:43 crc kubenswrapper[4794]: I0310 12:28:43.189288 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-2_9ed6efd3-8e35-42f4-902a-d76dceaf7e3a/ovsdbserver-nb/0.log" Mar 10 12:28:43 crc kubenswrapper[4794]: I0310 12:28:43.329659 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_78f18d30-feff-435e-a61b-8ac7020f133e/openstack-network-exporter/0.log" Mar 10 12:28:43 crc kubenswrapper[4794]: I0310 12:28:43.393982 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_78f18d30-feff-435e-a61b-8ac7020f133e/ovsdbserver-sb/0.log" Mar 10 12:28:43 crc kubenswrapper[4794]: I0310 12:28:43.487901 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-1_fb9f9744-a951-4f33-b9a7-2b2ed542dd84/openstack-network-exporter/0.log" Mar 10 12:28:43 crc kubenswrapper[4794]: I0310 12:28:43.563084 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-1_fb9f9744-a951-4f33-b9a7-2b2ed542dd84/ovsdbserver-sb/0.log" Mar 10 12:28:43 crc kubenswrapper[4794]: I0310 12:28:43.965043 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-2_74dc83e0-8027-42ed-b958-117726668c78/openstack-network-exporter/0.log" Mar 10 12:28:44 crc kubenswrapper[4794]: I0310 12:28:44.016279 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-2_74dc83e0-8027-42ed-b958-117726668c78/ovsdbserver-sb/0.log" Mar 10 12:28:44 crc kubenswrapper[4794]: I0310 12:28:44.348994 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_pre-adoption-validation-openstack-pre-adoption-openstack-cgbslf_5643a41f-c260-4e6a-881c-87f777fa58f3/pre-adoption-validation-openstack-pre-adoption-openstack-cell1/0.log" Mar 10 12:28:44 crc kubenswrapper[4794]: I0310 12:28:44.382289 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-7459f47788-bz25p_cebda460-fe0a-407c-bdd4-db77b6d6c14e/placement-api/0.log" Mar 10 12:28:44 crc kubenswrapper[4794]: I0310 12:28:44.390450 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-7459f47788-bz25p_cebda460-fe0a-407c-bdd4-db77b6d6c14e/placement-log/0.log" Mar 10 12:28:44 crc kubenswrapper[4794]: I0310 12:28:44.544029 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_a1c58e50-939f-4c26-8214-ea21470b3f12/init-config-reloader/0.log" Mar 10 12:28:44 crc kubenswrapper[4794]: I0310 12:28:44.765840 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_a1c58e50-939f-4c26-8214-ea21470b3f12/config-reloader/0.log" Mar 10 12:28:44 crc kubenswrapper[4794]: I0310 12:28:44.791226 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_a1c58e50-939f-4c26-8214-ea21470b3f12/init-config-reloader/0.log" Mar 10 12:28:44 crc kubenswrapper[4794]: I0310 12:28:44.817951 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_a1c58e50-939f-4c26-8214-ea21470b3f12/prometheus/0.log" Mar 10 12:28:44 crc kubenswrapper[4794]: I0310 12:28:44.883738 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_a1c58e50-939f-4c26-8214-ea21470b3f12/thanos-sidecar/0.log" Mar 10 12:28:45 crc kubenswrapper[4794]: I0310 12:28:45.039211 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_707decd3-40de-4bcf-873b-51fc83a7f136/setup-container/0.log" Mar 10 12:28:45 crc kubenswrapper[4794]: I0310 12:28:45.214726 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_707decd3-40de-4bcf-873b-51fc83a7f136/setup-container/0.log" Mar 10 12:28:45 crc kubenswrapper[4794]: I0310 12:28:45.301264 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_707decd3-40de-4bcf-873b-51fc83a7f136/rabbitmq/0.log" Mar 10 12:28:45 crc kubenswrapper[4794]: I0310 12:28:45.320736 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_514f5ee1-5433-486a-ace2-ad62c7622526/setup-container/0.log" Mar 10 12:28:45 crc kubenswrapper[4794]: I0310 12:28:45.557490 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_514f5ee1-5433-486a-ace2-ad62c7622526/setup-container/0.log" Mar 10 12:28:45 crc kubenswrapper[4794]: I0310 12:28:45.635076 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_514f5ee1-5433-486a-ace2-ad62c7622526/rabbitmq/0.log" Mar 10 12:28:46 crc kubenswrapper[4794]: I0310 12:28:46.347094 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_run-os-openstack-openstack-cell1-msjrc_f756112e-b33c-4c53-aecd-408a3f22f8cf/run-os-openstack-openstack-cell1/0.log" Mar 10 12:28:46 crc kubenswrapper[4794]: I0310 12:28:46.378342 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_reboot-os-openstack-openstack-cell1-xrfp9_a9adb25a-7a73-4147-869e-f5bbe20c230a/reboot-os-openstack-openstack-cell1/0.log" Mar 10 12:28:46 crc kubenswrapper[4794]: I0310 12:28:46.606235 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ssh-known-hosts-openstack-px9pl_f8422031-955c-4264-86dc-c633abfa5290/ssh-known-hosts-openstack/0.log" Mar 10 12:28:46 crc kubenswrapper[4794]: I0310 12:28:46.745425 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_telemetry-openstack-openstack-cell1-z7mv2_55055e58-21d6-4d61-bb6e-ba2ff62acb9a/telemetry-openstack-openstack-cell1/0.log" Mar 10 12:28:46 crc kubenswrapper[4794]: I0310 12:28:46.956746 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_tripleo-cleanup-tripleo-cleanup-openstack-cell1-4sdlv_2281ee85-db79-4fec-bb5e-ce0a4a4c61de/tripleo-cleanup-tripleo-cleanup-openstack-cell1/0.log" Mar 10 12:28:47 crc kubenswrapper[4794]: I0310 12:28:47.026612 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_memcached-0_bec2b4b2-9935-4ce8-b1e3-4e999d18a441/memcached/0.log" Mar 10 12:28:47 crc kubenswrapper[4794]: I0310 12:28:47.097780 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_validate-network-openstack-openstack-cell1-fv6vq_dd7d1c86-3b8c-4a4a-8c09-3769ebeabe6a/validate-network-openstack-openstack-cell1/0.log" Mar 10 12:28:55 crc kubenswrapper[4794]: I0310 12:28:54.999571 4794 scope.go:117] "RemoveContainer" containerID="713a7f9d6f1bfd1d1e307cff8db8bc49857bd1b38078cd590d6dd88f1ae69a2e" Mar 10 12:28:55 crc kubenswrapper[4794]: E0310 12:28:55.000286 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-69278_openshift-machine-config-operator(071a79a8-a892-4d38-a255-2a19483b64aa)\"" pod="openshift-machine-config-operator/machine-config-daemon-69278" podUID="071a79a8-a892-4d38-a255-2a19483b64aa" Mar 10 12:28:56 crc kubenswrapper[4794]: I0310 12:28:56.080839 4794 scope.go:117] "RemoveContainer" containerID="fcd8ceb0add1e96d91b1bab5214dde92b41e24d41802e67f633bba12e61751ee" Mar 10 12:29:05 crc kubenswrapper[4794]: I0310 12:29:05.999210 4794 scope.go:117] "RemoveContainer" containerID="713a7f9d6f1bfd1d1e307cff8db8bc49857bd1b38078cd590d6dd88f1ae69a2e" Mar 10 12:29:06 crc kubenswrapper[4794]: E0310 12:29:06.000060 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-69278_openshift-machine-config-operator(071a79a8-a892-4d38-a255-2a19483b64aa)\"" pod="openshift-machine-config-operator/machine-config-daemon-69278" podUID="071a79a8-a892-4d38-a255-2a19483b64aa" Mar 10 12:29:10 crc kubenswrapper[4794]: I0310 12:29:10.628919 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_951b5e6f67e727348308cf03f2b463d3e2bd27386b453f6699e03a49ba2s2pl_a9889ae6-f864-48e1-b8ed-42ae046882fa/util/0.log" Mar 10 12:29:10 crc kubenswrapper[4794]: I0310 12:29:10.840207 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_951b5e6f67e727348308cf03f2b463d3e2bd27386b453f6699e03a49ba2s2pl_a9889ae6-f864-48e1-b8ed-42ae046882fa/pull/0.log" Mar 10 12:29:10 crc kubenswrapper[4794]: I0310 12:29:10.846035 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_951b5e6f67e727348308cf03f2b463d3e2bd27386b453f6699e03a49ba2s2pl_a9889ae6-f864-48e1-b8ed-42ae046882fa/pull/0.log" Mar 10 12:29:10 crc kubenswrapper[4794]: I0310 12:29:10.856184 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_951b5e6f67e727348308cf03f2b463d3e2bd27386b453f6699e03a49ba2s2pl_a9889ae6-f864-48e1-b8ed-42ae046882fa/util/0.log" Mar 10 12:29:11 crc kubenswrapper[4794]: I0310 12:29:11.026601 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_951b5e6f67e727348308cf03f2b463d3e2bd27386b453f6699e03a49ba2s2pl_a9889ae6-f864-48e1-b8ed-42ae046882fa/pull/0.log" Mar 10 12:29:11 crc kubenswrapper[4794]: I0310 12:29:11.036364 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_951b5e6f67e727348308cf03f2b463d3e2bd27386b453f6699e03a49ba2s2pl_a9889ae6-f864-48e1-b8ed-42ae046882fa/util/0.log" Mar 10 12:29:11 crc kubenswrapper[4794]: I0310 12:29:11.048777 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_951b5e6f67e727348308cf03f2b463d3e2bd27386b453f6699e03a49ba2s2pl_a9889ae6-f864-48e1-b8ed-42ae046882fa/extract/0.log" Mar 10 12:29:11 crc kubenswrapper[4794]: I0310 12:29:11.444737 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-66d56f6ff4-zcsq9_8c9a844a-91eb-4909-961a-79ab54ac592c/manager/0.log" Mar 10 12:29:11 crc kubenswrapper[4794]: I0310 12:29:11.920917 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-5964f64c48-2xrkm_491021e8-371d-44ff-bc8b-6cb379531865/manager/0.log" Mar 10 12:29:12 crc kubenswrapper[4794]: I0310 12:29:12.071571 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-77b6666d85-d9zzm_51d5cfa9-c743-4fe4-8965-e88568a7e266/manager/0.log" Mar 10 12:29:12 crc kubenswrapper[4794]: I0310 12:29:12.289279 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-6d9d6b584d-ldq59_6cc76a8e-a3ac-4499-ab5b-f3fba0d8702c/manager/0.log" Mar 10 12:29:12 crc kubenswrapper[4794]: I0310 12:29:12.929565 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-6bbb499bbc-tbxps_63cc23d4-8955-4725-886e-b1379acf91dc/manager/0.log" Mar 10 12:29:13 crc kubenswrapper[4794]: I0310 12:29:13.049442 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-677bd678f7-22vcf_690c6868-b841-43ec-9a82-42f6b2250153/manager/0.log" Mar 10 12:29:13 crc kubenswrapper[4794]: I0310 12:29:13.486008 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-5995f4446f-sf5hn_1201bc13-f478-4ce1-9e86-3b1fcf9bcde1/manager/0.log" Mar 10 12:29:13 crc kubenswrapper[4794]: I0310 12:29:13.498104 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-684f77d66d-p2ghv_f67ba1e8-d8af-4850-8133-6c50df162861/manager/0.log" Mar 10 12:29:13 crc kubenswrapper[4794]: I0310 12:29:13.717113 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-68f45f9d9f-zb879_f69e77cb-6b5c-4caf-a7de-7604ba460682/manager/0.log" Mar 10 12:29:13 crc kubenswrapper[4794]: I0310 12:29:13.845710 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-658d4cdd5-5nt4h_f6c79bcc-08f3-4c8a-aa30-ce9db5e7bd27/manager/0.log" Mar 10 12:29:14 crc kubenswrapper[4794]: I0310 12:29:14.110149 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-776c5696bf-pgqx8_f22ad8cb-2f75-4695-b030-67a2991aa07c/manager/0.log" Mar 10 12:29:14 crc kubenswrapper[4794]: I0310 12:29:14.398694 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-5f4f55cb5c-d8hqc_cd0343b5-41b0-44c4-8c72-997df328e4ef/manager/0.log" Mar 10 12:29:14 crc kubenswrapper[4794]: I0310 12:29:14.463579 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-569cc54c5-xqghl_8dc7b4da-0fd5-489a-a3c6-8c613fb3c5a9/manager/0.log" Mar 10 12:29:14 crc kubenswrapper[4794]: I0310 12:29:14.656477 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-6647d7885f664lg_7929c2d3-601b-4c69-970b-a69550d9852c/manager/0.log" Mar 10 12:29:15 crc kubenswrapper[4794]: I0310 12:29:15.149273 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-init-6cf8df7788-pbmxv_a51e8442-a0da-4aec-91c1-383cef679edf/operator/0.log" Mar 10 12:29:15 crc kubenswrapper[4794]: I0310 12:29:15.325542 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-index-8fx8m_3b5b4e59-c15f-427b-a84f-08f07cda5dbc/registry-server/0.log" Mar 10 12:29:15 crc kubenswrapper[4794]: I0310 12:29:15.342037 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-984cd4dcf-vh842_aea0d607-d1b3-4a15-993a-c571f49c1337/manager/0.log" Mar 10 12:29:15 crc kubenswrapper[4794]: I0310 12:29:15.536832 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-bbc5b68f9-xn8mp_a5511434-20de-4512-91d2-be8a49738d22/manager/0.log" Mar 10 12:29:15 crc kubenswrapper[4794]: I0310 12:29:15.643222 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-574d45c66c-s7nkh_0f893a2c-14c6-4d56-a798-30e94b0e89af/manager/0.log" Mar 10 12:29:15 crc kubenswrapper[4794]: I0310 12:29:15.768426 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_rabbitmq-cluster-operator-manager-668c99d594-wsbdq_a11f8148-52e7-47c7-8ff3-e9c172925ebe/operator/0.log" Mar 10 12:29:15 crc kubenswrapper[4794]: I0310 12:29:15.946253 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-677c674df7-6664x_dd5bf891-8f83-47c7-9d66-06a814fcc5ee/manager/0.log" Mar 10 12:29:16 crc kubenswrapper[4794]: I0310 12:29:16.209716 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-5c5cb9c4d7-gv6df_2ebeb7da-d08d-4e4a-a294-b5d0ce38d07f/manager/0.log" Mar 10 12:29:16 crc kubenswrapper[4794]: I0310 12:29:16.232918 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-6cd66dbd4b-h7m6z_5f503ef3-9c39-4afb-a266-431c3a44d21e/manager/0.log" Mar 10 12:29:16 crc kubenswrapper[4794]: I0310 12:29:16.460446 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-6dd88c6f67-sdplr_220c126b-fc41-4aa9-89ef-fb9ad27e9719/manager/0.log" Mar 10 12:29:17 crc kubenswrapper[4794]: I0310 12:29:17.586855 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-6679ddfdc7-bkrkg_ea65018d-9031-45ab-89c3-2846e861d0a2/manager/0.log" Mar 10 12:29:20 crc kubenswrapper[4794]: I0310 12:29:20.999533 4794 scope.go:117] "RemoveContainer" containerID="713a7f9d6f1bfd1d1e307cff8db8bc49857bd1b38078cd590d6dd88f1ae69a2e" Mar 10 12:29:21 crc kubenswrapper[4794]: E0310 12:29:21.000147 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-69278_openshift-machine-config-operator(071a79a8-a892-4d38-a255-2a19483b64aa)\"" pod="openshift-machine-config-operator/machine-config-daemon-69278" podUID="071a79a8-a892-4d38-a255-2a19483b64aa" Mar 10 12:29:34 crc kubenswrapper[4794]: I0310 12:29:33.999902 4794 scope.go:117] "RemoveContainer" containerID="713a7f9d6f1bfd1d1e307cff8db8bc49857bd1b38078cd590d6dd88f1ae69a2e" Mar 10 12:29:34 crc kubenswrapper[4794]: E0310 12:29:34.000851 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-69278_openshift-machine-config-operator(071a79a8-a892-4d38-a255-2a19483b64aa)\"" pod="openshift-machine-config-operator/machine-config-daemon-69278" podUID="071a79a8-a892-4d38-a255-2a19483b64aa" Mar 10 12:29:38 crc kubenswrapper[4794]: I0310 12:29:38.236783 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-q62bc_eb31e03a-1724-4272-bb5f-d0a7b6b80059/control-plane-machine-set-operator/0.log" Mar 10 12:29:38 crc kubenswrapper[4794]: I0310 12:29:38.431963 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-m2lbn_a1d20393-8c02-48fd-83ad-eb270b721313/kube-rbac-proxy/0.log" Mar 10 12:29:38 crc kubenswrapper[4794]: I0310 12:29:38.491789 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-m2lbn_a1d20393-8c02-48fd-83ad-eb270b721313/machine-api-operator/0.log" Mar 10 12:29:47 crc kubenswrapper[4794]: I0310 12:29:47.998768 4794 scope.go:117] "RemoveContainer" containerID="713a7f9d6f1bfd1d1e307cff8db8bc49857bd1b38078cd590d6dd88f1ae69a2e" Mar 10 12:29:48 crc kubenswrapper[4794]: E0310 12:29:47.999607 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-69278_openshift-machine-config-operator(071a79a8-a892-4d38-a255-2a19483b64aa)\"" pod="openshift-machine-config-operator/machine-config-daemon-69278" podUID="071a79a8-a892-4d38-a255-2a19483b64aa" Mar 10 12:29:52 crc kubenswrapper[4794]: I0310 12:29:52.502661 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-545d4d4674-lhr9c_c8fa8c73-f52b-4cee-a7ff-1c0288142c4c/cert-manager-controller/0.log" Mar 10 12:29:53 crc kubenswrapper[4794]: I0310 12:29:53.297138 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-6888856db4-4dfk8_1aab61bc-b01a-4965-96bd-261ceaca0636/cert-manager-webhook/0.log" Mar 10 12:29:53 crc kubenswrapper[4794]: I0310 12:29:53.370833 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-5545bd876-h6lmz_c9d3b9ca-7539-4c04-a6f1-7332587d32ff/cert-manager-cainjector/0.log" Mar 10 12:30:00 crc kubenswrapper[4794]: I0310 12:30:00.157662 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29552430-kw55d"] Mar 10 12:30:00 crc kubenswrapper[4794]: E0310 12:30:00.158845 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c8b35de4-fab9-46c5-a7f0-71ccf3c8309d" containerName="oc" Mar 10 12:30:00 crc kubenswrapper[4794]: I0310 12:30:00.158866 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="c8b35de4-fab9-46c5-a7f0-71ccf3c8309d" containerName="oc" Mar 10 12:30:00 crc kubenswrapper[4794]: I0310 12:30:00.159171 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="c8b35de4-fab9-46c5-a7f0-71ccf3c8309d" containerName="oc" Mar 10 12:30:00 crc kubenswrapper[4794]: I0310 12:30:00.160166 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29552430-kw55d" Mar 10 12:30:00 crc kubenswrapper[4794]: I0310 12:30:00.162858 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 10 12:30:00 crc kubenswrapper[4794]: I0310 12:30:00.163405 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 10 12:30:00 crc kubenswrapper[4794]: I0310 12:30:00.169802 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29552430-4xpf9"] Mar 10 12:30:00 crc kubenswrapper[4794]: I0310 12:30:00.173768 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552430-4xpf9" Mar 10 12:30:00 crc kubenswrapper[4794]: I0310 12:30:00.177612 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 10 12:30:00 crc kubenswrapper[4794]: I0310 12:30:00.177755 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-r5tkc" Mar 10 12:30:00 crc kubenswrapper[4794]: I0310 12:30:00.177828 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 10 12:30:00 crc kubenswrapper[4794]: I0310 12:30:00.183853 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552430-4xpf9"] Mar 10 12:30:00 crc kubenswrapper[4794]: I0310 12:30:00.198205 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29552430-kw55d"] Mar 10 12:30:00 crc kubenswrapper[4794]: I0310 12:30:00.249870 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/343c53f5-2a0d-4165-8d90-6ecd1a211769-config-volume\") pod \"collect-profiles-29552430-kw55d\" (UID: \"343c53f5-2a0d-4165-8d90-6ecd1a211769\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552430-kw55d" Mar 10 12:30:00 crc kubenswrapper[4794]: I0310 12:30:00.250172 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/343c53f5-2a0d-4165-8d90-6ecd1a211769-secret-volume\") pod \"collect-profiles-29552430-kw55d\" (UID: \"343c53f5-2a0d-4165-8d90-6ecd1a211769\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552430-kw55d" Mar 10 12:30:00 crc kubenswrapper[4794]: I0310 12:30:00.250278 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6kdfp\" (UniqueName: \"kubernetes.io/projected/4fc05efc-78ab-4e02-8f24-cee5ee379ae9-kube-api-access-6kdfp\") pod \"auto-csr-approver-29552430-4xpf9\" (UID: \"4fc05efc-78ab-4e02-8f24-cee5ee379ae9\") " pod="openshift-infra/auto-csr-approver-29552430-4xpf9" Mar 10 12:30:00 crc kubenswrapper[4794]: I0310 12:30:00.250366 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nwb2p\" (UniqueName: \"kubernetes.io/projected/343c53f5-2a0d-4165-8d90-6ecd1a211769-kube-api-access-nwb2p\") pod \"collect-profiles-29552430-kw55d\" (UID: \"343c53f5-2a0d-4165-8d90-6ecd1a211769\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552430-kw55d" Mar 10 12:30:00 crc kubenswrapper[4794]: I0310 12:30:00.352164 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nwb2p\" (UniqueName: \"kubernetes.io/projected/343c53f5-2a0d-4165-8d90-6ecd1a211769-kube-api-access-nwb2p\") pod \"collect-profiles-29552430-kw55d\" (UID: \"343c53f5-2a0d-4165-8d90-6ecd1a211769\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552430-kw55d" Mar 10 12:30:00 crc kubenswrapper[4794]: I0310 12:30:00.352269 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/343c53f5-2a0d-4165-8d90-6ecd1a211769-config-volume\") pod \"collect-profiles-29552430-kw55d\" (UID: \"343c53f5-2a0d-4165-8d90-6ecd1a211769\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552430-kw55d" Mar 10 12:30:00 crc kubenswrapper[4794]: I0310 12:30:00.352288 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/343c53f5-2a0d-4165-8d90-6ecd1a211769-secret-volume\") pod \"collect-profiles-29552430-kw55d\" (UID: \"343c53f5-2a0d-4165-8d90-6ecd1a211769\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552430-kw55d" Mar 10 12:30:00 crc kubenswrapper[4794]: I0310 12:30:00.352400 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6kdfp\" (UniqueName: \"kubernetes.io/projected/4fc05efc-78ab-4e02-8f24-cee5ee379ae9-kube-api-access-6kdfp\") pod \"auto-csr-approver-29552430-4xpf9\" (UID: \"4fc05efc-78ab-4e02-8f24-cee5ee379ae9\") " pod="openshift-infra/auto-csr-approver-29552430-4xpf9" Mar 10 12:30:00 crc kubenswrapper[4794]: I0310 12:30:00.353379 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/343c53f5-2a0d-4165-8d90-6ecd1a211769-config-volume\") pod \"collect-profiles-29552430-kw55d\" (UID: \"343c53f5-2a0d-4165-8d90-6ecd1a211769\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552430-kw55d" Mar 10 12:30:00 crc kubenswrapper[4794]: I0310 12:30:00.358447 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/343c53f5-2a0d-4165-8d90-6ecd1a211769-secret-volume\") pod \"collect-profiles-29552430-kw55d\" (UID: \"343c53f5-2a0d-4165-8d90-6ecd1a211769\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552430-kw55d" Mar 10 12:30:00 crc kubenswrapper[4794]: I0310 12:30:00.368411 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nwb2p\" (UniqueName: \"kubernetes.io/projected/343c53f5-2a0d-4165-8d90-6ecd1a211769-kube-api-access-nwb2p\") pod \"collect-profiles-29552430-kw55d\" (UID: \"343c53f5-2a0d-4165-8d90-6ecd1a211769\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552430-kw55d" Mar 10 12:30:00 crc kubenswrapper[4794]: I0310 12:30:00.378851 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6kdfp\" (UniqueName: \"kubernetes.io/projected/4fc05efc-78ab-4e02-8f24-cee5ee379ae9-kube-api-access-6kdfp\") pod \"auto-csr-approver-29552430-4xpf9\" (UID: \"4fc05efc-78ab-4e02-8f24-cee5ee379ae9\") " pod="openshift-infra/auto-csr-approver-29552430-4xpf9" Mar 10 12:30:00 crc kubenswrapper[4794]: I0310 12:30:00.522248 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29552430-kw55d" Mar 10 12:30:00 crc kubenswrapper[4794]: I0310 12:30:00.541595 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552430-4xpf9" Mar 10 12:30:01 crc kubenswrapper[4794]: I0310 12:30:00.999683 4794 scope.go:117] "RemoveContainer" containerID="713a7f9d6f1bfd1d1e307cff8db8bc49857bd1b38078cd590d6dd88f1ae69a2e" Mar 10 12:30:01 crc kubenswrapper[4794]: E0310 12:30:01.000543 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-69278_openshift-machine-config-operator(071a79a8-a892-4d38-a255-2a19483b64aa)\"" pod="openshift-machine-config-operator/machine-config-daemon-69278" podUID="071a79a8-a892-4d38-a255-2a19483b64aa" Mar 10 12:30:01 crc kubenswrapper[4794]: I0310 12:30:01.142256 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29552430-kw55d"] Mar 10 12:30:01 crc kubenswrapper[4794]: W0310 12:30:01.145712 4794 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod343c53f5_2a0d_4165_8d90_6ecd1a211769.slice/crio-19cee2998f2ceac33d905033a6e6c6d8f1e00fcdc8e831c0b1df1f3ccf61cc9f WatchSource:0}: Error finding container 19cee2998f2ceac33d905033a6e6c6d8f1e00fcdc8e831c0b1df1f3ccf61cc9f: Status 404 returned error can't find the container with id 19cee2998f2ceac33d905033a6e6c6d8f1e00fcdc8e831c0b1df1f3ccf61cc9f Mar 10 12:30:01 crc kubenswrapper[4794]: W0310 12:30:01.268563 4794 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4fc05efc_78ab_4e02_8f24_cee5ee379ae9.slice/crio-bfff9a4ca99d0bf2312541d83f130caa5b6b893fabb49678cbc31bbc4b790fba WatchSource:0}: Error finding container bfff9a4ca99d0bf2312541d83f130caa5b6b893fabb49678cbc31bbc4b790fba: Status 404 returned error can't find the container with id bfff9a4ca99d0bf2312541d83f130caa5b6b893fabb49678cbc31bbc4b790fba Mar 10 12:30:01 crc kubenswrapper[4794]: I0310 12:30:01.270768 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552430-4xpf9"] Mar 10 12:30:01 crc kubenswrapper[4794]: I0310 12:30:01.708473 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29552430-kw55d" event={"ID":"343c53f5-2a0d-4165-8d90-6ecd1a211769","Type":"ContainerStarted","Data":"19cee2998f2ceac33d905033a6e6c6d8f1e00fcdc8e831c0b1df1f3ccf61cc9f"} Mar 10 12:30:01 crc kubenswrapper[4794]: I0310 12:30:01.709810 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552430-4xpf9" event={"ID":"4fc05efc-78ab-4e02-8f24-cee5ee379ae9","Type":"ContainerStarted","Data":"bfff9a4ca99d0bf2312541d83f130caa5b6b893fabb49678cbc31bbc4b790fba"} Mar 10 12:30:02 crc kubenswrapper[4794]: I0310 12:30:02.721349 4794 generic.go:334] "Generic (PLEG): container finished" podID="343c53f5-2a0d-4165-8d90-6ecd1a211769" containerID="ba8743c65babdcef9a1cf2f7e72132981f3d4b68e506b065ba2b0badbc2cd053" exitCode=0 Mar 10 12:30:02 crc kubenswrapper[4794]: I0310 12:30:02.721439 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29552430-kw55d" event={"ID":"343c53f5-2a0d-4165-8d90-6ecd1a211769","Type":"ContainerDied","Data":"ba8743c65babdcef9a1cf2f7e72132981f3d4b68e506b065ba2b0badbc2cd053"} Mar 10 12:30:03 crc kubenswrapper[4794]: I0310 12:30:03.737742 4794 generic.go:334] "Generic (PLEG): container finished" podID="4fc05efc-78ab-4e02-8f24-cee5ee379ae9" containerID="38cf60c102d6c688576922fb452c9c55b266883e283b4aa1617b025b9b7af8ed" exitCode=0 Mar 10 12:30:03 crc kubenswrapper[4794]: I0310 12:30:03.737797 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552430-4xpf9" event={"ID":"4fc05efc-78ab-4e02-8f24-cee5ee379ae9","Type":"ContainerDied","Data":"38cf60c102d6c688576922fb452c9c55b266883e283b4aa1617b025b9b7af8ed"} Mar 10 12:30:04 crc kubenswrapper[4794]: I0310 12:30:04.227657 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29552430-kw55d" Mar 10 12:30:04 crc kubenswrapper[4794]: I0310 12:30:04.271428 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nwb2p\" (UniqueName: \"kubernetes.io/projected/343c53f5-2a0d-4165-8d90-6ecd1a211769-kube-api-access-nwb2p\") pod \"343c53f5-2a0d-4165-8d90-6ecd1a211769\" (UID: \"343c53f5-2a0d-4165-8d90-6ecd1a211769\") " Mar 10 12:30:04 crc kubenswrapper[4794]: I0310 12:30:04.271605 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/343c53f5-2a0d-4165-8d90-6ecd1a211769-config-volume\") pod \"343c53f5-2a0d-4165-8d90-6ecd1a211769\" (UID: \"343c53f5-2a0d-4165-8d90-6ecd1a211769\") " Mar 10 12:30:04 crc kubenswrapper[4794]: I0310 12:30:04.271714 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/343c53f5-2a0d-4165-8d90-6ecd1a211769-secret-volume\") pod \"343c53f5-2a0d-4165-8d90-6ecd1a211769\" (UID: \"343c53f5-2a0d-4165-8d90-6ecd1a211769\") " Mar 10 12:30:04 crc kubenswrapper[4794]: I0310 12:30:04.272291 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/343c53f5-2a0d-4165-8d90-6ecd1a211769-config-volume" (OuterVolumeSpecName: "config-volume") pod "343c53f5-2a0d-4165-8d90-6ecd1a211769" (UID: "343c53f5-2a0d-4165-8d90-6ecd1a211769"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 12:30:04 crc kubenswrapper[4794]: I0310 12:30:04.272462 4794 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/343c53f5-2a0d-4165-8d90-6ecd1a211769-config-volume\") on node \"crc\" DevicePath \"\"" Mar 10 12:30:04 crc kubenswrapper[4794]: I0310 12:30:04.277862 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/343c53f5-2a0d-4165-8d90-6ecd1a211769-kube-api-access-nwb2p" (OuterVolumeSpecName: "kube-api-access-nwb2p") pod "343c53f5-2a0d-4165-8d90-6ecd1a211769" (UID: "343c53f5-2a0d-4165-8d90-6ecd1a211769"). InnerVolumeSpecName "kube-api-access-nwb2p". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 12:30:04 crc kubenswrapper[4794]: I0310 12:30:04.373578 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nwb2p\" (UniqueName: \"kubernetes.io/projected/343c53f5-2a0d-4165-8d90-6ecd1a211769-kube-api-access-nwb2p\") on node \"crc\" DevicePath \"\"" Mar 10 12:30:04 crc kubenswrapper[4794]: I0310 12:30:04.377556 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/343c53f5-2a0d-4165-8d90-6ecd1a211769-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "343c53f5-2a0d-4165-8d90-6ecd1a211769" (UID: "343c53f5-2a0d-4165-8d90-6ecd1a211769"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 12:30:04 crc kubenswrapper[4794]: I0310 12:30:04.476544 4794 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/343c53f5-2a0d-4165-8d90-6ecd1a211769-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 10 12:30:04 crc kubenswrapper[4794]: I0310 12:30:04.751380 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29552430-kw55d" Mar 10 12:30:04 crc kubenswrapper[4794]: I0310 12:30:04.751419 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29552430-kw55d" event={"ID":"343c53f5-2a0d-4165-8d90-6ecd1a211769","Type":"ContainerDied","Data":"19cee2998f2ceac33d905033a6e6c6d8f1e00fcdc8e831c0b1df1f3ccf61cc9f"} Mar 10 12:30:04 crc kubenswrapper[4794]: I0310 12:30:04.751475 4794 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="19cee2998f2ceac33d905033a6e6c6d8f1e00fcdc8e831c0b1df1f3ccf61cc9f" Mar 10 12:30:05 crc kubenswrapper[4794]: I0310 12:30:05.104701 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552430-4xpf9" Mar 10 12:30:05 crc kubenswrapper[4794]: I0310 12:30:05.198434 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6kdfp\" (UniqueName: \"kubernetes.io/projected/4fc05efc-78ab-4e02-8f24-cee5ee379ae9-kube-api-access-6kdfp\") pod \"4fc05efc-78ab-4e02-8f24-cee5ee379ae9\" (UID: \"4fc05efc-78ab-4e02-8f24-cee5ee379ae9\") " Mar 10 12:30:05 crc kubenswrapper[4794]: I0310 12:30:05.204703 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4fc05efc-78ab-4e02-8f24-cee5ee379ae9-kube-api-access-6kdfp" (OuterVolumeSpecName: "kube-api-access-6kdfp") pod "4fc05efc-78ab-4e02-8f24-cee5ee379ae9" (UID: "4fc05efc-78ab-4e02-8f24-cee5ee379ae9"). InnerVolumeSpecName "kube-api-access-6kdfp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 12:30:05 crc kubenswrapper[4794]: I0310 12:30:05.302085 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6kdfp\" (UniqueName: \"kubernetes.io/projected/4fc05efc-78ab-4e02-8f24-cee5ee379ae9-kube-api-access-6kdfp\") on node \"crc\" DevicePath \"\"" Mar 10 12:30:05 crc kubenswrapper[4794]: I0310 12:30:05.307628 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29552385-rzmlx"] Mar 10 12:30:05 crc kubenswrapper[4794]: I0310 12:30:05.318888 4794 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29552385-rzmlx"] Mar 10 12:30:05 crc kubenswrapper[4794]: I0310 12:30:05.763672 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552430-4xpf9" event={"ID":"4fc05efc-78ab-4e02-8f24-cee5ee379ae9","Type":"ContainerDied","Data":"bfff9a4ca99d0bf2312541d83f130caa5b6b893fabb49678cbc31bbc4b790fba"} Mar 10 12:30:05 crc kubenswrapper[4794]: I0310 12:30:05.764755 4794 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bfff9a4ca99d0bf2312541d83f130caa5b6b893fabb49678cbc31bbc4b790fba" Mar 10 12:30:05 crc kubenswrapper[4794]: I0310 12:30:05.763717 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552430-4xpf9" Mar 10 12:30:06 crc kubenswrapper[4794]: I0310 12:30:06.012738 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ca80d151-efc9-49bd-a807-a67542d2ffda" path="/var/lib/kubelet/pods/ca80d151-efc9-49bd-a807-a67542d2ffda/volumes" Mar 10 12:30:06 crc kubenswrapper[4794]: I0310 12:30:06.181911 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29552424-5d5m7"] Mar 10 12:30:06 crc kubenswrapper[4794]: I0310 12:30:06.196103 4794 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29552424-5d5m7"] Mar 10 12:30:07 crc kubenswrapper[4794]: I0310 12:30:07.636026 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-console-plugin-5dcbbd79cf-54vjj_e60c58eb-6276-4eeb-b32b-21b8bb5c4d08/nmstate-console-plugin/0.log" Mar 10 12:30:07 crc kubenswrapper[4794]: I0310 12:30:07.855004 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-69594cc75-kl5qq_c22565b3-2d95-40ea-beb3-17d2daec0262/kube-rbac-proxy/0.log" Mar 10 12:30:07 crc kubenswrapper[4794]: I0310 12:30:07.865967 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-handler-mmjmm_eb034770-4c69-4e29-b08e-88af9c78c76e/nmstate-handler/0.log" Mar 10 12:30:08 crc kubenswrapper[4794]: I0310 12:30:08.006581 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-69594cc75-kl5qq_c22565b3-2d95-40ea-beb3-17d2daec0262/nmstate-metrics/0.log" Mar 10 12:30:08 crc kubenswrapper[4794]: I0310 12:30:08.011714 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f4490ab4-756f-4af3-a930-f8f22539a7e2" path="/var/lib/kubelet/pods/f4490ab4-756f-4af3-a930-f8f22539a7e2/volumes" Mar 10 12:30:08 crc kubenswrapper[4794]: I0310 12:30:08.094082 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-operator-75c5dccd6c-hpmnv_03b196f2-90f3-4491-bdc7-298a917ccda7/nmstate-operator/0.log" Mar 10 12:30:08 crc kubenswrapper[4794]: I0310 12:30:08.187776 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-webhook-786f45cff4-x8x2b_25233eef-8e10-48bd-bd95-1366cc06f956/nmstate-webhook/0.log" Mar 10 12:30:12 crc kubenswrapper[4794]: I0310 12:30:12.999454 4794 scope.go:117] "RemoveContainer" containerID="713a7f9d6f1bfd1d1e307cff8db8bc49857bd1b38078cd590d6dd88f1ae69a2e" Mar 10 12:30:13 crc kubenswrapper[4794]: E0310 12:30:13.000211 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-69278_openshift-machine-config-operator(071a79a8-a892-4d38-a255-2a19483b64aa)\"" pod="openshift-machine-config-operator/machine-config-daemon-69278" podUID="071a79a8-a892-4d38-a255-2a19483b64aa" Mar 10 12:30:23 crc kubenswrapper[4794]: I0310 12:30:23.340254 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-68bc856cb9-ld8g6_6a7f71bd-0553-4555-b978-b4e470af8a84/prometheus-operator/0.log" Mar 10 12:30:23 crc kubenswrapper[4794]: I0310 12:30:23.387792 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-696c9775d9-klbpr_97b0679e-d4da-48f2-9f6e-62fbf2c3fb87/prometheus-operator-admission-webhook/0.log" Mar 10 12:30:23 crc kubenswrapper[4794]: I0310 12:30:23.523448 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-696c9775d9-tmx6t_e8018367-607a-4c95-8e53-f06d848933cb/prometheus-operator-admission-webhook/0.log" Mar 10 12:30:23 crc kubenswrapper[4794]: I0310 12:30:23.621661 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_observability-operator-59bdc8b94-2tfm5_53147876-0193-4a98-bf6d-fd9d34f1d84a/operator/0.log" Mar 10 12:30:23 crc kubenswrapper[4794]: I0310 12:30:23.737776 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_perses-operator-5bf474d74f-kt69n_5a0fb20a-63c7-403c-9154-744f6d841f43/perses-operator/0.log" Mar 10 12:30:23 crc kubenswrapper[4794]: I0310 12:30:23.999293 4794 scope.go:117] "RemoveContainer" containerID="713a7f9d6f1bfd1d1e307cff8db8bc49857bd1b38078cd590d6dd88f1ae69a2e" Mar 10 12:30:23 crc kubenswrapper[4794]: E0310 12:30:23.999623 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-69278_openshift-machine-config-operator(071a79a8-a892-4d38-a255-2a19483b64aa)\"" pod="openshift-machine-config-operator/machine-config-daemon-69278" podUID="071a79a8-a892-4d38-a255-2a19483b64aa" Mar 10 12:30:37 crc kubenswrapper[4794]: I0310 12:30:37.950968 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-86ddb6bd46-8rmct_5cb2a87f-5847-4fba-8544-21262656a693/kube-rbac-proxy/0.log" Mar 10 12:30:37 crc kubenswrapper[4794]: I0310 12:30:37.999539 4794 scope.go:117] "RemoveContainer" containerID="713a7f9d6f1bfd1d1e307cff8db8bc49857bd1b38078cd590d6dd88f1ae69a2e" Mar 10 12:30:37 crc kubenswrapper[4794]: E0310 12:30:37.999867 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-69278_openshift-machine-config-operator(071a79a8-a892-4d38-a255-2a19483b64aa)\"" pod="openshift-machine-config-operator/machine-config-daemon-69278" podUID="071a79a8-a892-4d38-a255-2a19483b64aa" Mar 10 12:30:38 crc kubenswrapper[4794]: I0310 12:30:38.274606 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-vftrb_9797c8e7-cef5-4987-9eb3-6d9214e0e871/cp-frr-files/0.log" Mar 10 12:30:38 crc kubenswrapper[4794]: I0310 12:30:38.294838 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-86ddb6bd46-8rmct_5cb2a87f-5847-4fba-8544-21262656a693/controller/0.log" Mar 10 12:30:38 crc kubenswrapper[4794]: I0310 12:30:38.467419 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-vftrb_9797c8e7-cef5-4987-9eb3-6d9214e0e871/cp-reloader/0.log" Mar 10 12:30:38 crc kubenswrapper[4794]: I0310 12:30:38.532981 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-vftrb_9797c8e7-cef5-4987-9eb3-6d9214e0e871/cp-frr-files/0.log" Mar 10 12:30:38 crc kubenswrapper[4794]: I0310 12:30:38.570815 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-vftrb_9797c8e7-cef5-4987-9eb3-6d9214e0e871/cp-metrics/0.log" Mar 10 12:30:38 crc kubenswrapper[4794]: I0310 12:30:38.599089 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-vftrb_9797c8e7-cef5-4987-9eb3-6d9214e0e871/cp-reloader/0.log" Mar 10 12:30:38 crc kubenswrapper[4794]: I0310 12:30:38.742616 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-vftrb_9797c8e7-cef5-4987-9eb3-6d9214e0e871/cp-frr-files/0.log" Mar 10 12:30:38 crc kubenswrapper[4794]: I0310 12:30:38.773933 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-vftrb_9797c8e7-cef5-4987-9eb3-6d9214e0e871/cp-reloader/0.log" Mar 10 12:30:38 crc kubenswrapper[4794]: I0310 12:30:38.784084 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-vftrb_9797c8e7-cef5-4987-9eb3-6d9214e0e871/cp-metrics/0.log" Mar 10 12:30:38 crc kubenswrapper[4794]: I0310 12:30:38.795647 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-vftrb_9797c8e7-cef5-4987-9eb3-6d9214e0e871/cp-metrics/0.log" Mar 10 12:30:39 crc kubenswrapper[4794]: I0310 12:30:39.001161 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-vftrb_9797c8e7-cef5-4987-9eb3-6d9214e0e871/cp-reloader/0.log" Mar 10 12:30:39 crc kubenswrapper[4794]: I0310 12:30:39.001225 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-vftrb_9797c8e7-cef5-4987-9eb3-6d9214e0e871/cp-frr-files/0.log" Mar 10 12:30:39 crc kubenswrapper[4794]: I0310 12:30:39.020649 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-vftrb_9797c8e7-cef5-4987-9eb3-6d9214e0e871/controller/0.log" Mar 10 12:30:39 crc kubenswrapper[4794]: I0310 12:30:39.044768 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-vftrb_9797c8e7-cef5-4987-9eb3-6d9214e0e871/cp-metrics/0.log" Mar 10 12:30:39 crc kubenswrapper[4794]: I0310 12:30:39.187301 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-vftrb_9797c8e7-cef5-4987-9eb3-6d9214e0e871/frr-metrics/0.log" Mar 10 12:30:39 crc kubenswrapper[4794]: I0310 12:30:39.247859 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-vftrb_9797c8e7-cef5-4987-9eb3-6d9214e0e871/kube-rbac-proxy-frr/0.log" Mar 10 12:30:39 crc kubenswrapper[4794]: I0310 12:30:39.271582 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-vftrb_9797c8e7-cef5-4987-9eb3-6d9214e0e871/kube-rbac-proxy/0.log" Mar 10 12:30:39 crc kubenswrapper[4794]: I0310 12:30:39.415690 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-vftrb_9797c8e7-cef5-4987-9eb3-6d9214e0e871/reloader/0.log" Mar 10 12:30:39 crc kubenswrapper[4794]: I0310 12:30:39.505749 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-7f989f654f-tvkkq_bac622e3-6c34-4772-a6c6-99112d6e77fb/frr-k8s-webhook-server/0.log" Mar 10 12:30:39 crc kubenswrapper[4794]: I0310 12:30:39.697478 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-5854bdf585-wdfdn_b1881d46-f628-435d-8f46-23a62f6fcaee/manager/0.log" Mar 10 12:30:39 crc kubenswrapper[4794]: I0310 12:30:39.883667 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-57ff7f97d6-nq8rw_1c4d30ef-4a9c-4150-a3e5-8f9ba4944fbc/webhook-server/0.log" Mar 10 12:30:40 crc kubenswrapper[4794]: I0310 12:30:40.102645 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-cdztl_7ee9da20-7d24-4c95-a86b-7cde5025a756/kube-rbac-proxy/0.log" Mar 10 12:30:41 crc kubenswrapper[4794]: I0310 12:30:41.072220 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-cdztl_7ee9da20-7d24-4c95-a86b-7cde5025a756/speaker/0.log" Mar 10 12:30:43 crc kubenswrapper[4794]: I0310 12:30:43.109419 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-vftrb_9797c8e7-cef5-4987-9eb3-6d9214e0e871/frr/0.log" Mar 10 12:30:51 crc kubenswrapper[4794]: I0310 12:30:51.000403 4794 scope.go:117] "RemoveContainer" containerID="713a7f9d6f1bfd1d1e307cff8db8bc49857bd1b38078cd590d6dd88f1ae69a2e" Mar 10 12:30:51 crc kubenswrapper[4794]: E0310 12:30:51.001905 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-69278_openshift-machine-config-operator(071a79a8-a892-4d38-a255-2a19483b64aa)\"" pod="openshift-machine-config-operator/machine-config-daemon-69278" podUID="071a79a8-a892-4d38-a255-2a19483b64aa" Mar 10 12:30:56 crc kubenswrapper[4794]: I0310 12:30:56.173033 4794 scope.go:117] "RemoveContainer" containerID="674a9343c7dff95ddc611ad5b2416b8a8f4e3c42fa3374bece4903b286fca876" Mar 10 12:30:56 crc kubenswrapper[4794]: I0310 12:30:56.200325 4794 scope.go:117] "RemoveContainer" containerID="eaadd8e1a09a72a2eb996b5ea5f96ea0cc33ca5705489fa3340a5d4ba216f979" Mar 10 12:30:57 crc kubenswrapper[4794]: I0310 12:30:57.554669 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82v5bld_cf8896bc-caa9-4120-8390-a5ffe0897859/util/0.log" Mar 10 12:30:58 crc kubenswrapper[4794]: I0310 12:30:58.194485 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82v5bld_cf8896bc-caa9-4120-8390-a5ffe0897859/pull/0.log" Mar 10 12:30:58 crc kubenswrapper[4794]: I0310 12:30:58.206526 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82v5bld_cf8896bc-caa9-4120-8390-a5ffe0897859/pull/0.log" Mar 10 12:30:58 crc kubenswrapper[4794]: I0310 12:30:58.210079 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82v5bld_cf8896bc-caa9-4120-8390-a5ffe0897859/util/0.log" Mar 10 12:30:58 crc kubenswrapper[4794]: I0310 12:30:58.418826 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82v5bld_cf8896bc-caa9-4120-8390-a5ffe0897859/pull/0.log" Mar 10 12:30:58 crc kubenswrapper[4794]: I0310 12:30:58.422994 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82v5bld_cf8896bc-caa9-4120-8390-a5ffe0897859/util/0.log" Mar 10 12:30:58 crc kubenswrapper[4794]: I0310 12:30:58.486172 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82v5bld_cf8896bc-caa9-4120-8390-a5ffe0897859/extract/0.log" Mar 10 12:30:58 crc kubenswrapper[4794]: I0310 12:30:58.650857 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e57nmv4_1ad61494-e023-4b0b-babf-452f7e2a5532/util/0.log" Mar 10 12:30:58 crc kubenswrapper[4794]: I0310 12:30:58.885352 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e57nmv4_1ad61494-e023-4b0b-babf-452f7e2a5532/pull/0.log" Mar 10 12:30:58 crc kubenswrapper[4794]: I0310 12:30:58.886725 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e57nmv4_1ad61494-e023-4b0b-babf-452f7e2a5532/util/0.log" Mar 10 12:30:58 crc kubenswrapper[4794]: I0310 12:30:58.949019 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e57nmv4_1ad61494-e023-4b0b-babf-452f7e2a5532/pull/0.log" Mar 10 12:30:59 crc kubenswrapper[4794]: I0310 12:30:59.074797 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e57nmv4_1ad61494-e023-4b0b-babf-452f7e2a5532/util/0.log" Mar 10 12:30:59 crc kubenswrapper[4794]: I0310 12:30:59.112625 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e57nmv4_1ad61494-e023-4b0b-babf-452f7e2a5532/pull/0.log" Mar 10 12:30:59 crc kubenswrapper[4794]: I0310 12:30:59.141812 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e57nmv4_1ad61494-e023-4b0b-babf-452f7e2a5532/extract/0.log" Mar 10 12:30:59 crc kubenswrapper[4794]: I0310 12:30:59.275681 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f087v755_e6c90642-0761-4f5d-82bc-1546e809efe1/util/0.log" Mar 10 12:30:59 crc kubenswrapper[4794]: I0310 12:30:59.469658 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f087v755_e6c90642-0761-4f5d-82bc-1546e809efe1/pull/0.log" Mar 10 12:30:59 crc kubenswrapper[4794]: I0310 12:30:59.488006 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f087v755_e6c90642-0761-4f5d-82bc-1546e809efe1/util/0.log" Mar 10 12:30:59 crc kubenswrapper[4794]: I0310 12:30:59.504037 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f087v755_e6c90642-0761-4f5d-82bc-1546e809efe1/pull/0.log" Mar 10 12:30:59 crc kubenswrapper[4794]: I0310 12:30:59.697135 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f087v755_e6c90642-0761-4f5d-82bc-1546e809efe1/pull/0.log" Mar 10 12:30:59 crc kubenswrapper[4794]: I0310 12:30:59.702189 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f087v755_e6c90642-0761-4f5d-82bc-1546e809efe1/extract/0.log" Mar 10 12:30:59 crc kubenswrapper[4794]: I0310 12:30:59.704707 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f087v755_e6c90642-0761-4f5d-82bc-1546e809efe1/util/0.log" Mar 10 12:30:59 crc kubenswrapper[4794]: I0310 12:30:59.891299 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-6gz6k_ffcd51d8-f4d3-46b5-8462-43b6499bd37c/extract-utilities/0.log" Mar 10 12:31:00 crc kubenswrapper[4794]: I0310 12:31:00.073392 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-6gz6k_ffcd51d8-f4d3-46b5-8462-43b6499bd37c/extract-content/0.log" Mar 10 12:31:00 crc kubenswrapper[4794]: I0310 12:31:00.081594 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-6gz6k_ffcd51d8-f4d3-46b5-8462-43b6499bd37c/extract-utilities/0.log" Mar 10 12:31:00 crc kubenswrapper[4794]: I0310 12:31:00.101787 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-6gz6k_ffcd51d8-f4d3-46b5-8462-43b6499bd37c/extract-content/0.log" Mar 10 12:31:00 crc kubenswrapper[4794]: I0310 12:31:00.420671 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-6gz6k_ffcd51d8-f4d3-46b5-8462-43b6499bd37c/extract-content/0.log" Mar 10 12:31:00 crc kubenswrapper[4794]: I0310 12:31:00.658872 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-6gz6k_ffcd51d8-f4d3-46b5-8462-43b6499bd37c/extract-utilities/0.log" Mar 10 12:31:00 crc kubenswrapper[4794]: I0310 12:31:00.849080 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-sfbwb_bfe02ea7-261f-448d-904e-07b6ad54a152/extract-utilities/0.log" Mar 10 12:31:01 crc kubenswrapper[4794]: I0310 12:31:01.070521 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-sfbwb_bfe02ea7-261f-448d-904e-07b6ad54a152/extract-utilities/0.log" Mar 10 12:31:01 crc kubenswrapper[4794]: I0310 12:31:01.118514 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-sfbwb_bfe02ea7-261f-448d-904e-07b6ad54a152/extract-content/0.log" Mar 10 12:31:01 crc kubenswrapper[4794]: I0310 12:31:01.209418 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-sfbwb_bfe02ea7-261f-448d-904e-07b6ad54a152/extract-content/0.log" Mar 10 12:31:01 crc kubenswrapper[4794]: I0310 12:31:01.383632 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-sfbwb_bfe02ea7-261f-448d-904e-07b6ad54a152/extract-utilities/0.log" Mar 10 12:31:01 crc kubenswrapper[4794]: I0310 12:31:01.474648 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-sfbwb_bfe02ea7-261f-448d-904e-07b6ad54a152/extract-content/0.log" Mar 10 12:31:01 crc kubenswrapper[4794]: I0310 12:31:01.733524 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4rhj58_cce41c0b-f107-450b-95f5-64dae06dbe14/util/0.log" Mar 10 12:31:01 crc kubenswrapper[4794]: I0310 12:31:01.960257 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4rhj58_cce41c0b-f107-450b-95f5-64dae06dbe14/util/0.log" Mar 10 12:31:02 crc kubenswrapper[4794]: I0310 12:31:02.017313 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4rhj58_cce41c0b-f107-450b-95f5-64dae06dbe14/pull/0.log" Mar 10 12:31:02 crc kubenswrapper[4794]: I0310 12:31:02.037671 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4rhj58_cce41c0b-f107-450b-95f5-64dae06dbe14/pull/0.log" Mar 10 12:31:02 crc kubenswrapper[4794]: I0310 12:31:02.204862 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4rhj58_cce41c0b-f107-450b-95f5-64dae06dbe14/util/0.log" Mar 10 12:31:02 crc kubenswrapper[4794]: I0310 12:31:02.275755 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4rhj58_cce41c0b-f107-450b-95f5-64dae06dbe14/extract/0.log" Mar 10 12:31:02 crc kubenswrapper[4794]: I0310 12:31:02.305835 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4rhj58_cce41c0b-f107-450b-95f5-64dae06dbe14/pull/0.log" Mar 10 12:31:02 crc kubenswrapper[4794]: I0310 12:31:02.448109 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-6gz6k_ffcd51d8-f4d3-46b5-8462-43b6499bd37c/registry-server/0.log" Mar 10 12:31:02 crc kubenswrapper[4794]: I0310 12:31:02.482419 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-ztm8n_e995e0b7-2d36-4d94-a424-f916e1cba0ac/marketplace-operator/0.log" Mar 10 12:31:02 crc kubenswrapper[4794]: I0310 12:31:02.692097 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-286c8_23e836be-bdcf-44e3-a585-15f8942bf972/extract-utilities/0.log" Mar 10 12:31:02 crc kubenswrapper[4794]: I0310 12:31:02.799805 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-286c8_23e836be-bdcf-44e3-a585-15f8942bf972/extract-utilities/0.log" Mar 10 12:31:02 crc kubenswrapper[4794]: I0310 12:31:02.914192 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-286c8_23e836be-bdcf-44e3-a585-15f8942bf972/extract-content/0.log" Mar 10 12:31:02 crc kubenswrapper[4794]: I0310 12:31:02.925027 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-286c8_23e836be-bdcf-44e3-a585-15f8942bf972/extract-content/0.log" Mar 10 12:31:03 crc kubenswrapper[4794]: I0310 12:31:03.177723 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-286c8_23e836be-bdcf-44e3-a585-15f8942bf972/extract-content/0.log" Mar 10 12:31:03 crc kubenswrapper[4794]: I0310 12:31:03.199794 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-286c8_23e836be-bdcf-44e3-a585-15f8942bf972/extract-utilities/0.log" Mar 10 12:31:03 crc kubenswrapper[4794]: I0310 12:31:03.362094 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-sfbwb_bfe02ea7-261f-448d-904e-07b6ad54a152/registry-server/0.log" Mar 10 12:31:03 crc kubenswrapper[4794]: I0310 12:31:03.407003 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-d2q2f_b8c626f0-3ae2-44f4-83f1-660a9f69eb2a/extract-utilities/0.log" Mar 10 12:31:03 crc kubenswrapper[4794]: I0310 12:31:03.597467 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-286c8_23e836be-bdcf-44e3-a585-15f8942bf972/registry-server/0.log" Mar 10 12:31:03 crc kubenswrapper[4794]: I0310 12:31:03.652422 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-d2q2f_b8c626f0-3ae2-44f4-83f1-660a9f69eb2a/extract-content/0.log" Mar 10 12:31:03 crc kubenswrapper[4794]: I0310 12:31:03.653478 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-d2q2f_b8c626f0-3ae2-44f4-83f1-660a9f69eb2a/extract-utilities/0.log" Mar 10 12:31:03 crc kubenswrapper[4794]: I0310 12:31:03.699741 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-d2q2f_b8c626f0-3ae2-44f4-83f1-660a9f69eb2a/extract-content/0.log" Mar 10 12:31:03 crc kubenswrapper[4794]: I0310 12:31:03.884974 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-d2q2f_b8c626f0-3ae2-44f4-83f1-660a9f69eb2a/extract-content/0.log" Mar 10 12:31:03 crc kubenswrapper[4794]: I0310 12:31:03.903529 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-d2q2f_b8c626f0-3ae2-44f4-83f1-660a9f69eb2a/extract-utilities/0.log" Mar 10 12:31:04 crc kubenswrapper[4794]: I0310 12:31:04.545955 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-d2q2f_b8c626f0-3ae2-44f4-83f1-660a9f69eb2a/registry-server/0.log" Mar 10 12:31:05 crc kubenswrapper[4794]: I0310 12:31:05.999080 4794 scope.go:117] "RemoveContainer" containerID="713a7f9d6f1bfd1d1e307cff8db8bc49857bd1b38078cd590d6dd88f1ae69a2e" Mar 10 12:31:06 crc kubenswrapper[4794]: I0310 12:31:06.403189 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-69278" event={"ID":"071a79a8-a892-4d38-a255-2a19483b64aa","Type":"ContainerStarted","Data":"650642fca6fd05c6d1ce8b4bb58bc6d6d784b308ef3ef228d986be779c79b82d"} Mar 10 12:31:06 crc kubenswrapper[4794]: I0310 12:31:06.504290 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-2svm7"] Mar 10 12:31:06 crc kubenswrapper[4794]: E0310 12:31:06.518775 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4fc05efc-78ab-4e02-8f24-cee5ee379ae9" containerName="oc" Mar 10 12:31:06 crc kubenswrapper[4794]: I0310 12:31:06.518814 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="4fc05efc-78ab-4e02-8f24-cee5ee379ae9" containerName="oc" Mar 10 12:31:06 crc kubenswrapper[4794]: E0310 12:31:06.518838 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="343c53f5-2a0d-4165-8d90-6ecd1a211769" containerName="collect-profiles" Mar 10 12:31:06 crc kubenswrapper[4794]: I0310 12:31:06.518889 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="343c53f5-2a0d-4165-8d90-6ecd1a211769" containerName="collect-profiles" Mar 10 12:31:06 crc kubenswrapper[4794]: I0310 12:31:06.519210 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="4fc05efc-78ab-4e02-8f24-cee5ee379ae9" containerName="oc" Mar 10 12:31:06 crc kubenswrapper[4794]: I0310 12:31:06.519230 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="343c53f5-2a0d-4165-8d90-6ecd1a211769" containerName="collect-profiles" Mar 10 12:31:06 crc kubenswrapper[4794]: I0310 12:31:06.525058 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-2svm7"] Mar 10 12:31:06 crc kubenswrapper[4794]: I0310 12:31:06.525919 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-2svm7" Mar 10 12:31:06 crc kubenswrapper[4794]: I0310 12:31:06.703056 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4c0ac05f-6749-4d49-b3ac-4d4e56ddbea3-utilities\") pod \"redhat-marketplace-2svm7\" (UID: \"4c0ac05f-6749-4d49-b3ac-4d4e56ddbea3\") " pod="openshift-marketplace/redhat-marketplace-2svm7" Mar 10 12:31:06 crc kubenswrapper[4794]: I0310 12:31:06.703111 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4c0ac05f-6749-4d49-b3ac-4d4e56ddbea3-catalog-content\") pod \"redhat-marketplace-2svm7\" (UID: \"4c0ac05f-6749-4d49-b3ac-4d4e56ddbea3\") " pod="openshift-marketplace/redhat-marketplace-2svm7" Mar 10 12:31:06 crc kubenswrapper[4794]: I0310 12:31:06.703631 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-thv8j\" (UniqueName: \"kubernetes.io/projected/4c0ac05f-6749-4d49-b3ac-4d4e56ddbea3-kube-api-access-thv8j\") pod \"redhat-marketplace-2svm7\" (UID: \"4c0ac05f-6749-4d49-b3ac-4d4e56ddbea3\") " pod="openshift-marketplace/redhat-marketplace-2svm7" Mar 10 12:31:06 crc kubenswrapper[4794]: I0310 12:31:06.805602 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4c0ac05f-6749-4d49-b3ac-4d4e56ddbea3-utilities\") pod \"redhat-marketplace-2svm7\" (UID: \"4c0ac05f-6749-4d49-b3ac-4d4e56ddbea3\") " pod="openshift-marketplace/redhat-marketplace-2svm7" Mar 10 12:31:06 crc kubenswrapper[4794]: I0310 12:31:06.805945 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4c0ac05f-6749-4d49-b3ac-4d4e56ddbea3-catalog-content\") pod \"redhat-marketplace-2svm7\" (UID: \"4c0ac05f-6749-4d49-b3ac-4d4e56ddbea3\") " pod="openshift-marketplace/redhat-marketplace-2svm7" Mar 10 12:31:06 crc kubenswrapper[4794]: I0310 12:31:06.806154 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-thv8j\" (UniqueName: \"kubernetes.io/projected/4c0ac05f-6749-4d49-b3ac-4d4e56ddbea3-kube-api-access-thv8j\") pod \"redhat-marketplace-2svm7\" (UID: \"4c0ac05f-6749-4d49-b3ac-4d4e56ddbea3\") " pod="openshift-marketplace/redhat-marketplace-2svm7" Mar 10 12:31:06 crc kubenswrapper[4794]: I0310 12:31:06.806239 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4c0ac05f-6749-4d49-b3ac-4d4e56ddbea3-utilities\") pod \"redhat-marketplace-2svm7\" (UID: \"4c0ac05f-6749-4d49-b3ac-4d4e56ddbea3\") " pod="openshift-marketplace/redhat-marketplace-2svm7" Mar 10 12:31:06 crc kubenswrapper[4794]: I0310 12:31:06.806518 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4c0ac05f-6749-4d49-b3ac-4d4e56ddbea3-catalog-content\") pod \"redhat-marketplace-2svm7\" (UID: \"4c0ac05f-6749-4d49-b3ac-4d4e56ddbea3\") " pod="openshift-marketplace/redhat-marketplace-2svm7" Mar 10 12:31:07 crc kubenswrapper[4794]: I0310 12:31:07.083925 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-thv8j\" (UniqueName: \"kubernetes.io/projected/4c0ac05f-6749-4d49-b3ac-4d4e56ddbea3-kube-api-access-thv8j\") pod \"redhat-marketplace-2svm7\" (UID: \"4c0ac05f-6749-4d49-b3ac-4d4e56ddbea3\") " pod="openshift-marketplace/redhat-marketplace-2svm7" Mar 10 12:31:07 crc kubenswrapper[4794]: I0310 12:31:07.148102 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-2svm7" Mar 10 12:31:07 crc kubenswrapper[4794]: I0310 12:31:07.626489 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-2svm7"] Mar 10 12:31:07 crc kubenswrapper[4794]: W0310 12:31:07.639190 4794 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4c0ac05f_6749_4d49_b3ac_4d4e56ddbea3.slice/crio-5fd286aa94333f9dd201c7691fedc894953cc4d9e3219f889ca3c6b9cb44cd92 WatchSource:0}: Error finding container 5fd286aa94333f9dd201c7691fedc894953cc4d9e3219f889ca3c6b9cb44cd92: Status 404 returned error can't find the container with id 5fd286aa94333f9dd201c7691fedc894953cc4d9e3219f889ca3c6b9cb44cd92 Mar 10 12:31:08 crc kubenswrapper[4794]: I0310 12:31:08.425665 4794 generic.go:334] "Generic (PLEG): container finished" podID="4c0ac05f-6749-4d49-b3ac-4d4e56ddbea3" containerID="d7b4c19ba3d5aa0cc49d06bfc438608cc28d26be4eb5ce893dfbcf95bcef5e44" exitCode=0 Mar 10 12:31:08 crc kubenswrapper[4794]: I0310 12:31:08.425770 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2svm7" event={"ID":"4c0ac05f-6749-4d49-b3ac-4d4e56ddbea3","Type":"ContainerDied","Data":"d7b4c19ba3d5aa0cc49d06bfc438608cc28d26be4eb5ce893dfbcf95bcef5e44"} Mar 10 12:31:08 crc kubenswrapper[4794]: I0310 12:31:08.426249 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2svm7" event={"ID":"4c0ac05f-6749-4d49-b3ac-4d4e56ddbea3","Type":"ContainerStarted","Data":"5fd286aa94333f9dd201c7691fedc894953cc4d9e3219f889ca3c6b9cb44cd92"} Mar 10 12:31:08 crc kubenswrapper[4794]: I0310 12:31:08.428145 4794 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 10 12:31:09 crc kubenswrapper[4794]: I0310 12:31:09.444190 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2svm7" event={"ID":"4c0ac05f-6749-4d49-b3ac-4d4e56ddbea3","Type":"ContainerStarted","Data":"b1b3a8edf151e95b6ffc57aea05db805d5d54ac7c77c57c37758bc73d570bf7b"} Mar 10 12:31:10 crc kubenswrapper[4794]: I0310 12:31:10.454670 4794 generic.go:334] "Generic (PLEG): container finished" podID="4c0ac05f-6749-4d49-b3ac-4d4e56ddbea3" containerID="b1b3a8edf151e95b6ffc57aea05db805d5d54ac7c77c57c37758bc73d570bf7b" exitCode=0 Mar 10 12:31:10 crc kubenswrapper[4794]: I0310 12:31:10.454859 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2svm7" event={"ID":"4c0ac05f-6749-4d49-b3ac-4d4e56ddbea3","Type":"ContainerDied","Data":"b1b3a8edf151e95b6ffc57aea05db805d5d54ac7c77c57c37758bc73d570bf7b"} Mar 10 12:31:11 crc kubenswrapper[4794]: I0310 12:31:11.467316 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2svm7" event={"ID":"4c0ac05f-6749-4d49-b3ac-4d4e56ddbea3","Type":"ContainerStarted","Data":"e2390405d1a7fb7043832abcf5454de296c748de423728e2d82acec83cf3225a"} Mar 10 12:31:11 crc kubenswrapper[4794]: I0310 12:31:11.489761 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-2svm7" podStartSLOduration=2.842165164 podStartE2EDuration="5.489740196s" podCreationTimestamp="2026-03-10 12:31:06 +0000 UTC" firstStartedPulling="2026-03-10 12:31:08.42791722 +0000 UTC m=+10017.184088038" lastFinishedPulling="2026-03-10 12:31:11.075492252 +0000 UTC m=+10019.831663070" observedRunningTime="2026-03-10 12:31:11.488185867 +0000 UTC m=+10020.244356695" watchObservedRunningTime="2026-03-10 12:31:11.489740196 +0000 UTC m=+10020.245911014" Mar 10 12:31:17 crc kubenswrapper[4794]: I0310 12:31:17.149509 4794 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-2svm7" Mar 10 12:31:17 crc kubenswrapper[4794]: I0310 12:31:17.150111 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-2svm7" Mar 10 12:31:17 crc kubenswrapper[4794]: I0310 12:31:17.606976 4794 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-2svm7" Mar 10 12:31:17 crc kubenswrapper[4794]: I0310 12:31:17.710123 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-2svm7" Mar 10 12:31:17 crc kubenswrapper[4794]: I0310 12:31:17.863653 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-2svm7"] Mar 10 12:31:19 crc kubenswrapper[4794]: I0310 12:31:19.600572 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-2svm7" podUID="4c0ac05f-6749-4d49-b3ac-4d4e56ddbea3" containerName="registry-server" containerID="cri-o://e2390405d1a7fb7043832abcf5454de296c748de423728e2d82acec83cf3225a" gracePeriod=2 Mar 10 12:31:19 crc kubenswrapper[4794]: I0310 12:31:19.604716 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-68bc856cb9-ld8g6_6a7f71bd-0553-4555-b978-b4e470af8a84/prometheus-operator/0.log" Mar 10 12:31:19 crc kubenswrapper[4794]: I0310 12:31:19.650822 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-696c9775d9-tmx6t_e8018367-607a-4c95-8e53-f06d848933cb/prometheus-operator-admission-webhook/0.log" Mar 10 12:31:19 crc kubenswrapper[4794]: I0310 12:31:19.750103 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-696c9775d9-klbpr_97b0679e-d4da-48f2-9f6e-62fbf2c3fb87/prometheus-operator-admission-webhook/0.log" Mar 10 12:31:19 crc kubenswrapper[4794]: I0310 12:31:19.806564 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_observability-operator-59bdc8b94-2tfm5_53147876-0193-4a98-bf6d-fd9d34f1d84a/operator/0.log" Mar 10 12:31:19 crc kubenswrapper[4794]: I0310 12:31:19.819344 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_perses-operator-5bf474d74f-kt69n_5a0fb20a-63c7-403c-9154-744f6d841f43/perses-operator/0.log" Mar 10 12:31:20 crc kubenswrapper[4794]: I0310 12:31:20.242017 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-2svm7" Mar 10 12:31:20 crc kubenswrapper[4794]: I0310 12:31:20.402907 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-thv8j\" (UniqueName: \"kubernetes.io/projected/4c0ac05f-6749-4d49-b3ac-4d4e56ddbea3-kube-api-access-thv8j\") pod \"4c0ac05f-6749-4d49-b3ac-4d4e56ddbea3\" (UID: \"4c0ac05f-6749-4d49-b3ac-4d4e56ddbea3\") " Mar 10 12:31:20 crc kubenswrapper[4794]: I0310 12:31:20.403114 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4c0ac05f-6749-4d49-b3ac-4d4e56ddbea3-catalog-content\") pod \"4c0ac05f-6749-4d49-b3ac-4d4e56ddbea3\" (UID: \"4c0ac05f-6749-4d49-b3ac-4d4e56ddbea3\") " Mar 10 12:31:20 crc kubenswrapper[4794]: I0310 12:31:20.403140 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4c0ac05f-6749-4d49-b3ac-4d4e56ddbea3-utilities\") pod \"4c0ac05f-6749-4d49-b3ac-4d4e56ddbea3\" (UID: \"4c0ac05f-6749-4d49-b3ac-4d4e56ddbea3\") " Mar 10 12:31:20 crc kubenswrapper[4794]: I0310 12:31:20.403917 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4c0ac05f-6749-4d49-b3ac-4d4e56ddbea3-utilities" (OuterVolumeSpecName: "utilities") pod "4c0ac05f-6749-4d49-b3ac-4d4e56ddbea3" (UID: "4c0ac05f-6749-4d49-b3ac-4d4e56ddbea3"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 12:31:20 crc kubenswrapper[4794]: I0310 12:31:20.409225 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4c0ac05f-6749-4d49-b3ac-4d4e56ddbea3-kube-api-access-thv8j" (OuterVolumeSpecName: "kube-api-access-thv8j") pod "4c0ac05f-6749-4d49-b3ac-4d4e56ddbea3" (UID: "4c0ac05f-6749-4d49-b3ac-4d4e56ddbea3"). InnerVolumeSpecName "kube-api-access-thv8j". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 12:31:20 crc kubenswrapper[4794]: I0310 12:31:20.439955 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4c0ac05f-6749-4d49-b3ac-4d4e56ddbea3-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "4c0ac05f-6749-4d49-b3ac-4d4e56ddbea3" (UID: "4c0ac05f-6749-4d49-b3ac-4d4e56ddbea3"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 12:31:20 crc kubenswrapper[4794]: I0310 12:31:20.506228 4794 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4c0ac05f-6749-4d49-b3ac-4d4e56ddbea3-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 10 12:31:20 crc kubenswrapper[4794]: I0310 12:31:20.506278 4794 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4c0ac05f-6749-4d49-b3ac-4d4e56ddbea3-utilities\") on node \"crc\" DevicePath \"\"" Mar 10 12:31:20 crc kubenswrapper[4794]: I0310 12:31:20.506291 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-thv8j\" (UniqueName: \"kubernetes.io/projected/4c0ac05f-6749-4d49-b3ac-4d4e56ddbea3-kube-api-access-thv8j\") on node \"crc\" DevicePath \"\"" Mar 10 12:31:20 crc kubenswrapper[4794]: I0310 12:31:20.613930 4794 generic.go:334] "Generic (PLEG): container finished" podID="4c0ac05f-6749-4d49-b3ac-4d4e56ddbea3" containerID="e2390405d1a7fb7043832abcf5454de296c748de423728e2d82acec83cf3225a" exitCode=0 Mar 10 12:31:20 crc kubenswrapper[4794]: I0310 12:31:20.613992 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2svm7" event={"ID":"4c0ac05f-6749-4d49-b3ac-4d4e56ddbea3","Type":"ContainerDied","Data":"e2390405d1a7fb7043832abcf5454de296c748de423728e2d82acec83cf3225a"} Mar 10 12:31:20 crc kubenswrapper[4794]: I0310 12:31:20.614029 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2svm7" event={"ID":"4c0ac05f-6749-4d49-b3ac-4d4e56ddbea3","Type":"ContainerDied","Data":"5fd286aa94333f9dd201c7691fedc894953cc4d9e3219f889ca3c6b9cb44cd92"} Mar 10 12:31:20 crc kubenswrapper[4794]: I0310 12:31:20.614049 4794 scope.go:117] "RemoveContainer" containerID="e2390405d1a7fb7043832abcf5454de296c748de423728e2d82acec83cf3225a" Mar 10 12:31:20 crc kubenswrapper[4794]: I0310 12:31:20.614179 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-2svm7" Mar 10 12:31:20 crc kubenswrapper[4794]: I0310 12:31:20.640923 4794 scope.go:117] "RemoveContainer" containerID="b1b3a8edf151e95b6ffc57aea05db805d5d54ac7c77c57c37758bc73d570bf7b" Mar 10 12:31:20 crc kubenswrapper[4794]: I0310 12:31:20.660047 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-2svm7"] Mar 10 12:31:20 crc kubenswrapper[4794]: I0310 12:31:20.666904 4794 scope.go:117] "RemoveContainer" containerID="d7b4c19ba3d5aa0cc49d06bfc438608cc28d26be4eb5ce893dfbcf95bcef5e44" Mar 10 12:31:20 crc kubenswrapper[4794]: I0310 12:31:20.678302 4794 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-2svm7"] Mar 10 12:31:20 crc kubenswrapper[4794]: I0310 12:31:20.730608 4794 scope.go:117] "RemoveContainer" containerID="e2390405d1a7fb7043832abcf5454de296c748de423728e2d82acec83cf3225a" Mar 10 12:31:20 crc kubenswrapper[4794]: E0310 12:31:20.731003 4794 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e2390405d1a7fb7043832abcf5454de296c748de423728e2d82acec83cf3225a\": container with ID starting with e2390405d1a7fb7043832abcf5454de296c748de423728e2d82acec83cf3225a not found: ID does not exist" containerID="e2390405d1a7fb7043832abcf5454de296c748de423728e2d82acec83cf3225a" Mar 10 12:31:20 crc kubenswrapper[4794]: I0310 12:31:20.732050 4794 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e2390405d1a7fb7043832abcf5454de296c748de423728e2d82acec83cf3225a"} err="failed to get container status \"e2390405d1a7fb7043832abcf5454de296c748de423728e2d82acec83cf3225a\": rpc error: code = NotFound desc = could not find container \"e2390405d1a7fb7043832abcf5454de296c748de423728e2d82acec83cf3225a\": container with ID starting with e2390405d1a7fb7043832abcf5454de296c748de423728e2d82acec83cf3225a not found: ID does not exist" Mar 10 12:31:20 crc kubenswrapper[4794]: I0310 12:31:20.732072 4794 scope.go:117] "RemoveContainer" containerID="b1b3a8edf151e95b6ffc57aea05db805d5d54ac7c77c57c37758bc73d570bf7b" Mar 10 12:31:20 crc kubenswrapper[4794]: E0310 12:31:20.733651 4794 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b1b3a8edf151e95b6ffc57aea05db805d5d54ac7c77c57c37758bc73d570bf7b\": container with ID starting with b1b3a8edf151e95b6ffc57aea05db805d5d54ac7c77c57c37758bc73d570bf7b not found: ID does not exist" containerID="b1b3a8edf151e95b6ffc57aea05db805d5d54ac7c77c57c37758bc73d570bf7b" Mar 10 12:31:20 crc kubenswrapper[4794]: I0310 12:31:20.733679 4794 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b1b3a8edf151e95b6ffc57aea05db805d5d54ac7c77c57c37758bc73d570bf7b"} err="failed to get container status \"b1b3a8edf151e95b6ffc57aea05db805d5d54ac7c77c57c37758bc73d570bf7b\": rpc error: code = NotFound desc = could not find container \"b1b3a8edf151e95b6ffc57aea05db805d5d54ac7c77c57c37758bc73d570bf7b\": container with ID starting with b1b3a8edf151e95b6ffc57aea05db805d5d54ac7c77c57c37758bc73d570bf7b not found: ID does not exist" Mar 10 12:31:20 crc kubenswrapper[4794]: I0310 12:31:20.733697 4794 scope.go:117] "RemoveContainer" containerID="d7b4c19ba3d5aa0cc49d06bfc438608cc28d26be4eb5ce893dfbcf95bcef5e44" Mar 10 12:31:20 crc kubenswrapper[4794]: E0310 12:31:20.734220 4794 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d7b4c19ba3d5aa0cc49d06bfc438608cc28d26be4eb5ce893dfbcf95bcef5e44\": container with ID starting with d7b4c19ba3d5aa0cc49d06bfc438608cc28d26be4eb5ce893dfbcf95bcef5e44 not found: ID does not exist" containerID="d7b4c19ba3d5aa0cc49d06bfc438608cc28d26be4eb5ce893dfbcf95bcef5e44" Mar 10 12:31:20 crc kubenswrapper[4794]: I0310 12:31:20.734248 4794 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d7b4c19ba3d5aa0cc49d06bfc438608cc28d26be4eb5ce893dfbcf95bcef5e44"} err="failed to get container status \"d7b4c19ba3d5aa0cc49d06bfc438608cc28d26be4eb5ce893dfbcf95bcef5e44\": rpc error: code = NotFound desc = could not find container \"d7b4c19ba3d5aa0cc49d06bfc438608cc28d26be4eb5ce893dfbcf95bcef5e44\": container with ID starting with d7b4c19ba3d5aa0cc49d06bfc438608cc28d26be4eb5ce893dfbcf95bcef5e44 not found: ID does not exist" Mar 10 12:31:22 crc kubenswrapper[4794]: I0310 12:31:22.010454 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4c0ac05f-6749-4d49-b3ac-4d4e56ddbea3" path="/var/lib/kubelet/pods/4c0ac05f-6749-4d49-b3ac-4d4e56ddbea3/volumes" Mar 10 12:31:27 crc kubenswrapper[4794]: E0310 12:31:27.517091 4794 upgradeaware.go:441] Error proxying data from backend to client: writeto tcp 38.102.83.65:36372->38.102.83.65:40829: read tcp 38.102.83.65:36372->38.102.83.65:40829: read: connection reset by peer Mar 10 12:31:50 crc kubenswrapper[4794]: E0310 12:31:50.475398 4794 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 38.102.83.65:51952->38.102.83.65:40829: write tcp 38.102.83.65:51952->38.102.83.65:40829: write: broken pipe Mar 10 12:32:00 crc kubenswrapper[4794]: I0310 12:32:00.151739 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29552432-md94r"] Mar 10 12:32:00 crc kubenswrapper[4794]: E0310 12:32:00.152727 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4c0ac05f-6749-4d49-b3ac-4d4e56ddbea3" containerName="registry-server" Mar 10 12:32:00 crc kubenswrapper[4794]: I0310 12:32:00.152741 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="4c0ac05f-6749-4d49-b3ac-4d4e56ddbea3" containerName="registry-server" Mar 10 12:32:00 crc kubenswrapper[4794]: E0310 12:32:00.152758 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4c0ac05f-6749-4d49-b3ac-4d4e56ddbea3" containerName="extract-content" Mar 10 12:32:00 crc kubenswrapper[4794]: I0310 12:32:00.152764 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="4c0ac05f-6749-4d49-b3ac-4d4e56ddbea3" containerName="extract-content" Mar 10 12:32:00 crc kubenswrapper[4794]: E0310 12:32:00.152793 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4c0ac05f-6749-4d49-b3ac-4d4e56ddbea3" containerName="extract-utilities" Mar 10 12:32:00 crc kubenswrapper[4794]: I0310 12:32:00.152800 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="4c0ac05f-6749-4d49-b3ac-4d4e56ddbea3" containerName="extract-utilities" Mar 10 12:32:00 crc kubenswrapper[4794]: I0310 12:32:00.153017 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="4c0ac05f-6749-4d49-b3ac-4d4e56ddbea3" containerName="registry-server" Mar 10 12:32:00 crc kubenswrapper[4794]: I0310 12:32:00.153826 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552432-md94r" Mar 10 12:32:00 crc kubenswrapper[4794]: I0310 12:32:00.155601 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 10 12:32:00 crc kubenswrapper[4794]: I0310 12:32:00.156009 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-r5tkc" Mar 10 12:32:00 crc kubenswrapper[4794]: I0310 12:32:00.156229 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 10 12:32:00 crc kubenswrapper[4794]: I0310 12:32:00.163869 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552432-md94r"] Mar 10 12:32:00 crc kubenswrapper[4794]: I0310 12:32:00.172592 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cz257\" (UniqueName: \"kubernetes.io/projected/87efc794-55e3-4e68-a08c-07577cb32196-kube-api-access-cz257\") pod \"auto-csr-approver-29552432-md94r\" (UID: \"87efc794-55e3-4e68-a08c-07577cb32196\") " pod="openshift-infra/auto-csr-approver-29552432-md94r" Mar 10 12:32:00 crc kubenswrapper[4794]: I0310 12:32:00.274409 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cz257\" (UniqueName: \"kubernetes.io/projected/87efc794-55e3-4e68-a08c-07577cb32196-kube-api-access-cz257\") pod \"auto-csr-approver-29552432-md94r\" (UID: \"87efc794-55e3-4e68-a08c-07577cb32196\") " pod="openshift-infra/auto-csr-approver-29552432-md94r" Mar 10 12:32:00 crc kubenswrapper[4794]: I0310 12:32:00.302813 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cz257\" (UniqueName: \"kubernetes.io/projected/87efc794-55e3-4e68-a08c-07577cb32196-kube-api-access-cz257\") pod \"auto-csr-approver-29552432-md94r\" (UID: \"87efc794-55e3-4e68-a08c-07577cb32196\") " pod="openshift-infra/auto-csr-approver-29552432-md94r" Mar 10 12:32:00 crc kubenswrapper[4794]: I0310 12:32:00.478907 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552432-md94r" Mar 10 12:32:00 crc kubenswrapper[4794]: I0310 12:32:00.980079 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552432-md94r"] Mar 10 12:32:01 crc kubenswrapper[4794]: I0310 12:32:01.109321 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552432-md94r" event={"ID":"87efc794-55e3-4e68-a08c-07577cb32196","Type":"ContainerStarted","Data":"d5169e39c3c38453b248b378c6ca27d36fd840dc0d18561d1bd0f160bbe3fc92"} Mar 10 12:32:03 crc kubenswrapper[4794]: I0310 12:32:03.139196 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552432-md94r" event={"ID":"87efc794-55e3-4e68-a08c-07577cb32196","Type":"ContainerStarted","Data":"05051da6ef953fb6c2e9dcc66ef2c032f1840564250939e3b311eef15a10a236"} Mar 10 12:32:04 crc kubenswrapper[4794]: I0310 12:32:04.164829 4794 generic.go:334] "Generic (PLEG): container finished" podID="87efc794-55e3-4e68-a08c-07577cb32196" containerID="05051da6ef953fb6c2e9dcc66ef2c032f1840564250939e3b311eef15a10a236" exitCode=0 Mar 10 12:32:04 crc kubenswrapper[4794]: I0310 12:32:04.164916 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552432-md94r" event={"ID":"87efc794-55e3-4e68-a08c-07577cb32196","Type":"ContainerDied","Data":"05051da6ef953fb6c2e9dcc66ef2c032f1840564250939e3b311eef15a10a236"} Mar 10 12:32:05 crc kubenswrapper[4794]: I0310 12:32:05.621372 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552432-md94r" Mar 10 12:32:05 crc kubenswrapper[4794]: I0310 12:32:05.696392 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cz257\" (UniqueName: \"kubernetes.io/projected/87efc794-55e3-4e68-a08c-07577cb32196-kube-api-access-cz257\") pod \"87efc794-55e3-4e68-a08c-07577cb32196\" (UID: \"87efc794-55e3-4e68-a08c-07577cb32196\") " Mar 10 12:32:05 crc kubenswrapper[4794]: I0310 12:32:05.705160 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87efc794-55e3-4e68-a08c-07577cb32196-kube-api-access-cz257" (OuterVolumeSpecName: "kube-api-access-cz257") pod "87efc794-55e3-4e68-a08c-07577cb32196" (UID: "87efc794-55e3-4e68-a08c-07577cb32196"). InnerVolumeSpecName "kube-api-access-cz257". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 12:32:05 crc kubenswrapper[4794]: I0310 12:32:05.799957 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cz257\" (UniqueName: \"kubernetes.io/projected/87efc794-55e3-4e68-a08c-07577cb32196-kube-api-access-cz257\") on node \"crc\" DevicePath \"\"" Mar 10 12:32:06 crc kubenswrapper[4794]: I0310 12:32:06.192180 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552432-md94r" event={"ID":"87efc794-55e3-4e68-a08c-07577cb32196","Type":"ContainerDied","Data":"d5169e39c3c38453b248b378c6ca27d36fd840dc0d18561d1bd0f160bbe3fc92"} Mar 10 12:32:06 crc kubenswrapper[4794]: I0310 12:32:06.192467 4794 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d5169e39c3c38453b248b378c6ca27d36fd840dc0d18561d1bd0f160bbe3fc92" Mar 10 12:32:06 crc kubenswrapper[4794]: I0310 12:32:06.192535 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552432-md94r" Mar 10 12:32:06 crc kubenswrapper[4794]: I0310 12:32:06.220746 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29552426-x5rr5"] Mar 10 12:32:06 crc kubenswrapper[4794]: I0310 12:32:06.230382 4794 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29552426-x5rr5"] Mar 10 12:32:08 crc kubenswrapper[4794]: I0310 12:32:08.013242 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a1297148-f11a-47c9-ac04-40f9fba0c8fe" path="/var/lib/kubelet/pods/a1297148-f11a-47c9-ac04-40f9fba0c8fe/volumes" Mar 10 12:32:56 crc kubenswrapper[4794]: I0310 12:32:56.383853 4794 scope.go:117] "RemoveContainer" containerID="b6972798a7ffe7f30f4f2e32a2789a0a3a21147f3624d1270de1408ccc9b73ed" Mar 10 12:33:15 crc kubenswrapper[4794]: I0310 12:33:15.893464 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-8jfvn"] Mar 10 12:33:15 crc kubenswrapper[4794]: E0310 12:33:15.894549 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="87efc794-55e3-4e68-a08c-07577cb32196" containerName="oc" Mar 10 12:33:15 crc kubenswrapper[4794]: I0310 12:33:15.894566 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="87efc794-55e3-4e68-a08c-07577cb32196" containerName="oc" Mar 10 12:33:15 crc kubenswrapper[4794]: I0310 12:33:15.894850 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="87efc794-55e3-4e68-a08c-07577cb32196" containerName="oc" Mar 10 12:33:15 crc kubenswrapper[4794]: I0310 12:33:15.896665 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-8jfvn" Mar 10 12:33:15 crc kubenswrapper[4794]: I0310 12:33:15.906079 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-8jfvn"] Mar 10 12:33:16 crc kubenswrapper[4794]: I0310 12:33:16.040033 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-66gxw\" (UniqueName: \"kubernetes.io/projected/085ce94b-4a8f-4cf6-ae75-aedb2d65753f-kube-api-access-66gxw\") pod \"community-operators-8jfvn\" (UID: \"085ce94b-4a8f-4cf6-ae75-aedb2d65753f\") " pod="openshift-marketplace/community-operators-8jfvn" Mar 10 12:33:16 crc kubenswrapper[4794]: I0310 12:33:16.040395 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/085ce94b-4a8f-4cf6-ae75-aedb2d65753f-catalog-content\") pod \"community-operators-8jfvn\" (UID: \"085ce94b-4a8f-4cf6-ae75-aedb2d65753f\") " pod="openshift-marketplace/community-operators-8jfvn" Mar 10 12:33:16 crc kubenswrapper[4794]: I0310 12:33:16.040495 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/085ce94b-4a8f-4cf6-ae75-aedb2d65753f-utilities\") pod \"community-operators-8jfvn\" (UID: \"085ce94b-4a8f-4cf6-ae75-aedb2d65753f\") " pod="openshift-marketplace/community-operators-8jfvn" Mar 10 12:33:16 crc kubenswrapper[4794]: I0310 12:33:16.143002 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/085ce94b-4a8f-4cf6-ae75-aedb2d65753f-catalog-content\") pod \"community-operators-8jfvn\" (UID: \"085ce94b-4a8f-4cf6-ae75-aedb2d65753f\") " pod="openshift-marketplace/community-operators-8jfvn" Mar 10 12:33:16 crc kubenswrapper[4794]: I0310 12:33:16.143087 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/085ce94b-4a8f-4cf6-ae75-aedb2d65753f-utilities\") pod \"community-operators-8jfvn\" (UID: \"085ce94b-4a8f-4cf6-ae75-aedb2d65753f\") " pod="openshift-marketplace/community-operators-8jfvn" Mar 10 12:33:16 crc kubenswrapper[4794]: I0310 12:33:16.143230 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-66gxw\" (UniqueName: \"kubernetes.io/projected/085ce94b-4a8f-4cf6-ae75-aedb2d65753f-kube-api-access-66gxw\") pod \"community-operators-8jfvn\" (UID: \"085ce94b-4a8f-4cf6-ae75-aedb2d65753f\") " pod="openshift-marketplace/community-operators-8jfvn" Mar 10 12:33:16 crc kubenswrapper[4794]: I0310 12:33:16.143652 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/085ce94b-4a8f-4cf6-ae75-aedb2d65753f-catalog-content\") pod \"community-operators-8jfvn\" (UID: \"085ce94b-4a8f-4cf6-ae75-aedb2d65753f\") " pod="openshift-marketplace/community-operators-8jfvn" Mar 10 12:33:16 crc kubenswrapper[4794]: I0310 12:33:16.143668 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/085ce94b-4a8f-4cf6-ae75-aedb2d65753f-utilities\") pod \"community-operators-8jfvn\" (UID: \"085ce94b-4a8f-4cf6-ae75-aedb2d65753f\") " pod="openshift-marketplace/community-operators-8jfvn" Mar 10 12:33:16 crc kubenswrapper[4794]: I0310 12:33:16.164179 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-66gxw\" (UniqueName: \"kubernetes.io/projected/085ce94b-4a8f-4cf6-ae75-aedb2d65753f-kube-api-access-66gxw\") pod \"community-operators-8jfvn\" (UID: \"085ce94b-4a8f-4cf6-ae75-aedb2d65753f\") " pod="openshift-marketplace/community-operators-8jfvn" Mar 10 12:33:16 crc kubenswrapper[4794]: I0310 12:33:16.254720 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-8jfvn" Mar 10 12:33:16 crc kubenswrapper[4794]: I0310 12:33:16.883214 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-8jfvn"] Mar 10 12:33:17 crc kubenswrapper[4794]: I0310 12:33:17.001676 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8jfvn" event={"ID":"085ce94b-4a8f-4cf6-ae75-aedb2d65753f","Type":"ContainerStarted","Data":"49759e3d27092c26c3c8532b6df1aad12273e5730c7b37f57a9ad6a18a482477"} Mar 10 12:33:18 crc kubenswrapper[4794]: I0310 12:33:18.016685 4794 generic.go:334] "Generic (PLEG): container finished" podID="085ce94b-4a8f-4cf6-ae75-aedb2d65753f" containerID="fb481c9d239897085f4d5eb250606c5087a14ea6103d965d5e98eb069aa60b32" exitCode=0 Mar 10 12:33:18 crc kubenswrapper[4794]: I0310 12:33:18.016738 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8jfvn" event={"ID":"085ce94b-4a8f-4cf6-ae75-aedb2d65753f","Type":"ContainerDied","Data":"fb481c9d239897085f4d5eb250606c5087a14ea6103d965d5e98eb069aa60b32"} Mar 10 12:33:20 crc kubenswrapper[4794]: I0310 12:33:20.042244 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8jfvn" event={"ID":"085ce94b-4a8f-4cf6-ae75-aedb2d65753f","Type":"ContainerStarted","Data":"a0b33d9ff7659cd7849ce5cd88906097885239aca25bcacde818d48e595b4a15"} Mar 10 12:33:21 crc kubenswrapper[4794]: I0310 12:33:21.054620 4794 generic.go:334] "Generic (PLEG): container finished" podID="085ce94b-4a8f-4cf6-ae75-aedb2d65753f" containerID="a0b33d9ff7659cd7849ce5cd88906097885239aca25bcacde818d48e595b4a15" exitCode=0 Mar 10 12:33:21 crc kubenswrapper[4794]: I0310 12:33:21.054664 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8jfvn" event={"ID":"085ce94b-4a8f-4cf6-ae75-aedb2d65753f","Type":"ContainerDied","Data":"a0b33d9ff7659cd7849ce5cd88906097885239aca25bcacde818d48e595b4a15"} Mar 10 12:33:22 crc kubenswrapper[4794]: I0310 12:33:22.074399 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8jfvn" event={"ID":"085ce94b-4a8f-4cf6-ae75-aedb2d65753f","Type":"ContainerStarted","Data":"79201c64a021e5adb63f922d1039faadb91a91247b1f45819f9f01304452cc71"} Mar 10 12:33:22 crc kubenswrapper[4794]: I0310 12:33:22.098253 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-8jfvn" podStartSLOduration=3.627633731 podStartE2EDuration="7.098231332s" podCreationTimestamp="2026-03-10 12:33:15 +0000 UTC" firstStartedPulling="2026-03-10 12:33:18.020483102 +0000 UTC m=+10146.776653940" lastFinishedPulling="2026-03-10 12:33:21.491080723 +0000 UTC m=+10150.247251541" observedRunningTime="2026-03-10 12:33:22.092018859 +0000 UTC m=+10150.848189687" watchObservedRunningTime="2026-03-10 12:33:22.098231332 +0000 UTC m=+10150.854402160" Mar 10 12:33:22 crc kubenswrapper[4794]: I0310 12:33:22.967991 4794 patch_prober.go:28] interesting pod/machine-config-daemon-69278 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 12:33:22 crc kubenswrapper[4794]: I0310 12:33:22.968054 4794 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-69278" podUID="071a79a8-a892-4d38-a255-2a19483b64aa" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 12:33:26 crc kubenswrapper[4794]: I0310 12:33:26.255081 4794 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-8jfvn" Mar 10 12:33:26 crc kubenswrapper[4794]: I0310 12:33:26.255682 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-8jfvn" Mar 10 12:33:26 crc kubenswrapper[4794]: I0310 12:33:26.318045 4794 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-8jfvn" Mar 10 12:33:27 crc kubenswrapper[4794]: I0310 12:33:27.220991 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-8jfvn" Mar 10 12:33:27 crc kubenswrapper[4794]: I0310 12:33:27.285390 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-8jfvn"] Mar 10 12:33:29 crc kubenswrapper[4794]: I0310 12:33:29.170057 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-8jfvn" podUID="085ce94b-4a8f-4cf6-ae75-aedb2d65753f" containerName="registry-server" containerID="cri-o://79201c64a021e5adb63f922d1039faadb91a91247b1f45819f9f01304452cc71" gracePeriod=2 Mar 10 12:33:29 crc kubenswrapper[4794]: I0310 12:33:29.719889 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-8jfvn" Mar 10 12:33:29 crc kubenswrapper[4794]: I0310 12:33:29.870001 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/085ce94b-4a8f-4cf6-ae75-aedb2d65753f-utilities\") pod \"085ce94b-4a8f-4cf6-ae75-aedb2d65753f\" (UID: \"085ce94b-4a8f-4cf6-ae75-aedb2d65753f\") " Mar 10 12:33:29 crc kubenswrapper[4794]: I0310 12:33:29.870392 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/085ce94b-4a8f-4cf6-ae75-aedb2d65753f-catalog-content\") pod \"085ce94b-4a8f-4cf6-ae75-aedb2d65753f\" (UID: \"085ce94b-4a8f-4cf6-ae75-aedb2d65753f\") " Mar 10 12:33:29 crc kubenswrapper[4794]: I0310 12:33:29.870656 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-66gxw\" (UniqueName: \"kubernetes.io/projected/085ce94b-4a8f-4cf6-ae75-aedb2d65753f-kube-api-access-66gxw\") pod \"085ce94b-4a8f-4cf6-ae75-aedb2d65753f\" (UID: \"085ce94b-4a8f-4cf6-ae75-aedb2d65753f\") " Mar 10 12:33:29 crc kubenswrapper[4794]: I0310 12:33:29.871146 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/085ce94b-4a8f-4cf6-ae75-aedb2d65753f-utilities" (OuterVolumeSpecName: "utilities") pod "085ce94b-4a8f-4cf6-ae75-aedb2d65753f" (UID: "085ce94b-4a8f-4cf6-ae75-aedb2d65753f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 12:33:29 crc kubenswrapper[4794]: I0310 12:33:29.871578 4794 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/085ce94b-4a8f-4cf6-ae75-aedb2d65753f-utilities\") on node \"crc\" DevicePath \"\"" Mar 10 12:33:29 crc kubenswrapper[4794]: I0310 12:33:29.877490 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/085ce94b-4a8f-4cf6-ae75-aedb2d65753f-kube-api-access-66gxw" (OuterVolumeSpecName: "kube-api-access-66gxw") pod "085ce94b-4a8f-4cf6-ae75-aedb2d65753f" (UID: "085ce94b-4a8f-4cf6-ae75-aedb2d65753f"). InnerVolumeSpecName "kube-api-access-66gxw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 12:33:29 crc kubenswrapper[4794]: I0310 12:33:29.943019 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/085ce94b-4a8f-4cf6-ae75-aedb2d65753f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "085ce94b-4a8f-4cf6-ae75-aedb2d65753f" (UID: "085ce94b-4a8f-4cf6-ae75-aedb2d65753f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 12:33:29 crc kubenswrapper[4794]: I0310 12:33:29.973734 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-66gxw\" (UniqueName: \"kubernetes.io/projected/085ce94b-4a8f-4cf6-ae75-aedb2d65753f-kube-api-access-66gxw\") on node \"crc\" DevicePath \"\"" Mar 10 12:33:29 crc kubenswrapper[4794]: I0310 12:33:29.973963 4794 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/085ce94b-4a8f-4cf6-ae75-aedb2d65753f-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 10 12:33:30 crc kubenswrapper[4794]: I0310 12:33:30.182888 4794 generic.go:334] "Generic (PLEG): container finished" podID="085ce94b-4a8f-4cf6-ae75-aedb2d65753f" containerID="79201c64a021e5adb63f922d1039faadb91a91247b1f45819f9f01304452cc71" exitCode=0 Mar 10 12:33:30 crc kubenswrapper[4794]: I0310 12:33:30.182953 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8jfvn" event={"ID":"085ce94b-4a8f-4cf6-ae75-aedb2d65753f","Type":"ContainerDied","Data":"79201c64a021e5adb63f922d1039faadb91a91247b1f45819f9f01304452cc71"} Mar 10 12:33:30 crc kubenswrapper[4794]: I0310 12:33:30.183005 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8jfvn" event={"ID":"085ce94b-4a8f-4cf6-ae75-aedb2d65753f","Type":"ContainerDied","Data":"49759e3d27092c26c3c8532b6df1aad12273e5730c7b37f57a9ad6a18a482477"} Mar 10 12:33:30 crc kubenswrapper[4794]: I0310 12:33:30.183044 4794 scope.go:117] "RemoveContainer" containerID="79201c64a021e5adb63f922d1039faadb91a91247b1f45819f9f01304452cc71" Mar 10 12:33:30 crc kubenswrapper[4794]: I0310 12:33:30.183270 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-8jfvn" Mar 10 12:33:30 crc kubenswrapper[4794]: I0310 12:33:30.216927 4794 scope.go:117] "RemoveContainer" containerID="a0b33d9ff7659cd7849ce5cd88906097885239aca25bcacde818d48e595b4a15" Mar 10 12:33:30 crc kubenswrapper[4794]: I0310 12:33:30.226108 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-8jfvn"] Mar 10 12:33:30 crc kubenswrapper[4794]: I0310 12:33:30.241146 4794 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-8jfvn"] Mar 10 12:33:30 crc kubenswrapper[4794]: I0310 12:33:30.241157 4794 scope.go:117] "RemoveContainer" containerID="fb481c9d239897085f4d5eb250606c5087a14ea6103d965d5e98eb069aa60b32" Mar 10 12:33:30 crc kubenswrapper[4794]: I0310 12:33:30.297343 4794 scope.go:117] "RemoveContainer" containerID="79201c64a021e5adb63f922d1039faadb91a91247b1f45819f9f01304452cc71" Mar 10 12:33:30 crc kubenswrapper[4794]: E0310 12:33:30.297898 4794 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"79201c64a021e5adb63f922d1039faadb91a91247b1f45819f9f01304452cc71\": container with ID starting with 79201c64a021e5adb63f922d1039faadb91a91247b1f45819f9f01304452cc71 not found: ID does not exist" containerID="79201c64a021e5adb63f922d1039faadb91a91247b1f45819f9f01304452cc71" Mar 10 12:33:30 crc kubenswrapper[4794]: I0310 12:33:30.297984 4794 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"79201c64a021e5adb63f922d1039faadb91a91247b1f45819f9f01304452cc71"} err="failed to get container status \"79201c64a021e5adb63f922d1039faadb91a91247b1f45819f9f01304452cc71\": rpc error: code = NotFound desc = could not find container \"79201c64a021e5adb63f922d1039faadb91a91247b1f45819f9f01304452cc71\": container with ID starting with 79201c64a021e5adb63f922d1039faadb91a91247b1f45819f9f01304452cc71 not found: ID does not exist" Mar 10 12:33:30 crc kubenswrapper[4794]: I0310 12:33:30.298018 4794 scope.go:117] "RemoveContainer" containerID="a0b33d9ff7659cd7849ce5cd88906097885239aca25bcacde818d48e595b4a15" Mar 10 12:33:30 crc kubenswrapper[4794]: E0310 12:33:30.298378 4794 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a0b33d9ff7659cd7849ce5cd88906097885239aca25bcacde818d48e595b4a15\": container with ID starting with a0b33d9ff7659cd7849ce5cd88906097885239aca25bcacde818d48e595b4a15 not found: ID does not exist" containerID="a0b33d9ff7659cd7849ce5cd88906097885239aca25bcacde818d48e595b4a15" Mar 10 12:33:30 crc kubenswrapper[4794]: I0310 12:33:30.298434 4794 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a0b33d9ff7659cd7849ce5cd88906097885239aca25bcacde818d48e595b4a15"} err="failed to get container status \"a0b33d9ff7659cd7849ce5cd88906097885239aca25bcacde818d48e595b4a15\": rpc error: code = NotFound desc = could not find container \"a0b33d9ff7659cd7849ce5cd88906097885239aca25bcacde818d48e595b4a15\": container with ID starting with a0b33d9ff7659cd7849ce5cd88906097885239aca25bcacde818d48e595b4a15 not found: ID does not exist" Mar 10 12:33:30 crc kubenswrapper[4794]: I0310 12:33:30.298455 4794 scope.go:117] "RemoveContainer" containerID="fb481c9d239897085f4d5eb250606c5087a14ea6103d965d5e98eb069aa60b32" Mar 10 12:33:30 crc kubenswrapper[4794]: E0310 12:33:30.298684 4794 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fb481c9d239897085f4d5eb250606c5087a14ea6103d965d5e98eb069aa60b32\": container with ID starting with fb481c9d239897085f4d5eb250606c5087a14ea6103d965d5e98eb069aa60b32 not found: ID does not exist" containerID="fb481c9d239897085f4d5eb250606c5087a14ea6103d965d5e98eb069aa60b32" Mar 10 12:33:30 crc kubenswrapper[4794]: I0310 12:33:30.298728 4794 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fb481c9d239897085f4d5eb250606c5087a14ea6103d965d5e98eb069aa60b32"} err="failed to get container status \"fb481c9d239897085f4d5eb250606c5087a14ea6103d965d5e98eb069aa60b32\": rpc error: code = NotFound desc = could not find container \"fb481c9d239897085f4d5eb250606c5087a14ea6103d965d5e98eb069aa60b32\": container with ID starting with fb481c9d239897085f4d5eb250606c5087a14ea6103d965d5e98eb069aa60b32 not found: ID does not exist" Mar 10 12:33:32 crc kubenswrapper[4794]: I0310 12:33:32.016487 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="085ce94b-4a8f-4cf6-ae75-aedb2d65753f" path="/var/lib/kubelet/pods/085ce94b-4a8f-4cf6-ae75-aedb2d65753f/volumes" Mar 10 12:33:47 crc kubenswrapper[4794]: I0310 12:33:47.383477 4794 generic.go:334] "Generic (PLEG): container finished" podID="bb3870f7-5f55-4bc7-b9d2-89e55a39db3c" containerID="98ec632d28d4582ae735bb4be261bb11501984e06604d07ae68dc771e311e2f9" exitCode=0 Mar 10 12:33:47 crc kubenswrapper[4794]: I0310 12:33:47.383613 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-9zvls/must-gather-v29r8" event={"ID":"bb3870f7-5f55-4bc7-b9d2-89e55a39db3c","Type":"ContainerDied","Data":"98ec632d28d4582ae735bb4be261bb11501984e06604d07ae68dc771e311e2f9"} Mar 10 12:33:47 crc kubenswrapper[4794]: I0310 12:33:47.385161 4794 scope.go:117] "RemoveContainer" containerID="98ec632d28d4582ae735bb4be261bb11501984e06604d07ae68dc771e311e2f9" Mar 10 12:33:47 crc kubenswrapper[4794]: I0310 12:33:47.509432 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-9zvls_must-gather-v29r8_bb3870f7-5f55-4bc7-b9d2-89e55a39db3c/gather/0.log" Mar 10 12:33:52 crc kubenswrapper[4794]: I0310 12:33:52.968050 4794 patch_prober.go:28] interesting pod/machine-config-daemon-69278 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 12:33:52 crc kubenswrapper[4794]: I0310 12:33:52.968658 4794 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-69278" podUID="071a79a8-a892-4d38-a255-2a19483b64aa" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 12:33:55 crc kubenswrapper[4794]: I0310 12:33:55.578716 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-9zvls/must-gather-v29r8"] Mar 10 12:33:55 crc kubenswrapper[4794]: I0310 12:33:55.579486 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-9zvls/must-gather-v29r8" podUID="bb3870f7-5f55-4bc7-b9d2-89e55a39db3c" containerName="copy" containerID="cri-o://4e3d76e5154715389dfb45be595648f007d0bfea18b3afab2b1b01d3e1746a68" gracePeriod=2 Mar 10 12:33:55 crc kubenswrapper[4794]: I0310 12:33:55.588759 4794 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-9zvls/must-gather-v29r8"] Mar 10 12:33:56 crc kubenswrapper[4794]: I0310 12:33:56.491684 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-9zvls_must-gather-v29r8_bb3870f7-5f55-4bc7-b9d2-89e55a39db3c/copy/0.log" Mar 10 12:33:56 crc kubenswrapper[4794]: I0310 12:33:56.492994 4794 generic.go:334] "Generic (PLEG): container finished" podID="bb3870f7-5f55-4bc7-b9d2-89e55a39db3c" containerID="4e3d76e5154715389dfb45be595648f007d0bfea18b3afab2b1b01d3e1746a68" exitCode=143 Mar 10 12:33:56 crc kubenswrapper[4794]: I0310 12:33:56.842491 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-9zvls_must-gather-v29r8_bb3870f7-5f55-4bc7-b9d2-89e55a39db3c/copy/0.log" Mar 10 12:33:56 crc kubenswrapper[4794]: I0310 12:33:56.843179 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-9zvls/must-gather-v29r8" Mar 10 12:33:56 crc kubenswrapper[4794]: I0310 12:33:56.962747 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/bb3870f7-5f55-4bc7-b9d2-89e55a39db3c-must-gather-output\") pod \"bb3870f7-5f55-4bc7-b9d2-89e55a39db3c\" (UID: \"bb3870f7-5f55-4bc7-b9d2-89e55a39db3c\") " Mar 10 12:33:56 crc kubenswrapper[4794]: I0310 12:33:56.970614 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5blf6\" (UniqueName: \"kubernetes.io/projected/bb3870f7-5f55-4bc7-b9d2-89e55a39db3c-kube-api-access-5blf6\") pod \"bb3870f7-5f55-4bc7-b9d2-89e55a39db3c\" (UID: \"bb3870f7-5f55-4bc7-b9d2-89e55a39db3c\") " Mar 10 12:33:56 crc kubenswrapper[4794]: I0310 12:33:56.990758 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bb3870f7-5f55-4bc7-b9d2-89e55a39db3c-kube-api-access-5blf6" (OuterVolumeSpecName: "kube-api-access-5blf6") pod "bb3870f7-5f55-4bc7-b9d2-89e55a39db3c" (UID: "bb3870f7-5f55-4bc7-b9d2-89e55a39db3c"). InnerVolumeSpecName "kube-api-access-5blf6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 12:33:57 crc kubenswrapper[4794]: I0310 12:33:57.073058 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5blf6\" (UniqueName: \"kubernetes.io/projected/bb3870f7-5f55-4bc7-b9d2-89e55a39db3c-kube-api-access-5blf6\") on node \"crc\" DevicePath \"\"" Mar 10 12:33:57 crc kubenswrapper[4794]: I0310 12:33:57.165543 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bb3870f7-5f55-4bc7-b9d2-89e55a39db3c-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "bb3870f7-5f55-4bc7-b9d2-89e55a39db3c" (UID: "bb3870f7-5f55-4bc7-b9d2-89e55a39db3c"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 12:33:57 crc kubenswrapper[4794]: I0310 12:33:57.175413 4794 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/bb3870f7-5f55-4bc7-b9d2-89e55a39db3c-must-gather-output\") on node \"crc\" DevicePath \"\"" Mar 10 12:33:57 crc kubenswrapper[4794]: I0310 12:33:57.512633 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-9zvls_must-gather-v29r8_bb3870f7-5f55-4bc7-b9d2-89e55a39db3c/copy/0.log" Mar 10 12:33:57 crc kubenswrapper[4794]: I0310 12:33:57.513216 4794 scope.go:117] "RemoveContainer" containerID="4e3d76e5154715389dfb45be595648f007d0bfea18b3afab2b1b01d3e1746a68" Mar 10 12:33:57 crc kubenswrapper[4794]: I0310 12:33:57.513375 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-9zvls/must-gather-v29r8" Mar 10 12:33:57 crc kubenswrapper[4794]: I0310 12:33:57.546197 4794 scope.go:117] "RemoveContainer" containerID="98ec632d28d4582ae735bb4be261bb11501984e06604d07ae68dc771e311e2f9" Mar 10 12:33:58 crc kubenswrapper[4794]: I0310 12:33:58.013983 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bb3870f7-5f55-4bc7-b9d2-89e55a39db3c" path="/var/lib/kubelet/pods/bb3870f7-5f55-4bc7-b9d2-89e55a39db3c/volumes" Mar 10 12:34:00 crc kubenswrapper[4794]: I0310 12:34:00.151380 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29552434-9df6r"] Mar 10 12:34:00 crc kubenswrapper[4794]: E0310 12:34:00.152320 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="085ce94b-4a8f-4cf6-ae75-aedb2d65753f" containerName="registry-server" Mar 10 12:34:00 crc kubenswrapper[4794]: I0310 12:34:00.152356 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="085ce94b-4a8f-4cf6-ae75-aedb2d65753f" containerName="registry-server" Mar 10 12:34:00 crc kubenswrapper[4794]: E0310 12:34:00.152380 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="085ce94b-4a8f-4cf6-ae75-aedb2d65753f" containerName="extract-content" Mar 10 12:34:00 crc kubenswrapper[4794]: I0310 12:34:00.152388 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="085ce94b-4a8f-4cf6-ae75-aedb2d65753f" containerName="extract-content" Mar 10 12:34:00 crc kubenswrapper[4794]: E0310 12:34:00.152416 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bb3870f7-5f55-4bc7-b9d2-89e55a39db3c" containerName="copy" Mar 10 12:34:00 crc kubenswrapper[4794]: I0310 12:34:00.152424 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="bb3870f7-5f55-4bc7-b9d2-89e55a39db3c" containerName="copy" Mar 10 12:34:00 crc kubenswrapper[4794]: E0310 12:34:00.152452 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="085ce94b-4a8f-4cf6-ae75-aedb2d65753f" containerName="extract-utilities" Mar 10 12:34:00 crc kubenswrapper[4794]: I0310 12:34:00.152460 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="085ce94b-4a8f-4cf6-ae75-aedb2d65753f" containerName="extract-utilities" Mar 10 12:34:00 crc kubenswrapper[4794]: E0310 12:34:00.152473 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bb3870f7-5f55-4bc7-b9d2-89e55a39db3c" containerName="gather" Mar 10 12:34:00 crc kubenswrapper[4794]: I0310 12:34:00.152480 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="bb3870f7-5f55-4bc7-b9d2-89e55a39db3c" containerName="gather" Mar 10 12:34:00 crc kubenswrapper[4794]: I0310 12:34:00.152750 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="bb3870f7-5f55-4bc7-b9d2-89e55a39db3c" containerName="gather" Mar 10 12:34:00 crc kubenswrapper[4794]: I0310 12:34:00.152778 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="085ce94b-4a8f-4cf6-ae75-aedb2d65753f" containerName="registry-server" Mar 10 12:34:00 crc kubenswrapper[4794]: I0310 12:34:00.152808 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="bb3870f7-5f55-4bc7-b9d2-89e55a39db3c" containerName="copy" Mar 10 12:34:00 crc kubenswrapper[4794]: I0310 12:34:00.153807 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552434-9df6r" Mar 10 12:34:00 crc kubenswrapper[4794]: I0310 12:34:00.155922 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 10 12:34:00 crc kubenswrapper[4794]: I0310 12:34:00.156371 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 10 12:34:00 crc kubenswrapper[4794]: I0310 12:34:00.156888 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-r5tkc" Mar 10 12:34:00 crc kubenswrapper[4794]: I0310 12:34:00.171979 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552434-9df6r"] Mar 10 12:34:00 crc kubenswrapper[4794]: I0310 12:34:00.246528 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nrn5p\" (UniqueName: \"kubernetes.io/projected/aebb4aff-2d26-4aa9-8afc-04bab275a150-kube-api-access-nrn5p\") pod \"auto-csr-approver-29552434-9df6r\" (UID: \"aebb4aff-2d26-4aa9-8afc-04bab275a150\") " pod="openshift-infra/auto-csr-approver-29552434-9df6r" Mar 10 12:34:00 crc kubenswrapper[4794]: I0310 12:34:00.349165 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nrn5p\" (UniqueName: \"kubernetes.io/projected/aebb4aff-2d26-4aa9-8afc-04bab275a150-kube-api-access-nrn5p\") pod \"auto-csr-approver-29552434-9df6r\" (UID: \"aebb4aff-2d26-4aa9-8afc-04bab275a150\") " pod="openshift-infra/auto-csr-approver-29552434-9df6r" Mar 10 12:34:00 crc kubenswrapper[4794]: I0310 12:34:00.368798 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nrn5p\" (UniqueName: \"kubernetes.io/projected/aebb4aff-2d26-4aa9-8afc-04bab275a150-kube-api-access-nrn5p\") pod \"auto-csr-approver-29552434-9df6r\" (UID: \"aebb4aff-2d26-4aa9-8afc-04bab275a150\") " pod="openshift-infra/auto-csr-approver-29552434-9df6r" Mar 10 12:34:00 crc kubenswrapper[4794]: I0310 12:34:00.481444 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552434-9df6r" Mar 10 12:34:00 crc kubenswrapper[4794]: I0310 12:34:00.970006 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552434-9df6r"] Mar 10 12:34:01 crc kubenswrapper[4794]: I0310 12:34:01.551495 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552434-9df6r" event={"ID":"aebb4aff-2d26-4aa9-8afc-04bab275a150","Type":"ContainerStarted","Data":"28ecd8892294ac8413fbf12a90ed31c5bf6741a84bc2d9c4fefa27ad8a4cd9b2"} Mar 10 12:34:03 crc kubenswrapper[4794]: I0310 12:34:03.570083 4794 generic.go:334] "Generic (PLEG): container finished" podID="aebb4aff-2d26-4aa9-8afc-04bab275a150" containerID="8764242b513cc8dc39c94faa867251a6adc5f3b734f031f0a6e4ffd90aa5405c" exitCode=0 Mar 10 12:34:03 crc kubenswrapper[4794]: I0310 12:34:03.570169 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552434-9df6r" event={"ID":"aebb4aff-2d26-4aa9-8afc-04bab275a150","Type":"ContainerDied","Data":"8764242b513cc8dc39c94faa867251a6adc5f3b734f031f0a6e4ffd90aa5405c"} Mar 10 12:34:04 crc kubenswrapper[4794]: I0310 12:34:04.992905 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552434-9df6r" Mar 10 12:34:05 crc kubenswrapper[4794]: I0310 12:34:05.150064 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nrn5p\" (UniqueName: \"kubernetes.io/projected/aebb4aff-2d26-4aa9-8afc-04bab275a150-kube-api-access-nrn5p\") pod \"aebb4aff-2d26-4aa9-8afc-04bab275a150\" (UID: \"aebb4aff-2d26-4aa9-8afc-04bab275a150\") " Mar 10 12:34:05 crc kubenswrapper[4794]: I0310 12:34:05.155531 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aebb4aff-2d26-4aa9-8afc-04bab275a150-kube-api-access-nrn5p" (OuterVolumeSpecName: "kube-api-access-nrn5p") pod "aebb4aff-2d26-4aa9-8afc-04bab275a150" (UID: "aebb4aff-2d26-4aa9-8afc-04bab275a150"). InnerVolumeSpecName "kube-api-access-nrn5p". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 12:34:05 crc kubenswrapper[4794]: I0310 12:34:05.252040 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nrn5p\" (UniqueName: \"kubernetes.io/projected/aebb4aff-2d26-4aa9-8afc-04bab275a150-kube-api-access-nrn5p\") on node \"crc\" DevicePath \"\"" Mar 10 12:34:05 crc kubenswrapper[4794]: I0310 12:34:05.588993 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552434-9df6r" event={"ID":"aebb4aff-2d26-4aa9-8afc-04bab275a150","Type":"ContainerDied","Data":"28ecd8892294ac8413fbf12a90ed31c5bf6741a84bc2d9c4fefa27ad8a4cd9b2"} Mar 10 12:34:05 crc kubenswrapper[4794]: I0310 12:34:05.589030 4794 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="28ecd8892294ac8413fbf12a90ed31c5bf6741a84bc2d9c4fefa27ad8a4cd9b2" Mar 10 12:34:05 crc kubenswrapper[4794]: I0310 12:34:05.589317 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552434-9df6r" Mar 10 12:34:06 crc kubenswrapper[4794]: I0310 12:34:06.051657 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29552428-gvn7p"] Mar 10 12:34:06 crc kubenswrapper[4794]: I0310 12:34:06.061171 4794 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29552428-gvn7p"] Mar 10 12:34:08 crc kubenswrapper[4794]: I0310 12:34:08.018549 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c8b35de4-fab9-46c5-a7f0-71ccf3c8309d" path="/var/lib/kubelet/pods/c8b35de4-fab9-46c5-a7f0-71ccf3c8309d/volumes" Mar 10 12:34:22 crc kubenswrapper[4794]: I0310 12:34:22.967161 4794 patch_prober.go:28] interesting pod/machine-config-daemon-69278 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 12:34:22 crc kubenswrapper[4794]: I0310 12:34:22.967748 4794 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-69278" podUID="071a79a8-a892-4d38-a255-2a19483b64aa" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 12:34:22 crc kubenswrapper[4794]: I0310 12:34:22.967813 4794 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-69278" Mar 10 12:34:22 crc kubenswrapper[4794]: I0310 12:34:22.968830 4794 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"650642fca6fd05c6d1ce8b4bb58bc6d6d784b308ef3ef228d986be779c79b82d"} pod="openshift-machine-config-operator/machine-config-daemon-69278" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 10 12:34:22 crc kubenswrapper[4794]: I0310 12:34:22.968881 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-69278" podUID="071a79a8-a892-4d38-a255-2a19483b64aa" containerName="machine-config-daemon" containerID="cri-o://650642fca6fd05c6d1ce8b4bb58bc6d6d784b308ef3ef228d986be779c79b82d" gracePeriod=600 Mar 10 12:34:23 crc kubenswrapper[4794]: I0310 12:34:23.780635 4794 generic.go:334] "Generic (PLEG): container finished" podID="071a79a8-a892-4d38-a255-2a19483b64aa" containerID="650642fca6fd05c6d1ce8b4bb58bc6d6d784b308ef3ef228d986be779c79b82d" exitCode=0 Mar 10 12:34:23 crc kubenswrapper[4794]: I0310 12:34:23.780712 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-69278" event={"ID":"071a79a8-a892-4d38-a255-2a19483b64aa","Type":"ContainerDied","Data":"650642fca6fd05c6d1ce8b4bb58bc6d6d784b308ef3ef228d986be779c79b82d"} Mar 10 12:34:23 crc kubenswrapper[4794]: I0310 12:34:23.781198 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-69278" event={"ID":"071a79a8-a892-4d38-a255-2a19483b64aa","Type":"ContainerStarted","Data":"5810f383c79e26ca37e093d22aadaab4bdd2ca7173396934e04cd19f1f76809c"} Mar 10 12:34:23 crc kubenswrapper[4794]: I0310 12:34:23.781228 4794 scope.go:117] "RemoveContainer" containerID="713a7f9d6f1bfd1d1e307cff8db8bc49857bd1b38078cd590d6dd88f1ae69a2e" Mar 10 12:34:56 crc kubenswrapper[4794]: I0310 12:34:56.495269 4794 scope.go:117] "RemoveContainer" containerID="f6440904606d9aebf7d741832031499be56f2f4d4e99cd8c68eebdc26c4944c1" Mar 10 12:35:11 crc kubenswrapper[4794]: I0310 12:35:11.927432 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-5p9sk"] Mar 10 12:35:11 crc kubenswrapper[4794]: E0310 12:35:11.928442 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aebb4aff-2d26-4aa9-8afc-04bab275a150" containerName="oc" Mar 10 12:35:11 crc kubenswrapper[4794]: I0310 12:35:11.928458 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="aebb4aff-2d26-4aa9-8afc-04bab275a150" containerName="oc" Mar 10 12:35:11 crc kubenswrapper[4794]: I0310 12:35:11.928670 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="aebb4aff-2d26-4aa9-8afc-04bab275a150" containerName="oc" Mar 10 12:35:11 crc kubenswrapper[4794]: I0310 12:35:11.932572 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-5p9sk" Mar 10 12:35:11 crc kubenswrapper[4794]: I0310 12:35:11.959690 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-5p9sk"] Mar 10 12:35:12 crc kubenswrapper[4794]: I0310 12:35:12.009539 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ngnrc\" (UniqueName: \"kubernetes.io/projected/9f8a6b64-93dc-4146-84d9-a15400d3dfd3-kube-api-access-ngnrc\") pod \"certified-operators-5p9sk\" (UID: \"9f8a6b64-93dc-4146-84d9-a15400d3dfd3\") " pod="openshift-marketplace/certified-operators-5p9sk" Mar 10 12:35:12 crc kubenswrapper[4794]: I0310 12:35:12.009622 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9f8a6b64-93dc-4146-84d9-a15400d3dfd3-catalog-content\") pod \"certified-operators-5p9sk\" (UID: \"9f8a6b64-93dc-4146-84d9-a15400d3dfd3\") " pod="openshift-marketplace/certified-operators-5p9sk" Mar 10 12:35:12 crc kubenswrapper[4794]: I0310 12:35:12.009672 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9f8a6b64-93dc-4146-84d9-a15400d3dfd3-utilities\") pod \"certified-operators-5p9sk\" (UID: \"9f8a6b64-93dc-4146-84d9-a15400d3dfd3\") " pod="openshift-marketplace/certified-operators-5p9sk" Mar 10 12:35:12 crc kubenswrapper[4794]: I0310 12:35:12.111486 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ngnrc\" (UniqueName: \"kubernetes.io/projected/9f8a6b64-93dc-4146-84d9-a15400d3dfd3-kube-api-access-ngnrc\") pod \"certified-operators-5p9sk\" (UID: \"9f8a6b64-93dc-4146-84d9-a15400d3dfd3\") " pod="openshift-marketplace/certified-operators-5p9sk" Mar 10 12:35:12 crc kubenswrapper[4794]: I0310 12:35:12.111819 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9f8a6b64-93dc-4146-84d9-a15400d3dfd3-catalog-content\") pod \"certified-operators-5p9sk\" (UID: \"9f8a6b64-93dc-4146-84d9-a15400d3dfd3\") " pod="openshift-marketplace/certified-operators-5p9sk" Mar 10 12:35:12 crc kubenswrapper[4794]: I0310 12:35:12.111977 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9f8a6b64-93dc-4146-84d9-a15400d3dfd3-utilities\") pod \"certified-operators-5p9sk\" (UID: \"9f8a6b64-93dc-4146-84d9-a15400d3dfd3\") " pod="openshift-marketplace/certified-operators-5p9sk" Mar 10 12:35:12 crc kubenswrapper[4794]: I0310 12:35:12.112611 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9f8a6b64-93dc-4146-84d9-a15400d3dfd3-utilities\") pod \"certified-operators-5p9sk\" (UID: \"9f8a6b64-93dc-4146-84d9-a15400d3dfd3\") " pod="openshift-marketplace/certified-operators-5p9sk" Mar 10 12:35:12 crc kubenswrapper[4794]: I0310 12:35:12.113580 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9f8a6b64-93dc-4146-84d9-a15400d3dfd3-catalog-content\") pod \"certified-operators-5p9sk\" (UID: \"9f8a6b64-93dc-4146-84d9-a15400d3dfd3\") " pod="openshift-marketplace/certified-operators-5p9sk" Mar 10 12:35:12 crc kubenswrapper[4794]: I0310 12:35:12.133119 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ngnrc\" (UniqueName: \"kubernetes.io/projected/9f8a6b64-93dc-4146-84d9-a15400d3dfd3-kube-api-access-ngnrc\") pod \"certified-operators-5p9sk\" (UID: \"9f8a6b64-93dc-4146-84d9-a15400d3dfd3\") " pod="openshift-marketplace/certified-operators-5p9sk" Mar 10 12:35:12 crc kubenswrapper[4794]: I0310 12:35:12.260233 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-5p9sk" Mar 10 12:35:12 crc kubenswrapper[4794]: I0310 12:35:12.755533 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-5p9sk"] Mar 10 12:35:13 crc kubenswrapper[4794]: I0310 12:35:13.341109 4794 generic.go:334] "Generic (PLEG): container finished" podID="9f8a6b64-93dc-4146-84d9-a15400d3dfd3" containerID="dad4a9a884c9688c0ea52ede62b957751d167080cbb58edf4b3f9c0e0d0b12dc" exitCode=0 Mar 10 12:35:13 crc kubenswrapper[4794]: I0310 12:35:13.341154 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5p9sk" event={"ID":"9f8a6b64-93dc-4146-84d9-a15400d3dfd3","Type":"ContainerDied","Data":"dad4a9a884c9688c0ea52ede62b957751d167080cbb58edf4b3f9c0e0d0b12dc"} Mar 10 12:35:13 crc kubenswrapper[4794]: I0310 12:35:13.342373 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5p9sk" event={"ID":"9f8a6b64-93dc-4146-84d9-a15400d3dfd3","Type":"ContainerStarted","Data":"73319e24fdb37791939556c6fc324a83ba644544ba191731243f37d685a134da"} Mar 10 12:35:14 crc kubenswrapper[4794]: I0310 12:35:14.353226 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5p9sk" event={"ID":"9f8a6b64-93dc-4146-84d9-a15400d3dfd3","Type":"ContainerStarted","Data":"c5805092535a4ed64a6241987fddb37e633d7a989161b56071cefe489f124634"} Mar 10 12:35:16 crc kubenswrapper[4794]: I0310 12:35:16.381067 4794 generic.go:334] "Generic (PLEG): container finished" podID="9f8a6b64-93dc-4146-84d9-a15400d3dfd3" containerID="c5805092535a4ed64a6241987fddb37e633d7a989161b56071cefe489f124634" exitCode=0 Mar 10 12:35:16 crc kubenswrapper[4794]: I0310 12:35:16.381147 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5p9sk" event={"ID":"9f8a6b64-93dc-4146-84d9-a15400d3dfd3","Type":"ContainerDied","Data":"c5805092535a4ed64a6241987fddb37e633d7a989161b56071cefe489f124634"} Mar 10 12:35:17 crc kubenswrapper[4794]: I0310 12:35:17.394006 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5p9sk" event={"ID":"9f8a6b64-93dc-4146-84d9-a15400d3dfd3","Type":"ContainerStarted","Data":"0fc9067cd6ae02524ac2f14bda216c509fcdf78fdce191c6318341288e615bcb"} Mar 10 12:35:17 crc kubenswrapper[4794]: I0310 12:35:17.422639 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-5p9sk" podStartSLOduration=2.953124408 podStartE2EDuration="6.422622245s" podCreationTimestamp="2026-03-10 12:35:11 +0000 UTC" firstStartedPulling="2026-03-10 12:35:13.343396138 +0000 UTC m=+10262.099566956" lastFinishedPulling="2026-03-10 12:35:16.812893975 +0000 UTC m=+10265.569064793" observedRunningTime="2026-03-10 12:35:17.415579087 +0000 UTC m=+10266.171749905" watchObservedRunningTime="2026-03-10 12:35:17.422622245 +0000 UTC m=+10266.178793063" Mar 10 12:35:22 crc kubenswrapper[4794]: I0310 12:35:22.260796 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-5p9sk" Mar 10 12:35:22 crc kubenswrapper[4794]: I0310 12:35:22.261222 4794 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-5p9sk" Mar 10 12:35:23 crc kubenswrapper[4794]: I0310 12:35:23.027670 4794 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-5p9sk" Mar 10 12:35:23 crc kubenswrapper[4794]: I0310 12:35:23.083968 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-5p9sk" Mar 10 12:35:23 crc kubenswrapper[4794]: I0310 12:35:23.274903 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-5p9sk"] Mar 10 12:35:24 crc kubenswrapper[4794]: I0310 12:35:24.477723 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-5p9sk" podUID="9f8a6b64-93dc-4146-84d9-a15400d3dfd3" containerName="registry-server" containerID="cri-o://0fc9067cd6ae02524ac2f14bda216c509fcdf78fdce191c6318341288e615bcb" gracePeriod=2 Mar 10 12:35:25 crc kubenswrapper[4794]: I0310 12:35:25.490932 4794 generic.go:334] "Generic (PLEG): container finished" podID="9f8a6b64-93dc-4146-84d9-a15400d3dfd3" containerID="0fc9067cd6ae02524ac2f14bda216c509fcdf78fdce191c6318341288e615bcb" exitCode=0 Mar 10 12:35:25 crc kubenswrapper[4794]: I0310 12:35:25.491048 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5p9sk" event={"ID":"9f8a6b64-93dc-4146-84d9-a15400d3dfd3","Type":"ContainerDied","Data":"0fc9067cd6ae02524ac2f14bda216c509fcdf78fdce191c6318341288e615bcb"} Mar 10 12:35:25 crc kubenswrapper[4794]: I0310 12:35:25.575063 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-5p9sk" Mar 10 12:35:25 crc kubenswrapper[4794]: I0310 12:35:25.709738 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9f8a6b64-93dc-4146-84d9-a15400d3dfd3-catalog-content\") pod \"9f8a6b64-93dc-4146-84d9-a15400d3dfd3\" (UID: \"9f8a6b64-93dc-4146-84d9-a15400d3dfd3\") " Mar 10 12:35:25 crc kubenswrapper[4794]: I0310 12:35:25.709801 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9f8a6b64-93dc-4146-84d9-a15400d3dfd3-utilities\") pod \"9f8a6b64-93dc-4146-84d9-a15400d3dfd3\" (UID: \"9f8a6b64-93dc-4146-84d9-a15400d3dfd3\") " Mar 10 12:35:25 crc kubenswrapper[4794]: I0310 12:35:25.710069 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ngnrc\" (UniqueName: \"kubernetes.io/projected/9f8a6b64-93dc-4146-84d9-a15400d3dfd3-kube-api-access-ngnrc\") pod \"9f8a6b64-93dc-4146-84d9-a15400d3dfd3\" (UID: \"9f8a6b64-93dc-4146-84d9-a15400d3dfd3\") " Mar 10 12:35:25 crc kubenswrapper[4794]: I0310 12:35:25.710810 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9f8a6b64-93dc-4146-84d9-a15400d3dfd3-utilities" (OuterVolumeSpecName: "utilities") pod "9f8a6b64-93dc-4146-84d9-a15400d3dfd3" (UID: "9f8a6b64-93dc-4146-84d9-a15400d3dfd3"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 12:35:25 crc kubenswrapper[4794]: I0310 12:35:25.715073 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9f8a6b64-93dc-4146-84d9-a15400d3dfd3-kube-api-access-ngnrc" (OuterVolumeSpecName: "kube-api-access-ngnrc") pod "9f8a6b64-93dc-4146-84d9-a15400d3dfd3" (UID: "9f8a6b64-93dc-4146-84d9-a15400d3dfd3"). InnerVolumeSpecName "kube-api-access-ngnrc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 12:35:25 crc kubenswrapper[4794]: I0310 12:35:25.773808 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9f8a6b64-93dc-4146-84d9-a15400d3dfd3-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "9f8a6b64-93dc-4146-84d9-a15400d3dfd3" (UID: "9f8a6b64-93dc-4146-84d9-a15400d3dfd3"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 12:35:25 crc kubenswrapper[4794]: I0310 12:35:25.812566 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ngnrc\" (UniqueName: \"kubernetes.io/projected/9f8a6b64-93dc-4146-84d9-a15400d3dfd3-kube-api-access-ngnrc\") on node \"crc\" DevicePath \"\"" Mar 10 12:35:25 crc kubenswrapper[4794]: I0310 12:35:25.812619 4794 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9f8a6b64-93dc-4146-84d9-a15400d3dfd3-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 10 12:35:25 crc kubenswrapper[4794]: I0310 12:35:25.812629 4794 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9f8a6b64-93dc-4146-84d9-a15400d3dfd3-utilities\") on node \"crc\" DevicePath \"\"" Mar 10 12:35:26 crc kubenswrapper[4794]: I0310 12:35:26.503542 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5p9sk" event={"ID":"9f8a6b64-93dc-4146-84d9-a15400d3dfd3","Type":"ContainerDied","Data":"73319e24fdb37791939556c6fc324a83ba644544ba191731243f37d685a134da"} Mar 10 12:35:26 crc kubenswrapper[4794]: I0310 12:35:26.503601 4794 scope.go:117] "RemoveContainer" containerID="0fc9067cd6ae02524ac2f14bda216c509fcdf78fdce191c6318341288e615bcb" Mar 10 12:35:26 crc kubenswrapper[4794]: I0310 12:35:26.503617 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-5p9sk" Mar 10 12:35:26 crc kubenswrapper[4794]: I0310 12:35:26.534522 4794 scope.go:117] "RemoveContainer" containerID="c5805092535a4ed64a6241987fddb37e633d7a989161b56071cefe489f124634" Mar 10 12:35:26 crc kubenswrapper[4794]: I0310 12:35:26.537398 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-5p9sk"] Mar 10 12:35:26 crc kubenswrapper[4794]: I0310 12:35:26.551407 4794 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-5p9sk"] Mar 10 12:35:26 crc kubenswrapper[4794]: I0310 12:35:26.554553 4794 scope.go:117] "RemoveContainer" containerID="dad4a9a884c9688c0ea52ede62b957751d167080cbb58edf4b3f9c0e0d0b12dc" Mar 10 12:35:28 crc kubenswrapper[4794]: I0310 12:35:28.012628 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9f8a6b64-93dc-4146-84d9-a15400d3dfd3" path="/var/lib/kubelet/pods/9f8a6b64-93dc-4146-84d9-a15400d3dfd3/volumes" Mar 10 12:36:00 crc kubenswrapper[4794]: I0310 12:36:00.154377 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29552436-vjhkb"] Mar 10 12:36:00 crc kubenswrapper[4794]: E0310 12:36:00.155964 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9f8a6b64-93dc-4146-84d9-a15400d3dfd3" containerName="registry-server" Mar 10 12:36:00 crc kubenswrapper[4794]: I0310 12:36:00.155986 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="9f8a6b64-93dc-4146-84d9-a15400d3dfd3" containerName="registry-server" Mar 10 12:36:00 crc kubenswrapper[4794]: E0310 12:36:00.156039 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9f8a6b64-93dc-4146-84d9-a15400d3dfd3" containerName="extract-content" Mar 10 12:36:00 crc kubenswrapper[4794]: I0310 12:36:00.156048 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="9f8a6b64-93dc-4146-84d9-a15400d3dfd3" containerName="extract-content" Mar 10 12:36:00 crc kubenswrapper[4794]: E0310 12:36:00.156066 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9f8a6b64-93dc-4146-84d9-a15400d3dfd3" containerName="extract-utilities" Mar 10 12:36:00 crc kubenswrapper[4794]: I0310 12:36:00.156074 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="9f8a6b64-93dc-4146-84d9-a15400d3dfd3" containerName="extract-utilities" Mar 10 12:36:00 crc kubenswrapper[4794]: I0310 12:36:00.156348 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="9f8a6b64-93dc-4146-84d9-a15400d3dfd3" containerName="registry-server" Mar 10 12:36:00 crc kubenswrapper[4794]: I0310 12:36:00.157605 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552436-vjhkb" Mar 10 12:36:00 crc kubenswrapper[4794]: I0310 12:36:00.160897 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 10 12:36:00 crc kubenswrapper[4794]: I0310 12:36:00.161383 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-r5tkc" Mar 10 12:36:00 crc kubenswrapper[4794]: I0310 12:36:00.161489 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 10 12:36:00 crc kubenswrapper[4794]: I0310 12:36:00.187999 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552436-vjhkb"] Mar 10 12:36:00 crc kubenswrapper[4794]: I0310 12:36:00.258852 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2cgkx\" (UniqueName: \"kubernetes.io/projected/13d016f2-6433-499f-978e-53e7b5f53c5f-kube-api-access-2cgkx\") pod \"auto-csr-approver-29552436-vjhkb\" (UID: \"13d016f2-6433-499f-978e-53e7b5f53c5f\") " pod="openshift-infra/auto-csr-approver-29552436-vjhkb" Mar 10 12:36:00 crc kubenswrapper[4794]: I0310 12:36:00.361664 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2cgkx\" (UniqueName: \"kubernetes.io/projected/13d016f2-6433-499f-978e-53e7b5f53c5f-kube-api-access-2cgkx\") pod \"auto-csr-approver-29552436-vjhkb\" (UID: \"13d016f2-6433-499f-978e-53e7b5f53c5f\") " pod="openshift-infra/auto-csr-approver-29552436-vjhkb" Mar 10 12:36:00 crc kubenswrapper[4794]: I0310 12:36:00.382968 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2cgkx\" (UniqueName: \"kubernetes.io/projected/13d016f2-6433-499f-978e-53e7b5f53c5f-kube-api-access-2cgkx\") pod \"auto-csr-approver-29552436-vjhkb\" (UID: \"13d016f2-6433-499f-978e-53e7b5f53c5f\") " pod="openshift-infra/auto-csr-approver-29552436-vjhkb" Mar 10 12:36:00 crc kubenswrapper[4794]: I0310 12:36:00.485766 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552436-vjhkb" Mar 10 12:36:00 crc kubenswrapper[4794]: I0310 12:36:00.956517 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552436-vjhkb"] Mar 10 12:36:01 crc kubenswrapper[4794]: I0310 12:36:01.889088 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552436-vjhkb" event={"ID":"13d016f2-6433-499f-978e-53e7b5f53c5f","Type":"ContainerStarted","Data":"69f0a164fa8e5e4908bcd786de9c3d75a303349ac20f12baba35c39faae295a7"} Mar 10 12:36:02 crc kubenswrapper[4794]: E0310 12:36:02.410415 4794 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod13d016f2_6433_499f_978e_53e7b5f53c5f.slice/crio-29231456ee722f7a5f6045256b129c675d996f98cd492d99bf80231e0b673382.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod13d016f2_6433_499f_978e_53e7b5f53c5f.slice/crio-conmon-29231456ee722f7a5f6045256b129c675d996f98cd492d99bf80231e0b673382.scope\": RecentStats: unable to find data in memory cache]" Mar 10 12:36:02 crc kubenswrapper[4794]: I0310 12:36:02.902661 4794 generic.go:334] "Generic (PLEG): container finished" podID="13d016f2-6433-499f-978e-53e7b5f53c5f" containerID="29231456ee722f7a5f6045256b129c675d996f98cd492d99bf80231e0b673382" exitCode=0 Mar 10 12:36:02 crc kubenswrapper[4794]: I0310 12:36:02.902737 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552436-vjhkb" event={"ID":"13d016f2-6433-499f-978e-53e7b5f53c5f","Type":"ContainerDied","Data":"29231456ee722f7a5f6045256b129c675d996f98cd492d99bf80231e0b673382"} Mar 10 12:36:04 crc kubenswrapper[4794]: I0310 12:36:04.320850 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552436-vjhkb" Mar 10 12:36:04 crc kubenswrapper[4794]: I0310 12:36:04.479118 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2cgkx\" (UniqueName: \"kubernetes.io/projected/13d016f2-6433-499f-978e-53e7b5f53c5f-kube-api-access-2cgkx\") pod \"13d016f2-6433-499f-978e-53e7b5f53c5f\" (UID: \"13d016f2-6433-499f-978e-53e7b5f53c5f\") " Mar 10 12:36:04 crc kubenswrapper[4794]: I0310 12:36:04.778279 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/13d016f2-6433-499f-978e-53e7b5f53c5f-kube-api-access-2cgkx" (OuterVolumeSpecName: "kube-api-access-2cgkx") pod "13d016f2-6433-499f-978e-53e7b5f53c5f" (UID: "13d016f2-6433-499f-978e-53e7b5f53c5f"). InnerVolumeSpecName "kube-api-access-2cgkx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 12:36:04 crc kubenswrapper[4794]: I0310 12:36:04.786326 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2cgkx\" (UniqueName: \"kubernetes.io/projected/13d016f2-6433-499f-978e-53e7b5f53c5f-kube-api-access-2cgkx\") on node \"crc\" DevicePath \"\"" Mar 10 12:36:04 crc kubenswrapper[4794]: I0310 12:36:04.925472 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552436-vjhkb" event={"ID":"13d016f2-6433-499f-978e-53e7b5f53c5f","Type":"ContainerDied","Data":"69f0a164fa8e5e4908bcd786de9c3d75a303349ac20f12baba35c39faae295a7"} Mar 10 12:36:04 crc kubenswrapper[4794]: I0310 12:36:04.925512 4794 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="69f0a164fa8e5e4908bcd786de9c3d75a303349ac20f12baba35c39faae295a7" Mar 10 12:36:04 crc kubenswrapper[4794]: I0310 12:36:04.925516 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552436-vjhkb" Mar 10 12:36:05 crc kubenswrapper[4794]: I0310 12:36:05.403925 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29552430-4xpf9"] Mar 10 12:36:05 crc kubenswrapper[4794]: I0310 12:36:05.413375 4794 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29552430-4xpf9"] Mar 10 12:36:06 crc kubenswrapper[4794]: I0310 12:36:06.012018 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4fc05efc-78ab-4e02-8f24-cee5ee379ae9" path="/var/lib/kubelet/pods/4fc05efc-78ab-4e02-8f24-cee5ee379ae9/volumes" Mar 10 12:36:52 crc kubenswrapper[4794]: I0310 12:36:52.968687 4794 patch_prober.go:28] interesting pod/machine-config-daemon-69278 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 12:36:52 crc kubenswrapper[4794]: I0310 12:36:52.969234 4794 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-69278" podUID="071a79a8-a892-4d38-a255-2a19483b64aa" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 12:36:56 crc kubenswrapper[4794]: I0310 12:36:56.650361 4794 scope.go:117] "RemoveContainer" containerID="38cf60c102d6c688576922fb452c9c55b266883e283b4aa1617b025b9b7af8ed" Mar 10 12:37:22 crc kubenswrapper[4794]: I0310 12:37:22.967386 4794 patch_prober.go:28] interesting pod/machine-config-daemon-69278 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 12:37:22 crc kubenswrapper[4794]: I0310 12:37:22.968026 4794 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-69278" podUID="071a79a8-a892-4d38-a255-2a19483b64aa" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 12:37:52 crc kubenswrapper[4794]: I0310 12:37:52.967121 4794 patch_prober.go:28] interesting pod/machine-config-daemon-69278 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 12:37:52 crc kubenswrapper[4794]: I0310 12:37:52.967579 4794 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-69278" podUID="071a79a8-a892-4d38-a255-2a19483b64aa" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 12:37:52 crc kubenswrapper[4794]: I0310 12:37:52.967634 4794 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-69278" Mar 10 12:37:52 crc kubenswrapper[4794]: I0310 12:37:52.968751 4794 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"5810f383c79e26ca37e093d22aadaab4bdd2ca7173396934e04cd19f1f76809c"} pod="openshift-machine-config-operator/machine-config-daemon-69278" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 10 12:37:52 crc kubenswrapper[4794]: I0310 12:37:52.968831 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-69278" podUID="071a79a8-a892-4d38-a255-2a19483b64aa" containerName="machine-config-daemon" containerID="cri-o://5810f383c79e26ca37e093d22aadaab4bdd2ca7173396934e04cd19f1f76809c" gracePeriod=600 Mar 10 12:37:53 crc kubenswrapper[4794]: E0310 12:37:53.103120 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-69278_openshift-machine-config-operator(071a79a8-a892-4d38-a255-2a19483b64aa)\"" pod="openshift-machine-config-operator/machine-config-daemon-69278" podUID="071a79a8-a892-4d38-a255-2a19483b64aa" Mar 10 12:37:54 crc kubenswrapper[4794]: I0310 12:37:54.082019 4794 generic.go:334] "Generic (PLEG): container finished" podID="071a79a8-a892-4d38-a255-2a19483b64aa" containerID="5810f383c79e26ca37e093d22aadaab4bdd2ca7173396934e04cd19f1f76809c" exitCode=0 Mar 10 12:37:54 crc kubenswrapper[4794]: I0310 12:37:54.082075 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-69278" event={"ID":"071a79a8-a892-4d38-a255-2a19483b64aa","Type":"ContainerDied","Data":"5810f383c79e26ca37e093d22aadaab4bdd2ca7173396934e04cd19f1f76809c"} Mar 10 12:37:54 crc kubenswrapper[4794]: I0310 12:37:54.083176 4794 scope.go:117] "RemoveContainer" containerID="650642fca6fd05c6d1ce8b4bb58bc6d6d784b308ef3ef228d986be779c79b82d" Mar 10 12:37:54 crc kubenswrapper[4794]: I0310 12:37:54.084028 4794 scope.go:117] "RemoveContainer" containerID="5810f383c79e26ca37e093d22aadaab4bdd2ca7173396934e04cd19f1f76809c" Mar 10 12:37:54 crc kubenswrapper[4794]: E0310 12:37:54.084437 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-69278_openshift-machine-config-operator(071a79a8-a892-4d38-a255-2a19483b64aa)\"" pod="openshift-machine-config-operator/machine-config-daemon-69278" podUID="071a79a8-a892-4d38-a255-2a19483b64aa" Mar 10 12:38:00 crc kubenswrapper[4794]: I0310 12:38:00.149061 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29552438-vhr2r"] Mar 10 12:38:00 crc kubenswrapper[4794]: E0310 12:38:00.150078 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="13d016f2-6433-499f-978e-53e7b5f53c5f" containerName="oc" Mar 10 12:38:00 crc kubenswrapper[4794]: I0310 12:38:00.150093 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="13d016f2-6433-499f-978e-53e7b5f53c5f" containerName="oc" Mar 10 12:38:00 crc kubenswrapper[4794]: I0310 12:38:00.150413 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="13d016f2-6433-499f-978e-53e7b5f53c5f" containerName="oc" Mar 10 12:38:00 crc kubenswrapper[4794]: I0310 12:38:00.151355 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552438-vhr2r" Mar 10 12:38:00 crc kubenswrapper[4794]: I0310 12:38:00.153115 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 10 12:38:00 crc kubenswrapper[4794]: I0310 12:38:00.153577 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 10 12:38:00 crc kubenswrapper[4794]: I0310 12:38:00.153997 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-r5tkc" Mar 10 12:38:00 crc kubenswrapper[4794]: I0310 12:38:00.169089 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552438-vhr2r"] Mar 10 12:38:00 crc kubenswrapper[4794]: I0310 12:38:00.267557 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-46rq8\" (UniqueName: \"kubernetes.io/projected/98acfe8d-17f0-4338-a4cd-55f56a0019c7-kube-api-access-46rq8\") pod \"auto-csr-approver-29552438-vhr2r\" (UID: \"98acfe8d-17f0-4338-a4cd-55f56a0019c7\") " pod="openshift-infra/auto-csr-approver-29552438-vhr2r" Mar 10 12:38:00 crc kubenswrapper[4794]: I0310 12:38:00.369457 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-46rq8\" (UniqueName: \"kubernetes.io/projected/98acfe8d-17f0-4338-a4cd-55f56a0019c7-kube-api-access-46rq8\") pod \"auto-csr-approver-29552438-vhr2r\" (UID: \"98acfe8d-17f0-4338-a4cd-55f56a0019c7\") " pod="openshift-infra/auto-csr-approver-29552438-vhr2r" Mar 10 12:38:00 crc kubenswrapper[4794]: I0310 12:38:00.387859 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-46rq8\" (UniqueName: \"kubernetes.io/projected/98acfe8d-17f0-4338-a4cd-55f56a0019c7-kube-api-access-46rq8\") pod \"auto-csr-approver-29552438-vhr2r\" (UID: \"98acfe8d-17f0-4338-a4cd-55f56a0019c7\") " pod="openshift-infra/auto-csr-approver-29552438-vhr2r" Mar 10 12:38:00 crc kubenswrapper[4794]: I0310 12:38:00.478045 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552438-vhr2r" Mar 10 12:38:00 crc kubenswrapper[4794]: I0310 12:38:00.929275 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552438-vhr2r"] Mar 10 12:38:00 crc kubenswrapper[4794]: I0310 12:38:00.931727 4794 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 10 12:38:01 crc kubenswrapper[4794]: I0310 12:38:01.160216 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552438-vhr2r" event={"ID":"98acfe8d-17f0-4338-a4cd-55f56a0019c7","Type":"ContainerStarted","Data":"bd88d9f9c2e9613572d9495e051d4f31ee1c8a03c137875d035ab68adfca9d3b"}